- Home /
Neural network classificator stuck at 85% accuracy for mnist dataset
I am writing a back propagation neural network from scratch on c# to classify images of mnist dataset. All neurons in the nn are connected. My nn looks like that (784,800,10) 784 neurons in input layer, one hidden layer with 800 neurons and 10 neurons in output layer. As activation function I use sigmoid(values from 0 to 1) . I don't use biases. My learning rate is 0.002. Mini batch size is 60. My accuracy is rising from 75% to 85% 1epoch - 75% 2epoch - 79% 3epoch - 81% 4epoch - 83% 5epoch - 84.5% 6epoch - 85% 7epoch - 84.5% 10epoch - 80%
My question is can I go beyond that without a CNN and if yes, then how? (I tried using different learning rates from 0.01 to 0.0001, different batch sizes (from 10 to 1000)which didn't improved the accuracy.)
Answer by sjhalayka · Jan 13, 2020 at 12:22 AM
85% is not that bad. If your network was useless, this number would be more like 50%.
Anyway, have you tried altering the number of neurons in your hidden layer? When I create a hidden layer, I generally give it sqrt(num_input_neurons*num_output_neurons). Let me know how that works out for you.
Your answer
Follow this Question
Related Questions
Smooth path finding for agents with turning circles. 0 Answers
Game AI Implementation 0 Answers
How to store Drag Input(Irregular shapes included). 0 Answers
Tennis game like Grand slam tennis 1 Answer