Implementation of a Multi-Layer Perceptron with one variable-neuron hidden layer. Stochastic Gradient Descent (SGD) is used in training.
[1] - The Backpropagation Algorithm for Training Neural Networks
Implementation of a Multi-Layer Perceptron with one variable-neuron hidden layer. Stochastic Gradient Descent (SGD) is used in training.
[1] - The Backpropagation Algorithm for Training Neural Networks