Feed-Forward Neural Network Training Algorithm
(Redirected from FFNNs)
Jump to navigation
Jump to search
A Feed-Forward Neural Network Training Algorithm is a neural network training algorithm that implements a feed-forward neural network training system (to solve a feed-forward neural network training task to train a feed-forward neural network).
- AKA: FF Network Trainer.
- Context:
- It can range from being a Perceptron Training Algorithm to being a Deep Feed-Forward Neural Network Training Algorithm.
- …
- Example(s):
- Counter-Example(s):
- See: Convolution ANN Training Algorithm, Stochastic Gradient Descent Algorithm, Artificial Neural Networks.
References
2017a
- (Munro, 2017) ⇒ Munro P. (2017) "Backpropagation". In: Sammut, C., Webb, G.I. (eds) "Encyclopedia of Machine Learning and Data Mining". Springer, Boston, MA
- QUOTE: Backpropagation of error (henceforth BP) is a method for training feed-forward neural networks see Artificial Neural Networks. A specific implementation of BP is an iterative procedure that adjusts network weight parameters according to the gradient of an error measure. The procedure is implemented by computing an error value for each output unit, and by backpropagating the error values through the network.