1994 TrainingFeedforwardNetworkswith
- (Hagan & Menhaj, 1994) ⇒ Martin T. Hagan, and Mohammad B. Menhaj. (1994). “Training Feedforward Networks with the Marquardt Algorithm.” In: IEEE Transactions on Neural Networks Journal, 5(6). doi:10.1109/72.329697
Subject Headings:
Notes
Cited By
- http://scholar.google.com/scholar?q=%221994%22+Training+Feedforward+Networks+with+the+Marquardt+Algorithm
- http://dl.acm.org/citation.cfm?id=2325862.2328440&preflayout=flat#citedby
Quotes
Abstract
The Marquardt algorithm for nonlinear least squares is presented and is incorporated into the backpropagation algorithm for training feedforward neural networks. The algorithm is tested on several function approximation problems, and is compared with a conjugate gradient algorithm and a variable learning rate algorithm. It is found that the Marquardt algorithm is much more efficient than either of the other techniques when the network contains no more than a few hundred weights
…
III. MARQUARDT-LEVENBMEROGD MODIFICATION
While backpropagation is a steepest descent algorithm, the Marquardt-Levenberg algorithm [141 is an approximation to Newton's method. Suppose that we have a function V(:) which we want to minimize with respect to the parameter vector I, then Newton's method would be A: = -[V2V(~)]-'VV(:) (16) where [math]\displaystyle{ \triangledown V(\underline{x}) }[/math] is the Hessian matrix and [math]\displaystyle{ :V(\underline{x}) }[/math] is the gradient. If we assume that V(:) is a sum of squares function
- [math]\displaystyle{ V(\underline{x}) = \sum^N_{i=q} e^2_i(\underline{x}) \ \ (17) }[/math] …
References
;
Author | volume | Date Value | title | type | journal | titleUrl | doi | note | year | |
---|---|---|---|---|---|---|---|---|---|---|
1994 TrainingFeedforwardNetworkswith | Martin T. Hagan Mohammad B. Menhaj | Training Feedforward Networks with the Marquardt Algorithm | 10.1109/72.329697 | 1994 |