Error Surface
Jump to navigation
Jump to search
See: Error, Surface, Backpropagation, Gradient Descent, Quadratic Error Surface, Loss Function.
References
2008
- (Wilson, 2008a) ⇒ Bill Wilson. (2008). “The Machine Learning Dictionary for COMP9414." University of New South Wales, Australia.
- error surface: When total error of a backpropagation-trained neural network is expressed as a function of the weights, and graphed (to the extent that this is possible with a large number of weights), the result is a surface termed the error surface. The course of learning can be traced on the error surface: as learning is supposed to reduce error, when the learning algorithm causes the weights to change, the current point on the error surface should descend into a valley of the error surface. The "point" defined by the current set of weights is termed a point in weight space. Thus weight space is the set of all possible values of the weights. See also local minimum and gradient descent.