Stochastic Approximation Algorithm
(Redirected from Stochastic Algorithm)
Jump to navigation
Jump to search
A Stochastic Approximation Algorithm is an Iterative Stochastic Optimization Algorithm that can solve an approximation task.
- AKA: SA.
- Counter-Example(s):
- See: Stochastic Gradient Descent Algorithm.
References
2017
2009
- http://en.wikipedia.org/wiki/Stochastic_approximation
- Stochastic approximation methods are a family of iterative stochastic optimization algorithms that attempt to find zeroes or extrema of functions which cannot be computed directly, but only estimated via noisy observations. The first, and prototypical, algorithms of this kind were the Robbins-Monro and Kiefer-Wolfowitz algorithms.
2003
- (Kushner & Yin, 2003) ⇒ Harold Joseph Kushner, and George Yin. (2003). “Stochastic Approximation and Recursive Algorithms and Applications, 2nd edition." Springer. ISBN:0387008942
- The book presents a thorough development of the modern theory of stochastic approximation or recursive stochastic algorithms for both constrained and unconstrained problems. There is a complete development of both probability one and weak convergence methods for very general noise processes. The proofs of convergence use the ODE method, the most powerful to date, with which the asymptotic behavior is characterized by the limit behavior of a mean ODE. The assumptions and proof methods are designed to cover the needs of recent applications. The development proceeds from simple to complex problems, allowing the underlying ideas to be more easily understood. Rate of convergence, iterate averaging, high-dimensional problems, stability-ODE methods, two time scale, asynchronous and decentralized algorithms, general correlated and state-dependent noise, perturbed test function methods, and large devitations methods, are covered. Many motivational examples from learning theory, ergodic cost problems for discrete event systems, wireless communications, adaptive control, signal processing, and elsewhere, illustrate the application of the theory. This second edition is a thorough revision, although the main features and the structure remain unchanged
Google Keywords: stochastic approximation, Liapunov function, Wiener process, weak convergence, random variables, uniformly integrable, compact set, Hurwitz matrix, equicontinuous, Lipschitz continuous, limit point, rate of convergence, finite difference, approximation algorithm, conditional expectation, converges weakly, Markov chain, gradient descent, law of large, converges in distribution
- The book presents a thorough development of the modern theory of stochastic approximation or recursive stochastic algorithms for both constrained and unconstrained problems. There is a complete development of both probability one and weak convergence methods for very general noise processes. The proofs of convergence use the ODE method, the most powerful to date, with which the asymptotic behavior is characterized by the limit behavior of a mean ODE. The assumptions and proof methods are designed to cover the needs of recent applications. The development proceeds from simple to complex problems, allowing the underlying ideas to be more easily understood. Rate of convergence, iterate averaging, high-dimensional problems, stability-ODE methods, two time scale, asynchronous and decentralized algorithms, general correlated and state-dependent noise, perturbed test function methods, and large devitations methods, are covered. Many motivational examples from learning theory, ergodic cost problems for discrete event systems, wireless communications, adaptive control, signal processing, and elsewhere, illustrate the application of the theory. This second edition is a thorough revision, although the main features and the structure remain unchanged
2002
- (Chen, 2002) ⇒ Hanfu Chen. (2002). “Stochastic approximation and its applications." Springer. ISBN:1402008066
- This book presents the recent development of stochastic approximation algorithms with expanding truncations based on the TS (trajectory-subsequence) method, a newly developed method for convergence analysis. This approach is so powerful that conditions used for guaranteeing convergence have been considerably weakened in comparison with those applied in the classical probability and ODE methods. The general convergence theorem is presented for sample paths and is proved in a purely deterministic way. The sample-path description of theorems is particularly convenient for applications. Convergence theory takes both observation noise and structural error of the regression function into consideration. Convergence rates, asymptotic normality and other asymptotic properties are presented as well. Applications of the developed theory to global optimization, blind channel identification, adaptive filtering, system parameter identification, adaptive stabilization and other problems arising from engineering fields are demonstrated.
Google Keywords: stochastic approximation, random variables, Lipschitz continuous, global optimization, Lemma, eigenvector, nowhere dense, Lyapunov function, RM algorithm, weak convergence, eigenvalue, measurable function, supermartingale, Borel set, Approximation Errors, approximation algorithm, i=nk, recursively, adaptive control, continuously differentiable
- This book presents the recent development of stochastic approximation algorithms with expanding truncations based on the TS (trajectory-subsequence) method, a newly developed method for convergence analysis. This approach is so powerful that conditions used for guaranteeing convergence have been considerably weakened in comparison with those applied in the classical probability and ODE methods. The general convergence theorem is presented for sample paths and is proved in a purely deterministic way. The sample-path description of theorems is particularly convenient for applications. Convergence theory takes both observation noise and structural error of the regression function into consideration. Convergence rates, asymptotic normality and other asymptotic properties are presented as well. Applications of the developed theory to global optimization, blind channel identification, adaptive filtering, system parameter identification, adaptive stabilization and other problems arising from engineering fields are demonstrated.
1952
- (Kiefer & Wolfowitz, 1952) ⇒ J. Kiefer and J. Wolfowitz. (1952). “Stochastic Estimation of the Maximum of a Regression Function.” In: Annals of Mathematical Statistics 23(3).
1851
- (Robbins & Monro, 1951) ⇒ Herbert Robbins, and Sutton Monro. (1951). “A Stochastic Approximation Method.” In: Annals of Mathematical Statistics, 22(3)