Non-Linear Programming Algorithm
(Redirected from non-linear programming algorithm)
Jump to navigation
Jump to search
A Non-Linear Programming Algorithm is a continuous optimization algorithm that can be implemented by non-linear programming system (to solve a non-linear programming task).
- Example(s):
- Counter-Example(s):
- See: Space Mapping, Geometric Programming, Signomial, Posynomial, Quadratically Constrained Quadratic Program, Linear-Fractional Programming, Fractional Programming, Nonlinear Complementarity Problem, Non-Linear Least Squares, Gauss–Newton Algorithm.
References
2016
- (Wikipedia, 2016) ⇒ http://wikipedia.org/wiki/list_of_numerical_analysis_topics#Nonlinear_programming Retrieved:2016-4-4.
- Nonlinear programming — the most general optimization problem in the usual framework
- Special cases of nonlinear programming:
- See Linear programming and Convex optimization above
- Geometric programming — problems involving signomials or posynomials
- Signomial — similar to polynomials, but exponents need not be integers
- Posynomial — a signomial with positive coefficients
- Quadratically constrained quadratic program.
- Linear-fractional programming — objective is ratio of linear functions, constraints are linear
- Fractional programming — objective is ratio of nonlinear functions, constraints are linear
- Nonlinear complementarity problem (NCP) — find x such that x ≥ 0, f(x) ≥ 0 and xT f(x) = 0
- Least squares — the objective function is a sum of squares
- Non-linear least squares.
- Gauss–Newton algorithm.
- BHHH algorithm — variant of Gauss–Newton in econometrics
- Generalized Gauss–Newton method — for constrained nonlinear least-squares problems
- Levenberg–Marquardt algorithm.
- Iteratively reweighted least squares (IRLS) — solves a weigted least-squares problem at every iteration
- Partial least squares — statistical techniques similar to principal components analysis
- Mathematical programming with equilibrium constraints — constraints include variational inequalities or complementarities
- Univariate optimization:
- Golden section search.
- Successive parabolic interpolation — based on quadratic interpolation through the last three iterates
- General algorithms:
- Concepts:
- Descent direction.
- Guess value — the initial guess for a solution with which an algorithm starts
- Line search.
- Gradient method — method that uses the gradient as the search direction
- Gradient descent.
- Landweber iteration — mainly used for ill-posed problems
- Successive linear programming (SLP) — replace problem by a linear programming problem, solve that, and repeat
- Sequential quadratic programming (SQP) — replace problem by a quadratic programming problem, solve that, and repeat
- Newton's method in optimization.
- See also under Newton algorithm in the section Finding roots of nonlinear equations.
- Nonlinear conjugate gradient method.
- Derivative-free methods
- Coordinate descent — move in one of the coordinate directions
- Adaptive coordinate descent — adapt coordinate directions to objective function
- Random coordinate descent — randomized version
- Nelder–Mead method.
- Pattern search (optimization).
- Powell's method — based on conjugate gradient descent
- Rosenbrock methods — derivative-free method, similar to Nelder–Mead but with guaranteed convergence
- Coordinate descent — move in one of the coordinate directions
- Augmented Lagrangian method — replaces constrained problems by unconstrained problems with a term added to the objective function
- Ternary search.
- Tabu search.
- Guided Local Search — modification of search algorithms which builds up penalties during a search
- Reactive search optimization (RSO) — the algorithm adapts its parameters automatically
- MM algorithm — majorize-minimization, a wide framework of methods
- Least absolute deviations.
- Adaptive projected subgradient method.
- Nearest neighbor search.
- Space mapping — uses "coarse" (ideal or low-fidelity) and "fine" (practical or high-fidelity) models
- Concepts:
- Special cases of nonlinear programming:
- Nonlinear programming — the most general optimization problem in the usual framework