Total-Space Optimization Algorithm
(Redirected from Global Optimization)
Jump to navigation
Jump to search
A Total-Space Optimization Algorithm is an Optimization Algorithm that attempts to find the global optimum of a function within the entire feasible region.
- AKA: Global Optimization.
- Example(s):
- See: Local Optimization, Convex Optimization, Metaheuristic.
References
2018
- (Wikipedia, 2018) ⇒ https://en.wikipedia.org/wiki/Global_optimization Retrieved:2018-4-22.
- Global optimization is a branch of applied mathematics and numerical analysis that deals with the global optimization of a function or a set of functions according to some criteria. Typically, a set of bounds and more general constraints is also present, and the decision variables are optimized considering also the constraints.
Global optimization is distinguished from local optimization by its focus on finding the maximum or minimum over all input values, as opposed to finding local minima or maxima.
- Global optimization is a branch of applied mathematics and numerical analysis that deals with the global optimization of a function or a set of functions according to some criteria. Typically, a set of bounds and more general constraints is also present, and the decision variables are optimized considering also the constraints.
2009
- (Weise, 2009) ⇒ Weise, T. (2009). Global optimization algorithms-theory and application (PDF). Self-published, 2.
- QUOTE: Figure 1.1 sketches a rough taxonomy of global optimization methods. Generally, optimization algorithms can be divided in two basic classes: deterministic and probabilistic algorithms. Deterministic algorithms (see also Definition 30.11 on page 550) are most often used if a clear relation between the characteristics of the possible solutions and their utility for a given problem exists. Then, the search space can efficiently be explored using, for example, a divide and conquer scheme. If the relation between a solution candidate and its “fitness” are not so obvious or too complicated, or the dimensionality of the search space is very high, it becomes harder to solve a problem deterministically. Trying it would possible result in exhaustive enumeration of the search space, which is not feasible even for relatively small problems (...)
2008
- (Xiang et al., 2008) ⇒ Shiming Xiang, Feiping Nie, and Changshui Zhang. (2008). “Learning a Mahalanobis Distance Metric for Data Clustering and Classification.” In: Pattern Recognition 41.