Numerical Optimization System
A Numerical Optimization System is a optimization system that implements a numerical optimization algorithm to solve a numerical optimization task.
- AKA: Mathematical Optimization System, Optimizer.
- Example(s):
- Counter-Example(s):
- See: Operations Research, Domain of a Function, Mathematics, Optimization Problem, Maxima And Minima, Function of a Real Variable, Argument of a Function, Value (Mathematics), Applied Mathematics.
References
2018a
- (Wikipedia, 2018) ⇒ https://en.wikipedia.org/wiki/Mathematical_optimization Retrieved:2018-5-13.
- In mathematics, computer science and operations research, mathematical optimization or mathematical programming, alternatively spelled optimisation, is the selection of a best element (with regard to some criterion) from some set of available alternatives. [1]
In the simplest case, an optimization problem consists of maximizing or minimizing a real function by systematically choosing input values from within an allowed set and computing the value of the function. The generalization of optimization theory and techniques to other formulations constitutes a large area of applied mathematics. More generally, optimization includes finding "best available" values of some objective function given a defined domain (or input), including a variety of different types of objective functions and different types of domains.
- In mathematics, computer science and operations research, mathematical optimization or mathematical programming, alternatively spelled optimisation, is the selection of a best element (with regard to some criterion) from some set of available alternatives. [1]
2018b
- (ML Glossary, 2018) ⇒ (2008). Optimizer. In: Machine Learning Glossary https://developers.google.com/machine-learning/glossary/ Retrieved:2018-5-13.
- QUOTE: A specific implementation of the gradient descent algorithm. TensorFlow's base class for optimizers is
tf.train.Optimizer
. Different optimizers may leverage one or more of the following concepts to enhance the effectiveness of gradient descent on a given training set:- momentum (Momentum)
- update frequency (AdaGrad = ADAptive GRADient descent; Adam = ADAptive with Momentum; RMSProp)
- sparsity/regularization (Ftrl)
- more complex math (Proximal, and others)
- QUOTE: A specific implementation of the gradient descent algorithm. TensorFlow's base class for optimizers is
You might even imagine an NN-driven optimizer.
- ↑ "The Nature of Mathematical Programming ," Mathematical Programming Glossary, INFORMS Computing Society.