Overview
Optimization is the process of finding the inputs at which a function hits its maximum or minimum values.
That is, given a function
{% y = f(x) %}
to find the maximum is the find the value of {% x' %} such that
{% f(x') \geq f(x) %}
for any other value of {% x %} in the domain.
Finding the minimum of a function is an equivalent problem to finding a maximum of the inverted function.
Global vs Local Min/Max
A function can achieve a maximum or minimum in a small section of the domain under consideration, even if that point is not a min or maximum for the whole domain.
A point {% x_0 %} in the domain of the function is considered a local optimal point if there is an {% \epsilon %} for which
{% d(x,x_0) < \epsilon %}
and for which {% f(x) %} is less/greater than {% f(x_0) %}. Here,
{% d(x,y) %}
is a
metric
on the space under consideration.
The following is a function with two local minima, one global minima, one local maximum and no global maximum.
Analytic Methods
Analytic methods provide ways to find the extremes of a function with just methods from standard analysis (linear algebra and calculus)
- Derivative Method: the standard method from calculus of finding the locations where the derivative of the function in question is equal to zero.
- Lagrange Multipliers: is a multivariable calculus method for optimizing a function given a set of differentiable function constraints.
Algorithms
- Random Search : Random search is a brute force method of randomly searching the input space for the optimal points. It is inefficient in finding points close to the optimal, but can be used to quickly reduce the input space to a space that can be subject to more fine tuned methods.
- Golden Section : The golden section algorithm is an algorithm for finding an optimal point of a single variable function that uses a bisection search method to quickly narrow the search space.
- Linear and Non-linear Programming :
- Gradient Descent : Gradient Descent is an algorithm for finding the minimum of a given multivariable function. It is a local algorithm, that is, it starts at a given point, and then examines the function value around that point (local area) in order to determine which point to evaluate next. Technically, it determines the rate of change of the function in each direction (along each axis of the inputs), determines which direction represents the fastest rate and then proceeds in that direction by a given step size.
- Newtons Method : Newtons method uses the newton-raphson method for quickly finding a maximum. However, convergence is guaranteed under certain conditions, and often requires that the initial guess is "close" to the answer in some sense.
- Genetic Algorithm : an algorithm inspired by natural selection and most often associated with machine learning.
- Convex Optimization
Additional Topics
- Optimal Control
- Optimization with Tensor Flow - uses the optimizers in tensor-flow library to optimize a function