Convex Optimization

Overview


Convex functions are functions such that
{% f(\theta x + (1-\theta)y) \leq \theta f(x) + (1 - \theta)f(y) %}
whenever {% 0 \leq \theta \leq 1 %}.

While it may not be the case that there exists an analytic solution to the optimization of a convex function, solving a convex optimization problem is fast numerically.

Topics


  • Properties of Convex Functions
    • Local minima are global minima - this implies that unless there are constraints involved, a solution can be found quickly using gradient descent.
    • {% f(\vec{y}) \geq f(\vec{x}) + \nabla f(\vec{x})^T (\vec{y}-\vec{x}) %}
  • Operations that Preserve Convexity

General Convex Optimization Problem


The general convex optimization problem can be stated as
{% Minimize \; f_0(x) %}
{% Subject \; to \; f_i(x) \leq 0 %}
{% h_j(x) = 0 %}
where {% i %} runs from {% 1,2,....m %} and {% j %} runs from {% 1,...,p %}