An optimization problem whose optimization objective $f$ is a convex function and feasible region $\mathcal{F}$ is a convex set.
Nonconvex Set
Convex Set
NonConvex Function
Convex Function
Convex Combination
A point between two points
Given $\mathbf{x}, \mathbf{y} \in \mathbb{R}^n$, a convex combination of them is any point of the form $\mathbf{z}=\theta\mathbf{x}+(1-\theta)\mathbf{y}$ where $\theta \in [0,1]$.
When $\theta \in (0,1)$, $\mathbf{z}$ is called a strict convex combination of $\mathbf{x}$, $\mathbf{y}$.
Convex Sets
Conceptually: Any convex combination of two points in the set is also in the set
Mathematically: A set $\mathcal{F}$ is convex if $\forall x,y\in \mathcal{F}, \forall \theta \in [0,1]$,
$$\begin{equation}
z = \theta x + (1-\theta) y \in \mathcal{F} \nonumber
\end{equation}$$
$\mathcal{F} = \mathcal{F}_1 \bigcap \mathcal{F}_2$, where $\mathcal{F}_1$ and $\mathcal{F}_2$ are convex sets.
$\mathcal{F} = \mathcal{F}_1 \bigcup \mathcal{F}_2$, where $\mathcal{F}_1$ and $\mathcal{F}_2$ are convex sets.
$\mathcal{F} = \mathbb{Z}^n$
Convex Function
Value in the middle point is lower than average value
Let $\mathcal{F}$ be a convex set. A function $f:\mathcal{F}\rightarrow\mathbb{R}$ is convex in $\mathcal{F}$ if $\forall x,y \in \mathcal{F}, \forall \theta \in [0,1]$,
Convexity is preserved under a linear transformation
$$\begin{equation}
\mbox{If } f(\mathbf{x}) = g(A\mathbf{x} + b), \mbox{ if } g(\mathbf{x}) \mbox{ is convex, then } f(\mathbf{x}) \mbox{ is convex.} \nonumber
\end{equation}$$
If $f$ is a twice differentiable function of one variable, $f$ is convex on an interval $[a,b]\subseteq \mathbb{R}$ iff (if and only if) its second derivative $f''(x) \geq 0$ in $[a,b]$
Convex Optimization: Definition
If $f$ is a twice continuously differentiable function of $n$ variables, $f$ is convex on $\mathcal{F}$ iff its Hessian matrix of second partial derivatives is positive semidefinite on the interior of $\mathcal{F}$.
Let $\mathcal{F}$ be a convex set. A function $f:\mathcal{F}\rightarrow\mathbb{R}$ is concave in $\mathcal{F}$ if $\forall \mathbf{x},\mathbf{y}\in\mathcal{F}, \forall \theta\in[0,1]$,
Given an optimization problem, a point $\mathbf{x}\in\mathbb{R}^n$ is globally optimal if $\mathbf{x}\in\mathcal{F}$ and $\forall \mathbf{y}\in\mathcal{F}$, $f(\mathbf{x})\leq f(\mathbf{y})$
Given an optimization problem, a point $\mathbf{x}\in\mathbb{R}^n$ is locally optimal if $\mathbf{x}\in\mathcal{F}$ and $\exists R>0$ such that $\forall \mathbf{y}:\mathbf{y}\in\mathcal{F}$ and $\|\mathbf{x}-\mathbf{y}\|_2\leq R$, $f(\mathbf{x})\leq f(\mathbf{y})$
Theorem 1: For a convex optimization problem, all locally optimal points are globally optimal
Convex Optimization: How to Solve?
Recall discrete setting
Local search
Iteratively improving an assignment
Continuous and differentiable setting
Iteratively improving value of $\mathbf{x}$
Based on gradient
Convex Optimization: How to Solve?
For $f:\mathbb{R}^n\rightarrow\mathbb{R}$, gradient is the vector of partial derivatives
A multi-variable generalization of the derivative
Point in the direction of steepest increase in $f$