# Axel Ringh: : Continuous time convex optimization

**Time:
**
Fri 2018-03-09 13.15 - 14.15

**Lecturer: **
Emil Ringh, KTH

**Location: **
Room 3418, Lindstedtsvägen 25, 4th floor, Department of Mathematics, KTH

In this talk we will consider finite-dimensional convex optimization from a continuous perspective. In particular, this will be done by looking at the solutions of optimization problems as equilibrium points of certain ODEs. The simplest of these is to consider a gradient flow, which is a first-order ODE. By using Lyapunov stability we show that such a problem is guaranteed to converge to a globally optimal solution. Moreover, by different numerical discretization of the gradient flow we recover well-known algorithms such as steepest descent and the proximal point algorithm. Finally, we will also have a look at some second-order ODEs, giving rise to the heavy ball method and the Nesterov acceleration.