Large-Scale Optimization With Machine Learning Applications

Time: Tue 2019-11-26 10.00

Location: D2, Lindstedtsvägen 5, D-huset, Kungliga Tekniska högskolan, Stockholm (English)

Subject area: Optimization and Systems Theory Electrical Engineering

Doctoral student: Vien Van Mai , Reglerteknik

Opponent: Professor Alexandre d'Aspremont, École Normale Supérieure

Supervisor: Professor Mikael Johansson, Reglerteknik

Abstract

This thesis aims at developing efficient algorithms for solving some fundamental engineering problems in data science and machine learning. We investigate a variety of acceleration techniques for improving the convergence times of optimization algorithms.  First, we investigate how problem structure can be exploited to accelerate the solution of highly structured problems such as generalized eigenvalue and elastic net regression. We then consider Anderson acceleration, a generic and parameter-free extrapolation scheme, and show how it can be adapted to accelerate practical convergence of proximal gradient methods for a broad class of non-smooth problems. For all the methods developed in this thesis, we design novel algorithms, perform mathematical analysis of convergence rates, and conduct practical experiments on real-world data sets.

urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-263147