Skip to main content

Before choosing course

The course is based on the course offered at UC Berkeley by Prof. M. Johansson and Prof. L. El Ghaoui in 2012, and the course A. Proutiere gave in Hamilton institute and IIT Mumbai in 2012 and 2013. The course has four parts: In the first part, we explore recent advances in first-order methods for convex optimisation (which constitute the main building block for many of the more advanced algorithms developed later). The second part focuses on algorithms for distributed optimisation under computation and communication constraints. Our starting point here is mathematical decomposition techniques traditionally developed for exploiting structure in large-scale optimisation. The third part is devoted to distributed stochastic optimisation techniques, including stochastic approximation and simulation-based methods. In the last part, we present recent advances in the theory of distributed learning in repeated games.

Course offering missing for current semester as well as for previous and coming semesters
* Retrieved from Course syllabus FEL3311 (Spring 2019–)

Content and learning outcomes

Course contents

1.    Convexity

2.    Gradient and subgradient methods

3.    Duality and conjugate functions

4.    Proximal algorithms

5.    Limits of performance

6.    Accelerated methods

7.    Coordinate descent

8.    Conditional gradient

9.    Monotone operators

10.    Operator splitting methods

11.    Stochastic gradient descent

12.    Variance reduction techniques and limits of performance

13.    Newton and quasi-Newton methods

14.    Nonsmooth and stochastic second-order methods

15.    Conjugate gradients

16.    Sequential convex programming

17.    Architectures and algorithms for parallel optimisation

18.    Decomposition and parallelization

19.    Asynchrony I – time-varying update rates and information delays

20.    Asynchronous computations II – random effects  and communication efficiency

Intended learning outcomes

After the course the student will be able to:

  • know basic terminology and concepts in convex optimization.
  • design and analyze optimization algorithms for convex optimization problems.
  • characterize the limits of performance for first-order methods 
  • analyze and use modern methods for scalable convex optimization, including: gradient and subgradient methods, proximal algorithms, coordinate descent, conditional gradient, conjugate gradient, operator splitting and quasi-Newton methods. 
  • handle stochastic effects in optimization problems using stochastic gradient descent and variance reduction techniques. 
  • describe modern architectures for parallel numerical computating
  • use duality and decomposition can be for parallelization of optimization algorithms
  • assess the impact of asynchrony and information delay on iterative algorithms
  • apply techniques for reducing information exchange between computing nodes.

Course Disposition

20 lectures (2 per week)

10 handins 

project and take-home exam

Literature and preparations

Specific prerequisites

A basic course in convex optimization (e.g. EL3300), and at least one PhD-level course on convex analysis (SF3810 or EL3370).

Recommended prerequisites

No information inserted

Equipment

No information inserted

Literature

Introductory lectures on convex optimization – a basic course, Y. Nesterov.

Introduction to Optimization, B. T. Polyak

Research papers and lecture notes.

Examination and completion

If the course is discontinued, students may request to be examined during the following two academic years.

Grading scale

P, F

Examination

  • EXA1 - Examination, 8,0 hp, betygsskala: P, F

Based on recommendation from KTH’s coordinator for disabilities, the examiner will decide how to adapt an examination for students with documented disability.

The examiner may apply another examination format when re-examining individual students.

Other requirements for final grade

Passing grade on homeworks, project and exam.

Opportunity to complete the requirements via supplementary examination

No information inserted

Opportunity to raise an approved grade via renewed examination

No information inserted

Examiner

Profile picture Mikael Johansson

Profile picture Alexandre Proutiere

Ethical approach

  • All members of a group are responsible for the group's work.
  • In any assessment, every student shall honestly disclose any help received and sources used.
  • In an oral assessment, every student shall be able to present and answer questions about the entire assignment and solution.

Further information

Course web

Further information about the course can be found on the Course web at the link below. Information on the Course web will later be moved to this site.

Course web FEL3311

Offered by

EECS/Decision and Control Systems

Main field of study

No information inserted

Education cycle

Third cycle

Add-on studies

No information inserted

Postgraduate course

Postgraduate courses at EECS/Decision and Control Systems