FJL3380 Theoretical Foundations of Machine Learning

Innehåll visas utifrån dina val

Om du inte hittar någon sida, schemahändelse eller nyhet på din kurswebb kan det bero på att du inte ser den kursomgången/gruppen inom kursen som innehållet tillhör.

Veta mer om din kurswebb

Din kurswebb är sidorna för en kurs du prenumererar på. Du väljer sedan vilka omgångar/grupper inom kursen du vill ha information från. Är du registrerad på en kursomgång sköts prenumeration och val av kursomgäng automatiskt åt dig. Vill du ändra något av detta gör du det under Mina inställningar.

När du är inloggad på din kurswebb ser du:
  • Kursöversikt, nyheter och schema med information som är filtrerat utifrån dina valda omgångar/grupper inom kursen
  • Allmänna sidor för hela kursen
  • Kurswikin som är sidor som alla, lärare och studenter, kan skapa och redigera
  • Sidor som hör till de omgångar/grupper inom kursen du valt eller som valts för dig

Log in to your group web

You are not logged in KTH, so we cannot customize the content.

Due to the large number of participants, the first lecture will be on room E2!

Welcome to the course homepage of FJL3380 Theoretical Foundations of Machine Learning.

This advanced PhD course introduces the basic concepts and mathematical ideas of the foundations of the theory of Machine Learning (ML). The course covers some theoretical aspects of learning theory (e.g., VC theory), and the main ML subfields, including supervised learning (linear classification and regression, SVM, and deep learning), unsupervised learning (clustering), and reinforcement learning.

Lecturers: Alexandre Proutiere and Cristian Rojas 

Course literature: S. Shalev-Shwartz and S. Ben David. Understanding Machine Learning: From theory to algorithms, Cambridge University Press, 2015.

Keywords: Supervised and unsupervised learning; regression and classification; stochastic optimization; concentration inequalities; VC theory; SVM, deep learning; clustering; reinforcement learning; online stochastic optimization.

Learning outcomes: After the course, the student should be able to:

  • know the essential theoretical tools used in modern machine learning
    • concentration of measure in probability theory
    • stochastic optimization methods
    • VC theory
  • know the historical development of supervised and unsupervised learning algorithms
  • understand the advantages and drawbacks of deep learning
  • know the basic reinforcement learning algorithms and their modern versions

Prerequisites: Basic knowledge on Linear Algebra, Probability Theory.

Requirements for final pass grade: For passing the course, successful completion of a 72h home exam and a final project are required. The project consists in reading a few recent papers published at relevant conferences (NIPS, ICML) on a selected topic (e.g., on theoretical justification of deep learning), and to write a state-of-the-art report on the topic including historical developments, recent results, and open problems (5 pages double column minimum).

Pace: 2 or 3 lectures will be given per week.

Course material: The full course schedule and lecture slides will become available under the tab Course schedule and material, visible to those registered in the course.

Registration: If you are interested in taking this course, please sign up, by writing your full name and KTH email address at the doodle:

If you do not have a KTH account, please ask for one at, since otherwise you will not be able to access the course material.

No activity in the past month. Go to News feed to see older activity

Feedback News