ID2223 Scalable Machine Learning and Deep Learning 7.5 credits

Skalbar maskininlärning och djupinlärning

Show course information based on the chosen semester and course offering:

Offering and execution

No offering selected

Select the semester and course offering above to get information from the correct course syllabus and course offering.

Course information

Content and learning outcomes

Course contents *

Topics:

  • Machine learning algorithms
  • Scalable frameworks to parallelize machine learning algorithms
  • Distributed machine learning algorithms, e.g., distributed linear regression and distributed logistic regression
  • Linear algebra, probability theory and numerical computation
  • Deep neural networks
  • Regularization and optimization for training deep neural networks
  • Sequence modelling
  • Applications of deep learning

Intended learning outcomes *

The course studies fundamentals of distributed machine learning algorithms and the fundamentals of deep learning. We will cover the basics of machine learning and introduce techniques and systems that enable machine learning algorithms to be efficiently parallelized. The course complements courses in machine learning and distributed systems, with a focus on both the topic of deep learning as well as the intersection between distributed systems and machine learning. The course prepares the students for master projects, and Ph.D. studies in the area of data science and distributed computing.

The main objective of this course is to provide the students with a solid foundation for understanding large-scale machine learning algorithms, in particular, deep learning, and their application areas.

On successful completion of the course, the student will:

  • be able to re-implement a classical machine learning algorithm as a scalable machine learning algorithm
  • be able to design and train a layered neural network system
  • apply a trained layered neural network system to make useful predictions or classifications in an application area
  • be able to elaborate the performance tradeoffs when parallelizing machine learning algorithms as well as the limitations in different network environments
  • be able to identify appropriate distributed machine learning algorithms to efficiently solve classification and pattern recognition problems.

Course Disposition

No information inserted

Literature and preparations

Specific prerequisites *

No information inserted

Recommended prerequisites

Basic knowledge in distributed systems and programming models, programming languages (Scala, Java, Python).

It is preferable that you either have some training in or have taken a course in the following areas: Machine Learning, Linear Algebra and Probability Theory.

Equipment

No information inserted

Literature

Material from the the course is derived from the recent research publications as well as the following textbook:

Deep Learning, Yoshua Bengio, Ian Goodfellow and Aaron Courville, MIT Press (in preparation).

Examination and completion

Grading scale *

A, B, C, D, E, FX, F

Examination *

  • LAB1 - Programming Assignments, 3.0 credits, Grading scale: P, F
  • TEN1 - Examination, 4.5 credits, Grading scale: A, B, C, D, E, FX, F

Based on recommendation from KTH’s coordinator for disabilities, the examiner will decide how to adapt an examination for students with documented disability.

The examiner may apply another examination format when re-examining individual students.

Written examination. Laboratory tasks.

Opportunity to complete the requirements via supplementary examination

No information inserted

Opportunity to raise an approved grade via renewed examination

No information inserted

Examiner

Amir Payberah

Further information

Course web

Further information about the course can be found on the Course web at the link below. Information on the Course web will later be moved to this site.

Course web ID2223

Offered by

EECS/Computer Science

Main field of study *

Computer Science and Engineering

Education cycle *

Second cycle

Add-on studies

No information inserted

Ethical approach *

  • All members of a group are responsible for the group's work.
  • In any assessment, every student shall honestly disclose any help received and sources used.
  • In an oral assessment, every student shall be able to present and answer questions about the entire assignment and solution.

Supplementary information

In this course, the EECS code of honor applies, see: http://www.kth.se/en/eecs/utbildning/hederskodex.