FEP3260 Fundamentals of Machine Learning Networks 10.0 credits
This course covers fundamentals of machine learning over networks (MLoNs). It starts from a conventional single-agent setting where one server runs a convex/nonconvex optimization problem to learn an unknown function. We introduce several approaches to address this seemingly, simple yet fundamental, problem. We introduce an abstract form of MLoNs, present centralized and distributed solution approaches to address this problem, and exemplify via training a deep neural network over a network. The course covers various important aspects of MLoNs, including optimality, computational complexity, communication complexity, security, large-scale learning, online learning, MLoN with partial information, and several application areas. As most of these topics are under heavy researches nowadays, the course is not based on a single textbook but builds on a series of key publications in the field.
Choose semester and course offering
Choose semester and course offering to see current information and more about the course, such as course syllabus, study period, and application information.
Application
For course offering
Autumn 2023 Start 28 Aug 2023 programme students
Application code
51211
Content and learning outcomes
Course contents
- Lecture 1: Introduction
- Lecture 2: Centralized Convex ML
- Lecture 3: Centralized Nonconvex ML
- Lecture 4: Distributed ML
- Lecture 5: ADMM, guest lecturer
- Lecture 6: Communication Efficiency
- Lecture 7: Deep Neural Networks
- Lecture 8: Computer Assignment Session and Homework
- Lecture 9: Special Topic 1: Large-scale ML
- Lecture 10: Special Topic 2: Security in MLoNs
- Lecture 11: Special Topic 3: Online MLoNs
- Lecture 12: Special Topic 4: MLoNs with partial knowledge
- Lecture 13: Special Topic 5: Application Areas and Open Research Problems
Intended learning outcomes
After the course, the student should be able to:
· give new tools and training to model basic ML problems by optimization
· present basic theories of large-scale ML, distributed ML, and MLoNs
· provide a thorough understanding of how such problems are solved, pros and cons of various approaches, and some experience in solving them
- review on recent topics in ML and MLoNs, including communication-efficiency, security, and MLoNs with partial knowledge
- give students the background and skills required to do research in this growing field
Literature and preparations
Specific prerequisites
Basic knowledge of convex optimization and probability theory is required to follow the course.
Recommended prerequisites
Equipment
Literature
Examination and completion
If the course is discontinued, students may request to be examined during the following two academic years.
Grading scale
Examination
- EXA1 - Examination, 10.0 credits, grading scale: P, F
Based on recommendation from KTH’s coordinator for disabilities, the examiner will decide how to adapt an examination for students with documented disability.
The examiner may apply another examination format when re-examining individual students.
Other requirements for final grade
- Attending at least 11 lectures (out of 13)
- 45 min oral presentation of a selected topic in one of the Special Topic lectures
- 80% on homework problems and computer assignments
- Project (preferably on a problem related to the student’s own research)
Opportunity to complete the requirements via supplementary examination
Opportunity to raise an approved grade via renewed examination
Examiner
Ethical approach
- All members of a group are responsible for the group's work.
- In any assessment, every student shall honestly disclose any help received and sources used.
- In an oral assessment, every student shall be able to present and answer questions about the entire assignment and solution.