Nyhetsflöde
Logga in till din kurswebb
Du är inte inloggad på KTH så innehållet är inte anpassat efter dina val.
Har du frågor om kursen?
Om du är registrerad på en aktuell kursomgång, se kursrummet i Canvas. Du hittar rätt kursrum under "Kurser" i personliga menyn.
Är du inte registrerad, se Kurs-PM för DD2431 eller kontakta din studentexpedition, studievägledare, eller utbilningskansli.
I Nyhetsflödet hittar du uppdateringar på sidor, schema och inlägg från lärare (när de även behöver nå tidigare registrerade studenter).
Me too.
I was finally registered (probably by my coordinator).
Lärare Atsuto Maki redigerade 27 augusti 2014
Fältövöreläsning
Lärare Atsuto Maki redigerade 29 augusti 2014
Teacher: Atsuto Maki¶
Lärare Atsuto Maki redigerade 29 augusti 2014
FöreläsningLecture 6, Classification Introduction
Schemahandläggare redigerade 1 september 2014
FR4
Fältövöreläsning
Schemahandläggare redigerade 5 september 2014
[{u'user_idname': u'u12rf6rnAtsuto Maki', u'user_nameid': u'Giampiero Salviu1elx760'}]
Lärare Atsuto Maki redigerade 15 september 2014
Lecture 6, ClassificatRegression Introduction
Teacher: Atsuto Maki¶
Lärare Atsuto Maki redigerade 17 september 2014
Topics:¶
* Function approximation
* Linear regression
* RANSAC
* Nearest Neighbours regression
* Linear regression + regularization
* Ridge regression
* Lasso
Slides from this lecture: Slides for Lecture 6¶
Lärare Atsuto Maki redigerade 4 oktober 2014
Topics:
* Function approximation
* Linear regression
* RANSAC
* Nearest Neighbours regression
* Linear regression + regularization
* Ridge regression
* Lasso
Slides from this lecture: Slides for Lecture 6
Lärare Atsuto Maki redigerade 5 oktober 2014
Topics:
* Function approximation
* Linear regression
* RANSAC
* Nearest Neighbours regression
* Linear regression + regularization
* Ridge regression
* Lasso
Slides from this lecture: Slides for Lecture 6or Lecture 6¶
Related reading:¶
Chapter 3 and 6.2 from An Introduction to Statistical Learning (Springer, 2013)¶
Gareth James, Daniela Witten, Trevor Hastie and Robert TibshiraniAvailable online: http://www-bcf.usc.edu/~gareth/ISL/¶
Lärare Atsuto Maki redigerade 15 oktober 2014
Topics:
* Function approximation
* Linear regression
* RANSAC
* Nearest Neighbours regression
* Linear regression + regularization
* Ridge regression
* Lasso
Slides for Lecture 6
Related reading:
Chapter 3 and 6.2 from An Introduction to Statistical Learning (Springer, 2013)
Gareth James, Daniela Witten, Trevor Hastie and Robert TibshiraniAvailable online: http://www-bcf.usc.edu/~gareth/ISL/
Lärare Atsuto Maki redigerade 9 juni 2015
Topics:
* Function approximation
* Linear regression
* RANSAC
* Nearest Neighbours regression
* Linear regression + regularization
* Ridge regression
* Lasso
Slides for Lecture 6
Related reading:
Chapter 3 and 6.2 from An Introduction to Statistical Learning (Springer, 2013)
Gareth James, Daniela Witten, Trevor Hastie and Robert TibshiraniAvailable online: http://www-bcf.usc.edu/~gareth/ISL/
Lärare Atsuto Maki redigerade 29 augusti 2014
FöreläsningLecture 10, Ensemble Methods
Lärare Atsuto Maki redigerade 1 oktober 2014
Topics:¶
* Wisdom of Crowd
* Why combine classifiers?
* Bagging
* Decision Forests
* Boosting
Slides for Lecture 10¶
Lärare Atsuto Maki redigerade 2 oktober 2014
Topics:
* Wisdom of Crowd
* Why combine classifiers?
* Bagging
* Decision Forests
* Boosting
Slides for Lecture 10
¶
Lärare Atsuto Maki redigerade 4 oktober 2014
Topics:
* Wisdom of Crowd
* Why combine classifiers?
* Bagging
* Decision Forests
* Boosting
Slides for Lecture 10
Lärare Atsuto Maki redigerade 5 oktober 2014
Topics:
* Wisdom of Crowd
* Why combine classifiers?
* Bagging
* Decision Forests
* Boosting
Slides for Lecture 10
Related reading:¶
Chapter 8.2 from An Introduction to Statistical Learning (Springer, 2013)¶
Gareth James, Daniela Witten, Trevor Hastie and Robert TibshiraniAvailable online: http://www-bcf.usc.edu/~gareth/ISL/
Lärare Atsuto Maki redigerade 9 juni 2015
Topics:
* Wisdom of Crowd
* Why combine classifiers?
* Bagging
* Decision Forests
* Boosting
Slides for Lecture 10
Related reading:
Chapter 8.2 from An Introduction to Statistical Learning (Springer, 2013)
Gareth James, Daniela Witten, Trevor Hastie and Robert TibshiraniAvailable online: http://www-bcf.usc.edu/~gareth/ISL/
Lärare Atsuto Maki redigerade 29 augusti 2014
FöreläsningLecture 9, Learning Theory
Lärare Örjan Ekeberg redigerade 29 september 2014
Topics:¶
* Concepts and Hypotheses
* PAC-Learning
* VC-Dimension
Slides from lecture 9¶
Lärare Atsuto Maki redigerade 4 oktober 2014
Topics:
* Concepts and Hypotheses
* PAC-Learning
* VC-Dimension
Slides from lecture 9
Lärare Atsuto Maki redigerade 29 augusti 2014
FöreläsningLecture 8, Classification with Separating Hyperplanes
Lärare Örjan Ekeberg redigerade 24 september 2014
Topics:¶
* Linear separation
* Structural risk minimization
* Support vector machines
* Kernels
* Non-separable Classes
Slides from lecture 8¶
Lärare Atsuto Maki redigerade 4 oktober 2014
Topics:
* Linear separation
* Structural risk minimization
* Support vector machines
* Kernels
* Non-separable Classes
Slides from lecture 8
Lärare Atsuto Maki redigerade 29 augusti 2014
Teacher: Atsuto Maki¶
Lärare Atsuto Maki redigerade 29 augusti 2014
FöreläsningLecture 7, Regression Introduction
Schemahandläggare redigerade 5 september 2014
[{u'user_idname': u'u12rf6rnAtsuto Maki', u'user_nameid': u'Giampiero Salviu1elx760'}]
Lärare Atsuto Maki redigerade 15 september 2014
Lecture 7, RegressClassification Introduction
Teacher: Atsuto Maki¶
Lärare Atsuto Maki redigerade 21 september 2014
Topics:¶
* Logistic regression
* Inference and decision
* Discriminative function
* Discriminative vs Generative model
* Naive Bayes
Slides from this lecture: Slides for Lecture 7 ¶
Slides for Lecture 7 (part II)¶
Lärare Atsuto Maki redigerade 22 september 2014
Topics:
* Logistic regression
* Inference and decision
* Discriminative function
* Discriminative vs Generative model
* Naive Bayes
Slides from this lecture: Slides for Lecture 7 ¶ (part I) , Slides for Lecture 7 (part II)
¶
Lärare Atsuto Maki redigerade 22 september 2014
Topics:
* Naive Bayes
* Logistic regression
* Inference and decision
* Discriminative function
* Discriminative vs Generative model * Naive Bayes Slides from this lecture: Slides for Lecture 7 (part I) , Slides for Lecture 7 (part II)
Lärare Atsuto Maki redigerade 4 oktober 2014
Topics:
* Naive Bayes
* Logistic regression
* Inference and decision
* Discriminative function
* Discriminative vs Generative model
Slides from this lecture: Slides for Lecture 7 (part I) , Slides for Lecture 7 (part II)
Lärare Atsuto Maki redigerade 5 oktober 2014
Topics:
* Naive Bayes
* Logistic regression
* Inference and decision
* Discriminative function
* Discriminative vs Generative model
Slides from this lecture: or Lecture 7 (part I) , Slides for Lecture 7 (part I) , Slides for Lecture 7 (part II)¶ I)¶
Related reading:¶
Chapter 4 from An Introduction to Statistical Learning (Springer, 2013)¶
Gareth James, Daniela Witten, Trevor Hastie and Robert TibshiraniAvailable online: http://www-bcf.usc.edu/~gareth/ISL/
Lärare Atsuto Maki redigerade 9 juni 2015
Topics:
* Naive Bayes
* Logistic regression
* Inference and decision
* Discriminative function
* Discriminative vs Generative model
Slides for Lecture 7 (part I) , Slides for Lecture 7 (part II)
Related reading:
Chapter 4 from An Introduction to Statistical Learning (Springer, 2013)
Gareth James, Daniela Witten, Trevor Hastie and Robert TibshiraniAvailable online: http://www-bcf.usc.edu/~gareth/ISL/
Lärare Atsuto Maki redigerade 29 augusti 2014
FöreläsLecture 5, Challenges to machine learning
Lärare Atsuto Maki redigerade 14 september 2014
Topics:¶
* Challenges to machine learning
* Model complexity and overfitting
* The curse of dimentionality
* Concepts of prediction errors
* The bias-variance trade-off
Slides from this lecture: Slides for Lecture 5¶
Related reading: Chapter 2 from An Introduction to Statistical Learning James et al.¶
available online at: http://www-bcf.usc.edu/~gareth/ISL/data.html¶
Lärare Atsuto Maki redigerade 14 september 2014
Topics:
* Challenges to machine learning
* Model complexity and overfitting
* The curse of dimentionality
* Concepts of prediction errors
* The bias-variance trade-off
Slides from this lecture: Slides for Lecture 5
Related reading: Chapter 2 from An Introduction to Statistical Learning James et al.¶ available online at: http://www-bcf.usc.edu/~gareth/ISL/data.html
Lärare Atsuto Maki redigerade 14 september 2014
Topics:
* Challenges to machine learning
* Model complexity and overfitting
* The curse of dimentionality
* Concepts of prediction errors
* The bias-variance trade-off
Slides from this lecture: Slides for Lecture 5
Lärare Atsuto Maki redigerade 15 september 2014
Topics:
* Challenges to machine learning
* Model complexity and overfitting
* The curse of dimentionality
* Concepts of prediction errors
* The bias-variance trade-off
Slides from this lecture: Slides for Lecture 5
Lärare Atsuto Maki redigerade 15 september 2014
Topics:
* Challenges to machine learning
* Model complexity and overfitting
* The curse of dimentionality
* Concepts of prediction errors
* The bias-variance trade-off
Slides from this lecture: Slides for Lecture 5
¶
Lärare Atsuto Maki redigerade 15 september 2014
Topics:
* Challenges to machine learning
* Model complexity and overfitting
* The curse of dimentionality
* Concepts of prediction errors
* The bias-variance trade-off
Slides from this lecture: Slides for Lecture 5
¶
Lärare Atsuto Maki redigerade 15 september 2014
Topics:
* Challenges to machine learning
* Model complexity and overfitting
* The curse of dimentionality
* Concepts of prediction errors
* The bias-variance trade-off
Slides from this lecture: Slides for Lecture 5
Lärare Atsuto Maki redigerade 5 oktober 2014
Topics:
* Challenges to machine learning
* Model complexity and overfitting
* The curse of dimentionality
* Concepts of prediction errors
* The bias-variance trade-off
Slides from this lecture: Slides for Lecture 5¶ or Lecture 5¶
Related reading:¶
Chapter 2 and 6.4 from An Introduction to Statistical Learning (Springer, 2013)¶
Gareth James, Daniela Witten, Trevor Hastie and Robert TibshiraniAvailable online: http://www-bcf.usc.edu/~gareth/ISL/
Lärare Atsuto Maki redigerade 9 juni 2015
Topics:
* Challenges to machine learning
* Model complexity and overfitting
* The curse of dimentsionality
* Concepts of prediction errors
* The bias-variance trade-off
Slides for Lecture 5
Related reading:
Chapter 2 and 6.4 from An Introduction to Statistical Learning (Springer, 2013)
Gareth James, Daniela Witten, Trevor Hastie and Robert TibshiraniAvailable online: http://www-bcf.usc.edu/~gareth/ISL/
Lärare Giampiero Salvi redigerade 8 juli 2014
Teacher: Giampiero Salvi¶
Lärare Atsuto Maki redigerade 29 augusti 2014
FöreläsningLecture 4, Probability II
Lärare Giampiero Salvi redigerade 3 september 2014
Teacher: Giampiero Salvi
Topics:¶
* Fitting probability models (continued)
* Model selection (Occam's Razor)
* Unsupervised learning and Expectation Maximization
Selected reading:¶
Chapters 4 and 7 from Computer Vision: Models, Learning, and Inference Simon J.D. Princeavailable online at: http://www.computervisionmodels.com/¶
Schemahandläggare redigerade 5 september 2014
[{u'user_idname': u'u1oppompGiampiero Salvi', u'user_nameid': u'\xd6rjan Ekebergu12rf6rn'}]
Lärare Giampiero Salvi redigerade 10 september 2014
Teacher: Giampiero Salvi
Topics:
* Fitting probability models (continued)
* Model selection (Occam's Razor)
* Unsupervised learning and Expectation Maximization
Selected reading:
Chapters 4 and 7 from Computer Vision: Models, Learning, and Inference Simon J.D. Princeavailable online at: http://www.computervisionmodels.com/
Handouts: Handouts for Lecture 04¶
¶
Lärare Giampiero Salvi redigerade 10 september 2014
Teacher: Giampiero Salvi
Topics:
* Fitting probability models (continued)
* Model selection (Occam's Razor)
* Unsupervised learning and Expectation Maximization
Selected reading:
Chapters 4 and 7 from Computer Vision: Models, Learning, and Inference Simon J.D. Princeavailable online at: http://www.computervisionmodels.com/
Handouts: Handouts for Lecture 04
Lärare Giampiero Salvi redigerade 11 september 2014
Teacher: Giampiero Salvi
Topics:
* Fitting probability models (continued)
* Model selection (Occam's Razor)
* Unsupervised learning and Expectation Maximization
Selected reading:
Chapters 4 and 7 from Computer Vision: Models, Learning, and Inference Simon J.D. Princeavailable online at: http://www.computervisionmodels.com/
Handouts: Handouts for Lecture 04
Lärare Atsuto Maki redigerade 4 oktober 2014
Teacher: Giampiero Salvi
Topics:
* Fitting probability models (continued)
* Model selection (Occam's Razor)
* Unsupervised learning and Expectation Maximization
Selected reading:
Chapters 4 and 7 from Computer Vision: Models, Learning, and Inference Simon J.D. Princeavailable online at: http://www.computervisionmodels.com/
Handouts: Handouts for Lecture 04
Lärare Atsuto Maki redigerade 29 augusti 2014
FöreläsningLecture 3, Probability I
Lärare Giampiero Salvi redigerade 3 september 2014
Topics:¶
* Probability theory
* Common probability distributions
* Bayes rule and machine learning
Selected reading:¶
Chapters 2, 3, 5 from Computer Vision: Models, Learning, and Inference Simon J.D. Princeavailable online at: http://www.computervisionmodels.com/¶
¶
Lärare Giampiero Salvi redigerade 3 september 2014
Topics:
* Probability theory
* Common probability distributions
* Bayes rule and machine learning
* Fitting probability models
Selected reading:
Chapters 2, 3, -5 from Computer Vision: Models, Learning, and Inference Simon J.D. Princeavailable online at: http://www.computervisionmodels.com/
Lärare Giampiero Salvi redigerade 8 september 2014
Topics:
* Probability theory
* Common probability distributions
* Bayes rule and machine learning
* Fitting probability models
Handouts: 03-probtheory-2x2.pdf¶
Selected reading:
Chapters 2-5 from Computer Vision: Models, Learning, and Inference Simon J.D. Princeavailable online at: http://www.computervisionmodels.com/
Lärare Atsuto Maki redigerade 4 oktober 2014
Topics:
* Probability theory
* Common probability distributions
* Bayes rule and machine learning
* Fitting probability models
Handouts: 03-probtheory-2x2.pdf
Selected reading:
Chapters 2-5 from Computer Vision: Models, Learning, and Inference Simon J.D. Princeavailable online at: http://www.computervisionmodels.com/
Lärare Atsuto Maki redigerade 29 augusti 2014
FöreläsningLecture 1, Introduction
Lärare Atsuto Maki redigerade 29 augusti 2014
Structure of the Course¶
* What will the course cover?
* How are labs and examination handled?
Learning Machines¶
* What do we mean by a "Learning Machine"?
* Supervised vs Unsupervised learning?
* What can learning algorithms be used for?
* How can a simple learning program be constructured?
* What is a Nearest neighbour classifier?
Lärare Atsuto Maki redigerade 1 september 2014
Structure of the Course
* What will the course cover?
* How are labs and examination handled?
Learning Machines
* What do we mean by a "Learning Machine"?
* Supervised vs Unsupervised learning?
* What can learning algorithms be used for?
* How can a simple learning program be constructured?
* What is a Nearest neighbour classifier?
Slides from this lecture: Slides for Lecture 1¶
Slides from this lecture: Slides for Lecture 1 (part II)¶
Lärare Atsuto Maki redigerade 4 oktober 2014
Structure of the Course
* What will the course cover?
* How are labs and examination handled?
Learning Machines
* What do we mean by a "Learning Machine"?
* Supervised vs Unsupervised learning?
* What can learning algorithms be used for?
* How can a simple learning program be constructured?
* What is a Nearest neighbour classifier?
Slides from this lecture: Slides for Lecture 1
Slides from this lecture: Slides for Lecture 1 (part II)
Lärare Atsuto Maki redigerade 5 oktober 2014
Structure of the Course
* What will the course cover?
* How are labs and examination handled?
Learning Machines
* What do we mean by a "Learning Machine"?
* Supervised vs Unsupervised learning?
* What can learning algorithms be used for?
* How can a simple learning program be constructured?
* What is a Nearest neighbour classifier?
Slides from this lecture: Slides for Lecture 1¶ Slides from this lecture:or Lecture 1¶
Slides for Lecture 1 (part II)
Lärare Atsuto Maki redigerade 9 juni 2015
Structure of the Course
* What will the course cover?
* How are labs and examination handled?
Learning Machines
* What do we mean by a "Learning Machine"?
* Supervised vs Unsupervised learning?
* What can learning algorithms be used for?
* How can a simple learning program be constructured?
* What is a Nearest neighbour classifier?
Slides for Lecture 1
Slides for Lecture 1 (part II)
Lärare Atsuto Maki redigerade 29 augusti 2014
FöreläsningLecture 2, Decision Trees
Lärare Atsuto Maki redigerade 2 september 2014
* What is a Decision Tree?
* When are decision trees useful?
* How can one select what questions to ask?
* What do we mean by Entropy for a data set?
* What do we mean by the Information Gain of a question?
* What is the problem of overfitting? Minimizing training error?
* Bias - Variance trade-off
Lärare Atsuto Maki redigerade 3 september 2014
* What is a Decision Tree?
* When are decision trees useful?
* How can one select what questions to ask?
* What do we mean by Entropy for a data set?
* What do we mean by the Information Gain of a question?
* What is the problem of overfitting? Minimizing training error?
* Bias - Variance trade-off What extensions will be possible for improvement?
Slides from this lecture: Slides for Lecture 2 ¶
Lärare Atsuto Maki redigerade 4 september 2014
* What is a Decision Tree?
* When are decision trees useful?
* How can one select what questions to ask?
* What do we mean by Entropy for a data set?
* What do we mean by the Information Gain of a question?
* What is the problem of overfitting? Minimizing training error?
* What extensions will be possible for improvement?
Slides from this lecture: Slides for Lecture 2
Lärare Atsuto Maki redigerade 4 oktober 2014
* What is a Decision Tree?
* When are decision trees useful?
* How can one select what questions to ask?
* What do we mean by Entropy for a data set?
* What do we mean by the Information Gain of a question?
* What is the problem of overfitting? Minimizing training error?
* What extensions will be possible for improvement?
Slides from this lecture: Slides for Lecture 2
Lärare Atsuto Maki redigerade 5 oktober 2014
* What is a Decision Tree?
* When are decision trees useful?
* How can one select what questions to ask?
* What do we mean by Entropy for a data set?
* What do we mean by the Information Gain of a question?
* What is the problem of overfitting? Minimizing training error?
* What extensions will be possible for improvement?
Slides from this lecture: Slides for Lecture 2or Lecture 2¶
Related reading:¶
An Introduction to Statistical Learning with Applications in R (Springer, 2013)Gareth James, Daniela Witten, Trevor Hastie and Robert TibshiraniAvailable online: http://www-bcf.usc.edu/~gareth/ISL/¶
Chapter 8.1¶
Lärare Atsuto Maki redigerade 5 oktober 2014
* What is a Decision Tree?
* When are decision trees useful?
* How can one select what questions to ask?
* What do we mean by Entropy for a data set?
* What do we mean by the Information Gain of a question?
* What is the problem of overfitting? Minimizing training error?
* What extensions will be possible for improvement?
Slides for Lecture 2
Related reading:
Chapter 8.1 from An Introduction to Statistical Learning with Applications in R (Springer, 2013)¶
Gareth James, Daniela Witten, Trevor Hastie and Robert TibshiraniAvailable online: http://www-bcf.usc.edu/~gareth/ISL/
Chapter 8.1¶ ¶
Lärare Atsuto Maki redigerade 15 oktober 2014
* What is a Decision Tree?
* When are decision trees useful?
* How can one select what questions to ask?
* What do we mean by Entropy for a data set?
* What do we mean by the Information Gain of a question?
* What is the problem of overfitting? Minimizing training error?
* What extensions will be possible for improvement?
Slides for Lecture 2
Related reading:
Chapter 8.1 from An Introduction to Statistical Learning (Springer, 2013)
Gareth James, Daniela Witten, Trevor Hastie and Robert TibshiraniAvailable online: http://www-bcf.usc.edu/~gareth/ISL/
Lärare Atsuto Maki redigerade 9 juni 2015
* What is a Decision Tree?
* When are decision trees useful?
* How can one select what questions to ask?
* What do we mean by Entropy for a data set?
* What do we mean by the Information Gain of a question?
* What is the problem of overfitting? Minimizing training error?
* What extensions will be possible for improvement?
Slides for Lecture 2
Related reading:
Chapter 8.1 from An Introduction to Statistical Learning (Springer, 2013)
Gareth James, Daniela Witten, Trevor Hastie and Robert TibshiraniAvailable online: http://www-bcf.usc.edu/~gareth/ISL/
Lärare Atsuto Maki redigerade 29 augusti 2014
FöreläsningLecture 11, Learning Representation
Lärare Atsuto Maki redigerade 4 oktober 2014
Lecture 11, Learning RepresentaDimensionality Reduction
Lärare Atsuto Maki redigerade 6 oktober 2014
Topics:¶
* Concept of subspace
* Similarity measure
* Subspace method
* Unsupervised Learning
* Principal Component Analysis (PCA)
Slides from lecture 11¶
Related reading:¶
Chapter 10.2 from An Introduction to Statistical Learning (Springer, 2013)¶
Gareth James, Daniela Witten, Trevor Hastie and Robert TibshiraniAvailable online: http://www-bcf.usc.edu/~gareth/ISL/¶
¶
Lärare Atsuto Maki redigerade 6 oktober 2014
Topics:
* Concept of subspace
* Similarity measure
* Subspace method
* Unsupervised Learning
* Principal Component Analysis (PCA)
Slides from lecture 11
Related reading:
Chapter 10.2 from An Introduction to Statistical Learning (Springer, 2013)
Gareth James, Daniela Witten, Trevor Hastie and Robert TibshiraniAvailable online: http://www-bcf.usc.edu/~gareth/ISL/
Lärare Atsuto Maki redigerade 6 oktober 2014
Topics:
* Concept of subspace
* Similarity measure
* Subspace method
* Unsupervised Learning
* Principal Component Analysis (PCA)
Slides from lor Lecture 11
Related reading:
Chapter 10.2 from An Introduction to Statistical Learning (Springer, 2013)
Gareth James, Daniela Witten, Trevor Hastie and Robert TibshiraniAvailable online: http://www-bcf.usc.edu/~gareth/ISL/
Lärare Atsuto Maki redigerade 9 juni 2015
Topics:
* Concept of subspace
* Similarity measure
* Subspace method
* Unsupervised Learning
* Principal Component Analysis (PCA)
Slides for Lecture 11
Related reading:
Chapter 10.2 from An Introduction to Statistical Learning (Springer, 2013)
Gareth James, Daniela Witten, Trevor Hastie and Robert TibshiraniAvailable online: http://www-bcf.usc.edu/~gareth/ISL/
Schemahandläggare redigerade 13 maj 2014
TorsMåndag 913 oktober 2014 kl 107:00 - 129:00
FR4D1
Lärare Atsuto Maki redigerade 29 augusti 2014
FöreläsningLecture 12, Summary
Lärare Atsuto Maki redigerade 13 oktober 2014
Slides for Lecture 12¶
Lärare Atsuto Maki redigerade 13 oktober 2014
Slides for Lecture 12
This says "The address given is protected by one of the group's administrators." ; I'm "admitted" to this course but still not registered. Is there something I'm supposed to do or the administration will eventually register me?