Till KTH:s startsida Till KTH:s startsida

Nyhetsflöde

Logga in till din kurswebb

Du är inte inloggad på KTH så innehållet är inte anpassat efter dina val.

I Nyhetsflödet hittar du uppdateringar på sidor, schema och inlägg från lärare (när de även behöver nå tidigare registrerade studenter).

September 2014
under HT 2014

Atsuto Maki skapade sidan 19 augusti 2014

Lärare Atsuto Maki ändrade rättigheterna 19 augusti 2014

Kan därmed läsas av lärare och ändras av lärare.

Lärare Atsuto Maki ändrade rättigheterna 27 augusti 2014

Kan därmed läsas av studerande och lärare och ändras av lärare.

Lärare Atsuto Maki ändrade rättigheterna 28 augusti 2014

Kan därmed läsas av alla inloggade användare och ändras av lärare.

Lärare Atsuto Maki ändrade rättigheterna 4 september 2014

Kan därmed läsas av alla och ändras av lärare.
kommenterade 9 september 2014

This says "The address given is protected by one of the group's administrators." ; I'm "admitted" to this course but still not registered. Is there something I'm supposed to do or the administration will eventually register me?

kommenterade 9 september 2014

Me too.

kommenterade 10 september 2014

I was finally registered (probably by my coordinator).

 
Schemahandläggare skapade händelsen 2 april 2014

ändrade rättigheterna 15 maj 2014

Kan därmed läsas av alla och ändras av lärare.
Lärare Atsuto Maki redigerade 27 augusti 2014

Fältövöreläsning

Lärare Atsuto Maki redigerade 29 augusti 2014

Teacher: Atsuto Maki¶

Lärare Atsuto Maki redigerade 29 augusti 2014

FöreläsningLecture 6, Classification Introduction

Schemahandläggare redigerade 1 september 2014

FR4

Fältövöreläsning

Schemahandläggare redigerade 5 september 2014

[{u'user_idname': u'u12rf6rnAtsuto Maki', u'user_nameid': u'Giampiero Salviu1elx760'}]

Lärare Atsuto Maki redigerade 15 september 2014

Lecture 6, ClassificatRegression Introduction

Teacher: Atsuto Maki¶

Lärare Atsuto Maki redigerade 17 september 2014

Topics:¶


* Function approximation
* Linear regression
* RANSAC
* Nearest Neighbours regression
* Linear regression + regularization
* Ridge regression
* Lasso
Slides from this lecture: Slides for Lecture 6¶

Lärare Atsuto Maki redigerade 4 oktober 2014

Topics:


* Function approximation
* Linear regression
* RANSAC
* Nearest Neighbours regression
* Linear regression + regularization
* Ridge regression
* Lasso
Slides from this lecture: Slides for Lecture 6

Lärare Atsuto Maki redigerade 5 oktober 2014

Topics:


* Function approximation
* Linear regression
* RANSAC
* Nearest Neighbours regression
* Linear regression + regularization
* Ridge regression
* Lasso
Slides from this lecture: Slides for Lecture 6or Lecture 6¶

Related reading:¶

Chapter 3 and 6.2 from An Introduction to Statistical Learning (Springer, 2013)¶

Gareth James, Daniela Witten, Trevor Hastie and Robert TibshiraniAvailable online: http://www-bcf.usc.edu/~gareth/ISL/¶

Lärare Atsuto Maki redigerade 15 oktober 2014

Topics:


* Function approximation
* Linear regression
* RANSAC
* Nearest Neighbours regression
* Linear regression + regularization
* Ridge regression
* Lasso
Slides for Lecture 6

Related reading:

Chapter 3 and 6.2 from An Introduction to Statistical Learning (Springer, 2013)

Gareth James, Daniela Witten, Trevor Hastie and Robert TibshiraniAvailable online: http://www-bcf.usc.edu/~gareth/ISL/

Lärare Atsuto Maki redigerade 9 juni 2015

Topics:


* Function approximation
* Linear regression
* RANSAC
* Nearest Neighbours regression
* Linear regression + regularization
* Ridge regression
* Lasso
Slides for Lecture 6

Related reading:

Chapter 3 and 6.2 from An Introduction to Statistical Learning (Springer, 2013)

Gareth James, Daniela Witten, Trevor Hastie and Robert TibshiraniAvailable online: http://www-bcf.usc.edu/~gareth/ISL/

 
April 2014
under
HT 2014
Schemahandläggare skapade händelsen 2 april 2014

ändrade rättigheterna 15 maj 2014

Kan därmed läsas av alla och ändras av lärare.
 
Schemahandläggare skapade händelsen 2 april 2014

ändrade rättigheterna 15 maj 2014

Kan därmed läsas av alla och ändras av lärare.
Lärare Atsuto Maki redigerade 29 augusti 2014

FöreläsningLecture 10, Ensemble Methods

Lärare Atsuto Maki redigerade 1 oktober 2014

Topics:¶


* Wisdom of Crowd
* Why combine classifiers?
* Bagging
* Decision Forests
* Boosting
Slides for Lecture 10¶

Lärare Atsuto Maki redigerade 2 oktober 2014

Topics:


* Wisdom of Crowd
* Why combine classifiers?
* Bagging
* Decision Forests
* Boosting
Slides for Lecture 10



Lärare Atsuto Maki redigerade 4 oktober 2014

Topics:


* Wisdom of Crowd
* Why combine classifiers?
* Bagging
* Decision Forests
* Boosting
Slides for Lecture 10

Lärare Atsuto Maki redigerade 5 oktober 2014

Topics:


* Wisdom of Crowd
* Why combine classifiers?
* Bagging
* Decision Forests
* Boosting
Slides for Lecture 10

Related reading:¶

Chapter 8.2 from An Introduction to Statistical Learning (Springer, 2013)¶

Gareth James, Daniela Witten, Trevor Hastie and Robert TibshiraniAvailable online: http://www-bcf.usc.edu/~gareth/ISL/

Lärare Atsuto Maki redigerade 9 juni 2015

Topics:


* Wisdom of Crowd
* Why combine classifiers?
* Bagging
* Decision Forests
* Boosting
Slides for Lecture 10

Related reading:

Chapter 8.2 from An Introduction to Statistical Learning (Springer, 2013)

Gareth James, Daniela Witten, Trevor Hastie and Robert TibshiraniAvailable online: http://www-bcf.usc.edu/~gareth/ISL/

 
Schemahandläggare skapade händelsen 2 april 2014

ändrade rättigheterna 15 maj 2014

Kan därmed läsas av alla och ändras av lärare.
Lärare Atsuto Maki redigerade 29 augusti 2014

FöreläsningLecture 9, Learning Theory

Lärare Örjan Ekeberg redigerade 29 september 2014

Topics:¶


* Concepts and Hypotheses
* PAC-Learning
* VC-Dimension
Slides from lecture 9¶

Lärare Atsuto Maki redigerade 4 oktober 2014

Topics:


* Concepts and Hypotheses
* PAC-Learning
* VC-Dimension
Slides from lecture 9

 
Schemahandläggare skapade händelsen 2 april 2014

ändrade rättigheterna 15 maj 2014

Kan därmed läsas av alla och ändras av lärare.
Lärare Atsuto Maki redigerade 29 augusti 2014

FöreläsningLecture 8, Classification with Separating Hyperplanes

Lärare Örjan Ekeberg redigerade 24 september 2014

Topics:¶


* Linear separation
* Structural risk minimization
* Support vector machines
* Kernels
* Non-separable Classes
Slides from lecture 8¶

Lärare Atsuto Maki redigerade 4 oktober 2014

Topics:


* Linear separation
* Structural risk minimization
* Support vector machines
* Kernels
* Non-separable Classes
Slides from lecture 8

 
Schemahandläggare skapade händelsen 2 april 2014

ändrade rättigheterna 15 maj 2014

Kan därmed läsas av alla och ändras av lärare.
Lärare Atsuto Maki redigerade 29 augusti 2014

Teacher: Atsuto Maki¶

Lärare Atsuto Maki redigerade 29 augusti 2014

FöreläsningLecture 7, Regression Introduction

Schemahandläggare redigerade 5 september 2014

[{u'user_idname': u'u12rf6rnAtsuto Maki', u'user_nameid': u'Giampiero Salviu1elx760'}]

Lärare Atsuto Maki redigerade 15 september 2014

Lecture 7, RegressClassification Introduction

Teacher: Atsuto Maki¶

Lärare Atsuto Maki redigerade 21 september 2014

Topics:¶


* Logistic regression
* Inference and decision
* Discriminative function
* Discriminative vs Generative model
* Naive Bayes
Slides from this lecture: Slides for Lecture 7 ¶

                                       Slides for Lecture 7 (part II)¶

Lärare Atsuto Maki redigerade 22 september 2014

Topics:


* Logistic regression
* Inference and decision
* Discriminative function
* Discriminative vs Generative model
* Naive Bayes
Slides from this lecture:  Slides for Lecture 7 ¶                                       (part I) , Slides for Lecture 7 (part II)



Lärare Atsuto Maki redigerade 22 september 2014

Topics:


* Naive Bayes
* Logistic regression
* Inference and decision
* Discriminative function
* Discriminative vs Generative model
* Naive Bayes
Slides from this lecture: Slides for Lecture 7 (part I) , Slides for Lecture 7 (part II)

Lärare Atsuto Maki redigerade 4 oktober 2014

Topics:


* Naive Bayes
* Logistic regression
* Inference and decision
* Discriminative function
* Discriminative vs Generative model
Slides from this lecture: Slides for Lecture 7 (part I) , Slides for Lecture 7 (part II)

Lärare Atsuto Maki redigerade 5 oktober 2014

Topics:


* Naive Bayes
* Logistic regression
* Inference and decision
* Discriminative function
* Discriminative vs Generative model
Slides from this lecture: or Lecture 7 (part I) , Slides for Lecture 7 (part I) , Slides for Lecture 7 (part II)¶ I)¶

Related reading:¶

Chapter 4 from An Introduction to Statistical Learning (Springer, 2013)¶

Gareth James, Daniela Witten, Trevor Hastie and Robert TibshiraniAvailable online: http://www-bcf.usc.edu/~gareth/ISL/

Lärare Atsuto Maki redigerade 9 juni 2015

Topics:


* Naive Bayes
* Logistic regression
* Inference and decision
* Discriminative function
* Discriminative vs Generative model
Slides for Lecture 7 (part I) , Slides for Lecture 7 (part II)

Related reading:

Chapter 4 from An Introduction to Statistical Learning (Springer, 2013)

Gareth James, Daniela Witten, Trevor Hastie and Robert TibshiraniAvailable online: http://www-bcf.usc.edu/~gareth/ISL/

 
Schemahandläggare skapade händelsen 2 april 2014

ändrade rättigheterna 15 maj 2014

Kan därmed läsas av alla och ändras av lärare.
Lärare Atsuto Maki redigerade 29 augusti 2014

FöreläsLecture 5, Challenges to machine learning

Lärare Atsuto Maki redigerade 14 september 2014

Topics:¶


* Challenges to machine learning
* Model complexity and overfitting
* The curse of dimentionality
* Concepts of prediction errors
* The bias-variance trade-off
Slides from this lecture: Slides for Lecture 5¶

Related reading: Chapter 2 from An Introduction to Statistical Learning James et al.¶

available online at: http://www-bcf.usc.edu/~gareth/ISL/data.html¶

Lärare Atsuto Maki redigerade 14 september 2014

Topics:


* Challenges to machine learning
* Model complexity and overfitting
* The curse of dimentionality
* Concepts of prediction errors
* The bias-variance trade-off
Slides from this lecture: Slides for Lecture 5

Related reading: Chapter 2 from An Introduction to Statistical Learning James et al.¶ available online at: http://www-bcf.usc.edu/~gareth/ISL/data.html

Lärare Atsuto Maki redigerade 14 september 2014

Topics:


* Challenges to machine learning
* Model complexity and overfitting
* The curse of dimentionality
* Concepts of prediction errors
* The bias-variance trade-off
Slides from this lecture: Slides for Lecture 5

Lärare Atsuto Maki redigerade 15 september 2014

Topics:


* Challenges to machine learning
* Model complexity and overfitting
* The curse of dimentionality
* Concepts of prediction errors
* The bias-variance trade-off
Slides from this lecture: Slides for Lecture 5

Lärare Atsuto Maki redigerade 15 september 2014

Topics:


* Challenges to machine learning
* Model complexity and overfitting
* The curse of dimentionality
* Concepts of prediction errors
* The bias-variance trade-off
Slides from this lecture: Slides for Lecture 5



Lärare Atsuto Maki redigerade 15 september 2014

Topics:


* Challenges to machine learning
* Model complexity and overfitting
* The curse of dimentionality
* Concepts of prediction errors
* The bias-variance trade-off
Slides from this lecture: Slides for Lecture 5



Lärare Atsuto Maki redigerade 15 september 2014

Topics:


* Challenges to machine learning
* Model complexity and overfitting
* The curse of dimentionality
* Concepts of prediction errors
* The bias-variance trade-off
Slides from this lecture: Slides for Lecture 5

Lärare Atsuto Maki redigerade 5 oktober 2014

Topics:


* Challenges to machine learning
* Model complexity and overfitting
* The curse of dimentionality
* Concepts of prediction errors
* The bias-variance trade-off
Slides from this lecture: Slides for Lecture 5¶ or Lecture 5¶

Related reading:¶

Chapter 2 and 6.4 from An Introduction to Statistical Learning (Springer, 2013)¶

Gareth James, Daniela Witten, Trevor Hastie and Robert TibshiraniAvailable online: http://www-bcf.usc.edu/~gareth/ISL/

Lärare Atsuto Maki redigerade 9 juni 2015

Topics:


* Challenges to machine learning
* Model complexity and overfitting
* The curse of dimentsionality
* Concepts of prediction errors
* The bias-variance trade-off
Slides for Lecture 5

Related reading:

Chapter 2 and 6.4 from An Introduction to Statistical Learning (Springer, 2013)

Gareth James, Daniela Witten, Trevor Hastie and Robert TibshiraniAvailable online: http://www-bcf.usc.edu/~gareth/ISL/

 
Schemahandläggare skapade händelsen 2 april 2014

ändrade rättigheterna 15 maj 2014

Kan därmed läsas av alla och ändras av lärare.
Lärare Giampiero Salvi redigerade 8 juli 2014

Teacher: Giampiero Salvi¶

Lärare Atsuto Maki redigerade 29 augusti 2014

FöreläsningLecture 4, Probability II

Lärare Giampiero Salvi redigerade 3 september 2014

Teacher: Giampiero Salvi

Topics:¶


* Fitting probability models (continued)
* Model selection (Occam's Razor)
* Unsupervised learning and Expectation Maximization
Selected reading:¶

Chapters 4 and 7 from Computer Vision: Models, Learning, and Inference Simon J.D. Princeavailable online at: http://www.computervisionmodels.com/¶

Schemahandläggare redigerade 5 september 2014

[{u'user_idname': u'u1oppompGiampiero Salvi', u'user_nameid': u'\xd6rjan Ekebergu12rf6rn'}]

Lärare Giampiero Salvi redigerade 10 september 2014

Teacher: Giampiero Salvi

Topics:


* Fitting probability models (continued)
* Model selection (Occam's Razor)
* Unsupervised learning and Expectation Maximization
Selected reading:

Chapters 4 and 7 from Computer Vision: Models, Learning, and Inference Simon J.D. Princeavailable online at: http://www.computervisionmodels.com/

Handouts: Handouts for Lecture 04¶



Lärare Giampiero Salvi redigerade 10 september 2014

Teacher: Giampiero Salvi

Topics:


* Fitting probability models (continued)
* Model selection (Occam's Razor)
* Unsupervised learning and Expectation Maximization
Selected reading:

Chapters 4 and 7 from Computer Vision: Models, Learning, and Inference Simon J.D. Princeavailable online at: http://www.computervisionmodels.com/

Handouts: Handouts for Lecture 04

Lärare Giampiero Salvi redigerade 11 september 2014

Teacher: Giampiero Salvi

Topics:


* Fitting probability models (continued)
* Model selection (Occam's Razor)
* Unsupervised learning and Expectation Maximization
Selected reading:

Chapters 4 and 7 from Computer Vision: Models, Learning, and Inference Simon J.D. Princeavailable online at: http://www.computervisionmodels.com/

Handouts: Handouts for Lecture 04

Lärare Atsuto Maki redigerade 4 oktober 2014

Teacher: Giampiero Salvi

Topics:


* Fitting probability models (continued)
* Model selection (Occam's Razor)
* Unsupervised learning and Expectation Maximization
Selected reading:

Chapters 4 and 7 from Computer Vision: Models, Learning, and Inference Simon J.D. Princeavailable online at: http://www.computervisionmodels.com/

Handouts: Handouts for Lecture 04

 
Schemahandläggare skapade händelsen 2 april 2014

ändrade rättigheterna 15 maj 2014

Kan därmed läsas av alla och ändras av lärare.
Lärare Atsuto Maki redigerade 29 augusti 2014

FöreläsningLecture 3, Probability I

Lärare Giampiero Salvi redigerade 3 september 2014

Topics:¶


* Probability theory
* Common probability distributions
* Bayes rule and machine learning
Selected reading:¶

Chapters 2, 3, 5 from Computer Vision: Models, Learning, and Inference Simon J.D. Princeavailable online at: http://www.computervisionmodels.com/¶



Lärare Giampiero Salvi redigerade 3 september 2014

Topics:


* Probability theory
* Common probability distributions
* Bayes rule and machine learning
* Fitting probability models
Selected reading:

Chapters 2, 3, -5 from Computer Vision: Models, Learning, and Inference Simon J.D. Princeavailable online at: http://www.computervisionmodels.com/

Lärare Giampiero Salvi redigerade 8 september 2014

Topics:


* Probability theory
* Common probability distributions
* Bayes rule and machine learning
* Fitting probability models
Handouts: 03-probtheory-2x2.pdf¶

Selected reading:

Chapters 2-5 from Computer Vision: Models, Learning, and Inference Simon J.D. Princeavailable online at: http://www.computervisionmodels.com/

Lärare Atsuto Maki redigerade 4 oktober 2014

Topics:


* Probability theory
* Common probability distributions
* Bayes rule and machine learning
* Fitting probability models
Handouts: 03-probtheory-2x2.pdf

Selected reading:

Chapters 2-5 from Computer Vision: Models, Learning, and Inference Simon J.D. Princeavailable online at: http://www.computervisionmodels.com/

 
Schemahandläggare skapade händelsen 2 april 2014

ändrade rättigheterna 15 maj 2014

Kan därmed läsas av alla och ändras av lärare.
Lärare Atsuto Maki redigerade 29 augusti 2014

FöreläsningLecture 1, Introduction

Lärare Atsuto Maki redigerade 29 augusti 2014

Structure of the Course¶


* What will the course cover?
* How are labs and examination handled?
Learning Machines¶


* What do we mean by a "Learning Machine"?
* Supervised vs Unsupervised learning?   
* What can learning algorithms be used for?
* How can a simple learning program be constructured?
* What is a Nearest neighbour classifier?

Lärare Atsuto Maki redigerade 1 september 2014

Structure of the Course


* What will the course cover?
* How are labs and examination handled?
Learning Machines


* What do we mean by a "Learning Machine"?
* Supervised vs Unsupervised learning?   
* What can learning algorithms be used for?
* How can a simple learning program be constructured?
* What is a Nearest neighbour classifier?
Slides from this lecture: Slides for Lecture 1¶

Slides from this lecture: Slides for Lecture 1 (part II)¶

Lärare Atsuto Maki redigerade 4 oktober 2014

Structure of the Course


* What will the course cover?
* How are labs and examination handled?
Learning Machines


* What do we mean by a "Learning Machine"?
* Supervised vs Unsupervised learning?   
* What can learning algorithms be used for?
* How can a simple learning program be constructured?
* What is a Nearest neighbour classifier?
Slides from this lecture: Slides for Lecture 1

Slides from this lecture: Slides for Lecture 1 (part II)

Lärare Atsuto Maki redigerade 5 oktober 2014

Structure of the Course


* What will the course cover?
* How are labs and examination handled?
Learning Machines


* What do we mean by a "Learning Machine"?
* Supervised vs Unsupervised learning?   
* What can learning algorithms be used for?
* How can a simple learning program be constructured?
* What is a Nearest neighbour classifier?
Slides from this lecture: Slides for Lecture 1¶ Slides from this lecture:or Lecture 1¶

Slides for Lecture 1 (part II)

Lärare Atsuto Maki redigerade 9 juni 2015

Structure of the Course


* What will the course cover?
* How are labs and examination handled?
Learning Machines


* What do we mean by a "Learning Machine"?
* Supervised vs Unsupervised learning?   
* What can learning algorithms be used for?
* How can a simple learning program be constructured?
* What is a Nearest neighbour classifier?
Slides for Lecture 1

Slides for Lecture 1 (part II)

 
Schemahandläggare skapade händelsen 2 april 2014

ändrade rättigheterna 15 maj 2014

Kan därmed läsas av alla och ändras av lärare.
Lärare Atsuto Maki redigerade 29 augusti 2014

FöreläsningLecture 2, Decision Trees

Lärare Atsuto Maki redigerade 2 september 2014


* What is a Decision Tree?
* When are decision trees useful?
* How can one select what questions to ask?
* What do we mean by Entropy for a data set?
* What do we mean by the Information Gain of a question?
* What is the problem of overfitting? Minimizing training error?
* Bias - Variance trade-off

Lärare Atsuto Maki redigerade 3 september 2014


* What is a Decision Tree?
* When are decision trees useful?
* How can one select what questions to ask?
* What do we mean by Entropy for a data set?
* What do we mean by the Information Gain of a question?
* What is the problem of overfitting? Minimizing training error?
* Bias - Variance trade-off
What extensions will be possible for improvement?
Slides from this lecture: Slides for Lecture 2   ¶

Lärare Atsuto Maki redigerade 4 september 2014


* What is a Decision Tree?
* When are decision trees useful?
* How can one select what questions to ask?
* What do we mean by Entropy for a data set?
* What do we mean by the Information Gain of a question?
* What is the problem of overfitting? Minimizing training error?
* What extensions will be possible for improvement?
Slides from this lecture: Slides for Lecture 2  

Lärare Atsuto Maki redigerade 4 oktober 2014


* What is a Decision Tree?
* When are decision trees useful?
* How can one select what questions to ask?
* What do we mean by Entropy for a data set?
* What do we mean by the Information Gain of a question?
* What is the problem of overfitting? Minimizing training error?
* What extensions will be possible for improvement?
Slides from this lecture: Slides for Lecture 2

Lärare Atsuto Maki redigerade 5 oktober 2014


* What is a Decision Tree?
* When are decision trees useful?
* How can one select what questions to ask?
* What do we mean by Entropy for a data set?
* What do we mean by the Information Gain of a question?
* What is the problem of overfitting? Minimizing training error?
* What extensions will be possible for improvement?
Slides from this lecture: Slides for Lecture 2or Lecture 2¶

Related reading:¶

An Introduction to Statistical Learning with Applications in R (Springer, 2013)Gareth James, Daniela Witten, Trevor Hastie and Robert TibshiraniAvailable online: http://www-bcf.usc.edu/~gareth/ISL/¶

Chapter 8.1¶

Lärare Atsuto Maki redigerade 5 oktober 2014


* What is a Decision Tree?
* When are decision trees useful?
* How can one select what questions to ask?
* What do we mean by Entropy for a data set?
* What do we mean by the Information Gain of a question?
* What is the problem of overfitting? Minimizing training error?
* What extensions will be possible for improvement?
Slides for Lecture 2

Related reading:

Chapter 8.1 from An Introduction to Statistical Learning with Applications in R (Springer, 2013)

Gareth James, Daniela Witten, Trevor Hastie and Robert TibshiraniAvailable online: http://www-bcf.usc.edu/~gareth/ISL/

Chapter 8.1¶ ¶

Lärare Atsuto Maki redigerade 15 oktober 2014


* What is a Decision Tree?
* When are decision trees useful?
* How can one select what questions to ask?
* What do we mean by Entropy for a data set?
* What do we mean by the Information Gain of a question?
* What is the problem of overfitting? Minimizing training error?
* What extensions will be possible for improvement?
Slides for Lecture 2

Related reading:

Chapter 8.1 from An Introduction to Statistical Learning (Springer, 2013)

Gareth James, Daniela Witten, Trevor Hastie and Robert TibshiraniAvailable online: http://www-bcf.usc.edu/~gareth/ISL/

Lärare Atsuto Maki redigerade 9 juni 2015

* What is a Decision Tree?
* When are decision trees useful?
* How can one select what questions to ask?
* What do we mean by Entropy for a data set?
* What do we mean by the Information Gain of a question?
* What is the problem of overfitting? Minimizing training error?
* What extensions will be possible for improvement?
Slides for Lecture 2

Related reading:

Chapter 8.1 from An Introduction to Statistical Learning (Springer, 2013)

Gareth James, Daniela Witten, Trevor Hastie and Robert TibshiraniAvailable online: http://www-bcf.usc.edu/~gareth/ISL/

 
Schemahandläggare skapade händelsen 2 april 2014

ändrade rättigheterna 15 maj 2014

Kan därmed läsas av alla och ändras av lärare.
Lärare Atsuto Maki redigerade 29 augusti 2014

FöreläsningLecture 11, Learning Representation

Lärare Atsuto Maki redigerade 4 oktober 2014

Lecture 11, Learning RepresentaDimensionality Reduction

Lärare Atsuto Maki redigerade 6 oktober 2014

Topics:¶


* Concept of subspace
* Similarity measure
* Subspace method
* Unsupervised Learning
* Principal Component Analysis (PCA)
Slides from lecture 11¶

Related reading:¶

Chapter 10.2 from An Introduction to Statistical Learning (Springer, 2013)¶

Gareth James, Daniela Witten, Trevor Hastie and Robert TibshiraniAvailable online: http://www-bcf.usc.edu/~gareth/ISL/¶



Lärare Atsuto Maki redigerade 6 oktober 2014

Topics:


* Concept of subspace
* Similarity measure
* Subspace method
* Unsupervised Learning
* Principal Component Analysis (PCA)
Slides from lecture 11

Related reading:

Chapter 10.2 from An Introduction to Statistical Learning (Springer, 2013)

Gareth James, Daniela Witten, Trevor Hastie and Robert TibshiraniAvailable online: http://www-bcf.usc.edu/~gareth/ISL/

Lärare Atsuto Maki redigerade 6 oktober 2014

Topics:


* Concept of subspace
* Similarity measure
* Subspace method
* Unsupervised Learning
* Principal Component Analysis (PCA)
Slides from lor Lecture 11

Related reading:

Chapter 10.2 from An Introduction to Statistical Learning (Springer, 2013)

Gareth James, Daniela Witten, Trevor Hastie and Robert TibshiraniAvailable online: http://www-bcf.usc.edu/~gareth/ISL/

Lärare Atsuto Maki redigerade 9 juni 2015

Topics:


* Concept of subspace
* Similarity measure
* Subspace method
* Unsupervised Learning
* Principal Component Analysis (PCA)
Slides for Lecture 11

Related reading:

Chapter 10.2 from An Introduction to Statistical Learning (Springer, 2013)

Gareth James, Daniela Witten, Trevor Hastie and Robert TibshiraniAvailable online: http://www-bcf.usc.edu/~gareth/ISL/

 
under
HT 2014
Schemahandläggare skapade händelsen 2 april 2014
Schemahandläggare redigerade 13 maj 2014

TorsMåndag 913 oktober 2014 kl 107:00 - 129:00

FR4D1

ändrade rättigheterna 15 maj 2014

Kan därmed läsas av alla och ändras av lärare.
Lärare Atsuto Maki redigerade 29 augusti 2014

FöreläsningLecture 12, Summary

Lärare Atsuto Maki redigerade 13 oktober 2014

Slides for Lecture 12¶

Lärare Atsuto Maki redigerade 13 oktober 2014

Slides for Lecture 12