Module
|
Material covered
|
Class details, online material, and homework
|
Module 1; Basics
(1 Lectures)
|
- What is learning?
- Version spaces
- Sample complexity
- Training set/test
set split
- Point estimation
- MLE
- Bayesian
- MAP
-
Bias-Variance trade off
|
Mon., Jan 15:
|
|
|
|
** No Class. MLK B-Day **
|
|
|
Module
2:
Linear models
(3
Lectures)
|
- Linear regression [Applet]
http://www.mste.uiuc.edu/users/exner/java.f/leastsquares/
- Bias-Variance tradeoff
- Overfitting
- Bayes optimal classifier
- Naive Bayes [Applet]
http://www.cs.technion.ac.il/~rani/LocBoost/
- Logistic regression [Applet]
- Discriminative v.
Generative models [Applet]
|
Mon., Jan. 22:
- Lecture: Gaussians, Linear Regression, Bias-Variance Tradeoff
[Slides]
[Annotated]
- Readings: Bishop 1.1 to 1.4, Bishop 3.1, 3.1.1, 3.1.4, 3.1.5, 3.2, 3.3, 3.3.1, 3.3.2
|
|
|
|
|
|
|
Module
3: Non-linear models
Model selection
(5
Lectures)
|
- Decision trees [Applet]
- Overfitting, again
- Regularization
- MDL
- Cross-validation
- Boosting [Adaboost Applet]
www.cse.ucsd.edu/~yfreund/adaboost
- Instance-based
learning
[Applet]
www.site.uottawa.ca/~gcaron/applets.htm
- K-nearest
neighbors
- Kernels
- Neural nets [CMU Course]
www.cs.cmu.edu/afs/cs/academic/class/15782-s04/
|
Mon., Feb. 12:
|
|
|
|
- Lecture:
Cross Validation, Simple Model Selection, Regularization, MDL,
Neural Nets
[Slides]
[Annotated]
- Readings: (Bishop 1.3)
Model Selection / Cross Validation
- (Bishop 3.1.4) Regularized least squares
- (Bishop 5.1) Feed-forward Network Functions
|
|
Wed., Feb. 14:
|
|
|
|
- Lecture:
Neural Nets, Instance-based Learning
[Slides]
[Annotated]
- Readings:
(Bishop 5.1) Feed-forward Network Functions
- (Bishop 5.2) Network Training
- (Bishop 5.3) Error Backpropagation
|
|
|
Module
4:
Margin-based approaches
(2
Lectures)
|
- SVMs [Applets]
www.site.uottawa.ca/~gcaron/applets.htm
- Kernel trick
|
|
Module
5:
Learning theory
(3
Lectures)
|
- Sample complexity
- PAC learning [Applets]
www.site.uottawa.ca/~gcaron/applets.htm
- Error bounds
- VC-dimension
- Margin-based bounds
- Large-deviation bounds
- Hoeffding's
inequality, Chernoff bound
- Mistake bounds
- No Free Lunch theorem
|
|
Mid-term Exam
|
All material thus far
|
|
Spring break
|
|
Mon., Mar. 12:
|
|
|
|
** No class **
|
|
Wed., Mar. 14:
|
|
|
|
** No class **
|
|
|
Module 6: Structured models
(4
Lectures)
|
- HMMs
- Forwards-Backwards
- Viterbi
- Supervised learning
- Graphical Models
|
|
Module
7:
Unsupervised
and semi-supervised learning
(6
Lectures)
|
- K-means
- Expectation
Maximization (EM)
- mixture of Gaussians
- for training Bayes
nets
- for training HMMs
- Combining labeled and
unlabeled
data
- EM
- reweighting labeled
data
- Co-training
- unlabeled data and
model selection
- Dimensionality
reduction
- Feature selection
|
Mon., Apr. 2:
|
|
|
|
Lecture:
Bayes nets - Structure Learning
Clustering - K-means & Gaussian mixture models
[Slides]
[Annotated]
Readings: (Bishop 9.1, 9.2) - K-means, Mixtures of Gaussian
|
|
|
Module
8:
Learning to make decisions
(3 Lectures)
|
- Markov decision
processes
- Reinforcement learning
|
|
Module
9:
Advanced topics
(3 Lectures)
|
- Text data
- Hierarchial Bayesian models
- Tackling very large
datasets
- Active learning
- Overview of follow-up
classes
|
|
Project Poster Session
|
|
Fri., May 4:
Newell-Simon Hall Atrium
2:00-5:00pm
|
|
|
|
|
|
|
Project Paper
|
|
Thur., May 10:
Project paper due
|
|
|
|
|
|
|
Final Exam
|
All material thus far
|
Tuesday, May 15th, 1-4 p.m.
Location: Baker Hall, Room A51
|
|
|
|
|
|
|