This page lists the readings for each lecture. The instructors will include comments and pointers to other resources that might be helpful to get the most out of the readings.
Mon., Sep. 10:
- (Bishop - 2.1) This section gives many details on the Bayesian and maximum likelihood results for the binomial example Carlos covered today.
Tue., Sep. 11:
Recitation 1 -- Probability Review- (Bishop - 1.2) A good review of the probability concepts needed for this course
- We have not checked all of these articles for correctness, but we do recommend brushing up with the Wikipedia articles for these topics:
Wed., Sep. 12:
- (Bishop - 1.1 to 1.4) Introduces curve fitting, reviews probability theory, introduces Gaussians, and covers the famous "curse of dimensionality"
- (Bishop - 3.1, 3.1.1, 3.1.4, 3.1.5, 3.2, 3.3, 3.3.1, 3.3.2) Regression, linear basis function models, bias-variance decomposition, and Bayesian linear regression
- (Bishop - 1.5.5) Covers loss functions for regression and discusses minimizing expected loss
- Completely Optional: Joey's quickly written notes on the matrix MLE for regression. Corrections are welcome. [PDF] [Mathematica6 Notebook]
Mon., Sep 17:
- (Bishop - 1.5.0, 1.5.1, 1.5.4) Picking a decision boundary
- Mitchell Chapter (All sections): Mitchell's Chapter on Naive Bayes and Logistic Regression
Wed., Sep 19:
- (Bishop - 3.2) Bias-variance decomposition
- (Bishop - 1.3) Discusses model selection using a test set
- Mitchell Chapter (All sections): Mitchell's Chapter on Naive Bayes and Logistic Regression
- Optional Reading: Ng and Jordan's NIPS 2001 paper on Discriminative versus Generative Learning [pdf] [ps]
Mon., Sep 24:
- Bishop - 4.0, 4.2, 4.3, 4.4, 4.5
Wed., Sep. 26:
- (Bishop - 1.6) Information Theory
- (Bishop - 14.4) Tree-based Models
- Recommended Reading: Quantities of Information Wikipedia entry
- Recommended Reading: Nils Nilsson's Chapter (All Sections): Decision Trees
- Optional Review of Boolean Logic/DNF: Nils Nilsson's Chapter Boolean Functions (first 4 pages)
Mon., Oct. 1:
- (Bishop - 14.3) Boosting
- Schapire's Boosting Tutorial
- Optional Reading: Multi-class AdaBoost paper, by Zhu, Rosset, Zou, and Hastie.
- Additional resource: Schapire Boosting Tutorial Video.
Wed., Oct. 3:
- (Bishop 1.3) Model Selection / Cross Validation
- (Bishop 3.1.4) Regularized least squares
- (Bishop 5.1) Feed-forward Network Functions
- Optional Reading: Ron Kohavi's paper, A Study of Cross-Validation and Bootstrap for Accuracy Estimation and Model Selection.
- Additional Resource: Minimum Description Length website
Mon., Oct. 8:
- (Bishop 5.1) Feed-forward Network Functions
- (Bishop 5.2) Network Training
- (Bishop 5.3) Error Backpropagation
Wed., Oct. 10:
- (Bishop 2.5) Nonparametric Methods
Mon., Oct. 15:
- (Bishop 6.1,6.2) Kernels
- (Bishop 7.1) Maximum Margin Classifiers
- Hearst 1998: High Level Presentation
- Burges 1998: Detailed Tutorial
- Optional Reading: Platt 1998: Training SVMs with Sequential Minimal Optimization
- Additional Resource: Smola video tutorial on SVM (see Part 3)
- Additional Resource: Scholkopf video tutorial on kernels
Wed., Oct. 17:
Mon., Oct. 22:
- (Mitchell Chapter 7) Computational Learning Theory
- (Optional) John Langford's tutorial on generalization bounds
- Additional Resource: John Shawe-Taylor video tutorial on statistical learning theory
Wed., Oct. 24:
Mon., Oct. 29:
- (Bishop 8.1,8.2) Bayesian Networks, Conditional Independence
Wed., Oct. 31:
- (Bishop 8.4.1,8.4.2) Inference in Chain/Tree structures
Mon., Nov. 5:
Wed., Nov. 7:
- Additional Reading: Heckerman BN Learning Tutorial
- Additional Reading: Tree-Augmented Naive Bayes paper
Mon., Nov. 12:
- (Bishop 9.1, 9.2) K-means, Mixtures of Gaussians
Wed., Nov. 14:
- Neal and Hinton EM paper
- (Bishop 9.3, 9.4) - EM
- Ghahramani, "An introduction to HMMs and Bayesian Networks"
Mon., Nov. 19:
- Xiaojin Zhu survey of semi-supervised learning
- Optional: Blum and Mitchell co-training paper
- Optional reading: Joachims Transductive SVMs paper
Wed., Nov. 21:
- NO CLASS: Thanksgiving
Mon., Nov. 26:
Wed., Nov. 28:
Mon., Dec. 3: