10-715 Fall 2022: Advanced Introduction to Machine Learning

This course is designed for Ph.D. students whose primary field of study is machine learning, or who intend to make machine learning methodological research a main focus of their thesis. It will give students a thorough grounding in the algorithms, mathematics, theories, and insights needed to do in-depth research and applications in machine learning. The topics of this course will in part parallel those covered in the general graduate machine learning course (10-701), but with a greater emphasis on depth in theory and algorithms. The course will also include additional recent topics such as fairness in machine learning.

IMPORTANT NOTE: Students entering the class are expected to have a pre-existing strong working knowledge of linear algebra (e.g., mathematical representation of subspaces, singular value decomposition), probability (e.g., multivariate Gaussians, Bayes' rule, conditional expectation), and calculus (e.g., derivative of a vector with respect to another vector), and programming (we'll mostly be supporting Python). This class is best for you if you have machine learning at the CORE of your studies/research, and want to understand the fundamentals. This class will be HEAVY and will move FAST. If machine learning is an auxiliary component of your studies/research or if you do not have the required background, you may consider the general graduate Machine Learning course (10-701) or the Masters-level Machine Learning course (10-601). Click here for an ML course comparison. Note that machine learning itself is NOT a prerequisite for this course.
Also note that by departmental policy, this course is open only to graduate students (5th year masters are allowed).

Waitlist: The waitlist will be processed near end of summer.

Time and location: The class is scheduled for Monday and Wednesday 10.10am to 11.30am in TBD venue. We will reserve Fridays for recitations, which will be held 10.10am to 11.30am in the same location.

Units: 12

Instructor: Nihar B. Shah

Textbook: [SB] Understanding Machine Learning: From Theory to Algorithms by Shai Shalev-Shwartz and Shai Ben-David (available online)

Last year's videos: The course content is similar to the offering in previous years. You can find previous years' videos here.

Course staff and contact details:
Nihar Shah: nihars at cs.cmu.edu      Office hours by appointment. To set up a meeting, please send Nihar an email with your availability, as well as the topics you would like to discuss (e.g., specific lectures or paper dissection choice).

Here is the syllabus from the 2021 edition. It will remain the same in 2022. The dates will change: Note that most lectures will be taught on the "board'', and hence there are no "slides''.
DateTopicReferences/Remarks
PART I: DEPTH
Aug 30Logistics and introduction to introduction to MLSB chapter 2
Sep 1Perceptrons: Hope, hopelessness, and hope again SB chapter 9
Sep 6No class (labor day)
Sep 8Optimization for MLNotes
Sep 10Recitation: Optimization
Sep 13Support vector machines SB chapter 15
Sep 15Kernel methods 1SB chapter 16
Sep 17Recitation: Tail bounds
Sep 20Kernel methods 2SB chapter 16
Sep 22Learning theory 1SB Chapters 2 - 5
Sep 24Recitation: Linear regression, Logistic regression
Sept 27Recitation: MLE and MAP [Note: This is a recitation on a Monday since Nihar is giving a tutorial at the same time]
Sep 29Learning theory 3SB Chapters 2 - 6
Oct 1Learning theory 2 [Note: This is a regular lecture on Friday to make up for Monday]SB Chapters 2 - 6
Oct 4Learning theory 4SB Chapters 6 - 7
Oct 6MidtermAll material in previous lectures
Oct 8Recitation: Rademacher Complexity
PART II: BREADTH
Oct 11Neural networks 1: Introduction. Also, midterm discussion. SB Chapter 20
Oct 13Neural networks 2: Representation power
Oct 18Neural networks 3: Training, automatic differentiation, CNNs, ResNet etc.
Oct 20Theory paper dissection
Oct 25Model complexity, cross-validation bias-variance tradeoff, interpolation regime, and Neural networks 4 (neural architecture search)
Oct 27Decision trees, random forests, bagging, bootstrapping SB Chapter 18
Nov 1Unsupervised learning: Clustering SB Chapter 22
Nov 3Dimensionality reductionSB Chapter 23
Nov 8BoostingSB Chapter 10
Nov 10Online learningSB Chapter 21
Nov 15Semi-supervised learning, Active learning, Multi-armed banditsTransductive SVM, Active learning, Multi-armed bandits, Ranking via MABs
Nov 17Reinforcement learningSurvey
Nov 22Graphical models, CausalityGraphical models
Nov 24No class (Thanksgiving break)
Nov 29Fairness, interpretability, explanability Hiring example, Paper 1, Paper 2
Dec 1Applied paper dissection


Accommodations for Students with Disabilities: If you have a disability and have an accommodations letter from the Disability Resources office, I encourage you to discuss your accommodations and needs with me as early in the semester as possible. I will work with you to ensure that accommodations are provided as appropriate. If you suspect that you may have a disability and would benefit from accommodations but are not yet registered with the Office of Disability Resources, I encourage you to contact them at access@andrew.cmu.edu.

Finally, to all students: Please take care! The COVID pandemic can lead to stress and uncertainty and can take its toll on our mental health. Make sure to move regularly, eat well, and reach out to your support system or me if you need to. We can all benefit from support in times of stress, and this semester is no exception.