10-715 Fall 2023: Advanced Introduction to Machine Learning

This course is designed for Ph.D. students whose primary field of study is machine learning, or who intend to make machine learning methodological research a main focus of their thesis. It will give students a thorough grounding in the algorithms, mathematics, theories, and insights needed to do in-depth research and applications in machine learning. The topics of this course will in part parallel those covered in the general graduate machine learning course (10-701), but with a greater emphasis on depth in theory.

IMPORTANT NOTE: Students entering the class are expected to have a pre-existing strong working knowledge of linear algebra (e.g., mathematical representation of subspaces, singular value decomposition), probability (e.g., multivariate Gaussians, Bayes' rule, conditional expectation), and calculus (e.g., derivative of a vector with respect to another vector), and programming (we'll mostly be supporting Python). This class is best for you if you have machine learning at the CORE of your studies/research, and want to understand the fundamentals. This class will be HEAVY and will move FAST. If machine learning is an auxiliary component of your studies/research or if you do not have the required background, you may consider the general graduate Machine Learning course (10-701) or the Masters-level Machine Learning course (10-601). Click here for an ML course comparison. Note that machine learning itself is NOT a prerequisite for this course.
Also note that by departmental policy, this course is open only to graduate students (5th year masters are allowed).

Waitlist: The waitlist will be processed near end of summer.

Time and location: The class is scheduled for Monday and Wednesday 2pm to 3.20pm in POS 152. We will reserve Fridays for recitations, which will be held 2pm to 3.20pm in the same location.

Units: 12

Instructor: Nihar B. Shah

Grading and other logistics: Will be discussed in the first lecture.

Textbook: [SB] Understanding Machine Learning: From Theory to Algorithms by Shai Shalev-Shwartz and Shai Ben-David (available online)

Last year's videos: The course content is similar to the offering in previous years. You can find previous years' videos here.

Syllabus and lecture schedule. Note that most lectures will be taught on the "board'', and hence there are no "slides''.
DateTopicReferences/Remarks
PART I: DEPTH
Aug 28Logistics and introduction to introduction to MLSB chapter 2
Aug 30Perceptrons: Hope, hopelessness, and hope again SB chapter 9
Sep 1Optimization for ML [Note:This is a regular lecture on Friday to make up for Sep 21]Notes
Sep 6Support vector machines SB chapter 15
Sep 8Recitation: Optimization
Sep 11Kernel methods 1SB chapter 16
Sep 13Kernel methods 2SB chapter 16
Sep 15Recitation: Tail bounds
Sep 18Learning theory 1SB Chapters 2 - 5
Sep 20No class [Make up class was on Sep 1]
Sep 22Recitation: Linear regression, Logistic regression
Sep 25Learning theory 2SB Chapters 2 - 6
Sep 27Learning theory 3SB Chapters 2 - 6
Sep 29Recitation: MLE and MAP
Oct 2Learning theory 4SB Chapters 6 - 7
Oct 4MidtermAll material in previous lectures
PART II: BREADTH
Oct 9Neural networks 1: Introduction. Also, midterm discussion. SB Chapter 20
Oct 11Neural networks 2: Representation power
Oct 23Neural networks 3: Training, automatic differentiation, CNNs, etc.
Oct 25Theory paper dissection
Oct 30Model complexity, cross-validation bias-variance tradeoff, interpolation regime, and Neural networks 4 (neural architecture search)
Nov 1(Large) language models
Nov 6Unsupervised learning: Clustering, Dimensionality reduction, Diffusion modelsSB Chapter 22, 23
Nov 8Decision trees, random forests, bagging, bootstrapping SB Chapter 18
Nov 13Online learningSB Chapter 21
Nov 15Semi-supervised learning, Active learning, Multi-armed banditsTransductive SVM, Active learning, Multi-armed bandits, Ranking via MABs
Nov 20Reinforcement learning 1Survey
Nov 27Reinforcement learning 2 and RL from Human Feedback (RLHF)
Nov 29Applied paper dissection
Dec 4Graphical models, Causality, Fairness, Interpretability, AlignmentGraphical models, Hiring example, Paper 1, Paper 2
Dec 6Final exam


Accommodations for Students with Disabilities: If you have a disability and have an accommodations letter from the Disability Resources office, I encourage you to discuss your accommodations and needs with me as early in the semester as possible. I will work with you to ensure that accommodations are provided as appropriate. If you suspect that you may have a disability and would benefit from accommodations but are not yet registered with the Office of Disability Resources, I encourage you to contact them at access@andrew.cmu.edu.