10-715 Fall 2020: Advanced Introduction to Machine Learning

This course is designed for Ph.D. students whose primary field of study is machine learning, or who intend to make machine learning methodological research a main focus of their thesis. It will give students a thorough grounding in the algorithms, mathematics, theories, and insights needed to do in-depth research and applications in machine learning. The topics of this course will in part parallel those covered in the general graduate machine learning course (10-701), but with a greater emphasis on depth in theory and algorithms. The course will also include additional recent topics such as fairness in machine learning.

IMPORTANT NOTE: Students entering the class are expected to have a pre-existing strong working knowledge of linear algebra (e.g., mathematical representation of subspaces, singular value decomposition), probability (e.g., multivariate Gaussians, Bayes' rule, conditional expectation), and calculus (e.g., derivative of a vector with respect to another vector), and programming (we'll mostly be supporting Python). This class is best for you if you have machine learning at the CORE of your studies/research, and want to understand the fundamentals. This class will be HEAVY and will move FAST. If machine learning is an auxiliary component of your studies/research or if you do not have the required background, you may consider the general graduate Machine Learning course (10-701) or the Masters-level Machine Learning course (10-601). Click here for an ML course comparison.

Waitlist: If you are in the waitlist and meet the aforementioned requisites, please send an email to the instructor and cc Diane Stidle (stidle@andrew.cmu.edu) outlining how you meet them (e.g., some courses you took on these topics etc.) Note that machine learning itself is NOT a prerequisite for this course. Also note that by departmental policy, this course is open only to graduate students (5th year masters are allowed). We expect that the waitlist will be cleared after the first week of classes, when everyone has had a chance to experience this class (and the other classes they are choosing between), and decide whether this class is a right fit for them or not.

Time and location: The class is scheduled for Monday and Wednesday 11.40am to 1.00pm over Zoom. Several Fridays 11.40am to 1.00pm will be used for recitations, also over Zoom. The Zoom links will be available on Diderot.

Units: 12

Instructor: Nihar B. Shah

Textbook: [SB] Understanding Machine Learning: From Theory to Algorithms by Shai Shalev-Shwartz and Shai Ben-David (available online)

Course staff and contact details:
Nihar Shah: nihars at cs dot cmu dot edu      Office hours by appointment. To set up a meeting, please send Nihar an email with your availability, as well as the topics you would like to discuss (e.g., specific lectures or paper dissection choice).

Cristian Challu : cchallu at andrew.cmu.edu      Office hours details on Diderot
Kin GutiĆ©rrez Olivares : kdgutier at andrew.cmu.edu      Office hours details on Diderot
Shenghao Wu : shenghaw at andrew.cmu.edu      Office hours details on Diderot

Syllabus and tentative schedule (subject to change): Note that most lectures will be taught on the "board'', and hence there are no "slides''.
DateTopicReferences/Remarks
PART I: DEPTH
Aug 31Logistics and introduction to introduction to MLSB chapter 2
Sep 2Perceptrons: Hope, hopelessness, and hope again SB chapter 9
Sep 7No class (labor day)
Sep 9Optimization for MLNotes
Sep 11Recitation: Optimization
Sep 14Support vector machines SB chapter 15
Sep 16Kernel methods 1SB chapter 16
Sep 18Recitation: Tail bounds
Sep 21Kernel methods 2SB chapter 16
Sep 23Learning theory 1SB Chapters 2 - 5
Sep 25Recitation: Linear regression, Logistic regression
Sep 28Learning theory 2SB Chapters 2 - 6
Sep 30Learning theory 3SB Chapters 2 - 6
Oct 2Recitation: MLE and MAP
Oct 5Learning theory 4SB Chapters 6 - 7
Oct 7MidtermAll material in previous lectures
Oct 9Recitation: Rademacher Complexity
PART II: BREADTH
Oct 12Neural networks 1: Introduction. Also, midterm discussion. SB Chapter 20
Oct 14Neural networks 2: Representation power
Oct 19Neural networks 3: Training, automatic differentiation, CNNs, ResNet etc.
Oct 21Theory paper dissection
Oct 26Model complexity, cross-validation bias-variance tradeoff, interpolation regime, and Neural networks 4 (neural architecture search)
Oct 28Decision trees, random forests, bagging, bootstrapping SB Chapter 18
Nov 2Unsupervised learning: Clustering SB Chapter 22
Nov 4Dimensionality reductionSB Chapter 23
Nov 9BoostingSB Chapter 10
Nov 11Online learningSB Chapter 21
Nov 16Semi-supervised learning, Active learning, Multi-armed banditsTransductive SVM, Active learning, Multi-armed bandits, Ranking via MABs
Nov 18Reinforcement learningSurvey
Nov 23Graphical modelsGraphical models
Nov 25No class (Thanksgiving break)
Nov 30Fairness Hiring example, Paper 1, Paper 2, In peer review
Dec 2Interpretability, explanabilityGuest lecture by Hima Lakkaraju
Dec 7Causality
Dec 9Applied paper dissection


Accommodations for Students with Disabilities: If you have a disability and have an accommodations letter from the Disability Resources office, I encourage you to discuss your accommodations and needs with me as early in the semester as possible. I will work with you to ensure that accommodations are provided as appropriate. If you suspect that you may have a disability and would benefit from accommodations but are not yet registered with the Office of Disability Resources, I encourage you to contact them at access@andrew.cmu.edu.

Finally, to all students: Please take care! This semester is unlike any other. We are all under a lot of stress and uncertainty at this time. Attending Zoom classes all day can take its toll on our mental health. Make sure to move regularly, eat well, and reach out to your support system or me if you need to. We can all benefit from support in times of stress, and this semester is no exception.