Course Syllabus

This course is open to graduate students, as well as advanced undergraduates with sufficient mathematical maturity.  Undergraduate registration requires approval---please submit your consent request online and contact Nati (nati@ttic.edu) via email to secure approval.  This course is also offered as part of the IDEAL Special Quarter on Theory of Deep Learning ( https://www.ideal.northwestern.edu/special-quarters/fall-2020/ ).  If you are not a UChicago or TTIC student, you may request registration by registering for the special quarter (this includes Northwestern students, TTIC, UChicago and Northwestern post-docs and other scholars, and students and other scholars from other institutions).

 

If you are a University of Chicago or TTIC student and would like to register for the class, but cannot because the class is full, please email us at ttic31120-staff@ttic.edu to be (temporarily) added manually to canvas and gain access to the class.  We expect registration spots will open up.

 

Note: This course is not intended as an introductory machine learning course, and is recommended mostly for students familiar with machine learning problems and methods. Although all formulations and methods are carefully introduced and defined, very few examples of empirical behavior are explored, as the students will be assumed to have seen this previously. However, mathematically inclined students that prefer learning about concepts through definitions and theorems rather then examples and experimentation, and are interested in exploring the mathematical aspects of machine learning, might choose to take this course even without having previously completed an introductory machine learning course.

 

Lectures: Tuesday and Thursday - 2:40-4:00pm (Zoom)

Instructor: Nati Srebro (nati@ttic.edu).

TAs: Omar Montasser (omar@ttic.edu)

Office hours: Thursdays 4-5pm.

Office hours for Nati: TBA.

Contact: Please use the alias ttic31120-staff@ttic.edu for queries about registration, special arrangements, extensions, conflicts, and other individual circumstances. If you have any question about class material or homework please use the Canvas forum.

 

Course Description

The purpose of this course is to gain a deeper understanding of machine learning by formalizing learning mathematically, studying both statistical and computational aspects of learning, and understanding how these two aspects are inseparable. The course is intended both for students interested in using machine learning methods and that would like to understand such methods better so as to use them more effectively, as well as for students interested in the mathematical aspects of learning or that intend on rigorously studying or developing learning algorithms.

We will discuss classic results and recent advances in statistical learning theory, touch on computational learning theory, and explore the relationship with stochastic optimization and online regret analysis. Our emphasis will be on concept development and on obtaining a rigorous quantitative understanding of machine learning. We will also study techniques for analyzing and proving performance guarantees for learning methods.

Past incarnation: 2018, 2016.

 

Prerequisites

  • Mathematical maturity, as obtain, e.g., in a rigorous analysis course.
  • Discrete Math (specifically combinatorics and asymptotic notation)
  • Probability Theory
  • Introduction to Machine Learning
  • Algorithms; Basic Complexity Theory (NP-Hardness)

Familiarity with Convex Optimization, Computational Complexity and background in Statistics can be helpful, but is not required.

 

Specific Topics

  • The Statistical Model (Learning Based on an IID Samples):

    • The PAC (Probably Approximately Correct) and Agnostic PAC models.
    • Generalized Learning and relationship to Stochastic Optimization
    • VC Theory: Cardinality, VC dimension, a the growth function
    • Description Length Bounds, Structural Risk Minimization
    • Uniform Learning and No-Free Lunch Theorems
    • PAC-Bayes
    • Compression Bounds
    • Scale sensitive classes: Fat Shattering Dimension, Covering Numbers and Radamacher Averages
    • Tight Characterization of Learning in terms of the VC and Fat Shattering Dimensions
    • Relative bounds, "Optimistic" rates, and Local Rademacher Analysis
    • Boosting, including three different views of AdaBoost (traditional PAC view; learning a linear predictor with coordinate descent; boosting the margin and l1 as its dual)
  • Online Learning, Online Optimization and Online Regret:

    • The Perceptron Rule and Online Gradient Descent
    • Experts and the Winnow Rule
    • Strong convexity, stability, and online learning in Banach spaces as a unifying theory.
    • Bregman Divergence and Online Mirror Descent
    • Online to Batch Conversion
  • Computational Lower Bounds:

    • Computational Hardness of Proper Learning
    • Cryptographic Hardness of Learning
  • Deep Learning through the lens of Learning Theory

Requirements and Grading

  • Problem sets. About 5 problem sets, some of which introducing additional material not covered in class.
    • Each of the 5 homeworks counts towards 5% of the grade, if higher than the final. 
    • Collaboration on the homeworks is encouraged, but each student must write the solutions independently. 
    • Although the grading structure allows missing a homework, students should consider the homeworks as a required component of the course: some of the course material will be taught through the homeworks and missing homeworks will likely result in students not succeeding on the exam. 
  • Final Exam. 
  • Scribing or editing a scribed lecture or topic is required, will be graded, and will be 30% of the grade.
  • Optional challenge problems on some problem sets.
    • Each challenge problem can be used towards 5-70% of the grade, replacing the final exam (number of points indicated on the problem, partial solutions might result in less percentage points.)  
    • Doing at least one such problem is required for an A or A+. 
    • Students may accumulate up to 70 points from the homeworks and challenge problems.  If the sum of the percentage points from the homeworks and challenge problems is 70% or higher, students need not take the final. 
    • Collaboration on challenge problems is allowed in small groups working on the problem together, and such collaboration should be reported in the submission.

 

Scribes

Scribe Sign-up: here (Links to an external site.).

 

Recommended Texts

We will not be following a specific text, but some of the material is covered in:

Course Summary:

Date Details