Course Syllabus

This is a Canvas web page for the Fall 2018: course "Computational and Statistical Learning Theory", taught at TTIC.  This course is open to graduate students, as well as advanced undergraduates with sufficient mathematical maturity.  Undergraduate registration requires approval---please submit your consent request online and contact Nati (nati@ttic.edu) via email to secure approval.

 

Note: This course is not intended as an introductory machine learning course, and is recommended mostly for students familiar with machine learning problems and methods. Although all formulations and methods are carefully introduced and defined, very few examples of empirical behavior are explored, as the students will be assumed to have seen this previously. However, mathematically inclined students that prefer learning about concepts through definitions and theorems rather then examples and experimentation, and are interested in exploring the mathematical rather then practical aspects of machine learning, might choose to take this course even without having previously completed an introductory machine learning course.

 

 

Lectures: Monday and Wednesday - 1:30-2:50pm (TTIC Room 526)

Instructor: Nati Srebro (nati@ttic.edu).

TAs: Rachit Nimavat (nimavat@ttic.edu) and Blake Woodworth (blake@ttic.edu)

Office hours: Tuesday 1-2pm (Blake) and Wednesday 3-4pm (Rachit) in TTI Library on 4th floor.

Office hours for Nati: Monday and Wednesday (1-1.30pm) in his office on 4th floor.

Contact: Please use the alias ttic31120-staff@ttic.edu for queries about extensions, conflicts, special arrangements, registration, etc. If you have any specific question about class material or homework please use the Canvas forum.

 

Course Description

The purpose of this course is to gain a deeper understanding of machine learning by formalizing learning mathematically, studying both statistical and computational aspects of learning, and understanding how these two aspects are inseparable. The course is intended both for students interested in using machine learning methods and that would like to understand such methods better so as to use them more effectively, as well as for students interested in the mathematical aspects of learning or that intend on rigorously studying or developing learning algorithms.

We will discuss classic results and recent advances in statistical learning theory (mostly under the agnostic PAC model), touch on computational learning theory, and also explore the relationship with stochastic optimization and online regret analysis. Our emphasis will be on concept development and on obtaining a rigorous quantitative understanding of machine learning. We will also study techniques for analyzing and proving performance guarantees for learning methods.

Past incarnation: 2016.

 

Prerequisites

  • Mathematical maturity, as obtain, e.g., in a rigorous analysis course.
  • Discrete Math (specifically combinatorics and asymptotic notation)
  • Probability Theory
  • Introduction to Machine Learning
  • Algorithms; Basic Complexity Theory (NP-Hardness)

Familiarity with Convex Optimization, Computational Complexity and background in Statistics can be helpful, but is not required.

 

Specific Topics

  • The Statistical Model (Learning Based on an IID Samples):

    • The PAC (Probably Approximately Correct) and Agnostic PAC models.
    • Stochastic Optimization
    • Cardinality Bounds
    • Description Length Bounds
    • PAC-Bayes
    • Compression Bounds
    • The Growth Function and VC Dimension
    • VC Subgraph Dimension and Fat Shattering Dimension
    • Tight Characterization of Learning in terms of the VC and Fat Shattering Dimensions
    • Covering Numbers
    • Rademacher Averages, including Local Rademacher Analysis
  • Uniform Learning and No-Free Lunch Theorems

  • Online Learning, Online Optimization and Online Regret:

    • The Perceptron Rule and Online Gradient Descent
    • Experts and the Winnow Rule
    • Bregman Divergence and Online Mirror Descent
    • Online to Batch Conversion
  • Computational Lower Bounds:

    • Computational Hardness of Proper Learning
    • Cryptographic Hardness of Learning
  • Additional Topics:

    • Stability Based Analysis
    • Boosting: Weak Learning and the Margin Interpretation of Boosting

 

Requirements and Grading

  • Problem sets. About 4-5 problem sets, some of which introducing additional material not covered in class. (Required)
  • Final Exam. (Required)
  • Scribe notes for a lecture. (Extra credit. Not required, but expected for a grade of "A" or "A+")
  • Optional challenge problems on some problem sets. (Extra credit. Not required, but expected for a grade of "A+")
  • Required Reading: Some material will be covered through assigned reading, rather than in-class
  • Grading: Your grade will be based on the final exam and homeworks (they contribute equally to the final grade).

 

Scribes

Scribe Sign-up: here.

 

Recommended Texts

We will not be following a specific text, but some of the material is covered in:

Course Summary:

Date Details Due