Course Syllabus

Convex Optimization (Winter 2020)

TTIC 31070 / CAAM 31015 / CMSC 35470 / BUSF 36903 / STAT 31015

The class will be held Mondays and Wednesdays 1:30pm-2:50pm in TTIC 526B.

Final: Monday, March 16th, 1:30-4:30pm.

Instructor: Nati Srebro.  Office hours: Monday 3-4pm.

TAs: Omar Montasser, Lijia Zhou, Zhen Dai.

Recitation: Monday (4-5pm) and Tuesday (4-5pm). Both at TTIC 526.
Office Hours: Omar (Tuesday 3-4pm, TTIC 405).
Zhen Dai (Wednesday 4-5pm, Jones 308)
Lijia Zhou (Thursday 5-6pm, TTIC 526)

Staff email list:  convex-optimization-2020-staff@ttic.edu .

Whom should I contact and where should I ask questions?

Questions about the course material, including questions about class, the slides or the textbooks --> ask on Canvas, in person during TA or instructor office hours, or during lectures or recitations.  Do not ask by direct email (we will not answer such questions via email).

Clarifications and technical problems with the homework, or general questions about course logistics (not specific to you) --> ask on Canvas (preferred) or during TA office hours.  Please post on canvas instead of emailing us.

Help with the homework --> seek help during TA office hours.

Questions and concerns about grading, including mistakes in grading, late submissions, extensions and other special arrangements or concerns that are specific to you --> email us at convex-optimization-2020-staff@ttic.edu. Please use this email address instead of emailing a staff person individually.

General comments or feedback  --> via Canvas, anonymously via Canvas, to the staff email list or directly to the relevant staff. 

Personally sensitive issue --> you may contact any staff member you are comfortable contacting. 

Course Description

The course will cover techniques in unconstrained and constrained convex optimization and a practical introduction to convex duality. The course will focus on (1) formulating and understanding convex optimization problems and studying their properties; (2) presenting and understanding optimization approaches; and (3) understanding the dual problem. Limited theoretical analysis of convergence properties of methods will be presented. Examples will be mostly from data fitting, statistics and machine learning.

Specific Topics:

  • Formalization of optimization problems
  • Smooth unconstrained optimization: Gradient Descent, Conjugate Gradient Descent, Newton and Quasi-Newton methods; Line Search methods
  • Standard formulations of constrained optimization: Linear, Quadratic, Conic and Semi-Definite Programming
  • KKT optimality conditions
  • Lagrangian duality, constraint qualification, weak and strong duality
  • Fenchel conjugacy and its relationship to Lagrangian duality
  • Multi-objective optimization
  • Equality constrained Newton method
  • Log Barrier (Central Path) methods, and Primal-Dual optimization methods
  • Overview of the simplex method as an example of an active set method.
  • Overview of the first order oracle model, subgradient and separation oracles, and the Ellipsoid algorithm.
  • Large-scale subgradient methods: subgradient descent, mirror descent, stochastic subgradient descent.

 

Text Books

The required textbook for the class is:

The book is available online here (Links to an external site.). About 75% of the material covered in the class can be found in the above book.

Supplemental recommended books:

 

Requirements and Grading:

There will be weekly homework assignments. Assignments must be typed (not handwritten) and submitted electronically in PDF. Collaborative work on the homeworks (except for the preparation evaluation) is encouraged, but each student must eventually write up the solution independently. Most assignments also include NumPy programming section. 50% of the grade will be based on the final exam, and the remaining 50% will be from the maximum of final exam and weekly assignments grades, according to the following formula for calculating the final grade:

0.5*FINAL + 0.5*( \sum_i max{HW_i, FINAL} / #HW )

Prerequisites:

Linear Algebra, Vector Calculus and Algorithms at undergraduate level  OR  Matrix Computation (STAT/CAAM 30900). CMSC 25300 can replace Linear Algebra and Vector Analysis, but an algorithms course would still be required. 

 

Registration:

  • PhD students in programs where the course is listed (TTIC, STAT, CAM, CS or Booth) may register independently after checking they have the required prerequisites
  • Other graduate students, including master students and students in other PhD programs may register if they took Matrix Computation, OR after emailing explaining their prerequisites and receiving permission from the instructor.
  • Undergraduate students may only register after receiving permission from the instructor based on the prerequisites and background.

 

Preparation evaluation:

In any case, all students will have to complete an in-home evaluation covering the prerequisite material and evaluating how well prepared they are for class. This homework is mandatory for all students, and will not go into the final grade. Collaboration is NOT allowed on this homework. Students will receive a letter grade on the Monday after the evaluation is due:
  • Students scoring A on this evaluation will be welcome to take the class.
  • Students scoring B on this evaluation will be allowed to take the class if they choose to, but should review or restudy the prerequisites.
  • Students scoring C will be advised not to take the class until mastering the prerequisites.

 

Course Summary:

Date Details Due