Course Syllabus
Convex Optimization (Winter 2018)
TTIC 31070 / CMSC 35470 / BUSF 36903 / STAT 21015
The class will be held Mondays and Wednesdays 12:00noon-1:20pm in Social Sciences 122.
Instructor: Nati Srebro.
TAs: Blake Woodworth, Angela Wu, Haoyang Liu, and Greg Naisat
Recitations: There will be two weekly recitations covering background and supplementary material for the course. They will be held at two times: Monday 4:00-5:00pm in TTIC 530, Tuesday 4:30-5:30pm in Ryerson 276, there is no need to go to both, they will cover the same material.
Office Hours: Angela - Tuesday 2:00-3:00pm in Ryerson 163, Greg - Wednesday 2:30-3:30pm in TTIC 4th floor commons, Blake - Thursday 2:30-3:30pm in TTIC 404, Haoyang - Friday 10:00-11:00am in Jones 304
Staff email list: convex-optimization-2018-staff@ttic.edu. If you have questions about course administrivia, email us here. For questions about course material, please post your questions under the discussions tab so that other students can benefit from the answer.
If your name was not on the signup sheet in the first lecture, or if you did not attend the first lecture, please fill out the following form.
Course Description
The course will cover techniques in unconstrained and constrained convex optimization and a practical introduction to convex duality. The course will focus on (1) formulating and understanding convex optimization problems and studying their properties; (2) presenting and understanding optimization approaches; and (3) understanding the dual problem. Limited theoretical analysis of convergence properties of methods will be presented. Examples will be mostly from data fitting, statistics and machine learning.
Specific Topics:
- Formalization of optimization problems
- Smooth unconstrained optimization: Gradient Descent, Conjugate Gradient Descent, Newton and Quasi-Newton methods; Line Search methods
- Standard formulations of constrained optimization: Linear, Quadratic, Conic and Semi-Definite Programming
- KKT optimality conditions
- Lagrangian duality, constraint qualification, weak and strong duality
- Fenchel conjugacy and its relationship to Lagrangian duality
- Multi-objective optimization
- Equality constrained Newton method
- Log Barrier (Central Path) methods, and Primal-Dual optimization methods
- Overview of the simplex method as an example of an active set method.
- Overview of the first order oracle model, subgradient and separation oracles, and the Ellipsoid algorithm.
- Large-scale subgradient methods: subgradient descent, mirror descent, stochastic subgradient descent.
Text Books
The required textbook for the class is:
The book is available online here. About 75% of the material covered in the class can be found in the above book.
Supplemental recommended books:
- Buceck: Convex Optimization: Algorithms and Complexity (Foundations and Trends in Machine Learning, 2015)
- Bertsekas: Nonlinear Programming, 2nd Edition (Athena Scientific 1999)
- Nocedal and Wright: Numerical Optimization, 2nd Edition (Springer 2006)
- Bertsekas with Nedic and Ozdaglar: Convex Analysis and Optimization (Athena Scientific 2003)
- Ben-Tal amd Nemirovski: Lecture Notes on Modern Convex Optimization(2013)
- Nemirovski: Information Based Complexity of Convex Programming(1994/5)
Requirements and Grading:
There will be roughly 7-8 weekly homework assignments, counting toward 50% of the grade. Assignments must be typed (not handwritten) and submitted electronically in PDF. Collaborative work on the homeworks is encouraged, but each student must eventually write up the solution independently. Most assignments also include NumPy programming section. The remaining 50% of the grade will be based on the final exam.
Homework submissions:
- Submit 1) a separate pdf file "writeup" containing your derivations, answers, and a copy of code you generated, and 2) all othe relevant files, either separately or as a zip file.
- If you submit multiple versions, comment on which version you'd like graded. Also, check if you actually submitted the files you meant to.
- Questions about grading: Leave comments on the assignment submission. But, email convex-optimization-2018-staff@ttic.edu with the subject line "GRADEQUES HW3" (for example) to make sure the correct person will address the issue.
Lectures
Assignments
Course Summary:
Date | Details | Due |
---|---|---|