![]() |
The basics
Communications and questions for this course will be managed through the course page on Piazza, which you can find here.
When: TuTh 11:00pm - 12:15pm | Where: CSI 3120 |
Instructor: Tom Goldstein | Office Hours: Th 12:15-1:30pm, AVW π 3141 |
TA: Ali Shafahi | Office Hours: TBD |
TA: Zheng Xu | Office Hours: Fri 2:25-3:25, AVW 3228 |
Final Exam: TBD
Coursework & grading
Homework will be assigned approximately once per week. Homework assignments will consist of both programming exercises and theoretical problem sets. Programming assignments will be cumulative - you will need the results of early assignments to complete assignments that are given later in the course. The completed homework assignments you turn in must represent your own work.
The approximate grade breakdown of the course will be
- 50% homework
- 25% midterm exam
- 25% final exam.
Note: This is a PhD qualifying course for computer science.
Homework
Assignments will be distributed in the for the Jupyter notebooks using Python 3. If you’ve never used notebooks before, you can find a tutorial here. For each assignment, you’ll turn in both a notebook, and a PDF of your completed notebook.
Homework will be turned in using Elms/Canvas. Follow this link to Elms, and then look for the course page after logging in.
Homework 1: Linear algebra review (Due Feb 6)
(Due Feb 6)Homework 2: More linear algebra (Due Feb 13)
(Due Feb 13)Homework 3: Gradients (Due Feb 20)
For this assignment, you’ll need to download the notebook importer, and place it in the same folder as the notebook.
Homework 4: PySmorch (Due March 5)
For this assignment, you’ll need the notebook importer, and also this utility script.
Homework 5: Convex Functions (Due March 17)
Homework 6: Gradient Methods (Due April 11)
Homework 7: Splitting Methods (Due May 14)
Lecture Slides
These slides might get updated shortly before or after lecture…
Course Overview
Linear Algebra Review
Optimization Problems
TV, FFT, and Calculus
Quadratic Forms
Convex Functions
Gradient Methods
Quasi-Newton Methods
Duality
Proximal Methods
Lagrangian Methods
Random topics | MCMC code example
Linear Programming
Semidefinite Programming
Interior Point Method
Book & Other Sources
All course materials are available for free online. Suggested reading material for various topics includes:
- Derivatives and gradients: Convex Optimization by Boyd and Vandenberghe, Appendix A
- Numerical Linear Algebra: Numerical Linear Algebra by Trefethen and Bau
- Randomized matrix factorizations: Finding Structure with Randomness
- L1 models and sparsity: Sparse modeling for Image and Vision Processing
- Convex functions and gradient methods: Convex Optimization by Boyd and Vandenberghe
- Convergence rates for gradient methods: Optimal Rates in Convex Optimization
- Proximal methods: A Field Guide to Forward-Backward Splitting
- ADMM: Fast Alternating Direction Optimization Methods
- Consensus ADMM: Distributed Optimization and Statistical Learning
- Unwrapped ADMM: Unwrapping ADMM
- PDHG: Adaptive Primal-Dual Hybrid Gradient Methods
- SGD: Incremental Gradient, Subgradient, and Proximal Methods
- SGD convergence rates: Stochastic Gradient Descent for Non-Smooth Optimization
- Monte-Carlo: An Introduction to MCMC for Machine Learning
- Barrier Methods: Convex Optimization by Boyd and Vandenberghe, chapter 11
- Primal-Dual Interior Point Methods: Nocedal and Wright, chapter 14
- Semi-definite programming: Vandenbergh and Boyd
- Metric learning: Distance Metric Learning for LMNN