Lectures

The course comprises 10 lectures. These will be at the campus in Uppsala and will neither be recorded nor broadcast via zoom. 

Suggested solutions for some of the recommended exercises will be continuously added to the exercise sheet Download exercise sheet throughout the course.

Lecture Lecturer

Suggested reading
(cursory in parentheses)

Recommended
exercises
Lecture notes
1. Introduction, course structure, foundations JS [WR] 1, 2.1-2.3 Jupyter notebook Download Jupyter notebook
Colab Links to an external site.
PDF Download PDF
2. Convex functions, descent methods SM [WR] 2.4-2.5, 3-3.2 2.7, 3.3 PDF Download PDF
3. Line search, momentum, conjugate gradients JS

[WR] 3.4-3.5, 4.1, (4.2 - 4.4), 4.5,
[NW] pp. 101-112

4.4, 4.10 PDF Download PDF
4. SGD, coordinate descent SM [WR] 5.1-5.5, (6) PDF Download PDF
5. Numerical linear algebra JS [BV] Appendix C PDF Download PDF
6. Randomized linear algebra SM [MT] 1, 4-4.3, 4.8, 8, 10 PDF Download PDF
7. First-order constrained optimization, projected gradient, subgradients JS [WR] 7.0-7.3, 8.0-8.2  PDF Download PDF
8. Nonsmooth optimization SM [WR] 8.4-8.6, 9 PDF Download PDF
9. Duality theory and algorithms JS [WR] 10 PDF Download PDF
10. Differentiation and Adjoints SM [WR] 11 PDF Download PDF

The main book for the course is

  • [WR] Wright, S. J., & Recht, B. (2022). Optimization for data analysis. Cambridge University Press.

We do not cover all aspects of the suggested chapters in the lecture. For some lectures in the course, we use additional material. In particular,

JS = Jens Sjölund
SM = Sebastian Mair