Department of Computer Science | Institute of Theoretical Computer Science | CADMO

Theory of Combinatorial Algorithms

Prof. Emo Welzl and Prof. Bernd Gärtner

Optimization for Data Science (2018)

Lecturers: Bernd Gärtner (CAB G31.1),
David Steurer (CAB H37.1).
Assistants: Viktor Gal (CAB F52.1),
Gleb Novikov (CAB H31.1),
Manuel Wettstein (CAB G19.2).
Lectures: Mon 15-16, HG E1.1,
Tue 10-12, HG D3.2.
Credit Points: 8CP for Data Science Master and Informatik Master (261-5110-00L, 3V + 2U + 2A)
Language: English
Contents: This course teaches an overview of modern optimization methods, with applications in particular for machine learning and data science.
  • In the first part of the course, we will discuss how classic first and second order methods such as gradient descent and Newton's method can be adapated to scale to large datasets, in theory and in practice. We also cover some new algorithms and paradigms that have been developed specifically in the context of data science. The emphasis is not so much on the application of these methods (many of which are covered in other courses), but on understanding and analyzing the methods themselves.
  • In the second part, we discuss convex programming relaxations as a powerful and versatile paradigm for designing efficient algorithms to solve computational problems arising in data science. We will learn about this paradigm and develop a unified perspective on it through the lens of the sum-of-squares semidefinite programming hierarchy. As applications, we are discussing non-negative matrix factorization, compressed sensing and sparse linear regression, matrix completion and phase retrieval, as well as robust estimation.
Literature:
Prerequisites: As background, we require material taught in the course "252-0209-00L Algorithms, Probability, and Computing". It is not necessary that participants have actually taken the course, but they should be prepared to catch up if necessary.

Exams, Special Assignments and Grading

Grading: There will be a written exam in the examination session. Furthermore, there will be two mandatory written special assignments during the semester. The final grade of the whole course will be calculated as a weighted average of the grades for the exam (80%) and the special assignments (20%).
Special Assignments: At two times in the course of the semester, we will hand out specially marked exercises or term projects — the written part of the solutions are expected to be typeset in LaTeX or similar. Solutions will be graded, and the grades will account for 20% of the final grade. Assignments can be discussed with colleagues, but we expect an independent writeup.
Exam: Date to be determined. The exam lasts 120 minutes, it is written and closed-book. No written material permitted! Click here for more information. Click here for the exam and the solutions.

Regular Exercises

The exercise classes take place as follows.

Note: Exercise class B has been canceled due to low attendance. Please go to A instead.

Group Time Room Assistant
A Tue 13-15 HG D3.2 Manuel Wettstein
B Tue 13-15 CHN G22 Gleb Novikov

Preliminary Schedule

This schedule is preliminary and subject to change. The exact schedule will be updated on a weekly basis.

Note: The first special assignment has been published.

Note: Lecture notes are now password protected. Please use your nethz-login.

Note: The second special assignment has been published.

Calendar Week Date Topic Exercises and Special
Assignments (by due date)
Solutions
8 Mon
19.2.18
Theory of Convex Functions (1.1, 1.2, 1.3 until Lem. 1.6) Exercises 1, 2, 3, 4, 8
Tue
20.2.18
Theory of Convex Functions (1.3.1, 1.3.2, 1.3.3, 1.4.1, 1.4.2)
9 Mon
26.2.18
Theory of Convex Functions (1.4.3, 1.5.1, 1.6.1) Extra material (1.5.2, 1.5.3, 1.5.4)
Tue
27.2.18
Theory of Convex Functions (1.6.2)
Gradient Descent (2.1, 2.2, 2.3, 2.4 until Def. 2.3)
10 Mon
5.3.18
Gradient Descent (2.4, 2.5, 2.6 w/o Thm 2.11) Exercises 6, 7, 10, 12
Tue
6.3.18
Gradient Descent (Thm. 2.11)
Projected Gradient Descent (3.1, 3.2, 3.3)
11 Mon
12.3.18
Projected Gradient Descent (3.5) Exercises 15, 17, 18
Tue
13.3.18
Subgradient Descent (4.1, 4.2, 4.3)
Stochastic Gradient Descent (5.1, 5.2, 5.2.1)
12 Mon
19.3.18
Stochastic Gradient Descent (5.2.2, 5.2.3) Exercises 22, 23, 26
Tue
20.3.18
Newton's Method (6.1, 6.2, 6.3 until proof of Thm. 6.4)
13 Mon
26.3.18
Quasi-Newton Methods (7.1, 7.2, 7.3, 7.4 until Thm. 7.1) Special Assignment 1
Exercises 27, 28, 29, 30, 31
Tue
27.3.18
Quasi-Newton Methods (7.4.1, 7.4.2, 7.4.3, 7.4.4)
14 Mon
2.4.18
Easter – No Lecture! No exercise class
Tue
3.4.18
Easter – No Lecture!
15 Mon
9.4.18
Part 2 (May 28): Introduction (1, 2) Exercises 34, 35, 36, 37
Tue
10.4.18
Regression (3.1, 3.2, 3.3)
16 Mon
16.4.18
Sechseläuten – No Lecture! Exercises 3.a, 3.b, 3.c, 3.d, 3.e
Tue
17.4.18
Regression (3.4, 3.5, 3.6)
17 Mon
23.4.18
Regression (3.7) Exercises 3.f, 3.g, 3.h
Tue
24.4.18
Regression (3.8)
18 Mon
30.4.18
Exercise 3.i No exercise class
Tue
1.5.18
Labor Day – No Lecture!
19 Mon
7.5.18
Exercises 3.m, 3.n (in-class, no preparation expected)
Tue
8.5.18
20 Mon
14.5.18
Exercises 3.j, 3.k, 3.l
Tue
15.5.18
21 Mon
21.5.18
Pentecost – No Lecture! Exercises 4.a, 4.b
Tue
22.5.18
22 Mon
28.5.18
Special Assignment 2
Exercise 5.a (in-class, no preparation expected)
Tue
29.5.18