This is the public course website for *Machine Learning Theory*, Spring 2024.

- No lecture on 05/28.
- On Tuesday, 05/28, 5:00–6:35 pm, I will give a talk entitled "세상을 바꿀 LLM의 사고력" in the Mathematics department's Gauss Colloquium. Those of you who are interested are encouraged to attend.
- No lecture on 06/04 and 06/06.

- Homework 1, Due 03/13, 5pm.
- Homework 2, Due 03/27, 5pm.
- Homework 3, Due 04/09 5pm.
- Homework 4, Due 04/25 5pm.
- Homework 5, Due 05/09 5pm.
- Homework 6, Due 05/31 5pm.
- Homework 7, Due 06/21 5pm.

This course focuses on understanding the mathematical theory (but not necessarily the modern "deep learning theory") behind machine learning algorithms and aims to use mathematical reasoning to design better machine learning methods. The topics include reproducing kernel Hilbert spaces and kernel methods, gradient descent, Rademacher complexity, Newton's method, stochastic gradient descent, and continuous-time models of gradient descent.

Course material will be posted on this website. eTL will be used for announcements, homework submission, and receiving homework and exam scores.

Ernest K. Ryu, 27-205,

Uijeong Jang

Jaeyeon Kim

Jongmin Lee

TaeHo Yoon

Tuesdays and Thursdays 12:30–1:45pm at 28-102.

This class will have in-person midterm and final exams.

- Midterm exam: Thursday, 04/11, 5:00–9:00pm, location TBD.
- Final exam: Monday, 06/24, 9:00–1:00pm, location TBD.

Homework 30%, midterm exam 30%, final exam 40%.

Good knowledge of the following subjects is required.

- Differential calculus and linear algebra.
- Undergraduate analysis: $\epsilon$-$\delta$ argument, compact sets, uniform convergence.
- Basic ODEs: Initial value problem.
- Basic Fourier analysis.
- Probability theory: Conditional expectation, central limit theorem, and law of large numbers.
- Mathematical maturity: The focus of this course will be on rigorous mathematical proofs, as opposed to derivations or implementations.

Background knowledge in the following areas is helpful but not necessary.

- This course will have no programming component; we will analyze (implementable) algorithms, but we will not implement them.
- Background in machine learning or deep learning is not required, although prior exposure will help motivate the subject matter.
- Measure-theoretic probability theory is not required, but we will discuss the measure-theoretic considerations in passing.
- Optimization theory (convex or non-convex) is not assumed.