Mathematical Machine Learning Theory, Spring 2024

This is the public course website for Machine Learning Theory, Spring 2024.


Announcements

  • No lecture on 04.23 and 04.25.

Homework



Course Description

This course focuses on understanding the mathematical theory (but not necessarily the modern "deep learning theory") behind machine learning algorithms and aims to use mathematical reasoning to design better machine learning methods. The topics include reproducing kernel Hilbert spaces and kernel methods, gradient descent, Rademacher complexity, Newton's method, stochastic gradient descent, and continuous-time models of gradient descent.



Course Information

Course material will be posted on this website. eTL will be used for announcements, homework submission, and receiving homework and exam scores.

Instructor

Ernest K. Ryu, 27-205,

Photo of Ernest Ryu

Teaching Assistants

Uijeong Jang

Photo of Uijeong

Jaeyeon Kim

Photo of Jaeyeon Kim

Jongmin Lee

Photo of Jongmin Lee

TaeHo Yoon

Photo of TaeHo Yoon

Lectures

Tuesdays and Thursdays 12:30–1:45pm at 28-102.

Exams

This class will have in-person midterm and final exams.

  • Midterm exam: Thursday, 04/11, 5:00–9:00pm, location TBD.
  • Final exam: Monday, 06/24, 9:00–1:00pm, location TBD.

Grading

Homework 30%, midterm exam 30%, final exam 40%.

Prerequisites

Good knowledge of the following subjects is required.

  • Differential calculus and linear algebra.
  • Undergraduate analysis: $\epsilon$-$\delta$ argument, compact sets, uniform convergence.
  • Basic ODEs: Initial value problem.
  • Basic Fourier analysis.
  • Probability theory: Conditional expectation, central limit theorem, and law of large numbers.
  • Mathematical maturity: The focus of this course will be on rigorous mathematical proofs, as opposed to derivations or implementations.

Non-Prerequisites

Background knowledge in the following areas is helpful but not necessary.

  • This course will have no programming component; we will analyze (implementable) algorithms, but we will not implement them.
  • Background in machine learning or deep learning is not required, although prior exposure will help motivate the subject matter.
  • Measure-theoretic probability theory is not required, but we will discuss the measure-theoretic considerations in passing.
  • Optimization theory (convex or non-convex) is not assumed.