This is the public course website for Machine Learning Theory, Spring 2024.
This course focuses on understanding the mathematical theory (but not necessarily the modern "deep learning theory") behind machine learning algorithms and aims to use mathematical reasoning to design better machine learning methods. The topics include reproducing kernel Hilbert spaces and kernel methods, gradient descent, Rademacher complexity, Newton's method, stochastic gradient descent, and continuous-time models of gradient descent.
Course material will be posted on this website. eTL will be used for announcements, homework submission, and receiving homework and exam scores.
Ernest K. Ryu, 27-205,
Tuesdays and Thursdays 12:30–1:45pm at 28-102.
This class will have in-person midterm and final exams.
Homework 30%, midterm exam 30%, final exam 40%.
Good knowledge of the following subjects is required.
Background knowledge in the following areas is helpful but not necessary.