News

This page contains the recent news of our group.

23.05 New paper with KRAFTON AI Research Center

Excited to announce our new paper Censored Sampling of Diffusion Models Using 3 Minutes of Human Feedback, which is joint work with the KRAFTON AI Research Center. This work is our first collaboration with an industry research lab, and it represents a new initiative to pursue experimental AI research alongside the theoretical work we have been doing. (This paper has no theorems or proofs!) This paper proposes a method for aligning diffusion probabilistic models with human preferences by utilizing minimal (~3 minutes) human feedback.

23.05 New paper with Asuman Ozdaglar's group of MIT

Excited to announce our new paper Time-Reversed Dissipation Induces Duality Between Minimizing Gradient Norm and Function Value, which is joint work with Jaeyeon Kim of my group and Chanwoo Park and Asuman Ozdaglar of MIT. This work presents a new duality principle between accelerated convex optimization algorithms for reducing function values and reducing gradient magnitude. This work is, to the best of our knowledge, the first instance of a duality of optimization algorithms (rather than optimization problems), and we are optimistic that this new discovery will open the door to many new interesting questions.

23.04 ICML papers

Our group has 2 papers accepted to ICML:

ICML, together with NeurIPS, is considered the top machine learning conference.

23.04.26 UNIST AI Seminar

I will give a talk at the UNIST Artificial Intelligence Graduate Seminar entitled 이론적 이해를 거부하는 딥러닝과 새로운 난제를 위한 새로운 이론을 건축하는 수학자들.

23.04 Stanford Course on Large-Scale Optimization

During Spring quarter, I will be teaching a course entitled Large-Scale Convex Optimization: Algorithms and Analyses via Monotone Operators (EE 392F) at Stanford.

23.03.15–04.12 KSIAM Lecture Series

I am giving a 5-part weekly lecture series entitled Diffusion Probabilistic Models and Text-to-Image Generative Models hosted by KSIAM.

23.02.22. KSIAM-NIMS AI Winter School

I gave a tutorial entitled Diffusion Probabilistic Models at the KSIAM-NIMS AI Winter School.

23.02 On Sabbatical at Stanford

I'm on sabbatical! I will spend my year at Stanford with my PhD advisor Stephen Boyd.

23.02 Exellent Research Award (SNU)

I received the Excellent Research Award from the College of Natural Science of Seoul National University.

23.01 Mathematical Programming 2022 Meritorious Service Award

I received the 2022 Meritorious Service Award from the journal Mathematical Programming.

22.12.03 International Conference on Matrix Theory with Applications to Combinatorics, Optimization and Data Science

I gave a talk at International Conference on Matrix Theory with Applications to Combinatorics, Optimization and Data Science entitled Continuous-Time Analysis of AGM via Conservation Laws in Dilated Coordinate Systems.

22.11.18 Naver & Korean AI Association conference

I gave a talk at Naver & Korean AI Association conference entitled Neural Tangent Kernel Analysis of Deep Narrow Neural Networks.

22.11.14 Korea University colloquium

I gave a talk at Korea University entitled Infinitely Large Neural Networks.

22.11.07. Samsung Science & Technology Annual Forum

I gave a talk at Samsung Science & Technology Annual Forum 2022 entitled 이론적 이해를 거부하는 딥러닝과 새로운 난제를 위한 새로운 이론을 건축하는 수학자들.

22.10.20. KMS Invited Speaker

I gave a talk at Korean Mathematical Society Annual Meeting entitled Continuous-Time Analysis of AGM via Conservation Laws in Dilated Coordinate Systems. (대한수학회 가을 연구발표회)

22.10 INFORMS

I gave an in-person talk at the 2022 INFORMS Annual Meeting in Indianapolis entitled Non-Nesterov Acceleration Methods in First-order Optimization.

22.10.07 Korean Women Mathematical Society Leaders Forum

I give an in-person talk at the Korean Women Mathematical Society Leaders Forum (한국여성수리과학회 리더스 포럼) entitled Infinitely Large Neural Networks.

22.09.29 MIT ORC Fall Seminar

I gave an in-person talk at the MIT ORC Fall Seminar entitled Non-Nesterov Acceleration Methods and Their Computer-Assisted Discovery Via the Performance Estimation Problem.

22.09 Uijeong Jang

Uijeong joined our research group as a Ph.D. student.

21.08.17 MathDNN enrollment reaches 300

Enrollment for Mathematical Foundations of Deep Neural Networks has reached 300 students! This is the largest enrollment of a single course on the SNU campus this semester. We have an exciting semester ahead of us.

21.08 Soheun: KFAS Fellowship

Soheun was selected to be part of the KFAS Fellowship (고등교육재단 장학금) 2023 cohort. The KFAS fellowship is one of the most prestigious fellowships in Korea, providing full financial support for five years of graduate studies.

22.08.11 KSIAM-MINDS-NIMS Conference

I gave a talk entitled Continuous-Time Analysis of AGM via Conservation Laws in Dilated Coordinate Systems at the KSIAM-MINDS-NIMS International Conference on Machine Learning and PDEs.

22.08.03 SNU AI Summer School

I gave a talk entitled NTK Analysis of Deep Narrow Neural Networks at the SNU AI Summer School.

22.08.01 Tutorial at Korean AI Association Summer Conference

I gave a tutorial entitled Optimization for ML at the Korean AI Association Summer Conference.

22.07 Taeho and Jongmin: Youlchon AI Research Fellowship (율촌 AI 장학생)

Taeho and Jongmin received the Youlchon AI Research Fellowship (율촌 AI 장학생).

22.07 ICML

I attended ICML and gave three presentations, two as long talks.

22.06 Jaewook and Jisun: Google Travel Grant

Jaewook and Jisun received the Google travel grant award to attend ICML 2022.

22.05 ICML papers

Our group has 3 papers accepted to ICML, 2 as long talks:

ICML, together with NeurIPS, is considered the top machine learning conference.

22.05.28 KSIAM Tutorial

I gave a toturial at the KSIAM Spring Conference entitled Infinitely Large Neural Networks.

22.05.27 POSTECH math colloquium

I gave a talk at the POSTECH math colloquium.

22.05.06 ASCC Tutorial

I gave a toturial at the Asian Control Conference entitled Mathematical and practical foundations of backpropagation.

21.04 Chanwoo: Joining MIT EECS (Ph.D.)

Chanwoo will be joining the MIT EECS Ph.D. program in Fall 2022. We wish Chanwoo the best for his very promising academic career.

22.04.21. NIMS Colloquium

I gave a talk entitled Neural Tangent Kernel Analysis of Deep Narrow Neural Networks at the NIMS Colloquium (산업수학 콜로퀴움).

22.03.31. Lecture for Recent Topics in Artificial Intelligence

I gave a lecture entitled Infinitely Large Neural Networks for the class Recent Topics in Artificial Intelligence (최신 인공지능 기술) at SNU.

22.03 New PEP paper with Das Gupta and Parys of MIT

Excited to announce our new paper Branch-and-Bound Performance Estimation Programming: A Unified Methodology for Constructing Optimal Optimization Methods, which is joint work with Shuvomoy Das Gupta and Bart P.G. Van Parys of MIT. This work substantially generalizes the performance estimation problem, a computer-assisted methodology for finding first-order optimization methods, by utilizing branch-and-bound algorithms to solve non-convex QCQPs to certifiable global optimality.

22.02 AISTATS paper

Our group has 1 paper accepted to AISTATS

This paper was written in collaboration with Youngsuk Park and Bernie Wang of Amazon AWS AI Labs.

22.01 Sehyun: Outstanding TA Award

Sehyun received the Outstanding TA Award for his work for the Mathematical Foundations of Deep Neural Networks course in Fall 2021.

22.01 Donghwan Rho

Donghwan has joined our research group as a Ph.D. student.

22.01.14. 바이오영상신호처리 겨울학교

I gave a talk entitled Plug-and-Play Methods Provably Converge with Properly Trained Denoisers in the Bio Imaging, Signal Processing Conference (바이오영상신호처리 겨울학교).

21.12.08 NeurIPS presentation

4:30–6:00 p.m. PST, we presented our work A Geometric Structure of Acceleration and Its Role in Making Gradients Small Fast.

21.12 AI Times interview

I conducted an interview with the AI times. (article)

21.11.23. Gauss Colloquium

I gave a talk entitled A Role of Mathematicians in Deep Learning Research in the Mathematics department's Gauss Colloquium.

21.10 1000 Citations

My Google Scholar profile has exceeded 1000 citations.

21.10 NeurIPS paper

Our group has 1 paper accepted to NeurIPS

NeurIPS, together with ICML, is considered the top machine learning conference.

21.10.26. INFORMS Annual Meeting

I gave a talk at the 2021 INFORMS Annual Meeting in Anaheim.

21.10.21. KMS Invited Speaker

I gave a talk at Korean Mathematical Society Annual Meting as an invited speaker. (대한수학회 가을 연구발표회 응용수학 상설분과 발표)

21.10.14. PDE and Applied Math Seminar at UC Riverside

I gave a talk at the PDE and Applied Math Seminar at UC Riverside.

21.10.11. One World Optimization Seminar

I gave a talk at the One World Optimization Seminar. (Video)

21.10.01. The AI Korea 2021 Conference

I gave a talk at the The AI Korea 2021.

21.09 Chanwoo: KFAS Fellowship

Chanwoo was selected to be part of the KFAS Fellowship (고등교육재단 장학금) 2022 cohort. The KFAS fellowship is one of the most prestigious fellowships in Korea, providing full financial support for five years of graduate studies.

21.09.30. KMS Symposium for AI and University-Level Mathematics

I gave a talk at the KMS 2021 Symposium for AI and University-Level Mathematics.

21.08 Industrial and Mathematical Data Analytics Research Center (산업수학센터) interview

I conducted an interview with the student reporter of the Industrial and Mathematical Data Analytics Research Center (산업수학센터) of SNU. (article)

21.08.09. SNU AI Summer School

I gave a talk at the SNU AI Summer School. (Video)

21.06.25. CJK-SIAM Mini Symposium

I gave a talk at the (Chinese-Japanese-Korean)-SIAM Mini Symposium: Emerging Mathematics in AI at the KSIAM Spring Conferrence.

21.05 Visiting student: Shuvomoy Das Gupta

Shuvomoy (Ph.D. student, MIT ORC) will join our research group for the summer (May–Aug) of 2021 as a full-time visiting student.

21.05 Book publication

The book Large-Scale Convex Optimization via Monotone Operators by myself and Wotao Yin will be published by the Cambridge University Press, which is considered to be one of the most prestigious academic publishers.

21.05 ICML papers

Our group has 2 papers accepted to ICML, one as a long talk:

ICML, together with NeurIPS, is considered the top machine learning conference.

21.04.08 SNU math colloquium

I gave a talk at the SNU math colloquium.

21.04.01 KAIST math colloquium

I gave a talk at the KAIST math colloquium.

21.04 Samsung Science & Technology Foundation grant

The grant proposal entitled A Theory of the Many Accelerations in Optimization and Machine Learning (기계 학습과 최적화 알고리즘의 가속에 대한 통합 이론) was accepted for funding by the Samsung Science & Technology Foundation. The SSTF grant is considered to be the most prestigious research grant in Korea. I am honored and grateful for the awared and the support. 삼성미래기술육성재단 (No. SSTF-BA2101-02) (중앙일보 article)

21.03 Amazon Web Services AI Labs intern: TaeHo Yoon

TaeHo will intern at the AWS AI Labs for the summer of 2021 (and therefore will be temporarily away from our research group).

21.03 SRG paper accepted at Mathematical Programming

The Scaled relative graph paper by myself, Robert Hannah, and Wotao Yin has been accepted for publication at Mathematical Programming, which is considered the top journal in mathematical optimization.

21.03 Joo Young Choi

Joo Young has joined our research group as a Ph.D. student.

21.01 Gyumin Roh

Gyumin has joined our research group as an undergraduate intern.

20.12.(14,22) Bielefeld, Germany Lecture Series: Mathematics of Infinitely Wide Neural Networks

I gave a two-part lecture series on the Mathematics of Infinitely Wide Neural Networks for the Bielefeld-Seoul International Research Training Group 2235.

20.12.04 Workshop on Optimization, Probability, and Simulation

I gave a talk at WOPS20, hosted by the Chinese University of Hong Kong, Shenzhen. (Video)

20.11.25 Variational Analysis and Optimisation Webinar

I gave a talk at the Variational Analysis and Optimisation Webinar, organized by the Australian Mathematical Society. (Video)

21.11.19 SKKU (성균관대학교) math colloquium

I gave a talk at the SKKU math colloquium entitled Minimax Optimization in Machine Learning.

20.11 Jaewook Suh

Jaewook has joined our research group as a Ph.D. student.

20.10.24 KMS Invited Speaker

I gave a talk at the Korean Mathematical Society Annual Meting as an invited speaker. (대한수학회 가을 연구발표회 응용수학 상설분과 발표)

20.10 Jongmin Lee

Jongmin has joined our research group as a Ph.D. student.

20.09.02 Yonsei University math colloquium

I gave a talk at the Yonsei math colloquium entitled Minimax Optimization in Machine Learning.

20.08 Sehyun Kwon

Sehyun has joined our research group as a M.S. student.

20.06 Chanwoo Park

Chanwoo has joined our research group as an undergraduate intern.

20.06 TaeHo Yoon

TaeHo has joined our research group as a Ph.D. student.

20.06 Jisun Park

Jisun has joined our research group as a Ph.D. student.

20.05 National Research Foundation grant

The grant proposal entitled Utilizing Minimax Game Structures to Efficiently Train General Adversarial Networks (최소최대 게임 구조를 활용한 적대 신경망의 효율적 훈련) was accepted for funding by the National Research Foundation. I am grateful for the support. (기본연구 No. 2020R1F1A1A01072877)

20.03 Moved to SNU

I have joined the Department of Mathematical Sciences at Seoul National University as a tenure-track assistant professor.

19.12 Left UCLA

I have left the Department of Mathematics at UCLA to move to Seoul, Korea. I will dearly miss the collaboration with Wotao Yin.