This page contains the recent news of our group.
I gave an in-person talk at Berkeley's Simons Institute for the Theory of Computing.
I received the KSIAM Outstanding Young Investigator Award (젊은 연구자상).
Our group has 4 papers accepted to NeurIPS:
NeurIPS, together with ICML, is considered the top machine learning conference.
I gave an in-person talk at the 2023 INFORMS Annual Meeting in Pheonix entitled Computer-Assisted Design of Accelerated Composite Optimization Methods: OptISTA.
I gave an in-person talk at the Stanford University Information Systems Laboratory (ISL) Colloquium.
I gave an in-person talk at the Northwestern University Industrial Engineering and Management Sciences (IEMS) department seminar.
I gave an in-person talk at the Colorado School of Mines Applied Mathematics and Statistics Colloquium.
I gave an in-person talk at the University of Wisconsin–Madison Systems-Information-Learning-Optimization Seminar.
Jinhee was selected to be part of the KFAS Fellowship (고등교육재단 장학금) 2024 cohort. The KFAS fellowship is one of the most prestigious fellowships in Korea, providing full financial support for five years of graduate studies.
I will give a talk entitled Toward a grand unified theory of accelerations in optimization and machine learning at the UNIST Workshop Optimization & Machine learning.
I will give a tutorial entitled Diffusion Probabilistic Models and Text-to-Image Generative Models at Inha University.
The SNU math department hosted the 10-10 Summerschool on Mathematics of Deep Learning and AI. I am main the organizer, and I gave a talk entitled Toward a grand unified theory of accelerations in optimization and machine learning.
The SNU math department hosted Professor Terence Tao to give a lecture entitled A counterexample to the periodic tiling conjecture.
The Samsung Science & Technology Foundation hosted the Global Research Symposium (GRS) on Mathematical Theory of AI. I am part of the organizing board, and I will give a talk entitled Toward a grand unified theory of accelerations in optimization and machine learning.
I gave a tutorial entitled Diffusion Probabilistic Models and Text-to-Image Generative Models at the 정보 및 학습이론 단기강좌 hosted by 부호 및 정보이론 연구회.
I attended ICML and present two papers.
I gave a tutorial entitled Stochastic Differential Equations + Deep Learning = Diffusion Probabilistic Models hosted by Workshop on Artificial Intelligence for Industrial Mathematics.
I gave an in-person talk at the SIAM Conference on Optimization entitled Non-Nesterov Acceleration Methods in First-order Optimization. Slides
Jaesung joined our research group as a Ph.D. student.
Excited to announce our new paper Censored Sampling of Diffusion Models Using 3 Minutes of Human Feedback, which is joint work with the KRAFTON AI Research Center. This work is our first collaboration with an industry research lab, and it represents a new initiative to pursue experimental AI research alongside the theoretical work we have been doing. (This paper has no theorems or proofs!) This paper proposes a method for aligning diffusion probabilistic models with human preferences by utilizing minimal (~3 minutes) human feedback.
Excited to announce our new paper Time-Reversed Dissipation Induces Duality Between Minimizing Gradient Norm and Function Value, which is joint work with Jaeyeon Kim of my group and Chanwoo Park and Asuman Ozdaglar of MIT. This work presents a new duality principle between accelerated convex optimization algorithms for reducing function values and reducing gradient magnitude. This work is, to the best of our knowledge, the first instance of a duality of optimization algorithms (rather than optimization problems), and we are optimistic that this new discovery will open the door to many new interesting questions.
Hyunsik joined our research group as a M.S. student.
Our group has 2 papers accepted to ICML:
ICML, together with NeurIPS, is considered the top machine learning conference.
I gave a talk at the UNIST Artificial Intelligence Graduate Seminar entitled 이론적 이해를 거부하는 딥러닝과 새로운 난제를 위한 새로운 이론을 건축하는 수학자들.
During Spring quarter, I will be teaching a course entitled Large-Scale Convex Optimization: Algorithms and Analyses via Monotone Operators (EE 392F) at Stanford.
I gave a 5-part weekly lecture series entitled Diffusion Probabilistic Models and Text-to-Image Generative Models hosted by KSIAM.
I gave a tutorial entitled Diffusion Probabilistic Models at the KSIAM-NIMS AI Winter School.
I'm on sabbatical! I will spend my year at Stanford with my PhD advisor Stephen Boyd.
I received the Excellent Research Award from the College of Natural Science of Seoul National University.
I received the 2022 Meritorious Service Award from the journal Mathematical Programming.
I gave a talk at International Conference on Matrix Theory with Applications to Combinatorics, Optimization and Data Science entitled Continuous-Time Analysis of AGM via Conservation Laws in Dilated Coordinate Systems.
I gave a talk at Naver & Korean AI Association conference entitled Neural Tangent Kernel Analysis of Deep Narrow Neural Networks.
I gave a talk at Korea University entitled Infinitely Large Neural Networks.
I gave a talk at Samsung Science & Technology Annual Forum 2022 entitled 이론적 이해를 거부하는 딥러닝과 새로운 난제를 위한 새로운 이론을 건축하는 수학자들.
I gave a talk at Korean Mathematical Society Annual Meeting entitled Continuous-Time Analysis of AGM via Conservation Laws in Dilated Coordinate Systems. (대한수학회 가을 연구발표회)
I gave an in-person talk at the 2022 INFORMS Annual Meeting in Indianapolis entitled Non-Nesterov Acceleration Methods in First-order Optimization.
I give an in-person talk at the Korean Women Mathematical Society Leaders Forum (한국여성수리과학회 리더스 포럼) entitled Infinitely Large Neural Networks.
I gave an in-person talk at the MIT ORC Fall Seminar entitled Non-Nesterov Acceleration Methods and Their Computer-Assisted Discovery Via the Performance Estimation Problem.
Uijeong joined our research group as a Ph.D. student.
Enrollment for Mathematical Foundations of Deep Neural Networks has reached 300 students! This is the largest enrollment of a single course on the SNU campus this semester. We have an exciting semester ahead of us.
Soheun was selected to be part of the KFAS Fellowship (고등교육재단 장학금) 2023 cohort. The KFAS fellowship is one of the most prestigious fellowships in Korea, providing full financial support for five years of graduate studies.
I gave a talk entitled Continuous-Time Analysis of AGM via Conservation Laws in Dilated Coordinate Systems at the KSIAM-MINDS-NIMS International Conference on Machine Learning and PDEs.
I gave a talk entitled NTK Analysis of Deep Narrow Neural Networks at the SNU AI Summer School.
I gave a tutorial entitled Optimization for ML at the Korean AI Association Summer Conference.
Taeho and Jongmin received the Youlchon AI Research Fellowship (율촌 AI 장학생).
I attended ICML and gave three presentations, two as long talks.
Jaewook and Jisun received the Google travel grant award to attend ICML 2022.
Our group has 3 papers accepted to ICML, 2 as long talks:
ICML, together with NeurIPS, is considered the top machine learning conference.
I gave a toturial at the KSIAM Spring Conference entitled Infinitely Large Neural Networks.
I gave a talk at the POSTECH math colloquium.
I gave a toturial at the Asian Control Conference entitled Mathematical and practical foundations of backpropagation.
Chanwoo will be joining the MIT EECS Ph.D. program in Fall 2022. We wish Chanwoo the best for his very promising academic career.
I gave a talk entitled Neural Tangent Kernel Analysis of Deep Narrow Neural Networks at the NIMS Colloquium (산업수학 콜로퀴움).
I gave a lecture entitled Infinitely Large Neural Networks for the class Recent Topics in Artificial Intelligence (최신 인공지능 기술) at SNU.
Excited to announce our new paper Branch-and-Bound Performance Estimation Programming: A Unified Methodology for Constructing Optimal Optimization Methods, which is joint work with Shuvomoy Das Gupta and Bart P.G. Van Parys of MIT. This work substantially generalizes the performance estimation problem, a computer-assisted methodology for finding first-order optimization methods, by utilizing branch-and-bound algorithms to solve non-convex QCQPs to certifiable global optimality.
Our group has 1 paper accepted to AISTATS
This paper was written in collaboration with Youngsuk Park and Bernie Wang of Amazon AWS AI Labs.
Sehyun received the Outstanding TA Award for his work for the Mathematical Foundations of Deep Neural Networks course in Fall 2021.
Donghwan has joined our research group as a Ph.D. student.
I gave a talk entitled Plug-and-Play Methods Provably Converge with Properly Trained Denoisers in the Bio Imaging, Signal Processing Conference (바이오영상신호처리 겨울학교).
4:30–6:00 p.m. PST, we presented our work A Geometric Structure of Acceleration and Its Role in Making Gradients Small Fast.
I conducted an interview with the AI times. (article)
I gave a talk entitled A Role of Mathematicians in Deep Learning Research in the Mathematics department's Gauss Colloquium.
My Google Scholar profile has exceeded 1000 citations.
Our group has 1 paper accepted to NeurIPS
NeurIPS, together with ICML, is considered the top machine learning conference.
I gave a talk at the 2021 INFORMS Annual Meeting in Anaheim.
I gave a talk at Korean Mathematical Society Annual Meting as an invited speaker. (대한수학회 가을 연구발표회 응용수학 상설분과 발표)
I gave a talk at the PDE and Applied Math Seminar at UC Riverside.
I gave a talk at the One World Optimization Seminar. (Video)
I gave a talk at the The AI Korea 2021.
Chanwoo was selected to be part of the KFAS Fellowship (고등교육재단 장학금) 2022 cohort. The KFAS fellowship is one of the most prestigious fellowships in Korea, providing full financial support for five years of graduate studies.
I gave a talk at the KMS 2021 Symposium for AI and University-Level Mathematics.
I conducted an interview with the student reporter of the Industrial and Mathematical Data Analytics Research Center (산업수학센터) of SNU. (article)
I gave a talk at the 2021 EIMS (이화여대 수리과학연구소) International Conference on Computational Mathematics.
I gave a talk at the SNU AI Summer School. (Video)
I gave a talk at the (Chinese-Japanese-Korean)-SIAM Mini Symposium: Emerging Mathematics in AI at the KSIAM Spring Conference.
Shuvomoy (Ph.D. student, MIT ORC) will join our research group for the summer (May–Aug) of 2021 as a full-time visiting student.
The book Large-Scale Convex Optimization via Monotone Operators by myself and Wotao Yin will be published by the Cambridge University Press, which is considered to be one of the most prestigious academic publishers.
Our group has 2 papers accepted to ICML, one as a long talk:
ICML, together with NeurIPS, is considered the top machine learning conference.
I gave a talk at the SNU math colloquium.
I gave a talk at the KAIST math colloquium.
The grant proposal entitled A Theory of the Many Accelerations in Optimization and Machine Learning (기계 학습과 최적화 알고리즘의 가속에 대한 통합 이론) was accepted for funding by the Samsung Science & Technology Foundation. The SSTF grant is considered to be the most prestigious research grant in Korea. I am honored and grateful for the awared and the support. 삼성미래기술육성재단 (No. SSTF-BA2101-02) (중앙일보 article)
TaeHo will intern at the AWS AI Labs for the summer of 2021 (and therefore will be temporarily away from our research group).
Joo Young has joined our research group as a Ph.D. student.
I gave a two-part lecture series on the Mathematics of Infinitely Wide Neural Networks for the Bielefeld-Seoul International Research Training Group 2235.
I gave a talk at the Variational Analysis and Optimisation Webinar, organized by the Australian Mathematical Society. (Video)
I gave a talk at the SKKU math colloquium entitled Minimax Optimization in Machine Learning.
Jaewook has joined our research group as a Ph.D. student.
I gave a talk at the Korean Mathematical Society Annual Meting as an invited speaker. (대한수학회 가을 연구발표회 응용수학 상설분과 발표)
Jongmin has joined our research group as a Ph.D. student.
I gave a talk at the Yonsei math colloquium entitled Minimax Optimization in Machine Learning.
Sehyun has joined our research group as a M.S. student.
TaeHo has joined our research group as a Ph.D. student.
Jisun has joined our research group as a Ph.D. student.
The grant proposal entitled Utilizing Minimax Game Structures to Efficiently Train General Adversarial Networks (최소최대 게임 구조를 활용한 적대 신경망의 효율적 훈련) was accepted for funding by the National Research Foundation. I am grateful for the support. (기본연구 No. 2020R1F1A1A01072877)
I have joined the Department of Mathematical Sciences at Seoul National University as a tenure-track assistant professor.
I have left the Department of Mathematics at UCLA to move to Seoul, Korea. I will dearly miss the collaboration with Wotao Yin.