Hello, the Graduate School of Data Science is pleased to host the upcoming BK21 x ERC Seminar as detailed below. We warmly invite your interest and participation.
Date & Time: Monday, July 7, 2025, at 2:00 PM
Venue: Room B102, Building 43-2, Seoul National University
Speaker: Prof. Ernest Ryu, UCLA
Title: Theory of the LoRA fine-tuning landscape
Abstract:
Low-rank adaptation (LoRA) has emerged as a standard approach for fine-tuning large foundation models due to its strong empirical performance, but the theoretical foundations of LoRA remain underexplored. In this talk, we present two analyses that demystify why LoRA often succeeds. In the first analysis, we adopt the Neural Tangent Kernel (NTK) assumption, which posits that the first-order Taylor expansion remains accurate throughout fine-tuning. We show that when fine-tuning with $N$ data points, there exists a low-rank solution of rank $r\lesssim \sqrt{N}$ that can be found using stochastic gradient descent (SGD), as the loss landscape has no spurious local minima. In the second analysis, we assume a priori that a low-rank update exists but do not rely on any linearization. We show that LoRA training converges to a global minimizer with low rank and small magnitude because spurious local minima have high rank and large magnitude, and the zero-initialization and weight decay induce an implicit bias toward the low-rank, small-magnitude region of the parameter space, where the global minima reside.
This talk is based on the following two papers:
- U. Jang, J. D. Lee, and E. K. Ryu, LoRA training in the NTK regime has no spurious local minima, International Conference on Machine Learning (Oral, top 144/9473=1.5% of papers), 2024.
- J. Kim, J. Kim, and E. K. Ryu, LoRA training provably converges to a low-rank global minimum or it fails loudly (but it probably won't fail), International Conference on Machine Learning (Oral, top 120/12107=1.0% of papers), 2025.
Bio:
Ernest Ryu is an assistant professor in the Department of Mathematics at UCLA. His current research focus is on applied mathematics, deep learning, and optimization.
Professor Ryu received a B.S. degree in Physics and Electrical Engineering with honors at the California Institute of Technology in 2010 and an M.S. in Statistics and a Ph.D. in Computational and Mathematical Engineering with the Gene Golub Best Thesis Award at Stanford University in 2016. In 2016, he joined the Department of Mathematics at UCLA as an Assistant Adjunct Professor. In 2020, he joined the Department of Mathematical Sciences at Seoul National University as a tenure-track faculty member. In 2024, returned to UCLA as an assistant professor.