Lee Optimization Group
2026
SeedFlood: A Step Toward Scalable Decentralized Training of LLMs
Jihun Kim, Namhoon Lee
arXiv 2026
Mitigating Staleness in Asynchronous Pipeline Parallelism via Basis Rotation
Hyunji Jung*, Sungbin Shin*, Namhoon Lee
arXiv 2026
The Unseen Frontier: Pushing the Limits of LLM Sparsity with Surrogate-Free ADMM
Kwanhee Lee, Hyeondo Jang, Dongyeop Lee, Dan Alistarh, Namhoon Lee
ICLR 2026; JKAIA 2025 Best Paper Award
2025
MemIEC: A Step Toward Continual and Compositional Knowledge Editing
Jin Seong*, Jiyun Park*, Wencke Liermann, Hongseok Choi, Yoonji Nam, Hyun Kim, Soojong Lim, Namhoon Lee
NeurIPS 2025
An Analysis of Concept Bottleneck Models: Measuring, Understanding, and Mitigating the Impact of Noisy Annotations
Seonghwan Park, Jueun Mun, Donghyun Oh, Namhoon Lee
NeurIPS 2025
Critical Influence of Overparameterization on Sharpness-aware Minimization
Sungbin Shin*, Dongyeop Lee*, Maksym Andriushchenko, Namhoon Lee
UAI 2025; ICML 2023 Workshop on High-dimensional Learning Dynamics; JKAIA 2023 Best Paper Award
SAFE: Finding Sparse and Flat Minima to Improve Pruning
Dongyeop Lee, Kwanhee Lee, Jinseok Chung, Namhoon Lee
ICML 2025 (Spotlight)
SASSHA: Sharpness-aware Adaptive Second-order Optimization with Stable Hessian Approximation
Dahun Shin*, Dongyeop Lee*, Jinseok Chung, Namhoon Lee
ICML 2025
ZIP: An Efficient Zeroth-order Prompt Tuning for Black-box Vision-Language Models
Seonghwan Park, Jaehyeon Jeong, Yongjun Kim, Jaeho Lee, Namhoon Lee
ICLR 2025
2024
Rethinking Pruning Large Language Models: Benefits and Pitfalls of Reconstruction Error Minimization
Sungbin Shin, Wonpyo Park, Jaeho Lee, Namhoon Lee
EMNLP 2024
The Role of Masking for Efficient Supervised Knowledge Distillation of Vision Transformers
Seungwoo Son, Jegwang Ryu, Namhoon Lee, Jaeho Lee
ECCV 2024; ICLR 2023 Workshop
JaxPruner: A Concise Library for Sparsity Research
Joo Hyung Lee, Wonpyo Park, Nicole Mitchell, Jonathan Pilault, Johan Obando-Ceron, Han-Byul Kim, Namhoon Lee, Elias Frantar, Yun Long, Amir Yazdanbakhsh, Shivani Agrawal, Suvinay Subramanian, Xin Wang, Sheng-Chun Kao, Xingyao Zhang, Trevor Gale, Aart Bik, Woohyun Han, Milen Ferez, Zhonglin Han, Hong-Seok Kim, Yann Dauphin, Gintare Karolina Dziugaite, Pablo Samuel Castro, Utku Evci
CPAL 2024 (Oral); ICLR 2023 Workshop on Sparsity in Neural Networks
2023
Pruning Neural Networks with Velocity-Constrained Optimization
Donghyun Oh*, Jinseok Chung*, Namhoon Lee
NeurIPS 2023 Workshop on Optimization for Machine Learning
A Closer Look at the Intervention Procedure of Concept Bottleneck Models
Sungbin Shin, Yohan Jo, Sungsoo Ahn, Namhoon Lee
ICML 2023; NeurIPS 2022 Workshop on Trustworthy and Socially Responsible ML
On the Effectiveness of Sharpness-Aware Minimization with Large Mini-batches
Jinseok Chung, Seonghwan Park, Jaeho Lee, Namhoon Lee
ICML 2023 Workshop on High-dimensional Learning Dynamics
FedFwd: Federated Learning without Backpropagation
Seonghwan Park, Dahun Shin, Jinseok Chung, Namhoon Lee
ICML 2023 Workshop on Federated Learning and Analytics in Practice
Semi-Supervised Concept Bottleneck Models
Jeeon Bae, Sungbin Shin, Namhoon Lee
ICML 2023 Workshop on Artificial Intelligence and Human-Computer Interaction
Almost Sure Last Iterate Convergence of Sharpness-Aware Minimization
Kyungchun Nam, Jinseok Chung, Namhoon Lee
ICLR 2023 Tiny Papers
Previous
Meta-Learning Sparse Implicit Neural Representations
Jaeho Lee*, Jihoon Tack*, Namhoon Lee, Jinwoo Shin
NeurIPS 2021; SNN 2021
Understanding the Effects of Data Parallelism and Sparsity on Neural Network Training
Namhoon Lee, Thalaiyasingam Ajanthan, Philip H. S. Torr, Martin Jaggi
ICLR 2021
Optimal Mini-Batch Size for Stochastic Gradient Methods
Namhoon Lee, Philip H. S. Torr, Richard Hartley
ICLR 2021 Workshop on Science and Engineering of Deep Learning
Toward Efficient Deep Learning with Sparse Neural Networks
Namhoon Lee
Ph.D. Thesis, University of Oxford
A Signal Propagation Perspective for Pruning Neural Networks at Initialization
Namhoon Lee, Thalaiyasingam Ajanthan, Stephen Gould, Philip H. S. Torr
ICLR 2020 (Spotlight)
Data Parallelism in Training Sparse Neural Networks
Namhoon Lee, Philip H. S. Torr, Martin Jaggi
ICLR 2020 Workshop on Practical ML
SNIP: Single-shot Network Pruning based on Connection Sensitivity
Namhoon Lee, Thalaiyasingam Ajanthan, Philip H. S. Torr
ICLR 2019
Learn to Pay Attention
Saumya Jetley, Nicholas Lord, Namhoon Lee, Philip H. S. Torr
ICLR 2018
Synthesizing a Scene-Specific Pedestrian Detector and Pose Estimator for Static Video Surveillance
Hironori Hattori*, Namhoon Lee*, Vishnu N. Boddeti, Fares Beainy, Kris M. Kitani, Takeo Kanade
IJCV 2018
DESIRE: Distant Future Prediction in Dynamic Scenes with Interacting Agents
Namhoon Lee, Wongun Choi, Paul Vernaza, Christopher B. Choy, Philip H. S. Torr, Manmohan Chandraker
CVPR 2017 (Spotlight)
Forecasting Interactive Dynamics of Pedestrians with Fictitious Play
Wei-Chiu Ma, De-An Huang, Namhoon Lee, Kris M. Kitani
CVPR 2017
Predicting Wide Receiver Trajectories in American Football
Namhoon Lee, Kris M. Kitani
WACV 2016 (Oral)
Modeling of Dynamic Environments for Visual Forecasting of American Football Plays
Namhoon Lee (committee: Kris M. Kitani, Martial Hebert, Sebastian Scherer)
M.S. Thesis, Carnegie Mellon University, Dec 2015