We pursue research in machine learning and optimization.
To this end, we develop theories and algorithms using computational and mathematical tools.
Our ultimate goal is to provide robust and provable solutions to challenging problems in artificial intelligence, particularly those in large-scale settings.
We are passionate about translating our findings into practical applications that can benefit society.
For further details on our research directions and ongoing projects, please refer to the
Research.
Recent News
Feb 2026
π€ Our lab has been selected for the "Early-Career Researcher Infrastructure Support Program" (NRF) β one of the most competitive grants in Korea. With 500M KRW in funding, we will build a high-performance compute infrastructure to power our research on extreme compression of hyperscale AI foundation models.
Feb 2026
π Our new paper is now available on arXiv! In this work, We introduce SeedFlood, a decentralized LLM training framework that enables model-sizeβindependent communication cost and perfect consensus.
Feb 2026
π Our new paper is now available on arXiv! In this work, we propose a basis rotation approach to effectively address the gradient staleness problem in asynchronous pipeline parallelism.
Jan 2026
π Our paper on extreme LLM sparsity (ELSA) has been accepted to ICLR 2026.
Acknowledgements
Our research is generously supported by multiple organizations including government agencies
(NRF, IITP), industry (Google, Samsung, Naver, Intel), and academic institutions
(POSTECH, Yonsei).