We pursue research in machine learning and optimization. To this end,
we develop theories and algorithms using computational and
mathematical tools. Our ultimate goal is to provide robust and
provable solutions to challenging problems in artificial intelligence,
particularly those in large-scale settings. We are passionate about
translating our findings into practical applications that can benefit
society. For further details on our research directions and ongoing
projects, please refer to the
Research.
Recent News
Apr 2026
🎓 Our paper on uncertainty quantification in ICL has been
accepted to ACL 2026. In this work, we introduce self-function
vectors to directly decompose uncertainty, and propose a novel
framework to evaluate these disentangled sources.
Apr 2026
🤝 Our lab has been selected for the IITP Efficient AI National
R&D Program (2026–2029), a national research program with
KRW 7.5B in total funding, together with leading partners
including SqueezeBits.
Feb 2026
🤝 Our lab has been selected for the "Early-Career Researcher
Infrastructure Support Program" (NRF) — one of the most
competitive grants in Korea. With 500M KRW in funding, we
will build a high-performance compute infrastructure to power our
research on extreme compression of hyperscale AI foundation
models.
Feb 2026
📃 Our new paper is now available on
arXiv! In this work, We introduce SeedFlood, a decentralized LLM
training framework that enables model-size–independent
communication cost and perfect consensus.
Feb 2026
📃 Our new paper is now available on
arXiv! In this work, we propose a basis rotation approach to
effectively address the gradient staleness problem in asynchronous
pipeline parallelism.
Acknowledgements
Our research is generously supported by multiple organizations
including government agencies (NRF, IITP), industry (Google, Samsung,
Naver, Intel), and academic institutions (POSTECH, Yonsei).