2nd year PhD student at OSI LAB of KAIST AI

<aside> 💌  itsnamgyu[at]kaist.ac.krGoogle ScholarTwitterGitHubLinkedIn

</aside>

I work on efficient reasoning in LLMs. Reasoning teachers (C4) pioneered LLM reasoning distillation. Block Transformer (C7) demonstrated a global-to-local approach to inference-optimized architectures, to mitigate prefill compute and KV cache memory cost. Self-training elicits concise reasoning (C8).

Last updated Mar 6th, 2025


Education

Sogang University | BS in CSE (Feb 18, 2016 – Aug 17, 2021) 3.87/4.3 (Summa Cum Laude)

KAIST | MS in AI (Aug 30, 2021 – Aug 27, 2023) OSI LAB (Advisor: Se-Young Yun)

KAIST | PhD in AI (Aug 28, 2023 – ) OSI LAB (Advisor: Se-Young Yun) XFact (Co-Advisor: James Thorne)


Highlighted Publications

[C8] Self-Training Elicits Concise Reasoning in Large Language Models (preprint) [link]

Tergel Munkhbat*, Namgyu Ho*, Seo Hyun Kim*, Yongjin Yang, Yujin Kim, Se-Young Yun

[C7] Block Transformer: Global-to-Local Language Modeling for Fast Inference (NeurIPS 2024) [link]

Namgyu Ho*, Sangmin Bae*, Taehyeon Kim, Hyunjik Jo, Yireun Kim, Tal Schuster, Adam Fisch, James Thorne, and Se-Young Yun

[C4] Large Language Models Are Reasoning Teachers (ACL 2023) [link]

Namgyu Ho, Laura Schmid, and Se-Young Yun

[C3] Understanding Cross-Domain Few-Shot Learning Based on Domain Similarity and Few-Shot Difficulty (NeurIPS 2022) [link]

Jaehoon Oh*, Sungnyun Kim*, Namgyu Ho*, Jin-Hwa Kim, Hwanjun Song, and Se-Young Yun

Other Publications

[C10] The BiGGen Bench: A Principled Benchmark for Fine-grained Evaluation of Language Models with Language Models (NAACL 2025) [link]

(19th author)

[C6] Carpe Diem: On the Evaluation of World Knowledge in Lifelong Language Models (NAACL 2024) [link]

Yujin Kim, Jaehong Yoon, Seonghyeon Ye, Sangmin Bae, Namgyu Ho, Sung Ju Hwang, and Se-Young Yun

[C5] HARE: Explainable Hate Speech Detection with Step-by-Step Reasoning (EMNLP 2023 Findings) [link]

Yongjin Yang*, Joonkee Kim*, Yujin Kim*, Namgyu Ho, James Thorne, Se-Young Yun

[C2] ReFine: Re-Randomization before Fine-Tuning for Cross-Domain Few-Shot Learning (CIKM 2022) [link]

Jaehoon Oh*, Sungnyun Kim*, Namgyu Ho*, Jin-Hwa Kim, Hwanjun Song, Se-Young Yun

[J2] Estimation of Cardiac Short Axis Slice Levels with a Cascaded Deep Convolutional and Recurrent Neural Network Model (Tomography 8, 2023) [link]

Namgyu Ho and Yoon-Chul Kim

[J1] Evaluation of transfer learning in deep convoluional neural network models for cardiac short axis slice classification (Scientific reports 11, 2021) [link]

Namgyu Ho* and Yoon-Chul Kim*

[C1] Cardiac short-axis slice range classification via transfer learning: Evaluation of seven popular deep CNNs (ISMRM 2019) [link]

Namgyu Ho*, Yoon-Chul Kim*, Yeon Hyeon Choe


Experience

Research Intern @ EXAONE Lab, LG AI Research (Dec 12, 2023 – Jun 14,2024) (Seoul)

Initiated and lead research on the Block Transformer architecture for efficient inference [C7] with 9 authors including stakeholders/advisors at LG and KAIST and collaborators at Google DeepMind.

Intern SWE @ ZionTech Solutions (Apr 4, 2019 – Dec 31, 2019) (California)

Initiated and lead the migration of the entire frontend of ZionTech’s flagship cloud service “Wavity” to the Angular framework, in production as of summer 2020.

(Unofficial) Undergraduate Researcher @ Samsung Medical Center (2018, 2020 – 2021) (Seoul)

Transfer learning of CNNs and RNNs to cardiac MRI classification. Under the supervision of Prof. Yoon-Chul Kim.


Honors & Activities

Naver D2 Campus Fest 2019 2nd place (Jan 2019 – Feb 2019)

Open-source project using Python/Django

Alpha Sigma Nu (2018 – 2019)

The honor society of Jesuit colleges and universities

Microsoft Student Partners (2018/2019)

Hosted tech evangelism events at Sogang University

Release (Student Organization) President (Mar 2017 – Aug 2018)

CS student organization for software development at Sogang University. Hosted seminars, hackathons, etc., sponsored by Naver and Microsoft.


Scholarships

Silicon Valley Data Science Program, Sogang University (Aug 2017)

Academic Excellence Scholarship, Sogang University (Spring 2017)