Sung-Yub Kim

Staff Engineer @ AI Center, Samsung Electronics

prof_pic.png

sungyub.kim@mli.kaist.ac.kr

Seoul, South Korea

I’m a researcher in the LLM team @ AI Center, Samsung Electronics. I received my Ph.D. from KAIST, where I studied machine learning and deep learning theory.

Research Interests

  • Post-hoc Analysis & Adaptation of Foundation Models: Developing methods to analyze and adapt pre-trained large-language models (LLMs) and vision-language models (VLMs) for downstream tasks without full retraining.
  • Explainability and Interpretability: Understanding the internal mechanisms of foundation models to improve transparency and trustworthiness.
  • Generalization via Loss Landscape: Investigating the generalization properties of deep neural networks through the lens of loss landscape geometry and optimization dynamics.
  • Efficient ML Systems: Exploring distributed training, sequence parallelism, and memory-efficient attention mechanisms for scaling large models.

Professional Timeline

Period Role Organization
2024 – Present Staff Engineer AI Center, Samsung Electronics
2019 – 2024 Ph.D. in Machine Learning KAIST

Collaboration

I’m always open to research collaborations, especially in the areas of LLM efficiency, interpretability, and distributed training. Feel free to reach out via email or connect on LinkedIn.

For more details, see my CV.


News

Jun 04, 2025 New blog series on FlashAttention (v1 → v2 → v3) published! Check out the series page.
Jun 02, 2025 Blog series on Sequence Parallelism completed: Ring Attention → Blockwise RingAttention → DeepSpeed Ulysses → USP. Read the series.
May 30, 2025 New blog series on Distributed Training started — covering Tensor Parallelism, Pipeline Parallelism, and Activation Recomputation. Read more.

Selected Publications

  1. ICLR
    lantern.png
    LANTERN: Accelerating Visual Autoregressive Models with Relaxed Speculative Decoding
    Doohyuk Jang, Sihwan Park, June Yong Yang, Yeonsung Jung, Jihun Yun, Souvik Kundu, and 2 more authors
    In ICLR, Co-Correspondence Author , 2025
  2. NeurIPS
    gex.png
    GEX: A flexible method for approximating influence via Geometric Ensemble
    Sung-Yub Kim, Kyungsu Kim, and Eunho Yang
    In NeurIPS, 1st Author , 2023
  3. ICML
    rge.png
    RGE: A Repulsive Graph Rectification for Node Classification via Influence
    Jaeyun Song, Sung-Yub Kim, and Eunho Yang
    In ICML, Co-1st Author , 2023
  4. ICLR
    bnn.png
    Scale-invariant Bayesian Neural Networks with Connectivity Tangent Kernel
    Sung-Yub Kim, Sihwan Park, Kyungsu Kim, and Eunho Yang
    In ICLR, 1st Author , 2023