Jinuk Kim
Hello!
I am a 3rd year PhD student at Seoul National University, Computer Science department, Machine Learning Lab, advised by Hyun Oh Song.
My research interests lie in constructing efficient machine learning system by solving tractable discrete and continuous optimization problem.
I received my Bachelor's degree in Statistics from Seoul National University in 2023.
CV   / 
Scholar   / 
Github   / 
Twitter
  / 
Blog
  / 
LinkedIn
|
|
Updates
|
- May 2025 One paper got accepted in ICML 2025 (GuidedQuant).
- Aug 2024 I will be joining Google as a Student Researcher.
- May 2024 One paper got accepted in ICML 2024 (LayerMerge).
- Aug 2023 I will be joining Samsung Advanced Institute of Technology as a Research Intern.
|
Research Highlights
|
- GuidedQuant proposed an improved objective function and quantization method for large language models.
|
|
GuidedQuant: Large Language Model Quantization via Exploiting End Loss Guidance
Jinuk Kim, Marwa El Halabi, Wonpyo Park, Clemens JS Schaefer, Deokjae Lee, Yeonhong Park, Jae W. Lee, Hyun Oh Song
ICML, 2025
Paper |
Code |
Bibtex
We propose GuidedQuant, a novel quantization approach that integrates gradient information from the end loss into the layer-wise quantization objective. Additionally, we introduce LNQ, a non-uniform scalar quantization algorithm which is guaranteed to monotonically decrease the quantization objective value.
|
|
LayerMerge: Neural Network Depth Compression through Layer Pruning and Merging
Jinuk Kim, Marwa El Halabi, Mingi Ji, Hyun Oh Song
ICML, 2024
Paper |
Code |
Project page |
Poster |
Bibtex
We propose LayerMerge, a novel depth compression method that selects which activation layers and convolution layers
to remove, to achieve a desired inference speed-up while minimizing performance loss.
|
|
Efficient Latency-Aware CNN Depth Compression via Two-Stage Dynamic Programming
Jinuk Kim*, Yeonwoo Jeong*, Deokjae Lee, Hyun Oh Song
ICML, 2023
Paper |
Code |
Blog |
Bibtex
We propose a subset selection problem that replaces inefficient activation layers with identity
functions and optimally merges consecutive convolution operations into shallow equivalent
convolution operations for efficient inference latency.
|
|
Dataset Condensation via Efficient Synthetic-Data Parameterization
Jang-Hyun Kim, Jinuk Kim, Seong Joon Oh, Sangdoo Yun, Hwanjun Song, Joonhyun
Jeong, Jung-Woo Ha, Hyun Oh Song
ICML, 2022
Paper |
Code |
Bibtex
We propose a novel condensation framework that generates multiple synthetic data with a limited
storage budget via efficient parameterization considering data regularity and develop an effective
optimization technique.
|
|
SNU Board
Code
Android/iOS service which collects notices from website of SNU departments and gather them
(Android / Aug 2021 / 100+ MAU / 80+ WAU / 1000+ Downloads).
|
Academic Services
- Conference Reviewer: TMLR 2024-, ICML 2025-, NeurIPS 2025-
|
|