About Me

I am an NLP researcher at Kakao Brain. I received an Master at KAIST Graduate School of Artificial Intelligence, under the advisement of Jaegul Choo. Before that, I obtained a Bachelor in Biz&Tech Management and Industrial Engineering at KAIST.

I am interested in addressing trustworthy and practical AI/ML challenges in the real world. My recent efforts center on enhancing the coding and reasoning capabilities of LLMs and employing LLMs as tool agents. In the past, I have contributed to projects addressing large-scale foundation model training, in-context learning optimization, fair and robust language generation, multimodal question-answering, and unbiased representation learning. I am also interested in the robustness and fairness of language models, knowledge-augmented LMs, and multimodal generative AI.

Latest News

Sep 2023   🎤 Invited talk at Singapore Management University
Jun 2023   🚀 Promoted to Project Leader at Kakao Brain
Jan 2023   🏆 Paper accepted to EACL 2023 findings
Aug 2022   🏆 Paper accepted to WACV 2023
Feb 2022   🎓 Officially a Master in Artificial Intelligence
Oct 2021   🚀 Working at Kakao Brain as an AI Research Scientist
Oct 2021   🏆 Paper accepted to NeurIPS 2021 as an Oral presentation
Jul 2021   🏆 Paper accepted to ICCV 2021
Jul 2021   📍 Starting my internship at Kakao Brain
Aug 2020   📍 Working at Naver Papago as a Collaborative Researcher
Aug 2020   🎓 Graduated cum laude with a Bachelor in Industrial Engineering & Business Management
Aug 2020   🚀 Working at Classum as a Data Analyst&Marketer

Publications

BiaSwap: Removing Dataset Bias with Bias-Tailored Swapping Augmentation
Eungyeup Kim*,Jihyeon Lee*, Jaegul Choo
International Conference on Computer Vision (ICCV), 2021, Accepted
[Paper]

Learning Debiased Representation via Disentangled Feature Augmentation
Jungsoo Lee*, Eungyeup Kim*, Juyoung Lee, Jihyeon Lee, Jaegul Choo
Conference on Neural Information Processing Systems (NeurIPS), 2021, Accepted as Oral Presentation (< 1% acceptance rate)
[Paper] [Code] Oral presentation

Dense but Efficient VideoQA for Intricate Compositional Reasoning
Jihyeon Lee*, Wooyoung Kang*, Eunsol Kim
Winter Conference on Applications of Computer Vision (WACV), 2023, Accepted
[Paper]

PePe: Personalized Post-editing Model utilizing User-generated Post-edits
Jihyeon Lee*, Taehee Kim*, Yunwon Tae*, Chunbok Park, Jaegul Choo
European Chapter of the Association for Computational Linguistics (EACL), 2023, Accepted
[Paper]

Exploiting the Potential of Seq2Seq Models as Robust Few-Shot Learners
Jihyeon Lee*, Dain Kim*, Doohae Jung*, Boseop Kim, Kyoung-Woon On
ArXiv preprinted
[Paper]