iconLogo
Published:2026/1/5 2:30:59

K-EXAONE解説!IT業界をブチアゲ🚀

超要約:多言語AIモデル「K-EXAONE」すごすぎ!ITを革新✨

ギャル的キラキラポイント✨

● 韓国発🇰🇷!多言語対応で世界を視野🌎 ● 長文もラクラク処理!情報爆速キャッチ👀 ● IT業界の未来を変える、可能性無限大💎

詳細解説

続きは「らくらく論文」アプリで

K-EXAONE Technical Report

Eunbi Choi / Kibong Choi / Seokhee Hong / Junwon Hwang / Hyojin Jeon / Hyunjik Jo / Joonkee Kim / Seonghwan Kim / Soyeon Kim / Sunkyoung Kim / Yireun Kim / Yongil Kim / Haeju Lee / Jinsik Lee / Kyungmin Lee / Sangha Park / Heuiyeen Yeen / Hwan Chang / Stanley Jungkyu Choi / Yejin Choi / Jiwon Ham / Kijeong Jeon / Geunyeong Jeong / Gerrard Jeongwon Jo / Yonghwan Jo / Jiyeon Jung / Naeun Kang / Dohoon Kim / Euisoon Kim / Hayeon Kim / Hyosang Kim / Hyunseo Kim / Jieun Kim / Minu Kim / Myoungshin Kim / Unsol Kim / Youchul Kim / YoungJin Kim / Chaeeun Lee / Chaeyoon Lee / Changhun Lee / Dahm Lee / Edward Hwayoung Lee / Honglak Lee / Jinsang Lee / Jiyoung Lee / Sangeun Lee / Seungwon Lim / Solji Lim / Woohyung Lim / Chanwoo Moon / Jaewoo Park / Jinho Park / Yongmin Park / Hyerin Seo / Wooseok Seo / Yongwoo Song / Sejong Yang / Sihoon Yang / Chang En Yea / Sihyuk Yi / Chansik Yoon / Dongkeun Yoon / Sangyeon Yoon / Hyeongu Yun

This technical report presents K-EXAONE, a large-scale multilingual language model developed by LG AI Research. K-EXAONE is built on a Mixture-of-Experts architecture with 236B total parameters, activating 23B parameters during inference. It supports a 256K-token context window and covers six languages: Korean, English, Spanish, German, Japanese, and Vietnamese. We evaluate K-EXAONE on a comprehensive benchmark suite spanning reasoning, agentic, general, Korean, and multilingual abilities. Across these evaluations, K-EXAONE demonstrates performance comparable to open-weight models of similar size. K-EXAONE, designed to advance AI for a better life, is positioned as a powerful proprietary AI foundation model for a wide range of industrial and research applications.

cs / cs.CL / cs.AI