iconLogo
Published:2025/11/7 18:13:01

表形式データAI「Orion-MSP」爆誕!✨ 超効率ICLで未来を変える!

超要約: 表形式データ (お洒落な表データのことね!) のAI、Orion-MSP が爆速&高精度で学習するから、ビジネスが超進化するって話だよ~!

🌟 ギャル的キラキラポイント✨

マルチスケール処理で、データの深い関係性もバッチリ理解するんだって! 例:年齢と年収の関係とか、細かく見れるってことね♪ ● スパースアテンション で計算量を減らして、爆速学習を実現! 大量のデータもへっちゃらだよ☆ ● Perceiverメモリ で、学習情報を共有して精度UP!賢くて可愛いやつ、最強じゃん?

詳細解説いくよ~!

続きは「らくらく論文」アプリで

Orion-MSP: Multi-Scale Sparse Attention for Tabular In-Context Learning

Mohamed Bouadi / Pratinav Seth / Aditya Tanna / Vinay Kumar Sankarapu

Tabular data remain the predominant format for real-world applications. Yet, developing effective neural models for tabular data remains challenging due to heterogeneous feature types and complex interactions occurring at multiple scales. Recent advances in tabular in-context learning (ICL), such as TabPFN and TabICL, have achieved state-of-the-art performance comparable to gradient-boosted trees (GBTs) without task-specific fine-tuning. However, current architectures exhibit key limitations: (1) single-scale feature processing that overlooks hierarchical dependencies, (2) dense attention with quadratic scaling in table width, and (3) strictly sequential component processing that prevents iterative representation refinement and cross-component communication. To address these challenges, we introduce Orion-MSP, a tabular ICL architecture featuring three key innovations: (1) multi-scale processing to capture hierarchical feature interactions; (2) block-sparse attention combining windowed, global, and random patterns for scalable efficiency and long-range connectivity; and (3) a Perceiver-style memory enabling safe bidirectional information flow across components. Across diverse benchmarks, Orion-MSP matches or surpasses state-of-the-art performance while scaling effectively to high-dimensional tables, establishing a new standard for efficient tabular in-context learning. The model is publicly available at https://github.com/Lexsi-Labs/Orion-MSP .

cs / cs.AI / cs.LG