iconLogo
Published:2026/1/5 13:09:42

DSMの闇👿 Homogeneity Trapって何⁉️

超要約: 深層学習モデルの性能を落とす「均質性トラップ」を発見!IT業界に革命を💥

✨ ギャル的キラキラポイント ✨

● DSM(二重確率行列)っていう、安定化する魔法🧙‍♀️、実は落とし穴だった! ● 表現力(モデルのすごさ)が下がる原因を、ズバッとHomogeneity Trapって名付けた💖 ● IT業界でAIをもっと良く使えるように、具体的にどうすればいいか教えてくれるよ🌟


続きは「らくらく論文」アプリで

The Homogeneity Trap: Spectral Collapse in Doubly-Stochastic Deep Networks

Yizhi Liu

Doubly-stochastic matrices (DSM) are increasingly utilized in structure-preserving deep architectures -- such as Optimal Transport layers and Sinkhorn-based attention -- to enforce numerical stability and probabilistic interpretability. In this work, we identify a critical spectral degradation phenomenon inherent to these constraints, termed the Homogeneity Trap. We demonstrate that the maximum-entropy bias, typical of Sinkhorn-based projections, drives the mixing operator towards the uniform barycenter, thereby suppressing the subdominant singular value \sigma_2 and filtering out high-frequency feature components. We derive a spectral bound linking \sigma_2 to the network's effective depth, showing that high-entropy constraints restrict feature transformation to a shallow effective receptive field. Furthermore, we formally demonstrate that Layer Normalization fails to mitigate this collapse in noise-dominated regimes; specifically, when spectral filtering degrades the Signal-to-Noise Ratio (SNR) below a critical threshold, geometric structure is irreversibly lost to noise-induced orthogonal collapse. Our findings highlight a fundamental trade-off between entropic stability and spectral expressivity in DSM-constrained networks.

cs / cs.LG / cs.AI