iconLogo
Published:2025/12/17 7:54:27

DNN爆速!OCSPrunerでIT革命💥

超要約: DNNを単一サイクルで圧縮して、IT業界をハッピーに!🚀

🌟 ギャル的キラキラポイント✨

● ワンサイクルトレーニングって神!💖 学習と圧縮を同時にやっちゃう時短テクだよ! ● 低スペック(てーすぺ)デバイスでも動くから、スマホとかIoTにもってこい📱✨ ● 開発コスト削減、市場投入期間短縮! IT企業の未来がマジ卍じゃん?😎

詳細解説いくよ~!

続きは「らくらく論文」アプリで

One-Cycle Structured Pruning via Stability-Driven Subnetwork Search

Deepak Ghimire / Dayoung Kil / Seonghwan Jeong / Jaesik Park / Seong-heum Kim

Existing structured pruning methods typically rely on multi-stage training procedures that incur high computational costs. Pruning at initialization aims to reduce this burden but often suffers from degraded performance. To address these limitations, we propose an efficient one-cycle structured pruning framework that integrates pre-training, pruning, and fine-tuning into a single training cycle without sacrificing accuracy. The key idea is to identify an optimal sub-network during the early stages of training, guided by norm-based group saliency criteria and structured sparsity regularization. We introduce a novel pruning indicator that detects a stable pruning epoch by measuring the similarity between pruning sub-networks across consecutive training epochs. In addition, group sparsity regularization accelerates convergence, further reducing overall training time. Extensive experiments on CIFAR-10, CIFAR-100, and ImageNet using VGG, ResNet, and MobileNet architectures demonstrate that the proposed method achieves state-of-the-art accuracy while being among the most efficient structured pruning frameworks in terms of training cost. Code is available at https://github.com/ghimiredhikura/OCSPruner.

cs / cs.CV / cs.AI / cs.LG