iconLogo
Published:2025/12/17 6:27:17

メモリ節約!継続学習SketchOGD、爆誕💖

複数のタスクを賢く学習!メモリも節約するSketchOGDってスゴくない?✨

ギャル的キラキラポイント✨

● Catastrophic Forgetting(カタストロフィックフォゲティング)問題、バイバイ👋 ● Matrix Sketching(マトリックススケッチング)でメモリをギュッと節約💰 ● 色んなサービスで使える!AIちゃんの可能性を広げる🚀

詳細解説

続きは「らくらく論文」アプリで

SketchOGD: Memory-Efficient Continual Learning

Youngjae Min / Benjamin Wright / Jeremy Bernstein / Navid Azizan

When machine learning models are trained continually on a sequence of tasks, they are often liable to forget what they learned on previous tasks--a phenomenon known as catastrophic forgetting. Proposed solutions to catastrophic forgetting tend to involve storing information about past tasks, meaning that memory usage is a chief consideration in determining their practicality. This paper develops a memory-efficient solution to catastrophic forgetting using the idea of matrix sketching, in the context of a simple continual learning algorithm known as orthogonal gradient descent (OGD). OGD finds weight updates that aim to preserve performance on prior datapoints, using gradients of the model on those datapoints. However, since the memory cost of storing prior model gradients grows with the runtime of the algorithm, OGD is ill-suited to continual learning over long time horizons. To address this problem, we propose SketchOGD. SketchOGD employs an online sketching algorithm to compress model gradients as they are encountered into a matrix of a fixed, user-determined size. In contrast to existing memory-efficient variants of OGD, SketchOGD runs online without the need for advance knowledge of the total number of tasks, is simple to implement, and is more amenable to analysis. We provide theoretical guarantees on the approximation error of the relevant sketches under a novel metric suited to the downstream task of OGD. Experimentally, we find that SketchOGD tends to outperform current state-of-the-art variants of OGD given a fixed memory budget.

cs / cs.LG / cs.AI / stat.ML