超要約: 偏微分方程式(PDE)の計算を、AIで超速くする技術だよん!
✨ ギャル的キラキラポイント ✨
● シミュレーションが爆速になるから、開発期間が短くなるの!✨ ● 追加のデータとかいらないから、すぐに使えるのがウレシイ💖 ● 難しい計算も、AIが賢くサポートしてくれるってコト😉
詳細解説いくよ~!
続きは「らくらく論文」アプリで
Data-driven acceleration of scientific computing workflows has been a high-profile aim of machine learning (ML) for science, with numerical simulation of transient partial differential equations (PDEs) being one of the main applications. The focus thus far has been on methods that require classical simulations to train, which when combined with the data-hungriness and optimization challenges of neural networks has caused difficulties in demonstrating a convincing advantage against strong classical baselines. We consider an alternative paradigm in which the learner uses a classical solver's own data to accelerate it, enabling a one-shot speedup of the simulation. Concretely, since transient PDEs often require solving a sequence of related linear systems, the feedback from repeated calls to a linear solver such as preconditioned conjugate gradient (PCG) can be used by a bandit algorithm to online-learn an adaptive sequence of solver configurations (e.g. preconditioners). The method we develop, PCGBandit, is implemented directly on top of the popular open source software OpenFOAM, which we use to show its effectiveness on a set of fluid and magnetohydrodynamics (MHD) problems.