学术论文

姚嘉宸、苏畅

个人简介

姚嘉宸,致理书院信息与计算科学专业2020级本科生,曾获曾获国家奖学金、科技创新优秀奖学金、致理书院院长奖等荣誉,入选星火计划十六期、斯坦福UGVR暑研,研究兴趣为:物理启发的机器学习、表示学习。

苏畅,致理书院信息与计算科学专业2020级本科生,曾获国家奖学金,清华大学综合优秀奖学金,学业优秀奖学金,清华大学优秀学生干部,国际大学生程序设计竞赛南京赛区金奖等荣誉。研究兴趣为:物理启发的机器学习。

文献著录信息

Yao, J.#, Su, C.#, Hao, Z., Liu, S., Su, H., & Zhu, J.* (2023, July). Multiadam: Parameter-wise scale-invariant optimizer for multiscale training of physics-informed neural networks. International Conference on Machine Learning, 39702-39721.

论文摘要

Physics-informed Neural Networks (PINNs) have recently achieved remarkable progress in solving Partial Differential Equations (PDEs) in various fields by minimizing a weighted sum of PDE loss and boundary loss. However, there are several critical challenges in the training of PINNs, including the lack of theoretical frameworks and the imbalance between PDE loss and boundary loss. In this paper, we present an analysis of second-order non-homogeneous PDEs, which are classified into three categories and applicable to various common problems. We also characterize the connections between the training loss and actual error, guaranteeing convergence under mild conditions. The theoretical analysis inspires us to further propose MultiAdam, a scale-invariant optimizer that leverages gradient momentum to parameter-wisely balance the loss terms. Extensive experiment results on multiple problems from different physical domains demonstrate that our MultiAdam solver can improve the predictive accuracy by 1-2 orders of magnitude compared with strong baselines.