Learning Data-Efficient and Generalizable Neural Operators via Fundamental Physics Knowledge
Paper • 2602.15184 • Published
Paper: Learning Data-Efficient and Generalizable Neural Operators via Fundamental Physics Knowledge Published at: ICLR 2026 Authors: Siying Ma, Mehrdad M. Zadeh, Mauricio Soroco, Wuyang Chen, Jiguo Cao, Vijay Ganesh Affiliations: Simon Fraser University, Georgia Institute of Technology Project Page: https://sites.google.com/view/sciml-fundemental-pde
We propose a multiphysics training framework that jointly learns from both original PDEs and their simplified basic forms (decomposed fundamental physics terms). This approach improves data efficiency, long-term physical consistency, and out-of-distribution generalization across 1D/2D/3D PDE problems. The method is architecture-agnostic and demonstrates consistent improvements in nRMSE.
@inproceedings{ma2026learning,
title={Learning Data-Efficient and Generalizable Neural Operators via Fundamental Physics Knowledge},
author={Ma, Siying and Zadeh, Mehrdad M. and Soroco, Mauricio and Chen, Wuyang and Cao, Jiguo and Ganesh, Vijay},
booktitle={International Conference on Learning Representations (ICLR)},
year={2026}
}
MIT