Fundamental Physics Neural Operators

Paper: Learning Data-Efficient and Generalizable Neural Operators via Fundamental Physics Knowledge Published at: ICLR 2026 Authors: Siying Ma, Mehrdad M. Zadeh, Mauricio Soroco, Wuyang Chen, Jiguo Cao, Vijay Ganesh Affiliations: Simon Fraser University, Georgia Institute of Technology Project Page: https://sites.google.com/view/sciml-fundemental-pde

Overview

We propose a multiphysics training framework that jointly learns from both original PDEs and their simplified basic forms (decomposed fundamental physics terms). This approach improves data efficiency, long-term physical consistency, and out-of-distribution generalization across 1D/2D/3D PDE problems. The method is architecture-agnostic and demonstrates consistent improvements in nRMSE.

Citation

@inproceedings{ma2026learning,
  title={Learning Data-Efficient and Generalizable Neural Operators via Fundamental Physics Knowledge},
  author={Ma, Siying and Zadeh, Mehrdad M. and Soroco, Mauricio and Chen, Wuyang and Cao, Jiguo and Ganesh, Vijay},
  booktitle={International Conference on Learning Representations (ICLR)},
  year={2026}
}

License

MIT

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Paper for delta-lab-ai/fundamental-physics-neural-operators