LogoAIAny
Icon for item

NeuralOperator: Learning in Infinite Dimensions

NeuralOperator is an open-source PyTorch library that implements neural operator architectures (notably Fourier Neural Operators) for learning mappings between function spaces. It targets physics-informed tasks such as PDE modeling, provides resolution-invariant operators, tensorized (Tucker) variants for parameter efficiency, and ready-to-use training and examples.

Introduction

NeuralOperator — Detailed Introduction

NeuralOperator is a comprehensive PyTorch library for learning neural operators, with official implementations of Fourier Neural Operators (FNO) and related operator-learning architectures. Neural operators learn mappings between function spaces (e.g., inputs and solutions of PDEs) rather than mappings between finite-dimensional vectors; this enables models trained at one spatial/temporal resolution to generalize to different resolutions (resolution invariance).

Key features

  • Official implementations of Fourier Neural Operators (FNO) and other operator architectures.
  • Resolution-invariant operator learning: trained operators can be applied at different grid resolutions without retraining.
  • Tensorization support (e.g., Tucker factorization) to reduce parameter count and improve efficiency (TFNO variants).
  • PyTorch ecosystem integration and examples, ready-to-run training scripts and demo problems.
  • Documentation site with guides and practical notes, and recommended citation entries for academic use.

Quickstart (example)

from neuralop.models import FNO
 
operator = FNO(n_modes=(64, 64),
               hidden_channels=64,
               in_channels=2,
               out_channels=1)

Tensorized variant example (Tucker TFNO):

from neuralop.models import TFNO
 
operator = TFNO(n_modes=(64, 64),
                hidden_channels=64,
                in_channels=2,
                out_channels=1,
                factorization='tucker',
                implementation='factorized',
                rank=0.1)

Installation

Who is it for

NeuralOperator is aimed at researchers and engineers working on scientific machine learning, numerical simulation, and physics-informed ML who need models that generalize across discretizations. It is suitable for experimenting with operator-learning architectures, benchmarking against PDE problems, and integrating efficient tensorized variants for large-scale problems.

Citations and further reading

The repository provides recommended citation entries (including the 2023 JMLR neural operator paper and subsequent library/guides) and links to practical guides and documentation for deeper understanding and reproducible experiments.

Information

  • Websitegithub.com
  • AuthorsNeuralOperator (GitHub organization)
  • Published date2020/10/13

Categories