ENGLISH | 简体中文
Partial differential equations (PDEs) are closely related to numerous physical phenomena and engineering applications, covering multiple fields such as airfoil design, electromagnetic field simulation, and stress analysis. In these practical applications, solving PDE often requires repeated iterations. Although traditional PDE solving algorithms are highly accurate, they often consume a significant amount of computational resources and time. The neural operator methods proposed in recent years, based on deep learning, have greatly improved the speed of solving PDEs. However, they pose difficulties to generalize to new forms of PDE, and often encounter problems such as high training costs and limited data size.
We develop the PDEformer model series to address the above issues. This is a class of end-to-end solution prediction models that can directly handle almost any form of PDE, eliminating the need for customized architecture design and training for different PDEs, thereby significantly reducing model deployment costs and improving solution efficiency. The PDEformer-1 model developed for one-dimensional PDEs has been open-sourced previously. The current PDEformer-2 model for two-dimensional PDEs, pretrained on a dataset of approximately 40TB, can directly handle 2D PDEs with different computational domains, boundary conditions, number of variables, and time dependencies, and quickly obtain predicted solutions at any spatio-temporal location. In addition, as a differentiable surrogate model for solving forward problems, PDEformer-2 can also be used to solve various inverse problems, estimating scalar coefficients, source term fields, or wave velocity fields based on noisy spatio-temporal scatter observations. This has laid a promising foundation for the model to support research on numerous physical phenomena and engineering applications in fields such as fluids and electromagnetics.
We consider two-dimensional PDEs defined on
where
As shown in the figure, PDEformer-2 first formulates the symbolic expression of the PDE as a computational graph, and makes use of a scalar encoder and a function encoder to embed the numeric information of the PDE into the node features of the computational graph. Then, PDEformer-2 encodes this computational graph using a graph Transformer, and decodes the resulting latent vectors using an implicit neural representation (INR) to obtain the predicted values of each solution component of PDE at specific spatio-temporal coordinates. A more detailed interpretation of the working principle of the model can be found in the introduction of PDEformer-1.
In terms of the complex domain shapes and boundary locations that may appear in two-dimensional equations, PDEformer-2 represents them as signed distance functions (SDFs), and embeds this information into the computational graph using the function encoder. The example shown in the following figure demonstrates the way of using computational graphs to represent Dirichlet boundary conditions on a square domain:
Please first make sure that MindSpore is successfully installed, as instructed in the Installation Tutorial. Other dependencies can be installed using the following command:
pip3 install -r pip-requirements.txt
We provide configuration files for PDEformer models with different numbers of parameters in the configs/inference folder. The details are as follows:
Model | Parameters | Configuration File | Checkpoint File |
---|---|---|---|
PDEformer2-S | 27.75M | configs/inference/model-S.yaml | model-S.ckpt |
PDEformer2-M | 71.07M | configs/inference/model-M.yaml | model-M.ckpt |
PDEformer2-L | 82.65M | configs/inference/model-L.yaml | model-L.ckpt |
The example code below demonstrates how to use PDEformer-2 to predict the solution of a given PDE,
taking the nonlinear conservation law model-M.ckpt
from Gitee AI,
and change the value of the model.load_ckpt
entry in configs/inference/model-M.yaml to the path of the corresponding weight file.
import numpy as np
from mindspore import context
from src import load_config, get_model, PDENodesCollector
from src.inference import infer_plot_2d, x_fenc, y_fenc
# Basic Settings
context.set_context(mode=context.PYNATIVE_MODE, device_target="CPU")
config, _ = load_config("configs/inference/model-M.yaml")
model = get_model(config)
# Specify the PDE to be solved
pde = PDENodesCollector()
u = pde.new_uf()
u_ic = np.sin(2 * np.pi * x_fenc) * np.cos(4 * np.pi * y_fenc)
pde.set_ic(u, u_ic, x=x_fenc, y=y_fenc)
pde.sum_eq0(pde.dt(u), pde.dx(pde.square(u)), pde.dy(-0.3 * u))
# Predict the solution using PDEformer (with spatial resolution 32) and plot
pde_dag = pde.gen_dag(config)
x_plot, y_plot = np.meshgrid(np.linspace(0, 1, 32), np.linspace(0, 1, 32), indexing="ij")
u_pred = infer_plot_2d(model, pde_dag, x_plot, y_plot)
For more examples, please refer to the interactive notebook PDEformer_inference.ipynb.