Skip to content

v0.7.0

Latest
Compare
Choose a tag to compare
@Yang-LONG Yang-LONG released this 20 Jun 09:07
b233dc1

MindSpore Chemistry

Major Features and Enhancement

Force prediction

  • [STABLE] NequIP: Leveraging the equivariant computing library, the model is trained efficiently and achieves highly accurate inference of molecular energy based on atomic information.
  • [STABLE] Allegro: Leveraging the equivariant computing library, the model is trained efficiently and achieves highly accurate inference of molecular energy based on atomic information.

DFT Prediction

  • [STABLE] DeephE3nn: An equivariant neural network based on the E(3) group, designed to predict Hamiltonians using atomic structures.

Property Prediction

  • [STABLE] Matformer: Leveraging graph neural networks and Transformer architectures to predict diverse properties of crystalline materials.

Structure Generation

  • [STABLE] DiffCSP: New feature. This is a crystal structure prediction method based on diffusion models, specifically designed to learn structural distributions from stable crystal data. It predicts crystal structures by jointly generating lattice and atomic coordinates, and leverages a periodic E(3)-equivariant denoising model to better simulate the geometric properties of crystals. It is significantly more cost-effective in terms of computational resources compared to traditional methods based on Density Functional Theory (DFT) and performs remarkably well in crystal structure prediction tasks.

MindSpore Flow

Major Feature and Improvements

Data Driven

  • [STABLE] Burgers_SNO/Navier_Stokes_SNO2D/Navier_Stokes_SNO3D: Applications sovling one-dimension Burgers Equation, two/three-dimension Navier Stokes Equation by Spectral Neural Operator under data driven method are added.

  • [STABLE] API-SNO1D/2D/3D: Spectral Neural Operator (including SNO and U-SNO) APIs are added, utilizing polynomial transformations to transform computations into a spectral space similar to FNO architecture. Its advantage lies in effectively reducing system bias caused by aliasing errors.

  • [STABLE] API-Attention: Refactoring most commonly used Transformer class networks such as Attention, MultiHeadAttention, AttentionBlock, and ViT network interfaces.

  • [STABLE] API-Diffusion: A complete set of training and inference interfaces for diffusion models are added with support of two mainstream diffusion methods of DDPM and DDIM. Meanwhile the entire process of diffusion model training and inference can be completed through the simple and easy-to-use interfaces of Diffusion Scheduler, Diffusion Trainer, Diffusion Pipeline, and Diffusion Transformer.

  • [STABLE] API-Refactor_Core: Refactor of mindflow.core by fusion of mindflow.common, mindflow.loss and mindflow.operators.

  • [RESEARCH] CascadeNet: CascadeNet case is added, It uses surface pressure, Reynolds number, and a small number of wake velocity measurement points as inputs to predict the spatiotemporal field of cylinder wake pulsation velocity through a generative adversarial network with scale transfer topology structure.

  • [RESEARCH] MultiScaleGNN: A multi-scale graph neural network case to solve the large-scale pressure Poisson equation is added, which supports the use of projection method (or fractional step method) to solve incompressible Navier Stokes equations.

  • [RESEARCH] TurbineUQ: A case study of turbine stage flow field prediction and uncertainty optimization design is added with a combination of Monte Carlo method with deep learning methods to quantitative evaluation of uncertainty.

Data-Mechanism Fusion

  • [STABLE] PhyMPGN: An application of PhyMPGN, a physical equation solving model based on graph neural networks for the problem of flow around a cylinder is added. PhyMPGN can solve Burgers, FitzHugh-Nagumo, Gray-Scott and other equations in unstructured grids. Related paper has been received as ICLR 2025 Spotlight.

  • [RESEARCH] Heat_Conduction: A case study of steady-state heat conduction physics field prediction driven by data and physics is added.

  • [RESEARCH] SuperPosition: SDNO, an operator neural network based on the superposition principle, is added for predicting the temperature field of complex flow patterns in aircraft engine internal flow cascades.

Physics Driven

  • [RESEARCH] NSFNets: Navier Stokes Flow Networks (NSFNets) are added. It is a highly cited paper for solving ill posed problems (such as partially missing boundary conditions or inversion problems.

Solver

  • [STABLE] CBS solver: Application of CBS acoustic equation solver for solving two-dimensional acoustic equations in complex parameter fields is added. The CBS solver solves the acoustic equation in the frequency domain and has spectral accuracy in all spatial directions, with higher accuracy than the finite difference method. Reference: Osnabrugge et al. 2016

Optimizer

  • [STABLE] API-AdaHessian second-order optimizer: AdaHessian second-order optimizer based on the second-order information provided by the diagonal elements of the Hessian matrix for optimization calculations is added. Tests achieved a loss reduction over 20% compared with Adam under the same number of steps.

Foundation Model

  • [RESEARCH] PDEformer: PDEformer supports to solve one dimensional/two dimensional general partial differential equations with time with a superior of accuracy to domain model by foundation model under Zero-Shot occasions.

MindSpore Earth

Major Feature and Improvements

Short-Term Precipitation Forecast

  • [STABLE] PreDiff: PreDiff is a novel latent diffusion model (LDM)-based method for short-term precipitation nowcasting. This approach proposes a two-stage pipeline specifically designed for data-driven Earth system forecasting models. Additionally, it introduces a knowledge control mechanism to guide the denoising diffusion process in the latent space. This mechanism ensures that the generated predictions align more closely with domain-specific knowledge, thereby enhancing the credibility of the model's forecasts.
    Earthquake Early Warning

  • [RESEARCH]: G-TEAM is a data-driven earthquake early warning model whose core architecture combines the sequence modeling capability of Transformer and the spatial information extraction capability of GNN, so that the model can not only capture the time sequence characteristics of seismic waveforms, but also use the propagation relationship of seismic waves in the station network to improve the prediction precision of magnitude and epicenter location. It can quickly provide epicenter location, magnitude and seismic intensity distribution within 3 seconds after an earthquake occurs.
    Medium-Range Global Predictions

  • [STABLE] GraphCast: The graphcastTp mode has been newly added. In the downstream medium-term precipitation cases, a grid of 0.5°×0.5° is adopted, and ERA5 data with an input resolution of 360×720 is used for training.

  • [STABLE] SKNO: A new SKNO model has been added, which integrates the KNO model and the SHT operator. The SKNO model is developed based on the 16-year assimilation reanalysis dataset ERA5. It can predict global meteorological conditions with a temporal resolution of 6 hours and a spatial resolution of 1.4 degrees. The prediction results include indicators such as temperature, humidity, and wind speed.