Skip to content

yrhiba/xad

This branch is 34 commits behind auto-differentiation/xad:main.

Folders and files

NameName
Last commit message
Last commit date
Dec 5, 2024
Nov 29, 2024
Jun 12, 2024
Oct 3, 2024
Nov 30, 2024
Dec 5, 2024
May 30, 2024
Mar 22, 2024
Jul 15, 2022
Nov 30, 2024
Aug 21, 2024
Sep 2, 2023
Oct 4, 2024
Jul 14, 2022
Sep 27, 2024
Nov 30, 2024
Jan 18, 2024

Repository files navigation

XAD

๐Ÿš€ XAD: Powerful Automatic Differentiation for C++ & Python

XAD is the ultimate solution for automatic differentiation, combining ease of use with high performance. It's designed to help you differentiate complex applications with speed and precisionโ€”whether you're optimizing neural networks, solving scientific problems, or performing financial risk analysis.

Download PRs Welcome Build Status Coverage Codacy Quality

๐ŸŒŸ Why XAD?

XAD is trusted by professionals for its speed, flexibility, and scalability across various fields:

  • Machine Learning & Deep Learning: Accelerate neural network training and model optimization.
  • Optimization in Engineering & Finance: Solve complex problems with high precision.
  • Numerical Analysis: Improve methods for solving differential equations efficiently.
  • Scientific Computing: Simulate physical systems and processes with precision.
  • Risk Management & Quantitative Finance: Assess and hedge risks in sophisticated financial models.
  • Computer Graphics: Optimize rendering algorithms for high-quality graphics.
  • Robotics: Enhance control and simulation for robotic systems.
  • Meteorology: Improve accuracy in weather prediction models.
  • Biotechnology: Model complex biological processes effectively.

Key Features

  • Forward & Adjoint Mode: Supports any order using operator overloading.
  • Checkpointing Support: Efficient tape memory management for large-scale applications.
  • External Function Interface: Seamlessly connect with external libraries.
  • Thread-Safe Tape: Ensure safe, concurrent operations.
  • Exception-Safe: Formal guarantees for stability and error handling.
  • High Performance: Optimized for speed and efficiency.
  • Proven in Production: Battle-tested in large-scale, mission-critical systems.

๐Ÿ’ป Example

Calculate first-order derivatives of an arbitrary function with two inputs and one output using XAD in adjoint mode.

Adouble x0 = 1.3;              // initialise inputs
Adouble x1 = 5.2;  
tape.registerInput(x0);        // register independent variables
tape.registerInput(x1);        // with the tape
tape.newRecording();           // start recording derivatives
Adouble y = func(x0, x1);      // run main function
tape.registerOutput(y);        // register the output variable
derivative(y) = 1.0;           // seed output adjoint to 1.0
tape.computeAdjoints();        // roll back adjoints to inputs
cout << "dy/dx0=" << derivative(x0) << "\n"
     << "dy/dx1=" << derivative(x1) << "\n";

๐Ÿš€ Getting Started

git clone https://github.com/auto-differentiation/xad.git
cd xad
mkdir build
cd build
cmake ..
make

For more detailed guides, refer to our Installation Guide and explore Tutorials.

๐Ÿค Contributing

Want to get involved? We welcome contributors from all backgrounds! Check out our Contributing Guide and join the conversation in our Discussions.

๐Ÿ› Found a Bug?

Please report any issues through our Issue Tracker.


๐Ÿ“ฆ Related Projects


About

Powerful automatic differentiation in C++ and Python

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • C++ 96.1%
  • CMake 3.9%