Skip to content

RobertTLange/evosax

Folders and files

NameName
Last commit message
Last commit date

Latest commit

40066b3 Β· Mar 13, 2025
Mar 11, 2025
Jun 17, 2022
Mar 11, 2025
Mar 11, 2025
Mar 10, 2025
Feb 8, 2025
Mar 6, 2025
Mar 11, 2025
Mar 10, 2025
Sep 25, 2021
Mar 11, 2025
Mar 11, 2025

Repository files navigation

evosax: Evolution Strategies in JAX 🦎

Pyversions PyPI version Ruff codecov Paper

Tired of having to handle asynchronous processes for neuroevolution? Do you want to leverage massive vectorization and high-throughput accelerators for Evolution Strategies? evosax provides a comprehensive, high-performance library that implements Evolution Strategies (ES) in JAX. By leveraging XLA compilation and JAX's transformation primitives, evosax enables researchers and practitioners to efficiently scale evolutionary algorithms to modern hardware accelerators without the traditional overhead of distributed implementations.

The API follows the classical ask-eval-tell cycle of ES, with full support for JAX's transformations (jit, vmap, lax.scan). The library includes 30+ evolution strategies, from classics like CMA-ES and Differential Evolution to modern approaches like OpenAI-ES and Diffusion Evolution.

Get started here πŸ‘‰ Colab

Basic evosax API Usage 🍲

import jax
from evosax.algorithms import CMA_ES


# Instantiate the search strategy
es = CMA_ES(population_size=32, solution=dummy_solution)
params = es.default_params

# Initialize state
key = jax.random.key(0)
state = es.init(key, params)

# Ask-Eval-Tell loop
for i in range(num_generations):
    key, key_ask, key_eval = jax.random.split(key, 3)

    # Generate a set of candidate solutions to evaluate
    population, state = es.ask(key_ask, state, params)

    # Evaluate the fitness of the population
    fitness = ...

    # Update the evolution strategy
    state = es.tell(population, fitness, state, params)

# Get best solution
state.best_solution, state.best_fitness

Implemented Evolution Strategies 🦎

Strategy Reference Import Example
Simple Evolution Strategy Rechenberg (1978) SimpleES Colab
OpenAI-ES Salimans et al. (2017) Open_ES Colab
CMA-ES Hansen & Ostermeier (2001) CMA_ES Colab
Sep-CMA-ES Ros & Hansen (2008) Sep_CMA_ES Colab
xNES Wierstra et al. (2014) XNES Colab
SNES Wierstra et al. (2014) SNES Colab
MA-ES Bayer & Sendhoff (2017) MA_ES Colab
LM-MA-ES Loshchilov et al. (2017) LM_MA_ES Colab
Rm_ES Li & Zhang (2017) Rm_ES Colab
PGPE Sehnke et al. (2010) PGPE Colab
ARS Mania et al. (2018) ARS Colab
ESMC Merchant et al. (2021) ESMC Colab
Persistent ES Vicol et al. (2021) PersistentES Colab
Noise-Reuse ES Li et al. (2023) NoiseReuseES Colab
CR-FM-NES Nomura & Ono (2022) CR_FM_NES Colab
Guided ES Maheswaranathan et al. (2018) GuidedES Colab
ASEBO Choromanski et al. (2019) ASEBO Colab
Discovered ES Lange et al. (2023a) DES Colab
Learned ES Lange et al. (2023a) LES Colab
EvoTF Lange et al. (2024) EvoTF_ES Colab
iAMaLGaM-Full Bosman et al. (2013) iAMaLGaM_Full Colab
iAMaLGaM-Univariate Bosman et al. (2013) iAMaLGaM_Univariate Colab
Gradientless Descent Golovin et al. (2019) GLD Colab
Simulated Annealing Rasdi Rere et al. (2015) SimAnneal Colab
Hill Climbing Rasdi Rere et al. (2015) SimAnneal Colab
Random Search Bergstra & Bengio (2012) RandomSearch Colab
SV-CMA-ES Braun et al. (2024) SV_CMA_ES Colab
SV-OpenAI-ES Liu et al. (2017) SV_OpenES Colab
Simple Genetic Algorithm Such et al. (2017) SimpleGA Colab
MR15-GA Rechenberg (1978) MR15_GA Colab
SAMR-GA Clune et al. (2008) SAMR_GA Colab
GESMR-GA Kumar et al. (2022) GESMR_GA Colab
LGA Lange et al. (2023b) LGA Colab
Diffusion Evolution Zhang et al. (2024) DiffusionEvolution Colab
Differential Evolution Storn & Price (1997) DE Colab
Particle Swarm Optimization Kennedy & Eberhart (1995) PSO Colab

Installation ⏳

You will need Python 3.10 or later, and a working JAX installation.

Then, install evosax from PyPi:

pip install evosax

To upgrade to the latest version of evosax, you can use:

pip install git+https://github.com/RobertTLange/evosax.git@main

Examples πŸ“–

Key Features πŸ’Ž

  • Comprehensive Algorithm Collection: 30+ classic and modern evolution strategies with a unified API
  • JAX Acceleration: Fully compatible with JAX transformations for speed and scalability
  • Vectorization & Parallelization: Fast execution on CPUs, GPUs, and TPUs
  • Production Ready: Well-tested, documented, and used in research environments
  • Batteries Included: Comes with optimizers like ClipUp, fitness shaping, and restart strategies

Related Resources πŸ“š

  • πŸ“Ί Rob's MLC Research Jam Talk - Overview at the ML Collective Research Jam
  • πŸ“ Rob's 02/2021 Blog - Blog post on implementing CMA-ES in JAX
  • πŸ’» Evojax - Hardware-Accelerated Neuroevolution with great rollout wrappers.
  • πŸ’» QDax: Quality-Diversity algorithms in JAX.

Citing evosax ✏️

If you use evosax in your research, please cite the following paper:

@article{evosax2022github,
    author  = {Robert Tjarko Lange},
    title   = {evosax: JAX-based Evolution Strategies},
    journal = {arXiv preprint arXiv:2212.04180},
    year    = {2022},
}

Acknowledgements πŸ™

We acknowledge financial support by the Google TRC and the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) under Germany's Excellence Strategy - EXC 2002/1 "Science of Intelligence" - project number 390523135.

Contributing πŸ‘·

Contributions are welcome! If you find a bug or are missing your favorite feature, please open an issue or submit a pull request following our contribution guidelines πŸ€—.

Disclaimer ⚠️

This repository contains independent reimplementations of LES and DES based and is unrelated to Google DeepMind. The implementation has been tested to reproduce the official results on a range of tasks.