Explauto: A library to study, model and simulate intrinsically motivated multitask learning and exploration in virtual and robotic agents
Explauto is a framework developed in the Inria FLOWERS research team which provides a common interface for the implementation and benchmarking of active and online sensorimotor learning algorithms. In particular, this framework considers algorithms where exploration is driven by models of intrinsic motivation/curiosity at multiple levels of abstraction (from the active choice of parameterized motor primitives to the choice of parameterized problems/goals to the choice of learning strategies - e.g. deciding when to self-explore or ask input from an external expert). It is designed and maintained by Clément Moulin-Frier, Pierre Rouanet, and Sébastien Forestier.
Explauto provides a high-level API for an easy definition, use and evaluation of:
- Virtual and robotics setups (Environment level)
- Sensorimotor learning iterative models (Sensorimotor level)
- Active choice of sensorimotor experiments (Interest level)
It is crossed-platform and has been tested on Linux, Windows and Mac OS. Do not hesitate to contact us if you want to get involved! It has been released under the GPLv3 license.
Explauto's scientific roots trace back from Intelligent Adaptive Curiosity algorithmic architecture [Oudeyer, 2007], which has been extended to a more general family of autonomous exploration architecture by [Baranes, 2013] and recently expressed as a compact and unified formalism [Moulin-Frier, 2013]. We strongly recommend to read this short introduction into developmental robotics before going through the tutorials.
If you use the library in a scientific paper, please cite (follow the link for bibtex and pdf files):
Moulin-Frier, C.; Rouanet, P. & Oudeyer, P.-Y. Explauto: an open-source Python library to study autonomous exploration in developmental robotics International Conference on Development and Learning, ICDL/Epirob, Genova, Italy, 2014
Most of Explauto's documentation is written as IPython notebooks. If you do not know how to use them, please refer to the dedicated section.
-
More specific tutorials
- Setting environments
- Learning sensorimotor models
- Summary of available sensorimotor and interest models
- Learning sensorimotor models with sensorimotor context
- Learning sensorimotor models with context provided by environment
- Comming soon: Autonomous exploration using interest models
- Setting a basic experiment
- Comparing motor vs goal strategies
- Running pool of experiments
- Introducing curiosity-driven exploration
- Poppy environment
- Fast-forward a previous experiment
- Tutorial on Active Model Babbling and a comparison with Motor Babbling and Goal Babbling
- Goal Babbling with direct optimization
- Learning to produce sounds with the DIVA vocal synthesizer and Dynamic Movement Primitives
Explauto's API can be found on a html format here.
The best way to install Explauto at the moment is to clone the repo and use it in development mode. It is also available as a python package. The core of explauto depends on the following packages:
- python 2.7 or 3.*
- numpy
- scipy
- scikit-learn
For more details, please refer to the installation section of the documentation.