Plangym is an open source Python library for developing and comparing planning algorithms by providing a standard API to communicate between algorithms and environments, as well as a standard set of environments compliant with that API.
Given that OpenAI's gym
has become the de-facto standard in the research community, plangym
's API
is designed to be as similar as possible to gym
's API while allowing to modify the environment state.
Furthermore, it provides additional functionality for stepping the environments in parallel, delayed environment
initialization for dealing with environments that are difficult to serialize, compatibility with gym.Wrappers
,
and more.
Plangym currently supports all the following environments:
- OpenAI gym classic control environments
- OpenAI gym Box2D environments
- OpenAI gym Atari 2600 environments
- Deepmind's dm_control environments
- Stable-retro environments
import plangym
env = plangym.make(name="CartPole-v0")
state, obs = env.reset()
state = state.copy()
action = env.action_space.sample()
data = env.step(state=state, action=action)
new_state, observ, reward, end, info = data
import plangym
env = plangym.make(name="CartPole-v0")
state, obs = env.reset()
states = [state.copy() for _ in range(10)]
actions = [env.action_space.sample() for _ in range(10)]
data = env.step_batch(states=states, actions=actions)
new_states, observs, rewards, ends, infos = data
import plangym
env = plangym.make(name="MsPacman-v0", n_workers=2)
state, obs = env.reset()
states = [state.copy() for _ in range(10)]
actions = [env.action_space.sample() for _ in range(10)]
data = env.step_batch(states=states, actions=actions)
new_states, observs, rewards, ends, infos = data
Plangym is tested on Ubuntu 20.04 and Ubuntu 21.04 for python versions 3.7 and 3.8.
Installing it with Python 3.6 will break AtariEnv, and RetroEnv does not support python 3.9 yet.
Assuming that the environment libraries that you want to use are already installed, you can install plangym from pip running:
pip3 install plangym
If you also want to install the environment libraries, first clone the repository:
git clone [email protected]:FragileTech/plangym.git
cd plangym
Install the system dependencies by running
sudo apt-get install -y --no-install-suggests --no-install-recommends libglfw3 libglew-dev libgl1-mesa-glx libosmesa6 xvfb swig
To install MuJoCo, run:
make install-mujoco
Finally, install the project requirements and plangym.
pip install -r requirements.txt
pip install .
This is a summary of the incoming improvements to the project:
- Improved documentation:
- Adding specific tutorials for all the different types of supported environments.
- Adding a developer guide section for incorporating new environments to plangym.
- Improving the library docstrings with more examples and detailed information.
- Better gym integration:
- Registering all of plangym environments in gym under a namespace.
- Offering more control over how the states are passed to
step
,reset
andstep_batch
. - Allowing to return the states inside the info dictionary.
- Adding new environments to plangym, such as:
- Gym mujoco
- Gym robotics
- Gym-pybullet-drones
- Support for rendering in notebooks that are running on headless machines.
Plangym is released under the MIT license.
Contributions are very welcome! Please check the contributing guidelines before opening a pull request.
If you have any suggestions for improvement, or you want to report a bug please open an issue.