- Autonomous Drone Racing Project Course
- Table of Contents
- Documentation
- Required Packages - Overview
- Step-by-Step Installation
- Fork lsy_drone_racing
- Setting up your environment
- Package Installation
- Installation of you lsy_drone_racing fork (necessary for sim & real)
- Installation of crazyflow (necessary for sim & real)
- Install Motion Capture Tracking (necessary for real only)
- Install Models Repository (necessary for real only)
- Install Estimators Repository (necessary for real only)
- Install cflib (necessary for real only)
- Install Acados (necessary for MPC approaches in sim & real)
- USB Preparation for crazyradio (real only)
- cfclient (real only/ optional)
- Using Docker
- Difficulty levels
- The online competition
- Creating your own controller
- Common errors
- Deployment
To get you started with the drone racing project, you can head over to our documentation page.
To run the LSY Autonomous Drone Racing project in simulation, you will need 2 repositories:
- lsy_drone_racing -
main
branch: This repository contains the environments and scripts to simulate and deploy the drones in the racing challenge. - crazyflow -
main
branch: This repository constains the drone simulation.
To run the LSY Autonomous Drone Racing project in the lab on real hardware, you need three additional repositories to the ones required for the simulation:
- motion_capture_tracking -
ros2
branch: This repository is a ROS 2 package that receives data from our VICON motion capture system and publishes it under the\tf
topic via ROS2. - estimators -
main
branch: Estimators to accurately predict the drone state based on the vicon measurements. - models -
main
branch: Dynamics Models of the crazyflie quadrotor.
The first step is to fork the lsy_drone_racing repository for your own group. This has two purposes: You automatically have your own repository with git version control, and it sets you up for taking part in the online competition and automated testing (see competition).
If you have never worked with GitHub before, see the docs on forking.
You need a working installation of ROS2 Jazzy and Python 3.11 in order to deploy your controller on the real drone in the end.
We recommend RoboStack and micromamba for this. Robostack lets you install your favorite ROS version independent of you OS. It builds on top of conda/mamba to do this.
Please follow the Robostack Getting Started in order to create a ROS2 Jazzy Environment on Python3.11 using micromamba or another package manager of your choice.
We provide a PC in the Lab on which you are allowed to run your controllers during deployment. Please create a new user for each team and follow the instructions regarding micromamba & robostack as noted above. Then, proceed with the Package Installation instructions.
If you only want to run the simulation, you can use your favorite conda/mamba/venv to install the packages. However, you will need a working installation of ROS2 if you want to deploy your controller on the real drone.
Note: Make sure you have activated your environment before installing the packages
First, clone the new fork from your own account and create a new environment by running
mkdir -p ~/repos && cd repos
git clone https://github.com/<YOUR-USERNAME>/lsy_drone_racing.git
Now you can install the lsy_drone_racing package in editable mode from the repository root
cd ~/repos/lsy_drone_racing
pip install --upgrade pip
pip install -e .
In addition, you also need to install the crazyflow package
cd ~/repos
git clone https://github.com/utiasDSL/crazyflow.git
cd ~/repos/crazyflow
pip install -e .
Finally, you can test if the installation was successful by running
cd ~/repos/lsy_drone_racing
python scripts/sim.py
If everything is installed correctly, this opens the simulator and simulates a drone flying through four gates.
You can also install the extended dependencies with
cd ~/repos/lsy_drone_racing
pip install -e .[rl,test]
and check if all tests complete with
cd ~/repos/lsy_drone_racing
pytest tests
Create a ros2 workspace in which the package will be located:
mkdir -p ~/ros_ws/src
Clone the repository and build it using colcon. Make sure your robostack environment is activated.
cd ~/ros_ws/src
git clone --recurse-submodules https://github.com/utiasDSL/motion_capture_tracking
cd ../
colcon build --cmake-args -DCMAKE_POLICY_VERSION_MINIMUM=3.5
Test your installation: For this to work you have to be in the lab and be connected to our local network.
# Check connection to the vicon PC
ping 10.157.163.191
If this works, source the workspace and run the motiontracking node
source ~/ros_ws/install/setup.sh
ros2 launch motion_capture_tracking launch.py
Optional: Sourcing your workspace automatically
Every time you run the motion_capture_tracking node, you have to source the workspace first. You can either do this manually or automate this. In order to automate this, you have to modify the activate.d
directory of your package manager. The files within this directory are run when the environment is activated. Because the files are run in alphabetic order, we start our file name with "x".
For micromamba this would be:
echo "source $HOME/ros_ws/install/setup.sh" > ~/micromamba/envs/ros_env/etc/conda/activate.d/xcustom_activate.sh
cd ~/repos
# Download and install our models repository
git clone [email protected]:utiasDSL/models.git
cd models
pip install -e .
cd ~/repos
git clone [email protected]:utiasDSL/estimators.git
cd estimators
pip install -e .
Cflib is a library provided by crazyflie to communicate with the drones via the radio. As it depends on numpy<1.x, but our repositories require numpy>=2.0, we first install cflib and then reinstall numpy 2.x.
pip install cflib
pip install -U numpy
Acados is an Optimal Control Framework that can be used to control the quadrotor using a Model Predictive Controller. Even though the installation instructions can also be found on the wepage, we summarized the installation for our recommended setup:
# Clone the repo and check out the correct branch, initialize submodules.
cd ~/repos
git clone https://github.com/acados/acados.git
cd acados
git checkout tags/v0.5.0
git submodule update --recursive --init
# Build the application
# Note: If you use Robostack, this might lead to issues. Try to build acados outside your environment if this is the case.
mkdir -p build
cd build
cmake -DACADOS_WITH_QPOASES=ON ..
# add more optional arguments e.g. -DACADOS_WITH_DAQP=ON, a list of CMake options is provided below
make install -j4
# In your environment, make sure you install the acados python interface:
# Note: If you build acados outside your environment previously, activate it again before executing the following commands.
cd ~/repos/acados
pip install -e interfaces/acados_template
# Make sure acados can be found by adding its location to the path. For robostack and micromamba, this would be:
echo 'export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:"$HOME/repos/acados/lib"' > ~/micromamba/envs/ros_env/etc/conda/activate.d/xcustom_acados_ld_library.sh
echo 'export ACADOS_SOURCE_DIR="$HOME/repos/acados"' > ~/micromamba/envs/ros_env/etc/conda/activate.d/xcustom_acados_source.sh
# Deactivate and activate your env again such that the previous two lines can take effect.
micromamba deactivate
micromamba activate ros_env
# Run a simple example from the acados example to make sure it works.
# If he asks you whether you want to get the t_renderer package installed automatically, press yes.
python3 ~/repos/acados/examples/acados_python/getting_started/minimal_example_ocp.py
Next, paste the following block into your terminal
cat <<EOF | sudo tee /etc/udev/rules.d/99-bitcraze.rules > /dev/null
# Crazyradio (normal operation)
SUBSYSTEM=="usb", ATTRS{idVendor}=="1915", ATTRS{idProduct}=="7777", MODE="0664", GROUP="plugdev"
# Bootloader
SUBSYSTEM=="usb", ATTRS{idVendor}=="1915", ATTRS{idProduct}=="0101", MODE="0664", GROUP="plugdev"
# Crazyflie (over USB)
SUBSYSTEM=="usb", ATTRS{idVendor}=="0483", ATTRS{idProduct}=="5740", MODE="0664", GROUP="plugdev"
EOF
# USB preparation for crazyradio
sudo groupadd plugdev
sudo usermod -a -G plugdev $USER
# Apply changes
sudo udevadm control --reload-rules
sudo udevadm trigger
Optionally, you can also install cfclient to debug issues with the drones and configure IDs etc.
Note: We recommend to do this in a seperate environment to prevent conflicts between package versions.
# (optional) Install cfclient
sudo apt install libxcb-xinerama0
pip install --upgrade pip
pip install cfclient
You can also run the simulation with Docker, albeit without the GUI at the moment. To test this, install docker with docker compose on your system, and then run
docker compose --profile sim build
docker compose --profile sim up
After building, running the container should produce the following output:
sim-1 | INFO:__main__:Flight time (s): 8.466666666666667
sim-1 | Reason for termination: Task completed
sim-1 | Gates passed: 4
sim-1 |
sim-1 | 8.466666666666667
The complete problem is specified by a TOML file, e.g. level0.toml
The config folder contains settings for progressively harder scenarios:
Evaluation Scenario | Rand. Inertial Properties | Randomized Obstacles, Gates | Notes |
---|---|---|---|
Level 0 | No | No | Perfect knowledge |
Level 1 | Yes | No | Adaptive |
Level 2 | Yes | Yes | Re-planning |
sim2real | Real-life hardware | Yes | Sim2real transfer |
Bonus | Yes | Yes | Multi-agent racing |
Warning: The bonus level has not yet been tested with students. You are not expected to solve this level. Only touch this if you have a solid solution already and want to take the challenge one level further.
You can choose which configuration to use by changing the --config
command line option. To e.g. run the example controller on the hardest simulation scenario, you can use the following command
python scripts/sim.py --config config/level2.toml
During the semester, you will compete with the other teams on who's the fastest to complete the drone race. You can see the current standings on the competition page in Kaggle, a popular ML competition website. The results of the competition will NOT influence your grade directly. However, it gives you a sense of how performant and robust your approach is compared to others. In addition, the competition is an easy way for you to check if your code is running correctly. If there are errors in the automated testing, chances are your project also doesn't run on our systems. The competition will always use difficulty level 3.
To take part in the competition, you first have to create an account on Kaggle. Next, use this invite link to join the competition, go to the drone racing competition, click on "Rules", and accept the competition conditions. This step is necessary to allow submissions from your account.
The competition submission to Kaggle is fully automated. However, to make the automation work with your Kaggle account, you first have to save your credentials in GitHub. GitHub offers a way to safely store this information without giving anyone else access to it via its secrets. Start by opening your account settings on Kaggle, go to the API section and click on Create New Token. This will download a json file containing two keys: Your account username and an API key. Next, open your lsy_drone_racing GitHub repository in the browser and go to Settings -> Secrets and variables -> Actions
Note: You have to select the repository settings, not your account settings
Here you add two new repository secrets using the information from the json file you downloaded:
- Name: KaggleUsername Secret: INSERT_YOUR_USERNAME
- Name: KaggleKey Secret: INSERT_YOUR_KEY
The whole point of the steps you just took was to set you up to use the GitHub action defined in your repository's .github folder. This workflow runs every time you push changes to your repository's main
or master
branch. To prevent submitting every iteration of your code, you can create new branches and only merge them into the main branch once you finished your changes. However, we recommend regularly updating your main branch to see how fast you are and if the code runs without problems.
Note: The competition will count your fastest average lap time. If a submission performed worse than a previous iteration, it won't update your standing.
Note: The first time the test runs on your account, it will take longer than usual because it has to install all dependencies in GitHub. We cache this environment, so subsequent runs should be faster.
Warning: Kaggle only accepts 100 submissions per day. While we really hope you don't make 100 commits in a single day, we do mention it just in case.
Once you have pushed your latest iteration, a GitHub action runner will start testing your implementation. You can check the progress by clicking on the Actions tab of your repository. If the submission fails, you can check the errors. Please let us know if something is not working as intended. If you need additional packages for your project, please make sure to update the environment.yaml file accordingly. Otherwise, the tests will fail. If you want to get a more detailed summary of your performance, you can have a look at the test output directly:
To implement your own controller, have a look at the example implementation. We recommend altering the existing example controller instead of creating your own file to not break the testing pipeline. Please also read through the documentation of the controller. You must not alter its function signatures. If you encounter problems implementing something with the given interface, contact one of the lecturers.
If you were able to install everything without any issues, but the simulation crashes when running the sim script, you should check the error messages for any errors related to LIBGL
and GLIBCXX_3.4.30
. If you don't find any conclusive evidence about what has happened, you might also want to run the simulation in verbose mode for LIBGL
with
LIBGL_DEBUG=verbose python scripts/sim.py
Next, you should check if your system has the required library installed
strings /usr/lib/x86_64-linux-gnu/libstdc++.so.6 | grep GLIBCXX_3.4.30
or if it is installed in your mamba environment
strings /<PATH-TO-YOUR-MAMBA-ENV>/lib/libstdc++.so.6 | grep GLIBCXX_3.4.30
If neither of those yield any results, you are missing this library and can install it with
micromamba install -c conda-forge gcc=12.1.0
If the program still crashes and complains about not finding GLIBCXX_3.4.30
, please update your LD_LIBRARY_PATH
variable to point to your mamba environment's lib folder.
Change the USB access permissions with
sudo chmod -R 777 /dev/bus/usb/
Make sure your drone is selected on the vicon system, otherwise it will not be tracked.
You will have to modify wo config files before liftoff:
Please modify ~/repos/estimators/ros_nodes/estimators.toml
in the estimators repository to include the up-to-date drone id (in decimal).
Please modify either ~/repos/lsy_drone_racing/config/level2.toml
or create your own custom config file to include the correct drone id.
Note: The following should be run within your teams mamba environment.
You will need a total of three terminal windows for deploying your controller on the real drone:
Terminal 1 launches the motion_capture_tracking an ensures that the position of all vicon objects are published via ros2:
micromamba activate ros_env
ros2 launch motion_capture_tracking launch.py
Terminal 2 starts the estimator:
micromamba activate ros_env
cd ~/repos/estimators
python3 lsy_estimators/ros_nodes/ros2_node.py --settings ros_nodes/estimators.toml
Terminal 3 starts the deployment of the controller:
micromamba activate ros_env
cd ~/repos/lsy_drone_racing/scripts
python deploy.py --controller <your_controller.py> --config level2.toml
where <your_controller.py>
implements a controller that inherits from lsy_drone_racing.control.BaseController