This repository contains the ROS2-powered stack to do audio processing for sound-based localization on robots. The system was tested on the Crazyflie drone 2.1 and the e-puck2 robot, shown in the image above.
This repository is part of the experimental framework described in the paper:
F. Dümbgen, A. Hoffet, M. Kolundžija, A. Scholefield, and M. Vetterli, "Blind as a bat: audible
echolocation on small robots", IEEE Robotics and Automation Letters (Early Access), 2022.
The framework includes all components listed below. The components are kept modular so that researchers may focus only on what's relevant to them. For instance, to recreate the audio extension deck and perform more research on audio-based navigation on drones, only repository 2. is necessary; to run the audio-based algorithms in ROS but for a different robot, only repository 1. is enough as a starting point.
- ROS processing pipeline (this repository) for processing audio for tasks such as obstacle detection and sound source localization.
- Audio deck firmware, which also includes the PCB files for reproducing the audio deck.
- Public datasets, for audio-based localization on the e-puck2 robot and the Crazyflie drone.
- Crazyflie firmware (fork from official vendor firmware), with an added audio deck driver.
- Gtsam extension for performing factor-graph inference using the echolocation measurements.
The repository was built for Ubuntu 20.04 (Focal Fossa) with ROS2 (Galactic) and Python 3.8.12.
To install, make sure to clone this repo including submodules (e.g. datasets) by running
git clone --recurse-submodules https://github.com/LCAV/audioROS
To install all packages contained in this repo, including dependencies, run (from the root of this repository):
sudo apt install python3-rosdep python3-colcon-common-extensions
sudo rosdep init
cp 19-custom.list /etc/ros/rosdep/sources.list.d/ #might need sudo here
rosdep update
rosdep install --from-path src/ --rosdistro $ROS_DISTRO
colcon build --symlink-install
. install/local_setup.bash
Update from May 25 2022: because of an inconsistency in matplotlib, in order to run all of the analysis notebooks (using latex rendering), the following two non-standard libraries had to be installed:
sudo apt install cm-super dvipng
The stack is separated into the following modules:
audio_interfaces
(C++): Custom ROS message definintions.audio_bringup
(python): Pipeline for recording measurements (measurement_pipeline.py
) and launch files.audio_gtsam
(python): Classes to build a factor graph from audio and pose measurements.audio_stack
(python): Read and process audio signals.audio_simulation
(python): Simulated audio using pyroomacoustics.audio_publisher
(python): Publish signals from file or computer's audio input stream.crazyflie_crtp
(python): Publish audio signals received over CRTP from Crazyflie drone.crazyflie_demo
(python): Demo of drone detecting and avoiding walls.crazyflie_description
(python): Commonly used global parameters of Crazyflie drone.topic_plotter
(python): Create plots of the different topics.topic_writer
(python): Convert data from topics to different formats (e.g. csv format) and save.
A diagram of the interactions of the different packages is shown below.
- To simulate the drone doing wall detection, run:
ros2 launch audio_bringup wall_simulated.launch.py
- To simulate the drone doing sound source localization inside a room, run:
ros2 launch audio_bringup doa_simulated.launch.py
- To do real-time wall detection on the Crazyflie drone, run:
ros2 launch audio_bringup wall_real.launch.py
- To run a new measurement campaign, run:
ros2 run audio_bringup measurement_pipeline
- To do real-time direction-of-arrival estimation (of an external sound source) on the Crazyflie drone, run:
ros2 launch audio_bringup doa_real.launch.py
This repo also contains functions to analyze the experimental data, grouped in the folder python/
. Below table explains where to find what.
Note that the notebooks use preprocessed data to speed up analysis. To process data for the first time, you can use the examples listed in the Makefile. For instance, to preprocess the stepper motor results of the Crazyflie reported in the thesis, run
make crayflie_stepper_thesis
. Similarly, for the e-puck results run make epuck_results
. See python/Makefile
for all options.
Descriptions:
notebook | datasets | results |
---|---|---|
StepperAnalysis | 2021_07_08_stepper_fast |
echolocation, distance and frequency slice, matrices [1] |
2021_07_27_epuck_wall |
echolocation, epuck results [1] | |
CleanupAnalysis | 2021_07_08_stepper_fast |
echolocation, matrix cleaning results [1] |
DistanceFlying | 2021_10_12_flying |
echolocation, drone flying towards walls [1] |
2021_11_23_demo |
echolocation, drone avoiding walls [1] | |
2022_01_27_demo |
echolocation, drone avoiding whiteboards [1] | |
ApproachAngleFlying | 2021_05_04_linear |
echolocation, approach angle results [2] |
WallStudy | -- |
echolocation, simulation results [1] |
DoaAnalysis | 2021_10_12_doa_stepper |
doa, experiments on stepper motor [2] |
2021_10_12_doa_flying |
doa, experiments with hovering drone [2] | |
DoaStudy | -- |
doa, simulation results [2] |
We took inspiration from this helpful article for the structure of this repo.
Please refer to the below publications for more information.
[1] RA-L paper (main reference)
F. Dümbgen, A. Hoffet, M. Kolundžija, A. Scholefield, and M. Vetterli, "Blind as a bat: audible
echolocation on small robots", IEEE Robotics and Automation Letters (Early Access), 2022.
[2] Ph.D. dissertation (including additional methods and experimental analysis):
F. Dümbgen, "Blind as a bat: spatial perception without sight", Ph.D. disseration,
École Polytechnique Fédérale de Lausanne, 2021.