This project is developed for ROS2 Humble using GAZEBO Classic on Ubuntu 22.04. The project consists of three packages within src
, which are used to simulate an entity tracking environment through sensor fusion using a LIDAR and ADS-B data.
This repository also contains the Multi-tracking Training 10_10_24-pptx, which serves as the official guide for the training, providing step-by-step instructions.
- drone_sim: Contains the Gazebo simulation environment, along with complementary nodes to move the drone and the cubes.
- multitracker: Node that acts as a multitracker for entities, using sensor data.
- sim_msgs: Package that contains the custom messages used in the project, such as
Adsb.msg
.
This project simulates a drone equipped with a LIDAR sensor that tracks three entities (cubes) through sensor data fusion. The cubes periodically publish their states via ADS-B messages on their respective topics, allowing the drone to track them in real-time.
- The drone publishes its state on the
/drone/state
topic. - The cubes publish their state on the
/cube/state
topic. - If the LIDAR is used, the sensor will publish PointCloud messages on the
/drone/laser/scan
topic.
The multitracker node uses a Kalman filter to predict and update the state of the entities, and the tracking data is visualized in RViz2 through the /cube/marker
topic.
- ROS2 Humble
- Gazebo Classic
- Ubuntu 22.04
- Transform ADS-B tracks from global to local
- Track Cube1 from its ADS-B data using Kalman Filter
- Create a list to track all 3 cubes at the same time
- strengthen the multitracker (random dissaparence, low-pass filter, ...)
- Add LIDAR measurement using a driver to get tracks
- Matching of LIDAR tracks with obstacle in the list
- New matrix in Kalman Filter for LIDAR tracks
- Final testings
-
Build the project: First, build the
sim_msgs
package, followed by the rest of the project.colcon build --packages-select sim_msgs colcon build
-
Source the environment: After building, execute:
source install/setup.bash
-
Launch the simulation: To start the Gazebo simulation environment, use the following command:
ros2 launch drone_sim drone_world_launch.py
In the launch file drone_sim/launch/drone_world_launch.py, on line 16, you can change
model.sdf
tomodel_lidar.sdf
to include or exclude the LIDAR sensor on the drone. -
Move the cubes randomly: To move the cubes randomly in the simulation, execute:
ros2 run drone_sim cube_mover
-
Move the drone randomly: To move the drone randomly, execute:
ros2 run drone_sim drone_mover
-
Launch the tracker node: The
tracker
node from themultitracker
package uses a Kalman filter to predict and update the state of the tracked entities. To run this node, use:ros2 run multitracker tracker
The tracking data for the cubes is published on the
/cube/marker
topic in Marker or MarkerArray format, and can be visualized in RViz2.You can also launch both the
tracker
node with RViz2, with its adapted configuration, using:ros2 launch multitracker tracker_launch.py
-
Error visualization: When you stop the
multitracker
node withCTRL+C
, the RMSE of position and orientation for each tracked obstacle will be displayed. Additionally, the error data will be saved inerror_data.txt
in the workspace.
To visualize the errors, run the Python script:
python3 /home/flyros/multitracker_sim_ws/error_plotter.py
This script generates graphs for positional and orientational errors for each tracked obstacle.
Note
This node currently tracks via ADS-B and LiDAR.
- /drone/state: Publishes the state of the drone.
- /cube/state: Publishes the state of the cubes.
- /drone/laser/scan: If the LIDAR is used, it publishes the sensor data in PointCloud format.
- /cube/marker: Publishes the tracking data in Marker format for visualization in RViz2.
The project uses a custom Adsb.msg message that contains the global state of the cubes and the drone. This message is used both for the simulation and for tracking the entities.
This simulation project uses a drone equipped with LIDAR to track entities (cubes) in a Gazebo simulation environment. The states of the cubes are published using ADS-B messages and are processed by a Kalman filter in the multitracker node.