-
Notifications
You must be signed in to change notification settings - Fork 22
Launching VTR3
A comprehensive video tutorial was created to get new users familiar with the User Interface as well as explore the features available for VT&R3. We recommend viewing this video before attempting to run any version of VT&R3. This demo will show you how to:
- Initialize the user interface
- Enter teach mode and manually drive a new path
- Perform loop closures
- Align the pose graph with the satellite imagery
- Teach additional branches to a graph
- Repeat along a path to user defined waypoints
- Utilize multi-experience localization
- Load and initialize pose graphs from previous driving sessions
We also have some supplementary tutorials specific for the offline versions of Stereo SURF-Feature-Based T&R and LiDAR Point-Cloud-Based T&R. Please note the offline data was recorded manually and as such the path tracking may not function correctly.
When using the VTR3 Sample Datasets, VT&R3 can be run in offline mode without connecting to any sensor or robot.
We use tmux and tmuxp to launch and run VT&R3. They are useful tools that allow multiple terminal sessions to be launched and accessed simultaneously in a single window.
In addition, you must first:
- Install VT&R3+UI inside
main
folder including all its dependencies. - Install Grizzly robot description packages inside
robots
folder. - Download the
utias_20210921_ros2bag
dataset from here into${VTRDATA}
.
Launching VT&R3 to use stereo camera images as input:
# launch command
tmuxp load ${VTRSRC}/launch/offline_bumblebee_grizzly.launch.yaml
Data playback for teaching a path:
- Run these lines in a separate terminal after initializing the UI into teach mode.
# replay the first ros2 bag source ${VTRSRC}/main/install/setup.bash ros2 bag play ${VTRDATA}/utias_20210921_ros2bag/rosbag2_2021_09_21-16_25_56
Data playback for repeating a path:
-
Run these lines in a separate terminal after teaching a path, entering repeat mode, and specifying repeat waypoints.
# replay the second ros2 bag source ${VTRSRC}/main/install/setup.bash ros2 bag play ${VTRDATA}/utias_20210921_ros2bag/rosbag2_2021_09_21-16_28_53
-
Terminate VT&R3 (
Ctrl-C
once in terminal)
Launching VT&R3 to use LiDAR point-clouds as input:
# launch command
tmuxp load ${VTRSRC}/main/launch/offline_honeycomb_grizzly.launch.yaml
Data playback for teaching a path:
- Run these lines in a separate terminal after initializing the UI into teach mode.
# replay the first ros2 bag source ${VTRSRC}/main/install/setup.bash ros2 bag play ${VTRDATA}/utias_20210921_ros2bag/rosbag2_2021_09_21-16_25_56
Data playback for repeating a path:
-
Run these lines in a separate terminal after teaching a path, entering repeat mode, and specifying repeat waypoints.
# replay the second ros2 bag source ${VTRSRC}/main/install/setup.bash ros2 bag play ${VTRDATA}/utias_20210921_ros2bag/rosbag2_2021_09_21-16_28_53
-
Terminate VT&R3 (
Ctrl-C
once in terminal)
Perform the following steps to start driving Grizzly manually. Steps marked with "*" should be done with help from a lab mate during first attempt.
- * Unplug Grizzly from wall charger
- * Ensure 2 e-stops are active
- * Turn on red power switch, set key to run
- * Close panel
- * Configure Grizzly network.
- *Connect Grizzly computer (via the yellow ethernet cable) and XBox controller (via usb) to your laptop.
- Use grizzly_control.launch.yaml to start a tmux session that passes commands from the XBox controller to Grizzly, by opening a new terminal outside of Docker and inputting the following command. *You will be asked to input Grizzly computer's password at the bottom pane. Ask a lab mate for password and controller key mappings.
tmuxp load ${GRIZZLY}/launch/control.launch.yaml
E-stop remote: To enable the remote, the Grizzly's e-stop remote must be connected. To pair
- Press and hold green button until the light is green
- Press release twice
- Press green button until the light flashes If the light is red, the battery is dead and must be replaced.
You should now be able to drive Grizzly manually (when emergency stop is released). Always keep the tmux session (launched by grizzly_control.launch.yaml) on while running (Stereo/LiDAR) VT&R3.
How it works: your machine receives commands from the XBox controller and publishes them as ROS2 messages. The Grizzly computer receives ROS2 messages and converts them to ROS1 messages, which are then received by the internal micro-controller.
Connect the Bumblebee XB3 camera (via usb-c) to your machine. (You may be asked to authenticate this connection.) Use online_bumblebee_grizzly.launch.yaml to launch VT&R3 and wait for a few seconds. To launch the camera, open a terminal in Docker.
#Compilation is only required once
cd $GRIZZLY/ros2/vision
source /opt/ros/humble/setup.bash
colcon build --symlink-install
cd $GRIZZLY
tmuxp load launch/bumblebee.launch.yaml
The you should see left & right images received from the camera.
tmuxp load ${VTRSRC}/launch/online_bumblebee_grizzly.launch.yaml
You should now be able to use VT&R3. To start, open the Web Interface
Connect the Honeycomb LiDAR (blue ethernet-usb) to your machine. Use honeycomb.launch.yaml
to turn on the sensor, which will then publish point-clouds as ROS1 and ROS2 messages (ros1_bridge is launched alongside).
tmuxp load ${GRIZZLY}/launch/honeycomb.launch.yaml
If VTR&R has been installed using docker, open the vtr3 docker image and run.
Use online_honeycomb_grizzly.launch.yaml
to launch VT&R3.
Open a web-browser and navigate to localhost:5200
tmuxp load ${VTRSRC}/launch/online_honeycomb_grizzly.launch.yaml
You should now be able to use VT&R3. To start, open the Web Interface.