-
Notifications
You must be signed in to change notification settings - Fork 21
Install Without Docker
VT&R3 requires a high-powered, NVIDIA GPU-enabled machine that is able to run Ubuntu 20.04. Example:
- Lenovo Notebook ThinkPad P53: Intel Core i7-9750H Processor, NVIDIA Quadro T2000 4GB GPU, 32GB DDR4 RAM
VT&R3 can handle data from either a stereo camera or a spinning LiDAR. Examples:
- stereo camera: FLIR Bumblebee XB3 FireWire
- LiDAR: Waymo Honeycomb, Velodyne Alpha Prime
See this page for how to use VT&R3 with your own camera/LiDAR sensors.
VT&R3 communicates with the underlying platform only through the kinematic controller inputs of the system and can be run on robots that take ROS geometry_msgs/twist message as control command.
The following environment variables are assumed present so that files and data can be put into different locations on different machines. It is recommended to put them in .bashrc
.
export VTRROOT=~/ASRL # (INTERNAL default) root directory of VTR3 (this variable only initializes the following variables and will not be used anywhere else)
# you can change the following directories to anywhere appropriate
export VTRSRC=${VTRROOT}/vtr3 # source code of VTR3 (this repo)
export VTRDEPS=${VTRROOT}/deps # system dependencies of VTR3
export VTRVENV=${VTRROOT}/venv # python dependencies of VTR3 (not used at the moment)
export VTRDATA=${VTRROOT}/data # datasets for VTR3
export VTRTEMP=${VTRROOT}/temp # temporary data directory for testing
Remember to create these directories
mkdir -p ${VTRSRC} ${VTRDEPS} ${VTRVENV} ${VTRDATA} ${VTRTEMP}
If the default values above are used, the final directory structure should look like
|- ~/ASRL
|- data sample datasets
|- temp temporary files
|- venv a python virtual environment (NOTE: currently not used, but will do eventually)
|- vtr3 source code and installation
|- config configuration files
|- extra sensor, robot and dataset specific add-ons
|- launch tmuxp launch files
|- main main VTR3 ROS2 packages
|- robots example robot description packages
|- deps system dependencies source code and installation
|- opencv opencv source code cloned from GitHub, installed to /usr/local/opencv_cuda/[lib,bin]
|- opencv_contrib opencv contrib source code cloned from GitHub, installed together with opencv
|- proj PROJ source code cloned from GitHub, installed to /usr/local/[lib,bin]
|- vtr_ros2_deps VTR3 dependencies from ROS2 packages
cd ${VTRSRC}
git clone [email protected]:utiasASRL/vtr3.git .
git submodule update --init --remote
Machine specific settings
- Change Nvidia GPU compute capability in gpusurf line 16 based on your GPU model (default to 7.5).
- Change
OpenCV_DIR
in gpusurf line 21 and vtr_common line 48 to point to your OpenCV+CUDA installation (default to/usr/local/opencv_cuda/lib/cmake/opencv4
). If you do not have an OpenCV+CUDA installation, keep the default value for now, continue to the next section, and we will eventually install OpenCV to this location here.
# Dependencies from Debian packages
sudo apt install -y tmux # for launching VTR3
sudo apt install -y doxygen # for building the documentation
sudo apt install -y nodejs npm protobuf-compiler # for building the VTR web-based graphical user interface
sudo apt install -y libboost-all-dev libomp-dev # boost and openmp, needed by multiple packages
sudo apt install -y libpcl-dev # point cloud library, for LiDAR VTR
sudo apt install -y libcanberra-gtk-module libcanberra-gtk3-module # for camera VTR image playback (see: https://github.com/utiasASRL/vtr3/issues/107)
sudo apt install -y libdc1394-22 libdc1394-22-dev # (INTERNAL) for BumbleBee stereo camera
sudo apt install -y libbluetooth-dev libcwiid-dev # (INTERNAL) for joystick drivers
# Dependencies from Python packages
cd ${VTRSRC} && pip3 install -r requirements.txt
Install CUDA (>=11.3)
We recommend install CUDA from Debian packages following instructions here. Be sure to perform post-installation actions. Do not forget to put the following line in .bashrc
:
export PATH=/usr/local/cuda-<your cuda version, e.g. 11.3>/bin${PATH:+:${PATH}}
You can check the CUDA driver version using nvidia-smi
and CUDA toolkit version using nvcc --version
.
Install Eigen (>=3.3.7)
# Debian package
sudo apt -y install libeigen3-dev
# OR from source if preferred
mkdir -p ${VTRDEPS}/eigen && cd $_
git clone https://gitlab.com/libeigen/eigen.git . && git checkout 3.3.7
mkdir build && cd $_
cmake .. && make install # default install location is /usr/local/
Install PROJ (>=8.0.0)
The instruction below is mostly copied from this page, except that the source code is cloned from GitHub.
Install dependencies
sudo apt install cmake libsqlite3-dev sqlite3 libtiff-dev libcurl4-openssl-dev
Download PROJ from GitHub to the following directory: ${VTRDEPS}
and check out the branch of the version you want to install
mkdir -p ${VTRDEPS}/proj && cd $_
git clone https://github.com/OSGeo/PROJ.git .
git checkout <proj-version> # e.g. <proj-version> = 8.0.0
Build and install PROJ
mkdir -p ${VTRDEPS}/proj/build && cd $_
cmake ..
sudo cmake --build . --target install # will install to /usr/local/[lib,bin]
export LD_LIBRARY_PATH=/usr/local/lib${LD_LIBRARY_PATH:+:${LD_LIBRARY_PATH}} # put this in bashrc
Install OpenCV (>=4.5.0)
Install OpenCV with CUDA from source to a customized location so that it is not conflicted with OpenCV installed from Debian packages. The instruction below is copied from this page with install location changed to /usr/local/opencv_cuda
to be different from the default /usr/local
.
sudo apt-get install build-essential cmake git libgtk2.0-dev pkg-config libavcodec-dev libavformat-dev libswscale-dev python3-dev python3-numpy
Download OpenCV and OpenCV Contrib from GitHub to ${VTRDEPS}
cd ${VTRDEPS}
git clone https://github.com/opencv/opencv.git
git clone https://github.com/opencv/opencv_contrib.git
Check out the version you want to install
cd ${VTRDEPS}/opencv && git checkout <opencv-version> # e.g. <opencv-version> = 4.5.0
cd ${VTRDEPS}/opencv_contrib && git checkout <opencv-version> # e.g. <opencv-version> = 4.5.0
Build and install OpenCV
mkdir -p ${VTRDEPS}/opencv/build && cd $_ # create build directory
# generate Makefiles (note that install prefix is customized to: /usr/local/opencv_cuda)
cmake -D CMAKE_BUILD_TYPE=RELEASE \
-D CMAKE_INSTALL_PREFIX=/usr/local/opencv_cuda \
-D OPENCV_EXTRA_MODULES_PATH=${VTRDEPS}/opencv_contrib/modules \
-D PYTHON_DEFAULT_EXECUTABLE=/usr/bin/python3.8 \
-DBUILD_opencv_python2=OFF \
-DBUILD_opencv_python3=ON \
-DWITH_OPENMP=ON \
-DWITH_CUDA=ON \
-DOPENCV_ENABLE_NONFREE=ON \
-D OPENCV_GENERATE_PKGCONFIG=ON \
-DWITH_TBB=ON \
-DWITH_GTK=ON \
-DWITH_OPENMP=ON \
-DWITH_FFMPEG=ON \
-DBUILD_opencv_cudacodec=OFF \
-D BUILD_EXAMPLES=ON ..
make -j<nproc> # <nproc> is the number of cpu cores of your computer, 12 for Lenovo P53
sudo make install # copy libraries to /usr/local/[lib, include]
export LD_LIBRARY_PATH=/usr/local/opencv_cuda/lib${LD_LIBRARY_PATH:+:${LD_LIBRARY_PATH}} # put this in bashrc, note that the path should match CMAKE_INSTALL_PREFIX
Install ROS2 Foxy/Galactic (and optionally ROS1 Noetic+ros1_bridge)
Install ROS2 Foxy/Galactic binary packages or build from source. For Debian package installation, the test_msgs
package needs to be installed manually
sudo apt install -y ros-<distro:foxy/galactic>-test-msgs # given that ROS2 Foxy/Galactic is also installed from Debian packages
If you are working with robots or sensors that are ROS1 but not ROS2 enabled, also install ROS1 Noetic following instructions here (install binary packages or build from source). ros1_bridge is required to pass messages between ROS1 and ROS2, which can be built from source or installed from Debian packages
sudo apt install -y ros-<distro:foxy/galactic>-ros1-bridge # given that ROS2 Foxy/Galactic is also installed from Debian packages
## install from Debian packages
sudo apt install -y ros-<distro:foxy/galactic>-xacro # for robot descriptions
sudo apt install -y ros-<distro:foxy/galactic>-vision-opencv # for camera T&R
sudo apt install -y ros-<distro:foxy/galactic>-perception-pcl ros-<distro:foxy/galactic>-pcl-ros # for lidar T&R
## OR install from source if preferred
mkdir -p ${VTRDEPS}/vtr_ros2_deps/src
# xacro (for robot descriptions)
cd ${VTRDEPS}/vtr_ros2_deps/src
git clone https://github.com/ros/xacro.git ros2_xacro
cd ros2_xacro
git checkout 2.0.3
# vision opencv (for camera T&R)
cd ${VTRDEPS}/vtr_ros2_deps/src
git clone https://github.com/ros-perception/vision_opencv.git ros2_vision_opencv
cd ros2_vision_opencv
git checkout ros2
# ros2_pcl_msgs (for LiDAR T&R)
cd ${VTRDEPS}/vtr_ros2_deps/src
git clone https://github.com/ros-perception/pcl_msgs.git ros2_pcl_msgs
cd ros2_pcl_msgs
git checkout ros2
# ros2_perception (for LiDAR T&R)
cd ${VTRDEPS}/vtr_ros2_deps/src
git clone https://github.com/ros-perception/perception_pcl.git ros2_perception_pcl
cd ros2_perception_pcl
git checkout 2.2.0
# install all
cd ${VTRDEPS}/vtr_ros2_deps
source /opt/ros/galactic/setup.bash # source ros2 workspace first, e.g. for Debian package install
colcon build --symlink-install
source ${VTRDEPS}/vtr_ros2_deps/install/setup.bash # source the overlayed workspace
For external users, we assume that the necessary sensor drivers and robot description packages have been installed already, so that data from sensors can be published to a ROS2 topic, and robot frames and transformations are published via the robot_state_publisher.
- See this page for how to use VT&R3 with your own camera/LiDAR sensors and robots.
- See ROS2 packges in this directory for some example robot description packages.
ASRL students please follow instructions on this page to install drivers for our sensors.
Our sample datasets for VT&R3 are collected using the UTIAS Grizzly platform equipped with a Waymo Honeycomb LiDAR and a FLIR Bumblebee XB3 stereo camera. To use these datasets, it is necessary to install the UTIAS Grizzly description packages in this directory.
# source the ROS2 workspace
source /opt/ros/galactic/setup.bash # this is an example command, not necessarily the one you would use
# install UTIAS Grizzly description packages
cd ${VTRSRC}/robots/ros2
colcon build --symlink-install
source ${VTRSRC}/robots/ros2/install/setup.bash
Source the ROS2 workspace with all dependencies installed
# source the ROS2 workspace with all VTR3 dependencies installed
source ${VTRSRC}/vtr_ros2_deps/ros2/install/setup.bash # this is an example command, not necessarily the one you would use
# build and install VTR3 packages
cd ${VTRSRC}/main
colcon build --symlink-install
# build VTR3 web-based GUI
VTRUI=${VTRSRC}/main/src/vtr_ui/vtr_ui/frontend/vtr-ui
npm --prefix ${VTRUI} install ${VTRUI}
npm --prefix ${VTRUI} run build
After installing VT&R3, the in-source documentation can be accessed by opening the index.html
at main/install/vtr_documentation/docs/html/index.html
in the browser.