This system constructs a hierarchical 3D scene graph from dense mapping for high-level planning incrementally in real time. .
Since this system runs on top of the dense mapper. You should first configure the environment for panoptic_mapping.
Please first move the RGB pointclouds publisher source file to ./panoptic_mapping_ros/app
in panoptic_mapping package and modify cmake file correspondingly to build this node.
Now we assume you have already installed ROS, created ROS working space, and successfully run the panoptic mapper.
Common Libs
pip3 install numpy scipy pandas json opencv-python
Point Cloud Libs
Follow the official instructions and install pcl, open3d.
Deep Learning Libs
Follow the official instructions and instal Pytorch, PyTorch Geometric. Pytorch versions 1.8.0/1.9.0
, cuda111/cpu
have been tested.
The environment setup has only been tested on the following system
Ubuntu 20.04
gcc 9.3.0
python 3.8.10
CUDA 11.1
If you meet some unknown errors, feel free to raise an issue.
First, you should download pre-trained relationship prediction model from 3DSSG, or directly download from the link. Then you should unzip file, and put all *.pth
files to ./src/SSG/CVPR21
directory.
Then, you should download the room scans from here. It contains all RGBD input, modified labels, and room segmentation annotations, etc. There are two scans in this zip file, flat
and large_flat
.
Please Change all data paths in the launch file to your local path. Then with panoptic mapper running at the background, you could simply run roslaunch scene_graph run.launch
to start the system.