Skip to content

Latest commit

 

History

History
107 lines (65 loc) · 2.87 KB

readme.md

File metadata and controls

107 lines (65 loc) · 2.87 KB

RGB-Only Gaussian Splatting SLAM for Unbounded Outdoor Scenes

Sicheng Yu* · Chong Cheng* · Yifan Zhou · Xiaojun Yang · Hao Wang✉

The Hong Kong University of Science and Technology (GuangZhou)

(* Equal Contribution)

ICRA 2025

[Project page],[arxiv]

Getting Started

Installation

  1. Clone OpenGS-SLAM.
git clone https://github.com/3DAgentWorld/OpenGS-SLAM.git --recursive
cd opengs-slam
  1. Setup the environment.
conda env create -f environment.yml
conda activate opengs-slam
  1. Compile the cuda kernels for RoPE (as in CroCo v2 and DUSt3R).
cd croco/models/curope/
python setup.py build_ext --inplace
cd ../../../

Our test setup was:

  • Ubuntu 20.04: pytorch==2.1.0 torchvision==0.16.0 torchaudio==2.1.0 cudatoolkit=11.8
  • NVIDIA RTX A6000

Checkpoints

You can download the 'DUSt3R_ViTLarge_BaseDecoder_512_dpt.pth' checkpoint from the DUSt3R code repository, and save it to the 'checkpoints' folder.

Alternatively, download it directly using the following method:

mkdir -p checkpoints/
wget https://download.europe.naverlabs.com/ComputerVision/DUSt3R/DUSt3R_ViTLarge_BaseDecoder_512_dpt.pth -P checkpoints/

Please note that you must agree to the DUSt3R license when using it.

Downloading Datasets

The processed data for the 9 Waymo segments can be downloaded via baidu or google.

Run

## Taking segment-100613 as an example
CUDA_VISIBLE_DEVICES=0 python slam.py --config configs/mono/waymo/100613.yaml

## All 9 Waymo segments
bash run_waymo.sh

Demo

  • If you want to view the real-time interactive SLAM window, please change 'Results-use_gui' in base_config.yaml to True.

  • When running on an Ubuntu system, a GUI window will pop up.

Run on other dataset

  • Please organize your data format and modify the code in utils/dataset.py.

  • Depth map input interface is still retained in the code, although we didn't use it for SLAM.

Acknowledgement

  • This work is built on 3DGS, MonoGS, and DUSt3R, thanks for these great works.

  • For more details about Demo, please refer to MonoGS, as we are using its visualization code.