Skip to content

nv-tlabs/3dgrut

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

28 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation


This repository provides the official implementations of 3D Gaussian Ray Tracing (3DGRT) and 3D Gaussian Unscented Transform (3DGUT). Unlike traditional methods that rely on splatting, 3DGRT performs ray tracing of volumetric Gaussian particles instead. This enables support for distorted cameras with complex, time-dependent effects such as rolling shutters, while also efficiently simulating secondary rays required for rendering phenomena like reflection, refraction, and shadows. However, 3DGRT requires dedicated ray-tracing hardware and remains slower than 3DGS.

To mitigate this limitation, we also propose 3DGUT, which enables support for distorted cameras with complex, time-dependent effects within a rasterization framework, maintaining the efficiency of rasterization methods. By aligning the rendering formulations of 3DGRT and 3DGUT, we introduce a hybrid approach called 3DGRUT. This technique allows for rendering primary rays via rasterization and secondary rays via ray tracing, combining the strengths of both methods for improved performance and flexibility.

3D Gaussian Ray Tracing: Fast Tracing of Particle Scenes
Nicolas Moenne-Loccoz*, Ashkan Mirzaei*, Or Perel, Riccardo De Lutio, Janick Martinez Esturo,
Gavriel State, Sanja Fidler, Nicholas Sharp^, Zan Gojcic^ (*,^ indicates equal contribution)
SIGGRAPH Asia 2024 (Journal Track)
Project pageΒ / PaperΒ / VideoΒ / BibTeX

3DGUT: Enabling Distorted Cameras and Secondary Rays in Gaussian Splatting
Qi Wu*, Janick Martinez Esturo*, Ashkan Mirzaei,
Nicolas Moenne-Loccoz, Zan Gojcic (* indicates equal contribution)
CVPR 2025
Project pageΒ / PaperΒ / VideoΒ / BibTeX

πŸ”₯ News

  • βœ…[2025/04] Stable release v1.0.0 tagged.
  • βœ…[2025/03] Initial code release!
  • βœ…[2025/02] 3DGUT was accepted to CVPR 2025!
  • βœ…[2024/08] 3DGRT was accepted to SIGGRAPH Asia 2024!

Contents

πŸ”§ 1 Dependencies and Installation

  • CUDA 11.8+ Compatible System
  • For good performance with 3DGRT, we recommend using an NVIDIA GPU with Ray Tracing (RT) cores.
  • Currently, only Linux environments are supported by the included install script (Windows support coming soon!)
NOTE: gcc versions >11 (expand for details)

Currently the codebase requires gcc <= 11. If your machine uses the compiler gcc-12 or newer (i.e., in Ubuntu 24.04), you may need to install and use gcc-11.

First, install gcc 11:

sudo apt-get install gcc-11 g++-11

Then run the install script with the optional WITH_GCC11 flag, which additionally configures the conda environment to use gcc-11:

./install_env.sh 3dgrut WITH_GCC11

To set up the environment using conda, first clone the repository and run ./install_env.sh script as:

git clone --recursive https://github.com/nv-tlabs/3dgrut.git
cd 3dgrut

# You can install each component step by step following install_env.sh
chmod +x install_env.sh
./install_env.sh 3dgrut
conda activate 3dgrut

Running with Docker

Build the docker image:

git clone --recursive https://github.com/nv-tlabs/3dgrut.git
cd 3dgrut
docker build . -t 3dgrut

Run it:

xhost +local:root
docker run -v --rm -it --gpus=all --net=host --ipc=host -v $PWD:/workspace --runtime=nvidia -e DISPLAY 3dgrut

Note

Remember to set DISPLAY environment variable if you are running on a remote server from command line.

πŸ’» 2. Train 3DGRT or 3DGUT scenes

We provide different configurations for training using 3DGRT and 3DGUT models on common benchmark datasets. For example you can download NeRF Synthetic dataset, MipNeRF360 dataset or ScanNet++, and then run one of the following commands:

# Train Lego with 3DGRT & 3DGUT
python train.py --config-name apps/nerf_synthetic_3dgrt.yaml path=data/nerf_synthetic/lego out_dir=runs experiment_name=lego_3dgrt
python train.py --config-name apps/nerf_synthetic_3dgut.yaml path=data/nerf_synthetic/lego out_dir=runs experiment_name=lego_3dgut

# Train Bonsai
python train.py --config-name apps/colmap_3dgrt.yaml path=data/mipnerf360/bonsai out_dir=runs experiment_name=bonsai_3dgrt dataset.downsample_factor=2 
python train.py --config-name apps/colmap_3dgut.yaml path=data/mipnerf360/bonsai out_dir=runs experiment_name=bonsai_3dgut dataset.downsample_factor=2 

# Train Scannet++
python train.py --config-name apps/scannetpp_3dgut.yaml path=data/scannetpp/0a5c013435/dslr out_dir=runs experiment_name=0a5c013435_3dgut

Note

For ScanNet++, we expect the dataset to be preprocessed following FisheyeGS's method.

Note

If you're running from PyCharm IDE, enable rich console through: Run Configuration > Modify Options > Emulate terminal in output console*

πŸŽ₯ 3. Rendering from Checkpoints

Evaluate Checkpoint with Splatting / OptiX Tracer / Torch

python render.py --checkpoint runs/lego/ckpt_last.pt --out-dir outputs/eval

To visualize training progress interactively

python train.py --config-name apps/nerf_synthetic_3dgut.yaml path=data/nerf_synthetic/lego with_gui=True 

To visualize a pre-trained checkpoint

python train.py --config-name apps/nerf_synthetic_3dgut.yaml path=data/nerf_synthetic/lego with_gui=True test_last=False export_ingp.enabled=False resume=runs/lego/ckpt_last.pt 

Note

Remember to set DISPLAY environment variable if you are running on a remote server from command line.

On start up, you might see a black screen, but you can use the GUI to navigate to correct camera views:

πŸ“‹ 4. Evaluations

We provide scripts to reproduce results reported in publications.

# Training
bash ./benchmark/mipnerf360_3dgut.sh <config-yaml>
# Rendering
bash ./benchmark/mipnerf360_3dgut_render.sh <results-folder>
3DGRT Results Produced on RTX 5090

NeRF Synthetic Dataset

bash ./benchmark/nerf_synthetic.sh apps/nerf_synthetic_3dgrt.yaml
bash ./benchmark/nerf_synthetic_render.sh results/nerf_synthetic
PSNR SSIM Train (s) FPS
Chair 35.85 0.988 556.4 299
Drums 25.87 0.953 462.8 389
Ficus 36.57 0.989 331.0 465
Hotdog 37.88 0.986 597.0 270
Lego 36.70 0.985 469.8 360
Materials 30.42 0.962 463.3 347
Mic 35.90 0.992 443.4 291
Ship 31.73 0.909 510.7 360
Average 33.87 0.971 479.3 347

MipNeRF360 Dataset

bash ./benchmark/mipnerf360.sh apps/colmap_3dgrt.yaml
bash ./benchmark/mipnerf360_render.sh results/mipnerf360
PSNR SSIM Train (s) FPS
Bicycle 24.85 0.748 2335 66
Bonsai 31.95 0.942 3383 72
Counter 28.47 0.905 3247 62
Flowers 21.42 0.615 2090 86
Garden 26.97 0.852 2253 70
Kitchen 30.13 0.921 4837 39
Room 30.35 0.911 2734 73
Stump 26.37 0.770 1995 73
Treehill 22.08 0.622 2413 68
Average 27.22 0.817 2869 68
3DGUT Results Produced on RTX 5090

NeRF Synthetic Dataset

bash ./benchmark/nerf_synthetic.sh paper/3dgut/unsorted_nerf_synthetic.yaml
bash ./benchmark/nerf_synthetic_render.sh results/nerf_synthetic
PSNR SSIM Train (s) FPS
Chair 35.61 0.988 265.6 599
Drums 25.99 0.953 254.1 694
Ficus 36.43 0.988 183.5 1053
Hotdog 38.11 0.986 184.8 952
Lego 36.47 0.984 221.7 826
Materials 30.39 0.960 194.3 1000
Mic 36.32 0.992 204.7 775
Ship 31.72 0.908 208.5 870
Average 33.88 0.970 214.6 846

MipNeRF360 Dataset

bash ./benchmark/mipnerf360.sh paper/3dgut/unsorted_colmap.yaml
bash ./benchmark/mipnerf360_render.sh results/mipnerf360
PSNR SSIM Train (s) FPS
Bicycle 25.01 0.759 949.8 275
Bonsai 32.46 0.945 485.3 362
Counter 29.14 0.911 484.5 380
Flowers 21.45 0.612 782.0 253
Garden 27.18 0.856 810.2 316
Kitchen 31.16 0.928 664.8 275
Room 31.63 0.920 448.8 370
Stump 26.50 0.773 742.6 319
Treehill 22.35 0.627 809.6 299
Average 27.43 0.815 686.4 317

Scannet++ Dataset

bash ./benchmark/scannetpp.sh paper/3dgut/unsorted_scannetpp.yaml
bash ./benchmark/scannetpp_render.sh results/scannetpp

[!Note] We followed FisheyeGS's convention to prepare the dataset for fair comparisons

PSNR SSIM Train (s) FPS
0a5c013435 29.67 0.930 292.3 389
8d563fc2cc 26.88 0.912 286.1 439
bb87c292ad 31.58 0.941 316.9 448
d415cc449b 28.12 0.871 394.6 483
e8ea9b4da8 33.47 0.954 280.8 394
fe1733741f 25.60 0.858 355.8 450
Average 29.22 0.911 321.1 434

πŸ› 5. Interactive Playground GUI

The playground allows interactive exploration of pretrained scenes, with raytracing effects such as inserted objects, reflections, refractions, depth of field, and more.

Run the playground UI to visualize a pretrained scene with:

python playground.py --gs_object <ckpt_path>

See Playground README for details.

πŸŽ“ 6. Citations

@article{loccoz20243dgrt,
    author = {Nicolas Moenne-Loccoz and Ashkan Mirzaei and Or Perel and Riccardo de Lutio and Janick Martinez Esturo and Gavriel State and Sanja Fidler and Nicholas Sharp and Zan Gojcic},
    title = {3D Gaussian Ray Tracing: Fast Tracing of Particle Scenes},
    journal = {ACM Transactions on Graphics and SIGGRAPH Asia},
    year = {2024},
}
@article{wu20253dgut,
    title={3DGUT: Enabling Distorted Cameras and Secondary Rays in Gaussian Splatting},
    author={Wu, Qi and Martinez Esturo, Janick and Mirzaei, Ashkan and Moenne-Loccoz, Nicolas and Gojcic, Zan},
    journal = {Conference on Computer Vision and Pattern Recognition (CVPR)},
    year={2025}
}

πŸ™ 7. Acknowledgements

We sincerely thank our colleagues for their valuable contributions to this project.

Hassan Abu Alhaija, Ronnie Sharif, Beau Perschall and Lars Fabiunke for assistance with assets. Greg Muthler, Magnus Andersson, Maksim Eisenstein, Tanki Zhang, Nathan Morrical, Dietger van Antwerpen and John Burgess for performance feedback. Thomas MΓΌller, Merlin Nimier-David, and Carsten Kolve for inspiration and pointers. Ziyu Chen, Clement Fuji-Tsang, Masha Shugrina, and George Kopanas for technical & experiment assistance, and to Ramana Kiran and Shailesh Mishra for typo fixes.