Skip to content

[TVCG 2024] Voxel-Mesh Hybrid Representation for Real-Time View Synthesis

Notifications You must be signed in to change notification settings

zachzhang07/vosh

Repository files navigation

Vosh

This repository contains a PyTorch implementation of the paper: Voxel-Mesh Hybrid Representation for Real-Time View Synthesis.

Install

git clone https://github.com/zachzhang07/vosh.git
cd vosh

Install with pip

conda create -n vosh python==3.8.13
conda activate vosh

pip install torch==1.10.1+cu111 torchvision==0.11.2+cu111 torchaudio==0.10.1 -f https://download.pytorch.org/whl/cu111/torch_stable.html

pip install -r requirements.txt

# nvdiffrast
pip install git+https://github.com/NVlabs/nvdiffrast/

Tested environments

  • Ubuntu 20.04 with torch 1.10.1 & CUDA 11.1 on RTX 4090 and RTX 3090.

Usage

We majorly support COLMAP dataset like Mip-NeRF 360. Please download and put them under ../data/.

For custom datasets:

# prepare your video or images under /data/custom, and run colmap (assumed installed):
python scripts/colmap2nerf.py --video ../data/custom/video.mp4 --run_colmap # if use video
python scripts/colmap2nerf.py --images ../data/custom/images/ --run_colmap # if use images

Basics

First time running will take some time to compile the CUDA extensions.

## train and eval
# mip-nerf 360
python main_vol.py ../data/360_v2/bicycle/ --workspace ../output/bicycle --contract
python main_mesh.py ../data/360_v2/bicycle/ --vol_path ../output/bicycle \
  --workspace ../output/bicycle_mesh
python main_vosh.py ../data/360_v2/bicycle/ --vol_path ../output/bicycle_mesh --workspace ../output/bicycle_base --lambda_mesh_weight 0.001 --mesh_select 0.9 --keep_center 0.25 --lambda_bg_weight 0.01
python main_vosh.py ../data/360_v2/bicycle/ --vol_path ../output/bicycle_mesh --workspace ../output/bicycle_light --lambda_mesh_weight 0.01 --mesh_select 1.0 --keep_center 0.25 --lambda_bg_weight 0.01 --use_mesh_occ_grid --mesh_check_ratio 8

If you want to eval Vosh in 7 scenes of mip-nerf 360 dataset, just run:

python full_eval_360.py ../data/360_v2/ --workspace ../output/

Please check full_eval_360.py for different hyper-parameters of different kind of scenes, and check main_*.py for all options.

Acknowledgement

Heavily borrowed from torch-merf and nerf2mesh. Many thanks to Jiaxiang.

Citation

@ARTICLE{10759307,
  author={Zhang, Chenhao and Zhou, Yongyang and Zhang, Lei},
  journal={IEEE Transactions on Visualization and Computer Graphics}, 
  title={Voxel-Mesh Hybrid Representation for Real-Time View Synthesis by Meshing Density Field}, 
  year={2024},
  volume={},
  number={},
  pages={1-13},
  doi={10.1109/TVCG.2024.3502672}}

About

[TVCG 2024] Voxel-Mesh Hybrid Representation for Real-Time View Synthesis

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages