- [30/8/2024] Radar Fields has been accepted as a spotlight at the ECCV 2024 workshop on Neural Fields Beyond Conventional Cameras
- [1/8/2024] We presented Radar Fields at SIGGRAPH 2024
# Set up conda environment & install dependencies
conda env create -f environment.yml
conda activate radarfields
# Install tinycudann
pip install git+https://github.com/NVlabs/tiny-cuda-nn/#subdirectory=bindings/torch
# Install Radar Fields
pip install -e .
python -c "import radarfields; print(radarfields.__version__)" # Should print "1.0.0"
# Run a pre-trained demo model
python demo.py --config configs/radarfields.ini --demo --demo_name [DEMO_NAME]
# Run training job
python main.py --config configs/radarfields.ini --name [NAME] --seq [SEQUENCE NAME] --preprocess_file [PATH TO PREPROCESS .JSON]
# More help
python main.py --help
We have several pre-trained models available for download.
These can be run without downloading any datasets.
@inproceedings{radarfields,
author = {Borts, David and Liang, Erich and Broedermann, Tim and Ramazzina, Andrea and Walz, Stefanie and Palladin, Edoardo and Sun, Jipeng and Brueggemann, David and Sakaridis, Christos and Van Gool, Luc and Bijelic, Mario and Heide, Felix},
title = {Radar Fields: Frequency-Space Neural Scene Representations for FMCW Radar},
year = {2024},
isbn = {9798400705250},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3641519.3657510},
doi = {10.1145/3641519.3657510},
booktitle = {ACM SIGGRAPH 2024 Conference Papers},
articleno = {130},
numpages = {10},
keywords = {neural rendering., radar},
location = {Denver, CO, USA},
series = {SIGGRAPH '24}
}
The general structure/layout of this codebase was inspired by LiDAR-NeRF & torch-ngp.
We also rely on tiny-cuda-nn for our networks and encodings, and on Nerfstudio for pose optimization.