This is a PyTorch implementation of the paper ARKitTrack: A New Diverse Dataset for Tracking Using Mobile RGB-D Data. Code will be released here.
- Code for VOS is coming soon ...
- [2023/06/08] Release Train sets v1.
- [2023/05/09] Release Test sets v1.
- [2023/04/20] Release code for VOT.
- VOT_test_set: [BaiduNetdisk], VOS_test_set: [BaiduNetdisk]
- All: [BaiduNetdisk], [OneDrive(Available before 2023.7.15)]
# 1. Clone this repo
git clone https://github.com/lawrence-cj/ARKitTrack.git
cd ARKitTrack
# 2. Create conda env
conda env create -f art_env.yml
conda activate art
# 3. Install mmcv-full, mmdet, mmdet3d for the BEV pooling, which is from bevfusion.
pip install openmim
mim install mmcv-full==1.4.0
mim install mmdet==2.20.0
python setup.py develop # mmdet3d
Run the following command to set paths for this project.
python tracking/create_default_local_file.py --workspace_dir . --data_dir ./data --save_dir ./output
After running this command, you can also modify paths by editing these two files:
lib/train/admin/local.py
and lib/test/evaluation/local.py
.
Download our trained models from Google Drive and uncompress them to output/checkpoints/
.
Change the corresponding dataset paths in lib/test/evaluation/local.py
.
Run the following command to test on different datasets.
python tracking/test.py --tracker art --param vitb_384_mae_ce_32x4_ep300 --dataset depthtrack --threads 2 --num_gpus 2
--config vitb_384_mae_ce_32x4_ep300
is used for cdtb and depthtrack.--config vitb_384_mae_ce_32x4_ep300_art
is used for arkittrack.--debug 1
for visualization.--dataset
: [depthtrack, cdtb, arkit].
The raw results are stored in Google Drive.
Download the pre-trained weights from Google Drive and uncompress it to pretrained_models/
.
Change the corresponding dataset paths in lib/train/admin/local.py
.
Run the following command to train for vot.
python tracking/train.py --script art --config vitb_384_mae_ce_32x4_ep300 --save_dir ./output --mode multiple --nproc_per_node 2
--config vitb_384_mae_ce_32x4_ep300
: train with depthtrack, test on cdtb and depthtrack.--config vitb_384_mae_ce_32x4_ep300_art
: train with arkittrack, test on arkittrack.- You can modify the config
yaml
files for your own datasets.
Thanks for the OSTrack and BEVFusion projects, which help us to quickly implement our ideas.
@InProceedings{Zhao_2023_CVPR,
author = {Zhao, Haojie and Chen, Junsong and Wang, Lijun and Lu, Huchuan},
title = {ARKitTrack: A New Diverse Dataset for Tracking Using Mobile RGB-D Data},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2023},
pages = {5126-5135}
}
This project is under the MIT license. See LICENSE for details.