Skip to content

QiaoLiuHit/PTB-TIR_Evaluation_toolkit

Repository files navigation

PTB-TIR: A Thermal Infrared Pedestrian Tracking Benchmark (TMM19)

This toolkit is used to evaluate the tracker on the thermal infrared pedestrian tracking benchmark, PTB-TIR. Paper, Project

News

  • [2019-11-4] We correct some annotation mistakes, so the results are slightly different from the results of the paper.
  • [2019-11-4] We evaluate more trackers on the benchmark and provide their results in the raw results.

Download dataset and raw results

Usage

  1. Download this toolkit and unzip it in your computer.
  2. Download and unzip the raw results and put it into the results folder of the toolkit.
  3. Download and unzip the dataset and put it into the toolkit.
  4. Now, you can run run_evaluation.m and run_speed.m to draw the result plots.
  5. You can configure configTrackers.m and then use run_tracker_interface.m to run your own tracker on the benchmark.

Result's plots

Alt text

Trackers and codes

TIR trackers

  • CMD-DiMP. Sun J, et al. Unsupervised Cross-Modal Distillation for Thermal Infrared Tracking, ACM MM, 2021. [Github]
  • MMNet. Liu Q, et al. Multi-task driven feature model for thermal infrared tracking, AAAI, 2020. [Github]
  • ECO-stir. Zhang L, et al. Synthetic data generation for end-to-end thermal infrared tracking, TIP, 2018. [Github]
  • MLSSNet. Liu Q, et al, Learning Deep Multi-Level Similarity for Thermal Infrared Object Tracking, TMM, 2020. [Github]
  • HSSNet. Li X, et al, Hierarchical spatial-aware Siamese network for thermal infrared object tracking, KBS, 2019.[Github]
  • MCFTS. Liu Q, et al, Deep convolutional neural networks for thermal infrared object tracking, KBS, 2017. [Github]

RGB trackers

  • ECO. Danelljan M, et al, ECO: efficient convolution operators for tracking, CVPR, 2017. [Github]
  • DeepSTRCF. Li F et al, Learning spatial-temporal regularized correlation filters for visual tracking, CVPR, 2018. [Github]
  • MDNet. Nam H, et al, Learning multi-domain convolutional neural networks for visual tracking, CVPR, 2016. [Github]
  • SRDCF. Danelljan M, et al, Learning spatially regularized correlation filters for visual tracking, ICCV, 2015. [Project]
  • VITAL. Song Y, et al., Vital: Visual tracking via adversarial learning, CVPR, 2018. [Github]
  • TADT. Li X, et al, Target-aware deep tracking, CVPR, 2019. [Github]
  • MCCT. Wang N, et al, Multi-cue correlation filters for robust visual tracking, CVPR, 2018. [Github]
  • Staple. Bertinetto, L, et al, Staple: Complementary learners for real-time tracking, CVPR, 2016. [Github]
  • DSST. Danelljan M, et al, Accurate scale estimation for robust visual tracking, BMVC, 2014. [Github]
  • UDT. Wang N, et al, Unsupervised deep tracking, CVPR, 2019. [Github]
  • CREST. Song Y, et al, Crest: Convolutional residual learning for visual tracking, ICCV, 2017. [Github]
  • SiamFC. Bertinetto, L, et al, Fully-Convolutional Siamese Networks for Object Tracking, ECCVW, 2016. [Github]
  • SiamFC-tri. Dong X, et al, Triplet loss in Siamese network for object tracking, ECCV, 2018. [Github]
  • HDT. Qi Y, et al, Hedged deep tracking, CVPR, 2016. [Project]
  • CFNet. Valmadre, J, et al, End-to-end representation learning for correlation filter based tracking, CVPR, 2017. [Github]
  • HCF. Ma, C, et al, Hierarchical convolutional features for visual tracking, ICCV, 2015. [Github]
  • L1APG. Bao, C, et al, Real time robust L1 tracker using accelerated proximal gradient approach, CVPR, 2012. [Project]
  • SVM. Wang N, et al, Understanding and diagnosing visual tracking systems, ICCV, 2015. [Project]
  • KCF. Henriques, J, et al, High-speed tracking with kernelized correlation filters, TPAMI, 2015. [Project]
  • DSiam. Guo, Q, et al, Learning dynamic siamese network for visual object tracking, ICCV, 2017. [Github]

Citation

If you use this benchmark, please consider citing our paper.

@article{PTB-TIR,
  title={PTB-TIR: A Thermal Infrared Pedestrian Tracking Benchmark},
  author={Liu, Qiao and He, Zhenyu and Li, Xin and Zheng, Yuan},
  journal={IEEE Transactions on Multimedia},
  year={2019},
  DOI ={10.1109/TMM.2019.2932615}
}

Contact

Feedbacks and comments are welcome! Feel free to contact us via [email protected] or [email protected]