Skip to content

flywheel1412/NonDeepNetworks

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 

Repository files navigation

Non-deep Networks
arXiv:2110.07641
Ankit Goyal, Alexey Bochkovskiy, Jia Deng, Vladlen Koltun

Overview: Depth is the hallmark of DNNs. But more depth means more sequential computation and higher latency. This begs the question -- is it possible to build high-performing ``non-deep" neural networks? We show that it is. We show, for the first time, that a network with a depth of just 12 can achieve top-1 accuracy over 80% on ImageNet, 96% on CIFAR10, and 81% on CIFAR100. We also show that a network with a low-depth (12) backbone can achieve an AP of 48% on MS-COCO.

If you find our work useful, please consider citing it:

@article{goyal2021nondeep,
  title={Non-deep Networks},
  author={Goyal, Ankit and Bochkovskiy, Alexey and Deng, Jia and Koltun, Vladlen},
  journal={arXiv:2110.07641},
  year={2021}
}

Code Coming Soon!

About

Official Code for "Non-deep Networks"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published