Skip to content
/ ETENAS Public

Enhanced Training-Free Neural Architecture Search (ETE-NAS)

License

Notifications You must be signed in to change notification settings

Rufaim/ETENAS

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Enhanced Training-Free Neural Architecture Search (ETE-NAS)

MIT licensed

Installation

  1. Clone this repo
  2. Install dependencies:
pip install -r requirements.txt

3.Prepare the dataset

  • Please follow the guideline here to prepare ImageNet16 dataset.
  • Download NAS-Bench-201 NAS-Bench-201-v1_0-e61699.pth from here and generate simplified API
python3 -m lib/nas_201_api -p <path to NAS-Bench-201-v1_0-e61699.pth> -o NAS_data/NAS-Bench-201-v1_0-e61699-simple.pkl

Usage

1. Search

python3 prune_etenas.py --save_dir ./output/prune-nas-bench-201/cifar10 --max_nodes 4 --dataset cifar10 --data_path NAS_data/cifar.python --search_space_name nas-bench-201 --super_type basic --arch_nas_dataset NAS_data/NAS-Bench-201-v1_0-e61699-simple.pkl --track_running_stats 1 --workers 0 --precision 3 --init kaiming_norm --repeat 3 --rand_seed 0 --batch_size 72 --prune_number 1 --rankers_config configs/rankers_configs/frob.json
python3 prune_etenas.py --save_dir ./output/prune-nas-bench-201/cifar100 --max_nodes 4 --dataset cifar10 --data_path NAS_data/cifar.python --search_space_name nas-bench-201 --super_type basic --arch_nas_dataset NAS_data/NAS-Bench-201-v1_0-e61699-simple.pkl --track_running_stats 1 --workers 0 --precision 3 --init kaiming_norm --repeat 3 --rand_seed 0 --batch_size 72 --prune_number 1 --rankers_config configs/rankers_configs/frob.json
python3 prune_etenas.py --save_dir ./output/prune-nas-bench-201/ImageNet16 --max_nodes 4 --dataset ImageNet16-120 --data_path NAS_data/ImageNet16 --search_space_name nas-bench-201 --super_type basic --arch_nas_dataset NAS_data/NAS-Bench-201-v1_0-e61699-simple.pkl --track_running_stats 1 --workers 0 --precision 3 --init kaiming_norm --repeat 3 --rand_seed 0 --batch_size 72 --prune_number 1 --rankers_config configs/rankers_configs/frob.json
python3 prune_etenas.py --save_dir ./output/prune-darts/cifar10 --max_nodes 4 --dataset cifar10 --data_path NAS_data/cifar.python --search_space_name darts --super_type nasnet-super --track_running_stats 1 --workers 0 --precision 3 --init kaiming_norm --repeat 3 --rand_seed 0 --batch_size 72 --prune_number 3 --rankers_config configs/rankers_configs/frob.json
python3 prune_etenas.py --save_dir ./output/prune-darts/ImageNet16 --max_nodes 4 --dataset ImageNet16-120 --data_path NAS_data/ImageNet16 --search_space_name darts --super_type nasnet-super --track_running_stats 1 --workers 0 --precision 3 --init kaiming_norm --repeat 3 --rand_seed 0 --batch_size 72 --prune_number 3 --rankers_config configs/rankers_configs/frob.json

2. Evaluation

  • For architectures searched on nas-bench-201, the accuracies are immediately available at the end of search (from the console output).
  • For architectures searched on darts, please use DARTS_evaluation for training the searched architecture from scratch and evaluation. You can use make_list_of_genotypes.py to aggregate all found genotypes for DARTS.

Acknowledgement

About

Enhanced Training-Free Neural Architecture Search (ETE-NAS)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published