This folder contains official pyTorch implementations for "Do Your Best and Get Enough Rest for Continual Learning" accepted in CVPR'25. (see our paper, slides, poster).
-
Clone this repository and install the requirements.
git clone https://github.com/hankyul2/ViewBatchModel.git cd ViewBatchModel pip install -r requirements.txt
-
Train ResNet18 on S-CIFAR-10 using iCaRL as baseline methods with 200 buffers.
iCaRL
CUDA_VISIBLE_DEVICES=0 python utils/main.py --model icarl --load_best_args --dataset seq-cifar10 --buffer_size 200 --seed 1993 --savecheck 1 --ckpt_name icarl_r1_s1993
Ours-iCaRL
CUDA_VISIBLE_DEVICES=4 python utils/main.py --model icarl --load_best_args --dataset seq-cifar10 --buffer_size 200 --aug-repeat 4 --prog-aug 5 --seed 1993 --flag hard_aug --savecheck 1 --ckpt_name icarl_r4_hard_aug_s1993
-
Validate the trained network using the saved checkpoint.
iCaRL
CUDA_VISIBLE_DEVICES=0 python utils/main.py --model icarl --load_best_args --dataset seq-cifar10 --buffer_size 200 --seed 1993 --loadcheck checkpoints/icarl_r1_s1993_cifar10_t0.pth --start_from 0 --stop_after 0 --inference_only 1
Ours-iCaRL
CUDA_VISIBLE_DEVICES=4 python utils/main.py --model icarl --load_best_args --dataset seq-cifar10 --buffer_size 200 --seed 1997 --loadcheck checkpoints/icarl_r4_hard_aug_s1997_cifar10_t0.pth --start_from 0 --stop_after 0 --inference_only 1
-
See scripts/icarl for more commands to reproduce Table 6 in the paper. Also, check datasets/utils/continual_dataset.py#L24 for view-batch replay and models/icarl.py#L78 for view-batch SSL.
After the paper has been accepted, we rerun everything to provide complete logs and checkpoints for our Table 6 in the paper. Our exact environments are:
torch==1.12.1+cu113
torchvision==0.13.1+cu113
timm==1.0.7
numpy==1.24.4
The table below reproduces Table 6 of our paper, which contains the main ablation study for the proposed method.
Method | View-batch Replay | Strong Augment | View-batch SSL | Forgetting(⬇️) | CIL(⬆️) | TIL(⬆️) | AVG | ∆ |
---|---|---|---|---|---|---|---|---|
iCaRL | ❌ | ❌ | ❌ | 28.05±4.21 | 63.58±2.64 | 90.32±3.19 | 76.95 | - |
iCaRL | ❌ | ✅ | ❌ | 22.16±0.91 | 65.33±1.05 | 89.33±0.58 | 77.33 | +0.38 |
iCaRL | ✅ | ❌ | ❌ | 18.72±1.76 | 67.21±0.42 | 91.63±0.98 | 79.42 | +2.47 |
iCaRL | ✅ | ✅ | ❌ | 18.29±0.91 | 67.16±0.75 | 91.02±0.97 | 79.09 | +2.14 |
iCaRL | ✅ | ✅ | ✅ | 13.81±1.58 | 69.25±0.41 | 92.73±0.57 | 80.99 | +4.04 |
Below, the WanDB project link provides the complete logs that are made during the training of the above tables. It includes:
- command line
- metrics
- console outputs
- environments
WanDB Project Link: https://wandb.ai/gregor99/view_batch_model.
The tables below provide the checkpoints saved at the end of tasks during the training of the above tables.
seed=1993
method | View-batch Replay | Strong Augmentation | View-batch SSL | task 1 | task 2 | task 3 | task 4 | task 5 |
---|---|---|---|---|---|---|---|---|
iCaRL | - | - | - | ckpt | ckpt | ckpt | ckpt | ckpt |
iCaRL | - | v | - | ckpt | ckpt | ckpt | ckpt | ckpt |
iCaRL | v | - | - | ckpt | ckpt | ckpt | ckpt | ckpt |
iCaRL | v | v | - | ckpt | ckpt | ckpt | ckpt | ckpt |
iCaRL | v | v | v | ckpt | ckpt | ckpt | ckpt | ckpt |
seed=1996
method | View-batch Replay | Strong Augmentation | View-batch SSL | task 1 | task 2 | task 3 | task 4 | task 5 |
---|---|---|---|---|---|---|---|---|
iCaRL | - | - | - | ckpt | ckpt | ckpt | ckpt | ckpt |
iCaRL | - | v | - | ckpt | ckpt | ckpt | ckpt | ckpt |
iCaRL | v | - | - | ckpt | ckpt | ckpt | ckpt | ckpt |
iCaRL | v | v | - | ckpt | ckpt | ckpt | ckpt | ckpt |
iCaRL | v | v | v | ckpt | ckpt | ckpt | ckpt | ckpt |
seed=1997
method | View-batch Replay | Strong Augmentation | View-batch SSL | task 1 | task 2 | task 3 | task 4 | task 5 |
---|---|---|---|---|---|---|---|---|
iCaRL | - | - | - | ckpt | ckpt | ckpt | ckpt | ckpt |
iCaRL | - | v | - | ckpt | ckpt | ckpt | ckpt | ckpt |
iCaRL | v | - | - | ckpt | ckpt | ckpt | ckpt | ckpt |
iCaRL | v | v | - | ckpt | ckpt | ckpt | ckpt | ckpt |
iCaRL | v | v | v | ckpt | ckpt | ckpt | ckpt | ckpt |
This project is heavily based on Mammoth. We sincerely appreciate the authors of the mentioned works for sharing such great library as open-source project.