Skip to content

Latest commit

 

History

History
45 lines (33 loc) · 1.56 KB

README.md

File metadata and controls

45 lines (33 loc) · 1.56 KB

PaperHow To UseCiteLicenseAcknowledgmentsContact

FOCIL: Finetune-and-Freeze for Online Class Incremental Learning by Training Randomly Pruned Sparse Experts

FOCIL is a method proposed for Online Class-Incremental Learning. It fine-tunes the main backbone continually by training a randomly pruned sparse subnetwork for each task. Then, it freezes the trained connections to prevent forgetting. FOCIL also determines the sparsity level and learning rate per task adaptively, and ensures nearly zero forget across all tasks without expanding the network or storing replay data.

How to Use

For example, to train 20-Task CIFAR100, run:

python main.py
       --dataset cifar100
       --num_classes 100
       --num_tasks 20
       --num_classes_per_task 5
       --backbone resnet18

License

Please check the MIT license that is listed in this repository.

Acknowledgments

We thank the following repos providing helpful components/functions in our work.