Paper • How To Use • Cite • License • Acknowledgments • Contact
FOCIL: Finetune-and-Freeze for Online Class Incremental Learning by Training Randomly Pruned Sparse Experts
FOCIL is a method proposed for Online Class-Incremental Learning. It fine-tunes the main backbone continually by training a randomly pruned sparse subnetwork for each task. Then, it freezes the trained connections to prevent forgetting. FOCIL also determines the sparsity level and learning rate per task adaptively, and ensures nearly zero forget across all tasks without expanding the network or storing replay data.
For example, to train 20-Task CIFAR100, run:
python main.py
--dataset cifar100
--num_classes 100
--num_tasks 20
--num_classes_per_task 5
--backbone resnet18
Please check the MIT license that is listed in this repository.
We thank the following repos providing helpful components/functions in our work.