📣 Published as a conference paper at AAAI 2025.
This GitHub implements the key experiments of the following paper: Till the Layers Collapse: Compressing a Deep Neural Network Through the Lenses of Batch Normalization Layers, and more particularly of our method called Till the Layers Collapse (TLC) to reduce the depth of over-parametrized deep neural networks.
To test TLC with MobileNetv2 on CIFAR-10 following the same policy as in the paper, you can run:
python TLC_MobileNetV2.py
To test TLC with ResNet-18 on another dataset, like VLCS, you can run:
python TLC_ResNet-18.py --dataset VLCS --DATA_DIR $DATASET_PATH
Before executing, please ensure that the target dataset already exists on your machine.
- CIFAR-10
- Tiny-ImageNet-200
- PACS
- VLCS
- ImageNet
- QNLI
- RTE
- SST-2
- VGG-16 (bn version)
- ResNet-18
- Swin-T
- MobileNetv2
- BERT
- RoBERTa
Please find the extended version of the paper at Till the Layers Collapse: Compressing a Deep Neural Network Through the Lenses of Batch Normalization Layers.
If you find this useful for your research, please cite the following paper.
@article{liao2024till,
title={Till the Layers Collapse: Compressing a Deep Neural Network through the Lenses of Batch Normalization Layers},
author={Liao, Zhu and Hezbri, Nour and Qu{\'e}tu, Victor and Nguyen, Van-Tam and Tartaglione, Enzo},
journal={arXiv preprint arXiv:2412.15077},
year={2024}
}