Skip to content
/ TLC Public

Here we provide the code for the paper 'Till the Layers Collapse: Compressing a Deep Neural Network through the Lenses of Batch Normalization Layers.'

Notifications You must be signed in to change notification settings

ZhuLIAO001/TLC

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Till the Layers Collapse: Compressing a Deep Neural Network Through the Lenses of Batch Normalization Layers

Static Badge Static Badge

1LTCI, Télécom Paris, Institut Polytechnique de Paris 

📣 Published as a conference paper at AAAI 2025.

This GitHub implements the key experiments of the following paper: Till the Layers Collapse: Compressing a Deep Neural Network Through the Lenses of Batch Normalization Layers, and more particularly of our method called Till the Layers Collapse (TLC) to reduce the depth of over-parametrized deep neural networks.

Overview of the key steps of TLC: identification of the layer to remove, removal of irrelevant channels, and linearization of the remaining, removal of the layer.

Example Runs

To test TLC with MobileNetv2 on CIFAR-10 following the same policy as in the paper, you can run:

python TLC_MobileNetV2.py

To test TLC with ResNet-18 on another dataset, like VLCS, you can run:

python TLC_ResNet-18.py --dataset VLCS --DATA_DIR $DATASET_PATH

Before executing, please ensure that the target dataset already exists on your machine.

List of available datasets

  • CIFAR-10
  • Tiny-ImageNet-200
  • PACS
  • VLCS
  • ImageNet
  • QNLI
  • RTE
  • SST-2

List of available architectures

  • VGG-16 (bn version)
  • ResNet-18
  • Swin-T
  • MobileNetv2
  • BERT
  • RoBERTa

Extended version

Please find the extended version of the paper at Till the Layers Collapse: Compressing a Deep Neural Network Through the Lenses of Batch Normalization Layers.

Citation

If you find this useful for your research, please cite the following paper.

@article{liao2024till,
  title={Till the Layers Collapse: Compressing a Deep Neural Network through the Lenses of Batch Normalization Layers},
  author={Liao, Zhu and Hezbri, Nour and Qu{\'e}tu, Victor and Nguyen, Van-Tam and Tartaglione, Enzo},
  journal={arXiv preprint arXiv:2412.15077},
  year={2024}
}

About

Here we provide the code for the paper 'Till the Layers Collapse: Compressing a Deep Neural Network through the Lenses of Batch Normalization Layers.'

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages