Skip to content
/ HCFL Public

[ICLR 2025] Find Your Optimal Assignments On-the-fly: A Holistic Framework for Clustered Federated Learning

Notifications You must be signed in to change notification settings

LINs-lab/HCFL

Repository files navigation

If our project helps you, please give us a star ⭐ and cite our paper!

Requirements:

Pillow == 8.1.2
tqdm
scikit-learn == 0.21.3
numpy == 1.19.0
torch == 1.2.0
matplotlib == 3.1.1
networkx == 2.5.1
cvxpy
torchvision
tensorboard

Configurations:

  • Aggregator: Please refer to FedIASAggregator class of aggregator.py, following the annotations to choose the backbone algorithms, aggregation methods, and similarity metrics.
  • Client similarity: Please refer to SplitLearnersEnsemble class of learners\learners_ensemble.py, following the annotations to choose using graidents or prototypes to calculate the similarity.

Example scripts:

python run_experiment.py cifar10-c-2swap FedEM_SW --n_learners 2 --n_rounds 200 --bz 128 --lr 0.03 --lr_scheduler constant --log_freq 1 --device 1 --optimizer sgd --seed 1 --verbose 1 --suffix 03-lr-03-resnet-split-005-04-1 --split

Hints:

  • All the supported algorithms can be found in constants.py. The default models are avaliable in models.py. The detailed explanation of arguments can be found in args.py.
  • You need to first use scrips in create_c to construct the datasets.
  • The --split indicates using shared feature extractor for all clients.

Acknowledgement

We are grateful for the following awesome projects:

Bibliography

If you find this repository helpful for your project, please consider citing:

@inproceedings{
guo2025enhancing,
title={Enhancing Clustered Federated Learning: Integration of Strategies and Improved Methodologies},
author={Yongxin Guo and Xiaoying Tang and Tao Lin},
booktitle={The Thirteenth International Conference on Learning Representations},
year={2025},
url={https://openreview.net/forum?id=zPDpdk3V8L}
}

About

[ICLR 2025] Find Your Optimal Assignments On-the-fly: A Holistic Framework for Clustered Federated Learning

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages