Implementation of ICML 2023 paper DADAO: Decoupled Accelerated Decentralized Asynchronous Optimization.
To compare our work to other decentralized optimizers, we also implemented MSDA, ADOM+ (with and without Multi-Consensus) and the optimizer described in the Continuized Framework.
Simply run the main script, e.g., as follows
python main.py --optimizer_name "DADAO" --n_workers 10 --classification True --graph_type "random_geom" --t_max 200
In our Examples Notebook, we provide further examples of how to run the implemented optimizers, along with a small exploration of the datasets and graphs considered.
@InProceedings{pmlr-v202-nabli23a,
title = {{DADAO}: Decoupled Accelerated Decentralized Asynchronous Optimization},
author = {Nabli, Adel and Oyallon, Edouard},
booktitle = {Proceedings of the 40th International Conference on Machine Learning},
pages = {25604--25626},
year = {2023},
editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan},
volume = {202},
series = {Proceedings of Machine Learning Research},
month = {23--29 Jul},
publisher = {PMLR},
}