Skip to content
forked from AdelNabli/DADAO

Implementation of DADAO: Decoupled Accelerated Decentralized Asynchronous Optimization

Notifications You must be signed in to change notification settings

BitBute01/DADAO

 
 

Repository files navigation

DADAO

Implementation of ICML 2023 paper DADAO: Decoupled Accelerated Decentralized Asynchronous Optimization.
To compare our work to other decentralized optimizers, we also implemented MSDA, ADOM+ (with and without Multi-Consensus) and the optimizer described in the Continuized Framework.

Requirements

Usages

Simply run the main script, e.g., as follows

python main.py --optimizer_name "DADAO" --n_workers 10 --classification True --graph_type "random_geom" --t_max 200

In our Examples Notebook, we provide further examples of how to run the implemented optimizers, along with a small exploration of the datasets and graphs considered.

Citation

@InProceedings{pmlr-v202-nabli23a,
  title = 	 {{DADAO}: Decoupled Accelerated Decentralized Asynchronous Optimization},
  author =       {Nabli, Adel and Oyallon, Edouard},
  booktitle = 	 {Proceedings of the 40th International Conference on Machine Learning},
  pages = 	 {25604--25626},
  year = 	 {2023},
  editor = 	 {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan},
  volume = 	 {202},
  series = 	 {Proceedings of Machine Learning Research},
  month = 	 {23--29 Jul},
  publisher =    {PMLR},
}

About

Implementation of DADAO: Decoupled Accelerated Decentralized Asynchronous Optimization

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 86.6%
  • Python 13.4%