From 7a7c93aefffc9494c39e7b170c07cb06d8c09c4c Mon Sep 17 00:00:00 2001 From: Alexander Borzunov Date: Wed, 20 Jul 2022 17:01:45 +0400 Subject: [PATCH] Add links to "Example Use Cases" (#497) I think some people are interested in the "Example Use Cases" section because they'd like to know what was already built with hivemind, and other people would like to take a look on the code if they've already started to use hivemind and want some code examples. Currently, the sahajBERT link leads to the sahajBERT repo that doesn't describe much about the project itself. Conversely, it's hard to find the repo with the code following the CALM and "Training Transformers Together" links. This PR adds more useful links to each of the projects. --- README.md | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/README.md b/README.md index 6c944f5a6..c6042f596 100644 --- a/README.md +++ b/README.md @@ -26,16 +26,16 @@ large model on hundreds of computers from different universities, companies, and To learn more about the ideas behind this library, see the [full list](https://github.com/learning-at-home/hivemind/tree/refer-to-discord-in-docs#citation) of our papers below. -## Example Applications and Use Cases +## Example Use Cases This section lists projects that leverage hivemind for decentralized training. If you have succesfully trained a model or created a downstream repository with the help of our library, feel free to submit a pull request that adds your project to this list. -* [sahajBERT](https://github.com/tanmoyio/sahajbert) — a collaboratively pretrained ALBERT-xlarge for the Bengali language. -* [CALM](https://github.com/NCAI-Research/CALM) (Collaborative Arabic Language Model) — a masked language model trained on a combination of Arabic datasets. -* [Training Transformers Together](https://training-transformers-together.github.io/) — a NeurIPS 2021 demonstration that trained a collaborative text-to-image Transformer model. -* [HivemindStrategy](https://pytorch-lightning.readthedocs.io/en/latest/api/pytorch_lightning.strategies.HivemindStrategy.html) in PyTorch Lightning allows adapting your existing pipelines to training over slow network with unreliable peers. +* **sahajBERT** ([blog post](https://huggingface.co/blog/collaborative-training), [code](https://github.com/tanmoyio/sahajbert)) — a collaboratively pretrained ALBERT-xlarge for the Bengali language. +* **CALM** ([webpage](https://huggingface.co/CALM), [code](https://github.com/NCAI-Research/CALM)) — a masked language model trained on a combination of Arabic datasets. +* **Training Transformers Together** ([webpage](https://training-transformers-together.github.io/), [code](https://github.com/learning-at-home/dalle-hivemind)) — a NeurIPS 2021 demonstration that trained a collaborative text-to-image Transformer model. +* **HivemindStrategy** ([docs](https://pytorch-lightning.readthedocs.io/en/latest/api/pytorch_lightning.strategies.HivemindStrategy.html)) in PyTorch Lightning allows adapting your existing pipelines to training over slow network with unreliable peers. ## Installation