justheuristic commited on
Commit
4a376d7
Β·
1 Parent(s): fce987b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -2
README.md CHANGED
@@ -19,9 +19,11 @@ pinned: false
19
  <div class="lg:col-span-3">
20
  <img src="https://raw.githubusercontent.com/NCAI-Research/CALM/main/assets/logo.png" width="380" alt="CALM Logo" />
21
  <p class="mb-2">
22
- This organization is a part of the NeurIPS 2021 demonstration <u><a href="https://training-transformers-together.github.io/">"Training Transformers Together"</a></u>.
23
  </p>
24
  <p class="mb-2">
 
 
25
  In this demo, we train a model similar to <u><a target="_blank" href="https://openai.com/blog/dall-e/">OpenAI DALL-E</a></u> β€”
26
  a Transformer "language model" that generates images from text descriptions.
27
  Training happens collaboratively β€” volunteers from all over the Internet contribute to the training using hardware available to them.
@@ -59,7 +61,6 @@ pinned: false
59
 
60
  #
61
 
62
- The CALM project is joint effort lead by <a href="https://sdaia.gov.sa/ncai/?Lang=en">NCAI</a> in collaboration with <a href="https://yandex.com/">Yandex</a> and <a href="https://huggingface.co/">HuggingFace</a> to train an Arabic language model with volunteers from around the globe. The project is an adaptation of the framework proposed at the NeurIPS 2021 demonstration: <a href="https://huggingface.co/training-transformers-together">Training Transformers Together</a>.
63
 
64
  Once of the main obstacles facing many researchers in the Arabic NLP community is the lack of computing resources that are needed for training large models. Models with leading performane on Arabic NLP tasks, such as <a href="https://github.com/aub-mind/arabert">AraBERT</a>, <a href="https://github.com/CAMeL-Lab/CAMeLBERT">CamelBERT</a>, <a href="https://huggingface.co/aubmindlab/araelectra-base-generator">AraELECTRA</a>, and <a href="https://huggingface.co/qarib">QARiB</a>, took days to train on TPUs. In the spirit of democratization of AI and community enabling, a core value at NCAI, CALM aims to demonstrate the effectiveness of collaborative training and form a community of volunteers for ANLP researchers with basic level cloud GPUs who wish to train their own models collaboratively.
65
 
 
19
  <div class="lg:col-span-3">
20
  <img src="https://raw.githubusercontent.com/NCAI-Research/CALM/main/assets/logo.png" width="380" alt="CALM Logo" />
21
  <p class="mb-2">
22
+ CALM: Collaborative Arabic Language Model
23
  </p>
24
  <p class="mb-2">
25
+ The CALM project is joint effort lead by <a href="https://sdaia.gov.sa/ncai/?Lang=en">NCAI</a> in collaboration with <a href="https://yandex.com/">Yandex</a> and <a href="https://huggingface.co/">HuggingFace</a> to train an Arabic language model with volunteers from around the globe. The project is an adaptation of the framework proposed at the NeurIPS 2021 demonstration: <a href="https://huggingface.co/training-transformers-together">Training Transformers Together</a>.
26
+ TODO
27
  In this demo, we train a model similar to <u><a target="_blank" href="https://openai.com/blog/dall-e/">OpenAI DALL-E</a></u> β€”
28
  a Transformer "language model" that generates images from text descriptions.
29
  Training happens collaboratively β€” volunteers from all over the Internet contribute to the training using hardware available to them.
 
61
 
62
  #
63
 
 
64
 
65
  Once of the main obstacles facing many researchers in the Arabic NLP community is the lack of computing resources that are needed for training large models. Models with leading performane on Arabic NLP tasks, such as <a href="https://github.com/aub-mind/arabert">AraBERT</a>, <a href="https://github.com/CAMeL-Lab/CAMeLBERT">CamelBERT</a>, <a href="https://huggingface.co/aubmindlab/araelectra-base-generator">AraELECTRA</a>, and <a href="https://huggingface.co/qarib">QARiB</a>, took days to train on TPUs. In the spirit of democratization of AI and community enabling, a core value at NCAI, CALM aims to demonstrate the effectiveness of collaborative training and form a community of volunteers for ANLP researchers with basic level cloud GPUs who wish to train their own models collaboratively.
66