Ontocord.AI commited on
Commit
05a81b9
·
1 Parent(s): 736b5a5

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -0
README.md CHANGED
@@ -14,6 +14,7 @@ pinned: false
14
  Volunteers from:
15
  Bedrock AI, TurkuNLP, ETH, Redmond.AI, Incite, MICS CentraleSupelec, Centro de Excelência em Inteligência Artificial, VietAI, Technion - Israel Institute of Technology, Nous Research, University of Western Australia, KoboldAI Community, LAION.AI, Mila, Luleå University of Technology, Juelich Supercomputing Center, Tokyo Tech, RIKEN, Together
16
 
 
17
 
18
  Open sourcing AI models can lead to increased innovation, accessibility, transparency, and community building. However we need a mechanism to train more capable models in an efficient and modular way.
19
 
@@ -23,6 +24,8 @@ In this effort, we seek international labs and open source aligned researchers a
23
 
24
  We will be using a varient of the c-BTM (https://arxiv.org/pdf/2303.14177v1.pdf) method and will be focusing on models around 7-20B parameters.
25
 
 
 
26
  If you are interested in contributing to this project, please reach out to us and learn more about how you can get involved at [email protected].
27
 
28
  Let's work together to create open-source models that benefit everyone! 🤝 #AI #MDEL #Supercomputers #Summit #OpenSource #Innovation #VolunteersNeeded #OpenScience #DemocratizeAI
 
14
  Volunteers from:
15
  Bedrock AI, TurkuNLP, ETH, Redmond.AI, Incite, MICS CentraleSupelec, Centro de Excelência em Inteligência Artificial, VietAI, Technion - Israel Institute of Technology, Nous Research, University of Western Australia, KoboldAI Community, LAION.AI, Mila, Luleå University of Technology, Juelich Supercomputing Center, Tokyo Tech, RIKEN, Together
16
 
17
+ - (Try out our current proof of concept)[https://huggingface.co/Multi-Domain-Expert-Layers/MDEL-theblackcat-chat-5-experts/]
18
 
19
  Open sourcing AI models can lead to increased innovation, accessibility, transparency, and community building. However we need a mechanism to train more capable models in an efficient and modular way.
20
 
 
24
 
25
  We will be using a varient of the c-BTM (https://arxiv.org/pdf/2303.14177v1.pdf) method and will be focusing on models around 7-20B parameters.
26
 
27
+ ## We are also adding multimodal both understanding and generation and multilingual training with context length of at least 65K tokens.
28
+
29
  If you are interested in contributing to this project, please reach out to us and learn more about how you can get involved at [email protected].
30
 
31
  Let's work together to create open-source models that benefit everyone! 🤝 #AI #MDEL #Supercomputers #Summit #OpenSource #Innovation #VolunteersNeeded #OpenScience #DemocratizeAI