Ontocord.AI
commited on
Commit
·
71c8660
1
Parent(s):
07c1961
Update README.md
Browse files
README.md
CHANGED
@@ -32,4 +32,4 @@ If you are interested in contributing to this project, please reach out to us an
|
|
32 |
|
33 |
Let's work together to create open-source models that benefit everyone! 🤝 #AI #MDEL #Supercomputers #Summit #OpenSource #Innovation #VolunteersNeeded #OpenScience #DemocratizeAI
|
34 |
|
35 |
-
** Why did we change the term "Layer" to "Learning"? Because we are exploring, in addition layerwise experts, also working with different adapters and architecture like Flamingo (https://arxiv.org/abs/2204.14198), EMU (https://arxiv.org/abs/2307.05222) and a novel multi-node architecture for training loras we call lora-x, which will allow us to swap out different component experts to improve the performance of the model.
|
|
|
32 |
|
33 |
Let's work together to create open-source models that benefit everyone! 🤝 #AI #MDEL #Supercomputers #Summit #OpenSource #Innovation #VolunteersNeeded #OpenScience #DemocratizeAI
|
34 |
|
35 |
+
** Why did we change the term "Layer" to "Learning"? Because we are exploring, in addition to layerwise experts, also working with different adapters and architecture like Flamingo (https://arxiv.org/abs/2204.14198), EMU (https://arxiv.org/abs/2307.05222) and a novel multi-node architecture for training loras we call lora-x, which will allow us to swap out different component experts to improve the performance of the model.
|