Defetya commited on
Commit
4e8d00b
·
verified ·
1 Parent(s): 61eddf2

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -1,5 +1,5 @@
1
  ---
2
  license: apache-2.0
3
  ---
4
- openllama_v2 3B second stage pre-trained on OSCAR with 4k sequence length. Model has seen about 5B tokens for now, weights will be updated as the training goes on.
5
  Achieves 3.8 perplexity on the evaluation dataset. Will we further pre-trained on wiki dataset with 16K context length.
 
1
  ---
2
  license: apache-2.0
3
  ---
4
+ openllama_v2 3B second stage pre-trained on russian part of OSCAR with 4k sequence length. Model has seen about 5B tokens for now, weights will be updated as the training goes on.
5
  Achieves 3.8 perplexity on the evaluation dataset. Will we further pre-trained on wiki dataset with 16K context length.