Update README.md
Browse files
README.md
CHANGED
@@ -1,10 +1,10 @@
|
|
1 |
---
|
2 |
-
license:
|
3 |
---
|
4 |
|
5 |
## Overview
|
6 |
|
7 |
-
The
|
8 |
|
9 |
## Variants
|
10 |
|
@@ -33,5 +33,5 @@ The Phi-3, state-of-the-art open model trained with the Phi-3 datasets that incl
|
|
33 |
|
34 |
- **Author:** Microsoft
|
35 |
- **Converter:** [Homebrew](https://www.homebrew.ltd/)
|
36 |
-
- **Original License:** [
|
37 |
-
- **Papers:** [
|
|
|
1 |
---
|
2 |
+
license: apache-2.0
|
3 |
---
|
4 |
|
5 |
## Overview
|
6 |
|
7 |
+
The TinyLlama project aims to pretrain a 1.1B Llama model on 3 trillion tokens. This is the chat model finetuned on a diverse range of synthetic dialogues generated by ChatGPT.
|
8 |
|
9 |
## Variants
|
10 |
|
|
|
33 |
|
34 |
- **Author:** Microsoft
|
35 |
- **Converter:** [Homebrew](https://www.homebrew.ltd/)
|
36 |
+
- **Original License:** [License](https://huggingface.co/datasets/choosealicense/licenses/blob/main/markdown/apache-2.0.md)
|
37 |
+
- **Papers:** [Tinyllama Paper](https://arxiv.org/abs/2401.02385)
|