Update README.md
Browse files
README.md
CHANGED
@@ -13,7 +13,7 @@ datasets:
|
|
13 |
|
14 |
# Model Card for Lite-Oute-1-65M-smol-smoltalk
|
15 |
|
16 |
-
This model is a fine-tuned version of [OuteAI/Lite-Oute-1-65M](https://huggingface.co/OuteAI/Lite-Oute-1-65M). Lite-Oute-1-65M (Base) is an experimental ultra-compact base model in the Lite series, built on the LLaMA architecture and comprising approximately 65 million parameters.
|
17 |
It has been trained using [TRL](https://github.com/huggingface/trl). Below the finetuned model is evaluated on common benchmarks.
|
18 |
|
19 |
## Benchmarks:
|
|
|
13 |
|
14 |
# Model Card for Lite-Oute-1-65M-smol-smoltalk
|
15 |
|
16 |
+
This model is a fine-tuned version of [OuteAI/Lite-Oute-1-65M](https://huggingface.co/OuteAI/Lite-Oute-1-65M) using smol-smoltalk dataset for 1 epoch. Lite-Oute-1-65M (Base) is an experimental ultra-compact base model in the Lite series, built on the LLaMA architecture and comprising approximately 65 million parameters.
|
17 |
It has been trained using [TRL](https://github.com/huggingface/trl). Below the finetuned model is evaluated on common benchmarks.
|
18 |
|
19 |
## Benchmarks:
|