Update README.md
Browse files
README.md
CHANGED
@@ -6,6 +6,7 @@ datasets:
|
|
6 |
---
|
7 |
|
8 |
|
|
|
9 |
# Model Details
|
10 |
|
11 |
The TinyCodeLM family of tiny language models (LMs) is a collection of fully open-source pretrained and instruction tuned generative code models in 150M and 400M sizes. These models are pretrained on a mixture of open-source web text and Python code. The instruction tuned TinyCodeLM models are optimized for Python code synthesis, and are trained on [synthetic edit sequence data generated with the LintSeq algorithm](https://arxiv.org/abs/2410.02749).
|
@@ -41,7 +42,7 @@ TinyCodeLM models were pretrained from scratch on a single H100 node (four GPUs)
|
|
41 |
| :----------- | -----------------: | -----------------: |
|
42 |
| HumanEval, pass@1 | 12.8 | 13.4 |
|
43 |
| HumanEval, pass@10 | 20.6 | 20.9 |
|
44 |
-
| MBPP(+), pass@1 | 13.6 |
|
45 |
| MBPP(+), pass@10 | 24.4 | 29.9 |
|
46 |
|
47 |
|
|
|
6 |
---
|
7 |
|
8 |
|
9 |
+
|
10 |
# Model Details
|
11 |
|
12 |
The TinyCodeLM family of tiny language models (LMs) is a collection of fully open-source pretrained and instruction tuned generative code models in 150M and 400M sizes. These models are pretrained on a mixture of open-source web text and Python code. The instruction tuned TinyCodeLM models are optimized for Python code synthesis, and are trained on [synthetic edit sequence data generated with the LintSeq algorithm](https://arxiv.org/abs/2410.02749).
|
|
|
42 |
| :----------- | -----------------: | -----------------: |
|
43 |
| HumanEval, pass@1 | 12.8 | 13.4 |
|
44 |
| HumanEval, pass@10 | 20.6 | 20.9 |
|
45 |
+
| MBPP(+), pass@1 | 13.6 | 19.4 |
|
46 |
| MBPP(+), pass@10 | 24.4 | 29.9 |
|
47 |
|
48 |
|