Update README.md
Browse files
README.md
CHANGED
@@ -24,6 +24,9 @@ Despite being trained on only 72 billion tokens of text, the models outperform m
|
|
24 |
|
25 |
**Instruction Tuning Data** TinyCodeLMs are instruction tuned on paired instruction and Python edit sequence data. These edit sequences are generated with the LintSeq algorithm over a source dataset of paired instruction and Python programs drawn from the Magicoder and StarCoder2 OSS-Instruct datasets (Wei et al., 2024).
|
26 |
|
|
|
|
|
|
|
27 |
# Benchmarks
|
28 |
|
29 |
**Pretrained (Temperature 0)**
|
@@ -54,3 +57,6 @@ Despite being trained on only 72 billion tokens of text, the models outperform m
|
|
54 |
primaryClass={cs.LG}
|
55 |
}
|
56 |
```
|
|
|
|
|
|
|
|
24 |
|
25 |
**Instruction Tuning Data** TinyCodeLMs are instruction tuned on paired instruction and Python edit sequence data. These edit sequences are generated with the LintSeq algorithm over a source dataset of paired instruction and Python programs drawn from the Magicoder and StarCoder2 OSS-Instruct datasets (Wei et al., 2024).
|
26 |
|
27 |
+
# Training Details
|
28 |
+
TinyCodeLM models were pretrained from scratch on a single H100 node (four GPUs) for two epochs. Pretraining took about two days and six days, respectively. Instruction tuning was conducted on a single H100 GPU using DeepSpeed and took no more than several hours.
|
29 |
+
|
30 |
# Benchmarks
|
31 |
|
32 |
**Pretrained (Temperature 0)**
|
|
|
57 |
primaryClass={cs.LG}
|
58 |
}
|
59 |
```
|
60 |
+
|
61 |
+
# Safety
|
62 |
+
This work explores data-driven mechanisms for improving the quality of language model-generated code. Our synthetic data generation method relies on open-source data and our experiments leverage open-source software and resources. It is important to acknowledge that all language models for code synthesis have the potential to be misused – whether intentionally or unintentionally – for generation of code with vulnerabilities and/or malicious behaviors. Any and all model generated code has thepotential to be harmful and must not be executed without precautions.
|