license: bigcode-openrail-m | |
datasets: | |
- bigcode/self-oss-instruct-sc2-exec-filter-50k | |
base_model: | |
- bigcode/starcoder2-3b | |
library_name: transformers | |
Starcoder2-3b finetuned the same way as https://huggingface.co/bigcode/starcoder2-15b-instruct-v0.1 using https://huggingface.co/datasets/bigcode/self-oss-instruct-sc2-exec-filter-50k | |
* Epochs: 1 | |
* Learning Rate: 0.0001 | |
* Lora Rank: 8 | |
* Batch Size: 16 | |
* Evaluation Split: 0 |