|
--- |
|
library_name: peft |
|
license: other |
|
base_model: deepseek-ai/deepseek-coder-1.3b-base |
|
tags: |
|
- generated_from_trainer |
|
model-index: |
|
- name: lemexp-task1-template_full-deepseek-coder-1.3b-base-ddp-8lr |
|
results: [] |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# lemexp-task1-template_full-deepseek-coder-1.3b-base-ddp-8lr |
|
|
|
This model is a fine-tuned version of [deepseek-ai/deepseek-coder-1.3b-base](https://huggingface.co/deepseek-ai/deepseek-coder-1.3b-base) on an unknown dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 0.1692 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 0.0008 |
|
- train_batch_size: 2 |
|
- eval_batch_size: 2 |
|
- seed: 42 |
|
- distributed_type: multi-GPU |
|
- num_devices: 8 |
|
- total_train_batch_size: 16 |
|
- total_eval_batch_size: 16 |
|
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments |
|
- lr_scheduler_type: linear |
|
- num_epochs: 18 |
|
- mixed_precision_training: Native AMP |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | |
|
|:-------------:|:-------:|:------:|:---------------:| |
|
| 0.3452 | 0.2000 | 2902 | 0.3366 | |
|
| 0.3188 | 0.4001 | 5804 | 0.3166 | |
|
| 0.3088 | 0.6001 | 8706 | 0.3066 | |
|
| 0.2989 | 0.8001 | 11608 | 0.2897 | |
|
| 0.296 | 1.0001 | 14510 | 0.2956 | |
|
| 0.2886 | 1.2002 | 17412 | 0.2853 | |
|
| 0.2838 | 1.4002 | 20314 | 0.2798 | |
|
| 0.2836 | 1.6002 | 23216 | 0.2750 | |
|
| 0.278 | 1.8002 | 26118 | 0.2719 | |
|
| 0.2749 | 2.0003 | 29020 | 0.2711 | |
|
| 0.2722 | 2.2003 | 31922 | 0.2696 | |
|
| 0.2681 | 2.4003 | 34824 | 0.2643 | |
|
| 0.2698 | 2.6004 | 37726 | 0.2591 | |
|
| 0.2634 | 2.8004 | 40628 | 0.2585 | |
|
| 0.2598 | 3.0004 | 43530 | 0.2632 | |
|
| 0.2537 | 3.2004 | 46432 | 0.2581 | |
|
| 0.2529 | 3.4005 | 49334 | 0.2516 | |
|
| 0.2544 | 3.6005 | 52236 | 0.2481 | |
|
| 0.2531 | 3.8005 | 55138 | 0.2475 | |
|
| 0.2505 | 4.0006 | 58040 | 0.2523 | |
|
| 0.2434 | 4.2006 | 60942 | 0.2439 | |
|
| 0.2439 | 4.4006 | 63844 | 0.2396 | |
|
| 0.2415 | 4.6006 | 66746 | 0.2358 | |
|
| 0.2375 | 4.8007 | 69648 | 0.2346 | |
|
| 0.2395 | 5.0007 | 72550 | 0.2391 | |
|
| 0.2319 | 5.2007 | 75452 | 0.2318 | |
|
| 0.2296 | 5.4007 | 78354 | 0.2310 | |
|
| 0.2315 | 5.6008 | 81256 | 0.2277 | |
|
| 0.2264 | 5.8008 | 84158 | 0.2249 | |
|
| 0.2246 | 6.0008 | 87060 | 0.2228 | |
|
| 0.2196 | 6.2009 | 89962 | 0.2217 | |
|
| 0.216 | 6.4009 | 92864 | 0.2182 | |
|
| 0.2181 | 6.6009 | 95766 | 0.2181 | |
|
| 0.2156 | 6.8009 | 98668 | 0.2180 | |
|
| 0.2144 | 7.0010 | 101570 | 0.2139 | |
|
| 0.2063 | 7.2010 | 104472 | 0.2130 | |
|
| 0.2055 | 7.4010 | 107374 | 0.2101 | |
|
| 0.2051 | 7.6010 | 110276 | 0.2074 | |
|
| 0.2058 | 7.8011 | 113178 | 0.2065 | |
|
| 0.2037 | 8.0011 | 116080 | 0.2044 | |
|
| 0.1965 | 8.2011 | 118982 | 0.2010 | |
|
| 0.1935 | 8.4012 | 121884 | 0.2014 | |
|
| 0.1945 | 8.6012 | 124786 | 0.1984 | |
|
| 0.1916 | 8.8012 | 127688 | 0.1983 | |
|
| 0.19 | 9.0012 | 130590 | 0.1946 | |
|
| 0.182 | 9.2013 | 133492 | 0.1936 | |
|
| 0.1831 | 9.4013 | 136394 | 0.1905 | |
|
| 0.1831 | 9.6013 | 139296 | 0.1874 | |
|
| 0.1779 | 9.8014 | 142198 | 0.1892 | |
|
| 0.1781 | 10.0014 | 145100 | 0.1873 | |
|
| 0.1706 | 10.2014 | 148002 | 0.1840 | |
|
| 0.1678 | 10.4014 | 150904 | 0.1832 | |
|
| 0.168 | 10.6015 | 153806 | 0.1817 | |
|
| 0.1675 | 10.8015 | 156708 | 0.1791 | |
|
| 0.165 | 11.0015 | 159610 | 0.1767 | |
|
| 0.1572 | 11.2015 | 162512 | 0.1780 | |
|
| 0.1566 | 11.4016 | 165414 | 0.1770 | |
|
| 0.1563 | 11.6016 | 168316 | 0.1738 | |
|
| 0.1549 | 11.8016 | 171218 | 0.1734 | |
|
| 0.1568 | 12.0017 | 174120 | 0.1779 | |
|
| 0.1814 | 12.2017 | 177022 | 0.1961 | |
|
| 0.1855 | 12.4017 | 179924 | 0.1945 | |
|
| 0.1863 | 12.6017 | 182826 | 0.1942 | |
|
| 0.186 | 12.8018 | 185728 | 0.1949 | |
|
| 0.1855 | 13.0018 | 188630 | 0.1927 | |
|
| 0.1791 | 13.2018 | 191532 | 0.1921 | |
|
| 0.1785 | 13.4018 | 194434 | 0.1915 | |
|
| 0.179 | 13.6019 | 197336 | 0.1902 | |
|
| 0.1775 | 13.8019 | 200238 | 0.1895 | |
|
| 0.1791 | 14.0019 | 203140 | 0.1873 | |
|
| 0.169 | 14.2020 | 206042 | 0.1900 | |
|
| 0.1719 | 14.4020 | 208944 | 0.1858 | |
|
| 0.1698 | 14.6020 | 211846 | 0.1825 | |
|
| 0.1707 | 14.8020 | 214748 | 0.1810 | |
|
| 0.168 | 15.0021 | 217650 | 0.1814 | |
|
| 0.1614 | 15.2021 | 220552 | 0.1810 | |
|
| 0.1611 | 15.4021 | 223454 | 0.1780 | |
|
| 0.1615 | 15.6022 | 226356 | 0.1768 | |
|
| 0.1636 | 15.8022 | 229258 | 0.1769 | |
|
| 0.1595 | 16.0022 | 232160 | 0.1771 | |
|
| 0.1527 | 16.2022 | 235062 | 0.1749 | |
|
| 0.1521 | 16.4023 | 237964 | 0.1738 | |
|
| 0.1527 | 16.6023 | 240866 | 0.1727 | |
|
| 0.1529 | 16.8023 | 243768 | 0.1713 | |
|
| 0.1494 | 17.0023 | 246670 | 0.1721 | |
|
| 0.1452 | 17.2024 | 249572 | 0.1719 | |
|
| 0.1436 | 17.4024 | 252474 | 0.1699 | |
|
| 0.1445 | 17.6024 | 255376 | 0.1696 | |
|
| 0.1424 | 17.8025 | 258278 | 0.1692 | |
|
|
|
|
|
### Framework versions |
|
|
|
- PEFT 0.14.0 |
|
- Transformers 4.47.0 |
|
- Pytorch 2.5.1+cu124 |
|
- Datasets 3.2.0 |
|
- Tokenizers 0.21.0 |