metadata
license: apache-2.0
library_name: peft
tags:
- unsloth
- generated_from_trainer
base_model: mistralai/Mistral-7B-v0.3
model-index:
- name: mistral_7b_v_Magiccoder_evol_10k
results: []
mistral_7b_v_Magiccoder_evol_10k
This model is a fine-tuned version of mistralai/Mistral-7B-v0.3 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 1.1499
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 0.02
- num_epochs: 1
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
1.1885 | 0.0262 | 4 | 1.1900 |
1.0966 | 0.0523 | 8 | 1.1442 |
1.1468 | 0.0785 | 12 | 1.1514 |
1.0845 | 0.1047 | 16 | 1.1671 |
1.1413 | 0.1308 | 20 | 1.1635 |
1.0557 | 0.1570 | 24 | 1.1689 |
1.1949 | 0.1832 | 28 | 1.1682 |
1.149 | 0.2093 | 32 | 1.1674 |
1.0952 | 0.2355 | 36 | 1.1541 |
1.1551 | 0.2617 | 40 | 1.1687 |
1.1864 | 0.2878 | 44 | 1.1547 |
1.119 | 0.3140 | 48 | 1.1576 |
1.141 | 0.3401 | 52 | 1.1748 |
1.118 | 0.3663 | 56 | 1.1625 |
1.1186 | 0.3925 | 60 | 1.1571 |
1.1766 | 0.4186 | 64 | 1.1620 |
1.0801 | 0.4448 | 68 | 1.1534 |
1.0816 | 0.4710 | 72 | 1.1579 |
1.087 | 0.4971 | 76 | 1.1575 |
1.1822 | 0.5233 | 80 | 1.1619 |
1.0812 | 0.5495 | 84 | 1.1607 |
1.1626 | 0.5756 | 88 | 1.1611 |
1.21 | 0.6018 | 92 | 1.1624 |
1.1947 | 0.6280 | 96 | 1.1555 |
1.1154 | 0.6541 | 100 | 1.1518 |
1.1488 | 0.6803 | 104 | 1.1587 |
1.1402 | 0.7065 | 108 | 1.1595 |
1.0249 | 0.7326 | 112 | 1.1574 |
1.1102 | 0.7588 | 116 | 1.1472 |
1.1072 | 0.7850 | 120 | 1.1464 |
1.1382 | 0.8111 | 124 | 1.1473 |
1.1457 | 0.8373 | 128 | 1.1477 |
1.156 | 0.8635 | 132 | 1.1483 |
1.1037 | 0.8896 | 136 | 1.1488 |
1.2025 | 0.9158 | 140 | 1.1492 |
1.0551 | 0.9419 | 144 | 1.1496 |
1.0823 | 0.9681 | 148 | 1.1499 |
1.2344 | 0.9943 | 152 | 1.1499 |
Framework versions
- PEFT 0.7.1
- Transformers 4.40.2
- Pytorch 2.3.0+cu121
- Datasets 2.19.1
- Tokenizers 0.19.1