Labira/LabiraPJOK_6_50
This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:
- Train Loss: 0.2339
- Validation Loss: 5.5381
- Epoch: 45
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 150, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
- training_precision: float32
Training results
Train Loss | Validation Loss | Epoch |
---|---|---|
5.9288 | 5.8645 | 0 |
5.3504 | 5.5642 | 1 |
4.8415 | 5.2861 | 2 |
4.4262 | 5.0405 | 3 |
3.9393 | 4.8319 | 4 |
3.5618 | 4.6846 | 5 |
3.1397 | 4.5537 | 6 |
2.8672 | 4.5351 | 7 |
2.5237 | 4.5660 | 8 |
2.2248 | 4.6591 | 9 |
1.9959 | 4.7339 | 10 |
1.7472 | 4.7277 | 11 |
1.4278 | 4.8903 | 12 |
1.3894 | 5.0373 | 13 |
1.0967 | 5.0278 | 14 |
1.1221 | 4.9946 | 15 |
0.7442 | 5.2586 | 16 |
0.8612 | 5.2843 | 17 |
0.5931 | 5.1700 | 18 |
0.6821 | 5.1365 | 19 |
0.5318 | 5.3756 | 20 |
0.6532 | 5.3596 | 21 |
0.4709 | 5.0514 | 22 |
0.5231 | 5.2925 | 23 |
0.4409 | 5.4717 | 24 |
0.4190 | 5.3165 | 25 |
0.2947 | 5.2838 | 26 |
0.4589 | 5.3584 | 27 |
0.3687 | 5.3363 | 28 |
0.4503 | 5.3796 | 29 |
0.3816 | 5.5059 | 30 |
0.2797 | 5.6630 | 31 |
0.3414 | 5.7868 | 32 |
0.2765 | 5.8455 | 33 |
0.3072 | 5.6132 | 34 |
0.3149 | 5.3748 | 35 |
0.2645 | 5.2937 | 36 |
0.2593 | 5.2920 | 37 |
0.2050 | 5.3847 | 38 |
0.2114 | 5.4801 | 39 |
0.2337 | 5.5506 | 40 |
0.1893 | 5.5690 | 41 |
0.2540 | 5.5731 | 42 |
0.2297 | 5.5711 | 43 |
0.3589 | 5.5490 | 44 |
0.2339 | 5.5381 | 45 |
Framework versions
- Transformers 4.44.2
- TensorFlow 2.17.0
- Datasets 3.0.1
- Tokenizers 0.19.1
- Downloads last month
- 63
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for Labira/LabiraPJOK_6_50
Base model
indolem/indobert-base-uncased