LabiraPJOK_5x_50 / README.md
Labira's picture
Training in progress epoch 44
c78d392
metadata
library_name: transformers
license: mit
base_model: Labira/LabiraPJOK_3x_50
tags:
  - generated_from_keras_callback
model-index:
  - name: Labira/LabiraPJOK_5x_50
    results: []

Labira/LabiraPJOK_5x_50

This model is a fine-tuned version of Labira/LabiraPJOK_3x_50 on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 0.0274
  • Validation Loss: 3.1328
  • Epoch: 44

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 150, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
  • training_precision: float32

Training results

Train Loss Validation Loss Epoch
3.2696 2.9298 0
1.9565 2.6626 1
1.6690 2.7837 2
1.1679 2.9196 3
1.0975 2.8046 4
0.8930 2.6822 5
0.6527 2.6118 6
0.5637 2.5444 7
0.4854 2.5175 8
0.4389 2.5464 9
0.3206 2.5893 10
0.3225 2.6538 11
0.1880 2.7504 12
0.1288 2.8371 13
0.1381 2.9128 14
0.0994 2.9468 15
0.1544 2.9312 16
0.0978 2.9279 17
0.0492 2.9426 18
0.0612 2.9733 19
0.1016 3.0228 20
0.0554 3.0772 21
0.0768 3.1331 22
0.0277 3.1720 23
0.0403 3.1906 24
0.0730 3.1962 25
0.0204 3.1958 26
0.0731 3.1981 27
0.0414 3.1874 28
0.0316 3.1657 29
0.0324 3.1507 30
0.0526 3.1275 31
0.0369 3.1141 32
0.0406 3.1091 33
0.0214 3.1127 34
0.0209 3.1207 35
0.0139 3.1172 36
0.0215 3.1166 37
0.0140 3.1168 38
0.0277 3.1187 39
0.0131 3.1214 40
0.0184 3.1226 41
0.0286 3.1256 42
0.0144 3.1297 43
0.0274 3.1328 44

Framework versions

  • Transformers 4.44.2
  • TensorFlow 2.17.0
  • Datasets 3.0.1
  • Tokenizers 0.19.1