yemen2016's picture
End of training
88dc8bd verified
|
raw
history blame
2.53 kB
metadata
license: cc-by-4.0
base_model: vesteinn/DanskBERT
tags:
  - generated_from_trainer
model-index:
  - name: MeMo_BERT-WSD-DanskBERT
    results: []

MeMo_BERT-WSD-DanskBERT

This model is a fine-tuned version of vesteinn/DanskBERT on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.7755
  • F1-score: 0.5209

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 20

Training results

Training Loss Epoch Step Validation Loss F1-score
No log 1.0 61 1.4766 0.1229
No log 2.0 122 1.4366 0.1229
No log 3.0 183 1.3636 0.2462
No log 4.0 244 1.2889 0.3692
No log 5.0 305 1.4150 0.3786
No log 6.0 366 1.5581 0.3409
No log 7.0 427 1.6512 0.4664
No log 8.0 488 1.7405 0.4661
0.9424 9.0 549 1.7755 0.5209
0.9424 10.0 610 2.4738 0.4351
0.9424 11.0 671 2.4721 0.4858
0.9424 12.0 732 2.9449 0.4491
0.9424 13.0 793 2.8346 0.4528
0.9424 14.0 854 3.0715 0.4845
0.9424 15.0 915 3.1416 0.4520
0.9424 16.0 976 3.0893 0.5197
0.1197 17.0 1037 3.1668 0.4764
0.1197 18.0 1098 3.2142 0.4656
0.1197 19.0 1159 3.2174 0.5087
0.1197 20.0 1220 3.2239 0.5087

Framework versions

  • Transformers 4.38.2
  • Pytorch 2.2.1+cu121
  • Datasets 2.18.0
  • Tokenizers 0.15.2