robbert-2023-dutch-base-ft-nlp-xxl
This model is a fine-tuned version of DTAI-KULeuven/robbert-2023-dutch-base on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 2.0118
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- training_steps: 2000
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
2.8326 | 0.06 | 10 | 2.6788 |
2.7533 | 0.12 | 20 | 2.5468 |
2.4636 | 0.19 | 30 | 2.5083 |
2.6891 | 0.25 | 40 | 2.4572 |
2.5285 | 0.31 | 50 | 2.4016 |
2.5102 | 0.37 | 60 | 2.4493 |
2.5021 | 0.43 | 70 | 2.3338 |
2.4623 | 0.5 | 80 | 2.3530 |
2.3883 | 0.56 | 90 | 2.3881 |
2.4773 | 0.62 | 100 | 2.3410 |
2.4389 | 0.68 | 110 | 2.3148 |
2.3577 | 0.75 | 120 | 2.3326 |
2.3497 | 0.81 | 130 | 2.3429 |
2.3806 | 0.87 | 140 | 2.2916 |
2.433 | 0.93 | 150 | 2.2801 |
2.4703 | 0.99 | 160 | 2.2703 |
2.1623 | 1.06 | 170 | 2.3148 |
2.3273 | 1.12 | 180 | 2.2596 |
2.2054 | 1.18 | 190 | 2.1914 |
2.3115 | 1.24 | 200 | 2.2161 |
2.109 | 1.3 | 210 | 2.1979 |
2.375 | 1.37 | 220 | 2.2155 |
2.2816 | 1.43 | 230 | 2.1992 |
2.3764 | 1.49 | 240 | 2.1825 |
2.1229 | 1.55 | 250 | 2.2547 |
2.1761 | 1.61 | 260 | 2.1983 |
2.2285 | 1.68 | 270 | 2.2590 |
2.3079 | 1.74 | 280 | 2.1666 |
2.2963 | 1.8 | 290 | 2.2389 |
2.3471 | 1.86 | 300 | 2.1583 |
2.2031 | 1.93 | 310 | 2.2457 |
2.3073 | 1.99 | 320 | 2.2102 |
2.1813 | 2.05 | 330 | 2.1898 |
2.1958 | 2.11 | 340 | 2.2095 |
2.2239 | 2.17 | 350 | 2.2107 |
2.1024 | 2.24 | 360 | 2.2168 |
2.1895 | 2.3 | 370 | 2.1944 |
2.1631 | 2.36 | 380 | 2.2287 |
2.1258 | 2.42 | 390 | 2.1830 |
2.236 | 2.48 | 400 | 2.1641 |
2.1493 | 2.55 | 410 | 2.1377 |
2.1368 | 2.61 | 420 | 2.1640 |
2.1932 | 2.67 | 430 | 2.2102 |
2.2071 | 2.73 | 440 | 2.1461 |
2.2059 | 2.8 | 450 | 2.2398 |
2.2088 | 2.86 | 460 | 2.1055 |
2.2002 | 2.92 | 470 | 2.2272 |
2.1892 | 2.98 | 480 | 2.1622 |
2.1382 | 3.04 | 490 | 2.1392 |
2.0724 | 3.11 | 500 | 2.1669 |
2.09 | 3.17 | 510 | 2.1585 |
2.1398 | 3.23 | 520 | 2.1565 |
2.1023 | 3.29 | 530 | 2.1532 |
1.9628 | 3.35 | 540 | 2.1312 |
2.1294 | 3.42 | 550 | 2.1337 |
2.0734 | 3.48 | 560 | 2.1854 |
2.0503 | 3.54 | 570 | 2.1351 |
1.9727 | 3.6 | 580 | 2.1715 |
2.0652 | 3.66 | 590 | 2.1348 |
1.9942 | 3.73 | 600 | 2.2555 |
2.0017 | 3.79 | 610 | 2.1412 |
2.0962 | 3.85 | 620 | 2.1442 |
2.1212 | 3.91 | 630 | 2.1866 |
2.0276 | 3.98 | 640 | 2.0766 |
2.0726 | 4.04 | 650 | 2.0432 |
2.0554 | 4.1 | 660 | 2.1925 |
1.9865 | 4.16 | 670 | 2.1344 |
1.9676 | 4.22 | 680 | 2.1379 |
2.0355 | 4.29 | 690 | 2.1465 |
1.9982 | 4.35 | 700 | 2.0861 |
2.0307 | 4.41 | 710 | 2.1359 |
2.1014 | 4.47 | 720 | 2.0703 |
1.9608 | 4.53 | 730 | 2.0898 |
2.1068 | 4.6 | 740 | 2.2018 |
2.0099 | 4.66 | 750 | 2.1502 |
2.0715 | 4.72 | 760 | 2.0592 |
2.1272 | 4.78 | 770 | 2.1833 |
2.1069 | 4.84 | 780 | 2.0944 |
1.96 | 4.91 | 790 | 2.1344 |
2.0613 | 4.97 | 800 | 2.1366 |
1.9297 | 5.03 | 810 | 2.0956 |
2.0172 | 5.09 | 820 | 2.1792 |
2.0134 | 5.16 | 830 | 2.0792 |
1.9867 | 5.22 | 840 | 2.1058 |
1.9391 | 5.28 | 850 | 2.1820 |
1.8802 | 5.34 | 860 | 2.1274 |
1.9789 | 5.4 | 870 | 2.0956 |
2.0665 | 5.47 | 880 | 2.1209 |
2.0909 | 5.53 | 890 | 2.1557 |
1.9261 | 5.59 | 900 | 2.0976 |
2.0246 | 5.65 | 910 | 2.1127 |
1.9727 | 5.71 | 920 | 2.1670 |
1.8429 | 5.78 | 930 | 2.0906 |
2.001 | 5.84 | 940 | 2.0951 |
1.9363 | 5.9 | 950 | 2.0593 |
2.0033 | 5.96 | 960 | 2.0947 |
1.9868 | 6.02 | 970 | 2.0643 |
1.9011 | 6.09 | 980 | 2.1598 |
1.9562 | 6.15 | 990 | 2.0961 |
1.8923 | 6.21 | 1000 | 2.1436 |
1.9066 | 6.27 | 1010 | 2.0773 |
1.9805 | 6.34 | 1020 | 2.1261 |
1.829 | 6.4 | 1030 | 2.0962 |
1.8745 | 6.46 | 1040 | 2.0881 |
1.8518 | 6.52 | 1050 | 2.0200 |
1.9164 | 6.58 | 1060 | 2.0809 |
1.7968 | 6.65 | 1070 | 2.1169 |
1.9029 | 6.71 | 1080 | 2.0290 |
1.9383 | 6.77 | 1090 | 2.0806 |
1.8375 | 6.83 | 1100 | 2.0816 |
1.8289 | 6.89 | 1110 | 2.0660 |
1.894 | 6.96 | 1120 | 2.0229 |
1.843 | 7.02 | 1130 | 2.1239 |
1.8515 | 7.08 | 1140 | 2.0687 |
1.8899 | 7.14 | 1150 | 2.0832 |
1.903 | 7.2 | 1160 | 2.0882 |
1.8505 | 7.27 | 1170 | 2.0213 |
1.8155 | 7.33 | 1180 | 2.0808 |
1.9355 | 7.39 | 1190 | 2.0649 |
1.8213 | 7.45 | 1200 | 2.0817 |
1.9897 | 7.52 | 1210 | 2.1589 |
1.8044 | 7.58 | 1220 | 2.1288 |
1.9347 | 7.64 | 1230 | 2.0927 |
1.9311 | 7.7 | 1240 | 2.0180 |
1.922 | 7.76 | 1250 | 2.0163 |
1.8572 | 7.83 | 1260 | 2.0632 |
1.8858 | 7.89 | 1270 | 2.0255 |
1.8692 | 7.95 | 1280 | 2.0807 |
1.9486 | 8.01 | 1290 | 2.0829 |
1.8184 | 8.07 | 1300 | 2.0721 |
1.884 | 8.14 | 1310 | 2.0809 |
1.7928 | 8.2 | 1320 | 2.0462 |
1.8337 | 8.26 | 1330 | 2.0486 |
1.8443 | 8.32 | 1340 | 2.0113 |
1.8546 | 8.39 | 1350 | 2.0348 |
1.9359 | 8.45 | 1360 | 1.9960 |
1.874 | 8.51 | 1370 | 2.0198 |
1.9366 | 8.57 | 1380 | 2.1198 |
1.8081 | 8.63 | 1390 | 2.0964 |
1.8655 | 8.7 | 1400 | 2.0571 |
1.8357 | 8.76 | 1410 | 2.0432 |
1.8409 | 8.82 | 1420 | 2.0679 |
1.7785 | 8.88 | 1430 | 2.0930 |
1.766 | 8.94 | 1440 | 2.1041 |
1.8542 | 9.01 | 1450 | 2.0035 |
1.7403 | 9.07 | 1460 | 2.0662 |
1.8109 | 9.13 | 1470 | 1.9674 |
1.8191 | 9.19 | 1480 | 2.0274 |
1.7713 | 9.25 | 1490 | 2.1420 |
1.7628 | 9.32 | 1500 | 2.0899 |
1.8273 | 9.38 | 1510 | 1.9969 |
1.7786 | 9.44 | 1520 | 2.0089 |
1.7618 | 9.5 | 1530 | 2.0572 |
1.8247 | 9.57 | 1540 | 2.0710 |
1.7363 | 9.63 | 1550 | 1.9818 |
1.8374 | 9.69 | 1560 | 2.0177 |
1.8838 | 9.75 | 1570 | 2.0528 |
1.709 | 9.81 | 1580 | 1.9890 |
1.8743 | 9.88 | 1590 | 2.0105 |
1.855 | 9.94 | 1600 | 1.9971 |
1.8659 | 10.0 | 1610 | 2.0052 |
1.8172 | 10.06 | 1620 | 2.0004 |
1.7537 | 10.12 | 1630 | 2.1136 |
1.7822 | 10.19 | 1640 | 2.0685 |
1.7855 | 10.25 | 1650 | 2.0326 |
1.7825 | 10.31 | 1660 | 2.0402 |
1.7391 | 10.37 | 1670 | 2.0100 |
1.755 | 10.43 | 1680 | 2.0587 |
1.7649 | 10.5 | 1690 | 2.0548 |
1.7742 | 10.56 | 1700 | 2.0025 |
1.8407 | 10.62 | 1710 | 2.0164 |
1.828 | 10.68 | 1720 | 1.9975 |
1.7487 | 10.75 | 1730 | 2.0598 |
1.7521 | 10.81 | 1740 | 2.0318 |
1.7253 | 10.87 | 1750 | 2.1049 |
1.7245 | 10.93 | 1760 | 2.0569 |
1.8093 | 10.99 | 1770 | 1.9909 |
1.6967 | 11.06 | 1780 | 2.0660 |
1.7274 | 11.12 | 1790 | 2.0615 |
1.901 | 11.18 | 1800 | 2.0775 |
1.7667 | 11.24 | 1810 | 2.0470 |
1.8173 | 11.3 | 1820 | 2.0141 |
1.6841 | 11.37 | 1830 | 2.0541 |
1.7374 | 11.43 | 1840 | 2.0526 |
1.7307 | 11.49 | 1850 | 2.0060 |
1.7778 | 11.55 | 1860 | 2.0601 |
1.7656 | 11.61 | 1870 | 2.0358 |
1.7167 | 11.68 | 1880 | 2.1360 |
1.7 | 11.74 | 1890 | 2.0746 |
1.833 | 11.8 | 1900 | 2.0382 |
1.7076 | 11.86 | 1910 | 1.9974 |
1.7491 | 11.93 | 1920 | 2.0558 |
1.7912 | 11.99 | 1930 | 2.0598 |
1.7654 | 12.05 | 1940 | 2.0048 |
1.6612 | 12.11 | 1950 | 2.0457 |
1.7856 | 12.17 | 1960 | 2.0841 |
1.8026 | 12.24 | 1970 | 2.1041 |
1.696 | 12.3 | 1980 | 2.0776 |
1.7901 | 12.36 | 1990 | 2.0176 |
1.7881 | 12.42 | 2000 | 2.0118 |
Framework versions
- Transformers 4.38.2
- Pytorch 2.1.1+cu121
- Datasets 2.15.0
- Tokenizers 0.15.0
- Downloads last month
- 499
Model tree for helena-balabin/robbert-2023-dutch-base-ft-nlp-xxl
Base model
DTAI-KULeuven/robbert-2023-dutch-base