fydhfzh commited on
Commit
43be8f5
·
verified ·
1 Parent(s): d927aa8

End of training

Browse files
README.md CHANGED
@@ -20,12 +20,12 @@ should probably proofread and complete it, then remove this comment. -->
20
 
21
  This model is a fine-tuned version of [facebook/wav2vec2-base](https://huggingface.co/facebook/wav2vec2-base) on an unknown dataset.
22
  It achieves the following results on the evaluation set:
23
- - Loss: 0.6244
24
- - Accuracy: 0.8471
25
- - Precision: 0.8748
26
- - Recall: 0.8471
27
- - F1: 0.8488
28
- - Binary: 0.8939
29
 
30
  ## Model description
31
 
@@ -44,7 +44,7 @@ More information needed
44
  ### Training hyperparameters
45
 
46
  The following hyperparameters were used during training:
47
- - learning_rate: 5e-05
48
  - train_batch_size: 32
49
  - eval_batch_size: 32
50
  - seed: 42
@@ -59,63 +59,63 @@ The following hyperparameters were used during training:
59
 
60
  | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | Binary |
61
  |:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|:------:|
62
- | No log | 0.17 | 50 | 4.1183 | 0.0437 | 0.0069 | 0.0437 | 0.0092 | 0.3226 |
63
- | No log | 0.35 | 100 | 3.7518 | 0.0461 | 0.0070 | 0.0461 | 0.0098 | 0.3272 |
64
- | No log | 0.52 | 150 | 3.5442 | 0.0825 | 0.0303 | 0.0825 | 0.0290 | 0.3527 |
65
- | No log | 0.69 | 200 | 3.3675 | 0.1068 | 0.0360 | 0.1068 | 0.0473 | 0.3697 |
66
- | No log | 0.86 | 250 | 3.2266 | 0.1214 | 0.0494 | 0.1214 | 0.0498 | 0.3784 |
67
- | 3.7123 | 1.04 | 300 | 3.0319 | 0.2015 | 0.1672 | 0.2015 | 0.1388 | 0.4388 |
68
- | 3.7123 | 1.21 | 350 | 2.8840 | 0.2694 | 0.2341 | 0.2694 | 0.2132 | 0.4871 |
69
- | 3.7123 | 1.38 | 400 | 2.6910 | 0.3859 | 0.3375 | 0.3859 | 0.3187 | 0.5694 |
70
- | 3.7123 | 1.55 | 450 | 2.5564 | 0.4223 | 0.4081 | 0.4223 | 0.3680 | 0.5934 |
71
- | 3.7123 | 1.73 | 500 | 2.4118 | 0.4684 | 0.4887 | 0.4684 | 0.4192 | 0.6265 |
72
- | 3.7123 | 1.9 | 550 | 2.2664 | 0.5170 | 0.4814 | 0.5170 | 0.4632 | 0.6612 |
73
- | 2.7681 | 2.07 | 600 | 2.1127 | 0.6214 | 0.6252 | 0.6214 | 0.5918 | 0.7350 |
74
- | 2.7681 | 2.24 | 650 | 1.9688 | 0.6165 | 0.6309 | 0.6165 | 0.5828 | 0.7308 |
75
- | 2.7681 | 2.42 | 700 | 1.8190 | 0.6553 | 0.6452 | 0.6553 | 0.6192 | 0.7580 |
76
- | 2.7681 | 2.59 | 750 | 1.7073 | 0.6626 | 0.6724 | 0.6626 | 0.6309 | 0.7631 |
77
- | 2.7681 | 2.76 | 800 | 1.6781 | 0.6553 | 0.6746 | 0.6553 | 0.6282 | 0.7573 |
78
- | 2.7681 | 2.93 | 850 | 1.5448 | 0.6942 | 0.7092 | 0.6942 | 0.6662 | 0.7852 |
79
- | 2.0663 | 3.11 | 900 | 1.4279 | 0.75 | 0.7486 | 0.75 | 0.7261 | 0.8252 |
80
- | 2.0663 | 3.28 | 950 | 1.3753 | 0.7403 | 0.7526 | 0.7403 | 0.7185 | 0.8184 |
81
- | 2.0663 | 3.45 | 1000 | 1.2878 | 0.7718 | 0.7685 | 0.7718 | 0.7524 | 0.8396 |
82
- | 2.0663 | 3.62 | 1050 | 1.2287 | 0.7670 | 0.7899 | 0.7670 | 0.7543 | 0.8354 |
83
- | 2.0663 | 3.8 | 1100 | 1.1488 | 0.7791 | 0.8148 | 0.7791 | 0.7687 | 0.8454 |
84
- | 2.0663 | 3.97 | 1150 | 1.1336 | 0.7718 | 0.8118 | 0.7718 | 0.7645 | 0.8396 |
85
- | 1.5924 | 4.14 | 1200 | 1.0735 | 0.7840 | 0.8185 | 0.7840 | 0.7763 | 0.8488 |
86
- | 1.5924 | 4.31 | 1250 | 1.0218 | 0.8058 | 0.8291 | 0.8058 | 0.7960 | 0.8641 |
87
- | 1.5924 | 4.49 | 1300 | 0.9783 | 0.7937 | 0.8176 | 0.7937 | 0.7868 | 0.8556 |
88
- | 1.5924 | 4.66 | 1350 | 0.9595 | 0.8083 | 0.8247 | 0.8083 | 0.8020 | 0.8667 |
89
- | 1.5924 | 4.83 | 1400 | 0.9167 | 0.8155 | 0.8394 | 0.8155 | 0.8072 | 0.8718 |
90
- | 1.2937 | 5.0 | 1450 | 0.8874 | 0.8083 | 0.8392 | 0.8083 | 0.8068 | 0.8658 |
91
- | 1.2937 | 5.18 | 1500 | 0.8918 | 0.7913 | 0.8305 | 0.7913 | 0.7893 | 0.8541 |
92
- | 1.2937 | 5.35 | 1550 | 0.8706 | 0.7937 | 0.8411 | 0.7937 | 0.7948 | 0.8558 |
93
- | 1.2937 | 5.52 | 1600 | 0.8037 | 0.8350 | 0.8804 | 0.8350 | 0.8347 | 0.8845 |
94
- | 1.2937 | 5.69 | 1650 | 0.8356 | 0.8010 | 0.8518 | 0.8010 | 0.8019 | 0.8626 |
95
- | 1.2937 | 5.87 | 1700 | 0.8261 | 0.8131 | 0.8491 | 0.8131 | 0.8159 | 0.8694 |
96
- | 1.0749 | 6.04 | 1750 | 0.7995 | 0.8131 | 0.8568 | 0.8131 | 0.8145 | 0.8721 |
97
- | 1.0749 | 6.21 | 1800 | 0.7737 | 0.8301 | 0.8674 | 0.8301 | 0.8315 | 0.8830 |
98
- | 1.0749 | 6.38 | 1850 | 0.7524 | 0.8228 | 0.8660 | 0.8228 | 0.8236 | 0.8755 |
99
- | 1.0749 | 6.56 | 1900 | 0.7203 | 0.8301 | 0.8728 | 0.8301 | 0.8342 | 0.8830 |
100
- | 1.0749 | 6.73 | 1950 | 0.7239 | 0.8398 | 0.8751 | 0.8398 | 0.8426 | 0.8881 |
101
- | 1.0749 | 6.9 | 2000 | 0.6872 | 0.8422 | 0.8797 | 0.8422 | 0.8454 | 0.8896 |
102
- | 0.916 | 7.08 | 2050 | 0.6973 | 0.8398 | 0.8791 | 0.8398 | 0.8431 | 0.8879 |
103
- | 0.916 | 7.25 | 2100 | 0.6895 | 0.8350 | 0.8783 | 0.8350 | 0.8403 | 0.8845 |
104
- | 0.916 | 7.42 | 2150 | 0.6613 | 0.8495 | 0.8832 | 0.8495 | 0.8510 | 0.8947 |
105
- | 0.916 | 7.59 | 2200 | 0.6550 | 0.8325 | 0.8633 | 0.8325 | 0.8348 | 0.8847 |
106
- | 0.916 | 7.77 | 2250 | 0.6565 | 0.8422 | 0.8746 | 0.8422 | 0.8434 | 0.8896 |
107
- | 0.916 | 7.94 | 2300 | 0.6689 | 0.8350 | 0.8777 | 0.8350 | 0.8381 | 0.8864 |
108
- | 0.7992 | 8.11 | 2350 | 0.6816 | 0.8252 | 0.8624 | 0.8252 | 0.8236 | 0.8786 |
109
- | 0.7992 | 8.28 | 2400 | 0.6394 | 0.8447 | 0.8784 | 0.8447 | 0.8465 | 0.8932 |
110
- | 0.7992 | 8.46 | 2450 | 0.6732 | 0.8252 | 0.8502 | 0.8252 | 0.8219 | 0.8786 |
111
- | 0.7992 | 8.63 | 2500 | 0.6593 | 0.8544 | 0.8836 | 0.8544 | 0.8542 | 0.8990 |
112
- | 0.7992 | 8.8 | 2550 | 0.6510 | 0.8374 | 0.8690 | 0.8374 | 0.8365 | 0.8871 |
113
- | 0.7992 | 8.97 | 2600 | 0.6500 | 0.8398 | 0.8761 | 0.8398 | 0.8417 | 0.8888 |
114
- | 0.7376 | 9.15 | 2650 | 0.6393 | 0.8374 | 0.8665 | 0.8374 | 0.8381 | 0.8871 |
115
- | 0.7376 | 9.32 | 2700 | 0.6284 | 0.8422 | 0.8716 | 0.8422 | 0.8435 | 0.8905 |
116
- | 0.7376 | 9.49 | 2750 | 0.6225 | 0.8471 | 0.8761 | 0.8471 | 0.8483 | 0.8939 |
117
- | 0.7376 | 9.66 | 2800 | 0.6219 | 0.8519 | 0.8836 | 0.8519 | 0.8544 | 0.8973 |
118
- | 0.7376 | 9.84 | 2850 | 0.6244 | 0.8471 | 0.8748 | 0.8471 | 0.8488 | 0.8939 |
119
 
120
 
121
  ### Framework versions
 
20
 
21
  This model is a fine-tuned version of [facebook/wav2vec2-base](https://huggingface.co/facebook/wav2vec2-base) on an unknown dataset.
22
  It achieves the following results on the evaluation set:
23
+ - Loss: 0.8709
24
+ - Accuracy: 0.8083
25
+ - Precision: 0.8377
26
+ - Recall: 0.8083
27
+ - F1: 0.8018
28
+ - Binary: 0.8650
29
 
30
  ## Model description
31
 
 
44
  ### Training hyperparameters
45
 
46
  The following hyperparameters were used during training:
47
+ - learning_rate: 3e-05
48
  - train_batch_size: 32
49
  - eval_batch_size: 32
50
  - seed: 42
 
59
 
60
  | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | Binary |
61
  |:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|:------:|
62
+ | No log | 0.17 | 50 | 4.2449 | 0.0510 | 0.0085 | 0.0510 | 0.0115 | 0.3218 |
63
+ | No log | 0.35 | 100 | 3.9731 | 0.0461 | 0.0045 | 0.0461 | 0.0074 | 0.3265 |
64
+ | No log | 0.52 | 150 | 3.7773 | 0.0437 | 0.0097 | 0.0437 | 0.0124 | 0.3301 |
65
+ | No log | 0.69 | 200 | 3.6063 | 0.1019 | 0.0598 | 0.1019 | 0.0574 | 0.3701 |
66
+ | No log | 0.86 | 250 | 3.4003 | 0.1990 | 0.1624 | 0.1990 | 0.1291 | 0.4379 |
67
+ | 3.8908 | 1.04 | 300 | 3.2347 | 0.25 | 0.2359 | 0.25 | 0.1908 | 0.4728 |
68
+ | 3.8908 | 1.21 | 350 | 3.0941 | 0.3131 | 0.2684 | 0.3131 | 0.2507 | 0.5177 |
69
+ | 3.8908 | 1.38 | 400 | 2.9380 | 0.3835 | 0.3642 | 0.3835 | 0.3147 | 0.5670 |
70
+ | 3.8908 | 1.55 | 450 | 2.8310 | 0.3956 | 0.3552 | 0.3956 | 0.3255 | 0.5750 |
71
+ | 3.8908 | 1.73 | 500 | 2.6928 | 0.4709 | 0.4197 | 0.4709 | 0.4024 | 0.6282 |
72
+ | 3.8908 | 1.9 | 550 | 2.6116 | 0.5194 | 0.4641 | 0.5194 | 0.4569 | 0.6629 |
73
+ | 3.027 | 2.07 | 600 | 2.4653 | 0.5704 | 0.5395 | 0.5704 | 0.5124 | 0.6988 |
74
+ | 3.027 | 2.24 | 650 | 2.3361 | 0.6044 | 0.5338 | 0.6044 | 0.5442 | 0.7223 |
75
+ | 3.027 | 2.42 | 700 | 2.2435 | 0.5947 | 0.5439 | 0.5947 | 0.5324 | 0.7141 |
76
+ | 3.027 | 2.59 | 750 | 2.1533 | 0.6141 | 0.6006 | 0.6141 | 0.5678 | 0.7291 |
77
+ | 3.027 | 2.76 | 800 | 2.0622 | 0.6189 | 0.6313 | 0.6189 | 0.5815 | 0.7325 |
78
+ | 3.027 | 2.93 | 850 | 1.9734 | 0.6578 | 0.6757 | 0.6578 | 0.6267 | 0.7597 |
79
+ | 2.4367 | 3.11 | 900 | 1.9059 | 0.6699 | 0.6688 | 0.6699 | 0.6289 | 0.7675 |
80
+ | 2.4367 | 3.28 | 950 | 1.8291 | 0.6990 | 0.7240 | 0.6990 | 0.6767 | 0.7886 |
81
+ | 2.4367 | 3.45 | 1000 | 1.7507 | 0.7257 | 0.7486 | 0.7257 | 0.6920 | 0.8066 |
82
+ | 2.4367 | 3.62 | 1050 | 1.6782 | 0.7330 | 0.7475 | 0.7330 | 0.7153 | 0.8124 |
83
+ | 2.4367 | 3.8 | 1100 | 1.6353 | 0.7184 | 0.7445 | 0.7184 | 0.6941 | 0.8015 |
84
+ | 2.4367 | 3.97 | 1150 | 1.5548 | 0.7694 | 0.7840 | 0.7694 | 0.7505 | 0.8379 |
85
+ | 2.0129 | 4.14 | 1200 | 1.5002 | 0.7330 | 0.7432 | 0.7330 | 0.7112 | 0.8133 |
86
+ | 2.0129 | 4.31 | 1250 | 1.4655 | 0.7524 | 0.7706 | 0.7524 | 0.7339 | 0.8260 |
87
+ | 2.0129 | 4.49 | 1300 | 1.4149 | 0.7718 | 0.7931 | 0.7718 | 0.7529 | 0.8396 |
88
+ | 2.0129 | 4.66 | 1350 | 1.3751 | 0.7743 | 0.7927 | 0.7743 | 0.7544 | 0.8422 |
89
+ | 2.0129 | 4.83 | 1400 | 1.3503 | 0.7767 | 0.8061 | 0.7767 | 0.7693 | 0.8422 |
90
+ | 1.7091 | 5.0 | 1450 | 1.3102 | 0.7816 | 0.8084 | 0.7816 | 0.7712 | 0.8473 |
91
+ | 1.7091 | 5.18 | 1500 | 1.3020 | 0.7621 | 0.8003 | 0.7621 | 0.7538 | 0.8328 |
92
+ | 1.7091 | 5.35 | 1550 | 1.2288 | 0.7816 | 0.8013 | 0.7816 | 0.7707 | 0.8473 |
93
+ | 1.7091 | 5.52 | 1600 | 1.2042 | 0.7961 | 0.8235 | 0.7961 | 0.7874 | 0.8575 |
94
+ | 1.7091 | 5.69 | 1650 | 1.1769 | 0.8010 | 0.8371 | 0.8010 | 0.7946 | 0.8592 |
95
+ | 1.7091 | 5.87 | 1700 | 1.1471 | 0.8107 | 0.8524 | 0.8107 | 0.8051 | 0.8667 |
96
+ | 1.4785 | 6.04 | 1750 | 1.1107 | 0.8228 | 0.8478 | 0.8228 | 0.8138 | 0.8745 |
97
+ | 1.4785 | 6.21 | 1800 | 1.0799 | 0.8131 | 0.8378 | 0.8131 | 0.8027 | 0.8684 |
98
+ | 1.4785 | 6.38 | 1850 | 1.0704 | 0.8010 | 0.8229 | 0.8010 | 0.7915 | 0.8600 |
99
+ | 1.4785 | 6.56 | 1900 | 1.0424 | 0.8058 | 0.8281 | 0.8058 | 0.7998 | 0.8633 |
100
+ | 1.4785 | 6.73 | 1950 | 1.0279 | 0.7985 | 0.8194 | 0.7985 | 0.7905 | 0.8592 |
101
+ | 1.4785 | 6.9 | 2000 | 1.0122 | 0.8180 | 0.8408 | 0.8180 | 0.8069 | 0.8711 |
102
+ | 1.3046 | 7.08 | 2050 | 0.9924 | 0.8180 | 0.8366 | 0.8180 | 0.8087 | 0.8711 |
103
+ | 1.3046 | 7.25 | 2100 | 0.9909 | 0.8058 | 0.8320 | 0.8058 | 0.7980 | 0.8633 |
104
+ | 1.3046 | 7.42 | 2150 | 0.9724 | 0.8058 | 0.8355 | 0.8058 | 0.8001 | 0.8643 |
105
+ | 1.3046 | 7.59 | 2200 | 0.9406 | 0.8083 | 0.8345 | 0.8083 | 0.8032 | 0.8660 |
106
+ | 1.3046 | 7.77 | 2250 | 0.9495 | 0.8010 | 0.8290 | 0.8010 | 0.7949 | 0.8592 |
107
+ | 1.3046 | 7.94 | 2300 | 0.9283 | 0.8107 | 0.8343 | 0.8107 | 0.8035 | 0.8667 |
108
+ | 1.1807 | 8.11 | 2350 | 0.9242 | 0.8155 | 0.8392 | 0.8155 | 0.8096 | 0.8711 |
109
+ | 1.1807 | 8.28 | 2400 | 0.9066 | 0.8204 | 0.8427 | 0.8204 | 0.8131 | 0.8735 |
110
+ | 1.1807 | 8.46 | 2450 | 0.9049 | 0.8252 | 0.8451 | 0.8252 | 0.8149 | 0.8769 |
111
+ | 1.1807 | 8.63 | 2500 | 0.9019 | 0.8204 | 0.8484 | 0.8204 | 0.8117 | 0.8735 |
112
+ | 1.1807 | 8.8 | 2550 | 0.8914 | 0.8131 | 0.8356 | 0.8131 | 0.8051 | 0.8684 |
113
+ | 1.1807 | 8.97 | 2600 | 0.8953 | 0.8083 | 0.8403 | 0.8083 | 0.8014 | 0.8643 |
114
+ | 1.1094 | 9.15 | 2650 | 0.8824 | 0.8107 | 0.8450 | 0.8107 | 0.8067 | 0.8667 |
115
+ | 1.1094 | 9.32 | 2700 | 0.8800 | 0.8058 | 0.8317 | 0.8058 | 0.7964 | 0.8633 |
116
+ | 1.1094 | 9.49 | 2750 | 0.8729 | 0.8180 | 0.8450 | 0.8180 | 0.8113 | 0.8718 |
117
+ | 1.1094 | 9.66 | 2800 | 0.8704 | 0.8107 | 0.8383 | 0.8107 | 0.8037 | 0.8667 |
118
+ | 1.1094 | 9.84 | 2850 | 0.8709 | 0.8083 | 0.8377 | 0.8083 | 0.8018 | 0.8650 |
119
 
120
 
121
  ### Framework versions
runs/Jun16_16-45-55_LAPTOP-1GID9RGH/events.out.tfevents.1718531156.LAPTOP-1GID9RGH.22340.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:3fba68822dc9d3efaf31f52f30c8ad7e58bbd41656e0139279ee351bc7184177
3
- size 37607
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b79745c7b8082108dd3289ab1991dad88a0ad7a0eb9572ff250b0104a6639165
3
+ size 42037