End of training
Browse files
README.md
CHANGED
@@ -20,12 +20,12 @@ should probably proofread and complete it, then remove this comment. -->
|
|
20 |
|
21 |
This model is a fine-tuned version of [facebook/wav2vec2-base](https://huggingface.co/facebook/wav2vec2-base) on an unknown dataset.
|
22 |
It achieves the following results on the evaluation set:
|
23 |
-
- Loss: 0.
|
24 |
-
- Accuracy: 0.
|
25 |
-
- Precision: 0.
|
26 |
-
- Recall: 0.
|
27 |
-
- F1: 0.
|
28 |
-
- Binary: 0.
|
29 |
|
30 |
## Model description
|
31 |
|
@@ -44,7 +44,7 @@ More information needed
|
|
44 |
### Training hyperparameters
|
45 |
|
46 |
The following hyperparameters were used during training:
|
47 |
-
- learning_rate:
|
48 |
- train_batch_size: 32
|
49 |
- eval_batch_size: 32
|
50 |
- seed: 42
|
@@ -59,63 +59,63 @@ The following hyperparameters were used during training:
|
|
59 |
|
60 |
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | Binary |
|
61 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|:------:|
|
62 |
-
| No log | 0.17 | 50 | 4.
|
63 |
-
| No log | 0.35 | 100 | 3.
|
64 |
-
| No log | 0.52 | 150 | 3.
|
65 |
-
| No log | 0.69 | 200 | 3.
|
66 |
-
| No log | 0.86 | 250 | 3.
|
67 |
-
| 3.
|
68 |
-
| 3.
|
69 |
-
| 3.
|
70 |
-
| 3.
|
71 |
-
| 3.
|
72 |
-
| 3.
|
73 |
-
|
|
74 |
-
|
|
75 |
-
|
|
76 |
-
|
|
77 |
-
|
|
78 |
-
|
|
79 |
-
| 2.
|
80 |
-
| 2.
|
81 |
-
| 2.
|
82 |
-
| 2.
|
83 |
-
| 2.
|
84 |
-
| 2.
|
85 |
-
|
|
86 |
-
|
|
87 |
-
|
|
88 |
-
|
|
89 |
-
|
|
90 |
-
| 1.
|
91 |
-
| 1.
|
92 |
-
| 1.
|
93 |
-
| 1.
|
94 |
-
| 1.
|
95 |
-
| 1.
|
96 |
-
| 1.
|
97 |
-
| 1.
|
98 |
-
| 1.
|
99 |
-
| 1.
|
100 |
-
| 1.
|
101 |
-
| 1.
|
102 |
-
|
|
103 |
-
|
|
104 |
-
|
|
105 |
-
|
|
106 |
-
|
|
107 |
-
|
|
108 |
-
|
|
109 |
-
|
|
110 |
-
|
|
111 |
-
|
|
112 |
-
|
|
113 |
-
|
|
114 |
-
|
|
115 |
-
|
|
116 |
-
|
|
117 |
-
|
|
118 |
-
|
|
119 |
|
120 |
|
121 |
### Framework versions
|
|
|
20 |
|
21 |
This model is a fine-tuned version of [facebook/wav2vec2-base](https://huggingface.co/facebook/wav2vec2-base) on an unknown dataset.
|
22 |
It achieves the following results on the evaluation set:
|
23 |
+
- Loss: 0.8709
|
24 |
+
- Accuracy: 0.8083
|
25 |
+
- Precision: 0.8377
|
26 |
+
- Recall: 0.8083
|
27 |
+
- F1: 0.8018
|
28 |
+
- Binary: 0.8650
|
29 |
|
30 |
## Model description
|
31 |
|
|
|
44 |
### Training hyperparameters
|
45 |
|
46 |
The following hyperparameters were used during training:
|
47 |
+
- learning_rate: 3e-05
|
48 |
- train_batch_size: 32
|
49 |
- eval_batch_size: 32
|
50 |
- seed: 42
|
|
|
59 |
|
60 |
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | Binary |
|
61 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|:------:|
|
62 |
+
| No log | 0.17 | 50 | 4.2449 | 0.0510 | 0.0085 | 0.0510 | 0.0115 | 0.3218 |
|
63 |
+
| No log | 0.35 | 100 | 3.9731 | 0.0461 | 0.0045 | 0.0461 | 0.0074 | 0.3265 |
|
64 |
+
| No log | 0.52 | 150 | 3.7773 | 0.0437 | 0.0097 | 0.0437 | 0.0124 | 0.3301 |
|
65 |
+
| No log | 0.69 | 200 | 3.6063 | 0.1019 | 0.0598 | 0.1019 | 0.0574 | 0.3701 |
|
66 |
+
| No log | 0.86 | 250 | 3.4003 | 0.1990 | 0.1624 | 0.1990 | 0.1291 | 0.4379 |
|
67 |
+
| 3.8908 | 1.04 | 300 | 3.2347 | 0.25 | 0.2359 | 0.25 | 0.1908 | 0.4728 |
|
68 |
+
| 3.8908 | 1.21 | 350 | 3.0941 | 0.3131 | 0.2684 | 0.3131 | 0.2507 | 0.5177 |
|
69 |
+
| 3.8908 | 1.38 | 400 | 2.9380 | 0.3835 | 0.3642 | 0.3835 | 0.3147 | 0.5670 |
|
70 |
+
| 3.8908 | 1.55 | 450 | 2.8310 | 0.3956 | 0.3552 | 0.3956 | 0.3255 | 0.5750 |
|
71 |
+
| 3.8908 | 1.73 | 500 | 2.6928 | 0.4709 | 0.4197 | 0.4709 | 0.4024 | 0.6282 |
|
72 |
+
| 3.8908 | 1.9 | 550 | 2.6116 | 0.5194 | 0.4641 | 0.5194 | 0.4569 | 0.6629 |
|
73 |
+
| 3.027 | 2.07 | 600 | 2.4653 | 0.5704 | 0.5395 | 0.5704 | 0.5124 | 0.6988 |
|
74 |
+
| 3.027 | 2.24 | 650 | 2.3361 | 0.6044 | 0.5338 | 0.6044 | 0.5442 | 0.7223 |
|
75 |
+
| 3.027 | 2.42 | 700 | 2.2435 | 0.5947 | 0.5439 | 0.5947 | 0.5324 | 0.7141 |
|
76 |
+
| 3.027 | 2.59 | 750 | 2.1533 | 0.6141 | 0.6006 | 0.6141 | 0.5678 | 0.7291 |
|
77 |
+
| 3.027 | 2.76 | 800 | 2.0622 | 0.6189 | 0.6313 | 0.6189 | 0.5815 | 0.7325 |
|
78 |
+
| 3.027 | 2.93 | 850 | 1.9734 | 0.6578 | 0.6757 | 0.6578 | 0.6267 | 0.7597 |
|
79 |
+
| 2.4367 | 3.11 | 900 | 1.9059 | 0.6699 | 0.6688 | 0.6699 | 0.6289 | 0.7675 |
|
80 |
+
| 2.4367 | 3.28 | 950 | 1.8291 | 0.6990 | 0.7240 | 0.6990 | 0.6767 | 0.7886 |
|
81 |
+
| 2.4367 | 3.45 | 1000 | 1.7507 | 0.7257 | 0.7486 | 0.7257 | 0.6920 | 0.8066 |
|
82 |
+
| 2.4367 | 3.62 | 1050 | 1.6782 | 0.7330 | 0.7475 | 0.7330 | 0.7153 | 0.8124 |
|
83 |
+
| 2.4367 | 3.8 | 1100 | 1.6353 | 0.7184 | 0.7445 | 0.7184 | 0.6941 | 0.8015 |
|
84 |
+
| 2.4367 | 3.97 | 1150 | 1.5548 | 0.7694 | 0.7840 | 0.7694 | 0.7505 | 0.8379 |
|
85 |
+
| 2.0129 | 4.14 | 1200 | 1.5002 | 0.7330 | 0.7432 | 0.7330 | 0.7112 | 0.8133 |
|
86 |
+
| 2.0129 | 4.31 | 1250 | 1.4655 | 0.7524 | 0.7706 | 0.7524 | 0.7339 | 0.8260 |
|
87 |
+
| 2.0129 | 4.49 | 1300 | 1.4149 | 0.7718 | 0.7931 | 0.7718 | 0.7529 | 0.8396 |
|
88 |
+
| 2.0129 | 4.66 | 1350 | 1.3751 | 0.7743 | 0.7927 | 0.7743 | 0.7544 | 0.8422 |
|
89 |
+
| 2.0129 | 4.83 | 1400 | 1.3503 | 0.7767 | 0.8061 | 0.7767 | 0.7693 | 0.8422 |
|
90 |
+
| 1.7091 | 5.0 | 1450 | 1.3102 | 0.7816 | 0.8084 | 0.7816 | 0.7712 | 0.8473 |
|
91 |
+
| 1.7091 | 5.18 | 1500 | 1.3020 | 0.7621 | 0.8003 | 0.7621 | 0.7538 | 0.8328 |
|
92 |
+
| 1.7091 | 5.35 | 1550 | 1.2288 | 0.7816 | 0.8013 | 0.7816 | 0.7707 | 0.8473 |
|
93 |
+
| 1.7091 | 5.52 | 1600 | 1.2042 | 0.7961 | 0.8235 | 0.7961 | 0.7874 | 0.8575 |
|
94 |
+
| 1.7091 | 5.69 | 1650 | 1.1769 | 0.8010 | 0.8371 | 0.8010 | 0.7946 | 0.8592 |
|
95 |
+
| 1.7091 | 5.87 | 1700 | 1.1471 | 0.8107 | 0.8524 | 0.8107 | 0.8051 | 0.8667 |
|
96 |
+
| 1.4785 | 6.04 | 1750 | 1.1107 | 0.8228 | 0.8478 | 0.8228 | 0.8138 | 0.8745 |
|
97 |
+
| 1.4785 | 6.21 | 1800 | 1.0799 | 0.8131 | 0.8378 | 0.8131 | 0.8027 | 0.8684 |
|
98 |
+
| 1.4785 | 6.38 | 1850 | 1.0704 | 0.8010 | 0.8229 | 0.8010 | 0.7915 | 0.8600 |
|
99 |
+
| 1.4785 | 6.56 | 1900 | 1.0424 | 0.8058 | 0.8281 | 0.8058 | 0.7998 | 0.8633 |
|
100 |
+
| 1.4785 | 6.73 | 1950 | 1.0279 | 0.7985 | 0.8194 | 0.7985 | 0.7905 | 0.8592 |
|
101 |
+
| 1.4785 | 6.9 | 2000 | 1.0122 | 0.8180 | 0.8408 | 0.8180 | 0.8069 | 0.8711 |
|
102 |
+
| 1.3046 | 7.08 | 2050 | 0.9924 | 0.8180 | 0.8366 | 0.8180 | 0.8087 | 0.8711 |
|
103 |
+
| 1.3046 | 7.25 | 2100 | 0.9909 | 0.8058 | 0.8320 | 0.8058 | 0.7980 | 0.8633 |
|
104 |
+
| 1.3046 | 7.42 | 2150 | 0.9724 | 0.8058 | 0.8355 | 0.8058 | 0.8001 | 0.8643 |
|
105 |
+
| 1.3046 | 7.59 | 2200 | 0.9406 | 0.8083 | 0.8345 | 0.8083 | 0.8032 | 0.8660 |
|
106 |
+
| 1.3046 | 7.77 | 2250 | 0.9495 | 0.8010 | 0.8290 | 0.8010 | 0.7949 | 0.8592 |
|
107 |
+
| 1.3046 | 7.94 | 2300 | 0.9283 | 0.8107 | 0.8343 | 0.8107 | 0.8035 | 0.8667 |
|
108 |
+
| 1.1807 | 8.11 | 2350 | 0.9242 | 0.8155 | 0.8392 | 0.8155 | 0.8096 | 0.8711 |
|
109 |
+
| 1.1807 | 8.28 | 2400 | 0.9066 | 0.8204 | 0.8427 | 0.8204 | 0.8131 | 0.8735 |
|
110 |
+
| 1.1807 | 8.46 | 2450 | 0.9049 | 0.8252 | 0.8451 | 0.8252 | 0.8149 | 0.8769 |
|
111 |
+
| 1.1807 | 8.63 | 2500 | 0.9019 | 0.8204 | 0.8484 | 0.8204 | 0.8117 | 0.8735 |
|
112 |
+
| 1.1807 | 8.8 | 2550 | 0.8914 | 0.8131 | 0.8356 | 0.8131 | 0.8051 | 0.8684 |
|
113 |
+
| 1.1807 | 8.97 | 2600 | 0.8953 | 0.8083 | 0.8403 | 0.8083 | 0.8014 | 0.8643 |
|
114 |
+
| 1.1094 | 9.15 | 2650 | 0.8824 | 0.8107 | 0.8450 | 0.8107 | 0.8067 | 0.8667 |
|
115 |
+
| 1.1094 | 9.32 | 2700 | 0.8800 | 0.8058 | 0.8317 | 0.8058 | 0.7964 | 0.8633 |
|
116 |
+
| 1.1094 | 9.49 | 2750 | 0.8729 | 0.8180 | 0.8450 | 0.8180 | 0.8113 | 0.8718 |
|
117 |
+
| 1.1094 | 9.66 | 2800 | 0.8704 | 0.8107 | 0.8383 | 0.8107 | 0.8037 | 0.8667 |
|
118 |
+
| 1.1094 | 9.84 | 2850 | 0.8709 | 0.8083 | 0.8377 | 0.8083 | 0.8018 | 0.8650 |
|
119 |
|
120 |
|
121 |
### Framework versions
|
runs/Jun16_16-45-55_LAPTOP-1GID9RGH/events.out.tfevents.1718531156.LAPTOP-1GID9RGH.22340.0
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:b79745c7b8082108dd3289ab1991dad88a0ad7a0eb9572ff250b0104a6639165
|
3 |
+
size 42037
|