Bisher commited on
Commit
72cb157
·
verified ·
1 Parent(s): 29fb096

Model save

Browse files
README.md CHANGED
@@ -20,18 +20,18 @@ should probably proofread and complete it, then remove this comment. -->
20
 
21
  This model is a fine-tuned version of [Bisher/wav2vec2_ASV_deepfake_audio_detection_DF_finetune_frozen](https://huggingface.co/Bisher/wav2vec2_ASV_deepfake_audio_detection_DF_finetune_frozen) on an unknown dataset.
22
  It achieves the following results on the evaluation set:
23
- - Loss: 0.4087
24
- - Accuracy: 0.9167
25
- - Precision: 0.9180
26
- - Recall: 0.9167
27
- - F1: 0.8908
28
- - Tp: 375
29
- - Tn: 17886
30
- - Fn: 1632
31
- - Fp: 27
32
- - Eer: 0.1674
33
- - Min Tdcf: 0.0312
34
- - Auc Roc: 0.8827
35
 
36
  ## Model description
37
 
@@ -51,11 +51,11 @@ More information needed
51
 
52
  The following hyperparameters were used during training:
53
  - learning_rate: 0.0005
54
- - train_batch_size: 64
55
- - eval_batch_size: 64
56
  - seed: 42
57
  - gradient_accumulation_steps: 4
58
- - total_train_batch_size: 256
59
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
60
  - lr_scheduler_type: linear
61
  - lr_scheduler_warmup_ratio: 0.1
@@ -64,15 +64,36 @@ The following hyperparameters were used during training:
64
 
65
  ### Training results
66
 
67
- | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | Tp | Tn | Fn | Fp | Eer | Min Tdcf | Auc Roc |
68
- |:-------------:|:------:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|:----:|:-----:|:----:|:---:|:------:|:--------:|:-------:|
69
- | 0.7199 | 0.0103 | 5 | 0.6815 | 0.9488 | 0.9472 | 0.9488 | 0.9478 | 1413 | 17488 | 594 | 425 | 0.1520 | 0.0329 | 0.8893 |
70
- | 0.6841 | 0.0206 | 10 | 0.6499 | 0.9304 | 0.9301 | 0.9304 | 0.9154 | 675 | 17859 | 1332 | 54 | 0.2820 | 0.0325 | 0.7611 |
71
- | 0.6394 | 0.0309 | 15 | 0.5997 | 0.9079 | 0.9088 | 0.9079 | 0.8726 | 189 | 17896 | 1818 | 17 | 0.1166 | 0.0323 | 0.9573 |
72
- | 0.5966 | 0.0413 | 20 | 0.5522 | 0.9020 | 0.9083 | 0.9020 | 0.8583 | 57 | 17911 | 1950 | 2 | 0.0837 | 0.0330 | 0.9745 |
73
- | 0.5565 | 0.0516 | 25 | 0.5057 | 0.9030 | 0.9067 | 0.9030 | 0.8609 | 80 | 17908 | 1927 | 5 | 0.0872 | 0.0333 | 0.9712 |
74
- | 0.5236 | 0.0619 | 30 | 0.4542 | 0.9124 | 0.9141 | 0.9124 | 0.8822 | 284 | 17892 | 1723 | 21 | 0.0876 | 0.0315 | 0.9648 |
75
- | 0.4739 | 0.0722 | 35 | 0.4087 | 0.9167 | 0.9180 | 0.9167 | 0.8908 | 375 | 17886 | 1632 | 27 | 0.1674 | 0.0312 | 0.8827 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
76
 
77
 
78
  ### Framework versions
 
20
 
21
  This model is a fine-tuned version of [Bisher/wav2vec2_ASV_deepfake_audio_detection_DF_finetune_frozen](https://huggingface.co/Bisher/wav2vec2_ASV_deepfake_audio_detection_DF_finetune_frozen) on an unknown dataset.
22
  It achieves the following results on the evaluation set:
23
+ - Loss: 0.6355
24
+ - Accuracy: 0.9146
25
+ - Precision: 0.9161
26
+ - Recall: 0.9146
27
+ - F1: 0.8866
28
+ - Tp: 330
29
+ - Tn: 17889
30
+ - Fn: 1677
31
+ - Fp: 24
32
+ - Eer: 0.1639
33
+ - Min Tdcf: 0.0357
34
+ - Auc Roc: 0.9189
35
 
36
  ## Model description
37
 
 
51
 
52
  The following hyperparameters were used during training:
53
  - learning_rate: 0.0005
54
+ - train_batch_size: 128
55
+ - eval_batch_size: 128
56
  - seed: 42
57
  - gradient_accumulation_steps: 4
58
+ - total_train_batch_size: 512
59
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
60
  - lr_scheduler_type: linear
61
  - lr_scheduler_warmup_ratio: 0.1
 
64
 
65
  ### Training results
66
 
67
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | Tp | Tn | Fn | Fp | Eer | Min Tdcf | Auc Roc |
68
+ |:-------------:|:------:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|:---:|:-----:|:----:|:---:|:------:|:--------:|:-------:|
69
+ | 0.6678 | 0.0206 | 5 | 0.6581 | 0.9410 | 0.9381 | 0.9410 | 0.9331 | 963 | 17782 | 1044 | 131 | 0.0717 | 0.0322 | 0.9729 |
70
+ | 0.6124 | 0.0412 | 10 | 0.5702 | 0.9060 | 0.9082 | 0.9060 | 0.8681 | 145 | 17902 | 1862 | 11 | 0.1978 | 0.0322 | 0.8640 |
71
+ | 0.5335 | 0.0619 | 15 | 0.5016 | 0.9016 | 0.9093 | 0.9016 | 0.8573 | 48 | 17912 | 1959 | 1 | 0.3503 | 0.0363 | 0.6924 |
72
+ | 0.4592 | 0.0825 | 20 | 0.4335 | 0.9094 | 0.9108 | 0.9094 | 0.8759 | 221 | 17895 | 1786 | 18 | 0.1774 | 0.0323 | 0.8683 |
73
+ | 0.3927 | 0.1031 | 25 | 0.3781 | 0.9104 | 0.9110 | 0.9104 | 0.8782 | 244 | 17891 | 1763 | 22 | 0.2297 | 0.0315 | 0.8110 |
74
+ | 0.3231 | 0.1237 | 30 | 0.3201 | 0.9138 | 0.9151 | 0.9138 | 0.8851 | 314 | 17889 | 1693 | 24 | 0.2900 | 0.0309 | 0.7424 |
75
+ | 0.2519 | 0.1443 | 35 | 0.2804 | 0.9196 | 0.9215 | 0.9196 | 0.8960 | 431 | 17887 | 1576 | 26 | 0.1141 | 0.0295 | 0.9340 |
76
+ | 0.1963 | 0.1649 | 40 | 0.2395 | 0.9346 | 0.9350 | 0.9346 | 0.9216 | 751 | 17866 | 1256 | 47 | 0.0898 | 0.0286 | 0.9623 |
77
+ | 0.1423 | 0.1856 | 45 | 0.3794 | 0.9048 | 0.9032 | 0.9048 | 0.8659 | 127 | 17897 | 1880 | 16 | 0.0901 | 0.0298 | 0.9659 |
78
+ | 0.1046 | 0.2062 | 50 | 0.3194 | 0.9287 | 0.9286 | 0.9287 | 0.9124 | 636 | 17863 | 1371 | 50 | 0.0751 | 0.0318 | 0.9767 |
79
+ | 0.0681 | 0.2268 | 55 | 0.4859 | 0.9021 | 0.9055 | 0.9021 | 0.8586 | 60 | 17909 | 1947 | 4 | 0.1709 | 0.0378 | 0.9015 |
80
+ | 0.0473 | 0.2474 | 60 | 0.5605 | 0.9100 | 0.9101 | 0.9100 | 0.8774 | 237 | 17890 | 1770 | 23 | 0.7055 | 0.0382 | 0.3149 |
81
+ | 0.0323 | 0.2680 | 65 | 0.5107 | 0.9164 | 0.9178 | 0.9164 | 0.8900 | 367 | 17887 | 1640 | 26 | 0.0703 | 0.0337 | 0.9791 |
82
+ | 0.0339 | 0.2887 | 70 | 0.8921 | 0.9026 | 0.9009 | 0.9026 | 0.8604 | 77 | 17903 | 1930 | 10 | 0.8316 | 0.0435 | 0.1773 |
83
+ | 0.0423 | 0.3093 | 75 | 0.8964 | 0.9030 | 0.8998 | 0.9030 | 0.8615 | 87 | 17900 | 1920 | 13 | 0.0732 | 0.0327 | 0.9753 |
84
+ | 0.0456 | 0.3299 | 80 | 1.0843 | 0.9013 | 0.8935 | 0.9013 | 0.8574 | 51 | 17902 | 1956 | 11 | 0.8520 | 0.0478 | 0.1126 |
85
+ | 0.0712 | 0.3505 | 85 | 0.8587 | 0.9023 | 0.8998 | 0.9023 | 0.8597 | 71 | 17903 | 1936 | 10 | 0.8665 | 0.0480 | 0.0990 |
86
+ | 0.0629 | 0.3711 | 90 | 0.4810 | 0.9267 | 0.9278 | 0.9267 | 0.9087 | 583 | 17877 | 1424 | 36 | 0.0848 | 0.0328 | 0.9683 |
87
+ | 0.0477 | 0.3918 | 95 | 0.9415 | 0.9094 | 0.9114 | 0.9094 | 0.8757 | 218 | 17897 | 1789 | 16 | 0.1219 | 0.0408 | 0.8890 |
88
+ | 0.0484 | 0.4124 | 100 | 0.7774 | 0.9150 | 0.9170 | 0.9150 | 0.8873 | 336 | 17891 | 1671 | 22 | 0.6906 | 0.0383 | 0.3129 |
89
+ | 0.0449 | 0.4330 | 105 | 0.3949 | 0.9197 | 0.9199 | 0.9197 | 0.8967 | 444 | 17876 | 1563 | 37 | 0.6527 | 0.0363 | 0.3629 |
90
+ | 0.0567 | 0.4536 | 110 | 0.5853 | 0.9232 | 0.9212 | 0.9232 | 0.9040 | 540 | 17850 | 1467 | 63 | 0.2192 | 0.0355 | 0.8158 |
91
+ | 0.0416 | 0.4742 | 115 | 0.7031 | 0.9036 | 0.9054 | 0.9036 | 0.8626 | 95 | 17905 | 1912 | 8 | 0.7633 | 0.0408 | 0.2549 |
92
+ | 0.1778 | 0.4948 | 120 | 0.5440 | 0.9033 | 0.9093 | 0.9033 | 0.8613 | 83 | 17910 | 1924 | 3 | 0.7389 | 0.0398 | 0.2838 |
93
+ | 0.036 | 0.5155 | 125 | 0.5825 | 0.9161 | 0.9187 | 0.9161 | 0.8892 | 356 | 17893 | 1651 | 20 | 0.1538 | 0.0366 | 0.9078 |
94
+ | 0.0797 | 0.5361 | 130 | 0.5864 | 0.9027 | 0.9059 | 0.9027 | 0.8601 | 73 | 17908 | 1934 | 5 | 0.2236 | 0.0407 | 0.7721 |
95
+ | 0.0669 | 0.5567 | 135 | 0.4264 | 0.9036 | 0.9079 | 0.9036 | 0.8623 | 92 | 17908 | 1915 | 5 | 0.1597 | 0.0370 | 0.8791 |
96
+ | 0.0353 | 0.5773 | 140 | 0.6355 | 0.9146 | 0.9161 | 0.9146 | 0.8866 | 330 | 17889 | 1677 | 24 | 0.1639 | 0.0357 | 0.9189 |
97
 
98
 
99
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:1abb318390f5ce2befbbfa09fbd4ac944b7cf260dabe9d6172818a4d33746f67
3
  size 378449016
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:716261a859a0abee9241eb3ab719267384e210bb4855078dea742a1cb1c7d0b7
3
  size 378449016
runs/Aug24_17-38-04_2d795081614d/events.out.tfevents.1724521087.2d795081614d.23.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:853d44b6601998d505c45a876efce524a16d73c6e1ee065a8ff48252b6e37d07
3
- size 34998
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:64798dc3d487ffc82a6597964379823d69b8a2fb568ea878d96bff7ecdc354dc
3
+ size 35352