modelId
stringlengths 5
139
| author
stringlengths 2
42
| last_modified
timestamp[us, tz=UTC]date 2020-02-15 11:33:14
2025-06-27 00:42:13
| downloads
int64 0
223M
| likes
int64 0
11.7k
| library_name
stringclasses 499
values | tags
sequencelengths 1
4.05k
| pipeline_tag
stringclasses 54
values | createdAt
timestamp[us, tz=UTC]date 2022-03-02 23:29:04
2025-06-27 00:40:00
| card
stringlengths 11
1.01M
|
---|---|---|---|---|---|---|---|---|---|
mlx-community/Llama-3.1-Tulu-3-8B-3bit | mlx-community | 2024-11-25T07:06:03Z | 79 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"mlx",
"conversational",
"en",
"dataset:allenai/RLVR-GSM-MATH-IF-Mixed-Constraints",
"base_model:allenai/Llama-3.1-Tulu-3-8B",
"base_model:quantized:allenai/Llama-3.1-Tulu-3-8B",
"license:llama3.1",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"3-bit",
"region:us"
] | text-generation | 2024-11-25T06:56:17Z | ---
license: llama3.1
language:
- en
pipeline_tag: text-generation
datasets:
- allenai/RLVR-GSM-MATH-IF-Mixed-Constraints
base_model: allenai/Llama-3.1-Tulu-3-8B
library_name: transformers
tags:
- mlx
---
# mlx-community/Llama-3.1-Tulu-3-8B-3bit
The Model [mlx-community/Llama-3.1-Tulu-3-8B-3bit](https://huggingface.co/mlx-community/Llama-3.1-Tulu-3-8B-3bit) was
converted to MLX format from [allenai/Llama-3.1-Tulu-3-8B](https://huggingface.co/allenai/Llama-3.1-Tulu-3-8B)
using mlx-lm version **0.20.0**.
## Use with mlx
```bash
pip install mlx-lm
```
```python
from mlx_lm import load, generate
model, tokenizer = load("mlx-community/Llama-3.1-Tulu-3-8B-3bit")
prompt="hello"
if hasattr(tokenizer, "apply_chat_template") and tokenizer.chat_template is not None:
messages = [{"role": "user", "content": prompt}]
prompt = tokenizer.apply_chat_template(
messages, tokenize=False, add_generation_prompt=True
)
response = generate(model, tokenizer, prompt=prompt, verbose=True)
```
|
MayBashendy/Arabic_FineTuningAraBERT_AugV5_k30_task2_organization_fold1 | MayBashendy | 2024-11-25T07:05:19Z | 161 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:aubmindlab/bert-base-arabertv02",
"base_model:finetune:aubmindlab/bert-base-arabertv02",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-25T06:52:32Z | ---
library_name: transformers
base_model: aubmindlab/bert-base-arabertv02
tags:
- generated_from_trainer
model-index:
- name: Arabic_FineTuningAraBERT_AugV5_k30_task2_organization_fold1
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Arabic_FineTuningAraBERT_AugV5_k30_task2_organization_fold1
This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6013
- Qwk: 0.3824
- Mse: 0.6013
- Rmse: 0.7754
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:|
| No log | 0.0085 | 2 | 4.9846 | 0.0 | 4.9846 | 2.2326 |
| No log | 0.0169 | 4 | 2.7783 | -0.0435 | 2.7783 | 1.6668 |
| No log | 0.0254 | 6 | 2.4845 | -0.0930 | 2.4845 | 1.5762 |
| No log | 0.0339 | 8 | 2.3609 | -0.0930 | 2.3609 | 1.5365 |
| No log | 0.0424 | 10 | 1.1486 | 0.0250 | 1.1486 | 1.0717 |
| No log | 0.0508 | 12 | 0.8418 | 0.0308 | 0.8418 | 0.9175 |
| No log | 0.0593 | 14 | 0.8548 | 0.0 | 0.8548 | 0.9245 |
| No log | 0.0678 | 16 | 1.0644 | 0.0548 | 1.0644 | 1.0317 |
| No log | 0.0763 | 18 | 1.2562 | 0.0 | 1.2562 | 1.1208 |
| No log | 0.0847 | 20 | 1.6354 | 0.1064 | 1.6354 | 1.2788 |
| No log | 0.0932 | 22 | 1.7384 | 0.1064 | 1.7384 | 1.3185 |
| No log | 0.1017 | 24 | 1.7531 | 0.1386 | 1.7531 | 1.3240 |
| No log | 0.1102 | 26 | 1.6695 | 0.1064 | 1.6695 | 1.2921 |
| No log | 0.1186 | 28 | 1.6332 | 0.0250 | 1.6332 | 1.2780 |
| No log | 0.1271 | 30 | 1.7753 | -0.0174 | 1.7753 | 1.3324 |
| No log | 0.1356 | 32 | 2.0626 | 0.0400 | 2.0626 | 1.4362 |
| No log | 0.1441 | 34 | 1.9541 | 0.0140 | 1.9541 | 1.3979 |
| No log | 0.1525 | 36 | 1.5675 | 0.0250 | 1.5675 | 1.2520 |
| No log | 0.1610 | 38 | 1.1454 | 0.0 | 1.1454 | 1.0702 |
| No log | 0.1695 | 40 | 0.9870 | 0.0 | 0.9870 | 0.9935 |
| No log | 0.1780 | 42 | 0.9782 | 0.0308 | 0.9782 | 0.9890 |
| No log | 0.1864 | 44 | 1.2010 | 0.0 | 1.2010 | 1.0959 |
| No log | 0.1949 | 46 | 1.2901 | 0.0548 | 1.2901 | 1.1358 |
| No log | 0.2034 | 48 | 1.2114 | 0.0 | 1.2114 | 1.1006 |
| No log | 0.2119 | 50 | 0.9399 | 0.0308 | 0.9399 | 0.9695 |
| No log | 0.2203 | 52 | 0.8501 | 0.1563 | 0.8501 | 0.9220 |
| No log | 0.2288 | 54 | 0.8556 | 0.1563 | 0.8556 | 0.9250 |
| No log | 0.2373 | 56 | 1.0025 | 0.1563 | 1.0025 | 1.0013 |
| No log | 0.2458 | 58 | 1.0097 | 0.1563 | 1.0097 | 1.0048 |
| No log | 0.2542 | 60 | 0.8621 | 0.1563 | 0.8621 | 0.9285 |
| No log | 0.2627 | 62 | 0.7828 | 0.1563 | 0.7828 | 0.8848 |
| No log | 0.2712 | 64 | 0.8131 | 0.0308 | 0.8131 | 0.9017 |
| No log | 0.2797 | 66 | 0.9763 | 0.0 | 0.9763 | 0.9881 |
| No log | 0.2881 | 68 | 1.3842 | 0.0548 | 1.3842 | 1.1765 |
| No log | 0.2966 | 70 | 1.6848 | 0.1064 | 1.6848 | 1.2980 |
| No log | 0.3051 | 72 | 1.6081 | 0.1386 | 1.6081 | 1.2681 |
| No log | 0.3136 | 74 | 1.5103 | 0.0250 | 1.5103 | 1.2290 |
| No log | 0.3220 | 76 | 1.2469 | 0.0 | 1.2469 | 1.1166 |
| No log | 0.3305 | 78 | 1.0046 | 0.0 | 1.0046 | 1.0023 |
| No log | 0.3390 | 80 | 0.8476 | 0.0 | 0.8476 | 0.9206 |
| No log | 0.3475 | 82 | 0.8810 | 0.0833 | 0.8810 | 0.9386 |
| No log | 0.3559 | 84 | 1.0385 | 0.0548 | 1.0385 | 1.0191 |
| No log | 0.3644 | 86 | 1.0100 | 0.0548 | 1.0100 | 1.0050 |
| No log | 0.3729 | 88 | 1.1480 | 0.0548 | 1.1480 | 1.0715 |
| No log | 0.3814 | 90 | 1.3726 | 0.0690 | 1.3726 | 1.1716 |
| No log | 0.3898 | 92 | 1.7874 | -0.0147 | 1.7874 | 1.3369 |
| No log | 0.3983 | 94 | 1.9458 | 0.0 | 1.9458 | 1.3949 |
| No log | 0.4068 | 96 | 1.6799 | 0.0348 | 1.6799 | 1.2961 |
| No log | 0.4153 | 98 | 1.2717 | 0.0548 | 1.2717 | 1.1277 |
| No log | 0.4237 | 100 | 0.9616 | 0.0 | 0.9616 | 0.9806 |
| No log | 0.4322 | 102 | 0.7217 | 0.1905 | 0.7217 | 0.8495 |
| No log | 0.4407 | 104 | 0.6286 | 0.2759 | 0.6286 | 0.7929 |
| No log | 0.4492 | 106 | 0.6240 | 0.1724 | 0.6240 | 0.7899 |
| No log | 0.4576 | 108 | 0.7408 | 0.1563 | 0.7408 | 0.8607 |
| No log | 0.4661 | 110 | 0.9964 | 0.1667 | 0.9964 | 0.9982 |
| No log | 0.4746 | 112 | 1.1956 | 0.0690 | 1.1956 | 1.0934 |
| No log | 0.4831 | 114 | 1.1623 | 0.0548 | 1.1623 | 1.0781 |
| No log | 0.4915 | 116 | 0.9291 | 0.1667 | 0.9291 | 0.9639 |
| No log | 0.5 | 118 | 0.7805 | 0.2286 | 0.7805 | 0.8834 |
| No log | 0.5085 | 120 | 0.6523 | 0.3226 | 0.6523 | 0.8077 |
| No log | 0.5169 | 122 | 0.6236 | 0.3000 | 0.6236 | 0.7897 |
| No log | 0.5254 | 124 | 0.5932 | 0.3390 | 0.5932 | 0.7702 |
| No log | 0.5339 | 126 | 0.6148 | 0.1356 | 0.6148 | 0.7841 |
| No log | 0.5424 | 128 | 0.6271 | 0.1356 | 0.6271 | 0.7919 |
| No log | 0.5508 | 130 | 0.6021 | 0.1356 | 0.6021 | 0.7759 |
| No log | 0.5593 | 132 | 0.6323 | 0.2000 | 0.6323 | 0.7951 |
| No log | 0.5678 | 134 | 0.7753 | 0.1429 | 0.7753 | 0.8805 |
| No log | 0.5763 | 136 | 0.8951 | 0.1972 | 0.8951 | 0.9461 |
| No log | 0.5847 | 138 | 0.7571 | 0.0323 | 0.7571 | 0.8701 |
| No log | 0.5932 | 140 | 0.7787 | 0.0870 | 0.7787 | 0.8824 |
| No log | 0.6017 | 142 | 0.7690 | 0.0870 | 0.7690 | 0.8769 |
| No log | 0.6102 | 144 | 0.7274 | 0.0656 | 0.7274 | 0.8529 |
| No log | 0.6186 | 146 | 0.6709 | 0.0 | 0.6709 | 0.8191 |
| No log | 0.6271 | 148 | 0.5784 | 0.0426 | 0.5784 | 0.7605 |
| No log | 0.6356 | 150 | 0.5980 | -0.0408 | 0.5980 | 0.7733 |
| No log | 0.6441 | 152 | 0.8672 | 0.1818 | 0.8672 | 0.9313 |
| No log | 0.6525 | 154 | 1.0752 | 0.0741 | 1.0752 | 1.0369 |
| No log | 0.6610 | 156 | 1.0092 | 0.1600 | 1.0092 | 1.0046 |
| No log | 0.6695 | 158 | 0.6873 | -0.0755 | 0.6873 | 0.8291 |
| No log | 0.6780 | 160 | 0.6090 | 0.0870 | 0.6090 | 0.7804 |
| No log | 0.6864 | 162 | 0.6660 | 0.1176 | 0.6660 | 0.8161 |
| No log | 0.6949 | 164 | 0.8205 | 0.0 | 0.8205 | 0.9058 |
| No log | 0.7034 | 166 | 0.9718 | 0.0 | 0.9718 | 0.9858 |
| No log | 0.7119 | 168 | 1.0832 | -0.0563 | 1.0832 | 1.0408 |
| No log | 0.7203 | 170 | 0.9288 | 0.0 | 0.9288 | 0.9637 |
| No log | 0.7288 | 172 | 0.9606 | -0.1887 | 0.9606 | 0.9801 |
| No log | 0.7373 | 174 | 0.9067 | -0.1538 | 0.9067 | 0.9522 |
| No log | 0.7458 | 176 | 0.7862 | -0.1176 | 0.7862 | 0.8867 |
| No log | 0.7542 | 178 | 0.6902 | 0.0 | 0.6902 | 0.8308 |
| No log | 0.7627 | 180 | 0.6745 | 0.0 | 0.6745 | 0.8213 |
| No log | 0.7712 | 182 | 0.6862 | -0.1633 | 0.6862 | 0.8284 |
| No log | 0.7797 | 184 | 0.8204 | -0.0714 | 0.8204 | 0.9058 |
| No log | 0.7881 | 186 | 1.1496 | -0.0274 | 1.1496 | 1.0722 |
| No log | 0.7966 | 188 | 1.0213 | 0.0 | 1.0213 | 1.0106 |
| No log | 0.8051 | 190 | 0.7732 | -0.0800 | 0.7732 | 0.8793 |
| No log | 0.8136 | 192 | 0.6955 | -0.1633 | 0.6955 | 0.8339 |
| No log | 0.8220 | 194 | 0.6953 | -0.0800 | 0.6953 | 0.8339 |
| No log | 0.8305 | 196 | 0.7862 | 0.0 | 0.7862 | 0.8867 |
| No log | 0.8390 | 198 | 0.8025 | 0.0 | 0.8025 | 0.8958 |
| No log | 0.8475 | 200 | 0.7130 | -0.1176 | 0.7130 | 0.8444 |
| No log | 0.8559 | 202 | 0.6461 | 0.0377 | 0.6461 | 0.8038 |
| No log | 0.8644 | 204 | 0.6529 | 0.2105 | 0.6529 | 0.8080 |
| No log | 0.8729 | 206 | 0.6537 | 0.2105 | 0.6537 | 0.8085 |
| No log | 0.8814 | 208 | 0.6195 | 0.1818 | 0.6195 | 0.7871 |
| No log | 0.8898 | 210 | 0.6311 | 0.0377 | 0.6311 | 0.7944 |
| No log | 0.8983 | 212 | 0.7078 | 0.0 | 0.7078 | 0.8413 |
| No log | 0.9068 | 214 | 0.7843 | -0.0755 | 0.7843 | 0.8856 |
| No log | 0.9153 | 216 | 0.7300 | -0.1176 | 0.7300 | 0.8544 |
| No log | 0.9237 | 218 | 0.7828 | 0.1000 | 0.7828 | 0.8848 |
| No log | 0.9322 | 220 | 0.8510 | 0.0625 | 0.8510 | 0.9225 |
| No log | 0.9407 | 222 | 0.8896 | 0.2388 | 0.8896 | 0.9432 |
| No log | 0.9492 | 224 | 0.9913 | 0.0833 | 0.9913 | 0.9956 |
| No log | 0.9576 | 226 | 1.0287 | 0.0571 | 1.0287 | 1.0142 |
| No log | 0.9661 | 228 | 1.1535 | -0.1481 | 1.1535 | 1.0740 |
| No log | 0.9746 | 230 | 1.1085 | -0.0274 | 1.1085 | 1.0528 |
| No log | 0.9831 | 232 | 1.0922 | -0.0500 | 1.0922 | 1.0451 |
| No log | 0.9915 | 234 | 1.0094 | 0.0833 | 1.0094 | 1.0047 |
| No log | 1.0 | 236 | 0.7129 | 0.1176 | 0.7129 | 0.8444 |
| No log | 1.0085 | 238 | 0.5695 | 0.0870 | 0.5695 | 0.7547 |
| No log | 1.0169 | 240 | 0.5150 | 0.125 | 0.5150 | 0.7176 |
| No log | 1.0254 | 242 | 0.5393 | 0.0426 | 0.5393 | 0.7344 |
| No log | 1.0339 | 244 | 0.6926 | -0.0755 | 0.6926 | 0.8322 |
| No log | 1.0424 | 246 | 1.1901 | -0.0274 | 1.1901 | 1.0909 |
| No log | 1.0508 | 248 | 1.4232 | 0.0625 | 1.4232 | 1.1930 |
| No log | 1.0593 | 250 | 1.2571 | 0.0488 | 1.2571 | 1.1212 |
| No log | 1.0678 | 252 | 0.9456 | -0.2187 | 0.9456 | 0.9724 |
| No log | 1.0763 | 254 | 0.7507 | 0.1290 | 0.7507 | 0.8664 |
| No log | 1.0847 | 256 | 0.7673 | 0.1639 | 0.7673 | 0.8760 |
| No log | 1.0932 | 258 | 0.8362 | 0.1639 | 0.8362 | 0.9144 |
| No log | 1.1017 | 260 | 0.8464 | 0.1639 | 0.8464 | 0.9200 |
| No log | 1.1102 | 262 | 0.8394 | 0.1639 | 0.8394 | 0.9162 |
| No log | 1.1186 | 264 | 0.8383 | 0.1724 | 0.8383 | 0.9156 |
| No log | 1.1271 | 266 | 0.8344 | 0.1724 | 0.8344 | 0.9135 |
| No log | 1.1356 | 268 | 0.8512 | 0.2258 | 0.8512 | 0.9226 |
| No log | 1.1441 | 270 | 0.8368 | 0.2258 | 0.8368 | 0.9148 |
| No log | 1.1525 | 272 | 0.8175 | 0.2258 | 0.8175 | 0.9042 |
| No log | 1.1610 | 274 | 0.8128 | 0.0625 | 0.8128 | 0.9015 |
| No log | 1.1695 | 276 | 0.8289 | 0.0339 | 0.8289 | 0.9104 |
| No log | 1.1780 | 278 | 0.7954 | 0.0625 | 0.7954 | 0.8919 |
| No log | 1.1864 | 280 | 0.6798 | 0.2623 | 0.6798 | 0.8245 |
| No log | 1.1949 | 282 | 0.6009 | 0.2000 | 0.6009 | 0.7752 |
| No log | 1.2034 | 284 | 0.5345 | 0.3226 | 0.5345 | 0.7311 |
| No log | 1.2119 | 286 | 0.4812 | 0.3226 | 0.4812 | 0.6937 |
| No log | 1.2203 | 288 | 0.5024 | 0.4407 | 0.5024 | 0.7088 |
| No log | 1.2288 | 290 | 0.6220 | 0.2258 | 0.6220 | 0.7887 |
| No log | 1.2373 | 292 | 0.7984 | 0.1818 | 0.7984 | 0.8935 |
| No log | 1.2458 | 294 | 0.8350 | 0.0294 | 0.8350 | 0.9138 |
| No log | 1.2542 | 296 | 0.6925 | 0.1724 | 0.6925 | 0.8322 |
| No log | 1.2627 | 298 | 0.5311 | 0.2623 | 0.5311 | 0.7288 |
| No log | 1.2712 | 300 | 0.4770 | 0.3793 | 0.4770 | 0.6906 |
| No log | 1.2797 | 302 | 0.4306 | 0.3774 | 0.4306 | 0.6562 |
| No log | 1.2881 | 304 | 0.4102 | 0.3333 | 0.4102 | 0.6405 |
| No log | 1.2966 | 306 | 0.4032 | 0.6038 | 0.4032 | 0.6350 |
| No log | 1.3051 | 308 | 0.4236 | 0.4643 | 0.4236 | 0.6508 |
| No log | 1.3136 | 310 | 0.5006 | 0.3793 | 0.5006 | 0.7075 |
| No log | 1.3220 | 312 | 0.5824 | 0.2623 | 0.5824 | 0.7631 |
| No log | 1.3305 | 314 | 0.6049 | 0.2258 | 0.6049 | 0.7778 |
| No log | 1.3390 | 316 | 0.5743 | 0.2623 | 0.5743 | 0.7578 |
| No log | 1.3475 | 318 | 0.5400 | 0.3793 | 0.5400 | 0.7348 |
| No log | 1.3559 | 320 | 0.4797 | 0.3793 | 0.4797 | 0.6926 |
| No log | 1.3644 | 322 | 0.5111 | 0.1509 | 0.5111 | 0.7149 |
| No log | 1.3729 | 324 | 0.5497 | 0.1176 | 0.5497 | 0.7414 |
| No log | 1.3814 | 326 | 0.5740 | 0.1111 | 0.5740 | 0.7576 |
| No log | 1.3898 | 328 | 0.5745 | 0.1356 | 0.5745 | 0.7579 |
| No log | 1.3983 | 330 | 0.5666 | 0.3438 | 0.5666 | 0.7527 |
| No log | 1.4068 | 332 | 0.5764 | 0.2941 | 0.5764 | 0.7592 |
| No log | 1.4153 | 334 | 0.6065 | 0.1231 | 0.6065 | 0.7788 |
| No log | 1.4237 | 336 | 0.6737 | 0.1356 | 0.6737 | 0.8208 |
| No log | 1.4322 | 338 | 0.7064 | 0.0656 | 0.7064 | 0.8405 |
| No log | 1.4407 | 340 | 0.6955 | 0.0 | 0.6955 | 0.8340 |
| No log | 1.4492 | 342 | 0.6422 | 0.0690 | 0.6422 | 0.8014 |
| No log | 1.4576 | 344 | 0.6339 | 0.2817 | 0.6339 | 0.7962 |
| No log | 1.4661 | 346 | 0.6697 | 0.2388 | 0.6697 | 0.8184 |
| No log | 1.4746 | 348 | 0.7118 | 0.2059 | 0.7118 | 0.8437 |
| No log | 1.4831 | 350 | 0.7447 | 0.2727 | 0.7447 | 0.8629 |
| No log | 1.4915 | 352 | 0.7713 | 0.2154 | 0.7713 | 0.8782 |
| No log | 1.5 | 354 | 0.7710 | 0.2154 | 0.7710 | 0.8781 |
| No log | 1.5085 | 356 | 0.7897 | 0.2154 | 0.7897 | 0.8886 |
| No log | 1.5169 | 358 | 0.7457 | 0.2941 | 0.7457 | 0.8636 |
| No log | 1.5254 | 360 | 0.6800 | 0.2941 | 0.6800 | 0.8246 |
| No log | 1.5339 | 362 | 0.6548 | 0.2941 | 0.6548 | 0.8092 |
| No log | 1.5424 | 364 | 0.6682 | 0.2941 | 0.6682 | 0.8174 |
| No log | 1.5508 | 366 | 0.6874 | 0.2941 | 0.6874 | 0.8291 |
| No log | 1.5593 | 368 | 0.7087 | 0.1818 | 0.7087 | 0.8418 |
| No log | 1.5678 | 370 | 0.7270 | 0.2388 | 0.7270 | 0.8526 |
| No log | 1.5763 | 372 | 0.7462 | 0.2941 | 0.7462 | 0.8638 |
| No log | 1.5847 | 374 | 0.7635 | 0.2286 | 0.7635 | 0.8738 |
| No log | 1.5932 | 376 | 0.8034 | 0.1667 | 0.8034 | 0.8963 |
| No log | 1.6017 | 378 | 0.8388 | 0.1972 | 0.8388 | 0.9159 |
| No log | 1.6102 | 380 | 0.8260 | 0.2609 | 0.8260 | 0.9088 |
| No log | 1.6186 | 382 | 0.7977 | 0.1429 | 0.7977 | 0.8931 |
| No log | 1.6271 | 384 | 0.7859 | 0.1429 | 0.7859 | 0.8865 |
| No log | 1.6356 | 386 | 0.7563 | 0.1972 | 0.7563 | 0.8697 |
| No log | 1.6441 | 388 | 0.7316 | 0.1972 | 0.7316 | 0.8553 |
| No log | 1.6525 | 390 | 0.6839 | 0.1818 | 0.6839 | 0.8270 |
| No log | 1.6610 | 392 | 0.6509 | 0.1429 | 0.6509 | 0.8068 |
| No log | 1.6695 | 394 | 0.6536 | -0.0385 | 0.6536 | 0.8085 |
| No log | 1.6780 | 396 | 0.6801 | 0.0 | 0.6801 | 0.8247 |
| No log | 1.6864 | 398 | 0.7038 | 0.0727 | 0.7038 | 0.8389 |
| No log | 1.6949 | 400 | 0.6728 | -0.0385 | 0.6728 | 0.8203 |
| No log | 1.7034 | 402 | 0.6631 | 0.1639 | 0.6631 | 0.8143 |
| No log | 1.7119 | 404 | 0.6919 | 0.1818 | 0.6919 | 0.8318 |
| No log | 1.7203 | 406 | 0.7347 | 0.1493 | 0.7347 | 0.8571 |
| No log | 1.7288 | 408 | 0.7817 | 0.1176 | 0.7817 | 0.8842 |
| No log | 1.7373 | 410 | 0.8126 | 0.1493 | 0.8126 | 0.9014 |
| No log | 1.7458 | 412 | 0.8119 | 0.2727 | 0.8119 | 0.9010 |
| No log | 1.7542 | 414 | 0.8065 | 0.3077 | 0.8065 | 0.8980 |
| No log | 1.7627 | 416 | 0.7725 | 0.3143 | 0.7725 | 0.8789 |
| No log | 1.7712 | 418 | 0.7280 | 0.3143 | 0.7280 | 0.8532 |
| No log | 1.7797 | 420 | 0.6939 | 0.2727 | 0.6939 | 0.8330 |
| No log | 1.7881 | 422 | 0.6829 | 0.1905 | 0.6829 | 0.8264 |
| No log | 1.7966 | 424 | 0.6839 | 0.1905 | 0.6839 | 0.8270 |
| No log | 1.8051 | 426 | 0.6341 | 0.3226 | 0.6341 | 0.7963 |
| No log | 1.8136 | 428 | 0.5962 | 0.3607 | 0.5962 | 0.7721 |
| No log | 1.8220 | 430 | 0.6079 | 0.3077 | 0.6079 | 0.7797 |
| No log | 1.8305 | 432 | 0.5977 | 0.3000 | 0.5977 | 0.7731 |
| No log | 1.8390 | 434 | 0.5995 | 0.3607 | 0.5995 | 0.7743 |
| No log | 1.8475 | 436 | 0.5914 | 0.4348 | 0.5914 | 0.7690 |
| No log | 1.8559 | 438 | 0.6061 | 0.4 | 0.6061 | 0.7785 |
| No log | 1.8644 | 440 | 0.6010 | 0.4348 | 0.6010 | 0.7752 |
| No log | 1.8729 | 442 | 0.5801 | 0.3607 | 0.5801 | 0.7616 |
| No log | 1.8814 | 444 | 0.5709 | 0.3607 | 0.5709 | 0.7556 |
| No log | 1.8898 | 446 | 0.5684 | 0.3390 | 0.5684 | 0.7539 |
| No log | 1.8983 | 448 | 0.5838 | 0.3793 | 0.5838 | 0.7641 |
| No log | 1.9068 | 450 | 0.6183 | 0.3158 | 0.6183 | 0.7863 |
| No log | 1.9153 | 452 | 0.6187 | 0.2759 | 0.6187 | 0.7866 |
| No log | 1.9237 | 454 | 0.5949 | 0.3000 | 0.5949 | 0.7713 |
| No log | 1.9322 | 456 | 0.5834 | 0.3158 | 0.5834 | 0.7638 |
| No log | 1.9407 | 458 | 0.5810 | 0.25 | 0.5810 | 0.7622 |
| No log | 1.9492 | 460 | 0.5842 | 0.1818 | 0.5842 | 0.7643 |
| No log | 1.9576 | 462 | 0.5900 | 0.2373 | 0.5900 | 0.7681 |
| No log | 1.9661 | 464 | 0.6288 | 0.2759 | 0.6288 | 0.7930 |
| No log | 1.9746 | 466 | 0.6883 | 0.2623 | 0.6883 | 0.8296 |
| No log | 1.9831 | 468 | 0.7137 | 0.2727 | 0.7137 | 0.8448 |
| No log | 1.9915 | 470 | 0.6744 | 0.2623 | 0.6744 | 0.8212 |
| No log | 2.0 | 472 | 0.6207 | 0.2500 | 0.6207 | 0.7879 |
| No log | 2.0085 | 474 | 0.5803 | 0.2759 | 0.5803 | 0.7618 |
| No log | 2.0169 | 476 | 0.5620 | 0.3000 | 0.5620 | 0.7497 |
| No log | 2.0254 | 478 | 0.5767 | 0.3793 | 0.5767 | 0.7594 |
| No log | 2.0339 | 480 | 0.5911 | 0.2642 | 0.5911 | 0.7689 |
| No log | 2.0424 | 482 | 0.5770 | 0.2642 | 0.5770 | 0.7596 |
| No log | 2.0508 | 484 | 0.5588 | 0.25 | 0.5588 | 0.7476 |
| No log | 2.0593 | 486 | 0.5449 | 0.3000 | 0.5449 | 0.7382 |
| No log | 2.0678 | 488 | 0.5507 | 0.3390 | 0.5507 | 0.7421 |
| No log | 2.0763 | 490 | 0.5591 | 0.3390 | 0.5591 | 0.7477 |
| No log | 2.0847 | 492 | 0.5720 | 0.3000 | 0.5720 | 0.7563 |
| No log | 2.0932 | 494 | 0.5911 | 0.3000 | 0.5911 | 0.7688 |
| No log | 2.1017 | 496 | 0.5823 | 0.3000 | 0.5823 | 0.7631 |
| No log | 2.1102 | 498 | 0.5359 | 0.3607 | 0.5359 | 0.7321 |
| 0.5108 | 2.1186 | 500 | 0.5103 | 0.4 | 0.5103 | 0.7143 |
| 0.5108 | 2.1271 | 502 | 0.4860 | 0.4 | 0.4860 | 0.6971 |
| 0.5108 | 2.1356 | 504 | 0.4990 | 0.4 | 0.4990 | 0.7064 |
| 0.5108 | 2.1441 | 506 | 0.4772 | 0.2909 | 0.4772 | 0.6908 |
| 0.5108 | 2.1525 | 508 | 0.4802 | 0.2353 | 0.4802 | 0.6929 |
| 0.5108 | 2.1610 | 510 | 0.4969 | 0.2353 | 0.4969 | 0.7049 |
| 0.5108 | 2.1695 | 512 | 0.5081 | 0.25 | 0.5081 | 0.7128 |
| 0.5108 | 2.1780 | 514 | 0.5197 | 0.25 | 0.5197 | 0.7209 |
| 0.5108 | 2.1864 | 516 | 0.5265 | 0.3607 | 0.5265 | 0.7256 |
| 0.5108 | 2.1949 | 518 | 0.5458 | 0.3607 | 0.5458 | 0.7388 |
| 0.5108 | 2.2034 | 520 | 0.5505 | 0.3607 | 0.5505 | 0.7420 |
| 0.5108 | 2.2119 | 522 | 0.5350 | 0.3607 | 0.5350 | 0.7314 |
| 0.5108 | 2.2203 | 524 | 0.5246 | 0.3607 | 0.5246 | 0.7243 |
| 0.5108 | 2.2288 | 526 | 0.5191 | 0.3607 | 0.5191 | 0.7205 |
| 0.5108 | 2.2373 | 528 | 0.5190 | 0.3607 | 0.5190 | 0.7204 |
| 0.5108 | 2.2458 | 530 | 0.5143 | 0.3607 | 0.5143 | 0.7171 |
| 0.5108 | 2.2542 | 532 | 0.5185 | 0.3607 | 0.5185 | 0.7201 |
| 0.5108 | 2.2627 | 534 | 0.5201 | 0.3607 | 0.5201 | 0.7212 |
| 0.5108 | 2.2712 | 536 | 0.5247 | 0.4194 | 0.5247 | 0.7243 |
| 0.5108 | 2.2797 | 538 | 0.5614 | 0.4375 | 0.5614 | 0.7492 |
| 0.5108 | 2.2881 | 540 | 0.6054 | 0.3077 | 0.6054 | 0.7781 |
| 0.5108 | 2.2966 | 542 | 0.5957 | 0.3077 | 0.5957 | 0.7718 |
| 0.5108 | 2.3051 | 544 | 0.5729 | 0.3810 | 0.5729 | 0.7569 |
| 0.5108 | 2.3136 | 546 | 0.5456 | 0.4194 | 0.5456 | 0.7386 |
| 0.5108 | 2.3220 | 548 | 0.5389 | 0.4194 | 0.5389 | 0.7341 |
| 0.5108 | 2.3305 | 550 | 0.5123 | 0.4194 | 0.5123 | 0.7158 |
| 0.5108 | 2.3390 | 552 | 0.5084 | 0.4194 | 0.5084 | 0.7130 |
| 0.5108 | 2.3475 | 554 | 0.5188 | 0.4194 | 0.5188 | 0.7203 |
| 0.5108 | 2.3559 | 556 | 0.5511 | 0.25 | 0.5511 | 0.7423 |
| 0.5108 | 2.3644 | 558 | 0.5978 | 0.3636 | 0.5978 | 0.7732 |
| 0.5108 | 2.3729 | 560 | 0.6212 | 0.3636 | 0.6212 | 0.7882 |
| 0.5108 | 2.3814 | 562 | 0.6590 | 0.3636 | 0.6590 | 0.8118 |
| 0.5108 | 2.3898 | 564 | 0.6595 | 0.1905 | 0.6595 | 0.8121 |
| 0.5108 | 2.3983 | 566 | 0.6426 | 0.3000 | 0.6426 | 0.8016 |
| 0.5108 | 2.4068 | 568 | 0.6770 | 0.2609 | 0.6770 | 0.8228 |
| 0.5108 | 2.4153 | 570 | 0.6881 | 0.2609 | 0.6881 | 0.8295 |
| 0.5108 | 2.4237 | 572 | 0.6955 | 0.25 | 0.6955 | 0.8340 |
| 0.5108 | 2.4322 | 574 | 0.7135 | 0.3226 | 0.7135 | 0.8447 |
| 0.5108 | 2.4407 | 576 | 0.7552 | 0.3544 | 0.7552 | 0.8690 |
| 0.5108 | 2.4492 | 578 | 0.7267 | 0.4545 | 0.7267 | 0.8525 |
| 0.5108 | 2.4576 | 580 | 0.6259 | 0.3810 | 0.6259 | 0.7912 |
| 0.5108 | 2.4661 | 582 | 0.5674 | 0.3226 | 0.5674 | 0.7532 |
| 0.5108 | 2.4746 | 584 | 0.5555 | 0.3607 | 0.5555 | 0.7453 |
| 0.5108 | 2.4831 | 586 | 0.5651 | 0.3810 | 0.5651 | 0.7517 |
| 0.5108 | 2.4915 | 588 | 0.5856 | 0.3810 | 0.5856 | 0.7653 |
| 0.5108 | 2.5 | 590 | 0.6261 | 0.3636 | 0.6261 | 0.7913 |
| 0.5108 | 2.5085 | 592 | 0.6240 | 0.3077 | 0.6240 | 0.7899 |
| 0.5108 | 2.5169 | 594 | 0.5905 | 0.3810 | 0.5905 | 0.7684 |
| 0.5108 | 2.5254 | 596 | 0.5630 | 0.2623 | 0.5630 | 0.7503 |
| 0.5108 | 2.5339 | 598 | 0.5630 | 0.3000 | 0.5630 | 0.7503 |
| 0.5108 | 2.5424 | 600 | 0.5714 | 0.3390 | 0.5714 | 0.7559 |
| 0.5108 | 2.5508 | 602 | 0.5580 | 0.4 | 0.5580 | 0.7470 |
| 0.5108 | 2.5593 | 604 | 0.5162 | 0.4 | 0.5162 | 0.7185 |
| 0.5108 | 2.5678 | 606 | 0.4889 | 0.4194 | 0.4889 | 0.6992 |
| 0.5108 | 2.5763 | 608 | 0.4826 | 0.3077 | 0.4826 | 0.6947 |
| 0.5108 | 2.5847 | 610 | 0.4698 | 0.3077 | 0.4698 | 0.6854 |
| 0.5108 | 2.5932 | 612 | 0.4523 | 0.3529 | 0.4523 | 0.6726 |
| 0.5108 | 2.6017 | 614 | 0.4344 | 0.2800 | 0.4344 | 0.6591 |
| 0.5108 | 2.6102 | 616 | 0.4152 | 0.5385 | 0.4152 | 0.6444 |
| 0.5108 | 2.6186 | 618 | 0.4244 | 0.5385 | 0.4244 | 0.6514 |
| 0.5108 | 2.6271 | 620 | 0.4536 | 0.2353 | 0.4536 | 0.6735 |
| 0.5108 | 2.6356 | 622 | 0.4894 | 0.2353 | 0.4894 | 0.6995 |
| 0.5108 | 2.6441 | 624 | 0.5292 | 0.2353 | 0.5292 | 0.7275 |
| 0.5108 | 2.6525 | 626 | 0.5516 | 0.2353 | 0.5516 | 0.7427 |
| 0.5108 | 2.6610 | 628 | 0.5485 | 0.3571 | 0.5485 | 0.7406 |
| 0.5108 | 2.6695 | 630 | 0.5386 | 0.3607 | 0.5386 | 0.7339 |
| 0.5108 | 2.6780 | 632 | 0.5531 | 0.3000 | 0.5531 | 0.7437 |
| 0.5108 | 2.6864 | 634 | 0.5728 | 0.3390 | 0.5728 | 0.7569 |
| 0.5108 | 2.6949 | 636 | 0.5594 | 0.3000 | 0.5594 | 0.7479 |
| 0.5108 | 2.7034 | 638 | 0.5573 | 0.2623 | 0.5573 | 0.7465 |
| 0.5108 | 2.7119 | 640 | 0.5782 | 0.3438 | 0.5782 | 0.7604 |
| 0.5108 | 2.7203 | 642 | 0.6039 | 0.4000 | 0.6039 | 0.7771 |
| 0.5108 | 2.7288 | 644 | 0.5831 | 0.4000 | 0.5831 | 0.7636 |
| 0.5108 | 2.7373 | 646 | 0.5602 | 0.3810 | 0.5602 | 0.7485 |
| 0.5108 | 2.7458 | 648 | 0.5741 | 0.2727 | 0.5741 | 0.7577 |
| 0.5108 | 2.7542 | 650 | 0.5943 | 0.3143 | 0.5943 | 0.7709 |
| 0.5108 | 2.7627 | 652 | 0.5842 | 0.3143 | 0.5842 | 0.7643 |
| 0.5108 | 2.7712 | 654 | 0.5617 | 0.2623 | 0.5617 | 0.7495 |
| 0.5108 | 2.7797 | 656 | 0.5488 | 0.3810 | 0.5488 | 0.7408 |
| 0.5108 | 2.7881 | 658 | 0.5210 | 0.4923 | 0.5210 | 0.7218 |
| 0.5108 | 2.7966 | 660 | 0.5237 | 0.4590 | 0.5237 | 0.7237 |
| 0.5108 | 2.8051 | 662 | 0.5303 | 0.1724 | 0.5303 | 0.7282 |
| 0.5108 | 2.8136 | 664 | 0.5262 | 0.1724 | 0.5262 | 0.7254 |
| 0.5108 | 2.8220 | 666 | 0.4825 | 0.4923 | 0.4825 | 0.6947 |
| 0.5108 | 2.8305 | 668 | 0.4692 | 0.4375 | 0.4692 | 0.6850 |
| 0.5108 | 2.8390 | 670 | 0.5254 | 0.3636 | 0.5254 | 0.7248 |
| 0.5108 | 2.8475 | 672 | 0.5867 | 0.3200 | 0.5867 | 0.7660 |
| 0.5108 | 2.8559 | 674 | 0.6016 | 0.3200 | 0.6016 | 0.7757 |
| 0.5108 | 2.8644 | 676 | 0.5894 | 0.3636 | 0.5894 | 0.7677 |
| 0.5108 | 2.8729 | 678 | 0.5627 | 0.3607 | 0.5627 | 0.7502 |
| 0.5108 | 2.8814 | 680 | 0.5461 | 0.3810 | 0.5461 | 0.7390 |
| 0.5108 | 2.8898 | 682 | 0.5298 | 0.4375 | 0.5298 | 0.7279 |
| 0.5108 | 2.8983 | 684 | 0.5028 | 0.3810 | 0.5028 | 0.7091 |
| 0.5108 | 2.9068 | 686 | 0.4874 | 0.3810 | 0.4874 | 0.6981 |
| 0.5108 | 2.9153 | 688 | 0.4894 | 0.4194 | 0.4894 | 0.6995 |
| 0.5108 | 2.9237 | 690 | 0.4887 | 0.3810 | 0.4887 | 0.6990 |
| 0.5108 | 2.9322 | 692 | 0.4862 | 0.4375 | 0.4862 | 0.6973 |
| 0.5108 | 2.9407 | 694 | 0.5050 | 0.4375 | 0.5050 | 0.7106 |
| 0.5108 | 2.9492 | 696 | 0.5312 | 0.5 | 0.5312 | 0.7288 |
| 0.5108 | 2.9576 | 698 | 0.5417 | 0.5 | 0.5417 | 0.7360 |
| 0.5108 | 2.9661 | 700 | 0.5272 | 0.5 | 0.5272 | 0.7261 |
| 0.5108 | 2.9746 | 702 | 0.5098 | 0.4375 | 0.5098 | 0.7140 |
| 0.5108 | 2.9831 | 704 | 0.5121 | 0.4375 | 0.5121 | 0.7156 |
| 0.5108 | 2.9915 | 706 | 0.5459 | 0.5 | 0.5459 | 0.7389 |
| 0.5108 | 3.0 | 708 | 0.5914 | 0.3607 | 0.5914 | 0.7690 |
| 0.5108 | 3.0085 | 710 | 0.6515 | 0.3607 | 0.6515 | 0.8072 |
| 0.5108 | 3.0169 | 712 | 0.6620 | 0.3607 | 0.6620 | 0.8136 |
| 0.5108 | 3.0254 | 714 | 0.6477 | 0.3636 | 0.6477 | 0.8048 |
| 0.5108 | 3.0339 | 716 | 0.6328 | 0.3810 | 0.6328 | 0.7955 |
| 0.5108 | 3.0424 | 718 | 0.6484 | 0.2817 | 0.6484 | 0.8052 |
| 0.5108 | 3.0508 | 720 | 0.6513 | 0.2817 | 0.6513 | 0.8070 |
| 0.5108 | 3.0593 | 722 | 0.6294 | 0.2727 | 0.6294 | 0.7934 |
| 0.5108 | 3.0678 | 724 | 0.6097 | 0.3810 | 0.6097 | 0.7809 |
| 0.5108 | 3.0763 | 726 | 0.6100 | 0.3077 | 0.6100 | 0.7810 |
| 0.5108 | 3.0847 | 728 | 0.5879 | 0.3000 | 0.5879 | 0.7667 |
| 0.5108 | 3.0932 | 730 | 0.5822 | 0.3000 | 0.5822 | 0.7630 |
| 0.5108 | 3.1017 | 732 | 0.5733 | 0.3793 | 0.5733 | 0.7571 |
| 0.5108 | 3.1102 | 734 | 0.5778 | 0.3158 | 0.5778 | 0.7601 |
| 0.5108 | 3.1186 | 736 | 0.5648 | 0.3226 | 0.5648 | 0.7515 |
| 0.5108 | 3.1271 | 738 | 0.5425 | 0.2623 | 0.5425 | 0.7366 |
| 0.5108 | 3.1356 | 740 | 0.5178 | 0.3000 | 0.5178 | 0.7196 |
| 0.5108 | 3.1441 | 742 | 0.5053 | 0.3607 | 0.5053 | 0.7109 |
| 0.5108 | 3.1525 | 744 | 0.4950 | 0.3607 | 0.4950 | 0.7036 |
| 0.5108 | 3.1610 | 746 | 0.4791 | 0.3793 | 0.4791 | 0.6922 |
| 0.5108 | 3.1695 | 748 | 0.4727 | 0.3793 | 0.4727 | 0.6876 |
| 0.5108 | 3.1780 | 750 | 0.4738 | 0.4407 | 0.4738 | 0.6884 |
| 0.5108 | 3.1864 | 752 | 0.4700 | 0.4407 | 0.4700 | 0.6856 |
| 0.5108 | 3.1949 | 754 | 0.4555 | 0.4407 | 0.4555 | 0.6749 |
| 0.5108 | 3.2034 | 756 | 0.4568 | 0.4407 | 0.4568 | 0.6758 |
| 0.5108 | 3.2119 | 758 | 0.4558 | 0.3793 | 0.4558 | 0.6751 |
| 0.5108 | 3.2203 | 760 | 0.4632 | 0.3793 | 0.4632 | 0.6806 |
| 0.5108 | 3.2288 | 762 | 0.4654 | 0.4179 | 0.4654 | 0.6822 |
| 0.5108 | 3.2373 | 764 | 0.4622 | 0.4179 | 0.4622 | 0.6799 |
| 0.5108 | 3.2458 | 766 | 0.4564 | 0.4179 | 0.4564 | 0.6756 |
| 0.5108 | 3.2542 | 768 | 0.4580 | 0.4348 | 0.4580 | 0.6767 |
| 0.5108 | 3.2627 | 770 | 0.4627 | 0.4348 | 0.4627 | 0.6802 |
| 0.5108 | 3.2712 | 772 | 0.4718 | 0.4156 | 0.4718 | 0.6869 |
| 0.5108 | 3.2797 | 774 | 0.4936 | 0.3684 | 0.4936 | 0.7026 |
| 0.5108 | 3.2881 | 776 | 0.4979 | 0.3684 | 0.4979 | 0.7056 |
| 0.5108 | 3.2966 | 778 | 0.4769 | 0.4156 | 0.4769 | 0.6906 |
| 0.5108 | 3.3051 | 780 | 0.4517 | 0.4 | 0.4517 | 0.6721 |
| 0.5108 | 3.3136 | 782 | 0.4405 | 0.4507 | 0.4405 | 0.6637 |
| 0.5108 | 3.3220 | 784 | 0.4364 | 0.4507 | 0.4364 | 0.6606 |
| 0.5108 | 3.3305 | 786 | 0.4413 | 0.4545 | 0.4413 | 0.6643 |
| 0.5108 | 3.3390 | 788 | 0.4424 | 0.5161 | 0.4424 | 0.6651 |
| 0.5108 | 3.3475 | 790 | 0.4546 | 0.5161 | 0.4546 | 0.6742 |
| 0.5108 | 3.3559 | 792 | 0.4564 | 0.4590 | 0.4564 | 0.6756 |
| 0.5108 | 3.3644 | 794 | 0.4621 | 0.4 | 0.4621 | 0.6798 |
| 0.5108 | 3.3729 | 796 | 0.4775 | 0.4 | 0.4775 | 0.6910 |
| 0.5108 | 3.3814 | 798 | 0.4930 | 0.3810 | 0.4930 | 0.7021 |
| 0.5108 | 3.3898 | 800 | 0.5086 | 0.4194 | 0.5086 | 0.7131 |
| 0.5108 | 3.3983 | 802 | 0.5194 | 0.4194 | 0.5194 | 0.7207 |
| 0.5108 | 3.4068 | 804 | 0.5242 | 0.3810 | 0.5242 | 0.7240 |
| 0.5108 | 3.4153 | 806 | 0.5361 | 0.4407 | 0.5361 | 0.7322 |
| 0.5108 | 3.4237 | 808 | 0.5430 | 0.4407 | 0.5430 | 0.7369 |
| 0.5108 | 3.4322 | 810 | 0.5690 | 0.5 | 0.5690 | 0.7543 |
| 0.5108 | 3.4407 | 812 | 0.5676 | 0.5 | 0.5676 | 0.7534 |
| 0.5108 | 3.4492 | 814 | 0.5536 | 0.3793 | 0.5536 | 0.7440 |
| 0.5108 | 3.4576 | 816 | 0.5476 | 0.3158 | 0.5476 | 0.7400 |
| 0.5108 | 3.4661 | 818 | 0.5548 | 0.2909 | 0.5548 | 0.7448 |
| 0.5108 | 3.4746 | 820 | 0.5555 | 0.25 | 0.5555 | 0.7453 |
| 0.5108 | 3.4831 | 822 | 0.5615 | 0.25 | 0.5615 | 0.7493 |
| 0.5108 | 3.4915 | 824 | 0.5753 | 0.25 | 0.5753 | 0.7585 |
| 0.5108 | 3.5 | 826 | 0.5846 | 0.3158 | 0.5846 | 0.7646 |
| 0.5108 | 3.5085 | 828 | 0.5926 | 0.3158 | 0.5926 | 0.7698 |
| 0.5108 | 3.5169 | 830 | 0.6211 | 0.25 | 0.6211 | 0.7881 |
| 0.5108 | 3.5254 | 832 | 0.6430 | 0.3000 | 0.6430 | 0.8019 |
| 0.5108 | 3.5339 | 834 | 0.6573 | 0.3077 | 0.6573 | 0.8107 |
| 0.5108 | 3.5424 | 836 | 0.6594 | 0.3143 | 0.6594 | 0.8121 |
| 0.5108 | 3.5508 | 838 | 0.6475 | 0.3662 | 0.6475 | 0.8047 |
| 0.5108 | 3.5593 | 840 | 0.6373 | 0.3226 | 0.6373 | 0.7983 |
| 0.5108 | 3.5678 | 842 | 0.6237 | 0.4375 | 0.6237 | 0.7897 |
| 0.5108 | 3.5763 | 844 | 0.5983 | 0.4923 | 0.5983 | 0.7735 |
| 0.5108 | 3.5847 | 846 | 0.5629 | 0.4375 | 0.5629 | 0.7503 |
| 0.5108 | 3.5932 | 848 | 0.5421 | 0.3810 | 0.5421 | 0.7363 |
| 0.5108 | 3.6017 | 850 | 0.5360 | 0.4179 | 0.5360 | 0.7321 |
| 0.5108 | 3.6102 | 852 | 0.5471 | 0.3636 | 0.5471 | 0.7396 |
| 0.5108 | 3.6186 | 854 | 0.5490 | 0.4179 | 0.5490 | 0.7409 |
| 0.5108 | 3.6271 | 856 | 0.5480 | 0.3824 | 0.5480 | 0.7403 |
| 0.5108 | 3.6356 | 858 | 0.5532 | 0.3824 | 0.5532 | 0.7438 |
| 0.5108 | 3.6441 | 860 | 0.5674 | 0.3824 | 0.5674 | 0.7533 |
| 0.5108 | 3.6525 | 862 | 0.5739 | 0.3824 | 0.5739 | 0.7575 |
| 0.5108 | 3.6610 | 864 | 0.5718 | 0.3824 | 0.5718 | 0.7562 |
| 0.5108 | 3.6695 | 866 | 0.5737 | 0.3824 | 0.5737 | 0.7574 |
| 0.5108 | 3.6780 | 868 | 0.5649 | 0.4179 | 0.5649 | 0.7516 |
| 0.5108 | 3.6864 | 870 | 0.5735 | 0.4156 | 0.5735 | 0.7573 |
| 0.5108 | 3.6949 | 872 | 0.5917 | 0.4156 | 0.5917 | 0.7692 |
| 0.5108 | 3.7034 | 874 | 0.6026 | 0.3684 | 0.6026 | 0.7762 |
| 0.5108 | 3.7119 | 876 | 0.6106 | 0.3200 | 0.6106 | 0.7814 |
| 0.5108 | 3.7203 | 878 | 0.6055 | 0.3684 | 0.6055 | 0.7782 |
| 0.5108 | 3.7288 | 880 | 0.6169 | 0.3662 | 0.6169 | 0.7854 |
| 0.5108 | 3.7373 | 882 | 0.6429 | 0.4348 | 0.6429 | 0.8018 |
| 0.5108 | 3.7458 | 884 | 0.6409 | 0.4923 | 0.6409 | 0.8006 |
| 0.5108 | 3.7542 | 886 | 0.6136 | 0.4857 | 0.6136 | 0.7834 |
| 0.5108 | 3.7627 | 888 | 0.5750 | 0.3824 | 0.5750 | 0.7583 |
| 0.5108 | 3.7712 | 890 | 0.5598 | 0.3836 | 0.5598 | 0.7482 |
| 0.5108 | 3.7797 | 892 | 0.5628 | 0.4923 | 0.5628 | 0.7502 |
| 0.5108 | 3.7881 | 894 | 0.5766 | 0.4923 | 0.5766 | 0.7593 |
| 0.5108 | 3.7966 | 896 | 0.5957 | 0.4923 | 0.5957 | 0.7718 |
| 0.5108 | 3.8051 | 898 | 0.6155 | 0.4923 | 0.6155 | 0.7846 |
| 0.5108 | 3.8136 | 900 | 0.6098 | 0.4923 | 0.6098 | 0.7809 |
| 0.5108 | 3.8220 | 902 | 0.6045 | 0.4923 | 0.6045 | 0.7775 |
| 0.5108 | 3.8305 | 904 | 0.5919 | 0.4857 | 0.5919 | 0.7693 |
| 0.5108 | 3.8390 | 906 | 0.5792 | 0.3824 | 0.5792 | 0.7611 |
| 0.5108 | 3.8475 | 908 | 0.5727 | 0.3836 | 0.5727 | 0.7568 |
| 0.5108 | 3.8559 | 910 | 0.5616 | 0.4857 | 0.5616 | 0.7494 |
| 0.5108 | 3.8644 | 912 | 0.5720 | 0.4507 | 0.5720 | 0.7563 |
| 0.5108 | 3.8729 | 914 | 0.5718 | 0.4545 | 0.5718 | 0.7562 |
| 0.5108 | 3.8814 | 916 | 0.5635 | 0.4545 | 0.5635 | 0.7507 |
| 0.5108 | 3.8898 | 918 | 0.5559 | 0.5075 | 0.5559 | 0.7456 |
| 0.5108 | 3.8983 | 920 | 0.5529 | 0.4545 | 0.5529 | 0.7436 |
| 0.5108 | 3.9068 | 922 | 0.5525 | 0.4545 | 0.5525 | 0.7433 |
| 0.5108 | 3.9153 | 924 | 0.5724 | 0.4348 | 0.5724 | 0.7566 |
| 0.5108 | 3.9237 | 926 | 0.6046 | 0.4348 | 0.6046 | 0.7775 |
| 0.5108 | 3.9322 | 928 | 0.6424 | 0.3284 | 0.6424 | 0.8015 |
| 0.5108 | 3.9407 | 930 | 0.6767 | 0.2727 | 0.6767 | 0.8226 |
| 0.5108 | 3.9492 | 932 | 0.7011 | 0.2727 | 0.7011 | 0.8373 |
| 0.5108 | 3.9576 | 934 | 0.7359 | 0.3200 | 0.7359 | 0.8579 |
| 0.5108 | 3.9661 | 936 | 0.7508 | 0.3200 | 0.7508 | 0.8665 |
| 0.5108 | 3.9746 | 938 | 0.7501 | 0.3200 | 0.7501 | 0.8661 |
| 0.5108 | 3.9831 | 940 | 0.7185 | 0.2727 | 0.7185 | 0.8476 |
| 0.5108 | 3.9915 | 942 | 0.6685 | 0.3284 | 0.6685 | 0.8176 |
| 0.5108 | 4.0 | 944 | 0.6317 | 0.3824 | 0.6317 | 0.7948 |
| 0.5108 | 4.0085 | 946 | 0.6028 | 0.4000 | 0.6028 | 0.7764 |
| 0.5108 | 4.0169 | 948 | 0.5981 | 0.4545 | 0.5981 | 0.7734 |
| 0.5108 | 4.0254 | 950 | 0.6054 | 0.4545 | 0.6054 | 0.7781 |
| 0.5108 | 4.0339 | 952 | 0.5890 | 0.4545 | 0.5890 | 0.7675 |
| 0.5108 | 4.0424 | 954 | 0.5603 | 0.4507 | 0.5603 | 0.7485 |
| 0.5108 | 4.0508 | 956 | 0.5356 | 0.3824 | 0.5356 | 0.7318 |
| 0.5108 | 4.0593 | 958 | 0.5306 | 0.3824 | 0.5306 | 0.7285 |
| 0.5108 | 4.0678 | 960 | 0.5232 | 0.3824 | 0.5232 | 0.7233 |
| 0.5108 | 4.0763 | 962 | 0.5198 | 0.4348 | 0.5198 | 0.7210 |
| 0.5108 | 4.0847 | 964 | 0.5395 | 0.4545 | 0.5395 | 0.7345 |
| 0.5108 | 4.0932 | 966 | 0.5643 | 0.5075 | 0.5643 | 0.7512 |
| 0.5108 | 4.1017 | 968 | 0.5569 | 0.5075 | 0.5569 | 0.7462 |
| 0.5108 | 4.1102 | 970 | 0.5400 | 0.4545 | 0.5400 | 0.7349 |
| 0.5108 | 4.1186 | 972 | 0.5230 | 0.4348 | 0.5230 | 0.7232 |
| 0.5108 | 4.1271 | 974 | 0.5345 | 0.3284 | 0.5345 | 0.7311 |
| 0.5108 | 4.1356 | 976 | 0.5515 | 0.3284 | 0.5515 | 0.7426 |
| 0.5108 | 4.1441 | 978 | 0.5544 | 0.3824 | 0.5544 | 0.7446 |
| 0.5108 | 4.1525 | 980 | 0.5585 | 0.4857 | 0.5585 | 0.7473 |
| 0.5108 | 4.1610 | 982 | 0.5657 | 0.4923 | 0.5657 | 0.7521 |
| 0.5108 | 4.1695 | 984 | 0.5799 | 0.5075 | 0.5799 | 0.7615 |
| 0.5108 | 4.1780 | 986 | 0.6032 | 0.5075 | 0.6032 | 0.7766 |
| 0.5108 | 4.1864 | 988 | 0.6150 | 0.4923 | 0.6150 | 0.7842 |
| 0.5108 | 4.1949 | 990 | 0.6188 | 0.4375 | 0.6188 | 0.7866 |
| 0.5108 | 4.2034 | 992 | 0.6129 | 0.4375 | 0.6129 | 0.7829 |
| 0.5108 | 4.2119 | 994 | 0.6120 | 0.4375 | 0.6120 | 0.7823 |
| 0.5108 | 4.2203 | 996 | 0.5959 | 0.3824 | 0.5959 | 0.7719 |
| 0.5108 | 4.2288 | 998 | 0.5875 | 0.3824 | 0.5875 | 0.7665 |
| 0.1182 | 4.2373 | 1000 | 0.5877 | 0.3824 | 0.5877 | 0.7666 |
| 0.1182 | 4.2458 | 1002 | 0.5955 | 0.3824 | 0.5955 | 0.7717 |
| 0.1182 | 4.2542 | 1004 | 0.5980 | 0.3824 | 0.5980 | 0.7733 |
| 0.1182 | 4.2627 | 1006 | 0.6054 | 0.3824 | 0.6054 | 0.7781 |
| 0.1182 | 4.2712 | 1008 | 0.6206 | 0.3824 | 0.6206 | 0.7878 |
| 0.1182 | 4.2797 | 1010 | 0.6443 | 0.3284 | 0.6443 | 0.8027 |
| 0.1182 | 4.2881 | 1012 | 0.6663 | 0.3284 | 0.6663 | 0.8163 |
| 0.1182 | 4.2966 | 1014 | 0.6712 | 0.3284 | 0.6712 | 0.8193 |
| 0.1182 | 4.3051 | 1016 | 0.6834 | 0.2727 | 0.6834 | 0.8267 |
| 0.1182 | 4.3136 | 1018 | 0.6827 | 0.3284 | 0.6827 | 0.8263 |
| 0.1182 | 4.3220 | 1020 | 0.6820 | 0.3824 | 0.6820 | 0.8259 |
| 0.1182 | 4.3305 | 1022 | 0.6882 | 0.3824 | 0.6882 | 0.8296 |
| 0.1182 | 4.3390 | 1024 | 0.6849 | 0.3824 | 0.6849 | 0.8276 |
| 0.1182 | 4.3475 | 1026 | 0.6758 | 0.3284 | 0.6758 | 0.8221 |
| 0.1182 | 4.3559 | 1028 | 0.6741 | 0.3284 | 0.6741 | 0.8210 |
| 0.1182 | 4.3644 | 1030 | 0.6875 | 0.3636 | 0.6875 | 0.8292 |
| 0.1182 | 4.3729 | 1032 | 0.6721 | 0.3636 | 0.6721 | 0.8198 |
| 0.1182 | 4.3814 | 1034 | 0.6358 | 0.3284 | 0.6358 | 0.7974 |
| 0.1182 | 4.3898 | 1036 | 0.6167 | 0.3284 | 0.6167 | 0.7853 |
| 0.1182 | 4.3983 | 1038 | 0.6266 | 0.4375 | 0.6266 | 0.7915 |
| 0.1182 | 4.4068 | 1040 | 0.6358 | 0.3284 | 0.6358 | 0.7974 |
| 0.1182 | 4.4153 | 1042 | 0.6250 | 0.4348 | 0.6250 | 0.7906 |
| 0.1182 | 4.4237 | 1044 | 0.6261 | 0.3284 | 0.6261 | 0.7913 |
| 0.1182 | 4.4322 | 1046 | 0.6347 | 0.3284 | 0.6347 | 0.7967 |
| 0.1182 | 4.4407 | 1048 | 0.6355 | 0.3284 | 0.6355 | 0.7972 |
| 0.1182 | 4.4492 | 1050 | 0.6208 | 0.3824 | 0.6208 | 0.7879 |
| 0.1182 | 4.4576 | 1052 | 0.6135 | 0.3824 | 0.6135 | 0.7833 |
| 0.1182 | 4.4661 | 1054 | 0.6211 | 0.3824 | 0.6211 | 0.7881 |
| 0.1182 | 4.4746 | 1056 | 0.6292 | 0.3824 | 0.6292 | 0.7932 |
| 0.1182 | 4.4831 | 1058 | 0.6325 | 0.4857 | 0.6325 | 0.7953 |
| 0.1182 | 4.4915 | 1060 | 0.6347 | 0.3824 | 0.6347 | 0.7967 |
| 0.1182 | 4.5 | 1062 | 0.6385 | 0.3824 | 0.6385 | 0.7991 |
| 0.1182 | 4.5085 | 1064 | 0.6432 | 0.3284 | 0.6432 | 0.8020 |
| 0.1182 | 4.5169 | 1066 | 0.6505 | 0.3284 | 0.6505 | 0.8065 |
| 0.1182 | 4.5254 | 1068 | 0.6547 | 0.3284 | 0.6547 | 0.8092 |
| 0.1182 | 4.5339 | 1070 | 0.6473 | 0.3284 | 0.6473 | 0.8045 |
| 0.1182 | 4.5424 | 1072 | 0.6273 | 0.3284 | 0.6273 | 0.7920 |
| 0.1182 | 4.5508 | 1074 | 0.6097 | 0.3226 | 0.6097 | 0.7808 |
| 0.1182 | 4.5593 | 1076 | 0.6027 | 0.3810 | 0.6027 | 0.7764 |
| 0.1182 | 4.5678 | 1078 | 0.6118 | 0.3607 | 0.6118 | 0.7822 |
| 0.1182 | 4.5763 | 1080 | 0.6330 | 0.3226 | 0.6330 | 0.7956 |
| 0.1182 | 4.5847 | 1082 | 0.6369 | 0.3000 | 0.6369 | 0.7981 |
| 0.1182 | 4.5932 | 1084 | 0.6424 | 0.3810 | 0.6424 | 0.8015 |
| 0.1182 | 4.6017 | 1086 | 0.6562 | 0.3284 | 0.6562 | 0.8100 |
| 0.1182 | 4.6102 | 1088 | 0.7087 | 0.2727 | 0.7087 | 0.8418 |
| 0.1182 | 4.6186 | 1090 | 0.7632 | 0.1316 | 0.7632 | 0.8736 |
| 0.1182 | 4.6271 | 1092 | 0.7733 | 0.1316 | 0.7733 | 0.8794 |
| 0.1182 | 4.6356 | 1094 | 0.7507 | 0.1127 | 0.7507 | 0.8664 |
| 0.1182 | 4.6441 | 1096 | 0.7489 | 0.3824 | 0.7489 | 0.8654 |
| 0.1182 | 4.6525 | 1098 | 0.7680 | 0.2154 | 0.7680 | 0.8763 |
| 0.1182 | 4.6610 | 1100 | 0.7746 | 0.2623 | 0.7746 | 0.8801 |
| 0.1182 | 4.6695 | 1102 | 0.7577 | 0.2623 | 0.7577 | 0.8705 |
| 0.1182 | 4.6780 | 1104 | 0.7157 | 0.2623 | 0.7157 | 0.8460 |
| 0.1182 | 4.6864 | 1106 | 0.6616 | 0.4375 | 0.6616 | 0.8134 |
| 0.1182 | 4.6949 | 1108 | 0.6219 | 0.3284 | 0.6219 | 0.7886 |
| 0.1182 | 4.7034 | 1110 | 0.6027 | 0.3284 | 0.6027 | 0.7763 |
| 0.1182 | 4.7119 | 1112 | 0.5915 | 0.3284 | 0.5915 | 0.7691 |
| 0.1182 | 4.7203 | 1114 | 0.5863 | 0.3824 | 0.5863 | 0.7657 |
| 0.1182 | 4.7288 | 1116 | 0.5894 | 0.3824 | 0.5894 | 0.7677 |
| 0.1182 | 4.7373 | 1118 | 0.5965 | 0.3824 | 0.5965 | 0.7723 |
| 0.1182 | 4.7458 | 1120 | 0.6013 | 0.4923 | 0.6013 | 0.7754 |
| 0.1182 | 4.7542 | 1122 | 0.6103 | 0.4923 | 0.6103 | 0.7812 |
| 0.1182 | 4.7627 | 1124 | 0.6288 | 0.4375 | 0.6288 | 0.7929 |
| 0.1182 | 4.7712 | 1126 | 0.6385 | 0.3284 | 0.6385 | 0.7991 |
| 0.1182 | 4.7797 | 1128 | 0.6424 | 0.3284 | 0.6424 | 0.8015 |
| 0.1182 | 4.7881 | 1130 | 0.6356 | 0.3284 | 0.6356 | 0.7972 |
| 0.1182 | 4.7966 | 1132 | 0.6214 | 0.3284 | 0.6214 | 0.7883 |
| 0.1182 | 4.8051 | 1134 | 0.6280 | 0.3810 | 0.6280 | 0.7925 |
| 0.1182 | 4.8136 | 1136 | 0.6204 | 0.4375 | 0.6204 | 0.7877 |
| 0.1182 | 4.8220 | 1138 | 0.5909 | 0.3810 | 0.5909 | 0.7687 |
| 0.1182 | 4.8305 | 1140 | 0.5645 | 0.3824 | 0.5645 | 0.7513 |
| 0.1182 | 4.8390 | 1142 | 0.5638 | 0.3824 | 0.5638 | 0.7509 |
| 0.1182 | 4.8475 | 1144 | 0.5665 | 0.3824 | 0.5665 | 0.7527 |
| 0.1182 | 4.8559 | 1146 | 0.5632 | 0.3810 | 0.5632 | 0.7505 |
| 0.1182 | 4.8644 | 1148 | 0.5790 | 0.3810 | 0.5790 | 0.7609 |
| 0.1182 | 4.8729 | 1150 | 0.5949 | 0.3810 | 0.5949 | 0.7713 |
| 0.1182 | 4.8814 | 1152 | 0.5988 | 0.3810 | 0.5988 | 0.7738 |
| 0.1182 | 4.8898 | 1154 | 0.5832 | 0.3226 | 0.5832 | 0.7637 |
| 0.1182 | 4.8983 | 1156 | 0.5633 | 0.3226 | 0.5633 | 0.7505 |
| 0.1182 | 4.9068 | 1158 | 0.5462 | 0.3810 | 0.5462 | 0.7391 |
| 0.1182 | 4.9153 | 1160 | 0.5455 | 0.3810 | 0.5455 | 0.7386 |
| 0.1182 | 4.9237 | 1162 | 0.5412 | 0.3810 | 0.5412 | 0.7357 |
| 0.1182 | 4.9322 | 1164 | 0.5540 | 0.3810 | 0.5540 | 0.7443 |
| 0.1182 | 4.9407 | 1166 | 0.5741 | 0.4375 | 0.5741 | 0.7577 |
| 0.1182 | 4.9492 | 1168 | 0.5651 | 0.3810 | 0.5651 | 0.7517 |
| 0.1182 | 4.9576 | 1170 | 0.5476 | 0.3810 | 0.5476 | 0.7400 |
| 0.1182 | 4.9661 | 1172 | 0.5423 | 0.3810 | 0.5423 | 0.7364 |
| 0.1182 | 4.9746 | 1174 | 0.5573 | 0.3226 | 0.5573 | 0.7466 |
| 0.1182 | 4.9831 | 1176 | 0.5807 | 0.3226 | 0.5807 | 0.7621 |
| 0.1182 | 4.9915 | 1178 | 0.5863 | 0.3226 | 0.5863 | 0.7657 |
| 0.1182 | 5.0 | 1180 | 0.5937 | 0.3284 | 0.5937 | 0.7705 |
| 0.1182 | 5.0085 | 1182 | 0.6080 | 0.3284 | 0.6080 | 0.7797 |
| 0.1182 | 5.0169 | 1184 | 0.6198 | 0.3284 | 0.6198 | 0.7873 |
| 0.1182 | 5.0254 | 1186 | 0.6262 | 0.3284 | 0.6262 | 0.7913 |
| 0.1182 | 5.0339 | 1188 | 0.6268 | 0.3284 | 0.6268 | 0.7917 |
| 0.1182 | 5.0424 | 1190 | 0.6284 | 0.3284 | 0.6284 | 0.7927 |
| 0.1182 | 5.0508 | 1192 | 0.6226 | 0.3284 | 0.6226 | 0.7891 |
| 0.1182 | 5.0593 | 1194 | 0.6131 | 0.3824 | 0.6131 | 0.7830 |
| 0.1182 | 5.0678 | 1196 | 0.6085 | 0.3824 | 0.6085 | 0.7800 |
| 0.1182 | 5.0763 | 1198 | 0.6053 | 0.4348 | 0.6053 | 0.7780 |
| 0.1182 | 5.0847 | 1200 | 0.5968 | 0.4348 | 0.5968 | 0.7725 |
| 0.1182 | 5.0932 | 1202 | 0.5866 | 0.4348 | 0.5866 | 0.7659 |
| 0.1182 | 5.1017 | 1204 | 0.5644 | 0.4348 | 0.5644 | 0.7513 |
| 0.1182 | 5.1102 | 1206 | 0.5384 | 0.4857 | 0.5384 | 0.7338 |
| 0.1182 | 5.1186 | 1208 | 0.5291 | 0.4923 | 0.5291 | 0.7274 |
| 0.1182 | 5.1271 | 1210 | 0.5329 | 0.5455 | 0.5329 | 0.7300 |
| 0.1182 | 5.1356 | 1212 | 0.5301 | 0.4923 | 0.5301 | 0.7281 |
| 0.1182 | 5.1441 | 1214 | 0.5379 | 0.4923 | 0.5379 | 0.7334 |
| 0.1182 | 5.1525 | 1216 | 0.5472 | 0.4348 | 0.5472 | 0.7397 |
| 0.1182 | 5.1610 | 1218 | 0.5550 | 0.4857 | 0.5550 | 0.7450 |
| 0.1182 | 5.1695 | 1220 | 0.5671 | 0.4857 | 0.5671 | 0.7530 |
| 0.1182 | 5.1780 | 1222 | 0.5761 | 0.4923 | 0.5761 | 0.7590 |
| 0.1182 | 5.1864 | 1224 | 0.5896 | 0.4923 | 0.5896 | 0.7678 |
| 0.1182 | 5.1949 | 1226 | 0.5840 | 0.4923 | 0.5840 | 0.7642 |
| 0.1182 | 5.2034 | 1228 | 0.5720 | 0.4923 | 0.5720 | 0.7563 |
| 0.1182 | 5.2119 | 1230 | 0.5467 | 0.3824 | 0.5467 | 0.7394 |
| 0.1182 | 5.2203 | 1232 | 0.5245 | 0.3333 | 0.5245 | 0.7242 |
| 0.1182 | 5.2288 | 1234 | 0.5064 | 0.3333 | 0.5064 | 0.7116 |
| 0.1182 | 5.2373 | 1236 | 0.4964 | 0.3836 | 0.4964 | 0.7046 |
| 0.1182 | 5.2458 | 1238 | 0.4959 | 0.3836 | 0.4959 | 0.7042 |
| 0.1182 | 5.2542 | 1240 | 0.5105 | 0.3810 | 0.5105 | 0.7145 |
| 0.1182 | 5.2627 | 1242 | 0.5320 | 0.4923 | 0.5320 | 0.7294 |
| 0.1182 | 5.2712 | 1244 | 0.5389 | 0.4923 | 0.5389 | 0.7341 |
| 0.1182 | 5.2797 | 1246 | 0.5299 | 0.4923 | 0.5299 | 0.7279 |
| 0.1182 | 5.2881 | 1248 | 0.5146 | 0.3810 | 0.5146 | 0.7173 |
| 0.1182 | 5.2966 | 1250 | 0.5087 | 0.3810 | 0.5087 | 0.7133 |
| 0.1182 | 5.3051 | 1252 | 0.5200 | 0.3810 | 0.5200 | 0.7211 |
| 0.1182 | 5.3136 | 1254 | 0.5333 | 0.3810 | 0.5333 | 0.7302 |
| 0.1182 | 5.3220 | 1256 | 0.5474 | 0.3226 | 0.5474 | 0.7399 |
| 0.1182 | 5.3305 | 1258 | 0.5548 | 0.3284 | 0.5548 | 0.7449 |
| 0.1182 | 5.3390 | 1260 | 0.5619 | 0.3284 | 0.5619 | 0.7496 |
| 0.1182 | 5.3475 | 1262 | 0.5746 | 0.3284 | 0.5746 | 0.7580 |
| 0.1182 | 5.3559 | 1264 | 0.5741 | 0.3284 | 0.5741 | 0.7577 |
| 0.1182 | 5.3644 | 1266 | 0.5688 | 0.3824 | 0.5688 | 0.7542 |
| 0.1182 | 5.3729 | 1268 | 0.5653 | 0.3824 | 0.5653 | 0.7519 |
| 0.1182 | 5.3814 | 1270 | 0.5682 | 0.3824 | 0.5682 | 0.7538 |
| 0.1182 | 5.3898 | 1272 | 0.5636 | 0.3824 | 0.5636 | 0.7508 |
| 0.1182 | 5.3983 | 1274 | 0.5630 | 0.3824 | 0.5630 | 0.7504 |
| 0.1182 | 5.4068 | 1276 | 0.5656 | 0.4348 | 0.5656 | 0.7520 |
| 0.1182 | 5.4153 | 1278 | 0.5573 | 0.3824 | 0.5573 | 0.7465 |
| 0.1182 | 5.4237 | 1280 | 0.5514 | 0.3824 | 0.5514 | 0.7426 |
| 0.1182 | 5.4322 | 1282 | 0.5640 | 0.3284 | 0.5640 | 0.7510 |
| 0.1182 | 5.4407 | 1284 | 0.5868 | 0.3284 | 0.5868 | 0.7660 |
| 0.1182 | 5.4492 | 1286 | 0.5936 | 0.2727 | 0.5936 | 0.7705 |
| 0.1182 | 5.4576 | 1288 | 0.5780 | 0.3284 | 0.5780 | 0.7602 |
| 0.1182 | 5.4661 | 1290 | 0.5623 | 0.3226 | 0.5623 | 0.7499 |
| 0.1182 | 5.4746 | 1292 | 0.5660 | 0.4923 | 0.5660 | 0.7524 |
| 0.1182 | 5.4831 | 1294 | 0.5747 | 0.4923 | 0.5747 | 0.7581 |
| 0.1182 | 5.4915 | 1296 | 0.5874 | 0.4179 | 0.5874 | 0.7664 |
| 0.1182 | 5.5 | 1298 | 0.5770 | 0.4179 | 0.5770 | 0.7596 |
| 0.1182 | 5.5085 | 1300 | 0.5770 | 0.4545 | 0.5770 | 0.7596 |
| 0.1182 | 5.5169 | 1302 | 0.5803 | 0.4545 | 0.5803 | 0.7617 |
| 0.1182 | 5.5254 | 1304 | 0.5804 | 0.4923 | 0.5804 | 0.7619 |
| 0.1182 | 5.5339 | 1306 | 0.5761 | 0.4923 | 0.5761 | 0.7590 |
| 0.1182 | 5.5424 | 1308 | 0.5913 | 0.4923 | 0.5913 | 0.7689 |
| 0.1182 | 5.5508 | 1310 | 0.6111 | 0.4923 | 0.6111 | 0.7817 |
| 0.1182 | 5.5593 | 1312 | 0.6349 | 0.4375 | 0.6349 | 0.7968 |
| 0.1182 | 5.5678 | 1314 | 0.6520 | 0.3810 | 0.6520 | 0.8074 |
| 0.1182 | 5.5763 | 1316 | 0.6593 | 0.3810 | 0.6593 | 0.8120 |
| 0.1182 | 5.5847 | 1318 | 0.6740 | 0.3810 | 0.6740 | 0.8210 |
| 0.1182 | 5.5932 | 1320 | 0.6709 | 0.3810 | 0.6709 | 0.8191 |
| 0.1182 | 5.6017 | 1322 | 0.6613 | 0.3810 | 0.6613 | 0.8132 |
| 0.1182 | 5.6102 | 1324 | 0.6341 | 0.3226 | 0.6341 | 0.7963 |
| 0.1182 | 5.6186 | 1326 | 0.6085 | 0.2623 | 0.6085 | 0.7800 |
| 0.1182 | 5.6271 | 1328 | 0.6003 | 0.2727 | 0.6003 | 0.7748 |
| 0.1182 | 5.6356 | 1330 | 0.5986 | 0.3077 | 0.5986 | 0.7737 |
| 0.1182 | 5.6441 | 1332 | 0.5872 | 0.3077 | 0.5872 | 0.7663 |
| 0.1182 | 5.6525 | 1334 | 0.5658 | 0.2727 | 0.5658 | 0.7522 |
| 0.1182 | 5.6610 | 1336 | 0.5629 | 0.3810 | 0.5629 | 0.7503 |
| 0.1182 | 5.6695 | 1338 | 0.5857 | 0.4923 | 0.5857 | 0.7653 |
| 0.1182 | 5.6780 | 1340 | 0.5979 | 0.4923 | 0.5979 | 0.7732 |
| 0.1182 | 5.6864 | 1342 | 0.5975 | 0.4923 | 0.5975 | 0.7730 |
| 0.1182 | 5.6949 | 1344 | 0.5867 | 0.4923 | 0.5867 | 0.7660 |
| 0.1182 | 5.7034 | 1346 | 0.5715 | 0.4375 | 0.5715 | 0.7560 |
| 0.1182 | 5.7119 | 1348 | 0.5546 | 0.4375 | 0.5546 | 0.7447 |
| 0.1182 | 5.7203 | 1350 | 0.5535 | 0.3226 | 0.5535 | 0.7440 |
| 0.1182 | 5.7288 | 1352 | 0.5558 | 0.3226 | 0.5558 | 0.7455 |
| 0.1182 | 5.7373 | 1354 | 0.5652 | 0.3284 | 0.5652 | 0.7518 |
| 0.1182 | 5.7458 | 1356 | 0.5669 | 0.3284 | 0.5669 | 0.7529 |
| 0.1182 | 5.7542 | 1358 | 0.5584 | 0.3284 | 0.5584 | 0.7472 |
| 0.1182 | 5.7627 | 1360 | 0.5626 | 0.3226 | 0.5626 | 0.7501 |
| 0.1182 | 5.7712 | 1362 | 0.5771 | 0.3810 | 0.5771 | 0.7597 |
| 0.1182 | 5.7797 | 1364 | 0.6143 | 0.4923 | 0.6143 | 0.7838 |
| 0.1182 | 5.7881 | 1366 | 0.6468 | 0.4923 | 0.6468 | 0.8042 |
| 0.1182 | 5.7966 | 1368 | 0.6523 | 0.4923 | 0.6523 | 0.8076 |
| 0.1182 | 5.8051 | 1370 | 0.6426 | 0.4375 | 0.6426 | 0.8016 |
| 0.1182 | 5.8136 | 1372 | 0.6235 | 0.3226 | 0.6235 | 0.7896 |
| 0.1182 | 5.8220 | 1374 | 0.6141 | 0.3226 | 0.6141 | 0.7837 |
| 0.1182 | 5.8305 | 1376 | 0.6153 | 0.3284 | 0.6153 | 0.7844 |
| 0.1182 | 5.8390 | 1378 | 0.6135 | 0.3284 | 0.6135 | 0.7832 |
| 0.1182 | 5.8475 | 1380 | 0.6054 | 0.3226 | 0.6054 | 0.7781 |
| 0.1182 | 5.8559 | 1382 | 0.5985 | 0.3226 | 0.5985 | 0.7737 |
| 0.1182 | 5.8644 | 1384 | 0.6053 | 0.3226 | 0.6053 | 0.7780 |
| 0.1182 | 5.8729 | 1386 | 0.6170 | 0.3226 | 0.6170 | 0.7855 |
| 0.1182 | 5.8814 | 1388 | 0.6181 | 0.3226 | 0.6181 | 0.7862 |
| 0.1182 | 5.8898 | 1390 | 0.6153 | 0.3226 | 0.6153 | 0.7844 |
| 0.1182 | 5.8983 | 1392 | 0.6131 | 0.3284 | 0.6131 | 0.7830 |
| 0.1182 | 5.9068 | 1394 | 0.6074 | 0.3284 | 0.6074 | 0.7793 |
| 0.1182 | 5.9153 | 1396 | 0.6053 | 0.3284 | 0.6053 | 0.7780 |
| 0.1182 | 5.9237 | 1398 | 0.6075 | 0.3284 | 0.6075 | 0.7794 |
| 0.1182 | 5.9322 | 1400 | 0.6062 | 0.4375 | 0.6062 | 0.7786 |
| 0.1182 | 5.9407 | 1402 | 0.6278 | 0.4923 | 0.6278 | 0.7923 |
| 0.1182 | 5.9492 | 1404 | 0.6666 | 0.4658 | 0.6666 | 0.8164 |
| 0.1182 | 5.9576 | 1406 | 0.6853 | 0.4658 | 0.6853 | 0.8278 |
| 0.1182 | 5.9661 | 1408 | 0.6770 | 0.4658 | 0.6770 | 0.8228 |
| 0.1182 | 5.9746 | 1410 | 0.6605 | 0.4375 | 0.6605 | 0.8127 |
| 0.1182 | 5.9831 | 1412 | 0.6621 | 0.3284 | 0.6621 | 0.8137 |
| 0.1182 | 5.9915 | 1414 | 0.6817 | 0.3284 | 0.6817 | 0.8257 |
| 0.1182 | 6.0 | 1416 | 0.7088 | 0.2817 | 0.7088 | 0.8419 |
| 0.1182 | 6.0085 | 1418 | 0.7196 | 0.2817 | 0.7196 | 0.8483 |
| 0.1182 | 6.0169 | 1420 | 0.7091 | 0.3333 | 0.7091 | 0.8421 |
| 0.1182 | 6.0254 | 1422 | 0.6999 | 0.3284 | 0.6999 | 0.8366 |
| 0.1182 | 6.0339 | 1424 | 0.6956 | 0.3284 | 0.6956 | 0.8340 |
| 0.1182 | 6.0424 | 1426 | 0.7107 | 0.3438 | 0.7107 | 0.8430 |
| 0.1182 | 6.0508 | 1428 | 0.7203 | 0.2727 | 0.7203 | 0.8487 |
| 0.1182 | 6.0593 | 1430 | 0.7055 | 0.2727 | 0.7055 | 0.8400 |
| 0.1182 | 6.0678 | 1432 | 0.6722 | 0.3438 | 0.6722 | 0.8199 |
| 0.1182 | 6.0763 | 1434 | 0.6329 | 0.3226 | 0.6329 | 0.7956 |
| 0.1182 | 6.0847 | 1436 | 0.6043 | 0.3226 | 0.6043 | 0.7774 |
| 0.1182 | 6.0932 | 1438 | 0.5784 | 0.3226 | 0.5784 | 0.7605 |
| 0.1182 | 6.1017 | 1440 | 0.5689 | 0.3226 | 0.5689 | 0.7542 |
| 0.1182 | 6.1102 | 1442 | 0.5725 | 0.3284 | 0.5725 | 0.7566 |
| 0.1182 | 6.1186 | 1444 | 0.5733 | 0.3226 | 0.5733 | 0.7572 |
| 0.1182 | 6.1271 | 1446 | 0.5670 | 0.3810 | 0.5670 | 0.7530 |
| 0.1182 | 6.1356 | 1448 | 0.5682 | 0.3810 | 0.5682 | 0.7538 |
| 0.1182 | 6.1441 | 1450 | 0.5834 | 0.4375 | 0.5834 | 0.7638 |
| 0.1182 | 6.1525 | 1452 | 0.6052 | 0.4923 | 0.6052 | 0.7779 |
| 0.1182 | 6.1610 | 1454 | 0.6078 | 0.4375 | 0.6078 | 0.7796 |
| 0.1182 | 6.1695 | 1456 | 0.6177 | 0.4375 | 0.6177 | 0.7859 |
| 0.1182 | 6.1780 | 1458 | 0.6264 | 0.4375 | 0.6264 | 0.7914 |
| 0.1182 | 6.1864 | 1460 | 0.6254 | 0.3810 | 0.6254 | 0.7908 |
| 0.1182 | 6.1949 | 1462 | 0.6332 | 0.2623 | 0.6332 | 0.7957 |
| 0.1182 | 6.2034 | 1464 | 0.6505 | 0.2623 | 0.6505 | 0.8066 |
| 0.1182 | 6.2119 | 1466 | 0.6630 | 0.2623 | 0.6630 | 0.8143 |
| 0.1182 | 6.2203 | 1468 | 0.6842 | 0.3226 | 0.6842 | 0.8272 |
| 0.1182 | 6.2288 | 1470 | 0.7012 | 0.3226 | 0.7012 | 0.8374 |
| 0.1182 | 6.2373 | 1472 | 0.7152 | 0.3226 | 0.7152 | 0.8457 |
| 0.1182 | 6.2458 | 1474 | 0.7227 | 0.3284 | 0.7227 | 0.8501 |
| 0.1182 | 6.2542 | 1476 | 0.7211 | 0.3284 | 0.7211 | 0.8492 |
| 0.1182 | 6.2627 | 1478 | 0.7199 | 0.2727 | 0.7199 | 0.8485 |
| 0.1182 | 6.2712 | 1480 | 0.7238 | 0.2727 | 0.7238 | 0.8508 |
| 0.1182 | 6.2797 | 1482 | 0.7248 | 0.2727 | 0.7248 | 0.8513 |
| 0.1182 | 6.2881 | 1484 | 0.7152 | 0.2727 | 0.7152 | 0.8457 |
| 0.1182 | 6.2966 | 1486 | 0.7034 | 0.2727 | 0.7034 | 0.8387 |
| 0.1182 | 6.3051 | 1488 | 0.6964 | 0.2727 | 0.6964 | 0.8345 |
| 0.1182 | 6.3136 | 1490 | 0.6893 | 0.2727 | 0.6893 | 0.8302 |
| 0.1182 | 6.3220 | 1492 | 0.6842 | 0.2817 | 0.6842 | 0.8272 |
| 0.1182 | 6.3305 | 1494 | 0.6795 | 0.2817 | 0.6795 | 0.8243 |
| 0.1182 | 6.3390 | 1496 | 0.6785 | 0.3143 | 0.6785 | 0.8237 |
| 0.1182 | 6.3475 | 1498 | 0.6782 | 0.3200 | 0.6782 | 0.8235 |
| 0.0761 | 6.3559 | 1500 | 0.6639 | 0.3143 | 0.6639 | 0.8148 |
| 0.0761 | 6.3644 | 1502 | 0.6365 | 0.2727 | 0.6365 | 0.7978 |
| 0.0761 | 6.3729 | 1504 | 0.6108 | 0.3226 | 0.6108 | 0.7815 |
| 0.0761 | 6.3814 | 1506 | 0.6022 | 0.3810 | 0.6022 | 0.7760 |
| 0.0761 | 6.3898 | 1508 | 0.6196 | 0.4545 | 0.6196 | 0.7872 |
| 0.0761 | 6.3983 | 1510 | 0.6434 | 0.2857 | 0.6434 | 0.8021 |
| 0.0761 | 6.4068 | 1512 | 0.6407 | 0.2857 | 0.6407 | 0.8004 |
| 0.0761 | 6.4153 | 1514 | 0.6096 | 0.3226 | 0.6096 | 0.7808 |
| 0.0761 | 6.4237 | 1516 | 0.5768 | 0.4923 | 0.5768 | 0.7595 |
| 0.0761 | 6.4322 | 1518 | 0.5610 | 0.3226 | 0.5610 | 0.7490 |
| 0.0761 | 6.4407 | 1520 | 0.5672 | 0.3226 | 0.5672 | 0.7531 |
| 0.0761 | 6.4492 | 1522 | 0.5877 | 0.3636 | 0.5877 | 0.7666 |
| 0.0761 | 6.4576 | 1524 | 0.5846 | 0.3662 | 0.5846 | 0.7646 |
| 0.0761 | 6.4661 | 1526 | 0.5749 | 0.3662 | 0.5749 | 0.7582 |
| 0.0761 | 6.4746 | 1528 | 0.5624 | 0.3636 | 0.5624 | 0.7499 |
| 0.0761 | 6.4831 | 1530 | 0.5549 | 0.3226 | 0.5549 | 0.7449 |
| 0.0761 | 6.4915 | 1532 | 0.5570 | 0.3810 | 0.5570 | 0.7463 |
| 0.0761 | 6.5 | 1534 | 0.5692 | 0.4375 | 0.5692 | 0.7545 |
| 0.0761 | 6.5085 | 1536 | 0.5842 | 0.4923 | 0.5842 | 0.7643 |
| 0.0761 | 6.5169 | 1538 | 0.5927 | 0.4923 | 0.5927 | 0.7699 |
| 0.0761 | 6.5254 | 1540 | 0.5905 | 0.4923 | 0.5905 | 0.7684 |
| 0.0761 | 6.5339 | 1542 | 0.5771 | 0.4923 | 0.5771 | 0.7597 |
| 0.0761 | 6.5424 | 1544 | 0.5658 | 0.4375 | 0.5658 | 0.7522 |
| 0.0761 | 6.5508 | 1546 | 0.5559 | 0.3810 | 0.5559 | 0.7456 |
| 0.0761 | 6.5593 | 1548 | 0.5539 | 0.3226 | 0.5539 | 0.7442 |
| 0.0761 | 6.5678 | 1550 | 0.5553 | 0.3226 | 0.5553 | 0.7452 |
| 0.0761 | 6.5763 | 1552 | 0.5599 | 0.3226 | 0.5599 | 0.7482 |
| 0.0761 | 6.5847 | 1554 | 0.5739 | 0.3226 | 0.5739 | 0.7576 |
| 0.0761 | 6.5932 | 1556 | 0.5842 | 0.3226 | 0.5842 | 0.7643 |
| 0.0761 | 6.6017 | 1558 | 0.5953 | 0.3226 | 0.5953 | 0.7716 |
| 0.0761 | 6.6102 | 1560 | 0.6007 | 0.3226 | 0.6007 | 0.7750 |
| 0.0761 | 6.6186 | 1562 | 0.6107 | 0.3636 | 0.6107 | 0.7815 |
| 0.0761 | 6.6271 | 1564 | 0.6314 | 0.3636 | 0.6314 | 0.7946 |
| 0.0761 | 6.6356 | 1566 | 0.6485 | 0.3636 | 0.6485 | 0.8053 |
| 0.0761 | 6.6441 | 1568 | 0.6547 | 0.3284 | 0.6547 | 0.8091 |
| 0.0761 | 6.6525 | 1570 | 0.6599 | 0.3284 | 0.6599 | 0.8123 |
| 0.0761 | 6.6610 | 1572 | 0.6664 | 0.3226 | 0.6664 | 0.8163 |
| 0.0761 | 6.6695 | 1574 | 0.6737 | 0.4375 | 0.6737 | 0.8208 |
| 0.0761 | 6.6780 | 1576 | 0.6831 | 0.2727 | 0.6831 | 0.8265 |
| 0.0761 | 6.6864 | 1578 | 0.6873 | 0.2727 | 0.6873 | 0.8290 |
| 0.0761 | 6.6949 | 1580 | 0.6716 | 0.2727 | 0.6716 | 0.8195 |
| 0.0761 | 6.7034 | 1582 | 0.6416 | 0.4375 | 0.6416 | 0.8010 |
| 0.0761 | 6.7119 | 1584 | 0.6165 | 0.4375 | 0.6165 | 0.7852 |
| 0.0761 | 6.7203 | 1586 | 0.5987 | 0.3226 | 0.5987 | 0.7737 |
| 0.0761 | 6.7288 | 1588 | 0.5888 | 0.3226 | 0.5888 | 0.7673 |
| 0.0761 | 6.7373 | 1590 | 0.5900 | 0.3226 | 0.5900 | 0.7681 |
| 0.0761 | 6.7458 | 1592 | 0.5943 | 0.3636 | 0.5943 | 0.7709 |
| 0.0761 | 6.7542 | 1594 | 0.5950 | 0.3636 | 0.5950 | 0.7713 |
| 0.0761 | 6.7627 | 1596 | 0.6000 | 0.3636 | 0.6000 | 0.7746 |
| 0.0761 | 6.7712 | 1598 | 0.6001 | 0.3284 | 0.6001 | 0.7747 |
| 0.0761 | 6.7797 | 1600 | 0.5980 | 0.3226 | 0.5980 | 0.7733 |
| 0.0761 | 6.7881 | 1602 | 0.5984 | 0.3226 | 0.5984 | 0.7735 |
| 0.0761 | 6.7966 | 1604 | 0.6032 | 0.3226 | 0.6032 | 0.7767 |
| 0.0761 | 6.8051 | 1606 | 0.6122 | 0.4923 | 0.6122 | 0.7824 |
| 0.0761 | 6.8136 | 1608 | 0.6182 | 0.4923 | 0.6182 | 0.7862 |
| 0.0761 | 6.8220 | 1610 | 0.6149 | 0.4923 | 0.6149 | 0.7842 |
| 0.0761 | 6.8305 | 1612 | 0.6095 | 0.4923 | 0.6095 | 0.7807 |
| 0.0761 | 6.8390 | 1614 | 0.6058 | 0.4923 | 0.6058 | 0.7784 |
| 0.0761 | 6.8475 | 1616 | 0.6007 | 0.3810 | 0.6007 | 0.7750 |
| 0.0761 | 6.8559 | 1618 | 0.6004 | 0.3226 | 0.6004 | 0.7749 |
| 0.0761 | 6.8644 | 1620 | 0.6104 | 0.3284 | 0.6104 | 0.7813 |
| 0.0761 | 6.8729 | 1622 | 0.6118 | 0.3636 | 0.6118 | 0.7822 |
| 0.0761 | 6.8814 | 1624 | 0.5972 | 0.3226 | 0.5972 | 0.7728 |
| 0.0761 | 6.8898 | 1626 | 0.5843 | 0.3226 | 0.5843 | 0.7644 |
| 0.0761 | 6.8983 | 1628 | 0.5764 | 0.3810 | 0.5764 | 0.7592 |
| 0.0761 | 6.9068 | 1630 | 0.5702 | 0.4375 | 0.5702 | 0.7551 |
| 0.0761 | 6.9153 | 1632 | 0.5607 | 0.4375 | 0.5607 | 0.7488 |
| 0.0761 | 6.9237 | 1634 | 0.5577 | 0.4375 | 0.5577 | 0.7468 |
| 0.0761 | 6.9322 | 1636 | 0.5498 | 0.4375 | 0.5498 | 0.7415 |
| 0.0761 | 6.9407 | 1638 | 0.5481 | 0.4375 | 0.5481 | 0.7404 |
| 0.0761 | 6.9492 | 1640 | 0.5504 | 0.4375 | 0.5504 | 0.7419 |
| 0.0761 | 6.9576 | 1642 | 0.5490 | 0.4375 | 0.5490 | 0.7409 |
| 0.0761 | 6.9661 | 1644 | 0.5446 | 0.4375 | 0.5446 | 0.7380 |
| 0.0761 | 6.9746 | 1646 | 0.5437 | 0.4375 | 0.5437 | 0.7374 |
| 0.0761 | 6.9831 | 1648 | 0.5388 | 0.4923 | 0.5388 | 0.7341 |
| 0.0761 | 6.9915 | 1650 | 0.5335 | 0.4923 | 0.5335 | 0.7304 |
| 0.0761 | 7.0 | 1652 | 0.5257 | 0.5455 | 0.5257 | 0.7250 |
| 0.0761 | 7.0085 | 1654 | 0.5207 | 0.5455 | 0.5207 | 0.7216 |
| 0.0761 | 7.0169 | 1656 | 0.5257 | 0.5075 | 0.5257 | 0.7250 |
| 0.0761 | 7.0254 | 1658 | 0.5319 | 0.5075 | 0.5319 | 0.7293 |
| 0.0761 | 7.0339 | 1660 | 0.5393 | 0.3824 | 0.5393 | 0.7344 |
| 0.0761 | 7.0424 | 1662 | 0.5380 | 0.3824 | 0.5380 | 0.7335 |
| 0.0761 | 7.0508 | 1664 | 0.5287 | 0.5075 | 0.5287 | 0.7271 |
| 0.0761 | 7.0593 | 1666 | 0.5204 | 0.5455 | 0.5204 | 0.7214 |
| 0.0761 | 7.0678 | 1668 | 0.5083 | 0.4923 | 0.5083 | 0.7130 |
| 0.0761 | 7.0763 | 1670 | 0.5037 | 0.4923 | 0.5037 | 0.7097 |
| 0.0761 | 7.0847 | 1672 | 0.4993 | 0.4375 | 0.4993 | 0.7066 |
| 0.0761 | 7.0932 | 1674 | 0.5096 | 0.3810 | 0.5096 | 0.7139 |
| 0.0761 | 7.1017 | 1676 | 0.5222 | 0.3810 | 0.5222 | 0.7227 |
| 0.0761 | 7.1102 | 1678 | 0.5338 | 0.3810 | 0.5338 | 0.7306 |
| 0.0761 | 7.1186 | 1680 | 0.5458 | 0.3810 | 0.5458 | 0.7388 |
| 0.0761 | 7.1271 | 1682 | 0.5626 | 0.4375 | 0.5626 | 0.7501 |
| 0.0761 | 7.1356 | 1684 | 0.5762 | 0.4375 | 0.5762 | 0.7591 |
| 0.0761 | 7.1441 | 1686 | 0.5809 | 0.4375 | 0.5809 | 0.7621 |
| 0.0761 | 7.1525 | 1688 | 0.5759 | 0.4375 | 0.5759 | 0.7589 |
| 0.0761 | 7.1610 | 1690 | 0.5665 | 0.4375 | 0.5665 | 0.7527 |
| 0.0761 | 7.1695 | 1692 | 0.5576 | 0.4375 | 0.5576 | 0.7468 |
| 0.0761 | 7.1780 | 1694 | 0.5476 | 0.4375 | 0.5476 | 0.7400 |
| 0.0761 | 7.1864 | 1696 | 0.5338 | 0.3810 | 0.5338 | 0.7306 |
| 0.0761 | 7.1949 | 1698 | 0.5226 | 0.3810 | 0.5226 | 0.7229 |
| 0.0761 | 7.2034 | 1700 | 0.5164 | 0.3810 | 0.5164 | 0.7186 |
| 0.0761 | 7.2119 | 1702 | 0.5203 | 0.3810 | 0.5203 | 0.7213 |
| 0.0761 | 7.2203 | 1704 | 0.5286 | 0.3810 | 0.5286 | 0.7271 |
| 0.0761 | 7.2288 | 1706 | 0.5403 | 0.3810 | 0.5403 | 0.7350 |
| 0.0761 | 7.2373 | 1708 | 0.5533 | 0.3810 | 0.5533 | 0.7439 |
| 0.0761 | 7.2458 | 1710 | 0.5738 | 0.4375 | 0.5738 | 0.7575 |
| 0.0761 | 7.2542 | 1712 | 0.5945 | 0.4375 | 0.5945 | 0.7710 |
| 0.0761 | 7.2627 | 1714 | 0.6127 | 0.4375 | 0.6127 | 0.7828 |
| 0.0761 | 7.2712 | 1716 | 0.6273 | 0.4375 | 0.6273 | 0.7920 |
| 0.0761 | 7.2797 | 1718 | 0.6315 | 0.3824 | 0.6315 | 0.7947 |
| 0.0761 | 7.2881 | 1720 | 0.6393 | 0.3284 | 0.6393 | 0.7996 |
| 0.0761 | 7.2966 | 1722 | 0.6532 | 0.3333 | 0.6532 | 0.8082 |
| 0.0761 | 7.3051 | 1724 | 0.6603 | 0.3662 | 0.6603 | 0.8126 |
| 0.0761 | 7.3136 | 1726 | 0.6492 | 0.3662 | 0.6492 | 0.8057 |
| 0.0761 | 7.3220 | 1728 | 0.6407 | 0.3662 | 0.6407 | 0.8004 |
| 0.0761 | 7.3305 | 1730 | 0.6292 | 0.3284 | 0.6292 | 0.7932 |
| 0.0761 | 7.3390 | 1732 | 0.6196 | 0.3284 | 0.6196 | 0.7871 |
| 0.0761 | 7.3475 | 1734 | 0.6152 | 0.3824 | 0.6152 | 0.7844 |
| 0.0761 | 7.3559 | 1736 | 0.6182 | 0.4348 | 0.6182 | 0.7863 |
| 0.0761 | 7.3644 | 1738 | 0.6266 | 0.4375 | 0.6266 | 0.7916 |
| 0.0761 | 7.3729 | 1740 | 0.6345 | 0.4375 | 0.6345 | 0.7965 |
| 0.0761 | 7.3814 | 1742 | 0.6431 | 0.4375 | 0.6431 | 0.8019 |
| 0.0761 | 7.3898 | 1744 | 0.6537 | 0.4375 | 0.6537 | 0.8085 |
| 0.0761 | 7.3983 | 1746 | 0.6661 | 0.4375 | 0.6661 | 0.8162 |
| 0.0761 | 7.4068 | 1748 | 0.6699 | 0.4375 | 0.6699 | 0.8185 |
| 0.0761 | 7.4153 | 1750 | 0.6641 | 0.4348 | 0.6641 | 0.8149 |
| 0.0761 | 7.4237 | 1752 | 0.6514 | 0.4348 | 0.6514 | 0.8071 |
| 0.0761 | 7.4322 | 1754 | 0.6419 | 0.4348 | 0.6419 | 0.8012 |
| 0.0761 | 7.4407 | 1756 | 0.6333 | 0.3824 | 0.6333 | 0.7958 |
| 0.0761 | 7.4492 | 1758 | 0.6277 | 0.3824 | 0.6277 | 0.7923 |
| 0.0761 | 7.4576 | 1760 | 0.6217 | 0.3824 | 0.6217 | 0.7885 |
| 0.0761 | 7.4661 | 1762 | 0.6154 | 0.4348 | 0.6154 | 0.7845 |
| 0.0761 | 7.4746 | 1764 | 0.6098 | 0.4923 | 0.6098 | 0.7809 |
| 0.0761 | 7.4831 | 1766 | 0.6056 | 0.4923 | 0.6056 | 0.7782 |
| 0.0761 | 7.4915 | 1768 | 0.6057 | 0.4923 | 0.6057 | 0.7783 |
| 0.0761 | 7.5 | 1770 | 0.6067 | 0.4923 | 0.6067 | 0.7789 |
| 0.0761 | 7.5085 | 1772 | 0.6063 | 0.4923 | 0.6063 | 0.7786 |
| 0.0761 | 7.5169 | 1774 | 0.6019 | 0.4923 | 0.6019 | 0.7758 |
| 0.0761 | 7.5254 | 1776 | 0.6033 | 0.4923 | 0.6033 | 0.7768 |
| 0.0761 | 7.5339 | 1778 | 0.6014 | 0.4923 | 0.6014 | 0.7755 |
| 0.0761 | 7.5424 | 1780 | 0.6011 | 0.4923 | 0.6011 | 0.7753 |
| 0.0761 | 7.5508 | 1782 | 0.6052 | 0.4923 | 0.6052 | 0.7780 |
| 0.0761 | 7.5593 | 1784 | 0.6065 | 0.4923 | 0.6065 | 0.7788 |
| 0.0761 | 7.5678 | 1786 | 0.6053 | 0.4923 | 0.6053 | 0.7780 |
| 0.0761 | 7.5763 | 1788 | 0.6000 | 0.4923 | 0.6000 | 0.7746 |
| 0.0761 | 7.5847 | 1790 | 0.5995 | 0.4375 | 0.5995 | 0.7743 |
| 0.0761 | 7.5932 | 1792 | 0.6027 | 0.4375 | 0.6027 | 0.7763 |
| 0.0761 | 7.6017 | 1794 | 0.6101 | 0.4375 | 0.6101 | 0.7811 |
| 0.0761 | 7.6102 | 1796 | 0.6191 | 0.4375 | 0.6191 | 0.7868 |
| 0.0761 | 7.6186 | 1798 | 0.6282 | 0.4375 | 0.6282 | 0.7926 |
| 0.0761 | 7.6271 | 1800 | 0.6391 | 0.4375 | 0.6391 | 0.7994 |
| 0.0761 | 7.6356 | 1802 | 0.6403 | 0.3810 | 0.6403 | 0.8002 |
| 0.0761 | 7.6441 | 1804 | 0.6376 | 0.3284 | 0.6376 | 0.7985 |
| 0.0761 | 7.6525 | 1806 | 0.6365 | 0.2727 | 0.6365 | 0.7978 |
| 0.0761 | 7.6610 | 1808 | 0.6347 | 0.2727 | 0.6347 | 0.7967 |
| 0.0761 | 7.6695 | 1810 | 0.6328 | 0.2727 | 0.6328 | 0.7955 |
| 0.0761 | 7.6780 | 1812 | 0.6305 | 0.2727 | 0.6305 | 0.7940 |
| 0.0761 | 7.6864 | 1814 | 0.6230 | 0.2727 | 0.6230 | 0.7893 |
| 0.0761 | 7.6949 | 1816 | 0.6129 | 0.2727 | 0.6129 | 0.7829 |
| 0.0761 | 7.7034 | 1818 | 0.6013 | 0.3226 | 0.6013 | 0.7754 |
| 0.0761 | 7.7119 | 1820 | 0.5941 | 0.4375 | 0.5941 | 0.7708 |
| 0.0761 | 7.7203 | 1822 | 0.5930 | 0.4375 | 0.5930 | 0.7701 |
| 0.0761 | 7.7288 | 1824 | 0.5931 | 0.4375 | 0.5931 | 0.7701 |
| 0.0761 | 7.7373 | 1826 | 0.5896 | 0.4375 | 0.5896 | 0.7678 |
| 0.0761 | 7.7458 | 1828 | 0.5879 | 0.4375 | 0.5879 | 0.7668 |
| 0.0761 | 7.7542 | 1830 | 0.5870 | 0.4375 | 0.5870 | 0.7661 |
| 0.0761 | 7.7627 | 1832 | 0.5868 | 0.4375 | 0.5868 | 0.7660 |
| 0.0761 | 7.7712 | 1834 | 0.5844 | 0.4375 | 0.5844 | 0.7644 |
| 0.0761 | 7.7797 | 1836 | 0.5813 | 0.3810 | 0.5813 | 0.7624 |
| 0.0761 | 7.7881 | 1838 | 0.5791 | 0.3810 | 0.5791 | 0.7610 |
| 0.0761 | 7.7966 | 1840 | 0.5842 | 0.3824 | 0.5842 | 0.7643 |
| 0.0761 | 7.8051 | 1842 | 0.5912 | 0.3284 | 0.5912 | 0.7689 |
| 0.0761 | 7.8136 | 1844 | 0.5948 | 0.3284 | 0.5948 | 0.7712 |
| 0.0761 | 7.8220 | 1846 | 0.6001 | 0.3824 | 0.6001 | 0.7746 |
| 0.0761 | 7.8305 | 1848 | 0.6085 | 0.3824 | 0.6085 | 0.7801 |
| 0.0761 | 7.8390 | 1850 | 0.6204 | 0.3824 | 0.6204 | 0.7876 |
| 0.0761 | 7.8475 | 1852 | 0.6288 | 0.4348 | 0.6288 | 0.7930 |
| 0.0761 | 7.8559 | 1854 | 0.6361 | 0.4348 | 0.6361 | 0.7976 |
| 0.0761 | 7.8644 | 1856 | 0.6405 | 0.4348 | 0.6405 | 0.8003 |
| 0.0761 | 7.8729 | 1858 | 0.6442 | 0.4348 | 0.6442 | 0.8026 |
| 0.0761 | 7.8814 | 1860 | 0.6478 | 0.4348 | 0.6478 | 0.8049 |
| 0.0761 | 7.8898 | 1862 | 0.6502 | 0.3824 | 0.6502 | 0.8064 |
| 0.0761 | 7.8983 | 1864 | 0.6502 | 0.2727 | 0.6502 | 0.8063 |
| 0.0761 | 7.9068 | 1866 | 0.6552 | 0.3143 | 0.6552 | 0.8095 |
| 0.0761 | 7.9153 | 1868 | 0.6536 | 0.3200 | 0.6536 | 0.8085 |
| 0.0761 | 7.9237 | 1870 | 0.6437 | 0.2727 | 0.6437 | 0.8023 |
| 0.0761 | 7.9322 | 1872 | 0.6314 | 0.3824 | 0.6314 | 0.7946 |
| 0.0761 | 7.9407 | 1874 | 0.6200 | 0.3824 | 0.6200 | 0.7874 |
| 0.0761 | 7.9492 | 1876 | 0.6131 | 0.3824 | 0.6131 | 0.7830 |
| 0.0761 | 7.9576 | 1878 | 0.6023 | 0.3810 | 0.6023 | 0.7761 |
| 0.0761 | 7.9661 | 1880 | 0.5966 | 0.4375 | 0.5966 | 0.7724 |
| 0.0761 | 7.9746 | 1882 | 0.5976 | 0.4923 | 0.5976 | 0.7731 |
| 0.0761 | 7.9831 | 1884 | 0.5977 | 0.4545 | 0.5977 | 0.7731 |
| 0.0761 | 7.9915 | 1886 | 0.5940 | 0.4545 | 0.5940 | 0.7707 |
| 0.0761 | 8.0 | 1888 | 0.5891 | 0.4923 | 0.5891 | 0.7675 |
| 0.0761 | 8.0085 | 1890 | 0.5837 | 0.4923 | 0.5837 | 0.7640 |
| 0.0761 | 8.0169 | 1892 | 0.5845 | 0.4375 | 0.5845 | 0.7645 |
| 0.0761 | 8.0254 | 1894 | 0.5862 | 0.4375 | 0.5862 | 0.7657 |
| 0.0761 | 8.0339 | 1896 | 0.5899 | 0.3810 | 0.5899 | 0.7681 |
| 0.0761 | 8.0424 | 1898 | 0.5969 | 0.3810 | 0.5969 | 0.7726 |
| 0.0761 | 8.0508 | 1900 | 0.6086 | 0.2727 | 0.6086 | 0.7801 |
| 0.0761 | 8.0593 | 1902 | 0.6231 | 0.2727 | 0.6231 | 0.7894 |
| 0.0761 | 8.0678 | 1904 | 0.6356 | 0.2727 | 0.6356 | 0.7973 |
| 0.0761 | 8.0763 | 1906 | 0.6427 | 0.2727 | 0.6427 | 0.8017 |
| 0.0761 | 8.0847 | 1908 | 0.6449 | 0.3284 | 0.6449 | 0.8030 |
| 0.0761 | 8.0932 | 1910 | 0.6483 | 0.3284 | 0.6483 | 0.8052 |
| 0.0761 | 8.1017 | 1912 | 0.6483 | 0.3284 | 0.6483 | 0.8052 |
| 0.0761 | 8.1102 | 1914 | 0.6444 | 0.3284 | 0.6444 | 0.8028 |
| 0.0761 | 8.1186 | 1916 | 0.6467 | 0.4348 | 0.6467 | 0.8042 |
| 0.0761 | 8.1271 | 1918 | 0.6494 | 0.4348 | 0.6494 | 0.8059 |
| 0.0761 | 8.1356 | 1920 | 0.6475 | 0.4375 | 0.6475 | 0.8047 |
| 0.0761 | 8.1441 | 1922 | 0.6456 | 0.4375 | 0.6456 | 0.8035 |
| 0.0761 | 8.1525 | 1924 | 0.6516 | 0.4375 | 0.6516 | 0.8072 |
| 0.0761 | 8.1610 | 1926 | 0.6575 | 0.3077 | 0.6575 | 0.8109 |
| 0.0761 | 8.1695 | 1928 | 0.6634 | 0.3077 | 0.6634 | 0.8145 |
| 0.0761 | 8.1780 | 1930 | 0.6640 | 0.3077 | 0.6640 | 0.8149 |
| 0.0761 | 8.1864 | 1932 | 0.6639 | 0.3077 | 0.6639 | 0.8148 |
| 0.0761 | 8.1949 | 1934 | 0.6602 | 0.4375 | 0.6602 | 0.8125 |
| 0.0761 | 8.2034 | 1936 | 0.6562 | 0.4348 | 0.6562 | 0.8101 |
| 0.0761 | 8.2119 | 1938 | 0.6491 | 0.4348 | 0.6491 | 0.8057 |
| 0.0761 | 8.2203 | 1940 | 0.6374 | 0.4348 | 0.6374 | 0.7984 |
| 0.0761 | 8.2288 | 1942 | 0.6262 | 0.3824 | 0.6262 | 0.7913 |
| 0.0761 | 8.2373 | 1944 | 0.6149 | 0.3824 | 0.6149 | 0.7842 |
| 0.0761 | 8.2458 | 1946 | 0.6040 | 0.3824 | 0.6040 | 0.7772 |
| 0.0761 | 8.2542 | 1948 | 0.5938 | 0.3824 | 0.5938 | 0.7706 |
| 0.0761 | 8.2627 | 1950 | 0.5816 | 0.3810 | 0.5816 | 0.7627 |
| 0.0761 | 8.2712 | 1952 | 0.5750 | 0.4375 | 0.5750 | 0.7583 |
| 0.0761 | 8.2797 | 1954 | 0.5721 | 0.4375 | 0.5721 | 0.7564 |
| 0.0761 | 8.2881 | 1956 | 0.5722 | 0.4375 | 0.5722 | 0.7564 |
| 0.0761 | 8.2966 | 1958 | 0.5754 | 0.4923 | 0.5754 | 0.7585 |
| 0.0761 | 8.3051 | 1960 | 0.5776 | 0.4923 | 0.5776 | 0.7600 |
| 0.0761 | 8.3136 | 1962 | 0.5793 | 0.4923 | 0.5793 | 0.7611 |
| 0.0761 | 8.3220 | 1964 | 0.5859 | 0.4923 | 0.5859 | 0.7654 |
| 0.0761 | 8.3305 | 1966 | 0.5928 | 0.4923 | 0.5928 | 0.7699 |
| 0.0761 | 8.3390 | 1968 | 0.5970 | 0.4923 | 0.5970 | 0.7726 |
| 0.0761 | 8.3475 | 1970 | 0.6003 | 0.4375 | 0.6003 | 0.7748 |
| 0.0761 | 8.3559 | 1972 | 0.6055 | 0.4375 | 0.6055 | 0.7781 |
| 0.0761 | 8.3644 | 1974 | 0.6117 | 0.4348 | 0.6117 | 0.7821 |
| 0.0761 | 8.3729 | 1976 | 0.6208 | 0.4348 | 0.6208 | 0.7879 |
| 0.0761 | 8.3814 | 1978 | 0.6308 | 0.3284 | 0.6308 | 0.7943 |
| 0.0761 | 8.3898 | 1980 | 0.6399 | 0.3284 | 0.6399 | 0.7999 |
| 0.0761 | 8.3983 | 1982 | 0.6441 | 0.3284 | 0.6441 | 0.8025 |
| 0.0761 | 8.4068 | 1984 | 0.6424 | 0.3284 | 0.6424 | 0.8015 |
| 0.0761 | 8.4153 | 1986 | 0.6408 | 0.3284 | 0.6408 | 0.8005 |
| 0.0761 | 8.4237 | 1988 | 0.6375 | 0.3824 | 0.6375 | 0.7985 |
| 0.0761 | 8.4322 | 1990 | 0.6356 | 0.4348 | 0.6356 | 0.7972 |
| 0.0761 | 8.4407 | 1992 | 0.6344 | 0.4348 | 0.6344 | 0.7965 |
| 0.0761 | 8.4492 | 1994 | 0.6352 | 0.4348 | 0.6352 | 0.7970 |
| 0.0761 | 8.4576 | 1996 | 0.6335 | 0.4348 | 0.6335 | 0.7959 |
| 0.0761 | 8.4661 | 1998 | 0.6312 | 0.4348 | 0.6312 | 0.7945 |
| 0.0545 | 8.4746 | 2000 | 0.6289 | 0.4348 | 0.6289 | 0.7930 |
| 0.0545 | 8.4831 | 2002 | 0.6272 | 0.4348 | 0.6272 | 0.7920 |
| 0.0545 | 8.4915 | 2004 | 0.6278 | 0.4348 | 0.6278 | 0.7924 |
| 0.0545 | 8.5 | 2006 | 0.6281 | 0.4348 | 0.6281 | 0.7925 |
| 0.0545 | 8.5085 | 2008 | 0.6271 | 0.4348 | 0.6271 | 0.7919 |
| 0.0545 | 8.5169 | 2010 | 0.6285 | 0.4348 | 0.6285 | 0.7928 |
| 0.0545 | 8.5254 | 2012 | 0.6306 | 0.4348 | 0.6306 | 0.7941 |
| 0.0545 | 8.5339 | 2014 | 0.6314 | 0.4348 | 0.6314 | 0.7946 |
| 0.0545 | 8.5424 | 2016 | 0.6306 | 0.4348 | 0.6306 | 0.7941 |
| 0.0545 | 8.5508 | 2018 | 0.6317 | 0.4348 | 0.6317 | 0.7948 |
| 0.0545 | 8.5593 | 2020 | 0.6329 | 0.4348 | 0.6329 | 0.7956 |
| 0.0545 | 8.5678 | 2022 | 0.6365 | 0.3824 | 0.6365 | 0.7978 |
| 0.0545 | 8.5763 | 2024 | 0.6372 | 0.3824 | 0.6372 | 0.7983 |
| 0.0545 | 8.5847 | 2026 | 0.6366 | 0.3824 | 0.6366 | 0.7979 |
| 0.0545 | 8.5932 | 2028 | 0.6332 | 0.3824 | 0.6332 | 0.7957 |
| 0.0545 | 8.6017 | 2030 | 0.6304 | 0.3824 | 0.6304 | 0.7940 |
| 0.0545 | 8.6102 | 2032 | 0.6244 | 0.4348 | 0.6244 | 0.7902 |
| 0.0545 | 8.6186 | 2034 | 0.6173 | 0.4348 | 0.6173 | 0.7857 |
| 0.0545 | 8.6271 | 2036 | 0.6133 | 0.4348 | 0.6133 | 0.7832 |
| 0.0545 | 8.6356 | 2038 | 0.6086 | 0.4348 | 0.6086 | 0.7801 |
| 0.0545 | 8.6441 | 2040 | 0.6026 | 0.4375 | 0.6026 | 0.7762 |
| 0.0545 | 8.6525 | 2042 | 0.5979 | 0.4375 | 0.5979 | 0.7732 |
| 0.0545 | 8.6610 | 2044 | 0.5960 | 0.4375 | 0.5960 | 0.7720 |
| 0.0545 | 8.6695 | 2046 | 0.5952 | 0.4375 | 0.5952 | 0.7715 |
| 0.0545 | 8.6780 | 2048 | 0.5949 | 0.4348 | 0.5949 | 0.7713 |
| 0.0545 | 8.6864 | 2050 | 0.5932 | 0.4348 | 0.5932 | 0.7702 |
| 0.0545 | 8.6949 | 2052 | 0.5915 | 0.3824 | 0.5915 | 0.7691 |
| 0.0545 | 8.7034 | 2054 | 0.5930 | 0.3284 | 0.5930 | 0.7701 |
| 0.0545 | 8.7119 | 2056 | 0.5955 | 0.3284 | 0.5955 | 0.7717 |
| 0.0545 | 8.7203 | 2058 | 0.5995 | 0.3284 | 0.5995 | 0.7743 |
| 0.0545 | 8.7288 | 2060 | 0.6003 | 0.3824 | 0.6003 | 0.7748 |
| 0.0545 | 8.7373 | 2062 | 0.6009 | 0.4348 | 0.6009 | 0.7752 |
| 0.0545 | 8.7458 | 2064 | 0.6014 | 0.4348 | 0.6014 | 0.7755 |
| 0.0545 | 8.7542 | 2066 | 0.6039 | 0.4348 | 0.6039 | 0.7771 |
| 0.0545 | 8.7627 | 2068 | 0.6098 | 0.4348 | 0.6098 | 0.7809 |
| 0.0545 | 8.7712 | 2070 | 0.6170 | 0.4348 | 0.6170 | 0.7855 |
| 0.0545 | 8.7797 | 2072 | 0.6213 | 0.4348 | 0.6213 | 0.7882 |
| 0.0545 | 8.7881 | 2074 | 0.6244 | 0.4348 | 0.6244 | 0.7902 |
| 0.0545 | 8.7966 | 2076 | 0.6256 | 0.4348 | 0.6256 | 0.7910 |
| 0.0545 | 8.8051 | 2078 | 0.6287 | 0.4348 | 0.6287 | 0.7929 |
| 0.0545 | 8.8136 | 2080 | 0.6324 | 0.4348 | 0.6324 | 0.7953 |
| 0.0545 | 8.8220 | 2082 | 0.6369 | 0.4348 | 0.6369 | 0.7981 |
| 0.0545 | 8.8305 | 2084 | 0.6377 | 0.4348 | 0.6377 | 0.7986 |
| 0.0545 | 8.8390 | 2086 | 0.6365 | 0.4348 | 0.6365 | 0.7978 |
| 0.0545 | 8.8475 | 2088 | 0.6343 | 0.4348 | 0.6343 | 0.7964 |
| 0.0545 | 8.8559 | 2090 | 0.6307 | 0.4348 | 0.6307 | 0.7941 |
| 0.0545 | 8.8644 | 2092 | 0.6266 | 0.4348 | 0.6266 | 0.7916 |
| 0.0545 | 8.8729 | 2094 | 0.6250 | 0.4348 | 0.6250 | 0.7905 |
| 0.0545 | 8.8814 | 2096 | 0.6221 | 0.4348 | 0.6221 | 0.7887 |
| 0.0545 | 8.8898 | 2098 | 0.6173 | 0.4348 | 0.6173 | 0.7857 |
| 0.0545 | 8.8983 | 2100 | 0.6146 | 0.4348 | 0.6146 | 0.7840 |
| 0.0545 | 8.9068 | 2102 | 0.6141 | 0.4375 | 0.6141 | 0.7836 |
| 0.0545 | 8.9153 | 2104 | 0.6139 | 0.4375 | 0.6139 | 0.7835 |
| 0.0545 | 8.9237 | 2106 | 0.6115 | 0.4375 | 0.6115 | 0.7820 |
| 0.0545 | 8.9322 | 2108 | 0.6081 | 0.4375 | 0.6081 | 0.7798 |
| 0.0545 | 8.9407 | 2110 | 0.6071 | 0.4375 | 0.6071 | 0.7792 |
| 0.0545 | 8.9492 | 2112 | 0.6059 | 0.4375 | 0.6059 | 0.7784 |
| 0.0545 | 8.9576 | 2114 | 0.6039 | 0.4375 | 0.6039 | 0.7771 |
| 0.0545 | 8.9661 | 2116 | 0.6028 | 0.4375 | 0.6028 | 0.7764 |
| 0.0545 | 8.9746 | 2118 | 0.6005 | 0.4375 | 0.6005 | 0.7749 |
| 0.0545 | 8.9831 | 2120 | 0.5984 | 0.4375 | 0.5984 | 0.7736 |
| 0.0545 | 8.9915 | 2122 | 0.5969 | 0.4375 | 0.5969 | 0.7726 |
| 0.0545 | 9.0 | 2124 | 0.5961 | 0.4375 | 0.5961 | 0.7721 |
| 0.0545 | 9.0085 | 2126 | 0.5976 | 0.4375 | 0.5976 | 0.7730 |
| 0.0545 | 9.0169 | 2128 | 0.5990 | 0.4375 | 0.5990 | 0.7739 |
| 0.0545 | 9.0254 | 2130 | 0.6007 | 0.4375 | 0.6007 | 0.7751 |
| 0.0545 | 9.0339 | 2132 | 0.6034 | 0.4375 | 0.6034 | 0.7768 |
| 0.0545 | 9.0424 | 2134 | 0.6079 | 0.4348 | 0.6079 | 0.7797 |
| 0.0545 | 9.0508 | 2136 | 0.6117 | 0.3824 | 0.6117 | 0.7821 |
| 0.0545 | 9.0593 | 2138 | 0.6132 | 0.3824 | 0.6132 | 0.7831 |
| 0.0545 | 9.0678 | 2140 | 0.6141 | 0.3284 | 0.6141 | 0.7836 |
| 0.0545 | 9.0763 | 2142 | 0.6150 | 0.3284 | 0.6150 | 0.7842 |
| 0.0545 | 9.0847 | 2144 | 0.6157 | 0.3284 | 0.6157 | 0.7847 |
| 0.0545 | 9.0932 | 2146 | 0.6145 | 0.3284 | 0.6145 | 0.7839 |
| 0.0545 | 9.1017 | 2148 | 0.6114 | 0.3636 | 0.6114 | 0.7819 |
| 0.0545 | 9.1102 | 2150 | 0.6073 | 0.3284 | 0.6073 | 0.7793 |
| 0.0545 | 9.1186 | 2152 | 0.6043 | 0.3284 | 0.6043 | 0.7774 |
| 0.0545 | 9.1271 | 2154 | 0.6010 | 0.3824 | 0.6010 | 0.7753 |
| 0.0545 | 9.1356 | 2156 | 0.5978 | 0.3824 | 0.5978 | 0.7731 |
| 0.0545 | 9.1441 | 2158 | 0.5959 | 0.3824 | 0.5959 | 0.7719 |
| 0.0545 | 9.1525 | 2160 | 0.5927 | 0.3824 | 0.5927 | 0.7698 |
| 0.0545 | 9.1610 | 2162 | 0.5894 | 0.3810 | 0.5894 | 0.7677 |
| 0.0545 | 9.1695 | 2164 | 0.5851 | 0.3810 | 0.5851 | 0.7649 |
| 0.0545 | 9.1780 | 2166 | 0.5836 | 0.4375 | 0.5836 | 0.7639 |
| 0.0545 | 9.1864 | 2168 | 0.5844 | 0.4375 | 0.5844 | 0.7644 |
| 0.0545 | 9.1949 | 2170 | 0.5858 | 0.4375 | 0.5858 | 0.7654 |
| 0.0545 | 9.2034 | 2172 | 0.5858 | 0.4375 | 0.5858 | 0.7653 |
| 0.0545 | 9.2119 | 2174 | 0.5853 | 0.4375 | 0.5853 | 0.7650 |
| 0.0545 | 9.2203 | 2176 | 0.5859 | 0.3810 | 0.5859 | 0.7655 |
| 0.0545 | 9.2288 | 2178 | 0.5871 | 0.3810 | 0.5871 | 0.7662 |
| 0.0545 | 9.2373 | 2180 | 0.5880 | 0.4375 | 0.5880 | 0.7668 |
| 0.0545 | 9.2458 | 2182 | 0.5897 | 0.4375 | 0.5897 | 0.7679 |
| 0.0545 | 9.2542 | 2184 | 0.5922 | 0.4375 | 0.5922 | 0.7696 |
| 0.0545 | 9.2627 | 2186 | 0.5933 | 0.4375 | 0.5933 | 0.7702 |
| 0.0545 | 9.2712 | 2188 | 0.5942 | 0.4375 | 0.5942 | 0.7709 |
| 0.0545 | 9.2797 | 2190 | 0.5960 | 0.4375 | 0.5960 | 0.7720 |
| 0.0545 | 9.2881 | 2192 | 0.5963 | 0.4375 | 0.5963 | 0.7722 |
| 0.0545 | 9.2966 | 2194 | 0.5954 | 0.4375 | 0.5954 | 0.7716 |
| 0.0545 | 9.3051 | 2196 | 0.5944 | 0.4375 | 0.5944 | 0.7710 |
| 0.0545 | 9.3136 | 2198 | 0.5941 | 0.4375 | 0.5941 | 0.7708 |
| 0.0545 | 9.3220 | 2200 | 0.5947 | 0.4375 | 0.5947 | 0.7712 |
| 0.0545 | 9.3305 | 2202 | 0.5952 | 0.4375 | 0.5952 | 0.7715 |
| 0.0545 | 9.3390 | 2204 | 0.5975 | 0.3810 | 0.5975 | 0.7730 |
| 0.0545 | 9.3475 | 2206 | 0.5999 | 0.3824 | 0.5999 | 0.7745 |
| 0.0545 | 9.3559 | 2208 | 0.6006 | 0.3824 | 0.6006 | 0.7750 |
| 0.0545 | 9.3644 | 2210 | 0.6008 | 0.3824 | 0.6008 | 0.7751 |
| 0.0545 | 9.3729 | 2212 | 0.6000 | 0.3824 | 0.6000 | 0.7746 |
| 0.0545 | 9.3814 | 2214 | 0.5988 | 0.3824 | 0.5988 | 0.7738 |
| 0.0545 | 9.3898 | 2216 | 0.5978 | 0.3824 | 0.5978 | 0.7732 |
| 0.0545 | 9.3983 | 2218 | 0.5988 | 0.3824 | 0.5988 | 0.7738 |
| 0.0545 | 9.4068 | 2220 | 0.5992 | 0.3824 | 0.5992 | 0.7741 |
| 0.0545 | 9.4153 | 2222 | 0.5989 | 0.3824 | 0.5989 | 0.7739 |
| 0.0545 | 9.4237 | 2224 | 0.5981 | 0.3824 | 0.5981 | 0.7734 |
| 0.0545 | 9.4322 | 2226 | 0.5988 | 0.3824 | 0.5988 | 0.7738 |
| 0.0545 | 9.4407 | 2228 | 0.5998 | 0.3824 | 0.5998 | 0.7745 |
| 0.0545 | 9.4492 | 2230 | 0.6009 | 0.3824 | 0.6009 | 0.7752 |
| 0.0545 | 9.4576 | 2232 | 0.6005 | 0.3824 | 0.6005 | 0.7749 |
| 0.0545 | 9.4661 | 2234 | 0.5995 | 0.3824 | 0.5995 | 0.7743 |
| 0.0545 | 9.4746 | 2236 | 0.5980 | 0.3824 | 0.5980 | 0.7733 |
| 0.0545 | 9.4831 | 2238 | 0.5959 | 0.3824 | 0.5959 | 0.7720 |
| 0.0545 | 9.4915 | 2240 | 0.5943 | 0.3824 | 0.5943 | 0.7709 |
| 0.0545 | 9.5 | 2242 | 0.5930 | 0.3810 | 0.5930 | 0.7700 |
| 0.0545 | 9.5085 | 2244 | 0.5927 | 0.4375 | 0.5927 | 0.7699 |
| 0.0545 | 9.5169 | 2246 | 0.5933 | 0.4375 | 0.5933 | 0.7702 |
| 0.0545 | 9.5254 | 2248 | 0.5944 | 0.4375 | 0.5944 | 0.7710 |
| 0.0545 | 9.5339 | 2250 | 0.5955 | 0.4375 | 0.5955 | 0.7717 |
| 0.0545 | 9.5424 | 2252 | 0.5964 | 0.4375 | 0.5964 | 0.7723 |
| 0.0545 | 9.5508 | 2254 | 0.5977 | 0.4375 | 0.5977 | 0.7731 |
| 0.0545 | 9.5593 | 2256 | 0.5982 | 0.4375 | 0.5982 | 0.7734 |
| 0.0545 | 9.5678 | 2258 | 0.5987 | 0.4375 | 0.5987 | 0.7738 |
| 0.0545 | 9.5763 | 2260 | 0.5993 | 0.4375 | 0.5993 | 0.7741 |
| 0.0545 | 9.5847 | 2262 | 0.6000 | 0.4375 | 0.6000 | 0.7746 |
| 0.0545 | 9.5932 | 2264 | 0.6002 | 0.4375 | 0.6002 | 0.7747 |
| 0.0545 | 9.6017 | 2266 | 0.6000 | 0.4375 | 0.6000 | 0.7746 |
| 0.0545 | 9.6102 | 2268 | 0.5995 | 0.4375 | 0.5995 | 0.7742 |
| 0.0545 | 9.6186 | 2270 | 0.5989 | 0.4375 | 0.5989 | 0.7739 |
| 0.0545 | 9.6271 | 2272 | 0.5989 | 0.4348 | 0.5989 | 0.7739 |
| 0.0545 | 9.6356 | 2274 | 0.5989 | 0.3824 | 0.5989 | 0.7739 |
| 0.0545 | 9.6441 | 2276 | 0.5985 | 0.3824 | 0.5985 | 0.7736 |
| 0.0545 | 9.6525 | 2278 | 0.5986 | 0.3824 | 0.5986 | 0.7737 |
| 0.0545 | 9.6610 | 2280 | 0.5986 | 0.3824 | 0.5986 | 0.7737 |
| 0.0545 | 9.6695 | 2282 | 0.5993 | 0.3824 | 0.5993 | 0.7742 |
| 0.0545 | 9.6780 | 2284 | 0.6002 | 0.3824 | 0.6002 | 0.7747 |
| 0.0545 | 9.6864 | 2286 | 0.6009 | 0.3824 | 0.6009 | 0.7752 |
| 0.0545 | 9.6949 | 2288 | 0.6014 | 0.3824 | 0.6014 | 0.7755 |
| 0.0545 | 9.7034 | 2290 | 0.6012 | 0.3824 | 0.6012 | 0.7754 |
| 0.0545 | 9.7119 | 2292 | 0.6012 | 0.3824 | 0.6012 | 0.7754 |
| 0.0545 | 9.7203 | 2294 | 0.6011 | 0.3824 | 0.6011 | 0.7753 |
| 0.0545 | 9.7288 | 2296 | 0.6011 | 0.3824 | 0.6011 | 0.7753 |
| 0.0545 | 9.7373 | 2298 | 0.6010 | 0.3824 | 0.6010 | 0.7752 |
| 0.0545 | 9.7458 | 2300 | 0.6011 | 0.3824 | 0.6011 | 0.7753 |
| 0.0545 | 9.7542 | 2302 | 0.6011 | 0.3824 | 0.6011 | 0.7753 |
| 0.0545 | 9.7627 | 2304 | 0.6012 | 0.3824 | 0.6012 | 0.7753 |
| 0.0545 | 9.7712 | 2306 | 0.6010 | 0.3824 | 0.6010 | 0.7752 |
| 0.0545 | 9.7797 | 2308 | 0.6006 | 0.3824 | 0.6006 | 0.7750 |
| 0.0545 | 9.7881 | 2310 | 0.6011 | 0.3824 | 0.6011 | 0.7753 |
| 0.0545 | 9.7966 | 2312 | 0.6013 | 0.3824 | 0.6013 | 0.7754 |
| 0.0545 | 9.8051 | 2314 | 0.6014 | 0.3824 | 0.6014 | 0.7755 |
| 0.0545 | 9.8136 | 2316 | 0.6015 | 0.3824 | 0.6015 | 0.7756 |
| 0.0545 | 9.8220 | 2318 | 0.6017 | 0.3824 | 0.6017 | 0.7757 |
| 0.0545 | 9.8305 | 2320 | 0.6020 | 0.3824 | 0.6020 | 0.7759 |
| 0.0545 | 9.8390 | 2322 | 0.6019 | 0.3824 | 0.6019 | 0.7758 |
| 0.0545 | 9.8475 | 2324 | 0.6021 | 0.3824 | 0.6021 | 0.7759 |
| 0.0545 | 9.8559 | 2326 | 0.6022 | 0.3824 | 0.6022 | 0.7760 |
| 0.0545 | 9.8644 | 2328 | 0.6021 | 0.3824 | 0.6021 | 0.7759 |
| 0.0545 | 9.8729 | 2330 | 0.6020 | 0.3824 | 0.6020 | 0.7759 |
| 0.0545 | 9.8814 | 2332 | 0.6021 | 0.3824 | 0.6021 | 0.7760 |
| 0.0545 | 9.8898 | 2334 | 0.6022 | 0.3824 | 0.6022 | 0.7760 |
| 0.0545 | 9.8983 | 2336 | 0.6023 | 0.3824 | 0.6023 | 0.7761 |
| 0.0545 | 9.9068 | 2338 | 0.6022 | 0.3824 | 0.6022 | 0.7760 |
| 0.0545 | 9.9153 | 2340 | 0.6022 | 0.3824 | 0.6022 | 0.7760 |
| 0.0545 | 9.9237 | 2342 | 0.6022 | 0.3824 | 0.6022 | 0.7760 |
| 0.0545 | 9.9322 | 2344 | 0.6022 | 0.3824 | 0.6022 | 0.7760 |
| 0.0545 | 9.9407 | 2346 | 0.6020 | 0.3824 | 0.6020 | 0.7759 |
| 0.0545 | 9.9492 | 2348 | 0.6019 | 0.3824 | 0.6019 | 0.7758 |
| 0.0545 | 9.9576 | 2350 | 0.6017 | 0.3824 | 0.6017 | 0.7757 |
| 0.0545 | 9.9661 | 2352 | 0.6015 | 0.3824 | 0.6015 | 0.7756 |
| 0.0545 | 9.9746 | 2354 | 0.6014 | 0.3824 | 0.6014 | 0.7755 |
| 0.0545 | 9.9831 | 2356 | 0.6013 | 0.3824 | 0.6013 | 0.7755 |
| 0.0545 | 9.9915 | 2358 | 0.6013 | 0.3824 | 0.6013 | 0.7754 |
| 0.0545 | 10.0 | 2360 | 0.6013 | 0.3824 | 0.6013 | 0.7754 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
|
Shinyaaa/Travel-05-v1-on-RPC-10-v1 | Shinyaaa | 2024-11-25T07:04:40Z | 103 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"feature-extraction",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | feature-extraction | 2024-11-25T07:04:12Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
bjbjbj/business-news-generator | bjbjbj | 2024-11-25T07:04:38Z | 138 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"generated_from_trainer",
"base_model:HuggingFaceTB/SmolLM-135M",
"base_model:finetune:HuggingFaceTB/SmolLM-135M",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-23T07:10:14Z | ---
library_name: transformers
license: apache-2.0
base_model: HuggingFaceTB/SmolLM-135M
tags:
- generated_from_trainer
model-index:
- name: business-news-generator
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# business-news-generator
This model is a fine-tuned version of [HuggingFaceTB/SmolLM-135M](https://huggingface.co/HuggingFaceTB/SmolLM-135M) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 3.2262
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0005
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 3.1445 | 0.32 | 200 | 3.3104 |
| 2.8324 | 0.64 | 400 | 3.2118 |
| 2.6586 | 0.96 | 600 | 3.0967 |
| 1.6904 | 1.28 | 800 | 3.2338 |
| 1.5063 | 1.6 | 1000 | 3.2210 |
| 1.4548 | 1.92 | 1200 | 3.2262 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.5.1
- Datasets 2.16.1
- Tokenizers 0.20.3
|
jaewon0916/xlm-roberta-base-finetuned-panx-it | jaewon0916 | 2024-11-25T07:01:06Z | 134 | 0 | transformers | [
"transformers",
"safetensors",
"xlm-roberta",
"token-classification",
"generated_from_trainer",
"base_model:FacebookAI/xlm-roberta-base",
"base_model:finetune:FacebookAI/xlm-roberta-base",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | 2024-11-25T06:58:18Z | ---
library_name: transformers
license: mit
base_model: xlm-roberta-base
tags:
- generated_from_trainer
metrics:
- f1
model-index:
- name: xlm-roberta-base-finetuned-panx-it
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# xlm-roberta-base-finetuned-panx-it
This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7532
- F1: 0.3009
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | F1 |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 1.2305 | 1.0 | 70 | 1.0231 | 0.1365 |
| 0.9253 | 2.0 | 140 | 0.8115 | 0.2711 |
| 0.7816 | 3.0 | 210 | 0.7532 | 0.3009 |
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.1+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
|
Nguyen17/Dev50 | Nguyen17 | 2024-11-25T07:00:37Z | 30 | 0 | diffusers | [
"diffusers",
"safetensors",
"trl",
"ddpo",
"reinforcement-learning",
"text-to-image",
"stable-diffusion",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | 2024-11-25T06:59:28Z | ---
license: apache-2.0
tags:
- trl
- ddpo
- diffusers
- reinforcement-learning
- text-to-image
- stable-diffusion
---
# TRL DDPO Model
This is a diffusion model that has been fine-tuned with reinforcement learning to
guide the model outputs according to a value, function, or human feedback. The model can be used for image generation conditioned with text.
|
jaewon0916/xlm-roberta-base-finetuned-panx-fr | jaewon0916 | 2024-11-25T06:58:08Z | 124 | 0 | transformers | [
"transformers",
"safetensors",
"xlm-roberta",
"token-classification",
"generated_from_trainer",
"base_model:FacebookAI/xlm-roberta-base",
"base_model:finetune:FacebookAI/xlm-roberta-base",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | 2024-11-25T06:53:37Z | ---
library_name: transformers
license: mit
base_model: xlm-roberta-base
tags:
- generated_from_trainer
metrics:
- f1
model-index:
- name: xlm-roberta-base-finetuned-panx-fr
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# xlm-roberta-base-finetuned-panx-fr
This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6818
- F1: 0.4052
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | F1 |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 1.0766 | 1.0 | 191 | 0.8136 | 0.2819 |
| 0.7658 | 2.0 | 382 | 0.7302 | 0.3516 |
| 0.6456 | 3.0 | 573 | 0.6818 | 0.4052 |
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.1+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
|
bjbjbj/classifier-chapter4 | bjbjbj | 2024-11-25T06:57:02Z | 106 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:google-bert/bert-base-uncased",
"base_model:finetune:google-bert/bert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-23T07:00:56Z | ---
library_name: transformers
license: apache-2.0
base_model: bert-base-uncased
tags:
- generated_from_trainer
metrics:
- accuracy
- f1
model-index:
- name: classifier-chapter4
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# classifier-chapter4
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2394
- Accuracy: 0.9261
- F1: 0.9260
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|
| No log | 1.0 | 313 | 0.2599 | 0.9105 | 0.9102 |
| 0.2993 | 2.0 | 626 | 0.2394 | 0.9261 | 0.9260 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.5.1
- Datasets 2.16.1
- Tokenizers 0.20.3
|
Rich-J/subnet29_upload_c02_N25_0 | Rich-J | 2024-11-25T06:48:37Z | 35 | 0 | transformers | [
"transformers",
"safetensors",
"phi3",
"text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-25T06:44:49Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
cogifa/roberta-base-klue-ynat-classification | cogifa | 2024-11-25T06:48:09Z | 106 | 0 | transformers | [
"transformers",
"safetensors",
"roberta",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-25T06:47:09Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
BigHuggyD/TheDrummer_Behemoth-123B-v2.2_exl2_4.0bpw_h6 | BigHuggyD | 2024-11-25T06:33:16Z | 12 | 0 | null | [
"safetensors",
"mistral",
"license:other",
"4-bit",
"exl2",
"region:us"
] | null | 2024-11-25T06:29:12Z | ---
license: other
---
# Join our Discord! https://discord.gg/Nbv9pQ88Xb
## Nearly 2500 members strong 💪
### Now with more channels! A hub for creatives and makers alike!
---
[BeaverAI](https://huggingface.co/BeaverAI) proudly presents...
# Behemoth 123B v2.2 🦣
> Nothing in the void is foreign to us. The place we go is the place we belong.

## Links
- Original: https://huggingface.co/TheDrummer/Behemoth-123B-v2.2
- GGUF: https://huggingface.co/TheDrummer/Behemoth-123B-v2.2-GGUF
- iMatrix: https://huggingface.co/bartowski/Behemoth-123B-v2.2-GGUF (recommended for smaller quants)
## Description
Behemoth v2.x is a finetune of the new Largestral 2411 with system prompt support. Testers have noted that **everything** felt improved.
### Usage
Testers say this frankenformat maximizes the model's potential: **Metharme** with Mistral's new system tokens
- `[SYSTEM_PROMPT] <|system|>{{system_message}}[/SYSTEM_PROMPT]<|user|>{{user_message}}<|model|>{{assistant_message}}`
- `<|system|>[SYSTEM_PROMPT] {{system_message}}[/SYSTEM_PROMPT]<|user|>{{user_message}}<|model|>{{assistant_message}}`
*Take note that the opening system tag SHOULD ALWAYS have a leading whitespace after it.*
Complete SillyTavern Settings in BeaverAI Club: https://discord.com/channels/1238219753324281886/1309968730301792370/1309968730301792370
### Versions
- [v2.0](https://huggingface.co/TheDrummer/Behemoth-123B-v2) is equivalent to Behemoth v1.0 (Classic)
- [v2.1](https://huggingface.co/TheDrummer/Behemoth-123B-v2.1) is equivalent to Behemoth v1.1 (Creative Boost)
- [v2.2](https://huggingface.co/TheDrummer/Behemoth-123B-v2.2) is an improvement of Behemoth v2.1 (Creative++)
## Special Thanks
Thank you to each and everyone who donated/subscribed in [Ko-Fi](https://ko-fi.com/thedrummer) 🙇 I hope to never disappoint!
```
Toasty Pigeon
theguywhogamesalot
Grozi
F
Marinara
Ko-fi Supporter
Grozi
Phaelon
ONTHEREDTEAM
EvarinSharath'fe(USM-Valor)
Silva
Dakkidaze
AlexTheVP
Pseudo
Kistara
Dr. Fjut
Grozi 🥈
KinjiHakari777
dustywintr
Syd
HumbleConsumer
Syd
Ko-fi Supporter
Arkamist
joe 🥇
Toad
Lied
Konnect
Kistara
Grozi 🥉
SleepDeprived3
Luigi
Nestor
```
https://ko-fi.com/thedrummer/leaderboard
```
Finetuned by yours truly,
Drummer
```

|
MayBashendy/Arabic_FineTuningAraBERT_AugV5_k25_task2_organization_fold0 | MayBashendy | 2024-11-25T06:30:11Z | 161 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:aubmindlab/bert-base-arabertv02",
"base_model:finetune:aubmindlab/bert-base-arabertv02",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-25T06:20:52Z | ---
library_name: transformers
base_model: aubmindlab/bert-base-arabertv02
tags:
- generated_from_trainer
model-index:
- name: Arabic_FineTuningAraBERT_AugV5_k25_task2_organization_fold0
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Arabic_FineTuningAraBERT_AugV5_k25_task2_organization_fold0
This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4392
- Qwk: 0.4809
- Mse: 0.4392
- Rmse: 0.6627
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:|
| No log | 0.0113 | 2 | 3.2414 | 0.0 | 3.2414 | 1.8004 |
| No log | 0.0226 | 4 | 2.5741 | 0.0455 | 2.5741 | 1.6044 |
| No log | 0.0339 | 6 | 1.5514 | -0.0328 | 1.5514 | 1.2456 |
| No log | 0.0452 | 8 | 0.8388 | 0.0 | 0.8388 | 0.9159 |
| No log | 0.0565 | 10 | 0.8797 | 0.0335 | 0.8797 | 0.9379 |
| No log | 0.0678 | 12 | 1.0452 | 0.0 | 1.0452 | 1.0223 |
| No log | 0.0791 | 14 | 1.2591 | 0.0 | 1.2591 | 1.1221 |
| No log | 0.0904 | 16 | 1.6081 | 0.0173 | 1.6081 | 1.2681 |
| No log | 0.1017 | 18 | 1.6282 | 0.0901 | 1.6282 | 1.2760 |
| No log | 0.1130 | 20 | 1.6987 | 0.2288 | 1.6987 | 1.3033 |
| No log | 0.1243 | 22 | 1.2547 | 0.0 | 1.2547 | 1.1201 |
| No log | 0.1356 | 24 | 0.9827 | 0.0 | 0.9827 | 0.9913 |
| No log | 0.1469 | 26 | 0.8649 | 0.0 | 0.8649 | 0.9300 |
| No log | 0.1582 | 28 | 1.0445 | 0.0 | 1.0445 | 1.0220 |
| No log | 0.1695 | 30 | 1.4237 | 0.0 | 1.4237 | 1.1932 |
| No log | 0.1808 | 32 | 1.3904 | 0.0 | 1.3904 | 1.1792 |
| No log | 0.1921 | 34 | 1.2406 | 0.0 | 1.2406 | 1.1138 |
| No log | 0.2034 | 36 | 1.2689 | 0.0 | 1.2689 | 1.1265 |
| No log | 0.2147 | 38 | 1.1627 | 0.0 | 1.1627 | 1.0783 |
| No log | 0.2260 | 40 | 0.7983 | 0.0 | 0.7983 | 0.8935 |
| No log | 0.2373 | 42 | 0.7508 | 0.2821 | 0.7508 | 0.8665 |
| No log | 0.2486 | 44 | 0.7582 | 0.1564 | 0.7582 | 0.8708 |
| No log | 0.2599 | 46 | 0.7744 | 0.1564 | 0.7744 | 0.8800 |
| No log | 0.2712 | 48 | 0.8455 | 0.0 | 0.8455 | 0.9195 |
| No log | 0.2825 | 50 | 1.0165 | 0.0 | 1.0165 | 1.0082 |
| No log | 0.2938 | 52 | 1.0678 | 0.0 | 1.0678 | 1.0333 |
| No log | 0.3051 | 54 | 1.2480 | 0.0 | 1.2480 | 1.1171 |
| No log | 0.3164 | 56 | 1.6066 | 0.0965 | 1.6066 | 1.2675 |
| No log | 0.3277 | 58 | 1.7120 | 0.1600 | 1.7120 | 1.3084 |
| No log | 0.3390 | 60 | 1.7195 | 0.1600 | 1.7195 | 1.3113 |
| No log | 0.3503 | 62 | 1.5850 | 0.0466 | 1.5850 | 1.2590 |
| No log | 0.3616 | 64 | 1.2942 | 0.0 | 1.2942 | 1.1376 |
| No log | 0.3729 | 66 | 0.9887 | 0.0 | 0.9887 | 0.9943 |
| No log | 0.3842 | 68 | 0.8069 | 0.1213 | 0.8069 | 0.8983 |
| No log | 0.3955 | 70 | 1.0145 | 0.1564 | 1.0145 | 1.0072 |
| No log | 0.4068 | 72 | 1.3736 | 0.2057 | 1.3736 | 1.1720 |
| No log | 0.4181 | 74 | 1.4391 | 0.1934 | 1.4391 | 1.1996 |
| No log | 0.4294 | 76 | 1.0152 | 0.1923 | 1.0152 | 1.0075 |
| No log | 0.4407 | 78 | 0.8406 | 0.3197 | 0.8406 | 0.9169 |
| No log | 0.4520 | 80 | 0.8244 | 0.3197 | 0.8244 | 0.9079 |
| No log | 0.4633 | 82 | 0.9664 | 0.1923 | 0.9664 | 0.9831 |
| No log | 0.4746 | 84 | 1.0293 | 0.1564 | 1.0293 | 1.0146 |
| No log | 0.4859 | 86 | 0.9284 | 0.1564 | 0.9284 | 0.9635 |
| No log | 0.4972 | 88 | 0.7896 | 0.2821 | 0.7896 | 0.8886 |
| No log | 0.5085 | 90 | 0.6214 | 0.3197 | 0.6214 | 0.7883 |
| No log | 0.5198 | 92 | 0.5258 | 0.3695 | 0.5258 | 0.7251 |
| No log | 0.5311 | 94 | 0.5250 | 0.4407 | 0.5250 | 0.7245 |
| No log | 0.5424 | 96 | 0.5335 | 0.4407 | 0.5335 | 0.7304 |
| No log | 0.5537 | 98 | 0.5150 | 0.4976 | 0.5150 | 0.7176 |
| No log | 0.5650 | 100 | 0.5363 | 0.4112 | 0.5363 | 0.7323 |
| No log | 0.5763 | 102 | 0.6601 | 0.4120 | 0.6601 | 0.8124 |
| No log | 0.5876 | 104 | 0.9470 | 0.2476 | 0.9470 | 0.9732 |
| No log | 0.5989 | 106 | 1.0452 | 0.2476 | 1.0452 | 1.0223 |
| No log | 0.6102 | 108 | 1.0612 | 0.1923 | 1.0612 | 1.0302 |
| No log | 0.6215 | 110 | 0.9987 | 0.0 | 0.9987 | 0.9994 |
| No log | 0.6328 | 112 | 0.8322 | 0.2821 | 0.8322 | 0.9123 |
| No log | 0.6441 | 114 | 0.6804 | 0.3581 | 0.6804 | 0.8249 |
| No log | 0.6554 | 116 | 0.6311 | 0.1941 | 0.6311 | 0.7944 |
| No log | 0.6667 | 118 | 0.6328 | 0.1941 | 0.6328 | 0.7955 |
| No log | 0.6780 | 120 | 0.6713 | 0.2500 | 0.6713 | 0.8193 |
| No log | 0.6893 | 122 | 0.7995 | 0.4040 | 0.7995 | 0.8942 |
| No log | 0.7006 | 124 | 0.8639 | 0.4040 | 0.8639 | 0.9295 |
| No log | 0.7119 | 126 | 0.6866 | 0.2500 | 0.6866 | 0.8286 |
| No log | 0.7232 | 128 | 0.6175 | 0.3288 | 0.6175 | 0.7858 |
| No log | 0.7345 | 130 | 0.5690 | 0.3288 | 0.5690 | 0.7543 |
| No log | 0.7458 | 132 | 0.5508 | 0.3695 | 0.5508 | 0.7422 |
| No log | 0.7571 | 134 | 0.5416 | 0.4112 | 0.5416 | 0.7359 |
| No log | 0.7684 | 136 | 0.5838 | 0.3837 | 0.5838 | 0.7640 |
| No log | 0.7797 | 138 | 0.8243 | 0.1923 | 0.8243 | 0.9079 |
| No log | 0.7910 | 140 | 0.9018 | 0.1923 | 0.9018 | 0.9496 |
| No log | 0.8023 | 142 | 0.8673 | 0.1923 | 0.8673 | 0.9313 |
| No log | 0.8136 | 144 | 0.6219 | 0.4371 | 0.6219 | 0.7886 |
| No log | 0.8249 | 146 | 0.5932 | 0.3288 | 0.5932 | 0.7702 |
| No log | 0.8362 | 148 | 0.7514 | 0.2794 | 0.7514 | 0.8668 |
| No log | 0.8475 | 150 | 0.7653 | 0.2794 | 0.7653 | 0.8748 |
| No log | 0.8588 | 152 | 0.6623 | 0.3131 | 0.6623 | 0.8138 |
| No log | 0.8701 | 154 | 0.6382 | 0.3745 | 0.6382 | 0.7989 |
| No log | 0.8814 | 156 | 0.7028 | 0.4576 | 0.7028 | 0.8383 |
| No log | 0.8927 | 158 | 0.7373 | 0.4576 | 0.7373 | 0.8587 |
| No log | 0.9040 | 160 | 0.6450 | 0.3000 | 0.6450 | 0.8031 |
| No log | 0.9153 | 162 | 0.7201 | 0.0567 | 0.7201 | 0.8486 |
| No log | 0.9266 | 164 | 0.8674 | 0.2373 | 0.8674 | 0.9313 |
| No log | 0.9379 | 166 | 0.8617 | 0.0957 | 0.8617 | 0.9283 |
| No log | 0.9492 | 168 | 0.8417 | -0.0678 | 0.8417 | 0.9175 |
| No log | 0.9605 | 170 | 0.7874 | 0.0957 | 0.7874 | 0.8873 |
| No log | 0.9718 | 172 | 0.8247 | 0.0957 | 0.8247 | 0.9081 |
| No log | 0.9831 | 174 | 0.8265 | 0.1168 | 0.8265 | 0.9091 |
| No log | 0.9944 | 176 | 0.8412 | 0.0957 | 0.8412 | 0.9172 |
| No log | 1.0056 | 178 | 0.8865 | 0.0957 | 0.8865 | 0.9415 |
| No log | 1.0169 | 180 | 1.0339 | -0.0553 | 1.0339 | 1.0168 |
| No log | 1.0282 | 182 | 1.0656 | -0.0553 | 1.0656 | 1.0323 |
| No log | 1.0395 | 184 | 0.9069 | 0.1340 | 0.9069 | 0.9523 |
| No log | 1.0508 | 186 | 0.6893 | 0.2373 | 0.6893 | 0.8303 |
| No log | 1.0621 | 188 | 0.6527 | 0.3000 | 0.6527 | 0.8079 |
| No log | 1.0734 | 190 | 0.9295 | 0.2476 | 0.9295 | 0.9641 |
| No log | 1.0847 | 192 | 1.0650 | 0.2197 | 1.0650 | 1.0320 |
| No log | 1.0960 | 194 | 0.8959 | 0.2476 | 0.8959 | 0.9465 |
| No log | 1.1073 | 196 | 0.6693 | 0.3197 | 0.6693 | 0.8181 |
| No log | 1.1186 | 198 | 0.5554 | 0.4638 | 0.5554 | 0.7453 |
| No log | 1.1299 | 200 | 0.5749 | 0.2533 | 0.5749 | 0.7582 |
| No log | 1.1412 | 202 | 0.6095 | 0.2759 | 0.6095 | 0.7807 |
| No log | 1.1525 | 204 | 0.6001 | 0.2364 | 0.6001 | 0.7747 |
| No log | 1.1638 | 206 | 0.5762 | 0.2851 | 0.5762 | 0.7591 |
| No log | 1.1751 | 208 | 0.8031 | 0.4717 | 0.8031 | 0.8962 |
| No log | 1.1864 | 210 | 1.0008 | 0.3895 | 1.0008 | 1.0004 |
| No log | 1.1977 | 212 | 0.9781 | 0.3895 | 0.9781 | 0.9890 |
| No log | 1.2090 | 214 | 0.8263 | 0.3011 | 0.8263 | 0.9090 |
| No log | 1.2203 | 216 | 0.7056 | 0.1783 | 0.7056 | 0.8400 |
| No log | 1.2316 | 218 | 0.7353 | 0.1978 | 0.7353 | 0.8575 |
| No log | 1.2429 | 220 | 0.7483 | 0.1558 | 0.7483 | 0.8650 |
| No log | 1.2542 | 222 | 0.7097 | 0.1978 | 0.7097 | 0.8424 |
| No log | 1.2655 | 224 | 0.6746 | 0.1962 | 0.6746 | 0.8213 |
| No log | 1.2768 | 226 | 0.7119 | 0.1370 | 0.7119 | 0.8438 |
| No log | 1.2881 | 228 | 0.7503 | 0.2286 | 0.7503 | 0.8662 |
| No log | 1.2994 | 230 | 0.7623 | 0.2286 | 0.7623 | 0.8731 |
| No log | 1.3107 | 232 | 0.6817 | 0.1755 | 0.6817 | 0.8257 |
| No log | 1.3220 | 234 | 0.6406 | 0.3467 | 0.6406 | 0.8004 |
| No log | 1.3333 | 236 | 0.6629 | 0.1978 | 0.6629 | 0.8142 |
| No log | 1.3446 | 238 | 0.6949 | 0.1978 | 0.6949 | 0.8336 |
| No log | 1.3559 | 240 | 0.7094 | 0.1978 | 0.7094 | 0.8422 |
| No log | 1.3672 | 242 | 0.7175 | 0.3662 | 0.7175 | 0.8470 |
| No log | 1.3785 | 244 | 0.7420 | 0.2862 | 0.7420 | 0.8614 |
| No log | 1.3898 | 246 | 0.7761 | 0.2782 | 0.7761 | 0.8810 |
| No log | 1.4011 | 248 | 0.7792 | 0.3182 | 0.7792 | 0.8827 |
| No log | 1.4124 | 250 | 0.7226 | 0.2553 | 0.7226 | 0.8501 |
| No log | 1.4237 | 252 | 0.6524 | 0.2323 | 0.6524 | 0.8077 |
| No log | 1.4350 | 254 | 0.6175 | 0.3077 | 0.6175 | 0.7858 |
| No log | 1.4463 | 256 | 0.6222 | 0.3318 | 0.6222 | 0.7888 |
| No log | 1.4576 | 258 | 0.6229 | 0.3318 | 0.6229 | 0.7893 |
| No log | 1.4689 | 260 | 0.6164 | 0.2921 | 0.6164 | 0.7851 |
| No log | 1.4802 | 262 | 0.6150 | 0.3077 | 0.6150 | 0.7842 |
| No log | 1.4915 | 264 | 0.6486 | 0.1755 | 0.6486 | 0.8053 |
| No log | 1.5028 | 266 | 0.6637 | 0.2125 | 0.6637 | 0.8147 |
| No log | 1.5141 | 268 | 0.6362 | 0.1783 | 0.6362 | 0.7977 |
| No log | 1.5254 | 270 | 0.6199 | 0.1978 | 0.6199 | 0.7873 |
| No log | 1.5367 | 272 | 0.6465 | 0.2588 | 0.6465 | 0.8041 |
| No log | 1.5480 | 274 | 0.6589 | 0.2588 | 0.6589 | 0.8117 |
| No log | 1.5593 | 276 | 0.6344 | 0.2759 | 0.6344 | 0.7965 |
| No log | 1.5706 | 278 | 0.6187 | 0.3163 | 0.6187 | 0.7866 |
| No log | 1.5819 | 280 | 0.6372 | 0.3636 | 0.6372 | 0.7982 |
| No log | 1.5932 | 282 | 0.6221 | 0.2588 | 0.6221 | 0.7888 |
| No log | 1.6045 | 284 | 0.5790 | 0.3163 | 0.5790 | 0.7609 |
| No log | 1.6158 | 286 | 0.5939 | 0.0339 | 0.5939 | 0.7706 |
| No log | 1.6271 | 288 | 0.6373 | 0.2725 | 0.6373 | 0.7983 |
| No log | 1.6384 | 290 | 0.6570 | 0.3288 | 0.6570 | 0.8105 |
| No log | 1.6497 | 292 | 0.6150 | 0.3131 | 0.6150 | 0.7842 |
| No log | 1.6610 | 294 | 0.5544 | 0.2184 | 0.5544 | 0.7446 |
| No log | 1.6723 | 296 | 0.5367 | 0.1340 | 0.5367 | 0.7326 |
| No log | 1.6836 | 298 | 0.5432 | 0.0679 | 0.5432 | 0.7370 |
| No log | 1.6949 | 300 | 0.5260 | 0.2184 | 0.5260 | 0.7253 |
| No log | 1.7062 | 302 | 0.4961 | 0.2794 | 0.4961 | 0.7043 |
| No log | 1.7175 | 304 | 0.4855 | 0.5093 | 0.4855 | 0.6968 |
| No log | 1.7288 | 306 | 0.4743 | 0.5935 | 0.4743 | 0.6887 |
| No log | 1.7401 | 308 | 0.4807 | 0.2364 | 0.4807 | 0.6933 |
| No log | 1.7514 | 310 | 0.5060 | 0.2364 | 0.5060 | 0.7113 |
| No log | 1.7627 | 312 | 0.5331 | 0.2186 | 0.5331 | 0.7301 |
| No log | 1.7740 | 314 | 0.5857 | 0.3687 | 0.5857 | 0.7653 |
| No log | 1.7853 | 316 | 0.5590 | 0.3636 | 0.5590 | 0.7477 |
| No log | 1.7966 | 318 | 0.4658 | 0.3318 | 0.4658 | 0.6825 |
| No log | 1.8079 | 320 | 0.4378 | 0.4522 | 0.4378 | 0.6616 |
| No log | 1.8192 | 322 | 0.4629 | 0.4638 | 0.4629 | 0.6804 |
| No log | 1.8305 | 324 | 0.4675 | 0.4129 | 0.4675 | 0.6837 |
| No log | 1.8418 | 326 | 0.4903 | 0.4638 | 0.4903 | 0.7002 |
| No log | 1.8531 | 328 | 0.5181 | 0.4368 | 0.5181 | 0.7198 |
| No log | 1.8644 | 330 | 0.5264 | 0.3875 | 0.5264 | 0.7256 |
| No log | 1.8757 | 332 | 0.5062 | 0.4129 | 0.5062 | 0.7115 |
| No log | 1.8870 | 334 | 0.4889 | 0.4000 | 0.4889 | 0.6992 |
| No log | 1.8983 | 336 | 0.5004 | 0.3724 | 0.5004 | 0.7074 |
| No log | 1.9096 | 338 | 0.5241 | 0.3824 | 0.5241 | 0.7240 |
| No log | 1.9209 | 340 | 0.5282 | 0.3390 | 0.5282 | 0.7268 |
| No log | 1.9322 | 342 | 0.5235 | 0.3824 | 0.5235 | 0.7235 |
| No log | 1.9435 | 344 | 0.5077 | 0.3971 | 0.5077 | 0.7126 |
| No log | 1.9548 | 346 | 0.5250 | 0.3875 | 0.5250 | 0.7246 |
| No log | 1.9661 | 348 | 0.5554 | 0.4023 | 0.5554 | 0.7453 |
| No log | 1.9774 | 350 | 0.6009 | 0.3684 | 0.6009 | 0.7752 |
| No log | 1.9887 | 352 | 0.5896 | 0.4023 | 0.5896 | 0.7678 |
| No log | 2.0 | 354 | 0.5744 | 0.4023 | 0.5744 | 0.7579 |
| No log | 2.0113 | 356 | 0.5643 | 0.2851 | 0.5643 | 0.7512 |
| No log | 2.0226 | 358 | 0.5758 | 0.3226 | 0.5758 | 0.7588 |
| No log | 2.0339 | 360 | 0.6377 | 0.1985 | 0.6377 | 0.7986 |
| No log | 2.0452 | 362 | 0.6693 | 0.1985 | 0.6693 | 0.8181 |
| No log | 2.0565 | 364 | 0.6380 | 0.3163 | 0.6380 | 0.7988 |
| No log | 2.0678 | 366 | 0.5837 | 0.3467 | 0.5837 | 0.7640 |
| No log | 2.0791 | 368 | 0.6123 | 0.3563 | 0.6123 | 0.7825 |
| No log | 2.0904 | 370 | 0.6975 | 0.3527 | 0.6975 | 0.8352 |
| No log | 2.1017 | 372 | 0.7080 | 0.3527 | 0.7080 | 0.8415 |
| No log | 2.1130 | 374 | 0.6293 | 0.4146 | 0.6293 | 0.7933 |
| No log | 2.1243 | 376 | 0.5666 | 0.3368 | 0.5666 | 0.7528 |
| No log | 2.1356 | 378 | 0.5912 | 0.3163 | 0.5912 | 0.7689 |
| No log | 2.1469 | 380 | 0.6305 | 0.3163 | 0.6305 | 0.7940 |
| No log | 2.1582 | 382 | 0.6629 | 0.2588 | 0.6629 | 0.8142 |
| No log | 2.1695 | 384 | 0.6556 | 0.2588 | 0.6556 | 0.8097 |
| No log | 2.1808 | 386 | 0.6180 | 0.1340 | 0.6180 | 0.7861 |
| No log | 2.1921 | 388 | 0.5951 | 0.1985 | 0.5951 | 0.7714 |
| No log | 2.2034 | 390 | 0.5959 | 0.2373 | 0.5959 | 0.7720 |
| No log | 2.2147 | 392 | 0.6003 | 0.2323 | 0.6003 | 0.7748 |
| No log | 2.2260 | 394 | 0.5922 | 0.2696 | 0.5922 | 0.7696 |
| No log | 2.2373 | 396 | 0.5862 | 0.4024 | 0.5862 | 0.7656 |
| No log | 2.2486 | 398 | 0.5922 | 0.4394 | 0.5922 | 0.7696 |
| No log | 2.2599 | 400 | 0.6419 | 0.3163 | 0.6419 | 0.8012 |
| No log | 2.2712 | 402 | 0.7166 | 0.5 | 0.7166 | 0.8465 |
| No log | 2.2825 | 404 | 0.7347 | 0.5 | 0.7347 | 0.8571 |
| No log | 2.2938 | 406 | 0.6779 | 0.4154 | 0.6779 | 0.8234 |
| No log | 2.3051 | 408 | 0.5721 | 0.2364 | 0.5721 | 0.7564 |
| No log | 2.3164 | 410 | 0.5161 | 0.3971 | 0.5161 | 0.7184 |
| No log | 2.3277 | 412 | 0.4998 | 0.4112 | 0.4998 | 0.7070 |
| No log | 2.3390 | 414 | 0.5170 | 0.4023 | 0.5170 | 0.7190 |
| No log | 2.3503 | 416 | 0.5748 | 0.3473 | 0.5748 | 0.7582 |
| No log | 2.3616 | 418 | 0.5682 | 0.5009 | 0.5682 | 0.7538 |
| No log | 2.3729 | 420 | 0.4735 | 0.4474 | 0.4735 | 0.6881 |
| No log | 2.3842 | 422 | 0.4254 | 0.3546 | 0.4254 | 0.6522 |
| No log | 2.3955 | 424 | 0.4990 | 0.5157 | 0.4990 | 0.7064 |
| No log | 2.4068 | 426 | 0.6519 | 0.4154 | 0.6519 | 0.8074 |
| No log | 2.4181 | 428 | 0.6684 | 0.4154 | 0.6684 | 0.8176 |
| No log | 2.4294 | 430 | 0.6012 | 0.4690 | 0.6012 | 0.7754 |
| No log | 2.4407 | 432 | 0.5473 | 0.4690 | 0.5473 | 0.7398 |
| No log | 2.4520 | 434 | 0.5048 | 0.4024 | 0.5048 | 0.7105 |
| No log | 2.4633 | 436 | 0.5006 | 0.3787 | 0.5006 | 0.7075 |
| No log | 2.4746 | 438 | 0.5077 | 0.2851 | 0.5077 | 0.7125 |
| No log | 2.4859 | 440 | 0.5033 | 0.2851 | 0.5033 | 0.7095 |
| No log | 2.4972 | 442 | 0.5035 | 0.2851 | 0.5035 | 0.7096 |
| No log | 2.5085 | 444 | 0.4914 | 0.2553 | 0.4914 | 0.7010 |
| No log | 2.5198 | 446 | 0.4781 | 0.4247 | 0.4781 | 0.6914 |
| No log | 2.5311 | 448 | 0.4639 | 0.4247 | 0.4639 | 0.6811 |
| No log | 2.5424 | 450 | 0.4465 | 0.4247 | 0.4465 | 0.6682 |
| No log | 2.5537 | 452 | 0.4480 | 0.4247 | 0.4480 | 0.6693 |
| No log | 2.5650 | 454 | 0.4443 | 0.4247 | 0.4443 | 0.6666 |
| No log | 2.5763 | 456 | 0.4214 | 0.4665 | 0.4214 | 0.6492 |
| No log | 2.5876 | 458 | 0.4237 | 0.4247 | 0.4237 | 0.6509 |
| No log | 2.5989 | 460 | 0.4264 | 0.4665 | 0.4264 | 0.6530 |
| No log | 2.6102 | 462 | 0.4188 | 0.4665 | 0.4188 | 0.6472 |
| No log | 2.6215 | 464 | 0.4198 | 0.4665 | 0.4198 | 0.6479 |
| No log | 2.6328 | 466 | 0.4194 | 0.4665 | 0.4194 | 0.6476 |
| No log | 2.6441 | 468 | 0.4145 | 0.4665 | 0.4145 | 0.6438 |
| No log | 2.6554 | 470 | 0.4371 | 0.4407 | 0.4371 | 0.6612 |
| No log | 2.6667 | 472 | 0.4583 | 0.4268 | 0.4583 | 0.6770 |
| No log | 2.6780 | 474 | 0.4664 | 0.4268 | 0.4664 | 0.6829 |
| No log | 2.6893 | 476 | 0.4519 | 0.5753 | 0.4519 | 0.6723 |
| No log | 2.7006 | 478 | 0.4481 | 0.5227 | 0.4481 | 0.6694 |
| No log | 2.7119 | 480 | 0.4425 | 0.4809 | 0.4425 | 0.6652 |
| No log | 2.7232 | 482 | 0.4314 | 0.5333 | 0.4314 | 0.6568 |
| No log | 2.7345 | 484 | 0.4255 | 0.5846 | 0.4255 | 0.6523 |
| No log | 2.7458 | 486 | 0.4183 | 0.5333 | 0.4183 | 0.6468 |
| No log | 2.7571 | 488 | 0.4070 | 0.4809 | 0.4070 | 0.6379 |
| No log | 2.7684 | 490 | 0.4038 | 0.3724 | 0.4038 | 0.6355 |
| No log | 2.7797 | 492 | 0.4052 | 0.3724 | 0.4052 | 0.6365 |
| No log | 2.7910 | 494 | 0.4077 | 0.5227 | 0.4077 | 0.6385 |
| No log | 2.8023 | 496 | 0.4168 | 0.5817 | 0.4168 | 0.6456 |
| No log | 2.8136 | 498 | 0.4274 | 0.5817 | 0.4274 | 0.6537 |
| 0.4465 | 2.8249 | 500 | 0.4390 | 0.5817 | 0.4390 | 0.6626 |
| 0.4465 | 2.8362 | 502 | 0.4662 | 0.5444 | 0.4662 | 0.6828 |
| 0.4465 | 2.8475 | 504 | 0.4872 | 0.5078 | 0.4872 | 0.6980 |
| 0.4465 | 2.8588 | 506 | 0.4716 | 0.5078 | 0.4716 | 0.6867 |
| 0.4465 | 2.8701 | 508 | 0.4570 | 0.4024 | 0.4570 | 0.6760 |
| 0.4465 | 2.8814 | 510 | 0.4611 | 0.4024 | 0.4611 | 0.6791 |
| 0.4465 | 2.8927 | 512 | 0.4657 | 0.4024 | 0.4657 | 0.6825 |
| 0.4465 | 2.9040 | 514 | 0.4773 | 0.3662 | 0.4773 | 0.6909 |
| 0.4465 | 2.9153 | 516 | 0.4773 | 0.3662 | 0.4773 | 0.6909 |
| 0.4465 | 2.9266 | 518 | 0.4858 | 0.4024 | 0.4858 | 0.6970 |
| 0.4465 | 2.9379 | 520 | 0.4941 | 0.3318 | 0.4941 | 0.7029 |
| 0.4465 | 2.9492 | 522 | 0.5012 | 0.3318 | 0.5012 | 0.7079 |
| 0.4465 | 2.9605 | 524 | 0.5181 | 0.4638 | 0.5181 | 0.7198 |
| 0.4465 | 2.9718 | 526 | 0.5412 | 0.3875 | 0.5412 | 0.7357 |
| 0.4465 | 2.9831 | 528 | 0.5694 | 0.4 | 0.5694 | 0.7546 |
| 0.4465 | 2.9944 | 530 | 0.5704 | 0.3505 | 0.5704 | 0.7552 |
| 0.4465 | 3.0056 | 532 | 0.5585 | 0.3875 | 0.5585 | 0.7473 |
| 0.4465 | 3.0169 | 534 | 0.5347 | 0.2696 | 0.5347 | 0.7312 |
| 0.4465 | 3.0282 | 536 | 0.5485 | 0.3318 | 0.5485 | 0.7406 |
| 0.4465 | 3.0395 | 538 | 0.5811 | 0.2588 | 0.5811 | 0.7623 |
| 0.4465 | 3.0508 | 540 | 0.5915 | 0.3163 | 0.5915 | 0.7691 |
| 0.4465 | 3.0621 | 542 | 0.5743 | 0.2759 | 0.5743 | 0.7578 |
| 0.4465 | 3.0734 | 544 | 0.5644 | 0.2759 | 0.5644 | 0.7513 |
| 0.4465 | 3.0847 | 546 | 0.5633 | 0.2759 | 0.5633 | 0.7505 |
| 0.4465 | 3.0960 | 548 | 0.5508 | 0.3163 | 0.5508 | 0.7422 |
| 0.4465 | 3.1073 | 550 | 0.5237 | 0.2759 | 0.5237 | 0.7237 |
| 0.4465 | 3.1186 | 552 | 0.5114 | 0.2759 | 0.5114 | 0.7151 |
| 0.4465 | 3.1299 | 554 | 0.5065 | 0.3318 | 0.5065 | 0.7117 |
| 0.4465 | 3.1412 | 556 | 0.5074 | 0.3318 | 0.5074 | 0.7123 |
| 0.4465 | 3.1525 | 558 | 0.5093 | 0.4400 | 0.5093 | 0.7136 |
| 0.4465 | 3.1638 | 560 | 0.5058 | 0.4400 | 0.5058 | 0.7112 |
| 0.4465 | 3.1751 | 562 | 0.5000 | 0.4400 | 0.5000 | 0.7071 |
| 0.4465 | 3.1864 | 564 | 0.4945 | 0.4400 | 0.4945 | 0.7032 |
| 0.4465 | 3.1977 | 566 | 0.4965 | 0.4400 | 0.4965 | 0.7046 |
| 0.4465 | 3.2090 | 568 | 0.4969 | 0.4522 | 0.4969 | 0.7049 |
| 0.4465 | 3.2203 | 570 | 0.5008 | 0.4000 | 0.5008 | 0.7076 |
| 0.4465 | 3.2316 | 572 | 0.5170 | 0.2921 | 0.5170 | 0.7190 |
| 0.4465 | 3.2429 | 574 | 0.5391 | 0.2921 | 0.5391 | 0.7342 |
| 0.4465 | 3.2542 | 576 | 0.5455 | 0.2921 | 0.5455 | 0.7386 |
| 0.4465 | 3.2655 | 578 | 0.5534 | 0.3171 | 0.5534 | 0.7439 |
| 0.4465 | 3.2768 | 580 | 0.5653 | 0.4507 | 0.5653 | 0.7519 |
| 0.4465 | 3.2881 | 582 | 0.5725 | 0.4507 | 0.5725 | 0.7566 |
| 0.4465 | 3.2994 | 584 | 0.5731 | 0.4507 | 0.5731 | 0.7571 |
| 0.4465 | 3.3107 | 586 | 0.5839 | 0.4507 | 0.5839 | 0.7641 |
| 0.4465 | 3.3220 | 588 | 0.5812 | 0.3171 | 0.5812 | 0.7624 |
| 0.4465 | 3.3333 | 590 | 0.5637 | 0.4507 | 0.5637 | 0.7508 |
| 0.4465 | 3.3446 | 592 | 0.5454 | 0.4507 | 0.5454 | 0.7385 |
| 0.4465 | 3.3559 | 594 | 0.5324 | 0.4507 | 0.5324 | 0.7296 |
| 0.4465 | 3.3672 | 596 | 0.5263 | 0.4400 | 0.5263 | 0.7254 |
| 0.4465 | 3.3785 | 598 | 0.5257 | 0.4522 | 0.5257 | 0.7250 |
| 0.4465 | 3.3898 | 600 | 0.5214 | 0.5032 | 0.5214 | 0.7220 |
| 0.4465 | 3.4011 | 602 | 0.5204 | 0.5078 | 0.5204 | 0.7214 |
| 0.4465 | 3.4124 | 604 | 0.5161 | 0.5078 | 0.5161 | 0.7184 |
| 0.4465 | 3.4237 | 606 | 0.5124 | 0.5032 | 0.5124 | 0.7158 |
| 0.4465 | 3.4350 | 608 | 0.5136 | 0.4400 | 0.5136 | 0.7166 |
| 0.4465 | 3.4463 | 610 | 0.5164 | 0.4400 | 0.5164 | 0.7186 |
| 0.4465 | 3.4576 | 612 | 0.5196 | 0.4400 | 0.5196 | 0.7209 |
| 0.4465 | 3.4689 | 614 | 0.5222 | 0.4400 | 0.5222 | 0.7226 |
| 0.4465 | 3.4802 | 616 | 0.5086 | 0.4400 | 0.5086 | 0.7132 |
| 0.4465 | 3.4915 | 618 | 0.5053 | 0.4400 | 0.5053 | 0.7108 |
| 0.4465 | 3.5028 | 620 | 0.5121 | 0.4400 | 0.5121 | 0.7156 |
| 0.4465 | 3.5141 | 622 | 0.5119 | 0.4400 | 0.5119 | 0.7155 |
| 0.4465 | 3.5254 | 624 | 0.5073 | 0.4400 | 0.5073 | 0.7122 |
| 0.4465 | 3.5367 | 626 | 0.5073 | 0.4400 | 0.5073 | 0.7123 |
| 0.4465 | 3.5480 | 628 | 0.5042 | 0.4400 | 0.5042 | 0.7101 |
| 0.4465 | 3.5593 | 630 | 0.4885 | 0.4400 | 0.4885 | 0.6989 |
| 0.4465 | 3.5706 | 632 | 0.4743 | 0.4400 | 0.4743 | 0.6887 |
| 0.4465 | 3.5819 | 634 | 0.4654 | 0.5032 | 0.4654 | 0.6822 |
| 0.4465 | 3.5932 | 636 | 0.4936 | 0.5078 | 0.4936 | 0.7026 |
| 0.4465 | 3.6045 | 638 | 0.5285 | 0.4368 | 0.5285 | 0.7270 |
| 0.4465 | 3.6158 | 640 | 0.5326 | 0.4720 | 0.5326 | 0.7298 |
| 0.4465 | 3.6271 | 642 | 0.5139 | 0.5078 | 0.5139 | 0.7169 |
| 0.4465 | 3.6384 | 644 | 0.5008 | 0.5078 | 0.5008 | 0.7076 |
| 0.4465 | 3.6497 | 646 | 0.4767 | 0.5444 | 0.4767 | 0.6905 |
| 0.4465 | 3.6610 | 648 | 0.4853 | 0.6067 | 0.4853 | 0.6967 |
| 0.4465 | 3.6723 | 650 | 0.4875 | 0.4385 | 0.4875 | 0.6982 |
| 0.4465 | 3.6836 | 652 | 0.4763 | 0.5954 | 0.4763 | 0.6902 |
| 0.4465 | 3.6949 | 654 | 0.4780 | 0.5444 | 0.4780 | 0.6914 |
| 0.4465 | 3.7062 | 656 | 0.5243 | 0.4720 | 0.5243 | 0.7241 |
| 0.4465 | 3.7175 | 658 | 0.6392 | 0.5008 | 0.6392 | 0.7995 |
| 0.4465 | 3.7288 | 660 | 0.7302 | 0.4717 | 0.7302 | 0.8545 |
| 0.4465 | 3.7401 | 662 | 0.7064 | 0.5008 | 0.7064 | 0.8405 |
| 0.4465 | 3.7514 | 664 | 0.6051 | 0.4146 | 0.6051 | 0.7779 |
| 0.4465 | 3.7627 | 666 | 0.5215 | 0.5078 | 0.5215 | 0.7221 |
| 0.4465 | 3.7740 | 668 | 0.5084 | 0.5444 | 0.5084 | 0.7130 |
| 0.4465 | 3.7853 | 670 | 0.5046 | 0.4809 | 0.5046 | 0.7103 |
| 0.4465 | 3.7966 | 672 | 0.5143 | 0.3318 | 0.5143 | 0.7171 |
| 0.4465 | 3.8079 | 674 | 0.5317 | 0.3318 | 0.5317 | 0.7291 |
| 0.4465 | 3.8192 | 676 | 0.5368 | 0.3318 | 0.5368 | 0.7327 |
| 0.4465 | 3.8305 | 678 | 0.5248 | 0.3318 | 0.5248 | 0.7244 |
| 0.4465 | 3.8418 | 680 | 0.5111 | 0.3318 | 0.5111 | 0.7149 |
| 0.4465 | 3.8531 | 682 | 0.4964 | 0.4400 | 0.4964 | 0.7046 |
| 0.4465 | 3.8644 | 684 | 0.4899 | 0.4400 | 0.4899 | 0.7000 |
| 0.4465 | 3.8757 | 686 | 0.4790 | 0.4809 | 0.4790 | 0.6921 |
| 0.4465 | 3.8870 | 688 | 0.4711 | 0.4809 | 0.4711 | 0.6863 |
| 0.4465 | 3.8983 | 690 | 0.4635 | 0.4809 | 0.4635 | 0.6808 |
| 0.4465 | 3.9096 | 692 | 0.4546 | 0.4809 | 0.4546 | 0.6743 |
| 0.4465 | 3.9209 | 694 | 0.4507 | 0.4809 | 0.4507 | 0.6713 |
| 0.4465 | 3.9322 | 696 | 0.4506 | 0.4809 | 0.4506 | 0.6712 |
| 0.4465 | 3.9435 | 698 | 0.4437 | 0.4809 | 0.4437 | 0.6661 |
| 0.4465 | 3.9548 | 700 | 0.4325 | 0.4923 | 0.4325 | 0.6576 |
| 0.4465 | 3.9661 | 702 | 0.4276 | 0.4923 | 0.4276 | 0.6539 |
| 0.4465 | 3.9774 | 704 | 0.4232 | 0.5435 | 0.4232 | 0.6506 |
| 0.4465 | 3.9887 | 706 | 0.4169 | 0.5435 | 0.4169 | 0.6457 |
| 0.4465 | 4.0 | 708 | 0.4177 | 0.5435 | 0.4177 | 0.6463 |
| 0.4465 | 4.0113 | 710 | 0.4155 | 0.5435 | 0.4155 | 0.6446 |
| 0.4465 | 4.0226 | 712 | 0.4193 | 0.5846 | 0.4193 | 0.6475 |
| 0.4465 | 4.0339 | 714 | 0.4242 | 0.5846 | 0.4242 | 0.6513 |
| 0.4465 | 4.0452 | 716 | 0.4304 | 0.5846 | 0.4304 | 0.6560 |
| 0.4465 | 4.0565 | 718 | 0.4430 | 0.5435 | 0.4430 | 0.6656 |
| 0.4465 | 4.0678 | 720 | 0.4544 | 0.4539 | 0.4544 | 0.6741 |
| 0.4465 | 4.0791 | 722 | 0.4633 | 0.3875 | 0.4633 | 0.6807 |
| 0.4465 | 4.0904 | 724 | 0.4923 | 0.4371 | 0.4923 | 0.7016 |
| 0.4465 | 4.1017 | 726 | 0.5318 | 0.5009 | 0.5318 | 0.7292 |
| 0.4465 | 4.1130 | 728 | 0.5512 | 0.5009 | 0.5512 | 0.7424 |
| 0.4465 | 4.1243 | 730 | 0.5320 | 0.5685 | 0.5320 | 0.7294 |
| 0.4465 | 4.1356 | 732 | 0.4736 | 0.4787 | 0.4736 | 0.6882 |
| 0.4465 | 4.1469 | 734 | 0.4361 | 0.5444 | 0.4361 | 0.6604 |
| 0.4465 | 4.1582 | 736 | 0.4319 | 0.5817 | 0.4319 | 0.6572 |
| 0.4465 | 4.1695 | 738 | 0.4345 | 0.5352 | 0.4345 | 0.6591 |
| 0.4465 | 4.1808 | 740 | 0.4339 | 0.5352 | 0.4339 | 0.6587 |
| 0.4465 | 4.1921 | 742 | 0.4326 | 0.5352 | 0.4326 | 0.6577 |
| 0.4465 | 4.2034 | 744 | 0.4257 | 0.5352 | 0.4257 | 0.6525 |
| 0.4465 | 4.2147 | 746 | 0.4146 | 0.5352 | 0.4146 | 0.6439 |
| 0.4465 | 4.2260 | 748 | 0.4106 | 0.5846 | 0.4106 | 0.6408 |
| 0.4465 | 4.2373 | 750 | 0.4131 | 0.5333 | 0.4131 | 0.6427 |
| 0.4465 | 4.2486 | 752 | 0.4331 | 0.4273 | 0.4331 | 0.6581 |
| 0.4465 | 4.2599 | 754 | 0.4519 | 0.3724 | 0.4519 | 0.6722 |
| 0.4465 | 4.2712 | 756 | 0.4581 | 0.3724 | 0.4581 | 0.6768 |
| 0.4465 | 4.2825 | 758 | 0.4506 | 0.3724 | 0.4506 | 0.6712 |
| 0.4465 | 4.2938 | 760 | 0.4264 | 0.4809 | 0.4264 | 0.6530 |
| 0.4465 | 4.3051 | 762 | 0.4173 | 0.4809 | 0.4173 | 0.6460 |
| 0.4465 | 4.3164 | 764 | 0.4213 | 0.5846 | 0.4213 | 0.6491 |
| 0.4465 | 4.3277 | 766 | 0.4285 | 0.5817 | 0.4285 | 0.6546 |
| 0.4465 | 4.3390 | 768 | 0.4374 | 0.5444 | 0.4374 | 0.6614 |
| 0.4465 | 4.3503 | 770 | 0.4381 | 0.5817 | 0.4381 | 0.6619 |
| 0.4465 | 4.3616 | 772 | 0.4393 | 0.4809 | 0.4393 | 0.6628 |
| 0.4465 | 4.3729 | 774 | 0.4553 | 0.4277 | 0.4553 | 0.6747 |
| 0.4465 | 4.3842 | 776 | 0.4714 | 0.5097 | 0.4714 | 0.6866 |
| 0.4465 | 4.3955 | 778 | 0.4878 | 0.4615 | 0.4878 | 0.6984 |
| 0.4465 | 4.4068 | 780 | 0.4745 | 0.5045 | 0.4745 | 0.6888 |
| 0.4465 | 4.4181 | 782 | 0.4537 | 0.4809 | 0.4537 | 0.6736 |
| 0.4465 | 4.4294 | 784 | 0.4555 | 0.5817 | 0.4555 | 0.6749 |
| 0.4465 | 4.4407 | 786 | 0.4618 | 0.5817 | 0.4618 | 0.6795 |
| 0.4465 | 4.4520 | 788 | 0.4654 | 0.5817 | 0.4654 | 0.6822 |
| 0.4465 | 4.4633 | 790 | 0.4626 | 0.5817 | 0.4626 | 0.6802 |
| 0.4465 | 4.4746 | 792 | 0.4606 | 0.5817 | 0.4606 | 0.6786 |
| 0.4465 | 4.4859 | 794 | 0.4575 | 0.5817 | 0.4575 | 0.6764 |
| 0.4465 | 4.4972 | 796 | 0.4481 | 0.5817 | 0.4481 | 0.6694 |
| 0.4465 | 4.5085 | 798 | 0.4386 | 0.5846 | 0.4386 | 0.6623 |
| 0.4465 | 4.5198 | 800 | 0.4293 | 0.5846 | 0.4293 | 0.6552 |
| 0.4465 | 4.5311 | 802 | 0.4270 | 0.6119 | 0.4270 | 0.6534 |
| 0.4465 | 4.5424 | 804 | 0.4202 | 0.6585 | 0.4202 | 0.6482 |
| 0.4465 | 4.5537 | 806 | 0.4094 | 0.5435 | 0.4094 | 0.6399 |
| 0.4465 | 4.5650 | 808 | 0.4107 | 0.5078 | 0.4107 | 0.6409 |
| 0.4465 | 4.5763 | 810 | 0.4302 | 0.4787 | 0.4302 | 0.6559 |
| 0.4465 | 4.5876 | 812 | 0.4490 | 0.4464 | 0.4490 | 0.6700 |
| 0.4465 | 4.5989 | 814 | 0.4337 | 0.5116 | 0.4337 | 0.6585 |
| 0.4465 | 4.6102 | 816 | 0.4214 | 0.5078 | 0.4214 | 0.6491 |
| 0.4465 | 4.6215 | 818 | 0.4220 | 0.6197 | 0.4220 | 0.6496 |
| 0.4465 | 4.6328 | 820 | 0.4430 | 0.5643 | 0.4430 | 0.6656 |
| 0.4465 | 4.6441 | 822 | 0.4525 | 0.5643 | 0.4525 | 0.6727 |
| 0.4465 | 4.6554 | 824 | 0.4370 | 0.4809 | 0.4370 | 0.6611 |
| 0.4465 | 4.6667 | 826 | 0.4345 | 0.5435 | 0.4345 | 0.6592 |
| 0.4465 | 4.6780 | 828 | 0.4599 | 0.5078 | 0.4599 | 0.6781 |
| 0.4465 | 4.6893 | 830 | 0.5018 | 0.4878 | 0.5018 | 0.7084 |
| 0.4465 | 4.7006 | 832 | 0.5327 | 0.3801 | 0.5327 | 0.7299 |
| 0.4465 | 4.7119 | 834 | 0.5238 | 0.4878 | 0.5238 | 0.7237 |
| 0.4465 | 4.7232 | 836 | 0.4892 | 0.4787 | 0.4892 | 0.6994 |
| 0.4465 | 4.7345 | 838 | 0.4575 | 0.5078 | 0.4575 | 0.6764 |
| 0.4465 | 4.7458 | 840 | 0.4412 | 0.5333 | 0.4412 | 0.6642 |
| 0.4465 | 4.7571 | 842 | 0.4387 | 0.5333 | 0.4387 | 0.6623 |
| 0.4465 | 4.7684 | 844 | 0.4451 | 0.4923 | 0.4451 | 0.6671 |
| 0.4465 | 4.7797 | 846 | 0.4556 | 0.5078 | 0.4556 | 0.6750 |
| 0.4465 | 4.7910 | 848 | 0.4629 | 0.5451 | 0.4629 | 0.6804 |
| 0.4465 | 4.8023 | 850 | 0.4644 | 0.5451 | 0.4644 | 0.6815 |
| 0.4465 | 4.8136 | 852 | 0.4587 | 0.5451 | 0.4587 | 0.6773 |
| 0.4465 | 4.8249 | 854 | 0.4489 | 0.5444 | 0.4489 | 0.6700 |
| 0.4465 | 4.8362 | 856 | 0.4407 | 0.5817 | 0.4407 | 0.6639 |
| 0.4465 | 4.8475 | 858 | 0.4356 | 0.5444 | 0.4356 | 0.6600 |
| 0.4465 | 4.8588 | 860 | 0.4302 | 0.5444 | 0.4302 | 0.6559 |
| 0.4465 | 4.8701 | 862 | 0.4324 | 0.5078 | 0.4324 | 0.6575 |
| 0.4465 | 4.8814 | 864 | 0.4336 | 0.5078 | 0.4336 | 0.6585 |
| 0.4465 | 4.8927 | 866 | 0.4303 | 0.5078 | 0.4303 | 0.6559 |
| 0.4465 | 4.9040 | 868 | 0.4198 | 0.5435 | 0.4198 | 0.6479 |
| 0.4465 | 4.9153 | 870 | 0.4153 | 0.5333 | 0.4153 | 0.6444 |
| 0.4465 | 4.9266 | 872 | 0.4148 | 0.4809 | 0.4148 | 0.6440 |
| 0.4465 | 4.9379 | 874 | 0.4149 | 0.4809 | 0.4149 | 0.6441 |
| 0.4465 | 4.9492 | 876 | 0.4143 | 0.4923 | 0.4143 | 0.6437 |
| 0.4465 | 4.9605 | 878 | 0.4148 | 0.4522 | 0.4148 | 0.6441 |
| 0.4465 | 4.9718 | 880 | 0.4272 | 0.5078 | 0.4272 | 0.6536 |
| 0.4465 | 4.9831 | 882 | 0.4346 | 0.5078 | 0.4346 | 0.6592 |
| 0.4465 | 4.9944 | 884 | 0.4235 | 0.5032 | 0.4235 | 0.6507 |
| 0.4465 | 5.0056 | 886 | 0.4137 | 0.4522 | 0.4137 | 0.6432 |
| 0.4465 | 5.0169 | 888 | 0.4144 | 0.4522 | 0.4144 | 0.6437 |
| 0.4465 | 5.0282 | 890 | 0.4212 | 0.4809 | 0.4212 | 0.6490 |
| 0.4465 | 5.0395 | 892 | 0.4605 | 0.4809 | 0.4605 | 0.6786 |
| 0.4465 | 5.0508 | 894 | 0.5217 | 0.4637 | 0.5217 | 0.7223 |
| 0.4465 | 5.0621 | 896 | 0.5529 | 0.5385 | 0.5529 | 0.7436 |
| 0.4465 | 5.0734 | 898 | 0.5380 | 0.6223 | 0.5380 | 0.7335 |
| 0.4465 | 5.0847 | 900 | 0.4932 | 0.5191 | 0.4932 | 0.7022 |
| 0.4465 | 5.0960 | 902 | 0.4437 | 0.4000 | 0.4437 | 0.6661 |
| 0.4465 | 5.1073 | 904 | 0.4378 | 0.5078 | 0.4378 | 0.6617 |
| 0.4465 | 5.1186 | 906 | 0.4569 | 0.4720 | 0.4569 | 0.6759 |
| 0.4465 | 5.1299 | 908 | 0.4633 | 0.4720 | 0.4633 | 0.6807 |
| 0.4465 | 5.1412 | 910 | 0.4561 | 0.4720 | 0.4561 | 0.6754 |
| 0.4465 | 5.1525 | 912 | 0.4475 | 0.5078 | 0.4475 | 0.6690 |
| 0.4465 | 5.1638 | 914 | 0.4412 | 0.5032 | 0.4412 | 0.6643 |
| 0.4465 | 5.1751 | 916 | 0.4352 | 0.4522 | 0.4352 | 0.6597 |
| 0.4465 | 5.1864 | 918 | 0.4393 | 0.4000 | 0.4393 | 0.6628 |
| 0.4465 | 5.1977 | 920 | 0.4478 | 0.4809 | 0.4478 | 0.6692 |
| 0.4465 | 5.2090 | 922 | 0.4646 | 0.4809 | 0.4646 | 0.6816 |
| 0.4465 | 5.2203 | 924 | 0.4759 | 0.3724 | 0.4759 | 0.6899 |
| 0.4465 | 5.2316 | 926 | 0.4705 | 0.4273 | 0.4705 | 0.6859 |
| 0.4465 | 5.2429 | 928 | 0.4630 | 0.4809 | 0.4630 | 0.6804 |
| 0.4465 | 5.2542 | 930 | 0.4481 | 0.4000 | 0.4481 | 0.6694 |
| 0.4465 | 5.2655 | 932 | 0.4348 | 0.4000 | 0.4348 | 0.6594 |
| 0.4465 | 5.2768 | 934 | 0.4343 | 0.4143 | 0.4343 | 0.6590 |
| 0.4465 | 5.2881 | 936 | 0.4362 | 0.5078 | 0.4362 | 0.6604 |
| 0.4465 | 5.2994 | 938 | 0.4328 | 0.5078 | 0.4328 | 0.6579 |
| 0.4465 | 5.3107 | 940 | 0.4335 | 0.4143 | 0.4335 | 0.6584 |
| 0.4465 | 5.3220 | 942 | 0.4330 | 0.4143 | 0.4330 | 0.6580 |
| 0.4465 | 5.3333 | 944 | 0.4423 | 0.4000 | 0.4423 | 0.6651 |
| 0.4465 | 5.3446 | 946 | 0.4562 | 0.3467 | 0.4562 | 0.6754 |
| 0.4465 | 5.3559 | 948 | 0.4675 | 0.4772 | 0.4675 | 0.6838 |
| 0.4465 | 5.3672 | 950 | 0.4723 | 0.4772 | 0.4723 | 0.6873 |
| 0.4465 | 5.3785 | 952 | 0.4630 | 0.3865 | 0.4630 | 0.6804 |
| 0.4465 | 5.3898 | 954 | 0.4609 | 0.3865 | 0.4609 | 0.6789 |
| 0.4465 | 5.4011 | 956 | 0.4560 | 0.4273 | 0.4560 | 0.6753 |
| 0.4465 | 5.4124 | 958 | 0.4533 | 0.4273 | 0.4533 | 0.6733 |
| 0.4465 | 5.4237 | 960 | 0.4580 | 0.3724 | 0.4580 | 0.6768 |
| 0.4465 | 5.4350 | 962 | 0.4630 | 0.4661 | 0.4630 | 0.6805 |
| 0.4465 | 5.4463 | 964 | 0.4549 | 0.3724 | 0.4549 | 0.6744 |
| 0.4465 | 5.4576 | 966 | 0.4473 | 0.4809 | 0.4473 | 0.6688 |
| 0.4465 | 5.4689 | 968 | 0.4392 | 0.4000 | 0.4392 | 0.6627 |
| 0.4465 | 5.4802 | 970 | 0.4369 | 0.4000 | 0.4369 | 0.6610 |
| 0.4465 | 5.4915 | 972 | 0.4415 | 0.5643 | 0.4415 | 0.6644 |
| 0.4465 | 5.5028 | 974 | 0.4484 | 0.5643 | 0.4484 | 0.6696 |
| 0.4465 | 5.5141 | 976 | 0.4552 | 0.5549 | 0.4552 | 0.6747 |
| 0.4465 | 5.5254 | 978 | 0.4459 | 0.5549 | 0.4459 | 0.6678 |
| 0.4465 | 5.5367 | 980 | 0.4409 | 0.5549 | 0.4409 | 0.6640 |
| 0.4465 | 5.5480 | 982 | 0.4362 | 0.5549 | 0.4362 | 0.6604 |
| 0.4465 | 5.5593 | 984 | 0.4487 | 0.6223 | 0.4487 | 0.6699 |
| 0.4465 | 5.5706 | 986 | 0.4567 | 0.6223 | 0.4567 | 0.6758 |
| 0.4465 | 5.5819 | 988 | 0.4720 | 0.6223 | 0.4720 | 0.6871 |
| 0.4465 | 5.5932 | 990 | 0.4785 | 0.6223 | 0.4785 | 0.6917 |
| 0.4465 | 5.6045 | 992 | 0.4618 | 0.6223 | 0.4618 | 0.6796 |
| 0.4465 | 5.6158 | 994 | 0.4421 | 0.6223 | 0.4421 | 0.6649 |
| 0.4465 | 5.6271 | 996 | 0.4203 | 0.5257 | 0.4203 | 0.6483 |
| 0.4465 | 5.6384 | 998 | 0.4075 | 0.4000 | 0.4075 | 0.6384 |
| 0.0988 | 5.6497 | 1000 | 0.4072 | 0.5032 | 0.4072 | 0.6381 |
| 0.0988 | 5.6610 | 1002 | 0.4134 | 0.5032 | 0.4134 | 0.6430 |
| 0.0988 | 5.6723 | 1004 | 0.4162 | 0.4000 | 0.4162 | 0.6451 |
| 0.0988 | 5.6836 | 1006 | 0.4251 | 0.4400 | 0.4251 | 0.6520 |
| 0.0988 | 5.6949 | 1008 | 0.4464 | 0.4809 | 0.4464 | 0.6681 |
| 0.0988 | 5.7062 | 1010 | 0.4608 | 0.4809 | 0.4608 | 0.6788 |
| 0.0988 | 5.7175 | 1012 | 0.4527 | 0.4809 | 0.4527 | 0.6728 |
| 0.0988 | 5.7288 | 1014 | 0.4442 | 0.4809 | 0.4442 | 0.6665 |
| 0.0988 | 5.7401 | 1016 | 0.4272 | 0.4809 | 0.4272 | 0.6536 |
| 0.0988 | 5.7514 | 1018 | 0.4207 | 0.4809 | 0.4207 | 0.6486 |
| 0.0988 | 5.7627 | 1020 | 0.4200 | 0.4809 | 0.4200 | 0.6481 |
| 0.0988 | 5.7740 | 1022 | 0.4201 | 0.4809 | 0.4201 | 0.6482 |
| 0.0988 | 5.7853 | 1024 | 0.4270 | 0.4809 | 0.4270 | 0.6535 |
| 0.0988 | 5.7966 | 1026 | 0.4248 | 0.4809 | 0.4248 | 0.6517 |
| 0.0988 | 5.8079 | 1028 | 0.4215 | 0.4400 | 0.4215 | 0.6493 |
| 0.0988 | 5.8192 | 1030 | 0.4191 | 0.4000 | 0.4191 | 0.6474 |
| 0.0988 | 5.8305 | 1032 | 0.4183 | 0.4000 | 0.4183 | 0.6468 |
| 0.0988 | 5.8418 | 1034 | 0.4176 | 0.4000 | 0.4176 | 0.6462 |
| 0.0988 | 5.8531 | 1036 | 0.4163 | 0.4522 | 0.4163 | 0.6452 |
| 0.0988 | 5.8644 | 1038 | 0.4198 | 0.5032 | 0.4198 | 0.6479 |
| 0.0988 | 5.8757 | 1040 | 0.4130 | 0.5032 | 0.4130 | 0.6426 |
| 0.0988 | 5.8870 | 1042 | 0.4004 | 0.5032 | 0.4004 | 0.6327 |
| 0.0988 | 5.8983 | 1044 | 0.3932 | 0.5032 | 0.3932 | 0.6271 |
| 0.0988 | 5.9096 | 1046 | 0.3894 | 0.5032 | 0.3894 | 0.6240 |
| 0.0988 | 5.9209 | 1048 | 0.3884 | 0.5032 | 0.3884 | 0.6232 |
| 0.0988 | 5.9322 | 1050 | 0.3863 | 0.5032 | 0.3863 | 0.6216 |
| 0.0988 | 5.9435 | 1052 | 0.3857 | 0.5032 | 0.3857 | 0.6211 |
| 0.0988 | 5.9548 | 1054 | 0.3898 | 0.5032 | 0.3898 | 0.6243 |
| 0.0988 | 5.9661 | 1056 | 0.4028 | 0.4638 | 0.4028 | 0.6346 |
| 0.0988 | 5.9774 | 1058 | 0.4120 | 0.6021 | 0.4120 | 0.6419 |
| 0.0988 | 5.9887 | 1060 | 0.4150 | 0.4638 | 0.4150 | 0.6442 |
| 0.0988 | 6.0 | 1062 | 0.4138 | 0.4638 | 0.4138 | 0.6433 |
| 0.0988 | 6.0113 | 1064 | 0.4167 | 0.5032 | 0.4167 | 0.6455 |
| 0.0988 | 6.0226 | 1066 | 0.4215 | 0.5032 | 0.4215 | 0.6493 |
| 0.0988 | 6.0339 | 1068 | 0.4199 | 0.5032 | 0.4199 | 0.6480 |
| 0.0988 | 6.0452 | 1070 | 0.4087 | 0.5435 | 0.4087 | 0.6393 |
| 0.0988 | 6.0565 | 1072 | 0.4040 | 0.5435 | 0.4040 | 0.6356 |
| 0.0988 | 6.0678 | 1074 | 0.4025 | 0.5435 | 0.4025 | 0.6345 |
| 0.0988 | 6.0791 | 1076 | 0.4052 | 0.5032 | 0.4052 | 0.6365 |
| 0.0988 | 6.0904 | 1078 | 0.4130 | 0.4638 | 0.4130 | 0.6427 |
| 0.0988 | 6.1017 | 1080 | 0.4106 | 0.5032 | 0.4106 | 0.6408 |
| 0.0988 | 6.1130 | 1082 | 0.4025 | 0.5032 | 0.4025 | 0.6345 |
| 0.0988 | 6.1243 | 1084 | 0.4060 | 0.4394 | 0.4060 | 0.6372 |
| 0.0988 | 6.1356 | 1086 | 0.4146 | 0.4772 | 0.4146 | 0.6439 |
| 0.0988 | 6.1469 | 1088 | 0.4210 | 0.5549 | 0.4210 | 0.6489 |
| 0.0988 | 6.1582 | 1090 | 0.4217 | 0.5549 | 0.4217 | 0.6494 |
| 0.0988 | 6.1695 | 1092 | 0.4187 | 0.5191 | 0.4187 | 0.6470 |
| 0.0988 | 6.1808 | 1094 | 0.4188 | 0.5191 | 0.4188 | 0.6471 |
| 0.0988 | 6.1921 | 1096 | 0.4227 | 0.5191 | 0.4227 | 0.6501 |
| 0.0988 | 6.2034 | 1098 | 0.4365 | 0.5549 | 0.4365 | 0.6607 |
| 0.0988 | 6.2147 | 1100 | 0.4416 | 0.5549 | 0.4416 | 0.6646 |
| 0.0988 | 6.2260 | 1102 | 0.4518 | 0.5549 | 0.4518 | 0.6722 |
| 0.0988 | 6.2373 | 1104 | 0.4672 | 0.6223 | 0.4672 | 0.6835 |
| 0.0988 | 6.2486 | 1106 | 0.4855 | 0.6223 | 0.4855 | 0.6968 |
| 0.0988 | 6.2599 | 1108 | 0.4981 | 0.6223 | 0.4981 | 0.7058 |
| 0.0988 | 6.2712 | 1110 | 0.4790 | 0.6223 | 0.4790 | 0.6921 |
| 0.0988 | 6.2825 | 1112 | 0.4700 | 0.6223 | 0.4700 | 0.6855 |
| 0.0988 | 6.2938 | 1114 | 0.4627 | 0.6223 | 0.4627 | 0.6802 |
| 0.0988 | 6.3051 | 1116 | 0.4449 | 0.5549 | 0.4449 | 0.6670 |
| 0.0988 | 6.3164 | 1118 | 0.4193 | 0.5191 | 0.4193 | 0.6475 |
| 0.0988 | 6.3277 | 1120 | 0.4126 | 0.5032 | 0.4126 | 0.6423 |
| 0.0988 | 6.3390 | 1122 | 0.4204 | 0.4638 | 0.4204 | 0.6483 |
| 0.0988 | 6.3503 | 1124 | 0.4229 | 0.4638 | 0.4229 | 0.6503 |
| 0.0988 | 6.3616 | 1126 | 0.4183 | 0.4638 | 0.4183 | 0.6468 |
| 0.0988 | 6.3729 | 1128 | 0.4118 | 0.5032 | 0.4118 | 0.6417 |
| 0.0988 | 6.3842 | 1130 | 0.4093 | 0.5846 | 0.4093 | 0.6397 |
| 0.0988 | 6.3955 | 1132 | 0.4108 | 0.4772 | 0.4108 | 0.6409 |
| 0.0988 | 6.4068 | 1134 | 0.4229 | 0.4772 | 0.4229 | 0.6503 |
| 0.0988 | 6.4181 | 1136 | 0.4364 | 0.5549 | 0.4364 | 0.6606 |
| 0.0988 | 6.4294 | 1138 | 0.4395 | 0.5549 | 0.4395 | 0.6629 |
| 0.0988 | 6.4407 | 1140 | 0.4319 | 0.5549 | 0.4319 | 0.6572 |
| 0.0988 | 6.4520 | 1142 | 0.4098 | 0.4772 | 0.4098 | 0.6401 |
| 0.0988 | 6.4633 | 1144 | 0.3963 | 0.5732 | 0.3963 | 0.6296 |
| 0.0988 | 6.4746 | 1146 | 0.4031 | 0.5257 | 0.4031 | 0.6349 |
| 0.0988 | 6.4859 | 1148 | 0.4073 | 0.4772 | 0.4073 | 0.6382 |
| 0.0988 | 6.4972 | 1150 | 0.4056 | 0.5846 | 0.4056 | 0.6368 |
| 0.0988 | 6.5085 | 1152 | 0.4071 | 0.5846 | 0.4071 | 0.6381 |
| 0.0988 | 6.5198 | 1154 | 0.4098 | 0.5846 | 0.4098 | 0.6402 |
| 0.0988 | 6.5311 | 1156 | 0.4136 | 0.5846 | 0.4136 | 0.6431 |
| 0.0988 | 6.5424 | 1158 | 0.4193 | 0.5333 | 0.4193 | 0.6476 |
| 0.0988 | 6.5537 | 1160 | 0.4220 | 0.5333 | 0.4220 | 0.6496 |
| 0.0988 | 6.5650 | 1162 | 0.4244 | 0.5333 | 0.4244 | 0.6515 |
| 0.0988 | 6.5763 | 1164 | 0.4271 | 0.5333 | 0.4271 | 0.6535 |
| 0.0988 | 6.5876 | 1166 | 0.4305 | 0.4809 | 0.4305 | 0.6561 |
| 0.0988 | 6.5989 | 1168 | 0.4423 | 0.4772 | 0.4423 | 0.6651 |
| 0.0988 | 6.6102 | 1170 | 0.4541 | 0.4772 | 0.4541 | 0.6738 |
| 0.0988 | 6.6215 | 1172 | 0.4646 | 0.5549 | 0.4646 | 0.6816 |
| 0.0988 | 6.6328 | 1174 | 0.4707 | 0.5549 | 0.4707 | 0.6861 |
| 0.0988 | 6.6441 | 1176 | 0.4864 | 0.5549 | 0.4864 | 0.6974 |
| 0.0988 | 6.6554 | 1178 | 0.4977 | 0.5097 | 0.4977 | 0.7055 |
| 0.0988 | 6.6667 | 1180 | 0.4977 | 0.5097 | 0.4977 | 0.7055 |
| 0.0988 | 6.6780 | 1182 | 0.5003 | 0.5097 | 0.5003 | 0.7073 |
| 0.0988 | 6.6893 | 1184 | 0.4962 | 0.5097 | 0.4962 | 0.7044 |
| 0.0988 | 6.7006 | 1186 | 0.4928 | 0.4273 | 0.4928 | 0.7020 |
| 0.0988 | 6.7119 | 1188 | 0.4880 | 0.4809 | 0.4880 | 0.6986 |
| 0.0988 | 6.7232 | 1190 | 0.4853 | 0.4809 | 0.4853 | 0.6966 |
| 0.0988 | 6.7345 | 1192 | 0.4861 | 0.4923 | 0.4861 | 0.6972 |
| 0.0988 | 6.7458 | 1194 | 0.4865 | 0.4522 | 0.4865 | 0.6975 |
| 0.0988 | 6.7571 | 1196 | 0.4847 | 0.4809 | 0.4847 | 0.6962 |
| 0.0988 | 6.7684 | 1198 | 0.4815 | 0.4809 | 0.4815 | 0.6939 |
| 0.0988 | 6.7797 | 1200 | 0.4804 | 0.4809 | 0.4804 | 0.6931 |
| 0.0988 | 6.7910 | 1202 | 0.4865 | 0.4273 | 0.4865 | 0.6975 |
| 0.0988 | 6.8023 | 1204 | 0.4966 | 0.4273 | 0.4966 | 0.7047 |
| 0.0988 | 6.8136 | 1206 | 0.5046 | 0.4273 | 0.5046 | 0.7103 |
| 0.0988 | 6.8249 | 1208 | 0.5042 | 0.4273 | 0.5042 | 0.7100 |
| 0.0988 | 6.8362 | 1210 | 0.4981 | 0.4273 | 0.4981 | 0.7058 |
| 0.0988 | 6.8475 | 1212 | 0.4902 | 0.4273 | 0.4902 | 0.7002 |
| 0.0988 | 6.8588 | 1214 | 0.4776 | 0.4273 | 0.4776 | 0.6911 |
| 0.0988 | 6.8701 | 1216 | 0.4665 | 0.4273 | 0.4665 | 0.6830 |
| 0.0988 | 6.8814 | 1218 | 0.4604 | 0.4809 | 0.4604 | 0.6786 |
| 0.0988 | 6.8927 | 1220 | 0.4583 | 0.4809 | 0.4583 | 0.6770 |
| 0.0988 | 6.9040 | 1222 | 0.4558 | 0.4809 | 0.4558 | 0.6752 |
| 0.0988 | 6.9153 | 1224 | 0.4541 | 0.5333 | 0.4541 | 0.6739 |
| 0.0988 | 6.9266 | 1226 | 0.4545 | 0.5352 | 0.4545 | 0.6742 |
| 0.0988 | 6.9379 | 1228 | 0.4543 | 0.5352 | 0.4543 | 0.6740 |
| 0.0988 | 6.9492 | 1230 | 0.4504 | 0.5444 | 0.4504 | 0.6711 |
| 0.0988 | 6.9605 | 1232 | 0.4414 | 0.5817 | 0.4414 | 0.6644 |
| 0.0988 | 6.9718 | 1234 | 0.4309 | 0.5352 | 0.4309 | 0.6564 |
| 0.0988 | 6.9831 | 1236 | 0.4261 | 0.5333 | 0.4261 | 0.6528 |
| 0.0988 | 6.9944 | 1238 | 0.4287 | 0.4273 | 0.4287 | 0.6548 |
| 0.0988 | 7.0056 | 1240 | 0.4327 | 0.4277 | 0.4327 | 0.6578 |
| 0.0988 | 7.0169 | 1242 | 0.4294 | 0.4273 | 0.4294 | 0.6553 |
| 0.0988 | 7.0282 | 1244 | 0.4292 | 0.4273 | 0.4292 | 0.6551 |
| 0.0988 | 7.0395 | 1246 | 0.4300 | 0.4273 | 0.4300 | 0.6558 |
| 0.0988 | 7.0508 | 1248 | 0.4307 | 0.4273 | 0.4307 | 0.6563 |
| 0.0988 | 7.0621 | 1250 | 0.4335 | 0.4273 | 0.4335 | 0.6584 |
| 0.0988 | 7.0734 | 1252 | 0.4405 | 0.4273 | 0.4405 | 0.6637 |
| 0.0988 | 7.0847 | 1254 | 0.4501 | 0.4273 | 0.4501 | 0.6709 |
| 0.0988 | 7.0960 | 1256 | 0.4569 | 0.4273 | 0.4569 | 0.6759 |
| 0.0988 | 7.1073 | 1258 | 0.4653 | 0.4273 | 0.4653 | 0.6821 |
| 0.0988 | 7.1186 | 1260 | 0.4700 | 0.4277 | 0.4700 | 0.6856 |
| 0.0988 | 7.1299 | 1262 | 0.4704 | 0.4277 | 0.4704 | 0.6859 |
| 0.0988 | 7.1412 | 1264 | 0.4605 | 0.4273 | 0.4605 | 0.6786 |
| 0.0988 | 7.1525 | 1266 | 0.4483 | 0.4273 | 0.4483 | 0.6696 |
| 0.0988 | 7.1638 | 1268 | 0.4398 | 0.4273 | 0.4398 | 0.6632 |
| 0.0988 | 7.1751 | 1270 | 0.4391 | 0.4273 | 0.4391 | 0.6626 |
| 0.0988 | 7.1864 | 1272 | 0.4416 | 0.4273 | 0.4416 | 0.6645 |
| 0.0988 | 7.1977 | 1274 | 0.4437 | 0.4273 | 0.4437 | 0.6661 |
| 0.0988 | 7.2090 | 1276 | 0.4491 | 0.4277 | 0.4491 | 0.6701 |
| 0.0988 | 7.2203 | 1278 | 0.4594 | 0.5097 | 0.4594 | 0.6778 |
| 0.0988 | 7.2316 | 1280 | 0.4715 | 0.5097 | 0.4715 | 0.6866 |
| 0.0988 | 7.2429 | 1282 | 0.4764 | 0.5097 | 0.4764 | 0.6902 |
| 0.0988 | 7.2542 | 1284 | 0.4765 | 0.5097 | 0.4765 | 0.6903 |
| 0.0988 | 7.2655 | 1286 | 0.4666 | 0.5097 | 0.4666 | 0.6831 |
| 0.0988 | 7.2768 | 1288 | 0.4527 | 0.4772 | 0.4527 | 0.6728 |
| 0.0988 | 7.2881 | 1290 | 0.4452 | 0.5333 | 0.4452 | 0.6672 |
| 0.0988 | 7.2994 | 1292 | 0.4436 | 0.5333 | 0.4436 | 0.6661 |
| 0.0988 | 7.3107 | 1294 | 0.4444 | 0.5333 | 0.4444 | 0.6666 |
| 0.0988 | 7.3220 | 1296 | 0.4455 | 0.5333 | 0.4455 | 0.6675 |
| 0.0988 | 7.3333 | 1298 | 0.4487 | 0.4809 | 0.4487 | 0.6699 |
| 0.0988 | 7.3446 | 1300 | 0.4534 | 0.4273 | 0.4534 | 0.6734 |
| 0.0988 | 7.3559 | 1302 | 0.4618 | 0.4273 | 0.4618 | 0.6796 |
| 0.0988 | 7.3672 | 1304 | 0.4752 | 0.5097 | 0.4752 | 0.6893 |
| 0.0988 | 7.3785 | 1306 | 0.4858 | 0.5097 | 0.4858 | 0.6970 |
| 0.0988 | 7.3898 | 1308 | 0.4921 | 0.5097 | 0.4921 | 0.7015 |
| 0.0988 | 7.4011 | 1310 | 0.4869 | 0.5097 | 0.4869 | 0.6977 |
| 0.0988 | 7.4124 | 1312 | 0.4757 | 0.5097 | 0.4757 | 0.6897 |
| 0.0988 | 7.4237 | 1314 | 0.4695 | 0.5097 | 0.4695 | 0.6852 |
| 0.0988 | 7.4350 | 1316 | 0.4613 | 0.5097 | 0.4613 | 0.6792 |
| 0.0988 | 7.4463 | 1318 | 0.4576 | 0.5097 | 0.4576 | 0.6765 |
| 0.0988 | 7.4576 | 1320 | 0.4512 | 0.5097 | 0.4512 | 0.6717 |
| 0.0988 | 7.4689 | 1322 | 0.4470 | 0.5097 | 0.4470 | 0.6686 |
| 0.0988 | 7.4802 | 1324 | 0.4455 | 0.5097 | 0.4455 | 0.6674 |
| 0.0988 | 7.4915 | 1326 | 0.4449 | 0.5097 | 0.4449 | 0.6670 |
| 0.0988 | 7.5028 | 1328 | 0.4458 | 0.5097 | 0.4458 | 0.6676 |
| 0.0988 | 7.5141 | 1330 | 0.4463 | 0.5097 | 0.4463 | 0.6680 |
| 0.0988 | 7.5254 | 1332 | 0.4449 | 0.5097 | 0.4449 | 0.6670 |
| 0.0988 | 7.5367 | 1334 | 0.4435 | 0.5097 | 0.4435 | 0.6660 |
| 0.0988 | 7.5480 | 1336 | 0.4435 | 0.5097 | 0.4435 | 0.6660 |
| 0.0988 | 7.5593 | 1338 | 0.4426 | 0.5097 | 0.4426 | 0.6653 |
| 0.0988 | 7.5706 | 1340 | 0.4399 | 0.5097 | 0.4399 | 0.6633 |
| 0.0988 | 7.5819 | 1342 | 0.4408 | 0.5097 | 0.4408 | 0.6639 |
| 0.0988 | 7.5932 | 1344 | 0.4404 | 0.5097 | 0.4404 | 0.6637 |
| 0.0988 | 7.6045 | 1346 | 0.4381 | 0.4277 | 0.4381 | 0.6619 |
| 0.0988 | 7.6158 | 1348 | 0.4355 | 0.4277 | 0.4355 | 0.6599 |
| 0.0988 | 7.6271 | 1350 | 0.4316 | 0.4277 | 0.4316 | 0.6570 |
| 0.0988 | 7.6384 | 1352 | 0.4317 | 0.4277 | 0.4317 | 0.6571 |
| 0.0988 | 7.6497 | 1354 | 0.4393 | 0.4277 | 0.4393 | 0.6628 |
| 0.0988 | 7.6610 | 1356 | 0.4461 | 0.4277 | 0.4461 | 0.6679 |
| 0.0988 | 7.6723 | 1358 | 0.4399 | 0.4277 | 0.4399 | 0.6632 |
| 0.0988 | 7.6836 | 1360 | 0.4282 | 0.4277 | 0.4282 | 0.6544 |
| 0.0988 | 7.6949 | 1362 | 0.4170 | 0.4809 | 0.4170 | 0.6458 |
| 0.0988 | 7.7062 | 1364 | 0.4102 | 0.5846 | 0.4102 | 0.6405 |
| 0.0988 | 7.7175 | 1366 | 0.4086 | 0.5846 | 0.4086 | 0.6392 |
| 0.0988 | 7.7288 | 1368 | 0.4114 | 0.5846 | 0.4114 | 0.6414 |
| 0.0988 | 7.7401 | 1370 | 0.4134 | 0.5846 | 0.4134 | 0.6429 |
| 0.0988 | 7.7514 | 1372 | 0.4170 | 0.5846 | 0.4170 | 0.6458 |
| 0.0988 | 7.7627 | 1374 | 0.4199 | 0.4809 | 0.4199 | 0.6480 |
| 0.0988 | 7.7740 | 1376 | 0.4246 | 0.4772 | 0.4246 | 0.6516 |
| 0.0988 | 7.7853 | 1378 | 0.4277 | 0.4772 | 0.4277 | 0.6540 |
| 0.0988 | 7.7966 | 1380 | 0.4294 | 0.4772 | 0.4294 | 0.6553 |
| 0.0988 | 7.8079 | 1382 | 0.4277 | 0.4809 | 0.4277 | 0.6540 |
| 0.0988 | 7.8192 | 1384 | 0.4280 | 0.5333 | 0.4280 | 0.6542 |
| 0.0988 | 7.8305 | 1386 | 0.4298 | 0.5846 | 0.4298 | 0.6556 |
| 0.0988 | 7.8418 | 1388 | 0.4323 | 0.5333 | 0.4323 | 0.6575 |
| 0.0988 | 7.8531 | 1390 | 0.4359 | 0.5333 | 0.4359 | 0.6603 |
| 0.0988 | 7.8644 | 1392 | 0.4397 | 0.5333 | 0.4397 | 0.6631 |
| 0.0988 | 7.8757 | 1394 | 0.4422 | 0.4809 | 0.4422 | 0.6650 |
| 0.0988 | 7.8870 | 1396 | 0.4445 | 0.4809 | 0.4445 | 0.6667 |
| 0.0988 | 7.8983 | 1398 | 0.4468 | 0.4809 | 0.4468 | 0.6684 |
| 0.0988 | 7.9096 | 1400 | 0.4511 | 0.4809 | 0.4511 | 0.6716 |
| 0.0988 | 7.9209 | 1402 | 0.4541 | 0.4809 | 0.4541 | 0.6739 |
| 0.0988 | 7.9322 | 1404 | 0.4571 | 0.4809 | 0.4571 | 0.6761 |
| 0.0988 | 7.9435 | 1406 | 0.4594 | 0.4809 | 0.4594 | 0.6778 |
| 0.0988 | 7.9548 | 1408 | 0.4587 | 0.4809 | 0.4587 | 0.6773 |
| 0.0988 | 7.9661 | 1410 | 0.4563 | 0.4809 | 0.4563 | 0.6755 |
| 0.0988 | 7.9774 | 1412 | 0.4531 | 0.5333 | 0.4531 | 0.6731 |
| 0.0988 | 7.9887 | 1414 | 0.4517 | 0.5333 | 0.4517 | 0.6721 |
| 0.0988 | 8.0 | 1416 | 0.4526 | 0.5846 | 0.4526 | 0.6727 |
| 0.0988 | 8.0113 | 1418 | 0.4544 | 0.5846 | 0.4544 | 0.6741 |
| 0.0988 | 8.0226 | 1420 | 0.4547 | 0.5846 | 0.4547 | 0.6743 |
| 0.0988 | 8.0339 | 1422 | 0.4542 | 0.5846 | 0.4542 | 0.6739 |
| 0.0988 | 8.0452 | 1424 | 0.4578 | 0.5032 | 0.4578 | 0.6766 |
| 0.0988 | 8.0565 | 1426 | 0.4637 | 0.5032 | 0.4637 | 0.6809 |
| 0.0988 | 8.0678 | 1428 | 0.4652 | 0.5032 | 0.4652 | 0.6821 |
| 0.0988 | 8.0791 | 1430 | 0.4604 | 0.5032 | 0.4604 | 0.6785 |
| 0.0988 | 8.0904 | 1432 | 0.4545 | 0.5435 | 0.4545 | 0.6742 |
| 0.0988 | 8.1017 | 1434 | 0.4513 | 0.5846 | 0.4513 | 0.6718 |
| 0.0988 | 8.1130 | 1436 | 0.4475 | 0.5846 | 0.4475 | 0.6689 |
| 0.0988 | 8.1243 | 1438 | 0.4451 | 0.5846 | 0.4451 | 0.6671 |
| 0.0988 | 8.1356 | 1440 | 0.4460 | 0.5846 | 0.4460 | 0.6679 |
| 0.0988 | 8.1469 | 1442 | 0.4475 | 0.5846 | 0.4475 | 0.6689 |
| 0.0988 | 8.1582 | 1444 | 0.4498 | 0.5846 | 0.4498 | 0.6707 |
| 0.0988 | 8.1695 | 1446 | 0.4527 | 0.5846 | 0.4527 | 0.6729 |
| 0.0988 | 8.1808 | 1448 | 0.4564 | 0.5846 | 0.4564 | 0.6756 |
| 0.0988 | 8.1921 | 1450 | 0.4581 | 0.5846 | 0.4581 | 0.6768 |
| 0.0988 | 8.2034 | 1452 | 0.4585 | 0.5846 | 0.4585 | 0.6771 |
| 0.0988 | 8.2147 | 1454 | 0.4596 | 0.5846 | 0.4596 | 0.6779 |
| 0.0988 | 8.2260 | 1456 | 0.4606 | 0.5846 | 0.4606 | 0.6787 |
| 0.0988 | 8.2373 | 1458 | 0.4621 | 0.4809 | 0.4621 | 0.6798 |
| 0.0988 | 8.2486 | 1460 | 0.4619 | 0.4809 | 0.4619 | 0.6796 |
| 0.0988 | 8.2599 | 1462 | 0.4609 | 0.4809 | 0.4609 | 0.6789 |
| 0.0988 | 8.2712 | 1464 | 0.4593 | 0.4809 | 0.4593 | 0.6777 |
| 0.0988 | 8.2825 | 1466 | 0.4572 | 0.5333 | 0.4572 | 0.6762 |
| 0.0988 | 8.2938 | 1468 | 0.4564 | 0.5846 | 0.4564 | 0.6756 |
| 0.0988 | 8.3051 | 1470 | 0.4567 | 0.4809 | 0.4567 | 0.6758 |
| 0.0988 | 8.3164 | 1472 | 0.4565 | 0.4809 | 0.4565 | 0.6756 |
| 0.0988 | 8.3277 | 1474 | 0.4571 | 0.4809 | 0.4571 | 0.6761 |
| 0.0988 | 8.3390 | 1476 | 0.4572 | 0.4809 | 0.4572 | 0.6762 |
| 0.0988 | 8.3503 | 1478 | 0.4562 | 0.4809 | 0.4562 | 0.6754 |
| 0.0988 | 8.3616 | 1480 | 0.4539 | 0.4809 | 0.4539 | 0.6737 |
| 0.0988 | 8.3729 | 1482 | 0.4503 | 0.4809 | 0.4503 | 0.6710 |
| 0.0988 | 8.3842 | 1484 | 0.4476 | 0.4809 | 0.4476 | 0.6690 |
| 0.0988 | 8.3955 | 1486 | 0.4464 | 0.4809 | 0.4464 | 0.6682 |
| 0.0988 | 8.4068 | 1488 | 0.4456 | 0.4809 | 0.4456 | 0.6675 |
| 0.0988 | 8.4181 | 1490 | 0.4445 | 0.4809 | 0.4445 | 0.6667 |
| 0.0988 | 8.4294 | 1492 | 0.4429 | 0.4809 | 0.4429 | 0.6655 |
| 0.0988 | 8.4407 | 1494 | 0.4428 | 0.4809 | 0.4428 | 0.6654 |
| 0.0988 | 8.4520 | 1496 | 0.4417 | 0.4809 | 0.4417 | 0.6646 |
| 0.0988 | 8.4633 | 1498 | 0.4423 | 0.4809 | 0.4423 | 0.6650 |
| 0.0574 | 8.4746 | 1500 | 0.4422 | 0.4809 | 0.4422 | 0.6650 |
| 0.0574 | 8.4859 | 1502 | 0.4432 | 0.4273 | 0.4432 | 0.6657 |
| 0.0574 | 8.4972 | 1504 | 0.4445 | 0.4273 | 0.4445 | 0.6667 |
| 0.0574 | 8.5085 | 1506 | 0.4435 | 0.4273 | 0.4435 | 0.6660 |
| 0.0574 | 8.5198 | 1508 | 0.4439 | 0.4273 | 0.4439 | 0.6663 |
| 0.0574 | 8.5311 | 1510 | 0.4453 | 0.4273 | 0.4453 | 0.6673 |
| 0.0574 | 8.5424 | 1512 | 0.4468 | 0.4273 | 0.4468 | 0.6685 |
| 0.0574 | 8.5537 | 1514 | 0.4473 | 0.4809 | 0.4473 | 0.6688 |
| 0.0574 | 8.5650 | 1516 | 0.4455 | 0.4809 | 0.4455 | 0.6674 |
| 0.0574 | 8.5763 | 1518 | 0.4452 | 0.4809 | 0.4452 | 0.6672 |
| 0.0574 | 8.5876 | 1520 | 0.4466 | 0.4809 | 0.4466 | 0.6683 |
| 0.0574 | 8.5989 | 1522 | 0.4493 | 0.4809 | 0.4493 | 0.6703 |
| 0.0574 | 8.6102 | 1524 | 0.4512 | 0.4809 | 0.4512 | 0.6717 |
| 0.0574 | 8.6215 | 1526 | 0.4526 | 0.4809 | 0.4526 | 0.6728 |
| 0.0574 | 8.6328 | 1528 | 0.4546 | 0.4809 | 0.4546 | 0.6743 |
| 0.0574 | 8.6441 | 1530 | 0.4557 | 0.4809 | 0.4557 | 0.6751 |
| 0.0574 | 8.6554 | 1532 | 0.4581 | 0.4809 | 0.4581 | 0.6768 |
| 0.0574 | 8.6667 | 1534 | 0.4599 | 0.4809 | 0.4599 | 0.6782 |
| 0.0574 | 8.6780 | 1536 | 0.4617 | 0.4809 | 0.4617 | 0.6795 |
| 0.0574 | 8.6893 | 1538 | 0.4632 | 0.4809 | 0.4632 | 0.6806 |
| 0.0574 | 8.7006 | 1540 | 0.4646 | 0.4809 | 0.4646 | 0.6816 |
| 0.0574 | 8.7119 | 1542 | 0.4664 | 0.4809 | 0.4664 | 0.6829 |
| 0.0574 | 8.7232 | 1544 | 0.4672 | 0.4809 | 0.4672 | 0.6836 |
| 0.0574 | 8.7345 | 1546 | 0.4662 | 0.4809 | 0.4662 | 0.6828 |
| 0.0574 | 8.7458 | 1548 | 0.4643 | 0.4809 | 0.4643 | 0.6814 |
| 0.0574 | 8.7571 | 1550 | 0.4626 | 0.4809 | 0.4626 | 0.6801 |
| 0.0574 | 8.7684 | 1552 | 0.4612 | 0.4809 | 0.4612 | 0.6791 |
| 0.0574 | 8.7797 | 1554 | 0.4593 | 0.4809 | 0.4593 | 0.6777 |
| 0.0574 | 8.7910 | 1556 | 0.4575 | 0.4809 | 0.4575 | 0.6764 |
| 0.0574 | 8.8023 | 1558 | 0.4559 | 0.4809 | 0.4559 | 0.6752 |
| 0.0574 | 8.8136 | 1560 | 0.4525 | 0.4809 | 0.4525 | 0.6727 |
| 0.0574 | 8.8249 | 1562 | 0.4512 | 0.4809 | 0.4512 | 0.6717 |
| 0.0574 | 8.8362 | 1564 | 0.4501 | 0.4809 | 0.4501 | 0.6709 |
| 0.0574 | 8.8475 | 1566 | 0.4483 | 0.4809 | 0.4483 | 0.6696 |
| 0.0574 | 8.8588 | 1568 | 0.4482 | 0.4809 | 0.4482 | 0.6695 |
| 0.0574 | 8.8701 | 1570 | 0.4480 | 0.4772 | 0.4480 | 0.6693 |
| 0.0574 | 8.8814 | 1572 | 0.4477 | 0.4772 | 0.4477 | 0.6691 |
| 0.0574 | 8.8927 | 1574 | 0.4485 | 0.4772 | 0.4485 | 0.6697 |
| 0.0574 | 8.9040 | 1576 | 0.4496 | 0.5549 | 0.4496 | 0.6705 |
| 0.0574 | 8.9153 | 1578 | 0.4471 | 0.4772 | 0.4471 | 0.6686 |
| 0.0574 | 8.9266 | 1580 | 0.4437 | 0.4772 | 0.4437 | 0.6661 |
| 0.0574 | 8.9379 | 1582 | 0.4403 | 0.4809 | 0.4403 | 0.6635 |
| 0.0574 | 8.9492 | 1584 | 0.4370 | 0.4809 | 0.4370 | 0.6611 |
| 0.0574 | 8.9605 | 1586 | 0.4367 | 0.4809 | 0.4367 | 0.6608 |
| 0.0574 | 8.9718 | 1588 | 0.4372 | 0.4809 | 0.4372 | 0.6612 |
| 0.0574 | 8.9831 | 1590 | 0.4379 | 0.4772 | 0.4379 | 0.6618 |
| 0.0574 | 8.9944 | 1592 | 0.4381 | 0.4772 | 0.4381 | 0.6619 |
| 0.0574 | 9.0056 | 1594 | 0.4379 | 0.5549 | 0.4379 | 0.6617 |
| 0.0574 | 9.0169 | 1596 | 0.4385 | 0.5549 | 0.4385 | 0.6622 |
| 0.0574 | 9.0282 | 1598 | 0.4374 | 0.5549 | 0.4374 | 0.6614 |
| 0.0574 | 9.0395 | 1600 | 0.4389 | 0.5549 | 0.4389 | 0.6625 |
| 0.0574 | 9.0508 | 1602 | 0.4441 | 0.5549 | 0.4441 | 0.6664 |
| 0.0574 | 9.0621 | 1604 | 0.4510 | 0.5097 | 0.4510 | 0.6715 |
| 0.0574 | 9.0734 | 1606 | 0.4560 | 0.5097 | 0.4560 | 0.6753 |
| 0.0574 | 9.0847 | 1608 | 0.4625 | 0.5097 | 0.4625 | 0.6800 |
| 0.0574 | 9.0960 | 1610 | 0.4676 | 0.5097 | 0.4676 | 0.6838 |
| 0.0574 | 9.1073 | 1612 | 0.4698 | 0.5097 | 0.4698 | 0.6855 |
| 0.0574 | 9.1186 | 1614 | 0.4682 | 0.5097 | 0.4682 | 0.6842 |
| 0.0574 | 9.1299 | 1616 | 0.4637 | 0.5097 | 0.4637 | 0.6810 |
| 0.0574 | 9.1412 | 1618 | 0.4590 | 0.5097 | 0.4590 | 0.6775 |
| 0.0574 | 9.1525 | 1620 | 0.4531 | 0.5097 | 0.4531 | 0.6731 |
| 0.0574 | 9.1638 | 1622 | 0.4462 | 0.5549 | 0.4462 | 0.6680 |
| 0.0574 | 9.1751 | 1624 | 0.4387 | 0.4772 | 0.4387 | 0.6623 |
| 0.0574 | 9.1864 | 1626 | 0.4325 | 0.4809 | 0.4325 | 0.6576 |
| 0.0574 | 9.1977 | 1628 | 0.4273 | 0.4809 | 0.4273 | 0.6537 |
| 0.0574 | 9.2090 | 1630 | 0.4231 | 0.5333 | 0.4231 | 0.6505 |
| 0.0574 | 9.2203 | 1632 | 0.4205 | 0.5333 | 0.4205 | 0.6485 |
| 0.0574 | 9.2316 | 1634 | 0.4184 | 0.5333 | 0.4184 | 0.6469 |
| 0.0574 | 9.2429 | 1636 | 0.4171 | 0.5333 | 0.4171 | 0.6458 |
| 0.0574 | 9.2542 | 1638 | 0.4165 | 0.5333 | 0.4165 | 0.6454 |
| 0.0574 | 9.2655 | 1640 | 0.4163 | 0.5333 | 0.4163 | 0.6452 |
| 0.0574 | 9.2768 | 1642 | 0.4165 | 0.5333 | 0.4165 | 0.6454 |
| 0.0574 | 9.2881 | 1644 | 0.4168 | 0.5333 | 0.4168 | 0.6456 |
| 0.0574 | 9.2994 | 1646 | 0.4176 | 0.5333 | 0.4176 | 0.6462 |
| 0.0574 | 9.3107 | 1648 | 0.4185 | 0.5333 | 0.4185 | 0.6469 |
| 0.0574 | 9.3220 | 1650 | 0.4194 | 0.5333 | 0.4194 | 0.6476 |
| 0.0574 | 9.3333 | 1652 | 0.4196 | 0.5333 | 0.4196 | 0.6477 |
| 0.0574 | 9.3446 | 1654 | 0.4202 | 0.5333 | 0.4202 | 0.6482 |
| 0.0574 | 9.3559 | 1656 | 0.4214 | 0.5333 | 0.4214 | 0.6491 |
| 0.0574 | 9.3672 | 1658 | 0.4228 | 0.5333 | 0.4228 | 0.6502 |
| 0.0574 | 9.3785 | 1660 | 0.4241 | 0.5333 | 0.4241 | 0.6512 |
| 0.0574 | 9.3898 | 1662 | 0.4253 | 0.5333 | 0.4253 | 0.6521 |
| 0.0574 | 9.4011 | 1664 | 0.4266 | 0.5333 | 0.4266 | 0.6532 |
| 0.0574 | 9.4124 | 1666 | 0.4279 | 0.5333 | 0.4279 | 0.6542 |
| 0.0574 | 9.4237 | 1668 | 0.4289 | 0.5333 | 0.4289 | 0.6549 |
| 0.0574 | 9.4350 | 1670 | 0.4306 | 0.5333 | 0.4306 | 0.6562 |
| 0.0574 | 9.4463 | 1672 | 0.4328 | 0.5333 | 0.4328 | 0.6579 |
| 0.0574 | 9.4576 | 1674 | 0.4349 | 0.4809 | 0.4349 | 0.6595 |
| 0.0574 | 9.4689 | 1676 | 0.4374 | 0.4809 | 0.4374 | 0.6614 |
| 0.0574 | 9.4802 | 1678 | 0.4401 | 0.4809 | 0.4401 | 0.6634 |
| 0.0574 | 9.4915 | 1680 | 0.4421 | 0.4809 | 0.4421 | 0.6649 |
| 0.0574 | 9.5028 | 1682 | 0.4434 | 0.4809 | 0.4434 | 0.6659 |
| 0.0574 | 9.5141 | 1684 | 0.4449 | 0.4809 | 0.4449 | 0.6670 |
| 0.0574 | 9.5254 | 1686 | 0.4452 | 0.4809 | 0.4452 | 0.6673 |
| 0.0574 | 9.5367 | 1688 | 0.4460 | 0.4809 | 0.4460 | 0.6678 |
| 0.0574 | 9.5480 | 1690 | 0.4466 | 0.4809 | 0.4466 | 0.6683 |
| 0.0574 | 9.5593 | 1692 | 0.4467 | 0.4809 | 0.4467 | 0.6683 |
| 0.0574 | 9.5706 | 1694 | 0.4471 | 0.4809 | 0.4471 | 0.6687 |
| 0.0574 | 9.5819 | 1696 | 0.4479 | 0.4809 | 0.4479 | 0.6693 |
| 0.0574 | 9.5932 | 1698 | 0.4487 | 0.4809 | 0.4487 | 0.6698 |
| 0.0574 | 9.6045 | 1700 | 0.4491 | 0.4809 | 0.4491 | 0.6702 |
| 0.0574 | 9.6158 | 1702 | 0.4498 | 0.4809 | 0.4498 | 0.6706 |
| 0.0574 | 9.6271 | 1704 | 0.4499 | 0.4809 | 0.4499 | 0.6708 |
| 0.0574 | 9.6384 | 1706 | 0.4493 | 0.4809 | 0.4493 | 0.6703 |
| 0.0574 | 9.6497 | 1708 | 0.4484 | 0.4809 | 0.4484 | 0.6696 |
| 0.0574 | 9.6610 | 1710 | 0.4475 | 0.4809 | 0.4475 | 0.6690 |
| 0.0574 | 9.6723 | 1712 | 0.4466 | 0.4809 | 0.4466 | 0.6683 |
| 0.0574 | 9.6836 | 1714 | 0.4460 | 0.4809 | 0.4460 | 0.6678 |
| 0.0574 | 9.6949 | 1716 | 0.4450 | 0.4809 | 0.4450 | 0.6671 |
| 0.0574 | 9.7062 | 1718 | 0.4449 | 0.4809 | 0.4449 | 0.6670 |
| 0.0574 | 9.7175 | 1720 | 0.4450 | 0.4809 | 0.4450 | 0.6671 |
| 0.0574 | 9.7288 | 1722 | 0.4443 | 0.4809 | 0.4443 | 0.6666 |
| 0.0574 | 9.7401 | 1724 | 0.4435 | 0.4809 | 0.4435 | 0.6659 |
| 0.0574 | 9.7514 | 1726 | 0.4434 | 0.4809 | 0.4434 | 0.6659 |
| 0.0574 | 9.7627 | 1728 | 0.4439 | 0.4809 | 0.4439 | 0.6662 |
| 0.0574 | 9.7740 | 1730 | 0.4448 | 0.4809 | 0.4448 | 0.6669 |
| 0.0574 | 9.7853 | 1732 | 0.4450 | 0.4809 | 0.4450 | 0.6671 |
| 0.0574 | 9.7966 | 1734 | 0.4445 | 0.4809 | 0.4445 | 0.6667 |
| 0.0574 | 9.8079 | 1736 | 0.4443 | 0.4809 | 0.4443 | 0.6666 |
| 0.0574 | 9.8192 | 1738 | 0.4437 | 0.4809 | 0.4437 | 0.6661 |
| 0.0574 | 9.8305 | 1740 | 0.4430 | 0.4809 | 0.4430 | 0.6656 |
| 0.0574 | 9.8418 | 1742 | 0.4424 | 0.4809 | 0.4424 | 0.6651 |
| 0.0574 | 9.8531 | 1744 | 0.4418 | 0.4809 | 0.4418 | 0.6647 |
| 0.0574 | 9.8644 | 1746 | 0.4413 | 0.4809 | 0.4413 | 0.6643 |
| 0.0574 | 9.8757 | 1748 | 0.4408 | 0.4809 | 0.4408 | 0.6640 |
| 0.0574 | 9.8870 | 1750 | 0.4407 | 0.4809 | 0.4407 | 0.6638 |
| 0.0574 | 9.8983 | 1752 | 0.4406 | 0.4809 | 0.4406 | 0.6638 |
| 0.0574 | 9.9096 | 1754 | 0.4403 | 0.4809 | 0.4403 | 0.6636 |
| 0.0574 | 9.9209 | 1756 | 0.4401 | 0.4809 | 0.4401 | 0.6634 |
| 0.0574 | 9.9322 | 1758 | 0.4399 | 0.4809 | 0.4399 | 0.6633 |
| 0.0574 | 9.9435 | 1760 | 0.4397 | 0.4809 | 0.4397 | 0.6631 |
| 0.0574 | 9.9548 | 1762 | 0.4395 | 0.4809 | 0.4395 | 0.6630 |
| 0.0574 | 9.9661 | 1764 | 0.4394 | 0.4809 | 0.4394 | 0.6629 |
| 0.0574 | 9.9774 | 1766 | 0.4393 | 0.4809 | 0.4393 | 0.6628 |
| 0.0574 | 9.9887 | 1768 | 0.4393 | 0.4809 | 0.4393 | 0.6628 |
| 0.0574 | 10.0 | 1770 | 0.4392 | 0.4809 | 0.4392 | 0.6627 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
|
hayatoshibahara/distilhubert-finetuned-gtzan | hayatoshibahara | 2024-11-25T06:30:02Z | 159 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"hubert",
"audio-classification",
"generated_from_trainer",
"dataset:marsyas/gtzan",
"base_model:ntu-spml/distilhubert",
"base_model:finetune:ntu-spml/distilhubert",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | audio-classification | 2024-11-25T05:43:59Z | ---
library_name: transformers
license: apache-2.0
base_model: ntu-spml/distilhubert
tags:
- generated_from_trainer
datasets:
- marsyas/gtzan
metrics:
- accuracy
model-index:
- name: distilhubert-finetuned-gtzan
results:
- task:
name: Audio Classification
type: audio-classification
dataset:
name: GTZAN
type: marsyas/gtzan
config: all
split: None
args: all
metrics:
- name: Accuracy
type: accuracy
value: 0.82
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilhubert-finetuned-gtzan
This model is a fine-tuned version of [ntu-spml/distilhubert](https://huggingface.co/ntu-spml/distilhubert) on the GTZAN dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6298
- Accuracy: 0.82
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 1.9297 | 1.0 | 113 | 1.8304 | 0.52 |
| 1.1908 | 2.0 | 226 | 1.2680 | 0.62 |
| 0.997 | 3.0 | 339 | 1.0931 | 0.66 |
| 0.6903 | 4.0 | 452 | 0.8967 | 0.71 |
| 0.5372 | 5.0 | 565 | 0.7350 | 0.8 |
| 0.4878 | 6.0 | 678 | 0.6679 | 0.82 |
| 0.2688 | 7.0 | 791 | 0.6463 | 0.8 |
| 0.1214 | 8.0 | 904 | 0.6345 | 0.82 |
| 0.1472 | 9.0 | 1017 | 0.6368 | 0.83 |
| 0.0925 | 10.0 | 1130 | 0.6298 | 0.82 |
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.1+cu124
- Datasets 3.0.2
- Tokenizers 0.20.3
|
BoringAnt1793/metehan-toxic-spans-bert-small | BoringAnt1793 | 2024-11-25T06:23:46Z | 105 | 0 | transformers | [
"transformers",
"pytorch",
"tf",
"jax",
"safetensors",
"bert",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-23T23:45:46Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
MayBashendy/Arabic_FineTuningAraBERT_AugV5_k20_task2_organization_fold1 | MayBashendy | 2024-11-25T06:19:45Z | 160 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:aubmindlab/bert-base-arabertv02",
"base_model:finetune:aubmindlab/bert-base-arabertv02",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-24T21:18:34Z | ---
library_name: transformers
base_model: aubmindlab/bert-base-arabertv02
tags:
- generated_from_trainer
model-index:
- name: Arabic_FineTuningAraBERT_AugV5_k20_task2_organization_fold1
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Arabic_FineTuningAraBERT_AugV5_k20_task2_organization_fold1
This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7076
- Qwk: 0.2857
- Mse: 0.7076
- Rmse: 0.8412
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:|
| No log | 0.0124 | 2 | 4.0570 | 0.0177 | 4.0570 | 2.0142 |
| No log | 0.0248 | 4 | 2.1502 | -0.0699 | 2.1502 | 1.4664 |
| No log | 0.0373 | 6 | 1.5104 | -0.0851 | 1.5104 | 1.2290 |
| No log | 0.0497 | 8 | 1.1280 | 0.0 | 1.1280 | 1.0621 |
| No log | 0.0621 | 10 | 1.1115 | -0.0615 | 1.1115 | 1.0543 |
| No log | 0.0745 | 12 | 1.0992 | 0.0 | 1.0992 | 1.0484 |
| No log | 0.0870 | 14 | 1.2175 | 0.0 | 1.2175 | 1.1034 |
| No log | 0.0994 | 16 | 1.3496 | 0.0 | 1.3496 | 1.1617 |
| No log | 0.1118 | 18 | 1.3816 | 0.0 | 1.3816 | 1.1754 |
| No log | 0.1242 | 20 | 1.4290 | -0.0274 | 1.4290 | 1.1954 |
| No log | 0.1366 | 22 | 1.4453 | 0.0 | 1.4453 | 1.2022 |
| No log | 0.1491 | 24 | 1.2434 | 0.0 | 1.2434 | 1.1151 |
| No log | 0.1615 | 26 | 1.2372 | 0.0 | 1.2372 | 1.1123 |
| No log | 0.1739 | 28 | 1.1607 | 0.0 | 1.1607 | 1.0773 |
| No log | 0.1863 | 30 | 1.0016 | 0.0 | 1.0016 | 1.0008 |
| No log | 0.1988 | 32 | 0.9569 | 0.0 | 0.9569 | 0.9782 |
| No log | 0.2112 | 34 | 1.0188 | 0.0 | 1.0188 | 1.0094 |
| No log | 0.2236 | 36 | 1.1075 | 0.0 | 1.1075 | 1.0524 |
| No log | 0.2360 | 38 | 1.2507 | -0.0274 | 1.2507 | 1.1184 |
| No log | 0.2484 | 40 | 1.4305 | 0.1064 | 1.4305 | 1.1960 |
| No log | 0.2609 | 42 | 1.4081 | 0.0792 | 1.4081 | 1.1866 |
| No log | 0.2733 | 44 | 1.2032 | 0.0690 | 1.2032 | 1.0969 |
| No log | 0.2857 | 46 | 1.2310 | 0.0250 | 1.2310 | 1.1095 |
| No log | 0.2981 | 48 | 1.3572 | 0.0792 | 1.3572 | 1.1650 |
| No log | 0.3106 | 50 | 1.1948 | 0.0 | 1.1948 | 1.0931 |
| No log | 0.3230 | 52 | 0.9325 | -0.0615 | 0.9325 | 0.9657 |
| No log | 0.3354 | 54 | 0.8528 | 0.0 | 0.8528 | 0.9235 |
| No log | 0.3478 | 56 | 0.8369 | 0.0 | 0.8369 | 0.9148 |
| No log | 0.3602 | 58 | 0.8264 | 0.0308 | 0.8264 | 0.9091 |
| No log | 0.3727 | 60 | 0.8184 | 0.0 | 0.8184 | 0.9047 |
| No log | 0.3851 | 62 | 0.9077 | 0.0 | 0.9077 | 0.9527 |
| No log | 0.3975 | 64 | 0.9715 | 0.0 | 0.9715 | 0.9857 |
| No log | 0.4099 | 66 | 1.0118 | 0.0 | 1.0118 | 1.0059 |
| No log | 0.4224 | 68 | 0.9735 | 0.0 | 0.9735 | 0.9867 |
| No log | 0.4348 | 70 | 0.8121 | 0.0308 | 0.8121 | 0.9012 |
| No log | 0.4472 | 72 | 0.7334 | 0.1639 | 0.7334 | 0.8564 |
| No log | 0.4596 | 74 | 0.7769 | 0.0339 | 0.7769 | 0.8814 |
| No log | 0.4720 | 76 | 0.8242 | 0.0656 | 0.8242 | 0.9079 |
| No log | 0.4845 | 78 | 0.8417 | 0.0323 | 0.8417 | 0.9174 |
| No log | 0.4969 | 80 | 0.8034 | 0.0323 | 0.8034 | 0.8963 |
| No log | 0.5093 | 82 | 0.7686 | 0.0323 | 0.7686 | 0.8767 |
| No log | 0.5217 | 84 | 0.6756 | 0.1053 | 0.6756 | 0.8220 |
| No log | 0.5342 | 86 | 0.6261 | 0.1111 | 0.6261 | 0.7913 |
| No log | 0.5466 | 88 | 0.6811 | 0.0690 | 0.6811 | 0.8253 |
| No log | 0.5590 | 90 | 0.6879 | 0.0 | 0.6879 | 0.8294 |
| No log | 0.5714 | 92 | 0.6654 | 0.0727 | 0.6654 | 0.8157 |
| No log | 0.5839 | 94 | 0.6573 | 0.1923 | 0.6573 | 0.8107 |
| No log | 0.5963 | 96 | 0.7331 | 0.1176 | 0.7331 | 0.8562 |
| No log | 0.6087 | 98 | 0.7406 | 0.1176 | 0.7406 | 0.8606 |
| No log | 0.6211 | 100 | 0.6981 | 0.1724 | 0.6981 | 0.8355 |
| No log | 0.6335 | 102 | 0.7587 | 0.1739 | 0.7587 | 0.8710 |
| No log | 0.6460 | 104 | 0.7635 | 0.1739 | 0.7635 | 0.8738 |
| No log | 0.6584 | 106 | 0.7313 | 0.1356 | 0.7313 | 0.8551 |
| No log | 0.6708 | 108 | 0.7733 | 0.4156 | 0.7733 | 0.8794 |
| No log | 0.6832 | 110 | 0.8023 | 0.4156 | 0.8023 | 0.8957 |
| No log | 0.6957 | 112 | 0.7892 | 0.4156 | 0.7892 | 0.8884 |
| No log | 0.7081 | 114 | 0.8050 | 0.4156 | 0.8050 | 0.8972 |
| No log | 0.7205 | 116 | 0.8214 | 0.4156 | 0.8214 | 0.9063 |
| No log | 0.7329 | 118 | 0.8121 | 0.3478 | 0.8121 | 0.9011 |
| No log | 0.7453 | 120 | 0.7958 | 0.1639 | 0.7958 | 0.8921 |
| No log | 0.7578 | 122 | 0.8632 | 0.3333 | 0.8632 | 0.9291 |
| No log | 0.7702 | 124 | 1.0959 | -0.1471 | 1.0959 | 1.0468 |
| No log | 0.7826 | 126 | 1.1273 | -0.1096 | 1.1273 | 1.0617 |
| No log | 0.7950 | 128 | 0.9483 | 0.1493 | 0.9483 | 0.9738 |
| No log | 0.8075 | 130 | 0.8337 | 0.2105 | 0.8337 | 0.9131 |
| No log | 0.8199 | 132 | 0.9088 | 0.3571 | 0.9088 | 0.9533 |
| No log | 0.8323 | 134 | 1.0736 | 0.1429 | 1.0736 | 1.0361 |
| No log | 0.8447 | 136 | 1.2419 | 0.1429 | 1.2419 | 1.1144 |
| No log | 0.8571 | 138 | 1.2021 | 0.1429 | 1.2021 | 1.0964 |
| No log | 0.8696 | 140 | 0.9687 | 0.3721 | 0.9687 | 0.9842 |
| No log | 0.8820 | 142 | 0.8568 | 0.1818 | 0.8568 | 0.9257 |
| No log | 0.8944 | 144 | 0.8414 | 0.1818 | 0.8414 | 0.9173 |
| No log | 0.9068 | 146 | 0.8720 | 0.2388 | 0.8720 | 0.9338 |
| No log | 0.9193 | 148 | 0.9359 | 0.2388 | 0.9359 | 0.9674 |
| No log | 0.9317 | 150 | 0.9625 | 0.2388 | 0.9625 | 0.9811 |
| No log | 0.9441 | 152 | 0.9125 | 0.0656 | 0.9125 | 0.9552 |
| No log | 0.9565 | 154 | 0.8702 | 0.1739 | 0.8702 | 0.9329 |
| No log | 0.9689 | 156 | 0.8679 | 0.1739 | 0.8679 | 0.9316 |
| No log | 0.9814 | 158 | 0.9000 | 0.3077 | 0.9000 | 0.9487 |
| No log | 0.9938 | 160 | 0.9578 | 0.1176 | 0.9578 | 0.9787 |
| No log | 1.0062 | 162 | 1.0722 | 0.1127 | 1.0722 | 1.0355 |
| No log | 1.0186 | 164 | 1.2149 | 0.0 | 1.2149 | 1.1022 |
| No log | 1.0311 | 166 | 1.2342 | 0.0217 | 1.2342 | 1.1110 |
| No log | 1.0435 | 168 | 1.0780 | 0.0571 | 1.0780 | 1.0383 |
| No log | 1.0559 | 170 | 0.8539 | 0.1356 | 0.8539 | 0.9241 |
| No log | 1.0683 | 172 | 0.7505 | 0.1562 | 0.7505 | 0.8663 |
| No log | 1.0807 | 174 | 0.7467 | 0.1356 | 0.7467 | 0.8641 |
| No log | 1.0932 | 176 | 0.8414 | 0.1053 | 0.8414 | 0.9173 |
| No log | 1.1056 | 178 | 1.0172 | 0.0308 | 1.0172 | 1.0086 |
| No log | 1.1180 | 180 | 1.1166 | 0.0571 | 1.1166 | 1.0567 |
| No log | 1.1304 | 182 | 1.0697 | 0.0571 | 1.0697 | 1.0343 |
| No log | 1.1429 | 184 | 0.8948 | -0.0678 | 0.8948 | 0.9459 |
| No log | 1.1553 | 186 | 0.7133 | 0.0357 | 0.7133 | 0.8445 |
| No log | 1.1677 | 188 | 0.6611 | 0.1111 | 0.6611 | 0.8131 |
| No log | 1.1801 | 190 | 0.6927 | 0.1724 | 0.6927 | 0.8323 |
| No log | 1.1925 | 192 | 0.7486 | 0.3014 | 0.7486 | 0.8652 |
| No log | 1.2050 | 194 | 0.8367 | 0.3377 | 0.8367 | 0.9147 |
| No log | 1.2174 | 196 | 0.9288 | 0.2683 | 0.9288 | 0.9637 |
| No log | 1.2298 | 198 | 1.0184 | 0.2667 | 1.0184 | 1.0091 |
| No log | 1.2422 | 200 | 1.0315 | 0.2174 | 1.0315 | 1.0156 |
| No log | 1.2547 | 202 | 0.9773 | 0.3014 | 0.9773 | 0.9886 |
| No log | 1.2671 | 204 | 0.9870 | 0.2703 | 0.9870 | 0.9935 |
| No log | 1.2795 | 206 | 0.9870 | 0.3014 | 0.9870 | 0.9935 |
| No log | 1.2919 | 208 | 0.9847 | 0.3077 | 0.9847 | 0.9923 |
| No log | 1.3043 | 210 | 1.0383 | 0.2826 | 1.0383 | 1.0190 |
| No log | 1.3168 | 212 | 1.0040 | 0.2000 | 1.0040 | 1.0020 |
| No log | 1.3292 | 214 | 0.9129 | 0.2683 | 0.9129 | 0.9555 |
| No log | 1.3416 | 216 | 0.8224 | 0.3377 | 0.8224 | 0.9069 |
| No log | 1.3540 | 218 | 0.7268 | 0.2857 | 0.7268 | 0.8525 |
| No log | 1.3665 | 220 | 0.6927 | 0.2857 | 0.6927 | 0.8323 |
| No log | 1.3789 | 222 | 0.6983 | 0.2857 | 0.6983 | 0.8357 |
| No log | 1.3913 | 224 | 0.7293 | 0.1724 | 0.7293 | 0.8540 |
| No log | 1.4037 | 226 | 0.8535 | 0.1111 | 0.8535 | 0.9238 |
| No log | 1.4161 | 228 | 0.9195 | 0.1639 | 0.9195 | 0.9589 |
| No log | 1.4286 | 230 | 0.9083 | 0.1562 | 0.9083 | 0.9530 |
| No log | 1.4410 | 232 | 0.9771 | 0.2785 | 0.9771 | 0.9885 |
| No log | 1.4534 | 234 | 1.0745 | 0.1818 | 1.0745 | 1.0366 |
| No log | 1.4658 | 236 | 1.1192 | 0.1290 | 1.1192 | 1.0579 |
| No log | 1.4783 | 238 | 1.1329 | 0.1600 | 1.1329 | 1.0644 |
| No log | 1.4907 | 240 | 1.1361 | 0.1600 | 1.1361 | 1.0659 |
| No log | 1.5031 | 242 | 1.0863 | 0.2105 | 1.0863 | 1.0423 |
| No log | 1.5155 | 244 | 1.1139 | 0.2418 | 1.1139 | 1.0554 |
| No log | 1.5280 | 246 | 1.1750 | 0.2569 | 1.1750 | 1.0840 |
| No log | 1.5404 | 248 | 1.1860 | 0.1913 | 1.1860 | 1.0890 |
| No log | 1.5528 | 250 | 1.0269 | 0.1935 | 1.0269 | 1.0134 |
| No log | 1.5652 | 252 | 0.8683 | 0.2817 | 0.8683 | 0.9318 |
| No log | 1.5776 | 254 | 0.7512 | 0.3333 | 0.7512 | 0.8667 |
| No log | 1.5901 | 256 | 0.6983 | 0.3284 | 0.6983 | 0.8356 |
| No log | 1.6025 | 258 | 0.6604 | 0.3226 | 0.6604 | 0.8126 |
| No log | 1.6149 | 260 | 0.6606 | 0.1509 | 0.6606 | 0.8127 |
| No log | 1.6273 | 262 | 0.6495 | 0.1509 | 0.6495 | 0.8059 |
| No log | 1.6398 | 264 | 0.6440 | 0.1509 | 0.6440 | 0.8025 |
| No log | 1.6522 | 266 | 0.7127 | 0.3390 | 0.7127 | 0.8442 |
| No log | 1.6646 | 268 | 0.7290 | 0.3390 | 0.7290 | 0.8538 |
| No log | 1.6770 | 270 | 0.7561 | 0.3390 | 0.7561 | 0.8695 |
| No log | 1.6894 | 272 | 0.7289 | 0.3390 | 0.7289 | 0.8537 |
| No log | 1.7019 | 274 | 0.6979 | 0.2759 | 0.6979 | 0.8354 |
| No log | 1.7143 | 276 | 0.7178 | 0.2857 | 0.7178 | 0.8473 |
| No log | 1.7267 | 278 | 0.7525 | 0.3014 | 0.7525 | 0.8675 |
| No log | 1.7391 | 280 | 0.7760 | 0.2609 | 0.7760 | 0.8809 |
| No log | 1.7516 | 282 | 0.8087 | 0.25 | 0.8087 | 0.8993 |
| No log | 1.7640 | 284 | 0.8162 | 0.3636 | 0.8162 | 0.9034 |
| No log | 1.7764 | 286 | 0.8040 | 0.3250 | 0.8040 | 0.8966 |
| No log | 1.7888 | 288 | 0.7789 | 0.3250 | 0.7789 | 0.8826 |
| No log | 1.8012 | 290 | 0.7824 | 0.2222 | 0.7824 | 0.8845 |
| No log | 1.8137 | 292 | 0.7249 | 0.3377 | 0.7249 | 0.8514 |
| No log | 1.8261 | 294 | 0.7117 | 0.3333 | 0.7117 | 0.8436 |
| No log | 1.8385 | 296 | 0.7486 | 0.3377 | 0.7486 | 0.8652 |
| No log | 1.8509 | 298 | 0.7513 | 0.3377 | 0.7513 | 0.8668 |
| No log | 1.8634 | 300 | 0.7721 | 0.3143 | 0.7721 | 0.8787 |
| No log | 1.8758 | 302 | 0.8003 | 0.2857 | 0.8003 | 0.8946 |
| No log | 1.8882 | 304 | 0.8410 | 0.3000 | 0.8410 | 0.9171 |
| No log | 1.9006 | 306 | 0.7868 | 0.2759 | 0.7868 | 0.8870 |
| No log | 1.9130 | 308 | 0.6970 | 0.1509 | 0.6970 | 0.8349 |
| No log | 1.9255 | 310 | 0.6647 | 0.1509 | 0.6647 | 0.8153 |
| No log | 1.9379 | 312 | 0.6437 | 0.1509 | 0.6437 | 0.8023 |
| No log | 1.9503 | 314 | 0.6754 | 0.1509 | 0.6754 | 0.8218 |
| No log | 1.9627 | 316 | 0.7035 | -0.1786 | 0.7035 | 0.8388 |
| No log | 1.9752 | 318 | 0.6922 | -0.1786 | 0.6922 | 0.8320 |
| No log | 1.9876 | 320 | 0.6776 | 0.1509 | 0.6776 | 0.8232 |
| No log | 2.0 | 322 | 0.6737 | 0.1509 | 0.6737 | 0.8208 |
| No log | 2.0124 | 324 | 0.6817 | 0.1509 | 0.6817 | 0.8257 |
| No log | 2.0248 | 326 | 0.7215 | 0.1509 | 0.7215 | 0.8494 |
| No log | 2.0373 | 328 | 0.7375 | 0.1509 | 0.7375 | 0.8588 |
| No log | 2.0497 | 330 | 0.7324 | 0.2258 | 0.7324 | 0.8558 |
| No log | 2.0621 | 332 | 0.7700 | 0.2388 | 0.7700 | 0.8775 |
| No log | 2.0745 | 334 | 0.7910 | 0.2258 | 0.7910 | 0.8894 |
| No log | 2.0870 | 336 | 0.8006 | 0.25 | 0.8006 | 0.8947 |
| No log | 2.0994 | 338 | 0.7879 | 0.2941 | 0.7879 | 0.8876 |
| No log | 2.1118 | 340 | 0.7768 | 0.2941 | 0.7768 | 0.8813 |
| No log | 2.1242 | 342 | 0.7615 | 0.2941 | 0.7615 | 0.8726 |
| No log | 2.1366 | 344 | 0.7298 | 0.2857 | 0.7298 | 0.8543 |
| No log | 2.1491 | 346 | 0.7210 | 0.2857 | 0.7210 | 0.8491 |
| No log | 2.1615 | 348 | 0.7455 | 0.2857 | 0.7455 | 0.8634 |
| No log | 2.1739 | 350 | 0.7795 | 0.3143 | 0.7795 | 0.8829 |
| No log | 2.1863 | 352 | 0.8062 | 0.1818 | 0.8062 | 0.8979 |
| No log | 2.1988 | 354 | 0.7726 | 0.2154 | 0.7726 | 0.8790 |
| No log | 2.2112 | 356 | 0.7667 | 0.2154 | 0.7667 | 0.8756 |
| No log | 2.2236 | 358 | 0.7412 | 0.1724 | 0.7412 | 0.8609 |
| No log | 2.2360 | 360 | 0.7796 | 0.0323 | 0.7796 | 0.8830 |
| No log | 2.2484 | 362 | 0.8774 | 0.0548 | 0.8774 | 0.9367 |
| No log | 2.2609 | 364 | 0.9749 | 0.0741 | 0.9749 | 0.9874 |
| No log | 2.2733 | 366 | 1.0432 | 0.0842 | 1.0432 | 1.0214 |
| No log | 2.2857 | 368 | 1.0245 | 0.0842 | 1.0245 | 1.0122 |
| No log | 2.2981 | 370 | 0.8443 | 0.1290 | 0.8443 | 0.9188 |
| No log | 2.3106 | 372 | 0.7381 | 0.2857 | 0.7381 | 0.8591 |
| No log | 2.3230 | 374 | 0.7511 | 0.25 | 0.7511 | 0.8667 |
| No log | 2.3354 | 376 | 0.7832 | 0.2941 | 0.7832 | 0.8850 |
| No log | 2.3478 | 378 | 0.8233 | 0.1739 | 0.8233 | 0.9074 |
| No log | 2.3602 | 380 | 0.8686 | 0.0323 | 0.8686 | 0.9320 |
| No log | 2.3727 | 382 | 1.0153 | 0.0571 | 1.0153 | 1.0076 |
| No log | 2.3851 | 384 | 1.0590 | 0.0769 | 1.0590 | 1.0291 |
| No log | 2.3975 | 386 | 0.9979 | 0.0526 | 0.9979 | 0.9989 |
| No log | 2.4099 | 388 | 0.8686 | -0.0328 | 0.8686 | 0.9320 |
| No log | 2.4224 | 390 | 0.8371 | 0.2941 | 0.8371 | 0.9149 |
| No log | 2.4348 | 392 | 0.8850 | 0.2388 | 0.8850 | 0.9408 |
| No log | 2.4472 | 394 | 0.8838 | 0.2388 | 0.8838 | 0.9401 |
| No log | 2.4596 | 396 | 0.8567 | 0.2388 | 0.8567 | 0.9256 |
| No log | 2.4720 | 398 | 0.8711 | 0.2286 | 0.8711 | 0.9333 |
| No log | 2.4845 | 400 | 0.9763 | -0.0286 | 0.9763 | 0.9881 |
| No log | 2.4969 | 402 | 0.9782 | 0.0800 | 0.9782 | 0.9890 |
| No log | 2.5093 | 404 | 0.9153 | 0.1081 | 0.9153 | 0.9567 |
| No log | 2.5217 | 406 | 0.8091 | 0.2388 | 0.8091 | 0.8995 |
| No log | 2.5342 | 408 | 0.7445 | 0.2388 | 0.7445 | 0.8628 |
| No log | 2.5466 | 410 | 0.7126 | 0.2388 | 0.7126 | 0.8441 |
| No log | 2.5590 | 412 | 0.7214 | 0.3390 | 0.7214 | 0.8493 |
| No log | 2.5714 | 414 | 0.8108 | 0.0 | 0.8108 | 0.9005 |
| No log | 2.5839 | 416 | 0.7998 | 0.0 | 0.7998 | 0.8943 |
| No log | 2.5963 | 418 | 0.7387 | 0.2222 | 0.7387 | 0.8595 |
| No log | 2.6087 | 420 | 0.6918 | 0.2857 | 0.6918 | 0.8317 |
| No log | 2.6211 | 422 | 0.6721 | 0.3284 | 0.6721 | 0.8198 |
| No log | 2.6335 | 424 | 0.6689 | 0.3284 | 0.6689 | 0.8179 |
| No log | 2.6460 | 426 | 0.6765 | 0.2857 | 0.6765 | 0.8225 |
| No log | 2.6584 | 428 | 0.6585 | 0.2857 | 0.6585 | 0.8115 |
| No log | 2.6708 | 430 | 0.6561 | 0.1509 | 0.6561 | 0.8100 |
| No log | 2.6832 | 432 | 0.6484 | 0.1509 | 0.6484 | 0.8052 |
| No log | 2.6957 | 434 | 0.6691 | 0.1509 | 0.6691 | 0.8180 |
| No log | 2.7081 | 436 | 0.7095 | 0.1509 | 0.7095 | 0.8423 |
| No log | 2.7205 | 438 | 0.7828 | 0.1639 | 0.7828 | 0.8848 |
| No log | 2.7329 | 440 | 0.7843 | 0.1639 | 0.7843 | 0.8856 |
| No log | 2.7453 | 442 | 0.7449 | 0.3200 | 0.7449 | 0.8631 |
| No log | 2.7578 | 444 | 0.7407 | 0.2941 | 0.7407 | 0.8607 |
| No log | 2.7702 | 446 | 0.7204 | 0.3284 | 0.7204 | 0.8487 |
| No log | 2.7826 | 448 | 0.7097 | 0.3284 | 0.7097 | 0.8425 |
| No log | 2.7950 | 450 | 0.6811 | 0.3284 | 0.6811 | 0.8253 |
| No log | 2.8075 | 452 | 0.6853 | 0.2759 | 0.6853 | 0.8278 |
| No log | 2.8199 | 454 | 0.7341 | -0.0364 | 0.7341 | 0.8568 |
| No log | 2.8323 | 456 | 0.7522 | 0.0323 | 0.7522 | 0.8673 |
| No log | 2.8447 | 458 | 0.7271 | -0.0364 | 0.7271 | 0.8527 |
| No log | 2.8571 | 460 | 0.7085 | -0.0364 | 0.7085 | 0.8417 |
| No log | 2.8696 | 462 | 0.6747 | 0.2941 | 0.6747 | 0.8214 |
| No log | 2.8820 | 464 | 0.6818 | 0.2941 | 0.6818 | 0.8257 |
| No log | 2.8944 | 466 | 0.7340 | 0.1818 | 0.7340 | 0.8567 |
| No log | 2.9068 | 468 | 0.8133 | 0.1818 | 0.8133 | 0.9018 |
| No log | 2.9193 | 470 | 0.7769 | 0.1481 | 0.7769 | 0.8814 |
| No log | 2.9317 | 472 | 0.7522 | 0.1892 | 0.7522 | 0.8673 |
| No log | 2.9441 | 474 | 0.6936 | 0.3514 | 0.6936 | 0.8328 |
| No log | 2.9565 | 476 | 0.6684 | 0.3284 | 0.6684 | 0.8176 |
| No log | 2.9689 | 478 | 0.6725 | 0.3284 | 0.6725 | 0.8201 |
| No log | 2.9814 | 480 | 0.6617 | 0.3158 | 0.6617 | 0.8134 |
| No log | 2.9938 | 482 | 0.6314 | 0.3284 | 0.6314 | 0.7946 |
| No log | 3.0062 | 484 | 0.6163 | 0.3284 | 0.6163 | 0.7851 |
| No log | 3.0186 | 486 | 0.6255 | 0.3284 | 0.6255 | 0.7909 |
| No log | 3.0311 | 488 | 0.6376 | 0.3226 | 0.6376 | 0.7985 |
| No log | 3.0435 | 490 | 0.6485 | 0.2857 | 0.6485 | 0.8053 |
| No log | 3.0559 | 492 | 0.6570 | 0.3284 | 0.6570 | 0.8106 |
| No log | 3.0683 | 494 | 0.6769 | 0.3284 | 0.6769 | 0.8228 |
| No log | 3.0807 | 496 | 0.6952 | 0.2059 | 0.6952 | 0.8338 |
| No log | 3.0932 | 498 | 0.7368 | 0.1639 | 0.7368 | 0.8584 |
| 0.4266 | 3.1056 | 500 | 0.7326 | 0.2388 | 0.7326 | 0.8559 |
| 0.4266 | 3.1180 | 502 | 0.6543 | 0.2000 | 0.6543 | 0.8089 |
| 0.4266 | 3.1304 | 504 | 0.5778 | 0.3158 | 0.5778 | 0.7602 |
| 0.4266 | 3.1429 | 506 | 0.5592 | 0.3226 | 0.5592 | 0.7478 |
| 0.4266 | 3.1553 | 508 | 0.5760 | 0.3284 | 0.5760 | 0.7589 |
| 0.4266 | 3.1677 | 510 | 0.6366 | 0.2000 | 0.6366 | 0.7979 |
| 0.4266 | 3.1801 | 512 | 0.6948 | 0.2623 | 0.6948 | 0.8335 |
| 0.4266 | 3.1925 | 514 | 0.6929 | 0.2154 | 0.6929 | 0.8324 |
| 0.4266 | 3.2050 | 516 | 0.6655 | 0.3284 | 0.6655 | 0.8158 |
| 0.4266 | 3.2174 | 518 | 0.6948 | 0.3284 | 0.6948 | 0.8335 |
| 0.4266 | 3.2298 | 520 | 0.6944 | 0.3284 | 0.6944 | 0.8333 |
| 0.4266 | 3.2422 | 522 | 0.7021 | 0.3284 | 0.7021 | 0.8379 |
| 0.4266 | 3.2547 | 524 | 0.7627 | 0.2154 | 0.7627 | 0.8733 |
| 0.4266 | 3.2671 | 526 | 0.8194 | 0.1892 | 0.8194 | 0.9052 |
| 0.4266 | 3.2795 | 528 | 0.7895 | 0.2388 | 0.7895 | 0.8885 |
| 0.4266 | 3.2919 | 530 | 0.7143 | 0.2857 | 0.7143 | 0.8452 |
| 0.4266 | 3.3043 | 532 | 0.6657 | 0.3284 | 0.6657 | 0.8159 |
| 0.4266 | 3.3168 | 534 | 0.7034 | 0.3333 | 0.7034 | 0.8387 |
| 0.4266 | 3.3292 | 536 | 0.7461 | 0.3377 | 0.7461 | 0.8638 |
| 0.4266 | 3.3416 | 538 | 0.7618 | 0.3333 | 0.7618 | 0.8728 |
| 0.4266 | 3.3540 | 540 | 0.8006 | 0.3077 | 0.8006 | 0.8947 |
| 0.4266 | 3.3665 | 542 | 0.8273 | 0.2785 | 0.8273 | 0.9095 |
| 0.4266 | 3.3789 | 544 | 0.8252 | 0.1750 | 0.8252 | 0.9084 |
| 0.4266 | 3.3913 | 546 | 0.8008 | 0.2785 | 0.8008 | 0.8949 |
| 0.4266 | 3.4037 | 548 | 0.7888 | 0.3077 | 0.7888 | 0.8882 |
| 0.4266 | 3.4161 | 550 | 0.7803 | 0.3077 | 0.7803 | 0.8833 |
| 0.4266 | 3.4286 | 552 | 0.7398 | 0.3077 | 0.7398 | 0.8601 |
| 0.4266 | 3.4410 | 554 | 0.7205 | 0.3077 | 0.7205 | 0.8488 |
| 0.4266 | 3.4534 | 556 | 0.6958 | 0.3077 | 0.6958 | 0.8342 |
| 0.4266 | 3.4658 | 558 | 0.7040 | 0.2857 | 0.7040 | 0.8391 |
| 0.4266 | 3.4783 | 560 | 0.7144 | 0.2857 | 0.7144 | 0.8452 |
| 0.4266 | 3.4907 | 562 | 0.7251 | 0.2857 | 0.7251 | 0.8515 |
| 0.4266 | 3.5031 | 564 | 0.7205 | 0.2941 | 0.7205 | 0.8488 |
| 0.4266 | 3.5155 | 566 | 0.7131 | 0.2941 | 0.7131 | 0.8445 |
| 0.4266 | 3.5280 | 568 | 0.7018 | 0.2857 | 0.7018 | 0.8377 |
| 0.4266 | 3.5404 | 570 | 0.6977 | 0.2857 | 0.6977 | 0.8353 |
| 0.4266 | 3.5528 | 572 | 0.7055 | 0.2941 | 0.7055 | 0.8399 |
| 0.4266 | 3.5652 | 574 | 0.7090 | 0.2857 | 0.7090 | 0.8421 |
| 0.4266 | 3.5776 | 576 | 0.7288 | 0.2857 | 0.7288 | 0.8537 |
| 0.4266 | 3.5901 | 578 | 0.7876 | 0.1562 | 0.7876 | 0.8875 |
| 0.4266 | 3.6025 | 580 | 0.8372 | 0.1127 | 0.8372 | 0.9150 |
| 0.4266 | 3.6149 | 582 | 0.9269 | 0.1628 | 0.9269 | 0.9628 |
| 0.4266 | 3.6273 | 584 | 0.9634 | 0.1758 | 0.9634 | 0.9815 |
| 0.4266 | 3.6398 | 586 | 0.9428 | 0.1758 | 0.9428 | 0.9710 |
| 0.4266 | 3.6522 | 588 | 0.8930 | 0.24 | 0.8930 | 0.9450 |
| 0.4266 | 3.6646 | 590 | 0.8447 | 0.3014 | 0.8447 | 0.9191 |
| 0.4266 | 3.6770 | 592 | 0.7942 | 0.2941 | 0.7942 | 0.8912 |
| 0.4266 | 3.6894 | 594 | 0.7630 | 0.2941 | 0.7630 | 0.8735 |
| 0.4266 | 3.7019 | 596 | 0.7832 | 0.2857 | 0.7832 | 0.8850 |
| 0.4266 | 3.7143 | 598 | 0.9030 | 0.1039 | 0.9030 | 0.9503 |
| 0.4266 | 3.7267 | 600 | 0.9781 | 0.0506 | 0.9781 | 0.9890 |
| 0.4266 | 3.7391 | 602 | 0.9774 | 0.0506 | 0.9774 | 0.9887 |
| 0.4266 | 3.7516 | 604 | 0.8865 | 0.0526 | 0.8865 | 0.9415 |
| 0.4266 | 3.7640 | 606 | 0.7948 | 0.1509 | 0.7948 | 0.8915 |
| 0.4266 | 3.7764 | 608 | 0.7520 | 0.2941 | 0.7520 | 0.8672 |
| 0.4266 | 3.7888 | 610 | 0.7646 | 0.2941 | 0.7646 | 0.8744 |
| 0.4266 | 3.8012 | 612 | 0.7966 | 0.2941 | 0.7966 | 0.8925 |
| 0.4266 | 3.8137 | 614 | 0.8506 | 0.1429 | 0.8506 | 0.9223 |
| 0.4266 | 3.8261 | 616 | 0.9083 | 0.0526 | 0.9083 | 0.9530 |
| 0.4266 | 3.8385 | 618 | 0.9897 | 0.0526 | 0.9897 | 0.9948 |
| 0.4266 | 3.8509 | 620 | 0.9823 | 0.0526 | 0.9823 | 0.9911 |
| 0.4266 | 3.8634 | 622 | 0.9032 | 0.0 | 0.9032 | 0.9503 |
| 0.4266 | 3.8758 | 624 | 0.8134 | 0.1509 | 0.8134 | 0.9019 |
| 0.4266 | 3.8882 | 626 | 0.7599 | 0.2857 | 0.7599 | 0.8717 |
| 0.4266 | 3.9006 | 628 | 0.7446 | 0.2941 | 0.7446 | 0.8629 |
| 0.4266 | 3.9130 | 630 | 0.7442 | 0.2941 | 0.7442 | 0.8626 |
| 0.4266 | 3.9255 | 632 | 0.7653 | 0.1509 | 0.7653 | 0.8748 |
| 0.4266 | 3.9379 | 634 | 0.7603 | 0.1509 | 0.7603 | 0.8720 |
| 0.4266 | 3.9503 | 636 | 0.7194 | 0.2857 | 0.7194 | 0.8482 |
| 0.4266 | 3.9627 | 638 | 0.6960 | 0.2857 | 0.6960 | 0.8343 |
| 0.4266 | 3.9752 | 640 | 0.6744 | 0.2857 | 0.6744 | 0.8212 |
| 0.4266 | 3.9876 | 642 | 0.6691 | 0.2857 | 0.6691 | 0.8180 |
| 0.4266 | 4.0 | 644 | 0.6765 | 0.2857 | 0.6765 | 0.8225 |
| 0.4266 | 4.0124 | 646 | 0.6956 | 0.1509 | 0.6956 | 0.8340 |
| 0.4266 | 4.0248 | 648 | 0.7465 | -0.0364 | 0.7465 | 0.8640 |
| 0.4266 | 4.0373 | 650 | 0.8008 | -0.0364 | 0.8008 | 0.8949 |
| 0.4266 | 4.0497 | 652 | 0.8510 | -0.1053 | 0.8510 | 0.9225 |
| 0.4266 | 4.0621 | 654 | 0.8115 | -0.0364 | 0.8115 | 0.9008 |
| 0.4266 | 4.0745 | 656 | 0.7574 | 0.1111 | 0.7574 | 0.8703 |
| 0.4266 | 4.0870 | 658 | 0.7415 | 0.2759 | 0.7415 | 0.8611 |
| 0.4266 | 4.0994 | 660 | 0.7373 | 0.2759 | 0.7373 | 0.8587 |
| 0.4266 | 4.1118 | 662 | 0.7298 | 0.2857 | 0.7298 | 0.8543 |
| 0.4266 | 4.1242 | 664 | 0.7304 | 0.2759 | 0.7304 | 0.8547 |
| 0.4266 | 4.1366 | 666 | 0.7536 | 0.1111 | 0.7536 | 0.8681 |
| 0.4266 | 4.1491 | 668 | 0.7632 | -0.0364 | 0.7632 | 0.8736 |
| 0.4266 | 4.1615 | 670 | 0.7770 | -0.0364 | 0.7770 | 0.8815 |
| 0.4266 | 4.1739 | 672 | 0.7535 | 0.1111 | 0.7535 | 0.8680 |
| 0.4266 | 4.1863 | 674 | 0.7430 | 0.2759 | 0.7430 | 0.8620 |
| 0.4266 | 4.1988 | 676 | 0.7464 | 0.2759 | 0.7464 | 0.8640 |
| 0.4266 | 4.2112 | 678 | 0.7391 | 0.2857 | 0.7391 | 0.8597 |
| 0.4266 | 4.2236 | 680 | 0.7357 | 0.2941 | 0.7357 | 0.8577 |
| 0.4266 | 4.2360 | 682 | 0.7383 | 0.2388 | 0.7383 | 0.8592 |
| 0.4266 | 4.2484 | 684 | 0.7507 | 0.25 | 0.7507 | 0.8664 |
| 0.4266 | 4.2609 | 686 | 0.7307 | 0.2388 | 0.7307 | 0.8548 |
| 0.4266 | 4.2733 | 688 | 0.6995 | 0.2258 | 0.6995 | 0.8364 |
| 0.4266 | 4.2857 | 690 | 0.7004 | 0.1509 | 0.7004 | 0.8369 |
| 0.4266 | 4.2981 | 692 | 0.7214 | 0.1111 | 0.7214 | 0.8494 |
| 0.4266 | 4.3106 | 694 | 0.7374 | 0.1818 | 0.7374 | 0.8587 |
| 0.4266 | 4.3230 | 696 | 0.7395 | 0.1111 | 0.7395 | 0.8600 |
| 0.4266 | 4.3354 | 698 | 0.7337 | 0.1111 | 0.7337 | 0.8566 |
| 0.4266 | 4.3478 | 700 | 0.7198 | 0.1509 | 0.7198 | 0.8484 |
| 0.4266 | 4.3602 | 702 | 0.7289 | 0.1509 | 0.7289 | 0.8538 |
| 0.4266 | 4.3727 | 704 | 0.7468 | 0.1111 | 0.7468 | 0.8642 |
| 0.4266 | 4.3851 | 706 | 0.7697 | 0.1111 | 0.7697 | 0.8773 |
| 0.4266 | 4.3975 | 708 | 0.7816 | 0.1111 | 0.7816 | 0.8841 |
| 0.4266 | 4.4099 | 710 | 0.7893 | 0.1111 | 0.7893 | 0.8884 |
| 0.4266 | 4.4224 | 712 | 0.7687 | 0.1111 | 0.7687 | 0.8768 |
| 0.4266 | 4.4348 | 714 | 0.7383 | 0.2759 | 0.7383 | 0.8593 |
| 0.4266 | 4.4472 | 716 | 0.7235 | 0.2857 | 0.7235 | 0.8506 |
| 0.4266 | 4.4596 | 718 | 0.7262 | 0.1509 | 0.7262 | 0.8522 |
| 0.4266 | 4.4720 | 720 | 0.7357 | 0.1509 | 0.7357 | 0.8577 |
| 0.4266 | 4.4845 | 722 | 0.7723 | -0.0364 | 0.7723 | 0.8788 |
| 0.4266 | 4.4969 | 724 | 0.7941 | -0.0364 | 0.7941 | 0.8911 |
| 0.4266 | 4.5093 | 726 | 0.8092 | 0.0323 | 0.8092 | 0.8995 |
| 0.4266 | 4.5217 | 728 | 0.8069 | 0.0597 | 0.8069 | 0.8983 |
| 0.4266 | 4.5342 | 730 | 0.8273 | 0.1667 | 0.8273 | 0.9096 |
| 0.4266 | 4.5466 | 732 | 0.8418 | 0.1667 | 0.8418 | 0.9175 |
| 0.4266 | 4.5590 | 734 | 0.8272 | 0.3200 | 0.8272 | 0.9095 |
| 0.4266 | 4.5714 | 736 | 0.7980 | 0.2941 | 0.7980 | 0.8933 |
| 0.4266 | 4.5839 | 738 | 0.7644 | 0.2941 | 0.7644 | 0.8743 |
| 0.4266 | 4.5963 | 740 | 0.7178 | 0.2941 | 0.7178 | 0.8472 |
| 0.4266 | 4.6087 | 742 | 0.6999 | 0.2941 | 0.6999 | 0.8366 |
| 0.4266 | 4.6211 | 744 | 0.6763 | 0.2941 | 0.6763 | 0.8224 |
| 0.4266 | 4.6335 | 746 | 0.6710 | 0.2388 | 0.6710 | 0.8191 |
| 0.4266 | 4.6460 | 748 | 0.6692 | 0.2857 | 0.6692 | 0.8180 |
| 0.4266 | 4.6584 | 750 | 0.6993 | 0.1562 | 0.6993 | 0.8363 |
| 0.4266 | 4.6708 | 752 | 0.7744 | 0.2703 | 0.7744 | 0.8800 |
| 0.4266 | 4.6832 | 754 | 0.8123 | 0.2222 | 0.8123 | 0.9013 |
| 0.4266 | 4.6957 | 756 | 0.7722 | 0.2388 | 0.7722 | 0.8787 |
| 0.4266 | 4.7081 | 758 | 0.7326 | 0.1562 | 0.7326 | 0.8559 |
| 0.4266 | 4.7205 | 760 | 0.7155 | 0.2941 | 0.7155 | 0.8459 |
| 0.4266 | 4.7329 | 762 | 0.7271 | 0.2388 | 0.7271 | 0.8527 |
| 0.4266 | 4.7453 | 764 | 0.7267 | 0.2388 | 0.7267 | 0.8525 |
| 0.4266 | 4.7578 | 766 | 0.7218 | 0.2941 | 0.7218 | 0.8496 |
| 0.4266 | 4.7702 | 768 | 0.7169 | 0.2388 | 0.7169 | 0.8467 |
| 0.4266 | 4.7826 | 770 | 0.7131 | 0.2388 | 0.7131 | 0.8444 |
| 0.4266 | 4.7950 | 772 | 0.7172 | 0.2388 | 0.7172 | 0.8469 |
| 0.4266 | 4.8075 | 774 | 0.7207 | 0.2388 | 0.7207 | 0.8489 |
| 0.4266 | 4.8199 | 776 | 0.7283 | 0.2941 | 0.7283 | 0.8534 |
| 0.4266 | 4.8323 | 778 | 0.7570 | 0.25 | 0.7570 | 0.8701 |
| 0.4266 | 4.8447 | 780 | 0.7870 | 0.2192 | 0.7870 | 0.8871 |
| 0.4266 | 4.8571 | 782 | 0.7688 | 0.1290 | 0.7688 | 0.8768 |
| 0.4266 | 4.8696 | 784 | 0.7430 | 0.2623 | 0.7430 | 0.8620 |
| 0.4266 | 4.8820 | 786 | 0.6951 | 0.25 | 0.6951 | 0.8337 |
| 0.4266 | 4.8944 | 788 | 0.6571 | 0.25 | 0.6571 | 0.8106 |
| 0.4266 | 4.9068 | 790 | 0.6603 | 0.25 | 0.6603 | 0.8126 |
| 0.4266 | 4.9193 | 792 | 0.6635 | 0.25 | 0.6635 | 0.8145 |
| 0.4266 | 4.9317 | 794 | 0.6560 | 0.2388 | 0.6560 | 0.8099 |
| 0.4266 | 4.9441 | 796 | 0.6551 | 0.25 | 0.6551 | 0.8094 |
| 0.4266 | 4.9565 | 798 | 0.6643 | 0.25 | 0.6643 | 0.8151 |
| 0.4266 | 4.9689 | 800 | 0.6697 | 0.25 | 0.6697 | 0.8183 |
| 0.4266 | 4.9814 | 802 | 0.6758 | 0.2857 | 0.6758 | 0.8221 |
| 0.4266 | 4.9938 | 804 | 0.6930 | 0.2388 | 0.6930 | 0.8325 |
| 0.4266 | 5.0062 | 806 | 0.7269 | 0.25 | 0.7269 | 0.8526 |
| 0.4266 | 5.0186 | 808 | 0.7400 | 0.3014 | 0.7400 | 0.8602 |
| 0.4266 | 5.0311 | 810 | 0.7569 | 0.2703 | 0.7569 | 0.8700 |
| 0.4266 | 5.0435 | 812 | 0.7811 | 0.25 | 0.7811 | 0.8838 |
| 0.4266 | 5.0559 | 814 | 0.8123 | 0.1818 | 0.8123 | 0.9013 |
| 0.4266 | 5.0683 | 816 | 0.8168 | 0.1818 | 0.8168 | 0.9037 |
| 0.4266 | 5.0807 | 818 | 0.8233 | 0.2192 | 0.8233 | 0.9074 |
| 0.4266 | 5.0932 | 820 | 0.7968 | 0.1231 | 0.7968 | 0.8926 |
| 0.4266 | 5.1056 | 822 | 0.7616 | 0.1231 | 0.7616 | 0.8727 |
| 0.4266 | 5.1180 | 824 | 0.7441 | 0.2941 | 0.7441 | 0.8626 |
| 0.4266 | 5.1304 | 826 | 0.7620 | 0.2597 | 0.7620 | 0.8729 |
| 0.4266 | 5.1429 | 828 | 0.7971 | 0.3514 | 0.7971 | 0.8928 |
| 0.4266 | 5.1553 | 830 | 0.7907 | 0.3514 | 0.7907 | 0.8892 |
| 0.4266 | 5.1677 | 832 | 0.7433 | 0.3514 | 0.7433 | 0.8621 |
| 0.4266 | 5.1801 | 834 | 0.7008 | 0.2817 | 0.7008 | 0.8372 |
| 0.4266 | 5.1925 | 836 | 0.6635 | 0.2623 | 0.6635 | 0.8146 |
| 0.4266 | 5.2050 | 838 | 0.6674 | 0.2258 | 0.6674 | 0.8170 |
| 0.4266 | 5.2174 | 840 | 0.6817 | 0.2258 | 0.6817 | 0.8257 |
| 0.4266 | 5.2298 | 842 | 0.7162 | 0.2258 | 0.7162 | 0.8463 |
| 0.4266 | 5.2422 | 844 | 0.7657 | 0.25 | 0.7657 | 0.8751 |
| 0.4266 | 5.2547 | 846 | 0.8198 | 0.2597 | 0.8198 | 0.9054 |
| 0.4266 | 5.2671 | 848 | 0.8479 | 0.2895 | 0.8479 | 0.9208 |
| 0.4266 | 5.2795 | 850 | 0.8260 | 0.2597 | 0.8260 | 0.9088 |
| 0.4266 | 5.2919 | 852 | 0.8134 | 0.3077 | 0.8134 | 0.9019 |
| 0.4266 | 5.3043 | 854 | 0.7972 | 0.3014 | 0.7972 | 0.8929 |
| 0.4266 | 5.3168 | 856 | 0.7941 | 0.2597 | 0.7941 | 0.8911 |
| 0.4266 | 5.3292 | 858 | 0.7843 | 0.2597 | 0.7843 | 0.8856 |
| 0.4266 | 5.3416 | 860 | 0.7702 | 0.2597 | 0.7702 | 0.8776 |
| 0.4266 | 5.3540 | 862 | 0.7460 | 0.3014 | 0.7460 | 0.8637 |
| 0.4266 | 5.3665 | 864 | 0.7257 | 0.2941 | 0.7257 | 0.8519 |
| 0.4266 | 5.3789 | 866 | 0.7299 | 0.25 | 0.7299 | 0.8543 |
| 0.4266 | 5.3913 | 868 | 0.7473 | 0.1356 | 0.7473 | 0.8645 |
| 0.4266 | 5.4037 | 870 | 0.7615 | 0.1290 | 0.7615 | 0.8727 |
| 0.4266 | 5.4161 | 872 | 0.7599 | 0.1053 | 0.7599 | 0.8717 |
| 0.4266 | 5.4286 | 874 | 0.7533 | 0.1053 | 0.7533 | 0.8679 |
| 0.4266 | 5.4410 | 876 | 0.7428 | 0.1053 | 0.7428 | 0.8618 |
| 0.4266 | 5.4534 | 878 | 0.7316 | 0.2500 | 0.7316 | 0.8553 |
| 0.4266 | 5.4658 | 880 | 0.7369 | 0.1818 | 0.7369 | 0.8584 |
| 0.4266 | 5.4783 | 882 | 0.7370 | 0.1111 | 0.7370 | 0.8585 |
| 0.4266 | 5.4907 | 884 | 0.7621 | 0.1111 | 0.7621 | 0.8730 |
| 0.4266 | 5.5031 | 886 | 0.7998 | 0.0357 | 0.7998 | 0.8943 |
| 0.4266 | 5.5155 | 888 | 0.8465 | -0.0615 | 0.8465 | 0.9201 |
| 0.4266 | 5.5280 | 890 | 0.8343 | 0.0625 | 0.8343 | 0.9134 |
| 0.4266 | 5.5404 | 892 | 0.8443 | -0.1250 | 0.8443 | 0.9188 |
| 0.4266 | 5.5528 | 894 | 0.8393 | 0.0 | 0.8393 | 0.9161 |
| 0.4266 | 5.5652 | 896 | 0.8218 | 0.0656 | 0.8218 | 0.9065 |
| 0.4266 | 5.5776 | 898 | 0.8101 | 0.0656 | 0.8101 | 0.9001 |
| 0.4266 | 5.5901 | 900 | 0.8349 | 0.0909 | 0.8349 | 0.9137 |
| 0.4266 | 5.6025 | 902 | 0.9017 | 0.0656 | 0.9017 | 0.9496 |
| 0.4266 | 5.6149 | 904 | 0.9242 | -0.0299 | 0.9242 | 0.9613 |
| 0.4266 | 5.6273 | 906 | 0.9092 | 0.1127 | 0.9092 | 0.9535 |
| 0.4266 | 5.6398 | 908 | 0.8689 | 0.2105 | 0.8689 | 0.9321 |
| 0.4266 | 5.6522 | 910 | 0.8423 | 0.2388 | 0.8423 | 0.9178 |
| 0.4266 | 5.6646 | 912 | 0.8294 | 0.2388 | 0.8294 | 0.9107 |
| 0.4266 | 5.6770 | 914 | 0.8189 | 0.2059 | 0.8189 | 0.9049 |
| 0.4266 | 5.6894 | 916 | 0.8061 | 0.25 | 0.8061 | 0.8978 |
| 0.4266 | 5.7019 | 918 | 0.8022 | 0.1356 | 0.8022 | 0.8957 |
| 0.4266 | 5.7143 | 920 | 0.8017 | 0.0357 | 0.8017 | 0.8954 |
| 0.4266 | 5.7267 | 922 | 0.7832 | -0.1053 | 0.7832 | 0.8850 |
| 0.4266 | 5.7391 | 924 | 0.7572 | 0.1818 | 0.7572 | 0.8702 |
| 0.4266 | 5.7516 | 926 | 0.7415 | 0.1356 | 0.7415 | 0.8611 |
| 0.4266 | 5.7640 | 928 | 0.7412 | 0.1356 | 0.7412 | 0.8609 |
| 0.4266 | 5.7764 | 930 | 0.7496 | 0.1356 | 0.7496 | 0.8658 |
| 0.4266 | 5.7888 | 932 | 0.7494 | 0.1356 | 0.7494 | 0.8657 |
| 0.4266 | 5.8012 | 934 | 0.7536 | 0.1356 | 0.7536 | 0.8681 |
| 0.4266 | 5.8137 | 936 | 0.7557 | 0.1356 | 0.7557 | 0.8693 |
| 0.4266 | 5.8261 | 938 | 0.7761 | 0.1356 | 0.7761 | 0.8810 |
| 0.4266 | 5.8385 | 940 | 0.8119 | 0.0357 | 0.8119 | 0.9011 |
| 0.4266 | 5.8509 | 942 | 0.8384 | -0.1250 | 0.8384 | 0.9157 |
| 0.4266 | 5.8634 | 944 | 0.8457 | -0.1250 | 0.8457 | 0.9196 |
| 0.4266 | 5.8758 | 946 | 0.8307 | 0.0294 | 0.8307 | 0.9114 |
| 0.4266 | 5.8882 | 948 | 0.8057 | 0.0 | 0.8057 | 0.8976 |
| 0.4266 | 5.9006 | 950 | 0.7627 | 0.25 | 0.7627 | 0.8733 |
| 0.4266 | 5.9130 | 952 | 0.7389 | 0.2857 | 0.7389 | 0.8596 |
| 0.4266 | 5.9255 | 954 | 0.7299 | 0.2857 | 0.7299 | 0.8543 |
| 0.4266 | 5.9379 | 956 | 0.7116 | 0.2857 | 0.7116 | 0.8435 |
| 0.4266 | 5.9503 | 958 | 0.7005 | 0.2941 | 0.7005 | 0.8370 |
| 0.4266 | 5.9627 | 960 | 0.6994 | 0.2941 | 0.6994 | 0.8363 |
| 0.4266 | 5.9752 | 962 | 0.7006 | 0.2941 | 0.7006 | 0.8370 |
| 0.4266 | 5.9876 | 964 | 0.7093 | 0.2941 | 0.7093 | 0.8422 |
| 0.4266 | 6.0 | 966 | 0.7301 | 0.25 | 0.7301 | 0.8544 |
| 0.4266 | 6.0124 | 968 | 0.7548 | 0.1231 | 0.7548 | 0.8688 |
| 0.4266 | 6.0248 | 970 | 0.7686 | 0.1231 | 0.7686 | 0.8767 |
| 0.4266 | 6.0373 | 972 | 0.7757 | 0.1429 | 0.7757 | 0.8807 |
| 0.4266 | 6.0497 | 974 | 0.7705 | 0.2609 | 0.7705 | 0.8778 |
| 0.4266 | 6.0621 | 976 | 0.7683 | 0.2609 | 0.7683 | 0.8765 |
| 0.4266 | 6.0745 | 978 | 0.7616 | 0.2609 | 0.7616 | 0.8727 |
| 0.4266 | 6.0870 | 980 | 0.7568 | 0.2609 | 0.7568 | 0.8700 |
| 0.4266 | 6.0994 | 982 | 0.7422 | 0.2609 | 0.7422 | 0.8615 |
| 0.4266 | 6.1118 | 984 | 0.7278 | 0.25 | 0.7278 | 0.8531 |
| 0.4266 | 6.1242 | 986 | 0.7170 | 0.25 | 0.7170 | 0.8468 |
| 0.4266 | 6.1366 | 988 | 0.7015 | 0.25 | 0.7015 | 0.8376 |
| 0.4266 | 6.1491 | 990 | 0.6965 | 0.25 | 0.6965 | 0.8346 |
| 0.4266 | 6.1615 | 992 | 0.7081 | 0.25 | 0.7081 | 0.8415 |
| 0.4266 | 6.1739 | 994 | 0.7158 | 0.1818 | 0.7158 | 0.8461 |
| 0.4266 | 6.1863 | 996 | 0.7216 | 0.2388 | 0.7216 | 0.8495 |
| 0.4266 | 6.1988 | 998 | 0.7342 | 0.1290 | 0.7342 | 0.8569 |
| 0.1071 | 6.2112 | 1000 | 0.7368 | 0.2388 | 0.7368 | 0.8584 |
| 0.1071 | 6.2236 | 1002 | 0.7318 | 0.3077 | 0.7318 | 0.8555 |
| 0.1071 | 6.2360 | 1004 | 0.7247 | 0.25 | 0.7247 | 0.8513 |
| 0.1071 | 6.2484 | 1006 | 0.7294 | 0.2941 | 0.7294 | 0.8540 |
| 0.1071 | 6.2609 | 1008 | 0.7352 | 0.2941 | 0.7352 | 0.8574 |
| 0.1071 | 6.2733 | 1010 | 0.7267 | 0.2941 | 0.7267 | 0.8525 |
| 0.1071 | 6.2857 | 1012 | 0.7178 | 0.2857 | 0.7178 | 0.8472 |
| 0.1071 | 6.2981 | 1014 | 0.7235 | 0.25 | 0.7235 | 0.8506 |
| 0.1071 | 6.3106 | 1016 | 0.7304 | 0.25 | 0.7304 | 0.8546 |
| 0.1071 | 6.3230 | 1018 | 0.7299 | 0.25 | 0.7299 | 0.8543 |
| 0.1071 | 6.3354 | 1020 | 0.7232 | 0.25 | 0.7232 | 0.8504 |
| 0.1071 | 6.3478 | 1022 | 0.7234 | 0.2857 | 0.7234 | 0.8506 |
| 0.1071 | 6.3602 | 1024 | 0.7378 | 0.2258 | 0.7378 | 0.8589 |
| 0.1071 | 6.3727 | 1026 | 0.7655 | 0.2727 | 0.7655 | 0.8749 |
| 0.1071 | 6.3851 | 1028 | 0.7918 | 0.2817 | 0.7918 | 0.8898 |
| 0.1071 | 6.3975 | 1030 | 0.7940 | 0.2388 | 0.7940 | 0.8911 |
| 0.1071 | 6.4099 | 1032 | 0.7859 | 0.2941 | 0.7859 | 0.8865 |
| 0.1071 | 6.4224 | 1034 | 0.7828 | 0.2941 | 0.7828 | 0.8847 |
| 0.1071 | 6.4348 | 1036 | 0.7852 | 0.2941 | 0.7852 | 0.8861 |
| 0.1071 | 6.4472 | 1038 | 0.7823 | 0.2857 | 0.7823 | 0.8845 |
| 0.1071 | 6.4596 | 1040 | 0.7777 | 0.25 | 0.7777 | 0.8819 |
| 0.1071 | 6.4720 | 1042 | 0.7706 | 0.25 | 0.7706 | 0.8778 |
| 0.1071 | 6.4845 | 1044 | 0.7618 | 0.25 | 0.7618 | 0.8728 |
| 0.1071 | 6.4969 | 1046 | 0.7462 | 0.25 | 0.7462 | 0.8638 |
| 0.1071 | 6.5093 | 1048 | 0.7301 | 0.25 | 0.7301 | 0.8545 |
| 0.1071 | 6.5217 | 1050 | 0.7170 | 0.25 | 0.7170 | 0.8468 |
| 0.1071 | 6.5342 | 1052 | 0.7145 | 0.2857 | 0.7145 | 0.8453 |
| 0.1071 | 6.5466 | 1054 | 0.7269 | 0.2857 | 0.7269 | 0.8526 |
| 0.1071 | 6.5590 | 1056 | 0.7341 | 0.2857 | 0.7341 | 0.8568 |
| 0.1071 | 6.5714 | 1058 | 0.7494 | 0.2727 | 0.7494 | 0.8657 |
| 0.1071 | 6.5839 | 1060 | 0.7540 | 0.2727 | 0.7540 | 0.8683 |
| 0.1071 | 6.5963 | 1062 | 0.7460 | 0.2388 | 0.7460 | 0.8637 |
| 0.1071 | 6.6087 | 1064 | 0.7431 | 0.2857 | 0.7431 | 0.8620 |
| 0.1071 | 6.6211 | 1066 | 0.7503 | 0.2857 | 0.7503 | 0.8662 |
| 0.1071 | 6.6335 | 1068 | 0.7616 | 0.25 | 0.7616 | 0.8727 |
| 0.1071 | 6.6460 | 1070 | 0.7699 | 0.1231 | 0.7699 | 0.8774 |
| 0.1071 | 6.6584 | 1072 | 0.7613 | 0.25 | 0.7613 | 0.8726 |
| 0.1071 | 6.6708 | 1074 | 0.7477 | 0.25 | 0.7477 | 0.8647 |
| 0.1071 | 6.6832 | 1076 | 0.7447 | 0.2941 | 0.7447 | 0.8630 |
| 0.1071 | 6.6957 | 1078 | 0.7663 | 0.2388 | 0.7663 | 0.8754 |
| 0.1071 | 6.7081 | 1080 | 0.7886 | 0.2817 | 0.7886 | 0.8880 |
| 0.1071 | 6.7205 | 1082 | 0.8094 | 0.2703 | 0.8094 | 0.8997 |
| 0.1071 | 6.7329 | 1084 | 0.7958 | 0.2192 | 0.7958 | 0.8921 |
| 0.1071 | 6.7453 | 1086 | 0.7612 | 0.2727 | 0.7612 | 0.8724 |
| 0.1071 | 6.7578 | 1088 | 0.7302 | 0.2623 | 0.7302 | 0.8545 |
| 0.1071 | 6.7702 | 1090 | 0.7108 | 0.2258 | 0.7108 | 0.8431 |
| 0.1071 | 6.7826 | 1092 | 0.6987 | 0.2857 | 0.6987 | 0.8359 |
| 0.1071 | 6.7950 | 1094 | 0.6951 | 0.2857 | 0.6951 | 0.8337 |
| 0.1071 | 6.8075 | 1096 | 0.6953 | 0.1724 | 0.6953 | 0.8338 |
| 0.1071 | 6.8199 | 1098 | 0.6946 | 0.1724 | 0.6946 | 0.8334 |
| 0.1071 | 6.8323 | 1100 | 0.6970 | 0.2258 | 0.6970 | 0.8348 |
| 0.1071 | 6.8447 | 1102 | 0.7036 | 0.2258 | 0.7036 | 0.8388 |
| 0.1071 | 6.8571 | 1104 | 0.7064 | 0.2258 | 0.7064 | 0.8405 |
| 0.1071 | 6.8696 | 1106 | 0.7030 | 0.2258 | 0.7030 | 0.8384 |
| 0.1071 | 6.8820 | 1108 | 0.6982 | 0.2258 | 0.6982 | 0.8356 |
| 0.1071 | 6.8944 | 1110 | 0.6998 | 0.2258 | 0.6998 | 0.8366 |
| 0.1071 | 6.9068 | 1112 | 0.7035 | 0.2857 | 0.7035 | 0.8388 |
| 0.1071 | 6.9193 | 1114 | 0.7076 | 0.2857 | 0.7076 | 0.8412 |
| 0.1071 | 6.9317 | 1116 | 0.7083 | 0.2857 | 0.7083 | 0.8416 |
| 0.1071 | 6.9441 | 1118 | 0.7110 | 0.2857 | 0.7110 | 0.8432 |
| 0.1071 | 6.9565 | 1120 | 0.7124 | 0.2857 | 0.7124 | 0.8440 |
| 0.1071 | 6.9689 | 1122 | 0.7124 | 0.2857 | 0.7124 | 0.8441 |
| 0.1071 | 6.9814 | 1124 | 0.7153 | 0.2857 | 0.7153 | 0.8458 |
| 0.1071 | 6.9938 | 1126 | 0.7227 | 0.2857 | 0.7227 | 0.8501 |
| 0.1071 | 7.0062 | 1128 | 0.7203 | 0.2857 | 0.7203 | 0.8487 |
| 0.1071 | 7.0186 | 1130 | 0.7237 | 0.2857 | 0.7237 | 0.8507 |
| 0.1071 | 7.0311 | 1132 | 0.7246 | 0.2857 | 0.7246 | 0.8513 |
| 0.1071 | 7.0435 | 1134 | 0.7249 | 0.2857 | 0.7249 | 0.8514 |
| 0.1071 | 7.0559 | 1136 | 0.7290 | 0.2857 | 0.7290 | 0.8538 |
| 0.1071 | 7.0683 | 1138 | 0.7325 | 0.2857 | 0.7325 | 0.8558 |
| 0.1071 | 7.0807 | 1140 | 0.7347 | 0.2857 | 0.7347 | 0.8571 |
| 0.1071 | 7.0932 | 1142 | 0.7294 | 0.2857 | 0.7294 | 0.8540 |
| 0.1071 | 7.1056 | 1144 | 0.7256 | 0.2857 | 0.7256 | 0.8518 |
| 0.1071 | 7.1180 | 1146 | 0.7273 | 0.2857 | 0.7273 | 0.8528 |
| 0.1071 | 7.1304 | 1148 | 0.7303 | 0.2857 | 0.7303 | 0.8546 |
| 0.1071 | 7.1429 | 1150 | 0.7345 | 0.2857 | 0.7345 | 0.8570 |
| 0.1071 | 7.1553 | 1152 | 0.7407 | 0.2857 | 0.7407 | 0.8606 |
| 0.1071 | 7.1677 | 1154 | 0.7353 | 0.2857 | 0.7353 | 0.8575 |
| 0.1071 | 7.1801 | 1156 | 0.7254 | 0.2857 | 0.7254 | 0.8517 |
| 0.1071 | 7.1925 | 1158 | 0.7150 | 0.2857 | 0.7150 | 0.8456 |
| 0.1071 | 7.2050 | 1160 | 0.7069 | 0.2857 | 0.7069 | 0.8408 |
| 0.1071 | 7.2174 | 1162 | 0.6994 | 0.2857 | 0.6994 | 0.8363 |
| 0.1071 | 7.2298 | 1164 | 0.6966 | 0.2857 | 0.6966 | 0.8346 |
| 0.1071 | 7.2422 | 1166 | 0.6935 | 0.2857 | 0.6935 | 0.8328 |
| 0.1071 | 7.2547 | 1168 | 0.6879 | 0.2857 | 0.6879 | 0.8294 |
| 0.1071 | 7.2671 | 1170 | 0.6853 | 0.2857 | 0.6853 | 0.8278 |
| 0.1071 | 7.2795 | 1172 | 0.6900 | 0.2857 | 0.6900 | 0.8306 |
| 0.1071 | 7.2919 | 1174 | 0.6990 | 0.2857 | 0.6990 | 0.8361 |
| 0.1071 | 7.3043 | 1176 | 0.7135 | 0.2258 | 0.7135 | 0.8447 |
| 0.1071 | 7.3168 | 1178 | 0.7242 | 0.2258 | 0.7242 | 0.8510 |
| 0.1071 | 7.3292 | 1180 | 0.7322 | 0.2388 | 0.7322 | 0.8557 |
| 0.1071 | 7.3416 | 1182 | 0.7388 | 0.2388 | 0.7388 | 0.8595 |
| 0.1071 | 7.3540 | 1184 | 0.7412 | 0.2388 | 0.7412 | 0.8609 |
| 0.1071 | 7.3665 | 1186 | 0.7428 | 0.2941 | 0.7428 | 0.8618 |
| 0.1071 | 7.3789 | 1188 | 0.7428 | 0.2941 | 0.7428 | 0.8619 |
| 0.1071 | 7.3913 | 1190 | 0.7364 | 0.2941 | 0.7364 | 0.8581 |
| 0.1071 | 7.4037 | 1192 | 0.7346 | 0.2388 | 0.7346 | 0.8571 |
| 0.1071 | 7.4161 | 1194 | 0.7359 | 0.2388 | 0.7359 | 0.8578 |
| 0.1071 | 7.4286 | 1196 | 0.7349 | 0.2941 | 0.7349 | 0.8572 |
| 0.1071 | 7.4410 | 1198 | 0.7345 | 0.2941 | 0.7345 | 0.8570 |
| 0.1071 | 7.4534 | 1200 | 0.7325 | 0.2941 | 0.7325 | 0.8558 |
| 0.1071 | 7.4658 | 1202 | 0.7244 | 0.2388 | 0.7244 | 0.8511 |
| 0.1071 | 7.4783 | 1204 | 0.7219 | 0.2388 | 0.7219 | 0.8497 |
| 0.1071 | 7.4907 | 1206 | 0.7164 | 0.2727 | 0.7164 | 0.8464 |
| 0.1071 | 7.5031 | 1208 | 0.7198 | 0.2727 | 0.7198 | 0.8484 |
| 0.1071 | 7.5155 | 1210 | 0.7115 | 0.2727 | 0.7115 | 0.8435 |
| 0.1071 | 7.5280 | 1212 | 0.7126 | 0.2727 | 0.7126 | 0.8442 |
| 0.1071 | 7.5404 | 1214 | 0.7167 | 0.2727 | 0.7167 | 0.8466 |
| 0.1071 | 7.5528 | 1216 | 0.7152 | 0.2388 | 0.7152 | 0.8457 |
| 0.1071 | 7.5652 | 1218 | 0.7134 | 0.2388 | 0.7134 | 0.8447 |
| 0.1071 | 7.5776 | 1220 | 0.7114 | 0.2857 | 0.7114 | 0.8434 |
| 0.1071 | 7.5901 | 1222 | 0.7096 | 0.2857 | 0.7096 | 0.8424 |
| 0.1071 | 7.6025 | 1224 | 0.7084 | 0.2857 | 0.7084 | 0.8416 |
| 0.1071 | 7.6149 | 1226 | 0.7097 | 0.2857 | 0.7097 | 0.8424 |
| 0.1071 | 7.6273 | 1228 | 0.7146 | 0.2857 | 0.7146 | 0.8454 |
| 0.1071 | 7.6398 | 1230 | 0.7247 | 0.2857 | 0.7247 | 0.8513 |
| 0.1071 | 7.6522 | 1232 | 0.7356 | 0.1724 | 0.7356 | 0.8577 |
| 0.1071 | 7.6646 | 1234 | 0.7412 | 0.2857 | 0.7412 | 0.8609 |
| 0.1071 | 7.6770 | 1236 | 0.7447 | 0.2857 | 0.7447 | 0.8630 |
| 0.1071 | 7.6894 | 1238 | 0.7419 | 0.2857 | 0.7419 | 0.8613 |
| 0.1071 | 7.7019 | 1240 | 0.7371 | 0.2941 | 0.7371 | 0.8585 |
| 0.1071 | 7.7143 | 1242 | 0.7369 | 0.2388 | 0.7369 | 0.8584 |
| 0.1071 | 7.7267 | 1244 | 0.7366 | 0.2388 | 0.7366 | 0.8582 |
| 0.1071 | 7.7391 | 1246 | 0.7383 | 0.2388 | 0.7383 | 0.8592 |
| 0.1071 | 7.7516 | 1248 | 0.7300 | 0.2817 | 0.7300 | 0.8544 |
| 0.1071 | 7.7640 | 1250 | 0.7128 | 0.2727 | 0.7128 | 0.8443 |
| 0.1071 | 7.7764 | 1252 | 0.7000 | 0.2388 | 0.7000 | 0.8367 |
| 0.1071 | 7.7888 | 1254 | 0.6915 | 0.2258 | 0.6915 | 0.8316 |
| 0.1071 | 7.8012 | 1256 | 0.6843 | 0.2258 | 0.6843 | 0.8272 |
| 0.1071 | 7.8137 | 1258 | 0.6761 | 0.2258 | 0.6761 | 0.8223 |
| 0.1071 | 7.8261 | 1260 | 0.6751 | 0.2258 | 0.6751 | 0.8216 |
| 0.1071 | 7.8385 | 1262 | 0.6751 | 0.2258 | 0.6751 | 0.8217 |
| 0.1071 | 7.8509 | 1264 | 0.6768 | 0.2258 | 0.6768 | 0.8227 |
| 0.1071 | 7.8634 | 1266 | 0.6749 | 0.1053 | 0.6749 | 0.8215 |
| 0.1071 | 7.8758 | 1268 | 0.6742 | 0.1053 | 0.6742 | 0.8211 |
| 0.1071 | 7.8882 | 1270 | 0.6764 | 0.1724 | 0.6764 | 0.8225 |
| 0.1071 | 7.9006 | 1272 | 0.6820 | 0.1724 | 0.6820 | 0.8258 |
| 0.1071 | 7.9130 | 1274 | 0.6891 | 0.1356 | 0.6891 | 0.8301 |
| 0.1071 | 7.9255 | 1276 | 0.6932 | 0.1356 | 0.6932 | 0.8326 |
| 0.1071 | 7.9379 | 1278 | 0.6965 | 0.25 | 0.6965 | 0.8346 |
| 0.1071 | 7.9503 | 1280 | 0.6975 | 0.25 | 0.6975 | 0.8352 |
| 0.1071 | 7.9627 | 1282 | 0.6986 | 0.25 | 0.6986 | 0.8358 |
| 0.1071 | 7.9752 | 1284 | 0.6972 | 0.2857 | 0.6972 | 0.8350 |
| 0.1071 | 7.9876 | 1286 | 0.6959 | 0.2857 | 0.6959 | 0.8342 |
| 0.1071 | 8.0 | 1288 | 0.6966 | 0.2941 | 0.6966 | 0.8347 |
| 0.1071 | 8.0124 | 1290 | 0.7008 | 0.2941 | 0.7008 | 0.8372 |
| 0.1071 | 8.0248 | 1292 | 0.7062 | 0.2941 | 0.7062 | 0.8404 |
| 0.1071 | 8.0373 | 1294 | 0.7066 | 0.2941 | 0.7066 | 0.8406 |
| 0.1071 | 8.0497 | 1296 | 0.7056 | 0.2941 | 0.7056 | 0.8400 |
| 0.1071 | 8.0621 | 1298 | 0.7040 | 0.2857 | 0.7040 | 0.8390 |
| 0.1071 | 8.0745 | 1300 | 0.7056 | 0.2857 | 0.7056 | 0.8400 |
| 0.1071 | 8.0870 | 1302 | 0.7076 | 0.25 | 0.7076 | 0.8412 |
| 0.1071 | 8.0994 | 1304 | 0.7050 | 0.25 | 0.7050 | 0.8396 |
| 0.1071 | 8.1118 | 1306 | 0.6981 | 0.2857 | 0.6981 | 0.8355 |
| 0.1071 | 8.1242 | 1308 | 0.6892 | 0.2857 | 0.6892 | 0.8302 |
| 0.1071 | 8.1366 | 1310 | 0.6819 | 0.2857 | 0.6819 | 0.8258 |
| 0.1071 | 8.1491 | 1312 | 0.6803 | 0.2857 | 0.6803 | 0.8248 |
| 0.1071 | 8.1615 | 1314 | 0.6792 | 0.2857 | 0.6792 | 0.8241 |
| 0.1071 | 8.1739 | 1316 | 0.6796 | 0.2857 | 0.6796 | 0.8244 |
| 0.1071 | 8.1863 | 1318 | 0.6828 | 0.2857 | 0.6828 | 0.8263 |
| 0.1071 | 8.1988 | 1320 | 0.6897 | 0.2857 | 0.6897 | 0.8305 |
| 0.1071 | 8.2112 | 1322 | 0.6996 | 0.2941 | 0.6996 | 0.8364 |
| 0.1071 | 8.2236 | 1324 | 0.7112 | 0.2941 | 0.7112 | 0.8433 |
| 0.1071 | 8.2360 | 1326 | 0.7205 | 0.2941 | 0.7205 | 0.8488 |
| 0.1071 | 8.2484 | 1328 | 0.7290 | 0.2941 | 0.7290 | 0.8538 |
| 0.1071 | 8.2609 | 1330 | 0.7372 | 0.2941 | 0.7372 | 0.8586 |
| 0.1071 | 8.2733 | 1332 | 0.7417 | 0.2941 | 0.7417 | 0.8612 |
| 0.1071 | 8.2857 | 1334 | 0.7433 | 0.2941 | 0.7433 | 0.8621 |
| 0.1071 | 8.2981 | 1336 | 0.7457 | 0.2941 | 0.7457 | 0.8635 |
| 0.1071 | 8.3106 | 1338 | 0.7460 | 0.2941 | 0.7460 | 0.8637 |
| 0.1071 | 8.3230 | 1340 | 0.7450 | 0.2941 | 0.7450 | 0.8632 |
| 0.1071 | 8.3354 | 1342 | 0.7457 | 0.2941 | 0.7457 | 0.8635 |
| 0.1071 | 8.3478 | 1344 | 0.7379 | 0.2941 | 0.7379 | 0.8590 |
| 0.1071 | 8.3602 | 1346 | 0.7268 | 0.2941 | 0.7268 | 0.8525 |
| 0.1071 | 8.3727 | 1348 | 0.7126 | 0.2941 | 0.7126 | 0.8441 |
| 0.1071 | 8.3851 | 1350 | 0.7045 | 0.2857 | 0.7045 | 0.8393 |
| 0.1071 | 8.3975 | 1352 | 0.6989 | 0.2857 | 0.6989 | 0.8360 |
| 0.1071 | 8.4099 | 1354 | 0.6968 | 0.2857 | 0.6968 | 0.8348 |
| 0.1071 | 8.4224 | 1356 | 0.6980 | 0.2857 | 0.6980 | 0.8355 |
| 0.1071 | 8.4348 | 1358 | 0.7021 | 0.2857 | 0.7021 | 0.8379 |
| 0.1071 | 8.4472 | 1360 | 0.7052 | 0.2857 | 0.7052 | 0.8397 |
| 0.1071 | 8.4596 | 1362 | 0.7096 | 0.2857 | 0.7096 | 0.8424 |
| 0.1071 | 8.4720 | 1364 | 0.7163 | 0.2857 | 0.7163 | 0.8463 |
| 0.1071 | 8.4845 | 1366 | 0.7172 | 0.2857 | 0.7172 | 0.8469 |
| 0.1071 | 8.4969 | 1368 | 0.7170 | 0.2857 | 0.7170 | 0.8467 |
| 0.1071 | 8.5093 | 1370 | 0.7170 | 0.2857 | 0.7170 | 0.8467 |
| 0.1071 | 8.5217 | 1372 | 0.7148 | 0.2857 | 0.7148 | 0.8454 |
| 0.1071 | 8.5342 | 1374 | 0.7113 | 0.2857 | 0.7113 | 0.8434 |
| 0.1071 | 8.5466 | 1376 | 0.7059 | 0.2857 | 0.7059 | 0.8402 |
| 0.1071 | 8.5590 | 1378 | 0.7015 | 0.2857 | 0.7015 | 0.8375 |
| 0.1071 | 8.5714 | 1380 | 0.6976 | 0.2857 | 0.6976 | 0.8352 |
| 0.1071 | 8.5839 | 1382 | 0.6962 | 0.2857 | 0.6962 | 0.8344 |
| 0.1071 | 8.5963 | 1384 | 0.6953 | 0.2857 | 0.6953 | 0.8339 |
| 0.1071 | 8.6087 | 1386 | 0.6966 | 0.2857 | 0.6966 | 0.8346 |
| 0.1071 | 8.6211 | 1388 | 0.7004 | 0.2857 | 0.7004 | 0.8369 |
| 0.1071 | 8.6335 | 1390 | 0.7013 | 0.2857 | 0.7013 | 0.8374 |
| 0.1071 | 8.6460 | 1392 | 0.7007 | 0.2857 | 0.7007 | 0.8371 |
| 0.1071 | 8.6584 | 1394 | 0.7012 | 0.2857 | 0.7012 | 0.8374 |
| 0.1071 | 8.6708 | 1396 | 0.7031 | 0.2857 | 0.7031 | 0.8385 |
| 0.1071 | 8.6832 | 1398 | 0.7030 | 0.2857 | 0.7030 | 0.8384 |
| 0.1071 | 8.6957 | 1400 | 0.7036 | 0.2941 | 0.7036 | 0.8388 |
| 0.1071 | 8.7081 | 1402 | 0.7048 | 0.2941 | 0.7048 | 0.8395 |
| 0.1071 | 8.7205 | 1404 | 0.7046 | 0.2941 | 0.7046 | 0.8394 |
| 0.1071 | 8.7329 | 1406 | 0.7053 | 0.2941 | 0.7053 | 0.8398 |
| 0.1071 | 8.7453 | 1408 | 0.7089 | 0.2941 | 0.7089 | 0.8420 |
| 0.1071 | 8.7578 | 1410 | 0.7119 | 0.2941 | 0.7119 | 0.8438 |
| 0.1071 | 8.7702 | 1412 | 0.7129 | 0.2857 | 0.7129 | 0.8443 |
| 0.1071 | 8.7826 | 1414 | 0.7141 | 0.2857 | 0.7141 | 0.8450 |
| 0.1071 | 8.7950 | 1416 | 0.7149 | 0.2857 | 0.7149 | 0.8455 |
| 0.1071 | 8.8075 | 1418 | 0.7184 | 0.2857 | 0.7184 | 0.8476 |
| 0.1071 | 8.8199 | 1420 | 0.7227 | 0.2857 | 0.7227 | 0.8501 |
| 0.1071 | 8.8323 | 1422 | 0.7273 | 0.2857 | 0.7273 | 0.8528 |
| 0.1071 | 8.8447 | 1424 | 0.7331 | 0.2857 | 0.7331 | 0.8562 |
| 0.1071 | 8.8571 | 1426 | 0.7355 | 0.2857 | 0.7355 | 0.8576 |
| 0.1071 | 8.8696 | 1428 | 0.7389 | 0.2857 | 0.7389 | 0.8596 |
| 0.1071 | 8.8820 | 1430 | 0.7424 | 0.2941 | 0.7424 | 0.8616 |
| 0.1071 | 8.8944 | 1432 | 0.7430 | 0.2941 | 0.7430 | 0.8620 |
| 0.1071 | 8.9068 | 1434 | 0.7438 | 0.2941 | 0.7438 | 0.8625 |
| 0.1071 | 8.9193 | 1436 | 0.7439 | 0.2941 | 0.7439 | 0.8625 |
| 0.1071 | 8.9317 | 1438 | 0.7452 | 0.2941 | 0.7452 | 0.8633 |
| 0.1071 | 8.9441 | 1440 | 0.7480 | 0.2941 | 0.7480 | 0.8649 |
| 0.1071 | 8.9565 | 1442 | 0.7492 | 0.2941 | 0.7492 | 0.8656 |
| 0.1071 | 8.9689 | 1444 | 0.7493 | 0.2941 | 0.7493 | 0.8656 |
| 0.1071 | 8.9814 | 1446 | 0.7460 | 0.2941 | 0.7460 | 0.8637 |
| 0.1071 | 8.9938 | 1448 | 0.7416 | 0.2941 | 0.7416 | 0.8612 |
| 0.1071 | 9.0062 | 1450 | 0.7376 | 0.2941 | 0.7376 | 0.8588 |
| 0.1071 | 9.0186 | 1452 | 0.7352 | 0.2941 | 0.7352 | 0.8575 |
| 0.1071 | 9.0311 | 1454 | 0.7347 | 0.2941 | 0.7347 | 0.8571 |
| 0.1071 | 9.0435 | 1456 | 0.7322 | 0.2941 | 0.7322 | 0.8557 |
| 0.1071 | 9.0559 | 1458 | 0.7301 | 0.2857 | 0.7301 | 0.8545 |
| 0.1071 | 9.0683 | 1460 | 0.7285 | 0.2857 | 0.7285 | 0.8535 |
| 0.1071 | 9.0807 | 1462 | 0.7258 | 0.2857 | 0.7258 | 0.8520 |
| 0.1071 | 9.0932 | 1464 | 0.7242 | 0.2941 | 0.7242 | 0.8510 |
| 0.1071 | 9.1056 | 1466 | 0.7214 | 0.2941 | 0.7214 | 0.8493 |
| 0.1071 | 9.1180 | 1468 | 0.7194 | 0.2941 | 0.7194 | 0.8482 |
| 0.1071 | 9.1304 | 1470 | 0.7185 | 0.2941 | 0.7185 | 0.8477 |
| 0.1071 | 9.1429 | 1472 | 0.7159 | 0.2941 | 0.7159 | 0.8461 |
| 0.1071 | 9.1553 | 1474 | 0.7147 | 0.2941 | 0.7147 | 0.8454 |
| 0.1071 | 9.1677 | 1476 | 0.7133 | 0.2941 | 0.7133 | 0.8446 |
| 0.1071 | 9.1801 | 1478 | 0.7120 | 0.2941 | 0.7120 | 0.8438 |
| 0.1071 | 9.1925 | 1480 | 0.7130 | 0.2941 | 0.7130 | 0.8444 |
| 0.1071 | 9.2050 | 1482 | 0.7133 | 0.2857 | 0.7133 | 0.8446 |
| 0.1071 | 9.2174 | 1484 | 0.7142 | 0.2857 | 0.7142 | 0.8451 |
| 0.1071 | 9.2298 | 1486 | 0.7162 | 0.2857 | 0.7162 | 0.8463 |
| 0.1071 | 9.2422 | 1488 | 0.7173 | 0.2857 | 0.7173 | 0.8470 |
| 0.1071 | 9.2547 | 1490 | 0.7170 | 0.2857 | 0.7170 | 0.8468 |
| 0.1071 | 9.2671 | 1492 | 0.7167 | 0.2857 | 0.7167 | 0.8466 |
| 0.1071 | 9.2795 | 1494 | 0.7161 | 0.2857 | 0.7161 | 0.8462 |
| 0.1071 | 9.2919 | 1496 | 0.7161 | 0.2857 | 0.7161 | 0.8462 |
| 0.1071 | 9.3043 | 1498 | 0.7165 | 0.2857 | 0.7165 | 0.8465 |
| 0.062 | 9.3168 | 1500 | 0.7167 | 0.2857 | 0.7167 | 0.8466 |
| 0.062 | 9.3292 | 1502 | 0.7166 | 0.2857 | 0.7166 | 0.8465 |
| 0.062 | 9.3416 | 1504 | 0.7151 | 0.2857 | 0.7151 | 0.8456 |
| 0.062 | 9.3540 | 1506 | 0.7145 | 0.2857 | 0.7145 | 0.8453 |
| 0.062 | 9.3665 | 1508 | 0.7153 | 0.2857 | 0.7153 | 0.8457 |
| 0.062 | 9.3789 | 1510 | 0.7172 | 0.2941 | 0.7172 | 0.8468 |
| 0.062 | 9.3913 | 1512 | 0.7182 | 0.2941 | 0.7182 | 0.8475 |
| 0.062 | 9.4037 | 1514 | 0.7204 | 0.2941 | 0.7204 | 0.8487 |
| 0.062 | 9.4161 | 1516 | 0.7219 | 0.2941 | 0.7219 | 0.8496 |
| 0.062 | 9.4286 | 1518 | 0.7229 | 0.2941 | 0.7229 | 0.8502 |
| 0.062 | 9.4410 | 1520 | 0.7241 | 0.2941 | 0.7241 | 0.8509 |
| 0.062 | 9.4534 | 1522 | 0.7238 | 0.2941 | 0.7238 | 0.8507 |
| 0.062 | 9.4658 | 1524 | 0.7231 | 0.2941 | 0.7231 | 0.8503 |
| 0.062 | 9.4783 | 1526 | 0.7227 | 0.2941 | 0.7227 | 0.8501 |
| 0.062 | 9.4907 | 1528 | 0.7219 | 0.2941 | 0.7219 | 0.8496 |
| 0.062 | 9.5031 | 1530 | 0.7215 | 0.2941 | 0.7215 | 0.8494 |
| 0.062 | 9.5155 | 1532 | 0.7212 | 0.2941 | 0.7212 | 0.8493 |
| 0.062 | 9.5280 | 1534 | 0.7218 | 0.2941 | 0.7218 | 0.8496 |
| 0.062 | 9.5404 | 1536 | 0.7222 | 0.2941 | 0.7222 | 0.8498 |
| 0.062 | 9.5528 | 1538 | 0.7215 | 0.2941 | 0.7215 | 0.8494 |
| 0.062 | 9.5652 | 1540 | 0.7201 | 0.2941 | 0.7201 | 0.8486 |
| 0.062 | 9.5776 | 1542 | 0.7189 | 0.2941 | 0.7189 | 0.8479 |
| 0.062 | 9.5901 | 1544 | 0.7177 | 0.2941 | 0.7177 | 0.8472 |
| 0.062 | 9.6025 | 1546 | 0.7169 | 0.2941 | 0.7169 | 0.8467 |
| 0.062 | 9.6149 | 1548 | 0.7151 | 0.2941 | 0.7151 | 0.8456 |
| 0.062 | 9.6273 | 1550 | 0.7139 | 0.2941 | 0.7139 | 0.8449 |
| 0.062 | 9.6398 | 1552 | 0.7131 | 0.2941 | 0.7131 | 0.8444 |
| 0.062 | 9.6522 | 1554 | 0.7119 | 0.2941 | 0.7119 | 0.8437 |
| 0.062 | 9.6646 | 1556 | 0.7103 | 0.2857 | 0.7103 | 0.8428 |
| 0.062 | 9.6770 | 1558 | 0.7092 | 0.2857 | 0.7092 | 0.8421 |
| 0.062 | 9.6894 | 1560 | 0.7083 | 0.2857 | 0.7083 | 0.8416 |
| 0.062 | 9.7019 | 1562 | 0.7074 | 0.2857 | 0.7074 | 0.8411 |
| 0.062 | 9.7143 | 1564 | 0.7069 | 0.2857 | 0.7069 | 0.8408 |
| 0.062 | 9.7267 | 1566 | 0.7064 | 0.2857 | 0.7064 | 0.8405 |
| 0.062 | 9.7391 | 1568 | 0.7063 | 0.2857 | 0.7063 | 0.8404 |
| 0.062 | 9.7516 | 1570 | 0.7063 | 0.2857 | 0.7063 | 0.8404 |
| 0.062 | 9.7640 | 1572 | 0.7064 | 0.2857 | 0.7064 | 0.8405 |
| 0.062 | 9.7764 | 1574 | 0.7068 | 0.2857 | 0.7068 | 0.8407 |
| 0.062 | 9.7888 | 1576 | 0.7069 | 0.2857 | 0.7069 | 0.8408 |
| 0.062 | 9.8012 | 1578 | 0.7069 | 0.2857 | 0.7069 | 0.8408 |
| 0.062 | 9.8137 | 1580 | 0.7069 | 0.2857 | 0.7069 | 0.8408 |
| 0.062 | 9.8261 | 1582 | 0.7070 | 0.2857 | 0.7070 | 0.8408 |
| 0.062 | 9.8385 | 1584 | 0.7066 | 0.2857 | 0.7066 | 0.8406 |
| 0.062 | 9.8509 | 1586 | 0.7062 | 0.2857 | 0.7062 | 0.8404 |
| 0.062 | 9.8634 | 1588 | 0.7061 | 0.2857 | 0.7061 | 0.8403 |
| 0.062 | 9.8758 | 1590 | 0.7061 | 0.2857 | 0.7061 | 0.8403 |
| 0.062 | 9.8882 | 1592 | 0.7062 | 0.2857 | 0.7062 | 0.8404 |
| 0.062 | 9.9006 | 1594 | 0.7063 | 0.2857 | 0.7063 | 0.8404 |
| 0.062 | 9.9130 | 1596 | 0.7066 | 0.2857 | 0.7066 | 0.8406 |
| 0.062 | 9.9255 | 1598 | 0.7069 | 0.2857 | 0.7069 | 0.8408 |
| 0.062 | 9.9379 | 1600 | 0.7072 | 0.2857 | 0.7072 | 0.8410 |
| 0.062 | 9.9503 | 1602 | 0.7074 | 0.2857 | 0.7074 | 0.8411 |
| 0.062 | 9.9627 | 1604 | 0.7075 | 0.2857 | 0.7075 | 0.8411 |
| 0.062 | 9.9752 | 1606 | 0.7076 | 0.2857 | 0.7076 | 0.8412 |
| 0.062 | 9.9876 | 1608 | 0.7076 | 0.2857 | 0.7076 | 0.8412 |
| 0.062 | 10.0 | 1610 | 0.7076 | 0.2857 | 0.7076 | 0.8412 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
|
Eugeoter/noob-sdxl-controlnet-scribble_pidinet | Eugeoter | 2024-11-25T06:16:51Z | 8,078 | 0 | diffusers | [
"diffusers",
"safetensors",
"stable-diffusion",
"stable-diffusion-xl",
"controlnet",
"text-to-image",
"en",
"base_model:Laxhar/noobai-xl-EarlyAccess",
"base_model:adapter:Laxhar/noobai-xl-EarlyAccess",
"license:other",
"region:us"
] | text-to-image | 2024-11-25T05:38:08Z | ---
license: other
license_name: fair-ai-public-license-1.0-sd
license_link: https://freedevproject.org/faipl-1.0-sd/
library_name: diffusers
language:
- en
base_model:
- Laxhar/sdxl_noob
pipeline_tag: text-to-image
tags:
- stable-diffusion
- stable-diffusion-xl
- controlnet
- diffusers
--- |
TheHierophant/Fimbulvetr-11B-Attention-V0.1-test | TheHierophant | 2024-11-25T06:09:22Z | 39 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"mergekit",
"merge",
"base_model:Sao10K/Fimbulvetr-11B-v2",
"base_model:finetune:Sao10K/Fimbulvetr-11B-v2",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-23T03:40:34Z | ---
base_model:
- Sao10K/Fimbulvetr-11B-v2
library_name: transformers
tags:
- mergekit
- merge
---
# merge
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the passthrough merge method.
### Models Merged
The following models were included in the merge:
* [Sao10K/Fimbulvetr-11B-v2](https://huggingface.co/Sao10K/Fimbulvetr-11B-v2)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
slices:
- sources:
- model: Sao10K/Fimbulvetr-11B-v2
layer_range: [0, 16]
parameters:
attention:
- filter: grouped_qk_proj
clusters: 8
value: 1.2
attention_type: hierarchical
local_attention: 0.5
global_attention: 1.5
dynamic_weighting: true
rope_scaling: 10000
significance: 0.85
mlp:
filter: intermediate_proj
value: 14336
compression: true
dynamic_capacity: true
regularization:
filter: attention_dropout
value: 0.0
- sources:
- model: Sao10K/Fimbulvetr-11B-v2
layer_range: [16, 32]
parameters:
attention:
- filter: grouped_qk_proj
clusters: 8
value: 1.3
attention_type: hierarchical
local_attention: 0.6
global_attention: 1.4
dynamic_weighting: true
rope_scaling: 10000
significance: 0.80
mlp:
filter: intermediate_proj
value: 14336
compression: true
dynamic_capacity: true
regularization:
filter: attention_dropout
value: 0.0
- sources:
- model: Sao10K/Fimbulvetr-11B-v2
layer_range: [32, 48]
parameters:
attention:
- filter: grouped_qk_proj
clusters: 8
value: 1.5
attention_type: hierarchical
local_attention: 0.7
global_attention: 1.6
dynamic_weighting: true
rope_scaling: 10000
significance: 0.9
mlp:
filter: intermediate_proj
value: 14336
compression: true
dynamic_capacity: true
regularization:
filter: attention_dropout
value: 0.0
merge_method: passthrough
dtype: bfloat16
```
|
LEESIHYUN/xlm-roberta-base-finetuned-panx-de | LEESIHYUN | 2024-11-25T06:04:39Z | 103 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"xlm-roberta",
"token-classification",
"generated_from_trainer",
"base_model:FacebookAI/xlm-roberta-base",
"base_model:finetune:FacebookAI/xlm-roberta-base",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | 2024-03-02T06:14:28Z | ---
library_name: transformers
license: mit
base_model: xlm-roberta-base
tags:
- generated_from_trainer
metrics:
- f1
model-index:
- name: xlm-roberta-base-finetuned-panx-de
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# xlm-roberta-base-finetuned-panx-de
This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1376
- F1: 0.8650
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | F1 |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 0.267 | 1.0 | 525 | 0.1502 | 0.8266 |
| 0.1269 | 2.0 | 1050 | 0.1352 | 0.8574 |
| 0.0793 | 3.0 | 1575 | 0.1376 | 0.8650 |
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.1+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
|
Gummybear05/wav2vec2-E50_freq_speed_pause2 | Gummybear05 | 2024-11-25T06:03:48Z | 8 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"base_model:facebook/wav2vec2-xls-r-300m",
"base_model:finetune:facebook/wav2vec2-xls-r-300m",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | automatic-speech-recognition | 2024-11-25T03:27:33Z | ---
library_name: transformers
license: apache-2.0
base_model: facebook/wav2vec2-xls-r-300m
tags:
- generated_from_trainer
model-index:
- name: wav2vec2-E50_freq_speed_pause2
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-E50_freq_speed_pause2
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2004
- Cer: 25.1351
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 50
- num_epochs: 3
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Cer |
|:-------------:|:------:|:----:|:---------------:|:-------:|
| 31.4584 | 0.1289 | 200 | 5.0025 | 100.0 |
| 4.882 | 0.2579 | 400 | 4.6922 | 100.0 |
| 4.7642 | 0.3868 | 600 | 4.7290 | 100.0 |
| 4.7287 | 0.5158 | 800 | 4.6828 | 100.0 |
| 4.6641 | 0.6447 | 1000 | 4.6322 | 100.0 |
| 4.6322 | 0.7737 | 1200 | 4.5289 | 100.0 |
| 4.5965 | 0.9026 | 1400 | 4.5190 | 98.8132 |
| 4.4551 | 1.0316 | 1600 | 4.3994 | 97.3678 |
| 3.9667 | 1.1605 | 1800 | 3.5939 | 65.2409 |
| 3.138 | 1.2895 | 2000 | 2.8840 | 52.5147 |
| 2.6722 | 1.4184 | 2200 | 2.4844 | 44.6122 |
| 2.3367 | 1.5474 | 2400 | 2.1465 | 39.5300 |
| 2.1071 | 1.6763 | 2600 | 1.9978 | 37.5206 |
| 1.9574 | 1.8053 | 2800 | 1.8497 | 35.2585 |
| 1.7583 | 1.9342 | 3000 | 1.6906 | 33.4195 |
| 1.6158 | 2.0632 | 3200 | 1.5764 | 31.8096 |
| 1.4885 | 2.1921 | 3400 | 1.4695 | 30.5582 |
| 1.3927 | 2.3211 | 3600 | 1.4137 | 29.6710 |
| 1.3595 | 2.4500 | 3800 | 1.3518 | 27.6146 |
| 1.2957 | 2.5790 | 4000 | 1.2965 | 26.9036 |
| 1.2472 | 2.7079 | 4200 | 1.2612 | 25.9929 |
| 1.1913 | 2.8369 | 4400 | 1.2208 | 25.4289 |
| 1.1683 | 2.9658 | 4600 | 1.2004 | 25.1351 |
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.1+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
|
ICharmU/finetuning-sentiment-model-3000-samples | ICharmU | 2024-11-25T06:01:15Z | 105 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"base_model:distilbert/distilbert-base-uncased",
"base_model:finetune:distilbert/distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-25T05:20:05Z | ---
library_name: transformers
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_trainer
model-index:
- name: finetuning-sentiment-model-3000-samples
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# finetuning-sentiment-model-3000-samples
***Only 300 samples were used due to time limitations
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- eval_loss: 0.3936
- eval_model_preparation_time: 0.0032
- eval_accuracy: 0.8333
- eval_f1: 0.8397
- eval_runtime: 191.9771
- eval_samples_per_second: 1.563
- eval_steps_per_second: 1.563
- step: 0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 1
### Framework versions
- Transformers 4.46.3
- Pytorch 2.5.1+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
|
jaewon0916/xlm-roberta-base-finetuned-panx-de | jaewon0916 | 2024-11-25T05:57:29Z | 125 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"xlm-roberta",
"token-classification",
"generated_from_trainer",
"base_model:FacebookAI/xlm-roberta-base",
"base_model:finetune:FacebookAI/xlm-roberta-base",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | 2024-11-12T07:15:24Z | ---
library_name: transformers
license: mit
base_model: xlm-roberta-base
tags:
- generated_from_trainer
metrics:
- f1
model-index:
- name: xlm-roberta-base-finetuned-panx-de
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# xlm-roberta-base-finetuned-panx-de
This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3641
- F1: 0.5317
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | F1 |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 0.6831 | 1.0 | 525 | 0.4823 | 0.3416 |
| 0.4381 | 2.0 | 1050 | 0.4039 | 0.4552 |
| 0.3256 | 3.0 | 1575 | 0.3641 | 0.5317 |
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.1+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
|
DBMe/Monstral-123B-2.85bpw-h6-exl2 | DBMe | 2024-11-25T05:51:33Z | 7 | 0 | transformers | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"chat",
"conversational",
"en",
"license:other",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"exl2",
"region:us"
] | text-generation | 2024-11-24T04:41:44Z | ---
license: other
license_name: mrl
language:
- en
tags:
- chat
pipeline_tag: text-generation
library_name: transformers
---
Quantized model => https://huggingface.co/MarsupialAI/Monstral-123B
**Quantization Details:**
Quantization is done using turboderp's ExLlamaV2 v0.2.4.
I use the default calibration datasets and arguments. The repo also includes a "measurement.json" file, which was used during the quantization process.
For models with bits per weight (BPW) over 6.0, I default to quantizing the `lm_head` layer at 8 bits instead of the standard 6 bits.
---
**Who are you? What's with these weird BPWs on [insert model here]?**
I specialize in optimized EXL2 quantization for models in the 70B to 100B+ range, specifically tailored for 48GB VRAM setups. My rig is built using 2 x 3090s with a Ryzen APU (APU used solely for desktop output—no VRAM wasted on the 3090s). I use TabbyAPI for inference, targeting context sizes between 32K and 64K.
Every model I upload includes a `config.yml` file with my ideal TabbyAPI settings. If you're using my config, don’t forget to set `PYTORCH_CUDA_ALLOC_CONF=backend:cudaMallocAsync` to save some VRAM.
|
MayBashendy/ASAP_FineTuningBERT_AugV4_k25_task1_organization_fold3 | MayBashendy | 2024-11-25T05:50:53Z | 161 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:google-bert/bert-base-uncased",
"base_model:finetune:google-bert/bert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-25T05:18:18Z | ---
library_name: transformers
license: apache-2.0
base_model: bert-base-uncased
tags:
- generated_from_trainer
model-index:
- name: ASAP_FineTuningBERT_AugV4_k25_task1_organization_fold2
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# ASAP_FineTuningBERT_AugV4_k25_task1_organization_fold2
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8864
- Qwk: 0.3550
- Mse: 0.8864
- Rmse: 0.9415
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:|
| No log | 0.0008 | 2 | 9.5211 | 0.0085 | 9.5211 | 3.0856 |
| No log | 0.0016 | 4 | 7.7638 | 0.0 | 7.7638 | 2.7864 |
| No log | 0.0024 | 6 | 7.1927 | 0.0 | 7.1927 | 2.6819 |
| No log | 0.0032 | 8 | 6.4727 | -0.0012 | 6.4727 | 2.5442 |
| No log | 0.0041 | 10 | 5.3579 | 0.0007 | 5.3579 | 2.3147 |
| No log | 0.0049 | 12 | 4.3309 | 0.0 | 4.3309 | 2.0811 |
| No log | 0.0057 | 14 | 3.4456 | 0.0078 | 3.4456 | 1.8562 |
| No log | 0.0065 | 16 | 2.8072 | 0.0144 | 2.8072 | 1.6755 |
| No log | 0.0073 | 18 | 1.7915 | 0.0449 | 1.7915 | 1.3385 |
| No log | 0.0081 | 20 | 1.3687 | 0.0213 | 1.3687 | 1.1699 |
| No log | 0.0089 | 22 | 1.0646 | 0.0213 | 1.0646 | 1.0318 |
| No log | 0.0097 | 24 | 0.8607 | 0.3235 | 0.8607 | 0.9278 |
| No log | 0.0105 | 26 | 0.7966 | 0.0910 | 0.7966 | 0.8925 |
| No log | 0.0113 | 28 | 0.8009 | 0.0648 | 0.8009 | 0.8949 |
| No log | 0.0122 | 30 | 0.8471 | 0.0648 | 0.8471 | 0.9204 |
| No log | 0.0130 | 32 | 0.8928 | 0.0325 | 0.8928 | 0.9449 |
| No log | 0.0138 | 34 | 0.9072 | 0.0164 | 0.9072 | 0.9525 |
| No log | 0.0146 | 36 | 0.9057 | 0.0 | 0.9057 | 0.9517 |
| No log | 0.0154 | 38 | 0.9030 | 0.0 | 0.9030 | 0.9503 |
| No log | 0.0162 | 40 | 0.9555 | 0.0164 | 0.9555 | 0.9775 |
| No log | 0.0170 | 42 | 1.0194 | 0.0164 | 1.0194 | 1.0096 |
| No log | 0.0178 | 44 | 1.1108 | 0.0164 | 1.1108 | 1.0539 |
| No log | 0.0186 | 46 | 1.0026 | 0.0164 | 1.0026 | 1.0013 |
| No log | 0.0195 | 48 | 0.8869 | 0.0164 | 0.8869 | 0.9417 |
| No log | 0.0203 | 50 | 0.8344 | 0.0 | 0.8344 | 0.9135 |
| No log | 0.0211 | 52 | 0.8327 | 0.0 | 0.8327 | 0.9125 |
| No log | 0.0219 | 54 | 0.8921 | 0.0 | 0.8921 | 0.9445 |
| No log | 0.0227 | 56 | 1.2001 | 0.0250 | 1.2001 | 1.0955 |
| No log | 0.0235 | 58 | 1.2927 | 0.0559 | 1.2927 | 1.1370 |
| No log | 0.0243 | 60 | 0.9422 | 0.0 | 0.9422 | 0.9707 |
| No log | 0.0251 | 62 | 0.9049 | 0.0 | 0.9049 | 0.9513 |
| No log | 0.0259 | 64 | 0.9336 | 0.0 | 0.9336 | 0.9662 |
| No log | 0.0268 | 66 | 1.0105 | 0.0 | 1.0105 | 1.0052 |
| No log | 0.0276 | 68 | 1.1159 | 0.0001 | 1.1159 | 1.0564 |
| No log | 0.0284 | 70 | 0.9173 | 0.0 | 0.9173 | 0.9578 |
| No log | 0.0292 | 72 | 0.8965 | 0.0 | 0.8965 | 0.9468 |
| No log | 0.0300 | 74 | 0.9306 | 0.0 | 0.9306 | 0.9647 |
| No log | 0.0308 | 76 | 0.9083 | 0.0 | 0.9083 | 0.9530 |
| No log | 0.0316 | 78 | 0.9239 | 0.0075 | 0.9239 | 0.9612 |
| No log | 0.0324 | 80 | 1.0984 | 0.1563 | 1.0984 | 1.0481 |
| No log | 0.0332 | 82 | 1.0258 | 0.1584 | 1.0258 | 1.0128 |
| No log | 0.0340 | 84 | 0.8683 | 0.1654 | 0.8683 | 0.9319 |
| No log | 0.0349 | 86 | 0.8879 | 0.1509 | 0.8879 | 0.9423 |
| No log | 0.0357 | 88 | 1.0538 | 0.1724 | 1.0538 | 1.0265 |
| No log | 0.0365 | 90 | 1.2321 | 0.1690 | 1.2321 | 1.1100 |
| No log | 0.0373 | 92 | 0.9262 | 0.1549 | 0.9262 | 0.9624 |
| No log | 0.0381 | 94 | 0.8781 | 0.1837 | 0.8781 | 0.9371 |
| No log | 0.0389 | 96 | 1.0370 | 0.1922 | 1.0370 | 1.0183 |
| No log | 0.0397 | 98 | 1.3008 | 0.1475 | 1.3008 | 1.1405 |
| No log | 0.0405 | 100 | 1.4885 | 0.1036 | 1.4885 | 1.2200 |
| No log | 0.0413 | 102 | 1.1735 | 0.2053 | 1.1735 | 1.0833 |
| No log | 0.0422 | 104 | 0.8042 | 0.0802 | 0.8042 | 0.8968 |
| No log | 0.0430 | 106 | 0.7891 | 0.0852 | 0.7891 | 0.8883 |
| No log | 0.0438 | 108 | 0.7999 | 0.0296 | 0.7999 | 0.8944 |
| No log | 0.0446 | 110 | 0.9821 | 0.0671 | 0.9821 | 0.9910 |
| No log | 0.0454 | 112 | 1.3540 | 0.1995 | 1.3540 | 1.1636 |
| No log | 0.0462 | 114 | 1.4034 | 0.1837 | 1.4034 | 1.1847 |
| No log | 0.0470 | 116 | 1.1613 | 0.1814 | 1.1613 | 1.0776 |
| No log | 0.0478 | 118 | 0.8645 | 0.0406 | 0.8645 | 0.9298 |
| No log | 0.0486 | 120 | 0.8526 | 0.0684 | 0.8526 | 0.9234 |
| No log | 0.0495 | 122 | 1.1374 | 0.1793 | 1.1374 | 1.0665 |
| No log | 0.0503 | 124 | 1.4875 | 0.1797 | 1.4875 | 1.2196 |
| No log | 0.0511 | 126 | 1.5333 | 0.1617 | 1.5333 | 1.2383 |
| No log | 0.0519 | 128 | 1.4762 | 0.1496 | 1.4762 | 1.2150 |
| No log | 0.0527 | 130 | 1.3425 | 0.1505 | 1.3425 | 1.1586 |
| No log | 0.0535 | 132 | 1.4790 | 0.1362 | 1.4790 | 1.2162 |
| No log | 0.0543 | 134 | 1.6983 | 0.0960 | 1.6983 | 1.3032 |
| No log | 0.0551 | 136 | 1.6694 | 0.1184 | 1.6694 | 1.2920 |
| No log | 0.0559 | 138 | 1.3125 | 0.1790 | 1.3125 | 1.1456 |
| No log | 0.0567 | 140 | 0.9324 | 0.2274 | 0.9324 | 0.9656 |
| No log | 0.0576 | 142 | 0.9334 | 0.2387 | 0.9334 | 0.9661 |
| No log | 0.0584 | 144 | 1.2946 | 0.1722 | 1.2946 | 1.1378 |
| No log | 0.0592 | 146 | 1.5237 | 0.1302 | 1.5237 | 1.2344 |
| No log | 0.0600 | 148 | 1.2725 | 0.2041 | 1.2725 | 1.1281 |
| No log | 0.0608 | 150 | 1.0451 | 0.2604 | 1.0451 | 1.0223 |
| No log | 0.0616 | 152 | 0.9733 | 0.2814 | 0.9733 | 0.9865 |
| No log | 0.0624 | 154 | 0.8388 | 0.3151 | 0.8388 | 0.9159 |
| No log | 0.0632 | 156 | 0.9445 | 0.2922 | 0.9445 | 0.9718 |
| No log | 0.0640 | 158 | 1.2082 | 0.2450 | 1.2082 | 1.0992 |
| No log | 0.0649 | 160 | 1.2216 | 0.2540 | 1.2216 | 1.1053 |
| No log | 0.0657 | 162 | 0.8965 | 0.2518 | 0.8965 | 0.9468 |
| No log | 0.0665 | 164 | 0.7406 | 0.1494 | 0.7406 | 0.8606 |
| No log | 0.0673 | 166 | 0.7469 | 0.1497 | 0.7469 | 0.8643 |
| No log | 0.0681 | 168 | 0.7615 | 0.2034 | 0.7615 | 0.8726 |
| No log | 0.0689 | 170 | 0.9080 | 0.2664 | 0.9080 | 0.9529 |
| No log | 0.0697 | 172 | 1.3884 | 0.2180 | 1.3884 | 1.1783 |
| No log | 0.0705 | 174 | 1.2009 | 0.2437 | 1.2009 | 1.0958 |
| No log | 0.0713 | 176 | 0.9239 | 0.2802 | 0.9239 | 0.9612 |
| No log | 0.0722 | 178 | 1.0926 | 0.2625 | 1.0926 | 1.0453 |
| No log | 0.0730 | 180 | 1.4207 | 0.2061 | 1.4207 | 1.1919 |
| No log | 0.0738 | 182 | 1.1721 | 0.2385 | 1.1721 | 1.0827 |
| No log | 0.0746 | 184 | 1.1990 | 0.2205 | 1.1990 | 1.0950 |
| No log | 0.0754 | 186 | 1.0254 | 0.2137 | 1.0254 | 1.0126 |
| No log | 0.0762 | 188 | 0.8561 | 0.1982 | 0.8561 | 0.9253 |
| No log | 0.0770 | 190 | 0.9205 | 0.2474 | 0.9205 | 0.9594 |
| No log | 0.0778 | 192 | 1.1717 | 0.1869 | 1.1717 | 1.0824 |
| No log | 0.0786 | 194 | 1.4316 | 0.1669 | 1.4316 | 1.1965 |
| No log | 0.0794 | 196 | 1.1419 | 0.1964 | 1.1419 | 1.0686 |
| No log | 0.0803 | 198 | 0.8386 | 0.2808 | 0.8386 | 0.9157 |
| No log | 0.0811 | 200 | 0.8245 | 0.2511 | 0.8245 | 0.9080 |
| No log | 0.0819 | 202 | 0.9247 | 0.2598 | 0.9247 | 0.9616 |
| No log | 0.0827 | 204 | 0.8609 | 0.3103 | 0.8609 | 0.9279 |
| No log | 0.0835 | 206 | 0.9117 | 0.3048 | 0.9117 | 0.9548 |
| No log | 0.0843 | 208 | 1.1003 | 0.2614 | 1.1003 | 1.0490 |
| No log | 0.0851 | 210 | 0.8235 | 0.3402 | 0.8235 | 0.9075 |
| No log | 0.0859 | 212 | 0.7206 | 0.2885 | 0.7206 | 0.8489 |
| No log | 0.0867 | 214 | 0.7273 | 0.2624 | 0.7273 | 0.8528 |
| No log | 0.0876 | 216 | 0.7721 | 0.2794 | 0.7721 | 0.8787 |
| No log | 0.0884 | 218 | 0.7771 | 0.2494 | 0.7771 | 0.8815 |
| No log | 0.0892 | 220 | 0.7609 | 0.2048 | 0.7609 | 0.8723 |
| No log | 0.0900 | 222 | 0.7760 | 0.1930 | 0.7760 | 0.8809 |
| No log | 0.0908 | 224 | 0.8475 | 0.1722 | 0.8475 | 0.9206 |
| No log | 0.0916 | 226 | 0.9025 | 0.1739 | 0.9025 | 0.9500 |
| No log | 0.0924 | 228 | 0.8373 | 0.2369 | 0.8373 | 0.9150 |
| No log | 0.0932 | 230 | 0.8519 | 0.2628 | 0.8519 | 0.9230 |
| No log | 0.0940 | 232 | 0.9719 | 0.2743 | 0.9719 | 0.9858 |
| No log | 0.0949 | 234 | 1.0164 | 0.2726 | 1.0164 | 1.0082 |
| No log | 0.0957 | 236 | 0.8096 | 0.2940 | 0.8096 | 0.8998 |
| No log | 0.0965 | 238 | 0.8164 | 0.2987 | 0.8164 | 0.9036 |
| No log | 0.0973 | 240 | 1.1212 | 0.2574 | 1.1212 | 1.0589 |
| No log | 0.0981 | 242 | 1.2868 | 0.2254 | 1.2868 | 1.1344 |
| No log | 0.0989 | 244 | 0.9503 | 0.2740 | 0.9503 | 0.9749 |
| No log | 0.0997 | 246 | 0.8232 | 0.1960 | 0.8232 | 0.9073 |
| No log | 0.1005 | 248 | 0.8321 | 0.1973 | 0.8321 | 0.9122 |
| No log | 0.1013 | 250 | 0.9043 | 0.2290 | 0.9043 | 0.9509 |
| No log | 0.1021 | 252 | 1.2968 | 0.1984 | 1.2968 | 1.1388 |
| No log | 0.1030 | 254 | 1.2826 | 0.1930 | 1.2826 | 1.1325 |
| No log | 0.1038 | 256 | 0.9921 | 0.2212 | 0.9921 | 0.9961 |
| No log | 0.1046 | 258 | 0.8900 | 0.1663 | 0.8900 | 0.9434 |
| No log | 0.1054 | 260 | 0.9361 | 0.1714 | 0.9361 | 0.9675 |
| No log | 0.1062 | 262 | 1.1213 | 0.2056 | 1.1213 | 1.0589 |
| No log | 0.1070 | 264 | 1.2654 | 0.1960 | 1.2654 | 1.1249 |
| No log | 0.1078 | 266 | 1.1232 | 0.2289 | 1.1232 | 1.0598 |
| No log | 0.1086 | 268 | 0.9849 | 0.2136 | 0.9849 | 0.9924 |
| No log | 0.1094 | 270 | 1.1004 | 0.2157 | 1.1004 | 1.0490 |
| No log | 0.1103 | 272 | 1.3683 | 0.1988 | 1.3683 | 1.1697 |
| No log | 0.1111 | 274 | 1.2697 | 0.1941 | 1.2697 | 1.1268 |
| No log | 0.1119 | 276 | 1.1292 | 0.2055 | 1.1292 | 1.0626 |
| No log | 0.1127 | 278 | 1.0772 | 0.1996 | 1.0772 | 1.0379 |
| No log | 0.1135 | 280 | 1.1942 | 0.1848 | 1.1942 | 1.0928 |
| No log | 0.1143 | 282 | 1.3719 | 0.1531 | 1.3719 | 1.1713 |
| No log | 0.1151 | 284 | 1.0921 | 0.1868 | 1.0921 | 1.0450 |
| No log | 0.1159 | 286 | 1.1128 | 0.1879 | 1.1128 | 1.0549 |
| No log | 0.1167 | 288 | 1.3590 | 0.1511 | 1.3590 | 1.1657 |
| No log | 0.1176 | 290 | 1.4723 | 0.1514 | 1.4723 | 1.2134 |
| No log | 0.1184 | 292 | 1.4840 | 0.1467 | 1.4840 | 1.2182 |
| No log | 0.1192 | 294 | 1.2196 | 0.1978 | 1.2196 | 1.1043 |
| No log | 0.1200 | 296 | 0.9451 | 0.1739 | 0.9451 | 0.9721 |
| No log | 0.1208 | 298 | 0.9789 | 0.1786 | 0.9789 | 0.9894 |
| No log | 0.1216 | 300 | 1.0237 | 0.2253 | 1.0237 | 1.0118 |
| No log | 0.1224 | 302 | 1.2818 | 0.2214 | 1.2818 | 1.1322 |
| No log | 0.1232 | 304 | 1.4200 | 0.2055 | 1.4200 | 1.1917 |
| No log | 0.1240 | 306 | 1.0439 | 0.2383 | 1.0439 | 1.0217 |
| No log | 0.1248 | 308 | 0.8017 | 0.2183 | 0.8017 | 0.8954 |
| No log | 0.1257 | 310 | 0.8045 | 0.2286 | 0.8045 | 0.8969 |
| No log | 0.1265 | 312 | 1.0384 | 0.2339 | 1.0384 | 1.0190 |
| No log | 0.1273 | 314 | 1.3067 | 0.2233 | 1.3067 | 1.1431 |
| No log | 0.1281 | 316 | 1.0476 | 0.2413 | 1.0476 | 1.0235 |
| No log | 0.1289 | 318 | 0.8234 | 0.2037 | 0.8234 | 0.9074 |
| No log | 0.1297 | 320 | 0.7986 | 0.2109 | 0.7986 | 0.8937 |
| No log | 0.1305 | 322 | 0.9317 | 0.2364 | 0.9317 | 0.9653 |
| No log | 0.1313 | 324 | 1.1979 | 0.2197 | 1.1979 | 1.0945 |
| No log | 0.1321 | 326 | 1.0247 | 0.2571 | 1.0247 | 1.0123 |
| No log | 0.1330 | 328 | 0.7914 | 0.2415 | 0.7914 | 0.8896 |
| No log | 0.1338 | 330 | 0.7712 | 0.1931 | 0.7712 | 0.8782 |
| No log | 0.1346 | 332 | 0.8331 | 0.2360 | 0.8331 | 0.9128 |
| No log | 0.1354 | 334 | 0.9862 | 0.2632 | 0.9862 | 0.9931 |
| No log | 0.1362 | 336 | 1.1504 | 0.2135 | 1.1504 | 1.0726 |
| No log | 0.1370 | 338 | 0.9062 | 0.2811 | 0.9062 | 0.9520 |
| No log | 0.1378 | 340 | 0.8195 | 0.2506 | 0.8195 | 0.9052 |
| No log | 0.1386 | 342 | 0.8229 | 0.2625 | 0.8229 | 0.9071 |
| No log | 0.1394 | 344 | 0.9846 | 0.3087 | 0.9846 | 0.9923 |
| No log | 0.1403 | 346 | 0.9594 | 0.2868 | 0.9594 | 0.9795 |
| No log | 0.1411 | 348 | 0.7996 | 0.2404 | 0.7996 | 0.8942 |
| No log | 0.1419 | 350 | 0.7957 | 0.2499 | 0.7957 | 0.8920 |
| No log | 0.1427 | 352 | 0.8933 | 0.2527 | 0.8933 | 0.9452 |
| No log | 0.1435 | 354 | 0.8482 | 0.2634 | 0.8482 | 0.9210 |
| No log | 0.1443 | 356 | 0.7788 | 0.2659 | 0.7788 | 0.8825 |
| No log | 0.1451 | 358 | 0.7845 | 0.2428 | 0.7845 | 0.8857 |
| No log | 0.1459 | 360 | 0.7883 | 0.2731 | 0.7883 | 0.8879 |
| No log | 0.1467 | 362 | 0.8310 | 0.2785 | 0.8310 | 0.9116 |
| No log | 0.1475 | 364 | 0.8319 | 0.2857 | 0.8319 | 0.9121 |
| No log | 0.1484 | 366 | 0.7908 | 0.2593 | 0.7908 | 0.8893 |
| No log | 0.1492 | 368 | 0.8194 | 0.2318 | 0.8194 | 0.9052 |
| No log | 0.1500 | 370 | 0.8003 | 0.2160 | 0.8003 | 0.8946 |
| No log | 0.1508 | 372 | 0.8383 | 0.2854 | 0.8383 | 0.9156 |
| No log | 0.1516 | 374 | 0.8361 | 0.2787 | 0.8361 | 0.9144 |
| No log | 0.1524 | 376 | 0.7749 | 0.2542 | 0.7749 | 0.8803 |
| No log | 0.1532 | 378 | 0.8254 | 0.2040 | 0.8254 | 0.9085 |
| No log | 0.1540 | 380 | 0.8412 | 0.2267 | 0.8412 | 0.9172 |
| No log | 0.1548 | 382 | 0.7438 | 0.2968 | 0.7438 | 0.8624 |
| No log | 0.1557 | 384 | 0.7877 | 0.3415 | 0.7877 | 0.8875 |
| No log | 0.1565 | 386 | 0.8279 | 0.3333 | 0.8279 | 0.9099 |
| No log | 0.1573 | 388 | 0.7186 | 0.3515 | 0.7186 | 0.8477 |
| No log | 0.1581 | 390 | 0.7607 | 0.2778 | 0.7607 | 0.8722 |
| No log | 0.1589 | 392 | 0.7769 | 0.2683 | 0.7769 | 0.8814 |
| No log | 0.1597 | 394 | 0.7069 | 0.3299 | 0.7069 | 0.8408 |
| No log | 0.1605 | 396 | 0.7091 | 0.3601 | 0.7091 | 0.8421 |
| No log | 0.1613 | 398 | 0.7039 | 0.3409 | 0.7039 | 0.8390 |
| No log | 0.1621 | 400 | 0.7179 | 0.3585 | 0.7179 | 0.8473 |
| No log | 0.1630 | 402 | 0.7321 | 0.3374 | 0.7321 | 0.8557 |
| No log | 0.1638 | 404 | 0.7686 | 0.3583 | 0.7686 | 0.8767 |
| No log | 0.1646 | 406 | 0.7637 | 0.3413 | 0.7637 | 0.8739 |
| No log | 0.1654 | 408 | 0.7411 | 0.3383 | 0.7411 | 0.8608 |
| No log | 0.1662 | 410 | 0.7352 | 0.3274 | 0.7352 | 0.8574 |
| No log | 0.1670 | 412 | 0.7548 | 0.3264 | 0.7548 | 0.8688 |
| No log | 0.1678 | 414 | 0.8822 | 0.3071 | 0.8822 | 0.9392 |
| No log | 0.1686 | 416 | 0.9175 | 0.3171 | 0.9175 | 0.9579 |
| No log | 0.1694 | 418 | 0.7634 | 0.3313 | 0.7634 | 0.8738 |
| No log | 0.1702 | 420 | 0.7216 | 0.3252 | 0.7216 | 0.8494 |
| No log | 0.1711 | 422 | 0.7332 | 0.3080 | 0.7332 | 0.8562 |
| No log | 0.1719 | 424 | 0.7549 | 0.3732 | 0.7549 | 0.8688 |
| No log | 0.1727 | 426 | 0.9698 | 0.3516 | 0.9698 | 0.9848 |
| No log | 0.1735 | 428 | 0.8549 | 0.3605 | 0.8549 | 0.9246 |
| No log | 0.1743 | 430 | 0.6854 | 0.3681 | 0.6854 | 0.8279 |
| No log | 0.1751 | 432 | 0.7212 | 0.3046 | 0.7212 | 0.8492 |
| No log | 0.1759 | 434 | 0.7133 | 0.3028 | 0.7133 | 0.8446 |
| No log | 0.1767 | 436 | 0.6673 | 0.3231 | 0.6673 | 0.8169 |
| No log | 0.1775 | 438 | 0.7067 | 0.3245 | 0.7067 | 0.8406 |
| No log | 0.1784 | 440 | 0.6928 | 0.3203 | 0.6928 | 0.8324 |
| No log | 0.1792 | 442 | 0.7009 | 0.3174 | 0.7009 | 0.8372 |
| No log | 0.1800 | 444 | 0.7195 | 0.3302 | 0.7195 | 0.8482 |
| No log | 0.1808 | 446 | 0.8087 | 0.3160 | 0.8087 | 0.8993 |
| No log | 0.1816 | 448 | 0.8538 | 0.3002 | 0.8538 | 0.9240 |
| No log | 0.1824 | 450 | 0.7595 | 0.2963 | 0.7595 | 0.8715 |
| No log | 0.1832 | 452 | 0.7829 | 0.2588 | 0.7829 | 0.8848 |
| No log | 0.1840 | 454 | 0.7573 | 0.2883 | 0.7573 | 0.8702 |
| No log | 0.1848 | 456 | 0.8051 | 0.2721 | 0.8051 | 0.8973 |
| No log | 0.1857 | 458 | 0.8523 | 0.2872 | 0.8523 | 0.9232 |
| No log | 0.1865 | 460 | 0.7410 | 0.3264 | 0.7410 | 0.8608 |
| No log | 0.1873 | 462 | 0.7136 | 0.3443 | 0.7136 | 0.8447 |
| No log | 0.1881 | 464 | 0.7115 | 0.3424 | 0.7115 | 0.8435 |
| No log | 0.1889 | 466 | 0.7652 | 0.3633 | 0.7652 | 0.8748 |
| No log | 0.1897 | 468 | 0.7898 | 0.3585 | 0.7898 | 0.8887 |
| No log | 0.1905 | 470 | 0.7537 | 0.3257 | 0.7537 | 0.8682 |
| No log | 0.1913 | 472 | 0.7553 | 0.3522 | 0.7553 | 0.8691 |
| No log | 0.1921 | 474 | 0.8120 | 0.3425 | 0.8120 | 0.9011 |
| No log | 0.1929 | 476 | 0.7625 | 0.3299 | 0.7625 | 0.8732 |
| No log | 0.1938 | 478 | 0.6969 | 0.3343 | 0.6969 | 0.8348 |
| No log | 0.1946 | 480 | 0.7400 | 0.3047 | 0.7400 | 0.8602 |
| No log | 0.1954 | 482 | 0.7084 | 0.3224 | 0.7084 | 0.8416 |
| No log | 0.1962 | 484 | 0.6966 | 0.3188 | 0.6966 | 0.8346 |
| No log | 0.1970 | 486 | 0.7850 | 0.3462 | 0.7850 | 0.8860 |
| No log | 0.1978 | 488 | 0.7492 | 0.3324 | 0.7492 | 0.8656 |
| No log | 0.1986 | 490 | 0.7379 | 0.3049 | 0.7379 | 0.8590 |
| No log | 0.1994 | 492 | 0.7495 | 0.2910 | 0.7495 | 0.8657 |
| No log | 0.2002 | 494 | 0.7539 | 0.3325 | 0.7539 | 0.8683 |
| No log | 0.2011 | 496 | 0.7623 | 0.3659 | 0.7623 | 0.8731 |
| No log | 0.2019 | 498 | 0.6939 | 0.3530 | 0.6939 | 0.8330 |
| 0.934 | 0.2027 | 500 | 0.7732 | 0.3011 | 0.7732 | 0.8793 |
| 0.934 | 0.2035 | 502 | 0.7525 | 0.2977 | 0.7525 | 0.8674 |
| 0.934 | 0.2043 | 504 | 0.6796 | 0.3690 | 0.6796 | 0.8244 |
| 0.934 | 0.2051 | 506 | 0.7122 | 0.3229 | 0.7122 | 0.8439 |
| 0.934 | 0.2059 | 508 | 0.7012 | 0.3381 | 0.7012 | 0.8374 |
| 0.934 | 0.2067 | 510 | 0.7835 | 0.2812 | 0.7835 | 0.8851 |
| 0.934 | 0.2075 | 512 | 0.8539 | 0.2447 | 0.8539 | 0.9241 |
| 0.934 | 0.2084 | 514 | 0.7376 | 0.3324 | 0.7376 | 0.8588 |
| 0.934 | 0.2092 | 516 | 0.7424 | 0.3530 | 0.7424 | 0.8616 |
| 0.934 | 0.2100 | 518 | 0.7216 | 0.3606 | 0.7216 | 0.8495 |
| 0.934 | 0.2108 | 520 | 0.6917 | 0.3578 | 0.6917 | 0.8317 |
| 0.934 | 0.2116 | 522 | 0.6888 | 0.3626 | 0.6888 | 0.8300 |
| 0.934 | 0.2124 | 524 | 0.6776 | 0.3776 | 0.6776 | 0.8232 |
| 0.934 | 0.2132 | 526 | 0.6715 | 0.3753 | 0.6715 | 0.8194 |
| 0.934 | 0.2140 | 528 | 0.6793 | 0.3954 | 0.6793 | 0.8242 |
| 0.934 | 0.2148 | 530 | 0.7225 | 0.4331 | 0.7225 | 0.8500 |
| 0.934 | 0.2156 | 532 | 0.6937 | 0.4113 | 0.6937 | 0.8329 |
| 0.934 | 0.2165 | 534 | 0.6909 | 0.4100 | 0.6909 | 0.8312 |
| 0.934 | 0.2173 | 536 | 0.6858 | 0.4127 | 0.6858 | 0.8281 |
| 0.934 | 0.2181 | 538 | 0.6928 | 0.3865 | 0.6928 | 0.8323 |
| 0.934 | 0.2189 | 540 | 0.6935 | 0.3767 | 0.6935 | 0.8328 |
| 0.934 | 0.2197 | 542 | 0.7385 | 0.3379 | 0.7385 | 0.8593 |
| 0.934 | 0.2205 | 544 | 0.7458 | 0.3394 | 0.7458 | 0.8636 |
| 0.934 | 0.2213 | 546 | 0.7457 | 0.3384 | 0.7457 | 0.8635 |
| 0.934 | 0.2221 | 548 | 0.7649 | 0.3267 | 0.7649 | 0.8746 |
| 0.934 | 0.2229 | 550 | 0.7700 | 0.3611 | 0.7700 | 0.8775 |
| 0.934 | 0.2238 | 552 | 0.7702 | 0.3446 | 0.7702 | 0.8776 |
| 0.934 | 0.2246 | 554 | 0.7567 | 0.3477 | 0.7567 | 0.8699 |
| 0.934 | 0.2254 | 556 | 0.7433 | 0.3535 | 0.7433 | 0.8621 |
| 0.934 | 0.2262 | 558 | 0.8157 | 0.3443 | 0.8157 | 0.9032 |
| 0.934 | 0.2270 | 560 | 0.8270 | 0.3320 | 0.8270 | 0.9094 |
| 0.934 | 0.2278 | 562 | 0.7428 | 0.3275 | 0.7428 | 0.8618 |
| 0.934 | 0.2286 | 564 | 0.8466 | 0.2571 | 0.8466 | 0.9201 |
| 0.934 | 0.2294 | 566 | 0.8050 | 0.2791 | 0.8050 | 0.8972 |
| 0.934 | 0.2302 | 568 | 0.7260 | 0.3424 | 0.7260 | 0.8521 |
| 0.934 | 0.2310 | 570 | 0.7500 | 0.3789 | 0.7500 | 0.8660 |
| 0.934 | 0.2319 | 572 | 0.7383 | 0.4091 | 0.7383 | 0.8593 |
| 0.934 | 0.2327 | 574 | 0.6799 | 0.3774 | 0.6799 | 0.8246 |
| 0.934 | 0.2335 | 576 | 0.7321 | 0.3665 | 0.7321 | 0.8556 |
| 0.934 | 0.2343 | 578 | 0.6742 | 0.3778 | 0.6742 | 0.8211 |
| 0.934 | 0.2351 | 580 | 0.7255 | 0.4219 | 0.7255 | 0.8518 |
| 0.934 | 0.2359 | 582 | 0.8879 | 0.4102 | 0.8879 | 0.9423 |
| 0.934 | 0.2367 | 584 | 0.7819 | 0.4286 | 0.7819 | 0.8842 |
| 0.934 | 0.2375 | 586 | 0.6981 | 0.3954 | 0.6981 | 0.8355 |
| 0.934 | 0.2383 | 588 | 0.6966 | 0.3757 | 0.6966 | 0.8346 |
| 0.934 | 0.2392 | 590 | 0.7253 | 0.3962 | 0.7253 | 0.8516 |
| 0.934 | 0.2400 | 592 | 0.8609 | 0.4052 | 0.8609 | 0.9278 |
| 0.934 | 0.2408 | 594 | 0.7854 | 0.4049 | 0.7854 | 0.8862 |
| 0.934 | 0.2416 | 596 | 0.7298 | 0.3740 | 0.7298 | 0.8543 |
| 0.934 | 0.2424 | 598 | 0.7518 | 0.3440 | 0.7518 | 0.8671 |
| 0.934 | 0.2432 | 600 | 0.7397 | 0.3631 | 0.7397 | 0.8601 |
| 0.934 | 0.2440 | 602 | 0.7354 | 0.3522 | 0.7354 | 0.8576 |
| 0.934 | 0.2448 | 604 | 0.7321 | 0.3499 | 0.7321 | 0.8556 |
| 0.934 | 0.2456 | 606 | 0.7283 | 0.3426 | 0.7283 | 0.8534 |
| 0.934 | 0.2465 | 608 | 0.7193 | 0.3408 | 0.7193 | 0.8481 |
| 0.934 | 0.2473 | 610 | 0.7143 | 0.3347 | 0.7143 | 0.8451 |
| 0.934 | 0.2481 | 612 | 0.7169 | 0.3374 | 0.7169 | 0.8467 |
| 0.934 | 0.2489 | 614 | 0.7232 | 0.3311 | 0.7232 | 0.8504 |
| 0.934 | 0.2497 | 616 | 0.7712 | 0.3539 | 0.7712 | 0.8782 |
| 0.934 | 0.2505 | 618 | 0.8304 | 0.4071 | 0.8304 | 0.9113 |
| 0.934 | 0.2513 | 620 | 0.7238 | 0.3798 | 0.7238 | 0.8508 |
| 0.934 | 0.2521 | 622 | 0.6872 | 0.3748 | 0.6872 | 0.8290 |
| 0.934 | 0.2529 | 624 | 0.7574 | 0.4173 | 0.7574 | 0.8703 |
| 0.934 | 0.2537 | 626 | 0.7265 | 0.3957 | 0.7265 | 0.8523 |
| 0.934 | 0.2546 | 628 | 0.6639 | 0.3974 | 0.6639 | 0.8148 |
| 0.934 | 0.2554 | 630 | 0.6860 | 0.3997 | 0.6860 | 0.8283 |
| 0.934 | 0.2562 | 632 | 0.7070 | 0.4128 | 0.7070 | 0.8408 |
| 0.934 | 0.2570 | 634 | 0.6750 | 0.4125 | 0.6750 | 0.8216 |
| 0.934 | 0.2578 | 636 | 0.6760 | 0.4161 | 0.6760 | 0.8222 |
| 0.934 | 0.2586 | 638 | 0.7038 | 0.3801 | 0.7038 | 0.8389 |
| 0.934 | 0.2594 | 640 | 0.6844 | 0.3880 | 0.6844 | 0.8273 |
| 0.934 | 0.2602 | 642 | 0.7153 | 0.4000 | 0.7153 | 0.8457 |
| 0.934 | 0.2610 | 644 | 0.7135 | 0.3602 | 0.7135 | 0.8447 |
| 0.934 | 0.2619 | 646 | 0.7275 | 0.3557 | 0.7275 | 0.8529 |
| 0.934 | 0.2627 | 648 | 0.7135 | 0.3208 | 0.7135 | 0.8447 |
| 0.934 | 0.2635 | 650 | 0.7139 | 0.3283 | 0.7139 | 0.8449 |
| 0.934 | 0.2643 | 652 | 0.7229 | 0.3050 | 0.7229 | 0.8503 |
| 0.934 | 0.2651 | 654 | 0.7387 | 0.3294 | 0.7387 | 0.8595 |
| 0.934 | 0.2659 | 656 | 0.7159 | 0.3029 | 0.7159 | 0.8461 |
| 0.934 | 0.2667 | 658 | 0.7207 | 0.3032 | 0.7207 | 0.8489 |
| 0.934 | 0.2675 | 660 | 0.7676 | 0.3484 | 0.7676 | 0.8761 |
| 0.934 | 0.2683 | 662 | 0.9858 | 0.3364 | 0.9858 | 0.9929 |
| 0.934 | 0.2692 | 664 | 0.9152 | 0.3425 | 0.9152 | 0.9566 |
| 0.934 | 0.2700 | 666 | 0.7322 | 0.3056 | 0.7322 | 0.8557 |
| 0.934 | 0.2708 | 668 | 0.7542 | 0.3232 | 0.7542 | 0.8684 |
| 0.934 | 0.2716 | 670 | 0.7494 | 0.3128 | 0.7494 | 0.8657 |
| 0.934 | 0.2724 | 672 | 0.7823 | 0.3356 | 0.7823 | 0.8845 |
| 0.934 | 0.2732 | 674 | 0.9038 | 0.3323 | 0.9038 | 0.9507 |
| 0.934 | 0.2740 | 676 | 0.8235 | 0.3517 | 0.8235 | 0.9075 |
| 0.934 | 0.2748 | 678 | 0.7770 | 0.3484 | 0.7770 | 0.8815 |
| 0.934 | 0.2756 | 680 | 0.8314 | 0.3779 | 0.8314 | 0.9118 |
| 0.934 | 0.2764 | 682 | 0.7419 | 0.3595 | 0.7419 | 0.8614 |
| 0.934 | 0.2773 | 684 | 0.7566 | 0.3847 | 0.7566 | 0.8698 |
| 0.934 | 0.2781 | 686 | 0.7483 | 0.3807 | 0.7483 | 0.8651 |
| 0.934 | 0.2789 | 688 | 0.7094 | 0.3987 | 0.7094 | 0.8422 |
| 0.934 | 0.2797 | 690 | 0.7608 | 0.3915 | 0.7608 | 0.8722 |
| 0.934 | 0.2805 | 692 | 0.7558 | 0.3871 | 0.7558 | 0.8694 |
| 0.934 | 0.2813 | 694 | 0.6837 | 0.3781 | 0.6837 | 0.8269 |
| 0.934 | 0.2821 | 696 | 0.6994 | 0.3924 | 0.6994 | 0.8363 |
| 0.934 | 0.2829 | 698 | 0.7135 | 0.3682 | 0.7135 | 0.8447 |
| 0.934 | 0.2837 | 700 | 0.7303 | 0.3646 | 0.7303 | 0.8546 |
| 0.934 | 0.2846 | 702 | 0.9187 | 0.3878 | 0.9187 | 0.9585 |
| 0.934 | 0.2854 | 704 | 1.0355 | 0.3343 | 1.0355 | 1.0176 |
| 0.934 | 0.2862 | 706 | 0.8372 | 0.3466 | 0.8372 | 0.9150 |
| 0.934 | 0.2870 | 708 | 0.7284 | 0.3352 | 0.7284 | 0.8534 |
| 0.934 | 0.2878 | 710 | 0.7340 | 0.3405 | 0.7340 | 0.8567 |
| 0.934 | 0.2886 | 712 | 0.7525 | 0.3564 | 0.7525 | 0.8675 |
| 0.934 | 0.2894 | 714 | 0.8538 | 0.3778 | 0.8538 | 0.9240 |
| 0.934 | 0.2902 | 716 | 0.8531 | 0.3515 | 0.8531 | 0.9236 |
| 0.934 | 0.2910 | 718 | 0.7635 | 0.3579 | 0.7635 | 0.8738 |
| 0.934 | 0.2919 | 720 | 0.6918 | 0.3829 | 0.6918 | 0.8317 |
| 0.934 | 0.2927 | 722 | 0.6950 | 0.3832 | 0.6950 | 0.8337 |
| 0.934 | 0.2935 | 724 | 0.7626 | 0.3854 | 0.7626 | 0.8733 |
| 0.934 | 0.2943 | 726 | 0.7315 | 0.3791 | 0.7315 | 0.8553 |
| 0.934 | 0.2951 | 728 | 0.7178 | 0.3937 | 0.7178 | 0.8472 |
| 0.934 | 0.2959 | 730 | 0.6627 | 0.3991 | 0.6627 | 0.8141 |
| 0.934 | 0.2967 | 732 | 0.6847 | 0.4048 | 0.6847 | 0.8275 |
| 0.934 | 0.2975 | 734 | 0.6678 | 0.4100 | 0.6678 | 0.8172 |
| 0.934 | 0.2983 | 736 | 0.6926 | 0.4379 | 0.6926 | 0.8322 |
| 0.934 | 0.2991 | 738 | 0.6459 | 0.4401 | 0.6459 | 0.8037 |
| 0.934 | 0.3000 | 740 | 0.6589 | 0.4072 | 0.6589 | 0.8117 |
| 0.934 | 0.3008 | 742 | 0.6587 | 0.3983 | 0.6587 | 0.8116 |
| 0.934 | 0.3016 | 744 | 0.6703 | 0.4146 | 0.6703 | 0.8187 |
| 0.934 | 0.3024 | 746 | 0.6995 | 0.4391 | 0.6995 | 0.8364 |
| 0.934 | 0.3032 | 748 | 0.6893 | 0.4523 | 0.6893 | 0.8302 |
| 0.934 | 0.3040 | 750 | 0.6745 | 0.4088 | 0.6745 | 0.8213 |
| 0.934 | 0.3048 | 752 | 0.6554 | 0.4272 | 0.6554 | 0.8096 |
| 0.934 | 0.3056 | 754 | 0.6356 | 0.4144 | 0.6356 | 0.7973 |
| 0.934 | 0.3064 | 756 | 0.6508 | 0.4217 | 0.6508 | 0.8067 |
| 0.934 | 0.3073 | 758 | 0.6540 | 0.3948 | 0.6540 | 0.8087 |
| 0.934 | 0.3081 | 760 | 0.6684 | 0.3842 | 0.6684 | 0.8176 |
| 0.934 | 0.3089 | 762 | 0.6597 | 0.4044 | 0.6597 | 0.8122 |
| 0.934 | 0.3097 | 764 | 0.6868 | 0.4166 | 0.6868 | 0.8287 |
| 0.934 | 0.3105 | 766 | 0.7299 | 0.4065 | 0.7299 | 0.8543 |
| 0.934 | 0.3113 | 768 | 0.7622 | 0.3902 | 0.7622 | 0.8731 |
| 0.934 | 0.3121 | 770 | 0.7281 | 0.3694 | 0.7281 | 0.8533 |
| 0.934 | 0.3129 | 772 | 0.7006 | 0.3761 | 0.7006 | 0.8370 |
| 0.934 | 0.3137 | 774 | 0.7397 | 0.3228 | 0.7397 | 0.8600 |
| 0.934 | 0.3146 | 776 | 0.7176 | 0.3338 | 0.7176 | 0.8471 |
| 0.934 | 0.3154 | 778 | 0.7316 | 0.3787 | 0.7316 | 0.8553 |
| 0.934 | 0.3162 | 780 | 0.9242 | 0.3504 | 0.9242 | 0.9613 |
| 0.934 | 0.3170 | 782 | 0.8678 | 0.3709 | 0.8678 | 0.9315 |
| 0.934 | 0.3178 | 784 | 0.7681 | 0.3886 | 0.7681 | 0.8764 |
| 0.934 | 0.3186 | 786 | 0.7731 | 0.3373 | 0.7731 | 0.8793 |
| 0.934 | 0.3194 | 788 | 0.8053 | 0.3653 | 0.8053 | 0.8974 |
| 0.934 | 0.3202 | 790 | 0.8751 | 0.3961 | 0.8751 | 0.9355 |
| 0.934 | 0.3210 | 792 | 0.8788 | 0.4033 | 0.8788 | 0.9374 |
| 0.934 | 0.3218 | 794 | 0.8115 | 0.4170 | 0.8115 | 0.9008 |
| 0.934 | 0.3227 | 796 | 0.8006 | 0.4214 | 0.8006 | 0.8947 |
| 0.934 | 0.3235 | 798 | 0.7455 | 0.4489 | 0.7455 | 0.8634 |
| 0.934 | 0.3243 | 800 | 0.7228 | 0.4605 | 0.7228 | 0.8502 |
| 0.934 | 0.3251 | 802 | 0.7383 | 0.4564 | 0.7383 | 0.8592 |
| 0.934 | 0.3259 | 804 | 0.7061 | 0.4571 | 0.7061 | 0.8403 |
| 0.934 | 0.3267 | 806 | 0.7464 | 0.3885 | 0.7464 | 0.8639 |
| 0.934 | 0.3275 | 808 | 0.7697 | 0.3594 | 0.7697 | 0.8773 |
| 0.934 | 0.3283 | 810 | 0.7060 | 0.4426 | 0.7060 | 0.8403 |
| 0.934 | 0.3291 | 812 | 0.7866 | 0.4255 | 0.7866 | 0.8869 |
| 0.934 | 0.3300 | 814 | 0.7162 | 0.4573 | 0.7162 | 0.8463 |
| 0.934 | 0.3308 | 816 | 0.6738 | 0.3894 | 0.6738 | 0.8209 |
| 0.934 | 0.3316 | 818 | 0.6648 | 0.3906 | 0.6648 | 0.8153 |
| 0.934 | 0.3324 | 820 | 0.6586 | 0.4513 | 0.6586 | 0.8116 |
| 0.934 | 0.3332 | 822 | 0.6727 | 0.4380 | 0.6727 | 0.8202 |
| 0.934 | 0.3340 | 824 | 0.7372 | 0.4109 | 0.7372 | 0.8586 |
| 0.934 | 0.3348 | 826 | 0.7019 | 0.4102 | 0.7019 | 0.8378 |
| 0.934 | 0.3356 | 828 | 0.7158 | 0.4192 | 0.7158 | 0.8460 |
| 0.934 | 0.3364 | 830 | 0.7349 | 0.3643 | 0.7349 | 0.8573 |
| 0.934 | 0.3373 | 832 | 0.7396 | 0.4205 | 0.7396 | 0.8600 |
| 0.934 | 0.3381 | 834 | 0.7927 | 0.4165 | 0.7927 | 0.8903 |
| 0.934 | 0.3389 | 836 | 0.7650 | 0.4165 | 0.7650 | 0.8747 |
| 0.934 | 0.3397 | 838 | 0.7539 | 0.3790 | 0.7539 | 0.8682 |
| 0.934 | 0.3405 | 840 | 0.7319 | 0.3929 | 0.7319 | 0.8555 |
| 0.934 | 0.3413 | 842 | 0.7179 | 0.4108 | 0.7179 | 0.8473 |
| 0.934 | 0.3421 | 844 | 0.7300 | 0.3815 | 0.7300 | 0.8544 |
| 0.934 | 0.3429 | 846 | 0.7551 | 0.3745 | 0.7551 | 0.8690 |
| 0.934 | 0.3437 | 848 | 0.6887 | 0.4012 | 0.6887 | 0.8299 |
| 0.934 | 0.3445 | 850 | 0.6794 | 0.4001 | 0.6794 | 0.8243 |
| 0.934 | 0.3454 | 852 | 0.7130 | 0.3759 | 0.7130 | 0.8444 |
| 0.934 | 0.3462 | 854 | 0.7182 | 0.3759 | 0.7182 | 0.8475 |
| 0.934 | 0.3470 | 856 | 0.6991 | 0.3711 | 0.6991 | 0.8361 |
| 0.934 | 0.3478 | 858 | 0.6846 | 0.3777 | 0.6846 | 0.8274 |
| 0.934 | 0.3486 | 860 | 0.6650 | 0.4095 | 0.6650 | 0.8155 |
| 0.934 | 0.3494 | 862 | 0.7067 | 0.3623 | 0.7067 | 0.8407 |
| 0.934 | 0.3502 | 864 | 0.6873 | 0.3854 | 0.6873 | 0.8291 |
| 0.934 | 0.3510 | 866 | 0.7201 | 0.4202 | 0.7201 | 0.8486 |
| 0.934 | 0.3518 | 868 | 0.7567 | 0.3864 | 0.7567 | 0.8699 |
| 0.934 | 0.3527 | 870 | 0.6998 | 0.3632 | 0.6998 | 0.8365 |
| 0.934 | 0.3535 | 872 | 0.7986 | 0.3108 | 0.7986 | 0.8936 |
| 0.934 | 0.3543 | 874 | 0.7564 | 0.3029 | 0.7564 | 0.8697 |
| 0.934 | 0.3551 | 876 | 0.6892 | 0.3554 | 0.6892 | 0.8302 |
| 0.934 | 0.3559 | 878 | 0.7357 | 0.3721 | 0.7357 | 0.8577 |
| 0.934 | 0.3567 | 880 | 0.7102 | 0.3634 | 0.7102 | 0.8428 |
| 0.934 | 0.3575 | 882 | 0.7307 | 0.3191 | 0.7307 | 0.8548 |
| 0.934 | 0.3583 | 884 | 0.7217 | 0.3315 | 0.7217 | 0.8495 |
| 0.934 | 0.3591 | 886 | 0.7698 | 0.3531 | 0.7698 | 0.8774 |
| 0.934 | 0.3600 | 888 | 0.7576 | 0.3670 | 0.7576 | 0.8704 |
| 0.934 | 0.3608 | 890 | 0.7406 | 0.3426 | 0.7406 | 0.8606 |
| 0.934 | 0.3616 | 892 | 0.7359 | 0.3247 | 0.7359 | 0.8578 |
| 0.934 | 0.3624 | 894 | 0.6944 | 0.3896 | 0.6944 | 0.8333 |
| 0.934 | 0.3632 | 896 | 0.7006 | 0.3803 | 0.7006 | 0.8370 |
| 0.934 | 0.3640 | 898 | 0.6862 | 0.3855 | 0.6862 | 0.8284 |
| 0.934 | 0.3648 | 900 | 0.7162 | 0.4059 | 0.7162 | 0.8463 |
| 0.934 | 0.3656 | 902 | 0.7137 | 0.4027 | 0.7137 | 0.8448 |
| 0.934 | 0.3664 | 904 | 0.7440 | 0.3940 | 0.7440 | 0.8626 |
| 0.934 | 0.3672 | 906 | 0.7479 | 0.3840 | 0.7479 | 0.8648 |
| 0.934 | 0.3681 | 908 | 0.7836 | 0.3394 | 0.7836 | 0.8852 |
| 0.934 | 0.3689 | 910 | 0.7746 | 0.3592 | 0.7746 | 0.8801 |
| 0.934 | 0.3697 | 912 | 0.7737 | 0.3481 | 0.7737 | 0.8796 |
| 0.934 | 0.3705 | 914 | 0.7739 | 0.3487 | 0.7739 | 0.8797 |
| 0.934 | 0.3713 | 916 | 0.7981 | 0.3712 | 0.7981 | 0.8934 |
| 0.934 | 0.3721 | 918 | 0.7247 | 0.3931 | 0.7247 | 0.8513 |
| 0.934 | 0.3729 | 920 | 0.7135 | 0.3925 | 0.7135 | 0.8447 |
| 0.934 | 0.3737 | 922 | 0.8336 | 0.3658 | 0.8336 | 0.9130 |
| 0.934 | 0.3745 | 924 | 0.7210 | 0.3872 | 0.7210 | 0.8491 |
| 0.934 | 0.3754 | 926 | 0.6977 | 0.4294 | 0.6977 | 0.8353 |
| 0.934 | 0.3762 | 928 | 0.7136 | 0.4352 | 0.7136 | 0.8448 |
| 0.934 | 0.3770 | 930 | 0.6788 | 0.4170 | 0.6788 | 0.8239 |
| 0.934 | 0.3778 | 932 | 0.6737 | 0.4328 | 0.6737 | 0.8208 |
| 0.934 | 0.3786 | 934 | 0.7061 | 0.3858 | 0.7061 | 0.8403 |
| 0.934 | 0.3794 | 936 | 0.7671 | 0.3646 | 0.7671 | 0.8759 |
| 0.934 | 0.3802 | 938 | 0.7219 | 0.3971 | 0.7219 | 0.8496 |
| 0.934 | 0.3810 | 940 | 0.7239 | 0.3835 | 0.7239 | 0.8508 |
| 0.934 | 0.3818 | 942 | 0.7367 | 0.3564 | 0.7367 | 0.8583 |
| 0.934 | 0.3827 | 944 | 0.7590 | 0.3293 | 0.7590 | 0.8712 |
| 0.934 | 0.3835 | 946 | 0.7166 | 0.3716 | 0.7166 | 0.8465 |
| 0.934 | 0.3843 | 948 | 0.7190 | 0.3648 | 0.7190 | 0.8479 |
| 0.934 | 0.3851 | 950 | 0.7217 | 0.3541 | 0.7217 | 0.8495 |
| 0.934 | 0.3859 | 952 | 0.7924 | 0.3143 | 0.7924 | 0.8901 |
| 0.934 | 0.3867 | 954 | 0.7955 | 0.2995 | 0.7955 | 0.8919 |
| 0.934 | 0.3875 | 956 | 0.7022 | 0.3545 | 0.7022 | 0.8380 |
| 0.934 | 0.3883 | 958 | 0.7773 | 0.3942 | 0.7773 | 0.8817 |
| 0.934 | 0.3891 | 960 | 0.7980 | 0.3874 | 0.7980 | 0.8933 |
| 0.934 | 0.3899 | 962 | 0.7043 | 0.3479 | 0.7043 | 0.8392 |
| 0.934 | 0.3908 | 964 | 0.7732 | 0.2805 | 0.7732 | 0.8793 |
| 0.934 | 0.3916 | 966 | 0.7964 | 0.2759 | 0.7964 | 0.8924 |
| 0.934 | 0.3924 | 968 | 0.7550 | 0.3503 | 0.7550 | 0.8689 |
| 0.934 | 0.3932 | 970 | 0.8387 | 0.3502 | 0.8387 | 0.9158 |
| 0.934 | 0.3940 | 972 | 0.8067 | 0.3315 | 0.8067 | 0.8981 |
| 0.934 | 0.3948 | 974 | 0.7669 | 0.3488 | 0.7669 | 0.8757 |
| 0.934 | 0.3956 | 976 | 0.7505 | 0.3727 | 0.7505 | 0.8663 |
| 0.934 | 0.3964 | 978 | 0.7314 | 0.3617 | 0.7314 | 0.8552 |
| 0.934 | 0.3972 | 980 | 0.7958 | 0.3636 | 0.7958 | 0.8921 |
| 0.934 | 0.3981 | 982 | 0.8442 | 0.3637 | 0.8442 | 0.9188 |
| 0.934 | 0.3989 | 984 | 0.7538 | 0.3734 | 0.7538 | 0.8682 |
| 0.934 | 0.3997 | 986 | 0.7322 | 0.4015 | 0.7322 | 0.8557 |
| 0.934 | 0.4005 | 988 | 0.7365 | 0.4021 | 0.7365 | 0.8582 |
| 0.934 | 0.4013 | 990 | 0.7283 | 0.3794 | 0.7283 | 0.8534 |
| 0.934 | 0.4021 | 992 | 0.7212 | 0.3842 | 0.7212 | 0.8492 |
| 0.934 | 0.4029 | 994 | 0.7096 | 0.3888 | 0.7096 | 0.8424 |
| 0.934 | 0.4037 | 996 | 0.7261 | 0.3938 | 0.7261 | 0.8521 |
| 0.934 | 0.4045 | 998 | 0.7677 | 0.3873 | 0.7677 | 0.8762 |
| 0.3836 | 0.4054 | 1000 | 0.8145 | 0.3742 | 0.8145 | 0.9025 |
| 0.3836 | 0.4062 | 1002 | 0.8123 | 0.3717 | 0.8123 | 0.9013 |
| 0.3836 | 0.4070 | 1004 | 0.7417 | 0.3490 | 0.7417 | 0.8612 |
| 0.3836 | 0.4078 | 1006 | 0.7529 | 0.3642 | 0.7529 | 0.8677 |
| 0.3836 | 0.4086 | 1008 | 0.7902 | 0.3288 | 0.7902 | 0.8889 |
| 0.3836 | 0.4094 | 1010 | 0.8864 | 0.3550 | 0.8864 | 0.9415 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
|
amang1802/Llama3.2-1B-summary-length-exp7.1 | amang1802 | 2024-11-25T05:45:21Z | 6 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-24T22:41:18Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
QuantFactory/Llama-3.1-Tulu-3-8B-DPO-GGUF | QuantFactory | 2024-11-25T05:44:08Z | 59 | 1 | transformers | [
"transformers",
"gguf",
"text-generation",
"en",
"dataset:allenai/llama-3.1-tulu-3-8b-preference-mixture",
"base_model:allenai/Llama-3.1-Tulu-3-8B-SFT",
"base_model:quantized:allenai/Llama-3.1-Tulu-3-8B-SFT",
"license:llama3.1",
"endpoints_compatible",
"region:us",
"conversational"
] | text-generation | 2024-11-25T04:37:46Z |
---
license: llama3.1
language:
- en
pipeline_tag: text-generation
datasets:
- allenai/llama-3.1-tulu-3-8b-preference-mixture
base_model:
- allenai/Llama-3.1-Tulu-3-8B-SFT
library_name: transformers
---
[](https://hf.co/QuantFactory)
# QuantFactory/Llama-3.1-Tulu-3-8B-DPO-GGUF
This is quantized version of [allenai/Llama-3.1-Tulu-3-8B-DPO](https://huggingface.co/allenai/Llama-3.1-Tulu-3-8B-DPO) created using llama.cpp
# Original Model Card
<img src="https://huggingface.co/datasets/allenai/blog-images/resolve/main/tulu3/Tulu3-logo.png" alt="Tulu 3 banner" width="800" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
# Llama-3.1-Tulu-3-8B-DPO
Tülu3 is a leading instruction following model family, offering fully open-source data, code, and recipes designed to serve as a comprehensive guide for modern post-training techniques.
Tülu3 is designed for state-of-the-art performance on a diversity of tasks in addition to chat, such as MATH, GSM8K, and IFEval.
## Model description
- **Model type:** A model trained on a mix of publicly available, synthetic and human-created datasets.
- **Language(s) (NLP):** Primarily English
- **License:** Llama 3.1 Community License Agreement
- **Finetuned from model:** allenai/Llama-3.1-Tulu-3-8B-SFT
### Model Sources
- **Training Repository:** https://github.com/allenai/open-instruct
- **Eval Repository:** https://github.com/allenai/olmes
- **Paper:** https://allenai.org/papers/tulu-3-report.pdf (arXiv soon)
- **Demo:** https://playground.allenai.org/
### Model Family
| **Stage** | **Llama 3.1 8B** | **Llama 3.1 70B** |
|----------------------|----------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------|
| **Base Model** | [meta-llama/Llama-3.1-8B](https://huggingface.co/meta-llama/Llama-3.1-8B) | [meta-llama/Llama-3.1-70B](https://huggingface.co/meta-llama/Llama-3.1-70B) |
| **SFT** | [allenai/Llama-3.1-Tulu-3-8B-SFT](https://huggingface.co/allenai/Llama-3.1-Tulu-3-8B-SFT) | [allenai/Llama-3.1-Tulu-3-70B-SFT](https://huggingface.co/allenai/Llama-3.1-Tulu-3-70B-SFT) |
| **DPO** | [allenai/Llama-3.1-Tulu-3-8B-DPO](https://huggingface.co/allenai/Llama-3.1-Tulu-3-8B-DPO) | [allenai/Llama-3.1-Tulu-3-70B-DPO](https://huggingface.co/allenai/Llama-3.1-Tulu-3-70B-DPO) |
| **Final Models (RLVR)** | [allenai/Llama-3.1-Tulu-3-8B](https://huggingface.co/allenai/Llama-3.1-Tulu-3-8B) | [allenai/Llama-3.1-Tulu-3-70B](https://huggingface.co/allenai/Llama-3.1-Tulu-3-70B) |
| **Reward Model (RM)**| [allenai/Llama-3.1-Tulu-3-8B-RM](https://huggingface.co/allenai/Llama-3.1-Tulu-3-8B-RM) | (Same as 8B) |
## Using the model
### Loading with HuggingFace
To load the model with HuggingFace, use the following snippet:
```
from transformers import AutoModelForCausalLM
tulu_model = AutoModelForCausalLM.from_pretrained("allenai/Llama-3.1-Tulu-3-8B-DPO")
```
### VLLM
As a Llama base model, the model can be easily served with:
```
vllm serve allenai/Llama-3.1-Tulu-3-8B-DPO
```
Note that given the long chat template of Llama, you may want to use `--max_model_len=8192`.
### Chat template
The chat template for our models is formatted as:
```
<|user|>\nHow are you doing?\n<|assistant|>\nI'm just a computer program, so I don't have feelings, but I'm functioning as expected. How can I assist you today?<|endoftext|>
```
Or with new lines expanded:
```
<|user|>
How are you doing?
<|assistant|>
I'm just a computer program, so I don't have feelings, but I'm functioning as expected. How can I assist you today?<|endoftext|>
```
It is embedded within the tokenizer as well, for `tokenizer.apply_chat_template`.
### System prompt
In Ai2 demos, we use this system prompt by default:
```
You are Tulu 3, a helpful and harmless AI Assistant built by the Allen Institute for AI.
```
The model has not been trained with a specific system prompt in mind.
### Bias, Risks, and Limitations
The Tülu3 models have limited safety training, but are not deployed automatically with in-the-loop filtering of responses like ChatGPT, so the model can produce problematic outputs (especially when prompted to do so).
It is also unknown what the size and composition of the corpus was used to train the base Llama 3.1 models, however it is likely to have included a mix of Web data and technical sources like books and code.
See the Falcon 180B model card for an example of this.
## Performance
| Benchmark (eval) | Tülu 3 SFT 8B | Tülu 3 DPO 8B | Tülu 3 8B | Llama 3.1 8B Instruct | Qwen 2.5 7B Instruct | Magpie 8B | Gemma 2 9B Instruct | Ministral 8B Instruct |
|---------------------------------|----------------|----------------|------------|------------------------|----------------------|-----------|---------------------|-----------------------|
| **Avg.** | 60.4 | 64.4 | **64.8** | 62.2 | 57.8 | 44.7 | 55.2 | 58.3 |
| **MMLU (0 shot, CoT)** | 65.9 | 68.7 | 68.2 | 71.2 | **76.6** | 62.0 | 74.6 | 68.5 |
| **PopQA (15 shot)** | **29.3** | 29.3 | 29.1 | 20.2 | 18.1 | 22.5 | 28.3 | 20.2 |
| **TruthfulQA (6 shot)** | 46.8 | 56.1 | 55.0 | 55.1 | **63.1** | 57.0 | 61.4 | 55.5 |
| **BigBenchHard (3 shot, CoT)** | **67.9** | 65.8 | 66.0 | 62.8 | 21.7 | 0.9 | 2.5 | 56.2 |
| **DROP (3 shot)** | 61.3 | 62.5 | **62.6** | 61.5 | 54.4 | 49.4 | 58.8 | 56.2 |
| **MATH (4 shot CoT, Flex)** | 31.5 | 42.0 | **43.7** | 42.5 | 14.8 | 5.1 | 29.8 | 40.0 |
| **GSM8K (8 shot, CoT)** | 76.2 | 84.3 | **87.6** | 83.4 | 83.8 | 61.2 | 79.7 | 80.0 |
| **HumanEval (pass@10)** | 86.2 | 83.9 | 83.9 | 86.3 | **93.1** | 75.4 | 71.7 | 91.0 |
| **HumanEval+ (pass@10)** | 81.4 | 78.6 | 79.2 | 82.9 | **89.7** | 69.1 | 67.0 | 88.5 |
| **IFEval (prompt loose)** | 72.8 | 81.1 | **82.4** | 80.6 | 74.7 | 38.8 | 69.9 | 56.4 |
| **AlpacaEval 2 (LC % win)** | 12.4 | 33.5 | 34.5 | 24.2 | 29.0 | **49.0** | 43.7 | 31.4 |
| **Safety (6 task avg.)** | **93.1** | 87.2 | 85.5 | 75.2 | 75.0 | 46.4 | 75.5 | 56.2 |
| Benchmark (eval) | Tülu 3 70B SFT | Tülu 3 DPO 70B | Tülu 3 70B | Llama 3.1 70B Instruct | Qwen 2.5 72B Instruct | Hermes 3 Llama 3.1 70B | Nemotron Llama 3.1 70B |
|---------------------------------|-----------------|-----------------|-------------|-------------------------|-----------------------|------------------------|-------------------------|
| **Avg.** | 72.6 | 75.9 | **76.0** | 73.4 | 71.5 | 68.3 | 65.5 |
| **MMLU (0 shot, CoT)** | 78.9 | 83.3 | 83.1 | 85.3 | **85.5** | 80.4 | 83.8 |
| **PopQA (15 shot)** | **48.6** | 46.3 | 46.5 | 46.4 | 30.6 | 48.1 | 36.4 |
| **TruthfulQA (6 shot)** | 55.7 | 67.9 | 67.6 | 66.8 | **69.9** | 66.5 | 62.6 |
| **BigBenchHard (3 shot, CoT)** | **82.7** | 81.8 | 82.0 | 73.8 | 67.2 | 82.1 | 0.7 |
| **DROP (3 shot)** | **77.2** | 74.1 | 74.3 | 77.0 | 34.2 | 73.2 | 68.8 |
| **MATH (4 shot CoT, Flex)** | 53.7 | 62.3 | 63.0 | 56.4 | **74.3** | 41.9 | 55.0 |
| **GSM8K (8 shot, CoT)** | 91.1 | 93.5 | 93.5 | **93.7** | 89.5 | 90.0 | 84.7 |
| **HumanEval (pass@10)** | 92.9 | 92.4 | 92.4 | 93.6 | 94.0 | 89.6 | **94.1** |
| **HumanEval+ (pass@10)** | 87.3 | 88.4 | 88.0 | 89.5 | **90.8** | 85.9 | 85.5 |
| **IFEval (prompt loose)** | 82.1 | 82.6 | 83.2 | **88.0** | 87.6 | 76.0 | 79.9 |
| **AlpacaEval 2 (LC % win)** | 26.3 | 49.6 | 49.8 | 33.4 | 47.7 | 28.4 | **66.1** |
| **Safety (6 task avg.)** | **94.4** | 89.0 | 88.3 | 76.5 | 87.0 | 57.9 | 69.0 |
## Hyperparamters
DPO:
- **Learning Rate**: 5 × 10⁻⁷ (8B), 2.0e-7 (70B)
- **Learning Rate Schedule**: Linear
- **Batch Size (effective)**: 32 (8B), 128 (70B)
- **Max Sequence Length**: 2,048
- **Epochs**: 1
## License and use
All Llama 3.1 Tülu3 models are released under Meta's [Llama 3.1 Community License Agreement](https://www.llama.com/llama3_1/license/).
Llama 3.1 is licensed under the Llama 3.1 Community License, Copyright © Meta Platforms, Inc.
Tülu3 is intended for research and educational use.
For more information, please see our [Responsible Use Guidelines](https://allenai.org/responsible-use).
The models have been fine-tuned using a dataset mix with outputs generated from third party models and are subject to additional terms:
[Gemma Terms of Use](https://ai.google.dev/gemma/terms) and [Qwen License Agreement](https://huggingface.co/Qwen/Qwen2.5-72B-Instruct/blob/main/LICENSE) (models were improved using Qwen 2.5).
## Citation
If Tülu3 or any of the related materials were helpful to your work, please cite:
```
@article{lambert2024tulu3,
title = {Tülu 3: Pushing Frontiers in Open Language Model Post-Training},
author = {
Nathan Lambert and
Jacob Morrison and
Valentina Pyatkin and
Shengyi Huang and
Hamish Ivison and
Faeze Brahman and
Lester James V. Miranda and
Alisa Liu and
Nouha Dziri and
Shane Lyu and
Yuling Gu and
Saumya Malik and
Victoria Graf and
Jena D. Hwang and
Jiangjiang Yang and
Ronan Le Bras and
Oyvind Tafjord and
Chris Wilhelm and
Luca Soldaini and
Noah A. Smith and
Yizhong Wang and
Pradeep Dasigi and
Hannaneh Hajishirzi
},
year = {2024},
email = {[email protected]}
}
```
|
ericflo/Llama-3.2-3B-COT | ericflo | 2024-11-25T05:42:53Z | 166 | 1 | transformers | [
"transformers",
"safetensors",
"gguf",
"llama",
"text-generation",
"llama-3.2",
"thought-chain",
"instruction-finetuning",
"conversational",
"base_model:meta-llama/Llama-3.2-3B",
"base_model:quantized:meta-llama/Llama-3.2-3B",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-25T00:55:48Z | ---
license: apache-2.0
base_model:
- meta-llama/Llama-3.2-3B
tags:
- llama-3.2
- thought-chain
- instruction-finetuning
- transformers
library_name: transformers
pipeline_tag: text-generation
---
# Thought-Ranked Llama 3.2 3B
## Model Description
This model is a fine-tuned version of Meta's Llama 3.2 3B (Base) that has been specially trained to generate high-quality thought processes before producing answers. The model underwent 4 rounds of specialized fine-tuning using a thought-chain ranking approach.
(Weekend project, just a few hundred steps of training)
### Training Process
1. **Initial Generation**: For each training sample, the model generates multiple thought chains by prefixing different thought tokens: `<thought>{char}</thought>` for each character in `[a-zA-Z0-9]`. Each thought chain is allowed up to 128 tokens.
2. **Answer Generation**: Following each thought chain, the model generates a complete answer with up to 2048 tokens.
3. **Ranking & Selection**: An external LLM ranking system evaluates the quality of answers without seeing the thought processes, creating a ranking of the most effective thought patterns.
4. **Final Training**: The model is then trained on the highest-ranked thought-answer pairs, learning to generate the most effective thought patterns autonomously.
### Key Features
- **Thought Chain Generation**: The model has learned to generate explicit thought processes before providing answers
- **Greedy Sampling**: Uses greedy sampling for both thought generation and final answers
- **Length Parameters**:
- Thought chains: Up to 128 tokens
- Final answers: Up to 2048 tokens
### Model Architecture
- Base model: Llama 3.2 3B (Base)
- Architecture: Transformer-based language model
- Parameters: ~3.2 billion
- Training Strategy: Supervised Fine-Tuning (SFT) with thought-chain ranking
## Intended Use
This model is designed for tasks that benefit from explicit reasoning chains, including but not limited to:
- Problem-solving
- Mathematical reasoning
- Logical deduction
- Step-by-step explanations
- Complex decision making
### Out-of-Scope Uses
- Direct deployment without safety measures
- Applications requiring guaranteed accuracy
- Critical decision-making without human oversight
- Tasks requiring capabilities beyond the base Llama 3.2 3B model
## Training Details
### Training Data
The model was trained using:
- Sample questions paired with multiple thought variations
- Thought chains generated using systematic character prefixes
- Rankings derived from LLM evaluation of answer quality
### Training Procedure
1. **Thought Generation Phase**
- Generated 62 variations of thoughts per sample (a-z, A-Z, 0-9)
- Sampled with temperature=0.0
- Maximum thought length: 128 tokens
2. **Answer Generation Phase**
- Generated completions following each thought chain
- Maximum answer length: 2048 tokens
- Sampled with temperature=0.0
3. **Ranking Phase**
- External LLM evaluated answer quality
- Ranking performed without access to thought chains
- Selected highest-performing thought-answer pairs
4. **Final Training Phase**
- Fine-tuned on best-performing thought-answer combinations
- 4 complete rounds of training
## Usage
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("ericflo/Llama-3.2-3B-COT")
tokenizer = AutoTokenizer.from_pretrained("ericflo/Llama-3.2-3B-COT")
# Example usage
prompt = "Solve this math problem: 2x + 3 = 7"
input_ids = tokenizer.apply_chat_template(
[{"role": "user", "content": prompt}],
return_tensors="pt"
)
# Generate response with thought chain
output = model.generate(
input_ids,
temperature=1.0,
)
response = tokenizer.decode(output[0])
```
## Limitations
- Limited to the capabilities of the base Llama 3.2 3B model
- May generate thought chains that are not always optimal
- Performance depends on the quality of the LLM ranking system used during training
- Training process may not capture all possible effective thought patterns
- Limited by the context window of the base model
## Ethical Considerations
- The model inherits biases from the base Llama 3.2 3B model
- Generated thought chains should be reviewed for accuracy and appropriateness
- The model's reasoning process should not be relied upon for critical decisions without human verification
- Users should implement appropriate content filtering and safety measures
## Citation
If you use this model in your research, please cite:
```bibtex
@misc{thought-ranked-llama,
title={Thought-Ranked Llama 3.2: Fine-tuning Language Models with Ranked Thought Chains},
author={[Eric Florenzano]},
year={2024},
howpublished={\url{https://huggingface.co/ericflo/Llama-3.2-3B-COT}}
}
``` |
initial01/videomae-base-finetuned-ucf101-subset | initial01 | 2024-11-25T05:41:38Z | 63 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"videomae",
"video-classification",
"generated_from_trainer",
"base_model:MCG-NJU/videomae-base",
"base_model:finetune:MCG-NJU/videomae-base",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"region:us"
] | video-classification | 2024-11-25T05:27:06Z | ---
library_name: transformers
license: cc-by-nc-4.0
base_model: MCG-NJU/videomae-base
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: videomae-base-finetuned-ucf101-subset
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# videomae-base-finetuned-ucf101-subset
This model is a fine-tuned version of [MCG-NJU/videomae-base](https://huggingface.co/MCG-NJU/videomae-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4339
- Accuracy: 0.8429
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 148
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:------:|:----:|:---------------:|:--------:|
| 2.1071 | 0.2568 | 38 | 1.8574 | 0.2714 |
| 1.0387 | 1.2568 | 76 | 0.9456 | 0.8 |
| 0.4169 | 2.2568 | 114 | 0.6413 | 0.7857 |
| 0.2935 | 3.2297 | 148 | 0.4339 | 0.8429 |
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.1+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
|
PhillipGuo/llama-3-manual_interp-forget_first_16_unsplit-inject_True-1 | PhillipGuo | 2024-11-25T05:40:27Z | 5 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-25T05:36:23Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
PhillipGuo/llama-3-localized_ct_mlps-forget_first_16_unsplit-inject_True-1 | PhillipGuo | 2024-11-25T05:37:54Z | 6 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-25T05:35:24Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
guan06/mt5-base-finetuned-xsum | guan06 | 2024-11-25T05:37:15Z | 113 | 0 | transformers | [
"transformers",
"safetensors",
"mt5",
"text2text-generation",
"generated_from_trainer",
"base_model:google/mt5-base",
"base_model:finetune:google/mt5-base",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text2text-generation | 2024-11-24T10:25:52Z | ---
library_name: transformers
license: apache-2.0
base_model: google/mt5-base
tags:
- generated_from_trainer
metrics:
- rouge
model-index:
- name: mt5-base-finetuned-xsum
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# mt5-base-finetuned-xsum
This model is a fine-tuned version of [google/mt5-base](https://huggingface.co/google/mt5-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: nan
- Rouge1: 1.5134
- Rouge2: 0.2001
- Rougel: 1.4917
- Rougelsum: 1.4788
- Gen Len: 8.6992
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-06
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
|:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:-------:|
| 0.0 | 1.0 | 559 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 2.0 | 1118 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 3.0 | 1677 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 4.0 | 2236 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 5.0 | 2795 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 6.0 | 3354 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 7.0 | 3913 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 8.0 | 4472 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 9.0 | 5031 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 10.0 | 5590 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 11.0 | 6149 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 12.0 | 6708 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 13.0 | 7267 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 14.0 | 7826 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 15.0 | 8385 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 16.0 | 8944 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 17.0 | 9503 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 18.0 | 10062 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 19.0 | 10621 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 20.0 | 11180 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 21.0 | 11739 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 22.0 | 12298 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 23.0 | 12857 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 24.0 | 13416 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 25.0 | 13975 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 26.0 | 14534 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 27.0 | 15093 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 28.0 | 15652 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 29.0 | 16211 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 30.0 | 16770 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 31.0 | 17329 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 32.0 | 17888 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 33.0 | 18447 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 34.0 | 19006 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 35.0 | 19565 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 36.0 | 20124 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 37.0 | 20683 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 38.0 | 21242 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 39.0 | 21801 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 40.0 | 22360 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 41.0 | 22919 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 42.0 | 23478 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 43.0 | 24037 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 44.0 | 24596 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 45.0 | 25155 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 46.0 | 25714 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 47.0 | 26273 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 48.0 | 26832 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 49.0 | 27391 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
| 0.0 | 50.0 | 27950 | nan | 1.5134 | 0.2001 | 1.4917 | 1.4788 | 8.6992 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.5.1+cu118
- Datasets 3.1.0
- Tokenizers 0.20.3
|
heongjun127/test4 | heongjun127 | 2024-11-25T05:34:41Z | 5 | 0 | transformers | [
"transformers",
"safetensors",
"exaone",
"text-generation",
"trl",
"sft",
"conversational",
"custom_code",
"arxiv:1910.09700",
"autotrain_compatible",
"region:us"
] | text-generation | 2024-11-25T05:21:42Z | ---
library_name: transformers
tags:
- trl
- sft
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
nkadoor/cnn_news_summary_model_trained_on_reduced_data | nkadoor | 2024-11-25T05:30:43Z | 116 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"t5",
"text2text-generation",
"generated_from_trainer",
"base_model:google-t5/t5-small",
"base_model:finetune:google-t5/t5-small",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text2text-generation | 2024-11-25T04:47:46Z | ---
library_name: transformers
license: apache-2.0
base_model: t5-small
tags:
- generated_from_trainer
metrics:
- rouge
model-index:
- name: cnn_news_summary_model_trained_on_reduced_data
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# cnn_news_summary_model_trained_on_reduced_data
This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.6040
- Rouge1: 0.2183
- Rouge2: 0.0946
- Rougel: 0.1842
- Rougelsum: 0.1842
- Generated Length: 19.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 3
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Generated Length |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------------:|
| No log | 1.0 | 431 | 1.6239 | 0.217 | 0.0934 | 0.1826 | 0.1826 | 19.0 |
| 1.9203 | 2.0 | 862 | 1.6075 | 0.2167 | 0.0938 | 0.1826 | 0.1827 | 19.0 |
| 1.822 | 3.0 | 1293 | 1.6040 | 0.2183 | 0.0946 | 0.1842 | 0.1842 | 19.0 |
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.1+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
|
Shinyaaa/Travel-20-v1 | Shinyaaa | 2024-11-25T05:28:45Z | 102 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"feature-extraction",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | feature-extraction | 2024-11-25T05:28:18Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
FrancisNweke/bh_llm | FrancisNweke | 2024-11-25T05:25:50Z | 128 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"text-generation-inference",
"unsloth",
"trl",
"sft",
"en",
"base_model:unsloth/Llama-3.2-3B",
"base_model:finetune:unsloth/Llama-3.2-3B",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-25T05:24:03Z | ---
base_model: unsloth/Llama-3.2-3B
language:
- en
license: apache-2.0
tags:
- text-generation-inference
- transformers
- unsloth
- llama
- trl
- sft
---
# Uploaded model
- **Developed by:** FrancisNweke
- **License:** apache-2.0
- **Finetuned from model :** unsloth/Llama-3.2-3B
This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
aankurkumar/llama-3.2-3b-it-eva-bcil-chatbot | aankurkumar | 2024-11-25T05:24:19Z | 127 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-25T05:21:39Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Yebin46/llama1B | Yebin46 | 2024-11-25T05:23:25Z | 1,920 | 0 | null | [
"safetensors",
"llama",
"ko",
"dataset:taeshahn/ko-lima",
"base_model:meta-llama/Llama-3.2-1B-Instruct",
"base_model:finetune:meta-llama/Llama-3.2-1B-Instruct",
"license:apache-2.0",
"region:us"
] | null | 2024-11-25T03:40:11Z | ---
datasets:
- taeshahn/ko-lima
base_model:
- meta-llama/Llama-3.2-1B-Instruct
license: apache-2.0
language:
- ko
---
리더보드 체크용입니다.
Llama-3.2-1B-Instruct 모델을 한국어 instruction dataset 1K로 LoRA finetuning을 진행했습니다.
---
This model is just for testing uploads to the leaderboard. 😅
We fine-tuned the Llama-3.2-1B-Instruct model on a Korean instruction dataset containing 1,000 samples using LoRA (Unsloth). |
cvapict/distilbert-base-multilingual-cased-aoe-test3 | cvapict | 2024-11-25T05:18:22Z | 105 | 0 | transformers | [
"transformers",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"base_model:distilbert/distilbert-base-multilingual-cased",
"base_model:finetune:distilbert/distilbert-base-multilingual-cased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-25T05:17:48Z | ---
library_name: transformers
license: apache-2.0
base_model: distilbert-base-multilingual-cased
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-multilingual-cased-aoe-test3
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-multilingual-cased-aoe-test3
This model is a fine-tuned version of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1259
- Accuracy: 0.9563
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.1037 | 1.0 | 375 | 0.1388 | 0.954 |
| 0.0979 | 2.0 | 750 | 0.1259 | 0.9563 |
| 0.0865 | 3.0 | 1125 | 0.1417 | 0.9427 |
| 0.0593 | 4.0 | 1500 | 0.1857 | 0.9467 |
| 0.0076 | 5.0 | 1875 | 0.2010 | 0.9453 |
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.1+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
|
mradermacher/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v0.1-GGUF | mradermacher | 2024-11-25T05:12:19Z | 45 | 0 | transformers | [
"transformers",
"gguf",
"ko",
"en",
"base_model:KBNIT/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v0.1",
"base_model:quantized:KBNIT/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v0.1",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-11-22T04:15:36Z | ---
base_model: KBNIT/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v0.1
language:
- ko
- en
library_name: transformers
license: cc-by-nc-4.0
quantized_by: mradermacher
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
static quants of https://huggingface.co/KBNIT/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v0.1
<!-- provided-files -->
weighted/imatrix quants are available at https://huggingface.co/mradermacher/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v0.1-i1-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v0.1-GGUF/resolve/main/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v0.1.Q2_K.gguf) | Q2_K | 4.1 | |
| [GGUF](https://huggingface.co/mradermacher/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v0.1-GGUF/resolve/main/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v0.1.Q3_K_S.gguf) | Q3_K_S | 4.8 | |
| [GGUF](https://huggingface.co/mradermacher/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v0.1-GGUF/resolve/main/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v0.1.Q3_K_M.gguf) | Q3_K_M | 5.3 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v0.1-GGUF/resolve/main/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v0.1.Q3_K_L.gguf) | Q3_K_L | 5.8 | |
| [GGUF](https://huggingface.co/mradermacher/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v0.1-GGUF/resolve/main/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v0.1.IQ4_XS.gguf) | IQ4_XS | 6.0 | |
| [GGUF](https://huggingface.co/mradermacher/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v0.1-GGUF/resolve/main/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v0.1.Q4_0_4_4.gguf) | Q4_0_4_4 | 6.2 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v0.1-GGUF/resolve/main/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v0.1.Q4_K_S.gguf) | Q4_K_S | 6.3 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v0.1-GGUF/resolve/main/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v0.1.Q4_K_M.gguf) | Q4_K_M | 6.6 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v0.1-GGUF/resolve/main/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v0.1.Q5_K_S.gguf) | Q5_K_S | 7.6 | |
| [GGUF](https://huggingface.co/mradermacher/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v0.1-GGUF/resolve/main/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v0.1.Q5_K_M.gguf) | Q5_K_M | 7.8 | |
| [GGUF](https://huggingface.co/mradermacher/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v0.1-GGUF/resolve/main/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v0.1.Q6_K.gguf) | Q6_K | 9.0 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v0.1-GGUF/resolve/main/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v0.1.Q8_0.gguf) | Q8_0 | 11.6 | fast, best quality |
| [GGUF](https://huggingface.co/mradermacher/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v0.1-GGUF/resolve/main/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v0.1.f16.gguf) | f16 | 21.7 | 16 bpw, overkill |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
mradermacher/free-solar-slerp-v0.3-GGUF | mradermacher | 2024-11-25T05:12:14Z | 18 | 0 | transformers | [
"transformers",
"gguf",
"mergekit",
"merge",
"en",
"base_model:freewheelin/free-solar-slerp-v0.3",
"base_model:quantized:freewheelin/free-solar-slerp-v0.3",
"license:mit",
"endpoints_compatible",
"region:us"
] | null | 2024-11-22T04:32:52Z | ---
base_model: freewheelin/free-solar-slerp-v0.3
language:
- en
library_name: transformers
license: mit
quantized_by: mradermacher
tags:
- mergekit
- merge
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
static quants of https://huggingface.co/freewheelin/free-solar-slerp-v0.3
<!-- provided-files -->
weighted/imatrix quants are available at https://huggingface.co/mradermacher/free-solar-slerp-v0.3-i1-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/free-solar-slerp-v0.3-GGUF/resolve/main/free-solar-slerp-v0.3.Q2_K.gguf) | Q2_K | 4.2 | |
| [GGUF](https://huggingface.co/mradermacher/free-solar-slerp-v0.3-GGUF/resolve/main/free-solar-slerp-v0.3.Q3_K_S.gguf) | Q3_K_S | 4.8 | |
| [GGUF](https://huggingface.co/mradermacher/free-solar-slerp-v0.3-GGUF/resolve/main/free-solar-slerp-v0.3.Q3_K_M.gguf) | Q3_K_M | 5.4 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/free-solar-slerp-v0.3-GGUF/resolve/main/free-solar-slerp-v0.3.Q3_K_L.gguf) | Q3_K_L | 5.8 | |
| [GGUF](https://huggingface.co/mradermacher/free-solar-slerp-v0.3-GGUF/resolve/main/free-solar-slerp-v0.3.IQ4_XS.gguf) | IQ4_XS | 6.0 | |
| [GGUF](https://huggingface.co/mradermacher/free-solar-slerp-v0.3-GGUF/resolve/main/free-solar-slerp-v0.3.Q4_0_4_4.gguf) | Q4_0_4_4 | 6.3 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/free-solar-slerp-v0.3-GGUF/resolve/main/free-solar-slerp-v0.3.Q4_K_S.gguf) | Q4_K_S | 6.3 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/free-solar-slerp-v0.3-GGUF/resolve/main/free-solar-slerp-v0.3.Q4_K_M.gguf) | Q4_K_M | 6.7 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/free-solar-slerp-v0.3-GGUF/resolve/main/free-solar-slerp-v0.3.Q5_K_S.gguf) | Q5_K_S | 7.6 | |
| [GGUF](https://huggingface.co/mradermacher/free-solar-slerp-v0.3-GGUF/resolve/main/free-solar-slerp-v0.3.Q5_K_M.gguf) | Q5_K_M | 7.8 | |
| [GGUF](https://huggingface.co/mradermacher/free-solar-slerp-v0.3-GGUF/resolve/main/free-solar-slerp-v0.3.Q6_K.gguf) | Q6_K | 9.0 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/free-solar-slerp-v0.3-GGUF/resolve/main/free-solar-slerp-v0.3.Q8_0.gguf) | Q8_0 | 11.6 | fast, best quality |
| [GGUF](https://huggingface.co/mradermacher/free-solar-slerp-v0.3-GGUF/resolve/main/free-solar-slerp-v0.3.f16.gguf) | f16 | 21.8 | 16 bpw, overkill |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
mradermacher/Qwen2.5-32B-ArliAI-RPMax-v1.3-GGUF | mradermacher | 2024-11-25T05:11:24Z | 24 | 0 | transformers | [
"transformers",
"gguf",
"en",
"base_model:ArliAI/Qwen2.5-32B-ArliAI-RPMax-v1.3",
"base_model:quantized:ArliAI/Qwen2.5-32B-ArliAI-RPMax-v1.3",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-11-22T17:07:21Z | ---
base_model: ArliAI/Qwen2.5-32B-ArliAI-RPMax-v1.3
language:
- en
library_name: transformers
license: apache-2.0
quantized_by: mradermacher
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
static quants of https://huggingface.co/ArliAI/Qwen2.5-32B-ArliAI-RPMax-v1.3
<!-- provided-files -->
weighted/imatrix quants are available at https://huggingface.co/mradermacher/Qwen2.5-32B-ArliAI-RPMax-v1.3-i1-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-32B-ArliAI-RPMax-v1.3-GGUF/resolve/main/Qwen2.5-32B-ArliAI-RPMax-v1.3.Q2_K.gguf) | Q2_K | 12.4 | |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-32B-ArliAI-RPMax-v1.3-GGUF/resolve/main/Qwen2.5-32B-ArliAI-RPMax-v1.3.Q3_K_S.gguf) | Q3_K_S | 14.5 | |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-32B-ArliAI-RPMax-v1.3-GGUF/resolve/main/Qwen2.5-32B-ArliAI-RPMax-v1.3.Q3_K_M.gguf) | Q3_K_M | 16.0 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-32B-ArliAI-RPMax-v1.3-GGUF/resolve/main/Qwen2.5-32B-ArliAI-RPMax-v1.3.Q3_K_L.gguf) | Q3_K_L | 17.3 | |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-32B-ArliAI-RPMax-v1.3-GGUF/resolve/main/Qwen2.5-32B-ArliAI-RPMax-v1.3.IQ4_XS.gguf) | IQ4_XS | 18.0 | |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-32B-ArliAI-RPMax-v1.3-GGUF/resolve/main/Qwen2.5-32B-ArliAI-RPMax-v1.3.Q4_K_S.gguf) | Q4_K_S | 18.9 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-32B-ArliAI-RPMax-v1.3-GGUF/resolve/main/Qwen2.5-32B-ArliAI-RPMax-v1.3.Q4_K_M.gguf) | Q4_K_M | 20.0 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-32B-ArliAI-RPMax-v1.3-GGUF/resolve/main/Qwen2.5-32B-ArliAI-RPMax-v1.3.Q5_K_S.gguf) | Q5_K_S | 22.7 | |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-32B-ArliAI-RPMax-v1.3-GGUF/resolve/main/Qwen2.5-32B-ArliAI-RPMax-v1.3.Q5_K_M.gguf) | Q5_K_M | 23.4 | |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-32B-ArliAI-RPMax-v1.3-GGUF/resolve/main/Qwen2.5-32B-ArliAI-RPMax-v1.3.Q6_K.gguf) | Q6_K | 27.0 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-32B-ArliAI-RPMax-v1.3-GGUF/resolve/main/Qwen2.5-32B-ArliAI-RPMax-v1.3.Q8_0.gguf) | Q8_0 | 34.9 | fast, best quality |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
mradermacher/BreezePipe-7B-merge-GGUF | mradermacher | 2024-11-25T05:11:08Z | 66 | 0 | transformers | [
"transformers",
"gguf",
"merge",
"mergekit",
"lazymergekit",
"MediaTek-Research/Breeze-7B-Instruct-v0.1",
"Azure99/blossom-v4-mistral-7b",
"en",
"base_model:Heng666/BreezePipe-7B-merge",
"base_model:quantized:Heng666/BreezePipe-7B-merge",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | null | 2024-11-24T00:51:42Z | ---
base_model: Heng666/BreezePipe-7B-merge
language:
- en
library_name: transformers
license: apache-2.0
quantized_by: mradermacher
tags:
- merge
- mergekit
- lazymergekit
- MediaTek-Research/Breeze-7B-Instruct-v0.1
- Azure99/blossom-v4-mistral-7b
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: -->
static quants of https://huggingface.co/Heng666/BreezePipe-7B-merge
<!-- provided-files -->
weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/BreezePipe-7B-merge-GGUF/resolve/main/BreezePipe-7B-merge.Q2_K.gguf) | Q2_K | 3.5 | |
| [GGUF](https://huggingface.co/mradermacher/BreezePipe-7B-merge-GGUF/resolve/main/BreezePipe-7B-merge.Q3_K_S.gguf) | Q3_K_S | 4.1 | |
| [GGUF](https://huggingface.co/mradermacher/BreezePipe-7B-merge-GGUF/resolve/main/BreezePipe-7B-merge.Q3_K_M.gguf) | Q3_K_M | 4.5 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/BreezePipe-7B-merge-GGUF/resolve/main/BreezePipe-7B-merge.Q3_K_L.gguf) | Q3_K_L | 4.9 | |
| [GGUF](https://huggingface.co/mradermacher/BreezePipe-7B-merge-GGUF/resolve/main/BreezePipe-7B-merge.IQ4_XS.gguf) | IQ4_XS | 5.1 | |
| [GGUF](https://huggingface.co/mradermacher/BreezePipe-7B-merge-GGUF/resolve/main/BreezePipe-7B-merge.Q4_0_4_4.gguf) | Q4_0_4_4 | 5.3 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/BreezePipe-7B-merge-GGUF/resolve/main/BreezePipe-7B-merge.Q4_K_S.gguf) | Q4_K_S | 5.3 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/BreezePipe-7B-merge-GGUF/resolve/main/BreezePipe-7B-merge.Q4_K_M.gguf) | Q4_K_M | 5.6 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/BreezePipe-7B-merge-GGUF/resolve/main/BreezePipe-7B-merge.Q5_K_S.gguf) | Q5_K_S | 6.4 | |
| [GGUF](https://huggingface.co/mradermacher/BreezePipe-7B-merge-GGUF/resolve/main/BreezePipe-7B-merge.Q5_K_M.gguf) | Q5_K_M | 6.5 | |
| [GGUF](https://huggingface.co/mradermacher/BreezePipe-7B-merge-GGUF/resolve/main/BreezePipe-7B-merge.Q6_K.gguf) | Q6_K | 7.6 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/BreezePipe-7B-merge-GGUF/resolve/main/BreezePipe-7B-merge.Q8_0.gguf) | Q8_0 | 9.8 | fast, best quality |
| [GGUF](https://huggingface.co/mradermacher/BreezePipe-7B-merge-GGUF/resolve/main/BreezePipe-7B-merge.f16.gguf) | f16 | 18.3 | 16 bpw, overkill |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time.
<!-- end -->
|
mradermacher/S-SOLAR-10.7B-v2.0-GGUF | mradermacher | 2024-11-25T05:10:23Z | 18 | 0 | transformers | [
"transformers",
"gguf",
"en",
"base_model:hwkwon/S-SOLAR-10.7B-v2.0",
"base_model:quantized:hwkwon/S-SOLAR-10.7B-v2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-11-24T03:13:57Z | ---
base_model: hwkwon/S-SOLAR-10.7B-v2.0
language:
- en
library_name: transformers
quantized_by: mradermacher
tags: []
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: -->
static quants of https://huggingface.co/hwkwon/S-SOLAR-10.7B-v2.0
<!-- provided-files -->
weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/S-SOLAR-10.7B-v2.0-GGUF/resolve/main/S-SOLAR-10.7B-v2.0.Q2_K.gguf) | Q2_K | 4.1 | |
| [GGUF](https://huggingface.co/mradermacher/S-SOLAR-10.7B-v2.0-GGUF/resolve/main/S-SOLAR-10.7B-v2.0.Q3_K_S.gguf) | Q3_K_S | 4.8 | |
| [GGUF](https://huggingface.co/mradermacher/S-SOLAR-10.7B-v2.0-GGUF/resolve/main/S-SOLAR-10.7B-v2.0.Q3_K_M.gguf) | Q3_K_M | 5.3 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/S-SOLAR-10.7B-v2.0-GGUF/resolve/main/S-SOLAR-10.7B-v2.0.Q3_K_L.gguf) | Q3_K_L | 5.8 | |
| [GGUF](https://huggingface.co/mradermacher/S-SOLAR-10.7B-v2.0-GGUF/resolve/main/S-SOLAR-10.7B-v2.0.IQ4_XS.gguf) | IQ4_XS | 6.0 | |
| [GGUF](https://huggingface.co/mradermacher/S-SOLAR-10.7B-v2.0-GGUF/resolve/main/S-SOLAR-10.7B-v2.0.Q4_0_4_4.gguf) | Q4_0_4_4 | 6.2 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/S-SOLAR-10.7B-v2.0-GGUF/resolve/main/S-SOLAR-10.7B-v2.0.Q4_K_S.gguf) | Q4_K_S | 6.3 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/S-SOLAR-10.7B-v2.0-GGUF/resolve/main/S-SOLAR-10.7B-v2.0.Q4_K_M.gguf) | Q4_K_M | 6.6 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/S-SOLAR-10.7B-v2.0-GGUF/resolve/main/S-SOLAR-10.7B-v2.0.Q5_K_S.gguf) | Q5_K_S | 7.6 | |
| [GGUF](https://huggingface.co/mradermacher/S-SOLAR-10.7B-v2.0-GGUF/resolve/main/S-SOLAR-10.7B-v2.0.Q5_K_M.gguf) | Q5_K_M | 7.8 | |
| [GGUF](https://huggingface.co/mradermacher/S-SOLAR-10.7B-v2.0-GGUF/resolve/main/S-SOLAR-10.7B-v2.0.Q6_K.gguf) | Q6_K | 9.0 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/S-SOLAR-10.7B-v2.0-GGUF/resolve/main/S-SOLAR-10.7B-v2.0.Q8_0.gguf) | Q8_0 | 11.6 | fast, best quality |
| [GGUF](https://huggingface.co/mradermacher/S-SOLAR-10.7B-v2.0-GGUF/resolve/main/S-SOLAR-10.7B-v2.0.f16.gguf) | f16 | 21.7 | 16 bpw, overkill |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time.
<!-- end -->
|
mradermacher/CodeLlama-SDSAT_L7_13B-GGUF | mradermacher | 2024-11-25T05:10:15Z | 18 | 0 | transformers | [
"transformers",
"gguf",
"en",
"base_model:ainergy/CodeLlama-SDSAT_L7_13B",
"base_model:quantized:ainergy/CodeLlama-SDSAT_L7_13B",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | null | 2024-11-24T03:48:42Z | ---
base_model: ainergy/CodeLlama-SDSAT_L7_13B
language:
- en
library_name: transformers
license: apache-2.0
quantized_by: mradermacher
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: -->
static quants of https://huggingface.co/ainergy/CodeLlama-SDSAT_L7_13B
<!-- provided-files -->
weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/CodeLlama-SDSAT_L7_13B-GGUF/resolve/main/CodeLlama-SDSAT_L7_13B.Q2_K.gguf) | Q2_K | 5.0 | |
| [GGUF](https://huggingface.co/mradermacher/CodeLlama-SDSAT_L7_13B-GGUF/resolve/main/CodeLlama-SDSAT_L7_13B.Q3_K_S.gguf) | Q3_K_S | 5.8 | |
| [GGUF](https://huggingface.co/mradermacher/CodeLlama-SDSAT_L7_13B-GGUF/resolve/main/CodeLlama-SDSAT_L7_13B.Q3_K_M.gguf) | Q3_K_M | 6.4 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/CodeLlama-SDSAT_L7_13B-GGUF/resolve/main/CodeLlama-SDSAT_L7_13B.Q3_K_L.gguf) | Q3_K_L | 7.0 | |
| [GGUF](https://huggingface.co/mradermacher/CodeLlama-SDSAT_L7_13B-GGUF/resolve/main/CodeLlama-SDSAT_L7_13B.IQ4_XS.gguf) | IQ4_XS | 7.1 | |
| [GGUF](https://huggingface.co/mradermacher/CodeLlama-SDSAT_L7_13B-GGUF/resolve/main/CodeLlama-SDSAT_L7_13B.Q4_0_4_4.gguf) | Q4_0_4_4 | 7.5 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/CodeLlama-SDSAT_L7_13B-GGUF/resolve/main/CodeLlama-SDSAT_L7_13B.Q4_K_S.gguf) | Q4_K_S | 7.5 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/CodeLlama-SDSAT_L7_13B-GGUF/resolve/main/CodeLlama-SDSAT_L7_13B.Q4_K_M.gguf) | Q4_K_M | 8.0 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/CodeLlama-SDSAT_L7_13B-GGUF/resolve/main/CodeLlama-SDSAT_L7_13B.Q5_K_S.gguf) | Q5_K_S | 9.1 | |
| [GGUF](https://huggingface.co/mradermacher/CodeLlama-SDSAT_L7_13B-GGUF/resolve/main/CodeLlama-SDSAT_L7_13B.Q5_K_M.gguf) | Q5_K_M | 9.3 | |
| [GGUF](https://huggingface.co/mradermacher/CodeLlama-SDSAT_L7_13B-GGUF/resolve/main/CodeLlama-SDSAT_L7_13B.Q6_K.gguf) | Q6_K | 10.8 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/CodeLlama-SDSAT_L7_13B-GGUF/resolve/main/CodeLlama-SDSAT_L7_13B.Q8_0.gguf) | Q8_0 | 13.9 | fast, best quality |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time.
<!-- end -->
|
mradermacher/Llama-SmolTalk-3.2-1B-Instruct-i1-GGUF | mradermacher | 2024-11-25T05:08:59Z | 173 | 0 | transformers | [
"transformers",
"gguf",
"Llama",
"Llama-CPP",
"SmolTalk",
"ollama",
"bin",
"en",
"dataset:HuggingFaceTB/smoltalk",
"base_model:prithivMLmods/Llama-SmolTalk-3.2-1B-Instruct",
"base_model:quantized:prithivMLmods/Llama-SmolTalk-3.2-1B-Instruct",
"license:creativeml-openrail-m",
"endpoints_compatible",
"region:us",
"imatrix",
"conversational"
] | null | 2024-11-24T05:32:06Z | ---
base_model: prithivMLmods/Llama-SmolTalk-3.2-1B-Instruct
datasets:
- HuggingFaceTB/smoltalk
language:
- en
library_name: transformers
license: creativeml-openrail-m
quantized_by: mradermacher
tags:
- Llama
- Llama-CPP
- SmolTalk
- ollama
- bin
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/prithivMLmods/Llama-SmolTalk-3.2-1B-Instruct
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/Llama-SmolTalk-3.2-1B-Instruct-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/Llama-SmolTalk-3.2-1B-Instruct-i1-GGUF/resolve/main/Llama-SmolTalk-3.2-1B-Instruct.i1-IQ1_S.gguf) | i1-IQ1_S | 0.5 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/Llama-SmolTalk-3.2-1B-Instruct-i1-GGUF/resolve/main/Llama-SmolTalk-3.2-1B-Instruct.i1-IQ1_M.gguf) | i1-IQ1_M | 0.5 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/Llama-SmolTalk-3.2-1B-Instruct-i1-GGUF/resolve/main/Llama-SmolTalk-3.2-1B-Instruct.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 0.5 | |
| [GGUF](https://huggingface.co/mradermacher/Llama-SmolTalk-3.2-1B-Instruct-i1-GGUF/resolve/main/Llama-SmolTalk-3.2-1B-Instruct.i1-IQ2_XS.gguf) | i1-IQ2_XS | 0.6 | |
| [GGUF](https://huggingface.co/mradermacher/Llama-SmolTalk-3.2-1B-Instruct-i1-GGUF/resolve/main/Llama-SmolTalk-3.2-1B-Instruct.i1-IQ2_S.gguf) | i1-IQ2_S | 0.6 | |
| [GGUF](https://huggingface.co/mradermacher/Llama-SmolTalk-3.2-1B-Instruct-i1-GGUF/resolve/main/Llama-SmolTalk-3.2-1B-Instruct.i1-IQ2_M.gguf) | i1-IQ2_M | 0.6 | |
| [GGUF](https://huggingface.co/mradermacher/Llama-SmolTalk-3.2-1B-Instruct-i1-GGUF/resolve/main/Llama-SmolTalk-3.2-1B-Instruct.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 0.7 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/Llama-SmolTalk-3.2-1B-Instruct-i1-GGUF/resolve/main/Llama-SmolTalk-3.2-1B-Instruct.i1-Q2_K.gguf) | i1-Q2_K | 0.7 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/Llama-SmolTalk-3.2-1B-Instruct-i1-GGUF/resolve/main/Llama-SmolTalk-3.2-1B-Instruct.i1-IQ3_XS.gguf) | i1-IQ3_XS | 0.7 | |
| [GGUF](https://huggingface.co/mradermacher/Llama-SmolTalk-3.2-1B-Instruct-i1-GGUF/resolve/main/Llama-SmolTalk-3.2-1B-Instruct.i1-Q3_K_S.gguf) | i1-Q3_K_S | 0.7 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/Llama-SmolTalk-3.2-1B-Instruct-i1-GGUF/resolve/main/Llama-SmolTalk-3.2-1B-Instruct.i1-IQ3_S.gguf) | i1-IQ3_S | 0.7 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/Llama-SmolTalk-3.2-1B-Instruct-i1-GGUF/resolve/main/Llama-SmolTalk-3.2-1B-Instruct.i1-IQ3_M.gguf) | i1-IQ3_M | 0.8 | |
| [GGUF](https://huggingface.co/mradermacher/Llama-SmolTalk-3.2-1B-Instruct-i1-GGUF/resolve/main/Llama-SmolTalk-3.2-1B-Instruct.i1-Q3_K_M.gguf) | i1-Q3_K_M | 0.8 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/Llama-SmolTalk-3.2-1B-Instruct-i1-GGUF/resolve/main/Llama-SmolTalk-3.2-1B-Instruct.i1-Q3_K_L.gguf) | i1-Q3_K_L | 0.8 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/Llama-SmolTalk-3.2-1B-Instruct-i1-GGUF/resolve/main/Llama-SmolTalk-3.2-1B-Instruct.i1-IQ4_XS.gguf) | i1-IQ4_XS | 0.8 | |
| [GGUF](https://huggingface.co/mradermacher/Llama-SmolTalk-3.2-1B-Instruct-i1-GGUF/resolve/main/Llama-SmolTalk-3.2-1B-Instruct.i1-Q4_0_4_4.gguf) | i1-Q4_0_4_4 | 0.9 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/Llama-SmolTalk-3.2-1B-Instruct-i1-GGUF/resolve/main/Llama-SmolTalk-3.2-1B-Instruct.i1-Q4_0_4_8.gguf) | i1-Q4_0_4_8 | 0.9 | fast on arm+i8mm, low quality |
| [GGUF](https://huggingface.co/mradermacher/Llama-SmolTalk-3.2-1B-Instruct-i1-GGUF/resolve/main/Llama-SmolTalk-3.2-1B-Instruct.i1-Q4_0_8_8.gguf) | i1-Q4_0_8_8 | 0.9 | fast on arm+sve, low quality |
| [GGUF](https://huggingface.co/mradermacher/Llama-SmolTalk-3.2-1B-Instruct-i1-GGUF/resolve/main/Llama-SmolTalk-3.2-1B-Instruct.i1-Q4_0.gguf) | i1-Q4_0 | 0.9 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/Llama-SmolTalk-3.2-1B-Instruct-i1-GGUF/resolve/main/Llama-SmolTalk-3.2-1B-Instruct.i1-Q4_K_S.gguf) | i1-Q4_K_S | 0.9 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/Llama-SmolTalk-3.2-1B-Instruct-i1-GGUF/resolve/main/Llama-SmolTalk-3.2-1B-Instruct.i1-Q4_K_M.gguf) | i1-Q4_K_M | 0.9 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Llama-SmolTalk-3.2-1B-Instruct-i1-GGUF/resolve/main/Llama-SmolTalk-3.2-1B-Instruct.i1-Q5_K_S.gguf) | i1-Q5_K_S | 1.0 | |
| [GGUF](https://huggingface.co/mradermacher/Llama-SmolTalk-3.2-1B-Instruct-i1-GGUF/resolve/main/Llama-SmolTalk-3.2-1B-Instruct.i1-Q5_K_M.gguf) | i1-Q5_K_M | 1.0 | |
| [GGUF](https://huggingface.co/mradermacher/Llama-SmolTalk-3.2-1B-Instruct-i1-GGUF/resolve/main/Llama-SmolTalk-3.2-1B-Instruct.i1-Q6_K.gguf) | i1-Q6_K | 1.1 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
myfi/semantic-embedding_2 | myfi | 2024-11-25T05:03:35Z | 435 | 0 | sentence-transformers | [
"sentence-transformers",
"safetensors",
"mpnet",
"sentence-similarity",
"feature-extraction",
"generated_from_trainer",
"dataset_size:7851",
"loss:MultipleNegativesRankingLoss",
"arxiv:1908.10084",
"arxiv:1705.00652",
"base_model:microsoft/mpnet-base",
"base_model:finetune:microsoft/mpnet-base",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | sentence-similarity | 2024-11-25T05:03:18Z | ---
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:7851
- loss:MultipleNegativesRankingLoss
base_model: microsoft/mpnet-base
widget:
- source_sentence: did I gain any profits over the past 10 days
sentences:
- Which stocks have a strong potential to see a 10% increase in the next 10 months?
- Did I make any money from trading in the last 10 days
- Which stocks have a strong potential to go up by 10% in the next 10 months?
- source_sentence: Can you show me my holdings?
sentences:
- Reveal my highest-risk assets
- Display my riskiest investment holdings
- 'I''d like to see my portfolio details '
- source_sentence: Do I have any stocks in my portfolio?
sentences:
- Are there any shares of stock included in my portfolio?
- Unfold my individualized fintech recommendations
- What's the numerical assessment of my portfolio?
- source_sentence: View my report card
sentences:
- Which sectors are the most attractive to investors in my portfolio
- Recalibrate portfolio from stocks to mutual fund holdings
- Get my account overview
- source_sentence: Which of my investments have the highest volatility?
sentences:
- Can I see a yearly analysis of my returns
- Have I committed resources to any equity-driven investment funds?
- Which of my assets show the most pronounced fluctuations in market value?
pipeline_tag: sentence-similarity
library_name: sentence-transformers
---
# SentenceTransformer based on microsoft/mpnet-base
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [microsoft/mpnet-base](https://huggingface.co/microsoft/mpnet-base). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [microsoft/mpnet-base](https://huggingface.co/microsoft/mpnet-base) <!-- at revision 6996ce1e91bd2a9c7d7f61daec37463394f73f09 -->
- **Maximum Sequence Length:** 512 tokens
- **Output Dimensionality:** 768 tokens
- **Similarity Function:** Cosine Similarity
<!-- - **Training Dataset:** Unknown -->
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: MPNetModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("pawan2411/semantic-embedding_2")
# Run inference
sentences = [
'Which of my investments have the highest volatility?',
'Which of my assets show the most pronounced fluctuations in market value?',
'Can I see a yearly analysis of my returns',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Dataset
#### Unnamed Dataset
* Size: 7,851 training samples
* Columns: <code>sentence_0</code> and <code>sentence_1</code>
* Approximate statistics based on the first 1000 samples:
| | sentence_0 | sentence_1 |
|:--------|:---------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 5 tokens</li><li>mean: 9.57 tokens</li><li>max: 19 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 12.07 tokens</li><li>max: 27 tokens</li></ul> |
* Samples:
| sentence_0 | sentence_1 |
|:----------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------|
| <code>Show me how to switch my stock portfolio to mutual funds</code> | <code>What steps should I take to replace my stock holdings with mutual fund investments?</code> |
| <code>View my holdings</code> | <code>See my investment portfolio</code> |
| <code>How did my portfolio perform last week ?</code> | <code>Can you give me a rundown of my portfolio's performance for the past week?</code> |
* Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
```json
{
"scale": 20.0,
"similarity_fct": "cos_sim"
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `per_device_train_batch_size`: 64
- `per_device_eval_batch_size`: 64
- `num_train_epochs`: 50
- `multi_dataset_batch_sampler`: round_robin
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: no
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 64
- `per_device_eval_batch_size`: 64
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 1
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 5e-05
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1
- `num_train_epochs`: 50
- `max_steps`: -1
- `lr_scheduler_type`: linear
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.0
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: False
- `fp16`: False
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: False
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: False
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `use_liger_kernel`: False
- `eval_use_gather_object`: False
- `batch_sampler`: batch_sampler
- `multi_dataset_batch_sampler`: round_robin
</details>
### Training Logs
| Epoch | Step | Training Loss |
|:--------:|:-----:|:-------------:|
| 4.0650 | 500 | 2.1067 |
| 8.1301 | 1000 | 0.8233 |
| 12.1951 | 1500 | 0.6455 |
| 16.2602 | 2000 | 0.5768 |
| 20.3252 | 2500 | 0.5378 |
| 24.3902 | 3000 | 0.5155 |
| 28.4553 | 3500 | 0.499 |
| 32.5203 | 4000 | 0.4906 |
| 36.5854 | 4500 | 0.4841 |
| 40.6504 | 5000 | 0.4801 |
| 44.7154 | 5500 | 0.4746 |
| 48.7805 | 6000 | 0.4718 |
| 52.8455 | 6500 | 0.47 |
| 56.9106 | 7000 | 0.468 |
| 60.9756 | 7500 | 0.4655 |
| 65.0407 | 8000 | 0.4634 |
| 69.1057 | 8500 | 0.462 |
| 73.1707 | 9000 | 0.4604 |
| 77.2358 | 9500 | 0.46 |
| 81.3008 | 10000 | 0.4598 |
| 85.3659 | 10500 | 0.458 |
| 89.4309 | 11000 | 0.4574 |
| 93.4959 | 11500 | 0.4566 |
| 97.5610 | 12000 | 0.4565 |
| 101.6260 | 12500 | 0.4558 |
| 105.6911 | 13000 | 0.455 |
| 109.7561 | 13500 | 0.4551 |
| 113.8211 | 14000 | 0.455 |
| 117.8862 | 14500 | 0.4544 |
| 121.9512 | 15000 | 0.4533 |
| 126.0163 | 15500 | 0.4543 |
| 130.0813 | 16000 | 0.4535 |
| 134.1463 | 16500 | 0.4532 |
| 138.2114 | 17000 | 0.4522 |
| 142.2764 | 17500 | 0.4536 |
| 146.3415 | 18000 | 0.4521 |
| 4.0650 | 500 | 0.4898 |
| 8.1301 | 1000 | 0.4737 |
| 12.1951 | 1500 | 0.4681 |
| 16.2602 | 2000 | 0.4669 |
| 20.3252 | 2500 | 0.4645 |
| 24.3902 | 3000 | 0.4626 |
| 28.4553 | 3500 | 0.4586 |
| 32.5203 | 4000 | 0.4568 |
### Framework Versions
- Python: 3.10.12
- Sentence Transformers: 3.1.1
- Transformers: 4.45.2
- PyTorch: 2.5.1+cu121
- Accelerate: 1.1.1
- Datasets: 3.1.0
- Tokenizers: 0.20.3
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
#### MultipleNegativesRankingLoss
```bibtex
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> |
mradermacher/Beyonder-4x7B-v3-random-lora-i1-GGUF | mradermacher | 2024-11-25T05:00:08Z | 185 | 1 | transformers | [
"transformers",
"gguf",
"moe",
"merge",
"mergekit",
"en",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"region:us",
"imatrix",
"conversational"
] | null | 2024-11-24T19:47:53Z | ---
base_model: Aratako/Beyonder-4x7B-v3-random-lora
language:
- en
library_name: transformers
license: cc-by-nc-4.0
quantized_by: mradermacher
tags:
- moe
- merge
- mergekit
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/Aratako/Beyonder-4x7B-v3-random-lora
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/Beyonder-4x7B-v3-random-lora-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/Beyonder-4x7B-v3-random-lora-i1-GGUF/resolve/main/Beyonder-4x7B-v3-random-lora.i1-IQ1_S.gguf) | i1-IQ1_S | 5.1 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/Beyonder-4x7B-v3-random-lora-i1-GGUF/resolve/main/Beyonder-4x7B-v3-random-lora.i1-IQ1_M.gguf) | i1-IQ1_M | 5.6 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/Beyonder-4x7B-v3-random-lora-i1-GGUF/resolve/main/Beyonder-4x7B-v3-random-lora.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 6.5 | |
| [GGUF](https://huggingface.co/mradermacher/Beyonder-4x7B-v3-random-lora-i1-GGUF/resolve/main/Beyonder-4x7B-v3-random-lora.i1-IQ2_XS.gguf) | i1-IQ2_XS | 7.2 | |
| [GGUF](https://huggingface.co/mradermacher/Beyonder-4x7B-v3-random-lora-i1-GGUF/resolve/main/Beyonder-4x7B-v3-random-lora.i1-IQ2_S.gguf) | i1-IQ2_S | 7.4 | |
| [GGUF](https://huggingface.co/mradermacher/Beyonder-4x7B-v3-random-lora-i1-GGUF/resolve/main/Beyonder-4x7B-v3-random-lora.i1-IQ2_M.gguf) | i1-IQ2_M | 8.1 | |
| [GGUF](https://huggingface.co/mradermacher/Beyonder-4x7B-v3-random-lora-i1-GGUF/resolve/main/Beyonder-4x7B-v3-random-lora.i1-Q2_K.gguf) | i1-Q2_K | 8.9 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/Beyonder-4x7B-v3-random-lora-i1-GGUF/resolve/main/Beyonder-4x7B-v3-random-lora.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 9.4 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/Beyonder-4x7B-v3-random-lora-i1-GGUF/resolve/main/Beyonder-4x7B-v3-random-lora.i1-IQ3_XS.gguf) | i1-IQ3_XS | 10.0 | |
| [GGUF](https://huggingface.co/mradermacher/Beyonder-4x7B-v3-random-lora-i1-GGUF/resolve/main/Beyonder-4x7B-v3-random-lora.i1-Q3_K_S.gguf) | i1-Q3_K_S | 10.5 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/Beyonder-4x7B-v3-random-lora-i1-GGUF/resolve/main/Beyonder-4x7B-v3-random-lora.i1-IQ3_S.gguf) | i1-IQ3_S | 10.6 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/Beyonder-4x7B-v3-random-lora-i1-GGUF/resolve/main/Beyonder-4x7B-v3-random-lora.i1-IQ3_M.gguf) | i1-IQ3_M | 10.7 | |
| [GGUF](https://huggingface.co/mradermacher/Beyonder-4x7B-v3-random-lora-i1-GGUF/resolve/main/Beyonder-4x7B-v3-random-lora.i1-Q3_K_M.gguf) | i1-Q3_K_M | 11.7 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/Beyonder-4x7B-v3-random-lora-i1-GGUF/resolve/main/Beyonder-4x7B-v3-random-lora.i1-Q3_K_L.gguf) | i1-Q3_K_L | 12.6 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/Beyonder-4x7B-v3-random-lora-i1-GGUF/resolve/main/Beyonder-4x7B-v3-random-lora.i1-IQ4_XS.gguf) | i1-IQ4_XS | 13.0 | |
| [GGUF](https://huggingface.co/mradermacher/Beyonder-4x7B-v3-random-lora-i1-GGUF/resolve/main/Beyonder-4x7B-v3-random-lora.i1-Q4_0.gguf) | i1-Q4_0 | 13.8 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/Beyonder-4x7B-v3-random-lora-i1-GGUF/resolve/main/Beyonder-4x7B-v3-random-lora.i1-Q4_K_S.gguf) | i1-Q4_K_S | 13.8 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/Beyonder-4x7B-v3-random-lora-i1-GGUF/resolve/main/Beyonder-4x7B-v3-random-lora.i1-Q4_K_M.gguf) | i1-Q4_K_M | 14.7 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Beyonder-4x7B-v3-random-lora-i1-GGUF/resolve/main/Beyonder-4x7B-v3-random-lora.i1-Q5_K_S.gguf) | i1-Q5_K_S | 16.7 | |
| [GGUF](https://huggingface.co/mradermacher/Beyonder-4x7B-v3-random-lora-i1-GGUF/resolve/main/Beyonder-4x7B-v3-random-lora.i1-Q5_K_M.gguf) | i1-Q5_K_M | 17.2 | |
| [GGUF](https://huggingface.co/mradermacher/Beyonder-4x7B-v3-random-lora-i1-GGUF/resolve/main/Beyonder-4x7B-v3-random-lora.i1-Q6_K.gguf) | i1-Q6_K | 19.9 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
cmzheng/bert-base-uncased-finetuned-imdb-mlm | cmzheng | 2024-11-25T04:57:27Z | 194 | 0 | transformers | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"fill-mask",
"generated_from_trainer",
"base_model:google-bert/bert-base-uncased",
"base_model:finetune:google-bert/bert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | 2024-11-25T04:03:24Z | ---
library_name: transformers
license: apache-2.0
base_model: bert-base-uncased
tags:
- generated_from_trainer
model-index:
- name: bert-base-uncased-finetuned-imdb-mlm
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-imdb-mlm
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 2.0236
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 3.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 2.328 | 1.0 | 958 | 2.0760 |
| 2.1638 | 2.0 | 1916 | 2.0397 |
| 2.1222 | 3.0 | 2874 | 2.0249 |
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.1+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
|
peter198477/girls | peter198477 | 2024-11-25T04:56:22Z | 9 | 0 | diffusers | [
"diffusers",
"text-to-image",
"lora",
"template:diffusion-lora",
"base_model:black-forest-labs/FLUX.1-dev",
"base_model:adapter:black-forest-labs/FLUX.1-dev",
"region:us"
] | text-to-image | 2024-11-25T04:55:17Z | ---
tags:
- text-to-image
- lora
- diffusers
- template:diffusion-lora
widget:
- text: '-'
output:
url: images/799885153188003873.png
base_model: black-forest-labs/FLUX.1-dev
instance_prompt: KiSS
---
# gls
<Gallery />
## Trigger words
You should use `KiSS` to trigger the image generation.
## Download model
Weights for this model are available in Safetensors format.
[Download](/peter198477/girls/tree/main) them in the Files & versions tab.
|
mradermacher/Beyonder-4x7B-v3-random-lora-GGUF | mradermacher | 2024-11-25T04:52:16Z | 71 | 0 | transformers | [
"transformers",
"gguf",
"moe",
"merge",
"mergekit",
"en",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-11-23T23:53:57Z | ---
base_model: Aratako/Beyonder-4x7B-v3-random-lora
language:
- en
library_name: transformers
license: cc-by-nc-4.0
quantized_by: mradermacher
tags:
- moe
- merge
- mergekit
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
static quants of https://huggingface.co/Aratako/Beyonder-4x7B-v3-random-lora
<!-- provided-files -->
weighted/imatrix quants are available at https://huggingface.co/mradermacher/Beyonder-4x7B-v3-random-lora-i1-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/Beyonder-4x7B-v3-random-lora-GGUF/resolve/main/Beyonder-4x7B-v3-random-lora.Q2_K.gguf) | Q2_K | 8.9 | |
| [GGUF](https://huggingface.co/mradermacher/Beyonder-4x7B-v3-random-lora-GGUF/resolve/main/Beyonder-4x7B-v3-random-lora.Q3_K_S.gguf) | Q3_K_S | 10.5 | |
| [GGUF](https://huggingface.co/mradermacher/Beyonder-4x7B-v3-random-lora-GGUF/resolve/main/Beyonder-4x7B-v3-random-lora.Q3_K_M.gguf) | Q3_K_M | 11.7 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/Beyonder-4x7B-v3-random-lora-GGUF/resolve/main/Beyonder-4x7B-v3-random-lora.Q3_K_L.gguf) | Q3_K_L | 12.6 | |
| [GGUF](https://huggingface.co/mradermacher/Beyonder-4x7B-v3-random-lora-GGUF/resolve/main/Beyonder-4x7B-v3-random-lora.IQ4_XS.gguf) | IQ4_XS | 13.1 | |
| [GGUF](https://huggingface.co/mradermacher/Beyonder-4x7B-v3-random-lora-GGUF/resolve/main/Beyonder-4x7B-v3-random-lora.Q4_K_S.gguf) | Q4_K_S | 13.8 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Beyonder-4x7B-v3-random-lora-GGUF/resolve/main/Beyonder-4x7B-v3-random-lora.Q4_K_M.gguf) | Q4_K_M | 14.7 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Beyonder-4x7B-v3-random-lora-GGUF/resolve/main/Beyonder-4x7B-v3-random-lora.Q5_K_S.gguf) | Q5_K_S | 16.7 | |
| [GGUF](https://huggingface.co/mradermacher/Beyonder-4x7B-v3-random-lora-GGUF/resolve/main/Beyonder-4x7B-v3-random-lora.Q5_K_M.gguf) | Q5_K_M | 17.2 | |
| [GGUF](https://huggingface.co/mradermacher/Beyonder-4x7B-v3-random-lora-GGUF/resolve/main/Beyonder-4x7B-v3-random-lora.Q6_K.gguf) | Q6_K | 19.9 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/Beyonder-4x7B-v3-random-lora-GGUF/resolve/main/Beyonder-4x7B-v3-random-lora.Q8_0.gguf) | Q8_0 | 25.8 | fast, best quality |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
ohaengman/NoobAI_XL_V-Pred-0.65s | ohaengman | 2024-11-25T04:36:53Z | 5 | 0 | diffusers | [
"diffusers",
"text-to-image",
"lora",
"template:diffusion-lora",
"base_model:Laxhar/noobai-XL-Vpred-0.65",
"base_model:adapter:Laxhar/noobai-XL-Vpred-0.65",
"region:us"
] | text-to-image | 2024-11-25T04:25:42Z | ---
tags:
- text-to-image
- lora
- diffusers
- template:diffusion-lora
widget:
- text: '-'
output:
url: images/WHITE.png
base_model: Laxhar/noobai-XL-Vpred-0.65
instance_prompt: null
---
# noobai-XL_V-Pred-0.65s
<Gallery />
## Download model
Weights for this model are available in Safetensors format.
[Download](/ohaengman/NoobAI_XL_V-Pred-0.65s/tree/main) them in the Files & versions tab.
|
amang1802/Llama3.2-1B-summary-length-exp7 | amang1802 | 2024-11-25T04:32:49Z | 95 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-24T07:29:31Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
- Summary Length PPO experiment #7
- No KL divergence in loss
## Model Details
- Dataset size: 16384
- Epochs: 1
- Batch Size: 16 * 4 (w/ 4 GPUs)
Optimizer args: Torch AdamW default, except
- LR = 0.00001 |
rohan105/Llama-2-7b-chat-finetune | rohan105 | 2024-11-25T04:31:20Z | 5 | 0 | transformers | [
"transformers",
"pytorch",
"llama",
"text-generation",
"en",
"dataset:mlabonne/guanaco-llama2-1k",
"base_model:NousResearch/Llama-2-7b-chat-hf",
"base_model:finetune:NousResearch/Llama-2-7b-chat-hf",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-24T11:09:44Z | ---
datasets:
- mlabonne/guanaco-llama2-1k
language:
- en
base_model:
- NousResearch/Llama-2-7b-chat-hf
pipeline_tag: text-generation
library_name: transformers
--- |
OpenVINO/gemma-2b-it-int8-ov | OpenVINO | 2024-11-25T04:23:51Z | 9 | 0 | null | [
"openvino",
"gemma",
"base_model:google/gemma-2b-it",
"base_model:quantized:google/gemma-2b-it",
"license:gemma",
"region:us"
] | null | 2024-10-30T07:10:51Z | ---
license: gemma
license_link: https://choosealicense.com/licenses/gemma/
base_model: google/gemma-2b-it
base_model_relation: quantized
---
# gemma-2b-it-int8-ov
* Model creator: [google](https://huggingface.co/google)
* Original model: [gemma-2b-it](https://huggingface.co/google/gemma-2b-it)
## Description
This is [gemma-2b-it](https://huggingface.co/google/gemma-2b-it) model converted to the [OpenVINO™ IR](https://docs.openvino.ai/2024/documentation/openvino-ir-format.html) (Intermediate Representation) format with weights compressed to INT8 by [NNCF](https://github.com/openvinotoolkit/nncf).
## Quantization Parameters
Weight compression was performed using `nncf.compress_weights` with the following parameters:
* mode: **int8_asym**
* ratio: **1**
For more information on quantization, check the [OpenVINO model optimization guide](https://docs.openvino.ai/2024/openvino-workflow/model-optimization-guide/weight-compression.html).
## Compatibility
The provided OpenVINO™ IR model is compatible with:
* OpenVINO version 2024.5.0 and higher
* Optimum Intel 1.21.0 and higher
## Running Model Inference with [Optimum Intel](https://huggingface.co/docs/optimum/intel/index)
1. Install packages required for using [Optimum Intel](https://huggingface.co/docs/optimum/intel/index) integration with the OpenVINO backend:
```
pip install optimum[openvino]
```
2. Run model inference:
```
from transformers import AutoTokenizer
from optimum.intel.openvino import OVModelForCausalLM
model_id = "OpenVINO/gemma-2b-it-int8-ov"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = OVModelForCausalLM.from_pretrained(model_id)
inputs = tokenizer("What is OpenVINO?", return_tensors="pt")
outputs = model.generate(**inputs, max_length=200)
text = tokenizer.batch_decode(outputs)[0]
print(text)
```
For more examples and possible optimizations, refer to the [OpenVINO Large Language Model Inference Guide](https://docs.openvino.ai/2024/learn-openvino/llm_inference_guide.html).
## Running Model Inference with [OpenVINO GenAI](https://github.com/openvinotoolkit/openvino.genai)
1. Install packages required for using OpenVINO GenAI.
```
pip install openvino-genai huggingface_hub
```
2. Download model from HuggingFace Hub
```
import huggingface_hub as hf_hub
model_id = "OpenVINO/gemma-2b-it-int8-ov"
model_path = "gemma-2b-it-int8-ov"
hf_hub.snapshot_download(model_id, local_dir=model_path)
```
3. Run model inference:
```
import openvino_genai as ov_genai
device = "CPU"
pipe = ov_genai.LLMPipeline(model_path, device)
print(pipe.generate("What is OpenVINO?", max_length=200))
```
More GenAI usage examples can be found in OpenVINO GenAI library [docs](https://github.com/openvinotoolkit/openvino.genai/blob/master/src/README.md) and [samples](https://github.com/openvinotoolkit/openvino.genai?tab=readme-ov-file#openvino-genai-samples)
## Limitations
Check the original model card for [original model card](https://huggingface.co/google/gemma-2b-it) for limitations.
## Legal information
The original model is distributed under [gemma](https://choosealicense.com/licenses/gemma/) license. More details can be found in [original model card](https://huggingface.co/google/gemma-2b-it).
## Disclaimer
Intel is committed to respecting human rights and avoiding causing or contributing to adverse impacts on human rights. See [Intel’s Global Human Rights Principles](https://www.intel.com/content/dam/www/central-libraries/us/en/documents/policy-human-rights.pdf). Intel’s products and software are intended only to be used in applications that do not cause or contribute to adverse impacts on human rights.
|
OpenVINO/gemma-2-9b-it-int8-ov | OpenVINO | 2024-11-25T04:21:12Z | 22 | 0 | null | [
"openvino",
"gemma2",
"base_model:google/gemma-2-9b-it",
"base_model:quantized:google/gemma-2-9b-it",
"license:gemma",
"region:us"
] | null | 2024-10-23T09:54:50Z | ---
license: gemma
license_link: https://choosealicense.com/licenses/gemma/
base_model: google/gemma-2-9b-it
base_model_relation: quantized
---
# gemma-2-9b-it-int8-ov
* Model creator: [google](https://huggingface.co/google)
* Original model: [gemma-2-9b-it](https://huggingface.co/google/gemma-2-9b-it)
## Description
This is [gemma-2-9b-it](https://huggingface.co/google/gemma-2-9b-it) model converted to the [OpenVINO™ IR](https://docs.openvino.ai/2024/documentation/openvino-ir-format.html) (Intermediate Representation) format with weights compressed to INT8 by [NNCF](https://github.com/openvinotoolkit/nncf).
## Quantization Parameters
Weight compression was performed using `nncf.compress_weights` with the following parameters:
* mode: **int8_asym**
* ratio: **1**
For more information on quantization, check the [OpenVINO model optimization guide](https://docs.openvino.ai/2024/openvino-workflow/model-optimization-guide/weight-compression.html).
## Compatibility
The provided OpenVINO™ IR model is compatible with:
* OpenVINO version 2024.5.0 and higher
* Optimum Intel 1.21.0 and higher
## Running Model Inference with [Optimum Intel](https://huggingface.co/docs/optimum/intel/index)
1. Install packages required for using [Optimum Intel](https://huggingface.co/docs/optimum/intel/index) integration with the OpenVINO backend:
```
pip install optimum[openvino]
```
2. Run model inference:
```
from transformers import AutoTokenizer
from optimum.intel.openvino import OVModelForCausalLM
model_id = "OpenVINO/gemma-2-9b-it-int8-ov"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = OVModelForCausalLM.from_pretrained(model_id)
inputs = tokenizer("What is OpenVINO?", return_tensors="pt")
outputs = model.generate(**inputs, max_length=200)
text = tokenizer.batch_decode(outputs)[0]
print(text)
```
For more examples and possible optimizations, refer to the [OpenVINO Large Language Model Inference Guide](https://docs.openvino.ai/2024/learn-openvino/llm_inference_guide.html).
## Running Model Inference with [OpenVINO GenAI](https://github.com/openvinotoolkit/openvino.genai)
1. Install packages required for using OpenVINO GenAI.
```
pip install openvino-genai huggingface_hub
```
2. Download model from HuggingFace Hub
```
import huggingface_hub as hf_hub
model_id = "OpenVINO/gemma-2-9b-it-int8-ov"
model_path = "gemma-2-9b-it-int8-ov"
hf_hub.snapshot_download(model_id, local_dir=model_path)
```
3. Run model inference:
```
import openvino_genai as ov_genai
device = "CPU"
pipe = ov_genai.LLMPipeline(model_path, device)
print(pipe.generate("What is OpenVINO?", max_length=200))
```
More GenAI usage examples can be found in OpenVINO GenAI library [docs](https://github.com/openvinotoolkit/openvino.genai/blob/master/src/README.md) and [samples](https://github.com/openvinotoolkit/openvino.genai?tab=readme-ov-file#openvino-genai-samples)
## Limitations
Check the original model card for [original model card](https://huggingface.co/google/gemma-2-9b-it) for limitations.
## Legal information
The original model is distributed under [gemma](https://ai.google.dev/gemma/terms) license. More details can be found in [original model card](https://huggingface.co/google/gemma-2-9b-it).
## Disclaimer
Intel is committed to respecting human rights and avoiding causing or contributing to adverse impacts on human rights. See [Intel’s Global Human Rights Principles](https://www.intel.com/content/dam/www/central-libraries/us/en/documents/policy-human-rights.pdf). Intel’s products and software are intended only to be used in applications that do not cause or contribute to adverse impacts on human rights.
|
Shinyaaa/Travel-10-v1 | Shinyaaa | 2024-11-25T04:19:39Z | 103 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"feature-extraction",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | feature-extraction | 2024-11-25T04:19:12Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
RajeevkumarYadana/gita-text-generation-gpt2 | RajeevkumarYadana | 2024-11-25T04:18:44Z | 130 | 0 | transformers | [
"transformers",
"safetensors",
"gpt2",
"text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-25T04:18:06Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
OpenVINO/gemma-2-9b-it-int4-ov | OpenVINO | 2024-11-25T04:18:05Z | 7 | 0 | null | [
"openvino",
"gemma2",
"base_model:google/gemma-2-9b-it",
"base_model:quantized:google/gemma-2-9b-it",
"license:gemma",
"region:us"
] | null | 2024-10-23T09:51:00Z | ---
license: gemma
license_link: https://choosealicense.com/licenses/gemma/
base_model: google/gemma-2-9b-it
base_model_relation: quantized
---
# gemma-2-9b-it-int4-ov
* Model creator: [google](https://huggingface.co/google)
* Original model: [gemma-2-9b-it](https://huggingface.co/google/gemma-2-9b-it)
## Description
This is [gemma-2-9b-it](https://huggingface.co/google/gemma-2-9b-it) model converted to the [OpenVINO™ IR](https://docs.openvino.ai/2024/documentation/openvino-ir-format.html) (Intermediate Representation) format with weights compressed to INT4 by [NNCF](https://github.com/openvinotoolkit/nncf).
## Quantization Parameters
Weight compression was performed using `nncf.compress_weights` with the following parameters:
* mode: **int4_asym**
* ratio: **1**
* group_size: **128**
For more information on quantization, check the [OpenVINO model optimization guide](https://docs.openvino.ai/2024/openvino-workflow/model-optimization-guide/weight-compression.html).
## Compatibility
The provided OpenVINO™ IR model is compatible with:
* OpenVINO version 2024.5.0 and higher
* Optimum Intel 1.21.0 and higher
## Running Model Inference with [Optimum Intel](https://huggingface.co/docs/optimum/intel/index)
1. Install packages required for using [Optimum Intel](https://huggingface.co/docs/optimum/intel/index) integration with the OpenVINO backend:
```
pip install optimum[openvino]
```
2. Run model inference:
```
from transformers import AutoTokenizer
from optimum.intel.openvino import OVModelForCausalLM
model_id = "OpenVINO/gemma-2-9b-it-int4-ov"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = OVModelForCausalLM.from_pretrained(model_id)
inputs = tokenizer("What is OpenVINO?", return_tensors="pt")
outputs = model.generate(**inputs, max_length=200)
text = tokenizer.batch_decode(outputs)[0]
print(text)
```
For more examples and possible optimizations, refer to the [OpenVINO Large Language Model Inference Guide](https://docs.openvino.ai/2024/learn-openvino/llm_inference_guide.html).
## Running Model Inference with [OpenVINO GenAI](https://github.com/openvinotoolkit/openvino.genai)
1. Install packages required for using OpenVINO GenAI.
```
pip install openvino-genai huggingface_hub
```
2. Download model from HuggingFace Hub
```
import huggingface_hub as hf_hub
model_id = "OpenVINO/gemma-2-9b-it-int4-ov"
model_path = "gemma-2-9b-it-int4-ov"
hf_hub.snapshot_download(model_id, local_dir=model_path)
```
3. Run model inference:
```
import openvino_genai as ov_genai
device = "CPU"
pipe = ov_genai.LLMPipeline(model_path, device)
print(pipe.generate("What is OpenVINO?", max_length=200))
```
More GenAI usage examples can be found in OpenVINO GenAI library [docs](https://github.com/openvinotoolkit/openvino.genai/blob/master/src/README.md) and [samples](https://github.com/openvinotoolkit/openvino.genai?tab=readme-ov-file#openvino-genai-samples)
## Limitations
Check the original model card for [original model card](https://huggingface.co/google/gemma-2-9b-it) for limitations.
## Legal information
The original model is distributed under [gemma](https://choosealicense.com/licenses/gemma/) license. More details can be found in [original model card](https://huggingface.co/google/gemma-2-9b-it).
## Disclaimer
Intel is committed to respecting human rights and avoiding causing or contributing to adverse impacts on human rights. See [Intel’s Global Human Rights Principles](https://www.intel.com/content/dam/www/central-libraries/us/en/documents/policy-human-rights.pdf). Intel’s products and software are intended only to be used in applications that do not cause or contribute to adverse impacts on human rights.
|
mradermacher/sentient-simulations-pydecompiler-3.7-6.7b-v0.9-GGUF | mradermacher | 2024-11-25T04:15:59Z | 159 | 0 | transformers | [
"transformers",
"gguf",
"Axolotl",
"Deepspeed",
"en",
"dataset:GusPuffy/python-decompiler-37-0.7-train",
"base_model:GusPuffy/sentient-simulations-pydecompiler-3.7-6.7b-v0.9",
"base_model:quantized:GusPuffy/sentient-simulations-pydecompiler-3.7-6.7b-v0.9",
"license:other",
"endpoints_compatible",
"region:us"
] | null | 2024-11-25T02:13:06Z | ---
base_model: GusPuffy/sentient-simulations-pydecompiler-3.7-6.7b-v0.9
datasets:
- GusPuffy/python-decompiler-37-0.7-train
language:
- en
library_name: transformers
license: other
license_link: LICENSE
license_name: deepseek-license
quantized_by: mradermacher
tags:
- Axolotl
- Deepspeed
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
static quants of https://huggingface.co/GusPuffy/sentient-simulations-pydecompiler-3.7-6.7b-v0.9
<!-- provided-files -->
weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/sentient-simulations-pydecompiler-3.7-6.7b-v0.9-GGUF/resolve/main/sentient-simulations-pydecompiler-3.7-6.7b-v0.9.Q2_K.gguf) | Q2_K | 2.6 | |
| [GGUF](https://huggingface.co/mradermacher/sentient-simulations-pydecompiler-3.7-6.7b-v0.9-GGUF/resolve/main/sentient-simulations-pydecompiler-3.7-6.7b-v0.9.Q3_K_S.gguf) | Q3_K_S | 3.1 | |
| [GGUF](https://huggingface.co/mradermacher/sentient-simulations-pydecompiler-3.7-6.7b-v0.9-GGUF/resolve/main/sentient-simulations-pydecompiler-3.7-6.7b-v0.9.Q3_K_M.gguf) | Q3_K_M | 3.4 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/sentient-simulations-pydecompiler-3.7-6.7b-v0.9-GGUF/resolve/main/sentient-simulations-pydecompiler-3.7-6.7b-v0.9.Q3_K_L.gguf) | Q3_K_L | 3.7 | |
| [GGUF](https://huggingface.co/mradermacher/sentient-simulations-pydecompiler-3.7-6.7b-v0.9-GGUF/resolve/main/sentient-simulations-pydecompiler-3.7-6.7b-v0.9.IQ4_XS.gguf) | IQ4_XS | 3.7 | |
| [GGUF](https://huggingface.co/mradermacher/sentient-simulations-pydecompiler-3.7-6.7b-v0.9-GGUF/resolve/main/sentient-simulations-pydecompiler-3.7-6.7b-v0.9.Q4_0_4_4.gguf) | Q4_0_4_4 | 3.9 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/sentient-simulations-pydecompiler-3.7-6.7b-v0.9-GGUF/resolve/main/sentient-simulations-pydecompiler-3.7-6.7b-v0.9.Q4_K_S.gguf) | Q4_K_S | 4.0 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/sentient-simulations-pydecompiler-3.7-6.7b-v0.9-GGUF/resolve/main/sentient-simulations-pydecompiler-3.7-6.7b-v0.9.Q4_K_M.gguf) | Q4_K_M | 4.2 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/sentient-simulations-pydecompiler-3.7-6.7b-v0.9-GGUF/resolve/main/sentient-simulations-pydecompiler-3.7-6.7b-v0.9.Q5_K_S.gguf) | Q5_K_S | 4.8 | |
| [GGUF](https://huggingface.co/mradermacher/sentient-simulations-pydecompiler-3.7-6.7b-v0.9-GGUF/resolve/main/sentient-simulations-pydecompiler-3.7-6.7b-v0.9.Q5_K_M.gguf) | Q5_K_M | 4.9 | |
| [GGUF](https://huggingface.co/mradermacher/sentient-simulations-pydecompiler-3.7-6.7b-v0.9-GGUF/resolve/main/sentient-simulations-pydecompiler-3.7-6.7b-v0.9.Q6_K.gguf) | Q6_K | 5.6 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/sentient-simulations-pydecompiler-3.7-6.7b-v0.9-GGUF/resolve/main/sentient-simulations-pydecompiler-3.7-6.7b-v0.9.Q8_0.gguf) | Q8_0 | 7.3 | fast, best quality |
| [GGUF](https://huggingface.co/mradermacher/sentient-simulations-pydecompiler-3.7-6.7b-v0.9-GGUF/resolve/main/sentient-simulations-pydecompiler-3.7-6.7b-v0.9.f16.gguf) | f16 | 13.6 | 16 bpw, overkill |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
OpenVINO/Phi-3.5-mini-instruct-int4-ov | OpenVINO | 2024-11-25T04:08:01Z | 188 | 3 | null | [
"openvino",
"phi3",
"custom_code",
"base_model:microsoft/Phi-3.5-mini-instruct",
"base_model:quantized:microsoft/Phi-3.5-mini-instruct",
"license:mit",
"region:us"
] | null | 2024-11-20T11:19:58Z | ---
license: mit
license_link: https://choosealicense.com/licenses/mit/
base_model: microsoft/Phi-3.5-mini-instruct
base_model_relation: quantized
---
# Phi-3.5-mini-instruct-int4-ov
* Model creator: [microsoft](https://huggingface.co/microsoft)
* Original model: [Phi-3.5-mini-instruct](https://huggingface.co/microsoft/Phi-3.5-mini-instruct)
## Description
This is [Phi-3.5-mini-instruct](https://huggingface.co/microsoft/Phi-3.5-mini-instruct) model converted to the [OpenVINO™ IR](https://docs.openvino.ai/2024/documentation/openvino-ir-format.html) (Intermediate Representation) format with weights compressed to INT4 by [NNCF](https://github.com/openvinotoolkit/nncf).
## Quantization Parameters
Weight compression was performed using `nncf.compress_weights` with the following parameters:
* mode: **int4_asym**
* ratio: **1**
* group_size: **128**
For more information on quantization, check the [OpenVINO model optimization guide](https://docs.openvino.ai/2024/openvino-workflow/model-optimization-guide/weight-compression.html).
## Compatibility
The provided OpenVINO™ IR model is compatible with:
* OpenVINO version 2024.5.0 and higher
* Optimum Intel 1.21.0 and higher
## Running Model Inference with [Optimum Intel](https://huggingface.co/docs/optimum/intel/index)
1. Install packages required for using [Optimum Intel](https://huggingface.co/docs/optimum/intel/index) integration with the OpenVINO backend:
```
pip install optimum[openvino]
```
2. Run model inference:
```
from transformers import AutoTokenizer
from optimum.intel.openvino import OVModelForCausalLM
model_id = "OpenVINO/Phi-3.5-mini-instruct-int4-ov"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = OVModelForCausalLM.from_pretrained(model_id)
inputs = tokenizer("What is OpenVINO?", return_tensors="pt")
outputs = model.generate(**inputs, max_length=200)
text = tokenizer.batch_decode(outputs)[0]
print(text)
```
For more examples and possible optimizations, refer to the [OpenVINO Large Language Model Inference Guide](https://docs.openvino.ai/2024/learn-openvino/llm_inference_guide.html).
## Running Model Inference with [OpenVINO GenAI](https://github.com/openvinotoolkit/openvino.genai)
1. Install packages required for using OpenVINO GenAI.
```
pip install openvino-genai huggingface_hub
```
2. Download model from HuggingFace Hub
```
import huggingface_hub as hf_hub
model_id = "OpenVINO/Phi-3.5-mini-instruct-int4-ov"
model_path = "Phi-3.5-mini-instruct-int4-ov"
hf_hub.snapshot_download(model_id, local_dir=model_path)
```
3. Run model inference:
```
import openvino_genai as ov_genai
device = "CPU"
pipe = ov_genai.LLMPipeline(model_path, device)
print(pipe.generate("What is OpenVINO?", max_length=200))
```
More GenAI usage examples can be found in OpenVINO GenAI library [docs](https://github.com/openvinotoolkit/openvino.genai/blob/master/src/README.md) and [samples](https://github.com/openvinotoolkit/openvino.genai?tab=readme-ov-file#openvino-genai-samples)
## Limitations
Check the original model card for [original model card](https://huggingface.co/microsoft/Phi-3.5-mini-instruct) for limitations.
## Legal information
The original model is distributed under [mit](https://choosealicense.com/licenses/mit/) license. More details can be found in [original model card](https://huggingface.co/microsoft/Phi-3.5-mini-instruct).
## Disclaimer
Intel is committed to respecting human rights and avoiding causing or contributing to adverse impacts on human rights. See [Intel’s Global Human Rights Principles](https://www.intel.com/content/dam/www/central-libraries/us/en/documents/policy-human-rights.pdf). Intel’s products and software are intended only to be used in applications that do not cause or contribute to adverse impacts on human rights.
|
OpenVINO/Phi-3.5-mini-instruct-int8-ov | OpenVINO | 2024-11-25T04:06:09Z | 1,774 | 0 | null | [
"openvino",
"phi3",
"custom_code",
"base_model:microsoft/Phi-3.5-mini-instruct",
"base_model:quantized:microsoft/Phi-3.5-mini-instruct",
"license:mit",
"region:us"
] | null | 2024-11-20T11:52:22Z | ---
license: mit
license_link: https://choosealicense.com/licenses/mit/
base_model: microsoft/Phi-3.5-mini-instruct
base_model_relation: quantized
---
# Phi-3.5-mini-instruct-int8-ov
* Model creator: [microsoft](https://huggingface.co/microsoft)
* Original model: [Phi-3.5-mini-instruct](https://huggingface.co/microsoft/Phi-3.5-mini-instruct)
## Description
This is [Phi-3.5-mini-instruct](https://huggingface.co/microsoft/Phi-3.5-mini-instruct) model converted to the [OpenVINO™ IR](https://docs.openvino.ai/2024/documentation/openvino-ir-format.html) (Intermediate Representation) format with weights compressed to INT8 by [NNCF](https://github.com/openvinotoolkit/nncf).
## Quantization Parameters
Weight compression was performed using `nncf.compress_weights` with the following parameters:
* mode: **int8_asym**
* ratio: **1**
For more information on quantization, check the [OpenVINO model optimization guide](https://docs.openvino.ai/2024/openvino-workflow/model-optimization-guide/weight-compression.html).
## Compatibility
The provided OpenVINO™ IR model is compatible with:
* OpenVINO version 2024.5.0 and higher
* Optimum Intel 1.21.0 and higher
## Running Model Inference with [Optimum Intel](https://huggingface.co/docs/optimum/intel/index)
1. Install packages required for using [Optimum Intel](https://huggingface.co/docs/optimum/intel/index) integration with the OpenVINO backend:
```
pip install optimum[openvino]
```
2. Run model inference:
```
from transformers import AutoTokenizer
from optimum.intel.openvino import OVModelForCausalLM
model_id = "OpenVINO/Phi-3.5-mini-instruct-int8-ov"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = OVModelForCausalLM.from_pretrained(model_id)
inputs = tokenizer("What is OpenVINO?", return_tensors="pt")
outputs = model.generate(**inputs, max_length=200)
text = tokenizer.batch_decode(outputs)[0]
print(text)
```
For more examples and possible optimizations, refer to the [OpenVINO Large Language Model Inference Guide](https://docs.openvino.ai/2024/learn-openvino/llm_inference_guide.html).
## Running Model Inference with [OpenVINO GenAI](https://github.com/openvinotoolkit/openvino.genai)
1. Install packages required for using OpenVINO GenAI.
```
pip install openvino-genai huggingface_hub
```
2. Download model from HuggingFace Hub
```
import huggingface_hub as hf_hub
model_id = "OpenVINO/Phi-3.5-mini-instruct-int8-ov"
model_path = "Phi-3.5-mini-instruct-int8-ov"
hf_hub.snapshot_download(model_id, local_dir=model_path)
```
3. Run model inference:
```
import openvino_genai as ov_genai
device = "CPU"
pipe = ov_genai.LLMPipeline(model_path, device)
print(pipe.generate("What is OpenVINO?", max_length=200))
```
More GenAI usage examples can be found in OpenVINO GenAI library [docs](https://github.com/openvinotoolkit/openvino.genai/blob/master/src/README.md) and [samples](https://github.com/openvinotoolkit/openvino.genai?tab=readme-ov-file#openvino-genai-samples)
## Limitations
Check the original model card for [original model card](https://huggingface.co/microsoft/Phi-3.5-mini-instruct) for limitations.
## Legal information
The original model is distributed under [mit](https://choosealicense.com/licenses/mit/) license. More details can be found in [original model card](https://huggingface.co/microsoft/Phi-3.5-mini-instruct).
## Disclaimer
Intel is committed to respecting human rights and avoiding causing or contributing to adverse impacts on human rights. See [Intel’s Global Human Rights Principles](https://www.intel.com/content/dam/www/central-libraries/us/en/documents/policy-human-rights.pdf). Intel’s products and software are intended only to be used in applications that do not cause or contribute to adverse impacts on human rights.
|
XueyingJia/pythia-1b-deduped-hh-online-dpo-full-merge | XueyingJia | 2024-11-25T04:06:00Z | 128 | 0 | transformers | [
"transformers",
"safetensors",
"gpt_neox",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-25T04:04:33Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
MayBashendy/ASAP_FineTuningBERT_AugV4_k25_task1_organization_fold1 | MayBashendy | 2024-11-25T04:04:16Z | 161 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:google-bert/bert-base-uncased",
"base_model:finetune:google-bert/bert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-25T03:24:50Z | ---
library_name: transformers
license: apache-2.0
base_model: bert-base-uncased
tags:
- generated_from_trainer
model-index:
- name: ASAP_FineTuningBERT_AugV4_k25_task1_organization_fold1
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# ASAP_FineTuningBERT_AugV4_k25_task1_organization_fold1
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7463
- Qwk: 0.4000
- Mse: 0.7463
- Rmse: 0.8639
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|:-------------:|:------:|:----:|:---------------:|:-------:|:-------:|:------:|
| No log | 0.0008 | 2 | 12.5364 | 0.0 | 12.5364 | 3.5407 |
| No log | 0.0016 | 4 | 11.1525 | -0.0160 | 11.1525 | 3.3395 |
| No log | 0.0024 | 6 | 9.9888 | 0.0 | 9.9888 | 3.1605 |
| No log | 0.0032 | 8 | 8.7236 | 0.0 | 8.7236 | 2.9536 |
| No log | 0.0040 | 10 | 7.4626 | 0.0 | 7.4626 | 2.7318 |
| No log | 0.0048 | 12 | 6.3933 | 0.0 | 6.3933 | 2.5285 |
| No log | 0.0056 | 14 | 5.6891 | 0.0267 | 5.6891 | 2.3852 |
| No log | 0.0064 | 16 | 4.6276 | 0.0104 | 4.6276 | 2.1512 |
| No log | 0.0072 | 18 | 3.9970 | 0.0040 | 3.9970 | 1.9992 |
| No log | 0.0080 | 20 | 3.2027 | 0.0 | 3.2027 | 1.7896 |
| No log | 0.0088 | 22 | 2.5899 | 0.0029 | 2.5899 | 1.6093 |
| No log | 0.0096 | 24 | 1.9127 | 0.0874 | 1.9127 | 1.3830 |
| No log | 0.0104 | 26 | 1.4006 | 0.0315 | 1.4006 | 1.1835 |
| No log | 0.0112 | 28 | 1.0870 | 0.0 | 1.0870 | 1.0426 |
| No log | 0.0120 | 30 | 0.8971 | 0.2824 | 0.8971 | 0.9471 |
| No log | 0.0128 | 32 | 0.8539 | 0.2151 | 0.8539 | 0.9241 |
| No log | 0.0136 | 34 | 0.8956 | 0.0784 | 0.8956 | 0.9464 |
| No log | 0.0144 | 36 | 1.1286 | 0.0286 | 1.1286 | 1.0624 |
| No log | 0.0152 | 38 | 1.1801 | 0.0286 | 1.1801 | 1.0863 |
| No log | 0.0160 | 40 | 1.1688 | 0.0286 | 1.1688 | 1.0811 |
| No log | 0.0168 | 42 | 1.4385 | 0.1666 | 1.4385 | 1.1994 |
| No log | 0.0176 | 44 | 1.1952 | 0.0143 | 1.1952 | 1.0932 |
| No log | 0.0184 | 46 | 0.9882 | 0.0286 | 0.9882 | 0.9941 |
| No log | 0.0192 | 48 | 1.3300 | 0.1326 | 1.3300 | 1.1533 |
| No log | 0.0200 | 50 | 1.1779 | 0.0288 | 1.1779 | 1.0853 |
| No log | 0.0208 | 52 | 1.2238 | 0.0114 | 1.2238 | 1.1062 |
| No log | 0.0216 | 54 | 1.6299 | 0.0792 | 1.6299 | 1.2767 |
| No log | 0.0224 | 56 | 1.7557 | 0.0810 | 1.7557 | 1.3250 |
| No log | 0.0232 | 58 | 1.3004 | 0.0964 | 1.3004 | 1.1403 |
| No log | 0.0240 | 60 | 0.9798 | 0.0143 | 0.9798 | 0.9899 |
| No log | 0.0248 | 62 | 0.9078 | 0.1435 | 0.9078 | 0.9528 |
| No log | 0.0256 | 64 | 0.8934 | 0.1042 | 0.8934 | 0.9452 |
| No log | 0.0264 | 66 | 1.0052 | 0.0143 | 1.0052 | 1.0026 |
| No log | 0.0272 | 68 | 1.4093 | 0.1946 | 1.4093 | 1.1872 |
| No log | 0.0280 | 70 | 1.4672 | 0.2492 | 1.4672 | 1.2113 |
| No log | 0.0288 | 72 | 1.1845 | 0.0 | 1.1845 | 1.0884 |
| No log | 0.0296 | 74 | 0.9752 | 0.0 | 0.9752 | 0.9875 |
| No log | 0.0304 | 76 | 0.9763 | 0.0 | 0.9763 | 0.9881 |
| No log | 0.0312 | 78 | 0.9743 | 0.0 | 0.9743 | 0.9870 |
| No log | 0.0320 | 80 | 1.0843 | 0.0382 | 1.0843 | 1.0413 |
| No log | 0.0328 | 82 | 1.4109 | 0.2328 | 1.4109 | 1.1878 |
| No log | 0.0336 | 84 | 1.3546 | 0.2373 | 1.3546 | 1.1639 |
| No log | 0.0344 | 86 | 1.0914 | 0.0479 | 1.0914 | 1.0447 |
| No log | 0.0352 | 88 | 0.8533 | 0.0554 | 0.8533 | 0.9238 |
| No log | 0.0360 | 90 | 0.8429 | 0.0855 | 0.8429 | 0.9181 |
| No log | 0.0368 | 92 | 0.9985 | 0.0143 | 0.9985 | 0.9993 |
| No log | 0.0376 | 94 | 1.2016 | 0.1743 | 1.2016 | 1.0962 |
| No log | 0.0384 | 96 | 1.2282 | 0.1360 | 1.2282 | 1.1083 |
| No log | 0.0392 | 98 | 0.9237 | 0.0760 | 0.9237 | 0.9611 |
| No log | 0.0400 | 100 | 1.2095 | 0.1163 | 1.2095 | 1.0998 |
| No log | 0.0408 | 102 | 1.5339 | 0.0549 | 1.5339 | 1.2385 |
| No log | 0.0416 | 104 | 1.3577 | 0.0794 | 1.3577 | 1.1652 |
| No log | 0.0423 | 106 | 1.2889 | 0.1725 | 1.2889 | 1.1353 |
| No log | 0.0431 | 108 | 1.2676 | 0.1825 | 1.2676 | 1.1259 |
| No log | 0.0439 | 110 | 1.5355 | 0.0991 | 1.5355 | 1.2391 |
| No log | 0.0447 | 112 | 1.6357 | 0.1192 | 1.6357 | 1.2789 |
| No log | 0.0455 | 114 | 1.1254 | 0.1430 | 1.1254 | 1.0608 |
| No log | 0.0463 | 116 | 0.9216 | 0.0975 | 0.9216 | 0.9600 |
| No log | 0.0471 | 118 | 1.0344 | 0.0406 | 1.0344 | 1.0171 |
| No log | 0.0479 | 120 | 1.4224 | 0.1321 | 1.4224 | 1.1926 |
| No log | 0.0487 | 122 | 1.7926 | 0.1040 | 1.7926 | 1.3389 |
| No log | 0.0495 | 124 | 1.6757 | 0.1077 | 1.6757 | 1.2945 |
| No log | 0.0503 | 126 | 1.5021 | 0.1105 | 1.5021 | 1.2256 |
| No log | 0.0511 | 128 | 1.1740 | 0.1715 | 1.1740 | 1.0835 |
| No log | 0.0519 | 130 | 1.1760 | 0.2776 | 1.1760 | 1.0844 |
| No log | 0.0527 | 132 | 1.3749 | 0.2115 | 1.3749 | 1.1726 |
| No log | 0.0535 | 134 | 1.5542 | 0.1662 | 1.5542 | 1.2467 |
| No log | 0.0543 | 136 | 1.5883 | 0.1627 | 1.5883 | 1.2603 |
| No log | 0.0551 | 138 | 1.2843 | 0.2609 | 1.2843 | 1.1333 |
| No log | 0.0559 | 140 | 0.8714 | 0.2441 | 0.8714 | 0.9335 |
| No log | 0.0567 | 142 | 0.8751 | 0.2685 | 0.8751 | 0.9354 |
| No log | 0.0575 | 144 | 1.2942 | 0.2671 | 1.2942 | 1.1376 |
| No log | 0.0583 | 146 | 1.6978 | 0.1420 | 1.6978 | 1.3030 |
| No log | 0.0591 | 148 | 1.5195 | 0.1461 | 1.5195 | 1.2327 |
| No log | 0.0599 | 150 | 1.0821 | 0.2596 | 1.0821 | 1.0402 |
| No log | 0.0607 | 152 | 0.8844 | 0.1948 | 0.8844 | 0.9404 |
| No log | 0.0615 | 154 | 0.9557 | 0.2519 | 0.9557 | 0.9776 |
| No log | 0.0623 | 156 | 1.3424 | 0.2039 | 1.3424 | 1.1586 |
| No log | 0.0631 | 158 | 1.6028 | 0.1356 | 1.6028 | 1.2660 |
| No log | 0.0639 | 160 | 1.3255 | 0.2053 | 1.3255 | 1.1513 |
| No log | 0.0647 | 162 | 0.9709 | 0.1919 | 0.9709 | 0.9854 |
| No log | 0.0655 | 164 | 0.9274 | 0.1686 | 0.9274 | 0.9630 |
| No log | 0.0663 | 166 | 1.1772 | 0.2614 | 1.1772 | 1.0850 |
| No log | 0.0671 | 168 | 1.2372 | 0.2491 | 1.2372 | 1.1123 |
| No log | 0.0679 | 170 | 1.0801 | 0.2329 | 1.0801 | 1.0393 |
| No log | 0.0687 | 172 | 1.1099 | 0.2181 | 1.1099 | 1.0535 |
| No log | 0.0695 | 174 | 1.3584 | 0.2296 | 1.3584 | 1.1655 |
| No log | 0.0703 | 176 | 1.0876 | 0.2360 | 1.0876 | 1.0429 |
| No log | 0.0711 | 178 | 0.8627 | 0.1393 | 0.8627 | 0.9288 |
| No log | 0.0719 | 180 | 0.9684 | 0.2065 | 0.9684 | 0.9841 |
| No log | 0.0727 | 182 | 1.4415 | 0.1928 | 1.4415 | 1.2006 |
| No log | 0.0735 | 184 | 1.4636 | 0.1953 | 1.4636 | 1.2098 |
| No log | 0.0743 | 186 | 1.1302 | 0.2872 | 1.1302 | 1.0631 |
| No log | 0.0751 | 188 | 1.0122 | 0.3170 | 1.0122 | 1.0061 |
| No log | 0.0759 | 190 | 0.9612 | 0.3601 | 0.9612 | 0.9804 |
| No log | 0.0767 | 192 | 1.0452 | 0.3538 | 1.0452 | 1.0224 |
| No log | 0.0775 | 194 | 1.1695 | 0.2895 | 1.1695 | 1.0814 |
| No log | 0.0783 | 196 | 1.2627 | 0.2585 | 1.2627 | 1.1237 |
| No log | 0.0791 | 198 | 1.0438 | 0.2963 | 1.0438 | 1.0216 |
| No log | 0.0799 | 200 | 0.9337 | 0.1763 | 0.9337 | 0.9663 |
| No log | 0.0807 | 202 | 0.9027 | 0.1917 | 0.9027 | 0.9501 |
| No log | 0.0815 | 204 | 0.8791 | 0.3047 | 0.8791 | 0.9376 |
| No log | 0.0823 | 206 | 0.9659 | 0.3603 | 0.9659 | 0.9828 |
| No log | 0.0831 | 208 | 1.4072 | 0.2708 | 1.4072 | 1.1862 |
| No log | 0.0839 | 210 | 1.5363 | 0.2065 | 1.5363 | 1.2395 |
| No log | 0.0847 | 212 | 1.3325 | 0.1893 | 1.3325 | 1.1543 |
| No log | 0.0855 | 214 | 1.3344 | 0.1330 | 1.3344 | 1.1552 |
| No log | 0.0863 | 216 | 1.3101 | 0.2006 | 1.3101 | 1.1446 |
| No log | 0.0871 | 218 | 1.2483 | 0.2959 | 1.2483 | 1.1173 |
| No log | 0.0879 | 220 | 1.2783 | 0.2928 | 1.2783 | 1.1306 |
| No log | 0.0887 | 222 | 1.7220 | 0.1584 | 1.7220 | 1.3123 |
| No log | 0.0895 | 224 | 1.6620 | 0.1675 | 1.6620 | 1.2892 |
| No log | 0.0903 | 226 | 1.1786 | 0.3006 | 1.1786 | 1.0856 |
| No log | 0.0911 | 228 | 1.1009 | 0.3130 | 1.1009 | 1.0493 |
| No log | 0.0919 | 230 | 1.2078 | 0.2863 | 1.2078 | 1.0990 |
| No log | 0.0927 | 232 | 1.3470 | 0.2585 | 1.3470 | 1.1606 |
| No log | 0.0935 | 234 | 1.5202 | 0.1941 | 1.5202 | 1.2330 |
| No log | 0.0943 | 236 | 1.2990 | 0.3074 | 1.2990 | 1.1397 |
| No log | 0.0951 | 238 | 0.9377 | 0.3472 | 0.9377 | 0.9684 |
| No log | 0.0959 | 240 | 1.0600 | 0.3230 | 1.0600 | 1.0295 |
| No log | 0.0967 | 242 | 1.3813 | 0.2685 | 1.3813 | 1.1753 |
| No log | 0.0975 | 244 | 1.1218 | 0.3457 | 1.1218 | 1.0591 |
| No log | 0.0983 | 246 | 0.7807 | 0.4298 | 0.7807 | 0.8836 |
| No log | 0.0991 | 248 | 0.7438 | 0.4674 | 0.7438 | 0.8624 |
| No log | 0.0999 | 250 | 0.8610 | 0.4088 | 0.8610 | 0.9279 |
| No log | 0.1007 | 252 | 1.1899 | 0.3468 | 1.1899 | 1.0908 |
| No log | 0.1015 | 254 | 1.0604 | 0.3657 | 1.0604 | 1.0298 |
| No log | 0.1023 | 256 | 0.6528 | 0.4855 | 0.6528 | 0.8080 |
| No log | 0.1031 | 258 | 0.6399 | 0.4781 | 0.6399 | 0.7999 |
| No log | 0.1039 | 260 | 0.9305 | 0.3998 | 0.9305 | 0.9646 |
| No log | 0.1047 | 262 | 0.9391 | 0.3979 | 0.9391 | 0.9691 |
| No log | 0.1055 | 264 | 0.6409 | 0.4512 | 0.6409 | 0.8005 |
| No log | 0.1063 | 266 | 0.6004 | 0.4484 | 0.6004 | 0.7748 |
| No log | 0.1071 | 268 | 0.6568 | 0.4432 | 0.6568 | 0.8104 |
| No log | 0.1079 | 270 | 1.0885 | 0.3496 | 1.0885 | 1.0433 |
| No log | 0.1087 | 272 | 1.2839 | 0.3097 | 1.2839 | 1.1331 |
| No log | 0.1095 | 274 | 0.9332 | 0.3792 | 0.9332 | 0.9660 |
| No log | 0.1103 | 276 | 0.6763 | 0.4599 | 0.6763 | 0.8224 |
| No log | 0.1111 | 278 | 0.6953 | 0.4662 | 0.6953 | 0.8339 |
| No log | 0.1119 | 280 | 1.0206 | 0.3489 | 1.0206 | 1.0102 |
| No log | 0.1127 | 282 | 1.1292 | 0.3310 | 1.1292 | 1.0627 |
| No log | 0.1135 | 284 | 0.8370 | 0.4004 | 0.8370 | 0.9149 |
| No log | 0.1143 | 286 | 0.6465 | 0.3951 | 0.6465 | 0.8041 |
| No log | 0.1151 | 288 | 0.6448 | 0.3973 | 0.6448 | 0.8030 |
| No log | 0.1159 | 290 | 0.7694 | 0.4308 | 0.7694 | 0.8772 |
| No log | 0.1167 | 292 | 1.0659 | 0.3378 | 1.0659 | 1.0324 |
| No log | 0.1175 | 294 | 1.0262 | 0.3561 | 1.0262 | 1.0130 |
| No log | 0.1183 | 296 | 0.7268 | 0.4098 | 0.7268 | 0.8525 |
| No log | 0.1191 | 298 | 0.7072 | 0.3904 | 0.7072 | 0.8410 |
| No log | 0.1199 | 300 | 0.8822 | 0.3615 | 0.8822 | 0.9393 |
| No log | 0.1207 | 302 | 1.2739 | 0.2792 | 1.2739 | 1.1287 |
| No log | 0.1215 | 304 | 1.2911 | 0.2797 | 1.2911 | 1.1363 |
| No log | 0.1223 | 306 | 1.1955 | 0.2920 | 1.1955 | 1.0934 |
| No log | 0.1231 | 308 | 0.8481 | 0.3675 | 0.8481 | 0.9209 |
| No log | 0.1239 | 310 | 0.7381 | 0.3944 | 0.7381 | 0.8591 |
| No log | 0.1247 | 312 | 0.7137 | 0.3962 | 0.7137 | 0.8448 |
| No log | 0.1254 | 314 | 0.8761 | 0.3774 | 0.8761 | 0.9360 |
| No log | 0.1262 | 316 | 1.2618 | 0.2962 | 1.2618 | 1.1233 |
| No log | 0.1270 | 318 | 1.1543 | 0.3289 | 1.1543 | 1.0744 |
| No log | 0.1278 | 320 | 0.8734 | 0.3970 | 0.8734 | 0.9346 |
| No log | 0.1286 | 322 | 0.9017 | 0.3704 | 0.9017 | 0.9496 |
| No log | 0.1294 | 324 | 1.1495 | 0.3398 | 1.1495 | 1.0721 |
| No log | 0.1302 | 326 | 1.0744 | 0.3524 | 1.0744 | 1.0365 |
| No log | 0.1310 | 328 | 0.9964 | 0.3654 | 0.9964 | 0.9982 |
| No log | 0.1318 | 330 | 0.6856 | 0.4306 | 0.6856 | 0.8280 |
| No log | 0.1326 | 332 | 0.6345 | 0.4778 | 0.6345 | 0.7965 |
| No log | 0.1334 | 334 | 0.6367 | 0.4502 | 0.6367 | 0.7979 |
| No log | 0.1342 | 336 | 0.8254 | 0.4201 | 0.8254 | 0.9085 |
| No log | 0.1350 | 338 | 1.2568 | 0.3226 | 1.2568 | 1.1211 |
| No log | 0.1358 | 340 | 1.2057 | 0.3229 | 1.2057 | 1.0981 |
| No log | 0.1366 | 342 | 0.8126 | 0.3878 | 0.8126 | 0.9014 |
| No log | 0.1374 | 344 | 0.6800 | 0.4254 | 0.6800 | 0.8246 |
| No log | 0.1382 | 346 | 0.6839 | 0.4263 | 0.6839 | 0.8270 |
| No log | 0.1390 | 348 | 0.8903 | 0.3510 | 0.8903 | 0.9435 |
| No log | 0.1398 | 350 | 1.1309 | 0.3194 | 1.1309 | 1.0634 |
| No log | 0.1406 | 352 | 0.9973 | 0.3531 | 0.9973 | 0.9986 |
| No log | 0.1414 | 354 | 0.8027 | 0.4076 | 0.8027 | 0.8959 |
| No log | 0.1422 | 356 | 0.7669 | 0.4423 | 0.7669 | 0.8757 |
| No log | 0.1430 | 358 | 0.8864 | 0.3703 | 0.8864 | 0.9415 |
| No log | 0.1438 | 360 | 1.2031 | 0.3268 | 1.2031 | 1.0969 |
| No log | 0.1446 | 362 | 1.2285 | 0.3115 | 1.2285 | 1.1084 |
| No log | 0.1454 | 364 | 0.9024 | 0.3560 | 0.9024 | 0.9500 |
| No log | 0.1462 | 366 | 0.8495 | 0.3500 | 0.8495 | 0.9217 |
| No log | 0.1470 | 368 | 1.0747 | 0.3476 | 1.0747 | 1.0367 |
| No log | 0.1478 | 370 | 1.4146 | 0.2501 | 1.4146 | 1.1894 |
| No log | 0.1486 | 372 | 1.2111 | 0.3011 | 1.2111 | 1.1005 |
| No log | 0.1494 | 374 | 0.8902 | 0.3637 | 0.8902 | 0.9435 |
| No log | 0.1502 | 376 | 0.7482 | 0.4161 | 0.7482 | 0.8650 |
| No log | 0.1510 | 378 | 0.8112 | 0.3823 | 0.8112 | 0.9007 |
| No log | 0.1518 | 380 | 1.0830 | 0.3262 | 1.0830 | 1.0407 |
| No log | 0.1526 | 382 | 1.0731 | 0.3331 | 1.0731 | 1.0359 |
| No log | 0.1534 | 384 | 0.8153 | 0.4001 | 0.8153 | 0.9030 |
| No log | 0.1542 | 386 | 0.7727 | 0.4264 | 0.7727 | 0.8790 |
| No log | 0.1550 | 388 | 0.8084 | 0.4070 | 0.8084 | 0.8991 |
| No log | 0.1558 | 390 | 1.0025 | 0.3665 | 1.0025 | 1.0012 |
| No log | 0.1566 | 392 | 1.1693 | 0.3682 | 1.1693 | 1.0814 |
| No log | 0.1574 | 394 | 0.8559 | 0.3950 | 0.8559 | 0.9252 |
| No log | 0.1582 | 396 | 0.7534 | 0.3810 | 0.7534 | 0.8680 |
| No log | 0.1590 | 398 | 0.7330 | 0.3733 | 0.7330 | 0.8562 |
| No log | 0.1598 | 400 | 0.9134 | 0.3729 | 0.9134 | 0.9557 |
| No log | 0.1606 | 402 | 1.3460 | 0.2934 | 1.3460 | 1.1602 |
| No log | 0.1614 | 404 | 1.1877 | 0.3083 | 1.1877 | 1.0898 |
| No log | 0.1622 | 406 | 0.7778 | 0.3765 | 0.7778 | 0.8819 |
| No log | 0.1630 | 408 | 0.6914 | 0.3697 | 0.6914 | 0.8315 |
| No log | 0.1638 | 410 | 0.6978 | 0.3536 | 0.6978 | 0.8353 |
| No log | 0.1646 | 412 | 0.8052 | 0.3578 | 0.8052 | 0.8973 |
| No log | 0.1654 | 414 | 1.0860 | 0.3102 | 1.0860 | 1.0421 |
| No log | 0.1662 | 416 | 1.0397 | 0.3350 | 1.0397 | 1.0197 |
| No log | 0.1670 | 418 | 0.7614 | 0.3781 | 0.7614 | 0.8726 |
| No log | 0.1678 | 420 | 0.7603 | 0.3579 | 0.7603 | 0.8720 |
| No log | 0.1686 | 422 | 0.7471 | 0.3517 | 0.7471 | 0.8644 |
| No log | 0.1694 | 424 | 0.8497 | 0.3817 | 0.8497 | 0.9218 |
| No log | 0.1702 | 426 | 1.1458 | 0.3436 | 1.1458 | 1.0704 |
| No log | 0.1710 | 428 | 1.0305 | 0.3592 | 1.0305 | 1.0151 |
| No log | 0.1718 | 430 | 0.7768 | 0.4066 | 0.7768 | 0.8814 |
| No log | 0.1726 | 432 | 0.7884 | 0.4068 | 0.7884 | 0.8879 |
| No log | 0.1734 | 434 | 0.7346 | 0.4292 | 0.7346 | 0.8571 |
| No log | 0.1742 | 436 | 0.7491 | 0.4241 | 0.7491 | 0.8655 |
| No log | 0.1750 | 438 | 0.8910 | 0.3996 | 0.8910 | 0.9440 |
| No log | 0.1758 | 440 | 0.9190 | 0.4092 | 0.9190 | 0.9587 |
| No log | 0.1766 | 442 | 0.7915 | 0.4385 | 0.7915 | 0.8897 |
| No log | 0.1774 | 444 | 0.7731 | 0.4487 | 0.7731 | 0.8792 |
| No log | 0.1782 | 446 | 0.6820 | 0.4821 | 0.6820 | 0.8258 |
| No log | 0.1790 | 448 | 0.7872 | 0.4591 | 0.7872 | 0.8873 |
| No log | 0.1798 | 450 | 1.0227 | 0.3914 | 1.0227 | 1.0113 |
| No log | 0.1806 | 452 | 0.8452 | 0.4296 | 0.8452 | 0.9194 |
| No log | 0.1814 | 454 | 0.6397 | 0.4300 | 0.6397 | 0.7998 |
| No log | 0.1822 | 456 | 0.6577 | 0.3829 | 0.6577 | 0.8110 |
| No log | 0.1830 | 458 | 0.6485 | 0.4237 | 0.6485 | 0.8053 |
| No log | 0.1838 | 460 | 0.9086 | 0.3954 | 0.9086 | 0.9532 |
| No log | 0.1846 | 462 | 1.1451 | 0.3515 | 1.1451 | 1.0701 |
| No log | 0.1854 | 464 | 0.9471 | 0.3803 | 0.9471 | 0.9732 |
| No log | 0.1862 | 466 | 0.6768 | 0.4406 | 0.6768 | 0.8227 |
| No log | 0.1870 | 468 | 0.6689 | 0.3969 | 0.6689 | 0.8179 |
| No log | 0.1878 | 470 | 0.7163 | 0.4281 | 0.7163 | 0.8464 |
| No log | 0.1886 | 472 | 0.9622 | 0.3560 | 0.9622 | 0.9809 |
| No log | 0.1894 | 474 | 1.1929 | 0.3220 | 1.1929 | 1.0922 |
| No log | 0.1902 | 476 | 1.0833 | 0.3431 | 1.0833 | 1.0408 |
| No log | 0.1910 | 478 | 0.8673 | 0.3702 | 0.8673 | 0.9313 |
| No log | 0.1918 | 480 | 0.8892 | 0.3680 | 0.8892 | 0.9430 |
| No log | 0.1926 | 482 | 1.1001 | 0.3345 | 1.1001 | 1.0489 |
| No log | 0.1934 | 484 | 1.1616 | 0.3138 | 1.1616 | 1.0778 |
| No log | 0.1942 | 486 | 1.0211 | 0.3560 | 1.0211 | 1.0105 |
| No log | 0.1950 | 488 | 0.8079 | 0.3908 | 0.8079 | 0.8988 |
| No log | 0.1958 | 490 | 0.7697 | 0.3710 | 0.7697 | 0.8773 |
| No log | 0.1966 | 492 | 0.8135 | 0.3925 | 0.8135 | 0.9020 |
| No log | 0.1974 | 494 | 1.0768 | 0.3434 | 1.0768 | 1.0377 |
| No log | 0.1982 | 496 | 1.1909 | 0.3084 | 1.1909 | 1.0913 |
| No log | 0.1990 | 498 | 0.9839 | 0.3817 | 0.9839 | 0.9919 |
| 1.1074 | 0.1998 | 500 | 0.7495 | 0.4030 | 0.7495 | 0.8657 |
| 1.1074 | 0.2006 | 502 | 0.7155 | 0.4028 | 0.7155 | 0.8459 |
| 1.1074 | 0.2014 | 504 | 0.7544 | 0.4012 | 0.7544 | 0.8685 |
| 1.1074 | 0.2022 | 506 | 0.7652 | 0.4291 | 0.7652 | 0.8748 |
| 1.1074 | 0.2030 | 508 | 0.6663 | 0.4415 | 0.6663 | 0.8163 |
| 1.1074 | 0.2038 | 510 | 0.7129 | 0.3673 | 0.7129 | 0.8443 |
| 1.1074 | 0.2046 | 512 | 0.6884 | 0.4261 | 0.6884 | 0.8297 |
| 1.1074 | 0.2054 | 514 | 0.6800 | 0.4596 | 0.6800 | 0.8246 |
| 1.1074 | 0.2062 | 516 | 0.8036 | 0.4156 | 0.8036 | 0.8965 |
| 1.1074 | 0.2070 | 518 | 0.9078 | 0.3917 | 0.9078 | 0.9528 |
| 1.1074 | 0.2078 | 520 | 0.7711 | 0.4206 | 0.7711 | 0.8781 |
| 1.1074 | 0.2085 | 522 | 0.7139 | 0.4483 | 0.7139 | 0.8449 |
| 1.1074 | 0.2093 | 524 | 0.7279 | 0.4535 | 0.7279 | 0.8532 |
| 1.1074 | 0.2101 | 526 | 0.7937 | 0.4202 | 0.7937 | 0.8909 |
| 1.1074 | 0.2109 | 528 | 0.7076 | 0.4518 | 0.7076 | 0.8412 |
| 1.1074 | 0.2117 | 530 | 0.6913 | 0.4232 | 0.6913 | 0.8314 |
| 1.1074 | 0.2125 | 532 | 0.6960 | 0.3950 | 0.6960 | 0.8342 |
| 1.1074 | 0.2133 | 534 | 0.6646 | 0.4543 | 0.6646 | 0.8152 |
| 1.1074 | 0.2141 | 536 | 0.8829 | 0.4173 | 0.8829 | 0.9396 |
| 1.1074 | 0.2149 | 538 | 0.8820 | 0.4158 | 0.8820 | 0.9392 |
| 1.1074 | 0.2157 | 540 | 0.6666 | 0.4355 | 0.6666 | 0.8164 |
| 1.1074 | 0.2165 | 542 | 0.6511 | 0.4442 | 0.6511 | 0.8069 |
| 1.1074 | 0.2173 | 544 | 0.6422 | 0.4576 | 0.6422 | 0.8014 |
| 1.1074 | 0.2181 | 546 | 0.7412 | 0.4201 | 0.7412 | 0.8609 |
| 1.1074 | 0.2189 | 548 | 1.0071 | 0.3739 | 1.0071 | 1.0035 |
| 1.1074 | 0.2197 | 550 | 0.9787 | 0.3750 | 0.9787 | 0.9893 |
| 1.1074 | 0.2205 | 552 | 0.7447 | 0.3841 | 0.7447 | 0.8630 |
| 1.1074 | 0.2213 | 554 | 0.7213 | 0.4261 | 0.7213 | 0.8493 |
| 1.1074 | 0.2221 | 556 | 0.7908 | 0.3973 | 0.7908 | 0.8893 |
| 1.1074 | 0.2229 | 558 | 1.0802 | 0.3720 | 1.0802 | 1.0393 |
| 1.1074 | 0.2237 | 560 | 1.0632 | 0.3854 | 1.0632 | 1.0311 |
| 1.1074 | 0.2245 | 562 | 0.7583 | 0.4263 | 0.7583 | 0.8708 |
| 1.1074 | 0.2253 | 564 | 0.6554 | 0.4575 | 0.6554 | 0.8096 |
| 1.1074 | 0.2261 | 566 | 0.6469 | 0.4679 | 0.6469 | 0.8043 |
| 1.1074 | 0.2269 | 568 | 0.6993 | 0.4338 | 0.6993 | 0.8362 |
| 1.1074 | 0.2277 | 570 | 0.9641 | 0.4021 | 0.9641 | 0.9819 |
| 1.1074 | 0.2285 | 572 | 0.9744 | 0.4096 | 0.9744 | 0.9871 |
| 1.1074 | 0.2293 | 574 | 0.7320 | 0.4200 | 0.7320 | 0.8556 |
| 1.1074 | 0.2301 | 576 | 0.6752 | 0.4443 | 0.6752 | 0.8217 |
| 1.1074 | 0.2309 | 578 | 0.7463 | 0.4186 | 0.7463 | 0.8639 |
| 1.1074 | 0.2317 | 580 | 0.9634 | 0.4143 | 0.9634 | 0.9815 |
| 1.1074 | 0.2325 | 582 | 0.8819 | 0.4236 | 0.8819 | 0.9391 |
| 1.1074 | 0.2333 | 584 | 0.7393 | 0.4268 | 0.7393 | 0.8598 |
| 1.1074 | 0.2341 | 586 | 0.7501 | 0.4422 | 0.7501 | 0.8661 |
| 1.1074 | 0.2349 | 588 | 0.9491 | 0.4135 | 0.9491 | 0.9742 |
| 1.1074 | 0.2357 | 590 | 0.9795 | 0.4235 | 0.9795 | 0.9897 |
| 1.1074 | 0.2365 | 592 | 0.7808 | 0.4312 | 0.7808 | 0.8837 |
| 1.1074 | 0.2373 | 594 | 0.6632 | 0.4517 | 0.6632 | 0.8143 |
| 1.1074 | 0.2381 | 596 | 0.6745 | 0.4534 | 0.6745 | 0.8213 |
| 1.1074 | 0.2389 | 598 | 0.8610 | 0.3929 | 0.8610 | 0.9279 |
| 1.1074 | 0.2397 | 600 | 0.8937 | 0.3880 | 0.8937 | 0.9453 |
| 1.1074 | 0.2405 | 602 | 0.7321 | 0.4083 | 0.7321 | 0.8556 |
| 1.1074 | 0.2413 | 604 | 0.6317 | 0.4418 | 0.6317 | 0.7948 |
| 1.1074 | 0.2421 | 606 | 0.6394 | 0.4708 | 0.6394 | 0.7996 |
| 1.1074 | 0.2429 | 608 | 0.8026 | 0.4345 | 0.8026 | 0.8959 |
| 1.1074 | 0.2437 | 610 | 0.9036 | 0.4290 | 0.9036 | 0.9506 |
| 1.1074 | 0.2445 | 612 | 0.8307 | 0.4223 | 0.8307 | 0.9114 |
| 1.1074 | 0.2453 | 614 | 0.7197 | 0.4207 | 0.7197 | 0.8483 |
| 1.1074 | 0.2461 | 616 | 0.7962 | 0.3991 | 0.7962 | 0.8923 |
| 1.1074 | 0.2469 | 618 | 0.9289 | 0.3954 | 0.9289 | 0.9638 |
| 1.1074 | 0.2477 | 620 | 0.8027 | 0.3998 | 0.8027 | 0.8959 |
| 1.1074 | 0.2485 | 622 | 0.7094 | 0.3872 | 0.7094 | 0.8423 |
| 1.1074 | 0.2493 | 624 | 0.7463 | 0.4000 | 0.7463 | 0.8639 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
|
mradermacher/datallm-v2-mixtral-8x7b-v0.1-GGUF | mradermacher | 2024-11-25T03:59:54Z | 33 | 1 | transformers | [
"transformers",
"gguf",
"en",
"dataset:mostlyai/datallm-instructs-v2",
"base_model:mostlyai/datallm-v2-mixtral-8x7b-v0.1",
"base_model:quantized:mostlyai/datallm-v2-mixtral-8x7b-v0.1",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | null | 2024-11-23T23:23:07Z | ---
base_model: mostlyai/datallm-v2-mixtral-8x7b-v0.1
datasets:
- mostlyai/datallm-instructs-v2
language:
- en
library_name: transformers
license: apache-2.0
quantized_by: mradermacher
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
static quants of https://huggingface.co/mostlyai/datallm-v2-mixtral-8x7b-v0.1
<!-- provided-files -->
weighted/imatrix quants are available at https://huggingface.co/mradermacher/datallm-v2-mixtral-8x7b-v0.1-i1-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/datallm-v2-mixtral-8x7b-v0.1-GGUF/resolve/main/datallm-v2-mixtral-8x7b-v0.1.Q2_K.gguf) | Q2_K | 17.4 | |
| [GGUF](https://huggingface.co/mradermacher/datallm-v2-mixtral-8x7b-v0.1-GGUF/resolve/main/datallm-v2-mixtral-8x7b-v0.1.Q3_K_S.gguf) | Q3_K_S | 20.5 | |
| [GGUF](https://huggingface.co/mradermacher/datallm-v2-mixtral-8x7b-v0.1-GGUF/resolve/main/datallm-v2-mixtral-8x7b-v0.1.Q3_K_M.gguf) | Q3_K_M | 22.6 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/datallm-v2-mixtral-8x7b-v0.1-GGUF/resolve/main/datallm-v2-mixtral-8x7b-v0.1.Q3_K_L.gguf) | Q3_K_L | 24.3 | |
| [GGUF](https://huggingface.co/mradermacher/datallm-v2-mixtral-8x7b-v0.1-GGUF/resolve/main/datallm-v2-mixtral-8x7b-v0.1.IQ4_XS.gguf) | IQ4_XS | 25.5 | |
| [GGUF](https://huggingface.co/mradermacher/datallm-v2-mixtral-8x7b-v0.1-GGUF/resolve/main/datallm-v2-mixtral-8x7b-v0.1.Q4_K_S.gguf) | Q4_K_S | 26.8 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/datallm-v2-mixtral-8x7b-v0.1-GGUF/resolve/main/datallm-v2-mixtral-8x7b-v0.1.Q4_K_M.gguf) | Q4_K_M | 28.5 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/datallm-v2-mixtral-8x7b-v0.1-GGUF/resolve/main/datallm-v2-mixtral-8x7b-v0.1.Q5_K_S.gguf) | Q5_K_S | 32.3 | |
| [GGUF](https://huggingface.co/mradermacher/datallm-v2-mixtral-8x7b-v0.1-GGUF/resolve/main/datallm-v2-mixtral-8x7b-v0.1.Q5_K_M.gguf) | Q5_K_M | 33.3 | |
| [GGUF](https://huggingface.co/mradermacher/datallm-v2-mixtral-8x7b-v0.1-GGUF/resolve/main/datallm-v2-mixtral-8x7b-v0.1.Q6_K.gguf) | Q6_K | 38.5 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/datallm-v2-mixtral-8x7b-v0.1-GGUF/resolve/main/datallm-v2-mixtral-8x7b-v0.1.Q8_0.gguf) | Q8_0 | 49.7 | fast, best quality |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
gaianet/SmolLM2-1.7B-Instruct-GGUF | gaianet | 2024-11-25T03:55:35Z | 21 | 1 | transformers | [
"transformers",
"gguf",
"llama",
"text-generation",
"en",
"base_model:HuggingFaceTB/SmolLM2-1.7B-Instruct",
"base_model:quantized:HuggingFaceTB/SmolLM2-1.7B-Instruct",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us",
"conversational"
] | text-generation | 2024-11-19T14:25:39Z | ---
base_model: HuggingFaceTB/SmolLM2-1.7B-Instruct
license: apache-2.0
library_name: transformers
model_creator: HuggingFaceTB
model_name: SmolLM2-1.7B-Instruct
quantized_by: Second State Inc.
language:
- en
---
# SmolLM2-1.7B-Instruct-GGUF
## Original Model
[HuggingFaceTB/SmolLM2-1.7B-Instruct](https://huggingface.co/HuggingFaceTB/SmolLM2-1.7B-Instruct)
## Run with Gaianet
**Prompt template:**
prompt template: `chatml`
**Context size:**
chat_ctx_size: `2048`
**Run with GaiaNet:**
- Quick start: https://docs.gaianet.ai/node-guide/quick-start
- Customize your node: https://docs.gaianet.ai/node-guide/customize
*Quantized with llama.cpp b4120*
|
Shinyaaa/RPC-10-v1 | Shinyaaa | 2024-11-25T03:48:09Z | 103 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"feature-extraction",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | feature-extraction | 2024-11-25T03:47:43Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
irinachengsc/Mixtral-8x7B-Instruct-v0.1-Q4_0-GGUF | irinachengsc | 2024-11-25T03:33:51Z | 17 | 0 | null | [
"gguf",
"llama-cpp",
"gguf-my-repo",
"fr",
"it",
"de",
"es",
"en",
"base_model:mistralai/Mixtral-8x7B-Instruct-v0.1",
"base_model:quantized:mistralai/Mixtral-8x7B-Instruct-v0.1",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-11-25T03:31:58Z | ---
language:
- fr
- it
- de
- es
- en
license: apache-2.0
base_model: mistralai/Mixtral-8x7B-Instruct-v0.1
inference:
parameters:
temperature: 0.5
widget:
- messages:
- role: user
content: What is your favorite condiment?
extra_gated_description: If you want to learn more about how we process your personal
data, please read our <a href="https://mistral.ai/terms/">Privacy Policy</a>.
tags:
- llama-cpp
- gguf-my-repo
---
# irinachengsc/Mixtral-8x7B-Instruct-v0.1-Q4_0-GGUF
This model was converted to GGUF format from [`mistralai/Mixtral-8x7B-Instruct-v0.1`](https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space.
Refer to the [original model card](https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1) for more details on the model.
## Use with llama.cpp
Install llama.cpp through brew (works on Mac and Linux)
```bash
brew install llama.cpp
```
Invoke the llama.cpp server or the CLI.
### CLI:
```bash
llama-cli --hf-repo irinachengsc/Mixtral-8x7B-Instruct-v0.1-Q4_0-GGUF --hf-file mixtral-8x7b-instruct-v0.1-q4_0.gguf -p "The meaning to life and the universe is"
```
### Server:
```bash
llama-server --hf-repo irinachengsc/Mixtral-8x7B-Instruct-v0.1-Q4_0-GGUF --hf-file mixtral-8x7b-instruct-v0.1-q4_0.gguf -c 2048
```
Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well.
Step 1: Clone llama.cpp from GitHub.
```
git clone https://github.com/ggerganov/llama.cpp
```
Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux).
```
cd llama.cpp && LLAMA_CURL=1 make
```
Step 3: Run inference through the main binary.
```
./llama-cli --hf-repo irinachengsc/Mixtral-8x7B-Instruct-v0.1-Q4_0-GGUF --hf-file mixtral-8x7b-instruct-v0.1-q4_0.gguf -p "The meaning to life and the universe is"
```
or
```
./llama-server --hf-repo irinachengsc/Mixtral-8x7B-Instruct-v0.1-Q4_0-GGUF --hf-file mixtral-8x7b-instruct-v0.1-q4_0.gguf -c 2048
```
|
g-assismoraes/deberta-semeval25-fulltrain-translateen | g-assismoraes | 2024-11-25T03:26:50Z | 160 | 0 | transformers | [
"transformers",
"safetensors",
"deberta-v2",
"text-classification",
"generated_from_trainer",
"base_model:microsoft/deberta-v3-base",
"base_model:finetune:microsoft/deberta-v3-base",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-25T03:15:32Z | ---
library_name: transformers
license: mit
base_model: microsoft/deberta-v3-base
tags:
- generated_from_trainer
model-index:
- name: deberta-semeval25-fulltrain-translateen
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# deberta-semeval25-fulltrain-translateen
This model is a fine-tuned version of [microsoft/deberta-v3-base](https://huggingface.co/microsoft/deberta-v3-base) on the None dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.1+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
|
mjm4dl/intent_slot_train_v2_llama_3_1_8B_64_e2_cosine_sched | mjm4dl | 2024-11-25T03:17:55Z | 6 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"text-generation-inference",
"unsloth",
"trl",
"sft",
"conversational",
"en",
"base_model:meta-llama/Llama-3.1-8B-Instruct",
"base_model:finetune:meta-llama/Llama-3.1-8B-Instruct",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-25T03:15:33Z | ---
base_model: meta-llama/Llama-3.1-8B-Instruct
language:
- en
license: apache-2.0
tags:
- text-generation-inference
- transformers
- unsloth
- llama
- trl
- sft
---
# Uploaded model
- **Developed by:** mjm4dl
- **License:** apache-2.0
- **Finetuned from model :** meta-llama/Llama-3.1-8B-Instruct
This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
Turbo-AI/gte-base-v1__trim_vocab-1024 | Turbo-AI | 2024-11-25T03:17:08Z | 19 | 0 | sentence-transformers | [
"sentence-transformers",
"safetensors",
"new",
"sentence-similarity",
"feature-extraction",
"generated_from_trainer",
"dataset_size:131843",
"loss:CachedMultipleNegativesRankingLoss",
"custom_code",
"arxiv:1908.10084",
"arxiv:2101.06983",
"model-index",
"autotrain_compatible",
"text-embeddings-inference",
"endpoints_compatible",
"region:us"
] | sentence-similarity | 2024-11-25T03:15:40Z | ---
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:131843
- loss:CachedMultipleNegativesRankingLoss
widget:
- source_sentence: Tính thuế thu nhập chuyển nhượng bất động sản từ các căn cứ nào?
sentences:
- 'Căn cứ tính thuế đối với thu nhập từ chuyển nhượng bất động sản
Căn cứ tính thuế đối với thu nhập từ chuyển nhượng bất động sản là giá chuyển
nhượng từng lần và thuế suất.
1. Giá chuyển nhượng
a) Giá chuyển nhượng đối với chuyển nhượng quyền sử dụng đất không có công trình
xây dựng trên đất là giá ghi trên hợp đồng chuyển nhượng tại thời điểm chuyển
nhượng.
Trường hợp trên hợp đồng chuyển nhượng không ghi giá hoặc giá trên hợp đồng chuyển
nhượng thấp hơn giá đất do Uỷ ban nhân dân cấp tỉnh quy định tại thời điểm chuyển
nhượng thì giá chuyển nhượng sẽ được xác định theo bảng giá đất do Uỷ ban nhân
dân cấp tỉnh quy định tại thời điểm chuyển nhượng.
…
2. Thuế suất
Thuế suất đối với chuyển nhượng bất động sản là 2% trên giá chuyển nhượng hoặc
giá cho thuê lại.
….
4. Cách tính thuế
a) Thuế thu nhập cá nhân đối với thu nhập từ chuyển nhượng bất động sản được xác
định như sau:
Thuế thu nhập cá nhân phải nộp = Giá chuyển nhượng x Thuế suất 2%'
- 'Căn cứ tính thuế
Căn cứ tính thuế thu nhập từ chuyển nhượng bất động sản là thu nhập tính thuế
và thuế suất.
Thu nhập tính thuế bằng (=) thu nhập chịu thuế trừ (-) các khoản lỗ của hoạt động
chuyển nhượng bất động sản của các năm trước (nếu có).
1. Thu nhập chịu thuế.
Thu nhập chịu thuế từ chuyển nhượng bất động sản được xác định bằng doanh thu
thu được từ hoạt động chuyển nhượng bất động sản trừ giá vốn của bất động sản
và các khoản chi phí được trừ liên quan đến hoạt động chuyển nhượng bất động sản.
a) Doanh thu từ hoạt động chuyển nhượng bất động sản.
a.1) Doanh thu từ hoạt động chuyển nhượng bất động sản được xác định theo giá
thực tế chuyển nhượng bất động sản theo hợp đồng chuyển nhượng, mua bán bất động
sản phù hợp với quy định của pháp luật (bao gồm cả các khoản phụ thu và phí thu
thêm nếu có).
Trường hợp giá chuyển quyền sử dụng đất theo hợp đồng chuyển nhượng, mua bán bất
động sản thấp hơn giá đất tại bảng giá đất do Ủy ban nhân dân tỉnh, thành phố
trực thuộc Trung ương quy định tại thời điểm ký hợp đồng chuyển nhượng bất động
sản thì tính theo giá đất do Ủy ban nhân dân tỉnh, thành phố trực thuộc Trung
ương quy định tại thời điểm ký hợp đồng chuyển nhượng bất động sản.
...
b) Chi phí chuyển nhượng bất động sản:
...
b.2) Chi phí chuyển nhượng bất động sản được trừ bao gồm:
- Giá vốn của đất chuyển quyền được xác định phù hợp với nguồn gốc quyền sử dụng
đất, cụ thể như sau:
+ Đối với đất Nhà nước giao có thu tiền sử dụng đất, thu tiền cho thuê đất thì
giá vốn là số tiền sử dụng đất, số tiền cho thuê đất thực nộp Ngân sách Nhà nước;
+ Đối với đất nhận quyền sử dụng của tổ chức, cá nhân khác thì căn cứ vào hợp
đồng và chứng từ trả tiền hợp pháp khi nhận quyền sử dụng đất, quyền thuê đất;
trường hợp không có hợp đồng và chứng từ trả tiền hợp pháp thì giá vốn được tính
theo giá do Ủy ban nhân dân tỉnh, thành phố trực thuộc Trung ương quy định tại
thời điểm doanh nghiệp nhận chuyển nhượng bất động sản.
...
- Chi phí đền bù thiệt hại về đất.
- Chi phí đền bù thiệt hại về hoa màu.
- Chi phí bồi thường, hỗ trợ, tái định cư và chi phí tổ chức thực hiện bồi thường,
hỗ trợ, tái định cư theo quy định của pháp luật.
Các khoản chi phí bồi thường, đền bù, hỗ trợ, tái định cư và chi phí tổ chức thực
hiện bồi thường, hỗ trợ, tái định cư nêu trên nếu không có hóa đơn thì được lập
Bảng kê ghi rõ: tên; địa chỉ của người nhận; số tiền đền bù, hỗ trợ; chữ ký của
người nhận tiền và được chính quyền phường, xã nơi có đất được đền bù, hỗ trợ
xác nhận theo đúng quy định của pháp luật về bồi thường, hỗ trợ và tái định cư
khi Nhà nước thu hồi đất.
- Các loại phí, lệ phí theo quy định của pháp luật liên quan đến cấp quyền sử
dụng đất.
- Chi phí cải tạo đất, san lấp mặt bằng.
- Chi phí đầu tư xây dựng kết cấu hạ tầng như đường giao thông, điện, cấp nước,
thoát nước, bưu chính viễn thông...
- Giá trị kết cấu hạ tầng, công trình kiến trúc có trên đất.
- Các khoản chi phí khác liên quan đến bất động sản được chuyển nhượng.
Trường hợp doanh nghiệp có hoạt động kinh doanh nhiều ngành nghề khác nhau thì
phải hạch toán riêng các khoản chi phí. Trường hợp không hạch toán riêng được
chi phí của từng hoạt động thì chi phí chung được phân bổ theo tỷ lệ giữa doanh
thu từ chuyển nhượng bất động sản so với tổng doanh thu của doanh nghiệp.
Không được tính vào chi phí chuyển nhượng bất động sản các khoản chi phí đã được
Nhà nước thanh toán hoặc thanh toán bằng nguồn vốn khác.
2. Thuế suất thuế thu nhập doanh nghiệp đối với hoạt động chuyển nhượng bất động
sản là 22% (từ ngày 01/01/2016 là 20%).'
- 'Căn cứ tính thuế đối với thu nhập từ chuyển nhượng bất động sản
...
2. Thuế suất
Thuế suất đối với chuyển nhượng bất động sản là 2% trên giá chuyển nhượng hoặc
giá cho thuê lại.
...
4. Cách tính thuế
a) Thuế thu nhập cá nhân đối với thu nhập từ chuyển nhượng bất động sản được xác
định như sau:
Thuế thu nhập cá nhân phải nộp = Giá chuyển nhượng x Thuế suất 2%
b) Trường hợp chuyển nhượng bất sản là đồng sở hữu thì nghĩa vụ thuế được xác
định riêng cho từng người nộp thuế theo tỷ lệ sở hữu bất động sản. Căn cứ xác
định tỷ lệ sở hữu là tài liệu hợp pháp như: thoả thuận góp vốn ban đầu, di chúc
hoặc quyết định phân chia của toà án,... Trường hợp không có tài liệu hợp pháp
thì nghĩa vụ thuế của từng người nộp thuế được xác định theo tỷ lệ bình quân.'
- 'Căn cứ tính thuế đối với thu nhập từ chuyển nhượng bất động sản
Căn cứ tính thuế đối với thu nhập từ chuyển nhượng bất động sản là giá chuyển
nhượng từng lần và thuế suất.
1. Giá chuyển nhượng
a) Giá chuyển nhượng đối với chuyển nhượng quyền sử dụng đất không có công trình
xây dựng trên đất là giá ghi trên hợp đồng chuyển nhượng tại thời điểm chuyển
nhượng.
Trường hợp trên hợp đồng chuyển nhượng không ghi giá hoặc giá trên hợp đồng chuyển
nhượng thấp hơn giá đất do Uỷ ban nhân dân cấp tỉnh quy định tại thời điểm chuyển
nhượng thì giá chuyển nhượng sẽ được xác định theo bảng giá đất do Uỷ ban nhân
dân cấp tỉnh quy định tại thời điểm chuyển nhượng.
b) Giá chuyển nhượng đối với chuyển nhượng quyền sử dụng đất gắn với công trình
xây dựng trên đất, kể cả công trình xây dựng hình thành trong tương lai là giá
ghi trên hợp đồng chuyển nhượng tại thời điểm chuyển nhượng.
Trường hợp trên hợp đồng chuyển nhượng không ghi giá đất hoặc giá đất trên hợp
đồng chuyển nhượng thấp hơn giá do Ủy ban nhân dân cấp tỉnh quy định thì giá chuyển
nhượng đất là giá do Ủy ban nhân dân cấp tỉnh quy định tại thời điểm chuyển nhượng
theo quy định của pháp luật về đất đai.
Trường hợp chuyển nhượng nhà gắn liền với đất thì phần giá trị nhà, kết cấu hạ
tầng và công trình kiến trúc gắn liền với đất được xác định căn cứ theo giá tính
lệ phí trước bạ nhà do Ủy ban nhân dân cấp tỉnh quy định. Trường hợp Ủy ban nhân
dân cấp tỉnh không có quy định giá tính lệ phí trước bạ nhà thì căn cứ vào quy
định của Bộ Xây dựng về phân loại nhà, về tiêu chuẩn, định mức xây dựng cơ bản,
về giá trị còn lại thực tế của công trình trên đất.
Đối với công trình xây dựng hình thành trong tương lai, trường hợp hợp đồng không
ghi giá chuyển nhượng hoặc giá chuyển nhượng thấp hơn tỷ lệ góp vốn trên tổng
giá trị hợp đồng nhân (x) với giá đất và giá tính lệ phí trước bạ công trình xây
dựng do Ủy ban nhân dân cấp tỉnh quy định thì giá chuyển nhượng được xác định
theo giá Uỷ ban nhân (x) với tỷ lệ góp vốn trên tổng giá trị hợp đồng. Trường
hợp Ủy ban nhân dân cấp tỉnh chưa có quy định về đơn giá thì áp dụng theo suất
vốn đầu tư xây dựng công trình do Bộ Xây dựng công bố, đang áp dụng tại thời điểm
chuyển nhượng.
c) Giá chuyển nhượng đối với chuyển nhượng quyền thuê đất, thuê mặt nước là giá
ghi trên hợp đồng tại thời điểm chuyển nhượng quyền thuê mặt đất, thuê mặt nước.
Trường hợp đơn giá cho thuê lại trên hợp đồng thấp hơn giá do Uỷ ban nhân dân
tỉnh quy định tại thời điểm cho thuê lại thì giá cho thuê lại được xác định căn
cứ theo bảng giá do Uỷ ban nhân dân tỉnh quy định.
2. Thuế suất
Thuế suất đối với chuyển nhượng bất động sản là 2% trên giá chuyển nhượng hoặc
giá cho thuê lại.
3. Thời điểm tính thuế từ chuyển nhượng bất động sản được xác định như sau:
- Trường hợp hợp đồng chuyển nhượng không có thỏa thuận bên mua là người nộp thuế
thay cho bên bán thì thời điểm tính thuế là thời điểm hợp đồng chuyển nhượng có
hiệu lực theo quy định của pháp luật;
- Trường hợp hợp đồng chuyển nhượng có thỏa thuận bên mua là người nộp thuế thay
cho bên bán thì thời điểm tính thuế là thời điểm làm thủ tục đăng ký quyền sở
hữu, quyền sử dụng bất động sản. Trường hợp cá nhân nhận chuyển nhượng nhà ở hình
thành trong tương lai, quyền sử dụng đất gắn với công trình xây dựng tương lai
là thời điểm cá nhân nộp hồ sơ khai thuế với cơ quan thuế.
4. Cách tính thuế
a) Thuế thu nhập cá nhân đối với thu nhập từ chuyển nhượng bất động sản được xác
định như sau:
Thuế thu nhập cá nhân phải nộp = Giá chuyển nhượng x Thuế suất 2%
b) Trường hợp chuyển nhượng bất sản là đồng sở hữu thì nghĩa vụ thuế được xác
định riêng cho từng người nộp thuế theo tỷ lệ sở hữu bất động sản. Căn cứ xác
định tỷ lệ sở hữu là tài liệu hợp pháp như: thoả thuận góp vốn ban đầu, di chúc
hoặc quyết định phân chia của toà án,... Trường hợp không có tài liệu hợp pháp
thì nghĩa vụ thuế của từng người nộp thuế được xác định theo tỷ lệ bình quân.'
- source_sentence: Có được thay biển số xe đấu giá trúng vào chiếc xe đang có biển
số xe cũ của mình không?
sentences:
- 'Quyền và nghĩa vụ của người trúng đấu giá biển số xe ô tô; người nhận chuyển
nhượng, trao đổi, được tặng cho, thừa kế xe ô tô gắn biển số trúng đấu giá
...
2. Nghĩa vụ của người trúng đấu giá biển số xe ô tô bao gồm:
...
c) Không được chuyển nhượng, trao đổi, tặng cho, để thừa kế biển số xe ô tô trúng
đấu giá, trừ trường hợp chuyển nhượng, trao đổi, tặng cho, để thừa kế xe ô tô
gắn biển số trúng đấu giá.'
- "Quyền và nghĩa vụ của người trúng đấu giá biển số xe ô tô; người nhận chuyển\
\ nhượng, trao đổi, được tặng cho, thừa kế xe ô tô gắn biển số trúng đấu giá\n\
1. Quyền của người trúng đấu giá biển số xe ô tô bao gồm:\na) Được cấp văn bản\
\ xác nhận biển số xe ô tô trúng đấu giá sau khi nộp đủ số tiền trúng đấu giá;\n\
b) Được đăng ký biển số xe ô tô trúng đấu giá gắn với xe ô tô thuộc sở hữu của\
\ mình tại cơ quan công an nơi quản lý biển số xe ô tô trúng đấu giá hoặc nơi\
\ người trúng đấu giá đăng ký thường trú, đặt trụ sở;\nc) Được giữ lại biển số\
\ xe ô tô trúng đấu giá trong trường hợp xe ô tô bị mất, hư hỏng không thể sử\
\ dụng được hoặc được chuyển nhượng, trao đổi, tặng cho để đăng ký cho xe khác\
\ thuộc sở hữu của mình trong thời hạn 12 tháng kể từ thời điểm xe ô tô bị mất,\
\ hư hỏng không thể sử dụng được hoặc được chuyển giao quyền sở hữu;\nd) Được\
\ cấp lại biển số xe ô tô trúng đấu giá, văn bản xác nhận biển số xe ô tô trúng\
\ đấu giákhi bị mất, bị mờ, hỏng;\nđ) Trong thời hạn 12 tháng kể từ ngày được\
\ cấp văn bản xác nhận biển số xe ô tô trúng đấu giá, nếu người trúng đấu giá\
\ chết nhưng chưa thực hiện thủ tục đăng ký xe ô tô để gắn biển số trúng đấu giá\
\ thì biển số xe ô tô trúng đấu giá được chuyển vào hệ thống đăng ký, quản lý\
\ xe, người thừa kế theo quy định của pháp luật về thừa kế được nhận số tiền người\
\ trúng đấu giá đã nộp sau khi trừ các khoản chi phí tổ chức đấu giá.\n2. Nghĩa\
\ vụ của người trúng đấu giá biển số xe ô tô bao gồm:\na) Nộp đủ số tiền trúng\
\ đấu giá trong thời hạn 15 ngày kể từ ngày có văn bản phê duyệt kết quả đấu giá;\
\ tiền trúng đấu giá không bao gồm lệ phí đăng ký, cấp biển số xe ô tô;\nb) Thực\
\ hiện thủ tục đăng ký xe ô tô để gắn biển số trúng đấu giá trong thời hạn 12\
\ tháng kể từ ngày được cấp văn bản xác nhận biển số xe ô tô trúng đấu giá; trường\
\ hợp sự kiện bất khả kháng hoặc trở ngại khách quan thì thời hạn này được kéo\
\ dài thêm nhưng tối đa không quá 06 tháng. Sau thời hạn quy định, người trúng\
\ đấu giá biển số xe ô tô không thực hiện thủ tục đăng ký xe ô tô để gắn biển\
\ số trúng đấu giá thì biển số xe ô tô trúng đấu giá được chuyển vào hệ thống\
\ đăng ký, quản lý xe và người trúng đấu giá không được hoàn trả số tiền trúng\
\ đấu giá đã nộp; \nc) Không được chuyển nhượng, trao đổi, tặng cho, để thừa kế\
\ biển số xe ô tô trúng đấu giá, trừ trường hợp chuyển nhượng, trao đổi, tặng\
\ cho, để thừa kế xe ô tô gắn biển số trúng đấu giá.\n3. Quyền và nghĩa vụ của\
\ người nhận chuyển nhượng, trao đổi, được tặng cho, thừa kế xe ô tô gắn biển\
\ số trúng đấu giá thực hiện theo quy định của pháp luật về quy trình cấp, thu\
\ hồi đăng ký, biển số phương tiện giao thông cơ giới đường bộ. "
- 'Hồ sơ đăng ký, cấp biển số xe trúng đấu giá
1. Đối với xe chưa đăng ký
a) Giấy tờ đăng ký xe theo quy định tại Điều 8 Thông tư này;
b) Giấy xác nhận biển số xe trúng đấu giá do Cục Cảnh sát giao thông cấp, còn
thời hạn sử dụng; trường hợp quá thời hạn thì phải có thêm giấy xác nhận gia hạn
do Cục Cảnh sát giao thông cấp.
2. Đối với xe đã đăng ký thuộc quyền sở hữu của tổ chức, cá nhân trúng đấu giá
a) Giấy khai đăng ký xe;
b) Chứng nhận đăng ký xe và biển số xe;
Trường hợp cơ quan thực hiện đăng ký, cấp biển số ô tô trúng đấu giá khác cơ quan
quản lý hồ sơ xe đã đăng ký của tổ chức, cá nhân trúng đấu giá thì chủ xe phải
làm thủ tục thu hồi đối với xe đã đăng ký đó;
c) Giấy xác nhận biển số xe trúng đấu giá do Cục Cảnh sát giao thông cấp, còn
thời hạn sử dụng; trường hợp quá thời hạn thì phải có thêm giấy xác nhận gia hạn
do Cục Cảnh sát giao thông cấp.'
- 'Thủ tục đăng ký xe
...
3. Trường hợp chuyển quyền sở hữu xe kèm theo biển số xe trúng đấu giá
a) Chủ xe nộp hồ sơ và làm thủ tục thu hồi theo quy định tại khoản 1 Điều 14,
khoản 1 Điều 15 Thông tư này, chủ xe không phải nộp lại biển số xe trúng đấu giá
nhưng phải nộp bản sao chứng từ chuyển quyền sở hữu xe và xuất trình bản chính
để đối chiếu (chứng từ chuyển quyền sở hữu phải thể hiện rõ nội dung chuyển quyền
sở hữu xe kèm theo biển số trúng đấu giá);
b) Tổ chức, cá nhân nhận chuyển quyền sở hữu xe nộp hồ sơ và làm thủ tục đăng
ký sang tên xe theo quy định tại khoản 2 Điều 14, khoản 2 Điều 15 Thông tư này
và được đăng ký, giữ nguyên biển số xe trúng đấu giá (chứng từ chuyển quyền sở
hữu phải thể hiện rõ nội dung chuyển quyền sở hữu xe kèm theo biển số trúng đấu
giá).
Tổ chức, cá nhân đã nhận chuyển quyền sở hữu xe kèm theo biển số xe trúng đấu
giá, không được tiếp tục chuyển quyền sở hữu xe kèm theo biển số xe trúng đấu
giá cho tổ chức, cá nhân khác; được chuyển quyền sở hữu xe theo quy định của pháp
luật.'
- source_sentence: Thủ tục thu hồi thẻ giám định viên được quy định như thế nào?
sentences:
- '“Điều 1. Sửa đổi, bổ sung một số điều của Luật Giám định tư pháp
...
6. Sửa đổi, bổ sung Điều 10 như sau:
“Điều 10. Thẩm quyền, trình tự, thủ tục miễn nhiệm giám định viên tư pháp và thu
hồi thẻ giám định viên tư pháp
[...]
2. Hồ sơ đề nghị miễn nhiệm giám định viên tư pháp bao gồm:
a) Văn bản đề nghị miễn nhiệm giám định viên tư pháp của cơ quan, tổ chức quản
lý giám định viên tư pháp hoặc đơn xin miễn nhiệm của giám định viên tư pháp;
b) Văn bản, giấy tờ chứng minh giám định viên tư pháp thuộc một trong các trường
hợp quy định tại khoản 1 Điều này.
[...]”'
- 'Thu hồi Thẻ giám định viên
…
2. Thủ tục thu hồi Thẻ giám định viên
Khi có căn cứ thu hồi Thẻ giám định viên theo quy định tại khoản 1 Điều này, Cục
Trồng trọt thực hiện các thủ tục sau đây:
a) Cục trưởng Cục Trồng trọt ký quyết định thu hồi Thẻ giám định viên;
b) Xoá tên khỏi Sổ đăng ký quốc gia về người hoạt động giám định quyền đối với
giống cây trồng;
c) Công bố trên Website của Văn phòng bảo hộ giống cây trồng mới trong thời hạn
năm (05) ngày kể từ ngày ký quyết định.'
- 'Trình tự, thủ tục bổ nhiệm, cấp thẻ, miễn nhiệm, thu hồi thẻ giám định viên tư
pháp
1. Bổ nhiệm, cấp thẻ giám định viên tư pháp:
a) Đơn vị thuộc Ngân hàng Nhà nước lập hồ sơ bổ nhiệm, cấp thẻ giám định viên
tư pháp theo quy định tại khoản 1 Điều 6 Thông tư này gửi Vụ Tổ chức cán bộ;
b) Trong thời hạn tối đa 07 ngày kể từ ngày nhận đủ hồ sơ hợp lệ, Vụ Tổ chức cán
bộ trình Thống đốc Ngân hàng Nhà nước quyết định bổ nhiệm, cấp thẻ giám định viên
tư pháp. Trường hợp từ chối, Vụ Tổ chức cán bộ có văn bản gửi đơn vị đề nghị và
nêu rõ lý do.
2. Miễn nhiệm, thu hồi thẻ giám định viên tư pháp:
a) Đơn vị thuộc Ngân hàng Nhà nước lập hồ sơ miễn nhiệm, thu hồi thẻ giám định
viên tư pháp theo quy định tại khoản 2 Điều 6 Thông tư này gửi Vụ tổ chức cán
bộ;
b) Trong thời hạn tối đa 07 ngày kể từ ngày nhận đủ hồ sơ hợp lệ, Vụ Tổ chức cán
bộ trình Thống đốc Ngân hàng Nhà nước quyết định miễn nhiệm, thu hồi thẻ giám
định viên tư pháp. Trường hợp từ chối, Vụ Tổ chức cán bộ có văn bản gửi đơn vị
đề nghị và nêu rõ lý do.
3. Trong thời hạn tối đa 03 ngày kể từ ngày có quyết định bổ nhiệm, miễn nhiệm,
Vụ Tổ chức cán bộ lập danh sách giám định viên tư pháp, điều chỉnh danh sách giám
định viên tư pháp trình Thống đốc Ngân hàng Nhà nước ký gửi Bộ Tư pháp, đồng thời
gửi Vụ Truyền thông để thực hiện đăng tải danh sách trên Cổng thông tin điện tử
Ngân hàng Nhà nước, gửi Cơ quan Thanh tra, giám sát ngân hàng để theo dõi.'
- 'Thủ tục miễn nhiệm và thu hồi thẻ giám định viên pháp y và giám định viên pháp
y tâm thần
...
3. Thủ tục miễn nhiệm giám định viên pháp y và giám định viên pháp y tâm thần.
a) Tại cấp Trung ương
Cơ quan đề nghị miễn nhiệm giám định viên pháp y, giám định viên pháp y tâm thần
lập hồ sơ đề nghị miễn nhiệm theo quy định tại khoản 1 Điều này, gửi đến Bộ Y
tế (qua Vụ Tổ chức cán bộ). Vụ Tổ chức cán bộ, Bộ Y tế chủ trì, phối hợp với Cục
Quản lý Khám, chữa bệnh và Vụ Pháp chế rà soát hồ sơ.
Trong thời hạn 10 ngày, kể từ ngày nhận được hồ sơ đầy đủ, hợp lệ, Vụ Tổ chức
cán bộ trình Bộ trưởng Bộ Y tế xem xét quyết định miễn nhiệm, thu hồi thẻ giám
định viên pháp y, giám định viên pháp y tâm thần và điều chỉnh danh sách giám
định viên trên Cổng Thông tin điện tử của Bộ Y tế, đồng thời gửi Bộ Tư pháp để
điều chỉnh danh sách chung về giám định viên tư pháp. Trường hợp không miễn nhiệm
thì Bộ Y tế thông báo cho cơ quan đề nghị bằng văn bản và nêu rõ lý do.
b) Tại cấp tỉnh/thành phố trực thuộc Trung ương
Cơ quan đề nghị miễn nhiệm giám định viên pháp y, giám định viên pháp y tâm thần
lập hồ sơ đề nghị miễn nhiệm theo quy định tại khoản 1 Điều này gửi đến Sở Y tế.
Sở Y tế phối hợp với Sở Tư pháp rà soát hồ sơ.
Trong thời hạn 10 ngày, kể từ ngày nhận được hồ sơ đầy đủ, hợp lệ, Sở Y tế trình
Chủ tịch Ủy ban nhân dân cấp tỉnh xem xét quyết định miễn nhiệm, thu hồi thẻ giám
định viên pháp y, giám định viên pháp y tâm thần và điều chỉnh danh sách giám
định viên trên Cổng Thông tin điện tử của Ủy ban nhân dân cấp tỉnh, đồng thời
gửi Bộ Tư pháp để điều chỉnh danh sách chung về giám định viên tư pháp. Trường
hợp không miễn nhiệm thì Sở Y tế thông báo cho cơ quan đề nghị miễn nhiệm bằng
văn bản và nêu rõ lý do.'
- source_sentence: Hồ sơ cán bộ, công chức trong tổ chức công đoàn được quản lý, sử
dụng và bảo quản theo chế độ nào?
sentences:
- 'Phân cấp quản lý hồ sơ
1. Tổng Liên đoàn Lao động Việt Nam quản lý hồ sơ:
- Hồ sơ cán bộ, công chức, người lao động tại cơ quan Tổng Liên đoàn.
- Hồ sơ của chủ tịch, phó chủ tịch công đoàn ngành Trung ương và tương đương,
công đoàn tổng công ty trực thuộc Tổng Liên đoàn.
- Hồ sơ của cấp trưởng, cấp phó, kế toán trưởng các đơn vị trực thuộc Tổng Liên
đoàn.
- Sơ yếu lý lịch và bản bổ sung lý lịch hàng năm của các ủy viên ban thường vụ,
ủy viên ủy ban kiểm tra các liên đoàn lao động tỉnh, thành phố, công đoàn ngành
Trung ương và tương đương, công đoàn tổng công ty trực thuộc Tổng Liên đoàn.
2. Liên đoàn lao động tỉnh, thành phố, công đoàn ngành Trung ương và tương đương,
công đoàn tổng công ty trực thuộc Tổng Liên đoàn (gọi chung là công đoàn cấp tỉnh)
quản lý hồ sơ cán bộ thuộc diện quản lý của đơn vị mình gồm:
- Cán bộ, công chức, người lao động tại cơ quan công đoàn cấp tỉnh.
...'
- 'Nguyên tắc quản lý hồ sơ
1. Bảo đảm sự quản lý thống nhất của Tổng Liên đoàn Lao động Việt Nam trong công
tác quản lý hồ sơ cán bộ công đoàn.
2. Công tác xây dựng và quản lý hồ sơ cán bộ công đoàn được thực hiện thống nhất,
khoa học và phải phản ánh được đầy đủ, chính xác thông tin của từng cán bộ công
đoàn từ khi được tuyển dụng hoặc tiếp nhận vào làm việc tại các cơ quan, đơn vị
công đoàn cho đến khi không làm việc tại cơ quan, đơn vị công đoàn.
3. Hồ sơ cán bộ công đoàn được quản lý, sử dụng và bảo quản theo chế độ tài liệu
mật do nhà nước quy định, chỉ những người được Thủ trưởng cơ quan hoặc người có
thẩm quyền quản lý hồ sơ đồng ý bằng văn bản mới được nghiên cứu, sử dụng và khai
thác hồ sơ của cán bộ công đoàn. Nghiêm cấm việc tự ý phát tán thông tin trong
hồ sơ cán bộ công đoàn.
4. Cán bộ công đoàn có trách nhiệm kê khai đầy đủ, rõ ràng, chính xác và chịu
trách nhiệm về tính trung thực của những thông tin trong hồ sơ do mình kê khai.
Những tài liệu do cán bộ công đoàn kê khai phải được cơ quan quản lý xác nhận.
5. Hồ sơ cán bộ công đoàn phải được xây dựng, lưu trữ, và bảo quản bằng công nghệ
thông tin để quản lý, sử dụng và khai thác nhanh chóng, chính xác, có hiệu quả,
đáp ứng yêu cầu quản lý, phân tích chất lượng đội ngũ cán bộ công đoàn.'
- 'Chuyển giao, tiếp nhận hồ sơ
...
4. Cán bộ công đoàn nghỉ hưu, chuyển công tác, thôi việc, hoặc bị kỷ luật buộc
thôi việc và từ trần thì việc chuyển giao và lưu trữ hồ sơ được thực hiện như
sau:
a) Cán bộ công đoàn nghỉ hưu, thôi việc hoặc bị kỷ luật buộc thôi việc được nhận
một bản sao “Sơ yếu lý lịch cán bộ, công chức” và các Quyết định liên quan. Hồ
sơ gốc vẫn do cơ quan, tổ chức, đơn vị có thẩm quyền quản lý hồ sơ cán bộ công
đoàn lưu giữ, bảo quản và đưa vào nhóm cán bộ công đoàn thôi việc. Cơ quan có
thẩm quyền quản lý cán bộ công đoàn chỉ được xác nhận và cấp lại bản sao “Sơ yếu
lý lịch cán bộ, công chức” khi có yêu cầu bằng văn bản và trên cơ sở hồ sơ gốc
lưu trữ;
b) Đối với cán bộ công đoàn từ trần thì gia đình được nhận một bản sao “Sơ yếu
lý lịch cán bộ, công chức”. Hồ sơ gốc vẫn do cơ quan, tổ chức, đơn vị quản lý
hồ sơ cán bộ công đoàn lưu giữ, bảo quản;
c) Đối với cán bộ công đoàn chuyển công tác hoặc không tiếp tục làm việc tại cơ
quan, đơn vị của tổ chức công đoàn được nhận một bản sao “Sơ yếu lý lịch cán bộ,
công chức” của bản thân. Hồ sơ gốc vẫn do cơ quan, tổ chức, đơn vị quản lý cán
bộ công đoàn đó lưu giữ, bảo quản và chỉ được chuyển giao cho các cơ quan, tổ
chức, đơn vị khác quản lý khi các cơ quan, tổ chức, đơn vị đó có yêu cầu bằng
văn bản.'
- 'Lưu giữ, bảo quản hồ sơ cán bộ công đoàn
...
3. Quy trình lưu giữ hồ sơ cán bộ công đoàn được thực hiện như sau:
a) Kiểm tra và xử lý để bảo đảm các tài liệu được lưu trữ trong thành phần hồ
sơ là những tài liệu chính thức, tin cậy và có giá trị pháp lý;
b) Loại bỏ những tài liệu trùng lặp, thừa, chỉ giữ lại mỗi loại tài liệu một bản;
c) Trường hợp cần hủy tài liệu trong thành phần hồ sơ cán bộ công đoàn phải thành
lập hội đồng hủy tài liệu hồ sơ. Hội đồng hủy tài liệu hồ sơ cán bộ công đoàn
do người đứng đầu cơ quan quản lý cán bộ công đoàn quyết định. Khi tiến hành tiêu
hủy phải lập biên bản ghi rõ lý do hủy, cơ quan có thẩm quyền cho phép hủy tài
liệu, hồ sơ cán bộ công đoàn, danh mục tài liệu hủy, ngày và nơi hủy. Biên bản
hủy phải lưu trong thành phần hồ sơ cán bộ công đoàn.
4. Chế độ bảo quản hồ sơ cán bộ công đoàn theo chế độ bảo mật của nhà nước và
phải đảm bảo các chế độ và điều kiện cơ sở vật chất, trang thiết bị gồm:
a) Trang thiết bị và phương tiện bảo quản hồ sơ giấy gồm: Phòng để lưu giữ hồ
sơ riêng biệt với phòng làm việc hoặc tủ lưu giữ hồ sơ.... bảo đảm hồ sơ cán bộ
công đoàn được lưu giữ lâu dài, an toàn và bảo mật;
...'
- source_sentence: Khi công chức chuyển đổi vị trí công tác thì cơ quan nào có trách
nhiệm chuyển giao quyền truy cập tài khoản hồ sơ điện tử của công chức?
sentences:
- 'Chuyển, tiếp nhận hồ sơ điện tử
1. Trường hợp công chức, viên chức được bổ nhiệm, điều động, luân chuyển, chuyển
đổi công tác đến cơ quan, đơn vị khác thuộc Bộ mà dẫn tới việc thay đổi về thẩm
quyền quản lý hồ sơ công chức, viên chức thì cùng với việc chuyển giao hồ sơ giấy
theo quy định tại khoản 2 Điều 15 Quy chế này, cơ quan đang quản lý hồ sơ công
chức, viên chức có văn bản gửi Cục Công nghệ thông tin đề nghị chuyển giao quyền
truy cập tài khoản hồ sơ điện tử của công chức, viên chức sang cơ quan, đơn vị
mới có thẩm quyền quản lý.
2. Trường hợp tiếp nhận công chức, viên chức từ cơ quan ngoài Bộ Tư pháp thì trong
thời hạn 15 ngày kể từ ngày nhận chuyển giao hồ sơ giấy, cơ quan quản lý hồ sơ
công chức, viên chức có trách nhiệm lập hồ sơ điện tử của công chức, viên chức,
phê duyệt, lưu trữ trên Phần mềm quản lý hồ sơ.
3. Trường hợp công chức, viên chức được biệt phái; thuyên chuyển đến cơ quan,
đơn vị khác của nhà nước; nghỉ hưu; chuyển công tác ra khỏi cơ quan, đơn vị của
nhà nước; thôi việc; bị kỷ luật buộc thôi việc; chết thì hồ sơ điện tử của công
chức, viên chức đó vẫn được lưu trữ trên Phần mềm quản lý hồ sơ.'
- 'Thẩm quyền và trách nhiệm của công chức trực tiếp làm công tác quản lý hồ sơ
công chức
1. Chủ động đề xuất kế hoạch, biện pháp quản lý, sử dụng và khai thác hồ sơ, hồ
sơ điện tử công chức.
2. Tổ chức việc bổ sung các tài liệu vào hồ sơ công chức bảo đảm kịp thời, chính
xác.
3. Tổ chức việc sắp xếp, bảo quản, lưu giữ hồ sơ.
4. Cung cấp số liệu, tư liệu nhanh, chính xác.
5. Nghiên cứu, phát hiện các vấn đề chưa rõ hoặc mâu thuẫn trong hồ sơ công chức
và những vấn đề nảy sinh trong công tác quản lý hồ sơ, báo cáo người đứng đầu
cơ quan có thẩm quyền quản lý công chức xem xét, xử lý.
6. Đôn đốc, thu thập đầy đủ các thành phần tài liệu trong hồ sơ công chức thuộc
thẩm quyền quản lý của cơ quan, tổ chức, đơn vị mình.
7. Tổ chức phục vụ nghiên cứu, sử dụng và khai thác hồ sơ công chức.
8. Thực hiện nguyên tắc bảo mật hồ sơ, phát hiện và kiến nghị với người có thẩm
quyền về những vấn đề phát sinh trong công tác quản lý hồ sơ công chức để có biện
pháp giải quyết kịp thời.
9. Thường xuyên học tập, trao đổi kinh nghiệm, bồi dưỡng nâng cao trình độ chuyên
môn, nghiệp vụ.'
- 'Chuyển giao, tiếp nhận, lưu trữ thông tin hồ sơ điện tử cá nhân
1. Trường hợp công chức, viên chức có quyết định điều động, luân chuyển, chuyển
đổi vị trí công tác, bổ nhiệm chức vụ, ngạch công chức, chức danh nghề nghiệp
làm thay đổi thẩm quyền quản lý hồ sơ công chức, viên chức giữa các cơ quan, tổ
chức trực thuộc Bộ, trong thời hạn 15 ngày kể từ ngày quyết định có hiệu lực,
Thủ trưởng đơn vị đang quản lý hồ sơ công chức, viên chức có trách nhiệm cập nhật
thông tin hồ sơ điện tử cá nhân đó đến thời điểm công tác tại đơn vị; đồng thời
hủy bỏ các quyền (nếu có) của cá nhân đó trên hệ thống liên quan đến phạm vi dữ
liệu của đơn vị.
Đơn vị tiếp nhận công chức, viên chức có trách nhiệm tiếp tục cập nhật hồ sơ điện
tử cá nhân của công chức, viên chức được chuyển đến theo quy định.
2. Trường hợp công chức, viên chức có quyết định biệt phái đến một đơn vị trực
thuộc Bộ, Thủ trưởng đơn vị tiếp nhận công chức, viên chức biệt phái có trách
nhiệm báo cáo, trao đổi thông tin liên quan đến công chức, viên chức đó trong
thời gian biệt phái cho cơ quan đang quản lý hồ sơ công chức, viên chức để cập
nhật, bổ sung thông tin hồ sơ điện tử cá nhân của công chức, viên chức đó.
…'
- 'Thẩm quyền và trách nhiệm của cơ quan quản lý hồ sơ công chức
1. Chấp hành sự chỉ đạo, kiểm tra và hướng dẫn nghiệp vụ về công tác hồ sơ, hồ
sơ điện tử công chức của cơ quan cấp trên, đồng thời hướng dẫn kiểm tra, đôn đốc
các cơ quan, tổ chức, đơn vị trực thuộc về công tác quản lý hồ sơ công chức.
2. Tổ chức thực hiện các quy định về bổ sung, chuyển giao, tiếp nhận, nghiên cứu,
sử dụng, khai thác, lưu trữ, bảo quản hồ sơ công chức theo quy định của Thông
tư này.
3. Giao nộp đầy đủ, kịp thời cho cơ quan, tổ chức, đơn vị quản lý công chức những
tài liệu liên quan đến hồ sơ công chức hiện đang công tác ở đơn vị mình.'
pipeline_tag: sentence-similarity
library_name: sentence-transformers
metrics:
- cosine_accuracy@10
- cosine_precision@10
- cosine_recall@10
- cosine_ndcg@10
- cosine_mrr@10
- cosine_map@10
- dot_accuracy@10
- dot_precision@10
- dot_recall@10
- dot_ndcg@10
- dot_mrr@10
- dot_map@10
model-index:
- name: SentenceTransformer
results:
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: Unknown
type: unknown
metrics:
- type: cosine_accuracy@10
value: 0.9818913480885312
name: Cosine Accuracy@10
- type: cosine_precision@10
value: 0.10684104627766601
name: Cosine Precision@10
- type: cosine_recall@10
value: 0.97719651240778
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.8595980505645753
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.8287486825716198
name: Cosine Mrr@10
- type: cosine_map@10
value: 0.8160047587109961
name: Cosine Map@10
- type: dot_accuracy@10
value: 0.971830985915493
name: Dot Accuracy@10
- type: dot_precision@10
value: 0.1058350100603622
name: Dot Precision@10
- type: dot_recall@10
value: 0.9668008048289738
name: Dot Recall@10
- type: dot_ndcg@10
value: 0.8039262176278247
name: Dot Ndcg@10
- type: dot_mrr@10
value: 0.7591573249017911
name: Dot Mrr@10
- type: dot_map@10
value: 0.7455279294816518
name: Dot Map@10
---
# SentenceTransformer
This is a [sentence-transformers](https://www.SBERT.net) model trained. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
<!-- - **Base model:** [Unknown](https://huggingface.co/unknown) -->
- **Maximum Sequence Length:** 1024 tokens
- **Output Dimensionality:** 768 tokens
- **Similarity Function:** Cosine Similarity
<!-- - **Training Dataset:** Unknown -->
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 1024, 'do_lower_case': False}) with Transformer model: NewModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("Turbo-AI/gte-base-v1__trim_vocab-1024")
# Run inference
sentences = [
'Khi công chức chuyển đổi vị trí công tác thì cơ quan nào có trách nhiệm chuyển giao quyền truy cập tài khoản hồ sơ điện tử của công chức?',
'Chuyển, tiếp nhận hồ sơ điện tử\n1. Trường hợp công chức, viên chức được bổ nhiệm, điều động, luân chuyển, chuyển đổi công tác đến cơ quan, đơn vị khác thuộc Bộ mà dẫn tới việc thay đổi về thẩm quyền quản lý hồ sơ công chức, viên chức thì cùng với việc chuyển giao hồ sơ giấy theo quy định tại khoản 2 Điều 15 Quy chế này, cơ quan đang quản lý hồ sơ công chức, viên chức có văn bản gửi Cục Công nghệ thông tin đề nghị chuyển giao quyền truy cập tài khoản hồ sơ điện tử của công chức, viên chức sang cơ quan, đơn vị mới có thẩm quyền quản lý.\n2. Trường hợp tiếp nhận công chức, viên chức từ cơ quan ngoài Bộ Tư pháp thì trong thời hạn 15 ngày kể từ ngày nhận chuyển giao hồ sơ giấy, cơ quan quản lý hồ sơ công chức, viên chức có trách nhiệm lập hồ sơ điện tử của công chức, viên chức, phê duyệt, lưu trữ trên Phần mềm quản lý hồ sơ.\n3. Trường hợp công chức, viên chức được biệt phái; thuyên chuyển đến cơ quan, đơn vị khác của nhà nước; nghỉ hưu; chuyển công tác ra khỏi cơ quan, đơn vị của nhà nước; thôi việc; bị kỷ luật buộc thôi việc; chết thì hồ sơ điện tử của công chức, viên chức đó vẫn được lưu trữ trên Phần mềm quản lý hồ sơ.',
'Thẩm quyền và trách nhiệm của cơ quan quản lý hồ sơ công chức\n1. Chấp hành sự chỉ đạo, kiểm tra và hướng dẫn nghiệp vụ về công tác hồ sơ, hồ sơ điện tử công chức của cơ quan cấp trên, đồng thời hướng dẫn kiểm tra, đôn đốc các cơ quan, tổ chức, đơn vị trực thuộc về công tác quản lý hồ sơ công chức.\n2. Tổ chức thực hiện các quy định về bổ sung, chuyển giao, tiếp nhận, nghiên cứu, sử dụng, khai thác, lưu trữ, bảo quản hồ sơ công chức theo quy định của Thông tư này.\n3. Giao nộp đầy đủ, kịp thời cho cơ quan, tổ chức, đơn vị quản lý công chức những tài liệu liên quan đến hồ sơ công chức hiện đang công tác ở đơn vị mình.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
## Evaluation
### Metrics
#### Information Retrieval
* Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | Value |
|:--------------------|:----------|
| cosine_accuracy@10 | 0.9819 |
| cosine_precision@10 | 0.1068 |
| cosine_recall@10 | 0.9772 |
| cosine_ndcg@10 | 0.8596 |
| cosine_mrr@10 | 0.8287 |
| **cosine_map@10** | **0.816** |
| dot_accuracy@10 | 0.9718 |
| dot_precision@10 | 0.1058 |
| dot_recall@10 | 0.9668 |
| dot_ndcg@10 | 0.8039 |
| dot_mrr@10 | 0.7592 |
| dot_map@10 | 0.7455 |
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Dataset
#### Unnamed Dataset
* Size: 131,843 training samples
* Columns: <code>anchor</code>, <code>positive</code>, <code>negative_0</code>, <code>negative_1</code>, and <code>negative_2</code>
* Approximate statistics based on the first 1000 samples:
| | anchor | positive | negative_0 | negative_1 | negative_2 |
|:--------|:----------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|
| type | string | string | string | string | string |
| details | <ul><li>min: 7 tokens</li><li>mean: 24.45 tokens</li><li>max: 48 tokens</li></ul> | <ul><li>min: 19 tokens</li><li>mean: 300.08 tokens</li><li>max: 1024 tokens</li></ul> | <ul><li>min: 12 tokens</li><li>mean: 331.9 tokens</li><li>max: 1024 tokens</li></ul> | <ul><li>min: 24 tokens</li><li>mean: 337.39 tokens</li><li>max: 1024 tokens</li></ul> | <ul><li>min: 22 tokens</li><li>mean: 330.35 tokens</li><li>max: 1024 tokens</li></ul> |
* Samples:
| anchor | positive | negative_0 | negative_1 | negative_2 |
|:------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>Người học ngành quản lý khai thác công trình thủy lợi trình độ cao đẳng phải có khả năng học tập và nâng cao trình độ như thế nào?</code> | <code>Khả năng học tập, nâng cao trình độ<br>- Khối lượng khối lượng kiến thức tối thiểu, yêu cầu về năng lực mà người học phải đạt được sau khi tốt nghiệp ngành, nghề Mộc xây dựng và trang trí nội thất, trình độ cao đẳng có thể tiếp tục phát triển ở các trình độ cao hơn;<br>- Người học sau tốt nghiệp có năng lực tự học, tự cập nhật những tiến bộ khoa học công nghệ trong phạm vi ngành, nghề để nâng cao trình độ hoặc học liên thông lên trình độ cao hơn trong cùng ngành, nghề hoặc trong nhóm ngành, nghề hoặc trong cùng lĩnh vực đào tạo./.<br>Người học ngành mộc xây dựng và trang trí nội thất trình độ cao đẳng phải có khả năng học tập, nâng cao trình độ như thế sau:<br>- Khối lượng khối lượng kiến thức tối thiểu, yêu cầu về năng lực mà người học phải đạt được sau khi tốt nghiệp ngành, nghề Mộc xây dựng và trang trí nội thất, trình độ cao đẳng có thể tiếp tục phát triển ở các trình độ cao hơn;<br>- Người học sau tốt nghiệp có năng lực tự học, tự cập nhật những tiến bộ khoa học công nghệ trong phạm vi ngành, nghề để nâng cao trình độ hoặc học liên thông lên trình độ cao hơn trong cùng ngành, nghề hoặc trong nhóm ngành, nghề hoặc trong cùng lĩnh vực đào tạo.</code> | <code>Khả năng học tập, nâng cao trình độ<br>- Khối lượng kiến thức tối thiểu, yêu cầu về năng lực mà người học phải đạt được sau khi tốt nghiệp ngành, nghề Quản trị dịch vụ giải trí, thể thao trình độ trung cấp có thể tiếp tục phát triển ở các trình độ cao hơn;<br>- Người học sau tốt nghiệp có năng lực tự học, tự cập nhật những tiến bộ khoa học công nghệ trong phạm vi ngành, nghề để nâng cao trình độ hoặc học liên thông lên trình độ cao hơn trong cùng ngành nghề hoặc trong nhóm ngành nghề hoặc trong cùng lĩnh vực đào tạo.</code> | <code>Đào tạo về quản lý, khai thác công trình thủy lợi<br>1. Các cơ sở đào tạo có chức năng, nhiệm vụ, năng lực phù hợp được tổ chức các khóa đào tạo, đào tạo lại, bồi dưỡng nâng cao năng lực, nghiệp vụ cho các đối tượng làm công tác quản lý, khai thác công trình thủy lợi, quản lý đập.<br>2. Bộ Nông nghiệp và Phát triển nông thôn xây dựng, ban hành chương trình, kế hoạch đào tạo, bồi dưỡng cho các đối tượng làm công tác quản lý, khai thác công trình thủy lợi, quản lý đập làm cơ sở để các cơ sở đào tạo và địa phương tổ chức triển khai thực hiện.</code> | <code>Điều 12. Trách nhiệm tuân thủ yêu cầu năng lực trong khai thác công trình thủy lợi<br>1. Tổ chức, cá nhân tham gia khai thác công trình thủy lợi phải có năng lực phù hợp với quy mô, yêu cầu kỹ thuật của công trình theo quy định của Nghị định này; chịu trách nhiệm trước pháp luật về những hậu quả, thiệt hại do việc không bảo đảm các yêu cầu về năng lực gây ra.<br>2. Định kỳ 05 năm, cá nhân trực tiếp làm nhiệm vụ quản lý, vận hành công trình thủy lợi, quản lý vận hành đập, hồ chứa nước phải tham gia lớp đào tạo, bồi dưỡng nghiệp vụ nâng cao năng lực quản lý, vận hành công trình thủy lợi, quản lý, vận hành đập, hồ chứa nước.<br>3. Đối với các tổ chức được giao khai thác nhiều loại hình công trình đầu mối, số lượng cán bộ, công nhân khai thác công trình thủy lợi theo yêu cầu quy định về đảm bảo năng lực phải tăng lên tương ứng.<br>4. Ngoài việc đáp ứng yêu cầu năng lực quy định tại Nghị định này, tổ chức, cá nhân khai thác công trình thủy lợi có sản xuất, kinh doanh hoạt động khác phải bảo đảm yêu cầu năng lực đối với ngành nghề kinh doanh đó theo quy định của pháp luật có liên quan.<br>5. Cơ quan chuyên môn quản lý nhà nước về thủy lợi kiểm tra, giám sát việc thực hiện quy định về năng lực đối với các tổ chức, cá nhân khai thác công trình thủy lợi quy định tại Nghị định này.</code> |
| <code>Nội dung lồng ghép vấn đề bình đẳng giới trong xây dựng văn bản quy phạm pháp luật được quy định thế nào?</code> | <code>Nội dung lồng ghép vấn đề bình đẳng giới trong xây dựng văn bản quy phạm pháp luật<br>Trong phạm vi điều chỉnh của văn bản quy phạm pháp luật:<br>1. Xác định nội dung liên quan đến vấn đề bình đẳng giới hoặc vấn đề bất bình đẳng giới, phân biệt đối xử về giới.<br>2. Quy định các biện pháp cần thiết để thực hiện bình đẳng giới hoặc để giải quyết vấn đề bất bình đẳng giới, phân biệt đối xử về giới; dự báo tác động của các quy định đó đối với nam và nữ sau khi được ban hành.<br>3. Xác định nguồn nhân lực, tài chính cần thiết để triển khai các biện pháp thực hiện bình đẳng giới hoặc để giải quyết vấn đề bất bình đẳng giới, phân biệt đối xử về giới.</code> | <code>"Điều 21. Lồng ghép vấn đề bình đẳng giới trong xây dựng văn bản quy phạm pháp luật<br>1. Lồng ghép vấn đề bình đẳng giới trong xây dựng văn bản quy phạm pháp luật bao gồm:<br>a) Xác định vấn đề giới và các biện pháp giải quyết trong lĩnh vực mà văn bản quy phạm pháp luật điều chỉnh;<br>b) Dự báo tác động của các quy định trong văn bản quy phạm pháp luật khi được ban hành đối với nữ và nam;<br>c) Xác định trách nhiệm và nguồn lực để giải quyết các vấn đề giới trong phạm vi văn bản quy phạm pháp luật điều chỉnh.<br>2. Cơ quan chủ trì soạn thảo văn bản quy phạm pháp luật có trách nhiệm lồng ghép vấn đề bình đẳng giới, chuẩn bị báo cáo việc lồng ghép vấn đề bình đẳng giới vào quá trình xây dựng văn bản quy phạm pháp luật theo các nội dung quy định tại khoản 1 Điều này và phụ lục thông tin, số liệu về giới có liên quan đến dự án, dự thảo văn bản quy phạm pháp luật.<br>3. Cơ quan thẩm định văn bản quy phạm pháp luật có trách nhiệm phối hợp với cơ quan quản lý nhà nước về bình đẳng giới đánh giá việc lồng ghép vấn đề bình đẳng giới trong xây dựng văn bản quy phạm pháp luật. Nội dung đánh giá bao gồm:<br>a) Xác định vấn đề giới trong dự án, dự thảo;<br>b) Việc bảo đảm các nguyên tắc cơ bản về bình đẳng giới trong dự án, dự thảo;<br>c) Tính khả thi của việc giải quyết vấn đề giới được điều chỉnh trong dự án, dự thảo;<br>d) Việc thực hiện lồng ghép vấn đề bình đẳng giới trong xây dựng dự án, dự thảo theo các nội dung quy định tại khoản 1 Điều này.<br>4. Chính phủ quy định việc thực hiện lồng ghép vấn đề bình đẳng giới trong xây dựng văn bản quy phạm pháp luật."</code> | <code>Yêu cầu và phạm vi lồng ghép vấn đề bình đẳng giới trong xây dựng văn bản quy phạm pháp luật<br>1. Bảo đảm các nguyên tắc cơ bản về bình đẳng giới trong nội dung, trình tự, thủ tục soạn thảo, ban hành, rà soát, hệ thống hóa văn bản quy phạm pháp luật theo quy định của Luật Ban hành văn bản quy phạm pháp luật và Luật Ban hành văn bản quy phạm pháp luật của Hội đồng nhân dân, Ủy ban nhân dân.<br>2. Lồng ghép vấn đề bình đẳng giới được áp dụng đối với các dự thảo văn bản quy phạm pháp luật được xác định có nội dung liên quan đến bình đẳng giới hoặc có vấn đề bất bình đẳng giới, phân biệt đối xử về giới trong phạm vi điều chỉnh của văn bản.</code> | <code>Nguyên tắc lồng ghép vấn đề bình đẳng giới trong xây dựng văn bản quy phạm pháp luật<br>1. Lồng ghép vấn đề bình đẳng giới được thực hiện trong toàn bộ quy trình xây dựng văn bản quy phạm pháp luật.<br>2. Bảo đảm không làm phát sinh bất bình đẳng giới, bảo đảm quyền của mỗi giới trong nội dung, trình tự, thủ tục soạn thảo, ban hành văn bản theo quy định.<br>3. Bảo đảm sự tham gia của cơ quan lao động, thương binh và xã hội, Hội liên hiệp phụ nữ Việt Nam. Huy động sự tham gia của Mặt trận Tổ quốc Việt Nam và các tổ chức thành viên, các tổ chức, cá nhân có liên quan theo quy định của pháp luật.</code> |
| <code>Sản phẩm phần mềm có được hưởng ưu đãi về thời gian miễn thuế, giảm thuế hay không? Nếu được thì trong vòng bao nhiêu năm?</code> | <code>"Điều 20. Ưu đãi về thời gian miễn thuế, giảm thuế<br>1. Miễn thuế bốn năm, giảm 50% số thuế phải nộp trong chín năm tiếp theo đối với:<br>a) Thu nhập của doanh nghiệp từ thực hiện dự án đầu tư quy định tại khoản 1 Điều 19 Thông tư số 78/2014/TT-BTC (được sửa đổi, bổ sung tại Khoản 1 Điều 11 Thông tư này)." </code> | <code>Mục I. ƯU ĐÃI THUẾ THU NHẬP DOANH NGHIỆP<br>1. Doanh nghiệp phần mềm mới thành lập được hưởng thuế suất thuế thu nhập doanh nghiệp 10% trong 15 năm, kể từ khi doanh nghiệp phần mềm mới thành lập bắt đầu hoạt động kinh doanh.<br>2. Doanh nghiệp phần mềm mới thành lập được miễn thuế thu nhập doanh nghiệp 04 năm, kể từ khi có thu nhập chịu thuế và được giảm 50% số thuế phải nộp trong 09 năm tiếp theo.<br>3. Doanh nghiệp phần mềm đã được cấp Giấy phép đầu tư hoặc Giấy chứng nhận ưu đãi đầu tư thì tiếp tục được hưởng ưu đãi về thuế thu nhập doanh nghiệp đã ghi trong Giấy phép đầu tư hoặc Giấy chứng nhận ưu đãi đầu tư. Trường hợp mức ưu đãi về thuế thu nhập doanh nghiệp (bao gồm cả thuế suất ưu đãi và thời gian miễn thuế, giảm thuế) ghi trong Giấy phép đầu tư, Giấy chứng nhận ưu đãi đầu tư thấp hơn mức ưu đãi theo hướng dẫn tại điểm 1 và 2, Mục I, Phần B, Thông tư này thì doanh nghiệp phần mềm có quyền lựa chọn hưởng các ưu đãi về thuế thu nhập doanh nghiệp theo hướng dẫn tại điểm 1 và 2, Mục I, Phần B, Thông tư này cho thời gian ưu đãi còn lại.<br>4. Đối với doanh nghiệp phần mềm có sản xuất kinh doanh hàng hoá, dịch vụ khác như: sản xuất lắp ráp máy vi tính, thiết bị điện tử, kinh doanh máy móc thiết bị..., doanh nghiệp phải tổ chức hạch toán riêng doanh thu, chi phí và thu nhập của hoạt động sản xuất sản phẩm và dịch vụ phần mềm để xác định số thuế thu nhập doanh nghiệp được hưởng ưu đãi thuế thu nhập doanh nghiệp. Trường hợp doanh nghiệp không hạch toán riêng được thì thu nhập từ hoạt động sản xuất sản phẩm và dịch vụ phần mềm được xác định theo tỷ lệ giữa doanh thu hoạt động sản xuất sản phẩm và dịch vụ phần mềm so với tổng doanh thu của doanh nghiệp.</code> | <code>Ưu tiên phát triển công nghiệp phần mềm, công nghiệp nội dung<br>1. Nhà nước áp dụng mức ưu đãi cao nhất cho các tổ chức, cá nhân tham gia hoạt động công nghiệp phần mềm, công nghiệp nội dung theo quy định của pháp luật, bao gồm:<br>a) Các tổ chức, cá nhân đầu tư sản xuất, kinh doanh phần mềm; sản xuất sản phẩm nội dung thông tin số được hưởng chế độ ưu đãi về thuế theo quy định của pháp luật thuế và ưu đãi trong việc sử dụng đất;<br>b) Các sản phẩm phần mềm và nội dung thông tin số được sản xuất tại Việt Nam và các dịch vụ phần mềm do các tổ chức, doanh nghiệp thuộc mọi thành phần kinh tế hoạt động tại Việt Nam cung cấp được áp dụng mức ưu đãi cao nhất về thuế giá trị gia tăng và thuế xuất khẩu.<br>2. Trong trường hợp tổ chức, doanh nghiệp ngoài việc tham gia hoạt động công nghiệp phần mềm, công nghiệp nội đung còn tham gia nhiều loại hình hoạt động khác thì chỉ được hưởng các chính sách ưu đãi quy định tại Điều này đối với các hoạt động sản xuất, cung cấp sản phẩm, dịch vụ phần mềm; sản xuất sản phẩm nội dung thông tin số.</code> | <code>1. Miễn thuế bốn năm, giảm 50% số thuế phải nộp trong chín năm tiếp theo đối với: <br>a) Thu nhập của doanh nghiệp từ thực hiện dự án đầu tư quy định tại khoản 1 Điều 19 Thông tư số 78/2014/TT-BTC (được sửa đổi, bổ sung tại Khoản 1 Điều 11 Thông tư này)<br>b) Thu nhập của doanh nghiệp từ thực hiện dự án đầu tư mới trong lĩnh vực xã hội hóa thực hiện tại địa bàn có điều kiện kinh tế - xã hội khó khăn hoặc đặc biệt khó khăn quy định tại Phụ lục ban hành kèm theo Nghị định số 218/2013/NĐ-CP .<br>2. Miễn thuế bốn năm, giảm 50% số thuế phải nộp trong năm năm tiếp theo đối với thu nhập của doanh nghiệp từ thực hiện dự án đầu tư mới trong lĩnh vực xã hội hóa thực hiện tại địa bàn không thuộc danh mục địa bàn có điều kiện kinh tế - xã hội khó khăn hoặc đặc biệt khó khăn quy định tại Phụ lục ban hành kèm theo Nghị định số 218/2013/NĐ-CP của Chính phủ.<br>...<br>4. Thời gian miễn thuế, giảm thuế quy định tại Điều này được tính liên tục từ năm đầu tiên doanh nghiệp có thu nhập chịu thuế từ dự án đầu tư mới được hưởng ưu đãi thuế. Trường hợp doanh nghiệp không có thu nhập chịu thuế trong ba năm đầu, kể từ năm đầu tiên có doanh thu từ dự án đầu tư mới thì thời gian miễn thuế, giảm thuế được tính từ năm thứ tư dự án đầu tư mới phát sinh doanh thu. <br>Ví dụ: Năm 2014, doanh nghiệp A có dự án đầu tư mới sản xuất sản phẩm phần mềm, nếu năm 2014 doanh nghiệp A đã có thu nhập chịu thuế từ dự án sản xuất sản phẩm phần mềm thì thời gian miễn giảm thuế được tính liên tục kể từ năm 2014. Trường hợp dự án đầu tư mới sản xuất sản phẩm phần mềm của doanh nghiệp A phát sinh doanh thu từ năm 2014, đến năm 2016 dự án đầu tư mới của doanh nghiệp A vẫn chưa có thu nhập chịu thuế thì thời gian miễn giảm thuế được tính liên tục kể từ năm 2017. <br>Thời gian miễn thuế, giảm thuế đối với doanh nghiệp công nghệ cao, doanh nghiệp nông nghiệp ứng dụng công nghệ cao theo quy định nêu trên được tính từ năm được cấp Giấy chứng nhận công nhận là doanh nghiệp công nghệ cao, doanh nghiệp nông nghiệp ứng dụng công nghệ cao"</code> |
* Loss: [<code>CachedMultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedmultiplenegativesrankingloss) with these parameters:
```json
{
"scale": 20.0,
"similarity_fct": "cos_sim"
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: steps
- `per_device_train_batch_size`: 128
- `per_device_eval_batch_size`: 128
- `num_train_epochs`: 10
- `warmup_ratio`: 0.05
- `bf16`: True
- `load_best_model_at_end`: True
- `batch_sampler`: no_duplicates
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: steps
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 128
- `per_device_eval_batch_size`: 128
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 1
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 5e-05
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 10
- `max_steps`: -1
- `lr_scheduler_type`: linear
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.05
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: True
- `fp16`: False
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: True
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: False
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `use_liger_kernel`: False
- `eval_use_gather_object`: False
- `batch_sampler`: no_duplicates
- `multi_dataset_batch_sampler`: proportional
</details>
### Training Logs
<details><summary>Click to expand</summary>
| Epoch | Step | Training Loss | cosine_map@10 |
|:------:|:----:|:-------------:|:-------------:|
| 0.0010 | 1 | 1.3548 | - |
| 0.0019 | 2 | 1.3278 | - |
| 0.0029 | 3 | 1.5714 | - |
| 0.0039 | 4 | 1.688 | - |
| 0.0048 | 5 | 1.321 | - |
| 0.0058 | 6 | 1.5785 | - |
| 0.0068 | 7 | 1.4 | - |
| 0.0078 | 8 | 1.1968 | - |
| 0.0087 | 9 | 1.2794 | - |
| 0.0097 | 10 | 1.4283 | - |
| 0.0107 | 11 | 1.2814 | - |
| 0.0116 | 12 | 1.1217 | - |
| 0.0126 | 13 | 1.0995 | - |
| 0.0136 | 14 | 1.2857 | - |
| 0.0145 | 15 | 1.6785 | - |
| 0.0155 | 16 | 1.2856 | - |
| 0.0165 | 17 | 1.588 | - |
| 0.0175 | 18 | 1.4281 | - |
| 0.0184 | 19 | 1.5265 | - |
| 0.0194 | 20 | 1.5657 | - |
| 0.0204 | 21 | 1.2795 | - |
| 0.0213 | 22 | 1.3395 | - |
| 0.0223 | 23 | 1.3997 | - |
| 0.0233 | 24 | 1.5981 | - |
| 0.0242 | 25 | 1.3228 | - |
| 0.0252 | 26 | 1.9005 | - |
| 0.0262 | 27 | 1.4483 | - |
| 0.0272 | 28 | 1.4254 | - |
| 0.0281 | 29 | 1.3475 | - |
| 0.0291 | 30 | 1.2793 | - |
| 0.0301 | 31 | 1.5486 | - |
| 0.0310 | 32 | 1.6218 | - |
| 0.0320 | 33 | 1.3148 | - |
| 0.0330 | 34 | 1.5519 | - |
| 0.0339 | 35 | 1.3749 | - |
| 0.0349 | 36 | 1.5245 | - |
| 0.0359 | 37 | 1.2491 | - |
| 0.0369 | 38 | 1.2252 | - |
| 0.0378 | 39 | 1.4935 | - |
| 0.0388 | 40 | 1.4856 | - |
| 0.0398 | 41 | 1.2697 | - |
| 0.0407 | 42 | 1.2835 | - |
| 0.0417 | 43 | 1.5985 | - |
| 0.0427 | 44 | 1.3632 | - |
| 0.0436 | 45 | 1.377 | - |
| 0.0446 | 46 | 1.5715 | - |
| 0.0456 | 47 | 1.2594 | - |
| 0.0466 | 48 | 1.3104 | - |
| 0.0475 | 49 | 1.5241 | - |
| 0.0485 | 50 | 1.6861 | - |
| 0.0495 | 51 | 1.6746 | - |
| 0.0504 | 52 | 1.3369 | - |
| 0.0514 | 53 | 1.232 | - |
| 0.0524 | 54 | 1.3837 | - |
| 0.0533 | 55 | 1.3185 | - |
| 0.0543 | 56 | 1.4374 | - |
| 0.0553 | 57 | 1.7459 | - |
| 0.0563 | 58 | 1.5598 | - |
| 0.0572 | 59 | 1.5001 | - |
| 0.0582 | 60 | 1.5655 | - |
| 0.0592 | 61 | 1.5307 | - |
| 0.0601 | 62 | 1.3772 | - |
| 0.0611 | 63 | 1.2658 | - |
| 0.0621 | 64 | 1.3971 | - |
| 0.0630 | 65 | 1.3668 | - |
| 0.0640 | 66 | 1.5259 | - |
| 0.0650 | 67 | 1.2111 | - |
| 0.0660 | 68 | 1.2598 | - |
| 0.0669 | 69 | 1.2429 | - |
| 0.0679 | 70 | 1.3428 | - |
| 0.0689 | 71 | 1.115 | - |
| 0.0698 | 72 | 1.1894 | - |
| 0.0708 | 73 | 1.4045 | - |
| 0.0718 | 74 | 1.3301 | - |
| 0.0727 | 75 | 1.1178 | - |
| 0.0737 | 76 | 1.1563 | - |
| 0.0747 | 77 | 1.5223 | - |
| 0.0757 | 78 | 1.0415 | - |
| 0.0766 | 79 | 1.3928 | - |
| 0.0776 | 80 | 1.5593 | - |
| 0.0786 | 81 | 1.1636 | - |
| 0.0795 | 82 | 1.1725 | - |
| 0.0805 | 83 | 1.1873 | - |
| 0.0815 | 84 | 1.5505 | - |
| 0.0824 | 85 | 1.2596 | - |
| 0.0834 | 86 | 1.3247 | - |
| 0.0844 | 87 | 1.1035 | - |
| 0.0854 | 88 | 1.2047 | - |
| 0.0863 | 89 | 1.2387 | - |
| 0.0873 | 90 | 1.2032 | - |
| 0.0883 | 91 | 1.2809 | - |
| 0.0892 | 92 | 1.3073 | - |
| 0.0902 | 93 | 1.2669 | - |
| 0.0912 | 94 | 1.5633 | - |
| 0.0921 | 95 | 1.4179 | - |
| 0.0931 | 96 | 1.2724 | - |
| 0.0941 | 97 | 1.262 | - |
| 0.0951 | 98 | 1.3759 | - |
| 0.0960 | 99 | 1.1443 | - |
| 0.0970 | 100 | 1.1694 | 0.7078 |
| 0.0980 | 101 | 1.449 | - |
| 0.0989 | 102 | 1.0755 | - |
| 0.0999 | 103 | 1.2922 | - |
| 0.1009 | 104 | 1.2039 | - |
| 0.1018 | 105 | 1.2379 | - |
| 0.1028 | 106 | 1.2626 | - |
| 0.1038 | 107 | 1.3253 | - |
| 0.1048 | 108 | 1.3196 | - |
| 0.1057 | 109 | 1.0833 | - |
| 0.1067 | 110 | 1.3256 | - |
| 0.1077 | 111 | 1.3405 | - |
| 0.1086 | 112 | 1.1855 | - |
| 0.1096 | 113 | 1.0964 | - |
| 0.1106 | 114 | 1.2231 | - |
| 0.1115 | 115 | 1.1934 | - |
| 0.1125 | 116 | 1.2604 | - |
| 0.1135 | 117 | 1.2031 | - |
| 0.1145 | 118 | 1.1722 | - |
| 0.1154 | 119 | 1.3981 | - |
| 0.1164 | 120 | 1.2969 | - |
| 0.1174 | 121 | 1.2744 | - |
| 0.1183 | 122 | 1.1505 | - |
| 0.1193 | 123 | 0.9585 | - |
| 0.1203 | 124 | 1.1577 | - |
| 0.1212 | 125 | 1.3243 | - |
| 0.1222 | 126 | 1.2497 | - |
| 0.1232 | 127 | 1.1402 | - |
| 0.1242 | 128 | 1.131 | - |
| 0.1251 | 129 | 1.2245 | - |
| 0.1261 | 130 | 1.2254 | - |
| 0.1271 | 131 | 1.3789 | - |
| 0.1280 | 132 | 1.2799 | - |
| 0.1290 | 133 | 1.1926 | - |
| 0.1300 | 134 | 0.9283 | - |
| 0.1309 | 135 | 1.2868 | - |
| 0.1319 | 136 | 1.2746 | - |
| 0.1329 | 137 | 1.0359 | - |
| 0.1339 | 138 | 1.0168 | - |
| 0.1348 | 139 | 1.2419 | - |
| 0.1358 | 140 | 1.1359 | - |
| 0.1368 | 141 | 1.0765 | - |
| 0.1377 | 142 | 1.2262 | - |
| 0.1387 | 143 | 0.9495 | - |
| 0.1397 | 144 | 0.9723 | - |
| 0.1406 | 145 | 1.1321 | - |
| 0.1416 | 146 | 1.184 | - |
| 0.1426 | 147 | 1.2128 | - |
| 0.1435 | 148 | 1.1424 | - |
| 0.1445 | 149 | 1.0847 | - |
| 0.1455 | 150 | 1.3908 | - |
| 0.1465 | 151 | 1.4538 | - |
| 0.1474 | 152 | 0.9631 | - |
| 0.1484 | 153 | 1.3216 | - |
| 0.1494 | 154 | 1.2703 | - |
| 0.1503 | 155 | 1.1631 | - |
| 0.1513 | 156 | 1.141 | - |
| 0.1523 | 157 | 1.2933 | - |
| 0.1532 | 158 | 1.1562 | - |
| 0.1542 | 159 | 1.0216 | - |
| 0.1552 | 160 | 1.2608 | - |
| 0.1562 | 161 | 1.2552 | - |
| 0.1571 | 162 | 1.2438 | - |
| 0.1581 | 163 | 1.2391 | - |
| 0.1591 | 164 | 1.3711 | - |
| 0.1600 | 165 | 1.153 | - |
| 0.1610 | 166 | 1.3028 | - |
| 0.1620 | 167 | 1.1986 | - |
| 0.1629 | 168 | 1.2984 | - |
| 0.1639 | 169 | 1.1317 | - |
| 0.1649 | 170 | 1.1621 | - |
| 0.1659 | 171 | 1.2129 | - |
| 0.1668 | 172 | 1.2324 | - |
| 0.1678 | 173 | 1.2268 | - |
| 0.1688 | 174 | 1.1136 | - |
| 0.1697 | 175 | 1.0994 | - |
| 0.1707 | 176 | 1.1727 | - |
| 0.1717 | 177 | 1.0041 | - |
| 0.1726 | 178 | 0.8898 | - |
| 0.1736 | 179 | 1.3132 | - |
| 0.1746 | 180 | 1.1036 | - |
| 0.1756 | 181 | 1.1042 | - |
| 0.1765 | 182 | 1.2623 | - |
| 0.1775 | 183 | 1.104 | - |
| 0.1785 | 184 | 1.2267 | - |
| 0.1794 | 185 | 1.2608 | - |
| 0.1804 | 186 | 1.2426 | - |
| 0.1814 | 187 | 1.0772 | - |
| 0.1823 | 188 | 1.0582 | - |
| 0.1833 | 189 | 1.009 | - |
| 0.1843 | 190 | 1.1663 | - |
| 0.1853 | 191 | 1.0357 | - |
| 0.1862 | 192 | 1.1134 | - |
| 0.1872 | 193 | 1.0779 | - |
| 0.1882 | 194 | 1.0048 | - |
| 0.1891 | 195 | 1.0197 | - |
| 0.1901 | 196 | 1.4021 | - |
| 0.1911 | 197 | 1.3717 | - |
| 0.1920 | 198 | 1.3017 | - |
| 0.1930 | 199 | 1.0689 | - |
| 0.1940 | 200 | 1.1656 | 0.7364 |
| 0.1950 | 201 | 0.9706 | - |
| 0.1959 | 202 | 1.1652 | - |
| 0.1969 | 203 | 1.3547 | - |
| 0.1979 | 204 | 1.4233 | - |
| 0.1988 | 205 | 1.0874 | - |
| 0.1998 | 206 | 1.2017 | - |
| 0.2008 | 207 | 1.1905 | - |
| 0.2017 | 208 | 1.0878 | - |
| 0.2027 | 209 | 1.2459 | - |
| 0.2037 | 210 | 1.0648 | - |
| 0.2047 | 211 | 1.1462 | - |
| 0.2056 | 212 | 1.2667 | - |
| 0.2066 | 213 | 1.0226 | - |
| 0.2076 | 214 | 1.1385 | - |
| 0.2085 | 215 | 1.3379 | - |
| 0.2095 | 216 | 1.2006 | - |
| 0.2105 | 217 | 1.1559 | - |
| 0.2114 | 218 | 1.2308 | - |
| 0.2124 | 219 | 1.2824 | - |
| 0.2134 | 220 | 1.0961 | - |
| 0.2144 | 221 | 1.2229 | - |
| 0.2153 | 222 | 1.251 | - |
| 0.2163 | 223 | 1.0761 | - |
| 0.2173 | 224 | 1.3629 | - |
| 0.2182 | 225 | 0.9135 | - |
| 0.2192 | 226 | 1.1653 | - |
| 0.2202 | 227 | 1.109 | - |
| 0.2211 | 228 | 1.0082 | - |
| 0.2221 | 229 | 1.4515 | - |
| 0.2231 | 230 | 1.3488 | - |
| 0.2241 | 231 | 1.2158 | - |
| 0.2250 | 232 | 1.2533 | - |
| 0.2260 | 233 | 1.0845 | - |
| 0.2270 | 234 | 1.2498 | - |
| 0.2279 | 235 | 1.106 | - |
| 0.2289 | 236 | 1.2398 | - |
| 0.2299 | 237 | 1.1755 | - |
| 0.2308 | 238 | 1.0101 | - |
| 0.2318 | 239 | 1.0436 | - |
| 0.2328 | 240 | 1.0812 | - |
| 0.2338 | 241 | 1.1255 | - |
| 0.2347 | 242 | 1.104 | - |
| 0.2357 | 243 | 1.1667 | - |
| 0.2367 | 244 | 0.9677 | - |
| 0.2376 | 245 | 1.1672 | - |
| 0.2386 | 246 | 1.1139 | - |
| 0.2396 | 247 | 0.9859 | - |
| 0.2405 | 248 | 1.2074 | - |
| 0.2415 | 249 | 0.9183 | - |
| 0.2425 | 250 | 1.1708 | - |
| 0.2435 | 251 | 1.1623 | - |
| 0.2444 | 252 | 1.0374 | - |
| 0.2454 | 253 | 1.0173 | - |
| 0.2464 | 254 | 1.2728 | - |
| 0.2473 | 255 | 1.1842 | - |
| 0.2483 | 256 | 1.3049 | - |
| 0.2493 | 257 | 1.121 | - |
| 0.2502 | 258 | 1.04 | - |
| 0.2512 | 259 | 1.0904 | - |
| 0.2522 | 260 | 1.135 | - |
| 0.2532 | 261 | 1.1586 | - |
| 0.2541 | 262 | 1.2567 | - |
| 0.2551 | 263 | 1.1263 | - |
| 0.2561 | 264 | 0.9714 | - |
| 0.2570 | 265 | 1.1639 | - |
| 0.2580 | 266 | 1.2079 | - |
| 0.2590 | 267 | 1.2435 | - |
| 0.2599 | 268 | 1.3252 | - |
| 0.2609 | 269 | 1.3408 | - |
| 0.2619 | 270 | 1.1008 | - |
| 0.2629 | 271 | 1.0997 | - |
| 0.2638 | 272 | 1.1025 | - |
| 0.2648 | 273 | 1.1626 | - |
| 0.2658 | 274 | 1.2769 | - |
| 0.2667 | 275 | 1.1397 | - |
| 0.2677 | 276 | 1.0249 | - |
| 0.2687 | 277 | 1.1837 | - |
| 0.2696 | 278 | 1.167 | - |
| 0.2706 | 279 | 1.197 | - |
| 0.2716 | 280 | 1.3918 | - |
| 0.2726 | 281 | 0.9473 | - |
| 0.2735 | 282 | 1.0996 | - |
| 0.2745 | 283 | 1.0829 | - |
| 0.2755 | 284 | 1.0333 | - |
| 0.2764 | 285 | 0.9612 | - |
| 0.2774 | 286 | 1.1824 | - |
| 0.2784 | 287 | 1.0552 | - |
| 0.2793 | 288 | 1.099 | - |
| 0.2803 | 289 | 0.9097 | - |
| 0.2813 | 290 | 0.9924 | - |
| 0.2823 | 291 | 0.9563 | - |
| 0.2832 | 292 | 1.0526 | - |
| 0.2842 | 293 | 1.0729 | - |
| 0.2852 | 294 | 1.0283 | - |
| 0.2861 | 295 | 1.024 | - |
| 0.2871 | 296 | 1.1101 | - |
| 0.2881 | 297 | 1.2648 | - |
| 0.2890 | 298 | 1.132 | - |
| 0.2900 | 299 | 1.0156 | - |
| 0.2910 | 300 | 1.0502 | 0.7654 |
| 0.2919 | 301 | 0.9857 | - |
| 0.2929 | 302 | 1.1165 | - |
| 0.2939 | 303 | 1.1947 | - |
| 0.2949 | 304 | 1.3046 | - |
| 0.2958 | 305 | 1.183 | - |
| 0.2968 | 306 | 1.3003 | - |
| 0.2978 | 307 | 1.1527 | - |
| 0.2987 | 308 | 1.0434 | - |
| 0.2997 | 309 | 0.9812 | - |
| 0.3007 | 310 | 1.0613 | - |
| 0.3016 | 311 | 1.1769 | - |
| 0.3026 | 312 | 0.8909 | - |
| 0.3036 | 313 | 1.06 | - |
| 0.3046 | 314 | 1.4242 | - |
| 0.3055 | 315 | 1.1372 | - |
| 0.3065 | 316 | 1.2208 | - |
| 0.3075 | 317 | 1.2163 | - |
| 0.3084 | 318 | 1.1843 | - |
| 0.3094 | 319 | 1.1612 | - |
| 0.3104 | 320 | 1.0604 | - |
| 0.3113 | 321 | 0.8697 | - |
| 0.3123 | 322 | 1.0984 | - |
| 0.3133 | 323 | 1.0234 | - |
| 0.3143 | 324 | 1.0793 | - |
| 0.3152 | 325 | 1.023 | - |
| 0.3162 | 326 | 0.9799 | - |
| 0.3172 | 327 | 1.1197 | - |
| 0.3181 | 328 | 0.9841 | - |
| 0.3191 | 329 | 1.4618 | - |
| 0.3201 | 330 | 0.905 | - |
| 0.3210 | 331 | 1.0484 | - |
| 0.3220 | 332 | 1.142 | - |
| 0.3230 | 333 | 1.1191 | - |
| 0.3240 | 334 | 1.1052 | - |
| 0.3249 | 335 | 1.0759 | - |
| 0.3259 | 336 | 1.105 | - |
| 0.3269 | 337 | 1.1268 | - |
| 0.3278 | 338 | 0.9781 | - |
| 0.3288 | 339 | 1.0858 | - |
| 0.3298 | 340 | 1.1341 | - |
| 0.3307 | 341 | 0.9523 | - |
| 0.3317 | 342 | 1.0239 | - |
| 0.3327 | 343 | 1.019 | - |
| 0.3337 | 344 | 0.9879 | - |
| 0.3346 | 345 | 1.0123 | - |
| 0.3356 | 346 | 1.0628 | - |
| 0.3366 | 347 | 1.1316 | - |
| 0.3375 | 348 | 1.17 | - |
| 0.3385 | 349 | 1.2409 | - |
| 0.3395 | 350 | 1.0749 | - |
| 0.3404 | 351 | 1.1431 | - |
| 0.3414 | 352 | 1.0659 | - |
| 0.3424 | 353 | 1.0466 | - |
| 0.3434 | 354 | 1.1462 | - |
| 0.3443 | 355 | 1.175 | - |
| 0.3453 | 356 | 1.224 | - |
| 0.3463 | 357 | 0.8952 | - |
| 0.3472 | 358 | 1.0089 | - |
| 0.3482 | 359 | 0.9993 | - |
| 0.3492 | 360 | 0.9882 | - |
| 0.3501 | 361 | 0.7859 | - |
| 0.3511 | 362 | 1.0453 | - |
| 0.3521 | 363 | 0.9472 | - |
| 0.3531 | 364 | 1.2126 | - |
| 0.3540 | 365 | 1.1988 | - |
| 0.3550 | 366 | 0.9887 | - |
| 0.3560 | 367 | 1.1496 | - |
| 0.3569 | 368 | 1.0429 | - |
| 0.3579 | 369 | 1.0068 | - |
| 0.3589 | 370 | 1.038 | - |
| 0.3598 | 371 | 1.198 | - |
| 0.3608 | 372 | 1.1827 | - |
| 0.3618 | 373 | 0.9925 | - |
| 0.3628 | 374 | 0.9952 | - |
| 0.3637 | 375 | 1.1337 | - |
| 0.3647 | 376 | 1.1322 | - |
| 0.3657 | 377 | 1.2632 | - |
| 0.3666 | 378 | 1.2663 | - |
| 0.3676 | 379 | 1.3192 | - |
| 0.3686 | 380 | 0.9937 | - |
| 0.3695 | 381 | 1.068 | - |
| 0.3705 | 382 | 0.8849 | - |
| 0.3715 | 383 | 1.0335 | - |
| 0.3725 | 384 | 1.2943 | - |
| 0.3734 | 385 | 1.0779 | - |
| 0.3744 | 386 | 1.0837 | - |
| 0.3754 | 387 | 1.1756 | - |
| 0.3763 | 388 | 1.0215 | - |
| 0.3773 | 389 | 0.9023 | - |
| 0.3783 | 390 | 0.9316 | - |
| 0.3792 | 391 | 0.9476 | - |
| 0.3802 | 392 | 1.0946 | - |
| 0.3812 | 393 | 1.1369 | - |
| 0.3822 | 394 | 1.0125 | - |
| 0.3831 | 395 | 0.9411 | - |
| 0.3841 | 396 | 1.3418 | - |
| 0.3851 | 397 | 1.0482 | - |
| 0.3860 | 398 | 1.1375 | - |
| 0.3870 | 399 | 1.1676 | - |
| 0.3880 | 400 | 1.1086 | 0.7585 |
| 0.3889 | 401 | 0.9736 | - |
| 0.3899 | 402 | 1.022 | - |
| 0.3909 | 403 | 0.8769 | - |
| 0.3919 | 404 | 1.0097 | - |
| 0.3928 | 405 | 1.166 | - |
| 0.3938 | 406 | 1.0191 | - |
| 0.3948 | 407 | 0.8552 | - |
| 0.3957 | 408 | 0.9855 | - |
| 0.3967 | 409 | 0.9701 | - |
| 0.3977 | 410 | 0.9811 | - |
| 0.3986 | 411 | 1.109 | - |
| 0.3996 | 412 | 1.028 | - |
| 0.4006 | 413 | 0.9696 | - |
| 0.4016 | 414 | 0.9999 | - |
| 0.4025 | 415 | 1.1674 | - |
| 0.4035 | 416 | 1.0858 | - |
| 0.4045 | 417 | 1.0558 | - |
| 0.4054 | 418 | 1.1086 | - |
| 0.4064 | 419 | 0.9793 | - |
| 0.4074 | 420 | 1.0327 | - |
| 0.4083 | 421 | 1.0932 | - |
| 0.4093 | 422 | 1.1454 | - |
| 0.4103 | 423 | 0.9758 | - |
| 0.4113 | 424 | 1.0591 | - |
| 0.4122 | 425 | 1.0967 | - |
| 0.4132 | 426 | 1.044 | - |
| 0.4142 | 427 | 0.9475 | - |
| 0.4151 | 428 | 0.9888 | - |
| 0.4161 | 429 | 0.9394 | - |
| 0.4171 | 430 | 0.9903 | - |
| 0.4180 | 431 | 0.953 | - |
| 0.4190 | 432 | 0.9712 | - |
| 0.4200 | 433 | 1.0969 | - |
| 0.4210 | 434 | 1.0842 | - |
| 0.4219 | 435 | 0.8944 | - |
| 0.4229 | 436 | 0.7771 | - |
| 0.4239 | 437 | 1.3449 | - |
| 0.4248 | 438 | 1.1097 | - |
| 0.4258 | 439 | 1.2074 | - |
| 0.4268 | 440 | 0.9279 | - |
| 0.4277 | 441 | 1.2441 | - |
| 0.4287 | 442 | 0.9458 | - |
| 0.4297 | 443 | 1.0521 | - |
| 0.4306 | 444 | 1.0725 | - |
| 0.4316 | 445 | 1.0397 | - |
| 0.4326 | 446 | 1.2645 | - |
| 0.4336 | 447 | 1.2226 | - |
| 0.4345 | 448 | 1.2859 | - |
| 0.4355 | 449 | 1.0253 | - |
| 0.4365 | 450 | 0.9425 | - |
| 0.4374 | 451 | 1.0049 | - |
| 0.4384 | 452 | 1.0303 | - |
| 0.4394 | 453 | 1.2021 | - |
| 0.4403 | 454 | 0.9022 | - |
| 0.4413 | 455 | 1.26 | - |
| 0.4423 | 456 | 0.9062 | - |
| 0.4433 | 457 | 1.0418 | - |
| 0.4442 | 458 | 1.0206 | - |
| 0.4452 | 459 | 1.0049 | - |
| 0.4462 | 460 | 1.0927 | - |
| 0.4471 | 461 | 1.0166 | - |
| 0.4481 | 462 | 1.0161 | - |
| 0.4491 | 463 | 1.2102 | - |
| 0.4500 | 464 | 1.1485 | - |
| 0.4510 | 465 | 1.0363 | - |
| 0.4520 | 466 | 1.0765 | - |
| 0.4530 | 467 | 0.9655 | - |
| 0.4539 | 468 | 1.0744 | - |
| 0.4549 | 469 | 1.1902 | - |
| 0.4559 | 470 | 1.1392 | - |
| 0.4568 | 471 | 0.9389 | - |
| 0.4578 | 472 | 0.9925 | - |
| 0.4588 | 473 | 1.1502 | - |
| 0.4597 | 474 | 1.0407 | - |
| 0.4607 | 475 | 0.9539 | - |
| 0.4617 | 476 | 0.9028 | - |
| 0.4627 | 477 | 1.1409 | - |
| 0.4636 | 478 | 1.0653 | - |
| 0.4646 | 479 | 1.2056 | - |
| 0.4656 | 480 | 1.1113 | - |
| 0.4665 | 481 | 1.0278 | - |
| 0.4675 | 482 | 1.1144 | - |
| 0.4685 | 483 | 1.0295 | - |
| 0.4694 | 484 | 1.0195 | - |
| 0.4704 | 485 | 1.0378 | - |
| 0.4714 | 486 | 1.1373 | - |
| 0.4724 | 487 | 1.1413 | - |
| 0.4733 | 488 | 0.9819 | - |
| 0.4743 | 489 | 1.0604 | - |
| 0.4753 | 490 | 1.1592 | - |
| 0.4762 | 491 | 0.9545 | - |
| 0.4772 | 492 | 0.9767 | - |
| 0.4782 | 493 | 1.127 | - |
| 0.4791 | 494 | 1.0507 | - |
| 0.4801 | 495 | 0.9485 | - |
| 0.4811 | 496 | 0.8845 | - |
| 0.4821 | 497 | 1.0914 | - |
| 0.4830 | 498 | 0.9064 | - |
| 0.4840 | 499 | 0.9991 | - |
| 0.4850 | 500 | 1.0451 | 0.7634 |
| 0.4859 | 501 | 1.1429 | - |
| 0.4869 | 502 | 1.0411 | - |
| 0.4879 | 503 | 1.0788 | - |
| 0.4888 | 504 | 0.9389 | - |
| 0.4898 | 505 | 1.0161 | - |
| 0.4908 | 506 | 1.1338 | - |
| 0.4918 | 507 | 1.0242 | - |
| 0.4927 | 508 | 1.1131 | - |
| 0.4937 | 509 | 0.9932 | - |
| 0.4947 | 510 | 1.009 | - |
| 0.4956 | 511 | 0.9534 | - |
| 0.4966 | 512 | 1.0069 | - |
| 0.4976 | 513 | 1.2472 | - |
| 0.4985 | 514 | 1.122 | - |
| 0.4995 | 515 | 1.0673 | - |
| 0.5005 | 516 | 1.0541 | - |
| 0.5015 | 517 | 1.0598 | - |
| 0.5024 | 518 | 1.045 | - |
| 0.5034 | 519 | 1.0612 | - |
| 0.5044 | 520 | 0.9839 | - |
| 0.5053 | 521 | 0.951 | - |
| 0.5063 | 522 | 0.8447 | - |
| 0.5073 | 523 | 0.8521 | - |
| 0.5082 | 524 | 0.9845 | - |
| 0.5092 | 525 | 1.0788 | - |
| 0.5102 | 526 | 1.0702 | - |
| 0.5112 | 527 | 0.8576 | - |
| 0.5121 | 528 | 0.7558 | - |
| 0.5131 | 529 | 1.0439 | - |
| 0.5141 | 530 | 1.0969 | - |
| 0.5150 | 531 | 1.189 | - |
| 0.5160 | 532 | 0.9082 | - |
| 0.5170 | 533 | 1.1147 | - |
| 0.5179 | 534 | 1.1694 | - |
| 0.5189 | 535 | 0.8549 | - |
| 0.5199 | 536 | 1.1317 | - |
| 0.5209 | 537 | 1.114 | - |
| 0.5218 | 538 | 1.0999 | - |
| 0.5228 | 539 | 1.1403 | - |
| 0.5238 | 540 | 0.9576 | - |
| 0.5247 | 541 | 1.0732 | - |
| 0.5257 | 542 | 1.0934 | - |
| 0.5267 | 543 | 1.0736 | - |
| 0.5276 | 544 | 0.9017 | - |
| 0.5286 | 545 | 1.182 | - |
| 0.5296 | 546 | 1.0232 | - |
| 0.5306 | 547 | 1.0199 | - |
| 0.5315 | 548 | 1.203 | - |
| 0.5325 | 549 | 0.9899 | - |
| 0.5335 | 550 | 1.1647 | - |
| 0.5344 | 551 | 1.2184 | - |
| 0.5354 | 552 | 0.8703 | - |
| 0.5364 | 553 | 1.1434 | - |
| 0.5373 | 554 | 1.1051 | - |
| 0.5383 | 555 | 1.1752 | - |
| 0.5393 | 556 | 1.0698 | - |
| 0.5403 | 557 | 1.0935 | - |
| 0.5412 | 558 | 0.8637 | - |
| 0.5422 | 559 | 1.1394 | - |
| 0.5432 | 560 | 1.1028 | - |
| 0.5441 | 561 | 1.0835 | - |
| 0.5451 | 562 | 1.3288 | - |
| 0.5461 | 563 | 1.0221 | - |
| 0.5470 | 564 | 1.0331 | - |
| 0.5480 | 565 | 1.134 | - |
| 0.5490 | 566 | 0.8226 | - |
| 0.5500 | 567 | 1.0843 | - |
| 0.5509 | 568 | 1.1385 | - |
| 0.5519 | 569 | 0.9898 | - |
| 0.5529 | 570 | 0.98 | - |
| 0.5538 | 571 | 0.7958 | - |
| 0.5548 | 572 | 1.1537 | - |
| 0.5558 | 573 | 1.0687 | - |
| 0.5567 | 574 | 1.163 | - |
| 0.5577 | 575 | 0.8539 | - |
| 0.5587 | 576 | 1.0478 | - |
| 0.5597 | 577 | 1.0276 | - |
| 0.5606 | 578 | 1.0391 | - |
| 0.5616 | 579 | 1.0313 | - |
| 0.5626 | 580 | 1.1077 | - |
| 0.5635 | 581 | 0.9457 | - |
| 0.5645 | 582 | 1.2136 | - |
| 0.5655 | 583 | 1.0202 | - |
| 0.5664 | 584 | 1.0163 | - |
| 0.5674 | 585 | 1.2129 | - |
| 0.5684 | 586 | 0.9628 | - |
| 0.5694 | 587 | 1.0365 | - |
| 0.5703 | 588 | 1.0317 | - |
| 0.5713 | 589 | 0.9017 | - |
| 0.5723 | 590 | 0.8179 | - |
| 0.5732 | 591 | 0.9598 | - |
| 0.5742 | 592 | 0.9092 | - |
| 0.5752 | 593 | 1.0257 | - |
| 0.5761 | 594 | 0.9555 | - |
| 0.5771 | 595 | 1.0875 | - |
| 0.5781 | 596 | 1.1334 | - |
| 0.5790 | 597 | 1.075 | - |
| 0.5800 | 598 | 0.9825 | - |
| 0.5810 | 599 | 0.9609 | - |
| 0.5820 | 600 | 0.8968 | 0.7698 |
| 0.5829 | 601 | 1.1367 | - |
| 0.5839 | 602 | 1.1402 | - |
| 0.5849 | 603 | 1.1907 | - |
| 0.5858 | 604 | 1.3122 | - |
| 0.5868 | 605 | 1.1105 | - |
| 0.5878 | 606 | 0.8503 | - |
| 0.5887 | 607 | 0.9548 | - |
| 0.5897 | 608 | 0.8559 | - |
| 0.5907 | 609 | 1.0742 | - |
| 0.5917 | 610 | 1.1877 | - |
| 0.5926 | 611 | 0.8497 | - |
| 0.5936 | 612 | 0.9624 | - |
| 0.5946 | 613 | 1.2401 | - |
| 0.5955 | 614 | 1.132 | - |
| 0.5965 | 615 | 0.8969 | - |
| 0.5975 | 616 | 1.0496 | - |
| 0.5984 | 617 | 1.0005 | - |
| 0.5994 | 618 | 1.1439 | - |
| 0.6004 | 619 | 1.0735 | - |
| 0.6014 | 620 | 0.9692 | - |
| 0.6023 | 621 | 0.9886 | - |
| 0.6033 | 622 | 1.1511 | - |
| 0.6043 | 623 | 1.0945 | - |
| 0.6052 | 624 | 1.1312 | - |
| 0.6062 | 625 | 0.9951 | - |
| 0.6072 | 626 | 1.1972 | - |
| 0.6081 | 627 | 1.0338 | - |
| 0.6091 | 628 | 1.0568 | - |
| 0.6101 | 629 | 1.0572 | - |
| 0.6111 | 630 | 0.8318 | - |
| 0.6120 | 631 | 0.9429 | - |
| 0.6130 | 632 | 0.912 | - |
| 0.6140 | 633 | 1.0005 | - |
| 0.6149 | 634 | 0.9752 | - |
| 0.6159 | 635 | 0.845 | - |
| 0.6169 | 636 | 0.9303 | - |
| 0.6178 | 637 | 1.0656 | - |
| 0.6188 | 638 | 1.0395 | - |
| 0.6198 | 639 | 1.0108 | - |
| 0.6208 | 640 | 1.1792 | - |
| 0.6217 | 641 | 0.8825 | - |
| 0.6227 | 642 | 1.0048 | - |
| 0.6237 | 643 | 1.056 | - |
| 0.6246 | 644 | 1.0789 | - |
| 0.6256 | 645 | 0.9544 | - |
| 0.6266 | 646 | 0.9803 | - |
| 0.6275 | 647 | 1.0531 | - |
| 0.6285 | 648 | 1.0489 | - |
| 0.6295 | 649 | 1.1502 | - |
| 0.6305 | 650 | 1.0816 | - |
| 0.6314 | 651 | 0.9297 | - |
| 0.6324 | 652 | 0.9255 | - |
| 0.6334 | 653 | 0.6616 | - |
| 0.6343 | 654 | 1.0829 | - |
| 0.6353 | 655 | 1.183 | - |
| 0.6363 | 656 | 0.9131 | - |
| 0.6372 | 657 | 0.9559 | - |
| 0.6382 | 658 | 1.1332 | - |
| 0.6392 | 659 | 0.8958 | - |
| 0.6402 | 660 | 0.9512 | - |
| 0.6411 | 661 | 1.0395 | - |
| 0.6421 | 662 | 1.0129 | - |
| 0.6431 | 663 | 0.9519 | - |
| 0.6440 | 664 | 0.8938 | - |
| 0.6450 | 665 | 1.1163 | - |
| 0.6460 | 666 | 0.8499 | - |
| 0.6469 | 667 | 0.8961 | - |
| 0.6479 | 668 | 0.9473 | - |
| 0.6489 | 669 | 1.0553 | - |
| 0.6499 | 670 | 1.1022 | - |
| 0.6508 | 671 | 0.8882 | - |
| 0.6518 | 672 | 0.8653 | - |
| 0.6528 | 673 | 1.1437 | - |
| 0.6537 | 674 | 0.9091 | - |
| 0.6547 | 675 | 0.9815 | - |
| 0.6557 | 676 | 1.187 | - |
| 0.6566 | 677 | 0.8464 | - |
| 0.6576 | 678 | 1.0301 | - |
| 0.6586 | 679 | 1.2916 | - |
| 0.6596 | 680 | 1.023 | - |
| 0.6605 | 681 | 0.9788 | - |
| 0.6615 | 682 | 1.327 | - |
| 0.6625 | 683 | 0.8235 | - |
| 0.6634 | 684 | 0.9471 | - |
| 0.6644 | 685 | 1.0675 | - |
| 0.6654 | 686 | 0.7868 | - |
| 0.6663 | 687 | 0.95 | - |
| 0.6673 | 688 | 1.1434 | - |
| 0.6683 | 689 | 1.0451 | - |
| 0.6693 | 690 | 1.1038 | - |
| 0.6702 | 691 | 1.2937 | - |
| 0.6712 | 692 | 1.1106 | - |
| 0.6722 | 693 | 1.0265 | - |
| 0.6731 | 694 | 1.1148 | - |
| 0.6741 | 695 | 1.1206 | - |
| 0.6751 | 696 | 0.9779 | - |
| 0.6760 | 697 | 0.9706 | - |
| 0.6770 | 698 | 1.1312 | - |
| 0.6780 | 699 | 1.116 | - |
| 0.6790 | 700 | 0.951 | 0.7743 |
| 0.6799 | 701 | 0.8288 | - |
| 0.6809 | 702 | 1.0286 | - |
| 0.6819 | 703 | 0.9263 | - |
| 0.6828 | 704 | 1.079 | - |
| 0.6838 | 705 | 0.8845 | - |
| 0.6848 | 706 | 1.1676 | - |
| 0.6857 | 707 | 1.0349 | - |
| 0.6867 | 708 | 1.2269 | - |
| 0.6877 | 709 | 0.959 | - |
| 0.6887 | 710 | 0.9698 | - |
| 0.6896 | 711 | 1.0016 | - |
| 0.6906 | 712 | 0.9496 | - |
| 0.6916 | 713 | 1.1597 | - |
| 0.6925 | 714 | 1.0114 | - |
| 0.6935 | 715 | 1.017 | - |
| 0.6945 | 716 | 1.1233 | - |
| 0.6954 | 717 | 0.9365 | - |
| 0.6964 | 718 | 0.8629 | - |
| 0.6974 | 719 | 0.9872 | - |
| 0.6984 | 720 | 0.9775 | - |
| 0.6993 | 721 | 1.0152 | - |
| 0.7003 | 722 | 1.0435 | - |
| 0.7013 | 723 | 0.8754 | - |
| 0.7022 | 724 | 1.209 | - |
| 0.7032 | 725 | 1.1563 | - |
| 0.7042 | 726 | 0.948 | - |
| 0.7051 | 727 | 0.924 | - |
| 0.7061 | 728 | 1.1001 | - |
| 0.7071 | 729 | 1.1743 | - |
| 0.7081 | 730 | 1.0965 | - |
| 0.7090 | 731 | 0.8052 | - |
| 0.7100 | 732 | 1.066 | - |
| 0.7110 | 733 | 0.9235 | - |
| 0.7119 | 734 | 1.06 | - |
| 0.7129 | 735 | 1.0352 | - |
| 0.7139 | 736 | 1.1974 | - |
| 0.7148 | 737 | 1.0942 | - |
| 0.7158 | 738 | 1.0253 | - |
| 0.7168 | 739 | 1.0714 | - |
| 0.7177 | 740 | 1.0744 | - |
| 0.7187 | 741 | 1.0304 | - |
| 0.7197 | 742 | 1.0696 | - |
| 0.7207 | 743 | 1.0484 | - |
| 0.7216 | 744 | 0.8974 | - |
| 0.7226 | 745 | 0.9182 | - |
| 0.7236 | 746 | 1.0171 | - |
| 0.7245 | 747 | 1.2643 | - |
| 0.7255 | 748 | 1.1048 | - |
| 0.7265 | 749 | 0.926 | - |
| 0.7274 | 750 | 0.9786 | - |
| 0.7284 | 751 | 1.061 | - |
| 0.7294 | 752 | 1.0634 | - |
| 0.7304 | 753 | 0.8852 | - |
| 0.7313 | 754 | 0.955 | - |
| 0.7323 | 755 | 0.9501 | - |
| 0.7333 | 756 | 0.8055 | - |
| 0.7342 | 757 | 0.8329 | - |
| 0.7352 | 758 | 1.1392 | - |
| 0.7362 | 759 | 1.0669 | - |
| 0.7371 | 760 | 0.9245 | - |
| 0.7381 | 761 | 0.9757 | - |
| 0.7391 | 762 | 1.1642 | - |
| 0.7401 | 763 | 0.9897 | - |
| 0.7410 | 764 | 0.9399 | - |
| 0.7420 | 765 | 0.9005 | - |
| 0.7430 | 766 | 1.1521 | - |
| 0.7439 | 767 | 0.9101 | - |
| 0.7449 | 768 | 0.9139 | - |
| 0.7459 | 769 | 0.9832 | - |
| 0.7468 | 770 | 1.0531 | - |
| 0.7478 | 771 | 0.9897 | - |
| 0.7488 | 772 | 1.0826 | - |
| 0.7498 | 773 | 0.8405 | - |
| 0.7507 | 774 | 1.0506 | - |
| 0.7517 | 775 | 0.9899 | - |
| 0.7527 | 776 | 1.0265 | - |
| 0.7536 | 777 | 0.9145 | - |
| 0.7546 | 778 | 1.053 | - |
| 0.7556 | 779 | 1.2219 | - |
| 0.7565 | 780 | 1.0113 | - |
| 0.7575 | 781 | 1.096 | - |
| 0.7585 | 782 | 0.9848 | - |
| 0.7595 | 783 | 0.8231 | - |
| 0.7604 | 784 | 1.0052 | - |
| 0.7614 | 785 | 1.1283 | - |
| 0.7624 | 786 | 1.009 | - |
| 0.7633 | 787 | 1.0359 | - |
| 0.7643 | 788 | 1.0517 | - |
| 0.7653 | 789 | 1.0982 | - |
| 0.7662 | 790 | 0.8718 | - |
| 0.7672 | 791 | 1.0786 | - |
| 0.7682 | 792 | 1.1352 | - |
| 0.7692 | 793 | 1.0237 | - |
| 0.7701 | 794 | 0.9454 | - |
| 0.7711 | 795 | 0.8788 | - |
| 0.7721 | 796 | 0.9528 | - |
| 0.7730 | 797 | 0.9875 | - |
| 0.7740 | 798 | 1.0055 | - |
| 0.7750 | 799 | 0.8797 | - |
| 0.7759 | 800 | 0.9553 | 0.7850 |
| 0.7769 | 801 | 1.0058 | - |
| 0.7779 | 802 | 1.0594 | - |
| 0.7789 | 803 | 1.0803 | - |
| 0.7798 | 804 | 1.0505 | - |
| 0.7808 | 805 | 1.0356 | - |
| 0.7818 | 806 | 1.1827 | - |
| 0.7827 | 807 | 0.8951 | - |
| 0.7837 | 808 | 0.9158 | - |
| 0.7847 | 809 | 0.9735 | - |
| 0.7856 | 810 | 0.9962 | - |
| 0.7866 | 811 | 1.0038 | - |
| 0.7876 | 812 | 0.9331 | - |
| 0.7886 | 813 | 0.9398 | - |
| 0.7895 | 814 | 1.0149 | - |
| 0.7905 | 815 | 1.0502 | - |
| 0.7915 | 816 | 0.927 | - |
| 0.7924 | 817 | 1.0834 | - |
| 0.7934 | 818 | 1.1106 | - |
| 0.7944 | 819 | 0.9005 | - |
| 0.7953 | 820 | 0.8341 | - |
| 0.7963 | 821 | 1.0671 | - |
| 0.7973 | 822 | 1.0785 | - |
| 0.7983 | 823 | 1.0531 | - |
| 0.7992 | 824 | 1.0732 | - |
| 0.8002 | 825 | 1.0284 | - |
| 0.8012 | 826 | 1.054 | - |
| 0.8021 | 827 | 1.0111 | - |
| 0.8031 | 828 | 1.0668 | - |
| 0.8041 | 829 | 0.8917 | - |
| 0.8050 | 830 | 0.7971 | - |
| 0.8060 | 831 | 0.9066 | - |
| 0.8070 | 832 | 1.0575 | - |
| 0.8080 | 833 | 0.8642 | - |
| 0.8089 | 834 | 0.8548 | - |
| 0.8099 | 835 | 1.2178 | - |
| 0.8109 | 836 | 1.1365 | - |
| 0.8118 | 837 | 1.0819 | - |
| 0.8128 | 838 | 0.9707 | - |
| 0.8138 | 839 | 0.9713 | - |
| 0.8147 | 840 | 0.8953 | - |
| 0.8157 | 841 | 0.9213 | - |
| 0.8167 | 842 | 1.1724 | - |
| 0.8177 | 843 | 1.0114 | - |
| 0.8186 | 844 | 1.0272 | - |
| 0.8196 | 845 | 0.9459 | - |
| 0.8206 | 846 | 1.1893 | - |
| 0.8215 | 847 | 0.8021 | - |
| 0.8225 | 848 | 0.7971 | - |
| 0.8235 | 849 | 0.8631 | - |
| 0.8244 | 850 | 1.1306 | - |
| 0.8254 | 851 | 0.9815 | - |
| 0.8264 | 852 | 1.0511 | - |
| 0.8274 | 853 | 0.9648 | - |
| 0.8283 | 854 | 1.0144 | - |
| 0.8293 | 855 | 1.0894 | - |
| 0.8303 | 856 | 0.9498 | - |
| 0.8312 | 857 | 1.0919 | - |
| 0.8322 | 858 | 1.1252 | - |
| 0.8332 | 859 | 0.8702 | - |
| 0.8341 | 860 | 1.0995 | - |
| 0.8351 | 861 | 0.8141 | - |
| 0.8361 | 862 | 1.0677 | - |
| 0.8371 | 863 | 0.9817 | - |
| 0.8380 | 864 | 1.0592 | - |
| 0.8390 | 865 | 0.9584 | - |
| 0.8400 | 866 | 1.1485 | - |
| 0.8409 | 867 | 1.0698 | - |
| 0.8419 | 868 | 0.9717 | - |
| 0.8429 | 869 | 1.0263 | - |
| 0.8438 | 870 | 1.1607 | - |
| 0.8448 | 871 | 0.7513 | - |
| 0.8458 | 872 | 1.0434 | - |
| 0.8468 | 873 | 1.0811 | - |
| 0.8477 | 874 | 1.1221 | - |
| 0.8487 | 875 | 0.9802 | - |
| 0.8497 | 876 | 0.8221 | - |
| 0.8506 | 877 | 0.9487 | - |
| 0.8516 | 878 | 0.9615 | - |
| 0.8526 | 879 | 0.9094 | - |
| 0.8535 | 880 | 1.1254 | - |
| 0.8545 | 881 | 1.0389 | - |
| 0.8555 | 882 | 0.9059 | - |
| 0.8565 | 883 | 1.1106 | - |
| 0.8574 | 884 | 0.9647 | - |
| 0.8584 | 885 | 1.2276 | - |
| 0.8594 | 886 | 0.8228 | - |
| 0.8603 | 887 | 0.9616 | - |
| 0.8613 | 888 | 0.9603 | - |
| 0.8623 | 889 | 1.0795 | - |
| 0.8632 | 890 | 0.7917 | - |
| 0.8642 | 891 | 1.1794 | - |
| 0.8652 | 892 | 1.0716 | - |
| 0.8661 | 893 | 1.0956 | - |
| 0.8671 | 894 | 0.7817 | - |
| 0.8681 | 895 | 0.8731 | - |
| 0.8691 | 896 | 1.0859 | - |
| 0.8700 | 897 | 0.9853 | - |
| 0.8710 | 898 | 1.1447 | - |
| 0.8720 | 899 | 1.0804 | - |
| 0.8729 | 900 | 1.0068 | 0.7732 |
| 0.8739 | 901 | 0.8114 | - |
| 0.8749 | 902 | 1.0151 | - |
| 0.8758 | 903 | 0.9454 | - |
| 0.8768 | 904 | 0.9953 | - |
| 0.8778 | 905 | 1.0639 | - |
| 0.8788 | 906 | 1.1602 | - |
| 0.8797 | 907 | 0.8954 | - |
| 0.8807 | 908 | 0.9654 | - |
| 0.8817 | 909 | 0.9739 | - |
| 0.8826 | 910 | 0.9898 | - |
| 0.8836 | 911 | 0.8363 | - |
| 0.8846 | 912 | 1.3511 | - |
| 0.8855 | 913 | 1.0814 | - |
| 0.8865 | 914 | 0.9362 | - |
| 0.8875 | 915 | 0.9156 | - |
| 0.8885 | 916 | 1.1602 | - |
| 0.8894 | 917 | 1.1053 | - |
| 0.8904 | 918 | 0.9363 | - |
| 0.8914 | 919 | 1.0752 | - |
| 0.8923 | 920 | 1.0411 | - |
| 0.8933 | 921 | 0.9259 | - |
| 0.8943 | 922 | 0.8787 | - |
| 0.8952 | 923 | 0.9916 | - |
| 0.8962 | 924 | 0.9259 | - |
| 0.8972 | 925 | 1.0699 | - |
| 0.8982 | 926 | 1.0453 | - |
| 0.8991 | 927 | 0.9482 | - |
| 0.9001 | 928 | 0.9673 | - |
| 0.9011 | 929 | 1.0472 | - |
| 0.9020 | 930 | 1.0362 | - |
| 0.9030 | 931 | 1.0146 | - |
| 0.9040 | 932 | 1.2532 | - |
| 0.9049 | 933 | 1.0516 | - |
| 0.9059 | 934 | 0.7712 | - |
| 0.9069 | 935 | 1.0709 | - |
| 0.9079 | 936 | 1.2373 | - |
| 0.9088 | 937 | 1.0152 | - |
| 0.9098 | 938 | 0.9763 | - |
| 0.9108 | 939 | 0.8526 | - |
| 0.9117 | 940 | 0.9346 | - |
| 0.9127 | 941 | 0.9223 | - |
| 0.9137 | 942 | 0.9728 | - |
| 0.9146 | 943 | 0.7943 | - |
| 0.9156 | 944 | 0.9577 | - |
| 0.9166 | 945 | 1.0172 | - |
| 0.9176 | 946 | 0.8119 | - |
| 0.9185 | 947 | 0.9257 | - |
| 0.9195 | 948 | 1.1248 | - |
| 0.9205 | 949 | 1.0013 | - |
| 0.9214 | 950 | 0.8668 | - |
| 0.9224 | 951 | 0.8028 | - |
| 0.9234 | 952 | 1.0528 | - |
| 0.9243 | 953 | 1.1875 | - |
| 0.9253 | 954 | 0.9449 | - |
| 0.9263 | 955 | 1.1283 | - |
| 0.9273 | 956 | 1.0642 | - |
| 0.9282 | 957 | 0.8697 | - |
| 0.9292 | 958 | 0.9364 | - |
| 0.9302 | 959 | 0.9071 | - |
| 0.9311 | 960 | 0.9994 | - |
| 0.9321 | 961 | 0.9029 | - |
| 0.9331 | 962 | 0.8964 | - |
| 0.9340 | 963 | 1.0432 | - |
| 0.9350 | 964 | 1.1667 | - |
| 0.9360 | 965 | 1.12 | - |
| 0.9370 | 966 | 0.9149 | - |
| 0.9379 | 967 | 0.9626 | - |
| 0.9389 | 968 | 0.9908 | - |
| 0.9399 | 969 | 1.016 | - |
| 0.9408 | 970 | 0.9927 | - |
| 0.9418 | 971 | 0.9359 | - |
| 0.9428 | 972 | 0.9862 | - |
| 0.9437 | 973 | 0.9201 | - |
| 0.9447 | 974 | 0.924 | - |
| 0.9457 | 975 | 1.041 | - |
| 0.9467 | 976 | 0.8923 | - |
| 0.9476 | 977 | 0.9623 | - |
| 0.9486 | 978 | 0.7969 | - |
| 0.9496 | 979 | 1.029 | - |
| 0.9505 | 980 | 0.9091 | - |
| 0.9515 | 981 | 0.9959 | - |
| 0.9525 | 982 | 0.9284 | - |
| 0.9534 | 983 | 1.0337 | - |
| 0.9544 | 984 | 1.0271 | - |
| 0.9554 | 985 | 0.9689 | - |
| 0.9564 | 986 | 1.0873 | - |
| 0.9573 | 987 | 0.9134 | - |
| 0.9583 | 988 | 0.9464 | - |
| 0.9593 | 989 | 1.0108 | - |
| 0.9602 | 990 | 0.9301 | - |
| 0.9612 | 991 | 1.0997 | - |
| 0.9622 | 992 | 1.0195 | - |
| 0.9631 | 993 | 0.7899 | - |
| 0.9641 | 994 | 1.0238 | - |
| 0.9651 | 995 | 0.9627 | - |
| 0.9661 | 996 | 1.0571 | - |
| 0.9670 | 997 | 0.9942 | - |
| 0.9680 | 998 | 0.9517 | - |
| 0.9690 | 999 | 0.9362 | - |
| 0.9699 | 1000 | 0.9168 | 0.7758 |
| 0.9709 | 1001 | 0.9433 | - |
| 0.9719 | 1002 | 0.9075 | - |
| 0.9728 | 1003 | 0.789 | - |
| 0.9738 | 1004 | 0.9803 | - |
| 0.9748 | 1005 | 0.9036 | - |
| 0.9758 | 1006 | 1.2136 | - |
| 0.9767 | 1007 | 0.9326 | - |
| 0.9777 | 1008 | 1.234 | - |
| 0.9787 | 1009 | 0.9285 | - |
| 0.9796 | 1010 | 0.936 | - |
| 0.9806 | 1011 | 0.9184 | - |
| 0.9816 | 1012 | 1.0219 | - |
| 0.9825 | 1013 | 0.8375 | - |
| 0.9835 | 1014 | 0.7984 | - |
| 0.9845 | 1015 | 1.0286 | - |
| 0.9855 | 1016 | 1.0073 | - |
| 0.9864 | 1017 | 0.8438 | - |
| 0.9874 | 1018 | 0.9551 | - |
| 0.9884 | 1019 | 0.8861 | - |
| 0.9893 | 1020 | 0.9589 | - |
| 0.9903 | 1021 | 0.8096 | - |
| 0.9913 | 1022 | 0.9143 | - |
| 0.9922 | 1023 | 0.8689 | - |
| 0.9932 | 1024 | 0.8822 | - |
| 0.9942 | 1025 | 0.8862 | - |
| 0.9952 | 1026 | 1.1052 | - |
| 0.9961 | 1027 | 0.9032 | - |
| 0.9971 | 1028 | 0.7437 | - |
| 0.9981 | 1029 | 0.9992 | - |
| 0.9990 | 1030 | 1.101 | - |
| 1.0 | 1031 | 1.4964 | - |
| 1.0010 | 1032 | 1.2588 | - |
| 1.0010 | 1033 | 0.8675 | - |
| 1.0019 | 1034 | 0.8294 | - |
| 1.0029 | 1035 | 1.0767 | - |
| 1.0039 | 1036 | 1.0335 | - |
| 1.0048 | 1037 | 0.9359 | - |
| 1.0058 | 1038 | 1.1158 | - |
| 1.0068 | 1039 | 1.058 | - |
| 1.0078 | 1040 | 0.8808 | - |
| 1.0087 | 1041 | 0.887 | - |
| 1.0097 | 1042 | 0.8947 | - |
| 1.0107 | 1043 | 0.8709 | - |
| 1.0116 | 1044 | 0.8153 | - |
| 1.0126 | 1045 | 0.8091 | - |
| 1.0136 | 1046 | 0.9106 | - |
| 1.0145 | 1047 | 1.1256 | - |
| 1.0155 | 1048 | 0.8195 | - |
| 1.0165 | 1049 | 1.072 | - |
| 1.0175 | 1050 | 1.087 | - |
| 1.0184 | 1051 | 1.0114 | - |
| 1.0194 | 1052 | 1.0976 | - |
| 1.0204 | 1053 | 0.921 | - |
| 1.0213 | 1054 | 0.9462 | - |
| 1.0223 | 1055 | 1.0139 | - |
| 1.0233 | 1056 | 1.1217 | - |
| 1.0242 | 1057 | 0.8605 | - |
| 1.0252 | 1058 | 1.247 | - |
| 1.0262 | 1059 | 0.9156 | - |
| 1.0272 | 1060 | 0.9954 | - |
| 1.0281 | 1061 | 0.8935 | - |
| 1.0291 | 1062 | 0.8086 | - |
| 1.0301 | 1063 | 1.0395 | - |
| 1.0310 | 1064 | 1.0155 | - |
| 1.0320 | 1065 | 0.8537 | - |
| 1.0330 | 1066 | 1.0203 | - |
| 1.0339 | 1067 | 0.9523 | - |
| 1.0349 | 1068 | 1.1267 | - |
| 1.0359 | 1069 | 0.7492 | - |
| 1.0369 | 1070 | 0.8994 | - |
| 1.0378 | 1071 | 0.8827 | - |
| 1.0388 | 1072 | 0.9288 | - |
| 1.0398 | 1073 | 0.9542 | - |
| 1.0407 | 1074 | 0.8632 | - |
| 1.0417 | 1075 | 1.1675 | - |
| 1.0427 | 1076 | 0.9906 | - |
| 1.0436 | 1077 | 0.8954 | - |
| 1.0446 | 1078 | 1.0465 | - |
| 1.0456 | 1079 | 0.9034 | - |
| 1.0466 | 1080 | 0.9598 | - |
| 1.0475 | 1081 | 1.0112 | - |
| 1.0485 | 1082 | 1.0856 | - |
| 1.0495 | 1083 | 1.0647 | - |
| 1.0504 | 1084 | 0.8393 | - |
| 1.0514 | 1085 | 0.9565 | - |
| 1.0524 | 1086 | 0.8846 | - |
| 1.0533 | 1087 | 1.0093 | - |
| 1.0543 | 1088 | 0.9585 | - |
| 1.0553 | 1089 | 1.0605 | - |
| 1.0563 | 1090 | 1.0775 | - |
| 1.0572 | 1091 | 1.0286 | - |
| 1.0582 | 1092 | 1.1261 | - |
| 1.0592 | 1093 | 0.9954 | - |
| 1.0601 | 1094 | 1.0158 | - |
| 1.0611 | 1095 | 0.922 | - |
| 1.0621 | 1096 | 1.0617 | - |
| 1.0630 | 1097 | 0.9094 | - |
| 1.0640 | 1098 | 1.008 | - |
| 1.0650 | 1099 | 0.8282 | - |
| 1.0660 | 1100 | 0.8301 | 0.7853 |
| 1.0669 | 1101 | 0.8357 | - |
| 1.0679 | 1102 | 0.9063 | - |
| 1.0689 | 1103 | 0.7942 | - |
| 1.0698 | 1104 | 0.7446 | - |
| 1.0708 | 1105 | 1.0596 | - |
| 1.0718 | 1106 | 0.9936 | - |
| 1.0727 | 1107 | 0.8537 | - |
| 1.0737 | 1108 | 0.9637 | - |
| 1.0747 | 1109 | 1.0706 | - |
| 1.0757 | 1110 | 0.7171 | - |
| 1.0766 | 1111 | 1.1043 | - |
| 1.0776 | 1112 | 1.1395 | - |
| 1.0786 | 1113 | 0.8704 | - |
| 1.0795 | 1114 | 0.7569 | - |
| 1.0805 | 1115 | 1.0114 | - |
| 1.0815 | 1116 | 1.0144 | - |
| 1.0824 | 1117 | 0.9716 | - |
| 1.0834 | 1118 | 0.7612 | - |
| 1.0844 | 1119 | 0.7822 | - |
| 1.0854 | 1120 | 0.9572 | - |
| 1.0863 | 1121 | 0.9037 | - |
| 1.0873 | 1122 | 0.8627 | - |
| 1.0883 | 1123 | 1.0196 | - |
| 1.0892 | 1124 | 0.9917 | - |
| 1.0902 | 1125 | 0.9181 | - |
| 1.0912 | 1126 | 1.224 | - |
| 1.0921 | 1127 | 1.0823 | - |
| 1.0931 | 1128 | 1.0521 | - |
| 1.0941 | 1129 | 1.0782 | - |
| 1.0951 | 1130 | 1.0567 | - |
| 1.0960 | 1131 | 0.9204 | - |
| 1.0970 | 1132 | 0.9318 | - |
| 1.0980 | 1133 | 1.0245 | - |
| 1.0989 | 1134 | 0.8573 | - |
| 1.0999 | 1135 | 0.9509 | - |
| 1.1009 | 1136 | 0.9522 | - |
| 1.1018 | 1137 | 0.9393 | - |
| 1.1028 | 1138 | 0.9645 | - |
| 1.1038 | 1139 | 0.9974 | - |
| 1.1048 | 1140 | 0.9518 | - |
| 1.1057 | 1141 | 0.7764 | - |
| 1.1067 | 1142 | 0.9631 | - |
| 1.1077 | 1143 | 0.9927 | - |
| 1.1086 | 1144 | 0.8814 | - |
| 1.1096 | 1145 | 0.897 | - |
| 1.1106 | 1146 | 0.9564 | - |
| 1.1115 | 1147 | 0.895 | - |
| 1.1125 | 1148 | 0.9096 | - |
| 1.1135 | 1149 | 0.9751 | - |
| 1.1145 | 1150 | 0.9359 | - |
| 1.1154 | 1151 | 1.0769 | - |
| 1.1164 | 1152 | 0.999 | - |
| 1.1174 | 1153 | 1.0376 | - |
| 1.1183 | 1154 | 0.8127 | - |
| 1.1193 | 1155 | 0.7559 | - |
| 1.1203 | 1156 | 0.8768 | - |
| 1.1212 | 1157 | 0.9444 | - |
| 1.1222 | 1158 | 0.879 | - |
| 1.1232 | 1159 | 0.8841 | - |
| 1.1242 | 1160 | 1.0216 | - |
| 1.1251 | 1161 | 0.9782 | - |
| 1.1261 | 1162 | 0.8721 | - |
| 1.1271 | 1163 | 0.9624 | - |
| 1.1280 | 1164 | 0.9652 | - |
| 1.1290 | 1165 | 0.8734 | - |
| 1.1300 | 1166 | 0.7225 | - |
| 1.1309 | 1167 | 0.8565 | - |
| 1.1319 | 1168 | 0.9513 | - |
| 1.1329 | 1169 | 0.7248 | - |
| 1.1339 | 1170 | 0.7341 | - |
| 1.1348 | 1171 | 0.8635 | - |
| 1.1358 | 1172 | 0.9145 | - |
| 1.1368 | 1173 | 0.8385 | - |
| 1.1377 | 1174 | 0.9555 | - |
| 1.1387 | 1175 | 0.6751 | - |
| 1.1397 | 1176 | 0.7009 | - |
| 1.1406 | 1177 | 0.7856 | - |
| 1.1416 | 1178 | 0.9331 | - |
| 1.1426 | 1179 | 0.8542 | - |
| 1.1435 | 1180 | 0.8366 | - |
| 1.1445 | 1181 | 0.8267 | - |
| 1.1455 | 1182 | 1.1346 | - |
| 1.1465 | 1183 | 1.1498 | - |
| 1.1474 | 1184 | 0.7565 | - |
| 1.1484 | 1185 | 1.0709 | - |
| 1.1494 | 1186 | 0.9927 | - |
| 1.1503 | 1187 | 0.9678 | - |
| 1.1513 | 1188 | 0.916 | - |
| 1.1523 | 1189 | 1.0785 | - |
| 1.1532 | 1190 | 0.9407 | - |
| 1.1542 | 1191 | 0.8164 | - |
| 1.1552 | 1192 | 0.9886 | - |
| 1.1562 | 1193 | 1.051 | - |
| 1.1571 | 1194 | 0.9515 | - |
| 1.1581 | 1195 | 1.0084 | - |
| 1.1591 | 1196 | 0.9572 | - |
| 1.1600 | 1197 | 0.9123 | - |
| 1.1610 | 1198 | 0.9766 | - |
| 1.1620 | 1199 | 0.9623 | - |
| 1.1629 | 1200 | 1.0156 | 0.7876 |
| 1.1639 | 1201 | 0.964 | - |
| 1.1649 | 1202 | 0.9256 | - |
| 1.1659 | 1203 | 0.9928 | - |
| 1.1668 | 1204 | 1.0211 | - |
| 1.1678 | 1205 | 0.9243 | - |
| 1.1688 | 1206 | 0.8579 | - |
| 1.1697 | 1207 | 0.8466 | - |
| 1.1707 | 1208 | 1.0328 | - |
| 1.1717 | 1209 | 0.7937 | - |
| 1.1726 | 1210 | 0.6935 | - |
| 1.1736 | 1211 | 1.0592 | - |
| 1.1746 | 1212 | 0.7996 | - |
| 1.1756 | 1213 | 0.9154 | - |
| 1.1765 | 1214 | 0.9347 | - |
| 1.1775 | 1215 | 0.8974 | - |
| 1.1785 | 1216 | 0.9143 | - |
| 1.1794 | 1217 | 0.9666 | - |
| 1.1804 | 1218 | 0.9933 | - |
| 1.1814 | 1219 | 0.8474 | - |
| 1.1823 | 1220 | 0.8823 | - |
| 1.1833 | 1221 | 0.8623 | - |
| 1.1843 | 1222 | 0.8325 | - |
| 1.1853 | 1223 | 0.7443 | - |
| 1.1862 | 1224 | 0.9104 | - |
| 1.1872 | 1225 | 0.8507 | - |
| 1.1882 | 1226 | 0.6476 | - |
| 1.1891 | 1227 | 0.8584 | - |
| 1.1901 | 1228 | 1.0635 | - |
| 1.1911 | 1229 | 1.1582 | - |
| 1.1920 | 1230 | 1.0787 | - |
| 1.1930 | 1231 | 0.7793 | - |
| 1.1940 | 1232 | 0.8799 | - |
| 1.1950 | 1233 | 0.7783 | - |
| 1.1959 | 1234 | 0.8943 | - |
| 1.1969 | 1235 | 1.1208 | - |
| 1.1979 | 1236 | 1.1257 | - |
| 1.1988 | 1237 | 0.9184 | - |
| 1.1998 | 1238 | 0.9265 | - |
| 1.2008 | 1239 | 0.926 | - |
| 1.2017 | 1240 | 0.8261 | - |
| 1.2027 | 1241 | 0.9991 | - |
| 1.2037 | 1242 | 0.8113 | - |
| 1.2047 | 1243 | 0.876 | - |
| 1.2056 | 1244 | 0.9464 | - |
| 1.2066 | 1245 | 0.8153 | - |
| 1.2076 | 1246 | 0.8578 | - |
| 1.2085 | 1247 | 1.1457 | - |
| 1.2095 | 1248 | 0.8605 | - |
| 1.2105 | 1249 | 0.9466 | - |
| 1.2114 | 1250 | 0.8246 | - |
| 1.2124 | 1251 | 0.9013 | - |
| 1.2134 | 1252 | 0.8775 | - |
| 1.2144 | 1253 | 0.834 | - |
| 1.2153 | 1254 | 1.0859 | - |
| 1.2163 | 1255 | 0.9353 | - |
| 1.2173 | 1256 | 1.1261 | - |
| 1.2182 | 1257 | 0.7248 | - |
| 1.2192 | 1258 | 0.9393 | - |
| 1.2202 | 1259 | 0.8916 | - |
| 1.2211 | 1260 | 0.7358 | - |
| 1.2221 | 1261 | 1.204 | - |
| 1.2231 | 1262 | 1.0034 | - |
| 1.2241 | 1263 | 0.9008 | - |
| 1.2250 | 1264 | 0.9595 | - |
| 1.2260 | 1265 | 0.7679 | - |
| 1.2270 | 1266 | 1.0197 | - |
| 1.2279 | 1267 | 0.879 | - |
| 1.2289 | 1268 | 0.9903 | - |
| 1.2299 | 1269 | 0.909 | - |
| 1.2308 | 1270 | 0.7562 | - |
| 1.2318 | 1271 | 0.8426 | - |
| 1.2328 | 1272 | 0.7815 | - |
| 1.2338 | 1273 | 0.9019 | - |
| 1.2347 | 1274 | 0.8489 | - |
| 1.2357 | 1275 | 0.9321 | - |
| 1.2367 | 1276 | 0.7697 | - |
| 1.2376 | 1277 | 0.8796 | - |
| 1.2386 | 1278 | 0.9192 | - |
| 1.2396 | 1279 | 0.6848 | - |
| 1.2405 | 1280 | 0.8633 | - |
| 1.2415 | 1281 | 0.7201 | - |
| 1.2425 | 1282 | 0.8746 | - |
| 1.2435 | 1283 | 0.8757 | - |
| 1.2444 | 1284 | 0.8366 | - |
| 1.2454 | 1285 | 0.786 | - |
| 1.2464 | 1286 | 0.8817 | - |
| 1.2473 | 1287 | 0.9352 | - |
| 1.2483 | 1288 | 1.0692 | - |
| 1.2493 | 1289 | 0.8337 | - |
| 1.2502 | 1290 | 0.801 | - |
| 1.2512 | 1291 | 0.8109 | - |
| 1.2522 | 1292 | 0.947 | - |
| 1.2532 | 1293 | 0.8049 | - |
| 1.2541 | 1294 | 0.9311 | - |
| 1.2551 | 1295 | 0.8174 | - |
| 1.2561 | 1296 | 0.7876 | - |
| 1.2570 | 1297 | 0.8645 | - |
| 1.2580 | 1298 | 0.9518 | - |
| 1.2590 | 1299 | 0.9368 | - |
| 1.2599 | 1300 | 1.007 | 0.7874 |
| 1.2609 | 1301 | 0.9365 | - |
| 1.2619 | 1302 | 0.8543 | - |
| 1.2629 | 1303 | 0.8573 | - |
| 1.2638 | 1304 | 0.8061 | - |
| 1.2648 | 1305 | 0.7904 | - |
| 1.2658 | 1306 | 1.0943 | - |
| 1.2667 | 1307 | 0.8651 | - |
| 1.2677 | 1308 | 0.7755 | - |
| 1.2687 | 1309 | 0.9349 | - |
| 1.2696 | 1310 | 0.9932 | - |
| 1.2706 | 1311 | 0.8683 | - |
| 1.2716 | 1312 | 1.0831 | - |
| 1.2726 | 1313 | 0.668 | - |
| 1.2735 | 1314 | 0.7754 | - |
| 1.2745 | 1315 | 0.795 | - |
| 1.2755 | 1316 | 0.8559 | - |
| 1.2764 | 1317 | 0.7518 | - |
| 1.2774 | 1318 | 0.9431 | - |
| 1.2784 | 1319 | 0.7652 | - |
| 1.2793 | 1320 | 0.8715 | - |
| 1.2803 | 1321 | 0.7019 | - |
| 1.2813 | 1322 | 0.7422 | - |
| 1.2823 | 1323 | 0.7708 | - |
| 1.2832 | 1324 | 0.8813 | - |
| 1.2842 | 1325 | 0.8266 | - |
| 1.2852 | 1326 | 0.7693 | - |
| 1.2861 | 1327 | 0.7458 | - |
| 1.2871 | 1328 | 0.7913 | - |
| 1.2881 | 1329 | 0.8685 | - |
| 1.2890 | 1330 | 0.8531 | - |
| 1.2900 | 1331 | 0.7455 | - |
| 1.2910 | 1332 | 0.7994 | - |
| 1.2919 | 1333 | 0.7512 | - |
| 1.2929 | 1334 | 0.8219 | - |
| 1.2939 | 1335 | 0.9268 | - |
| 1.2949 | 1336 | 1.1041 | - |
| 1.2958 | 1337 | 0.8627 | - |
| 1.2968 | 1338 | 1.0383 | - |
| 1.2978 | 1339 | 0.8163 | - |
| 1.2987 | 1340 | 0.8756 | - |
| 1.2997 | 1341 | 0.7362 | - |
| 1.3007 | 1342 | 0.8636 | - |
| 1.3016 | 1343 | 0.9287 | - |
| 1.3026 | 1344 | 0.6461 | - |
| 1.3036 | 1345 | 0.8201 | - |
| 1.3046 | 1346 | 1.1362 | - |
| 1.3055 | 1347 | 0.8238 | - |
| 1.3065 | 1348 | 0.9655 | - |
| 1.3075 | 1349 | 0.9845 | - |
| 1.3084 | 1350 | 0.8321 | - |
| 1.3094 | 1351 | 1.0261 | - |
| 1.3104 | 1352 | 0.807 | - |
| 1.3113 | 1353 | 0.6478 | - |
| 1.3123 | 1354 | 0.7247 | - |
| 1.3133 | 1355 | 0.801 | - |
| 1.3143 | 1356 | 0.8335 | - |
| 1.3152 | 1357 | 0.808 | - |
| 1.3162 | 1358 | 0.6832 | - |
| 1.3172 | 1359 | 0.9396 | - |
| 1.3181 | 1360 | 0.692 | - |
| 1.3191 | 1361 | 1.0407 | - |
| 1.3201 | 1362 | 0.6717 | - |
| 1.3210 | 1363 | 0.8258 | - |
| 1.3220 | 1364 | 0.8678 | - |
| 1.3230 | 1365 | 0.8991 | - |
| 1.3240 | 1366 | 0.8041 | - |
| 1.3249 | 1367 | 0.7996 | - |
| 1.3259 | 1368 | 0.8647 | - |
| 1.3269 | 1369 | 0.8487 | - |
| 1.3278 | 1370 | 0.7463 | - |
| 1.3288 | 1371 | 0.7625 | - |
| 1.3298 | 1372 | 0.816 | - |
| 1.3307 | 1373 | 0.6854 | - |
| 1.3317 | 1374 | 0.8371 | - |
| 1.3327 | 1375 | 0.7786 | - |
| 1.3337 | 1376 | 0.6705 | - |
| 1.3346 | 1377 | 0.7191 | - |
| 1.3356 | 1378 | 0.7675 | - |
| 1.3366 | 1379 | 0.9576 | - |
| 1.3375 | 1380 | 0.8516 | - |
| 1.3385 | 1381 | 1.0449 | - |
| 1.3395 | 1382 | 0.8666 | - |
| 1.3404 | 1383 | 0.7942 | - |
| 1.3414 | 1384 | 0.7903 | - |
| 1.3424 | 1385 | 0.8479 | - |
| 1.3434 | 1386 | 0.875 | - |
| 1.3443 | 1387 | 0.8867 | - |
| 1.3453 | 1388 | 0.8504 | - |
| 1.3463 | 1389 | 0.5668 | - |
| 1.3472 | 1390 | 0.6684 | - |
| 1.3482 | 1391 | 0.7128 | - |
| 1.3492 | 1392 | 0.7491 | - |
| 1.3501 | 1393 | 0.5747 | - |
| 1.3511 | 1394 | 0.7758 | - |
| 1.3521 | 1395 | 0.7229 | - |
| 1.3531 | 1396 | 0.9459 | - |
| 1.3540 | 1397 | 0.8524 | - |
| 1.3550 | 1398 | 0.7258 | - |
| 1.3560 | 1399 | 0.8195 | - |
| 1.3569 | 1400 | 0.7631 | 0.7897 |
| 1.3579 | 1401 | 0.766 | - |
| 1.3589 | 1402 | 0.7246 | - |
| 1.3598 | 1403 | 0.8431 | - |
| 1.3608 | 1404 | 0.9004 | - |
| 1.3618 | 1405 | 0.6848 | - |
| 1.3628 | 1406 | 0.8385 | - |
| 1.3637 | 1407 | 0.9009 | - |
| 1.3647 | 1408 | 0.8355 | - |
| 1.3657 | 1409 | 0.9339 | - |
| 1.3666 | 1410 | 0.8778 | - |
| 1.3676 | 1411 | 1.0262 | - |
| 1.3686 | 1412 | 0.8082 | - |
| 1.3695 | 1413 | 0.7778 | - |
| 1.3705 | 1414 | 0.7046 | - |
| 1.3715 | 1415 | 0.7019 | - |
| 1.3725 | 1416 | 0.9853 | - |
| 1.3734 | 1417 | 0.8217 | - |
| 1.3744 | 1418 | 0.7665 | - |
| 1.3754 | 1419 | 0.875 | - |
| 1.3763 | 1420 | 0.7446 | - |
| 1.3773 | 1421 | 0.6599 | - |
| 1.3783 | 1422 | 0.6976 | - |
| 1.3792 | 1423 | 0.6713 | - |
| 1.3802 | 1424 | 0.9009 | - |
| 1.3812 | 1425 | 0.8795 | - |
| 1.3822 | 1426 | 0.7926 | - |
| 1.3831 | 1427 | 0.6909 | - |
| 1.3841 | 1428 | 1.0336 | - |
| 1.3851 | 1429 | 0.8579 | - |
| 1.3860 | 1430 | 0.7349 | - |
| 1.3870 | 1431 | 0.9453 | - |
| 1.3880 | 1432 | 0.8759 | - |
| 1.3889 | 1433 | 0.6249 | - |
| 1.3899 | 1434 | 0.7396 | - |
| 1.3909 | 1435 | 0.5758 | - |
| 1.3919 | 1436 | 0.6576 | - |
| 1.3928 | 1437 | 0.9228 | - |
| 1.3938 | 1438 | 0.7832 | - |
| 1.3948 | 1439 | 0.6235 | - |
| 1.3957 | 1440 | 0.7528 | - |
| 1.3967 | 1441 | 0.6852 | - |
| 1.3977 | 1442 | 0.7735 | - |
| 1.3986 | 1443 | 0.7852 | - |
| 1.3996 | 1444 | 0.7872 | - |
| 1.4006 | 1445 | 0.6813 | - |
| 1.4016 | 1446 | 0.7312 | - |
| 1.4025 | 1447 | 0.8524 | - |
| 1.4035 | 1448 | 0.8002 | - |
| 1.4045 | 1449 | 0.8489 | - |
| 1.4054 | 1450 | 0.8555 | - |
| 1.4064 | 1451 | 0.8035 | - |
| 1.4074 | 1452 | 0.7414 | - |
| 1.4083 | 1453 | 0.7767 | - |
| 1.4093 | 1454 | 0.8772 | - |
| 1.4103 | 1455 | 0.688 | - |
| 1.4113 | 1456 | 0.7272 | - |
| 1.4122 | 1457 | 0.7855 | - |
| 1.4132 | 1458 | 0.7596 | - |
| 1.4142 | 1459 | 0.7339 | - |
| 1.4151 | 1460 | 0.7369 | - |
| 1.4161 | 1461 | 0.6703 | - |
| 1.4171 | 1462 | 0.7413 | - |
| 1.4180 | 1463 | 0.7147 | - |
| 1.4190 | 1464 | 0.7449 | - |
| 1.4200 | 1465 | 0.8486 | - |
| 1.4210 | 1466 | 0.8065 | - |
| 1.4219 | 1467 | 0.647 | - |
| 1.4229 | 1468 | 0.5723 | - |
| 1.4239 | 1469 | 1.052 | - |
| 1.4248 | 1470 | 0.8359 | - |
| 1.4258 | 1471 | 0.8468 | - |
| 1.4268 | 1472 | 0.6374 | - |
| 1.4277 | 1473 | 0.8974 | - |
| 1.4287 | 1474 | 0.6532 | - |
| 1.4297 | 1475 | 0.7862 | - |
| 1.4306 | 1476 | 0.8004 | - |
| 1.4316 | 1477 | 0.7798 | - |
| 1.4326 | 1478 | 0.89 | - |
| 1.4336 | 1479 | 0.991 | - |
| 1.4345 | 1480 | 0.9278 | - |
| 1.4355 | 1481 | 0.7162 | - |
| 1.4365 | 1482 | 0.6632 | - |
| 1.4374 | 1483 | 0.6808 | - |
| 1.4384 | 1484 | 0.6875 | - |
| 1.4394 | 1485 | 0.8712 | - |
| 1.4403 | 1486 | 0.5778 | - |
| 1.4413 | 1487 | 0.9611 | - |
| 1.4423 | 1488 | 0.6821 | - |
| 1.4433 | 1489 | 0.7605 | - |
| 1.4442 | 1490 | 0.7431 | - |
| 1.4452 | 1491 | 0.7267 | - |
| 1.4462 | 1492 | 0.8073 | - |
| 1.4471 | 1493 | 0.7989 | - |
| 1.4481 | 1494 | 0.7369 | - |
| 1.4491 | 1495 | 0.8187 | - |
| 1.4500 | 1496 | 0.8339 | - |
| 1.4510 | 1497 | 0.7723 | - |
| 1.4520 | 1498 | 0.8331 | - |
| 1.4530 | 1499 | 0.7482 | - |
| 1.4539 | 1500 | 0.7019 | 0.8058 |
| 1.4549 | 1501 | 0.9959 | - |
| 1.4559 | 1502 | 0.8531 | - |
| 1.4568 | 1503 | 0.7493 | - |
| 1.4578 | 1504 | 0.6325 | - |
| 1.4588 | 1505 | 0.7645 | - |
| 1.4597 | 1506 | 0.7151 | - |
| 1.4607 | 1507 | 0.6742 | - |
| 1.4617 | 1508 | 0.6208 | - |
| 1.4627 | 1509 | 0.8318 | - |
| 1.4636 | 1510 | 0.7019 | - |
| 1.4646 | 1511 | 0.8984 | - |
| 1.4656 | 1512 | 0.883 | - |
| 1.4665 | 1513 | 0.6922 | - |
| 1.4675 | 1514 | 0.83 | - |
| 1.4685 | 1515 | 0.744 | - |
| 1.4694 | 1516 | 0.7566 | - |
| 1.4704 | 1517 | 0.6981 | - |
| 1.4714 | 1518 | 0.8996 | - |
| 1.4724 | 1519 | 0.8577 | - |
| 1.4733 | 1520 | 0.6758 | - |
| 1.4743 | 1521 | 0.8123 | - |
| 1.4753 | 1522 | 0.835 | - |
| 1.4762 | 1523 | 0.6785 | - |
| 1.4772 | 1524 | 0.6792 | - |
| 1.4782 | 1525 | 0.8448 | - |
| 1.4791 | 1526 | 0.7172 | - |
| 1.4801 | 1527 | 0.62 | - |
| 1.4811 | 1528 | 0.634 | - |
| 1.4821 | 1529 | 0.7962 | - |
| 1.4830 | 1530 | 0.6111 | - |
| 1.4840 | 1531 | 0.7702 | - |
| 1.4850 | 1532 | 0.8659 | - |
| 1.4859 | 1533 | 0.7547 | - |
| 1.4869 | 1534 | 0.7076 | - |
| 1.4879 | 1535 | 0.7396 | - |
| 1.4888 | 1536 | 0.7168 | - |
| 1.4898 | 1537 | 0.7296 | - |
| 1.4908 | 1538 | 0.869 | - |
| 1.4918 | 1539 | 0.6373 | - |
| 1.4927 | 1540 | 0.7523 | - |
| 1.4937 | 1541 | 0.6693 | - |
| 1.4947 | 1542 | 0.7123 | - |
| 1.4956 | 1543 | 0.6206 | - |
| 1.4966 | 1544 | 0.7445 | - |
| 1.4976 | 1545 | 0.8999 | - |
| 1.4985 | 1546 | 0.8598 | - |
| 1.4995 | 1547 | 0.7804 | - |
| 1.5005 | 1548 | 0.719 | - |
| 1.5015 | 1549 | 0.6686 | - |
| 1.5024 | 1550 | 0.7836 | - |
| 1.5034 | 1551 | 0.7813 | - |
| 1.5044 | 1552 | 0.7585 | - |
| 1.5053 | 1553 | 0.6581 | - |
| 1.5063 | 1554 | 0.5922 | - |
| 1.5073 | 1555 | 0.5956 | - |
| 1.5082 | 1556 | 0.6992 | - |
| 1.5092 | 1557 | 0.7576 | - |
| 1.5102 | 1558 | 0.7258 | - |
| 1.5112 | 1559 | 0.597 | - |
| 1.5121 | 1560 | 0.5803 | - |
| 1.5131 | 1561 | 0.7251 | - |
| 1.5141 | 1562 | 0.7514 | - |
| 1.5150 | 1563 | 0.8071 | - |
| 1.5160 | 1564 | 0.5921 | - |
| 1.5170 | 1565 | 0.7758 | - |
| 1.5179 | 1566 | 0.8269 | - |
| 1.5189 | 1567 | 0.5422 | - |
| 1.5199 | 1568 | 0.9072 | - |
| 1.5209 | 1569 | 0.8235 | - |
| 1.5218 | 1570 | 0.8102 | - |
| 1.5228 | 1571 | 0.7935 | - |
| 1.5238 | 1572 | 0.7218 | - |
| 1.5247 | 1573 | 0.709 | - |
| 1.5257 | 1574 | 0.7464 | - |
| 1.5267 | 1575 | 0.7907 | - |
| 1.5276 | 1576 | 0.6071 | - |
| 1.5286 | 1577 | 0.8561 | - |
| 1.5296 | 1578 | 0.7112 | - |
| 1.5306 | 1579 | 0.7987 | - |
| 1.5315 | 1580 | 0.8957 | - |
| 1.5325 | 1581 | 0.6504 | - |
| 1.5335 | 1582 | 0.7708 | - |
| 1.5344 | 1583 | 0.8088 | - |
| 1.5354 | 1584 | 0.6257 | - |
| 1.5364 | 1585 | 0.7473 | - |
| 1.5373 | 1586 | 0.7711 | - |
| 1.5383 | 1587 | 0.8638 | - |
| 1.5393 | 1588 | 0.7441 | - |
| 1.5403 | 1589 | 0.6848 | - |
| 1.5412 | 1590 | 0.5622 | - |
| 1.5422 | 1591 | 0.9032 | - |
| 1.5432 | 1592 | 0.7939 | - |
| 1.5441 | 1593 | 0.7537 | - |
| 1.5451 | 1594 | 1.0496 | - |
| 1.5461 | 1595 | 0.6859 | - |
| 1.5470 | 1596 | 0.6735 | - |
| 1.5480 | 1597 | 0.7802 | - |
| 1.5490 | 1598 | 0.5412 | - |
| 1.5500 | 1599 | 0.7435 | - |
| 1.5509 | 1600 | 0.8165 | 0.7961 |
| 1.5519 | 1601 | 0.8609 | - |
| 1.5529 | 1602 | 0.731 | - |
| 1.5538 | 1603 | 0.5897 | - |
| 1.5548 | 1604 | 0.8745 | - |
| 1.5558 | 1605 | 0.7755 | - |
| 1.5567 | 1606 | 0.7862 | - |
| 1.5577 | 1607 | 0.5873 | - |
| 1.5587 | 1608 | 0.7339 | - |
| 1.5597 | 1609 | 0.79 | - |
| 1.5606 | 1610 | 0.7787 | - |
| 1.5616 | 1611 | 0.7633 | - |
| 1.5626 | 1612 | 0.7421 | - |
| 1.5635 | 1613 | 0.7208 | - |
| 1.5645 | 1614 | 0.831 | - |
| 1.5655 | 1615 | 0.6685 | - |
| 1.5664 | 1616 | 0.7849 | - |
| 1.5674 | 1617 | 0.8241 | - |
| 1.5684 | 1618 | 0.6251 | - |
| 1.5694 | 1619 | 0.6991 | - |
| 1.5703 | 1620 | 0.7075 | - |
| 1.5713 | 1621 | 0.6251 | - |
| 1.5723 | 1622 | 0.5788 | - |
| 1.5732 | 1623 | 0.6897 | - |
| 1.5742 | 1624 | 0.5351 | - |
| 1.5752 | 1625 | 0.6771 | - |
| 1.5761 | 1626 | 0.688 | - |
| 1.5771 | 1627 | 0.772 | - |
| 1.5781 | 1628 | 0.7549 | - |
| 1.5790 | 1629 | 0.7748 | - |
| 1.5800 | 1630 | 0.667 | - |
| 1.5810 | 1631 | 0.5608 | - |
| 1.5820 | 1632 | 0.6907 | - |
| 1.5829 | 1633 | 0.7686 | - |
| 1.5839 | 1634 | 0.8906 | - |
| 1.5849 | 1635 | 0.8486 | - |
| 1.5858 | 1636 | 0.8854 | - |
| 1.5868 | 1637 | 0.7551 | - |
| 1.5878 | 1638 | 0.6147 | - |
| 1.5887 | 1639 | 0.6694 | - |
| 1.5897 | 1640 | 0.5965 | - |
| 1.5907 | 1641 | 0.7636 | - |
| 1.5917 | 1642 | 0.8391 | - |
| 1.5926 | 1643 | 0.6128 | - |
| 1.5936 | 1644 | 0.5919 | - |
| 1.5946 | 1645 | 0.7991 | - |
| 1.5955 | 1646 | 0.7661 | - |
| 1.5965 | 1647 | 0.618 | - |
| 1.5975 | 1648 | 0.7847 | - |
| 1.5984 | 1649 | 0.7109 | - |
| 1.5994 | 1650 | 0.9202 | - |
| 1.6004 | 1651 | 0.7567 | - |
| 1.6014 | 1652 | 0.6609 | - |
| 1.6023 | 1653 | 0.6905 | - |
| 1.6033 | 1654 | 0.852 | - |
| 1.6043 | 1655 | 0.7588 | - |
| 1.6052 | 1656 | 0.8175 | - |
| 1.6062 | 1657 | 0.7543 | - |
| 1.6072 | 1658 | 0.9037 | - |
| 1.6081 | 1659 | 0.7401 | - |
| 1.6091 | 1660 | 0.6967 | - |
| 1.6101 | 1661 | 0.7833 | - |
| 1.6111 | 1662 | 0.6 | - |
| 1.6120 | 1663 | 0.6552 | - |
| 1.6130 | 1664 | 0.5971 | - |
| 1.6140 | 1665 | 0.7289 | - |
| 1.6149 | 1666 | 0.5993 | - |
| 1.6159 | 1667 | 0.5717 | - |
| 1.6169 | 1668 | 0.5738 | - |
| 1.6178 | 1669 | 0.7154 | - |
| 1.6188 | 1670 | 0.7596 | - |
| 1.6198 | 1671 | 0.6952 | - |
| 1.6208 | 1672 | 0.842 | - |
| 1.6217 | 1673 | 0.6087 | - |
| 1.6227 | 1674 | 0.7497 | - |
| 1.6237 | 1675 | 0.72 | - |
| 1.6246 | 1676 | 0.8619 | - |
| 1.6256 | 1677 | 0.6896 | - |
| 1.6266 | 1678 | 0.7126 | - |
| 1.6275 | 1679 | 0.8727 | - |
| 1.6285 | 1680 | 0.7492 | - |
| 1.6295 | 1681 | 0.8323 | - |
| 1.6305 | 1682 | 0.8472 | - |
| 1.6314 | 1683 | 0.7048 | - |
| 1.6324 | 1684 | 0.6237 | - |
| 1.6334 | 1685 | 0.4773 | - |
| 1.6343 | 1686 | 0.757 | - |
| 1.6353 | 1687 | 0.8287 | - |
| 1.6363 | 1688 | 0.7019 | - |
| 1.6372 | 1689 | 0.6554 | - |
| 1.6382 | 1690 | 0.8718 | - |
| 1.6392 | 1691 | 0.6256 | - |
| 1.6402 | 1692 | 0.715 | - |
| 1.6411 | 1693 | 0.7061 | - |
| 1.6421 | 1694 | 0.7735 | - |
| 1.6431 | 1695 | 0.6913 | - |
| 1.6440 | 1696 | 0.6472 | - |
| 1.6450 | 1697 | 0.7388 | - |
| 1.6460 | 1698 | 0.5621 | - |
| 1.6469 | 1699 | 0.6833 | - |
| 1.6479 | 1700 | 0.6612 | 0.8113 |
| 1.6489 | 1701 | 0.7632 | - |
| 1.6499 | 1702 | 0.7379 | - |
| 1.6508 | 1703 | 0.6027 | - |
| 1.6518 | 1704 | 0.6581 | - |
| 1.6528 | 1705 | 0.9163 | - |
| 1.6537 | 1706 | 0.5908 | - |
| 1.6547 | 1707 | 0.6724 | - |
| 1.6557 | 1708 | 0.9015 | - |
| 1.6566 | 1709 | 0.602 | - |
| 1.6576 | 1710 | 0.764 | - |
| 1.6586 | 1711 | 0.9204 | - |
| 1.6596 | 1712 | 0.7874 | - |
| 1.6605 | 1713 | 0.6792 | - |
| 1.6615 | 1714 | 1.0571 | - |
| 1.6625 | 1715 | 0.5636 | - |
| 1.6634 | 1716 | 0.6821 | - |
| 1.6644 | 1717 | 0.7831 | - |
| 1.6654 | 1718 | 0.5413 | - |
| 1.6663 | 1719 | 0.5983 | - |
| 1.6673 | 1720 | 0.7914 | - |
| 1.6683 | 1721 | 0.7817 | - |
| 1.6693 | 1722 | 0.8008 | - |
| 1.6702 | 1723 | 0.9789 | - |
| 1.6712 | 1724 | 0.8115 | - |
| 1.6722 | 1725 | 0.8184 | - |
| 1.6731 | 1726 | 0.7656 | - |
| 1.6741 | 1727 | 0.7962 | - |
| 1.6751 | 1728 | 0.668 | - |
| 1.6760 | 1729 | 0.7339 | - |
| 1.6770 | 1730 | 0.8075 | - |
| 1.6780 | 1731 | 0.8269 | - |
| 1.6790 | 1732 | 0.6609 | - |
| 1.6799 | 1733 | 0.6131 | - |
| 1.6809 | 1734 | 0.663 | - |
| 1.6819 | 1735 | 0.6928 | - |
| 1.6828 | 1736 | 0.7256 | - |
| 1.6838 | 1737 | 0.6625 | - |
| 1.6848 | 1738 | 0.8669 | - |
| 1.6857 | 1739 | 0.7212 | - |
| 1.6867 | 1740 | 0.8138 | - |
| 1.6877 | 1741 | 0.666 | - |
| 1.6887 | 1742 | 0.6959 | - |
| 1.6896 | 1743 | 0.7597 | - |
| 1.6906 | 1744 | 0.6856 | - |
| 1.6916 | 1745 | 0.7972 | - |
| 1.6925 | 1746 | 0.7378 | - |
| 1.6935 | 1747 | 0.7512 | - |
| 1.6945 | 1748 | 0.7767 | - |
| 1.6954 | 1749 | 0.6285 | - |
| 1.6964 | 1750 | 0.6225 | - |
| 1.6974 | 1751 | 0.6832 | - |
| 1.6984 | 1752 | 0.7369 | - |
| 1.6993 | 1753 | 0.6652 | - |
| 1.7003 | 1754 | 0.7356 | - |
| 1.7013 | 1755 | 0.6351 | - |
| 1.7022 | 1756 | 0.8455 | - |
| 1.7032 | 1757 | 0.7938 | - |
| 1.7042 | 1758 | 0.6644 | - |
| 1.7051 | 1759 | 0.607 | - |
| 1.7061 | 1760 | 0.8006 | - |
| 1.7071 | 1761 | 0.823 | - |
| 1.7081 | 1762 | 0.806 | - |
| 1.7090 | 1763 | 0.5637 | - |
| 1.7100 | 1764 | 0.6686 | - |
| 1.7110 | 1765 | 0.6267 | - |
| 1.7119 | 1766 | 0.6483 | - |
| 1.7129 | 1767 | 0.6823 | - |
| 1.7139 | 1768 | 0.7953 | - |
| 1.7148 | 1769 | 0.7937 | - |
| 1.7158 | 1770 | 0.715 | - |
| 1.7168 | 1771 | 0.6956 | - |
| 1.7177 | 1772 | 0.7322 | - |
| 1.7187 | 1773 | 0.6683 | - |
| 1.7197 | 1774 | 0.7737 | - |
| 1.7207 | 1775 | 0.7517 | - |
| 1.7216 | 1776 | 0.6453 | - |
| 1.7226 | 1777 | 0.6973 | - |
| 1.7236 | 1778 | 0.7086 | - |
| 1.7245 | 1779 | 0.9034 | - |
| 1.7255 | 1780 | 0.8997 | - |
| 1.7265 | 1781 | 0.7065 | - |
| 1.7274 | 1782 | 0.7131 | - |
| 1.7284 | 1783 | 0.7341 | - |
| 1.7294 | 1784 | 0.837 | - |
| 1.7304 | 1785 | 0.665 | - |
| 1.7313 | 1786 | 0.7239 | - |
| 1.7323 | 1787 | 0.7013 | - |
| 1.7333 | 1788 | 0.5536 | - |
| 1.7342 | 1789 | 0.6315 | - |
| 1.7352 | 1790 | 0.7614 | - |
| 1.7362 | 1791 | 0.8372 | - |
| 1.7371 | 1792 | 0.619 | - |
| 1.7381 | 1793 | 0.7073 | - |
| 1.7391 | 1794 | 0.7967 | - |
| 1.7401 | 1795 | 0.7013 | - |
| 1.7410 | 1796 | 0.6824 | - |
| 1.7420 | 1797 | 0.6354 | - |
| 1.7430 | 1798 | 0.732 | - |
| 1.7439 | 1799 | 0.6849 | - |
| 1.7449 | 1800 | 0.6575 | 0.8046 |
| 1.7459 | 1801 | 0.6977 | - |
| 1.7468 | 1802 | 0.7931 | - |
| 1.7478 | 1803 | 0.7585 | - |
| 1.7488 | 1804 | 0.8337 | - |
| 1.7498 | 1805 | 0.6137 | - |
| 1.7507 | 1806 | 0.801 | - |
| 1.7517 | 1807 | 0.7391 | - |
| 1.7527 | 1808 | 0.6558 | - |
| 1.7536 | 1809 | 0.664 | - |
| 1.7546 | 1810 | 0.7474 | - |
| 1.7556 | 1811 | 0.8532 | - |
| 1.7565 | 1812 | 0.7538 | - |
| 1.7575 | 1813 | 0.852 | - |
| 1.7585 | 1814 | 0.7035 | - |
| 1.7595 | 1815 | 0.5078 | - |
| 1.7604 | 1816 | 0.7348 | - |
| 1.7614 | 1817 | 0.7589 | - |
| 1.7624 | 1818 | 0.717 | - |
| 1.7633 | 1819 | 0.6664 | - |
| 1.7643 | 1820 | 0.8432 | - |
| 1.7653 | 1821 | 0.8765 | - |
| 1.7662 | 1822 | 0.6047 | - |
| 1.7672 | 1823 | 0.7773 | - |
| 1.7682 | 1824 | 0.7824 | - |
| 1.7692 | 1825 | 0.7697 | - |
| 1.7701 | 1826 | 0.6739 | - |
| 1.7711 | 1827 | 0.6484 | - |
| 1.7721 | 1828 | 0.7148 | - |
| 1.7730 | 1829 | 0.7107 | - |
| 1.7740 | 1830 | 0.6807 | - |
| 1.7750 | 1831 | 0.5445 | - |
| 1.7759 | 1832 | 0.7003 | - |
| 1.7769 | 1833 | 0.7163 | - |
| 1.7779 | 1834 | 0.7772 | - |
| 1.7789 | 1835 | 0.7915 | - |
| 1.7798 | 1836 | 0.762 | - |
| 1.7808 | 1837 | 0.7434 | - |
| 1.7818 | 1838 | 0.9288 | - |
| 1.7827 | 1839 | 0.6533 | - |
| 1.7837 | 1840 | 0.6722 | - |
| 1.7847 | 1841 | 0.6979 | - |
| 1.7856 | 1842 | 0.6648 | - |
| 1.7866 | 1843 | 0.7768 | - |
| 1.7876 | 1844 | 0.6604 | - |
| 1.7886 | 1845 | 0.7512 | - |
| 1.7895 | 1846 | 0.7303 | - |
| 1.7905 | 1847 | 0.7366 | - |
| 1.7915 | 1848 | 0.6329 | - |
| 1.7924 | 1849 | 0.8459 | - |
| 1.7934 | 1850 | 0.842 | - |
| 1.7944 | 1851 | 0.5862 | - |
| 1.7953 | 1852 | 0.6383 | - |
| 1.7963 | 1853 | 0.8031 | - |
| 1.7973 | 1854 | 0.775 | - |
| 1.7983 | 1855 | 0.764 | - |
| 1.7992 | 1856 | 0.7254 | - |
| 1.8002 | 1857 | 0.7771 | - |
| 1.8012 | 1858 | 0.8341 | - |
| 1.8021 | 1859 | 0.739 | - |
| 1.8031 | 1860 | 0.7977 | - |
| 1.8041 | 1861 | 0.6467 | - |
| 1.8050 | 1862 | 0.5262 | - |
| 1.8060 | 1863 | 0.6503 | - |
| 1.8070 | 1864 | 0.7916 | - |
| 1.8080 | 1865 | 0.6324 | - |
| 1.8089 | 1866 | 0.6694 | - |
| 1.8099 | 1867 | 0.9165 | - |
| 1.8109 | 1868 | 0.8335 | - |
| 1.8118 | 1869 | 0.8097 | - |
| 1.8128 | 1870 | 0.7214 | - |
| 1.8138 | 1871 | 0.625 | - |
| 1.8147 | 1872 | 0.6869 | - |
| 1.8157 | 1873 | 0.647 | - |
| 1.8167 | 1874 | 0.8229 | - |
| 1.8177 | 1875 | 0.7434 | - |
| 1.8186 | 1876 | 0.7254 | - |
| 1.8196 | 1877 | 0.7173 | - |
| 1.8206 | 1878 | 0.9296 | - |
| 1.8215 | 1879 | 0.5835 | - |
| 1.8225 | 1880 | 0.6052 | - |
| 1.8235 | 1881 | 0.6361 | - |
| 1.8244 | 1882 | 0.8452 | - |
| 1.8254 | 1883 | 0.6942 | - |
| 1.8264 | 1884 | 0.7272 | - |
| 1.8274 | 1885 | 0.7507 | - |
| 1.8283 | 1886 | 0.7883 | - |
| 1.8293 | 1887 | 0.7448 | - |
| 1.8303 | 1888 | 0.6287 | - |
| 1.8312 | 1889 | 0.7264 | - |
| 1.8322 | 1890 | 0.914 | - |
| 1.8332 | 1891 | 0.6529 | - |
| 1.8341 | 1892 | 0.8242 | - |
| 1.8351 | 1893 | 0.596 | - |
| 1.8361 | 1894 | 0.7097 | - |
| 1.8371 | 1895 | 0.7529 | - |
| 1.8380 | 1896 | 0.7671 | - |
| 1.8390 | 1897 | 0.678 | - |
| 1.8400 | 1898 | 0.8757 | - |
| 1.8409 | 1899 | 0.7371 | - |
| 1.8419 | 1900 | 0.6828 | 0.7876 |
| 1.8429 | 1901 | 0.6667 | - |
| 1.8438 | 1902 | 0.8272 | - |
| 1.8448 | 1903 | 0.5297 | - |
| 1.8458 | 1904 | 0.7095 | - |
| 1.8468 | 1905 | 0.8089 | - |
| 1.8477 | 1906 | 0.8868 | - |
| 1.8487 | 1907 | 0.6937 | - |
| 1.8497 | 1908 | 0.5408 | - |
| 1.8506 | 1909 | 0.5756 | - |
| 1.8516 | 1910 | 0.7298 | - |
| 1.8526 | 1911 | 0.681 | - |
| 1.8535 | 1912 | 0.7994 | - |
| 1.8545 | 1913 | 0.7438 | - |
| 1.8555 | 1914 | 0.6668 | - |
| 1.8565 | 1915 | 0.8449 | - |
| 1.8574 | 1916 | 0.6834 | - |
| 1.8584 | 1917 | 0.9498 | - |
| 1.8594 | 1918 | 0.5706 | - |
| 1.8603 | 1919 | 0.6799 | - |
| 1.8613 | 1920 | 0.7774 | - |
| 1.8623 | 1921 | 0.7812 | - |
| 1.8632 | 1922 | 0.5577 | - |
| 1.8642 | 1923 | 0.8447 | - |
| 1.8652 | 1924 | 0.7711 | - |
| 1.8661 | 1925 | 0.7756 | - |
| 1.8671 | 1926 | 0.564 | - |
| 1.8681 | 1927 | 0.6679 | - |
| 1.8691 | 1928 | 0.9041 | - |
| 1.8700 | 1929 | 0.7227 | - |
| 1.8710 | 1930 | 0.894 | - |
| 1.8720 | 1931 | 0.8396 | - |
| 1.8729 | 1932 | 0.7066 | - |
| 1.8739 | 1933 | 0.5352 | - |
| 1.8749 | 1934 | 0.7371 | - |
| 1.8758 | 1935 | 0.6748 | - |
| 1.8768 | 1936 | 0.7177 | - |
| 1.8778 | 1937 | 0.8082 | - |
| 1.8788 | 1938 | 0.8527 | - |
| 1.8797 | 1939 | 0.6969 | - |
| 1.8807 | 1940 | 0.6269 | - |
| 1.8817 | 1941 | 0.6769 | - |
| 1.8826 | 1942 | 0.6839 | - |
| 1.8836 | 1943 | 0.6378 | - |
| 1.8846 | 1944 | 1.0035 | - |
| 1.8855 | 1945 | 0.7998 | - |
| 1.8865 | 1946 | 0.727 | - |
| 1.8875 | 1947 | 0.646 | - |
| 1.8885 | 1948 | 0.8097 | - |
| 1.8894 | 1949 | 0.7683 | - |
| 1.8904 | 1950 | 0.649 | - |
| 1.8914 | 1951 | 0.8022 | - |
| 1.8923 | 1952 | 0.764 | - |
| 1.8933 | 1953 | 0.5857 | - |
| 1.8943 | 1954 | 0.6373 | - |
| 1.8952 | 1955 | 0.7232 | - |
| 1.8962 | 1956 | 0.628 | - |
| 1.8972 | 1957 | 0.8118 | - |
| 1.8982 | 1958 | 0.7808 | - |
| 1.8991 | 1959 | 0.6535 | - |
| 1.9001 | 1960 | 0.7101 | - |
| 1.9011 | 1961 | 0.7428 | - |
| 1.9020 | 1962 | 0.8839 | - |
| 1.9030 | 1963 | 0.8365 | - |
| 1.9040 | 1964 | 0.9767 | - |
| 1.9049 | 1965 | 0.6979 | - |
| 1.9059 | 1966 | 0.5444 | - |
| 1.9069 | 1967 | 0.7702 | - |
| 1.9079 | 1968 | 0.9249 | - |
| 1.9088 | 1969 | 0.7246 | - |
| 1.9098 | 1970 | 0.7728 | - |
| 1.9108 | 1971 | 0.6503 | - |
| 1.9117 | 1972 | 0.695 | - |
| 1.9127 | 1973 | 0.577 | - |
| 1.9137 | 1974 | 0.7251 | - |
| 1.9146 | 1975 | 0.4864 | - |
| 1.9156 | 1976 | 0.742 | - |
| 1.9166 | 1977 | 0.7708 | - |
| 1.9176 | 1978 | 0.5806 | - |
| 1.9185 | 1979 | 0.6627 | - |
| 1.9195 | 1980 | 0.8659 | - |
| 1.9205 | 1981 | 0.7937 | - |
| 1.9214 | 1982 | 0.6552 | - |
| 1.9224 | 1983 | 0.6047 | - |
| 1.9234 | 1984 | 0.7473 | - |
| 1.9243 | 1985 | 0.8233 | - |
| 1.9253 | 1986 | 0.7255 | - |
| 1.9263 | 1987 | 0.8763 | - |
| 1.9273 | 1988 | 0.8215 | - |
| 1.9282 | 1989 | 0.6413 | - |
| 1.9292 | 1990 | 0.7078 | - |
| 1.9302 | 1991 | 0.6256 | - |
| 1.9311 | 1992 | 0.7109 | - |
| 1.9321 | 1993 | 0.6234 | - |
| 1.9331 | 1994 | 0.6345 | - |
| 1.9340 | 1995 | 0.7889 | - |
| 1.9350 | 1996 | 0.8912 | - |
| 1.9360 | 1997 | 0.8942 | - |
| 1.9370 | 1998 | 0.7109 | - |
| 1.9379 | 1999 | 0.6851 | - |
| 1.9389 | 2000 | 0.735 | 0.8081 |
| 1.9399 | 2001 | 0.7644 | - |
| 1.9408 | 2002 | 0.7399 | - |
| 1.9418 | 2003 | 0.6904 | - |
| 1.9428 | 2004 | 0.7046 | - |
| 1.9437 | 2005 | 0.6325 | - |
| 1.9447 | 2006 | 0.6424 | - |
| 1.9457 | 2007 | 0.8156 | - |
| 1.9467 | 2008 | 0.645 | - |
| 1.9476 | 2009 | 0.7741 | - |
| 1.9486 | 2010 | 0.5627 | - |
| 1.9496 | 2011 | 0.7891 | - |
| 1.9505 | 2012 | 0.7316 | - |
| 1.9515 | 2013 | 0.6753 | - |
| 1.9525 | 2014 | 0.6432 | - |
| 1.9534 | 2015 | 0.711 | - |
| 1.9544 | 2016 | 0.7305 | - |
| 1.9554 | 2017 | 0.7298 | - |
| 1.9564 | 2018 | 0.7882 | - |
| 1.9573 | 2019 | 0.5478 | - |
| 1.9583 | 2020 | 0.6051 | - |
| 1.9593 | 2021 | 0.7872 | - |
| 1.9602 | 2022 | 0.68 | - |
| 1.9612 | 2023 | 0.7561 | - |
| 1.9622 | 2024 | 0.7967 | - |
| 1.9631 | 2025 | 0.5483 | - |
| 1.9641 | 2026 | 0.845 | - |
| 1.9651 | 2027 | 0.6567 | - |
| 1.9661 | 2028 | 0.8371 | - |
| 1.9670 | 2029 | 0.7164 | - |
| 1.9680 | 2030 | 0.691 | - |
| 1.9690 | 2031 | 0.6514 | - |
| 1.9699 | 2032 | 0.5925 | - |
| 1.9709 | 2033 | 0.6455 | - |
| 1.9719 | 2034 | 0.6055 | - |
| 1.9728 | 2035 | 0.5931 | - |
| 1.9738 | 2036 | 0.7638 | - |
| 1.9748 | 2037 | 0.7142 | - |
| 1.9758 | 2038 | 0.9528 | - |
| 1.9767 | 2039 | 0.7095 | - |
| 1.9777 | 2040 | 0.9217 | - |
| 1.9787 | 2041 | 0.6688 | - |
| 1.9796 | 2042 | 0.6394 | - |
| 1.9806 | 2043 | 0.6851 | - |
| 1.9816 | 2044 | 0.775 | - |
| 1.9825 | 2045 | 0.5942 | - |
| 1.9835 | 2046 | 0.5549 | - |
| 1.9845 | 2047 | 0.6548 | - |
| 1.9855 | 2048 | 0.7167 | - |
| 1.9864 | 2049 | 0.5756 | - |
| 1.9874 | 2050 | 0.6826 | - |
| 1.9884 | 2051 | 0.597 | - |
| 1.9893 | 2052 | 0.7482 | - |
| 1.9903 | 2053 | 0.6438 | - |
| 1.9913 | 2054 | 0.6134 | - |
| 1.9922 | 2055 | 0.644 | - |
| 1.9932 | 2056 | 0.6556 | - |
| 1.9942 | 2057 | 0.7017 | - |
| 1.9952 | 2058 | 0.8051 | - |
| 1.9961 | 2059 | 0.6849 | - |
| 1.9971 | 2060 | 0.5369 | - |
| 1.9981 | 2061 | 0.7209 | - |
| 1.9990 | 2062 | 0.768 | - |
| 2.0 | 2063 | 0.9505 | - |
| 2.0010 | 2064 | 0.6213 | - |
| 2.0010 | 2065 | 0.697 | - |
| 2.0019 | 2066 | 0.5945 | - |
| 2.0029 | 2067 | 0.8256 | - |
| 2.0039 | 2068 | 0.7616 | - |
| 2.0048 | 2069 | 0.6724 | - |
| 2.0058 | 2070 | 0.75 | - |
| 2.0068 | 2071 | 0.723 | - |
| 2.0078 | 2072 | 0.6326 | - |
| 2.0087 | 2073 | 0.6084 | - |
| 2.0097 | 2074 | 0.6465 | - |
| 2.0107 | 2075 | 0.5935 | - |
| 2.0116 | 2076 | 0.5946 | - |
| 2.0126 | 2077 | 0.5824 | - |
| 2.0136 | 2078 | 0.6075 | - |
| 2.0145 | 2079 | 0.7978 | - |
| 2.0155 | 2080 | 0.5604 | - |
| 2.0165 | 2081 | 0.7576 | - |
| 2.0175 | 2082 | 0.868 | - |
| 2.0184 | 2083 | 0.7906 | - |
| 2.0194 | 2084 | 0.8015 | - |
| 2.0204 | 2085 | 0.6843 | - |
| 2.0213 | 2086 | 0.6486 | - |
| 2.0223 | 2087 | 0.7035 | - |
| 2.0233 | 2088 | 0.7376 | - |
| 2.0242 | 2089 | 0.6083 | - |
| 2.0252 | 2090 | 0.9665 | - |
| 2.0262 | 2091 | 0.6457 | - |
| 2.0272 | 2092 | 0.7228 | - |
| 2.0281 | 2093 | 0.5987 | - |
| 2.0291 | 2094 | 0.6075 | - |
| 2.0301 | 2095 | 0.7552 | - |
| 2.0310 | 2096 | 0.7809 | - |
| 2.0320 | 2097 | 0.5553 | - |
| 2.0330 | 2098 | 0.7906 | - |
| 2.0339 | 2099 | 0.7123 | - |
| 2.0349 | 2100 | 0.8322 | 0.8038 |
| 2.0359 | 2101 | 0.4839 | - |
| 2.0369 | 2102 | 0.6673 | - |
| 2.0378 | 2103 | 0.6374 | - |
| 2.0388 | 2104 | 0.6439 | - |
| 2.0398 | 2105 | 0.6442 | - |
| 2.0407 | 2106 | 0.6666 | - |
| 2.0417 | 2107 | 0.852 | - |
| 2.0427 | 2108 | 0.7694 | - |
| 2.0436 | 2109 | 0.6081 | - |
| 2.0446 | 2110 | 0.8221 | - |
| 2.0456 | 2111 | 0.6266 | - |
| 2.0466 | 2112 | 0.7678 | - |
| 2.0475 | 2113 | 0.7202 | - |
| 2.0485 | 2114 | 0.7524 | - |
| 2.0495 | 2115 | 0.7356 | - |
| 2.0504 | 2116 | 0.5937 | - |
| 2.0514 | 2117 | 0.704 | - |
| 2.0524 | 2118 | 0.6464 | - |
| 2.0533 | 2119 | 0.7039 | - |
| 2.0543 | 2120 | 0.6733 | - |
| 2.0553 | 2121 | 0.7823 | - |
| 2.0563 | 2122 | 0.7805 | - |
| 2.0572 | 2123 | 0.7935 | - |
| 2.0582 | 2124 | 0.7556 | - |
| 2.0592 | 2125 | 0.665 | - |
| 2.0601 | 2126 | 0.7393 | - |
| 2.0611 | 2127 | 0.7254 | - |
| 2.0621 | 2128 | 0.7674 | - |
| 2.0630 | 2129 | 0.6455 | - |
| 2.0640 | 2130 | 0.687 | - |
| 2.0650 | 2131 | 0.562 | - |
| 2.0660 | 2132 | 0.59 | - |
| 2.0669 | 2133 | 0.6257 | - |
| 2.0679 | 2134 | 0.6269 | - |
| 2.0689 | 2135 | 0.5602 | - |
| 2.0698 | 2136 | 0.5691 | - |
| 2.0708 | 2137 | 0.7111 | - |
| 2.0718 | 2138 | 0.6943 | - |
| 2.0727 | 2139 | 0.6063 | - |
| 2.0737 | 2140 | 0.7773 | - |
| 2.0747 | 2141 | 0.8427 | - |
| 2.0757 | 2142 | 0.4501 | - |
| 2.0766 | 2143 | 0.7915 | - |
| 2.0776 | 2144 | 0.8073 | - |
| 2.0786 | 2145 | 0.641 | - |
| 2.0795 | 2146 | 0.5195 | - |
| 2.0805 | 2147 | 0.8016 | - |
| 2.0815 | 2148 | 0.8433 | - |
| 2.0824 | 2149 | 0.6949 | - |
| 2.0834 | 2150 | 0.5516 | - |
| 2.0844 | 2151 | 0.5592 | - |
| 2.0854 | 2152 | 0.7595 | - |
| 2.0863 | 2153 | 0.6842 | - |
| 2.0873 | 2154 | 0.6193 | - |
| 2.0883 | 2155 | 0.6777 | - |
| 2.0892 | 2156 | 0.7572 | - |
| 2.0902 | 2157 | 0.6439 | - |
| 2.0912 | 2158 | 0.8789 | - |
| 2.0921 | 2159 | 0.7792 | - |
| 2.0931 | 2160 | 0.743 | - |
| 2.0941 | 2161 | 0.7873 | - |
| 2.0951 | 2162 | 0.8388 | - |
| 2.0960 | 2163 | 0.5687 | - |
| 2.0970 | 2164 | 0.6197 | - |
| 2.0980 | 2165 | 0.7098 | - |
| 2.0989 | 2166 | 0.6194 | - |
| 2.0999 | 2167 | 0.6567 | - |
| 2.1009 | 2168 | 0.6158 | - |
| 2.1018 | 2169 | 0.6464 | - |
| 2.1028 | 2170 | 0.6537 | - |
| 2.1038 | 2171 | 0.7301 | - |
| 2.1048 | 2172 | 0.6635 | - |
| 2.1057 | 2173 | 0.5954 | - |
| 2.1067 | 2174 | 0.6943 | - |
| 2.1077 | 2175 | 0.6524 | - |
| 2.1086 | 2176 | 0.7287 | - |
| 2.1096 | 2177 | 0.6389 | - |
| 2.1106 | 2178 | 0.6938 | - |
| 2.1115 | 2179 | 0.6134 | - |
| 2.1125 | 2180 | 0.6583 | - |
| 2.1135 | 2181 | 0.755 | - |
| 2.1145 | 2182 | 0.7083 | - |
| 2.1154 | 2183 | 0.7313 | - |
| 2.1164 | 2184 | 0.7368 | - |
| 2.1174 | 2185 | 0.7395 | - |
| 2.1183 | 2186 | 0.5992 | - |
| 2.1193 | 2187 | 0.5644 | - |
| 2.1203 | 2188 | 0.6177 | - |
| 2.1212 | 2189 | 0.7857 | - |
| 2.1222 | 2190 | 0.6157 | - |
| 2.1232 | 2191 | 0.6741 | - |
| 2.1242 | 2192 | 0.694 | - |
| 2.1251 | 2193 | 0.7318 | - |
| 2.1261 | 2194 | 0.6434 | - |
| 2.1271 | 2195 | 0.6612 | - |
| 2.1280 | 2196 | 0.6803 | - |
| 2.1290 | 2197 | 0.5998 | - |
| 2.1300 | 2198 | 0.4957 | - |
| 2.1309 | 2199 | 0.6614 | - |
| 2.1319 | 2200 | 0.7134 | 0.8142 |
| 2.1329 | 2201 | 0.5306 | - |
| 2.1339 | 2202 | 0.5336 | - |
| 2.1348 | 2203 | 0.5839 | - |
| 2.1358 | 2204 | 0.7654 | - |
| 2.1368 | 2205 | 0.5474 | - |
| 2.1377 | 2206 | 0.5837 | - |
| 2.1387 | 2207 | 0.468 | - |
| 2.1397 | 2208 | 0.5691 | - |
| 2.1406 | 2209 | 0.6057 | - |
| 2.1416 | 2210 | 0.7195 | - |
| 2.1426 | 2211 | 0.5807 | - |
| 2.1435 | 2212 | 0.5785 | - |
| 2.1445 | 2213 | 0.6073 | - |
| 2.1455 | 2214 | 0.7759 | - |
| 2.1465 | 2215 | 0.8264 | - |
| 2.1474 | 2216 | 0.5129 | - |
| 2.1484 | 2217 | 0.7901 | - |
| 2.1494 | 2218 | 0.7649 | - |
| 2.1503 | 2219 | 0.6914 | - |
| 2.1513 | 2220 | 0.702 | - |
| 2.1523 | 2221 | 0.7743 | - |
| 2.1532 | 2222 | 0.6295 | - |
| 2.1542 | 2223 | 0.5996 | - |
| 2.1552 | 2224 | 0.7745 | - |
| 2.1562 | 2225 | 0.727 | - |
| 2.1571 | 2226 | 0.6576 | - |
| 2.1581 | 2227 | 0.7042 | - |
| 2.1591 | 2228 | 0.6019 | - |
| 2.1600 | 2229 | 0.7618 | - |
| 2.1610 | 2230 | 0.6631 | - |
| 2.1620 | 2231 | 0.7242 | - |
| 2.1629 | 2232 | 0.6439 | - |
| 2.1639 | 2233 | 0.7622 | - |
| 2.1649 | 2234 | 0.6449 | - |
| 2.1659 | 2235 | 0.696 | - |
| 2.1668 | 2236 | 0.7449 | - |
| 2.1678 | 2237 | 0.6975 | - |
| 2.1688 | 2238 | 0.6087 | - |
| 2.1697 | 2239 | 0.5393 | - |
| 2.1707 | 2240 | 0.7284 | - |
| 2.1717 | 2241 | 0.5937 | - |
| 2.1726 | 2242 | 0.5201 | - |
| 2.1736 | 2243 | 0.7753 | - |
| 2.1746 | 2244 | 0.6054 | - |
| 2.1756 | 2245 | 0.6011 | - |
| 2.1765 | 2246 | 0.6828 | - |
| 2.1775 | 2247 | 0.7113 | - |
| 2.1785 | 2248 | 0.7737 | - |
| 2.1794 | 2249 | 0.7298 | - |
| 2.1804 | 2250 | 0.7054 | - |
| 2.1814 | 2251 | 0.6387 | - |
| 2.1823 | 2252 | 0.7226 | - |
| 2.1833 | 2253 | 0.6498 | - |
| 2.1843 | 2254 | 0.5952 | - |
| 2.1853 | 2255 | 0.5159 | - |
| 2.1862 | 2256 | 0.7067 | - |
| 2.1872 | 2257 | 0.5254 | - |
| 2.1882 | 2258 | 0.4552 | - |
| 2.1891 | 2259 | 0.5587 | - |
| 2.1901 | 2260 | 0.7326 | - |
| 2.1911 | 2261 | 0.8482 | - |
| 2.1920 | 2262 | 0.845 | - |
| 2.1930 | 2263 | 0.6742 | - |
| 2.1940 | 2264 | 0.6717 | - |
| 2.1950 | 2265 | 0.562 | - |
| 2.1959 | 2266 | 0.6129 | - |
| 2.1969 | 2267 | 0.8403 | - |
| 2.1979 | 2268 | 0.7992 | - |
| 2.1988 | 2269 | 0.6775 | - |
| 2.1998 | 2270 | 0.6762 | - |
| 2.2008 | 2271 | 0.7047 | - |
| 2.2017 | 2272 | 0.5106 | - |
| 2.2027 | 2273 | 0.7229 | - |
| 2.2037 | 2274 | 0.4966 | - |
| 2.2047 | 2275 | 0.6378 | - |
| 2.2056 | 2276 | 0.7507 | - |
| 2.2066 | 2277 | 0.5889 | - |
| 2.2076 | 2278 | 0.5453 | - |
| 2.2085 | 2279 | 0.806 | - |
| 2.2095 | 2280 | 0.6133 | - |
| 2.2105 | 2281 | 0.6721 | - |
| 2.2114 | 2282 | 0.601 | - |
| 2.2124 | 2283 | 0.6726 | - |
| 2.2134 | 2284 | 0.6404 | - |
| 2.2144 | 2285 | 0.5994 | - |
| 2.2153 | 2286 | 0.8444 | - |
| 2.2163 | 2287 | 0.6977 | - |
| 2.2173 | 2288 | 0.8614 | - |
| 2.2182 | 2289 | 0.5399 | - |
| 2.2192 | 2290 | 0.7301 | - |
| 2.2202 | 2291 | 0.6688 | - |
| 2.2211 | 2292 | 0.5011 | - |
| 2.2221 | 2293 | 0.9408 | - |
| 2.2231 | 2294 | 0.7823 | - |
| 2.2241 | 2295 | 0.64 | - |
| 2.2250 | 2296 | 0.6913 | - |
| 2.2260 | 2297 | 0.561 | - |
| 2.2270 | 2298 | 0.7667 | - |
| 2.2279 | 2299 | 0.6357 | - |
| 2.2289 | 2300 | 0.6608 | 0.8050 |
| 2.2299 | 2301 | 0.6165 | - |
| 2.2308 | 2302 | 0.5348 | - |
| 2.2318 | 2303 | 0.5755 | - |
| 2.2328 | 2304 | 0.501 | - |
| 2.2338 | 2305 | 0.6129 | - |
| 2.2347 | 2306 | 0.5947 | - |
| 2.2357 | 2307 | 0.6587 | - |
| 2.2367 | 2308 | 0.5762 | - |
| 2.2376 | 2309 | 0.6447 | - |
| 2.2386 | 2310 | 0.6996 | - |
| 2.2396 | 2311 | 0.5453 | - |
| 2.2405 | 2312 | 0.6236 | - |
| 2.2415 | 2313 | 0.5003 | - |
| 2.2425 | 2314 | 0.6777 | - |
| 2.2435 | 2315 | 0.6202 | - |
| 2.2444 | 2316 | 0.5856 | - |
| 2.2454 | 2317 | 0.5762 | - |
| 2.2464 | 2318 | 0.6564 | - |
| 2.2473 | 2319 | 0.7147 | - |
| 2.2483 | 2320 | 0.8049 | - |
| 2.2493 | 2321 | 0.6364 | - |
| 2.2502 | 2322 | 0.6419 | - |
| 2.2512 | 2323 | 0.5596 | - |
| 2.2522 | 2324 | 0.6926 | - |
| 2.2532 | 2325 | 0.6572 | - |
| 2.2541 | 2326 | 0.568 | - |
| 2.2551 | 2327 | 0.6515 | - |
| 2.2561 | 2328 | 0.5985 | - |
| 2.2570 | 2329 | 0.6158 | - |
| 2.2580 | 2330 | 0.7656 | - |
| 2.2590 | 2331 | 0.6629 | - |
| 2.2599 | 2332 | 0.7722 | - |
| 2.2609 | 2333 | 0.6679 | - |
| 2.2619 | 2334 | 0.648 | - |
| 2.2629 | 2335 | 0.6359 | - |
| 2.2638 | 2336 | 0.5415 | - |
| 2.2648 | 2337 | 0.5266 | - |
| 2.2658 | 2338 | 0.8612 | - |
| 2.2667 | 2339 | 0.6214 | - |
| 2.2677 | 2340 | 0.5972 | - |
| 2.2687 | 2341 | 0.7087 | - |
| 2.2696 | 2342 | 0.7453 | - |
| 2.2706 | 2343 | 0.6149 | - |
| 2.2716 | 2344 | 0.8606 | - |
| 2.2726 | 2345 | 0.4198 | - |
| 2.2735 | 2346 | 0.6141 | - |
| 2.2745 | 2347 | 0.5736 | - |
| 2.2755 | 2348 | 0.6334 | - |
| 2.2764 | 2349 | 0.563 | - |
| 2.2774 | 2350 | 0.624 | - |
| 2.2784 | 2351 | 0.5621 | - |
| 2.2793 | 2352 | 0.7086 | - |
| 2.2803 | 2353 | 0.4849 | - |
| 2.2813 | 2354 | 0.5143 | - |
| 2.2823 | 2355 | 0.5659 | - |
| 2.2832 | 2356 | 0.6138 | - |
| 2.2842 | 2357 | 0.6163 | - |
| 2.2852 | 2358 | 0.5264 | - |
| 2.2861 | 2359 | 0.4825 | - |
| 2.2871 | 2360 | 0.5727 | - |
| 2.2881 | 2361 | 0.6577 | - |
| 2.2890 | 2362 | 0.6634 | - |
| 2.2900 | 2363 | 0.5883 | - |
| 2.2910 | 2364 | 0.5863 | - |
| 2.2919 | 2365 | 0.5052 | - |
| 2.2929 | 2366 | 0.6017 | - |
| 2.2939 | 2367 | 0.6322 | - |
| 2.2949 | 2368 | 0.8126 | - |
| 2.2958 | 2369 | 0.682 | - |
| 2.2968 | 2370 | 0.7833 | - |
| 2.2978 | 2371 | 0.5465 | - |
| 2.2987 | 2372 | 0.6079 | - |
| 2.2997 | 2373 | 0.5704 | - |
| 2.3007 | 2374 | 0.5803 | - |
| 2.3016 | 2375 | 0.6285 | - |
| 2.3026 | 2376 | 0.44 | - |
| 2.3036 | 2377 | 0.5384 | - |
| 2.3046 | 2378 | 0.8658 | - |
| 2.3055 | 2379 | 0.5611 | - |
| 2.3065 | 2380 | 0.7294 | - |
| 2.3075 | 2381 | 0.7659 | - |
| 2.3084 | 2382 | 0.5574 | - |
| 2.3094 | 2383 | 0.7914 | - |
| 2.3104 | 2384 | 0.597 | - |
| 2.3113 | 2385 | 0.4018 | - |
| 2.3123 | 2386 | 0.5309 | - |
| 2.3133 | 2387 | 0.6193 | - |
| 2.3143 | 2388 | 0.6986 | - |
| 2.3152 | 2389 | 0.5962 | - |
| 2.3162 | 2390 | 0.444 | - |
| 2.3172 | 2391 | 0.7489 | - |
| 2.3181 | 2392 | 0.5069 | - |
| 2.3191 | 2393 | 0.7428 | - |
| 2.3201 | 2394 | 0.506 | - |
| 2.3210 | 2395 | 0.6058 | - |
| 2.3220 | 2396 | 0.6388 | - |
| 2.3230 | 2397 | 0.6608 | - |
| 2.3240 | 2398 | 0.5509 | - |
| 2.3249 | 2399 | 0.5127 | - |
| 2.3259 | 2400 | 0.6843 | 0.7911 |
| 2.3269 | 2401 | 0.6531 | - |
| 2.3278 | 2402 | 0.556 | - |
| 2.3288 | 2403 | 0.5438 | - |
| 2.3298 | 2404 | 0.6147 | - |
| 2.3307 | 2405 | 0.5102 | - |
| 2.3317 | 2406 | 0.6377 | - |
| 2.3327 | 2407 | 0.5563 | - |
| 2.3337 | 2408 | 0.4994 | - |
| 2.3346 | 2409 | 0.4898 | - |
| 2.3356 | 2410 | 0.545 | - |
| 2.3366 | 2411 | 0.8057 | - |
| 2.3375 | 2412 | 0.6199 | - |
| 2.3385 | 2413 | 0.818 | - |
| 2.3395 | 2414 | 0.6664 | - |
| 2.3404 | 2415 | 0.5573 | - |
| 2.3414 | 2416 | 0.5708 | - |
| 2.3424 | 2417 | 0.6146 | - |
| 2.3434 | 2418 | 0.6748 | - |
| 2.3443 | 2419 | 0.6544 | - |
| 2.3453 | 2420 | 0.6185 | - |
| 2.3463 | 2421 | 0.415 | - |
| 2.3472 | 2422 | 0.515 | - |
| 2.3482 | 2423 | 0.4978 | - |
| 2.3492 | 2424 | 0.5656 | - |
| 2.3501 | 2425 | 0.4267 | - |
| 2.3511 | 2426 | 0.6412 | - |
| 2.3521 | 2427 | 0.5355 | - |
| 2.3531 | 2428 | 0.6927 | - |
| 2.3540 | 2429 | 0.6316 | - |
| 2.3550 | 2430 | 0.5382 | - |
| 2.3560 | 2431 | 0.607 | - |
| 2.3569 | 2432 | 0.5236 | - |
| 2.3579 | 2433 | 0.4755 | - |
| 2.3589 | 2434 | 0.5147 | - |
| 2.3598 | 2435 | 0.6042 | - |
| 2.3608 | 2436 | 0.6468 | - |
| 2.3618 | 2437 | 0.4508 | - |
| 2.3628 | 2438 | 0.6722 | - |
| 2.3637 | 2439 | 0.6243 | - |
| 2.3647 | 2440 | 0.5883 | - |
| 2.3657 | 2441 | 0.6832 | - |
| 2.3666 | 2442 | 0.693 | - |
| 2.3676 | 2443 | 0.6997 | - |
| 2.3686 | 2444 | 0.6196 | - |
| 2.3695 | 2445 | 0.6083 | - |
| 2.3705 | 2446 | 0.5101 | - |
| 2.3715 | 2447 | 0.4764 | - |
| 2.3725 | 2448 | 0.7069 | - |
| 2.3734 | 2449 | 0.5851 | - |
| 2.3744 | 2450 | 0.5511 | - |
| 2.3754 | 2451 | 0.6408 | - |
| 2.3763 | 2452 | 0.5199 | - |
| 2.3773 | 2453 | 0.4659 | - |
| 2.3783 | 2454 | 0.5275 | - |
| 2.3792 | 2455 | 0.5008 | - |
| 2.3802 | 2456 | 0.7143 | - |
| 2.3812 | 2457 | 0.6084 | - |
| 2.3822 | 2458 | 0.6363 | - |
| 2.3831 | 2459 | 0.5089 | - |
| 2.3841 | 2460 | 0.8134 | - |
| 2.3851 | 2461 | 0.6219 | - |
| 2.3860 | 2462 | 0.599 | - |
| 2.3870 | 2463 | 0.7124 | - |
| 2.3880 | 2464 | 0.6723 | - |
| 2.3889 | 2465 | 0.4625 | - |
| 2.3899 | 2466 | 0.5003 | - |
| 2.3909 | 2467 | 0.3922 | - |
| 2.3919 | 2468 | 0.5405 | - |
| 2.3928 | 2469 | 0.722 | - |
| 2.3938 | 2470 | 0.6158 | - |
| 2.3948 | 2471 | 0.4851 | - |
| 2.3957 | 2472 | 0.5411 | - |
| 2.3967 | 2473 | 0.5564 | - |
| 2.3977 | 2474 | 0.564 | - |
| 2.3986 | 2475 | 0.6014 | - |
| 2.3996 | 2476 | 0.5399 | - |
| 2.4006 | 2477 | 0.5258 | - |
| 2.4016 | 2478 | 0.5253 | - |
| 2.4025 | 2479 | 0.6459 | - |
| 2.4035 | 2480 | 0.6064 | - |
| 2.4045 | 2481 | 0.6295 | - |
| 2.4054 | 2482 | 0.6232 | - |
| 2.4064 | 2483 | 0.56 | - |
| 2.4074 | 2484 | 0.6047 | - |
| 2.4083 | 2485 | 0.6638 | - |
| 2.4093 | 2486 | 0.6538 | - |
| 2.4103 | 2487 | 0.5277 | - |
| 2.4113 | 2488 | 0.6008 | - |
| 2.4122 | 2489 | 0.5194 | - |
| 2.4132 | 2490 | 0.4957 | - |
| 2.4142 | 2491 | 0.4893 | - |
| 2.4151 | 2492 | 0.544 | - |
| 2.4161 | 2493 | 0.5037 | - |
| 2.4171 | 2494 | 0.577 | - |
| 2.4180 | 2495 | 0.5825 | - |
| 2.4190 | 2496 | 0.5512 | - |
| 2.4200 | 2497 | 0.6542 | - |
| 2.4210 | 2498 | 0.5824 | - |
| 2.4219 | 2499 | 0.4363 | - |
| 2.4229 | 2500 | 0.3686 | 0.7976 |
| 2.4239 | 2501 | 0.8667 | - |
| 2.4248 | 2502 | 0.6225 | - |
| 2.4258 | 2503 | 0.6417 | - |
| 2.4268 | 2504 | 0.4422 | - |
| 2.4277 | 2505 | 0.6904 | - |
| 2.4287 | 2506 | 0.4017 | - |
| 2.4297 | 2507 | 0.6005 | - |
| 2.4306 | 2508 | 0.6127 | - |
| 2.4316 | 2509 | 0.5502 | - |
| 2.4326 | 2510 | 0.6193 | - |
| 2.4336 | 2511 | 0.7539 | - |
| 2.4345 | 2512 | 0.6475 | - |
| 2.4355 | 2513 | 0.5581 | - |
| 2.4365 | 2514 | 0.5482 | - |
| 2.4374 | 2515 | 0.4977 | - |
| 2.4384 | 2516 | 0.5233 | - |
| 2.4394 | 2517 | 0.6893 | - |
| 2.4403 | 2518 | 0.4326 | - |
| 2.4413 | 2519 | 0.6881 | - |
| 2.4423 | 2520 | 0.4547 | - |
| 2.4433 | 2521 | 0.5573 | - |
| 2.4442 | 2522 | 0.5357 | - |
| 2.4452 | 2523 | 0.5211 | - |
| 2.4462 | 2524 | 0.579 | - |
| 2.4471 | 2525 | 0.5532 | - |
| 2.4481 | 2526 | 0.5533 | - |
| 2.4491 | 2527 | 0.6261 | - |
| 2.4500 | 2528 | 0.587 | - |
| 2.4510 | 2529 | 0.5719 | - |
| 2.4520 | 2530 | 0.6108 | - |
| 2.4530 | 2531 | 0.5609 | - |
| 2.4539 | 2532 | 0.4883 | - |
| 2.4549 | 2533 | 0.6455 | - |
| 2.4559 | 2534 | 0.6324 | - |
| 2.4568 | 2535 | 0.5311 | - |
| 2.4578 | 2536 | 0.5076 | - |
| 2.4588 | 2537 | 0.5674 | - |
| 2.4597 | 2538 | 0.5394 | - |
| 2.4607 | 2539 | 0.5041 | - |
| 2.4617 | 2540 | 0.4288 | - |
| 2.4627 | 2541 | 0.5671 | - |
| 2.4636 | 2542 | 0.5632 | - |
| 2.4646 | 2543 | 0.6149 | - |
| 2.4656 | 2544 | 0.6663 | - |
| 2.4665 | 2545 | 0.5429 | - |
| 2.4675 | 2546 | 0.6046 | - |
| 2.4685 | 2547 | 0.5275 | - |
| 2.4694 | 2548 | 0.5669 | - |
| 2.4704 | 2549 | 0.5259 | - |
| 2.4714 | 2550 | 0.6942 | - |
| 2.4724 | 2551 | 0.6705 | - |
| 2.4733 | 2552 | 0.4577 | - |
| 2.4743 | 2553 | 0.6303 | - |
| 2.4753 | 2554 | 0.5975 | - |
| 2.4762 | 2555 | 0.5383 | - |
| 2.4772 | 2556 | 0.4658 | - |
| 2.4782 | 2557 | 0.6125 | - |
| 2.4791 | 2558 | 0.5363 | - |
| 2.4801 | 2559 | 0.4766 | - |
| 2.4811 | 2560 | 0.4872 | - |
| 2.4821 | 2561 | 0.5881 | - |
| 2.4830 | 2562 | 0.4837 | - |
| 2.4840 | 2563 | 0.5513 | - |
| 2.4850 | 2564 | 0.6088 | - |
| 2.4859 | 2565 | 0.5902 | - |
| 2.4869 | 2566 | 0.5341 | - |
| 2.4879 | 2567 | 0.6034 | - |
| 2.4888 | 2568 | 0.5354 | - |
| 2.4898 | 2569 | 0.5055 | - |
| 2.4908 | 2570 | 0.6696 | - |
| 2.4918 | 2571 | 0.4503 | - |
| 2.4927 | 2572 | 0.53 | - |
| 2.4937 | 2573 | 0.4621 | - |
| 2.4947 | 2574 | 0.5489 | - |
| 2.4956 | 2575 | 0.4142 | - |
| 2.4966 | 2576 | 0.5252 | - |
| 2.4976 | 2577 | 0.7389 | - |
| 2.4985 | 2578 | 0.6858 | - |
| 2.4995 | 2579 | 0.616 | - |
| 2.5005 | 2580 | 0.5036 | - |
| 2.5015 | 2581 | 0.5524 | - |
| 2.5024 | 2582 | 0.6247 | - |
| 2.5034 | 2583 | 0.5257 | - |
| 2.5044 | 2584 | 0.5289 | - |
| 2.5053 | 2585 | 0.4754 | - |
| 2.5063 | 2586 | 0.4092 | - |
| 2.5073 | 2587 | 0.4759 | - |
| 2.5082 | 2588 | 0.5213 | - |
| 2.5092 | 2589 | 0.5248 | - |
| 2.5102 | 2590 | 0.4827 | - |
| 2.5112 | 2591 | 0.4829 | - |
| 2.5121 | 2592 | 0.457 | - |
| 2.5131 | 2593 | 0.5259 | - |
| 2.5141 | 2594 | 0.5555 | - |
| 2.5150 | 2595 | 0.5976 | - |
| 2.5160 | 2596 | 0.4375 | - |
| 2.5170 | 2597 | 0.5486 | - |
| 2.5179 | 2598 | 0.6078 | - |
| 2.5189 | 2599 | 0.3893 | - |
| 2.5199 | 2600 | 0.7378 | 0.7975 |
| 2.5209 | 2601 | 0.6217 | - |
| 2.5218 | 2602 | 0.6347 | - |
| 2.5228 | 2603 | 0.6141 | - |
| 2.5238 | 2604 | 0.5735 | - |
| 2.5247 | 2605 | 0.4771 | - |
| 2.5257 | 2606 | 0.5558 | - |
| 2.5267 | 2607 | 0.5981 | - |
| 2.5276 | 2608 | 0.4218 | - |
| 2.5286 | 2609 | 0.6544 | - |
| 2.5296 | 2610 | 0.5261 | - |
| 2.5306 | 2611 | 0.5354 | - |
| 2.5315 | 2612 | 0.6805 | - |
| 2.5325 | 2613 | 0.4851 | - |
| 2.5335 | 2614 | 0.564 | - |
| 2.5344 | 2615 | 0.6193 | - |
| 2.5354 | 2616 | 0.4883 | - |
| 2.5364 | 2617 | 0.532 | - |
| 2.5373 | 2618 | 0.6659 | - |
| 2.5383 | 2619 | 0.6307 | - |
| 2.5393 | 2620 | 0.551 | - |
| 2.5403 | 2621 | 0.4974 | - |
| 2.5412 | 2622 | 0.4089 | - |
| 2.5422 | 2623 | 0.613 | - |
| 2.5432 | 2624 | 0.5561 | - |
| 2.5441 | 2625 | 0.5308 | - |
| 2.5451 | 2626 | 0.8394 | - |
| 2.5461 | 2627 | 0.5317 | - |
| 2.5470 | 2628 | 0.5092 | - |
| 2.5480 | 2629 | 0.4916 | - |
| 2.5490 | 2630 | 0.4255 | - |
| 2.5500 | 2631 | 0.5399 | - |
| 2.5509 | 2632 | 0.5806 | - |
| 2.5519 | 2633 | 0.6364 | - |
| 2.5529 | 2634 | 0.5112 | - |
| 2.5538 | 2635 | 0.4603 | - |
| 2.5548 | 2636 | 0.6033 | - |
| 2.5558 | 2637 | 0.5756 | - |
| 2.5567 | 2638 | 0.5932 | - |
| 2.5577 | 2639 | 0.4626 | - |
| 2.5587 | 2640 | 0.5018 | - |
| 2.5597 | 2641 | 0.575 | - |
| 2.5606 | 2642 | 0.6023 | - |
| 2.5616 | 2643 | 0.6095 | - |
| 2.5626 | 2644 | 0.6539 | - |
| 2.5635 | 2645 | 0.4795 | - |
| 2.5645 | 2646 | 0.5672 | - |
| 2.5655 | 2647 | 0.4684 | - |
| 2.5664 | 2648 | 0.5452 | - |
| 2.5674 | 2649 | 0.6177 | - |
| 2.5684 | 2650 | 0.4874 | - |
| 2.5694 | 2651 | 0.512 | - |
| 2.5703 | 2652 | 0.5485 | - |
| 2.5713 | 2653 | 0.4808 | - |
| 2.5723 | 2654 | 0.3866 | - |
| 2.5732 | 2655 | 0.485 | - |
| 2.5742 | 2656 | 0.3588 | - |
| 2.5752 | 2657 | 0.4549 | - |
| 2.5761 | 2658 | 0.5455 | - |
| 2.5771 | 2659 | 0.6106 | - |
| 2.5781 | 2660 | 0.63 | - |
| 2.5790 | 2661 | 0.6071 | - |
| 2.5800 | 2662 | 0.5064 | - |
| 2.5810 | 2663 | 0.4227 | - |
| 2.5820 | 2664 | 0.4452 | - |
| 2.5829 | 2665 | 0.5478 | - |
| 2.5839 | 2666 | 0.6906 | - |
| 2.5849 | 2667 | 0.563 | - |
| 2.5858 | 2668 | 0.6233 | - |
| 2.5868 | 2669 | 0.5206 | - |
| 2.5878 | 2670 | 0.4622 | - |
| 2.5887 | 2671 | 0.4753 | - |
| 2.5897 | 2672 | 0.4132 | - |
| 2.5907 | 2673 | 0.6369 | - |
| 2.5917 | 2674 | 0.6229 | - |
| 2.5926 | 2675 | 0.4426 | - |
| 2.5936 | 2676 | 0.4372 | - |
| 2.5946 | 2677 | 0.5765 | - |
| 2.5955 | 2678 | 0.575 | - |
| 2.5965 | 2679 | 0.4613 | - |
| 2.5975 | 2680 | 0.561 | - |
| 2.5984 | 2681 | 0.514 | - |
| 2.5994 | 2682 | 0.6764 | - |
| 2.6004 | 2683 | 0.567 | - |
| 2.6014 | 2684 | 0.5336 | - |
| 2.6023 | 2685 | 0.5113 | - |
| 2.6033 | 2686 | 0.6015 | - |
| 2.6043 | 2687 | 0.5233 | - |
| 2.6052 | 2688 | 0.586 | - |
| 2.6062 | 2689 | 0.5228 | - |
| 2.6072 | 2690 | 0.6954 | - |
| 2.6081 | 2691 | 0.5381 | - |
| 2.6091 | 2692 | 0.5604 | - |
| 2.6101 | 2693 | 0.6094 | - |
| 2.6111 | 2694 | 0.4444 | - |
| 2.6120 | 2695 | 0.4708 | - |
| 2.6130 | 2696 | 0.4829 | - |
| 2.6140 | 2697 | 0.568 | - |
| 2.6149 | 2698 | 0.478 | - |
| 2.6159 | 2699 | 0.4591 | - |
| 2.6169 | 2700 | 0.4439 | 0.7997 |
| 2.6178 | 2701 | 0.5144 | - |
| 2.6188 | 2702 | 0.5368 | - |
| 2.6198 | 2703 | 0.5427 | - |
| 2.6208 | 2704 | 0.6137 | - |
| 2.6217 | 2705 | 0.4571 | - |
| 2.6227 | 2706 | 0.5479 | - |
| 2.6237 | 2707 | 0.5089 | - |
| 2.6246 | 2708 | 0.6946 | - |
| 2.6256 | 2709 | 0.5076 | - |
| 2.6266 | 2710 | 0.6072 | - |
| 2.6275 | 2711 | 0.6197 | - |
| 2.6285 | 2712 | 0.5605 | - |
| 2.6295 | 2713 | 0.5985 | - |
| 2.6305 | 2714 | 0.611 | - |
| 2.6314 | 2715 | 0.5473 | - |
| 2.6324 | 2716 | 0.4145 | - |
| 2.6334 | 2717 | 0.3962 | - |
| 2.6343 | 2718 | 0.5955 | - |
| 2.6353 | 2719 | 0.6185 | - |
| 2.6363 | 2720 | 0.527 | - |
| 2.6372 | 2721 | 0.5018 | - |
| 2.6382 | 2722 | 0.663 | - |
| 2.6392 | 2723 | 0.4816 | - |
| 2.6402 | 2724 | 0.5169 | - |
| 2.6411 | 2725 | 0.5677 | - |
| 2.6421 | 2726 | 0.5976 | - |
| 2.6431 | 2727 | 0.4865 | - |
| 2.6440 | 2728 | 0.4505 | - |
| 2.6450 | 2729 | 0.5861 | - |
| 2.6460 | 2730 | 0.3877 | - |
| 2.6469 | 2731 | 0.4313 | - |
| 2.6479 | 2732 | 0.4827 | - |
| 2.6489 | 2733 | 0.5711 | - |
| 2.6499 | 2734 | 0.5826 | - |
| 2.6508 | 2735 | 0.4632 | - |
| 2.6518 | 2736 | 0.4886 | - |
| 2.6528 | 2737 | 0.5857 | - |
| 2.6537 | 2738 | 0.3913 | - |
| 2.6547 | 2739 | 0.5086 | - |
| 2.6557 | 2740 | 0.7045 | - |
| 2.6566 | 2741 | 0.4195 | - |
| 2.6576 | 2742 | 0.5942 | - |
| 2.6586 | 2743 | 0.6162 | - |
| 2.6596 | 2744 | 0.5439 | - |
| 2.6605 | 2745 | 0.5012 | - |
| 2.6615 | 2746 | 0.79 | - |
| 2.6625 | 2747 | 0.4013 | - |
| 2.6634 | 2748 | 0.4883 | - |
| 2.6644 | 2749 | 0.5258 | - |
| 2.6654 | 2750 | 0.4169 | - |
| 2.6663 | 2751 | 0.3749 | - |
| 2.6673 | 2752 | 0.5412 | - |
| 2.6683 | 2753 | 0.6017 | - |
| 2.6693 | 2754 | 0.5658 | - |
| 2.6702 | 2755 | 0.7508 | - |
| 2.6712 | 2756 | 0.6179 | - |
| 2.6722 | 2757 | 0.6221 | - |
| 2.6731 | 2758 | 0.6117 | - |
| 2.6741 | 2759 | 0.7156 | - |
| 2.6751 | 2760 | 0.5204 | - |
| 2.6760 | 2761 | 0.5496 | - |
| 2.6770 | 2762 | 0.6063 | - |
| 2.6780 | 2763 | 0.6665 | - |
| 2.6790 | 2764 | 0.4716 | - |
| 2.6799 | 2765 | 0.4645 | - |
| 2.6809 | 2766 | 0.5171 | - |
| 2.6819 | 2767 | 0.5293 | - |
| 2.6828 | 2768 | 0.5263 | - |
| 2.6838 | 2769 | 0.4517 | - |
| 2.6848 | 2770 | 0.6595 | - |
| 2.6857 | 2771 | 0.535 | - |
| 2.6867 | 2772 | 0.5361 | - |
| 2.6877 | 2773 | 0.4418 | - |
| 2.6887 | 2774 | 0.4982 | - |
| 2.6896 | 2775 | 0.4709 | - |
| 2.6906 | 2776 | 0.5114 | - |
| 2.6916 | 2777 | 0.5641 | - |
| 2.6925 | 2778 | 0.5416 | - |
| 2.6935 | 2779 | 0.5552 | - |
| 2.6945 | 2780 | 0.5991 | - |
| 2.6954 | 2781 | 0.4499 | - |
| 2.6964 | 2782 | 0.4542 | - |
| 2.6974 | 2783 | 0.5265 | - |
| 2.6984 | 2784 | 0.5254 | - |
| 2.6993 | 2785 | 0.5247 | - |
| 2.7003 | 2786 | 0.5344 | - |
| 2.7013 | 2787 | 0.5097 | - |
| 2.7022 | 2788 | 0.6562 | - |
| 2.7032 | 2789 | 0.6604 | - |
| 2.7042 | 2790 | 0.5394 | - |
| 2.7051 | 2791 | 0.4455 | - |
| 2.7061 | 2792 | 0.5545 | - |
| 2.7071 | 2793 | 0.6043 | - |
| 2.7081 | 2794 | 0.6313 | - |
| 2.7090 | 2795 | 0.4341 | - |
| 2.7100 | 2796 | 0.472 | - |
| 2.7110 | 2797 | 0.4265 | - |
| 2.7119 | 2798 | 0.4814 | - |
| 2.7129 | 2799 | 0.4924 | - |
| 2.7139 | 2800 | 0.6586 | 0.7953 |
| 2.7148 | 2801 | 0.5459 | - |
| 2.7158 | 2802 | 0.602 | - |
| 2.7168 | 2803 | 0.4941 | - |
| 2.7177 | 2804 | 0.5797 | - |
| 2.7187 | 2805 | 0.4563 | - |
| 2.7197 | 2806 | 0.5839 | - |
| 2.7207 | 2807 | 0.5645 | - |
| 2.7216 | 2808 | 0.4621 | - |
| 2.7226 | 2809 | 0.4713 | - |
| 2.7236 | 2810 | 0.5251 | - |
| 2.7245 | 2811 | 0.7717 | - |
| 2.7255 | 2812 | 0.6003 | - |
| 2.7265 | 2813 | 0.4983 | - |
| 2.7274 | 2814 | 0.5394 | - |
| 2.7284 | 2815 | 0.5679 | - |
| 2.7294 | 2816 | 0.594 | - |
| 2.7304 | 2817 | 0.52 | - |
| 2.7313 | 2818 | 0.5036 | - |
| 2.7323 | 2819 | 0.4444 | - |
| 2.7333 | 2820 | 0.3981 | - |
| 2.7342 | 2821 | 0.447 | - |
| 2.7352 | 2822 | 0.5789 | - |
| 2.7362 | 2823 | 0.6871 | - |
| 2.7371 | 2824 | 0.4199 | - |
| 2.7381 | 2825 | 0.5459 | - |
| 2.7391 | 2826 | 0.5619 | - |
| 2.7401 | 2827 | 0.5157 | - |
| 2.7410 | 2828 | 0.4704 | - |
| 2.7420 | 2829 | 0.4609 | - |
| 2.7430 | 2830 | 0.5302 | - |
| 2.7439 | 2831 | 0.5111 | - |
| 2.7449 | 2832 | 0.4888 | - |
| 2.7459 | 2833 | 0.5704 | - |
| 2.7468 | 2834 | 0.5382 | - |
| 2.7478 | 2835 | 0.5218 | - |
| 2.7488 | 2836 | 0.5555 | - |
| 2.7498 | 2837 | 0.4535 | - |
| 2.7507 | 2838 | 0.5652 | - |
| 2.7517 | 2839 | 0.53 | - |
| 2.7527 | 2840 | 0.4659 | - |
| 2.7536 | 2841 | 0.4626 | - |
| 2.7546 | 2842 | 0.5034 | - |
| 2.7556 | 2843 | 0.6065 | - |
| 2.7565 | 2844 | 0.5343 | - |
| 2.7575 | 2845 | 0.6133 | - |
| 2.7585 | 2846 | 0.479 | - |
| 2.7595 | 2847 | 0.3796 | - |
| 2.7604 | 2848 | 0.52 | - |
| 2.7614 | 2849 | 0.5467 | - |
| 2.7624 | 2850 | 0.614 | - |
| 2.7633 | 2851 | 0.5326 | - |
| 2.7643 | 2852 | 0.627 | - |
| 2.7653 | 2853 | 0.6664 | - |
| 2.7662 | 2854 | 0.4338 | - |
| 2.7672 | 2855 | 0.5638 | - |
| 2.7682 | 2856 | 0.5336 | - |
| 2.7692 | 2857 | 0.5564 | - |
| 2.7701 | 2858 | 0.5245 | - |
| 2.7711 | 2859 | 0.4499 | - |
| 2.7721 | 2860 | 0.573 | - |
| 2.7730 | 2861 | 0.495 | - |
| 2.7740 | 2862 | 0.5491 | - |
| 2.7750 | 2863 | 0.4094 | - |
| 2.7759 | 2864 | 0.5092 | - |
| 2.7769 | 2865 | 0.4925 | - |
| 2.7779 | 2866 | 0.5885 | - |
| 2.7789 | 2867 | 0.5085 | - |
| 2.7798 | 2868 | 0.4995 | - |
| 2.7808 | 2869 | 0.5395 | - |
| 2.7818 | 2870 | 0.7295 | - |
| 2.7827 | 2871 | 0.4588 | - |
| 2.7837 | 2872 | 0.4756 | - |
| 2.7847 | 2873 | 0.5386 | - |
| 2.7856 | 2874 | 0.5446 | - |
| 2.7866 | 2875 | 0.5655 | - |
| 2.7876 | 2876 | 0.5231 | - |
| 2.7886 | 2877 | 0.5023 | - |
| 2.7895 | 2878 | 0.5628 | - |
| 2.7905 | 2879 | 0.581 | - |
| 2.7915 | 2880 | 0.4391 | - |
| 2.7924 | 2881 | 0.6174 | - |
| 2.7934 | 2882 | 0.664 | - |
| 2.7944 | 2883 | 0.4486 | - |
| 2.7953 | 2884 | 0.5085 | - |
| 2.7963 | 2885 | 0.6983 | - |
| 2.7973 | 2886 | 0.5854 | - |
| 2.7983 | 2887 | 0.6174 | - |
| 2.7992 | 2888 | 0.5427 | - |
| 2.8002 | 2889 | 0.5126 | - |
| 2.8012 | 2890 | 0.6293 | - |
| 2.8021 | 2891 | 0.5768 | - |
| 2.8031 | 2892 | 0.632 | - |
| 2.8041 | 2893 | 0.5053 | - |
| 2.8050 | 2894 | 0.4158 | - |
| 2.8060 | 2895 | 0.4752 | - |
| 2.8070 | 2896 | 0.5845 | - |
| 2.8080 | 2897 | 0.4671 | - |
| 2.8089 | 2898 | 0.5059 | - |
| 2.8099 | 2899 | 0.6917 | - |
| 2.8109 | 2900 | 0.6657 | 0.8000 |
| 2.8118 | 2901 | 0.6367 | - |
| 2.8128 | 2902 | 0.5312 | - |
| 2.8138 | 2903 | 0.4563 | - |
| 2.8147 | 2904 | 0.4352 | - |
| 2.8157 | 2905 | 0.5129 | - |
| 2.8167 | 2906 | 0.6363 | - |
| 2.8177 | 2907 | 0.5407 | - |
| 2.8186 | 2908 | 0.548 | - |
| 2.8196 | 2909 | 0.4679 | - |
| 2.8206 | 2910 | 0.7003 | - |
| 2.8215 | 2911 | 0.4061 | - |
| 2.8225 | 2912 | 0.4207 | - |
| 2.8235 | 2913 | 0.4606 | - |
| 2.8244 | 2914 | 0.5914 | - |
| 2.8254 | 2915 | 0.5725 | - |
| 2.8264 | 2916 | 0.5012 | - |
| 2.8274 | 2917 | 0.5847 | - |
| 2.8283 | 2918 | 0.5614 | - |
| 2.8293 | 2919 | 0.6195 | - |
| 2.8303 | 2920 | 0.4192 | - |
| 2.8312 | 2921 | 0.4858 | - |
| 2.8322 | 2922 | 0.681 | - |
| 2.8332 | 2923 | 0.5237 | - |
| 2.8341 | 2924 | 0.6461 | - |
| 2.8351 | 2925 | 0.4185 | - |
| 2.8361 | 2926 | 0.5061 | - |
| 2.8371 | 2927 | 0.4818 | - |
| 2.8380 | 2928 | 0.5655 | - |
| 2.8390 | 2929 | 0.5376 | - |
| 2.8400 | 2930 | 0.619 | - |
| 2.8409 | 2931 | 0.5344 | - |
| 2.8419 | 2932 | 0.5415 | - |
| 2.8429 | 2933 | 0.4876 | - |
| 2.8438 | 2934 | 0.617 | - |
| 2.8448 | 2935 | 0.3902 | - |
| 2.8458 | 2936 | 0.6009 | - |
| 2.8468 | 2937 | 0.5809 | - |
| 2.8477 | 2938 | 0.7075 | - |
| 2.8487 | 2939 | 0.5369 | - |
| 2.8497 | 2940 | 0.4337 | - |
| 2.8506 | 2941 | 0.4278 | - |
| 2.8516 | 2942 | 0.4996 | - |
| 2.8526 | 2943 | 0.5383 | - |
| 2.8535 | 2944 | 0.6566 | - |
| 2.8545 | 2945 | 0.5774 | - |
| 2.8555 | 2946 | 0.523 | - |
| 2.8565 | 2947 | 0.6451 | - |
| 2.8574 | 2948 | 0.5344 | - |
| 2.8584 | 2949 | 0.7905 | - |
| 2.8594 | 2950 | 0.4712 | - |
| 2.8603 | 2951 | 0.5191 | - |
| 2.8613 | 2952 | 0.532 | - |
| 2.8623 | 2953 | 0.6545 | - |
| 2.8632 | 2954 | 0.3878 | - |
| 2.8642 | 2955 | 0.6302 | - |
| 2.8652 | 2956 | 0.5886 | - |
| 2.8661 | 2957 | 0.5555 | - |
| 2.8671 | 2958 | 0.4484 | - |
| 2.8681 | 2959 | 0.4525 | - |
| 2.8691 | 2960 | 0.6344 | - |
| 2.8700 | 2961 | 0.5398 | - |
| 2.8710 | 2962 | 0.7107 | - |
| 2.8720 | 2963 | 0.6001 | - |
| 2.8729 | 2964 | 0.5972 | - |
| 2.8739 | 2965 | 0.3465 | - |
| 2.8749 | 2966 | 0.4921 | - |
| 2.8758 | 2967 | 0.5281 | - |
| 2.8768 | 2968 | 0.4714 | - |
| 2.8778 | 2969 | 0.5687 | - |
| 2.8788 | 2970 | 0.5918 | - |
| 2.8797 | 2971 | 0.5207 | - |
| 2.8807 | 2972 | 0.4783 | - |
| 2.8817 | 2973 | 0.4519 | - |
| 2.8826 | 2974 | 0.5052 | - |
| 2.8836 | 2975 | 0.4912 | - |
| 2.8846 | 2976 | 0.7495 | - |
| 2.8855 | 2977 | 0.6044 | - |
| 2.8865 | 2978 | 0.5533 | - |
| 2.8875 | 2979 | 0.4458 | - |
| 2.8885 | 2980 | 0.619 | - |
| 2.8894 | 2981 | 0.5832 | - |
| 2.8904 | 2982 | 0.4819 | - |
| 2.8914 | 2983 | 0.5894 | - |
| 2.8923 | 2984 | 0.591 | - |
| 2.8933 | 2985 | 0.4694 | - |
| 2.8943 | 2986 | 0.4542 | - |
| 2.8952 | 2987 | 0.6101 | - |
| 2.8962 | 2988 | 0.4969 | - |
| 2.8972 | 2989 | 0.5935 | - |
| 2.8982 | 2990 | 0.5854 | - |
| 2.8991 | 2991 | 0.5164 | - |
| 2.9001 | 2992 | 0.5713 | - |
| 2.9011 | 2993 | 0.5168 | - |
| 2.9020 | 2994 | 0.6482 | - |
| 2.9030 | 2995 | 0.556 | - |
| 2.9040 | 2996 | 0.6662 | - |
| 2.9049 | 2997 | 0.4963 | - |
| 2.9059 | 2998 | 0.4563 | - |
| 2.9069 | 2999 | 0.5894 | - |
| 2.9079 | 3000 | 0.6439 | 0.8014 |
| 2.9088 | 3001 | 0.51 | - |
| 2.9098 | 3002 | 0.5358 | - |
| 2.9108 | 3003 | 0.5784 | - |
| 2.9117 | 3004 | 0.5352 | - |
| 2.9127 | 3005 | 0.4505 | - |
| 2.9137 | 3006 | 0.5085 | - |
| 2.9146 | 3007 | 0.3523 | - |
| 2.9156 | 3008 | 0.549 | - |
| 2.9166 | 3009 | 0.5627 | - |
| 2.9176 | 3010 | 0.4198 | - |
| 2.9185 | 3011 | 0.4294 | - |
| 2.9195 | 3012 | 0.6771 | - |
| 2.9205 | 3013 | 0.5901 | - |
| 2.9214 | 3014 | 0.5111 | - |
| 2.9224 | 3015 | 0.463 | - |
| 2.9234 | 3016 | 0.5813 | - |
| 2.9243 | 3017 | 0.6286 | - |
| 2.9253 | 3018 | 0.5563 | - |
| 2.9263 | 3019 | 0.6788 | - |
| 2.9273 | 3020 | 0.6529 | - |
| 2.9282 | 3021 | 0.4504 | - |
| 2.9292 | 3022 | 0.4607 | - |
| 2.9302 | 3023 | 0.4252 | - |
| 2.9311 | 3024 | 0.5121 | - |
| 2.9321 | 3025 | 0.4833 | - |
| 2.9331 | 3026 | 0.4792 | - |
| 2.9340 | 3027 | 0.5602 | - |
| 2.9350 | 3028 | 0.6863 | - |
| 2.9360 | 3029 | 0.67 | - |
| 2.9370 | 3030 | 0.4937 | - |
| 2.9379 | 3031 | 0.5279 | - |
| 2.9389 | 3032 | 0.4885 | - |
| 2.9399 | 3033 | 0.6027 | - |
| 2.9408 | 3034 | 0.6025 | - |
| 2.9418 | 3035 | 0.4846 | - |
| 2.9428 | 3036 | 0.5371 | - |
| 2.9437 | 3037 | 0.4921 | - |
| 2.9447 | 3038 | 0.5322 | - |
| 2.9457 | 3039 | 0.6226 | - |
| 2.9467 | 3040 | 0.4313 | - |
| 2.9476 | 3041 | 0.5842 | - |
| 2.9486 | 3042 | 0.4039 | - |
| 2.9496 | 3043 | 0.6052 | - |
| 2.9505 | 3044 | 0.5546 | - |
| 2.9515 | 3045 | 0.4946 | - |
| 2.9525 | 3046 | 0.4972 | - |
| 2.9534 | 3047 | 0.5291 | - |
| 2.9544 | 3048 | 0.5968 | - |
| 2.9554 | 3049 | 0.5543 | - |
| 2.9564 | 3050 | 0.5804 | - |
| 2.9573 | 3051 | 0.4343 | - |
| 2.9583 | 3052 | 0.4654 | - |
| 2.9593 | 3053 | 0.5679 | - |
| 2.9602 | 3054 | 0.5784 | - |
| 2.9612 | 3055 | 0.6018 | - |
| 2.9622 | 3056 | 0.5565 | - |
| 2.9631 | 3057 | 0.403 | - |
| 2.9641 | 3058 | 0.6706 | - |
| 2.9651 | 3059 | 0.5176 | - |
| 2.9661 | 3060 | 0.6643 | - |
| 2.9670 | 3061 | 0.5339 | - |
| 2.9680 | 3062 | 0.5553 | - |
| 2.9690 | 3063 | 0.4865 | - |
| 2.9699 | 3064 | 0.4651 | - |
| 2.9709 | 3065 | 0.4227 | - |
| 2.9719 | 3066 | 0.4342 | - |
| 2.9728 | 3067 | 0.4079 | - |
| 2.9738 | 3068 | 0.5709 | - |
| 2.9748 | 3069 | 0.4366 | - |
| 2.9758 | 3070 | 0.6763 | - |
| 2.9767 | 3071 | 0.494 | - |
| 2.9777 | 3072 | 0.7012 | - |
| 2.9787 | 3073 | 0.5008 | - |
| 2.9796 | 3074 | 0.4857 | - |
| 2.9806 | 3075 | 0.4697 | - |
| 2.9816 | 3076 | 0.6056 | - |
| 2.9825 | 3077 | 0.4102 | - |
| 2.9835 | 3078 | 0.4017 | - |
| 2.9845 | 3079 | 0.5137 | - |
| 2.9855 | 3080 | 0.5583 | - |
| 2.9864 | 3081 | 0.3887 | - |
| 2.9874 | 3082 | 0.5383 | - |
| 2.9884 | 3083 | 0.4707 | - |
| 2.9893 | 3084 | 0.5424 | - |
| 2.9903 | 3085 | 0.4453 | - |
| 2.9913 | 3086 | 0.4668 | - |
| 2.9922 | 3087 | 0.5347 | - |
| 2.9932 | 3088 | 0.4929 | - |
| 2.9942 | 3089 | 0.5649 | - |
| 2.9952 | 3090 | 0.653 | - |
| 2.9961 | 3091 | 0.548 | - |
| 2.9971 | 3092 | 0.4389 | - |
| 2.9981 | 3093 | 0.5869 | - |
| 2.9990 | 3094 | 0.6079 | - |
| 3.0 | 3095 | 0.6773 | - |
| 3.0010 | 3096 | 0.0515 | - |
| 3.0010 | 3097 | 0.5344 | - |
| 3.0019 | 3098 | 0.4454 | - |
| 3.0029 | 3099 | 0.6245 | - |
| 3.0039 | 3100 | 0.6673 | 0.8078 |
| 3.0048 | 3101 | 0.5138 | - |
| 3.0058 | 3102 | 0.5724 | - |
| 3.0068 | 3103 | 0.5311 | - |
| 3.0078 | 3104 | 0.4358 | - |
| 3.0087 | 3105 | 0.4539 | - |
| 3.0097 | 3106 | 0.5478 | - |
| 3.0107 | 3107 | 0.4591 | - |
| 3.0116 | 3108 | 0.4291 | - |
| 3.0126 | 3109 | 0.4066 | - |
| 3.0136 | 3110 | 0.4664 | - |
| 3.0145 | 3111 | 0.6209 | - |
| 3.0155 | 3112 | 0.4105 | - |
| 3.0165 | 3113 | 0.6471 | - |
| 3.0175 | 3114 | 0.6617 | - |
| 3.0184 | 3115 | 0.5458 | - |
| 3.0194 | 3116 | 0.5323 | - |
| 3.0204 | 3117 | 0.4924 | - |
| 3.0213 | 3118 | 0.5386 | - |
| 3.0223 | 3119 | 0.5362 | - |
| 3.0233 | 3120 | 0.61 | - |
| 3.0242 | 3121 | 0.452 | - |
| 3.0252 | 3122 | 0.6957 | - |
| 3.0262 | 3123 | 0.53 | - |
| 3.0272 | 3124 | 0.5339 | - |
| 3.0281 | 3125 | 0.4435 | - |
| 3.0291 | 3126 | 0.458 | - |
| 3.0301 | 3127 | 0.5616 | - |
| 3.0310 | 3128 | 0.5122 | - |
| 3.0320 | 3129 | 0.385 | - |
| 3.0330 | 3130 | 0.629 | - |
| 3.0339 | 3131 | 0.519 | - |
| 3.0349 | 3132 | 0.6359 | - |
| 3.0359 | 3133 | 0.3122 | - |
| 3.0369 | 3134 | 0.5211 | - |
| 3.0378 | 3135 | 0.5232 | - |
| 3.0388 | 3136 | 0.4651 | - |
| 3.0398 | 3137 | 0.4998 | - |
| 3.0407 | 3138 | 0.4265 | - |
| 3.0417 | 3139 | 0.6804 | - |
| 3.0427 | 3140 | 0.5763 | - |
| 3.0436 | 3141 | 0.4073 | - |
| 3.0446 | 3142 | 0.557 | - |
| 3.0456 | 3143 | 0.4501 | - |
| 3.0466 | 3144 | 0.5691 | - |
| 3.0475 | 3145 | 0.5862 | - |
| 3.0485 | 3146 | 0.5888 | - |
| 3.0495 | 3147 | 0.5681 | - |
| 3.0504 | 3148 | 0.4317 | - |
| 3.0514 | 3149 | 0.51 | - |
| 3.0524 | 3150 | 0.5184 | - |
| 3.0533 | 3151 | 0.5287 | - |
| 3.0543 | 3152 | 0.4446 | - |
| 3.0553 | 3153 | 0.5364 | - |
| 3.0563 | 3154 | 0.6492 | - |
| 3.0572 | 3155 | 0.556 | - |
| 3.0582 | 3156 | 0.5011 | - |
| 3.0592 | 3157 | 0.573 | - |
| 3.0601 | 3158 | 0.5718 | - |
| 3.0611 | 3159 | 0.5537 | - |
| 3.0621 | 3160 | 0.5882 | - |
| 3.0630 | 3161 | 0.4816 | - |
| 3.0640 | 3162 | 0.5055 | - |
| 3.0650 | 3163 | 0.4202 | - |
| 3.0660 | 3164 | 0.4387 | - |
| 3.0669 | 3165 | 0.4965 | - |
| 3.0679 | 3166 | 0.4305 | - |
| 3.0689 | 3167 | 0.4603 | - |
| 3.0698 | 3168 | 0.4236 | - |
| 3.0708 | 3169 | 0.6023 | - |
| 3.0718 | 3170 | 0.5018 | - |
| 3.0727 | 3171 | 0.4389 | - |
| 3.0737 | 3172 | 0.5838 | - |
| 3.0747 | 3173 | 0.6225 | - |
| 3.0757 | 3174 | 0.3819 | - |
| 3.0766 | 3175 | 0.5297 | - |
| 3.0776 | 3176 | 0.598 | - |
| 3.0786 | 3177 | 0.4745 | - |
| 3.0795 | 3178 | 0.4001 | - |
| 3.0805 | 3179 | 0.6107 | - |
| 3.0815 | 3180 | 0.566 | - |
| 3.0824 | 3181 | 0.505 | - |
| 3.0834 | 3182 | 0.4026 | - |
| 3.0844 | 3183 | 0.3939 | - |
| 3.0854 | 3184 | 0.5071 | - |
| 3.0863 | 3185 | 0.4875 | - |
| 3.0873 | 3186 | 0.4751 | - |
| 3.0883 | 3187 | 0.4809 | - |
| 3.0892 | 3188 | 0.5278 | - |
| 3.0902 | 3189 | 0.5013 | - |
| 3.0912 | 3190 | 0.6536 | - |
| 3.0921 | 3191 | 0.5431 | - |
| 3.0931 | 3192 | 0.5729 | - |
| 3.0941 | 3193 | 0.5663 | - |
| 3.0951 | 3194 | 0.6296 | - |
| 3.0960 | 3195 | 0.3926 | - |
| 3.0970 | 3196 | 0.448 | - |
| 3.0980 | 3197 | 0.5246 | - |
| 3.0989 | 3198 | 0.4462 | - |
| 3.0999 | 3199 | 0.4463 | - |
| 3.1009 | 3200 | 0.4516 | 0.7983 |
| 3.1018 | 3201 | 0.4641 | - |
| 3.1028 | 3202 | 0.4774 | - |
| 3.1038 | 3203 | 0.5402 | - |
| 3.1048 | 3204 | 0.4847 | - |
| 3.1057 | 3205 | 0.4452 | - |
| 3.1067 | 3206 | 0.5086 | - |
| 3.1077 | 3207 | 0.4923 | - |
| 3.1086 | 3208 | 0.5452 | - |
| 3.1096 | 3209 | 0.461 | - |
| 3.1106 | 3210 | 0.5373 | - |
| 3.1115 | 3211 | 0.4513 | - |
| 3.1125 | 3212 | 0.5363 | - |
| 3.1135 | 3213 | 0.5521 | - |
| 3.1145 | 3214 | 0.4831 | - |
| 3.1154 | 3215 | 0.5862 | - |
| 3.1164 | 3216 | 0.5112 | - |
| 3.1174 | 3217 | 0.5696 | - |
| 3.1183 | 3218 | 0.4873 | - |
| 3.1193 | 3219 | 0.4593 | - |
| 3.1203 | 3220 | 0.4948 | - |
| 3.1212 | 3221 | 0.5751 | - |
| 3.1222 | 3222 | 0.5094 | - |
| 3.1232 | 3223 | 0.4541 | - |
| 3.1242 | 3224 | 0.5554 | - |
| 3.1251 | 3225 | 0.561 | - |
| 3.1261 | 3226 | 0.4849 | - |
| 3.1271 | 3227 | 0.5016 | - |
| 3.1280 | 3228 | 0.506 | - |
| 3.1290 | 3229 | 0.4845 | - |
| 3.1300 | 3230 | 0.3193 | - |
| 3.1309 | 3231 | 0.4792 | - |
| 3.1319 | 3232 | 0.5386 | - |
| 3.1329 | 3233 | 0.4336 | - |
| 3.1339 | 3234 | 0.3778 | - |
| 3.1348 | 3235 | 0.3913 | - |
| 3.1358 | 3236 | 0.5693 | - |
| 3.1368 | 3237 | 0.4209 | - |
| 3.1377 | 3238 | 0.4399 | - |
| 3.1387 | 3239 | 0.3711 | - |
| 3.1397 | 3240 | 0.4135 | - |
| 3.1406 | 3241 | 0.3824 | - |
| 3.1416 | 3242 | 0.5343 | - |
| 3.1426 | 3243 | 0.4261 | - |
| 3.1435 | 3244 | 0.4817 | - |
| 3.1445 | 3245 | 0.5155 | - |
| 3.1455 | 3246 | 0.6736 | - |
| 3.1465 | 3247 | 0.5873 | - |
| 3.1474 | 3248 | 0.4254 | - |
| 3.1484 | 3249 | 0.5463 | - |
| 3.1494 | 3250 | 0.6273 | - |
| 3.1503 | 3251 | 0.525 | - |
| 3.1513 | 3252 | 0.4887 | - |
| 3.1523 | 3253 | 0.6385 | - |
| 3.1532 | 3254 | 0.4913 | - |
| 3.1542 | 3255 | 0.4714 | - |
| 3.1552 | 3256 | 0.5509 | - |
| 3.1562 | 3257 | 0.527 | - |
| 3.1571 | 3258 | 0.488 | - |
| 3.1581 | 3259 | 0.5227 | - |
| 3.1591 | 3260 | 0.4675 | - |
| 3.1600 | 3261 | 0.557 | - |
| 3.1610 | 3262 | 0.5151 | - |
| 3.1620 | 3263 | 0.558 | - |
| 3.1629 | 3264 | 0.4662 | - |
| 3.1639 | 3265 | 0.5883 | - |
| 3.1649 | 3266 | 0.4072 | - |
| 3.1659 | 3267 | 0.4966 | - |
| 3.1668 | 3268 | 0.5666 | - |
| 3.1678 | 3269 | 0.5493 | - |
| 3.1688 | 3270 | 0.4389 | - |
| 3.1697 | 3271 | 0.4192 | - |
| 3.1707 | 3272 | 0.5922 | - |
| 3.1717 | 3273 | 0.4428 | - |
| 3.1726 | 3274 | 0.3821 | - |
| 3.1736 | 3275 | 0.5755 | - |
| 3.1746 | 3276 | 0.4323 | - |
| 3.1756 | 3277 | 0.4906 | - |
| 3.1765 | 3278 | 0.5707 | - |
| 3.1775 | 3279 | 0.4835 | - |
| 3.1785 | 3280 | 0.5934 | - |
| 3.1794 | 3281 | 0.5617 | - |
| 3.1804 | 3282 | 0.5129 | - |
| 3.1814 | 3283 | 0.452 | - |
| 3.1823 | 3284 | 0.502 | - |
| 3.1833 | 3285 | 0.4694 | - |
| 3.1843 | 3286 | 0.4143 | - |
| 3.1853 | 3287 | 0.3865 | - |
| 3.1862 | 3288 | 0.5249 | - |
| 3.1872 | 3289 | 0.417 | - |
| 3.1882 | 3290 | 0.3483 | - |
| 3.1891 | 3291 | 0.4731 | - |
| 3.1901 | 3292 | 0.6014 | - |
| 3.1911 | 3293 | 0.6958 | - |
| 3.1920 | 3294 | 0.6789 | - |
| 3.1930 | 3295 | 0.4803 | - |
| 3.1940 | 3296 | 0.418 | - |
| 3.1950 | 3297 | 0.4434 | - |
| 3.1959 | 3298 | 0.4473 | - |
| 3.1969 | 3299 | 0.7174 | - |
| 3.1979 | 3300 | 0.6348 | 0.8131 |
| 3.1988 | 3301 | 0.5117 | - |
| 3.1998 | 3302 | 0.5238 | - |
| 3.2008 | 3303 | 0.561 | - |
| 3.2017 | 3304 | 0.4278 | - |
| 3.2027 | 3305 | 0.5663 | - |
| 3.2037 | 3306 | 0.4212 | - |
| 3.2047 | 3307 | 0.4614 | - |
| 3.2056 | 3308 | 0.5533 | - |
| 3.2066 | 3309 | 0.431 | - |
| 3.2076 | 3310 | 0.3801 | - |
| 3.2085 | 3311 | 0.6816 | - |
| 3.2095 | 3312 | 0.4225 | - |
| 3.2105 | 3313 | 0.5669 | - |
| 3.2114 | 3314 | 0.4613 | - |
| 3.2124 | 3315 | 0.4556 | - |
| 3.2134 | 3316 | 0.4616 | - |
| 3.2144 | 3317 | 0.4009 | - |
| 3.2153 | 3318 | 0.5823 | - |
| 3.2163 | 3319 | 0.4973 | - |
| 3.2173 | 3320 | 0.6411 | - |
| 3.2182 | 3321 | 0.3642 | - |
| 3.2192 | 3322 | 0.5652 | - |
| 3.2202 | 3323 | 0.4425 | - |
| 3.2211 | 3324 | 0.3724 | - |
| 3.2221 | 3325 | 0.7187 | - |
| 3.2231 | 3326 | 0.5821 | - |
| 3.2241 | 3327 | 0.5004 | - |
| 3.2250 | 3328 | 0.532 | - |
| 3.2260 | 3329 | 0.3882 | - |
| 3.2270 | 3330 | 0.6304 | - |
| 3.2279 | 3331 | 0.4924 | - |
| 3.2289 | 3332 | 0.5252 | - |
| 3.2299 | 3333 | 0.5052 | - |
| 3.2308 | 3334 | 0.4221 | - |
| 3.2318 | 3335 | 0.4072 | - |
| 3.2328 | 3336 | 0.3564 | - |
| 3.2338 | 3337 | 0.5028 | - |
| 3.2347 | 3338 | 0.4568 | - |
| 3.2357 | 3339 | 0.4984 | - |
| 3.2367 | 3340 | 0.3977 | - |
| 3.2376 | 3341 | 0.4813 | - |
| 3.2386 | 3342 | 0.4724 | - |
| 3.2396 | 3343 | 0.3894 | - |
| 3.2405 | 3344 | 0.4452 | - |
| 3.2415 | 3345 | 0.3997 | - |
| 3.2425 | 3346 | 0.4524 | - |
| 3.2435 | 3347 | 0.4286 | - |
| 3.2444 | 3348 | 0.3902 | - |
| 3.2454 | 3349 | 0.4071 | - |
| 3.2464 | 3350 | 0.5158 | - |
| 3.2473 | 3351 | 0.5197 | - |
| 3.2483 | 3352 | 0.5685 | - |
| 3.2493 | 3353 | 0.4642 | - |
| 3.2502 | 3354 | 0.4582 | - |
| 3.2512 | 3355 | 0.4574 | - |
| 3.2522 | 3356 | 0.5654 | - |
| 3.2532 | 3357 | 0.4213 | - |
| 3.2541 | 3358 | 0.3926 | - |
| 3.2551 | 3359 | 0.5482 | - |
| 3.2561 | 3360 | 0.4552 | - |
| 3.2570 | 3361 | 0.429 | - |
| 3.2580 | 3362 | 0.57 | - |
| 3.2590 | 3363 | 0.4936 | - |
| 3.2599 | 3364 | 0.6125 | - |
| 3.2609 | 3365 | 0.5156 | - |
| 3.2619 | 3366 | 0.4732 | - |
| 3.2629 | 3367 | 0.4846 | - |
| 3.2638 | 3368 | 0.3843 | - |
| 3.2648 | 3369 | 0.3907 | - |
| 3.2658 | 3370 | 0.5623 | - |
| 3.2667 | 3371 | 0.5098 | - |
| 3.2677 | 3372 | 0.4355 | - |
| 3.2687 | 3373 | 0.5103 | - |
| 3.2696 | 3374 | 0.4804 | - |
| 3.2706 | 3375 | 0.4725 | - |
| 3.2716 | 3376 | 0.6938 | - |
| 3.2726 | 3377 | 0.3176 | - |
| 3.2735 | 3378 | 0.4431 | - |
| 3.2745 | 3379 | 0.4189 | - |
| 3.2755 | 3380 | 0.5002 | - |
| 3.2764 | 3381 | 0.3881 | - |
| 3.2774 | 3382 | 0.504 | - |
| 3.2784 | 3383 | 0.4012 | - |
| 3.2793 | 3384 | 0.4937 | - |
| 3.2803 | 3385 | 0.3842 | - |
| 3.2813 | 3386 | 0.4006 | - |
| 3.2823 | 3387 | 0.4154 | - |
| 3.2832 | 3388 | 0.4882 | - |
| 3.2842 | 3389 | 0.4691 | - |
| 3.2852 | 3390 | 0.421 | - |
| 3.2861 | 3391 | 0.4096 | - |
| 3.2871 | 3392 | 0.4063 | - |
| 3.2881 | 3393 | 0.5175 | - |
| 3.2890 | 3394 | 0.4751 | - |
| 3.2900 | 3395 | 0.3965 | - |
| 3.2910 | 3396 | 0.4034 | - |
| 3.2919 | 3397 | 0.3853 | - |
| 3.2929 | 3398 | 0.4123 | - |
| 3.2939 | 3399 | 0.4858 | - |
| 3.2949 | 3400 | 0.5825 | 0.8150 |
| 3.2958 | 3401 | 0.5219 | - |
| 3.2968 | 3402 | 0.5124 | - |
| 3.2978 | 3403 | 0.427 | - |
| 3.2987 | 3404 | 0.4525 | - |
| 3.2997 | 3405 | 0.4217 | - |
| 3.3007 | 3406 | 0.4278 | - |
| 3.3016 | 3407 | 0.5375 | - |
| 3.3026 | 3408 | 0.3229 | - |
| 3.3036 | 3409 | 0.3919 | - |
| 3.3046 | 3410 | 0.638 | - |
| 3.3055 | 3411 | 0.3877 | - |
| 3.3065 | 3412 | 0.618 | - |
| 3.3075 | 3413 | 0.5634 | - |
| 3.3084 | 3414 | 0.4158 | - |
| 3.3094 | 3415 | 0.5145 | - |
| 3.3104 | 3416 | 0.4365 | - |
| 3.3113 | 3417 | 0.3235 | - |
| 3.3123 | 3418 | 0.4612 | - |
| 3.3133 | 3419 | 0.462 | - |
| 3.3143 | 3420 | 0.5114 | - |
| 3.3152 | 3421 | 0.4567 | - |
| 3.3162 | 3422 | 0.3353 | - |
| 3.3172 | 3423 | 0.5496 | - |
| 3.3181 | 3424 | 0.3419 | - |
| 3.3191 | 3425 | 0.5226 | - |
| 3.3201 | 3426 | 0.4126 | - |
| 3.3210 | 3427 | 0.4961 | - |
| 3.3220 | 3428 | 0.5129 | - |
| 3.3230 | 3429 | 0.4971 | - |
| 3.3240 | 3430 | 0.4148 | - |
| 3.3249 | 3431 | 0.4687 | - |
| 3.3259 | 3432 | 0.4899 | - |
| 3.3269 | 3433 | 0.516 | - |
| 3.3278 | 3434 | 0.4031 | - |
| 3.3288 | 3435 | 0.437 | - |
| 3.3298 | 3436 | 0.4724 | - |
| 3.3307 | 3437 | 0.3983 | - |
| 3.3317 | 3438 | 0.4843 | - |
| 3.3327 | 3439 | 0.442 | - |
| 3.3337 | 3440 | 0.3579 | - |
| 3.3346 | 3441 | 0.3791 | - |
| 3.3356 | 3442 | 0.4023 | - |
| 3.3366 | 3443 | 0.5747 | - |
| 3.3375 | 3444 | 0.4974 | - |
| 3.3385 | 3445 | 0.601 | - |
| 3.3395 | 3446 | 0.4845 | - |
| 3.3404 | 3447 | 0.4604 | - |
| 3.3414 | 3448 | 0.3828 | - |
| 3.3424 | 3449 | 0.4825 | - |
| 3.3434 | 3450 | 0.532 | - |
| 3.3443 | 3451 | 0.4833 | - |
| 3.3453 | 3452 | 0.4522 | - |
| 3.3463 | 3453 | 0.319 | - |
| 3.3472 | 3454 | 0.3463 | - |
| 3.3482 | 3455 | 0.3392 | - |
| 3.3492 | 3456 | 0.3658 | - |
| 3.3501 | 3457 | 0.3309 | - |
| 3.3511 | 3458 | 0.4834 | - |
| 3.3521 | 3459 | 0.3804 | - |
| 3.3531 | 3460 | 0.5847 | - |
| 3.3540 | 3461 | 0.4419 | - |
| 3.3550 | 3462 | 0.3977 | - |
| 3.3560 | 3463 | 0.4423 | - |
| 3.3569 | 3464 | 0.3822 | - |
| 3.3579 | 3465 | 0.3732 | - |
| 3.3589 | 3466 | 0.405 | - |
| 3.3598 | 3467 | 0.4749 | - |
| 3.3608 | 3468 | 0.4539 | - |
| 3.3618 | 3469 | 0.3058 | - |
| 3.3628 | 3470 | 0.5302 | - |
| 3.3637 | 3471 | 0.4536 | - |
| 3.3647 | 3472 | 0.4199 | - |
| 3.3657 | 3473 | 0.5341 | - |
| 3.3666 | 3474 | 0.5735 | - |
| 3.3676 | 3475 | 0.514 | - |
| 3.3686 | 3476 | 0.5231 | - |
| 3.3695 | 3477 | 0.4594 | - |
| 3.3705 | 3478 | 0.3358 | - |
| 3.3715 | 3479 | 0.3932 | - |
| 3.3725 | 3480 | 0.5607 | - |
| 3.3734 | 3481 | 0.4457 | - |
| 3.3744 | 3482 | 0.4267 | - |
| 3.3754 | 3483 | 0.4922 | - |
| 3.3763 | 3484 | 0.3383 | - |
| 3.3773 | 3485 | 0.3428 | - |
| 3.3783 | 3486 | 0.356 | - |
| 3.3792 | 3487 | 0.4013 | - |
| 3.3802 | 3488 | 0.5301 | - |
| 3.3812 | 3489 | 0.4206 | - |
| 3.3822 | 3490 | 0.455 | - |
| 3.3831 | 3491 | 0.4102 | - |
| 3.3841 | 3492 | 0.6372 | - |
| 3.3851 | 3493 | 0.4915 | - |
| 3.3860 | 3494 | 0.422 | - |
| 3.3870 | 3495 | 0.489 | - |
| 3.3880 | 3496 | 0.5241 | - |
| 3.3889 | 3497 | 0.3588 | - |
| 3.3899 | 3498 | 0.3673 | - |
| 3.3909 | 3499 | 0.2671 | - |
| 3.3919 | 3500 | 0.3608 | 0.8030 |
| 3.3928 | 3501 | 0.5118 | - |
| 3.3938 | 3502 | 0.4458 | - |
| 3.3948 | 3503 | 0.3381 | - |
| 3.3957 | 3504 | 0.4165 | - |
| 3.3967 | 3505 | 0.4312 | - |
| 3.3977 | 3506 | 0.4684 | - |
| 3.3986 | 3507 | 0.4656 | - |
| 3.3996 | 3508 | 0.4178 | - |
| 3.4006 | 3509 | 0.3459 | - |
| 3.4016 | 3510 | 0.3724 | - |
| 3.4025 | 3511 | 0.4297 | - |
| 3.4035 | 3512 | 0.4561 | - |
| 3.4045 | 3513 | 0.4929 | - |
| 3.4054 | 3514 | 0.5017 | - |
| 3.4064 | 3515 | 0.4132 | - |
| 3.4074 | 3516 | 0.4774 | - |
| 3.4083 | 3517 | 0.4701 | - |
| 3.4093 | 3518 | 0.4608 | - |
| 3.4103 | 3519 | 0.3473 | - |
| 3.4113 | 3520 | 0.3823 | - |
| 3.4122 | 3521 | 0.4266 | - |
| 3.4132 | 3522 | 0.3965 | - |
| 3.4142 | 3523 | 0.3949 | - |
| 3.4151 | 3524 | 0.3762 | - |
| 3.4161 | 3525 | 0.3395 | - |
| 3.4171 | 3526 | 0.4036 | - |
| 3.4180 | 3527 | 0.4113 | - |
| 3.4190 | 3528 | 0.4668 | - |
| 3.4200 | 3529 | 0.5248 | - |
| 3.4210 | 3530 | 0.4969 | - |
| 3.4219 | 3531 | 0.3323 | - |
| 3.4229 | 3532 | 0.2793 | - |
| 3.4239 | 3533 | 0.6917 | - |
| 3.4248 | 3534 | 0.5031 | - |
| 3.4258 | 3535 | 0.4994 | - |
| 3.4268 | 3536 | 0.3574 | - |
| 3.4277 | 3537 | 0.4902 | - |
| 3.4287 | 3538 | 0.3429 | - |
| 3.4297 | 3539 | 0.4414 | - |
| 3.4306 | 3540 | 0.4441 | - |
| 3.4316 | 3541 | 0.4787 | - |
| 3.4326 | 3542 | 0.4248 | - |
| 3.4336 | 3543 | 0.541 | - |
| 3.4345 | 3544 | 0.5773 | - |
| 3.4355 | 3545 | 0.401 | - |
| 3.4365 | 3546 | 0.4109 | - |
| 3.4374 | 3547 | 0.3471 | - |
| 3.4384 | 3548 | 0.3787 | - |
| 3.4394 | 3549 | 0.5363 | - |
| 3.4403 | 3550 | 0.3314 | - |
| 3.4413 | 3551 | 0.5858 | - |
| 3.4423 | 3552 | 0.3498 | - |
| 3.4433 | 3553 | 0.3994 | - |
| 3.4442 | 3554 | 0.3987 | - |
| 3.4452 | 3555 | 0.378 | - |
| 3.4462 | 3556 | 0.4476 | - |
| 3.4471 | 3557 | 0.4087 | - |
| 3.4481 | 3558 | 0.4219 | - |
| 3.4491 | 3559 | 0.5072 | - |
| 3.4500 | 3560 | 0.5149 | - |
| 3.4510 | 3561 | 0.3893 | - |
| 3.4520 | 3562 | 0.4882 | - |
| 3.4530 | 3563 | 0.4545 | - |
| 3.4539 | 3564 | 0.3661 | - |
| 3.4549 | 3565 | 0.4733 | - |
| 3.4559 | 3566 | 0.4813 | - |
| 3.4568 | 3567 | 0.4057 | - |
| 3.4578 | 3568 | 0.3625 | - |
| 3.4588 | 3569 | 0.4606 | - |
| 3.4597 | 3570 | 0.4266 | - |
| 3.4607 | 3571 | 0.3528 | - |
| 3.4617 | 3572 | 0.3684 | - |
| 3.4627 | 3573 | 0.4401 | - |
| 3.4636 | 3574 | 0.407 | - |
| 3.4646 | 3575 | 0.4784 | - |
| 3.4656 | 3576 | 0.4955 | - |
| 3.4665 | 3577 | 0.3492 | - |
| 3.4675 | 3578 | 0.3782 | - |
| 3.4685 | 3579 | 0.4151 | - |
| 3.4694 | 3580 | 0.4342 | - |
| 3.4704 | 3581 | 0.4303 | - |
| 3.4714 | 3582 | 0.5193 | - |
| 3.4724 | 3583 | 0.5302 | - |
| 3.4733 | 3584 | 0.3799 | - |
| 3.4743 | 3585 | 0.54 | - |
| 3.4753 | 3586 | 0.4049 | - |
| 3.4762 | 3587 | 0.3952 | - |
| 3.4772 | 3588 | 0.3275 | - |
| 3.4782 | 3589 | 0.4778 | - |
| 3.4791 | 3590 | 0.3788 | - |
| 3.4801 | 3591 | 0.3899 | - |
| 3.4811 | 3592 | 0.3418 | - |
| 3.4821 | 3593 | 0.4578 | - |
| 3.4830 | 3594 | 0.3596 | - |
| 3.4840 | 3595 | 0.4139 | - |
| 3.4850 | 3596 | 0.4767 | - |
| 3.4859 | 3597 | 0.448 | - |
| 3.4869 | 3598 | 0.3755 | - |
| 3.4879 | 3599 | 0.4778 | - |
| 3.4888 | 3600 | 0.4404 | 0.8001 |
| 3.4898 | 3601 | 0.3704 | - |
| 3.4908 | 3602 | 0.467 | - |
| 3.4918 | 3603 | 0.3182 | - |
| 3.4927 | 3604 | 0.3727 | - |
| 3.4937 | 3605 | 0.3694 | - |
| 3.4947 | 3606 | 0.3656 | - |
| 3.4956 | 3607 | 0.3327 | - |
| 3.4966 | 3608 | 0.403 | - |
| 3.4976 | 3609 | 0.5006 | - |
| 3.4985 | 3610 | 0.5148 | - |
| 3.4995 | 3611 | 0.4384 | - |
| 3.5005 | 3612 | 0.3437 | - |
| 3.5015 | 3613 | 0.4135 | - |
| 3.5024 | 3614 | 0.4989 | - |
| 3.5034 | 3615 | 0.3678 | - |
| 3.5044 | 3616 | 0.4617 | - |
| 3.5053 | 3617 | 0.3554 | - |
| 3.5063 | 3618 | 0.2999 | - |
| 3.5073 | 3619 | 0.3692 | - |
| 3.5082 | 3620 | 0.4134 | - |
| 3.5092 | 3621 | 0.3747 | - |
| 3.5102 | 3622 | 0.3959 | - |
| 3.5112 | 3623 | 0.3711 | - |
| 3.5121 | 3624 | 0.3299 | - |
| 3.5131 | 3625 | 0.3937 | - |
| 3.5141 | 3626 | 0.4108 | - |
| 3.5150 | 3627 | 0.4789 | - |
| 3.5160 | 3628 | 0.3525 | - |
| 3.5170 | 3629 | 0.4139 | - |
| 3.5179 | 3630 | 0.4921 | - |
| 3.5189 | 3631 | 0.3238 | - |
| 3.5199 | 3632 | 0.5281 | - |
| 3.5209 | 3633 | 0.4325 | - |
| 3.5218 | 3634 | 0.5742 | - |
| 3.5228 | 3635 | 0.4245 | - |
| 3.5238 | 3636 | 0.5078 | - |
| 3.5247 | 3637 | 0.3728 | - |
| 3.5257 | 3638 | 0.4122 | - |
| 3.5267 | 3639 | 0.4705 | - |
| 3.5276 | 3640 | 0.2589 | - |
| 3.5286 | 3641 | 0.4108 | - |
| 3.5296 | 3642 | 0.4371 | - |
| 3.5306 | 3643 | 0.4901 | - |
| 3.5315 | 3644 | 0.5387 | - |
| 3.5325 | 3645 | 0.3321 | - |
| 3.5335 | 3646 | 0.455 | - |
| 3.5344 | 3647 | 0.4172 | - |
| 3.5354 | 3648 | 0.319 | - |
| 3.5364 | 3649 | 0.4057 | - |
| 3.5373 | 3650 | 0.468 | - |
| 3.5383 | 3651 | 0.4682 | - |
| 3.5393 | 3652 | 0.3819 | - |
| 3.5403 | 3653 | 0.3513 | - |
| 3.5412 | 3654 | 0.3394 | - |
| 3.5422 | 3655 | 0.5314 | - |
| 3.5432 | 3656 | 0.3946 | - |
| 3.5441 | 3657 | 0.396 | - |
| 3.5451 | 3658 | 0.6828 | - |
| 3.5461 | 3659 | 0.4183 | - |
| 3.5470 | 3660 | 0.3627 | - |
| 3.5480 | 3661 | 0.3765 | - |
| 3.5490 | 3662 | 0.3029 | - |
| 3.5500 | 3663 | 0.3892 | - |
| 3.5509 | 3664 | 0.3999 | - |
| 3.5519 | 3665 | 0.4495 | - |
| 3.5529 | 3666 | 0.357 | - |
| 3.5538 | 3667 | 0.363 | - |
| 3.5548 | 3668 | 0.5102 | - |
| 3.5558 | 3669 | 0.4169 | - |
| 3.5567 | 3670 | 0.4409 | - |
| 3.5577 | 3671 | 0.3497 | - |
| 3.5587 | 3672 | 0.3539 | - |
| 3.5597 | 3673 | 0.4057 | - |
| 3.5606 | 3674 | 0.415 | - |
| 3.5616 | 3675 | 0.4695 | - |
| 3.5626 | 3676 | 0.5622 | - |
| 3.5635 | 3677 | 0.387 | - |
| 3.5645 | 3678 | 0.4526 | - |
| 3.5655 | 3679 | 0.3413 | - |
| 3.5664 | 3680 | 0.4194 | - |
| 3.5674 | 3681 | 0.4702 | - |
| 3.5684 | 3682 | 0.3607 | - |
| 3.5694 | 3683 | 0.3358 | - |
| 3.5703 | 3684 | 0.4055 | - |
| 3.5713 | 3685 | 0.3584 | - |
| 3.5723 | 3686 | 0.2802 | - |
| 3.5732 | 3687 | 0.3645 | - |
| 3.5742 | 3688 | 0.2816 | - |
| 3.5752 | 3689 | 0.3974 | - |
| 3.5761 | 3690 | 0.3937 | - |
| 3.5771 | 3691 | 0.4436 | - |
| 3.5781 | 3692 | 0.4419 | - |
| 3.5790 | 3693 | 0.4494 | - |
| 3.5800 | 3694 | 0.3798 | - |
| 3.5810 | 3695 | 0.2571 | - |
| 3.5820 | 3696 | 0.3516 | - |
| 3.5829 | 3697 | 0.4189 | - |
| 3.5839 | 3698 | 0.4664 | - |
| 3.5849 | 3699 | 0.4192 | - |
| 3.5858 | 3700 | 0.4813 | 0.8000 |
| 3.5868 | 3701 | 0.385 | - |
| 3.5878 | 3702 | 0.3108 | - |
| 3.5887 | 3703 | 0.383 | - |
| 3.5897 | 3704 | 0.3024 | - |
| 3.5907 | 3705 | 0.4519 | - |
| 3.5917 | 3706 | 0.4803 | - |
| 3.5926 | 3707 | 0.3218 | - |
| 3.5936 | 3708 | 0.3151 | - |
| 3.5946 | 3709 | 0.5071 | - |
| 3.5955 | 3710 | 0.4937 | - |
| 3.5965 | 3711 | 0.3379 | - |
| 3.5975 | 3712 | 0.4283 | - |
| 3.5984 | 3713 | 0.387 | - |
| 3.5994 | 3714 | 0.54 | - |
| 3.6004 | 3715 | 0.4213 | - |
| 3.6014 | 3716 | 0.3996 | - |
| 3.6023 | 3717 | 0.3809 | - |
| 3.6033 | 3718 | 0.4711 | - |
| 3.6043 | 3719 | 0.362 | - |
| 3.6052 | 3720 | 0.4748 | - |
| 3.6062 | 3721 | 0.3924 | - |
| 3.6072 | 3722 | 0.5444 | - |
| 3.6081 | 3723 | 0.3995 | - |
| 3.6091 | 3724 | 0.3921 | - |
| 3.6101 | 3725 | 0.4433 | - |
| 3.6111 | 3726 | 0.3291 | - |
| 3.6120 | 3727 | 0.4252 | - |
| 3.6130 | 3728 | 0.3554 | - |
| 3.6140 | 3729 | 0.4005 | - |
| 3.6149 | 3730 | 0.3362 | - |
| 3.6159 | 3731 | 0.3135 | - |
| 3.6169 | 3732 | 0.3227 | - |
| 3.6178 | 3733 | 0.451 | - |
| 3.6188 | 3734 | 0.3908 | - |
| 3.6198 | 3735 | 0.4168 | - |
| 3.6208 | 3736 | 0.4965 | - |
| 3.6217 | 3737 | 0.362 | - |
| 3.6227 | 3738 | 0.4275 | - |
| 3.6237 | 3739 | 0.4233 | - |
| 3.6246 | 3740 | 0.6025 | - |
| 3.6256 | 3741 | 0.4275 | - |
| 3.6266 | 3742 | 0.3993 | - |
| 3.6275 | 3743 | 0.4789 | - |
| 3.6285 | 3744 | 0.4669 | - |
| 3.6295 | 3745 | 0.4682 | - |
| 3.6305 | 3746 | 0.4692 | - |
| 3.6314 | 3747 | 0.4453 | - |
| 3.6324 | 3748 | 0.3302 | - |
| 3.6334 | 3749 | 0.2798 | - |
| 3.6343 | 3750 | 0.4601 | - |
| 3.6353 | 3751 | 0.4263 | - |
| 3.6363 | 3752 | 0.3555 | - |
| 3.6372 | 3753 | 0.3778 | - |
| 3.6382 | 3754 | 0.4983 | - |
| 3.6392 | 3755 | 0.3218 | - |
| 3.6402 | 3756 | 0.4061 | - |
| 3.6411 | 3757 | 0.4383 | - |
| 3.6421 | 3758 | 0.4411 | - |
| 3.6431 | 3759 | 0.4033 | - |
| 3.6440 | 3760 | 0.3243 | - |
| 3.6450 | 3761 | 0.5027 | - |
| 3.6460 | 3762 | 0.3207 | - |
| 3.6469 | 3763 | 0.3654 | - |
| 3.6479 | 3764 | 0.3756 | - |
| 3.6489 | 3765 | 0.4538 | - |
| 3.6499 | 3766 | 0.5007 | - |
| 3.6508 | 3767 | 0.3319 | - |
| 3.6518 | 3768 | 0.4131 | - |
| 3.6528 | 3769 | 0.4431 | - |
| 3.6537 | 3770 | 0.2785 | - |
| 3.6547 | 3771 | 0.3737 | - |
| 3.6557 | 3772 | 0.5274 | - |
| 3.6566 | 3773 | 0.3482 | - |
| 3.6576 | 3774 | 0.4883 | - |
| 3.6586 | 3775 | 0.4975 | - |
| 3.6596 | 3776 | 0.4304 | - |
| 3.6605 | 3777 | 0.4065 | - |
| 3.6615 | 3778 | 0.5716 | - |
| 3.6625 | 3779 | 0.3042 | - |
| 3.6634 | 3780 | 0.4003 | - |
| 3.6644 | 3781 | 0.4539 | - |
| 3.6654 | 3782 | 0.3347 | - |
| 3.6663 | 3783 | 0.2877 | - |
| 3.6673 | 3784 | 0.4373 | - |
| 3.6683 | 3785 | 0.4771 | - |
| 3.6693 | 3786 | 0.4265 | - |
| 3.6702 | 3787 | 0.601 | - |
| 3.6712 | 3788 | 0.4453 | - |
| 3.6722 | 3789 | 0.4775 | - |
| 3.6731 | 3790 | 0.4476 | - |
| 3.6741 | 3791 | 0.5138 | - |
| 3.6751 | 3792 | 0.3594 | - |
| 3.6760 | 3793 | 0.4125 | - |
| 3.6770 | 3794 | 0.5095 | - |
| 3.6780 | 3795 | 0.4566 | - |
| 3.6790 | 3796 | 0.3554 | - |
| 3.6799 | 3797 | 0.3749 | - |
| 3.6809 | 3798 | 0.36 | - |
| 3.6819 | 3799 | 0.386 | - |
| 3.6828 | 3800 | 0.3737 | 0.8007 |
| 3.6838 | 3801 | 0.313 | - |
| 3.6848 | 3802 | 0.5457 | - |
| 3.6857 | 3803 | 0.3392 | - |
| 3.6867 | 3804 | 0.3941 | - |
| 3.6877 | 3805 | 0.3172 | - |
| 3.6887 | 3806 | 0.359 | - |
| 3.6896 | 3807 | 0.3474 | - |
| 3.6906 | 3808 | 0.3619 | - |
| 3.6916 | 3809 | 0.3864 | - |
| 3.6925 | 3810 | 0.4471 | - |
| 3.6935 | 3811 | 0.4188 | - |
| 3.6945 | 3812 | 0.4313 | - |
| 3.6954 | 3813 | 0.372 | - |
| 3.6964 | 3814 | 0.3627 | - |
| 3.6974 | 3815 | 0.4122 | - |
| 3.6984 | 3816 | 0.3819 | - |
| 3.6993 | 3817 | 0.3508 | - |
| 3.7003 | 3818 | 0.3956 | - |
| 3.7013 | 3819 | 0.3982 | - |
| 3.7022 | 3820 | 0.4967 | - |
| 3.7032 | 3821 | 0.482 | - |
| 3.7042 | 3822 | 0.3598 | - |
| 3.7051 | 3823 | 0.3464 | - |
| 3.7061 | 3824 | 0.4343 | - |
| 3.7071 | 3825 | 0.4902 | - |
| 3.7081 | 3826 | 0.5374 | - |
| 3.7090 | 3827 | 0.3286 | - |
| 3.7100 | 3828 | 0.3407 | - |
| 3.7110 | 3829 | 0.284 | - |
| 3.7119 | 3830 | 0.3565 | - |
| 3.7129 | 3831 | 0.3444 | - |
| 3.7139 | 3832 | 0.5146 | - |
| 3.7148 | 3833 | 0.42 | - |
| 3.7158 | 3834 | 0.48 | - |
| 3.7168 | 3835 | 0.3609 | - |
| 3.7177 | 3836 | 0.477 | - |
| 3.7187 | 3837 | 0.3587 | - |
| 3.7197 | 3838 | 0.42 | - |
| 3.7207 | 3839 | 0.4201 | - |
| 3.7216 | 3840 | 0.3422 | - |
| 3.7226 | 3841 | 0.3674 | - |
| 3.7236 | 3842 | 0.3897 | - |
| 3.7245 | 3843 | 0.5181 | - |
| 3.7255 | 3844 | 0.4457 | - |
| 3.7265 | 3845 | 0.4331 | - |
| 3.7274 | 3846 | 0.4465 | - |
| 3.7284 | 3847 | 0.4778 | - |
| 3.7294 | 3848 | 0.4875 | - |
| 3.7304 | 3849 | 0.3721 | - |
| 3.7313 | 3850 | 0.4242 | - |
| 3.7323 | 3851 | 0.3572 | - |
| 3.7333 | 3852 | 0.2938 | - |
| 3.7342 | 3853 | 0.3892 | - |
| 3.7352 | 3854 | 0.4191 | - |
| 3.7362 | 3855 | 0.5058 | - |
| 3.7371 | 3856 | 0.3018 | - |
| 3.7381 | 3857 | 0.4478 | - |
| 3.7391 | 3858 | 0.4196 | - |
| 3.7401 | 3859 | 0.3779 | - |
| 3.7410 | 3860 | 0.4109 | - |
| 3.7420 | 3861 | 0.3662 | - |
| 3.7430 | 3862 | 0.3948 | - |
| 3.7439 | 3863 | 0.4355 | - |
| 3.7449 | 3864 | 0.3177 | - |
| 3.7459 | 3865 | 0.4354 | - |
| 3.7468 | 3866 | 0.4179 | - |
| 3.7478 | 3867 | 0.4025 | - |
| 3.7488 | 3868 | 0.4815 | - |
| 3.7498 | 3869 | 0.3568 | - |
| 3.7507 | 3870 | 0.4873 | - |
| 3.7517 | 3871 | 0.3923 | - |
| 3.7527 | 3872 | 0.323 | - |
| 3.7536 | 3873 | 0.3468 | - |
| 3.7546 | 3874 | 0.4178 | - |
| 3.7556 | 3875 | 0.5326 | - |
| 3.7565 | 3876 | 0.4381 | - |
| 3.7575 | 3877 | 0.4434 | - |
| 3.7585 | 3878 | 0.3941 | - |
| 3.7595 | 3879 | 0.2817 | - |
| 3.7604 | 3880 | 0.4286 | - |
| 3.7614 | 3881 | 0.4535 | - |
| 3.7624 | 3882 | 0.4137 | - |
| 3.7633 | 3883 | 0.4289 | - |
| 3.7643 | 3884 | 0.4739 | - |
| 3.7653 | 3885 | 0.5499 | - |
| 3.7662 | 3886 | 0.3653 | - |
| 3.7672 | 3887 | 0.4609 | - |
| 3.7682 | 3888 | 0.4395 | - |
| 3.7692 | 3889 | 0.4833 | - |
| 3.7701 | 3890 | 0.353 | - |
| 3.7711 | 3891 | 0.3584 | - |
| 3.7721 | 3892 | 0.4722 | - |
| 3.7730 | 3893 | 0.3595 | - |
| 3.7740 | 3894 | 0.4321 | - |
| 3.7750 | 3895 | 0.3281 | - |
| 3.7759 | 3896 | 0.3938 | - |
| 3.7769 | 3897 | 0.4071 | - |
| 3.7779 | 3898 | 0.493 | - |
| 3.7789 | 3899 | 0.4255 | - |
| 3.7798 | 3900 | 0.4097 | 0.8075 |
| 3.7808 | 3901 | 0.4024 | - |
| 3.7818 | 3902 | 0.5435 | - |
| 3.7827 | 3903 | 0.3145 | - |
| 3.7837 | 3904 | 0.3779 | - |
| 3.7847 | 3905 | 0.3576 | - |
| 3.7856 | 3906 | 0.4215 | - |
| 3.7866 | 3907 | 0.4326 | - |
| 3.7876 | 3908 | 0.3755 | - |
| 3.7886 | 3909 | 0.3781 | - |
| 3.7895 | 3910 | 0.4601 | - |
| 3.7905 | 3911 | 0.4832 | - |
| 3.7915 | 3912 | 0.3378 | - |
| 3.7924 | 3913 | 0.3951 | - |
| 3.7934 | 3914 | 0.5037 | - |
| 3.7944 | 3915 | 0.2934 | - |
| 3.7953 | 3916 | 0.3283 | - |
| 3.7963 | 3917 | 0.4938 | - |
| 3.7973 | 3918 | 0.4575 | - |
| 3.7983 | 3919 | 0.4861 | - |
| 3.7992 | 3920 | 0.3876 | - |
| 3.8002 | 3921 | 0.3772 | - |
| 3.8012 | 3922 | 0.505 | - |
| 3.8021 | 3923 | 0.4055 | - |
| 3.8031 | 3924 | 0.4615 | - |
| 3.8041 | 3925 | 0.3793 | - |
| 3.8050 | 3926 | 0.3452 | - |
| 3.8060 | 3927 | 0.3948 | - |
| 3.8070 | 3928 | 0.4749 | - |
| 3.8080 | 3929 | 0.362 | - |
| 3.8089 | 3930 | 0.3142 | - |
| 3.8099 | 3931 | 0.4973 | - |
| 3.8109 | 3932 | 0.4744 | - |
| 3.8118 | 3933 | 0.469 | - |
| 3.8128 | 3934 | 0.4228 | - |
| 3.8138 | 3935 | 0.3884 | - |
| 3.8147 | 3936 | 0.4161 | - |
| 3.8157 | 3937 | 0.363 | - |
| 3.8167 | 3938 | 0.4782 | - |
| 3.8177 | 3939 | 0.3397 | - |
| 3.8186 | 3940 | 0.4088 | - |
| 3.8196 | 3941 | 0.355 | - |
| 3.8206 | 3942 | 0.5584 | - |
| 3.8215 | 3943 | 0.3059 | - |
| 3.8225 | 3944 | 0.3212 | - |
| 3.8235 | 3945 | 0.3323 | - |
| 3.8244 | 3946 | 0.5085 | - |
| 3.8254 | 3947 | 0.4276 | - |
| 3.8264 | 3948 | 0.3809 | - |
| 3.8274 | 3949 | 0.5036 | - |
| 3.8283 | 3950 | 0.4532 | - |
| 3.8293 | 3951 | 0.4455 | - |
| 3.8303 | 3952 | 0.3247 | - |
| 3.8312 | 3953 | 0.3792 | - |
| 3.8322 | 3954 | 0.5139 | - |
| 3.8332 | 3955 | 0.3971 | - |
| 3.8341 | 3956 | 0.4994 | - |
| 3.8351 | 3957 | 0.3376 | - |
| 3.8361 | 3958 | 0.3506 | - |
| 3.8371 | 3959 | 0.3549 | - |
| 3.8380 | 3960 | 0.3741 | - |
| 3.8390 | 3961 | 0.4146 | - |
| 3.8400 | 3962 | 0.5096 | - |
| 3.8409 | 3963 | 0.3958 | - |
| 3.8419 | 3964 | 0.3555 | - |
| 3.8429 | 3965 | 0.3995 | - |
| 3.8438 | 3966 | 0.4461 | - |
| 3.8448 | 3967 | 0.291 | - |
| 3.8458 | 3968 | 0.4799 | - |
| 3.8468 | 3969 | 0.4635 | - |
| 3.8477 | 3970 | 0.5236 | - |
| 3.8487 | 3971 | 0.3948 | - |
| 3.8497 | 3972 | 0.3188 | - |
| 3.8506 | 3973 | 0.3616 | - |
| 3.8516 | 3974 | 0.3948 | - |
| 3.8526 | 3975 | 0.3852 | - |
| 3.8535 | 3976 | 0.4645 | - |
| 3.8545 | 3977 | 0.3571 | - |
| 3.8555 | 3978 | 0.3232 | - |
| 3.8565 | 3979 | 0.5128 | - |
| 3.8574 | 3980 | 0.405 | - |
| 3.8584 | 3981 | 0.6004 | - |
| 3.8594 | 3982 | 0.36 | - |
| 3.8603 | 3983 | 0.4346 | - |
| 3.8613 | 3984 | 0.4296 | - |
| 3.8623 | 3985 | 0.4806 | - |
| 3.8632 | 3986 | 0.2918 | - |
| 3.8642 | 3987 | 0.5149 | - |
| 3.8652 | 3988 | 0.4247 | - |
| 3.8661 | 3989 | 0.4492 | - |
| 3.8671 | 3990 | 0.3662 | - |
| 3.8681 | 3991 | 0.339 | - |
| 3.8691 | 3992 | 0.5478 | - |
| 3.8700 | 3993 | 0.3783 | - |
| 3.8710 | 3994 | 0.5345 | - |
| 3.8720 | 3995 | 0.488 | - |
| 3.8729 | 3996 | 0.4659 | - |
| 3.8739 | 3997 | 0.2756 | - |
| 3.8749 | 3998 | 0.4055 | - |
| 3.8758 | 3999 | 0.4062 | - |
| 3.8768 | 4000 | 0.3754 | 0.8106 |
| 3.8778 | 4001 | 0.4341 | - |
| 3.8788 | 4002 | 0.4363 | - |
| 3.8797 | 4003 | 0.3813 | - |
| 3.8807 | 4004 | 0.3798 | - |
| 3.8817 | 4005 | 0.3193 | - |
| 3.8826 | 4006 | 0.3686 | - |
| 3.8836 | 4007 | 0.3831 | - |
| 3.8846 | 4008 | 0.5797 | - |
| 3.8855 | 4009 | 0.4513 | - |
| 3.8865 | 4010 | 0.4369 | - |
| 3.8875 | 4011 | 0.3231 | - |
| 3.8885 | 4012 | 0.4595 | - |
| 3.8894 | 4013 | 0.412 | - |
| 3.8904 | 4014 | 0.3706 | - |
| 3.8914 | 4015 | 0.452 | - |
| 3.8923 | 4016 | 0.4972 | - |
| 3.8933 | 4017 | 0.3665 | - |
| 3.8943 | 4018 | 0.3562 | - |
| 3.8952 | 4019 | 0.4332 | - |
| 3.8962 | 4020 | 0.3807 | - |
| 3.8972 | 4021 | 0.4893 | - |
| 3.8982 | 4022 | 0.4206 | - |
| 3.8991 | 4023 | 0.4137 | - |
| 3.9001 | 4024 | 0.4588 | - |
| 3.9011 | 4025 | 0.4161 | - |
| 3.9020 | 4026 | 0.5312 | - |
| 3.9030 | 4027 | 0.4152 | - |
| 3.9040 | 4028 | 0.5372 | - |
| 3.9049 | 4029 | 0.3955 | - |
| 3.9059 | 4030 | 0.3258 | - |
| 3.9069 | 4031 | 0.471 | - |
| 3.9079 | 4032 | 0.557 | - |
| 3.9088 | 4033 | 0.3726 | - |
| 3.9098 | 4034 | 0.4754 | - |
| 3.9108 | 4035 | 0.4224 | - |
| 3.9117 | 4036 | 0.43 | - |
| 3.9127 | 4037 | 0.3452 | - |
| 3.9137 | 4038 | 0.3867 | - |
| 3.9146 | 4039 | 0.29 | - |
| 3.9156 | 4040 | 0.4259 | - |
| 3.9166 | 4041 | 0.4024 | - |
| 3.9176 | 4042 | 0.3295 | - |
| 3.9185 | 4043 | 0.2796 | - |
| 3.9195 | 4044 | 0.5039 | - |
| 3.9205 | 4045 | 0.4599 | - |
| 3.9214 | 4046 | 0.3674 | - |
| 3.9224 | 4047 | 0.3631 | - |
| 3.9234 | 4048 | 0.4369 | - |
| 3.9243 | 4049 | 0.4591 | - |
| 3.9253 | 4050 | 0.4569 | - |
| 3.9263 | 4051 | 0.4937 | - |
| 3.9273 | 4052 | 0.4751 | - |
| 3.9282 | 4053 | 0.3978 | - |
| 3.9292 | 4054 | 0.3624 | - |
| 3.9302 | 4055 | 0.375 | - |
| 3.9311 | 4056 | 0.3646 | - |
| 3.9321 | 4057 | 0.3742 | - |
| 3.9331 | 4058 | 0.3249 | - |
| 3.9340 | 4059 | 0.4627 | - |
| 3.9350 | 4060 | 0.4368 | - |
| 3.9360 | 4061 | 0.5225 | - |
| 3.9370 | 4062 | 0.3872 | - |
| 3.9379 | 4063 | 0.4439 | - |
| 3.9389 | 4064 | 0.3902 | - |
| 3.9399 | 4065 | 0.4534 | - |
| 3.9408 | 4066 | 0.4111 | - |
| 3.9418 | 4067 | 0.3876 | - |
| 3.9428 | 4068 | 0.4835 | - |
| 3.9437 | 4069 | 0.3555 | - |
| 3.9447 | 4070 | 0.3769 | - |
| 3.9457 | 4071 | 0.4693 | - |
| 3.9467 | 4072 | 0.3485 | - |
| 3.9476 | 4073 | 0.4704 | - |
| 3.9486 | 4074 | 0.2929 | - |
| 3.9496 | 4075 | 0.4668 | - |
| 3.9505 | 4076 | 0.4186 | - |
| 3.9515 | 4077 | 0.4458 | - |
| 3.9525 | 4078 | 0.3272 | - |
| 3.9534 | 4079 | 0.3829 | - |
| 3.9544 | 4080 | 0.4873 | - |
| 3.9554 | 4081 | 0.4058 | - |
| 3.9564 | 4082 | 0.3986 | - |
| 3.9573 | 4083 | 0.2891 | - |
| 3.9583 | 4084 | 0.3566 | - |
| 3.9593 | 4085 | 0.4851 | - |
| 3.9602 | 4086 | 0.441 | - |
| 3.9612 | 4087 | 0.4485 | - |
| 3.9622 | 4088 | 0.432 | - |
| 3.9631 | 4089 | 0.3028 | - |
| 3.9641 | 4090 | 0.4969 | - |
| 3.9651 | 4091 | 0.3974 | - |
| 3.9661 | 4092 | 0.5434 | - |
| 3.9670 | 4093 | 0.4325 | - |
| 3.9680 | 4094 | 0.4328 | - |
| 3.9690 | 4095 | 0.3956 | - |
| 3.9699 | 4096 | 0.3292 | - |
| 3.9709 | 4097 | 0.3426 | - |
| 3.9719 | 4098 | 0.3434 | - |
| 3.9728 | 4099 | 0.3095 | - |
| 3.9738 | 4100 | 0.4029 | 0.8160 |
</details>
### Framework Versions
- Python: 3.10.6
- Sentence Transformers: 3.3.0.dev0
- Transformers: 4.45.2
- PyTorch: 2.4.1+cu118
- Accelerate: 0.34.0
- Datasets: 2.21.0
- Tokenizers: 0.20.2
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
#### CachedMultipleNegativesRankingLoss
```bibtex
@misc{gao2021scaling,
title={Scaling Deep Contrastive Learning Batch Size under Memory Limited Setup},
author={Luyu Gao and Yunyi Zhang and Jiawei Han and Jamie Callan},
year={2021},
eprint={2101.06983},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> |
TrgTuan10/llava-v1.6-mistral-7b-hf | TrgTuan10 | 2024-11-25T02:51:38Z | 6 | 0 | null | [
"safetensors",
"llava_next",
"vision",
"image-text-to-text",
"conversational",
"en",
"arxiv:2310.03744",
"license:apache-2.0",
"region:us"
] | image-text-to-text | 2024-11-24T05:20:22Z | ---
license: apache-2.0
tags:
- vision
- image-text-to-text
language:
- en
pipeline_tag: image-text-to-text
inference: true
---
# LLaVa-Next, leveraging [mistralai/Mistral-7B-Instruct-v0.2](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2) as LLM
The LLaVA-NeXT model was proposed in [LLaVA-NeXT: Improved reasoning, OCR, and world knowledge](https://llava-vl.github.io/blog/2024-01-30-llava-next/) by Haotian Liu, Chunyuan Li, Yuheng Li, Bo Li, Yuanhan Zhang, Sheng Shen, Yong Jae Lee. LLaVa-NeXT (also called LLaVa-1.6) improves upon [LLaVa-1.5](https://huggingface.co/transformers/main/model_doc/llava.html) by increasing the input image resolution and training on an improved visual instruction tuning dataset to improve OCR and common sense reasoning.
Disclaimer: The team releasing LLaVa-NeXT did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
LLaVa combines a pre-trained large language model with a pre-trained vision encoder for multimodal chatbot use cases. LLaVA 1.6 improves on LLaVA 1.5 BY:
- Using [Mistral-7B](https://mistral.ai/news/announcing-mistral-7b/) (for this checkpoint) and [Nous-Hermes-2-Yi-34B](https://huggingface.co/NousResearch/Nous-Hermes-2-Yi-34B) which has better commercial licenses,
and bilingual support
- More diverse and high quality data mixture
- Dynamic high resolution

## Intended uses & limitations
You can use the raw model for tasks like image captioning, visual question answering, multimodal chatbot use cases. See the [model hub](https://huggingface.co/models?search=llava-hf) to look for
other versions on a task that interests you.
### How to use
Here's the prompt template for this model:
```
"[INST] <image>\nWhat is shown in this image? [/INST]"
```
You can load and use the model like following:
```python
from transformers import LlavaNextProcessor, LlavaNextForConditionalGeneration
import torch
from PIL import Image
import requests
processor = LlavaNextProcessor.from_pretrained("llava-hf/llava-v1.6-mistral-7b-hf")
model = LlavaNextForConditionalGeneration.from_pretrained("llava-hf/llava-v1.6-mistral-7b-hf", torch_dtype=torch.float16, low_cpu_mem_usage=True)
model.to("cuda:0")
# prepare image and text prompt, using the appropriate prompt template
url = "https://github.com/haotian-liu/LLaVA/blob/1a91fc274d7c35a9b50b3cb29c4247ae5837ce39/images/llava_v1_5_radar.jpg?raw=true"
image = Image.open(requests.get(url, stream=True).raw)
# Define a chat history and use `apply_chat_template` to get correctly formatted prompt
# Each value in "content" has to be a list of dicts with types ("text", "image")
conversation = [
{
"role": "user",
"content": [
{"type": "text", "text": "What is shown in this image?"},
{"type": "image"},
],
},
]
prompt = processor.apply_chat_template(conversation, add_generation_prompt=True)
inputs = processor(images=image, text=prompt, return_tensors="pt").to("cuda:0")
# autoregressively complete prompt
output = model.generate(**inputs, max_new_tokens=100)
print(processor.decode(output[0], skip_special_tokens=True))
```
### Model optimization
#### 4-bit quantization through `bitsandbytes` library
First make sure to install `bitsandbytes`, `pip install bitsandbytes` and make sure to have access to a CUDA compatible GPU device. Simply change the snippet above with:
```diff
model = LlavaNextForConditionalGeneration.from_pretrained(
model_id,
torch_dtype=torch.float16,
low_cpu_mem_usage=True,
+ load_in_4bit=True
)
```
#### Use Flash-Attention 2 to further speed-up generation
First make sure to install `flash-attn`. Refer to the [original repository of Flash Attention](https://github.com/Dao-AILab/flash-attention) regarding that package installation. Simply change the snippet above with:
```diff
model = LlavaNextForConditionalGeneration.from_pretrained(
model_id,
torch_dtype=torch.float16,
low_cpu_mem_usage=True,
+ use_flash_attention_2=True
).to(0)
```
### BibTeX entry and citation info
```bibtex
@misc{liu2023improved,
title={Improved Baselines with Visual Instruction Tuning},
author={Haotian Liu and Chunyuan Li and Yuheng Li and Yong Jae Lee},
year={2023},
eprint={2310.03744},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
``` |
cvapict/distilbert-base-multilingual-cased-aoe-test2 | cvapict | 2024-11-25T02:32:34Z | 105 | 0 | transformers | [
"transformers",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"base_model:distilbert/distilbert-base-multilingual-cased",
"base_model:finetune:distilbert/distilbert-base-multilingual-cased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-25T02:32:17Z | ---
library_name: transformers
license: apache-2.0
base_model: distilbert-base-multilingual-cased
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-multilingual-cased-aoe-test2
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-multilingual-cased-aoe-test2
This model is a fine-tuned version of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1055
- Accuracy: 0.9727
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.1014 | 1.0 | 375 | 0.0816 | 0.9717 |
| 0.103 | 2.0 | 750 | 0.0845 | 0.9667 |
| 0.0438 | 3.0 | 1125 | 0.1055 | 0.9727 |
| 0.0367 | 4.0 | 1500 | 0.1231 | 0.9677 |
| 0.0031 | 5.0 | 1875 | 0.1312 | 0.966 |
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.1+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
|
mradermacher/MFANNv0.3-GGUF | mradermacher | 2024-11-25T02:26:07Z | 5 | 0 | transformers | [
"transformers",
"gguf",
"en",
"base_model:netcat420/MFANNv0.3",
"base_model:quantized:netcat420/MFANNv0.3",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-11-25T01:39:42Z | ---
base_model: netcat420/MFANNv0.3
language:
- en
library_name: transformers
license: apache-2.0
quantized_by: mradermacher
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
static quants of https://huggingface.co/netcat420/MFANNv0.3
<!-- provided-files -->
weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/MFANNv0.3-GGUF/resolve/main/MFANNv0.3.Q2_K.gguf) | Q2_K | 2.8 | |
| [GGUF](https://huggingface.co/mradermacher/MFANNv0.3-GGUF/resolve/main/MFANNv0.3.Q3_K_S.gguf) | Q3_K_S | 3.3 | |
| [GGUF](https://huggingface.co/mradermacher/MFANNv0.3-GGUF/resolve/main/MFANNv0.3.Q3_K_M.gguf) | Q3_K_M | 3.6 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/MFANNv0.3-GGUF/resolve/main/MFANNv0.3.Q3_K_L.gguf) | Q3_K_L | 3.9 | |
| [GGUF](https://huggingface.co/mradermacher/MFANNv0.3-GGUF/resolve/main/MFANNv0.3.IQ4_XS.gguf) | IQ4_XS | 4.0 | |
| [GGUF](https://huggingface.co/mradermacher/MFANNv0.3-GGUF/resolve/main/MFANNv0.3.Q4_0_4_4.gguf) | Q4_0_4_4 | 4.2 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/MFANNv0.3-GGUF/resolve/main/MFANNv0.3.Q4_K_S.gguf) | Q4_K_S | 4.2 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/MFANNv0.3-GGUF/resolve/main/MFANNv0.3.Q4_K_M.gguf) | Q4_K_M | 4.5 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/MFANNv0.3-GGUF/resolve/main/MFANNv0.3.Q5_K_S.gguf) | Q5_K_S | 5.1 | |
| [GGUF](https://huggingface.co/mradermacher/MFANNv0.3-GGUF/resolve/main/MFANNv0.3.Q5_K_M.gguf) | Q5_K_M | 5.2 | |
| [GGUF](https://huggingface.co/mradermacher/MFANNv0.3-GGUF/resolve/main/MFANNv0.3.Q6_K.gguf) | Q6_K | 6.0 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/MFANNv0.3-GGUF/resolve/main/MFANNv0.3.Q8_0.gguf) | Q8_0 | 7.8 | fast, best quality |
| [GGUF](https://huggingface.co/mradermacher/MFANNv0.3-GGUF/resolve/main/MFANNv0.3.f16.gguf) | f16 | 14.6 | 16 bpw, overkill |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
MikeRoz/TheDrummer_Behemoth-123B-v2.2-4.0bpw-h6-exl2 | MikeRoz | 2024-11-25T02:26:04Z | 6 | 0 | null | [
"safetensors",
"mistral",
"license:other",
"4-bit",
"exl2",
"region:us"
] | null | 2024-11-24T22:32:37Z | ---
license: other
---
# Join our Discord! https://discord.gg/Nbv9pQ88Xb
## Nearly 2500 members strong 💪
### Now with more channels! A hub for creatives and makers alike!
---
[BeaverAI](https://huggingface.co/BeaverAI) proudly presents...
# Behemoth 123B v2.2 🦣
> Nothing in the void is foreign to us. The place we go is the place we belong.

## Links
- Original: https://huggingface.co/TheDrummer/Behemoth-123B-v2.2
- GGUF: https://huggingface.co/TheDrummer/Behemoth-123B-v2.2-GGUF
- iMatrix: https://huggingface.co/bartowski/Behemoth-123B-v2.2-GGUF (recommended for smaller quants)
## Description
Behemoth v2.x is a finetune of the new Largestral 2411 with system prompt support. Testers have noted that **everything** felt improved.
### Usage
Testers say this frankenformat maximizes the model's potential: **Metharme** with Mistral's new system tokens
- `[SYSTEM_PROMPT] <|system|>{{system_message}}[/SYSTEM_PROMPT]<|user|>{{user_message}}<|model|>{{assistant_message}}`
- `<|system|>[SYSTEM_PROMPT] {{system_message}}[/SYSTEM_PROMPT]<|user|>{{user_message}}<|model|>{{assistant_message}}`
*Take note that the opening system tag SHOULD ALWAYS have a leading whitespace after it.*
Complete SillyTavern Settings in BeaverAI Club: https://discord.com/channels/1238219753324281886/1309968730301792370/1309968730301792370
### Versions
- [v2.0](https://huggingface.co/TheDrummer/Behemoth-123B-v2) is equivalent to Behemoth v1.0 (Classic)
- [v2.1](https://huggingface.co/TheDrummer/Behemoth-123B-v2.1) is equivalent to Behemoth v1.1 (Creative Boost)
- [v2.2](https://huggingface.co/TheDrummer/Behemoth-123B-v2.2) is an improvement of Behemoth v2.1 (Creative++)
## Special Thanks
Thank you to each and everyone who donated/subscribed in [Ko-Fi](https://ko-fi.com/thedrummer) 🙇 I hope to never disappoint!
```
Toasty Pigeon
theguywhogamesalot
Grozi
F
Marinara
Ko-fi Supporter
Grozi
Phaelon
ONTHEREDTEAM
EvarinSharath'fe(USM-Valor)
Silva
Dakkidaze
AlexTheVP
Pseudo
Kistara
Dr. Fjut
Grozi 🥈
KinjiHakari777
dustywintr
Syd
HumbleConsumer
Syd
Ko-fi Supporter
Arkamist
joe 🥇
Toad
Lied
Konnect
Kistara
Grozi 🥉
SleepDeprived3
Luigi
Nestor
```
https://ko-fi.com/thedrummer/leaderboard
```
Finetuned by yours truly,
Drummer
```

|
VijayChoudhari/speecht5_finetuned_common_voice_17_0_mr | VijayChoudhari | 2024-11-25T02:17:06Z | 74 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"speecht5",
"text-to-audio",
"generated_from_trainer",
"dataset:common_voice_17_0",
"base_model:microsoft/speecht5_tts",
"base_model:finetune:microsoft/speecht5_tts",
"license:mit",
"endpoints_compatible",
"region:us"
] | text-to-audio | 2024-11-24T11:58:52Z | ---
library_name: transformers
license: mit
base_model: microsoft/speecht5_tts
tags:
- generated_from_trainer
datasets:
- common_voice_17_0
model-index:
- name: speecht5_finetuned_common_voice_17_0_mr
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# speecht5_finetuned_common_voice_17_0_mr
This model is a fine-tuned version of [microsoft/speecht5_tts](https://huggingface.co/microsoft/speecht5_tts) on the common_voice_17_0 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5241
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 4
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 32
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- training_steps: 4000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-------:|:----:|:---------------:|
| 4.6254 | 16.0962 | 1000 | 0.5405 |
| 4.489 | 32.1924 | 2000 | 0.5284 |
| 4.4293 | 48.2886 | 3000 | 0.5270 |
| 4.3461 | 64.3848 | 4000 | 0.5241 |
### Framework versions
- Transformers 4.47.0.dev0
- Pytorch 2.5.0+cpu
- Datasets 3.0.3.dev0
- Tokenizers 0.20.1
|
moritzbur/lilt-GottBERT-base-xfund-de | moritzbur | 2024-11-25T02:16:51Z | 5 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"lilt",
"token-classification",
"generated_from_trainer",
"dataset:xfund",
"base_model:moritzbur/lilt-GottBERT-base",
"base_model:finetune:moritzbur/lilt-GottBERT-base",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | 2024-11-24T23:54:00Z | ---
library_name: transformers
base_model: moritzbur/lilt-GottBERT-base
tags:
- generated_from_trainer
datasets:
- xfund
model-index:
- name: lilt-GottBERT-base-xfund-de
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# lilt-GottBERT-base-xfund-de
This model is a fine-tuned version of [moritzbur/lilt-GottBERT-base](https://huggingface.co/moritzbur/lilt-GottBERT-base) on the xfund dataset.
It achieves the following results on the evaluation set:
- Loss: 1.7402
- Answer: {'precision': 0.7931914893617021, 'recall': 0.8589861751152074, 'f1': 0.8247787610619469, 'number': 1085}
- Header: {'precision': 0.5581395348837209, 'recall': 0.41379310344827586, 'f1': 0.4752475247524752, 'number': 58}
- Question: {'precision': 0.7877906976744186, 'recall': 0.7465564738292011, 'f1': 0.7666195190947666, 'number': 726}
- Overall Precision: 0.7859
- Overall Recall: 0.8015
- Overall F1: 0.7936
- Overall Accuracy: 0.7255
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 2000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Answer | Header | Question | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:---------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
| 0.0373 | 20.0 | 200 | 1.8211 | {'precision': 0.7350565428109854, 'recall': 0.8387096774193549, 'f1': 0.7834696513129574, 'number': 1085} | {'precision': 0.5135135135135135, 'recall': 0.3275862068965517, 'f1': 0.4, 'number': 58} | {'precision': 0.7130102040816326, 'recall': 0.7699724517906336, 'f1': 0.7403973509933776, 'number': 726} | 0.7227 | 0.7961 | 0.7576 | 0.7076 |
| 0.0345 | 40.0 | 400 | 2.1454 | {'precision': 0.7412698412698413, 'recall': 0.8608294930875576, 'f1': 0.796588486140725, 'number': 1085} | {'precision': 0.48148148148148145, 'recall': 0.4482758620689655, 'f1': 0.4642857142857143, 'number': 58} | {'precision': 0.6554809843400448, 'recall': 0.8071625344352618, 'f1': 0.7234567901234568, 'number': 726} | 0.7002 | 0.8272 | 0.7584 | 0.6866 |
| 0.0114 | 60.0 | 600 | 2.0185 | {'precision': 0.8492723492723493, 'recall': 0.7529953917050691, 'f1': 0.7982413287738153, 'number': 1085} | {'precision': 0.7857142857142857, 'recall': 0.3793103448275862, 'f1': 0.5116279069767441, 'number': 58} | {'precision': 0.7317073170731707, 'recall': 0.7851239669421488, 'f1': 0.7574750830564784, 'number': 726} | 0.7965 | 0.7539 | 0.7746 | 0.7294 |
| 0.0043 | 80.0 | 800 | 1.7402 | {'precision': 0.7931914893617021, 'recall': 0.8589861751152074, 'f1': 0.8247787610619469, 'number': 1085} | {'precision': 0.5581395348837209, 'recall': 0.41379310344827586, 'f1': 0.4752475247524752, 'number': 58} | {'precision': 0.7877906976744186, 'recall': 0.7465564738292011, 'f1': 0.7666195190947666, 'number': 726} | 0.7859 | 0.8015 | 0.7936 | 0.7255 |
| 0.0013 | 100.0 | 1000 | 1.8975 | {'precision': 0.8072727272727273, 'recall': 0.8184331797235023, 'f1': 0.8128146453089244, 'number': 1085} | {'precision': 0.5, 'recall': 0.41379310344827586, 'f1': 0.4528301886792453, 'number': 58} | {'precision': 0.7246022031823746, 'recall': 0.8154269972451791, 'f1': 0.7673363577446531, 'number': 726} | 0.7654 | 0.8047 | 0.7846 | 0.7248 |
| 0.0009 | 120.0 | 1200 | 1.8875 | {'precision': 0.8050314465408805, 'recall': 0.8258064516129032, 'f1': 0.8152866242038216, 'number': 1085} | {'precision': 0.6666666666666666, 'recall': 0.3793103448275862, 'f1': 0.48351648351648346, 'number': 58} | {'precision': 0.7094017094017094, 'recall': 0.800275482093664, 'f1': 0.7521035598705502, 'number': 726} | 0.7628 | 0.8020 | 0.7820 | 0.7334 |
| 0.0003 | 140.0 | 1400 | 1.9918 | {'precision': 0.8246575342465754, 'recall': 0.832258064516129, 'f1': 0.8284403669724771, 'number': 1085} | {'precision': 0.4716981132075472, 'recall': 0.43103448275862066, 'f1': 0.45045045045045046, 'number': 58} | {'precision': 0.7354430379746836, 'recall': 0.800275482093664, 'f1': 0.766490765171504, 'number': 726} | 0.7786 | 0.8074 | 0.7928 | 0.7316 |
| 0.0003 | 160.0 | 1600 | 2.4537 | {'precision': 0.7632850241545893, 'recall': 0.8737327188940092, 'f1': 0.8147829823807479, 'number': 1085} | {'precision': 0.6857142857142857, 'recall': 0.41379310344827586, 'f1': 0.5161290322580646, 'number': 58} | {'precision': 0.7536231884057971, 'recall': 0.7878787878787878, 'f1': 0.7703703703703704, 'number': 726} | 0.7583 | 0.8261 | 0.7908 | 0.6903 |
| 0.0004 | 180.0 | 1800 | 2.1619 | {'precision': 0.785593220338983, 'recall': 0.8543778801843318, 'f1': 0.8185430463576159, 'number': 1085} | {'precision': 0.5641025641025641, 'recall': 0.3793103448275862, 'f1': 0.4536082474226804, 'number': 58} | {'precision': 0.7718579234972678, 'recall': 0.778236914600551, 'f1': 0.7750342935528121, 'number': 726} | 0.7760 | 0.8101 | 0.7927 | 0.7197 |
| 0.0003 | 200.0 | 2000 | 2.1507 | {'precision': 0.7948051948051948, 'recall': 0.8460829493087557, 'f1': 0.8196428571428571, 'number': 1085} | {'precision': 0.631578947368421, 'recall': 0.41379310344827586, 'f1': 0.5, 'number': 58} | {'precision': 0.7438551099611902, 'recall': 0.7920110192837465, 'f1': 0.7671781187458305, 'number': 726} | 0.7716 | 0.8117 | 0.7911 | 0.7207 |
### Framework versions
- Transformers 4.45.1
- Pytorch 2.4.0
- Datasets 3.0.1
- Tokenizers 0.20.0
|
Shinyaaa/outputs_sup_simcse_v3 | Shinyaaa | 2024-11-25T02:16:13Z | 103 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"feature-extraction",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | feature-extraction | 2024-11-25T02:15:45Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
maidalun1020/bce-reranker-base_v1 | maidalun1020 | 2024-11-25T02:16:09Z | 16,228 | 180 | sentence-transformers | [
"sentence-transformers",
"pytorch",
"xlm-roberta",
"text-classification",
"transformers",
"en",
"zh",
"ja",
"ko",
"license:apache-2.0",
"region:us"
] | text-classification | 2023-12-29T07:37:26Z | ---
license: apache-2.0
pipeline_tag: text-classification
tags:
- transformers
- sentence-transformers
language:
- en
- zh
- ja
- ko
---
<!--
* @Description:
* @Author: shenlei
* @Date: 2023-12-19 10:31:41
* @LastEditTime: 2024-01-10 00:17:02
* @LastEditors: shenlei
-->
<h1 align="center">BCEmbedding: Bilingual and Crosslingual Embedding for RAG</h1>
<p align="center">
<a href="https://github.com/netease-youdao/BCEmbedding/blob/master/LICENSE">
<img src="https://img.shields.io/badge/license-Apache--2.0-yellow">
</a>
<a href="https://twitter.com/YDopensource">
<img src="https://img.shields.io/badge/follow-%40YDOpenSource-1DA1F2?logo=twitter&style={style}">
</a>
</p>
最新、最详细bce-reranker-base_v1相关信息,请移步(The latest "Updates" should be checked in):
<p align="left">
<a href="https://github.com/netease-youdao/BCEmbedding">GitHub</a>
</p>
## 主要特点(Key Features):
- 中英日韩四个语种,以及中英日韩四个语种的跨语种能力(Multilingual and Crosslingual capability in English, Chinese, Japanese and Korean);
- RAG优化,适配更多真实业务场景(RAG adaptation for more domains, including Education, Law, Finance, Medical, Literature, FAQ, Textbook, Wikipedia, etc.);
- <a href="https://github.com/netease-youdao/BCEmbedding">BCEmbedding</a>适配长文本做rerank(Handle long passages reranking more than 512 limit in <a href="https://github.com/netease-youdao/BCEmbedding">BCEmbedding</a>);
- RerankerModel可以提供 **“绝对”分数**,低质量passage过滤阈值推荐0.35或0.4。(RerankerModel provides **"meaningful" (for filtering bad passages with a threshold of 0.35 or 0.4) similarity score**)
- **最佳实践(Best practice)** :embedding召回top50-100片段,reranker对这50-100片段精排,最后取top5-10片段。(1. Get top 50-100 passages with [bce-embedding-base_v1](https://huggingface.co/maidalun1020/bce-embedding-base_v1) for "`recall`"; 2. Rerank passages with [bce-reranker-base_v1](https://huggingface.co/maidalun1020/bce-reranker-base_v1) and get top 5-10 for "`precision`" finally. )
## News:
- `BCEmbedding`技术博客( **Technical Blog** ): [为RAG而生-BCEmbedding技术报告](https://zhuanlan.zhihu.com/p/681370855)
- Related link for **EmbeddingModel** : [bce-embedding-base_v1](https://huggingface.co/maidalun1020/bce-embedding-base_v1)
## Third-party Examples:
- RAG applications: [QAnything](https://github.com/netease-youdao/qanything), [HuixiangDou](https://github.com/InternLM/HuixiangDou), [ChatPDF](https://github.com/shibing624/ChatPDF).
- Efficient inference framework: [ChatLLM.cpp](https://github.com/foldl/chatllm.cpp), [Xinference](https://github.com/xorbitsai/inference), [mindnlp (Huawei GPU, 华为GPU)](https://github.com/mindspore-lab/mindnlp/tree/master/llm/inference/bce).


-----------------------------------------
<details open="open">
<summary>Click to Open Contents</summary>
- <a href="#-bilingual-and-crosslingual-superiority" target="_Self">🌐 Bilingual and Crosslingual Superiority</a>
- <a href="#-key-features" target="_Self">💡 Key Features</a>
- <a href="#-latest-updates" target="_Self">🚀 Latest Updates</a>
- <a href="#-model-list" target="_Self">🍎 Model List</a>
- <a href="#-manual" target="_Self">📖 Manual</a>
- <a href="#installation" target="_Self">Installation</a>
- <a href="#quick-start" target="_Self">Quick Start (`transformers`, `sentence-transformers`)</a>
- <a href="#integrations-for-rag-frameworks" target="_Self">Integrations for RAG Frameworks (`langchain`, `llama_index`)</a>
- <a href="#%EF%B8%8F-evaluation" target="_Self">⚙️ Evaluation</a>
- <a href="#evaluate-semantic-representation-by-mteb" target="_Self">Evaluate Semantic Representation by MTEB</a>
- <a href="#evaluate-rag-by-llamaindex" target="_Self">Evaluate RAG by LlamaIndex</a>
- <a href="#-leaderboard" target="_Self">📈 Leaderboard</a>
- <a href="#semantic-representation-evaluations-in-mteb" target="_Self">Semantic Representation Evaluations in MTEB</a>
- <a href="#rag-evaluations-in-llamaindex" target="_Self">RAG Evaluations in LlamaIndex</a>
- <a href="#-youdaos-bcembedding-api" target="_Self">🛠 Youdao's BCEmbedding API</a>
- <a href="#-wechat-group" target="_Self">🧲 WeChat Group</a>
- <a href="#%EF%B8%8F-citation" target="_Self">✏️ Citation</a>
- <a href="#-license" target="_Self">🔐 License</a>
- <a href="#-related-links" target="_Self">🔗 Related Links</a>
</details>
<br>
**B**ilingual and **C**rosslingual **Embedding** (`BCEmbedding`), developed by NetEase Youdao, encompasses `EmbeddingModel` and `RerankerModel`. The `EmbeddingModel` specializes in generating semantic vectors, playing a crucial role in semantic search and question-answering, and the `RerankerModel` excels at refining search results and ranking tasks.
`BCEmbedding` serves as the cornerstone of Youdao's Retrieval Augmented Generation (RAG) implmentation, notably [QAnything](http://qanything.ai) [[github](https://github.com/netease-youdao/qanything)], an open-source implementation widely integrated in various Youdao products like [Youdao Speed Reading](https://read.youdao.com/#/home) and [Youdao Translation](https://fanyi.youdao.com/download-Mac?keyfrom=fanyiweb_navigation).
Distinguished for its bilingual and crosslingual proficiency, `BCEmbedding` excels in bridging Chinese and English linguistic gaps, which achieves
- **A high performence on <a href="#semantic-representation-evaluations-in-mteb">Semantic Representation Evaluations in MTEB</a>**;
- **A new benchmark in the realm of <a href="#rag-evaluations-in-llamaindex">RAG Evaluations in LlamaIndex</a>**.
`BCEmbedding`是由网易有道开发的双语和跨语种语义表征算法模型库,其中包含`EmbeddingModel`和`RerankerModel`两类基础模型。`EmbeddingModel`专门用于生成语义向量,在语义搜索和问答中起着关键作用,而`RerankerModel`擅长优化语义搜索结果和语义相关顺序精排。
`BCEmbedding`作为有道的检索增强生成式应用(RAG)的基石,特别是在[QAnything](http://qanything.ai) [[github](https://github.com/netease-youdao/qanything)]中发挥着重要作用。QAnything作为一个网易有道开源项目,在有道许多产品中有很好的应用实践,比如[有道速读](https://read.youdao.com/#/home)和[有道翻译](https://fanyi.youdao.com/download-Mac?keyfrom=fanyiweb_navigation)
`BCEmbedding`以其出色的双语和跨语种能力而著称,在语义检索中消除中英语言之间的差异,从而实现:
- **强大的双语和跨语种语义表征能力【<a href="#semantic-representation-evaluations-in-mteb">基于MTEB的语义表征评测指标</a>】。**
- **基于LlamaIndex的RAG评测,表现SOTA【<a href="#rag-evaluations-in-llamaindex">基于LlamaIndex的RAG评测指标</a>】。**
## 🌐 Bilingual and Crosslingual Superiority
Existing embedding models often encounter performance challenges in bilingual and crosslingual scenarios, particularly in Chinese, English and their crosslingual tasks. `BCEmbedding`, leveraging the strength of Youdao's translation engine, excels in delivering superior performance across monolingual, bilingual, and crosslingual settings.
`EmbeddingModel` supports ***Chinese (ch) and English (en)*** (more languages support will come soon), while `RerankerModel` supports ***Chinese (ch), English (en), Japanese (ja) and Korean (ko)***.
现有的单个语义表征模型在双语和跨语种场景中常常表现不佳,特别是在中文、英文及其跨语种任务中。`BCEmbedding`充分利用有道翻译引擎的优势,实现只需一个模型就可以在单语、双语和跨语种场景中表现出卓越的性能。
`EmbeddingModel`支持***中文和英文***(之后会支持更多语种);`RerankerModel`支持***中文,英文,日文和韩文***。
## 💡 Key Features
- **Bilingual and Crosslingual Proficiency**: Powered by Youdao's translation engine, excelling in Chinese, English and their crosslingual retrieval task, with upcoming support for additional languages.
- **RAG-Optimized**: Tailored for diverse RAG tasks including **translation, summarization, and question answering**, ensuring accurate **query understanding**. See <a href=#rag-evaluations-in-llamaindex>RAG Evaluations in LlamaIndex</a>.
- **Efficient and Precise Retrieval**: Dual-encoder for efficient retrieval of `EmbeddingModel` in first stage, and cross-encoder of `RerankerModel` for enhanced precision and deeper semantic analysis in second stage.
- **Broad Domain Adaptability**: Trained on diverse datasets for superior performance across various fields.
- **User-Friendly Design**: Instruction-free, versatile use for multiple tasks without specifying query instruction for each task.
- **Meaningful Reranking Scores**: `RerankerModel` provides relevant scores to improve result quality and optimize large language model performance.
- **Proven in Production**: Successfully implemented and validated in Youdao's products.
- **双语和跨语种能力**:基于有道翻译引擎的强大能力,我们的`BCEmbedding`具备强大的中英双语和跨语种语义表征能力。
- **RAG适配**:面向RAG做了针对性优化,可以适配大多数相关任务,比如**翻译,摘要,问答**等。此外,针对**问题理解**(query understanding)也做了针对优化,详见 <a href="#rag-evaluations-in-llamaindex">基于LlamaIndex的RAG评测指标</a>。
- **高效且精确的语义检索**:`EmbeddingModel`采用双编码器,可以在第一阶段实现高效的语义检索。`RerankerModel`采用交叉编码器,可以在第二阶段实现更高精度的语义顺序精排。
- **更好的领域泛化性**:为了在更多场景实现更好的效果,我们收集了多种多样的领域数据。
- **用户友好**:语义检索时不需要特殊指令前缀。也就是,你不需要为各种任务绞尽脑汁设计指令前缀。
- **有意义的重排序分数**:`RerankerModel`可以提供有意义的语义相关性分数(不仅仅是排序),可以用于过滤无意义文本片段,提高大模型生成效果。
- **产品化检验**:`BCEmbedding`已经被有道众多真实产品检验。
## 🚀 Latest Updates
- ***2024-01-03***: **Model Releases** - [bce-embedding-base_v1](https://huggingface.co/maidalun1020/bce-embedding-base_v1) and [bce-reranker-base_v1](https://huggingface.co/maidalun1020/bce-reranker-base_v1) are available.
- ***2024-01-03***: **Eval Datasets** [[CrosslingualMultiDomainsDataset](https://huggingface.co/datasets/maidalun1020/CrosslingualMultiDomainsDataset)] - Evaluate the performence of RAG, using [LlamaIndex](https://github.com/run-llama/llama_index).
- ***2024-01-03***: **Eval Datasets** [[Details](https://github.com/netease-youdao/BCEmbedding/blob/master/BCEmbedding/evaluation/c_mteb/Retrieval.py)] - Evaluate the performence of crosslingual semantic representation, using [MTEB](https://github.com/embeddings-benchmark/mteb).
- ***2024-01-03***: **模型发布** - [bce-embedding-base_v1](https://huggingface.co/maidalun1020/bce-embedding-base_v1)和[bce-reranker-base_v1](https://huggingface.co/maidalun1020/bce-reranker-base_v1)已发布.
- ***2024-01-03***: **RAG评测数据** [[CrosslingualMultiDomainsDataset](https://huggingface.co/datasets/maidalun1020/CrosslingualMultiDomainsDataset)] - 基于[LlamaIndex](https://github.com/run-llama/llama_index)的RAG评测数据已发布。
- ***2024-01-03***: **跨语种语义表征评测数据** [[详情](https://github.com/netease-youdao/BCEmbedding/blob/master/BCEmbedding/evaluation/c_mteb/Retrieval.py)] - 基于[MTEB](https://github.com/embeddings-benchmark/mteb)的跨语种评测数据已发布.
## 🍎 Model List
| Model Name | Model Type | Languages | Parameters | Weights |
|:-------------------------------|:--------:|:--------:|:--------:|:--------:|
| bce-embedding-base_v1 | `EmbeddingModel` | ch, en | 279M | [download](https://huggingface.co/maidalun1020/bce-embedding-base_v1) |
| bce-reranker-base_v1 | `RerankerModel` | ch, en, ja, ko | 279M | [download](https://huggingface.co/maidalun1020/bce-reranker-base_v1) |
## 📖 Manual
### Installation
First, create a conda environment and activate it.
```bash
conda create --name bce python=3.10 -y
conda activate bce
```
Then install `BCEmbedding` for minimal installation:
```bash
pip install BCEmbedding==0.1.1
```
Or install from source:
```bash
git clone [email protected]:netease-youdao/BCEmbedding.git
cd BCEmbedding
pip install -v -e .
```
### Quick Start
#### 1. Based on `BCEmbedding`
Use `EmbeddingModel`, and `cls` [pooler](./BCEmbedding/models/embedding.py#L24) is default.
```python
from BCEmbedding import EmbeddingModel
# list of sentences
sentences = ['sentence_0', 'sentence_1', ...]
# init embedding model
model = EmbeddingModel(model_name_or_path="maidalun1020/bce-embedding-base_v1")
# extract embeddings
embeddings = model.encode(sentences)
```
Use `RerankerModel` to calculate relevant scores and rerank:
```python
from BCEmbedding import RerankerModel
# your query and corresponding passages
query = 'input_query'
passages = ['passage_0', 'passage_1', ...]
# construct sentence pairs
sentence_pairs = [[query, passage] for passage in passages]
# init reranker model
model = RerankerModel(model_name_or_path="maidalun1020/bce-reranker-base_v1")
# method 0: calculate scores of sentence pairs
scores = model.compute_score(sentence_pairs)
# method 1: rerank passages
rerank_results = model.rerank(query, passages)
```
NOTE:
- In [`RerankerModel.rerank`](./BCEmbedding/models/reranker.py#L137) method, we provide an advanced preproccess that we use in production for making `sentence_pairs`, when "passages" are very long.
#### 2. Based on `transformers`
For `EmbeddingModel`:
```python
from transformers import AutoModel, AutoTokenizer
# list of sentences
sentences = ['sentence_0', 'sentence_1', ...]
# init model and tokenizer
tokenizer = AutoTokenizer.from_pretrained('maidalun1020/bce-embedding-base_v1')
model = AutoModel.from_pretrained('maidalun1020/bce-embedding-base_v1')
device = 'cuda' # if no GPU, set "cpu"
model.to(device)
# get inputs
inputs = tokenizer(sentences, padding=True, truncation=True, max_length=512, return_tensors="pt")
inputs_on_device = {k: v.to(self.device) for k, v in inputs.items()}
# get embeddings
outputs = model(**inputs_on_device, return_dict=True)
embeddings = outputs.last_hidden_state[:, 0] # cls pooler
embeddings = embeddings / embeddings.norm(dim=1, keepdim=True) # normalize
```
For `RerankerModel`:
```python
import torch
from transformers import AutoTokenizer, AutoModelForSequenceClassification
# init model and tokenizer
tokenizer = AutoTokenizer.from_pretrained('maidalun1020/bce-reranker-base_v1')
model = AutoModelForSequenceClassification.from_pretrained('maidalun1020/bce-reranker-base_v1')
device = 'cuda' # if no GPU, set "cpu"
model.to(device)
# get inputs
inputs = tokenizer(sentence_pairs, padding=True, truncation=True, max_length=512, return_tensors="pt")
inputs_on_device = {k: v.to(device) for k, v in inputs.items()}
# calculate scores
scores = model(**inputs_on_device, return_dict=True).logits.view(-1,).float()
scores = torch.sigmoid(scores)
```
#### 3. Based on `sentence_transformers`
For `EmbeddingModel`:
```python
from sentence_transformers import SentenceTransformer
# list of sentences
sentences = ['sentence_0', 'sentence_1', ...]
# init embedding model
## New update for sentence-trnasformers. So clean up your "`SENTENCE_TRANSFORMERS_HOME`/maidalun1020_bce-embedding-base_v1" or "~/.cache/torch/sentence_transformers/maidalun1020_bce-embedding-base_v1" first for downloading new version.
model = SentenceTransformer("maidalun1020/bce-embedding-base_v1")
# extract embeddings
embeddings = model.encode(sentences, normalize_embeddings=True)
```
For `RerankerModel`:
```python
from sentence_transformers import CrossEncoder
# init reranker model
model = CrossEncoder('maidalun1020/bce-reranker-base_v1', max_length=512)
# calculate scores of sentence pairs
scores = model.predict(sentence_pairs)
```
### Integrations for RAG Frameworks
#### 1. Used in `langchain`
```python
from langchain.embeddings import HuggingFaceEmbeddings
from langchain_community.vectorstores import FAISS
from langchain_community.vectorstores.utils import DistanceStrategy
query = 'apples'
passages = [
'I like apples',
'I like oranges',
'Apples and oranges are fruits'
]
# init embedding model
model_name = 'maidalun1020/bce-embedding-base_v1'
model_kwargs = {'device': 'cuda'}
encode_kwargs = {'batch_size': 64, 'normalize_embeddings': True, 'show_progress_bar': False}
embed_model = HuggingFaceEmbeddings(
model_name=model_name,
model_kwargs=model_kwargs,
encode_kwargs=encode_kwargs
)
# example #1. extract embeddings
query_embedding = embed_model.embed_query(query)
passages_embeddings = embed_model.embed_documents(passages)
# example #2. langchain retriever example
faiss_vectorstore = FAISS.from_texts(passages, embed_model, distance_strategy=DistanceStrategy.MAX_INNER_PRODUCT)
retriever = faiss_vectorstore.as_retriever(search_type="similarity", search_kwargs={"score_threshold": 0.5, "k": 3})
related_passages = retriever.get_relevant_documents(query)
```
#### 2. Used in `llama_index`
```python
from llama_index.embeddings import HuggingFaceEmbedding
from llama_index import VectorStoreIndex, ServiceContext, SimpleDirectoryReader
from llama_index.node_parser import SimpleNodeParser
from llama_index.llms import OpenAI
query = 'apples'
passages = [
'I like apples',
'I like oranges',
'Apples and oranges are fruits'
]
# init embedding model
model_args = {'model_name': 'maidalun1020/bce-embedding-base_v1', 'max_length': 512, 'embed_batch_size': 64, 'device': 'cuda'}
embed_model = HuggingFaceEmbedding(**model_args)
# example #1. extract embeddings
query_embedding = embed_model.get_query_embedding(query)
passages_embeddings = embed_model.get_text_embedding_batch(passages)
# example #2. rag example
llm = OpenAI(model='gpt-3.5-turbo-0613', api_key=os.environ.get('OPENAI_API_KEY'), api_base=os.environ.get('OPENAI_BASE_URL'))
service_context = ServiceContext.from_defaults(llm=llm, embed_model=embed_model)
documents = SimpleDirectoryReader(input_files=["BCEmbedding/tools/eval_rag/eval_pdfs/Comp_en_llama2.pdf"]).load_data()
node_parser = SimpleNodeParser.from_defaults(chunk_size=512)
nodes = node_parser.get_nodes_from_documents(documents[0:36])
index = VectorStoreIndex(nodes, service_context=service_context)
query_engine = index.as_query_engine()
response = query_engine.query("What is llama?")
```
## ⚙️ Evaluation
### Evaluate Semantic Representation by MTEB
We provide evaluateion tools for `embedding` and `reranker` models, based on [MTEB](https://github.com/embeddings-benchmark/mteb) and [C_MTEB](https://github.com/FlagOpen/FlagEmbedding/tree/master/C_MTEB).
我们基于[MTEB](https://github.com/embeddings-benchmark/mteb)和[C_MTEB](https://github.com/FlagOpen/FlagEmbedding/tree/master/C_MTEB),提供`embedding`和`reranker`模型的语义表征评测工具。
#### 1. Embedding Models
Just run following cmd to evaluate `your_embedding_model` (e.g. `maidalun1020/bce-embedding-base_v1`) in **bilingual and crosslingual settings** (e.g. `["en", "zh", "en-zh", "zh-en"]`).
运行下面命令评测`your_embedding_model`(比如,`maidalun1020/bce-embedding-base_v1`)。评测任务将会在**双语和跨语种**(比如,`["en", "zh", "en-zh", "zh-en"]`)模式下评测:
```bash
python BCEmbedding/tools/eval_mteb/eval_embedding_mteb.py --model_name_or_path maidalun1020/bce-embedding-base_v1 --pooler cls
```
The total evaluation tasks contain ***114 datastes*** of **"Retrieval", "STS", "PairClassification", "Classification", "Reranking" and "Clustering"**.
评测包含 **"Retrieval", "STS", "PairClassification", "Classification", "Reranking"和"Clustering"** 这六大类任务的 ***114个数据集***。
***NOTE:***
- **All models are evaluated in their recommended pooling method (`pooler`)**.
- `mean` pooler: "jina-embeddings-v2-base-en", "m3e-base", "m3e-large", "e5-large-v2", "multilingual-e5-base", "multilingual-e5-large" and "gte-large".
- `cls` pooler: Other models.
- "jina-embeddings-v2-base-en" model should be loaded with `trust_remote_code`.
```bash
python BCEmbedding/tools/eval_mteb/eval_embedding_mteb.py --model_name_or_path {moka-ai/m3e-base | moka-ai/m3e-large} --pooler mean
python BCEmbedding/tools/eval_mteb/eval_embedding_mteb.py --model_name_or_path jinaai/jina-embeddings-v2-base-en --pooler mean --trust_remote_code
```
***注意:***
- 所有模型的评测采用各自推荐的`pooler`。"jina-embeddings-v2-base-en", "m3e-base", "m3e-large", "e5-large-v2", "multilingual-e5-base", "multilingual-e5-large"和"gte-large"的 `pooler`采用`mean`,其他模型的`pooler`采用`cls`.
- "jina-embeddings-v2-base-en"模型在载入时需要`trust_remote_code`。
#### 2. Reranker Models
Run following cmd to evaluate `your_reranker_model` (e.g. "maidalun1020/bce-reranker-base_v1") in **bilingual and crosslingual settings** (e.g. `["en", "zh", "en-zh", "zh-en"]`).
运行下面命令评测`your_reranker_model`(比如,`maidalun1020/bce-reranker-base_v1`)。评测任务将会在 **双语种和跨语种**(比如,`["en", "zh", "en-zh", "zh-en"]`)模式下评测:
```bash
python BCEmbedding/tools/eval_mteb/eval_reranker_mteb.py --model_name_or_path maidalun1020/bce-reranker-base_v1
```
The evaluation tasks contain ***12 datastes*** of **"Reranking"**.
评测包含 **"Reranking"** 任务的 ***12个数据集***。
#### 3. Metrics Visualization Tool
We proveide a one-click script to sumarize evaluation results of `embedding` and `reranker` models as [Embedding Models Evaluation Summary](https://github.com/netease-youdao/BCEmbedding/blob/master/Docs/EvaluationSummary/embedding_eval_summary.md) and [Reranker Models Evaluation Summary](https://github.com/netease-youdao/BCEmbedding/blob/master/Docs/EvaluationSummary/reranker_eval_summary.md).
我们提供了`embedding`和`reranker`模型的指标可视化一键脚本,输出一个markdown文件,详见[Embedding模型指标汇总](https://github.com/netease-youdao/BCEmbedding/blob/master/Docs/EvaluationSummary/embedding_eval_summary.md)和[Reranker模型指标汇总](https://github.com/netease-youdao/BCEmbedding/blob/master/Docs/EvaluationSummary/reranker_eval_summary.md)。
```bash
python BCEmbedding/evaluation/mteb/summarize_eval_results.py --results_dir {your_embedding_results_dir | your_reranker_results_dir}
```
### Evaluate RAG by LlamaIndex
[LlamaIndex](https://github.com/run-llama/llama_index) is a famous data framework for LLM-based applications, particularly in RAG. Recently, the [LlamaIndex Blog](https://blog.llamaindex.ai/boosting-rag-picking-the-best-embedding-reranker-models-42d079022e83) has evaluated the popular embedding and reranker models in RAG pipeline and attract great attention. Now, we follow its pipeline to evaluate our `BCEmbedding`.
[LlamaIndex](https://github.com/run-llama/llama_index)是一个著名的大模型应用的开源工具,在RAG中很受欢迎。最近,[LlamaIndex博客](https://blog.llamaindex.ai/boosting-rag-picking-the-best-embedding-reranker-models-42d079022e83)对市面上常用的embedding和reranker模型进行RAG流程的评测,吸引广泛关注。下面我们按照该评测流程验证`BCEmbedding`在RAG中的效果。
First, install LlamaIndex:
```bash
pip install llama-index==0.9.22
```
#### 1. Metrics Definition
- Hit Rate:
Hit rate calculates the fraction of queries where the correct answer is found within the top-k retrieved documents. In simpler terms, it's about how often our system gets it right within the top few guesses. ***The larger, the better.***
- Mean Reciprocal Rank (MRR):
For each query, MRR evaluates the system's accuracy by looking at the rank of the highest-placed relevant document. Specifically, it's the average of the reciprocals of these ranks across all the queries. So, if the first relevant document is the top result, the reciprocal rank is 1; if it's second, the reciprocal rank is 1/2, and so on. ***The larger, the better.***
- 命中率(Hit Rate)
命中率计算的是在检索的前k个文档中找到正确答案的查询所占的比例。简单来说,它反映了我们的系统在前几次猜测中答对的频率。***该指标越大越好。***
- 平均倒数排名(Mean Reciprocal Rank,MRR)
对于每个查询,MRR通过查看最高排名的相关文档的排名来评估系统的准确性。具体来说,它是在所有查询中这些排名的倒数的平均值。因此,如果第一个相关文档是排名最靠前的结果,倒数排名就是1;如果是第二个,倒数排名就是1/2,依此类推。***该指标越大越好。***
#### 2. Reproduce [LlamaIndex Blog](https://blog.llamaindex.ai/boosting-rag-picking-the-best-embedding-reranker-models-42d079022e83)
In order to compare our `BCEmbedding` with other embedding and reranker models fairly, we provide a one-click script to reproduce results of the LlamaIndex Blog, including our `BCEmbedding`:
为了公平起见,运行下面脚本,复现LlamaIndex博客的结果,将`BCEmbedding`与其他embedding和reranker模型进行对比分析:
```bash
# There should be two GPUs available at least.
CUDA_VISIBLE_DEVICES=0,1 python BCEmbedding/tools/eval_rag/eval_llamaindex_reproduce.py
```
Then, sumarize the evaluation results by:
```bash
python BCEmbedding/tools/eval_rag/summarize_eval_results.py --results_dir results/rag_reproduce_results
```
Results Reproduced from the LlamaIndex Blog can be checked in ***[Reproduced Summary of RAG Evaluation](https://github.com/netease-youdao/BCEmbedding/blob/master/Docs/EvaluationSummary/rag_eval_reproduced_summary.md)***, with some obvious ***conclusions***:
- In `WithoutReranker` setting, our `bce-embedding-base_v1` outperforms all the other embedding models.
- With fixing the embedding model, our `bce-reranker-base_v1` achieves the best performence.
- ***The combination of `bce-embedding-base_v1` and `bce-reranker-base_v1` is SOTA.***
输出的指标汇总详见 ***[LlamaIndex RAG评测结果复现](https://github.com/netease-youdao/BCEmbedding/blob/master/Docs/EvaluationSummary/rag_eval_reproduced_summary.md)***。从该复现结果中,可以看出:
- 在`WithoutReranker`设置下(**竖排对比**),`bce-embedding-base_v1`比其他embedding模型效果都要好。
- 在固定embedding模型设置下,对比不同reranker效果(**横排对比**),`bce-reranker-base_v1`比其他reranker模型效果都要好。
- ***`bce-embedding-base_v1`和`bce-reranker-base_v1`组合,表现SOTA。***
#### 3. Broad Domain Adaptability
The evaluation of [LlamaIndex Blog](https://blog.llamaindex.ai/boosting-rag-picking-the-best-embedding-reranker-models-42d079022e83) is **monolingual, small amount of data, and specific domain** (just including "llama2" paper). In order to evaluate the **broad domain adaptability, bilingual and crosslingual capability**, we follow the blog to build a multiple domains evaluation dataset (includding "Computer Science", "Physics", "Biology", "Economics", "Math", and "Quantitative Finance"), named [CrosslingualMultiDomainsDataset](https://huggingface.co/datasets/maidalun1020/CrosslingualMultiDomainsDataset), **by OpenAI `gpt-4-1106-preview` for high quality**.
在上述的[LlamaIndex博客](https://blog.llamaindex.ai/boosting-rag-picking-the-best-embedding-reranker-models-42d079022e83)的评测数据只用了“llama2”这一篇文章,该评测是 **单语种,小数据量,特定领域** 的。为了兼容更真实更广的用户使用场景,评测算法模型的 **领域泛化性,双语和跨语种能力**,我们按照该博客的方法构建了一个多领域(计算机科学,物理学,生物学,经济学,数学,量化金融等)的双语种、跨语种评测数据,[CrosslingualMultiDomainsDataset](https://huggingface.co/datasets/maidalun1020/CrosslingualMultiDomainsDataset)。**为了保证构建数据的高质量,我们采用OpenAI的`gpt-4-1106-preview`。**
First, run following cmd to evaluate the most popular and powerful embedding and reranker models:
```bash
# There should be two GPUs available at least.
CUDA_VISIBLE_DEVICES=0,1 python BCEmbedding/tools/eval_rag/eval_llamaindex_multiple_domains.py
```
Then, run the following script to sumarize the evaluation results:
```bash
python BCEmbedding/tools/eval_rag/summarize_eval_results.py --results_dir results/rag_results
```
The summary of multiple domains evaluations can be seen in <a href=#1-multiple-domains-scenarios>Multiple Domains Scenarios</a>.
## 📈 Leaderboard
### Semantic Representation Evaluations in MTEB
#### 1. Embedding Models
| Model | Dimensions | Pooler | Instructions | Retrieval (47) | STS (19) | PairClassification (5) | Classification (21) | Reranking (12) | Clustering (15) | ***AVG*** (119) |
|:--------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|
| bge-base-en-v1.5 | 768 | `cls` | Need | 37.14 | 55.06 | 75.45 | 59.73 | 43.00 | 37.74 | 47.19 |
| bge-base-zh-v1.5 | 768 | `cls` | Need | 47.63 | 63.72 | 77.40 | 63.38 | 54.95 | 32.56 | 53.62 |
| bge-large-en-v1.5 | 1024 | `cls` | Need | 37.18 | 54.09 | 75.00 | 59.24 | 42.47 | 37.32 | 46.80 |
| bge-large-zh-v1.5 | 1024 | `cls` | Need | 47.58 | 64.73 | 79.14 | 64.19 | 55.98 | 33.26 | 54.23 |
| e5-large-v2 | 1024 | `mean` | Need | 35.98 | 55.23 | 75.28 | 59.53 | 42.12 | 36.51 | 46.52 |
| gte-large | 1024 | `mean` | Free | 36.68 | 55.22 | 74.29 | 57.73 | 42.44 | 38.51 | 46.67 |
| gte-large-zh | 1024 | `cls` | Free | 41.15 | 64.62 | 77.58 | 62.04 | 55.62 | 33.03 | 51.51 |
| jina-embeddings-v2-base-en | 768 | `mean` | Free | 31.58 | 54.28 | 74.84 | 58.42 | 41.16 | 34.67 | 44.29 |
| m3e-base | 768 | `mean` | Free | 46.29 | 63.93 | 71.84 | 64.08 | 52.38 | 37.84 | 53.54 |
| m3e-large | 1024 | `mean` | Free | 34.85 | 59.74 | 67.69 | 60.07 | 48.99 | 31.62 | 46.78 |
| multilingual-e5-base | 768 | `mean` | Need | 54.73 | 65.49 | 76.97 | 69.72 | 55.01 | 38.44 | 58.34 |
| multilingual-e5-large | 1024 | `mean` | Need | 56.76 | 66.79 | 78.80 | 71.61 | 56.49 | 43.09 | 60.50 |
| ***bce-embedding-base_v1*** | 768 | `cls` | Free | 57.60 | 65.73 | 74.96 | 69.00 | 57.29 | 38.95 | 59.43 |
***NOTE:***
- Our ***bce-embedding-base_v1*** outperforms other opensource embedding models with comparable model size.
- ***114 datastes*** of **"Retrieval", "STS", "PairClassification", "Classification", "Reranking" and "Clustering"** in `["en", "zh", "en-zh", "zh-en"]` setting.
- The [crosslingual evaluation datasets](https://github.com/netease-youdao/BCEmbedding/blob/master/BCEmbedding/evaluation/c_mteb/Retrieval.py) we released belong to `Retrieval` task.
- More evaluation details please check [Embedding Models Evaluation Summary](https://github.com/netease-youdao/BCEmbedding/blob/master/Docs/EvaluationSummary/embedding_eval_summary.md).
***要点:***
- 对比其他开源的相同规模的embedding模型,***bce-embedding-base_v1*** 表现最好,效果比最好的large模型稍差。
- 评测包含 **"Retrieval", "STS", "PairClassification", "Classification", "Reranking"和"Clustering"** 这六大类任务的共 ***114个数据集***。
- 我们开源的[跨语种语义表征评测数据](https://github.com/netease-youdao/BCEmbedding/blob/master/BCEmbedding/evaluation/c_mteb/Retrieval.py)属于`Retrieval`任务。
- 更详细的评测结果详见[Embedding模型指标汇总](https://github.com/netease-youdao/BCEmbedding/blob/master/Docs/EvaluationSummary/embedding_eval_summary.md)。
#### 2. Reranker Models
| Model | Reranking (12) | ***AVG*** (12) |
| :--------------------------------- | :-------------: | :--------------------: |
| bge-reranker-base | 59.04 | 59.04 |
| bge-reranker-large | 60.86 | 60.86 |
| ***bce-reranker-base_v1*** | **61.29** | ***61.29*** |
***NOTE:***
- Our ***bce-reranker-base_v1*** outperforms other opensource reranker models.
- ***12 datastes*** of **"Reranking"** in `["en", "zh", "en-zh", "zh-en"]` setting.
- More evaluation details please check [Reranker Models Evaluation Summary](https://github.com/netease-youdao/BCEmbedding/blob/master/Docs/EvaluationSummary/reranker_eval_summary.md).
***要点:***
- ***bce-reranker-base_v1*** 优于其他开源reranker模型。
- 评测包含 **"Reranking"** 任务的 ***12个数据集***。
- 更详细的评测结果详见[Reranker模型指标汇总](https://github.com/netease-youdao/BCEmbedding/blob/master/Docs/EvaluationSummary/reranker_eval_summary.md)
### RAG Evaluations in LlamaIndex
#### 1. Multiple Domains Scenarios

***NOTE:***
- Evaluated in **["en", "zh", "en-zh", "zh-en"] setting**.
- In `WithoutReranker` setting, our `bce-embedding-base_v1` outperforms all the other embedding models.
- With fixing the embedding model, our `bce-reranker-base_v1` achieves the best performence.
- **The combination of `bce-embedding-base_v1` and `bce-reranker-base_v1` is SOTA**.
***要点:***
- 评测是在["en", "zh", "en-zh", "zh-en"]设置下。
- 在`WithoutReranker`设置下(**竖排对比**),`bce-embedding-base_v1`优于其他Embedding模型,包括开源和闭源。
- 在固定Embedding模型设置下,对比不同reranker效果(**横排对比**),`bce-reranker-base_v1`比其他reranker模型效果都要好,包括开源和闭源。
- ***`bce-embedding-base_v1`和`bce-reranker-base_v1`组合,表现SOTA。***
## 🛠 Youdao's BCEmbedding API
For users who prefer a hassle-free experience without the need to download and configure the model on their own systems, `BCEmbedding` is readily accessible through Youdao's API. This option offers a streamlined and efficient way to integrate BCEmbedding into your projects, bypassing the complexities of manual setup and maintenance. Detailed instructions and comprehensive API documentation are available at [Youdao BCEmbedding API](https://ai.youdao.com/DOCSIRMA/html/aigc/api/embedding/index.html). Here, you'll find all the necessary guidance to easily implement `BCEmbedding` across a variety of use cases, ensuring a smooth and effective integration for optimal results.
对于那些更喜欢直接调用api的用户,有道提供方便的`BCEmbedding`调用api。该方式是一种简化和高效的方式,将`BCEmbedding`集成到您的项目中,避开了手动设置和系统维护的复杂性。更详细的api调用接口说明详见[有道BCEmbedding API](https://ai.youdao.com/DOCSIRMA/html/aigc/api/embedding/index.html)。
## 🧲 WeChat Group
Welcome to scan the QR code below and join the WeChat group.
欢迎大家扫码加入官方微信交流群。

## ✏️ Citation
If you use `BCEmbedding` in your research or project, please feel free to cite and star it:
如果在您的研究或任何项目中使用本工作,烦请按照下方进行引用,并打个小星星~
```
@misc{youdao_bcembedding_2023,
title={BCEmbedding: Bilingual and Crosslingual Embedding for RAG},
author={NetEase Youdao, Inc.},
year={2023},
howpublished={\url{https://github.com/netease-youdao/BCEmbedding}}
}
```
## 🔐 License
`BCEmbedding` is licensed under [Apache 2.0 License](https://github.com/netease-youdao/BCEmbedding/blob/master/LICENSE)
## 🔗 Related Links
[Netease Youdao - QAnything](https://github.com/netease-youdao/qanything)
[FlagEmbedding](https://github.com/FlagOpen/FlagEmbedding)
[MTEB](https://github.com/embeddings-benchmark/mteb)
[C_MTEB](https://github.com/FlagOpen/FlagEmbedding/tree/master/C_MTEB)
[LLama Index](https://github.com/run-llama/llama_index) | [LlamaIndex Blog](https://blog.llamaindex.ai/boosting-rag-picking-the-best-embedding-reranker-models-42d079022e83) |
artificialguybr/LLAMA3.2-1B-Synthia-II-Redmond-gguf | artificialguybr | 2024-11-25T01:51:54Z | 226 | 1 | transformers | [
"transformers",
"gguf",
"instruct",
"finetune",
"chatml",
"gpt4",
"synthetic data",
"distillation",
"facebook",
"meta",
"pytorch",
"llama",
"llama-3",
"en",
"de",
"fr",
"it",
"pt",
"hi",
"es",
"th",
"dataset:migtissera/Synthia-v1.5-II",
"base_model:artificialguybr/LLAMA3.2-1B-Synthia-II-Redmond",
"base_model:quantized:artificialguybr/LLAMA3.2-1B-Synthia-II-Redmond",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | null | 2024-11-25T01:44:53Z | ---
base_model: artificialguybr/LLAMA3.2-1B-Synthia-II-Redmond
datasets:
- migtissera/Synthia-v1.5-II
language:
- en
- de
- fr
- it
- pt
- hi
- es
- th
library_name: transformers
license: apache-2.0
quantized_by: artificialguybr
tags:
- instruct
- finetune
- chatml
- gpt4
- synthetic data
- distillation
- facebook
- meta
- pytorch
- llama
- llama-3
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: -->
Thanks [Redmond.AI](https://redmond.ai/) for GPU Sponsor!
Quantization for: https://huggingface.co/artificialguybr/LLAMA3.2-1B-Synthia-II-Redmond
## How to use
If you are unsure how to use GGUF files, look at the [TheBloke
READMEs](https://huggingface.co/TheBloke/CodeLlama-70B-Python-GGUF) for
more details, including on how to concatenate multi-part files.
|
gwisuk/klue-roberta-base-klue-sts | gwisuk | 2024-11-25T01:51:54Z | 6 | 0 | sentence-transformers | [
"sentence-transformers",
"safetensors",
"roberta",
"feature-extraction",
"sentence-similarity",
"transformers",
"autotrain_compatible",
"text-embeddings-inference",
"endpoints_compatible",
"region:us"
] | sentence-similarity | 2024-11-25T01:51:05Z | ---
library_name: sentence-transformers
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
- transformers
---
# {MODEL_NAME}
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.
<!--- Describe your model here -->
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('{MODEL_NAME}')
embeddings = model.encode(sentences)
print(embeddings)
```
## Usage (HuggingFace Transformers)
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
```python
from transformers import AutoTokenizer, AutoModel
import torch
#Mean Pooling - Take attention mask into account for correct averaging
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0] #First element of model_output contains all token embeddings
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
# Sentences we want sentence embeddings for
sentences = ['This is an example sentence', 'Each sentence is converted']
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('{MODEL_NAME}')
model = AutoModel.from_pretrained('{MODEL_NAME}')
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, mean pooling.
sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
print("Sentence embeddings:")
print(sentence_embeddings)
```
## Evaluation Results
<!--- Describe how your model was evaluated -->
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name={MODEL_NAME})
## Training
The model was trained with the parameters:
**DataLoader**:
`torch.utils.data.dataloader.DataLoader` of length 657 with parameters:
```
{'batch_size': 16, 'sampler': 'torch.utils.data.sampler.RandomSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'}
```
**Loss**:
`sentence_transformers.losses.CosineSimilarityLoss.CosineSimilarityLoss`
Parameters of the fit()-Method:
```
{
"epochs": 4,
"evaluation_steps": 1000,
"evaluator": "sentence_transformers.evaluation.EmbeddingSimilarityEvaluator.EmbeddingSimilarityEvaluator",
"max_grad_norm": 1,
"optimizer_class": "<class 'torch.optim.adamw.AdamW'>",
"optimizer_params": {
"lr": 2e-05
},
"scheduler": "WarmupLinear",
"steps_per_epoch": null,
"warmup_steps": 100,
"weight_decay": 0.01
}
```
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: RobertaModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
```
## Citing & Authors
<!--- Describe where people can find more information --> |
jla25/squareV3 | jla25 | 2024-11-25T01:42:08Z | 110 | 0 | transformers | [
"transformers",
"safetensors",
"m2m_100",
"text2text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text2text-generation | 2024-11-12T11:30:18Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
artificialguybr/LLAMA3.2-1B-Synthia-II-Redmond | artificialguybr | 2024-11-25T01:40:37Z | 120 | 1 | transformers | [
"transformers",
"pytorch",
"llama",
"text-generation",
"generated_from_trainer",
"facebook",
"meta",
"llama-3",
"en",
"de",
"fr",
"it",
"pt",
"hi",
"es",
"th",
"base_model:NousResearch/Llama-3.2-1B",
"base_model:finetune:NousResearch/Llama-3.2-1B",
"license:llama3.2",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-25T01:39:35Z | ---
language:
- en
- de
- fr
- it
- pt
- hi
- es
- th
library_name: transformers
pipeline_tag: text-generation
license: llama3.2
base_model: NousResearch/Llama-3.2-1B
tags:
- generated_from_trainer
- facebook
- meta
- pytorch
- llama
- llama-3
model-index:
- name: llama3.2-1b-synthia-II
results: []
---
# Llama 3.2 1B - Synthia-v1.5-II - Redmond - Fine-tuned Model
This model is a fine-tuned version of [NousResearch/Llama-3.2-1B](https://huggingface.co/NousResearch/Llama-3.2-1B) on the [Synthia-v1.5-II](https://huggingface.co/datasets/migtissera/Synthia-v1.5-II) dataset.
Thanks [RedmondAI](https://redmond.ai) for all the GPU Support!
## Model Description
The base model is Llama 3.2 1B, a multilingual large language model developed by Meta. This version has been fine-tuned on the Synthia-v1.5-II instruction dataset to improve its instruction-following capabilities.
### Training Data
The model was fine-tuned on Synthia-v1.5-II.
### Training Procedure
The model was trained with the following hyperparameters:
- Learning rate: 2e-05
- Train batch size: 1
- Eval batch size: 1
- Seed: 42
- Gradient accumulation steps: 8
- Total train batch size: 8
- Optimizer: Paged AdamW 8bit (betas=(0.9,0.999), epsilon=1e-08)
- LR scheduler: Cosine with 100 warmup steps
- Number of epochs: 3
### Framework Versions
- Transformers 4.46.1
- Pytorch 2.3.1+cu121
- Datasets 3.0.1
- Tokenizers 0.20.3
## Intended Use
This model is intended for:
- Instruction following tasks
- Conversational AI applications
- Research and development in natural language processing
## Training Infrastructure
The model was trained using the Axolotl framework version 0.5.0.
## License
This model is subject to the Llama 3.2 Community License Agreement. Users must comply with all terms and conditions specified in the license.
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
|
liiinn/crisis-bert | liiinn | 2024-11-25T01:22:17Z | 52 | 0 | transformers | [
"transformers",
"safetensors",
"distilbert",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | null | 2024-11-25T01:21:14Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
RE-N-Y/pickscore-clip-weighted | RE-N-Y | 2024-11-25T01:04:09Z | 7 | 0 | preferences | [
"preferences",
"safetensors",
"model_hub_mixin",
"pytorch_model_hub_mixin",
"region:us"
] | null | 2024-11-25T01:02:46Z | ---
library_name: preferences
tags:
- model_hub_mixin
- pytorch_model_hub_mixin
---
This model has been pushed to the Hub using the [PytorchModelHubMixin](https://huggingface.co/docs/huggingface_hub/package_reference/mixins#huggingface_hub.PyTorchModelHubMixin) integration:
- Library: https://github.com/RE-N-Y/finebooru
- Docs: [More Information Needed] |
bartowski/Teleut-7b-GGUF | bartowski | 2024-11-25T00:57:00Z | 96 | 1 | null | [
"gguf",
"text-generation",
"dataset:allenai/tulu-3-sft-mixture",
"base_model:allura-org/Teleut-7b",
"base_model:quantized:allura-org/Teleut-7b",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | text-generation | 2024-11-25T00:32:23Z | ---
quantized_by: bartowski
pipeline_tag: text-generation
datasets:
- allenai/tulu-3-sft-mixture
base_model: allura-org/Teleut-7b
license: apache-2.0
---
## Llamacpp imatrix Quantizations of Teleut-7b
Using <a href="https://github.com/ggerganov/llama.cpp/">llama.cpp</a> release <a href="https://github.com/ggerganov/llama.cpp/releases/tag/b4132">b4132</a> for quantization.
Original model: https://huggingface.co/allura-org/Teleut-7b
All quants made using imatrix option with dataset from [here](https://gist.github.com/bartowski1182/eb213dccb3571f863da82e99418f81e8)
Run them in [LM Studio](https://lmstudio.ai/)
## Prompt format
```
<|im_start|>system
{system_prompt}<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant
```
## Download a file (not the whole branch) from below:
| Filename | Quant type | File Size | Split | Description |
| -------- | ---------- | --------- | ----- | ----------- |
| [Teleut-7b-f16.gguf](https://huggingface.co/bartowski/Teleut-7b-GGUF/blob/main/Teleut-7b-f16.gguf) | f16 | 15.24GB | false | Full F16 weights. |
| [Teleut-7b-Q8_0.gguf](https://huggingface.co/bartowski/Teleut-7b-GGUF/blob/main/Teleut-7b-Q8_0.gguf) | Q8_0 | 8.10GB | false | Extremely high quality, generally unneeded but max available quant. |
| [Teleut-7b-Q6_K_L.gguf](https://huggingface.co/bartowski/Teleut-7b-GGUF/blob/main/Teleut-7b-Q6_K_L.gguf) | Q6_K_L | 6.52GB | false | Uses Q8_0 for embed and output weights. Very high quality, near perfect, *recommended*. |
| [Teleut-7b-Q6_K.gguf](https://huggingface.co/bartowski/Teleut-7b-GGUF/blob/main/Teleut-7b-Q6_K.gguf) | Q6_K | 6.25GB | false | Very high quality, near perfect, *recommended*. |
| [Teleut-7b-Q5_K_L.gguf](https://huggingface.co/bartowski/Teleut-7b-GGUF/blob/main/Teleut-7b-Q5_K_L.gguf) | Q5_K_L | 5.78GB | false | Uses Q8_0 for embed and output weights. High quality, *recommended*. |
| [Teleut-7b-Q5_K_M.gguf](https://huggingface.co/bartowski/Teleut-7b-GGUF/blob/main/Teleut-7b-Q5_K_M.gguf) | Q5_K_M | 5.44GB | false | High quality, *recommended*. |
| [Teleut-7b-Q5_K_S.gguf](https://huggingface.co/bartowski/Teleut-7b-GGUF/blob/main/Teleut-7b-Q5_K_S.gguf) | Q5_K_S | 5.32GB | false | High quality, *recommended*. |
| [Teleut-7b-Q4_K_L.gguf](https://huggingface.co/bartowski/Teleut-7b-GGUF/blob/main/Teleut-7b-Q4_K_L.gguf) | Q4_K_L | 5.09GB | false | Uses Q8_0 for embed and output weights. Good quality, *recommended*. |
| [Teleut-7b-Q4_K_M.gguf](https://huggingface.co/bartowski/Teleut-7b-GGUF/blob/main/Teleut-7b-Q4_K_M.gguf) | Q4_K_M | 4.68GB | false | Good quality, default size for most use cases, *recommended*. |
| [Teleut-7b-Q3_K_XL.gguf](https://huggingface.co/bartowski/Teleut-7b-GGUF/blob/main/Teleut-7b-Q3_K_XL.gguf) | Q3_K_XL | 4.57GB | false | Uses Q8_0 for embed and output weights. Lower quality but usable, good for low RAM availability. |
| [Teleut-7b-Q4_K_S.gguf](https://huggingface.co/bartowski/Teleut-7b-GGUF/blob/main/Teleut-7b-Q4_K_S.gguf) | Q4_K_S | 4.46GB | false | Slightly lower quality with more space savings, *recommended*. |
| [Teleut-7b-Q4_0.gguf](https://huggingface.co/bartowski/Teleut-7b-GGUF/blob/main/Teleut-7b-Q4_0.gguf) | Q4_0 | 4.44GB | false | Legacy format, generally not worth using over similarly sized formats |
| [Teleut-7b-Q4_0_8_8.gguf](https://huggingface.co/bartowski/Teleut-7b-GGUF/blob/main/Teleut-7b-Q4_0_8_8.gguf) | Q4_0_8_8 | 4.43GB | false | Optimized for ARM and AVX inference. Requires 'sve' support for ARM (see details below). *Don't use on Mac*. |
| [Teleut-7b-Q4_0_4_8.gguf](https://huggingface.co/bartowski/Teleut-7b-GGUF/blob/main/Teleut-7b-Q4_0_4_8.gguf) | Q4_0_4_8 | 4.43GB | false | Optimized for ARM inference. Requires 'i8mm' support (see details below). *Don't use on Mac*. |
| [Teleut-7b-Q4_0_4_4.gguf](https://huggingface.co/bartowski/Teleut-7b-GGUF/blob/main/Teleut-7b-Q4_0_4_4.gguf) | Q4_0_4_4 | 4.43GB | false | Optimized for ARM inference. Should work well on all ARM chips, not for use with GPUs. *Don't use on Mac*. |
| [Teleut-7b-IQ4_XS.gguf](https://huggingface.co/bartowski/Teleut-7b-GGUF/blob/main/Teleut-7b-IQ4_XS.gguf) | IQ4_XS | 4.22GB | false | Decent quality, smaller than Q4_K_S with similar performance, *recommended*. |
| [Teleut-7b-Q3_K_L.gguf](https://huggingface.co/bartowski/Teleut-7b-GGUF/blob/main/Teleut-7b-Q3_K_L.gguf) | Q3_K_L | 4.09GB | false | Lower quality but usable, good for low RAM availability. |
| [Teleut-7b-Q3_K_M.gguf](https://huggingface.co/bartowski/Teleut-7b-GGUF/blob/main/Teleut-7b-Q3_K_M.gguf) | Q3_K_M | 3.81GB | false | Low quality. |
| [Teleut-7b-IQ3_M.gguf](https://huggingface.co/bartowski/Teleut-7b-GGUF/blob/main/Teleut-7b-IQ3_M.gguf) | IQ3_M | 3.57GB | false | Medium-low quality, new method with decent performance comparable to Q3_K_M. |
| [Teleut-7b-Q2_K_L.gguf](https://huggingface.co/bartowski/Teleut-7b-GGUF/blob/main/Teleut-7b-Q2_K_L.gguf) | Q2_K_L | 3.55GB | false | Uses Q8_0 for embed and output weights. Very low quality but surprisingly usable. |
| [Teleut-7b-Q3_K_S.gguf](https://huggingface.co/bartowski/Teleut-7b-GGUF/blob/main/Teleut-7b-Q3_K_S.gguf) | Q3_K_S | 3.49GB | false | Low quality, not recommended. |
| [Teleut-7b-IQ3_XS.gguf](https://huggingface.co/bartowski/Teleut-7b-GGUF/blob/main/Teleut-7b-IQ3_XS.gguf) | IQ3_XS | 3.35GB | false | Lower quality, new method with decent performance, slightly better than Q3_K_S. |
| [Teleut-7b-Q2_K.gguf](https://huggingface.co/bartowski/Teleut-7b-GGUF/blob/main/Teleut-7b-Q2_K.gguf) | Q2_K | 3.02GB | false | Very low quality but surprisingly usable. |
| [Teleut-7b-IQ2_M.gguf](https://huggingface.co/bartowski/Teleut-7b-GGUF/blob/main/Teleut-7b-IQ2_M.gguf) | IQ2_M | 2.78GB | false | Relatively low quality, uses SOTA techniques to be surprisingly usable. |
## Embed/output weights
Some of these quants (Q3_K_XL, Q4_K_L etc) are the standard quantization method with the embeddings and output weights quantized to Q8_0 instead of what they would normally default to.
## Downloading using huggingface-cli
<details>
<summary>Click to view download instructions</summary>
First, make sure you have hugginface-cli installed:
```
pip install -U "huggingface_hub[cli]"
```
Then, you can target the specific file you want:
```
huggingface-cli download bartowski/Teleut-7b-GGUF --include "Teleut-7b-Q4_K_M.gguf" --local-dir ./
```
If the model is bigger than 50GB, it will have been split into multiple files. In order to download them all to a local folder, run:
```
huggingface-cli download bartowski/Teleut-7b-GGUF --include "Teleut-7b-Q8_0/*" --local-dir ./
```
You can either specify a new local-dir (Teleut-7b-Q8_0) or download them all in place (./)
</details>
## Q4_0_X_X information
<details>
<summary>Click to view Q4_0_X_X information</summary>
These are *NOT* for Metal (Apple) or GPU (nvidia/AMD/intel) offloading, only ARM chips (and certain AVX2/AVX512 CPUs).
If you're using an ARM chip, the Q4_0_X_X quants will have a substantial speedup. Check out Q4_0_4_4 speed comparisons [on the original pull request](https://github.com/ggerganov/llama.cpp/pull/5780#pullrequestreview-21657544660)
To check which one would work best for your ARM chip, you can check [AArch64 SoC features](https://gpages.juszkiewicz.com.pl/arm-socs-table/arm-socs.html) (thanks EloyOn!).
If you're using a CPU that supports AVX2 or AVX512 (typically server CPUs and AMD's latest Zen5 CPUs) and are not offloading to a GPU, the Q4_0_8_8 may offer a nice speed as well:
<details>
<summary>Click to view benchmarks on an AVX2 system (EPYC7702)</summary>
| model | size | params | backend | threads | test | t/s | % (vs Q4_0) |
| ------------------------------ | ---------: | ---------: | ---------- | ------: | ------------: | -------------------: |-------------: |
| qwen2 3B Q4_0 | 1.70 GiB | 3.09 B | CPU | 64 | pp512 | 204.03 ± 1.03 | 100% |
| qwen2 3B Q4_0 | 1.70 GiB | 3.09 B | CPU | 64 | pp1024 | 282.92 ± 0.19 | 100% |
| qwen2 3B Q4_0 | 1.70 GiB | 3.09 B | CPU | 64 | pp2048 | 259.49 ± 0.44 | 100% |
| qwen2 3B Q4_0 | 1.70 GiB | 3.09 B | CPU | 64 | tg128 | 39.12 ± 0.27 | 100% |
| qwen2 3B Q4_0 | 1.70 GiB | 3.09 B | CPU | 64 | tg256 | 39.31 ± 0.69 | 100% |
| qwen2 3B Q4_0 | 1.70 GiB | 3.09 B | CPU | 64 | tg512 | 40.52 ± 0.03 | 100% |
| qwen2 3B Q4_K_M | 1.79 GiB | 3.09 B | CPU | 64 | pp512 | 301.02 ± 1.74 | 147% |
| qwen2 3B Q4_K_M | 1.79 GiB | 3.09 B | CPU | 64 | pp1024 | 287.23 ± 0.20 | 101% |
| qwen2 3B Q4_K_M | 1.79 GiB | 3.09 B | CPU | 64 | pp2048 | 262.77 ± 1.81 | 101% |
| qwen2 3B Q4_K_M | 1.79 GiB | 3.09 B | CPU | 64 | tg128 | 18.80 ± 0.99 | 48% |
| qwen2 3B Q4_K_M | 1.79 GiB | 3.09 B | CPU | 64 | tg256 | 24.46 ± 3.04 | 83% |
| qwen2 3B Q4_K_M | 1.79 GiB | 3.09 B | CPU | 64 | tg512 | 36.32 ± 3.59 | 90% |
| qwen2 3B Q4_0_8_8 | 1.69 GiB | 3.09 B | CPU | 64 | pp512 | 271.71 ± 3.53 | 133% |
| qwen2 3B Q4_0_8_8 | 1.69 GiB | 3.09 B | CPU | 64 | pp1024 | 279.86 ± 45.63 | 100% |
| qwen2 3B Q4_0_8_8 | 1.69 GiB | 3.09 B | CPU | 64 | pp2048 | 320.77 ± 5.00 | 124% |
| qwen2 3B Q4_0_8_8 | 1.69 GiB | 3.09 B | CPU | 64 | tg128 | 43.51 ± 0.05 | 111% |
| qwen2 3B Q4_0_8_8 | 1.69 GiB | 3.09 B | CPU | 64 | tg256 | 43.35 ± 0.09 | 110% |
| qwen2 3B Q4_0_8_8 | 1.69 GiB | 3.09 B | CPU | 64 | tg512 | 42.60 ± 0.31 | 105% |
Q4_0_8_8 offers a nice bump to prompt processing and a small bump to text generation
</details>
</details>
## Which file should I choose?
<details>
<summary>Click here for details</summary>
A great write up with charts showing various performances is provided by Artefact2 [here](https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9)
The first thing to figure out is how big a model you can run. To do this, you'll need to figure out how much RAM and/or VRAM you have.
If you want your model running as FAST as possible, you'll want to fit the whole thing on your GPU's VRAM. Aim for a quant with a file size 1-2GB smaller than your GPU's total VRAM.
If you want the absolute maximum quality, add both your system RAM and your GPU's VRAM together, then similarly grab a quant with a file size 1-2GB Smaller than that total.
Next, you'll need to decide if you want to use an 'I-quant' or a 'K-quant'.
If you don't want to think too much, grab one of the K-quants. These are in format 'QX_K_X', like Q5_K_M.
If you want to get more into the weeds, you can check out this extremely useful feature chart:
[llama.cpp feature matrix](https://github.com/ggerganov/llama.cpp/wiki/Feature-matrix)
But basically, if you're aiming for below Q4, and you're running cuBLAS (Nvidia) or rocBLAS (AMD), you should look towards the I-quants. These are in format IQX_X, like IQ3_M. These are newer and offer better performance for their size.
These I-quants can also be used on CPU and Apple Metal, but will be slower than their K-quant equivalent, so speed vs performance is a tradeoff you'll have to decide.
The I-quants are *not* compatible with Vulcan, which is also AMD, so if you have an AMD card double check if you're using the rocBLAS build or the Vulcan build. At the time of writing this, LM Studio has a preview with ROCm support, and other inference engines have specific builds for ROCm.
</details>
## Credits
Thank you kalomaze and Dampf for assistance in creating the imatrix calibration dataset.
Thank you ZeroWw for the inspiration to experiment with embed/output.
Want to support my work? Visit my ko-fi page here: https://ko-fi.com/bartowski
|
perian/bert-base-uncased-finetuned-biasbios-mlm | perian | 2024-11-25T00:56:30Z | 195 | 0 | transformers | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"fill-mask",
"generated_from_trainer",
"dataset:social_bias_frames",
"base_model:google-bert/bert-base-uncased",
"base_model:finetune:google-bert/bert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | 2024-11-24T18:10:18Z | ---
library_name: transformers
license: apache-2.0
base_model: bert-base-uncased
tags:
- generated_from_trainer
datasets:
- social_bias_frames
model-index:
- name: bert-base-uncased-finetuned-biasbios-mlm
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-biasbios-mlm
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the social_bias_frames dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7074
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 3
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| No log | 1.0 | 412 | 0.7723 |
| 1.0566 | 2.0 | 824 | 0.7228 |
| 0.88 | 3.0 | 1236 | 0.6870 |
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.1+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
|
verbiate/Marco-o1-Q8-mlx | verbiate | 2024-11-25T00:56:12Z | 77 | 0 | transformers | [
"transformers",
"safetensors",
"qwen2",
"text-generation",
"mlx",
"conversational",
"base_model:AIDC-AI/Marco-o1",
"base_model:quantized:AIDC-AI/Marco-o1",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"8-bit",
"region:us"
] | text-generation | 2024-11-25T00:55:42Z | ---
license: apache-2.0
library_name: transformers
inference: false
tags:
- mlx
base_model: AIDC-AI/Marco-o1
---
# verbiate/Marco-o1-Q8-mlx
The Model [verbiate/Marco-o1-Q8-mlx](https://huggingface.co/verbiate/Marco-o1-Q8-mlx) was converted to MLX format from [AIDC-AI/Marco-o1](https://huggingface.co/AIDC-AI/Marco-o1) using mlx-lm version **0.19.2**.
## Use with mlx
```bash
pip install mlx-lm
```
```python
from mlx_lm import load, generate
model, tokenizer = load("verbiate/Marco-o1-Q8-mlx")
prompt="hello"
if hasattr(tokenizer, "apply_chat_template") and tokenizer.chat_template is not None:
messages = [{"role": "user", "content": prompt}]
prompt = tokenizer.apply_chat_template(
messages, tokenize=False, add_generation_prompt=True
)
response = generate(model, tokenizer, prompt=prompt, verbose=True)
```
|
mfidabel/whisper-tiny-gn | mfidabel | 2024-11-25T00:44:52Z | 150 | 0 | transformers | [
"transformers",
"safetensors",
"whisper",
"automatic-speech-recognition",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | automatic-speech-recognition | 2024-11-25T00:44:41Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
mradermacher/Sue_Ann_11B-GGUF | mradermacher | 2024-11-25T00:42:07Z | 19 | 2 | transformers | [
"transformers",
"gguf",
"mergekit",
"merge",
"en",
"base_model:Datters/Sue_Ann_11B",
"base_model:quantized:Datters/Sue_Ann_11B",
"endpoints_compatible",
"region:us"
] | null | 2024-11-24T15:33:08Z | ---
base_model: Datters/Sue_Ann_11B
language:
- en
library_name: transformers
quantized_by: mradermacher
tags:
- mergekit
- merge
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: -->
static quants of https://huggingface.co/Datters/Sue_Ann_11B
<!-- provided-files -->
weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/Sue_Ann_11B-GGUF/resolve/main/Sue_Ann_11B.Q2_K.gguf) | Q2_K | 4.1 | |
| [GGUF](https://huggingface.co/mradermacher/Sue_Ann_11B-GGUF/resolve/main/Sue_Ann_11B.Q3_K_S.gguf) | Q3_K_S | 4.8 | |
| [GGUF](https://huggingface.co/mradermacher/Sue_Ann_11B-GGUF/resolve/main/Sue_Ann_11B.Q3_K_M.gguf) | Q3_K_M | 5.3 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/Sue_Ann_11B-GGUF/resolve/main/Sue_Ann_11B.Q3_K_L.gguf) | Q3_K_L | 5.8 | |
| [GGUF](https://huggingface.co/mradermacher/Sue_Ann_11B-GGUF/resolve/main/Sue_Ann_11B.IQ4_XS.gguf) | IQ4_XS | 5.9 | |
| [GGUF](https://huggingface.co/mradermacher/Sue_Ann_11B-GGUF/resolve/main/Sue_Ann_11B.Q4_0_4_4.gguf) | Q4_0_4_4 | 6.2 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/Sue_Ann_11B-GGUF/resolve/main/Sue_Ann_11B.Q4_K_S.gguf) | Q4_K_S | 6.2 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Sue_Ann_11B-GGUF/resolve/main/Sue_Ann_11B.Q4_K_M.gguf) | Q4_K_M | 6.6 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Sue_Ann_11B-GGUF/resolve/main/Sue_Ann_11B.Q5_K_S.gguf) | Q5_K_S | 7.5 | |
| [GGUF](https://huggingface.co/mradermacher/Sue_Ann_11B-GGUF/resolve/main/Sue_Ann_11B.Q5_K_M.gguf) | Q5_K_M | 7.7 | |
| [GGUF](https://huggingface.co/mradermacher/Sue_Ann_11B-GGUF/resolve/main/Sue_Ann_11B.Q6_K.gguf) | Q6_K | 8.9 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/Sue_Ann_11B-GGUF/resolve/main/Sue_Ann_11B.Q8_0.gguf) | Q8_0 | 11.5 | fast, best quality |
| [GGUF](https://huggingface.co/mradermacher/Sue_Ann_11B-GGUF/resolve/main/Sue_Ann_11B.f16.gguf) | f16 | 21.6 | 16 bpw, overkill |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time.
<!-- end -->
|
mradermacher/Marco-o1-i1-GGUF | mradermacher | 2024-11-25T00:34:07Z | 142 | 2 | transformers | [
"transformers",
"gguf",
"en",
"base_model:AIDC-AI/Marco-o1",
"base_model:quantized:AIDC-AI/Marco-o1",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"imatrix",
"conversational"
] | null | 2024-11-24T19:45:45Z | ---
base_model: AIDC-AI/Marco-o1
language:
- en
library_name: transformers
license: apache-2.0
quantized_by: mradermacher
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/AIDC-AI/Marco-o1
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/Marco-o1-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/Marco-o1-i1-GGUF/resolve/main/Marco-o1.i1-IQ1_S.gguf) | i1-IQ1_S | 2.0 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/Marco-o1-i1-GGUF/resolve/main/Marco-o1.i1-IQ1_M.gguf) | i1-IQ1_M | 2.1 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/Marco-o1-i1-GGUF/resolve/main/Marco-o1.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 2.4 | |
| [GGUF](https://huggingface.co/mradermacher/Marco-o1-i1-GGUF/resolve/main/Marco-o1.i1-IQ2_XS.gguf) | i1-IQ2_XS | 2.6 | |
| [GGUF](https://huggingface.co/mradermacher/Marco-o1-i1-GGUF/resolve/main/Marco-o1.i1-IQ2_S.gguf) | i1-IQ2_S | 2.7 | |
| [GGUF](https://huggingface.co/mradermacher/Marco-o1-i1-GGUF/resolve/main/Marco-o1.i1-IQ2_M.gguf) | i1-IQ2_M | 2.9 | |
| [GGUF](https://huggingface.co/mradermacher/Marco-o1-i1-GGUF/resolve/main/Marco-o1.i1-Q2_K.gguf) | i1-Q2_K | 3.1 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/Marco-o1-i1-GGUF/resolve/main/Marco-o1.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 3.2 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/Marco-o1-i1-GGUF/resolve/main/Marco-o1.i1-IQ3_XS.gguf) | i1-IQ3_XS | 3.4 | |
| [GGUF](https://huggingface.co/mradermacher/Marco-o1-i1-GGUF/resolve/main/Marco-o1.i1-Q3_K_S.gguf) | i1-Q3_K_S | 3.6 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/Marco-o1-i1-GGUF/resolve/main/Marco-o1.i1-IQ3_S.gguf) | i1-IQ3_S | 3.6 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/Marco-o1-i1-GGUF/resolve/main/Marco-o1.i1-IQ3_M.gguf) | i1-IQ3_M | 3.7 | |
| [GGUF](https://huggingface.co/mradermacher/Marco-o1-i1-GGUF/resolve/main/Marco-o1.i1-Q3_K_M.gguf) | i1-Q3_K_M | 3.9 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/Marco-o1-i1-GGUF/resolve/main/Marco-o1.i1-Q3_K_L.gguf) | i1-Q3_K_L | 4.2 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/Marco-o1-i1-GGUF/resolve/main/Marco-o1.i1-IQ4_XS.gguf) | i1-IQ4_XS | 4.3 | |
| [GGUF](https://huggingface.co/mradermacher/Marco-o1-i1-GGUF/resolve/main/Marco-o1.i1-Q4_0_4_4.gguf) | i1-Q4_0_4_4 | 4.5 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/Marco-o1-i1-GGUF/resolve/main/Marco-o1.i1-Q4_0_4_8.gguf) | i1-Q4_0_4_8 | 4.5 | fast on arm+i8mm, low quality |
| [GGUF](https://huggingface.co/mradermacher/Marco-o1-i1-GGUF/resolve/main/Marco-o1.i1-Q4_0_8_8.gguf) | i1-Q4_0_8_8 | 4.5 | fast on arm+sve, low quality |
| [GGUF](https://huggingface.co/mradermacher/Marco-o1-i1-GGUF/resolve/main/Marco-o1.i1-Q4_0.gguf) | i1-Q4_0 | 4.5 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/Marco-o1-i1-GGUF/resolve/main/Marco-o1.i1-Q4_K_S.gguf) | i1-Q4_K_S | 4.6 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/Marco-o1-i1-GGUF/resolve/main/Marco-o1.i1-Q4_K_M.gguf) | i1-Q4_K_M | 4.8 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Marco-o1-i1-GGUF/resolve/main/Marco-o1.i1-Q5_K_S.gguf) | i1-Q5_K_S | 5.4 | |
| [GGUF](https://huggingface.co/mradermacher/Marco-o1-i1-GGUF/resolve/main/Marco-o1.i1-Q5_K_M.gguf) | i1-Q5_K_M | 5.5 | |
| [GGUF](https://huggingface.co/mradermacher/Marco-o1-i1-GGUF/resolve/main/Marco-o1.i1-Q6_K.gguf) | i1-Q6_K | 6.4 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
Marqo/dunzhang-stella_en_400M_v5 | Marqo | 2024-11-25T00:25:59Z | 2,403 | 7 | sentence-transformers | [
"sentence-transformers",
"pytorch",
"safetensors",
"new",
"feature-extraction",
"mteb",
"transformers",
"sentence-similarity",
"custom_code",
"license:mit",
"model-index",
"autotrain_compatible",
"text-embeddings-inference",
"endpoints_compatible",
"region:us"
] | sentence-similarity | 2024-09-25T20:01:18Z | ---
model-index:
- name: stella_en_400M_v5
results:
- dataset:
config: en
name: MTEB AmazonCounterfactualClassification (en)
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
split: test
type: mteb/amazon_counterfactual
metrics:
- type: accuracy
value: 92.35820895522387
- type: ap
value: 70.81322736988783
- type: ap_weighted
value: 70.81322736988783
- type: f1
value: 88.9505466159595
- type: f1_weighted
value: 92.68630932872613
- type: main_score
value: 92.35820895522387
task:
type: Classification
- dataset:
config: default
name: MTEB AmazonPolarityClassification
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
split: test
type: mteb/amazon_polarity
metrics:
- type: accuracy
value: 97.1945
- type: ap
value: 96.08192192244094
- type: ap_weighted
value: 96.08192192244094
- type: f1
value: 97.1936887167346
- type: f1_weighted
value: 97.1936887167346
- type: main_score
value: 97.1945
task:
type: Classification
- dataset:
config: en
name: MTEB AmazonReviewsClassification (en)
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
split: test
type: mteb/amazon_reviews_multi
metrics:
- type: accuracy
value: 59.528000000000006
- type: f1
value: 59.21016819840188
- type: f1_weighted
value: 59.21016819840188
- type: main_score
value: 59.528000000000006
task:
type: Classification
- dataset:
config: default
name: MTEB ArguAna
revision: c22ab2a51041ffd869aaddef7af8d8215647e41a
split: test
type: mteb/arguana
metrics:
- type: main_score
value: 64.24
- type: map_at_1
value: 40.398
- type: map_at_10
value: 56.215
- type: map_at_100
value: 56.833999999999996
- type: map_at_1000
value: 56.835
- type: map_at_20
value: 56.747
- type: map_at_3
value: 52.181
- type: map_at_5
value: 54.628
- type: mrr_at_1
value: 41.25177809388336
- type: mrr_at_10
value: 56.570762491815216
- type: mrr_at_100
value: 57.17548614361504
- type: mrr_at_1000
value: 57.176650626377466
- type: mrr_at_20
value: 57.08916253512566
- type: mrr_at_3
value: 52.47747747747754
- type: mrr_at_5
value: 54.94547178757718
- type: nauc_map_at_1000_diff1
value: 22.408086887100158
- type: nauc_map_at_1000_max
value: -8.730419096847543
- type: nauc_map_at_1000_std
value: -17.789262741255737
- type: nauc_map_at_100_diff1
value: 22.407371684274025
- type: nauc_map_at_100_max
value: -8.732263549026266
- type: nauc_map_at_100_std
value: -17.79550515579994
- type: nauc_map_at_10_diff1
value: 21.925005073301246
- type: nauc_map_at_10_max
value: -8.990323944492134
- type: nauc_map_at_10_std
value: -18.199246301671458
- type: nauc_map_at_1_diff1
value: 26.23276644969203
- type: nauc_map_at_1_max
value: -12.376511389571245
- type: nauc_map_at_1_std
value: -18.11411715207284
- type: nauc_map_at_20_diff1
value: 22.32455790850922
- type: nauc_map_at_20_max
value: -8.664671547236034
- type: nauc_map_at_20_std
value: -17.8290016125137
- type: nauc_map_at_3_diff1
value: 22.395462147465064
- type: nauc_map_at_3_max
value: -8.206580750918844
- type: nauc_map_at_3_std
value: -17.604490446911484
- type: nauc_map_at_5_diff1
value: 21.95307379904799
- type: nauc_map_at_5_max
value: -8.03958102978443
- type: nauc_map_at_5_std
value: -17.36578866595004
- type: nauc_mrr_at_1000_diff1
value: 20.124236798365587
- type: nauc_mrr_at_1000_max
value: -9.587376069575898
- type: nauc_mrr_at_1000_std
value: -17.79191612151833
- type: nauc_mrr_at_100_diff1
value: 20.123612603474033
- type: nauc_mrr_at_100_max
value: -9.589187218607831
- type: nauc_mrr_at_100_std
value: -17.7981617777748
- type: nauc_mrr_at_10_diff1
value: 19.723683875738075
- type: nauc_mrr_at_10_max
value: -9.774151729178815
- type: nauc_mrr_at_10_std
value: -18.168668675495162
- type: nauc_mrr_at_1_diff1
value: 23.945332059908132
- type: nauc_mrr_at_1_max
value: -12.260461466152819
- type: nauc_mrr_at_1_std
value: -18.007194922921148
- type: nauc_mrr_at_20_diff1
value: 20.04819461810257
- type: nauc_mrr_at_20_max
value: -9.518368283588936
- type: nauc_mrr_at_20_std
value: -17.831608149836136
- type: nauc_mrr_at_3_diff1
value: 19.8571785245832
- type: nauc_mrr_at_3_max
value: -9.464375021240478
- type: nauc_mrr_at_3_std
value: -17.728533927330453
- type: nauc_mrr_at_5_diff1
value: 19.670313652167827
- type: nauc_mrr_at_5_max
value: -8.966372585728434
- type: nauc_mrr_at_5_std
value: -17.468955834324817
- type: nauc_ndcg_at_1000_diff1
value: 21.863049281767417
- type: nauc_ndcg_at_1000_max
value: -8.18698520924057
- type: nauc_ndcg_at_1000_std
value: -17.634483364794804
- type: nauc_ndcg_at_100_diff1
value: 21.849924385738586
- type: nauc_ndcg_at_100_max
value: -8.226437560889345
- type: nauc_ndcg_at_100_std
value: -17.774648478087002
- type: nauc_ndcg_at_10_diff1
value: 19.888395590413573
- type: nauc_ndcg_at_10_max
value: -8.968706085632382
- type: nauc_ndcg_at_10_std
value: -19.31386964628115
- type: nauc_ndcg_at_1_diff1
value: 26.23276644969203
- type: nauc_ndcg_at_1_max
value: -12.376511389571245
- type: nauc_ndcg_at_1_std
value: -18.11411715207284
- type: nauc_ndcg_at_20_diff1
value: 21.38413342416933
- type: nauc_ndcg_at_20_max
value: -7.636238194084164
- type: nauc_ndcg_at_20_std
value: -17.946390844693028
- type: nauc_ndcg_at_3_diff1
value: 21.29169165029195
- type: nauc_ndcg_at_3_max
value: -6.793840499730093
- type: nauc_ndcg_at_3_std
value: -17.52359001586737
- type: nauc_ndcg_at_5_diff1
value: 20.238297656671364
- type: nauc_ndcg_at_5_max
value: -6.424992706950072
- type: nauc_ndcg_at_5_std
value: -17.082391132291356
- type: nauc_precision_at_1000_diff1
value: -7.05195108528572
- type: nauc_precision_at_1000_max
value: 34.439879624882145
- type: nauc_precision_at_1000_std
value: 68.72436351659353
- type: nauc_precision_at_100_diff1
value: -2.769464113932605
- type: nauc_precision_at_100_max
value: 9.89562961226698
- type: nauc_precision_at_100_std
value: -0.5880967482224028
- type: nauc_precision_at_10_diff1
value: 2.1371544726832323
- type: nauc_precision_at_10_max
value: -11.93051325147756
- type: nauc_precision_at_10_std
value: -30.83144187392059
- type: nauc_precision_at_1_diff1
value: 26.23276644969203
- type: nauc_precision_at_1_max
value: -12.376511389571245
- type: nauc_precision_at_1_std
value: -18.11411715207284
- type: nauc_precision_at_20_diff1
value: 3.780146814257504
- type: nauc_precision_at_20_max
value: 17.06527540214615
- type: nauc_precision_at_20_std
value: -20.36832563035565
- type: nauc_precision_at_3_diff1
value: 17.63894384012077
- type: nauc_precision_at_3_max
value: -2.0220490624638887
- type: nauc_precision_at_3_std
value: -17.285601413493918
- type: nauc_precision_at_5_diff1
value: 12.557855071944601
- type: nauc_precision_at_5_max
value: 0.5840236463956658
- type: nauc_precision_at_5_std
value: -15.827224420217846
- type: nauc_recall_at_1000_diff1
value: -7.051951085286463
- type: nauc_recall_at_1000_max
value: 34.43987962487738
- type: nauc_recall_at_1000_std
value: 68.724363516591
- type: nauc_recall_at_100_diff1
value: -2.769464113930314
- type: nauc_recall_at_100_max
value: 9.895629612270017
- type: nauc_recall_at_100_std
value: -0.58809674821745
- type: nauc_recall_at_10_diff1
value: 2.1371544726834495
- type: nauc_recall_at_10_max
value: -11.930513251477253
- type: nauc_recall_at_10_std
value: -30.83144187392047
- type: nauc_recall_at_1_diff1
value: 26.23276644969203
- type: nauc_recall_at_1_max
value: -12.376511389571245
- type: nauc_recall_at_1_std
value: -18.11411715207284
- type: nauc_recall_at_20_diff1
value: 3.7801468142575922
- type: nauc_recall_at_20_max
value: 17.0652754021456
- type: nauc_recall_at_20_std
value: -20.36832563035559
- type: nauc_recall_at_3_diff1
value: 17.63894384012074
- type: nauc_recall_at_3_max
value: -2.02204906246383
- type: nauc_recall_at_3_std
value: -17.28560141349386
- type: nauc_recall_at_5_diff1
value: 12.55785507194463
- type: nauc_recall_at_5_max
value: 0.5840236463957296
- type: nauc_recall_at_5_std
value: -15.827224420217856
- type: ndcg_at_1
value: 40.398
- type: ndcg_at_10
value: 64.24
- type: ndcg_at_100
value: 66.631
- type: ndcg_at_1000
value: 66.65100000000001
- type: ndcg_at_20
value: 66.086
- type: ndcg_at_3
value: 55.938
- type: ndcg_at_5
value: 60.370000000000005
- type: precision_at_1
value: 40.398
- type: precision_at_10
value: 8.962
- type: precision_at_100
value: 0.9950000000000001
- type: precision_at_1000
value: 0.1
- type: precision_at_20
value: 4.836
- type: precision_at_3
value: 22.262
- type: precision_at_5
value: 15.519
- type: recall_at_1
value: 40.398
- type: recall_at_10
value: 89.616
- type: recall_at_100
value: 99.502
- type: recall_at_1000
value: 99.644
- type: recall_at_20
value: 96.72800000000001
- type: recall_at_3
value: 66.78500000000001
- type: recall_at_5
value: 77.596
task:
type: Retrieval
- dataset:
config: default
name: MTEB ArxivClusteringP2P
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
split: test
type: mteb/arxiv-clustering-p2p
metrics:
- type: main_score
value: 55.1564333205451
- type: v_measure
value: 55.1564333205451
- type: v_measure_std
value: 14.696883012214512
task:
type: Clustering
- dataset:
config: default
name: MTEB ArxivClusteringS2S
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
split: test
type: mteb/arxiv-clustering-s2s
metrics:
- type: main_score
value: 49.823698316694795
- type: v_measure
value: 49.823698316694795
- type: v_measure_std
value: 14.951660654298186
task:
type: Clustering
- dataset:
config: default
name: MTEB AskUbuntuDupQuestions
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
split: test
type: mteb/askubuntudupquestions-reranking
metrics:
- type: main_score
value: 66.15294503553424
- type: map
value: 66.15294503553424
- type: mrr
value: 78.53438420612935
- type: nAUC_map_diff1
value: 12.569697092717997
- type: nAUC_map_max
value: 21.50670312412572
- type: nAUC_map_std
value: 16.943786429229064
- type: nAUC_mrr_diff1
value: 15.590272897361238
- type: nAUC_mrr_max
value: 34.96072022474653
- type: nAUC_mrr_std
value: 21.649217605241045
task:
type: Reranking
- dataset:
config: default
name: MTEB BIOSSES
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
split: test
type: mteb/biosses-sts
metrics:
- type: cosine_pearson
value: 85.7824546319275
- type: cosine_spearman
value: 83.29587385660628
- type: euclidean_pearson
value: 84.58764190565167
- type: euclidean_spearman
value: 83.30069324352772
- type: main_score
value: 83.29587385660628
- type: manhattan_pearson
value: 84.95996839947179
- type: manhattan_spearman
value: 83.87480271054358
- type: pearson
value: 85.7824546319275
- type: spearman
value: 83.29587385660628
task:
type: STS
- dataset:
config: default
name: MTEB Banking77Classification
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
split: test
type: mteb/banking77
metrics:
- type: accuracy
value: 89.30194805194806
- type: f1
value: 89.26182507266391
- type: f1_weighted
value: 89.26182507266391
- type: main_score
value: 89.30194805194806
task:
type: Classification
- dataset:
config: default
name: MTEB BiorxivClusteringP2P
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
split: test
type: mteb/biorxiv-clustering-p2p
metrics:
- type: main_score
value: 50.67972171889736
- type: v_measure
value: 50.67972171889736
- type: v_measure_std
value: 0.7687409980036303
task:
type: Clustering
- dataset:
config: default
name: MTEB BiorxivClusteringS2S
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
split: test
type: mteb/biorxiv-clustering-s2s
metrics:
- type: main_score
value: 45.80539715556144
- type: v_measure
value: 45.80539715556144
- type: v_measure_std
value: 0.9601346216579142
task:
type: Clustering
- dataset:
config: default
name: MTEB CQADupstackRetrieval
revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4
split: test
type: mteb/cqadupstack
metrics:
- type: main_score
value: 44.361250000000005
- type: map_at_1
value: 28.304499999999997
- type: map_at_10
value: 38.54841666666666
- type: map_at_100
value: 39.83141666666667
- type: map_at_1000
value: 39.944750000000006
- type: map_at_20
value: 39.25341666666667
- type: map_at_3
value: 35.406749999999995
- type: map_at_5
value: 37.15558333333333
- type: mrr_at_1
value: 34.09077232860122
- type: mrr_at_10
value: 43.15445393211421
- type: mrr_at_100
value: 43.98645286848257
- type: mrr_at_1000
value: 44.037631313469404
- type: mrr_at_20
value: 43.64045813249614
- type: mrr_at_3
value: 40.674138648480486
- type: mrr_at_5
value: 42.106251182620255
- type: nauc_map_at_1000_diff1
value: 46.250011739434996
- type: nauc_map_at_1000_max
value: 30.13664446260598
- type: nauc_map_at_1000_std
value: 5.422301791618935
- type: nauc_map_at_100_diff1
value: 46.253631351999395
- type: nauc_map_at_100_max
value: 30.12612918885181
- type: nauc_map_at_100_std
value: 5.367077019987172
- type: nauc_map_at_10_diff1
value: 46.328171341741346
- type: nauc_map_at_10_max
value: 29.80274612581464
- type: nauc_map_at_10_std
value: 4.62996685176396
- type: nauc_map_at_1_diff1
value: 51.56118117729493
- type: nauc_map_at_1_max
value: 27.94885243863768
- type: nauc_map_at_1_std
value: 1.700366508927356
- type: nauc_map_at_20_diff1
value: 46.286750260299094
- type: nauc_map_at_20_max
value: 29.979205290353278
- type: nauc_map_at_20_std
value: 5.010588412441873
- type: nauc_map_at_3_diff1
value: 47.10018183619064
- type: nauc_map_at_3_max
value: 29.062318206078753
- type: nauc_map_at_3_std
value: 3.2235696254694197
- type: nauc_map_at_5_diff1
value: 46.41971733050039
- type: nauc_map_at_5_max
value: 29.456798617695657
- type: nauc_map_at_5_std
value: 4.0921691023077145
- type: nauc_mrr_at_1000_diff1
value: 45.88888977975723
- type: nauc_mrr_at_1000_max
value: 32.162138978089544
- type: nauc_mrr_at_1000_std
value: 6.2811943424217915
- type: nauc_mrr_at_100_diff1
value: 45.87480433011124
- type: nauc_mrr_at_100_max
value: 32.16011334212834
- type: nauc_mrr_at_100_std
value: 6.2865717772421785
- type: nauc_mrr_at_10_diff1
value: 45.849652904658825
- type: nauc_mrr_at_10_max
value: 32.13847916232293
- type: nauc_mrr_at_10_std
value: 6.105718728141999
- type: nauc_mrr_at_1_diff1
value: 51.013730325062156
- type: nauc_mrr_at_1_max
value: 32.77457396492779
- type: nauc_mrr_at_1_std
value: 4.415684893471724
- type: nauc_mrr_at_20_diff1
value: 45.86663046255274
- type: nauc_mrr_at_20_max
value: 32.15219360697865
- type: nauc_mrr_at_20_std
value: 6.19603046412763
- type: nauc_mrr_at_3_diff1
value: 46.522376582423185
- type: nauc_mrr_at_3_max
value: 32.18259009733714
- type: nauc_mrr_at_3_std
value: 5.288000648220897
- type: nauc_mrr_at_5_diff1
value: 45.86611481369745
- type: nauc_mrr_at_5_max
value: 32.14261639054921
- type: nauc_mrr_at_5_std
value: 5.8811238177073735
- type: nauc_ndcg_at_1000_diff1
value: 44.5055097547565
- type: nauc_ndcg_at_1000_max
value: 31.149682057975458
- type: nauc_ndcg_at_1000_std
value: 8.157937194901333
- type: nauc_ndcg_at_100_diff1
value: 44.12398363638596
- type: nauc_ndcg_at_100_max
value: 30.878064321409994
- type: nauc_ndcg_at_100_std
value: 8.40493441452808
- type: nauc_ndcg_at_10_diff1
value: 44.200093505221474
- type: nauc_ndcg_at_10_max
value: 30.15267107733158
- type: nauc_ndcg_at_10_std
value: 6.407495361566107
- type: nauc_ndcg_at_1_diff1
value: 51.013730325062156
- type: nauc_ndcg_at_1_max
value: 32.77457396492779
- type: nauc_ndcg_at_1_std
value: 4.415684893471724
- type: nauc_ndcg_at_20_diff1
value: 44.16988321564116
- type: nauc_ndcg_at_20_max
value: 30.333532500651213
- type: nauc_ndcg_at_20_std
value: 7.10024701386895
- type: nauc_ndcg_at_3_diff1
value: 45.35982873879988
- type: nauc_ndcg_at_3_max
value: 30.288312457948702
- type: nauc_ndcg_at_3_std
value: 4.653900898293395
- type: nauc_ndcg_at_5_diff1
value: 44.324558115380185
- type: nauc_ndcg_at_5_max
value: 30.048149698941373
- type: nauc_ndcg_at_5_std
value: 5.6684459618413205
- type: nauc_precision_at_1000_diff1
value: -7.282175798304458
- type: nauc_precision_at_1000_max
value: 7.820142031765352
- type: nauc_precision_at_1000_std
value: 11.736131836431172
- type: nauc_precision_at_100_diff1
value: 1.0222940256506976
- type: nauc_precision_at_100_max
value: 16.12346497070298
- type: nauc_precision_at_100_std
value: 18.202607395247874
- type: nauc_precision_at_10_diff1
value: 18.289439185857837
- type: nauc_precision_at_10_max
value: 26.116517399154375
- type: nauc_precision_at_10_std
value: 13.921214069982302
- type: nauc_precision_at_1_diff1
value: 51.013730325062156
- type: nauc_precision_at_1_max
value: 32.77457396492779
- type: nauc_precision_at_1_std
value: 4.415684893471724
- type: nauc_precision_at_20_diff1
value: 12.365165405210886
- type: nauc_precision_at_20_max
value: 22.946297258937367
- type: nauc_precision_at_20_std
value: 16.13862870358933
- type: nauc_precision_at_3_diff1
value: 32.063423642849685
- type: nauc_precision_at_3_max
value: 30.140965811989407
- type: nauc_precision_at_3_std
value: 8.501746262550146
- type: nauc_precision_at_5_diff1
value: 24.777203357717948
- type: nauc_precision_at_5_max
value: 28.401579566848472
- type: nauc_precision_at_5_std
value: 11.643246774390914
- type: nauc_recall_at_1000_diff1
value: 30.04216463401409
- type: nauc_recall_at_1000_max
value: 34.98067760563842
- type: nauc_recall_at_1000_std
value: 48.01453905250591
- type: nauc_recall_at_100_diff1
value: 31.193415507513972
- type: nauc_recall_at_100_max
value: 28.69740149270981
- type: nauc_recall_at_100_std
value: 25.20960758920368
- type: nauc_recall_at_10_diff1
value: 36.18870823636506
- type: nauc_recall_at_10_max
value: 26.005625231341238
- type: nauc_recall_at_10_std
value: 8.891983977041376
- type: nauc_recall_at_1_diff1
value: 51.56118117729493
- type: nauc_recall_at_1_max
value: 27.94885243863768
- type: nauc_recall_at_1_std
value: 1.700366508927356
- type: nauc_recall_at_20_diff1
value: 34.93996118564803
- type: nauc_recall_at_20_max
value: 26.149961715956138
- type: nauc_recall_at_20_std
value: 12.0657502367633
- type: nauc_recall_at_3_diff1
value: 40.80743946709512
- type: nauc_recall_at_3_max
value: 26.443127773025783
- type: nauc_recall_at_3_std
value: 3.7011448604241477
- type: nauc_recall_at_5_diff1
value: 37.608535157055776
- type: nauc_recall_at_5_max
value: 26.168016189725822
- type: nauc_recall_at_5_std
value: 6.344191564595316
- type: ndcg_at_1
value: 34.09083333333333
- type: ndcg_at_10
value: 44.361250000000005
- type: ndcg_at_100
value: 49.586166666666664
- type: ndcg_at_1000
value: 51.623583333333336
- type: ndcg_at_20
value: 46.40158333333333
- type: ndcg_at_3
value: 39.27733333333333
- type: ndcg_at_5
value: 41.662333333333336
- type: precision_at_1
value: 34.09083333333333
- type: precision_at_10
value: 7.957000000000002
- type: precision_at_100
value: 1.2521666666666669
- type: precision_at_1000
value: 0.16125
- type: precision_at_20
value: 4.6755
- type: precision_at_3
value: 18.402083333333334
- type: precision_at_5
value: 13.104333333333335
- type: recall_at_1
value: 28.304499999999997
- type: recall_at_10
value: 56.80666666666667
- type: recall_at_100
value: 79.66208333333334
- type: recall_at_1000
value: 93.6455
- type: recall_at_20
value: 64.2495
- type: recall_at_3
value: 42.431333333333335
- type: recall_at_5
value: 48.665416666666665
task:
type: Retrieval
- dataset:
config: default
name: MTEB ClimateFEVER
revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380
split: test
type: mteb/climate-fever
metrics:
- type: main_score
value: 43.525999999999996
- type: map_at_1
value: 19.291
- type: map_at_10
value: 33.471000000000004
- type: map_at_100
value: 35.388999999999996
- type: map_at_1000
value: 35.568
- type: map_at_20
value: 34.496
- type: map_at_3
value: 28.713
- type: map_at_5
value: 31.384
- type: mrr_at_1
value: 43.77850162866449
- type: mrr_at_10
value: 56.28576598934912
- type: mrr_at_100
value: 56.8588518168194
- type: mrr_at_1000
value: 56.878236725973544
- type: mrr_at_20
value: 56.6409328120183
- type: mrr_at_3
value: 53.56134636264935
- type: mrr_at_5
value: 55.27795874049956
- type: nauc_map_at_1000_diff1
value: 27.262513153363876
- type: nauc_map_at_1000_max
value: 40.099398684385584
- type: nauc_map_at_1000_std
value: 18.847812394005512
- type: nauc_map_at_100_diff1
value: 27.238993503030745
- type: nauc_map_at_100_max
value: 40.07730434492169
- type: nauc_map_at_100_std
value: 18.795349250833684
- type: nauc_map_at_10_diff1
value: 27.70929180366227
- type: nauc_map_at_10_max
value: 39.55987024970173
- type: nauc_map_at_10_std
value: 17.214881544648996
- type: nauc_map_at_1_diff1
value: 43.34155892182403
- type: nauc_map_at_1_max
value: 38.23324890148018
- type: nauc_map_at_1_std
value: 6.0781444393516075
- type: nauc_map_at_20_diff1
value: 27.311577477800103
- type: nauc_map_at_20_max
value: 39.624414083413456
- type: nauc_map_at_20_std
value: 18.149811054163287
- type: nauc_map_at_3_diff1
value: 30.475965062734367
- type: nauc_map_at_3_max
value: 38.49324825043695
- type: nauc_map_at_3_std
value: 13.357656038648487
- type: nauc_map_at_5_diff1
value: 28.425110095017747
- type: nauc_map_at_5_max
value: 39.017894870747796
- type: nauc_map_at_5_std
value: 15.543817194122564
- type: nauc_mrr_at_1000_diff1
value: 33.16689354701644
- type: nauc_mrr_at_1000_max
value: 41.70755363247148
- type: nauc_mrr_at_1000_std
value: 24.61667417463176
- type: nauc_mrr_at_100_diff1
value: 33.147229262917506
- type: nauc_mrr_at_100_max
value: 41.712455697170725
- type: nauc_mrr_at_100_std
value: 24.6418922043652
- type: nauc_mrr_at_10_diff1
value: 32.94185191112572
- type: nauc_mrr_at_10_max
value: 41.64272730141954
- type: nauc_mrr_at_10_std
value: 24.663391015702707
- type: nauc_mrr_at_1_diff1
value: 39.571969559016395
- type: nauc_mrr_at_1_max
value: 39.396249211263495
- type: nauc_mrr_at_1_std
value: 16.984149923258357
- type: nauc_mrr_at_20_diff1
value: 33.10040770334742
- type: nauc_mrr_at_20_max
value: 41.807565560083034
- type: nauc_mrr_at_20_std
value: 24.8064180365271
- type: nauc_mrr_at_3_diff1
value: 33.065406161485704
- type: nauc_mrr_at_3_max
value: 41.049510969934694
- type: nauc_mrr_at_3_std
value: 23.18371458928609
- type: nauc_mrr_at_5_diff1
value: 33.2389593543916
- type: nauc_mrr_at_5_max
value: 41.629486918949915
- type: nauc_mrr_at_5_std
value: 24.5777253036149
- type: nauc_ndcg_at_1000_diff1
value: 25.868840609197637
- type: nauc_ndcg_at_1000_max
value: 42.79564910784761
- type: nauc_ndcg_at_1000_std
value: 27.035091271680113
- type: nauc_ndcg_at_100_diff1
value: 25.019789319579942
- type: nauc_ndcg_at_100_max
value: 42.482345143533735
- type: nauc_ndcg_at_100_std
value: 26.76872010731345
- type: nauc_ndcg_at_10_diff1
value: 25.949464660653238
- type: nauc_ndcg_at_10_max
value: 40.79769544643906
- type: nauc_ndcg_at_10_std
value: 22.486116508973204
- type: nauc_ndcg_at_1_diff1
value: 39.571969559016395
- type: nauc_ndcg_at_1_max
value: 39.396249211263495
- type: nauc_ndcg_at_1_std
value: 16.984149923258357
- type: nauc_ndcg_at_20_diff1
value: 25.173455685962214
- type: nauc_ndcg_at_20_max
value: 40.88873540662413
- type: nauc_ndcg_at_20_std
value: 24.4451041955519
- type: nauc_ndcg_at_3_diff1
value: 28.185416070726333
- type: nauc_ndcg_at_3_max
value: 39.10600031163912
- type: nauc_ndcg_at_3_std
value: 18.42694044215541
- type: nauc_ndcg_at_5_diff1
value: 27.112647584005583
- type: nauc_ndcg_at_5_max
value: 40.154045682322526
- type: nauc_ndcg_at_5_std
value: 20.26822517176828
- type: nauc_precision_at_1000_diff1
value: -16.42087927044017
- type: nauc_precision_at_1000_max
value: 3.5326295053913
- type: nauc_precision_at_1000_std
value: 24.406810708493197
- type: nauc_precision_at_100_diff1
value: -12.17648135724982
- type: nauc_precision_at_100_max
value: 15.895489260126183
- type: nauc_precision_at_100_std
value: 32.48346122610907
- type: nauc_precision_at_10_diff1
value: -1.2493131347748072
- type: nauc_precision_at_10_max
value: 26.409459305604376
- type: nauc_precision_at_10_std
value: 31.115432019300016
- type: nauc_precision_at_1_diff1
value: 39.571969559016395
- type: nauc_precision_at_1_max
value: 39.396249211263495
- type: nauc_precision_at_1_std
value: 16.984149923258357
- type: nauc_precision_at_20_diff1
value: -6.597509397240593
- type: nauc_precision_at_20_max
value: 21.461984620659695
- type: nauc_precision_at_20_std
value: 32.9450259748889
- type: nauc_precision_at_3_diff1
value: 9.46378764865453
- type: nauc_precision_at_3_max
value: 32.03650819375425
- type: nauc_precision_at_3_std
value: 26.489382638510765
- type: nauc_precision_at_5_diff1
value: 3.5987036728169537
- type: nauc_precision_at_5_max
value: 30.633955978579703
- type: nauc_precision_at_5_std
value: 30.532430088014443
- type: nauc_recall_at_1000_diff1
value: 10.714633106872254
- type: nauc_recall_at_1000_max
value: 43.94958623961
- type: nauc_recall_at_1000_std
value: 51.78914468954123
- type: nauc_recall_at_100_diff1
value: 9.63781472255557
- type: nauc_recall_at_100_max
value: 38.50917465255336
- type: nauc_recall_at_100_std
value: 37.78623984642377
- type: nauc_recall_at_10_diff1
value: 16.480342820841688
- type: nauc_recall_at_10_max
value: 35.982566867357406
- type: nauc_recall_at_10_std
value: 23.30688188788895
- type: nauc_recall_at_1_diff1
value: 43.34155892182403
- type: nauc_recall_at_1_max
value: 38.23324890148018
- type: nauc_recall_at_1_std
value: 6.0781444393516075
- type: nauc_recall_at_20_diff1
value: 13.521048985146367
- type: nauc_recall_at_20_max
value: 34.62462209239834
- type: nauc_recall_at_20_std
value: 27.85924191501618
- type: nauc_recall_at_3_diff1
value: 23.57032748533523
- type: nauc_recall_at_3_max
value: 36.32703197635613
- type: nauc_recall_at_3_std
value: 15.730238734014337
- type: nauc_recall_at_5_diff1
value: 19.61387036368584
- type: nauc_recall_at_5_max
value: 36.22030835529556
- type: nauc_recall_at_5_std
value: 19.76310648649897
- type: ndcg_at_1
value: 43.779
- type: ndcg_at_10
value: 43.525999999999996
- type: ndcg_at_100
value: 50.138000000000005
- type: ndcg_at_1000
value: 52.991
- type: ndcg_at_20
value: 46.083
- type: ndcg_at_3
value: 38.002
- type: ndcg_at_5
value: 39.842
- type: precision_at_1
value: 43.779
- type: precision_at_10
value: 13.205
- type: precision_at_100
value: 2.051
- type: precision_at_1000
value: 0.259
- type: precision_at_20
value: 7.722999999999999
- type: precision_at_3
value: 28.903000000000002
- type: precision_at_5
value: 21.368000000000002
- type: recall_at_1
value: 19.291
- type: recall_at_10
value: 48.754
- type: recall_at_100
value: 70.97200000000001
- type: recall_at_1000
value: 86.611
- type: recall_at_20
value: 55.884
- type: recall_at_3
value: 34.101
- type: recall_at_5
value: 40.784
task:
type: Retrieval
- dataset:
config: default
name: MTEB DBPedia
revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659
split: test
type: mteb/dbpedia
metrics:
- type: main_score
value: 49.884
- type: map_at_1
value: 9.913
- type: map_at_10
value: 23.186999999999998
- type: map_at_100
value: 34.207
- type: map_at_1000
value: 36.318
- type: map_at_20
value: 27.419
- type: map_at_3
value: 15.656
- type: map_at_5
value: 18.945999999999998
- type: mrr_at_1
value: 75.75
- type: mrr_at_10
value: 82.16279761904761
- type: mrr_at_100
value: 82.48445635330299
- type: mrr_at_1000
value: 82.4870246719901
- type: mrr_at_20
value: 82.36203632968338
- type: mrr_at_3
value: 81.29166666666666
- type: mrr_at_5
value: 82.02916666666667
- type: nauc_map_at_1000_diff1
value: 17.0739966990996
- type: nauc_map_at_1000_max
value: 28.440065298437133
- type: nauc_map_at_1000_std
value: 20.83498154003865
- type: nauc_map_at_100_diff1
value: 17.75982086107111
- type: nauc_map_at_100_max
value: 26.87850835673573
- type: nauc_map_at_100_std
value: 18.350282298599275
- type: nauc_map_at_10_diff1
value: 17.15984258564116
- type: nauc_map_at_10_max
value: 10.846179132675553
- type: nauc_map_at_10_std
value: -6.263534464094614
- type: nauc_map_at_1_diff1
value: 24.014897777973694
- type: nauc_map_at_1_max
value: -4.556638938723358
- type: nauc_map_at_1_std
value: -22.7844467526989
- type: nauc_map_at_20_diff1
value: 16.3179372493187
- type: nauc_map_at_20_max
value: 17.176378915498915
- type: nauc_map_at_20_std
value: 1.9378637630340372
- type: nauc_map_at_3_diff1
value: 19.12786794046792
- type: nauc_map_at_3_max
value: 0.09063919305677291
- type: nauc_map_at_3_std
value: -16.713143158330492
- type: nauc_map_at_5_diff1
value: 18.76504725420023
- type: nauc_map_at_5_max
value: 5.040867712207419
- type: nauc_map_at_5_std
value: -12.382578318931165
- type: nauc_mrr_at_1000_diff1
value: 54.61266255011247
- type: nauc_mrr_at_1000_max
value: 60.83961280977112
- type: nauc_mrr_at_1000_std
value: 32.70429260443016
- type: nauc_mrr_at_100_diff1
value: 54.61346236538542
- type: nauc_mrr_at_100_max
value: 60.8407974416647
- type: nauc_mrr_at_100_std
value: 32.69272843993462
- type: nauc_mrr_at_10_diff1
value: 54.74633685810871
- type: nauc_mrr_at_10_max
value: 61.084525933097865
- type: nauc_mrr_at_10_std
value: 33.001220210025565
- type: nauc_mrr_at_1_diff1
value: 56.12708423835806
- type: nauc_mrr_at_1_max
value: 58.9314540998289
- type: nauc_mrr_at_1_std
value: 27.39422607651012
- type: nauc_mrr_at_20_diff1
value: 54.58896150245695
- type: nauc_mrr_at_20_max
value: 60.890929983464815
- type: nauc_mrr_at_20_std
value: 32.65559641276393
- type: nauc_mrr_at_3_diff1
value: 54.38229071443791
- type: nauc_mrr_at_3_max
value: 59.987849044098596
- type: nauc_mrr_at_3_std
value: 33.439813880719974
- type: nauc_mrr_at_5_diff1
value: 54.961790262449824
- type: nauc_mrr_at_5_max
value: 61.17705173908951
- type: nauc_mrr_at_5_std
value: 33.30939850734856
- type: nauc_ndcg_at_1000_diff1
value: 29.27465932507067
- type: nauc_ndcg_at_1000_max
value: 47.952543312315214
- type: nauc_ndcg_at_1000_std
value: 36.17132236391485
- type: nauc_ndcg_at_100_diff1
value: 28.63072328980134
- type: nauc_ndcg_at_100_max
value: 41.460833419186564
- type: nauc_ndcg_at_100_std
value: 27.157100358988135
- type: nauc_ndcg_at_10_diff1
value: 23.41488013023301
- type: nauc_ndcg_at_10_max
value: 39.27798133072349
- type: nauc_ndcg_at_10_std
value: 21.979241438928312
- type: nauc_ndcg_at_1_diff1
value: 46.12120543657642
- type: nauc_ndcg_at_1_max
value: 47.28452124039853
- type: nauc_ndcg_at_1_std
value: 19.799884708952543
- type: nauc_ndcg_at_20_diff1
value: 23.627669045115574
- type: nauc_ndcg_at_20_max
value: 35.88225062457673
- type: nauc_ndcg_at_20_std
value: 18.218628030529498
- type: nauc_ndcg_at_3_diff1
value: 25.37309228946118
- type: nauc_ndcg_at_3_max
value: 40.64426332992231
- type: nauc_ndcg_at_3_std
value: 24.608330645901482
- type: nauc_ndcg_at_5_diff1
value: 24.055798594999654
- type: nauc_ndcg_at_5_max
value: 41.16180524175431
- type: nauc_ndcg_at_5_std
value: 24.048305528761315
- type: nauc_precision_at_1000_diff1
value: -18.234943251015576
- type: nauc_precision_at_1000_max
value: 0.48708502364659184
- type: nauc_precision_at_1000_std
value: 2.4473601543134027
- type: nauc_precision_at_100_diff1
value: -3.0077810947381227
- type: nauc_precision_at_100_max
value: 25.27249321108913
- type: nauc_precision_at_100_std
value: 37.36575792126928
- type: nauc_precision_at_10_diff1
value: -0.2393778190297635
- type: nauc_precision_at_10_max
value: 36.40513293547299
- type: nauc_precision_at_10_std
value: 37.4827885766009
- type: nauc_precision_at_1_diff1
value: 56.12708423835806
- type: nauc_precision_at_1_max
value: 58.9314540998289
- type: nauc_precision_at_1_std
value: 27.39422607651012
- type: nauc_precision_at_20_diff1
value: -1.2010133229402933
- type: nauc_precision_at_20_max
value: 34.117541814385966
- type: nauc_precision_at_20_std
value: 39.13273254177449
- type: nauc_precision_at_3_diff1
value: 11.757378092198486
- type: nauc_precision_at_3_max
value: 42.637962482588875
- type: nauc_precision_at_3_std
value: 37.42465077352342
- type: nauc_precision_at_5_diff1
value: 7.233177203405101
- type: nauc_precision_at_5_max
value: 43.1663582897407
- type: nauc_precision_at_5_std
value: 38.848449220750055
- type: nauc_recall_at_1000_diff1
value: 27.33938551969145
- type: nauc_recall_at_1000_max
value: 45.5614254479334
- type: nauc_recall_at_1000_std
value: 50.58528916250458
- type: nauc_recall_at_100_diff1
value: 23.610383761920097
- type: nauc_recall_at_100_max
value: 31.422168485847184
- type: nauc_recall_at_100_std
value: 25.58649926458304
- type: nauc_recall_at_10_diff1
value: 14.62495111808408
- type: nauc_recall_at_10_max
value: 7.4295041277681095
- type: nauc_recall_at_10_std
value: -9.32297089600654
- type: nauc_recall_at_1_diff1
value: 24.014897777973694
- type: nauc_recall_at_1_max
value: -4.556638938723358
- type: nauc_recall_at_1_std
value: -22.7844467526989
- type: nauc_recall_at_20_diff1
value: 14.027862330014662
- type: nauc_recall_at_20_max
value: 12.437478731690844
- type: nauc_recall_at_20_std
value: -3.0740743798103676
- type: nauc_recall_at_3_diff1
value: 16.354018356566712
- type: nauc_recall_at_3_max
value: -2.9812231240997917
- type: nauc_recall_at_3_std
value: -18.27746460743442
- type: nauc_recall_at_5_diff1
value: 16.81486583473587
- type: nauc_recall_at_5_max
value: 2.420128513974744
- type: nauc_recall_at_5_std
value: -14.441820321214108
- type: ndcg_at_1
value: 63.87500000000001
- type: ndcg_at_10
value: 49.884
- type: ndcg_at_100
value: 54.738
- type: ndcg_at_1000
value: 61.635
- type: ndcg_at_20
value: 48.894999999999996
- type: ndcg_at_3
value: 54.287
- type: ndcg_at_5
value: 52.40899999999999
- type: precision_at_1
value: 75.75
- type: precision_at_10
value: 40.9
- type: precision_at_100
value: 13.139999999999999
- type: precision_at_1000
value: 2.533
- type: precision_at_20
value: 30.8
- type: precision_at_3
value: 57.667
- type: precision_at_5
value: 51.05
- type: recall_at_1
value: 9.913
- type: recall_at_10
value: 28.591
- type: recall_at_100
value: 61.017999999999994
- type: recall_at_1000
value: 83.383
- type: recall_at_20
value: 37.834
- type: recall_at_3
value: 17.049
- type: recall_at_5
value: 21.685
task:
type: Retrieval
- dataset:
config: default
name: MTEB EmotionClassification
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
split: test
type: mteb/emotion
metrics:
- type: accuracy
value: 78.77499999999999
- type: f1
value: 73.74058240799386
- type: f1_weighted
value: 79.78804377638227
- type: main_score
value: 78.77499999999999
task:
type: Classification
- dataset:
config: default
name: MTEB FEVER
revision: bea83ef9e8fb933d90a2f1d5515737465d613e12
split: test
type: mteb/fever
metrics:
- type: main_score
value: 90.986
- type: map_at_1
value: 81.601
- type: map_at_10
value: 88.242
- type: map_at_100
value: 88.46000000000001
- type: map_at_1000
value: 88.472
- type: map_at_20
value: 88.375
- type: map_at_3
value: 87.237
- type: map_at_5
value: 87.85300000000001
- type: mrr_at_1
value: 87.81878187818782
- type: mrr_at_10
value: 92.20301196786335
- type: mrr_at_100
value: 92.24884236673292
- type: mrr_at_1000
value: 92.2496338899362
- type: mrr_at_20
value: 92.23112073283473
- type: mrr_at_3
value: 91.77417741774165
- type: mrr_at_5
value: 92.03970397039689
- type: nauc_map_at_1000_diff1
value: 56.54670664910505
- type: nauc_map_at_1000_max
value: 33.08375749975477
- type: nauc_map_at_1000_std
value: 2.7491595418252865
- type: nauc_map_at_100_diff1
value: 56.50887688686924
- type: nauc_map_at_100_max
value: 33.075487189958494
- type: nauc_map_at_100_std
value: 2.7675869969253375
- type: nauc_map_at_10_diff1
value: 56.08080806610569
- type: nauc_map_at_10_max
value: 32.776972098819066
- type: nauc_map_at_10_std
value: 2.5904846711290097
- type: nauc_map_at_1_diff1
value: 60.645344065853145
- type: nauc_map_at_1_max
value: 31.232776777514797
- type: nauc_map_at_1_std
value: -1.1946138176109171
- type: nauc_map_at_20_diff1
value: 56.28378454162355
- type: nauc_map_at_20_max
value: 32.98207150385811
- type: nauc_map_at_20_std
value: 2.8469814040214025
- type: nauc_map_at_3_diff1
value: 55.81958007095375
- type: nauc_map_at_3_max
value: 31.602707711038313
- type: nauc_map_at_3_std
value: 0.8117019292273401
- type: nauc_map_at_5_diff1
value: 55.706025752316535
- type: nauc_map_at_5_max
value: 32.16032683604737
- type: nauc_map_at_5_std
value: 1.8853201503498669
- type: nauc_mrr_at_1000_diff1
value: 75.4997173366251
- type: nauc_mrr_at_1000_max
value: 41.49117135484116
- type: nauc_mrr_at_1000_std
value: -2.0636172883680852
- type: nauc_mrr_at_100_diff1
value: 75.50118860648519
- type: nauc_mrr_at_100_max
value: 41.49490161517194
- type: nauc_mrr_at_100_std
value: -2.057024385178682
- type: nauc_mrr_at_10_diff1
value: 75.47295153099428
- type: nauc_mrr_at_10_max
value: 41.55003304042536
- type: nauc_mrr_at_10_std
value: -2.0353663198929253
- type: nauc_mrr_at_1_diff1
value: 76.632058433229
- type: nauc_mrr_at_1_max
value: 39.754483718891656
- type: nauc_mrr_at_1_std
value: -2.962241058101701
- type: nauc_mrr_at_20_diff1
value: 75.47221882396194
- type: nauc_mrr_at_20_max
value: 41.50779280480839
- type: nauc_mrr_at_20_std
value: -1.9620212266426307
- type: nauc_mrr_at_3_diff1
value: 75.5682297897137
- type: nauc_mrr_at_3_max
value: 41.53543801506081
- type: nauc_mrr_at_3_std
value: -3.391681195945978
- type: nauc_mrr_at_5_diff1
value: 75.37562775183947
- type: nauc_mrr_at_5_max
value: 41.42028509006753
- type: nauc_mrr_at_5_std
value: -2.418698675622726
- type: nauc_ndcg_at_1000_diff1
value: 59.364557011624
- type: nauc_ndcg_at_1000_max
value: 35.4112238125149
- type: nauc_ndcg_at_1000_std
value: 3.717516193303376
- type: nauc_ndcg_at_100_diff1
value: 58.55706703023122
- type: nauc_ndcg_at_100_max
value: 35.352285999934594
- type: nauc_ndcg_at_100_std
value: 4.273437944266781
- type: nauc_ndcg_at_10_diff1
value: 56.77422701267037
- type: nauc_ndcg_at_10_max
value: 34.24909893882957
- type: nauc_ndcg_at_10_std
value: 4.178151434006727
- type: nauc_ndcg_at_1_diff1
value: 76.632058433229
- type: nauc_ndcg_at_1_max
value: 39.754483718891656
- type: nauc_ndcg_at_1_std
value: -2.962241058101701
- type: nauc_ndcg_at_20_diff1
value: 57.27343398231262
- type: nauc_ndcg_at_20_max
value: 34.7416626740278
- type: nauc_ndcg_at_20_std
value: 4.955858766014002
- type: nauc_ndcg_at_3_diff1
value: 57.69267803121093
- type: nauc_ndcg_at_3_max
value: 33.13744317023105
- type: nauc_ndcg_at_3_std
value: 0.40380284030057023
- type: nauc_ndcg_at_5_diff1
value: 56.57461019113917
- type: nauc_ndcg_at_5_max
value: 33.244657840804386
- type: nauc_ndcg_at_5_std
value: 2.5121440827702046
- type: nauc_precision_at_1000_diff1
value: -14.54492513449718
- type: nauc_precision_at_1000_max
value: -5.94552147573623
- type: nauc_precision_at_1000_std
value: 1.2446209816057374
- type: nauc_precision_at_100_diff1
value: -15.452676132568344
- type: nauc_precision_at_100_max
value: -3.760241749847617
- type: nauc_precision_at_100_std
value: 4.623534605290865
- type: nauc_precision_at_10_diff1
value: -12.712908026086176
- type: nauc_precision_at_10_max
value: 0.45241316994816805
- type: nauc_precision_at_10_std
value: 7.849478570138391
- type: nauc_precision_at_1_diff1
value: 76.632058433229
- type: nauc_precision_at_1_max
value: 39.754483718891656
- type: nauc_precision_at_1_std
value: -2.962241058101701
- type: nauc_precision_at_20_diff1
value: -14.514618673172041
- type: nauc_precision_at_20_max
value: -1.113635490621818
- type: nauc_precision_at_20_std
value: 8.599811730457576
- type: nauc_precision_at_3_diff1
value: 6.1367799850003815
- type: nauc_precision_at_3_max
value: 8.466271950897857
- type: nauc_precision_at_3_std
value: 1.7458051543195068
- type: nauc_precision_at_5_diff1
value: -5.804548945783379
- type: nauc_precision_at_5_max
value: 3.4060251839074818
- type: nauc_precision_at_5_std
value: 5.583410511782371
- type: nauc_recall_at_1000_diff1
value: 19.329432953574095
- type: nauc_recall_at_1000_max
value: 43.260442595158736
- type: nauc_recall_at_1000_std
value: 53.89644660661804
- type: nauc_recall_at_100_diff1
value: 21.265326296051235
- type: nauc_recall_at_100_max
value: 38.573000195373695
- type: nauc_recall_at_100_std
value: 42.169391082152785
- type: nauc_recall_at_10_diff1
value: 29.785129558987432
- type: nauc_recall_at_10_max
value: 28.379657867558034
- type: nauc_recall_at_10_std
value: 21.132574624091973
- type: nauc_recall_at_1_diff1
value: 60.645344065853145
- type: nauc_recall_at_1_max
value: 31.232776777514797
- type: nauc_recall_at_1_std
value: -1.1946138176109171
- type: nauc_recall_at_20_diff1
value: 25.88845612373954
- type: nauc_recall_at_20_max
value: 30.24785945821152
- type: nauc_recall_at_20_std
value: 31.73911437468067
- type: nauc_recall_at_3_diff1
value: 42.2968464797395
- type: nauc_recall_at_3_max
value: 26.494318009870018
- type: nauc_recall_at_3_std
value: 2.6045977160467544
- type: nauc_recall_at_5_diff1
value: 35.81340094401374
- type: nauc_recall_at_5_max
value: 25.91082947510634
- type: nauc_recall_at_5_std
value: 9.759404930864779
- type: ndcg_at_1
value: 87.819
- type: ndcg_at_10
value: 90.986
- type: ndcg_at_100
value: 91.69
- type: ndcg_at_1000
value: 91.863
- type: ndcg_at_20
value: 91.293
- type: ndcg_at_3
value: 89.621
- type: ndcg_at_5
value: 90.333
- type: precision_at_1
value: 87.819
- type: precision_at_10
value: 10.753
- type: precision_at_100
value: 1.138
- type: precision_at_1000
value: 0.117
- type: precision_at_20
value: 5.4879999999999995
- type: precision_at_3
value: 33.703
- type: precision_at_5
value: 20.831
- type: recall_at_1
value: 81.601
- type: recall_at_10
value: 95.44200000000001
- type: recall_at_100
value: 98.14399999999999
- type: recall_at_1000
value: 99.157
- type: recall_at_20
value: 96.43
- type: recall_at_3
value: 91.729
- type: recall_at_5
value: 93.552
task:
type: Retrieval
- dataset:
config: default
name: MTEB FiQA2018
revision: 27a168819829fe9bcd655c2df245fb19452e8e06
split: test
type: mteb/fiqa
metrics:
- type: main_score
value: 56.056
- type: map_at_1
value: 28.666000000000004
- type: map_at_10
value: 47.437000000000005
- type: map_at_100
value: 49.537
- type: map_at_1000
value: 49.665
- type: map_at_20
value: 48.618
- type: map_at_3
value: 41.355
- type: map_at_5
value: 44.525
- type: mrr_at_1
value: 55.55555555555556
- type: mrr_at_10
value: 63.705173427395614
- type: mrr_at_100
value: 64.25449940779741
- type: mrr_at_1000
value: 64.27635581092147
- type: mrr_at_20
value: 64.03796029079103
- type: mrr_at_3
value: 61.49691358024688
- type: mrr_at_5
value: 62.73148148148143
- type: nauc_map_at_1000_diff1
value: 43.24282910397747
- type: nauc_map_at_1000_max
value: 28.506093180265644
- type: nauc_map_at_1000_std
value: -13.040508386155054
- type: nauc_map_at_100_diff1
value: 43.23650442904607
- type: nauc_map_at_100_max
value: 28.470565635459156
- type: nauc_map_at_100_std
value: -12.988098780714935
- type: nauc_map_at_10_diff1
value: 43.393840733087686
- type: nauc_map_at_10_max
value: 26.637302062720153
- type: nauc_map_at_10_std
value: -14.47500292113762
- type: nauc_map_at_1_diff1
value: 47.705150227211725
- type: nauc_map_at_1_max
value: 15.354189686550129
- type: nauc_map_at_1_std
value: -14.559819859039067
- type: nauc_map_at_20_diff1
value: 43.14121075706104
- type: nauc_map_at_20_max
value: 27.811170590408395
- type: nauc_map_at_20_std
value: -13.459413585283583
- type: nauc_map_at_3_diff1
value: 44.33938667720801
- type: nauc_map_at_3_max
value: 21.785619884549398
- type: nauc_map_at_3_std
value: -15.569980103071593
- type: nauc_map_at_5_diff1
value: 43.39280905665027
- type: nauc_map_at_5_max
value: 25.021492190645017
- type: nauc_map_at_5_std
value: -14.48856622187443
- type: nauc_mrr_at_1000_diff1
value: 52.971563939946286
- type: nauc_mrr_at_1000_max
value: 38.88019486172324
- type: nauc_mrr_at_1000_std
value: -12.412991642381616
- type: nauc_mrr_at_100_diff1
value: 52.978468139876945
- type: nauc_mrr_at_100_max
value: 38.89751787948751
- type: nauc_mrr_at_100_std
value: -12.3677876252269
- type: nauc_mrr_at_10_diff1
value: 52.78507148048174
- type: nauc_mrr_at_10_max
value: 38.55079809310022
- type: nauc_mrr_at_10_std
value: -12.944127025078755
- type: nauc_mrr_at_1_diff1
value: 55.52626805861546
- type: nauc_mrr_at_1_max
value: 40.49306809164979
- type: nauc_mrr_at_1_std
value: -12.886607701317681
- type: nauc_mrr_at_20_diff1
value: 52.9592152665678
- type: nauc_mrr_at_20_max
value: 38.88514014589964
- type: nauc_mrr_at_20_std
value: -12.434464359819444
- type: nauc_mrr_at_3_diff1
value: 52.73696844091174
- type: nauc_mrr_at_3_max
value: 38.61018727252859
- type: nauc_mrr_at_3_std
value: -13.123989867364166
- type: nauc_mrr_at_5_diff1
value: 53.037110010188
- type: nauc_mrr_at_5_max
value: 38.44770729849151
- type: nauc_mrr_at_5_std
value: -13.49318771828972
- type: nauc_ndcg_at_1000_diff1
value: 44.73813840091289
- type: nauc_ndcg_at_1000_max
value: 33.70113904685389
- type: nauc_ndcg_at_1000_std
value: -10.328687058192742
- type: nauc_ndcg_at_100_diff1
value: 44.595174119928835
- type: nauc_ndcg_at_100_max
value: 33.4788285112467
- type: nauc_ndcg_at_100_std
value: -8.695355259716946
- type: nauc_ndcg_at_10_diff1
value: 44.39837225263
- type: nauc_ndcg_at_10_max
value: 29.188289725593393
- type: nauc_ndcg_at_10_std
value: -13.67608323673103
- type: nauc_ndcg_at_1_diff1
value: 55.52626805861546
- type: nauc_ndcg_at_1_max
value: 40.49306809164979
- type: nauc_ndcg_at_1_std
value: -12.886607701317681
- type: nauc_ndcg_at_20_diff1
value: 44.24661739902305
- type: nauc_ndcg_at_20_max
value: 31.667868318249965
- type: nauc_ndcg_at_20_std
value: -10.65470780066342
- type: nauc_ndcg_at_3_diff1
value: 43.39857166975522
- type: nauc_ndcg_at_3_max
value: 31.764668313577495
- type: nauc_ndcg_at_3_std
value: -14.494866954678152
- type: nauc_ndcg_at_5_diff1
value: 43.16976647347281
- type: nauc_ndcg_at_5_max
value: 29.878329062643143
- type: nauc_ndcg_at_5_std
value: -13.987689089179739
- type: nauc_precision_at_1000_diff1
value: -9.807973252625484
- type: nauc_precision_at_1000_max
value: 26.6279603849494
- type: nauc_precision_at_1000_std
value: 7.113187103520632
- type: nauc_precision_at_100_diff1
value: -4.777149603323976
- type: nauc_precision_at_100_max
value: 31.03410463692187
- type: nauc_precision_at_100_std
value: 10.463144150275435
- type: nauc_precision_at_10_diff1
value: 8.691528703215962
- type: nauc_precision_at_10_max
value: 33.329579434123374
- type: nauc_precision_at_10_std
value: -0.8002015226329403
- type: nauc_precision_at_1_diff1
value: 55.52626805861546
- type: nauc_precision_at_1_max
value: 40.49306809164979
- type: nauc_precision_at_1_std
value: -12.886607701317681
- type: nauc_precision_at_20_diff1
value: 3.4564653474184284
- type: nauc_precision_at_20_max
value: 34.401070158471136
- type: nauc_precision_at_20_std
value: 5.813431200164549
- type: nauc_precision_at_3_diff1
value: 22.463219705462187
- type: nauc_precision_at_3_max
value: 34.77413976546924
- type: nauc_precision_at_3_std
value: -7.083890789741479
- type: nauc_precision_at_5_diff1
value: 14.011006004883154
- type: nauc_precision_at_5_max
value: 35.73655466853702
- type: nauc_precision_at_5_std
value: -2.8395172077771598
- type: nauc_recall_at_1000_diff1
value: 16.478046357391555
- type: nauc_recall_at_1000_max
value: 43.231704288282344
- type: nauc_recall_at_1000_std
value: 38.430684937573645
- type: nauc_recall_at_100_diff1
value: 30.764718344602436
- type: nauc_recall_at_100_max
value: 31.769050487166655
- type: nauc_recall_at_100_std
value: 23.48468311677149
- type: nauc_recall_at_10_diff1
value: 34.47339565324045
- type: nauc_recall_at_10_max
value: 19.054212335800454
- type: nauc_recall_at_10_std
value: -11.039734015330437
- type: nauc_recall_at_1_diff1
value: 47.705150227211725
- type: nauc_recall_at_1_max
value: 15.354189686550129
- type: nauc_recall_at_1_std
value: -14.559819859039067
- type: nauc_recall_at_20_diff1
value: 32.1011474016873
- type: nauc_recall_at_20_max
value: 25.546372988304423
- type: nauc_recall_at_20_std
value: -0.007233471152482897
- type: nauc_recall_at_3_diff1
value: 37.5708138019065
- type: nauc_recall_at_3_max
value: 16.66410785756736
- type: nauc_recall_at_3_std
value: -15.404817020108966
- type: nauc_recall_at_5_diff1
value: 35.714519648479595
- type: nauc_recall_at_5_max
value: 19.02075233009296
- type: nauc_recall_at_5_std
value: -13.180963359760725
- type: ndcg_at_1
value: 55.556000000000004
- type: ndcg_at_10
value: 56.056
- type: ndcg_at_100
value: 62.44
- type: ndcg_at_1000
value: 64.263
- type: ndcg_at_20
value: 58.638999999999996
- type: ndcg_at_3
value: 51.722
- type: ndcg_at_5
value: 52.701
- type: precision_at_1
value: 55.556000000000004
- type: precision_at_10
value: 15.679000000000002
- type: precision_at_100
value: 2.252
- type: precision_at_1000
value: 0.257
- type: precision_at_20
value: 9.02
- type: precision_at_3
value: 34.619
- type: precision_at_5
value: 25.093
- type: recall_at_1
value: 28.666000000000004
- type: recall_at_10
value: 63.717999999999996
- type: recall_at_100
value: 86.938
- type: recall_at_1000
value: 97.603
- type: recall_at_20
value: 71.649
- type: recall_at_3
value: 46.663
- type: recall_at_5
value: 53.313
task:
type: Retrieval
- dataset:
config: default
name: MTEB HotpotQA
revision: ab518f4d6fcca38d87c25209f94beba119d02014
split: test
type: mteb/hotpotqa
metrics:
- type: main_score
value: 71.74199999999999
- type: map_at_1
value: 41.729
- type: map_at_10
value: 63.168
- type: map_at_100
value: 64.132
- type: map_at_1000
value: 64.199
- type: map_at_20
value: 63.736000000000004
- type: map_at_3
value: 59.826
- type: map_at_5
value: 61.882000000000005
- type: mrr_at_1
value: 83.45712356515868
- type: mrr_at_10
value: 87.850342432719
- type: mrr_at_100
value: 88.0016320691113
- type: mrr_at_1000
value: 88.00576596968136
- type: mrr_at_20
value: 87.94463253190389
- type: mrr_at_3
value: 87.13706954760278
- type: mrr_at_5
value: 87.59419311276136
- type: nauc_map_at_1000_diff1
value: 13.635446621095054
- type: nauc_map_at_1000_max
value: 18.670632529445633
- type: nauc_map_at_1000_std
value: 10.444842636150575
- type: nauc_map_at_100_diff1
value: 13.599262398010783
- type: nauc_map_at_100_max
value: 18.636389405484806
- type: nauc_map_at_100_std
value: 10.460027483576043
- type: nauc_map_at_10_diff1
value: 13.235053919323942
- type: nauc_map_at_10_max
value: 18.252140477080047
- type: nauc_map_at_10_std
value: 9.9075337042203
- type: nauc_map_at_1_diff1
value: 76.51940497836482
- type: nauc_map_at_1_max
value: 51.251419487235474
- type: nauc_map_at_1_std
value: 0.16714896857146574
- type: nauc_map_at_20_diff1
value: 13.4178245722222
- type: nauc_map_at_20_max
value: 18.40988771210718
- type: nauc_map_at_20_std
value: 10.216685163366282
- type: nauc_map_at_3_diff1
value: 13.38370761663418
- type: nauc_map_at_3_max
value: 17.760962555456537
- type: nauc_map_at_3_std
value: 7.15741965624388
- type: nauc_map_at_5_diff1
value: 13.138133309724855
- type: nauc_map_at_5_max
value: 17.871761295251044
- type: nauc_map_at_5_std
value: 8.475147426940074
- type: nauc_mrr_at_1000_diff1
value: 75.82650818891959
- type: nauc_mrr_at_1000_max
value: 53.6736100668434
- type: nauc_mrr_at_1000_std
value: 1.8025016349213916
- type: nauc_mrr_at_100_diff1
value: 75.82530574210111
- type: nauc_mrr_at_100_max
value: 53.68067545829002
- type: nauc_mrr_at_100_std
value: 1.8147470536495791
- type: nauc_mrr_at_10_diff1
value: 75.8330135686799
- type: nauc_mrr_at_10_max
value: 53.78626885349077
- type: nauc_mrr_at_10_std
value: 1.7975782717226636
- type: nauc_mrr_at_1_diff1
value: 76.51940497836482
- type: nauc_mrr_at_1_max
value: 51.251419487235474
- type: nauc_mrr_at_1_std
value: 0.16714896857146574
- type: nauc_mrr_at_20_diff1
value: 75.82783382464166
- type: nauc_mrr_at_20_max
value: 53.68364567043885
- type: nauc_mrr_at_20_std
value: 1.742037904463963
- type: nauc_mrr_at_3_diff1
value: 75.6944609768663
- type: nauc_mrr_at_3_max
value: 53.803941340341666
- type: nauc_mrr_at_3_std
value: 1.1849945458077804
- type: nauc_mrr_at_5_diff1
value: 75.73006960604903
- type: nauc_mrr_at_5_max
value: 53.62223096420106
- type: nauc_mrr_at_5_std
value: 1.6144067563410909
- type: nauc_ndcg_at_1000_diff1
value: 21.58025241642726
- type: nauc_ndcg_at_1000_max
value: 24.675747527001153
- type: nauc_ndcg_at_1000_std
value: 13.075943547492718
- type: nauc_ndcg_at_100_diff1
value: 20.30260137544846
- type: nauc_ndcg_at_100_max
value: 23.757528813872018
- type: nauc_ndcg_at_100_std
value: 13.648994687574062
- type: nauc_ndcg_at_10_diff1
value: 18.995052360997818
- type: nauc_ndcg_at_10_max
value: 22.254260808196037
- type: nauc_ndcg_at_10_std
value: 11.27212390633054
- type: nauc_ndcg_at_1_diff1
value: 76.51940497836482
- type: nauc_ndcg_at_1_max
value: 51.251419487235474
- type: nauc_ndcg_at_1_std
value: 0.16714896857146574
- type: nauc_ndcg_at_20_diff1
value: 19.333742380695757
- type: nauc_ndcg_at_20_max
value: 22.527779834633364
- type: nauc_ndcg_at_20_std
value: 12.161009000707917
- type: nauc_ndcg_at_3_diff1
value: 20.013329040965534
- type: nauc_ndcg_at_3_max
value: 21.99692460311921
- type: nauc_ndcg_at_3_std
value: 6.8076290638386165
- type: nauc_ndcg_at_5_diff1
value: 19.08226315942471
- type: nauc_ndcg_at_5_max
value: 21.71185964294168
- type: nauc_ndcg_at_5_std
value: 8.671911269518214
- type: nauc_precision_at_1000_diff1
value: 2.4462475489446764
- type: nauc_precision_at_1000_max
value: 29.145662064268578
- type: nauc_precision_at_1000_std
value: 49.20704909525856
- type: nauc_precision_at_100_diff1
value: 0.11271196725540299
- type: nauc_precision_at_100_max
value: 17.37584606388067
- type: nauc_precision_at_100_std
value: 34.66099346244071
- type: nauc_precision_at_10_diff1
value: 2.9923183951227825
- type: nauc_precision_at_10_max
value: 14.261884731124264
- type: nauc_precision_at_10_std
value: 18.084188795498378
- type: nauc_precision_at_1_diff1
value: 76.51940497836482
- type: nauc_precision_at_1_max
value: 51.251419487235474
- type: nauc_precision_at_1_std
value: 0.16714896857146574
- type: nauc_precision_at_20_diff1
value: 1.9180293008303761
- type: nauc_precision_at_20_max
value: 13.832269193468512
- type: nauc_precision_at_20_std
value: 21.65284406055607
- type: nauc_precision_at_3_diff1
value: 7.226609484731811
- type: nauc_precision_at_3_max
value: 15.162908526977272
- type: nauc_precision_at_3_std
value: 8.451859972962776
- type: nauc_precision_at_5_diff1
value: 4.705236845538159
- type: nauc_precision_at_5_max
value: 14.022910843582666
- type: nauc_precision_at_5_std
value: 11.777269322821605
- type: nauc_recall_at_1000_diff1
value: 2.446247548945172
- type: nauc_recall_at_1000_max
value: 29.14566206426889
- type: nauc_recall_at_1000_std
value: 49.20704909525879
- type: nauc_recall_at_100_diff1
value: 0.1127119672553316
- type: nauc_recall_at_100_max
value: 17.37584606388062
- type: nauc_recall_at_100_std
value: 34.660993462440686
- type: nauc_recall_at_10_diff1
value: 2.9923183951227927
- type: nauc_recall_at_10_max
value: 14.261884731124299
- type: nauc_recall_at_10_std
value: 18.08418879549837
- type: nauc_recall_at_1_diff1
value: 76.51940497836482
- type: nauc_recall_at_1_max
value: 51.251419487235474
- type: nauc_recall_at_1_std
value: 0.16714896857146574
- type: nauc_recall_at_20_diff1
value: 1.918029300830432
- type: nauc_recall_at_20_max
value: 13.832269193468566
- type: nauc_recall_at_20_std
value: 21.65284406055605
- type: nauc_recall_at_3_diff1
value: 7.226609484731802
- type: nauc_recall_at_3_max
value: 15.162908526977182
- type: nauc_recall_at_3_std
value: 8.451859972962634
- type: nauc_recall_at_5_diff1
value: 4.705236845538197
- type: nauc_recall_at_5_max
value: 14.02291084358265
- type: nauc_recall_at_5_std
value: 11.777269322821638
- type: ndcg_at_1
value: 83.45700000000001
- type: ndcg_at_10
value: 71.74199999999999
- type: ndcg_at_100
value: 75.008
- type: ndcg_at_1000
value: 76.242
- type: ndcg_at_20
value: 73.114
- type: ndcg_at_3
value: 67.128
- type: ndcg_at_5
value: 69.645
- type: precision_at_1
value: 83.45700000000001
- type: precision_at_10
value: 14.747
- type: precision_at_100
value: 1.73
- type: precision_at_1000
value: 0.189
- type: precision_at_20
value: 7.8149999999999995
- type: precision_at_3
value: 42.323
- type: precision_at_5
value: 27.381
- type: recall_at_1
value: 41.729
- type: recall_at_10
value: 73.734
- type: recall_at_100
value: 86.502
- type: recall_at_1000
value: 94.60499999999999
- type: recall_at_20
value: 78.14999999999999
- type: recall_at_3
value: 63.483999999999995
- type: recall_at_5
value: 68.45400000000001
task:
type: Retrieval
- dataset:
config: default
name: MTEB ImdbClassification
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
split: test
type: mteb/imdb
metrics:
- type: accuracy
value: 96.4904
- type: ap
value: 94.85481918794709
- type: ap_weighted
value: 94.85481918794709
- type: f1
value: 96.4898592305707
- type: f1_weighted
value: 96.4898592305707
- type: main_score
value: 96.4904
task:
type: Classification
- dataset:
config: default
name: MTEB MSMARCO
revision: c5a29a104738b98a9e76336939199e264163d4a0
split: dev
type: mteb/msmarco
metrics:
- type: main_score
value: 43.692
- type: map_at_1
value: 23.751
- type: map_at_10
value: 36.553999999999995
- type: map_at_100
value: 37.721
- type: map_at_1000
value: 37.763999999999996
- type: map_at_20
value: 37.289
- type: map_at_3
value: 32.643
- type: map_at_5
value: 34.851
- type: mrr_at_1
value: 24.455587392550143
- type: mrr_at_10
value: 37.18388706963206
- type: mrr_at_100
value: 38.28330737932916
- type: mrr_at_1000
value: 38.32054399710817
- type: mrr_at_20
value: 37.8818001216278
- type: mrr_at_3
value: 33.35721107927405
- type: mrr_at_5
value: 35.52483285577843
- type: nauc_map_at_1000_diff1
value: 36.3576177260684
- type: nauc_map_at_1000_max
value: 7.854511605962703
- type: nauc_map_at_1000_std
value: -17.701121059746878
- type: nauc_map_at_100_diff1
value: 36.356075649230505
- type: nauc_map_at_100_max
value: 7.862168042999533
- type: nauc_map_at_100_std
value: -17.670102459097233
- type: nauc_map_at_10_diff1
value: 36.22122978875574
- type: nauc_map_at_10_max
value: 7.80848606967416
- type: nauc_map_at_10_std
value: -18.3265151386167
- type: nauc_map_at_1_diff1
value: 39.28605466408357
- type: nauc_map_at_1_max
value: 6.20202977590459
- type: nauc_map_at_1_std
value: -15.734334090045026
- type: nauc_map_at_20_diff1
value: 36.33637880909657
- type: nauc_map_at_20_max
value: 7.843437969476022
- type: nauc_map_at_20_std
value: -17.917533363025996
- type: nauc_map_at_3_diff1
value: 36.24864976076741
- type: nauc_map_at_3_max
value: 7.420345251835957
- type: nauc_map_at_3_std
value: -18.71678497722944
- type: nauc_map_at_5_diff1
value: 36.0789619291824
- type: nauc_map_at_5_max
value: 7.7314285669514495
- type: nauc_map_at_5_std
value: -18.748688764538706
- type: nauc_mrr_at_1000_diff1
value: 36.23912675623378
- type: nauc_mrr_at_1000_max
value: 7.690553436255147
- type: nauc_mrr_at_1000_std
value: -17.609526070212304
- type: nauc_mrr_at_100_diff1
value: 36.23782651189002
- type: nauc_mrr_at_100_max
value: 7.70075095171647
- type: nauc_mrr_at_100_std
value: -17.575714144960184
- type: nauc_mrr_at_10_diff1
value: 36.125229472534215
- type: nauc_mrr_at_10_max
value: 7.635472248755658
- type: nauc_mrr_at_10_std
value: -18.208166616511086
- type: nauc_mrr_at_1_diff1
value: 39.20986875554532
- type: nauc_mrr_at_1_max
value: 6.062668487561363
- type: nauc_mrr_at_1_std
value: -16.04130340817602
- type: nauc_mrr_at_20_diff1
value: 36.21207088739667
- type: nauc_mrr_at_20_max
value: 7.699610250145951
- type: nauc_mrr_at_20_std
value: -17.778245221724028
- type: nauc_mrr_at_3_diff1
value: 36.03957583885305
- type: nauc_mrr_at_3_max
value: 7.225515576504581
- type: nauc_mrr_at_3_std
value: -18.74478742943741
- type: nauc_mrr_at_5_diff1
value: 35.969152496648974
- type: nauc_mrr_at_5_max
value: 7.584059789018233
- type: nauc_mrr_at_5_std
value: -18.569374723129332
- type: nauc_ndcg_at_1000_diff1
value: 35.894655529841806
- type: nauc_ndcg_at_1000_max
value: 8.579327424366236
- type: nauc_ndcg_at_1000_std
value: -16.359677367747896
- type: nauc_ndcg_at_100_diff1
value: 35.89861902483983
- type: nauc_ndcg_at_100_max
value: 8.830873623962242
- type: nauc_ndcg_at_100_std
value: -15.173125564722978
- type: nauc_ndcg_at_10_diff1
value: 35.36499811105169
- type: nauc_ndcg_at_10_max
value: 8.449267180956992
- type: nauc_ndcg_at_10_std
value: -18.41978802362402
- type: nauc_ndcg_at_1_diff1
value: 39.15422481210622
- type: nauc_ndcg_at_1_max
value: 6.055515791928331
- type: nauc_ndcg_at_1_std
value: -16.042779610876252
- type: nauc_ndcg_at_20_diff1
value: 35.73402868264468
- type: nauc_ndcg_at_20_max
value: 8.695705518210847
- type: nauc_ndcg_at_20_std
value: -16.7735829470466
- type: nauc_ndcg_at_3_diff1
value: 35.31358242856231
- type: nauc_ndcg_at_3_max
value: 7.645692789058997
- type: nauc_ndcg_at_3_std
value: -19.460003734786874
- type: nauc_ndcg_at_5_diff1
value: 35.05216588927143
- type: nauc_ndcg_at_5_max
value: 8.216690520604715
- type: nauc_ndcg_at_5_std
value: -19.3982054492159
- type: nauc_precision_at_1000_diff1
value: -4.440002625111349
- type: nauc_precision_at_1000_max
value: 7.886988951901723
- type: nauc_precision_at_1000_std
value: 9.88111187048247
- type: nauc_precision_at_100_diff1
value: 15.728286119463325
- type: nauc_precision_at_100_max
value: 13.218650824470654
- type: nauc_precision_at_100_std
value: 16.113245895522553
- type: nauc_precision_at_10_diff1
value: 29.51218489610567
- type: nauc_precision_at_10_max
value: 10.197432401942912
- type: nauc_precision_at_10_std
value: -16.950603431359493
- type: nauc_precision_at_1_diff1
value: 39.15422481210622
- type: nauc_precision_at_1_max
value: 6.055515791928331
- type: nauc_precision_at_1_std
value: -16.042779610876252
- type: nauc_precision_at_20_diff1
value: 27.825993070397338
- type: nauc_precision_at_20_max
value: 11.437632287846007
- type: nauc_precision_at_20_std
value: -7.450353566405601
- type: nauc_precision_at_3_diff1
value: 32.14135556796588
- type: nauc_precision_at_3_max
value: 7.989252443574163
- type: nauc_precision_at_3_std
value: -21.566254595671055
- type: nauc_precision_at_5_diff1
value: 30.68778685307082
- type: nauc_precision_at_5_max
value: 9.332160758499892
- type: nauc_precision_at_5_std
value: -20.928554713448914
- type: nauc_recall_at_1000_diff1
value: 25.00810478716878
- type: nauc_recall_at_1000_max
value: 46.518165765201644
- type: nauc_recall_at_1000_std
value: 61.4734635576085
- type: nauc_recall_at_100_diff1
value: 33.895581318261726
- type: nauc_recall_at_100_max
value: 20.10706035872801
- type: nauc_recall_at_100_std
value: 24.204226584457047
- type: nauc_recall_at_10_diff1
value: 32.363127359576296
- type: nauc_recall_at_10_max
value: 10.729923804989545
- type: nauc_recall_at_10_std
value: -18.1335370184202
- type: nauc_recall_at_1_diff1
value: 39.28605466408357
- type: nauc_recall_at_1_max
value: 6.20202977590459
- type: nauc_recall_at_1_std
value: -15.734334090045026
- type: nauc_recall_at_20_diff1
value: 33.47804003169795
- type: nauc_recall_at_20_max
value: 12.781494765263382
- type: nauc_recall_at_20_std
value: -9.263970132202658
- type: nauc_recall_at_3_diff1
value: 32.71001429428999
- type: nauc_recall_at_3_max
value: 8.353439197382693
- type: nauc_recall_at_3_std
value: -21.235097744366954
- type: nauc_recall_at_5_diff1
value: 31.87451464963415
- type: nauc_recall_at_5_max
value: 9.635051450907305
- type: nauc_recall_at_5_std
value: -21.113235357132794
- type: ndcg_at_1
value: 24.47
- type: ndcg_at_10
value: 43.692
- type: ndcg_at_100
value: 49.211
- type: ndcg_at_1000
value: 50.244
- type: ndcg_at_20
value: 46.278000000000006
- type: ndcg_at_3
value: 35.719
- type: ndcg_at_5
value: 39.652
- type: precision_at_1
value: 24.47
- type: precision_at_10
value: 6.857
- type: precision_at_100
value: 0.9610000000000001
- type: precision_at_1000
value: 0.105
- type: precision_at_20
value: 3.968
- type: precision_at_3
value: 15.181000000000001
- type: precision_at_5
value: 11.117
- type: recall_at_1
value: 23.751
- type: recall_at_10
value: 65.64
- type: recall_at_100
value: 90.967
- type: recall_at_1000
value: 98.738
- type: recall_at_20
value: 75.639
- type: recall_at_3
value: 43.927
- type: recall_at_5
value: 53.366
task:
type: Retrieval
- dataset:
config: en
name: MTEB MTOPDomainClassification (en)
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
split: test
type: mteb/mtop_domain
metrics:
- type: accuracy
value: 98.82580939352485
- type: f1
value: 98.75201754333801
- type: f1_weighted
value: 98.82795205108245
- type: main_score
value: 98.82580939352485
task:
type: Classification
- dataset:
config: en
name: MTEB MTOPIntentClassification (en)
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
split: test
type: mteb/mtop_intent
metrics:
- type: accuracy
value: 92.29822161422709
- type: f1
value: 77.75210224871594
- type: f1_weighted
value: 93.58661422540348
- type: main_score
value: 92.29822161422709
task:
type: Classification
- dataset:
config: en
name: MTEB MassiveIntentClassification (en)
revision: 4672e20407010da34463acc759c162ca9734bca6
split: test
type: mteb/amazon_massive_intent
metrics:
- type: accuracy
value: 85.17484868863484
- type: f1
value: 81.94484244487094
- type: f1_weighted
value: 85.21022593423332
- type: main_score
value: 85.17484868863484
task:
type: Classification
- dataset:
config: en
name: MTEB MassiveScenarioClassification (en)
revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8
split: test
type: mteb/amazon_massive_scenario
metrics:
- type: accuracy
value: 89.61667787491594
- type: f1
value: 89.02701927621264
- type: f1_weighted
value: 89.56306982022801
- type: main_score
value: 89.61667787491594
task:
type: Classification
- dataset:
config: default
name: MTEB MedrxivClusteringP2P
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
split: test
type: mteb/medrxiv-clustering-p2p
metrics:
- type: main_score
value: 46.318282423948574
- type: v_measure
value: 46.318282423948574
- type: v_measure_std
value: 0.9729055662461538
task:
type: Clustering
- dataset:
config: default
name: MTEB MedrxivClusteringS2S
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
split: test
type: mteb/medrxiv-clustering-s2s
metrics:
- type: main_score
value: 44.29033625273981
- type: v_measure
value: 44.29033625273981
- type: v_measure_std
value: 1.0596383629128594
task:
type: Clustering
- dataset:
config: default
name: MTEB MindSmallReranking
revision: 59042f120c80e8afa9cdbb224f67076cec0fc9a7
split: test
type: mteb/mind_small
metrics:
- type: main_score
value: 33.0526129239962
- type: map
value: 33.0526129239962
- type: mrr
value: 34.29260046890935
- type: nAUC_map_diff1
value: 12.579738077238032
- type: nAUC_map_max
value: -20.936629344962
- type: nAUC_map_std
value: -1.6096805784945216
- type: nAUC_mrr_diff1
value: 11.597584463580807
- type: nAUC_mrr_max
value: -15.723702838537504
- type: nAUC_mrr_std
value: 0.2719172965777737
task:
type: Reranking
- dataset:
config: default
name: MTEB NFCorpus
revision: ec0fa4fe99da2ff19ca1214b7966684033a58814
split: test
type: mteb/nfcorpus
metrics:
- type: main_score
value: 41.486000000000004
- type: map_at_1
value: 6.866
- type: map_at_10
value: 15.895999999999999
- type: map_at_100
value: 21.093
- type: map_at_1000
value: 23.067
- type: map_at_20
value: 18.125
- type: map_at_3
value: 11.421000000000001
- type: map_at_5
value: 13.415
- type: mrr_at_1
value: 52.63157894736842
- type: mrr_at_10
value: 61.486805248415166
- type: mrr_at_100
value: 62.08211009182091
- type: mrr_at_1000
value: 62.10828701365016
- type: mrr_at_20
value: 61.904411187915784
- type: mrr_at_3
value: 59.90712074303407
- type: mrr_at_5
value: 60.91331269349847
- type: nauc_map_at_1000_diff1
value: 25.484625278529403
- type: nauc_map_at_1000_max
value: 31.206600396418853
- type: nauc_map_at_1000_std
value: 15.569448072357156
- type: nauc_map_at_100_diff1
value: 27.636750226316764
- type: nauc_map_at_100_max
value: 29.66992681250722
- type: nauc_map_at_100_std
value: 10.570600484002671
- type: nauc_map_at_10_diff1
value: 32.76642525548697
- type: nauc_map_at_10_max
value: 21.459225397237663
- type: nauc_map_at_10_std
value: -3.546494734209264
- type: nauc_map_at_1_diff1
value: 48.8002894871328
- type: nauc_map_at_1_max
value: 5.7236722609868815
- type: nauc_map_at_1_std
value: -13.283554044471352
- type: nauc_map_at_20_diff1
value: 30.57169701502308
- type: nauc_map_at_20_max
value: 25.79666139518404
- type: nauc_map_at_20_std
value: 1.781732492989651
- type: nauc_map_at_3_diff1
value: 40.076315947201095
- type: nauc_map_at_3_max
value: 12.862524429140054
- type: nauc_map_at_3_std
value: -9.188349777126817
- type: nauc_map_at_5_diff1
value: 36.9918718052938
- type: nauc_map_at_5_max
value: 16.74234374361876
- type: nauc_map_at_5_std
value: -7.818523349307494
- type: nauc_mrr_at_1000_diff1
value: 26.88183002609805
- type: nauc_mrr_at_1000_max
value: 47.10209348428658
- type: nauc_mrr_at_1000_std
value: 32.067825924992924
- type: nauc_mrr_at_100_diff1
value: 26.871482491566745
- type: nauc_mrr_at_100_max
value: 47.11303868498556
- type: nauc_mrr_at_100_std
value: 32.08961428818868
- type: nauc_mrr_at_10_diff1
value: 26.6356914977722
- type: nauc_mrr_at_10_max
value: 47.091624558810366
- type: nauc_mrr_at_10_std
value: 31.942424120660164
- type: nauc_mrr_at_1_diff1
value: 28.19774198483673
- type: nauc_mrr_at_1_max
value: 41.44380927834253
- type: nauc_mrr_at_1_std
value: 25.18222691885917
- type: nauc_mrr_at_20_diff1
value: 26.86487347109452
- type: nauc_mrr_at_20_max
value: 47.1987778214726
- type: nauc_mrr_at_20_std
value: 32.143517921610034
- type: nauc_mrr_at_3_diff1
value: 27.34340373236422
- type: nauc_mrr_at_3_max
value: 46.358726506276646
- type: nauc_mrr_at_3_std
value: 31.74924155572593
- type: nauc_mrr_at_5_diff1
value: 27.209667205060672
- type: nauc_mrr_at_5_max
value: 46.79883369072009
- type: nauc_mrr_at_5_std
value: 31.655605306670758
- type: nauc_ndcg_at_1000_diff1
value: 18.940195769769687
- type: nauc_ndcg_at_1000_max
value: 46.48551313937331
- type: nauc_ndcg_at_1000_std
value: 33.64819502089232
- type: nauc_ndcg_at_100_diff1
value: 19.50885253809146
- type: nauc_ndcg_at_100_max
value: 40.53174462354878
- type: nauc_ndcg_at_100_std
value: 28.516152877751118
- type: nauc_ndcg_at_10_diff1
value: 16.01699218096564
- type: nauc_ndcg_at_10_max
value: 41.17322878314514
- type: nauc_ndcg_at_10_std
value: 29.002233224832196
- type: nauc_ndcg_at_1_diff1
value: 27.443547710102205
- type: nauc_ndcg_at_1_max
value: 40.66529763309582
- type: nauc_ndcg_at_1_std
value: 24.15016766225869
- type: nauc_ndcg_at_20_diff1
value: 17.541197675685062
- type: nauc_ndcg_at_20_max
value: 40.53231266973844
- type: nauc_ndcg_at_20_std
value: 29.54096347876548
- type: nauc_ndcg_at_3_diff1
value: 18.649628357473716
- type: nauc_ndcg_at_3_max
value: 41.18603570171764
- type: nauc_ndcg_at_3_std
value: 27.125524188420396
- type: nauc_ndcg_at_5_diff1
value: 17.519593751448483
- type: nauc_ndcg_at_5_max
value: 42.715997890377345
- type: nauc_ndcg_at_5_std
value: 27.902627839899868
- type: nauc_precision_at_1000_diff1
value: -15.528797630565155
- type: nauc_precision_at_1000_max
value: 13.741640921778671
- type: nauc_precision_at_1000_std
value: 44.50896053788372
- type: nauc_precision_at_100_diff1
value: -14.491464489721887
- type: nauc_precision_at_100_max
value: 23.136434418999457
- type: nauc_precision_at_100_std
value: 49.73145147863128
- type: nauc_precision_at_10_diff1
value: -4.829188942994277
- type: nauc_precision_at_10_max
value: 40.327612559528866
- type: nauc_precision_at_10_std
value: 39.34919529635044
- type: nauc_precision_at_1_diff1
value: 28.19774198483673
- type: nauc_precision_at_1_max
value: 41.44380927834253
- type: nauc_precision_at_1_std
value: 25.18222691885917
- type: nauc_precision_at_20_diff1
value: -7.210726293112847
- type: nauc_precision_at_20_max
value: 37.195679576636984
- type: nauc_precision_at_20_std
value: 45.4597096418357
- type: nauc_precision_at_3_diff1
value: 7.578219537774854
- type: nauc_precision_at_3_max
value: 41.59775233475654
- type: nauc_precision_at_3_std
value: 30.764584790895118
- type: nauc_precision_at_5_diff1
value: 1.655451789039598
- type: nauc_precision_at_5_max
value: 43.435739407610455
- type: nauc_precision_at_5_std
value: 33.42552263325999
- type: nauc_recall_at_1000_diff1
value: 5.030705700690516
- type: nauc_recall_at_1000_max
value: 19.108072570815583
- type: nauc_recall_at_1000_std
value: 14.697734974217308
- type: nauc_recall_at_100_diff1
value: 14.746540318132407
- type: nauc_recall_at_100_max
value: 21.798705033854795
- type: nauc_recall_at_100_std
value: 11.416195108842587
- type: nauc_recall_at_10_diff1
value: 25.548642427860486
- type: nauc_recall_at_10_max
value: 18.711677681987474
- type: nauc_recall_at_10_std
value: -5.988904818971677
- type: nauc_recall_at_1_diff1
value: 48.8002894871328
- type: nauc_recall_at_1_max
value: 5.7236722609868815
- type: nauc_recall_at_1_std
value: -13.283554044471352
- type: nauc_recall_at_20_diff1
value: 23.39140739154809
- type: nauc_recall_at_20_max
value: 19.351150636155474
- type: nauc_recall_at_20_std
value: -2.757280266915132
- type: nauc_recall_at_3_diff1
value: 38.17453576012812
- type: nauc_recall_at_3_max
value: 13.47003839643972
- type: nauc_recall_at_3_std
value: -8.75780163862688
- type: nauc_recall_at_5_diff1
value: 33.02812855226899
- type: nauc_recall_at_5_max
value: 15.477626408978477
- type: nauc_recall_at_5_std
value: -9.072206441070708
- type: ndcg_at_1
value: 50.773999999999994
- type: ndcg_at_10
value: 41.486000000000004
- type: ndcg_at_100
value: 39.051
- type: ndcg_at_1000
value: 48.106
- type: ndcg_at_20
value: 39.432
- type: ndcg_at_3
value: 47.428
- type: ndcg_at_5
value: 45.227000000000004
- type: precision_at_1
value: 52.632
- type: precision_at_10
value: 31.146
- type: precision_at_100
value: 10.328
- type: precision_at_1000
value: 2.432
- type: precision_at_20
value: 23.793
- type: precision_at_3
value: 45.201
- type: precision_at_5
value: 39.876
- type: recall_at_1
value: 6.866
- type: recall_at_10
value: 20.447000000000003
- type: recall_at_100
value: 40.607
- type: recall_at_1000
value: 73.411
- type: recall_at_20
value: 26.082
- type: recall_at_3
value: 12.484
- type: recall_at_5
value: 15.847
task:
type: Retrieval
- dataset:
config: default
name: MTEB NQ
revision: b774495ed302d8c44a3a7ea25c90dbce03968f31
split: test
type: mteb/nq
metrics:
- type: main_score
value: 69.072
- type: map_at_1
value: 45.483000000000004
- type: map_at_10
value: 62.050000000000004
- type: map_at_100
value: 62.693
- type: map_at_1000
value: 62.702999999999996
- type: map_at_20
value: 62.498
- type: map_at_3
value: 58.285
- type: map_at_5
value: 60.711000000000006
- type: mrr_at_1
value: 50.840092699884124
- type: mrr_at_10
value: 64.54635224116673
- type: mrr_at_100
value: 64.9526548702289
- type: mrr_at_1000
value: 64.95908460752281
- type: mrr_at_20
value: 64.82949565799959
- type: mrr_at_3
value: 61.89165701042856
- type: mrr_at_5
value: 63.632676709154026
- type: nauc_map_at_1000_diff1
value: 43.187285304185224
- type: nauc_map_at_1000_max
value: 32.39921659632756
- type: nauc_map_at_1000_std
value: -5.780901333066553
- type: nauc_map_at_100_diff1
value: 43.184487221204456
- type: nauc_map_at_100_max
value: 32.41176116347982
- type: nauc_map_at_100_std
value: -5.76422606662383
- type: nauc_map_at_10_diff1
value: 42.967066814031746
- type: nauc_map_at_10_max
value: 32.489617364418514
- type: nauc_map_at_10_std
value: -6.029045531102664
- type: nauc_map_at_1_diff1
value: 46.16376563218624
- type: nauc_map_at_1_max
value: 26.342624776802232
- type: nauc_map_at_1_std
value: -7.142171388751972
- type: nauc_map_at_20_diff1
value: 43.15894358608328
- type: nauc_map_at_20_max
value: 32.46492198956245
- type: nauc_map_at_20_std
value: -5.788373305449195
- type: nauc_map_at_3_diff1
value: 43.231752344608545
- type: nauc_map_at_3_max
value: 31.68003009949564
- type: nauc_map_at_3_std
value: -8.015235132765458
- type: nauc_map_at_5_diff1
value: 42.86197608819917
- type: nauc_map_at_5_max
value: 32.363857571094485
- type: nauc_map_at_5_std
value: -6.780487416387977
- type: nauc_mrr_at_1000_diff1
value: 43.40542912045782
- type: nauc_mrr_at_1000_max
value: 32.8461770324533
- type: nauc_mrr_at_1000_std
value: -3.6505425530008204
- type: nauc_mrr_at_100_diff1
value: 43.40233508014468
- type: nauc_mrr_at_100_max
value: 32.85598538385942
- type: nauc_mrr_at_100_std
value: -3.637477352635459
- type: nauc_mrr_at_10_diff1
value: 43.260179162806054
- type: nauc_mrr_at_10_max
value: 32.942643527040474
- type: nauc_mrr_at_10_std
value: -3.712052825320437
- type: nauc_mrr_at_1_diff1
value: 46.354919460881206
- type: nauc_mrr_at_1_max
value: 29.1760258591106
- type: nauc_mrr_at_1_std
value: -4.107225031227406
- type: nauc_mrr_at_20_diff1
value: 43.37092385434311
- type: nauc_mrr_at_20_max
value: 32.93390254712846
- type: nauc_mrr_at_20_std
value: -3.5719056112132006
- type: nauc_mrr_at_3_diff1
value: 43.1744474040527
- type: nauc_mrr_at_3_max
value: 32.741290559777994
- type: nauc_mrr_at_3_std
value: -4.72677925120697
- type: nauc_mrr_at_5_diff1
value: 43.108396819975674
- type: nauc_mrr_at_5_max
value: 32.970519514893084
- type: nauc_mrr_at_5_std
value: -4.090906158975974
- type: nauc_ndcg_at_1000_diff1
value: 42.786664193638714
- type: nauc_ndcg_at_1000_max
value: 33.65554095609296
- type: nauc_ndcg_at_1000_std
value: -4.024030130584482
- type: nauc_ndcg_at_100_diff1
value: 42.691246775210814
- type: nauc_ndcg_at_100_max
value: 34.063232335110875
- type: nauc_ndcg_at_100_std
value: -3.477813807415248
- type: nauc_ndcg_at_10_diff1
value: 41.90988990571757
- type: nauc_ndcg_at_10_max
value: 34.58934812881633
- type: nauc_ndcg_at_10_std
value: -4.3295110195497655
- type: nauc_ndcg_at_1_diff1
value: 46.354919460881206
- type: nauc_ndcg_at_1_max
value: 29.1760258591106
- type: nauc_ndcg_at_1_std
value: -4.107225031227406
- type: nauc_ndcg_at_20_diff1
value: 42.493206675867114
- type: nauc_ndcg_at_20_max
value: 34.562441307459544
- type: nauc_ndcg_at_20_std
value: -3.4456116866749107
- type: nauc_ndcg_at_3_diff1
value: 42.24180336502808
- type: nauc_ndcg_at_3_max
value: 33.064267018100594
- type: nauc_ndcg_at_3_std
value: -7.786248093572142
- type: nauc_ndcg_at_5_diff1
value: 41.692714787779565
- type: nauc_ndcg_at_5_max
value: 34.20502498949156
- type: nauc_ndcg_at_5_std
value: -5.979557859282785
- type: nauc_precision_at_1000_diff1
value: -13.779832506640702
- type: nauc_precision_at_1000_max
value: 1.243001688631421
- type: nauc_precision_at_1000_std
value: 17.351623398622323
- type: nauc_precision_at_100_diff1
value: -11.310526816290297
- type: nauc_precision_at_100_max
value: 5.771669506192959
- type: nauc_precision_at_100_std
value: 19.917795079540113
- type: nauc_precision_at_10_diff1
value: 2.163699384635286
- type: nauc_precision_at_10_max
value: 19.66440698458386
- type: nauc_precision_at_10_std
value: 13.689876348315726
- type: nauc_precision_at_1_diff1
value: 46.354919460881206
- type: nauc_precision_at_1_max
value: 29.1760258591106
- type: nauc_precision_at_1_std
value: -4.107225031227406
- type: nauc_precision_at_20_diff1
value: -3.038735879584471
- type: nauc_precision_at_20_max
value: 14.132968299701695
- type: nauc_precision_at_20_std
value: 17.78069734664346
- type: nauc_precision_at_3_diff1
value: 21.783760758070095
- type: nauc_precision_at_3_max
value: 30.244127986404497
- type: nauc_precision_at_3_std
value: -0.12411163467738723
- type: nauc_precision_at_5_diff1
value: 10.980635723302418
- type: nauc_precision_at_5_max
value: 25.302293738975575
- type: nauc_precision_at_5_std
value: 6.4740817488722024
- type: nauc_recall_at_1000_diff1
value: 34.10343772356593
- type: nauc_recall_at_1000_max
value: 80.72497340357538
- type: nauc_recall_at_1000_std
value: 69.54564103264093
- type: nauc_recall_at_100_diff1
value: 33.427719956774126
- type: nauc_recall_at_100_max
value: 71.54086768335449
- type: nauc_recall_at_100_std
value: 49.66157377654885
- type: nauc_recall_at_10_diff1
value: 33.70139560054039
- type: nauc_recall_at_10_max
value: 45.47878072860151
- type: nauc_recall_at_10_std
value: 1.4188516615716378
- type: nauc_recall_at_1_diff1
value: 46.16376563218624
- type: nauc_recall_at_1_max
value: 26.342624776802232
- type: nauc_recall_at_1_std
value: -7.142171388751972
- type: nauc_recall_at_20_diff1
value: 35.805379874970086
- type: nauc_recall_at_20_max
value: 51.80479822253392
- type: nauc_recall_at_20_std
value: 13.531467576460143
- type: nauc_recall_at_3_diff1
value: 37.288500141631616
- type: nauc_recall_at_3_max
value: 35.07078243516728
- type: nauc_recall_at_3_std
value: -10.452926441410405
- type: nauc_recall_at_5_diff1
value: 34.83186104526897
- type: nauc_recall_at_5_max
value: 39.58488976496973
- type: nauc_recall_at_5_std
value: -6.3049292065708835
- type: ndcg_at_1
value: 50.839999999999996
- type: ndcg_at_10
value: 69.072
- type: ndcg_at_100
value: 71.538
- type: ndcg_at_1000
value: 71.77799999999999
- type: ndcg_at_20
value: 70.41
- type: ndcg_at_3
value: 62.544999999999995
- type: ndcg_at_5
value: 66.33099999999999
- type: precision_at_1
value: 50.839999999999996
- type: precision_at_10
value: 10.495000000000001
- type: precision_at_100
value: 1.1900000000000002
- type: precision_at_1000
value: 0.121
- type: precision_at_20
value: 5.5809999999999995
- type: precision_at_3
value: 27.636
- type: precision_at_5
value: 18.864
- type: recall_at_1
value: 45.483000000000004
- type: recall_at_10
value: 87.483
- type: recall_at_100
value: 97.844
- type: recall_at_1000
value: 99.66199999999999
- type: recall_at_20
value: 92.294
- type: recall_at_3
value: 71.2
- type: recall_at_5
value: 79.753
task:
type: Retrieval
- dataset:
config: default
name: MTEB QuoraRetrieval
revision: e4e08e0b7dbe3c8700f0daef558ff32256715259
split: test
type: mteb/quora
metrics:
- type: main_score
value: 89.58
- type: map_at_1
value: 71.819
- type: map_at_10
value: 86.04899999999999
- type: map_at_100
value: 86.648
- type: map_at_1000
value: 86.66199999999999
- type: map_at_20
value: 86.441
- type: map_at_3
value: 83.114
- type: map_at_5
value: 84.981
- type: mrr_at_1
value: 82.62
- type: mrr_at_10
value: 88.62899999999979
- type: mrr_at_100
value: 88.70918591324215
- type: mrr_at_1000
value: 88.70973091492397
- type: mrr_at_20
value: 88.68914765317221
- type: mrr_at_3
value: 87.74999999999979
- type: mrr_at_5
value: 88.36799999999974
- type: nauc_map_at_1000_diff1
value: 77.89207709760448
- type: nauc_map_at_1000_max
value: 29.63371361495422
- type: nauc_map_at_1000_std
value: -48.628180385874344
- type: nauc_map_at_100_diff1
value: 77.89592179104915
- type: nauc_map_at_100_max
value: 29.617171506130756
- type: nauc_map_at_100_std
value: -48.66057170774648
- type: nauc_map_at_10_diff1
value: 78.0618161228185
- type: nauc_map_at_10_max
value: 29.178490609366737
- type: nauc_map_at_10_std
value: -50.74755004592002
- type: nauc_map_at_1_diff1
value: 81.64335579973574
- type: nauc_map_at_1_max
value: 21.813832226652174
- type: nauc_map_at_1_std
value: -42.57570978190876
- type: nauc_map_at_20_diff1
value: 77.9299081005938
- type: nauc_map_at_20_max
value: 29.458718470003888
- type: nauc_map_at_20_std
value: -49.63337236763102
- type: nauc_map_at_3_diff1
value: 78.72941448509229
- type: nauc_map_at_3_max
value: 26.600997896960056
- type: nauc_map_at_3_std
value: -51.889002227479885
- type: nauc_map_at_5_diff1
value: 78.31466610917171
- type: nauc_map_at_5_max
value: 28.09863984582896
- type: nauc_map_at_5_std
value: -52.14058096096497
- type: nauc_mrr_at_1000_diff1
value: 78.42667263739992
- type: nauc_mrr_at_1000_max
value: 31.98996235127974
- type: nauc_mrr_at_1000_std
value: -44.380439148429296
- type: nauc_mrr_at_100_diff1
value: 78.42661032698115
- type: nauc_mrr_at_100_max
value: 31.991652631740102
- type: nauc_mrr_at_100_std
value: -44.37854108460535
- type: nauc_mrr_at_10_diff1
value: 78.39126022544136
- type: nauc_mrr_at_10_max
value: 32.02023484451197
- type: nauc_mrr_at_10_std
value: -44.561252349176954
- type: nauc_mrr_at_1_diff1
value: 79.21630894647448
- type: nauc_mrr_at_1_max
value: 31.526303156060177
- type: nauc_mrr_at_1_std
value: -41.887504422443136
- type: nauc_mrr_at_20_diff1
value: 78.42548039170424
- type: nauc_mrr_at_20_max
value: 31.99588275070137
- type: nauc_mrr_at_20_std
value: -44.44957722627042
- type: nauc_mrr_at_3_diff1
value: 78.26165151833735
- type: nauc_mrr_at_3_max
value: 32.18028826126801
- type: nauc_mrr_at_3_std
value: -44.6998237213182
- type: nauc_mrr_at_5_diff1
value: 78.34786430903962
- type: nauc_mrr_at_5_max
value: 32.168476272879566
- type: nauc_mrr_at_5_std
value: -44.7915919956712
- type: nauc_ndcg_at_1000_diff1
value: 77.79198355957816
- type: nauc_ndcg_at_1000_max
value: 31.14363511518406
- type: nauc_ndcg_at_1000_std
value: -46.69335151274275
- type: nauc_ndcg_at_100_diff1
value: 77.79898090286419
- type: nauc_ndcg_at_100_max
value: 31.115103811629215
- type: nauc_ndcg_at_100_std
value: -46.73078913421965
- type: nauc_ndcg_at_10_diff1
value: 77.74856635461343
- type: nauc_ndcg_at_10_max
value: 30.279584686212747
- type: nauc_ndcg_at_10_std
value: -50.23514662356807
- type: nauc_ndcg_at_1_diff1
value: 79.17833000040999
- type: nauc_ndcg_at_1_max
value: 31.703788144510746
- type: nauc_ndcg_at_1_std
value: -41.854817402870715
- type: nauc_ndcg_at_20_diff1
value: 77.7380353804671
- type: nauc_ndcg_at_20_max
value: 30.622294129001553
- type: nauc_ndcg_at_20_std
value: -49.035794761065254
- type: nauc_ndcg_at_3_diff1
value: 77.41476880573593
- type: nauc_ndcg_at_3_max
value: 29.015949978243032
- type: nauc_ndcg_at_3_std
value: -49.78627087622648
- type: nauc_ndcg_at_5_diff1
value: 77.64439137502896
- type: nauc_ndcg_at_5_max
value: 29.444684897492206
- type: nauc_ndcg_at_5_std
value: -51.21908400252501
- type: nauc_precision_at_1000_diff1
value: -44.92396459446822
- type: nauc_precision_at_1000_max
value: -3.674153720989045
- type: nauc_precision_at_1000_std
value: 39.56552468277785
- type: nauc_precision_at_100_diff1
value: -44.75143023259094
- type: nauc_precision_at_100_max
value: -3.705280025140011
- type: nauc_precision_at_100_std
value: 39.433619999113326
- type: nauc_precision_at_10_diff1
value: -41.0651074726579
- type: nauc_precision_at_10_max
value: -0.21097985601783667
- type: nauc_precision_at_10_std
value: 26.24652824589493
- type: nauc_precision_at_1_diff1
value: 79.17833000040999
- type: nauc_precision_at_1_max
value: 31.703788144510746
- type: nauc_precision_at_1_std
value: -41.854817402870715
- type: nauc_precision_at_20_diff1
value: -43.368001340920294
- type: nauc_precision_at_20_max
value: -2.036990010399129
- type: nauc_precision_at_20_std
value: 32.37747041406297
- type: nauc_precision_at_3_diff1
value: -22.089307548346877
- type: nauc_precision_at_3_max
value: 6.2280973175296
- type: nauc_precision_at_3_std
value: 5.323992514036145
- type: nauc_precision_at_5_diff1
value: -34.07115055244003
- type: nauc_precision_at_5_max
value: 2.5955315789198834
- type: nauc_precision_at_5_std
value: 16.26096689407332
- type: nauc_recall_at_1000_diff1
value: 58.27703860947467
- type: nauc_recall_at_1000_max
value: 68.59835835315768
- type: nauc_recall_at_1000_std
value: 77.96687006056064
- type: nauc_recall_at_100_diff1
value: 73.24371223081737
- type: nauc_recall_at_100_max
value: 39.55925344664591
- type: nauc_recall_at_100_std
value: -32.25605030215798
- type: nauc_recall_at_10_diff1
value: 73.41261201339202
- type: nauc_recall_at_10_max
value: 26.822979434062926
- type: nauc_recall_at_10_std
value: -74.2909332592806
- type: nauc_recall_at_1_diff1
value: 81.64335579973574
- type: nauc_recall_at_1_max
value: 21.813832226652174
- type: nauc_recall_at_1_std
value: -42.57570978190876
- type: nauc_recall_at_20_diff1
value: 72.7621297920656
- type: nauc_recall_at_20_max
value: 26.02492304096079
- type: nauc_recall_at_20_std
value: -77.8724532438279
- type: nauc_recall_at_3_diff1
value: 75.25149312810714
- type: nauc_recall_at_3_max
value: 23.20545662481487
- type: nauc_recall_at_3_std
value: -59.69689982140521
- type: nauc_recall_at_5_diff1
value: 73.69807273001406
- type: nauc_recall_at_5_max
value: 24.073666798066057
- type: nauc_recall_at_5_std
value: -67.91121268130719
- type: ndcg_at_1
value: 82.64
- type: ndcg_at_10
value: 89.58
- type: ndcg_at_100
value: 90.606
- type: ndcg_at_1000
value: 90.676
- type: ndcg_at_20
value: 90.132
- type: ndcg_at_3
value: 86.88
- type: ndcg_at_5
value: 88.40299999999999
- type: precision_at_1
value: 82.64
- type: precision_at_10
value: 13.604
- type: precision_at_100
value: 1.539
- type: precision_at_1000
value: 0.157
- type: precision_at_20
value: 7.188
- type: precision_at_3
value: 38.083
- type: precision_at_5
value: 25.018
- type: recall_at_1
value: 71.819
- type: recall_at_10
value: 96.34700000000001
- type: recall_at_100
value: 99.715
- type: recall_at_1000
value: 99.995
- type: recall_at_20
value: 98.073
- type: recall_at_3
value: 88.57300000000001
- type: recall_at_5
value: 92.908
task:
type: Retrieval
- dataset:
config: default
name: MTEB RedditClustering
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
split: test
type: mteb/reddit-clustering
metrics:
- type: main_score
value: 71.18966762070158
- type: v_measure
value: 71.18966762070158
- type: v_measure_std
value: 2.7498969054457048
task:
type: Clustering
- dataset:
config: default
name: MTEB RedditClusteringP2P
revision: 385e3cb46b4cfa89021f56c4380204149d0efe33
split: test
type: mteb/reddit-clustering-p2p
metrics:
- type: main_score
value: 74.42014716862516
- type: v_measure
value: 74.42014716862516
- type: v_measure_std
value: 9.909739891410648
task:
type: Clustering
- dataset:
config: default
name: MTEB SCIDOCS
revision: f8c2fcf00f625baaa80f62ec5bd9e1fff3b8ae88
split: test
type: mteb/scidocs
metrics:
- type: main_score
value: 25.041999999999998
- type: map_at_1
value: 5.893000000000001
- type: map_at_10
value: 15.260000000000002
- type: map_at_100
value: 18.084
- type: map_at_1000
value: 18.467
- type: map_at_20
value: 16.675
- type: map_at_3
value: 10.526
- type: map_at_5
value: 12.775
- type: mrr_at_1
value: 28.999999999999996
- type: mrr_at_10
value: 41.03575396825395
- type: mrr_at_100
value: 42.136771862785835
- type: mrr_at_1000
value: 42.16698555415099
- type: mrr_at_20
value: 41.707493696104315
- type: mrr_at_3
value: 37.34999999999998
- type: mrr_at_5
value: 39.59999999999995
- type: nauc_map_at_1000_diff1
value: 12.080002654911883
- type: nauc_map_at_1000_max
value: 29.813563682286276
- type: nauc_map_at_1000_std
value: 20.36659817908673
- type: nauc_map_at_100_diff1
value: 12.108735517749706
- type: nauc_map_at_100_max
value: 29.76830671710955
- type: nauc_map_at_100_std
value: 20.3433621032846
- type: nauc_map_at_10_diff1
value: 12.91575031185637
- type: nauc_map_at_10_max
value: 29.427600958386318
- type: nauc_map_at_10_std
value: 16.89867275177153
- type: nauc_map_at_1_diff1
value: 19.353069488987916
- type: nauc_map_at_1_max
value: 17.093914951159693
- type: nauc_map_at_1_std
value: 8.19886078055046
- type: nauc_map_at_20_diff1
value: 11.977233457943113
- type: nauc_map_at_20_max
value: 29.171812822948805
- type: nauc_map_at_20_std
value: 18.780517506173965
- type: nauc_map_at_3_diff1
value: 14.453129464176092
- type: nauc_map_at_3_max
value: 25.801958649112077
- type: nauc_map_at_3_std
value: 11.572823684429643
- type: nauc_map_at_5_diff1
value: 13.167155808104997
- type: nauc_map_at_5_max
value: 27.355626948365792
- type: nauc_map_at_5_std
value: 14.414151839192183
- type: nauc_mrr_at_1000_diff1
value: 17.262104643988636
- type: nauc_mrr_at_1000_max
value: 23.991373837217058
- type: nauc_mrr_at_1000_std
value: 12.44755488671623
- type: nauc_mrr_at_100_diff1
value: 17.267280132318703
- type: nauc_mrr_at_100_max
value: 24.022189287889294
- type: nauc_mrr_at_100_std
value: 12.480695500214788
- type: nauc_mrr_at_10_diff1
value: 17.012383998246268
- type: nauc_mrr_at_10_max
value: 24.192637911171722
- type: nauc_mrr_at_10_std
value: 12.524608847408917
- type: nauc_mrr_at_1_diff1
value: 19.43518811038007
- type: nauc_mrr_at_1_max
value: 17.747482933395602
- type: nauc_mrr_at_1_std
value: 8.410779775558684
- type: nauc_mrr_at_20_diff1
value: 17.202663281407446
- type: nauc_mrr_at_20_max
value: 24.091991130543118
- type: nauc_mrr_at_20_std
value: 12.503814263019908
- type: nauc_mrr_at_3_diff1
value: 17.52733013432995
- type: nauc_mrr_at_3_max
value: 23.569459518780214
- type: nauc_mrr_at_3_std
value: 11.770846827520726
- type: nauc_mrr_at_5_diff1
value: 17.10817561975543
- type: nauc_mrr_at_5_max
value: 23.945141435234678
- type: nauc_mrr_at_5_std
value: 12.034468615317719
- type: nauc_ndcg_at_1000_diff1
value: 12.317811393346936
- type: nauc_ndcg_at_1000_max
value: 30.809991350156103
- type: nauc_ndcg_at_1000_std
value: 24.517501065205067
- type: nauc_ndcg_at_100_diff1
value: 12.824804203182936
- type: nauc_ndcg_at_100_max
value: 30.895499817010748
- type: nauc_ndcg_at_100_std
value: 25.424376279745402
- type: nauc_ndcg_at_10_diff1
value: 13.32724552457439
- type: nauc_ndcg_at_10_max
value: 30.409088666807456
- type: nauc_ndcg_at_10_std
value: 18.216330475714113
- type: nauc_ndcg_at_1_diff1
value: 19.43518811038007
- type: nauc_ndcg_at_1_max
value: 17.747482933395602
- type: nauc_ndcg_at_1_std
value: 8.410779775558684
- type: nauc_ndcg_at_20_diff1
value: 12.224399111852902
- type: nauc_ndcg_at_20_max
value: 29.86352330445272
- type: nauc_ndcg_at_20_std
value: 21.196937851331807
- type: nauc_ndcg_at_3_diff1
value: 15.367489533734027
- type: nauc_ndcg_at_3_max
value: 26.76486390741532
- type: nauc_ndcg_at_3_std
value: 12.606077508789923
- type: nauc_ndcg_at_5_diff1
value: 13.831157482390935
- type: nauc_ndcg_at_5_max
value: 28.070226983968904
- type: nauc_ndcg_at_5_std
value: 15.236787943125435
- type: nauc_precision_at_1000_diff1
value: 0.016122957101357048
- type: nauc_precision_at_1000_max
value: 24.380929903557334
- type: nauc_precision_at_1000_std
value: 34.54045112720052
- type: nauc_precision_at_100_diff1
value: 7.255224788507301
- type: nauc_precision_at_100_max
value: 27.98453788447542
- type: nauc_precision_at_100_std
value: 35.38999555441665
- type: nauc_precision_at_10_diff1
value: 9.69185099834181
- type: nauc_precision_at_10_max
value: 32.532315522580454
- type: nauc_precision_at_10_std
value: 21.48948348473612
- type: nauc_precision_at_1_diff1
value: 19.43518811038007
- type: nauc_precision_at_1_max
value: 17.747482933395602
- type: nauc_precision_at_1_std
value: 8.410779775558684
- type: nauc_precision_at_20_diff1
value: 6.964076536695672
- type: nauc_precision_at_20_max
value: 29.30087236410044
- type: nauc_precision_at_20_std
value: 26.413625895571986
- type: nauc_precision_at_3_diff1
value: 14.145134359925155
- type: nauc_precision_at_3_max
value: 29.915650960808303
- type: nauc_precision_at_3_std
value: 14.095370019867797
- type: nauc_precision_at_5_diff1
value: 11.043933558522692
- type: nauc_precision_at_5_max
value: 30.93016505807111
- type: nauc_precision_at_5_std
value: 17.749256196062603
- type: nauc_recall_at_1000_diff1
value: -0.7776817772090345
- type: nauc_recall_at_1000_max
value: 23.094717340324518
- type: nauc_recall_at_1000_std
value: 37.189908681396425
- type: nauc_recall_at_100_diff1
value: 6.887748742013364
- type: nauc_recall_at_100_max
value: 27.00798435230277
- type: nauc_recall_at_100_std
value: 35.908147807345344
- type: nauc_recall_at_10_diff1
value: 9.605632017480751
- type: nauc_recall_at_10_max
value: 31.845202901168655
- type: nauc_recall_at_10_std
value: 21.497414586634683
- type: nauc_recall_at_1_diff1
value: 19.353069488987916
- type: nauc_recall_at_1_max
value: 17.093914951159693
- type: nauc_recall_at_1_std
value: 8.19886078055046
- type: nauc_recall_at_20_diff1
value: 6.927503731844782
- type: nauc_recall_at_20_max
value: 28.611698183338202
- type: nauc_recall_at_20_std
value: 26.69018660149911
- type: nauc_recall_at_3_diff1
value: 14.043724087062268
- type: nauc_recall_at_3_max
value: 29.269835821380465
- type: nauc_recall_at_3_std
value: 14.104419605998094
- type: nauc_recall_at_5_diff1
value: 11.017319452873336
- type: nauc_recall_at_5_max
value: 30.295720628306228
- type: nauc_recall_at_5_std
value: 17.758048545573825
- type: ndcg_at_1
value: 28.999999999999996
- type: ndcg_at_10
value: 25.041999999999998
- type: ndcg_at_100
value: 35.045
- type: ndcg_at_1000
value: 40.803
- type: ndcg_at_20
value: 28.584
- type: ndcg_at_3
value: 23.249
- type: ndcg_at_5
value: 20.533
- type: precision_at_1
value: 28.999999999999996
- type: precision_at_10
value: 13.120000000000001
- type: precision_at_100
value: 2.7470000000000003
- type: precision_at_1000
value: 0.41200000000000003
- type: precision_at_20
value: 8.584999999999999
- type: precision_at_3
value: 21.633
- type: precision_at_5
value: 18.099999999999998
- type: recall_at_1
value: 5.893000000000001
- type: recall_at_10
value: 26.567
- type: recall_at_100
value: 55.800000000000004
- type: recall_at_1000
value: 83.608
- type: recall_at_20
value: 34.86
- type: recall_at_3
value: 13.153
- type: recall_at_5
value: 18.323
task:
type: Retrieval
- dataset:
config: default
name: MTEB SICK-R
revision: 20a6d6f312dd54037fe07a32d58e5e168867909d
split: test
type: mteb/sickr-sts
metrics:
- type: cosine_pearson
value: 86.57284584320382
- type: cosine_spearman
value: 82.20531642680812
- type: euclidean_pearson
value: 83.94261758556554
- type: euclidean_spearman
value: 82.20721497738559
- type: main_score
value: 82.20531642680812
- type: manhattan_pearson
value: 84.15902154703083
- type: manhattan_spearman
value: 82.19506027155957
- type: pearson
value: 86.57284584320382
- type: spearman
value: 82.20531642680812
task:
type: STS
- dataset:
config: default
name: MTEB STS12
revision: a0d554a64d88156834ff5ae9920b964011b16384
split: test
type: mteb/sts12-sts
metrics:
- type: cosine_pearson
value: 86.28047602146931
- type: cosine_spearman
value: 79.51504881448884
- type: euclidean_pearson
value: 83.10545189967856
- type: euclidean_spearman
value: 79.50586960492797
- type: main_score
value: 79.51504881448884
- type: manhattan_pearson
value: 83.44244457500889
- type: manhattan_spearman
value: 79.730303339846
- type: pearson
value: 86.28047602146931
- type: spearman
value: 79.51504881448884
task:
type: STS
- dataset:
config: default
name: MTEB STS13
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
split: test
type: mteb/sts13-sts
metrics:
- type: cosine_pearson
value: 88.74723553048702
- type: cosine_spearman
value: 89.18936052329725
- type: euclidean_pearson
value: 88.90400878928668
- type: euclidean_spearman
value: 89.19174821431281
- type: main_score
value: 89.18936052329725
- type: manhattan_pearson
value: 88.81504628424054
- type: manhattan_spearman
value: 89.18063294142597
- type: pearson
value: 88.74723553048702
- type: spearman
value: 89.18936052329725
task:
type: STS
- dataset:
config: default
name: MTEB STS14
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
split: test
type: mteb/sts14-sts
metrics:
- type: cosine_pearson
value: 86.45403437836023
- type: cosine_spearman
value: 85.14654611519086
- type: euclidean_pearson
value: 85.87509624462743
- type: euclidean_spearman
value: 85.1391108856681
- type: main_score
value: 85.14654611519086
- type: manhattan_pearson
value: 85.96635794953866
- type: manhattan_spearman
value: 85.3271371527667
- type: pearson
value: 86.45403437836023
- type: spearman
value: 85.14654611519086
task:
type: STS
- dataset:
config: default
name: MTEB STS15
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
split: test
type: mteb/sts15-sts
metrics:
- type: cosine_pearson
value: 87.84742260009705
- type: cosine_spearman
value: 89.10215217191254
- type: euclidean_pearson
value: 88.97393286325477
- type: euclidean_spearman
value: 89.1014105509662
- type: main_score
value: 89.10215217191254
- type: manhattan_pearson
value: 89.31698781090151
- type: manhattan_spearman
value: 89.53000001764433
- type: pearson
value: 87.84742260009705
- type: spearman
value: 89.10215217191254
task:
type: STS
- dataset:
config: default
name: MTEB STS16
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
split: test
type: mteb/sts16-sts
metrics:
- type: cosine_pearson
value: 85.22397535461835
- type: cosine_spearman
value: 87.14066355879785
- type: euclidean_pearson
value: 86.31393364087295
- type: euclidean_spearman
value: 87.14018892702765
- type: main_score
value: 87.14066355879785
- type: manhattan_pearson
value: 86.36366855248434
- type: manhattan_spearman
value: 87.20858630423012
- type: pearson
value: 85.22397535461835
- type: spearman
value: 87.14066355879785
task:
type: STS
- dataset:
config: en-en
name: MTEB STS17 (en-en)
revision: faeb762787bd10488a50c8b5be4a3b82e411949c
split: test
type: mteb/sts17-crosslingual-sts
metrics:
- type: cosine_pearson
value: 90.66131612061355
- type: cosine_spearman
value: 90.97082650129164
- type: euclidean_pearson
value: 90.98181906744969
- type: euclidean_spearman
value: 90.99008476850047
- type: main_score
value: 90.97082650129164
- type: manhattan_pearson
value: 90.75245040709021
- type: manhattan_spearman
value: 90.6199877691265
- type: pearson
value: 90.66131612061355
- type: spearman
value: 90.97082650129164
task:
type: STS
- dataset:
config: en
name: MTEB STS22 (en)
revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3
split: test
type: mteb/sts22-crosslingual-sts
metrics:
- type: cosine_pearson
value: 67.270656447085
- type: cosine_spearman
value: 67.82870469746828
- type: euclidean_pearson
value: 69.03857775285664
- type: euclidean_spearman
value: 67.74455108773341
- type: main_score
value: 67.82870469746828
- type: manhattan_pearson
value: 69.25304172245812
- type: manhattan_spearman
value: 68.00987097916055
- type: pearson
value: 67.270656447085
- type: spearman
value: 67.82870469746828
task:
type: STS
- dataset:
config: default
name: MTEB STSBenchmark
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
split: test
type: mteb/stsbenchmark-sts
metrics:
- type: cosine_pearson
value: 87.17245205384889
- type: cosine_spearman
value: 87.7360146030987
- type: euclidean_pearson
value: 87.48919412794656
- type: euclidean_spearman
value: 87.7312047878383
- type: main_score
value: 87.7360146030987
- type: manhattan_pearson
value: 87.61476224354806
- type: manhattan_spearman
value: 87.95220889254693
- type: pearson
value: 87.17245205384889
- type: spearman
value: 87.7360146030987
task:
type: STS
- dataset:
config: default
name: MTEB SciDocsRR
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
split: test
type: mteb/scidocs-reranking
metrics:
- type: main_score
value: 88.43547871921146
- type: map
value: 88.43547871921146
- type: mrr
value: 96.5564473652709
- type: nAUC_map_diff1
value: -13.66029392579231
- type: nAUC_map_max
value: 50.325613574053506
- type: nAUC_map_std
value: 60.02986231275796
- type: nAUC_mrr_diff1
value: 23.83821476411125
- type: nAUC_mrr_max
value: 86.72643311769906
- type: nAUC_mrr_std
value: 72.12741063469213
task:
type: Reranking
- dataset:
config: default
name: MTEB SciFact
revision: 0228b52cf27578f30900b9e5271d331663a030d7
split: test
type: mteb/scifact
metrics:
- type: main_score
value: 78.233
- type: map_at_1
value: 61.49400000000001
- type: map_at_10
value: 73.30600000000001
- type: map_at_100
value: 73.719
- type: map_at_1000
value: 73.724
- type: map_at_20
value: 73.611
- type: map_at_3
value: 70.626
- type: map_at_5
value: 72.417
- type: mrr_at_1
value: 64.66666666666666
- type: mrr_at_10
value: 74.30357142857143
- type: mrr_at_100
value: 74.56950898079988
- type: mrr_at_1000
value: 74.57295833098681
- type: mrr_at_20
value: 74.46165223665226
- type: mrr_at_3
value: 72.3888888888889
- type: mrr_at_5
value: 73.60555555555557
- type: nauc_map_at_1000_diff1
value: 76.51524604780636
- type: nauc_map_at_1000_max
value: 53.48521938401881
- type: nauc_map_at_1000_std
value: -7.347799382158861
- type: nauc_map_at_100_diff1
value: 76.5122888096236
- type: nauc_map_at_100_max
value: 53.49221847471618
- type: nauc_map_at_100_std
value: -7.329683735681086
- type: nauc_map_at_10_diff1
value: 76.30928630674504
- type: nauc_map_at_10_max
value: 53.00102977185941
- type: nauc_map_at_10_std
value: -7.7467740085108705
- type: nauc_map_at_1_diff1
value: 79.54189281784247
- type: nauc_map_at_1_max
value: 46.630071622109526
- type: nauc_map_at_1_std
value: -14.395943134644112
- type: nauc_map_at_20_diff1
value: 76.41604361947962
- type: nauc_map_at_20_max
value: 53.578883876146875
- type: nauc_map_at_20_std
value: -7.403103451288041
- type: nauc_map_at_3_diff1
value: 76.25911617571941
- type: nauc_map_at_3_max
value: 49.140287380513605
- type: nauc_map_at_3_std
value: -11.35992449218983
- type: nauc_map_at_5_diff1
value: 76.35122077770336
- type: nauc_map_at_5_max
value: 52.1744367901208
- type: nauc_map_at_5_std
value: -7.85753955055384
- type: nauc_mrr_at_1000_diff1
value: 76.97223309515867
- type: nauc_mrr_at_1000_max
value: 57.263787498613326
- type: nauc_mrr_at_1000_std
value: -4.884090708840035
- type: nauc_mrr_at_100_diff1
value: 76.97312970894603
- type: nauc_mrr_at_100_max
value: 57.26850730446478
- type: nauc_mrr_at_100_std
value: -4.875200894216617
- type: nauc_mrr_at_10_diff1
value: 76.65927674223613
- type: nauc_mrr_at_10_max
value: 57.30979763941454
- type: nauc_mrr_at_10_std
value: -4.863331094022142
- type: nauc_mrr_at_1_diff1
value: 80.0454932568644
- type: nauc_mrr_at_1_max
value: 56.76038421319305
- type: nauc_mrr_at_1_std
value: -4.101939392632653
- type: nauc_mrr_at_20_diff1
value: 76.87237970440503
- type: nauc_mrr_at_20_max
value: 57.33843605225869
- type: nauc_mrr_at_20_std
value: -4.96248984417978
- type: nauc_mrr_at_3_diff1
value: 76.74130186666727
- type: nauc_mrr_at_3_max
value: 56.19313244846155
- type: nauc_mrr_at_3_std
value: -5.684365934009136
- type: nauc_mrr_at_5_diff1
value: 76.66406918799962
- type: nauc_mrr_at_5_max
value: 57.56110093228628
- type: nauc_mrr_at_5_std
value: -3.7464413085588073
- type: nauc_ndcg_at_1000_diff1
value: 76.19194173971773
- type: nauc_ndcg_at_1000_max
value: 55.57464600170693
- type: nauc_ndcg_at_1000_std
value: -6.0761689532372625
- type: nauc_ndcg_at_100_diff1
value: 76.14631273843654
- type: nauc_ndcg_at_100_max
value: 55.72246565373382
- type: nauc_ndcg_at_100_std
value: -5.595160698860595
- type: nauc_ndcg_at_10_diff1
value: 75.0108223611192
- type: nauc_ndcg_at_10_max
value: 55.27894212877493
- type: nauc_ndcg_at_10_std
value: -6.968331740214591
- type: nauc_ndcg_at_1_diff1
value: 80.0454932568644
- type: nauc_ndcg_at_1_max
value: 56.76038421319305
- type: nauc_ndcg_at_1_std
value: -4.101939392632653
- type: nauc_ndcg_at_20_diff1
value: 75.54887755702472
- type: nauc_ndcg_at_20_max
value: 56.406879417251496
- type: nauc_ndcg_at_20_std
value: -6.495231061329629
- type: nauc_ndcg_at_3_diff1
value: 75.03620356688509
- type: nauc_ndcg_at_3_max
value: 52.147381077773424
- type: nauc_ndcg_at_3_std
value: -8.448005688956199
- type: nauc_ndcg_at_5_diff1
value: 75.1195898074229
- type: nauc_ndcg_at_5_max
value: 54.2321033861173
- type: nauc_ndcg_at_5_std
value: -5.882690780895338
- type: nauc_precision_at_1000_diff1
value: -28.081979732100532
- type: nauc_precision_at_1000_max
value: 35.055348014832916
- type: nauc_precision_at_1000_std
value: 59.61280468927384
- type: nauc_precision_at_100_diff1
value: -25.112740730587458
- type: nauc_precision_at_100_max
value: 38.26331300116496
- type: nauc_precision_at_100_std
value: 62.46316222328831
- type: nauc_precision_at_10_diff1
value: -2.6766206473658833
- type: nauc_precision_at_10_max
value: 45.95321867204845
- type: nauc_precision_at_10_std
value: 45.07212468670564
- type: nauc_precision_at_1_diff1
value: 80.0454932568644
- type: nauc_precision_at_1_max
value: 56.76038421319305
- type: nauc_precision_at_1_std
value: -4.101939392632653
- type: nauc_precision_at_20_diff1
value: -10.698911116738385
- type: nauc_precision_at_20_max
value: 43.467275950182994
- type: nauc_precision_at_20_std
value: 48.00467321991766
- type: nauc_precision_at_3_diff1
value: 33.6344708541193
- type: nauc_precision_at_3_max
value: 49.309242331670504
- type: nauc_precision_at_3_std
value: 21.02940391379915
- type: nauc_precision_at_5_diff1
value: 13.560415600596318
- type: nauc_precision_at_5_max
value: 48.918726500100085
- type: nauc_precision_at_5_std
value: 39.940930429172184
- type: nauc_recall_at_1000_diff1
value: .nan
- type: nauc_recall_at_1000_max
value: .nan
- type: nauc_recall_at_1000_std
value: .nan
- type: nauc_recall_at_100_diff1
value: 70.82166199813196
- type: nauc_recall_at_100_max
value: 76.6106442577042
- type: nauc_recall_at_100_std
value: 66.47992530345513
- type: nauc_recall_at_10_diff1
value: 62.68908885556092
- type: nauc_recall_at_10_max
value: 58.14262437741839
- type: nauc_recall_at_10_std
value: -12.946717875063369
- type: nauc_recall_at_1_diff1
value: 79.54189281784247
- type: nauc_recall_at_1_max
value: 46.630071622109526
- type: nauc_recall_at_1_std
value: -14.395943134644112
- type: nauc_recall_at_20_diff1
value: 65.79470497876567
- type: nauc_recall_at_20_max
value: 71.68308183488456
- type: nauc_recall_at_20_std
value: -12.556850697268453
- type: nauc_recall_at_3_diff1
value: 68.3240211318129
- type: nauc_recall_at_3_max
value: 45.05998217275036
- type: nauc_recall_at_3_std
value: -14.23179772593869
- type: nauc_recall_at_5_diff1
value: 67.53366869904056
- type: nauc_recall_at_5_max
value: 53.57935627081027
- type: nauc_recall_at_5_std
value: -3.3271112904853393
- type: ndcg_at_1
value: 64.667
- type: ndcg_at_10
value: 78.233
- type: ndcg_at_100
value: 79.806
- type: ndcg_at_1000
value: 79.92099999999999
- type: ndcg_at_20
value: 79.006
- type: ndcg_at_3
value: 74.018
- type: ndcg_at_5
value: 76.334
- type: precision_at_1
value: 64.667
- type: precision_at_10
value: 10.4
- type: precision_at_100
value: 1.1199999999999999
- type: precision_at_1000
value: 0.11299999999999999
- type: precision_at_20
value: 5.383
- type: precision_at_3
value: 29.444
- type: precision_at_5
value: 19.467000000000002
- type: recall_at_1
value: 61.49400000000001
- type: recall_at_10
value: 92.156
- type: recall_at_100
value: 99.167
- type: recall_at_1000
value: 100.0
- type: recall_at_20
value: 94.833
- type: recall_at_3
value: 80.833
- type: recall_at_5
value: 86.6
task:
type: Retrieval
- dataset:
config: default
name: MTEB SprintDuplicateQuestions
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
split: test
type: mteb/sprintduplicatequestions-pairclassification
metrics:
- type: cosine_accuracy
value: 99.8039603960396
- type: cosine_accuracy_threshold
value: 84.54211950302124
- type: cosine_ap
value: 95.59056372734358
- type: cosine_f1
value: 90.1394422310757
- type: cosine_f1_threshold
value: 84.54211950302124
- type: cosine_precision
value: 89.78174603174604
- type: cosine_recall
value: 90.5
- type: dot_accuracy
value: 99.80594059405941
- type: dot_accuracy_threshold
value: 85.57180166244507
- type: dot_ap
value: 95.53453431914399
- type: dot_f1
value: 90.10442565887618
- type: dot_f1_threshold
value: 84.59715843200684
- type: dot_precision
value: 89.61424332344214
- type: dot_recall
value: 90.60000000000001
- type: euclidean_accuracy
value: 99.8039603960396
- type: euclidean_accuracy_threshold
value: 53.253382444381714
- type: euclidean_ap
value: 95.5850992402159
- type: euclidean_f1
value: 90.09457441513192
- type: euclidean_f1_threshold
value: 55.725520849227905
- type: euclidean_precision
value: 89.69276511397423
- type: euclidean_recall
value: 90.5
- type: main_score
value: 95.7485189884476
- type: manhattan_accuracy
value: 99.81485148514851
- type: manhattan_accuracy_threshold
value: 3491.29638671875
- type: manhattan_ap
value: 95.7485189884476
- type: manhattan_f1
value: 90.464048954615
- type: manhattan_f1_threshold
value: 3491.29638671875
- type: manhattan_precision
value: 92.2996878251821
- type: manhattan_recall
value: 88.7
- type: max_ap
value: 95.7485189884476
- type: max_f1
value: 90.464048954615
- type: max_precision
value: 92.2996878251821
- type: max_recall
value: 90.60000000000001
- type: similarity_accuracy
value: 99.8039603960396
- type: similarity_accuracy_threshold
value: 84.54211950302124
- type: similarity_ap
value: 95.59056372734358
- type: similarity_f1
value: 90.1394422310757
- type: similarity_f1_threshold
value: 84.54211950302124
- type: similarity_precision
value: 89.78174603174604
- type: similarity_recall
value: 90.5
task:
type: PairClassification
- dataset:
config: default
name: MTEB StackExchangeClustering
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
split: test
type: mteb/stackexchange-clustering
metrics:
- type: main_score
value: 78.49205191950675
- type: v_measure
value: 78.49205191950675
- type: v_measure_std
value: 2.84869550699959
task:
type: Clustering
- dataset:
config: default
name: MTEB StackExchangeClusteringP2P
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
split: test
type: mteb/stackexchange-clustering-p2p
metrics:
- type: main_score
value: 48.90421736513028
- type: v_measure
value: 48.90421736513028
- type: v_measure_std
value: 1.6875865714471023
task:
type: Clustering
- dataset:
config: default
name: MTEB StackOverflowDupQuestions
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
split: test
type: mteb/stackoverflowdupquestions-reranking
metrics:
- type: main_score
value: 52.9874730481696
- type: map
value: 52.9874730481696
- type: mrr
value: 53.85867604617604
- type: nAUC_map_diff1
value: 39.633429293407616
- type: nAUC_map_max
value: 10.236807988858546
- type: nAUC_map_std
value: 10.276522217929674
- type: nAUC_mrr_diff1
value: 40.0543079218377
- type: nAUC_mrr_max
value: 10.96209807382042
- type: nAUC_mrr_std
value: 10.524400196109918
task:
type: Reranking
- dataset:
config: default
name: MTEB SummEval
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
split: test
type: mteb/summeval
metrics:
- type: cosine_pearson
value: 30.727801109114232
- type: cosine_spearman
value: 31.66058223980157
- type: dot_pearson
value: 30.78818248622866
- type: dot_spearman
value: 31.525158776890265
- type: main_score
value: 31.66058223980157
- type: pearson
value: 30.727801109114232
- type: spearman
value: 31.66058223980157
task:
type: Summarization
- dataset:
config: default
name: MTEB TRECCOVID
revision: bb9466bac8153a0349341eb1b22e06409e78ef4e
split: test
type: mteb/trec-covid
metrics:
- type: main_score
value: 85.206
- type: map_at_1
value: 0.246
- type: map_at_10
value: 2.1950000000000003
- type: map_at_100
value: 14.179
- type: map_at_1000
value: 35.037
- type: map_at_20
value: 4.143
- type: map_at_3
value: 0.7100000000000001
- type: map_at_5
value: 1.135
- type: mrr_at_1
value: 94.0
- type: mrr_at_10
value: 96.66666666666666
- type: mrr_at_100
value: 96.66666666666666
- type: mrr_at_1000
value: 96.66666666666666
- type: mrr_at_20
value: 96.66666666666666
- type: mrr_at_3
value: 96.66666666666666
- type: mrr_at_5
value: 96.66666666666666
- type: nauc_map_at_1000_diff1
value: -4.6264497624527525
- type: nauc_map_at_1000_max
value: 44.594457564749355
- type: nauc_map_at_1000_std
value: 73.17642341400133
- type: nauc_map_at_100_diff1
value: 23.451335157405726
- type: nauc_map_at_100_max
value: 25.426398857299525
- type: nauc_map_at_100_std
value: 64.07416694472633
- type: nauc_map_at_10_diff1
value: 46.57568738568346
- type: nauc_map_at_10_max
value: 9.693233249079238
- type: nauc_map_at_10_std
value: 28.549530265164357
- type: nauc_map_at_1_diff1
value: 53.48238396620123
- type: nauc_map_at_1_max
value: 0.33476619393733076
- type: nauc_map_at_1_std
value: 8.906362219128463
- type: nauc_map_at_20_diff1
value: 39.40719602207749
- type: nauc_map_at_20_max
value: 9.635915072074045
- type: nauc_map_at_20_std
value: 35.15634791346394
- type: nauc_map_at_3_diff1
value: 53.11784737840137
- type: nauc_map_at_3_max
value: 3.059682761072153
- type: nauc_map_at_3_std
value: 21.310633086556617
- type: nauc_map_at_5_diff1
value: 49.91570701185436
- type: nauc_map_at_5_max
value: 8.045082896244576
- type: nauc_map_at_5_std
value: 20.597686235051647
- type: nauc_mrr_at_1000_diff1
value: 41.98412698412726
- type: nauc_mrr_at_1000_max
value: 78.24463118580779
- type: nauc_mrr_at_1000_std
value: 0.30812324930028195
- type: nauc_mrr_at_100_diff1
value: 41.98412698412726
- type: nauc_mrr_at_100_max
value: 78.24463118580779
- type: nauc_mrr_at_100_std
value: 0.30812324930028195
- type: nauc_mrr_at_10_diff1
value: 41.98412698412726
- type: nauc_mrr_at_10_max
value: 78.24463118580779
- type: nauc_mrr_at_10_std
value: 0.30812324930028195
- type: nauc_mrr_at_1_diff1
value: 38.62433862433873
- type: nauc_mrr_at_1_max
value: 80.78120136943666
- type: nauc_mrr_at_1_std
value: -10.768751945222197
- type: nauc_mrr_at_20_diff1
value: 41.98412698412726
- type: nauc_mrr_at_20_max
value: 78.24463118580779
- type: nauc_mrr_at_20_std
value: 0.30812324930028195
- type: nauc_mrr_at_3_diff1
value: 41.98412698412726
- type: nauc_mrr_at_3_max
value: 78.24463118580779
- type: nauc_mrr_at_3_std
value: 0.30812324930028195
- type: nauc_mrr_at_5_diff1
value: 41.98412698412726
- type: nauc_mrr_at_5_max
value: 78.24463118580779
- type: nauc_mrr_at_5_std
value: 0.30812324930028195
- type: nauc_ndcg_at_1000_diff1
value: 0.5174948602880207
- type: nauc_ndcg_at_1000_max
value: 48.60686602077053
- type: nauc_ndcg_at_1000_std
value: 75.72456343175277
- type: nauc_ndcg_at_100_diff1
value: -20.747252137999254
- type: nauc_ndcg_at_100_max
value: 49.985132618254994
- type: nauc_ndcg_at_100_std
value: 61.096383293836574
- type: nauc_ndcg_at_10_diff1
value: 6.791377920463332
- type: nauc_ndcg_at_10_max
value: 57.50019332833286
- type: nauc_ndcg_at_10_std
value: 49.201028841219426
- type: nauc_ndcg_at_1_diff1
value: 54.92683440362145
- type: nauc_ndcg_at_1_max
value: 83.8667228129276
- type: nauc_ndcg_at_1_std
value: 1.6738604063586122
- type: nauc_ndcg_at_20_diff1
value: -5.1948699196314925
- type: nauc_ndcg_at_20_max
value: 54.483087684806556
- type: nauc_ndcg_at_20_std
value: 50.54823818118781
- type: nauc_ndcg_at_3_diff1
value: 26.267246500164372
- type: nauc_ndcg_at_3_max
value: 63.0173212926611
- type: nauc_ndcg_at_3_std
value: 41.025597406368256
- type: nauc_ndcg_at_5_diff1
value: 16.910185454343036
- type: nauc_ndcg_at_5_max
value: 60.9328683868778
- type: nauc_ndcg_at_5_std
value: 36.70169905857712
- type: nauc_precision_at_1000_diff1
value: -46.374447765983525
- type: nauc_precision_at_1000_max
value: 35.36052337813863
- type: nauc_precision_at_1000_std
value: 14.219220668161018
- type: nauc_precision_at_100_diff1
value: -29.7838083657744
- type: nauc_precision_at_100_max
value: 43.93589400385112
- type: nauc_precision_at_100_std
value: 55.425045718579945
- type: nauc_precision_at_10_diff1
value: -12.016613405227687
- type: nauc_precision_at_10_max
value: 57.79924427743131
- type: nauc_precision_at_10_std
value: 49.022036703550675
- type: nauc_precision_at_1_diff1
value: 38.62433862433873
- type: nauc_precision_at_1_max
value: 80.78120136943666
- type: nauc_precision_at_1_std
value: -10.768751945222197
- type: nauc_precision_at_20_diff1
value: -23.95633847880195
- type: nauc_precision_at_20_max
value: 48.34715917258276
- type: nauc_precision_at_20_std
value: 48.82198285255887
- type: nauc_precision_at_3_diff1
value: 6.871296905858807
- type: nauc_precision_at_3_max
value: 70.54805793285054
- type: nauc_precision_at_3_std
value: 44.65108624094803
- type: nauc_precision_at_5_diff1
value: -9.074932448759695
- type: nauc_precision_at_5_max
value: 67.41284242437573
- type: nauc_precision_at_5_std
value: 23.876891983919577
- type: nauc_recall_at_1000_diff1
value: 8.142288830293255
- type: nauc_recall_at_1000_max
value: 38.85182826835104
- type: nauc_recall_at_1000_std
value: 68.60783819217335
- type: nauc_recall_at_100_diff1
value: 34.262914076287466
- type: nauc_recall_at_100_max
value: 12.87009658528838
- type: nauc_recall_at_100_std
value: 56.21330603762995
- type: nauc_recall_at_10_diff1
value: 49.33830945338758
- type: nauc_recall_at_10_max
value: 0.3539875530671406
- type: nauc_recall_at_10_std
value: 26.85864465557644
- type: nauc_recall_at_1_diff1
value: 53.48238396620123
- type: nauc_recall_at_1_max
value: 0.33476619393733076
- type: nauc_recall_at_1_std
value: 8.906362219128463
- type: nauc_recall_at_20_diff1
value: 44.21928181266254
- type: nauc_recall_at_20_max
value: -0.9198356057088594
- type: nauc_recall_at_20_std
value: 31.484376992896784
- type: nauc_recall_at_3_diff1
value: 53.038093080990876
- type: nauc_recall_at_3_max
value: -1.4170895916973003
- type: nauc_recall_at_3_std
value: 21.890202855574497
- type: nauc_recall_at_5_diff1
value: 49.39742214825278
- type: nauc_recall_at_5_max
value: 2.8412267611894517
- type: nauc_recall_at_5_std
value: 18.01598921859512
- type: ndcg_at_1
value: 91.0
- type: ndcg_at_10
value: 85.206
- type: ndcg_at_100
value: 67.29
- type: ndcg_at_1000
value: 60.584
- type: ndcg_at_20
value: 82.321
- type: ndcg_at_3
value: 88.642
- type: ndcg_at_5
value: 87.063
- type: precision_at_1
value: 94.0
- type: precision_at_10
value: 89.8
- type: precision_at_100
value: 69.78
- type: precision_at_1000
value: 26.738
- type: precision_at_20
value: 87.2
- type: precision_at_3
value: 92.0
- type: precision_at_5
value: 90.8
- type: recall_at_1
value: 0.246
- type: recall_at_10
value: 2.344
- type: recall_at_100
value: 16.962
- type: recall_at_1000
value: 57.325
- type: recall_at_20
value: 4.517
- type: recall_at_3
value: 0.731
- type: recall_at_5
value: 1.1780000000000002
task:
type: Retrieval
- dataset:
config: default
name: MTEB Touche2020
revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f
split: test
type: mteb/touche2020
metrics:
- type: main_score
value: 31.455
- type: map_at_1
value: 2.9739999999999998
- type: map_at_10
value: 12.183
- type: map_at_100
value: 18.772
- type: map_at_1000
value: 20.415
- type: map_at_20
value: 14.451
- type: map_at_3
value: 6.507000000000001
- type: map_at_5
value: 8.66
- type: mrr_at_1
value: 40.816326530612244
- type: mrr_at_10
value: 57.70975056689341
- type: mrr_at_100
value: 58.18379126542391
- type: mrr_at_1000
value: 58.18379126542391
- type: mrr_at_20
value: 57.85552316164561
- type: mrr_at_3
value: 54.08163265306123
- type: mrr_at_5
value: 56.42857142857143
- type: nauc_map_at_1000_diff1
value: 3.1567471051481437
- type: nauc_map_at_1000_max
value: -1.5882060729791523
- type: nauc_map_at_1000_std
value: 18.69622198722074
- type: nauc_map_at_100_diff1
value: 3.3449677678147536
- type: nauc_map_at_100_max
value: -2.8928606866168405
- type: nauc_map_at_100_std
value: 15.789984947653412
- type: nauc_map_at_10_diff1
value: 2.9696743570444264
- type: nauc_map_at_10_max
value: -9.096749212011876
- type: nauc_map_at_10_std
value: -5.38545817258353
- type: nauc_map_at_1_diff1
value: 20.680780404542546
- type: nauc_map_at_1_max
value: -7.04722927447817
- type: nauc_map_at_1_std
value: -7.062494733973898
- type: nauc_map_at_20_diff1
value: 4.070437790119271
- type: nauc_map_at_20_max
value: -4.84491434686032
- type: nauc_map_at_20_std
value: 0.5846341109021014
- type: nauc_map_at_3_diff1
value: 11.9634978045925
- type: nauc_map_at_3_max
value: -8.27834591046608
- type: nauc_map_at_3_std
value: -8.687615453381065
- type: nauc_map_at_5_diff1
value: 0.9195191526009436
- type: nauc_map_at_5_max
value: -1.673813362719489
- type: nauc_map_at_5_std
value: -6.67549753473631
- type: nauc_mrr_at_1000_diff1
value: 19.877993208719573
- type: nauc_mrr_at_1000_max
value: -10.37776706406218
- type: nauc_mrr_at_1000_std
value: 7.132169578056367
- type: nauc_mrr_at_100_diff1
value: 19.877993208719573
- type: nauc_mrr_at_100_max
value: -10.37776706406218
- type: nauc_mrr_at_100_std
value: 7.132169578056367
- type: nauc_mrr_at_10_diff1
value: 20.414285568401457
- type: nauc_mrr_at_10_max
value: -9.677800295687861
- type: nauc_mrr_at_10_std
value: 8.001103690180859
- type: nauc_mrr_at_1_diff1
value: 22.393284073955723
- type: nauc_mrr_at_1_max
value: -5.889370191243167
- type: nauc_mrr_at_1_std
value: -1.5183536173658247
- type: nauc_mrr_at_20_diff1
value: 20.455564720604055
- type: nauc_mrr_at_20_max
value: -10.230642830103074
- type: nauc_mrr_at_20_std
value: 7.863582453266621
- type: nauc_mrr_at_3_diff1
value: 17.554895390732618
- type: nauc_mrr_at_3_max
value: -15.618463505555052
- type: nauc_mrr_at_3_std
value: 5.913231577966864
- type: nauc_mrr_at_5_diff1
value: 18.393678507779914
- type: nauc_mrr_at_5_max
value: -11.903593353147762
- type: nauc_mrr_at_5_std
value: 7.580745996262831
- type: nauc_ndcg_at_1000_diff1
value: 13.746937095530473
- type: nauc_ndcg_at_1000_max
value: -0.9319249687895838
- type: nauc_ndcg_at_1000_std
value: 38.56328031451904
- type: nauc_ndcg_at_100_diff1
value: 13.854865944415895
- type: nauc_ndcg_at_100_max
value: -7.142142012591404
- type: nauc_ndcg_at_100_std
value: 35.61341954818848
- type: nauc_ndcg_at_10_diff1
value: 9.010144273248759
- type: nauc_ndcg_at_10_max
value: -15.320014897424574
- type: nauc_ndcg_at_10_std
value: 2.84883880489144
- type: nauc_ndcg_at_1_diff1
value: 20.939533945592967
- type: nauc_ndcg_at_1_max
value: -6.387319972188946
- type: nauc_ndcg_at_1_std
value: -0.5258673122126726
- type: nauc_ndcg_at_20_diff1
value: 14.660827309009496
- type: nauc_ndcg_at_20_max
value: -13.476196120145994
- type: nauc_ndcg_at_20_std
value: 8.22391881710838
- type: nauc_ndcg_at_3_diff1
value: 13.429985227235935
- type: nauc_ndcg_at_3_max
value: -14.904544592570247
- type: nauc_ndcg_at_3_std
value: 1.599779998183342
- type: nauc_ndcg_at_5_diff1
value: 8.085466231900622
- type: nauc_ndcg_at_5_max
value: -9.09591969526831
- type: nauc_ndcg_at_5_std
value: 3.5794092637248505
- type: nauc_precision_at_1000_diff1
value: -9.31941215946743
- type: nauc_precision_at_1000_max
value: 31.52913520470716
- type: nauc_precision_at_1000_std
value: 22.720784312185856
- type: nauc_precision_at_100_diff1
value: 8.958548406995279
- type: nauc_precision_at_100_max
value: 15.100597910674104
- type: nauc_precision_at_100_std
value: 71.04548238175113
- type: nauc_precision_at_10_diff1
value: 12.4698194690008
- type: nauc_precision_at_10_max
value: -15.84870544871496
- type: nauc_precision_at_10_std
value: 7.575297622501928
- type: nauc_precision_at_1_diff1
value: 22.393284073955723
- type: nauc_precision_at_1_max
value: -5.889370191243167
- type: nauc_precision_at_1_std
value: -1.5183536173658247
- type: nauc_precision_at_20_diff1
value: 15.393505718138758
- type: nauc_precision_at_20_max
value: -3.70684298539384
- type: nauc_precision_at_20_std
value: 29.426137824970304
- type: nauc_precision_at_3_diff1
value: 9.997768085465394
- type: nauc_precision_at_3_max
value: -17.12224314347674
- type: nauc_precision_at_3_std
value: -1.343018166772313
- type: nauc_precision_at_5_diff1
value: 3.8936997437913554
- type: nauc_precision_at_5_max
value: -5.689104289687632
- type: nauc_precision_at_5_std
value: 3.181098051304285
- type: nauc_recall_at_1000_diff1
value: 9.908303508158387
- type: nauc_recall_at_1000_max
value: 6.174506592699848
- type: nauc_recall_at_1000_std
value: 77.41931114780012
- type: nauc_recall_at_100_diff1
value: 10.286839241876192
- type: nauc_recall_at_100_max
value: -6.6138697026666815
- type: nauc_recall_at_100_std
value: 49.608313692633224
- type: nauc_recall_at_10_diff1
value: 2.215545846659851
- type: nauc_recall_at_10_max
value: -17.83025802478445
- type: nauc_recall_at_10_std
value: -3.3784768673705465
- type: nauc_recall_at_1_diff1
value: 20.680780404542546
- type: nauc_recall_at_1_max
value: -7.04722927447817
- type: nauc_recall_at_1_std
value: -7.062494733973898
- type: nauc_recall_at_20_diff1
value: 6.974410239251615
- type: nauc_recall_at_20_max
value: -14.161147924731646
- type: nauc_recall_at_20_std
value: 9.328412057721454
- type: nauc_recall_at_3_diff1
value: 7.904589805754212
- type: nauc_recall_at_3_max
value: -12.1912388648593
- type: nauc_recall_at_3_std
value: -9.221542013385555
- type: nauc_recall_at_5_diff1
value: -3.2604132752706914
- type: nauc_recall_at_5_max
value: -6.886351441658915
- type: nauc_recall_at_5_std
value: -7.014252851712789
- type: ndcg_at_1
value: 39.796
- type: ndcg_at_10
value: 31.455
- type: ndcg_at_100
value: 42.388999999999996
- type: ndcg_at_1000
value: 53.556000000000004
- type: ndcg_at_20
value: 30.808000000000003
- type: ndcg_at_3
value: 35.831
- type: ndcg_at_5
value: 32.845
- type: precision_at_1
value: 40.816
- type: precision_at_10
value: 27.143
- type: precision_at_100
value: 8.449
- type: precision_at_1000
value: 1.6179999999999999
- type: precision_at_20
value: 19.387999999999998
- type: precision_at_3
value: 35.374
- type: precision_at_5
value: 31.019999999999996
- type: recall_at_1
value: 2.9739999999999998
- type: recall_at_10
value: 19.39
- type: recall_at_100
value: 51.636
- type: recall_at_1000
value: 86.99900000000001
- type: recall_at_20
value: 26.478
- type: recall_at_3
value: 7.703
- type: recall_at_5
value: 11.42
task:
type: Retrieval
- dataset:
config: default
name: MTEB ToxicConversationsClassification
revision: edfaf9da55d3dd50d43143d90c1ac476895ae6de
split: test
type: mteb/toxic_conversations_50k
metrics:
- type: accuracy
value: 86.9384765625
- type: ap
value: 31.737513704141552
- type: ap_weighted
value: 31.737513704141552
- type: f1
value: 71.5490757306975
- type: f1_weighted
value: 89.14632533489856
- type: main_score
value: 86.9384765625
task:
type: Classification
- dataset:
config: default
name: MTEB TweetSentimentExtractionClassification
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
split: test
type: mteb/tweet_sentiment_extraction
metrics:
- type: accuracy
value: 73.57668364459535
- type: f1
value: 73.90467103648074
- type: f1_weighted
value: 73.42158415034704
- type: main_score
value: 73.57668364459535
task:
type: Classification
- dataset:
config: default
name: MTEB TwentyNewsgroupsClustering
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
split: test
type: mteb/twentynewsgroups-clustering
metrics:
- type: main_score
value: 58.574148097494685
- type: v_measure
value: 58.574148097494685
- type: v_measure_std
value: 0.9443161637490822
task:
type: Clustering
- dataset:
config: default
name: MTEB TwitterSemEval2015
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
split: test
type: mteb/twittersemeval2015-pairclassification
metrics:
- type: cosine_accuracy
value: 88.1385229778864
- type: cosine_accuracy_threshold
value: 83.86307954788208
- type: cosine_ap
value: 80.17965893449055
- type: cosine_f1
value: 73.0614300100705
- type: cosine_f1_threshold
value: 80.7942807674408
- type: cosine_precision
value: 69.8603755416466
- type: cosine_recall
value: 76.56992084432717
- type: dot_accuracy
value: 88.2100494724921
- type: dot_accuracy_threshold
value: 83.84793996810913
- type: dot_ap
value: 80.18603932881858
- type: dot_f1
value: 73.07643714466204
- type: dot_f1_threshold
value: 80.87586164474487
- type: dot_precision
value: 70.10909090909091
- type: dot_recall
value: 76.3060686015831
- type: euclidean_accuracy
value: 88.1385229778864
- type: euclidean_accuracy_threshold
value: 56.77661895751953
- type: euclidean_ap
value: 80.1784070881624
- type: euclidean_f1
value: 73.04830369529574
- type: euclidean_f1_threshold
value: 61.91838979721069
- type: euclidean_precision
value: 69.96859144720948
- type: euclidean_recall
value: 76.41160949868075
- type: main_score
value: 80.18603932881858
- type: manhattan_accuracy
value: 88.0431543184121
- type: manhattan_accuracy_threshold
value: 3755.6137084960938
- type: manhattan_ap
value: 79.98270453664578
- type: manhattan_f1
value: 72.68242015061023
- type: manhattan_f1_threshold
value: 3892.494583129883
- type: manhattan_precision
value: 71.54907975460122
- type: manhattan_recall
value: 73.85224274406332
- type: max_ap
value: 80.18603932881858
- type: max_f1
value: 73.07643714466204
- type: max_precision
value: 71.54907975460122
- type: max_recall
value: 76.56992084432717
- type: similarity_accuracy
value: 88.1385229778864
- type: similarity_accuracy_threshold
value: 83.86307954788208
- type: similarity_ap
value: 80.17965893449055
- type: similarity_f1
value: 73.0614300100705
- type: similarity_f1_threshold
value: 80.7942807674408
- type: similarity_precision
value: 69.8603755416466
- type: similarity_recall
value: 76.56992084432717
task:
type: PairClassification
- dataset:
config: default
name: MTEB TwitterURLCorpus
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
split: test
type: mteb/twitterurlcorpus-pairclassification
metrics:
- type: cosine_accuracy
value: 89.7892653393876
- type: cosine_accuracy_threshold
value: 79.69566583633423
- type: cosine_ap
value: 87.4579867302024
- type: cosine_f1
value: 79.91620843152658
- type: cosine_f1_threshold
value: 78.53609323501587
- type: cosine_precision
value: 77.7155329210622
- type: cosine_recall
value: 82.24514936864799
- type: dot_accuracy
value: 89.78732487289945
- type: dot_accuracy_threshold
value: 80.05315661430359
- type: dot_ap
value: 87.44916182456272
- type: dot_f1
value: 79.90419878751591
- type: dot_f1_threshold
value: 78.57890725135803
- type: dot_precision
value: 77.73409057812728
- type: dot_recall
value: 82.19895287958116
- type: euclidean_accuracy
value: 89.78538440641131
- type: euclidean_accuracy_threshold
value: 62.29925751686096
- type: euclidean_ap
value: 87.45904868911386
- type: euclidean_f1
value: 79.93127404474657
- type: euclidean_f1_threshold
value: 65.61101078987122
- type: euclidean_precision
value: 77.62060210373595
- type: euclidean_recall
value: 82.38373883584848
- type: main_score
value: 87.46554314325058
- type: manhattan_accuracy
value: 89.76597974152986
- type: manhattan_accuracy_threshold
value: 3988.5299682617188
- type: manhattan_ap
value: 87.46554314325058
- type: manhattan_f1
value: 79.97181740645973
- type: manhattan_f1_threshold
value: 4235.905838012695
- type: manhattan_precision
value: 77.13713427283783
- type: manhattan_recall
value: 83.02279026793964
- type: max_ap
value: 87.46554314325058
- type: max_f1
value: 79.97181740645973
- type: max_precision
value: 77.73409057812728
- type: max_recall
value: 83.02279026793964
- type: similarity_accuracy
value: 89.7892653393876
- type: similarity_accuracy_threshold
value: 79.69566583633423
- type: similarity_ap
value: 87.4579867302024
- type: similarity_f1
value: 79.91620843152658
- type: similarity_f1_threshold
value: 78.53609323501587
- type: similarity_precision
value: 77.7155329210622
- type: similarity_recall
value: 82.24514936864799
task:
type: PairClassification
tags:
- mteb
- sentence-transformers
- transformers
- sentence-similarity
license: mit
---
## Marqo Stella v2
This model is similar to the original [Dunzhang stella 400m model](https://huggingface.co/dunzhang/stella_en_400M_v5), with a fused matryoshka layer. The hierarchical structuring from a Matryoshka Layer reduces the computational overhead for generating embeddings, while leaving relevance metrics unchanged.
## Transformers
```python
import os
import torch
from transformers import AutoModel, AutoTokenizer, AutoConfig
from sklearn.preprocessing import normalize
query_prompt = "Instruct: Given a web search query, retrieve relevant passages that answer the query.\nQuery: "
queries = [
"What are some ways to reduce stress?",
"What are the benefits of drinking green tea?",
]
queries = [query_prompt + query for query in queries]
# docs do not need any prompts
docs = [
"There are many effective ways to reduce stress. Some common techniques include deep breathing, meditation, and physical activity. Engaging in hobbies, spending time in nature, and connecting with loved ones can also help alleviate stress. Additionally, setting boundaries, practicing self-care, and learning to say no can prevent stress from building up.",
"Green tea has been consumed for centuries and is known for its potential health benefits. It contains antioxidants that may help protect the body against damage caused by free radicals. Regular consumption of green tea has been associated with improved heart health, enhanced cognitive function, and a reduced risk of certain types of cancer. The polyphenols in green tea may also have anti-inflammatory and weight loss properties.",
]
# The path of your model after cloning it
model_dir = "Marqo/dunzhang-stella_en_400M_v5"
model = AutoModel.from_pretrained(model_dir, trust_remote_code=True).cuda().eval()
tokenizer = AutoTokenizer.from_pretrained(model_dir, trust_remote_code=True)
with torch.no_grad():
input_data = tokenizer(queries, padding="longest", truncation=True, max_length=512, return_tensors="pt")
input_data = {k: v.cuda() for k, v in input_data.items()}
attention_mask = input_data["attention_mask"]
last_hidden_state = model(**input_data)[0]
last_hidden = last_hidden_state.masked_fill(~attention_mask[..., None].bool(), 0.0)
query_vectors = last_hidden.sum(dim=1) / attention_mask.sum(dim=1)[..., None]
query_vectors = normalize(query_vectors.cpu().numpy())
# Embed the documents
with torch.no_grad():
input_data = tokenizer(docs, padding="longest", truncation=True, max_length=512, return_tensors="pt")
input_data = {k: v.cuda() for k, v in input_data.items()}
attention_mask = input_data["attention_mask"]
last_hidden_state = model(**input_data)[0]
last_hidden = last_hidden_state.masked_fill(~attention_mask[..., None].bool(), 0.0)
docs_vectors = last_hidden.sum(dim=1) / attention_mask.sum(dim=1)[..., None]
docs_vectors = normalize(docs_vectors.cpu().numpy())
print(query_vectors.shape, docs_vectors.shape)
# (2, 1024) (2, 1024)
similarities = query_vectors @ docs_vectors.T
print(similarities)
# [[0.8397531 0.29900077]
# [0.32818374 0.80954516]]
```
|
ahmedheakl/asm2asm-qwen2.5coder-0.5b-200k-2ep | ahmedheakl | 2024-11-24T23:52:58Z | 25 | 0 | transformers | [
"transformers",
"safetensors",
"qwen2",
"text-generation",
"generated_from_trainer",
"trl",
"sft",
"conversational",
"base_model:Qwen/Qwen2.5-Coder-0.5B-Instruct",
"base_model:finetune:Qwen/Qwen2.5-Coder-0.5B-Instruct",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-24T13:15:37Z | ---
base_model: Qwen/Qwen2.5-Coder-0.5B-Instruct
library_name: transformers
model_name: asm2asm-qwen2.5coder-0.5b-200k-2ep
tags:
- generated_from_trainer
- trl
- sft
licence: license
---
# Model Card for asm2asm-qwen2.5coder-0.5b-200k-2ep
This model is a fine-tuned version of [Qwen/Qwen2.5-Coder-0.5B-Instruct](https://huggingface.co/Qwen/Qwen2.5-Coder-0.5B-Instruct).
It has been trained using [TRL](https://github.com/huggingface/trl).
## Quick start
```python
from transformers import pipeline
question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?"
generator = pipeline("text-generation", model="ahmedheakl/asm2asm-qwen2.5coder-0.5b-200k-2ep", device="cuda")
output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0]
print(output["generated_text"])
```
## Training procedure
[<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="150" height="24"/>](https://wandb.ai/ahmed-heakl/huggingface/runs/nyjrtslb)
This model was trained with SFT.
### Framework versions
- TRL: 0.12.1
- Transformers: 4.46.3
- Pytorch: 2.5.1+cu124
- Datasets: 3.1.0
- Tokenizers: 0.20.3
## Citations
Cite TRL as:
```bibtex
@misc{vonwerra2022trl,
title = {{TRL: Transformer Reinforcement Learning}},
author = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallouédec},
year = 2020,
journal = {GitHub repository},
publisher = {GitHub},
howpublished = {\url{https://github.com/huggingface/trl}}
}
``` |
RE-N-Y/pickscore-siglip-weighted | RE-N-Y | 2024-11-24T23:49:24Z | 7 | 0 | preferences | [
"preferences",
"safetensors",
"model_hub_mixin",
"pytorch_model_hub_mixin",
"region:us"
] | null | 2024-11-24T23:48:19Z | ---
library_name: preferences
tags:
- model_hub_mixin
- pytorch_model_hub_mixin
---
This model has been pushed to the Hub using the [PytorchModelHubMixin](https://huggingface.co/docs/huggingface_hub/package_reference/mixins#huggingface_hub.PyTorchModelHubMixin) integration:
- Library: https://github.com/RE-N-Y/finebooru
- Docs: [More Information Needed] |
touhidulislam/BERTweet_retrain_2020_50 | touhidulislam | 2024-11-24T23:48:38Z | 179 | 0 | transformers | [
"transformers",
"safetensors",
"roberta",
"fill-mask",
"generated_from_trainer",
"base_model:vinai/bertweet-base",
"base_model:finetune:vinai/bertweet-base",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | 2024-11-24T23:48:20Z | ---
library_name: transformers
license: mit
base_model: vinai/bertweet-base
tags:
- generated_from_trainer
model-index:
- name: BERTweet_retrain_2020_50
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# BERTweet_retrain_2020_50
This model is a fine-tuned version of [vinai/bertweet-base](https://huggingface.co/vinai/bertweet-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 2.5680
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:-----:|:---------------:|
| 2.9311 | 1.0 | 5840 | 2.6445 |
| 2.3951 | 2.0 | 11680 | 2.5876 |
| 2.5587 | 3.0 | 17520 | 2.5657 |
### Framework versions
- Transformers 4.45.1
- Pytorch 2.1.0+cu121
- Datasets 3.0.1
- Tokenizers 0.20.0
|
aehrc/uniformer_base_tl_384 | aehrc | 2024-11-24T23:48:25Z | 3,259 | 0 | transformers | [
"transformers",
"safetensors",
"uniformer",
"feature-extraction",
"custom_code",
"region:us"
] | feature-extraction | 2024-07-01T06:00:16Z | ---
library_name: transformers
tags: []
---
Hugging Face Transfomers implementation of UniFormer-B with Large resolution fine-tuning (384x384): https://github.com/Sense-X/UniFormer/tree/main/image_classification#large-resolution-fine-tuning-384x384 |
BigHuggyD/TheDrummer_Behemoth-123B-v2.2_exl2_5.0bpw_h6 | BigHuggyD | 2024-11-24T23:41:13Z | 20 | 2 | null | [
"safetensors",
"mistral",
"license:other",
"5-bit",
"exl2",
"region:us"
] | null | 2024-11-24T23:34:53Z | ---
license: other
---
# Join our Discord! https://discord.gg/Nbv9pQ88Xb
## Nearly 2500 members strong 💪
### Now with more channels! A hub for creatives and makers alike!
---
[BeaverAI](https://huggingface.co/BeaverAI) proudly presents...
# Behemoth 123B v2.2 🦣
> Nothing in the void is foreign to us. The place we go is the place we belong.

## Links
- Original: https://huggingface.co/TheDrummer/Behemoth-123B-v2.2
- GGUF: https://huggingface.co/TheDrummer/Behemoth-123B-v2.2-GGUF
- iMatrix: https://huggingface.co/bartowski/Behemoth-123B-v2.2-GGUF (recommended for smaller quants)
## Description
Behemoth v2.x is a finetune of the new Largestral 2411 with system prompt support. Testers have noted that **everything** felt improved.
### Usage
Testers say this frankenformat maximizes the model's potential: **Metharme** with Mistral's new system tokens
- `[SYSTEM_PROMPT] <|system|>{{system_message}}[/SYSTEM_PROMPT]<|user|>{{user_message}}<|model|>{{assistant_message}}`
- `<|system|>[SYSTEM_PROMPT] {{system_message}}[/SYSTEM_PROMPT]<|user|>{{user_message}}<|model|>{{assistant_message}}`
*Take note that the opening system tag SHOULD ALWAYS have a leading whitespace after it.*
Complete SillyTavern Settings in BeaverAI Club: https://discord.com/channels/1238219753324281886/1309968730301792370/1309968730301792370
### Versions
- [v2.0](https://huggingface.co/TheDrummer/Behemoth-123B-v2) is equivalent to Behemoth v1.0 (Classic)
- [v2.1](https://huggingface.co/TheDrummer/Behemoth-123B-v2.1) is equivalent to Behemoth v1.1 (Creative Boost)
- [v2.2](https://huggingface.co/TheDrummer/Behemoth-123B-v2.2) is an improvement of Behemoth v2.1 (Creative++)
## Special Thanks
Thank you to each and everyone who donated/subscribed in [Ko-Fi](https://ko-fi.com/thedrummer) 🙇 I hope to never disappoint!
```
Toasty Pigeon
theguywhogamesalot
Grozi
F
Marinara
Ko-fi Supporter
Grozi
Phaelon
ONTHEREDTEAM
EvarinSharath'fe(USM-Valor)
Silva
Dakkidaze
AlexTheVP
Pseudo
Kistara
Dr. Fjut
Grozi 🥈
KinjiHakari777
dustywintr
Syd
HumbleConsumer
Syd
Ko-fi Supporter
Arkamist
joe 🥇
Toad
Lied
Konnect
Kistara
Grozi 🥉
SleepDeprived3
Luigi
Nestor
```
https://ko-fi.com/thedrummer/leaderboard
```
Finetuned by yours truly,
Drummer
```

|
touhidulislam/BERTweet_retrain_2022_53 | touhidulislam | 2024-11-24T23:35:58Z | 181 | 0 | transformers | [
"transformers",
"safetensors",
"roberta",
"fill-mask",
"generated_from_trainer",
"base_model:vinai/bertweet-base",
"base_model:finetune:vinai/bertweet-base",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | 2024-11-23T22:20:47Z | ---
library_name: transformers
license: mit
base_model: vinai/bertweet-base
tags:
- generated_from_trainer
model-index:
- name: BERTweet_retrain_2022_53
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# BERTweet_retrain_2022_53
This model is a fine-tuned version of [vinai/bertweet-base](https://huggingface.co/vinai/bertweet-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 2.4125
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:-----:|:---------------:|
| 2.8653 | 1.0 | 5952 | 2.5238 |
| 2.6626 | 2.0 | 11904 | 2.4443 |
| 2.5307 | 3.0 | 17856 | 2.4060 |
### Framework versions
- Transformers 4.45.1
- Pytorch 2.1.0+cu121
- Datasets 3.0.1
- Tokenizers 0.20.0
|
braindao/iq-code-evmind-14b-instruct-v0.2411.0 | braindao | 2024-11-24T23:29:44Z | 6 | 0 | transformers | [
"transformers",
"safetensors",
"qwen2",
"text-generation",
"llama-factory",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-24T23:24:31Z | ---
library_name: transformers
tags:
- llama-factory
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Triangle104/Mistral-Small-Instruct-2409-abliterated-Q6_K-GGUF | Triangle104 | 2024-11-24T23:28:37Z | 20 | 0 | transformers | [
"transformers",
"gguf",
"llm",
"mistral",
"chat",
"instruct",
"it",
"abliterated",
"llama-cpp",
"gguf-my-repo",
"text-generation",
"en",
"base_model:byroneverson/Mistral-Small-Instruct-2409-abliterated",
"base_model:quantized:byroneverson/Mistral-Small-Instruct-2409-abliterated",
"license:other",
"endpoints_compatible",
"region:us",
"conversational"
] | text-generation | 2024-11-24T23:27:15Z | ---
base_model: byroneverson/Mistral-Small-Instruct-2409-abliterated
license: other
license_name: mrl
license_link: https://mistral.ai/licenses/MRL-0.1.md
pipeline_tag: text-generation
language:
- en
tags:
- llm
- mistral
- chat
- instruct
- it
- abliterated
- llama-cpp
- gguf-my-repo
library_name: transformers
---
# Triangle104/Mistral-Small-Instruct-2409-abliterated-Q6_K-GGUF
This model was converted to GGUF format from [`byroneverson/Mistral-Small-Instruct-2409-abliterated`](https://huggingface.co/byroneverson/Mistral-Small-Instruct-2409-abliterated) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space.
Refer to the [original model card](https://huggingface.co/byroneverson/Mistral-Small-Instruct-2409-abliterated) for more details on the model.
## Use with llama.cpp
Install llama.cpp through brew (works on Mac and Linux)
```bash
brew install llama.cpp
```
Invoke the llama.cpp server or the CLI.
### CLI:
```bash
llama-cli --hf-repo Triangle104/Mistral-Small-Instruct-2409-abliterated-Q6_K-GGUF --hf-file mistral-small-instruct-2409-abliterated-q6_k.gguf -p "The meaning to life and the universe is"
```
### Server:
```bash
llama-server --hf-repo Triangle104/Mistral-Small-Instruct-2409-abliterated-Q6_K-GGUF --hf-file mistral-small-instruct-2409-abliterated-q6_k.gguf -c 2048
```
Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well.
Step 1: Clone llama.cpp from GitHub.
```
git clone https://github.com/ggerganov/llama.cpp
```
Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux).
```
cd llama.cpp && LLAMA_CURL=1 make
```
Step 3: Run inference through the main binary.
```
./llama-cli --hf-repo Triangle104/Mistral-Small-Instruct-2409-abliterated-Q6_K-GGUF --hf-file mistral-small-instruct-2409-abliterated-q6_k.gguf -p "The meaning to life and the universe is"
```
or
```
./llama-server --hf-repo Triangle104/Mistral-Small-Instruct-2409-abliterated-Q6_K-GGUF --hf-file mistral-small-instruct-2409-abliterated-q6_k.gguf -c 2048
```
|
touhidulislam/BERTweet_retrain_2020_49 | touhidulislam | 2024-11-24T23:13:46Z | 180 | 0 | transformers | [
"transformers",
"safetensors",
"roberta",
"fill-mask",
"generated_from_trainer",
"base_model:vinai/bertweet-base",
"base_model:finetune:vinai/bertweet-base",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | 2024-11-24T23:13:25Z | ---
library_name: transformers
license: mit
base_model: vinai/bertweet-base
tags:
- generated_from_trainer
model-index:
- name: BERTweet_retrain_2020_49
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# BERTweet_retrain_2020_49
This model is a fine-tuned version of [vinai/bertweet-base](https://huggingface.co/vinai/bertweet-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 2.5818
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:-----:|:---------------:|
| 2.8991 | 1.0 | 5779 | 2.6638 |
| 2.7764 | 2.0 | 11558 | 2.6118 |
| 2.842 | 3.0 | 17337 | 2.6004 |
### Framework versions
- Transformers 4.45.1
- Pytorch 2.1.0+cu121
- Datasets 3.0.1
- Tokenizers 0.20.0
|
wwwtwwwt/whisper-tiny-Entertainment-Game-Commentary | wwwtwwwt | 2024-11-24T23:03:28Z | 139 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"whisper",
"automatic-speech-recognition",
"generated_from_trainer",
"en",
"dataset:wwwtwwwt/fineaudio-Entertainment",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | automatic-speech-recognition | 2024-11-24T23:03:10Z | ---
library_name: transformers
language:
- en
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_trainer
datasets:
- wwwtwwwt/fineaudio-Entertainment
metrics:
- wer
model-index:
- name: Whisper Tiny En - Entertainment - Game Commentary
results:
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: fineaudio-Entertainment-Game Commentary
type: wwwtwwwt/fineaudio-Entertainment
args: 'config: en, split: test'
metrics:
- name: Wer
type: wer
value: 46.31946283631152
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Whisper Tiny En - Entertainment - Game Commentary
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on the fineaudio-Entertainment-Game Commentary dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8817
- Wer: 46.3195
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- training_steps: 4000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:------:|:----:|:---------------:|:-------:|
| 0.8341 | 0.5984 | 1000 | 0.9697 | 53.8799 |
| 0.6267 | 1.1969 | 2000 | 0.9055 | 49.3543 |
| 0.6058 | 1.7953 | 3000 | 0.8844 | 47.1311 |
| 0.5022 | 2.3938 | 4000 | 0.8817 | 46.3195 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.4.0
- Datasets 3.1.0
- Tokenizers 0.20.0
|
Subsets and Splits