modelId
stringlengths 5
139
| author
stringlengths 2
42
| last_modified
timestamp[us, tz=UTC]date 2020-02-15 11:33:14
2025-06-27 00:42:13
| downloads
int64 0
223M
| likes
int64 0
11.7k
| library_name
stringclasses 499
values | tags
sequencelengths 1
4.05k
| pipeline_tag
stringclasses 54
values | createdAt
timestamp[us, tz=UTC]date 2022-03-02 23:29:04
2025-06-27 00:40:00
| card
stringlengths 11
1.01M
|
---|---|---|---|---|---|---|---|---|---|
mini1013/master_cate_ac8 | mini1013 | 2024-11-25T10:14:43Z | 77 | 0 | setfit | [
"setfit",
"safetensors",
"roberta",
"sentence-transformers",
"text-classification",
"generated_from_setfit_trainer",
"arxiv:2209.11055",
"base_model:mini1013/master_domain",
"base_model:finetune:mini1013/master_domain",
"model-index",
"region:us"
] | text-classification | 2024-11-25T10:14:16Z | ---
base_model: mini1013/master_domain
library_name: setfit
metrics:
- metric
pipeline_tag: text-classification
tags:
- setfit
- sentence-transformers
- text-classification
- generated_from_setfit_trainer
widget:
- text: '[10%+๋ณต์620]๊ตญ๋ด์์ฐ ๋จ์์ฌ์ ์ต๋15์ผค๋ ํ์ดํฌ์ญ์ค ์ค๋ฆฌ์ฝ ๋ง์ ์๋ง ํ์ ๋ฌด์ง 25_์ฑ ๋ฐ๋ ์ด์ค์ค๋ฆฌ์ฝ_์ฌ์ฑ_๋ฒ ์ด์ง(4์ผค๋ )
๋ฐ์ฅ๋์๋ง'
- text: W616 ๋ฐ๋ปํ ๋๊บผ์ด ์๋ฉด ํตํ์ผ ๋ฌด์ง ๊ธด ์๋ง ์ฌ์ ๋จ์ ๋น
์ฌ์ด์ฆ ์๋ฉด ๊ฒจ์ธ ๋ง์ ๋์ญ์ค W432 ๊ณจ์ง ํตํ์ผ ๋ง์ _S(225-245mm)_๋ธ๋
์ญ์ค์์ด
- text: '[2์ฐจ 11/14 ์์ฝ๋ฐฐ์ก][23FW] HEMISH LEG WARMER - MELANGE GREY MELANGE GREY_FREE
์ฃผ์ํ์ฌ ํ์
์ค(Types Co.,Ltd)'
- text: ๋ํฐํ ๋ฉด๋์ฌ ์๋ง ๊ตญ๋ด์์ฐ/์ค๋ชฉ/์ฅ๋ชฉ/์ค๋์ปค์ฆ/ํจ์
/ํ์ 25~26_26.๋จ๋
๊ธฐ๋ชจ๋ง์ _์ฌ)2์ผค๋ / ๋ธ๋ ํฌํฌ์ญ์ค
- text: ๋ํฐ ์์ง ์๋ง ๋ฐ๊ฐ๋ฝ ์ฌ ํ๋น ์ญ์ค ๊ธฐ๋ชจ ๋ณด์จ ์ปฌ๋ฌ ์ฌ์ ๋๊บผ์ด ๋ฌด์ง ์ฐ๋ธ๋ผ์ด ๊น๋ฏผ์ฃผ
inference: true
model-index:
- name: SetFit with mini1013/master_domain
results:
- task:
type: text-classification
name: Text Classification
dataset:
name: Unknown
type: unknown
split: test
metrics:
- type: metric
value: 0.7735123253257968
name: Metric
---
# SetFit with mini1013/master_domain
This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [mini1013/master_domain](https://huggingface.co/mini1013/master_domain) as the Sentence Transformer embedding model. A [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance is used for classification.
The model has been trained using an efficient few-shot learning technique that involves:
1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning.
2. Training a classification head with features from the fine-tuned Sentence Transformer.
## Model Details
### Model Description
- **Model Type:** SetFit
- **Sentence Transformer body:** [mini1013/master_domain](https://huggingface.co/mini1013/master_domain)
- **Classification head:** a [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance
- **Maximum Sequence Length:** 512 tokens
- **Number of Classes:** 2 classes
<!-- - **Training Dataset:** [Unknown](https://huggingface.co/datasets/unknown) -->
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->
### Model Sources
- **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit)
- **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055)
- **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit)
### Model Labels
| Label | Examples |
|:------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 1.0 | <ul><li>'์์ ๊ฑฐ ๋ฑ์ฐ ๊ณจํ ๊ฒจ์ธ ๋ฐ ๋ค๋ฆฌํ ์ ๋ ๊ทธ์๋จธ ๋ธ๋ผ์ด ๋ํ์ฝ๋ฆฌ์ (Digital Plus Korea)'</li><li>'๊ตญ์ฐ ๋ฉด ํํ
๊ฒจ์ธ ๋ฐฉํ ํ ๋ค๋ฆฌ ์๋ฉด ํ ์ ๋ฐ ์์ฐ๋ถ ์ฐํ์ฉํ ์์กฑ๋์ฆ ๊ฒจ์ธ ๋ฐฉํ ๋ณด์จ ๊ธฐ๋ณธ ์๋ฉดํ ์ ๊ทธ๋ ์ด ์ธ์๋งค ์๋ง'</li><li>'์ธ๋ธ๋ค์ค ์ฌ์ ๋ ๊ทธ์๋จธ ์๋ฉด ์ฌ์ฑ ๋ฐํ ์ ๊ฒจ์ธ ๋ณด์จ SD001 ๊ทธ๋ ์ด_FREE ์์ด๋ณด๋ฆฌ'</li></ul> |
| 0.0 | <ul><li>'[๋งค์ฅ๋ฐ์ก] ๋ง๋ฆฌ๋ผ 11/6 ๋ฐฐ์ก 3PACK EMBROIDERY SOCKS multi OS ์์ด์์ค๋ง์ผ'</li><li>'์๋ธ๋ฆฌ๋ฐ์ด ํ๋ฌ์ค ์ฟ ์
ํธ๋ ์ด๋ ํฌ๋ฃจ ์ญ์ค(3์ผค๋ ) SX6888-100 024 '</li><li>'[๋กฏ๋ฐ๋ฐฑํ์ ]์ธ๋์๋จธ(๋ฐฑ) ์ ๋์น์ค UA ์ฝ์ด ์ฟผํฐ ์๋ง - 3์ผค๋ 1358344-100 1.LG ๋กฏ๋ฐ๋ฐฑํ์ _'</li></ul> |
## Evaluation
### Metrics
| Label | Metric |
|:--------|:-------|
| **all** | 0.7735 |
## Uses
### Direct Use for Inference
First install the SetFit library:
```bash
pip install setfit
```
Then you can load this model and run inference.
```python
from setfit import SetFitModel
# Download from the ๐ค Hub
model = SetFitModel.from_pretrained("mini1013/master_cate_ac8")
# Run inference
preds = model("๋ํฐ ์์ง ์๋ง ๋ฐ๊ฐ๋ฝ ์ฌ ํ๋น ์ญ์ค ๊ธฐ๋ชจ ๋ณด์จ ์ปฌ๋ฌ ์ฌ์ ๋๊บผ์ด ๋ฌด์ง ์ฐ๋ธ๋ผ์ด ๊น๋ฏผ์ฃผ")
```
<!--
### Downstream Use
*List how someone could finetune this model on their own dataset.*
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Set Metrics
| Training set | Min | Median | Max |
|:-------------|:----|:-------|:----|
| Word count | 4 | 10.82 | 24 |
| Label | Training Sample Count |
|:------|:----------------------|
| 0.0 | 50 |
| 1.0 | 50 |
### Training Hyperparameters
- batch_size: (512, 512)
- num_epochs: (20, 20)
- max_steps: -1
- sampling_strategy: oversampling
- num_iterations: 40
- body_learning_rate: (2e-05, 2e-05)
- head_learning_rate: 2e-05
- loss: CosineSimilarityLoss
- distance_metric: cosine_distance
- margin: 0.25
- end_to_end: False
- use_amp: False
- warmup_proportion: 0.1
- seed: 42
- eval_max_steps: -1
- load_best_model_at_end: False
### Training Results
| Epoch | Step | Training Loss | Validation Loss |
|:------:|:----:|:-------------:|:---------------:|
| 0.0625 | 1 | 0.4226 | - |
| 3.125 | 50 | 0.0022 | - |
| 6.25 | 100 | 0.0001 | - |
| 9.375 | 150 | 0.0001 | - |
| 12.5 | 200 | 0.0001 | - |
| 15.625 | 250 | 0.0001 | - |
| 18.75 | 300 | 0.0001 | - |
### Framework Versions
- Python: 3.10.12
- SetFit: 1.1.0.dev0
- Sentence Transformers: 3.1.1
- Transformers: 4.46.1
- PyTorch: 2.4.0+cu121
- Datasets: 2.20.0
- Tokenizers: 0.20.0
## Citation
### BibTeX
```bibtex
@article{https://doi.org/10.48550/arxiv.2209.11055,
doi = {10.48550/ARXIV.2209.11055},
url = {https://arxiv.org/abs/2209.11055},
author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Efficient Few-Shot Learning Without Prompts},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> |
mini1013/master_cate_ac7 | mini1013 | 2024-11-25T10:12:50Z | 137 | 0 | setfit | [
"setfit",
"safetensors",
"roberta",
"sentence-transformers",
"text-classification",
"generated_from_setfit_trainer",
"arxiv:2209.11055",
"base_model:mini1013/master_domain",
"base_model:finetune:mini1013/master_domain",
"model-index",
"region:us"
] | text-classification | 2024-11-25T10:12:28Z | ---
base_model: mini1013/master_domain
library_name: setfit
metrics:
- metric
pipeline_tag: text-classification
tags:
- setfit
- sentence-transformers
- text-classification
- generated_from_setfit_trainer
widget:
- text: ๊ฐ์ค์ฝ ๊ฐ์ฃฝ์ ์ฉ์ผ์์ฝ ์ํ ์นด์ํธ ์ค๋์ปค์ฆ 33์์ 100ml ๋คํฌ๋ธ๋ผ์ด ์ฃผ์ํ์ฌ๊ฐ์ค์ฝ
- text: ๋ ์ธ์์ฆ ์ฅํ ๋ฐฉ์ ๋ถ์ธ ์์ค์์
์ ๋ฐ๋ณดํธ ๊ณ ๋ฌด ๋ฏธ๋๋ผ๋ฐฉ์ง ์ฌ์ฑ์ฉ H_M 34-36 ์ง์์ค
- text: ๊ฐ์ค์ฝ ๊ฐ์ฃฝ์ ์ฉ์ผ์์ฝ ๋๊ตฌ ํ์ธํธ ๊ฐ์ฃฝ์ท 100ml ๋ธ๋ผ์ด_๋ฌด๊ด ์ฃผ์ํ์ฌ ๊ฐ์ค์ฝ
- text: ์์ค์ ์์ด์ฌ๋ฆผ ์ธ์ ๊ธฐ๋ฅ์ฑ ์ ๋ฐ ๊น์ฐฝ 245mm ์ฃผ์ํ์ฌ ์์ฐฝ์์ฝ
- text: ๊น์ค ์๋ง ์ธ๊ฐ ๋ฐ ๋ณดํธ ๋ณด์จ ๋ฐฉํ ํธํ ์ด์ ๋กฑ ๋ถ์ธ ํ ์ฌ์ฑ ๋ฐฉ์์ปค๋ฒ ์ค์ ํ ๋จ์ฑ์ฉ ํ๋ฌ์ ์ฌ๋ฆฌ๋ธ/๋๊บผ์ด ๋ฒ์ ๋์ด 35_45
ํํฌ๊ณ ๋ฆด๋ผ
inference: true
model-index:
- name: SetFit with mini1013/master_domain
results:
- task:
type: text-classification
name: Text Classification
dataset:
name: Unknown
type: unknown
split: test
metrics:
- type: metric
value: 0.9254610935283204
name: Metric
---
# SetFit with mini1013/master_domain
This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [mini1013/master_domain](https://huggingface.co/mini1013/master_domain) as the Sentence Transformer embedding model. A [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance is used for classification.
The model has been trained using an efficient few-shot learning technique that involves:
1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning.
2. Training a classification head with features from the fine-tuned Sentence Transformer.
## Model Details
### Model Description
- **Model Type:** SetFit
- **Sentence Transformer body:** [mini1013/master_domain](https://huggingface.co/mini1013/master_domain)
- **Classification head:** a [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance
- **Maximum Sequence Length:** 512 tokens
- **Number of Classes:** 7 classes
<!-- - **Training Dataset:** [Unknown](https://huggingface.co/datasets/unknown) -->
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->
### Model Sources
- **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit)
- **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055)
- **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit)
### Model Labels
| Label | Examples |
|:------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 6.0 | <ul><li>'๋ฑ์ฐํ๋ 1+1 ํต๋_๋ผ์ธ๋ค์ด๋น ์ ์ธ๊ณ๋ชฐ'</li><li>'๋ฑ์ฐํ๋ 1+1 ํธ์์คํธ_๋ธ๋ผ์ด ์ ์ธ๊ณ๋ชฐ'</li><li>'๋ชฝ๋ฒจ ์๋ ์ด์ค ํ๋ซ 4MM YELLOW JBSXXUZZ105 ์ ๋ฐ๋ ํ๋ ๋ฑ์ฐํ๋ 140 (์ฃผ)์ฝ์ด๋ฐธ๋ฅ'</li></ul> |
| 2.0 | <ul><li>'๊ณ ๊ธ ๊ฐ๋จ์คํํ ๊ตฌ๋๊ตฝ/์์๋ฐฉ์ง/์ถฉ๊ฒฉ์ํ/ํ์ดํ๊ตฝ ๋ธ๋_DD-107 ์๋ฏธ์ฆ'</li><li>'๋ฐ ๋ค๊ฟ์น ํจ๋ ์ฟ ์
์ ๋ฐ ๊ตฌ๋ ์ด๋ํ ์ฌ์ด์ฆ ํด๋ ์ค์ด๊ธฐ ํจ๋ 6-ํผ๋_์์ด๋ณด๋ฆฌํ์ดํธ_One Size(2P) ์ ์คํธ์์'</li><li>'๊ณ ๊ธ ๊ฐ๋จ์คํํ ๊ตฌ๋๊ตฝ/์์๋ฐฉ์ง/์ถฉ๊ฒฉ์ํ/ํ์ดํ๊ตฝ ๋ธ๋_DD-092 ์๋ฏธ์ฆ'</li></ul> |
| 5.0 | <ul><li>'[์ฐ๋ฝ] ์๊ทธ๋์ฒ ๊น์ฐฝ ์์น ์ด๋ํ ๋ฑ์ฐํ ๊ตฐ๋ ๊ตฐ์ธ ๊ตฐํ ์์ ํ ํ๋ฐ ๊ธฐ๋ฅ์ฑ ํค๋์ด [0008]๊ทธ๋ฆฐ M(255 270) CJONSTYLE'</li><li>'[๋กฏ๋ฐ๋ฐฑํ์ ]์์ฝ(์์ฆ) ์ปดํฌํธ ์๋ธ๋ฆฌ๋ฐ์ด ์ธ์ ๋ฉ์ฆ 9059029-00101 ๋ธ๋_EU 39 ๋กฏ๋ฐ๋ฐฑํ์ _'</li><li>'๋ฑ์ฐํ ๊น์ฐฝ ๊ธฐ๋ฅ์ฑ ์ด๋ํ ํน์ ์คํฌ์ธ ์ ๋ฐ ํค๋์ด ๊ณจํํ XL275-295 ๋ง์ผํธ์ฆ'</li></ul> |
| 0.0 | <ul><li>'[ํ๋๋ฐฑํ์ ]๊ธ๊ฐ์ ํ ๋๋๋ก๋ฐ SHOSC0150SAM ํด๋์ฉ ๋ฏธ๋ ๊ตฌ๋ํค๋ผ [00001] ํด๋์ฉ ๊ตฌ๋์นผ (์ฃผ)ํ๋ํ์ผํ'</li><li>'์๋๊ฐ ์ฒดํฌ ์๊ฐ์ฃฝ ํด๋์ฉ ์ํผ navy 000 (์ฃผ)ํธ๋ผ์ด๋ณธ์ฆ'</li><li>'[๊ธ๊ฐ์ ํ](๊ด์ฃผ์ ์ธ๊ณ) ์ฝ๋ ์
ํด๋์ฉ ์ํผ ์คํธ ๋ฏธ๋ ๊ตฌ๋ ํค๋ผ N8MKA150/SHOSC0150SAM 10.5cm ์ ์ธ๊ณ๋ฐฑํ์ '</li></ul> |
| 4.0 | <ul><li>'๋น์ค๋๋ ๋จ์ฑ ์ฌ์ฑ 1ํ์ฉ๋น๋๋ง์ S M L ๋น์ฌ๋์ ๋ฐ ์ฌ๋ฆํ์ํ ์ ๋ฐ์ฐ๋น ์์_๋ ์ธ์ ๋ฐ์ปค๋ฒ ํฌ๋ช
๋ธ๋ฃจM ์คํ๋ฆฌ๋น'</li><li>'๋น์ฌ๋ ์ด์์ ์ธ ์ฌ์ฑ์ฉ ์ฑ๊ธ ์์ฆ ๊ฐ์ฃฝ ์ ๋ฐ ์ฌ์ฑ ํจ์
๋ ์ธ์ ๋ฐ์ปค๋ฒ ๋ฉ์ค๋ฌ์ด์ฝ๋ 13_39 ์คํฐ๋ธ๋์๋ฒ'</li><li>'ํฌ๋ช
์์ฆ ํจ์
์ํฐ ์ฅ๋ง ์ฌ์ฑ์ฅํ ๋ฏธ๋๋ผ๋ฐฉ์ง ํ์ XXL(43-45 ์ ํฉ)_๋ธ๋ฃจ-ํ์ด [๋ฏธ๋๋ผ๋ฐฉ์ง์ฐฝx2๋
ํ์ง] ๊ตฌ๋ฃก๊ธ๋ก๋ฒ'</li></ul> |
| 1.0 | <ul><li>'๊ณฐ๋์ด ๋ธ๋ ๊ฒ์ ํํธ ํ์ดํธ ๋จ์ ์ฑ์ธ ์ปคํ ์ง๋น์ถ ์๋น์ธ ์ฌํ ํ์ธ ํด๋ก๊ทธ ์ฐธ ์ฅ์์ ๋ฐ A set (๋ธ๋-ํ์ดํธ) ๋ด์ง(NYUZY)'</li><li>'์ํ ๊ธ์ํ ๋ฉํ๋ฝํ ๋๋ธ๋ ๋ฉํ๋ฐด๋ ์
์ดํด๋ฆฝ ๋ฉํ๊ณ ์ ํ ํ๋ผ์คํฑ๊ณ ์ ํ ๊ณจ๋์ํ ํฉ๋์ํ ์ค๋ฒ์ํ ๊ธ์์ํ ๊ตต์๊ณจ๋(4๊ฐ) ์๋ ์ด์ค'</li><li>'(SAPHIR) ์ฌํผ๋ฅด ๋ ๋
ธ๋ฒ ์ดํ
์ปฌ๋ฌ ์ฌ์ํฌ๋ฆผ / ๊ฐ์ฃฝ ์ผ์์ ๋ฆฌ๋
ธ๋ฒ ์ดํ
๋ฏธ๋์๋ธ๋ผ์ด ์ ์ด์ ์ปดํผ๋'</li></ul> |
| 3.0 | <ul><li>'STRATTON ๋จ์ฑ์ฉ ์ผ๋๋ฌด ์ํธ๋ฆฌ- ๋ฏธ๊ตญ์ฐ, m / 9 - 10 ์์ฐ๋ฆฌ์ปดํผ๋'</li><li>'๋ฐ๋ณผ ์ฌ์ ์ ๋ฐ ๋จ์ ์ ๊ณจ๊ธฐ ๋ฐ๋ฑ ์ฌ์ฑํ์ดํ ๋ฐ๋ฑ ์ฝ์ฝ๋๋ผ'</li><li>'์์ค์ด๋ ์ ๊ธฐ ๊ธ์์ ๊ณจ๊ธฐ ๊ฒฝ์ฒฉํ์
์ ๋ฌธ๊ฐ์ฉ ๋ ์ง๊ฐ๋ค ์
์์ฉ ์ฌ์ฑ์ฉ js9997'</li></ul> |
## Evaluation
### Metrics
| Label | Metric |
|:--------|:-------|
| **all** | 0.9255 |
## Uses
### Direct Use for Inference
First install the SetFit library:
```bash
pip install setfit
```
Then you can load this model and run inference.
```python
from setfit import SetFitModel
# Download from the ๐ค Hub
model = SetFitModel.from_pretrained("mini1013/master_cate_ac7")
# Run inference
preds = model("์์ค์ ์์ด์ฌ๋ฆผ ์ธ์ ๊ธฐ๋ฅ์ฑ ์ ๋ฐ ๊น์ฐฝ 245mm ์ฃผ์ํ์ฌ ์์ฐฝ์์ฝ")
```
<!--
### Downstream Use
*List how someone could finetune this model on their own dataset.*
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Set Metrics
| Training set | Min | Median | Max |
|:-------------|:----|:--------|:----|
| Word count | 3 | 10.4257 | 27 |
| Label | Training Sample Count |
|:------|:----------------------|
| 0.0 | 50 |
| 1.0 | 50 |
| 2.0 | 50 |
| 3.0 | 50 |
| 4.0 | 50 |
| 5.0 | 50 |
| 6.0 | 50 |
### Training Hyperparameters
- batch_size: (512, 512)
- num_epochs: (20, 20)
- max_steps: -1
- sampling_strategy: oversampling
- num_iterations: 40
- body_learning_rate: (2e-05, 2e-05)
- head_learning_rate: 2e-05
- loss: CosineSimilarityLoss
- distance_metric: cosine_distance
- margin: 0.25
- end_to_end: False
- use_amp: False
- warmup_proportion: 0.1
- seed: 42
- eval_max_steps: -1
- load_best_model_at_end: False
### Training Results
| Epoch | Step | Training Loss | Validation Loss |
|:-------:|:----:|:-------------:|:---------------:|
| 0.0182 | 1 | 0.3761 | - |
| 0.9091 | 50 | 0.2291 | - |
| 1.8182 | 100 | 0.033 | - |
| 2.7273 | 150 | 0.018 | - |
| 3.6364 | 200 | 0.0001 | - |
| 4.5455 | 250 | 0.0001 | - |
| 5.4545 | 300 | 0.0001 | - |
| 6.3636 | 350 | 0.0001 | - |
| 7.2727 | 400 | 0.0001 | - |
| 8.1818 | 450 | 0.0 | - |
| 9.0909 | 500 | 0.0 | - |
| 10.0 | 550 | 0.0 | - |
| 10.9091 | 600 | 0.0 | - |
| 11.8182 | 650 | 0.0 | - |
| 12.7273 | 700 | 0.0 | - |
| 13.6364 | 750 | 0.0 | - |
| 14.5455 | 800 | 0.0 | - |
| 15.4545 | 850 | 0.0 | - |
| 16.3636 | 900 | 0.0 | - |
| 17.2727 | 950 | 0.0001 | - |
| 18.1818 | 1000 | 0.0 | - |
| 19.0909 | 1050 | 0.0 | - |
| 20.0 | 1100 | 0.0 | - |
### Framework Versions
- Python: 3.10.12
- SetFit: 1.1.0.dev0
- Sentence Transformers: 3.1.1
- Transformers: 4.46.1
- PyTorch: 2.4.0+cu121
- Datasets: 2.20.0
- Tokenizers: 0.20.0
## Citation
### BibTeX
```bibtex
@article{https://doi.org/10.48550/arxiv.2209.11055,
doi = {10.48550/ARXIV.2209.11055},
url = {https://arxiv.org/abs/2209.11055},
author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Efficient Few-Shot Learning Without Prompts},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> |
MayBashendy/Arabic_FineTuningAraBERT_AugV5_k3_task3_organization_fold1 | MayBashendy | 2024-11-25T10:12:48Z | 161 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:aubmindlab/bert-base-arabertv02",
"base_model:finetune:aubmindlab/bert-base-arabertv02",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-25T10:11:10Z | ---
library_name: transformers
base_model: aubmindlab/bert-base-arabertv02
tags:
- generated_from_trainer
model-index:
- name: Arabic_FineTuningAraBERT_AugV5_k3_task3_organization_fold1
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Arabic_FineTuningAraBERT_AugV5_k3_task3_organization_fold1
This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6023
- Qwk: 0.1852
- Mse: 0.6023
- Rmse: 0.7761
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:|
| No log | 0.1111 | 2 | 3.2691 | 0.0041 | 3.2691 | 1.8081 |
| No log | 0.2222 | 4 | 1.5440 | -0.0168 | 1.5440 | 1.2426 |
| No log | 0.3333 | 6 | 1.0494 | -0.4426 | 1.0494 | 1.0244 |
| No log | 0.4444 | 8 | 0.7627 | 0.0 | 0.7627 | 0.8733 |
| No log | 0.5556 | 10 | 0.7207 | -0.0421 | 0.7207 | 0.8489 |
| No log | 0.6667 | 12 | 1.1277 | -0.1000 | 1.1277 | 1.0619 |
| No log | 0.7778 | 14 | 0.7978 | -0.2791 | 0.7978 | 0.8932 |
| No log | 0.8889 | 16 | 0.6799 | 0.0 | 0.6799 | 0.8246 |
| No log | 1.0 | 18 | 0.6912 | 0.0 | 0.6912 | 0.8314 |
| No log | 1.1111 | 20 | 0.7478 | -0.0233 | 0.7478 | 0.8647 |
| No log | 1.2222 | 22 | 0.7894 | -0.2692 | 0.7894 | 0.8885 |
| No log | 1.3333 | 24 | 0.8841 | -0.1074 | 0.8841 | 0.9403 |
| No log | 1.4444 | 26 | 0.9616 | 0.1646 | 0.9616 | 0.9806 |
| No log | 1.5556 | 28 | 0.8752 | 0.0403 | 0.8752 | 0.9355 |
| No log | 1.6667 | 30 | 0.7599 | -0.2655 | 0.7599 | 0.8717 |
| No log | 1.7778 | 32 | 0.8963 | 0.0571 | 0.8963 | 0.9467 |
| No log | 1.8889 | 34 | 1.2057 | 0.0 | 1.2057 | 1.0980 |
| No log | 2.0 | 36 | 1.1424 | 0.0 | 1.1424 | 1.0688 |
| No log | 2.1111 | 38 | 0.9274 | 0.0 | 0.9274 | 0.9630 |
| No log | 2.2222 | 40 | 0.6915 | 0.0 | 0.6915 | 0.8316 |
| No log | 2.3333 | 42 | 0.6934 | -0.0577 | 0.6934 | 0.8327 |
| No log | 2.4444 | 44 | 1.0658 | 0.0120 | 1.0658 | 1.0324 |
| No log | 2.5556 | 46 | 2.2327 | -0.0267 | 2.2327 | 1.4942 |
| No log | 2.6667 | 48 | 1.8881 | 0.0 | 1.8881 | 1.3741 |
| No log | 2.7778 | 50 | 1.3046 | 0.0 | 1.3046 | 1.1422 |
| No log | 2.8889 | 52 | 0.8297 | 0.1879 | 0.8297 | 0.9109 |
| No log | 3.0 | 54 | 0.6838 | 0.0 | 0.6838 | 0.8269 |
| No log | 3.1111 | 56 | 0.6598 | 0.0 | 0.6598 | 0.8123 |
| No log | 3.2222 | 58 | 0.6455 | 0.0 | 0.6455 | 0.8034 |
| No log | 3.3333 | 60 | 0.6495 | 0.0 | 0.6495 | 0.8059 |
| No log | 3.4444 | 62 | 0.6547 | 0.0 | 0.6547 | 0.8092 |
| No log | 3.5556 | 64 | 0.6983 | 0.0 | 0.6983 | 0.8357 |
| No log | 3.6667 | 66 | 0.8622 | -0.1074 | 0.8622 | 0.9286 |
| No log | 3.7778 | 68 | 1.0773 | 0.0 | 1.0773 | 1.0379 |
| No log | 3.8889 | 70 | 1.0490 | 0.0 | 1.0490 | 1.0242 |
| No log | 4.0 | 72 | 1.0322 | 0.0 | 1.0322 | 1.0160 |
| No log | 4.1111 | 74 | 0.9420 | 0.0 | 0.9420 | 0.9706 |
| No log | 4.2222 | 76 | 0.8430 | 0.0571 | 0.8430 | 0.9181 |
| No log | 4.3333 | 78 | 0.7792 | -0.0577 | 0.7792 | 0.8827 |
| No log | 4.4444 | 80 | 0.7126 | -0.0233 | 0.7126 | 0.8441 |
| No log | 4.5556 | 82 | 0.6694 | 0.0 | 0.6694 | 0.8182 |
| No log | 4.6667 | 84 | 0.6598 | 0.0 | 0.6598 | 0.8123 |
| No log | 4.7778 | 86 | 0.6827 | 0.0 | 0.6827 | 0.8263 |
| No log | 4.8889 | 88 | 0.7243 | 0.1239 | 0.7243 | 0.8511 |
| No log | 5.0 | 90 | 0.7197 | 0.1239 | 0.7197 | 0.8484 |
| No log | 5.1111 | 92 | 0.7406 | 0.0984 | 0.7406 | 0.8606 |
| No log | 5.2222 | 94 | 0.7973 | 0.0984 | 0.7973 | 0.8929 |
| No log | 5.3333 | 96 | 0.7799 | 0.0984 | 0.7799 | 0.8831 |
| No log | 5.4444 | 98 | 0.7187 | 0.1239 | 0.7187 | 0.8478 |
| No log | 5.5556 | 100 | 0.6871 | 0.1239 | 0.6871 | 0.8289 |
| No log | 5.6667 | 102 | 0.6141 | 0.0 | 0.6141 | 0.7836 |
| No log | 5.7778 | 104 | 0.5736 | 0.0 | 0.5736 | 0.7574 |
| No log | 5.8889 | 106 | 0.5768 | 0.0 | 0.5768 | 0.7595 |
| No log | 6.0 | 108 | 0.5742 | 0.0 | 0.5742 | 0.7577 |
| No log | 6.1111 | 110 | 0.5836 | 0.0 | 0.5836 | 0.7640 |
| No log | 6.2222 | 112 | 0.5875 | 0.0 | 0.5875 | 0.7665 |
| No log | 6.3333 | 114 | 0.5939 | 0.0 | 0.5939 | 0.7706 |
| No log | 6.4444 | 116 | 0.6115 | 0.0 | 0.6115 | 0.7820 |
| No log | 6.5556 | 118 | 0.6120 | 0.0 | 0.6120 | 0.7823 |
| No log | 6.6667 | 120 | 0.6020 | -0.0233 | 0.6020 | 0.7759 |
| No log | 6.7778 | 122 | 0.6050 | 0.1895 | 0.6050 | 0.7778 |
| No log | 6.8889 | 124 | 0.6034 | -0.0233 | 0.6034 | 0.7768 |
| No log | 7.0 | 126 | 0.6037 | -0.0233 | 0.6037 | 0.7770 |
| No log | 7.1111 | 128 | 0.6084 | -0.0233 | 0.6084 | 0.7800 |
| No log | 7.2222 | 130 | 0.6394 | 0.0 | 0.6394 | 0.7996 |
| No log | 7.3333 | 132 | 0.6494 | 0.0 | 0.6494 | 0.8059 |
| No log | 7.4444 | 134 | 0.6215 | -0.0233 | 0.6215 | 0.7884 |
| No log | 7.5556 | 136 | 0.6098 | 0.1895 | 0.6098 | 0.7809 |
| No log | 7.6667 | 138 | 0.6302 | 0.1538 | 0.6302 | 0.7939 |
| No log | 7.7778 | 140 | 0.6763 | 0.1239 | 0.6763 | 0.8224 |
| No log | 7.8889 | 142 | 0.6795 | 0.1239 | 0.6795 | 0.8243 |
| No log | 8.0 | 144 | 0.6233 | 0.1538 | 0.6233 | 0.7895 |
| No log | 8.1111 | 146 | 0.5968 | 0.1895 | 0.5968 | 0.7725 |
| No log | 8.2222 | 148 | 0.6159 | 0.0 | 0.6159 | 0.7848 |
| No log | 8.3333 | 150 | 0.6337 | 0.0 | 0.6337 | 0.7961 |
| No log | 8.4444 | 152 | 0.6319 | 0.0 | 0.6319 | 0.7949 |
| No log | 8.5556 | 154 | 0.6303 | 0.0 | 0.6303 | 0.7939 |
| No log | 8.6667 | 156 | 0.6209 | 0.0 | 0.6209 | 0.7880 |
| No log | 8.7778 | 158 | 0.6141 | 0.1852 | 0.6141 | 0.7837 |
| No log | 8.8889 | 160 | 0.6148 | 0.1852 | 0.6148 | 0.7841 |
| No log | 9.0 | 162 | 0.6149 | 0.1852 | 0.6149 | 0.7842 |
| No log | 9.1111 | 164 | 0.6135 | 0.1852 | 0.6135 | 0.7832 |
| No log | 9.2222 | 166 | 0.6099 | 0.1852 | 0.6099 | 0.7810 |
| No log | 9.3333 | 168 | 0.6068 | 0.1852 | 0.6068 | 0.7790 |
| No log | 9.4444 | 170 | 0.6060 | 0.1852 | 0.6060 | 0.7784 |
| No log | 9.5556 | 172 | 0.6050 | 0.1852 | 0.6050 | 0.7778 |
| No log | 9.6667 | 174 | 0.6033 | 0.1852 | 0.6033 | 0.7767 |
| No log | 9.7778 | 176 | 0.6029 | 0.1852 | 0.6029 | 0.7765 |
| No log | 9.8889 | 178 | 0.6023 | 0.1852 | 0.6023 | 0.7761 |
| No log | 10.0 | 180 | 0.6023 | 0.1852 | 0.6023 | 0.7761 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
|
braindao/iq-code-evmind-14b-instruct-v0.2411.1 | braindao | 2024-11-25T10:10:34Z | 8 | 0 | transformers | [
"transformers",
"safetensors",
"qwen2",
"text-generation",
"llama-factory",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-25T10:05:01Z | ---
library_name: transformers
tags:
- llama-factory
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a ๐ค transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
MayBashendy/Arabic_FineTuningAraBERT_AugV5_k2_task3_organization_fold1 | MayBashendy | 2024-11-25T10:07:49Z | 162 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:aubmindlab/bert-base-arabertv02",
"base_model:finetune:aubmindlab/bert-base-arabertv02",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-25T10:05:33Z | ---
library_name: transformers
base_model: aubmindlab/bert-base-arabertv02
tags:
- generated_from_trainer
model-index:
- name: Arabic_FineTuningAraBERT_AugV5_k2_task3_organization_fold1
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Arabic_FineTuningAraBERT_AugV5_k2_task3_organization_fold1
This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6027
- Qwk: 0.0222
- Mse: 0.6027
- Rmse: 0.7763
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:|
| No log | 0.1429 | 2 | 3.6943 | 0.0 | 3.6943 | 1.9220 |
| No log | 0.2857 | 4 | 1.8695 | 0.0 | 1.8695 | 1.3673 |
| No log | 0.4286 | 6 | 1.0940 | 0.0 | 1.0940 | 1.0459 |
| No log | 0.5714 | 8 | 0.7397 | 0.3654 | 0.7397 | 0.8601 |
| No log | 0.7143 | 10 | 0.6617 | -0.0233 | 0.6617 | 0.8135 |
| No log | 0.8571 | 12 | 0.6396 | -0.0233 | 0.6396 | 0.7997 |
| No log | 1.0 | 14 | 0.7214 | -0.0577 | 0.7214 | 0.8493 |
| No log | 1.1429 | 16 | 0.8774 | 0.2143 | 0.8774 | 0.9367 |
| No log | 1.2857 | 18 | 0.7807 | 0.0984 | 0.7807 | 0.8836 |
| No log | 1.4286 | 20 | 0.6395 | 0.0 | 0.6395 | 0.7997 |
| No log | 1.5714 | 22 | 0.6691 | 0.0 | 0.6691 | 0.8180 |
| No log | 1.7143 | 24 | 0.8455 | -0.0916 | 0.8455 | 0.9195 |
| No log | 1.8571 | 26 | 1.2566 | 0.0 | 1.2566 | 1.1210 |
| No log | 2.0 | 28 | 1.9493 | 0.0 | 1.9493 | 1.3962 |
| No log | 2.1429 | 30 | 1.9458 | 0.0 | 1.9458 | 1.3949 |
| No log | 2.2857 | 32 | 1.4166 | 0.0 | 1.4166 | 1.1902 |
| No log | 2.4286 | 34 | 1.0109 | 0.0 | 1.0109 | 1.0054 |
| No log | 2.5714 | 36 | 0.7919 | 0.2143 | 0.7919 | 0.8899 |
| No log | 2.7143 | 38 | 0.8311 | 0.1879 | 0.8311 | 0.9117 |
| No log | 2.8571 | 40 | 0.9516 | 0.0 | 0.9516 | 0.9755 |
| No log | 3.0 | 42 | 0.9142 | 0.0 | 0.9142 | 0.9562 |
| No log | 3.1429 | 44 | 0.7883 | 0.0763 | 0.7883 | 0.8879 |
| No log | 3.2857 | 46 | 0.8340 | 0.0571 | 0.8340 | 0.9132 |
| No log | 3.4286 | 48 | 0.8201 | 0.0571 | 0.8201 | 0.9056 |
| No log | 3.5714 | 50 | 0.7335 | 0.1239 | 0.7335 | 0.8565 |
| No log | 3.7143 | 52 | 0.6410 | 0.0 | 0.6410 | 0.8006 |
| No log | 3.8571 | 54 | 0.6221 | 0.0 | 0.6221 | 0.7887 |
| No log | 4.0 | 56 | 0.6607 | 0.0 | 0.6607 | 0.8128 |
| No log | 4.1429 | 58 | 0.8841 | -0.0820 | 0.8841 | 0.9403 |
| No log | 4.2857 | 60 | 0.9114 | -0.0820 | 0.9114 | 0.9547 |
| No log | 4.4286 | 62 | 0.7034 | -0.0421 | 0.7034 | 0.8387 |
| No log | 4.5714 | 64 | 0.6314 | 0.0 | 0.6314 | 0.7946 |
| No log | 4.7143 | 66 | 0.5949 | 0.0 | 0.5949 | 0.7713 |
| No log | 4.8571 | 68 | 0.5761 | 0.0 | 0.5761 | 0.7590 |
| No log | 5.0 | 70 | 0.5803 | 0.0 | 0.5803 | 0.7618 |
| No log | 5.1429 | 72 | 0.6058 | 0.1895 | 0.6058 | 0.7783 |
| No log | 5.2857 | 74 | 0.5757 | 0.1895 | 0.5757 | 0.7588 |
| No log | 5.4286 | 76 | 0.5368 | 0.0 | 0.5368 | 0.7327 |
| No log | 5.5714 | 78 | 0.5330 | 0.0 | 0.5330 | 0.7301 |
| No log | 5.7143 | 80 | 0.5448 | 0.0 | 0.5448 | 0.7381 |
| No log | 5.8571 | 82 | 0.5658 | 0.0 | 0.5658 | 0.7522 |
| No log | 6.0 | 84 | 0.6706 | 0.0 | 0.6706 | 0.8189 |
| No log | 6.1429 | 86 | 0.8092 | 0.0222 | 0.8092 | 0.8996 |
| No log | 6.2857 | 88 | 0.8005 | 0.0222 | 0.8005 | 0.8947 |
| No log | 6.4286 | 90 | 0.6768 | 0.0222 | 0.6768 | 0.8227 |
| No log | 6.5714 | 92 | 0.5966 | 0.0 | 0.5966 | 0.7724 |
| No log | 6.7143 | 94 | 0.6005 | -0.0233 | 0.6005 | 0.7749 |
| No log | 6.8571 | 96 | 0.5849 | -0.0233 | 0.5849 | 0.7648 |
| No log | 7.0 | 98 | 0.5442 | 0.0 | 0.5442 | 0.7377 |
| No log | 7.1429 | 100 | 0.5926 | 0.0222 | 0.5926 | 0.7698 |
| No log | 7.2857 | 102 | 0.5723 | 0.0222 | 0.5723 | 0.7565 |
| No log | 7.4286 | 104 | 0.5054 | 0.0 | 0.5054 | 0.7109 |
| No log | 7.5714 | 106 | 0.4898 | 0.0 | 0.4898 | 0.6999 |
| No log | 7.7143 | 108 | 0.5001 | 0.0 | 0.5001 | 0.7072 |
| No log | 7.8571 | 110 | 0.5214 | -0.0421 | 0.5214 | 0.7221 |
| No log | 8.0 | 112 | 0.5205 | 0.0 | 0.5205 | 0.7215 |
| No log | 8.1429 | 114 | 0.5150 | 0.0 | 0.5150 | 0.7177 |
| No log | 8.2857 | 116 | 0.5367 | 0.2667 | 0.5367 | 0.7326 |
| No log | 8.4286 | 118 | 0.6132 | 0.2414 | 0.6132 | 0.7830 |
| No log | 8.5714 | 120 | 0.6227 | 0.2326 | 0.6227 | 0.7891 |
| No log | 8.7143 | 122 | 0.6027 | 0.2524 | 0.6027 | 0.7764 |
| No log | 8.8571 | 124 | 0.5833 | 0.0222 | 0.5833 | 0.7637 |
| No log | 9.0 | 126 | 0.5827 | 0.0222 | 0.5827 | 0.7634 |
| No log | 9.1429 | 128 | 0.5799 | 0.0 | 0.5799 | 0.7615 |
| No log | 9.2857 | 130 | 0.5873 | 0.0 | 0.5873 | 0.7664 |
| No log | 9.4286 | 132 | 0.5931 | 0.0222 | 0.5931 | 0.7701 |
| No log | 9.5714 | 134 | 0.5958 | 0.0222 | 0.5958 | 0.7719 |
| No log | 9.7143 | 136 | 0.5963 | 0.0222 | 0.5963 | 0.7722 |
| No log | 9.8571 | 138 | 0.6013 | 0.0222 | 0.6013 | 0.7754 |
| No log | 10.0 | 140 | 0.6027 | 0.0222 | 0.6027 | 0.7763 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
|
QuantFactory/Llama-Sentient-3.2-3B-Instruct-GGUF | QuantFactory | 2024-11-25T10:05:13Z | 406 | 2 | transformers | [
"transformers",
"gguf",
"Llama",
"Llama-Cpp",
"Llama3.2",
"Instruct",
"3B",
"bin",
"Sentient",
"text-generation",
"en",
"dataset:mlabonne/lmsys-arena-human-preference-55k-sharegpt",
"base_model:meta-llama/Llama-3.2-3B-Instruct",
"base_model:quantized:meta-llama/Llama-3.2-3B-Instruct",
"license:creativeml-openrail-m",
"endpoints_compatible",
"region:us",
"conversational"
] | text-generation | 2024-11-22T04:56:16Z |
---
license: creativeml-openrail-m
datasets:
- mlabonne/lmsys-arena-human-preference-55k-sharegpt
language:
- en
base_model:
- meta-llama/Llama-3.2-3B-Instruct
pipeline_tag: text-generation
library_name: transformers
tags:
- Llama
- Llama-Cpp
- Llama3.2
- Instruct
- 3B
- bin
- Sentient
---
[](https://hf.co/QuantFactory)
# QuantFactory/Llama-Sentient-3.2-3B-Instruct-GGUF
This is quantized version of [prithivMLmods/Llama-Sentient-3.2-3B-Instruct](https://huggingface.co/prithivMLmods/Llama-Sentient-3.2-3B-Instruct) created using llama.cpp
# Original Model Card
## Llama-Sentient-3.2-3B-Instruct Modelfile
| File Name | Size | Description | Upload Status |
|-----------------------------------------|--------------|-----------------------------------------|----------------|
| `.gitattributes` | 1.57 kB | Git attributes configuration file | Uploaded |
| `README.md` | 42 Bytes | Initial commit README | Uploaded |
| `config.json` | 1.04 kB | Configuration file | Uploaded |
| `generation_config.json` | 248 Bytes | Generation configuration file | Uploaded |
| `pytorch_model-00001-of-00002.bin` | 4.97 GB | PyTorch model file (part 1) | Uploaded (LFS) |
| `pytorch_model-00002-of-00002.bin` | 1.46 GB | PyTorch model file (part 2) | Uploaded (LFS) |
| `pytorch_model.bin.index.json` | 21.2 kB | Model index file | Uploaded |
| `special_tokens_map.json` | 477 Bytes | Special tokens mapping | Uploaded |
| `tokenizer.json` | 17.2 MB | Tokenizer JSON file | Uploaded (LFS) |
| `tokenizer_config.json` | 57.4 kB | Tokenizer configuration file | Uploaded |
| Model Type | Size | Context Length | Link |
|------------|------|----------------|------|
| GGUF | 3B | - | [๐ค Llama-Sentient-3.2-3B-Instruct-GGUF](https://huggingface.co/prithivMLmods/Llama-Sentient-3.2-3B-Instruct-GGUF) |
The **Llama-Sentient-3.2-3B-Instruct** model is a fine-tuned version of the **Llama-3.2-3B-Instruct** model, optimized for **text generation** tasks, particularly where instruction-following abilities are critical. This model is trained on the **mlabonne/lmsys-arena-human-preference-55k-sharegpt** dataset, which enhances its performance in conversational and advisory contexts, making it suitable for a wide range of applications.
### Key Use Cases:
1. **Conversational AI**: Engage in intelligent dialogue, offering coherent responses and following instructions, useful for customer support and virtual assistants.
2. **Text Generation**: Generate high-quality, contextually appropriate content such as articles, summaries, explanations, and other forms of written communication based on user prompts.
3. **Instruction Following**: Follow specific instructions with accuracy, making it ideal for tasks that require structured guidance, such as technical troubleshooting or educational assistance.
The model uses a **PyTorch-based architecture** and includes a range of necessary files such as configuration files, tokenizer files, and model weight files for deployment.
### Intended Applications:
- **Chatbots** for virtual assistance, customer support, or as personal digital assistants.
- **Content Creation Tools**, aiding in the generation of written materials, blog posts, or automated responses based on user inputs.
- **Educational and Training Systems**, providing explanations and guided learning experiences in various domains.
- **Human-AI Interaction** platforms, where the model can follow user instructions to provide personalized assistance or perform specific tasks.
With its strong foundation in instruction-following and conversational contexts, the **Llama-Sentient-3.2-3B-Instruct** model offers versatile applications for both general and specialized domains.
|
PotatoB/task_3-exp | PotatoB | 2024-11-25T10:03:32Z | 5 | 0 | null | [
"safetensors",
"mistral",
"merge",
"mergekit",
"potatoB/task_2-1",
"potatoB/task_1-2",
"license:apache-2.0",
"region:us"
] | null | 2024-11-25T10:00:36Z | ---
license: apache-2.0
tags:
- merge
- mergekit
- potatoB/task_2-1
- potatoB/task_1-2
---
# task_3-exp
task_3-exp is a merged model generated for Model Kinship experiments, originating from
* [potatoB/task_2-1](https://huggingface.co/potatoB/task_2-1)
* [potatoB/task_1-2](https://huggingface.co/potatoB/task_1-2)
## ๐งฉ Configuration
```yaml
slices:
- sources:
- model: potatoB/task_2-1
layer_range: [0, 32]
- model: potatoB/task_1-2
layer_range: [0, 32]
merge_method: slerp
base_model: potatoB/task_2-1
parameters:
t:
- filter: self_attn
value: [0, 0.5, 0.3, 0.7, 1]
- filter: mlp
value: [1, 0.5, 0.7, 0.3, 0]
- value: 0.5
dtype: float16
``` |
MayBashendy/Arabic_FineTuningAraBERT_AugV5_k1_task3_organization_fold1 | MayBashendy | 2024-11-25T10:02:13Z | 181 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:aubmindlab/bert-base-arabertv02",
"base_model:finetune:aubmindlab/bert-base-arabertv02",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-25T10:00:46Z | ---
library_name: transformers
base_model: aubmindlab/bert-base-arabertv02
tags:
- generated_from_trainer
model-index:
- name: Arabic_FineTuningAraBERT_AugV5_k1_task3_organization_fold1
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Arabic_FineTuningAraBERT_AugV5_k1_task3_organization_fold1
This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4881
- Qwk: 0.3654
- Mse: 0.4881
- Rmse: 0.6987
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|:-------------:|:-----:|:----:|:---------------:|:-------:|:------:|:------:|
| No log | 0.2 | 2 | 4.3733 | 0.0 | 4.3733 | 2.0912 |
| No log | 0.4 | 4 | 3.0656 | 0.0041 | 3.0656 | 1.7509 |
| No log | 0.6 | 6 | 1.9187 | 0.0 | 1.9187 | 1.3852 |
| No log | 0.8 | 8 | 1.2784 | 0.0 | 1.2784 | 1.1307 |
| No log | 1.0 | 10 | 1.0288 | 0.0 | 1.0288 | 1.0143 |
| No log | 1.2 | 12 | 0.8073 | 0.2143 | 0.8073 | 0.8985 |
| No log | 1.4 | 14 | 0.8244 | -0.0708 | 0.8244 | 0.9080 |
| No log | 1.6 | 16 | 0.8426 | -0.0708 | 0.8426 | 0.9179 |
| No log | 1.8 | 18 | 0.8425 | -0.2623 | 0.8425 | 0.9179 |
| No log | 2.0 | 20 | 0.7834 | -0.2737 | 0.7834 | 0.8851 |
| No log | 2.2 | 22 | 0.7676 | -0.0233 | 0.7676 | 0.8761 |
| No log | 2.4 | 24 | 0.7458 | -0.0233 | 0.7458 | 0.8636 |
| No log | 2.6 | 26 | 0.7393 | -0.0233 | 0.7393 | 0.8598 |
| No log | 2.8 | 28 | 0.7465 | -0.0233 | 0.7465 | 0.8640 |
| No log | 3.0 | 30 | 0.7807 | -0.0708 | 0.7807 | 0.8836 |
| No log | 3.2 | 32 | 0.7686 | -0.0708 | 0.7686 | 0.8767 |
| No log | 3.4 | 34 | 0.7733 | 0.1239 | 0.7733 | 0.8794 |
| No log | 3.6 | 36 | 0.7919 | -0.0820 | 0.7919 | 0.8899 |
| No log | 3.8 | 38 | 0.8260 | 0.0571 | 0.8260 | 0.9088 |
| No log | 4.0 | 40 | 0.8921 | 0.0253 | 0.8921 | 0.9445 |
| No log | 4.2 | 42 | 0.9072 | 0.0253 | 0.9072 | 0.9524 |
| No log | 4.4 | 44 | 0.8748 | 0.0403 | 0.8748 | 0.9353 |
| No log | 4.6 | 46 | 0.7669 | 0.1538 | 0.7669 | 0.8757 |
| No log | 4.8 | 48 | 0.7034 | 0.0 | 0.7034 | 0.8387 |
| No log | 5.0 | 50 | 0.7061 | 0.0 | 0.7061 | 0.8403 |
| No log | 5.2 | 52 | 0.7177 | -0.0233 | 0.7177 | 0.8472 |
| No log | 5.4 | 54 | 0.7338 | -0.0233 | 0.7338 | 0.8566 |
| No log | 5.6 | 56 | 0.7832 | 0.4590 | 0.7832 | 0.8850 |
| No log | 5.8 | 58 | 0.8263 | 0.0571 | 0.8263 | 0.9090 |
| No log | 6.0 | 60 | 0.9227 | 0.0253 | 0.9227 | 0.9606 |
| No log | 6.2 | 62 | 0.8976 | 0.0253 | 0.8976 | 0.9474 |
| No log | 6.4 | 64 | 0.8543 | 0.0403 | 0.8543 | 0.9243 |
| No log | 6.6 | 66 | 0.7962 | 0.2443 | 0.7962 | 0.8923 |
| No log | 6.8 | 68 | 0.7551 | 0.1239 | 0.7551 | 0.8690 |
| No log | 7.0 | 70 | 0.7269 | 0.0984 | 0.7269 | 0.8526 |
| No log | 7.2 | 72 | 0.6806 | 0.0984 | 0.6806 | 0.8250 |
| No log | 7.4 | 74 | 0.6234 | 0.1239 | 0.6234 | 0.7896 |
| No log | 7.6 | 76 | 0.5757 | 0.0984 | 0.5757 | 0.7587 |
| No log | 7.8 | 78 | 0.5394 | 0.3529 | 0.5394 | 0.7345 |
| No log | 8.0 | 80 | 0.4984 | 0.1239 | 0.4984 | 0.7059 |
| No log | 8.2 | 82 | 0.4521 | 0.3654 | 0.4521 | 0.6724 |
| No log | 8.4 | 84 | 0.4245 | 0.4884 | 0.4245 | 0.6516 |
| No log | 8.6 | 86 | 0.4220 | 0.4884 | 0.4220 | 0.6496 |
| No log | 8.8 | 88 | 0.4231 | 0.4884 | 0.4231 | 0.6505 |
| No log | 9.0 | 90 | 0.4275 | 0.4211 | 0.4275 | 0.6538 |
| No log | 9.2 | 92 | 0.4397 | 0.4211 | 0.4397 | 0.6631 |
| No log | 9.4 | 94 | 0.4567 | 0.4211 | 0.4567 | 0.6758 |
| No log | 9.6 | 96 | 0.4734 | 0.4211 | 0.4734 | 0.6880 |
| No log | 9.8 | 98 | 0.4866 | 0.3654 | 0.4866 | 0.6975 |
| No log | 10.0 | 100 | 0.4881 | 0.3654 | 0.4881 | 0.6987 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
|
mini1013/master_cate_ac4 | mini1013 | 2024-11-25T10:01:43Z | 89 | 0 | setfit | [
"setfit",
"safetensors",
"roberta",
"sentence-transformers",
"text-classification",
"generated_from_setfit_trainer",
"arxiv:2209.11055",
"base_model:mini1013/master_domain",
"base_model:finetune:mini1013/master_domain",
"model-index",
"region:us"
] | text-classification | 2024-11-25T10:01:22Z | ---
base_model: mini1013/master_domain
library_name: setfit
metrics:
- metric
pipeline_tag: text-classification
tags:
- setfit
- sentence-transformers
- text-classification
- generated_from_setfit_trainer
widget:
- text: ๊ธธ์ด์กฐ์ ์๊ฒฝ๊ณ ์ ๋ฐด๋ ์ฝ๋ฐ์นจ ํจ๋ ์ด๋ ์บ ํ ๋ฑ์ฐ ์ง๋ธ๋ผ์ด ์๋ฆฌ๋ชฝ๋
- text: ๋ ์ด๋ฐด ์๊ฒฝํ
RB3691VF 2509 ๋จ์ ์ฌ์ ๋๊ทธ๋์๊ฒฝ ์์์ํ ์์จ์์ด์ํฐ
- text: ๋ฐ์ฐฉ ์คํฌ์ธ ์๊ฒฝ์ค ํ๋ค๋ฆผ๋ฐฉ์ง ์๊ฒฝ์คํธ๋ฉ ๋น์ค๋น
- text: '[ํ
๋ฐ์ดํ
]๋ฐ์ฒดํํฉํ ๋ฆฌ ๊ฐ์ฃฝ ์๊ฒฝ ์ผ์ด์ค 08 ์ค๋ ์ง_์ถ๊ฐ ์ ํจ_์ถ๊ฐ ์ ํจ ์ ์ธ๊ณ๋ชฐ'
- text: TUMI ํฌ๋ฏธ ์นด๋ณธ ํฐํ๋ ๋ช
ํ ์๊ฒฝํ
๋ฉํ ์คํ์ด ๋จ์ ์ฌ์ ๊ณต์ฉ ์๊ฒฝ 04.TU10-0003-01 LFmall02
inference: true
model-index:
- name: SetFit with mini1013/master_domain
results:
- task:
type: text-classification
name: Text Classification
dataset:
name: Unknown
type: unknown
split: test
metrics:
- type: metric
value: 0.9104360692836626
name: Metric
---
# SetFit with mini1013/master_domain
This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [mini1013/master_domain](https://huggingface.co/mini1013/master_domain) as the Sentence Transformer embedding model. A [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance is used for classification.
The model has been trained using an efficient few-shot learning technique that involves:
1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning.
2. Training a classification head with features from the fine-tuned Sentence Transformer.
## Model Details
### Model Description
- **Model Type:** SetFit
- **Sentence Transformer body:** [mini1013/master_domain](https://huggingface.co/mini1013/master_domain)
- **Classification head:** a [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance
- **Maximum Sequence Length:** 512 tokens
- **Number of Classes:** 6 classes
<!-- - **Training Dataset:** [Unknown](https://huggingface.co/datasets/unknown) -->
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->
### Model Sources
- **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit)
- **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055)
- **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit)
### Model Labels
| Label | Examples |
|:------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 5.0 | <ul><li>'์ด๊ฒฝ๋ ๊ตญ์ฐ ์๊ฒฝํ
๋ฒ ํ ์ธํ
์นด๋ณธ ํฐํ๋ ๋ฟํ
์๊ฒฝ 551-599_S571-2 ๋ธ๋ผ์ดํฌํค ENA์์ด์จ์ด'</li><li>'B019 ORIGINAL GLASS CRYSTAL GREEN '</li><li>'๋์๋ฐ์นด์ฆ์ค BROWLINE2 ํ๊ธํ
๊ทผ์ ์ธ์ ์ฐจ๋จ๋ ์ฆ ์์ด๋ผ์ดํฌ(EYE LIKE)'</li></ul> |
| 1.0 | <ul><li>'๋ ๋๋ ์๊ฐ์ฃฝ์ ๊ธ๋ผ์คํ์ฐ์นํด๋์ฉ์๊ฒฝ์ผ์ด์ค ์ด์ ๋ฏผ'</li><li>'์์ ์๊ฒฝ ์ฐ๋ ํ์ฐ์น ํธ๊ด ๋ผ์ฐ๋ ์ ๊ธ๋ผ์ค 3์ข
์ธํธ ์ ๊ทธ๋ผ์ค ํด๋ฆฝ ์๋ผ์ฐ๋ ํ๋ฆฝ ์จ ํด๋ฆฝ์ ๊ธ๋ผ์ค3์ข
์ธํธ_์ผ๋ฐ๋ธ๋ ํํฌ์'</li><li>'ํด๋์ฉ ๊ฐ์ฃฝ ์ ๊ธ๋ผ์ค ์๊ฒฝ ํ์ฐ์น ์ผ์ด์ค ๋ณด๊ดํจ ์ PU์๊ฒฝ์ผ์ด์ค_๊ทธ๋ ์ด ๋ผ์ดํํจ์
'</li></ul> |
| 3.0 | <ul><li>'์์ด์
๊ฝ๋ฐฐ๊ธฐ์ธ์กฐ๊ฐ์ฃฝ์๊ฒฝ์ค10p์ธํธ์ ๊ธ๋ผ์ค์ค ์ ์ด๋๋ฆผ์ปค๋จธ์ค'</li><li>'์คํธ๋ฉ ์บ์ฃผ์ผ๋์์ธ์ค ์คํ ํผ์ค ์๊ฒฝ๊ฑธ์ด ๋ B ๋ํญ๊ท์ต'</li><li>'์ฒ์ฐ ํฌ๋ฆฌ์คํ ์๊ฒฝ ์ ๊ธ๋ผ์ค ๊ฑธ์ด ์ค ์์ ๋น์ฆ ๋นํฐ์ง ์์ค๋ ๋ง์คํฌ ์คํธ๋ฉ ๊ฒธ์ฉ ๋ธ๋ฃจ 3mm 70-75CM nouville'</li></ul> |
| 0.0 | <ul><li>'๊ฐค๋ฌ๋ฆฌ์ NIRNIR SUNGLASS 5 COLOR GREEN ๊ฐค๋ฌ๋ฆฌ์๋ชฐ'</li><li>'์ฌ์ ์ผ์์ด ๋ฟํ
์ ๊ทธ๋ผ์ค ์ฌ๊ทธ๋ผ์ค ๋จ์ RORGGE 2111 ์ํ์ ํ_2์ ๊ด๋ธ๋ ์จ๋ฌ์ด'</li><li>'๋ฎค์ฆ ์ํด ๋ฟํ
์ ๊ธ๋ผ์ค ์ฝ์ฝ์ ํธ์น๋ฐฑ'</li></ul> |
| 2.0 | <ul><li>'๋ก์๋ ์๊ฒฝ ์๊ตญ ์ฝํจ๋ ์ฝ๋ฐ์นจ ๋๋ฆผ ์ ๊ธ๋ผ์ค ์ฝ ํต์ฆ ๋ฐฉ์ง ํจ๋ ๊ต์ฒด ์คํฐ์ปค ์๊ฒฝ์ฝํจ๋ 1.8mm๏ผํ์ดํธ๏ผ_2.8mm(ํ์ดํธ) ๋ก์๋'</li><li>'[ํํฌ]๊ตญ์ฐ ๊ณ ๊ธ ์ด๊ทน์ธ์ฌ ๋ ์ฆ ์๊ฒฝ๋ฆ์ด ๊น์๋ฆผ๋ฐฉ์ง ํด๋ฆฌ๋ ํฌ๋ฆฌ๋ ์
๊ธฐ์๊ฑด ์๊ฒฝ์ฒ ์ตs 05. knit ์๊ฒฝ๋ฆ์ด30๋งค 15x18cm_๋ธ๋ฃจ ๋ชจ์ํ
์ค'</li><li>'์์ฐ๋ฒ ๋ ์ฆ ์ผ์ด ํด๋ฆฌ๋ ํฐ์ 200๋งค ๋ฉ๋์'</li></ul> |
| 4.0 | <ul><li>'์ฐ๋ฆฌ์ค ์๊ฒฝ์ ๋ฆฌํจ ์๊ฒฝ์ผ์ด์ค ์ธํธ 6์ข
์๊ฒฝ์ผ์ด์ค์๋๋ชจ๋กค ์ง์์ด์น๊ธ๋ก๋ฒ'</li><li>'(์ด๊ฑฐ์ฐ) ํ๋ฆฌ๋ฏธ์ ๊ฐ์ฃฝ ์๊ฒฝ์ง ์๊ฒฝ์ผ์ด์ค ๊ฐ์ฃฝ์๊ฒฝ์ง ์ค์นด์ด ์ ์ด์ผ์ด'</li><li>'์คํธ๋ฉ ์๊ฒฝ์ผ์ด์ค ํด๋์ฉ ์๊ฒฝํ์ฐ์น ๊ฐ์ฃฝ์๊ฒฝ๋ณด๊ด์ง ์ ๊ธ๋ผ์ค๋ณด๊ด์ผ์ด์ค No.01 ์คํธ๋ฉ ์๊ฒฝ์ผ์ด์ค ๋ธ๋ ์ฌ์ ์'</li></ul> |
## Evaluation
### Metrics
| Label | Metric |
|:--------|:-------|
| **all** | 0.9104 |
## Uses
### Direct Use for Inference
First install the SetFit library:
```bash
pip install setfit
```
Then you can load this model and run inference.
```python
from setfit import SetFitModel
# Download from the ๐ค Hub
model = SetFitModel.from_pretrained("mini1013/master_cate_ac4")
# Run inference
preds = model("๋ฐ์ฐฉ ์คํฌ์ธ ์๊ฒฝ์ค ํ๋ค๋ฆผ๋ฐฉ์ง ์๊ฒฝ์คํธ๋ฉ ๋น์ค๋น")
```
<!--
### Downstream Use
*List how someone could finetune this model on their own dataset.*
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Set Metrics
| Training set | Min | Median | Max |
|:-------------|:----|:-------|:----|
| Word count | 3 | 9.53 | 20 |
| Label | Training Sample Count |
|:------|:----------------------|
| 0.0 | 50 |
| 1.0 | 50 |
| 2.0 | 50 |
| 3.0 | 50 |
| 4.0 | 50 |
| 5.0 | 50 |
### Training Hyperparameters
- batch_size: (512, 512)
- num_epochs: (20, 20)
- max_steps: -1
- sampling_strategy: oversampling
- num_iterations: 40
- body_learning_rate: (2e-05, 2e-05)
- head_learning_rate: 2e-05
- loss: CosineSimilarityLoss
- distance_metric: cosine_distance
- margin: 0.25
- end_to_end: False
- use_amp: False
- warmup_proportion: 0.1
- seed: 42
- eval_max_steps: -1
- load_best_model_at_end: False
### Training Results
| Epoch | Step | Training Loss | Validation Loss |
|:-------:|:----:|:-------------:|:---------------:|
| 0.0213 | 1 | 0.4524 | - |
| 1.0638 | 50 | 0.2583 | - |
| 2.1277 | 100 | 0.0642 | - |
| 3.1915 | 150 | 0.0781 | - |
| 4.2553 | 200 | 0.0806 | - |
| 5.3191 | 250 | 0.0391 | - |
| 6.3830 | 300 | 0.0011 | - |
| 7.4468 | 350 | 0.0003 | - |
| 8.5106 | 400 | 0.0001 | - |
| 9.5745 | 450 | 0.0001 | - |
| 10.6383 | 500 | 0.0 | - |
| 11.7021 | 550 | 0.0 | - |
| 12.7660 | 600 | 0.0 | - |
| 13.8298 | 650 | 0.0 | - |
| 14.8936 | 700 | 0.0 | - |
| 15.9574 | 750 | 0.0 | - |
| 17.0213 | 800 | 0.0 | - |
| 18.0851 | 850 | 0.0 | - |
| 19.1489 | 900 | 0.0 | - |
### Framework Versions
- Python: 3.10.12
- SetFit: 1.1.0.dev0
- Sentence Transformers: 3.1.1
- Transformers: 4.46.1
- PyTorch: 2.4.0+cu121
- Datasets: 2.20.0
- Tokenizers: 0.20.0
## Citation
### BibTeX
```bibtex
@article{https://doi.org/10.48550/arxiv.2209.11055,
doi = {10.48550/ARXIV.2209.11055},
url = {https://arxiv.org/abs/2209.11055},
author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Efficient Few-Shot Learning Without Prompts},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> |
mini1013/master_cate_ac3 | mini1013 | 2024-11-25T09:57:32Z | 85 | 0 | setfit | [
"setfit",
"safetensors",
"roberta",
"sentence-transformers",
"text-classification",
"generated_from_setfit_trainer",
"arxiv:2209.11055",
"base_model:mini1013/master_domain",
"base_model:finetune:mini1013/master_domain",
"model-index",
"region:us"
] | text-classification | 2024-11-25T09:57:06Z | ---
base_model: mini1013/master_domain
library_name: setfit
metrics:
- metric
pipeline_tag: text-classification
tags:
- setfit
- sentence-transformers
- text-classification
- generated_from_setfit_trainer
widget:
- text: ์ด๋ฏ ๊ต์ฒด์ฉ ๊ฐ์ฃฝ ๋ฒจํธ๋ ๋ฒจํธ์ค ํ๋ฆฌ๋ ๋ฒจํธ ๊ฐ์ฃฝ ์๋ ์๋์ฉ 22_์๋๋ฒจํธ์ฉ ์ดํ๋ฆฌ๊ฐ์ฃฝ 3.3cm_์นด๋ฉ(42์ธ์น) ์์ค์ปดํผ๋
- text: ์ฌ์ฑ ์ฌ์ ํจ์
์์ด๋ ๋ฐด๋ฉ ๋ฒจํธ ํจ๋ฉ ์ฝํธ ํ๋ฆฌ ํ๋ฆฌ๋ ์ํผ์ค ๊ฐ๋๊ฑด ์ฝ๋ ํจ๋ฉ๋ฒจํธ 088_(SH30)_์์ด๋ณด๋ฆฌ {SH30-Ivory}
์ค์ฐswell
- text: '[1 + 1]์ญ์ญ์คํ ๋์ด๋๋ ๋ฐด๋ฉ ๋ฒจํธ ๋จ์ฌ๊ณต์ฉ ์บ์ฅฌ์ผ ๋ฐ์ผ๋ฆฌ ๊ตฐ์ฉ ํ
ํฐ์ปฌ ๋ฒจํธ 01. ๋์ด๋๋ ๋ฒจํธ 1+1_05. ๋คํฌ๋ธ๋ผ์ด_๋ผ์ดํธ๋ธ๋ผ์ด
์คํ ๋ฆฌ๋ชฐ2'
- text: '[๋ก์ ์ด] ์ ์ฅ ์บ์ฃผ์ผ ๊ฐ์ฃฝ ๋๋ธ ์์คํ๋ ๋ฉ๋นต NRMGSN011_BL ๋ธ๋_free '
- text: ๋ชจ๋์ต ๋จ์ ๊ฐ์ฃฝ ์ฒญ๋ฐ์ง๋ฒจํธ ์บ์ฃผ์ผ๋ฒจํธ ํ๋ฆฌ๋ ์ด๋์
๊ฐ์ธ 7. ๋ธ๋ผ์ด D107_ํ๊ธ(์ ์์ฒด)_๋ณดํต๊ธธ์ด(36๊น์ง์ฐฉ์ฉ๊ฐ๋ฅ) ๋ชจ๋์พ
inference: true
model-index:
- name: SetFit with mini1013/master_domain
results:
- task:
type: text-classification
name: Text Classification
dataset:
name: Unknown
type: unknown
split: test
metrics:
- type: metric
value: 0.9649836541954232
name: Metric
---
# SetFit with mini1013/master_domain
This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [mini1013/master_domain](https://huggingface.co/mini1013/master_domain) as the Sentence Transformer embedding model. A [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance is used for classification.
The model has been trained using an efficient few-shot learning technique that involves:
1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning.
2. Training a classification head with features from the fine-tuned Sentence Transformer.
## Model Details
### Model Description
- **Model Type:** SetFit
- **Sentence Transformer body:** [mini1013/master_domain](https://huggingface.co/mini1013/master_domain)
- **Classification head:** a [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance
- **Maximum Sequence Length:** 512 tokens
- **Number of Classes:** 3 classes
<!-- - **Training Dataset:** [Unknown](https://huggingface.co/datasets/unknown) -->
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->
### Model Sources
- **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit)
- **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055)
- **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit)
### Model Labels
| Label | Examples |
|:------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 1.0 | <ul><li>'๊ณ ๋ฆฌ ์ง๊ฒ ๊ฐ๋ฐฉ ์ฌํ์ฉ ๋ฉ๋นต ํด๋ฆฝ ๋ค์ฉ๋ ์ผ๊ฐ๋ฒํด ํํฌ ์๋ก์ฐ๋ชฐ'</li><li>'ํจ์
์ฌ์ฑ์์คํ๋ ์คํธ๋ฉ ์๋ณต ์ถ๊ทผ๋ฃฉ ์ ์ฅ ์ฝ์คํฌ ํฐ์ ํญ 2.5cm 120cm ๋งด๋งค2'</li><li>'ํจ์
์ฌ์ฑ์์คํ๋ ์คํธ๋ฉ ์๋ณต ์ถ๊ทผ๋ฃฉ ์ ์ฅ ์ฝ์คํฌ ํ๋์ ํฐ์ ๋นจ๊ฐ์ ์ค๋ฌด๋ฌ ํญ2.5 120cm ๋งด๋งค2'</li></ul> |
| 2.0 | <ul><li>'Basic Leather Belt ๋ค์ด๋น_100cm ๋ง๋ฌ๋ฌธํ์ฌํ์ฌ'</li><li>'๋ค์ด์๋๋กค๋ ๋ฌ๋ธ๋ฆฌ ์ฌ์๋ฒจํธ 146276 ์์ฅ ๋ธ๋ผ์ด FCB0012CM_L 105 ๋ค์ํด๋ก๋ฒ๋ง์ผ'</li><li>'[๊ฐค๋ฌ๋ฆฌ์] ํค์ง์คํธ๋๋ฐฑHJBE2F406W2๋ธ๋ผ์ด ์คํฐ์น์ฅ์ ์๊ฐ์ฃฝ ์ฌ์ฑ ๋ฒจํธ(ํ์์๋) ํํ๊ฐค๋ฌ๋ฆฌ์(์ฃผ)'</li></ul> |
| 0.0 | <ul><li>'(์ํฌํ
๋ฆญ์ค)(๊ณต์ํ๋งค์ฒ)(23SS) ์ปจ๋ฒ ์ด์ด ๋ฒจํธ 32mm (AENSUX5577) BLACK_SM '</li><li>'[๊ฐค๋ฌ๋ฆฌ์] ํค์ง์คํธ๋๋ฐฑ HJBE2F775BK_ ๋ธ๋ ๋น
๋ก๊ณ ๋ฒํด ๊ฐ์ฃฝ ์๋๋ฒจํธ(ํ์์๋) ํํ๊ฐค๋ฌ๋ฆฌ์(์ฃผ)'</li><li>'๋ฅ์ค_ํธ๋๋ฐฑ (์ ๋ฌผํฌ์ฅ/์ผํ๋ฐฑ๋๋ด) ๋ธ๋ ์ฒดํฌ๋ฐฐ์ ๊ฐ์ฃฝ ์๋๋ฒจํธ DBBE3E990BK ๋กฏ๋ฐ๋ฐฑํ์ 2๊ด'</li></ul> |
## Evaluation
### Metrics
| Label | Metric |
|:--------|:-------|
| **all** | 0.9650 |
## Uses
### Direct Use for Inference
First install the SetFit library:
```bash
pip install setfit
```
Then you can load this model and run inference.
```python
from setfit import SetFitModel
# Download from the ๐ค Hub
model = SetFitModel.from_pretrained("mini1013/master_cate_ac3")
# Run inference
preds = model("[๋ก์ ์ด] ์ ์ฅ ์บ์ฃผ์ผ ๊ฐ์ฃฝ ๋๋ธ ์์คํ๋ ๋ฉ๋นต NRMGSN011_BL ๋ธ๋_free ")
```
<!--
### Downstream Use
*List how someone could finetune this model on their own dataset.*
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Set Metrics
| Training set | Min | Median | Max |
|:-------------|:----|:-------|:----|
| Word count | 3 | 9.6133 | 17 |
| Label | Training Sample Count |
|:------|:----------------------|
| 0.0 | 50 |
| 1.0 | 50 |
| 2.0 | 50 |
### Training Hyperparameters
- batch_size: (512, 512)
- num_epochs: (20, 20)
- max_steps: -1
- sampling_strategy: oversampling
- num_iterations: 40
- body_learning_rate: (2e-05, 2e-05)
- head_learning_rate: 2e-05
- loss: CosineSimilarityLoss
- distance_metric: cosine_distance
- margin: 0.25
- end_to_end: False
- use_amp: False
- warmup_proportion: 0.1
- seed: 42
- eval_max_steps: -1
- load_best_model_at_end: False
### Training Results
| Epoch | Step | Training Loss | Validation Loss |
|:-------:|:----:|:-------------:|:---------------:|
| 0.0417 | 1 | 0.394 | - |
| 2.0833 | 50 | 0.0731 | - |
| 4.1667 | 100 | 0.0 | - |
| 6.25 | 150 | 0.0 | - |
| 8.3333 | 200 | 0.0 | - |
| 10.4167 | 250 | 0.0 | - |
| 12.5 | 300 | 0.0 | - |
| 14.5833 | 350 | 0.0 | - |
| 16.6667 | 400 | 0.0 | - |
| 18.75 | 450 | 0.0 | - |
### Framework Versions
- Python: 3.10.12
- SetFit: 1.1.0.dev0
- Sentence Transformers: 3.1.1
- Transformers: 4.46.1
- PyTorch: 2.4.0+cu121
- Datasets: 2.20.0
- Tokenizers: 0.20.0
## Citation
### BibTeX
```bibtex
@article{https://doi.org/10.48550/arxiv.2209.11055,
doi = {10.48550/ARXIV.2209.11055},
url = {https://arxiv.org/abs/2209.11055},
author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Efficient Few-Shot Learning Without Prompts},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> |
omarelsayeed/LayoutReader90Small | omarelsayeed | 2024-11-25T09:55:03Z | 131 | 0 | transformers | [
"transformers",
"safetensors",
"layoutlmv3",
"token-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | 2024-11-25T09:54:58Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a ๐ค transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
mini1013/master_cate_ac2 | mini1013 | 2024-11-25T09:55:02Z | 95 | 0 | setfit | [
"setfit",
"safetensors",
"roberta",
"sentence-transformers",
"text-classification",
"generated_from_setfit_trainer",
"arxiv:2209.11055",
"base_model:mini1013/master_domain",
"base_model:finetune:mini1013/master_domain",
"model-index",
"region:us"
] | text-classification | 2024-11-25T09:54:41Z | ---
base_model: mini1013/master_domain
library_name: setfit
metrics:
- metric
pipeline_tag: text-classification
tags:
- setfit
- sentence-transformers
- text-classification
- generated_from_setfit_trainer
widget:
- text: MLB [MLB] ๋ฃจํค ์ธ์คํธ๋ญ์ณ ๋ณผ์บก 24์ข
ํ1 203993 ์ ํ 20) 3ACP7701N-07ORL_F ์๋ํ๋ฆฌํฌ
- text: ๋จ์ฌ๊ณต์ฉ ๊ธฐ๋ณธ๊ตฐ๋ชจ 4์ปฌ๋ฌ EVE ์นดํค ์๋ธ๋ฆฌ์ฝ๊ตฟ
- text: ๊ณจ๋ด์์ด์ด๋ฒํทํ(T)7252 ๋ธ๋ผ์ด ๋ชจํฐ๋ธ์ฝ๋ฆฌ์
- text: ํจ์
์ธ๋ฒ๊ฑฐ์ง97 ๋ฒ ์ด์ง ๋ํ์ฝ๋ฆฌ์ (Digital Plus Korea)
- text: '[๋ฅ์ค](๊ฐ๋จ์ )DBHE4EL01W2 ๋ธ๋ผ์ด ์ฒดํฌ ๋ฉด ํํ
์บก ์ ์ธ๊ณ๋ฐฑํ์ '
inference: true
model-index:
- name: SetFit with mini1013/master_domain
results:
- task:
type: text-classification
name: Text Classification
dataset:
name: Unknown
type: unknown
split: test
metrics:
- type: metric
value: 0.8489339496048904
name: Metric
---
# SetFit with mini1013/master_domain
This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [mini1013/master_domain](https://huggingface.co/mini1013/master_domain) as the Sentence Transformer embedding model. A [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance is used for classification.
The model has been trained using an efficient few-shot learning technique that involves:
1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning.
2. Training a classification head with features from the fine-tuned Sentence Transformer.
## Model Details
### Model Description
- **Model Type:** SetFit
- **Sentence Transformer body:** [mini1013/master_domain](https://huggingface.co/mini1013/master_domain)
- **Classification head:** a [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance
- **Maximum Sequence Length:** 512 tokens
- **Number of Classes:** 13 classes
<!-- - **Training Dataset:** [Unknown](https://huggingface.co/datasets/unknown) -->
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->
### Model Sources
- **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit)
- **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055)
- **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit)
### Model Labels
| Label | Examples |
|:------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 10.0 | <ul><li>'๋ฐ๋ก [Exclusive] Holiday Signature Ball Cap (20Colors) MINT GRAY ํฌ์ฑ๋ฆฐ์ง'</li><li>'(๊ณจ๋ผ) ๋จ๋
๊ณต์ฉ (GL)CONTRAST STITCHED CAP (3 COLOR) WW9G3SAAC101 ์ฐํํฌ_FRE '</li><li>'๋ฐ๋ก [Exclusive] Holiday Signature Ball Cap (20Colors) STONE BLACK ํฌ์ฑ๋ฆฐ์ง'</li></ul> |
| 4.0 | <ul><li>'๊ฝ๋ฐฐ๊ธฐ ๋น๋ ๋ชจ์ ๋๊บผ์ด ๊ณจ๋ฌด ํธ ๋จ๊ฐ ์ฌ์ฑ ๊ฒจ์ธ ์บก ์ํ์นด ๋จ์ ์ปคํ ๋ํธ ์ฃผํฉ์_S(์์ด 32-52 cm) ์ค๋์ผ๋ ๋ธ'</li><li>'ํจ์
๋ชจ์ ๋ฐฉํ ๋จ์ ๋ํธ ํ๋ ๊ฒจ์ธ ์ฅ๊ฐ ๊ฐ์ ์๋จธ ๋ํฐํ 3์ข
์ธํธ ๊ธฐ๋ชจ ๋ธ๋ ๋ง์ดํด๋ก๋'</li><li>'ํธ๋ชจ์ ๋ฐ๋ปํ ๋์ ๋ชจ์ ์๋น ์ค๋
๋จ์ฑ ๋
ธ์ธ ๊ฒจ์ธ ์ต์
06 ์์ค์ค์ง์ต'</li></ul> |
| 7.0 | <ul><li>'[ํํํด๋ฝ/๊ตฌ๊น์ค]๊ตฌ๊น์ค ๋ชจ์(์คํฌ์ธ /๋ฑ์ฐ/์ฌํ/๋ฐฉ์) BEST 7์ข
๊ท ์ผ๊ฐ 763_๋ธ๋_D type ํํํด๋ฝ'</li><li>'์บ๊ณจ ์์๋์ด ์กํฐ๋นํฐ ๋ฒ์ผ 4480 ์ํฌ๋ฃจ M AKํ๋ผ์1๊ด'</li><li>'[๋ฒค์๋ชฝ](์ ์ธ๊ณ์ผํ
์ )[23FW] WINTER BUCKET HAT - 2color NAVY_FREE ์ฃผ์ํ์ฌ ์์ค์์ค์ง๋ท์ปด'</li></ul> |
| 3.0 | <ul><li>'๊ณ ํ์ฑ ๋ถ๋๋ฌ์ด ๋ฉ์ฌ ์๋จ ์ด๋์ผ์ธํ๋ ์ค์นดํ ๋๊ฑด ์ฐ๊ทธ๋ ์ด ๋๋ฆผํฝ์ณ์ค'</li><li>'[๋ก์ค์ฝ]๋ฐ๋ค๋ ์ค์นดํ ํค์ด๋ฐด๋ ํ์ด์ฆ๋ฆฌ ์์๊ฑด OLIVE DRAB_4051/Freesize ํจ์
ํ๋ฌ์ค'</li><li>'ํ์ด์ฆ๋ฆฌ ๋ฐ๋ค๋ ํค์ด ๋จธ๋ฆฌ๋๊ฑด ๋น ์์๊ฑด ์ค์นดํ ๊ทธ๋ฆฐ ๋ณด๋ฌผ์ผ'</li></ul> |
| 1.0 | <ul><li>'๋ฐฉํ๋ชจ์2์ข
๊ท๋ฌ์ด ํธ๋ชจ์ ๊ตฐ๋ฐค ์คํค ์ฉํ ํธ๋ํผํ ๋ง์คํฌ ์บก๋ฐฉํ๋ชจ์ 01.๋ถ๊ตฌ๋ฉ์ด๊ตฐ๋ฐฉ๋ชจ์ ์ ์ด์ผ์ด ์ํธ ๊ฐค๋ฌ๋ฆฌ'</li><li>'[MLB] ํจ๋ฉ ํธ๋ฃจํผ ๊ท๋ฌ์ด ํ(3AWMPH136-50BKS) ๋ธ๋-50BKS/59H ์์ด์ผ์ด์์ค์ค๋(์ฃผ) AKํ๋ผ์ ํํ์ '</li><li>'๊ฒจ์ธ ๊ณฐ๋์ด ํ๋ ๊ท๋ฌ์ด ๋ชจ์ ๋ชฉ๋์ด ๋๋ฌผ ํธ๋ชจ์ 05.๋ธ๋ผ์ด ์์ง์ผ์ด ์ฃผ์ํ์ฌ'</li></ul> |
| 9.0 | <ul><li>'์ค๋
๋ฐฑ ํจ์
๋ชจ์ snapback (ํฌํค)๊ทธ๋ ์ด์ค๋ ์ง ๋ฃจ๋๋ง์ผ'</li><li>'์ค๋
๋ฐฑ ํจ์
๋ชจ์ snapback ๋ ๋ ๋ฃจ๋๋ง์ผ'</li><li>'๊ณต์ฉ ๋ฉํ ์ํฌ์ธํธ ์ค๋
๋ฐฑ ๋ด์์ํค์ค (32CP57111-50L) '</li></ul> |
| 0.0 | <ul><li>'๊ธฐ๋ณธ ๊ตฐ๋ชจ ๋ฒํทํ ๋ฐ๋ฆฌํฐ๋ฆฌ ์ฌ์ ๋นํฐ์ง๊ตฐ๋ชจ ๋ชจ์ ๋จ์ ๋ฒ์บฃํ ๋ธ๋ ์นดํค / FREE ์ฒด์ธ์ง๋น'</li><li>'๋นํฐ์ง ์์ฑ ๋๋ ์๋ฌธ ๋ ํฐ๋ง ์ฅ์ ํฌ์ธํธ ์ฃ์ง ๊ตฐ๋ชจ ๊ทธ๋ ์ด (์ฃผ)์ค๋ํด๋'</li><li>'์ง์ข์ ๊ตฐ๋ชจ ๋ชจ์(์ฐจ์ฝ/๊ตญ๋ด์์ฐ) ๋ค์ด๋น ํ๋ฆฌ๋ง์ผ'</li></ul> |
| 2.0 | <ul><li>'์ฌ์ ๊ฒจ์ธํ
๋ฐ๋ป ๊ทน์ธ์ฌ ์ํธ๊ณฐ๋์ด๋จธ๋ฆฌ๋ ๊ท๋ง๊ฐ A24973_๋ฒ ์ด์ง_FREE ์ธ๋ธ์ ์ด์ค(7JS)'</li><li>'์ํธ ๊ณฐ๋์ด๊ท๋ง๊ฐ ๊ท๋๋ฆฌ ๋ฝ๊ธ์ด ๊ท๋ง๊ฐ ๋ฐฉํ๊ท๋ง๊ฐ ๋ชฉ๋๋ฆฌ ํ์ดํธ ํ์ฑ๋ง์ผ'</li><li>'์คํ์ผ ๋ํ๊ธฐ-36-๊ฝ๋ฐฐ๊ธฐ๋ฐฉํ๊ท๋ง๊ฐ ํํฌ ์ด๋ฏธ์ฐ'</li></ul> |
| 6.0 | <ul><li>'๊ตญ๋ด๋ฐ์ก MARITHE FRANCOIS GIRBAUD ๋ง๋ฆฌ๋ผ CABLE KNIT BEANIE blue 1MG23SHG112 ONE SIZE ์จ์ด๋ฉ'</li><li>'[๋งค์ฅ๋ฐ์ก] ๋ง๋ฆฌ๋ผ CLASSIC LOGO BEANIE black OS ์์ด์์ค๋ง์ผ'</li><li>'MARITHE FRANCOIS GIRBAUD CABLE KNIT BEANIE gray 1MG23SHG112 227185 ONE SIZE ์ํ๋ ์ค'</li></ul> |
| 8.0 | <ul><li>'๋น์์นด BIANCA (์ฌ์ฑ์ฉ) ๋๊ฐ/๋ด์ถ๋ด๋ก๊ณ _OS '</li><li>'[๋กฏ๋ฐ๋ฐฑํ์ ]ํ์ดํธ์์ฆ ๊ณต์ฉ UV ํ๋กํ
์
๋ฐ์ด์ ์๋์ 2.์์ด๋ณด๋ฆฌ ๋กฏ๋ฐ๋ฐฑํ์ _'</li><li>'ํ์ดํธ์์ฆ ์๋์ UV ํ๋กํ
์
์ฌ๋ฐ์ด์ 1์ข
[00003] ์์ด๋ณด๋ฆฌ ํ๋ํ์ผํ'</li></ul> |
| 12.0 | <ul><li>'์บ๊ณจ ํํ
์บก ์ธ ํ๋ ์คํ 504 K0873 ์ฌ๋ฆฌ์ค ์ธ 507 K0875 3107 ๋จ๋
๊ณต์ฉ ๋ฒ ๋ ๋ชจ 3. K3107ST (Black)_SMALL ์ด์ธ์ฐ์ฆ'</li><li>'๋ค์ฉ๋ ํ์ฉ ์ง์ ์ข
์
์ ๋จ์ฒด ํจ์
๋ชจ์ ํํ
์บก ํ์ดํธ ๊ฐ์จ'</li><li>'์จ๋ฆฌ ์นดํ ๋ฐ๋ฆฌ์คํ ๋ชจ์ ๋ฒ ์ด์ปค ์บก ๋ง๋๋ก์คํ[๋ฃจ์ฆ๋ฃจ๋์ฃผ์ผ๋ฆฌ] ๋ธ๋ ์ฃผ์ํ์ฌ ์น์ด์ฆ'</li></ul> |
| 11.0 | <ul><li>'1631๋ด์ ๋ณผ์บก 6color / ๋จ๋
๊ณต์ฉ๋ชจ์ ์บก๋ชจ์ ๊ทธ๋ฆฐ ๋ ์ด์ด๋์ปดํผ๋'</li><li>'ํจ์
๋ฒ๊ฑฐ์ง0009 ๋ฒ๊ฑฐ์ง ๊ฐ์ ๋ชจ์ ์ฌ์ฑ ํจ์
๋ฐค์ ๊ณจ๋์ฝ์คํธ'</li><li>'๊ฝ๋ฐฐ๊ธฐ๋ํธ๋ฒ๊ฑฐ์ง๋ชจ์B28016 ๊ฒ์ ํ๋ ์๋ฐ์ด๋ธ'</li></ul> |
| 5.0 | <ul><li>'๋ํธ ๋ฒ ๋ ๋ชจ S1450 ์ง์ฃผ๋ฐฉ์ธ ํํฌ ์ง์์ด์น๊ธ๋ก๋ฒ'</li><li>'[๋ฐ๋ฏผ์, ๋ผ์ด์ฆ ์๋น ์ฐฉ์ฉ] ์คํฐ๋ ๋ก๊ณ ์ธ ๋ฒ ๋ ๋ชจ ๋ธ๋ '</li><li>'/ ๋ฒ ์ด์ง ๋ ๋ ๋ด์ค๋ณด์ด์บก ๋นต๋ชจ์ (2color) ์์ด๋ณด๋ฆฌ_one size ๋กญ์ค(robs)'</li></ul> |
## Evaluation
### Metrics
| Label | Metric |
|:--------|:-------|
| **all** | 0.8489 |
## Uses
### Direct Use for Inference
First install the SetFit library:
```bash
pip install setfit
```
Then you can load this model and run inference.
```python
from setfit import SetFitModel
# Download from the ๐ค Hub
model = SetFitModel.from_pretrained("mini1013/master_cate_ac2")
# Run inference
preds = model("๋จ์ฌ๊ณต์ฉ ๊ธฐ๋ณธ๊ตฐ๋ชจ 4์ปฌ๋ฌ EVE ์นดํค ์๋ธ๋ฆฌ์ฝ๊ตฟ")
```
<!--
### Downstream Use
*List how someone could finetune this model on their own dataset.*
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Set Metrics
| Training set | Min | Median | Max |
|:-------------|:----|:-------|:----|
| Word count | 3 | 9.5523 | 21 |
| Label | Training Sample Count |
|:------|:----------------------|
| 0.0 | 50 |
| 1.0 | 50 |
| 2.0 | 50 |
| 3.0 | 50 |
| 4.0 | 50 |
| 5.0 | 50 |
| 6.0 | 50 |
| 7.0 | 50 |
| 8.0 | 50 |
| 9.0 | 50 |
| 10.0 | 50 |
| 11.0 | 50 |
| 12.0 | 50 |
### Training Hyperparameters
- batch_size: (512, 512)
- num_epochs: (20, 20)
- max_steps: -1
- sampling_strategy: oversampling
- num_iterations: 40
- body_learning_rate: (2e-05, 2e-05)
- head_learning_rate: 2e-05
- loss: CosineSimilarityLoss
- distance_metric: cosine_distance
- margin: 0.25
- end_to_end: False
- use_amp: False
- warmup_proportion: 0.1
- seed: 42
- eval_max_steps: -1
- load_best_model_at_end: False
### Training Results
| Epoch | Step | Training Loss | Validation Loss |
|:-------:|:----:|:-------------:|:---------------:|
| 0.0098 | 1 | 0.4348 | - |
| 0.4902 | 50 | 0.3427 | - |
| 0.9804 | 100 | 0.1921 | - |
| 1.4706 | 150 | 0.1061 | - |
| 1.9608 | 200 | 0.0544 | - |
| 2.4510 | 250 | 0.0384 | - |
| 2.9412 | 300 | 0.0155 | - |
| 3.4314 | 350 | 0.0128 | - |
| 3.9216 | 400 | 0.0177 | - |
| 4.4118 | 450 | 0.0082 | - |
| 4.9020 | 500 | 0.005 | - |
| 5.3922 | 550 | 0.0007 | - |
| 5.8824 | 600 | 0.0004 | - |
| 6.3725 | 650 | 0.0003 | - |
| 6.8627 | 700 | 0.0003 | - |
| 7.3529 | 750 | 0.0003 | - |
| 7.8431 | 800 | 0.0003 | - |
| 8.3333 | 850 | 0.0003 | - |
| 8.8235 | 900 | 0.0002 | - |
| 9.3137 | 950 | 0.0002 | - |
| 9.8039 | 1000 | 0.0001 | - |
| 10.2941 | 1050 | 0.0001 | - |
| 10.7843 | 1100 | 0.0001 | - |
| 11.2745 | 1150 | 0.0001 | - |
| 11.7647 | 1200 | 0.0001 | - |
| 12.2549 | 1250 | 0.0001 | - |
| 12.7451 | 1300 | 0.0001 | - |
| 13.2353 | 1350 | 0.0001 | - |
| 13.7255 | 1400 | 0.0001 | - |
| 14.2157 | 1450 | 0.0001 | - |
| 14.7059 | 1500 | 0.0001 | - |
| 15.1961 | 1550 | 0.0001 | - |
| 15.6863 | 1600 | 0.0001 | - |
| 16.1765 | 1650 | 0.0001 | - |
| 16.6667 | 1700 | 0.0001 | - |
| 17.1569 | 1750 | 0.0001 | - |
| 17.6471 | 1800 | 0.0001 | - |
| 18.1373 | 1850 | 0.0001 | - |
| 18.6275 | 1900 | 0.0001 | - |
| 19.1176 | 1950 | 0.0001 | - |
| 19.6078 | 2000 | 0.0001 | - |
### Framework Versions
- Python: 3.10.12
- SetFit: 1.1.0.dev0
- Sentence Transformers: 3.1.1
- Transformers: 4.46.1
- PyTorch: 2.4.0+cu121
- Datasets: 2.20.0
- Tokenizers: 0.20.0
## Citation
### BibTeX
```bibtex
@article{https://doi.org/10.48550/arxiv.2209.11055,
doi = {10.48550/ARXIV.2209.11055},
url = {https://arxiv.org/abs/2209.11055},
author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Efficient Few-Shot Learning Without Prompts},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> |
ClaudeRitchie/tinyllama-vels-v1 | ClaudeRitchie | 2024-11-25T09:52:51Z | 130 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-25T09:50:35Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a ๐ค transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
BoltMonkey/SuperNeuralDreadDevil-8b | BoltMonkey | 2024-11-25T09:46:12Z | 43 | 1 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"merge",
"mergekit",
"lazymergekit",
"BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated",
"BoltMonkey/DreadMix",
"conversational",
"base_model:BoltMonkey/DreadMix",
"base_model:merge:BoltMonkey/DreadMix",
"base_model:BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated",
"base_model:merge:BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-10-13T03:25:41Z | ---
library_name: transformers
base_model:
- BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated
- BoltMonkey/DreadMix
tags:
- merge
- mergekit
- lazymergekit
- BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated
- BoltMonkey/DreadMix
pipeline_tag: text-generation
---
# SuperNeuralDreadDevil-8b
SuperNeuralDreadDevil-8b is a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
* [BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated](https://huggingface.co/BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated)
* [BoltMonkey/DreadMix](https://huggingface.co/BoltMonkey/DreadMix)
## ๐งฉ Configuration
```yamlmodels:
- model: NousResearch/Meta-Llama-3.1-8B-Instruct
- model: BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated
parameters:
density: 0.53
weight: 0.55
- model: BoltMonkey/DreadMix
parameters:
density: 0.53
weight: 0.45
merge_method: dare_ties
base_model: NousResearch/Meta-Llama-3.1-8B-Instruct
parameters:
int8_mask: true
dtype: bfloat16
```
## ๐ป Usage
```python
!pip install -qU transformers accelerate
from transformers import AutoTokenizer
import transformers
import torch
model = "BoltMonkey/SuperNeuralDreadDevil-8b"
messages = [{"role": "user", "content": "What is a large language model?"}]
tokenizer = AutoTokenizer.from_pretrained(model)
prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
pipeline = transformers.pipeline(
"text-generation",
model=model,
torch_dtype=torch.float16,
device_map="auto",
)
outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
``` |
PrunaAI/AmberYifan-Mistral-7B-v0.1-sft-dpo-10k-bnb-8bit-smashed | PrunaAI | 2024-11-25T09:42:42Z | 6 | 0 | null | [
"safetensors",
"mistral",
"pruna-ai",
"base_model:AmberYifan/Mistral-7B-v0.1-sft-dpo-10k",
"base_model:quantized:AmberYifan/Mistral-7B-v0.1-sft-dpo-10k",
"8-bit",
"bitsandbytes",
"region:us"
] | null | 2024-11-25T09:33:11Z | ---
thumbnail: "https://assets-global.website-files.com/646b351987a8d8ce158d1940/64ec9e96b4334c0e1ac41504_Logo%20with%20white%20text.svg"
base_model: AmberYifan/Mistral-7B-v0.1-sft-dpo-10k
metrics:
- memory_disk
- memory_inference
- inference_latency
- inference_throughput
- inference_CO2_emissions
- inference_energy_consumption
tags:
- pruna-ai
---
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<a href="https://docs.pruna.ai/en/latest/setup/pip.html" target="_blank" rel="noopener noreferrer">
<img src="https://imgur.com/rVAgqMY.png" alt="PrunaAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</a>
</div>
<!-- header end -->
[](https://twitter.com/PrunaAI)
[](https://github.com/PrunaAI)
[](https://www.linkedin.com/company/93832878/admin/feed/posts/?feedType=following)
[](https://discord.gg/rskEr4BZJx)
# Simply make AI models cheaper, smaller, faster, and greener!
- Give a thumbs up if you like this model!
- Contact us and tell us which model to compress next [here](https://www.pruna.ai/contact).
- Request access to easily compress your *own* AI models [here](https://z0halsaff74.typeform.com/pruna-access?typeform-source=www.pruna.ai).
- Read the documentations to know more [here](https://pruna-ai-pruna.readthedocs-hosted.com/en/latest/)
- Join Pruna AI community on Discord [here](https://discord.gg/CP4VSgck) to share feedback/suggestions or get help.
## Results

**Frequently Asked Questions**
- ***How does the compression work?*** The model is compressed with llm-int8.
- ***How does the model quality change?*** The quality of the model output might vary compared to the base model.
- ***How is the model efficiency evaluated?*** These results were obtained with configuration described in `model/smash_config.json` and are obtained after a hardware warmup. The smashed model is directly compared to the original base model. Efficiency results may vary in other settings (e.g. other hardware, image size, batch size, ...). We recommend to directly run them in the use-case conditions to know if the smashed model can benefit you.
- ***What is the model format?*** We use safetensors.
- ***What calibration data has been used?*** If needed by the compression method, we used WikiText as the calibration data.
- ***What is the naming convention for Pruna Huggingface models?*** We take the original model name and append "turbo", "tiny", or "green" if the smashed model has a measured inference speed, inference memory, or inference energy consumption which is less than 90% of the original base model.
- ***How to compress my own models?*** You can request premium access to more compression methods and tech support for your specific use-cases [here](https://z0halsaff74.typeform.com/pruna-access?typeform-source=www.pruna.ai).
- ***What are "first" metrics?*** Results mentioning "first" are obtained after the first run of the model. The first run might take more memory or be slower than the subsequent runs due cuda overheads.
- ***What are "Sync" and "Async" metrics?*** "Sync" metrics are obtained by syncing all GPU processes and stop measurement when all of them are executed. "Async" metrics are obtained without syncing all GPU processes and stop when the model output can be used by the CPU. We provide both metrics since both could be relevant depending on the use-case. We recommend to test the efficiency gains directly in your use-cases.
## Setup
You can run the smashed model with these steps:
0. Check requirements from the original repo AmberYifan/Mistral-7B-v0.1-sft-dpo-10k installed. In particular, check python, cuda, and transformers versions.
1. Make sure that you have installed quantization related packages.
```bash
pip install transformers accelerate bitsandbytes>0.37.0
```
2. Load & run the model.
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("PrunaAI/AmberYifan-Mistral-7B-v0.1-sft-dpo-10k-bnb-8bit-smashed", trust_remote_code=True, device_map='auto')
tokenizer = AutoTokenizer.from_pretrained("AmberYifan/Mistral-7B-v0.1-sft-dpo-10k")
input_ids = tokenizer("What is the color of prunes?,", return_tensors='pt').to(model.device)["input_ids"]
outputs = model.generate(input_ids, max_new_tokens=216)
tokenizer.decode(outputs[0])
```
## Configurations
The configuration info are in `smash_config.json`.
## Credits & License
The license of the smashed model follows the license of the original model. Please check the license of the original model AmberYifan/Mistral-7B-v0.1-sft-dpo-10k before using this model which provided the base model. The license of the `pruna-engine` is [here](https://pypi.org/project/pruna-engine/) on Pypi.
## Want to compress other models?
- Contact us and tell us which model to compress next [here](https://www.pruna.ai/contact).
- Do it by yourself [here](https://docs.pruna.ai/en/latest/setup/pip.html). |
glif-loradex-trainer/fabian3000_chillguy | glif-loradex-trainer | 2024-11-25T09:40:56Z | 846 | 1 | diffusers | [
"diffusers",
"text-to-image",
"template:sd-lora",
"base_model:black-forest-labs/FLUX.1-dev",
"base_model:finetune:black-forest-labs/FLUX.1-dev",
"license:other",
"region:us",
"flux",
"lora",
"base_model:adapter:black-forest-labs/FLUX.1-dev"
] | text-to-image | 2024-11-25T09:40:37Z | ---
tags:
- diffusers
- text-to-image
- template:sd-lora
- base_model:black-forest-labs/FLUX.1-dev
- base_model:finetune:black-forest-labs/FLUX.1-dev
- license:other
- region:us
- flux
- lora
widget:
- output:
url: samples/1732527571108__000001500_0.jpg
text: chillguy as a spartan warrior
- output:
url: samples/1732527596137__000001500_1.jpg
text: chillguy with text saying I AM CHILL
- output:
url: samples/1732527621130__000001500_2.jpg
text: black and white photographic portrait of chillguy
base_model: black-forest-labs/FLUX.1-dev
trigger: chillguy
instance_prompt: chillguy
license: other
license_name: flux-1-dev-non-commercial-license
license_link: https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/LICENSE.md
---
# chillguy
Model trained with [AI Toolkit by Ostris](https://github.com/ostris/ai-toolkit) under the [Glif Loradex program](https://huggingface.co/glif-loradex-trainer) by [Glif](https://glif.app) user `fabian3000`.
<Gallery />
## Trigger words
You should use `chillguy` to trigger the image generation.
## Download model
Weights for this model are available in Safetensors format.
[Download](/glif-loradex-trainer/fabian3000_chillguy/tree/main) them in the Files & versions tab.
## License
This model is licensed under the [flux-1-dev-non-commercial-license](https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/LICENSE.md).
|
huihui-ai/Llama-3.2-3B-Instruct-abliterated | huihui-ai | 2024-11-25T09:39:09Z | 3,315 | 50 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"abliterated",
"uncensored",
"conversational",
"base_model:meta-llama/Llama-3.2-3B-Instruct",
"base_model:finetune:meta-llama/Llama-3.2-3B-Instruct",
"license:llama3.2",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-09-28T05:20:02Z | ---
library_name: transformers
license: llama3.2
base_model: meta-llama/Llama-3.2-3B-Instruct
tags:
- abliterated
- uncensored
---
# ๐ฆ Llama-3.2-3B-Instruct-abliterated
This is an uncensored version of Llama 3.2 3B Instruct created with abliteration (see [this article](https://huggingface.co/blog/mlabonne/abliteration) to know more about it).
Special thanks to [@FailSpy](https://huggingface.co/failspy) for the original code and technique. Please follow him if you're interested in abliterated models.
## ollama
You can use [huihui_ai/llama3.2-abliterate:3b](https://ollama.com/huihui_ai/llama3.2-abliterate:3b) directly,
```
ollama run huihui_ai/llama3.2-abliterate
```
or create your own model using the following methods.
1. Download this model.
```
huggingface-cli download huihui-ai/Llama-3.2-3B-Instruct-abliterated --local-dir ./huihui-ai/Llama-3.2-3B-Instruct-abliterated
```
2. Get Llama-3.2-3B-Instruct model for reference.
```
ollama pull llama3.2
```
3. Export Llama-3.2-3B-Instruct model parameters.
```
ollama show llama3.2 --modelfile > Modelfile
```
4. Modify Modelfile, Remove all comment lines (indicated by #) before the "FROM" keyword. Replace the "FROM" with the following content.
```
FROM huihui-ai/Llama-3.2-3B-Instruct-abliterated
```
5. Use ollama create to then create the quantized model.
```
ollama create --quantize q4_K_M -f Modelfile Llama-3.2-3B-Instruct-abliterated-q4_K_M
```
6. Run model
```
ollama run Llama-3.2-3B-Instruct-abliterated-q4_K_M
```
The running architecture is llama.
## Evaluations
The following data has been re-evaluated and calculated as the average for each test.
| Benchmark | Llama-3.2-3B-Instruct | Llama-3.2-3B-Instruct-abliterated |
|-------------|-----------------------|-----------------------------------|
| IF_Eval | 76.55 | **76.76** |
| MMLU Pro | 27.88 | **28.00** |
| TruthfulQA | 50.55 | **50.73** |
| BBH | 41.81 | **41.86** |
| GPQA | 28.39 | **28.41** |
The script used for evaluation can be found inside this repository under /eval.sh, or click [here](https://huggingface.co/huihui-ai/Llama-3.2-3B-Instruct-abliterated/blob/main/eval.sh)
|
huihui-ai/Llama-3.2-1B-Instruct-abliterated | huihui-ai | 2024-11-25T09:36:23Z | 632 | 6 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"abliterated",
"uncensored",
"conversational",
"base_model:meta-llama/Llama-3.2-1B-Instruct",
"base_model:finetune:meta-llama/Llama-3.2-1B-Instruct",
"license:llama3.2",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-10-01T18:46:39Z | ---
library_name: transformers
license: llama3.2
base_model: meta-llama/Llama-3.2-1B-Instruct
tags:
- abliterated
- uncensored
---
# ๐ฆ Llama-3.2-1B-Instruct-abliterated
This is an uncensored version of Llama 3.2 1B Instruct created with abliteration (see [this article](https://huggingface.co/blog/mlabonne/abliteration) to know more about it).
Special thanks to [@FailSpy](https://huggingface.co/failspy) for the original code and technique. Please follow him if you're interested in abliterated models.
## ollama
You can use [huihui_ai/llama3.2-abliterate:1b](https://ollama.com/huihui_ai/llama3.2-abliterate:1b) directly,
```
ollama run huihui_ai/llama3.2-abliterate:1b
```
## Evaluations
The following data has been re-evaluated and calculated as the average for each test.
| Benchmark | Llama-3.2-1B-Instruct | Llama-3.2-1B-Instruct-abliterated |
|-------------|-----------------------|-----------------------------------|
| IF_Eval | **58.50** | 56.88 |
| MMLU Pro | **16.35** | 14.35 |
| TruthfulQA | **43.08** | 38.96 |
| BBH | **33.75** | 31.83 |
| GPQA | 25.96 | **26.39** |
The script used for evaluation can be found inside this repository under /eval.sh, or click [here](https://huggingface.co/huihui-ai/Llama-3.2-1B-Instruct-abliterated/blob/main/eval.sh)
|
mradermacher/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1-i1-GGUF | mradermacher | 2024-11-25T09:34:00Z | 34 | 0 | transformers | [
"transformers",
"gguf",
"text-generation-inference",
"sft",
"chocolatine",
"fr",
"dataset:jpacifico/french-orca-pairs-culinary-9865",
"dataset:jpacifico/finetome_french_cook_definitions_v2",
"base_model:jpacifico/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1",
"base_model:quantized:jpacifico/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1",
"license:mit",
"endpoints_compatible",
"region:us",
"imatrix",
"conversational"
] | null | 2024-11-25T09:10:57Z | ---
base_model: jpacifico/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1
datasets:
- jpacifico/french-orca-pairs-culinary-9865
- jpacifico/finetome_french_cook_definitions_v2
language:
- fr
library_name: transformers
license: mit
quantized_by: mradermacher
tags:
- text-generation-inference
- transformers
- sft
- chocolatine
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/jpacifico/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1-i1-GGUF/resolve/main/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1.i1-IQ1_S.gguf) | i1-IQ1_S | 1.0 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1-i1-GGUF/resolve/main/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1.i1-IQ1_M.gguf) | i1-IQ1_M | 1.1 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1-i1-GGUF/resolve/main/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 1.2 | |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1-i1-GGUF/resolve/main/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1.i1-IQ2_XS.gguf) | i1-IQ2_XS | 1.3 | |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1-i1-GGUF/resolve/main/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1.i1-IQ2_S.gguf) | i1-IQ2_S | 1.4 | |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1-i1-GGUF/resolve/main/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1.i1-IQ2_M.gguf) | i1-IQ2_M | 1.4 | |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1-i1-GGUF/resolve/main/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1.i1-Q2_K.gguf) | i1-Q2_K | 1.5 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1-i1-GGUF/resolve/main/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 1.6 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1-i1-GGUF/resolve/main/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1.i1-IQ3_XS.gguf) | i1-IQ3_XS | 1.7 | |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1-i1-GGUF/resolve/main/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1.i1-IQ3_S.gguf) | i1-IQ3_S | 1.8 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1-i1-GGUF/resolve/main/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1.i1-Q3_K_S.gguf) | i1-Q3_K_S | 1.8 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1-i1-GGUF/resolve/main/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1.i1-IQ3_M.gguf) | i1-IQ3_M | 1.9 | |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1-i1-GGUF/resolve/main/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1.i1-Q3_K_M.gguf) | i1-Q3_K_M | 2.0 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1-i1-GGUF/resolve/main/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1.i1-Q3_K_L.gguf) | i1-Q3_K_L | 2.1 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1-i1-GGUF/resolve/main/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1.i1-IQ4_XS.gguf) | i1-IQ4_XS | 2.2 | |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1-i1-GGUF/resolve/main/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1.i1-Q4_0_4_4.gguf) | i1-Q4_0_4_4 | 2.3 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1-i1-GGUF/resolve/main/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1.i1-Q4_0_4_8.gguf) | i1-Q4_0_4_8 | 2.3 | fast on arm+i8mm, low quality |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1-i1-GGUF/resolve/main/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1.i1-Q4_0_8_8.gguf) | i1-Q4_0_8_8 | 2.3 | fast on arm+sve, low quality |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1-i1-GGUF/resolve/main/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1.i1-Q4_0.gguf) | i1-Q4_0 | 2.3 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1-i1-GGUF/resolve/main/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1.i1-Q4_K_S.gguf) | i1-Q4_K_S | 2.3 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1-i1-GGUF/resolve/main/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1.i1-Q4_K_M.gguf) | i1-Q4_K_M | 2.4 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1-i1-GGUF/resolve/main/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1.i1-Q5_K_S.gguf) | i1-Q5_K_S | 2.7 | |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1-i1-GGUF/resolve/main/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1.i1-Q5_K_M.gguf) | i1-Q5_K_M | 2.8 | |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1-i1-GGUF/resolve/main/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1.i1-Q6_K.gguf) | i1-Q6_K | 3.2 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
mini1013/master_cate_ac0 | mini1013 | 2024-11-25T09:33:58Z | 154 | 0 | setfit | [
"setfit",
"safetensors",
"roberta",
"sentence-transformers",
"text-classification",
"generated_from_setfit_trainer",
"arxiv:2209.11055",
"base_model:mini1013/master_domain",
"base_model:finetune:mini1013/master_domain",
"model-index",
"region:us"
] | text-classification | 2024-11-25T09:33:32Z | ---
base_model: mini1013/master_domain
library_name: setfit
metrics:
- metric
pipeline_tag: text-classification
tags:
- setfit
- sentence-transformers
- text-classification
- generated_from_setfit_trainer
widget:
- text: '[ํค์ง์คACC]HJBA3F885BK[13์ธ์น ๋
ธํธ๋ถ ์๋ฉ๊ฐ๋ฅ][KEVIN]๋ธ๋ ์ฐธ์ฅ์ ํฌ๋ก์ค ๊ฒธ์ฉ ๋ฏธ๋ ํ ํธ๋ฐฑ ์์ด์ผ์ด์์ค์ค๋ (์ฃผ)
AK์ธํฐ๋ท์ผํ๋ชฐ'
- text: ๋ง์ ค๋ ๋ฉ์ ์ ๋ฐฑ ํฌ๋ก์ค๋ฐฑ ์ฌ๋ง๋ฐฑ ํ์ ํ์ ํ์ ์ฌ์ฑ ๋จ์ ์บ์ฃผ์ผ ํฌ๋ก์ค ์ฌํ์ฉ ์ฌ๊ถ ํธ๋ํฐ ๋ณด์กฐ ํ์ ๊ฐ๋ฐฉ LKHS-304_B-์ฐํํฌ(+ํคํ๋)
๋๋ธ์ ํ
- text: ๋ง์ ค๋ ๋ฉ์ ์ ๋ฐฑ ํฌ๋ก์ค๋ฐฑ ์ฌ๋ง๋ฐฑ ํ์ ํ์ ํ์ ์ฌ์ฑ ๋จ์ ์บ์ฃผ์ผ ํฌ๋ก์ค ์ฌํ์ฉ ์ฌ๊ถ ํธ๋ํฐ ๋ณด์กฐ ํ์ ๊ฐ๋ฐฉ ML-1928_์ฐ๊ทธ๋ ์ด
๋๋ธ์ ํ
- text: '[๊ฐค๋ฌ๋ฆฌ์] JUBA4E021G2 [MATEO] ๊ทธ๋ ์ด ๋ก๊ณ ํ๋ฆฐํธ ์๋๋ฐฑ JUBA4E021G2 [MATEO] ๊ทธ๋ ์ด ๋ก๊ณ ํ๋ฆฐํธ ์๋๋ฐฑ
NSํ์ผํ_NS๋ชฐ'
- text: '[๋์ค์ปค๋ฒ๋ฆฌ](์ ์ธ๊ณ๊ฐ๋จ์ )[23N] ๋์ค์ปค๋ฒ๋ฆฌ ๋ฏธ๋ ์ฌ๋ง๋ฐฑ (DXSG0043N) IVD ๋คํฌ ์์ด๋ณด๋ฆฌ_F ์ฃผ์ํ์ฌ ์์ค์์ค์ง๋ท์ปด'
inference: true
model-index:
- name: SetFit with mini1013/master_domain
results:
- task:
type: text-classification
name: Text Classification
dataset:
name: Unknown
type: unknown
split: test
metrics:
- type: metric
value: 0.8488667448221962
name: Metric
---
# SetFit with mini1013/master_domain
This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [mini1013/master_domain](https://huggingface.co/mini1013/master_domain) as the Sentence Transformer embedding model. A [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance is used for classification.
The model has been trained using an efficient few-shot learning technique that involves:
1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning.
2. Training a classification head with features from the fine-tuned Sentence Transformer.
## Model Details
### Model Description
- **Model Type:** SetFit
- **Sentence Transformer body:** [mini1013/master_domain](https://huggingface.co/mini1013/master_domain)
- **Classification head:** a [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance
- **Maximum Sequence Length:** 512 tokens
- **Number of Classes:** 9 classes
<!-- - **Training Dataset:** [Unknown](https://huggingface.co/datasets/unknown) -->
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->
### Model Sources
- **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit)
- **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055)
- **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit)
### Model Labels
| Label | Examples |
|:------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 6.0 | <ul><li>'[์ง์คํ์ดํธ](๊ด์ฃผ์ ์ธ๊ณ)๋ธ๋ ํด๋์ ํด๋ฌ์น๋ฐฑ [JUWA2F392BK] ์ฃผ์ํ์ฌ ์์ค์์ค์ง๋ท์ปด'</li><li>'์ฌํ ํด๋ฌ์น๋ฐฑ EOCFHX257BK/์์ค์ฝฐ์ด์ ๋ธ๋ ๋กฏ๋ฐ์ผํ(์ฃผ)'</li><li>'[๋ํ] ์ํํธ๊ทธ๋ ์ธ ํ์ฐ์น ๋ฒ ์ด์ง CG180263CL ๋ฒ ์ด์ง (์ฃผ)์จ์ ์ด์ด์์ '</li></ul> |
| 3.0 | <ul><li>'์์ง๋์ด๋๊ฐ๋จผ์ธ ๋ธ๋ ๋์ผ๋ก ํ ํธ๋ฐฑ 23F1H034BLACK ์ฃผ์ํ์ฌ ์ด๋์ด๋ญ์ค'</li><li>'[๊ฐ์ด๊ฑฐ] ํผํ
๋ ๋ ์ฒด์ธ ์๋๋ฐฑ (+ํ๋ฉ์ง๊ฐ) ์บ๋ฌ๋ฉ ๋ธ๋ผ์ด (์ฃผ)์ฐ๋ฆฌํ์ผํ'</li><li>'ํ ํธ ๋ธ๋ฆฌํ ํฌ๋ก์ค๋ฐฑ FT8570 ๋ธ๋ ๊ธ๋ก๋ฆฌํ'</li></ul> |
| 4.0 | <ul><li>'์ฌ์์บ๋ฒ์ค ๊ฐ๋ฐฉ ์ฝ๋ ํฌ๋ก์ค๋ฐฑ ๋จ์์์ฝ๋ฐฑ ์ ๋ฐ BLUE ๊ณ ์ค๋ฐ'</li><li>'์ฌํ์ ์์ฝ๋ฐฑ ์์ด๋ณด๋ฆฌ ๊ฐ๋ฐฉ ๋จ๋
๊ณต์ฉ ์บ์ฃผ์ผ ์ผํผ๋ฐฑ ์์ผ์ด์ '</li><li>'ํจ์
์์ฝ๋ฐฑ ๋ฐ์ผ๋ฆฌ ๊ฐ๋ฐฉ ์บ์ฃผ์ผ ์๋๋ฐฑ ๋ธ๋ผ์ด ์ฌ์ '</li></ul> |
| 7.0 | <ul><li>'[๊ฐค๋ฌ๋ฆฌ์] 644040 2BKPI 1000 ONE SIZE ํํ๊ฐค๋ฌ๋ฆฌ์(์ฃผ)'</li><li>'[๊ฐค๋ฌ๋ฆฌ์] ํค์ง์คํธ๋๋ฐฑ ๊ทธ๋ฆฐ ์์ฑ๊ฐ์ฃฝ ํฌ๋ก์ค ๊ฒธ์ฉ ํ ํธ๋ฐฑ HJBA3E301E2(ํ์์๋) ํํ๊ฐค๋ฌ๋ฆฌ์(์ฃผ)'</li><li>'[๋ฉ์ข
ํค์ธ ๋ค] ๋ก๊ณ ํ๋ฆฐํธ ์ฝํผ ํ ํธ๋ฐฑ ๋ธ๋ฃจ LW05102WW0008 BLUE_FREE ์ ์ธ๊ณ๋ชฐ'</li></ul> |
| 2.0 | <ul><li>'๋ฐ๋ฒ ๊ฐ์ฃฝ ์ฝํ
์๋ฅ ๊ฐ๋ฐฉ ๋ธ๋ฆฌํ ์ผ์ด์ค UBA0004 NAVY ๋ด์ํธ๋ ์ด๋ฉ'</li><li>'[๋กฏ๋ฐ๋ฐฑํ์ ]์์ค์ฝฐ์ด์ 23FW ์ ์ ๊ฒฝ๋ ๋์ผ๋ก ๋
ธํธ๋ถ ์๋ฉ ๋จ์ฌ ๋ฐ์ผ๋ฆฌ ํ ํธ ํฌ๋ก์ค๋ฐฑ EOCFHX258BK ๋กฏ๋ฐ๋ฐฑํ์ _'</li><li>'22FW ์ ์ ๋ด ํฌ๋ฉ ์ฌ๋ฆผ ์คํ์ด ์ฌํ ๋น์ฆ๋์ค ์บ์ฃผ์ผ ์๋ฅ๊ฐ๋ฐฉ ECBFHX227GY ๋กฏ๋ฐ๋ฐฑํ์ 1๊ด'</li></ul> |
| 1.0 | <ul><li>'NATIONALGEOGRAPHIC N225USD340 ๋ค์ด๋ธ ํ๋ฌ์ค V3 BLACK 240 ๋งฅ์คํฌ'</li><li>'๋ ์คํฌ์ญ ๋ณด์ด์ ๋ฐฑํฉ ๊ฒฝ๋ ๋์ผ๋ก ๋ณด๋ถ์ ๋ณต์กฐ๋ฆฌ ๊ฐ๋ฐฉ 7839 ํ๋ผ์ ํ์ด์ต'</li><li>'๋ ์คํฌ์ญ ๋ณด์ด์ ๋ฐฑํฉ ๊ฒฝ๋ Voyager Backpack 7839 ๋ธ๋ ํํ๋ํ'</li></ul> |
| 0.0 | <ul><li>'[๊ฐค๋ฌ๋ฆฌ์] ํค์ง์คํธ๋๋ฐฑHJBA2F770BK_ ๋ธ๋ ๋ก๊ณ ์ฅ์ ์๋ฆฌ๋ ๋ฉ์ ์ ธ๋ฐฑ(ํ์์๋) ํํ๊ฐค๋ฌ๋ฆฌ์(์ฃผ)'</li><li>'๋ก์๋๋ก์ ํ์ฌ ๋ฉ์ฌ ํฌ์ผ ํฌ๋ก์ค ๋ฉ์ ์ ๋ฐฑ (์์ด๋ณด๋ฆฌ) ํฌ๋ก์ค๋ฐฑ FREE ๊ฐ๋ฐฉํ'</li><li>'[๋ณธ์ฌ๊ณต์] ํํ ๋ฉ์ ์ ๋ฐฑ ์ฌ์ฒผ S EOCBS04 008 ๋กฏ๋ฐ์์ด๋ชฐ'</li></ul> |
| 5.0 | <ul><li>'ํฉ์ธ์ดํ ๊ฐ๋ฐฉ GO ํฌ๋ก์ค๋ฐ๋ ๋ฐฑ 2.5L / PACSAFE URBAN ๋๋๋ฐฉ์ง ์ ๋ฝ ํด์ธ ์ฌํ ๋ฑ์ฐ ์ฌ๋ง๋ฐฑ ํฌ๋ก์ค๋ฐฑ RFID์ฐจ๋จ 1. ์ ํธ ๋ธ๋ (JET BLACK) ์๊ณ1์ํ์์น'</li><li>'์จํ์ฝ[Chantaco] ๋ ๋ ํฌ๋ก์ค๋ฐฑ BB NH3271C53N 000/๋ผ์ฝ์คํ
๋กฏ๋ฐ์ผํ(์ฃผ)'</li><li>'ํฉ์ธ์ดํ ๊ฐ๋ฐฉ GO ํฌ๋ก์ค๋ฐ๋ ๋ฐฑ 2.5L / PACSAFE URBAN ๋๋๋ฐฉ์ง ์ ๋ฝ ํด์ธ ์ฌํ ๋ฑ์ฐ ์ฌ๋ง๋ฐฑ ํฌ๋ก์ค๋ฐฑ RFID์ฐจ๋จ 2. ๋ก์ฆ (ROSE) ์๊ณ1์ํ์์น'</li></ul> |
| 8.0 | <ul><li>'[๊ธฐํ๊ณต์์] ๋ฐ์ผ๋ฆฌ ์ฌ๋ง๋ฐฑ ํฌ๋ก์ค ํ์ ํ๋ฆฌ๊ฐ๋ฐฉ ์คํฌ์ธ ๋ฑ์ฐ ํ์ ํ๋ฆฌ์ ์ฌ๋ง๋ฐฑ ๋ณด์กฐ๊ฐ๋ฐฉ ๊ธ๋ก๋ฆฌ์ปค๋จธ์ค'</li><li>'๊ตฌ์ฐ GG ์บ๋ฒ์ค ํฌ์จ์ด ๋ฐธํธ๋ฐฑ ํ์ 630915 KY9KN 9886 ์ ๋์ธ'</li><li>'๋ฒจํธํ ํธ๋ํฐ ํ๋ฆฌ๊ฐ๋ฐฉ ๋จ์ ๋ฒจํธ๋ฐฑ ์ธ๋กํ ๊ฐ์ฃฝ ๋ฒจํธํ์ฐ์น ์ง๊ฐ ํ๋ฆฌ๋ฒจํธ์ผ์ด์ค ๋ธ๋ผ์ด ์์ฃผ๊ตฌ๋งค'</li></ul> |
## Evaluation
### Metrics
| Label | Metric |
|:--------|:-------|
| **all** | 0.8489 |
## Uses
### Direct Use for Inference
First install the SetFit library:
```bash
pip install setfit
```
Then you can load this model and run inference.
```python
from setfit import SetFitModel
# Download from the ๐ค Hub
model = SetFitModel.from_pretrained("mini1013/master_cate_ac0")
# Run inference
preds = model("[๋์ค์ปค๋ฒ๋ฆฌ](์ ์ธ๊ณ๊ฐ๋จ์ )[23N] ๋์ค์ปค๋ฒ๋ฆฌ ๋ฏธ๋ ์ฌ๋ง๋ฐฑ (DXSG0043N) IVD ๋คํฌ ์์ด๋ณด๋ฆฌ_F ์ฃผ์ํ์ฌ ์์ค์์ค์ง๋ท์ปด")
```
<!--
### Downstream Use
*List how someone could finetune this model on their own dataset.*
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Set Metrics
| Training set | Min | Median | Max |
|:-------------|:----|:-------|:----|
| Word count | 4 | 9.2289 | 29 |
| Label | Training Sample Count |
|:------|:----------------------|
| 0.0 | 50 |
| 1.0 | 50 |
| 2.0 | 50 |
| 3.0 | 50 |
| 4.0 | 50 |
| 5.0 | 50 |
| 6.0 | 50 |
| 7.0 | 50 |
| 8.0 | 50 |
### Training Hyperparameters
- batch_size: (512, 512)
- num_epochs: (20, 20)
- max_steps: -1
- sampling_strategy: oversampling
- num_iterations: 40
- body_learning_rate: (2e-05, 2e-05)
- head_learning_rate: 2e-05
- loss: CosineSimilarityLoss
- distance_metric: cosine_distance
- margin: 0.25
- end_to_end: False
- use_amp: False
- warmup_proportion: 0.1
- seed: 42
- eval_max_steps: -1
- load_best_model_at_end: False
### Training Results
| Epoch | Step | Training Loss | Validation Loss |
|:-------:|:----:|:-------------:|:---------------:|
| 0.0141 | 1 | 0.3958 | - |
| 0.7042 | 50 | 0.3012 | - |
| 1.4085 | 100 | 0.1811 | - |
| 2.1127 | 150 | 0.0599 | - |
| 2.8169 | 200 | 0.0333 | - |
| 3.5211 | 250 | 0.0169 | - |
| 4.2254 | 300 | 0.0005 | - |
| 4.9296 | 350 | 0.0003 | - |
| 5.6338 | 400 | 0.0002 | - |
| 6.3380 | 450 | 0.0003 | - |
| 7.0423 | 500 | 0.0001 | - |
| 7.7465 | 550 | 0.0001 | - |
| 8.4507 | 600 | 0.0001 | - |
| 9.1549 | 650 | 0.0001 | - |
| 9.8592 | 700 | 0.0001 | - |
| 10.5634 | 750 | 0.0 | - |
| 11.2676 | 800 | 0.0001 | - |
| 11.9718 | 850 | 0.0001 | - |
| 12.6761 | 900 | 0.0001 | - |
| 13.3803 | 950 | 0.0 | - |
| 14.0845 | 1000 | 0.0 | - |
| 14.7887 | 1050 | 0.0 | - |
| 15.4930 | 1100 | 0.0 | - |
| 16.1972 | 1150 | 0.0 | - |
| 16.9014 | 1200 | 0.0 | - |
| 17.6056 | 1250 | 0.0 | - |
| 18.3099 | 1300 | 0.0 | - |
| 19.0141 | 1350 | 0.0 | - |
| 19.7183 | 1400 | 0.0 | - |
### Framework Versions
- Python: 3.10.12
- SetFit: 1.1.0.dev0
- Sentence Transformers: 3.1.1
- Transformers: 4.46.1
- PyTorch: 2.4.0+cu121
- Datasets: 2.20.0
- Tokenizers: 0.20.0
## Citation
### BibTeX
```bibtex
@article{https://doi.org/10.48550/arxiv.2209.11055,
doi = {10.48550/ARXIV.2209.11055},
url = {https://arxiv.org/abs/2209.11055},
author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Efficient Few-Shot Learning Without Prompts},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> |
mradermacher/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1-GGUF | mradermacher | 2024-11-25T09:31:34Z | 83 | 0 | transformers | [
"transformers",
"gguf",
"text-generation-inference",
"sft",
"chocolatine",
"fr",
"dataset:jpacifico/french-orca-pairs-culinary-9865",
"dataset:jpacifico/finetome_french_cook_definitions_v2",
"base_model:jpacifico/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1",
"base_model:quantized:jpacifico/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1",
"license:mit",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-11-25T09:00:57Z | ---
base_model: jpacifico/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1
datasets:
- jpacifico/french-orca-pairs-culinary-9865
- jpacifico/finetome_french_cook_definitions_v2
language:
- fr
library_name: transformers
license: mit
quantized_by: mradermacher
tags:
- text-generation-inference
- transformers
- sft
- chocolatine
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
static quants of https://huggingface.co/jpacifico/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1
<!-- provided-files -->
weighted/imatrix quants are available at https://huggingface.co/mradermacher/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1-i1-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1-GGUF/resolve/main/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1.Q2_K.gguf) | Q2_K | 1.5 | |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1-GGUF/resolve/main/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1.Q3_K_S.gguf) | Q3_K_S | 1.8 | |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1-GGUF/resolve/main/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1.Q3_K_M.gguf) | Q3_K_M | 2.0 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1-GGUF/resolve/main/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1.Q3_K_L.gguf) | Q3_K_L | 2.1 | |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1-GGUF/resolve/main/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1.IQ4_XS.gguf) | IQ4_XS | 2.2 | |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1-GGUF/resolve/main/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1.Q4_0_4_4.gguf) | Q4_0_4_4 | 2.3 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1-GGUF/resolve/main/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1.Q4_K_S.gguf) | Q4_K_S | 2.3 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1-GGUF/resolve/main/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1.Q4_K_M.gguf) | Q4_K_M | 2.4 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1-GGUF/resolve/main/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1.Q5_K_S.gguf) | Q5_K_S | 2.7 | |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1-GGUF/resolve/main/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1.Q5_K_M.gguf) | Q5_K_M | 2.8 | |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1-GGUF/resolve/main/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1.Q6_K.gguf) | Q6_K | 3.2 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1-GGUF/resolve/main/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1.Q8_0.gguf) | Q8_0 | 4.2 | fast, best quality |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1-GGUF/resolve/main/Chocolatine-Cook-3B-combined-SFT-DPO-v0.1.f16.gguf) | f16 | 7.7 | 16 bpw, overkill |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
leinad-deinor/Llama3.2-3b-redeIT-XML-GGUF | leinad-deinor | 2024-11-25T09:22:10Z | 9 | 0 | null | [
"gguf",
"llama",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-11-25T08:13:52Z | ---
license: apache-2.0
---
|
mini1013/master_item_ac | mini1013 | 2024-11-25T09:19:32Z | 423 | 0 | setfit | [
"setfit",
"safetensors",
"roberta",
"sentence-transformers",
"text-classification",
"generated_from_setfit_trainer",
"arxiv:2209.11055",
"base_model:klue/roberta-base",
"base_model:finetune:klue/roberta-base",
"model-index",
"region:us"
] | text-classification | 2024-11-25T09:19:09Z | ---
base_model: klue/roberta-base
library_name: setfit
metrics:
- metric
pipeline_tag: text-classification
tags:
- setfit
- sentence-transformers
- text-classification
- generated_from_setfit_trainer
widget:
- text: '[์์ฒด์ ์] 14k ์ฝฉ์ฌ๋ค๋ฆฌ ์ฒด์ธ ๋ฐ์ง ํํฌ_D style(1ํผ ๊ตต๊ธฐ)_10ํธ (์ฃผ)์ ์ด๋์์ด์ธํฐ๋ด์
๋'
- text: ์ค๋ฆฌ์ฝ ๋์ ์ง๊ฐ ์ฌํ ์บ๋ฆญํฐ [on] ๋ธ๋์บฃ(๋์ ์ง๊ฐ) ๋น150
- text: ์ฒดํฌ ๋จ์ ๋ฒ ๋ ๋ชจ ์๋น ๋ชจ์ ํํ
์บก ํจ์
๋นต๋ชจ์ ์ธ์ถ ๋ฒ ์ด์ง์ฒดํฌ (4JS) ํฌ์ ์ด์ค
- text: TIMBERLAND ๋จ์ฑ ์จ๋ฒ 6์ธ์น ์ํฐํ๋ฃจํ ์์ปค๋ถ์ธ _TB0A1OIZC641 070(250) ๋น์ธ ์ปดํผ๋
- text: ๋ผ์ธ๋์คํ ํฌ์คํ ์คํฌ์ธ ์ฌ์ฑ ์ฌ์ฆํ ๋์คํ ๋ณผ๋ฃธ ๋ชจ๋ ๋ฏธ๋ํ 37_๋ธ๋ ์คํธ๋ ์ดํธ 3.5cm/๊ตฝ(๋ฉ์ฌ) ์ฌ๋์ต๋ค
inference: true
model-index:
- name: SetFit with klue/roberta-base
results:
- task:
type: text-classification
name: Text Classification
dataset:
name: Unknown
type: unknown
split: test
metrics:
- type: metric
value: 0.9385943021823656
name: Metric
---
# SetFit with klue/roberta-base
This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [klue/roberta-base](https://huggingface.co/klue/roberta-base) as the Sentence Transformer embedding model. A [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance is used for classification.
The model has been trained using an efficient few-shot learning technique that involves:
1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning.
2. Training a classification head with features from the fine-tuned Sentence Transformer.
## Model Details
### Model Description
- **Model Type:** SetFit
- **Sentence Transformer body:** [klue/roberta-base](https://huggingface.co/klue/roberta-base)
- **Classification head:** a [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance
- **Maximum Sequence Length:** 512 tokens
- **Number of Classes:** 17 classes
<!-- - **Training Dataset:** [Unknown](https://huggingface.co/datasets/unknown) -->
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->
### Model Sources
- **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit)
- **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055)
- **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit)
### Model Labels
| Label | Examples |
|:------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 2.0 | <ul><li>'๋จ๋
๊ณต์ฉ ๋ฉํฐ์ค์นดํ ๋ชฉํ ์ ๋ฐ๋ค๋ ํค์ด๋ฐด๋ ๋๊ฑด ๋ธ๋ ๋น์ค๋๋ฐค'</li><li>'ํ๋ ๋ชจ์ ๊ท๋ฌ์ด ๊ฒจ์ธ ํธ๋ชจ์ ๋๋ฌผ ๋ชฉ๋์ด 03.๋ธ๋ผ์ด ๋ฟ์ต'</li><li>'ํ๋น ๋ท๋ชฉ๊ฐ๋ฆฌ๊ฐ ๋ฉ์ฌ ํตํ ์ ๊ฐ๋ ์์ธ์ ์ฐจ๋จ์ฌ์บก๊ฐ๋ ๊ทธ๋๋ชจ์ ์ฟจ๋ฉ์ฌ๋ชจ์_๊ทธ๋ ์ด ์์ค๋๋ธ์ ์ปดํผ๋'</li></ul> |
| 9.0 | <ul><li>'[LAP](๊ฐ๋จ์ )์๋ธ๋ผ ํธ๋ค ๋ฏธ๋ ํฌ๋ก์ค๋ฐฑ (AP7AB208) ์ ํธ๋ธ๋(ZB)_FREE ์ ์ธ๊ณ๋ฐฑํ์ '</li><li>'ํ์คํ
์ฌ๋ง๋ฐฑ ํ์ ๋ฏธ๋ ํฌ๋ก์ค ์๋๋ฐฑ ๊ทธ๋ฆฐ ๊นํ์ฒ '</li><li>'[๋ฉํธ๋ก์ํฐ]๋ด๋ด๋ฐฑ ํด๋ฌ์น๋ฐฑ ๋ฏธ๋ M233MQ3852Z ์์ด์ผ์ด์์ค์ค๋ (์ฃผ) AK์ธํฐ๋ท์ผํ๋ชฐ'</li></ul> |
| 15.0 | <ul><li>'ํฌ๋ฆฌ์ค๋ง์ค ๋ฑ์ง ๋ธ๋ก์น ๋ฐฐ์ง 19์ข
์ธํธ ๋ฐ ๋ฑ๊ฐ ๋ด์ ์ฌ์ด 5 ๊ตฌ๋งค๋ํ ์ด์'</li><li>'์ค๋์คํ๋์ค ODDSTUDIO ๋ฒ ์ด์ง ๋ํธ ์ฒดํฌ ๋จธํ๋ฌ - 21COLOR ๋ธ๋ CS์คํ์ด์ค'</li><li>'๋ฅ์ผ์ดํ ๋ฅ์ปคํ์ค ํ์ดํฌ์นด๋ผ ๋ ์ด์ด๋์นด๋ผ ์
์ธ ์นด๋ผ 1-์นด๋ผ-ํ์ดํธ ํ๋ณต๋๋ผ'</li></ul> |
| 13.0 | <ul><li>'ํ ์ฅฌ์ผ๋ฆฌ ๋ณด์ํจ ์ฌํ์ฉ ํฌ์ผ ๋ฏธ๋ ์
์ธ์ฌ๋ฆฌ ๋ณด๊ดํจ ์ผ์ด์ค Cํ์
-๋ฒ ์ด๋นํํฌ ์ ์ผ์ฌ'</li><li>'[๊ฐค๋ฌ๋ฆฌ์] [๋น์ค๋น๊ณจ๋] 14K ์ด์ด๋ณผ ๋ธ๋ฃจํ๋น
๋๋๋ง ๋ฐ์ง SRS39135 14K ํ์ดํธ๊ณจ๋_1ํธ ํํ๊ฐค๋ฌ๋ฆฌ์(์ฃผ)'</li><li>'๋ฏธ๋๊ณจ๋ ๊น์ฒ์ 14K 18K ํธ๋ ๋ฒ ์ปคํ๋ง ๋จ์ ์ฌ์ ๊ธ๋ฐ์ง RJUC4047 RJUC4048 ๋ฒ ์ด์งํ๊ณ ์ฌํํ ๋์์ธ ์ฌ์_14K์๋ก์ฐ๊ณจ๋ ๋ฏธ๋๊ณจ๋ ๊น์ฒ์ '</li></ul> |
| 1.0 | <ul><li>'[๋ฒ ์ดํ์ฐ](์ ์ธ๊ณ๊ฐ๋จ์ )(BEARPAW) ๋จ์ฑ ํธ ์ฌ๋ฆฌํผ MARY MENS ๋ธ๋ K814001ND-M BLACK (K814001ND)_280 ์ฃผ์ํ์ฌ ์์ค์์ค์ง๋ท์ปด'</li><li>'๋
ธ์คํ์ด์ค ๋ฎฌ ์ฌ๋ฆฝ์จ ๋ธ์ด๋ชจ์
- NS93P53A ๋ธ๋_290 ๋กฏ๋ฐ๋ฐฑํ์ 2๊ด'</li><li>'์ฌ๋ฌด์ค ๋จ์ ์ฌ๋ฆฌํผ ๊ฐ์ฃฝ ๋จ์ฑ ๋น
์ฌ์ด์ฆ 48 47 ์ฌ๋ฌด์ฉ ์ ์
์์ฝ๋์ค๋ดํ blue_38 ๋ฆฌ๋ง106'</li></ul> |
| 7.0 | <ul><li>'๋ถ๋๋ฌ์ด ์ํธ๋ฆฌ ์ ๋ฐ์ฃผ๋ฆ๋ฐฉ์ง ์ ๋ฐ๋ชจ์์ ์ง ์ ๋ฐ์งํฑ 225 245 mm ์ปคํผ์ ๊ธฐ์ ๊ท'</li><li>'[๊ฐ์ฑ๋น] ๊ฟ์กฐํฉ ์ ๋๋น์ธ ์ธํธ ์บ๋ฆญํฐ ์ ๋ฐ ์
์ธ์ฌ๋ฆฌ ํฌ์ผ๋ชฌ ์ค๋ํผ ์ปค๋นํธ์์ SET ์ ๋ํ'</li><li>'MSMAX Jazz Dance Shoes Split Sole Men Dancing Sneakers High Top Boots for Women Silver 10.5 M Silver_11 Narrow ๋์ํธ479'</li></ul> |
| 11.0 | <ul><li>'์บ๋ฆฌ์ด ์ํธ์ผ์ด์ค ์๋ฉด ๊ฐ๋ฐฉํ ๊ธฐ๋ด์ฉ ๋ฐํด๊ฐ๋ฐฉ ํ์ดํธ_26์ธ์น ํผ์ค์จํธ๋ ์ด๋'</li><li>'ํด๋์ ํจ์ค ์ปค๋ฒ์ฌ๊ถ ํฌํธ์๋ ํฌํธํ์ฐ์น ํ์ฐ์น ์ฌํ์ง๊ฐ ํฌํธ ์ผ์ด์ค (01 ๋ ๋ชจ๋) ์ฃผ์ํ์ฌ์ ๋ง์ผ'</li><li>'ํด๋์ํจ์ค์ปค๋ฒ (์ํฐ์คํค๋ฐ ์ฌ๊ถ์ผ์ด์ค) (10๋ธ๋) JTEC'</li></ul> |
| 4.0 | <ul><li>'๊ณ ๊ธ ์๊ฒฝ์ง ์ ๊ธ๋ผ์ค์ง ํด๋์ฉ ์ผ์ด์ค ํ์ฐ์น ํ๋ ๋ณด๊ดํจ ๋ธ๋ ๋ค์จ๋ง์ผ'</li><li>'๊ณ ๊ธ ์ฌ ์นผ๋ผ ํฌ๋ฆฌ์คํ ๋ค์ค ๋น์ฆ ์๊ฒฝ ์ค ๋ง์คํฌ ๊ฑธ์ด ์ํ์ ํ_๋ธ๋(๊ณจ๋) ๋ฆฌ๋ฏธ๋ชฐ'</li><li>'์์ด์
๊ฝ๋ฐฐ๊ธฐ์ธ์กฐ๊ฐ์ฃฝ์๊ฒฝ์ค10p์ธํธ์ ๊ธ๋ผ์ค์ค ๋ง๋๋์ผ'</li></ul> |
| 14.0 | <ul><li>'[๊ฐค๋ฌ๋ฆฌ์] [Prada]ํ๋ผ๋ค 23FW ์ฌํผ์๋
ธ ๋ฐ์ง๊ฐ ๋ธ๋ 2MO004 QME F0002 2MO004 QME F0002 FREE ํํ๊ฐค๋ฌ๋ฆฌ์(์ฃผ)'</li><li>'๋ฅ์ค ์ก์ธ์๋ฆฌ [OSCAR][์ค์ค์นด][์ ๋ค์์ค ์ ์ฉ] ๋ค์ด๋น ํ๋ฆฌ๋ฏธ์ ํ ๊ณ ์์
๊ฐ์ฃฝ ์ฐจํค์ผ์ด์ค DBHO2F573N2 XXX ์ฃผ์ํ์ฌ LF'</li><li>'ํฐ๋ธ๋ผ์ด 23SS ๋จ์ฑ ํ๋ธ๊ทธ๋ ์ธ ๋จธ๋ํด๋ฆฝ ๋ธ๋ MAW025L 00198 001 ONE SIZE ์ฃผ์ํ์ฌ ์ด์ง๊ฒ์ธํฐ๋ด์
๋'</li></ul> |
| 0.0 | <ul><li>'[๋กฏ๋ฐ๋ฐฑํ์ ]๋ฅ์คACC [์ ๋ฌผํฌ์ฅ/์ผํ๋ฐฑ๋๋ด] [GRIDโ
ก] ๋ธ๋ผ์ด ํจํด๋ฐฐ์ ์๊ฐ์ฃฝ ํด๋ฌ์น๋ฐฑ DBBA2F266W3 ๋กฏ๋ฐ๋ฐฑํ์ _'</li><li>'๋ง๋ค๋ฆฌ๋๋ ํ ํธ๋ฐฑ PIETRO P4T05163 ์ํ์๋ชฐ'</li><li>'๋ด์
๋์ง์ค๊ทธ๋ํฝ N245ATO510 ๋ฒ ์ด์ง ์์ฝ๋ฐฑ BLACK TNSC'</li></ul> |
| 16.0 | <ul><li>'์ฌ๋ฆผ๋จธ๋ฆฌ ๋ฉํํ๋ ์ ๋ฐ๋จธ๋ฆฌ ๊ผฌ์ ์ง๊ฒํ 114 ์ ๊ด์คํธ 7cm ์ด์ง ์ํธ ํ๋ก๋์
(EG ART PRODUCTION)'</li><li>'๊ผฌ์ ๋ฉํํ๋ ์ ๋ฐ๋จธ๋ฆฌ ์ฌ๋ฆผ๋จธ๋ฆฌ ์ง๊ฒํ 114 ๋ฌด๊ด๋ก์ฆ 7cm ๋ค์ค๋ชฐ'</li><li>'ํผํผ ๋ฐฉ์ธํธ ์ฅ์ ๋ฏธ๋ ๋จธ๋ฆฌ๋ ํฌ์ธํธ ํค์ด๋ ํผํ 1P ์๊ฐ'</li></ul> |
| 8.0 | <ul><li>'๊ธฐ๋ชจ ๋กฑ ์ค๋ฒ ๋์ญ์ค ๊ฒจ์ธ ์คํํน ๋ค๋ฆฌ ์๋จธ ๋กฑ์ญ์ค ๋กฑ์๋ง ๋ฌด๋ฆ ๋ํ์ด ๋ธ๋ผ์ด ๋ฆฐ์ดํธ'</li><li>'์ต๋12์ผค๋ ๋จ์ฌ ๊ตญ์ฐ์๋ง ์ฅ๋ชฉ/๋ํธ/๊ท ์ผ๊ฐ/์ ์/์ค๋ชฉ/๋ฐ๋ชฉ/์๋ฉด/ํ์ 37~38_37.์ฌ)ํธ์ค ์ค๋ชฉ_4์ผค๋ / ๋ฒ๊ฑด๋ ํฌํฌ์ญ์ค'</li><li>'NY์ฝํผํด๋ฝ 5์ผค๋ ๊ตญ์ฐ ๊ทน์ธ์ฌ ๊ธฐ๋ชจ ๋กฑ ๋ฌด์๋ฐ ์์ฐ๋ถ ์๋ฉด์๋ง W8001-์ฌ์ฑ-์นด๋ฉ5์กฑ GSSHOP_'</li></ul> |
| 5.0 | <ul><li>'[ํ๊ตญ๊ธ๊ฑฐ๋์] ์๊ธ ์นด๋ค์ด์
๋ฐฐ์ง 1.875g ๋ถ๋ชจ๋ ์ถ์ ๋ช
์ ์์ ์์ผ ๊ธฐ๋
์ผ ๊ธฐ๋
์ถํ ๊ฐ์ฌ์ ๋ฌผ ์ฃผ์ํ์ฌ ํ๊ตญ๊ธ๊ฑฐ๋์๋์งํธ์์
'</li><li>'[ํ๊ตญ๊ธ๊ฑฐ๋์]ํ๊ตญ๊ธ๊ฑฐ๋์ ์๊ธ ์ฉ 37.5g [์๊ธ24K] ๋กฏ๋ฐ์์ด๋ชฐ'</li><li>'ํ๊ตญ๊ธ๊ฑฐ๋์ ์ค๋ฒ๋ฐ 1kg(1000g) ์ฃผ์ํ์ฌ ํ๊ตญ๊ธ๊ฑฐ๋์๋์งํธ์์
'</li></ul> |
| 10.0 | <ul><li>'์บ ํผ ๋ธ๋ฃจํฌ์ค ํธ๋ ์ฒผ์ ์ตํด๋ถ์ธ 346335 EU 39 ์ฃผ์ํ์ฌ ์๋น๋ฅด๊ธ๋ก๋ฒ์ปค๋จธ์ค(SUBIR Global Commerce)'</li><li>'์์ฝค๋ง๋ณด๋ ์์ปค ๋ถ์ธ DG3CW22519BLK ๋ธ๋_250 ๋กฏ๋ฐ์ผํ(์ฃผ) ํ๋ฆฌ๋ฏธ์์์ธ๋ ํ์๋น๋ผ์ค'</li><li>'๋ง๋ ์ฟ ํค ๊ฑฐ์คํ ์ค๋ดํ ๊ฑฐ์ค์ฌ๋ฆฌํผ ์ค๋ด์ฌ๋ฆฌํผ LWS ๊ทธ๋ ์ด265mm ์ํ๊ณต์์365'</li></ul> |
| 6.0 | <ul><li>'BOXY ๋ฐ์ ์์น์์ธ๋ BWS-S / BWS-F 1๊ตฌ ์๋ตํฐ1๊ฐ๋ก ์์์ ์ฌ์ฉ๊ฐ๋ฅ BWS-S(DG)์๋ตํฐ๋ฏธํฌํจ ์์น๋ท์ปด'</li><li>'์ง์ฅ GA-2100 2110 ์ง์์คํฌ ๋ฒ ์ ค ๋ฐด๋ ์ผ์ฒดํ ์ฉ๋ ๋ฉํ ์ฐ๋ ํ๋ฐด๋ ์ปค์คํ
์ต์
5:์ค๋ฒ+๋ธ๋๋ฒ ์ ค_1.์ผ๋ฐ๋ฒํด_ํ์ดํธ ๋ฐฉ์ธ๋ฐฉ์ธ'</li><li>'์คํ์ต ์นด์์ค MRW-200H-2B2 ๋จ์ฑ ์๋ชฉ์๊ณ c57 ์ ํ19. AW-49H-1B ์คํ์ต'</li></ul> |
| 3.0 | <ul><li>'๋จ์ ๋ฉ๋นต 2 5CM ๋จ์ฑ ๋ฐ ์ฌ์ฑ ์์คํ๋ ํด๋ฆฝ ์ฌ์ด๋ ํ์คํฐ ์คํ์ผ ํ์ฑ ๋ฐฑ ์์คํ๋ 05 ๋ฐ์ ๋นจ๊ฐ์ ํฌ๋ก์ฐ์คํ ์ด'</li><li>'๋ฉ๋นต ์ํ๋ฉ๋นต ์ฉ ๋ฉ๋นต ์ด๋ฆฐ์ด๋ฉ๋นต ๋ฉ๋นต ๋งฌ๋นต MinSellAmount ๋ชจ๋ฃจ๋ชจ๋ฃจ'</li><li>'[๋ฅ์ค ์ก์ธ์๋ฆฌ] [23FW] DBBE3F097BK ์ฌ์ฑ๋ฒจํธDD Symbol ๋ธ๋ DD๋ฉํ๋ฆญ ๊ณจ๋ ๋ฒํด ์ XXX '</li></ul> |
| 12.0 | <ul><li>'๋ฏธ๋ ํ ์ ์ฌ๋ฌด์ฉ ๊ด๋ชฉ ์์ ํํ ์ ๋ ๋๋ก์ฆ ๋ค์์ด๋ค'</li><li>'๋ฐฑํ์ ์ฌ์ฑ ๋จ์ฑ ์ฒ์ฐ ์๊ฐ์ฃฝ ์ฅ๊ฐ ์ค๋งํธํฐ ํฐ์น ํธ ์๊ฐ๋ฝ ๊ฒจ์ธ ๋ฐฉํ ๊ฐ์ฃฝ ์ปคํ ์ฅ๊ฐ 2.์ฌ์ฑ์ฉ/์ค์จ์ด๋/์ฐจ์ฝ ํ๋ ์ค'</li><li>'[์ ๋ฌผํฌ์ฅ] ์ธ ์บ์๋ฏธ์ดํผ๋ฐฉ ํ๊ฑฐํ ์ฅ๊ฐ JAGV2F310G2,JAGV2F311W2,JAGV2F312E2,JAGV2F313/์ง์คํ์ดํธ ๊ทธ๋ฆฐ ๋กฏ๋ฐ์ผํ(์ฃผ)'</li></ul> |
## Evaluation
### Metrics
| Label | Metric |
|:--------|:-------|
| **all** | 0.9386 |
## Uses
### Direct Use for Inference
First install the SetFit library:
```bash
pip install setfit
```
Then you can load this model and run inference.
```python
from setfit import SetFitModel
# Download from the ๐ค Hub
model = SetFitModel.from_pretrained("mini1013/master_item_ac")
# Run inference
preds = model("์ค๋ฆฌ์ฝ ๋์ ์ง๊ฐ ์ฌํ ์บ๋ฆญํฐ [on] ๋ธ๋์บฃ(๋์ ์ง๊ฐ) ๋น150")
```
<!--
### Downstream Use
*List how someone could finetune this model on their own dataset.*
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Set Metrics
| Training set | Min | Median | Max |
|:-------------|:----|:--------|:----|
| Word count | 3 | 10.2537 | 30 |
| Label | Training Sample Count |
|:------|:----------------------|
| 0.0 | 450 |
| 1.0 | 650 |
| 2.0 | 650 |
| 3.0 | 150 |
| 4.0 | 300 |
| 5.0 | 120 |
| 6.0 | 224 |
| 7.0 | 350 |
| 8.0 | 100 |
| 9.0 | 467 |
| 10.0 | 500 |
| 11.0 | 600 |
| 12.0 | 150 |
| 13.0 | 450 |
| 14.0 | 400 |
| 15.0 | 1000 |
| 16.0 | 250 |
### Training Hyperparameters
- batch_size: (512, 512)
- num_epochs: (20, 20)
- max_steps: -1
- sampling_strategy: oversampling
- num_iterations: 40
- body_learning_rate: (2e-05, 2e-05)
- head_learning_rate: 2e-05
- loss: CosineSimilarityLoss
- distance_metric: cosine_distance
- margin: 0.25
- end_to_end: False
- use_amp: False
- warmup_proportion: 0.1
- seed: 42
- eval_max_steps: -1
- load_best_model_at_end: False
### Training Results
| Epoch | Step | Training Loss | Validation Loss |
|:-------:|:-----:|:-------------:|:---------------:|
| 0.0009 | 1 | 0.407 | - |
| 0.0469 | 50 | 0.3772 | - |
| 0.0939 | 100 | 0.3062 | - |
| 0.1408 | 150 | 0.2861 | - |
| 0.1878 | 200 | 0.2513 | - |
| 0.2347 | 250 | 0.2284 | - |
| 0.2817 | 300 | 0.1952 | - |
| 0.3286 | 350 | 0.149 | - |
| 0.3756 | 400 | 0.1154 | - |
| 0.4225 | 450 | 0.1042 | - |
| 0.4695 | 500 | 0.0802 | - |
| 0.5164 | 550 | 0.0765 | - |
| 0.5634 | 600 | 0.0767 | - |
| 0.6103 | 650 | 0.0475 | - |
| 0.6573 | 700 | 0.0535 | - |
| 0.7042 | 750 | 0.0293 | - |
| 0.7512 | 800 | 0.0388 | - |
| 0.7981 | 850 | 0.0156 | - |
| 0.8451 | 900 | 0.0348 | - |
| 0.8920 | 950 | 0.0241 | - |
| 0.9390 | 1000 | 0.023 | - |
| 0.9859 | 1050 | 0.0166 | - |
| 1.0329 | 1100 | 0.0124 | - |
| 1.0798 | 1150 | 0.0139 | - |
| 1.1268 | 1200 | 0.0122 | - |
| 1.1737 | 1250 | 0.0111 | - |
| 1.2207 | 1300 | 0.0062 | - |
| 1.2676 | 1350 | 0.0106 | - |
| 1.3146 | 1400 | 0.0112 | - |
| 1.3615 | 1450 | 0.0137 | - |
| 1.4085 | 1500 | 0.0154 | - |
| 1.4554 | 1550 | 0.0185 | - |
| 1.5023 | 1600 | 0.0248 | - |
| 1.5493 | 1650 | 0.0128 | - |
| 1.5962 | 1700 | 0.018 | - |
| 1.6432 | 1750 | 0.0013 | - |
| 1.6901 | 1800 | 0.0151 | - |
| 1.7371 | 1850 | 0.0208 | - |
| 1.7840 | 1900 | 0.0076 | - |
| 1.8310 | 1950 | 0.0138 | - |
| 1.8779 | 2000 | 0.0133 | - |
| 1.9249 | 2050 | 0.0131 | - |
| 1.9718 | 2100 | 0.0123 | - |
| 2.0188 | 2150 | 0.0165 | - |
| 2.0657 | 2200 | 0.0084 | - |
| 2.1127 | 2250 | 0.0062 | - |
| 2.1596 | 2300 | 0.0068 | - |
| 2.2066 | 2350 | 0.0023 | - |
| 2.2535 | 2400 | 0.006 | - |
| 2.3005 | 2450 | 0.0048 | - |
| 2.3474 | 2500 | 0.0016 | - |
| 2.3944 | 2550 | 0.0046 | - |
| 2.4413 | 2600 | 0.001 | - |
| 2.4883 | 2650 | 0.0022 | - |
| 2.5352 | 2700 | 0.0014 | - |
| 2.5822 | 2750 | 0.0004 | - |
| 2.6291 | 2800 | 0.0002 | - |
| 2.6761 | 2850 | 0.0004 | - |
| 2.7230 | 2900 | 0.0016 | - |
| 2.7700 | 2950 | 0.0018 | - |
| 2.8169 | 3000 | 0.0004 | - |
| 2.8638 | 3050 | 0.0001 | - |
| 2.9108 | 3100 | 0.0002 | - |
| 2.9577 | 3150 | 0.0018 | - |
| 3.0047 | 3200 | 0.0019 | - |
| 3.0516 | 3250 | 0.0001 | - |
| 3.0986 | 3300 | 0.0011 | - |
| 3.1455 | 3350 | 0.0001 | - |
| 3.1925 | 3400 | 0.0001 | - |
| 3.2394 | 3450 | 0.0002 | - |
| 3.2864 | 3500 | 0.0007 | - |
| 3.3333 | 3550 | 0.0001 | - |
| 3.3803 | 3600 | 0.0002 | - |
| 3.4272 | 3650 | 0.0001 | - |
| 3.4742 | 3700 | 0.0011 | - |
| 3.5211 | 3750 | 0.0013 | - |
| 3.5681 | 3800 | 0.0014 | - |
| 3.6150 | 3850 | 0.0001 | - |
| 3.6620 | 3900 | 0.0001 | - |
| 3.7089 | 3950 | 0.0002 | - |
| 3.7559 | 4000 | 0.0001 | - |
| 3.8028 | 4050 | 0.0014 | - |
| 3.8498 | 4100 | 0.0002 | - |
| 3.8967 | 4150 | 0.0001 | - |
| 3.9437 | 4200 | 0.0 | - |
| 3.9906 | 4250 | 0.0 | - |
| 4.0376 | 4300 | 0.0001 | - |
| 4.0845 | 4350 | 0.0002 | - |
| 4.1315 | 4400 | 0.0 | - |
| 4.1784 | 4450 | 0.0001 | - |
| 4.2254 | 4500 | 0.0 | - |
| 4.2723 | 4550 | 0.0 | - |
| 4.3192 | 4600 | 0.0003 | - |
| 4.3662 | 4650 | 0.0007 | - |
| 4.4131 | 4700 | 0.0 | - |
| 4.4601 | 4750 | 0.0001 | - |
| 4.5070 | 4800 | 0.0011 | - |
| 4.5540 | 4850 | 0.0003 | - |
| 4.6009 | 4900 | 0.0005 | - |
| 4.6479 | 4950 | 0.0001 | - |
| 4.6948 | 5000 | 0.0001 | - |
| 4.7418 | 5050 | 0.0001 | - |
| 4.7887 | 5100 | 0.0001 | - |
| 4.8357 | 5150 | 0.0 | - |
| 4.8826 | 5200 | 0.0 | - |
| 4.9296 | 5250 | 0.0 | - |
| 4.9765 | 5300 | 0.0001 | - |
| 5.0235 | 5350 | 0.0 | - |
| 5.0704 | 5400 | 0.0 | - |
| 5.1174 | 5450 | 0.0 | - |
| 5.1643 | 5500 | 0.0 | - |
| 5.2113 | 5550 | 0.0 | - |
| 5.2582 | 5600 | 0.0001 | - |
| 5.3052 | 5650 | 0.0 | - |
| 5.3521 | 5700 | 0.0 | - |
| 5.3991 | 5750 | 0.0 | - |
| 5.4460 | 5800 | 0.0 | - |
| 5.4930 | 5850 | 0.0 | - |
| 5.5399 | 5900 | 0.0 | - |
| 5.5869 | 5950 | 0.0 | - |
| 5.6338 | 6000 | 0.0 | - |
| 5.6808 | 6050 | 0.0 | - |
| 5.7277 | 6100 | 0.0 | - |
| 5.7746 | 6150 | 0.0 | - |
| 5.8216 | 6200 | 0.0 | - |
| 5.8685 | 6250 | 0.0 | - |
| 5.9155 | 6300 | 0.0001 | - |
| 5.9624 | 6350 | 0.0004 | - |
| 6.0094 | 6400 | 0.0007 | - |
| 6.0563 | 6450 | 0.0 | - |
| 6.1033 | 6500 | 0.0001 | - |
| 6.1502 | 6550 | 0.0 | - |
| 6.1972 | 6600 | 0.0001 | - |
| 6.2441 | 6650 | 0.0 | - |
| 6.2911 | 6700 | 0.0 | - |
| 6.3380 | 6750 | 0.0009 | - |
| 6.3850 | 6800 | 0.0 | - |
| 6.4319 | 6850 | 0.0001 | - |
| 6.4789 | 6900 | 0.0 | - |
| 6.5258 | 6950 | 0.0001 | - |
| 6.5728 | 7000 | 0.0 | - |
| 6.6197 | 7050 | 0.0 | - |
| 6.6667 | 7100 | 0.0 | - |
| 6.7136 | 7150 | 0.0 | - |
| 6.7606 | 7200 | 0.0001 | - |
| 6.8075 | 7250 | 0.0 | - |
| 6.8545 | 7300 | 0.0 | - |
| 6.9014 | 7350 | 0.0 | - |
| 6.9484 | 7400 | 0.0 | - |
| 6.9953 | 7450 | 0.0 | - |
| 7.0423 | 7500 | 0.0 | - |
| 7.0892 | 7550 | 0.0 | - |
| 7.1362 | 7600 | 0.0 | - |
| 7.1831 | 7650 | 0.0 | - |
| 7.2300 | 7700 | 0.0 | - |
| 7.2770 | 7750 | 0.0001 | - |
| 7.3239 | 7800 | 0.0 | - |
| 7.3709 | 7850 | 0.0 | - |
| 7.4178 | 7900 | 0.0 | - |
| 7.4648 | 7950 | 0.0 | - |
| 7.5117 | 8000 | 0.0 | - |
| 7.5587 | 8050 | 0.0 | - |
| 7.6056 | 8100 | 0.0 | - |
| 7.6526 | 8150 | 0.0024 | - |
| 7.6995 | 8200 | 0.0 | - |
| 7.7465 | 8250 | 0.0 | - |
| 7.7934 | 8300 | 0.0 | - |
| 7.8404 | 8350 | 0.0 | - |
| 7.8873 | 8400 | 0.0 | - |
| 7.9343 | 8450 | 0.0 | - |
| 7.9812 | 8500 | 0.0 | - |
| 8.0282 | 8550 | 0.0 | - |
| 8.0751 | 8600 | 0.0 | - |
| 8.1221 | 8650 | 0.0 | - |
| 8.1690 | 8700 | 0.0 | - |
| 8.2160 | 8750 | 0.0 | - |
| 8.2629 | 8800 | 0.0 | - |
| 8.3099 | 8850 | 0.0 | - |
| 8.3568 | 8900 | 0.0 | - |
| 8.4038 | 8950 | 0.0 | - |
| 8.4507 | 9000 | 0.0 | - |
| 8.4977 | 9050 | 0.0 | - |
| 8.5446 | 9100 | 0.0 | - |
| 8.5915 | 9150 | 0.0 | - |
| 8.6385 | 9200 | 0.0002 | - |
| 8.6854 | 9250 | 0.0003 | - |
| 8.7324 | 9300 | 0.0005 | - |
| 8.7793 | 9350 | 0.0001 | - |
| 8.8263 | 9400 | 0.0001 | - |
| 8.8732 | 9450 | 0.0001 | - |
| 8.9202 | 9500 | 0.0 | - |
| 8.9671 | 9550 | 0.0 | - |
| 9.0141 | 9600 | 0.0001 | - |
| 9.0610 | 9650 | 0.0001 | - |
| 9.1080 | 9700 | 0.0 | - |
| 9.1549 | 9750 | 0.0 | - |
| 9.2019 | 9800 | 0.0001 | - |
| 9.2488 | 9850 | 0.0 | - |
| 9.2958 | 9900 | 0.0 | - |
| 9.3427 | 9950 | 0.0 | - |
| 9.3897 | 10000 | 0.0 | - |
| 9.4366 | 10050 | 0.0 | - |
| 9.4836 | 10100 | 0.0 | - |
| 9.5305 | 10150 | 0.0 | - |
| 9.5775 | 10200 | 0.0 | - |
| 9.6244 | 10250 | 0.0 | - |
| 9.6714 | 10300 | 0.0 | - |
| 9.7183 | 10350 | 0.0 | - |
| 9.7653 | 10400 | 0.0 | - |
| 9.8122 | 10450 | 0.0 | - |
| 9.8592 | 10500 | 0.0016 | - |
| 9.9061 | 10550 | 0.0 | - |
| 9.9531 | 10600 | 0.0 | - |
| 10.0 | 10650 | 0.0 | - |
| 10.0469 | 10700 | 0.0003 | - |
| 10.0939 | 10750 | 0.0 | - |
| 10.1408 | 10800 | 0.0 | - |
| 10.1878 | 10850 | 0.0 | - |
| 10.2347 | 10900 | 0.0 | - |
| 10.2817 | 10950 | 0.0 | - |
| 10.3286 | 11000 | 0.0 | - |
| 10.3756 | 11050 | 0.0 | - |
| 10.4225 | 11100 | 0.0 | - |
| 10.4695 | 11150 | 0.0 | - |
| 10.5164 | 11200 | 0.0 | - |
| 10.5634 | 11250 | 0.0 | - |
| 10.6103 | 11300 | 0.0 | - |
| 10.6573 | 11350 | 0.0 | - |
| 10.7042 | 11400 | 0.0 | - |
| 10.7512 | 11450 | 0.0 | - |
| 10.7981 | 11500 | 0.0 | - |
| 10.8451 | 11550 | 0.0 | - |
| 10.8920 | 11600 | 0.0 | - |
| 10.9390 | 11650 | 0.0 | - |
| 10.9859 | 11700 | 0.0 | - |
| 11.0329 | 11750 | 0.0 | - |
| 11.0798 | 11800 | 0.0 | - |
| 11.1268 | 11850 | 0.0 | - |
| 11.1737 | 11900 | 0.0 | - |
| 11.2207 | 11950 | 0.0 | - |
| 11.2676 | 12000 | 0.0 | - |
| 11.3146 | 12050 | 0.0 | - |
| 11.3615 | 12100 | 0.0 | - |
| 11.4085 | 12150 | 0.0 | - |
| 11.4554 | 12200 | 0.0 | - |
| 11.5023 | 12250 | 0.0015 | - |
| 11.5493 | 12300 | 0.0 | - |
| 11.5962 | 12350 | 0.0 | - |
| 11.6432 | 12400 | 0.0 | - |
| 11.6901 | 12450 | 0.0 | - |
| 11.7371 | 12500 | 0.0 | - |
| 11.7840 | 12550 | 0.0002 | - |
| 11.8310 | 12600 | 0.0 | - |
| 11.8779 | 12650 | 0.0 | - |
| 11.9249 | 12700 | 0.0 | - |
| 11.9718 | 12750 | 0.0001 | - |
| 12.0188 | 12800 | 0.0 | - |
| 12.0657 | 12850 | 0.0 | - |
| 12.1127 | 12900 | 0.0 | - |
| 12.1596 | 12950 | 0.0001 | - |
| 12.2066 | 13000 | 0.0001 | - |
| 12.2535 | 13050 | 0.0 | - |
| 12.3005 | 13100 | 0.0 | - |
| 12.3474 | 13150 | 0.0001 | - |
| 12.3944 | 13200 | 0.0 | - |
| 12.4413 | 13250 | 0.0 | - |
| 12.4883 | 13300 | 0.0 | - |
| 12.5352 | 13350 | 0.0 | - |
| 12.5822 | 13400 | 0.0 | - |
| 12.6291 | 13450 | 0.0 | - |
| 12.6761 | 13500 | 0.0 | - |
| 12.7230 | 13550 | 0.0 | - |
| 12.7700 | 13600 | 0.0 | - |
| 12.8169 | 13650 | 0.0 | - |
| 12.8638 | 13700 | 0.0 | - |
| 12.9108 | 13750 | 0.0 | - |
| 12.9577 | 13800 | 0.0 | - |
| 13.0047 | 13850 | 0.0 | - |
| 13.0516 | 13900 | 0.0 | - |
| 13.0986 | 13950 | 0.0 | - |
| 13.1455 | 14000 | 0.0 | - |
| 13.1925 | 14050 | 0.0 | - |
| 13.2394 | 14100 | 0.0 | - |
| 13.2864 | 14150 | 0.0 | - |
| 13.3333 | 14200 | 0.0 | - |
| 13.3803 | 14250 | 0.0 | - |
| 13.4272 | 14300 | 0.0 | - |
| 13.4742 | 14350 | 0.0 | - |
| 13.5211 | 14400 | 0.0 | - |
| 13.5681 | 14450 | 0.0 | - |
| 13.6150 | 14500 | 0.0 | - |
| 13.6620 | 14550 | 0.0 | - |
| 13.7089 | 14600 | 0.0 | - |
| 13.7559 | 14650 | 0.0 | - |
| 13.8028 | 14700 | 0.0 | - |
| 13.8498 | 14750 | 0.0 | - |
| 13.8967 | 14800 | 0.0 | - |
| 13.9437 | 14850 | 0.0 | - |
| 13.9906 | 14900 | 0.0 | - |
| 14.0376 | 14950 | 0.0 | - |
| 14.0845 | 15000 | 0.0 | - |
| 14.1315 | 15050 | 0.0 | - |
| 14.1784 | 15100 | 0.0001 | - |
| 14.2254 | 15150 | 0.0 | - |
| 14.2723 | 15200 | 0.0 | - |
| 14.3192 | 15250 | 0.0 | - |
| 14.3662 | 15300 | 0.0 | - |
| 14.4131 | 15350 | 0.0 | - |
| 14.4601 | 15400 | 0.0 | - |
| 14.5070 | 15450 | 0.0 | - |
| 14.5540 | 15500 | 0.0 | - |
| 14.6009 | 15550 | 0.0 | - |
| 14.6479 | 15600 | 0.0 | - |
| 14.6948 | 15650 | 0.0 | - |
| 14.7418 | 15700 | 0.0 | - |
| 14.7887 | 15750 | 0.0 | - |
| 14.8357 | 15800 | 0.0 | - |
| 14.8826 | 15850 | 0.0 | - |
| 14.9296 | 15900 | 0.0 | - |
| 14.9765 | 15950 | 0.0 | - |
| 15.0235 | 16000 | 0.0 | - |
| 15.0704 | 16050 | 0.0 | - |
| 15.1174 | 16100 | 0.0 | - |
| 15.1643 | 16150 | 0.0 | - |
| 15.2113 | 16200 | 0.0 | - |
| 15.2582 | 16250 | 0.0 | - |
| 15.3052 | 16300 | 0.0 | - |
| 15.3521 | 16350 | 0.0 | - |
| 15.3991 | 16400 | 0.0 | - |
| 15.4460 | 16450 | 0.0 | - |
| 15.4930 | 16500 | 0.0 | - |
| 15.5399 | 16550 | 0.0 | - |
| 15.5869 | 16600 | 0.0 | - |
| 15.6338 | 16650 | 0.0 | - |
| 15.6808 | 16700 | 0.0 | - |
| 15.7277 | 16750 | 0.0 | - |
| 15.7746 | 16800 | 0.0 | - |
| 15.8216 | 16850 | 0.0 | - |
| 15.8685 | 16900 | 0.0 | - |
| 15.9155 | 16950 | 0.0 | - |
| 15.9624 | 17000 | 0.0 | - |
| 16.0094 | 17050 | 0.0 | - |
| 16.0563 | 17100 | 0.0 | - |
| 16.1033 | 17150 | 0.0 | - |
| 16.1502 | 17200 | 0.0 | - |
| 16.1972 | 17250 | 0.0 | - |
| 16.2441 | 17300 | 0.0 | - |
| 16.2911 | 17350 | 0.0 | - |
| 16.3380 | 17400 | 0.0 | - |
| 16.3850 | 17450 | 0.0 | - |
| 16.4319 | 17500 | 0.0 | - |
| 16.4789 | 17550 | 0.0 | - |
| 16.5258 | 17600 | 0.0 | - |
| 16.5728 | 17650 | 0.0 | - |
| 16.6197 | 17700 | 0.0 | - |
| 16.6667 | 17750 | 0.0 | - |
| 16.7136 | 17800 | 0.0 | - |
| 16.7606 | 17850 | 0.0 | - |
| 16.8075 | 17900 | 0.0 | - |
| 16.8545 | 17950 | 0.0 | - |
| 16.9014 | 18000 | 0.0 | - |
| 16.9484 | 18050 | 0.0 | - |
| 16.9953 | 18100 | 0.0 | - |
| 17.0423 | 18150 | 0.0 | - |
| 17.0892 | 18200 | 0.0 | - |
| 17.1362 | 18250 | 0.0 | - |
| 17.1831 | 18300 | 0.0 | - |
| 17.2300 | 18350 | 0.0 | - |
| 17.2770 | 18400 | 0.0 | - |
| 17.3239 | 18450 | 0.0 | - |
| 17.3709 | 18500 | 0.0 | - |
| 17.4178 | 18550 | 0.0 | - |
| 17.4648 | 18600 | 0.0 | - |
| 17.5117 | 18650 | 0.0 | - |
| 17.5587 | 18700 | 0.0 | - |
| 17.6056 | 18750 | 0.0 | - |
| 17.6526 | 18800 | 0.0 | - |
| 17.6995 | 18850 | 0.0 | - |
| 17.7465 | 18900 | 0.0 | - |
| 17.7934 | 18950 | 0.0 | - |
| 17.8404 | 19000 | 0.0 | - |
| 17.8873 | 19050 | 0.0 | - |
| 17.9343 | 19100 | 0.0 | - |
| 17.9812 | 19150 | 0.0 | - |
| 18.0282 | 19200 | 0.0 | - |
| 18.0751 | 19250 | 0.0 | - |
| 18.1221 | 19300 | 0.0 | - |
| 18.1690 | 19350 | 0.0 | - |
| 18.2160 | 19400 | 0.0 | - |
| 18.2629 | 19450 | 0.0 | - |
| 18.3099 | 19500 | 0.0 | - |
| 18.3568 | 19550 | 0.0 | - |
| 18.4038 | 19600 | 0.0 | - |
| 18.4507 | 19650 | 0.0 | - |
| 18.4977 | 19700 | 0.0 | - |
| 18.5446 | 19750 | 0.0 | - |
| 18.5915 | 19800 | 0.0 | - |
| 18.6385 | 19850 | 0.0 | - |
| 18.6854 | 19900 | 0.0 | - |
| 18.7324 | 19950 | 0.0 | - |
| 18.7793 | 20000 | 0.0 | - |
| 18.8263 | 20050 | 0.0 | - |
| 18.8732 | 20100 | 0.0 | - |
| 18.9202 | 20150 | 0.0 | - |
| 18.9671 | 20200 | 0.0 | - |
| 19.0141 | 20250 | 0.0 | - |
| 19.0610 | 20300 | 0.0 | - |
| 19.1080 | 20350 | 0.0 | - |
| 19.1549 | 20400 | 0.0 | - |
| 19.2019 | 20450 | 0.0 | - |
| 19.2488 | 20500 | 0.0 | - |
| 19.2958 | 20550 | 0.0 | - |
| 19.3427 | 20600 | 0.0 | - |
| 19.3897 | 20650 | 0.0 | - |
| 19.4366 | 20700 | 0.0 | - |
| 19.4836 | 20750 | 0.0 | - |
| 19.5305 | 20800 | 0.0 | - |
| 19.5775 | 20850 | 0.0 | - |
| 19.6244 | 20900 | 0.0 | - |
| 19.6714 | 20950 | 0.0 | - |
| 19.7183 | 21000 | 0.0 | - |
| 19.7653 | 21050 | 0.0 | - |
| 19.8122 | 21100 | 0.0 | - |
| 19.8592 | 21150 | 0.0 | - |
| 19.9061 | 21200 | 0.0 | - |
| 19.9531 | 21250 | 0.0 | - |
| 20.0 | 21300 | 0.0 | - |
### Framework Versions
- Python: 3.10.12
- SetFit: 1.1.0.dev0
- Sentence Transformers: 3.1.1
- Transformers: 4.46.1
- PyTorch: 2.4.0+cu121
- Datasets: 2.20.0
- Tokenizers: 0.20.0
## Citation
### BibTeX
```bibtex
@article{https://doi.org/10.48550/arxiv.2209.11055,
doi = {10.48550/ARXIV.2209.11055},
url = {https://arxiv.org/abs/2209.11055},
author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Efficient Few-Shot Learning Without Prompts},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> |
mradermacher/Ice0.41-22.11-RP-i1-GGUF | mradermacher | 2024-11-25T09:18:22Z | 179 | 3 | transformers | [
"transformers",
"gguf",
"mergekit",
"merge",
"en",
"base_model:icefog72/Ice0.41-22.11-RP",
"base_model:quantized:icefog72/Ice0.41-22.11-RP",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"region:us",
"imatrix",
"conversational"
] | null | 2024-11-23T09:21:22Z | ---
base_model: icefog72/Ice0.41-22.11-RP
language:
- en
library_name: transformers
license: cc-by-nc-4.0
quantized_by: mradermacher
tags:
- mergekit
- merge
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/icefog72/Ice0.41-22.11-RP
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/Ice0.41-22.11-RP-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/Ice0.41-22.11-RP-i1-GGUF/resolve/main/Ice0.41-22.11-RP.i1-IQ1_S.gguf) | i1-IQ1_S | 1.7 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/Ice0.41-22.11-RP-i1-GGUF/resolve/main/Ice0.41-22.11-RP.i1-IQ1_M.gguf) | i1-IQ1_M | 1.9 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/Ice0.41-22.11-RP-i1-GGUF/resolve/main/Ice0.41-22.11-RP.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 2.1 | |
| [GGUF](https://huggingface.co/mradermacher/Ice0.41-22.11-RP-i1-GGUF/resolve/main/Ice0.41-22.11-RP.i1-IQ2_XS.gguf) | i1-IQ2_XS | 2.3 | |
| [GGUF](https://huggingface.co/mradermacher/Ice0.41-22.11-RP-i1-GGUF/resolve/main/Ice0.41-22.11-RP.i1-IQ2_S.gguf) | i1-IQ2_S | 2.4 | |
| [GGUF](https://huggingface.co/mradermacher/Ice0.41-22.11-RP-i1-GGUF/resolve/main/Ice0.41-22.11-RP.i1-IQ2_M.gguf) | i1-IQ2_M | 2.6 | |
| [GGUF](https://huggingface.co/mradermacher/Ice0.41-22.11-RP-i1-GGUF/resolve/main/Ice0.41-22.11-RP.i1-Q2_K.gguf) | i1-Q2_K | 2.8 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/Ice0.41-22.11-RP-i1-GGUF/resolve/main/Ice0.41-22.11-RP.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 2.9 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/Ice0.41-22.11-RP-i1-GGUF/resolve/main/Ice0.41-22.11-RP.i1-IQ3_XS.gguf) | i1-IQ3_XS | 3.1 | |
| [GGUF](https://huggingface.co/mradermacher/Ice0.41-22.11-RP-i1-GGUF/resolve/main/Ice0.41-22.11-RP.i1-Q3_K_S.gguf) | i1-Q3_K_S | 3.3 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/Ice0.41-22.11-RP-i1-GGUF/resolve/main/Ice0.41-22.11-RP.i1-IQ3_S.gguf) | i1-IQ3_S | 3.3 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/Ice0.41-22.11-RP-i1-GGUF/resolve/main/Ice0.41-22.11-RP.i1-IQ3_M.gguf) | i1-IQ3_M | 3.4 | |
| [GGUF](https://huggingface.co/mradermacher/Ice0.41-22.11-RP-i1-GGUF/resolve/main/Ice0.41-22.11-RP.i1-Q3_K_M.gguf) | i1-Q3_K_M | 3.6 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/Ice0.41-22.11-RP-i1-GGUF/resolve/main/Ice0.41-22.11-RP.i1-Q3_K_L.gguf) | i1-Q3_K_L | 3.9 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/Ice0.41-22.11-RP-i1-GGUF/resolve/main/Ice0.41-22.11-RP.i1-IQ4_XS.gguf) | i1-IQ4_XS | 4.0 | |
| [GGUF](https://huggingface.co/mradermacher/Ice0.41-22.11-RP-i1-GGUF/resolve/main/Ice0.41-22.11-RP.i1-Q4_0_4_4.gguf) | i1-Q4_0_4_4 | 4.2 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/Ice0.41-22.11-RP-i1-GGUF/resolve/main/Ice0.41-22.11-RP.i1-Q4_0_4_8.gguf) | i1-Q4_0_4_8 | 4.2 | fast on arm+i8mm, low quality |
| [GGUF](https://huggingface.co/mradermacher/Ice0.41-22.11-RP-i1-GGUF/resolve/main/Ice0.41-22.11-RP.i1-Q4_0_8_8.gguf) | i1-Q4_0_8_8 | 4.2 | fast on arm+sve, low quality |
| [GGUF](https://huggingface.co/mradermacher/Ice0.41-22.11-RP-i1-GGUF/resolve/main/Ice0.41-22.11-RP.i1-Q4_0.gguf) | i1-Q4_0 | 4.2 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/Ice0.41-22.11-RP-i1-GGUF/resolve/main/Ice0.41-22.11-RP.i1-Q4_K_S.gguf) | i1-Q4_K_S | 4.2 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/Ice0.41-22.11-RP-i1-GGUF/resolve/main/Ice0.41-22.11-RP.i1-Q4_K_M.gguf) | i1-Q4_K_M | 4.5 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Ice0.41-22.11-RP-i1-GGUF/resolve/main/Ice0.41-22.11-RP.i1-Q5_K_S.gguf) | i1-Q5_K_S | 5.1 | |
| [GGUF](https://huggingface.co/mradermacher/Ice0.41-22.11-RP-i1-GGUF/resolve/main/Ice0.41-22.11-RP.i1-Q5_K_M.gguf) | i1-Q5_K_M | 5.2 | |
| [GGUF](https://huggingface.co/mradermacher/Ice0.41-22.11-RP-i1-GGUF/resolve/main/Ice0.41-22.11-RP.i1-Q6_K.gguf) | i1-Q6_K | 6.0 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
mradermacher/Tulu-3.1-8B-SuperNova-i1-GGUF | mradermacher | 2024-11-25T09:17:42Z | 263 | 2 | transformers | [
"transformers",
"gguf",
"mergekit",
"merge",
"en",
"base_model:bunnycore/Tulu-3.1-8B-SuperNova",
"base_model:quantized:bunnycore/Tulu-3.1-8B-SuperNova",
"endpoints_compatible",
"region:us",
"imatrix",
"conversational"
] | null | 2024-11-23T04:51:17Z | ---
base_model: bunnycore/Tulu-3.1-8B-SuperNova
language:
- en
library_name: transformers
quantized_by: mradermacher
tags:
- mergekit
- merge
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/bunnycore/Tulu-3.1-8B-SuperNova
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/Tulu-3.1-8B-SuperNova-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/Tulu-3.1-8B-SuperNova-i1-GGUF/resolve/main/Tulu-3.1-8B-SuperNova.i1-IQ1_S.gguf) | i1-IQ1_S | 2.1 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/Tulu-3.1-8B-SuperNova-i1-GGUF/resolve/main/Tulu-3.1-8B-SuperNova.i1-IQ1_M.gguf) | i1-IQ1_M | 2.3 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/Tulu-3.1-8B-SuperNova-i1-GGUF/resolve/main/Tulu-3.1-8B-SuperNova.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 2.5 | |
| [GGUF](https://huggingface.co/mradermacher/Tulu-3.1-8B-SuperNova-i1-GGUF/resolve/main/Tulu-3.1-8B-SuperNova.i1-IQ2_XS.gguf) | i1-IQ2_XS | 2.7 | |
| [GGUF](https://huggingface.co/mradermacher/Tulu-3.1-8B-SuperNova-i1-GGUF/resolve/main/Tulu-3.1-8B-SuperNova.i1-IQ2_S.gguf) | i1-IQ2_S | 2.9 | |
| [GGUF](https://huggingface.co/mradermacher/Tulu-3.1-8B-SuperNova-i1-GGUF/resolve/main/Tulu-3.1-8B-SuperNova.i1-IQ2_M.gguf) | i1-IQ2_M | 3.0 | |
| [GGUF](https://huggingface.co/mradermacher/Tulu-3.1-8B-SuperNova-i1-GGUF/resolve/main/Tulu-3.1-8B-SuperNova.i1-Q2_K.gguf) | i1-Q2_K | 3.3 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/Tulu-3.1-8B-SuperNova-i1-GGUF/resolve/main/Tulu-3.1-8B-SuperNova.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 3.4 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/Tulu-3.1-8B-SuperNova-i1-GGUF/resolve/main/Tulu-3.1-8B-SuperNova.i1-IQ3_XS.gguf) | i1-IQ3_XS | 3.6 | |
| [GGUF](https://huggingface.co/mradermacher/Tulu-3.1-8B-SuperNova-i1-GGUF/resolve/main/Tulu-3.1-8B-SuperNova.i1-Q3_K_S.gguf) | i1-Q3_K_S | 3.8 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/Tulu-3.1-8B-SuperNova-i1-GGUF/resolve/main/Tulu-3.1-8B-SuperNova.i1-IQ3_S.gguf) | i1-IQ3_S | 3.8 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/Tulu-3.1-8B-SuperNova-i1-GGUF/resolve/main/Tulu-3.1-8B-SuperNova.i1-IQ3_M.gguf) | i1-IQ3_M | 3.9 | |
| [GGUF](https://huggingface.co/mradermacher/Tulu-3.1-8B-SuperNova-i1-GGUF/resolve/main/Tulu-3.1-8B-SuperNova.i1-Q3_K_M.gguf) | i1-Q3_K_M | 4.1 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/Tulu-3.1-8B-SuperNova-i1-GGUF/resolve/main/Tulu-3.1-8B-SuperNova.i1-Q3_K_L.gguf) | i1-Q3_K_L | 4.4 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/Tulu-3.1-8B-SuperNova-i1-GGUF/resolve/main/Tulu-3.1-8B-SuperNova.i1-IQ4_XS.gguf) | i1-IQ4_XS | 4.5 | |
| [GGUF](https://huggingface.co/mradermacher/Tulu-3.1-8B-SuperNova-i1-GGUF/resolve/main/Tulu-3.1-8B-SuperNova.i1-Q4_0_4_4.gguf) | i1-Q4_0_4_4 | 4.8 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/Tulu-3.1-8B-SuperNova-i1-GGUF/resolve/main/Tulu-3.1-8B-SuperNova.i1-Q4_0_4_8.gguf) | i1-Q4_0_4_8 | 4.8 | fast on arm+i8mm, low quality |
| [GGUF](https://huggingface.co/mradermacher/Tulu-3.1-8B-SuperNova-i1-GGUF/resolve/main/Tulu-3.1-8B-SuperNova.i1-Q4_0_8_8.gguf) | i1-Q4_0_8_8 | 4.8 | fast on arm+sve, low quality |
| [GGUF](https://huggingface.co/mradermacher/Tulu-3.1-8B-SuperNova-i1-GGUF/resolve/main/Tulu-3.1-8B-SuperNova.i1-Q4_0.gguf) | i1-Q4_0 | 4.8 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/Tulu-3.1-8B-SuperNova-i1-GGUF/resolve/main/Tulu-3.1-8B-SuperNova.i1-Q4_K_S.gguf) | i1-Q4_K_S | 4.8 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/Tulu-3.1-8B-SuperNova-i1-GGUF/resolve/main/Tulu-3.1-8B-SuperNova.i1-Q4_K_M.gguf) | i1-Q4_K_M | 5.0 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Tulu-3.1-8B-SuperNova-i1-GGUF/resolve/main/Tulu-3.1-8B-SuperNova.i1-Q5_K_S.gguf) | i1-Q5_K_S | 5.7 | |
| [GGUF](https://huggingface.co/mradermacher/Tulu-3.1-8B-SuperNova-i1-GGUF/resolve/main/Tulu-3.1-8B-SuperNova.i1-Q5_K_M.gguf) | i1-Q5_K_M | 5.8 | |
| [GGUF](https://huggingface.co/mradermacher/Tulu-3.1-8B-SuperNova-i1-GGUF/resolve/main/Tulu-3.1-8B-SuperNova.i1-Q6_K.gguf) | i1-Q6_K | 6.7 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
mradermacher/Qwen2.5-14B-Mixed-Instruct-GGUF | mradermacher | 2024-11-25T09:17:34Z | 38 | 2 | transformers | [
"transformers",
"gguf",
"en",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-11-23T06:30:46Z | ---
base_model: ddh0/Qwen2.5-14B-Mixed-Instruct
language:
- en
library_name: transformers
quantized_by: mradermacher
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
static quants of https://huggingface.co/ddh0/Qwen2.5-14B-Mixed-Instruct
<!-- provided-files -->
weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-14B-Mixed-Instruct-GGUF/resolve/main/Qwen2.5-14B-Mixed-Instruct.Q2_K.gguf) | Q2_K | 5.9 | |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-14B-Mixed-Instruct-GGUF/resolve/main/Qwen2.5-14B-Mixed-Instruct.Q3_K_S.gguf) | Q3_K_S | 6.8 | |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-14B-Mixed-Instruct-GGUF/resolve/main/Qwen2.5-14B-Mixed-Instruct.Q3_K_M.gguf) | Q3_K_M | 7.4 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-14B-Mixed-Instruct-GGUF/resolve/main/Qwen2.5-14B-Mixed-Instruct.Q3_K_L.gguf) | Q3_K_L | 8.0 | |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-14B-Mixed-Instruct-GGUF/resolve/main/Qwen2.5-14B-Mixed-Instruct.IQ4_XS.gguf) | IQ4_XS | 8.3 | |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-14B-Mixed-Instruct-GGUF/resolve/main/Qwen2.5-14B-Mixed-Instruct.Q4_0_4_4.gguf) | Q4_0_4_4 | 8.6 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-14B-Mixed-Instruct-GGUF/resolve/main/Qwen2.5-14B-Mixed-Instruct.Q4_K_S.gguf) | Q4_K_S | 8.7 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-14B-Mixed-Instruct-GGUF/resolve/main/Qwen2.5-14B-Mixed-Instruct.Q4_K_M.gguf) | Q4_K_M | 9.1 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-14B-Mixed-Instruct-GGUF/resolve/main/Qwen2.5-14B-Mixed-Instruct.Q5_K_S.gguf) | Q5_K_S | 10.4 | |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-14B-Mixed-Instruct-GGUF/resolve/main/Qwen2.5-14B-Mixed-Instruct.Q5_K_M.gguf) | Q5_K_M | 10.6 | |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-14B-Mixed-Instruct-GGUF/resolve/main/Qwen2.5-14B-Mixed-Instruct.Q6_K.gguf) | Q6_K | 12.2 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-14B-Mixed-Instruct-GGUF/resolve/main/Qwen2.5-14B-Mixed-Instruct.Q8_0.gguf) | Q8_0 | 15.8 | fast, best quality |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
Sakalti/Kan1-2.5b | Sakalti | 2024-11-25T09:17:28Z | 129 | 0 | transformers | [
"transformers",
"safetensors",
"qwen2",
"text-generation",
"mergekit",
"merge",
"conversational",
"arxiv:2403.19522",
"base_model:Qwen/Qwen2.5-7B-Instruct",
"base_model:finetune:Qwen/Qwen2.5-7B-Instruct",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-25T09:15:02Z | ---
base_model:
- Qwen/Qwen2.5-7B-Instruct
library_name: transformers
tags:
- mergekit
- merge
---
# merge
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the [Model Stock](https://arxiv.org/abs/2403.19522) merge method using [Qwen/Qwen2.5-7B-Instruct](https://huggingface.co/Qwen/Qwen2.5-7B-Instruct) as a base.
### Models Merged
The following models were included in the merge:
### Configuration
The following YAML configuration was used to produce this model:
```yaml
slices:
- sources:
- model: Qwen/Qwen2.5-7B-Instruct
layer_range: [0, 6]
- model: Qwen/Qwen2.5-7B-Instruct
layer_range: [7, 13]
- model: Qwen/Qwen2.5-7B-Instruct
layer_range: [14, 20]
- model: Qwen/Qwen2.5-7B-Instruct
layer_range: [21, 27]
merge_method: model_stock
base_model: Qwen/Qwen2.5-7B-Instruct
dtype: bfloat16
```
|
mradermacher/NeuralDaredevil-12b-32k-GGUF | mradermacher | 2024-11-25T09:17:19Z | 121 | 2 | transformers | [
"transformers",
"gguf",
"merge",
"mergekit",
"lazymergekit",
"mlabonne/NeuralDaredevil-7B",
"en",
"base_model:mvpmaster/NeuralDaredevil-12b-32k",
"base_model:quantized:mvpmaster/NeuralDaredevil-12b-32k",
"endpoints_compatible",
"region:us"
] | null | 2024-11-23T18:19:56Z | ---
base_model: mvpmaster/NeuralDaredevil-12b-32k
language:
- en
library_name: transformers
quantized_by: mradermacher
tags:
- merge
- mergekit
- lazymergekit
- mlabonne/NeuralDaredevil-7B
- mlabonne/NeuralDaredevil-7B
- mlabonne/NeuralDaredevil-7B
- mlabonne/NeuralDaredevil-7B
- mlabonne/NeuralDaredevil-7B
- mlabonne/NeuralDaredevil-7B
- mlabonne/NeuralDaredevil-7B
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
static quants of https://huggingface.co/mvpmaster/NeuralDaredevil-12b-32k
<!-- provided-files -->
weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/NeuralDaredevil-12b-32k-GGUF/resolve/main/NeuralDaredevil-12b-32k.Q2_K.gguf) | Q2_K | 4.7 | |
| [GGUF](https://huggingface.co/mradermacher/NeuralDaredevil-12b-32k-GGUF/resolve/main/NeuralDaredevil-12b-32k.Q3_K_S.gguf) | Q3_K_S | 5.5 | |
| [GGUF](https://huggingface.co/mradermacher/NeuralDaredevil-12b-32k-GGUF/resolve/main/NeuralDaredevil-12b-32k.Q3_K_M.gguf) | Q3_K_M | 6.1 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/NeuralDaredevil-12b-32k-GGUF/resolve/main/NeuralDaredevil-12b-32k.Q3_K_L.gguf) | Q3_K_L | 6.7 | |
| [GGUF](https://huggingface.co/mradermacher/NeuralDaredevil-12b-32k-GGUF/resolve/main/NeuralDaredevil-12b-32k.IQ4_XS.gguf) | IQ4_XS | 6.9 | |
| [GGUF](https://huggingface.co/mradermacher/NeuralDaredevil-12b-32k-GGUF/resolve/main/NeuralDaredevil-12b-32k.Q4_0_4_4.gguf) | Q4_0_4_4 | 7.2 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/NeuralDaredevil-12b-32k-GGUF/resolve/main/NeuralDaredevil-12b-32k.Q4_K_S.gguf) | Q4_K_S | 7.2 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/NeuralDaredevil-12b-32k-GGUF/resolve/main/NeuralDaredevil-12b-32k.Q4_K_M.gguf) | Q4_K_M | 7.6 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/NeuralDaredevil-12b-32k-GGUF/resolve/main/NeuralDaredevil-12b-32k.Q5_K_S.gguf) | Q5_K_S | 8.7 | |
| [GGUF](https://huggingface.co/mradermacher/NeuralDaredevil-12b-32k-GGUF/resolve/main/NeuralDaredevil-12b-32k.Q5_K_M.gguf) | Q5_K_M | 8.9 | |
| [GGUF](https://huggingface.co/mradermacher/NeuralDaredevil-12b-32k-GGUF/resolve/main/NeuralDaredevil-12b-32k.Q6_K.gguf) | Q6_K | 10.3 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/NeuralDaredevil-12b-32k-GGUF/resolve/main/NeuralDaredevil-12b-32k.Q8_0.gguf) | Q8_0 | 13.4 | fast, best quality |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
SeppeV/xlnet_ft_pref_10pc | SeppeV | 2024-11-25T09:10:11Z | 89 | 0 | transformers | [
"transformers",
"safetensors",
"xlnet",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-23T15:02:28Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a ๐ค transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
PrunaAI/ahmedheakl-asm2asm-qwen2.5coder-0.5b-100k-2ep-tokenizer-bnb-8bit-smashed | PrunaAI | 2024-11-25T09:09:29Z | 6 | 0 | null | [
"safetensors",
"qwen2",
"pruna-ai",
"base_model:ahmedheakl/asm2asm-qwen2.5coder-0.5b-100k-2ep-tokenizer",
"base_model:quantized:ahmedheakl/asm2asm-qwen2.5coder-0.5b-100k-2ep-tokenizer",
"8-bit",
"bitsandbytes",
"region:us"
] | null | 2024-11-24T09:08:22Z | ---
thumbnail: "https://assets-global.website-files.com/646b351987a8d8ce158d1940/64ec9e96b4334c0e1ac41504_Logo%20with%20white%20text.svg"
base_model: ahmedheakl/asm2asm-qwen2.5coder-0.5b-100k-2ep-tokenizer
metrics:
- memory_disk
- memory_inference
- inference_latency
- inference_throughput
- inference_CO2_emissions
- inference_energy_consumption
tags:
- pruna-ai
---
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<a href="https://docs.pruna.ai/en/latest/setup/pip.html" target="_blank" rel="noopener noreferrer">
<img src="https://imgur.com/rVAgqMY.png" alt="PrunaAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</a>
</div>
<!-- header end -->
[](https://twitter.com/PrunaAI)
[](https://github.com/PrunaAI)
[](https://www.linkedin.com/company/93832878/admin/feed/posts/?feedType=following)
[](https://discord.gg/rskEr4BZJx)
# Simply make AI models cheaper, smaller, faster, and greener!
- Give a thumbs up if you like this model!
- Contact us and tell us which model to compress next [here](https://www.pruna.ai/contact).
- Request access to easily compress your *own* AI models [here](https://z0halsaff74.typeform.com/pruna-access?typeform-source=www.pruna.ai).
- Read the documentations to know more [here](https://pruna-ai-pruna.readthedocs-hosted.com/en/latest/)
- Join Pruna AI community on Discord [here](https://discord.gg/CP4VSgck) to share feedback/suggestions or get help.
## Results

**Frequently Asked Questions**
- ***How does the compression work?*** The model is compressed with llm-int8.
- ***How does the model quality change?*** The quality of the model output might vary compared to the base model.
- ***How is the model efficiency evaluated?*** These results were obtained with configuration described in `model/smash_config.json` and are obtained after a hardware warmup. The smashed model is directly compared to the original base model. Efficiency results may vary in other settings (e.g. other hardware, image size, batch size, ...). We recommend to directly run them in the use-case conditions to know if the smashed model can benefit you.
- ***What is the model format?*** We use safetensors.
- ***What calibration data has been used?*** If needed by the compression method, we used WikiText as the calibration data.
- ***What is the naming convention for Pruna Huggingface models?*** We take the original model name and append "turbo", "tiny", or "green" if the smashed model has a measured inference speed, inference memory, or inference energy consumption which is less than 90% of the original base model.
- ***How to compress my own models?*** You can request premium access to more compression methods and tech support for your specific use-cases [here](https://z0halsaff74.typeform.com/pruna-access?typeform-source=www.pruna.ai).
- ***What are "first" metrics?*** Results mentioning "first" are obtained after the first run of the model. The first run might take more memory or be slower than the subsequent runs due cuda overheads.
- ***What are "Sync" and "Async" metrics?*** "Sync" metrics are obtained by syncing all GPU processes and stop measurement when all of them are executed. "Async" metrics are obtained without syncing all GPU processes and stop when the model output can be used by the CPU. We provide both metrics since both could be relevant depending on the use-case. We recommend to test the efficiency gains directly in your use-cases.
## Setup
You can run the smashed model with these steps:
0. Check requirements from the original repo ahmedheakl/asm2asm-qwen2.5coder-0.5b-100k-2ep-tokenizer installed. In particular, check python, cuda, and transformers versions.
1. Make sure that you have installed quantization related packages.
```bash
pip install transformers accelerate bitsandbytes>0.37.0
```
2. Load & run the model.
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("PrunaAI/ahmedheakl-asm2asm-qwen2.5coder-0.5b-100k-2ep-tokenizer-bnb-8bit-smashed", trust_remote_code=True, device_map='auto')
tokenizer = AutoTokenizer.from_pretrained("ahmedheakl/asm2asm-qwen2.5coder-0.5b-100k-2ep-tokenizer")
input_ids = tokenizer("What is the color of prunes?,", return_tensors='pt').to(model.device)["input_ids"]
outputs = model.generate(input_ids, max_new_tokens=216)
tokenizer.decode(outputs[0])
```
## Configurations
The configuration info are in `smash_config.json`.
## Credits & License
The license of the smashed model follows the license of the original model. Please check the license of the original model ahmedheakl/asm2asm-qwen2.5coder-0.5b-100k-2ep-tokenizer before using this model which provided the base model. The license of the `pruna-engine` is [here](https://pypi.org/project/pruna-engine/) on Pypi.
## Want to compress other models?
- Contact us and tell us which model to compress next [here](https://www.pruna.ai/contact).
- Do it by yourself [here](https://docs.pruna.ai/en/latest/setup/pip.html). |
July-Tokyo/xlm-roberta-base-finetuned-panx-de | July-Tokyo | 2024-11-25T09:09:10Z | 133 | 0 | transformers | [
"transformers",
"safetensors",
"xlm-roberta",
"token-classification",
"generated_from_trainer",
"base_model:FacebookAI/xlm-roberta-base",
"base_model:finetune:FacebookAI/xlm-roberta-base",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | 2024-11-25T07:10:45Z | ---
library_name: transformers
license: mit
base_model: xlm-roberta-base
tags:
- generated_from_trainer
metrics:
- f1
model-index:
- name: xlm-roberta-base-finetuned-panx-de
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# xlm-roberta-base-finetuned-panx-de
This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1377
- F1: 0.8627
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | F1 |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 0.2592 | 1.0 | 525 | 0.1594 | 0.8243 |
| 0.126 | 2.0 | 1050 | 0.1390 | 0.8513 |
| 0.0802 | 3.0 | 1575 | 0.1377 | 0.8627 |
### Framework versions
- Transformers 4.46.1
- Pytorch 2.5.1+cu118
- Datasets 3.1.0
- Tokenizers 0.20.1
|
MayBashendy/Arabic_FineTuningAraBERT_AugV5_k60_task2_organization_fold0 | MayBashendy | 2024-11-25T09:09:00Z | 164 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:aubmindlab/bert-base-arabertv02",
"base_model:finetune:aubmindlab/bert-base-arabertv02",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-25T08:30:51Z | ---
library_name: transformers
base_model: aubmindlab/bert-base-arabertv02
tags:
- generated_from_trainer
model-index:
- name: Arabic_FineTuningAraBERT_AugV5_k60_task2_organization_fold0
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Arabic_FineTuningAraBERT_AugV5_k60_task2_organization_fold0
This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5654
- Qwk: 0.4661
- Mse: 0.5654
- Rmse: 0.7519
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:|
| No log | 0.0049 | 2 | 3.7329 | 0.0 | 3.7329 | 1.9321 |
| No log | 0.0098 | 4 | 3.0481 | 0.0233 | 3.0481 | 1.7459 |
| No log | 0.0147 | 6 | 1.8078 | -0.0157 | 1.8078 | 1.3445 |
| No log | 0.0196 | 8 | 1.2965 | 0.0 | 1.2965 | 1.1386 |
| No log | 0.0245 | 10 | 1.2090 | 0.0 | 1.2090 | 1.0996 |
| No log | 0.0294 | 12 | 1.2281 | -0.0302 | 1.2281 | 1.1082 |
| No log | 0.0343 | 14 | 1.4012 | -0.0302 | 1.4012 | 1.1837 |
| No log | 0.0392 | 16 | 1.3026 | 0.0 | 1.3026 | 1.1413 |
| No log | 0.0441 | 18 | 1.2453 | 0.0 | 1.2453 | 1.1159 |
| No log | 0.0490 | 20 | 1.0166 | 0.0 | 1.0166 | 1.0083 |
| No log | 0.0539 | 22 | 0.7326 | 0.1213 | 0.7326 | 0.8559 |
| No log | 0.0588 | 24 | 0.8443 | 0.0 | 0.8443 | 0.9188 |
| No log | 0.0637 | 26 | 1.5283 | -0.0302 | 1.5283 | 1.2362 |
| No log | 0.0686 | 28 | 1.8904 | -0.1442 | 1.8904 | 1.3749 |
| No log | 0.0735 | 30 | 1.7874 | -0.1282 | 1.7874 | 1.3369 |
| No log | 0.0784 | 32 | 1.5789 | 0.0 | 1.5789 | 1.2566 |
| No log | 0.0833 | 34 | 1.5011 | 0.0 | 1.5011 | 1.2252 |
| No log | 0.0882 | 36 | 1.2510 | 0.0 | 1.2510 | 1.1185 |
| No log | 0.0931 | 38 | 1.0157 | 0.0 | 1.0157 | 1.0078 |
| No log | 0.0980 | 40 | 1.1595 | 0.0 | 1.1595 | 1.0768 |
| No log | 0.1029 | 42 | 1.1783 | 0.0 | 1.1783 | 1.0855 |
| No log | 0.1078 | 44 | 1.3290 | 0.0 | 1.3290 | 1.1528 |
| No log | 0.1127 | 46 | 1.7079 | 0.1811 | 1.7079 | 1.3069 |
| No log | 0.1176 | 48 | 1.5984 | 0.0577 | 1.5984 | 1.2643 |
| No log | 0.1225 | 50 | 1.5642 | 0.1550 | 1.5642 | 1.2507 |
| No log | 0.1275 | 52 | 1.5572 | 0.1811 | 1.5572 | 1.2479 |
| No log | 0.1324 | 54 | 1.3567 | 0.0491 | 1.3567 | 1.1648 |
| No log | 0.1373 | 56 | 1.5691 | 0.1811 | 1.5691 | 1.2526 |
| No log | 0.1422 | 58 | 1.4052 | 0.0924 | 1.4052 | 1.1854 |
| No log | 0.1471 | 60 | 0.8369 | 0.0 | 0.8369 | 0.9148 |
| No log | 0.1520 | 62 | 0.7136 | 0.2452 | 0.7136 | 0.8448 |
| No log | 0.1569 | 64 | 0.7666 | 0.1213 | 0.7666 | 0.8756 |
| No log | 0.1618 | 66 | 1.0461 | 0.0 | 1.0461 | 1.0228 |
| No log | 0.1667 | 68 | 1.6175 | 0.0641 | 1.6175 | 1.2718 |
| No log | 0.1716 | 70 | 1.8439 | 0.0265 | 1.8439 | 1.3579 |
| No log | 0.1765 | 72 | 1.6212 | 0.0924 | 1.6212 | 1.2733 |
| No log | 0.1814 | 74 | 1.2282 | 0.0 | 1.2282 | 1.1082 |
| No log | 0.1863 | 76 | 1.0309 | 0.0 | 1.0309 | 1.0153 |
| No log | 0.1912 | 78 | 1.2423 | 0.0491 | 1.2423 | 1.1146 |
| No log | 0.1961 | 80 | 1.8788 | 0.1170 | 1.8788 | 1.3707 |
| No log | 0.2010 | 82 | 2.6548 | 0.0 | 2.6548 | 1.6294 |
| No log | 0.2059 | 84 | 2.5991 | 0.0 | 2.5991 | 1.6122 |
| No log | 0.2108 | 86 | 2.1897 | -0.0073 | 2.1897 | 1.4798 |
| No log | 0.2157 | 88 | 1.4933 | 0.0491 | 1.4933 | 1.2220 |
| No log | 0.2206 | 90 | 0.9251 | 0.0 | 0.9251 | 0.9618 |
| No log | 0.2255 | 92 | 0.7122 | 0.2821 | 0.7122 | 0.8439 |
| No log | 0.2304 | 94 | 0.6805 | 0.2289 | 0.6805 | 0.8249 |
| No log | 0.2353 | 96 | 0.7859 | -0.0544 | 0.7859 | 0.8865 |
| No log | 0.2402 | 98 | 1.3142 | 0.0 | 1.3142 | 1.1464 |
| No log | 0.2451 | 100 | 1.8197 | 0.0955 | 1.8197 | 1.3490 |
| No log | 0.25 | 102 | 2.1004 | 0.0499 | 2.1004 | 1.4493 |
| No log | 0.2549 | 104 | 1.9572 | 0.1170 | 1.9572 | 1.3990 |
| No log | 0.2598 | 106 | 1.6868 | 0.125 | 1.6868 | 1.2988 |
| No log | 0.2647 | 108 | 1.4735 | 0.0 | 1.4735 | 1.2139 |
| No log | 0.2696 | 110 | 1.1563 | 0.0 | 1.1563 | 1.0753 |
| No log | 0.2745 | 112 | 1.0011 | 0.0 | 1.0011 | 1.0006 |
| No log | 0.2794 | 114 | 1.1042 | 0.0 | 1.1042 | 1.0508 |
| No log | 0.2843 | 116 | 1.0354 | 0.0 | 1.0354 | 1.0176 |
| No log | 0.2892 | 118 | 0.9569 | 0.0 | 0.9569 | 0.9782 |
| No log | 0.2941 | 120 | 1.0087 | 0.0 | 1.0087 | 1.0043 |
| No log | 0.2990 | 122 | 0.9712 | 0.0 | 0.9712 | 0.9855 |
| No log | 0.3039 | 124 | 0.8387 | -0.0544 | 0.8387 | 0.9158 |
| No log | 0.3088 | 126 | 0.8373 | -0.1667 | 0.8373 | 0.9150 |
| No log | 0.3137 | 128 | 0.9305 | 0.0335 | 0.9305 | 0.9646 |
| No log | 0.3186 | 130 | 1.0780 | 0.0 | 1.0780 | 1.0383 |
| No log | 0.3235 | 132 | 1.2888 | 0.0 | 1.2888 | 1.1352 |
| No log | 0.3284 | 134 | 1.3256 | 0.0 | 1.3256 | 1.1513 |
| No log | 0.3333 | 136 | 1.4511 | 0.0 | 1.4511 | 1.2046 |
| No log | 0.3382 | 138 | 1.4020 | 0.0 | 1.4020 | 1.1841 |
| No log | 0.3431 | 140 | 1.2453 | 0.0 | 1.2453 | 1.1159 |
| No log | 0.3480 | 142 | 1.0186 | 0.0 | 1.0186 | 1.0093 |
| No log | 0.3529 | 144 | 0.9046 | -0.1099 | 0.9046 | 0.9511 |
| No log | 0.3578 | 146 | 1.1096 | 0.0162 | 1.1096 | 1.0534 |
| No log | 0.3627 | 148 | 1.9099 | 0.0808 | 1.9099 | 1.3820 |
| No log | 0.3676 | 150 | 2.2249 | 0.0538 | 2.2249 | 1.4916 |
| No log | 0.3725 | 152 | 1.9898 | 0.0455 | 1.9898 | 1.4106 |
| No log | 0.3775 | 154 | 1.3677 | 0.0605 | 1.3677 | 1.1695 |
| No log | 0.3824 | 156 | 0.8731 | -0.0548 | 0.8731 | 0.9344 |
| No log | 0.3873 | 158 | 0.8028 | 0.2150 | 0.8028 | 0.8960 |
| No log | 0.3922 | 160 | 0.8322 | -0.0548 | 0.8322 | 0.9123 |
| No log | 0.3971 | 162 | 1.0129 | 0.0 | 1.0129 | 1.0064 |
| No log | 0.4020 | 164 | 1.3319 | 0.0491 | 1.3319 | 1.1541 |
| No log | 0.4069 | 166 | 1.5114 | -0.0096 | 1.5114 | 1.2294 |
| No log | 0.4118 | 168 | 1.4833 | -0.0328 | 1.4833 | 1.2179 |
| No log | 0.4167 | 170 | 1.3613 | 0.0173 | 1.3613 | 1.1667 |
| No log | 0.4216 | 172 | 1.1971 | 0.0491 | 1.1971 | 1.0941 |
| No log | 0.4265 | 174 | 1.0106 | 0.0 | 1.0106 | 1.0053 |
| No log | 0.4314 | 176 | 0.8181 | 0.1564 | 0.8181 | 0.9045 |
| No log | 0.4363 | 178 | 0.7188 | -0.0312 | 0.7188 | 0.8478 |
| No log | 0.4412 | 180 | 0.7157 | 0.0993 | 0.7157 | 0.8460 |
| No log | 0.4461 | 182 | 0.7602 | 0.0265 | 0.7602 | 0.8719 |
| No log | 0.4510 | 184 | 0.8625 | -0.0087 | 0.8625 | 0.9287 |
| No log | 0.4559 | 186 | 1.0581 | 0.0 | 1.0581 | 1.0287 |
| No log | 0.4608 | 188 | 1.1478 | 0.1213 | 1.1478 | 1.0714 |
| No log | 0.4657 | 190 | 1.1894 | 0.1213 | 1.1894 | 1.0906 |
| No log | 0.4706 | 192 | 1.1746 | 0.0436 | 1.1746 | 1.0838 |
| No log | 0.4755 | 194 | 1.2390 | 0.0109 | 1.2390 | 1.1131 |
| No log | 0.4804 | 196 | 1.1443 | 0.1939 | 1.1443 | 1.0697 |
| No log | 0.4853 | 198 | 1.0193 | 0.1429 | 1.0193 | 1.0096 |
| No log | 0.4902 | 200 | 0.9713 | -0.0185 | 0.9713 | 0.9855 |
| No log | 0.4951 | 202 | 0.9346 | 0.0411 | 0.9346 | 0.9667 |
| No log | 0.5 | 204 | 1.0566 | 0.1765 | 1.0566 | 1.0279 |
| No log | 0.5049 | 206 | 1.1853 | 0.1600 | 1.1853 | 1.0887 |
| No log | 0.5098 | 208 | 1.1745 | 0.0491 | 1.1745 | 1.0838 |
| No log | 0.5147 | 210 | 1.0121 | 0.1923 | 1.0121 | 1.0060 |
| No log | 0.5196 | 212 | 0.7318 | 0.2329 | 0.7318 | 0.8554 |
| No log | 0.5245 | 214 | 0.6922 | 0.0957 | 0.6922 | 0.8320 |
| No log | 0.5294 | 216 | 0.7221 | 0.0567 | 0.7221 | 0.8498 |
| No log | 0.5343 | 218 | 0.7596 | 0.1600 | 0.7596 | 0.8716 |
| No log | 0.5392 | 220 | 0.7837 | 0.1600 | 0.7837 | 0.8853 |
| No log | 0.5441 | 222 | 0.7752 | 0.0567 | 0.7752 | 0.8804 |
| No log | 0.5490 | 224 | 0.7915 | 0.2184 | 0.7915 | 0.8897 |
| No log | 0.5539 | 226 | 0.7819 | 0.2184 | 0.7819 | 0.8842 |
| No log | 0.5588 | 228 | 0.7789 | 0.1600 | 0.7789 | 0.8825 |
| No log | 0.5637 | 230 | 1.0059 | 0.1800 | 1.0059 | 1.0030 |
| No log | 0.5686 | 232 | 0.9666 | 0.0701 | 0.9666 | 0.9832 |
| No log | 0.5735 | 234 | 0.7999 | 0.1370 | 0.7999 | 0.8944 |
| No log | 0.5784 | 236 | 0.7399 | 0.1168 | 0.7399 | 0.8602 |
| No log | 0.5833 | 238 | 0.7587 | 0.1168 | 0.7587 | 0.8711 |
| No log | 0.5882 | 240 | 0.8089 | 0.0045 | 0.8089 | 0.8994 |
| No log | 0.5931 | 242 | 0.8514 | 0.0045 | 0.8514 | 0.9227 |
| No log | 0.5980 | 244 | 0.9308 | -0.0312 | 0.9308 | 0.9648 |
| No log | 0.6029 | 246 | 0.9141 | -0.0312 | 0.9141 | 0.9561 |
| No log | 0.6078 | 248 | 0.8402 | -0.0548 | 0.8402 | 0.9166 |
| No log | 0.6127 | 250 | 0.7696 | -0.0048 | 0.7696 | 0.8773 |
| No log | 0.6176 | 252 | 0.7865 | -0.0943 | 0.7865 | 0.8868 |
| No log | 0.6225 | 254 | 0.8473 | 0.1356 | 0.8473 | 0.9205 |
| No log | 0.6275 | 256 | 0.9396 | 0.1064 | 0.9396 | 0.9693 |
| No log | 0.6324 | 258 | 1.0712 | -0.0483 | 1.0712 | 1.0350 |
| No log | 0.6373 | 260 | 1.1024 | -0.0909 | 1.1024 | 1.0500 |
| No log | 0.6422 | 262 | 1.0499 | 0.0297 | 1.0499 | 1.0246 |
| No log | 0.6471 | 264 | 1.0232 | 0.1765 | 1.0232 | 1.0115 |
| No log | 0.6520 | 266 | 1.0093 | -0.0553 | 1.0093 | 1.0046 |
| No log | 0.6569 | 268 | 0.9457 | -0.0153 | 0.9457 | 0.9725 |
| No log | 0.6618 | 270 | 0.8648 | -0.0153 | 0.8648 | 0.9300 |
| No log | 0.6667 | 272 | 0.8738 | -0.0553 | 0.8738 | 0.9348 |
| No log | 0.6716 | 274 | 0.9585 | 0.0903 | 0.9585 | 0.9790 |
| No log | 0.6765 | 276 | 1.0674 | 0.1408 | 1.0674 | 1.0332 |
| No log | 0.6814 | 278 | 1.1280 | 0.0455 | 1.1280 | 1.0621 |
| No log | 0.6863 | 280 | 1.0904 | 0.1408 | 1.0904 | 1.0442 |
| No log | 0.6912 | 282 | 1.0795 | 0.0455 | 1.0795 | 1.0390 |
| No log | 0.6961 | 284 | 1.0543 | 0.0720 | 1.0543 | 1.0268 |
| No log | 0.7010 | 286 | 0.9815 | 0.0473 | 0.9815 | 0.9907 |
| No log | 0.7059 | 288 | 0.9048 | 0.1765 | 0.9048 | 0.9512 |
| No log | 0.7108 | 290 | 0.8590 | 0.1765 | 0.8590 | 0.9268 |
| No log | 0.7157 | 292 | 0.8301 | 0.1765 | 0.8301 | 0.9111 |
| No log | 0.7206 | 294 | 0.8156 | 0.0567 | 0.8156 | 0.9031 |
| No log | 0.7255 | 296 | 0.8623 | 0.0785 | 0.8623 | 0.9286 |
| No log | 0.7304 | 298 | 0.8934 | 0.0375 | 0.8934 | 0.9452 |
| No log | 0.7353 | 300 | 0.9323 | 0.0038 | 0.9323 | 0.9656 |
| No log | 0.7402 | 302 | 0.9764 | 0.0038 | 0.9764 | 0.9882 |
| No log | 0.7451 | 304 | 0.9810 | -0.0460 | 0.9810 | 0.9904 |
| No log | 0.75 | 306 | 0.9550 | 0.0170 | 0.9550 | 0.9773 |
| No log | 0.7549 | 308 | 0.9805 | 0.0375 | 0.9805 | 0.9902 |
| No log | 0.7598 | 310 | 0.9342 | 0.0041 | 0.9342 | 0.9665 |
| No log | 0.7647 | 312 | 0.9290 | -0.0286 | 0.9290 | 0.9638 |
| No log | 0.7696 | 314 | 0.8797 | -0.0185 | 0.8797 | 0.9379 |
| No log | 0.7745 | 316 | 0.8227 | -0.0794 | 0.8227 | 0.9070 |
| No log | 0.7794 | 318 | 0.7853 | 0.0957 | 0.7853 | 0.8861 |
| No log | 0.7843 | 320 | 0.7605 | 0.1356 | 0.7605 | 0.8721 |
| No log | 0.7892 | 322 | 0.7714 | 0.1765 | 0.7714 | 0.8783 |
| No log | 0.7941 | 324 | 0.7959 | 0.1765 | 0.7959 | 0.8921 |
| No log | 0.7990 | 326 | 0.8596 | 0.1765 | 0.8596 | 0.9271 |
| No log | 0.8039 | 328 | 0.9724 | 0.3029 | 0.9724 | 0.9861 |
| No log | 0.8088 | 330 | 0.9962 | 0.3029 | 0.9962 | 0.9981 |
| No log | 0.8137 | 332 | 0.9977 | 0.1635 | 0.9977 | 0.9988 |
| No log | 0.8186 | 334 | 1.0736 | 0.0321 | 1.0736 | 1.0362 |
| No log | 0.8235 | 336 | 1.0085 | 0.1818 | 1.0085 | 1.0042 |
| No log | 0.8284 | 338 | 0.9455 | 0.0099 | 0.9455 | 0.9724 |
| No log | 0.8333 | 340 | 0.9222 | 0.1356 | 0.9222 | 0.9603 |
| No log | 0.8382 | 342 | 0.8597 | 0.0957 | 0.8597 | 0.9272 |
| No log | 0.8431 | 344 | 0.8225 | 0.1356 | 0.8225 | 0.9069 |
| No log | 0.8480 | 346 | 0.7537 | 0.1765 | 0.7537 | 0.8682 |
| No log | 0.8529 | 348 | 0.7236 | 0.2184 | 0.7236 | 0.8506 |
| No log | 0.8578 | 350 | 0.7374 | 0.0503 | 0.7374 | 0.8587 |
| No log | 0.8627 | 352 | 0.8519 | 0.0099 | 0.8519 | 0.9230 |
| No log | 0.8676 | 354 | 0.9807 | 0.1765 | 0.9807 | 0.9903 |
| No log | 0.8725 | 356 | 1.0893 | 0.0516 | 1.0893 | 1.0437 |
| No log | 0.8775 | 358 | 1.1002 | -0.0876 | 1.1002 | 1.0489 |
| No log | 0.8824 | 360 | 1.0915 | 0.1429 | 1.0915 | 1.0448 |
| No log | 0.8873 | 362 | 1.0676 | 0.1715 | 1.0676 | 1.0332 |
| No log | 0.8922 | 364 | 0.9708 | 0.0833 | 0.9708 | 0.9853 |
| No log | 0.8971 | 366 | 0.8791 | 0.0916 | 0.8791 | 0.9376 |
| No log | 0.9020 | 368 | 0.7827 | 0.0099 | 0.7827 | 0.8847 |
| No log | 0.9069 | 370 | 0.7366 | 0.1356 | 0.7366 | 0.8582 |
| No log | 0.9118 | 372 | 0.7425 | 0.1168 | 0.7425 | 0.8617 |
| No log | 0.9167 | 374 | 0.7658 | 0.0045 | 0.7658 | 0.8751 |
| No log | 0.9216 | 376 | 0.7434 | 0.0567 | 0.7434 | 0.8622 |
| No log | 0.9265 | 378 | 0.7418 | 0.1356 | 0.7418 | 0.8613 |
| No log | 0.9314 | 380 | 0.8042 | 0.0099 | 0.8042 | 0.8968 |
| No log | 0.9363 | 382 | 0.9116 | 0.0099 | 0.9116 | 0.9548 |
| No log | 0.9412 | 384 | 0.9422 | 0.1765 | 0.9422 | 0.9707 |
| No log | 0.9461 | 386 | 0.9646 | 0.1765 | 0.9646 | 0.9821 |
| No log | 0.9510 | 388 | 0.9366 | 0.1765 | 0.9366 | 0.9678 |
| No log | 0.9559 | 390 | 0.8935 | 0.1765 | 0.8935 | 0.9452 |
| No log | 0.9608 | 392 | 0.8234 | 0.1765 | 0.8234 | 0.9074 |
| No log | 0.9657 | 394 | 0.7723 | 0.1356 | 0.7723 | 0.8788 |
| No log | 0.9706 | 396 | 0.7751 | 0.1356 | 0.7751 | 0.8804 |
| No log | 0.9755 | 398 | 0.7900 | 0.1962 | 0.7900 | 0.8888 |
| No log | 0.9804 | 400 | 0.8108 | 0.1962 | 0.8108 | 0.9005 |
| No log | 0.9853 | 402 | 0.8253 | 0.1765 | 0.8253 | 0.9085 |
| No log | 0.9902 | 404 | 0.8073 | 0.0099 | 0.8073 | 0.8985 |
| No log | 0.9951 | 406 | 0.8569 | 0.0099 | 0.8569 | 0.9257 |
| No log | 1.0 | 408 | 0.9797 | 0.0503 | 0.9797 | 0.9898 |
| No log | 1.0049 | 410 | 1.0478 | 0.0503 | 1.0478 | 1.0236 |
| No log | 1.0098 | 412 | 1.0258 | 0.0503 | 1.0258 | 1.0128 |
| No log | 1.0147 | 414 | 0.9693 | 0.0503 | 0.9693 | 0.9845 |
| No log | 1.0196 | 416 | 0.8774 | 0.0503 | 0.8774 | 0.9367 |
| No log | 1.0245 | 418 | 0.8861 | 0.0099 | 0.8861 | 0.9413 |
| No log | 1.0294 | 420 | 0.9214 | 0.0099 | 0.9214 | 0.9599 |
| No log | 1.0343 | 422 | 0.8879 | 0.0099 | 0.8879 | 0.9423 |
| No log | 1.0392 | 424 | 0.8490 | 0.1231 | 0.8490 | 0.9214 |
| No log | 1.0441 | 426 | 0.8146 | 0.0870 | 0.8146 | 0.9025 |
| No log | 1.0490 | 428 | 0.7657 | 0.1356 | 0.7657 | 0.8750 |
| No log | 1.0539 | 430 | 0.7545 | 0.0503 | 0.7545 | 0.8686 |
| No log | 1.0588 | 432 | 0.7598 | 0.0503 | 0.7598 | 0.8717 |
| No log | 1.0637 | 434 | 0.7718 | -0.0825 | 0.7718 | 0.8785 |
| No log | 1.0686 | 436 | 0.8087 | -0.0825 | 0.8086 | 0.8992 |
| No log | 1.0735 | 438 | 0.8054 | -0.0825 | 0.8054 | 0.8974 |
| No log | 1.0784 | 440 | 0.8101 | 0.0503 | 0.8101 | 0.9000 |
| No log | 1.0833 | 442 | 0.8115 | 0.0503 | 0.8115 | 0.9008 |
| No log | 1.0882 | 444 | 0.7931 | 0.2184 | 0.7931 | 0.8906 |
| No log | 1.0931 | 446 | 0.7737 | 0.1765 | 0.7737 | 0.8796 |
| No log | 1.0980 | 448 | 0.7484 | 0.1765 | 0.7484 | 0.8651 |
| No log | 1.1029 | 450 | 0.7459 | 0.2184 | 0.7459 | 0.8637 |
| No log | 1.1078 | 452 | 0.7613 | 0.2184 | 0.7613 | 0.8725 |
| No log | 1.1127 | 454 | 0.7604 | 0.2184 | 0.7604 | 0.8720 |
| No log | 1.1176 | 456 | 0.7483 | 0.2184 | 0.7483 | 0.8651 |
| No log | 1.1225 | 458 | 0.7503 | 0.2613 | 0.7503 | 0.8662 |
| No log | 1.1275 | 460 | 0.7585 | 0.0916 | 0.7585 | 0.8709 |
| No log | 1.1324 | 462 | 0.7548 | 0.2613 | 0.7548 | 0.8688 |
| No log | 1.1373 | 464 | 0.7606 | 0.1765 | 0.7606 | 0.8721 |
| No log | 1.1422 | 466 | 0.7849 | 0.1765 | 0.7849 | 0.8859 |
| No log | 1.1471 | 468 | 0.8086 | 0.1765 | 0.8086 | 0.8992 |
| No log | 1.1520 | 470 | 0.8458 | 0.1765 | 0.8458 | 0.9197 |
| No log | 1.1569 | 472 | 0.8762 | 0.1765 | 0.8762 | 0.9360 |
| No log | 1.1618 | 474 | 0.9001 | 0.1356 | 0.9001 | 0.9487 |
| No log | 1.1667 | 476 | 0.9094 | 0.1356 | 0.9094 | 0.9536 |
| No log | 1.1716 | 478 | 0.9015 | 0.1356 | 0.9015 | 0.9495 |
| No log | 1.1765 | 480 | 0.8821 | 0.1765 | 0.8821 | 0.9392 |
| No log | 1.1814 | 482 | 0.8844 | 0.3029 | 0.8844 | 0.9404 |
| No log | 1.1863 | 484 | 0.9158 | 0.1992 | 0.9158 | 0.9570 |
| No log | 1.1912 | 486 | 0.9036 | 0.1600 | 0.9036 | 0.9506 |
| No log | 1.1961 | 488 | 0.8359 | 0.0258 | 0.8359 | 0.9143 |
| No log | 1.2010 | 490 | 0.7704 | 0.0258 | 0.7704 | 0.8777 |
| No log | 1.2059 | 492 | 0.7179 | 0.0916 | 0.7179 | 0.8473 |
| No log | 1.2108 | 494 | 0.6796 | 0.2184 | 0.6796 | 0.8244 |
| No log | 1.2157 | 496 | 0.6828 | 0.1765 | 0.6828 | 0.8263 |
| No log | 1.2206 | 498 | 0.6992 | 0.2184 | 0.6992 | 0.8362 |
| 0.5578 | 1.2255 | 500 | 0.7241 | 0.2184 | 0.7241 | 0.8510 |
| 0.5578 | 1.2304 | 502 | 0.7504 | 0.0503 | 0.7504 | 0.8663 |
| 0.5578 | 1.2353 | 504 | 0.8165 | 0.0679 | 0.8165 | 0.9036 |
| 0.5578 | 1.2402 | 506 | 0.8493 | 0.0679 | 0.8493 | 0.9216 |
| 0.5578 | 1.2451 | 508 | 0.8073 | 0.0679 | 0.8073 | 0.8985 |
| 0.5578 | 1.25 | 510 | 0.7283 | 0.0258 | 0.7283 | 0.8534 |
| 0.5578 | 1.2549 | 512 | 0.6871 | 0.0503 | 0.6871 | 0.8289 |
| 0.5578 | 1.2598 | 514 | 0.6995 | 0.1783 | 0.6995 | 0.8364 |
| 0.5578 | 1.2647 | 516 | 0.6958 | 0.1783 | 0.6958 | 0.8341 |
| 0.5578 | 1.2696 | 518 | 0.6776 | 0.1356 | 0.6776 | 0.8232 |
| 0.5578 | 1.2745 | 520 | 0.6641 | 0.0503 | 0.6641 | 0.8149 |
| 0.5578 | 1.2794 | 522 | 0.6771 | 0.0258 | 0.6771 | 0.8228 |
| 0.5578 | 1.2843 | 524 | 0.7397 | 0.0258 | 0.7397 | 0.8601 |
| 0.5578 | 1.2892 | 526 | 0.7622 | 0.0258 | 0.7622 | 0.8731 |
| 0.5578 | 1.2941 | 528 | 0.7170 | 0.0258 | 0.7170 | 0.8468 |
| 0.5578 | 1.2990 | 530 | 0.6831 | -0.0153 | 0.6831 | 0.8265 |
| 0.5578 | 1.3039 | 532 | 0.6822 | -0.0153 | 0.6822 | 0.8259 |
| 0.5578 | 1.3088 | 534 | 0.6797 | -0.0153 | 0.6797 | 0.8244 |
| 0.5578 | 1.3137 | 536 | 0.6996 | -0.0153 | 0.6996 | 0.8364 |
| 0.5578 | 1.3186 | 538 | 0.7385 | 0.0258 | 0.7385 | 0.8594 |
| 0.5578 | 1.3235 | 540 | 0.7466 | -0.0153 | 0.7466 | 0.8641 |
| 0.5578 | 1.3284 | 542 | 0.7164 | -0.0153 | 0.7164 | 0.8464 |
| 0.5578 | 1.3333 | 544 | 0.7053 | -0.0153 | 0.7053 | 0.8398 |
| 0.5578 | 1.3382 | 546 | 0.7006 | 0.0503 | 0.7006 | 0.8370 |
| 0.5578 | 1.3431 | 548 | 0.7064 | 0.0099 | 0.7064 | 0.8405 |
| 0.5578 | 1.3480 | 550 | 0.7086 | 0.1765 | 0.7086 | 0.8418 |
| 0.5578 | 1.3529 | 552 | 0.7189 | 0.1765 | 0.7189 | 0.8479 |
| 0.5578 | 1.3578 | 554 | 0.7168 | 0.1765 | 0.7168 | 0.8466 |
| 0.5578 | 1.3627 | 556 | 0.7376 | 0.1765 | 0.7376 | 0.8588 |
| 0.5578 | 1.3676 | 558 | 0.7608 | 0.0828 | 0.7608 | 0.8723 |
| 0.5578 | 1.3725 | 560 | 0.7547 | 0.0503 | 0.7547 | 0.8688 |
| 0.5578 | 1.3775 | 562 | 0.7262 | 0.1765 | 0.7262 | 0.8522 |
| 0.5578 | 1.3824 | 564 | 0.7455 | 0.0503 | 0.7455 | 0.8634 |
| 0.5578 | 1.3873 | 566 | 0.8094 | 0.1818 | 0.8094 | 0.8997 |
| 0.5578 | 1.3922 | 568 | 0.7910 | 0.1209 | 0.7910 | 0.8894 |
| 0.5578 | 1.3971 | 570 | 0.7395 | 0.0916 | 0.7395 | 0.8599 |
| 0.5578 | 1.4020 | 572 | 0.6950 | 0.0503 | 0.6950 | 0.8336 |
| 0.5578 | 1.4069 | 574 | 0.6585 | 0.1765 | 0.6585 | 0.8115 |
| 0.5578 | 1.4118 | 576 | 0.6510 | 0.1765 | 0.6510 | 0.8068 |
| 0.5578 | 1.4167 | 578 | 0.6428 | 0.1765 | 0.6428 | 0.8018 |
| 0.5578 | 1.4216 | 580 | 0.6618 | 0.0503 | 0.6618 | 0.8135 |
| 0.5578 | 1.4265 | 582 | 0.6488 | 0.0099 | 0.6488 | 0.8055 |
| 0.5578 | 1.4314 | 584 | 0.6235 | -0.0294 | 0.6235 | 0.7896 |
| 0.5578 | 1.4363 | 586 | 0.6235 | 0.0735 | 0.6235 | 0.7896 |
| 0.5578 | 1.4412 | 588 | 0.6686 | 0.0099 | 0.6686 | 0.8177 |
| 0.5578 | 1.4461 | 590 | 0.7365 | 0.0916 | 0.7365 | 0.8582 |
| 0.5578 | 1.4510 | 592 | 0.7768 | 0.1793 | 0.7768 | 0.8814 |
| 0.5578 | 1.4559 | 594 | 0.7998 | 0.1793 | 0.7998 | 0.8943 |
| 0.5578 | 1.4608 | 596 | 0.8258 | 0.1635 | 0.8258 | 0.9087 |
| 0.5578 | 1.4657 | 598 | 0.8195 | 0.1765 | 0.8195 | 0.9053 |
| 0.5578 | 1.4706 | 600 | 0.8337 | 0.0455 | 0.8337 | 0.9131 |
| 0.5578 | 1.4755 | 602 | 0.7986 | 0.0503 | 0.7986 | 0.8936 |
| 0.5578 | 1.4804 | 604 | 0.7542 | 0.0099 | 0.7542 | 0.8684 |
| 0.5578 | 1.4853 | 606 | 0.7238 | 0.0503 | 0.7238 | 0.8508 |
| 0.5578 | 1.4902 | 608 | 0.7202 | 0.0503 | 0.7202 | 0.8486 |
| 0.5578 | 1.4951 | 610 | 0.6943 | 0.0503 | 0.6943 | 0.8333 |
| 0.5578 | 1.5 | 612 | 0.7182 | -0.0153 | 0.7182 | 0.8475 |
| 0.5578 | 1.5049 | 614 | 0.7282 | 0.0258 | 0.7282 | 0.8534 |
| 0.5578 | 1.5098 | 616 | 0.7412 | 0.0258 | 0.7412 | 0.8609 |
| 0.5578 | 1.5147 | 618 | 0.7442 | 0.0258 | 0.7442 | 0.8627 |
| 0.5578 | 1.5196 | 620 | 0.7727 | 0.0258 | 0.7727 | 0.8790 |
| 0.5578 | 1.5245 | 622 | 0.7509 | -0.0153 | 0.7509 | 0.8666 |
| 0.5578 | 1.5294 | 624 | 0.7460 | 0.0503 | 0.7460 | 0.8637 |
| 0.5578 | 1.5343 | 626 | 0.7664 | 0.2184 | 0.7664 | 0.8754 |
| 0.5578 | 1.5392 | 628 | 0.7797 | 0.0503 | 0.7797 | 0.8830 |
| 0.5578 | 1.5441 | 630 | 0.8092 | 0.0503 | 0.8092 | 0.8996 |
| 0.5578 | 1.5490 | 632 | 0.8579 | -0.0153 | 0.8579 | 0.9262 |
| 0.5578 | 1.5539 | 634 | 0.8892 | -0.0153 | 0.8892 | 0.9430 |
| 0.5578 | 1.5588 | 636 | 0.8795 | -0.0153 | 0.8795 | 0.9378 |
| 0.5578 | 1.5637 | 638 | 0.8237 | 0.0503 | 0.8237 | 0.9076 |
| 0.5578 | 1.5686 | 640 | 0.7623 | 0.0503 | 0.7623 | 0.8731 |
| 0.5578 | 1.5735 | 642 | 0.7500 | 0.0503 | 0.7500 | 0.8660 |
| 0.5578 | 1.5784 | 644 | 0.7688 | 0.0503 | 0.7688 | 0.8768 |
| 0.5578 | 1.5833 | 646 | 0.7593 | 0.0503 | 0.7593 | 0.8714 |
| 0.5578 | 1.5882 | 648 | 0.7676 | 0.0916 | 0.7676 | 0.8761 |
| 0.5578 | 1.5931 | 650 | 0.7992 | 0.2355 | 0.7992 | 0.8940 |
| 0.5578 | 1.5980 | 652 | 0.7329 | 0.0916 | 0.7329 | 0.8561 |
| 0.5578 | 1.6029 | 654 | 0.6586 | 0.1765 | 0.6586 | 0.8116 |
| 0.5578 | 1.6078 | 656 | 0.6617 | 0.2725 | 0.6617 | 0.8135 |
| 0.5578 | 1.6127 | 658 | 0.7002 | 0.2329 | 0.7002 | 0.8368 |
| 0.5578 | 1.6176 | 660 | 0.7021 | 0.2725 | 0.7021 | 0.8379 |
| 0.5578 | 1.6225 | 662 | 0.6746 | 0.3131 | 0.6746 | 0.8214 |
| 0.5578 | 1.6275 | 664 | 0.6984 | 0.2921 | 0.6984 | 0.8357 |
| 0.5578 | 1.6324 | 666 | 0.8072 | 0.1992 | 0.8072 | 0.8984 |
| 0.5578 | 1.6373 | 668 | 0.8233 | 0.1992 | 0.8233 | 0.9074 |
| 0.5578 | 1.6422 | 670 | 0.8055 | 0.2355 | 0.8055 | 0.8975 |
| 0.5578 | 1.6471 | 672 | 0.7299 | 0.2921 | 0.7299 | 0.8543 |
| 0.5578 | 1.6520 | 674 | 0.6800 | 0.1765 | 0.6800 | 0.8246 |
| 0.5578 | 1.6569 | 676 | 0.7202 | 0.1783 | 0.7202 | 0.8487 |
| 0.5578 | 1.6618 | 678 | 0.7445 | 0.1797 | 0.7445 | 0.8629 |
| 0.5578 | 1.6667 | 680 | 0.7278 | 0.1419 | 0.7278 | 0.8531 |
| 0.5578 | 1.6716 | 682 | 0.6878 | 0.0957 | 0.6878 | 0.8293 |
| 0.5578 | 1.6765 | 684 | 0.7109 | 0.1356 | 0.7109 | 0.8432 |
| 0.5578 | 1.6814 | 686 | 0.7583 | 0.2921 | 0.7583 | 0.8708 |
| 0.5578 | 1.6863 | 688 | 0.7626 | 0.2921 | 0.7626 | 0.8733 |
| 0.5578 | 1.6912 | 690 | 0.7425 | 0.1765 | 0.7425 | 0.8617 |
| 0.5578 | 1.6961 | 692 | 0.7350 | 0.1765 | 0.7350 | 0.8573 |
| 0.5578 | 1.7010 | 694 | 0.7285 | 0.1765 | 0.7285 | 0.8535 |
| 0.5578 | 1.7059 | 696 | 0.7034 | 0.1765 | 0.7034 | 0.8387 |
| 0.5578 | 1.7108 | 698 | 0.6936 | 0.1765 | 0.6936 | 0.8328 |
| 0.5578 | 1.7157 | 700 | 0.6978 | 0.1765 | 0.6978 | 0.8354 |
| 0.5578 | 1.7206 | 702 | 0.7344 | 0.1765 | 0.7344 | 0.8570 |
| 0.5578 | 1.7255 | 704 | 0.7590 | 0.2921 | 0.7590 | 0.8712 |
| 0.5578 | 1.7304 | 706 | 0.7798 | 0.2921 | 0.7798 | 0.8831 |
| 0.5578 | 1.7353 | 708 | 0.7821 | 0.1992 | 0.7821 | 0.8843 |
| 0.5578 | 1.7402 | 710 | 0.7444 | 0.2921 | 0.7444 | 0.8628 |
| 0.5578 | 1.7451 | 712 | 0.6936 | 0.2921 | 0.6936 | 0.8328 |
| 0.5578 | 1.75 | 714 | 0.6691 | 0.1765 | 0.6691 | 0.8180 |
| 0.5578 | 1.7549 | 716 | 0.6563 | 0.1356 | 0.6563 | 0.8101 |
| 0.5578 | 1.7598 | 718 | 0.6508 | 0.1356 | 0.6508 | 0.8067 |
| 0.5578 | 1.7647 | 720 | 0.6493 | 0.0099 | 0.6493 | 0.8058 |
| 0.5578 | 1.7696 | 722 | 0.6629 | 0.0503 | 0.6629 | 0.8142 |
| 0.5578 | 1.7745 | 724 | 0.7317 | 0.2186 | 0.7317 | 0.8554 |
| 0.5578 | 1.7794 | 726 | 0.8393 | 0.2727 | 0.8393 | 0.9162 |
| 0.5578 | 1.7843 | 728 | 0.8773 | 0.2846 | 0.8773 | 0.9367 |
| 0.5578 | 1.7892 | 730 | 0.7873 | 0.2186 | 0.7873 | 0.8873 |
| 0.5578 | 1.7941 | 732 | 0.7080 | 0.2184 | 0.7080 | 0.8414 |
| 0.5578 | 1.7990 | 734 | 0.6906 | 0.1356 | 0.6906 | 0.8310 |
| 0.5578 | 1.8039 | 736 | 0.6882 | 0.1356 | 0.6882 | 0.8296 |
| 0.5578 | 1.8088 | 738 | 0.6746 | 0.1356 | 0.6746 | 0.8213 |
| 0.5578 | 1.8137 | 740 | 0.6798 | 0.1765 | 0.6798 | 0.8245 |
| 0.5578 | 1.8186 | 742 | 0.6784 | 0.1765 | 0.6784 | 0.8236 |
| 0.5578 | 1.8235 | 744 | 0.6669 | 0.1765 | 0.6669 | 0.8167 |
| 0.5578 | 1.8284 | 746 | 0.6734 | 0.2184 | 0.6734 | 0.8206 |
| 0.5578 | 1.8333 | 748 | 0.6727 | 0.2184 | 0.6727 | 0.8202 |
| 0.5578 | 1.8382 | 750 | 0.6770 | 0.2613 | 0.6770 | 0.8228 |
| 0.5578 | 1.8431 | 752 | 0.6828 | 0.0916 | 0.6828 | 0.8263 |
| 0.5578 | 1.8480 | 754 | 0.6833 | 0.0258 | 0.6833 | 0.8266 |
| 0.5578 | 1.8529 | 756 | 0.7135 | 0.0258 | 0.7135 | 0.8447 |
| 0.5578 | 1.8578 | 758 | 0.7219 | -0.0153 | 0.7219 | 0.8496 |
| 0.5578 | 1.8627 | 760 | 0.7159 | -0.0153 | 0.7159 | 0.8461 |
| 0.5578 | 1.8676 | 762 | 0.7267 | 0.0099 | 0.7267 | 0.8525 |
| 0.5578 | 1.8725 | 764 | 0.7224 | 0.1356 | 0.7224 | 0.8499 |
| 0.5578 | 1.8775 | 766 | 0.7271 | 0.1356 | 0.7271 | 0.8527 |
| 0.5578 | 1.8824 | 768 | 0.7485 | 0.1765 | 0.7485 | 0.8651 |
| 0.5578 | 1.8873 | 770 | 0.7799 | 0.1765 | 0.7799 | 0.8831 |
| 0.5578 | 1.8922 | 772 | 0.7954 | 0.0099 | 0.7954 | 0.8919 |
| 0.5578 | 1.8971 | 774 | 0.7923 | 0.0099 | 0.7923 | 0.8901 |
| 0.5578 | 1.9020 | 776 | 0.7898 | 0.0099 | 0.7898 | 0.8887 |
| 0.5578 | 1.9069 | 778 | 0.8181 | 0.1793 | 0.8181 | 0.9045 |
| 0.5578 | 1.9118 | 780 | 0.8152 | 0.0503 | 0.8152 | 0.9029 |
| 0.5578 | 1.9167 | 782 | 0.7926 | 0.1765 | 0.7926 | 0.8903 |
| 0.5578 | 1.9216 | 784 | 0.7978 | 0.1765 | 0.7978 | 0.8932 |
| 0.5578 | 1.9265 | 786 | 0.8013 | 0.2184 | 0.8013 | 0.8951 |
| 0.5578 | 1.9314 | 788 | 0.8023 | 0.2184 | 0.8023 | 0.8957 |
| 0.5578 | 1.9363 | 790 | 0.7933 | 0.1765 | 0.7933 | 0.8907 |
| 0.5578 | 1.9412 | 792 | 0.7900 | 0.3318 | 0.7900 | 0.8888 |
| 0.5578 | 1.9461 | 794 | 0.7859 | 0.3318 | 0.7859 | 0.8865 |
| 0.5578 | 1.9510 | 796 | 0.7713 | 0.3318 | 0.7713 | 0.8782 |
| 0.5578 | 1.9559 | 798 | 0.7542 | 0.1765 | 0.7542 | 0.8685 |
| 0.5578 | 1.9608 | 800 | 0.7744 | 0.3318 | 0.7744 | 0.8800 |
| 0.5578 | 1.9657 | 802 | 0.8383 | 0.2186 | 0.8383 | 0.9156 |
| 0.5578 | 1.9706 | 804 | 0.8534 | 0.2186 | 0.8534 | 0.9238 |
| 0.5578 | 1.9755 | 806 | 0.8328 | 0.1793 | 0.8328 | 0.9126 |
| 0.5578 | 1.9804 | 808 | 0.7872 | 0.2921 | 0.7872 | 0.8872 |
| 0.5578 | 1.9853 | 810 | 0.7593 | 0.1765 | 0.7593 | 0.8714 |
| 0.5578 | 1.9902 | 812 | 0.7483 | 0.1765 | 0.7483 | 0.8651 |
| 0.5578 | 1.9951 | 814 | 0.7205 | 0.1765 | 0.7205 | 0.8488 |
| 0.5578 | 2.0 | 816 | 0.7090 | 0.0099 | 0.7090 | 0.8420 |
| 0.5578 | 2.0049 | 818 | 0.7199 | 0.0503 | 0.7199 | 0.8485 |
| 0.5578 | 2.0098 | 820 | 0.7702 | 0.1600 | 0.7702 | 0.8776 |
| 0.5578 | 2.0147 | 822 | 0.7948 | 0.1600 | 0.7948 | 0.8915 |
| 0.5578 | 2.0196 | 824 | 0.7714 | 0.1209 | 0.7714 | 0.8783 |
| 0.5578 | 2.0245 | 826 | 0.7442 | 0.0503 | 0.7442 | 0.8626 |
| 0.5578 | 2.0294 | 828 | 0.7438 | 0.0503 | 0.7438 | 0.8625 |
| 0.5578 | 2.0343 | 830 | 0.7616 | 0.0099 | 0.7616 | 0.8727 |
| 0.5578 | 2.0392 | 832 | 0.7911 | 0.0099 | 0.7911 | 0.8894 |
| 0.5578 | 2.0441 | 834 | 0.8279 | 0.0099 | 0.8279 | 0.9099 |
| 0.5578 | 2.0490 | 836 | 0.8528 | 0.0099 | 0.8528 | 0.9235 |
| 0.5578 | 2.0539 | 838 | 0.8694 | 0.1409 | 0.8694 | 0.9324 |
| 0.5578 | 2.0588 | 840 | 0.8550 | 0.0099 | 0.8550 | 0.9247 |
| 0.5578 | 2.0637 | 842 | 0.8447 | 0.0099 | 0.8447 | 0.9191 |
| 0.5578 | 2.0686 | 844 | 0.8239 | 0.0099 | 0.8239 | 0.9077 |
| 0.5578 | 2.0735 | 846 | 0.7818 | 0.0099 | 0.7818 | 0.8842 |
| 0.5578 | 2.0784 | 848 | 0.7487 | 0.0099 | 0.7487 | 0.8653 |
| 0.5578 | 2.0833 | 850 | 0.7326 | 0.0099 | 0.7326 | 0.8559 |
| 0.5578 | 2.0882 | 852 | 0.7200 | 0.0099 | 0.7200 | 0.8485 |
| 0.5578 | 2.0931 | 854 | 0.7178 | 0.0099 | 0.7178 | 0.8472 |
| 0.5578 | 2.0980 | 856 | 0.7123 | 0.0099 | 0.7123 | 0.8440 |
| 0.5578 | 2.1029 | 858 | 0.7103 | 0.0503 | 0.7103 | 0.8428 |
| 0.5578 | 2.1078 | 860 | 0.7040 | 0.0099 | 0.7040 | 0.8391 |
| 0.5578 | 2.1127 | 862 | 0.7110 | 0.0099 | 0.7110 | 0.8432 |
| 0.5578 | 2.1176 | 864 | 0.7208 | 0.0099 | 0.7208 | 0.8490 |
| 0.5578 | 2.1225 | 866 | 0.7310 | 0.0099 | 0.7310 | 0.8550 |
| 0.5578 | 2.1275 | 868 | 0.7427 | 0.0099 | 0.7427 | 0.8618 |
| 0.5578 | 2.1324 | 870 | 0.7558 | 0.0099 | 0.7558 | 0.8694 |
| 0.5578 | 2.1373 | 872 | 0.7712 | 0.0099 | 0.7712 | 0.8782 |
| 0.5578 | 2.1422 | 874 | 0.8072 | 0.0099 | 0.8072 | 0.8984 |
| 0.5578 | 2.1471 | 876 | 0.8080 | 0.1793 | 0.8080 | 0.8989 |
| 0.5578 | 2.1520 | 878 | 0.7558 | 0.0099 | 0.7558 | 0.8694 |
| 0.5578 | 2.1569 | 880 | 0.7004 | 0.0099 | 0.7004 | 0.8369 |
| 0.5578 | 2.1618 | 882 | 0.6875 | 0.0099 | 0.6875 | 0.8292 |
| 0.5578 | 2.1667 | 884 | 0.6856 | 0.0099 | 0.6856 | 0.8280 |
| 0.5578 | 2.1716 | 886 | 0.6961 | 0.0099 | 0.6961 | 0.8343 |
| 0.5578 | 2.1765 | 888 | 0.7183 | 0.0099 | 0.7183 | 0.8475 |
| 0.5578 | 2.1814 | 890 | 0.7322 | 0.0099 | 0.7322 | 0.8557 |
| 0.5578 | 2.1863 | 892 | 0.7432 | 0.0099 | 0.7432 | 0.8621 |
| 0.5578 | 2.1912 | 894 | 0.7627 | 0.0099 | 0.7627 | 0.8733 |
| 0.5578 | 2.1961 | 896 | 0.7871 | 0.0099 | 0.7871 | 0.8872 |
| 0.5578 | 2.2010 | 898 | 0.8083 | 0.0099 | 0.8083 | 0.8990 |
| 0.5578 | 2.2059 | 900 | 0.8043 | 0.0503 | 0.8043 | 0.8968 |
| 0.5578 | 2.2108 | 902 | 0.7885 | 0.0916 | 0.7885 | 0.8880 |
| 0.5578 | 2.2157 | 904 | 0.7669 | 0.2186 | 0.7669 | 0.8757 |
| 0.5578 | 2.2206 | 906 | 0.7204 | 0.2186 | 0.7204 | 0.8488 |
| 0.5578 | 2.2255 | 908 | 0.7269 | 0.2186 | 0.7269 | 0.8526 |
| 0.5578 | 2.2304 | 910 | 0.7296 | 0.2186 | 0.7296 | 0.8542 |
| 0.5578 | 2.2353 | 912 | 0.7265 | 0.1793 | 0.7265 | 0.8523 |
| 0.5578 | 2.2402 | 914 | 0.7163 | 0.1793 | 0.7163 | 0.8464 |
| 0.5578 | 2.2451 | 916 | 0.7555 | 0.1793 | 0.7555 | 0.8692 |
| 0.5578 | 2.25 | 918 | 0.7513 | 0.1793 | 0.7513 | 0.8668 |
| 0.5578 | 2.2549 | 920 | 0.7326 | 0.1793 | 0.7326 | 0.8559 |
| 0.5578 | 2.2598 | 922 | 0.7222 | 0.1793 | 0.7222 | 0.8498 |
| 0.5578 | 2.2647 | 924 | 0.7370 | 0.1793 | 0.7370 | 0.8585 |
| 0.5578 | 2.2696 | 926 | 0.7920 | 0.1992 | 0.7920 | 0.8900 |
| 0.5578 | 2.2745 | 928 | 0.8360 | 0.1992 | 0.8360 | 0.9143 |
| 0.5578 | 2.2794 | 930 | 0.7819 | 0.1992 | 0.7819 | 0.8843 |
| 0.5578 | 2.2843 | 932 | 0.7211 | 0.2533 | 0.7211 | 0.8492 |
| 0.5578 | 2.2892 | 934 | 0.6942 | 0.1356 | 0.6942 | 0.8332 |
| 0.5578 | 2.2941 | 936 | 0.6928 | 0.1356 | 0.6928 | 0.8323 |
| 0.5578 | 2.2990 | 938 | 0.6936 | 0.2533 | 0.6936 | 0.8328 |
| 0.5578 | 2.3039 | 940 | 0.6979 | 0.1409 | 0.6979 | 0.8354 |
| 0.5578 | 2.3088 | 942 | 0.6968 | 0.1409 | 0.6968 | 0.8347 |
| 0.5578 | 2.3137 | 944 | 0.6995 | 0.1409 | 0.6995 | 0.8363 |
| 0.5578 | 2.3186 | 946 | 0.7280 | 0.1409 | 0.7280 | 0.8532 |
| 0.5578 | 2.3235 | 948 | 0.7728 | 0.1635 | 0.7728 | 0.8791 |
| 0.5578 | 2.3284 | 950 | 0.7729 | 0.1635 | 0.7729 | 0.8792 |
| 0.5578 | 2.3333 | 952 | 0.7663 | 0.1635 | 0.7663 | 0.8754 |
| 0.5578 | 2.3382 | 954 | 0.7735 | 0.1635 | 0.7735 | 0.8795 |
| 0.5578 | 2.3431 | 956 | 0.7792 | 0.1409 | 0.7792 | 0.8827 |
| 0.5578 | 2.3480 | 958 | 0.7592 | 0.2921 | 0.7592 | 0.8713 |
| 0.5578 | 2.3529 | 960 | 0.7620 | 0.1409 | 0.7620 | 0.8729 |
| 0.5578 | 2.3578 | 962 | 0.7656 | 0.1793 | 0.7656 | 0.8750 |
| 0.5578 | 2.3627 | 964 | 0.7905 | 0.2588 | 0.7905 | 0.8891 |
| 0.5578 | 2.3676 | 966 | 0.7927 | 0.2588 | 0.7927 | 0.8904 |
| 0.5578 | 2.3725 | 968 | 0.7444 | 0.2588 | 0.7444 | 0.8628 |
| 0.5578 | 2.3775 | 970 | 0.7016 | 0.2186 | 0.7016 | 0.8376 |
| 0.5578 | 2.3824 | 972 | 0.6727 | 0.0503 | 0.6727 | 0.8202 |
| 0.5578 | 2.3873 | 974 | 0.6824 | 0.0099 | 0.6824 | 0.8261 |
| 0.5578 | 2.3922 | 976 | 0.6995 | 0.1409 | 0.6995 | 0.8364 |
| 0.5578 | 2.3971 | 978 | 0.7122 | 0.1409 | 0.7122 | 0.8439 |
| 0.5578 | 2.4020 | 980 | 0.7272 | 0.1793 | 0.7272 | 0.8528 |
| 0.5578 | 2.4069 | 982 | 0.7788 | 0.1793 | 0.7788 | 0.8825 |
| 0.5578 | 2.4118 | 984 | 0.8050 | 0.1793 | 0.8050 | 0.8972 |
| 0.5578 | 2.4167 | 986 | 0.8236 | 0.2186 | 0.8236 | 0.9075 |
| 0.5578 | 2.4216 | 988 | 0.7882 | 0.2588 | 0.7882 | 0.8878 |
| 0.5578 | 2.4265 | 990 | 0.7090 | 0.1793 | 0.7090 | 0.8420 |
| 0.5578 | 2.4314 | 992 | 0.6612 | 0.1793 | 0.6612 | 0.8132 |
| 0.5578 | 2.4363 | 994 | 0.6352 | 0.0099 | 0.6352 | 0.7970 |
| 0.5578 | 2.4412 | 996 | 0.6234 | 0.2553 | 0.6234 | 0.7895 |
| 0.5578 | 2.4461 | 998 | 0.6244 | 0.2553 | 0.6244 | 0.7902 |
| 0.1212 | 2.4510 | 1000 | 0.6264 | 0.2373 | 0.6264 | 0.7914 |
| 0.1212 | 2.4559 | 1002 | 0.6482 | 0.1409 | 0.6482 | 0.8051 |
| 0.1212 | 2.4608 | 1004 | 0.6848 | 0.1409 | 0.6848 | 0.8275 |
| 0.1212 | 2.4657 | 1006 | 0.7350 | 0.1793 | 0.7350 | 0.8573 |
| 0.1212 | 2.4706 | 1008 | 0.7820 | 0.1793 | 0.7820 | 0.8843 |
| 0.1212 | 2.4755 | 1010 | 0.8048 | 0.1793 | 0.8048 | 0.8971 |
| 0.1212 | 2.4804 | 1012 | 0.8372 | 0.2881 | 0.8372 | 0.9150 |
| 0.1212 | 2.4853 | 1014 | 0.8292 | 0.2881 | 0.8292 | 0.9106 |
| 0.1212 | 2.4902 | 1016 | 0.7600 | 0.1978 | 0.7600 | 0.8718 |
| 0.1212 | 2.4951 | 1018 | 0.7176 | 0.1978 | 0.7176 | 0.8471 |
| 0.1212 | 2.5 | 1020 | 0.6963 | 0.1978 | 0.6963 | 0.8344 |
| 0.1212 | 2.5049 | 1022 | 0.7406 | 0.2364 | 0.7406 | 0.8606 |
| 0.1212 | 2.5098 | 1024 | 0.7410 | 0.1793 | 0.7410 | 0.8608 |
| 0.1212 | 2.5147 | 1026 | 0.6967 | 0.2364 | 0.6967 | 0.8347 |
| 0.1212 | 2.5196 | 1028 | 0.6256 | 0.1978 | 0.6256 | 0.7910 |
| 0.1212 | 2.5245 | 1030 | 0.6013 | 0.1978 | 0.6013 | 0.7754 |
| 0.1212 | 2.5294 | 1032 | 0.6069 | 0.1141 | 0.6069 | 0.7790 |
| 0.1212 | 2.5343 | 1034 | 0.6189 | 0.0503 | 0.6189 | 0.7867 |
| 0.1212 | 2.5392 | 1036 | 0.6228 | 0.0503 | 0.6228 | 0.7892 |
| 0.1212 | 2.5441 | 1038 | 0.6630 | 0.0099 | 0.6630 | 0.8143 |
| 0.1212 | 2.5490 | 1040 | 0.6965 | 0.1765 | 0.6965 | 0.8346 |
| 0.1212 | 2.5539 | 1042 | 0.7601 | 0.2921 | 0.7601 | 0.8718 |
| 0.1212 | 2.5588 | 1044 | 0.8109 | 0.2921 | 0.8109 | 0.9005 |
| 0.1212 | 2.5637 | 1046 | 0.8554 | 0.2921 | 0.8554 | 0.9249 |
| 0.1212 | 2.5686 | 1048 | 0.9004 | 0.1503 | 0.9004 | 0.9489 |
| 0.1212 | 2.5735 | 1050 | 0.9211 | 0.0928 | 0.9211 | 0.9597 |
| 0.1212 | 2.5784 | 1052 | 0.9303 | 0.1463 | 0.9303 | 0.9645 |
| 0.1212 | 2.5833 | 1054 | 0.9738 | 0.1463 | 0.9738 | 0.9868 |
| 0.1212 | 2.5882 | 1056 | 0.9299 | 0.1463 | 0.9299 | 0.9643 |
| 0.1212 | 2.5931 | 1058 | 0.8688 | 0.2186 | 0.8688 | 0.9321 |
| 0.1212 | 2.5980 | 1060 | 0.7843 | 0.3318 | 0.7843 | 0.8856 |
| 0.1212 | 2.6029 | 1062 | 0.7275 | 0.1765 | 0.7275 | 0.8529 |
| 0.1212 | 2.6078 | 1064 | 0.7032 | 0.1356 | 0.7032 | 0.8386 |
| 0.1212 | 2.6127 | 1066 | 0.7016 | 0.0957 | 0.7016 | 0.8376 |
| 0.1212 | 2.6176 | 1068 | 0.7012 | 0.1356 | 0.7012 | 0.8374 |
| 0.1212 | 2.6225 | 1070 | 0.7049 | 0.1765 | 0.7049 | 0.8396 |
| 0.1212 | 2.6275 | 1072 | 0.7225 | 0.1765 | 0.7225 | 0.8500 |
| 0.1212 | 2.6324 | 1074 | 0.7325 | 0.1765 | 0.7325 | 0.8558 |
| 0.1212 | 2.6373 | 1076 | 0.7354 | 0.2921 | 0.7354 | 0.8575 |
| 0.1212 | 2.6422 | 1078 | 0.7511 | 0.2921 | 0.7511 | 0.8667 |
| 0.1212 | 2.6471 | 1080 | 0.7452 | 0.2921 | 0.7452 | 0.8633 |
| 0.1212 | 2.6520 | 1082 | 0.7244 | 0.3318 | 0.7244 | 0.8511 |
| 0.1212 | 2.6569 | 1084 | 0.7004 | 0.2921 | 0.7004 | 0.8369 |
| 0.1212 | 2.6618 | 1086 | 0.6903 | 0.2533 | 0.6903 | 0.8308 |
| 0.1212 | 2.6667 | 1088 | 0.6843 | 0.2533 | 0.6843 | 0.8272 |
| 0.1212 | 2.6716 | 1090 | 0.6789 | 0.3318 | 0.6789 | 0.8240 |
| 0.1212 | 2.6765 | 1092 | 0.6671 | 0.2533 | 0.6671 | 0.8168 |
| 0.1212 | 2.6814 | 1094 | 0.6698 | 0.2921 | 0.6698 | 0.8184 |
| 0.1212 | 2.6863 | 1096 | 0.6922 | 0.1793 | 0.6922 | 0.8320 |
| 0.1212 | 2.6912 | 1098 | 0.7097 | 0.2186 | 0.7097 | 0.8424 |
| 0.1212 | 2.6961 | 1100 | 0.7115 | 0.2588 | 0.7115 | 0.8435 |
| 0.1212 | 2.7010 | 1102 | 0.6823 | 0.2588 | 0.6823 | 0.8260 |
| 0.1212 | 2.7059 | 1104 | 0.6424 | 0.1793 | 0.6424 | 0.8015 |
| 0.1212 | 2.7108 | 1106 | 0.6226 | 0.1765 | 0.6226 | 0.7891 |
| 0.1212 | 2.7157 | 1108 | 0.6197 | 0.1765 | 0.6197 | 0.7872 |
| 0.1212 | 2.7206 | 1110 | 0.6313 | 0.1765 | 0.6313 | 0.7946 |
| 0.1212 | 2.7255 | 1112 | 0.6564 | 0.1793 | 0.6564 | 0.8102 |
| 0.1212 | 2.7304 | 1114 | 0.7129 | 0.2588 | 0.7129 | 0.8443 |
| 0.1212 | 2.7353 | 1116 | 0.7700 | 0.2588 | 0.7700 | 0.8775 |
| 0.1212 | 2.7402 | 1118 | 0.8459 | 0.3636 | 0.8459 | 0.9197 |
| 0.1212 | 2.7451 | 1120 | 0.8834 | 0.3687 | 0.8834 | 0.9399 |
| 0.1212 | 2.75 | 1122 | 0.8543 | 0.3687 | 0.8543 | 0.9243 |
| 0.1212 | 2.7549 | 1124 | 0.7792 | 0.1793 | 0.7792 | 0.8827 |
| 0.1212 | 2.7598 | 1126 | 0.7107 | 0.2154 | 0.7107 | 0.8430 |
| 0.1212 | 2.7647 | 1128 | 0.6777 | 0.2154 | 0.6777 | 0.8232 |
| 0.1212 | 2.7696 | 1130 | 0.6717 | 0.2154 | 0.6717 | 0.8196 |
| 0.1212 | 2.7745 | 1132 | 0.6612 | 0.0957 | 0.6612 | 0.8131 |
| 0.1212 | 2.7794 | 1134 | 0.6444 | 0.1560 | 0.6444 | 0.8028 |
| 0.1212 | 2.7843 | 1136 | 0.6364 | 0.1560 | 0.6364 | 0.7978 |
| 0.1212 | 2.7892 | 1138 | 0.6291 | 0.1560 | 0.6291 | 0.7932 |
| 0.1212 | 2.7941 | 1140 | 0.6268 | 0.1560 | 0.6268 | 0.7917 |
| 0.1212 | 2.7990 | 1142 | 0.6319 | 0.1560 | 0.6319 | 0.7949 |
| 0.1212 | 2.8039 | 1144 | 0.6426 | 0.0957 | 0.6426 | 0.8016 |
| 0.1212 | 2.8088 | 1146 | 0.6633 | 0.0099 | 0.6633 | 0.8144 |
| 0.1212 | 2.8137 | 1148 | 0.7144 | 0.2186 | 0.7144 | 0.8452 |
| 0.1212 | 2.8186 | 1150 | 0.7613 | 0.2186 | 0.7613 | 0.8726 |
| 0.1212 | 2.8235 | 1152 | 0.7801 | 0.1409 | 0.7801 | 0.8832 |
| 0.1212 | 2.8284 | 1154 | 0.7979 | 0.2533 | 0.7979 | 0.8933 |
| 0.1212 | 2.8333 | 1156 | 0.8013 | 0.2533 | 0.8013 | 0.8951 |
| 0.1212 | 2.8382 | 1158 | 0.8101 | 0.2533 | 0.8101 | 0.9001 |
| 0.1212 | 2.8431 | 1160 | 0.8162 | 0.1409 | 0.8162 | 0.9034 |
| 0.1212 | 2.8480 | 1162 | 0.7858 | 0.1793 | 0.7858 | 0.8864 |
| 0.1212 | 2.8529 | 1164 | 0.7463 | 0.1409 | 0.7463 | 0.8639 |
| 0.1212 | 2.8578 | 1166 | 0.7142 | 0.1409 | 0.7142 | 0.8451 |
| 0.1212 | 2.8627 | 1168 | 0.6781 | 0.0099 | 0.6781 | 0.8235 |
| 0.1212 | 2.8676 | 1170 | 0.6558 | -0.0294 | 0.6558 | 0.8098 |
| 0.1212 | 2.8725 | 1172 | 0.6414 | 0.1034 | 0.6414 | 0.8009 |
| 0.1212 | 2.8775 | 1174 | 0.6365 | -0.0294 | 0.6365 | 0.7978 |
| 0.1212 | 2.8824 | 1176 | 0.6352 | 0.1962 | 0.6352 | 0.7970 |
| 0.1212 | 2.8873 | 1178 | 0.6246 | 0.1962 | 0.6246 | 0.7903 |
| 0.1212 | 2.8922 | 1180 | 0.6277 | -0.0294 | 0.6277 | 0.7923 |
| 0.1212 | 2.8971 | 1182 | 0.6271 | 0.0339 | 0.6271 | 0.7919 |
| 0.1212 | 2.9020 | 1184 | 0.6218 | 0.1409 | 0.6218 | 0.7886 |
| 0.1212 | 2.9069 | 1186 | 0.6191 | 0.1793 | 0.6191 | 0.7868 |
| 0.1212 | 2.9118 | 1188 | 0.6122 | 0.0099 | 0.6122 | 0.7824 |
| 0.1212 | 2.9167 | 1190 | 0.6226 | 0.1962 | 0.6226 | 0.7890 |
| 0.1212 | 2.9216 | 1192 | 0.6327 | 0.1962 | 0.6327 | 0.7954 |
| 0.1212 | 2.9265 | 1194 | 0.6284 | 0.1962 | 0.6284 | 0.7927 |
| 0.1212 | 2.9314 | 1196 | 0.6272 | 0.0503 | 0.6272 | 0.7920 |
| 0.1212 | 2.9363 | 1198 | 0.6282 | 0.0916 | 0.6282 | 0.7926 |
| 0.1212 | 2.9412 | 1200 | 0.6260 | 0.0916 | 0.6260 | 0.7912 |
| 0.1212 | 2.9461 | 1202 | 0.6322 | 0.0916 | 0.6322 | 0.7951 |
| 0.1212 | 2.9510 | 1204 | 0.6453 | 0.2186 | 0.6453 | 0.8033 |
| 0.1212 | 2.9559 | 1206 | 0.6579 | 0.1793 | 0.6579 | 0.8111 |
| 0.1212 | 2.9608 | 1208 | 0.6697 | 0.2186 | 0.6697 | 0.8183 |
| 0.1212 | 2.9657 | 1210 | 0.6697 | 0.1793 | 0.6697 | 0.8183 |
| 0.1212 | 2.9706 | 1212 | 0.6698 | 0.1793 | 0.6698 | 0.8184 |
| 0.1212 | 2.9755 | 1214 | 0.6784 | 0.1793 | 0.6784 | 0.8236 |
| 0.1212 | 2.9804 | 1216 | 0.6913 | 0.1793 | 0.6913 | 0.8315 |
| 0.1212 | 2.9853 | 1218 | 0.6965 | 0.1793 | 0.6965 | 0.8346 |
| 0.1212 | 2.9902 | 1220 | 0.7009 | 0.1409 | 0.7009 | 0.8372 |
| 0.1212 | 2.9951 | 1222 | 0.7186 | 0.1034 | 0.7186 | 0.8477 |
| 0.1212 | 3.0 | 1224 | 0.7407 | 0.2154 | 0.7407 | 0.8606 |
| 0.1212 | 3.0049 | 1226 | 0.7650 | 0.2154 | 0.7650 | 0.8746 |
| 0.1212 | 3.0098 | 1228 | 0.7814 | 0.2154 | 0.7814 | 0.8840 |
| 0.1212 | 3.0147 | 1230 | 0.7895 | 0.2154 | 0.7895 | 0.8885 |
| 0.1212 | 3.0196 | 1232 | 0.7814 | 0.2154 | 0.7814 | 0.8840 |
| 0.1212 | 3.0245 | 1234 | 0.7751 | 0.0957 | 0.7751 | 0.8804 |
| 0.1212 | 3.0294 | 1236 | 0.7702 | 0.0957 | 0.7702 | 0.8776 |
| 0.1212 | 3.0343 | 1238 | 0.7634 | 0.1034 | 0.7634 | 0.8737 |
| 0.1212 | 3.0392 | 1240 | 0.7536 | 0.1034 | 0.7536 | 0.8681 |
| 0.1212 | 3.0441 | 1242 | 0.7258 | 0.1793 | 0.7258 | 0.8519 |
| 0.1212 | 3.0490 | 1244 | 0.6818 | 0.0503 | 0.6818 | 0.8257 |
| 0.1212 | 3.0539 | 1246 | 0.6520 | -0.0678 | 0.6520 | 0.8075 |
| 0.1212 | 3.0588 | 1248 | 0.6756 | 0.2150 | 0.6756 | 0.8219 |
| 0.1212 | 3.0637 | 1250 | 0.7224 | 0.2150 | 0.7224 | 0.8500 |
| 0.1212 | 3.0686 | 1252 | 0.7444 | 0.1755 | 0.7444 | 0.8628 |
| 0.1212 | 3.0735 | 1254 | 0.7450 | 0.2150 | 0.7450 | 0.8631 |
| 0.1212 | 3.0784 | 1256 | 0.7266 | 0.2150 | 0.7266 | 0.8524 |
| 0.1212 | 3.0833 | 1258 | 0.7112 | 0.0957 | 0.7112 | 0.8433 |
| 0.1212 | 3.0882 | 1260 | 0.7306 | -0.0294 | 0.7306 | 0.8548 |
| 0.1212 | 3.0931 | 1262 | 0.7970 | 0.2186 | 0.7970 | 0.8928 |
| 0.1212 | 3.0980 | 1264 | 0.8659 | 0.1162 | 0.8659 | 0.9305 |
| 0.1212 | 3.1029 | 1266 | 0.8734 | 0.1162 | 0.8734 | 0.9346 |
| 0.1212 | 3.1078 | 1268 | 0.8305 | 0.1992 | 0.8305 | 0.9113 |
| 0.1212 | 3.1127 | 1270 | 0.7934 | 0.2317 | 0.7934 | 0.8907 |
| 0.1212 | 3.1176 | 1272 | 0.7629 | 0.2696 | 0.7629 | 0.8735 |
| 0.1212 | 3.1225 | 1274 | 0.7458 | 0.3226 | 0.7458 | 0.8636 |
| 0.1212 | 3.1275 | 1276 | 0.7510 | 0.2696 | 0.7510 | 0.8666 |
| 0.1212 | 3.1324 | 1278 | 0.7766 | 0.2317 | 0.7766 | 0.8813 |
| 0.1212 | 3.1373 | 1280 | 0.8266 | 0.1848 | 0.8266 | 0.9092 |
| 0.1212 | 3.1422 | 1282 | 0.8591 | 0.2734 | 0.8591 | 0.9269 |
| 0.1212 | 3.1471 | 1284 | 0.8567 | 0.3037 | 0.8567 | 0.9256 |
| 0.1212 | 3.1520 | 1286 | 0.8112 | 0.3687 | 0.8112 | 0.9007 |
| 0.1212 | 3.1569 | 1288 | 0.7511 | 0.2588 | 0.7511 | 0.8666 |
| 0.1212 | 3.1618 | 1290 | 0.6872 | 0.2588 | 0.6872 | 0.8290 |
| 0.1212 | 3.1667 | 1292 | 0.6302 | 0.1978 | 0.6302 | 0.7939 |
| 0.1212 | 3.1716 | 1294 | 0.6109 | 0.0735 | 0.6109 | 0.7816 |
| 0.1212 | 3.1765 | 1296 | 0.6134 | 0.2553 | 0.6134 | 0.7832 |
| 0.1212 | 3.1814 | 1298 | 0.6129 | 0.2553 | 0.6129 | 0.7829 |
| 0.1212 | 3.1863 | 1300 | 0.6005 | 0.0339 | 0.6005 | 0.7749 |
| 0.1212 | 3.1912 | 1302 | 0.6036 | 0.0735 | 0.6036 | 0.7769 |
| 0.1212 | 3.1961 | 1304 | 0.6201 | 0.0503 | 0.6201 | 0.7875 |
| 0.1212 | 3.2010 | 1306 | 0.6512 | 0.2588 | 0.6512 | 0.8070 |
| 0.1212 | 3.2059 | 1308 | 0.7153 | 0.2588 | 0.7153 | 0.8457 |
| 0.1212 | 3.2108 | 1310 | 0.7597 | 0.2588 | 0.7597 | 0.8716 |
| 0.1212 | 3.2157 | 1312 | 0.7538 | 0.2588 | 0.7538 | 0.8682 |
| 0.1212 | 3.2206 | 1314 | 0.7161 | 0.2588 | 0.7161 | 0.8462 |
| 0.1212 | 3.2255 | 1316 | 0.6582 | 0.2588 | 0.6582 | 0.8113 |
| 0.1212 | 3.2304 | 1318 | 0.6287 | 0.0099 | 0.6287 | 0.7929 |
| 0.1212 | 3.2353 | 1320 | 0.6382 | 0.1560 | 0.6382 | 0.7989 |
| 0.1212 | 3.2402 | 1322 | 0.6511 | 0.1560 | 0.6511 | 0.8069 |
| 0.1212 | 3.2451 | 1324 | 0.6503 | 0.1356 | 0.6503 | 0.8064 |
| 0.1212 | 3.25 | 1326 | 0.6509 | -0.0294 | 0.6509 | 0.8068 |
| 0.1212 | 3.2549 | 1328 | 0.6657 | 0.1793 | 0.6657 | 0.8159 |
| 0.1212 | 3.2598 | 1330 | 0.6928 | 0.2186 | 0.6928 | 0.8323 |
| 0.1212 | 3.2647 | 1332 | 0.7036 | 0.2186 | 0.7036 | 0.8388 |
| 0.1212 | 3.2696 | 1334 | 0.7117 | 0.2186 | 0.7117 | 0.8436 |
| 0.1212 | 3.2745 | 1336 | 0.6872 | 0.1793 | 0.6872 | 0.8290 |
| 0.1212 | 3.2794 | 1338 | 0.6692 | 0.1409 | 0.6692 | 0.8181 |
| 0.1212 | 3.2843 | 1340 | 0.6413 | 0.0099 | 0.6413 | 0.8008 |
| 0.1212 | 3.2892 | 1342 | 0.6219 | 0.0735 | 0.6219 | 0.7886 |
| 0.1212 | 3.2941 | 1344 | 0.6120 | 0.1141 | 0.6120 | 0.7823 |
| 0.1212 | 3.2990 | 1346 | 0.6147 | 0.1558 | 0.6147 | 0.7840 |
| 0.1212 | 3.3039 | 1348 | 0.6333 | 0.1340 | 0.6333 | 0.7958 |
| 0.1212 | 3.3088 | 1350 | 0.6374 | 0.1340 | 0.6374 | 0.7984 |
| 0.1212 | 3.3137 | 1352 | 0.6323 | 0.1985 | 0.6323 | 0.7952 |
| 0.1212 | 3.3186 | 1354 | 0.6119 | 0.1985 | 0.6119 | 0.7822 |
| 0.1212 | 3.3235 | 1356 | 0.5826 | 0.1558 | 0.5826 | 0.7633 |
| 0.1212 | 3.3284 | 1358 | 0.5638 | 0.1141 | 0.5638 | 0.7508 |
| 0.1212 | 3.3333 | 1360 | 0.5615 | 0.2794 | 0.5615 | 0.7493 |
| 0.1212 | 3.3382 | 1362 | 0.5737 | 0.2794 | 0.5737 | 0.7574 |
| 0.1212 | 3.3431 | 1364 | 0.5984 | 0.2794 | 0.5984 | 0.7735 |
| 0.1212 | 3.3480 | 1366 | 0.6173 | 0.2794 | 0.6173 | 0.7857 |
| 0.1212 | 3.3529 | 1368 | 0.6508 | 0.3865 | 0.6508 | 0.8067 |
| 0.1212 | 3.3578 | 1370 | 0.7077 | 0.3163 | 0.7077 | 0.8412 |
| 0.1212 | 3.3627 | 1372 | 0.7377 | 0.3163 | 0.7377 | 0.8589 |
| 0.1212 | 3.3676 | 1374 | 0.7257 | 0.3163 | 0.7257 | 0.8519 |
| 0.1212 | 3.3725 | 1376 | 0.6912 | 0.3163 | 0.6912 | 0.8314 |
| 0.1212 | 3.3775 | 1378 | 0.6567 | 0.3163 | 0.6567 | 0.8104 |
| 0.1212 | 3.3824 | 1380 | 0.6102 | 0.3163 | 0.6102 | 0.7812 |
| 0.1212 | 3.3873 | 1382 | 0.6045 | 0.3163 | 0.6045 | 0.7775 |
| 0.1212 | 3.3922 | 1384 | 0.6079 | 0.3163 | 0.6079 | 0.7797 |
| 0.1212 | 3.3971 | 1386 | 0.6079 | 0.3163 | 0.6079 | 0.7797 |
| 0.1212 | 3.4020 | 1388 | 0.6066 | 0.3163 | 0.6066 | 0.7788 |
| 0.1212 | 3.4069 | 1390 | 0.5949 | 0.3163 | 0.5949 | 0.7713 |
| 0.1212 | 3.4118 | 1392 | 0.5952 | 0.3163 | 0.5952 | 0.7715 |
| 0.1212 | 3.4167 | 1394 | 0.6242 | 0.2588 | 0.6242 | 0.7901 |
| 0.1212 | 3.4216 | 1396 | 0.6646 | 0.2588 | 0.6646 | 0.8152 |
| 0.1212 | 3.4265 | 1398 | 0.6828 | 0.2588 | 0.6828 | 0.8263 |
| 0.1212 | 3.4314 | 1400 | 0.6958 | 0.2588 | 0.6958 | 0.8342 |
| 0.1212 | 3.4363 | 1402 | 0.6845 | 0.2588 | 0.6845 | 0.8274 |
| 0.1212 | 3.4412 | 1404 | 0.6459 | 0.2186 | 0.6459 | 0.8037 |
| 0.1212 | 3.4461 | 1406 | 0.6205 | 0.2154 | 0.6205 | 0.7877 |
| 0.1212 | 3.4510 | 1408 | 0.6152 | 0.2696 | 0.6152 | 0.7844 |
| 0.1212 | 3.4559 | 1410 | 0.6115 | 0.1560 | 0.6115 | 0.7820 |
| 0.1212 | 3.4608 | 1412 | 0.5999 | 0.2150 | 0.5999 | 0.7745 |
| 0.1212 | 3.4657 | 1414 | 0.5983 | 0.1978 | 0.5983 | 0.7735 |
| 0.1212 | 3.4706 | 1416 | 0.6313 | 0.3163 | 0.6313 | 0.7945 |
| 0.1212 | 3.4755 | 1418 | 0.6568 | 0.2588 | 0.6568 | 0.8105 |
| 0.1212 | 3.4804 | 1420 | 0.7026 | 0.2588 | 0.7026 | 0.8382 |
| 0.1212 | 3.4853 | 1422 | 0.7229 | 0.2588 | 0.7229 | 0.8503 |
| 0.1212 | 3.4902 | 1424 | 0.7069 | 0.2588 | 0.7069 | 0.8408 |
| 0.1212 | 3.4951 | 1426 | 0.6812 | 0.2588 | 0.6812 | 0.8253 |
| 0.1212 | 3.5 | 1428 | 0.6720 | 0.2588 | 0.6720 | 0.8197 |
| 0.1212 | 3.5049 | 1430 | 0.6791 | 0.2588 | 0.6791 | 0.8241 |
| 0.1212 | 3.5098 | 1432 | 0.6596 | 0.2588 | 0.6596 | 0.8121 |
| 0.1212 | 3.5147 | 1434 | 0.6479 | 0.1340 | 0.6479 | 0.8049 |
| 0.1212 | 3.5196 | 1436 | 0.6485 | -0.0048 | 0.6485 | 0.8053 |
| 0.1212 | 3.5245 | 1438 | 0.6534 | 0.0503 | 0.6534 | 0.8083 |
| 0.1212 | 3.5294 | 1440 | 0.6638 | 0.2588 | 0.6638 | 0.8147 |
| 0.1212 | 3.5343 | 1442 | 0.6916 | 0.2588 | 0.6916 | 0.8316 |
| 0.1212 | 3.5392 | 1444 | 0.7170 | 0.2588 | 0.7170 | 0.8468 |
| 0.1212 | 3.5441 | 1446 | 0.7221 | 0.2588 | 0.7221 | 0.8498 |
| 0.1212 | 3.5490 | 1448 | 0.7272 | 0.1793 | 0.7272 | 0.8528 |
| 0.1212 | 3.5539 | 1450 | 0.7159 | 0.2154 | 0.7159 | 0.8461 |
| 0.1212 | 3.5588 | 1452 | 0.7029 | 0.2696 | 0.7029 | 0.8384 |
| 0.1212 | 3.5637 | 1454 | 0.6966 | 0.2696 | 0.6966 | 0.8346 |
| 0.1212 | 3.5686 | 1456 | 0.6862 | 0.2696 | 0.6862 | 0.8284 |
| 0.1212 | 3.5735 | 1458 | 0.6811 | 0.2696 | 0.6811 | 0.8253 |
| 0.1212 | 3.5784 | 1460 | 0.6799 | 0.1978 | 0.6799 | 0.8246 |
| 0.1212 | 3.5833 | 1462 | 0.7066 | 0.1793 | 0.7066 | 0.8406 |
| 0.1212 | 3.5882 | 1464 | 0.7253 | 0.2588 | 0.7253 | 0.8517 |
| 0.1212 | 3.5931 | 1466 | 0.7126 | 0.2588 | 0.7126 | 0.8442 |
| 0.1212 | 3.5980 | 1468 | 0.7129 | 0.2588 | 0.7129 | 0.8443 |
| 0.1212 | 3.6029 | 1470 | 0.7152 | 0.2588 | 0.7152 | 0.8457 |
| 0.1212 | 3.6078 | 1472 | 0.7071 | 0.2588 | 0.7071 | 0.8409 |
| 0.1212 | 3.6127 | 1474 | 0.6837 | 0.1793 | 0.6837 | 0.8269 |
| 0.1212 | 3.6176 | 1476 | 0.6841 | 0.1978 | 0.6841 | 0.8271 |
| 0.1212 | 3.6225 | 1478 | 0.6879 | 0.3077 | 0.6879 | 0.8294 |
| 0.1212 | 3.6275 | 1480 | 0.6937 | 0.2696 | 0.6937 | 0.8329 |
| 0.1212 | 3.6324 | 1482 | 0.6995 | 0.2696 | 0.6995 | 0.8364 |
| 0.1212 | 3.6373 | 1484 | 0.7072 | 0.2696 | 0.7072 | 0.8409 |
| 0.1212 | 3.6422 | 1486 | 0.7205 | 0.2533 | 0.7205 | 0.8488 |
| 0.1212 | 3.6471 | 1488 | 0.7153 | 0.2533 | 0.7153 | 0.8457 |
| 0.1212 | 3.6520 | 1490 | 0.7201 | 0.1409 | 0.7201 | 0.8486 |
| 0.1212 | 3.6569 | 1492 | 0.7354 | 0.1793 | 0.7354 | 0.8576 |
| 0.1212 | 3.6618 | 1494 | 0.7457 | 0.2186 | 0.7457 | 0.8635 |
| 0.1212 | 3.6667 | 1496 | 0.7241 | 0.1793 | 0.7241 | 0.8510 |
| 0.1212 | 3.6716 | 1498 | 0.6961 | 0.1409 | 0.6961 | 0.8343 |
| 0.0791 | 3.6765 | 1500 | 0.6778 | 0.1356 | 0.6778 | 0.8233 |
| 0.0791 | 3.6814 | 1502 | 0.6761 | 0.1356 | 0.6761 | 0.8222 |
| 0.0791 | 3.6863 | 1504 | 0.6859 | 0.0957 | 0.6859 | 0.8282 |
| 0.0791 | 3.6912 | 1506 | 0.6944 | 0.0957 | 0.6944 | 0.8333 |
| 0.0791 | 3.6961 | 1508 | 0.7040 | 0.0957 | 0.7040 | 0.8390 |
| 0.0791 | 3.7010 | 1510 | 0.7152 | 0.1356 | 0.7152 | 0.8457 |
| 0.0791 | 3.7059 | 1512 | 0.7406 | 0.2588 | 0.7406 | 0.8606 |
| 0.0791 | 3.7108 | 1514 | 0.7760 | 0.2588 | 0.7760 | 0.8809 |
| 0.0791 | 3.7157 | 1516 | 0.7728 | 0.2588 | 0.7728 | 0.8791 |
| 0.0791 | 3.7206 | 1518 | 0.7336 | 0.2588 | 0.7336 | 0.8565 |
| 0.0791 | 3.7255 | 1520 | 0.7011 | 0.2588 | 0.7011 | 0.8373 |
| 0.0791 | 3.7304 | 1522 | 0.6835 | 0.1793 | 0.6835 | 0.8267 |
| 0.0791 | 3.7353 | 1524 | 0.6768 | 0.1409 | 0.6768 | 0.8227 |
| 0.0791 | 3.7402 | 1526 | 0.6799 | 0.1409 | 0.6799 | 0.8245 |
| 0.0791 | 3.7451 | 1528 | 0.6861 | 0.2921 | 0.6861 | 0.8283 |
| 0.0791 | 3.75 | 1530 | 0.6959 | 0.2533 | 0.6959 | 0.8342 |
| 0.0791 | 3.7549 | 1532 | 0.6940 | 0.2154 | 0.6940 | 0.8331 |
| 0.0791 | 3.7598 | 1534 | 0.6892 | 0.2154 | 0.6892 | 0.8302 |
| 0.0791 | 3.7647 | 1536 | 0.7028 | 0.2154 | 0.7028 | 0.8383 |
| 0.0791 | 3.7696 | 1538 | 0.7101 | 0.0957 | 0.7101 | 0.8427 |
| 0.0791 | 3.7745 | 1540 | 0.7086 | 0.2154 | 0.7086 | 0.8418 |
| 0.0791 | 3.7794 | 1542 | 0.7038 | 0.2154 | 0.7038 | 0.8389 |
| 0.0791 | 3.7843 | 1544 | 0.6954 | 0.2154 | 0.6954 | 0.8339 |
| 0.0791 | 3.7892 | 1546 | 0.7015 | 0.3318 | 0.7015 | 0.8376 |
| 0.0791 | 3.7941 | 1548 | 0.7003 | 0.1793 | 0.7003 | 0.8368 |
| 0.0791 | 3.7990 | 1550 | 0.6815 | 0.1793 | 0.6815 | 0.8255 |
| 0.0791 | 3.8039 | 1552 | 0.6841 | 0.2186 | 0.6841 | 0.8271 |
| 0.0791 | 3.8088 | 1554 | 0.6907 | 0.2186 | 0.6907 | 0.8311 |
| 0.0791 | 3.8137 | 1556 | 0.6936 | 0.2186 | 0.6936 | 0.8328 |
| 0.0791 | 3.8186 | 1558 | 0.6869 | 0.2186 | 0.6869 | 0.8288 |
| 0.0791 | 3.8235 | 1560 | 0.6744 | 0.2921 | 0.6744 | 0.8212 |
| 0.0791 | 3.8284 | 1562 | 0.6651 | 0.2154 | 0.6651 | 0.8156 |
| 0.0791 | 3.8333 | 1564 | 0.6676 | 0.2154 | 0.6676 | 0.8171 |
| 0.0791 | 3.8382 | 1566 | 0.6710 | 0.2154 | 0.6710 | 0.8191 |
| 0.0791 | 3.8431 | 1568 | 0.6762 | 0.2533 | 0.6762 | 0.8223 |
| 0.0791 | 3.8480 | 1570 | 0.6898 | 0.1034 | 0.6898 | 0.8305 |
| 0.0791 | 3.8529 | 1572 | 0.6950 | 0.2186 | 0.6950 | 0.8337 |
| 0.0791 | 3.8578 | 1574 | 0.6865 | 0.2186 | 0.6865 | 0.8286 |
| 0.0791 | 3.8627 | 1576 | 0.6698 | 0.1409 | 0.6698 | 0.8184 |
| 0.0791 | 3.8676 | 1578 | 0.6508 | 0.2154 | 0.6508 | 0.8067 |
| 0.0791 | 3.8725 | 1580 | 0.6741 | 0.2150 | 0.6741 | 0.8210 |
| 0.0791 | 3.8775 | 1582 | 0.7014 | 0.1755 | 0.7014 | 0.8375 |
| 0.0791 | 3.8824 | 1584 | 0.6867 | 0.2851 | 0.6867 | 0.8286 |
| 0.0791 | 3.8873 | 1586 | 0.6669 | 0.3226 | 0.6669 | 0.8166 |
| 0.0791 | 3.8922 | 1588 | 0.6563 | 0.3226 | 0.6563 | 0.8101 |
| 0.0791 | 3.8971 | 1590 | 0.6450 | 0.3077 | 0.6450 | 0.8031 |
| 0.0791 | 3.9020 | 1592 | 0.6484 | 0.3467 | 0.6484 | 0.8053 |
| 0.0791 | 3.9069 | 1594 | 0.6492 | 0.3467 | 0.6492 | 0.8057 |
| 0.0791 | 3.9118 | 1596 | 0.6567 | 0.3077 | 0.6567 | 0.8104 |
| 0.0791 | 3.9167 | 1598 | 0.6630 | 0.3077 | 0.6630 | 0.8143 |
| 0.0791 | 3.9216 | 1600 | 0.6768 | 0.3077 | 0.6768 | 0.8227 |
| 0.0791 | 3.9265 | 1602 | 0.6933 | 0.3077 | 0.6933 | 0.8327 |
| 0.0791 | 3.9314 | 1604 | 0.7030 | 0.1600 | 0.7030 | 0.8385 |
| 0.0791 | 3.9363 | 1606 | 0.7104 | 0.1978 | 0.7104 | 0.8428 |
| 0.0791 | 3.9412 | 1608 | 0.7090 | 0.1978 | 0.7090 | 0.8420 |
| 0.0791 | 3.9461 | 1610 | 0.7204 | 0.2364 | 0.7204 | 0.8488 |
| 0.0791 | 3.9510 | 1612 | 0.7198 | 0.2364 | 0.7198 | 0.8484 |
| 0.0791 | 3.9559 | 1614 | 0.7037 | 0.2364 | 0.7037 | 0.8388 |
| 0.0791 | 3.9608 | 1616 | 0.6855 | 0.2364 | 0.6855 | 0.8279 |
| 0.0791 | 3.9657 | 1618 | 0.6595 | 0.1600 | 0.6595 | 0.8121 |
| 0.0791 | 3.9706 | 1620 | 0.6500 | 0.1600 | 0.6500 | 0.8063 |
| 0.0791 | 3.9755 | 1622 | 0.6526 | 0.1600 | 0.6526 | 0.8078 |
| 0.0791 | 3.9804 | 1624 | 0.6712 | 0.2364 | 0.6712 | 0.8193 |
| 0.0791 | 3.9853 | 1626 | 0.7021 | 0.2186 | 0.7021 | 0.8379 |
| 0.0791 | 3.9902 | 1628 | 0.7048 | 0.2186 | 0.7048 | 0.8395 |
| 0.0791 | 3.9951 | 1630 | 0.6990 | 0.2186 | 0.6990 | 0.8360 |
| 0.0791 | 4.0 | 1632 | 0.7031 | 0.2186 | 0.7031 | 0.8385 |
| 0.0791 | 4.0049 | 1634 | 0.6996 | 0.2186 | 0.6996 | 0.8364 |
| 0.0791 | 4.0098 | 1636 | 0.6952 | 0.1409 | 0.6952 | 0.8338 |
| 0.0791 | 4.0147 | 1638 | 0.7047 | 0.1034 | 0.7047 | 0.8395 |
| 0.0791 | 4.0196 | 1640 | 0.7232 | 0.1034 | 0.7232 | 0.8504 |
| 0.0791 | 4.0245 | 1642 | 0.7470 | 0.1034 | 0.7470 | 0.8643 |
| 0.0791 | 4.0294 | 1644 | 0.7678 | 0.1409 | 0.7678 | 0.8762 |
| 0.0791 | 4.0343 | 1646 | 0.8113 | 0.1793 | 0.8113 | 0.9007 |
| 0.0791 | 4.0392 | 1648 | 0.8505 | 0.3255 | 0.8505 | 0.9222 |
| 0.0791 | 4.0441 | 1650 | 0.8698 | 0.3255 | 0.8698 | 0.9326 |
| 0.0791 | 4.0490 | 1652 | 0.8610 | 0.3255 | 0.8610 | 0.9279 |
| 0.0791 | 4.0539 | 1654 | 0.8489 | 0.3255 | 0.8489 | 0.9213 |
| 0.0791 | 4.0588 | 1656 | 0.8233 | 0.1409 | 0.8233 | 0.9073 |
| 0.0791 | 4.0637 | 1658 | 0.8296 | 0.1409 | 0.8296 | 0.9108 |
| 0.0791 | 4.0686 | 1660 | 0.8387 | 0.2186 | 0.8387 | 0.9158 |
| 0.0791 | 4.0735 | 1662 | 0.8289 | 0.2186 | 0.8289 | 0.9104 |
| 0.0791 | 4.0784 | 1664 | 0.8289 | 0.2186 | 0.8289 | 0.9104 |
| 0.0791 | 4.0833 | 1666 | 0.8186 | 0.2186 | 0.8186 | 0.9048 |
| 0.0791 | 4.0882 | 1668 | 0.8049 | 0.1793 | 0.8049 | 0.8972 |
| 0.0791 | 4.0931 | 1670 | 0.7837 | 0.1793 | 0.7837 | 0.8853 |
| 0.0791 | 4.0980 | 1672 | 0.7665 | 0.1409 | 0.7665 | 0.8755 |
| 0.0791 | 4.1029 | 1674 | 0.7436 | 0.1600 | 0.7436 | 0.8623 |
| 0.0791 | 4.1078 | 1676 | 0.7272 | 0.1600 | 0.7272 | 0.8528 |
| 0.0791 | 4.1127 | 1678 | 0.7201 | 0.1978 | 0.7201 | 0.8486 |
| 0.0791 | 4.1176 | 1680 | 0.7042 | 0.1409 | 0.7042 | 0.8392 |
| 0.0791 | 4.1225 | 1682 | 0.6854 | 0.1409 | 0.6854 | 0.8279 |
| 0.0791 | 4.1275 | 1684 | 0.6862 | 0.1409 | 0.6862 | 0.8284 |
| 0.0791 | 4.1324 | 1686 | 0.6993 | 0.1793 | 0.6993 | 0.8362 |
| 0.0791 | 4.1373 | 1688 | 0.7311 | 0.2186 | 0.7311 | 0.8550 |
| 0.0791 | 4.1422 | 1690 | 0.7596 | 0.2186 | 0.7596 | 0.8715 |
| 0.0791 | 4.1471 | 1692 | 0.7471 | 0.2186 | 0.7471 | 0.8644 |
| 0.0791 | 4.1520 | 1694 | 0.7068 | 0.2186 | 0.7068 | 0.8407 |
| 0.0791 | 4.1569 | 1696 | 0.6582 | 0.1034 | 0.6582 | 0.8113 |
| 0.0791 | 4.1618 | 1698 | 0.6384 | 0.1962 | 0.6384 | 0.7990 |
| 0.0791 | 4.1667 | 1700 | 0.6446 | 0.2150 | 0.6446 | 0.8029 |
| 0.0791 | 4.1716 | 1702 | 0.6478 | 0.2150 | 0.6478 | 0.8048 |
| 0.0791 | 4.1765 | 1704 | 0.6374 | 0.2150 | 0.6374 | 0.7983 |
| 0.0791 | 4.1814 | 1706 | 0.6339 | 0.2553 | 0.6339 | 0.7962 |
| 0.0791 | 4.1863 | 1708 | 0.6347 | 0.1356 | 0.6347 | 0.7967 |
| 0.0791 | 4.1912 | 1710 | 0.6393 | 0.2921 | 0.6393 | 0.7996 |
| 0.0791 | 4.1961 | 1712 | 0.6482 | 0.1793 | 0.6482 | 0.8051 |
| 0.0791 | 4.2010 | 1714 | 0.6547 | 0.2186 | 0.6547 | 0.8091 |
| 0.0791 | 4.2059 | 1716 | 0.6433 | 0.2186 | 0.6433 | 0.8021 |
| 0.0791 | 4.2108 | 1718 | 0.6339 | 0.2186 | 0.6339 | 0.7962 |
| 0.0791 | 4.2157 | 1720 | 0.6289 | 0.0503 | 0.6289 | 0.7930 |
| 0.0791 | 4.2206 | 1722 | 0.6347 | 0.1793 | 0.6347 | 0.7967 |
| 0.0791 | 4.2255 | 1724 | 0.6359 | 0.1765 | 0.6359 | 0.7974 |
| 0.0791 | 4.2304 | 1726 | 0.6393 | 0.2921 | 0.6393 | 0.7996 |
| 0.0791 | 4.2353 | 1728 | 0.6440 | 0.3318 | 0.6440 | 0.8025 |
| 0.0791 | 4.2402 | 1730 | 0.6491 | 0.3318 | 0.6491 | 0.8056 |
| 0.0791 | 4.2451 | 1732 | 0.6437 | 0.2921 | 0.6437 | 0.8023 |
| 0.0791 | 4.25 | 1734 | 0.6366 | 0.1765 | 0.6366 | 0.7979 |
| 0.0791 | 4.2549 | 1736 | 0.6388 | 0.1560 | 0.6388 | 0.7992 |
| 0.0791 | 4.2598 | 1738 | 0.6462 | 0.2150 | 0.6462 | 0.8039 |
| 0.0791 | 4.2647 | 1740 | 0.6544 | 0.2150 | 0.6544 | 0.8089 |
| 0.0791 | 4.2696 | 1742 | 0.6589 | 0.2696 | 0.6589 | 0.8117 |
| 0.0791 | 4.2745 | 1744 | 0.6620 | 0.2696 | 0.6620 | 0.8136 |
| 0.0791 | 4.2794 | 1746 | 0.6786 | 0.3318 | 0.6786 | 0.8237 |
| 0.0791 | 4.2843 | 1748 | 0.7055 | 0.2186 | 0.7055 | 0.8400 |
| 0.0791 | 4.2892 | 1750 | 0.7199 | 0.2186 | 0.7199 | 0.8485 |
| 0.0791 | 4.2941 | 1752 | 0.7248 | 0.2186 | 0.7248 | 0.8513 |
| 0.0791 | 4.2990 | 1754 | 0.7232 | 0.2186 | 0.7232 | 0.8504 |
| 0.0791 | 4.3039 | 1756 | 0.7045 | 0.1793 | 0.7045 | 0.8393 |
| 0.0791 | 4.3088 | 1758 | 0.6700 | 0.2921 | 0.6700 | 0.8185 |
| 0.0791 | 4.3137 | 1760 | 0.6459 | 0.3077 | 0.6459 | 0.8037 |
| 0.0791 | 4.3186 | 1762 | 0.6375 | 0.2696 | 0.6375 | 0.7984 |
| 0.0791 | 4.3235 | 1764 | 0.6266 | 0.3226 | 0.6266 | 0.7916 |
| 0.0791 | 4.3284 | 1766 | 0.6143 | 0.3609 | 0.6143 | 0.7838 |
| 0.0791 | 4.3333 | 1768 | 0.6055 | 0.3467 | 0.6055 | 0.7781 |
| 0.0791 | 4.3382 | 1770 | 0.6156 | 0.1409 | 0.6156 | 0.7846 |
| 0.0791 | 4.3431 | 1772 | 0.6485 | 0.2186 | 0.6485 | 0.8053 |
| 0.0791 | 4.3480 | 1774 | 0.6657 | 0.2588 | 0.6657 | 0.8159 |
| 0.0791 | 4.3529 | 1776 | 0.6797 | 0.2588 | 0.6797 | 0.8244 |
| 0.0791 | 4.3578 | 1778 | 0.6674 | 0.2588 | 0.6674 | 0.8170 |
| 0.0791 | 4.3627 | 1780 | 0.6352 | 0.2186 | 0.6352 | 0.7970 |
| 0.0791 | 4.3676 | 1782 | 0.6177 | 0.1793 | 0.6177 | 0.7859 |
| 0.0791 | 4.3725 | 1784 | 0.6099 | 0.2921 | 0.6099 | 0.7810 |
| 0.0791 | 4.3775 | 1786 | 0.6069 | 0.3467 | 0.6069 | 0.7791 |
| 0.0791 | 4.3824 | 1788 | 0.6065 | 0.2921 | 0.6065 | 0.7788 |
| 0.0791 | 4.3873 | 1790 | 0.6111 | 0.3318 | 0.6111 | 0.7817 |
| 0.0791 | 4.3922 | 1792 | 0.6225 | 0.2186 | 0.6225 | 0.7890 |
| 0.0791 | 4.3971 | 1794 | 0.6230 | 0.2186 | 0.6230 | 0.7893 |
| 0.0791 | 4.4020 | 1796 | 0.6145 | 0.2186 | 0.6145 | 0.7839 |
| 0.0791 | 4.4069 | 1798 | 0.6213 | 0.2186 | 0.6213 | 0.7882 |
| 0.0791 | 4.4118 | 1800 | 0.6351 | 0.2588 | 0.6351 | 0.7970 |
| 0.0791 | 4.4167 | 1802 | 0.6360 | 0.2186 | 0.6360 | 0.7975 |
| 0.0791 | 4.4216 | 1804 | 0.6447 | 0.2186 | 0.6447 | 0.8029 |
| 0.0791 | 4.4265 | 1806 | 0.6423 | 0.2186 | 0.6423 | 0.8015 |
| 0.0791 | 4.4314 | 1808 | 0.6185 | 0.3724 | 0.6185 | 0.7864 |
| 0.0791 | 4.4363 | 1810 | 0.6105 | 0.3724 | 0.6105 | 0.7814 |
| 0.0791 | 4.4412 | 1812 | 0.6064 | 0.3724 | 0.6064 | 0.7787 |
| 0.0791 | 4.4461 | 1814 | 0.6078 | 0.3724 | 0.6078 | 0.7796 |
| 0.0791 | 4.4510 | 1816 | 0.6086 | 0.3724 | 0.6086 | 0.7801 |
| 0.0791 | 4.4559 | 1818 | 0.5950 | 0.3318 | 0.5950 | 0.7713 |
| 0.0791 | 4.4608 | 1820 | 0.5914 | 0.1962 | 0.5914 | 0.7690 |
| 0.0791 | 4.4657 | 1822 | 0.5938 | 0.1962 | 0.5938 | 0.7706 |
| 0.0791 | 4.4706 | 1824 | 0.5969 | 0.1962 | 0.5969 | 0.7726 |
| 0.0791 | 4.4755 | 1826 | 0.6043 | 0.1962 | 0.6043 | 0.7774 |
| 0.0791 | 4.4804 | 1828 | 0.6081 | 0.1962 | 0.6081 | 0.7798 |
| 0.0791 | 4.4853 | 1830 | 0.5997 | 0.1962 | 0.5997 | 0.7744 |
| 0.0791 | 4.4902 | 1832 | 0.5905 | 0.1962 | 0.5905 | 0.7685 |
| 0.0791 | 4.4951 | 1834 | 0.5845 | 0.2373 | 0.5845 | 0.7645 |
| 0.0791 | 4.5 | 1836 | 0.5791 | 0.2373 | 0.5791 | 0.7610 |
| 0.0791 | 4.5049 | 1838 | 0.5741 | 0.2373 | 0.5741 | 0.7577 |
| 0.0791 | 4.5098 | 1840 | 0.5715 | 0.2373 | 0.5715 | 0.7560 |
| 0.0791 | 4.5147 | 1842 | 0.5695 | 0.1765 | 0.5695 | 0.7547 |
| 0.0791 | 4.5196 | 1844 | 0.5697 | 0.2184 | 0.5697 | 0.7548 |
| 0.0791 | 4.5245 | 1846 | 0.5658 | 0.2184 | 0.5658 | 0.7522 |
| 0.0791 | 4.5294 | 1848 | 0.5707 | 0.3724 | 0.5707 | 0.7555 |
| 0.0791 | 4.5343 | 1850 | 0.5879 | 0.3724 | 0.5879 | 0.7667 |
| 0.0791 | 4.5392 | 1852 | 0.6251 | 0.2186 | 0.6251 | 0.7907 |
| 0.0791 | 4.5441 | 1854 | 0.6718 | 0.2186 | 0.6718 | 0.8197 |
| 0.0791 | 4.5490 | 1856 | 0.6709 | 0.2186 | 0.6709 | 0.8191 |
| 0.0791 | 4.5539 | 1858 | 0.6502 | 0.2186 | 0.6502 | 0.8064 |
| 0.0791 | 4.5588 | 1860 | 0.6372 | 0.3724 | 0.6372 | 0.7982 |
| 0.0791 | 4.5637 | 1862 | 0.6120 | 0.3467 | 0.6120 | 0.7823 |
| 0.0791 | 4.5686 | 1864 | 0.5944 | 0.3077 | 0.5944 | 0.7710 |
| 0.0791 | 4.5735 | 1866 | 0.5952 | 0.3077 | 0.5952 | 0.7715 |
| 0.0791 | 4.5784 | 1868 | 0.6035 | 0.3077 | 0.6035 | 0.7769 |
| 0.0791 | 4.5833 | 1870 | 0.6174 | 0.3077 | 0.6174 | 0.7857 |
| 0.0791 | 4.5882 | 1872 | 0.6202 | 0.3077 | 0.6202 | 0.7875 |
| 0.0791 | 4.5931 | 1874 | 0.6141 | 0.3077 | 0.6141 | 0.7836 |
| 0.0791 | 4.5980 | 1876 | 0.6143 | 0.3077 | 0.6143 | 0.7838 |
| 0.0791 | 4.6029 | 1878 | 0.6190 | 0.3077 | 0.6190 | 0.7868 |
| 0.0791 | 4.6078 | 1880 | 0.6174 | 0.3609 | 0.6174 | 0.7858 |
| 0.0791 | 4.6127 | 1882 | 0.6118 | 0.3609 | 0.6118 | 0.7822 |
| 0.0791 | 4.6176 | 1884 | 0.5976 | 0.3077 | 0.5976 | 0.7731 |
| 0.0791 | 4.6225 | 1886 | 0.5875 | 0.3609 | 0.5875 | 0.7665 |
| 0.0791 | 4.6275 | 1888 | 0.5813 | 0.3609 | 0.5813 | 0.7624 |
| 0.0791 | 4.6324 | 1890 | 0.5859 | 0.3609 | 0.5859 | 0.7655 |
| 0.0791 | 4.6373 | 1892 | 0.5935 | 0.3787 | 0.5935 | 0.7704 |
| 0.0791 | 4.6422 | 1894 | 0.5933 | 0.3609 | 0.5933 | 0.7703 |
| 0.0791 | 4.6471 | 1896 | 0.5921 | 0.3609 | 0.5921 | 0.7695 |
| 0.0791 | 4.6520 | 1898 | 0.5860 | 0.3077 | 0.5860 | 0.7655 |
| 0.0791 | 4.6569 | 1900 | 0.5861 | 0.3467 | 0.5861 | 0.7655 |
| 0.0791 | 4.6618 | 1902 | 0.5949 | 0.3865 | 0.5949 | 0.7713 |
| 0.0791 | 4.6667 | 1904 | 0.5916 | 0.4273 | 0.5916 | 0.7692 |
| 0.0791 | 4.6716 | 1906 | 0.5844 | 0.4273 | 0.5844 | 0.7645 |
| 0.0791 | 4.6765 | 1908 | 0.5682 | 0.3163 | 0.5682 | 0.7538 |
| 0.0791 | 4.6814 | 1910 | 0.5449 | 0.3163 | 0.5449 | 0.7382 |
| 0.0791 | 4.6863 | 1912 | 0.5272 | 0.2759 | 0.5272 | 0.7261 |
| 0.0791 | 4.6912 | 1914 | 0.5094 | 0.2364 | 0.5094 | 0.7137 |
| 0.0791 | 4.6961 | 1916 | 0.4956 | 0.2794 | 0.4956 | 0.7040 |
| 0.0791 | 4.7010 | 1918 | 0.4901 | 0.2794 | 0.4901 | 0.7001 |
| 0.0791 | 4.7059 | 1920 | 0.4917 | 0.3390 | 0.4917 | 0.7012 |
| 0.0791 | 4.7108 | 1922 | 0.4968 | 0.3390 | 0.4968 | 0.7048 |
| 0.0791 | 4.7157 | 1924 | 0.5044 | 0.2967 | 0.5044 | 0.7102 |
| 0.0791 | 4.7206 | 1926 | 0.5157 | 0.4400 | 0.5157 | 0.7181 |
| 0.0791 | 4.7255 | 1928 | 0.5382 | 0.3865 | 0.5382 | 0.7336 |
| 0.0791 | 4.7304 | 1930 | 0.5782 | 0.3318 | 0.5782 | 0.7604 |
| 0.0791 | 4.7353 | 1932 | 0.6183 | 0.3318 | 0.6183 | 0.7863 |
| 0.0791 | 4.7402 | 1934 | 0.6317 | 0.3318 | 0.6317 | 0.7948 |
| 0.0791 | 4.7451 | 1936 | 0.6400 | 0.3318 | 0.6400 | 0.8000 |
| 0.0791 | 4.75 | 1938 | 0.6419 | 0.3318 | 0.6419 | 0.8012 |
| 0.0791 | 4.7549 | 1940 | 0.6341 | 0.3318 | 0.6341 | 0.7963 |
| 0.0791 | 4.7598 | 1942 | 0.6109 | 0.3318 | 0.6109 | 0.7816 |
| 0.0791 | 4.7647 | 1944 | 0.5942 | 0.3467 | 0.5942 | 0.7708 |
| 0.0791 | 4.7696 | 1946 | 0.5860 | 0.3077 | 0.5860 | 0.7655 |
| 0.0791 | 4.7745 | 1948 | 0.5697 | 0.3077 | 0.5697 | 0.7548 |
| 0.0791 | 4.7794 | 1950 | 0.5595 | 0.3467 | 0.5595 | 0.7480 |
| 0.0791 | 4.7843 | 1952 | 0.5565 | 0.3467 | 0.5565 | 0.7460 |
| 0.0791 | 4.7892 | 1954 | 0.5562 | 0.3467 | 0.5562 | 0.7458 |
| 0.0791 | 4.7941 | 1956 | 0.5550 | 0.3467 | 0.5550 | 0.7450 |
| 0.0791 | 4.7990 | 1958 | 0.5606 | 0.3467 | 0.5606 | 0.7488 |
| 0.0791 | 4.8039 | 1960 | 0.5687 | 0.3318 | 0.5687 | 0.7541 |
| 0.0791 | 4.8088 | 1962 | 0.5724 | 0.1793 | 0.5724 | 0.7566 |
| 0.0791 | 4.8137 | 1964 | 0.5674 | 0.3865 | 0.5674 | 0.7532 |
| 0.0791 | 4.8186 | 1966 | 0.5649 | 0.3467 | 0.5649 | 0.7516 |
| 0.0791 | 4.8235 | 1968 | 0.5609 | 0.2373 | 0.5609 | 0.7489 |
| 0.0791 | 4.8284 | 1970 | 0.5599 | 0.1962 | 0.5599 | 0.7482 |
| 0.0791 | 4.8333 | 1972 | 0.5604 | 0.4000 | 0.5604 | 0.7486 |
| 0.0791 | 4.8382 | 1974 | 0.5587 | 0.3609 | 0.5587 | 0.7475 |
| 0.0791 | 4.8431 | 1976 | 0.5640 | 0.4000 | 0.5640 | 0.7510 |
| 0.0791 | 4.8480 | 1978 | 0.5731 | 0.3467 | 0.5731 | 0.7570 |
| 0.0791 | 4.8529 | 1980 | 0.5777 | 0.3865 | 0.5777 | 0.7601 |
| 0.0791 | 4.8578 | 1982 | 0.5758 | 0.3865 | 0.5758 | 0.7588 |
| 0.0791 | 4.8627 | 1984 | 0.5747 | 0.2364 | 0.5747 | 0.7581 |
| 0.0791 | 4.8676 | 1986 | 0.5802 | 0.2759 | 0.5802 | 0.7617 |
| 0.0791 | 4.8725 | 1988 | 0.5758 | 0.2759 | 0.5758 | 0.7588 |
| 0.0791 | 4.8775 | 1990 | 0.5640 | 0.2759 | 0.5640 | 0.7510 |
| 0.0791 | 4.8824 | 1992 | 0.5500 | 0.2364 | 0.5500 | 0.7416 |
| 0.0791 | 4.8873 | 1994 | 0.5369 | 0.3865 | 0.5369 | 0.7327 |
| 0.0791 | 4.8922 | 1996 | 0.5348 | 0.3865 | 0.5348 | 0.7313 |
| 0.0791 | 4.8971 | 1998 | 0.5365 | 0.3865 | 0.5365 | 0.7325 |
| 0.0614 | 4.9020 | 2000 | 0.5377 | 0.3467 | 0.5377 | 0.7333 |
| 0.0614 | 4.9069 | 2002 | 0.5395 | 0.4000 | 0.5395 | 0.7345 |
| 0.0614 | 4.9118 | 2004 | 0.5483 | 0.3609 | 0.5483 | 0.7405 |
| 0.0614 | 4.9167 | 2006 | 0.5586 | 0.3609 | 0.5586 | 0.7474 |
| 0.0614 | 4.9216 | 2008 | 0.5619 | 0.3609 | 0.5619 | 0.7496 |
| 0.0614 | 4.9265 | 2010 | 0.5632 | 0.3609 | 0.5632 | 0.7504 |
| 0.0614 | 4.9314 | 2012 | 0.5689 | 0.4000 | 0.5689 | 0.7542 |
| 0.0614 | 4.9363 | 2014 | 0.5818 | 0.2364 | 0.5818 | 0.7628 |
| 0.0614 | 4.9412 | 2016 | 0.6181 | 0.3163 | 0.6181 | 0.7862 |
| 0.0614 | 4.9461 | 2018 | 0.6791 | 0.3163 | 0.6791 | 0.8241 |
| 0.0614 | 4.9510 | 2020 | 0.7314 | 0.4154 | 0.7314 | 0.8552 |
| 0.0614 | 4.9559 | 2022 | 0.7389 | 0.3636 | 0.7389 | 0.8596 |
| 0.0614 | 4.9608 | 2024 | 0.7179 | 0.2588 | 0.7179 | 0.8473 |
| 0.0614 | 4.9657 | 2026 | 0.6848 | 0.2588 | 0.6848 | 0.8275 |
| 0.0614 | 4.9706 | 2028 | 0.6596 | 0.2588 | 0.6596 | 0.8122 |
| 0.0614 | 4.9755 | 2030 | 0.6251 | 0.3163 | 0.6251 | 0.7906 |
| 0.0614 | 4.9804 | 2032 | 0.6034 | 0.3163 | 0.6034 | 0.7768 |
| 0.0614 | 4.9853 | 2034 | 0.5772 | 0.3163 | 0.5772 | 0.7597 |
| 0.0614 | 4.9902 | 2036 | 0.5650 | 0.3163 | 0.5650 | 0.7517 |
| 0.0614 | 4.9951 | 2038 | 0.5640 | 0.2759 | 0.5640 | 0.7510 |
| 0.0614 | 5.0 | 2040 | 0.5614 | 0.2759 | 0.5614 | 0.7493 |
| 0.0614 | 5.0049 | 2042 | 0.5694 | 0.2759 | 0.5694 | 0.7546 |
| 0.0614 | 5.0098 | 2044 | 0.5940 | 0.3163 | 0.5940 | 0.7707 |
| 0.0614 | 5.0147 | 2046 | 0.6169 | 0.2588 | 0.6169 | 0.7854 |
| 0.0614 | 5.0196 | 2048 | 0.6261 | 0.2588 | 0.6261 | 0.7913 |
| 0.0614 | 5.0245 | 2050 | 0.6128 | 0.3163 | 0.6128 | 0.7828 |
| 0.0614 | 5.0294 | 2052 | 0.6006 | 0.3163 | 0.6006 | 0.7750 |
| 0.0614 | 5.0343 | 2054 | 0.5880 | 0.2759 | 0.5880 | 0.7668 |
| 0.0614 | 5.0392 | 2056 | 0.5835 | 0.2759 | 0.5835 | 0.7639 |
| 0.0614 | 5.0441 | 2058 | 0.5798 | 0.2759 | 0.5798 | 0.7614 |
| 0.0614 | 5.0490 | 2060 | 0.5838 | 0.2759 | 0.5838 | 0.7640 |
| 0.0614 | 5.0539 | 2062 | 0.5912 | 0.2759 | 0.5912 | 0.7689 |
| 0.0614 | 5.0588 | 2064 | 0.5955 | 0.2759 | 0.5955 | 0.7717 |
| 0.0614 | 5.0637 | 2066 | 0.6004 | 0.2759 | 0.6004 | 0.7749 |
| 0.0614 | 5.0686 | 2068 | 0.5992 | 0.0916 | 0.5992 | 0.7741 |
| 0.0614 | 5.0735 | 2070 | 0.5929 | 0.1558 | 0.5929 | 0.7700 |
| 0.0614 | 5.0784 | 2072 | 0.5890 | 0.1558 | 0.5890 | 0.7675 |
| 0.0614 | 5.0833 | 2074 | 0.5808 | 0.1141 | 0.5808 | 0.7621 |
| 0.0614 | 5.0882 | 2076 | 0.5731 | 0.1141 | 0.5731 | 0.7570 |
| 0.0614 | 5.0931 | 2078 | 0.5785 | 0.1141 | 0.5785 | 0.7606 |
| 0.0614 | 5.0980 | 2080 | 0.5892 | 0.1141 | 0.5892 | 0.7676 |
| 0.0614 | 5.1029 | 2082 | 0.5970 | 0.1141 | 0.5970 | 0.7727 |
| 0.0614 | 5.1078 | 2084 | 0.6177 | 0.1793 | 0.6177 | 0.7859 |
| 0.0614 | 5.1127 | 2086 | 0.6396 | 0.2186 | 0.6396 | 0.7998 |
| 0.0614 | 5.1176 | 2088 | 0.6584 | 0.2186 | 0.6584 | 0.8114 |
| 0.0614 | 5.1225 | 2090 | 0.6824 | 0.2186 | 0.6824 | 0.8261 |
| 0.0614 | 5.1275 | 2092 | 0.6983 | 0.2186 | 0.6983 | 0.8357 |
| 0.0614 | 5.1324 | 2094 | 0.7067 | 0.2186 | 0.7067 | 0.8407 |
| 0.0614 | 5.1373 | 2096 | 0.7107 | 0.2186 | 0.7107 | 0.8430 |
| 0.0614 | 5.1422 | 2098 | 0.6967 | 0.2186 | 0.6967 | 0.8347 |
| 0.0614 | 5.1471 | 2100 | 0.6927 | 0.2186 | 0.6927 | 0.8323 |
| 0.0614 | 5.1520 | 2102 | 0.6830 | 0.2186 | 0.6830 | 0.8265 |
| 0.0614 | 5.1569 | 2104 | 0.6678 | 0.2186 | 0.6678 | 0.8172 |
| 0.0614 | 5.1618 | 2106 | 0.6531 | 0.2186 | 0.6531 | 0.8081 |
| 0.0614 | 5.1667 | 2108 | 0.6331 | 0.2186 | 0.6331 | 0.7957 |
| 0.0614 | 5.1716 | 2110 | 0.6273 | 0.2186 | 0.6273 | 0.7920 |
| 0.0614 | 5.1765 | 2112 | 0.6299 | 0.2186 | 0.6299 | 0.7936 |
| 0.0614 | 5.1814 | 2114 | 0.6357 | 0.2186 | 0.6357 | 0.7973 |
| 0.0614 | 5.1863 | 2116 | 0.6402 | 0.2186 | 0.6402 | 0.8001 |
| 0.0614 | 5.1912 | 2118 | 0.6259 | 0.2186 | 0.6259 | 0.7912 |
| 0.0614 | 5.1961 | 2120 | 0.6253 | 0.2186 | 0.6253 | 0.7908 |
| 0.0614 | 5.2010 | 2122 | 0.6312 | 0.2588 | 0.6312 | 0.7945 |
| 0.0614 | 5.2059 | 2124 | 0.6262 | 0.2588 | 0.6262 | 0.7913 |
| 0.0614 | 5.2108 | 2126 | 0.6246 | 0.2588 | 0.6246 | 0.7903 |
| 0.0614 | 5.2157 | 2128 | 0.6368 | 0.2588 | 0.6368 | 0.7980 |
| 0.0614 | 5.2206 | 2130 | 0.6516 | 0.2588 | 0.6516 | 0.8072 |
| 0.0614 | 5.2255 | 2132 | 0.6769 | 0.2588 | 0.6769 | 0.8227 |
| 0.0614 | 5.2304 | 2134 | 0.6758 | 0.2588 | 0.6758 | 0.8221 |
| 0.0614 | 5.2353 | 2136 | 0.6577 | 0.2588 | 0.6577 | 0.8110 |
| 0.0614 | 5.2402 | 2138 | 0.6295 | 0.2186 | 0.6295 | 0.7934 |
| 0.0614 | 5.2451 | 2140 | 0.6117 | 0.2186 | 0.6117 | 0.7821 |
| 0.0614 | 5.25 | 2142 | 0.6120 | 0.2186 | 0.6120 | 0.7823 |
| 0.0614 | 5.2549 | 2144 | 0.6369 | 0.2588 | 0.6369 | 0.7980 |
| 0.0614 | 5.2598 | 2146 | 0.6788 | 0.2588 | 0.6788 | 0.8239 |
| 0.0614 | 5.2647 | 2148 | 0.7103 | 0.3636 | 0.7103 | 0.8428 |
| 0.0614 | 5.2696 | 2150 | 0.7214 | 0.3636 | 0.7214 | 0.8494 |
| 0.0614 | 5.2745 | 2152 | 0.6943 | 0.3636 | 0.6943 | 0.8332 |
| 0.0614 | 5.2794 | 2154 | 0.6540 | 0.2588 | 0.6540 | 0.8087 |
| 0.0614 | 5.2843 | 2156 | 0.6182 | 0.2588 | 0.6182 | 0.7863 |
| 0.0614 | 5.2892 | 2158 | 0.5937 | 0.2186 | 0.5937 | 0.7705 |
| 0.0614 | 5.2941 | 2160 | 0.5796 | 0.2186 | 0.5796 | 0.7613 |
| 0.0614 | 5.2990 | 2162 | 0.5847 | 0.2186 | 0.5847 | 0.7646 |
| 0.0614 | 5.3039 | 2164 | 0.6082 | 0.2186 | 0.6082 | 0.7799 |
| 0.0614 | 5.3088 | 2166 | 0.6444 | 0.2186 | 0.6444 | 0.8027 |
| 0.0614 | 5.3137 | 2168 | 0.6782 | 0.3255 | 0.6782 | 0.8235 |
| 0.0614 | 5.3186 | 2170 | 0.6920 | 0.3255 | 0.6920 | 0.8319 |
| 0.0614 | 5.3235 | 2172 | 0.7042 | 0.3255 | 0.7042 | 0.8392 |
| 0.0614 | 5.3284 | 2174 | 0.7288 | 0.3255 | 0.7288 | 0.8537 |
| 0.0614 | 5.3333 | 2176 | 0.7344 | 0.3255 | 0.7344 | 0.8570 |
| 0.0614 | 5.3382 | 2178 | 0.7464 | 0.3255 | 0.7464 | 0.8640 |
| 0.0614 | 5.3431 | 2180 | 0.7523 | 0.3255 | 0.7523 | 0.8673 |
| 0.0614 | 5.3480 | 2182 | 0.7357 | 0.3255 | 0.7357 | 0.8577 |
| 0.0614 | 5.3529 | 2184 | 0.7328 | 0.3255 | 0.7328 | 0.8560 |
| 0.0614 | 5.3578 | 2186 | 0.7272 | 0.2186 | 0.7272 | 0.8527 |
| 0.0614 | 5.3627 | 2188 | 0.7104 | 0.2186 | 0.7104 | 0.8429 |
| 0.0614 | 5.3676 | 2190 | 0.6987 | 0.1793 | 0.6987 | 0.8359 |
| 0.0614 | 5.3725 | 2192 | 0.6896 | 0.1793 | 0.6896 | 0.8304 |
| 0.0614 | 5.3775 | 2194 | 0.6858 | 0.1793 | 0.6858 | 0.8282 |
| 0.0614 | 5.3824 | 2196 | 0.6823 | 0.1793 | 0.6823 | 0.8260 |
| 0.0614 | 5.3873 | 2198 | 0.6700 | 0.1793 | 0.6700 | 0.8185 |
| 0.0614 | 5.3922 | 2200 | 0.6680 | 0.1793 | 0.6680 | 0.8173 |
| 0.0614 | 5.3971 | 2202 | 0.6586 | 0.1793 | 0.6586 | 0.8115 |
| 0.0614 | 5.4020 | 2204 | 0.6395 | 0.1793 | 0.6395 | 0.7997 |
| 0.0614 | 5.4069 | 2206 | 0.6233 | 0.1793 | 0.6233 | 0.7895 |
| 0.0614 | 5.4118 | 2208 | 0.6176 | 0.1793 | 0.6176 | 0.7859 |
| 0.0614 | 5.4167 | 2210 | 0.6240 | 0.1793 | 0.6240 | 0.7899 |
| 0.0614 | 5.4216 | 2212 | 0.6289 | 0.1793 | 0.6289 | 0.7931 |
| 0.0614 | 5.4265 | 2214 | 0.6382 | 0.2186 | 0.6382 | 0.7989 |
| 0.0614 | 5.4314 | 2216 | 0.6462 | 0.2186 | 0.6462 | 0.8039 |
| 0.0614 | 5.4363 | 2218 | 0.6551 | 0.2186 | 0.6551 | 0.8094 |
| 0.0614 | 5.4412 | 2220 | 0.6627 | 0.2186 | 0.6627 | 0.8141 |
| 0.0614 | 5.4461 | 2222 | 0.6976 | 0.2186 | 0.6976 | 0.8352 |
| 0.0614 | 5.4510 | 2224 | 0.7623 | 0.3636 | 0.7623 | 0.8731 |
| 0.0614 | 5.4559 | 2226 | 0.7952 | 0.3687 | 0.7952 | 0.8917 |
| 0.0614 | 5.4608 | 2228 | 0.7802 | 0.3255 | 0.7802 | 0.8833 |
| 0.0614 | 5.4657 | 2230 | 0.7324 | 0.2186 | 0.7324 | 0.8558 |
| 0.0614 | 5.4706 | 2232 | 0.6900 | 0.2186 | 0.6900 | 0.8307 |
| 0.0614 | 5.4755 | 2234 | 0.6697 | 0.2186 | 0.6697 | 0.8184 |
| 0.0614 | 5.4804 | 2236 | 0.6738 | 0.2186 | 0.6738 | 0.8209 |
| 0.0614 | 5.4853 | 2238 | 0.6660 | 0.2186 | 0.6660 | 0.8161 |
| 0.0614 | 5.4902 | 2240 | 0.6609 | 0.2186 | 0.6609 | 0.8130 |
| 0.0614 | 5.4951 | 2242 | 0.6573 | 0.2186 | 0.6573 | 0.8107 |
| 0.0614 | 5.5 | 2244 | 0.6559 | 0.2186 | 0.6559 | 0.8099 |
| 0.0614 | 5.5049 | 2246 | 0.6564 | 0.2186 | 0.6564 | 0.8102 |
| 0.0614 | 5.5098 | 2248 | 0.6626 | 0.2186 | 0.6626 | 0.8140 |
| 0.0614 | 5.5147 | 2250 | 0.6437 | 0.2186 | 0.6437 | 0.8023 |
| 0.0614 | 5.5196 | 2252 | 0.6196 | 0.2186 | 0.6196 | 0.7871 |
| 0.0614 | 5.5245 | 2254 | 0.6034 | 0.3724 | 0.6034 | 0.7768 |
| 0.0614 | 5.5294 | 2256 | 0.6037 | 0.3724 | 0.6037 | 0.7770 |
| 0.0614 | 5.5343 | 2258 | 0.6203 | 0.2186 | 0.6203 | 0.7876 |
| 0.0614 | 5.5392 | 2260 | 0.6469 | 0.2186 | 0.6469 | 0.8043 |
| 0.0614 | 5.5441 | 2262 | 0.6694 | 0.2186 | 0.6694 | 0.8182 |
| 0.0614 | 5.5490 | 2264 | 0.6919 | 0.2186 | 0.6919 | 0.8318 |
| 0.0614 | 5.5539 | 2266 | 0.6978 | 0.2186 | 0.6978 | 0.8354 |
| 0.0614 | 5.5588 | 2268 | 0.6986 | 0.2186 | 0.6986 | 0.8358 |
| 0.0614 | 5.5637 | 2270 | 0.6926 | 0.2186 | 0.6926 | 0.8323 |
| 0.0614 | 5.5686 | 2272 | 0.6569 | 0.2186 | 0.6569 | 0.8105 |
| 0.0614 | 5.5735 | 2274 | 0.6151 | 0.2186 | 0.6151 | 0.7843 |
| 0.0614 | 5.5784 | 2276 | 0.5745 | 0.3318 | 0.5745 | 0.7580 |
| 0.0614 | 5.5833 | 2278 | 0.5459 | 0.3467 | 0.5459 | 0.7388 |
| 0.0614 | 5.5882 | 2280 | 0.5401 | 0.3467 | 0.5401 | 0.7349 |
| 0.0614 | 5.5931 | 2282 | 0.5416 | 0.3467 | 0.5416 | 0.7359 |
| 0.0614 | 5.5980 | 2284 | 0.5397 | 0.3467 | 0.5397 | 0.7347 |
| 0.0614 | 5.6029 | 2286 | 0.5499 | 0.3865 | 0.5499 | 0.7416 |
| 0.0614 | 5.6078 | 2288 | 0.5791 | 0.3724 | 0.5791 | 0.7610 |
| 0.0614 | 5.6127 | 2290 | 0.6159 | 0.3724 | 0.6159 | 0.7848 |
| 0.0614 | 5.6176 | 2292 | 0.6454 | 0.3255 | 0.6454 | 0.8033 |
| 0.0614 | 5.6225 | 2294 | 0.6487 | 0.3255 | 0.6487 | 0.8054 |
| 0.0614 | 5.6275 | 2296 | 0.6381 | 0.2186 | 0.6381 | 0.7988 |
| 0.0614 | 5.6324 | 2298 | 0.6163 | 0.2186 | 0.6163 | 0.7851 |
| 0.0614 | 5.6373 | 2300 | 0.5910 | 0.3724 | 0.5910 | 0.7688 |
| 0.0614 | 5.6422 | 2302 | 0.5704 | 0.3724 | 0.5704 | 0.7552 |
| 0.0614 | 5.6471 | 2304 | 0.5536 | 0.4273 | 0.5536 | 0.7440 |
| 0.0614 | 5.6520 | 2306 | 0.5563 | 0.3865 | 0.5563 | 0.7458 |
| 0.0614 | 5.6569 | 2308 | 0.5666 | 0.3865 | 0.5666 | 0.7527 |
| 0.0614 | 5.6618 | 2310 | 0.5782 | 0.4273 | 0.5782 | 0.7604 |
| 0.0614 | 5.6667 | 2312 | 0.5814 | 0.4273 | 0.5814 | 0.7625 |
| 0.0614 | 5.6716 | 2314 | 0.5907 | 0.4273 | 0.5907 | 0.7686 |
| 0.0614 | 5.6765 | 2316 | 0.5889 | 0.3865 | 0.5889 | 0.7674 |
| 0.0614 | 5.6814 | 2318 | 0.5885 | 0.4273 | 0.5885 | 0.7672 |
| 0.0614 | 5.6863 | 2320 | 0.5983 | 0.3724 | 0.5983 | 0.7735 |
| 0.0614 | 5.6912 | 2322 | 0.6071 | 0.3724 | 0.6071 | 0.7791 |
| 0.0614 | 5.6961 | 2324 | 0.6186 | 0.3724 | 0.6186 | 0.7865 |
| 0.0614 | 5.7010 | 2326 | 0.6283 | 0.3724 | 0.6283 | 0.7927 |
| 0.0614 | 5.7059 | 2328 | 0.6276 | 0.3724 | 0.6276 | 0.7922 |
| 0.0614 | 5.7108 | 2330 | 0.6149 | 0.3724 | 0.6149 | 0.7842 |
| 0.0614 | 5.7157 | 2332 | 0.6086 | 0.3724 | 0.6086 | 0.7801 |
| 0.0614 | 5.7206 | 2334 | 0.6033 | 0.3724 | 0.6033 | 0.7767 |
| 0.0614 | 5.7255 | 2336 | 0.5917 | 0.3724 | 0.5917 | 0.7692 |
| 0.0614 | 5.7304 | 2338 | 0.5824 | 0.3318 | 0.5824 | 0.7632 |
| 0.0614 | 5.7353 | 2340 | 0.5792 | 0.3318 | 0.5792 | 0.7610 |
| 0.0614 | 5.7402 | 2342 | 0.5761 | 0.3318 | 0.5761 | 0.7590 |
| 0.0614 | 5.7451 | 2344 | 0.5759 | 0.2533 | 0.5759 | 0.7589 |
| 0.0614 | 5.75 | 2346 | 0.5735 | 0.3318 | 0.5735 | 0.7573 |
| 0.0614 | 5.7549 | 2348 | 0.5756 | 0.3724 | 0.5756 | 0.7587 |
| 0.0614 | 5.7598 | 2350 | 0.5897 | 0.3724 | 0.5897 | 0.7679 |
| 0.0614 | 5.7647 | 2352 | 0.6112 | 0.3724 | 0.6112 | 0.7818 |
| 0.0614 | 5.7696 | 2354 | 0.6272 | 0.3724 | 0.6272 | 0.7919 |
| 0.0614 | 5.7745 | 2356 | 0.6303 | 0.3724 | 0.6303 | 0.7939 |
| 0.0614 | 5.7794 | 2358 | 0.6222 | 0.3724 | 0.6222 | 0.7888 |
| 0.0614 | 5.7843 | 2360 | 0.6133 | 0.3724 | 0.6133 | 0.7831 |
| 0.0614 | 5.7892 | 2362 | 0.6069 | 0.3724 | 0.6069 | 0.7790 |
| 0.0614 | 5.7941 | 2364 | 0.6120 | 0.3724 | 0.6120 | 0.7823 |
| 0.0614 | 5.7990 | 2366 | 0.6179 | 0.3724 | 0.6179 | 0.7860 |
| 0.0614 | 5.8039 | 2368 | 0.6327 | 0.4140 | 0.6327 | 0.7954 |
| 0.0614 | 5.8088 | 2370 | 0.6337 | 0.2588 | 0.6337 | 0.7960 |
| 0.0614 | 5.8137 | 2372 | 0.6242 | 0.4140 | 0.6242 | 0.7901 |
| 0.0614 | 5.8186 | 2374 | 0.6180 | 0.4140 | 0.6180 | 0.7861 |
| 0.0614 | 5.8235 | 2376 | 0.6003 | 0.3724 | 0.6003 | 0.7748 |
| 0.0614 | 5.8284 | 2378 | 0.5881 | 0.3724 | 0.5881 | 0.7669 |
| 0.0614 | 5.8333 | 2380 | 0.5781 | 0.3724 | 0.5781 | 0.7603 |
| 0.0614 | 5.8382 | 2382 | 0.5773 | 0.3724 | 0.5773 | 0.7598 |
| 0.0614 | 5.8431 | 2384 | 0.5765 | 0.3865 | 0.5765 | 0.7593 |
| 0.0614 | 5.8480 | 2386 | 0.5628 | 0.3865 | 0.5628 | 0.7502 |
| 0.0614 | 5.8529 | 2388 | 0.5582 | 0.3865 | 0.5582 | 0.7471 |
| 0.0614 | 5.8578 | 2390 | 0.5647 | 0.3865 | 0.5647 | 0.7514 |
| 0.0614 | 5.8627 | 2392 | 0.5633 | 0.3865 | 0.5633 | 0.7505 |
| 0.0614 | 5.8676 | 2394 | 0.5591 | 0.4400 | 0.5591 | 0.7477 |
| 0.0614 | 5.8725 | 2396 | 0.5588 | 0.4400 | 0.5588 | 0.7475 |
| 0.0614 | 5.8775 | 2398 | 0.5587 | 0.4400 | 0.5587 | 0.7475 |
| 0.0614 | 5.8824 | 2400 | 0.5598 | 0.3318 | 0.5598 | 0.7482 |
| 0.0614 | 5.8873 | 2402 | 0.5681 | 0.3724 | 0.5681 | 0.7537 |
| 0.0614 | 5.8922 | 2404 | 0.5759 | 0.3724 | 0.5759 | 0.7589 |
| 0.0614 | 5.8971 | 2406 | 0.6007 | 0.4140 | 0.6007 | 0.7750 |
| 0.0614 | 5.9020 | 2408 | 0.6132 | 0.2588 | 0.6132 | 0.7831 |
| 0.0614 | 5.9069 | 2410 | 0.6169 | 0.3636 | 0.6169 | 0.7854 |
| 0.0614 | 5.9118 | 2412 | 0.6200 | 0.3636 | 0.6200 | 0.7874 |
| 0.0614 | 5.9167 | 2414 | 0.6188 | 0.4661 | 0.6188 | 0.7866 |
| 0.0614 | 5.9216 | 2416 | 0.6097 | 0.4661 | 0.6097 | 0.7808 |
| 0.0614 | 5.9265 | 2418 | 0.6117 | 0.4661 | 0.6117 | 0.7821 |
| 0.0614 | 5.9314 | 2420 | 0.6331 | 0.4661 | 0.6331 | 0.7957 |
| 0.0614 | 5.9363 | 2422 | 0.6584 | 0.4661 | 0.6584 | 0.8114 |
| 0.0614 | 5.9412 | 2424 | 0.6692 | 0.3255 | 0.6692 | 0.8180 |
| 0.0614 | 5.9461 | 2426 | 0.6618 | 0.3255 | 0.6618 | 0.8135 |
| 0.0614 | 5.9510 | 2428 | 0.6336 | 0.4661 | 0.6336 | 0.7960 |
| 0.0614 | 5.9559 | 2430 | 0.6206 | 0.4661 | 0.6206 | 0.7878 |
| 0.0614 | 5.9608 | 2432 | 0.6073 | 0.4661 | 0.6073 | 0.7793 |
| 0.0614 | 5.9657 | 2434 | 0.5952 | 0.4661 | 0.5952 | 0.7715 |
| 0.0614 | 5.9706 | 2436 | 0.6036 | 0.3255 | 0.6036 | 0.7769 |
| 0.0614 | 5.9755 | 2438 | 0.6183 | 0.3636 | 0.6183 | 0.7863 |
| 0.0614 | 5.9804 | 2440 | 0.6499 | 0.3636 | 0.6499 | 0.8062 |
| 0.0614 | 5.9853 | 2442 | 0.6780 | 0.3636 | 0.6780 | 0.8234 |
| 0.0614 | 5.9902 | 2444 | 0.7000 | 0.3636 | 0.7000 | 0.8367 |
| 0.0614 | 5.9951 | 2446 | 0.7083 | 0.3636 | 0.7083 | 0.8416 |
| 0.0614 | 6.0 | 2448 | 0.7052 | 0.3636 | 0.7052 | 0.8397 |
| 0.0614 | 6.0049 | 2450 | 0.7019 | 0.3636 | 0.7019 | 0.8378 |
| 0.0614 | 6.0098 | 2452 | 0.6930 | 0.3636 | 0.6930 | 0.8324 |
| 0.0614 | 6.0147 | 2454 | 0.6944 | 0.3636 | 0.6944 | 0.8333 |
| 0.0614 | 6.0196 | 2456 | 0.7027 | 0.3636 | 0.7027 | 0.8383 |
| 0.0614 | 6.0245 | 2458 | 0.6920 | 0.3636 | 0.6920 | 0.8319 |
| 0.0614 | 6.0294 | 2460 | 0.6733 | 0.2588 | 0.6733 | 0.8205 |
| 0.0614 | 6.0343 | 2462 | 0.6524 | 0.2588 | 0.6524 | 0.8077 |
| 0.0614 | 6.0392 | 2464 | 0.6511 | 0.2588 | 0.6511 | 0.8069 |
| 0.0614 | 6.0441 | 2466 | 0.6484 | 0.2588 | 0.6484 | 0.8052 |
| 0.0614 | 6.0490 | 2468 | 0.6634 | 0.3636 | 0.6634 | 0.8145 |
| 0.0614 | 6.0539 | 2470 | 0.6881 | 0.3636 | 0.6881 | 0.8295 |
| 0.0614 | 6.0588 | 2472 | 0.6930 | 0.3636 | 0.6930 | 0.8325 |
| 0.0614 | 6.0637 | 2474 | 0.6936 | 0.3636 | 0.6936 | 0.8328 |
| 0.0614 | 6.0686 | 2476 | 0.6729 | 0.3636 | 0.6729 | 0.8203 |
| 0.0614 | 6.0735 | 2478 | 0.6464 | 0.3636 | 0.6464 | 0.8040 |
| 0.0614 | 6.0784 | 2480 | 0.6377 | 0.3636 | 0.6377 | 0.7986 |
| 0.0614 | 6.0833 | 2482 | 0.6405 | 0.3636 | 0.6405 | 0.8003 |
| 0.0614 | 6.0882 | 2484 | 0.6337 | 0.3255 | 0.6337 | 0.7961 |
| 0.0614 | 6.0931 | 2486 | 0.6346 | 0.3255 | 0.6346 | 0.7966 |
| 0.0614 | 6.0980 | 2488 | 0.6329 | 0.3255 | 0.6329 | 0.7955 |
| 0.0614 | 6.1029 | 2490 | 0.6558 | 0.3255 | 0.6558 | 0.8098 |
| 0.0614 | 6.1078 | 2492 | 0.6809 | 0.3636 | 0.6809 | 0.8252 |
| 0.0614 | 6.1127 | 2494 | 0.7015 | 0.3636 | 0.7015 | 0.8376 |
| 0.0614 | 6.1176 | 2496 | 0.6841 | 0.3636 | 0.6841 | 0.8271 |
| 0.0614 | 6.1225 | 2498 | 0.6441 | 0.3255 | 0.6441 | 0.8026 |
| 0.0484 | 6.1275 | 2500 | 0.6127 | 0.2186 | 0.6127 | 0.7828 |
| 0.0484 | 6.1324 | 2502 | 0.6060 | 0.2186 | 0.6060 | 0.7785 |
| 0.0484 | 6.1373 | 2504 | 0.6087 | 0.2186 | 0.6087 | 0.7802 |
| 0.0484 | 6.1422 | 2506 | 0.6224 | 0.2588 | 0.6224 | 0.7889 |
| 0.0484 | 6.1471 | 2508 | 0.6370 | 0.2588 | 0.6370 | 0.7981 |
| 0.0484 | 6.1520 | 2510 | 0.6629 | 0.3636 | 0.6629 | 0.8142 |
| 0.0484 | 6.1569 | 2512 | 0.6811 | 0.3636 | 0.6811 | 0.8253 |
| 0.0484 | 6.1618 | 2514 | 0.7128 | 0.3636 | 0.7128 | 0.8442 |
| 0.0484 | 6.1667 | 2516 | 0.7161 | 0.3636 | 0.7161 | 0.8462 |
| 0.0484 | 6.1716 | 2518 | 0.6999 | 0.3636 | 0.6999 | 0.8366 |
| 0.0484 | 6.1765 | 2520 | 0.6761 | 0.3636 | 0.6761 | 0.8223 |
| 0.0484 | 6.1814 | 2522 | 0.6490 | 0.2588 | 0.6490 | 0.8056 |
| 0.0484 | 6.1863 | 2524 | 0.6282 | 0.2588 | 0.6282 | 0.7926 |
| 0.0484 | 6.1912 | 2526 | 0.6187 | 0.2588 | 0.6187 | 0.7866 |
| 0.0484 | 6.1961 | 2528 | 0.6129 | 0.2588 | 0.6129 | 0.7829 |
| 0.0484 | 6.2010 | 2530 | 0.6103 | 0.2588 | 0.6103 | 0.7812 |
| 0.0484 | 6.2059 | 2532 | 0.6198 | 0.2588 | 0.6198 | 0.7873 |
| 0.0484 | 6.2108 | 2534 | 0.6412 | 0.2588 | 0.6412 | 0.8008 |
| 0.0484 | 6.2157 | 2536 | 0.6727 | 0.3636 | 0.6727 | 0.8202 |
| 0.0484 | 6.2206 | 2538 | 0.6996 | 0.3636 | 0.6996 | 0.8364 |
| 0.0484 | 6.2255 | 2540 | 0.7080 | 0.3636 | 0.7080 | 0.8414 |
| 0.0484 | 6.2304 | 2542 | 0.7206 | 0.3636 | 0.7206 | 0.8489 |
| 0.0484 | 6.2353 | 2544 | 0.7063 | 0.3636 | 0.7063 | 0.8404 |
| 0.0484 | 6.2402 | 2546 | 0.6860 | 0.3636 | 0.6860 | 0.8283 |
| 0.0484 | 6.2451 | 2548 | 0.6647 | 0.3255 | 0.6647 | 0.8153 |
| 0.0484 | 6.25 | 2550 | 0.6601 | 0.3255 | 0.6601 | 0.8125 |
| 0.0484 | 6.2549 | 2552 | 0.6519 | 0.3255 | 0.6519 | 0.8074 |
| 0.0484 | 6.2598 | 2554 | 0.6432 | 0.3724 | 0.6432 | 0.8020 |
| 0.0484 | 6.2647 | 2556 | 0.6465 | 0.3724 | 0.6465 | 0.8040 |
| 0.0484 | 6.2696 | 2558 | 0.6502 | 0.3318 | 0.6502 | 0.8064 |
| 0.0484 | 6.2745 | 2560 | 0.6516 | 0.2533 | 0.6516 | 0.8072 |
| 0.0484 | 6.2794 | 2562 | 0.6555 | 0.2154 | 0.6555 | 0.8096 |
| 0.0484 | 6.2843 | 2564 | 0.6617 | 0.2154 | 0.6617 | 0.8134 |
| 0.0484 | 6.2892 | 2566 | 0.6590 | 0.2154 | 0.6590 | 0.8118 |
| 0.0484 | 6.2941 | 2568 | 0.6497 | 0.2154 | 0.6497 | 0.8061 |
| 0.0484 | 6.2990 | 2570 | 0.6417 | 0.2533 | 0.6417 | 0.8011 |
| 0.0484 | 6.3039 | 2572 | 0.6340 | 0.3318 | 0.6340 | 0.7962 |
| 0.0484 | 6.3088 | 2574 | 0.6232 | 0.3724 | 0.6232 | 0.7895 |
| 0.0484 | 6.3137 | 2576 | 0.6147 | 0.3724 | 0.6147 | 0.7841 |
| 0.0484 | 6.3186 | 2578 | 0.6080 | 0.3724 | 0.6080 | 0.7797 |
| 0.0484 | 6.3235 | 2580 | 0.6087 | 0.3724 | 0.6087 | 0.7802 |
| 0.0484 | 6.3284 | 2582 | 0.6127 | 0.3724 | 0.6127 | 0.7827 |
| 0.0484 | 6.3333 | 2584 | 0.6291 | 0.2186 | 0.6291 | 0.7932 |
| 0.0484 | 6.3382 | 2586 | 0.6663 | 0.3255 | 0.6663 | 0.8163 |
| 0.0484 | 6.3431 | 2588 | 0.7074 | 0.3255 | 0.7074 | 0.8411 |
| 0.0484 | 6.3480 | 2590 | 0.7460 | 0.3636 | 0.7460 | 0.8637 |
| 0.0484 | 6.3529 | 2592 | 0.7678 | 0.3636 | 0.7678 | 0.8762 |
| 0.0484 | 6.3578 | 2594 | 0.7978 | 0.2948 | 0.7978 | 0.8932 |
| 0.0484 | 6.3627 | 2596 | 0.8107 | 0.2948 | 0.8107 | 0.9004 |
| 0.0484 | 6.3676 | 2598 | 0.7841 | 0.2948 | 0.7841 | 0.8855 |
| 0.0484 | 6.3725 | 2600 | 0.7430 | 0.3636 | 0.7430 | 0.8620 |
| 0.0484 | 6.3775 | 2602 | 0.6871 | 0.3255 | 0.6871 | 0.8289 |
| 0.0484 | 6.3824 | 2604 | 0.6371 | 0.2186 | 0.6371 | 0.7982 |
| 0.0484 | 6.3873 | 2606 | 0.6100 | 0.2186 | 0.6100 | 0.7810 |
| 0.0484 | 6.3922 | 2608 | 0.5906 | 0.3724 | 0.5906 | 0.7685 |
| 0.0484 | 6.3971 | 2610 | 0.5793 | 0.2794 | 0.5793 | 0.7611 |
| 0.0484 | 6.4020 | 2612 | 0.5736 | 0.2794 | 0.5736 | 0.7573 |
| 0.0484 | 6.4069 | 2614 | 0.5737 | 0.2794 | 0.5737 | 0.7574 |
| 0.0484 | 6.4118 | 2616 | 0.5758 | 0.2184 | 0.5758 | 0.7588 |
| 0.0484 | 6.4167 | 2618 | 0.5824 | 0.0916 | 0.5824 | 0.7631 |
| 0.0484 | 6.4216 | 2620 | 0.6002 | 0.2186 | 0.6002 | 0.7747 |
| 0.0484 | 6.4265 | 2622 | 0.6291 | 0.2588 | 0.6291 | 0.7931 |
| 0.0484 | 6.4314 | 2624 | 0.6543 | 0.2588 | 0.6543 | 0.8089 |
| 0.0484 | 6.4363 | 2626 | 0.6780 | 0.3636 | 0.6780 | 0.8234 |
| 0.0484 | 6.4412 | 2628 | 0.6863 | 0.3636 | 0.6863 | 0.8284 |
| 0.0484 | 6.4461 | 2630 | 0.6714 | 0.3636 | 0.6714 | 0.8194 |
| 0.0484 | 6.4510 | 2632 | 0.6420 | 0.2588 | 0.6420 | 0.8012 |
| 0.0484 | 6.4559 | 2634 | 0.6160 | 0.2186 | 0.6160 | 0.7848 |
| 0.0484 | 6.4608 | 2636 | 0.6051 | 0.2186 | 0.6051 | 0.7779 |
| 0.0484 | 6.4657 | 2638 | 0.5968 | 0.2186 | 0.5968 | 0.7725 |
| 0.0484 | 6.4706 | 2640 | 0.5931 | 0.2186 | 0.5931 | 0.7701 |
| 0.0484 | 6.4755 | 2642 | 0.5914 | 0.1793 | 0.5914 | 0.7690 |
| 0.0484 | 6.4804 | 2644 | 0.5892 | 0.1793 | 0.5892 | 0.7676 |
| 0.0484 | 6.4853 | 2646 | 0.5866 | 0.1793 | 0.5866 | 0.7659 |
| 0.0484 | 6.4902 | 2648 | 0.5893 | 0.1793 | 0.5893 | 0.7677 |
| 0.0484 | 6.4951 | 2650 | 0.5929 | 0.1793 | 0.5929 | 0.7700 |
| 0.0484 | 6.5 | 2652 | 0.5972 | 0.1793 | 0.5972 | 0.7728 |
| 0.0484 | 6.5049 | 2654 | 0.6033 | 0.1793 | 0.6033 | 0.7768 |
| 0.0484 | 6.5098 | 2656 | 0.6103 | 0.1793 | 0.6103 | 0.7812 |
| 0.0484 | 6.5147 | 2658 | 0.6120 | 0.1793 | 0.6120 | 0.7823 |
| 0.0484 | 6.5196 | 2660 | 0.6065 | 0.3318 | 0.6065 | 0.7788 |
| 0.0484 | 6.5245 | 2662 | 0.6071 | 0.3865 | 0.6071 | 0.7792 |
| 0.0484 | 6.5294 | 2664 | 0.6065 | 0.3865 | 0.6065 | 0.7788 |
| 0.0484 | 6.5343 | 2666 | 0.6056 | 0.3865 | 0.6056 | 0.7782 |
| 0.0484 | 6.5392 | 2668 | 0.6064 | 0.3865 | 0.6064 | 0.7787 |
| 0.0484 | 6.5441 | 2670 | 0.6086 | 0.3865 | 0.6086 | 0.7801 |
| 0.0484 | 6.5490 | 2672 | 0.6113 | 0.3865 | 0.6113 | 0.7819 |
| 0.0484 | 6.5539 | 2674 | 0.6099 | 0.3318 | 0.6099 | 0.7809 |
| 0.0484 | 6.5588 | 2676 | 0.6092 | 0.3318 | 0.6092 | 0.7805 |
| 0.0484 | 6.5637 | 2678 | 0.6099 | 0.3318 | 0.6099 | 0.7810 |
| 0.0484 | 6.5686 | 2680 | 0.6109 | 0.1793 | 0.6109 | 0.7816 |
| 0.0484 | 6.5735 | 2682 | 0.6145 | 0.2186 | 0.6145 | 0.7839 |
| 0.0484 | 6.5784 | 2684 | 0.6220 | 0.2588 | 0.6220 | 0.7887 |
| 0.0484 | 6.5833 | 2686 | 0.6172 | 0.2588 | 0.6172 | 0.7856 |
| 0.0484 | 6.5882 | 2688 | 0.6112 | 0.2588 | 0.6112 | 0.7818 |
| 0.0484 | 6.5931 | 2690 | 0.6046 | 0.2588 | 0.6046 | 0.7776 |
| 0.0484 | 6.5980 | 2692 | 0.5943 | 0.2588 | 0.5943 | 0.7709 |
| 0.0484 | 6.6029 | 2694 | 0.5916 | 0.2588 | 0.5916 | 0.7692 |
| 0.0484 | 6.6078 | 2696 | 0.5931 | 0.3636 | 0.5931 | 0.7701 |
| 0.0484 | 6.6127 | 2698 | 0.5978 | 0.3255 | 0.5978 | 0.7732 |
| 0.0484 | 6.6176 | 2700 | 0.6067 | 0.3255 | 0.6067 | 0.7789 |
| 0.0484 | 6.6225 | 2702 | 0.6134 | 0.3255 | 0.6134 | 0.7832 |
| 0.0484 | 6.6275 | 2704 | 0.6143 | 0.3255 | 0.6143 | 0.7838 |
| 0.0484 | 6.6324 | 2706 | 0.6157 | 0.2186 | 0.6157 | 0.7847 |
| 0.0484 | 6.6373 | 2708 | 0.6146 | 0.3724 | 0.6146 | 0.7840 |
| 0.0484 | 6.6422 | 2710 | 0.6124 | 0.3724 | 0.6124 | 0.7826 |
| 0.0484 | 6.6471 | 2712 | 0.6135 | 0.3724 | 0.6135 | 0.7832 |
| 0.0484 | 6.6520 | 2714 | 0.6120 | 0.3724 | 0.6120 | 0.7823 |
| 0.0484 | 6.6569 | 2716 | 0.6095 | 0.3318 | 0.6095 | 0.7807 |
| 0.0484 | 6.6618 | 2718 | 0.6086 | 0.3865 | 0.6086 | 0.7801 |
| 0.0484 | 6.6667 | 2720 | 0.6084 | 0.3077 | 0.6084 | 0.7800 |
| 0.0484 | 6.6716 | 2722 | 0.6088 | 0.3077 | 0.6088 | 0.7802 |
| 0.0484 | 6.6765 | 2724 | 0.6084 | 0.3077 | 0.6084 | 0.7800 |
| 0.0484 | 6.6814 | 2726 | 0.6100 | 0.3865 | 0.6100 | 0.7810 |
| 0.0484 | 6.6863 | 2728 | 0.6099 | 0.3865 | 0.6099 | 0.7810 |
| 0.0484 | 6.6912 | 2730 | 0.6075 | 0.3865 | 0.6075 | 0.7795 |
| 0.0484 | 6.6961 | 2732 | 0.6094 | 0.3865 | 0.6094 | 0.7806 |
| 0.0484 | 6.7010 | 2734 | 0.6067 | 0.3865 | 0.6067 | 0.7789 |
| 0.0484 | 6.7059 | 2736 | 0.6002 | 0.3865 | 0.6002 | 0.7747 |
| 0.0484 | 6.7108 | 2738 | 0.5945 | 0.3865 | 0.5945 | 0.7711 |
| 0.0484 | 6.7157 | 2740 | 0.5854 | 0.3865 | 0.5854 | 0.7651 |
| 0.0484 | 6.7206 | 2742 | 0.5839 | 0.3724 | 0.5839 | 0.7641 |
| 0.0484 | 6.7255 | 2744 | 0.5930 | 0.3724 | 0.5930 | 0.7701 |
| 0.0484 | 6.7304 | 2746 | 0.6021 | 0.3724 | 0.6021 | 0.7760 |
| 0.0484 | 6.7353 | 2748 | 0.6130 | 0.2186 | 0.6130 | 0.7830 |
| 0.0484 | 6.7402 | 2750 | 0.6345 | 0.2588 | 0.6345 | 0.7966 |
| 0.0484 | 6.7451 | 2752 | 0.6582 | 0.3636 | 0.6582 | 0.8113 |
| 0.0484 | 6.75 | 2754 | 0.6611 | 0.3636 | 0.6611 | 0.8131 |
| 0.0484 | 6.7549 | 2756 | 0.6475 | 0.3636 | 0.6475 | 0.8047 |
| 0.0484 | 6.7598 | 2758 | 0.6347 | 0.2588 | 0.6347 | 0.7967 |
| 0.0484 | 6.7647 | 2760 | 0.6102 | 0.3724 | 0.6102 | 0.7812 |
| 0.0484 | 6.7696 | 2762 | 0.5882 | 0.3724 | 0.5882 | 0.7670 |
| 0.0484 | 6.7745 | 2764 | 0.5803 | 0.3724 | 0.5803 | 0.7618 |
| 0.0484 | 6.7794 | 2766 | 0.5746 | 0.3724 | 0.5746 | 0.7580 |
| 0.0484 | 6.7843 | 2768 | 0.5763 | 0.3724 | 0.5763 | 0.7591 |
| 0.0484 | 6.7892 | 2770 | 0.5820 | 0.3724 | 0.5820 | 0.7629 |
| 0.0484 | 6.7941 | 2772 | 0.5859 | 0.3724 | 0.5859 | 0.7655 |
| 0.0484 | 6.7990 | 2774 | 0.5917 | 0.3724 | 0.5917 | 0.7692 |
| 0.0484 | 6.8039 | 2776 | 0.6052 | 0.3724 | 0.6052 | 0.7779 |
| 0.0484 | 6.8088 | 2778 | 0.6158 | 0.3724 | 0.6158 | 0.7847 |
| 0.0484 | 6.8137 | 2780 | 0.6289 | 0.3724 | 0.6289 | 0.7930 |
| 0.0484 | 6.8186 | 2782 | 0.6449 | 0.3724 | 0.6449 | 0.8031 |
| 0.0484 | 6.8235 | 2784 | 0.6509 | 0.4140 | 0.6509 | 0.8068 |
| 0.0484 | 6.8284 | 2786 | 0.6423 | 0.4140 | 0.6423 | 0.8015 |
| 0.0484 | 6.8333 | 2788 | 0.6323 | 0.4140 | 0.6323 | 0.7952 |
| 0.0484 | 6.8382 | 2790 | 0.6192 | 0.3724 | 0.6192 | 0.7869 |
| 0.0484 | 6.8431 | 2792 | 0.6140 | 0.3724 | 0.6140 | 0.7836 |
| 0.0484 | 6.8480 | 2794 | 0.6046 | 0.3724 | 0.6046 | 0.7776 |
| 0.0484 | 6.8529 | 2796 | 0.5935 | 0.4273 | 0.5935 | 0.7704 |
| 0.0484 | 6.8578 | 2798 | 0.5860 | 0.3226 | 0.5860 | 0.7655 |
| 0.0484 | 6.8627 | 2800 | 0.5835 | 0.4273 | 0.5835 | 0.7639 |
| 0.0484 | 6.8676 | 2802 | 0.5858 | 0.3724 | 0.5858 | 0.7654 |
| 0.0484 | 6.8725 | 2804 | 0.5940 | 0.3724 | 0.5940 | 0.7707 |
| 0.0484 | 6.8775 | 2806 | 0.6006 | 0.4140 | 0.6006 | 0.7750 |
| 0.0484 | 6.8824 | 2808 | 0.5999 | 0.3724 | 0.5999 | 0.7746 |
| 0.0484 | 6.8873 | 2810 | 0.6014 | 0.3724 | 0.6014 | 0.7755 |
| 0.0484 | 6.8922 | 2812 | 0.6036 | 0.3724 | 0.6036 | 0.7769 |
| 0.0484 | 6.8971 | 2814 | 0.6012 | 0.3724 | 0.6012 | 0.7754 |
| 0.0484 | 6.9020 | 2816 | 0.5993 | 0.3724 | 0.5993 | 0.7742 |
| 0.0484 | 6.9069 | 2818 | 0.6011 | 0.3724 | 0.6011 | 0.7753 |
| 0.0484 | 6.9118 | 2820 | 0.6006 | 0.3724 | 0.6006 | 0.7750 |
| 0.0484 | 6.9167 | 2822 | 0.5976 | 0.3318 | 0.5976 | 0.7731 |
| 0.0484 | 6.9216 | 2824 | 0.5950 | 0.2921 | 0.5950 | 0.7713 |
| 0.0484 | 6.9265 | 2826 | 0.5923 | 0.3077 | 0.5923 | 0.7696 |
| 0.0484 | 6.9314 | 2828 | 0.5876 | 0.1962 | 0.5876 | 0.7666 |
| 0.0484 | 6.9363 | 2830 | 0.5836 | 0.2373 | 0.5836 | 0.7640 |
| 0.0484 | 6.9412 | 2832 | 0.5781 | 0.2373 | 0.5781 | 0.7604 |
| 0.0484 | 6.9461 | 2834 | 0.5741 | 0.2794 | 0.5741 | 0.7577 |
| 0.0484 | 6.9510 | 2836 | 0.5697 | 0.3226 | 0.5697 | 0.7548 |
| 0.0484 | 6.9559 | 2838 | 0.5672 | 0.2613 | 0.5672 | 0.7531 |
| 0.0484 | 6.9608 | 2840 | 0.5661 | 0.2613 | 0.5661 | 0.7524 |
| 0.0484 | 6.9657 | 2842 | 0.5678 | 0.3724 | 0.5678 | 0.7535 |
| 0.0484 | 6.9706 | 2844 | 0.5757 | 0.3724 | 0.5757 | 0.7587 |
| 0.0484 | 6.9755 | 2846 | 0.5816 | 0.3724 | 0.5816 | 0.7626 |
| 0.0484 | 6.9804 | 2848 | 0.5889 | 0.3724 | 0.5889 | 0.7674 |
| 0.0484 | 6.9853 | 2850 | 0.5996 | 0.3724 | 0.5996 | 0.7744 |
| 0.0484 | 6.9902 | 2852 | 0.6045 | 0.3724 | 0.6045 | 0.7775 |
| 0.0484 | 6.9951 | 2854 | 0.6024 | 0.3724 | 0.6024 | 0.7762 |
| 0.0484 | 7.0 | 2856 | 0.5987 | 0.3724 | 0.5987 | 0.7737 |
| 0.0484 | 7.0049 | 2858 | 0.5955 | 0.3724 | 0.5955 | 0.7717 |
| 0.0484 | 7.0098 | 2860 | 0.5973 | 0.3724 | 0.5973 | 0.7729 |
| 0.0484 | 7.0147 | 2862 | 0.6018 | 0.3724 | 0.6018 | 0.7758 |
| 0.0484 | 7.0196 | 2864 | 0.6024 | 0.3724 | 0.6024 | 0.7761 |
| 0.0484 | 7.0245 | 2866 | 0.5999 | 0.3724 | 0.5999 | 0.7745 |
| 0.0484 | 7.0294 | 2868 | 0.5949 | 0.3724 | 0.5949 | 0.7713 |
| 0.0484 | 7.0343 | 2870 | 0.5896 | 0.3724 | 0.5896 | 0.7678 |
| 0.0484 | 7.0392 | 2872 | 0.5883 | 0.3724 | 0.5883 | 0.7670 |
| 0.0484 | 7.0441 | 2874 | 0.5895 | 0.3724 | 0.5895 | 0.7678 |
| 0.0484 | 7.0490 | 2876 | 0.5927 | 0.3724 | 0.5927 | 0.7699 |
| 0.0484 | 7.0539 | 2878 | 0.5880 | 0.2186 | 0.5880 | 0.7668 |
| 0.0484 | 7.0588 | 2880 | 0.5826 | 0.2186 | 0.5826 | 0.7633 |
| 0.0484 | 7.0637 | 2882 | 0.5830 | 0.2588 | 0.5830 | 0.7636 |
| 0.0484 | 7.0686 | 2884 | 0.5818 | 0.2588 | 0.5818 | 0.7627 |
| 0.0484 | 7.0735 | 2886 | 0.5761 | 0.2588 | 0.5761 | 0.7590 |
| 0.0484 | 7.0784 | 2888 | 0.5695 | 0.2588 | 0.5695 | 0.7546 |
| 0.0484 | 7.0833 | 2890 | 0.5596 | 0.2588 | 0.5596 | 0.7480 |
| 0.0484 | 7.0882 | 2892 | 0.5524 | 0.2186 | 0.5524 | 0.7433 |
| 0.0484 | 7.0931 | 2894 | 0.5499 | 0.2186 | 0.5499 | 0.7416 |
| 0.0484 | 7.0980 | 2896 | 0.5533 | 0.2186 | 0.5533 | 0.7438 |
| 0.0484 | 7.1029 | 2898 | 0.5597 | 0.2186 | 0.5597 | 0.7481 |
| 0.0484 | 7.1078 | 2900 | 0.5655 | 0.2186 | 0.5655 | 0.7520 |
| 0.0484 | 7.1127 | 2902 | 0.5726 | 0.2186 | 0.5726 | 0.7567 |
| 0.0484 | 7.1176 | 2904 | 0.5815 | 0.2186 | 0.5815 | 0.7625 |
| 0.0484 | 7.1225 | 2906 | 0.5797 | 0.2186 | 0.5797 | 0.7614 |
| 0.0484 | 7.1275 | 2908 | 0.5757 | 0.3724 | 0.5757 | 0.7587 |
| 0.0484 | 7.1324 | 2910 | 0.5769 | 0.3724 | 0.5769 | 0.7595 |
| 0.0484 | 7.1373 | 2912 | 0.5821 | 0.3724 | 0.5821 | 0.7630 |
| 0.0484 | 7.1422 | 2914 | 0.5868 | 0.2186 | 0.5868 | 0.7660 |
| 0.0484 | 7.1471 | 2916 | 0.5949 | 0.2588 | 0.5949 | 0.7713 |
| 0.0484 | 7.1520 | 2918 | 0.6055 | 0.2588 | 0.6055 | 0.7782 |
| 0.0484 | 7.1569 | 2920 | 0.6056 | 0.2588 | 0.6056 | 0.7782 |
| 0.0484 | 7.1618 | 2922 | 0.5956 | 0.2186 | 0.5956 | 0.7718 |
| 0.0484 | 7.1667 | 2924 | 0.5841 | 0.2186 | 0.5841 | 0.7643 |
| 0.0484 | 7.1716 | 2926 | 0.5740 | 0.3724 | 0.5740 | 0.7576 |
| 0.0484 | 7.1765 | 2928 | 0.5638 | 0.4273 | 0.5638 | 0.7509 |
| 0.0484 | 7.1814 | 2930 | 0.5598 | 0.4273 | 0.5598 | 0.7482 |
| 0.0484 | 7.1863 | 2932 | 0.5587 | 0.4273 | 0.5587 | 0.7475 |
| 0.0484 | 7.1912 | 2934 | 0.5577 | 0.4273 | 0.5577 | 0.7468 |
| 0.0484 | 7.1961 | 2936 | 0.5575 | 0.4273 | 0.5575 | 0.7466 |
| 0.0484 | 7.2010 | 2938 | 0.5605 | 0.4273 | 0.5605 | 0.7486 |
| 0.0484 | 7.2059 | 2940 | 0.5650 | 0.4273 | 0.5650 | 0.7516 |
| 0.0484 | 7.2108 | 2942 | 0.5784 | 0.4273 | 0.5784 | 0.7605 |
| 0.0484 | 7.2157 | 2944 | 0.5920 | 0.3724 | 0.5920 | 0.7694 |
| 0.0484 | 7.2206 | 2946 | 0.5985 | 0.2186 | 0.5985 | 0.7736 |
| 0.0484 | 7.2255 | 2948 | 0.6026 | 0.2588 | 0.6026 | 0.7762 |
| 0.0484 | 7.2304 | 2950 | 0.6018 | 0.3724 | 0.6018 | 0.7758 |
| 0.0484 | 7.2353 | 2952 | 0.5978 | 0.3724 | 0.5978 | 0.7732 |
| 0.0484 | 7.2402 | 2954 | 0.6014 | 0.3724 | 0.6014 | 0.7755 |
| 0.0484 | 7.2451 | 2956 | 0.5995 | 0.4273 | 0.5995 | 0.7743 |
| 0.0484 | 7.25 | 2958 | 0.5931 | 0.4273 | 0.5931 | 0.7701 |
| 0.0484 | 7.2549 | 2960 | 0.5918 | 0.4273 | 0.5918 | 0.7693 |
| 0.0484 | 7.2598 | 2962 | 0.5924 | 0.4273 | 0.5924 | 0.7697 |
| 0.0484 | 7.2647 | 2964 | 0.5894 | 0.4273 | 0.5894 | 0.7677 |
| 0.0484 | 7.2696 | 2966 | 0.5884 | 0.3865 | 0.5884 | 0.7671 |
| 0.0484 | 7.2745 | 2968 | 0.5891 | 0.3077 | 0.5891 | 0.7675 |
| 0.0484 | 7.2794 | 2970 | 0.5899 | 0.3077 | 0.5899 | 0.7680 |
| 0.0484 | 7.2843 | 2972 | 0.5917 | 0.1962 | 0.5917 | 0.7693 |
| 0.0484 | 7.2892 | 2974 | 0.5945 | 0.1962 | 0.5945 | 0.7711 |
| 0.0484 | 7.2941 | 2976 | 0.5986 | 0.1962 | 0.5986 | 0.7737 |
| 0.0484 | 7.2990 | 2978 | 0.5996 | 0.1962 | 0.5996 | 0.7743 |
| 0.0484 | 7.3039 | 2980 | 0.5938 | 0.1962 | 0.5938 | 0.7706 |
| 0.0484 | 7.3088 | 2982 | 0.5896 | 0.1962 | 0.5896 | 0.7679 |
| 0.0484 | 7.3137 | 2984 | 0.5903 | 0.4273 | 0.5903 | 0.7683 |
| 0.0484 | 7.3186 | 2986 | 0.6007 | 0.4273 | 0.6007 | 0.7750 |
| 0.0484 | 7.3235 | 2988 | 0.6197 | 0.2588 | 0.6197 | 0.7872 |
| 0.0484 | 7.3284 | 2990 | 0.6438 | 0.2588 | 0.6438 | 0.8024 |
| 0.0484 | 7.3333 | 2992 | 0.6531 | 0.2588 | 0.6531 | 0.8081 |
| 0.0484 | 7.3382 | 2994 | 0.6506 | 0.2588 | 0.6506 | 0.8066 |
| 0.0484 | 7.3431 | 2996 | 0.6399 | 0.2588 | 0.6399 | 0.8000 |
| 0.0484 | 7.3480 | 2998 | 0.6246 | 0.2588 | 0.6246 | 0.7903 |
| 0.0411 | 7.3529 | 3000 | 0.6190 | 0.2588 | 0.6190 | 0.7868 |
| 0.0411 | 7.3578 | 3002 | 0.6147 | 0.2588 | 0.6147 | 0.7840 |
| 0.0411 | 7.3627 | 3004 | 0.6192 | 0.2588 | 0.6192 | 0.7869 |
| 0.0411 | 7.3676 | 3006 | 0.6184 | 0.2588 | 0.6184 | 0.7864 |
| 0.0411 | 7.3725 | 3008 | 0.6144 | 0.2588 | 0.6144 | 0.7838 |
| 0.0411 | 7.3775 | 3010 | 0.6032 | 0.2588 | 0.6032 | 0.7767 |
| 0.0411 | 7.3824 | 3012 | 0.5871 | 0.3724 | 0.5871 | 0.7662 |
| 0.0411 | 7.3873 | 3014 | 0.5713 | 0.4273 | 0.5713 | 0.7558 |
| 0.0411 | 7.3922 | 3016 | 0.5613 | 0.4273 | 0.5613 | 0.7492 |
| 0.0411 | 7.3971 | 3018 | 0.5541 | 0.4273 | 0.5541 | 0.7443 |
| 0.0411 | 7.4020 | 3020 | 0.5492 | 0.3467 | 0.5492 | 0.7411 |
| 0.0411 | 7.4069 | 3022 | 0.5470 | 0.3077 | 0.5470 | 0.7396 |
| 0.0411 | 7.4118 | 3024 | 0.5447 | 0.3467 | 0.5447 | 0.7380 |
| 0.0411 | 7.4167 | 3026 | 0.5460 | 0.4273 | 0.5460 | 0.7389 |
| 0.0411 | 7.4216 | 3028 | 0.5570 | 0.4273 | 0.5570 | 0.7463 |
| 0.0411 | 7.4265 | 3030 | 0.5654 | 0.3724 | 0.5654 | 0.7519 |
| 0.0411 | 7.4314 | 3032 | 0.5718 | 0.4661 | 0.5718 | 0.7562 |
| 0.0411 | 7.4363 | 3034 | 0.5704 | 0.4661 | 0.5704 | 0.7552 |
| 0.0411 | 7.4412 | 3036 | 0.5678 | 0.4661 | 0.5678 | 0.7536 |
| 0.0411 | 7.4461 | 3038 | 0.5751 | 0.4661 | 0.5751 | 0.7583 |
| 0.0411 | 7.4510 | 3040 | 0.5945 | 0.4661 | 0.5945 | 0.7710 |
| 0.0411 | 7.4559 | 3042 | 0.6046 | 0.4661 | 0.6046 | 0.7776 |
| 0.0411 | 7.4608 | 3044 | 0.6135 | 0.3255 | 0.6135 | 0.7833 |
| 0.0411 | 7.4657 | 3046 | 0.6191 | 0.3255 | 0.6191 | 0.7869 |
| 0.0411 | 7.4706 | 3048 | 0.6098 | 0.3255 | 0.6098 | 0.7809 |
| 0.0411 | 7.4755 | 3050 | 0.5983 | 0.3255 | 0.5983 | 0.7735 |
| 0.0411 | 7.4804 | 3052 | 0.5858 | 0.3255 | 0.5858 | 0.7654 |
| 0.0411 | 7.4853 | 3054 | 0.5704 | 0.2186 | 0.5704 | 0.7552 |
| 0.0411 | 7.4902 | 3056 | 0.5561 | 0.3724 | 0.5561 | 0.7458 |
| 0.0411 | 7.4951 | 3058 | 0.5418 | 0.3724 | 0.5418 | 0.7361 |
| 0.0411 | 7.5 | 3060 | 0.5366 | 0.4273 | 0.5366 | 0.7325 |
| 0.0411 | 7.5049 | 3062 | 0.5382 | 0.3724 | 0.5382 | 0.7336 |
| 0.0411 | 7.5098 | 3064 | 0.5390 | 0.3724 | 0.5390 | 0.7341 |
| 0.0411 | 7.5147 | 3066 | 0.5369 | 0.3724 | 0.5369 | 0.7327 |
| 0.0411 | 7.5196 | 3068 | 0.5412 | 0.3724 | 0.5412 | 0.7357 |
| 0.0411 | 7.5245 | 3070 | 0.5541 | 0.3724 | 0.5541 | 0.7444 |
| 0.0411 | 7.5294 | 3072 | 0.5643 | 0.4661 | 0.5643 | 0.7512 |
| 0.0411 | 7.5343 | 3074 | 0.5738 | 0.4661 | 0.5738 | 0.7575 |
| 0.0411 | 7.5392 | 3076 | 0.5723 | 0.4661 | 0.5723 | 0.7565 |
| 0.0411 | 7.5441 | 3078 | 0.5720 | 0.4661 | 0.5720 | 0.7563 |
| 0.0411 | 7.5490 | 3080 | 0.5743 | 0.4661 | 0.5743 | 0.7578 |
| 0.0411 | 7.5539 | 3082 | 0.5861 | 0.4661 | 0.5861 | 0.7656 |
| 0.0411 | 7.5588 | 3084 | 0.6017 | 0.4661 | 0.6017 | 0.7757 |
| 0.0411 | 7.5637 | 3086 | 0.6255 | 0.3255 | 0.6255 | 0.7909 |
| 0.0411 | 7.5686 | 3088 | 0.6465 | 0.3255 | 0.6465 | 0.8041 |
| 0.0411 | 7.5735 | 3090 | 0.6455 | 0.3255 | 0.6455 | 0.8034 |
| 0.0411 | 7.5784 | 3092 | 0.6340 | 0.3255 | 0.6340 | 0.7962 |
| 0.0411 | 7.5833 | 3094 | 0.6106 | 0.3255 | 0.6106 | 0.7814 |
| 0.0411 | 7.5882 | 3096 | 0.5799 | 0.4661 | 0.5799 | 0.7615 |
| 0.0411 | 7.5931 | 3098 | 0.5602 | 0.4661 | 0.5602 | 0.7484 |
| 0.0411 | 7.5980 | 3100 | 0.5480 | 0.3724 | 0.5480 | 0.7402 |
| 0.0411 | 7.6029 | 3102 | 0.5363 | 0.4273 | 0.5363 | 0.7324 |
| 0.0411 | 7.6078 | 3104 | 0.5275 | 0.4273 | 0.5275 | 0.7263 |
| 0.0411 | 7.6127 | 3106 | 0.5241 | 0.4273 | 0.5241 | 0.7240 |
| 0.0411 | 7.6176 | 3108 | 0.5264 | 0.4273 | 0.5264 | 0.7255 |
| 0.0411 | 7.6225 | 3110 | 0.5288 | 0.4273 | 0.5288 | 0.7272 |
| 0.0411 | 7.6275 | 3112 | 0.5320 | 0.4273 | 0.5320 | 0.7294 |
| 0.0411 | 7.6324 | 3114 | 0.5335 | 0.3865 | 0.5335 | 0.7304 |
| 0.0411 | 7.6373 | 3116 | 0.5385 | 0.3865 | 0.5385 | 0.7338 |
| 0.0411 | 7.6422 | 3118 | 0.5465 | 0.3467 | 0.5465 | 0.7393 |
| 0.0411 | 7.6471 | 3120 | 0.5531 | 0.3467 | 0.5531 | 0.7437 |
| 0.0411 | 7.6520 | 3122 | 0.5552 | 0.3467 | 0.5552 | 0.7451 |
| 0.0411 | 7.6569 | 3124 | 0.5537 | 0.3865 | 0.5537 | 0.7441 |
| 0.0411 | 7.6618 | 3126 | 0.5555 | 0.3865 | 0.5555 | 0.7453 |
| 0.0411 | 7.6667 | 3128 | 0.5601 | 0.3865 | 0.5601 | 0.7484 |
| 0.0411 | 7.6716 | 3130 | 0.5727 | 0.4273 | 0.5727 | 0.7568 |
| 0.0411 | 7.6765 | 3132 | 0.5882 | 0.3724 | 0.5882 | 0.7669 |
| 0.0411 | 7.6814 | 3134 | 0.6032 | 0.3724 | 0.6032 | 0.7767 |
| 0.0411 | 7.6863 | 3136 | 0.6175 | 0.2186 | 0.6175 | 0.7858 |
| 0.0411 | 7.6912 | 3138 | 0.6188 | 0.2186 | 0.6188 | 0.7867 |
| 0.0411 | 7.6961 | 3140 | 0.6089 | 0.2186 | 0.6089 | 0.7803 |
| 0.0411 | 7.7010 | 3142 | 0.6012 | 0.2186 | 0.6012 | 0.7754 |
| 0.0411 | 7.7059 | 3144 | 0.5923 | 0.2186 | 0.5923 | 0.7696 |
| 0.0411 | 7.7108 | 3146 | 0.5804 | 0.3724 | 0.5804 | 0.7618 |
| 0.0411 | 7.7157 | 3148 | 0.5662 | 0.4273 | 0.5662 | 0.7525 |
| 0.0411 | 7.7206 | 3150 | 0.5564 | 0.4273 | 0.5564 | 0.7459 |
| 0.0411 | 7.7255 | 3152 | 0.5497 | 0.4273 | 0.5497 | 0.7414 |
| 0.0411 | 7.7304 | 3154 | 0.5447 | 0.4273 | 0.5447 | 0.7380 |
| 0.0411 | 7.7353 | 3156 | 0.5450 | 0.4273 | 0.5450 | 0.7382 |
| 0.0411 | 7.7402 | 3158 | 0.5508 | 0.4273 | 0.5508 | 0.7422 |
| 0.0411 | 7.7451 | 3160 | 0.5564 | 0.4273 | 0.5564 | 0.7459 |
| 0.0411 | 7.75 | 3162 | 0.5608 | 0.4273 | 0.5608 | 0.7489 |
| 0.0411 | 7.7549 | 3164 | 0.5661 | 0.3724 | 0.5662 | 0.7524 |
| 0.0411 | 7.7598 | 3166 | 0.5762 | 0.2186 | 0.5762 | 0.7591 |
| 0.0411 | 7.7647 | 3168 | 0.5785 | 0.2186 | 0.5785 | 0.7606 |
| 0.0411 | 7.7696 | 3170 | 0.5750 | 0.2186 | 0.5750 | 0.7583 |
| 0.0411 | 7.7745 | 3172 | 0.5703 | 0.2186 | 0.5703 | 0.7552 |
| 0.0411 | 7.7794 | 3174 | 0.5662 | 0.2186 | 0.5662 | 0.7525 |
| 0.0411 | 7.7843 | 3176 | 0.5640 | 0.2186 | 0.5640 | 0.7510 |
| 0.0411 | 7.7892 | 3178 | 0.5623 | 0.2186 | 0.5623 | 0.7498 |
| 0.0411 | 7.7941 | 3180 | 0.5666 | 0.2186 | 0.5666 | 0.7527 |
| 0.0411 | 7.7990 | 3182 | 0.5736 | 0.2186 | 0.5736 | 0.7574 |
| 0.0411 | 7.8039 | 3184 | 0.5749 | 0.2186 | 0.5749 | 0.7582 |
| 0.0411 | 7.8088 | 3186 | 0.5747 | 0.3255 | 0.5747 | 0.7581 |
| 0.0411 | 7.8137 | 3188 | 0.5698 | 0.3255 | 0.5698 | 0.7549 |
| 0.0411 | 7.8186 | 3190 | 0.5756 | 0.3255 | 0.5756 | 0.7587 |
| 0.0411 | 7.8235 | 3192 | 0.5875 | 0.3636 | 0.5875 | 0.7665 |
| 0.0411 | 7.8284 | 3194 | 0.5931 | 0.3636 | 0.5931 | 0.7701 |
| 0.0411 | 7.8333 | 3196 | 0.5926 | 0.3636 | 0.5926 | 0.7698 |
| 0.0411 | 7.8382 | 3198 | 0.5871 | 0.3636 | 0.5871 | 0.7662 |
| 0.0411 | 7.8431 | 3200 | 0.5834 | 0.3636 | 0.5834 | 0.7638 |
| 0.0411 | 7.8480 | 3202 | 0.5857 | 0.3636 | 0.5857 | 0.7653 |
| 0.0411 | 7.8529 | 3204 | 0.5867 | 0.3255 | 0.5867 | 0.7659 |
| 0.0411 | 7.8578 | 3206 | 0.5769 | 0.2186 | 0.5769 | 0.7596 |
| 0.0411 | 7.8627 | 3208 | 0.5661 | 0.2186 | 0.5661 | 0.7524 |
| 0.0411 | 7.8676 | 3210 | 0.5590 | 0.2186 | 0.5590 | 0.7476 |
| 0.0411 | 7.8725 | 3212 | 0.5509 | 0.3724 | 0.5509 | 0.7422 |
| 0.0411 | 7.8775 | 3214 | 0.5431 | 0.3724 | 0.5431 | 0.7369 |
| 0.0411 | 7.8824 | 3216 | 0.5374 | 0.3724 | 0.5374 | 0.7331 |
| 0.0411 | 7.8873 | 3218 | 0.5318 | 0.3724 | 0.5318 | 0.7292 |
| 0.0411 | 7.8922 | 3220 | 0.5291 | 0.3724 | 0.5291 | 0.7274 |
| 0.0411 | 7.8971 | 3222 | 0.5220 | 0.3724 | 0.5220 | 0.7225 |
| 0.0411 | 7.9020 | 3224 | 0.5191 | 0.3724 | 0.5191 | 0.7205 |
| 0.0411 | 7.9069 | 3226 | 0.5246 | 0.3724 | 0.5246 | 0.7243 |
| 0.0411 | 7.9118 | 3228 | 0.5357 | 0.3724 | 0.5357 | 0.7319 |
| 0.0411 | 7.9167 | 3230 | 0.5448 | 0.3724 | 0.5448 | 0.7381 |
| 0.0411 | 7.9216 | 3232 | 0.5568 | 0.2186 | 0.5568 | 0.7462 |
| 0.0411 | 7.9265 | 3234 | 0.5743 | 0.3255 | 0.5743 | 0.7579 |
| 0.0411 | 7.9314 | 3236 | 0.5857 | 0.3255 | 0.5857 | 0.7653 |
| 0.0411 | 7.9363 | 3238 | 0.5902 | 0.3255 | 0.5902 | 0.7682 |
| 0.0411 | 7.9412 | 3240 | 0.5941 | 0.3255 | 0.5941 | 0.7708 |
| 0.0411 | 7.9461 | 3242 | 0.5940 | 0.3255 | 0.5940 | 0.7707 |
| 0.0411 | 7.9510 | 3244 | 0.5792 | 0.3255 | 0.5792 | 0.7611 |
| 0.0411 | 7.9559 | 3246 | 0.5686 | 0.2186 | 0.5686 | 0.7541 |
| 0.0411 | 7.9608 | 3248 | 0.5643 | 0.2186 | 0.5643 | 0.7512 |
| 0.0411 | 7.9657 | 3250 | 0.5577 | 0.3724 | 0.5577 | 0.7468 |
| 0.0411 | 7.9706 | 3252 | 0.5513 | 0.3724 | 0.5513 | 0.7425 |
| 0.0411 | 7.9755 | 3254 | 0.5494 | 0.3724 | 0.5494 | 0.7412 |
| 0.0411 | 7.9804 | 3256 | 0.5517 | 0.3724 | 0.5517 | 0.7428 |
| 0.0411 | 7.9853 | 3258 | 0.5462 | 0.3724 | 0.5462 | 0.7391 |
| 0.0411 | 7.9902 | 3260 | 0.5383 | 0.4273 | 0.5383 | 0.7337 |
| 0.0411 | 7.9951 | 3262 | 0.5321 | 0.4273 | 0.5321 | 0.7295 |
| 0.0411 | 8.0 | 3264 | 0.5273 | 0.4273 | 0.5273 | 0.7262 |
| 0.0411 | 8.0049 | 3266 | 0.5219 | 0.4273 | 0.5219 | 0.7224 |
| 0.0411 | 8.0098 | 3268 | 0.5199 | 0.4273 | 0.5199 | 0.7211 |
| 0.0411 | 8.0147 | 3270 | 0.5266 | 0.4273 | 0.5266 | 0.7257 |
| 0.0411 | 8.0196 | 3272 | 0.5405 | 0.3724 | 0.5405 | 0.7352 |
| 0.0411 | 8.0245 | 3274 | 0.5558 | 0.3724 | 0.5558 | 0.7455 |
| 0.0411 | 8.0294 | 3276 | 0.5620 | 0.3724 | 0.5620 | 0.7497 |
| 0.0411 | 8.0343 | 3278 | 0.5611 | 0.3724 | 0.5611 | 0.7490 |
| 0.0411 | 8.0392 | 3280 | 0.5626 | 0.3724 | 0.5626 | 0.7501 |
| 0.0411 | 8.0441 | 3282 | 0.5697 | 0.3724 | 0.5697 | 0.7548 |
| 0.0411 | 8.0490 | 3284 | 0.5774 | 0.3724 | 0.5774 | 0.7599 |
| 0.0411 | 8.0539 | 3286 | 0.5862 | 0.3724 | 0.5862 | 0.7656 |
| 0.0411 | 8.0588 | 3288 | 0.5873 | 0.3724 | 0.5873 | 0.7663 |
| 0.0411 | 8.0637 | 3290 | 0.5762 | 0.3724 | 0.5762 | 0.7591 |
| 0.0411 | 8.0686 | 3292 | 0.5645 | 0.3724 | 0.5645 | 0.7513 |
| 0.0411 | 8.0735 | 3294 | 0.5631 | 0.3724 | 0.5631 | 0.7504 |
| 0.0411 | 8.0784 | 3296 | 0.5640 | 0.3724 | 0.5640 | 0.7510 |
| 0.0411 | 8.0833 | 3298 | 0.5624 | 0.3724 | 0.5624 | 0.7499 |
| 0.0411 | 8.0882 | 3300 | 0.5652 | 0.3724 | 0.5652 | 0.7518 |
| 0.0411 | 8.0931 | 3302 | 0.5702 | 0.3724 | 0.5702 | 0.7551 |
| 0.0411 | 8.0980 | 3304 | 0.5709 | 0.3724 | 0.5709 | 0.7556 |
| 0.0411 | 8.1029 | 3306 | 0.5677 | 0.3724 | 0.5677 | 0.7534 |
| 0.0411 | 8.1078 | 3308 | 0.5623 | 0.3724 | 0.5623 | 0.7499 |
| 0.0411 | 8.1127 | 3310 | 0.5543 | 0.3724 | 0.5543 | 0.7445 |
| 0.0411 | 8.1176 | 3312 | 0.5523 | 0.3724 | 0.5523 | 0.7432 |
| 0.0411 | 8.1225 | 3314 | 0.5503 | 0.3724 | 0.5503 | 0.7418 |
| 0.0411 | 8.1275 | 3316 | 0.5503 | 0.3724 | 0.5503 | 0.7418 |
| 0.0411 | 8.1324 | 3318 | 0.5545 | 0.3724 | 0.5545 | 0.7447 |
| 0.0411 | 8.1373 | 3320 | 0.5562 | 0.3724 | 0.5562 | 0.7458 |
| 0.0411 | 8.1422 | 3322 | 0.5553 | 0.3724 | 0.5553 | 0.7452 |
| 0.0411 | 8.1471 | 3324 | 0.5520 | 0.3724 | 0.5520 | 0.7430 |
| 0.0411 | 8.1520 | 3326 | 0.5528 | 0.3724 | 0.5528 | 0.7435 |
| 0.0411 | 8.1569 | 3328 | 0.5582 | 0.3724 | 0.5582 | 0.7471 |
| 0.0411 | 8.1618 | 3330 | 0.5704 | 0.4661 | 0.5704 | 0.7552 |
| 0.0411 | 8.1667 | 3332 | 0.5821 | 0.4661 | 0.5821 | 0.7630 |
| 0.0411 | 8.1716 | 3334 | 0.6006 | 0.4661 | 0.6006 | 0.7750 |
| 0.0411 | 8.1765 | 3336 | 0.6148 | 0.4661 | 0.6148 | 0.7841 |
| 0.0411 | 8.1814 | 3338 | 0.6344 | 0.4661 | 0.6344 | 0.7965 |
| 0.0411 | 8.1863 | 3340 | 0.6494 | 0.3255 | 0.6494 | 0.8059 |
| 0.0411 | 8.1912 | 3342 | 0.6600 | 0.3255 | 0.6600 | 0.8124 |
| 0.0411 | 8.1961 | 3344 | 0.6578 | 0.3255 | 0.6578 | 0.8110 |
| 0.0411 | 8.2010 | 3346 | 0.6555 | 0.3255 | 0.6555 | 0.8097 |
| 0.0411 | 8.2059 | 3348 | 0.6544 | 0.3255 | 0.6544 | 0.8090 |
| 0.0411 | 8.2108 | 3350 | 0.6493 | 0.3255 | 0.6493 | 0.8058 |
| 0.0411 | 8.2157 | 3352 | 0.6386 | 0.4661 | 0.6386 | 0.7991 |
| 0.0411 | 8.2206 | 3354 | 0.6250 | 0.4661 | 0.6250 | 0.7906 |
| 0.0411 | 8.2255 | 3356 | 0.6029 | 0.4661 | 0.6029 | 0.7765 |
| 0.0411 | 8.2304 | 3358 | 0.5803 | 0.4661 | 0.5803 | 0.7618 |
| 0.0411 | 8.2353 | 3360 | 0.5643 | 0.4661 | 0.5643 | 0.7512 |
| 0.0411 | 8.2402 | 3362 | 0.5510 | 0.4273 | 0.5510 | 0.7423 |
| 0.0411 | 8.2451 | 3364 | 0.5470 | 0.4273 | 0.5470 | 0.7396 |
| 0.0411 | 8.25 | 3366 | 0.5462 | 0.4273 | 0.5462 | 0.7390 |
| 0.0411 | 8.2549 | 3368 | 0.5461 | 0.4273 | 0.5461 | 0.7390 |
| 0.0411 | 8.2598 | 3370 | 0.5462 | 0.4273 | 0.5462 | 0.7390 |
| 0.0411 | 8.2647 | 3372 | 0.5451 | 0.4273 | 0.5451 | 0.7383 |
| 0.0411 | 8.2696 | 3374 | 0.5457 | 0.4273 | 0.5457 | 0.7387 |
| 0.0411 | 8.2745 | 3376 | 0.5488 | 0.4273 | 0.5488 | 0.7408 |
| 0.0411 | 8.2794 | 3378 | 0.5547 | 0.4273 | 0.5547 | 0.7448 |
| 0.0411 | 8.2843 | 3380 | 0.5624 | 0.4273 | 0.5624 | 0.7499 |
| 0.0411 | 8.2892 | 3382 | 0.5724 | 0.4273 | 0.5724 | 0.7566 |
| 0.0411 | 8.2941 | 3384 | 0.5800 | 0.4273 | 0.5800 | 0.7616 |
| 0.0411 | 8.2990 | 3386 | 0.5879 | 0.4661 | 0.5879 | 0.7667 |
| 0.0411 | 8.3039 | 3388 | 0.5921 | 0.4661 | 0.5921 | 0.7695 |
| 0.0411 | 8.3088 | 3390 | 0.5915 | 0.4661 | 0.5915 | 0.7691 |
| 0.0411 | 8.3137 | 3392 | 0.5945 | 0.4661 | 0.5945 | 0.7710 |
| 0.0411 | 8.3186 | 3394 | 0.5942 | 0.4661 | 0.5942 | 0.7708 |
| 0.0411 | 8.3235 | 3396 | 0.5920 | 0.4661 | 0.5920 | 0.7694 |
| 0.0411 | 8.3284 | 3398 | 0.5883 | 0.4661 | 0.5883 | 0.7670 |
| 0.0411 | 8.3333 | 3400 | 0.5803 | 0.5157 | 0.5803 | 0.7618 |
| 0.0411 | 8.3382 | 3402 | 0.5729 | 0.4273 | 0.5729 | 0.7569 |
| 0.0411 | 8.3431 | 3404 | 0.5660 | 0.4273 | 0.5660 | 0.7523 |
| 0.0411 | 8.3480 | 3406 | 0.5620 | 0.4273 | 0.5620 | 0.7497 |
| 0.0411 | 8.3529 | 3408 | 0.5580 | 0.4273 | 0.5580 | 0.7470 |
| 0.0411 | 8.3578 | 3410 | 0.5587 | 0.4273 | 0.5587 | 0.7475 |
| 0.0411 | 8.3627 | 3412 | 0.5578 | 0.4273 | 0.5578 | 0.7469 |
| 0.0411 | 8.3676 | 3414 | 0.5577 | 0.4273 | 0.5577 | 0.7468 |
| 0.0411 | 8.3725 | 3416 | 0.5602 | 0.4273 | 0.5602 | 0.7485 |
| 0.0411 | 8.3775 | 3418 | 0.5601 | 0.4273 | 0.5601 | 0.7484 |
| 0.0411 | 8.3824 | 3420 | 0.5637 | 0.4273 | 0.5637 | 0.7508 |
| 0.0411 | 8.3873 | 3422 | 0.5684 | 0.4273 | 0.5684 | 0.7539 |
| 0.0411 | 8.3922 | 3424 | 0.5722 | 0.4273 | 0.5722 | 0.7565 |
| 0.0411 | 8.3971 | 3426 | 0.5776 | 0.3724 | 0.5776 | 0.7600 |
| 0.0411 | 8.4020 | 3428 | 0.5828 | 0.3724 | 0.5828 | 0.7634 |
| 0.0411 | 8.4069 | 3430 | 0.5846 | 0.3724 | 0.5846 | 0.7646 |
| 0.0411 | 8.4118 | 3432 | 0.5913 | 0.3724 | 0.5913 | 0.7690 |
| 0.0411 | 8.4167 | 3434 | 0.6020 | 0.4661 | 0.6020 | 0.7759 |
| 0.0411 | 8.4216 | 3436 | 0.6085 | 0.3255 | 0.6085 | 0.7801 |
| 0.0411 | 8.4265 | 3438 | 0.6069 | 0.3255 | 0.6069 | 0.7790 |
| 0.0411 | 8.4314 | 3440 | 0.6069 | 0.3255 | 0.6069 | 0.7791 |
| 0.0411 | 8.4363 | 3442 | 0.6114 | 0.3255 | 0.6114 | 0.7819 |
| 0.0411 | 8.4412 | 3444 | 0.6174 | 0.3255 | 0.6174 | 0.7858 |
| 0.0411 | 8.4461 | 3446 | 0.6170 | 0.3255 | 0.6170 | 0.7855 |
| 0.0411 | 8.4510 | 3448 | 0.6171 | 0.3255 | 0.6171 | 0.7855 |
| 0.0411 | 8.4559 | 3450 | 0.6143 | 0.3255 | 0.6143 | 0.7838 |
| 0.0411 | 8.4608 | 3452 | 0.6091 | 0.3255 | 0.6091 | 0.7804 |
| 0.0411 | 8.4657 | 3454 | 0.6011 | 0.3255 | 0.6011 | 0.7753 |
| 0.0411 | 8.4706 | 3456 | 0.5933 | 0.4661 | 0.5933 | 0.7703 |
| 0.0411 | 8.4755 | 3458 | 0.5810 | 0.4661 | 0.5810 | 0.7622 |
| 0.0411 | 8.4804 | 3460 | 0.5703 | 0.4661 | 0.5703 | 0.7552 |
| 0.0411 | 8.4853 | 3462 | 0.5600 | 0.3724 | 0.5600 | 0.7483 |
| 0.0411 | 8.4902 | 3464 | 0.5534 | 0.4273 | 0.5534 | 0.7439 |
| 0.0411 | 8.4951 | 3466 | 0.5496 | 0.4273 | 0.5496 | 0.7414 |
| 0.0411 | 8.5 | 3468 | 0.5488 | 0.4273 | 0.5488 | 0.7408 |
| 0.0411 | 8.5049 | 3470 | 0.5537 | 0.4273 | 0.5537 | 0.7441 |
| 0.0411 | 8.5098 | 3472 | 0.5573 | 0.5157 | 0.5573 | 0.7465 |
| 0.0411 | 8.5147 | 3474 | 0.5669 | 0.4661 | 0.5669 | 0.7529 |
| 0.0411 | 8.5196 | 3476 | 0.5744 | 0.4661 | 0.5744 | 0.7579 |
| 0.0411 | 8.5245 | 3478 | 0.5785 | 0.4661 | 0.5785 | 0.7606 |
| 0.0411 | 8.5294 | 3480 | 0.5774 | 0.4661 | 0.5774 | 0.7599 |
| 0.0411 | 8.5343 | 3482 | 0.5785 | 0.4661 | 0.5785 | 0.7606 |
| 0.0411 | 8.5392 | 3484 | 0.5789 | 0.4661 | 0.5789 | 0.7609 |
| 0.0411 | 8.5441 | 3486 | 0.5788 | 0.4661 | 0.5788 | 0.7608 |
| 0.0411 | 8.5490 | 3488 | 0.5818 | 0.4661 | 0.5818 | 0.7628 |
| 0.0411 | 8.5539 | 3490 | 0.5864 | 0.4661 | 0.5864 | 0.7658 |
| 0.0411 | 8.5588 | 3492 | 0.5885 | 0.4661 | 0.5885 | 0.7671 |
| 0.0411 | 8.5637 | 3494 | 0.5878 | 0.4661 | 0.5878 | 0.7667 |
| 0.0411 | 8.5686 | 3496 | 0.5867 | 0.4661 | 0.5867 | 0.7660 |
| 0.0411 | 8.5735 | 3498 | 0.5905 | 0.4661 | 0.5905 | 0.7684 |
| 0.0395 | 8.5784 | 3500 | 0.5949 | 0.4661 | 0.5949 | 0.7713 |
| 0.0395 | 8.5833 | 3502 | 0.6033 | 0.4661 | 0.6033 | 0.7767 |
| 0.0395 | 8.5882 | 3504 | 0.6088 | 0.4661 | 0.6088 | 0.7803 |
| 0.0395 | 8.5931 | 3506 | 0.6136 | 0.3255 | 0.6136 | 0.7834 |
| 0.0395 | 8.5980 | 3508 | 0.6195 | 0.3255 | 0.6195 | 0.7871 |
| 0.0395 | 8.6029 | 3510 | 0.6209 | 0.3255 | 0.6209 | 0.7880 |
| 0.0395 | 8.6078 | 3512 | 0.6172 | 0.3255 | 0.6172 | 0.7856 |
| 0.0395 | 8.6127 | 3514 | 0.6131 | 0.3255 | 0.6131 | 0.7830 |
| 0.0395 | 8.6176 | 3516 | 0.6108 | 0.3255 | 0.6108 | 0.7815 |
| 0.0395 | 8.6225 | 3518 | 0.6101 | 0.3255 | 0.6101 | 0.7811 |
| 0.0395 | 8.6275 | 3520 | 0.6114 | 0.3255 | 0.6114 | 0.7819 |
| 0.0395 | 8.6324 | 3522 | 0.6152 | 0.3255 | 0.6152 | 0.7843 |
| 0.0395 | 8.6373 | 3524 | 0.6157 | 0.3255 | 0.6157 | 0.7847 |
| 0.0395 | 8.6422 | 3526 | 0.6097 | 0.3255 | 0.6097 | 0.7809 |
| 0.0395 | 8.6471 | 3528 | 0.5981 | 0.3255 | 0.5981 | 0.7734 |
| 0.0395 | 8.6520 | 3530 | 0.5852 | 0.3255 | 0.5852 | 0.7650 |
| 0.0395 | 8.6569 | 3532 | 0.5769 | 0.3255 | 0.5769 | 0.7595 |
| 0.0395 | 8.6618 | 3534 | 0.5712 | 0.2186 | 0.5712 | 0.7558 |
| 0.0395 | 8.6667 | 3536 | 0.5646 | 0.3724 | 0.5646 | 0.7514 |
| 0.0395 | 8.6716 | 3538 | 0.5561 | 0.3724 | 0.5561 | 0.7457 |
| 0.0395 | 8.6765 | 3540 | 0.5544 | 0.3724 | 0.5544 | 0.7446 |
| 0.0395 | 8.6814 | 3542 | 0.5542 | 0.3724 | 0.5542 | 0.7445 |
| 0.0395 | 8.6863 | 3544 | 0.5583 | 0.3724 | 0.5583 | 0.7472 |
| 0.0395 | 8.6912 | 3546 | 0.5577 | 0.3724 | 0.5577 | 0.7468 |
| 0.0395 | 8.6961 | 3548 | 0.5577 | 0.3724 | 0.5577 | 0.7468 |
| 0.0395 | 8.7010 | 3550 | 0.5605 | 0.3724 | 0.5605 | 0.7486 |
| 0.0395 | 8.7059 | 3552 | 0.5652 | 0.3724 | 0.5652 | 0.7518 |
| 0.0395 | 8.7108 | 3554 | 0.5696 | 0.3724 | 0.5696 | 0.7547 |
| 0.0395 | 8.7157 | 3556 | 0.5758 | 0.3724 | 0.5758 | 0.7588 |
| 0.0395 | 8.7206 | 3558 | 0.5789 | 0.3724 | 0.5789 | 0.7609 |
| 0.0395 | 8.7255 | 3560 | 0.5809 | 0.3724 | 0.5809 | 0.7622 |
| 0.0395 | 8.7304 | 3562 | 0.5822 | 0.3724 | 0.5822 | 0.7630 |
| 0.0395 | 8.7353 | 3564 | 0.5844 | 0.3724 | 0.5844 | 0.7645 |
| 0.0395 | 8.7402 | 3566 | 0.5884 | 0.3724 | 0.5884 | 0.7671 |
| 0.0395 | 8.7451 | 3568 | 0.5900 | 0.3724 | 0.5900 | 0.7681 |
| 0.0395 | 8.75 | 3570 | 0.5911 | 0.3724 | 0.5911 | 0.7688 |
| 0.0395 | 8.7549 | 3572 | 0.5946 | 0.3724 | 0.5946 | 0.7711 |
| 0.0395 | 8.7598 | 3574 | 0.5956 | 0.3724 | 0.5956 | 0.7718 |
| 0.0395 | 8.7647 | 3576 | 0.5976 | 0.3724 | 0.5976 | 0.7731 |
| 0.0395 | 8.7696 | 3578 | 0.5984 | 0.3724 | 0.5984 | 0.7736 |
| 0.0395 | 8.7745 | 3580 | 0.5987 | 0.3724 | 0.5987 | 0.7738 |
| 0.0395 | 8.7794 | 3582 | 0.5977 | 0.3724 | 0.5977 | 0.7731 |
| 0.0395 | 8.7843 | 3584 | 0.5946 | 0.3724 | 0.5946 | 0.7711 |
| 0.0395 | 8.7892 | 3586 | 0.5934 | 0.3724 | 0.5934 | 0.7703 |
| 0.0395 | 8.7941 | 3588 | 0.5942 | 0.3724 | 0.5942 | 0.7709 |
| 0.0395 | 8.7990 | 3590 | 0.5933 | 0.3724 | 0.5933 | 0.7702 |
| 0.0395 | 8.8039 | 3592 | 0.5892 | 0.3724 | 0.5892 | 0.7676 |
| 0.0395 | 8.8088 | 3594 | 0.5852 | 0.3724 | 0.5852 | 0.7650 |
| 0.0395 | 8.8137 | 3596 | 0.5821 | 0.3724 | 0.5821 | 0.7629 |
| 0.0395 | 8.8186 | 3598 | 0.5784 | 0.3724 | 0.5784 | 0.7605 |
| 0.0395 | 8.8235 | 3600 | 0.5757 | 0.3724 | 0.5757 | 0.7588 |
| 0.0395 | 8.8284 | 3602 | 0.5732 | 0.3724 | 0.5732 | 0.7571 |
| 0.0395 | 8.8333 | 3604 | 0.5686 | 0.3724 | 0.5686 | 0.7541 |
| 0.0395 | 8.8382 | 3606 | 0.5621 | 0.4273 | 0.5621 | 0.7497 |
| 0.0395 | 8.8431 | 3608 | 0.5570 | 0.4273 | 0.5570 | 0.7463 |
| 0.0395 | 8.8480 | 3610 | 0.5540 | 0.4273 | 0.5540 | 0.7443 |
| 0.0395 | 8.8529 | 3612 | 0.5528 | 0.3865 | 0.5528 | 0.7435 |
| 0.0395 | 8.8578 | 3614 | 0.5531 | 0.4273 | 0.5531 | 0.7437 |
| 0.0395 | 8.8627 | 3616 | 0.5538 | 0.4273 | 0.5538 | 0.7442 |
| 0.0395 | 8.8676 | 3618 | 0.5534 | 0.4273 | 0.5534 | 0.7439 |
| 0.0395 | 8.8725 | 3620 | 0.5550 | 0.4273 | 0.5550 | 0.7450 |
| 0.0395 | 8.8775 | 3622 | 0.5559 | 0.4273 | 0.5559 | 0.7456 |
| 0.0395 | 8.8824 | 3624 | 0.5563 | 0.4273 | 0.5563 | 0.7458 |
| 0.0395 | 8.8873 | 3626 | 0.5566 | 0.4273 | 0.5566 | 0.7461 |
| 0.0395 | 8.8922 | 3628 | 0.5598 | 0.4273 | 0.5598 | 0.7482 |
| 0.0395 | 8.8971 | 3630 | 0.5633 | 0.4273 | 0.5633 | 0.7506 |
| 0.0395 | 8.9020 | 3632 | 0.5721 | 0.4273 | 0.5721 | 0.7563 |
| 0.0395 | 8.9069 | 3634 | 0.5803 | 0.2186 | 0.5803 | 0.7618 |
| 0.0395 | 8.9118 | 3636 | 0.5836 | 0.2186 | 0.5836 | 0.7640 |
| 0.0395 | 8.9167 | 3638 | 0.5866 | 0.2186 | 0.5866 | 0.7659 |
| 0.0395 | 8.9216 | 3640 | 0.5892 | 0.2186 | 0.5892 | 0.7676 |
| 0.0395 | 8.9265 | 3642 | 0.5961 | 0.2186 | 0.5961 | 0.7721 |
| 0.0395 | 8.9314 | 3644 | 0.6027 | 0.2186 | 0.6027 | 0.7764 |
| 0.0395 | 8.9363 | 3646 | 0.6056 | 0.3255 | 0.6056 | 0.7782 |
| 0.0395 | 8.9412 | 3648 | 0.6084 | 0.3636 | 0.6084 | 0.7800 |
| 0.0395 | 8.9461 | 3650 | 0.6100 | 0.3636 | 0.6100 | 0.7810 |
| 0.0395 | 8.9510 | 3652 | 0.6083 | 0.3636 | 0.6083 | 0.7799 |
| 0.0395 | 8.9559 | 3654 | 0.6041 | 0.3636 | 0.6041 | 0.7772 |
| 0.0395 | 8.9608 | 3656 | 0.5978 | 0.3636 | 0.5978 | 0.7732 |
| 0.0395 | 8.9657 | 3658 | 0.5888 | 0.2588 | 0.5888 | 0.7673 |
| 0.0395 | 8.9706 | 3660 | 0.5776 | 0.2186 | 0.5776 | 0.7600 |
| 0.0395 | 8.9755 | 3662 | 0.5683 | 0.2186 | 0.5683 | 0.7539 |
| 0.0395 | 8.9804 | 3664 | 0.5618 | 0.2186 | 0.5618 | 0.7495 |
| 0.0395 | 8.9853 | 3666 | 0.5585 | 0.3724 | 0.5585 | 0.7473 |
| 0.0395 | 8.9902 | 3668 | 0.5572 | 0.3724 | 0.5572 | 0.7465 |
| 0.0395 | 8.9951 | 3670 | 0.5576 | 0.3724 | 0.5576 | 0.7467 |
| 0.0395 | 9.0 | 3672 | 0.5591 | 0.2186 | 0.5591 | 0.7477 |
| 0.0395 | 9.0049 | 3674 | 0.5627 | 0.2186 | 0.5627 | 0.7501 |
| 0.0395 | 9.0098 | 3676 | 0.5669 | 0.2186 | 0.5669 | 0.7529 |
| 0.0395 | 9.0147 | 3678 | 0.5710 | 0.2186 | 0.5710 | 0.7556 |
| 0.0395 | 9.0196 | 3680 | 0.5723 | 0.2186 | 0.5723 | 0.7565 |
| 0.0395 | 9.0245 | 3682 | 0.5713 | 0.2186 | 0.5713 | 0.7559 |
| 0.0395 | 9.0294 | 3684 | 0.5690 | 0.2186 | 0.5690 | 0.7543 |
| 0.0395 | 9.0343 | 3686 | 0.5685 | 0.2186 | 0.5685 | 0.7540 |
| 0.0395 | 9.0392 | 3688 | 0.5687 | 0.2186 | 0.5687 | 0.7541 |
| 0.0395 | 9.0441 | 3690 | 0.5692 | 0.2186 | 0.5692 | 0.7544 |
| 0.0395 | 9.0490 | 3692 | 0.5677 | 0.2186 | 0.5677 | 0.7535 |
| 0.0395 | 9.0539 | 3694 | 0.5653 | 0.2186 | 0.5653 | 0.7519 |
| 0.0395 | 9.0588 | 3696 | 0.5648 | 0.2186 | 0.5648 | 0.7516 |
| 0.0395 | 9.0637 | 3698 | 0.5657 | 0.2186 | 0.5657 | 0.7521 |
| 0.0395 | 9.0686 | 3700 | 0.5678 | 0.2186 | 0.5678 | 0.7536 |
| 0.0395 | 9.0735 | 3702 | 0.5700 | 0.2186 | 0.5700 | 0.7550 |
| 0.0395 | 9.0784 | 3704 | 0.5745 | 0.2186 | 0.5745 | 0.7579 |
| 0.0395 | 9.0833 | 3706 | 0.5772 | 0.2186 | 0.5772 | 0.7597 |
| 0.0395 | 9.0882 | 3708 | 0.5789 | 0.2186 | 0.5789 | 0.7609 |
| 0.0395 | 9.0931 | 3710 | 0.5789 | 0.2186 | 0.5789 | 0.7609 |
| 0.0395 | 9.0980 | 3712 | 0.5778 | 0.2759 | 0.5778 | 0.7602 |
| 0.0395 | 9.1029 | 3714 | 0.5768 | 0.4273 | 0.5768 | 0.7595 |
| 0.0395 | 9.1078 | 3716 | 0.5758 | 0.4273 | 0.5758 | 0.7588 |
| 0.0395 | 9.1127 | 3718 | 0.5743 | 0.4273 | 0.5743 | 0.7579 |
| 0.0395 | 9.1176 | 3720 | 0.5733 | 0.4273 | 0.5733 | 0.7571 |
| 0.0395 | 9.1225 | 3722 | 0.5704 | 0.4273 | 0.5704 | 0.7553 |
| 0.0395 | 9.1275 | 3724 | 0.5683 | 0.4273 | 0.5683 | 0.7539 |
| 0.0395 | 9.1324 | 3726 | 0.5648 | 0.4273 | 0.5648 | 0.7515 |
| 0.0395 | 9.1373 | 3728 | 0.5603 | 0.4273 | 0.5603 | 0.7485 |
| 0.0395 | 9.1422 | 3730 | 0.5565 | 0.4273 | 0.5565 | 0.7460 |
| 0.0395 | 9.1471 | 3732 | 0.5521 | 0.3865 | 0.5521 | 0.7431 |
| 0.0395 | 9.1520 | 3734 | 0.5487 | 0.3865 | 0.5487 | 0.7407 |
| 0.0395 | 9.1569 | 3736 | 0.5448 | 0.3865 | 0.5448 | 0.7381 |
| 0.0395 | 9.1618 | 3738 | 0.5402 | 0.3865 | 0.5402 | 0.7350 |
| 0.0395 | 9.1667 | 3740 | 0.5361 | 0.3865 | 0.5361 | 0.7322 |
| 0.0395 | 9.1716 | 3742 | 0.5336 | 0.3865 | 0.5336 | 0.7305 |
| 0.0395 | 9.1765 | 3744 | 0.5327 | 0.3865 | 0.5327 | 0.7299 |
| 0.0395 | 9.1814 | 3746 | 0.5325 | 0.3865 | 0.5325 | 0.7297 |
| 0.0395 | 9.1863 | 3748 | 0.5332 | 0.3865 | 0.5332 | 0.7302 |
| 0.0395 | 9.1912 | 3750 | 0.5334 | 0.3865 | 0.5334 | 0.7304 |
| 0.0395 | 9.1961 | 3752 | 0.5338 | 0.3865 | 0.5338 | 0.7306 |
| 0.0395 | 9.2010 | 3754 | 0.5335 | 0.3865 | 0.5335 | 0.7304 |
| 0.0395 | 9.2059 | 3756 | 0.5345 | 0.3865 | 0.5345 | 0.7311 |
| 0.0395 | 9.2108 | 3758 | 0.5365 | 0.3865 | 0.5365 | 0.7325 |
| 0.0395 | 9.2157 | 3760 | 0.5391 | 0.3865 | 0.5391 | 0.7342 |
| 0.0395 | 9.2206 | 3762 | 0.5403 | 0.4273 | 0.5403 | 0.7350 |
| 0.0395 | 9.2255 | 3764 | 0.5400 | 0.4273 | 0.5400 | 0.7349 |
| 0.0395 | 9.2304 | 3766 | 0.5395 | 0.4273 | 0.5395 | 0.7345 |
| 0.0395 | 9.2353 | 3768 | 0.5383 | 0.3865 | 0.5383 | 0.7337 |
| 0.0395 | 9.2402 | 3770 | 0.5370 | 0.3865 | 0.5370 | 0.7328 |
| 0.0395 | 9.2451 | 3772 | 0.5368 | 0.3865 | 0.5368 | 0.7327 |
| 0.0395 | 9.25 | 3774 | 0.5364 | 0.4273 | 0.5364 | 0.7324 |
| 0.0395 | 9.2549 | 3776 | 0.5382 | 0.4273 | 0.5382 | 0.7336 |
| 0.0395 | 9.2598 | 3778 | 0.5408 | 0.4273 | 0.5408 | 0.7354 |
| 0.0395 | 9.2647 | 3780 | 0.5430 | 0.4273 | 0.5430 | 0.7369 |
| 0.0395 | 9.2696 | 3782 | 0.5467 | 0.4273 | 0.5467 | 0.7394 |
| 0.0395 | 9.2745 | 3784 | 0.5521 | 0.4273 | 0.5521 | 0.7430 |
| 0.0395 | 9.2794 | 3786 | 0.5571 | 0.3724 | 0.5571 | 0.7464 |
| 0.0395 | 9.2843 | 3788 | 0.5607 | 0.2186 | 0.5607 | 0.7488 |
| 0.0395 | 9.2892 | 3790 | 0.5620 | 0.2186 | 0.5620 | 0.7496 |
| 0.0395 | 9.2941 | 3792 | 0.5625 | 0.3724 | 0.5625 | 0.7500 |
| 0.0395 | 9.2990 | 3794 | 0.5618 | 0.3724 | 0.5618 | 0.7496 |
| 0.0395 | 9.3039 | 3796 | 0.5623 | 0.3724 | 0.5623 | 0.7499 |
| 0.0395 | 9.3088 | 3798 | 0.5630 | 0.3724 | 0.5630 | 0.7503 |
| 0.0395 | 9.3137 | 3800 | 0.5649 | 0.3724 | 0.5649 | 0.7516 |
| 0.0395 | 9.3186 | 3802 | 0.5654 | 0.3724 | 0.5654 | 0.7519 |
| 0.0395 | 9.3235 | 3804 | 0.5646 | 0.3724 | 0.5646 | 0.7514 |
| 0.0395 | 9.3284 | 3806 | 0.5633 | 0.3724 | 0.5633 | 0.7506 |
| 0.0395 | 9.3333 | 3808 | 0.5630 | 0.3724 | 0.5630 | 0.7503 |
| 0.0395 | 9.3382 | 3810 | 0.5640 | 0.3724 | 0.5640 | 0.7510 |
| 0.0395 | 9.3431 | 3812 | 0.5662 | 0.3724 | 0.5662 | 0.7525 |
| 0.0395 | 9.3480 | 3814 | 0.5707 | 0.4661 | 0.5707 | 0.7555 |
| 0.0395 | 9.3529 | 3816 | 0.5756 | 0.4661 | 0.5756 | 0.7587 |
| 0.0395 | 9.3578 | 3818 | 0.5784 | 0.4661 | 0.5784 | 0.7605 |
| 0.0395 | 9.3627 | 3820 | 0.5805 | 0.4661 | 0.5805 | 0.7619 |
| 0.0395 | 9.3676 | 3822 | 0.5831 | 0.4661 | 0.5831 | 0.7636 |
| 0.0395 | 9.3725 | 3824 | 0.5853 | 0.4661 | 0.5853 | 0.7651 |
| 0.0395 | 9.3775 | 3826 | 0.5859 | 0.3255 | 0.5859 | 0.7654 |
| 0.0395 | 9.3824 | 3828 | 0.5855 | 0.3255 | 0.5855 | 0.7652 |
| 0.0395 | 9.3873 | 3830 | 0.5837 | 0.4661 | 0.5837 | 0.7640 |
| 0.0395 | 9.3922 | 3832 | 0.5815 | 0.4661 | 0.5815 | 0.7626 |
| 0.0395 | 9.3971 | 3834 | 0.5779 | 0.4661 | 0.5779 | 0.7602 |
| 0.0395 | 9.4020 | 3836 | 0.5757 | 0.4661 | 0.5757 | 0.7587 |
| 0.0395 | 9.4069 | 3838 | 0.5724 | 0.4661 | 0.5724 | 0.7566 |
| 0.0395 | 9.4118 | 3840 | 0.5706 | 0.4661 | 0.5706 | 0.7554 |
| 0.0395 | 9.4167 | 3842 | 0.5696 | 0.4661 | 0.5696 | 0.7547 |
| 0.0395 | 9.4216 | 3844 | 0.5677 | 0.4661 | 0.5677 | 0.7535 |
| 0.0395 | 9.4265 | 3846 | 0.5645 | 0.3724 | 0.5645 | 0.7514 |
| 0.0395 | 9.4314 | 3848 | 0.5597 | 0.3724 | 0.5597 | 0.7481 |
| 0.0395 | 9.4363 | 3850 | 0.5552 | 0.4273 | 0.5552 | 0.7451 |
| 0.0395 | 9.4412 | 3852 | 0.5515 | 0.4273 | 0.5515 | 0.7426 |
| 0.0395 | 9.4461 | 3854 | 0.5495 | 0.4273 | 0.5495 | 0.7413 |
| 0.0395 | 9.4510 | 3856 | 0.5481 | 0.4273 | 0.5481 | 0.7403 |
| 0.0395 | 9.4559 | 3858 | 0.5491 | 0.4273 | 0.5491 | 0.7410 |
| 0.0395 | 9.4608 | 3860 | 0.5512 | 0.4273 | 0.5512 | 0.7424 |
| 0.0395 | 9.4657 | 3862 | 0.5539 | 0.4273 | 0.5539 | 0.7443 |
| 0.0395 | 9.4706 | 3864 | 0.5568 | 0.4273 | 0.5568 | 0.7462 |
| 0.0395 | 9.4755 | 3866 | 0.5586 | 0.4273 | 0.5586 | 0.7474 |
| 0.0395 | 9.4804 | 3868 | 0.5591 | 0.4273 | 0.5591 | 0.7477 |
| 0.0395 | 9.4853 | 3870 | 0.5600 | 0.3724 | 0.5600 | 0.7483 |
| 0.0395 | 9.4902 | 3872 | 0.5613 | 0.3724 | 0.5613 | 0.7492 |
| 0.0395 | 9.4951 | 3874 | 0.5638 | 0.3724 | 0.5638 | 0.7509 |
| 0.0395 | 9.5 | 3876 | 0.5661 | 0.3724 | 0.5661 | 0.7524 |
| 0.0395 | 9.5049 | 3878 | 0.5688 | 0.4661 | 0.5688 | 0.7542 |
| 0.0395 | 9.5098 | 3880 | 0.5699 | 0.4661 | 0.5699 | 0.7549 |
| 0.0395 | 9.5147 | 3882 | 0.5689 | 0.4661 | 0.5689 | 0.7543 |
| 0.0395 | 9.5196 | 3884 | 0.5698 | 0.4661 | 0.5698 | 0.7549 |
| 0.0395 | 9.5245 | 3886 | 0.5715 | 0.4661 | 0.5715 | 0.7560 |
| 0.0395 | 9.5294 | 3888 | 0.5728 | 0.4661 | 0.5728 | 0.7569 |
| 0.0395 | 9.5343 | 3890 | 0.5726 | 0.4661 | 0.5726 | 0.7567 |
| 0.0395 | 9.5392 | 3892 | 0.5726 | 0.4661 | 0.5726 | 0.7567 |
| 0.0395 | 9.5441 | 3894 | 0.5726 | 0.4661 | 0.5726 | 0.7567 |
| 0.0395 | 9.5490 | 3896 | 0.5733 | 0.4661 | 0.5733 | 0.7572 |
| 0.0395 | 9.5539 | 3898 | 0.5744 | 0.4661 | 0.5744 | 0.7579 |
| 0.0395 | 9.5588 | 3900 | 0.5754 | 0.4661 | 0.5754 | 0.7585 |
| 0.0395 | 9.5637 | 3902 | 0.5769 | 0.4661 | 0.5769 | 0.7595 |
| 0.0395 | 9.5686 | 3904 | 0.5777 | 0.4661 | 0.5777 | 0.7601 |
| 0.0395 | 9.5735 | 3906 | 0.5774 | 0.4661 | 0.5774 | 0.7599 |
| 0.0395 | 9.5784 | 3908 | 0.5773 | 0.4661 | 0.5773 | 0.7598 |
| 0.0395 | 9.5833 | 3910 | 0.5783 | 0.4661 | 0.5783 | 0.7604 |
| 0.0395 | 9.5882 | 3912 | 0.5782 | 0.4661 | 0.5782 | 0.7604 |
| 0.0395 | 9.5931 | 3914 | 0.5770 | 0.4661 | 0.5770 | 0.7596 |
| 0.0395 | 9.5980 | 3916 | 0.5759 | 0.4661 | 0.5759 | 0.7589 |
| 0.0395 | 9.6029 | 3918 | 0.5747 | 0.4661 | 0.5747 | 0.7581 |
| 0.0395 | 9.6078 | 3920 | 0.5728 | 0.4661 | 0.5728 | 0.7568 |
| 0.0395 | 9.6127 | 3922 | 0.5706 | 0.3724 | 0.5706 | 0.7554 |
| 0.0395 | 9.6176 | 3924 | 0.5685 | 0.3724 | 0.5685 | 0.7540 |
| 0.0395 | 9.6225 | 3926 | 0.5657 | 0.3724 | 0.5657 | 0.7521 |
| 0.0395 | 9.6275 | 3928 | 0.5639 | 0.3724 | 0.5639 | 0.7509 |
| 0.0395 | 9.6324 | 3930 | 0.5626 | 0.3724 | 0.5626 | 0.7501 |
| 0.0395 | 9.6373 | 3932 | 0.5616 | 0.3724 | 0.5616 | 0.7494 |
| 0.0395 | 9.6422 | 3934 | 0.5607 | 0.3724 | 0.5607 | 0.7488 |
| 0.0395 | 9.6471 | 3936 | 0.5582 | 0.3724 | 0.5582 | 0.7471 |
| 0.0395 | 9.6520 | 3938 | 0.5567 | 0.3724 | 0.5567 | 0.7461 |
| 0.0395 | 9.6569 | 3940 | 0.5553 | 0.3724 | 0.5553 | 0.7452 |
| 0.0395 | 9.6618 | 3942 | 0.5543 | 0.3724 | 0.5543 | 0.7445 |
| 0.0395 | 9.6667 | 3944 | 0.5538 | 0.3724 | 0.5538 | 0.7441 |
| 0.0395 | 9.6716 | 3946 | 0.5537 | 0.3724 | 0.5537 | 0.7441 |
| 0.0395 | 9.6765 | 3948 | 0.5541 | 0.3724 | 0.5541 | 0.7444 |
| 0.0395 | 9.6814 | 3950 | 0.5545 | 0.3724 | 0.5545 | 0.7447 |
| 0.0395 | 9.6863 | 3952 | 0.5553 | 0.3724 | 0.5553 | 0.7452 |
| 0.0395 | 9.6912 | 3954 | 0.5557 | 0.3724 | 0.5557 | 0.7455 |
| 0.0395 | 9.6961 | 3956 | 0.5560 | 0.3724 | 0.5560 | 0.7456 |
| 0.0395 | 9.7010 | 3958 | 0.5553 | 0.3724 | 0.5553 | 0.7452 |
| 0.0395 | 9.7059 | 3960 | 0.5555 | 0.3724 | 0.5555 | 0.7453 |
| 0.0395 | 9.7108 | 3962 | 0.5563 | 0.3724 | 0.5563 | 0.7459 |
| 0.0395 | 9.7157 | 3964 | 0.5571 | 0.3724 | 0.5571 | 0.7464 |
| 0.0395 | 9.7206 | 3966 | 0.5581 | 0.3724 | 0.5581 | 0.7471 |
| 0.0395 | 9.7255 | 3968 | 0.5588 | 0.3724 | 0.5588 | 0.7475 |
| 0.0395 | 9.7304 | 3970 | 0.5593 | 0.3724 | 0.5593 | 0.7478 |
| 0.0395 | 9.7353 | 3972 | 0.5601 | 0.3724 | 0.5601 | 0.7484 |
| 0.0395 | 9.7402 | 3974 | 0.5604 | 0.3724 | 0.5604 | 0.7486 |
| 0.0395 | 9.7451 | 3976 | 0.5609 | 0.3724 | 0.5609 | 0.7490 |
| 0.0395 | 9.75 | 3978 | 0.5611 | 0.3724 | 0.5611 | 0.7490 |
| 0.0395 | 9.7549 | 3980 | 0.5616 | 0.4661 | 0.5616 | 0.7494 |
| 0.0395 | 9.7598 | 3982 | 0.5622 | 0.4661 | 0.5622 | 0.7498 |
| 0.0395 | 9.7647 | 3984 | 0.5627 | 0.4661 | 0.5627 | 0.7501 |
| 0.0395 | 9.7696 | 3986 | 0.5633 | 0.4661 | 0.5633 | 0.7505 |
| 0.0395 | 9.7745 | 3988 | 0.5643 | 0.4661 | 0.5643 | 0.7512 |
| 0.0395 | 9.7794 | 3990 | 0.5650 | 0.4661 | 0.5650 | 0.7517 |
| 0.0395 | 9.7843 | 3992 | 0.5657 | 0.4661 | 0.5657 | 0.7522 |
| 0.0395 | 9.7892 | 3994 | 0.5667 | 0.3255 | 0.5667 | 0.7528 |
| 0.0395 | 9.7941 | 3996 | 0.5677 | 0.3255 | 0.5677 | 0.7535 |
| 0.0395 | 9.7990 | 3998 | 0.5690 | 0.3255 | 0.5690 | 0.7543 |
| 0.0358 | 9.8039 | 4000 | 0.5703 | 0.3255 | 0.5703 | 0.7552 |
| 0.0358 | 9.8088 | 4002 | 0.5722 | 0.3255 | 0.5722 | 0.7564 |
| 0.0358 | 9.8137 | 4004 | 0.5739 | 0.3255 | 0.5739 | 0.7576 |
| 0.0358 | 9.8186 | 4006 | 0.5745 | 0.3255 | 0.5745 | 0.7580 |
| 0.0358 | 9.8235 | 4008 | 0.5747 | 0.3255 | 0.5747 | 0.7581 |
| 0.0358 | 9.8284 | 4010 | 0.5741 | 0.3255 | 0.5741 | 0.7577 |
| 0.0358 | 9.8333 | 4012 | 0.5736 | 0.3255 | 0.5736 | 0.7574 |
| 0.0358 | 9.8382 | 4014 | 0.5741 | 0.3255 | 0.5741 | 0.7577 |
| 0.0358 | 9.8431 | 4016 | 0.5744 | 0.3255 | 0.5744 | 0.7579 |
| 0.0358 | 9.8480 | 4018 | 0.5741 | 0.3255 | 0.5741 | 0.7577 |
| 0.0358 | 9.8529 | 4020 | 0.5739 | 0.3255 | 0.5739 | 0.7576 |
| 0.0358 | 9.8578 | 4022 | 0.5734 | 0.3255 | 0.5734 | 0.7572 |
| 0.0358 | 9.8627 | 4024 | 0.5731 | 0.3255 | 0.5731 | 0.7570 |
| 0.0358 | 9.8676 | 4026 | 0.5723 | 0.3255 | 0.5723 | 0.7565 |
| 0.0358 | 9.8725 | 4028 | 0.5719 | 0.3255 | 0.5719 | 0.7562 |
| 0.0358 | 9.8775 | 4030 | 0.5717 | 0.3255 | 0.5717 | 0.7561 |
| 0.0358 | 9.8824 | 4032 | 0.5713 | 0.3255 | 0.5713 | 0.7558 |
| 0.0358 | 9.8873 | 4034 | 0.5709 | 0.3255 | 0.5709 | 0.7556 |
| 0.0358 | 9.8922 | 4036 | 0.5706 | 0.3255 | 0.5706 | 0.7554 |
| 0.0358 | 9.8971 | 4038 | 0.5701 | 0.3255 | 0.5701 | 0.7550 |
| 0.0358 | 9.9020 | 4040 | 0.5696 | 0.3255 | 0.5696 | 0.7547 |
| 0.0358 | 9.9069 | 4042 | 0.5691 | 0.3255 | 0.5691 | 0.7544 |
| 0.0358 | 9.9118 | 4044 | 0.5685 | 0.3255 | 0.5685 | 0.7540 |
| 0.0358 | 9.9167 | 4046 | 0.5679 | 0.3255 | 0.5679 | 0.7536 |
| 0.0358 | 9.9216 | 4048 | 0.5674 | 0.3255 | 0.5674 | 0.7533 |
| 0.0358 | 9.9265 | 4050 | 0.5670 | 0.3255 | 0.5670 | 0.7530 |
| 0.0358 | 9.9314 | 4052 | 0.5666 | 0.3255 | 0.5666 | 0.7527 |
| 0.0358 | 9.9363 | 4054 | 0.5663 | 0.4661 | 0.5663 | 0.7525 |
| 0.0358 | 9.9412 | 4056 | 0.5662 | 0.4661 | 0.5662 | 0.7524 |
| 0.0358 | 9.9461 | 4058 | 0.5659 | 0.4661 | 0.5659 | 0.7523 |
| 0.0358 | 9.9510 | 4060 | 0.5657 | 0.4661 | 0.5657 | 0.7521 |
| 0.0358 | 9.9559 | 4062 | 0.5656 | 0.4661 | 0.5656 | 0.7521 |
| 0.0358 | 9.9608 | 4064 | 0.5656 | 0.4661 | 0.5656 | 0.7521 |
| 0.0358 | 9.9657 | 4066 | 0.5656 | 0.4661 | 0.5656 | 0.7520 |
| 0.0358 | 9.9706 | 4068 | 0.5655 | 0.4661 | 0.5655 | 0.7520 |
| 0.0358 | 9.9755 | 4070 | 0.5654 | 0.4661 | 0.5654 | 0.7519 |
| 0.0358 | 9.9804 | 4072 | 0.5654 | 0.4661 | 0.5654 | 0.7519 |
| 0.0358 | 9.9853 | 4074 | 0.5654 | 0.4661 | 0.5654 | 0.7519 |
| 0.0358 | 9.9902 | 4076 | 0.5654 | 0.4661 | 0.5654 | 0.7519 |
| 0.0358 | 9.9951 | 4078 | 0.5654 | 0.4661 | 0.5654 | 0.7519 |
| 0.0358 | 10.0 | 4080 | 0.5654 | 0.4661 | 0.5654 | 0.7519 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
|
SeppeV/ernie2p0_ft_pref_10pc | SeppeV | 2024-11-25T09:09:00Z | 89 | 0 | transformers | [
"transformers",
"safetensors",
"ernie",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-23T15:34:09Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a ๐ค transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
allknowingroger/Marco-01-slerp7-7B | allknowingroger | 2024-11-25T09:08:36Z | 10 | 1 | transformers | [
"transformers",
"safetensors",
"qwen2",
"text-generation",
"mergekit",
"merge",
"conversational",
"base_model:AIDC-AI/Marco-o1",
"base_model:merge:AIDC-AI/Marco-o1",
"base_model:ZeroXClem/Qwen2.5-7B-HomerCreative-Mix",
"base_model:merge:ZeroXClem/Qwen2.5-7B-HomerCreative-Mix",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-25T08:54:00Z | ---
base_model:
- ZeroXClem/Qwen2.5-7B-HomerCreative-Mix
- AIDC-AI/Marco-o1
library_name: transformers
tags:
- mergekit
- merge
license: apache-2.0
---
# merge
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the SLERP merge method.
### Models Merged
The following models were included in the merge:
* [ZeroXClem/Qwen2.5-7B-HomerCreative-Mix](https://huggingface.co/ZeroXClem/Qwen2.5-7B-HomerCreative-Mix)
* [AIDC-AI/Marco-o1](https://huggingface.co/AIDC-AI/Marco-o1)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
models:
- model: AIDC-AI/Marco-o1
- model: ZeroXClem/Qwen2.5-7B-HomerCreative-Mix
merge_method: slerp
base_model: AIDC-AI/Marco-o1
dtype: bfloat16
parameters:
t: [0, 0.5, 1, 0.5, 0] # V shaped curve: Hermes for input & output, WizardMath in the middle layers
``` |
jebish7/cde-small-v1_MNR_3 | jebish7 | 2024-11-25T09:06:55Z | 6 | 0 | sentence-transformers | [
"sentence-transformers",
"safetensors",
"sentence-similarity",
"feature-extraction",
"generated_from_trainer",
"dataset_size:29545",
"loss:MultipleNegativesRankingLoss",
"custom_code",
"arxiv:1908.10084",
"arxiv:1705.00652",
"base_model:jxm/cde-small-v1",
"base_model:finetune:jxm/cde-small-v1",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | sentence-similarity | 2024-11-25T09:06:28Z | ---
base_model: jxm/cde-small-v1
library_name: sentence-transformers
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:29545
- loss:MultipleNegativesRankingLoss
widget:
- source_sentence: In terms of audited accounts submission for an Applicant, could
you clarify the scenarios in which the Regulator might agree that a reviewed pro
forma statement of financial position is not needed, and what factors would be
considered in making that determination?
sentences:
- "DocumentID: 1 | PassageID: 4.2.1.(3) | Passage: Where the regulator in another\
\ jurisdiction does not permit the implementation of policies, procedures, systems\
\ and controls consistent with these Rules, the Relevant Person must:\n(a)\tinform\
\ the Regulator in writing immediately; and\n(b)\tapply appropriate additional\
\ measures to manage the money laundering risks posed by the relevant branch or\
\ subsidiary."
- "DocumentID: 11 | PassageID: 2.3.15.(4) | Passage: The Applicant must submit to\
\ the Regulator the following records, as applicable:\n(a)\tAudited accounts,\
\ for the purposes of this Rule and Rule 2.3.2(1), for the last three full financial\
\ years, noting that:\n(i)\tif the Applicant applies for admission less than ninety\
\ days after the end of its last financial year, unless the Applicant has audited\
\ accounts for its latest full financial year, the accounts may be for the three\
\ years to the end of the previous financial year, but must also include audited\
\ or reviewed accounts for its most recent semi-annual financial reporting period;\
\ and\n(ii)\tif the Applicant applies for admission more than six months and seventy-five\
\ days after the end of its last financial year, audited or reviewed accounts\
\ for its most recent semi-annual financial reporting period (or longer period\
\ if available).\n(b)\tUnless the Regulator agrees it is not needed, a reviewed\
\ pro forma statement of financial position. The review must be conducted by an\
\ accredited professional auditor of the company or an independent accountant."
- 'DocumentID: 36 | PassageID: D.1.3. | Passage: Principle 1 โ Oversight and responsibility
of climate-related financial risk exposures.Certain functions related to the management
of climate-related financial risks may be delegated, but, as with other risks,
the board is ultimately responsible and accountable for monitoring, managing and
overseeing climate-related risks for the financial firm.
'
- source_sentence: A financial institution is interested in multiple designations,
including the ADGM Green Fund and ADGM Green Bond. For each application, what
fee will the institution incur?
sentences:
- 'DocumentID: 31 | PassageID: 63) | Passage: INITIAL DISCLOSURE OF MATERIAL ESTIMATES.
Disclosure of material estimates of Contingent Resources
Section 2.3 of the PRMS Guidelines states that Contingent Resources may be assigned
for Petroleum Projects that are dependent on โtechnology under developmentโ, and
further recommended that a number of guidelines are followed in order to distinguish
these estimates from those that should be classified as Unrecoverable Petroleum. By
way of Rule 12.10.1(3), the FSRA fully supports and requires compliance with what
is set out in the PRMS Guidelines.
'
- 'DocumentID: 19 | PassageID: 40) | Passage: REGULATORY REQUIREMENTS FOR AUTHORISED
PERSONS ENGAGED IN REGULATED ACTIVITIES IN RELATION TO VIRTUAL ASSETS
Anti-Money Laundering and Countering Financing of Terrorism
On 21 June 2019, FATF released a revised Guidance for a Risk-Based Approach (RBA)
for VAs and VASPs, as well as an Interpretative Note for Recommendation 15. This
built upon previous FATF statements by clarifying a RBA for Anti-Money Laundering
and Countering the Financing of Terrorism (โAML/CFTโ) purposes. The basic principle
underlying the FATF Guidelines is that VASPs are expected to โidentify, assess,
and take effective action to mitigate their ML/TF risksโ with respect to VAs.
'
- "DocumentID: 4 | PassageID: 10.1.1 | Passage: A Person applying to the Regulator\
\ for any of the following designations:\n(a)\tADGM Green Fund;\n(b)\tADGM Climate\
\ Transition Fund;\n(c)\tADGM Green Portfolio;\n(d)\tADGM Climate Transition Portfolio;\n\
(e)\tADGM Green Bond; or\n(f)\tADGM Sustainability Linked Bond\nmust pay to the\
\ Regulator an application fee of $2,000."
- source_sentence: How does the ADGM expect Authorised Persons to incorporate the
eligibility of collateral types into their overall risk management framework,
particularly concerning Islamic finance principles?
sentences:
- 'DocumentID: 17 | PassageID: Schedule 1.Part 2.Chapter 5.42.(2) | Passage: In
determining for the purposes of sub-paragraph โ(1)โ(b) whether Deposits are accepted
only on particular occasions, regard is to be had to the frequency of those occasions
and to any characteristics distinguishing them from each other.'
- "DocumentID: 9 | PassageID: 6.8.5 | Passage: \n(a)\tA Fund Manager of an Islamic\
\ REIT may obtain financing either directly or through its Special Purpose Vehicle\
\ up to 65% of the total gross asset value of the Fund provided that such financing\
\ is provided in a Shari'a-compliant manner.\n(b)\tUpon becoming aware that the\
\ borrowing limit set out in 6.8.5(a) has been exceeded, the Fund Manager shall:\n\
(c)\timmediately inform Unitholders and the Regulator of the details of the breach\
\ and the proposed remedial action;\n(d)\tuse its best endeavours to reduce the\
\ excess borrowings;\n(e)\tnot permit the Fund to engage in additional borrowing;\
\ and\n(f)\tinform Unitholders and the Regulator on a regular basis as to the\
\ progress of the remedial action."
- 'DocumentID: 9 | PassageID: 5.1.1.Guidance.(ii) | Passage: The prudential Category
for Islamic Financial Institutions and other Authorised Persons (acting through
an Islamic Window) undertaking the Regulated Activity of Managing PSIAs (which
may be either a Restricted PSIA or an Unrestricted PSIA) is determined in accordance
with PRU Rule 1.3. An Authorised Person which Manages PSIAs (whether as an Islamic
Financial Institution or through an Islamic Window) must comply with the requirements
in PRU in relation to specific prudential requirements relating to Trading Book
and Non-Trading Book activities, including Credit Risk, Market Risk, Liquidity
Risk and Group Risk.'
- source_sentence: Can you please detail the specific Anti-Money Laundering (AML)
and Countering Financing of Terrorism (CFT) measures and controls that our firm
must have in place when dealing with Spot Commodities as per the FSRA's requirements?
sentences:
- 'DocumentID: 34 | PassageID: 65) | Passage: REGULATORY REQUIREMENTS - SPOT COMMODITY
ACTIVITIES
Sanctions
Pursuant to AML Rule 11.2.1(1), an Authorised Person must have arrangements in
place to ensure that only Spot Commodities that are not subject to sanctions or
associated with an entity in the supply chain that is itself subject to a sanction,
are used as part of its Regulated Activities, or utilised as part of a delivery
and/or storage facility operated by itself (or by any third parties it uses). In
demonstrating compliance with the Rule, an Authorised Person must have powers
to resolve any breach in a timely fashion, such as taking emergency action itself
or by compelling the delivery and/or storage facility to take appropriate action. The
FSRA expects this to include the Authorised Person having the ability to sanction
a Member, market participant or the delivery and/or storage facility for acts
or omissions that compromise compliance with applicable sanctions.
'
- "DocumentID: 18 | PassageID: 3.2 | Passage: Financial Services Permissions. VC\
\ Managers operating in ADGM require a Financial Services Permission (โFSPโ) to\
\ undertake any Regulated Activity pertaining to VC Funds and/or co-investments\
\ by third parties in VC Funds. The Regulated Activities covered by the FSP will\
\ be dependent on the VC Managersโ investment strategy and business model.\n(a)\t\
Managing a Collective Investment Fund: this includes carrying out fund management\
\ activities in respect of a VC Fund.\n(b)\tAdvising on Investments or Credit\
\ : for VC Managers these activities will be restricted to activities related\
\ to co-investment alongside a VC Fund which the VC Manager manages, such as recommending\
\ that a client invest in an investee company alongside the VC Fund and on the\
\ strategy and structure required to make the investment.\n(c)\tArranging Deals\
\ in Investments: VC Managers may also wish to make arrangements to facilitate\
\ co-investments in the investee company.\nAuthorisation fees and supervision\
\ fees for a VC Manager are capped at USD 10,000 regardless of whether one or\
\ both of the additional Regulated Activities in b) and c) above in relation to\
\ co-investments are included in its FSP. The FSP will include restrictions appropriate\
\ to the business model of a VC Manager."
- 'DocumentID: 24 | PassageID: 3.9 | Passage: Principle 2 โ High Standards for Authorisation.
This discerning approach is shown by the FSRAโs power to only permit VAs that
it deems โacceptableโ, as determined by risk factors such as security and traceability,
in order to prevent the build-up of risk from illiquid or immature assets. Additionally,
we do not permit stablecoins based on the algorithmic model of valuation to the
underlying fiat currency.'
- source_sentence: What are the common scenarios or instances where assets and liabilities
are not covered by the bases of accounting in Rule 5.3.2, and how should an Insurer
address these in their reporting?
sentences:
- 'DocumentID: 1 | PassageID: 14.4.1.Guidance.1. | Passage: Relevant Persons are
reminded that in accordance with Federal AML Legislation, Relevant Persons or
any of their Employees must not tip off any Person, that is, inform any Person
that he is being scrutinised, or investigated by any other competent authority,
for possible involvement in suspicious Transactions or activity related to money
laundering or terrorist financing.'
- "DocumentID: 12 | PassageID: 5.3.1.Guidance | Passage: \nThe exceptions provided\
\ in this Chapter relate to the following:\na.\tspecific Rules in respect of certain\
\ assets and liabilities, intended to achieve a regulatory objective not achieved\
\ by application of either or both of the bases of accounting set out in Rule\
\ โ5.3.2;\nb.\tassets and liabilities that are not dealt with in either or both\
\ of the bases of accounting set out in Rule โ5.3.2; and\nc.\tthe overriding power\
\ of the Regulator, set out in Rule โ5.1.6, to require an Insurer to adopt a particular\
\ measurement for a specific asset or liability."
- 'DocumentID: 1 | PassageID: 6.2.1.Guidance.2. | Passage: The risk assessment under
Rule โ6.2.1(c) should identify actions to mitigate risks associated with undertaking
NFTF business generally, and the use of eKYC specifically. This is because distinct
risks are often likely to arise where business is conducted entirely in an NFTF
manner, compared to when the business relationship includes a mix of face-to-face
and NFTF interactions. The assessment should make reference to risk mitigation
measures recommended by the Regulator, a competent authority of the U.A.E., FATF,
and other relevant bodies.
'
---
# SentenceTransformer based on jxm/cde-small-v1
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [jxm/cde-small-v1](https://huggingface.co/jxm/cde-small-v1) on the csv dataset. It maps sentences & paragraphs to a None-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [jxm/cde-small-v1](https://huggingface.co/jxm/cde-small-v1) <!-- at revision 9e2ed1d8d569d34458913d2d246935c1b2324d11 -->
- **Maximum Sequence Length:** None tokens
- **Output Dimensionality:** None tokens
- **Similarity Function:** Cosine Similarity
- **Training Dataset:**
- csv
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({}) with Transformer model: DatasetTransformer
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the ๐ค Hub
model = SentenceTransformer("jebish7/cde-small-v1_MNR_3")
# Run inference
sentences = [
'What are the common scenarios or instances where assets and liabilities are not covered by the bases of accounting in Rule 5.3.2, and how should an Insurer address these in their reporting?',
'DocumentID: 12 | PassageID: 5.3.1.Guidance | Passage: \nThe exceptions provided in this Chapter relate to the following:\na.\tspecific Rules in respect of certain assets and liabilities, intended to achieve a regulatory objective not achieved by application of either or both of the bases of accounting set out in Rule \u200e5.3.2;\nb.\tassets and liabilities that are not dealt with in either or both of the bases of accounting set out in Rule \u200e5.3.2; and\nc.\tthe overriding power of the Regulator, set out in Rule \u200e5.1.6, to require an Insurer to adopt a particular measurement for a specific asset or liability.',
'DocumentID: 1 | PassageID: 14.4.1.Guidance.1. | Passage: Relevant Persons are reminded that in accordance with Federal AML Legislation, Relevant Persons or any of their Employees must not tip off any Person, that is, inform any Person that he is being scrutinised, or investigated by any other competent authority, for possible involvement in suspicious Transactions or activity related to money laundering or terrorist financing.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1024]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Dataset
#### csv
* Dataset: csv
* Size: 29,545 training samples
* Columns: <code>anchor</code> and <code>positive</code>
* Approximate statistics based on the first 1000 samples:
| | anchor | positive |
|:--------|:-----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 16 tokens</li><li>mean: 34.95 tokens</li><li>max: 68 tokens</li></ul> | <ul><li>min: 35 tokens</li><li>mean: 132.0 tokens</li><li>max: 512 tokens</li></ul> |
* Samples:
| anchor | positive |
|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>If a financial institution offers Money Remittance as one of its services, under what circumstances is it deemed to be holding Relevant Money and therefore subject to regulatory compliance (a)?</code> | <code>DocumentID: 13 | PassageID: 3.7.1.Guidance.1. | Passage: An Authorised Person is considered to be holding Relevant Money and subject to (a) where it offers Payment Services alongside currency exchange or Money Remittance.<br></code> |
| <code>What are the consequences for a Recognised Body or Authorised Person if they fail to comply with ADGM's requirements regarding severance payments?</code> | <code>DocumentID: 7 | PassageID: APP1.A1.2.Guidance.9. | Passage: Severance payments. Where an Authorised Person or Recognised Body provides discretionary payouts on termination of employment ("severance payments", also called "golden parachutes"), such payment should generally be subject to appropriate limits or shareholder approval. In any case, such payouts should be aligned with the Authorised Person or Recognised Body's overall financial condition and performance over an appropriate time horizon and should not be payable in the case of failure or threatened failure of the Authorised Person or Recognised Body, particularly to an individual whose actions may have contributed to the failure or potential failure of the Authorised Person or Recognised Body.<br></code> |
| <code>If a Public Fund is structured as an Investment Trust, to whom should the Fund Manager report the review findings regarding delegated Regulated Activities or outsourced functions?</code> | <code>DocumentID: 6 | PassageID: PART 5.12.12.8.(1) | Passage: A Fund Manager or the Trustee of a Public Fund, which has delegated any Regulated Activities or outsourced any functions, must conduct a review of the carrying out of the relevant activities or functions by the Service Provider and present the findings of the review to either:<br>(a) the Fund's Governing Body every 6 months at the Fund's board meeting; or<br>(b) in the case of a Fund structured as an Investment Trust, to the Trustee.</code> |
* Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
```json
{
"scale": 20.0,
"similarity_fct": "cos_sim"
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `per_device_train_batch_size`: 16
- `learning_rate`: 2e-05
- `warmup_ratio`: 0.1
- `batch_sampler`: no_duplicates
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: no
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 16
- `per_device_eval_batch_size`: 8
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 1
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 2e-05
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 3
- `max_steps`: -1
- `lr_scheduler_type`: linear
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.1
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: False
- `fp16`: False
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: False
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: False
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `use_liger_kernel`: False
- `eval_use_gather_object`: False
- `batch_sampler`: no_duplicates
- `multi_dataset_batch_sampler`: proportional
</details>
### Training Logs
| Epoch | Step | Training Loss |
|:------:|:----:|:-------------:|
| 0.1082 | 100 | 1.9962 |
| 0.2165 | 200 | 1.1626 |
| 0.3247 | 300 | 0.9907 |
| 0.4329 | 400 | 0.8196 |
| 0.5411 | 500 | 0.8082 |
| 0.6494 | 600 | 0.6944 |
| 0.7576 | 700 | 0.6559 |
| 0.8658 | 800 | 0.6242 |
| 0.9740 | 900 | 0.6299 |
| 1.0823 | 1000 | 0.6051 |
| 1.1905 | 1100 | 0.567 |
| 1.2987 | 1200 | 0.4679 |
| 1.4069 | 1300 | 0.3443 |
| 1.5152 | 1400 | 0.3356 |
| 1.6234 | 1500 | 0.2958 |
| 1.7316 | 1600 | 0.254 |
| 1.8398 | 1700 | 0.2694 |
| 1.9481 | 1800 | 0.2497 |
| 2.0563 | 1900 | 0.2671 |
| 2.1645 | 2000 | 0.2558 |
| 2.2727 | 2100 | 0.1943 |
| 2.3810 | 2200 | 0.1242 |
| 2.4892 | 2300 | 0.116 |
| 2.5974 | 2400 | 0.1081 |
| 2.7056 | 2500 | 0.1056 |
| 2.8139 | 2600 | 0.107 |
| 2.9221 | 2700 | 0.1154 |
### Framework Versions
- Python: 3.10.14
- Sentence Transformers: 3.1.1
- Transformers: 4.45.2
- PyTorch: 2.4.0
- Accelerate: 0.34.2
- Datasets: 3.0.1
- Tokenizers: 0.20.0
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
#### MultipleNegativesRankingLoss
```bibtex
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> |
nuyyep81/results | nuyyep81 | 2024-11-25T09:04:27Z | 176 | 0 | transformers | [
"transformers",
"safetensors",
"t5",
"text2text-generation",
"generated_from_trainer",
"base_model:google-t5/t5-base",
"base_model:finetune:google-t5/t5-base",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text2text-generation | 2024-11-25T08:38:06Z | ---
library_name: transformers
license: apache-2.0
base_model: google-t5/t5-base
tags:
- generated_from_trainer
model-index:
- name: results
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# results
This model is a fine-tuned version of [google-t5/t5-base](https://huggingface.co/google-t5/t5-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.4484
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 3
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| No log | 1.0 | 352 | 1.4879 |
| 1.8923 | 2.0 | 704 | 1.4566 |
| 1.5369 | 3.0 | 1056 | 1.4484 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.5.1+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
|
mradermacher/BioinspiredLLM-GGUF | mradermacher | 2024-11-25T09:00:52Z | 138 | 1 | transformers | [
"transformers",
"gguf",
"biology",
"materials science",
"code",
"scientific AI",
"biological materials",
"bioinspiration",
"machine learning",
"generative",
"en",
"base_model:lamm-mit/BioinspiredLLM",
"base_model:quantized:lamm-mit/BioinspiredLLM",
"endpoints_compatible",
"region:us"
] | null | 2024-11-22T23:58:59Z | ---
base_model: lamm-mit/BioinspiredLLM
language:
- en
library_name: transformers
quantized_by: mradermacher
tags:
- biology
- materials science
- code
- scientific AI
- biological materials
- bioinspiration
- machine learning
- generative
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
static quants of https://huggingface.co/lamm-mit/BioinspiredLLM
<!-- provided-files -->
weighted/imatrix quants are available at https://huggingface.co/mradermacher/BioinspiredLLM-i1-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/BioinspiredLLM-GGUF/resolve/main/BioinspiredLLM.Q2_K.gguf) | Q2_K | 5.0 | |
| [GGUF](https://huggingface.co/mradermacher/BioinspiredLLM-GGUF/resolve/main/BioinspiredLLM.Q3_K_S.gguf) | Q3_K_S | 5.8 | |
| [GGUF](https://huggingface.co/mradermacher/BioinspiredLLM-GGUF/resolve/main/BioinspiredLLM.Q3_K_M.gguf) | Q3_K_M | 6.4 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/BioinspiredLLM-GGUF/resolve/main/BioinspiredLLM.Q3_K_L.gguf) | Q3_K_L | 7.0 | |
| [GGUF](https://huggingface.co/mradermacher/BioinspiredLLM-GGUF/resolve/main/BioinspiredLLM.IQ4_XS.gguf) | IQ4_XS | 7.1 | |
| [GGUF](https://huggingface.co/mradermacher/BioinspiredLLM-GGUF/resolve/main/BioinspiredLLM.Q4_0_4_4.gguf) | Q4_0_4_4 | 7.5 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/BioinspiredLLM-GGUF/resolve/main/BioinspiredLLM.Q4_K_S.gguf) | Q4_K_S | 7.5 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/BioinspiredLLM-GGUF/resolve/main/BioinspiredLLM.Q4_K_M.gguf) | Q4_K_M | 8.0 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/BioinspiredLLM-GGUF/resolve/main/BioinspiredLLM.Q5_K_S.gguf) | Q5_K_S | 9.1 | |
| [GGUF](https://huggingface.co/mradermacher/BioinspiredLLM-GGUF/resolve/main/BioinspiredLLM.Q5_K_M.gguf) | Q5_K_M | 9.3 | |
| [GGUF](https://huggingface.co/mradermacher/BioinspiredLLM-GGUF/resolve/main/BioinspiredLLM.Q6_K.gguf) | Q6_K | 10.8 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/BioinspiredLLM-GGUF/resolve/main/BioinspiredLLM.Q8_0.gguf) | Q8_0 | 13.9 | fast, best quality |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
mradermacher/Chocolatine-32B-Instruct-DPO-v1.2-i1-GGUF | mradermacher | 2024-11-25T09:00:29Z | 9 | 1 | transformers | [
"transformers",
"gguf",
"chocolatine",
"fr",
"en",
"dataset:jpacifico/french-orca-dpo-pairs-revised",
"base_model:jpacifico/Chocolatine-32B-Instruct-DPO-v1.2",
"base_model:quantized:jpacifico/Chocolatine-32B-Instruct-DPO-v1.2",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"imatrix",
"conversational"
] | null | 2024-11-23T00:20:45Z | ---
base_model: jpacifico/Chocolatine-32B-Instruct-DPO-v1.2
datasets:
- jpacifico/french-orca-dpo-pairs-revised
language:
- fr
- en
library_name: transformers
license: apache-2.0
quantized_by: mradermacher
tags:
- chocolatine
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/jpacifico/Chocolatine-32B-Instruct-DPO-v1.2
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/Chocolatine-32B-Instruct-DPO-v1.2-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-32B-Instruct-DPO-v1.2-i1-GGUF/resolve/main/Chocolatine-32B-Instruct-DPO-v1.2.i1-IQ1_S.gguf) | i1-IQ1_S | 7.4 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-32B-Instruct-DPO-v1.2-i1-GGUF/resolve/main/Chocolatine-32B-Instruct-DPO-v1.2.i1-IQ1_M.gguf) | i1-IQ1_M | 8.0 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-32B-Instruct-DPO-v1.2-i1-GGUF/resolve/main/Chocolatine-32B-Instruct-DPO-v1.2.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 9.1 | |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-32B-Instruct-DPO-v1.2-i1-GGUF/resolve/main/Chocolatine-32B-Instruct-DPO-v1.2.i1-IQ2_XS.gguf) | i1-IQ2_XS | 10.1 | |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-32B-Instruct-DPO-v1.2-i1-GGUF/resolve/main/Chocolatine-32B-Instruct-DPO-v1.2.i1-IQ2_S.gguf) | i1-IQ2_S | 10.5 | |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-32B-Instruct-DPO-v1.2-i1-GGUF/resolve/main/Chocolatine-32B-Instruct-DPO-v1.2.i1-IQ2_M.gguf) | i1-IQ2_M | 11.4 | |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-32B-Instruct-DPO-v1.2-i1-GGUF/resolve/main/Chocolatine-32B-Instruct-DPO-v1.2.i1-Q2_K.gguf) | i1-Q2_K | 12.4 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-32B-Instruct-DPO-v1.2-i1-GGUF/resolve/main/Chocolatine-32B-Instruct-DPO-v1.2.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 12.9 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-32B-Instruct-DPO-v1.2-i1-GGUF/resolve/main/Chocolatine-32B-Instruct-DPO-v1.2.i1-IQ3_XS.gguf) | i1-IQ3_XS | 13.8 | |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-32B-Instruct-DPO-v1.2-i1-GGUF/resolve/main/Chocolatine-32B-Instruct-DPO-v1.2.i1-Q3_K_S.gguf) | i1-Q3_K_S | 14.5 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-32B-Instruct-DPO-v1.2-i1-GGUF/resolve/main/Chocolatine-32B-Instruct-DPO-v1.2.i1-IQ3_S.gguf) | i1-IQ3_S | 14.5 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-32B-Instruct-DPO-v1.2-i1-GGUF/resolve/main/Chocolatine-32B-Instruct-DPO-v1.2.i1-IQ3_M.gguf) | i1-IQ3_M | 14.9 | |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-32B-Instruct-DPO-v1.2-i1-GGUF/resolve/main/Chocolatine-32B-Instruct-DPO-v1.2.i1-Q3_K_M.gguf) | i1-Q3_K_M | 16.0 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-32B-Instruct-DPO-v1.2-i1-GGUF/resolve/main/Chocolatine-32B-Instruct-DPO-v1.2.i1-Q3_K_L.gguf) | i1-Q3_K_L | 17.3 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-32B-Instruct-DPO-v1.2-i1-GGUF/resolve/main/Chocolatine-32B-Instruct-DPO-v1.2.i1-IQ4_XS.gguf) | i1-IQ4_XS | 17.8 | |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-32B-Instruct-DPO-v1.2-i1-GGUF/resolve/main/Chocolatine-32B-Instruct-DPO-v1.2.i1-Q4_0.gguf) | i1-Q4_0 | 18.8 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-32B-Instruct-DPO-v1.2-i1-GGUF/resolve/main/Chocolatine-32B-Instruct-DPO-v1.2.i1-Q4_K_S.gguf) | i1-Q4_K_S | 18.9 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-32B-Instruct-DPO-v1.2-i1-GGUF/resolve/main/Chocolatine-32B-Instruct-DPO-v1.2.i1-Q4_K_M.gguf) | i1-Q4_K_M | 20.0 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-32B-Instruct-DPO-v1.2-i1-GGUF/resolve/main/Chocolatine-32B-Instruct-DPO-v1.2.i1-Q5_K_S.gguf) | i1-Q5_K_S | 22.7 | |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-32B-Instruct-DPO-v1.2-i1-GGUF/resolve/main/Chocolatine-32B-Instruct-DPO-v1.2.i1-Q5_K_M.gguf) | i1-Q5_K_M | 23.4 | |
| [GGUF](https://huggingface.co/mradermacher/Chocolatine-32B-Instruct-DPO-v1.2-i1-GGUF/resolve/main/Chocolatine-32B-Instruct-DPO-v1.2.i1-Q6_K.gguf) | i1-Q6_K | 27.0 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
mradermacher/ECE-TW3-JRGL-V5-GGUF | mradermacher | 2024-11-25T09:00:23Z | 28 | 1 | transformers | [
"transformers",
"gguf",
"merge",
"mergekit",
"lazymergekit",
"davidkim205/Rhea-72b-v0.5",
"abacusai/Smaug-72B-v0.1",
"en",
"base_model:MatthieuJ/ECE-TW3-JRGL-V5",
"base_model:quantized:MatthieuJ/ECE-TW3-JRGL-V5",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | null | 2024-11-23T01:16:48Z | ---
base_model: MatthieuJ/ECE-TW3-JRGL-V5
language:
- en
library_name: transformers
license: apache-2.0
quantized_by: mradermacher
tags:
- merge
- mergekit
- lazymergekit
- davidkim205/Rhea-72b-v0.5
- abacusai/Smaug-72B-v0.1
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
static quants of https://huggingface.co/MatthieuJ/ECE-TW3-JRGL-V5
<!-- provided-files -->
weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/ECE-TW3-JRGL-V5-GGUF/resolve/main/ECE-TW3-JRGL-V5.Q2_K.gguf) | Q2_K | 27.2 | |
| [GGUF](https://huggingface.co/mradermacher/ECE-TW3-JRGL-V5-GGUF/resolve/main/ECE-TW3-JRGL-V5.Q3_K_S.gguf) | Q3_K_S | 31.7 | |
| [GGUF](https://huggingface.co/mradermacher/ECE-TW3-JRGL-V5-GGUF/resolve/main/ECE-TW3-JRGL-V5.Q3_K_M.gguf) | Q3_K_M | 35.4 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/ECE-TW3-JRGL-V5-GGUF/resolve/main/ECE-TW3-JRGL-V5.Q3_K_L.gguf) | Q3_K_L | 38.6 | |
| [GGUF](https://huggingface.co/mradermacher/ECE-TW3-JRGL-V5-GGUF/resolve/main/ECE-TW3-JRGL-V5.IQ4_XS.gguf) | IQ4_XS | 39.2 | |
| [GGUF](https://huggingface.co/mradermacher/ECE-TW3-JRGL-V5-GGUF/resolve/main/ECE-TW3-JRGL-V5.Q4_K_S.gguf) | Q4_K_S | 41.4 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/ECE-TW3-JRGL-V5-GGUF/resolve/main/ECE-TW3-JRGL-V5.Q4_K_M.gguf) | Q4_K_M | 43.9 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/ECE-TW3-JRGL-V5-GGUF/resolve/main/ECE-TW3-JRGL-V5.Q5_K_S.gguf) | Q5_K_S | 50.0 | |
| [PART 1](https://huggingface.co/mradermacher/ECE-TW3-JRGL-V5-GGUF/resolve/main/ECE-TW3-JRGL-V5.Q5_K_M.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/ECE-TW3-JRGL-V5-GGUF/resolve/main/ECE-TW3-JRGL-V5.Q5_K_M.gguf.part2of2) | Q5_K_M | 51.4 | |
| [PART 1](https://huggingface.co/mradermacher/ECE-TW3-JRGL-V5-GGUF/resolve/main/ECE-TW3-JRGL-V5.Q6_K.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/ECE-TW3-JRGL-V5-GGUF/resolve/main/ECE-TW3-JRGL-V5.Q6_K.gguf.part2of2) | Q6_K | 59.4 | very good quality |
| [PART 1](https://huggingface.co/mradermacher/ECE-TW3-JRGL-V5-GGUF/resolve/main/ECE-TW3-JRGL-V5.Q8_0.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/ECE-TW3-JRGL-V5-GGUF/resolve/main/ECE-TW3-JRGL-V5.Q8_0.gguf.part2of2) | Q8_0 | 76.9 | fast, best quality |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
mradermacher/hermes-llama3-roleplay-3500-v1-i1-GGUF | mradermacher | 2024-11-25T09:00:17Z | 38 | 1 | transformers | [
"transformers",
"gguf",
"en",
"base_model:Deev124/hermes-llama3-roleplay-3500-v1",
"base_model:quantized:Deev124/hermes-llama3-roleplay-3500-v1",
"endpoints_compatible",
"region:us",
"imatrix",
"conversational"
] | null | 2024-11-23T01:28:41Z | ---
base_model: Deev124/hermes-llama3-roleplay-3500-v1
language:
- en
library_name: transformers
quantized_by: mradermacher
tags: []
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/Deev124/hermes-llama3-roleplay-3500-v1
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/hermes-llama3-roleplay-3500-v1-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/hermes-llama3-roleplay-3500-v1-i1-GGUF/resolve/main/hermes-llama3-roleplay-3500-v1.i1-IQ1_S.gguf) | i1-IQ1_S | 2.1 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/hermes-llama3-roleplay-3500-v1-i1-GGUF/resolve/main/hermes-llama3-roleplay-3500-v1.i1-IQ1_M.gguf) | i1-IQ1_M | 2.3 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/hermes-llama3-roleplay-3500-v1-i1-GGUF/resolve/main/hermes-llama3-roleplay-3500-v1.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 2.5 | |
| [GGUF](https://huggingface.co/mradermacher/hermes-llama3-roleplay-3500-v1-i1-GGUF/resolve/main/hermes-llama3-roleplay-3500-v1.i1-IQ2_XS.gguf) | i1-IQ2_XS | 2.7 | |
| [GGUF](https://huggingface.co/mradermacher/hermes-llama3-roleplay-3500-v1-i1-GGUF/resolve/main/hermes-llama3-roleplay-3500-v1.i1-IQ2_S.gguf) | i1-IQ2_S | 2.9 | |
| [GGUF](https://huggingface.co/mradermacher/hermes-llama3-roleplay-3500-v1-i1-GGUF/resolve/main/hermes-llama3-roleplay-3500-v1.i1-IQ2_M.gguf) | i1-IQ2_M | 3.0 | |
| [GGUF](https://huggingface.co/mradermacher/hermes-llama3-roleplay-3500-v1-i1-GGUF/resolve/main/hermes-llama3-roleplay-3500-v1.i1-Q2_K.gguf) | i1-Q2_K | 3.3 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/hermes-llama3-roleplay-3500-v1-i1-GGUF/resolve/main/hermes-llama3-roleplay-3500-v1.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 3.4 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/hermes-llama3-roleplay-3500-v1-i1-GGUF/resolve/main/hermes-llama3-roleplay-3500-v1.i1-IQ3_XS.gguf) | i1-IQ3_XS | 3.6 | |
| [GGUF](https://huggingface.co/mradermacher/hermes-llama3-roleplay-3500-v1-i1-GGUF/resolve/main/hermes-llama3-roleplay-3500-v1.i1-Q3_K_S.gguf) | i1-Q3_K_S | 3.8 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/hermes-llama3-roleplay-3500-v1-i1-GGUF/resolve/main/hermes-llama3-roleplay-3500-v1.i1-IQ3_S.gguf) | i1-IQ3_S | 3.8 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/hermes-llama3-roleplay-3500-v1-i1-GGUF/resolve/main/hermes-llama3-roleplay-3500-v1.i1-IQ3_M.gguf) | i1-IQ3_M | 3.9 | |
| [GGUF](https://huggingface.co/mradermacher/hermes-llama3-roleplay-3500-v1-i1-GGUF/resolve/main/hermes-llama3-roleplay-3500-v1.i1-Q3_K_M.gguf) | i1-Q3_K_M | 4.1 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/hermes-llama3-roleplay-3500-v1-i1-GGUF/resolve/main/hermes-llama3-roleplay-3500-v1.i1-Q3_K_L.gguf) | i1-Q3_K_L | 4.4 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/hermes-llama3-roleplay-3500-v1-i1-GGUF/resolve/main/hermes-llama3-roleplay-3500-v1.i1-IQ4_XS.gguf) | i1-IQ4_XS | 4.5 | |
| [GGUF](https://huggingface.co/mradermacher/hermes-llama3-roleplay-3500-v1-i1-GGUF/resolve/main/hermes-llama3-roleplay-3500-v1.i1-Q4_0_4_4.gguf) | i1-Q4_0_4_4 | 4.8 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/hermes-llama3-roleplay-3500-v1-i1-GGUF/resolve/main/hermes-llama3-roleplay-3500-v1.i1-Q4_0_4_8.gguf) | i1-Q4_0_4_8 | 4.8 | fast on arm+i8mm, low quality |
| [GGUF](https://huggingface.co/mradermacher/hermes-llama3-roleplay-3500-v1-i1-GGUF/resolve/main/hermes-llama3-roleplay-3500-v1.i1-Q4_0_8_8.gguf) | i1-Q4_0_8_8 | 4.8 | fast on arm+sve, low quality |
| [GGUF](https://huggingface.co/mradermacher/hermes-llama3-roleplay-3500-v1-i1-GGUF/resolve/main/hermes-llama3-roleplay-3500-v1.i1-Q4_0.gguf) | i1-Q4_0 | 4.8 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/hermes-llama3-roleplay-3500-v1-i1-GGUF/resolve/main/hermes-llama3-roleplay-3500-v1.i1-Q4_K_S.gguf) | i1-Q4_K_S | 4.8 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/hermes-llama3-roleplay-3500-v1-i1-GGUF/resolve/main/hermes-llama3-roleplay-3500-v1.i1-Q4_K_M.gguf) | i1-Q4_K_M | 5.0 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/hermes-llama3-roleplay-3500-v1-i1-GGUF/resolve/main/hermes-llama3-roleplay-3500-v1.i1-Q5_K_S.gguf) | i1-Q5_K_S | 5.7 | |
| [GGUF](https://huggingface.co/mradermacher/hermes-llama3-roleplay-3500-v1-i1-GGUF/resolve/main/hermes-llama3-roleplay-3500-v1.i1-Q5_K_M.gguf) | i1-Q5_K_M | 5.8 | |
| [GGUF](https://huggingface.co/mradermacher/hermes-llama3-roleplay-3500-v1-i1-GGUF/resolve/main/hermes-llama3-roleplay-3500-v1.i1-Q6_K.gguf) | i1-Q6_K | 6.7 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
mradermacher/blossom-v5.1-34b-i1-GGUF | mradermacher | 2024-11-25T08:59:58Z | 347 | 1 | transformers | [
"transformers",
"gguf",
"zh",
"en",
"dataset:Azure99/blossom-chat-v3",
"dataset:Azure99/blossom-math-v4",
"dataset:Azure99/blossom-wizard-v3",
"dataset:Azure99/blossom-orca-v3",
"base_model:Azure99/blossom-v5.1-34b",
"base_model:quantized:Azure99/blossom-v5.1-34b",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"imatrix",
"conversational"
] | null | 2024-11-23T03:42:15Z | ---
base_model: Azure99/blossom-v5.1-34b
datasets:
- Azure99/blossom-chat-v3
- Azure99/blossom-math-v4
- Azure99/blossom-wizard-v3
- Azure99/blossom-orca-v3
language:
- zh
- en
library_name: transformers
license: apache-2.0
quantized_by: mradermacher
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/Azure99/blossom-v5.1-34b
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/blossom-v5.1-34b-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/blossom-v5.1-34b-i1-GGUF/resolve/main/blossom-v5.1-34b.i1-IQ1_S.gguf) | i1-IQ1_S | 7.6 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/blossom-v5.1-34b-i1-GGUF/resolve/main/blossom-v5.1-34b.i1-IQ1_M.gguf) | i1-IQ1_M | 8.3 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/blossom-v5.1-34b-i1-GGUF/resolve/main/blossom-v5.1-34b.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 9.4 | |
| [GGUF](https://huggingface.co/mradermacher/blossom-v5.1-34b-i1-GGUF/resolve/main/blossom-v5.1-34b.i1-IQ2_XS.gguf) | i1-IQ2_XS | 10.4 | |
| [GGUF](https://huggingface.co/mradermacher/blossom-v5.1-34b-i1-GGUF/resolve/main/blossom-v5.1-34b.i1-IQ2_S.gguf) | i1-IQ2_S | 11.0 | |
| [GGUF](https://huggingface.co/mradermacher/blossom-v5.1-34b-i1-GGUF/resolve/main/blossom-v5.1-34b.i1-IQ2_M.gguf) | i1-IQ2_M | 11.9 | |
| [GGUF](https://huggingface.co/mradermacher/blossom-v5.1-34b-i1-GGUF/resolve/main/blossom-v5.1-34b.i1-Q2_K.gguf) | i1-Q2_K | 12.9 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/blossom-v5.1-34b-i1-GGUF/resolve/main/blossom-v5.1-34b.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 13.4 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/blossom-v5.1-34b-i1-GGUF/resolve/main/blossom-v5.1-34b.i1-IQ3_XS.gguf) | i1-IQ3_XS | 14.3 | |
| [GGUF](https://huggingface.co/mradermacher/blossom-v5.1-34b-i1-GGUF/resolve/main/blossom-v5.1-34b.i1-Q3_K_S.gguf) | i1-Q3_K_S | 15.1 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/blossom-v5.1-34b-i1-GGUF/resolve/main/blossom-v5.1-34b.i1-IQ3_S.gguf) | i1-IQ3_S | 15.1 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/blossom-v5.1-34b-i1-GGUF/resolve/main/blossom-v5.1-34b.i1-IQ3_M.gguf) | i1-IQ3_M | 15.7 | |
| [GGUF](https://huggingface.co/mradermacher/blossom-v5.1-34b-i1-GGUF/resolve/main/blossom-v5.1-34b.i1-Q3_K_M.gguf) | i1-Q3_K_M | 16.8 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/blossom-v5.1-34b-i1-GGUF/resolve/main/blossom-v5.1-34b.i1-Q3_K_L.gguf) | i1-Q3_K_L | 18.2 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/blossom-v5.1-34b-i1-GGUF/resolve/main/blossom-v5.1-34b.i1-IQ4_XS.gguf) | i1-IQ4_XS | 18.6 | |
| [GGUF](https://huggingface.co/mradermacher/blossom-v5.1-34b-i1-GGUF/resolve/main/blossom-v5.1-34b.i1-Q4_0.gguf) | i1-Q4_0 | 19.6 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/blossom-v5.1-34b-i1-GGUF/resolve/main/blossom-v5.1-34b.i1-Q4_K_S.gguf) | i1-Q4_K_S | 19.7 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/blossom-v5.1-34b-i1-GGUF/resolve/main/blossom-v5.1-34b.i1-Q4_K_M.gguf) | i1-Q4_K_M | 20.8 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/blossom-v5.1-34b-i1-GGUF/resolve/main/blossom-v5.1-34b.i1-Q5_K_S.gguf) | i1-Q5_K_S | 23.8 | |
| [GGUF](https://huggingface.co/mradermacher/blossom-v5.1-34b-i1-GGUF/resolve/main/blossom-v5.1-34b.i1-Q5_K_M.gguf) | i1-Q5_K_M | 24.4 | |
| [GGUF](https://huggingface.co/mradermacher/blossom-v5.1-34b-i1-GGUF/resolve/main/blossom-v5.1-34b.i1-Q6_K.gguf) | i1-Q6_K | 28.3 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
mradermacher/Ice0.41-22.11-RP-GGUF | mradermacher | 2024-11-25T08:59:30Z | 24 | 2 | transformers | [
"transformers",
"gguf",
"mergekit",
"merge",
"en",
"base_model:icefog72/Ice0.41-22.11-RP",
"base_model:quantized:icefog72/Ice0.41-22.11-RP",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-11-23T04:28:09Z | ---
base_model: icefog72/Ice0.41-22.11-RP
language:
- en
library_name: transformers
license: cc-by-nc-4.0
quantized_by: mradermacher
tags:
- mergekit
- merge
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
static quants of https://huggingface.co/icefog72/Ice0.41-22.11-RP
<!-- provided-files -->
weighted/imatrix quants are available at https://huggingface.co/mradermacher/Ice0.41-22.11-RP-i1-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/Ice0.41-22.11-RP-GGUF/resolve/main/Ice0.41-22.11-RP.Q2_K.gguf) | Q2_K | 2.8 | |
| [GGUF](https://huggingface.co/mradermacher/Ice0.41-22.11-RP-GGUF/resolve/main/Ice0.41-22.11-RP.Q3_K_S.gguf) | Q3_K_S | 3.3 | |
| [GGUF](https://huggingface.co/mradermacher/Ice0.41-22.11-RP-GGUF/resolve/main/Ice0.41-22.11-RP.Q3_K_M.gguf) | Q3_K_M | 3.6 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/Ice0.41-22.11-RP-GGUF/resolve/main/Ice0.41-22.11-RP.Q3_K_L.gguf) | Q3_K_L | 3.9 | |
| [GGUF](https://huggingface.co/mradermacher/Ice0.41-22.11-RP-GGUF/resolve/main/Ice0.41-22.11-RP.IQ4_XS.gguf) | IQ4_XS | 4.0 | |
| [GGUF](https://huggingface.co/mradermacher/Ice0.41-22.11-RP-GGUF/resolve/main/Ice0.41-22.11-RP.Q4_0_4_4.gguf) | Q4_0_4_4 | 4.2 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/Ice0.41-22.11-RP-GGUF/resolve/main/Ice0.41-22.11-RP.Q4_K_S.gguf) | Q4_K_S | 4.2 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Ice0.41-22.11-RP-GGUF/resolve/main/Ice0.41-22.11-RP.Q4_K_M.gguf) | Q4_K_M | 4.5 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Ice0.41-22.11-RP-GGUF/resolve/main/Ice0.41-22.11-RP.Q5_K_S.gguf) | Q5_K_S | 5.1 | |
| [GGUF](https://huggingface.co/mradermacher/Ice0.41-22.11-RP-GGUF/resolve/main/Ice0.41-22.11-RP.Q5_K_M.gguf) | Q5_K_M | 5.2 | |
| [GGUF](https://huggingface.co/mradermacher/Ice0.41-22.11-RP-GGUF/resolve/main/Ice0.41-22.11-RP.Q6_K.gguf) | Q6_K | 6.0 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/Ice0.41-22.11-RP-GGUF/resolve/main/Ice0.41-22.11-RP.Q8_0.gguf) | Q8_0 | 7.8 | fast, best quality |
| [GGUF](https://huggingface.co/mradermacher/Ice0.41-22.11-RP-GGUF/resolve/main/Ice0.41-22.11-RP.f16.gguf) | f16 | 14.6 | 16 bpw, overkill |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
mradermacher/EVA-Meissa-Coder-14B-Instruct-GGUF | mradermacher | 2024-11-25T08:59:05Z | 13 | 1 | transformers | [
"transformers",
"gguf",
"mergekit",
"merge",
"en",
"base_model:win10/EVA-Meissa-Coder-14B-Instruct",
"base_model:quantized:win10/EVA-Meissa-Coder-14B-Instruct",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-11-23T05:10:49Z | ---
base_model: win10/EVA-Meissa-Coder-14B-Instruct
language:
- en
library_name: transformers
quantized_by: mradermacher
tags:
- mergekit
- merge
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
static quants of https://huggingface.co/win10/EVA-Meissa-Coder-14B-Instruct
<!-- provided-files -->
weighted/imatrix quants are available at https://huggingface.co/mradermacher/EVA-Meissa-Coder-14B-Instruct-i1-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-Coder-14B-Instruct-GGUF/resolve/main/EVA-Meissa-Coder-14B-Instruct.Q2_K.gguf) | Q2_K | 5.9 | |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-Coder-14B-Instruct-GGUF/resolve/main/EVA-Meissa-Coder-14B-Instruct.Q3_K_S.gguf) | Q3_K_S | 6.8 | |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-Coder-14B-Instruct-GGUF/resolve/main/EVA-Meissa-Coder-14B-Instruct.Q3_K_M.gguf) | Q3_K_M | 7.4 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-Coder-14B-Instruct-GGUF/resolve/main/EVA-Meissa-Coder-14B-Instruct.Q3_K_L.gguf) | Q3_K_L | 8.0 | |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-Coder-14B-Instruct-GGUF/resolve/main/EVA-Meissa-Coder-14B-Instruct.IQ4_XS.gguf) | IQ4_XS | 8.3 | |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-Coder-14B-Instruct-GGUF/resolve/main/EVA-Meissa-Coder-14B-Instruct.Q4_0_4_4.gguf) | Q4_0_4_4 | 8.6 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-Coder-14B-Instruct-GGUF/resolve/main/EVA-Meissa-Coder-14B-Instruct.Q4_K_S.gguf) | Q4_K_S | 8.7 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-Coder-14B-Instruct-GGUF/resolve/main/EVA-Meissa-Coder-14B-Instruct.Q4_K_M.gguf) | Q4_K_M | 9.1 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-Coder-14B-Instruct-GGUF/resolve/main/EVA-Meissa-Coder-14B-Instruct.Q5_K_S.gguf) | Q5_K_S | 10.4 | |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-Coder-14B-Instruct-GGUF/resolve/main/EVA-Meissa-Coder-14B-Instruct.Q5_K_M.gguf) | Q5_K_M | 10.6 | |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-Coder-14B-Instruct-GGUF/resolve/main/EVA-Meissa-Coder-14B-Instruct.Q6_K.gguf) | Q6_K | 12.2 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-Coder-14B-Instruct-GGUF/resolve/main/EVA-Meissa-Coder-14B-Instruct.Q8_0.gguf) | Q8_0 | 15.8 | fast, best quality |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
MakiAi/Llama-3-2-3B-Instruct-bnb-4bit-OKU-v1-1epochs_GGUF | MakiAi | 2024-11-25T08:58:44Z | 13 | 0 | transformers | [
"transformers",
"gguf",
"llama",
"text-generation-inference",
"unsloth",
"en",
"base_model:unsloth/Llama-3.2-3B-Instruct-bnb-4bit",
"base_model:quantized:unsloth/Llama-3.2-3B-Instruct-bnb-4bit",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-11-25T08:58:01Z | ---
base_model: unsloth/Llama-3.2-3B-Instruct-bnb-4bit
tags:
- text-generation-inference
- transformers
- unsloth
- llama
- gguf
license: apache-2.0
language:
- en
---
# Uploaded model
- **Developed by:** MakiAi
- **License:** apache-2.0
- **Finetuned from model :** unsloth/Llama-3.2-3B-Instruct-bnb-4bit
This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
mradermacher/EVA-Qwen2.5-72B-v0.2-GGUF | mradermacher | 2024-11-25T08:58:36Z | 13 | 1 | transformers | [
"transformers",
"gguf",
"generated_from_trainer",
"en",
"dataset:anthracite-org/kalo-opus-instruct-22k-no-refusal",
"dataset:Nopm/Opus_WritingStruct",
"dataset:Gryphe/Sonnet3.5-SlimOrcaDedupCleaned",
"dataset:Gryphe/Sonnet3.5-Charcard-Roleplay",
"dataset:Gryphe/ChatGPT-4o-Writing-Prompts",
"dataset:Epiculous/Synthstruct-Gens-v1.1-Filtered-n-Cleaned",
"dataset:Epiculous/SynthRP-Gens-v1.1-Filtered-n-Cleaned",
"dataset:nothingiisreal/Reddit-Dirty-And-WritingPrompts",
"dataset:allura-org/Celeste-1.x-data-mixture",
"dataset:cognitivecomputations/dolphin-2.9.3",
"base_model:EVA-UNIT-01/EVA-Qwen2.5-72B-v0.2",
"base_model:quantized:EVA-UNIT-01/EVA-Qwen2.5-72B-v0.2",
"license:other",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-11-23T07:28:40Z | ---
base_model: EVA-UNIT-01/EVA-Qwen2.5-72B-v0.2
datasets:
- anthracite-org/kalo-opus-instruct-22k-no-refusal
- Nopm/Opus_WritingStruct
- Gryphe/Sonnet3.5-SlimOrcaDedupCleaned
- Gryphe/Sonnet3.5-Charcard-Roleplay
- Gryphe/ChatGPT-4o-Writing-Prompts
- Epiculous/Synthstruct-Gens-v1.1-Filtered-n-Cleaned
- Epiculous/SynthRP-Gens-v1.1-Filtered-n-Cleaned
- nothingiisreal/Reddit-Dirty-And-WritingPrompts
- allura-org/Celeste-1.x-data-mixture
- cognitivecomputations/dolphin-2.9.3
language:
- en
library_name: transformers
license: other
license_link: https://huggingface.co/Qwen/Qwen2.5-72B-Instruct/blob/main/LICENSE
license_name: qwen
quantized_by: mradermacher
tags:
- generated_from_trainer
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
static quants of https://huggingface.co/EVA-UNIT-01/EVA-Qwen2.5-72B-v0.2
<!-- provided-files -->
weighted/imatrix quants are available at https://huggingface.co/mradermacher/EVA-Qwen2.5-72B-v0.2-i1-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/EVA-Qwen2.5-72B-v0.2-GGUF/resolve/main/EVA-Qwen2.5-72B-v0.2.Q2_K.gguf) | Q2_K | 29.9 | |
| [GGUF](https://huggingface.co/mradermacher/EVA-Qwen2.5-72B-v0.2-GGUF/resolve/main/EVA-Qwen2.5-72B-v0.2.Q3_K_S.gguf) | Q3_K_S | 34.6 | |
| [GGUF](https://huggingface.co/mradermacher/EVA-Qwen2.5-72B-v0.2-GGUF/resolve/main/EVA-Qwen2.5-72B-v0.2.Q3_K_M.gguf) | Q3_K_M | 37.8 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/EVA-Qwen2.5-72B-v0.2-GGUF/resolve/main/EVA-Qwen2.5-72B-v0.2.Q3_K_L.gguf) | Q3_K_L | 39.6 | |
| [GGUF](https://huggingface.co/mradermacher/EVA-Qwen2.5-72B-v0.2-GGUF/resolve/main/EVA-Qwen2.5-72B-v0.2.IQ4_XS.gguf) | IQ4_XS | 40.3 | |
| [GGUF](https://huggingface.co/mradermacher/EVA-Qwen2.5-72B-v0.2-GGUF/resolve/main/EVA-Qwen2.5-72B-v0.2.Q4_K_S.gguf) | Q4_K_S | 44.0 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/EVA-Qwen2.5-72B-v0.2-GGUF/resolve/main/EVA-Qwen2.5-72B-v0.2.Q4_K_M.gguf) | Q4_K_M | 47.5 | fast, recommended |
| [PART 1](https://huggingface.co/mradermacher/EVA-Qwen2.5-72B-v0.2-GGUF/resolve/main/EVA-Qwen2.5-72B-v0.2.Q5_K_S.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/EVA-Qwen2.5-72B-v0.2-GGUF/resolve/main/EVA-Qwen2.5-72B-v0.2.Q5_K_S.gguf.part2of2) | Q5_K_S | 51.5 | |
| [PART 1](https://huggingface.co/mradermacher/EVA-Qwen2.5-72B-v0.2-GGUF/resolve/main/EVA-Qwen2.5-72B-v0.2.Q5_K_M.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/EVA-Qwen2.5-72B-v0.2-GGUF/resolve/main/EVA-Qwen2.5-72B-v0.2.Q5_K_M.gguf.part2of2) | Q5_K_M | 54.5 | |
| [PART 1](https://huggingface.co/mradermacher/EVA-Qwen2.5-72B-v0.2-GGUF/resolve/main/EVA-Qwen2.5-72B-v0.2.Q6_K.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/EVA-Qwen2.5-72B-v0.2-GGUF/resolve/main/EVA-Qwen2.5-72B-v0.2.Q6_K.gguf.part2of2) | Q6_K | 64.4 | very good quality |
| [PART 1](https://huggingface.co/mradermacher/EVA-Qwen2.5-72B-v0.2-GGUF/resolve/main/EVA-Qwen2.5-72B-v0.2.Q8_0.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/EVA-Qwen2.5-72B-v0.2-GGUF/resolve/main/EVA-Qwen2.5-72B-v0.2.Q8_0.gguf.part2of2) | Q8_0 | 77.4 | fast, best quality |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
mradermacher/EVA-Meissa-big-pro-v2-i1-GGUF | mradermacher | 2024-11-25T08:58:23Z | 42 | 1 | transformers | [
"transformers",
"gguf",
"mergekit",
"merge",
"en",
"base_model:win10/EVA-Meissa-big-pro-v2",
"base_model:quantized:win10/EVA-Meissa-big-pro-v2",
"endpoints_compatible",
"region:us",
"imatrix",
"conversational"
] | null | 2024-11-23T08:16:24Z | ---
base_model: win10/EVA-Meissa-big-pro-v2
language:
- en
library_name: transformers
quantized_by: mradermacher
tags:
- mergekit
- merge
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/win10/EVA-Meissa-big-pro-v2
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/EVA-Meissa-big-pro-v2-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-big-pro-v2-i1-GGUF/resolve/main/EVA-Meissa-big-pro-v2.i1-IQ1_S.gguf) | i1-IQ1_S | 5.2 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-big-pro-v2-i1-GGUF/resolve/main/EVA-Meissa-big-pro-v2.i1-IQ1_M.gguf) | i1-IQ1_M | 5.6 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-big-pro-v2-i1-GGUF/resolve/main/EVA-Meissa-big-pro-v2.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 6.3 | |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-big-pro-v2-i1-GGUF/resolve/main/EVA-Meissa-big-pro-v2.i1-IQ2_XS.gguf) | i1-IQ2_XS | 6.9 | |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-big-pro-v2-i1-GGUF/resolve/main/EVA-Meissa-big-pro-v2.i1-IQ2_S.gguf) | i1-IQ2_S | 7.3 | |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-big-pro-v2-i1-GGUF/resolve/main/EVA-Meissa-big-pro-v2.i1-IQ2_M.gguf) | i1-IQ2_M | 7.9 | |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-big-pro-v2-i1-GGUF/resolve/main/EVA-Meissa-big-pro-v2.i1-Q2_K.gguf) | i1-Q2_K | 8.5 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-big-pro-v2-i1-GGUF/resolve/main/EVA-Meissa-big-pro-v2.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 8.8 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-big-pro-v2-i1-GGUF/resolve/main/EVA-Meissa-big-pro-v2.i1-IQ3_XS.gguf) | i1-IQ3_XS | 9.4 | |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-big-pro-v2-i1-GGUF/resolve/main/EVA-Meissa-big-pro-v2.i1-Q3_K_S.gguf) | i1-Q3_K_S | 9.8 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-big-pro-v2-i1-GGUF/resolve/main/EVA-Meissa-big-pro-v2.i1-IQ3_S.gguf) | i1-IQ3_S | 9.9 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-big-pro-v2-i1-GGUF/resolve/main/EVA-Meissa-big-pro-v2.i1-IQ3_M.gguf) | i1-IQ3_M | 10.2 | |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-big-pro-v2-i1-GGUF/resolve/main/EVA-Meissa-big-pro-v2.i1-Q3_K_M.gguf) | i1-Q3_K_M | 10.9 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-big-pro-v2-i1-GGUF/resolve/main/EVA-Meissa-big-pro-v2.i1-Q3_K_L.gguf) | i1-Q3_K_L | 11.8 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-big-pro-v2-i1-GGUF/resolve/main/EVA-Meissa-big-pro-v2.i1-IQ4_XS.gguf) | i1-IQ4_XS | 12.0 | |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-big-pro-v2-i1-GGUF/resolve/main/EVA-Meissa-big-pro-v2.i1-Q4_0.gguf) | i1-Q4_0 | 12.7 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-big-pro-v2-i1-GGUF/resolve/main/EVA-Meissa-big-pro-v2.i1-Q4_K_S.gguf) | i1-Q4_K_S | 12.7 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-big-pro-v2-i1-GGUF/resolve/main/EVA-Meissa-big-pro-v2.i1-Q4_K_M.gguf) | i1-Q4_K_M | 13.4 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-big-pro-v2-i1-GGUF/resolve/main/EVA-Meissa-big-pro-v2.i1-Q5_K_S.gguf) | i1-Q5_K_S | 15.3 | |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-big-pro-v2-i1-GGUF/resolve/main/EVA-Meissa-big-pro-v2.i1-Q5_K_M.gguf) | i1-Q5_K_M | 15.7 | |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-big-pro-v2-i1-GGUF/resolve/main/EVA-Meissa-big-pro-v2.i1-Q6_K.gguf) | i1-Q6_K | 18.1 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
mradermacher/BioinspiredLLM-i1-GGUF | mradermacher | 2024-11-25T08:57:49Z | 202 | 1 | transformers | [
"transformers",
"gguf",
"biology",
"materials science",
"code",
"scientific AI",
"biological materials",
"bioinspiration",
"machine learning",
"generative",
"en",
"base_model:lamm-mit/BioinspiredLLM",
"base_model:quantized:lamm-mit/BioinspiredLLM",
"endpoints_compatible",
"region:us",
"imatrix"
] | null | 2024-11-23T08:42:52Z | ---
base_model: lamm-mit/BioinspiredLLM
language:
- en
library_name: transformers
quantized_by: mradermacher
tags:
- biology
- materials science
- code
- scientific AI
- biological materials
- bioinspiration
- machine learning
- generative
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/lamm-mit/BioinspiredLLM
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/BioinspiredLLM-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/BioinspiredLLM-i1-GGUF/resolve/main/BioinspiredLLM.i1-IQ1_S.gguf) | i1-IQ1_S | 3.0 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/BioinspiredLLM-i1-GGUF/resolve/main/BioinspiredLLM.i1-IQ1_M.gguf) | i1-IQ1_M | 3.2 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/BioinspiredLLM-i1-GGUF/resolve/main/BioinspiredLLM.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 3.6 | |
| [GGUF](https://huggingface.co/mradermacher/BioinspiredLLM-i1-GGUF/resolve/main/BioinspiredLLM.i1-IQ2_XS.gguf) | i1-IQ2_XS | 4.0 | |
| [GGUF](https://huggingface.co/mradermacher/BioinspiredLLM-i1-GGUF/resolve/main/BioinspiredLLM.i1-IQ2_S.gguf) | i1-IQ2_S | 4.3 | |
| [GGUF](https://huggingface.co/mradermacher/BioinspiredLLM-i1-GGUF/resolve/main/BioinspiredLLM.i1-IQ2_M.gguf) | i1-IQ2_M | 4.6 | |
| [GGUF](https://huggingface.co/mradermacher/BioinspiredLLM-i1-GGUF/resolve/main/BioinspiredLLM.i1-Q2_K.gguf) | i1-Q2_K | 5.0 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/BioinspiredLLM-i1-GGUF/resolve/main/BioinspiredLLM.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 5.1 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/BioinspiredLLM-i1-GGUF/resolve/main/BioinspiredLLM.i1-IQ3_XS.gguf) | i1-IQ3_XS | 5.5 | |
| [GGUF](https://huggingface.co/mradermacher/BioinspiredLLM-i1-GGUF/resolve/main/BioinspiredLLM.i1-IQ3_S.gguf) | i1-IQ3_S | 5.8 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/BioinspiredLLM-i1-GGUF/resolve/main/BioinspiredLLM.i1-Q3_K_S.gguf) | i1-Q3_K_S | 5.8 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/BioinspiredLLM-i1-GGUF/resolve/main/BioinspiredLLM.i1-IQ3_M.gguf) | i1-IQ3_M | 6.1 | |
| [GGUF](https://huggingface.co/mradermacher/BioinspiredLLM-i1-GGUF/resolve/main/BioinspiredLLM.i1-Q3_K_M.gguf) | i1-Q3_K_M | 6.4 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/BioinspiredLLM-i1-GGUF/resolve/main/BioinspiredLLM.i1-Q3_K_L.gguf) | i1-Q3_K_L | 7.0 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/BioinspiredLLM-i1-GGUF/resolve/main/BioinspiredLLM.i1-IQ4_XS.gguf) | i1-IQ4_XS | 7.1 | |
| [GGUF](https://huggingface.co/mradermacher/BioinspiredLLM-i1-GGUF/resolve/main/BioinspiredLLM.i1-Q4_0_4_4.gguf) | i1-Q4_0_4_4 | 7.5 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/BioinspiredLLM-i1-GGUF/resolve/main/BioinspiredLLM.i1-Q4_0_4_8.gguf) | i1-Q4_0_4_8 | 7.5 | fast on arm+i8mm, low quality |
| [GGUF](https://huggingface.co/mradermacher/BioinspiredLLM-i1-GGUF/resolve/main/BioinspiredLLM.i1-Q4_0_8_8.gguf) | i1-Q4_0_8_8 | 7.5 | fast on arm+sve, low quality |
| [GGUF](https://huggingface.co/mradermacher/BioinspiredLLM-i1-GGUF/resolve/main/BioinspiredLLM.i1-Q4_0.gguf) | i1-Q4_0 | 7.5 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/BioinspiredLLM-i1-GGUF/resolve/main/BioinspiredLLM.i1-Q4_K_S.gguf) | i1-Q4_K_S | 7.5 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/BioinspiredLLM-i1-GGUF/resolve/main/BioinspiredLLM.i1-Q4_K_M.gguf) | i1-Q4_K_M | 8.0 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/BioinspiredLLM-i1-GGUF/resolve/main/BioinspiredLLM.i1-Q5_K_S.gguf) | i1-Q5_K_S | 9.1 | |
| [GGUF](https://huggingface.co/mradermacher/BioinspiredLLM-i1-GGUF/resolve/main/BioinspiredLLM.i1-Q5_K_M.gguf) | i1-Q5_K_M | 9.3 | |
| [GGUF](https://huggingface.co/mradermacher/BioinspiredLLM-i1-GGUF/resolve/main/BioinspiredLLM.i1-Q6_K.gguf) | i1-Q6_K | 10.8 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
mradermacher/OxytocinErosEngineering_v0-4x7B-passthrough-GGUF | mradermacher | 2024-11-25T08:57:45Z | 29 | 1 | transformers | [
"transformers",
"gguf",
"merge",
"mergekit",
"lazymergekit",
"ChaoticNeutrals/Eris_Remix_7B",
"Virt-io/Erebus-Holodeck-7B",
"jeiku/Eros_Prodigadigm_7B",
"Epiculous/Mika-7B",
"en",
"base_model:weezywitasneezy/OxytocinErosEngineering_v0-4x7B-passthrough",
"base_model:quantized:weezywitasneezy/OxytocinErosEngineering_v0-4x7B-passthrough",
"endpoints_compatible",
"region:us"
] | null | 2024-11-23T08:43:07Z | ---
base_model: weezywitasneezy/OxytocinErosEngineering_v0-4x7B-passthrough
language:
- en
library_name: transformers
quantized_by: mradermacher
tags:
- merge
- mergekit
- lazymergekit
- ChaoticNeutrals/Eris_Remix_7B
- Virt-io/Erebus-Holodeck-7B
- jeiku/Eros_Prodigadigm_7B
- Epiculous/Mika-7B
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
static quants of https://huggingface.co/weezywitasneezy/OxytocinErosEngineering_v0-4x7B-passthrough
<!-- provided-files -->
weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/OxytocinErosEngineering_v0-4x7B-passthrough-GGUF/resolve/main/OxytocinErosEngineering_v0-4x7B-passthrough.Q2_K.gguf) | Q2_K | 6.7 | |
| [GGUF](https://huggingface.co/mradermacher/OxytocinErosEngineering_v0-4x7B-passthrough-GGUF/resolve/main/OxytocinErosEngineering_v0-4x7B-passthrough.Q3_K_S.gguf) | Q3_K_S | 7.9 | |
| [GGUF](https://huggingface.co/mradermacher/OxytocinErosEngineering_v0-4x7B-passthrough-GGUF/resolve/main/OxytocinErosEngineering_v0-4x7B-passthrough.Q3_K_M.gguf) | Q3_K_M | 8.7 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/OxytocinErosEngineering_v0-4x7B-passthrough-GGUF/resolve/main/OxytocinErosEngineering_v0-4x7B-passthrough.Q3_K_L.gguf) | Q3_K_L | 9.4 | |
| [GGUF](https://huggingface.co/mradermacher/OxytocinErosEngineering_v0-4x7B-passthrough-GGUF/resolve/main/OxytocinErosEngineering_v0-4x7B-passthrough.IQ4_XS.gguf) | IQ4_XS | 9.7 | |
| [GGUF](https://huggingface.co/mradermacher/OxytocinErosEngineering_v0-4x7B-passthrough-GGUF/resolve/main/OxytocinErosEngineering_v0-4x7B-passthrough.Q4_K_S.gguf) | Q4_K_S | 10.2 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/OxytocinErosEngineering_v0-4x7B-passthrough-GGUF/resolve/main/OxytocinErosEngineering_v0-4x7B-passthrough.Q4_K_M.gguf) | Q4_K_M | 10.8 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/OxytocinErosEngineering_v0-4x7B-passthrough-GGUF/resolve/main/OxytocinErosEngineering_v0-4x7B-passthrough.Q5_K_S.gguf) | Q5_K_S | 12.3 | |
| [GGUF](https://huggingface.co/mradermacher/OxytocinErosEngineering_v0-4x7B-passthrough-GGUF/resolve/main/OxytocinErosEngineering_v0-4x7B-passthrough.Q5_K_M.gguf) | Q5_K_M | 12.6 | |
| [GGUF](https://huggingface.co/mradermacher/OxytocinErosEngineering_v0-4x7B-passthrough-GGUF/resolve/main/OxytocinErosEngineering_v0-4x7B-passthrough.Q6_K.gguf) | Q6_K | 14.6 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/OxytocinErosEngineering_v0-4x7B-passthrough-GGUF/resolve/main/OxytocinErosEngineering_v0-4x7B-passthrough.Q8_0.gguf) | Q8_0 | 18.9 | fast, best quality |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
mradermacher/nox-solar-10.7b-v4-kolon-all-5-v3.0-GGUF | mradermacher | 2024-11-25T08:57:36Z | 67 | 1 | transformers | [
"transformers",
"gguf",
"ko",
"en",
"base_model:gwonny/nox-solar-10.7b-v4-kolon-all-5-v3.0",
"base_model:quantized:gwonny/nox-solar-10.7b-v4-kolon-all-5-v3.0",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-11-23T08:43:54Z | ---
base_model: gwonny/nox-solar-10.7b-v4-kolon-all-5-v3.0
language:
- ko
- en
library_name: transformers
license: cc-by-nc-4.0
quantized_by: mradermacher
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
static quants of https://huggingface.co/gwonny/nox-solar-10.7b-v4-kolon-all-5-v3.0
<!-- provided-files -->
weighted/imatrix quants are available at https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-5-v3.0-i1-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-5-v3.0-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-5-v3.0.Q2_K.gguf) | Q2_K | 4.1 | |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-5-v3.0-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-5-v3.0.Q3_K_S.gguf) | Q3_K_S | 4.8 | |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-5-v3.0-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-5-v3.0.Q3_K_M.gguf) | Q3_K_M | 5.3 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-5-v3.0-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-5-v3.0.Q3_K_L.gguf) | Q3_K_L | 5.8 | |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-5-v3.0-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-5-v3.0.IQ4_XS.gguf) | IQ4_XS | 5.9 | |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-5-v3.0-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-5-v3.0.Q4_0_4_4.gguf) | Q4_0_4_4 | 6.2 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-5-v3.0-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-5-v3.0.Q4_K_S.gguf) | Q4_K_S | 6.2 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-5-v3.0-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-5-v3.0.Q4_K_M.gguf) | Q4_K_M | 6.6 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-5-v3.0-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-5-v3.0.Q5_K_S.gguf) | Q5_K_S | 7.5 | |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-5-v3.0-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-5-v3.0.Q5_K_M.gguf) | Q5_K_M | 7.7 | |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-5-v3.0-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-5-v3.0.Q6_K.gguf) | Q6_K | 8.9 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-5-v3.0-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-5-v3.0.Q8_0.gguf) | Q8_0 | 11.5 | fast, best quality |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-5-v3.0-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-5-v3.0.f16.gguf) | f16 | 21.6 | 16 bpw, overkill |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
mradermacher/EVA-Qwen2.5-72B-v0.2-i1-GGUF | mradermacher | 2024-11-25T08:57:30Z | 123 | 2 | transformers | [
"transformers",
"gguf",
"generated_from_trainer",
"en",
"dataset:anthracite-org/kalo-opus-instruct-22k-no-refusal",
"dataset:Nopm/Opus_WritingStruct",
"dataset:Gryphe/Sonnet3.5-SlimOrcaDedupCleaned",
"dataset:Gryphe/Sonnet3.5-Charcard-Roleplay",
"dataset:Gryphe/ChatGPT-4o-Writing-Prompts",
"dataset:Epiculous/Synthstruct-Gens-v1.1-Filtered-n-Cleaned",
"dataset:Epiculous/SynthRP-Gens-v1.1-Filtered-n-Cleaned",
"dataset:nothingiisreal/Reddit-Dirty-And-WritingPrompts",
"dataset:allura-org/Celeste-1.x-data-mixture",
"dataset:cognitivecomputations/dolphin-2.9.3",
"base_model:EVA-UNIT-01/EVA-Qwen2.5-72B-v0.2",
"base_model:quantized:EVA-UNIT-01/EVA-Qwen2.5-72B-v0.2",
"license:other",
"endpoints_compatible",
"region:us",
"imatrix",
"conversational"
] | null | 2024-11-23T09:13:55Z | ---
base_model: EVA-UNIT-01/EVA-Qwen2.5-72B-v0.2
datasets:
- anthracite-org/kalo-opus-instruct-22k-no-refusal
- Nopm/Opus_WritingStruct
- Gryphe/Sonnet3.5-SlimOrcaDedupCleaned
- Gryphe/Sonnet3.5-Charcard-Roleplay
- Gryphe/ChatGPT-4o-Writing-Prompts
- Epiculous/Synthstruct-Gens-v1.1-Filtered-n-Cleaned
- Epiculous/SynthRP-Gens-v1.1-Filtered-n-Cleaned
- nothingiisreal/Reddit-Dirty-And-WritingPrompts
- allura-org/Celeste-1.x-data-mixture
- cognitivecomputations/dolphin-2.9.3
language:
- en
library_name: transformers
license: other
license_link: https://huggingface.co/Qwen/Qwen2.5-72B-Instruct/blob/main/LICENSE
license_name: qwen
quantized_by: mradermacher
tags:
- generated_from_trainer
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/EVA-UNIT-01/EVA-Qwen2.5-72B-v0.2
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/EVA-Qwen2.5-72B-v0.2-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/EVA-Qwen2.5-72B-v0.2-i1-GGUF/resolve/main/EVA-Qwen2.5-72B-v0.2.i1-IQ1_S.gguf) | i1-IQ1_S | 22.8 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/EVA-Qwen2.5-72B-v0.2-i1-GGUF/resolve/main/EVA-Qwen2.5-72B-v0.2.i1-IQ1_M.gguf) | i1-IQ1_M | 23.8 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/EVA-Qwen2.5-72B-v0.2-i1-GGUF/resolve/main/EVA-Qwen2.5-72B-v0.2.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 25.6 | |
| [GGUF](https://huggingface.co/mradermacher/EVA-Qwen2.5-72B-v0.2-i1-GGUF/resolve/main/EVA-Qwen2.5-72B-v0.2.i1-IQ2_XS.gguf) | i1-IQ2_XS | 27.2 | |
| [GGUF](https://huggingface.co/mradermacher/EVA-Qwen2.5-72B-v0.2-i1-GGUF/resolve/main/EVA-Qwen2.5-72B-v0.2.i1-IQ2_S.gguf) | i1-IQ2_S | 28.0 | |
| [GGUF](https://huggingface.co/mradermacher/EVA-Qwen2.5-72B-v0.2-i1-GGUF/resolve/main/EVA-Qwen2.5-72B-v0.2.i1-IQ2_M.gguf) | i1-IQ2_M | 29.4 | |
| [GGUF](https://huggingface.co/mradermacher/EVA-Qwen2.5-72B-v0.2-i1-GGUF/resolve/main/EVA-Qwen2.5-72B-v0.2.i1-Q2_K.gguf) | i1-Q2_K | 29.9 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/EVA-Qwen2.5-72B-v0.2-i1-GGUF/resolve/main/EVA-Qwen2.5-72B-v0.2.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 31.9 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/EVA-Qwen2.5-72B-v0.2-i1-GGUF/resolve/main/EVA-Qwen2.5-72B-v0.2.i1-IQ3_XS.gguf) | i1-IQ3_XS | 32.9 | |
| [GGUF](https://huggingface.co/mradermacher/EVA-Qwen2.5-72B-v0.2-i1-GGUF/resolve/main/EVA-Qwen2.5-72B-v0.2.i1-IQ3_S.gguf) | i1-IQ3_S | 34.6 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/EVA-Qwen2.5-72B-v0.2-i1-GGUF/resolve/main/EVA-Qwen2.5-72B-v0.2.i1-Q3_K_S.gguf) | i1-Q3_K_S | 34.6 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/EVA-Qwen2.5-72B-v0.2-i1-GGUF/resolve/main/EVA-Qwen2.5-72B-v0.2.i1-IQ3_M.gguf) | i1-IQ3_M | 35.6 | |
| [GGUF](https://huggingface.co/mradermacher/EVA-Qwen2.5-72B-v0.2-i1-GGUF/resolve/main/EVA-Qwen2.5-72B-v0.2.i1-Q3_K_M.gguf) | i1-Q3_K_M | 37.8 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/EVA-Qwen2.5-72B-v0.2-i1-GGUF/resolve/main/EVA-Qwen2.5-72B-v0.2.i1-Q3_K_L.gguf) | i1-Q3_K_L | 39.6 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/EVA-Qwen2.5-72B-v0.2-i1-GGUF/resolve/main/EVA-Qwen2.5-72B-v0.2.i1-IQ4_XS.gguf) | i1-IQ4_XS | 39.8 | |
| [GGUF](https://huggingface.co/mradermacher/EVA-Qwen2.5-72B-v0.2-i1-GGUF/resolve/main/EVA-Qwen2.5-72B-v0.2.i1-Q4_0.gguf) | i1-Q4_0 | 41.5 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/EVA-Qwen2.5-72B-v0.2-i1-GGUF/resolve/main/EVA-Qwen2.5-72B-v0.2.i1-Q4_K_S.gguf) | i1-Q4_K_S | 44.0 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/EVA-Qwen2.5-72B-v0.2-i1-GGUF/resolve/main/EVA-Qwen2.5-72B-v0.2.i1-Q4_K_M.gguf) | i1-Q4_K_M | 47.5 | fast, recommended |
| [PART 1](https://huggingface.co/mradermacher/EVA-Qwen2.5-72B-v0.2-i1-GGUF/resolve/main/EVA-Qwen2.5-72B-v0.2.i1-Q5_K_S.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/EVA-Qwen2.5-72B-v0.2-i1-GGUF/resolve/main/EVA-Qwen2.5-72B-v0.2.i1-Q5_K_S.gguf.part2of2) | i1-Q5_K_S | 51.5 | |
| [PART 1](https://huggingface.co/mradermacher/EVA-Qwen2.5-72B-v0.2-i1-GGUF/resolve/main/EVA-Qwen2.5-72B-v0.2.i1-Q5_K_M.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/EVA-Qwen2.5-72B-v0.2-i1-GGUF/resolve/main/EVA-Qwen2.5-72B-v0.2.i1-Q5_K_M.gguf.part2of2) | i1-Q5_K_M | 54.5 | |
| [PART 1](https://huggingface.co/mradermacher/EVA-Qwen2.5-72B-v0.2-i1-GGUF/resolve/main/EVA-Qwen2.5-72B-v0.2.i1-Q6_K.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/EVA-Qwen2.5-72B-v0.2-i1-GGUF/resolve/main/EVA-Qwen2.5-72B-v0.2.i1-Q6_K.gguf.part2of2) | i1-Q6_K | 64.4 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
mradermacher/mixtral-4x7b_slerp-GGUF | mradermacher | 2024-11-25T08:57:19Z | 14 | 1 | transformers | [
"transformers",
"gguf",
"en",
"base_model:isemmanuelolowe/mixtral-4x7b_slerp",
"base_model:quantized:isemmanuelolowe/mixtral-4x7b_slerp",
"license:mit",
"endpoints_compatible",
"region:us"
] | null | 2024-11-23T10:34:25Z | ---
base_model: isemmanuelolowe/mixtral-4x7b_slerp
language:
- en
library_name: transformers
license: mit
quantized_by: mradermacher
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
static quants of https://huggingface.co/isemmanuelolowe/mixtral-4x7b_slerp
<!-- provided-files -->
weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/mixtral-4x7b_slerp-GGUF/resolve/main/mixtral-4x7b_slerp.Q2_K.gguf) | Q2_K | 8.9 | |
| [GGUF](https://huggingface.co/mradermacher/mixtral-4x7b_slerp-GGUF/resolve/main/mixtral-4x7b_slerp.Q3_K_S.gguf) | Q3_K_S | 10.5 | |
| [GGUF](https://huggingface.co/mradermacher/mixtral-4x7b_slerp-GGUF/resolve/main/mixtral-4x7b_slerp.Q3_K_M.gguf) | Q3_K_M | 11.7 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/mixtral-4x7b_slerp-GGUF/resolve/main/mixtral-4x7b_slerp.Q3_K_L.gguf) | Q3_K_L | 12.6 | |
| [GGUF](https://huggingface.co/mradermacher/mixtral-4x7b_slerp-GGUF/resolve/main/mixtral-4x7b_slerp.IQ4_XS.gguf) | IQ4_XS | 13.1 | |
| [GGUF](https://huggingface.co/mradermacher/mixtral-4x7b_slerp-GGUF/resolve/main/mixtral-4x7b_slerp.Q4_K_S.gguf) | Q4_K_S | 13.8 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/mixtral-4x7b_slerp-GGUF/resolve/main/mixtral-4x7b_slerp.Q4_K_M.gguf) | Q4_K_M | 14.7 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/mixtral-4x7b_slerp-GGUF/resolve/main/mixtral-4x7b_slerp.Q5_K_S.gguf) | Q5_K_S | 16.7 | |
| [GGUF](https://huggingface.co/mradermacher/mixtral-4x7b_slerp-GGUF/resolve/main/mixtral-4x7b_slerp.Q5_K_M.gguf) | Q5_K_M | 17.2 | |
| [GGUF](https://huggingface.co/mradermacher/mixtral-4x7b_slerp-GGUF/resolve/main/mixtral-4x7b_slerp.Q6_K.gguf) | Q6_K | 19.9 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/mixtral-4x7b_slerp-GGUF/resolve/main/mixtral-4x7b_slerp.Q8_0.gguf) | Q8_0 | 25.8 | fast, best quality |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
mradermacher/llama-2-13b-cf-GGUF | mradermacher | 2024-11-25T08:56:53Z | 5 | 1 | transformers | [
"transformers",
"gguf",
"en",
"base_model:iestynmullinor/llama-2-13b-cf",
"base_model:quantized:iestynmullinor/llama-2-13b-cf",
"endpoints_compatible",
"region:us"
] | null | 2024-11-23T12:50:07Z | ---
base_model: iestynmullinor/llama-2-13b-cf
language:
- en
library_name: transformers
quantized_by: mradermacher
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: -->
static quants of https://huggingface.co/iestynmullinor/llama-2-13b-cf
<!-- provided-files -->
weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/llama-2-13b-cf-GGUF/resolve/main/llama-2-13b-cf.Q2_K.gguf) | Q2_K | 5.0 | |
| [GGUF](https://huggingface.co/mradermacher/llama-2-13b-cf-GGUF/resolve/main/llama-2-13b-cf.Q3_K_S.gguf) | Q3_K_S | 5.8 | |
| [GGUF](https://huggingface.co/mradermacher/llama-2-13b-cf-GGUF/resolve/main/llama-2-13b-cf.Q3_K_M.gguf) | Q3_K_M | 6.4 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/llama-2-13b-cf-GGUF/resolve/main/llama-2-13b-cf.Q3_K_L.gguf) | Q3_K_L | 7.0 | |
| [GGUF](https://huggingface.co/mradermacher/llama-2-13b-cf-GGUF/resolve/main/llama-2-13b-cf.IQ4_XS.gguf) | IQ4_XS | 7.1 | |
| [GGUF](https://huggingface.co/mradermacher/llama-2-13b-cf-GGUF/resolve/main/llama-2-13b-cf.Q4_0_4_4.gguf) | Q4_0_4_4 | 7.5 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/llama-2-13b-cf-GGUF/resolve/main/llama-2-13b-cf.Q4_K_S.gguf) | Q4_K_S | 7.5 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/llama-2-13b-cf-GGUF/resolve/main/llama-2-13b-cf.Q4_K_M.gguf) | Q4_K_M | 8.0 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/llama-2-13b-cf-GGUF/resolve/main/llama-2-13b-cf.Q5_K_S.gguf) | Q5_K_S | 9.1 | |
| [GGUF](https://huggingface.co/mradermacher/llama-2-13b-cf-GGUF/resolve/main/llama-2-13b-cf.Q5_K_M.gguf) | Q5_K_M | 9.3 | |
| [GGUF](https://huggingface.co/mradermacher/llama-2-13b-cf-GGUF/resolve/main/llama-2-13b-cf.Q6_K.gguf) | Q6_K | 10.8 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/llama-2-13b-cf-GGUF/resolve/main/llama-2-13b-cf.Q8_0.gguf) | Q8_0 | 13.9 | fast, best quality |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time.
<!-- end -->
|
mradermacher/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0-i1-GGUF | mradermacher | 2024-11-25T08:56:46Z | 24 | 1 | transformers | [
"transformers",
"gguf",
"ko",
"en",
"base_model:KBNIT/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0",
"base_model:quantized:KBNIT/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"region:us",
"imatrix",
"conversational"
] | null | 2024-11-23T13:25:35Z | ---
base_model: KBNIT/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0
language:
- ko
- en
library_name: transformers
license: cc-by-nc-4.0
quantized_by: mradermacher
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/KBNIT/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0-i1-GGUF/resolve/main/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0.i1-IQ1_S.gguf) | i1-IQ1_S | 2.5 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0-i1-GGUF/resolve/main/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0.i1-IQ1_M.gguf) | i1-IQ1_M | 2.7 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0-i1-GGUF/resolve/main/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 3.1 | |
| [GGUF](https://huggingface.co/mradermacher/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0-i1-GGUF/resolve/main/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0.i1-IQ2_XS.gguf) | i1-IQ2_XS | 3.4 | |
| [GGUF](https://huggingface.co/mradermacher/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0-i1-GGUF/resolve/main/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0.i1-IQ2_S.gguf) | i1-IQ2_S | 3.5 | |
| [GGUF](https://huggingface.co/mradermacher/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0-i1-GGUF/resolve/main/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0.i1-IQ2_M.gguf) | i1-IQ2_M | 3.8 | |
| [GGUF](https://huggingface.co/mradermacher/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0-i1-GGUF/resolve/main/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0.i1-Q2_K.gguf) | i1-Q2_K | 4.1 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0-i1-GGUF/resolve/main/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 4.3 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0-i1-GGUF/resolve/main/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0.i1-IQ3_XS.gguf) | i1-IQ3_XS | 4.6 | |
| [GGUF](https://huggingface.co/mradermacher/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0-i1-GGUF/resolve/main/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0.i1-Q3_K_S.gguf) | i1-Q3_K_S | 4.8 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0-i1-GGUF/resolve/main/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0.i1-IQ3_S.gguf) | i1-IQ3_S | 4.8 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0-i1-GGUF/resolve/main/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0.i1-IQ3_M.gguf) | i1-IQ3_M | 5.0 | |
| [GGUF](https://huggingface.co/mradermacher/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0-i1-GGUF/resolve/main/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0.i1-Q3_K_M.gguf) | i1-Q3_K_M | 5.3 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0-i1-GGUF/resolve/main/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0.i1-Q3_K_L.gguf) | i1-Q3_K_L | 5.8 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0-i1-GGUF/resolve/main/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0.i1-IQ4_XS.gguf) | i1-IQ4_XS | 5.9 | |
| [GGUF](https://huggingface.co/mradermacher/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0-i1-GGUF/resolve/main/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0.i1-Q4_0_4_4.gguf) | i1-Q4_0_4_4 | 6.2 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0-i1-GGUF/resolve/main/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0.i1-Q4_0_4_8.gguf) | i1-Q4_0_4_8 | 6.2 | fast on arm+i8mm, low quality |
| [GGUF](https://huggingface.co/mradermacher/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0-i1-GGUF/resolve/main/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0.i1-Q4_0_8_8.gguf) | i1-Q4_0_8_8 | 6.2 | fast on arm+sve, low quality |
| [GGUF](https://huggingface.co/mradermacher/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0-i1-GGUF/resolve/main/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0.i1-Q4_0.gguf) | i1-Q4_0 | 6.2 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0-i1-GGUF/resolve/main/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0.i1-Q4_K_S.gguf) | i1-Q4_K_S | 6.3 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0-i1-GGUF/resolve/main/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0.i1-Q4_K_M.gguf) | i1-Q4_K_M | 6.6 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0-i1-GGUF/resolve/main/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0.i1-Q5_K_S.gguf) | i1-Q5_K_S | 7.6 | |
| [GGUF](https://huggingface.co/mradermacher/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0-i1-GGUF/resolve/main/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0.i1-Q5_K_M.gguf) | i1-Q5_K_M | 7.8 | |
| [GGUF](https://huggingface.co/mradermacher/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0-i1-GGUF/resolve/main/KoSOLAR-10.7B-QLoRA-NEFTune-kolon-v2.0.i1-Q6_K.gguf) | i1-Q6_K | 9.0 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
mradermacher/llama-2-13b-cf-ds-GGUF | mradermacher | 2024-11-25T08:56:41Z | 8 | 1 | transformers | [
"transformers",
"gguf",
"en",
"base_model:iestynmullinor/llama-2-13b-cf-ds",
"base_model:quantized:iestynmullinor/llama-2-13b-cf-ds",
"endpoints_compatible",
"region:us"
] | null | 2024-11-23T13:56:00Z | ---
base_model: iestynmullinor/llama-2-13b-cf-ds
language:
- en
library_name: transformers
quantized_by: mradermacher
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: -->
static quants of https://huggingface.co/iestynmullinor/llama-2-13b-cf-ds
<!-- provided-files -->
weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/llama-2-13b-cf-ds-GGUF/resolve/main/llama-2-13b-cf-ds.Q2_K.gguf) | Q2_K | 5.0 | |
| [GGUF](https://huggingface.co/mradermacher/llama-2-13b-cf-ds-GGUF/resolve/main/llama-2-13b-cf-ds.Q3_K_S.gguf) | Q3_K_S | 5.8 | |
| [GGUF](https://huggingface.co/mradermacher/llama-2-13b-cf-ds-GGUF/resolve/main/llama-2-13b-cf-ds.Q3_K_M.gguf) | Q3_K_M | 6.4 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/llama-2-13b-cf-ds-GGUF/resolve/main/llama-2-13b-cf-ds.Q3_K_L.gguf) | Q3_K_L | 7.0 | |
| [GGUF](https://huggingface.co/mradermacher/llama-2-13b-cf-ds-GGUF/resolve/main/llama-2-13b-cf-ds.IQ4_XS.gguf) | IQ4_XS | 7.1 | |
| [GGUF](https://huggingface.co/mradermacher/llama-2-13b-cf-ds-GGUF/resolve/main/llama-2-13b-cf-ds.Q4_0_4_4.gguf) | Q4_0_4_4 | 7.5 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/llama-2-13b-cf-ds-GGUF/resolve/main/llama-2-13b-cf-ds.Q4_K_S.gguf) | Q4_K_S | 7.5 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/llama-2-13b-cf-ds-GGUF/resolve/main/llama-2-13b-cf-ds.Q4_K_M.gguf) | Q4_K_M | 8.0 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/llama-2-13b-cf-ds-GGUF/resolve/main/llama-2-13b-cf-ds.Q5_K_S.gguf) | Q5_K_S | 9.1 | |
| [GGUF](https://huggingface.co/mradermacher/llama-2-13b-cf-ds-GGUF/resolve/main/llama-2-13b-cf-ds.Q5_K_M.gguf) | Q5_K_M | 9.3 | |
| [GGUF](https://huggingface.co/mradermacher/llama-2-13b-cf-ds-GGUF/resolve/main/llama-2-13b-cf-ds.Q6_K.gguf) | Q6_K | 10.8 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/llama-2-13b-cf-ds-GGUF/resolve/main/llama-2-13b-cf-ds.Q8_0.gguf) | Q8_0 | 13.9 | fast, best quality |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time.
<!-- end -->
|
mradermacher/nox-solar-10.7b-v4-kolon-all-5-v3.0-i1-GGUF | mradermacher | 2024-11-25T08:56:31Z | 11 | 1 | transformers | [
"transformers",
"gguf",
"ko",
"en",
"base_model:gwonny/nox-solar-10.7b-v4-kolon-all-5-v3.0",
"base_model:quantized:gwonny/nox-solar-10.7b-v4-kolon-all-5-v3.0",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"region:us",
"imatrix",
"conversational"
] | null | 2024-11-23T15:12:29Z | ---
base_model: gwonny/nox-solar-10.7b-v4-kolon-all-5-v3.0
language:
- ko
- en
library_name: transformers
license: cc-by-nc-4.0
quantized_by: mradermacher
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/gwonny/nox-solar-10.7b-v4-kolon-all-5-v3.0
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-5-v3.0-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-5-v3.0-i1-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-5-v3.0.i1-IQ1_S.gguf) | i1-IQ1_S | 2.5 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-5-v3.0-i1-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-5-v3.0.i1-IQ1_M.gguf) | i1-IQ1_M | 2.7 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-5-v3.0-i1-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-5-v3.0.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 3.0 | |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-5-v3.0-i1-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-5-v3.0.i1-IQ2_XS.gguf) | i1-IQ2_XS | 3.3 | |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-5-v3.0-i1-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-5-v3.0.i1-IQ2_S.gguf) | i1-IQ2_S | 3.5 | |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-5-v3.0-i1-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-5-v3.0.i1-IQ2_M.gguf) | i1-IQ2_M | 3.8 | |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-5-v3.0-i1-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-5-v3.0.i1-Q2_K.gguf) | i1-Q2_K | 4.1 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-5-v3.0-i1-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-5-v3.0.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 4.3 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-5-v3.0-i1-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-5-v3.0.i1-IQ3_XS.gguf) | i1-IQ3_XS | 4.5 | |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-5-v3.0-i1-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-5-v3.0.i1-Q3_K_S.gguf) | i1-Q3_K_S | 4.8 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-5-v3.0-i1-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-5-v3.0.i1-IQ3_S.gguf) | i1-IQ3_S | 4.8 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-5-v3.0-i1-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-5-v3.0.i1-IQ3_M.gguf) | i1-IQ3_M | 4.9 | |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-5-v3.0-i1-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-5-v3.0.i1-Q3_K_M.gguf) | i1-Q3_K_M | 5.3 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-5-v3.0-i1-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-5-v3.0.i1-Q3_K_L.gguf) | i1-Q3_K_L | 5.8 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-5-v3.0-i1-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-5-v3.0.i1-IQ4_XS.gguf) | i1-IQ4_XS | 5.9 | |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-5-v3.0-i1-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-5-v3.0.i1-Q4_0_4_4.gguf) | i1-Q4_0_4_4 | 6.2 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-5-v3.0-i1-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-5-v3.0.i1-Q4_0_4_8.gguf) | i1-Q4_0_4_8 | 6.2 | fast on arm+i8mm, low quality |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-5-v3.0-i1-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-5-v3.0.i1-Q4_0_8_8.gguf) | i1-Q4_0_8_8 | 6.2 | fast on arm+sve, low quality |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-5-v3.0-i1-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-5-v3.0.i1-Q4_0.gguf) | i1-Q4_0 | 6.2 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-5-v3.0-i1-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-5-v3.0.i1-Q4_K_S.gguf) | i1-Q4_K_S | 6.2 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-5-v3.0-i1-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-5-v3.0.i1-Q4_K_M.gguf) | i1-Q4_K_M | 6.6 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-5-v3.0-i1-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-5-v3.0.i1-Q5_K_S.gguf) | i1-Q5_K_S | 7.5 | |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-5-v3.0-i1-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-5-v3.0.i1-Q5_K_M.gguf) | i1-Q5_K_M | 7.7 | |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-5-v3.0-i1-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-5-v3.0.i1-Q6_K.gguf) | i1-Q6_K | 8.9 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
mradermacher/DT-EQ-SOLAR-10.7B-v0.1-GGUF | mradermacher | 2024-11-25T08:56:27Z | 15 | 1 | transformers | [
"transformers",
"gguf",
"ko",
"base_model:juengsi/DT-EQ-SOLAR-10.7B-v0.1",
"base_model:quantized:juengsi/DT-EQ-SOLAR-10.7B-v0.1",
"license:cc-by-4.0",
"endpoints_compatible",
"region:us"
] | null | 2024-11-23T16:30:10Z | ---
base_model: juengsi/DT-EQ-SOLAR-10.7B-v0.1
language:
- ko
library_name: transformers
license: cc-by-4.0
quantized_by: mradermacher
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
static quants of https://huggingface.co/juengsi/DT-EQ-SOLAR-10.7B-v0.1
<!-- provided-files -->
weighted/imatrix quants are available at https://huggingface.co/mradermacher/DT-EQ-SOLAR-10.7B-v0.1-i1-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/DT-EQ-SOLAR-10.7B-v0.1-GGUF/resolve/main/DT-EQ-SOLAR-10.7B-v0.1.Q2_K.gguf) | Q2_K | 4.1 | |
| [GGUF](https://huggingface.co/mradermacher/DT-EQ-SOLAR-10.7B-v0.1-GGUF/resolve/main/DT-EQ-SOLAR-10.7B-v0.1.Q3_K_S.gguf) | Q3_K_S | 4.8 | |
| [GGUF](https://huggingface.co/mradermacher/DT-EQ-SOLAR-10.7B-v0.1-GGUF/resolve/main/DT-EQ-SOLAR-10.7B-v0.1.Q3_K_M.gguf) | Q3_K_M | 5.3 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/DT-EQ-SOLAR-10.7B-v0.1-GGUF/resolve/main/DT-EQ-SOLAR-10.7B-v0.1.Q3_K_L.gguf) | Q3_K_L | 5.8 | |
| [GGUF](https://huggingface.co/mradermacher/DT-EQ-SOLAR-10.7B-v0.1-GGUF/resolve/main/DT-EQ-SOLAR-10.7B-v0.1.IQ4_XS.gguf) | IQ4_XS | 5.9 | |
| [GGUF](https://huggingface.co/mradermacher/DT-EQ-SOLAR-10.7B-v0.1-GGUF/resolve/main/DT-EQ-SOLAR-10.7B-v0.1.Q4_0_4_4.gguf) | Q4_0_4_4 | 6.2 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/DT-EQ-SOLAR-10.7B-v0.1-GGUF/resolve/main/DT-EQ-SOLAR-10.7B-v0.1.Q4_K_S.gguf) | Q4_K_S | 6.2 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/DT-EQ-SOLAR-10.7B-v0.1-GGUF/resolve/main/DT-EQ-SOLAR-10.7B-v0.1.Q4_K_M.gguf) | Q4_K_M | 6.6 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/DT-EQ-SOLAR-10.7B-v0.1-GGUF/resolve/main/DT-EQ-SOLAR-10.7B-v0.1.Q5_K_S.gguf) | Q5_K_S | 7.5 | |
| [GGUF](https://huggingface.co/mradermacher/DT-EQ-SOLAR-10.7B-v0.1-GGUF/resolve/main/DT-EQ-SOLAR-10.7B-v0.1.Q5_K_M.gguf) | Q5_K_M | 7.7 | |
| [GGUF](https://huggingface.co/mradermacher/DT-EQ-SOLAR-10.7B-v0.1-GGUF/resolve/main/DT-EQ-SOLAR-10.7B-v0.1.Q6_K.gguf) | Q6_K | 8.9 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/DT-EQ-SOLAR-10.7B-v0.1-GGUF/resolve/main/DT-EQ-SOLAR-10.7B-v0.1.Q8_0.gguf) | Q8_0 | 11.5 | fast, best quality |
| [GGUF](https://huggingface.co/mradermacher/DT-EQ-SOLAR-10.7B-v0.1-GGUF/resolve/main/DT-EQ-SOLAR-10.7B-v0.1.f16.gguf) | f16 | 21.6 | 16 bpw, overkill |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
mradermacher/occiglot-10b-de-en-instruct-GGUF | mradermacher | 2024-11-25T08:56:23Z | 7 | 1 | transformers | [
"transformers",
"gguf",
"mergekit",
"merge",
"en",
"base_model:mayflowergmbh/occiglot-10b-de-en-instruct",
"base_model:quantized:mayflowergmbh/occiglot-10b-de-en-instruct",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-11-23T16:57:11Z | ---
base_model: mayflowergmbh/occiglot-10b-de-en-instruct
language:
- en
library_name: transformers
quantized_by: mradermacher
tags:
- mergekit
- merge
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: -->
static quants of https://huggingface.co/mayflowergmbh/occiglot-10b-de-en-instruct
<!-- provided-files -->
weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/occiglot-10b-de-en-instruct-GGUF/resolve/main/occiglot-10b-de-en-instruct.Q2_K.gguf) | Q2_K | 3.8 | |
| [GGUF](https://huggingface.co/mradermacher/occiglot-10b-de-en-instruct-GGUF/resolve/main/occiglot-10b-de-en-instruct.Q3_K_S.gguf) | Q3_K_S | 4.4 | |
| [GGUF](https://huggingface.co/mradermacher/occiglot-10b-de-en-instruct-GGUF/resolve/main/occiglot-10b-de-en-instruct.Q3_K_M.gguf) | Q3_K_M | 4.9 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/occiglot-10b-de-en-instruct-GGUF/resolve/main/occiglot-10b-de-en-instruct.Q3_K_L.gguf) | Q3_K_L | 5.3 | |
| [GGUF](https://huggingface.co/mradermacher/occiglot-10b-de-en-instruct-GGUF/resolve/main/occiglot-10b-de-en-instruct.IQ4_XS.gguf) | IQ4_XS | 5.5 | |
| [GGUF](https://huggingface.co/mradermacher/occiglot-10b-de-en-instruct-GGUF/resolve/main/occiglot-10b-de-en-instruct.Q4_0_4_4.gguf) | Q4_0_4_4 | 5.7 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/occiglot-10b-de-en-instruct-GGUF/resolve/main/occiglot-10b-de-en-instruct.Q4_K_S.gguf) | Q4_K_S | 5.7 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/occiglot-10b-de-en-instruct-GGUF/resolve/main/occiglot-10b-de-en-instruct.Q4_K_M.gguf) | Q4_K_M | 6.0 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/occiglot-10b-de-en-instruct-GGUF/resolve/main/occiglot-10b-de-en-instruct.Q5_K_S.gguf) | Q5_K_S | 6.9 | |
| [GGUF](https://huggingface.co/mradermacher/occiglot-10b-de-en-instruct-GGUF/resolve/main/occiglot-10b-de-en-instruct.Q5_K_M.gguf) | Q5_K_M | 7.1 | |
| [GGUF](https://huggingface.co/mradermacher/occiglot-10b-de-en-instruct-GGUF/resolve/main/occiglot-10b-de-en-instruct.Q6_K.gguf) | Q6_K | 8.2 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/occiglot-10b-de-en-instruct-GGUF/resolve/main/occiglot-10b-de-en-instruct.Q8_0.gguf) | Q8_0 | 10.6 | fast, best quality |
| [GGUF](https://huggingface.co/mradermacher/occiglot-10b-de-en-instruct-GGUF/resolve/main/occiglot-10b-de-en-instruct.f16.gguf) | f16 | 19.8 | 16 bpw, overkill |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time.
<!-- end -->
|
mradermacher/opencsg-CodeLlama-34b-v0.1-GGUF | mradermacher | 2024-11-25T08:56:19Z | 42 | 1 | transformers | [
"transformers",
"gguf",
"llama-2",
"code",
"base_model:opencsg/opencsg-CodeLlama-34b-v0.1",
"base_model:quantized:opencsg/opencsg-CodeLlama-34b-v0.1",
"license:llama2",
"endpoints_compatible",
"region:us"
] | null | 2024-11-23T17:27:45Z | ---
base_model: opencsg/opencsg-CodeLlama-34b-v0.1
language:
- code
library_name: transformers
license: llama2
quantized_by: mradermacher
tags:
- llama-2
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
static quants of https://huggingface.co/opencsg/opencsg-CodeLlama-34b-v0.1
<!-- provided-files -->
weighted/imatrix quants are available at https://huggingface.co/mradermacher/opencsg-CodeLlama-34b-v0.1-i1-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/opencsg-CodeLlama-34b-v0.1-GGUF/resolve/main/opencsg-CodeLlama-34b-v0.1.Q2_K.gguf) | Q2_K | 12.6 | |
| [GGUF](https://huggingface.co/mradermacher/opencsg-CodeLlama-34b-v0.1-GGUF/resolve/main/opencsg-CodeLlama-34b-v0.1.Q3_K_S.gguf) | Q3_K_S | 14.7 | |
| [GGUF](https://huggingface.co/mradermacher/opencsg-CodeLlama-34b-v0.1-GGUF/resolve/main/opencsg-CodeLlama-34b-v0.1.Q3_K_M.gguf) | Q3_K_M | 16.4 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/opencsg-CodeLlama-34b-v0.1-GGUF/resolve/main/opencsg-CodeLlama-34b-v0.1.Q3_K_L.gguf) | Q3_K_L | 17.9 | |
| [GGUF](https://huggingface.co/mradermacher/opencsg-CodeLlama-34b-v0.1-GGUF/resolve/main/opencsg-CodeLlama-34b-v0.1.IQ4_XS.gguf) | IQ4_XS | 18.3 | |
| [GGUF](https://huggingface.co/mradermacher/opencsg-CodeLlama-34b-v0.1-GGUF/resolve/main/opencsg-CodeLlama-34b-v0.1.Q4_K_S.gguf) | Q4_K_S | 19.3 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/opencsg-CodeLlama-34b-v0.1-GGUF/resolve/main/opencsg-CodeLlama-34b-v0.1.Q4_K_M.gguf) | Q4_K_M | 20.3 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/opencsg-CodeLlama-34b-v0.1-GGUF/resolve/main/opencsg-CodeLlama-34b-v0.1.Q5_K_S.gguf) | Q5_K_S | 23.3 | |
| [GGUF](https://huggingface.co/mradermacher/opencsg-CodeLlama-34b-v0.1-GGUF/resolve/main/opencsg-CodeLlama-34b-v0.1.Q5_K_M.gguf) | Q5_K_M | 23.9 | |
| [GGUF](https://huggingface.co/mradermacher/opencsg-CodeLlama-34b-v0.1-GGUF/resolve/main/opencsg-CodeLlama-34b-v0.1.Q6_K.gguf) | Q6_K | 27.8 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/opencsg-CodeLlama-34b-v0.1-GGUF/resolve/main/opencsg-CodeLlama-34b-v0.1.Q8_0.gguf) | Q8_0 | 36.0 | fast, best quality |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
mradermacher/WizardLM-Math-70B-v0.1-i1-GGUF | mradermacher | 2024-11-25T08:56:15Z | 55 | 1 | transformers | [
"transformers",
"gguf",
"merge",
"wizardlm",
"mique",
"en",
"base_model:MaziyarPanahi/WizardLM-Math-70B-v0.1",
"base_model:quantized:MaziyarPanahi/WizardLM-Math-70B-v0.1",
"license:agpl-3.0",
"endpoints_compatible",
"region:us",
"imatrix"
] | null | 2024-11-23T17:31:48Z | ---
base_model: MaziyarPanahi/WizardLM-Math-70B-v0.1
language:
- en
library_name: transformers
license: agpl-3.0
quantized_by: mradermacher
tags:
- merge
- wizardlm
- mique
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/MaziyarPanahi/WizardLM-Math-70B-v0.1
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/WizardLM-Math-70B-v0.1-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/WizardLM-Math-70B-v0.1-i1-GGUF/resolve/main/WizardLM-Math-70B-v0.1.i1-IQ1_S.gguf) | i1-IQ1_S | 14.6 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/WizardLM-Math-70B-v0.1-i1-GGUF/resolve/main/WizardLM-Math-70B-v0.1.i1-IQ1_M.gguf) | i1-IQ1_M | 16.0 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/WizardLM-Math-70B-v0.1-i1-GGUF/resolve/main/WizardLM-Math-70B-v0.1.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 18.4 | |
| [GGUF](https://huggingface.co/mradermacher/WizardLM-Math-70B-v0.1-i1-GGUF/resolve/main/WizardLM-Math-70B-v0.1.i1-IQ2_XS.gguf) | i1-IQ2_XS | 20.4 | |
| [GGUF](https://huggingface.co/mradermacher/WizardLM-Math-70B-v0.1-i1-GGUF/resolve/main/WizardLM-Math-70B-v0.1.i1-IQ2_S.gguf) | i1-IQ2_S | 21.5 | |
| [GGUF](https://huggingface.co/mradermacher/WizardLM-Math-70B-v0.1-i1-GGUF/resolve/main/WizardLM-Math-70B-v0.1.i1-IQ2_M.gguf) | i1-IQ2_M | 23.3 | |
| [GGUF](https://huggingface.co/mradermacher/WizardLM-Math-70B-v0.1-i1-GGUF/resolve/main/WizardLM-Math-70B-v0.1.i1-Q2_K.gguf) | i1-Q2_K | 25.6 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/WizardLM-Math-70B-v0.1-i1-GGUF/resolve/main/WizardLM-Math-70B-v0.1.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 26.7 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/WizardLM-Math-70B-v0.1-i1-GGUF/resolve/main/WizardLM-Math-70B-v0.1.i1-IQ3_XS.gguf) | i1-IQ3_XS | 28.4 | |
| [GGUF](https://huggingface.co/mradermacher/WizardLM-Math-70B-v0.1-i1-GGUF/resolve/main/WizardLM-Math-70B-v0.1.i1-IQ3_S.gguf) | i1-IQ3_S | 30.0 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/WizardLM-Math-70B-v0.1-i1-GGUF/resolve/main/WizardLM-Math-70B-v0.1.i1-Q3_K_S.gguf) | i1-Q3_K_S | 30.0 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/WizardLM-Math-70B-v0.1-i1-GGUF/resolve/main/WizardLM-Math-70B-v0.1.i1-IQ3_M.gguf) | i1-IQ3_M | 31.0 | |
| [GGUF](https://huggingface.co/mradermacher/WizardLM-Math-70B-v0.1-i1-GGUF/resolve/main/WizardLM-Math-70B-v0.1.i1-Q3_K_M.gguf) | i1-Q3_K_M | 33.4 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/WizardLM-Math-70B-v0.1-i1-GGUF/resolve/main/WizardLM-Math-70B-v0.1.i1-Q3_K_L.gguf) | i1-Q3_K_L | 36.2 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/WizardLM-Math-70B-v0.1-i1-GGUF/resolve/main/WizardLM-Math-70B-v0.1.i1-IQ4_XS.gguf) | i1-IQ4_XS | 36.9 | |
| [GGUF](https://huggingface.co/mradermacher/WizardLM-Math-70B-v0.1-i1-GGUF/resolve/main/WizardLM-Math-70B-v0.1.i1-Q4_0.gguf) | i1-Q4_0 | 39.1 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/WizardLM-Math-70B-v0.1-i1-GGUF/resolve/main/WizardLM-Math-70B-v0.1.i1-Q4_K_S.gguf) | i1-Q4_K_S | 39.3 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/WizardLM-Math-70B-v0.1-i1-GGUF/resolve/main/WizardLM-Math-70B-v0.1.i1-Q4_K_M.gguf) | i1-Q4_K_M | 41.5 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/WizardLM-Math-70B-v0.1-i1-GGUF/resolve/main/WizardLM-Math-70B-v0.1.i1-Q5_K_S.gguf) | i1-Q5_K_S | 47.6 | |
| [GGUF](https://huggingface.co/mradermacher/WizardLM-Math-70B-v0.1-i1-GGUF/resolve/main/WizardLM-Math-70B-v0.1.i1-Q5_K_M.gguf) | i1-Q5_K_M | 48.9 | |
| [PART 1](https://huggingface.co/mradermacher/WizardLM-Math-70B-v0.1-i1-GGUF/resolve/main/WizardLM-Math-70B-v0.1.i1-Q6_K.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/WizardLM-Math-70B-v0.1-i1-GGUF/resolve/main/WizardLM-Math-70B-v0.1.i1-Q6_K.gguf.part2of2) | i1-Q6_K | 56.7 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
mradermacher/free-solar-slerp-v0.3-i1-GGUF | mradermacher | 2024-11-25T08:56:07Z | 31 | 1 | transformers | [
"transformers",
"gguf",
"mergekit",
"merge",
"en",
"base_model:freewheelin/free-solar-slerp-v0.3",
"base_model:quantized:freewheelin/free-solar-slerp-v0.3",
"license:mit",
"endpoints_compatible",
"region:us",
"imatrix"
] | null | 2024-11-23T17:55:36Z | ---
base_model: freewheelin/free-solar-slerp-v0.3
language:
- en
library_name: transformers
license: mit
quantized_by: mradermacher
tags:
- mergekit
- merge
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/freewheelin/free-solar-slerp-v0.3
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/free-solar-slerp-v0.3-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/free-solar-slerp-v0.3-i1-GGUF/resolve/main/free-solar-slerp-v0.3.i1-IQ1_S.gguf) | i1-IQ1_S | 2.5 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/free-solar-slerp-v0.3-i1-GGUF/resolve/main/free-solar-slerp-v0.3.i1-IQ1_M.gguf) | i1-IQ1_M | 2.7 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/free-solar-slerp-v0.3-i1-GGUF/resolve/main/free-solar-slerp-v0.3.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 3.1 | |
| [GGUF](https://huggingface.co/mradermacher/free-solar-slerp-v0.3-i1-GGUF/resolve/main/free-solar-slerp-v0.3.i1-IQ2_XS.gguf) | i1-IQ2_XS | 3.4 | |
| [GGUF](https://huggingface.co/mradermacher/free-solar-slerp-v0.3-i1-GGUF/resolve/main/free-solar-slerp-v0.3.i1-IQ2_S.gguf) | i1-IQ2_S | 3.6 | |
| [GGUF](https://huggingface.co/mradermacher/free-solar-slerp-v0.3-i1-GGUF/resolve/main/free-solar-slerp-v0.3.i1-IQ2_M.gguf) | i1-IQ2_M | 3.9 | |
| [GGUF](https://huggingface.co/mradermacher/free-solar-slerp-v0.3-i1-GGUF/resolve/main/free-solar-slerp-v0.3.i1-Q2_K.gguf) | i1-Q2_K | 4.2 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/free-solar-slerp-v0.3-i1-GGUF/resolve/main/free-solar-slerp-v0.3.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 4.3 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/free-solar-slerp-v0.3-i1-GGUF/resolve/main/free-solar-slerp-v0.3.i1-IQ3_XS.gguf) | i1-IQ3_XS | 4.6 | |
| [GGUF](https://huggingface.co/mradermacher/free-solar-slerp-v0.3-i1-GGUF/resolve/main/free-solar-slerp-v0.3.i1-Q3_K_S.gguf) | i1-Q3_K_S | 4.8 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/free-solar-slerp-v0.3-i1-GGUF/resolve/main/free-solar-slerp-v0.3.i1-IQ3_S.gguf) | i1-IQ3_S | 4.9 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/free-solar-slerp-v0.3-i1-GGUF/resolve/main/free-solar-slerp-v0.3.i1-IQ3_M.gguf) | i1-IQ3_M | 5.0 | |
| [GGUF](https://huggingface.co/mradermacher/free-solar-slerp-v0.3-i1-GGUF/resolve/main/free-solar-slerp-v0.3.i1-Q3_K_M.gguf) | i1-Q3_K_M | 5.4 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/free-solar-slerp-v0.3-i1-GGUF/resolve/main/free-solar-slerp-v0.3.i1-Q3_K_L.gguf) | i1-Q3_K_L | 5.8 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/free-solar-slerp-v0.3-i1-GGUF/resolve/main/free-solar-slerp-v0.3.i1-IQ4_XS.gguf) | i1-IQ4_XS | 6.0 | |
| [GGUF](https://huggingface.co/mradermacher/free-solar-slerp-v0.3-i1-GGUF/resolve/main/free-solar-slerp-v0.3.i1-Q4_0_4_4.gguf) | i1-Q4_0_4_4 | 6.3 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/free-solar-slerp-v0.3-i1-GGUF/resolve/main/free-solar-slerp-v0.3.i1-Q4_0_4_8.gguf) | i1-Q4_0_4_8 | 6.3 | fast on arm+i8mm, low quality |
| [GGUF](https://huggingface.co/mradermacher/free-solar-slerp-v0.3-i1-GGUF/resolve/main/free-solar-slerp-v0.3.i1-Q4_0_8_8.gguf) | i1-Q4_0_8_8 | 6.3 | fast on arm+sve, low quality |
| [GGUF](https://huggingface.co/mradermacher/free-solar-slerp-v0.3-i1-GGUF/resolve/main/free-solar-slerp-v0.3.i1-Q4_0.gguf) | i1-Q4_0 | 6.3 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/free-solar-slerp-v0.3-i1-GGUF/resolve/main/free-solar-slerp-v0.3.i1-Q4_K_S.gguf) | i1-Q4_K_S | 6.3 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/free-solar-slerp-v0.3-i1-GGUF/resolve/main/free-solar-slerp-v0.3.i1-Q4_K_M.gguf) | i1-Q4_K_M | 6.7 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/free-solar-slerp-v0.3-i1-GGUF/resolve/main/free-solar-slerp-v0.3.i1-Q5_K_S.gguf) | i1-Q5_K_S | 7.6 | |
| [GGUF](https://huggingface.co/mradermacher/free-solar-slerp-v0.3-i1-GGUF/resolve/main/free-solar-slerp-v0.3.i1-Q5_K_M.gguf) | i1-Q5_K_M | 7.8 | |
| [GGUF](https://huggingface.co/mradermacher/free-solar-slerp-v0.3-i1-GGUF/resolve/main/free-solar-slerp-v0.3.i1-Q6_K.gguf) | i1-Q6_K | 9.0 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
mradermacher/DT-SL-MLP-SOLAR-10.7B-v0.1-GGUF | mradermacher | 2024-11-25T08:55:42Z | 5 | 1 | transformers | [
"transformers",
"gguf",
"ko",
"base_model:juengsi/DT-SL-MLP-SOLAR-10.7B-v0.1",
"base_model:quantized:juengsi/DT-SL-MLP-SOLAR-10.7B-v0.1",
"license:cc-by-4.0",
"endpoints_compatible",
"region:us"
] | null | 2024-11-23T18:34:56Z | ---
base_model: juengsi/DT-SL-MLP-SOLAR-10.7B-v0.1
language:
- ko
library_name: transformers
license: cc-by-4.0
quantized_by: mradermacher
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
static quants of https://huggingface.co/juengsi/DT-SL-MLP-SOLAR-10.7B-v0.1
<!-- provided-files -->
weighted/imatrix quants are available at https://huggingface.co/mradermacher/DT-SL-MLP-SOLAR-10.7B-v0.1-i1-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/DT-SL-MLP-SOLAR-10.7B-v0.1-GGUF/resolve/main/DT-SL-MLP-SOLAR-10.7B-v0.1.Q2_K.gguf) | Q2_K | 4.1 | |
| [GGUF](https://huggingface.co/mradermacher/DT-SL-MLP-SOLAR-10.7B-v0.1-GGUF/resolve/main/DT-SL-MLP-SOLAR-10.7B-v0.1.Q3_K_S.gguf) | Q3_K_S | 4.8 | |
| [GGUF](https://huggingface.co/mradermacher/DT-SL-MLP-SOLAR-10.7B-v0.1-GGUF/resolve/main/DT-SL-MLP-SOLAR-10.7B-v0.1.Q3_K_M.gguf) | Q3_K_M | 5.3 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/DT-SL-MLP-SOLAR-10.7B-v0.1-GGUF/resolve/main/DT-SL-MLP-SOLAR-10.7B-v0.1.Q3_K_L.gguf) | Q3_K_L | 5.8 | |
| [GGUF](https://huggingface.co/mradermacher/DT-SL-MLP-SOLAR-10.7B-v0.1-GGUF/resolve/main/DT-SL-MLP-SOLAR-10.7B-v0.1.IQ4_XS.gguf) | IQ4_XS | 5.9 | |
| [GGUF](https://huggingface.co/mradermacher/DT-SL-MLP-SOLAR-10.7B-v0.1-GGUF/resolve/main/DT-SL-MLP-SOLAR-10.7B-v0.1.Q4_0_4_4.gguf) | Q4_0_4_4 | 6.2 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/DT-SL-MLP-SOLAR-10.7B-v0.1-GGUF/resolve/main/DT-SL-MLP-SOLAR-10.7B-v0.1.Q4_K_S.gguf) | Q4_K_S | 6.2 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/DT-SL-MLP-SOLAR-10.7B-v0.1-GGUF/resolve/main/DT-SL-MLP-SOLAR-10.7B-v0.1.Q4_K_M.gguf) | Q4_K_M | 6.6 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/DT-SL-MLP-SOLAR-10.7B-v0.1-GGUF/resolve/main/DT-SL-MLP-SOLAR-10.7B-v0.1.Q5_K_S.gguf) | Q5_K_S | 7.5 | |
| [GGUF](https://huggingface.co/mradermacher/DT-SL-MLP-SOLAR-10.7B-v0.1-GGUF/resolve/main/DT-SL-MLP-SOLAR-10.7B-v0.1.Q5_K_M.gguf) | Q5_K_M | 7.7 | |
| [GGUF](https://huggingface.co/mradermacher/DT-SL-MLP-SOLAR-10.7B-v0.1-GGUF/resolve/main/DT-SL-MLP-SOLAR-10.7B-v0.1.Q6_K.gguf) | Q6_K | 8.9 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/DT-SL-MLP-SOLAR-10.7B-v0.1-GGUF/resolve/main/DT-SL-MLP-SOLAR-10.7B-v0.1.Q8_0.gguf) | Q8_0 | 11.5 | fast, best quality |
| [GGUF](https://huggingface.co/mradermacher/DT-SL-MLP-SOLAR-10.7B-v0.1-GGUF/resolve/main/DT-SL-MLP-SOLAR-10.7B-v0.1.f16.gguf) | f16 | 21.6 | 16 bpw, overkill |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
mradermacher/EVA-Meissa-Coder-14B-Instruct-i1-GGUF | mradermacher | 2024-11-25T08:55:36Z | 15 | 1 | transformers | [
"transformers",
"gguf",
"mergekit",
"merge",
"en",
"base_model:win10/EVA-Meissa-Coder-14B-Instruct",
"base_model:quantized:win10/EVA-Meissa-Coder-14B-Instruct",
"endpoints_compatible",
"region:us",
"imatrix",
"conversational"
] | null | 2024-11-23T19:00:49Z | ---
base_model: win10/EVA-Meissa-Coder-14B-Instruct
language:
- en
library_name: transformers
quantized_by: mradermacher
tags:
- mergekit
- merge
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/win10/EVA-Meissa-Coder-14B-Instruct
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/EVA-Meissa-Coder-14B-Instruct-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-Coder-14B-Instruct-i1-GGUF/resolve/main/EVA-Meissa-Coder-14B-Instruct.i1-IQ1_S.gguf) | i1-IQ1_S | 3.7 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-Coder-14B-Instruct-i1-GGUF/resolve/main/EVA-Meissa-Coder-14B-Instruct.i1-IQ1_M.gguf) | i1-IQ1_M | 4.0 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-Coder-14B-Instruct-i1-GGUF/resolve/main/EVA-Meissa-Coder-14B-Instruct.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 4.4 | |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-Coder-14B-Instruct-i1-GGUF/resolve/main/EVA-Meissa-Coder-14B-Instruct.i1-IQ2_XS.gguf) | i1-IQ2_XS | 4.8 | |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-Coder-14B-Instruct-i1-GGUF/resolve/main/EVA-Meissa-Coder-14B-Instruct.i1-IQ2_S.gguf) | i1-IQ2_S | 5.1 | |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-Coder-14B-Instruct-i1-GGUF/resolve/main/EVA-Meissa-Coder-14B-Instruct.i1-IQ2_M.gguf) | i1-IQ2_M | 5.5 | |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-Coder-14B-Instruct-i1-GGUF/resolve/main/EVA-Meissa-Coder-14B-Instruct.i1-Q2_K.gguf) | i1-Q2_K | 5.9 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-Coder-14B-Instruct-i1-GGUF/resolve/main/EVA-Meissa-Coder-14B-Instruct.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 6.0 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-Coder-14B-Instruct-i1-GGUF/resolve/main/EVA-Meissa-Coder-14B-Instruct.i1-IQ3_XS.gguf) | i1-IQ3_XS | 6.5 | |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-Coder-14B-Instruct-i1-GGUF/resolve/main/EVA-Meissa-Coder-14B-Instruct.i1-Q3_K_S.gguf) | i1-Q3_K_S | 6.8 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-Coder-14B-Instruct-i1-GGUF/resolve/main/EVA-Meissa-Coder-14B-Instruct.i1-IQ3_S.gguf) | i1-IQ3_S | 6.8 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-Coder-14B-Instruct-i1-GGUF/resolve/main/EVA-Meissa-Coder-14B-Instruct.i1-IQ3_M.gguf) | i1-IQ3_M | 7.0 | |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-Coder-14B-Instruct-i1-GGUF/resolve/main/EVA-Meissa-Coder-14B-Instruct.i1-Q3_K_M.gguf) | i1-Q3_K_M | 7.4 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-Coder-14B-Instruct-i1-GGUF/resolve/main/EVA-Meissa-Coder-14B-Instruct.i1-Q3_K_L.gguf) | i1-Q3_K_L | 8.0 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-Coder-14B-Instruct-i1-GGUF/resolve/main/EVA-Meissa-Coder-14B-Instruct.i1-IQ4_XS.gguf) | i1-IQ4_XS | 8.2 | |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-Coder-14B-Instruct-i1-GGUF/resolve/main/EVA-Meissa-Coder-14B-Instruct.i1-Q4_0_4_4.gguf) | i1-Q4_0_4_4 | 8.6 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-Coder-14B-Instruct-i1-GGUF/resolve/main/EVA-Meissa-Coder-14B-Instruct.i1-Q4_0_4_8.gguf) | i1-Q4_0_4_8 | 8.6 | fast on arm+i8mm, low quality |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-Coder-14B-Instruct-i1-GGUF/resolve/main/EVA-Meissa-Coder-14B-Instruct.i1-Q4_0_8_8.gguf) | i1-Q4_0_8_8 | 8.6 | fast on arm+sve, low quality |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-Coder-14B-Instruct-i1-GGUF/resolve/main/EVA-Meissa-Coder-14B-Instruct.i1-Q4_0.gguf) | i1-Q4_0 | 8.6 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-Coder-14B-Instruct-i1-GGUF/resolve/main/EVA-Meissa-Coder-14B-Instruct.i1-Q4_K_S.gguf) | i1-Q4_K_S | 8.7 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-Coder-14B-Instruct-i1-GGUF/resolve/main/EVA-Meissa-Coder-14B-Instruct.i1-Q4_K_M.gguf) | i1-Q4_K_M | 9.1 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-Coder-14B-Instruct-i1-GGUF/resolve/main/EVA-Meissa-Coder-14B-Instruct.i1-Q5_K_S.gguf) | i1-Q5_K_S | 10.4 | |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-Coder-14B-Instruct-i1-GGUF/resolve/main/EVA-Meissa-Coder-14B-Instruct.i1-Q5_K_M.gguf) | i1-Q5_K_M | 10.6 | |
| [GGUF](https://huggingface.co/mradermacher/EVA-Meissa-Coder-14B-Instruct-i1-GGUF/resolve/main/EVA-Meissa-Coder-14B-Instruct.i1-Q6_K.gguf) | i1-Q6_K | 12.2 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
mradermacher/Fusion-7B-Quintessence-GGUF | mradermacher | 2024-11-25T08:55:32Z | 6 | 1 | transformers | [
"transformers",
"gguf",
"mergekit",
"merge",
"en",
"base_model:ilevytate/Fusion-7B-Quintessence",
"base_model:quantized:ilevytate/Fusion-7B-Quintessence",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-11-23T19:03:24Z | ---
base_model: ilevytate/Fusion-7B-Quintessence
language:
- en
library_name: transformers
quantized_by: mradermacher
tags:
- mergekit
- merge
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
static quants of https://huggingface.co/ilevytate/Fusion-7B-Quintessence
<!-- provided-files -->
weighted/imatrix quants are available at https://huggingface.co/mradermacher/Fusion-7B-Quintessence-i1-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/Fusion-7B-Quintessence-GGUF/resolve/main/Fusion-7B-Quintessence.Q2_K.gguf) | Q2_K | 8.0 | |
| [GGUF](https://huggingface.co/mradermacher/Fusion-7B-Quintessence-GGUF/resolve/main/Fusion-7B-Quintessence.Q3_K_S.gguf) | Q3_K_S | 9.3 | |
| [GGUF](https://huggingface.co/mradermacher/Fusion-7B-Quintessence-GGUF/resolve/main/Fusion-7B-Quintessence.Q3_K_M.gguf) | Q3_K_M | 10.3 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/Fusion-7B-Quintessence-GGUF/resolve/main/Fusion-7B-Quintessence.Q3_K_L.gguf) | Q3_K_L | 11.2 | |
| [GGUF](https://huggingface.co/mradermacher/Fusion-7B-Quintessence-GGUF/resolve/main/Fusion-7B-Quintessence.IQ4_XS.gguf) | IQ4_XS | 11.6 | |
| [GGUF](https://huggingface.co/mradermacher/Fusion-7B-Quintessence-GGUF/resolve/main/Fusion-7B-Quintessence.Q4_K_S.gguf) | Q4_K_S | 12.2 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Fusion-7B-Quintessence-GGUF/resolve/main/Fusion-7B-Quintessence.Q4_K_M.gguf) | Q4_K_M | 12.8 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Fusion-7B-Quintessence-GGUF/resolve/main/Fusion-7B-Quintessence.Q5_K_S.gguf) | Q5_K_S | 14.7 | |
| [GGUF](https://huggingface.co/mradermacher/Fusion-7B-Quintessence-GGUF/resolve/main/Fusion-7B-Quintessence.Q5_K_M.gguf) | Q5_K_M | 15.1 | |
| [GGUF](https://huggingface.co/mradermacher/Fusion-7B-Quintessence-GGUF/resolve/main/Fusion-7B-Quintessence.Q6_K.gguf) | Q6_K | 17.5 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/Fusion-7B-Quintessence-GGUF/resolve/main/Fusion-7B-Quintessence.Q8_0.gguf) | Q8_0 | 22.6 | fast, best quality |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
mradermacher/llm4decompile-6.7b-nsp-GGUF | mradermacher | 2024-11-25T08:55:28Z | 37 | 1 | transformers | [
"transformers",
"gguf",
"decompile",
"binary",
"en",
"base_model:arise-sustech/llm4decompile-6.7b-nsp",
"base_model:quantized:arise-sustech/llm4decompile-6.7b-nsp",
"license:mit",
"endpoints_compatible",
"region:us"
] | null | 2024-11-23T19:31:46Z | ---
base_model: arise-sustech/llm4decompile-6.7b-nsp
language:
- en
library_name: transformers
license: mit
quantized_by: mradermacher
tags:
- decompile
- binary
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: -->
static quants of https://huggingface.co/arise-sustech/llm4decompile-6.7b-nsp
<!-- provided-files -->
weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/llm4decompile-6.7b-nsp-GGUF/resolve/main/llm4decompile-6.7b-nsp.Q2_K.gguf) | Q2_K | 2.6 | |
| [GGUF](https://huggingface.co/mradermacher/llm4decompile-6.7b-nsp-GGUF/resolve/main/llm4decompile-6.7b-nsp.Q3_K_S.gguf) | Q3_K_S | 3.1 | |
| [GGUF](https://huggingface.co/mradermacher/llm4decompile-6.7b-nsp-GGUF/resolve/main/llm4decompile-6.7b-nsp.Q3_K_M.gguf) | Q3_K_M | 3.4 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/llm4decompile-6.7b-nsp-GGUF/resolve/main/llm4decompile-6.7b-nsp.Q3_K_L.gguf) | Q3_K_L | 3.7 | |
| [GGUF](https://huggingface.co/mradermacher/llm4decompile-6.7b-nsp-GGUF/resolve/main/llm4decompile-6.7b-nsp.IQ4_XS.gguf) | IQ4_XS | 3.7 | |
| [GGUF](https://huggingface.co/mradermacher/llm4decompile-6.7b-nsp-GGUF/resolve/main/llm4decompile-6.7b-nsp.Q4_0_4_4.gguf) | Q4_0_4_4 | 3.9 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/llm4decompile-6.7b-nsp-GGUF/resolve/main/llm4decompile-6.7b-nsp.Q4_K_S.gguf) | Q4_K_S | 4.0 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/llm4decompile-6.7b-nsp-GGUF/resolve/main/llm4decompile-6.7b-nsp.Q4_K_M.gguf) | Q4_K_M | 4.2 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/llm4decompile-6.7b-nsp-GGUF/resolve/main/llm4decompile-6.7b-nsp.Q5_K_S.gguf) | Q5_K_S | 4.8 | |
| [GGUF](https://huggingface.co/mradermacher/llm4decompile-6.7b-nsp-GGUF/resolve/main/llm4decompile-6.7b-nsp.Q5_K_M.gguf) | Q5_K_M | 4.9 | |
| [GGUF](https://huggingface.co/mradermacher/llm4decompile-6.7b-nsp-GGUF/resolve/main/llm4decompile-6.7b-nsp.Q6_K.gguf) | Q6_K | 5.6 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/llm4decompile-6.7b-nsp-GGUF/resolve/main/llm4decompile-6.7b-nsp.Q8_0.gguf) | Q8_0 | 7.3 | fast, best quality |
| [GGUF](https://huggingface.co/mradermacher/llm4decompile-6.7b-nsp-GGUF/resolve/main/llm4decompile-6.7b-nsp.f16.gguf) | f16 | 13.6 | 16 bpw, overkill |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time.
<!-- end -->
|
mradermacher/T3Q-ko-solar-sft-v3.0-i1-GGUF | mradermacher | 2024-11-25T08:55:04Z | 340 | 1 | transformers | [
"transformers",
"gguf",
"T3Q-ko-solar-sft-v3.0",
"kyujinpy/KoCommercial-NoSSL",
"en",
"dataset:davidkim205/ko_common_gen",
"base_model:chlee10/T3Q-ko-solar-sft-v3.0",
"base_model:quantized:chlee10/T3Q-ko-solar-sft-v3.0",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"imatrix",
"conversational"
] | null | 2024-11-23T20:21:26Z | ---
base_model: chlee10/T3Q-ko-solar-sft-v3.0
datasets:
- davidkim205/ko_common_gen
language:
- en
library_name: transformers
license: apache-2.0
quantized_by: mradermacher
tags:
- T3Q-ko-solar-sft-v3.0
- kyujinpy/KoCommercial-NoSSL
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/chlee10/T3Q-ko-solar-sft-v3.0
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/T3Q-ko-solar-sft-v3.0-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/T3Q-ko-solar-sft-v3.0-i1-GGUF/resolve/main/T3Q-ko-solar-sft-v3.0.i1-IQ1_S.gguf) | i1-IQ1_S | 2.5 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/T3Q-ko-solar-sft-v3.0-i1-GGUF/resolve/main/T3Q-ko-solar-sft-v3.0.i1-IQ1_M.gguf) | i1-IQ1_M | 2.7 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/T3Q-ko-solar-sft-v3.0-i1-GGUF/resolve/main/T3Q-ko-solar-sft-v3.0.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 3.0 | |
| [GGUF](https://huggingface.co/mradermacher/T3Q-ko-solar-sft-v3.0-i1-GGUF/resolve/main/T3Q-ko-solar-sft-v3.0.i1-IQ2_XS.gguf) | i1-IQ2_XS | 3.3 | |
| [GGUF](https://huggingface.co/mradermacher/T3Q-ko-solar-sft-v3.0-i1-GGUF/resolve/main/T3Q-ko-solar-sft-v3.0.i1-IQ2_S.gguf) | i1-IQ2_S | 3.5 | |
| [GGUF](https://huggingface.co/mradermacher/T3Q-ko-solar-sft-v3.0-i1-GGUF/resolve/main/T3Q-ko-solar-sft-v3.0.i1-IQ2_M.gguf) | i1-IQ2_M | 3.8 | |
| [GGUF](https://huggingface.co/mradermacher/T3Q-ko-solar-sft-v3.0-i1-GGUF/resolve/main/T3Q-ko-solar-sft-v3.0.i1-Q2_K.gguf) | i1-Q2_K | 4.1 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/T3Q-ko-solar-sft-v3.0-i1-GGUF/resolve/main/T3Q-ko-solar-sft-v3.0.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 4.3 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/T3Q-ko-solar-sft-v3.0-i1-GGUF/resolve/main/T3Q-ko-solar-sft-v3.0.i1-IQ3_XS.gguf) | i1-IQ3_XS | 4.5 | |
| [GGUF](https://huggingface.co/mradermacher/T3Q-ko-solar-sft-v3.0-i1-GGUF/resolve/main/T3Q-ko-solar-sft-v3.0.i1-Q3_K_S.gguf) | i1-Q3_K_S | 4.8 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/T3Q-ko-solar-sft-v3.0-i1-GGUF/resolve/main/T3Q-ko-solar-sft-v3.0.i1-IQ3_S.gguf) | i1-IQ3_S | 4.8 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/T3Q-ko-solar-sft-v3.0-i1-GGUF/resolve/main/T3Q-ko-solar-sft-v3.0.i1-IQ3_M.gguf) | i1-IQ3_M | 4.9 | |
| [GGUF](https://huggingface.co/mradermacher/T3Q-ko-solar-sft-v3.0-i1-GGUF/resolve/main/T3Q-ko-solar-sft-v3.0.i1-Q3_K_M.gguf) | i1-Q3_K_M | 5.3 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/T3Q-ko-solar-sft-v3.0-i1-GGUF/resolve/main/T3Q-ko-solar-sft-v3.0.i1-Q3_K_L.gguf) | i1-Q3_K_L | 5.8 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/T3Q-ko-solar-sft-v3.0-i1-GGUF/resolve/main/T3Q-ko-solar-sft-v3.0.i1-IQ4_XS.gguf) | i1-IQ4_XS | 5.9 | |
| [GGUF](https://huggingface.co/mradermacher/T3Q-ko-solar-sft-v3.0-i1-GGUF/resolve/main/T3Q-ko-solar-sft-v3.0.i1-Q4_0_4_4.gguf) | i1-Q4_0_4_4 | 6.2 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/T3Q-ko-solar-sft-v3.0-i1-GGUF/resolve/main/T3Q-ko-solar-sft-v3.0.i1-Q4_0_4_8.gguf) | i1-Q4_0_4_8 | 6.2 | fast on arm+i8mm, low quality |
| [GGUF](https://huggingface.co/mradermacher/T3Q-ko-solar-sft-v3.0-i1-GGUF/resolve/main/T3Q-ko-solar-sft-v3.0.i1-Q4_0_8_8.gguf) | i1-Q4_0_8_8 | 6.2 | fast on arm+sve, low quality |
| [GGUF](https://huggingface.co/mradermacher/T3Q-ko-solar-sft-v3.0-i1-GGUF/resolve/main/T3Q-ko-solar-sft-v3.0.i1-Q4_0.gguf) | i1-Q4_0 | 6.2 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/T3Q-ko-solar-sft-v3.0-i1-GGUF/resolve/main/T3Q-ko-solar-sft-v3.0.i1-Q4_K_S.gguf) | i1-Q4_K_S | 6.2 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/T3Q-ko-solar-sft-v3.0-i1-GGUF/resolve/main/T3Q-ko-solar-sft-v3.0.i1-Q4_K_M.gguf) | i1-Q4_K_M | 6.6 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/T3Q-ko-solar-sft-v3.0-i1-GGUF/resolve/main/T3Q-ko-solar-sft-v3.0.i1-Q5_K_S.gguf) | i1-Q5_K_S | 7.5 | |
| [GGUF](https://huggingface.co/mradermacher/T3Q-ko-solar-sft-v3.0-i1-GGUF/resolve/main/T3Q-ko-solar-sft-v3.0.i1-Q5_K_M.gguf) | i1-Q5_K_M | 7.7 | |
| [GGUF](https://huggingface.co/mradermacher/T3Q-ko-solar-sft-v3.0-i1-GGUF/resolve/main/T3Q-ko-solar-sft-v3.0.i1-Q6_K.gguf) | i1-Q6_K | 8.9 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
mradermacher/DT-SL-MLP-SOLAR-10.7B-v0.1-i1-GGUF | mradermacher | 2024-11-25T08:55:00Z | 34 | 1 | transformers | [
"transformers",
"gguf",
"ko",
"base_model:juengsi/DT-SL-MLP-SOLAR-10.7B-v0.1",
"base_model:quantized:juengsi/DT-SL-MLP-SOLAR-10.7B-v0.1",
"license:cc-by-4.0",
"endpoints_compatible",
"region:us",
"imatrix"
] | null | 2024-11-23T20:36:50Z | ---
base_model: juengsi/DT-SL-MLP-SOLAR-10.7B-v0.1
language:
- ko
library_name: transformers
license: cc-by-4.0
quantized_by: mradermacher
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/juengsi/DT-SL-MLP-SOLAR-10.7B-v0.1
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/DT-SL-MLP-SOLAR-10.7B-v0.1-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/DT-SL-MLP-SOLAR-10.7B-v0.1-i1-GGUF/resolve/main/DT-SL-MLP-SOLAR-10.7B-v0.1.i1-IQ1_S.gguf) | i1-IQ1_S | 2.5 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/DT-SL-MLP-SOLAR-10.7B-v0.1-i1-GGUF/resolve/main/DT-SL-MLP-SOLAR-10.7B-v0.1.i1-IQ1_M.gguf) | i1-IQ1_M | 2.7 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/DT-SL-MLP-SOLAR-10.7B-v0.1-i1-GGUF/resolve/main/DT-SL-MLP-SOLAR-10.7B-v0.1.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 3.0 | |
| [GGUF](https://huggingface.co/mradermacher/DT-SL-MLP-SOLAR-10.7B-v0.1-i1-GGUF/resolve/main/DT-SL-MLP-SOLAR-10.7B-v0.1.i1-IQ2_XS.gguf) | i1-IQ2_XS | 3.3 | |
| [GGUF](https://huggingface.co/mradermacher/DT-SL-MLP-SOLAR-10.7B-v0.1-i1-GGUF/resolve/main/DT-SL-MLP-SOLAR-10.7B-v0.1.i1-IQ2_S.gguf) | i1-IQ2_S | 3.5 | |
| [GGUF](https://huggingface.co/mradermacher/DT-SL-MLP-SOLAR-10.7B-v0.1-i1-GGUF/resolve/main/DT-SL-MLP-SOLAR-10.7B-v0.1.i1-IQ2_M.gguf) | i1-IQ2_M | 3.8 | |
| [GGUF](https://huggingface.co/mradermacher/DT-SL-MLP-SOLAR-10.7B-v0.1-i1-GGUF/resolve/main/DT-SL-MLP-SOLAR-10.7B-v0.1.i1-Q2_K.gguf) | i1-Q2_K | 4.1 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/DT-SL-MLP-SOLAR-10.7B-v0.1-i1-GGUF/resolve/main/DT-SL-MLP-SOLAR-10.7B-v0.1.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 4.3 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/DT-SL-MLP-SOLAR-10.7B-v0.1-i1-GGUF/resolve/main/DT-SL-MLP-SOLAR-10.7B-v0.1.i1-IQ3_XS.gguf) | i1-IQ3_XS | 4.5 | |
| [GGUF](https://huggingface.co/mradermacher/DT-SL-MLP-SOLAR-10.7B-v0.1-i1-GGUF/resolve/main/DT-SL-MLP-SOLAR-10.7B-v0.1.i1-Q3_K_S.gguf) | i1-Q3_K_S | 4.8 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/DT-SL-MLP-SOLAR-10.7B-v0.1-i1-GGUF/resolve/main/DT-SL-MLP-SOLAR-10.7B-v0.1.i1-IQ3_S.gguf) | i1-IQ3_S | 4.8 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/DT-SL-MLP-SOLAR-10.7B-v0.1-i1-GGUF/resolve/main/DT-SL-MLP-SOLAR-10.7B-v0.1.i1-IQ3_M.gguf) | i1-IQ3_M | 4.9 | |
| [GGUF](https://huggingface.co/mradermacher/DT-SL-MLP-SOLAR-10.7B-v0.1-i1-GGUF/resolve/main/DT-SL-MLP-SOLAR-10.7B-v0.1.i1-Q3_K_M.gguf) | i1-Q3_K_M | 5.3 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/DT-SL-MLP-SOLAR-10.7B-v0.1-i1-GGUF/resolve/main/DT-SL-MLP-SOLAR-10.7B-v0.1.i1-Q3_K_L.gguf) | i1-Q3_K_L | 5.8 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/DT-SL-MLP-SOLAR-10.7B-v0.1-i1-GGUF/resolve/main/DT-SL-MLP-SOLAR-10.7B-v0.1.i1-IQ4_XS.gguf) | i1-IQ4_XS | 5.9 | |
| [GGUF](https://huggingface.co/mradermacher/DT-SL-MLP-SOLAR-10.7B-v0.1-i1-GGUF/resolve/main/DT-SL-MLP-SOLAR-10.7B-v0.1.i1-Q4_0_4_4.gguf) | i1-Q4_0_4_4 | 6.2 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/DT-SL-MLP-SOLAR-10.7B-v0.1-i1-GGUF/resolve/main/DT-SL-MLP-SOLAR-10.7B-v0.1.i1-Q4_0_4_8.gguf) | i1-Q4_0_4_8 | 6.2 | fast on arm+i8mm, low quality |
| [GGUF](https://huggingface.co/mradermacher/DT-SL-MLP-SOLAR-10.7B-v0.1-i1-GGUF/resolve/main/DT-SL-MLP-SOLAR-10.7B-v0.1.i1-Q4_0_8_8.gguf) | i1-Q4_0_8_8 | 6.2 | fast on arm+sve, low quality |
| [GGUF](https://huggingface.co/mradermacher/DT-SL-MLP-SOLAR-10.7B-v0.1-i1-GGUF/resolve/main/DT-SL-MLP-SOLAR-10.7B-v0.1.i1-Q4_0.gguf) | i1-Q4_0 | 6.2 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/DT-SL-MLP-SOLAR-10.7B-v0.1-i1-GGUF/resolve/main/DT-SL-MLP-SOLAR-10.7B-v0.1.i1-Q4_K_S.gguf) | i1-Q4_K_S | 6.2 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/DT-SL-MLP-SOLAR-10.7B-v0.1-i1-GGUF/resolve/main/DT-SL-MLP-SOLAR-10.7B-v0.1.i1-Q4_K_M.gguf) | i1-Q4_K_M | 6.6 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/DT-SL-MLP-SOLAR-10.7B-v0.1-i1-GGUF/resolve/main/DT-SL-MLP-SOLAR-10.7B-v0.1.i1-Q5_K_S.gguf) | i1-Q5_K_S | 7.5 | |
| [GGUF](https://huggingface.co/mradermacher/DT-SL-MLP-SOLAR-10.7B-v0.1-i1-GGUF/resolve/main/DT-SL-MLP-SOLAR-10.7B-v0.1.i1-Q5_K_M.gguf) | i1-Q5_K_M | 7.7 | |
| [GGUF](https://huggingface.co/mradermacher/DT-SL-MLP-SOLAR-10.7B-v0.1-i1-GGUF/resolve/main/DT-SL-MLP-SOLAR-10.7B-v0.1.i1-Q6_K.gguf) | i1-Q6_K | 8.9 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
mradermacher/BioMistral-7B-finetuned-GGUF | mradermacher | 2024-11-25T08:54:51Z | 10 | 1 | transformers | [
"transformers",
"gguf",
"en",
"dataset:camel-ai/biology",
"base_model:mridul3301/BioMistral-7B-finetuned",
"base_model:quantized:mridul3301/BioMistral-7B-finetuned",
"license:mit",
"endpoints_compatible",
"region:us"
] | null | 2024-11-23T20:42:34Z | ---
base_model: mridul3301/BioMistral-7B-finetuned
datasets:
- camel-ai/biology
language:
- en
library_name: transformers
license: mit
quantized_by: mradermacher
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: -->
static quants of https://huggingface.co/mridul3301/BioMistral-7B-finetuned
<!-- provided-files -->
weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/BioMistral-7B-finetuned-GGUF/resolve/main/BioMistral-7B-finetuned.Q2_K.gguf) | Q2_K | 2.8 | |
| [GGUF](https://huggingface.co/mradermacher/BioMistral-7B-finetuned-GGUF/resolve/main/BioMistral-7B-finetuned.Q3_K_S.gguf) | Q3_K_S | 3.3 | |
| [GGUF](https://huggingface.co/mradermacher/BioMistral-7B-finetuned-GGUF/resolve/main/BioMistral-7B-finetuned.Q3_K_M.gguf) | Q3_K_M | 3.6 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/BioMistral-7B-finetuned-GGUF/resolve/main/BioMistral-7B-finetuned.Q3_K_L.gguf) | Q3_K_L | 3.9 | |
| [GGUF](https://huggingface.co/mradermacher/BioMistral-7B-finetuned-GGUF/resolve/main/BioMistral-7B-finetuned.IQ4_XS.gguf) | IQ4_XS | 4.0 | |
| [GGUF](https://huggingface.co/mradermacher/BioMistral-7B-finetuned-GGUF/resolve/main/BioMistral-7B-finetuned.Q4_0_4_4.gguf) | Q4_0_4_4 | 4.2 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/BioMistral-7B-finetuned-GGUF/resolve/main/BioMistral-7B-finetuned.Q4_K_S.gguf) | Q4_K_S | 4.2 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/BioMistral-7B-finetuned-GGUF/resolve/main/BioMistral-7B-finetuned.Q4_K_M.gguf) | Q4_K_M | 4.5 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/BioMistral-7B-finetuned-GGUF/resolve/main/BioMistral-7B-finetuned.Q5_K_S.gguf) | Q5_K_S | 5.1 | |
| [GGUF](https://huggingface.co/mradermacher/BioMistral-7B-finetuned-GGUF/resolve/main/BioMistral-7B-finetuned.Q5_K_M.gguf) | Q5_K_M | 5.2 | |
| [GGUF](https://huggingface.co/mradermacher/BioMistral-7B-finetuned-GGUF/resolve/main/BioMistral-7B-finetuned.Q6_K.gguf) | Q6_K | 6.0 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/BioMistral-7B-finetuned-GGUF/resolve/main/BioMistral-7B-finetuned.Q8_0.gguf) | Q8_0 | 7.8 | fast, best quality |
| [GGUF](https://huggingface.co/mradermacher/BioMistral-7B-finetuned-GGUF/resolve/main/BioMistral-7B-finetuned.f16.gguf) | f16 | 14.6 | 16 bpw, overkill |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time.
<!-- end -->
|
mradermacher/llama-2-13b-cf-of-GGUF | mradermacher | 2024-11-25T08:54:31Z | 5 | 1 | transformers | [
"transformers",
"gguf",
"en",
"base_model:iestynmullinor/llama-2-13b-cf-of",
"base_model:quantized:iestynmullinor/llama-2-13b-cf-of",
"endpoints_compatible",
"region:us"
] | null | 2024-11-23T21:21:58Z | ---
base_model: iestynmullinor/llama-2-13b-cf-of
language:
- en
library_name: transformers
quantized_by: mradermacher
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: -->
static quants of https://huggingface.co/iestynmullinor/llama-2-13b-cf-of
<!-- provided-files -->
weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/llama-2-13b-cf-of-GGUF/resolve/main/llama-2-13b-cf-of.Q2_K.gguf) | Q2_K | 5.0 | |
| [GGUF](https://huggingface.co/mradermacher/llama-2-13b-cf-of-GGUF/resolve/main/llama-2-13b-cf-of.Q3_K_S.gguf) | Q3_K_S | 5.8 | |
| [GGUF](https://huggingface.co/mradermacher/llama-2-13b-cf-of-GGUF/resolve/main/llama-2-13b-cf-of.Q3_K_M.gguf) | Q3_K_M | 6.4 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/llama-2-13b-cf-of-GGUF/resolve/main/llama-2-13b-cf-of.Q3_K_L.gguf) | Q3_K_L | 7.0 | |
| [GGUF](https://huggingface.co/mradermacher/llama-2-13b-cf-of-GGUF/resolve/main/llama-2-13b-cf-of.IQ4_XS.gguf) | IQ4_XS | 7.1 | |
| [GGUF](https://huggingface.co/mradermacher/llama-2-13b-cf-of-GGUF/resolve/main/llama-2-13b-cf-of.Q4_0_4_4.gguf) | Q4_0_4_4 | 7.5 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/llama-2-13b-cf-of-GGUF/resolve/main/llama-2-13b-cf-of.Q4_K_S.gguf) | Q4_K_S | 7.5 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/llama-2-13b-cf-of-GGUF/resolve/main/llama-2-13b-cf-of.Q4_K_M.gguf) | Q4_K_M | 8.0 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/llama-2-13b-cf-of-GGUF/resolve/main/llama-2-13b-cf-of.Q5_K_S.gguf) | Q5_K_S | 9.1 | |
| [GGUF](https://huggingface.co/mradermacher/llama-2-13b-cf-of-GGUF/resolve/main/llama-2-13b-cf-of.Q5_K_M.gguf) | Q5_K_M | 9.3 | |
| [GGUF](https://huggingface.co/mradermacher/llama-2-13b-cf-of-GGUF/resolve/main/llama-2-13b-cf-of.Q6_K.gguf) | Q6_K | 10.8 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/llama-2-13b-cf-of-GGUF/resolve/main/llama-2-13b-cf-of.Q8_0.gguf) | Q8_0 | 13.9 | fast, best quality |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time.
<!-- end -->
|
mradermacher/NEBULA-XB-v1.0-GGUF | mradermacher | 2024-11-25T08:54:09Z | 8 | 0 | transformers | [
"transformers",
"gguf",
"en",
"dataset:Open-Orca/SlimOrca",
"base_model:TeeZee/NEBULA-XB-v1.0",
"base_model:quantized:TeeZee/NEBULA-XB-v1.0",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-11-25T04:52:10Z | ---
base_model: TeeZee/NEBULA-XB-v1.0
datasets:
- Open-Orca/SlimOrca
language:
- en
library_name: transformers
license: apache-2.0
quantized_by: mradermacher
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
static quants of https://huggingface.co/TeeZee/NEBULA-XB-v1.0
<!-- provided-files -->
weighted/imatrix quants are available at https://huggingface.co/mradermacher/NEBULA-XB-v1.0-i1-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/NEBULA-XB-v1.0-GGUF/resolve/main/NEBULA-XB-v1.0.Q2_K.gguf) | Q2_K | 8.9 | |
| [GGUF](https://huggingface.co/mradermacher/NEBULA-XB-v1.0-GGUF/resolve/main/NEBULA-XB-v1.0.Q3_K_S.gguf) | Q3_K_S | 10.4 | |
| [GGUF](https://huggingface.co/mradermacher/NEBULA-XB-v1.0-GGUF/resolve/main/NEBULA-XB-v1.0.Q3_K_M.gguf) | Q3_K_M | 11.6 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/NEBULA-XB-v1.0-GGUF/resolve/main/NEBULA-XB-v1.0.Q3_K_L.gguf) | Q3_K_L | 12.6 | |
| [GGUF](https://huggingface.co/mradermacher/NEBULA-XB-v1.0-GGUF/resolve/main/NEBULA-XB-v1.0.IQ4_XS.gguf) | IQ4_XS | 13.0 | |
| [GGUF](https://huggingface.co/mradermacher/NEBULA-XB-v1.0-GGUF/resolve/main/NEBULA-XB-v1.0.Q4_K_S.gguf) | Q4_K_S | 13.6 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/NEBULA-XB-v1.0-GGUF/resolve/main/NEBULA-XB-v1.0.Q4_K_M.gguf) | Q4_K_M | 14.4 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/NEBULA-XB-v1.0-GGUF/resolve/main/NEBULA-XB-v1.0.Q5_K_S.gguf) | Q5_K_S | 16.5 | |
| [GGUF](https://huggingface.co/mradermacher/NEBULA-XB-v1.0-GGUF/resolve/main/NEBULA-XB-v1.0.Q5_K_M.gguf) | Q5_K_M | 16.9 | |
| [GGUF](https://huggingface.co/mradermacher/NEBULA-XB-v1.0-GGUF/resolve/main/NEBULA-XB-v1.0.Q6_K.gguf) | Q6_K | 19.6 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/NEBULA-XB-v1.0-GGUF/resolve/main/NEBULA-XB-v1.0.Q8_0.gguf) | Q8_0 | 25.4 | fast, best quality |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
mradermacher/OpenHermes-2.5-Mistral-7B-BOLA-Karate-Fine-Tuned-GGUF | mradermacher | 2024-11-25T08:54:03Z | 35 | 2 | transformers | [
"transformers",
"gguf",
"en",
"base_model:EmilMarian/OpenHermes-2.5-Mistral-7B-BOLA-Karate-Fine-Tuned",
"base_model:quantized:EmilMarian/OpenHermes-2.5-Mistral-7B-BOLA-Karate-Fine-Tuned",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-11-23T22:24:58Z | ---
base_model: EmilMarian/OpenHermes-2.5-Mistral-7B-BOLA-Karate-Fine-Tuned
language:
- en
library_name: transformers
license: apache-2.0
quantized_by: mradermacher
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: -->
static quants of https://huggingface.co/EmilMarian/OpenHermes-2.5-Mistral-7B-BOLA-Karate-Fine-Tuned
<!-- provided-files -->
weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/OpenHermes-2.5-Mistral-7B-BOLA-Karate-Fine-Tuned-GGUF/resolve/main/OpenHermes-2.5-Mistral-7B-BOLA-Karate-Fine-Tuned.Q2_K.gguf) | Q2_K | 2.8 | |
| [GGUF](https://huggingface.co/mradermacher/OpenHermes-2.5-Mistral-7B-BOLA-Karate-Fine-Tuned-GGUF/resolve/main/OpenHermes-2.5-Mistral-7B-BOLA-Karate-Fine-Tuned.Q3_K_S.gguf) | Q3_K_S | 3.3 | |
| [GGUF](https://huggingface.co/mradermacher/OpenHermes-2.5-Mistral-7B-BOLA-Karate-Fine-Tuned-GGUF/resolve/main/OpenHermes-2.5-Mistral-7B-BOLA-Karate-Fine-Tuned.Q3_K_M.gguf) | Q3_K_M | 3.6 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/OpenHermes-2.5-Mistral-7B-BOLA-Karate-Fine-Tuned-GGUF/resolve/main/OpenHermes-2.5-Mistral-7B-BOLA-Karate-Fine-Tuned.Q3_K_L.gguf) | Q3_K_L | 3.9 | |
| [GGUF](https://huggingface.co/mradermacher/OpenHermes-2.5-Mistral-7B-BOLA-Karate-Fine-Tuned-GGUF/resolve/main/OpenHermes-2.5-Mistral-7B-BOLA-Karate-Fine-Tuned.IQ4_XS.gguf) | IQ4_XS | 4.0 | |
| [GGUF](https://huggingface.co/mradermacher/OpenHermes-2.5-Mistral-7B-BOLA-Karate-Fine-Tuned-GGUF/resolve/main/OpenHermes-2.5-Mistral-7B-BOLA-Karate-Fine-Tuned.Q4_0_4_4.gguf) | Q4_0_4_4 | 4.2 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/OpenHermes-2.5-Mistral-7B-BOLA-Karate-Fine-Tuned-GGUF/resolve/main/OpenHermes-2.5-Mistral-7B-BOLA-Karate-Fine-Tuned.Q4_K_S.gguf) | Q4_K_S | 4.2 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/OpenHermes-2.5-Mistral-7B-BOLA-Karate-Fine-Tuned-GGUF/resolve/main/OpenHermes-2.5-Mistral-7B-BOLA-Karate-Fine-Tuned.Q4_K_M.gguf) | Q4_K_M | 4.5 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/OpenHermes-2.5-Mistral-7B-BOLA-Karate-Fine-Tuned-GGUF/resolve/main/OpenHermes-2.5-Mistral-7B-BOLA-Karate-Fine-Tuned.Q5_K_S.gguf) | Q5_K_S | 5.1 | |
| [GGUF](https://huggingface.co/mradermacher/OpenHermes-2.5-Mistral-7B-BOLA-Karate-Fine-Tuned-GGUF/resolve/main/OpenHermes-2.5-Mistral-7B-BOLA-Karate-Fine-Tuned.Q5_K_M.gguf) | Q5_K_M | 5.2 | |
| [GGUF](https://huggingface.co/mradermacher/OpenHermes-2.5-Mistral-7B-BOLA-Karate-Fine-Tuned-GGUF/resolve/main/OpenHermes-2.5-Mistral-7B-BOLA-Karate-Fine-Tuned.Q6_K.gguf) | Q6_K | 6.0 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/OpenHermes-2.5-Mistral-7B-BOLA-Karate-Fine-Tuned-GGUF/resolve/main/OpenHermes-2.5-Mistral-7B-BOLA-Karate-Fine-Tuned.Q8_0.gguf) | Q8_0 | 7.8 | fast, best quality |
| [GGUF](https://huggingface.co/mradermacher/OpenHermes-2.5-Mistral-7B-BOLA-Karate-Fine-Tuned-GGUF/resolve/main/OpenHermes-2.5-Mistral-7B-BOLA-Karate-Fine-Tuned.f16.gguf) | f16 | 14.6 | 16 bpw, overkill |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time.
<!-- end -->
|
mradermacher/nox-solar-10.7b-v4-kolon-all-10-GGUF | mradermacher | 2024-11-25T08:53:46Z | 14 | 1 | transformers | [
"transformers",
"gguf",
"ko",
"en",
"base_model:gwonny/nox-solar-10.7b-v4-kolon-all-10",
"base_model:quantized:gwonny/nox-solar-10.7b-v4-kolon-all-10",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-11-23T23:34:17Z | ---
base_model: gwonny/nox-solar-10.7b-v4-kolon-all-10
language:
- ko
- en
library_name: transformers
license: cc-by-nc-4.0
quantized_by: mradermacher
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
static quants of https://huggingface.co/gwonny/nox-solar-10.7b-v4-kolon-all-10
<!-- provided-files -->
weighted/imatrix quants are available at https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-10-i1-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-10-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-10.Q2_K.gguf) | Q2_K | 4.1 | |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-10-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-10.Q3_K_S.gguf) | Q3_K_S | 4.8 | |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-10-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-10.Q3_K_M.gguf) | Q3_K_M | 5.3 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-10-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-10.Q3_K_L.gguf) | Q3_K_L | 5.8 | |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-10-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-10.IQ4_XS.gguf) | IQ4_XS | 5.9 | |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-10-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-10.Q4_0_4_4.gguf) | Q4_0_4_4 | 6.2 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-10-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-10.Q4_K_S.gguf) | Q4_K_S | 6.2 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-10-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-10.Q4_K_M.gguf) | Q4_K_M | 6.6 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-10-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-10.Q5_K_S.gguf) | Q5_K_S | 7.5 | |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-10-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-10.Q5_K_M.gguf) | Q5_K_M | 7.7 | |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-10-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-10.Q6_K.gguf) | Q6_K | 8.9 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-10-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-10.Q8_0.gguf) | Q8_0 | 11.5 | fast, best quality |
| [GGUF](https://huggingface.co/mradermacher/nox-solar-10.7b-v4-kolon-all-10-GGUF/resolve/main/nox-solar-10.7b-v4-kolon-all-10.f16.gguf) | f16 | 21.6 | 16 bpw, overkill |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
mradermacher/Hermes-2-Pro-11B-GGUF | mradermacher | 2024-11-25T08:53:39Z | 81 | 1 | transformers | [
"transformers",
"gguf",
"merge",
"mergekit",
"lazymergekit",
"NousResearch/Hermes-2-Pro-Mistral-7B",
"en",
"base_model:mattshumer/Hermes-2-Pro-11B",
"base_model:quantized:mattshumer/Hermes-2-Pro-11B",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-11-24T01:01:56Z | ---
base_model: mattshumer/Hermes-2-Pro-11B
language:
- en
library_name: transformers
license: apache-2.0
quantized_by: mradermacher
tags:
- merge
- mergekit
- lazymergekit
- NousResearch/Hermes-2-Pro-Mistral-7B
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: -->
static quants of https://huggingface.co/mattshumer/Hermes-2-Pro-11B
<!-- provided-files -->
weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/Hermes-2-Pro-11B-GGUF/resolve/main/Hermes-2-Pro-11B.Q2_K.gguf) | Q2_K | 4.3 | |
| [GGUF](https://huggingface.co/mradermacher/Hermes-2-Pro-11B-GGUF/resolve/main/Hermes-2-Pro-11B.Q3_K_S.gguf) | Q3_K_S | 5.0 | |
| [GGUF](https://huggingface.co/mradermacher/Hermes-2-Pro-11B-GGUF/resolve/main/Hermes-2-Pro-11B.Q3_K_M.gguf) | Q3_K_M | 5.5 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/Hermes-2-Pro-11B-GGUF/resolve/main/Hermes-2-Pro-11B.Q3_K_L.gguf) | Q3_K_L | 6.0 | |
| [GGUF](https://huggingface.co/mradermacher/Hermes-2-Pro-11B-GGUF/resolve/main/Hermes-2-Pro-11B.IQ4_XS.gguf) | IQ4_XS | 6.2 | |
| [GGUF](https://huggingface.co/mradermacher/Hermes-2-Pro-11B-GGUF/resolve/main/Hermes-2-Pro-11B.Q4_0_4_4.gguf) | Q4_0_4_4 | 6.4 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/Hermes-2-Pro-11B-GGUF/resolve/main/Hermes-2-Pro-11B.Q4_K_S.gguf) | Q4_K_S | 6.5 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Hermes-2-Pro-11B-GGUF/resolve/main/Hermes-2-Pro-11B.Q4_K_M.gguf) | Q4_K_M | 6.8 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Hermes-2-Pro-11B-GGUF/resolve/main/Hermes-2-Pro-11B.Q5_K_S.gguf) | Q5_K_S | 7.8 | |
| [GGUF](https://huggingface.co/mradermacher/Hermes-2-Pro-11B-GGUF/resolve/main/Hermes-2-Pro-11B.Q5_K_M.gguf) | Q5_K_M | 8.0 | |
| [GGUF](https://huggingface.co/mradermacher/Hermes-2-Pro-11B-GGUF/resolve/main/Hermes-2-Pro-11B.Q6_K.gguf) | Q6_K | 9.3 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/Hermes-2-Pro-11B-GGUF/resolve/main/Hermes-2-Pro-11B.Q8_0.gguf) | Q8_0 | 12.0 | fast, best quality |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time.
<!-- end -->
|
mradermacher/Tess-2.0-Mixtral-8x7B-v0.2-i1-GGUF | mradermacher | 2024-11-25T08:48:40Z | 9 | 1 | transformers | [
"transformers",
"gguf",
"en",
"base_model:migtissera/Tess-2.0-Mixtral-8x7B-v0.2",
"base_model:quantized:migtissera/Tess-2.0-Mixtral-8x7B-v0.2",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"imatrix"
] | null | 2024-11-24T19:49:30Z | ---
base_model: migtissera/Tess-2.0-Mixtral-8x7B-v0.2
language:
- en
library_name: transformers
license: apache-2.0
quantized_by: mradermacher
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/migtissera/Tess-2.0-Mixtral-8x7B-v0.2
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/Tess-2.0-Mixtral-8x7B-v0.2-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/Tess-2.0-Mixtral-8x7B-v0.2-i1-GGUF/resolve/main/Tess-2.0-Mixtral-8x7B-v0.2.i1-IQ1_S.gguf) | i1-IQ1_S | 9.9 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/Tess-2.0-Mixtral-8x7B-v0.2-i1-GGUF/resolve/main/Tess-2.0-Mixtral-8x7B-v0.2.i1-IQ1_M.gguf) | i1-IQ1_M | 10.9 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/Tess-2.0-Mixtral-8x7B-v0.2-i1-GGUF/resolve/main/Tess-2.0-Mixtral-8x7B-v0.2.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 12.7 | |
| [GGUF](https://huggingface.co/mradermacher/Tess-2.0-Mixtral-8x7B-v0.2-i1-GGUF/resolve/main/Tess-2.0-Mixtral-8x7B-v0.2.i1-IQ2_XS.gguf) | i1-IQ2_XS | 14.0 | |
| [GGUF](https://huggingface.co/mradermacher/Tess-2.0-Mixtral-8x7B-v0.2-i1-GGUF/resolve/main/Tess-2.0-Mixtral-8x7B-v0.2.i1-IQ2_S.gguf) | i1-IQ2_S | 14.2 | |
| [GGUF](https://huggingface.co/mradermacher/Tess-2.0-Mixtral-8x7B-v0.2-i1-GGUF/resolve/main/Tess-2.0-Mixtral-8x7B-v0.2.i1-IQ2_M.gguf) | i1-IQ2_M | 15.6 | |
| [GGUF](https://huggingface.co/mradermacher/Tess-2.0-Mixtral-8x7B-v0.2-i1-GGUF/resolve/main/Tess-2.0-Mixtral-8x7B-v0.2.i1-Q2_K.gguf) | i1-Q2_K | 17.4 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/Tess-2.0-Mixtral-8x7B-v0.2-i1-GGUF/resolve/main/Tess-2.0-Mixtral-8x7B-v0.2.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 18.3 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/Tess-2.0-Mixtral-8x7B-v0.2-i1-GGUF/resolve/main/Tess-2.0-Mixtral-8x7B-v0.2.i1-IQ3_XS.gguf) | i1-IQ3_XS | 19.5 | |
| [GGUF](https://huggingface.co/mradermacher/Tess-2.0-Mixtral-8x7B-v0.2-i1-GGUF/resolve/main/Tess-2.0-Mixtral-8x7B-v0.2.i1-IQ3_S.gguf) | i1-IQ3_S | 20.5 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/Tess-2.0-Mixtral-8x7B-v0.2-i1-GGUF/resolve/main/Tess-2.0-Mixtral-8x7B-v0.2.i1-Q3_K_S.gguf) | i1-Q3_K_S | 20.5 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/Tess-2.0-Mixtral-8x7B-v0.2-i1-GGUF/resolve/main/Tess-2.0-Mixtral-8x7B-v0.2.i1-IQ3_M.gguf) | i1-IQ3_M | 21.5 | |
| [GGUF](https://huggingface.co/mradermacher/Tess-2.0-Mixtral-8x7B-v0.2-i1-GGUF/resolve/main/Tess-2.0-Mixtral-8x7B-v0.2.i1-Q3_K_M.gguf) | i1-Q3_K_M | 22.6 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/Tess-2.0-Mixtral-8x7B-v0.2-i1-GGUF/resolve/main/Tess-2.0-Mixtral-8x7B-v0.2.i1-Q3_K_L.gguf) | i1-Q3_K_L | 24.3 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/Tess-2.0-Mixtral-8x7B-v0.2-i1-GGUF/resolve/main/Tess-2.0-Mixtral-8x7B-v0.2.i1-IQ4_XS.gguf) | i1-IQ4_XS | 25.2 | |
| [GGUF](https://huggingface.co/mradermacher/Tess-2.0-Mixtral-8x7B-v0.2-i1-GGUF/resolve/main/Tess-2.0-Mixtral-8x7B-v0.2.i1-Q4_0.gguf) | i1-Q4_0 | 26.7 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/Tess-2.0-Mixtral-8x7B-v0.2-i1-GGUF/resolve/main/Tess-2.0-Mixtral-8x7B-v0.2.i1-Q4_K_S.gguf) | i1-Q4_K_S | 26.8 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/Tess-2.0-Mixtral-8x7B-v0.2-i1-GGUF/resolve/main/Tess-2.0-Mixtral-8x7B-v0.2.i1-Q4_K_M.gguf) | i1-Q4_K_M | 28.5 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Tess-2.0-Mixtral-8x7B-v0.2-i1-GGUF/resolve/main/Tess-2.0-Mixtral-8x7B-v0.2.i1-Q5_K_S.gguf) | i1-Q5_K_S | 32.3 | |
| [GGUF](https://huggingface.co/mradermacher/Tess-2.0-Mixtral-8x7B-v0.2-i1-GGUF/resolve/main/Tess-2.0-Mixtral-8x7B-v0.2.i1-Q5_K_M.gguf) | i1-Q5_K_M | 33.3 | |
| [GGUF](https://huggingface.co/mradermacher/Tess-2.0-Mixtral-8x7B-v0.2-i1-GGUF/resolve/main/Tess-2.0-Mixtral-8x7B-v0.2.i1-Q6_K.gguf) | i1-Q6_K | 38.5 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
ibrahimchristopher/whisper-small-dv | ibrahimchristopher | 2024-11-25T08:44:29Z | 78 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"whisper",
"automatic-speech-recognition",
"generated_from_trainer",
"ha",
"dataset:mozilla-foundation/common_voice_13_0",
"base_model:openai/whisper-small",
"base_model:finetune:openai/whisper-small",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | automatic-speech-recognition | 2024-11-25T06:15:00Z | ---
library_name: transformers
language:
- ha
license: apache-2.0
base_model: openai/whisper-small
tags:
- generated_from_trainer
datasets:
- mozilla-foundation/common_voice_13_0
metrics:
- wer
model-index:
- name: Whisper Small Ha - Ibrahim Ibrahim
results:
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: Common Voice 13
type: mozilla-foundation/common_voice_13_0
metrics:
- name: Wer
type: wer
value: 45.91914569031274
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Whisper Small Ha - Ibrahim Ibrahim
This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co/openai/whisper-small) on the Common Voice 13 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6920
- Wer Ortho: 48.8189
- Wer: 45.9191
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: constant_with_warmup
- lr_scheduler_warmup_steps: 50
- training_steps: 500
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer Ortho | Wer |
|:-------------:|:------:|:----:|:---------------:|:---------:|:-------:|
| 0.0754 | 3.1847 | 500 | 0.6920 | 48.8189 | 45.9191 |
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.1+cu121
- Tokenizers 0.20.3
|
nonhmello/whisper_medium_nonhmello | nonhmello | 2024-11-25T08:37:56Z | 7 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"whisper",
"automatic-speech-recognition",
"asr",
"speech-recognition",
"thai",
"custom-model",
"generated_from_trainer",
"th",
"base_model:biodatlab/whisper-th-medium-combined",
"base_model:finetune:biodatlab/whisper-th-medium-combined",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | automatic-speech-recognition | 2024-11-25T02:25:57Z | ---
library_name: transformers
language:
- th
license: apache-2.0
base_model: biodatlab/whisper-th-medium-combined
tags:
- asr
- speech-recognition
- thai
- custom-model
- generated_from_trainer
metrics:
- wer
model-index:
- name: Whisper Medium TH - Nonhmello
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Whisper Medium TH - Nonhmello
This model is a fine-tuned version of [biodatlab/whisper-th-medium-combined](https://huggingface.co/biodatlab/whisper-th-medium-combined) on the Custom dataset on local machine dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3068
- Wer: 77.4194
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- training_steps: 1000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:--------:|:----:|:---------------:|:-------:|
| 0.0092 | 83.3333 | 500 | 0.2586 | 90.3226 |
| 0.0006 | 166.6667 | 1000 | 0.3068 | 77.4194 |
### Framework versions
- Transformers 4.45.2
- Pytorch 2.5.1+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
|
Cylingo/Xinyuan-VL-2B | Cylingo | 2024-11-25T08:24:55Z | 134 | 7 | transformers | [
"transformers",
"safetensors",
"qwen2_vl",
"image-text-to-text",
"multimodal",
"visual-question-answering",
"en",
"zh",
"license:apache-2.0",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | visual-question-answering | 2024-09-24T12:45:20Z | ---
license: apache-2.0
language:
- en
- zh
pipeline_tag: visual-question-answering
tags:
- multimodal
library_name: transformers
---
<div align=center><img src ="https://cdn-uploads.huggingface.co/production/uploads/6299c90ef1f2a097fcaa1293/XEfp5nnJOixkGAOyF8UtN.png"/></div>
## Introduction
**Xinyuan-VL-2B** is a high-performance multimodal large model for the end-side from the Cylingo Group, which is fine-tuned with `Qwen/Qwen2-VL-2B-Instruct`, and uses more than 5M of multimodal data as well as a small amount of plain text data.
It performs well on several authoritative Benchmarks.
## How to use
In order to rely on the thriving ecology of the open source community, we chose to fine-tune [Qwen/Qwen2-VL-2B-Instruct](https://huggingface.co/Qwen/Qwen2-VL-2B-Instruct) to form our `Cylingo/Xinyuan-VL- 2B`.
Thus, using `Cylingo/Xinyuan-VL-2B` is consistent with using `Qwen/Qwen2-VL-2B-Instruct`:
```Python
from transformers import Qwen2VLForConditionalGeneration, AutoTokenizer, AutoProcessor
from qwen_vl_utils import process_vision_info
# default: Load the model on the available device(s)
model = Qwen2VLForConditionalGeneration.from_pretrained(
"Cylingo/Xinyuan-VL-2B", torch_dtype="auto", device_map="auto"
)
# default processer
processor = AutoProcessor.from_pretrained("Cylingo/Xinyuan-VL-2B")
messages = [
{
"role": "user",
"content": [
{
"type": "image",
"image": "https://qianwen-res.oss-cn-beijing.aliyuncs.com/Qwen-VL/assets/demo.jpeg",
},
{"type": "text", "text": "Describe this image."},
],
}
]
# Preparation for inference
text = processor.apply_chat_template(
messages, tokenize=False, add_generation_prompt=True
)
image_inputs, video_inputs = process_vision_info(messages)
inputs = processor(
text=[text],
images=image_inputs,
videos=video_inputs,
padding=True,
return_tensors="pt",
)
inputs = inputs.to("cuda")
# Inference: Generation of the output
generated_ids = model.generate(**inputs, max_new_tokens=128)
generated_ids_trimmed = [
out_ids[len(in_ids) :] for in_ids, out_ids in zip(inputs.input_ids, generated_ids)
]
output_text = processor.batch_decode(
generated_ids_trimmed, skip_special_tokens=True, clean_up_tokenization_spaces=False
)
print(output_text)
```
## Evaluation
We evaluated **[XinYuan-VL-2B](https://huggingface.co/thomas-yanxin/XinYuan-VL-2B)** using the [VLMEvalKit](https://github.com/open-compass/VLMEvalKit) toolkit across the following benchmarks and found that **XinYuan-VL-2B** **outperformed** [Qwen/Qwen2-VL-2B-Instruct](https://huggingface.co/Qwen/Qwen2-VL-2B-Instruct) released by Alibaba Cloud, as well as other models of comparable parameter scale that have significant influence in the open-source community.
<p align="center">
<img src="https://cdn-uploads.huggingface.co/production/uploads/6299c90ef1f2a097fcaa1293/7ThTCYfd_lDzsvaFLlUv2.png">
</p>
You can see the results in [opencompass/open_vlm_leaderboard](https://huggingface.co/spaces/opencompass/open_vlm_leaderboard):
| Benchamrk | MiniCPM-2B | InternVL-2B | Qwen2-VL-2B | **XinYuan-VL-2B** |
| :---: | :---: | :---: | :---: | :---: |
| MMB-CN-V11-Test | 64.5 | 68.9 | 71.2 | **74.3** |
| MMB-EN-V11-Test | 65.8 | 70.2 | 73.2 | **76.5** |
| MMB-EN | 69.1 | 74.4 | 74.3 | **78.9** |
| MMB-CN | 66.5 | 71.2 | 73.8 | **76.12** |
| CCBench | 45.3 | 74.7 | 53.7 | 55.5 |
| MMT-Bench | 53.5 | 50.8 | 54.5 | **55.2** |
| RealWorld | 55.8 | 57.3 | 62.9 | **63.9** |
| SEEDBench\_IMG | 67.1 | 70.9 | 72.86 | **73.4** |
| AI2D | 56.3 | 74.1 | **74.7** | 74.2 |
| MMMU | 38.2 | 36.3 | **41.1** | 40.9 |
| HallusionBench | 36.2 | 36.2 | 42.4 | **55.00** |
| POPE | 86.3 | 86.3 | 86.82 | **89.42** |
| MME | 1808.6 | **1876.8** | 1872.0 | 1854.9 |
| MMStar | 39.1 | 49.8 | 47.5 | **51.87** |
| SEEDBench2\_Plus | 51.9 | 59.9 | 62.23 | **62.98** |
| BLINK | 41.2 | 42.8 | **43.92** | 42.98 |
| OCRBench | 605 | 781 | **794** | 782 |
| TextVQA | 74.1 | 73.4 | **79.7** | 77.6 | |
Miyoki/cve-mistral-7b-instruct-v0.3-bnb-4bit-02 | Miyoki | 2024-11-25T08:21:37Z | 12 | 0 | transformers | [
"transformers",
"gguf",
"mistral",
"text-generation-inference",
"unsloth",
"en",
"base_model:unsloth/mistral-7b-instruct-v0.3-bnb-4bit",
"base_model:quantized:unsloth/mistral-7b-instruct-v0.3-bnb-4bit",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-11-25T08:18:38Z | ---
base_model: unsloth/mistral-7b-instruct-v0.3-bnb-4bit
tags:
- text-generation-inference
- transformers
- unsloth
- mistral
- gguf
license: apache-2.0
language:
- en
---
# Uploaded model
- **Developed by:** Miyoki
- **License:** apache-2.0
- **Finetuned from model :** unsloth/mistral-7b-instruct-v0.3-bnb-4bit
This mistral model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
tl81092/my-drug-model_2 | tl81092 | 2024-11-25T08:14:46Z | 105 | 0 | transformers | [
"transformers",
"safetensors",
"distilbert",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-25T08:14:27Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a ๐ค transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
JNolet/Qwen2.5-Coder-14B_v11.25.24.0_CodeInstruct | JNolet | 2024-11-25T08:13:51Z | 18 | 0 | transformers | [
"transformers",
"safetensors",
"qwen2",
"text-generation",
"mergekit",
"merge",
"conversational",
"arxiv:2306.01708",
"base_model:Qwen/Qwen2.5-Coder-14B",
"base_model:merge:Qwen/Qwen2.5-Coder-14B",
"base_model:Qwen/Qwen2.5-Coder-14B-Instruct",
"base_model:merge:Qwen/Qwen2.5-Coder-14B-Instruct",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-25T08:05:29Z | ---
base_model:
- Qwen/Qwen2.5-Coder-14B
- Qwen/Qwen2.5-Coder-14B-Instruct
library_name: transformers
tags:
- mergekit
- merge
---
# merge
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [Qwen/Qwen2.5-Coder-14B](https://huggingface.co/Qwen/Qwen2.5-Coder-14B) as a base.
### Models Merged
The following models were included in the merge:
* [Qwen/Qwen2.5-Coder-14B-Instruct](https://huggingface.co/Qwen/Qwen2.5-Coder-14B-Instruct)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
models:
- model: Qwen/Qwen2.5-Coder-14B-Instruct
parameters:
weight: 1
density: 1
merge_method: ties
base_model: Qwen/Qwen2.5-Coder-14B
parameters:
weight: 1
density: 1
normalise: true
int8_mask: true
dtype: bfloat16
```
|
Kasobi/distilbert-base-uncased-finetuned-emotion | Kasobi | 2024-11-25T08:09:49Z | 105 | 0 | transformers | [
"transformers",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"base_model:distilbert/distilbert-base-uncased",
"base_model:finetune:distilbert/distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-23T12:39:59Z | ---
library_name: transformers
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_trainer
metrics:
- accuracy
- f1
model-index:
- name: distilbert-base-uncased-finetuned-emotion
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-emotion
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2250
- Accuracy: 0.9245
- F1: 0.9245
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|
| No log | 1.0 | 250 | 0.3164 | 0.901 | 0.8999 |
| No log | 2.0 | 500 | 0.2250 | 0.9245 | 0.9245 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.5.1+cpu
- Datasets 3.1.0
- Tokenizers 0.20.3
|
Shinyaaa/Face-travel-05-v1 | Shinyaaa | 2024-11-25T08:05:00Z | 103 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"feature-extraction",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | feature-extraction | 2024-11-25T08:04:33Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a ๐ค transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
msr2903/mrm8488-distilroberta-fine-tuned-financial-sentiment | msr2903 | 2024-11-25T07:58:40Z | 134 | 0 | transformers | [
"transformers",
"safetensors",
"roberta",
"text-classification",
"dataset:NickyNicky/finance-financialmodelingprep-stock-news-sentiments-rss-feed",
"base_model:mrm8488/distilroberta-finetuned-financial-news-sentiment-analysis",
"base_model:finetune:mrm8488/distilroberta-finetuned-financial-news-sentiment-analysis",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-25T07:35:07Z | ---
library_name: transformers
datasets:
- NickyNicky/finance-financialmodelingprep-stock-news-sentiments-rss-feed
base_model:
- mrm8488/distilroberta-finetuned-financial-news-sentiment-analysis
---
### Model Description
<!-- Provide a longer summary of what this model is. -->
This model is a fine-tuned version of [mrm8488/distilroberta-finetuned-financial-news-sentiment-analysis](https://huggingface.co/mrm8488/distilroberta-finetuned-financial-news-sentiment-analysis) on the [NickyNicky/finance-financialmodelingprep-stock-news-sentiments-rss-feed](https://huggingface.co/datasets/NickyNicky/finance-financialmodelingprep-stock-news-sentiments-rss-feed) dataset. It achieves the following results on the evaluation set:
- Loss: 0.4090
- Accuracy: 0.9171
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- num_epochs: 5
### Training results
| Training Loss | Epoch | Validation Loss |
|:-------------:|:-----:|:---------------:|
| 0.318500 | 1.0 | 0.294045 |
| 0.281700 | 2.0 | 0.298364 |
| 0.250100 | 3.0 | 0.302255 |
| 0.186400 | 4.0 | 0.380530 |
| 0.179100 | 5.0 | 0.409072 | |
SMARTICT/paraphrase-multilingual-MiniLM-L12-v2-ft-tr-rag-v1 | SMARTICT | 2024-11-25T07:48:31Z | 44 | 0 | sentence-transformers | [
"sentence-transformers",
"safetensors",
"bert",
"sentence-similarity",
"feature-extraction",
"generated_from_trainer",
"dataset_size:8970",
"loss:MultipleNegativesRankingLoss",
"en",
"arxiv:1908.10084",
"arxiv:1705.00652",
"base_model:sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2",
"base_model:finetune:sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"text-embeddings-inference",
"endpoints_compatible",
"region:us"
] | sentence-similarity | 2024-11-22T13:26:33Z | ---
language:
- en
license: apache-2.0
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:8970
- loss:MultipleNegativesRankingLoss
base_model: sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2
widget:
- source_sentence: 'Seri konum efekti tarafฤฑndan oluลturulan ลeklindeki seri konum
eฤrisini gรถsteren grafik. ''''''Seri konum etkisi'''''', bir kiลinin, bir serideki
ilk ve son รถgeleri en iyi; ortanca รถgeleri en kรถtรผ hatฤฑrlama eฤilimidir. Bu terim,
Hermann Ebbinghaus tarafฤฑndan kendi รผzerine yaptฤฑฤฤฑ รงalฤฑลmalar ile ve bu terim,
hatฤฑrlama doฤruluฤunun, bir รถgenin bir รงalฤฑลma listesindeki konumunun bir fonksiyonu
olarak deฤiลtiฤi bulgusuna deฤinmektedir. Sฤฑrasฤฑ fark etmeksizin (serbest hatฤฑrlama)
listedeki รถgelerin hatฤฑrlanmasฤฑ istenildiฤinde, insanlar listenin sonundaki รถgeleri
hatฤฑrlamaya baลlama eฤilimindedir ve bu รถgeleri en iyi ลekilde hatฤฑrlarlar (''''''sonluk
etkisi''''''). Daha รถnceki liste รถgeleri arasฤฑnda, ilk birkaรง รถge, orta รถgelerden
daha sฤฑk hatฤฑrlanฤฑr (''''''ilklik etkisi''''''). ฤฐlklik etkisi iรงin รถnerilen bir
neden, sunulan ilk รถgelerin kendilerine ayrฤฑlmฤฑล daha fazla miktarda iลlem nedeniyle
en etkin ลekilde hareketsiz bellekte depolanmasฤฑdฤฑr. (ฤฐlk liste รถgesi kendi baลฤฑna
prova edilebilir; ikincisi, birincisi ile birlikte prova edilmek zorundadฤฑr, รผรงรผncรผ,
birincisi ve ikincisi ile birlikte, ve bรถyle devam eder.) รgeler hฤฑzlฤฑ bir ลekilde
sunulduฤunda ilklik etkisi azalฤฑr ve yavaล sunulduฤunda artar (her bir รถgenin
iลlenmesini ve bรถylece kalฤฑcฤฑ depolanmasฤฑnฤฑ azaltan ve arttฤฑran faktรถrler). Daha
uzun sunum listelerinin ilklik etkisini azalttฤฑฤฤฑ bulunmuลtur. Sonluk etkisi iรงin
teorileลmiล bir neden, bu รถgelerin geri hatฤฑrlanmasฤฑ talep edildiฤinde hala aktif
hafฤฑzada bulunmasฤฑdฤฑr. Hiรงbirinden yararlanmayan รถgeler (ortanca รถgeler) en kรถtรผ
ลekilde geri รงaฤrฤฑlฤฑr. Sonluk etkisi iรงin ek bir aรงฤฑklama zamansal baฤlamla ilgilidir:
Mevcut zamansal baฤlam, daha yeni รถgelerin, farklฤฑ bir zamansal baฤlamda (listenin
baลlarฤฑnda) incelenen รถgelere gรถre daha yรผksek geri hatฤฑrlama olasฤฑlฤฑฤฤฑna sahip
olacaฤฤฑnฤฑ haber veren bir geri hatฤฑrlama iลareti olarak kullanฤฑlabilir. Araya
giren bir gรถrev verildiฤinde sonluk etkisi azalฤฑr. Araya giren gรถrevler, รงalฤฑลan
belleฤi kullanฤฑr, ve dikkat daฤฤฑtฤฑcฤฑ aktivite sรผresi 15 ila 30 saniyeyi aลarsa,
sonluk etkisini bozabilir. Ek olarak, geri hatฤฑrlama testten hemen sonra gelirse,
sonluk etkisi รงalฤฑลฤฑlan listenin uzunluฤuna, veya sunum hฤฑzฤฑna bakฤฑlmaksฤฑzฤฑn istikrarlฤฑdฤฑr.
Kalฤฑcฤฑ uzun sรผreli hafฤฑza oluลturma kabiliyeti zayฤฑf olan amnezyaklar ilklik etkisi
gรถstermezler, ancak hatฤฑrlama รงalฤฑลmadan hemen sonra gelirse bir sonluk etkisi
gรถsterirler. Alzheimer hastalฤฑฤฤฑ olan kiลiler daha dรผลรผk bir ilklik etkisi sergiler,
ancak hatฤฑrlamada bir sonluk etkisi gรถstermezler. ฤฐlklik etkisi ฤฐlklik etkisi,
psikolojide ve sosyolojide, kiลinin ilk verilen bilgiyi daha sonra verilen bilgiden
daha iyi hatฤฑrlamasฤฑna neden olan bir biliลsel รถnyargฤฑdฤฑr. รrneฤin, yeterince
uzun bir kelime listesini okuyan bir kiลinin, listenin baลฤฑndaki kelimeleri hatฤฑrlamasฤฑ
listenin ortasฤฑndakileri hatฤฑrlamasฤฑndan daha yรผksek ihtimallidir. Birรงok araลtฤฑrmacฤฑ
bu olguyu serbest hatฤฑrlama null testler yoluyla aรงฤฑklamaya รงalฤฑลmฤฑลtฤฑr. Coluccia,
Gamboz ve Brandimonte (2011), serbest hatฤฑrlamayฤฑ katฤฑlฤฑmcฤฑlarฤฑn herhangi bir
telkin olmaksฤฑzฤฑn bilgileri hatฤฑrlamaya รงalฤฑลmasฤฑ olarak aรงฤฑklamaktadฤฑr. 20. yรผzyฤฑlฤฑn
sonlarฤฑndaki bazฤฑ deneylerde, kendilerine sunulan bir listede test edileceklerini
bilen katฤฑlฤฑmcฤฑlarฤฑn รถgeleri prova edeceฤi kaydedildi: รgeler sunulduฤunda katฤฑlฤฑmcฤฑlar
bu รถgeleri kendilerine tekrar edecek ve yeni รถgeler sunuldukรงa katฤฑlฤฑmcฤฑlar daha
yeni maddelerle birlikte รถnceki รถgeleri prova etmeye devam edeceklerdi. ฤฐlklik
etkisinin รถgelerin sunumu arasฤฑnda daha fazla zaman olduฤunda hatฤฑrlama รผzerinde
daha bรผyรผk bir etkisi olduฤu, bรถylece katฤฑlฤฑmcฤฑlarฤฑn รถnceki (asal) รถgeleri prova
etme ลansฤฑnฤฑn daha yรผksek olacaฤฤฑ gรถsterilmiลtir. Aรงฤฑk prova katฤฑlฤฑmcฤฑlarฤฑn prova
รถrรผntรผlerini test etmek iรงin kullanฤฑlan bir teknikti. Bu tekniฤin kullanฤฑldฤฑฤฤฑ
bir deneyde, katฤฑlฤฑmcฤฑlardan akla gelen รถgeleri yรผksek sesle sรถylemeleri istendi.
Bu ลekilde deneyci, katฤฑlฤฑmcฤฑlarฤฑn listenin baลฤฑndaki รถgeleri listenin ortasฤฑndaki
รถgelerden daha รงok bรถylece onlarฤฑ daha sฤฑk prova yapacaฤฤฑnฤฑ ve daha sonra listenin
ortasฤฑndaki รถgelerden daha iyi hatฤฑrlayacaฤฤฑnฤฑ gรถrebildi. Brodie ve Murdock tarafฤฑndan
yapฤฑlan baลka bir deneyde, sonluk etkisinin ilklik etkisinden kฤฑsmen sorumlu olduฤu
bulunmuลtur. Deneylerinde, aynฤฑ zamanda aรงฤฑk prova tekniฤini kullandฤฑlar ve katฤฑlฤฑmcฤฑlarฤฑn
daha รถnceki รถgeleri daha fazla prova yapmasฤฑnฤฑn yanฤฑ sฤฑra, listenin baลฤฑndaki
kelimeleri provada daha sonra sรถylediklerini keลfettiler. Bu ลekilde, daha รถnceki
รถgeler prova yolu sayesinde test sonuna daha yakฤฑndฤฑ ve kฤฑsmen sonluk etkisi ile
2013 yฤฑlฤฑnda yapฤฑlan bir araลtฤฑrma, ilklik etkisinin, edimsel koลullama olarak
da bilinen bir รถฤrenme sรผreci olan tekrarlanan seรงim deneyime dayalฤฑ karar verme
sรผrecinde de รถnemli olduฤunu gรถstermiลtir. Yazarlar, takip eden davranฤฑลฤฑn ilk
รถdรผlรผnรผn deฤerine verilen รถnemi gรถstermiล ve bu olguyu sonuรง รถnceliฤi olarak ifade
etmiลlerdir. Baลka bir รงalฤฑลmada, katฤฑlฤฑmcฤฑlar iki cรผmleden birini aldฤฑ. รrneฤin,
cรผmlelerin biri "Steve akฤฑllฤฑ, รงalฤฑลkan, eleลtirel, fevri ve kฤฑskanรงtฤฑr."; diฤeri
ise "Steve kฤฑskanรง, fevri, eleลtirel, รงalฤฑลkan ve akฤฑllฤฑdฤฑr." olabilir. Bu iki
cรผmle aynฤฑ bilgileri iรงerir. Birincisi baลlangฤฑรงta pozitif รถzellikleri gรถsterirken,
ikincisi olumsuz รถzelliklere sahiptir. Araลtฤฑrmacฤฑlar, katฤฑlฤฑmcฤฑlarฤฑn Steve''i
ilk cรผmle verildiฤinde ikincisine kฤฑyasla daha olumlu buldular. Sonluk etkisi
ฤฐki geleneksel teori sฤฑnฤฑfฤฑ sonluk etkisini aรงฤฑklar. รift depo modelleri Bu modeller,
en son listelenen รงalฤฑลma รถgelerinin oldukรงa eriลilebilir kฤฑsa sรผreli ara bellekten,
yani insan hafฤฑzasฤฑndaki kฤฑsa sรผreli depodan (KSD) alฤฑndฤฑฤฤฑnฤฑ varsayar. Bu, daha
sonra incelenen รถgelerin, daha รถnce incelenen รถgelere gรถre bir avantaja sahip
olmasฤฑnฤฑ saฤlar, รงรผnkรผ daha รถnceki รงalฤฑลma รถgelerinin uzun sรผreli bellek deposundan
(USD) geriye getirilmesi iรงin daha fazla รงaba harcanmasฤฑ gerekir. Bu tรผr modellerin
รถnemli bir tahmini, alฤฑkoyma dรถneminde (liste sunumu ile test arasฤฑndaki sรผre)
10-30 saniye aritmetik problemleri รงรถzme gibi dikkat daฤฤฑtฤฑcฤฑ bir sunumun yenilik
etkisini azaltmasฤฑdฤฑr. KSD sฤฑnฤฑrlฤฑ kapasiteye sahip olduฤundan, dikkat daฤฤฑnฤฑklฤฑฤฤฑ
daha sonraki รงalฤฑลma listesi รถgelerini KSD''den deฤiลtirir, bรถylece testte bu
รถgeler sadece USD''den alฤฑnabilir ve kฤฑsa sรผreli ara bellekten daha kolay alฤฑnabilme
avantajlarฤฑnฤฑ yitirebilir. Bu nedenle, รงift depolu modeller, hem anlฤฑk hatฤฑrlama
gรถrevlerindeki sonluk etkisini hem de gecikmeli serbest geri hatฤฑrlama gรถrevinde
bรถyle bir etkinin zayฤฑflamasฤฑnฤฑ baลarฤฑlฤฑ bir ลekilde aรงฤฑklar. Bununla birlikte,
bu modelle ilgili bรผyรผk bir sorun, uyarฤฑcฤฑlar arasฤฑ zaman aralฤฑฤฤฑ (aralฤฑksฤฑz รงeldirici
gรถrev) sฤฑrasฤฑnda her รงalฤฑลma maddesi arasฤฑnda bir dikkat daฤฤฑlmasฤฑ olduฤunda,
gecikmeli hatฤฑrlamada gรถzlemlenen uzun sรผreli etkisini tahmin edememesidir. Dikkatin
daฤฤฑlmasฤฑ, son รงalฤฑลma maddesinden sonra hala mevcut olduฤundan, รงalฤฑลma maddesini
KSD''den, sonluk etkisi azaltฤฑlacak ลekilde Bu uzun vadeli sonluk etkisinin varlฤฑฤฤฑ,
anlฤฑk ve uzun sรผreli sonluk etkilerinin ortak bir mekanizmayฤฑ paylaลmasฤฑ olasฤฑlฤฑฤฤฑnฤฑ
arttฤฑrmaktadฤฑr. Tek depo modelleri Tek depo teorilerine gรถre, dizisel konum etkilerinden
tek bir mekanizma sorumludur. ฤฐlk model tรผrรผ, her bir liste รถgesinin incelenmesi
ile test arasฤฑndaki sรผrenin, bir รถgenin alฤฑnฤฑrken bellek izinin gรถreceli rekabetรงiliฤini
belirlediฤi gรถreceli zamansal farklฤฑlฤฑฤa dayanmaktadฤฑr. Bu modelde, liste sonu
รถgelerinin daha belirgin ve dolayฤฑsฤฑyla daha kolay alฤฑnabileceฤi Baลka bir model
tรผrรผ, รถgelerin bellekten geri alฤฑnmasฤฑnฤฑn yalnฤฑzca kiลinin รงalฤฑลma รถgesinin kendisini
deฤil, aynฤฑ zamanda รงalฤฑลma baฤlamฤฑnฤฑ zihinsel temsiline baฤlฤฑ olduฤunu รถne sรผren
baฤlamsal deฤiลkenliฤe dayanmaktadฤฑr. Baฤlam zamanla deฤiลtiฤinden ve gittikรงe
deฤiลtiฤinden, bellek รถgelerini geri almak iรงin yarฤฑลtฤฑฤฤฑnda, anlฤฑk serbest hatฤฑrlama
testinde, daha yakฤฑn zamanda incelenen รถgelerin test baฤlamฤฑyla daha benzer kodlama
baฤlamlarฤฑ olacaktฤฑr ve geriye getirme olasฤฑlฤฑฤฤฑ daha yรผksektir. Anlฤฑk serbest
hatฤฑrlama dฤฑลฤฑnda, bu modeller gecikmeli serbest hatฤฑrlama ve sรผrekli รงeldirici
serbest hatฤฑrlama koลullarฤฑnda sonluk etkisinin varlฤฑฤฤฑnฤฑ veya yokluฤunu da tahmin
edebilir. Gecikmeli hatฤฑrlama koลullarฤฑ altฤฑnda, test baฤlamฤฑ artan tutma aralฤฑฤฤฑyla
uzaklaลarak zayฤฑflamฤฑล bir sonluk etkisi yaratฤฑr. Sรผrekli รงeldirici hatฤฑrlama
koลullarฤฑnda, artan yorumlama aralฤฑklarฤฑ รงalฤฑลma baฤlamฤฑ ve test baฤlamฤฑ arasฤฑndaki
benzerlikleri azaltฤฑrken, maddeler arasฤฑndaki gรถreli benzerlikler deฤiลmeden kalmaktadฤฑr.
Hatฤฑrlama iลlemi rekabetรงi olduฤu sรผrece, son รถgeler kazanacaktฤฑr, bu nedenle
bir sonluk etkisi gรถzlenir. Oran kuralฤฑ Genel olarak, sonluk etkisi ile ilgili
รถnemli bir ampirik gรถzlem, mutlak tutma aralฤฑklarฤฑ (รงalฤฑลma sonu ile test sรผresi
arasฤฑndaki sรผre) veya sunumlar arasฤฑ aralฤฑklar (farklฤฑ รงalฤฑลma รถgeleri arasฤฑndaki
sรผre) olmamasฤฑdฤฑr. Bunun yerine, sonluk miktarฤฑ ile belirlenen oran; mutlak tutma
aralฤฑklarฤฑ ve sunumlar arasฤฑ aralฤฑklar oranฤฑ (oran kuralฤฑ). Sonuรง olarak, bu oran
sabit kaldฤฑฤฤฑ sรผrece, aralฤฑklarฤฑn mutlak deฤerlerinden baฤฤฑmsฤฑz olarak yenilik
gรถzlenecektir, bรถylece ''''''zaman รถlรงeฤi deฤiลmezliฤi'''''' olarak bilinen bir
fenomen olan tรผm zaman รถlรงeklerinde yenilik gรถzlenebilir. Bu, yeniliฤin KSD''nin
bรผyรผklรผฤรผne ve KSD''deki รถgelerin yer deฤiลtirmesini yรถneten kurala baฤlฤฑ olduฤunu
varsayan รงift depo modelleri ile รงeliลmektedir. Olasฤฑ aรงฤฑklamalar daha sonra tek,
aynฤฑ bir mekanizma yoluyla ortaya รงฤฑkan sonluk etkisini aรงฤฑklar ya da anlฤฑk ve
uzun sรผreli sonluk etkileri iรงin iki farklฤฑ mekanizmayฤฑ รถngรถrebilen farklฤฑ bir
modelle yeniden aรงฤฑklar. Bรถyle bir aรงฤฑklama Davelaar ve ark. (2005), tek bileลenli
bir bellek modeli tarafฤฑndan aรงฤฑklanamayan anlฤฑk ve uzun sรผreli sonluk fenomenleri
arasฤฑnda ayrฤฑลmalar olduฤunu, anlฤฑk ve sonluk aรงฤฑklayan bir KSD''nin varlฤฑฤฤฑnฤฑ
savunan ve bir saniye uzun sรผreli sonluฤu aรงฤฑklayan baฤlamsal kaymaya dayanan
mekanizmadฤฑr. ฤฐlgili etkiler 1977''de William Crano รถzellikle birbirinin zฤฑttฤฑ
olduฤu sรถylenen ilklik ve sonluk etkileri baลta olmak รผzere sฤฑra etkilerinin doฤasฤฑnฤฑ
belirten bir รงalฤฑลma hazฤฑrlamaya karar verdi. Crano tarafฤฑndan test edilen รถzellikler:
Anlam deฤiลimi hipotezi Bir listenin baลฤฑndaki รถgeler, katฤฑlฤฑmcฤฑlarฤฑn listenin
geri kalanฤฑnฤฑn da uymasฤฑnฤฑ beklediฤi bir tema oluลturur. Katฤฑlฤฑmcฤฑ, listedeki
bazฤฑ kelimelerin anlamlarฤฑnฤฑ belirlediฤi beklentiye uyacak ลekilde deฤiลtirir.
Watkins ve Peynircioฤlu (1984), katฤฑlฤฑmcฤฑlarฤฑn kelimelerin anlamlarฤฑnฤฑ deฤiลtirerek
belirlenen temadan uzaklaลarak da olsa sunulan bilgideki sapmayฤฑ azalttฤฑฤฤฑnฤฑ aรงฤฑklamฤฑลtฤฑr.
Tutarsฤฑzlฤฑk durumda saymama Katฤฑlฤฑmcฤฑlar, kendilerine sunulan รถnceki maddelerle
tutarlฤฑ olmayan bilgileri dikkate almazlar. Baลka bir deyiลle, tutarsฤฑzlฤฑk durumda
saymama, sunulan diฤer bilgilerle tutarsฤฑz olan bilgileri tutarlฤฑ olanlardan daha
az รถnemli gรถrmeyi iรงerir (Devine ve Ostrom, 1985). Dikkat azaltma hipotezi รnce
sunulan bilgilerin katฤฑlฤฑmcฤฑlar รผzerinde daha sonra sunulan bilgilerden daha fazla
etkisi vardฤฑr ve bu bilgiler tutarlฤฑ olsa bile รถncelikli bir etkinin ortaya รงฤฑkmasฤฑna
neden olur. Steiner ve Rain (1989) insanlarฤฑn baลlangฤฑรงta sunulan bilgilere daha
fazla dikkat ettiklerini, ancak kendilerine sonradan sunulan bilgilere giderek
daha az dikkat ettiklerini aรงฤฑklamaktadฤฑr. ฤฐlklik etkisi, katฤฑlฤฑmcฤฑlarฤฑn baลlangฤฑรง
bilgilerine dikkat etmeleri ve daha sonra sunulan bilgileri gรถrmezden gelmeleri
nedeniyle oluลur. รte yandan, katฤฑlฤฑmcฤฑlar sรผrekli olarak bilgiye dikkat etmek
zorunda olduklarฤฑ bir durumdaysa, sonluk etkisi oluลabilir. ''''''Sรผreklilik etkisi''''''
veya gecikme etkisi, baลarฤฑlฤฑ bir geri รงaฤฤฑrma sonra, bir sonraki geri รงaฤrฤฑlan
รถgenin, yakฤฑn bir seri konumdan ziyade, uzak bir seri konumdan gelme olasฤฑlฤฑฤฤฑnฤฑn
dรผลรผk olduฤunu tahmin eder (Kahana, Howard, Zaromb ve Wingfiend, 2002). ฤฐki รถgenin
seri konumu arasฤฑndaki fark seri konum gecikmesi olarak adlandฤฑrฤฑlฤฑr. Koลullu
yanฤฑt olasฤฑlฤฑฤฤฑ olarak adlandฤฑrฤฑlan bir baลka faktรถr, belirli bir seri konum gecikmesini
hatฤฑrlama olasฤฑlฤฑฤฤฑdฤฑr. Ayrฤฑca bakฤฑnฤฑz Anchoring Clive Wearing Serbest Hatฤฑrlama
Henry Molaison ฤฐknada ฤฐlklik Yasasฤฑ รฤrenme Eฤrisi Hafฤฑza Eฤilimleri Listesi Biliลsel
Eฤilimler Listesi Sonucun ฤฐlkliฤi รฤrenme ฤฐlkeleri Tepe-Uรง Kuralฤฑ Anฤฑmsama Yumrusu
Kaynakรงa ;Atฤฑflar ;Basฤฑlฤฑ eserler Konuyla ilgili yayฤฑnlar Liebermann, David A.
L''''earning and memory: An integrative approach.'''' Belmont, CA: Thomson Wadsworth,
2004, Kategori:Bellek sรผreรงleri eฤilimler'
sentences:
- Sultan Bey'in hayatฤฑnฤฑn ikinci kฤฑsmฤฑnฤฑ oluลturan รถnemli olay nedir?
- Aslanbaba hangi ilรงeye baฤlฤฑ bir mahalledir?
- Seri konum eฤrisinin ลeklini hangi etmenlerin belirlediฤi anlatฤฑyor musunuz?
- source_sentence: (doฤum adฤฑ '''David Gordon Kirkpatrick''' 13 Haziran 1927 19 Eylรผl
2003), Avustralyalฤฑ country mรผzik ลarkฤฑcฤฑsฤฑ ve sรถz yazarฤฑydฤฑ. Avustralya iรงin
bir kรผltรผr ikonuydu ve รผlkenin en รงok รถdรผl alan yฤฑldฤฑzlarฤฑndan biriydi. Haziran
1927'de Nulla Nulla Creek'te bir รงiftรงinin oฤlu olarak doฤan Dusty, ilk ลarkฤฑsฤฑ
"The Way the Cowboy Dies"ฤฑ 1937'de yazdฤฑ ve 1938'de 11 yaลฤฑndayken "Slim Dusty"
sahne adฤฑnฤฑ aldฤฑ. Yetmiล yฤฑla yakฤฑn kariyerinde รงok sayฤฑda kayฤฑt yaptฤฑ. Yรผzden
fazla albรผm รงฤฑkardฤฑ, yedi milyondan fazla kayฤฑt sattฤฑ ve 70'in รผzerinde altฤฑn
ve platin albรผm sertifikasฤฑ kazandฤฑ". Sidney 2000 Olimpiyat Oyunlarฤฑnฤฑn kapanฤฑล
tรถreninde Avustralya'da รงok รผnlรผ bir ลarkฤฑ olan "Waltzing Matilda"yฤฑ seslendirdi.
1951'de Dusty, ลarkฤฑcฤฑ-sรถz yazarฤฑ Joy McKean ile evlendi ve onun desteฤiyle Avustralya'da
bรผyรผk baลarฤฑlar elde etti. รiftin, ลarkฤฑcฤฑ-sรถz yazarฤฑ olan Anne Kirkpatrick ve
David Kirkpatrick adlฤฑ iki รงocuklarฤฑ oldu. Akciฤer ve bรถbrek kanseri ile uzun
bir mรผcadelenin ardฤฑndan 19 Eylรผl 2003'te 76 yaลฤฑnda Yeni Gรผney Galler'deki evinde
รถldรผ. Kaynakรงa Hristiyanlar erkek ลarkฤฑcฤฑ-ลarkฤฑ yazarlarฤฑ ลeref Niลanฤฑ sahipleri
erkek gitaristler kanserinden รถlenler Kategori:Bรถbrek kanserinden รถlenler Kategori:Yeni
Gรผney Galler'de kanserden รถlenler asฤฑllฤฑ Avustralyalฤฑlar gitaristler country ลarkฤฑcฤฑlarฤฑ
Kategori:ARIA Hall of Fame รผyeleri Kategori:ARIA รdรผlรผ sahipleri Kategori:APRA
รdรผlรผ sahipleri gitaristler Kategori:21. yรผzyฤฑl gitaristleri Kategori:20. yรผzyฤฑl
gitaristleri Kategori:2003 yฤฑlฤฑnda รถlenler Kategori:1927 doฤumlular
sentences:
- Bu Hollandalฤฑ aktrisin adฤฑ nedir?
- Kimdi Slim Dusty?
- Dusty Springfield'in mรผzik kariyeri ne kadar sรผrmรผลtรผr?
- source_sentence: 14 Aralฤฑk 1929 tarihli Milliyet gazetesinde ฤฐstanbul'da Kฤฑr Koลusu
Eski logosu '''Tรผrkiye Atletizm Federasyonu''' ('''TAF'''), atletizm sporunun
Tรผrkiye'deki yรถnetim teลkilatฤฑ olan spor federasyonu. 1922'de Tรผrkiye ฤฐdman Cemiyetleri
ฤฐttifakฤฑ (TฤฐCฤฐ) bรผnyesinde kurulan Tรผrkiye Atletizm Federasyonu, aynฤฑ yฤฑl Uluslararasฤฑ
Atletizm Federasyonlarฤฑ Birliฤi (IAAF) รผyeliฤine kabul edildi. Gรถrev yapmฤฑล baลkanlar
Tรผrkiye Atletizm Federasyonu'nun kronolojik sฤฑrayla baลkanlarฤฑ; Ali Seyfi Beyti
Ahmet Fetgeri Burhan Felek Vildan Aลir Savaลฤฑr Saffet Gรผrol Adnan Hรผn ฤฐrfan ลahinbaล
ฤฐsmail Hakkฤฑ Gรผngรถr Ali Naili Moran Refik Tagay Sadun รzdede Nejat Kรถk Behรงet
Beylem Erol Zorlu Kurthan Fiลek Jerfi Fฤฑratlฤฑ Nuri Turan Abdullah Kรถkpฤฑnar Cรผneyt
Koryรผrek Yฤฑlmaz Sazak ฤฐlker รetin Hรผseyin Manioฤlu Ali Ergenรง Muharrem Dalkฤฑlฤฑรง
Aลkฤฑn Tuna Fikret รetinkaya Semra Aksu Hรผseyin Yฤฑldฤฑrฤฑm Mehmet Yurdadรถn Mehmet
Terzi Hรผseyin Yฤฑldฤฑrฤฑm Fatih รintimar Kaynakรงa Dฤฑล baฤlantฤฑlar Federasyonun resmi
sitesi Atletizm Federasyon Kategori:Avrupa Atletizm Birliฤi รผyesi federasyonlar
Kategori:Ankara merkezli kuruluลlar Osmanlฤฑ kurulan oluลumlar kurulan spor kuruluลlarฤฑ
sentences:
- Leandro Pereira kimdir?
- Tรผrkiye Atletizm Federasyonu ne zaman kuruldu?
- P.E.N. nedir?
- source_sentence: '''''ฤฐlkbaharda Daฤ Yolunda Yรผrรผmek'''' ''''''Ma Yuan'''''' (;
1160''lar-1225), Gรผney Song Hanedanฤฑ dรถneminde yaลamฤฑล รinli bir ressamdฤฑ. รalฤฑลmalarฤฑ,
Xia Gui''ninkiyle birlikte, sรถzde Ma-Xia resim okulunun temelini oluลturdu ve
dรถnemin en iyileri arasฤฑnda kabul edilmektedir. Eserleri hem Zhe okulunun รinli
sanatรงฤฑlarฤฑna hem de ilk Japon ressamlar Shลซbun ve Sesshลซ''ye ilham verdi. Kaynakรงa
Dunlop, Ronald Ossory. 1954. ''''Landscape Painting: Ma Yรผan to Picasso''''. London:
Seeley, Service Co. Little, Stephen. '''' Taoism and the Arts of China,'''' p.
160. Chicago: Art Institute of Chicago. Dฤฑล baฤlantฤฑlar Ma Yuan Painting Gallery
at China Online Museum Sung and Yuan paintings an exhibition catalog from The
Metropolitan Museum of Art Libraries (fully available online as PDF), which contains
material on Ma Yuan (see list of paintings) doฤanlar doฤumlular Kategori:1225
yฤฑlฤฑnda รถlenler Kategori:รinli ressamlar Kategori:Song Hanedanฤฑ kiลileri Kategori:12.
yรผzyฤฑl ressamlarฤฑ Kategori:13. yรผzyฤฑl ressamlarฤฑ'
sentences:
- Denon hangi sanatsal hareketle iliลkilendirilir?
- Hammรขd bin Sรผleyman'ฤฑn hocasฤฑ kimdir?
- Ma Yuan hangi okulun ressamฤฑydฤฑ?
- source_sentence: 'veya ''''''Afrika insansฤฑlarฤฑ'''''', ilk kez John Edward Gray
tarafฤฑndan 1825 yฤฑlฤฑnda tanฤฑmlanmฤฑล bir Hominidae alt familyasฤฑdฤฑr. Aรงฤฑklama (insansฤฑ)
aile aฤacฤฑ sol Mevcut (5 tรผr) ve soyu tรผkenmiล tรผrleriyle birlikte iki oymak iรงerir:
''''''Hominini'''''' oymaฤฤฑ ve ''''''Gorillini'''''' oymaฤฤฑ. Kimi yazarlar ise,
''''Pan'''' cinsinin bazen kendi รผรงรผncรผ oymaฤฤฑ Panini''ye ait olduฤunu dรผลรผnรผr.
Homininae, orangutanlarฤฑn (Ponginae alt familyasฤฑ) hominid soyundan ayrฤฑlmasฤฑndan
(yaklaลฤฑk 16 myรถ) sonra ortaya รงฤฑkan, insanlarla orangutanlara gรถre daha yakฤฑn
akraba olan tรผm hominidleri iรงerir. Bu alt familyadaki canlฤฑlar, ''''hominine''''
veya ''''hominineler'''' olarak tanฤฑmlanฤฑr. Evrim Homininae alt familyasฤฑnฤฑn yaลฤฑ
son ortak atasฤฑ) tahminlere gรถre 14 ila 12.5 milyon yฤฑldฤฑr Gorillini ve Hominini
oymaklarฤฑna ayrฤฑlmasฤฑnฤฑn ("goril insan son ortak atasฤฑ", GHLCA) geรง Miyosen''de,
nakayamai''''nin yaลadฤฑฤฤฑ dรถneme yakฤฑn bir zamanda, ila 10 milyon yฤฑl รถnce gerรงekleลtiฤi
tahmin edilmiลtir (TGHLCA). ''''Pan-Homo'''' bรถlรผnmesine kadar (5-7 myรถ) gorillerin
ve ''''Pan-Homo'''' atalarฤฑnฤฑn melezlendiฤine dair kanฤฑtlar vardฤฑr. Filogeni Parins-Fukuchi
''''ve 2019''daki รงalฤฑลmasฤฑna gรถre oluลturulmuล, soyu tรผkenmiล homininleri iรงeren
bir Homininae kladogramฤฑ: Ayrฤฑca bakฤฑnฤฑz son ortak ata Ponginae Notlar Kaynakรงa
Dฤฑล baฤlantฤฑlar Kategori:John Edward Gray tarafฤฑndan adlandฤฑrฤฑlmฤฑล taksonlar tanฤฑmlanan
taksonlar'
sentences:
- Homininae alt familyasฤฑ ilk kez ne zaman ve kim tarafฤฑndan tanฤฑmlandฤฑ?
- Amr Hassan Zaki hangi takฤฑmlarda forma giymiลtir?
- KKTC spor kulรผbรผ hangi ลehirde kurulmuลtur?
pipeline_tag: sentence-similarity
library_name: sentence-transformers
metrics:
- cosine_accuracy@1
- cosine_accuracy@3
- cosine_accuracy@5
- cosine_accuracy@10
- cosine_precision@1
- cosine_precision@3
- cosine_precision@5
- cosine_precision@10
- cosine_recall@1
- cosine_recall@3
- cosine_recall@5
- cosine_recall@10
- cosine_ndcg@10
- cosine_mrr@10
- cosine_map@100
model-index:
- name: MiniLM-L12-TR
results:
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 384
type: dim_384
metrics:
- type: cosine_accuracy@1
value: 0.559679037111334
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.6720160481444333
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.7141424272818455
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.7542627883650953
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.559679037111334
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.22400534938147776
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.1428284854563691
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.07542627883650951
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.559679037111334
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.6720160481444333
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.7141424272818455
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.7542627883650953
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.6573432687197566
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.6262999315406539
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.6317830440458849
name: Cosine Map@100
---
# MiniLM-L12-TR
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2](https://huggingface.co/sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2) on the json dataset. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2](https://huggingface.co/sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2) <!-- at revision 8d6b950845285729817bf8e1af1861502c2fed0c -->
- **Maximum Sequence Length:** 128 tokens
- **Output Dimensionality:** 384 dimensions
- **Similarity Function:** Cosine Similarity
- **Training Dataset:**
- json
- **Language:** en
- **License:** apache-2.0
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the ๐ค Hub
model = SentenceTransformer("SMARTICT/paraphrase-multilingual-MiniLM-L12-v2-ft-tr-rag-v1")
# Run inference
sentences = [
'veya \'\'\'Afrika insansฤฑlarฤฑ\'\'\', ilk kez John Edward Gray tarafฤฑndan 1825 yฤฑlฤฑnda tanฤฑmlanmฤฑล bir Hominidae alt familyasฤฑdฤฑr. Aรงฤฑklama (insansฤฑ) aile aฤacฤฑ sol Mevcut (5 tรผr) ve soyu tรผkenmiล tรผrleriyle birlikte iki oymak iรงerir: \'\'\'Hominini\'\'\' oymaฤฤฑ ve \'\'\'Gorillini\'\'\' oymaฤฤฑ. Kimi yazarlar ise, \'\'Pan\'\' cinsinin bazen kendi รผรงรผncรผ oymaฤฤฑ Panini\'ye ait olduฤunu dรผลรผnรผr. Homininae, orangutanlarฤฑn (Ponginae alt familyasฤฑ) hominid soyundan ayrฤฑlmasฤฑndan (yaklaลฤฑk 16 myรถ) sonra ortaya รงฤฑkan, insanlarla orangutanlara gรถre daha yakฤฑn akraba olan tรผm hominidleri iรงerir. Bu alt familyadaki canlฤฑlar, \'\'hominine\'\' veya \'\'hominineler\'\' olarak tanฤฑmlanฤฑr. Evrim Homininae alt familyasฤฑnฤฑn yaลฤฑ son ortak atasฤฑ) tahminlere gรถre 14 ila 12.5 milyon yฤฑldฤฑr Gorillini ve Hominini oymaklarฤฑna ayrฤฑlmasฤฑnฤฑn ("goril insan son ortak atasฤฑ", GHLCA) geรง Miyosen\'de, nakayamai\'\'nin yaลadฤฑฤฤฑ dรถneme yakฤฑn bir zamanda, ila 10 milyon yฤฑl รถnce gerรงekleลtiฤi tahmin edilmiลtir (TGHLCA). \'\'Pan-Homo\'\' bรถlรผnmesine kadar (5-7 myรถ) gorillerin ve \'\'Pan-Homo\'\' atalarฤฑnฤฑn melezlendiฤine dair kanฤฑtlar vardฤฑr. Filogeni Parins-Fukuchi \'\'ve 2019\'daki รงalฤฑลmasฤฑna gรถre oluลturulmuล, soyu tรผkenmiล homininleri iรงeren bir Homininae kladogramฤฑ: Ayrฤฑca bakฤฑnฤฑz son ortak ata Ponginae Notlar Kaynakรงa Dฤฑล baฤlantฤฑlar Kategori:John Edward Gray tarafฤฑndan adlandฤฑrฤฑlmฤฑล taksonlar tanฤฑmlanan taksonlar',
'Homininae alt familyasฤฑ ilk kez ne zaman ve kim tarafฤฑndan tanฤฑmlandฤฑ?',
'Amr Hassan Zaki hangi takฤฑmlarda forma giymiลtir?',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
## Evaluation
### Metrics
#### Information Retrieval
* Dataset: `dim_384`
* Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| cosine_accuracy@1 | 0.5597 |
| cosine_accuracy@3 | 0.672 |
| cosine_accuracy@5 | 0.7141 |
| cosine_accuracy@10 | 0.7543 |
| cosine_precision@1 | 0.5597 |
| cosine_precision@3 | 0.224 |
| cosine_precision@5 | 0.1428 |
| cosine_precision@10 | 0.0754 |
| cosine_recall@1 | 0.5597 |
| cosine_recall@3 | 0.672 |
| cosine_recall@5 | 0.7141 |
| cosine_recall@10 | 0.7543 |
| **cosine_ndcg@10** | **0.6573** |
| cosine_mrr@10 | 0.6263 |
| cosine_map@100 | 0.6318 |
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Dataset
#### json
* Dataset: json
* Size: 8,970 training samples
* Columns: <code>positive</code> and <code>anchor</code>
* Approximate statistics based on the first 1000 samples:
| | positive | anchor |
|:--------|:-------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 68 tokens</li><li>mean: 124.21 tokens</li><li>max: 128 tokens</li></ul> | <ul><li>min: 2 tokens</li><li>mean: 14.35 tokens</li><li>max: 35 tokens</li></ul> |
* Samples:
| positive | anchor |
|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------|
| <code>Diyarbakฤฑr ilinin Bismil ilรงesine baฤlฤฑ bir mahalledir. Tarihรงe Mahallenin adฤฑ, 1928 yฤฑlฤฑ kayฤฑtlarฤฑnda olarak geรงmektedir. Coฤrafya Diyarbakฤฑr il merkezine 57 km, Bismil ilรงe merkezine 22 km uzaklฤฑktadฤฑr. Nรผfus Yฤฑllara gรถre mahalle nรผfus verileri 2007 2000 185 1997 165 Kaynakรงa Dฤฑล baฤlantฤฑlar Yerelnet mahalleleri</code> | <code>Mahallenin adฤฑ ne zaman kaydedilmiลtir?</code> |
| <code>'''karmaลฤฑk neden''', '''nedensel aลฤฑrฤฑ '''nedensel veya '''indirgeme safsatasฤฑ''', bir sonucun birkaรง nedenden kaynaklanmasฤฑ mรผmkรผnken; bir tek nedeni olduฤu varsayฤฑldฤฑฤฤฑnda ortaya รงฤฑkan kuลkulu neden safsatasฤฑdฤฑr. Mantฤฑksal olarak ลu ลekilde aรงฤฑklanabilir: "X, Y'ye neden oldu; bu nedenle, X, Y'nin tek nedeniydi" Nedensel aลฤฑrฤฑ basitleลtirme, birleลik olasฤฑlฤฑklarฤฑn gรถz ardฤฑ edildiฤi belirli bir tรผr yanlฤฑล ikilemdir. Diฤer bir deyiลle, "A ve ve C" veya "A ve ama deฤil" ลeklindeki รถncรผller dikkate alฤฑnmadฤฑฤฤฑnda olasฤฑ nedenlerin "A veya veya C" olduฤu varsayฤฑlฤฑr. Kaynakรงa</code> | <code>Karmaลฤฑk neden safsatasฤฑ nedir ve nasฤฑl oluลur?</code> |
| <code>Akyazฤฑ Sakarya ili ilรงesi Akyazฤฑ, Adฤฑyaman Adฤฑyaman ili merkez ilรงesine baฤlฤฑ kรถy Akyazฤฑ, Besni Adฤฑyaman ili Besni ilรงesine baฤlฤฑ kรถy Akyazฤฑ, Amasya Amasya ili merkez ilรงesine baฤlฤฑ kรถy Akyazฤฑ, Adilcevaz Bitlis ili Adilcevaz ilรงesine baฤlฤฑ kรถy Akyazฤฑ, Dรผzce Dรผzce ili merkez ilรงesine baฤlฤฑ kรถy Akyazฤฑ, รorum รorum ili merkez ilรงesine baฤlฤฑ kรถy Akyazฤฑ, Aziziye Erzurum ili Aziziye ilรงesine baฤlฤฑ mahalle Akyazฤฑ, Kฤฑzฤฑltepe Mardin ili Kฤฑzฤฑltepe ilรงesine baฤlฤฑ mahalle Akyazฤฑ, Asarcฤฑk Samsun ili Asarcฤฑk ilรงesine baฤlฤฑ mahalle Akyazฤฑ, Ortahisar Trabzon ili Ortahisar ilรงesine baฤlฤฑ mahalle</code> | <code>Akyazฤฑ adฤฑnda kaรง kรถy vardฤฑr?</code> |
* Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
```json
{
"scale": 20.0,
"similarity_fct": "cos_sim"
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: epoch
- `per_device_train_batch_size`: 32
- `per_device_eval_batch_size`: 16
- `gradient_accumulation_steps`: 16
- `learning_rate`: 2e-05
- `num_train_epochs`: 5
- `lr_scheduler_type`: cosine
- `warmup_ratio`: 0.1
- `tf32`: False
- `load_best_model_at_end`: True
- `optim`: adamw_torch_fused
- `batch_sampler`: no_duplicates
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: epoch
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 32
- `per_device_eval_batch_size`: 16
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 16
- `eval_accumulation_steps`: None
- `learning_rate`: 2e-05
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 5
- `max_steps`: -1
- `lr_scheduler_type`: cosine
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.1
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: False
- `fp16`: False
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: False
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: True
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch_fused
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: False
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `prompts`: None
- `batch_sampler`: no_duplicates
- `multi_dataset_batch_sampler`: proportional
</details>
### Training Logs
| Epoch | Step | Training Loss | dim_384_cosine_ndcg@10 |
|:----------:|:------:|:-------------:|:----------------------:|
| 0.5694 | 10 | 0.8456 | - |
| 0.9680 | 17 | - | 0.5968 |
| 1.1388 | 20 | 0.4964 | - |
| 1.7082 | 30 | 0.393 | - |
| 1.9929 | 35 | - | 0.6429 |
| 2.2776 | 40 | 0.3235 | - |
| 2.8470 | 50 | 0.2816 | - |
| 2.9609 | 52 | - | 0.6532 |
| 3.4164 | 60 | 0.2653 | - |
| **3.9858** | **70** | **0.2408** | **0.6576** |
| 4.5552 | 80 | 0.2379 | - |
| 4.8399 | 85 | - | 0.6573 |
* The bold row denotes the saved checkpoint.
### Framework Versions
- Python: 3.12.7
- Sentence Transformers: 3.3.1
- Transformers: 4.41.2
- PyTorch: 2.5.1+cu124
- Accelerate: 1.1.1
- Datasets: 2.19.1
- Tokenizers: 0.19.1
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
#### MultipleNegativesRankingLoss
```bibtex
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> |
MakiAi/Llama-3-2-3B-Instruct-bnb-4bit-OKU-v1-10epochs | MakiAi | 2024-11-25T07:43:32Z | 104 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"text-generation-inference",
"unsloth",
"trl",
"sft",
"conversational",
"en",
"base_model:unsloth/Llama-3.2-3B-Instruct-bnb-4bit",
"base_model:finetune:unsloth/Llama-3.2-3B-Instruct-bnb-4bit",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-25T07:41:41Z | ---
base_model: unsloth/Llama-3.2-3B-Instruct-bnb-4bit
tags:
- text-generation-inference
- transformers
- unsloth
- llama
- trl
- sft
license: apache-2.0
language:
- en
---
# Uploaded model
- **Developed by:** MakiAi
- **License:** apache-2.0
- **Finetuned from model :** unsloth/Llama-3.2-3B-Instruct-bnb-4bit
This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
MakiAi/Llama-3-2-3B-Instruct-bnb-4bit-OKU-v1-10epochs-adapter | MakiAi | 2024-11-25T07:39:20Z | 107 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"text-generation-inference",
"unsloth",
"trl",
"sft",
"conversational",
"en",
"base_model:unsloth/Llama-3.2-3B-Instruct-bnb-4bit",
"base_model:finetune:unsloth/Llama-3.2-3B-Instruct-bnb-4bit",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-25T07:37:13Z | ---
base_model: unsloth/Llama-3.2-3B-Instruct-bnb-4bit
tags:
- text-generation-inference
- transformers
- unsloth
- llama
- trl
- sft
license: apache-2.0
language:
- en
---
# Uploaded model
- **Developed by:** MakiAi
- **License:** apache-2.0
- **Finetuned from model :** unsloth/Llama-3.2-3B-Instruct-bnb-4bit
This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
MayBashendy/Arabic_FineTuningAraBERT_AugV5_k35_task2_organization_fold1 | MayBashendy | 2024-11-25T07:35:14Z | 162 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:aubmindlab/bert-base-arabertv02",
"base_model:finetune:aubmindlab/bert-base-arabertv02",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-25T07:19:13Z | ---
library_name: transformers
base_model: aubmindlab/bert-base-arabertv02
tags:
- generated_from_trainer
model-index:
- name: Arabic_FineTuningAraBERT_AugV5_k35_task2_organization_fold1
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Arabic_FineTuningAraBERT_AugV5_k35_task2_organization_fold1
This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5329
- Qwk: 0.5
- Mse: 0.5329
- Rmse: 0.7300
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:|
| No log | 0.0073 | 2 | 4.3121 | -0.0167 | 4.3121 | 2.0766 |
| No log | 0.0147 | 4 | 2.4805 | -0.0610 | 2.4805 | 1.5749 |
| No log | 0.0220 | 6 | 2.3076 | 0.1290 | 2.3076 | 1.5191 |
| No log | 0.0293 | 8 | 2.6382 | 0.0656 | 2.6382 | 1.6242 |
| No log | 0.0366 | 10 | 2.2411 | 0.1529 | 2.2411 | 1.4970 |
| No log | 0.0440 | 12 | 1.5364 | 0.1322 | 1.5364 | 1.2395 |
| No log | 0.0513 | 14 | 1.5650 | 0.0556 | 1.5650 | 1.2510 |
| No log | 0.0586 | 16 | 1.4212 | 0.0556 | 1.4212 | 1.1921 |
| No log | 0.0659 | 18 | 1.5321 | 0.0870 | 1.5321 | 1.2378 |
| No log | 0.0733 | 20 | 1.1517 | 0.1266 | 1.1517 | 1.0732 |
| No log | 0.0806 | 22 | 0.7274 | 0.1905 | 0.7274 | 0.8529 |
| No log | 0.0879 | 24 | 0.6328 | 0.1639 | 0.6328 | 0.7955 |
| No log | 0.0952 | 26 | 0.7007 | 0.1290 | 0.7007 | 0.8371 |
| No log | 0.1026 | 28 | 0.8374 | 0.0 | 0.8374 | 0.9151 |
| No log | 0.1099 | 30 | 1.4702 | 0.1395 | 1.4702 | 1.2125 |
| No log | 0.1172 | 32 | 1.7090 | 0.1818 | 1.7090 | 1.3073 |
| No log | 0.1245 | 34 | 1.4305 | 0.1111 | 1.4305 | 1.1960 |
| No log | 0.1319 | 36 | 1.0387 | 0.0308 | 1.0387 | 1.0192 |
| No log | 0.1392 | 38 | 0.8915 | 0.0308 | 0.8915 | 0.9442 |
| No log | 0.1465 | 40 | 0.7929 | 0.1563 | 0.7929 | 0.8905 |
| No log | 0.1538 | 42 | 1.0850 | 0.1266 | 1.0850 | 1.0416 |
| No log | 0.1612 | 44 | 1.2941 | 0.1869 | 1.2941 | 1.1376 |
| No log | 0.1685 | 46 | 2.0278 | 0.0559 | 2.0278 | 1.4240 |
| No log | 0.1758 | 48 | 1.8639 | 0.0930 | 1.8639 | 1.3652 |
| No log | 0.1832 | 50 | 1.0180 | 0.1127 | 1.0180 | 1.0090 |
| No log | 0.1905 | 52 | 0.5459 | 0.4407 | 0.5459 | 0.7389 |
| No log | 0.1978 | 54 | 0.5375 | 0.2105 | 0.5375 | 0.7331 |
| No log | 0.2051 | 56 | 0.6592 | 0.2623 | 0.6592 | 0.8119 |
| No log | 0.2125 | 58 | 0.9970 | 0.0 | 0.9970 | 0.9985 |
| No log | 0.2198 | 60 | 1.2622 | 0.0 | 1.2622 | 1.1235 |
| No log | 0.2271 | 62 | 1.2415 | 0.0 | 1.2415 | 1.1142 |
| No log | 0.2344 | 64 | 0.9522 | 0.0 | 0.9522 | 0.9758 |
| No log | 0.2418 | 66 | 0.6901 | 0.1905 | 0.6901 | 0.8307 |
| No log | 0.2491 | 68 | 0.5747 | 0.0339 | 0.5747 | 0.7581 |
| No log | 0.2564 | 70 | 0.5642 | 0.0339 | 0.5642 | 0.7512 |
| No log | 0.2637 | 72 | 0.6176 | 0.2000 | 0.6176 | 0.7859 |
| No log | 0.2711 | 74 | 0.6859 | 0.2258 | 0.6859 | 0.8282 |
| No log | 0.2784 | 76 | 0.8934 | 0.1563 | 0.8934 | 0.9452 |
| No log | 0.2857 | 78 | 0.9462 | 0.1563 | 0.9462 | 0.9727 |
| No log | 0.2930 | 80 | 0.8605 | 0.1905 | 0.8605 | 0.9276 |
| No log | 0.3004 | 82 | 0.7688 | 0.1905 | 0.7688 | 0.8768 |
| No log | 0.3077 | 84 | 0.6366 | 0.0339 | 0.6366 | 0.7979 |
| No log | 0.3150 | 86 | 0.6321 | 0.0339 | 0.6321 | 0.7951 |
| No log | 0.3223 | 88 | 0.7504 | 0.2000 | 0.7504 | 0.8663 |
| No log | 0.3297 | 90 | 0.8476 | 0.2258 | 0.8476 | 0.9206 |
| No log | 0.3370 | 92 | 0.7770 | 0.2258 | 0.7770 | 0.8815 |
| No log | 0.3443 | 94 | 0.6783 | 0.2000 | 0.6783 | 0.8236 |
| No log | 0.3516 | 96 | 0.5899 | 0.2105 | 0.5899 | 0.7680 |
| No log | 0.3590 | 98 | 0.5677 | 0.1429 | 0.5677 | 0.7535 |
| No log | 0.3663 | 100 | 0.5985 | 0.2105 | 0.5985 | 0.7737 |
| No log | 0.3736 | 102 | 0.6789 | 0.3390 | 0.6789 | 0.8239 |
| No log | 0.3810 | 104 | 0.7268 | 0.2258 | 0.7268 | 0.8525 |
| No log | 0.3883 | 106 | 0.8349 | 0.1563 | 0.8349 | 0.9137 |
| No log | 0.3956 | 108 | 0.7980 | 0.1563 | 0.7980 | 0.8933 |
| No log | 0.4029 | 110 | 0.6952 | 0.1563 | 0.6952 | 0.8338 |
| No log | 0.4103 | 112 | 0.6014 | 0.2258 | 0.6014 | 0.7755 |
| No log | 0.4176 | 114 | 0.4932 | 0.3793 | 0.4932 | 0.7023 |
| No log | 0.4249 | 116 | 0.4988 | 0.3333 | 0.4988 | 0.7062 |
| No log | 0.4322 | 118 | 0.5628 | 0.1818 | 0.5628 | 0.7502 |
| No log | 0.4396 | 120 | 0.6661 | -0.0755 | 0.6661 | 0.8161 |
| No log | 0.4469 | 122 | 0.7210 | -0.0408 | 0.7210 | 0.8491 |
| No log | 0.4542 | 124 | 0.7354 | 0.0400 | 0.7354 | 0.8576 |
| No log | 0.4615 | 126 | 0.6108 | 0.1818 | 0.6108 | 0.7816 |
| No log | 0.4689 | 128 | 0.6323 | 0.2373 | 0.6323 | 0.7952 |
| No log | 0.4762 | 130 | 0.7069 | 0.1724 | 0.7069 | 0.8408 |
| No log | 0.4835 | 132 | 0.7427 | 0.2105 | 0.7427 | 0.8618 |
| No log | 0.4908 | 134 | 0.7664 | -0.0755 | 0.7664 | 0.8754 |
| No log | 0.4982 | 136 | 0.7678 | 0.0 | 0.7678 | 0.8762 |
| No log | 0.5055 | 138 | 0.7011 | 0.1509 | 0.7011 | 0.8373 |
| No log | 0.5128 | 140 | 0.6116 | 0.2105 | 0.6116 | 0.7820 |
| No log | 0.5201 | 142 | 0.6154 | 0.2105 | 0.6154 | 0.7845 |
| No log | 0.5275 | 144 | 0.8091 | 0.3200 | 0.8091 | 0.8995 |
| No log | 0.5348 | 146 | 0.9449 | 0.2895 | 0.9449 | 0.9720 |
| No log | 0.5421 | 148 | 0.9432 | 0.2895 | 0.9432 | 0.9712 |
| No log | 0.5495 | 150 | 0.7335 | 0.2258 | 0.7335 | 0.8564 |
| No log | 0.5568 | 152 | 0.5818 | 0.4407 | 0.5818 | 0.7627 |
| No log | 0.5641 | 154 | 0.5265 | 0.3226 | 0.5265 | 0.7256 |
| No log | 0.5714 | 156 | 0.5171 | 0.3284 | 0.5171 | 0.7191 |
| No log | 0.5788 | 158 | 0.5730 | 0.3836 | 0.5730 | 0.7569 |
| No log | 0.5861 | 160 | 0.6338 | 0.3836 | 0.6338 | 0.7961 |
| No log | 0.5934 | 162 | 0.7125 | 0.3836 | 0.7125 | 0.8441 |
| No log | 0.6007 | 164 | 0.9304 | 0.3762 | 0.9304 | 0.9646 |
| No log | 0.6081 | 166 | 0.8147 | 0.2921 | 0.8147 | 0.9026 |
| No log | 0.6154 | 168 | 0.7102 | 0.2258 | 0.7102 | 0.8428 |
| No log | 0.6227 | 170 | 0.7472 | 0.3143 | 0.7472 | 0.8644 |
| No log | 0.6300 | 172 | 0.8547 | 0.2857 | 0.8547 | 0.9245 |
| No log | 0.6374 | 174 | 0.9096 | 0.2826 | 0.9096 | 0.9537 |
| No log | 0.6447 | 176 | 0.9356 | 0.3077 | 0.9356 | 0.9673 |
| No log | 0.6520 | 178 | 0.8476 | 0.1972 | 0.8476 | 0.9207 |
| No log | 0.6593 | 180 | 0.8232 | 0.0727 | 0.8232 | 0.9073 |
| No log | 0.6667 | 182 | 0.8438 | 0.0727 | 0.8438 | 0.9186 |
| No log | 0.6740 | 184 | 0.9355 | 0.0282 | 0.9355 | 0.9672 |
| No log | 0.6813 | 186 | 0.9849 | 0.2308 | 0.9849 | 0.9924 |
| No log | 0.6886 | 188 | 0.9396 | 0.2308 | 0.9396 | 0.9693 |
| No log | 0.6960 | 190 | 0.8917 | 0.1600 | 0.8917 | 0.9443 |
| No log | 0.7033 | 192 | 0.9917 | -0.1000 | 0.9917 | 0.9958 |
| No log | 0.7106 | 194 | 0.9400 | 0.0625 | 0.9400 | 0.9695 |
| No log | 0.7179 | 196 | 0.8076 | 0.2192 | 0.8076 | 0.8986 |
| No log | 0.7253 | 198 | 0.8555 | 0.3544 | 0.8555 | 0.9249 |
| No log | 0.7326 | 200 | 0.8250 | 0.2059 | 0.8250 | 0.9083 |
| No log | 0.7399 | 202 | 0.8208 | -0.1887 | 0.8208 | 0.9060 |
| No log | 0.7473 | 204 | 0.8430 | -0.0800 | 0.8430 | 0.9181 |
| No log | 0.7546 | 206 | 0.7761 | 0.0 | 0.7761 | 0.8809 |
| No log | 0.7619 | 208 | 0.7385 | 0.0 | 0.7385 | 0.8594 |
| No log | 0.7692 | 210 | 0.7824 | 0.2105 | 0.7824 | 0.8846 |
| No log | 0.7766 | 212 | 0.7792 | 0.0 | 0.7792 | 0.8827 |
| No log | 0.7839 | 214 | 0.7983 | 0.0 | 0.7983 | 0.8935 |
| No log | 0.7912 | 216 | 0.8124 | 0.0 | 0.8124 | 0.9013 |
| No log | 0.7985 | 218 | 0.8442 | 0.0690 | 0.8442 | 0.9188 |
| No log | 0.8059 | 220 | 0.8922 | 0.0690 | 0.8922 | 0.9446 |
| No log | 0.8132 | 222 | 0.8998 | 0.1231 | 0.8998 | 0.9486 |
| No log | 0.8205 | 224 | 0.8578 | 0.0 | 0.8578 | 0.9262 |
| No log | 0.8278 | 226 | 0.8195 | 0.1639 | 0.8195 | 0.9053 |
| No log | 0.8352 | 228 | 0.8399 | 0.2623 | 0.8399 | 0.9165 |
| No log | 0.8425 | 230 | 0.9020 | 0.2247 | 0.9020 | 0.9497 |
| No log | 0.8498 | 232 | 0.9211 | 0.2069 | 0.9211 | 0.9598 |
| No log | 0.8571 | 234 | 0.8606 | 0.4000 | 0.8606 | 0.9277 |
| No log | 0.8645 | 236 | 0.8381 | 0.4396 | 0.8381 | 0.9155 |
| No log | 0.8718 | 238 | 0.7499 | 0.2895 | 0.7499 | 0.8659 |
| No log | 0.8791 | 240 | 0.7132 | 0.3284 | 0.7132 | 0.8445 |
| No log | 0.8864 | 242 | 0.7819 | 0.3143 | 0.7819 | 0.8843 |
| No log | 0.8938 | 244 | 0.7760 | 0.2154 | 0.7760 | 0.8809 |
| No log | 0.9011 | 246 | 0.7711 | 0.2154 | 0.7711 | 0.8781 |
| No log | 0.9084 | 248 | 0.8401 | 0.25 | 0.8401 | 0.9166 |
| No log | 0.9158 | 250 | 0.9757 | 0.2041 | 0.9757 | 0.9878 |
| No log | 0.9231 | 252 | 0.9501 | 0.1758 | 0.9501 | 0.9747 |
| No log | 0.9304 | 254 | 0.8671 | 0.3415 | 0.8671 | 0.9312 |
| No log | 0.9377 | 256 | 0.8438 | 0.3250 | 0.8438 | 0.9186 |
| No log | 0.9451 | 258 | 0.8097 | 0.3377 | 0.8097 | 0.8998 |
| No log | 0.9524 | 260 | 0.7259 | 0.4935 | 0.7259 | 0.8520 |
| No log | 0.9597 | 262 | 0.6525 | 0.3014 | 0.6525 | 0.8078 |
| No log | 0.9670 | 264 | 0.6580 | 0.4935 | 0.6580 | 0.8112 |
| No log | 0.9744 | 266 | 0.8068 | 0.3544 | 0.8068 | 0.8982 |
| No log | 0.9817 | 268 | 0.8286 | 0.24 | 0.8286 | 0.9103 |
| No log | 0.9890 | 270 | 0.7183 | 0.3836 | 0.7183 | 0.8475 |
| No log | 0.9963 | 272 | 0.5954 | 0.4706 | 0.5954 | 0.7716 |
| No log | 1.0037 | 274 | 0.5651 | 0.2759 | 0.5651 | 0.7517 |
| No log | 1.0110 | 276 | 0.5733 | 0.3390 | 0.5733 | 0.7572 |
| No log | 1.0183 | 278 | 0.6954 | 0.3836 | 0.6954 | 0.8339 |
| No log | 1.0256 | 280 | 0.8501 | 0.24 | 0.8501 | 0.9220 |
| No log | 1.0330 | 282 | 0.7854 | 0.2703 | 0.7854 | 0.8862 |
| No log | 1.0403 | 284 | 0.6793 | 0.3014 | 0.6793 | 0.8242 |
| No log | 1.0476 | 286 | 0.6577 | 0.3077 | 0.6577 | 0.8110 |
| No log | 1.0549 | 288 | 0.6799 | 0.1905 | 0.6799 | 0.8246 |
| No log | 1.0623 | 290 | 0.7062 | 0.1356 | 0.7062 | 0.8403 |
| No log | 1.0696 | 292 | 0.8099 | 0.0870 | 0.8099 | 0.8999 |
| No log | 1.0769 | 294 | 0.8894 | 0.0571 | 0.8894 | 0.9431 |
| No log | 1.0842 | 296 | 0.9339 | 0.2222 | 0.9339 | 0.9664 |
| No log | 1.0916 | 298 | 0.8429 | 0.0 | 0.8429 | 0.9181 |
| No log | 1.0989 | 300 | 0.7654 | 0.1290 | 0.7654 | 0.8749 |
| No log | 1.1062 | 302 | 0.7428 | 0.1290 | 0.7428 | 0.8619 |
| No log | 1.1136 | 304 | 0.7310 | 0.1290 | 0.7310 | 0.8550 |
| No log | 1.1209 | 306 | 0.7282 | 0.2623 | 0.7282 | 0.8533 |
| No log | 1.1282 | 308 | 0.7923 | 0.1818 | 0.7923 | 0.8901 |
| No log | 1.1355 | 310 | 0.7819 | 0.1739 | 0.7819 | 0.8843 |
| No log | 1.1429 | 312 | 0.7345 | 0.2059 | 0.7345 | 0.8570 |
| No log | 1.1502 | 314 | 0.7329 | 0.2059 | 0.7329 | 0.8561 |
| No log | 1.1575 | 316 | 0.7583 | 0.2059 | 0.7583 | 0.8708 |
| No log | 1.1648 | 318 | 0.7983 | 0.0833 | 0.7983 | 0.8935 |
| No log | 1.1722 | 320 | 0.7779 | 0.2192 | 0.7779 | 0.8820 |
| No log | 1.1795 | 322 | 0.6693 | 0.2623 | 0.6693 | 0.8181 |
| No log | 1.1868 | 324 | 0.6334 | 0.2000 | 0.6334 | 0.7958 |
| No log | 1.1941 | 326 | 0.6238 | 0.2759 | 0.6238 | 0.7898 |
| No log | 1.2015 | 328 | 0.6233 | 0.2353 | 0.6233 | 0.7895 |
| No log | 1.2088 | 330 | 0.6391 | 0.4 | 0.6391 | 0.7994 |
| No log | 1.2161 | 332 | 0.6363 | 0.4 | 0.6363 | 0.7977 |
| No log | 1.2234 | 334 | 0.6430 | 0.3810 | 0.6430 | 0.8018 |
| No log | 1.2308 | 336 | 0.6525 | 0.3077 | 0.6525 | 0.8078 |
| No log | 1.2381 | 338 | 0.6687 | 0.3636 | 0.6687 | 0.8178 |
| No log | 1.2454 | 340 | 0.6741 | 0.3636 | 0.6741 | 0.8210 |
| No log | 1.2527 | 342 | 0.6576 | 0.2500 | 0.6576 | 0.8109 |
| No log | 1.2601 | 344 | 0.6509 | 0.4 | 0.6509 | 0.8068 |
| No log | 1.2674 | 346 | 0.6404 | 0.3571 | 0.6404 | 0.8002 |
| No log | 1.2747 | 348 | 0.6342 | 0.2759 | 0.6342 | 0.7964 |
| No log | 1.2821 | 350 | 0.6277 | 0.2222 | 0.6277 | 0.7922 |
| No log | 1.2894 | 352 | 0.6099 | 0.3571 | 0.6099 | 0.7810 |
| No log | 1.2967 | 354 | 0.6032 | 0.3390 | 0.6032 | 0.7767 |
| No log | 1.3040 | 356 | 0.6133 | 0.3390 | 0.6133 | 0.7832 |
| No log | 1.3114 | 358 | 0.6409 | 0.2941 | 0.6409 | 0.8005 |
| No log | 1.3187 | 360 | 0.6656 | 0.1429 | 0.6656 | 0.8158 |
| No log | 1.3260 | 362 | 0.6355 | 0.3636 | 0.6355 | 0.7972 |
| No log | 1.3333 | 364 | 0.6714 | 0.3662 | 0.6714 | 0.8194 |
| No log | 1.3407 | 366 | 0.7443 | 0.3478 | 0.7443 | 0.8627 |
| No log | 1.3480 | 368 | 0.6946 | 0.3377 | 0.6946 | 0.8335 |
| No log | 1.3553 | 370 | 0.6518 | 0.3284 | 0.6518 | 0.8074 |
| No log | 1.3626 | 372 | 0.6539 | 0.3284 | 0.6539 | 0.8086 |
| No log | 1.3700 | 374 | 0.6596 | 0.3284 | 0.6596 | 0.8122 |
| No log | 1.3773 | 376 | 0.6561 | 0.2059 | 0.6561 | 0.8100 |
| No log | 1.3846 | 378 | 0.6469 | 0.2059 | 0.6469 | 0.8043 |
| No log | 1.3919 | 380 | 0.6530 | 0.3636 | 0.6530 | 0.8081 |
| No log | 1.3993 | 382 | 0.6661 | 0.2609 | 0.6661 | 0.8162 |
| No log | 1.4066 | 384 | 0.6523 | 0.3636 | 0.6523 | 0.8077 |
| No log | 1.4139 | 386 | 0.6462 | 0.2941 | 0.6462 | 0.8039 |
| No log | 1.4212 | 388 | 0.6491 | 0.3143 | 0.6491 | 0.8057 |
| No log | 1.4286 | 390 | 0.6460 | 0.3143 | 0.6460 | 0.8037 |
| No log | 1.4359 | 392 | 0.6472 | 0.3143 | 0.6472 | 0.8045 |
| No log | 1.4432 | 394 | 0.6201 | 0.3284 | 0.6201 | 0.7875 |
| No log | 1.4505 | 396 | 0.6237 | 0.5135 | 0.6237 | 0.7898 |
| No log | 1.4579 | 398 | 0.6301 | 0.5135 | 0.6301 | 0.7938 |
| No log | 1.4652 | 400 | 0.6250 | 0.3662 | 0.6250 | 0.7906 |
| No log | 1.4725 | 402 | 0.6350 | 0.3824 | 0.6350 | 0.7968 |
| No log | 1.4799 | 404 | 0.6210 | 0.2623 | 0.6210 | 0.7880 |
| No log | 1.4872 | 406 | 0.6001 | 0.4407 | 0.6001 | 0.7746 |
| No log | 1.4945 | 408 | 0.5934 | 0.4407 | 0.5934 | 0.7703 |
| No log | 1.5018 | 410 | 0.5976 | 0.2623 | 0.5976 | 0.7730 |
| No log | 1.5092 | 412 | 0.6302 | 0.2500 | 0.6302 | 0.7939 |
| No log | 1.5165 | 414 | 0.6600 | 0.1231 | 0.6600 | 0.8124 |
| No log | 1.5238 | 416 | 0.6476 | 0.2500 | 0.6476 | 0.8047 |
| No log | 1.5311 | 418 | 0.5943 | 0.2258 | 0.5943 | 0.7709 |
| No log | 1.5385 | 420 | 0.5791 | 0.3793 | 0.5791 | 0.7610 |
| No log | 1.5458 | 422 | 0.5757 | 0.3793 | 0.5757 | 0.7587 |
| No log | 1.5531 | 424 | 0.5579 | 0.2800 | 0.5579 | 0.7469 |
| No log | 1.5604 | 426 | 0.5364 | 0.4231 | 0.5364 | 0.7324 |
| No log | 1.5678 | 428 | 0.5809 | 0.2727 | 0.5809 | 0.7621 |
| No log | 1.5751 | 430 | 0.7448 | 0.2941 | 0.7448 | 0.8630 |
| No log | 1.5824 | 432 | 0.7859 | 0.24 | 0.7859 | 0.8865 |
| No log | 1.5897 | 434 | 0.6662 | 0.2941 | 0.6662 | 0.8162 |
| No log | 1.5971 | 436 | 0.5368 | 0.2500 | 0.5368 | 0.7327 |
| No log | 1.6044 | 438 | 0.5174 | 0.4375 | 0.5174 | 0.7193 |
| No log | 1.6117 | 440 | 0.5398 | 0.4194 | 0.5398 | 0.7347 |
| No log | 1.6190 | 442 | 0.5335 | 0.4762 | 0.5335 | 0.7304 |
| No log | 1.6264 | 444 | 0.5624 | 0.2941 | 0.5624 | 0.7499 |
| No log | 1.6337 | 446 | 0.5967 | 0.4156 | 0.5967 | 0.7724 |
| No log | 1.6410 | 448 | 0.5637 | 0.3284 | 0.5637 | 0.7508 |
| No log | 1.6484 | 450 | 0.5546 | 0.4923 | 0.5546 | 0.7447 |
| No log | 1.6557 | 452 | 0.5538 | 0.3810 | 0.5538 | 0.7442 |
| No log | 1.6630 | 454 | 0.5660 | 0.3158 | 0.5660 | 0.7523 |
| No log | 1.6703 | 456 | 0.5565 | 0.3390 | 0.5565 | 0.7460 |
| No log | 1.6777 | 458 | 0.5662 | 0.2623 | 0.5662 | 0.7525 |
| No log | 1.6850 | 460 | 0.5690 | 0.4 | 0.5690 | 0.7543 |
| No log | 1.6923 | 462 | 0.5852 | 0.2623 | 0.5852 | 0.7650 |
| No log | 1.6996 | 464 | 0.6280 | 0.2500 | 0.6280 | 0.7925 |
| No log | 1.7070 | 466 | 0.5984 | 0.2258 | 0.5984 | 0.7736 |
| No log | 1.7143 | 468 | 0.5443 | 0.2623 | 0.5443 | 0.7378 |
| No log | 1.7216 | 470 | 0.5330 | 0.4375 | 0.5330 | 0.7301 |
| No log | 1.7289 | 472 | 0.5178 | 0.4375 | 0.5178 | 0.7196 |
| No log | 1.7363 | 474 | 0.5149 | 0.3077 | 0.5149 | 0.7176 |
| No log | 1.7436 | 476 | 0.5277 | 0.3077 | 0.5277 | 0.7265 |
| No log | 1.7509 | 478 | 0.5569 | 0.3662 | 0.5569 | 0.7462 |
| No log | 1.7582 | 480 | 0.6063 | 0.3662 | 0.6063 | 0.7787 |
| No log | 1.7656 | 482 | 0.5898 | 0.3662 | 0.5898 | 0.7680 |
| No log | 1.7729 | 484 | 0.5742 | 0.3662 | 0.5742 | 0.7577 |
| No log | 1.7802 | 486 | 0.5415 | 0.3662 | 0.5415 | 0.7358 |
| No log | 1.7875 | 488 | 0.5302 | 0.3143 | 0.5302 | 0.7281 |
| No log | 1.7949 | 490 | 0.5378 | 0.3143 | 0.5378 | 0.7333 |
| No log | 1.8022 | 492 | 0.5824 | 0.3077 | 0.5824 | 0.7631 |
| No log | 1.8095 | 494 | 0.6343 | 0.3684 | 0.6343 | 0.7965 |
| No log | 1.8168 | 496 | 0.6774 | 0.3544 | 0.6774 | 0.8231 |
| No log | 1.8242 | 498 | 0.6618 | 0.1972 | 0.6618 | 0.8135 |
| 0.4394 | 1.8315 | 500 | 0.6764 | 0.3333 | 0.6764 | 0.8224 |
| 0.4394 | 1.8388 | 502 | 0.5942 | 0.3077 | 0.5942 | 0.7708 |
| 0.4394 | 1.8462 | 504 | 0.5660 | 0.4375 | 0.5660 | 0.7524 |
| 0.4394 | 1.8535 | 506 | 0.5729 | 0.3810 | 0.5729 | 0.7569 |
| 0.4394 | 1.8608 | 508 | 0.5689 | 0.4375 | 0.5689 | 0.7542 |
| 0.4394 | 1.8681 | 510 | 0.6016 | 0.3077 | 0.6016 | 0.7756 |
| 0.4394 | 1.8755 | 512 | 0.6428 | 0.1739 | 0.6428 | 0.8017 |
| 0.4394 | 1.8828 | 514 | 0.6807 | 0.1429 | 0.6807 | 0.8251 |
| 0.4394 | 1.8901 | 516 | 0.7161 | 0.1039 | 0.7161 | 0.8462 |
| 0.4394 | 1.8974 | 518 | 0.7035 | 0.1039 | 0.7035 | 0.8387 |
| 0.4394 | 1.9048 | 520 | 0.6607 | 0.2727 | 0.6607 | 0.8128 |
| 0.4394 | 1.9121 | 522 | 0.6605 | 0.4348 | 0.6605 | 0.8127 |
| 0.4394 | 1.9194 | 524 | 0.6348 | 0.4762 | 0.6348 | 0.7967 |
| 0.4394 | 1.9267 | 526 | 0.6357 | 0.0870 | 0.6357 | 0.7973 |
| 0.4394 | 1.9341 | 528 | 0.6515 | 0.1972 | 0.6515 | 0.8071 |
| 0.4394 | 1.9414 | 530 | 0.6058 | 0.0909 | 0.6058 | 0.7783 |
| 0.4394 | 1.9487 | 532 | 0.5824 | 0.0339 | 0.5824 | 0.7631 |
| 0.4394 | 1.9560 | 534 | 0.5448 | 0.1000 | 0.5448 | 0.7381 |
| 0.4394 | 1.9634 | 536 | 0.5489 | 0.0 | 0.5489 | 0.7409 |
| 0.4394 | 1.9707 | 538 | 0.5613 | 0.2623 | 0.5613 | 0.7492 |
| 0.4394 | 1.9780 | 540 | 0.5872 | 0.3478 | 0.5872 | 0.7663 |
| 0.4394 | 1.9853 | 542 | 0.6412 | 0.0909 | 0.6412 | 0.8007 |
| 0.4394 | 1.9927 | 544 | 0.6194 | 0.3478 | 0.6194 | 0.7870 |
| 0.4394 | 2.0 | 546 | 0.5844 | 0.3478 | 0.5844 | 0.7644 |
| 0.4394 | 2.0073 | 548 | 0.5885 | 0.3226 | 0.5885 | 0.7672 |
| 0.4394 | 2.0147 | 550 | 0.6054 | 0.3662 | 0.6054 | 0.7780 |
| 0.4394 | 2.0220 | 552 | 0.5808 | 0.3607 | 0.5808 | 0.7621 |
| 0.4394 | 2.0293 | 554 | 0.5551 | 0.3284 | 0.5551 | 0.7450 |
| 0.4394 | 2.0366 | 556 | 0.5516 | 0.2941 | 0.5516 | 0.7427 |
| 0.4394 | 2.0440 | 558 | 0.5470 | 0.4407 | 0.5470 | 0.7396 |
| 0.4394 | 2.0513 | 560 | 0.5433 | 0.4828 | 0.5433 | 0.7371 |
| 0.4394 | 2.0586 | 562 | 0.5563 | 0.3284 | 0.5563 | 0.7458 |
| 0.4394 | 2.0659 | 564 | 0.5637 | 0.3284 | 0.5637 | 0.7508 |
| 0.4394 | 2.0733 | 566 | 0.5684 | 0.3284 | 0.5684 | 0.7539 |
| 0.4394 | 2.0806 | 568 | 0.5737 | 0.4348 | 0.5737 | 0.7574 |
| 0.4394 | 2.0879 | 570 | 0.5828 | 0.3836 | 0.5828 | 0.7634 |
| 0.4394 | 2.0952 | 572 | 0.5724 | 0.4923 | 0.5724 | 0.7566 |
| 0.4394 | 2.1026 | 574 | 0.6167 | 0.4 | 0.6167 | 0.7853 |
| 0.4394 | 2.1099 | 576 | 0.6451 | 0.5 | 0.6451 | 0.8032 |
| 0.4394 | 2.1172 | 578 | 0.6168 | 0.5 | 0.6168 | 0.7854 |
| 0.4394 | 2.1245 | 580 | 0.5485 | 0.3284 | 0.5485 | 0.7406 |
| 0.4394 | 2.1319 | 582 | 0.5082 | 0.4375 | 0.5082 | 0.7129 |
| 0.4394 | 2.1392 | 584 | 0.5101 | 0.4375 | 0.5101 | 0.7142 |
| 0.4394 | 2.1465 | 586 | 0.5350 | 0.3636 | 0.5350 | 0.7314 |
| 0.4394 | 2.1538 | 588 | 0.6042 | 0.5352 | 0.6042 | 0.7773 |
| 0.4394 | 2.1612 | 590 | 0.6711 | 0.4179 | 0.6711 | 0.8192 |
| 0.4394 | 2.1685 | 592 | 0.6731 | 0.4179 | 0.6731 | 0.8204 |
| 0.4394 | 2.1758 | 594 | 0.5895 | 0.5352 | 0.5895 | 0.7678 |
| 0.4394 | 2.1832 | 596 | 0.5354 | 0.4348 | 0.5354 | 0.7317 |
| 0.4394 | 2.1905 | 598 | 0.5138 | 0.3390 | 0.5138 | 0.7168 |
| 0.4394 | 2.1978 | 600 | 0.5322 | 0.4 | 0.5322 | 0.7295 |
| 0.4394 | 2.2051 | 602 | 0.5366 | 0.4 | 0.5366 | 0.7325 |
| 0.4394 | 2.2125 | 604 | 0.5474 | 0.3636 | 0.5474 | 0.7398 |
| 0.4394 | 2.2198 | 606 | 0.5727 | 0.3810 | 0.5727 | 0.7568 |
| 0.4394 | 2.2271 | 608 | 0.5590 | 0.2623 | 0.5590 | 0.7476 |
| 0.4394 | 2.2344 | 610 | 0.5206 | 0.3390 | 0.5206 | 0.7215 |
| 0.4394 | 2.2418 | 612 | 0.5047 | 0.2041 | 0.5047 | 0.7104 |
| 0.4394 | 2.2491 | 614 | 0.5069 | 0.2041 | 0.5069 | 0.7120 |
| 0.4394 | 2.2564 | 616 | 0.5034 | 0.2041 | 0.5034 | 0.7095 |
| 0.4394 | 2.2637 | 618 | 0.5097 | 0.3774 | 0.5097 | 0.7139 |
| 0.4394 | 2.2711 | 620 | 0.5589 | 0.2857 | 0.5589 | 0.7476 |
| 0.4394 | 2.2784 | 622 | 0.5975 | 0.3077 | 0.5975 | 0.7730 |
| 0.4394 | 2.2857 | 624 | 0.6019 | 0.4179 | 0.6019 | 0.7759 |
| 0.4394 | 2.2930 | 626 | 0.5930 | 0.4179 | 0.5930 | 0.7701 |
| 0.4394 | 2.3004 | 628 | 0.5941 | 0.3824 | 0.5941 | 0.7708 |
| 0.4394 | 2.3077 | 630 | 0.5769 | 0.3636 | 0.5769 | 0.7596 |
| 0.4394 | 2.3150 | 632 | 0.5728 | 0.4857 | 0.5728 | 0.7568 |
| 0.4394 | 2.3223 | 634 | 0.5773 | 0.4857 | 0.5773 | 0.7598 |
| 0.4394 | 2.3297 | 636 | 0.5714 | 0.4375 | 0.5714 | 0.7559 |
| 0.4394 | 2.3370 | 638 | 0.5587 | 0.3636 | 0.5587 | 0.7475 |
| 0.4394 | 2.3443 | 640 | 0.5817 | 0.2857 | 0.5817 | 0.7627 |
| 0.4394 | 2.3516 | 642 | 0.6151 | 0.3077 | 0.6151 | 0.7843 |
| 0.4394 | 2.3590 | 644 | 0.6204 | 0.3077 | 0.6204 | 0.7876 |
| 0.4394 | 2.3663 | 646 | 0.5875 | 0.4179 | 0.5875 | 0.7665 |
| 0.4394 | 2.3736 | 648 | 0.5832 | 0.4179 | 0.5832 | 0.7637 |
| 0.4394 | 2.3810 | 650 | 0.5847 | 0.4179 | 0.5847 | 0.7647 |
| 0.4394 | 2.3883 | 652 | 0.5908 | 0.4179 | 0.5908 | 0.7686 |
| 0.4394 | 2.3956 | 654 | 0.6069 | 0.4348 | 0.6069 | 0.7791 |
| 0.4394 | 2.4029 | 656 | 0.6180 | 0.4 | 0.6180 | 0.7861 |
| 0.4394 | 2.4103 | 658 | 0.6022 | 0.4 | 0.6022 | 0.7760 |
| 0.4394 | 2.4176 | 660 | 0.5792 | 0.3824 | 0.5792 | 0.7610 |
| 0.4394 | 2.4249 | 662 | 0.5708 | 0.4167 | 0.5708 | 0.7555 |
| 0.4394 | 2.4322 | 664 | 0.5678 | 0.3662 | 0.5678 | 0.7535 |
| 0.4394 | 2.4396 | 666 | 0.5786 | 0.2500 | 0.5786 | 0.7607 |
| 0.4394 | 2.4469 | 668 | 0.5853 | 0.2500 | 0.5853 | 0.7650 |
| 0.4394 | 2.4542 | 670 | 0.5670 | 0.2500 | 0.5670 | 0.7530 |
| 0.4394 | 2.4615 | 672 | 0.5476 | 0.3226 | 0.5476 | 0.7400 |
| 0.4394 | 2.4689 | 674 | 0.5228 | 0.3607 | 0.5228 | 0.7230 |
| 0.4394 | 2.4762 | 676 | 0.5152 | 0.4762 | 0.5152 | 0.7178 |
| 0.4394 | 2.4835 | 678 | 0.5273 | 0.4706 | 0.5273 | 0.7261 |
| 0.4394 | 2.4908 | 680 | 0.5398 | 0.4167 | 0.5398 | 0.7347 |
| 0.4394 | 2.4982 | 682 | 0.5394 | 0.3684 | 0.5394 | 0.7344 |
| 0.4394 | 2.5055 | 684 | 0.5716 | 0.4304 | 0.5716 | 0.7560 |
| 0.4394 | 2.5128 | 686 | 0.5776 | 0.3846 | 0.5776 | 0.7600 |
| 0.4394 | 2.5201 | 688 | 0.5735 | 0.3704 | 0.5735 | 0.7573 |
| 0.4394 | 2.5275 | 690 | 0.5887 | 0.4 | 0.5887 | 0.7673 |
| 0.4394 | 2.5348 | 692 | 0.5916 | 0.4 | 0.5916 | 0.7691 |
| 0.4394 | 2.5421 | 694 | 0.5472 | 0.4 | 0.5472 | 0.7397 |
| 0.4394 | 2.5495 | 696 | 0.5257 | 0.2222 | 0.5257 | 0.7251 |
| 0.4394 | 2.5568 | 698 | 0.5270 | 0.2222 | 0.5270 | 0.7259 |
| 0.4394 | 2.5641 | 700 | 0.5273 | 0.2222 | 0.5273 | 0.7262 |
| 0.4394 | 2.5714 | 702 | 0.5283 | 0.2222 | 0.5283 | 0.7268 |
| 0.4394 | 2.5788 | 704 | 0.5297 | 0.3158 | 0.5297 | 0.7278 |
| 0.4394 | 2.5861 | 706 | 0.5331 | 0.4194 | 0.5331 | 0.7301 |
| 0.4394 | 2.5934 | 708 | 0.5773 | 0.5312 | 0.5773 | 0.7598 |
| 0.4394 | 2.6007 | 710 | 0.6296 | 0.5 | 0.6296 | 0.7935 |
| 0.4394 | 2.6081 | 712 | 0.5837 | 0.4 | 0.5837 | 0.7640 |
| 0.4394 | 2.6154 | 714 | 0.5268 | 0.4590 | 0.5268 | 0.7258 |
| 0.4394 | 2.6227 | 716 | 0.5174 | 0.3438 | 0.5174 | 0.7193 |
| 0.4394 | 2.6300 | 718 | 0.5159 | 0.4590 | 0.5159 | 0.7183 |
| 0.4394 | 2.6374 | 720 | 0.5134 | 0.4706 | 0.5134 | 0.7165 |
| 0.4394 | 2.6447 | 722 | 0.5149 | 0.3636 | 0.5149 | 0.7175 |
| 0.4394 | 2.6520 | 724 | 0.5306 | 0.4194 | 0.5306 | 0.7284 |
| 0.4394 | 2.6593 | 726 | 0.5574 | 0.3836 | 0.5574 | 0.7466 |
| 0.4394 | 2.6667 | 728 | 0.5736 | 0.3333 | 0.5736 | 0.7574 |
| 0.4394 | 2.6740 | 730 | 0.5960 | 0.3333 | 0.5960 | 0.7720 |
| 0.4394 | 2.6813 | 732 | 0.5995 | 0.3377 | 0.5995 | 0.7742 |
| 0.4394 | 2.6886 | 734 | 0.6189 | 0.2941 | 0.6189 | 0.7867 |
| 0.4394 | 2.6960 | 736 | 0.5905 | 0.3478 | 0.5905 | 0.7685 |
| 0.4394 | 2.7033 | 738 | 0.5434 | 0.3077 | 0.5434 | 0.7372 |
| 0.4394 | 2.7106 | 740 | 0.5324 | 0.4857 | 0.5324 | 0.7297 |
| 0.4394 | 2.7179 | 742 | 0.5279 | 0.4857 | 0.5279 | 0.7265 |
| 0.4394 | 2.7253 | 744 | 0.5274 | 0.4857 | 0.5274 | 0.7262 |
| 0.4394 | 2.7326 | 746 | 0.5309 | 0.3824 | 0.5309 | 0.7286 |
| 0.4394 | 2.7399 | 748 | 0.5575 | 0.3438 | 0.5575 | 0.7467 |
| 0.4394 | 2.7473 | 750 | 0.5721 | 0.2941 | 0.5721 | 0.7564 |
| 0.4394 | 2.7546 | 752 | 0.5613 | 0.2857 | 0.5613 | 0.7492 |
| 0.4394 | 2.7619 | 754 | 0.5312 | 0.4 | 0.5312 | 0.7288 |
| 0.4394 | 2.7692 | 756 | 0.5035 | 0.4590 | 0.5035 | 0.7096 |
| 0.4394 | 2.7766 | 758 | 0.5060 | 0.3284 | 0.5060 | 0.7113 |
| 0.4394 | 2.7839 | 760 | 0.5022 | 0.4348 | 0.5022 | 0.7087 |
| 0.4394 | 2.7912 | 762 | 0.4806 | 0.3077 | 0.4806 | 0.6933 |
| 0.4394 | 2.7985 | 764 | 0.4960 | 0.4545 | 0.4960 | 0.7043 |
| 0.4394 | 2.8059 | 766 | 0.5268 | 0.4 | 0.5268 | 0.7258 |
| 0.4394 | 2.8132 | 768 | 0.5510 | 0.3478 | 0.5510 | 0.7423 |
| 0.4394 | 2.8205 | 770 | 0.5621 | 0.2941 | 0.5621 | 0.7497 |
| 0.4394 | 2.8278 | 772 | 0.5218 | 0.4 | 0.5218 | 0.7223 |
| 0.4394 | 2.8352 | 774 | 0.5103 | 0.3824 | 0.5103 | 0.7144 |
| 0.4394 | 2.8425 | 776 | 0.5223 | 0.4179 | 0.5223 | 0.7227 |
| 0.4394 | 2.8498 | 778 | 0.5300 | 0.4324 | 0.5300 | 0.7280 |
| 0.4394 | 2.8571 | 780 | 0.5428 | 0.3636 | 0.5428 | 0.7368 |
| 0.4394 | 2.8645 | 782 | 0.5500 | 0.3077 | 0.5500 | 0.7416 |
| 0.4394 | 2.8718 | 784 | 0.5484 | 0.3077 | 0.5484 | 0.7406 |
| 0.4394 | 2.8791 | 786 | 0.5431 | 0.3000 | 0.5431 | 0.7370 |
| 0.4394 | 2.8864 | 788 | 0.5381 | 0.3000 | 0.5381 | 0.7336 |
| 0.4394 | 2.8938 | 790 | 0.5317 | 0.3390 | 0.5317 | 0.7292 |
| 0.4394 | 2.9011 | 792 | 0.5342 | 0.3390 | 0.5342 | 0.7309 |
| 0.4394 | 2.9084 | 794 | 0.5367 | 0.3390 | 0.5367 | 0.7326 |
| 0.4394 | 2.9158 | 796 | 0.5334 | 0.4 | 0.5334 | 0.7303 |
| 0.4394 | 2.9231 | 798 | 0.5481 | 0.3077 | 0.5481 | 0.7403 |
| 0.4394 | 2.9304 | 800 | 0.5912 | 0.3636 | 0.5912 | 0.7689 |
| 0.4394 | 2.9377 | 802 | 0.6102 | 0.3514 | 0.6102 | 0.7812 |
| 0.4394 | 2.9451 | 804 | 0.5929 | 0.3636 | 0.5929 | 0.7700 |
| 0.4394 | 2.9524 | 806 | 0.5741 | 0.3143 | 0.5741 | 0.7577 |
| 0.4394 | 2.9597 | 808 | 0.5686 | 0.3143 | 0.5686 | 0.7541 |
| 0.4394 | 2.9670 | 810 | 0.5581 | 0.3077 | 0.5581 | 0.7471 |
| 0.4394 | 2.9744 | 812 | 0.5456 | 0.3077 | 0.5456 | 0.7386 |
| 0.4394 | 2.9817 | 814 | 0.5521 | 0.4348 | 0.5521 | 0.7430 |
| 0.4394 | 2.9890 | 816 | 0.5743 | 0.4658 | 0.5743 | 0.7578 |
| 0.4394 | 2.9963 | 818 | 0.5881 | 0.3200 | 0.5881 | 0.7669 |
| 0.4394 | 3.0037 | 820 | 0.6378 | 0.3571 | 0.6378 | 0.7986 |
| 0.4394 | 3.0110 | 822 | 0.6884 | 0.3571 | 0.6884 | 0.8297 |
| 0.4394 | 3.0183 | 824 | 0.7122 | 0.3077 | 0.7122 | 0.8439 |
| 0.4394 | 3.0256 | 826 | 0.7344 | 0.3077 | 0.7344 | 0.8570 |
| 0.4394 | 3.0330 | 828 | 0.7099 | 0.3077 | 0.7099 | 0.8425 |
| 0.4394 | 3.0403 | 830 | 0.6956 | 0.3077 | 0.6956 | 0.8340 |
| 0.4394 | 3.0476 | 832 | 0.6589 | 0.2921 | 0.6589 | 0.8117 |
| 0.4394 | 3.0549 | 834 | 0.6135 | 0.3377 | 0.6135 | 0.7833 |
| 0.4394 | 3.0623 | 836 | 0.5683 | 0.3284 | 0.5683 | 0.7538 |
| 0.4394 | 3.0696 | 838 | 0.5688 | 0.3077 | 0.5688 | 0.7542 |
| 0.4394 | 3.0769 | 840 | 0.6306 | 0.4179 | 0.6306 | 0.7941 |
| 0.4394 | 3.0842 | 842 | 0.6851 | 0.4179 | 0.6851 | 0.8277 |
| 0.4394 | 3.0916 | 844 | 0.8455 | 0.2105 | 0.8455 | 0.9195 |
| 0.4394 | 3.0989 | 846 | 0.9481 | 0.2105 | 0.9481 | 0.9737 |
| 0.4394 | 3.1062 | 848 | 0.9267 | 0.2105 | 0.9267 | 0.9626 |
| 0.4394 | 3.1136 | 850 | 0.7810 | 0.2105 | 0.7810 | 0.8837 |
| 0.4394 | 3.1209 | 852 | 0.6059 | 0.3077 | 0.6059 | 0.7784 |
| 0.4394 | 3.1282 | 854 | 0.5549 | 0.3077 | 0.5549 | 0.7449 |
| 0.4394 | 3.1355 | 856 | 0.5495 | 0.2500 | 0.5495 | 0.7413 |
| 0.4394 | 3.1429 | 858 | 0.5506 | 0.3636 | 0.5506 | 0.7420 |
| 0.4394 | 3.1502 | 860 | 0.5643 | 0.3636 | 0.5643 | 0.7512 |
| 0.4394 | 3.1575 | 862 | 0.5817 | 0.2623 | 0.5817 | 0.7627 |
| 0.4394 | 3.1648 | 864 | 0.5854 | 0.2623 | 0.5854 | 0.7651 |
| 0.4394 | 3.1722 | 866 | 0.5982 | 0.2623 | 0.5982 | 0.7734 |
| 0.4394 | 3.1795 | 868 | 0.6201 | 0.3226 | 0.6201 | 0.7875 |
| 0.4394 | 3.1868 | 870 | 0.6246 | 0.3226 | 0.6246 | 0.7903 |
| 0.4394 | 3.1941 | 872 | 0.6608 | 0.1972 | 0.6608 | 0.8129 |
| 0.4394 | 3.2015 | 874 | 0.6514 | 0.1972 | 0.6514 | 0.8071 |
| 0.4394 | 3.2088 | 876 | 0.6241 | 0.3226 | 0.6241 | 0.7900 |
| 0.4394 | 3.2161 | 878 | 0.5893 | 0.2623 | 0.5893 | 0.7677 |
| 0.4394 | 3.2234 | 880 | 0.5654 | 0.4211 | 0.5654 | 0.7519 |
| 0.4394 | 3.2308 | 882 | 0.5684 | 0.3571 | 0.5684 | 0.7540 |
| 0.4394 | 3.2381 | 884 | 0.5743 | 0.3607 | 0.5743 | 0.7579 |
| 0.4394 | 3.2454 | 886 | 0.5726 | 0.4211 | 0.5726 | 0.7567 |
| 0.4394 | 3.2527 | 888 | 0.5852 | 0.3390 | 0.5852 | 0.7650 |
| 0.4394 | 3.2601 | 890 | 0.6132 | 0.3607 | 0.6132 | 0.7831 |
| 0.4394 | 3.2674 | 892 | 0.6081 | 0.3607 | 0.6081 | 0.7798 |
| 0.4394 | 3.2747 | 894 | 0.5994 | 0.4179 | 0.5994 | 0.7742 |
| 0.4394 | 3.2821 | 896 | 0.5885 | 0.4923 | 0.5885 | 0.7672 |
| 0.4394 | 3.2894 | 898 | 0.5921 | 0.4 | 0.5921 | 0.7695 |
| 0.4394 | 3.2967 | 900 | 0.5924 | 0.4 | 0.5924 | 0.7697 |
| 0.4394 | 3.3040 | 902 | 0.5828 | 0.4 | 0.5828 | 0.7634 |
| 0.4394 | 3.3114 | 904 | 0.5967 | 0.4 | 0.5967 | 0.7725 |
| 0.4394 | 3.3187 | 906 | 0.6072 | 0.4 | 0.6072 | 0.7792 |
| 0.4394 | 3.3260 | 908 | 0.6175 | 0.2759 | 0.6175 | 0.7858 |
| 0.4394 | 3.3333 | 910 | 0.6268 | 0.3390 | 0.6268 | 0.7917 |
| 0.4394 | 3.3407 | 912 | 0.6329 | 0.4 | 0.6329 | 0.7955 |
| 0.4394 | 3.3480 | 914 | 0.6434 | 0.2623 | 0.6434 | 0.8021 |
| 0.4394 | 3.3553 | 916 | 0.6510 | 0.2623 | 0.6510 | 0.8068 |
| 0.4394 | 3.3626 | 918 | 0.6418 | 0.4 | 0.6418 | 0.8011 |
| 0.4394 | 3.3700 | 920 | 0.6444 | 0.4 | 0.6444 | 0.8028 |
| 0.4394 | 3.3773 | 922 | 0.6565 | 0.2258 | 0.6565 | 0.8103 |
| 0.4394 | 3.3846 | 924 | 0.6755 | 0.1739 | 0.6755 | 0.8219 |
| 0.4394 | 3.3919 | 926 | 0.7127 | 0.25 | 0.7127 | 0.8442 |
| 0.4394 | 3.3993 | 928 | 0.8145 | 0.2192 | 0.8145 | 0.9025 |
| 0.4394 | 3.4066 | 930 | 0.8950 | 0.2410 | 0.8950 | 0.9461 |
| 0.4394 | 3.4139 | 932 | 0.8864 | 0.2410 | 0.8864 | 0.9415 |
| 0.4394 | 3.4212 | 934 | 0.7805 | 0.3077 | 0.7805 | 0.8835 |
| 0.4394 | 3.4286 | 936 | 0.7266 | 0.2597 | 0.7266 | 0.8524 |
| 0.4394 | 3.4359 | 938 | 0.6929 | 0.3294 | 0.6929 | 0.8324 |
| 0.4394 | 3.4432 | 940 | 0.6470 | 0.3704 | 0.6470 | 0.8044 |
| 0.4394 | 3.4505 | 942 | 0.6227 | 0.475 | 0.6227 | 0.7891 |
| 0.4394 | 3.4579 | 944 | 0.6089 | 0.475 | 0.6089 | 0.7803 |
| 0.4394 | 3.4652 | 946 | 0.5860 | 0.4923 | 0.5860 | 0.7655 |
| 0.4394 | 3.4725 | 948 | 0.6036 | 0.2857 | 0.6036 | 0.7769 |
| 0.4394 | 3.4799 | 950 | 0.6775 | 0.2192 | 0.6775 | 0.8231 |
| 0.4394 | 3.4872 | 952 | 0.7340 | 0.2192 | 0.7340 | 0.8567 |
| 0.4394 | 3.4945 | 954 | 0.7522 | 0.2192 | 0.7522 | 0.8673 |
| 0.4394 | 3.5018 | 956 | 0.7190 | 0.2192 | 0.7190 | 0.8480 |
| 0.4394 | 3.5092 | 958 | 0.6386 | 0.2817 | 0.6386 | 0.7991 |
| 0.4394 | 3.5165 | 960 | 0.6002 | 0.2857 | 0.6002 | 0.7747 |
| 0.4394 | 3.5238 | 962 | 0.6018 | 0.2857 | 0.6018 | 0.7758 |
| 0.4394 | 3.5311 | 964 | 0.6261 | 0.2857 | 0.6261 | 0.7913 |
| 0.4394 | 3.5385 | 966 | 0.6825 | 0.2192 | 0.6825 | 0.8261 |
| 0.4394 | 3.5458 | 968 | 0.7351 | 0.2192 | 0.7351 | 0.8574 |
| 0.4394 | 3.5531 | 970 | 0.7172 | 0.2192 | 0.7172 | 0.8469 |
| 0.4394 | 3.5604 | 972 | 0.6545 | 0.25 | 0.6545 | 0.8090 |
| 0.4394 | 3.5678 | 974 | 0.6189 | 0.2857 | 0.6189 | 0.7867 |
| 0.4394 | 3.5751 | 976 | 0.6039 | 0.3824 | 0.6039 | 0.7771 |
| 0.4394 | 3.5824 | 978 | 0.6051 | 0.3836 | 0.6051 | 0.7779 |
| 0.4394 | 3.5897 | 980 | 0.6115 | 0.3836 | 0.6115 | 0.7820 |
| 0.4394 | 3.5971 | 982 | 0.6087 | 0.3824 | 0.6087 | 0.7802 |
| 0.4394 | 3.6044 | 984 | 0.5944 | 0.4167 | 0.5944 | 0.7709 |
| 0.4394 | 3.6117 | 986 | 0.5826 | 0.4857 | 0.5826 | 0.7633 |
| 0.4394 | 3.6190 | 988 | 0.5780 | 0.475 | 0.5780 | 0.7602 |
| 0.4394 | 3.6264 | 990 | 0.5735 | 0.4167 | 0.5735 | 0.7573 |
| 0.4394 | 3.6337 | 992 | 0.5578 | 0.4167 | 0.5578 | 0.7469 |
| 0.4394 | 3.6410 | 994 | 0.5438 | 0.4706 | 0.5438 | 0.7374 |
| 0.4394 | 3.6484 | 996 | 0.5378 | 0.3438 | 0.5378 | 0.7333 |
| 0.4394 | 3.6557 | 998 | 0.5408 | 0.4 | 0.5408 | 0.7354 |
| 0.1143 | 3.6630 | 1000 | 0.5414 | 0.4000 | 0.5414 | 0.7358 |
| 0.1143 | 3.6703 | 1002 | 0.5372 | 0.3438 | 0.5372 | 0.7330 |
| 0.1143 | 3.6777 | 1004 | 0.5406 | 0.3478 | 0.5406 | 0.7353 |
| 0.1143 | 3.6850 | 1006 | 0.5488 | 0.4179 | 0.5488 | 0.7408 |
| 0.1143 | 3.6923 | 1008 | 0.5570 | 0.4545 | 0.5570 | 0.7463 |
| 0.1143 | 3.6996 | 1010 | 0.5454 | 0.4545 | 0.5454 | 0.7385 |
| 0.1143 | 3.7070 | 1012 | 0.5398 | 0.2857 | 0.5398 | 0.7347 |
| 0.1143 | 3.7143 | 1014 | 0.5517 | 0.3662 | 0.5517 | 0.7427 |
| 0.1143 | 3.7216 | 1016 | 0.5589 | 0.3662 | 0.5589 | 0.7476 |
| 0.1143 | 3.7289 | 1018 | 0.5665 | 0.3836 | 0.5665 | 0.7526 |
| 0.1143 | 3.7363 | 1020 | 0.5563 | 0.3333 | 0.5563 | 0.7458 |
| 0.1143 | 3.7436 | 1022 | 0.5465 | 0.3478 | 0.5465 | 0.7393 |
| 0.1143 | 3.7509 | 1024 | 0.5525 | 0.4 | 0.5525 | 0.7433 |
| 0.1143 | 3.7582 | 1026 | 0.5444 | 0.4 | 0.5444 | 0.7378 |
| 0.1143 | 3.7656 | 1028 | 0.5281 | 0.4194 | 0.5281 | 0.7267 |
| 0.1143 | 3.7729 | 1030 | 0.5320 | 0.3662 | 0.5320 | 0.7294 |
| 0.1143 | 3.7802 | 1032 | 0.5687 | 0.2857 | 0.5687 | 0.7541 |
| 0.1143 | 3.7875 | 1034 | 0.5829 | 0.3438 | 0.5829 | 0.7635 |
| 0.1143 | 3.7949 | 1036 | 0.5588 | 0.2857 | 0.5588 | 0.7475 |
| 0.1143 | 3.8022 | 1038 | 0.5383 | 0.3662 | 0.5383 | 0.7337 |
| 0.1143 | 3.8095 | 1040 | 0.5492 | 0.4324 | 0.5492 | 0.7411 |
| 0.1143 | 3.8168 | 1042 | 0.5737 | 0.3143 | 0.5737 | 0.7575 |
| 0.1143 | 3.8242 | 1044 | 0.5792 | 0.3846 | 0.5792 | 0.7611 |
| 0.1143 | 3.8315 | 1046 | 0.5800 | 0.4304 | 0.5800 | 0.7616 |
| 0.1143 | 3.8388 | 1048 | 0.5746 | 0.4304 | 0.5746 | 0.7580 |
| 0.1143 | 3.8462 | 1050 | 0.5622 | 0.3684 | 0.5622 | 0.7498 |
| 0.1143 | 3.8535 | 1052 | 0.5480 | 0.3684 | 0.5480 | 0.7402 |
| 0.1143 | 3.8608 | 1054 | 0.5426 | 0.3333 | 0.5426 | 0.7366 |
| 0.1143 | 3.8681 | 1056 | 0.5311 | 0.4762 | 0.5311 | 0.7288 |
| 0.1143 | 3.8755 | 1058 | 0.5236 | 0.4407 | 0.5236 | 0.7236 |
| 0.1143 | 3.8828 | 1060 | 0.5318 | 0.2500 | 0.5318 | 0.7292 |
| 0.1143 | 3.8901 | 1062 | 0.5379 | 0.2500 | 0.5379 | 0.7334 |
| 0.1143 | 3.8974 | 1064 | 0.5222 | 0.3607 | 0.5222 | 0.7226 |
| 0.1143 | 3.9048 | 1066 | 0.5056 | 0.4407 | 0.5056 | 0.7110 |
| 0.1143 | 3.9121 | 1068 | 0.5038 | 0.4658 | 0.5038 | 0.7098 |
| 0.1143 | 3.9194 | 1070 | 0.5007 | 0.4658 | 0.5007 | 0.7076 |
| 0.1143 | 3.9267 | 1072 | 0.4909 | 0.4857 | 0.4909 | 0.7006 |
| 0.1143 | 3.9341 | 1074 | 0.5105 | 0.5714 | 0.5105 | 0.7145 |
| 0.1143 | 3.9414 | 1076 | 0.5745 | 0.4167 | 0.5745 | 0.7579 |
| 0.1143 | 3.9487 | 1078 | 0.5981 | 0.4167 | 0.5981 | 0.7734 |
| 0.1143 | 3.9560 | 1080 | 0.5752 | 0.4167 | 0.5752 | 0.7584 |
| 0.1143 | 3.9634 | 1082 | 0.5215 | 0.5312 | 0.5215 | 0.7221 |
| 0.1143 | 3.9707 | 1084 | 0.4851 | 0.3607 | 0.4851 | 0.6965 |
| 0.1143 | 3.9780 | 1086 | 0.4790 | 0.4706 | 0.4790 | 0.6921 |
| 0.1143 | 3.9853 | 1088 | 0.5138 | 0.3662 | 0.5138 | 0.7168 |
| 0.1143 | 3.9927 | 1090 | 0.5353 | 0.3662 | 0.5353 | 0.7316 |
| 0.1143 | 4.0 | 1092 | 0.5272 | 0.3662 | 0.5272 | 0.7261 |
| 0.1143 | 4.0073 | 1094 | 0.5186 | 0.4762 | 0.5186 | 0.7201 |
| 0.1143 | 4.0147 | 1096 | 0.5170 | 0.3774 | 0.5170 | 0.7190 |
| 0.1143 | 4.0220 | 1098 | 0.5283 | 0.3774 | 0.5283 | 0.7269 |
| 0.1143 | 4.0293 | 1100 | 0.5382 | 0.3774 | 0.5382 | 0.7336 |
| 0.1143 | 4.0366 | 1102 | 0.5581 | 0.4706 | 0.5581 | 0.7471 |
| 0.1143 | 4.0440 | 1104 | 0.5907 | 0.3636 | 0.5907 | 0.7686 |
| 0.1143 | 4.0513 | 1106 | 0.6598 | 0.2941 | 0.6598 | 0.8123 |
| 0.1143 | 4.0586 | 1108 | 0.7703 | 0.2941 | 0.7703 | 0.8777 |
| 0.1143 | 4.0659 | 1110 | 0.8313 | 0.3014 | 0.8313 | 0.9118 |
| 0.1143 | 4.0733 | 1112 | 0.8157 | 0.3014 | 0.8157 | 0.9032 |
| 0.1143 | 4.0806 | 1114 | 0.7357 | 0.2941 | 0.7357 | 0.8577 |
| 0.1143 | 4.0879 | 1116 | 0.6680 | 0.3143 | 0.6680 | 0.8173 |
| 0.1143 | 4.0952 | 1118 | 0.6461 | 0.3143 | 0.6461 | 0.8038 |
| 0.1143 | 4.1026 | 1120 | 0.6621 | 0.3143 | 0.6621 | 0.8137 |
| 0.1143 | 4.1099 | 1122 | 0.6931 | 0.2941 | 0.6931 | 0.8325 |
| 0.1143 | 4.1172 | 1124 | 0.7210 | 0.2941 | 0.7210 | 0.8491 |
| 0.1143 | 4.1245 | 1126 | 0.7188 | 0.2941 | 0.7188 | 0.8478 |
| 0.1143 | 4.1319 | 1128 | 0.7036 | 0.1905 | 0.7036 | 0.8388 |
| 0.1143 | 4.1392 | 1130 | 0.6642 | 0.1724 | 0.6642 | 0.8150 |
| 0.1143 | 4.1465 | 1132 | 0.6442 | 0.2000 | 0.6442 | 0.8026 |
| 0.1143 | 4.1538 | 1134 | 0.6241 | 0.2623 | 0.6241 | 0.7900 |
| 0.1143 | 4.1612 | 1136 | 0.6157 | 0.2857 | 0.6157 | 0.7846 |
| 0.1143 | 4.1685 | 1138 | 0.5988 | 0.3478 | 0.5988 | 0.7738 |
| 0.1143 | 4.1758 | 1140 | 0.5894 | 0.2623 | 0.5894 | 0.7677 |
| 0.1143 | 4.1832 | 1142 | 0.5887 | 0.3636 | 0.5887 | 0.7672 |
| 0.1143 | 4.1905 | 1144 | 0.5813 | 0.3636 | 0.5813 | 0.7624 |
| 0.1143 | 4.1978 | 1146 | 0.5722 | 0.4167 | 0.5722 | 0.7565 |
| 0.1143 | 4.2051 | 1148 | 0.5623 | 0.3478 | 0.5623 | 0.7499 |
| 0.1143 | 4.2125 | 1150 | 0.5490 | 0.3478 | 0.5490 | 0.7409 |
| 0.1143 | 4.2198 | 1152 | 0.5421 | 0.3478 | 0.5421 | 0.7363 |
| 0.1143 | 4.2271 | 1154 | 0.5436 | 0.4179 | 0.5436 | 0.7373 |
| 0.1143 | 4.2344 | 1156 | 0.5422 | 0.4179 | 0.5422 | 0.7364 |
| 0.1143 | 4.2418 | 1158 | 0.5436 | 0.4179 | 0.5436 | 0.7373 |
| 0.1143 | 4.2491 | 1160 | 0.5430 | 0.4179 | 0.5430 | 0.7369 |
| 0.1143 | 4.2564 | 1162 | 0.5419 | 0.4179 | 0.5419 | 0.7361 |
| 0.1143 | 4.2637 | 1164 | 0.5488 | 0.4706 | 0.5488 | 0.7408 |
| 0.1143 | 4.2711 | 1166 | 0.5601 | 0.4179 | 0.5601 | 0.7484 |
| 0.1143 | 4.2784 | 1168 | 0.5667 | 0.4179 | 0.5667 | 0.7528 |
| 0.1143 | 4.2857 | 1170 | 0.5588 | 0.4706 | 0.5588 | 0.7475 |
| 0.1143 | 4.2930 | 1172 | 0.5596 | 0.3077 | 0.5596 | 0.7480 |
| 0.1143 | 4.3004 | 1174 | 0.5684 | 0.3077 | 0.5684 | 0.7539 |
| 0.1143 | 4.3077 | 1176 | 0.5721 | 0.3077 | 0.5721 | 0.7563 |
| 0.1143 | 4.3150 | 1178 | 0.5731 | 0.4348 | 0.5731 | 0.7570 |
| 0.1143 | 4.3223 | 1180 | 0.5817 | 0.4658 | 0.5817 | 0.7627 |
| 0.1143 | 4.3297 | 1182 | 0.5898 | 0.4658 | 0.5898 | 0.7680 |
| 0.1143 | 4.3370 | 1184 | 0.5876 | 0.4658 | 0.5876 | 0.7666 |
| 0.1143 | 4.3443 | 1186 | 0.5899 | 0.4658 | 0.5899 | 0.7680 |
| 0.1143 | 4.3516 | 1188 | 0.5864 | 0.4706 | 0.5864 | 0.7658 |
| 0.1143 | 4.3590 | 1190 | 0.5837 | 0.4706 | 0.5837 | 0.7640 |
| 0.1143 | 4.3663 | 1192 | 0.5813 | 0.4179 | 0.5813 | 0.7624 |
| 0.1143 | 4.3736 | 1194 | 0.5738 | 0.4194 | 0.5738 | 0.7575 |
| 0.1143 | 4.3810 | 1196 | 0.5648 | 0.4194 | 0.5648 | 0.7515 |
| 0.1143 | 4.3883 | 1198 | 0.5529 | 0.4762 | 0.5529 | 0.7436 |
| 0.1143 | 4.3956 | 1200 | 0.5436 | 0.4375 | 0.5436 | 0.7373 |
| 0.1143 | 4.4029 | 1202 | 0.5540 | 0.4923 | 0.5540 | 0.7443 |
| 0.1143 | 4.4103 | 1204 | 0.5701 | 0.3824 | 0.5701 | 0.7551 |
| 0.1143 | 4.4176 | 1206 | 0.5621 | 0.3824 | 0.5621 | 0.7497 |
| 0.1143 | 4.4249 | 1208 | 0.5417 | 0.4375 | 0.5417 | 0.7360 |
| 0.1143 | 4.4322 | 1210 | 0.5344 | 0.4762 | 0.5344 | 0.7310 |
| 0.1143 | 4.4396 | 1212 | 0.5432 | 0.4179 | 0.5432 | 0.7370 |
| 0.1143 | 4.4469 | 1214 | 0.5499 | 0.4179 | 0.5499 | 0.7415 |
| 0.1143 | 4.4542 | 1216 | 0.5482 | 0.4179 | 0.5482 | 0.7404 |
| 0.1143 | 4.4615 | 1218 | 0.5493 | 0.4179 | 0.5493 | 0.7412 |
| 0.1143 | 4.4689 | 1220 | 0.5286 | 0.4179 | 0.5286 | 0.7271 |
| 0.1143 | 4.4762 | 1222 | 0.5128 | 0.4407 | 0.5128 | 0.7161 |
| 0.1143 | 4.4835 | 1224 | 0.5087 | 0.4407 | 0.5087 | 0.7133 |
| 0.1143 | 4.4908 | 1226 | 0.5101 | 0.4407 | 0.5101 | 0.7142 |
| 0.1143 | 4.4982 | 1228 | 0.5108 | 0.4407 | 0.5108 | 0.7147 |
| 0.1143 | 4.5055 | 1230 | 0.5117 | 0.3793 | 0.5117 | 0.7153 |
| 0.1143 | 4.5128 | 1232 | 0.5148 | 0.4211 | 0.5148 | 0.7175 |
| 0.1143 | 4.5201 | 1234 | 0.5074 | 0.3793 | 0.5074 | 0.7124 |
| 0.1143 | 4.5275 | 1236 | 0.5035 | 0.4407 | 0.5035 | 0.7095 |
| 0.1143 | 4.5348 | 1238 | 0.5034 | 0.4407 | 0.5034 | 0.7095 |
| 0.1143 | 4.5421 | 1240 | 0.5095 | 0.1818 | 0.5095 | 0.7138 |
| 0.1143 | 4.5495 | 1242 | 0.5172 | 0.2500 | 0.5172 | 0.7191 |
| 0.1143 | 4.5568 | 1244 | 0.5273 | 0.4828 | 0.5273 | 0.7261 |
| 0.1143 | 4.5641 | 1246 | 0.5454 | 0.4828 | 0.5454 | 0.7385 |
| 0.1143 | 4.5714 | 1248 | 0.5448 | 0.4828 | 0.5448 | 0.7381 |
| 0.1143 | 4.5788 | 1250 | 0.5454 | 0.4000 | 0.5454 | 0.7385 |
| 0.1143 | 4.5861 | 1252 | 0.5368 | 0.2500 | 0.5368 | 0.7327 |
| 0.1143 | 4.5934 | 1254 | 0.5316 | 0.1818 | 0.5316 | 0.7291 |
| 0.1143 | 4.6007 | 1256 | 0.5241 | 0.3333 | 0.5241 | 0.7239 |
| 0.1143 | 4.6081 | 1258 | 0.5251 | 0.3333 | 0.5251 | 0.7247 |
| 0.1143 | 4.6154 | 1260 | 0.5255 | 0.3333 | 0.5255 | 0.7249 |
| 0.1143 | 4.6227 | 1262 | 0.5295 | 0.4407 | 0.5295 | 0.7276 |
| 0.1143 | 4.6300 | 1264 | 0.5422 | 0.4590 | 0.5422 | 0.7363 |
| 0.1143 | 4.6374 | 1266 | 0.5419 | 0.4658 | 0.5419 | 0.7362 |
| 0.1143 | 4.6447 | 1268 | 0.5345 | 0.4545 | 0.5345 | 0.7311 |
| 0.1143 | 4.6520 | 1270 | 0.5347 | 0.3077 | 0.5347 | 0.7312 |
| 0.1143 | 4.6593 | 1272 | 0.5360 | 0.4348 | 0.5360 | 0.7321 |
| 0.1143 | 4.6667 | 1274 | 0.5461 | 0.3377 | 0.5461 | 0.7390 |
| 0.1143 | 4.6740 | 1276 | 0.5432 | 0.4444 | 0.5432 | 0.7370 |
| 0.1143 | 4.6813 | 1278 | 0.5521 | 0.4578 | 0.5521 | 0.7430 |
| 0.1143 | 4.6886 | 1280 | 0.5509 | 0.4578 | 0.5509 | 0.7422 |
| 0.1143 | 4.6960 | 1282 | 0.5790 | 0.4615 | 0.5790 | 0.7609 |
| 0.1143 | 4.7033 | 1284 | 0.5741 | 0.4615 | 0.5741 | 0.7577 |
| 0.1143 | 4.7106 | 1286 | 0.5505 | 0.4615 | 0.5505 | 0.7420 |
| 0.1143 | 4.7179 | 1288 | 0.5444 | 0.4407 | 0.5444 | 0.7378 |
| 0.1143 | 4.7253 | 1290 | 0.5377 | 0.4407 | 0.5377 | 0.7333 |
| 0.1143 | 4.7326 | 1292 | 0.5183 | 0.3810 | 0.5183 | 0.7199 |
| 0.1143 | 4.7399 | 1294 | 0.5145 | 0.3810 | 0.5145 | 0.7173 |
| 0.1143 | 4.7473 | 1296 | 0.5093 | 0.3810 | 0.5093 | 0.7136 |
| 0.1143 | 4.7546 | 1298 | 0.5178 | 0.5312 | 0.5178 | 0.7196 |
| 0.1143 | 4.7619 | 1300 | 0.5614 | 0.4507 | 0.5614 | 0.7493 |
| 0.1143 | 4.7692 | 1302 | 0.6205 | 0.4304 | 0.6205 | 0.7877 |
| 0.1143 | 4.7766 | 1304 | 0.6633 | 0.4304 | 0.6633 | 0.8144 |
| 0.1143 | 4.7839 | 1306 | 0.6658 | 0.4304 | 0.6658 | 0.8160 |
| 0.1143 | 4.7912 | 1308 | 0.6320 | 0.4615 | 0.6320 | 0.7950 |
| 0.1143 | 4.7985 | 1310 | 0.5866 | 0.4615 | 0.5866 | 0.7659 |
| 0.1143 | 4.8059 | 1312 | 0.5547 | 0.4935 | 0.5547 | 0.7448 |
| 0.1143 | 4.8132 | 1314 | 0.5454 | 0.3014 | 0.5454 | 0.7385 |
| 0.1143 | 4.8205 | 1316 | 0.5463 | 0.3077 | 0.5463 | 0.7391 |
| 0.1143 | 4.8278 | 1318 | 0.5439 | 0.3000 | 0.5439 | 0.7375 |
| 0.1143 | 4.8352 | 1320 | 0.5485 | 0.3000 | 0.5485 | 0.7406 |
| 0.1143 | 4.8425 | 1322 | 0.5573 | 0.4474 | 0.5573 | 0.7465 |
| 0.1143 | 4.8498 | 1324 | 0.5751 | 0.4474 | 0.5751 | 0.7583 |
| 0.1143 | 4.8571 | 1326 | 0.5801 | 0.4474 | 0.5801 | 0.7616 |
| 0.1143 | 4.8645 | 1328 | 0.5763 | 0.2703 | 0.5763 | 0.7591 |
| 0.1143 | 4.8718 | 1330 | 0.5680 | 0.2703 | 0.5680 | 0.7537 |
| 0.1143 | 4.8791 | 1332 | 0.5531 | 0.2373 | 0.5531 | 0.7437 |
| 0.1143 | 4.8864 | 1334 | 0.5443 | 0.2373 | 0.5443 | 0.7378 |
| 0.1143 | 4.8938 | 1336 | 0.5449 | 0.2373 | 0.5449 | 0.7382 |
| 0.1143 | 4.9011 | 1338 | 0.5440 | 0.3000 | 0.5440 | 0.7376 |
| 0.1143 | 4.9084 | 1340 | 0.5456 | 0.3000 | 0.5456 | 0.7386 |
| 0.1143 | 4.9158 | 1342 | 0.5576 | 0.1818 | 0.5576 | 0.7467 |
| 0.1143 | 4.9231 | 1344 | 0.5666 | 0.2500 | 0.5666 | 0.7528 |
| 0.1143 | 4.9304 | 1346 | 0.5575 | 0.1818 | 0.5575 | 0.7467 |
| 0.1143 | 4.9377 | 1348 | 0.5477 | 0.2373 | 0.5477 | 0.7401 |
| 0.1143 | 4.9451 | 1350 | 0.5585 | 0.2909 | 0.5585 | 0.7473 |
| 0.1143 | 4.9524 | 1352 | 0.5740 | 0.2373 | 0.5740 | 0.7576 |
| 0.1143 | 4.9597 | 1354 | 0.5771 | 0.3000 | 0.5771 | 0.7597 |
| 0.1143 | 4.9670 | 1356 | 0.5632 | 0.3607 | 0.5632 | 0.7505 |
| 0.1143 | 4.9744 | 1358 | 0.5554 | 0.2759 | 0.5554 | 0.7453 |
| 0.1143 | 4.9817 | 1360 | 0.5923 | 0.3684 | 0.5923 | 0.7696 |
| 0.1143 | 4.9890 | 1362 | 0.6620 | 0.3836 | 0.6620 | 0.8136 |
| 0.1143 | 4.9963 | 1364 | 0.6963 | 0.3836 | 0.6963 | 0.8344 |
| 0.1143 | 5.0037 | 1366 | 0.6720 | 0.3836 | 0.6720 | 0.8197 |
| 0.1143 | 5.0110 | 1368 | 0.6244 | 0.3836 | 0.6244 | 0.7902 |
| 0.1143 | 5.0183 | 1370 | 0.5813 | 0.3077 | 0.5813 | 0.7624 |
| 0.1143 | 5.0256 | 1372 | 0.5446 | 0.2500 | 0.5446 | 0.7380 |
| 0.1143 | 5.0330 | 1374 | 0.5357 | 0.4211 | 0.5357 | 0.7319 |
| 0.1143 | 5.0403 | 1376 | 0.5512 | 0.3571 | 0.5512 | 0.7424 |
| 0.1143 | 5.0476 | 1378 | 0.5718 | 0.3571 | 0.5718 | 0.7562 |
| 0.1143 | 5.0549 | 1380 | 0.5826 | 0.3636 | 0.5826 | 0.7633 |
| 0.1143 | 5.0623 | 1382 | 0.5946 | 0.3636 | 0.5946 | 0.7711 |
| 0.1143 | 5.0696 | 1384 | 0.6004 | 0.3836 | 0.6004 | 0.7749 |
| 0.1143 | 5.0769 | 1386 | 0.6060 | 0.3836 | 0.6060 | 0.7784 |
| 0.1143 | 5.0842 | 1388 | 0.6213 | 0.4 | 0.6213 | 0.7882 |
| 0.1143 | 5.0916 | 1390 | 0.6363 | 0.4507 | 0.6363 | 0.7977 |
| 0.1143 | 5.0989 | 1392 | 0.6506 | 0.3333 | 0.6506 | 0.8066 |
| 0.1143 | 5.1062 | 1394 | 0.6585 | 0.2817 | 0.6585 | 0.8115 |
| 0.1143 | 5.1136 | 1396 | 0.6567 | 0.4 | 0.6567 | 0.8103 |
| 0.1143 | 5.1209 | 1398 | 0.6670 | 0.3077 | 0.6670 | 0.8167 |
| 0.1143 | 5.1282 | 1400 | 0.6849 | 0.3077 | 0.6849 | 0.8276 |
| 0.1143 | 5.1355 | 1402 | 0.6744 | 0.3077 | 0.6744 | 0.8212 |
| 0.1143 | 5.1429 | 1404 | 0.6510 | 0.3077 | 0.6510 | 0.8068 |
| 0.1143 | 5.1502 | 1406 | 0.6246 | 0.4179 | 0.6246 | 0.7903 |
| 0.1143 | 5.1575 | 1408 | 0.5984 | 0.3000 | 0.5984 | 0.7736 |
| 0.1143 | 5.1648 | 1410 | 0.5763 | 0.3000 | 0.5763 | 0.7591 |
| 0.1143 | 5.1722 | 1412 | 0.5652 | 0.3607 | 0.5652 | 0.7518 |
| 0.1143 | 5.1795 | 1414 | 0.5632 | 0.4348 | 0.5632 | 0.7504 |
| 0.1143 | 5.1868 | 1416 | 0.5569 | 0.4348 | 0.5569 | 0.7463 |
| 0.1143 | 5.1941 | 1418 | 0.5537 | 0.5714 | 0.5537 | 0.7441 |
| 0.1143 | 5.2015 | 1420 | 0.5302 | 0.5714 | 0.5302 | 0.7281 |
| 0.1143 | 5.2088 | 1422 | 0.5109 | 0.5714 | 0.5109 | 0.7148 |
| 0.1143 | 5.2161 | 1424 | 0.4934 | 0.5714 | 0.4934 | 0.7024 |
| 0.1143 | 5.2234 | 1426 | 0.4786 | 0.5714 | 0.4786 | 0.6918 |
| 0.1143 | 5.2308 | 1428 | 0.4825 | 0.5714 | 0.4825 | 0.6946 |
| 0.1143 | 5.2381 | 1430 | 0.4879 | 0.5714 | 0.4879 | 0.6985 |
| 0.1143 | 5.2454 | 1432 | 0.5100 | 0.5312 | 0.5100 | 0.7141 |
| 0.1143 | 5.2527 | 1434 | 0.5228 | 0.5312 | 0.5228 | 0.7231 |
| 0.1143 | 5.2601 | 1436 | 0.5326 | 0.4507 | 0.5326 | 0.7298 |
| 0.1143 | 5.2674 | 1438 | 0.5174 | 0.5312 | 0.5174 | 0.7193 |
| 0.1143 | 5.2747 | 1440 | 0.5059 | 0.5588 | 0.5059 | 0.7113 |
| 0.1143 | 5.2821 | 1442 | 0.4966 | 0.4179 | 0.4966 | 0.7047 |
| 0.1143 | 5.2894 | 1444 | 0.4972 | 0.3636 | 0.4972 | 0.7051 |
| 0.1143 | 5.2967 | 1446 | 0.5015 | 0.4179 | 0.5015 | 0.7081 |
| 0.1143 | 5.3040 | 1448 | 0.5136 | 0.4179 | 0.5136 | 0.7166 |
| 0.1143 | 5.3114 | 1450 | 0.5291 | 0.5714 | 0.5291 | 0.7274 |
| 0.1143 | 5.3187 | 1452 | 0.5439 | 0.5714 | 0.5439 | 0.7375 |
| 0.1143 | 5.3260 | 1454 | 0.5473 | 0.5714 | 0.5473 | 0.7398 |
| 0.1143 | 5.3333 | 1456 | 0.5538 | 0.4857 | 0.5538 | 0.7442 |
| 0.1143 | 5.3407 | 1458 | 0.5484 | 0.4179 | 0.5484 | 0.7405 |
| 0.1143 | 5.3480 | 1460 | 0.5461 | 0.3662 | 0.5461 | 0.7390 |
| 0.1143 | 5.3553 | 1462 | 0.5479 | 0.3143 | 0.5479 | 0.7402 |
| 0.1143 | 5.3626 | 1464 | 0.5568 | 0.3143 | 0.5568 | 0.7462 |
| 0.1143 | 5.3700 | 1466 | 0.5584 | 0.3143 | 0.5584 | 0.7472 |
| 0.1143 | 5.3773 | 1468 | 0.5606 | 0.3143 | 0.5606 | 0.7487 |
| 0.1143 | 5.3846 | 1470 | 0.5728 | 0.3662 | 0.5728 | 0.7568 |
| 0.1143 | 5.3919 | 1472 | 0.5775 | 0.4167 | 0.5775 | 0.7600 |
| 0.1143 | 5.3993 | 1474 | 0.5806 | 0.4167 | 0.5806 | 0.7620 |
| 0.1143 | 5.4066 | 1476 | 0.5719 | 0.4179 | 0.5719 | 0.7562 |
| 0.1143 | 5.4139 | 1478 | 0.5519 | 0.3000 | 0.5519 | 0.7429 |
| 0.1143 | 5.4212 | 1480 | 0.5372 | 0.4407 | 0.5372 | 0.7329 |
| 0.1143 | 5.4286 | 1482 | 0.5357 | 0.4194 | 0.5357 | 0.7319 |
| 0.1143 | 5.4359 | 1484 | 0.5434 | 0.4179 | 0.5434 | 0.7372 |
| 0.1143 | 5.4432 | 1486 | 0.5581 | 0.4 | 0.5581 | 0.7471 |
| 0.1143 | 5.4505 | 1488 | 0.5640 | 0.4 | 0.5640 | 0.7510 |
| 0.1143 | 5.4579 | 1490 | 0.5585 | 0.3636 | 0.5585 | 0.7474 |
| 0.1143 | 5.4652 | 1492 | 0.5518 | 0.4179 | 0.5518 | 0.7429 |
| 0.1143 | 5.4725 | 1494 | 0.5509 | 0.4706 | 0.5509 | 0.7423 |
| 0.1143 | 5.4799 | 1496 | 0.5699 | 0.3077 | 0.5699 | 0.7549 |
| 0.1143 | 5.4872 | 1498 | 0.6039 | 0.4324 | 0.6039 | 0.7771 |
| 0.0785 | 5.4945 | 1500 | 0.6255 | 0.4 | 0.6255 | 0.7909 |
| 0.0785 | 5.5018 | 1502 | 0.6286 | 0.4 | 0.6286 | 0.7928 |
| 0.0785 | 5.5092 | 1504 | 0.6112 | 0.4 | 0.6112 | 0.7818 |
| 0.0785 | 5.5165 | 1506 | 0.5890 | 0.3836 | 0.5890 | 0.7675 |
| 0.0785 | 5.5238 | 1508 | 0.5915 | 0.3636 | 0.5915 | 0.7691 |
| 0.0785 | 5.5311 | 1510 | 0.6011 | 0.4 | 0.6011 | 0.7753 |
| 0.0785 | 5.5385 | 1512 | 0.6219 | 0.3377 | 0.6219 | 0.7886 |
| 0.0785 | 5.5458 | 1514 | 0.6203 | 0.3377 | 0.6203 | 0.7876 |
| 0.0785 | 5.5531 | 1516 | 0.5927 | 0.3377 | 0.5927 | 0.7699 |
| 0.0785 | 5.5604 | 1518 | 0.5717 | 0.3226 | 0.5717 | 0.7561 |
| 0.0785 | 5.5678 | 1520 | 0.5418 | 0.3636 | 0.5418 | 0.7361 |
| 0.0785 | 5.5751 | 1522 | 0.5330 | 0.4762 | 0.5330 | 0.7301 |
| 0.0785 | 5.5824 | 1524 | 0.5321 | 0.4762 | 0.5321 | 0.7295 |
| 0.0785 | 5.5897 | 1526 | 0.5302 | 0.4762 | 0.5302 | 0.7281 |
| 0.0785 | 5.5971 | 1528 | 0.5304 | 0.4828 | 0.5304 | 0.7283 |
| 0.0785 | 5.6044 | 1530 | 0.5411 | 0.3607 | 0.5411 | 0.7356 |
| 0.0785 | 5.6117 | 1532 | 0.5745 | 0.3478 | 0.5745 | 0.7580 |
| 0.0785 | 5.6190 | 1534 | 0.6291 | 0.3836 | 0.6291 | 0.7931 |
| 0.0785 | 5.6264 | 1536 | 0.6675 | 0.3836 | 0.6675 | 0.8170 |
| 0.0785 | 5.6337 | 1538 | 0.6625 | 0.3836 | 0.6625 | 0.8139 |
| 0.0785 | 5.6410 | 1540 | 0.6231 | 0.4615 | 0.6231 | 0.7894 |
| 0.0785 | 5.6484 | 1542 | 0.5886 | 0.4 | 0.5886 | 0.7672 |
| 0.0785 | 5.6557 | 1544 | 0.5588 | 0.3607 | 0.5588 | 0.7475 |
| 0.0785 | 5.6630 | 1546 | 0.5505 | 0.3636 | 0.5505 | 0.7419 |
| 0.0785 | 5.6703 | 1548 | 0.5476 | 0.3662 | 0.5476 | 0.7400 |
| 0.0785 | 5.6777 | 1550 | 0.5485 | 0.4 | 0.5485 | 0.7406 |
| 0.0785 | 5.6850 | 1552 | 0.5457 | 0.3662 | 0.5457 | 0.7387 |
| 0.0785 | 5.6923 | 1554 | 0.5478 | 0.3607 | 0.5478 | 0.7401 |
| 0.0785 | 5.6996 | 1556 | 0.5622 | 0.3607 | 0.5622 | 0.7498 |
| 0.0785 | 5.7070 | 1558 | 0.5911 | 0.25 | 0.5911 | 0.7688 |
| 0.0785 | 5.7143 | 1560 | 0.6215 | 0.3836 | 0.6215 | 0.7883 |
| 0.0785 | 5.7216 | 1562 | 0.6482 | 0.3836 | 0.6482 | 0.8051 |
| 0.0785 | 5.7289 | 1564 | 0.6569 | 0.3836 | 0.6569 | 0.8105 |
| 0.0785 | 5.7363 | 1566 | 0.6452 | 0.3836 | 0.6452 | 0.8032 |
| 0.0785 | 5.7436 | 1568 | 0.6124 | 0.3836 | 0.6124 | 0.7825 |
| 0.0785 | 5.7509 | 1570 | 0.5850 | 0.25 | 0.5850 | 0.7649 |
| 0.0785 | 5.7582 | 1572 | 0.5645 | 0.2154 | 0.5645 | 0.7514 |
| 0.0785 | 5.7656 | 1574 | 0.5427 | 0.2500 | 0.5427 | 0.7367 |
| 0.0785 | 5.7729 | 1576 | 0.5345 | 0.2500 | 0.5345 | 0.7311 |
| 0.0785 | 5.7802 | 1578 | 0.5340 | 0.2500 | 0.5340 | 0.7308 |
| 0.0785 | 5.7875 | 1580 | 0.5423 | 0.2759 | 0.5423 | 0.7364 |
| 0.0785 | 5.7949 | 1582 | 0.5488 | 0.2759 | 0.5488 | 0.7408 |
| 0.0785 | 5.8022 | 1584 | 0.5647 | 0.2154 | 0.5647 | 0.7515 |
| 0.0785 | 5.8095 | 1586 | 0.5873 | 0.2154 | 0.5873 | 0.7663 |
| 0.0785 | 5.8168 | 1588 | 0.5979 | 0.2154 | 0.5979 | 0.7732 |
| 0.0785 | 5.8242 | 1590 | 0.5858 | 0.2154 | 0.5858 | 0.7653 |
| 0.0785 | 5.8315 | 1592 | 0.5605 | 0.2154 | 0.5605 | 0.7487 |
| 0.0785 | 5.8388 | 1594 | 0.5355 | 0.2500 | 0.5355 | 0.7318 |
| 0.0785 | 5.8462 | 1596 | 0.5271 | 0.3607 | 0.5271 | 0.7260 |
| 0.0785 | 5.8535 | 1598 | 0.5243 | 0.3607 | 0.5243 | 0.7241 |
| 0.0785 | 5.8608 | 1600 | 0.5297 | 0.3607 | 0.5297 | 0.7278 |
| 0.0785 | 5.8681 | 1602 | 0.5378 | 0.3226 | 0.5378 | 0.7333 |
| 0.0785 | 5.8755 | 1604 | 0.5375 | 0.3226 | 0.5375 | 0.7332 |
| 0.0785 | 5.8828 | 1606 | 0.5447 | 0.3810 | 0.5447 | 0.7380 |
| 0.0785 | 5.8901 | 1608 | 0.5549 | 0.3810 | 0.5549 | 0.7449 |
| 0.0785 | 5.8974 | 1610 | 0.5440 | 0.3810 | 0.5440 | 0.7375 |
| 0.0785 | 5.9048 | 1612 | 0.5320 | 0.3607 | 0.5320 | 0.7294 |
| 0.0785 | 5.9121 | 1614 | 0.5329 | 0.3607 | 0.5329 | 0.7300 |
| 0.0785 | 5.9194 | 1616 | 0.5427 | 0.3607 | 0.5427 | 0.7367 |
| 0.0785 | 5.9267 | 1618 | 0.5505 | 0.3000 | 0.5505 | 0.7420 |
| 0.0785 | 5.9341 | 1620 | 0.5538 | 0.4407 | 0.5538 | 0.7441 |
| 0.0785 | 5.9414 | 1622 | 0.5582 | 0.4706 | 0.5582 | 0.7471 |
| 0.0785 | 5.9487 | 1624 | 0.5661 | 0.4179 | 0.5661 | 0.7524 |
| 0.0785 | 5.9560 | 1626 | 0.5699 | 0.3636 | 0.5699 | 0.7549 |
| 0.0785 | 5.9634 | 1628 | 0.5730 | 0.3636 | 0.5730 | 0.7570 |
| 0.0785 | 5.9707 | 1630 | 0.5679 | 0.3636 | 0.5679 | 0.7536 |
| 0.0785 | 5.9780 | 1632 | 0.5599 | 0.4211 | 0.5599 | 0.7482 |
| 0.0785 | 5.9853 | 1634 | 0.5522 | 0.4211 | 0.5522 | 0.7431 |
| 0.0785 | 5.9927 | 1636 | 0.5549 | 0.3333 | 0.5549 | 0.7449 |
| 0.0785 | 6.0 | 1638 | 0.5564 | 0.3333 | 0.5564 | 0.7459 |
| 0.0785 | 6.0073 | 1640 | 0.5510 | 0.3333 | 0.5510 | 0.7423 |
| 0.0785 | 6.0147 | 1642 | 0.5467 | 0.3333 | 0.5467 | 0.7394 |
| 0.0785 | 6.0220 | 1644 | 0.5392 | 0.4407 | 0.5392 | 0.7343 |
| 0.0785 | 6.0293 | 1646 | 0.5342 | 0.4828 | 0.5342 | 0.7309 |
| 0.0785 | 6.0366 | 1648 | 0.5362 | 0.4211 | 0.5362 | 0.7323 |
| 0.0785 | 6.0440 | 1650 | 0.5381 | 0.4211 | 0.5381 | 0.7336 |
| 0.0785 | 6.0513 | 1652 | 0.5411 | 0.4828 | 0.5411 | 0.7356 |
| 0.0785 | 6.0586 | 1654 | 0.5436 | 0.4828 | 0.5436 | 0.7373 |
| 0.0785 | 6.0659 | 1656 | 0.5494 | 0.4407 | 0.5494 | 0.7412 |
| 0.0785 | 6.0733 | 1658 | 0.5486 | 0.4407 | 0.5486 | 0.7407 |
| 0.0785 | 6.0806 | 1660 | 0.5475 | 0.4375 | 0.5475 | 0.7399 |
| 0.0785 | 6.0879 | 1662 | 0.5544 | 0.4407 | 0.5544 | 0.7446 |
| 0.0785 | 6.0952 | 1664 | 0.5639 | 0.4407 | 0.5639 | 0.7509 |
| 0.0785 | 6.1026 | 1666 | 0.5616 | 0.4375 | 0.5616 | 0.7494 |
| 0.0785 | 6.1099 | 1668 | 0.5488 | 0.4375 | 0.5488 | 0.7408 |
| 0.0785 | 6.1172 | 1670 | 0.5373 | 0.4375 | 0.5373 | 0.7330 |
| 0.0785 | 6.1245 | 1672 | 0.5328 | 0.4375 | 0.5328 | 0.7300 |
| 0.0785 | 6.1319 | 1674 | 0.5309 | 0.4762 | 0.5309 | 0.7286 |
| 0.0785 | 6.1392 | 1676 | 0.5343 | 0.4194 | 0.5343 | 0.7310 |
| 0.0785 | 6.1465 | 1678 | 0.5386 | 0.4194 | 0.5386 | 0.7339 |
| 0.0785 | 6.1538 | 1680 | 0.5375 | 0.4194 | 0.5375 | 0.7332 |
| 0.0785 | 6.1612 | 1682 | 0.5348 | 0.4828 | 0.5348 | 0.7313 |
| 0.0785 | 6.1685 | 1684 | 0.5329 | 0.4407 | 0.5329 | 0.7300 |
| 0.0785 | 6.1758 | 1686 | 0.5310 | 0.4828 | 0.5310 | 0.7287 |
| 0.0785 | 6.1832 | 1688 | 0.5338 | 0.4828 | 0.5338 | 0.7306 |
| 0.0785 | 6.1905 | 1690 | 0.5376 | 0.4828 | 0.5376 | 0.7332 |
| 0.0785 | 6.1978 | 1692 | 0.5447 | 0.4762 | 0.5447 | 0.7380 |
| 0.0785 | 6.2051 | 1694 | 0.5505 | 0.4762 | 0.5505 | 0.7420 |
| 0.0785 | 6.2125 | 1696 | 0.5531 | 0.4762 | 0.5531 | 0.7437 |
| 0.0785 | 6.2198 | 1698 | 0.5579 | 0.4375 | 0.5579 | 0.7470 |
| 0.0785 | 6.2271 | 1700 | 0.5778 | 0.3000 | 0.5778 | 0.7601 |
| 0.0785 | 6.2344 | 1702 | 0.6168 | 0.3810 | 0.6168 | 0.7854 |
| 0.0785 | 6.2418 | 1704 | 0.6542 | 0.4 | 0.6542 | 0.8088 |
| 0.0785 | 6.2491 | 1706 | 0.6772 | 0.4 | 0.6772 | 0.8229 |
| 0.0785 | 6.2564 | 1708 | 0.6775 | 0.3810 | 0.6775 | 0.8231 |
| 0.0785 | 6.2637 | 1710 | 0.6582 | 0.3810 | 0.6582 | 0.8113 |
| 0.0785 | 6.2711 | 1712 | 0.6226 | 0.3824 | 0.6226 | 0.7890 |
| 0.0785 | 6.2784 | 1714 | 0.6071 | 0.4324 | 0.6071 | 0.7792 |
| 0.0785 | 6.2857 | 1716 | 0.6268 | 0.4304 | 0.6268 | 0.7917 |
| 0.0785 | 6.2930 | 1718 | 0.6376 | 0.4304 | 0.6376 | 0.7985 |
| 0.0785 | 6.3004 | 1720 | 0.6409 | 0.4304 | 0.6409 | 0.8006 |
| 0.0785 | 6.3077 | 1722 | 0.6393 | 0.4304 | 0.6393 | 0.7996 |
| 0.0785 | 6.3150 | 1724 | 0.6354 | 0.4304 | 0.6354 | 0.7971 |
| 0.0785 | 6.3223 | 1726 | 0.6314 | 0.3333 | 0.6314 | 0.7946 |
| 0.0785 | 6.3297 | 1728 | 0.6413 | 0.3544 | 0.6413 | 0.8008 |
| 0.0785 | 6.3370 | 1730 | 0.6527 | 0.4 | 0.6527 | 0.8079 |
| 0.0785 | 6.3443 | 1732 | 0.6517 | 0.4 | 0.6517 | 0.8073 |
| 0.0785 | 6.3516 | 1734 | 0.6385 | 0.4 | 0.6385 | 0.7991 |
| 0.0785 | 6.3590 | 1736 | 0.6126 | 0.2759 | 0.6126 | 0.7827 |
| 0.0785 | 6.3663 | 1738 | 0.5898 | 0.3571 | 0.5898 | 0.7680 |
| 0.0785 | 6.3736 | 1740 | 0.5878 | 0.3571 | 0.5878 | 0.7667 |
| 0.0785 | 6.3810 | 1742 | 0.5816 | 0.3571 | 0.5816 | 0.7626 |
| 0.0785 | 6.3883 | 1744 | 0.5882 | 0.3571 | 0.5882 | 0.7670 |
| 0.0785 | 6.3956 | 1746 | 0.6022 | 0.4375 | 0.6022 | 0.7760 |
| 0.0785 | 6.4029 | 1748 | 0.6130 | 0.4375 | 0.6130 | 0.7829 |
| 0.0785 | 6.4103 | 1750 | 0.6340 | 0.3077 | 0.6340 | 0.7963 |
| 0.0785 | 6.4176 | 1752 | 0.6440 | 0.3077 | 0.6440 | 0.8025 |
| 0.0785 | 6.4249 | 1754 | 0.6436 | 0.3077 | 0.6436 | 0.8022 |
| 0.0785 | 6.4322 | 1756 | 0.6396 | 0.3077 | 0.6396 | 0.7997 |
| 0.0785 | 6.4396 | 1758 | 0.6209 | 0.4 | 0.6209 | 0.7879 |
| 0.0785 | 6.4469 | 1760 | 0.6296 | 0.4 | 0.6296 | 0.7935 |
| 0.0785 | 6.4542 | 1762 | 0.6375 | 0.4 | 0.6375 | 0.7984 |
| 0.0785 | 6.4615 | 1764 | 0.6117 | 0.3514 | 0.6117 | 0.7821 |
| 0.0785 | 6.4689 | 1766 | 0.5947 | 0.4658 | 0.5947 | 0.7712 |
| 0.0785 | 6.4762 | 1768 | 0.5827 | 0.4545 | 0.5827 | 0.7633 |
| 0.0785 | 6.4835 | 1770 | 0.5708 | 0.4545 | 0.5708 | 0.7555 |
| 0.0785 | 6.4908 | 1772 | 0.5596 | 0.4545 | 0.5596 | 0.7481 |
| 0.0785 | 6.4982 | 1774 | 0.5544 | 0.4590 | 0.5544 | 0.7446 |
| 0.0785 | 6.5055 | 1776 | 0.5494 | 0.4590 | 0.5494 | 0.7412 |
| 0.0785 | 6.5128 | 1778 | 0.5529 | 0.4590 | 0.5529 | 0.7436 |
| 0.0785 | 6.5201 | 1780 | 0.5577 | 0.4590 | 0.5577 | 0.7468 |
| 0.0785 | 6.5275 | 1782 | 0.5497 | 0.3571 | 0.5497 | 0.7415 |
| 0.0785 | 6.5348 | 1784 | 0.5359 | 0.4407 | 0.5359 | 0.7321 |
| 0.0785 | 6.5421 | 1786 | 0.5334 | 0.4211 | 0.5334 | 0.7303 |
| 0.0785 | 6.5495 | 1788 | 0.5440 | 0.3333 | 0.5440 | 0.7376 |
| 0.0785 | 6.5568 | 1790 | 0.5522 | 0.3333 | 0.5522 | 0.7431 |
| 0.0785 | 6.5641 | 1792 | 0.5509 | 0.3333 | 0.5509 | 0.7422 |
| 0.0785 | 6.5714 | 1794 | 0.5431 | 0.3571 | 0.5431 | 0.7369 |
| 0.0785 | 6.5788 | 1796 | 0.5425 | 0.4407 | 0.5425 | 0.7366 |
| 0.0785 | 6.5861 | 1798 | 0.5610 | 0.4 | 0.5610 | 0.7490 |
| 0.0785 | 6.5934 | 1800 | 0.5857 | 0.2105 | 0.5857 | 0.7653 |
| 0.0785 | 6.6007 | 1802 | 0.6020 | 0.2500 | 0.6020 | 0.7759 |
| 0.0785 | 6.6081 | 1804 | 0.5989 | 0.3478 | 0.5989 | 0.7739 |
| 0.0785 | 6.6154 | 1806 | 0.5875 | 0.4706 | 0.5875 | 0.7665 |
| 0.0785 | 6.6227 | 1808 | 0.5905 | 0.4706 | 0.5905 | 0.7684 |
| 0.0785 | 6.6300 | 1810 | 0.5894 | 0.4706 | 0.5894 | 0.7677 |
| 0.0785 | 6.6374 | 1812 | 0.5904 | 0.4706 | 0.5904 | 0.7684 |
| 0.0785 | 6.6447 | 1814 | 0.5963 | 0.4706 | 0.5963 | 0.7722 |
| 0.0785 | 6.6520 | 1816 | 0.5869 | 0.4545 | 0.5869 | 0.7661 |
| 0.0785 | 6.6593 | 1818 | 0.5739 | 0.4545 | 0.5739 | 0.7576 |
| 0.0785 | 6.6667 | 1820 | 0.5618 | 0.4407 | 0.5618 | 0.7496 |
| 0.0785 | 6.6740 | 1822 | 0.5597 | 0.4828 | 0.5597 | 0.7481 |
| 0.0785 | 6.6813 | 1824 | 0.5619 | 0.4828 | 0.5619 | 0.7496 |
| 0.0785 | 6.6886 | 1826 | 0.5629 | 0.4407 | 0.5629 | 0.7503 |
| 0.0785 | 6.6960 | 1828 | 0.5663 | 0.4407 | 0.5663 | 0.7525 |
| 0.0785 | 6.7033 | 1830 | 0.5694 | 0.4407 | 0.5694 | 0.7546 |
| 0.0785 | 6.7106 | 1832 | 0.5728 | 0.3607 | 0.5728 | 0.7568 |
| 0.0785 | 6.7179 | 1834 | 0.5708 | 0.3607 | 0.5708 | 0.7555 |
| 0.0785 | 6.7253 | 1836 | 0.5598 | 0.3607 | 0.5598 | 0.7482 |
| 0.0785 | 6.7326 | 1838 | 0.5501 | 0.4407 | 0.5501 | 0.7417 |
| 0.0785 | 6.7399 | 1840 | 0.5444 | 0.4407 | 0.5444 | 0.7378 |
| 0.0785 | 6.7473 | 1842 | 0.5413 | 0.4407 | 0.5413 | 0.7357 |
| 0.0785 | 6.7546 | 1844 | 0.5408 | 0.4828 | 0.5408 | 0.7354 |
| 0.0785 | 6.7619 | 1846 | 0.5459 | 0.4211 | 0.5459 | 0.7388 |
| 0.0785 | 6.7692 | 1848 | 0.5445 | 0.4211 | 0.5445 | 0.7379 |
| 0.0785 | 6.7766 | 1850 | 0.5371 | 0.4211 | 0.5371 | 0.7329 |
| 0.0785 | 6.7839 | 1852 | 0.5333 | 0.4211 | 0.5333 | 0.7303 |
| 0.0785 | 6.7912 | 1854 | 0.5341 | 0.4407 | 0.5341 | 0.7308 |
| 0.0785 | 6.7985 | 1856 | 0.5402 | 0.4407 | 0.5402 | 0.7350 |
| 0.0785 | 6.8059 | 1858 | 0.5510 | 0.4 | 0.5510 | 0.7423 |
| 0.0785 | 6.8132 | 1860 | 0.5658 | 0.3571 | 0.5658 | 0.7522 |
| 0.0785 | 6.8205 | 1862 | 0.5707 | 0.4706 | 0.5707 | 0.7555 |
| 0.0785 | 6.8278 | 1864 | 0.5679 | 0.4179 | 0.5679 | 0.7536 |
| 0.0785 | 6.8352 | 1866 | 0.5624 | 0.4545 | 0.5624 | 0.7499 |
| 0.0785 | 6.8425 | 1868 | 0.5571 | 0.4828 | 0.5571 | 0.7464 |
| 0.0785 | 6.8498 | 1870 | 0.5582 | 0.4211 | 0.5582 | 0.7472 |
| 0.0785 | 6.8571 | 1872 | 0.5629 | 0.4211 | 0.5629 | 0.7503 |
| 0.0785 | 6.8645 | 1874 | 0.5692 | 0.3571 | 0.5692 | 0.7545 |
| 0.0785 | 6.8718 | 1876 | 0.5711 | 0.3571 | 0.5711 | 0.7557 |
| 0.0785 | 6.8791 | 1878 | 0.5676 | 0.4211 | 0.5676 | 0.7534 |
| 0.0785 | 6.8864 | 1880 | 0.5624 | 0.4211 | 0.5624 | 0.7500 |
| 0.0785 | 6.8938 | 1882 | 0.5502 | 0.4828 | 0.5502 | 0.7418 |
| 0.0785 | 6.9011 | 1884 | 0.5424 | 0.4828 | 0.5424 | 0.7365 |
| 0.0785 | 6.9084 | 1886 | 0.5386 | 0.4407 | 0.5386 | 0.7339 |
| 0.0785 | 6.9158 | 1888 | 0.5338 | 0.4407 | 0.5338 | 0.7306 |
| 0.0785 | 6.9231 | 1890 | 0.5356 | 0.4 | 0.5356 | 0.7319 |
| 0.0785 | 6.9304 | 1892 | 0.5434 | 0.4590 | 0.5434 | 0.7371 |
| 0.0785 | 6.9377 | 1894 | 0.5461 | 0.4590 | 0.5461 | 0.7390 |
| 0.0785 | 6.9451 | 1896 | 0.5444 | 0.4590 | 0.5444 | 0.7379 |
| 0.0785 | 6.9524 | 1898 | 0.5455 | 0.4 | 0.5455 | 0.7386 |
| 0.0785 | 6.9597 | 1900 | 0.5497 | 0.4179 | 0.5497 | 0.7414 |
| 0.0785 | 6.9670 | 1902 | 0.5565 | 0.4179 | 0.5565 | 0.7460 |
| 0.0785 | 6.9744 | 1904 | 0.5577 | 0.3478 | 0.5577 | 0.7468 |
| 0.0785 | 6.9817 | 1906 | 0.5550 | 0.3478 | 0.5550 | 0.7450 |
| 0.0785 | 6.9890 | 1908 | 0.5493 | 0.3226 | 0.5493 | 0.7412 |
| 0.0785 | 6.9963 | 1910 | 0.5453 | 0.4590 | 0.5453 | 0.7384 |
| 0.0785 | 7.0037 | 1912 | 0.5432 | 0.4590 | 0.5432 | 0.7370 |
| 0.0785 | 7.0110 | 1914 | 0.5445 | 0.3226 | 0.5445 | 0.7379 |
| 0.0785 | 7.0183 | 1916 | 0.5450 | 0.3226 | 0.5450 | 0.7383 |
| 0.0785 | 7.0256 | 1918 | 0.5438 | 0.3226 | 0.5438 | 0.7374 |
| 0.0785 | 7.0330 | 1920 | 0.5457 | 0.3226 | 0.5457 | 0.7387 |
| 0.0785 | 7.0403 | 1922 | 0.5393 | 0.3226 | 0.5393 | 0.7344 |
| 0.0785 | 7.0476 | 1924 | 0.5270 | 0.4590 | 0.5270 | 0.7259 |
| 0.0785 | 7.0549 | 1926 | 0.5209 | 0.4407 | 0.5209 | 0.7218 |
| 0.0785 | 7.0623 | 1928 | 0.5182 | 0.4407 | 0.5182 | 0.7199 |
| 0.0785 | 7.0696 | 1930 | 0.5177 | 0.4407 | 0.5177 | 0.7195 |
| 0.0785 | 7.0769 | 1932 | 0.5186 | 0.4407 | 0.5186 | 0.7201 |
| 0.0785 | 7.0842 | 1934 | 0.5213 | 0.4590 | 0.5213 | 0.7220 |
| 0.0785 | 7.0916 | 1936 | 0.5270 | 0.3226 | 0.5270 | 0.7260 |
| 0.0785 | 7.0989 | 1938 | 0.5305 | 0.3226 | 0.5305 | 0.7284 |
| 0.0785 | 7.1062 | 1940 | 0.5364 | 0.3226 | 0.5364 | 0.7324 |
| 0.0785 | 7.1136 | 1942 | 0.5438 | 0.3226 | 0.5438 | 0.7374 |
| 0.0785 | 7.1209 | 1944 | 0.5539 | 0.3478 | 0.5539 | 0.7442 |
| 0.0785 | 7.1282 | 1946 | 0.5597 | 0.3478 | 0.5597 | 0.7482 |
| 0.0785 | 7.1355 | 1948 | 0.5635 | 0.3478 | 0.5635 | 0.7507 |
| 0.0785 | 7.1429 | 1950 | 0.5595 | 0.3824 | 0.5595 | 0.7480 |
| 0.0785 | 7.1502 | 1952 | 0.5540 | 0.4407 | 0.5540 | 0.7443 |
| 0.0785 | 7.1575 | 1954 | 0.5507 | 0.4375 | 0.5507 | 0.7421 |
| 0.0785 | 7.1648 | 1956 | 0.5478 | 0.4348 | 0.5478 | 0.7401 |
| 0.0785 | 7.1722 | 1958 | 0.5478 | 0.3824 | 0.5478 | 0.7401 |
| 0.0785 | 7.1795 | 1960 | 0.5443 | 0.3824 | 0.5443 | 0.7378 |
| 0.0785 | 7.1868 | 1962 | 0.5416 | 0.3793 | 0.5416 | 0.7359 |
| 0.0785 | 7.1941 | 1964 | 0.5451 | 0.4407 | 0.5451 | 0.7383 |
| 0.0785 | 7.2015 | 1966 | 0.5532 | 0.5 | 0.5532 | 0.7438 |
| 0.0785 | 7.2088 | 1968 | 0.5682 | 0.3478 | 0.5682 | 0.7538 |
| 0.0785 | 7.2161 | 1970 | 0.5856 | 0.3478 | 0.5856 | 0.7652 |
| 0.0785 | 7.2234 | 1972 | 0.5935 | 0.3478 | 0.5935 | 0.7704 |
| 0.0785 | 7.2308 | 1974 | 0.5892 | 0.3478 | 0.5892 | 0.7676 |
| 0.0785 | 7.2381 | 1976 | 0.5806 | 0.3478 | 0.5806 | 0.7620 |
| 0.0785 | 7.2454 | 1978 | 0.5707 | 0.3478 | 0.5707 | 0.7555 |
| 0.0785 | 7.2527 | 1980 | 0.5673 | 0.3478 | 0.5673 | 0.7532 |
| 0.0785 | 7.2601 | 1982 | 0.5657 | 0.3478 | 0.5657 | 0.7521 |
| 0.0785 | 7.2674 | 1984 | 0.5613 | 0.3478 | 0.5613 | 0.7492 |
| 0.0785 | 7.2747 | 1986 | 0.5587 | 0.3478 | 0.5587 | 0.7475 |
| 0.0785 | 7.2821 | 1988 | 0.5552 | 0.3478 | 0.5552 | 0.7451 |
| 0.0785 | 7.2894 | 1990 | 0.5536 | 0.3478 | 0.5536 | 0.7440 |
| 0.0785 | 7.2967 | 1992 | 0.5563 | 0.3478 | 0.5563 | 0.7458 |
| 0.0785 | 7.3040 | 1994 | 0.5599 | 0.3478 | 0.5599 | 0.7483 |
| 0.0785 | 7.3114 | 1996 | 0.5662 | 0.3478 | 0.5662 | 0.7524 |
| 0.0785 | 7.3187 | 1998 | 0.5687 | 0.3478 | 0.5687 | 0.7541 |
| 0.0555 | 7.3260 | 2000 | 0.5800 | 0.3478 | 0.5800 | 0.7616 |
| 0.0555 | 7.3333 | 2002 | 0.5922 | 0.3478 | 0.5922 | 0.7696 |
| 0.0555 | 7.3407 | 2004 | 0.6046 | 0.3478 | 0.6046 | 0.7776 |
| 0.0555 | 7.3480 | 2006 | 0.6190 | 0.4 | 0.6190 | 0.7868 |
| 0.0555 | 7.3553 | 2008 | 0.6313 | 0.3077 | 0.6313 | 0.7945 |
| 0.0555 | 7.3626 | 2010 | 0.6427 | 0.25 | 0.6427 | 0.8017 |
| 0.0555 | 7.3700 | 2012 | 0.6403 | 0.25 | 0.6403 | 0.8002 |
| 0.0555 | 7.3773 | 2014 | 0.6257 | 0.3077 | 0.6257 | 0.7910 |
| 0.0555 | 7.3846 | 2016 | 0.6071 | 0.4 | 0.6071 | 0.7792 |
| 0.0555 | 7.3919 | 2018 | 0.5883 | 0.3478 | 0.5883 | 0.7670 |
| 0.0555 | 7.3993 | 2020 | 0.5755 | 0.3478 | 0.5755 | 0.7586 |
| 0.0555 | 7.4066 | 2022 | 0.5697 | 0.3824 | 0.5697 | 0.7548 |
| 0.0555 | 7.4139 | 2024 | 0.5716 | 0.3824 | 0.5716 | 0.7560 |
| 0.0555 | 7.4212 | 2026 | 0.5698 | 0.3824 | 0.5698 | 0.7549 |
| 0.0555 | 7.4286 | 2028 | 0.5649 | 0.4545 | 0.5649 | 0.7516 |
| 0.0555 | 7.4359 | 2030 | 0.5636 | 0.4545 | 0.5636 | 0.7508 |
| 0.0555 | 7.4432 | 2032 | 0.5669 | 0.4545 | 0.5669 | 0.7529 |
| 0.0555 | 7.4505 | 2034 | 0.5736 | 0.4545 | 0.5736 | 0.7574 |
| 0.0555 | 7.4579 | 2036 | 0.5813 | 0.4545 | 0.5813 | 0.7624 |
| 0.0555 | 7.4652 | 2038 | 0.5913 | 0.3824 | 0.5913 | 0.7690 |
| 0.0555 | 7.4725 | 2040 | 0.5968 | 0.3824 | 0.5968 | 0.7725 |
| 0.0555 | 7.4799 | 2042 | 0.6034 | 0.3824 | 0.6034 | 0.7768 |
| 0.0555 | 7.4872 | 2044 | 0.6114 | 0.3478 | 0.6114 | 0.7819 |
| 0.0555 | 7.4945 | 2046 | 0.6161 | 0.3478 | 0.6161 | 0.7849 |
| 0.0555 | 7.5018 | 2048 | 0.6128 | 0.3478 | 0.6128 | 0.7828 |
| 0.0555 | 7.5092 | 2050 | 0.6013 | 0.3478 | 0.6013 | 0.7755 |
| 0.0555 | 7.5165 | 2052 | 0.5807 | 0.3478 | 0.5807 | 0.7620 |
| 0.0555 | 7.5238 | 2054 | 0.5669 | 0.3824 | 0.5669 | 0.7529 |
| 0.0555 | 7.5311 | 2056 | 0.5605 | 0.3824 | 0.5605 | 0.7486 |
| 0.0555 | 7.5385 | 2058 | 0.5567 | 0.3478 | 0.5567 | 0.7461 |
| 0.0555 | 7.5458 | 2060 | 0.5571 | 0.3478 | 0.5571 | 0.7464 |
| 0.0555 | 7.5531 | 2062 | 0.5519 | 0.3478 | 0.5519 | 0.7429 |
| 0.0555 | 7.5604 | 2064 | 0.5492 | 0.2500 | 0.5492 | 0.7411 |
| 0.0555 | 7.5678 | 2066 | 0.5532 | 0.2500 | 0.5532 | 0.7438 |
| 0.0555 | 7.5751 | 2068 | 0.5684 | 0.3077 | 0.5684 | 0.7539 |
| 0.0555 | 7.5824 | 2070 | 0.5736 | 0.3077 | 0.5736 | 0.7574 |
| 0.0555 | 7.5897 | 2072 | 0.5827 | 0.3077 | 0.5827 | 0.7633 |
| 0.0555 | 7.5971 | 2074 | 0.5899 | 0.3077 | 0.5899 | 0.7680 |
| 0.0555 | 7.6044 | 2076 | 0.5898 | 0.3077 | 0.5898 | 0.7680 |
| 0.0555 | 7.6117 | 2078 | 0.5746 | 0.3077 | 0.5746 | 0.7580 |
| 0.0555 | 7.6190 | 2080 | 0.5513 | 0.3478 | 0.5513 | 0.7425 |
| 0.0555 | 7.6264 | 2082 | 0.5376 | 0.4706 | 0.5376 | 0.7332 |
| 0.0555 | 7.6337 | 2084 | 0.5317 | 0.4545 | 0.5317 | 0.7292 |
| 0.0555 | 7.6410 | 2086 | 0.5350 | 0.4923 | 0.5350 | 0.7314 |
| 0.0555 | 7.6484 | 2088 | 0.5422 | 0.48 | 0.5422 | 0.7363 |
| 0.0555 | 7.6557 | 2090 | 0.5505 | 0.48 | 0.5505 | 0.7420 |
| 0.0555 | 7.6630 | 2092 | 0.5577 | 0.4507 | 0.5577 | 0.7468 |
| 0.0555 | 7.6703 | 2094 | 0.5662 | 0.5075 | 0.5662 | 0.7524 |
| 0.0555 | 7.6777 | 2096 | 0.5695 | 0.5075 | 0.5695 | 0.7547 |
| 0.0555 | 7.6850 | 2098 | 0.5703 | 0.5075 | 0.5703 | 0.7552 |
| 0.0555 | 7.6923 | 2100 | 0.5664 | 0.5075 | 0.5664 | 0.7526 |
| 0.0555 | 7.6996 | 2102 | 0.5653 | 0.5075 | 0.5653 | 0.7518 |
| 0.0555 | 7.7070 | 2104 | 0.5644 | 0.4545 | 0.5644 | 0.7513 |
| 0.0555 | 7.7143 | 2106 | 0.5670 | 0.4545 | 0.5670 | 0.7530 |
| 0.0555 | 7.7216 | 2108 | 0.5719 | 0.5075 | 0.5719 | 0.7562 |
| 0.0555 | 7.7289 | 2110 | 0.5746 | 0.5075 | 0.5746 | 0.7580 |
| 0.0555 | 7.7363 | 2112 | 0.5802 | 0.3478 | 0.5802 | 0.7617 |
| 0.0555 | 7.7436 | 2114 | 0.5827 | 0.3478 | 0.5827 | 0.7633 |
| 0.0555 | 7.7509 | 2116 | 0.5757 | 0.3478 | 0.5757 | 0.7588 |
| 0.0555 | 7.7582 | 2118 | 0.5691 | 0.3478 | 0.5691 | 0.7544 |
| 0.0555 | 7.7656 | 2120 | 0.5626 | 0.3478 | 0.5626 | 0.7501 |
| 0.0555 | 7.7729 | 2122 | 0.5588 | 0.3824 | 0.5588 | 0.7475 |
| 0.0555 | 7.7802 | 2124 | 0.5517 | 0.5075 | 0.5517 | 0.7428 |
| 0.0555 | 7.7875 | 2126 | 0.5434 | 0.5075 | 0.5434 | 0.7372 |
| 0.0555 | 7.7949 | 2128 | 0.5372 | 0.4375 | 0.5372 | 0.7330 |
| 0.0555 | 7.8022 | 2130 | 0.5356 | 0.4375 | 0.5356 | 0.7319 |
| 0.0555 | 7.8095 | 2132 | 0.5345 | 0.4375 | 0.5345 | 0.7311 |
| 0.0555 | 7.8168 | 2134 | 0.5349 | 0.4407 | 0.5349 | 0.7313 |
| 0.0555 | 7.8242 | 2136 | 0.5366 | 0.4407 | 0.5366 | 0.7325 |
| 0.0555 | 7.8315 | 2138 | 0.5416 | 0.4545 | 0.5416 | 0.7359 |
| 0.0555 | 7.8388 | 2140 | 0.5494 | 0.5075 | 0.5494 | 0.7412 |
| 0.0555 | 7.8462 | 2142 | 0.5597 | 0.3478 | 0.5597 | 0.7481 |
| 0.0555 | 7.8535 | 2144 | 0.5673 | 0.3478 | 0.5673 | 0.7532 |
| 0.0555 | 7.8608 | 2146 | 0.5786 | 0.3478 | 0.5786 | 0.7606 |
| 0.0555 | 7.8681 | 2148 | 0.5814 | 0.3478 | 0.5814 | 0.7625 |
| 0.0555 | 7.8755 | 2150 | 0.5745 | 0.3478 | 0.5745 | 0.7579 |
| 0.0555 | 7.8828 | 2152 | 0.5703 | 0.3478 | 0.5703 | 0.7552 |
| 0.0555 | 7.8901 | 2154 | 0.5699 | 0.3478 | 0.5699 | 0.7549 |
| 0.0555 | 7.8974 | 2156 | 0.5678 | 0.3478 | 0.5678 | 0.7536 |
| 0.0555 | 7.9048 | 2158 | 0.5636 | 0.5075 | 0.5636 | 0.7507 |
| 0.0555 | 7.9121 | 2160 | 0.5581 | 0.5075 | 0.5581 | 0.7470 |
| 0.0555 | 7.9194 | 2162 | 0.5490 | 0.4545 | 0.5490 | 0.7409 |
| 0.0555 | 7.9267 | 2164 | 0.5494 | 0.4545 | 0.5494 | 0.7412 |
| 0.0555 | 7.9341 | 2166 | 0.5558 | 0.5075 | 0.5558 | 0.7455 |
| 0.0555 | 7.9414 | 2168 | 0.5637 | 0.5075 | 0.5637 | 0.7508 |
| 0.0555 | 7.9487 | 2170 | 0.5699 | 0.3478 | 0.5699 | 0.7549 |
| 0.0555 | 7.9560 | 2172 | 0.5751 | 0.3478 | 0.5751 | 0.7584 |
| 0.0555 | 7.9634 | 2174 | 0.5744 | 0.3478 | 0.5744 | 0.7579 |
| 0.0555 | 7.9707 | 2176 | 0.5774 | 0.3478 | 0.5774 | 0.7599 |
| 0.0555 | 7.9780 | 2178 | 0.5806 | 0.3478 | 0.5806 | 0.7620 |
| 0.0555 | 7.9853 | 2180 | 0.5831 | 0.3478 | 0.5831 | 0.7636 |
| 0.0555 | 7.9927 | 2182 | 0.5831 | 0.3478 | 0.5831 | 0.7636 |
| 0.0555 | 8.0 | 2184 | 0.5836 | 0.3478 | 0.5836 | 0.7640 |
| 0.0555 | 8.0073 | 2186 | 0.5885 | 0.3478 | 0.5885 | 0.7671 |
| 0.0555 | 8.0147 | 2188 | 0.5972 | 0.3478 | 0.5972 | 0.7728 |
| 0.0555 | 8.0220 | 2190 | 0.6014 | 0.3478 | 0.6014 | 0.7755 |
| 0.0555 | 8.0293 | 2192 | 0.6106 | 0.3478 | 0.6106 | 0.7814 |
| 0.0555 | 8.0366 | 2194 | 0.6189 | 0.3478 | 0.6189 | 0.7867 |
| 0.0555 | 8.0440 | 2196 | 0.6244 | 0.3478 | 0.6244 | 0.7902 |
| 0.0555 | 8.0513 | 2198 | 0.6226 | 0.3478 | 0.6226 | 0.7891 |
| 0.0555 | 8.0586 | 2200 | 0.6172 | 0.3478 | 0.6172 | 0.7856 |
| 0.0555 | 8.0659 | 2202 | 0.6123 | 0.3514 | 0.6123 | 0.7825 |
| 0.0555 | 8.0733 | 2204 | 0.6068 | 0.4507 | 0.6068 | 0.7790 |
| 0.0555 | 8.0806 | 2206 | 0.6067 | 0.4474 | 0.6067 | 0.7789 |
| 0.0555 | 8.0879 | 2208 | 0.6093 | 0.4474 | 0.6093 | 0.7806 |
| 0.0555 | 8.0952 | 2210 | 0.6127 | 0.4474 | 0.6127 | 0.7828 |
| 0.0555 | 8.1026 | 2212 | 0.6125 | 0.4474 | 0.6125 | 0.7826 |
| 0.0555 | 8.1099 | 2214 | 0.6125 | 0.4474 | 0.6125 | 0.7826 |
| 0.0555 | 8.1172 | 2216 | 0.6110 | 0.4474 | 0.6110 | 0.7817 |
| 0.0555 | 8.1245 | 2218 | 0.6084 | 0.4474 | 0.6084 | 0.7800 |
| 0.0555 | 8.1319 | 2220 | 0.6068 | 0.4474 | 0.6068 | 0.7790 |
| 0.0555 | 8.1392 | 2222 | 0.6060 | 0.4507 | 0.6060 | 0.7785 |
| 0.0555 | 8.1465 | 2224 | 0.6050 | 0.4507 | 0.6050 | 0.7778 |
| 0.0555 | 8.1538 | 2226 | 0.6020 | 0.5 | 0.6020 | 0.7759 |
| 0.0555 | 8.1612 | 2228 | 0.5998 | 0.5 | 0.5998 | 0.7745 |
| 0.0555 | 8.1685 | 2230 | 0.5957 | 0.5 | 0.5957 | 0.7718 |
| 0.0555 | 8.1758 | 2232 | 0.5928 | 0.5 | 0.5928 | 0.7699 |
| 0.0555 | 8.1832 | 2234 | 0.5901 | 0.5075 | 0.5901 | 0.7682 |
| 0.0555 | 8.1905 | 2236 | 0.5867 | 0.3824 | 0.5867 | 0.7660 |
| 0.0555 | 8.1978 | 2238 | 0.5851 | 0.3478 | 0.5851 | 0.7649 |
| 0.0555 | 8.2051 | 2240 | 0.5822 | 0.3478 | 0.5822 | 0.7630 |
| 0.0555 | 8.2125 | 2242 | 0.5833 | 0.3478 | 0.5833 | 0.7637 |
| 0.0555 | 8.2198 | 2244 | 0.5853 | 0.3478 | 0.5853 | 0.7651 |
| 0.0555 | 8.2271 | 2246 | 0.5863 | 0.3478 | 0.5863 | 0.7657 |
| 0.0555 | 8.2344 | 2248 | 0.5829 | 0.3824 | 0.5829 | 0.7635 |
| 0.0555 | 8.2418 | 2250 | 0.5773 | 0.5075 | 0.5773 | 0.7598 |
| 0.0555 | 8.2491 | 2252 | 0.5775 | 0.5 | 0.5775 | 0.7599 |
| 0.0555 | 8.2564 | 2254 | 0.5807 | 0.5 | 0.5807 | 0.7620 |
| 0.0555 | 8.2637 | 2256 | 0.5820 | 0.5 | 0.5820 | 0.7629 |
| 0.0555 | 8.2711 | 2258 | 0.5824 | 0.3836 | 0.5824 | 0.7631 |
| 0.0555 | 8.2784 | 2260 | 0.5776 | 0.5 | 0.5776 | 0.7600 |
| 0.0555 | 8.2857 | 2262 | 0.5725 | 0.5 | 0.5725 | 0.7567 |
| 0.0555 | 8.2930 | 2264 | 0.5685 | 0.5 | 0.5685 | 0.7540 |
| 0.0555 | 8.3004 | 2266 | 0.5670 | 0.5 | 0.5670 | 0.7530 |
| 0.0555 | 8.3077 | 2268 | 0.5669 | 0.3824 | 0.5669 | 0.7529 |
| 0.0555 | 8.3150 | 2270 | 0.5702 | 0.3824 | 0.5702 | 0.7551 |
| 0.0555 | 8.3223 | 2272 | 0.5737 | 0.3824 | 0.5737 | 0.7574 |
| 0.0555 | 8.3297 | 2274 | 0.5774 | 0.3478 | 0.5774 | 0.7599 |
| 0.0555 | 8.3370 | 2276 | 0.5804 | 0.3478 | 0.5804 | 0.7618 |
| 0.0555 | 8.3443 | 2278 | 0.5754 | 0.3478 | 0.5754 | 0.7585 |
| 0.0555 | 8.3516 | 2280 | 0.5720 | 0.3478 | 0.5720 | 0.7563 |
| 0.0555 | 8.3590 | 2282 | 0.5672 | 0.3478 | 0.5672 | 0.7531 |
| 0.0555 | 8.3663 | 2284 | 0.5628 | 0.3478 | 0.5628 | 0.7502 |
| 0.0555 | 8.3736 | 2286 | 0.5574 | 0.3478 | 0.5574 | 0.7466 |
| 0.0555 | 8.3810 | 2288 | 0.5503 | 0.3824 | 0.5503 | 0.7418 |
| 0.0555 | 8.3883 | 2290 | 0.5463 | 0.3607 | 0.5463 | 0.7391 |
| 0.0555 | 8.3956 | 2292 | 0.5475 | 0.3607 | 0.5475 | 0.7399 |
| 0.0555 | 8.4029 | 2294 | 0.5492 | 0.3478 | 0.5492 | 0.7411 |
| 0.0555 | 8.4103 | 2296 | 0.5484 | 0.3478 | 0.5484 | 0.7406 |
| 0.0555 | 8.4176 | 2298 | 0.5486 | 0.3478 | 0.5486 | 0.7407 |
| 0.0555 | 8.4249 | 2300 | 0.5507 | 0.3478 | 0.5507 | 0.7421 |
| 0.0555 | 8.4322 | 2302 | 0.5524 | 0.3478 | 0.5524 | 0.7432 |
| 0.0555 | 8.4396 | 2304 | 0.5508 | 0.3478 | 0.5508 | 0.7421 |
| 0.0555 | 8.4469 | 2306 | 0.5476 | 0.3824 | 0.5476 | 0.7400 |
| 0.0555 | 8.4542 | 2308 | 0.5475 | 0.3824 | 0.5475 | 0.7400 |
| 0.0555 | 8.4615 | 2310 | 0.5535 | 0.3824 | 0.5535 | 0.7440 |
| 0.0555 | 8.4689 | 2312 | 0.5645 | 0.3478 | 0.5645 | 0.7513 |
| 0.0555 | 8.4762 | 2314 | 0.5741 | 0.3478 | 0.5741 | 0.7577 |
| 0.0555 | 8.4835 | 2316 | 0.5803 | 0.3478 | 0.5803 | 0.7617 |
| 0.0555 | 8.4908 | 2318 | 0.5842 | 0.3478 | 0.5842 | 0.7644 |
| 0.0555 | 8.4982 | 2320 | 0.5849 | 0.3478 | 0.5849 | 0.7648 |
| 0.0555 | 8.5055 | 2322 | 0.5799 | 0.3478 | 0.5799 | 0.7615 |
| 0.0555 | 8.5128 | 2324 | 0.5778 | 0.3478 | 0.5778 | 0.7601 |
| 0.0555 | 8.5201 | 2326 | 0.5814 | 0.3478 | 0.5814 | 0.7625 |
| 0.0555 | 8.5275 | 2328 | 0.5837 | 0.3478 | 0.5837 | 0.7640 |
| 0.0555 | 8.5348 | 2330 | 0.5807 | 0.3478 | 0.5807 | 0.7620 |
| 0.0555 | 8.5421 | 2332 | 0.5739 | 0.3478 | 0.5739 | 0.7576 |
| 0.0555 | 8.5495 | 2334 | 0.5650 | 0.3824 | 0.5650 | 0.7517 |
| 0.0555 | 8.5568 | 2336 | 0.5598 | 0.3824 | 0.5598 | 0.7482 |
| 0.0555 | 8.5641 | 2338 | 0.5569 | 0.3836 | 0.5569 | 0.7462 |
| 0.0555 | 8.5714 | 2340 | 0.5582 | 0.3836 | 0.5582 | 0.7471 |
| 0.0555 | 8.5788 | 2342 | 0.5617 | 0.3824 | 0.5617 | 0.7494 |
| 0.0555 | 8.5861 | 2344 | 0.5664 | 0.3824 | 0.5664 | 0.7526 |
| 0.0555 | 8.5934 | 2346 | 0.5708 | 0.3478 | 0.5708 | 0.7555 |
| 0.0555 | 8.6007 | 2348 | 0.5735 | 0.3478 | 0.5735 | 0.7573 |
| 0.0555 | 8.6081 | 2350 | 0.5779 | 0.3478 | 0.5779 | 0.7602 |
| 0.0555 | 8.6154 | 2352 | 0.5804 | 0.3478 | 0.5804 | 0.7619 |
| 0.0555 | 8.6227 | 2354 | 0.5842 | 0.3478 | 0.5842 | 0.7644 |
| 0.0555 | 8.6300 | 2356 | 0.5875 | 0.3478 | 0.5875 | 0.7665 |
| 0.0555 | 8.6374 | 2358 | 0.5864 | 0.3478 | 0.5864 | 0.7658 |
| 0.0555 | 8.6447 | 2360 | 0.5830 | 0.3478 | 0.5830 | 0.7636 |
| 0.0555 | 8.6520 | 2362 | 0.5796 | 0.3478 | 0.5796 | 0.7613 |
| 0.0555 | 8.6593 | 2364 | 0.5794 | 0.3478 | 0.5794 | 0.7612 |
| 0.0555 | 8.6667 | 2366 | 0.5771 | 0.3478 | 0.5771 | 0.7597 |
| 0.0555 | 8.6740 | 2368 | 0.5672 | 0.3478 | 0.5672 | 0.7532 |
| 0.0555 | 8.6813 | 2370 | 0.5574 | 0.3478 | 0.5574 | 0.7466 |
| 0.0555 | 8.6886 | 2372 | 0.5500 | 0.3824 | 0.5500 | 0.7416 |
| 0.0555 | 8.6960 | 2374 | 0.5448 | 0.3824 | 0.5448 | 0.7381 |
| 0.0555 | 8.7033 | 2376 | 0.5383 | 0.5075 | 0.5383 | 0.7337 |
| 0.0555 | 8.7106 | 2378 | 0.5318 | 0.5 | 0.5318 | 0.7292 |
| 0.0555 | 8.7179 | 2380 | 0.5312 | 0.5 | 0.5312 | 0.7288 |
| 0.0555 | 8.7253 | 2382 | 0.5346 | 0.5 | 0.5346 | 0.7312 |
| 0.0555 | 8.7326 | 2384 | 0.5389 | 0.5075 | 0.5389 | 0.7341 |
| 0.0555 | 8.7399 | 2386 | 0.5445 | 0.5075 | 0.5445 | 0.7379 |
| 0.0555 | 8.7473 | 2388 | 0.5493 | 0.3824 | 0.5493 | 0.7411 |
| 0.0555 | 8.7546 | 2390 | 0.5550 | 0.3824 | 0.5550 | 0.7450 |
| 0.0555 | 8.7619 | 2392 | 0.5604 | 0.3824 | 0.5604 | 0.7486 |
| 0.0555 | 8.7692 | 2394 | 0.5638 | 0.3824 | 0.5638 | 0.7509 |
| 0.0555 | 8.7766 | 2396 | 0.5686 | 0.3824 | 0.5686 | 0.7541 |
| 0.0555 | 8.7839 | 2398 | 0.5788 | 0.3478 | 0.5788 | 0.7608 |
| 0.0555 | 8.7912 | 2400 | 0.5859 | 0.3478 | 0.5859 | 0.7654 |
| 0.0555 | 8.7985 | 2402 | 0.5961 | 0.3478 | 0.5961 | 0.7721 |
| 0.0555 | 8.8059 | 2404 | 0.6048 | 0.3478 | 0.6048 | 0.7777 |
| 0.0555 | 8.8132 | 2406 | 0.6118 | 0.4 | 0.6118 | 0.7821 |
| 0.0555 | 8.8205 | 2408 | 0.6112 | 0.4 | 0.6112 | 0.7818 |
| 0.0555 | 8.8278 | 2410 | 0.6079 | 0.3478 | 0.6079 | 0.7797 |
| 0.0555 | 8.8352 | 2412 | 0.6000 | 0.3478 | 0.6000 | 0.7746 |
| 0.0555 | 8.8425 | 2414 | 0.5932 | 0.3478 | 0.5932 | 0.7702 |
| 0.0555 | 8.8498 | 2416 | 0.5907 | 0.3478 | 0.5907 | 0.7686 |
| 0.0555 | 8.8571 | 2418 | 0.5910 | 0.3478 | 0.5910 | 0.7688 |
| 0.0555 | 8.8645 | 2420 | 0.5863 | 0.3478 | 0.5863 | 0.7657 |
| 0.0555 | 8.8718 | 2422 | 0.5829 | 0.3824 | 0.5829 | 0.7634 |
| 0.0555 | 8.8791 | 2424 | 0.5810 | 0.3824 | 0.5810 | 0.7622 |
| 0.0555 | 8.8864 | 2426 | 0.5768 | 0.5075 | 0.5768 | 0.7595 |
| 0.0555 | 8.8938 | 2428 | 0.5718 | 0.5075 | 0.5718 | 0.7562 |
| 0.0555 | 8.9011 | 2430 | 0.5658 | 0.4507 | 0.5658 | 0.7522 |
| 0.0555 | 8.9084 | 2432 | 0.5589 | 0.4507 | 0.5589 | 0.7476 |
| 0.0555 | 8.9158 | 2434 | 0.5549 | 0.4507 | 0.5549 | 0.7449 |
| 0.0555 | 8.9231 | 2436 | 0.5510 | 0.4507 | 0.5510 | 0.7423 |
| 0.0555 | 8.9304 | 2438 | 0.5464 | 0.4507 | 0.5464 | 0.7392 |
| 0.0555 | 8.9377 | 2440 | 0.5425 | 0.4375 | 0.5425 | 0.7365 |
| 0.0555 | 8.9451 | 2442 | 0.5389 | 0.4407 | 0.5389 | 0.7341 |
| 0.0555 | 8.9524 | 2444 | 0.5371 | 0.4407 | 0.5371 | 0.7328 |
| 0.0555 | 8.9597 | 2446 | 0.5380 | 0.4407 | 0.5380 | 0.7335 |
| 0.0555 | 8.9670 | 2448 | 0.5404 | 0.4407 | 0.5404 | 0.7351 |
| 0.0555 | 8.9744 | 2450 | 0.5450 | 0.5075 | 0.5450 | 0.7383 |
| 0.0555 | 8.9817 | 2452 | 0.5494 | 0.5075 | 0.5494 | 0.7412 |
| 0.0555 | 8.9890 | 2454 | 0.5516 | 0.3824 | 0.5516 | 0.7427 |
| 0.0555 | 8.9963 | 2456 | 0.5544 | 0.3824 | 0.5544 | 0.7446 |
| 0.0555 | 9.0037 | 2458 | 0.5561 | 0.3824 | 0.5561 | 0.7457 |
| 0.0555 | 9.0110 | 2460 | 0.5573 | 0.3824 | 0.5573 | 0.7466 |
| 0.0555 | 9.0183 | 2462 | 0.5567 | 0.3824 | 0.5567 | 0.7461 |
| 0.0555 | 9.0256 | 2464 | 0.5550 | 0.3824 | 0.5550 | 0.7450 |
| 0.0555 | 9.0330 | 2466 | 0.5523 | 0.3824 | 0.5523 | 0.7432 |
| 0.0555 | 9.0403 | 2468 | 0.5508 | 0.3824 | 0.5508 | 0.7422 |
| 0.0555 | 9.0476 | 2470 | 0.5516 | 0.3824 | 0.5516 | 0.7427 |
| 0.0555 | 9.0549 | 2472 | 0.5523 | 0.3824 | 0.5523 | 0.7432 |
| 0.0555 | 9.0623 | 2474 | 0.5501 | 0.3824 | 0.5501 | 0.7417 |
| 0.0555 | 9.0696 | 2476 | 0.5470 | 0.3824 | 0.5470 | 0.7396 |
| 0.0555 | 9.0769 | 2478 | 0.5457 | 0.5075 | 0.5457 | 0.7387 |
| 0.0555 | 9.0842 | 2480 | 0.5438 | 0.5 | 0.5438 | 0.7374 |
| 0.0555 | 9.0916 | 2482 | 0.5429 | 0.5 | 0.5429 | 0.7368 |
| 0.0555 | 9.0989 | 2484 | 0.5433 | 0.5 | 0.5433 | 0.7371 |
| 0.0555 | 9.1062 | 2486 | 0.5455 | 0.5 | 0.5455 | 0.7386 |
| 0.0555 | 9.1136 | 2488 | 0.5484 | 0.3824 | 0.5484 | 0.7405 |
| 0.0555 | 9.1209 | 2490 | 0.5499 | 0.3824 | 0.5499 | 0.7416 |
| 0.0555 | 9.1282 | 2492 | 0.5478 | 0.3824 | 0.5478 | 0.7402 |
| 0.0555 | 9.1355 | 2494 | 0.5446 | 0.5 | 0.5446 | 0.7379 |
| 0.0555 | 9.1429 | 2496 | 0.5405 | 0.5 | 0.5405 | 0.7352 |
| 0.0555 | 9.1502 | 2498 | 0.5382 | 0.3607 | 0.5382 | 0.7336 |
| 0.0473 | 9.1575 | 2500 | 0.5385 | 0.3607 | 0.5385 | 0.7338 |
| 0.0473 | 9.1648 | 2502 | 0.5384 | 0.3607 | 0.5384 | 0.7338 |
| 0.0473 | 9.1722 | 2504 | 0.5363 | 0.3607 | 0.5363 | 0.7323 |
| 0.0473 | 9.1795 | 2506 | 0.5358 | 0.3607 | 0.5358 | 0.7320 |
| 0.0473 | 9.1868 | 2508 | 0.5379 | 0.3607 | 0.5379 | 0.7334 |
| 0.0473 | 9.1941 | 2510 | 0.5395 | 0.3607 | 0.5395 | 0.7345 |
| 0.0473 | 9.2015 | 2512 | 0.5413 | 0.3607 | 0.5413 | 0.7357 |
| 0.0473 | 9.2088 | 2514 | 0.5407 | 0.3607 | 0.5407 | 0.7353 |
| 0.0473 | 9.2161 | 2516 | 0.5389 | 0.3607 | 0.5389 | 0.7341 |
| 0.0473 | 9.2234 | 2518 | 0.5366 | 0.3607 | 0.5366 | 0.7325 |
| 0.0473 | 9.2308 | 2520 | 0.5352 | 0.3607 | 0.5352 | 0.7316 |
| 0.0473 | 9.2381 | 2522 | 0.5352 | 0.3607 | 0.5352 | 0.7316 |
| 0.0473 | 9.2454 | 2524 | 0.5366 | 0.3607 | 0.5366 | 0.7325 |
| 0.0473 | 9.2527 | 2526 | 0.5385 | 0.3607 | 0.5385 | 0.7338 |
| 0.0473 | 9.2601 | 2528 | 0.5397 | 0.3607 | 0.5397 | 0.7347 |
| 0.0473 | 9.2674 | 2530 | 0.5410 | 0.3607 | 0.5410 | 0.7355 |
| 0.0473 | 9.2747 | 2532 | 0.5408 | 0.3607 | 0.5408 | 0.7354 |
| 0.0473 | 9.2821 | 2534 | 0.5413 | 0.3607 | 0.5413 | 0.7357 |
| 0.0473 | 9.2894 | 2536 | 0.5418 | 0.3607 | 0.5418 | 0.7361 |
| 0.0473 | 9.2967 | 2538 | 0.5420 | 0.3607 | 0.5420 | 0.7362 |
| 0.0473 | 9.3040 | 2540 | 0.5434 | 0.3636 | 0.5434 | 0.7372 |
| 0.0473 | 9.3114 | 2542 | 0.5449 | 0.3607 | 0.5449 | 0.7382 |
| 0.0473 | 9.3187 | 2544 | 0.5487 | 0.3607 | 0.5487 | 0.7407 |
| 0.0473 | 9.3260 | 2546 | 0.5532 | 0.3824 | 0.5532 | 0.7437 |
| 0.0473 | 9.3333 | 2548 | 0.5542 | 0.3824 | 0.5542 | 0.7444 |
| 0.0473 | 9.3407 | 2550 | 0.5540 | 0.3607 | 0.5540 | 0.7443 |
| 0.0473 | 9.3480 | 2552 | 0.5528 | 0.3607 | 0.5528 | 0.7435 |
| 0.0473 | 9.3553 | 2554 | 0.5523 | 0.3607 | 0.5523 | 0.7431 |
| 0.0473 | 9.3626 | 2556 | 0.5526 | 0.3607 | 0.5526 | 0.7434 |
| 0.0473 | 9.3700 | 2558 | 0.5538 | 0.3607 | 0.5538 | 0.7442 |
| 0.0473 | 9.3773 | 2560 | 0.5546 | 0.3607 | 0.5546 | 0.7447 |
| 0.0473 | 9.3846 | 2562 | 0.5537 | 0.3607 | 0.5537 | 0.7441 |
| 0.0473 | 9.3919 | 2564 | 0.5520 | 0.3607 | 0.5520 | 0.7430 |
| 0.0473 | 9.3993 | 2566 | 0.5510 | 0.3607 | 0.5510 | 0.7423 |
| 0.0473 | 9.4066 | 2568 | 0.5495 | 0.3607 | 0.5495 | 0.7413 |
| 0.0473 | 9.4139 | 2570 | 0.5473 | 0.3607 | 0.5473 | 0.7398 |
| 0.0473 | 9.4212 | 2572 | 0.5458 | 0.3607 | 0.5458 | 0.7388 |
| 0.0473 | 9.4286 | 2574 | 0.5447 | 0.3607 | 0.5447 | 0.7380 |
| 0.0473 | 9.4359 | 2576 | 0.5444 | 0.3607 | 0.5444 | 0.7379 |
| 0.0473 | 9.4432 | 2578 | 0.5450 | 0.3607 | 0.5450 | 0.7383 |
| 0.0473 | 9.4505 | 2580 | 0.5457 | 0.3607 | 0.5457 | 0.7387 |
| 0.0473 | 9.4579 | 2582 | 0.5457 | 0.3607 | 0.5457 | 0.7387 |
| 0.0473 | 9.4652 | 2584 | 0.5455 | 0.3607 | 0.5455 | 0.7386 |
| 0.0473 | 9.4725 | 2586 | 0.5447 | 0.3607 | 0.5447 | 0.7380 |
| 0.0473 | 9.4799 | 2588 | 0.5451 | 0.3607 | 0.5451 | 0.7383 |
| 0.0473 | 9.4872 | 2590 | 0.5463 | 0.3607 | 0.5463 | 0.7391 |
| 0.0473 | 9.4945 | 2592 | 0.5489 | 0.3226 | 0.5489 | 0.7409 |
| 0.0473 | 9.5018 | 2594 | 0.5505 | 0.3226 | 0.5505 | 0.7420 |
| 0.0473 | 9.5092 | 2596 | 0.5524 | 0.3226 | 0.5524 | 0.7432 |
| 0.0473 | 9.5165 | 2598 | 0.5526 | 0.3226 | 0.5526 | 0.7434 |
| 0.0473 | 9.5238 | 2600 | 0.5506 | 0.3226 | 0.5506 | 0.7420 |
| 0.0473 | 9.5311 | 2602 | 0.5478 | 0.3226 | 0.5478 | 0.7402 |
| 0.0473 | 9.5385 | 2604 | 0.5454 | 0.3226 | 0.5454 | 0.7385 |
| 0.0473 | 9.5458 | 2606 | 0.5432 | 0.3226 | 0.5432 | 0.7370 |
| 0.0473 | 9.5531 | 2608 | 0.5422 | 0.3226 | 0.5422 | 0.7364 |
| 0.0473 | 9.5604 | 2610 | 0.5400 | 0.3607 | 0.5400 | 0.7348 |
| 0.0473 | 9.5678 | 2612 | 0.5376 | 0.3607 | 0.5376 | 0.7332 |
| 0.0473 | 9.5751 | 2614 | 0.5358 | 0.3607 | 0.5358 | 0.7320 |
| 0.0473 | 9.5824 | 2616 | 0.5338 | 0.3607 | 0.5338 | 0.7306 |
| 0.0473 | 9.5897 | 2618 | 0.5327 | 0.3607 | 0.5327 | 0.7298 |
| 0.0473 | 9.5971 | 2620 | 0.5313 | 0.3607 | 0.5313 | 0.7289 |
| 0.0473 | 9.6044 | 2622 | 0.5309 | 0.3607 | 0.5309 | 0.7286 |
| 0.0473 | 9.6117 | 2624 | 0.5304 | 0.3607 | 0.5304 | 0.7283 |
| 0.0473 | 9.6190 | 2626 | 0.5296 | 0.3607 | 0.5296 | 0.7278 |
| 0.0473 | 9.6264 | 2628 | 0.5293 | 0.3607 | 0.5293 | 0.7275 |
| 0.0473 | 9.6337 | 2630 | 0.5291 | 0.3607 | 0.5291 | 0.7274 |
| 0.0473 | 9.6410 | 2632 | 0.5293 | 0.3607 | 0.5293 | 0.7275 |
| 0.0473 | 9.6484 | 2634 | 0.5293 | 0.3607 | 0.5293 | 0.7276 |
| 0.0473 | 9.6557 | 2636 | 0.5295 | 0.3607 | 0.5295 | 0.7276 |
| 0.0473 | 9.6630 | 2638 | 0.5301 | 0.3607 | 0.5301 | 0.7281 |
| 0.0473 | 9.6703 | 2640 | 0.5303 | 0.3607 | 0.5303 | 0.7282 |
| 0.0473 | 9.6777 | 2642 | 0.5308 | 0.3607 | 0.5308 | 0.7286 |
| 0.0473 | 9.6850 | 2644 | 0.5316 | 0.3607 | 0.5316 | 0.7291 |
| 0.0473 | 9.6923 | 2646 | 0.5322 | 0.3607 | 0.5322 | 0.7295 |
| 0.0473 | 9.6996 | 2648 | 0.5325 | 0.3607 | 0.5325 | 0.7297 |
| 0.0473 | 9.7070 | 2650 | 0.5333 | 0.3607 | 0.5333 | 0.7303 |
| 0.0473 | 9.7143 | 2652 | 0.5337 | 0.3607 | 0.5337 | 0.7306 |
| 0.0473 | 9.7216 | 2654 | 0.5338 | 0.3607 | 0.5338 | 0.7306 |
| 0.0473 | 9.7289 | 2656 | 0.5336 | 0.3607 | 0.5336 | 0.7305 |
| 0.0473 | 9.7363 | 2658 | 0.5331 | 0.3607 | 0.5331 | 0.7301 |
| 0.0473 | 9.7436 | 2660 | 0.5324 | 0.3607 | 0.5324 | 0.7296 |
| 0.0473 | 9.7509 | 2662 | 0.5320 | 0.3607 | 0.5320 | 0.7294 |
| 0.0473 | 9.7582 | 2664 | 0.5319 | 0.3607 | 0.5319 | 0.7293 |
| 0.0473 | 9.7656 | 2666 | 0.5318 | 0.5 | 0.5318 | 0.7293 |
| 0.0473 | 9.7729 | 2668 | 0.5320 | 0.5 | 0.5320 | 0.7294 |
| 0.0473 | 9.7802 | 2670 | 0.5320 | 0.5 | 0.5320 | 0.7294 |
| 0.0473 | 9.7875 | 2672 | 0.5321 | 0.5 | 0.5321 | 0.7295 |
| 0.0473 | 9.7949 | 2674 | 0.5321 | 0.5 | 0.5321 | 0.7295 |
| 0.0473 | 9.8022 | 2676 | 0.5324 | 0.5 | 0.5324 | 0.7297 |
| 0.0473 | 9.8095 | 2678 | 0.5327 | 0.5 | 0.5327 | 0.7298 |
| 0.0473 | 9.8168 | 2680 | 0.5327 | 0.5 | 0.5327 | 0.7299 |
| 0.0473 | 9.8242 | 2682 | 0.5324 | 0.5 | 0.5324 | 0.7296 |
| 0.0473 | 9.8315 | 2684 | 0.5321 | 0.5 | 0.5321 | 0.7295 |
| 0.0473 | 9.8388 | 2686 | 0.5321 | 0.5 | 0.5321 | 0.7295 |
| 0.0473 | 9.8462 | 2688 | 0.5320 | 0.5 | 0.5320 | 0.7294 |
| 0.0473 | 9.8535 | 2690 | 0.5320 | 0.5 | 0.5320 | 0.7294 |
| 0.0473 | 9.8608 | 2692 | 0.5321 | 0.5 | 0.5321 | 0.7295 |
| 0.0473 | 9.8681 | 2694 | 0.5322 | 0.5 | 0.5322 | 0.7295 |
| 0.0473 | 9.8755 | 2696 | 0.5321 | 0.5 | 0.5321 | 0.7295 |
| 0.0473 | 9.8828 | 2698 | 0.5321 | 0.5 | 0.5321 | 0.7295 |
| 0.0473 | 9.8901 | 2700 | 0.5322 | 0.5 | 0.5322 | 0.7295 |
| 0.0473 | 9.8974 | 2702 | 0.5322 | 0.5 | 0.5322 | 0.7295 |
| 0.0473 | 9.9048 | 2704 | 0.5322 | 0.5 | 0.5322 | 0.7295 |
| 0.0473 | 9.9121 | 2706 | 0.5322 | 0.5 | 0.5322 | 0.7295 |
| 0.0473 | 9.9194 | 2708 | 0.5323 | 0.5 | 0.5323 | 0.7296 |
| 0.0473 | 9.9267 | 2710 | 0.5325 | 0.5 | 0.5325 | 0.7297 |
| 0.0473 | 9.9341 | 2712 | 0.5326 | 0.5 | 0.5326 | 0.7298 |
| 0.0473 | 9.9414 | 2714 | 0.5328 | 0.5 | 0.5328 | 0.7299 |
| 0.0473 | 9.9487 | 2716 | 0.5329 | 0.5 | 0.5329 | 0.7300 |
| 0.0473 | 9.9560 | 2718 | 0.5330 | 0.5 | 0.5330 | 0.7301 |
| 0.0473 | 9.9634 | 2720 | 0.5330 | 0.5 | 0.5330 | 0.7301 |
| 0.0473 | 9.9707 | 2722 | 0.5330 | 0.5 | 0.5330 | 0.7301 |
| 0.0473 | 9.9780 | 2724 | 0.5330 | 0.5 | 0.5330 | 0.7301 |
| 0.0473 | 9.9853 | 2726 | 0.5330 | 0.5 | 0.5330 | 0.7300 |
| 0.0473 | 9.9927 | 2728 | 0.5329 | 0.5 | 0.5329 | 0.7300 |
| 0.0473 | 10.0 | 2730 | 0.5329 | 0.5 | 0.5329 | 0.7300 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
|
eric10y/finetuning-sentiment-model-3000-samples | eric10y | 2024-11-25T07:30:21Z | 105 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"base_model:distilbert/distilbert-base-uncased",
"base_model:finetune:distilbert/distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-25T07:21:09Z | ---
library_name: transformers
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_trainer
model-index:
- name: finetuning-sentiment-model-3000-samples
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# finetuning-sentiment-model-3000-samples
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.1+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
|
vitumafeni/tiny-crypto-sentiment-analysis | vitumafeni | 2024-11-25T07:28:20Z | 130 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-25T07:26:23Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a ๐ค transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
second-state/FLUX.1-Canny-dev-GGUF | second-state | 2024-11-25T07:25:31Z | 445 | 6 | null | [
"gguf",
"text-to-image",
"image-generation",
"flux",
"en",
"base_model:black-forest-labs/FLUX.1-Canny-dev",
"base_model:quantized:black-forest-labs/FLUX.1-Canny-dev",
"license:other",
"region:us"
] | text-to-image | 2024-11-25T02:50:33Z | ---
base_model: black-forest-labs/FLUX.1-Canny-dev
license: other
license_name: flux-1-dev-non-commercial-license
license_link: LICENSE.md
model_creator: black-forest-labs
model_name: FLUX.1-Canny-dev
quantized_by: Second State Inc.
language:
- en
tags:
- text-to-image
- image-generation
- flux
---
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://github.com/LlamaEdge/LlamaEdge/raw/dev/assets/logo.svg" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<hr style="margin-top: 1.0em; margin-bottom: 1.0em;">
<!-- header end -->
> [!CAUTION]
> T5 and Clip are still not provided in the original model
# FLUX.1-Canny-dev-GGUF
## Original Model
[black-forest-labs/FLUX.1-Canny-dev](https://huggingface.co/black-forest-labs/FLUX.1-Canny-dev)
## Run with LlamaEdge-StableDiffusion
- Version: coming soon
<!-- - Version: [0.1.4](https://github.com/LlamaEdge/sd-api-server/releases/tag/0.1.4)
- Run as LlamaEdge service
```bash
wasmedge --dir .:. sd-api-server.wasm \
--model-name flux1-canny-dev \
--diffusion-model flux1-canny-dev-Q4_0.gguf \
--vae ae.safetensors \
--clip-l clip_l.safetensors \
--t5xxl t5xxl-Q8_0.gguf
```
- Run with LoRA
Assume that the LoRA model is located in the `lora-models` directory
```bash
wasmedge --dir .:. \
--dir lora-models:lora-models \
sd-api-server.wasm \
--model-name flux1-canny-dev \
--diffusion-model flux1-canny-dev-Q4_0.gguf \
--vae ae.safetensors \
--clip-l clip_l.safetensors \
--t5xxl t5xxl-Q8_0.gguf \
--lora-model-dir lora-models
```
*For details, see https://github.com/LlamaEdge/sd-api-server/blob/main/examples/flux_with_lora.md* -->
## Quantized GGUF Models
| Name | Quant method | Bits | Size | Use case |
| ---- | ---- | ---- | ---- | ----- |
| [ae.safetensors](https://huggingface.co/second-state/FLUX.1-Canny-dev-GGUF/blob/main/ae.safetensors) | f32 | 32 | 335 MB | |
| [flux1-canny-dev-Q2_K.gguf](https://huggingface.co/second-state/FLUX.1-Canny-dev-GGUF/blob/main/flux1-canny-dev-Q2_K.gguf) | Q2_K | 2 | 4.15 GB | |
| [flux1-canny-dev-Q3_K.gguf](https://huggingface.co/second-state/FLUX.1-Canny-dev-GGUF/blob/main/flux1-canny-dev-Q3_K.gguf) | Q3_K | 3 | 5.35 GB | |
| [flux1-canny-dev-Q4_0.gguf](https://huggingface.co/second-state/FLUX.1-Canny-dev-GGUF/blob/main/flux1-canny-dev-Q4_0.gguf) | Q4_0 | 4 | 6.93 GB | |
| [flux1-canny-dev-Q4_1.gguf](https://huggingface.co/second-state/FLUX.1-Canny-dev-GGUF/blob/main/flux1-canny-dev-Q4_1.gguf) | Q4_1 | 4 | 7.67 GB | |
| [flux1-canny-dev-Q4_K.gguf](https://huggingface.co/second-state/FLUX.1-Canny-dev-GGUF/blob/main/flux1-canny-dev-Q4_K.gguf) | Q4_K | 4 | 6.93 GB | |
| [flux1-canny-dev-Q5_0.gguf](https://huggingface.co/second-state/FLUX.1-Canny-dev-GGUF/blob/main/flux1-canny-dev-Q5_0.gguf) | Q5_0 | 5 | 8.40 GB | |
| [flux1-canny-dev-Q5_1.gguf](https://huggingface.co/second-state/FLUX.1-Canny-dev-GGUF/blob/main/flux1-canny-dev-Q5_1.gguf) | Q5_1 | 5 | 9.14 GB | |
| [flux1-canny-dev-Q8_0.gguf](https://huggingface.co/second-state/FLUX.1-Canny-dev-GGUF/blob/main/flux1-canny-dev-Q8_0.gguf) | Q8_0 | 8 | 12.6 GB | |
| [flux1-canny-dev.safetensors](https://huggingface.co/second-state/FLUX.1-Canny-dev-GGUF/blob/main/flux1-canny-dev.safetensors) | f16 | 16 | 23.8 GB | |
<!-- | [clip_l-Q8_0.gguf](https://huggingface.co/second-state/FLUX.1-Canny-dev-GGUF/blob/main/clip_l-Q8_0.gguf) | Q8_0 | 8 | 131 MB | |
| [clip_l.safetensors](https://huggingface.co/second-state/FLUX.1-Canny-dev-GGUF/blob/main/clip_l.safetensors) | f16 | 16 | 246 MB | |
| [t5xxl-Q2_K.gguf](https://huggingface.co/second-state/FLUX.1-Canny-dev-GGUF/blob/main/t5xxl-Q2_K.gguf) | Q2_K | 2 | 1.61 GB | |
| [t5xxl-Q3_K.gguf](https://huggingface.co/second-state/FLUX.1-Canny-dev-GGUF/blob/main/t5xxl-Q3_K.gguf) | Q3_K | 3 | 2.10 GB | |
| [t5xxl-Q4_0.gguf](https://huggingface.co/second-state/FLUX.1-Canny-dev-GGUF/blob/main/t5xxl-Q4_0.gguf) | Q4_0 | 4 | 2.75 GB | |
| [t5xxl-Q4_1.gguf](https://huggingface.co/second-state/FLUX.1-Canny-dev-GGUF/blob/main/t5xxl-Q4_1.gguf) | Q4_0 | 4 | 3.06 GB | |
| [t5xxl-Q4_K.gguf](https://huggingface.co/second-state/FLUX.1-Canny-dev-GGUF/blob/main/t5xxl-Q4_K.gguf) | Q4_K | 4 | 2.75 GB | |
| [t5xxl-Q5_0.gguf](https://huggingface.co/second-state/FLUX.1-Canny-dev-GGUF/blob/main/t5xxl-Q5_0.gguf) | Q5_0 | 5 | 3.36 GB | |
| [t5xxl-Q5_1.gguf](https://huggingface.co/second-state/FLUX.1-Canny-dev-GGUF/blob/main/t5xxl-Q5_1.gguf) | Q5_1 | 5 | 3.67 GB | |
| [t5xxl-Q8_0.gguf](https://huggingface.co/second-state/FLUX.1-Canny-dev-GGUF/blob/main/t5xxl-Q8_0.gguf) | Q8_0 | 8 | 5.20 GB | |
| [t5xxl_fp16.safetensors](https://huggingface.co/second-state/FLUX.1-Canny-dev-GGUF/blob/main/t5xxl_fp16.safetensors) | f16 | 16 | 9.79 GB | | -->
**Quantized with stable-diffusion.cpp `master-c3eeb669`.** |
ankit5319/chronos-t5-small-fine-tuned | ankit5319 | 2024-11-25T07:22:04Z | 174 | 0 | transformers | [
"transformers",
"safetensors",
"t5",
"text2text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text2text-generation | 2024-11-25T07:21:46Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a ๐ค transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Shinyaaa/Travel-20-v1-on-RPC-10-v1 | Shinyaaa | 2024-11-25T07:20:42Z | 102 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"feature-extraction",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | feature-extraction | 2024-11-25T07:20:13Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a ๐ค transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
mlx-community/Llama-3.1-Tulu-3-8B-4bit | mlx-community | 2024-11-25T07:19:51Z | 81 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"mlx",
"conversational",
"en",
"dataset:allenai/RLVR-GSM-MATH-IF-Mixed-Constraints",
"base_model:allenai/Llama-3.1-Tulu-3-8B",
"base_model:quantized:allenai/Llama-3.1-Tulu-3-8B",
"license:llama3.1",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"4-bit",
"region:us"
] | text-generation | 2024-11-25T07:08:02Z | ---
license: llama3.1
language:
- en
pipeline_tag: text-generation
datasets:
- allenai/RLVR-GSM-MATH-IF-Mixed-Constraints
base_model: allenai/Llama-3.1-Tulu-3-8B
library_name: transformers
tags:
- mlx
---
# mlx-community/Llama-3.1-Tulu-3-8B-4bit
The Model [mlx-community/Llama-3.1-Tulu-3-8B-4bit](https://huggingface.co/mlx-community/Llama-3.1-Tulu-3-8B-4bit) was
converted to MLX format from [allenai/Llama-3.1-Tulu-3-8B](https://huggingface.co/allenai/Llama-3.1-Tulu-3-8B)
using mlx-lm version **0.20.0**.
## Use with mlx
```bash
pip install mlx-lm
```
```python
from mlx_lm import load, generate
model, tokenizer = load("mlx-community/Llama-3.1-Tulu-3-8B-4bit")
prompt="hello"
if hasattr(tokenizer, "apply_chat_template") and tokenizer.chat_template is not None:
messages = [{"role": "user", "content": prompt}]
prompt = tokenizer.apply_chat_template(
messages, tokenize=False, add_generation_prompt=True
)
response = generate(model, tokenizer, prompt=prompt, verbose=True)
```
|
MayBashendy/Arabic_FineTuningAraBERT_AugV5_k35_task2_organization_fold0 | MayBashendy | 2024-11-25T07:19:11Z | 161 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:aubmindlab/bert-base-arabertv02",
"base_model:finetune:aubmindlab/bert-base-arabertv02",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-25T07:06:32Z | ---
library_name: transformers
base_model: aubmindlab/bert-base-arabertv02
tags:
- generated_from_trainer
model-index:
- name: Arabic_FineTuningAraBERT_AugV5_k35_task2_organization_fold0
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Arabic_FineTuningAraBERT_AugV5_k35_task2_organization_fold0
This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6437
- Qwk: 0.3771
- Mse: 0.6437
- Rmse: 0.8023
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:|
| No log | 0.0082 | 2 | 3.8449 | 0.0 | 3.8449 | 1.9608 |
| No log | 0.0165 | 4 | 2.5465 | 0.0172 | 2.5465 | 1.5958 |
| No log | 0.0247 | 6 | 1.3961 | 0.0491 | 1.3961 | 1.1816 |
| No log | 0.0329 | 8 | 1.3041 | 0.0 | 1.3041 | 1.1420 |
| No log | 0.0412 | 10 | 1.1781 | 0.0 | 1.1781 | 1.0854 |
| No log | 0.0494 | 12 | 1.2873 | 0.0 | 1.2873 | 1.1346 |
| No log | 0.0576 | 14 | 1.6711 | -0.0302 | 1.6711 | 1.2927 |
| No log | 0.0658 | 16 | 1.5391 | 0.0 | 1.5391 | 1.2406 |
| No log | 0.0741 | 18 | 1.4694 | 0.0491 | 1.4694 | 1.2122 |
| No log | 0.0823 | 20 | 1.5935 | 0.0491 | 1.5935 | 1.2623 |
| No log | 0.0905 | 22 | 1.7652 | -0.0096 | 1.7652 | 1.3286 |
| No log | 0.0988 | 24 | 1.9005 | 0.0525 | 1.9004 | 1.3786 |
| No log | 0.1070 | 26 | 1.7812 | -0.0328 | 1.7812 | 1.3346 |
| No log | 0.1152 | 28 | 1.6600 | 0.0173 | 1.6600 | 1.2884 |
| No log | 0.1235 | 30 | 1.5788 | 0.0491 | 1.5788 | 1.2565 |
| No log | 0.1317 | 32 | 1.4703 | 0.0 | 1.4703 | 1.2126 |
| No log | 0.1399 | 34 | 1.2667 | 0.0 | 1.2667 | 1.1255 |
| No log | 0.1481 | 36 | 1.0444 | 0.0 | 1.0444 | 1.0220 |
| No log | 0.1564 | 38 | 0.8703 | 0.0 | 0.8703 | 0.9329 |
| No log | 0.1646 | 40 | 0.7873 | 0.0335 | 0.7873 | 0.8873 |
| No log | 0.1728 | 42 | 0.8284 | 0.0335 | 0.8284 | 0.9102 |
| No log | 0.1811 | 44 | 0.9977 | 0.0 | 0.9977 | 0.9988 |
| No log | 0.1893 | 46 | 1.3060 | 0.0 | 1.3060 | 1.1428 |
| No log | 0.1975 | 48 | 1.4340 | 0.0 | 1.4340 | 1.1975 |
| No log | 0.2058 | 50 | 1.3664 | 0.0 | 1.3664 | 1.1689 |
| No log | 0.2140 | 52 | 1.2462 | 0.0 | 1.2462 | 1.1163 |
| No log | 0.2222 | 54 | 1.1136 | 0.0 | 1.1136 | 1.0553 |
| No log | 0.2305 | 56 | 1.2068 | 0.0 | 1.2068 | 1.0986 |
| No log | 0.2387 | 58 | 1.1604 | 0.0 | 1.1604 | 1.0772 |
| No log | 0.2469 | 60 | 0.9721 | 0.0 | 0.9721 | 0.9860 |
| No log | 0.2551 | 62 | 0.8585 | 0.0 | 0.8585 | 0.9265 |
| No log | 0.2634 | 64 | 0.8457 | 0.0 | 0.8457 | 0.9196 |
| No log | 0.2716 | 66 | 0.8333 | 0.0 | 0.8333 | 0.9129 |
| No log | 0.2798 | 68 | 0.9003 | 0.0 | 0.9003 | 0.9489 |
| No log | 0.2881 | 70 | 1.0546 | 0.0 | 1.0546 | 1.0269 |
| No log | 0.2963 | 72 | 1.1747 | 0.0 | 1.1747 | 1.0838 |
| No log | 0.3045 | 74 | 1.2196 | 0.0 | 1.2196 | 1.1044 |
| No log | 0.3128 | 76 | 1.1503 | 0.0 | 1.1503 | 1.0725 |
| No log | 0.3210 | 78 | 1.0305 | 0.0 | 1.0305 | 1.0151 |
| No log | 0.3292 | 80 | 0.9394 | 0.0 | 0.9394 | 0.9692 |
| No log | 0.3374 | 82 | 0.9147 | 0.0 | 0.9147 | 0.9564 |
| No log | 0.3457 | 84 | 0.9039 | 0.0 | 0.9039 | 0.9507 |
| No log | 0.3539 | 86 | 0.9438 | 0.0 | 0.9438 | 0.9715 |
| No log | 0.3621 | 88 | 1.1728 | 0.0 | 1.1728 | 1.0830 |
| No log | 0.3704 | 90 | 1.5843 | 0.125 | 1.5843 | 1.2587 |
| No log | 0.3786 | 92 | 1.5654 | 0.125 | 1.5654 | 1.2512 |
| No log | 0.3868 | 94 | 1.2734 | 0.0 | 1.2734 | 1.1284 |
| No log | 0.3951 | 96 | 0.9856 | 0.0 | 0.9856 | 0.9928 |
| No log | 0.4033 | 98 | 0.7949 | 0.1564 | 0.7949 | 0.8916 |
| No log | 0.4115 | 100 | 0.7168 | 0.1192 | 0.7168 | 0.8466 |
| No log | 0.4198 | 102 | 0.7223 | 0.2150 | 0.7223 | 0.8499 |
| No log | 0.4280 | 104 | 0.8128 | 0.1747 | 0.8128 | 0.9015 |
| No log | 0.4362 | 106 | 0.7921 | 0.1747 | 0.7921 | 0.8900 |
| No log | 0.4444 | 108 | 0.7928 | 0.3046 | 0.7928 | 0.8904 |
| No log | 0.4527 | 110 | 0.7358 | 0.3046 | 0.7358 | 0.8578 |
| No log | 0.4609 | 112 | 0.6887 | 0.3695 | 0.6887 | 0.8299 |
| No log | 0.4691 | 114 | 0.7364 | 0.1747 | 0.7364 | 0.8581 |
| No log | 0.4774 | 116 | 0.7567 | 0.3197 | 0.7567 | 0.8699 |
| No log | 0.4856 | 118 | 0.8435 | 0.1564 | 0.8435 | 0.9184 |
| No log | 0.4938 | 120 | 0.9194 | 0.0 | 0.9194 | 0.9589 |
| No log | 0.5021 | 122 | 0.9321 | 0.0 | 0.9321 | 0.9654 |
| No log | 0.5103 | 124 | 0.8488 | 0.2821 | 0.8488 | 0.9213 |
| No log | 0.5185 | 126 | 0.7542 | 0.2150 | 0.7542 | 0.8684 |
| No log | 0.5267 | 128 | 0.7265 | 0.0339 | 0.7265 | 0.8524 |
| No log | 0.5350 | 130 | 0.7171 | 0.1141 | 0.7171 | 0.8468 |
| No log | 0.5432 | 132 | 0.7488 | 0.0503 | 0.7488 | 0.8654 |
| No log | 0.5514 | 134 | 0.7621 | 0.1141 | 0.7621 | 0.8730 |
| No log | 0.5597 | 136 | 0.8116 | -0.0426 | 0.8116 | 0.9009 |
| No log | 0.5679 | 138 | 0.8358 | -0.2466 | 0.8358 | 0.9142 |
| No log | 0.5761 | 140 | 0.8469 | -0.1099 | 0.8469 | 0.9203 |
| No log | 0.5844 | 142 | 0.8581 | -0.0769 | 0.8581 | 0.9263 |
| No log | 0.5926 | 144 | 0.8125 | -0.1004 | 0.8125 | 0.9014 |
| No log | 0.6008 | 146 | 0.7305 | -0.0048 | 0.7305 | 0.8547 |
| No log | 0.6091 | 148 | 0.6800 | 0.1340 | 0.6800 | 0.8246 |
| No log | 0.6173 | 150 | 0.6528 | 0.1558 | 0.6528 | 0.8079 |
| No log | 0.6255 | 152 | 0.6511 | 0.0339 | 0.6511 | 0.8069 |
| No log | 0.6337 | 154 | 0.6960 | -0.0048 | 0.6960 | 0.8342 |
| No log | 0.6420 | 156 | 0.6974 | 0.0735 | 0.6974 | 0.8351 |
| No log | 0.6502 | 158 | 0.7287 | 0.0916 | 0.7287 | 0.8536 |
| No log | 0.6584 | 160 | 0.7260 | 0.0916 | 0.7260 | 0.8521 |
| No log | 0.6667 | 162 | 0.7071 | 0.0916 | 0.7071 | 0.8409 |
| No log | 0.6749 | 164 | 0.6744 | 0.0916 | 0.6744 | 0.8212 |
| No log | 0.6831 | 166 | 0.6773 | -0.0678 | 0.6773 | 0.8230 |
| No log | 0.6914 | 168 | 0.7038 | 0.0567 | 0.7038 | 0.8389 |
| No log | 0.6996 | 170 | 0.7098 | -0.1053 | 0.7098 | 0.8425 |
| No log | 0.7078 | 172 | 0.7049 | 0.0099 | 0.7049 | 0.8396 |
| No log | 0.7160 | 174 | 0.7244 | 0.0916 | 0.7244 | 0.8511 |
| No log | 0.7243 | 176 | 0.6981 | 0.2759 | 0.6981 | 0.8355 |
| No log | 0.7325 | 178 | 0.6879 | 0.1962 | 0.6879 | 0.8294 |
| No log | 0.7407 | 180 | 0.6943 | 0.1560 | 0.6943 | 0.8333 |
| No log | 0.7490 | 182 | 0.6909 | 0.2373 | 0.6909 | 0.8312 |
| No log | 0.7572 | 184 | 0.6780 | 0.0916 | 0.6780 | 0.8234 |
| No log | 0.7654 | 186 | 0.6530 | 0.0503 | 0.6530 | 0.8081 |
| No log | 0.7737 | 188 | 0.6458 | 0.1560 | 0.6458 | 0.8036 |
| No log | 0.7819 | 190 | 0.7552 | 0.3046 | 0.7552 | 0.8690 |
| No log | 0.7901 | 192 | 0.8491 | 0.1026 | 0.8491 | 0.9215 |
| No log | 0.7984 | 194 | 0.7692 | 0.2150 | 0.7692 | 0.8770 |
| No log | 0.8066 | 196 | 0.7988 | 0.0503 | 0.7988 | 0.8937 |
| No log | 0.8148 | 198 | 0.9089 | 0.2186 | 0.9089 | 0.9534 |
| No log | 0.8230 | 200 | 0.9674 | 0.0909 | 0.9674 | 0.9836 |
| No log | 0.8313 | 202 | 0.8486 | 0.1600 | 0.8486 | 0.9212 |
| No log | 0.8395 | 204 | 0.7004 | 0.0735 | 0.7004 | 0.8369 |
| No log | 0.8477 | 206 | 0.7193 | 0.1755 | 0.7193 | 0.8481 |
| No log | 0.8560 | 208 | 0.7631 | 0.2889 | 0.7631 | 0.8735 |
| No log | 0.8642 | 210 | 0.7775 | 0.1563 | 0.7775 | 0.8818 |
| No log | 0.8724 | 212 | 0.6998 | 0.2553 | 0.6998 | 0.8366 |
| No log | 0.8807 | 214 | 0.6873 | 0.2373 | 0.6873 | 0.8290 |
| No log | 0.8889 | 216 | 0.7307 | 0.0916 | 0.7307 | 0.8548 |
| No log | 0.8971 | 218 | 0.7571 | 0.0503 | 0.7571 | 0.8701 |
| No log | 0.9053 | 220 | 0.7653 | 0.1356 | 0.7653 | 0.8748 |
| No log | 0.9136 | 222 | 0.8030 | 0.1356 | 0.8030 | 0.8961 |
| No log | 0.9218 | 224 | 0.8217 | 0.1765 | 0.8217 | 0.9065 |
| No log | 0.9300 | 226 | 0.8519 | 0.0099 | 0.8519 | 0.9230 |
| No log | 0.9383 | 228 | 0.8988 | 0.1793 | 0.8988 | 0.9480 |
| No log | 0.9465 | 230 | 0.8815 | 0.1793 | 0.8815 | 0.9389 |
| No log | 0.9547 | 232 | 0.9057 | 0.1793 | 0.9057 | 0.9517 |
| No log | 0.9630 | 234 | 0.9843 | 0.0557 | 0.9843 | 0.9921 |
| No log | 0.9712 | 236 | 1.0257 | 0.0833 | 1.0257 | 1.0128 |
| No log | 0.9794 | 238 | 0.9546 | 0.2186 | 0.9546 | 0.9770 |
| No log | 0.9877 | 240 | 0.8870 | 0.0503 | 0.8870 | 0.9418 |
| No log | 0.9959 | 242 | 0.8678 | 0.0916 | 0.8678 | 0.9315 |
| No log | 1.0041 | 244 | 0.8923 | 0.0916 | 0.8923 | 0.9446 |
| No log | 1.0123 | 246 | 0.8906 | 0.0916 | 0.8906 | 0.9437 |
| No log | 1.0206 | 248 | 0.8417 | 0.0916 | 0.8417 | 0.9175 |
| No log | 1.0288 | 250 | 0.8226 | 0.0099 | 0.8226 | 0.9070 |
| No log | 1.0370 | 252 | 0.8301 | 0.0099 | 0.8301 | 0.9111 |
| No log | 1.0453 | 254 | 0.8631 | 0.0916 | 0.8631 | 0.9290 |
| No log | 1.0535 | 256 | 0.9591 | 0.0258 | 0.9591 | 0.9793 |
| No log | 1.0617 | 258 | 1.0055 | 0.0 | 1.0055 | 1.0028 |
| No log | 1.0700 | 260 | 0.9585 | 0.0258 | 0.9585 | 0.9790 |
| No log | 1.0782 | 262 | 0.9307 | 0.0916 | 0.9307 | 0.9647 |
| No log | 1.0864 | 264 | 0.9216 | 0.2186 | 0.9216 | 0.9600 |
| No log | 1.0947 | 266 | 0.8643 | 0.1340 | 0.8643 | 0.9297 |
| No log | 1.1029 | 268 | 0.7913 | 0.0916 | 0.7913 | 0.8895 |
| No log | 1.1111 | 270 | 0.7503 | 0.0916 | 0.7503 | 0.8662 |
| No log | 1.1193 | 272 | 0.7356 | 0.0916 | 0.7356 | 0.8577 |
| No log | 1.1276 | 274 | 0.7775 | -0.0294 | 0.7775 | 0.8818 |
| No log | 1.1358 | 276 | 0.8031 | 0.0099 | 0.8031 | 0.8961 |
| No log | 1.1440 | 278 | 0.8682 | 0.1793 | 0.8682 | 0.9318 |
| No log | 1.1523 | 280 | 0.9049 | 0.1793 | 0.9049 | 0.9513 |
| No log | 1.1605 | 282 | 0.9445 | 0.25 | 0.9445 | 0.9718 |
| No log | 1.1687 | 284 | 1.0603 | 0.1463 | 1.0603 | 1.0297 |
| No log | 1.1770 | 286 | 1.0686 | 0.1738 | 1.0686 | 1.0337 |
| No log | 1.1852 | 288 | 0.9538 | 0.1072 | 0.9538 | 0.9766 |
| No log | 1.1934 | 290 | 0.8921 | 0.1793 | 0.8921 | 0.9445 |
| No log | 1.2016 | 292 | 0.8276 | 0.1793 | 0.8276 | 0.9097 |
| No log | 1.2099 | 294 | 0.7594 | 0.2533 | 0.7594 | 0.8715 |
| No log | 1.2181 | 296 | 0.7813 | 0.0785 | 0.7813 | 0.8839 |
| No log | 1.2263 | 298 | 0.8477 | 0.1370 | 0.8477 | 0.9207 |
| No log | 1.2346 | 300 | 0.8624 | 0.1773 | 0.8624 | 0.9286 |
| No log | 1.2428 | 302 | 0.8222 | 0.1168 | 0.8222 | 0.9067 |
| No log | 1.2510 | 304 | 0.7986 | 0.3318 | 0.7986 | 0.8937 |
| No log | 1.2593 | 306 | 0.9415 | 0.0557 | 0.9415 | 0.9703 |
| No log | 1.2675 | 308 | 1.1720 | 0.1649 | 1.1720 | 1.0826 |
| No log | 1.2757 | 310 | 1.2318 | 0.1649 | 1.2318 | 1.1099 |
| No log | 1.2840 | 312 | 1.2591 | 0.1276 | 1.2591 | 1.1221 |
| No log | 1.2922 | 314 | 1.1760 | 0.0440 | 1.1760 | 1.0844 |
| No log | 1.3004 | 316 | 1.0025 | 0.0 | 1.0025 | 1.0013 |
| No log | 1.3086 | 318 | 0.8959 | 0.1793 | 0.8959 | 0.9465 |
| No log | 1.3169 | 320 | 0.9102 | 0.0099 | 0.9102 | 0.9540 |
| No log | 1.3251 | 322 | 0.8541 | 0.0503 | 0.8541 | 0.9242 |
| No log | 1.3333 | 324 | 0.7833 | 0.0503 | 0.7833 | 0.8850 |
| No log | 1.3416 | 326 | 0.7566 | 0.0916 | 0.7566 | 0.8698 |
| No log | 1.3498 | 328 | 0.7682 | 0.0258 | 0.7682 | 0.8765 |
| No log | 1.3580 | 330 | 0.7707 | 0.0099 | 0.7707 | 0.8779 |
| No log | 1.3663 | 332 | 0.7559 | -0.0678 | 0.7559 | 0.8694 |
| No log | 1.3745 | 334 | 0.7305 | -0.0048 | 0.7305 | 0.8547 |
| No log | 1.3827 | 336 | 0.6950 | -0.0294 | 0.6950 | 0.8336 |
| No log | 1.3909 | 338 | 0.7029 | 0.0503 | 0.7029 | 0.8384 |
| No log | 1.3992 | 340 | 0.7535 | 0.2186 | 0.7535 | 0.8680 |
| No log | 1.4074 | 342 | 0.7803 | 0.2186 | 0.7803 | 0.8834 |
| No log | 1.4156 | 344 | 0.7621 | 0.1793 | 0.7621 | 0.8730 |
| No log | 1.4239 | 346 | 0.7755 | 0.1034 | 0.7755 | 0.8806 |
| No log | 1.4321 | 348 | 0.7882 | 0.2696 | 0.7882 | 0.8878 |
| No log | 1.4403 | 350 | 0.7956 | 0.2533 | 0.7956 | 0.8919 |
| No log | 1.4486 | 352 | 0.7774 | 0.2154 | 0.7774 | 0.8817 |
| No log | 1.4568 | 354 | 0.7527 | 0.2533 | 0.7527 | 0.8676 |
| No log | 1.4650 | 356 | 0.7524 | 0.1409 | 0.7524 | 0.8674 |
| No log | 1.4733 | 358 | 0.7902 | 0.2186 | 0.7902 | 0.8889 |
| No log | 1.4815 | 360 | 0.8423 | 0.1398 | 0.8423 | 0.9177 |
| No log | 1.4897 | 362 | 0.8313 | 0.1398 | 0.8313 | 0.9118 |
| No log | 1.4979 | 364 | 0.8102 | 0.2186 | 0.8102 | 0.9001 |
| No log | 1.5062 | 366 | 0.8309 | 0.1000 | 0.8309 | 0.9116 |
| No log | 1.5144 | 368 | 0.8751 | 0.1398 | 0.8751 | 0.9355 |
| No log | 1.5226 | 370 | 0.8400 | 0.2186 | 0.8400 | 0.9165 |
| No log | 1.5309 | 372 | 0.7666 | 0.1034 | 0.7666 | 0.8755 |
| No log | 1.5391 | 374 | 0.7353 | 0.2533 | 0.7353 | 0.8575 |
| No log | 1.5473 | 376 | 0.7245 | 0.2533 | 0.7245 | 0.8512 |
| No log | 1.5556 | 378 | 0.7324 | 0.1034 | 0.7324 | 0.8558 |
| No log | 1.5638 | 380 | 0.7641 | 0.1409 | 0.7641 | 0.8741 |
| No log | 1.5720 | 382 | 0.8041 | 0.1000 | 0.8041 | 0.8967 |
| No log | 1.5802 | 384 | 0.8160 | 0.1398 | 0.8160 | 0.9033 |
| No log | 1.5885 | 386 | 0.8120 | 0.1398 | 0.8120 | 0.9011 |
| No log | 1.5967 | 388 | 0.8348 | 0.1398 | 0.8348 | 0.9137 |
| No log | 1.6049 | 390 | 0.8076 | 0.1398 | 0.8076 | 0.8987 |
| No log | 1.6132 | 392 | 0.7741 | 0.1398 | 0.7741 | 0.8798 |
| No log | 1.6214 | 394 | 0.7262 | -0.0418 | 0.7262 | 0.8522 |
| No log | 1.6296 | 396 | 0.6940 | -0.0153 | 0.6940 | 0.8330 |
| No log | 1.6379 | 398 | 0.6703 | 0.0099 | 0.6703 | 0.8187 |
| No log | 1.6461 | 400 | 0.6676 | 0.0099 | 0.6676 | 0.8170 |
| No log | 1.6543 | 402 | 0.6581 | 0.0099 | 0.6581 | 0.8112 |
| No log | 1.6626 | 404 | 0.6687 | -0.0294 | 0.6687 | 0.8177 |
| No log | 1.6708 | 406 | 0.7005 | -0.0294 | 0.7005 | 0.8369 |
| No log | 1.6790 | 408 | 0.7436 | 0.0828 | 0.7436 | 0.8623 |
| No log | 1.6872 | 410 | 0.7897 | 0.1000 | 0.7897 | 0.8886 |
| No log | 1.6955 | 412 | 0.8040 | 0.0612 | 0.8040 | 0.8967 |
| No log | 1.7037 | 414 | 0.8163 | 0.0612 | 0.8163 | 0.9035 |
| No log | 1.7119 | 416 | 0.8136 | 0.0233 | 0.8136 | 0.9020 |
| No log | 1.7202 | 418 | 0.8150 | 0.0233 | 0.8150 | 0.9028 |
| No log | 1.7284 | 420 | 0.8136 | 0.0233 | 0.8136 | 0.9020 |
| No log | 1.7366 | 422 | 0.7821 | 0.0233 | 0.7821 | 0.8844 |
| No log | 1.7449 | 424 | 0.7412 | -0.0553 | 0.7412 | 0.8609 |
| No log | 1.7531 | 426 | 0.7101 | -0.0294 | 0.7101 | 0.8427 |
| No log | 1.7613 | 428 | 0.6934 | -0.0294 | 0.6934 | 0.8327 |
| No log | 1.7695 | 430 | 0.7020 | -0.0294 | 0.7020 | 0.8378 |
| No log | 1.7778 | 432 | 0.7327 | 0.0828 | 0.7327 | 0.8560 |
| No log | 1.7860 | 434 | 0.7482 | 0.0828 | 0.7482 | 0.8650 |
| No log | 1.7942 | 436 | 0.7700 | 0.1209 | 0.7700 | 0.8775 |
| No log | 1.8025 | 438 | 0.7710 | 0.0828 | 0.7710 | 0.8780 |
| No log | 1.8107 | 440 | 0.7464 | -0.0553 | 0.7464 | 0.8639 |
| No log | 1.8189 | 442 | 0.7121 | -0.0943 | 0.7121 | 0.8438 |
| No log | 1.8272 | 444 | 0.6957 | -0.0294 | 0.6957 | 0.8341 |
| No log | 1.8354 | 446 | 0.6885 | 0.0339 | 0.6885 | 0.8298 |
| No log | 1.8436 | 448 | 0.7024 | -0.0153 | 0.7024 | 0.8381 |
| No log | 1.8519 | 450 | 0.7452 | 0.1600 | 0.7452 | 0.8632 |
| No log | 1.8601 | 452 | 0.7658 | 0.1398 | 0.7658 | 0.8751 |
| No log | 1.8683 | 454 | 0.7684 | 0.1398 | 0.7684 | 0.8766 |
| No log | 1.8765 | 456 | 0.7694 | 0.1398 | 0.7694 | 0.8772 |
| No log | 1.8848 | 458 | 0.7904 | 0.1398 | 0.7904 | 0.8891 |
| No log | 1.8930 | 460 | 0.7767 | 0.1600 | 0.7767 | 0.8813 |
| No log | 1.9012 | 462 | 0.7151 | 0.2921 | 0.7151 | 0.8457 |
| No log | 1.9095 | 464 | 0.7042 | 0.1783 | 0.7042 | 0.8391 |
| No log | 1.9177 | 466 | 0.7133 | 0.1783 | 0.7133 | 0.8446 |
| No log | 1.9259 | 468 | 0.7119 | 0.1783 | 0.7119 | 0.8437 |
| No log | 1.9342 | 470 | 0.6935 | 0.2154 | 0.6935 | 0.8328 |
| No log | 1.9424 | 472 | 0.7027 | 0.2921 | 0.7027 | 0.8383 |
| No log | 1.9506 | 474 | 0.7269 | 0.3724 | 0.7269 | 0.8526 |
| No log | 1.9588 | 476 | 0.7440 | 0.2186 | 0.7440 | 0.8626 |
| No log | 1.9671 | 478 | 0.7271 | 0.2588 | 0.7271 | 0.8527 |
| No log | 1.9753 | 480 | 0.6672 | 0.2186 | 0.6672 | 0.8169 |
| No log | 1.9835 | 482 | 0.5915 | 0.3467 | 0.5915 | 0.7691 |
| No log | 1.9918 | 484 | 0.5644 | 0.1560 | 0.5644 | 0.7512 |
| No log | 2.0 | 486 | 0.5817 | 0.2329 | 0.5817 | 0.7627 |
| No log | 2.0082 | 488 | 0.6373 | 0.2500 | 0.6373 | 0.7983 |
| No log | 2.0165 | 490 | 0.6611 | 0.2500 | 0.6611 | 0.8131 |
| No log | 2.0247 | 492 | 0.6433 | 0.2500 | 0.6433 | 0.8021 |
| No log | 2.0329 | 494 | 0.6117 | 0.2889 | 0.6117 | 0.7821 |
| No log | 2.0412 | 496 | 0.5556 | 0.2725 | 0.5556 | 0.7454 |
| No log | 2.0494 | 498 | 0.5430 | 0.2794 | 0.5430 | 0.7369 |
| 0.4608 | 2.0576 | 500 | 0.5797 | 0.1985 | 0.5797 | 0.7614 |
| 0.4608 | 2.0658 | 502 | 0.5947 | 0.1340 | 0.5947 | 0.7712 |
| 0.4608 | 2.0741 | 504 | 0.5880 | 0.1558 | 0.5880 | 0.7668 |
| 0.4608 | 2.0823 | 506 | 0.5929 | 0.2794 | 0.5929 | 0.7700 |
| 0.4608 | 2.0905 | 508 | 0.5988 | 0.2794 | 0.5988 | 0.7738 |
| 0.4608 | 2.0988 | 510 | 0.6130 | 0.1962 | 0.6130 | 0.7829 |
| 0.4608 | 2.1070 | 512 | 0.6189 | 0.2373 | 0.6189 | 0.7867 |
| 0.4608 | 2.1152 | 514 | 0.6166 | 0.2794 | 0.6166 | 0.7852 |
| 0.4608 | 2.1235 | 516 | 0.6265 | 0.3318 | 0.6265 | 0.7915 |
| 0.4608 | 2.1317 | 518 | 0.6417 | 0.3333 | 0.6417 | 0.8011 |
| 0.4608 | 2.1399 | 520 | 0.6221 | 0.3811 | 0.6221 | 0.7887 |
| 0.4608 | 2.1481 | 522 | 0.6069 | 0.4740 | 0.6069 | 0.7791 |
| 0.4608 | 2.1564 | 524 | 0.5840 | 0.4740 | 0.5840 | 0.7642 |
| 0.4608 | 2.1646 | 526 | 0.5537 | 0.4394 | 0.5537 | 0.7441 |
| 0.4608 | 2.1728 | 528 | 0.5402 | 0.4394 | 0.5402 | 0.7350 |
| 0.4608 | 2.1811 | 530 | 0.5344 | 0.4394 | 0.5344 | 0.7311 |
| 0.4608 | 2.1893 | 532 | 0.5366 | 0.4394 | 0.5366 | 0.7325 |
| 0.4608 | 2.1975 | 534 | 0.5417 | 0.3467 | 0.5417 | 0.7360 |
| 0.4608 | 2.2058 | 536 | 0.5570 | 0.3077 | 0.5570 | 0.7463 |
| 0.4608 | 2.2140 | 538 | 0.5678 | 0.3077 | 0.5678 | 0.7535 |
| 0.4608 | 2.2222 | 540 | 0.5726 | 0.4394 | 0.5726 | 0.7567 |
| 0.4608 | 2.2305 | 542 | 0.5863 | 0.4394 | 0.5863 | 0.7657 |
| 0.4608 | 2.2387 | 544 | 0.5828 | 0.4394 | 0.5828 | 0.7634 |
| 0.4608 | 2.2469 | 546 | 0.5640 | 0.4394 | 0.5640 | 0.7510 |
| 0.4608 | 2.2551 | 548 | 0.5501 | 0.3077 | 0.5501 | 0.7417 |
| 0.4608 | 2.2634 | 550 | 0.5549 | 0.3077 | 0.5549 | 0.7449 |
| 0.4608 | 2.2716 | 552 | 0.5648 | 0.3077 | 0.5648 | 0.7515 |
| 0.4608 | 2.2798 | 554 | 0.5753 | 0.3077 | 0.5753 | 0.7585 |
| 0.4608 | 2.2881 | 556 | 0.5908 | 0.3318 | 0.5908 | 0.7686 |
| 0.4608 | 2.2963 | 558 | 0.6116 | 0.3724 | 0.6116 | 0.7821 |
| 0.4608 | 2.3045 | 560 | 0.6260 | 0.3724 | 0.6260 | 0.7912 |
| 0.4608 | 2.3128 | 562 | 0.6291 | 0.3724 | 0.6291 | 0.7931 |
| 0.4608 | 2.3210 | 564 | 0.6709 | 0.3724 | 0.6709 | 0.8191 |
| 0.4608 | 2.3292 | 566 | 0.6848 | 0.3724 | 0.6848 | 0.8275 |
| 0.4608 | 2.3374 | 568 | 0.6502 | 0.3724 | 0.6502 | 0.8063 |
| 0.4608 | 2.3457 | 570 | 0.6072 | 0.3724 | 0.6072 | 0.7792 |
| 0.4608 | 2.3539 | 572 | 0.5894 | 0.1765 | 0.5894 | 0.7678 |
| 0.4608 | 2.3621 | 574 | 0.5911 | 0.1560 | 0.5911 | 0.7688 |
| 0.4608 | 2.3704 | 576 | 0.5899 | 0.1962 | 0.5899 | 0.7680 |
| 0.4608 | 2.3786 | 578 | 0.5935 | 0.1765 | 0.5935 | 0.7704 |
| 0.4608 | 2.3868 | 580 | 0.6156 | 0.2613 | 0.6156 | 0.7846 |
| 0.4608 | 2.3951 | 582 | 0.6616 | 0.3724 | 0.6616 | 0.8134 |
| 0.4608 | 2.4033 | 584 | 0.6960 | 0.2186 | 0.6960 | 0.8343 |
| 0.4608 | 2.4115 | 586 | 0.6885 | 0.2186 | 0.6885 | 0.8297 |
| 0.4608 | 2.4198 | 588 | 0.6483 | 0.3724 | 0.6483 | 0.8052 |
| 0.4608 | 2.4280 | 590 | 0.6295 | 0.2613 | 0.6295 | 0.7934 |
| 0.4608 | 2.4362 | 592 | 0.6179 | 0.2794 | 0.6179 | 0.7861 |
| 0.4608 | 2.4444 | 594 | 0.6188 | 0.1962 | 0.6188 | 0.7866 |
| 0.4608 | 2.4527 | 596 | 0.6167 | 0.1962 | 0.6167 | 0.7853 |
| 0.4608 | 2.4609 | 598 | 0.6165 | 0.2373 | 0.6165 | 0.7851 |
| 0.4608 | 2.4691 | 600 | 0.6327 | 0.1558 | 0.6327 | 0.7954 |
| 0.4608 | 2.4774 | 602 | 0.6478 | 0.2186 | 0.6478 | 0.8049 |
| 0.4608 | 2.4856 | 604 | 0.6660 | 0.2186 | 0.6660 | 0.8161 |
| 0.4608 | 2.4938 | 606 | 0.6790 | 0.2186 | 0.6790 | 0.8240 |
| 0.4608 | 2.5021 | 608 | 0.6569 | 0.2759 | 0.6569 | 0.8105 |
| 0.4608 | 2.5103 | 610 | 0.6135 | 0.3467 | 0.6135 | 0.7833 |
| 0.4608 | 2.5185 | 612 | 0.6081 | 0.3467 | 0.6081 | 0.7798 |
| 0.4608 | 2.5267 | 614 | 0.6175 | 0.3467 | 0.6175 | 0.7858 |
| 0.4608 | 2.5350 | 616 | 0.6364 | 0.3467 | 0.6364 | 0.7977 |
| 0.4608 | 2.5432 | 618 | 0.6775 | 0.2364 | 0.6775 | 0.8231 |
| 0.4608 | 2.5514 | 620 | 0.7083 | 0.3396 | 0.7083 | 0.8416 |
| 0.4608 | 2.5597 | 622 | 0.7059 | 0.2759 | 0.7059 | 0.8402 |
| 0.4608 | 2.5679 | 624 | 0.7097 | 0.2186 | 0.7097 | 0.8425 |
| 0.4608 | 2.5761 | 626 | 0.7355 | 0.2000 | 0.7355 | 0.8576 |
| 0.4608 | 2.5844 | 628 | 0.7654 | 0.2000 | 0.7654 | 0.8749 |
| 0.4608 | 2.5926 | 630 | 0.7487 | 0.2000 | 0.7487 | 0.8653 |
| 0.4608 | 2.6008 | 632 | 0.7130 | 0.2000 | 0.7130 | 0.8444 |
| 0.4608 | 2.6091 | 634 | 0.6732 | 0.1600 | 0.6732 | 0.8205 |
| 0.4608 | 2.6173 | 636 | 0.6652 | 0.1600 | 0.6652 | 0.8156 |
| 0.4608 | 2.6255 | 638 | 0.6602 | 0.1600 | 0.6602 | 0.8125 |
| 0.4608 | 2.6337 | 640 | 0.6635 | 0.2000 | 0.6635 | 0.8146 |
| 0.4608 | 2.6420 | 642 | 0.6839 | 0.2000 | 0.6839 | 0.8270 |
| 0.4608 | 2.6502 | 644 | 0.6558 | 0.2000 | 0.6558 | 0.8098 |
| 0.4608 | 2.6584 | 646 | 0.6363 | 0.1600 | 0.6363 | 0.7977 |
| 0.4608 | 2.6667 | 648 | 0.6270 | 0.1600 | 0.6270 | 0.7918 |
| 0.4608 | 2.6749 | 650 | 0.6337 | 0.1600 | 0.6337 | 0.7960 |
| 0.4608 | 2.6831 | 652 | 0.6327 | 0.1600 | 0.6327 | 0.7954 |
| 0.4608 | 2.6914 | 654 | 0.6530 | 0.1600 | 0.6530 | 0.8081 |
| 0.4608 | 2.6996 | 656 | 0.6802 | 0.2000 | 0.6802 | 0.8248 |
| 0.4608 | 2.7078 | 658 | 0.7190 | 0.2000 | 0.7190 | 0.8479 |
| 0.4608 | 2.7160 | 660 | 0.7140 | 0.2000 | 0.7140 | 0.8450 |
| 0.4608 | 2.7243 | 662 | 0.6844 | 0.1600 | 0.6844 | 0.8273 |
| 0.4608 | 2.7325 | 664 | 0.6335 | 0.3724 | 0.6335 | 0.7959 |
| 0.4608 | 2.7407 | 666 | 0.6120 | 0.1962 | 0.6120 | 0.7823 |
| 0.4608 | 2.7490 | 668 | 0.6157 | 0.1560 | 0.6157 | 0.7846 |
| 0.4608 | 2.7572 | 670 | 0.6152 | 0.1560 | 0.6152 | 0.7844 |
| 0.4608 | 2.7654 | 672 | 0.6157 | 0.1560 | 0.6157 | 0.7847 |
| 0.4608 | 2.7737 | 674 | 0.6330 | 0.3318 | 0.6330 | 0.7956 |
| 0.4608 | 2.7819 | 676 | 0.6575 | 0.3318 | 0.6575 | 0.8108 |
| 0.4608 | 2.7901 | 678 | 0.6984 | 0.1600 | 0.6984 | 0.8357 |
| 0.4608 | 2.7984 | 680 | 0.7294 | 0.2000 | 0.7294 | 0.8541 |
| 0.4608 | 2.8066 | 682 | 0.7152 | 0.2000 | 0.7152 | 0.8457 |
| 0.4608 | 2.8148 | 684 | 0.7148 | 0.2000 | 0.7148 | 0.8455 |
| 0.4608 | 2.8230 | 686 | 0.6888 | 0.2000 | 0.6888 | 0.8299 |
| 0.4608 | 2.8313 | 688 | 0.6352 | 0.2186 | 0.6352 | 0.7970 |
| 0.4608 | 2.8395 | 690 | 0.6025 | 0.3077 | 0.6025 | 0.7762 |
| 0.4608 | 2.8477 | 692 | 0.6244 | 0.2150 | 0.6244 | 0.7902 |
| 0.4608 | 2.8560 | 694 | 0.6440 | 0.1755 | 0.6440 | 0.8025 |
| 0.4608 | 2.8642 | 696 | 0.6430 | 0.1755 | 0.6430 | 0.8019 |
| 0.4608 | 2.8724 | 698 | 0.6276 | 0.1560 | 0.6276 | 0.7922 |
| 0.4608 | 2.8807 | 700 | 0.6191 | 0.2921 | 0.6191 | 0.7869 |
| 0.4608 | 2.8889 | 702 | 0.6311 | 0.3724 | 0.6311 | 0.7944 |
| 0.4608 | 2.8971 | 704 | 0.6476 | 0.2000 | 0.6476 | 0.8047 |
| 0.4608 | 2.9053 | 706 | 0.6636 | 0.2000 | 0.6636 | 0.8146 |
| 0.4608 | 2.9136 | 708 | 0.6691 | 0.2000 | 0.6691 | 0.8180 |
| 0.4608 | 2.9218 | 710 | 0.6595 | 0.2000 | 0.6595 | 0.8121 |
| 0.4608 | 2.9300 | 712 | 0.6259 | 0.1600 | 0.6259 | 0.7911 |
| 0.4608 | 2.9383 | 714 | 0.6047 | 0.3163 | 0.6047 | 0.7776 |
| 0.4608 | 2.9465 | 716 | 0.5931 | 0.3724 | 0.5931 | 0.7701 |
| 0.4608 | 2.9547 | 718 | 0.5937 | 0.3724 | 0.5937 | 0.7705 |
| 0.4608 | 2.9630 | 720 | 0.5991 | 0.3724 | 0.5991 | 0.7740 |
| 0.4608 | 2.9712 | 722 | 0.5993 | 0.3724 | 0.5993 | 0.7742 |
| 0.4608 | 2.9794 | 724 | 0.6208 | 0.3724 | 0.6208 | 0.7879 |
| 0.4608 | 2.9877 | 726 | 0.6729 | 0.2000 | 0.6729 | 0.8203 |
| 0.4608 | 2.9959 | 728 | 0.7148 | 0.2000 | 0.7148 | 0.8455 |
| 0.4608 | 3.0041 | 730 | 0.7300 | 0.2000 | 0.7300 | 0.8544 |
| 0.4608 | 3.0123 | 732 | 0.7470 | 0.2000 | 0.7470 | 0.8643 |
| 0.4608 | 3.0206 | 734 | 0.7594 | 0.2000 | 0.7594 | 0.8714 |
| 0.4608 | 3.0288 | 736 | 0.7383 | 0.2000 | 0.7383 | 0.8593 |
| 0.4608 | 3.0370 | 738 | 0.7335 | 0.2000 | 0.7335 | 0.8565 |
| 0.4608 | 3.0453 | 740 | 0.7054 | 0.2186 | 0.7054 | 0.8399 |
| 0.4608 | 3.0535 | 742 | 0.6693 | 0.2186 | 0.6693 | 0.8181 |
| 0.4608 | 3.0617 | 744 | 0.6355 | 0.2921 | 0.6355 | 0.7972 |
| 0.4608 | 3.0700 | 746 | 0.6327 | 0.3318 | 0.6327 | 0.7954 |
| 0.4608 | 3.0782 | 748 | 0.6501 | 0.1600 | 0.6501 | 0.8063 |
| 0.4608 | 3.0864 | 750 | 0.6755 | 0.2000 | 0.6755 | 0.8219 |
| 0.4608 | 3.0947 | 752 | 0.6930 | 0.2000 | 0.6930 | 0.8325 |
| 0.4608 | 3.1029 | 754 | 0.7248 | 0.2000 | 0.7248 | 0.8514 |
| 0.4608 | 3.1111 | 756 | 0.7025 | 0.2000 | 0.7025 | 0.8382 |
| 0.4608 | 3.1193 | 758 | 0.6839 | 0.1600 | 0.6839 | 0.8270 |
| 0.4608 | 3.1276 | 760 | 0.6746 | 0.1600 | 0.6746 | 0.8213 |
| 0.4608 | 3.1358 | 762 | 0.6942 | 0.2000 | 0.6942 | 0.8332 |
| 0.4608 | 3.1440 | 764 | 0.7235 | 0.1398 | 0.7235 | 0.8506 |
| 0.4608 | 3.1523 | 766 | 0.7442 | 0.1398 | 0.7442 | 0.8627 |
| 0.4608 | 3.1605 | 768 | 0.7152 | 0.1398 | 0.7152 | 0.8457 |
| 0.4608 | 3.1687 | 770 | 0.6785 | 0.1600 | 0.6785 | 0.8237 |
| 0.4608 | 3.1770 | 772 | 0.6604 | 0.1600 | 0.6604 | 0.8126 |
| 0.4608 | 3.1852 | 774 | 0.6512 | 0.2533 | 0.6512 | 0.8070 |
| 0.4608 | 3.1934 | 776 | 0.6512 | 0.2533 | 0.6512 | 0.8070 |
| 0.4608 | 3.2016 | 778 | 0.6528 | 0.0455 | 0.6528 | 0.8080 |
| 0.4608 | 3.2099 | 780 | 0.6800 | 0.1600 | 0.6800 | 0.8246 |
| 0.4608 | 3.2181 | 782 | 0.7145 | 0.1000 | 0.7145 | 0.8453 |
| 0.4608 | 3.2263 | 784 | 0.7507 | 0.1398 | 0.7507 | 0.8664 |
| 0.4608 | 3.2346 | 786 | 0.7818 | 0.1398 | 0.7818 | 0.8842 |
| 0.4608 | 3.2428 | 788 | 0.7708 | 0.1000 | 0.7708 | 0.8780 |
| 0.4608 | 3.2510 | 790 | 0.7246 | 0.1600 | 0.7246 | 0.8512 |
| 0.4608 | 3.2593 | 792 | 0.6926 | 0.1600 | 0.6926 | 0.8322 |
| 0.4608 | 3.2675 | 794 | 0.6878 | 0.2154 | 0.6878 | 0.8293 |
| 0.4608 | 3.2757 | 796 | 0.6934 | 0.2154 | 0.6934 | 0.8327 |
| 0.4608 | 3.2840 | 798 | 0.6951 | 0.0957 | 0.6951 | 0.8337 |
| 0.4608 | 3.2922 | 800 | 0.6838 | 0.2154 | 0.6838 | 0.8269 |
| 0.4608 | 3.3004 | 802 | 0.6843 | 0.2533 | 0.6843 | 0.8272 |
| 0.4608 | 3.3086 | 804 | 0.7108 | 0.1600 | 0.7108 | 0.8431 |
| 0.4608 | 3.3169 | 806 | 0.7749 | 0.1398 | 0.7749 | 0.8803 |
| 0.4608 | 3.3251 | 808 | 0.8174 | 0.1398 | 0.8174 | 0.9041 |
| 0.4608 | 3.3333 | 810 | 0.8305 | 0.1398 | 0.8305 | 0.9113 |
| 0.4608 | 3.3416 | 812 | 0.8008 | 0.1398 | 0.8008 | 0.8949 |
| 0.4608 | 3.3498 | 814 | 0.7626 | 0.1398 | 0.7626 | 0.8733 |
| 0.4608 | 3.3580 | 816 | 0.7119 | 0.2000 | 0.7119 | 0.8438 |
| 0.4608 | 3.3663 | 818 | 0.6969 | 0.1600 | 0.6969 | 0.8348 |
| 0.4608 | 3.3745 | 820 | 0.7092 | 0.1600 | 0.7092 | 0.8421 |
| 0.4608 | 3.3827 | 822 | 0.7331 | 0.2000 | 0.7331 | 0.8562 |
| 0.4608 | 3.3909 | 824 | 0.7654 | 0.1398 | 0.7654 | 0.8749 |
| 0.4608 | 3.3992 | 826 | 0.7778 | 0.1398 | 0.7778 | 0.8819 |
| 0.4608 | 3.4074 | 828 | 0.7790 | 0.1398 | 0.7790 | 0.8826 |
| 0.4608 | 3.4156 | 830 | 0.7525 | 0.1398 | 0.7525 | 0.8675 |
| 0.4608 | 3.4239 | 832 | 0.7099 | 0.2000 | 0.7099 | 0.8425 |
| 0.4608 | 3.4321 | 834 | 0.6679 | 0.2186 | 0.6679 | 0.8172 |
| 0.4608 | 3.4403 | 836 | 0.6515 | 0.3318 | 0.6515 | 0.8071 |
| 0.4608 | 3.4486 | 838 | 0.6545 | 0.3318 | 0.6545 | 0.8090 |
| 0.4608 | 3.4568 | 840 | 0.6770 | 0.2000 | 0.6770 | 0.8228 |
| 0.4608 | 3.4650 | 842 | 0.7027 | 0.2000 | 0.7027 | 0.8383 |
| 0.4608 | 3.4733 | 844 | 0.7563 | 0.2000 | 0.7563 | 0.8697 |
| 0.4608 | 3.4815 | 846 | 0.7959 | 0.1398 | 0.7959 | 0.8921 |
| 0.4608 | 3.4897 | 848 | 0.7839 | 0.1398 | 0.7839 | 0.8854 |
| 0.4608 | 3.4979 | 850 | 0.7421 | 0.2000 | 0.7421 | 0.8615 |
| 0.4608 | 3.5062 | 852 | 0.7016 | 0.2000 | 0.7016 | 0.8376 |
| 0.4608 | 3.5144 | 854 | 0.6734 | 0.2000 | 0.6734 | 0.8206 |
| 0.4608 | 3.5226 | 856 | 0.6503 | 0.2000 | 0.6503 | 0.8064 |
| 0.4608 | 3.5309 | 858 | 0.6411 | 0.2000 | 0.6411 | 0.8007 |
| 0.4608 | 3.5391 | 860 | 0.6339 | 0.2000 | 0.6339 | 0.7962 |
| 0.4608 | 3.5473 | 862 | 0.6339 | 0.2000 | 0.6339 | 0.7962 |
| 0.4608 | 3.5556 | 864 | 0.6477 | 0.2000 | 0.6477 | 0.8048 |
| 0.4608 | 3.5638 | 866 | 0.6577 | 0.2000 | 0.6577 | 0.8110 |
| 0.4608 | 3.5720 | 868 | 0.6815 | 0.2188 | 0.6815 | 0.8255 |
| 0.4608 | 3.5802 | 870 | 0.7363 | 0.3198 | 0.7363 | 0.8581 |
| 0.4608 | 3.5885 | 872 | 0.7513 | 0.3198 | 0.7513 | 0.8668 |
| 0.4608 | 3.5967 | 874 | 0.7470 | 0.3198 | 0.7470 | 0.8643 |
| 0.4608 | 3.6049 | 876 | 0.7090 | 0.2188 | 0.7090 | 0.8420 |
| 0.4608 | 3.6132 | 878 | 0.6699 | 0.2000 | 0.6699 | 0.8185 |
| 0.4608 | 3.6214 | 880 | 0.6369 | 0.2000 | 0.6369 | 0.7981 |
| 0.4608 | 3.6296 | 882 | 0.6287 | 0.2000 | 0.6287 | 0.7929 |
| 0.4608 | 3.6379 | 884 | 0.6310 | 0.2000 | 0.6310 | 0.7944 |
| 0.4608 | 3.6461 | 886 | 0.6229 | 0.1600 | 0.6229 | 0.7893 |
| 0.4608 | 3.6543 | 888 | 0.6169 | 0.1600 | 0.6169 | 0.7855 |
| 0.4608 | 3.6626 | 890 | 0.6151 | 0.3724 | 0.6151 | 0.7843 |
| 0.4608 | 3.6708 | 892 | 0.6142 | 0.3318 | 0.6142 | 0.7837 |
| 0.4608 | 3.6790 | 894 | 0.6169 | 0.2921 | 0.6169 | 0.7854 |
| 0.4608 | 3.6872 | 896 | 0.6177 | 0.2921 | 0.6177 | 0.7859 |
| 0.4608 | 3.6955 | 898 | 0.6257 | 0.3318 | 0.6257 | 0.7910 |
| 0.4608 | 3.7037 | 900 | 0.6257 | 0.3318 | 0.6257 | 0.7910 |
| 0.4608 | 3.7119 | 902 | 0.6227 | 0.3318 | 0.6227 | 0.7891 |
| 0.4608 | 3.7202 | 904 | 0.6164 | 0.3318 | 0.6164 | 0.7851 |
| 0.4608 | 3.7284 | 906 | 0.6184 | 0.3318 | 0.6184 | 0.7864 |
| 0.4608 | 3.7366 | 908 | 0.6787 | 0.3107 | 0.6787 | 0.8238 |
| 0.4608 | 3.7449 | 910 | 0.7562 | 0.3277 | 0.7562 | 0.8696 |
| 0.4608 | 3.7531 | 912 | 0.8121 | 0.3277 | 0.8121 | 0.9011 |
| 0.4608 | 3.7613 | 914 | 0.8140 | 0.3107 | 0.8140 | 0.9022 |
| 0.4608 | 3.7695 | 916 | 0.8356 | 0.2566 | 0.8356 | 0.9141 |
| 0.4608 | 3.7778 | 918 | 0.7919 | 0.3107 | 0.7919 | 0.8899 |
| 0.4608 | 3.7860 | 920 | 0.7292 | 0.2000 | 0.7292 | 0.8540 |
| 0.4608 | 3.7942 | 922 | 0.6849 | 0.2000 | 0.6849 | 0.8276 |
| 0.4608 | 3.8025 | 924 | 0.6415 | 0.2588 | 0.6415 | 0.8010 |
| 0.4608 | 3.8107 | 926 | 0.6140 | 0.3724 | 0.6140 | 0.7836 |
| 0.4608 | 3.8189 | 928 | 0.6121 | 0.3318 | 0.6121 | 0.7823 |
| 0.4608 | 3.8272 | 930 | 0.6138 | 0.3318 | 0.6138 | 0.7834 |
| 0.4608 | 3.8354 | 932 | 0.6229 | 0.2186 | 0.6229 | 0.7892 |
| 0.4608 | 3.8436 | 934 | 0.6353 | 0.2588 | 0.6353 | 0.7971 |
| 0.4608 | 3.8519 | 936 | 0.6542 | 0.2000 | 0.6542 | 0.8088 |
| 0.4608 | 3.8601 | 938 | 0.6833 | 0.2000 | 0.6833 | 0.8266 |
| 0.4608 | 3.8683 | 940 | 0.7102 | 0.2000 | 0.7102 | 0.8427 |
| 0.4608 | 3.8765 | 942 | 0.6991 | 0.2000 | 0.6991 | 0.8361 |
| 0.4608 | 3.8848 | 944 | 0.6823 | 0.1600 | 0.6823 | 0.8260 |
| 0.4608 | 3.8930 | 946 | 0.6662 | 0.2186 | 0.6662 | 0.8162 |
| 0.4608 | 3.9012 | 948 | 0.6658 | 0.2186 | 0.6658 | 0.8160 |
| 0.4608 | 3.9095 | 950 | 0.6742 | 0.2186 | 0.6742 | 0.8211 |
| 0.4608 | 3.9177 | 952 | 0.6798 | 0.2186 | 0.6798 | 0.8245 |
| 0.4608 | 3.9259 | 954 | 0.7083 | 0.1600 | 0.7083 | 0.8416 |
| 0.4608 | 3.9342 | 956 | 0.7263 | 0.2000 | 0.7263 | 0.8523 |
| 0.4608 | 3.9424 | 958 | 0.7201 | 0.2000 | 0.7201 | 0.8486 |
| 0.4608 | 3.9506 | 960 | 0.7136 | 0.2000 | 0.7136 | 0.8448 |
| 0.4608 | 3.9588 | 962 | 0.6800 | 0.2186 | 0.6800 | 0.8246 |
| 0.4608 | 3.9671 | 964 | 0.6480 | 0.1793 | 0.6480 | 0.8050 |
| 0.4608 | 3.9753 | 966 | 0.6376 | 0.1793 | 0.6376 | 0.7985 |
| 0.4608 | 3.9835 | 968 | 0.6514 | 0.2186 | 0.6514 | 0.8071 |
| 0.4608 | 3.9918 | 970 | 0.6779 | 0.2000 | 0.6779 | 0.8233 |
| 0.4608 | 4.0 | 972 | 0.7279 | 0.3198 | 0.7279 | 0.8532 |
| 0.4608 | 4.0082 | 974 | 0.7666 | 0.3198 | 0.7666 | 0.8755 |
| 0.4608 | 4.0165 | 976 | 0.7730 | 0.3198 | 0.7730 | 0.8792 |
| 0.4608 | 4.0247 | 978 | 0.7249 | 0.3198 | 0.7249 | 0.8514 |
| 0.4608 | 4.0329 | 980 | 0.6436 | 0.3107 | 0.6436 | 0.8022 |
| 0.4608 | 4.0412 | 982 | 0.5838 | 0.2364 | 0.5838 | 0.7641 |
| 0.4608 | 4.0494 | 984 | 0.5645 | 0.3865 | 0.5645 | 0.7514 |
| 0.4608 | 4.0576 | 986 | 0.5648 | 0.2696 | 0.5648 | 0.7515 |
| 0.4608 | 4.0658 | 988 | 0.5706 | 0.2696 | 0.5706 | 0.7554 |
| 0.4608 | 4.0741 | 990 | 0.5778 | 0.2696 | 0.5778 | 0.7601 |
| 0.4608 | 4.0823 | 992 | 0.5984 | 0.3724 | 0.5984 | 0.7736 |
| 0.4608 | 4.0905 | 994 | 0.6383 | 0.1600 | 0.6383 | 0.7989 |
| 0.4608 | 4.0988 | 996 | 0.6917 | 0.2000 | 0.6917 | 0.8317 |
| 0.4608 | 4.1070 | 998 | 0.7845 | 0.3107 | 0.7845 | 0.8857 |
| 0.1067 | 4.1152 | 1000 | 0.8507 | 0.3277 | 0.8507 | 0.9223 |
| 0.1067 | 4.1235 | 1002 | 0.8551 | 0.3277 | 0.8551 | 0.9247 |
| 0.1067 | 4.1317 | 1004 | 0.8002 | 0.3277 | 0.8002 | 0.8946 |
| 0.1067 | 4.1399 | 1006 | 0.7005 | 0.3107 | 0.7005 | 0.8370 |
| 0.1067 | 4.1481 | 1008 | 0.6174 | 0.3163 | 0.6174 | 0.7858 |
| 0.1067 | 4.1564 | 1010 | 0.5818 | 0.3724 | 0.5818 | 0.7627 |
| 0.1067 | 4.1646 | 1012 | 0.5679 | 0.3724 | 0.5679 | 0.7536 |
| 0.1067 | 4.1728 | 1014 | 0.5659 | 0.3318 | 0.5659 | 0.7523 |
| 0.1067 | 4.1811 | 1016 | 0.5647 | 0.3724 | 0.5647 | 0.7515 |
| 0.1067 | 4.1893 | 1018 | 0.5690 | 0.3724 | 0.5690 | 0.7543 |
| 0.1067 | 4.1975 | 1020 | 0.5874 | 0.3163 | 0.5874 | 0.7664 |
| 0.1067 | 4.2058 | 1022 | 0.6182 | 0.2000 | 0.6182 | 0.7863 |
| 0.1067 | 4.2140 | 1024 | 0.6451 | 0.2000 | 0.6451 | 0.8032 |
| 0.1067 | 4.2222 | 1026 | 0.6470 | 0.2000 | 0.6470 | 0.8043 |
| 0.1067 | 4.2305 | 1028 | 0.6411 | 0.3107 | 0.6411 | 0.8007 |
| 0.1067 | 4.2387 | 1030 | 0.6326 | 0.3107 | 0.6326 | 0.7954 |
| 0.1067 | 4.2469 | 1032 | 0.6159 | 0.3107 | 0.6159 | 0.7848 |
| 0.1067 | 4.2551 | 1034 | 0.5866 | 0.4154 | 0.5866 | 0.7659 |
| 0.1067 | 4.2634 | 1036 | 0.5535 | 0.3163 | 0.5535 | 0.7440 |
| 0.1067 | 4.2716 | 1038 | 0.5404 | 0.3724 | 0.5404 | 0.7351 |
| 0.1067 | 4.2798 | 1040 | 0.5345 | 0.3318 | 0.5345 | 0.7311 |
| 0.1067 | 4.2881 | 1042 | 0.5368 | 0.3724 | 0.5368 | 0.7326 |
| 0.1067 | 4.2963 | 1044 | 0.5432 | 0.3724 | 0.5432 | 0.7370 |
| 0.1067 | 4.3045 | 1046 | 0.5531 | 0.3163 | 0.5531 | 0.7437 |
| 0.1067 | 4.3128 | 1048 | 0.5643 | 0.3163 | 0.5643 | 0.7512 |
| 0.1067 | 4.3210 | 1050 | 0.5742 | 0.3163 | 0.5742 | 0.7578 |
| 0.1067 | 4.3292 | 1052 | 0.5848 | 0.3163 | 0.5848 | 0.7647 |
| 0.1067 | 4.3374 | 1054 | 0.5892 | 0.3163 | 0.5892 | 0.7676 |
| 0.1067 | 4.3457 | 1056 | 0.5874 | 0.3724 | 0.5874 | 0.7664 |
| 0.1067 | 4.3539 | 1058 | 0.6102 | 0.4154 | 0.6102 | 0.7811 |
| 0.1067 | 4.3621 | 1060 | 0.6437 | 0.2727 | 0.6437 | 0.8023 |
| 0.1067 | 4.3704 | 1062 | 0.6690 | 0.2727 | 0.6690 | 0.8179 |
| 0.1067 | 4.3786 | 1064 | 0.7169 | 0.2727 | 0.7169 | 0.8467 |
| 0.1067 | 4.3868 | 1066 | 0.7977 | 0.3277 | 0.7977 | 0.8932 |
| 0.1067 | 4.3951 | 1068 | 0.9043 | 0.1421 | 0.9043 | 0.9509 |
| 0.1067 | 4.4033 | 1070 | 0.9346 | 0.1421 | 0.9346 | 0.9668 |
| 0.1067 | 4.4115 | 1072 | 0.8760 | 0.1421 | 0.8760 | 0.9360 |
| 0.1067 | 4.4198 | 1074 | 0.7776 | 0.3107 | 0.7776 | 0.8818 |
| 0.1067 | 4.4280 | 1076 | 0.6973 | 0.2727 | 0.6973 | 0.8350 |
| 0.1067 | 4.4362 | 1078 | 0.6421 | 0.2186 | 0.6421 | 0.8013 |
| 0.1067 | 4.4444 | 1080 | 0.6260 | 0.3724 | 0.6260 | 0.7912 |
| 0.1067 | 4.4527 | 1082 | 0.6141 | 0.3724 | 0.6141 | 0.7837 |
| 0.1067 | 4.4609 | 1084 | 0.5965 | 0.3724 | 0.5965 | 0.7724 |
| 0.1067 | 4.4691 | 1086 | 0.5922 | 0.3724 | 0.5922 | 0.7696 |
| 0.1067 | 4.4774 | 1088 | 0.5860 | 0.3724 | 0.5860 | 0.7655 |
| 0.1067 | 4.4856 | 1090 | 0.5774 | 0.3724 | 0.5774 | 0.7599 |
| 0.1067 | 4.4938 | 1092 | 0.5751 | 0.3724 | 0.5751 | 0.7584 |
| 0.1067 | 4.5021 | 1094 | 0.5759 | 0.3724 | 0.5759 | 0.7589 |
| 0.1067 | 4.5103 | 1096 | 0.5726 | 0.3724 | 0.5726 | 0.7567 |
| 0.1067 | 4.5185 | 1098 | 0.5822 | 0.3724 | 0.5822 | 0.7630 |
| 0.1067 | 4.5267 | 1100 | 0.5873 | 0.3724 | 0.5873 | 0.7664 |
| 0.1067 | 4.5350 | 1102 | 0.5880 | 0.3724 | 0.5880 | 0.7668 |
| 0.1067 | 4.5432 | 1104 | 0.5947 | 0.3724 | 0.5947 | 0.7712 |
| 0.1067 | 4.5514 | 1106 | 0.6063 | 0.3724 | 0.6063 | 0.7787 |
| 0.1067 | 4.5597 | 1108 | 0.6089 | 0.3724 | 0.6089 | 0.7804 |
| 0.1067 | 4.5679 | 1110 | 0.6153 | 0.3724 | 0.6153 | 0.7844 |
| 0.1067 | 4.5761 | 1112 | 0.6160 | 0.3724 | 0.6160 | 0.7849 |
| 0.1067 | 4.5844 | 1114 | 0.6238 | 0.3163 | 0.6238 | 0.7898 |
| 0.1067 | 4.5926 | 1116 | 0.6326 | 0.3163 | 0.6326 | 0.7954 |
| 0.1067 | 4.6008 | 1118 | 0.6173 | 0.3163 | 0.6173 | 0.7857 |
| 0.1067 | 4.6091 | 1120 | 0.6053 | 0.3163 | 0.6053 | 0.7780 |
| 0.1067 | 4.6173 | 1122 | 0.6107 | 0.3163 | 0.6107 | 0.7815 |
| 0.1067 | 4.6255 | 1124 | 0.6035 | 0.3163 | 0.6035 | 0.7769 |
| 0.1067 | 4.6337 | 1126 | 0.6111 | 0.3163 | 0.6111 | 0.7817 |
| 0.1067 | 4.6420 | 1128 | 0.6276 | 0.3163 | 0.6276 | 0.7922 |
| 0.1067 | 4.6502 | 1130 | 0.6515 | 0.3107 | 0.6515 | 0.8072 |
| 0.1067 | 4.6584 | 1132 | 0.6731 | 0.3107 | 0.6731 | 0.8204 |
| 0.1067 | 4.6667 | 1134 | 0.6614 | 0.3107 | 0.6614 | 0.8133 |
| 0.1067 | 4.6749 | 1136 | 0.6400 | 0.3107 | 0.6400 | 0.8000 |
| 0.1067 | 4.6831 | 1138 | 0.6019 | 0.3163 | 0.6019 | 0.7758 |
| 0.1067 | 4.6914 | 1140 | 0.5688 | 0.3724 | 0.5688 | 0.7542 |
| 0.1067 | 4.6996 | 1142 | 0.5518 | 0.3724 | 0.5518 | 0.7428 |
| 0.1067 | 4.7078 | 1144 | 0.5512 | 0.3609 | 0.5512 | 0.7424 |
| 0.1067 | 4.7160 | 1146 | 0.5574 | 0.3226 | 0.5574 | 0.7466 |
| 0.1067 | 4.7243 | 1148 | 0.5579 | 0.3609 | 0.5579 | 0.7469 |
| 0.1067 | 4.7325 | 1150 | 0.5574 | 0.3865 | 0.5574 | 0.7466 |
| 0.1067 | 4.7407 | 1152 | 0.5764 | 0.3724 | 0.5764 | 0.7592 |
| 0.1067 | 4.7490 | 1154 | 0.6318 | 0.4154 | 0.6318 | 0.7948 |
| 0.1067 | 4.7572 | 1156 | 0.6887 | 0.3198 | 0.6887 | 0.8299 |
| 0.1067 | 4.7654 | 1158 | 0.7032 | 0.3198 | 0.7032 | 0.8386 |
| 0.1067 | 4.7737 | 1160 | 0.6678 | 0.4529 | 0.6678 | 0.8172 |
| 0.1067 | 4.7819 | 1162 | 0.6315 | 0.4154 | 0.6315 | 0.7947 |
| 0.1067 | 4.7901 | 1164 | 0.6128 | 0.4167 | 0.6128 | 0.7828 |
| 0.1067 | 4.7984 | 1166 | 0.5894 | 0.4154 | 0.5894 | 0.7677 |
| 0.1067 | 4.8066 | 1168 | 0.5796 | 0.3163 | 0.5796 | 0.7613 |
| 0.1067 | 4.8148 | 1170 | 0.5729 | 0.3163 | 0.5729 | 0.7569 |
| 0.1067 | 4.8230 | 1172 | 0.5792 | 0.3163 | 0.5792 | 0.7611 |
| 0.1067 | 4.8313 | 1174 | 0.5800 | 0.3163 | 0.5800 | 0.7616 |
| 0.1067 | 4.8395 | 1176 | 0.5854 | 0.3163 | 0.5854 | 0.7651 |
| 0.1067 | 4.8477 | 1178 | 0.5778 | 0.3163 | 0.5778 | 0.7601 |
| 0.1067 | 4.8560 | 1180 | 0.5690 | 0.3163 | 0.5690 | 0.7543 |
| 0.1067 | 4.8642 | 1182 | 0.5619 | 0.3163 | 0.5619 | 0.7496 |
| 0.1067 | 4.8724 | 1184 | 0.5602 | 0.3163 | 0.5602 | 0.7484 |
| 0.1067 | 4.8807 | 1186 | 0.5590 | 0.3163 | 0.5591 | 0.7477 |
| 0.1067 | 4.8889 | 1188 | 0.5542 | 0.3163 | 0.5542 | 0.7444 |
| 0.1067 | 4.8971 | 1190 | 0.5554 | 0.3163 | 0.5554 | 0.7452 |
| 0.1067 | 4.9053 | 1192 | 0.5586 | 0.3163 | 0.5586 | 0.7474 |
| 0.1067 | 4.9136 | 1194 | 0.5645 | 0.3163 | 0.5645 | 0.7513 |
| 0.1067 | 4.9218 | 1196 | 0.5658 | 0.3163 | 0.5658 | 0.7522 |
| 0.1067 | 4.9300 | 1198 | 0.5701 | 0.3163 | 0.5701 | 0.7551 |
| 0.1067 | 4.9383 | 1200 | 0.5671 | 0.3163 | 0.5671 | 0.7531 |
| 0.1067 | 4.9465 | 1202 | 0.5657 | 0.3163 | 0.5657 | 0.7521 |
| 0.1067 | 4.9547 | 1204 | 0.5573 | 0.3163 | 0.5573 | 0.7465 |
| 0.1067 | 4.9630 | 1206 | 0.5621 | 0.3163 | 0.5621 | 0.7497 |
| 0.1067 | 4.9712 | 1208 | 0.5581 | 0.3163 | 0.5581 | 0.7471 |
| 0.1067 | 4.9794 | 1210 | 0.5770 | 0.3255 | 0.5770 | 0.7596 |
| 0.1067 | 4.9877 | 1212 | 0.6160 | 0.4529 | 0.6160 | 0.7848 |
| 0.1067 | 4.9959 | 1214 | 0.6855 | 0.3198 | 0.6855 | 0.8280 |
| 0.1067 | 5.0041 | 1216 | 0.7575 | 0.3198 | 0.7575 | 0.8703 |
| 0.1067 | 5.0123 | 1218 | 0.7589 | 0.3198 | 0.7589 | 0.8712 |
| 0.1067 | 5.0206 | 1220 | 0.7158 | 0.3198 | 0.7158 | 0.8460 |
| 0.1067 | 5.0288 | 1222 | 0.6280 | 0.4529 | 0.6280 | 0.7925 |
| 0.1067 | 5.0370 | 1224 | 0.5834 | 0.4167 | 0.5834 | 0.7638 |
| 0.1067 | 5.0453 | 1226 | 0.5561 | 0.4637 | 0.5561 | 0.7457 |
| 0.1067 | 5.0535 | 1228 | 0.5489 | 0.3771 | 0.5489 | 0.7409 |
| 0.1067 | 5.0617 | 1230 | 0.5482 | 0.3771 | 0.5482 | 0.7404 |
| 0.1067 | 5.0700 | 1232 | 0.5362 | 0.3724 | 0.5362 | 0.7323 |
| 0.1067 | 5.0782 | 1234 | 0.5372 | 0.3724 | 0.5372 | 0.7329 |
| 0.1067 | 5.0864 | 1236 | 0.5379 | 0.3724 | 0.5379 | 0.7334 |
| 0.1067 | 5.0947 | 1238 | 0.5419 | 0.3318 | 0.5419 | 0.7361 |
| 0.1067 | 5.1029 | 1240 | 0.5436 | 0.3318 | 0.5436 | 0.7373 |
| 0.1067 | 5.1111 | 1242 | 0.5462 | 0.3724 | 0.5462 | 0.7391 |
| 0.1067 | 5.1193 | 1244 | 0.5527 | 0.3724 | 0.5527 | 0.7434 |
| 0.1067 | 5.1276 | 1246 | 0.5586 | 0.3724 | 0.5586 | 0.7474 |
| 0.1067 | 5.1358 | 1248 | 0.5621 | 0.3724 | 0.5621 | 0.7497 |
| 0.1067 | 5.1440 | 1250 | 0.5642 | 0.2921 | 0.5642 | 0.7511 |
| 0.1067 | 5.1523 | 1252 | 0.5681 | 0.2921 | 0.5681 | 0.7537 |
| 0.1067 | 5.1605 | 1254 | 0.5697 | 0.2921 | 0.5697 | 0.7548 |
| 0.1067 | 5.1687 | 1256 | 0.5701 | 0.2921 | 0.5701 | 0.7550 |
| 0.1067 | 5.1770 | 1258 | 0.5750 | 0.3724 | 0.5750 | 0.7583 |
| 0.1067 | 5.1852 | 1260 | 0.5924 | 0.3724 | 0.5924 | 0.7697 |
| 0.1067 | 5.1934 | 1262 | 0.6060 | 0.4140 | 0.6060 | 0.7785 |
| 0.1067 | 5.2016 | 1264 | 0.6008 | 0.3724 | 0.6008 | 0.7751 |
| 0.1067 | 5.2099 | 1266 | 0.5907 | 0.3724 | 0.5907 | 0.7686 |
| 0.1067 | 5.2181 | 1268 | 0.5801 | 0.3724 | 0.5801 | 0.7616 |
| 0.1067 | 5.2263 | 1270 | 0.5799 | 0.2921 | 0.5799 | 0.7615 |
| 0.1067 | 5.2346 | 1272 | 0.5806 | 0.3467 | 0.5806 | 0.7620 |
| 0.1067 | 5.2428 | 1274 | 0.5769 | 0.3467 | 0.5769 | 0.7596 |
| 0.1067 | 5.2510 | 1276 | 0.5722 | 0.3865 | 0.5722 | 0.7564 |
| 0.1067 | 5.2593 | 1278 | 0.5707 | 0.3724 | 0.5707 | 0.7554 |
| 0.1067 | 5.2675 | 1280 | 0.5736 | 0.3724 | 0.5736 | 0.7574 |
| 0.1067 | 5.2757 | 1282 | 0.5781 | 0.3724 | 0.5781 | 0.7603 |
| 0.1067 | 5.2840 | 1284 | 0.5893 | 0.3724 | 0.5893 | 0.7677 |
| 0.1067 | 5.2922 | 1286 | 0.6013 | 0.3724 | 0.6013 | 0.7754 |
| 0.1067 | 5.3004 | 1288 | 0.6058 | 0.3724 | 0.6058 | 0.7783 |
| 0.1067 | 5.3086 | 1290 | 0.6026 | 0.3724 | 0.6026 | 0.7763 |
| 0.1067 | 5.3169 | 1292 | 0.6021 | 0.3724 | 0.6021 | 0.7760 |
| 0.1067 | 5.3251 | 1294 | 0.6085 | 0.3724 | 0.6085 | 0.7801 |
| 0.1067 | 5.3333 | 1296 | 0.6104 | 0.3724 | 0.6104 | 0.7813 |
| 0.1067 | 5.3416 | 1298 | 0.6099 | 0.3724 | 0.6099 | 0.7810 |
| 0.1067 | 5.3498 | 1300 | 0.6138 | 0.4273 | 0.6138 | 0.7834 |
| 0.1067 | 5.3580 | 1302 | 0.6232 | 0.3467 | 0.6232 | 0.7894 |
| 0.1067 | 5.3663 | 1304 | 0.6391 | 0.2696 | 0.6391 | 0.7995 |
| 0.1067 | 5.3745 | 1306 | 0.6443 | 0.3077 | 0.6443 | 0.8027 |
| 0.1067 | 5.3827 | 1308 | 0.6407 | 0.2921 | 0.6407 | 0.8004 |
| 0.1067 | 5.3909 | 1310 | 0.6412 | 0.3724 | 0.6412 | 0.8008 |
| 0.1067 | 5.3992 | 1312 | 0.6475 | 0.3724 | 0.6475 | 0.8046 |
| 0.1067 | 5.4074 | 1314 | 0.6641 | 0.3724 | 0.6641 | 0.8149 |
| 0.1067 | 5.4156 | 1316 | 0.6809 | 0.1600 | 0.6809 | 0.8252 |
| 0.1067 | 5.4239 | 1318 | 0.6994 | 0.1818 | 0.6994 | 0.8363 |
| 0.1067 | 5.4321 | 1320 | 0.6984 | 0.2846 | 0.6984 | 0.8357 |
| 0.1067 | 5.4403 | 1322 | 0.6712 | 0.1818 | 0.6712 | 0.8193 |
| 0.1067 | 5.4486 | 1324 | 0.6343 | 0.2186 | 0.6343 | 0.7964 |
| 0.1067 | 5.4568 | 1326 | 0.6057 | 0.3724 | 0.6057 | 0.7782 |
| 0.1067 | 5.4650 | 1328 | 0.5926 | 0.3724 | 0.5926 | 0.7698 |
| 0.1067 | 5.4733 | 1330 | 0.5906 | 0.4273 | 0.5906 | 0.7685 |
| 0.1067 | 5.4815 | 1332 | 0.5892 | 0.3865 | 0.5892 | 0.7676 |
| 0.1067 | 5.4897 | 1334 | 0.5892 | 0.3077 | 0.5892 | 0.7676 |
| 0.1067 | 5.4979 | 1336 | 0.5866 | 0.3467 | 0.5866 | 0.7659 |
| 0.1067 | 5.5062 | 1338 | 0.5847 | 0.4273 | 0.5847 | 0.7647 |
| 0.1067 | 5.5144 | 1340 | 0.5820 | 0.4273 | 0.5820 | 0.7629 |
| 0.1067 | 5.5226 | 1342 | 0.5855 | 0.4273 | 0.5855 | 0.7652 |
| 0.1067 | 5.5309 | 1344 | 0.5909 | 0.4277 | 0.5909 | 0.7687 |
| 0.1067 | 5.5391 | 1346 | 0.5976 | 0.3811 | 0.5976 | 0.7730 |
| 0.1067 | 5.5473 | 1348 | 0.6043 | 0.3811 | 0.6043 | 0.7774 |
| 0.1067 | 5.5556 | 1350 | 0.5986 | 0.3811 | 0.5986 | 0.7737 |
| 0.1067 | 5.5638 | 1352 | 0.5869 | 0.2759 | 0.5869 | 0.7661 |
| 0.1067 | 5.5720 | 1354 | 0.5819 | 0.3865 | 0.5819 | 0.7628 |
| 0.1067 | 5.5802 | 1356 | 0.5902 | 0.2759 | 0.5902 | 0.7682 |
| 0.1067 | 5.5885 | 1358 | 0.5856 | 0.3865 | 0.5856 | 0.7652 |
| 0.1067 | 5.5967 | 1360 | 0.5777 | 0.3865 | 0.5777 | 0.7601 |
| 0.1067 | 5.6049 | 1362 | 0.5730 | 0.3865 | 0.5730 | 0.7570 |
| 0.1067 | 5.6132 | 1364 | 0.5739 | 0.3865 | 0.5739 | 0.7576 |
| 0.1067 | 5.6214 | 1366 | 0.5833 | 0.3865 | 0.5833 | 0.7638 |
| 0.1067 | 5.6296 | 1368 | 0.5973 | 0.4772 | 0.5973 | 0.7729 |
| 0.1067 | 5.6379 | 1370 | 0.6009 | 0.4772 | 0.6009 | 0.7752 |
| 0.1067 | 5.6461 | 1372 | 0.6135 | 0.3463 | 0.6135 | 0.7833 |
| 0.1067 | 5.6543 | 1374 | 0.6249 | 0.3463 | 0.6249 | 0.7905 |
| 0.1067 | 5.6626 | 1376 | 0.6465 | 0.3811 | 0.6465 | 0.8041 |
| 0.1067 | 5.6708 | 1378 | 0.6563 | 0.3333 | 0.6563 | 0.8101 |
| 0.1067 | 5.6790 | 1380 | 0.6677 | 0.2846 | 0.6677 | 0.8171 |
| 0.1067 | 5.6872 | 1382 | 0.6573 | 0.2846 | 0.6573 | 0.8107 |
| 0.1067 | 5.6955 | 1384 | 0.6353 | 0.3255 | 0.6353 | 0.7971 |
| 0.1067 | 5.7037 | 1386 | 0.6239 | 0.3255 | 0.6239 | 0.7899 |
| 0.1067 | 5.7119 | 1388 | 0.6212 | 0.2186 | 0.6212 | 0.7882 |
| 0.1067 | 5.7202 | 1390 | 0.6240 | 0.2186 | 0.6240 | 0.7899 |
| 0.1067 | 5.7284 | 1392 | 0.6335 | 0.2186 | 0.6335 | 0.7959 |
| 0.1067 | 5.7366 | 1394 | 0.6243 | 0.3724 | 0.6243 | 0.7902 |
| 0.1067 | 5.7449 | 1396 | 0.6067 | 0.3724 | 0.6067 | 0.7789 |
| 0.1067 | 5.7531 | 1398 | 0.6060 | 0.3724 | 0.6060 | 0.7785 |
| 0.1067 | 5.7613 | 1400 | 0.6136 | 0.3724 | 0.6136 | 0.7833 |
| 0.1067 | 5.7695 | 1402 | 0.6249 | 0.3724 | 0.6249 | 0.7905 |
| 0.1067 | 5.7778 | 1404 | 0.6320 | 0.3724 | 0.6320 | 0.7950 |
| 0.1067 | 5.7860 | 1406 | 0.6302 | 0.3724 | 0.6302 | 0.7938 |
| 0.1067 | 5.7942 | 1408 | 0.6485 | 0.3724 | 0.6485 | 0.8053 |
| 0.1067 | 5.8025 | 1410 | 0.6884 | 0.4615 | 0.6884 | 0.8297 |
| 0.1067 | 5.8107 | 1412 | 0.7049 | 0.4615 | 0.7049 | 0.8396 |
| 0.1067 | 5.8189 | 1414 | 0.7113 | 0.2948 | 0.7113 | 0.8434 |
| 0.1067 | 5.8272 | 1416 | 0.6979 | 0.2948 | 0.6979 | 0.8354 |
| 0.1067 | 5.8354 | 1418 | 0.6645 | 0.3333 | 0.6645 | 0.8152 |
| 0.1067 | 5.8436 | 1420 | 0.6171 | 0.2186 | 0.6171 | 0.7855 |
| 0.1067 | 5.8519 | 1422 | 0.5984 | 0.2186 | 0.5984 | 0.7736 |
| 0.1067 | 5.8601 | 1424 | 0.5909 | 0.3724 | 0.5909 | 0.7687 |
| 0.1067 | 5.8683 | 1426 | 0.5855 | 0.3724 | 0.5855 | 0.7652 |
| 0.1067 | 5.8765 | 1428 | 0.5823 | 0.3724 | 0.5823 | 0.7631 |
| 0.1067 | 5.8848 | 1430 | 0.5827 | 0.3724 | 0.5827 | 0.7633 |
| 0.1067 | 5.8930 | 1432 | 0.5836 | 0.3724 | 0.5836 | 0.7640 |
| 0.1067 | 5.9012 | 1434 | 0.5890 | 0.2186 | 0.5890 | 0.7675 |
| 0.1067 | 5.9095 | 1436 | 0.5924 | 0.2186 | 0.5924 | 0.7697 |
| 0.1067 | 5.9177 | 1438 | 0.5934 | 0.2186 | 0.5934 | 0.7703 |
| 0.1067 | 5.9259 | 1440 | 0.5975 | 0.2186 | 0.5975 | 0.7730 |
| 0.1067 | 5.9342 | 1442 | 0.5973 | 0.2186 | 0.5973 | 0.7729 |
| 0.1067 | 5.9424 | 1444 | 0.6052 | 0.2186 | 0.6052 | 0.7779 |
| 0.1067 | 5.9506 | 1446 | 0.6085 | 0.2355 | 0.6085 | 0.7800 |
| 0.1067 | 5.9588 | 1448 | 0.6136 | 0.2355 | 0.6136 | 0.7833 |
| 0.1067 | 5.9671 | 1450 | 0.6272 | 0.3333 | 0.6272 | 0.7920 |
| 0.1067 | 5.9753 | 1452 | 0.6377 | 0.3333 | 0.6377 | 0.7986 |
| 0.1067 | 5.9835 | 1454 | 0.6601 | 0.3333 | 0.6601 | 0.8124 |
| 0.1067 | 5.9918 | 1456 | 0.6702 | 0.3333 | 0.6702 | 0.8186 |
| 0.1067 | 6.0 | 1458 | 0.6696 | 0.3333 | 0.6696 | 0.8183 |
| 0.1067 | 6.0082 | 1460 | 0.7106 | 0.3401 | 0.7106 | 0.8430 |
| 0.1067 | 6.0165 | 1462 | 0.7502 | 0.3277 | 0.7502 | 0.8662 |
| 0.1067 | 6.0247 | 1464 | 0.7732 | 0.3277 | 0.7732 | 0.8793 |
| 0.1067 | 6.0329 | 1466 | 0.7905 | 0.3277 | 0.7905 | 0.8891 |
| 0.1067 | 6.0412 | 1468 | 0.7390 | 0.3277 | 0.7390 | 0.8596 |
| 0.1067 | 6.0494 | 1470 | 0.6955 | 0.3401 | 0.6955 | 0.8339 |
| 0.1067 | 6.0576 | 1472 | 0.6746 | 0.3401 | 0.6746 | 0.8214 |
| 0.1067 | 6.0658 | 1474 | 0.6916 | 0.3401 | 0.6916 | 0.8316 |
| 0.1067 | 6.0741 | 1476 | 0.7171 | 0.3401 | 0.7171 | 0.8468 |
| 0.1067 | 6.0823 | 1478 | 0.7460 | 0.3401 | 0.7460 | 0.8637 |
| 0.1067 | 6.0905 | 1480 | 0.7604 | 0.4187 | 0.7604 | 0.8720 |
| 0.1067 | 6.0988 | 1482 | 0.7358 | 0.3401 | 0.7358 | 0.8578 |
| 0.1067 | 6.1070 | 1484 | 0.6898 | 0.3401 | 0.6898 | 0.8305 |
| 0.1067 | 6.1152 | 1486 | 0.6483 | 0.4740 | 0.6483 | 0.8052 |
| 0.1067 | 6.1235 | 1488 | 0.6281 | 0.4389 | 0.6281 | 0.7925 |
| 0.1067 | 6.1317 | 1490 | 0.6149 | 0.4389 | 0.6149 | 0.7842 |
| 0.1067 | 6.1399 | 1492 | 0.6114 | 0.4389 | 0.6114 | 0.7819 |
| 0.1067 | 6.1481 | 1494 | 0.6122 | 0.4389 | 0.6122 | 0.7825 |
| 0.1067 | 6.1564 | 1496 | 0.6148 | 0.4389 | 0.6148 | 0.7841 |
| 0.1067 | 6.1646 | 1498 | 0.6254 | 0.3463 | 0.6254 | 0.7909 |
| 0.0692 | 6.1728 | 1500 | 0.6406 | 0.3333 | 0.6406 | 0.8004 |
| 0.0692 | 6.1811 | 1502 | 0.6690 | 0.3333 | 0.6690 | 0.8179 |
| 0.0692 | 6.1893 | 1504 | 0.6879 | 0.3333 | 0.6879 | 0.8294 |
| 0.0692 | 6.1975 | 1506 | 0.6737 | 0.3333 | 0.6737 | 0.8208 |
| 0.0692 | 6.2058 | 1508 | 0.6619 | 0.3333 | 0.6619 | 0.8136 |
| 0.0692 | 6.2140 | 1510 | 0.6404 | 0.3333 | 0.6404 | 0.8002 |
| 0.0692 | 6.2222 | 1512 | 0.6208 | 0.2186 | 0.6208 | 0.7879 |
| 0.0692 | 6.2305 | 1514 | 0.6093 | 0.2364 | 0.6093 | 0.7806 |
| 0.0692 | 6.2387 | 1516 | 0.6077 | 0.3865 | 0.6077 | 0.7795 |
| 0.0692 | 6.2469 | 1518 | 0.6043 | 0.3865 | 0.6043 | 0.7774 |
| 0.0692 | 6.2551 | 1520 | 0.6001 | 0.3865 | 0.6001 | 0.7747 |
| 0.0692 | 6.2634 | 1522 | 0.5978 | 0.3865 | 0.5978 | 0.7732 |
| 0.0692 | 6.2716 | 1524 | 0.5987 | 0.2759 | 0.5987 | 0.7738 |
| 0.0692 | 6.2798 | 1526 | 0.6101 | 0.2355 | 0.6101 | 0.7811 |
| 0.0692 | 6.2881 | 1528 | 0.6292 | 0.3333 | 0.6292 | 0.7932 |
| 0.0692 | 6.2963 | 1530 | 0.6504 | 0.3333 | 0.6504 | 0.8065 |
| 0.0692 | 6.3045 | 1532 | 0.6731 | 0.2846 | 0.6731 | 0.8204 |
| 0.0692 | 6.3128 | 1534 | 0.6776 | 0.2846 | 0.6776 | 0.8231 |
| 0.0692 | 6.3210 | 1536 | 0.6742 | 0.2846 | 0.6742 | 0.8211 |
| 0.0692 | 6.3292 | 1538 | 0.6550 | 0.3811 | 0.6550 | 0.8093 |
| 0.0692 | 6.3374 | 1540 | 0.6390 | 0.3811 | 0.6390 | 0.7993 |
| 0.0692 | 6.3457 | 1542 | 0.6197 | 0.3811 | 0.6197 | 0.7872 |
| 0.0692 | 6.3539 | 1544 | 0.6014 | 0.4740 | 0.6014 | 0.7755 |
| 0.0692 | 6.3621 | 1546 | 0.5895 | 0.4740 | 0.5895 | 0.7678 |
| 0.0692 | 6.3704 | 1548 | 0.5917 | 0.4740 | 0.5917 | 0.7692 |
| 0.0692 | 6.3786 | 1550 | 0.5937 | 0.3463 | 0.5937 | 0.7705 |
| 0.0692 | 6.3868 | 1552 | 0.5984 | 0.3811 | 0.5984 | 0.7736 |
| 0.0692 | 6.3951 | 1554 | 0.6104 | 0.3333 | 0.6104 | 0.7813 |
| 0.0692 | 6.4033 | 1556 | 0.6300 | 0.3333 | 0.6300 | 0.7937 |
| 0.0692 | 6.4115 | 1558 | 0.6186 | 0.3333 | 0.6186 | 0.7865 |
| 0.0692 | 6.4198 | 1560 | 0.6094 | 0.3811 | 0.6094 | 0.7806 |
| 0.0692 | 6.4280 | 1562 | 0.6105 | 0.3811 | 0.6105 | 0.7814 |
| 0.0692 | 6.4362 | 1564 | 0.6063 | 0.2881 | 0.6063 | 0.7787 |
| 0.0692 | 6.4444 | 1566 | 0.6054 | 0.2881 | 0.6054 | 0.7781 |
| 0.0692 | 6.4527 | 1568 | 0.6045 | 0.2881 | 0.6045 | 0.7775 |
| 0.0692 | 6.4609 | 1570 | 0.6082 | 0.2759 | 0.6082 | 0.7798 |
| 0.0692 | 6.4691 | 1572 | 0.6141 | 0.2186 | 0.6141 | 0.7836 |
| 0.0692 | 6.4774 | 1574 | 0.6320 | 0.2355 | 0.6320 | 0.7950 |
| 0.0692 | 6.4856 | 1576 | 0.6476 | 0.2355 | 0.6476 | 0.8047 |
| 0.0692 | 6.4938 | 1578 | 0.6600 | 0.1818 | 0.6600 | 0.8124 |
| 0.0692 | 6.5021 | 1580 | 0.6625 | 0.2188 | 0.6625 | 0.8139 |
| 0.0692 | 6.5103 | 1582 | 0.6635 | 0.2188 | 0.6635 | 0.8145 |
| 0.0692 | 6.5185 | 1584 | 0.6506 | 0.2188 | 0.6506 | 0.8066 |
| 0.0692 | 6.5267 | 1586 | 0.6407 | 0.1818 | 0.6407 | 0.8004 |
| 0.0692 | 6.5350 | 1588 | 0.6252 | 0.2355 | 0.6252 | 0.7907 |
| 0.0692 | 6.5432 | 1590 | 0.6079 | 0.2186 | 0.6079 | 0.7797 |
| 0.0692 | 6.5514 | 1592 | 0.5948 | 0.2186 | 0.5948 | 0.7712 |
| 0.0692 | 6.5597 | 1594 | 0.5883 | 0.4273 | 0.5883 | 0.7670 |
| 0.0692 | 6.5679 | 1596 | 0.5897 | 0.3077 | 0.5897 | 0.7679 |
| 0.0692 | 6.5761 | 1598 | 0.5946 | 0.3226 | 0.5946 | 0.7711 |
| 0.0692 | 6.5844 | 1600 | 0.5944 | 0.3226 | 0.5944 | 0.7710 |
| 0.0692 | 6.5926 | 1602 | 0.5929 | 0.3467 | 0.5929 | 0.7700 |
| 0.0692 | 6.6008 | 1604 | 0.5930 | 0.3724 | 0.5930 | 0.7701 |
| 0.0692 | 6.6091 | 1606 | 0.5982 | 0.3724 | 0.5982 | 0.7734 |
| 0.0692 | 6.6173 | 1608 | 0.6042 | 0.3724 | 0.6042 | 0.7773 |
| 0.0692 | 6.6255 | 1610 | 0.6068 | 0.3724 | 0.6068 | 0.7790 |
| 0.0692 | 6.6337 | 1612 | 0.6094 | 0.3724 | 0.6094 | 0.7806 |
| 0.0692 | 6.6420 | 1614 | 0.6169 | 0.2186 | 0.6169 | 0.7854 |
| 0.0692 | 6.6502 | 1616 | 0.6346 | 0.2186 | 0.6346 | 0.7966 |
| 0.0692 | 6.6584 | 1618 | 0.6467 | 0.2186 | 0.6467 | 0.8042 |
| 0.0692 | 6.6667 | 1620 | 0.6422 | 0.2186 | 0.6422 | 0.8014 |
| 0.0692 | 6.6749 | 1622 | 0.6255 | 0.2186 | 0.6255 | 0.7909 |
| 0.0692 | 6.6831 | 1624 | 0.6098 | 0.3724 | 0.6098 | 0.7809 |
| 0.0692 | 6.6914 | 1626 | 0.5949 | 0.3724 | 0.5949 | 0.7713 |
| 0.0692 | 6.6996 | 1628 | 0.5834 | 0.3724 | 0.5834 | 0.7638 |
| 0.0692 | 6.7078 | 1630 | 0.5793 | 0.3724 | 0.5793 | 0.7611 |
| 0.0692 | 6.7160 | 1632 | 0.5767 | 0.3318 | 0.5767 | 0.7594 |
| 0.0692 | 6.7243 | 1634 | 0.5749 | 0.2921 | 0.5749 | 0.7582 |
| 0.0692 | 6.7325 | 1636 | 0.5739 | 0.2921 | 0.5739 | 0.7576 |
| 0.0692 | 6.7407 | 1638 | 0.5732 | 0.2921 | 0.5732 | 0.7571 |
| 0.0692 | 6.7490 | 1640 | 0.5720 | 0.3318 | 0.5720 | 0.7563 |
| 0.0692 | 6.7572 | 1642 | 0.5721 | 0.3724 | 0.5721 | 0.7564 |
| 0.0692 | 6.7654 | 1644 | 0.5810 | 0.3724 | 0.5810 | 0.7623 |
| 0.0692 | 6.7737 | 1646 | 0.5880 | 0.2186 | 0.5880 | 0.7668 |
| 0.0692 | 6.7819 | 1648 | 0.5938 | 0.2186 | 0.5938 | 0.7706 |
| 0.0692 | 6.7901 | 1650 | 0.5990 | 0.2186 | 0.5990 | 0.7740 |
| 0.0692 | 6.7984 | 1652 | 0.5994 | 0.3724 | 0.5994 | 0.7742 |
| 0.0692 | 6.8066 | 1654 | 0.6008 | 0.3724 | 0.6008 | 0.7751 |
| 0.0692 | 6.8148 | 1656 | 0.6042 | 0.3724 | 0.6042 | 0.7773 |
| 0.0692 | 6.8230 | 1658 | 0.6091 | 0.2186 | 0.6091 | 0.7805 |
| 0.0692 | 6.8313 | 1660 | 0.6180 | 0.2186 | 0.6180 | 0.7861 |
| 0.0692 | 6.8395 | 1662 | 0.6167 | 0.2186 | 0.6167 | 0.7853 |
| 0.0692 | 6.8477 | 1664 | 0.6168 | 0.2186 | 0.6168 | 0.7854 |
| 0.0692 | 6.8560 | 1666 | 0.6187 | 0.2186 | 0.6187 | 0.7866 |
| 0.0692 | 6.8642 | 1668 | 0.6188 | 0.2186 | 0.6188 | 0.7867 |
| 0.0692 | 6.8724 | 1670 | 0.6151 | 0.2186 | 0.6151 | 0.7843 |
| 0.0692 | 6.8807 | 1672 | 0.6070 | 0.2186 | 0.6070 | 0.7791 |
| 0.0692 | 6.8889 | 1674 | 0.6057 | 0.2186 | 0.6057 | 0.7783 |
| 0.0692 | 6.8971 | 1676 | 0.6044 | 0.1793 | 0.6044 | 0.7774 |
| 0.0692 | 6.9053 | 1678 | 0.6025 | 0.1793 | 0.6025 | 0.7762 |
| 0.0692 | 6.9136 | 1680 | 0.5999 | 0.1793 | 0.5999 | 0.7745 |
| 0.0692 | 6.9218 | 1682 | 0.5964 | 0.3318 | 0.5964 | 0.7723 |
| 0.0692 | 6.9300 | 1684 | 0.5980 | 0.3318 | 0.5980 | 0.7733 |
| 0.0692 | 6.9383 | 1686 | 0.6003 | 0.1793 | 0.6003 | 0.7748 |
| 0.0692 | 6.9465 | 1688 | 0.6075 | 0.1992 | 0.6075 | 0.7794 |
| 0.0692 | 6.9547 | 1690 | 0.6167 | 0.1992 | 0.6167 | 0.7853 |
| 0.0692 | 6.9630 | 1692 | 0.6238 | 0.1992 | 0.6238 | 0.7898 |
| 0.0692 | 6.9712 | 1694 | 0.6318 | 0.1992 | 0.6318 | 0.7949 |
| 0.0692 | 6.9794 | 1696 | 0.6347 | 0.1793 | 0.6347 | 0.7967 |
| 0.0692 | 6.9877 | 1698 | 0.6336 | 0.1793 | 0.6336 | 0.7960 |
| 0.0692 | 6.9959 | 1700 | 0.6399 | 0.2186 | 0.6399 | 0.7999 |
| 0.0692 | 7.0041 | 1702 | 0.6433 | 0.1600 | 0.6433 | 0.8020 |
| 0.0692 | 7.0123 | 1704 | 0.6346 | 0.2186 | 0.6346 | 0.7966 |
| 0.0692 | 7.0206 | 1706 | 0.6267 | 0.2186 | 0.6267 | 0.7916 |
| 0.0692 | 7.0288 | 1708 | 0.6184 | 0.2186 | 0.6184 | 0.7864 |
| 0.0692 | 7.0370 | 1710 | 0.6157 | 0.2186 | 0.6157 | 0.7846 |
| 0.0692 | 7.0453 | 1712 | 0.6129 | 0.1793 | 0.6129 | 0.7829 |
| 0.0692 | 7.0535 | 1714 | 0.6149 | 0.2186 | 0.6149 | 0.7842 |
| 0.0692 | 7.0617 | 1716 | 0.6210 | 0.1600 | 0.6210 | 0.7880 |
| 0.0692 | 7.0700 | 1718 | 0.6209 | 0.2186 | 0.6209 | 0.7880 |
| 0.0692 | 7.0782 | 1720 | 0.6126 | 0.1793 | 0.6126 | 0.7827 |
| 0.0692 | 7.0864 | 1722 | 0.6086 | 0.1793 | 0.6086 | 0.7801 |
| 0.0692 | 7.0947 | 1724 | 0.6144 | 0.1793 | 0.6144 | 0.7839 |
| 0.0692 | 7.1029 | 1726 | 0.6169 | 0.1992 | 0.6169 | 0.7854 |
| 0.0692 | 7.1111 | 1728 | 0.6278 | 0.1992 | 0.6278 | 0.7923 |
| 0.0692 | 7.1193 | 1730 | 0.6458 | 0.2355 | 0.6458 | 0.8036 |
| 0.0692 | 7.1276 | 1732 | 0.6782 | 0.2355 | 0.6782 | 0.8235 |
| 0.0692 | 7.1358 | 1734 | 0.6941 | 0.1818 | 0.6941 | 0.8331 |
| 0.0692 | 7.1440 | 1736 | 0.6847 | 0.2355 | 0.6847 | 0.8275 |
| 0.0692 | 7.1523 | 1738 | 0.6607 | 0.2355 | 0.6607 | 0.8129 |
| 0.0692 | 7.1605 | 1740 | 0.6388 | 0.2355 | 0.6388 | 0.7993 |
| 0.0692 | 7.1687 | 1742 | 0.6237 | 0.1992 | 0.6237 | 0.7898 |
| 0.0692 | 7.1770 | 1744 | 0.6138 | 0.3396 | 0.6138 | 0.7835 |
| 0.0692 | 7.1852 | 1746 | 0.6140 | 0.3396 | 0.6140 | 0.7836 |
| 0.0692 | 7.1934 | 1748 | 0.6138 | 0.3771 | 0.6138 | 0.7835 |
| 0.0692 | 7.2016 | 1750 | 0.6196 | 0.2355 | 0.6196 | 0.7871 |
| 0.0692 | 7.2099 | 1752 | 0.6268 | 0.2355 | 0.6268 | 0.7917 |
| 0.0692 | 7.2181 | 1754 | 0.6321 | 0.2355 | 0.6321 | 0.7950 |
| 0.0692 | 7.2263 | 1756 | 0.6372 | 0.2355 | 0.6372 | 0.7982 |
| 0.0692 | 7.2346 | 1758 | 0.6410 | 0.2355 | 0.6410 | 0.8007 |
| 0.0692 | 7.2428 | 1760 | 0.6502 | 0.2355 | 0.6502 | 0.8064 |
| 0.0692 | 7.2510 | 1762 | 0.6506 | 0.2355 | 0.6506 | 0.8066 |
| 0.0692 | 7.2593 | 1764 | 0.6459 | 0.2355 | 0.6459 | 0.8037 |
| 0.0692 | 7.2675 | 1766 | 0.6486 | 0.2355 | 0.6486 | 0.8053 |
| 0.0692 | 7.2757 | 1768 | 0.6505 | 0.2355 | 0.6505 | 0.8065 |
| 0.0692 | 7.2840 | 1770 | 0.6471 | 0.3396 | 0.6471 | 0.8044 |
| 0.0692 | 7.2922 | 1772 | 0.6450 | 0.3396 | 0.6450 | 0.8031 |
| 0.0692 | 7.3004 | 1774 | 0.6465 | 0.3396 | 0.6465 | 0.8041 |
| 0.0692 | 7.3086 | 1776 | 0.6521 | 0.3396 | 0.6521 | 0.8076 |
| 0.0692 | 7.3169 | 1778 | 0.6630 | 0.1992 | 0.6630 | 0.8143 |
| 0.0692 | 7.3251 | 1780 | 0.6659 | 0.1992 | 0.6659 | 0.8160 |
| 0.0692 | 7.3333 | 1782 | 0.6630 | 0.3396 | 0.6630 | 0.8143 |
| 0.0692 | 7.3416 | 1784 | 0.6656 | 0.1992 | 0.6656 | 0.8158 |
| 0.0692 | 7.3498 | 1786 | 0.6680 | 0.2355 | 0.6680 | 0.8173 |
| 0.0692 | 7.3580 | 1788 | 0.6584 | 0.3396 | 0.6584 | 0.8114 |
| 0.0692 | 7.3663 | 1790 | 0.6572 | 0.3396 | 0.6572 | 0.8107 |
| 0.0692 | 7.3745 | 1792 | 0.6563 | 0.3771 | 0.6563 | 0.8101 |
| 0.0692 | 7.3827 | 1794 | 0.6627 | 0.3771 | 0.6627 | 0.8141 |
| 0.0692 | 7.3909 | 1796 | 0.6650 | 0.2355 | 0.6650 | 0.8155 |
| 0.0692 | 7.3992 | 1798 | 0.6579 | 0.2355 | 0.6579 | 0.8111 |
| 0.0692 | 7.4074 | 1800 | 0.6494 | 0.3771 | 0.6494 | 0.8059 |
| 0.0692 | 7.4156 | 1802 | 0.6377 | 0.3724 | 0.6377 | 0.7985 |
| 0.0692 | 7.4239 | 1804 | 0.6364 | 0.3724 | 0.6364 | 0.7978 |
| 0.0692 | 7.4321 | 1806 | 0.6459 | 0.3724 | 0.6459 | 0.8037 |
| 0.0692 | 7.4403 | 1808 | 0.6677 | 0.2355 | 0.6677 | 0.8171 |
| 0.0692 | 7.4486 | 1810 | 0.6834 | 0.2355 | 0.6834 | 0.8267 |
| 0.0692 | 7.4568 | 1812 | 0.7022 | 0.2355 | 0.7022 | 0.8380 |
| 0.0692 | 7.4650 | 1814 | 0.7093 | 0.2355 | 0.7093 | 0.8422 |
| 0.0692 | 7.4733 | 1816 | 0.7180 | 0.2355 | 0.7180 | 0.8474 |
| 0.0692 | 7.4815 | 1818 | 0.7246 | 0.3333 | 0.7246 | 0.8513 |
| 0.0692 | 7.4897 | 1820 | 0.7286 | 0.3333 | 0.7286 | 0.8536 |
| 0.0692 | 7.4979 | 1822 | 0.7242 | 0.3333 | 0.7242 | 0.8510 |
| 0.0692 | 7.5062 | 1824 | 0.7173 | 0.3333 | 0.7173 | 0.8470 |
| 0.0692 | 7.5144 | 1826 | 0.7155 | 0.3333 | 0.7155 | 0.8459 |
| 0.0692 | 7.5226 | 1828 | 0.6998 | 0.2355 | 0.6998 | 0.8365 |
| 0.0692 | 7.5309 | 1830 | 0.6859 | 0.2355 | 0.6859 | 0.8282 |
| 0.0692 | 7.5391 | 1832 | 0.6722 | 0.2355 | 0.6722 | 0.8199 |
| 0.0692 | 7.5473 | 1834 | 0.6616 | 0.3771 | 0.6616 | 0.8134 |
| 0.0692 | 7.5556 | 1836 | 0.6468 | 0.3771 | 0.6468 | 0.8043 |
| 0.0692 | 7.5638 | 1838 | 0.6428 | 0.3771 | 0.6428 | 0.8018 |
| 0.0692 | 7.5720 | 1840 | 0.6488 | 0.3771 | 0.6488 | 0.8055 |
| 0.0692 | 7.5802 | 1842 | 0.6507 | 0.3771 | 0.6507 | 0.8067 |
| 0.0692 | 7.5885 | 1844 | 0.6520 | 0.3771 | 0.6520 | 0.8075 |
| 0.0692 | 7.5967 | 1846 | 0.6585 | 0.3771 | 0.6585 | 0.8115 |
| 0.0692 | 7.6049 | 1848 | 0.6589 | 0.3771 | 0.6589 | 0.8117 |
| 0.0692 | 7.6132 | 1850 | 0.6589 | 0.3771 | 0.6589 | 0.8117 |
| 0.0692 | 7.6214 | 1852 | 0.6684 | 0.2355 | 0.6684 | 0.8175 |
| 0.0692 | 7.6296 | 1854 | 0.6832 | 0.2355 | 0.6832 | 0.8265 |
| 0.0692 | 7.6379 | 1856 | 0.7081 | 0.3333 | 0.7081 | 0.8415 |
| 0.0692 | 7.6461 | 1858 | 0.7205 | 0.2846 | 0.7205 | 0.8488 |
| 0.0692 | 7.6543 | 1860 | 0.7250 | 0.2846 | 0.7250 | 0.8515 |
| 0.0692 | 7.6626 | 1862 | 0.7194 | 0.2846 | 0.7194 | 0.8482 |
| 0.0692 | 7.6708 | 1864 | 0.7073 | 0.3333 | 0.7073 | 0.8410 |
| 0.0692 | 7.6790 | 1866 | 0.6927 | 0.2355 | 0.6927 | 0.8323 |
| 0.0692 | 7.6872 | 1868 | 0.6672 | 0.2355 | 0.6672 | 0.8168 |
| 0.0692 | 7.6955 | 1870 | 0.6434 | 0.3724 | 0.6434 | 0.8022 |
| 0.0692 | 7.7037 | 1872 | 0.6320 | 0.3724 | 0.6320 | 0.7950 |
| 0.0692 | 7.7119 | 1874 | 0.6248 | 0.3318 | 0.6248 | 0.7905 |
| 0.0692 | 7.7202 | 1876 | 0.6213 | 0.3318 | 0.6213 | 0.7882 |
| 0.0692 | 7.7284 | 1878 | 0.6196 | 0.3318 | 0.6196 | 0.7871 |
| 0.0692 | 7.7366 | 1880 | 0.6225 | 0.3318 | 0.6225 | 0.7890 |
| 0.0692 | 7.7449 | 1882 | 0.6360 | 0.3724 | 0.6360 | 0.7975 |
| 0.0692 | 7.7531 | 1884 | 0.6573 | 0.3724 | 0.6573 | 0.8108 |
| 0.0692 | 7.7613 | 1886 | 0.6798 | 0.3724 | 0.6798 | 0.8245 |
| 0.0692 | 7.7695 | 1888 | 0.6981 | 0.3771 | 0.6981 | 0.8355 |
| 0.0692 | 7.7778 | 1890 | 0.7134 | 0.2355 | 0.7134 | 0.8446 |
| 0.0692 | 7.7860 | 1892 | 0.7096 | 0.2355 | 0.7096 | 0.8424 |
| 0.0692 | 7.7942 | 1894 | 0.6947 | 0.2355 | 0.6947 | 0.8335 |
| 0.0692 | 7.8025 | 1896 | 0.6689 | 0.2186 | 0.6689 | 0.8179 |
| 0.0692 | 7.8107 | 1898 | 0.6435 | 0.3724 | 0.6435 | 0.8022 |
| 0.0692 | 7.8189 | 1900 | 0.6295 | 0.3724 | 0.6295 | 0.7934 |
| 0.0692 | 7.8272 | 1902 | 0.6151 | 0.3724 | 0.6151 | 0.7843 |
| 0.0692 | 7.8354 | 1904 | 0.6045 | 0.3724 | 0.6045 | 0.7775 |
| 0.0692 | 7.8436 | 1906 | 0.5975 | 0.3724 | 0.5975 | 0.7730 |
| 0.0692 | 7.8519 | 1908 | 0.5937 | 0.3724 | 0.5937 | 0.7705 |
| 0.0692 | 7.8601 | 1910 | 0.5954 | 0.3724 | 0.5954 | 0.7716 |
| 0.0692 | 7.8683 | 1912 | 0.5982 | 0.3724 | 0.5982 | 0.7735 |
| 0.0692 | 7.8765 | 1914 | 0.6021 | 0.3724 | 0.6021 | 0.7759 |
| 0.0692 | 7.8848 | 1916 | 0.6086 | 0.3724 | 0.6086 | 0.7802 |
| 0.0692 | 7.8930 | 1918 | 0.6219 | 0.3724 | 0.6219 | 0.7886 |
| 0.0692 | 7.9012 | 1920 | 0.6473 | 0.3771 | 0.6473 | 0.8046 |
| 0.0692 | 7.9095 | 1922 | 0.6686 | 0.3771 | 0.6686 | 0.8177 |
| 0.0692 | 7.9177 | 1924 | 0.6928 | 0.3811 | 0.6928 | 0.8324 |
| 0.0692 | 7.9259 | 1926 | 0.7079 | 0.4615 | 0.7079 | 0.8413 |
| 0.0692 | 7.9342 | 1928 | 0.7041 | 0.3811 | 0.7041 | 0.8391 |
| 0.0692 | 7.9424 | 1930 | 0.6896 | 0.3811 | 0.6896 | 0.8304 |
| 0.0692 | 7.9506 | 1932 | 0.6779 | 0.3771 | 0.6779 | 0.8234 |
| 0.0692 | 7.9588 | 1934 | 0.6610 | 0.3771 | 0.6610 | 0.8130 |
| 0.0692 | 7.9671 | 1936 | 0.6406 | 0.3724 | 0.6406 | 0.8004 |
| 0.0692 | 7.9753 | 1938 | 0.6288 | 0.3318 | 0.6288 | 0.7930 |
| 0.0692 | 7.9835 | 1940 | 0.6253 | 0.3318 | 0.6253 | 0.7907 |
| 0.0692 | 7.9918 | 1942 | 0.6240 | 0.3318 | 0.6240 | 0.7899 |
| 0.0692 | 8.0 | 1944 | 0.6227 | 0.3318 | 0.6227 | 0.7891 |
| 0.0692 | 8.0082 | 1946 | 0.6165 | 0.3318 | 0.6165 | 0.7852 |
| 0.0692 | 8.0165 | 1948 | 0.6127 | 0.3724 | 0.6127 | 0.7827 |
| 0.0692 | 8.0247 | 1950 | 0.6124 | 0.3724 | 0.6124 | 0.7826 |
| 0.0692 | 8.0329 | 1952 | 0.6120 | 0.3724 | 0.6120 | 0.7823 |
| 0.0692 | 8.0412 | 1954 | 0.6186 | 0.3724 | 0.6186 | 0.7865 |
| 0.0692 | 8.0494 | 1956 | 0.6321 | 0.3724 | 0.6321 | 0.7950 |
| 0.0692 | 8.0576 | 1958 | 0.6434 | 0.3724 | 0.6434 | 0.8021 |
| 0.0692 | 8.0658 | 1960 | 0.6600 | 0.3771 | 0.6600 | 0.8124 |
| 0.0692 | 8.0741 | 1962 | 0.6698 | 0.2355 | 0.6698 | 0.8184 |
| 0.0692 | 8.0823 | 1964 | 0.6623 | 0.3771 | 0.6623 | 0.8138 |
| 0.0692 | 8.0905 | 1966 | 0.6544 | 0.3724 | 0.6544 | 0.8089 |
| 0.0692 | 8.0988 | 1968 | 0.6476 | 0.3724 | 0.6476 | 0.8047 |
| 0.0692 | 8.1070 | 1970 | 0.6421 | 0.3724 | 0.6421 | 0.8013 |
| 0.0692 | 8.1152 | 1972 | 0.6377 | 0.3724 | 0.6377 | 0.7985 |
| 0.0692 | 8.1235 | 1974 | 0.6314 | 0.3318 | 0.6314 | 0.7946 |
| 0.0692 | 8.1317 | 1976 | 0.6277 | 0.3318 | 0.6277 | 0.7923 |
| 0.0692 | 8.1399 | 1978 | 0.6263 | 0.3318 | 0.6263 | 0.7914 |
| 0.0692 | 8.1481 | 1980 | 0.6240 | 0.3318 | 0.6240 | 0.7900 |
| 0.0692 | 8.1564 | 1982 | 0.6229 | 0.3318 | 0.6229 | 0.7892 |
| 0.0692 | 8.1646 | 1984 | 0.6233 | 0.3318 | 0.6233 | 0.7895 |
| 0.0692 | 8.1728 | 1986 | 0.6248 | 0.3318 | 0.6248 | 0.7905 |
| 0.0692 | 8.1811 | 1988 | 0.6294 | 0.3724 | 0.6294 | 0.7934 |
| 0.0692 | 8.1893 | 1990 | 0.6369 | 0.3724 | 0.6369 | 0.7981 |
| 0.0692 | 8.1975 | 1992 | 0.6457 | 0.3724 | 0.6457 | 0.8035 |
| 0.0692 | 8.2058 | 1994 | 0.6554 | 0.2355 | 0.6554 | 0.8096 |
| 0.0692 | 8.2140 | 1996 | 0.6601 | 0.2355 | 0.6601 | 0.8125 |
| 0.0692 | 8.2222 | 1998 | 0.6605 | 0.2355 | 0.6605 | 0.8127 |
| 0.049 | 8.2305 | 2000 | 0.6565 | 0.2355 | 0.6565 | 0.8102 |
| 0.049 | 8.2387 | 2002 | 0.6439 | 0.2355 | 0.6439 | 0.8024 |
| 0.049 | 8.2469 | 2004 | 0.6288 | 0.3724 | 0.6288 | 0.7930 |
| 0.049 | 8.2551 | 2006 | 0.6161 | 0.3724 | 0.6161 | 0.7849 |
| 0.049 | 8.2634 | 2008 | 0.6110 | 0.3724 | 0.6110 | 0.7816 |
| 0.049 | 8.2716 | 2010 | 0.6071 | 0.3318 | 0.6071 | 0.7792 |
| 0.049 | 8.2798 | 2012 | 0.6065 | 0.3318 | 0.6065 | 0.7788 |
| 0.049 | 8.2881 | 2014 | 0.6065 | 0.3318 | 0.6065 | 0.7788 |
| 0.049 | 8.2963 | 2016 | 0.6070 | 0.3318 | 0.6070 | 0.7791 |
| 0.049 | 8.3045 | 2018 | 0.6090 | 0.3318 | 0.6090 | 0.7804 |
| 0.049 | 8.3128 | 2020 | 0.6126 | 0.3724 | 0.6126 | 0.7827 |
| 0.049 | 8.3210 | 2022 | 0.6177 | 0.3724 | 0.6177 | 0.7859 |
| 0.049 | 8.3292 | 2024 | 0.6258 | 0.3724 | 0.6258 | 0.7911 |
| 0.049 | 8.3374 | 2026 | 0.6331 | 0.3724 | 0.6331 | 0.7957 |
| 0.049 | 8.3457 | 2028 | 0.6384 | 0.3724 | 0.6384 | 0.7990 |
| 0.049 | 8.3539 | 2030 | 0.6449 | 0.3724 | 0.6449 | 0.8031 |
| 0.049 | 8.3621 | 2032 | 0.6463 | 0.3724 | 0.6463 | 0.8039 |
| 0.049 | 8.3704 | 2034 | 0.6405 | 0.3724 | 0.6405 | 0.8003 |
| 0.049 | 8.3786 | 2036 | 0.6380 | 0.3724 | 0.6380 | 0.7987 |
| 0.049 | 8.3868 | 2038 | 0.6386 | 0.3724 | 0.6386 | 0.7992 |
| 0.049 | 8.3951 | 2040 | 0.6387 | 0.3724 | 0.6387 | 0.7992 |
| 0.049 | 8.4033 | 2042 | 0.6349 | 0.3724 | 0.6349 | 0.7968 |
| 0.049 | 8.4115 | 2044 | 0.6330 | 0.3724 | 0.6330 | 0.7956 |
| 0.049 | 8.4198 | 2046 | 0.6300 | 0.3724 | 0.6300 | 0.7937 |
| 0.049 | 8.4280 | 2048 | 0.6275 | 0.3724 | 0.6275 | 0.7922 |
| 0.049 | 8.4362 | 2050 | 0.6270 | 0.3724 | 0.6270 | 0.7918 |
| 0.049 | 8.4444 | 2052 | 0.6259 | 0.3724 | 0.6259 | 0.7911 |
| 0.049 | 8.4527 | 2054 | 0.6310 | 0.3724 | 0.6310 | 0.7944 |
| 0.049 | 8.4609 | 2056 | 0.6363 | 0.3771 | 0.6363 | 0.7977 |
| 0.049 | 8.4691 | 2058 | 0.6430 | 0.3771 | 0.6430 | 0.8019 |
| 0.049 | 8.4774 | 2060 | 0.6499 | 0.3771 | 0.6499 | 0.8062 |
| 0.049 | 8.4856 | 2062 | 0.6491 | 0.3771 | 0.6491 | 0.8056 |
| 0.049 | 8.4938 | 2064 | 0.6505 | 0.3771 | 0.6505 | 0.8065 |
| 0.049 | 8.5021 | 2066 | 0.6514 | 0.3771 | 0.6514 | 0.8071 |
| 0.049 | 8.5103 | 2068 | 0.6522 | 0.3771 | 0.6522 | 0.8076 |
| 0.049 | 8.5185 | 2070 | 0.6528 | 0.3771 | 0.6528 | 0.8079 |
| 0.049 | 8.5267 | 2072 | 0.6535 | 0.3771 | 0.6535 | 0.8084 |
| 0.049 | 8.5350 | 2074 | 0.6525 | 0.3771 | 0.6525 | 0.8078 |
| 0.049 | 8.5432 | 2076 | 0.6489 | 0.3771 | 0.6489 | 0.8056 |
| 0.049 | 8.5514 | 2078 | 0.6478 | 0.3771 | 0.6478 | 0.8048 |
| 0.049 | 8.5597 | 2080 | 0.6491 | 0.3771 | 0.6491 | 0.8057 |
| 0.049 | 8.5679 | 2082 | 0.6485 | 0.3771 | 0.6485 | 0.8053 |
| 0.049 | 8.5761 | 2084 | 0.6435 | 0.3771 | 0.6435 | 0.8022 |
| 0.049 | 8.5844 | 2086 | 0.6410 | 0.3771 | 0.6410 | 0.8006 |
| 0.049 | 8.5926 | 2088 | 0.6415 | 0.3771 | 0.6415 | 0.8009 |
| 0.049 | 8.6008 | 2090 | 0.6361 | 0.3771 | 0.6361 | 0.7976 |
| 0.049 | 8.6091 | 2092 | 0.6311 | 0.3724 | 0.6311 | 0.7944 |
| 0.049 | 8.6173 | 2094 | 0.6297 | 0.3724 | 0.6297 | 0.7935 |
| 0.049 | 8.6255 | 2096 | 0.6302 | 0.3724 | 0.6302 | 0.7938 |
| 0.049 | 8.6337 | 2098 | 0.6291 | 0.3724 | 0.6291 | 0.7931 |
| 0.049 | 8.6420 | 2100 | 0.6286 | 0.3771 | 0.6286 | 0.7928 |
| 0.049 | 8.6502 | 2102 | 0.6291 | 0.3771 | 0.6291 | 0.7932 |
| 0.049 | 8.6584 | 2104 | 0.6278 | 0.3771 | 0.6278 | 0.7923 |
| 0.049 | 8.6667 | 2106 | 0.6239 | 0.3771 | 0.6239 | 0.7899 |
| 0.049 | 8.6749 | 2108 | 0.6212 | 0.3724 | 0.6212 | 0.7882 |
| 0.049 | 8.6831 | 2110 | 0.6200 | 0.3724 | 0.6200 | 0.7874 |
| 0.049 | 8.6914 | 2112 | 0.6194 | 0.3724 | 0.6194 | 0.7870 |
| 0.049 | 8.6996 | 2114 | 0.6191 | 0.3724 | 0.6191 | 0.7868 |
| 0.049 | 8.7078 | 2116 | 0.6220 | 0.3724 | 0.6220 | 0.7886 |
| 0.049 | 8.7160 | 2118 | 0.6242 | 0.3724 | 0.6242 | 0.7901 |
| 0.049 | 8.7243 | 2120 | 0.6277 | 0.3771 | 0.6277 | 0.7923 |
| 0.049 | 8.7325 | 2122 | 0.6324 | 0.3771 | 0.6324 | 0.7952 |
| 0.049 | 8.7407 | 2124 | 0.6402 | 0.3771 | 0.6402 | 0.8001 |
| 0.049 | 8.7490 | 2126 | 0.6490 | 0.3771 | 0.6490 | 0.8056 |
| 0.049 | 8.7572 | 2128 | 0.6551 | 0.3771 | 0.6551 | 0.8094 |
| 0.049 | 8.7654 | 2130 | 0.6629 | 0.3811 | 0.6629 | 0.8142 |
| 0.049 | 8.7737 | 2132 | 0.6697 | 0.25 | 0.6697 | 0.8184 |
| 0.049 | 8.7819 | 2134 | 0.6754 | 0.25 | 0.6754 | 0.8218 |
| 0.049 | 8.7901 | 2136 | 0.6786 | 0.25 | 0.6786 | 0.8237 |
| 0.049 | 8.7984 | 2138 | 0.6735 | 0.25 | 0.6735 | 0.8207 |
| 0.049 | 8.8066 | 2140 | 0.6687 | 0.3811 | 0.6687 | 0.8178 |
| 0.049 | 8.8148 | 2142 | 0.6623 | 0.3811 | 0.6623 | 0.8138 |
| 0.049 | 8.8230 | 2144 | 0.6538 | 0.3771 | 0.6538 | 0.8086 |
| 0.049 | 8.8313 | 2146 | 0.6508 | 0.3771 | 0.6508 | 0.8067 |
| 0.049 | 8.8395 | 2148 | 0.6494 | 0.3771 | 0.6494 | 0.8059 |
| 0.049 | 8.8477 | 2150 | 0.6483 | 0.3771 | 0.6483 | 0.8052 |
| 0.049 | 8.8560 | 2152 | 0.6490 | 0.3771 | 0.6490 | 0.8056 |
| 0.049 | 8.8642 | 2154 | 0.6531 | 0.3771 | 0.6531 | 0.8082 |
| 0.049 | 8.8724 | 2156 | 0.6534 | 0.3771 | 0.6534 | 0.8084 |
| 0.049 | 8.8807 | 2158 | 0.6536 | 0.3771 | 0.6536 | 0.8085 |
| 0.049 | 8.8889 | 2160 | 0.6551 | 0.3771 | 0.6551 | 0.8094 |
| 0.049 | 8.8971 | 2162 | 0.6585 | 0.3771 | 0.6585 | 0.8115 |
| 0.049 | 8.9053 | 2164 | 0.6636 | 0.3771 | 0.6636 | 0.8146 |
| 0.049 | 8.9136 | 2166 | 0.6703 | 0.3811 | 0.6703 | 0.8187 |
| 0.049 | 8.9218 | 2168 | 0.6764 | 0.25 | 0.6764 | 0.8224 |
| 0.049 | 8.9300 | 2170 | 0.6813 | 0.25 | 0.6813 | 0.8254 |
| 0.049 | 8.9383 | 2172 | 0.6770 | 0.25 | 0.6770 | 0.8228 |
| 0.049 | 8.9465 | 2174 | 0.6734 | 0.25 | 0.6734 | 0.8206 |
| 0.049 | 8.9547 | 2176 | 0.6623 | 0.3771 | 0.6623 | 0.8138 |
| 0.049 | 8.9630 | 2178 | 0.6515 | 0.3771 | 0.6515 | 0.8071 |
| 0.049 | 8.9712 | 2180 | 0.6446 | 0.3771 | 0.6446 | 0.8029 |
| 0.049 | 8.9794 | 2182 | 0.6384 | 0.3771 | 0.6384 | 0.7990 |
| 0.049 | 8.9877 | 2184 | 0.6327 | 0.3771 | 0.6327 | 0.7954 |
| 0.049 | 8.9959 | 2186 | 0.6299 | 0.3771 | 0.6299 | 0.7937 |
| 0.049 | 9.0041 | 2188 | 0.6287 | 0.3771 | 0.6287 | 0.7929 |
| 0.049 | 9.0123 | 2190 | 0.6276 | 0.3771 | 0.6276 | 0.7922 |
| 0.049 | 9.0206 | 2192 | 0.6292 | 0.3771 | 0.6292 | 0.7932 |
| 0.049 | 9.0288 | 2194 | 0.6322 | 0.3771 | 0.6322 | 0.7951 |
| 0.049 | 9.0370 | 2196 | 0.6348 | 0.3771 | 0.6348 | 0.7967 |
| 0.049 | 9.0453 | 2198 | 0.6341 | 0.3771 | 0.6341 | 0.7963 |
| 0.049 | 9.0535 | 2200 | 0.6337 | 0.3771 | 0.6337 | 0.7960 |
| 0.049 | 9.0617 | 2202 | 0.6330 | 0.3771 | 0.6330 | 0.7956 |
| 0.049 | 9.0700 | 2204 | 0.6337 | 0.3771 | 0.6337 | 0.7960 |
| 0.049 | 9.0782 | 2206 | 0.6353 | 0.3771 | 0.6353 | 0.7970 |
| 0.049 | 9.0864 | 2208 | 0.6354 | 0.3771 | 0.6354 | 0.7971 |
| 0.049 | 9.0947 | 2210 | 0.6340 | 0.3771 | 0.6340 | 0.7962 |
| 0.049 | 9.1029 | 2212 | 0.6323 | 0.3771 | 0.6323 | 0.7952 |
| 0.049 | 9.1111 | 2214 | 0.6307 | 0.3771 | 0.6307 | 0.7942 |
| 0.049 | 9.1193 | 2216 | 0.6310 | 0.3771 | 0.6310 | 0.7944 |
| 0.049 | 9.1276 | 2218 | 0.6330 | 0.3771 | 0.6330 | 0.7956 |
| 0.049 | 9.1358 | 2220 | 0.6362 | 0.3771 | 0.6362 | 0.7976 |
| 0.049 | 9.1440 | 2222 | 0.6371 | 0.3771 | 0.6371 | 0.7982 |
| 0.049 | 9.1523 | 2224 | 0.6375 | 0.2355 | 0.6375 | 0.7984 |
| 0.049 | 9.1605 | 2226 | 0.6362 | 0.2355 | 0.6362 | 0.7976 |
| 0.049 | 9.1687 | 2228 | 0.6317 | 0.3771 | 0.6317 | 0.7948 |
| 0.049 | 9.1770 | 2230 | 0.6271 | 0.3771 | 0.6271 | 0.7919 |
| 0.049 | 9.1852 | 2232 | 0.6244 | 0.3771 | 0.6244 | 0.7902 |
| 0.049 | 9.1934 | 2234 | 0.6226 | 0.3771 | 0.6226 | 0.7891 |
| 0.049 | 9.2016 | 2236 | 0.6229 | 0.3771 | 0.6229 | 0.7893 |
| 0.049 | 9.2099 | 2238 | 0.6242 | 0.3771 | 0.6242 | 0.7901 |
| 0.049 | 9.2181 | 2240 | 0.6268 | 0.3771 | 0.6268 | 0.7917 |
| 0.049 | 9.2263 | 2242 | 0.6308 | 0.2355 | 0.6308 | 0.7942 |
| 0.049 | 9.2346 | 2244 | 0.6343 | 0.2355 | 0.6343 | 0.7964 |
| 0.049 | 9.2428 | 2246 | 0.6362 | 0.2355 | 0.6362 | 0.7977 |
| 0.049 | 9.2510 | 2248 | 0.6356 | 0.3771 | 0.6356 | 0.7973 |
| 0.049 | 9.2593 | 2250 | 0.6348 | 0.3771 | 0.6348 | 0.7967 |
| 0.049 | 9.2675 | 2252 | 0.6353 | 0.3771 | 0.6353 | 0.7971 |
| 0.049 | 9.2757 | 2254 | 0.6365 | 0.3771 | 0.6365 | 0.7978 |
| 0.049 | 9.2840 | 2256 | 0.6386 | 0.3771 | 0.6386 | 0.7991 |
| 0.049 | 9.2922 | 2258 | 0.6411 | 0.3771 | 0.6411 | 0.8007 |
| 0.049 | 9.3004 | 2260 | 0.6432 | 0.3771 | 0.6432 | 0.8020 |
| 0.049 | 9.3086 | 2262 | 0.6466 | 0.3771 | 0.6466 | 0.8041 |
| 0.049 | 9.3169 | 2264 | 0.6496 | 0.2355 | 0.6496 | 0.8060 |
| 0.049 | 9.3251 | 2266 | 0.6533 | 0.2355 | 0.6533 | 0.8083 |
| 0.049 | 9.3333 | 2268 | 0.6567 | 0.2355 | 0.6567 | 0.8104 |
| 0.049 | 9.3416 | 2270 | 0.6590 | 0.2355 | 0.6590 | 0.8118 |
| 0.049 | 9.3498 | 2272 | 0.6598 | 0.2355 | 0.6598 | 0.8123 |
| 0.049 | 9.3580 | 2274 | 0.6620 | 0.2355 | 0.6620 | 0.8136 |
| 0.049 | 9.3663 | 2276 | 0.6626 | 0.2355 | 0.6626 | 0.8140 |
| 0.049 | 9.3745 | 2278 | 0.6644 | 0.2355 | 0.6644 | 0.8151 |
| 0.049 | 9.3827 | 2280 | 0.6670 | 0.2355 | 0.6670 | 0.8167 |
| 0.049 | 9.3909 | 2282 | 0.6688 | 0.2355 | 0.6688 | 0.8178 |
| 0.049 | 9.3992 | 2284 | 0.6715 | 0.2355 | 0.6715 | 0.8195 |
| 0.049 | 9.4074 | 2286 | 0.6731 | 0.2355 | 0.6731 | 0.8204 |
| 0.049 | 9.4156 | 2288 | 0.6725 | 0.2355 | 0.6725 | 0.8200 |
| 0.049 | 9.4239 | 2290 | 0.6693 | 0.2355 | 0.6693 | 0.8181 |
| 0.049 | 9.4321 | 2292 | 0.6665 | 0.2355 | 0.6665 | 0.8164 |
| 0.049 | 9.4403 | 2294 | 0.6625 | 0.2355 | 0.6625 | 0.8140 |
| 0.049 | 9.4486 | 2296 | 0.6584 | 0.2355 | 0.6584 | 0.8114 |
| 0.049 | 9.4568 | 2298 | 0.6572 | 0.2355 | 0.6572 | 0.8107 |
| 0.049 | 9.4650 | 2300 | 0.6573 | 0.2355 | 0.6573 | 0.8108 |
| 0.049 | 9.4733 | 2302 | 0.6573 | 0.2355 | 0.6573 | 0.8107 |
| 0.049 | 9.4815 | 2304 | 0.6582 | 0.2355 | 0.6582 | 0.8113 |
| 0.049 | 9.4897 | 2306 | 0.6596 | 0.2355 | 0.6596 | 0.8122 |
| 0.049 | 9.4979 | 2308 | 0.6603 | 0.2355 | 0.6603 | 0.8126 |
| 0.049 | 9.5062 | 2310 | 0.6614 | 0.2355 | 0.6614 | 0.8133 |
| 0.049 | 9.5144 | 2312 | 0.6614 | 0.2355 | 0.6614 | 0.8133 |
| 0.049 | 9.5226 | 2314 | 0.6610 | 0.2355 | 0.6610 | 0.8130 |
| 0.049 | 9.5309 | 2316 | 0.6613 | 0.2355 | 0.6613 | 0.8132 |
| 0.049 | 9.5391 | 2318 | 0.6608 | 0.2355 | 0.6608 | 0.8129 |
| 0.049 | 9.5473 | 2320 | 0.6601 | 0.2355 | 0.6601 | 0.8124 |
| 0.049 | 9.5556 | 2322 | 0.6588 | 0.2355 | 0.6588 | 0.8117 |
| 0.049 | 9.5638 | 2324 | 0.6564 | 0.2355 | 0.6564 | 0.8102 |
| 0.049 | 9.5720 | 2326 | 0.6540 | 0.2355 | 0.6540 | 0.8087 |
| 0.049 | 9.5802 | 2328 | 0.6524 | 0.2355 | 0.6524 | 0.8077 |
| 0.049 | 9.5885 | 2330 | 0.6524 | 0.2355 | 0.6524 | 0.8077 |
| 0.049 | 9.5967 | 2332 | 0.6513 | 0.3771 | 0.6513 | 0.8070 |
| 0.049 | 9.6049 | 2334 | 0.6504 | 0.3771 | 0.6504 | 0.8065 |
| 0.049 | 9.6132 | 2336 | 0.6489 | 0.3771 | 0.6489 | 0.8055 |
| 0.049 | 9.6214 | 2338 | 0.6471 | 0.3771 | 0.6471 | 0.8044 |
| 0.049 | 9.6296 | 2340 | 0.6455 | 0.3771 | 0.6455 | 0.8034 |
| 0.049 | 9.6379 | 2342 | 0.6439 | 0.3771 | 0.6439 | 0.8024 |
| 0.049 | 9.6461 | 2344 | 0.6426 | 0.3771 | 0.6426 | 0.8016 |
| 0.049 | 9.6543 | 2346 | 0.6421 | 0.3771 | 0.6421 | 0.8013 |
| 0.049 | 9.6626 | 2348 | 0.6419 | 0.3396 | 0.6419 | 0.8012 |
| 0.049 | 9.6708 | 2350 | 0.6421 | 0.3771 | 0.6421 | 0.8013 |
| 0.049 | 9.6790 | 2352 | 0.6414 | 0.3771 | 0.6414 | 0.8009 |
| 0.049 | 9.6872 | 2354 | 0.6410 | 0.3771 | 0.6410 | 0.8006 |
| 0.049 | 9.6955 | 2356 | 0.6405 | 0.3771 | 0.6405 | 0.8003 |
| 0.049 | 9.7037 | 2358 | 0.6402 | 0.3771 | 0.6402 | 0.8001 |
| 0.049 | 9.7119 | 2360 | 0.6400 | 0.3771 | 0.6400 | 0.8000 |
| 0.049 | 9.7202 | 2362 | 0.6402 | 0.3771 | 0.6402 | 0.8001 |
| 0.049 | 9.7284 | 2364 | 0.6404 | 0.3771 | 0.6404 | 0.8003 |
| 0.049 | 9.7366 | 2366 | 0.6406 | 0.3771 | 0.6406 | 0.8004 |
| 0.049 | 9.7449 | 2368 | 0.6408 | 0.3771 | 0.6408 | 0.8005 |
| 0.049 | 9.7531 | 2370 | 0.6414 | 0.3771 | 0.6414 | 0.8008 |
| 0.049 | 9.7613 | 2372 | 0.6420 | 0.3771 | 0.6420 | 0.8013 |
| 0.049 | 9.7695 | 2374 | 0.6425 | 0.3771 | 0.6425 | 0.8016 |
| 0.049 | 9.7778 | 2376 | 0.6432 | 0.3771 | 0.6432 | 0.8020 |
| 0.049 | 9.7860 | 2378 | 0.6438 | 0.3771 | 0.6438 | 0.8024 |
| 0.049 | 9.7942 | 2380 | 0.6444 | 0.3771 | 0.6444 | 0.8027 |
| 0.049 | 9.8025 | 2382 | 0.6450 | 0.3771 | 0.6450 | 0.8031 |
| 0.049 | 9.8107 | 2384 | 0.6448 | 0.3771 | 0.6448 | 0.8030 |
| 0.049 | 9.8189 | 2386 | 0.6444 | 0.3771 | 0.6444 | 0.8027 |
| 0.049 | 9.8272 | 2388 | 0.6439 | 0.3771 | 0.6439 | 0.8025 |
| 0.049 | 9.8354 | 2390 | 0.6433 | 0.3771 | 0.6433 | 0.8021 |
| 0.049 | 9.8436 | 2392 | 0.6433 | 0.3771 | 0.6433 | 0.8020 |
| 0.049 | 9.8519 | 2394 | 0.6430 | 0.3771 | 0.6430 | 0.8019 |
| 0.049 | 9.8601 | 2396 | 0.6428 | 0.3771 | 0.6428 | 0.8018 |
| 0.049 | 9.8683 | 2398 | 0.6428 | 0.3771 | 0.6428 | 0.8018 |
| 0.049 | 9.8765 | 2400 | 0.6432 | 0.3771 | 0.6432 | 0.8020 |
| 0.049 | 9.8848 | 2402 | 0.6432 | 0.3771 | 0.6432 | 0.8020 |
| 0.049 | 9.8930 | 2404 | 0.6433 | 0.3771 | 0.6433 | 0.8020 |
| 0.049 | 9.9012 | 2406 | 0.6436 | 0.3771 | 0.6436 | 0.8023 |
| 0.049 | 9.9095 | 2408 | 0.6437 | 0.3771 | 0.6437 | 0.8023 |
| 0.049 | 9.9177 | 2410 | 0.6438 | 0.3771 | 0.6438 | 0.8023 |
| 0.049 | 9.9259 | 2412 | 0.6436 | 0.3771 | 0.6436 | 0.8023 |
| 0.049 | 9.9342 | 2414 | 0.6437 | 0.3771 | 0.6437 | 0.8023 |
| 0.049 | 9.9424 | 2416 | 0.6437 | 0.3771 | 0.6437 | 0.8023 |
| 0.049 | 9.9506 | 2418 | 0.6439 | 0.3771 | 0.6439 | 0.8024 |
| 0.049 | 9.9588 | 2420 | 0.6439 | 0.3771 | 0.6439 | 0.8024 |
| 0.049 | 9.9671 | 2422 | 0.6439 | 0.3771 | 0.6439 | 0.8024 |
| 0.049 | 9.9753 | 2424 | 0.6438 | 0.3771 | 0.6438 | 0.8024 |
| 0.049 | 9.9835 | 2426 | 0.6438 | 0.3771 | 0.6438 | 0.8023 |
| 0.049 | 9.9918 | 2428 | 0.6437 | 0.3771 | 0.6437 | 0.8023 |
| 0.049 | 10.0 | 2430 | 0.6437 | 0.3771 | 0.6437 | 0.8023 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
|
DoeyLLM/OneLLM-Doey-ChatQA-V1-Llama-3.2-1B | DoeyLLM | 2024-11-25T07:17:32Z | 89 | 2 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"en",
"dataset:nvidia/ChatQA-Training-Data",
"base_model:meta-llama/Llama-3.2-1B-Instruct",
"base_model:finetune:meta-llama/Llama-3.2-1B-Instruct",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-25T05:02:44Z | ---
license: apache-2.0
datasets:
- nvidia/ChatQA-Training-Data
language:
- en
base_model:
- meta-llama/Llama-3.2-1B-Instruct
pipeline_tag: text-generation
library_name: transformers
---
## **Model Summary**
This model is a fine-tuned version of **LLaMA 3.2-1B**, optimized using **LoRA (Low-Rank Adaptation)** on the [NVIDIA ChatQA-Training-Data](https://huggingface.co/datasets/nvidia/ChatQA-Training-Data). It is tailored for conversational AI, question answering, and other instruction-following tasks, with support for sequences up to 1024 tokens.
---
## **Key Features**
- **Base Model**: LLaMA 3.2-1B
- **Fine-Tuning Framework**: LoRA
- **Dataset**: NVIDIA ChatQA-Training-Data
- **Max Sequence Length**: 1024 tokens
- **Use Case**: Instruction-based tasks, question answering, conversational AI.
## **Model Usage**
This fine-tuned model is suitable for:
- **Conversational AI**: Chatbots and dialogue agents with improved contextual understanding.
- **Question Answering**: Generating concise and accurate answers to user queries.
- **Instruction Following**: Responding to structured prompts.
- **Long-Context Tasks**: Processing sequences up to 1024 tokens for long-text reasoning.
# **How to Use DoeyLLM / OneLLM-Doey-V1-Llama-3.2-1B-Instruct**
This guide explains how to use the **DoeyLLM** model on both app (iOS) and PC platforms.
---
## **App: Use with OneLLM**
OneLLM brings versatile large language models (LLMs) to your deviceโLlama, Gemma, Qwen, Mistral, and more. Enjoy private, offline GPT and AI tools tailored to your needs.
With OneLLM, experience the capabilities of leading-edge language models directly on your device, all without an internet connection. Get fast, reliable, and intelligent responses, while keeping your data secure with local processing.
### **Quick Start for mobile**

Follow these steps to integrate the **DoeyLLM** model using the OneLLM app:
1. **Download OneLLM**
Get the app from the [App Store](https://apps.apple.com/us/app/onellm-private-ai-gpt-llm/id6737907910) and install it on your iOS device.
Or get the app from the [Play Store](https://play.google.com/store/apps/details?id=com.esotech.onellm) and install it on your Android device.
3. **Load the DoeyLLM Model**
Use the OneLLM interface to load the DoeyLLM model directly into the app:
- Navigate to the **Model Library**.
- Search for `DoeyLLM`.
- Select the model and tap **Download** to store it locally on your device.
4. **Start Conversing**
Once the model is loaded, you can begin interacting with it through the app's chat interface. For example:
- Tap the **Chat** tab.
- Type your question or prompt, such as:
> "Explain the significance of AI in education."
- Receive real-time, intelligent responses generated locally.
### **Key Features of OneLLM**
- **Versatile Models**: Supports various LLMs, including Llama, Gemma, and Qwen.
- **Private & Secure**: All processing occurs locally on your device, ensuring data privacy.
- **Offline Capability**: Use the app without requiring an internet connection.
- **Fast Performance**: Optimized for mobile devices, delivering low-latency responses.
For more details or support, visit the [OneLLM App Store page](https://apps.apple.com/us/app/onellm-private-ai-gpt-llm/id6737907910) and [Play Store](https://play.google.com/store/apps/details?id=com.esotech.onellm).
## **PC: Use with Transformers**
The DoeyLLM model can also be used on PC platforms through the `transformers` library, enabling robust and scalable inference for various NLP tasks.
### **Quick Start for PC**
Follow these steps to use the model with Transformers:
1. **Install Transformers**
Ensure you have `transformers >= 4.43.0` installed. Update or install it via pip:
```bash
pip install --upgrade transformers
2. **Load the Model**
Use the transformers library to load the model and tokenizer:
Starting with `transformers >= 4.43.0` onward, you can run conversational inference using the Transformers `pipeline` abstraction or by leveraging the Auto classes with the `generate()` function.
Make sure to update your transformers installation via `pip install --upgrade transformers`.
```python
import torch
from transformers import pipeline
model_id = "OneLLM-Doey-V1-Llama-3.2-1B"
pipe = pipeline(
"text-generation",
model=model_id,
torch_dtype=torch.bfloat16,
device_map="auto",
)
messages = [
{"role": "system", "content": "You are a pirate chatbot who always responds in pirate speak!"},
{"role": "user", "content": "Who are you?"},
]
outputs = pipe(
messages,
max_new_tokens=256,
)
print(outputs[0]["generated_text"][-1])
```
## Responsibility & Safety
As part of our responsible release strategy, we adopted a three-pronged approach to managing trust and safety risks:
Enable developers to deploy helpful, safe, and flexible experiences for their target audience and the use cases supported by the model.
Protect developers from adversarial users attempting to exploit the modelโs capabilities to potentially cause harm.
Provide safeguards for the community to help prevent the misuse of the model. |
tl81092/my-drug-model | tl81092 | 2024-11-25T07:17:11Z | 105 | 0 | transformers | [
"transformers",
"safetensors",
"distilbert",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-25T07:16:48Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a ๐ค transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
JosephNguyen/Qwen2.5-7B-Instruct-finetuned | JosephNguyen | 2024-11-25T07:12:02Z | 83 | 1 | transformers | [
"transformers",
"safetensors",
"qwen2",
"text-generation",
"text-generation-inference",
"unsloth",
"trl",
"sft",
"conversational",
"en",
"base_model:unsloth/Qwen2.5-7B-Instruct",
"base_model:finetune:unsloth/Qwen2.5-7B-Instruct",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-25T06:36:15Z | ---
base_model: unsloth/Qwen2.5-7B-Instruct
tags:
- text-generation-inference
- transformers
- unsloth
- qwen2
- trl
- sft
license: apache-2.0
language:
- en
---
# Uploaded model
- **Developed by:** JosephNguyen
- **License:** apache-2.0
- **Finetuned from model :** unsloth/Qwen2.5-7B-Instruct
This qwen2 model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
xuYiLi/vit-base-patch16-224-finetuned-flower | xuYiLi | 2024-11-25T07:09:06Z | 5 | 0 | null | [
"tensorboard",
"safetensors",
"vit",
"image-classification",
"pytorch",
"huggingpics",
"model-index",
"region:us"
] | image-classification | 2024-11-25T07:08:50Z | ---
tags:
- image-classification
- pytorch
- huggingpics
metrics:
- accuracy
model-index:
- name: vit-base-patch16-224-finetuned-flower
results:
- task:
name: Image Classification
type: image-classification
metrics:
- name: Accuracy
type: accuracy
value: 0.907216489315033
---
# vit-base-patch16-224-finetuned-flower
Autogenerated by HuggingPics๐ค๐ผ๏ธ
Create your own image classifier for **anything** by running [the demo on Google Colab](https://colab.research.google.com/github/nateraw/huggingpics/blob/main/HuggingPics.ipynb).
Report any issues with the demo at the [github repo](https://github.com/nateraw/huggingpics).
## Example Images
#### crape myrtle

#### iris

#### narcissus

#### osmanthus

#### peony
 |
Keltezaa/zooey-deschanel | Keltezaa | 2024-11-25T07:08:38Z | 10 | 0 | diffusers | [
"diffusers",
"text-to-image",
"stable-diffusion",
"lora",
"template:sd-lora",
"migrated",
"celebrity",
"base_model:black-forest-labs/FLUX.1-dev",
"base_model:adapter:black-forest-labs/FLUX.1-dev",
"license:other",
"region:us"
] | text-to-image | 2024-11-25T07:08:30Z | ---
license: other
license_name: bespoke-lora-trained-license
license_link: https://multimodal.art/civitai-licenses?allowNoCredit=True&allowCommercialUse=RentCivit&allowDerivatives=True&allowDifferentLicense=False
tags:
- text-to-image
- stable-diffusion
- lora
- diffusers
- template:sd-lora
- migrated
- celebrity
base_model: black-forest-labs/FLUX.1-dev
instance_prompt: Zooey Deschanel
widget:
- text: ' '
output:
url: >-
28183142.jpeg
- text: ' '
output:
url: >-
28183139.jpeg
- text: ' '
output:
url: >-
28183138.jpeg
- text: ' '
output:
url: >-
28183140.jpeg
- text: ' '
output:
url: >-
28183141.jpeg
- text: ' '
output:
url: >-
28183143.jpeg
- text: ' '
output:
url: >-
28183144.jpeg
- text: ' '
output:
url: >-
28183145.jpeg
- text: ' '
output:
url: >-
28183146.jpeg
- text: ' '
output:
url: >-
28183147.jpeg
- text: ' '
output:
url: >-
28183148.jpeg
- text: ' '
output:
url: >-
28183150.jpeg
- text: 'portrait of Zooey Deschanel shot on a Hasselblad H3D-39. she is wearing a light summer dress at the beach in Italy'
output:
url: >-
28184076.jpeg
---
# Zooey Deschanel
<Gallery />
## Model description
<p>Trained on 15 Images for 2500 Steps</p>
## Trigger words
You should use `Zooey Deschanel` to trigger the image generation.
## Download model
Weights for this model are available in Safetensors format.
[Download](/Keltezaa/zooey-deschanel/tree/main) them in the Files & versions tab.
## Use it with the [๐งจ diffusers library](https://github.com/huggingface/diffusers)
```py
from diffusers import AutoPipelineForText2Image
import torch
device = "cuda" if torch.cuda.is_available() else "cpu"
pipeline = AutoPipelineForText2Image.from_pretrained('black-forest-labs/FLUX.1-dev', torch_dtype=torch.bfloat16).to(device)
pipeline.load_lora_weights('Keltezaa/zooey-deschanel', weight_name='Zooey_Deschanel_v1.safetensors')
image = pipeline('portrait of Zooey Deschanel shot on a Hasselblad H3D-39. she is wearing a light summer dress at the beach in Italy').images[0]
```
For more details, including weighting, merging and fusing LoRAs, check the [documentation on loading LoRAs in diffusers](https://huggingface.co/docs/diffusers/main/en/using-diffusers/loading_adapters)
|
ahmedheakl/asm2asm-qwen2.5coder-0.5b-500k-2ep-tokenizer | ahmedheakl | 2024-11-25T07:07:36Z | 264 | 0 | transformers | [
"transformers",
"safetensors",
"qwen2",
"text-generation",
"generated_from_trainer",
"trl",
"sft",
"conversational",
"base_model:Qwen/Qwen2.5-Coder-0.5B-Instruct",
"base_model:finetune:Qwen/Qwen2.5-Coder-0.5B-Instruct",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-23T15:01:30Z | ---
base_model: Qwen/Qwen2.5-Coder-0.5B-Instruct
library_name: transformers
model_name: asm2asm-qwen2.5coder-0.5b-500k-2ep-tokenizer
tags:
- generated_from_trainer
- trl
- sft
licence: license
---
# Model Card for asm2asm-qwen2.5coder-0.5b-500k-2ep-tokenizer
This model is a fine-tuned version of [Qwen/Qwen2.5-Coder-0.5B-Instruct](https://huggingface.co/Qwen/Qwen2.5-Coder-0.5B-Instruct).
It has been trained using [TRL](https://github.com/huggingface/trl).
## Quick start
```python
from transformers import pipeline
question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?"
generator = pipeline("text-generation", model="ahmedheakl/asm2asm-qwen2.5coder-0.5b-500k-2ep-tokenizer", device="cuda")
output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0]
print(output["generated_text"])
```
## Training procedure
[<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="150" height="24"/>](https://wandb.ai/ahmed-heakl/huggingface/runs/eppp58b2)
This model was trained with SFT.
### Framework versions
- TRL: 0.12.1
- Transformers: 4.46.3
- Pytorch: 2.5.1+cu124
- Datasets: 3.1.0
- Tokenizers: 0.20.3
## Citations
Cite TRL as:
```bibtex
@misc{vonwerra2022trl,
title = {{TRL: Transformer Reinforcement Learning}},
author = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallouรฉdec},
year = 2020,
journal = {GitHub repository},
publisher = {GitHub},
howpublished = {\url{https://github.com/huggingface/trl}}
}
``` |
chy2207/robot_care_8b_ver4 | chy2207 | 2024-11-25T07:06:28Z | 9 | 0 | transformers | [
"transformers",
"gguf",
"llama",
"text-generation-inference",
"unsloth",
"en",
"base_model:unsloth/Meta-Llama-3.1-8B-Instruct-bnb-4bit",
"base_model:quantized:unsloth/Meta-Llama-3.1-8B-Instruct-bnb-4bit",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-11-25T06:50:34Z | ---
base_model: unsloth/Meta-Llama-3.1-8B-Instruct-bnb-4bit
tags:
- text-generation-inference
- transformers
- unsloth
- llama
- gguf
license: apache-2.0
language:
- en
---
# Uploaded model
- **Developed by:** chy2207
- **License:** apache-2.0
- **Finetuned from model :** unsloth/Meta-Llama-3.1-8B-Instruct-bnb-4bit
This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
Subsets and Splits