modelId
stringlengths 5
139
| author
stringlengths 2
42
| last_modified
timestamp[us, tz=UTC]date 2020-02-15 11:33:14
2025-06-26 18:27:55
| downloads
int64 0
223M
| likes
int64 0
11.7k
| library_name
stringclasses 499
values | tags
sequencelengths 1
4.05k
| pipeline_tag
stringclasses 54
values | createdAt
timestamp[us, tz=UTC]date 2022-03-02 23:29:04
2025-06-26 18:27:32
| card
stringlengths 11
1.01M
|
---|---|---|---|---|---|---|---|---|---|
ljcamargo/tachiwin_translate | ljcamargo | 2024-11-24T09:45:51Z | 6 | 0 | transformers | [
"transformers",
"safetensors",
"gguf",
"unsloth",
"translation",
"text2text-generation",
"dataset:ljcamargo/tachiwin_translate",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | text2text-generation | 2024-10-20T00:50:52Z | ---
pipeline_tag: text2text-generation
library_name: transformers
tags:
- unsloth
- translation
datasets:
- ljcamargo/tachiwin_translate
---
# Model Card for Model ID
Tachiwin Totonaku
Totonac - Spanish, Spanish - Totonac Translation with Llama 3.1 8b-Intruct Finetuning (with vicuña model)
## Model Details
### Model Description
Totonac-Spanish, Spanish-Totonac Translation with Llama 3.1 8b-Intruct Finetuning (with vicuña model)
- **Developed by:** Luis J Camargo
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
PrunaAI/lsw0570168-krx-q25-7b-base-v3.4-bnb-8bit-smashed | PrunaAI | 2024-11-24T09:42:06Z | 5 | 0 | null | [
"safetensors",
"qwen2",
"pruna-ai",
"base_model:lsw0570168/krx-q25-7b-base-v3.4",
"base_model:quantized:lsw0570168/krx-q25-7b-base-v3.4",
"8-bit",
"bitsandbytes",
"region:us"
] | null | 2024-11-24T09:33:15Z | ---
thumbnail: "https://assets-global.website-files.com/646b351987a8d8ce158d1940/64ec9e96b4334c0e1ac41504_Logo%20with%20white%20text.svg"
base_model: lsw0570168/krx-q25-7b-base-v3.4
metrics:
- memory_disk
- memory_inference
- inference_latency
- inference_throughput
- inference_CO2_emissions
- inference_energy_consumption
tags:
- pruna-ai
---
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<a href="https://docs.pruna.ai/en/latest/setup/pip.html" target="_blank" rel="noopener noreferrer">
<img src="https://imgur.com/rVAgqMY.png" alt="PrunaAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</a>
</div>
<!-- header end -->
[](https://twitter.com/PrunaAI)
[](https://github.com/PrunaAI)
[](https://www.linkedin.com/company/93832878/admin/feed/posts/?feedType=following)
[](https://discord.gg/rskEr4BZJx)
# Simply make AI models cheaper, smaller, faster, and greener!
- Give a thumbs up if you like this model!
- Contact us and tell us which model to compress next [here](https://www.pruna.ai/contact).
- Request access to easily compress your *own* AI models [here](https://z0halsaff74.typeform.com/pruna-access?typeform-source=www.pruna.ai).
- Read the documentations to know more [here](https://pruna-ai-pruna.readthedocs-hosted.com/en/latest/)
- Join Pruna AI community on Discord [here](https://discord.gg/CP4VSgck) to share feedback/suggestions or get help.
## Results

**Frequently Asked Questions**
- ***How does the compression work?*** The model is compressed with llm-int8.
- ***How does the model quality change?*** The quality of the model output might vary compared to the base model.
- ***How is the model efficiency evaluated?*** These results were obtained with configuration described in `model/smash_config.json` and are obtained after a hardware warmup. The smashed model is directly compared to the original base model. Efficiency results may vary in other settings (e.g. other hardware, image size, batch size, ...). We recommend to directly run them in the use-case conditions to know if the smashed model can benefit you.
- ***What is the model format?*** We use safetensors.
- ***What calibration data has been used?*** If needed by the compression method, we used WikiText as the calibration data.
- ***What is the naming convention for Pruna Huggingface models?*** We take the original model name and append "turbo", "tiny", or "green" if the smashed model has a measured inference speed, inference memory, or inference energy consumption which is less than 90% of the original base model.
- ***How to compress my own models?*** You can request premium access to more compression methods and tech support for your specific use-cases [here](https://z0halsaff74.typeform.com/pruna-access?typeform-source=www.pruna.ai).
- ***What are "first" metrics?*** Results mentioning "first" are obtained after the first run of the model. The first run might take more memory or be slower than the subsequent runs due cuda overheads.
- ***What are "Sync" and "Async" metrics?*** "Sync" metrics are obtained by syncing all GPU processes and stop measurement when all of them are executed. "Async" metrics are obtained without syncing all GPU processes and stop when the model output can be used by the CPU. We provide both metrics since both could be relevant depending on the use-case. We recommend to test the efficiency gains directly in your use-cases.
## Setup
You can run the smashed model with these steps:
0. Check requirements from the original repo lsw0570168/krx-q25-7b-base-v3.4 installed. In particular, check python, cuda, and transformers versions.
1. Make sure that you have installed quantization related packages.
```bash
pip install transformers accelerate bitsandbytes>0.37.0
```
2. Load & run the model.
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("PrunaAI/lsw0570168-krx-q25-7b-base-v3.4-bnb-8bit-smashed", trust_remote_code=True, device_map='auto')
tokenizer = AutoTokenizer.from_pretrained("lsw0570168/krx-q25-7b-base-v3.4")
input_ids = tokenizer("What is the color of prunes?,", return_tensors='pt').to(model.device)["input_ids"]
outputs = model.generate(input_ids, max_new_tokens=216)
tokenizer.decode(outputs[0])
```
## Configurations
The configuration info are in `smash_config.json`.
## Credits & License
The license of the smashed model follows the license of the original model. Please check the license of the original model lsw0570168/krx-q25-7b-base-v3.4 before using this model which provided the base model. The license of the `pruna-engine` is [here](https://pypi.org/project/pruna-engine/) on Pypi.
## Want to compress other models?
- Contact us and tell us which model to compress next [here](https://www.pruna.ai/contact).
- Do it by yourself [here](https://docs.pruna.ai/en/latest/setup/pip.html). |
PotatoB/task_2-1 | PotatoB | 2024-11-24T09:39:44Z | 5 | 0 | null | [
"safetensors",
"mistral",
"merge",
"mergekit",
"Epiculous/Mika-7B",
"potatoB/task_1-1",
"license:apache-2.0",
"region:us"
] | null | 2024-11-24T09:36:06Z | ---
license: apache-2.0
tags:
- merge
- mergekit
- Epiculous/Mika-7B
- potatoB/task_1-1
---
# task_2-1
task_2-1 is a merged model generated for Model Kinship experiments, originating from
* [Epiculous/Mika-7B](https://huggingface.co/Epiculous/Mika-7B)
* [potatoB/task_1-1](https://huggingface.co/potatoB/task_1-1)
## 🧩 Configuration
```yaml
slices:
- sources:
- model: Epiculous/Mika-7B
layer_range: [0, 32]
- model: potatoB/task_1-1
layer_range: [0, 32]
merge_method: slerp
base_model: Epiculous/Mika-7B
parameters:
t:
- filter: self_attn
value: [0, 0.5, 0.3, 0.7, 1]
- filter: mlp
value: [1, 0.5, 0.7, 0.3, 0]
- value: 0.5
dtype: float16
``` |
isspek/xlnet-base-cased_non_epidemic_global_warning_1_2e-5_16 | isspek | 2024-11-24T09:19:23Z | 118 | 0 | transformers | [
"transformers",
"safetensors",
"xlnet",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-24T09:19:09Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
isspek/xlnet-base-cased_non_epidemic_global_warning_5_2e-5_16 | isspek | 2024-11-24T09:18:44Z | 118 | 0 | transformers | [
"transformers",
"safetensors",
"xlnet",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-24T09:18:28Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
MayBashendy/Arabic_FineTuningAraBERT_AugV5_k20_task1_organization_fold1 | MayBashendy | 2024-11-24T09:18:25Z | 163 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:aubmindlab/bert-base-arabertv02",
"base_model:finetune:aubmindlab/bert-base-arabertv02",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-18T20:21:51Z | ---
library_name: transformers
base_model: aubmindlab/bert-base-arabertv02
tags:
- generated_from_trainer
model-index:
- name: Arabic_FineTuningAraBERT_AugV5_k20_task1_organization_fold1
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Arabic_FineTuningAraBERT_AugV5_k20_task1_organization_fold1
This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4391
- Qwk: 0.8316
- Mse: 0.4391
- Rmse: 0.6626
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:|
| No log | 0.0096 | 2 | 3.6665 | -0.0544 | 3.6665 | 1.9148 |
| No log | 0.0191 | 4 | 2.4572 | 0.0417 | 2.4572 | 1.5675 |
| No log | 0.0287 | 6 | 1.6470 | -0.0070 | 1.6470 | 1.2833 |
| No log | 0.0383 | 8 | 1.3140 | 0.0239 | 1.3140 | 1.1463 |
| No log | 0.0478 | 10 | 1.5066 | 0.2905 | 1.5066 | 1.2274 |
| No log | 0.0574 | 12 | 1.2168 | 0.2891 | 1.2168 | 1.1031 |
| No log | 0.0670 | 14 | 0.8565 | 0.4251 | 0.8565 | 0.9255 |
| No log | 0.0766 | 16 | 0.8723 | 0.4667 | 0.8723 | 0.9340 |
| No log | 0.0861 | 18 | 1.3255 | 0.2905 | 1.3255 | 1.1513 |
| No log | 0.0957 | 20 | 1.4608 | 0.2468 | 1.4608 | 1.2086 |
| No log | 0.1053 | 22 | 1.2138 | 0.3728 | 1.2138 | 1.1017 |
| No log | 0.1148 | 24 | 0.9893 | 0.3475 | 0.9893 | 0.9946 |
| No log | 0.1244 | 26 | 0.8721 | 0.2143 | 0.8721 | 0.9339 |
| No log | 0.1340 | 28 | 0.7394 | 0.4385 | 0.7394 | 0.8599 |
| No log | 0.1435 | 30 | 0.7840 | 0.3109 | 0.7840 | 0.8854 |
| No log | 0.1531 | 32 | 0.9961 | 0.3849 | 0.9961 | 0.9980 |
| No log | 0.1627 | 34 | 1.1680 | 0.3226 | 1.1680 | 1.0808 |
| No log | 0.1722 | 36 | 1.0384 | 0.3243 | 1.0384 | 1.0190 |
| No log | 0.1818 | 38 | 0.7390 | 0.3571 | 0.7390 | 0.8597 |
| No log | 0.1914 | 40 | 0.6932 | 0.3571 | 0.6932 | 0.8326 |
| No log | 0.2010 | 42 | 0.6851 | 0.4560 | 0.6851 | 0.8277 |
| No log | 0.2105 | 44 | 0.6125 | 0.5333 | 0.6125 | 0.7826 |
| No log | 0.2201 | 46 | 0.6220 | 0.5333 | 0.6220 | 0.7887 |
| No log | 0.2297 | 48 | 0.7069 | 0.4 | 0.7069 | 0.8408 |
| No log | 0.2392 | 50 | 0.8510 | 0.2648 | 0.8510 | 0.9225 |
| No log | 0.2488 | 52 | 1.1309 | 0.2530 | 1.1309 | 1.0635 |
| No log | 0.2584 | 54 | 1.0000 | 0.3090 | 1.0000 | 1.0000 |
| No log | 0.2679 | 56 | 0.9855 | 0.3691 | 0.9855 | 0.9927 |
| No log | 0.2775 | 58 | 1.5821 | 0.1310 | 1.5821 | 1.2578 |
| No log | 0.2871 | 60 | 2.1881 | -0.0182 | 2.1881 | 1.4792 |
| No log | 0.2967 | 62 | 1.6145 | 0.1310 | 1.6145 | 1.2706 |
| No log | 0.3062 | 64 | 1.2122 | 0.1600 | 1.2122 | 1.1010 |
| No log | 0.3158 | 66 | 0.6779 | 0.4385 | 0.6779 | 0.8234 |
| No log | 0.3254 | 68 | 0.4819 | 0.4674 | 0.4819 | 0.6942 |
| No log | 0.3349 | 70 | 0.5225 | 0.4674 | 0.5225 | 0.7228 |
| No log | 0.3445 | 72 | 0.4158 | 0.6500 | 0.4158 | 0.6448 |
| No log | 0.3541 | 74 | 0.3763 | 0.6500 | 0.3763 | 0.6135 |
| No log | 0.3636 | 76 | 0.4329 | 0.6608 | 0.4329 | 0.6579 |
| No log | 0.3732 | 78 | 0.5203 | 0.4667 | 0.5203 | 0.7214 |
| No log | 0.3828 | 80 | 0.4934 | 0.6441 | 0.4934 | 0.7024 |
| No log | 0.3923 | 82 | 0.6029 | 0.6016 | 0.6029 | 0.7764 |
| No log | 0.4019 | 84 | 0.5882 | 0.4425 | 0.5882 | 0.7670 |
| No log | 0.4115 | 86 | 0.5890 | 0.5702 | 0.5890 | 0.7674 |
| No log | 0.4211 | 88 | 0.6108 | 0.5130 | 0.6108 | 0.7816 |
| No log | 0.4306 | 90 | 0.7875 | 0.3558 | 0.7875 | 0.8874 |
| No log | 0.4402 | 92 | 0.7986 | 0.3558 | 0.7986 | 0.8937 |
| No log | 0.4498 | 94 | 0.5665 | 0.5205 | 0.5665 | 0.7527 |
| No log | 0.4593 | 96 | 0.6635 | 0.5643 | 0.6635 | 0.8146 |
| No log | 0.4689 | 98 | 0.6849 | 0.5643 | 0.6849 | 0.8276 |
| No log | 0.4785 | 100 | 0.5468 | 0.5664 | 0.5468 | 0.7394 |
| No log | 0.4880 | 102 | 0.6649 | 0.3558 | 0.6649 | 0.8154 |
| No log | 0.4976 | 104 | 0.7633 | 0.2410 | 0.7633 | 0.8737 |
| No log | 0.5072 | 106 | 0.6918 | 0.2410 | 0.6918 | 0.8317 |
| No log | 0.5167 | 108 | 0.6458 | 0.3226 | 0.6458 | 0.8036 |
| No log | 0.5263 | 110 | 0.6794 | 0.2687 | 0.6794 | 0.8243 |
| No log | 0.5359 | 112 | 0.6725 | 0.3636 | 0.6725 | 0.8201 |
| No log | 0.5455 | 114 | 0.6568 | 0.3636 | 0.6568 | 0.8104 |
| No log | 0.5550 | 116 | 0.6403 | 0.2391 | 0.6403 | 0.8002 |
| No log | 0.5646 | 118 | 0.6162 | 0.2921 | 0.6162 | 0.7850 |
| No log | 0.5742 | 120 | 0.6131 | 0.3200 | 0.6131 | 0.7830 |
| No log | 0.5837 | 122 | 0.6084 | 0.2674 | 0.6084 | 0.7800 |
| No log | 0.5933 | 124 | 0.5921 | 0.3200 | 0.5921 | 0.7695 |
| No log | 0.6029 | 126 | 0.5929 | 0.3253 | 0.5929 | 0.7700 |
| No log | 0.6124 | 128 | 0.5771 | 0.3488 | 0.5771 | 0.7597 |
| No log | 0.6220 | 130 | 0.5675 | 0.4556 | 0.5675 | 0.7533 |
| No log | 0.6316 | 132 | 0.5478 | 0.4787 | 0.5478 | 0.7401 |
| No log | 0.6411 | 134 | 0.5536 | 0.4724 | 0.5536 | 0.7441 |
| No log | 0.6507 | 136 | 0.5067 | 0.5962 | 0.5067 | 0.7119 |
| No log | 0.6603 | 138 | 0.4941 | 0.5381 | 0.4941 | 0.7029 |
| No log | 0.6699 | 140 | 0.5286 | 0.5249 | 0.5286 | 0.7270 |
| No log | 0.6794 | 142 | 0.5235 | 0.5776 | 0.5235 | 0.7236 |
| No log | 0.6890 | 144 | 0.5287 | 0.5776 | 0.5287 | 0.7271 |
| No log | 0.6986 | 146 | 0.5240 | 0.6255 | 0.5240 | 0.7239 |
| No log | 0.7081 | 148 | 0.4126 | 0.5288 | 0.4126 | 0.6423 |
| No log | 0.7177 | 150 | 0.3775 | 0.6698 | 0.3775 | 0.6144 |
| No log | 0.7273 | 152 | 0.4300 | 0.6723 | 0.4300 | 0.6558 |
| No log | 0.7368 | 154 | 0.3847 | 0.6778 | 0.3847 | 0.6202 |
| No log | 0.7464 | 156 | 0.4978 | 0.7219 | 0.4978 | 0.7055 |
| No log | 0.7560 | 158 | 0.5639 | 0.7425 | 0.5639 | 0.7509 |
| No log | 0.7656 | 160 | 0.5676 | 0.7083 | 0.5676 | 0.7534 |
| No log | 0.7751 | 162 | 0.5555 | 0.6873 | 0.5555 | 0.7453 |
| No log | 0.7847 | 164 | 0.5637 | 0.6873 | 0.5637 | 0.7508 |
| No log | 0.7943 | 166 | 0.5359 | 0.72 | 0.5359 | 0.7321 |
| No log | 0.8038 | 168 | 0.5489 | 0.6997 | 0.5489 | 0.7409 |
| No log | 0.8134 | 170 | 0.6411 | 0.6020 | 0.6411 | 0.8007 |
| No log | 0.8230 | 172 | 0.7255 | 0.6580 | 0.7255 | 0.8517 |
| No log | 0.8325 | 174 | 0.6575 | 0.6645 | 0.6575 | 0.8108 |
| No log | 0.8421 | 176 | 0.5827 | 0.6873 | 0.5827 | 0.7633 |
| No log | 0.8517 | 178 | 0.4544 | 0.6873 | 0.4544 | 0.6741 |
| No log | 0.8612 | 180 | 0.3936 | 0.776 | 0.3936 | 0.6274 |
| No log | 0.8708 | 182 | 0.3782 | 0.7449 | 0.3782 | 0.6150 |
| No log | 0.8804 | 184 | 0.3779 | 0.7034 | 0.3779 | 0.6148 |
| No log | 0.8900 | 186 | 0.4024 | 0.7449 | 0.4024 | 0.6344 |
| No log | 0.8995 | 188 | 0.3940 | 0.7449 | 0.3940 | 0.6277 |
| No log | 0.9091 | 190 | 0.3976 | 0.7449 | 0.3976 | 0.6305 |
| No log | 0.9187 | 192 | 0.4552 | 0.6873 | 0.4552 | 0.6747 |
| No log | 0.9282 | 194 | 0.4847 | 0.6873 | 0.4847 | 0.6962 |
| No log | 0.9378 | 196 | 0.4549 | 0.6873 | 0.4549 | 0.6745 |
| No log | 0.9474 | 198 | 0.4293 | 0.7279 | 0.4293 | 0.6552 |
| No log | 0.9569 | 200 | 0.4600 | 0.7279 | 0.4600 | 0.6782 |
| No log | 0.9665 | 202 | 0.5257 | 0.6291 | 0.5257 | 0.7251 |
| No log | 0.9761 | 204 | 0.6252 | 0.6839 | 0.6252 | 0.7907 |
| No log | 0.9856 | 206 | 0.6186 | 0.6580 | 0.6186 | 0.7865 |
| No log | 0.9952 | 208 | 0.6291 | 0.6580 | 0.6291 | 0.7931 |
| No log | 1.0048 | 210 | 0.5000 | 0.6807 | 0.5000 | 0.7071 |
| No log | 1.0144 | 212 | 0.3870 | 0.7138 | 0.3870 | 0.6221 |
| No log | 1.0239 | 214 | 0.3731 | 0.6452 | 0.3731 | 0.6108 |
| No log | 1.0335 | 216 | 0.3765 | 0.5922 | 0.3765 | 0.6136 |
| No log | 1.0431 | 218 | 0.3687 | 0.6986 | 0.3687 | 0.6072 |
| No log | 1.0526 | 220 | 0.3741 | 0.6986 | 0.3741 | 0.6116 |
| No log | 1.0622 | 222 | 0.3709 | 0.6986 | 0.3709 | 0.6090 |
| No log | 1.0718 | 224 | 0.3588 | 0.6986 | 0.3588 | 0.5990 |
| No log | 1.0813 | 226 | 0.3737 | 0.7175 | 0.3737 | 0.6113 |
| No log | 1.0909 | 228 | 0.3864 | 0.7429 | 0.3864 | 0.6216 |
| No log | 1.1005 | 230 | 0.4024 | 0.7619 | 0.4024 | 0.6343 |
| No log | 1.1100 | 232 | 0.4751 | 0.7219 | 0.4751 | 0.6892 |
| No log | 1.1196 | 234 | 0.4691 | 0.72 | 0.4691 | 0.6849 |
| No log | 1.1292 | 236 | 0.5156 | 0.72 | 0.5156 | 0.7181 |
| No log | 1.1388 | 238 | 0.5597 | 0.7219 | 0.5597 | 0.7481 |
| No log | 1.1483 | 240 | 0.6188 | 0.6839 | 0.6188 | 0.7866 |
| No log | 1.1579 | 242 | 0.5404 | 0.7425 | 0.5404 | 0.7351 |
| No log | 1.1675 | 244 | 0.3975 | 0.7050 | 0.3975 | 0.6305 |
| No log | 1.1770 | 246 | 0.4210 | 0.5377 | 0.4210 | 0.6488 |
| No log | 1.1866 | 248 | 0.4784 | 0.6147 | 0.4784 | 0.6916 |
| No log | 1.1962 | 250 | 0.4478 | 0.4896 | 0.4478 | 0.6692 |
| No log | 1.2057 | 252 | 0.4798 | 0.4096 | 0.4798 | 0.6927 |
| No log | 1.2153 | 254 | 0.5284 | 0.3558 | 0.5284 | 0.7269 |
| No log | 1.2249 | 256 | 0.5181 | 0.3558 | 0.5181 | 0.7198 |
| No log | 1.2344 | 258 | 0.4523 | 0.3787 | 0.4523 | 0.6725 |
| No log | 1.2440 | 260 | 0.4184 | 0.48 | 0.4184 | 0.6468 |
| No log | 1.2536 | 262 | 0.4905 | 0.6789 | 0.4905 | 0.7004 |
| No log | 1.2632 | 264 | 0.4982 | 0.6516 | 0.4982 | 0.7058 |
| No log | 1.2727 | 266 | 0.4223 | 0.7070 | 0.4223 | 0.6499 |
| No log | 1.2823 | 268 | 0.3567 | 0.6986 | 0.3567 | 0.5972 |
| No log | 1.2919 | 270 | 0.3188 | 0.6986 | 0.3188 | 0.5646 |
| No log | 1.3014 | 272 | 0.3273 | 0.7522 | 0.3273 | 0.5721 |
| No log | 1.3110 | 274 | 0.3629 | 0.7464 | 0.3629 | 0.6024 |
| No log | 1.3206 | 276 | 0.4270 | 0.7508 | 0.4270 | 0.6535 |
| No log | 1.3301 | 278 | 0.5002 | 0.6936 | 0.5002 | 0.7072 |
| No log | 1.3397 | 280 | 0.5503 | 0.6216 | 0.5503 | 0.7418 |
| No log | 1.3493 | 282 | 0.6527 | 0.6416 | 0.6527 | 0.8079 |
| No log | 1.3589 | 284 | 0.6653 | 0.4 | 0.6653 | 0.8157 |
| No log | 1.3684 | 286 | 0.5580 | 0.5249 | 0.5580 | 0.7470 |
| No log | 1.3780 | 288 | 0.4169 | 0.6667 | 0.4169 | 0.6457 |
| No log | 1.3876 | 290 | 0.4402 | 0.6857 | 0.4402 | 0.6635 |
| No log | 1.3971 | 292 | 0.5077 | 0.7004 | 0.5077 | 0.7125 |
| No log | 1.4067 | 294 | 0.4489 | 0.6547 | 0.4489 | 0.6700 |
| No log | 1.4163 | 296 | 0.3858 | 0.5847 | 0.3858 | 0.6211 |
| No log | 1.4258 | 298 | 0.5326 | 0.5817 | 0.5326 | 0.7298 |
| No log | 1.4354 | 300 | 0.7077 | 0.5946 | 0.7077 | 0.8412 |
| No log | 1.4450 | 302 | 0.7031 | 0.5817 | 0.7031 | 0.8385 |
| No log | 1.4545 | 304 | 0.5647 | 0.5817 | 0.5647 | 0.7515 |
| No log | 1.4641 | 306 | 0.4081 | 0.6908 | 0.4081 | 0.6388 |
| No log | 1.4737 | 308 | 0.3675 | 0.7107 | 0.3675 | 0.6062 |
| No log | 1.4833 | 310 | 0.3813 | 0.6818 | 0.3813 | 0.6175 |
| No log | 1.4928 | 312 | 0.3595 | 0.7364 | 0.3595 | 0.5995 |
| No log | 1.5024 | 314 | 0.3812 | 0.7059 | 0.3812 | 0.6174 |
| No log | 1.5120 | 316 | 0.4524 | 0.6379 | 0.4524 | 0.6726 |
| No log | 1.5215 | 318 | 0.4668 | 0.5604 | 0.4668 | 0.6832 |
| No log | 1.5311 | 320 | 0.4270 | 0.6 | 0.4270 | 0.6535 |
| No log | 1.5407 | 322 | 0.3737 | 0.6348 | 0.3737 | 0.6113 |
| No log | 1.5502 | 324 | 0.3771 | 0.7287 | 0.3771 | 0.6141 |
| No log | 1.5598 | 326 | 0.3855 | 0.7986 | 0.3855 | 0.6209 |
| No log | 1.5694 | 328 | 0.3970 | 0.7820 | 0.3970 | 0.6301 |
| No log | 1.5789 | 330 | 0.4176 | 0.7820 | 0.4176 | 0.6462 |
| No log | 1.5885 | 332 | 0.4709 | 0.7138 | 0.4709 | 0.6862 |
| No log | 1.5981 | 334 | 0.6378 | 0.6216 | 0.6378 | 0.7986 |
| No log | 1.6077 | 336 | 0.7760 | 0.6047 | 0.7760 | 0.8809 |
| No log | 1.6172 | 338 | 0.7239 | 0.6416 | 0.7239 | 0.8508 |
| No log | 1.6268 | 340 | 0.6111 | 0.6957 | 0.6111 | 0.7817 |
| No log | 1.6364 | 342 | 0.5092 | 0.7407 | 0.5092 | 0.7136 |
| No log | 1.6459 | 344 | 0.4105 | 0.7986 | 0.4105 | 0.6407 |
| No log | 1.6555 | 346 | 0.3990 | 0.7986 | 0.3990 | 0.6317 |
| No log | 1.6651 | 348 | 0.3956 | 0.7986 | 0.3956 | 0.6290 |
| No log | 1.6746 | 350 | 0.4424 | 0.7220 | 0.4424 | 0.6651 |
| No log | 1.6842 | 352 | 0.4457 | 0.7220 | 0.4457 | 0.6676 |
| No log | 1.6938 | 354 | 0.4044 | 0.7368 | 0.4044 | 0.6359 |
| No log | 1.7033 | 356 | 0.3648 | 0.7529 | 0.3648 | 0.6040 |
| No log | 1.7129 | 358 | 0.3748 | 0.7368 | 0.3748 | 0.6122 |
| No log | 1.7225 | 360 | 0.4157 | 0.7368 | 0.4157 | 0.6447 |
| No log | 1.7321 | 362 | 0.5471 | 0.6807 | 0.5471 | 0.7397 |
| No log | 1.7416 | 364 | 0.4860 | 0.6807 | 0.4860 | 0.6971 |
| No log | 1.7512 | 366 | 0.3931 | 0.7368 | 0.3931 | 0.6270 |
| No log | 1.7608 | 368 | 0.3669 | 0.7368 | 0.3669 | 0.6057 |
| No log | 1.7703 | 370 | 0.4175 | 0.7445 | 0.4175 | 0.6462 |
| No log | 1.7799 | 372 | 0.5241 | 0.6738 | 0.5241 | 0.7239 |
| No log | 1.7895 | 374 | 0.4834 | 0.6642 | 0.4834 | 0.6953 |
| No log | 1.7990 | 376 | 0.3811 | 0.8095 | 0.3811 | 0.6173 |
| No log | 1.8086 | 378 | 0.3212 | 0.7941 | 0.3212 | 0.5668 |
| No log | 1.8182 | 380 | 0.3213 | 0.7941 | 0.3213 | 0.5668 |
| No log | 1.8278 | 382 | 0.3444 | 0.7529 | 0.3444 | 0.5869 |
| No log | 1.8373 | 384 | 0.3517 | 0.7529 | 0.3517 | 0.5931 |
| No log | 1.8469 | 386 | 0.3187 | 0.7131 | 0.3187 | 0.5645 |
| No log | 1.8565 | 388 | 0.3173 | 0.7477 | 0.3173 | 0.5633 |
| No log | 1.8660 | 390 | 0.3244 | 0.7477 | 0.3244 | 0.5696 |
| No log | 1.8756 | 392 | 0.3364 | 0.7477 | 0.3364 | 0.5800 |
| No log | 1.8852 | 394 | 0.3647 | 0.6667 | 0.3647 | 0.6039 |
| No log | 1.8947 | 396 | 0.4211 | 0.6769 | 0.4211 | 0.6489 |
| No log | 1.9043 | 398 | 0.4336 | 0.7219 | 0.4336 | 0.6585 |
| No log | 1.9139 | 400 | 0.4473 | 0.7219 | 0.4473 | 0.6688 |
| No log | 1.9234 | 402 | 0.4757 | 0.7219 | 0.4757 | 0.6897 |
| No log | 1.9330 | 404 | 0.4750 | 0.7425 | 0.4750 | 0.6892 |
| No log | 1.9426 | 406 | 0.4400 | 0.7425 | 0.4400 | 0.6633 |
| No log | 1.9522 | 408 | 0.3817 | 0.7138 | 0.3817 | 0.6178 |
| No log | 1.9617 | 410 | 0.3820 | 0.6744 | 0.3820 | 0.6180 |
| No log | 1.9713 | 412 | 0.3969 | 0.6842 | 0.3969 | 0.6300 |
| No log | 1.9809 | 414 | 0.4346 | 0.6980 | 0.4346 | 0.6592 |
| No log | 1.9904 | 416 | 0.4344 | 0.6980 | 0.4344 | 0.6591 |
| No log | 2.0 | 418 | 0.3912 | 0.7131 | 0.3912 | 0.6254 |
| No log | 2.0096 | 420 | 0.3655 | 0.8016 | 0.3655 | 0.6046 |
| No log | 2.0191 | 422 | 0.3495 | 0.776 | 0.3495 | 0.5912 |
| No log | 2.0287 | 424 | 0.3411 | 0.7941 | 0.3411 | 0.5841 |
| No log | 2.0383 | 426 | 0.3219 | 0.7805 | 0.3219 | 0.5673 |
| No log | 2.0478 | 428 | 0.3397 | 0.7697 | 0.3397 | 0.5828 |
| No log | 2.0574 | 430 | 0.3325 | 0.7697 | 0.3325 | 0.5767 |
| No log | 2.0670 | 432 | 0.3132 | 0.7603 | 0.3132 | 0.5597 |
| No log | 2.0766 | 434 | 0.3054 | 0.7709 | 0.3054 | 0.5526 |
| No log | 2.0861 | 436 | 0.3025 | 0.7941 | 0.3025 | 0.5500 |
| No log | 2.0957 | 438 | 0.2876 | 0.7941 | 0.2876 | 0.5363 |
| No log | 2.1053 | 440 | 0.2898 | 0.7941 | 0.2898 | 0.5383 |
| No log | 2.1148 | 442 | 0.3179 | 0.8178 | 0.3179 | 0.5638 |
| No log | 2.1244 | 444 | 0.3544 | 0.7835 | 0.3544 | 0.5953 |
| No log | 2.1340 | 446 | 0.3065 | 0.8178 | 0.3065 | 0.5536 |
| No log | 2.1435 | 448 | 0.3016 | 0.8178 | 0.3016 | 0.5492 |
| No log | 2.1531 | 450 | 0.3485 | 0.8 | 0.3485 | 0.5904 |
| No log | 2.1627 | 452 | 0.4888 | 0.8146 | 0.4888 | 0.6992 |
| No log | 2.1722 | 454 | 0.5168 | 0.8146 | 0.5168 | 0.7189 |
| No log | 2.1818 | 456 | 0.4520 | 0.7835 | 0.4520 | 0.6723 |
| No log | 2.1914 | 458 | 0.4239 | 0.7619 | 0.4239 | 0.6511 |
| No log | 2.2010 | 460 | 0.3946 | 0.7820 | 0.3946 | 0.6282 |
| No log | 2.2105 | 462 | 0.3801 | 0.7820 | 0.3801 | 0.6166 |
| No log | 2.2201 | 464 | 0.3584 | 0.7774 | 0.3584 | 0.5986 |
| No log | 2.2297 | 466 | 0.3891 | 0.7774 | 0.3891 | 0.6238 |
| No log | 2.2392 | 468 | 0.3863 | 0.7774 | 0.3863 | 0.6216 |
| No log | 2.2488 | 470 | 0.3397 | 0.7774 | 0.3397 | 0.5829 |
| No log | 2.2584 | 472 | 0.3358 | 0.7552 | 0.3358 | 0.5795 |
| No log | 2.2679 | 474 | 0.3599 | 0.7552 | 0.3599 | 0.5999 |
| No log | 2.2775 | 476 | 0.4255 | 0.7619 | 0.4255 | 0.6523 |
| No log | 2.2871 | 478 | 0.4911 | 0.7835 | 0.4911 | 0.7008 |
| No log | 2.2967 | 480 | 0.4515 | 0.7619 | 0.4515 | 0.6720 |
| No log | 2.3062 | 482 | 0.3516 | 0.7552 | 0.3516 | 0.5930 |
| No log | 2.3158 | 484 | 0.3181 | 0.7709 | 0.3181 | 0.5640 |
| No log | 2.3254 | 486 | 0.3173 | 0.7941 | 0.3173 | 0.5633 |
| No log | 2.3349 | 488 | 0.3417 | 0.8178 | 0.3417 | 0.5846 |
| No log | 2.3445 | 490 | 0.3528 | 0.8 | 0.3528 | 0.5940 |
| No log | 2.3541 | 492 | 0.3377 | 0.7774 | 0.3377 | 0.5811 |
| No log | 2.3636 | 494 | 0.2979 | 0.7709 | 0.2979 | 0.5458 |
| No log | 2.3732 | 496 | 0.2997 | 0.7709 | 0.2997 | 0.5475 |
| No log | 2.3828 | 498 | 0.3809 | 0.7619 | 0.3809 | 0.6172 |
| 0.4756 | 2.3923 | 500 | 0.5063 | 0.8146 | 0.5063 | 0.7115 |
| 0.4756 | 2.4019 | 502 | 0.4486 | 0.8146 | 0.4486 | 0.6698 |
| 0.4756 | 2.4115 | 504 | 0.3757 | 0.7619 | 0.3757 | 0.6129 |
| 0.4756 | 2.4211 | 506 | 0.3871 | 0.7835 | 0.3871 | 0.6222 |
| 0.4756 | 2.4306 | 508 | 0.4703 | 0.7789 | 0.4703 | 0.6858 |
| 0.4756 | 2.4402 | 510 | 0.5507 | 0.7021 | 0.5507 | 0.7421 |
| 0.4756 | 2.4498 | 512 | 0.5071 | 0.7021 | 0.5071 | 0.7121 |
| 0.4756 | 2.4593 | 514 | 0.4246 | 0.7220 | 0.4246 | 0.6516 |
| 0.4756 | 2.4689 | 516 | 0.3953 | 0.7619 | 0.3953 | 0.6287 |
| 0.4756 | 2.4785 | 518 | 0.3667 | 0.7820 | 0.3667 | 0.6055 |
| 0.4756 | 2.4880 | 520 | 0.3588 | 0.7986 | 0.3588 | 0.5990 |
| 0.4756 | 2.4976 | 522 | 0.4062 | 0.7619 | 0.4062 | 0.6373 |
| 0.4756 | 2.5072 | 524 | 0.5405 | 0.6416 | 0.5405 | 0.7352 |
| 0.4756 | 2.5167 | 526 | 0.6749 | 0.6416 | 0.6749 | 0.8215 |
| 0.4756 | 2.5263 | 528 | 0.6289 | 0.6839 | 0.6289 | 0.7931 |
| 0.4756 | 2.5359 | 530 | 0.6291 | 0.6839 | 0.6291 | 0.7931 |
| 0.4756 | 2.5455 | 532 | 0.5593 | 0.7540 | 0.5593 | 0.7479 |
| 0.4756 | 2.5550 | 534 | 0.4650 | 0.7835 | 0.4650 | 0.6819 |
| 0.4756 | 2.5646 | 536 | 0.3908 | 0.8 | 0.3908 | 0.6252 |
| 0.4756 | 2.5742 | 538 | 0.4128 | 0.7726 | 0.4128 | 0.6425 |
| 0.4756 | 2.5837 | 540 | 0.3654 | 0.7726 | 0.3654 | 0.6045 |
| 0.4756 | 2.5933 | 542 | 0.3247 | 0.7829 | 0.3247 | 0.5698 |
| 0.4756 | 2.6029 | 544 | 0.3275 | 0.6602 | 0.3275 | 0.5722 |
| 0.4756 | 2.6124 | 546 | 0.3224 | 0.6818 | 0.3224 | 0.5678 |
| 0.4756 | 2.6220 | 548 | 0.3095 | 0.6818 | 0.3095 | 0.5564 |
| 0.4756 | 2.6316 | 550 | 0.3114 | 0.7709 | 0.3114 | 0.5581 |
| 0.4756 | 2.6411 | 552 | 0.4197 | 0.7774 | 0.4197 | 0.6478 |
| 0.4756 | 2.6507 | 554 | 0.5136 | 0.8146 | 0.5136 | 0.7167 |
| 0.4756 | 2.6603 | 556 | 0.4411 | 0.8 | 0.4411 | 0.6641 |
| 0.4756 | 2.6699 | 558 | 0.3546 | 0.7774 | 0.3546 | 0.5955 |
| 0.4756 | 2.6794 | 560 | 0.3103 | 0.776 | 0.3103 | 0.5571 |
| 0.4756 | 2.6890 | 562 | 0.3095 | 0.7364 | 0.3095 | 0.5563 |
| 0.4756 | 2.6986 | 564 | 0.3115 | 0.7364 | 0.3115 | 0.5581 |
| 0.4756 | 2.7081 | 566 | 0.3173 | 0.7586 | 0.3173 | 0.5633 |
| 0.4756 | 2.7177 | 568 | 0.3845 | 0.8 | 0.3845 | 0.6201 |
| 0.4756 | 2.7273 | 570 | 0.4928 | 0.8146 | 0.4928 | 0.7020 |
| 0.4756 | 2.7368 | 572 | 0.5107 | 0.8146 | 0.5107 | 0.7146 |
| 0.4756 | 2.7464 | 574 | 0.4747 | 0.8146 | 0.4747 | 0.6890 |
| 0.4756 | 2.7560 | 576 | 0.3853 | 0.7774 | 0.3853 | 0.6207 |
| 0.4756 | 2.7656 | 578 | 0.3827 | 0.7941 | 0.3827 | 0.6186 |
| 0.4756 | 2.7751 | 580 | 0.4376 | 0.7619 | 0.4376 | 0.6615 |
| 0.4756 | 2.7847 | 582 | 0.5352 | 0.8011 | 0.5352 | 0.7316 |
| 0.4756 | 2.7943 | 584 | 0.6547 | 0.7028 | 0.6547 | 0.8091 |
| 0.4756 | 2.8038 | 586 | 0.6315 | 0.7119 | 0.6315 | 0.7947 |
| 0.4756 | 2.8134 | 588 | 0.5196 | 0.8011 | 0.5196 | 0.7208 |
| 0.4756 | 2.8230 | 590 | 0.3752 | 0.7941 | 0.3752 | 0.6125 |
| 0.4756 | 2.8325 | 592 | 0.3362 | 0.7508 | 0.3362 | 0.5798 |
| 0.4756 | 2.8421 | 594 | 0.3245 | 0.7442 | 0.3245 | 0.5696 |
| 0.4756 | 2.8517 | 596 | 0.3130 | 0.7864 | 0.3130 | 0.5594 |
| 0.4756 | 2.8612 | 598 | 0.3372 | 0.7941 | 0.3372 | 0.5806 |
| 0.4756 | 2.8708 | 600 | 0.4612 | 0.8146 | 0.4612 | 0.6791 |
| 0.4756 | 2.8804 | 602 | 0.5007 | 0.8146 | 0.5007 | 0.7076 |
| 0.4756 | 2.8900 | 604 | 0.4069 | 0.8316 | 0.4069 | 0.6379 |
| 0.4756 | 2.8995 | 606 | 0.2942 | 0.7709 | 0.2942 | 0.5424 |
| 0.4756 | 2.9091 | 608 | 0.2862 | 0.7692 | 0.2862 | 0.5350 |
| 0.4756 | 2.9187 | 610 | 0.2919 | 0.7692 | 0.2919 | 0.5403 |
| 0.4756 | 2.9282 | 612 | 0.2879 | 0.7510 | 0.2879 | 0.5366 |
| 0.4756 | 2.9378 | 614 | 0.2982 | 0.7941 | 0.2982 | 0.5461 |
| 0.4756 | 2.9474 | 616 | 0.3514 | 0.8095 | 0.3514 | 0.5928 |
| 0.4756 | 2.9569 | 618 | 0.4345 | 0.8146 | 0.4345 | 0.6591 |
| 0.4756 | 2.9665 | 620 | 0.4872 | 0.7987 | 0.4872 | 0.6980 |
| 0.4756 | 2.9761 | 622 | 0.4605 | 0.8146 | 0.4605 | 0.6786 |
| 0.4756 | 2.9856 | 624 | 0.4383 | 0.7934 | 0.4383 | 0.6621 |
| 0.4756 | 2.9952 | 626 | 0.3991 | 0.7619 | 0.3991 | 0.6317 |
| 0.4756 | 3.0048 | 628 | 0.3544 | 0.7774 | 0.3544 | 0.5953 |
| 0.4756 | 3.0144 | 630 | 0.3372 | 0.7941 | 0.3372 | 0.5807 |
| 0.4756 | 3.0239 | 632 | 0.3208 | 0.7941 | 0.3208 | 0.5664 |
| 0.4756 | 3.0335 | 634 | 0.3010 | 0.776 | 0.3010 | 0.5487 |
| 0.4756 | 3.0431 | 636 | 0.2973 | 0.7950 | 0.2973 | 0.5453 |
| 0.4756 | 3.0526 | 638 | 0.2966 | 0.8063 | 0.2966 | 0.5446 |
| 0.4756 | 3.0622 | 640 | 0.3026 | 0.7864 | 0.3026 | 0.5501 |
| 0.4756 | 3.0718 | 642 | 0.3069 | 0.7864 | 0.3069 | 0.5539 |
| 0.4756 | 3.0813 | 644 | 0.3410 | 0.7667 | 0.3410 | 0.5839 |
| 0.4756 | 3.0909 | 646 | 0.3843 | 0.7524 | 0.3843 | 0.6199 |
| 0.4756 | 3.1005 | 648 | 0.3522 | 0.7921 | 0.3522 | 0.5935 |
| 0.4756 | 3.1100 | 650 | 0.3296 | 0.7756 | 0.3296 | 0.5741 |
| 0.4756 | 3.1196 | 652 | 0.3158 | 0.7864 | 0.3158 | 0.5619 |
| 0.4756 | 3.1292 | 654 | 0.3226 | 0.7864 | 0.3226 | 0.5680 |
| 0.4756 | 3.1388 | 656 | 0.3101 | 0.7986 | 0.3101 | 0.5568 |
| 0.4756 | 3.1483 | 658 | 0.2917 | 0.8218 | 0.2917 | 0.5401 |
| 0.4756 | 3.1579 | 660 | 0.2900 | 0.8123 | 0.2900 | 0.5385 |
| 0.4756 | 3.1675 | 662 | 0.2885 | 0.8372 | 0.2885 | 0.5371 |
| 0.4756 | 3.1770 | 664 | 0.2893 | 0.7967 | 0.2893 | 0.5378 |
| 0.4756 | 3.1866 | 666 | 0.2830 | 0.8372 | 0.2830 | 0.5320 |
| 0.4756 | 3.1962 | 668 | 0.2958 | 0.8372 | 0.2958 | 0.5438 |
| 0.4756 | 3.2057 | 670 | 0.3577 | 0.8 | 0.3577 | 0.5981 |
| 0.4756 | 3.2153 | 672 | 0.4128 | 0.8146 | 0.4128 | 0.6425 |
| 0.4756 | 3.2249 | 674 | 0.4928 | 0.7540 | 0.4928 | 0.7020 |
| 0.4756 | 3.2344 | 676 | 0.4207 | 0.8146 | 0.4207 | 0.6486 |
| 0.4756 | 3.2440 | 678 | 0.3045 | 0.8372 | 0.3045 | 0.5518 |
| 0.4756 | 3.2536 | 680 | 0.2764 | 0.8659 | 0.2764 | 0.5258 |
| 0.4756 | 3.2632 | 682 | 0.2821 | 0.8372 | 0.2821 | 0.5311 |
| 0.4756 | 3.2727 | 684 | 0.2909 | 0.8016 | 0.2909 | 0.5393 |
| 0.4756 | 3.2823 | 686 | 0.2927 | 0.7822 | 0.2927 | 0.5410 |
| 0.4756 | 3.2919 | 688 | 0.3004 | 0.7565 | 0.3004 | 0.5481 |
| 0.4756 | 3.3014 | 690 | 0.3027 | 0.7042 | 0.3027 | 0.5502 |
| 0.4756 | 3.3110 | 692 | 0.2804 | 0.8016 | 0.2804 | 0.5295 |
| 0.4756 | 3.3206 | 694 | 0.2600 | 0.8320 | 0.2600 | 0.5099 |
| 0.4756 | 3.3301 | 696 | 0.2579 | 0.8372 | 0.2579 | 0.5078 |
| 0.4756 | 3.3397 | 698 | 0.3443 | 0.8316 | 0.3443 | 0.5867 |
| 0.4756 | 3.3493 | 700 | 0.5149 | 0.7493 | 0.5149 | 0.7176 |
| 0.4756 | 3.3589 | 702 | 0.7264 | 0.7038 | 0.7264 | 0.8523 |
| 0.4756 | 3.3684 | 704 | 0.8264 | 0.7038 | 0.8264 | 0.9091 |
| 0.4756 | 3.3780 | 706 | 0.6990 | 0.7038 | 0.6990 | 0.8361 |
| 0.4756 | 3.3876 | 708 | 0.4719 | 0.7614 | 0.4719 | 0.6869 |
| 0.4756 | 3.3971 | 710 | 0.3535 | 0.8464 | 0.3535 | 0.5946 |
| 0.4756 | 3.4067 | 712 | 0.2888 | 0.8456 | 0.2888 | 0.5374 |
| 0.4756 | 3.4163 | 714 | 0.2675 | 0.8659 | 0.2675 | 0.5172 |
| 0.4756 | 3.4258 | 716 | 0.3072 | 0.85 | 0.3072 | 0.5543 |
| 0.4756 | 3.4354 | 718 | 0.4004 | 0.8146 | 0.4004 | 0.6328 |
| 0.4756 | 3.4450 | 720 | 0.4353 | 0.8146 | 0.4353 | 0.6598 |
| 0.4756 | 3.4545 | 722 | 0.3806 | 0.8146 | 0.3806 | 0.6169 |
| 0.4756 | 3.4641 | 724 | 0.3375 | 0.8316 | 0.3375 | 0.5810 |
| 0.4756 | 3.4737 | 726 | 0.2970 | 0.8 | 0.2970 | 0.5450 |
| 0.4756 | 3.4833 | 728 | 0.2582 | 0.8659 | 0.2582 | 0.5081 |
| 0.4756 | 3.4928 | 730 | 0.2647 | 0.8372 | 0.2647 | 0.5145 |
| 0.4756 | 3.5024 | 732 | 0.2956 | 0.75 | 0.2956 | 0.5437 |
| 0.4756 | 3.5120 | 734 | 0.3455 | 0.7956 | 0.3455 | 0.5878 |
| 0.4756 | 3.5215 | 736 | 0.3662 | 0.8316 | 0.3662 | 0.6052 |
| 0.4756 | 3.5311 | 738 | 0.4465 | 0.7540 | 0.4465 | 0.6682 |
| 0.4756 | 3.5407 | 740 | 0.5522 | 0.7540 | 0.5522 | 0.7431 |
| 0.4756 | 3.5502 | 742 | 0.5530 | 0.7540 | 0.5530 | 0.7436 |
| 0.4756 | 3.5598 | 744 | 0.5074 | 0.7540 | 0.5074 | 0.7123 |
| 0.4756 | 3.5694 | 746 | 0.3958 | 0.8025 | 0.3958 | 0.6292 |
| 0.4756 | 3.5789 | 748 | 0.3108 | 0.7921 | 0.3108 | 0.5575 |
| 0.4756 | 3.5885 | 750 | 0.2946 | 0.8082 | 0.2946 | 0.5428 |
| 0.4756 | 3.5981 | 752 | 0.3027 | 0.7921 | 0.3027 | 0.5502 |
| 0.4756 | 3.6077 | 754 | 0.3183 | 0.7921 | 0.3183 | 0.5641 |
| 0.4756 | 3.6172 | 756 | 0.3420 | 0.7879 | 0.3420 | 0.5848 |
| 0.4756 | 3.6268 | 758 | 0.3557 | 0.8316 | 0.3557 | 0.5964 |
| 0.4756 | 3.6364 | 760 | 0.3320 | 0.7774 | 0.3320 | 0.5762 |
| 0.4756 | 3.6459 | 762 | 0.2880 | 0.7758 | 0.2880 | 0.5367 |
| 0.4756 | 3.6555 | 764 | 0.2737 | 0.7926 | 0.2737 | 0.5232 |
| 0.4756 | 3.6651 | 766 | 0.2736 | 0.7926 | 0.2736 | 0.5230 |
| 0.4756 | 3.6746 | 768 | 0.2857 | 0.8082 | 0.2857 | 0.5345 |
| 0.4756 | 3.6842 | 770 | 0.3368 | 0.7667 | 0.3368 | 0.5803 |
| 0.4756 | 3.6938 | 772 | 0.4367 | 0.8146 | 0.4367 | 0.6608 |
| 0.4756 | 3.7033 | 774 | 0.4671 | 0.7290 | 0.4671 | 0.6835 |
| 0.4756 | 3.7129 | 776 | 0.4172 | 0.8146 | 0.4172 | 0.6459 |
| 0.4756 | 3.7225 | 778 | 0.3986 | 0.8316 | 0.3986 | 0.6314 |
| 0.4756 | 3.7321 | 780 | 0.3830 | 0.8 | 0.3830 | 0.6189 |
| 0.4756 | 3.7416 | 782 | 0.4042 | 0.7682 | 0.4042 | 0.6357 |
| 0.4756 | 3.7512 | 784 | 0.4342 | 0.7540 | 0.4342 | 0.6590 |
| 0.4756 | 3.7608 | 786 | 0.4118 | 0.7835 | 0.4118 | 0.6417 |
| 0.4756 | 3.7703 | 788 | 0.3943 | 0.8 | 0.3943 | 0.6279 |
| 0.4756 | 3.7799 | 790 | 0.4164 | 0.7835 | 0.4164 | 0.6453 |
| 0.4756 | 3.7895 | 792 | 0.4762 | 0.7540 | 0.4762 | 0.6901 |
| 0.4756 | 3.7990 | 794 | 0.4953 | 0.7540 | 0.4953 | 0.7038 |
| 0.4756 | 3.8086 | 796 | 0.4570 | 0.8146 | 0.4570 | 0.6760 |
| 0.4756 | 3.8182 | 798 | 0.4328 | 0.7835 | 0.4328 | 0.6579 |
| 0.4756 | 3.8278 | 800 | 0.3962 | 0.7835 | 0.3962 | 0.6295 |
| 0.4756 | 3.8373 | 802 | 0.3686 | 0.7774 | 0.3686 | 0.6071 |
| 0.4756 | 3.8469 | 804 | 0.3866 | 0.7774 | 0.3866 | 0.6217 |
| 0.4756 | 3.8565 | 806 | 0.4986 | 0.7540 | 0.4986 | 0.7061 |
| 0.4756 | 3.8660 | 808 | 0.5794 | 0.7493 | 0.5794 | 0.7612 |
| 0.4756 | 3.8756 | 810 | 0.5232 | 0.7342 | 0.5232 | 0.7233 |
| 0.4756 | 3.8852 | 812 | 0.4957 | 0.7342 | 0.4957 | 0.7041 |
| 0.4756 | 3.8947 | 814 | 0.4748 | 0.7273 | 0.4748 | 0.6891 |
| 0.4756 | 3.9043 | 816 | 0.4846 | 0.7016 | 0.4846 | 0.6961 |
| 0.4756 | 3.9139 | 818 | 0.5017 | 0.7016 | 0.5017 | 0.7083 |
| 0.4756 | 3.9234 | 820 | 0.5283 | 0.6957 | 0.5283 | 0.7268 |
| 0.4756 | 3.9330 | 822 | 0.5076 | 0.6957 | 0.5076 | 0.7125 |
| 0.4756 | 3.9426 | 824 | 0.4461 | 0.7220 | 0.4461 | 0.6679 |
| 0.4756 | 3.9522 | 826 | 0.3495 | 0.7774 | 0.3495 | 0.5912 |
| 0.4756 | 3.9617 | 828 | 0.3093 | 0.7879 | 0.3093 | 0.5562 |
| 0.4756 | 3.9713 | 830 | 0.3008 | 0.8063 | 0.3008 | 0.5484 |
| 0.4756 | 3.9809 | 832 | 0.3105 | 0.7426 | 0.3105 | 0.5572 |
| 0.4756 | 3.9904 | 834 | 0.3658 | 0.7774 | 0.3658 | 0.6048 |
| 0.4756 | 4.0 | 836 | 0.4909 | 0.7518 | 0.4909 | 0.7006 |
| 0.4756 | 4.0096 | 838 | 0.5674 | 0.6776 | 0.5674 | 0.7533 |
| 0.4756 | 4.0191 | 840 | 0.5622 | 0.6776 | 0.5622 | 0.7498 |
| 0.4756 | 4.0287 | 842 | 0.4867 | 0.7635 | 0.4867 | 0.6976 |
| 0.4756 | 4.0383 | 844 | 0.3716 | 0.7619 | 0.3716 | 0.6096 |
| 0.4756 | 4.0478 | 846 | 0.3374 | 0.7941 | 0.3374 | 0.5809 |
| 0.4756 | 4.0574 | 848 | 0.3270 | 0.7986 | 0.3270 | 0.5719 |
| 0.4756 | 4.0670 | 850 | 0.3650 | 0.7619 | 0.3650 | 0.6042 |
| 0.4756 | 4.0766 | 852 | 0.4372 | 0.7569 | 0.4372 | 0.6612 |
| 0.4756 | 4.0861 | 854 | 0.5087 | 0.7036 | 0.5087 | 0.7133 |
| 0.4756 | 4.0957 | 856 | 0.5291 | 0.7036 | 0.5291 | 0.7274 |
| 0.4756 | 4.1053 | 858 | 0.5308 | 0.7407 | 0.5308 | 0.7285 |
| 0.4756 | 4.1148 | 860 | 0.4859 | 0.7407 | 0.4859 | 0.6971 |
| 0.4756 | 4.1244 | 862 | 0.3921 | 0.7934 | 0.3921 | 0.6262 |
| 0.4756 | 4.1340 | 864 | 0.3426 | 0.7774 | 0.3426 | 0.5853 |
| 0.4756 | 4.1435 | 866 | 0.3338 | 0.7774 | 0.3338 | 0.5778 |
| 0.4756 | 4.1531 | 868 | 0.3643 | 0.7619 | 0.3643 | 0.6036 |
| 0.4756 | 4.1627 | 870 | 0.4148 | 0.7934 | 0.4148 | 0.6440 |
| 0.4756 | 4.1722 | 872 | 0.4501 | 0.8146 | 0.4501 | 0.6709 |
| 0.4756 | 4.1818 | 874 | 0.4265 | 0.8146 | 0.4265 | 0.6531 |
| 0.4756 | 4.1914 | 876 | 0.3776 | 0.7619 | 0.3776 | 0.6145 |
| 0.4756 | 4.2010 | 878 | 0.3211 | 0.8218 | 0.3211 | 0.5667 |
| 0.4756 | 4.2105 | 880 | 0.3041 | 0.8218 | 0.3041 | 0.5514 |
| 0.4756 | 4.2201 | 882 | 0.3121 | 0.8218 | 0.3121 | 0.5587 |
| 0.4756 | 4.2297 | 884 | 0.3336 | 0.8218 | 0.3336 | 0.5776 |
| 0.4756 | 4.2392 | 886 | 0.3925 | 0.8146 | 0.3925 | 0.6265 |
| 0.4756 | 4.2488 | 888 | 0.4112 | 0.8146 | 0.4112 | 0.6412 |
| 0.4756 | 4.2584 | 890 | 0.4060 | 0.8146 | 0.4060 | 0.6372 |
| 0.4756 | 4.2679 | 892 | 0.3540 | 0.8 | 0.3540 | 0.5950 |
| 0.4756 | 4.2775 | 894 | 0.3091 | 0.7941 | 0.3091 | 0.5560 |
| 0.4756 | 4.2871 | 896 | 0.2974 | 0.7941 | 0.2974 | 0.5453 |
| 0.4756 | 4.2967 | 898 | 0.2990 | 0.8178 | 0.2990 | 0.5468 |
| 0.4756 | 4.3062 | 900 | 0.3074 | 0.8178 | 0.3074 | 0.5544 |
| 0.4756 | 4.3158 | 902 | 0.3462 | 0.8178 | 0.3462 | 0.5884 |
| 0.4756 | 4.3254 | 904 | 0.4051 | 0.7518 | 0.4051 | 0.6365 |
| 0.4756 | 4.3349 | 906 | 0.5294 | 0.6260 | 0.5294 | 0.7276 |
| 0.4756 | 4.3445 | 908 | 0.5658 | 0.6621 | 0.5658 | 0.7522 |
| 0.4756 | 4.3541 | 910 | 0.4659 | 0.7162 | 0.4659 | 0.6826 |
| 0.4756 | 4.3636 | 912 | 0.3438 | 0.8 | 0.3438 | 0.5863 |
| 0.4756 | 4.3732 | 914 | 0.3160 | 0.8178 | 0.3160 | 0.5621 |
| 0.4756 | 4.3828 | 916 | 0.3177 | 0.7941 | 0.3177 | 0.5636 |
| 0.4756 | 4.3923 | 918 | 0.3421 | 0.8 | 0.3421 | 0.5849 |
| 0.4756 | 4.4019 | 920 | 0.4015 | 0.7835 | 0.4015 | 0.6336 |
| 0.4756 | 4.4115 | 922 | 0.5003 | 0.7162 | 0.5003 | 0.7073 |
| 0.4756 | 4.4211 | 924 | 0.5695 | 0.6260 | 0.5695 | 0.7546 |
| 0.4756 | 4.4306 | 926 | 0.5493 | 0.6260 | 0.5493 | 0.7412 |
| 0.4756 | 4.4402 | 928 | 0.4834 | 0.5817 | 0.4834 | 0.6953 |
| 0.4756 | 4.4498 | 930 | 0.4082 | 0.6693 | 0.4082 | 0.6389 |
| 0.4756 | 4.4593 | 932 | 0.4056 | 0.7004 | 0.4056 | 0.6369 |
| 0.4756 | 4.4689 | 934 | 0.4653 | 0.7789 | 0.4653 | 0.6821 |
| 0.4756 | 4.4785 | 936 | 0.5981 | 0.7407 | 0.5981 | 0.7734 |
| 0.4756 | 4.4880 | 938 | 0.7846 | 0.7028 | 0.7846 | 0.8858 |
| 0.4756 | 4.4976 | 940 | 0.7989 | 0.7028 | 0.7989 | 0.8938 |
| 0.4756 | 4.5072 | 942 | 0.5929 | 0.7534 | 0.5929 | 0.7700 |
| 0.4756 | 4.5167 | 944 | 0.4522 | 0.8108 | 0.4522 | 0.6724 |
| 0.4756 | 4.5263 | 946 | 0.3699 | 0.7667 | 0.3699 | 0.6082 |
| 0.4756 | 4.5359 | 948 | 0.3637 | 0.7879 | 0.3637 | 0.6031 |
| 0.4756 | 4.5455 | 950 | 0.3830 | 0.8182 | 0.3830 | 0.6189 |
| 0.4756 | 4.5550 | 952 | 0.3529 | 0.8182 | 0.3529 | 0.5940 |
| 0.4756 | 4.5646 | 954 | 0.3222 | 0.8042 | 0.3222 | 0.5676 |
| 0.4756 | 4.5742 | 956 | 0.3077 | 0.8218 | 0.3077 | 0.5547 |
| 0.4756 | 4.5837 | 958 | 0.3009 | 0.8409 | 0.3009 | 0.5485 |
| 0.4756 | 4.5933 | 960 | 0.2887 | 0.8409 | 0.2887 | 0.5373 |
| 0.4756 | 4.6029 | 962 | 0.2760 | 0.8409 | 0.2760 | 0.5254 |
| 0.4756 | 4.6124 | 964 | 0.2730 | 0.8409 | 0.2730 | 0.5225 |
| 0.4756 | 4.6220 | 966 | 0.2714 | 0.8409 | 0.2714 | 0.5210 |
| 0.4756 | 4.6316 | 968 | 0.2904 | 0.8218 | 0.2904 | 0.5389 |
| 0.4756 | 4.6411 | 970 | 0.3370 | 0.7934 | 0.3370 | 0.5805 |
| 0.4756 | 4.6507 | 972 | 0.3469 | 0.7934 | 0.3469 | 0.5890 |
| 0.4756 | 4.6603 | 974 | 0.3334 | 0.7934 | 0.3334 | 0.5774 |
| 0.4756 | 4.6699 | 976 | 0.3452 | 0.7934 | 0.3452 | 0.5876 |
| 0.4756 | 4.6794 | 978 | 0.3273 | 0.8182 | 0.3273 | 0.5721 |
| 0.4756 | 4.6890 | 980 | 0.3105 | 0.7921 | 0.3105 | 0.5573 |
| 0.4756 | 4.6986 | 982 | 0.2985 | 0.8082 | 0.2985 | 0.5463 |
| 0.4756 | 4.7081 | 984 | 0.3094 | 0.7921 | 0.3094 | 0.5563 |
| 0.4756 | 4.7177 | 986 | 0.3264 | 0.8217 | 0.3264 | 0.5713 |
| 0.4756 | 4.7273 | 988 | 0.3327 | 0.8182 | 0.3327 | 0.5768 |
| 0.4756 | 4.7368 | 990 | 0.3460 | 0.8182 | 0.3460 | 0.5882 |
| 0.4756 | 4.7464 | 992 | 0.3402 | 0.8182 | 0.3402 | 0.5833 |
| 0.4756 | 4.7560 | 994 | 0.3186 | 0.8042 | 0.3186 | 0.5644 |
| 0.4756 | 4.7656 | 996 | 0.3104 | 0.7774 | 0.3104 | 0.5571 |
| 0.4756 | 4.7751 | 998 | 0.3364 | 0.7934 | 0.3364 | 0.5800 |
| 0.1313 | 4.7847 | 1000 | 0.3721 | 0.8146 | 0.3721 | 0.6100 |
| 0.1313 | 4.7943 | 1002 | 0.3838 | 0.8146 | 0.3838 | 0.6195 |
| 0.1313 | 4.8038 | 1004 | 0.3525 | 0.7934 | 0.3525 | 0.5937 |
| 0.1313 | 4.8134 | 1006 | 0.3166 | 0.8218 | 0.3166 | 0.5626 |
| 0.1313 | 4.8230 | 1008 | 0.3056 | 0.8218 | 0.3056 | 0.5528 |
| 0.1313 | 4.8325 | 1010 | 0.3091 | 0.7864 | 0.3091 | 0.5560 |
| 0.1313 | 4.8421 | 1012 | 0.3161 | 0.7864 | 0.3161 | 0.5622 |
| 0.1313 | 4.8517 | 1014 | 0.3414 | 0.8350 | 0.3414 | 0.5843 |
| 0.1313 | 4.8612 | 1016 | 0.4126 | 0.7934 | 0.4126 | 0.6423 |
| 0.1313 | 4.8708 | 1018 | 0.5176 | 0.8146 | 0.5176 | 0.7194 |
| 0.1313 | 4.8804 | 1020 | 0.5336 | 0.7987 | 0.5336 | 0.7305 |
| 0.1313 | 4.8900 | 1022 | 0.4867 | 0.8146 | 0.4867 | 0.6976 |
| 0.1313 | 4.8995 | 1024 | 0.4039 | 0.8146 | 0.4039 | 0.6356 |
| 0.1313 | 4.9091 | 1026 | 0.3427 | 0.85 | 0.3427 | 0.5854 |
| 0.1313 | 4.9187 | 1028 | 0.3105 | 0.8372 | 0.3105 | 0.5572 |
| 0.1313 | 4.9282 | 1030 | 0.2938 | 0.8372 | 0.2938 | 0.5421 |
| 0.1313 | 4.9378 | 1032 | 0.2827 | 0.8123 | 0.2827 | 0.5317 |
| 0.1313 | 4.9474 | 1034 | 0.2901 | 0.8123 | 0.2901 | 0.5386 |
| 0.1313 | 4.9569 | 1036 | 0.3170 | 0.8372 | 0.3170 | 0.5630 |
| 0.1313 | 4.9665 | 1038 | 0.3807 | 0.7956 | 0.3807 | 0.6170 |
| 0.1313 | 4.9761 | 1040 | 0.4665 | 0.7094 | 0.4665 | 0.6830 |
| 0.1313 | 4.9856 | 1042 | 0.5278 | 0.6260 | 0.5278 | 0.7265 |
| 0.1313 | 4.9952 | 1044 | 0.5566 | 0.6260 | 0.5566 | 0.7461 |
| 0.1313 | 5.0048 | 1046 | 0.5592 | 0.7094 | 0.5592 | 0.7478 |
| 0.1313 | 5.0144 | 1048 | 0.4768 | 0.7789 | 0.4768 | 0.6905 |
| 0.1313 | 5.0239 | 1050 | 0.4229 | 0.8146 | 0.4229 | 0.6503 |
| 0.1313 | 5.0335 | 1052 | 0.3811 | 0.8095 | 0.3811 | 0.6173 |
| 0.1313 | 5.0431 | 1054 | 0.3780 | 0.8095 | 0.3780 | 0.6148 |
| 0.1313 | 5.0526 | 1056 | 0.3975 | 0.7934 | 0.3975 | 0.6305 |
| 0.1313 | 5.0622 | 1058 | 0.4486 | 0.8146 | 0.4486 | 0.6698 |
| 0.1313 | 5.0718 | 1060 | 0.5023 | 0.7987 | 0.5023 | 0.7087 |
| 0.1313 | 5.0813 | 1062 | 0.5727 | 0.7048 | 0.5727 | 0.7567 |
| 0.1313 | 5.0909 | 1064 | 0.5508 | 0.7601 | 0.5508 | 0.7421 |
| 0.1313 | 5.1005 | 1066 | 0.4580 | 0.8146 | 0.4580 | 0.6768 |
| 0.1313 | 5.1100 | 1068 | 0.3707 | 0.8095 | 0.3707 | 0.6088 |
| 0.1313 | 5.1196 | 1070 | 0.3536 | 0.8350 | 0.3536 | 0.5946 |
| 0.1313 | 5.1292 | 1072 | 0.3708 | 0.8095 | 0.3708 | 0.6089 |
| 0.1313 | 5.1388 | 1074 | 0.4276 | 0.7934 | 0.4276 | 0.6539 |
| 0.1313 | 5.1483 | 1076 | 0.4534 | 0.8146 | 0.4534 | 0.6733 |
| 0.1313 | 5.1579 | 1078 | 0.4454 | 0.8146 | 0.4454 | 0.6674 |
| 0.1313 | 5.1675 | 1080 | 0.4029 | 0.8146 | 0.4029 | 0.6347 |
| 0.1313 | 5.1770 | 1082 | 0.3604 | 0.8316 | 0.3604 | 0.6003 |
| 0.1313 | 5.1866 | 1084 | 0.3413 | 0.8269 | 0.3413 | 0.5842 |
| 0.1313 | 5.1962 | 1086 | 0.3490 | 0.85 | 0.3490 | 0.5907 |
| 0.1313 | 5.2057 | 1088 | 0.3502 | 0.8269 | 0.3502 | 0.5918 |
| 0.1313 | 5.2153 | 1090 | 0.3407 | 0.8269 | 0.3407 | 0.5837 |
| 0.1313 | 5.2249 | 1092 | 0.3492 | 0.8269 | 0.3492 | 0.5910 |
| 0.1313 | 5.2344 | 1094 | 0.3425 | 0.8269 | 0.3425 | 0.5853 |
| 0.1313 | 5.2440 | 1096 | 0.3274 | 0.7941 | 0.3274 | 0.5722 |
| 0.1313 | 5.2536 | 1098 | 0.3349 | 0.7529 | 0.3349 | 0.5787 |
| 0.1313 | 5.2632 | 1100 | 0.3618 | 0.7895 | 0.3618 | 0.6015 |
| 0.1313 | 5.2727 | 1102 | 0.3934 | 0.7789 | 0.3934 | 0.6272 |
| 0.1313 | 5.2823 | 1104 | 0.3847 | 0.7934 | 0.3847 | 0.6202 |
| 0.1313 | 5.2919 | 1106 | 0.3840 | 0.7934 | 0.3840 | 0.6196 |
| 0.1313 | 5.3014 | 1108 | 0.3657 | 0.8095 | 0.3657 | 0.6048 |
| 0.1313 | 5.3110 | 1110 | 0.3720 | 0.7934 | 0.3720 | 0.6099 |
| 0.1313 | 5.3206 | 1112 | 0.3447 | 0.8531 | 0.3447 | 0.5871 |
| 0.1313 | 5.3301 | 1114 | 0.3382 | 0.8531 | 0.3382 | 0.5816 |
| 0.1313 | 5.3397 | 1116 | 0.3481 | 0.8269 | 0.3481 | 0.5900 |
| 0.1313 | 5.3493 | 1118 | 0.3674 | 0.8316 | 0.3674 | 0.6061 |
| 0.1313 | 5.3589 | 1120 | 0.3633 | 0.85 | 0.3633 | 0.6027 |
| 0.1313 | 5.3684 | 1122 | 0.3440 | 0.8269 | 0.3440 | 0.5865 |
| 0.1313 | 5.3780 | 1124 | 0.3290 | 0.7529 | 0.3290 | 0.5736 |
| 0.1313 | 5.3876 | 1126 | 0.3520 | 0.7895 | 0.3520 | 0.5933 |
| 0.1313 | 5.3971 | 1128 | 0.3651 | 0.7895 | 0.3651 | 0.6043 |
| 0.1313 | 5.4067 | 1130 | 0.3593 | 0.8095 | 0.3593 | 0.5995 |
| 0.1313 | 5.4163 | 1132 | 0.3614 | 0.8095 | 0.3614 | 0.6012 |
| 0.1313 | 5.4258 | 1134 | 0.3862 | 0.7569 | 0.3862 | 0.6214 |
| 0.1313 | 5.4354 | 1136 | 0.4583 | 0.7789 | 0.4583 | 0.6770 |
| 0.1313 | 5.4450 | 1138 | 0.4916 | 0.7789 | 0.4916 | 0.7012 |
| 0.1313 | 5.4545 | 1140 | 0.4555 | 0.7789 | 0.4555 | 0.6749 |
| 0.1313 | 5.4641 | 1142 | 0.3852 | 0.7789 | 0.3852 | 0.6206 |
| 0.1313 | 5.4737 | 1144 | 0.3344 | 0.7529 | 0.3344 | 0.5783 |
| 0.1313 | 5.4833 | 1146 | 0.3022 | 0.8218 | 0.3022 | 0.5498 |
| 0.1313 | 5.4928 | 1148 | 0.2918 | 0.8409 | 0.2918 | 0.5401 |
| 0.1313 | 5.5024 | 1150 | 0.2884 | 0.8409 | 0.2884 | 0.5370 |
| 0.1313 | 5.5120 | 1152 | 0.2933 | 0.8218 | 0.2933 | 0.5415 |
| 0.1313 | 5.5215 | 1154 | 0.3290 | 0.8316 | 0.3290 | 0.5736 |
| 0.1313 | 5.5311 | 1156 | 0.4524 | 0.8146 | 0.4524 | 0.6726 |
| 0.1313 | 5.5407 | 1158 | 0.5270 | 0.8146 | 0.5270 | 0.7260 |
| 0.1313 | 5.5502 | 1160 | 0.5099 | 0.8146 | 0.5099 | 0.7141 |
| 0.1313 | 5.5598 | 1162 | 0.4649 | 0.8146 | 0.4649 | 0.6819 |
| 0.1313 | 5.5694 | 1164 | 0.3835 | 0.8146 | 0.3835 | 0.6193 |
| 0.1313 | 5.5789 | 1166 | 0.3377 | 0.8316 | 0.3377 | 0.5811 |
| 0.1313 | 5.5885 | 1168 | 0.3030 | 0.8218 | 0.3030 | 0.5504 |
| 0.1313 | 5.5981 | 1170 | 0.3004 | 0.8218 | 0.3004 | 0.5481 |
| 0.1313 | 5.6077 | 1172 | 0.3080 | 0.8218 | 0.3080 | 0.5550 |
| 0.1313 | 5.6172 | 1174 | 0.3312 | 0.8042 | 0.3312 | 0.5755 |
| 0.1313 | 5.6268 | 1176 | 0.3655 | 0.7934 | 0.3655 | 0.6046 |
| 0.1313 | 5.6364 | 1178 | 0.3808 | 0.7934 | 0.3808 | 0.6171 |
| 0.1313 | 5.6459 | 1180 | 0.4288 | 0.8146 | 0.4288 | 0.6548 |
| 0.1313 | 5.6555 | 1182 | 0.4664 | 0.8146 | 0.4664 | 0.6829 |
| 0.1313 | 5.6651 | 1184 | 0.4902 | 0.7934 | 0.4902 | 0.7001 |
| 0.1313 | 5.6746 | 1186 | 0.4875 | 0.7342 | 0.4875 | 0.6982 |
| 0.1313 | 5.6842 | 1188 | 0.5107 | 0.7586 | 0.5107 | 0.7146 |
| 0.1313 | 5.6938 | 1190 | 0.5760 | 0.7215 | 0.5760 | 0.7589 |
| 0.1313 | 5.7033 | 1192 | 0.5665 | 0.7421 | 0.5665 | 0.7527 |
| 0.1313 | 5.7129 | 1194 | 0.4887 | 0.7586 | 0.4887 | 0.6991 |
| 0.1313 | 5.7225 | 1196 | 0.4442 | 0.8182 | 0.4442 | 0.6665 |
| 0.1313 | 5.7321 | 1198 | 0.4641 | 0.7934 | 0.4641 | 0.6813 |
| 0.1313 | 5.7416 | 1200 | 0.5077 | 0.7407 | 0.5077 | 0.7126 |
| 0.1313 | 5.7512 | 1202 | 0.5070 | 0.7407 | 0.5070 | 0.7120 |
| 0.1313 | 5.7608 | 1204 | 0.4995 | 0.7407 | 0.4995 | 0.7068 |
| 0.1313 | 5.7703 | 1206 | 0.4876 | 0.7388 | 0.4876 | 0.6983 |
| 0.1313 | 5.7799 | 1208 | 0.4529 | 0.7388 | 0.4529 | 0.6730 |
| 0.1313 | 5.7895 | 1210 | 0.4563 | 0.7388 | 0.4563 | 0.6755 |
| 0.1313 | 5.7990 | 1212 | 0.4706 | 0.8146 | 0.4706 | 0.6860 |
| 0.1313 | 5.8086 | 1214 | 0.4793 | 0.7540 | 0.4793 | 0.6923 |
| 0.1313 | 5.8182 | 1216 | 0.4820 | 0.8146 | 0.4820 | 0.6942 |
| 0.1313 | 5.8278 | 1218 | 0.5096 | 0.7540 | 0.5096 | 0.7139 |
| 0.1313 | 5.8373 | 1220 | 0.4993 | 0.7789 | 0.4993 | 0.7066 |
| 0.1313 | 5.8469 | 1222 | 0.4908 | 0.7789 | 0.4908 | 0.7006 |
| 0.1313 | 5.8565 | 1224 | 0.4474 | 0.7789 | 0.4474 | 0.6688 |
| 0.1313 | 5.8660 | 1226 | 0.4164 | 0.7445 | 0.4164 | 0.6453 |
| 0.1313 | 5.8756 | 1228 | 0.3977 | 0.7605 | 0.3977 | 0.6307 |
| 0.1313 | 5.8852 | 1230 | 0.3831 | 0.7605 | 0.3831 | 0.6189 |
| 0.1313 | 5.8947 | 1232 | 0.3715 | 0.8 | 0.3715 | 0.6095 |
| 0.1313 | 5.9043 | 1234 | 0.3595 | 0.8 | 0.3595 | 0.5996 |
| 0.1313 | 5.9139 | 1236 | 0.3728 | 0.8 | 0.3728 | 0.6105 |
| 0.1313 | 5.9234 | 1238 | 0.4421 | 0.8146 | 0.4421 | 0.6649 |
| 0.1313 | 5.9330 | 1240 | 0.5658 | 0.7540 | 0.5658 | 0.7522 |
| 0.1313 | 5.9426 | 1242 | 0.6445 | 0.7393 | 0.6445 | 0.8028 |
| 0.1313 | 5.9522 | 1244 | 0.6115 | 0.7515 | 0.6115 | 0.7820 |
| 0.1313 | 5.9617 | 1246 | 0.5071 | 0.8146 | 0.5071 | 0.7121 |
| 0.1313 | 5.9713 | 1248 | 0.3971 | 0.7667 | 0.3971 | 0.6302 |
| 0.1313 | 5.9809 | 1250 | 0.3536 | 0.7603 | 0.3536 | 0.5946 |
| 0.1313 | 5.9904 | 1252 | 0.3463 | 0.7603 | 0.3463 | 0.5885 |
| 0.1313 | 6.0 | 1254 | 0.3595 | 0.7820 | 0.3595 | 0.5996 |
| 0.1313 | 6.0096 | 1256 | 0.3622 | 0.7941 | 0.3622 | 0.6018 |
| 0.1313 | 6.0191 | 1258 | 0.3496 | 0.7941 | 0.3496 | 0.5913 |
| 0.1313 | 6.0287 | 1260 | 0.3510 | 0.7941 | 0.3510 | 0.5924 |
| 0.1313 | 6.0383 | 1262 | 0.3649 | 0.7774 | 0.3649 | 0.6041 |
| 0.1313 | 6.0478 | 1264 | 0.4079 | 0.8316 | 0.4079 | 0.6387 |
| 0.1313 | 6.0574 | 1266 | 0.4838 | 0.8146 | 0.4838 | 0.6956 |
| 0.1313 | 6.0670 | 1268 | 0.5353 | 0.7515 | 0.5353 | 0.7317 |
| 0.1313 | 6.0766 | 1270 | 0.5527 | 0.7515 | 0.5527 | 0.7435 |
| 0.1313 | 6.0861 | 1272 | 0.5566 | 0.7515 | 0.5566 | 0.7461 |
| 0.1313 | 6.0957 | 1274 | 0.5029 | 0.7540 | 0.5029 | 0.7092 |
| 0.1313 | 6.1053 | 1276 | 0.3938 | 0.8316 | 0.3938 | 0.6276 |
| 0.1313 | 6.1148 | 1278 | 0.3500 | 0.8316 | 0.3500 | 0.5916 |
| 0.1313 | 6.1244 | 1280 | 0.3474 | 0.8316 | 0.3474 | 0.5894 |
| 0.1313 | 6.1340 | 1282 | 0.3869 | 0.8316 | 0.3869 | 0.6220 |
| 0.1313 | 6.1435 | 1284 | 0.4636 | 0.7789 | 0.4636 | 0.6809 |
| 0.1313 | 6.1531 | 1286 | 0.5701 | 0.7162 | 0.5701 | 0.7551 |
| 0.1313 | 6.1627 | 1288 | 0.5785 | 0.6138 | 0.5785 | 0.7606 |
| 0.1313 | 6.1722 | 1290 | 0.5285 | 0.6738 | 0.5285 | 0.7270 |
| 0.1313 | 6.1818 | 1292 | 0.4837 | 0.7789 | 0.4837 | 0.6955 |
| 0.1313 | 6.1914 | 1294 | 0.4167 | 0.7956 | 0.4167 | 0.6455 |
| 0.1313 | 6.2010 | 1296 | 0.3739 | 0.7956 | 0.3739 | 0.6115 |
| 0.1313 | 6.2105 | 1298 | 0.3607 | 0.8316 | 0.3607 | 0.6006 |
| 0.1313 | 6.2201 | 1300 | 0.3747 | 0.8316 | 0.3747 | 0.6121 |
| 0.1313 | 6.2297 | 1302 | 0.3878 | 0.8316 | 0.3878 | 0.6228 |
| 0.1313 | 6.2392 | 1304 | 0.4260 | 0.8316 | 0.4260 | 0.6527 |
| 0.1313 | 6.2488 | 1306 | 0.4366 | 0.8316 | 0.4366 | 0.6607 |
| 0.1313 | 6.2584 | 1308 | 0.4186 | 0.8316 | 0.4186 | 0.6470 |
| 0.1313 | 6.2679 | 1310 | 0.3628 | 0.8316 | 0.3628 | 0.6023 |
| 0.1313 | 6.2775 | 1312 | 0.3214 | 0.8 | 0.3214 | 0.5669 |
| 0.1313 | 6.2871 | 1314 | 0.3035 | 0.8372 | 0.3035 | 0.5509 |
| 0.1313 | 6.2967 | 1316 | 0.2977 | 0.776 | 0.2977 | 0.5456 |
| 0.1313 | 6.3062 | 1318 | 0.3052 | 0.8123 | 0.3052 | 0.5525 |
| 0.1313 | 6.3158 | 1320 | 0.3236 | 0.7941 | 0.3236 | 0.5689 |
| 0.1313 | 6.3254 | 1322 | 0.3492 | 0.8 | 0.3492 | 0.5910 |
| 0.1313 | 6.3349 | 1324 | 0.3670 | 0.8 | 0.3670 | 0.6058 |
| 0.1313 | 6.3445 | 1326 | 0.3841 | 0.8316 | 0.3841 | 0.6197 |
| 0.1313 | 6.3541 | 1328 | 0.3746 | 0.7774 | 0.3746 | 0.6120 |
| 0.1313 | 6.3636 | 1330 | 0.3558 | 0.7712 | 0.3558 | 0.5965 |
| 0.1313 | 6.3732 | 1332 | 0.3509 | 0.7712 | 0.3509 | 0.5924 |
| 0.1313 | 6.3828 | 1334 | 0.3511 | 0.7712 | 0.3511 | 0.5926 |
| 0.1313 | 6.3923 | 1336 | 0.3495 | 0.7712 | 0.3495 | 0.5912 |
| 0.1313 | 6.4019 | 1338 | 0.3353 | 0.7864 | 0.3353 | 0.5791 |
| 0.1313 | 6.4115 | 1340 | 0.3278 | 0.7864 | 0.3278 | 0.5725 |
| 0.1313 | 6.4211 | 1342 | 0.3291 | 0.7864 | 0.3291 | 0.5737 |
| 0.1313 | 6.4306 | 1344 | 0.3477 | 0.7864 | 0.3477 | 0.5897 |
| 0.1313 | 6.4402 | 1346 | 0.3561 | 0.7712 | 0.3561 | 0.5967 |
| 0.1313 | 6.4498 | 1348 | 0.3798 | 0.8013 | 0.3798 | 0.6163 |
| 0.1313 | 6.4593 | 1350 | 0.3976 | 0.8013 | 0.3976 | 0.6305 |
| 0.1313 | 6.4689 | 1352 | 0.3949 | 0.8013 | 0.3949 | 0.6284 |
| 0.1313 | 6.4785 | 1354 | 0.3951 | 0.7771 | 0.3951 | 0.6286 |
| 0.1313 | 6.4880 | 1356 | 0.3920 | 0.7771 | 0.3920 | 0.6261 |
| 0.1313 | 6.4976 | 1358 | 0.3876 | 0.7974 | 0.3876 | 0.6226 |
| 0.1313 | 6.5072 | 1360 | 0.3551 | 0.7459 | 0.3551 | 0.5959 |
| 0.1313 | 6.5167 | 1362 | 0.3492 | 0.7712 | 0.3492 | 0.5909 |
| 0.1313 | 6.5263 | 1364 | 0.3357 | 0.7712 | 0.3357 | 0.5794 |
| 0.1313 | 6.5359 | 1366 | 0.3276 | 0.7864 | 0.3276 | 0.5724 |
| 0.1313 | 6.5455 | 1368 | 0.3241 | 0.7864 | 0.3241 | 0.5693 |
| 0.1313 | 6.5550 | 1370 | 0.3407 | 0.7712 | 0.3407 | 0.5837 |
| 0.1313 | 6.5646 | 1372 | 0.3656 | 0.7712 | 0.3656 | 0.6046 |
| 0.1313 | 6.5742 | 1374 | 0.4056 | 0.7459 | 0.4056 | 0.6369 |
| 0.1313 | 6.5837 | 1376 | 0.4643 | 0.7974 | 0.4643 | 0.6814 |
| 0.1313 | 6.5933 | 1378 | 0.5037 | 0.7965 | 0.5037 | 0.7097 |
| 0.1313 | 6.6029 | 1380 | 0.5387 | 0.7437 | 0.5387 | 0.7340 |
| 0.1313 | 6.6124 | 1382 | 0.5298 | 0.7437 | 0.5298 | 0.7278 |
| 0.1313 | 6.6220 | 1384 | 0.5036 | 0.7267 | 0.5036 | 0.7096 |
| 0.1313 | 6.6316 | 1386 | 0.4609 | 0.7974 | 0.4609 | 0.6789 |
| 0.1313 | 6.6411 | 1388 | 0.4037 | 0.7459 | 0.4037 | 0.6354 |
| 0.1313 | 6.6507 | 1390 | 0.3833 | 0.7712 | 0.3833 | 0.6191 |
| 0.1313 | 6.6603 | 1392 | 0.3864 | 0.7459 | 0.3864 | 0.6216 |
| 0.1313 | 6.6699 | 1394 | 0.3990 | 0.7667 | 0.3990 | 0.6317 |
| 0.1313 | 6.6794 | 1396 | 0.4084 | 0.7667 | 0.4084 | 0.6390 |
| 0.1313 | 6.6890 | 1398 | 0.4112 | 0.7974 | 0.4112 | 0.6412 |
| 0.1313 | 6.6986 | 1400 | 0.4000 | 0.7974 | 0.4000 | 0.6325 |
| 0.1313 | 6.7081 | 1402 | 0.3926 | 0.7774 | 0.3926 | 0.6266 |
| 0.1313 | 6.7177 | 1404 | 0.3957 | 0.8095 | 0.3957 | 0.6290 |
| 0.1313 | 6.7273 | 1406 | 0.4161 | 0.8316 | 0.4161 | 0.6450 |
| 0.1313 | 6.7368 | 1408 | 0.4110 | 0.8316 | 0.4110 | 0.6411 |
| 0.1313 | 6.7464 | 1410 | 0.3792 | 0.8 | 0.3792 | 0.6158 |
| 0.1313 | 6.7560 | 1412 | 0.3567 | 0.7774 | 0.3567 | 0.5972 |
| 0.1313 | 6.7656 | 1414 | 0.3604 | 0.7774 | 0.3604 | 0.6003 |
| 0.1313 | 6.7751 | 1416 | 0.3786 | 0.8316 | 0.3786 | 0.6153 |
| 0.1313 | 6.7847 | 1418 | 0.4131 | 0.8316 | 0.4131 | 0.6427 |
| 0.1313 | 6.7943 | 1420 | 0.4276 | 0.8316 | 0.4276 | 0.6539 |
| 0.1313 | 6.8038 | 1422 | 0.4462 | 0.8316 | 0.4462 | 0.6680 |
| 0.1313 | 6.8134 | 1424 | 0.4374 | 0.8316 | 0.4374 | 0.6613 |
| 0.1313 | 6.8230 | 1426 | 0.3995 | 0.8316 | 0.3995 | 0.6320 |
| 0.1313 | 6.8325 | 1428 | 0.3672 | 0.8316 | 0.3672 | 0.6060 |
| 0.1313 | 6.8421 | 1430 | 0.3444 | 0.8316 | 0.3444 | 0.5868 |
| 0.1313 | 6.8517 | 1432 | 0.3319 | 0.8 | 0.3319 | 0.5761 |
| 0.1313 | 6.8612 | 1434 | 0.3188 | 0.7774 | 0.3188 | 0.5646 |
| 0.1313 | 6.8708 | 1436 | 0.3163 | 0.7774 | 0.3163 | 0.5624 |
| 0.1313 | 6.8804 | 1438 | 0.3284 | 0.7774 | 0.3284 | 0.5731 |
| 0.1313 | 6.8900 | 1440 | 0.3432 | 0.7774 | 0.3432 | 0.5858 |
| 0.1313 | 6.8995 | 1442 | 0.3656 | 0.8095 | 0.3656 | 0.6047 |
| 0.1313 | 6.9091 | 1444 | 0.3942 | 0.8095 | 0.3942 | 0.6279 |
| 0.1313 | 6.9187 | 1446 | 0.4204 | 0.8095 | 0.4204 | 0.6484 |
| 0.1313 | 6.9282 | 1448 | 0.4655 | 0.8316 | 0.4655 | 0.6822 |
| 0.1313 | 6.9378 | 1450 | 0.5136 | 0.7682 | 0.5136 | 0.7167 |
| 0.1313 | 6.9474 | 1452 | 0.5208 | 0.7540 | 0.5208 | 0.7217 |
| 0.1313 | 6.9569 | 1454 | 0.4936 | 0.7682 | 0.4936 | 0.7026 |
| 0.1313 | 6.9665 | 1456 | 0.4414 | 0.8316 | 0.4414 | 0.6644 |
| 0.1313 | 6.9761 | 1458 | 0.3995 | 0.7726 | 0.3995 | 0.6321 |
| 0.1313 | 6.9856 | 1460 | 0.3709 | 0.7368 | 0.3709 | 0.6090 |
| 0.1313 | 6.9952 | 1462 | 0.3698 | 0.7529 | 0.3698 | 0.6081 |
| 0.1313 | 7.0048 | 1464 | 0.3658 | 0.7529 | 0.3658 | 0.6048 |
| 0.1313 | 7.0144 | 1466 | 0.3733 | 0.7529 | 0.3733 | 0.6110 |
| 0.1313 | 7.0239 | 1468 | 0.3673 | 0.7529 | 0.3673 | 0.6061 |
| 0.1313 | 7.0335 | 1470 | 0.3586 | 0.7941 | 0.3586 | 0.5988 |
| 0.1313 | 7.0431 | 1472 | 0.3451 | 0.7941 | 0.3451 | 0.5875 |
| 0.1313 | 7.0526 | 1474 | 0.3391 | 0.7941 | 0.3391 | 0.5823 |
| 0.1313 | 7.0622 | 1476 | 0.3447 | 0.7941 | 0.3447 | 0.5871 |
| 0.1313 | 7.0718 | 1478 | 0.3604 | 0.85 | 0.3604 | 0.6003 |
| 0.1313 | 7.0813 | 1480 | 0.3679 | 0.85 | 0.3679 | 0.6066 |
| 0.1313 | 7.0909 | 1482 | 0.3822 | 0.8316 | 0.3822 | 0.6182 |
| 0.1313 | 7.1005 | 1484 | 0.4031 | 0.8316 | 0.4031 | 0.6349 |
| 0.1313 | 7.1100 | 1486 | 0.4037 | 0.8316 | 0.4037 | 0.6353 |
| 0.1313 | 7.1196 | 1488 | 0.4019 | 0.8316 | 0.4019 | 0.6340 |
| 0.1313 | 7.1292 | 1490 | 0.3880 | 0.8316 | 0.3880 | 0.6229 |
| 0.1313 | 7.1388 | 1492 | 0.3847 | 0.85 | 0.3847 | 0.6202 |
| 0.1313 | 7.1483 | 1494 | 0.3670 | 0.8178 | 0.3670 | 0.6058 |
| 0.1313 | 7.1579 | 1496 | 0.3483 | 0.8178 | 0.3483 | 0.5901 |
| 0.1313 | 7.1675 | 1498 | 0.3401 | 0.7941 | 0.3401 | 0.5832 |
| 0.074 | 7.1770 | 1500 | 0.3444 | 0.7941 | 0.3444 | 0.5869 |
| 0.074 | 7.1866 | 1502 | 0.3576 | 0.7941 | 0.3576 | 0.5980 |
| 0.074 | 7.1962 | 1504 | 0.3786 | 0.7774 | 0.3786 | 0.6153 |
| 0.074 | 7.2057 | 1506 | 0.3979 | 0.8095 | 0.3979 | 0.6308 |
| 0.074 | 7.2153 | 1508 | 0.4164 | 0.8095 | 0.4164 | 0.6453 |
| 0.074 | 7.2249 | 1510 | 0.4226 | 0.8095 | 0.4226 | 0.6501 |
| 0.074 | 7.2344 | 1512 | 0.4333 | 0.8095 | 0.4333 | 0.6582 |
| 0.074 | 7.2440 | 1514 | 0.4510 | 0.8095 | 0.4510 | 0.6716 |
| 0.074 | 7.2536 | 1516 | 0.4626 | 0.8095 | 0.4626 | 0.6801 |
| 0.074 | 7.2632 | 1518 | 0.5037 | 0.7879 | 0.5037 | 0.7097 |
| 0.074 | 7.2727 | 1520 | 0.5374 | 0.7331 | 0.5374 | 0.7330 |
| 0.074 | 7.2823 | 1522 | 0.5386 | 0.7331 | 0.5386 | 0.7339 |
| 0.074 | 7.2919 | 1524 | 0.5083 | 0.7934 | 0.5083 | 0.7130 |
| 0.074 | 7.3014 | 1526 | 0.4548 | 0.8095 | 0.4548 | 0.6744 |
| 0.074 | 7.3110 | 1528 | 0.4014 | 0.8095 | 0.4014 | 0.6335 |
| 0.074 | 7.3206 | 1530 | 0.3801 | 0.7774 | 0.3801 | 0.6165 |
| 0.074 | 7.3301 | 1532 | 0.3808 | 0.7774 | 0.3808 | 0.6171 |
| 0.074 | 7.3397 | 1534 | 0.4098 | 0.8095 | 0.4098 | 0.6402 |
| 0.074 | 7.3493 | 1536 | 0.4367 | 0.8095 | 0.4367 | 0.6609 |
| 0.074 | 7.3589 | 1538 | 0.4910 | 0.7789 | 0.4910 | 0.7007 |
| 0.074 | 7.3684 | 1540 | 0.5363 | 0.7162 | 0.5363 | 0.7323 |
| 0.074 | 7.3780 | 1542 | 0.5767 | 0.6776 | 0.5767 | 0.7594 |
| 0.074 | 7.3876 | 1544 | 0.5791 | 0.7165 | 0.5791 | 0.7610 |
| 0.074 | 7.3971 | 1546 | 0.5451 | 0.7515 | 0.5451 | 0.7383 |
| 0.074 | 7.4067 | 1548 | 0.4762 | 0.8316 | 0.4762 | 0.6901 |
| 0.074 | 7.4163 | 1550 | 0.4302 | 0.8095 | 0.4302 | 0.6559 |
| 0.074 | 7.4258 | 1552 | 0.4060 | 0.8095 | 0.4060 | 0.6372 |
| 0.074 | 7.4354 | 1554 | 0.3878 | 0.8095 | 0.3878 | 0.6227 |
| 0.074 | 7.4450 | 1556 | 0.3722 | 0.7774 | 0.3722 | 0.6101 |
| 0.074 | 7.4545 | 1558 | 0.3774 | 0.7774 | 0.3774 | 0.6143 |
| 0.074 | 7.4641 | 1560 | 0.3811 | 0.7774 | 0.3811 | 0.6174 |
| 0.074 | 7.4737 | 1562 | 0.3820 | 0.7774 | 0.3820 | 0.6180 |
| 0.074 | 7.4833 | 1564 | 0.3915 | 0.7774 | 0.3915 | 0.6257 |
| 0.074 | 7.4928 | 1566 | 0.3980 | 0.7774 | 0.3980 | 0.6309 |
| 0.074 | 7.5024 | 1568 | 0.3968 | 0.7774 | 0.3968 | 0.6299 |
| 0.074 | 7.5120 | 1570 | 0.3979 | 0.8 | 0.3979 | 0.6308 |
| 0.074 | 7.5215 | 1572 | 0.3989 | 0.8 | 0.3989 | 0.6315 |
| 0.074 | 7.5311 | 1574 | 0.4049 | 0.8316 | 0.4049 | 0.6363 |
| 0.074 | 7.5407 | 1576 | 0.4074 | 0.8316 | 0.4074 | 0.6383 |
| 0.074 | 7.5502 | 1578 | 0.3979 | 0.7774 | 0.3979 | 0.6308 |
| 0.074 | 7.5598 | 1580 | 0.3901 | 0.7774 | 0.3901 | 0.6246 |
| 0.074 | 7.5694 | 1582 | 0.3840 | 0.7774 | 0.3840 | 0.6197 |
| 0.074 | 7.5789 | 1584 | 0.3757 | 0.7774 | 0.3757 | 0.6130 |
| 0.074 | 7.5885 | 1586 | 0.3787 | 0.7774 | 0.3787 | 0.6154 |
| 0.074 | 7.5981 | 1588 | 0.3822 | 0.7774 | 0.3822 | 0.6182 |
| 0.074 | 7.6077 | 1590 | 0.4012 | 0.8316 | 0.4012 | 0.6334 |
| 0.074 | 7.6172 | 1592 | 0.4139 | 0.8316 | 0.4139 | 0.6433 |
| 0.074 | 7.6268 | 1594 | 0.4300 | 0.8316 | 0.4300 | 0.6558 |
| 0.074 | 7.6364 | 1596 | 0.4218 | 0.8316 | 0.4218 | 0.6495 |
| 0.074 | 7.6459 | 1598 | 0.4043 | 0.8316 | 0.4043 | 0.6359 |
| 0.074 | 7.6555 | 1600 | 0.3977 | 0.8316 | 0.3977 | 0.6306 |
| 0.074 | 7.6651 | 1602 | 0.3760 | 0.8 | 0.3760 | 0.6132 |
| 0.074 | 7.6746 | 1604 | 0.3601 | 0.7941 | 0.3601 | 0.6001 |
| 0.074 | 7.6842 | 1606 | 0.3422 | 0.7941 | 0.3422 | 0.5850 |
| 0.074 | 7.6938 | 1608 | 0.3269 | 0.7941 | 0.3269 | 0.5718 |
| 0.074 | 7.7033 | 1610 | 0.3242 | 0.7709 | 0.3242 | 0.5694 |
| 0.074 | 7.7129 | 1612 | 0.3323 | 0.7941 | 0.3323 | 0.5765 |
| 0.074 | 7.7225 | 1614 | 0.3503 | 0.7941 | 0.3503 | 0.5919 |
| 0.074 | 7.7321 | 1616 | 0.3781 | 0.7774 | 0.3781 | 0.6149 |
| 0.074 | 7.7416 | 1618 | 0.4140 | 0.8095 | 0.4140 | 0.6435 |
| 0.074 | 7.7512 | 1620 | 0.4552 | 0.8316 | 0.4552 | 0.6747 |
| 0.074 | 7.7608 | 1622 | 0.4724 | 0.8316 | 0.4724 | 0.6873 |
| 0.074 | 7.7703 | 1624 | 0.4736 | 0.8316 | 0.4736 | 0.6882 |
| 0.074 | 7.7799 | 1626 | 0.4870 | 0.8316 | 0.4870 | 0.6979 |
| 0.074 | 7.7895 | 1628 | 0.4962 | 0.8146 | 0.4962 | 0.7044 |
| 0.074 | 7.7990 | 1630 | 0.4827 | 0.8316 | 0.4827 | 0.6948 |
| 0.074 | 7.8086 | 1632 | 0.4585 | 0.8316 | 0.4585 | 0.6771 |
| 0.074 | 7.8182 | 1634 | 0.4306 | 0.7726 | 0.4306 | 0.6562 |
| 0.074 | 7.8278 | 1636 | 0.4056 | 0.7368 | 0.4056 | 0.6369 |
| 0.074 | 7.8373 | 1638 | 0.3963 | 0.7774 | 0.3963 | 0.6296 |
| 0.074 | 7.8469 | 1640 | 0.3994 | 0.7774 | 0.3994 | 0.6320 |
| 0.074 | 7.8565 | 1642 | 0.4240 | 0.8095 | 0.4240 | 0.6512 |
| 0.074 | 7.8660 | 1644 | 0.4685 | 0.8095 | 0.4685 | 0.6845 |
| 0.074 | 7.8756 | 1646 | 0.4936 | 0.8146 | 0.4936 | 0.7026 |
| 0.074 | 7.8852 | 1648 | 0.5208 | 0.7742 | 0.5208 | 0.7217 |
| 0.074 | 7.8947 | 1650 | 0.5382 | 0.7165 | 0.5382 | 0.7336 |
| 0.074 | 7.9043 | 1652 | 0.5335 | 0.8073 | 0.5335 | 0.7304 |
| 0.074 | 7.9139 | 1654 | 0.5244 | 0.8073 | 0.5244 | 0.7241 |
| 0.074 | 7.9234 | 1656 | 0.4987 | 0.8073 | 0.4987 | 0.7062 |
| 0.074 | 7.9330 | 1658 | 0.4645 | 0.8095 | 0.4645 | 0.6816 |
| 0.074 | 7.9426 | 1660 | 0.4389 | 0.8095 | 0.4389 | 0.6625 |
| 0.074 | 7.9522 | 1662 | 0.4111 | 0.8095 | 0.4111 | 0.6412 |
| 0.074 | 7.9617 | 1664 | 0.3916 | 0.8095 | 0.3916 | 0.6258 |
| 0.074 | 7.9713 | 1666 | 0.3747 | 0.7774 | 0.3747 | 0.6122 |
| 0.074 | 7.9809 | 1668 | 0.3702 | 0.7774 | 0.3702 | 0.6084 |
| 0.074 | 7.9904 | 1670 | 0.3794 | 0.8095 | 0.3794 | 0.6160 |
| 0.074 | 8.0 | 1672 | 0.3859 | 0.8095 | 0.3859 | 0.6212 |
| 0.074 | 8.0096 | 1674 | 0.3888 | 0.8095 | 0.3888 | 0.6235 |
| 0.074 | 8.0191 | 1676 | 0.3939 | 0.8095 | 0.3939 | 0.6276 |
| 0.074 | 8.0287 | 1678 | 0.3891 | 0.8095 | 0.3891 | 0.6237 |
| 0.074 | 8.0383 | 1680 | 0.3899 | 0.8095 | 0.3899 | 0.6244 |
| 0.074 | 8.0478 | 1682 | 0.3774 | 0.8095 | 0.3774 | 0.6143 |
| 0.074 | 8.0574 | 1684 | 0.3676 | 0.8269 | 0.3676 | 0.6063 |
| 0.074 | 8.0670 | 1686 | 0.3546 | 0.7941 | 0.3546 | 0.5955 |
| 0.074 | 8.0766 | 1688 | 0.3471 | 0.7941 | 0.3471 | 0.5892 |
| 0.074 | 8.0861 | 1690 | 0.3351 | 0.7941 | 0.3351 | 0.5789 |
| 0.074 | 8.0957 | 1692 | 0.3231 | 0.7941 | 0.3231 | 0.5684 |
| 0.074 | 8.1053 | 1694 | 0.3222 | 0.7941 | 0.3222 | 0.5676 |
| 0.074 | 8.1148 | 1696 | 0.3272 | 0.7941 | 0.3272 | 0.5720 |
| 0.074 | 8.1244 | 1698 | 0.3403 | 0.8178 | 0.3403 | 0.5834 |
| 0.074 | 8.1340 | 1700 | 0.3509 | 0.8178 | 0.3509 | 0.5924 |
| 0.074 | 8.1435 | 1702 | 0.3709 | 0.85 | 0.3709 | 0.6090 |
| 0.074 | 8.1531 | 1704 | 0.3879 | 0.8316 | 0.3879 | 0.6228 |
| 0.074 | 8.1627 | 1706 | 0.3974 | 0.8316 | 0.3974 | 0.6304 |
| 0.074 | 8.1722 | 1708 | 0.3885 | 0.8316 | 0.3885 | 0.6233 |
| 0.074 | 8.1818 | 1710 | 0.3749 | 0.8316 | 0.3749 | 0.6123 |
| 0.074 | 8.1914 | 1712 | 0.3661 | 0.8 | 0.3661 | 0.6050 |
| 0.074 | 8.2010 | 1714 | 0.3676 | 0.7879 | 0.3676 | 0.6063 |
| 0.074 | 8.2105 | 1716 | 0.3807 | 0.8182 | 0.3807 | 0.6170 |
| 0.074 | 8.2201 | 1718 | 0.4095 | 0.8316 | 0.4095 | 0.6399 |
| 0.074 | 8.2297 | 1720 | 0.4311 | 0.8316 | 0.4311 | 0.6566 |
| 0.074 | 8.2392 | 1722 | 0.4449 | 0.8316 | 0.4449 | 0.6670 |
| 0.074 | 8.2488 | 1724 | 0.4504 | 0.8316 | 0.4504 | 0.6711 |
| 0.074 | 8.2584 | 1726 | 0.4451 | 0.8316 | 0.4451 | 0.6671 |
| 0.074 | 8.2679 | 1728 | 0.4261 | 0.8316 | 0.4261 | 0.6528 |
| 0.074 | 8.2775 | 1730 | 0.4162 | 0.8316 | 0.4162 | 0.6452 |
| 0.074 | 8.2871 | 1732 | 0.4125 | 0.8316 | 0.4125 | 0.6423 |
| 0.074 | 8.2967 | 1734 | 0.3979 | 0.8316 | 0.3979 | 0.6308 |
| 0.074 | 8.3062 | 1736 | 0.3752 | 0.8178 | 0.3752 | 0.6126 |
| 0.074 | 8.3158 | 1738 | 0.3604 | 0.8178 | 0.3604 | 0.6003 |
| 0.074 | 8.3254 | 1740 | 0.3540 | 0.7941 | 0.3540 | 0.5949 |
| 0.074 | 8.3349 | 1742 | 0.3596 | 0.8178 | 0.3596 | 0.5996 |
| 0.074 | 8.3445 | 1744 | 0.3676 | 0.8178 | 0.3676 | 0.6063 |
| 0.074 | 8.3541 | 1746 | 0.3760 | 0.8178 | 0.3760 | 0.6132 |
| 0.074 | 8.3636 | 1748 | 0.3759 | 0.8178 | 0.3759 | 0.6131 |
| 0.074 | 8.3732 | 1750 | 0.3777 | 0.8178 | 0.3777 | 0.6145 |
| 0.074 | 8.3828 | 1752 | 0.3788 | 0.8178 | 0.3788 | 0.6155 |
| 0.074 | 8.3923 | 1754 | 0.3852 | 0.8178 | 0.3852 | 0.6207 |
| 0.074 | 8.4019 | 1756 | 0.3921 | 0.8178 | 0.3921 | 0.6262 |
| 0.074 | 8.4115 | 1758 | 0.4055 | 0.8 | 0.4055 | 0.6368 |
| 0.074 | 8.4211 | 1760 | 0.4196 | 0.8316 | 0.4196 | 0.6478 |
| 0.074 | 8.4306 | 1762 | 0.4277 | 0.8316 | 0.4277 | 0.6540 |
| 0.074 | 8.4402 | 1764 | 0.4285 | 0.8095 | 0.4285 | 0.6546 |
| 0.074 | 8.4498 | 1766 | 0.4219 | 0.8095 | 0.4219 | 0.6495 |
| 0.074 | 8.4593 | 1768 | 0.4195 | 0.7774 | 0.4195 | 0.6477 |
| 0.074 | 8.4689 | 1770 | 0.4223 | 0.8095 | 0.4223 | 0.6498 |
| 0.074 | 8.4785 | 1772 | 0.4313 | 0.8095 | 0.4313 | 0.6568 |
| 0.074 | 8.4880 | 1774 | 0.4370 | 0.8095 | 0.4370 | 0.6610 |
| 0.074 | 8.4976 | 1776 | 0.4493 | 0.8316 | 0.4493 | 0.6703 |
| 0.074 | 8.5072 | 1778 | 0.4556 | 0.8316 | 0.4556 | 0.6750 |
| 0.074 | 8.5167 | 1780 | 0.4523 | 0.8316 | 0.4523 | 0.6725 |
| 0.074 | 8.5263 | 1782 | 0.4624 | 0.8316 | 0.4624 | 0.6800 |
| 0.074 | 8.5359 | 1784 | 0.4672 | 0.8316 | 0.4672 | 0.6835 |
| 0.074 | 8.5455 | 1786 | 0.4820 | 0.8316 | 0.4820 | 0.6943 |
| 0.074 | 8.5550 | 1788 | 0.4973 | 0.8316 | 0.4973 | 0.7052 |
| 0.074 | 8.5646 | 1790 | 0.5037 | 0.8146 | 0.5037 | 0.7097 |
| 0.074 | 8.5742 | 1792 | 0.5083 | 0.7540 | 0.5083 | 0.7130 |
| 0.074 | 8.5837 | 1794 | 0.4896 | 0.8316 | 0.4896 | 0.6997 |
| 0.074 | 8.5933 | 1796 | 0.4590 | 0.8316 | 0.4590 | 0.6775 |
| 0.074 | 8.6029 | 1798 | 0.4288 | 0.8316 | 0.4288 | 0.6548 |
| 0.074 | 8.6124 | 1800 | 0.4058 | 0.8316 | 0.4058 | 0.6370 |
| 0.074 | 8.6220 | 1802 | 0.3959 | 0.8095 | 0.3959 | 0.6292 |
| 0.074 | 8.6316 | 1804 | 0.3853 | 0.7774 | 0.3853 | 0.6207 |
| 0.074 | 8.6411 | 1806 | 0.3803 | 0.7774 | 0.3803 | 0.6167 |
| 0.074 | 8.6507 | 1808 | 0.3813 | 0.7774 | 0.3813 | 0.6175 |
| 0.074 | 8.6603 | 1810 | 0.3920 | 0.7774 | 0.3920 | 0.6261 |
| 0.074 | 8.6699 | 1812 | 0.4055 | 0.8316 | 0.4055 | 0.6368 |
| 0.074 | 8.6794 | 1814 | 0.4196 | 0.8316 | 0.4196 | 0.6477 |
| 0.074 | 8.6890 | 1816 | 0.4459 | 0.8316 | 0.4459 | 0.6678 |
| 0.074 | 8.6986 | 1818 | 0.4730 | 0.8316 | 0.4730 | 0.6878 |
| 0.074 | 8.7081 | 1820 | 0.5045 | 0.7682 | 0.5045 | 0.7103 |
| 0.074 | 8.7177 | 1822 | 0.5298 | 0.7540 | 0.5298 | 0.7279 |
| 0.074 | 8.7273 | 1824 | 0.5307 | 0.7540 | 0.5307 | 0.7285 |
| 0.074 | 8.7368 | 1826 | 0.5337 | 0.7540 | 0.5337 | 0.7305 |
| 0.074 | 8.7464 | 1828 | 0.5213 | 0.7540 | 0.5213 | 0.7220 |
| 0.074 | 8.7560 | 1830 | 0.5142 | 0.7682 | 0.5142 | 0.7171 |
| 0.074 | 8.7656 | 1832 | 0.4994 | 0.7682 | 0.4994 | 0.7067 |
| 0.074 | 8.7751 | 1834 | 0.4796 | 0.8316 | 0.4796 | 0.6925 |
| 0.074 | 8.7847 | 1836 | 0.4524 | 0.8316 | 0.4524 | 0.6726 |
| 0.074 | 8.7943 | 1838 | 0.4316 | 0.8316 | 0.4316 | 0.6569 |
| 0.074 | 8.8038 | 1840 | 0.4143 | 0.8316 | 0.4143 | 0.6437 |
| 0.074 | 8.8134 | 1842 | 0.4005 | 0.8095 | 0.4005 | 0.6328 |
| 0.074 | 8.8230 | 1844 | 0.3991 | 0.8095 | 0.3991 | 0.6317 |
| 0.074 | 8.8325 | 1846 | 0.4068 | 0.8316 | 0.4068 | 0.6378 |
| 0.074 | 8.8421 | 1848 | 0.4160 | 0.8316 | 0.4160 | 0.6450 |
| 0.074 | 8.8517 | 1850 | 0.4267 | 0.8316 | 0.4267 | 0.6532 |
| 0.074 | 8.8612 | 1852 | 0.4318 | 0.8316 | 0.4318 | 0.6571 |
| 0.074 | 8.8708 | 1854 | 0.4361 | 0.8316 | 0.4361 | 0.6604 |
| 0.074 | 8.8804 | 1856 | 0.4417 | 0.8316 | 0.4417 | 0.6646 |
| 0.074 | 8.8900 | 1858 | 0.4469 | 0.8316 | 0.4469 | 0.6685 |
| 0.074 | 8.8995 | 1860 | 0.4491 | 0.8316 | 0.4491 | 0.6701 |
| 0.074 | 8.9091 | 1862 | 0.4477 | 0.8316 | 0.4477 | 0.6691 |
| 0.074 | 8.9187 | 1864 | 0.4558 | 0.8316 | 0.4558 | 0.6751 |
| 0.074 | 8.9282 | 1866 | 0.4751 | 0.8316 | 0.4751 | 0.6893 |
| 0.074 | 8.9378 | 1868 | 0.4951 | 0.8316 | 0.4951 | 0.7036 |
| 0.074 | 8.9474 | 1870 | 0.5102 | 0.7682 | 0.5102 | 0.7143 |
| 0.074 | 8.9569 | 1872 | 0.5153 | 0.7540 | 0.5153 | 0.7178 |
| 0.074 | 8.9665 | 1874 | 0.5067 | 0.7682 | 0.5067 | 0.7118 |
| 0.074 | 8.9761 | 1876 | 0.4836 | 0.8316 | 0.4836 | 0.6954 |
| 0.074 | 8.9856 | 1878 | 0.4619 | 0.8316 | 0.4619 | 0.6796 |
| 0.074 | 8.9952 | 1880 | 0.4437 | 0.8316 | 0.4437 | 0.6661 |
| 0.074 | 9.0048 | 1882 | 0.4308 | 0.8316 | 0.4308 | 0.6564 |
| 0.074 | 9.0144 | 1884 | 0.4215 | 0.8095 | 0.4215 | 0.6492 |
| 0.074 | 9.0239 | 1886 | 0.4077 | 0.8095 | 0.4077 | 0.6386 |
| 0.074 | 9.0335 | 1888 | 0.3968 | 0.8095 | 0.3968 | 0.6300 |
| 0.074 | 9.0431 | 1890 | 0.3876 | 0.8095 | 0.3876 | 0.6226 |
| 0.074 | 9.0526 | 1892 | 0.3784 | 0.8095 | 0.3784 | 0.6151 |
| 0.074 | 9.0622 | 1894 | 0.3706 | 0.7774 | 0.3706 | 0.6087 |
| 0.074 | 9.0718 | 1896 | 0.3673 | 0.7774 | 0.3673 | 0.6060 |
| 0.074 | 9.0813 | 1898 | 0.3659 | 0.7774 | 0.3659 | 0.6049 |
| 0.074 | 9.0909 | 1900 | 0.3695 | 0.7774 | 0.3695 | 0.6078 |
| 0.074 | 9.1005 | 1902 | 0.3812 | 0.8095 | 0.3812 | 0.6174 |
| 0.074 | 9.1100 | 1904 | 0.3987 | 0.8316 | 0.3987 | 0.6314 |
| 0.074 | 9.1196 | 1906 | 0.4148 | 0.8316 | 0.4148 | 0.6441 |
| 0.074 | 9.1292 | 1908 | 0.4325 | 0.8316 | 0.4325 | 0.6577 |
| 0.074 | 9.1388 | 1910 | 0.4424 | 0.8316 | 0.4424 | 0.6651 |
| 0.074 | 9.1483 | 1912 | 0.4411 | 0.8316 | 0.4411 | 0.6642 |
| 0.074 | 9.1579 | 1914 | 0.4289 | 0.8316 | 0.4289 | 0.6549 |
| 0.074 | 9.1675 | 1916 | 0.4194 | 0.8316 | 0.4194 | 0.6476 |
| 0.074 | 9.1770 | 1918 | 0.4122 | 0.8316 | 0.4122 | 0.6420 |
| 0.074 | 9.1866 | 1920 | 0.4083 | 0.8316 | 0.4083 | 0.6390 |
| 0.074 | 9.1962 | 1922 | 0.4119 | 0.8316 | 0.4119 | 0.6418 |
| 0.074 | 9.2057 | 1924 | 0.4207 | 0.8316 | 0.4207 | 0.6486 |
| 0.074 | 9.2153 | 1926 | 0.4302 | 0.8316 | 0.4302 | 0.6559 |
| 0.074 | 9.2249 | 1928 | 0.4441 | 0.8316 | 0.4441 | 0.6664 |
| 0.074 | 9.2344 | 1930 | 0.4495 | 0.8316 | 0.4495 | 0.6705 |
| 0.074 | 9.2440 | 1932 | 0.4518 | 0.8316 | 0.4518 | 0.6721 |
| 0.074 | 9.2536 | 1934 | 0.4535 | 0.8316 | 0.4535 | 0.6734 |
| 0.074 | 9.2632 | 1936 | 0.4544 | 0.8316 | 0.4544 | 0.6741 |
| 0.074 | 9.2727 | 1938 | 0.4530 | 0.8316 | 0.4530 | 0.6730 |
| 0.074 | 9.2823 | 1940 | 0.4398 | 0.8316 | 0.4398 | 0.6631 |
| 0.074 | 9.2919 | 1942 | 0.4269 | 0.8316 | 0.4269 | 0.6534 |
| 0.074 | 9.3014 | 1944 | 0.4145 | 0.8316 | 0.4145 | 0.6438 |
| 0.074 | 9.3110 | 1946 | 0.4061 | 0.8316 | 0.4061 | 0.6373 |
| 0.074 | 9.3206 | 1948 | 0.4005 | 0.8316 | 0.4005 | 0.6328 |
| 0.074 | 9.3301 | 1950 | 0.3957 | 0.8316 | 0.3957 | 0.6290 |
| 0.074 | 9.3397 | 1952 | 0.3932 | 0.8316 | 0.3932 | 0.6270 |
| 0.074 | 9.3493 | 1954 | 0.3966 | 0.8316 | 0.3966 | 0.6297 |
| 0.074 | 9.3589 | 1956 | 0.4049 | 0.8316 | 0.4049 | 0.6363 |
| 0.074 | 9.3684 | 1958 | 0.4139 | 0.8316 | 0.4139 | 0.6434 |
| 0.074 | 9.3780 | 1960 | 0.4138 | 0.8316 | 0.4138 | 0.6432 |
| 0.074 | 9.3876 | 1962 | 0.4120 | 0.8316 | 0.4120 | 0.6419 |
| 0.074 | 9.3971 | 1964 | 0.4103 | 0.8316 | 0.4103 | 0.6406 |
| 0.074 | 9.4067 | 1966 | 0.4080 | 0.8316 | 0.4080 | 0.6388 |
| 0.074 | 9.4163 | 1968 | 0.4028 | 0.8316 | 0.4028 | 0.6347 |
| 0.074 | 9.4258 | 1970 | 0.3913 | 0.8316 | 0.3913 | 0.6255 |
| 0.074 | 9.4354 | 1972 | 0.3783 | 0.8095 | 0.3783 | 0.6151 |
| 0.074 | 9.4450 | 1974 | 0.3668 | 0.7774 | 0.3668 | 0.6057 |
| 0.074 | 9.4545 | 1976 | 0.3582 | 0.7774 | 0.3582 | 0.5985 |
| 0.074 | 9.4641 | 1978 | 0.3554 | 0.7774 | 0.3554 | 0.5962 |
| 0.074 | 9.4737 | 1980 | 0.3548 | 0.7774 | 0.3548 | 0.5956 |
| 0.074 | 9.4833 | 1982 | 0.3569 | 0.7774 | 0.3569 | 0.5974 |
| 0.074 | 9.4928 | 1984 | 0.3592 | 0.7774 | 0.3592 | 0.5993 |
| 0.074 | 9.5024 | 1986 | 0.3627 | 0.7774 | 0.3627 | 0.6023 |
| 0.074 | 9.5120 | 1988 | 0.3692 | 0.7774 | 0.3692 | 0.6076 |
| 0.074 | 9.5215 | 1990 | 0.3786 | 0.7774 | 0.3786 | 0.6153 |
| 0.074 | 9.5311 | 1992 | 0.3870 | 0.8316 | 0.3870 | 0.6221 |
| 0.074 | 9.5407 | 1994 | 0.3958 | 0.8316 | 0.3958 | 0.6292 |
| 0.074 | 9.5502 | 1996 | 0.4045 | 0.8316 | 0.4045 | 0.6360 |
| 0.074 | 9.5598 | 1998 | 0.4092 | 0.8316 | 0.4092 | 0.6397 |
| 0.0539 | 9.5694 | 2000 | 0.4127 | 0.8316 | 0.4127 | 0.6424 |
| 0.0539 | 9.5789 | 2002 | 0.4163 | 0.8316 | 0.4163 | 0.6452 |
| 0.0539 | 9.5885 | 2004 | 0.4178 | 0.8316 | 0.4178 | 0.6464 |
| 0.0539 | 9.5981 | 2006 | 0.4168 | 0.8316 | 0.4168 | 0.6456 |
| 0.0539 | 9.6077 | 2008 | 0.4150 | 0.8316 | 0.4150 | 0.6442 |
| 0.0539 | 9.6172 | 2010 | 0.4154 | 0.8316 | 0.4154 | 0.6445 |
| 0.0539 | 9.6268 | 2012 | 0.4174 | 0.8316 | 0.4174 | 0.6460 |
| 0.0539 | 9.6364 | 2014 | 0.4182 | 0.8316 | 0.4182 | 0.6467 |
| 0.0539 | 9.6459 | 2016 | 0.4209 | 0.8316 | 0.4209 | 0.6487 |
| 0.0539 | 9.6555 | 2018 | 0.4219 | 0.8316 | 0.4219 | 0.6496 |
| 0.0539 | 9.6651 | 2020 | 0.4232 | 0.8316 | 0.4232 | 0.6505 |
| 0.0539 | 9.6746 | 2022 | 0.4271 | 0.8316 | 0.4271 | 0.6536 |
| 0.0539 | 9.6842 | 2024 | 0.4304 | 0.8316 | 0.4304 | 0.6560 |
| 0.0539 | 9.6938 | 2026 | 0.4315 | 0.8316 | 0.4315 | 0.6569 |
| 0.0539 | 9.7033 | 2028 | 0.4325 | 0.8316 | 0.4325 | 0.6576 |
| 0.0539 | 9.7129 | 2030 | 0.4336 | 0.8316 | 0.4336 | 0.6585 |
| 0.0539 | 9.7225 | 2032 | 0.4328 | 0.8316 | 0.4328 | 0.6579 |
| 0.0539 | 9.7321 | 2034 | 0.4322 | 0.8316 | 0.4322 | 0.6574 |
| 0.0539 | 9.7416 | 2036 | 0.4317 | 0.8316 | 0.4317 | 0.6570 |
| 0.0539 | 9.7512 | 2038 | 0.4321 | 0.8316 | 0.4321 | 0.6574 |
| 0.0539 | 9.7608 | 2040 | 0.4317 | 0.8316 | 0.4317 | 0.6571 |
| 0.0539 | 9.7703 | 2042 | 0.4306 | 0.8316 | 0.4306 | 0.6562 |
| 0.0539 | 9.7799 | 2044 | 0.4305 | 0.8316 | 0.4305 | 0.6561 |
| 0.0539 | 9.7895 | 2046 | 0.4304 | 0.8316 | 0.4304 | 0.6561 |
| 0.0539 | 9.7990 | 2048 | 0.4304 | 0.8316 | 0.4304 | 0.6560 |
| 0.0539 | 9.8086 | 2050 | 0.4301 | 0.8316 | 0.4301 | 0.6559 |
| 0.0539 | 9.8182 | 2052 | 0.4297 | 0.8316 | 0.4297 | 0.6555 |
| 0.0539 | 9.8278 | 2054 | 0.4304 | 0.8316 | 0.4304 | 0.6560 |
| 0.0539 | 9.8373 | 2056 | 0.4312 | 0.8316 | 0.4312 | 0.6567 |
| 0.0539 | 9.8469 | 2058 | 0.4325 | 0.8316 | 0.4325 | 0.6577 |
| 0.0539 | 9.8565 | 2060 | 0.4334 | 0.8316 | 0.4334 | 0.6583 |
| 0.0539 | 9.8660 | 2062 | 0.4333 | 0.8316 | 0.4333 | 0.6582 |
| 0.0539 | 9.8756 | 2064 | 0.4320 | 0.8316 | 0.4320 | 0.6573 |
| 0.0539 | 9.8852 | 2066 | 0.4323 | 0.8316 | 0.4323 | 0.6575 |
| 0.0539 | 9.8947 | 2068 | 0.4332 | 0.8316 | 0.4332 | 0.6582 |
| 0.0539 | 9.9043 | 2070 | 0.4338 | 0.8316 | 0.4338 | 0.6586 |
| 0.0539 | 9.9139 | 2072 | 0.4344 | 0.8316 | 0.4344 | 0.6591 |
| 0.0539 | 9.9234 | 2074 | 0.4345 | 0.8316 | 0.4345 | 0.6592 |
| 0.0539 | 9.9330 | 2076 | 0.4353 | 0.8316 | 0.4353 | 0.6597 |
| 0.0539 | 9.9426 | 2078 | 0.4362 | 0.8316 | 0.4362 | 0.6605 |
| 0.0539 | 9.9522 | 2080 | 0.4367 | 0.8316 | 0.4367 | 0.6609 |
| 0.0539 | 9.9617 | 2082 | 0.4377 | 0.8316 | 0.4377 | 0.6616 |
| 0.0539 | 9.9713 | 2084 | 0.4383 | 0.8316 | 0.4383 | 0.6621 |
| 0.0539 | 9.9809 | 2086 | 0.4388 | 0.8316 | 0.4388 | 0.6625 |
| 0.0539 | 9.9904 | 2088 | 0.4391 | 0.8316 | 0.4391 | 0.6626 |
| 0.0539 | 10.0 | 2090 | 0.4391 | 0.8316 | 0.4391 | 0.6626 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
|
isspek/xlnet-base-cased_non_epidemic_4_2e-5_16 | isspek | 2024-11-24T09:18:06Z | 117 | 0 | transformers | [
"transformers",
"safetensors",
"xlnet",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-24T09:17:49Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
isspek/xlnet-base-cased_non_epidemic_5_2e-5_16 | isspek | 2024-11-24T09:17:31Z | 119 | 0 | transformers | [
"transformers",
"safetensors",
"xlnet",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-24T09:17:17Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
isspek/xlnet-base-cased_non_epidemic_1_2e-5_16 | isspek | 2024-11-24T09:16:53Z | 118 | 0 | transformers | [
"transformers",
"safetensors",
"xlnet",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-24T09:16:37Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
isspek/xlnet-base-cased_non_epidemic_global_warning_3_2e-5_16 | isspek | 2024-11-24T09:14:31Z | 117 | 0 | transformers | [
"transformers",
"safetensors",
"xlnet",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-24T09:14:11Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
isspek/roberta-base_non_epidemic_global_warning_5_2e-5_16 | isspek | 2024-11-24T09:08:21Z | 161 | 0 | transformers | [
"transformers",
"safetensors",
"roberta",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-24T09:08:06Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
isspek/roberta-base_non_epidemic_global_warning_1_2e-5_16 | isspek | 2024-11-24T09:07:36Z | 179 | 0 | transformers | [
"transformers",
"safetensors",
"roberta",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-24T09:07:20Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
isspek/roberta-base_non_epidemic_1_2e-5_16 | isspek | 2024-11-24T09:04:35Z | 161 | 0 | transformers | [
"transformers",
"safetensors",
"roberta",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-24T09:04:13Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
AmberYifan/llama2-7b-sft-ultrachat-safeRLHF | AmberYifan | 2024-11-24T09:04:10Z | 877 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"generated_from_trainer",
"trl",
"sft",
"conversational",
"base_model:AmberYifan/llama2-7b-sft-ultrachat",
"base_model:finetune:AmberYifan/llama2-7b-sft-ultrachat",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-24T06:38:21Z | ---
base_model: AmberYifan/llama2-7b-sft-ultrachat
library_name: transformers
model_name: llama2-7b-sft-ultrachat-safeRLHF
tags:
- generated_from_trainer
- trl
- sft
licence: license
---
# Model Card for llama2-7b-sft-ultrachat-safeRLHF
This model is a fine-tuned version of [AmberYifan/llama2-7b-sft-ultrachat](https://huggingface.co/AmberYifan/llama2-7b-sft-ultrachat).
It has been trained using [TRL](https://github.com/huggingface/trl).
## Quick start
```python
from transformers import pipeline
question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?"
generator = pipeline("text-generation", model="AmberYifan/llama2-7b-sft-ultrachat-safeRLHF", device="cuda")
output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0]
print(output["generated_text"])
```
## Training procedure
[<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="150" height="24"/>](https://wandb.ai/yifanwang/huggingface/runs/8oph7peu)
This model was trained with SFT.
### Framework versions
- TRL: 0.12.0
- Transformers: 4.46.3
- Pytorch: 2.1.2+cu121
- Datasets: 3.1.0
- Tokenizers: 0.20.3
## Citations
Cite TRL as:
```bibtex
@misc{vonwerra2022trl,
title = {{TRL: Transformer Reinforcement Learning}},
author = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallouédec},
year = 2020,
journal = {GitHub repository},
publisher = {GitHub},
howpublished = {\url{https://github.com/huggingface/trl}}
}
``` |
litagin/anime-whisper | litagin | 2024-11-24T08:56:57Z | 2,291 | 52 | transformers | [
"transformers",
"safetensors",
"whisper",
"automatic-speech-recognition",
"anime",
"japanese",
"ja",
"dataset:litagin/Galgame_Speech_ASR_16kHz",
"dataset:OOPPEENN/Galgame_Dataset",
"base_model:kotoba-tech/kotoba-whisper-v2.0",
"base_model:finetune:kotoba-tech/kotoba-whisper-v2.0",
"license:mit",
"endpoints_compatible",
"region:us"
] | automatic-speech-recognition | 2024-11-10T06:08:07Z | ---
library_name: transformers
base_model: kotoba-tech/kotoba-whisper-v2.0
datasets:
- litagin/Galgame_Speech_ASR_16kHz
- OOPPEENN/Galgame_Dataset
language:
- ja
pipeline_tag: automatic-speech-recognition
tags:
- whisper
- anime
- japanese
license: mit
---
# Anime Whisper 🤗🎤📝
**Anime Whisper** は、特に日本語のアニメ調演技セリフのドメインに特化した日本語音声認識モデルです。
このモデルは [kotoba-whisper-v2.0](https://huggingface.co/kotoba-tech/kotoba-whisper-v2.0) をベースモデルとして、約5,300時間373万ファイルのアニメ調の音声・台本データセット [Galgame_Speech_ASR_16kHz](https://huggingface.co/datasets/litagin/Galgame_Speech_ASR_16kHz) でファインチューニングしたものです。
特にアニメ演技音声ドメインに特化していますが、それ以外の音声でも、他のモデルにはない特徴や高い性能を持っています。
気軽にお試しできるデモはこちらから: https://huggingface.co/spaces/litagin/anime-whisper-demo
## 注意❗
**このモデルは初期プロンプト(initial prompt) との相性が悪いようで、設定をするとハルシネーションが起きて質が極めて劣化します。初期プロンプトを設定せずにご使用ください。**
**Using an initial prompt causes a decline in quality, with the model exhibiting hallucinations and significantly degraded performance. Please avoid using an initial prompt.**
## 特徴 🌟
Anime Whisperは、他モデルに比べて一般的に次のような傾向があります。
- ハルシネーションが少ない
- 他のモデルでスキップされがちな言い淀みや、笑い声や叫びや吐息などの非言語発話も忠実に書き起こす
- 「。、!?…」の句読点が音声のリズムや感情に合わせて適切に付き、セリフ台本として違和感がない自然な文体で書き起こされる
- アニメ調な演技セリフに対しては特に精度が高い
- [kotoba-whisper](https://huggingface.co/kotoba-tech/kotoba-whisper-v2.0) ([whisper-large-v3](https://huggingface.co/openai/whisper-large-v3)の蒸留モデル) ベースなので軽量で高速
- 他モデルでは書き起こしがほぼ不可能なNSFW音声もきちんとした文体で文字起こし可能
## 使い方例 🚀
```python
import torch
from transformers import pipeline
generate_kwargs = {
"language": "Japanese",
"no_repeat_ngram_size": 0,
"repetition_penalty": 1.0,
}
pipe = pipeline(
"automatic-speech-recognition",
model="litagin/anime-whisper",
device="cuda",
torch_dtype=torch.float16,
chunk_length_s=30.0,
batch_size=64,
)
audio_path = "test.wav"
result = pipe(audio_path, generate_kwargs=generate_kwargs)
print(result["text"])
```
- 複数ファイルを一気に推論する場合は `pipe` に単にファイルパスのリストを渡せばよいです。
- 繰り返しハルシネーションが目立つ場合は、上記の `no_repeat_ngram_size: int` を 5 - 10 程度に設定したり、`repetition_penalty` を1より上に設定することで抑制できます。
## 評価 📊
**詳しい評価・観察レポートや評価コードは[GitHubリポジトリ](https://github.com/litagin02/anime-whisper)で公開予定です。**
### CER (Character Error Rate, 文字誤り率)
- 「学習データと同じアニメ調セリフのドメインではあるが、**学習データには含まれていない**個人的に所持している5本ノベルゲーム(合計約75kファイル)」で評価
- OpenAIのWhisper系の繰り返しハルシネーション抑止のため`no_repeat_ngram_size=5`のパラメータで生成
- CERは適切な正規化を行った結果に対するCER

<details>
<summary>表</summary>
| モデル名 | game1 | game2 | game3 | game4 | game5 | avg |
| --- | --- | --- | --- | --- | --- | --- |
| [openai/whisper-large](https://huggingface.co/openai/whisper-large) | 15.11 | 20.24 | 14.89 | 17.95 | 19.37 | 17.5 |
| [openai/whisper-large-v2](https://huggingface.co/openai/whisper-large-v2) | 15.11 | 20.12 | 14.83 | 17.65 | 18.59 | 17.3 |
| [openai/whisper-large-v3](https://huggingface.co/openai/whisper-large-v3) | 14.60 | 18.66 | 14.43 | 17.29 | 17.74 | 16.5 |
| [openai/whisper-large-v3-turbo](https://huggingface.co/openai/whisper-large-v3-turbo) | 15.18 | 19.24 | 14.43 | 17.38 | 18.15 | 16.9 |
| [reazon-research/reazonspeech-nemo-v2](https://huggingface.co/reazon-research/reazonspeech-nemo-v2) | 23.92 | 25.08 | 20.29 | 25.91 | 22.71 | 23.6 |
| [nvidia/parakeet-tdt_ctc-0.6b-ja](https://huggingface.co/nvidia/parakeet-tdt_ctc-0.6b-ja) | 17.67 | 20.44 | 15.33 | 19.60 | 19.86 | 18.6 |
| [kotoba-tech/kotoba-whisper-v1.0](https://huggingface.co/kotoba-tech/kotoba-whisper-v1.0) | 16.62 | 21.54 | 16.42 | 19.83 | 20.01 | 18.9 |
| [kotoba-tech/kotoba-whisper-v2.0](https://huggingface.co/kotoba-tech/kotoba-whisper-v2.0) | 16.38 | 21.51 | 16.51 | 19.69 | 20.04 | 18.8 |
| **Anime Whisper** | 11.32 | 16.52 | 11.16 | 12.78 | 13.23 | 13.0 |
</details>
## バイアス等 🚨
- 人名等の固有名詞が学習データのビジュアルノベルに存在する場合、そのゲーム内の漢字で書き起こされることが多い
- 他にもデータセットに存在する一部特定の単語が通常とは異なる書き起こしになることがある(例: `からだ` → `身体` 等や、その他固有名詞等)
- [データセットの正規化](https://huggingface.co/datasets/litagin/Galgame_Speech_ASR_16kHz#modifications) により、以下のものは出力結果にほぼ現れない:
- 母音や長音符の連続: `ああああーーーー`
- 同じ感嘆符の連続: `こらーっ!!!!` `なにそれ!?!?!?!?`
- 三点リーダーの連続: `……` (日本語表記としては2個用いる `……` が正しいが、ほぼ常に1個のみ `…` で出力される)
- 数字とアルファベットと感嘆符は半角で書き起こされる
- 文末の「。」はほぼ常に省略される
- 一部卑語の書き起こしに伏せ字「○」が含まれることがある
## 例 👀
上記評価と同じ、**学習元には入っていない**ノベルゲームのセリフの書き起こし比較です(同様に`no_repeat_ngram_size=5`での生成)。
結果を見ると、だいたいwhisper-large-v3程度の良い性能が出ており、以下では他モデルとの差が顕著な例(特に非言語発話や感情的な音声等)のみいくつか抜粋しています。
| **正解テキスト** | **Anime Whisper** | whisper-large-v3 | kotoba-whisper-v2.0 | reazonspeech-nemo |
| --- | --- | --- | --- | --- |
| あわわわっ!わわわわっ! | はわわっ、わわわわっ…! | ああああああああああ | うわうわ | うわ! |
| そっ、そっか……。………。……そうなんだ。 | そっ…そっか…そうなんだ… | そっか…そうなんだ… | そっか…そうなんだ | そっそっかあっそうなんだ。 |
| たぶん、ぼくが勝つ、はず | たぶん、ボクが勝つ、はず | 多分、僕が勝つはず。 | 多分僕が勝つはず | 僕が勝つはず。 |
| げ、げほっ……なんだこいつ! | げほっ、げほっ…なんだ、こいつ… | なんだ、こいつ… | なんだこいつ | フッ何だこいつ。 |
| はっ、はい。そうです。……その、えっと。へっ、変だったでしょうか? | は、はい、そうです…その、えと…へ、変だったでしょうか…? | あ、はい、そうです。そ、えっと、へ、変だったでしょうか。 | はいそうですそういと変だったでしょうか | あっはいそうですうすえっとへ変だったでしょうか? |
| ぶぶぶぶ豚クソがァァァ!待てコルァァァ! | ぶぶぶぶぶ、ぶたくそがー!待てごらぁぁ! | 待てこらー | 待てこそか | 待てこら! |
| 地面が揺れるとかありえ……ぎゃっ! | 地面が揺れるとかありえ…ひゃっ!? | 地面が揺れるとかありえ? | 地面が揺れるとかありえ | やっ! |
| きゃっほう!い、いたっ、いただきまーす! | きゃっほう!い、いた、いただきまーす! | キャッホー!い、いただきます! | キャホー!いただきます! | いいたいただきます! |
| ……っ、はぁ……わ、わたし、今日は…… | んっ、はぁ…わ、私、今日は… | 私、今日は… | 私今日は | えっと私今日。 |
| ……ぷふっ、ンッ。かっ、かっ、かっ……ぷふっ。かっ。んふふっ。かっ、価値観 | うふふっ…か、かはっ…ぷっ…はぁっ…か、価値観っ… | 価値観! | 価値観 | ハッかちかん! |
| か、痒くもねぇ……こんなんんん……! | か、痒くもねえ…こんな、んんっ…! | か、回復もねぇ、こんな、うぬぅ | かかゆくもねえこんな | かゆくもねえこんなうう。 |
| ひゃっ!や、やだ、くすぐった……や、やっ、あは、あははっ | ひゃうっ!やっ、やだっ…くすぐったっ…やっ、やっ、はんっ、あははっ! | やだ!すぐだ! | やだ | やっほ! |
| ふえぇ、急に止まらないでよう…… | ふえぇ、急に止まらないでよぉ | おへぇ、急に止まらないでよ | おへえ急に止まらないでよ | 急に止まらないでよ。 |
| ごごご50キロもないです私ー! | ごごご50キロもないです私ー! | 50キロもないです私! | 550キロもないです私 | 50キロもないですわたし! |
| いいい、すびばぜん、すびばぜーんっ | いいずびばぜんずびばぜーん! | いいいい! ズビバル10! ズビブル10! | いいズビバーテン! | すみませんすみません。 |
| 間抜けか貴様ァァァ! | 間抜けか貴様ぁぁっ! | マヌケカキ様! | まぬけかきさま | 抜けか貴様! |
| ぷ、くく……ひっ、ひいっ…… | くっ…くくくっ…ぷっ…くくっ… | ご視聴ありがとうございました | フッ | フフフフ。フフフフフ。 |
| キミは……。あっ、はっ……。最初から……あんっ、あっ、容赦がないな | 君はぁ…はぁっ、はぁっ…最初から…あんっ、あっ、容赦がないなぁ… | 君は……最初から容赦がないな | 君は最初からあんあ容赦がないな | 君は最初からうっうん容赦がないなあ。 |
| 望んでるわけ……。のっ、のっ、のっ……望んでるんです。世界が終わればいいって……強く、強くっ。はぁっ、はぁっ | 望んでるわけ…の、の、の…望んでるんです…世界が終わればいいって、強く、強く…はぁっ | 望んでるわけ…望んでるんです…世界が終わればいいって…強く…強く… | 望んでるわけ…ののぞんでるんです世界が終わればいいって強く強く | ん?望んでるんです。世界が終わればいいって強く強く。 |
### NSFW例 🫣
成人向けの表現が含まれるため、閲覧にはご注意ください。
<details>
<summary>喘ぎ声</summary>
| **正解テキスト** | **Anime Whisper** | whisper-large-v3 | kotoba-whisper-v2.0 | reazonspeech-nemo |
| --- | --- | --- | --- | --- |
| ひっ、あっ!あぅっ、ああぁぁあぁぁぁぁぁっ!はっ、はっ、はっ、はっ、ひぁっ! | んぁっ、あっ、あっ、ああぁぁっ!あっ、はぁっ、はぁっ…んっ、ふぁああっ! | ご視聴ありがとうございました | アハハハ | うわ! |
| ち、ちがっ……んっ、あぁぁ……気持ちいい、わけが……あぁっ、やぁっ、待てと……んんっ、はぁ……あふぅっ…… | ち、ちがっ…はぁっ、はぁっ、気持ちいい、わけがっ…あっ、やぁっ、待てとっ…んくっ、はぁ、はぁっ… | ち、ちが…気持ちいいわけが…待てと… | ちちが気持ちいいわけが待てと | ち違うはあ気持ちいいわけが待てとあっ。 |
| あんっ!あっ、あっ……そっ、それ……あっ、はぁはぁはぁ。ンンンンッ!ぴっ、ぴりぴり、ってして……。あんっ!はぁはぁはぁ、きっ、きもち……いいです! | ふぁんっ!あっ、あぁっ!そっ、それっ…あっ、はぁっ、はぁっ…んんっ!ぴ、ぴりぴりって、して…ひぁっ!はっ、はぁ、はぁっ…!き、気持ち、いいですっ…! | それ…フィリフィリでした…気持ちいいです… | それフィリフィリフリでした | けきもしいいです! |
| その調子って……んんっ、こんなの、あぁっ、んっあぁん……んんっ、しょっ……あぁっ……だめ……んっ、あぁっ…… | その調子って…んんっ、こんなの…はぁっ、んんっ…んっ、しょっ…はぁっ…ダメ…んっ、あっ… | その調子って…こんなの…ダメ… | その調子ってこんなの | その調子ってううんこんなのダメうん |
| はぁっ、あっ……んっ……くぅ、あぁっ……やぁ……それは、ん、はぁ……だめ、だ……あっ、んんっ、ふ……ひぃうっ!やめっ……ま、待ってくれ……あぁん……! | はぁっ、あっ、くぅぅっ…あっ、やっ、それはっ…はぁっ、ダメだっ…んんっ…ひぅぅんっ!やめっ…ま、待ってくれっ…あぁぁっ! | それは、ダメだ、やめ、待ってくれ | それはそれはダメだやめやめまってくれ | やめま待ってくれうう。 |
| あは、はっ……んっ、くうっ……なん、だろこれ……気持ちいい、かも……んっ、あ、ああっ、はあっ、ふあぁ……やっ、くぅん | はぁっ、はぁっ、んっ…くぅっ…なん、だろこれ…気持ちいい、かも…んんっ、あっ、ああっ…ふぁぁっ、はやっ…んんっ… | あ、あ、あ、んっ、う、なんだろこれ、気持ちいいかも、あ、あ,あ、あ、う、うんっ | なんだろうこれ気持ちいいくも | うっなんだろうこれ。はあ気持ちいいかも。うわ!ううん。 |
| だめ、センパイ……そんなにおち○ちん挿れたら、だめだぁっ……あっ、あぁぁぁっ……! | だめ、先輩…んっ、そんなに、おち○ちん挿れたら、だめ…はぁ、あぁぁ…っ | ダメ、先輩…そんなに陥れたらダメ… | ダメ先輩そんなに落ち入れたらダメな | ダメ先輩そんなに気入れたらダメだ。 |
| やぁぁっ、こ、こらっ、おち○ちん、そんなに、びくびくさせないのっ……あぁっ、ひぃあぁぁっ……はぁっ、あぁっ、あぁぁぁんっ!! | ひゃんっ!こ、こらっ、おち○ちん、そんなにビクビクさせないのっ!ひぁっ、あっ、はぁっ、はぁっ! | いや、こ、こら、おじっちそんなにビクビクさせないの?いや、なにやろ | ここらじっちそんなにビクビクさせないの | もう全然そんなにビクビクさせないのうん! |
| やっ……あっ。……お兄ちゃんの舌が、あっ、中で、やあっ。……そんなりぐりぐりしちゃ、あっ、ふあっ。うくぅぅっ、ああっ、やあっ。 | やっ、あっ、お兄ちゃんの舌が、中で…やぁっ、そんなにぐりぐりしちゃ…あっ、あっ、んっ、ふあぁっ、やぁぁっ…! | にゃー!お兄ちゃんの舌がお腹で…にゃー!そんなにグリグリした…にゃー!! | お兄ちゃんの下がお腹でニャーそんなにグリグリした | お兄ちゃんの舌がおなかでよそんなにグイグイさあぐっにゃん! |
| はっ、激しく……して。ンッ。あっ!はあっ、はあっ……わっ、私を……一気に……ンッ。イッ、イかせちゃってくださいッ! | は、激しく、して…んっ、あぅっ…私を、一気に…い、イかせちゃってください…! | あ、ゲンシ君、ステッ、アッ、アッ…私を一気に、行かせてあげください! | あげんしくして私は一気に行かせてください | 激しく私も一輝行かせちゃってください! |
</details>
<details>
<summary>チュパ音</summary>
| **正解テキスト** | **Anime Whisper** | whisper-large-v3 | kotoba-whisper-v2.0 | reazonspeech-nemo |
| --- | --- | --- | --- | --- |
| れろっ、んっ……れろ、ちゅ、んちゅ | れろっ、れろっ、ちゅううっ | ううううう | わいしゅう | シュッ! |
| はっ、はい!んっ、れろっ、れろっ……あっ、れろっ | は、はい…っ、れろぉ…っ、れりゅっ、れりょっ… | わ、はぁい、わ、う、う、わ、へ、へ、へ | わあはい | はい。 |
| れろっ、れろ……むふふ、ここの線なぞると反応いいね、んちゅ、ちゅうっ……ここいい?どう? | れろれろれろっ…んっ、ふふっ、ここの線なぞると反応いいね…ちゅっ、ちゅっ…ここいい?どう? | ここの線なぞると反応いいねここいい?どう? | ここの線なぞると反応いいねうんふうに | へへへここの線なぞると反応いいねここいい?どう? |
| あぁむ……ちゅ……れぇろれろ……ん……ん……ちゅ……れぇろ……んん……ちゅぅ……ちゅぱっ……れぇろれろ…… | あむちゅっ…れろれろっ…んちゅっ、れろっ…ちゅぱちゅぷっ…れろぉっ… | アムー… | あん | おへん。 |
| んちゅっ……れろれろ……れぇろ、ちゅっ、んれぇろれろ……ちゅっ、ちゅぱっ…… | んちゅっ、れろれろっ、ちゅぱちゅぅっ…れろれろ、ちゅっ…ちゅぷっ… | お疲れ様でした | おくぬかんぱい | う。 |
| ん……イク……ちゅるぅ……イッちゃう……ん……あぁっ、ちゅるるっ、イク……もう……らめぇ……んあぁむ……イク……イクぅぅ…… | もう、イクっ…イッちゃう…んっ、んっ、じゅるるっ、イクっ、らめっ…んぁっ、イクッ、イクッ! | おーまいごーおまいごーまいごやめまいごよこー | お前 | ママペイ君! |
| れぇろ…………んちゅ……れろれろ……ん……ちゅ……れろれろ……んれぇろれろ……ちゅ…… | れろぉ…んちゅ、れろれろ…ちゅぱ…れろ、れろれちゅ… | エル…ラ…ル…ア…エル…ル…ツ…ン…エ…エル…ツ…ル…ア...エル…ル...プ… | えぇぇ | |
| はぷっ、ちゅぷ、んん……はやく、おっきくして……ちんぽ……れろっ、ちゅ、ぴちゅ、ちゅぱっ……はやく勃起ちんぽちょうだい、勃起ちんぽ私にちょうだい | じゅぷっ、じゅぼっ!早くおっきくしてっ、ちんぽっ!んじゅるるるるるっ!はやくっ、はやく勃起ちんぽちょうらいっ、勃起ち○ぽあたしにちょうだいっ! | 早く起きこして!チンポン!早く、早くポッキチンポンちょうだい! ポッキチンパン私にちょうだい!! | 早く大きくしてチンポン早くポッキ全部全部私にちょうだい | 早くおっきい子して。チープ!ん?早く早くボケ全部ちょうだい。ボケ全部私にちょうだい! |
| そっ、それじゃ……。あっ、はっ……がっ、がんばるぞ。ンッ!ああああっ!あっ、わっ、ボクも……んちっ、んむっ、んむっ、んんっ、むむっ。 | そ、それじゃあ…はぁ、はぁ、が、頑張るぞ…んっ、あっ、あっ、も、ボクも…れろ、ちゅ、ちゅぱ、ちゅるるっ | それじゃあ、頑張るぞ! | それじゃあ頑張るぞ | そそれじゃあううがんばるぞ。 |
| はむ、ちゅ、んんっ、れる……。んむっ、ふーっ、ふーっ。ここなんへ、ろうかひら?ちゅっ……じゅっ。……じゅるる。んっ、。 | はむ…ちゅ、んんっ…ん、はむ…ここなんへ、どうかしら…ちゅっ、ちゅるるっ… | ここな…廊下平… | ここな廊下平 | ん。ん?ここな?どうかしら。んっ。 |
</details>
## 学習手順 📚
**詳しい学習手順やハイパーパラメータや学習コードはそのうち[GitHub](https://github.com/litagin02/anime-whisper)で公開予定です。**
- 全データのうち1番最後のtarファイルをtestデータとして残し、それ以外の3,735,363ファイルで学習
- まずはベースモデルからEncoderを凍結してDecoderのみで数エポックを学習
- その後Encoderの凍結を解除し、全体で数エポックを学習
- 学習打ち切り後に、「ある時点から別の時点までのモデルの平均(マージ)」を取る操作で性能向上を試み、Optunaを使ってベンチマークデータに対するCERで最適化し、その結果を最終モデルとした
### 環境 🖥
- 自腹で[vast.ai](https://vast.ai/)で H100 NVL (VRAM 96GB) を借りて合計3週間弱、試行錯誤しながら学習をした(当初はベースモデルをwhisper-large-v3-turboにしていたので、その分も含まれる)
- 実際にこのモデルに使われた学習時間は、H100 NVL * 11.2日 程度(ただし後半の方はおそらく過学習によりテストデータに対する性能が悪かったため、最終マージには用いなかった) |
harxsan/qwen-1.5B-16bit | harxsan | 2024-11-24T08:35:13Z | 90 | 0 | transformers | [
"transformers",
"pytorch",
"safetensors",
"qwen2",
"text-generation",
"text-generation-inference",
"unsloth",
"trl",
"sft",
"conversational",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-20T15:59:44Z | ---
base_model: unsloth/qwen2.5-1.5b
language:
- en
tags:
- text-generation-inference
- transformers
- unsloth
- qwen2
- trl
- sft
---
# Uploaded model
- **Developed by:** harxsan
- **License:** apache-2.0
- **Finetuned from model :** unsloth/qwen2.5-0.5b-bnb-4bit
This qwen2 model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth) |
MayBashendy/ASAP_FineTuningBERT_AugV4_k5_task1_organization_fold0 | MayBashendy | 2024-11-24T08:33:09Z | 161 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:google-bert/bert-base-uncased",
"base_model:finetune:google-bert/bert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-18T07:54:25Z | ---
library_name: transformers
license: apache-2.0
base_model: bert-base-uncased
tags:
- generated_from_trainer
model-index:
- name: ASAP_FineTuningBERT_AugV4_k5_task1_organization_fold0
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# ASAP_FineTuningBERT_AugV4_k5_task1_organization_fold0
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7141
- Qwk: 0.5246
- Mse: 0.7141
- Rmse: 0.8451
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|:-------------:|:------:|:----:|:---------------:|:------:|:-------:|:------:|
| No log | 0.0039 | 2 | 10.3231 | 0.0012 | 10.3231 | 3.2130 |
| No log | 0.0079 | 4 | 9.0408 | 0.0 | 9.0408 | 3.0068 |
| No log | 0.0118 | 6 | 7.4538 | 0.0 | 7.4538 | 2.7302 |
| No log | 0.0157 | 8 | 6.6306 | 0.0 | 6.6306 | 2.5750 |
| No log | 0.0196 | 10 | 5.7346 | 0.0349 | 5.7346 | 2.3947 |
| No log | 0.0236 | 12 | 4.7638 | 0.0115 | 4.7638 | 2.1826 |
| No log | 0.0275 | 14 | 3.9586 | 0.0039 | 3.9586 | 1.9896 |
| No log | 0.0314 | 16 | 3.1861 | 0.0 | 3.1861 | 1.7850 |
| No log | 0.0354 | 18 | 2.6209 | 0.0080 | 2.6209 | 1.6189 |
| No log | 0.0393 | 20 | 2.0064 | 0.0372 | 2.0064 | 1.4165 |
| No log | 0.0432 | 22 | 1.4451 | 0.0344 | 1.4451 | 1.2021 |
| No log | 0.0472 | 24 | 1.0854 | 0.0212 | 1.0854 | 1.0418 |
| No log | 0.0511 | 26 | 0.8846 | 0.1124 | 0.8846 | 0.9405 |
| No log | 0.0550 | 28 | 0.7862 | 0.1360 | 0.7862 | 0.8867 |
| No log | 0.0589 | 30 | 0.7806 | 0.0521 | 0.7806 | 0.8835 |
| No log | 0.0629 | 32 | 0.8402 | 0.0521 | 0.8402 | 0.9166 |
| No log | 0.0668 | 34 | 0.8707 | 0.0521 | 0.8707 | 0.9331 |
| No log | 0.0707 | 36 | 0.9418 | 0.0348 | 0.9418 | 0.9705 |
| No log | 0.0747 | 38 | 0.9589 | 0.0348 | 0.9589 | 0.9793 |
| No log | 0.0786 | 40 | 0.8128 | 0.0521 | 0.8128 | 0.9016 |
| No log | 0.0825 | 42 | 0.7976 | 0.0521 | 0.7976 | 0.8931 |
| No log | 0.0864 | 44 | 0.8807 | 0.0348 | 0.8807 | 0.9385 |
| No log | 0.0904 | 46 | 0.8433 | 0.0348 | 0.8433 | 0.9183 |
| No log | 0.0943 | 48 | 0.8589 | 0.0348 | 0.8589 | 0.9268 |
| No log | 0.0982 | 50 | 0.7681 | 0.0444 | 0.7681 | 0.8764 |
| No log | 0.1022 | 52 | 0.8269 | 0.3320 | 0.8269 | 0.9093 |
| No log | 0.1061 | 54 | 0.9483 | 0.1476 | 0.9483 | 0.9738 |
| No log | 0.1100 | 56 | 0.7166 | 0.3553 | 0.7166 | 0.8465 |
| No log | 0.1139 | 58 | 0.7334 | 0.0657 | 0.7334 | 0.8564 |
| No log | 0.1179 | 60 | 0.7373 | 0.0796 | 0.7373 | 0.8587 |
| No log | 0.1218 | 62 | 0.6761 | 0.2314 | 0.6761 | 0.8223 |
| No log | 0.1257 | 64 | 0.6584 | 0.3490 | 0.6584 | 0.8114 |
| No log | 0.1297 | 66 | 0.7923 | 0.3240 | 0.7923 | 0.8901 |
| No log | 0.1336 | 68 | 0.8229 | 0.3175 | 0.8229 | 0.9071 |
| No log | 0.1375 | 70 | 0.6617 | 0.3712 | 0.6617 | 0.8134 |
| No log | 0.1415 | 72 | 0.6208 | 0.3498 | 0.6208 | 0.7879 |
| No log | 0.1454 | 74 | 0.6182 | 0.3672 | 0.6182 | 0.7863 |
| No log | 0.1493 | 76 | 0.6346 | 0.3624 | 0.6346 | 0.7966 |
| No log | 0.1532 | 78 | 0.6647 | 0.3749 | 0.6647 | 0.8153 |
| No log | 0.1572 | 80 | 0.6462 | 0.3865 | 0.6462 | 0.8039 |
| No log | 0.1611 | 82 | 0.6135 | 0.3485 | 0.6135 | 0.7832 |
| No log | 0.1650 | 84 | 0.6242 | 0.2779 | 0.6242 | 0.7901 |
| No log | 0.1690 | 86 | 0.6501 | 0.2764 | 0.6501 | 0.8063 |
| No log | 0.1729 | 88 | 0.6378 | 0.3621 | 0.6378 | 0.7986 |
| No log | 0.1768 | 90 | 0.6523 | 0.3949 | 0.6523 | 0.8076 |
| No log | 0.1807 | 92 | 0.7601 | 0.4141 | 0.7601 | 0.8718 |
| No log | 0.1847 | 94 | 1.2787 | 0.3443 | 1.2787 | 1.1308 |
| No log | 0.1886 | 96 | 1.2864 | 0.3438 | 1.2864 | 1.1342 |
| No log | 0.1925 | 98 | 0.8078 | 0.3996 | 0.8078 | 0.8988 |
| No log | 0.1965 | 100 | 0.7401 | 0.3452 | 0.7401 | 0.8603 |
| No log | 0.2004 | 102 | 0.8650 | 0.3899 | 0.8650 | 0.9300 |
| No log | 0.2043 | 104 | 0.8730 | 0.3770 | 0.8730 | 0.9343 |
| No log | 0.2083 | 106 | 0.8568 | 0.3901 | 0.8568 | 0.9256 |
| No log | 0.2122 | 108 | 0.7161 | 0.4354 | 0.7161 | 0.8462 |
| No log | 0.2161 | 110 | 0.6627 | 0.4625 | 0.6627 | 0.8141 |
| No log | 0.2200 | 112 | 0.6573 | 0.4769 | 0.6573 | 0.8107 |
| No log | 0.2240 | 114 | 0.6558 | 0.4547 | 0.6558 | 0.8098 |
| No log | 0.2279 | 116 | 0.6909 | 0.4471 | 0.6909 | 0.8312 |
| No log | 0.2318 | 118 | 0.8626 | 0.4188 | 0.8626 | 0.9288 |
| No log | 0.2358 | 120 | 0.9443 | 0.4074 | 0.9443 | 0.9717 |
| No log | 0.2397 | 122 | 0.8417 | 0.4634 | 0.8417 | 0.9175 |
| No log | 0.2436 | 124 | 0.7527 | 0.4861 | 0.7527 | 0.8676 |
| No log | 0.2475 | 126 | 0.6732 | 0.4896 | 0.6732 | 0.8205 |
| No log | 0.2515 | 128 | 0.6308 | 0.4966 | 0.6308 | 0.7943 |
| No log | 0.2554 | 130 | 0.6427 | 0.4266 | 0.6427 | 0.8017 |
| No log | 0.2593 | 132 | 0.6440 | 0.4176 | 0.6440 | 0.8025 |
| No log | 0.2633 | 134 | 0.5872 | 0.4610 | 0.5872 | 0.7663 |
| No log | 0.2672 | 136 | 0.6025 | 0.5151 | 0.6025 | 0.7762 |
| No log | 0.2711 | 138 | 0.6107 | 0.4808 | 0.6107 | 0.7815 |
| No log | 0.2750 | 140 | 0.5828 | 0.4926 | 0.5828 | 0.7634 |
| No log | 0.2790 | 142 | 0.5725 | 0.5058 | 0.5725 | 0.7566 |
| No log | 0.2829 | 144 | 0.5811 | 0.5155 | 0.5811 | 0.7623 |
| No log | 0.2868 | 146 | 0.5923 | 0.5485 | 0.5923 | 0.7696 |
| No log | 0.2908 | 148 | 0.6674 | 0.5389 | 0.6674 | 0.8170 |
| No log | 0.2947 | 150 | 0.6884 | 0.5404 | 0.6884 | 0.8297 |
| No log | 0.2986 | 152 | 0.7312 | 0.5577 | 0.7312 | 0.8551 |
| No log | 0.3026 | 154 | 1.0047 | 0.4597 | 1.0047 | 1.0023 |
| No log | 0.3065 | 156 | 1.1639 | 0.4227 | 1.1639 | 1.0788 |
| No log | 0.3104 | 158 | 0.8552 | 0.5034 | 0.8552 | 0.9248 |
| No log | 0.3143 | 160 | 0.6546 | 0.5589 | 0.6546 | 0.8091 |
| No log | 0.3183 | 162 | 0.7863 | 0.4637 | 0.7863 | 0.8867 |
| No log | 0.3222 | 164 | 0.7504 | 0.4751 | 0.7504 | 0.8662 |
| No log | 0.3261 | 166 | 0.5994 | 0.5534 | 0.5994 | 0.7742 |
| No log | 0.3301 | 168 | 0.6171 | 0.5565 | 0.6171 | 0.7856 |
| No log | 0.3340 | 170 | 0.5929 | 0.5538 | 0.5929 | 0.7700 |
| No log | 0.3379 | 172 | 0.6902 | 0.5037 | 0.6902 | 0.8308 |
| No log | 0.3418 | 174 | 0.6827 | 0.5006 | 0.6827 | 0.8263 |
| No log | 0.3458 | 176 | 0.5947 | 0.5273 | 0.5947 | 0.7712 |
| No log | 0.3497 | 178 | 0.5617 | 0.5424 | 0.5617 | 0.7494 |
| No log | 0.3536 | 180 | 0.6162 | 0.4913 | 0.6162 | 0.7850 |
| No log | 0.3576 | 182 | 0.5855 | 0.5312 | 0.5855 | 0.7652 |
| No log | 0.3615 | 184 | 0.5703 | 0.5639 | 0.5703 | 0.7552 |
| No log | 0.3654 | 186 | 0.6961 | 0.4779 | 0.6961 | 0.8343 |
| No log | 0.3694 | 188 | 0.6509 | 0.5181 | 0.6509 | 0.8068 |
| No log | 0.3733 | 190 | 0.5579 | 0.5099 | 0.5579 | 0.7470 |
| No log | 0.3772 | 192 | 0.5679 | 0.5019 | 0.5679 | 0.7536 |
| No log | 0.3811 | 194 | 0.6860 | 0.4627 | 0.6860 | 0.8282 |
| No log | 0.3851 | 196 | 0.7463 | 0.4276 | 0.7463 | 0.8639 |
| No log | 0.3890 | 198 | 0.8966 | 0.3631 | 0.8966 | 0.9469 |
| No log | 0.3929 | 200 | 0.8602 | 0.3978 | 0.8602 | 0.9274 |
| No log | 0.3969 | 202 | 0.7300 | 0.4685 | 0.7300 | 0.8544 |
| No log | 0.4008 | 204 | 0.6137 | 0.5175 | 0.6137 | 0.7834 |
| No log | 0.4047 | 206 | 0.5479 | 0.5300 | 0.5479 | 0.7402 |
| No log | 0.4086 | 208 | 0.5463 | 0.5099 | 0.5463 | 0.7391 |
| No log | 0.4126 | 210 | 0.5403 | 0.5238 | 0.5403 | 0.7350 |
| No log | 0.4165 | 212 | 0.5658 | 0.5234 | 0.5658 | 0.7522 |
| No log | 0.4204 | 214 | 0.5604 | 0.5408 | 0.5604 | 0.7486 |
| No log | 0.4244 | 216 | 0.5489 | 0.5526 | 0.5489 | 0.7409 |
| No log | 0.4283 | 218 | 0.5559 | 0.5608 | 0.5559 | 0.7456 |
| No log | 0.4322 | 220 | 0.5958 | 0.5252 | 0.5958 | 0.7719 |
| No log | 0.4361 | 222 | 0.6170 | 0.5246 | 0.6170 | 0.7855 |
| No log | 0.4401 | 224 | 0.6408 | 0.5182 | 0.6408 | 0.8005 |
| No log | 0.4440 | 226 | 0.7574 | 0.4841 | 0.7574 | 0.8703 |
| No log | 0.4479 | 228 | 0.7367 | 0.4999 | 0.7367 | 0.8583 |
| No log | 0.4519 | 230 | 0.6436 | 0.5132 | 0.6436 | 0.8022 |
| No log | 0.4558 | 232 | 0.5999 | 0.5387 | 0.5999 | 0.7746 |
| No log | 0.4597 | 234 | 0.5861 | 0.5099 | 0.5861 | 0.7655 |
| No log | 0.4637 | 236 | 0.6685 | 0.4689 | 0.6685 | 0.8176 |
| No log | 0.4676 | 238 | 0.7280 | 0.4626 | 0.7280 | 0.8532 |
| No log | 0.4715 | 240 | 0.5494 | 0.5188 | 0.5494 | 0.7412 |
| No log | 0.4754 | 242 | 0.7816 | 0.4666 | 0.7816 | 0.8841 |
| No log | 0.4794 | 244 | 1.1087 | 0.3485 | 1.1087 | 1.0530 |
| No log | 0.4833 | 246 | 0.9545 | 0.3859 | 0.9545 | 0.9770 |
| No log | 0.4872 | 248 | 0.6046 | 0.5192 | 0.6046 | 0.7776 |
| No log | 0.4912 | 250 | 0.5730 | 0.4962 | 0.5730 | 0.7570 |
| No log | 0.4951 | 252 | 0.5686 | 0.4883 | 0.5686 | 0.7540 |
| No log | 0.4990 | 254 | 0.6106 | 0.5163 | 0.6106 | 0.7814 |
| No log | 0.5029 | 256 | 1.0541 | 0.3511 | 1.0541 | 1.0267 |
| No log | 0.5069 | 258 | 1.6691 | 0.1710 | 1.6691 | 1.2919 |
| No log | 0.5108 | 260 | 1.7808 | 0.1390 | 1.7808 | 1.3345 |
| No log | 0.5147 | 262 | 1.5876 | 0.1774 | 1.5876 | 1.2600 |
| No log | 0.5187 | 264 | 1.1397 | 0.2915 | 1.1397 | 1.0676 |
| No log | 0.5226 | 266 | 0.6917 | 0.4401 | 0.6917 | 0.8317 |
| No log | 0.5265 | 268 | 0.5828 | 0.4461 | 0.5828 | 0.7634 |
| No log | 0.5305 | 270 | 0.5857 | 0.4337 | 0.5857 | 0.7653 |
| No log | 0.5344 | 272 | 0.6264 | 0.3944 | 0.6264 | 0.7915 |
| No log | 0.5383 | 274 | 0.6073 | 0.4193 | 0.6073 | 0.7793 |
| No log | 0.5422 | 276 | 0.5751 | 0.4298 | 0.5751 | 0.7583 |
| No log | 0.5462 | 278 | 0.6149 | 0.4635 | 0.6149 | 0.7841 |
| No log | 0.5501 | 280 | 0.7149 | 0.4445 | 0.7149 | 0.8455 |
| No log | 0.5540 | 282 | 0.7590 | 0.4356 | 0.7590 | 0.8712 |
| No log | 0.5580 | 284 | 0.6405 | 0.4662 | 0.6405 | 0.8003 |
| No log | 0.5619 | 286 | 0.5324 | 0.5198 | 0.5324 | 0.7297 |
| No log | 0.5658 | 288 | 0.6382 | 0.4461 | 0.6382 | 0.7989 |
| No log | 0.5697 | 290 | 0.6764 | 0.4435 | 0.6764 | 0.8224 |
| No log | 0.5737 | 292 | 0.5711 | 0.5131 | 0.5711 | 0.7557 |
| No log | 0.5776 | 294 | 0.5716 | 0.5181 | 0.5716 | 0.7561 |
| No log | 0.5815 | 296 | 0.7283 | 0.4508 | 0.7283 | 0.8534 |
| No log | 0.5855 | 298 | 0.7408 | 0.4325 | 0.7408 | 0.8607 |
| No log | 0.5894 | 300 | 0.6252 | 0.5008 | 0.6252 | 0.7907 |
| No log | 0.5933 | 302 | 0.5663 | 0.4872 | 0.5663 | 0.7525 |
| No log | 0.5972 | 304 | 0.5704 | 0.4694 | 0.5704 | 0.7553 |
| No log | 0.6012 | 306 | 0.5728 | 0.4800 | 0.5728 | 0.7568 |
| No log | 0.6051 | 308 | 0.6282 | 0.4896 | 0.6282 | 0.7926 |
| No log | 0.6090 | 310 | 0.7343 | 0.4440 | 0.7343 | 0.8569 |
| No log | 0.6130 | 312 | 0.7665 | 0.4291 | 0.7665 | 0.8755 |
| No log | 0.6169 | 314 | 0.8875 | 0.4086 | 0.8875 | 0.9421 |
| No log | 0.6208 | 316 | 0.8607 | 0.4269 | 0.8607 | 0.9277 |
| No log | 0.6248 | 318 | 0.7242 | 0.4866 | 0.7242 | 0.8510 |
| No log | 0.6287 | 320 | 0.8130 | 0.4448 | 0.8130 | 0.9017 |
| No log | 0.6326 | 322 | 0.9201 | 0.4199 | 0.9201 | 0.9592 |
| No log | 0.6365 | 324 | 0.7647 | 0.4658 | 0.7647 | 0.8745 |
| No log | 0.6405 | 326 | 0.7136 | 0.4873 | 0.7136 | 0.8448 |
| No log | 0.6444 | 328 | 0.7888 | 0.4693 | 0.7888 | 0.8881 |
| No log | 0.6483 | 330 | 0.7451 | 0.4882 | 0.7451 | 0.8632 |
| No log | 0.6523 | 332 | 0.7011 | 0.5056 | 0.7011 | 0.8373 |
| No log | 0.6562 | 334 | 0.5923 | 0.5559 | 0.5923 | 0.7696 |
| No log | 0.6601 | 336 | 0.5742 | 0.6040 | 0.5742 | 0.7578 |
| No log | 0.6640 | 338 | 0.5899 | 0.5929 | 0.5899 | 0.7680 |
| No log | 0.6680 | 340 | 0.6451 | 0.5865 | 0.6451 | 0.8032 |
| No log | 0.6719 | 342 | 0.6178 | 0.6050 | 0.6178 | 0.7860 |
| No log | 0.6758 | 344 | 0.6106 | 0.5990 | 0.6106 | 0.7814 |
| No log | 0.6798 | 346 | 0.6417 | 0.5831 | 0.6417 | 0.8011 |
| No log | 0.6837 | 348 | 0.5981 | 0.6068 | 0.5981 | 0.7734 |
| No log | 0.6876 | 350 | 0.7151 | 0.5565 | 0.7151 | 0.8457 |
| No log | 0.6916 | 352 | 0.8198 | 0.5009 | 0.8198 | 0.9054 |
| No log | 0.6955 | 354 | 0.7023 | 0.5547 | 0.7023 | 0.8381 |
| No log | 0.6994 | 356 | 0.5672 | 0.5999 | 0.5672 | 0.7531 |
| No log | 0.7033 | 358 | 0.5502 | 0.5976 | 0.5502 | 0.7417 |
| No log | 0.7073 | 360 | 0.5651 | 0.5861 | 0.5651 | 0.7517 |
| No log | 0.7112 | 362 | 0.6006 | 0.6039 | 0.6006 | 0.7750 |
| No log | 0.7151 | 364 | 0.6753 | 0.5583 | 0.6753 | 0.8217 |
| No log | 0.7191 | 366 | 0.6400 | 0.5829 | 0.6400 | 0.8000 |
| No log | 0.7230 | 368 | 0.6157 | 0.6054 | 0.6157 | 0.7847 |
| No log | 0.7269 | 370 | 0.6384 | 0.5853 | 0.6384 | 0.7990 |
| No log | 0.7308 | 372 | 0.6248 | 0.5891 | 0.6248 | 0.7905 |
| No log | 0.7348 | 374 | 0.6583 | 0.5703 | 0.6583 | 0.8113 |
| No log | 0.7387 | 376 | 0.7277 | 0.5057 | 0.7277 | 0.8531 |
| No log | 0.7426 | 378 | 0.7087 | 0.5086 | 0.7087 | 0.8418 |
| No log | 0.7466 | 380 | 0.6053 | 0.5827 | 0.6053 | 0.7780 |
| No log | 0.7505 | 382 | 0.5966 | 0.5845 | 0.5966 | 0.7724 |
| No log | 0.7544 | 384 | 0.5991 | 0.5865 | 0.5991 | 0.7740 |
| No log | 0.7583 | 386 | 0.6060 | 0.5957 | 0.6060 | 0.7784 |
| No log | 0.7623 | 388 | 0.6487 | 0.5538 | 0.6487 | 0.8054 |
| No log | 0.7662 | 390 | 0.6608 | 0.5504 | 0.6608 | 0.8129 |
| No log | 0.7701 | 392 | 0.6188 | 0.5820 | 0.6188 | 0.7867 |
| No log | 0.7741 | 394 | 0.5998 | 0.5842 | 0.5998 | 0.7744 |
| No log | 0.7780 | 396 | 0.6283 | 0.5552 | 0.6283 | 0.7926 |
| No log | 0.7819 | 398 | 0.6070 | 0.5508 | 0.6070 | 0.7791 |
| No log | 0.7859 | 400 | 0.5820 | 0.5878 | 0.5820 | 0.7629 |
| No log | 0.7898 | 402 | 0.5990 | 0.5731 | 0.5990 | 0.7740 |
| No log | 0.7937 | 404 | 0.6110 | 0.5749 | 0.6110 | 0.7817 |
| No log | 0.7976 | 406 | 0.6228 | 0.6023 | 0.6228 | 0.7892 |
| No log | 0.8016 | 408 | 0.6472 | 0.5911 | 0.6472 | 0.8045 |
| No log | 0.8055 | 410 | 0.6487 | 0.5774 | 0.6487 | 0.8054 |
| No log | 0.8094 | 412 | 0.6591 | 0.5898 | 0.6591 | 0.8118 |
| No log | 0.8134 | 414 | 0.6468 | 0.5911 | 0.6468 | 0.8042 |
| No log | 0.8173 | 416 | 0.6412 | 0.5703 | 0.6412 | 0.8007 |
| No log | 0.8212 | 418 | 0.6636 | 0.5504 | 0.6636 | 0.8146 |
| No log | 0.8251 | 420 | 0.7726 | 0.4895 | 0.7726 | 0.8790 |
| No log | 0.8291 | 422 | 0.8024 | 0.4558 | 0.8024 | 0.8958 |
| No log | 0.8330 | 424 | 0.6757 | 0.4668 | 0.6757 | 0.8220 |
| No log | 0.8369 | 426 | 0.6662 | 0.4783 | 0.6662 | 0.8162 |
| No log | 0.8409 | 428 | 0.8380 | 0.4234 | 0.8380 | 0.9154 |
| No log | 0.8448 | 430 | 0.8901 | 0.4233 | 0.8901 | 0.9435 |
| No log | 0.8487 | 432 | 0.7660 | 0.4595 | 0.7660 | 0.8752 |
| No log | 0.8527 | 434 | 0.8330 | 0.4376 | 0.8330 | 0.9127 |
| No log | 0.8566 | 436 | 0.8657 | 0.4454 | 0.8657 | 0.9304 |
| No log | 0.8605 | 438 | 0.6664 | 0.5531 | 0.6664 | 0.8163 |
| No log | 0.8644 | 440 | 0.5784 | 0.5633 | 0.5784 | 0.7605 |
| No log | 0.8684 | 442 | 0.5777 | 0.5691 | 0.5777 | 0.7601 |
| No log | 0.8723 | 444 | 0.6182 | 0.5646 | 0.6182 | 0.7863 |
| No log | 0.8762 | 446 | 0.6933 | 0.5540 | 0.6933 | 0.8327 |
| No log | 0.8802 | 448 | 0.6178 | 0.5839 | 0.6178 | 0.7860 |
| No log | 0.8841 | 450 | 0.6198 | 0.5529 | 0.6198 | 0.7873 |
| No log | 0.8880 | 452 | 0.6139 | 0.5537 | 0.6139 | 0.7835 |
| No log | 0.8919 | 454 | 0.5982 | 0.5798 | 0.5982 | 0.7734 |
| No log | 0.8959 | 456 | 0.6105 | 0.5663 | 0.6105 | 0.7813 |
| No log | 0.8998 | 458 | 0.6302 | 0.5139 | 0.6302 | 0.7939 |
| No log | 0.9037 | 460 | 0.5832 | 0.5504 | 0.5832 | 0.7637 |
| No log | 0.9077 | 462 | 0.6067 | 0.5125 | 0.6067 | 0.7789 |
| No log | 0.9116 | 464 | 0.6535 | 0.5163 | 0.6535 | 0.8084 |
| No log | 0.9155 | 466 | 0.5842 | 0.5314 | 0.5842 | 0.7643 |
| No log | 0.9194 | 468 | 0.5946 | 0.5250 | 0.5946 | 0.7711 |
| No log | 0.9234 | 470 | 0.6609 | 0.4881 | 0.6609 | 0.8129 |
| No log | 0.9273 | 472 | 0.6493 | 0.5099 | 0.6493 | 0.8058 |
| No log | 0.9312 | 474 | 0.5665 | 0.5381 | 0.5665 | 0.7527 |
| No log | 0.9352 | 476 | 0.5726 | 0.5304 | 0.5726 | 0.7567 |
| No log | 0.9391 | 478 | 0.5761 | 0.5374 | 0.5761 | 0.7590 |
| No log | 0.9430 | 480 | 0.5789 | 0.5268 | 0.5789 | 0.7609 |
| No log | 0.9470 | 482 | 0.6424 | 0.5279 | 0.6424 | 0.8015 |
| No log | 0.9509 | 484 | 0.5988 | 0.5374 | 0.5988 | 0.7738 |
| No log | 0.9548 | 486 | 0.5865 | 0.5202 | 0.5865 | 0.7658 |
| No log | 0.9587 | 488 | 0.7806 | 0.4822 | 0.7806 | 0.8835 |
| No log | 0.9627 | 490 | 0.8265 | 0.4772 | 0.8265 | 0.9091 |
| No log | 0.9666 | 492 | 0.6493 | 0.5071 | 0.6493 | 0.8058 |
| No log | 0.9705 | 494 | 0.5550 | 0.5493 | 0.5550 | 0.7450 |
| No log | 0.9745 | 496 | 0.6054 | 0.5583 | 0.6054 | 0.7781 |
| No log | 0.9784 | 498 | 0.5999 | 0.5900 | 0.5999 | 0.7745 |
| 0.9447 | 0.9823 | 500 | 0.5681 | 0.5709 | 0.5681 | 0.7537 |
| 0.9447 | 0.9862 | 502 | 0.6701 | 0.5117 | 0.6701 | 0.8186 |
| 0.9447 | 0.9902 | 504 | 0.6511 | 0.5218 | 0.6511 | 0.8069 |
| 0.9447 | 0.9941 | 506 | 0.5562 | 0.5944 | 0.5562 | 0.7458 |
| 0.9447 | 0.9980 | 508 | 0.5509 | 0.6022 | 0.5509 | 0.7422 |
| 0.9447 | 1.0020 | 510 | 0.5610 | 0.5930 | 0.5610 | 0.7490 |
| 0.9447 | 1.0059 | 512 | 0.5687 | 0.5833 | 0.5687 | 0.7541 |
| 0.9447 | 1.0098 | 514 | 0.5593 | 0.5880 | 0.5593 | 0.7479 |
| 0.9447 | 1.0138 | 516 | 0.5692 | 0.5916 | 0.5692 | 0.7544 |
| 0.9447 | 1.0177 | 518 | 0.5750 | 0.5742 | 0.5750 | 0.7583 |
| 0.9447 | 1.0216 | 520 | 0.6603 | 0.5358 | 0.6603 | 0.8126 |
| 0.9447 | 1.0255 | 522 | 0.6876 | 0.5147 | 0.6876 | 0.8292 |
| 0.9447 | 1.0295 | 524 | 0.6107 | 0.5474 | 0.6107 | 0.7815 |
| 0.9447 | 1.0334 | 526 | 0.6010 | 0.5567 | 0.6010 | 0.7752 |
| 0.9447 | 1.0373 | 528 | 0.7141 | 0.5246 | 0.7141 | 0.8451 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
|
bartowski/Behemoth-123B-v2.1-GGUF | bartowski | 2024-11-24T08:33:00Z | 698 | 1 | null | [
"gguf",
"text-generation",
"base_model:TheDrummer/Behemoth-123B-v2.1",
"base_model:quantized:TheDrummer/Behemoth-123B-v2.1",
"endpoints_compatible",
"region:us",
"imatrix",
"conversational"
] | text-generation | 2024-11-24T03:08:01Z | ---
quantized_by: bartowski
pipeline_tag: text-generation
base_model: TheDrummer/Behemoth-123B-v2.1
---
## Llamacpp imatrix Quantizations of Behemoth-123B-v2.1
Using <a href="https://github.com/ggerganov/llama.cpp/">llama.cpp</a> release <a href="https://github.com/ggerganov/llama.cpp/releases/tag/b4132">b4132</a> for quantization.
Original model: https://huggingface.co/TheDrummer/Behemoth-123B-v2.1
All quants made using imatrix option with dataset from [here](https://gist.github.com/bartowski1182/eb213dccb3571f863da82e99418f81e8)
Run them in [LM Studio](https://lmstudio.ai/)
## Prompt format
```
<s>[SYSTEM_PROMPT] {system_prompt}[/SYSTEM_PROMPT][INST] {prompt}[/INST]
```
## Download a file (not the whole branch) from below:
| Filename | Quant type | File Size | Split | Description |
| -------- | ---------- | --------- | ----- | ----------- |
| [Behemoth-123B-v2.1-Q8_0.gguf](https://huggingface.co/bartowski/Behemoth-123B-v2.1-GGUF/tree/main/Behemoth-123B-v2.1-Q8_0) | Q8_0 | 130.28GB | true | Extremely high quality, generally unneeded but max available quant. |
| [Behemoth-123B-v2.1-Q6_K_L.gguf](https://huggingface.co/bartowski/Behemoth-123B-v2.1-GGUF/tree/main/Behemoth-123B-v2.1-Q6_K_L) | Q6_K_L | 100.78GB | true | Uses Q8_0 for embed and output weights. Very high quality, near perfect, *recommended*. |
| [Behemoth-123B-v2.1-Q6_K.gguf](https://huggingface.co/bartowski/Behemoth-123B-v2.1-GGUF/tree/main/Behemoth-123B-v2.1-Q6_K) | Q6_K | 100.59GB | true | Very high quality, near perfect, *recommended*. |
| [Behemoth-123B-v2.1-Q5_K_L.gguf](https://huggingface.co/bartowski/Behemoth-123B-v2.1-GGUF/tree/main/Behemoth-123B-v2.1-Q5_K_L) | Q5_K_L | 86.74GB | true | Uses Q8_0 for embed and output weights. High quality, *recommended*. |
| [Behemoth-123B-v2.1-Q5_K_M.gguf](https://huggingface.co/bartowski/Behemoth-123B-v2.1-GGUF/tree/main/Behemoth-123B-v2.1-Q5_K_M) | Q5_K_M | 86.49GB | true | High quality, *recommended*. |
| [Behemoth-123B-v2.1-Q5_K_S.gguf](https://huggingface.co/bartowski/Behemoth-123B-v2.1-GGUF/tree/main/Behemoth-123B-v2.1-Q5_K_S) | Q5_K_S | 84.36GB | true | High quality, *recommended*. |
| [Behemoth-123B-v2.1-Q4_K_L.gguf](https://huggingface.co/bartowski/Behemoth-123B-v2.1-GGUF/tree/main/Behemoth-123B-v2.1-Q4_K_L) | Q4_K_L | 73.52GB | true | Uses Q8_0 for embed and output weights. Good quality, *recommended*. |
| [Behemoth-123B-v2.1-Q4_K_M.gguf](https://huggingface.co/bartowski/Behemoth-123B-v2.1-GGUF/tree/main/Behemoth-123B-v2.1-Q4_K_M) | Q4_K_M | 73.22GB | true | Good quality, default size for most use cases, *recommended*. |
| [Behemoth-123B-v2.1-Q4_K_S.gguf](https://huggingface.co/bartowski/Behemoth-123B-v2.1-GGUF/tree/main/Behemoth-123B-v2.1-Q4_K_S) | Q4_K_S | 69.57GB | true | Slightly lower quality with more space savings, *recommended*. |
| [Behemoth-123B-v2.1-Q4_0.gguf](https://huggingface.co/bartowski/Behemoth-123B-v2.1-GGUF/tree/main/Behemoth-123B-v2.1-Q4_0) | Q4_0 | 69.32GB | true | Legacy format, generally not worth using over similarly sized formats |
| [Behemoth-123B-v2.1-Q4_0_8_8.gguf](https://huggingface.co/bartowski/Behemoth-123B-v2.1-GGUF/tree/main/Behemoth-123B-v2.1-Q4_0_8_8) | Q4_0_8_8 | 69.08GB | true | Optimized for ARM and AVX inference. Requires 'sve' support for ARM (see details below). *Don't use on Mac*. |
| [Behemoth-123B-v2.1-IQ4_XS.gguf](https://huggingface.co/bartowski/Behemoth-123B-v2.1-GGUF/tree/main/Behemoth-123B-v2.1-IQ4_XS) | IQ4_XS | 65.43GB | true | Decent quality, smaller than Q4_K_S with similar performance, *recommended*. |
| [Behemoth-123B-v2.1-Q3_K_XL.gguf](https://huggingface.co/bartowski/Behemoth-123B-v2.1-GGUF/tree/main/Behemoth-123B-v2.1-Q3_K_XL) | Q3_K_XL | 64.91GB | true | Uses Q8_0 for embed and output weights. Lower quality but usable, good for low RAM availability. |
| [Behemoth-123B-v2.1-Q3_K_L.gguf](https://huggingface.co/bartowski/Behemoth-123B-v2.1-GGUF/tree/main/Behemoth-123B-v2.1-Q3_K_L) | Q3_K_L | 64.55GB | true | Lower quality but usable, good for low RAM availability. |
| [Behemoth-123B-v2.1-Q3_K_M.gguf](https://huggingface.co/bartowski/Behemoth-123B-v2.1-GGUF/tree/main/Behemoth-123B-v2.1-Q3_K_M) | Q3_K_M | 59.10GB | true | Low quality. |
| [Behemoth-123B-v2.1-IQ3_M.gguf](https://huggingface.co/bartowski/Behemoth-123B-v2.1-GGUF/tree/main/Behemoth-123B-v2.1-IQ3_M) | IQ3_M | 55.28GB | true | Medium-low quality, new method with decent performance comparable to Q3_K_M. |
| [Behemoth-123B-v2.1-Q3_K_S.gguf](https://huggingface.co/bartowski/Behemoth-123B-v2.1-GGUF/tree/main/Behemoth-123B-v2.1-Q3_K_S) | Q3_K_S | 52.85GB | true | Low quality, not recommended. |
| [Behemoth-123B-v2.1-IQ3_XXS.gguf](https://huggingface.co/bartowski/Behemoth-123B-v2.1-GGUF/blob/main/Behemoth-123B-v2.1-IQ3_XXS.gguf) | IQ3_XXS | 47.01GB | false | Lower quality, new method with decent performance, comparable to Q3 quants. |
| [Behemoth-123B-v2.1-Q2_K_L.gguf](https://huggingface.co/bartowski/Behemoth-123B-v2.1-GGUF/blob/main/Behemoth-123B-v2.1-Q2_K_L.gguf) | Q2_K_L | 45.59GB | false | Uses Q8_0 for embed and output weights. Very low quality but surprisingly usable. |
| [Behemoth-123B-v2.1-Q2_K.gguf](https://huggingface.co/bartowski/Behemoth-123B-v2.1-GGUF/blob/main/Behemoth-123B-v2.1-Q2_K.gguf) | Q2_K | 45.20GB | false | Very low quality but surprisingly usable. |
| [Behemoth-123B-v2.1-IQ2_M.gguf](https://huggingface.co/bartowski/Behemoth-123B-v2.1-GGUF/blob/main/Behemoth-123B-v2.1-IQ2_M.gguf) | IQ2_M | 41.62GB | false | Relatively low quality, uses SOTA techniques to be surprisingly usable. |
| [Behemoth-123B-v2.1-IQ2_XS.gguf](https://huggingface.co/bartowski/Behemoth-123B-v2.1-GGUF/blob/main/Behemoth-123B-v2.1-IQ2_XS.gguf) | IQ2_XS | 36.08GB | false | Low quality, uses SOTA techniques to be usable. |
| [Behemoth-123B-v2.1-IQ2_XXS.gguf](https://huggingface.co/bartowski/Behemoth-123B-v2.1-GGUF/blob/main/Behemoth-123B-v2.1-IQ2_XXS.gguf) | IQ2_XXS | 32.43GB | false | Very low quality, uses SOTA techniques to be usable. |
| [Behemoth-123B-v2.1-IQ1_M.gguf](https://huggingface.co/bartowski/Behemoth-123B-v2.1-GGUF/blob/main/Behemoth-123B-v2.1-IQ1_M.gguf) | IQ1_M | 28.39GB | false | Extremely low quality, *not* recommended. |
## Embed/output weights
Some of these quants (Q3_K_XL, Q4_K_L etc) are the standard quantization method with the embeddings and output weights quantized to Q8_0 instead of what they would normally default to.
## Downloading using huggingface-cli
<details>
<summary>Click to view download instructions</summary>
First, make sure you have hugginface-cli installed:
```
pip install -U "huggingface_hub[cli]"
```
Then, you can target the specific file you want:
```
huggingface-cli download bartowski/Behemoth-123B-v2.1-GGUF --include "Behemoth-123B-v2.1-Q4_K_M.gguf" --local-dir ./
```
If the model is bigger than 50GB, it will have been split into multiple files. In order to download them all to a local folder, run:
```
huggingface-cli download bartowski/Behemoth-123B-v2.1-GGUF --include "Behemoth-123B-v2.1-Q8_0/*" --local-dir ./
```
You can either specify a new local-dir (Behemoth-123B-v2.1-Q8_0) or download them all in place (./)
</details>
## Q4_0_X_X information
<details>
<summary>Click to view Q4_0_X_X information</summary>
These are *NOT* for Metal (Apple) or GPU (nvidia/AMD/intel) offloading, only ARM chips (and certain AVX2/AVX512 CPUs).
If you're using an ARM chip, the Q4_0_X_X quants will have a substantial speedup. Check out Q4_0_4_4 speed comparisons [on the original pull request](https://github.com/ggerganov/llama.cpp/pull/5780#pullrequestreview-21657544660)
To check which one would work best for your ARM chip, you can check [AArch64 SoC features](https://gpages.juszkiewicz.com.pl/arm-socs-table/arm-socs.html) (thanks EloyOn!).
If you're using a CPU that supports AVX2 or AVX512 (typically server CPUs and AMD's latest Zen5 CPUs) and are not offloading to a GPU, the Q4_0_8_8 may offer a nice speed as well:
<details>
<summary>Click to view benchmarks on an AVX2 system (EPYC7702)</summary>
| model | size | params | backend | threads | test | t/s | % (vs Q4_0) |
| ------------------------------ | ---------: | ---------: | ---------- | ------: | ------------: | -------------------: |-------------: |
| qwen2 3B Q4_0 | 1.70 GiB | 3.09 B | CPU | 64 | pp512 | 204.03 ± 1.03 | 100% |
| qwen2 3B Q4_0 | 1.70 GiB | 3.09 B | CPU | 64 | pp1024 | 282.92 ± 0.19 | 100% |
| qwen2 3B Q4_0 | 1.70 GiB | 3.09 B | CPU | 64 | pp2048 | 259.49 ± 0.44 | 100% |
| qwen2 3B Q4_0 | 1.70 GiB | 3.09 B | CPU | 64 | tg128 | 39.12 ± 0.27 | 100% |
| qwen2 3B Q4_0 | 1.70 GiB | 3.09 B | CPU | 64 | tg256 | 39.31 ± 0.69 | 100% |
| qwen2 3B Q4_0 | 1.70 GiB | 3.09 B | CPU | 64 | tg512 | 40.52 ± 0.03 | 100% |
| qwen2 3B Q4_K_M | 1.79 GiB | 3.09 B | CPU | 64 | pp512 | 301.02 ± 1.74 | 147% |
| qwen2 3B Q4_K_M | 1.79 GiB | 3.09 B | CPU | 64 | pp1024 | 287.23 ± 0.20 | 101% |
| qwen2 3B Q4_K_M | 1.79 GiB | 3.09 B | CPU | 64 | pp2048 | 262.77 ± 1.81 | 101% |
| qwen2 3B Q4_K_M | 1.79 GiB | 3.09 B | CPU | 64 | tg128 | 18.80 ± 0.99 | 48% |
| qwen2 3B Q4_K_M | 1.79 GiB | 3.09 B | CPU | 64 | tg256 | 24.46 ± 3.04 | 83% |
| qwen2 3B Q4_K_M | 1.79 GiB | 3.09 B | CPU | 64 | tg512 | 36.32 ± 3.59 | 90% |
| qwen2 3B Q4_0_8_8 | 1.69 GiB | 3.09 B | CPU | 64 | pp512 | 271.71 ± 3.53 | 133% |
| qwen2 3B Q4_0_8_8 | 1.69 GiB | 3.09 B | CPU | 64 | pp1024 | 279.86 ± 45.63 | 100% |
| qwen2 3B Q4_0_8_8 | 1.69 GiB | 3.09 B | CPU | 64 | pp2048 | 320.77 ± 5.00 | 124% |
| qwen2 3B Q4_0_8_8 | 1.69 GiB | 3.09 B | CPU | 64 | tg128 | 43.51 ± 0.05 | 111% |
| qwen2 3B Q4_0_8_8 | 1.69 GiB | 3.09 B | CPU | 64 | tg256 | 43.35 ± 0.09 | 110% |
| qwen2 3B Q4_0_8_8 | 1.69 GiB | 3.09 B | CPU | 64 | tg512 | 42.60 ± 0.31 | 105% |
Q4_0_8_8 offers a nice bump to prompt processing and a small bump to text generation
</details>
</details>
## Which file should I choose?
<details>
<summary>Click here for details</summary>
A great write up with charts showing various performances is provided by Artefact2 [here](https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9)
The first thing to figure out is how big a model you can run. To do this, you'll need to figure out how much RAM and/or VRAM you have.
If you want your model running as FAST as possible, you'll want to fit the whole thing on your GPU's VRAM. Aim for a quant with a file size 1-2GB smaller than your GPU's total VRAM.
If you want the absolute maximum quality, add both your system RAM and your GPU's VRAM together, then similarly grab a quant with a file size 1-2GB Smaller than that total.
Next, you'll need to decide if you want to use an 'I-quant' or a 'K-quant'.
If you don't want to think too much, grab one of the K-quants. These are in format 'QX_K_X', like Q5_K_M.
If you want to get more into the weeds, you can check out this extremely useful feature chart:
[llama.cpp feature matrix](https://github.com/ggerganov/llama.cpp/wiki/Feature-matrix)
But basically, if you're aiming for below Q4, and you're running cuBLAS (Nvidia) or rocBLAS (AMD), you should look towards the I-quants. These are in format IQX_X, like IQ3_M. These are newer and offer better performance for their size.
These I-quants can also be used on CPU and Apple Metal, but will be slower than their K-quant equivalent, so speed vs performance is a tradeoff you'll have to decide.
The I-quants are *not* compatible with Vulcan, which is also AMD, so if you have an AMD card double check if you're using the rocBLAS build or the Vulcan build. At the time of writing this, LM Studio has a preview with ROCm support, and other inference engines have specific builds for ROCm.
</details>
## Credits
Thank you kalomaze and Dampf for assistance in creating the imatrix calibration dataset.
Thank you ZeroWw for the inspiration to experiment with embed/output.
Want to support my work? Visit my ko-fi page here: https://ko-fi.com/bartowski
|
fsdfH13/segformer-b0-scene-parse-150 | fsdfH13 | 2024-11-24T08:26:45Z | 44 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"segformer",
"generated_from_trainer",
"dataset:scene_parse_150",
"base_model:nvidia/mit-b0",
"base_model:finetune:nvidia/mit-b0",
"license:other",
"endpoints_compatible",
"region:us"
] | null | 2024-11-24T08:14:53Z | ---
library_name: transformers
license: other
base_model: nvidia/mit-b0
tags:
- generated_from_trainer
datasets:
- scene_parse_150
model-index:
- name: segformer-b0-scene-parse-150
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# segformer-b0-scene-parse-150
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the scene_parse_150 dataset.
It achieves the following results on the evaluation set:
- Loss: 2.1645
- Mean Iou: 0.1262
- Mean Accuracy: 0.1662
- Overall Accuracy: 0.5628
- Per Category Iou: [0.15843597843469054, 0.37668694709687084, 0.6441654758344134, 0.8511895383645817, 0.5639924565886474, 0.8586149258267328, 0.3807328968759612, 0.8089034968690005, 0.0, 0.21086659359875867, 0.3886605682973176, 0.0, 0.006568939289171623, 0.3416319575918213, nan, 0.0, 0.0, 0.0, 0.3749031458236479, nan, 0.0, 0.8503101019657311, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0]
- Per Category Accuracy: [0.650956153631326, 0.5537014192487221, 0.9688420903560636, 0.9561805873069719, 0.7910321295257288, 0.9228123181822822, 0.43471782548274945, 0.9484787592002216, nan, 0.22006168872895515, 0.4582161092007919, nan, 0.017188348109281708, 0.37926927523032195, nan, 0.0, 0.0, 0.0, 0.4531279265780109, nan, 0.0, 0.8897321674091184, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0]
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Per Category Iou | Per Category Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|
| 4.6007 | 1.0 | 20 | 4.4688 | 0.0320 | 0.0847 | 0.3955 | [0.06737166501857701, 0.27884594290513753, 0.40975499953567807, 0.45403496213054917, 0.21970125668649035, 0.12327268205744601, 7.008585517258642e-05, 0.4684504883683693, 0.0, 0.05938579413427118, 0.0011461915181827655, 0.0, 0.0, 0.015604767754573525, 0.0, 0.0002659542316944911, 0.0, 0.0, 0.07312488251143555, nan, 0.0, 0.35381524423423044, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0] | [0.10314847206502727, 0.45924713922195093, 0.8454162822517999, 0.9915930443503783, 0.23561270524652483, 0.15179291408506773, 7.023634530194104e-05, 0.9654501847410384, nan, 0.060763398020819946, 0.0011461915181827655, nan, 0.0, 0.016464076785651732, nan, 0.0002984102870164397, 0.0, 0.0, 0.10929012923768495, nan, 0.0, 0.4648847824891382, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 3.9874 | 2.0 | 40 | 4.0341 | 0.0453 | 0.0945 | 0.4121 | [0.12753316513388133, 0.2703994918454796, 0.41789308738313335, 0.4108835237238245, 0.1297167312866798, 0.48070668477059586, 0.0009597406213947568, 0.4189549795501547, 0.0, 0.007633097315565608, 0.0, 0.0, 0.0, 0.06618987037100711, nan, 0.0, 0.0, 0.0, 0.06314281994531962, nan, 0.0, 0.46213803202796544, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0] | [0.2303293989681478, 0.45849183531966214, 0.8275495817080307, 0.9933629297502987, 0.14020885336753156, 0.6215639641110964, 0.0009682581888053299, 0.962191435322958, nan, 0.0076339802081994606, 0.0, nan, 0.0, 0.06743265491995656, nan, 0.0, 0.0, 0.0, 0.0681307360929013, nan, 0.0, 0.5380300280481769, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 3.5403 | 3.0 | 60 | 3.7760 | 0.0511 | 0.0967 | 0.4560 | [0.1505839448339678, 0.3447283238256516, 0.4800712385852389, 0.6071033539351967, 0.03454370712415629, 0.5970666544252663, 0.00863460898337167, 0.37567987921343204, 0.0, 0.0019505184272661945, 0.0017423388336578866, 0.0, 0.0, 0.03385034941196523, nan, 0.0, 0.0, 0.0, 0.0899830220713073, nan, 0.0, 0.2914513350559862, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0] | [0.32531271588787797, 0.5975159276215841, 0.8589136696191116, 0.9916815386203743, 0.03580507019030389, 0.7965683022180485, 0.008719340581055252, 0.9736058975703754, nan, 0.0019534764169129933, 0.001771386891737001, nan, 0.0, 0.034784740953515254, nan, 0.0, 0.0, 0.0, 0.1017512642817007, nan, 0.0, 0.29775064620799646, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 3.1987 | 4.0 | 80 | 3.5470 | 0.0690 | 0.1086 | 0.4782 | [0.1865587108082301, 0.33684959757633276, 0.49821903085989633, 0.6803136770483172, 0.3011192399475671, 0.5884192744484772, 0.0, 0.5811726503673337, 0.0, 0.0023768533031836986, 0.05381076215243048, 0.0, 0.0, 0.0717746091736815, nan, 0.0, 0.0, 0.0, 0.06766779598937826, nan, 0.0, 0.4287662610155267, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.5241059485837756, 0.6033536181465634, 0.8785630832606156, 0.9909957080279052, 0.3077495591636705, 0.718206779695618, 0.0, 0.9724214334961726, nan, 0.00237758642847963, 0.056059185162029804, nan, 0.0, 0.07317756681963078, nan, 0.0, 0.0, 0.0, 0.06920771680089904, nan, 0.0, 0.4495407798493098, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 3.2349 | 5.0 | 100 | 3.2951 | 0.0687 | 0.1062 | 0.4758 | [0.15922425896002754, 0.3544000936145671, 0.48070506147784325, 0.6869931187023839, 0.36220946789633945, 0.550908895520414, 9.984424298094972e-05, 0.5236712215715216, 0.0, 0.0003855545559696697, 0.17428336720134896, nan, 0.0, 0.0008740953113527499, nan, 0.0, 0.0, 0.0, 0.039444546964384576, nan, 0.0, 0.37873553764325274, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.3534093748621992, 0.5836004425151792, 0.9357744284800913, 0.9894986799604726, 0.3786497812993794, 0.6802559993467321, 0.00010033763614563004, 0.9722033978208218, nan, 0.0003855545559696697, 0.21001354589976035, nan, 0.0, 0.0008757487651942411, nan, 0.0, 0.0, 0.0, 0.0400355871886121, nan, 0.0, 0.3798603090799098, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 3.0379 | 6.0 | 120 | 3.2250 | 0.0679 | 0.1045 | 0.4855 | [0.19540706923981463, 0.3475387192434564, 0.4858654667196469, 0.7518925884166686, 0.33325853022579854, 0.5856751366164523, 0.0, 0.5758230951376272, 0.0, 0.0, 0.08577799801783945, nan, 0.0, 0.0010849023587877092, nan, 0.0, 0.0, 0.0, 0.02121141796240427, nan, 0.0, 0.28276885858808004, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.45676362941513676, 0.5922029926551426, 0.9574939552280769, 0.9638943378416247, 0.3570888771841436, 0.7373352795271974, 0.0, 0.9730225048174098, nan, 0.0, 0.0901844326351985, nan, 0.0, 0.001085928468840859, nan, 0.0, 0.0, 0.0, 0.021399138415433602, nan, 0.0, 0.28284661497002694, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 3.3803 | 7.0 | 140 | 3.1126 | 0.0732 | 0.1106 | 0.4968 | [0.19970832665346208, 0.3444605478866212, 0.5601446811141683, 0.7543543231072496, 0.4437306295026785, 0.6578475106393888, 0.0, 0.5745950998481516, 0.0, 0.0026978763858734054, 0.05060904872389791, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.023589505456234037, nan, 0.0, 0.34055676855895195, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.5676215954022312, 0.5986549052601153, 0.9433697026242953, 0.9623383135941947, 0.49508782375707056, 0.7857587604242158, 0.0, 0.9744485759913257, nan, 0.002698881891787688, 0.054548296342607064, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.023787226072298183, nan, 0.0, 0.3431226970246934, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 3.3784 | 8.0 | 160 | 2.9738 | 0.0839 | 0.1202 | 0.4945 | [0.15985479993244744, 0.34585775417314835, 0.545786757842671, 0.7293533956893046, 0.5096718083987997, 0.6940531406796904, 0.0, 0.7255952772839994, 0.0, 0.0029430664439018123, 0.33669751870815284, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.037251809241046575, nan, 0.0, 0.3616250343123799, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.619133361750915, 0.5680453113520972, 0.9035835448916731, 0.9919543959528621, 0.6106327432615019, 0.8217497371617553, 0.0, 0.9741833974672505, nan, 0.0029430664439018123, 0.3563092633114515, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0376006742835737, nan, 0.0, 0.3622614530055546, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 3.1938 | 9.0 | 180 | 2.8748 | 0.0797 | 0.1227 | 0.5054 | [0.18995399136745245, 0.34810012991436257, 0.5849498868412545, 0.7074903942089286, 0.4859083606900887, 0.6071300935751847, 0.0012166117363143646, 0.5967678289568087, 0.0, 0.008713532964914535, 0.34789651722330744, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.03178014541131282, nan, 0.0, 0.3940940180219303, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.588655505416489, 0.5873184646856714, 0.9344540545268625, 0.9831049689532603, 0.6372546774452104, 0.8496871459339179, 0.0012241191609766865, 0.9803060749453437, nan, 0.008713532964914535, 0.3804313848077524, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.03254354748080165, nan, 0.0, 0.3992740471869328, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 2.8443 | 10.0 | 200 | 2.8001 | 0.0857 | 0.1269 | 0.5097 | [0.1995003724232663, 0.3428447139109039, 0.5532424568054768, 0.6831386088761471, 0.5392824587654576, 0.7168796811653955, 0.01799445026333947, 0.6834022202683715, 0.0, 0.01805587683450279, 0.35989904382856863, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.02687122268712227, nan, 0.0, 0.48876099467911827, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.606286655005659, 0.564086417777686, 0.9542763246953664, 0.9805312606008761, 0.718059404126686, 0.8519225469281099, 0.018186196551395445, 0.9801941106796231, nan, 0.018056805037912866, 0.3863186412420548, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.027064993444465255, nan, 0.0, 0.4950778199417038, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 2.5561 | 11.0 | 220 | 2.8211 | 0.0956 | 0.1305 | 0.5088 | [0.13506989980290346, 0.3473008344239474, 0.69434862269396, 0.7803140892823507, 0.56466658147898, 0.7436653589264243, 0.02808395583141732, 0.7135645716045791, 0.0, 0.010101529366405347, 0.3171052631578947, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.03913529631010063, nan, 0.0, 0.6916159603683162, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.6502138667999353, 0.5734563153901342, 0.9155273061247441, 0.9776109496910075, 0.7083839054663705, 0.8648756239218528, 0.02855609124704631, 0.9829283959056436, nan, 0.010101529366405347, 0.3264561842242367, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.03933320846600487, nan, 0.0, 0.7063740856844305, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 2.4885 | 12.0 | 240 | 2.7136 | 0.0877 | 0.1310 | 0.5215 | [0.19848968442959197, 0.3606125099763697, 0.608227018166917, 0.7333834603093866, 0.4923070749227497, 0.6534764229721199, 0.04438585772085885, 0.6595550570495378, 0.0, 0.010962601208070941, 0.3577829405907733, 0.0, 0.0, 0.01782905086523335, nan, 0.0, 0.0, 0.0, 0.046911866759195, nan, 0.0, 0.6398316516484109, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.5831949201122985, 0.6343451652807959, 0.927350276049791, 0.9782304095809796, 0.7023496003847299, 0.8347334360869254, 0.04628575155397914, 0.9841069671237559, nan, 0.010962601208070941, 0.39314369073668853, nan, 0.0, 0.017865274809962518, nan, 0.0, 0.0, 0.0, 0.047480801648248736, nan, 0.0, 0.6521476104053237, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 2.7337 | 13.0 | 260 | 2.6561 | 0.0915 | 0.1249 | 0.5100 | [0.15172907696939902, 0.34333366948467997, 0.6574616714834647, 0.7872674828227841, 0.5616712079927338, 0.7836156429502119, 0.024235763941696797, 0.7468694615697111, 0.0, 0.0, 0.16769452153355946, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.00950552537928451, nan, 0.0, 0.6178312992763797, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.6555274646127615, 0.5916145782255692, 0.9200632280019243, 0.9641155735166148, 0.6726818879245197, 0.8717043146301381, 0.02478339612797062, 0.9858807168070148, nan, 0.0, 0.17000104199228927, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.00950552537928451, nan, 0.0, 0.6292141010834296, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 1.9553 | 14.0 | 280 | 2.5927 | 0.0994 | 0.1384 | 0.5349 | [0.1515238091857993, 0.36203977191604797, 0.6668900694970619, 0.7851383285129309, 0.5099721389096113, 0.7046669184836333, 0.32193962754884853, 0.6765303964471051, 0.0, 0.018288137771494666, 0.4320487829259759, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.013315890295752931, nan, 0.0, 0.6284093977855792, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.6275410462569635, 0.558431101315674, 0.9366782806815129, 0.9629209008716686, 0.7670895642018, 0.894987189825353, 0.34527183974072756, 0.9802648249527098, nan, 0.018288137771494666, 0.4503490674169011, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.013345195729537367, nan, 0.0, 0.6398834075785074, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 2.6836 | 15.0 | 300 | 2.5547 | 0.0894 | 0.1315 | 0.5128 | [0.14964508283674494, 0.35737291402764776, 0.6320891644923362, 0.7279309931189409, 0.5521137424809658, 0.7370275812999624, 0.05156811884690208, 0.7047224257000224, 0.0, 0.01609047680246755, 0.3708605988498909, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.013722287047841307, nan, 0.0, 0.605975319333189, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.6448488233651316, 0.558203993991979, 0.9323547848977544, 0.9845282517956963, 0.7514484622254792, 0.8987434800804336, 0.05266220833103393, 0.9807775034325886, nan, 0.01609047680246755, 0.3897572157966031, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.013766622963101705, nan, 0.0, 0.6157399769015014, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 2.4588 | 16.0 | 320 | 2.4921 | 0.0955 | 0.1360 | 0.5444 | [0.17656221903022504, 0.3840441825774168, 0.6987785037963307, 0.7360219727887389, 0.49025671568852036, 0.7487145573282632, 0.1849851917526551, 0.7136239190322704, 0.0, 0.003367176455468449, 0.2593436645396536, 0.0, 0.0, 0.011097325613454646, nan, 0.0, 0.0, 0.0, 0.023831688698566375, nan, 0.0, 0.7286813421671298, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.598944629812003, 0.6622088251841376, 0.9294058108949594, 0.9801920325658914, 0.7351661437698949, 0.924496524410783, 0.19678718889061692, 0.9876485736341832, nan, 0.003367176455468449, 0.2668021256642701, nan, 0.0, 0.011279644095701825, nan, 0.0, 0.0, 0.0, 0.023974527064993444, nan, 0.0, 0.7500412473189243, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 2.5944 | 17.0 | 340 | 2.4291 | 0.0945 | 0.1340 | 0.5214 | [0.20359126628612384, 0.3540357542469868, 0.5200241530851897, 0.8362056220149517, 0.4961164905454599, 0.7601217630484368, 0.04573941618047835, 0.7718666006670892, 0.0, 0.02114124148567022, 0.3789665893500469, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.7136630253457461, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.5845472050328517, 0.5726941294477335, 0.973909077466798, 0.9593221338918305, 0.7708796116060183, 0.9252212434545621, 0.046646967044103406, 0.983217145854081, nan, 0.02114124148567022, 0.40007293946024797, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.7293625914315569, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 1.8818 | 18.0 | 360 | 2.5280 | 0.0933 | 0.1311 | 0.5066 | [0.15395976334635586, 0.34802668681959387, 0.580464373724057, 0.7726843760922755, 0.5195764905784382, 0.772978293439253, 0.049167916508938966, 0.7223149380345307, 0.0, 0.026975967099344557, 0.24279502477500253, nan, 0.00028037383177570094, 0.0002452096542543875, nan, 0.0, 0.0, 0.0, 0.0015366705471478463, nan, 0.0, 0.7528300874510435, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.6542707215615952, 0.5807702035191312, 0.8692767308040744, 0.9781492898334833, 0.7422767765131564, 0.9058069389296615, 0.05147822422451549, 0.9819442889385198, nan, 0.026975967099344557, 0.25018234865062, nan, 0.0003618599601954044, 0.0002452096542543875, nan, 0.0, 0.0, 0.0, 0.0015452331897359056, nan, 0.0, 0.7717098388604741, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 2.4793 | 19.0 | 380 | 2.4306 | 0.1000 | 0.1370 | 0.5227 | [0.16151832996352422, 0.34420710293292217, 0.6205131271055299, 0.8034198011772079, 0.5579396520130538, 0.742377509227891, 0.1287482992538506, 0.732010588967488, 0.0, 0.0005140727412928929, 0.38390826842130726, nan, 0.0, 0.047539417104634496, nan, 0.0, 0.0, 0.0, 0.010253593335164332, nan, 0.0, 0.7680098332620778, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.6381829406317522, 0.5792441111243017, 0.9294599587226944, 0.9643146856241058, 0.7321432660819384, 0.8991823944308914, 0.13339888725561513, 0.9842071456772954, nan, 0.0005140727412928929, 0.41341044076273836, nan, 0.0, 0.048796721196623115, nan, 0.0, 0.0, 0.0, 0.010488855590934631, nan, 0.0, 0.790353627014244, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 2.4278 | 20.0 | 400 | 2.4267 | 0.0998 | 0.1396 | 0.5166 | [0.1486001280187761, 0.34512239779898557, 0.6502131360414325, 0.8151734464686798, 0.5232145392588385, 0.7644822110639028, 0.12781954887218044, 0.7317361792355304, 0.0, 0.023056162446986248, 0.4594955418871883, 0.0, 0.0016847281668336217, 0.015191888431325051, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.7830637488106565, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.6551746946334867, 0.5561755126689756, 0.8999598056509506, 0.9635256117166413, 0.8016923534934848, 0.9091243148342844, 0.13458287136213357, 0.9802117892478948, nan, 0.023056162446986248, 0.5020839845785141, nan, 0.002231469754538327, 0.015378148316810874, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.8147170433921795, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 1.8774 | 21.0 | 420 | 2.4260 | 0.1057 | 0.1449 | 0.5405 | [0.1433652644598082, 0.3588576254523741, 0.7277796998978452, 0.815332621176059, 0.5207554517133957, 0.7705992024444559, 0.31925570158564687, 0.7592512210835088, 0.0, 0.017067215010924047, 0.4084452228059934, 0.0, 0.006384640117587616, 0.0401695668524153, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.8188156882850551, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.6585186600620287, 0.5614609194749691, 0.9243513194367793, 0.9624931785666878, 0.765623926534912, 0.9112780573446703, 0.3871728365951427, 0.9792276822807711, nan, 0.017067215010924047, 0.43742836303011357, nan, 0.0083830890778602, 0.04116019196412933, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.8783479073860199, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 1.8807 | 22.0 | 440 | 2.3898 | 0.1079 | 0.1454 | 0.5394 | [0.1374158811478557, 0.3695142646694579, 0.7273029393144314, 0.833346076894649, 0.5339257383695775, 0.752263566797126, 0.28443638127228477, 0.7426869634639708, 0.0, 0.014342629482071713, 0.44847422587748864, nan, 0.010157332102141564, 0.058949442783365044, nan, 0.0, 0.0, 0.0, 0.00013572204125950055, nan, 0.0, 0.8069368992895947, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.6576220363647054, 0.5877898844333415, 0.9115495233949855, 0.9644916741640979, 0.7629903588522225, 0.9201482101481081, 0.31776929367321033, 0.9757862543238831, nan, 0.014342629482071713, 0.5199020527248098, nan, 0.01724865810264761, 0.06077696430448033, nan, 0.0, 0.0, 0.0, 0.00014047574452144597, nan, 0.0, 0.8495847769894956, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 2.1596 | 23.0 | 460 | 2.4047 | 0.1056 | 0.1475 | 0.5362 | [0.12985741431078748, 0.3681569312190797, 0.7395555829844718, 0.8076224596948546, 0.5300460133232607, 0.7571050188643893, 0.3254401915394347, 0.7091540789366978, 0.0, 0.03676074751980312, 0.4704422653592155, 0.0, 0.0029033443586916575, 0.03737100147426887, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.7883707950682429, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.6579527582202754, 0.5615589885465647, 0.8984415838656135, 0.9641893187416115, 0.7953603407607576, 0.9073992793638804, 0.3600565904267861, 0.979080360878507, nan, 0.036859015550700425, 0.6123788683963739, nan, 0.004764489475906158, 0.038182646162468914, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.854534455260408, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 1.5615 | 24.0 | 480 | 2.3712 | 0.1033 | 0.1418 | 0.5379 | [0.17694168431165172, 0.3677540332798378, 0.5573077523007332, 0.8236624059010761, 0.5594355734633305, 0.7329106448596597, 0.2690624006981366, 0.8041795363021464, 0.0, 0.05168184328482809, 0.4030742954739539, 0.0, 0.012246107252197385, 0.042640286542725565, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.7799219543486395, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.6037143739067805, 0.5618119035206797, 0.9673238685707264, 0.961799973451719, 0.772642956923993, 0.9120742275617797, 0.2846127218088869, 0.9696576839896993, nan, 0.05171571777406503, 0.41804730644993227, nan, 0.02243531753211507, 0.043787438259712055, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.8023978441401309, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 1.535 | 25.0 | 500 | 2.3069 | 0.0992 | 0.1382 | 0.5264 | [0.14418666088297982, 0.3635375998708063, 0.6844380904236833, 0.8229712185545932, 0.5096163382387721, 0.7868047104302384, 0.14615298754153538, 0.7258918538404697, 0.0, 0.03286209998714818, 0.2987155801882643, 0.0, 0.00015640273704789834, 0.03566774127169216, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.803732371964797, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.6498170005732512, 0.5964526524242847, 0.9023173187661793, 0.9676627188389552, 0.7648796574071953, 0.8974981882023906, 0.17079472424709147, 0.9783201824428246, nan, 0.03286209998714818, 0.35380848181723457, nan, 0.0002412399734636029, 0.036431148632080426, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.833745806522576, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 1.7637 | 26.0 | 520 | 2.2854 | 0.1080 | 0.1475 | 0.5433 | [0.1416567762754726, 0.3729907848890519, 0.6894907440535549, 0.8432713542779796, 0.5358741811870772, 0.7943196502221515, 0.318381439156801, 0.755121738021962, 0.0, 0.041562781133530395, 0.47124206069160196, 0.0, 0.0006462818596760512, 0.03925054704595186, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.8261486699610977, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.671049343700851, 0.5599004168795798, 0.9431551939221146, 0.9602144511142904, 0.7990129846337051, 0.9160754932682788, 0.34188544452081254, 0.9774126826048781, nan, 0.041562781133530395, 0.5566322809211212, nan, 0.0009649598938544116, 0.04021438329771955, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.8642688225265358, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 2.2267 | 27.0 | 540 | 2.3305 | 0.1102 | 0.1487 | 0.5453 | [0.1585445792258586, 0.37351634933144817, 0.6253498101079337, 0.8458549807996386, 0.560513555745284, 0.8074872071101535, 0.2918451222150119, 0.7771591807537367, 0.0, 0.061136100758257296, 0.5001658217652911, 0.0, 0.004806457530728307, 0.041834849832386944, nan, 0.0, 0.0, 0.0, 0.07112858547046734, nan, 0.0, 0.831306270731322, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.6399320917789896, 0.5699671210533651, 0.9591496291915105, 0.9665122933290069, 0.7968259784276456, 0.9181067480529556, 0.30459496204728914, 0.97312857622704, nan, 0.061136100758257296, 0.5500156298843388, nan, 0.007900609130932995, 0.042841629593302274, nan, 0.0, 0.0, 0.0, 0.07454579509271399, nan, 0.0, 0.8683385579937304, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 1.2666 | 28.0 | 560 | 2.2689 | 0.1083 | 0.1493 | 0.5495 | [0.1429050997343189, 0.37517871696355803, 0.6722014322835649, 0.8471102864910484, 0.5245935468655573, 0.7793895306141844, 0.3874185113221595, 0.7563301274718839, 0.0, 0.054967635877941026, 0.4217004189944134, 0.0, 0.004186660655051367, 0.054995573111761904, nan, 0.0, 0.0, 0.0, 0.0024382675001108304, nan, 0.0, 0.8249856971966505, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.6570046889009745, 0.5544188719303951, 0.9449628983249577, 0.9696833380038643, 0.8150319463210205, 0.9208218926395084, 0.4281356765515961, 0.9741008974819826, nan, 0.055005783318339545, 0.5033864749400855, nan, 0.007840299137567095, 0.05657337023154797, nan, 0.0, 0.0, 0.0, 0.0025753886495598426, nan, 0.0, 0.8723532970356926, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 1.8234 | 29.0 | 580 | 2.2416 | 0.1085 | 0.1496 | 0.5519 | [0.17310586948765747, 0.3739132666632445, 0.5866949187941202, 0.8376749944446208, 0.5045375832608805, 0.7548976447281532, 0.38251504364010913, 0.7625561519500564, 0.0, 0.1470894171129175, 0.37222171407664867, 0.0, 0.005667557282811108, 0.14362687321097828, nan, 0.0, 0.0, 0.0, 0.01884734522926574, nan, 0.0, 0.7969166755574523, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.6026340158452516, 0.5514269049917158, 0.9653016554657026, 0.9729797495612159, 0.8040053129365424, 0.9101552531923364, 0.4247894163861394, 0.9753207186927288, nan, 0.1472047294692199, 0.4240387621131604, nan, 0.008443399071226102, 0.14940273934213752, nan, 0.0, 0.0, 0.0, 0.02022850721108822, nan, 0.0, 0.8215915965462245, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 1.8137 | 30.0 | 600 | 2.2642 | 0.1130 | 0.1533 | 0.5552 | [0.15324773730312397, 0.37071462717013937, 0.6543737598384388, 0.8441023278432577, 0.5492826002082248, 0.7741790684023342, 0.3642876309165329, 0.7779958769857854, 0.0, 0.1272920100688721, 0.4288216696573528, 0.0, 0.004337004073012521, 0.14473154362416107, nan, 0.0, 0.0, 0.0, 0.07291174068182796, nan, 0.0, 0.8337273443656422, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.6183396292975468, 0.5590470439056954, 0.9663387946277024, 0.9728322591112225, 0.7732498225204387, 0.9154324327083057, 0.4290939009767869, 0.976298932803762, nan, 0.12802981621899498, 0.5229238303636553, nan, 0.006935649237078584, 0.15108417697131046, nan, 0.0, 0.0, 0.0, 0.07941562090279078, nan, 0.0, 0.8727932684375516, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 1.7144 | 31.0 | 620 | 2.2412 | 0.1126 | 0.1520 | 0.5516 | [0.16284596003491836, 0.37633172781601804, 0.6010533936135908, 0.8450746884223279, 0.5233439640299054, 0.8141838171823179, 0.3424316365938007, 0.7973221165262143, 0.0, 0.08517443354056414, 0.39258674346230665, 0.0, 0.006852882322304761, 0.14132409338626117, nan, 0.0, 0.0, 0.0, 0.1756434067391881, nan, 0.0, 0.8170673459640184, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.626541531315685, 0.5639281308551107, 0.9665720468087144, 0.9670727570389818, 0.8023221196784758, 0.9146260551807204, 0.3745553788498297, 0.9695692911483409, nan, 0.08522040868782933, 0.44034594144003336, nan, 0.015077498341475183, 0.14716082250324028, nan, 0.0, 0.0, 0.0, 0.18598988574639447, nan, 0.0, 0.846724962877413, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 1.6542 | 32.0 | 640 | 2.2640 | 0.1140 | 0.1532 | 0.5487 | [0.1350172150363662, 0.3753543607213474, 0.686882860329998, 0.8300517608850096, 0.5870417631449963, 0.7993580745813627, 0.4032773763775868, 0.7667334949645654, 0.0, 0.05347023220550251, 0.4913675701796775, 0.0, 0.002870676044208411, 0.11942659084025978, nan, 0.0, 0.0, 0.0, 0.061683172794283904, nan, 0.0, 0.8424789850154023, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.6611423867828847, 0.5421757225711991, 0.9348643284523926, 0.965000516216575, 0.7954175922321204, 0.9126254223274709, 0.4701520616875787, 0.9690742912367337, nan, 0.05347641691299319, 0.5827341877670105, nan, 0.006030999336590073, 0.12432129470697446, nan, 0.0, 0.0, 0.0, 0.0638696385090841, nan, 0.0, 0.8874223175493593, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 1.7602 | 33.0 | 660 | 2.2481 | 0.1119 | 0.1509 | 0.5450 | [0.14017012927798683, 0.3623220957305291, 0.6964876400637953, 0.8480341033529749, 0.5501191599622527, 0.8411447805116538, 0.3631870209052227, 0.7478304632879023, 0.0, 0.11501524966601971, 0.35747542679772376, 0.0, 0.0026031982149497955, 0.10218413978494624, nan, 0.0, 0.0, 0.0, 0.09521612769294645, nan, 0.0, 0.8219225793611304, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.6710346449517146, 0.5384353337703429, 0.9213023802135507, 0.9594401262518252, 0.7876428424210502, 0.9131970317141136, 0.45304951160655604, 0.9719500050089277, nan, 0.11728569592597353, 0.3960091695321455, nan, 0.005066039442735661, 0.10652607979822748, nan, 0.0, 0.0, 0.0, 0.09664731223075482, nan, 0.0, 0.9084859484133532, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 2.021 | 34.0 | 680 | 2.2351 | 0.1138 | 0.1569 | 0.5519 | [0.16929037156127189, 0.3777715701544251, 0.5403265698444515, 0.8353256970069701, 0.5393346199797813, 0.766401531008911, 0.38034133488927585, 0.7625754899331342, 0.0, 0.1426354401805869, 0.45649971862689925, 0.0, 0.0014980739049793123, 0.28264387907036964, nan, 0.0, 0.0, 0.0, 0.08527356188743228, nan, 0.0, 0.8071615069004103, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.557192832889921, 0.5537650781197578, 0.9599951683476791, 0.9615713632542293, 0.8063526232624179, 0.9156467862282968, 0.4185935673541467, 0.9717908978944825, nan, 0.14292507389795656, 0.633948108783995, nan, 0.0025330197213678306, 0.30588152870704455, nan, 0.0, 0.0, 0.0, 0.09655366173440719, nan, 0.0, 0.8330858494197877, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 1.2313 | 35.0 | 700 | 2.2293 | 0.1193 | 0.1606 | 0.5563 | [0.15053013063632534, 0.37542485456460994, 0.6854628407608374, 0.8478402544250672, 0.5059011325551832, 0.8391613599970208, 0.34762521100625376, 0.7699393542042288, 0.0, 0.13393953977279924, 0.3995456422965717, 0.0, 0.0027104096671278126, 0.2737309727575619, nan, 0.0, 0.0, 0.0, 0.26018550624133147, nan, 0.0, 0.848697186753867, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.6563138476915614, 0.5640451255370142, 0.9421222199776328, 0.9593958791168272, 0.8019900611445714, 0.9200461370433505, 0.4163560380680992, 0.9688385769931113, nan, 0.1359208327978409, 0.5039595706991769, nan, 0.00578975936312647, 0.2935509860931096, nan, 0.0, 0.0, 0.0, 0.2810919647874134, nan, 0.0, 0.8992465489743167, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 1.8443 | 36.0 | 720 | 2.2200 | 0.1237 | 0.1629 | 0.5597 | [0.14640725658035164, 0.37444109329288744, 0.6664958244143628, 0.8364106800904311, 0.5650507982583455, 0.8346773818492992, 0.38794353902179296, 0.818598319108213, 0.0, 0.12268982693658599, 0.4300575475874281, 0.0, 0.009388350259650558, 0.31106827309236945, nan, 0.0, 0.0, 0.0, 0.32794105191835354, nan, 0.0, 0.8481905060316466, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.6402113680125822, 0.555150088692292, 0.9668448685561482, 0.9630978894116606, 0.7801199990839764, 0.9159734201635211, 0.4441244788714023, 0.9602467928130727, nan, 0.12345456882148824, 0.5061477545066166, nan, 0.022133767565285567, 0.3391599817844257, nan, 0.0, 0.0, 0.0, 0.36261472185802585, nan, 0.0, 0.8932519386239894, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 1.5543 | 37.0 | 740 | 2.2140 | 0.1212 | 0.1648 | 0.5585 | [0.17110138302086084, 0.3761434451266687, 0.600560775893203, 0.8479301439899833, 0.511823298228152, 0.825049864750403, 0.34966334689297235, 0.7560746169828984, 0.0, 0.1827091532309255, 0.4259672619047619, 0.0, 9.861932938856015e-05, 0.324433941174592, nan, 0.0, 0.0, 0.0, 0.32808071647205533, nan, 0.0, 0.8426751920796164, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.6095497773139505, 0.5646283784365037, 0.9523644898545714, 0.9588722880193508, 0.8007648796574072, 0.9246598413783952, 0.40018662800323085, 0.9756742900581625, nan, 0.18496337231718288, 0.5368865270396999, nan, 0.0001809299800977022, 0.3558692682243318, nan, 0.0, 0.0, 0.0, 0.40653680464506464, nan, 0.0, 0.8987515811472254, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 1.8282 | 38.0 | 760 | 2.2059 | 0.1234 | 0.1632 | 0.5572 | [0.15369400689586105, 0.3713614126191303, 0.6297725026072933, 0.8492940775859811, 0.5695938419591348, 0.8389840001493568, 0.3691745736702185, 0.7976948576453559, 0.0, 0.18737848038530455, 0.41516165976565944, 0.0, 0.0040568908177465385, 0.2746054986172117, nan, 0.0, 0.0, 0.0, 0.3488028140864212, nan, 0.0, 0.8545765168772003, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.631619949142328, 0.5482628870502372, 0.9645769076175581, 0.9564313210719606, 0.7845856138502759, 0.9174024436301279, 0.42845675698726216, 0.9515017943746795, nan, 0.18950006425909266, 0.4910388663123893, nan, 0.010373318858934926, 0.29565278312957577, nan, 0.0, 0.0, 0.0, 0.4085971155647125, nan, 0.0, 0.9078259913105649, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 1.3943 | 39.0 | 780 | 2.1896 | 0.1212 | 0.1608 | 0.5580 | [0.16612933853782044, 0.37797477025607085, 0.5659295857751586, 0.8492484034714263, 0.561597469225852, 0.8389648812366, 0.36699645823482707, 0.7913155037046291, 0.0, 0.19248760822991012, 0.38709538009421324, 0.0, 0.003988192438135913, 0.2794414426144131, nan, 0.0, 0.0, 0.0, 0.3217505866853308, nan, 0.0, 0.8404445849136115, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.5786677053782723, 0.5601017165528549, 0.9727927991719547, 0.9561732127844722, 0.7846314150273662, 0.9146668844226234, 0.4127589338122783, 0.9704767909862874, nan, 0.19514201259478217, 0.4666562467437741, nan, 0.007659369157469393, 0.3007321259677024, nan, 0.0, 0.0, 0.0, 0.36593931447836675, nan, 0.0, 0.8774679645823021, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 1.1913 | 40.0 | 800 | 2.1733 | 0.1230 | 0.1643 | 0.5589 | [0.15146724950018636, 0.37587913550376595, 0.6347491880738849, 0.8479655012689113, 0.5666118488336892, 0.8384290493188061, 0.40726791357075753, 0.7804316451128962, 0.0, 0.16712981341802302, 0.44501127061335866, 0.0, 0.0003263029871555279, 0.30505141678217984, nan, 0.0, 0.0, 0.0, 0.27534090429349434, nan, 0.0, 0.8472717733473243, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.6570267370246792, 0.5393884963258508, 0.9520583463669932, 0.9585183109393667, 0.792967229257792, 0.9158713470587635, 0.478480085487666, 0.9687089341591189, nan, 0.16772908366533865, 0.5862769615504845, nan, 0.000663409927024908, 0.3314884226013241, nan, 0.0, 0.0, 0.0, 0.30539426858962354, nan, 0.0, 0.8881372710773799, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 1.7766 | 41.0 | 820 | 2.2109 | 0.1207 | 0.1600 | 0.5558 | [0.1570680721149667, 0.37397690324027355, 0.6323630601710086, 0.8411863082515966, 0.5543045706892881, 0.8332913975789091, 0.3250286061646285, 0.8021170762784543, 0.0, 0.20035420905870827, 0.3642310430758277, 0.0, 0.003817692126663703, 0.30298348942744036, nan, 0.0, 0.0, 0.0, 0.2808196028728348, nan, 0.0, 0.8477005459890802, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.6507209736451428, 0.5623985974402251, 0.9607969727199078, 0.9606937950767688, 0.7751276707811391, 0.9127172881217528, 0.36481761126189627, 0.9645191134787298, nan, 0.20499935740907338, 0.4264353443784516, nan, 0.008805259031421507, 0.32977195502154344, nan, 0.0, 0.0, 0.0, 0.31124742461135047, nan, 0.0, 0.8880272782269153, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 1.5868 | 42.0 | 840 | 2.1911 | 0.1227 | 0.1623 | 0.5548 | [0.15264070972969604, 0.3626065421174023, 0.6132219128690903, 0.8489022871747821, 0.5636104319478402, 0.8618165913398455, 0.4113041794230476, 0.7861103788732057, 0.0, 0.1992922426742025, 0.3741768372991483, 0.0, 0.0028042624789680315, 0.25942038466570716, nan, 0.0, 0.0, 0.0, 0.35903732809430255, nan, 0.0, 0.8317786728853085, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.6479282113092176, 0.5133898692928531, 0.9577646943667516, 0.9552513974720137, 0.7918451004190807, 0.9050618052649307, 0.5007801251210323, 0.9665344702617017, nan, 0.2012080709420383, 0.4440450140668959, nan, 0.00663409927024908, 0.277822538270221, nan, 0.0, 0.0, 0.0, 0.410751076980708, nan, 0.0, 0.8631138975966562, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 2.4634 | 43.0 | 860 | 2.1647 | 0.1229 | 0.1609 | 0.5583 | [0.15111144958072076, 0.3752083766248759, 0.6554744566863172, 0.8501452068624173, 0.566586754323093, 0.8621972134970034, 0.3623273621702657, 0.8084849693404457, 0.0, 0.1888887470805202, 0.38466666666666666, 0.0, 0.006035143102527797, 0.2627226671024677, nan, 0.0, 0.0, 0.0, 0.3160291051444619, nan, 0.0, 0.8452004219409283, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.656233004571311, 0.5591881257279908, 0.964689368490546, 0.9563428268019646, 0.7874825383012344, 0.9133909706131531, 0.404982767010992, 0.9556680436307065, nan, 0.1902069142783704, 0.4509221631759925, nan, 0.01568059827513419, 0.28157074298525236, nan, 0.0, 0.0, 0.0, 0.3477711181869264, nan, 0.0, 0.8813177143485673, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 1.2669 | 44.0 | 880 | 2.2000 | 0.1245 | 0.1629 | 0.5561 | [0.14951348733209258, 0.368295580204438, 0.6620979694572893, 0.8464151385400263, 0.5743848107590457, 0.8624840592031534, 0.3698799450288253, 0.8034236947791165, 0.0, 0.18619180715577532, 0.388349963045085, 0.0, 0.007610581092801388, 0.30749727196867577, nan, 0.0, 0.0, 0.0, 0.3452479681539227, nan, 0.0, 0.8534649328946682, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.6624432260814604, 0.5403674665317787, 0.9619778118862813, 0.9592188905768352, 0.7897840474500195, 0.9112474354132429, 0.4293848801216092, 0.9431044744456296, nan, 0.18920447243284924, 0.43800145878920493, nan, 0.021168807671431155, 0.33562195677304096, nan, 0.0, 0.0, 0.0, 0.3898670162951864, nan, 0.0, 0.8988065775724577, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 1.9998 | 45.0 | 900 | 2.2035 | 0.1247 | 0.1642 | 0.5587 | [0.14884557839718957, 0.3728449344010422, 0.6613773550393268, 0.8426013561695341, 0.5789663592652575, 0.8533438252526593, 0.3652341020265549, 0.8017765219474279, 0.0, 0.19025852190447406, 0.39791666666666664, 0.0, 0.0016051801066492546, 0.30900750625521267, nan, 0.0, 0.0, 0.0, 0.358512239562703, nan, 0.0, 0.8525157232704402, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.6638837034968325, 0.543607186914489, 0.9638646554219678, 0.9585256854618663, 0.7900245036297433, 0.9187498086129285, 0.41953172425210833, 0.9664814345568867, nan, 0.194178126204858, 0.45774721267062624, nan, 0.003558289608588143, 0.33744351420464497, nan, 0.0, 0.0, 0.0, 0.42381532122120247, nan, 0.0, 0.8945718528295661, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 1.4752 | 46.0 | 920 | 2.2000 | 0.1270 | 0.1672 | 0.5613 | [0.15043217445562077, 0.37379856021791297, 0.6696532629808947, 0.8454463855970237, 0.5663837342218547, 0.8629839561917572, 0.3903775215253501, 0.8056657534793435, 0.0, 0.18936893752117553, 0.39849558296160237, 0.0, 0.005366565622544067, 0.356681068466366, nan, 0.0, 0.0, 0.0, 0.392245162767622, nan, 0.0, 0.8526724771605586, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.6609072067967016, 0.5453982378536293, 0.9620965205855463, 0.9585699325968644, 0.7999633590583278, 0.9169022854168155, 0.446507497729861, 0.9514192943894118, nan, 0.1939467934712762, 0.4747316869855163, nan, 0.014414088414450275, 0.39618874137387466, nan, 0.0, 0.0, 0.0, 0.4812699007304739, nan, 0.0, 0.8931419457735247, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 1.105 | 47.0 | 940 | 2.1804 | 0.1262 | 0.1665 | 0.5618 | [0.15965737440981204, 0.3757317366393188, 0.6338628438589694, 0.8481555021423346, 0.5654543829863283, 0.8596382556987116, 0.38719821328866555, 0.8067233202966554, 0.0, 0.19231159346789736, 0.38126440417551627, 0.0, 0.006097150926901437, 0.36654826643196176, nan, 0.0, 0.0, 0.0, 0.3818290451473169, nan, 0.0, 0.8511586452762924, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.6486337512677671, 0.5511602259373769, 0.9676300120583047, 0.9576259937169068, 0.7970320837245517, 0.9207708560871296, 0.43488338258238973, 0.9493214376211718, nan, 0.194936383498265, 0.43956444722309057, nan, 0.016404318195525, 0.4084842540372018, nan, 0.0, 0.0, 0.0, 0.47878816257726164, nan, 0.0, 0.892866963647363, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 1.2312 | 48.0 | 960 | 2.1784 | 0.1252 | 0.1661 | 0.5627 | [0.16025539225594088, 0.37466007373449767, 0.654619986015055, 0.8452278087095496, 0.5433893649232883, 0.8508070627462867, 0.3813023672358711, 0.8051317856519462, 0.0, 0.20107536163967984, 0.3633123234079548, 0.0, 0.005671668453018935, 0.34927627438640657, nan, 0.0, 0.0, 0.0, 0.3683419988445985, nan, 0.0, 0.856464820320242, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.6526391604074493, 0.554434356520647, 0.9631482380088594, 0.9594253772068259, 0.7972954404928206, 0.9168002123120579, 0.44429003597104255, 0.951348580116325, nan, 0.2095489011695155, 0.43544857768052514, nan, 0.015318738314938785, 0.38883245174624304, nan, 0.0, 0.0, 0.0, 0.44783667353436973, nan, 0.0, 0.9030962987405818, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 1.5244 | 49.0 | 980 | 2.1803 | 0.1260 | 0.1661 | 0.5626 | [0.1577863939029567, 0.3754328571179982, 0.6556101984091115, 0.8462766303993745, 0.5643613013137634, 0.8601731144265224, 0.3729140988994659, 0.8078361274453667, 0.0, 0.20401304977064783, 0.39930570813376104, 0.0, 0.00490553208023702, 0.32288441280684393, nan, 0.0, 0.0, 0.0, 0.3817606224731101, nan, 0.0, 0.851583425240271, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.6529037378919054, 0.556800057809137, 0.9667324076831603, 0.9579136000943939, 0.7889710765566675, 0.9139421653788443, 0.4280503895608723, 0.9560864364131364, nan, 0.21377714946664952, 0.473429196623945, nan, 0.012182618659911947, 0.35432795039758996, nan, 0.0, 0.0, 0.0, 0.46867390897171757, nan, 0.0, 0.8917670351427157, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
| 1.9845 | 50.0 | 1000 | 2.1645 | 0.1262 | 0.1662 | 0.5628 | [0.15843597843469054, 0.37668694709687084, 0.6441654758344134, 0.8511895383645817, 0.5639924565886474, 0.8586149258267328, 0.3807328968759612, 0.8089034968690005, 0.0, 0.21086659359875867, 0.3886605682973176, 0.0, 0.006568939289171623, 0.3416319575918213, nan, 0.0, 0.0, 0.0, 0.3749031458236479, nan, 0.0, 0.8503101019657311, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] | [0.650956153631326, 0.5537014192487221, 0.9688420903560636, 0.9561805873069719, 0.7910321295257288, 0.9228123181822822, 0.43471782548274945, 0.9484787592002216, nan, 0.22006168872895515, 0.4582161092007919, nan, 0.017188348109281708, 0.37926927523032195, nan, 0.0, 0.0, 0.0, 0.4531279265780109, nan, 0.0, 0.8897321674091184, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0] |
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.1+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
|
MayBashendy/Arabic_FineTuningAraBERT_AugV5_k15_task1_organization_fold1 | MayBashendy | 2024-11-24T08:14:47Z | 163 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:aubmindlab/bert-base-arabertv02",
"base_model:finetune:aubmindlab/bert-base-arabertv02",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-18T19:30:31Z | ---
library_name: transformers
base_model: aubmindlab/bert-base-arabertv02
tags:
- generated_from_trainer
model-index:
- name: Arabic_FineTuningAraBERT_AugV5_k15_task1_organization_fold1
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Arabic_FineTuningAraBERT_AugV5_k15_task1_organization_fold1
This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5141
- Qwk: 0.6020
- Mse: 0.5141
- Rmse: 0.7170
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:|
| No log | 0.0125 | 2 | 5.8676 | -0.0500 | 5.8676 | 2.4223 |
| No log | 0.025 | 4 | 3.5269 | -0.0749 | 3.5269 | 1.8780 |
| No log | 0.0375 | 6 | 2.1337 | -0.2297 | 2.1337 | 1.4607 |
| No log | 0.05 | 8 | 1.9398 | -0.1897 | 1.9398 | 1.3928 |
| No log | 0.0625 | 10 | 1.8609 | -0.1484 | 1.8609 | 1.3642 |
| No log | 0.075 | 12 | 1.1760 | -0.0652 | 1.1760 | 1.0844 |
| No log | 0.0875 | 14 | 0.8634 | 0.2410 | 0.8634 | 0.9292 |
| No log | 0.1 | 16 | 1.0159 | 0.1515 | 1.0159 | 1.0079 |
| No log | 0.1125 | 18 | 1.1581 | 0.1057 | 1.1581 | 1.0762 |
| No log | 0.125 | 20 | 1.1322 | 0.2291 | 1.1322 | 1.0640 |
| No log | 0.1375 | 22 | 1.1796 | 0.1478 | 1.1796 | 1.0861 |
| No log | 0.15 | 24 | 1.0250 | 0.1674 | 1.0250 | 1.0124 |
| No log | 0.1625 | 26 | 0.9944 | 0.2291 | 0.9944 | 0.9972 |
| No log | 0.175 | 28 | 1.0728 | 0.2696 | 1.0728 | 1.0358 |
| No log | 0.1875 | 30 | 1.0598 | 0.2291 | 1.0598 | 1.0295 |
| No log | 0.2 | 32 | 1.0952 | 0.2291 | 1.0952 | 1.0465 |
| No log | 0.2125 | 34 | 1.1324 | 0.2696 | 1.1324 | 1.0642 |
| No log | 0.225 | 36 | 1.2377 | 0.1901 | 1.2377 | 1.1125 |
| No log | 0.2375 | 38 | 1.4215 | 0.1532 | 1.4215 | 1.1923 |
| No log | 0.25 | 40 | 1.6697 | -0.0561 | 1.6697 | 1.2922 |
| No log | 0.2625 | 42 | 1.7462 | 0.0421 | 1.7462 | 1.3214 |
| No log | 0.275 | 44 | 1.6673 | 0.0421 | 1.6673 | 1.2912 |
| No log | 0.2875 | 46 | 1.4630 | 0.0239 | 1.4630 | 1.2095 |
| No log | 0.3 | 48 | 1.1467 | 0.2018 | 1.1467 | 1.0708 |
| No log | 0.3125 | 50 | 1.0225 | 0.2018 | 1.0225 | 1.0112 |
| No log | 0.325 | 52 | 1.0198 | 0.2844 | 1.0198 | 1.0098 |
| No log | 0.3375 | 54 | 1.0160 | 0.3090 | 1.0160 | 1.0080 |
| No log | 0.35 | 56 | 1.0062 | 0.2696 | 1.0062 | 1.0031 |
| No log | 0.3625 | 58 | 1.0100 | 0.2696 | 1.0100 | 1.0050 |
| No log | 0.375 | 60 | 1.2045 | 0.3058 | 1.2045 | 1.0975 |
| No log | 0.3875 | 62 | 1.4424 | 0.2519 | 1.4424 | 1.2010 |
| No log | 0.4 | 64 | 1.6611 | 0.0912 | 1.6611 | 1.2888 |
| No log | 0.4125 | 66 | 1.4408 | 0.0797 | 1.4408 | 1.2003 |
| No log | 0.425 | 68 | 1.0612 | 0.3467 | 1.0612 | 1.0301 |
| No log | 0.4375 | 70 | 0.8896 | 0.3762 | 0.8896 | 0.9432 |
| No log | 0.45 | 72 | 0.8860 | 0.3762 | 0.8860 | 0.9413 |
| No log | 0.4625 | 74 | 0.9667 | 0.2829 | 0.9667 | 0.9832 |
| No log | 0.475 | 76 | 1.1179 | 0.0841 | 1.1179 | 1.0573 |
| No log | 0.4875 | 78 | 1.0946 | 0.2596 | 1.0946 | 1.0462 |
| No log | 0.5 | 80 | 0.9660 | 0.1429 | 0.9660 | 0.9828 |
| No log | 0.5125 | 82 | 0.8266 | 0.24 | 0.8266 | 0.9092 |
| No log | 0.525 | 84 | 0.7715 | 0.2473 | 0.7715 | 0.8783 |
| No log | 0.5375 | 86 | 0.7737 | 0.2963 | 0.7737 | 0.8796 |
| No log | 0.55 | 88 | 0.7688 | 0.2130 | 0.7688 | 0.8768 |
| No log | 0.5625 | 90 | 0.8189 | 0.2963 | 0.8189 | 0.9049 |
| No log | 0.575 | 92 | 0.9123 | 0.2687 | 0.9123 | 0.9551 |
| No log | 0.5875 | 94 | 1.0027 | 0.2870 | 1.0027 | 1.0014 |
| No log | 0.6 | 96 | 0.9885 | 0.3694 | 0.9885 | 0.9942 |
| No log | 0.6125 | 98 | 1.0448 | 0.2844 | 1.0448 | 1.0221 |
| No log | 0.625 | 100 | 1.0323 | 0.1923 | 1.0323 | 1.0160 |
| No log | 0.6375 | 102 | 0.9002 | 0.3317 | 0.9002 | 0.9488 |
| No log | 0.65 | 104 | 0.8182 | 0.0562 | 0.8182 | 0.9045 |
| No log | 0.6625 | 106 | 0.9063 | 0.0 | 0.9063 | 0.9520 |
| No log | 0.675 | 108 | 0.8715 | 0.0233 | 0.8715 | 0.9336 |
| No log | 0.6875 | 110 | 0.7806 | 0.2410 | 0.7806 | 0.8835 |
| No log | 0.7 | 112 | 0.7740 | 0.3913 | 0.7740 | 0.8797 |
| No log | 0.7125 | 114 | 0.9095 | 0.3317 | 0.9095 | 0.9537 |
| No log | 0.725 | 116 | 0.9594 | 0.3069 | 0.9594 | 0.9795 |
| No log | 0.7375 | 118 | 1.0921 | 0.2829 | 1.0921 | 1.0450 |
| No log | 0.75 | 120 | 1.1820 | 0.0841 | 1.1820 | 1.0872 |
| No log | 0.7625 | 122 | 1.1420 | 0.1043 | 1.1420 | 1.0686 |
| No log | 0.775 | 124 | 1.0433 | 0.2829 | 1.0433 | 1.0214 |
| No log | 0.7875 | 126 | 0.9707 | 0.2829 | 0.9707 | 0.9852 |
| No log | 0.8 | 128 | 0.9419 | 0.3069 | 0.9419 | 0.9705 |
| No log | 0.8125 | 130 | 0.8750 | 0.4105 | 0.8750 | 0.9354 |
| No log | 0.825 | 132 | 0.8027 | 0.1600 | 0.8027 | 0.8959 |
| No log | 0.8375 | 134 | 0.7776 | -0.0120 | 0.7776 | 0.8818 |
| No log | 0.85 | 136 | 0.7561 | 0.1840 | 0.7561 | 0.8695 |
| No log | 0.8625 | 138 | 0.7368 | 0.2410 | 0.7368 | 0.8584 |
| No log | 0.875 | 140 | 0.7264 | 0.3226 | 0.7264 | 0.8523 |
| No log | 0.8875 | 142 | 0.7090 | 0.3438 | 0.7090 | 0.8420 |
| No log | 0.9 | 144 | 0.7397 | 0.4776 | 0.7397 | 0.8601 |
| No log | 0.9125 | 146 | 0.7865 | 0.4928 | 0.7865 | 0.8868 |
| No log | 0.925 | 148 | 0.7299 | 0.5051 | 0.7299 | 0.8543 |
| No log | 0.9375 | 150 | 0.7285 | 0.3708 | 0.7285 | 0.8535 |
| No log | 0.95 | 152 | 0.7431 | 0.4 | 0.7431 | 0.8620 |
| No log | 0.9625 | 154 | 0.7443 | 0.4 | 0.7443 | 0.8627 |
| No log | 0.975 | 156 | 0.7371 | 0.2130 | 0.7371 | 0.8585 |
| No log | 0.9875 | 158 | 0.7506 | 0.0982 | 0.7506 | 0.8664 |
| No log | 1.0 | 160 | 0.7242 | 0.2130 | 0.7242 | 0.8510 |
| No log | 1.0125 | 162 | 0.7533 | 0.4776 | 0.7533 | 0.8679 |
| No log | 1.025 | 164 | 0.8382 | 0.5196 | 0.8382 | 0.9156 |
| No log | 1.0375 | 166 | 0.8884 | 0.4928 | 0.8884 | 0.9426 |
| No log | 1.05 | 168 | 0.7869 | 0.4776 | 0.7869 | 0.8871 |
| No log | 1.0625 | 170 | 0.7214 | 0.24 | 0.7214 | 0.8494 |
| No log | 1.075 | 172 | 0.7749 | 0.3298 | 0.7749 | 0.8803 |
| No log | 1.0875 | 174 | 0.9288 | 0.3419 | 0.9288 | 0.9637 |
| No log | 1.1 | 176 | 0.9358 | 0.2857 | 0.9358 | 0.9674 |
| No log | 1.1125 | 178 | 0.7969 | 0.3125 | 0.7969 | 0.8927 |
| No log | 1.125 | 180 | 0.6925 | 0.3438 | 0.6925 | 0.8321 |
| No log | 1.1375 | 182 | 0.6785 | 0.3897 | 0.6785 | 0.8237 |
| No log | 1.15 | 184 | 0.6625 | 0.4400 | 0.6625 | 0.8140 |
| No log | 1.1625 | 186 | 0.6500 | 0.4828 | 0.6500 | 0.8062 |
| No log | 1.175 | 188 | 0.6745 | 0.4758 | 0.6745 | 0.8213 |
| No log | 1.1875 | 190 | 0.6885 | 0.5221 | 0.6885 | 0.8297 |
| No log | 1.2 | 192 | 0.7192 | 0.4576 | 0.7192 | 0.8481 |
| No log | 1.2125 | 194 | 0.7159 | 0.4576 | 0.7159 | 0.8461 |
| No log | 1.225 | 196 | 0.6903 | 0.3937 | 0.6903 | 0.8309 |
| No log | 1.2375 | 198 | 0.7128 | 0.3937 | 0.7128 | 0.8443 |
| No log | 1.25 | 200 | 0.6758 | 0.3937 | 0.6758 | 0.8221 |
| No log | 1.2625 | 202 | 0.5768 | 0.6111 | 0.5768 | 0.7595 |
| No log | 1.275 | 204 | 0.5405 | 0.6578 | 0.5405 | 0.7352 |
| No log | 1.2875 | 206 | 0.5653 | 0.5845 | 0.5653 | 0.7519 |
| No log | 1.3 | 208 | 0.6693 | 0.5103 | 0.6693 | 0.8181 |
| No log | 1.3125 | 210 | 0.7809 | 0.4981 | 0.7809 | 0.8837 |
| No log | 1.325 | 212 | 0.7698 | 0.5039 | 0.7698 | 0.8774 |
| No log | 1.3375 | 214 | 0.6515 | 0.5294 | 0.6515 | 0.8071 |
| No log | 1.35 | 216 | 0.6221 | 0.5463 | 0.6221 | 0.7887 |
| No log | 1.3625 | 218 | 0.6328 | 0.5374 | 0.6328 | 0.7955 |
| No log | 1.375 | 220 | 0.6137 | 0.6094 | 0.6137 | 0.7834 |
| No log | 1.3875 | 222 | 0.5765 | 0.5702 | 0.5765 | 0.7593 |
| No log | 1.4 | 224 | 0.5972 | 0.5044 | 0.5972 | 0.7728 |
| No log | 1.4125 | 226 | 0.6461 | 0.4766 | 0.6461 | 0.8038 |
| No log | 1.425 | 228 | 0.7013 | 0.5 | 0.7013 | 0.8374 |
| No log | 1.4375 | 230 | 0.8359 | 0.5299 | 0.8359 | 0.9143 |
| No log | 1.45 | 232 | 1.0081 | 0.3893 | 1.0081 | 1.0040 |
| No log | 1.4625 | 234 | 1.2002 | 0.3893 | 1.2002 | 1.0956 |
| No log | 1.475 | 236 | 1.0477 | 0.4651 | 1.0477 | 1.0235 |
| No log | 1.4875 | 238 | 0.8046 | 0.6423 | 0.8046 | 0.8970 |
| No log | 1.5 | 240 | 0.7001 | 0.6008 | 0.7001 | 0.8367 |
| No log | 1.5125 | 242 | 0.6309 | 0.5182 | 0.6309 | 0.7943 |
| No log | 1.525 | 244 | 0.5883 | 0.6192 | 0.5883 | 0.7670 |
| No log | 1.5375 | 246 | 0.5853 | 0.6192 | 0.5853 | 0.7650 |
| No log | 1.55 | 248 | 0.6219 | 0.5728 | 0.6219 | 0.7886 |
| No log | 1.5625 | 250 | 0.6081 | 0.5333 | 0.6081 | 0.7798 |
| No log | 1.575 | 252 | 0.5971 | 0.5333 | 0.5971 | 0.7727 |
| No log | 1.5875 | 254 | 0.6090 | 0.5333 | 0.6090 | 0.7804 |
| No log | 1.6 | 256 | 0.6883 | 0.4615 | 0.6883 | 0.8296 |
| No log | 1.6125 | 258 | 0.6993 | 0.4615 | 0.6993 | 0.8362 |
| No log | 1.625 | 260 | 0.6631 | 0.5643 | 0.6631 | 0.8143 |
| No log | 1.6375 | 262 | 0.6579 | 0.6111 | 0.6579 | 0.8111 |
| No log | 1.65 | 264 | 0.6345 | 0.6431 | 0.6345 | 0.7965 |
| No log | 1.6625 | 266 | 0.6707 | 0.6111 | 0.6707 | 0.8190 |
| No log | 1.675 | 268 | 0.7988 | 0.4934 | 0.7988 | 0.8938 |
| No log | 1.6875 | 270 | 1.0564 | 0.4651 | 1.0564 | 1.0278 |
| No log | 1.7 | 272 | 1.2698 | 0.4052 | 1.2698 | 1.1268 |
| No log | 1.7125 | 274 | 1.1443 | 0.3883 | 1.1443 | 1.0697 |
| No log | 1.725 | 276 | 0.8199 | 0.4889 | 0.8199 | 0.9055 |
| No log | 1.7375 | 278 | 0.6890 | 0.6000 | 0.6890 | 0.8300 |
| No log | 1.75 | 280 | 0.7205 | 0.5704 | 0.7205 | 0.8488 |
| No log | 1.7625 | 282 | 0.7731 | 0.5939 | 0.7731 | 0.8793 |
| No log | 1.775 | 284 | 0.8214 | 0.5855 | 0.8214 | 0.9063 |
| No log | 1.7875 | 286 | 0.7438 | 0.5532 | 0.7438 | 0.8624 |
| No log | 1.8 | 288 | 0.5615 | 0.6573 | 0.5615 | 0.7493 |
| No log | 1.8125 | 290 | 0.5032 | 0.6957 | 0.5032 | 0.7094 |
| No log | 1.825 | 292 | 0.4986 | 0.664 | 0.4986 | 0.7061 |
| No log | 1.8375 | 294 | 0.5673 | 0.5783 | 0.5673 | 0.7532 |
| No log | 1.85 | 296 | 0.6665 | 0.5939 | 0.6665 | 0.8164 |
| No log | 1.8625 | 298 | 0.7788 | 0.5855 | 0.7788 | 0.8825 |
| No log | 1.875 | 300 | 0.7666 | 0.5855 | 0.7666 | 0.8756 |
| No log | 1.8875 | 302 | 0.6199 | 0.6316 | 0.6199 | 0.7873 |
| No log | 1.9 | 304 | 0.5390 | 0.6912 | 0.5390 | 0.7341 |
| No log | 1.9125 | 306 | 0.5148 | 0.6288 | 0.5148 | 0.7175 |
| No log | 1.925 | 308 | 0.5395 | 0.6202 | 0.5395 | 0.7345 |
| No log | 1.9375 | 310 | 0.6519 | 0.5172 | 0.6519 | 0.8074 |
| No log | 1.95 | 312 | 0.7388 | 0.5191 | 0.7388 | 0.8595 |
| No log | 1.9625 | 314 | 0.7273 | 0.5191 | 0.7273 | 0.8528 |
| No log | 1.975 | 316 | 0.6007 | 0.5172 | 0.6007 | 0.7751 |
| No log | 1.9875 | 318 | 0.5011 | 0.6957 | 0.5011 | 0.7079 |
| No log | 2.0 | 320 | 0.4700 | 0.6957 | 0.4700 | 0.6856 |
| No log | 2.0125 | 322 | 0.4799 | 0.6957 | 0.4799 | 0.6927 |
| No log | 2.025 | 324 | 0.5102 | 0.6957 | 0.5102 | 0.7143 |
| No log | 2.0375 | 326 | 0.5485 | 0.6008 | 0.5485 | 0.7406 |
| No log | 2.05 | 328 | 0.5976 | 0.6008 | 0.5976 | 0.7731 |
| No log | 2.0625 | 330 | 0.6764 | 0.5370 | 0.6764 | 0.8224 |
| No log | 2.075 | 332 | 0.8870 | 0.4615 | 0.8870 | 0.9418 |
| No log | 2.0875 | 334 | 1.0313 | 0.4790 | 1.0313 | 1.0155 |
| No log | 2.1 | 336 | 0.9564 | 0.4615 | 0.9564 | 0.9780 |
| No log | 2.1125 | 338 | 0.7485 | 0.5581 | 0.7485 | 0.8652 |
| No log | 2.125 | 340 | 0.6341 | 0.6216 | 0.6341 | 0.7963 |
| No log | 2.1375 | 342 | 0.6507 | 0.6316 | 0.6507 | 0.8067 |
| No log | 2.15 | 344 | 0.7151 | 0.5532 | 0.7151 | 0.8457 |
| No log | 2.1625 | 346 | 0.6611 | 0.6345 | 0.6611 | 0.8131 |
| No log | 2.175 | 348 | 0.5430 | 0.6557 | 0.5430 | 0.7369 |
| No log | 2.1875 | 350 | 0.4831 | 0.6192 | 0.4831 | 0.6951 |
| No log | 2.2 | 352 | 0.5081 | 0.5116 | 0.5081 | 0.7128 |
| No log | 2.2125 | 354 | 0.5098 | 0.5116 | 0.5098 | 0.7140 |
| No log | 2.225 | 356 | 0.4896 | 0.4343 | 0.4896 | 0.6997 |
| No log | 2.2375 | 358 | 0.5051 | 0.5586 | 0.5051 | 0.7107 |
| No log | 2.25 | 360 | 0.6979 | 0.5735 | 0.6979 | 0.8354 |
| No log | 2.2625 | 362 | 0.9315 | 0.4832 | 0.9315 | 0.9651 |
| No log | 2.275 | 364 | 0.8962 | 0.6135 | 0.8962 | 0.9467 |
| No log | 2.2875 | 366 | 0.7082 | 0.6124 | 0.7082 | 0.8415 |
| No log | 2.3 | 368 | 0.5988 | 0.6708 | 0.5988 | 0.7738 |
| No log | 2.3125 | 370 | 0.6167 | 0.6899 | 0.6167 | 0.7853 |
| No log | 2.325 | 372 | 0.6815 | 0.6387 | 0.6815 | 0.8255 |
| No log | 2.3375 | 374 | 0.6846 | 0.5855 | 0.6846 | 0.8274 |
| No log | 2.35 | 376 | 0.6901 | 0.5855 | 0.6901 | 0.8307 |
| No log | 2.3625 | 378 | 0.5787 | 0.6423 | 0.5787 | 0.7607 |
| No log | 2.375 | 380 | 0.4956 | 0.7131 | 0.4956 | 0.7040 |
| No log | 2.3875 | 382 | 0.4831 | 0.7131 | 0.4831 | 0.6951 |
| No log | 2.4 | 384 | 0.4732 | 0.7131 | 0.4732 | 0.6879 |
| No log | 2.4125 | 386 | 0.4955 | 0.7586 | 0.4955 | 0.7039 |
| No log | 2.425 | 388 | 0.6073 | 0.6488 | 0.6073 | 0.7793 |
| No log | 2.4375 | 390 | 0.6677 | 0.6124 | 0.6677 | 0.8172 |
| No log | 2.45 | 392 | 0.5965 | 0.6645 | 0.5965 | 0.7723 |
| No log | 2.4625 | 394 | 0.4759 | 0.7941 | 0.4759 | 0.6898 |
| No log | 2.475 | 396 | 0.4157 | 0.7879 | 0.4157 | 0.6448 |
| No log | 2.4875 | 398 | 0.3865 | 0.7879 | 0.3865 | 0.6217 |
| No log | 2.5 | 400 | 0.3675 | 0.7879 | 0.3675 | 0.6062 |
| No log | 2.5125 | 402 | 0.3629 | 0.7510 | 0.3629 | 0.6024 |
| No log | 2.525 | 404 | 0.3758 | 0.7510 | 0.3758 | 0.6131 |
| No log | 2.5375 | 406 | 0.3942 | 0.7034 | 0.3942 | 0.6278 |
| No log | 2.55 | 408 | 0.4059 | 0.7510 | 0.4059 | 0.6371 |
| No log | 2.5625 | 410 | 0.4303 | 0.7709 | 0.4303 | 0.6560 |
| No log | 2.575 | 412 | 0.4311 | 0.7181 | 0.4311 | 0.6565 |
| No log | 2.5875 | 414 | 0.4359 | 0.7181 | 0.4359 | 0.6602 |
| No log | 2.6 | 416 | 0.4718 | 0.7658 | 0.4718 | 0.6869 |
| No log | 2.6125 | 418 | 0.5650 | 0.7354 | 0.5650 | 0.7517 |
| No log | 2.625 | 420 | 0.5710 | 0.75 | 0.5710 | 0.7556 |
| No log | 2.6375 | 422 | 0.4787 | 0.7658 | 0.4787 | 0.6918 |
| No log | 2.65 | 424 | 0.4626 | 0.7658 | 0.4626 | 0.6801 |
| No log | 2.6625 | 426 | 0.4909 | 0.7658 | 0.4909 | 0.7006 |
| No log | 2.675 | 428 | 0.4850 | 0.7829 | 0.4850 | 0.6964 |
| No log | 2.6875 | 430 | 0.5175 | 0.6715 | 0.5175 | 0.7194 |
| No log | 2.7 | 432 | 0.5640 | 0.6316 | 0.5640 | 0.7510 |
| No log | 2.7125 | 434 | 0.5679 | 0.6525 | 0.5679 | 0.7536 |
| No log | 2.725 | 436 | 0.4802 | 0.6667 | 0.4802 | 0.6930 |
| No log | 2.7375 | 438 | 0.4399 | 0.7586 | 0.4399 | 0.6633 |
| No log | 2.75 | 440 | 0.4108 | 0.7348 | 0.4108 | 0.6409 |
| No log | 2.7625 | 442 | 0.3953 | 0.7535 | 0.3953 | 0.6287 |
| No log | 2.775 | 444 | 0.3955 | 0.7640 | 0.3955 | 0.6289 |
| No log | 2.7875 | 446 | 0.4249 | 0.7941 | 0.4249 | 0.6519 |
| No log | 2.8 | 448 | 0.5409 | 0.6525 | 0.5409 | 0.7355 |
| No log | 2.8125 | 450 | 0.6264 | 0.6138 | 0.6264 | 0.7915 |
| No log | 2.825 | 452 | 0.5986 | 0.6138 | 0.5986 | 0.7737 |
| No log | 2.8375 | 454 | 0.5677 | 0.6237 | 0.5677 | 0.7534 |
| No log | 2.85 | 456 | 0.4177 | 0.6980 | 0.4177 | 0.6463 |
| No log | 2.8625 | 458 | 0.3607 | 0.72 | 0.3607 | 0.6006 |
| No log | 2.875 | 460 | 0.3528 | 0.7640 | 0.3528 | 0.5940 |
| No log | 2.8875 | 462 | 0.3578 | 0.8444 | 0.3578 | 0.5982 |
| No log | 2.9 | 464 | 0.3653 | 0.7709 | 0.3653 | 0.6044 |
| No log | 2.9125 | 466 | 0.4565 | 0.6597 | 0.4565 | 0.6757 |
| No log | 2.925 | 468 | 0.5803 | 0.5855 | 0.5803 | 0.7618 |
| No log | 2.9375 | 470 | 0.6945 | 0.6135 | 0.6945 | 0.8334 |
| No log | 2.95 | 472 | 0.6414 | 0.6047 | 0.6414 | 0.8009 |
| No log | 2.9625 | 474 | 0.5027 | 0.6899 | 0.5027 | 0.7090 |
| No log | 2.975 | 476 | 0.4590 | 0.7368 | 0.4590 | 0.6775 |
| No log | 2.9875 | 478 | 0.4816 | 0.7072 | 0.4816 | 0.6940 |
| No log | 3.0 | 480 | 0.4823 | 0.7308 | 0.4823 | 0.6945 |
| No log | 3.0125 | 482 | 0.4547 | 0.7072 | 0.4547 | 0.6743 |
| No log | 3.025 | 484 | 0.3562 | 0.72 | 0.3562 | 0.5969 |
| No log | 3.0375 | 486 | 0.3101 | 0.7812 | 0.3101 | 0.5569 |
| No log | 3.05 | 488 | 0.3144 | 0.7812 | 0.3144 | 0.5607 |
| No log | 3.0625 | 490 | 0.3597 | 0.72 | 0.3597 | 0.5997 |
| No log | 3.075 | 492 | 0.4648 | 0.6980 | 0.4648 | 0.6818 |
| No log | 3.0875 | 494 | 0.6073 | 0.6124 | 0.6073 | 0.7793 |
| No log | 3.1 | 496 | 0.6429 | 0.6124 | 0.6429 | 0.8018 |
| No log | 3.1125 | 498 | 0.5731 | 0.6020 | 0.5731 | 0.7570 |
| 0.5909 | 3.125 | 500 | 0.4593 | 0.7287 | 0.4593 | 0.6777 |
| 0.5909 | 3.1375 | 502 | 0.4184 | 0.7287 | 0.4184 | 0.6468 |
| 0.5909 | 3.15 | 504 | 0.4439 | 0.7287 | 0.4439 | 0.6663 |
| 0.5909 | 3.1625 | 506 | 0.5349 | 0.6209 | 0.5349 | 0.7313 |
| 0.5909 | 3.175 | 508 | 0.5958 | 0.5939 | 0.5958 | 0.7719 |
| 0.5909 | 3.1875 | 510 | 0.5342 | 0.6980 | 0.5342 | 0.7309 |
| 0.5909 | 3.2 | 512 | 0.4407 | 0.6883 | 0.4407 | 0.6639 |
| 0.5909 | 3.2125 | 514 | 0.4265 | 0.6883 | 0.4265 | 0.6531 |
| 0.5909 | 3.225 | 516 | 0.4491 | 0.6883 | 0.4491 | 0.6701 |
| 0.5909 | 3.2375 | 518 | 0.5484 | 0.6097 | 0.5484 | 0.7405 |
| 0.5909 | 3.25 | 520 | 0.7287 | 0.5855 | 0.7287 | 0.8536 |
| 0.5909 | 3.2625 | 522 | 0.8659 | 0.6135 | 0.8659 | 0.9305 |
| 0.5909 | 3.275 | 524 | 0.9872 | 0.5575 | 0.9872 | 0.9936 |
| 0.5909 | 3.2875 | 526 | 0.8995 | 0.5882 | 0.8995 | 0.9484 |
| 0.5909 | 3.3 | 528 | 0.7036 | 0.6198 | 0.7036 | 0.8388 |
| 0.5909 | 3.3125 | 530 | 0.5796 | 0.6198 | 0.5796 | 0.7613 |
| 0.5909 | 3.325 | 532 | 0.4918 | 0.6667 | 0.4918 | 0.7013 |
| 0.5909 | 3.3375 | 534 | 0.5072 | 0.6291 | 0.5072 | 0.7122 |
| 0.5909 | 3.35 | 536 | 0.4799 | 0.6667 | 0.4799 | 0.6928 |
| 0.5909 | 3.3625 | 538 | 0.4573 | 0.6290 | 0.4573 | 0.6762 |
| 0.5909 | 3.375 | 540 | 0.4760 | 0.6000 | 0.4760 | 0.6899 |
| 0.5909 | 3.3875 | 542 | 0.4788 | 0.6000 | 0.4788 | 0.6920 |
| 0.5909 | 3.4 | 544 | 0.4331 | 0.72 | 0.4331 | 0.6581 |
| 0.5909 | 3.4125 | 546 | 0.4247 | 0.72 | 0.4247 | 0.6517 |
| 0.5909 | 3.425 | 548 | 0.4549 | 0.7063 | 0.4549 | 0.6745 |
| 0.5909 | 3.4375 | 550 | 0.5120 | 0.6124 | 0.5120 | 0.7155 |
| 0.5909 | 3.45 | 552 | 0.4765 | 0.6316 | 0.4765 | 0.6903 |
| 0.5909 | 3.4625 | 554 | 0.4910 | 0.6316 | 0.4910 | 0.7007 |
| 0.5909 | 3.475 | 556 | 0.5412 | 0.5855 | 0.5412 | 0.7356 |
| 0.5909 | 3.4875 | 558 | 0.5534 | 0.6047 | 0.5534 | 0.7439 |
| 0.5909 | 3.5 | 560 | 0.5028 | 0.6216 | 0.5028 | 0.7091 |
| 0.5909 | 3.5125 | 562 | 0.4860 | 0.6316 | 0.4860 | 0.6972 |
| 0.5909 | 3.525 | 564 | 0.5274 | 0.6543 | 0.5274 | 0.7262 |
| 0.5909 | 3.5375 | 566 | 0.5246 | 0.6543 | 0.5246 | 0.7243 |
| 0.5909 | 3.55 | 568 | 0.5200 | 0.6543 | 0.5200 | 0.7211 |
| 0.5909 | 3.5625 | 570 | 0.5589 | 0.6543 | 0.5589 | 0.7476 |
| 0.5909 | 3.575 | 572 | 0.5419 | 0.6198 | 0.5419 | 0.7361 |
| 0.5909 | 3.5875 | 574 | 0.4969 | 0.5825 | 0.4969 | 0.7049 |
| 0.5909 | 3.6 | 576 | 0.4604 | 0.6842 | 0.4604 | 0.6785 |
| 0.5909 | 3.6125 | 578 | 0.4208 | 0.7482 | 0.4208 | 0.6487 |
| 0.5909 | 3.625 | 580 | 0.4211 | 0.7482 | 0.4211 | 0.6489 |
| 0.5909 | 3.6375 | 582 | 0.4555 | 0.7336 | 0.4555 | 0.6749 |
| 0.5909 | 3.65 | 584 | 0.5643 | 0.6124 | 0.5643 | 0.7512 |
| 0.5909 | 3.6625 | 586 | 0.6213 | 0.6124 | 0.6213 | 0.7882 |
| 0.5909 | 3.675 | 588 | 0.6387 | 0.5855 | 0.6387 | 0.7992 |
| 0.5909 | 3.6875 | 590 | 0.5581 | 0.5825 | 0.5581 | 0.7471 |
| 0.5909 | 3.7 | 592 | 0.4484 | 0.6290 | 0.4484 | 0.6696 |
| 0.5909 | 3.7125 | 594 | 0.3895 | 0.8165 | 0.3895 | 0.6241 |
| 0.5909 | 3.725 | 596 | 0.3942 | 0.8444 | 0.3942 | 0.6278 |
| 0.5909 | 3.7375 | 598 | 0.4036 | 0.8444 | 0.4036 | 0.6353 |
| 0.5909 | 3.75 | 600 | 0.4322 | 0.7348 | 0.4322 | 0.6574 |
| 0.5909 | 3.7625 | 602 | 0.5056 | 0.5912 | 0.5056 | 0.7111 |
| 0.5909 | 3.775 | 604 | 0.5679 | 0.5825 | 0.5679 | 0.7536 |
| 0.5909 | 3.7875 | 606 | 0.5853 | 0.5939 | 0.5853 | 0.7651 |
| 0.5909 | 3.8 | 608 | 0.5326 | 0.5912 | 0.5326 | 0.7298 |
| 0.5909 | 3.8125 | 610 | 0.4881 | 0.6465 | 0.4881 | 0.6986 |
| 0.5909 | 3.825 | 612 | 0.4904 | 0.6465 | 0.4904 | 0.7003 |
| 0.5909 | 3.8375 | 614 | 0.5059 | 0.6465 | 0.5059 | 0.7113 |
| 0.5909 | 3.85 | 616 | 0.5333 | 0.6387 | 0.5333 | 0.7302 |
| 0.5909 | 3.8625 | 618 | 0.5425 | 0.5828 | 0.5425 | 0.7365 |
| 0.5909 | 3.875 | 620 | 0.4948 | 0.6000 | 0.4948 | 0.7034 |
| 0.5909 | 3.8875 | 622 | 0.4626 | 0.72 | 0.4626 | 0.6801 |
| 0.5909 | 3.9 | 624 | 0.4610 | 0.72 | 0.4610 | 0.6789 |
| 0.5909 | 3.9125 | 626 | 0.5292 | 0.6216 | 0.5292 | 0.7275 |
| 0.5909 | 3.925 | 628 | 0.5659 | 0.6124 | 0.5659 | 0.7523 |
| 0.5909 | 3.9375 | 630 | 0.5282 | 0.6216 | 0.5282 | 0.7268 |
| 0.5909 | 3.95 | 632 | 0.4593 | 0.6912 | 0.4593 | 0.6777 |
| 0.5909 | 3.9625 | 634 | 0.3994 | 0.7686 | 0.3994 | 0.6320 |
| 0.5909 | 3.975 | 636 | 0.3903 | 0.8108 | 0.3903 | 0.6247 |
| 0.5909 | 3.9875 | 638 | 0.3973 | 0.8108 | 0.3973 | 0.6303 |
| 0.5909 | 4.0 | 640 | 0.3906 | 0.8108 | 0.3906 | 0.6250 |
| 0.5909 | 4.0125 | 642 | 0.3913 | 0.8108 | 0.3913 | 0.6256 |
| 0.5909 | 4.025 | 644 | 0.4532 | 0.7426 | 0.4532 | 0.6732 |
| 0.5909 | 4.0375 | 646 | 0.5438 | 0.6205 | 0.5438 | 0.7374 |
| 0.5909 | 4.05 | 648 | 0.6091 | 0.5957 | 0.6091 | 0.7805 |
| 0.5909 | 4.0625 | 650 | 0.6260 | 0.5957 | 0.6260 | 0.7912 |
| 0.5909 | 4.075 | 652 | 0.5663 | 0.6205 | 0.5663 | 0.7525 |
| 0.5909 | 4.0875 | 654 | 0.5080 | 0.5704 | 0.5080 | 0.7128 |
| 0.5909 | 4.1 | 656 | 0.4810 | 0.6316 | 0.4810 | 0.6935 |
| 0.5909 | 4.1125 | 658 | 0.5058 | 0.5704 | 0.5058 | 0.7112 |
| 0.5909 | 4.125 | 660 | 0.5326 | 0.6316 | 0.5326 | 0.7298 |
| 0.5909 | 4.1375 | 662 | 0.4884 | 0.6540 | 0.4884 | 0.6989 |
| 0.5909 | 4.15 | 664 | 0.4464 | 0.6667 | 0.4464 | 0.6681 |
| 0.5909 | 4.1625 | 666 | 0.4743 | 0.6667 | 0.4743 | 0.6887 |
| 0.5909 | 4.175 | 668 | 0.5014 | 0.6667 | 0.5014 | 0.7081 |
| 0.5909 | 4.1875 | 670 | 0.5454 | 0.5912 | 0.5454 | 0.7385 |
| 0.5909 | 4.2 | 672 | 0.5383 | 0.6125 | 0.5383 | 0.7337 |
| 0.5909 | 4.2125 | 674 | 0.5192 | 0.6008 | 0.5192 | 0.7206 |
| 0.5909 | 4.225 | 676 | 0.4334 | 0.6431 | 0.4334 | 0.6584 |
| 0.5909 | 4.2375 | 678 | 0.3880 | 0.7050 | 0.3880 | 0.6229 |
| 0.5909 | 4.25 | 680 | 0.3543 | 0.8165 | 0.3543 | 0.5952 |
| 0.5909 | 4.2625 | 682 | 0.3572 | 0.8165 | 0.3572 | 0.5977 |
| 0.5909 | 4.275 | 684 | 0.3905 | 0.6557 | 0.3905 | 0.6249 |
| 0.5909 | 4.2875 | 686 | 0.4909 | 0.6431 | 0.4909 | 0.7007 |
| 0.5909 | 4.3 | 688 | 0.6893 | 0.5366 | 0.6893 | 0.8303 |
| 0.5909 | 4.3125 | 690 | 0.8751 | 0.5882 | 0.8751 | 0.9355 |
| 0.5909 | 4.325 | 692 | 0.9336 | 0.6059 | 0.9336 | 0.9663 |
| 0.5909 | 4.3375 | 694 | 0.8236 | 0.5904 | 0.8236 | 0.9075 |
| 0.5909 | 4.35 | 696 | 0.7144 | 0.5812 | 0.7144 | 0.8452 |
| 0.5909 | 4.3625 | 698 | 0.5879 | 0.6667 | 0.5879 | 0.7667 |
| 0.5909 | 4.375 | 700 | 0.5569 | 0.6465 | 0.5569 | 0.7462 |
| 0.5909 | 4.3875 | 702 | 0.5400 | 0.6573 | 0.5400 | 0.7348 |
| 0.5909 | 4.4 | 704 | 0.5768 | 0.6465 | 0.5768 | 0.7594 |
| 0.5909 | 4.4125 | 706 | 0.6111 | 0.6198 | 0.6111 | 0.7817 |
| 0.5909 | 4.425 | 708 | 0.5770 | 0.5911 | 0.5770 | 0.7596 |
| 0.5909 | 4.4375 | 710 | 0.5099 | 0.6097 | 0.5099 | 0.7141 |
| 0.5909 | 4.45 | 712 | 0.4513 | 0.7050 | 0.4513 | 0.6718 |
| 0.5909 | 4.4625 | 714 | 0.4230 | 0.7482 | 0.4230 | 0.6504 |
| 0.5909 | 4.475 | 716 | 0.4481 | 0.7050 | 0.4481 | 0.6694 |
| 0.5909 | 4.4875 | 718 | 0.5468 | 0.6198 | 0.5468 | 0.7395 |
| 0.5909 | 4.5 | 720 | 0.6100 | 0.6124 | 0.6100 | 0.7811 |
| 0.5909 | 4.5125 | 722 | 0.5717 | 0.6124 | 0.5717 | 0.7561 |
| 0.5909 | 4.525 | 724 | 0.4681 | 0.6456 | 0.4681 | 0.6842 |
| 0.5909 | 4.5375 | 726 | 0.4195 | 0.7771 | 0.4195 | 0.6477 |
| 0.5909 | 4.55 | 728 | 0.3824 | 0.8562 | 0.3824 | 0.6184 |
| 0.5909 | 4.5625 | 730 | 0.3793 | 0.8562 | 0.3793 | 0.6159 |
| 0.5909 | 4.575 | 732 | 0.3821 | 0.8383 | 0.3821 | 0.6181 |
| 0.5909 | 4.5875 | 734 | 0.3553 | 0.8562 | 0.3553 | 0.5961 |
| 0.5909 | 4.6 | 736 | 0.3280 | 0.8754 | 0.3280 | 0.5727 |
| 0.5909 | 4.6125 | 738 | 0.3222 | 0.8444 | 0.3222 | 0.5677 |
| 0.5909 | 4.625 | 740 | 0.3433 | 0.8063 | 0.3433 | 0.5859 |
| 0.5909 | 4.6375 | 742 | 0.3990 | 0.7050 | 0.3990 | 0.6316 |
| 0.5909 | 4.65 | 744 | 0.4288 | 0.6500 | 0.4288 | 0.6548 |
| 0.5909 | 4.6625 | 746 | 0.4067 | 0.6744 | 0.4067 | 0.6377 |
| 0.5909 | 4.675 | 748 | 0.4102 | 0.6744 | 0.4102 | 0.6405 |
| 0.5909 | 4.6875 | 750 | 0.4120 | 0.6744 | 0.4120 | 0.6419 |
| 0.5909 | 4.7 | 752 | 0.4832 | 0.5704 | 0.4832 | 0.6951 |
| 0.5909 | 4.7125 | 754 | 0.5278 | 0.5939 | 0.5278 | 0.7265 |
| 0.5909 | 4.725 | 756 | 0.4917 | 0.6392 | 0.4917 | 0.7012 |
| 0.5909 | 4.7375 | 758 | 0.4918 | 0.6392 | 0.4918 | 0.7013 |
| 0.5909 | 4.75 | 760 | 0.5831 | 0.5743 | 0.5831 | 0.7636 |
| 0.5909 | 4.7625 | 762 | 0.7452 | 0.6277 | 0.7452 | 0.8632 |
| 0.5909 | 4.775 | 764 | 0.8550 | 0.5985 | 0.8550 | 0.9247 |
| 0.5909 | 4.7875 | 766 | 0.8142 | 0.5985 | 0.8142 | 0.9023 |
| 0.5909 | 4.8 | 768 | 0.6552 | 0.5855 | 0.6552 | 0.8094 |
| 0.5909 | 4.8125 | 770 | 0.5470 | 0.5935 | 0.5470 | 0.7396 |
| 0.5909 | 4.825 | 772 | 0.4951 | 0.5911 | 0.4951 | 0.7036 |
| 0.5909 | 4.8375 | 774 | 0.4960 | 0.6291 | 0.4960 | 0.7043 |
| 0.5909 | 4.85 | 776 | 0.5512 | 0.5935 | 0.5512 | 0.7424 |
| 0.5909 | 4.8625 | 778 | 0.6284 | 0.5855 | 0.6284 | 0.7927 |
| 0.5909 | 4.875 | 780 | 0.6456 | 0.5855 | 0.6456 | 0.8035 |
| 0.5909 | 4.8875 | 782 | 0.6479 | 0.5855 | 0.6479 | 0.8049 |
| 0.5909 | 4.9 | 784 | 0.6093 | 0.5935 | 0.6093 | 0.7806 |
| 0.5909 | 4.9125 | 786 | 0.5565 | 0.5935 | 0.5565 | 0.7460 |
| 0.5909 | 4.925 | 788 | 0.5171 | 0.6291 | 0.5171 | 0.7191 |
| 0.5909 | 4.9375 | 790 | 0.5244 | 0.7016 | 0.5244 | 0.7242 |
| 0.5909 | 4.95 | 792 | 0.5706 | 0.6198 | 0.5706 | 0.7554 |
| 0.5909 | 4.9625 | 794 | 0.6163 | 0.6030 | 0.6163 | 0.7850 |
| 0.5909 | 4.975 | 796 | 0.6371 | 0.6030 | 0.6371 | 0.7982 |
| 0.5909 | 4.9875 | 798 | 0.6062 | 0.5935 | 0.6062 | 0.7786 |
| 0.5909 | 5.0 | 800 | 0.5803 | 0.6899 | 0.5803 | 0.7618 |
| 0.5909 | 5.0125 | 802 | 0.5161 | 0.6899 | 0.5161 | 0.7184 |
| 0.5909 | 5.025 | 804 | 0.4774 | 0.6784 | 0.4774 | 0.6909 |
| 0.5909 | 5.0375 | 806 | 0.4768 | 0.6784 | 0.4768 | 0.6905 |
| 0.5909 | 5.05 | 808 | 0.5467 | 0.6899 | 0.5467 | 0.7394 |
| 0.5909 | 5.0625 | 810 | 0.6869 | 0.5957 | 0.6869 | 0.8288 |
| 0.5909 | 5.075 | 812 | 0.7524 | 0.6121 | 0.7524 | 0.8674 |
| 0.5909 | 5.0875 | 814 | 0.7477 | 0.6121 | 0.7477 | 0.8647 |
| 0.5909 | 5.1 | 816 | 0.7099 | 0.5957 | 0.7099 | 0.8425 |
| 0.5909 | 5.1125 | 818 | 0.5989 | 0.5855 | 0.5989 | 0.7739 |
| 0.5909 | 5.125 | 820 | 0.4987 | 0.6899 | 0.4987 | 0.7062 |
| 0.5909 | 5.1375 | 822 | 0.4780 | 0.7143 | 0.4780 | 0.6914 |
| 0.5909 | 5.15 | 824 | 0.4938 | 0.6755 | 0.4938 | 0.7027 |
| 0.5909 | 5.1625 | 826 | 0.5079 | 0.6645 | 0.5079 | 0.7127 |
| 0.5909 | 5.175 | 828 | 0.4896 | 0.7016 | 0.4896 | 0.6997 |
| 0.5909 | 5.1875 | 830 | 0.5092 | 0.6645 | 0.5092 | 0.7136 |
| 0.5909 | 5.2 | 832 | 0.5897 | 0.6047 | 0.5897 | 0.7679 |
| 0.5909 | 5.2125 | 834 | 0.6377 | 0.6047 | 0.6377 | 0.7985 |
| 0.5909 | 5.225 | 836 | 0.6414 | 0.6047 | 0.6414 | 0.8009 |
| 0.5909 | 5.2375 | 838 | 0.5937 | 0.6047 | 0.5937 | 0.7705 |
| 0.5909 | 5.25 | 840 | 0.5073 | 0.6020 | 0.5073 | 0.7123 |
| 0.5909 | 5.2625 | 842 | 0.4423 | 0.6500 | 0.4423 | 0.6650 |
| 0.5909 | 5.275 | 844 | 0.4054 | 0.776 | 0.4054 | 0.6367 |
| 0.5909 | 5.2875 | 846 | 0.3979 | 0.776 | 0.3979 | 0.6308 |
| 0.5909 | 5.3 | 848 | 0.4064 | 0.776 | 0.4064 | 0.6375 |
| 0.5909 | 5.3125 | 850 | 0.4525 | 0.6392 | 0.4525 | 0.6726 |
| 0.5909 | 5.325 | 852 | 0.5557 | 0.6198 | 0.5557 | 0.7455 |
| 0.5909 | 5.3375 | 854 | 0.7260 | 0.6135 | 0.7260 | 0.8521 |
| 0.5909 | 5.35 | 856 | 0.9119 | 0.5985 | 0.9119 | 0.9550 |
| 0.5909 | 5.3625 | 858 | 0.9151 | 0.5985 | 0.9151 | 0.9566 |
| 0.5909 | 5.375 | 860 | 0.7949 | 0.6211 | 0.7949 | 0.8916 |
| 0.5909 | 5.3875 | 862 | 0.6216 | 0.6124 | 0.6216 | 0.7884 |
| 0.5909 | 5.4 | 864 | 0.4665 | 0.6392 | 0.4665 | 0.6830 |
| 0.5909 | 5.4125 | 866 | 0.3913 | 0.7348 | 0.3913 | 0.6255 |
| 0.5909 | 5.425 | 868 | 0.3892 | 0.7348 | 0.3892 | 0.6239 |
| 0.5909 | 5.4375 | 870 | 0.4362 | 0.72 | 0.4362 | 0.6605 |
| 0.5909 | 5.45 | 872 | 0.5581 | 0.6392 | 0.5581 | 0.7471 |
| 0.5909 | 5.4625 | 874 | 0.6608 | 0.5855 | 0.6608 | 0.8129 |
| 0.5909 | 5.475 | 876 | 0.6612 | 0.5855 | 0.6612 | 0.8131 |
| 0.5909 | 5.4875 | 878 | 0.5824 | 0.6291 | 0.5824 | 0.7632 |
| 0.5909 | 5.5 | 880 | 0.4909 | 0.7 | 0.4909 | 0.7007 |
| 0.5909 | 5.5125 | 882 | 0.4491 | 0.6744 | 0.4491 | 0.6701 |
| 0.5909 | 5.525 | 884 | 0.4596 | 0.6744 | 0.4596 | 0.6779 |
| 0.5909 | 5.5375 | 886 | 0.4847 | 0.6744 | 0.4847 | 0.6962 |
| 0.5909 | 5.55 | 888 | 0.5743 | 0.6216 | 0.5743 | 0.7578 |
| 0.5909 | 5.5625 | 890 | 0.6827 | 0.5855 | 0.6827 | 0.8263 |
| 0.5909 | 5.575 | 892 | 0.7734 | 0.5812 | 0.7734 | 0.8794 |
| 0.5909 | 5.5875 | 894 | 0.7928 | 0.5812 | 0.7928 | 0.8904 |
| 0.5909 | 5.6 | 896 | 0.7716 | 0.5752 | 0.7716 | 0.8784 |
| 0.5909 | 5.6125 | 898 | 0.7211 | 0.5722 | 0.7211 | 0.8492 |
| 0.5909 | 5.625 | 900 | 0.6228 | 0.6606 | 0.6228 | 0.7892 |
| 0.5909 | 5.6375 | 902 | 0.5821 | 0.6606 | 0.5821 | 0.7630 |
| 0.5909 | 5.65 | 904 | 0.5499 | 0.6708 | 0.5499 | 0.7415 |
| 0.5909 | 5.6625 | 906 | 0.5399 | 0.6847 | 0.5399 | 0.7347 |
| 0.5909 | 5.675 | 908 | 0.6089 | 0.6606 | 0.6089 | 0.7803 |
| 0.5909 | 5.6875 | 910 | 0.6659 | 0.5882 | 0.6659 | 0.8160 |
| 0.5909 | 5.7 | 912 | 0.7202 | 0.5882 | 0.7202 | 0.8486 |
| 0.5909 | 5.7125 | 914 | 0.7908 | 0.5650 | 0.7908 | 0.8893 |
| 0.5909 | 5.725 | 916 | 0.7927 | 0.5812 | 0.7927 | 0.8903 |
| 0.5909 | 5.7375 | 918 | 0.6840 | 0.5361 | 0.6840 | 0.8270 |
| 0.5909 | 5.75 | 920 | 0.5813 | 0.6198 | 0.5813 | 0.7624 |
| 0.5909 | 5.7625 | 922 | 0.5100 | 0.6557 | 0.5100 | 0.7142 |
| 0.5909 | 5.775 | 924 | 0.4737 | 0.7143 | 0.4737 | 0.6882 |
| 0.5909 | 5.7875 | 926 | 0.4534 | 0.6784 | 0.4534 | 0.6733 |
| 0.5909 | 5.8 | 928 | 0.4514 | 0.6912 | 0.4514 | 0.6719 |
| 0.5909 | 5.8125 | 930 | 0.4920 | 0.6500 | 0.4920 | 0.7014 |
| 0.5909 | 5.825 | 932 | 0.5612 | 0.6291 | 0.5612 | 0.7491 |
| 0.5909 | 5.8375 | 934 | 0.5988 | 0.5935 | 0.5988 | 0.7738 |
| 0.5909 | 5.85 | 936 | 0.6054 | 0.6198 | 0.6054 | 0.7781 |
| 0.5909 | 5.8625 | 938 | 0.6213 | 0.6361 | 0.6213 | 0.7882 |
| 0.5909 | 5.875 | 940 | 0.6369 | 0.6543 | 0.6369 | 0.7980 |
| 0.5909 | 5.8875 | 942 | 0.6000 | 0.6543 | 0.6000 | 0.7746 |
| 0.5909 | 5.9 | 944 | 0.5523 | 0.6708 | 0.5523 | 0.7432 |
| 0.5909 | 5.9125 | 946 | 0.5722 | 0.6645 | 0.5722 | 0.7564 |
| 0.5909 | 5.925 | 948 | 0.6114 | 0.6543 | 0.6114 | 0.7819 |
| 0.5909 | 5.9375 | 950 | 0.7005 | 0.6729 | 0.7005 | 0.8369 |
| 0.5909 | 5.95 | 952 | 0.8359 | 0.5575 | 0.8359 | 0.9143 |
| 0.5909 | 5.9625 | 954 | 0.9170 | 0.5575 | 0.9170 | 0.9576 |
| 0.5909 | 5.975 | 956 | 0.9060 | 0.5575 | 0.9060 | 0.9518 |
| 0.5909 | 5.9875 | 958 | 0.8119 | 0.5706 | 0.8119 | 0.9011 |
| 0.5909 | 6.0 | 960 | 0.7089 | 0.6047 | 0.7089 | 0.8419 |
| 0.5909 | 6.0125 | 962 | 0.6411 | 0.5668 | 0.6411 | 0.8007 |
| 0.5909 | 6.025 | 964 | 0.6465 | 0.5668 | 0.6465 | 0.8040 |
| 0.5909 | 6.0375 | 966 | 0.6927 | 0.6047 | 0.6927 | 0.8323 |
| 0.5909 | 6.05 | 968 | 0.7396 | 0.5706 | 0.7396 | 0.8600 |
| 0.5909 | 6.0625 | 970 | 0.6969 | 0.5706 | 0.6969 | 0.8348 |
| 0.5909 | 6.075 | 972 | 0.6853 | 0.5706 | 0.6853 | 0.8279 |
| 0.5909 | 6.0875 | 974 | 0.7343 | 0.5706 | 0.7343 | 0.8569 |
| 0.5909 | 6.1 | 976 | 0.7449 | 0.5812 | 0.7449 | 0.8631 |
| 0.5909 | 6.1125 | 978 | 0.7029 | 0.5706 | 0.7029 | 0.8384 |
| 0.5909 | 6.125 | 980 | 0.6147 | 0.6047 | 0.6147 | 0.7840 |
| 0.5909 | 6.1375 | 982 | 0.5930 | 0.6047 | 0.5930 | 0.7701 |
| 0.5909 | 6.15 | 984 | 0.5954 | 0.6047 | 0.5954 | 0.7716 |
| 0.5909 | 6.1625 | 986 | 0.5605 | 0.6138 | 0.5605 | 0.7487 |
| 0.5909 | 6.175 | 988 | 0.4853 | 0.6423 | 0.4853 | 0.6966 |
| 0.5909 | 6.1875 | 990 | 0.4557 | 0.6008 | 0.4557 | 0.6751 |
| 0.5909 | 6.2 | 992 | 0.4213 | 0.6557 | 0.4213 | 0.6491 |
| 0.5909 | 6.2125 | 994 | 0.4229 | 0.6557 | 0.4229 | 0.6503 |
| 0.5909 | 6.225 | 996 | 0.4293 | 0.6557 | 0.4293 | 0.6552 |
| 0.5909 | 6.2375 | 998 | 0.4597 | 0.6316 | 0.4597 | 0.6780 |
| 0.1257 | 6.25 | 1000 | 0.5029 | 0.6934 | 0.5029 | 0.7092 |
| 0.1257 | 6.2625 | 1002 | 0.4987 | 0.6316 | 0.4987 | 0.7062 |
| 0.1257 | 6.275 | 1004 | 0.5289 | 0.6216 | 0.5289 | 0.7272 |
| 0.1257 | 6.2875 | 1006 | 0.6038 | 0.6124 | 0.6038 | 0.7771 |
| 0.1257 | 6.3 | 1008 | 0.6866 | 0.5812 | 0.6866 | 0.8286 |
| 0.1257 | 6.3125 | 1010 | 0.6893 | 0.5812 | 0.6893 | 0.8303 |
| 0.1257 | 6.325 | 1012 | 0.6221 | 0.6047 | 0.6221 | 0.7887 |
| 0.1257 | 6.3375 | 1014 | 0.5569 | 0.6020 | 0.5569 | 0.7462 |
| 0.1257 | 6.35 | 1016 | 0.4981 | 0.7 | 0.4981 | 0.7058 |
| 0.1257 | 6.3625 | 1018 | 0.4513 | 0.7709 | 0.4513 | 0.6718 |
| 0.1257 | 6.375 | 1020 | 0.4471 | 0.7709 | 0.4471 | 0.6686 |
| 0.1257 | 6.3875 | 1022 | 0.4594 | 0.7709 | 0.4594 | 0.6778 |
| 0.1257 | 6.4 | 1024 | 0.4597 | 0.7709 | 0.4597 | 0.6780 |
| 0.1257 | 6.4125 | 1026 | 0.5030 | 0.6392 | 0.5030 | 0.7092 |
| 0.1257 | 6.425 | 1028 | 0.5833 | 0.5857 | 0.5833 | 0.7637 |
| 0.1257 | 6.4375 | 1030 | 0.6648 | 0.5650 | 0.6648 | 0.8154 |
| 0.1257 | 6.45 | 1032 | 0.6779 | 0.5650 | 0.6779 | 0.8233 |
| 0.1257 | 6.4625 | 1034 | 0.6872 | 0.5650 | 0.6872 | 0.8290 |
| 0.1257 | 6.475 | 1036 | 0.6147 | 0.5490 | 0.6147 | 0.7840 |
| 0.1257 | 6.4875 | 1038 | 0.5297 | 0.6291 | 0.5297 | 0.7278 |
| 0.1257 | 6.5 | 1040 | 0.5038 | 0.7273 | 0.5038 | 0.7098 |
| 0.1257 | 6.5125 | 1042 | 0.5061 | 0.7016 | 0.5061 | 0.7114 |
| 0.1257 | 6.525 | 1044 | 0.5070 | 0.6291 | 0.5070 | 0.7121 |
| 0.1257 | 6.5375 | 1046 | 0.5400 | 0.6291 | 0.5400 | 0.7348 |
| 0.1257 | 6.55 | 1048 | 0.5378 | 0.6291 | 0.5378 | 0.7334 |
| 0.1257 | 6.5625 | 1050 | 0.4979 | 0.6873 | 0.4979 | 0.7056 |
| 0.1257 | 6.575 | 1052 | 0.4430 | 0.8042 | 0.4430 | 0.6656 |
| 0.1257 | 6.5875 | 1054 | 0.4167 | 0.8042 | 0.4167 | 0.6455 |
| 0.1257 | 6.6 | 1056 | 0.4046 | 0.8042 | 0.4046 | 0.6361 |
| 0.1257 | 6.6125 | 1058 | 0.4189 | 0.8042 | 0.4189 | 0.6472 |
| 0.1257 | 6.625 | 1060 | 0.4594 | 0.7279 | 0.4594 | 0.6778 |
| 0.1257 | 6.6375 | 1062 | 0.5270 | 0.6020 | 0.5270 | 0.7260 |
| 0.1257 | 6.65 | 1064 | 0.5523 | 0.5743 | 0.5523 | 0.7431 |
| 0.1257 | 6.6625 | 1066 | 0.5668 | 0.6047 | 0.5668 | 0.7529 |
| 0.1257 | 6.675 | 1068 | 0.5321 | 0.6316 | 0.5321 | 0.7294 |
| 0.1257 | 6.6875 | 1070 | 0.4799 | 0.6597 | 0.4799 | 0.6928 |
| 0.1257 | 6.7 | 1072 | 0.4572 | 0.6873 | 0.4572 | 0.6761 |
| 0.1257 | 6.7125 | 1074 | 0.4345 | 0.7279 | 0.4345 | 0.6592 |
| 0.1257 | 6.725 | 1076 | 0.4101 | 0.8409 | 0.4101 | 0.6404 |
| 0.1257 | 6.7375 | 1078 | 0.4134 | 0.8409 | 0.4134 | 0.6430 |
| 0.1257 | 6.75 | 1080 | 0.4442 | 0.6597 | 0.4442 | 0.6665 |
| 0.1257 | 6.7625 | 1082 | 0.4742 | 0.6597 | 0.4742 | 0.6886 |
| 0.1257 | 6.775 | 1084 | 0.4790 | 0.6316 | 0.4790 | 0.6921 |
| 0.1257 | 6.7875 | 1086 | 0.5136 | 0.6316 | 0.5136 | 0.7167 |
| 0.1257 | 6.8 | 1088 | 0.5267 | 0.6216 | 0.5267 | 0.7257 |
| 0.1257 | 6.8125 | 1090 | 0.5078 | 0.6316 | 0.5078 | 0.7126 |
| 0.1257 | 6.825 | 1092 | 0.4651 | 0.6873 | 0.4651 | 0.6820 |
| 0.1257 | 6.8375 | 1094 | 0.4226 | 0.8042 | 0.4226 | 0.6501 |
| 0.1257 | 6.85 | 1096 | 0.4116 | 0.8218 | 0.4116 | 0.6416 |
| 0.1257 | 6.8625 | 1098 | 0.4323 | 0.8042 | 0.4323 | 0.6575 |
| 0.1257 | 6.875 | 1100 | 0.4716 | 0.7619 | 0.4716 | 0.6867 |
| 0.1257 | 6.8875 | 1102 | 0.5112 | 0.6873 | 0.5112 | 0.7150 |
| 0.1257 | 6.9 | 1104 | 0.5756 | 0.5818 | 0.5756 | 0.7587 |
| 0.1257 | 6.9125 | 1106 | 0.5810 | 0.5818 | 0.5810 | 0.7622 |
| 0.1257 | 6.925 | 1108 | 0.5874 | 0.5818 | 0.5874 | 0.7664 |
| 0.1257 | 6.9375 | 1110 | 0.5669 | 0.6111 | 0.5669 | 0.7529 |
| 0.1257 | 6.95 | 1112 | 0.5403 | 0.5743 | 0.5403 | 0.7351 |
| 0.1257 | 6.9625 | 1114 | 0.4917 | 0.7 | 0.4917 | 0.7012 |
| 0.1257 | 6.975 | 1116 | 0.4315 | 0.7941 | 0.4315 | 0.6569 |
| 0.1257 | 6.9875 | 1118 | 0.3912 | 0.8063 | 0.3912 | 0.6255 |
| 0.1257 | 7.0 | 1120 | 0.3583 | 0.8063 | 0.3583 | 0.5986 |
| 0.1257 | 7.0125 | 1122 | 0.3459 | 0.8063 | 0.3459 | 0.5881 |
| 0.1257 | 7.025 | 1124 | 0.3556 | 0.8063 | 0.3556 | 0.5963 |
| 0.1257 | 7.0375 | 1126 | 0.3833 | 0.8063 | 0.3833 | 0.6191 |
| 0.1257 | 7.05 | 1128 | 0.4473 | 0.7426 | 0.4473 | 0.6688 |
| 0.1257 | 7.0625 | 1130 | 0.5321 | 0.6423 | 0.5321 | 0.7295 |
| 0.1257 | 7.075 | 1132 | 0.5994 | 0.5957 | 0.5994 | 0.7742 |
| 0.1257 | 7.0875 | 1134 | 0.6616 | 0.5904 | 0.6616 | 0.8134 |
| 0.1257 | 7.1 | 1136 | 0.7270 | 0.5985 | 0.7270 | 0.8526 |
| 0.1257 | 7.1125 | 1138 | 0.7173 | 0.5985 | 0.7173 | 0.8469 |
| 0.1257 | 7.125 | 1140 | 0.6622 | 0.5602 | 0.6622 | 0.8138 |
| 0.1257 | 7.1375 | 1142 | 0.5876 | 0.6030 | 0.5876 | 0.7665 |
| 0.1257 | 7.15 | 1144 | 0.5282 | 0.6291 | 0.5282 | 0.7268 |
| 0.1257 | 7.1625 | 1146 | 0.5112 | 0.7619 | 0.5112 | 0.7150 |
| 0.1257 | 7.175 | 1148 | 0.5185 | 0.7619 | 0.5185 | 0.7200 |
| 0.1257 | 7.1875 | 1150 | 0.4911 | 0.7879 | 0.4911 | 0.7008 |
| 0.1257 | 7.2 | 1152 | 0.4977 | 0.7879 | 0.4977 | 0.7055 |
| 0.1257 | 7.2125 | 1154 | 0.5089 | 0.7879 | 0.5089 | 0.7133 |
| 0.1257 | 7.225 | 1156 | 0.5177 | 0.7879 | 0.5177 | 0.7195 |
| 0.1257 | 7.2375 | 1158 | 0.5181 | 0.7826 | 0.5181 | 0.7198 |
| 0.1257 | 7.25 | 1160 | 0.4999 | 0.7879 | 0.4999 | 0.7070 |
| 0.1257 | 7.2625 | 1162 | 0.5157 | 0.7879 | 0.5157 | 0.7181 |
| 0.1257 | 7.275 | 1164 | 0.5598 | 0.6361 | 0.5598 | 0.7482 |
| 0.1257 | 7.2875 | 1166 | 0.6208 | 0.5602 | 0.6208 | 0.7879 |
| 0.1257 | 7.3 | 1168 | 0.6302 | 0.5602 | 0.6302 | 0.7938 |
| 0.1257 | 7.3125 | 1170 | 0.5848 | 0.5602 | 0.5848 | 0.7647 |
| 0.1257 | 7.325 | 1172 | 0.5065 | 0.7619 | 0.5065 | 0.7117 |
| 0.1257 | 7.3375 | 1174 | 0.4396 | 0.8042 | 0.4396 | 0.6630 |
| 0.1257 | 7.35 | 1176 | 0.4193 | 0.8218 | 0.4193 | 0.6475 |
| 0.1257 | 7.3625 | 1178 | 0.4287 | 0.7941 | 0.4287 | 0.6547 |
| 0.1257 | 7.375 | 1180 | 0.4658 | 0.7774 | 0.4658 | 0.6825 |
| 0.1257 | 7.3875 | 1182 | 0.5150 | 0.6873 | 0.5150 | 0.7177 |
| 0.1257 | 7.4 | 1184 | 0.5739 | 0.5857 | 0.5739 | 0.7576 |
| 0.1257 | 7.4125 | 1186 | 0.6268 | 0.5752 | 0.6268 | 0.7917 |
| 0.1257 | 7.425 | 1188 | 0.6229 | 0.5752 | 0.6229 | 0.7892 |
| 0.1257 | 7.4375 | 1190 | 0.5718 | 0.5722 | 0.5718 | 0.7562 |
| 0.1257 | 7.45 | 1192 | 0.5091 | 0.6873 | 0.5091 | 0.7135 |
| 0.1257 | 7.4625 | 1194 | 0.4754 | 0.7619 | 0.4754 | 0.6895 |
| 0.1257 | 7.475 | 1196 | 0.4591 | 0.7774 | 0.4591 | 0.6776 |
| 0.1257 | 7.4875 | 1198 | 0.4377 | 0.7774 | 0.4377 | 0.6616 |
| 0.1257 | 7.5 | 1200 | 0.4380 | 0.7774 | 0.4380 | 0.6618 |
| 0.1257 | 7.5125 | 1202 | 0.4362 | 0.7941 | 0.4362 | 0.6604 |
| 0.1257 | 7.525 | 1204 | 0.4414 | 0.7774 | 0.4414 | 0.6644 |
| 0.1257 | 7.5375 | 1206 | 0.4594 | 0.7 | 0.4594 | 0.6778 |
| 0.1257 | 7.55 | 1208 | 0.4963 | 0.6873 | 0.4963 | 0.7045 |
| 0.1257 | 7.5625 | 1210 | 0.5623 | 0.6291 | 0.5623 | 0.7499 |
| 0.1257 | 7.575 | 1212 | 0.6096 | 0.5818 | 0.6096 | 0.7808 |
| 0.1257 | 7.5875 | 1214 | 0.6080 | 0.5818 | 0.6080 | 0.7797 |
| 0.1257 | 7.6 | 1216 | 0.5792 | 0.6101 | 0.5792 | 0.7611 |
| 0.1257 | 7.6125 | 1218 | 0.5589 | 0.6291 | 0.5589 | 0.7476 |
| 0.1257 | 7.625 | 1220 | 0.5460 | 0.6291 | 0.5460 | 0.7389 |
| 0.1257 | 7.6375 | 1222 | 0.5284 | 0.6291 | 0.5284 | 0.7269 |
| 0.1257 | 7.65 | 1224 | 0.5445 | 0.6291 | 0.5445 | 0.7379 |
| 0.1257 | 7.6625 | 1226 | 0.5763 | 0.6101 | 0.5763 | 0.7592 |
| 0.1257 | 7.675 | 1228 | 0.6353 | 0.5969 | 0.6353 | 0.7970 |
| 0.1257 | 7.6875 | 1230 | 0.6703 | 0.5904 | 0.6703 | 0.8187 |
| 0.1257 | 7.7 | 1232 | 0.7238 | 0.5904 | 0.7238 | 0.8508 |
| 0.1257 | 7.7125 | 1234 | 0.7468 | 0.5985 | 0.7468 | 0.8642 |
| 0.1257 | 7.725 | 1236 | 0.7332 | 0.5985 | 0.7332 | 0.8563 |
| 0.1257 | 7.7375 | 1238 | 0.6653 | 0.5904 | 0.6653 | 0.8157 |
| 0.1257 | 7.75 | 1240 | 0.5862 | 0.5818 | 0.5862 | 0.7656 |
| 0.1257 | 7.7625 | 1242 | 0.5072 | 0.7016 | 0.5072 | 0.7121 |
| 0.1257 | 7.775 | 1244 | 0.4680 | 0.8042 | 0.4680 | 0.6841 |
| 0.1257 | 7.7875 | 1246 | 0.4667 | 0.8042 | 0.4667 | 0.6832 |
| 0.1257 | 7.8 | 1248 | 0.4978 | 0.7273 | 0.4978 | 0.7055 |
| 0.1257 | 7.8125 | 1250 | 0.5338 | 0.6872 | 0.5338 | 0.7306 |
| 0.1257 | 7.825 | 1252 | 0.5400 | 0.7030 | 0.5400 | 0.7348 |
| 0.1257 | 7.8375 | 1254 | 0.5388 | 0.7030 | 0.5388 | 0.7340 |
| 0.1257 | 7.85 | 1256 | 0.5285 | 0.7016 | 0.5285 | 0.7270 |
| 0.1257 | 7.8625 | 1258 | 0.5292 | 0.7016 | 0.5292 | 0.7275 |
| 0.1257 | 7.875 | 1260 | 0.5263 | 0.7016 | 0.5263 | 0.7254 |
| 0.1257 | 7.8875 | 1262 | 0.5170 | 0.7016 | 0.5170 | 0.7191 |
| 0.1257 | 7.9 | 1264 | 0.5323 | 0.7030 | 0.5323 | 0.7296 |
| 0.1257 | 7.9125 | 1266 | 0.5468 | 0.6361 | 0.5468 | 0.7394 |
| 0.1257 | 7.925 | 1268 | 0.5727 | 0.5882 | 0.5727 | 0.7568 |
| 0.1257 | 7.9375 | 1270 | 0.5588 | 0.6420 | 0.5588 | 0.7476 |
| 0.1257 | 7.95 | 1272 | 0.5476 | 0.6361 | 0.5476 | 0.7400 |
| 0.1257 | 7.9625 | 1274 | 0.5335 | 0.7030 | 0.5335 | 0.7304 |
| 0.1257 | 7.975 | 1276 | 0.5448 | 0.6361 | 0.5448 | 0.7381 |
| 0.1257 | 7.9875 | 1278 | 0.5440 | 0.6361 | 0.5440 | 0.7376 |
| 0.1257 | 8.0 | 1280 | 0.5574 | 0.6420 | 0.5574 | 0.7466 |
| 0.1257 | 8.0125 | 1282 | 0.5997 | 0.5818 | 0.5997 | 0.7744 |
| 0.1257 | 8.025 | 1284 | 0.6316 | 0.5752 | 0.6316 | 0.7947 |
| 0.1257 | 8.0375 | 1286 | 0.6321 | 0.5842 | 0.6321 | 0.7950 |
| 0.1257 | 8.05 | 1288 | 0.6328 | 0.5842 | 0.6328 | 0.7955 |
| 0.1257 | 8.0625 | 1290 | 0.5960 | 0.6336 | 0.5960 | 0.7720 |
| 0.1257 | 8.075 | 1292 | 0.5735 | 0.6420 | 0.5735 | 0.7573 |
| 0.1257 | 8.0875 | 1294 | 0.5373 | 0.6606 | 0.5373 | 0.7330 |
| 0.1257 | 8.1 | 1296 | 0.4880 | 0.7273 | 0.4880 | 0.6986 |
| 0.1257 | 8.1125 | 1298 | 0.4512 | 0.8218 | 0.4512 | 0.6717 |
| 0.1257 | 8.125 | 1300 | 0.4267 | 0.8218 | 0.4267 | 0.6532 |
| 0.1257 | 8.1375 | 1302 | 0.4178 | 0.7879 | 0.4178 | 0.6464 |
| 0.1257 | 8.15 | 1304 | 0.4143 | 0.7879 | 0.4143 | 0.6436 |
| 0.1257 | 8.1625 | 1306 | 0.4207 | 0.7879 | 0.4207 | 0.6486 |
| 0.1257 | 8.175 | 1308 | 0.4348 | 0.8218 | 0.4348 | 0.6594 |
| 0.1257 | 8.1875 | 1310 | 0.4587 | 0.8042 | 0.4587 | 0.6773 |
| 0.1257 | 8.2 | 1312 | 0.4901 | 0.7016 | 0.4901 | 0.7001 |
| 0.1257 | 8.2125 | 1314 | 0.5220 | 0.6020 | 0.5220 | 0.7225 |
| 0.1257 | 8.225 | 1316 | 0.5599 | 0.6020 | 0.5599 | 0.7483 |
| 0.1257 | 8.2375 | 1318 | 0.5706 | 0.5855 | 0.5706 | 0.7554 |
| 0.1257 | 8.25 | 1320 | 0.5643 | 0.6020 | 0.5643 | 0.7512 |
| 0.1257 | 8.2625 | 1322 | 0.5542 | 0.6020 | 0.5542 | 0.7444 |
| 0.1257 | 8.275 | 1324 | 0.5646 | 0.6111 | 0.5646 | 0.7514 |
| 0.1257 | 8.2875 | 1326 | 0.5598 | 0.6189 | 0.5598 | 0.7482 |
| 0.1257 | 8.3 | 1328 | 0.5457 | 0.6789 | 0.5457 | 0.7387 |
| 0.1257 | 8.3125 | 1330 | 0.5380 | 0.7030 | 0.5380 | 0.7335 |
| 0.1257 | 8.325 | 1332 | 0.5238 | 0.7030 | 0.5238 | 0.7237 |
| 0.1257 | 8.3375 | 1334 | 0.5236 | 0.7030 | 0.5236 | 0.7236 |
| 0.1257 | 8.35 | 1336 | 0.5325 | 0.7042 | 0.5325 | 0.7297 |
| 0.1257 | 8.3625 | 1338 | 0.5625 | 0.6684 | 0.5625 | 0.7500 |
| 0.1257 | 8.375 | 1340 | 0.5878 | 0.6598 | 0.5878 | 0.7667 |
| 0.1257 | 8.3875 | 1342 | 0.6235 | 0.5902 | 0.6235 | 0.7896 |
| 0.1257 | 8.4 | 1344 | 0.6282 | 0.5902 | 0.6282 | 0.7926 |
| 0.1257 | 8.4125 | 1346 | 0.6021 | 0.6392 | 0.6021 | 0.7759 |
| 0.1257 | 8.425 | 1348 | 0.5936 | 0.6392 | 0.5936 | 0.7704 |
| 0.1257 | 8.4375 | 1350 | 0.5930 | 0.6392 | 0.5930 | 0.7701 |
| 0.1257 | 8.45 | 1352 | 0.5844 | 0.6598 | 0.5844 | 0.7644 |
| 0.1257 | 8.4625 | 1354 | 0.5706 | 0.7154 | 0.5706 | 0.7554 |
| 0.1257 | 8.475 | 1356 | 0.5510 | 0.7263 | 0.5510 | 0.7423 |
| 0.1257 | 8.4875 | 1358 | 0.5221 | 0.7263 | 0.5221 | 0.7226 |
| 0.1257 | 8.5 | 1360 | 0.5066 | 0.7273 | 0.5066 | 0.7118 |
| 0.1257 | 8.5125 | 1362 | 0.5009 | 0.7273 | 0.5009 | 0.7077 |
| 0.1257 | 8.525 | 1364 | 0.5142 | 0.7273 | 0.5142 | 0.7171 |
| 0.1257 | 8.5375 | 1366 | 0.5247 | 0.7016 | 0.5247 | 0.7244 |
| 0.1257 | 8.55 | 1368 | 0.5416 | 0.6420 | 0.5416 | 0.7359 |
| 0.1257 | 8.5625 | 1370 | 0.5412 | 0.6189 | 0.5412 | 0.7357 |
| 0.1257 | 8.575 | 1372 | 0.5300 | 0.6361 | 0.5300 | 0.7280 |
| 0.1257 | 8.5875 | 1374 | 0.5137 | 0.7267 | 0.5137 | 0.7167 |
| 0.1257 | 8.6 | 1376 | 0.4919 | 0.7273 | 0.4919 | 0.7014 |
| 0.1257 | 8.6125 | 1378 | 0.4682 | 0.7273 | 0.4682 | 0.6842 |
| 0.1257 | 8.625 | 1380 | 0.4621 | 0.7879 | 0.4621 | 0.6798 |
| 0.1257 | 8.6375 | 1382 | 0.4668 | 0.7273 | 0.4668 | 0.6832 |
| 0.1257 | 8.65 | 1384 | 0.4784 | 0.7273 | 0.4784 | 0.6917 |
| 0.1257 | 8.6625 | 1386 | 0.4995 | 0.7267 | 0.4995 | 0.7067 |
| 0.1257 | 8.675 | 1388 | 0.5192 | 0.6111 | 0.5192 | 0.7205 |
| 0.1257 | 8.6875 | 1390 | 0.5486 | 0.6358 | 0.5486 | 0.7407 |
| 0.1257 | 8.7 | 1392 | 0.5779 | 0.6294 | 0.5779 | 0.7602 |
| 0.1257 | 8.7125 | 1394 | 0.5934 | 0.6294 | 0.5934 | 0.7703 |
| 0.1257 | 8.725 | 1396 | 0.5963 | 0.6294 | 0.5963 | 0.7722 |
| 0.1257 | 8.7375 | 1398 | 0.6090 | 0.6294 | 0.6090 | 0.7804 |
| 0.1257 | 8.75 | 1400 | 0.6010 | 0.6294 | 0.6010 | 0.7752 |
| 0.1257 | 8.7625 | 1402 | 0.5656 | 0.6222 | 0.5656 | 0.7521 |
| 0.1257 | 8.775 | 1404 | 0.5436 | 0.6138 | 0.5436 | 0.7373 |
| 0.1257 | 8.7875 | 1406 | 0.5202 | 0.6216 | 0.5202 | 0.7213 |
| 0.1257 | 8.8 | 1408 | 0.5030 | 0.6807 | 0.5030 | 0.7093 |
| 0.1257 | 8.8125 | 1410 | 0.4861 | 0.6597 | 0.4861 | 0.6972 |
| 0.1257 | 8.825 | 1412 | 0.4834 | 0.6597 | 0.4834 | 0.6953 |
| 0.1257 | 8.8375 | 1414 | 0.4821 | 0.6597 | 0.4821 | 0.6943 |
| 0.1257 | 8.85 | 1416 | 0.4887 | 0.6807 | 0.4887 | 0.6990 |
| 0.1257 | 8.8625 | 1418 | 0.5129 | 0.6807 | 0.5129 | 0.7161 |
| 0.1257 | 8.875 | 1420 | 0.5386 | 0.6293 | 0.5386 | 0.7339 |
| 0.1257 | 8.8875 | 1422 | 0.5574 | 0.6222 | 0.5574 | 0.7466 |
| 0.1257 | 8.9 | 1424 | 0.5621 | 0.6222 | 0.5621 | 0.7497 |
| 0.1257 | 8.9125 | 1426 | 0.5669 | 0.6222 | 0.5669 | 0.7529 |
| 0.1257 | 8.925 | 1428 | 0.5611 | 0.6293 | 0.5611 | 0.7491 |
| 0.1257 | 8.9375 | 1430 | 0.5470 | 0.6293 | 0.5470 | 0.7396 |
| 0.1257 | 8.95 | 1432 | 0.5168 | 0.6789 | 0.5168 | 0.7189 |
| 0.1257 | 8.9625 | 1434 | 0.4929 | 0.7826 | 0.4929 | 0.7021 |
| 0.1257 | 8.975 | 1436 | 0.4762 | 0.7879 | 0.4762 | 0.6901 |
| 0.1257 | 8.9875 | 1438 | 0.4714 | 0.7879 | 0.4714 | 0.6866 |
| 0.1257 | 9.0 | 1440 | 0.4800 | 0.7879 | 0.4800 | 0.6928 |
| 0.1257 | 9.0125 | 1442 | 0.4929 | 0.7342 | 0.4929 | 0.7021 |
| 0.1257 | 9.025 | 1444 | 0.5033 | 0.7342 | 0.5033 | 0.7094 |
| 0.1257 | 9.0375 | 1446 | 0.5094 | 0.7342 | 0.5094 | 0.7137 |
| 0.1257 | 9.05 | 1448 | 0.5064 | 0.7342 | 0.5064 | 0.7116 |
| 0.1257 | 9.0625 | 1450 | 0.4966 | 0.7354 | 0.4966 | 0.7047 |
| 0.1257 | 9.075 | 1452 | 0.4830 | 0.7619 | 0.4830 | 0.6950 |
| 0.1257 | 9.0875 | 1454 | 0.4619 | 0.7879 | 0.4619 | 0.6797 |
| 0.1257 | 9.1 | 1456 | 0.4376 | 0.7879 | 0.4376 | 0.6615 |
| 0.1257 | 9.1125 | 1458 | 0.4103 | 0.7709 | 0.4103 | 0.6405 |
| 0.1257 | 9.125 | 1460 | 0.3964 | 0.7879 | 0.3964 | 0.6296 |
| 0.1257 | 9.1375 | 1462 | 0.3893 | 0.7879 | 0.3893 | 0.6239 |
| 0.1257 | 9.15 | 1464 | 0.3883 | 0.7879 | 0.3883 | 0.6231 |
| 0.1257 | 9.1625 | 1466 | 0.3962 | 0.7879 | 0.3962 | 0.6294 |
| 0.1257 | 9.175 | 1468 | 0.4106 | 0.7879 | 0.4106 | 0.6408 |
| 0.1257 | 9.1875 | 1470 | 0.4305 | 0.7552 | 0.4305 | 0.6561 |
| 0.1257 | 9.2 | 1472 | 0.4519 | 0.7619 | 0.4519 | 0.6723 |
| 0.1257 | 9.2125 | 1474 | 0.4748 | 0.7619 | 0.4748 | 0.6891 |
| 0.1257 | 9.225 | 1476 | 0.4927 | 0.6807 | 0.4927 | 0.7019 |
| 0.1257 | 9.2375 | 1478 | 0.5086 | 0.6807 | 0.5086 | 0.7132 |
| 0.1257 | 9.25 | 1480 | 0.5227 | 0.6416 | 0.5227 | 0.7229 |
| 0.1257 | 9.2625 | 1482 | 0.5239 | 0.6416 | 0.5239 | 0.7238 |
| 0.1257 | 9.275 | 1484 | 0.5125 | 0.6807 | 0.5125 | 0.7159 |
| 0.1257 | 9.2875 | 1486 | 0.5086 | 0.6807 | 0.5086 | 0.7132 |
| 0.1257 | 9.3 | 1488 | 0.5047 | 0.6807 | 0.5047 | 0.7104 |
| 0.1257 | 9.3125 | 1490 | 0.4945 | 0.6807 | 0.4945 | 0.7032 |
| 0.1257 | 9.325 | 1492 | 0.4844 | 0.6597 | 0.4844 | 0.6960 |
| 0.1257 | 9.3375 | 1494 | 0.4743 | 0.7354 | 0.4743 | 0.6887 |
| 0.1257 | 9.35 | 1496 | 0.4621 | 0.7279 | 0.4621 | 0.6798 |
| 0.1257 | 9.3625 | 1498 | 0.4583 | 0.7552 | 0.4583 | 0.6769 |
| 0.0682 | 9.375 | 1500 | 0.4565 | 0.7552 | 0.4565 | 0.6757 |
| 0.0682 | 9.3875 | 1502 | 0.4595 | 0.7879 | 0.4595 | 0.6778 |
| 0.0682 | 9.4 | 1504 | 0.4655 | 0.7879 | 0.4655 | 0.6823 |
| 0.0682 | 9.4125 | 1506 | 0.4718 | 0.7619 | 0.4718 | 0.6868 |
| 0.0682 | 9.425 | 1508 | 0.4796 | 0.7619 | 0.4796 | 0.6925 |
| 0.0682 | 9.4375 | 1510 | 0.4894 | 0.6597 | 0.4894 | 0.6996 |
| 0.0682 | 9.45 | 1512 | 0.4956 | 0.6597 | 0.4956 | 0.7040 |
| 0.0682 | 9.4625 | 1514 | 0.5056 | 0.6020 | 0.5056 | 0.7110 |
| 0.0682 | 9.475 | 1516 | 0.5153 | 0.6020 | 0.5153 | 0.7178 |
| 0.0682 | 9.4875 | 1518 | 0.5301 | 0.6020 | 0.5301 | 0.7281 |
| 0.0682 | 9.5 | 1520 | 0.5357 | 0.6020 | 0.5357 | 0.7319 |
| 0.0682 | 9.5125 | 1522 | 0.5360 | 0.6020 | 0.5360 | 0.7322 |
| 0.0682 | 9.525 | 1524 | 0.5346 | 0.6020 | 0.5346 | 0.7312 |
| 0.0682 | 9.5375 | 1526 | 0.5325 | 0.6020 | 0.5325 | 0.7297 |
| 0.0682 | 9.55 | 1528 | 0.5289 | 0.6020 | 0.5289 | 0.7272 |
| 0.0682 | 9.5625 | 1530 | 0.5206 | 0.6020 | 0.5206 | 0.7215 |
| 0.0682 | 9.575 | 1532 | 0.5153 | 0.6020 | 0.5153 | 0.7178 |
| 0.0682 | 9.5875 | 1534 | 0.5088 | 0.6020 | 0.5088 | 0.7133 |
| 0.0682 | 9.6 | 1536 | 0.5072 | 0.6020 | 0.5072 | 0.7121 |
| 0.0682 | 9.6125 | 1538 | 0.5031 | 0.6020 | 0.5031 | 0.7093 |
| 0.0682 | 9.625 | 1540 | 0.5015 | 0.6020 | 0.5015 | 0.7082 |
| 0.0682 | 9.6375 | 1542 | 0.4977 | 0.7354 | 0.4977 | 0.7055 |
| 0.0682 | 9.65 | 1544 | 0.4910 | 0.7619 | 0.4910 | 0.7007 |
| 0.0682 | 9.6625 | 1546 | 0.4859 | 0.7619 | 0.4859 | 0.6971 |
| 0.0682 | 9.675 | 1548 | 0.4819 | 0.7619 | 0.4819 | 0.6942 |
| 0.0682 | 9.6875 | 1550 | 0.4813 | 0.7619 | 0.4813 | 0.6937 |
| 0.0682 | 9.7 | 1552 | 0.4769 | 0.7879 | 0.4769 | 0.6906 |
| 0.0682 | 9.7125 | 1554 | 0.4723 | 0.7879 | 0.4723 | 0.6872 |
| 0.0682 | 9.725 | 1556 | 0.4714 | 0.7879 | 0.4714 | 0.6866 |
| 0.0682 | 9.7375 | 1558 | 0.4714 | 0.7879 | 0.4714 | 0.6866 |
| 0.0682 | 9.75 | 1560 | 0.4719 | 0.7879 | 0.4719 | 0.6870 |
| 0.0682 | 9.7625 | 1562 | 0.4740 | 0.7879 | 0.4740 | 0.6885 |
| 0.0682 | 9.775 | 1564 | 0.4754 | 0.7879 | 0.4754 | 0.6895 |
| 0.0682 | 9.7875 | 1566 | 0.4787 | 0.7879 | 0.4787 | 0.6919 |
| 0.0682 | 9.8 | 1568 | 0.4833 | 0.7619 | 0.4833 | 0.6952 |
| 0.0682 | 9.8125 | 1570 | 0.4893 | 0.7619 | 0.4893 | 0.6995 |
| 0.0682 | 9.825 | 1572 | 0.4947 | 0.7354 | 0.4947 | 0.7034 |
| 0.0682 | 9.8375 | 1574 | 0.5005 | 0.6020 | 0.5005 | 0.7075 |
| 0.0682 | 9.85 | 1576 | 0.5051 | 0.6020 | 0.5051 | 0.7107 |
| 0.0682 | 9.8625 | 1578 | 0.5084 | 0.6020 | 0.5084 | 0.7130 |
| 0.0682 | 9.875 | 1580 | 0.5115 | 0.6020 | 0.5115 | 0.7152 |
| 0.0682 | 9.8875 | 1582 | 0.5141 | 0.6020 | 0.5141 | 0.7170 |
| 0.0682 | 9.9 | 1584 | 0.5163 | 0.6020 | 0.5163 | 0.7186 |
| 0.0682 | 9.9125 | 1586 | 0.5170 | 0.6020 | 0.5170 | 0.7190 |
| 0.0682 | 9.925 | 1588 | 0.5169 | 0.6020 | 0.5169 | 0.7190 |
| 0.0682 | 9.9375 | 1590 | 0.5163 | 0.6020 | 0.5163 | 0.7185 |
| 0.0682 | 9.95 | 1592 | 0.5158 | 0.6020 | 0.5158 | 0.7182 |
| 0.0682 | 9.9625 | 1594 | 0.5152 | 0.6020 | 0.5152 | 0.7178 |
| 0.0682 | 9.975 | 1596 | 0.5147 | 0.6020 | 0.5147 | 0.7174 |
| 0.0682 | 9.9875 | 1598 | 0.5143 | 0.6020 | 0.5143 | 0.7171 |
| 0.0682 | 10.0 | 1600 | 0.5141 | 0.6020 | 0.5141 | 0.7170 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
|
paconaranjo/inah | paconaranjo | 2024-11-24T08:10:55Z | 20 | 0 | diffusers | [
"diffusers",
"text-to-image",
"flux",
"lora",
"template:sd-lora",
"fluxgym",
"base_model:black-forest-labs/FLUX.1-dev",
"base_model:adapter:black-forest-labs/FLUX.1-dev",
"license:other",
"region:us"
] | text-to-image | 2024-11-24T07:58:43Z | ---
tags:
- text-to-image
- flux
- lora
- diffusers
- template:sd-lora
- fluxgym
widget:
- output:
url: sample/inah_001000_00_20241123045700.png
text: inah
- text: inah dragon under water
output:
url: images/example_f9yu0q60n.png
base_model: black-forest-labs/FLUX.1-dev
instance_prompt: inah
license: other
license_name: flux-1-dev-non-commercial-license
license_link: https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/LICENSE.md
---
# inah
A Flux LoRA trained on a local computer with [Fluxgym](https://github.com/cocktailpeanut/fluxgym)
<Gallery />
## Trigger words
You should use `inah` to trigger the image generation.
## Download model and use it with ComfyUI, AUTOMATIC1111, SD.Next, Invoke AI, Forge, etc.
Weights for this model are available in Safetensors format.
|
susmitabhatt/whisper-a-norm-ls-8 | susmitabhatt | 2024-11-24T08:02:58Z | 81 | 0 | transformers | [
"transformers",
"safetensors",
"whisper",
"automatic-speech-recognition",
"generated_from_trainer",
"base_model:openai/whisper-small",
"base_model:finetune:openai/whisper-small",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | automatic-speech-recognition | 2024-11-24T06:49:16Z | ---
library_name: transformers
license: apache-2.0
base_model: openai/whisper-small
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: whisper-a-norm-ls-8
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# whisper-a-norm-ls-8
This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co/openai/whisper-small) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1538
- Wer: 78.3845
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0004
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 132
- num_epochs: 8
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:------:|:----:|:---------------:|:-------:|
| No log | 1.0 | 70 | 0.2366 | 16.4960 |
| 0.7791 | 2.0 | 140 | 0.4579 | 96.5870 |
| 0.8786 | 3.0 | 210 | 0.2974 | 91.4676 |
| 0.8786 | 4.0 | 280 | 0.2770 | 93.1741 |
| 0.2773 | 5.0 | 350 | 0.2503 | 91.8089 |
| 0.2596 | 6.0 | 420 | 0.3236 | 95.2218 |
| 0.2596 | 7.0 | 490 | 0.1855 | 93.0603 |
| 0.2108 | 7.8921 | 552 | 0.1538 | 78.3845 |
### Framework versions
- Transformers 4.47.0.dev0
- Pytorch 2.4.0
- Datasets 3.0.1
- Tokenizers 0.20.0
|
Amadeus99/parler-tts-mini-v1-pbl-v2 | Amadeus99 | 2024-11-24T08:01:11Z | 46 | 0 | transformers | [
"transformers",
"safetensors",
"parler_tts",
"text2text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text2text-generation | 2024-11-24T08:00:25Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
TravelAgentProject/Llama3.2-3B-var-extraction-flight | TravelAgentProject | 2024-11-24T07:42:26Z | 79 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"trl",
"sft",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"4-bit",
"bitsandbytes",
"region:us"
] | text-generation | 2024-11-24T07:40:04Z | ---
library_name: transformers
tags:
- trl
- sft
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
masterchop/gemma-Code-Instruct-Finetune-test | masterchop | 2024-11-24T07:41:35Z | 124 | 0 | transformers | [
"transformers",
"safetensors",
"gemma",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-24T07:33:26Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
raveas/results | raveas | 2024-11-24T07:39:15Z | 6 | 0 | null | [
"tensorboard",
"safetensors",
"roberta",
"generated_from_trainer",
"base_model:klue/roberta-base",
"base_model:finetune:klue/roberta-base",
"region:us"
] | null | 2024-11-24T07:34:45Z | ---
base_model: klue/roberta-base
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: results
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# results
This model is a fine-tuned version of [klue/roberta-base](https://huggingface.co/klue/roberta-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4823
- Accuracy: 0.859
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.5316 | 1.0 | 1250 | 0.5101 | 0.85 |
### Framework versions
- Transformers 4.40.1
- Pytorch 2.5.1+cu118
- Datasets 2.19.0
- Tokenizers 0.19.1
|
raveas/roberta-base-klue-ynat-classification-colab | raveas | 2024-11-24T07:35:23Z | 105 | 0 | transformers | [
"transformers",
"safetensors",
"roberta",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-24T07:35:03Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
MayBashendy/Arabic_FineTuningAraBERT_AugV5_k10_task1_organization_fold0 | MayBashendy | 2024-11-24T07:14:09Z | 162 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:aubmindlab/bert-base-arabertv02",
"base_model:finetune:aubmindlab/bert-base-arabertv02",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-18T18:48:41Z | ---
library_name: transformers
base_model: aubmindlab/bert-base-arabertv02
tags:
- generated_from_trainer
model-index:
- name: Arabic_FineTuningAraBERT_AugV5_k10_task1_organization_fold0
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Arabic_FineTuningAraBERT_AugV5_k10_task1_organization_fold0
This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6292
- Qwk: 0.7421
- Mse: 0.6292
- Rmse: 0.7932
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:|
| No log | 0.0182 | 2 | 5.4708 | -0.1511 | 5.4708 | 2.3390 |
| No log | 0.0364 | 4 | 3.2469 | -0.0173 | 3.2469 | 1.8019 |
| No log | 0.0545 | 6 | 2.3449 | -0.0561 | 2.3449 | 1.5313 |
| No log | 0.0727 | 8 | 1.7575 | 0.0255 | 1.7575 | 1.3257 |
| No log | 0.0909 | 10 | 1.8028 | 0.0742 | 1.8028 | 1.3427 |
| No log | 0.1091 | 12 | 2.2420 | 0.1064 | 2.2420 | 1.4973 |
| No log | 0.1273 | 14 | 2.4977 | 0.1513 | 2.4977 | 1.5804 |
| No log | 0.1455 | 16 | 2.2588 | 0.1582 | 2.2588 | 1.5029 |
| No log | 0.1636 | 18 | 1.9577 | 0.0950 | 1.9577 | 1.3992 |
| No log | 0.1818 | 20 | 1.6859 | 0.0 | 1.6859 | 1.2984 |
| No log | 0.2 | 22 | 1.6142 | 0.0 | 1.6142 | 1.2705 |
| No log | 0.2182 | 24 | 1.6106 | 0.0 | 1.6106 | 1.2691 |
| No log | 0.2364 | 26 | 1.9257 | 0.0950 | 1.9257 | 1.3877 |
| No log | 0.2545 | 28 | 2.5865 | 0.1034 | 2.5865 | 1.6083 |
| No log | 0.2727 | 30 | 2.6137 | 0.1161 | 2.6137 | 1.6167 |
| No log | 0.2909 | 32 | 2.2849 | 0.1642 | 2.2849 | 1.5116 |
| No log | 0.3091 | 34 | 2.0125 | 0.0696 | 2.0125 | 1.4186 |
| No log | 0.3273 | 36 | 1.5896 | -0.0269 | 1.5896 | 1.2608 |
| No log | 0.3455 | 38 | 1.3199 | 0.3778 | 1.3199 | 1.1489 |
| No log | 0.3636 | 40 | 1.3317 | 0.0847 | 1.3317 | 1.1540 |
| No log | 0.3818 | 42 | 1.4252 | 0.1893 | 1.4252 | 1.1938 |
| No log | 0.4 | 44 | 1.5542 | 0.0982 | 1.5542 | 1.2467 |
| No log | 0.4182 | 46 | 1.6544 | 0.0 | 1.6544 | 1.2862 |
| No log | 0.4364 | 48 | 1.9784 | 0.0 | 1.9784 | 1.4066 |
| No log | 0.4545 | 50 | 2.5361 | -0.2503 | 2.5361 | 1.5925 |
| No log | 0.4727 | 52 | 2.7130 | -0.1015 | 2.7130 | 1.6471 |
| No log | 0.4909 | 54 | 2.2667 | 0.0742 | 2.2667 | 1.5055 |
| No log | 0.5091 | 56 | 1.9878 | 0.0 | 1.9878 | 1.4099 |
| No log | 0.5273 | 58 | 1.8660 | 0.0 | 1.8660 | 1.3660 |
| No log | 0.5455 | 60 | 1.7680 | 0.0 | 1.7680 | 1.3297 |
| No log | 0.5636 | 62 | 1.7303 | 0.0 | 1.7303 | 1.3154 |
| No log | 0.5818 | 64 | 1.7132 | 0.0 | 1.7132 | 1.3089 |
| No log | 0.6 | 66 | 1.6945 | 0.0 | 1.6945 | 1.3017 |
| No log | 0.6182 | 68 | 1.7000 | 0.0 | 1.7000 | 1.3038 |
| No log | 0.6364 | 70 | 1.6219 | 0.0 | 1.6219 | 1.2736 |
| No log | 0.6545 | 72 | 1.5119 | 0.2032 | 1.5119 | 1.2296 |
| No log | 0.6727 | 74 | 1.4667 | 0.3808 | 1.4667 | 1.2111 |
| No log | 0.6909 | 76 | 1.4374 | 0.3793 | 1.4374 | 1.1989 |
| No log | 0.7091 | 78 | 1.4396 | 0.4059 | 1.4396 | 1.1998 |
| No log | 0.7273 | 80 | 1.3925 | 0.4059 | 1.3925 | 1.1800 |
| No log | 0.7455 | 82 | 1.3151 | 0.3808 | 1.3151 | 1.1468 |
| No log | 0.7636 | 84 | 1.1921 | 0.4310 | 1.1921 | 1.0918 |
| No log | 0.7818 | 86 | 1.1903 | 0.4059 | 1.1903 | 1.0910 |
| No log | 0.8 | 88 | 1.2966 | 0.2794 | 1.2966 | 1.1387 |
| No log | 0.8182 | 90 | 1.4310 | 0.1518 | 1.4310 | 1.1963 |
| No log | 0.8364 | 92 | 1.3529 | 0.2794 | 1.3529 | 1.1631 |
| No log | 0.8545 | 94 | 1.2545 | 0.4324 | 1.2545 | 1.1201 |
| No log | 0.8727 | 96 | 1.1983 | 0.4310 | 1.1983 | 1.0947 |
| No log | 0.8909 | 98 | 1.1210 | 0.4310 | 1.1210 | 1.0588 |
| No log | 0.9091 | 100 | 1.1091 | 0.4296 | 1.1091 | 1.0531 |
| No log | 0.9273 | 102 | 1.3268 | 0.2723 | 1.3268 | 1.1519 |
| No log | 0.9455 | 104 | 1.3442 | 0.4 | 1.3442 | 1.1594 |
| No log | 0.9636 | 106 | 1.1607 | 0.3494 | 1.1607 | 1.0774 |
| No log | 0.9818 | 108 | 1.0171 | 0.3511 | 1.0171 | 1.0085 |
| No log | 1.0 | 110 | 0.9815 | 0.3511 | 0.9815 | 0.9907 |
| No log | 1.0182 | 112 | 0.9685 | 0.3511 | 0.9685 | 0.9841 |
| No log | 1.0364 | 114 | 1.0992 | 0.5047 | 1.0992 | 1.0484 |
| No log | 1.0545 | 116 | 1.2978 | 0.3209 | 1.2978 | 1.1392 |
| No log | 1.0727 | 118 | 1.3305 | 0.3209 | 1.3305 | 1.1535 |
| No log | 1.0909 | 120 | 1.1361 | 0.5039 | 1.1361 | 1.0659 |
| No log | 1.1091 | 122 | 0.9297 | 0.5944 | 0.9297 | 0.9642 |
| No log | 1.1273 | 124 | 0.8978 | 0.5944 | 0.8978 | 0.9475 |
| No log | 1.1455 | 126 | 0.9838 | 0.6476 | 0.9838 | 0.9918 |
| No log | 1.1636 | 128 | 1.2428 | 0.5174 | 1.2428 | 1.1148 |
| No log | 1.1818 | 130 | 1.5020 | 0.2782 | 1.5020 | 1.2256 |
| No log | 1.2 | 132 | 1.4120 | 0.4119 | 1.4120 | 1.1883 |
| No log | 1.2182 | 134 | 1.0712 | 0.5589 | 1.0712 | 1.0350 |
| No log | 1.2364 | 136 | 0.8721 | 0.5944 | 0.8721 | 0.9339 |
| No log | 1.2545 | 138 | 0.9400 | 0.3793 | 0.9400 | 0.9695 |
| No log | 1.2727 | 140 | 1.0203 | 0.3793 | 1.0203 | 1.0101 |
| No log | 1.2909 | 142 | 0.9384 | 0.4044 | 0.9384 | 0.9687 |
| No log | 1.3091 | 144 | 0.8861 | 0.5070 | 0.8861 | 0.9413 |
| No log | 1.3273 | 146 | 0.8414 | 0.5762 | 0.8414 | 0.9173 |
| No log | 1.3455 | 148 | 0.8500 | 0.6476 | 0.8500 | 0.9220 |
| No log | 1.3636 | 150 | 1.0741 | 0.5563 | 1.0741 | 1.0364 |
| No log | 1.3818 | 152 | 1.1810 | 0.4815 | 1.1810 | 1.0867 |
| No log | 1.4 | 154 | 1.2439 | 0.4815 | 1.2439 | 1.1153 |
| No log | 1.4182 | 156 | 1.1609 | 0.4809 | 1.1609 | 1.0775 |
| No log | 1.4364 | 158 | 0.9786 | 0.5672 | 0.9786 | 0.9892 |
| No log | 1.4545 | 160 | 0.8973 | 0.6714 | 0.8973 | 0.9473 |
| No log | 1.4727 | 162 | 0.8809 | 0.6980 | 0.8809 | 0.9386 |
| No log | 1.4909 | 164 | 0.8366 | 0.6576 | 0.8366 | 0.9147 |
| No log | 1.5091 | 166 | 0.7934 | 0.6824 | 0.7934 | 0.8907 |
| No log | 1.5273 | 168 | 0.8410 | 0.6483 | 0.8410 | 0.9171 |
| No log | 1.5455 | 170 | 0.9982 | 0.5647 | 0.9982 | 0.9991 |
| No log | 1.5636 | 172 | 1.1728 | 0.4403 | 1.1728 | 1.0830 |
| No log | 1.5818 | 174 | 1.2666 | 0.3226 | 1.2666 | 1.1254 |
| No log | 1.6 | 176 | 1.1904 | 0.4055 | 1.1904 | 1.0910 |
| No log | 1.6182 | 178 | 1.0550 | 0.6414 | 1.0550 | 1.0271 |
| No log | 1.6364 | 180 | 0.9029 | 0.6309 | 0.9029 | 0.9502 |
| No log | 1.6545 | 182 | 0.8635 | 0.6435 | 0.8635 | 0.9293 |
| No log | 1.6727 | 184 | 0.8544 | 0.6940 | 0.8544 | 0.9243 |
| No log | 1.6909 | 186 | 0.8512 | 0.6752 | 0.8512 | 0.9226 |
| No log | 1.7091 | 188 | 0.8576 | 0.6702 | 0.8576 | 0.9261 |
| No log | 1.7273 | 190 | 0.8514 | 0.6194 | 0.8514 | 0.9227 |
| No log | 1.7455 | 192 | 0.9371 | 0.5664 | 0.9371 | 0.9680 |
| No log | 1.7636 | 194 | 1.1680 | 0.4592 | 1.1680 | 1.0807 |
| No log | 1.7818 | 196 | 1.4813 | 0.3165 | 1.4813 | 1.2171 |
| No log | 1.8 | 198 | 1.5343 | 0.1857 | 1.5343 | 1.2387 |
| No log | 1.8182 | 200 | 1.6211 | 0.0891 | 1.6211 | 1.2732 |
| No log | 1.8364 | 202 | 1.5875 | 0.2240 | 1.5875 | 1.2600 |
| No log | 1.8545 | 204 | 1.8079 | 0.0342 | 1.8079 | 1.3446 |
| No log | 1.8727 | 206 | 1.7946 | 0.1456 | 1.7946 | 1.3396 |
| No log | 1.8909 | 208 | 1.7986 | 0.0982 | 1.7986 | 1.3411 |
| No log | 1.9091 | 210 | 1.6959 | 0.3140 | 1.6959 | 1.3023 |
| No log | 1.9273 | 212 | 1.9129 | 0.2304 | 1.9129 | 1.3831 |
| No log | 1.9455 | 214 | 1.7066 | 0.3214 | 1.7066 | 1.3064 |
| No log | 1.9636 | 216 | 1.4904 | 0.3214 | 1.4904 | 1.2208 |
| No log | 1.9818 | 218 | 1.2839 | 0.5532 | 1.2839 | 1.1331 |
| No log | 2.0 | 220 | 1.1310 | 0.6194 | 1.1310 | 1.0635 |
| No log | 2.0182 | 222 | 0.9801 | 0.6153 | 0.9801 | 0.9900 |
| No log | 2.0364 | 224 | 0.9150 | 0.6473 | 0.9150 | 0.9565 |
| No log | 2.0545 | 226 | 0.9185 | 0.6473 | 0.9185 | 0.9584 |
| No log | 2.0727 | 228 | 0.9063 | 0.6182 | 0.9063 | 0.9520 |
| No log | 2.0909 | 230 | 0.8519 | 0.6182 | 0.8519 | 0.9230 |
| No log | 2.1091 | 232 | 0.7556 | 0.7525 | 0.7556 | 0.8693 |
| No log | 2.1273 | 234 | 0.8509 | 0.6254 | 0.8509 | 0.9225 |
| No log | 2.1455 | 236 | 1.0140 | 0.4345 | 1.0140 | 1.0070 |
| No log | 2.1636 | 238 | 0.9198 | 0.5607 | 0.9198 | 0.9591 |
| No log | 2.1818 | 240 | 0.7740 | 0.7258 | 0.7740 | 0.8798 |
| No log | 2.2 | 242 | 0.7330 | 0.7346 | 0.7330 | 0.8561 |
| No log | 2.2182 | 244 | 0.7964 | 0.6622 | 0.7964 | 0.8924 |
| No log | 2.2364 | 246 | 0.9886 | 0.6836 | 0.9886 | 0.9943 |
| No log | 2.2545 | 248 | 1.1575 | 0.6091 | 1.1575 | 1.0759 |
| No log | 2.2727 | 250 | 1.2043 | 0.5642 | 1.2043 | 1.0974 |
| No log | 2.2909 | 252 | 1.1822 | 0.5869 | 1.1822 | 1.0873 |
| No log | 2.3091 | 254 | 1.1321 | 0.6147 | 1.1321 | 1.0640 |
| No log | 2.3273 | 256 | 1.1822 | 0.6341 | 1.1822 | 1.0873 |
| No log | 2.3455 | 258 | 1.2560 | 0.6083 | 1.2560 | 1.1207 |
| No log | 2.3636 | 260 | 1.3150 | 0.5995 | 1.3150 | 1.1467 |
| No log | 2.3818 | 262 | 1.2509 | 0.5370 | 1.2509 | 1.1185 |
| No log | 2.4 | 264 | 1.1584 | 0.5370 | 1.1584 | 1.0763 |
| No log | 2.4182 | 266 | 1.0637 | 0.5728 | 1.0637 | 1.0313 |
| No log | 2.4364 | 268 | 0.9340 | 0.6206 | 0.9340 | 0.9664 |
| No log | 2.4545 | 270 | 0.8390 | 0.7445 | 0.8390 | 0.9160 |
| No log | 2.4727 | 272 | 0.8074 | 0.7277 | 0.8074 | 0.8986 |
| No log | 2.4909 | 274 | 0.7941 | 0.7529 | 0.7941 | 0.8911 |
| No log | 2.5091 | 276 | 0.7918 | 0.7782 | 0.7918 | 0.8898 |
| No log | 2.5273 | 278 | 0.8378 | 0.6142 | 0.8378 | 0.9153 |
| No log | 2.5455 | 280 | 0.9513 | 0.5430 | 0.9513 | 0.9754 |
| No log | 2.5636 | 282 | 0.9526 | 0.6175 | 0.9526 | 0.9760 |
| No log | 2.5818 | 284 | 0.8819 | 0.6633 | 0.8819 | 0.9391 |
| No log | 2.6 | 286 | 0.7952 | 0.7191 | 0.7952 | 0.8917 |
| No log | 2.6182 | 288 | 0.7597 | 0.6799 | 0.7597 | 0.8716 |
| No log | 2.6364 | 290 | 0.7641 | 0.6525 | 0.7641 | 0.8741 |
| No log | 2.6545 | 292 | 0.8114 | 0.5973 | 0.8114 | 0.9008 |
| No log | 2.6727 | 294 | 0.8989 | 0.5933 | 0.8989 | 0.9481 |
| No log | 2.6909 | 296 | 0.9505 | 0.5374 | 0.9505 | 0.9750 |
| No log | 2.7091 | 298 | 0.9907 | 0.5532 | 0.9907 | 0.9953 |
| No log | 2.7273 | 300 | 0.9237 | 0.5374 | 0.9237 | 0.9611 |
| No log | 2.7455 | 302 | 0.8052 | 0.5973 | 0.8052 | 0.8974 |
| No log | 2.7636 | 304 | 0.7395 | 0.5973 | 0.7395 | 0.8599 |
| No log | 2.7818 | 306 | 0.7306 | 0.5973 | 0.7306 | 0.8548 |
| No log | 2.8 | 308 | 0.7927 | 0.5973 | 0.7927 | 0.8904 |
| No log | 2.8182 | 310 | 0.8850 | 0.5933 | 0.8850 | 0.9408 |
| No log | 2.8364 | 312 | 0.9574 | 0.5 | 0.9574 | 0.9785 |
| No log | 2.8545 | 314 | 0.8890 | 0.5933 | 0.8890 | 0.9429 |
| No log | 2.8727 | 316 | 0.7422 | 0.5973 | 0.7422 | 0.8615 |
| No log | 2.8909 | 318 | 0.7185 | 0.6015 | 0.7185 | 0.8477 |
| No log | 2.9091 | 320 | 0.7653 | 0.5973 | 0.7653 | 0.8748 |
| No log | 2.9273 | 322 | 0.7966 | 0.5933 | 0.7966 | 0.8925 |
| No log | 2.9455 | 324 | 0.7831 | 0.5933 | 0.7831 | 0.8849 |
| No log | 2.9636 | 326 | 0.7600 | 0.6569 | 0.7600 | 0.8718 |
| No log | 2.9818 | 328 | 0.7800 | 0.6569 | 0.7800 | 0.8832 |
| No log | 3.0 | 330 | 0.7847 | 0.6519 | 0.7847 | 0.8858 |
| No log | 3.0182 | 332 | 0.8224 | 0.6309 | 0.8224 | 0.9068 |
| No log | 3.0364 | 334 | 0.8226 | 0.6309 | 0.8226 | 0.9070 |
| No log | 3.0545 | 336 | 0.8167 | 0.6309 | 0.8167 | 0.9037 |
| No log | 3.0727 | 338 | 0.8273 | 0.5973 | 0.8273 | 0.9096 |
| No log | 3.0909 | 340 | 0.7866 | 0.6363 | 0.7866 | 0.8869 |
| No log | 3.1091 | 342 | 0.7663 | 0.7020 | 0.7663 | 0.8754 |
| No log | 3.1273 | 344 | 0.7411 | 0.7063 | 0.7411 | 0.8608 |
| No log | 3.1455 | 346 | 0.7771 | 0.7020 | 0.7771 | 0.8815 |
| No log | 3.1636 | 348 | 0.8002 | 0.5435 | 0.8002 | 0.8945 |
| No log | 3.1818 | 350 | 0.7787 | 0.5435 | 0.7787 | 0.8824 |
| No log | 3.2 | 352 | 0.7170 | 0.5435 | 0.7170 | 0.8468 |
| No log | 3.2182 | 354 | 0.6898 | 0.6061 | 0.6898 | 0.8305 |
| No log | 3.2364 | 356 | 0.7298 | 0.6053 | 0.7298 | 0.8543 |
| No log | 3.2545 | 358 | 0.7823 | 0.6008 | 0.7823 | 0.8845 |
| No log | 3.2727 | 360 | 0.7542 | 0.5965 | 0.7542 | 0.8685 |
| No log | 3.2909 | 362 | 0.7651 | 0.5556 | 0.7651 | 0.8747 |
| No log | 3.3091 | 364 | 0.7574 | 0.5393 | 0.7574 | 0.8703 |
| No log | 3.3273 | 366 | 0.7169 | 0.6497 | 0.7169 | 0.8467 |
| No log | 3.3455 | 368 | 0.7241 | 0.6907 | 0.7241 | 0.8509 |
| No log | 3.3636 | 370 | 0.7455 | 0.7264 | 0.7455 | 0.8634 |
| No log | 3.3818 | 372 | 0.7909 | 0.6816 | 0.7909 | 0.8893 |
| No log | 3.4 | 374 | 0.8785 | 0.6446 | 0.8785 | 0.9373 |
| No log | 3.4182 | 376 | 0.8526 | 0.6376 | 0.8526 | 0.9233 |
| No log | 3.4364 | 378 | 0.8738 | 0.6376 | 0.8738 | 0.9348 |
| No log | 3.4545 | 380 | 0.8011 | 0.5926 | 0.8011 | 0.8950 |
| No log | 3.4727 | 382 | 0.7221 | 0.6680 | 0.7221 | 0.8498 |
| No log | 3.4909 | 384 | 0.6728 | 0.6680 | 0.6728 | 0.8202 |
| No log | 3.5091 | 386 | 0.6266 | 0.6686 | 0.6266 | 0.7916 |
| No log | 3.5273 | 388 | 0.6709 | 0.5973 | 0.6709 | 0.8191 |
| No log | 3.5455 | 390 | 0.7137 | 0.6875 | 0.7137 | 0.8448 |
| No log | 3.5636 | 392 | 0.7782 | 0.6933 | 0.7782 | 0.8822 |
| No log | 3.5818 | 394 | 0.8630 | 0.6933 | 0.8630 | 0.9290 |
| No log | 3.6 | 396 | 0.9473 | 0.5965 | 0.9473 | 0.9733 |
| No log | 3.6182 | 398 | 0.9127 | 0.6407 | 0.9127 | 0.9554 |
| No log | 3.6364 | 400 | 0.7845 | 0.6746 | 0.7845 | 0.8857 |
| No log | 3.6545 | 402 | 0.6754 | 0.7435 | 0.6754 | 0.8218 |
| No log | 3.6727 | 404 | 0.6576 | 0.7447 | 0.6576 | 0.8110 |
| No log | 3.6909 | 406 | 0.6757 | 0.7529 | 0.6757 | 0.8220 |
| No log | 3.7091 | 408 | 0.6800 | 0.7373 | 0.6800 | 0.8246 |
| No log | 3.7273 | 410 | 0.6872 | 0.7879 | 0.6872 | 0.8290 |
| No log | 3.7455 | 412 | 0.7772 | 0.7433 | 0.7772 | 0.8816 |
| No log | 3.7636 | 414 | 0.9417 | 0.5030 | 0.9417 | 0.9704 |
| No log | 3.7818 | 416 | 0.9794 | 0.4626 | 0.9794 | 0.9896 |
| No log | 3.8 | 418 | 0.8979 | 0.6085 | 0.8979 | 0.9476 |
| No log | 3.8182 | 420 | 0.7697 | 0.6917 | 0.7697 | 0.8773 |
| No log | 3.8364 | 422 | 0.7075 | 0.7779 | 0.7075 | 0.8412 |
| No log | 3.8545 | 424 | 0.7098 | 0.7605 | 0.7098 | 0.8425 |
| No log | 3.8727 | 426 | 0.7108 | 0.7689 | 0.7108 | 0.8431 |
| No log | 3.8909 | 428 | 0.7132 | 0.7597 | 0.7132 | 0.8445 |
| No log | 3.9091 | 430 | 0.7571 | 0.7273 | 0.7571 | 0.8701 |
| No log | 3.9273 | 432 | 0.8062 | 0.6528 | 0.8062 | 0.8979 |
| No log | 3.9455 | 434 | 0.7940 | 0.6535 | 0.7940 | 0.8910 |
| No log | 3.9636 | 436 | 0.7759 | 0.6535 | 0.7759 | 0.8809 |
| No log | 3.9818 | 438 | 0.7459 | 0.7512 | 0.7459 | 0.8636 |
| No log | 4.0 | 440 | 0.7293 | 0.7354 | 0.7293 | 0.8540 |
| No log | 4.0182 | 442 | 0.7393 | 0.7672 | 0.7393 | 0.8598 |
| No log | 4.0364 | 444 | 0.7823 | 0.7574 | 0.7823 | 0.8844 |
| No log | 4.0545 | 446 | 0.8424 | 0.6712 | 0.8424 | 0.9178 |
| No log | 4.0727 | 448 | 0.8862 | 0.6712 | 0.8862 | 0.9414 |
| No log | 4.0909 | 450 | 0.8482 | 0.6364 | 0.8482 | 0.9210 |
| No log | 4.1091 | 452 | 0.7696 | 0.6974 | 0.7696 | 0.8773 |
| No log | 4.1273 | 454 | 0.7171 | 0.6519 | 0.7171 | 0.8468 |
| No log | 4.1455 | 456 | 0.7044 | 0.7518 | 0.7044 | 0.8393 |
| No log | 4.1636 | 458 | 0.6858 | 0.7689 | 0.6858 | 0.8282 |
| No log | 4.1818 | 460 | 0.6654 | 0.7437 | 0.6654 | 0.8157 |
| No log | 4.2 | 462 | 0.6639 | 0.7090 | 0.6639 | 0.8148 |
| No log | 4.2182 | 464 | 0.7065 | 0.6363 | 0.7065 | 0.8405 |
| No log | 4.2364 | 466 | 0.7493 | 0.6382 | 0.7493 | 0.8656 |
| No log | 4.2545 | 468 | 0.7327 | 0.6382 | 0.7327 | 0.8560 |
| No log | 4.2727 | 470 | 0.7490 | 0.6816 | 0.7490 | 0.8654 |
| No log | 4.2909 | 472 | 0.8173 | 0.6905 | 0.8173 | 0.9040 |
| No log | 4.3091 | 474 | 0.8251 | 0.6905 | 0.8251 | 0.9083 |
| No log | 4.3273 | 476 | 0.7393 | 0.6816 | 0.7393 | 0.8598 |
| No log | 4.3455 | 478 | 0.6540 | 0.7129 | 0.6540 | 0.8087 |
| No log | 4.3636 | 480 | 0.6004 | 0.7327 | 0.6004 | 0.7749 |
| No log | 4.3818 | 482 | 0.5971 | 0.7327 | 0.5971 | 0.7727 |
| No log | 4.4 | 484 | 0.6123 | 0.7327 | 0.6123 | 0.7825 |
| No log | 4.4182 | 486 | 0.6598 | 0.6940 | 0.6598 | 0.8123 |
| No log | 4.4364 | 488 | 0.7025 | 0.7333 | 0.7025 | 0.8382 |
| No log | 4.4545 | 490 | 0.7119 | 0.6967 | 0.7119 | 0.8438 |
| No log | 4.4727 | 492 | 0.6748 | 0.6940 | 0.6748 | 0.8215 |
| No log | 4.4909 | 494 | 0.6547 | 0.6616 | 0.6547 | 0.8092 |
| No log | 4.5091 | 496 | 0.6634 | 0.6616 | 0.6634 | 0.8145 |
| No log | 4.5273 | 498 | 0.6767 | 0.6616 | 0.6767 | 0.8226 |
| 0.6876 | 4.5455 | 500 | 0.6623 | 0.6940 | 0.6623 | 0.8138 |
| 0.6876 | 4.5636 | 502 | 0.6378 | 0.7327 | 0.6378 | 0.7986 |
| 0.6876 | 4.5818 | 504 | 0.6379 | 0.7244 | 0.6379 | 0.7987 |
| 0.6876 | 4.6 | 506 | 0.6585 | 0.7437 | 0.6585 | 0.8115 |
| 0.6876 | 4.6182 | 508 | 0.7140 | 0.7590 | 0.7140 | 0.8450 |
| 0.6876 | 4.6364 | 510 | 0.8340 | 0.6894 | 0.8340 | 0.9132 |
| 0.6876 | 4.6545 | 512 | 1.0043 | 0.6311 | 1.0043 | 1.0021 |
| 0.6876 | 4.6727 | 514 | 1.0688 | 0.6311 | 1.0688 | 1.0338 |
| 0.6876 | 4.6909 | 516 | 1.0381 | 0.5925 | 1.0381 | 1.0189 |
| 0.6876 | 4.7091 | 518 | 0.9126 | 0.6204 | 0.9126 | 0.9553 |
| 0.6876 | 4.7273 | 520 | 0.7435 | 0.72 | 0.7435 | 0.8622 |
| 0.6876 | 4.7455 | 522 | 0.6535 | 0.7437 | 0.6535 | 0.8084 |
| 0.6876 | 4.7636 | 524 | 0.6340 | 0.7437 | 0.6340 | 0.7962 |
| 0.6876 | 4.7818 | 526 | 0.6281 | 0.7437 | 0.6281 | 0.7925 |
| 0.6876 | 4.8 | 528 | 0.6398 | 0.7437 | 0.6398 | 0.7999 |
| 0.6876 | 4.8182 | 530 | 0.6869 | 0.7086 | 0.6869 | 0.8288 |
| 0.6876 | 4.8364 | 532 | 0.7779 | 0.6115 | 0.7779 | 0.8820 |
| 0.6876 | 4.8545 | 534 | 0.8382 | 0.5686 | 0.8382 | 0.9155 |
| 0.6876 | 4.8727 | 536 | 0.8223 | 0.6123 | 0.8223 | 0.9068 |
| 0.6876 | 4.8909 | 538 | 0.7636 | 0.7076 | 0.7636 | 0.8738 |
| 0.6876 | 4.9091 | 540 | 0.6915 | 0.7086 | 0.6915 | 0.8316 |
| 0.6876 | 4.9273 | 542 | 0.6571 | 0.6856 | 0.6571 | 0.8106 |
| 0.6876 | 4.9455 | 544 | 0.6735 | 0.7136 | 0.6735 | 0.8207 |
| 0.6876 | 4.9636 | 546 | 0.7103 | 0.7354 | 0.7103 | 0.8428 |
| 0.6876 | 4.9818 | 548 | 0.7947 | 0.6918 | 0.7947 | 0.8915 |
| 0.6876 | 5.0 | 550 | 0.8166 | 0.6808 | 0.8166 | 0.9037 |
| 0.6876 | 5.0182 | 552 | 0.7686 | 0.6918 | 0.7686 | 0.8767 |
| 0.6876 | 5.0364 | 554 | 0.7005 | 0.7354 | 0.7005 | 0.8370 |
| 0.6876 | 5.0545 | 556 | 0.6386 | 0.7273 | 0.6386 | 0.7991 |
| 0.6876 | 5.0727 | 558 | 0.6071 | 0.7437 | 0.6071 | 0.7792 |
| 0.6876 | 5.0909 | 560 | 0.5938 | 0.7437 | 0.5938 | 0.7706 |
| 0.6876 | 5.1091 | 562 | 0.5899 | 0.7948 | 0.5899 | 0.7681 |
| 0.6876 | 5.1273 | 564 | 0.6121 | 0.7162 | 0.6121 | 0.7824 |
| 0.6876 | 5.1455 | 566 | 0.6581 | 0.7014 | 0.6581 | 0.8112 |
| 0.6876 | 5.1636 | 568 | 0.7238 | 0.6356 | 0.7238 | 0.8507 |
| 0.6876 | 5.1818 | 570 | 0.7475 | 0.6356 | 0.7475 | 0.8646 |
| 0.6876 | 5.2 | 572 | 0.7109 | 0.6309 | 0.7109 | 0.8432 |
| 0.6876 | 5.2182 | 574 | 0.6587 | 0.7602 | 0.6587 | 0.8116 |
| 0.6876 | 5.2364 | 576 | 0.6040 | 0.7437 | 0.6040 | 0.7772 |
| 0.6876 | 5.2545 | 578 | 0.5835 | 0.7437 | 0.5835 | 0.7639 |
| 0.6876 | 5.2727 | 580 | 0.5789 | 0.7437 | 0.5789 | 0.7609 |
| 0.6876 | 5.2909 | 582 | 0.5738 | 0.7437 | 0.5738 | 0.7575 |
| 0.6876 | 5.3091 | 584 | 0.5907 | 0.7167 | 0.5907 | 0.7686 |
| 0.6876 | 5.3273 | 586 | 0.6552 | 0.7696 | 0.6552 | 0.8094 |
| 0.6876 | 5.3455 | 588 | 0.7338 | 0.6356 | 0.7338 | 0.8566 |
| 0.6876 | 5.3636 | 590 | 0.7948 | 0.6287 | 0.7948 | 0.8915 |
| 0.6876 | 5.3818 | 592 | 0.7916 | 0.6571 | 0.7916 | 0.8897 |
| 0.6876 | 5.4 | 594 | 0.7440 | 0.7366 | 0.7440 | 0.8626 |
| 0.6876 | 5.4182 | 596 | 0.7299 | 0.7583 | 0.7299 | 0.8543 |
| 0.6876 | 5.4364 | 598 | 0.7386 | 0.7846 | 0.7386 | 0.8594 |
| 0.6876 | 5.4545 | 600 | 0.7512 | 0.7846 | 0.7512 | 0.8667 |
| 0.6876 | 5.4727 | 602 | 0.7662 | 0.7827 | 0.7662 | 0.8753 |
| 0.6876 | 5.4909 | 604 | 0.7832 | 0.7506 | 0.7832 | 0.8850 |
| 0.6876 | 5.5091 | 606 | 0.7889 | 0.7506 | 0.7889 | 0.8882 |
| 0.6876 | 5.5273 | 608 | 0.7417 | 0.7506 | 0.7417 | 0.8612 |
| 0.6876 | 5.5455 | 610 | 0.6804 | 0.7846 | 0.6804 | 0.8248 |
| 0.6876 | 5.5636 | 612 | 0.6426 | 0.7590 | 0.6426 | 0.8016 |
| 0.6876 | 5.5818 | 614 | 0.6414 | 0.7358 | 0.6414 | 0.8009 |
| 0.6876 | 5.6 | 616 | 0.6629 | 0.8006 | 0.6629 | 0.8142 |
| 0.6876 | 5.6182 | 618 | 0.6737 | 0.7511 | 0.6737 | 0.8208 |
| 0.6876 | 5.6364 | 620 | 0.6507 | 0.7511 | 0.6507 | 0.8066 |
| 0.6876 | 5.6545 | 622 | 0.6179 | 0.7898 | 0.6179 | 0.7861 |
| 0.6876 | 5.6727 | 624 | 0.5954 | 0.7793 | 0.5954 | 0.7716 |
| 0.6876 | 5.6909 | 626 | 0.5878 | 0.7167 | 0.5878 | 0.7667 |
| 0.6876 | 5.7091 | 628 | 0.5794 | 0.7167 | 0.5794 | 0.7612 |
| 0.6876 | 5.7273 | 630 | 0.5775 | 0.7167 | 0.5775 | 0.7600 |
| 0.6876 | 5.7455 | 632 | 0.6049 | 0.7692 | 0.6049 | 0.7778 |
| 0.6876 | 5.7636 | 634 | 0.6517 | 0.8009 | 0.6517 | 0.8073 |
| 0.6876 | 5.7818 | 636 | 0.7138 | 0.8058 | 0.7138 | 0.8449 |
| 0.6876 | 5.8 | 638 | 0.7588 | 0.8058 | 0.7588 | 0.8711 |
| 0.6876 | 5.8182 | 640 | 0.7619 | 0.8058 | 0.7619 | 0.8729 |
| 0.6876 | 5.8364 | 642 | 0.7474 | 0.8058 | 0.7474 | 0.8645 |
| 0.6876 | 5.8545 | 644 | 0.7664 | 0.8058 | 0.7664 | 0.8755 |
| 0.6876 | 5.8727 | 646 | 0.7428 | 0.8144 | 0.7428 | 0.8618 |
| 0.6876 | 5.8909 | 648 | 0.7151 | 0.8238 | 0.7151 | 0.8457 |
| 0.6876 | 5.9091 | 650 | 0.6960 | 0.8238 | 0.6960 | 0.8343 |
| 0.6876 | 5.9273 | 652 | 0.7086 | 0.8238 | 0.7086 | 0.8418 |
| 0.6876 | 5.9455 | 654 | 0.7094 | 0.8032 | 0.7094 | 0.8423 |
| 0.6876 | 5.9636 | 656 | 0.7483 | 0.8032 | 0.7483 | 0.8650 |
| 0.6876 | 5.9818 | 658 | 0.7331 | 0.8032 | 0.7331 | 0.8562 |
| 0.6876 | 6.0 | 660 | 0.7217 | 0.8128 | 0.7217 | 0.8495 |
| 0.6876 | 6.0182 | 662 | 0.6953 | 0.7838 | 0.6953 | 0.8338 |
| 0.6876 | 6.0364 | 664 | 0.7156 | 0.8032 | 0.7156 | 0.8459 |
| 0.6876 | 6.0545 | 666 | 0.7378 | 0.8032 | 0.7378 | 0.8589 |
| 0.6876 | 6.0727 | 668 | 0.7460 | 0.8032 | 0.7460 | 0.8637 |
| 0.6876 | 6.0909 | 670 | 0.7646 | 0.8238 | 0.7646 | 0.8744 |
| 0.6876 | 6.1091 | 672 | 0.7576 | 0.8238 | 0.7576 | 0.8704 |
| 0.6876 | 6.1273 | 674 | 0.7600 | 0.8128 | 0.7600 | 0.8718 |
| 0.6876 | 6.1455 | 676 | 0.7682 | 0.8128 | 0.7682 | 0.8765 |
| 0.6876 | 6.1636 | 678 | 0.7699 | 0.7905 | 0.7699 | 0.8774 |
| 0.6876 | 6.1818 | 680 | 0.7491 | 0.7905 | 0.7491 | 0.8655 |
| 0.6876 | 6.2 | 682 | 0.7044 | 0.7211 | 0.7044 | 0.8393 |
| 0.6876 | 6.2182 | 684 | 0.6654 | 0.7511 | 0.6654 | 0.8157 |
| 0.6876 | 6.2364 | 686 | 0.6392 | 0.6940 | 0.6392 | 0.7995 |
| 0.6876 | 6.2545 | 688 | 0.6306 | 0.6871 | 0.6306 | 0.7941 |
| 0.6876 | 6.2727 | 690 | 0.6358 | 0.6871 | 0.6358 | 0.7973 |
| 0.6876 | 6.2909 | 692 | 0.6401 | 0.6871 | 0.6401 | 0.8001 |
| 0.6876 | 6.3091 | 694 | 0.6467 | 0.6871 | 0.6467 | 0.8042 |
| 0.6876 | 6.3273 | 696 | 0.6770 | 0.6940 | 0.6770 | 0.8228 |
| 0.6876 | 6.3455 | 698 | 0.7129 | 0.7586 | 0.7129 | 0.8443 |
| 0.6876 | 6.3636 | 700 | 0.7305 | 0.7579 | 0.7305 | 0.8547 |
| 0.6876 | 6.3818 | 702 | 0.7179 | 0.7505 | 0.7179 | 0.8473 |
| 0.6876 | 6.4 | 704 | 0.7214 | 0.7505 | 0.7214 | 0.8494 |
| 0.6876 | 6.4182 | 706 | 0.6973 | 0.7505 | 0.6973 | 0.8351 |
| 0.6876 | 6.4364 | 708 | 0.6434 | 0.7514 | 0.6434 | 0.8021 |
| 0.6876 | 6.4545 | 710 | 0.6176 | 0.7167 | 0.6176 | 0.7858 |
| 0.6876 | 6.4727 | 712 | 0.6166 | 0.7244 | 0.6166 | 0.7852 |
| 0.6876 | 6.4909 | 714 | 0.6534 | 0.7514 | 0.6534 | 0.8083 |
| 0.6876 | 6.5091 | 716 | 0.6903 | 0.7720 | 0.6903 | 0.8308 |
| 0.6876 | 6.5273 | 718 | 0.7198 | 0.7720 | 0.7198 | 0.8484 |
| 0.6876 | 6.5455 | 720 | 0.7206 | 0.7720 | 0.7206 | 0.8489 |
| 0.6876 | 6.5636 | 722 | 0.6966 | 0.7647 | 0.6966 | 0.8346 |
| 0.6876 | 6.5818 | 724 | 0.6741 | 0.7510 | 0.6741 | 0.8210 |
| 0.6876 | 6.6 | 726 | 0.6466 | 0.7277 | 0.6466 | 0.8041 |
| 0.6876 | 6.6182 | 728 | 0.6444 | 0.7277 | 0.6444 | 0.8028 |
| 0.6876 | 6.6364 | 730 | 0.6603 | 0.7288 | 0.6603 | 0.8126 |
| 0.6876 | 6.6545 | 732 | 0.6836 | 0.7510 | 0.6836 | 0.8268 |
| 0.6876 | 6.6727 | 734 | 0.7269 | 0.7647 | 0.7269 | 0.8526 |
| 0.6876 | 6.6909 | 736 | 0.7978 | 0.7304 | 0.7978 | 0.8932 |
| 0.6876 | 6.7091 | 738 | 0.8483 | 0.7838 | 0.8483 | 0.9210 |
| 0.6876 | 6.7273 | 740 | 0.8769 | 0.7637 | 0.8769 | 0.9365 |
| 0.6876 | 6.7455 | 742 | 0.8354 | 0.7647 | 0.8354 | 0.9140 |
| 0.6876 | 6.7636 | 744 | 0.7894 | 0.7647 | 0.7894 | 0.8885 |
| 0.6876 | 6.7818 | 746 | 0.7319 | 0.7579 | 0.7319 | 0.8555 |
| 0.6876 | 6.8 | 748 | 0.6787 | 0.7269 | 0.6787 | 0.8238 |
| 0.6876 | 6.8182 | 750 | 0.6594 | 0.6871 | 0.6594 | 0.8120 |
| 0.6876 | 6.8364 | 752 | 0.6528 | 0.6871 | 0.6528 | 0.8080 |
| 0.6876 | 6.8545 | 754 | 0.6682 | 0.6940 | 0.6682 | 0.8174 |
| 0.6876 | 6.8727 | 756 | 0.7005 | 0.7838 | 0.7005 | 0.8370 |
| 0.6876 | 6.8909 | 758 | 0.7460 | 0.7502 | 0.7460 | 0.8637 |
| 0.6876 | 6.9091 | 760 | 0.7426 | 0.7745 | 0.7426 | 0.8617 |
| 0.6876 | 6.9273 | 762 | 0.7061 | 0.7838 | 0.7061 | 0.8403 |
| 0.6876 | 6.9455 | 764 | 0.6592 | 0.6940 | 0.6592 | 0.8119 |
| 0.6876 | 6.9636 | 766 | 0.6385 | 0.7162 | 0.6385 | 0.7990 |
| 0.6876 | 6.9818 | 768 | 0.6458 | 0.7090 | 0.6458 | 0.8036 |
| 0.6876 | 7.0 | 770 | 0.6715 | 0.7511 | 0.6715 | 0.8194 |
| 0.6876 | 7.0182 | 772 | 0.7116 | 0.7973 | 0.7116 | 0.8436 |
| 0.6876 | 7.0364 | 774 | 0.7877 | 0.7647 | 0.7877 | 0.8875 |
| 0.6876 | 7.0545 | 776 | 0.8465 | 0.7204 | 0.8465 | 0.9201 |
| 0.6876 | 7.0727 | 778 | 0.8483 | 0.7647 | 0.8483 | 0.9210 |
| 0.6876 | 7.0909 | 780 | 0.8338 | 0.7647 | 0.8338 | 0.9131 |
| 0.6876 | 7.1091 | 782 | 0.7996 | 0.7647 | 0.7996 | 0.8942 |
| 0.6876 | 7.1273 | 784 | 0.7948 | 0.7647 | 0.7948 | 0.8915 |
| 0.6876 | 7.1455 | 786 | 0.7632 | 0.7647 | 0.7632 | 0.8736 |
| 0.6876 | 7.1636 | 788 | 0.7133 | 0.7594 | 0.7133 | 0.8446 |
| 0.6876 | 7.1818 | 790 | 0.6666 | 0.7115 | 0.6666 | 0.8164 |
| 0.6876 | 7.2 | 792 | 0.6422 | 0.7090 | 0.6422 | 0.8014 |
| 0.6876 | 7.2182 | 794 | 0.6246 | 0.6877 | 0.6246 | 0.7903 |
| 0.6876 | 7.2364 | 796 | 0.6140 | 0.7167 | 0.6140 | 0.7836 |
| 0.6876 | 7.2545 | 798 | 0.6166 | 0.6877 | 0.6166 | 0.7852 |
| 0.6876 | 7.2727 | 800 | 0.6341 | 0.7614 | 0.6341 | 0.7963 |
| 0.6876 | 7.2909 | 802 | 0.6694 | 0.7511 | 0.6694 | 0.8182 |
| 0.6876 | 7.3091 | 804 | 0.7064 | 0.7123 | 0.7064 | 0.8405 |
| 0.6876 | 7.3273 | 806 | 0.7207 | 0.7504 | 0.7207 | 0.8489 |
| 0.6876 | 7.3455 | 808 | 0.6990 | 0.7764 | 0.6990 | 0.8361 |
| 0.6876 | 7.3636 | 810 | 0.6688 | 0.7764 | 0.6688 | 0.8178 |
| 0.6876 | 7.3818 | 812 | 0.6480 | 0.6866 | 0.6480 | 0.8050 |
| 0.6876 | 7.4 | 814 | 0.6315 | 0.6940 | 0.6315 | 0.7947 |
| 0.6876 | 7.4182 | 816 | 0.6225 | 0.6871 | 0.6225 | 0.7890 |
| 0.6876 | 7.4364 | 818 | 0.6384 | 0.6871 | 0.6384 | 0.7990 |
| 0.6876 | 7.4545 | 820 | 0.6653 | 0.7269 | 0.6653 | 0.8157 |
| 0.6876 | 7.4727 | 822 | 0.7025 | 0.8032 | 0.7025 | 0.8381 |
| 0.6876 | 7.4909 | 824 | 0.7401 | 0.8032 | 0.7401 | 0.8603 |
| 0.6876 | 7.5091 | 826 | 0.7679 | 0.7647 | 0.7679 | 0.8763 |
| 0.6876 | 7.5273 | 828 | 0.7650 | 0.7647 | 0.7650 | 0.8747 |
| 0.6876 | 7.5455 | 830 | 0.7406 | 0.8032 | 0.7406 | 0.8606 |
| 0.6876 | 7.5636 | 832 | 0.7166 | 0.8032 | 0.7166 | 0.8465 |
| 0.6876 | 7.5818 | 834 | 0.7083 | 0.8128 | 0.7083 | 0.8416 |
| 0.6876 | 7.6 | 836 | 0.6977 | 0.7590 | 0.6977 | 0.8353 |
| 0.6876 | 7.6182 | 838 | 0.6791 | 0.7590 | 0.6791 | 0.8241 |
| 0.6876 | 7.6364 | 840 | 0.6654 | 0.7230 | 0.6654 | 0.8157 |
| 0.6876 | 7.6545 | 842 | 0.6651 | 0.7590 | 0.6651 | 0.8155 |
| 0.6876 | 7.6727 | 844 | 0.6792 | 0.7590 | 0.6792 | 0.8241 |
| 0.6876 | 7.6909 | 846 | 0.6876 | 0.7590 | 0.6876 | 0.8292 |
| 0.6876 | 7.7091 | 848 | 0.6878 | 0.8128 | 0.6878 | 0.8293 |
| 0.6876 | 7.7273 | 850 | 0.7027 | 0.7812 | 0.7027 | 0.8383 |
| 0.6876 | 7.7455 | 852 | 0.7256 | 0.7812 | 0.7256 | 0.8518 |
| 0.6876 | 7.7636 | 854 | 0.7371 | 0.7812 | 0.7371 | 0.8586 |
| 0.6876 | 7.7818 | 856 | 0.7538 | 0.7812 | 0.7538 | 0.8682 |
| 0.6876 | 7.8 | 858 | 0.7563 | 0.8032 | 0.7563 | 0.8696 |
| 0.6876 | 7.8182 | 860 | 0.7554 | 0.8032 | 0.7554 | 0.8691 |
| 0.6876 | 7.8364 | 862 | 0.7716 | 0.8032 | 0.7716 | 0.8784 |
| 0.6876 | 7.8545 | 864 | 0.7790 | 0.7647 | 0.7790 | 0.8826 |
| 0.6876 | 7.8727 | 866 | 0.7629 | 0.7647 | 0.7629 | 0.8734 |
| 0.6876 | 7.8909 | 868 | 0.7481 | 0.7647 | 0.7481 | 0.8650 |
| 0.6876 | 7.9091 | 870 | 0.7001 | 0.8032 | 0.7001 | 0.8367 |
| 0.6876 | 7.9273 | 872 | 0.6572 | 0.8032 | 0.6572 | 0.8106 |
| 0.6876 | 7.9455 | 874 | 0.6235 | 0.7674 | 0.6235 | 0.7896 |
| 0.6876 | 7.9636 | 876 | 0.6103 | 0.7674 | 0.6103 | 0.7812 |
| 0.6876 | 7.9818 | 878 | 0.6087 | 0.7674 | 0.6087 | 0.7802 |
| 0.6876 | 8.0 | 880 | 0.6216 | 0.7764 | 0.6216 | 0.7884 |
| 0.6876 | 8.0182 | 882 | 0.6453 | 0.8067 | 0.6453 | 0.8033 |
| 0.6876 | 8.0364 | 884 | 0.6711 | 0.8067 | 0.6711 | 0.8192 |
| 0.6876 | 8.0545 | 886 | 0.6746 | 0.7835 | 0.6746 | 0.8214 |
| 0.6876 | 8.0727 | 888 | 0.6732 | 0.7835 | 0.6732 | 0.8205 |
| 0.6876 | 8.0909 | 890 | 0.6593 | 0.7504 | 0.6593 | 0.8120 |
| 0.6876 | 8.1091 | 892 | 0.6354 | 0.7764 | 0.6354 | 0.7971 |
| 0.6876 | 8.1273 | 894 | 0.6035 | 0.7511 | 0.6035 | 0.7768 |
| 0.6876 | 8.1455 | 896 | 0.5791 | 0.7511 | 0.5791 | 0.7610 |
| 0.6876 | 8.1636 | 898 | 0.5684 | 0.7516 | 0.5684 | 0.7539 |
| 0.6876 | 8.1818 | 900 | 0.5703 | 0.7337 | 0.5703 | 0.7552 |
| 0.6876 | 8.2 | 902 | 0.5851 | 0.7421 | 0.5851 | 0.7649 |
| 0.6876 | 8.2182 | 904 | 0.6078 | 0.7511 | 0.6078 | 0.7796 |
| 0.6876 | 8.2364 | 906 | 0.6197 | 0.7511 | 0.6197 | 0.7872 |
| 0.6876 | 8.2545 | 908 | 0.6349 | 0.7417 | 0.6349 | 0.7968 |
| 0.6876 | 8.2727 | 910 | 0.6467 | 0.7417 | 0.6467 | 0.8042 |
| 0.6876 | 8.2909 | 912 | 0.6336 | 0.7511 | 0.6336 | 0.7960 |
| 0.6876 | 8.3091 | 914 | 0.6154 | 0.7511 | 0.6154 | 0.7845 |
| 0.6876 | 8.3273 | 916 | 0.6074 | 0.7511 | 0.6074 | 0.7794 |
| 0.6876 | 8.3455 | 918 | 0.6054 | 0.7421 | 0.6054 | 0.7781 |
| 0.6876 | 8.3636 | 920 | 0.6158 | 0.7511 | 0.6158 | 0.7847 |
| 0.6876 | 8.3818 | 922 | 0.6213 | 0.7511 | 0.6213 | 0.7882 |
| 0.6876 | 8.4 | 924 | 0.6298 | 0.7511 | 0.6298 | 0.7936 |
| 0.6876 | 8.4182 | 926 | 0.6439 | 0.7511 | 0.6439 | 0.8025 |
| 0.6876 | 8.4364 | 928 | 0.6658 | 0.7211 | 0.6658 | 0.8160 |
| 0.6876 | 8.4545 | 930 | 0.6948 | 0.7123 | 0.6948 | 0.8336 |
| 0.6876 | 8.4727 | 932 | 0.7131 | 0.7123 | 0.7131 | 0.8444 |
| 0.6876 | 8.4909 | 934 | 0.7202 | 0.7504 | 0.7202 | 0.8486 |
| 0.6876 | 8.5091 | 936 | 0.7009 | 0.7123 | 0.7009 | 0.8372 |
| 0.6876 | 8.5273 | 938 | 0.6759 | 0.7417 | 0.6759 | 0.8221 |
| 0.6876 | 8.5455 | 940 | 0.6539 | 0.7511 | 0.6539 | 0.8086 |
| 0.6876 | 8.5636 | 942 | 0.6290 | 0.7511 | 0.6290 | 0.7931 |
| 0.6876 | 8.5818 | 944 | 0.6110 | 0.7421 | 0.6110 | 0.7817 |
| 0.6876 | 8.6 | 946 | 0.5985 | 0.7421 | 0.5985 | 0.7737 |
| 0.6876 | 8.6182 | 948 | 0.5902 | 0.7692 | 0.5902 | 0.7683 |
| 0.6876 | 8.6364 | 950 | 0.5922 | 0.7602 | 0.5922 | 0.7696 |
| 0.6876 | 8.6545 | 952 | 0.5915 | 0.7090 | 0.5915 | 0.7691 |
| 0.6876 | 8.6727 | 954 | 0.5917 | 0.7090 | 0.5917 | 0.7692 |
| 0.6876 | 8.6909 | 956 | 0.5988 | 0.7090 | 0.5988 | 0.7738 |
| 0.6876 | 8.7091 | 958 | 0.6110 | 0.7090 | 0.6110 | 0.7817 |
| 0.6876 | 8.7273 | 960 | 0.6269 | 0.7602 | 0.6269 | 0.7918 |
| 0.6876 | 8.7455 | 962 | 0.6446 | 0.7598 | 0.6446 | 0.8029 |
| 0.6876 | 8.7636 | 964 | 0.6666 | 0.7944 | 0.6666 | 0.8164 |
| 0.6876 | 8.7818 | 966 | 0.6892 | 0.7944 | 0.6892 | 0.8302 |
| 0.6876 | 8.8 | 968 | 0.6944 | 0.7944 | 0.6944 | 0.8333 |
| 0.6876 | 8.8182 | 970 | 0.6868 | 0.7944 | 0.6868 | 0.8287 |
| 0.6876 | 8.8364 | 972 | 0.6699 | 0.7944 | 0.6699 | 0.8185 |
| 0.6876 | 8.8545 | 974 | 0.6459 | 0.7421 | 0.6459 | 0.8037 |
| 0.6876 | 8.8727 | 976 | 0.6302 | 0.7421 | 0.6302 | 0.7938 |
| 0.6876 | 8.8909 | 978 | 0.6228 | 0.7421 | 0.6228 | 0.7892 |
| 0.6876 | 8.9091 | 980 | 0.6195 | 0.7421 | 0.6195 | 0.7871 |
| 0.6876 | 8.9273 | 982 | 0.6129 | 0.6871 | 0.6129 | 0.7829 |
| 0.6876 | 8.9455 | 984 | 0.6042 | 0.6871 | 0.6042 | 0.7773 |
| 0.6876 | 8.9636 | 986 | 0.5986 | 0.6871 | 0.5986 | 0.7737 |
| 0.6876 | 8.9818 | 988 | 0.5982 | 0.6871 | 0.5982 | 0.7734 |
| 0.6876 | 9.0 | 990 | 0.5972 | 0.6871 | 0.5972 | 0.7728 |
| 0.6876 | 9.0182 | 992 | 0.6008 | 0.6871 | 0.6008 | 0.7751 |
| 0.6876 | 9.0364 | 994 | 0.6077 | 0.7421 | 0.6077 | 0.7795 |
| 0.6876 | 9.0545 | 996 | 0.6121 | 0.7421 | 0.6121 | 0.7824 |
| 0.6876 | 9.0727 | 998 | 0.6098 | 0.7421 | 0.6098 | 0.7809 |
| 0.1375 | 9.0909 | 1000 | 0.6093 | 0.7421 | 0.6093 | 0.7806 |
| 0.1375 | 9.1091 | 1002 | 0.6056 | 0.7421 | 0.6056 | 0.7782 |
| 0.1375 | 9.1273 | 1004 | 0.6013 | 0.7421 | 0.6013 | 0.7754 |
| 0.1375 | 9.1455 | 1006 | 0.6015 | 0.7421 | 0.6015 | 0.7756 |
| 0.1375 | 9.1636 | 1008 | 0.6051 | 0.7421 | 0.6051 | 0.7779 |
| 0.1375 | 9.1818 | 1010 | 0.6103 | 0.7421 | 0.6103 | 0.7812 |
| 0.1375 | 9.2 | 1012 | 0.6143 | 0.7421 | 0.6143 | 0.7838 |
| 0.1375 | 9.2182 | 1014 | 0.6232 | 0.7421 | 0.6232 | 0.7894 |
| 0.1375 | 9.2364 | 1016 | 0.6360 | 0.7421 | 0.6360 | 0.7975 |
| 0.1375 | 9.2545 | 1018 | 0.6474 | 0.7511 | 0.6474 | 0.8046 |
| 0.1375 | 9.2727 | 1020 | 0.6611 | 0.7511 | 0.6611 | 0.8131 |
| 0.1375 | 9.2909 | 1022 | 0.6693 | 0.7511 | 0.6693 | 0.8181 |
| 0.1375 | 9.3091 | 1024 | 0.6738 | 0.7511 | 0.6738 | 0.8209 |
| 0.1375 | 9.3273 | 1026 | 0.6718 | 0.7511 | 0.6718 | 0.8197 |
| 0.1375 | 9.3455 | 1028 | 0.6664 | 0.7511 | 0.6664 | 0.8163 |
| 0.1375 | 9.3636 | 1030 | 0.6611 | 0.7511 | 0.6611 | 0.8131 |
| 0.1375 | 9.3818 | 1032 | 0.6550 | 0.7511 | 0.6550 | 0.8093 |
| 0.1375 | 9.4 | 1034 | 0.6551 | 0.7511 | 0.6551 | 0.8094 |
| 0.1375 | 9.4182 | 1036 | 0.6580 | 0.7511 | 0.6580 | 0.8111 |
| 0.1375 | 9.4364 | 1038 | 0.6638 | 0.7764 | 0.6638 | 0.8147 |
| 0.1375 | 9.4545 | 1040 | 0.6656 | 0.8032 | 0.6656 | 0.8159 |
| 0.1375 | 9.4727 | 1042 | 0.6709 | 0.8032 | 0.6709 | 0.8191 |
| 0.1375 | 9.4909 | 1044 | 0.6713 | 0.8032 | 0.6713 | 0.8193 |
| 0.1375 | 9.5091 | 1046 | 0.6691 | 0.8032 | 0.6691 | 0.8180 |
| 0.1375 | 9.5273 | 1048 | 0.6673 | 0.8032 | 0.6673 | 0.8169 |
| 0.1375 | 9.5455 | 1050 | 0.6646 | 0.8032 | 0.6646 | 0.8152 |
| 0.1375 | 9.5636 | 1052 | 0.6596 | 0.7764 | 0.6596 | 0.8122 |
| 0.1375 | 9.5818 | 1054 | 0.6542 | 0.7511 | 0.6542 | 0.8088 |
| 0.1375 | 9.6 | 1056 | 0.6478 | 0.7421 | 0.6478 | 0.8048 |
| 0.1375 | 9.6182 | 1058 | 0.6433 | 0.7421 | 0.6433 | 0.8020 |
| 0.1375 | 9.6364 | 1060 | 0.6389 | 0.7421 | 0.6389 | 0.7993 |
| 0.1375 | 9.6545 | 1062 | 0.6369 | 0.7421 | 0.6369 | 0.7981 |
| 0.1375 | 9.6727 | 1064 | 0.6331 | 0.7421 | 0.6331 | 0.7957 |
| 0.1375 | 9.6909 | 1066 | 0.6308 | 0.7421 | 0.6308 | 0.7942 |
| 0.1375 | 9.7091 | 1068 | 0.6296 | 0.7421 | 0.6296 | 0.7934 |
| 0.1375 | 9.7273 | 1070 | 0.6290 | 0.7421 | 0.6290 | 0.7931 |
| 0.1375 | 9.7455 | 1072 | 0.6277 | 0.7421 | 0.6277 | 0.7923 |
| 0.1375 | 9.7636 | 1074 | 0.6266 | 0.7421 | 0.6266 | 0.7916 |
| 0.1375 | 9.7818 | 1076 | 0.6254 | 0.7421 | 0.6254 | 0.7908 |
| 0.1375 | 9.8 | 1078 | 0.6237 | 0.7421 | 0.6237 | 0.7898 |
| 0.1375 | 9.8182 | 1080 | 0.6241 | 0.7421 | 0.6241 | 0.7900 |
| 0.1375 | 9.8364 | 1082 | 0.6242 | 0.7421 | 0.6242 | 0.7900 |
| 0.1375 | 9.8545 | 1084 | 0.6237 | 0.7421 | 0.6237 | 0.7898 |
| 0.1375 | 9.8727 | 1086 | 0.6237 | 0.7421 | 0.6237 | 0.7897 |
| 0.1375 | 9.8909 | 1088 | 0.6243 | 0.7421 | 0.6243 | 0.7901 |
| 0.1375 | 9.9091 | 1090 | 0.6252 | 0.7421 | 0.6252 | 0.7907 |
| 0.1375 | 9.9273 | 1092 | 0.6262 | 0.7421 | 0.6262 | 0.7913 |
| 0.1375 | 9.9455 | 1094 | 0.6273 | 0.7421 | 0.6273 | 0.7920 |
| 0.1375 | 9.9636 | 1096 | 0.6283 | 0.7421 | 0.6283 | 0.7926 |
| 0.1375 | 9.9818 | 1098 | 0.6289 | 0.7421 | 0.6289 | 0.7930 |
| 0.1375 | 10.0 | 1100 | 0.6292 | 0.7421 | 0.6292 | 0.7932 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
|
MayBashendy/Arabic_FineTuningAraBERT_AugV5_k5_task1_organization_fold1 | MayBashendy | 2024-11-24T06:55:56Z | 162 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:aubmindlab/bert-base-arabertv02",
"base_model:finetune:aubmindlab/bert-base-arabertv02",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-18T18:42:41Z | ---
library_name: transformers
base_model: aubmindlab/bert-base-arabertv02
tags:
- generated_from_trainer
model-index:
- name: Arabic_FineTuningAraBERT_AugV5_k5_task1_organization_fold1
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Arabic_FineTuningAraBERT_AugV5_k5_task1_organization_fold1
This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6108
- Qwk: 0.6744
- Mse: 0.6108
- Rmse: 0.7816
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:|
| No log | 0.0328 | 2 | 5.3667 | 0.0354 | 5.3667 | 2.3166 |
| No log | 0.0656 | 4 | 3.3750 | 0.0650 | 3.3750 | 1.8371 |
| No log | 0.0984 | 6 | 2.1664 | -0.0313 | 2.1664 | 1.4719 |
| No log | 0.1311 | 8 | 1.7096 | -0.0541 | 1.7096 | 1.3075 |
| No log | 0.1639 | 10 | 1.8104 | -0.1887 | 1.8104 | 1.3455 |
| No log | 0.1967 | 12 | 2.9042 | 0.0297 | 2.9042 | 1.7042 |
| No log | 0.2295 | 14 | 2.7513 | 0.0052 | 2.7513 | 1.6587 |
| No log | 0.2623 | 16 | 1.6273 | 0.1355 | 1.6273 | 1.2756 |
| No log | 0.2951 | 18 | 1.1280 | 0.0 | 1.1280 | 1.0621 |
| No log | 0.3279 | 20 | 1.0652 | 0.0 | 1.0652 | 1.0321 |
| No log | 0.3607 | 22 | 1.1399 | 0.0 | 1.1399 | 1.0676 |
| No log | 0.3934 | 24 | 1.2344 | 0.0 | 1.2344 | 1.1110 |
| No log | 0.4262 | 26 | 1.5160 | 0.0 | 1.5160 | 1.2313 |
| No log | 0.4590 | 28 | 1.6924 | 0.0239 | 1.6924 | 1.3009 |
| No log | 0.4918 | 30 | 2.0759 | 0.1003 | 2.0759 | 1.4408 |
| No log | 0.5246 | 32 | 1.9868 | 0.0265 | 1.9868 | 1.4095 |
| No log | 0.5574 | 34 | 1.5819 | 0.0 | 1.5819 | 1.2577 |
| No log | 0.5902 | 36 | 1.2731 | 0.0 | 1.2731 | 1.1283 |
| No log | 0.6230 | 38 | 1.0957 | 0.0841 | 1.0957 | 1.0468 |
| No log | 0.6557 | 40 | 0.9263 | 0.2391 | 0.9263 | 0.9624 |
| No log | 0.6885 | 42 | 0.9067 | 0.24 | 0.9067 | 0.9522 |
| No log | 0.7213 | 44 | 0.9380 | 0.1878 | 0.9380 | 0.9685 |
| No log | 0.7541 | 46 | 1.0530 | 0.0297 | 1.0530 | 1.0261 |
| No log | 0.7869 | 48 | 1.2411 | 0.1818 | 1.2411 | 1.1140 |
| No log | 0.8197 | 50 | 1.3713 | 0.1642 | 1.3713 | 1.1710 |
| No log | 0.8525 | 52 | 1.6268 | 0.1656 | 1.6268 | 1.2755 |
| No log | 0.8852 | 54 | 1.6368 | 0.1656 | 1.6368 | 1.2794 |
| No log | 0.9180 | 56 | 1.3757 | 0.0 | 1.3757 | 1.1729 |
| No log | 0.9508 | 58 | 1.0566 | 0.1043 | 1.0566 | 1.0279 |
| No log | 0.9836 | 60 | 0.8507 | 0.4286 | 0.8507 | 0.9223 |
| No log | 1.0164 | 62 | 0.7960 | 0.3200 | 0.7960 | 0.8922 |
| No log | 1.0492 | 64 | 0.8222 | 0.24 | 0.8222 | 0.9068 |
| No log | 1.0820 | 66 | 0.8885 | 0.1105 | 0.8885 | 0.9426 |
| No log | 1.1148 | 68 | 0.9769 | 0.1910 | 0.9769 | 0.9884 |
| No log | 1.1475 | 70 | 1.1247 | 0.1706 | 1.1247 | 1.0605 |
| No log | 1.1803 | 72 | 1.0106 | 0.2146 | 1.0106 | 1.0053 |
| No log | 1.2131 | 74 | 0.8831 | 0.2632 | 0.8831 | 0.9397 |
| No log | 1.2459 | 76 | 0.8033 | 0.2652 | 0.8033 | 0.8963 |
| No log | 1.2787 | 78 | 0.7690 | 0.2674 | 0.7690 | 0.8769 |
| No log | 1.3115 | 80 | 0.7706 | 0.2130 | 0.7706 | 0.8778 |
| No log | 1.3443 | 82 | 0.7400 | 0.3226 | 0.7400 | 0.8602 |
| No log | 1.3770 | 84 | 0.7196 | 0.4615 | 0.7196 | 0.8483 |
| No log | 1.4098 | 86 | 0.8355 | 0.3982 | 0.8355 | 0.9141 |
| No log | 1.4426 | 88 | 1.0117 | 0.4362 | 1.0117 | 1.0059 |
| No log | 1.4754 | 90 | 0.8817 | 0.4145 | 0.8817 | 0.9390 |
| No log | 1.5082 | 92 | 0.6774 | 0.4057 | 0.6774 | 0.8231 |
| No log | 1.5410 | 94 | 0.6270 | 0.4896 | 0.6270 | 0.7918 |
| No log | 1.5738 | 96 | 0.6285 | 0.3226 | 0.6285 | 0.7928 |
| No log | 1.6066 | 98 | 0.6238 | 0.37 | 0.6238 | 0.7898 |
| No log | 1.6393 | 100 | 0.6278 | 0.5494 | 0.6278 | 0.7923 |
| No log | 1.6721 | 102 | 0.6215 | 0.6578 | 0.6215 | 0.7883 |
| No log | 1.7049 | 104 | 0.6654 | 0.4664 | 0.6654 | 0.8157 |
| No log | 1.7377 | 106 | 0.7296 | 0.4569 | 0.7296 | 0.8542 |
| No log | 1.7705 | 108 | 0.7081 | 0.4569 | 0.7081 | 0.8415 |
| No log | 1.8033 | 110 | 0.6583 | 0.6578 | 0.6583 | 0.8114 |
| No log | 1.8361 | 112 | 0.6782 | 0.5494 | 0.6782 | 0.8236 |
| No log | 1.8689 | 114 | 0.7368 | 0.4784 | 0.7368 | 0.8584 |
| No log | 1.9016 | 116 | 0.8056 | 0.4444 | 0.8056 | 0.8975 |
| No log | 1.9344 | 118 | 0.7999 | 0.4262 | 0.7999 | 0.8944 |
| No log | 1.9672 | 120 | 0.8102 | 0.4737 | 0.8102 | 0.9001 |
| No log | 2.0 | 122 | 0.8349 | 0.4737 | 0.8349 | 0.9137 |
| No log | 2.0328 | 124 | 0.9094 | 0.5139 | 0.9094 | 0.9536 |
| No log | 2.0656 | 126 | 0.9928 | 0.4984 | 0.9928 | 0.9964 |
| No log | 2.0984 | 128 | 0.9734 | 0.4581 | 0.9734 | 0.9866 |
| No log | 2.1311 | 130 | 0.9860 | 0.4324 | 0.9860 | 0.9930 |
| No log | 2.1639 | 132 | 0.9580 | 0.4842 | 0.9580 | 0.9788 |
| No log | 2.1967 | 134 | 0.8692 | 0.5 | 0.8692 | 0.9323 |
| No log | 2.2295 | 136 | 0.7974 | 0.5062 | 0.7974 | 0.8930 |
| No log | 2.2623 | 138 | 0.7419 | 0.5513 | 0.7419 | 0.8613 |
| No log | 2.2951 | 140 | 0.7190 | 0.5581 | 0.7190 | 0.8479 |
| No log | 2.3279 | 142 | 0.7030 | 0.6006 | 0.7030 | 0.8384 |
| No log | 2.3607 | 144 | 0.6957 | 0.5908 | 0.6957 | 0.8341 |
| No log | 2.3934 | 146 | 0.5884 | 0.6794 | 0.5884 | 0.7671 |
| No log | 2.4262 | 148 | 0.6435 | 0.6397 | 0.6435 | 0.8022 |
| No log | 2.4590 | 150 | 0.8462 | 0.6020 | 0.8462 | 0.9199 |
| No log | 2.4918 | 152 | 0.8612 | 0.6111 | 0.8612 | 0.9280 |
| No log | 2.5246 | 154 | 0.7777 | 0.5909 | 0.7777 | 0.8819 |
| No log | 2.5574 | 156 | 0.7472 | 0.5368 | 0.7472 | 0.8644 |
| No log | 2.5902 | 158 | 0.7663 | 0.4913 | 0.7663 | 0.8754 |
| No log | 2.6230 | 160 | 0.7660 | 0.6149 | 0.7660 | 0.8752 |
| No log | 2.6557 | 162 | 0.7602 | 0.6316 | 0.7602 | 0.8719 |
| No log | 2.6885 | 164 | 0.7151 | 0.6500 | 0.7151 | 0.8457 |
| No log | 2.7213 | 166 | 0.6923 | 0.5795 | 0.6923 | 0.8320 |
| No log | 2.7541 | 168 | 0.7283 | 0.5430 | 0.7283 | 0.8534 |
| No log | 2.7869 | 170 | 0.7940 | 0.5828 | 0.7940 | 0.8910 |
| No log | 2.8197 | 172 | 0.7914 | 0.5668 | 0.7914 | 0.8896 |
| No log | 2.8525 | 174 | 0.7748 | 0.5395 | 0.7748 | 0.8802 |
| No log | 2.8852 | 176 | 0.6800 | 0.6316 | 0.6800 | 0.8246 |
| No log | 2.9180 | 178 | 0.5132 | 0.6784 | 0.5132 | 0.7164 |
| No log | 2.9508 | 180 | 0.4613 | 0.7426 | 0.4613 | 0.6792 |
| No log | 2.9836 | 182 | 0.4604 | 0.7986 | 0.4604 | 0.6786 |
| No log | 3.0164 | 184 | 0.4868 | 0.7586 | 0.4868 | 0.6977 |
| No log | 3.0492 | 186 | 0.5075 | 0.6912 | 0.5075 | 0.7124 |
| No log | 3.0820 | 188 | 0.6287 | 0.5704 | 0.6287 | 0.7929 |
| No log | 3.1148 | 190 | 0.7432 | 0.5532 | 0.7432 | 0.8621 |
| No log | 3.1475 | 192 | 0.7708 | 0.5532 | 0.7708 | 0.8780 |
| No log | 3.1803 | 194 | 0.6847 | 0.5625 | 0.6847 | 0.8275 |
| No log | 3.2131 | 196 | 0.5857 | 0.6667 | 0.5857 | 0.7653 |
| No log | 3.2459 | 198 | 0.5792 | 0.7074 | 0.5792 | 0.7611 |
| No log | 3.2787 | 200 | 0.6257 | 0.7074 | 0.6257 | 0.7910 |
| No log | 3.3115 | 202 | 0.7014 | 0.5552 | 0.7014 | 0.8375 |
| No log | 3.3443 | 204 | 0.6586 | 0.5828 | 0.6586 | 0.8116 |
| No log | 3.3770 | 206 | 0.5612 | 0.72 | 0.5612 | 0.7491 |
| No log | 3.4098 | 208 | 0.5076 | 0.72 | 0.5076 | 0.7124 |
| No log | 3.4426 | 210 | 0.4752 | 0.7820 | 0.4752 | 0.6893 |
| No log | 3.4754 | 212 | 0.5174 | 0.6316 | 0.5174 | 0.7193 |
| No log | 3.5082 | 214 | 0.5388 | 0.6231 | 0.5388 | 0.7340 |
| No log | 3.5410 | 216 | 0.4799 | 0.6316 | 0.4799 | 0.6927 |
| No log | 3.5738 | 218 | 0.4332 | 0.7586 | 0.4332 | 0.6582 |
| No log | 3.6066 | 220 | 0.4280 | 0.8165 | 0.4280 | 0.6542 |
| No log | 3.6393 | 222 | 0.4508 | 0.7336 | 0.4508 | 0.6714 |
| No log | 3.6721 | 224 | 0.5390 | 0.6784 | 0.5390 | 0.7342 |
| No log | 3.7049 | 226 | 0.7656 | 0.5668 | 0.7656 | 0.8750 |
| No log | 3.7377 | 228 | 0.9443 | 0.6045 | 0.9443 | 0.9717 |
| No log | 3.7705 | 230 | 0.9616 | 0.6045 | 0.9616 | 0.9806 |
| No log | 3.8033 | 232 | 0.8528 | 0.5783 | 0.8528 | 0.9235 |
| No log | 3.8361 | 234 | 0.7058 | 0.6789 | 0.7058 | 0.8401 |
| No log | 3.8689 | 236 | 0.5638 | 0.7143 | 0.5638 | 0.7508 |
| No log | 3.9016 | 238 | 0.4822 | 0.6688 | 0.4822 | 0.6944 |
| No log | 3.9344 | 240 | 0.4533 | 0.7986 | 0.4533 | 0.6733 |
| No log | 3.9672 | 242 | 0.4546 | 0.6912 | 0.4546 | 0.6742 |
| No log | 4.0 | 244 | 0.5019 | 0.7279 | 0.5019 | 0.7084 |
| No log | 4.0328 | 246 | 0.6154 | 0.6020 | 0.6154 | 0.7845 |
| No log | 4.0656 | 248 | 0.6141 | 0.5743 | 0.6141 | 0.7837 |
| No log | 4.0984 | 250 | 0.4929 | 0.7138 | 0.4929 | 0.7021 |
| No log | 4.1311 | 252 | 0.4016 | 0.776 | 0.4016 | 0.6337 |
| No log | 4.1639 | 254 | 0.3988 | 0.8 | 0.3988 | 0.6315 |
| No log | 4.1967 | 256 | 0.4165 | 0.776 | 0.4165 | 0.6454 |
| No log | 4.2295 | 258 | 0.4961 | 0.6912 | 0.4961 | 0.7044 |
| No log | 4.2623 | 260 | 0.5935 | 0.6645 | 0.5935 | 0.7704 |
| No log | 4.2951 | 262 | 0.7182 | 0.5935 | 0.7182 | 0.8475 |
| No log | 4.3279 | 264 | 0.7492 | 0.6020 | 0.7492 | 0.8656 |
| No log | 4.3607 | 266 | 0.6901 | 0.6020 | 0.6901 | 0.8307 |
| No log | 4.3934 | 268 | 0.6727 | 0.6020 | 0.6727 | 0.8202 |
| No log | 4.4262 | 270 | 0.5944 | 0.5625 | 0.5944 | 0.7710 |
| No log | 4.4590 | 272 | 0.5554 | 0.6936 | 0.5554 | 0.7452 |
| No log | 4.4918 | 274 | 0.5806 | 0.6818 | 0.5806 | 0.7620 |
| No log | 4.5246 | 276 | 0.5773 | 0.7391 | 0.5773 | 0.7598 |
| No log | 4.5574 | 278 | 0.5528 | 0.7267 | 0.5528 | 0.7435 |
| No log | 4.5902 | 280 | 0.5681 | 0.7267 | 0.5681 | 0.7537 |
| No log | 4.6230 | 282 | 0.5858 | 0.7267 | 0.5858 | 0.7654 |
| No log | 4.6557 | 284 | 0.5781 | 0.7016 | 0.5781 | 0.7603 |
| No log | 4.6885 | 286 | 0.5166 | 0.7391 | 0.5166 | 0.7188 |
| No log | 4.7213 | 288 | 0.4379 | 0.7986 | 0.4379 | 0.6617 |
| No log | 4.7541 | 290 | 0.4394 | 0.7586 | 0.4394 | 0.6629 |
| No log | 4.7869 | 292 | 0.4508 | 0.7036 | 0.4508 | 0.6714 |
| No log | 4.8197 | 294 | 0.4425 | 0.7442 | 0.4425 | 0.6652 |
| No log | 4.8525 | 296 | 0.4418 | 0.7758 | 0.4418 | 0.6647 |
| No log | 4.8852 | 298 | 0.4937 | 0.7391 | 0.4937 | 0.7026 |
| No log | 4.9180 | 300 | 0.6276 | 0.6645 | 0.6276 | 0.7922 |
| No log | 4.9508 | 302 | 0.6539 | 0.6645 | 0.6539 | 0.8086 |
| No log | 4.9836 | 304 | 0.5699 | 0.7147 | 0.5699 | 0.7549 |
| No log | 5.0164 | 306 | 0.5230 | 0.7391 | 0.5230 | 0.7232 |
| No log | 5.0492 | 308 | 0.4921 | 0.7571 | 0.4921 | 0.7015 |
| No log | 5.0820 | 310 | 0.4955 | 0.7571 | 0.4955 | 0.7039 |
| No log | 5.1148 | 312 | 0.5121 | 0.7667 | 0.5121 | 0.7156 |
| No log | 5.1475 | 314 | 0.5689 | 0.6456 | 0.5689 | 0.7543 |
| No log | 5.1803 | 316 | 0.6128 | 0.6020 | 0.6128 | 0.7828 |
| No log | 5.2131 | 318 | 0.5562 | 0.5625 | 0.5562 | 0.7458 |
| No log | 5.2459 | 320 | 0.4788 | 0.6784 | 0.4788 | 0.6919 |
| No log | 5.2787 | 322 | 0.4524 | 0.6784 | 0.4524 | 0.6726 |
| No log | 5.3115 | 324 | 0.4107 | 0.7986 | 0.4107 | 0.6409 |
| No log | 5.3443 | 326 | 0.4193 | 0.7986 | 0.4193 | 0.6475 |
| No log | 5.3770 | 328 | 0.4632 | 0.72 | 0.4632 | 0.6806 |
| No log | 5.4098 | 330 | 0.4833 | 0.72 | 0.4833 | 0.6952 |
| No log | 5.4426 | 332 | 0.5081 | 0.72 | 0.5081 | 0.7128 |
| No log | 5.4754 | 334 | 0.5713 | 0.6522 | 0.5713 | 0.7559 |
| No log | 5.5082 | 336 | 0.6184 | 0.6111 | 0.6184 | 0.7864 |
| No log | 5.5410 | 338 | 0.5717 | 0.6426 | 0.5717 | 0.7561 |
| No log | 5.5738 | 340 | 0.4748 | 0.7820 | 0.4748 | 0.6891 |
| No log | 5.6066 | 342 | 0.4317 | 0.7820 | 0.4317 | 0.6571 |
| No log | 5.6393 | 344 | 0.4136 | 0.7986 | 0.4136 | 0.6432 |
| No log | 5.6721 | 346 | 0.4376 | 0.7820 | 0.4376 | 0.6615 |
| No log | 5.7049 | 348 | 0.4995 | 0.7667 | 0.4995 | 0.7067 |
| No log | 5.7377 | 350 | 0.6612 | 0.6111 | 0.6612 | 0.8131 |
| No log | 5.7705 | 352 | 0.7573 | 0.6189 | 0.7573 | 0.8702 |
| No log | 5.8033 | 354 | 0.7750 | 0.6111 | 0.7750 | 0.8804 |
| No log | 5.8361 | 356 | 0.7359 | 0.6030 | 0.7359 | 0.8579 |
| No log | 5.8689 | 358 | 0.6486 | 0.7263 | 0.6486 | 0.8054 |
| No log | 5.9016 | 360 | 0.6347 | 0.7263 | 0.6347 | 0.7967 |
| No log | 5.9344 | 362 | 0.6544 | 0.7263 | 0.6544 | 0.8090 |
| No log | 5.9672 | 364 | 0.6726 | 0.7378 | 0.6726 | 0.8201 |
| No log | 6.0 | 366 | 0.6805 | 0.7378 | 0.6805 | 0.8249 |
| No log | 6.0328 | 368 | 0.7313 | 0.7378 | 0.7313 | 0.8552 |
| No log | 6.0656 | 370 | 0.8498 | 0.7053 | 0.8498 | 0.9219 |
| No log | 6.0984 | 372 | 0.9743 | 0.5845 | 0.9743 | 0.9871 |
| No log | 6.1311 | 374 | 0.9505 | 0.5404 | 0.9505 | 0.9749 |
| No log | 6.1639 | 376 | 0.8191 | 0.6111 | 0.8191 | 0.9050 |
| No log | 6.1967 | 378 | 0.6634 | 0.7147 | 0.6634 | 0.8145 |
| No log | 6.2295 | 380 | 0.6038 | 0.7391 | 0.6038 | 0.7771 |
| No log | 6.2623 | 382 | 0.6100 | 0.7391 | 0.6100 | 0.7810 |
| No log | 6.2951 | 384 | 0.5994 | 0.6708 | 0.5994 | 0.7742 |
| No log | 6.3279 | 386 | 0.6060 | 0.6456 | 0.6060 | 0.7785 |
| No log | 6.3607 | 388 | 0.6237 | 0.6456 | 0.6237 | 0.7897 |
| No log | 6.3934 | 390 | 0.6715 | 0.6456 | 0.6715 | 0.8194 |
| No log | 6.4262 | 392 | 0.7365 | 0.6420 | 0.7365 | 0.8582 |
| No log | 6.4590 | 394 | 0.7192 | 0.6420 | 0.7192 | 0.8481 |
| No log | 6.4918 | 396 | 0.6482 | 0.6420 | 0.6482 | 0.8051 |
| No log | 6.5246 | 398 | 0.5824 | 0.6456 | 0.5824 | 0.7631 |
| No log | 6.5574 | 400 | 0.5615 | 0.6098 | 0.5615 | 0.7493 |
| No log | 6.5902 | 402 | 0.5369 | 0.6667 | 0.5369 | 0.7327 |
| No log | 6.6230 | 404 | 0.5001 | 0.6667 | 0.5001 | 0.7071 |
| No log | 6.6557 | 406 | 0.4586 | 0.7063 | 0.4586 | 0.6772 |
| No log | 6.6885 | 408 | 0.4360 | 0.7063 | 0.4360 | 0.6603 |
| No log | 6.7213 | 410 | 0.4344 | 0.7063 | 0.4344 | 0.6591 |
| No log | 6.7541 | 412 | 0.4741 | 0.7063 | 0.4741 | 0.6885 |
| No log | 6.7869 | 414 | 0.5381 | 0.6667 | 0.5381 | 0.7335 |
| No log | 6.8197 | 416 | 0.5881 | 0.6291 | 0.5881 | 0.7668 |
| No log | 6.8525 | 418 | 0.5997 | 0.6111 | 0.5997 | 0.7744 |
| No log | 6.8852 | 420 | 0.6073 | 0.6456 | 0.6073 | 0.7793 |
| No log | 6.9180 | 422 | 0.5707 | 0.6708 | 0.5707 | 0.7555 |
| No log | 6.9508 | 424 | 0.5436 | 0.6936 | 0.5436 | 0.7373 |
| No log | 6.9836 | 426 | 0.5325 | 0.6936 | 0.5325 | 0.7298 |
| No log | 7.0164 | 428 | 0.5394 | 0.6936 | 0.5394 | 0.7344 |
| No log | 7.0492 | 430 | 0.5682 | 0.6847 | 0.5682 | 0.7538 |
| No log | 7.0820 | 432 | 0.5828 | 0.6744 | 0.5828 | 0.7634 |
| No log | 7.1148 | 434 | 0.5607 | 0.6957 | 0.5607 | 0.7488 |
| No log | 7.1475 | 436 | 0.5519 | 0.6957 | 0.5519 | 0.7429 |
| No log | 7.1803 | 438 | 0.5493 | 0.6957 | 0.5493 | 0.7412 |
| No log | 7.2131 | 440 | 0.5770 | 0.6361 | 0.5770 | 0.7596 |
| No log | 7.2459 | 442 | 0.5716 | 0.6456 | 0.5716 | 0.7561 |
| No log | 7.2787 | 444 | 0.5496 | 0.6557 | 0.5496 | 0.7414 |
| No log | 7.3115 | 446 | 0.5230 | 0.6557 | 0.5230 | 0.7232 |
| No log | 7.3443 | 448 | 0.5160 | 0.6957 | 0.5160 | 0.7183 |
| No log | 7.3770 | 450 | 0.5000 | 0.6936 | 0.5000 | 0.7071 |
| No log | 7.4098 | 452 | 0.4959 | 0.6818 | 0.4959 | 0.7042 |
| No log | 7.4426 | 454 | 0.4795 | 0.7063 | 0.4795 | 0.6925 |
| No log | 7.4754 | 456 | 0.4825 | 0.6465 | 0.4825 | 0.6946 |
| No log | 7.5082 | 458 | 0.4757 | 0.72 | 0.4757 | 0.6897 |
| No log | 7.5410 | 460 | 0.4768 | 0.7063 | 0.4768 | 0.6905 |
| No log | 7.5738 | 462 | 0.5103 | 0.6818 | 0.5103 | 0.7144 |
| No log | 7.6066 | 464 | 0.5790 | 0.6557 | 0.5790 | 0.7609 |
| No log | 7.6393 | 466 | 0.6432 | 0.6557 | 0.6432 | 0.8020 |
| No log | 7.6721 | 468 | 0.6567 | 0.6557 | 0.6567 | 0.8104 |
| No log | 7.7049 | 470 | 0.6451 | 0.6557 | 0.6451 | 0.8032 |
| No log | 7.7377 | 472 | 0.6231 | 0.6557 | 0.6231 | 0.7894 |
| No log | 7.7705 | 474 | 0.6007 | 0.6557 | 0.6007 | 0.7751 |
| No log | 7.8033 | 476 | 0.5606 | 0.6557 | 0.5606 | 0.7487 |
| No log | 7.8361 | 478 | 0.5470 | 0.6557 | 0.5470 | 0.7396 |
| No log | 7.8689 | 480 | 0.5580 | 0.6557 | 0.5580 | 0.7470 |
| No log | 7.9016 | 482 | 0.5660 | 0.6557 | 0.5660 | 0.7523 |
| No log | 7.9344 | 484 | 0.5768 | 0.6557 | 0.5768 | 0.7595 |
| No log | 7.9672 | 486 | 0.5662 | 0.6254 | 0.5662 | 0.7524 |
| No log | 8.0 | 488 | 0.5989 | 0.6557 | 0.5989 | 0.7739 |
| No log | 8.0328 | 490 | 0.6649 | 0.6189 | 0.6649 | 0.8154 |
| No log | 8.0656 | 492 | 0.7196 | 0.6182 | 0.7196 | 0.8483 |
| No log | 8.0984 | 494 | 0.7166 | 0.6182 | 0.7166 | 0.8466 |
| No log | 8.1311 | 496 | 0.6811 | 0.6189 | 0.6811 | 0.8253 |
| No log | 8.1639 | 498 | 0.6292 | 0.6189 | 0.6292 | 0.7932 |
| 0.4795 | 8.1967 | 500 | 0.5737 | 0.6254 | 0.5737 | 0.7574 |
| 0.4795 | 8.2295 | 502 | 0.5367 | 0.6571 | 0.5367 | 0.7326 |
| 0.4795 | 8.2623 | 504 | 0.5032 | 0.72 | 0.5032 | 0.7094 |
| 0.4795 | 8.2951 | 506 | 0.4801 | 0.7820 | 0.4801 | 0.6929 |
| 0.4795 | 8.3279 | 508 | 0.4851 | 0.7820 | 0.4851 | 0.6965 |
| 0.4795 | 8.3607 | 510 | 0.5023 | 0.72 | 0.5023 | 0.7088 |
| 0.4795 | 8.3934 | 512 | 0.5157 | 0.72 | 0.5157 | 0.7181 |
| 0.4795 | 8.4262 | 514 | 0.5467 | 0.72 | 0.5467 | 0.7394 |
| 0.4795 | 8.4590 | 516 | 0.5884 | 0.6893 | 0.5884 | 0.7671 |
| 0.4795 | 8.4918 | 518 | 0.6394 | 0.6716 | 0.6394 | 0.7996 |
| 0.4795 | 8.5246 | 520 | 0.7071 | 0.6517 | 0.7071 | 0.8409 |
| 0.4795 | 8.5574 | 522 | 0.7296 | 0.6517 | 0.7296 | 0.8542 |
| 0.4795 | 8.5902 | 524 | 0.7270 | 0.6517 | 0.7270 | 0.8526 |
| 0.4795 | 8.6230 | 526 | 0.7131 | 0.6517 | 0.7131 | 0.8445 |
| 0.4795 | 8.6557 | 528 | 0.7195 | 0.6517 | 0.7195 | 0.8482 |
| 0.4795 | 8.6885 | 530 | 0.7298 | 0.6517 | 0.7298 | 0.8543 |
| 0.4795 | 8.7213 | 532 | 0.7375 | 0.6517 | 0.7375 | 0.8588 |
| 0.4795 | 8.7541 | 534 | 0.7307 | 0.6517 | 0.7307 | 0.8548 |
| 0.4795 | 8.7869 | 536 | 0.7277 | 0.6517 | 0.7277 | 0.8530 |
| 0.4795 | 8.8197 | 538 | 0.7072 | 0.6472 | 0.7072 | 0.8410 |
| 0.4795 | 8.8525 | 540 | 0.6655 | 0.6472 | 0.6655 | 0.8158 |
| 0.4795 | 8.8852 | 542 | 0.6135 | 0.6510 | 0.6135 | 0.7832 |
| 0.4795 | 8.9180 | 544 | 0.5614 | 0.7378 | 0.5614 | 0.7493 |
| 0.4795 | 8.9508 | 546 | 0.5182 | 0.72 | 0.5182 | 0.7198 |
| 0.4795 | 8.9836 | 548 | 0.4919 | 0.72 | 0.4919 | 0.7013 |
| 0.4795 | 9.0164 | 550 | 0.4809 | 0.72 | 0.4809 | 0.6935 |
| 0.4795 | 9.0492 | 552 | 0.4868 | 0.72 | 0.4868 | 0.6977 |
| 0.4795 | 9.0820 | 554 | 0.5030 | 0.72 | 0.5030 | 0.7092 |
| 0.4795 | 9.1148 | 556 | 0.5219 | 0.72 | 0.5219 | 0.7224 |
| 0.4795 | 9.1475 | 558 | 0.5375 | 0.6522 | 0.5375 | 0.7332 |
| 0.4795 | 9.1803 | 560 | 0.5454 | 0.6847 | 0.5454 | 0.7385 |
| 0.4795 | 9.2131 | 562 | 0.5490 | 0.6847 | 0.5490 | 0.7409 |
| 0.4795 | 9.2459 | 564 | 0.5559 | 0.6847 | 0.5559 | 0.7456 |
| 0.4795 | 9.2787 | 566 | 0.5620 | 0.6847 | 0.5620 | 0.7497 |
| 0.4795 | 9.3115 | 568 | 0.5586 | 0.6847 | 0.5586 | 0.7474 |
| 0.4795 | 9.3443 | 570 | 0.5530 | 0.6847 | 0.5530 | 0.7437 |
| 0.4795 | 9.3770 | 572 | 0.5553 | 0.6847 | 0.5553 | 0.7452 |
| 0.4795 | 9.4098 | 574 | 0.5571 | 0.6847 | 0.5571 | 0.7464 |
| 0.4795 | 9.4426 | 576 | 0.5566 | 0.6847 | 0.5566 | 0.7461 |
| 0.4795 | 9.4754 | 578 | 0.5597 | 0.6847 | 0.5597 | 0.7481 |
| 0.4795 | 9.5082 | 580 | 0.5592 | 0.6847 | 0.5592 | 0.7478 |
| 0.4795 | 9.5410 | 582 | 0.5518 | 0.6847 | 0.5518 | 0.7429 |
| 0.4795 | 9.5738 | 584 | 0.5497 | 0.6847 | 0.5497 | 0.7414 |
| 0.4795 | 9.6066 | 586 | 0.5489 | 0.6522 | 0.5489 | 0.7408 |
| 0.4795 | 9.6393 | 588 | 0.5539 | 0.6847 | 0.5539 | 0.7443 |
| 0.4795 | 9.6721 | 590 | 0.5646 | 0.6847 | 0.5646 | 0.7514 |
| 0.4795 | 9.7049 | 592 | 0.5741 | 0.6744 | 0.5741 | 0.7577 |
| 0.4795 | 9.7377 | 594 | 0.5816 | 0.6744 | 0.5816 | 0.7626 |
| 0.4795 | 9.7705 | 596 | 0.5898 | 0.6744 | 0.5898 | 0.7680 |
| 0.4795 | 9.8033 | 598 | 0.5973 | 0.6744 | 0.5973 | 0.7728 |
| 0.4795 | 9.8361 | 600 | 0.6042 | 0.6744 | 0.6042 | 0.7773 |
| 0.4795 | 9.8689 | 602 | 0.6085 | 0.6744 | 0.6085 | 0.7801 |
| 0.4795 | 9.9016 | 604 | 0.6100 | 0.6744 | 0.6100 | 0.7810 |
| 0.4795 | 9.9344 | 606 | 0.6100 | 0.6744 | 0.6100 | 0.7810 |
| 0.4795 | 9.9672 | 608 | 0.6105 | 0.6744 | 0.6105 | 0.7813 |
| 0.4795 | 10.0 | 610 | 0.6108 | 0.6744 | 0.6108 | 0.7816 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
|
Nishitbaria/Anime-style-flux-lora-Large | Nishitbaria | 2024-11-24T06:46:01Z | 3,150 | 29 | diffusers | [
"diffusers",
"flux",
"lora",
"replicate",
"text-to-image",
"en",
"base_model:black-forest-labs/FLUX.1-dev",
"base_model:adapter:black-forest-labs/FLUX.1-dev",
"license:other",
"region:us"
] | text-to-image | 2024-11-23T16:20:05Z | ---
license: other
license_name: flux-1-dev-non-commercial-license
license_link: https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/LICENSE.md
language:
- en
tags:
- flux
- diffusers
- lora
- replicate
base_model: black-forest-labs/FLUX.1-dev
pipeline_tag: text-to-image
instance_prompt: anm
widget:
- text: >-
an anm young boy sitting under a large sakura tree, wearing traditional
festival attire, cherry blossoms gently falling around him, peaceful smile,
evening sky with vibrant hues.
output:
url: images/example_7y3r4uk1q.png
- text: >-
an anm girl wearing a school uniform, standing beside her bicycle at a rural
road, gazing up at the sky, full of hope, morning sunlight and dew-covered
grass.
output:
url: images/example_1l93qqd0r.png
- text: >-
an anm quiet residential street at dawn, soft orange sunlight peeking over
the rooftops, dew-covered flowers along the roadside, a gentle peaceful
mood.
output:
url: images/example_0vyad4135.png
---
# Anime Style Flux Lora Large
<Gallery />
Trained on Replicate using:
https://replicate.com/ostris/flux-dev-lora-trainer/train
## Trigger words
You should use `anm` to trigger the image generation.
## Use it with the [🧨 diffusers library](https://github.com/huggingface/diffusers)
```py
from diffusers import AutoPipelineForText2Image
import torch
pipeline = AutoPipelineForText2Image.from_pretrained('black-forest-labs/FLUX.1-dev', torch_dtype=torch.float16).to('cuda')
pipeline.load_lora_weights('Nishitbaria/Anime-style-flux-lora-Large', weight_name='lora.safetensors')
image = pipeline('your prompt').images[0]
```
For more details, including weighting, merging and fusing LoRAs, check the [documentation on loading LoRAs in diffusers](https://huggingface.co/docs/diffusers/main/en/using-diffusers/loading_adapters)
|
Taisei2001/llm-jp-3-13b-finetune-2 | Taisei2001 | 2024-11-24T06:37:52Z | 7 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"text-generation-inference",
"unsloth",
"trl",
"en",
"base_model:llm-jp/llm-jp-3-13b",
"base_model:finetune:llm-jp/llm-jp-3-13b",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-22T05:57:04Z | ---
base_model: llm-jp/llm-jp-3-13b
tags:
- text-generation-inference
- transformers
- unsloth
- llama
- trl
license: apache-2.0
language:
- en
---
# Uploaded model
- **Developed by:** Taisei2001
- **License:** apache-2.0
- **Finetuned from model :** llm-jp/llm-jp-3-13b
This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
MayBashendy/Arabic_FineTuningAraBERT_AugV5_k4_task1_organization_fold1 | MayBashendy | 2024-11-24T06:35:36Z | 163 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:aubmindlab/bert-base-arabertv02",
"base_model:finetune:aubmindlab/bert-base-arabertv02",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-18T18:30:21Z | ---
library_name: transformers
base_model: aubmindlab/bert-base-arabertv02
tags:
- generated_from_trainer
model-index:
- name: Arabic_FineTuningAraBERT_AugV5_k4_task1_organization_fold1
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Arabic_FineTuningAraBERT_AugV5_k4_task1_organization_fold1
This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5184
- Qwk: 0.7729
- Mse: 0.5184
- Rmse: 0.7200
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:|
| No log | 0.0392 | 2 | 4.6675 | -0.0566 | 4.6675 | 2.1604 |
| No log | 0.0784 | 4 | 2.7972 | 0.0508 | 2.7972 | 1.6725 |
| No log | 0.1176 | 6 | 1.6182 | 0.1355 | 1.6182 | 1.2721 |
| No log | 0.1569 | 8 | 1.0768 | 0.0098 | 1.0768 | 1.0377 |
| No log | 0.1961 | 10 | 0.9480 | 0.1105 | 0.9480 | 0.9736 |
| No log | 0.2353 | 12 | 0.9714 | 0.1047 | 0.9714 | 0.9856 |
| No log | 0.2745 | 14 | 0.8206 | 0.2921 | 0.8206 | 0.9059 |
| No log | 0.3137 | 16 | 0.9842 | 0.1683 | 0.9842 | 0.9921 |
| No log | 0.3529 | 18 | 1.2574 | 0.1818 | 1.2574 | 1.1213 |
| No log | 0.3922 | 20 | 1.5276 | 0.1026 | 1.5276 | 1.2360 |
| No log | 0.4314 | 22 | 1.5298 | 0.1026 | 1.5298 | 1.2369 |
| No log | 0.4706 | 24 | 1.2621 | 0.0841 | 1.2621 | 1.1234 |
| No log | 0.5098 | 26 | 1.1037 | 0.0841 | 1.1037 | 1.0506 |
| No log | 0.5490 | 28 | 1.0785 | 0.0841 | 1.0785 | 1.0385 |
| No log | 0.5882 | 30 | 1.0697 | 0.0841 | 1.0697 | 1.0343 |
| No log | 0.6275 | 32 | 1.0974 | 0.0841 | 1.0974 | 1.0476 |
| No log | 0.6667 | 34 | 1.1645 | 0.0841 | 1.1645 | 1.0791 |
| No log | 0.7059 | 36 | 1.0617 | 0.2829 | 1.0617 | 1.0304 |
| No log | 0.7451 | 38 | 0.9282 | 0.3317 | 0.9282 | 0.9634 |
| No log | 0.7843 | 40 | 0.7530 | 0.3152 | 0.7530 | 0.8677 |
| No log | 0.8235 | 42 | 0.7080 | 0.2921 | 0.7080 | 0.8415 |
| No log | 0.8627 | 44 | 0.7549 | 0.3636 | 0.7549 | 0.8688 |
| No log | 0.9020 | 46 | 0.7459 | 0.4105 | 0.7459 | 0.8637 |
| No log | 0.9412 | 48 | 0.8293 | 0.5749 | 0.8293 | 0.9106 |
| No log | 0.9804 | 50 | 0.8942 | 0.552 | 0.8942 | 0.9456 |
| No log | 1.0196 | 52 | 0.7209 | 0.5116 | 0.7209 | 0.8491 |
| No log | 1.0588 | 54 | 0.6642 | 0.4360 | 0.6642 | 0.8150 |
| No log | 1.0980 | 56 | 0.7150 | 0.4481 | 0.7150 | 0.8456 |
| No log | 1.1373 | 58 | 0.7180 | 0.4481 | 0.7180 | 0.8474 |
| No log | 1.1765 | 60 | 0.6908 | 0.4089 | 0.6908 | 0.8311 |
| No log | 1.2157 | 62 | 0.6736 | 0.4474 | 0.6736 | 0.8207 |
| No log | 1.2549 | 64 | 0.6966 | 0.5130 | 0.6966 | 0.8346 |
| No log | 1.2941 | 66 | 0.6997 | 0.5205 | 0.6997 | 0.8365 |
| No log | 1.3333 | 68 | 0.7478 | 0.4444 | 0.7478 | 0.8648 |
| No log | 1.3725 | 70 | 0.8004 | 0.5092 | 0.8004 | 0.8946 |
| No log | 1.4118 | 72 | 0.6793 | 0.5556 | 0.6793 | 0.8242 |
| No log | 1.4510 | 74 | 0.5755 | 0.5922 | 0.5755 | 0.7586 |
| No log | 1.4902 | 76 | 0.5807 | 0.5333 | 0.5807 | 0.7620 |
| No log | 1.5294 | 78 | 0.5272 | 0.6930 | 0.5272 | 0.7261 |
| No log | 1.5686 | 80 | 0.5410 | 0.6316 | 0.5410 | 0.7355 |
| No log | 1.6078 | 82 | 0.7291 | 0.5333 | 0.7291 | 0.8539 |
| No log | 1.6471 | 84 | 0.8184 | 0.5333 | 0.8184 | 0.9047 |
| No log | 1.6863 | 86 | 0.6248 | 0.5556 | 0.6248 | 0.7905 |
| No log | 1.7255 | 88 | 0.5864 | 0.7042 | 0.5864 | 0.7658 |
| No log | 1.7647 | 90 | 0.6219 | 0.7103 | 0.6219 | 0.7886 |
| No log | 1.8039 | 92 | 0.6314 | 0.7372 | 0.6314 | 0.7946 |
| No log | 1.8431 | 94 | 0.6935 | 0.5611 | 0.6935 | 0.8327 |
| No log | 1.8824 | 96 | 0.7583 | 0.5541 | 0.7583 | 0.8708 |
| No log | 1.9216 | 98 | 0.7768 | 0.5541 | 0.7768 | 0.8814 |
| No log | 1.9608 | 100 | 0.9051 | 0.5217 | 0.9051 | 0.9513 |
| No log | 2.0 | 102 | 1.0041 | 0.5304 | 1.0041 | 1.0020 |
| No log | 2.0392 | 104 | 0.8436 | 0.5217 | 0.8436 | 0.9185 |
| No log | 2.0784 | 106 | 0.6121 | 0.6797 | 0.6121 | 0.7824 |
| No log | 2.1176 | 108 | 0.5628 | 0.7179 | 0.5628 | 0.7502 |
| No log | 2.1569 | 110 | 0.5601 | 0.6997 | 0.5601 | 0.7484 |
| No log | 2.1961 | 112 | 0.7611 | 0.5084 | 0.7611 | 0.8724 |
| No log | 2.2353 | 114 | 0.7584 | 0.5084 | 0.7584 | 0.8709 |
| No log | 2.2745 | 116 | 0.5519 | 0.6997 | 0.5519 | 0.7429 |
| No log | 2.3137 | 118 | 0.5236 | 0.7179 | 0.5236 | 0.7236 |
| No log | 2.3529 | 120 | 0.5456 | 0.6997 | 0.5456 | 0.7387 |
| No log | 2.3922 | 122 | 0.7458 | 0.4901 | 0.7458 | 0.8636 |
| No log | 2.4314 | 124 | 0.9037 | 0.5172 | 0.9037 | 0.9506 |
| No log | 2.4706 | 126 | 0.8043 | 0.4901 | 0.8043 | 0.8968 |
| No log | 2.5098 | 128 | 0.6909 | 0.5714 | 0.6909 | 0.8312 |
| No log | 2.5490 | 130 | 0.7310 | 0.5199 | 0.7310 | 0.8550 |
| No log | 2.5882 | 132 | 0.7731 | 0.4351 | 0.7731 | 0.8792 |
| No log | 2.6275 | 134 | 0.8463 | 0.5172 | 0.8463 | 0.9200 |
| No log | 2.6667 | 136 | 0.8634 | 0.5172 | 0.8634 | 0.9292 |
| No log | 2.7059 | 138 | 0.7196 | 0.5912 | 0.7196 | 0.8483 |
| No log | 2.7451 | 140 | 0.5265 | 0.7123 | 0.5265 | 0.7256 |
| No log | 2.7843 | 142 | 0.5133 | 0.7442 | 0.5133 | 0.7165 |
| No log | 2.8235 | 144 | 0.5294 | 0.7508 | 0.5294 | 0.7276 |
| No log | 2.8627 | 146 | 0.5898 | 0.6367 | 0.5898 | 0.7680 |
| No log | 2.9020 | 148 | 0.6393 | 0.6364 | 0.6393 | 0.7995 |
| No log | 2.9412 | 150 | 0.7927 | 0.5668 | 0.7927 | 0.8903 |
| No log | 2.9804 | 152 | 0.8153 | 0.5172 | 0.8153 | 0.9029 |
| No log | 3.0196 | 154 | 0.6549 | 0.5625 | 0.6549 | 0.8093 |
| No log | 3.0588 | 156 | 0.5799 | 0.6000 | 0.5799 | 0.7615 |
| No log | 3.0980 | 158 | 0.6307 | 0.5092 | 0.6307 | 0.7941 |
| No log | 3.1373 | 160 | 0.5963 | 0.6617 | 0.5963 | 0.7722 |
| No log | 3.1765 | 162 | 0.4738 | 0.7508 | 0.4738 | 0.6883 |
| No log | 3.2157 | 164 | 0.5033 | 0.7586 | 0.5033 | 0.7095 |
| No log | 3.2549 | 166 | 0.5344 | 0.7093 | 0.5344 | 0.7310 |
| No log | 3.2941 | 168 | 0.5206 | 0.7508 | 0.5206 | 0.7215 |
| No log | 3.3333 | 170 | 0.6778 | 0.7012 | 0.6778 | 0.8233 |
| No log | 3.3725 | 172 | 0.7755 | 0.5612 | 0.7755 | 0.8806 |
| No log | 3.4118 | 174 | 0.6612 | 0.6392 | 0.6612 | 0.8131 |
| No log | 3.4510 | 176 | 0.5742 | 0.6851 | 0.5742 | 0.7578 |
| No log | 3.4902 | 178 | 0.6289 | 0.6624 | 0.6289 | 0.7930 |
| No log | 3.5294 | 180 | 0.7679 | 0.6686 | 0.7679 | 0.8763 |
| No log | 3.5686 | 182 | 0.6368 | 0.6573 | 0.6368 | 0.7980 |
| No log | 3.6078 | 184 | 0.5659 | 0.7336 | 0.5659 | 0.7523 |
| No log | 3.6471 | 186 | 0.4800 | 0.7603 | 0.4800 | 0.6928 |
| No log | 3.6863 | 188 | 0.4910 | 0.6975 | 0.4910 | 0.7007 |
| No log | 3.7255 | 190 | 0.4933 | 0.7217 | 0.4933 | 0.7024 |
| No log | 3.7647 | 192 | 0.4720 | 0.7651 | 0.4720 | 0.6870 |
| No log | 3.8039 | 194 | 0.5153 | 0.7336 | 0.5153 | 0.7178 |
| No log | 3.8431 | 196 | 0.6729 | 0.6337 | 0.6729 | 0.8203 |
| No log | 3.8824 | 198 | 0.6946 | 0.6510 | 0.6946 | 0.8334 |
| No log | 3.9216 | 200 | 0.6997 | 0.6020 | 0.6997 | 0.8365 |
| No log | 3.9608 | 202 | 0.5521 | 0.6573 | 0.5521 | 0.7430 |
| No log | 4.0 | 204 | 0.4318 | 0.7758 | 0.4318 | 0.6571 |
| No log | 4.0392 | 206 | 0.4233 | 0.7926 | 0.4233 | 0.6506 |
| No log | 4.0784 | 208 | 0.4528 | 0.7586 | 0.4528 | 0.6729 |
| No log | 4.1176 | 210 | 0.4237 | 0.7805 | 0.4237 | 0.6509 |
| No log | 4.1569 | 212 | 0.4126 | 0.7336 | 0.4126 | 0.6424 |
| No log | 4.1961 | 214 | 0.5178 | 0.7016 | 0.5178 | 0.7196 |
| No log | 4.2353 | 216 | 0.7133 | 0.6566 | 0.7133 | 0.8446 |
| No log | 4.2745 | 218 | 0.7216 | 0.6566 | 0.7216 | 0.8495 |
| No log | 4.3137 | 220 | 0.5446 | 0.7016 | 0.5446 | 0.7379 |
| No log | 4.3529 | 222 | 0.4544 | 0.7603 | 0.4544 | 0.6741 |
| No log | 4.3922 | 224 | 0.4736 | 0.7651 | 0.4736 | 0.6882 |
| No log | 4.4314 | 226 | 0.5278 | 0.7508 | 0.5278 | 0.7265 |
| No log | 4.4706 | 228 | 0.6627 | 0.7083 | 0.6627 | 0.8141 |
| No log | 4.5098 | 230 | 0.8967 | 0.5236 | 0.8967 | 0.9469 |
| No log | 4.5490 | 232 | 1.1182 | 0.5015 | 1.1182 | 1.0574 |
| No log | 4.5882 | 234 | 1.1105 | 0.5015 | 1.1105 | 1.0538 |
| No log | 4.6275 | 236 | 0.9796 | 0.5015 | 0.9796 | 0.9897 |
| No log | 4.6667 | 238 | 0.7070 | 0.6488 | 0.7070 | 0.8409 |
| No log | 4.7059 | 240 | 0.4651 | 0.7348 | 0.4651 | 0.6820 |
| No log | 4.7451 | 242 | 0.3843 | 0.7758 | 0.3843 | 0.6199 |
| No log | 4.7843 | 244 | 0.3720 | 0.7758 | 0.3720 | 0.6099 |
| No log | 4.8235 | 246 | 0.3893 | 0.7348 | 0.3893 | 0.6239 |
| No log | 4.8627 | 248 | 0.4389 | 0.7586 | 0.4389 | 0.6625 |
| No log | 4.9020 | 250 | 0.4984 | 0.7072 | 0.4984 | 0.7060 |
| No log | 4.9412 | 252 | 0.5340 | 0.6125 | 0.5340 | 0.7307 |
| No log | 4.9804 | 254 | 0.4943 | 0.7072 | 0.4943 | 0.7031 |
| No log | 5.0196 | 256 | 0.4792 | 0.8082 | 0.4792 | 0.6922 |
| No log | 5.0588 | 258 | 0.5360 | 0.7544 | 0.5360 | 0.7321 |
| No log | 5.0980 | 260 | 0.5759 | 0.7394 | 0.5759 | 0.7589 |
| No log | 5.1373 | 262 | 0.5880 | 0.7394 | 0.5880 | 0.7668 |
| No log | 5.1765 | 264 | 0.6033 | 0.7394 | 0.6033 | 0.7767 |
| No log | 5.2157 | 266 | 0.5876 | 0.7465 | 0.5876 | 0.7665 |
| No log | 5.2549 | 268 | 0.5055 | 0.7485 | 0.5055 | 0.7110 |
| No log | 5.2941 | 270 | 0.4919 | 0.7485 | 0.4919 | 0.7014 |
| No log | 5.3333 | 272 | 0.4873 | 0.7674 | 0.4873 | 0.6981 |
| No log | 5.3725 | 274 | 0.4750 | 0.7771 | 0.4750 | 0.6892 |
| No log | 5.4118 | 276 | 0.4465 | 0.7771 | 0.4465 | 0.6682 |
| No log | 5.4510 | 278 | 0.3958 | 0.7864 | 0.3958 | 0.6292 |
| No log | 5.4902 | 280 | 0.3863 | 0.7651 | 0.3863 | 0.6215 |
| No log | 5.5294 | 282 | 0.4012 | 0.7986 | 0.4012 | 0.6334 |
| No log | 5.5686 | 284 | 0.5027 | 0.7267 | 0.5027 | 0.7090 |
| No log | 5.6078 | 286 | 0.6235 | 0.6293 | 0.6235 | 0.7896 |
| No log | 5.6471 | 288 | 0.6810 | 0.6478 | 0.6810 | 0.8252 |
| No log | 5.6863 | 290 | 0.6170 | 0.7101 | 0.6170 | 0.7855 |
| No log | 5.7255 | 292 | 0.5903 | 0.6729 | 0.5903 | 0.7683 |
| No log | 5.7647 | 294 | 0.5319 | 0.7267 | 0.5319 | 0.7293 |
| No log | 5.8039 | 296 | 0.4497 | 0.7552 | 0.4497 | 0.6706 |
| No log | 5.8431 | 298 | 0.4272 | 0.7986 | 0.4272 | 0.6536 |
| No log | 5.8824 | 300 | 0.4287 | 0.7986 | 0.4287 | 0.6548 |
| No log | 5.9216 | 302 | 0.4448 | 0.7921 | 0.4448 | 0.6669 |
| No log | 5.9608 | 304 | 0.4910 | 0.7771 | 0.4910 | 0.7007 |
| No log | 6.0 | 306 | 0.6115 | 0.7515 | 0.6115 | 0.7820 |
| No log | 6.0392 | 308 | 0.7464 | 0.5746 | 0.7464 | 0.8640 |
| No log | 6.0784 | 310 | 0.7748 | 0.5746 | 0.7748 | 0.8802 |
| No log | 6.1176 | 312 | 0.6584 | 0.6500 | 0.6584 | 0.8114 |
| No log | 6.1569 | 314 | 0.5298 | 0.7771 | 0.5298 | 0.7279 |
| No log | 6.1961 | 316 | 0.5135 | 0.7820 | 0.5135 | 0.7166 |
| No log | 6.2353 | 318 | 0.4934 | 0.7820 | 0.4934 | 0.7024 |
| No log | 6.2745 | 320 | 0.5018 | 0.7820 | 0.5018 | 0.7084 |
| No log | 6.3137 | 322 | 0.5410 | 0.7667 | 0.5410 | 0.7355 |
| No log | 6.3529 | 324 | 0.6648 | 0.6590 | 0.6648 | 0.8154 |
| No log | 6.3922 | 326 | 0.7267 | 0.6383 | 0.7267 | 0.8525 |
| No log | 6.4314 | 328 | 0.7511 | 0.5854 | 0.7511 | 0.8667 |
| No log | 6.4706 | 330 | 0.7179 | 0.5854 | 0.7179 | 0.8473 |
| No log | 6.5098 | 332 | 0.5894 | 0.6580 | 0.5894 | 0.7677 |
| No log | 6.5490 | 334 | 0.4383 | 0.7820 | 0.4383 | 0.6620 |
| No log | 6.5882 | 336 | 0.3665 | 0.7986 | 0.3665 | 0.6054 |
| No log | 6.6275 | 338 | 0.3495 | 0.7758 | 0.3495 | 0.5912 |
| No log | 6.6667 | 340 | 0.3492 | 0.7986 | 0.3492 | 0.5909 |
| No log | 6.7059 | 342 | 0.3633 | 0.7586 | 0.3633 | 0.6027 |
| No log | 6.7451 | 344 | 0.3823 | 0.6805 | 0.3823 | 0.6183 |
| No log | 6.7843 | 346 | 0.4048 | 0.7059 | 0.4048 | 0.6362 |
| No log | 6.8235 | 348 | 0.4056 | 0.7059 | 0.4056 | 0.6368 |
| No log | 6.8627 | 350 | 0.4035 | 0.7529 | 0.4035 | 0.6352 |
| No log | 6.9020 | 352 | 0.4086 | 0.7586 | 0.4086 | 0.6392 |
| No log | 6.9412 | 354 | 0.4441 | 0.7829 | 0.4441 | 0.6664 |
| No log | 6.9804 | 356 | 0.4755 | 0.7829 | 0.4755 | 0.6896 |
| No log | 7.0196 | 358 | 0.4855 | 0.7586 | 0.4855 | 0.6968 |
| No log | 7.0588 | 360 | 0.5111 | 0.7727 | 0.5111 | 0.7149 |
| No log | 7.0980 | 362 | 0.5093 | 0.7342 | 0.5093 | 0.7137 |
| No log | 7.1373 | 364 | 0.5023 | 0.7354 | 0.5023 | 0.7087 |
| No log | 7.1765 | 366 | 0.5030 | 0.7354 | 0.5030 | 0.7092 |
| No log | 7.2157 | 368 | 0.4861 | 0.7368 | 0.4861 | 0.6972 |
| No log | 7.2549 | 370 | 0.4437 | 0.7529 | 0.4437 | 0.6661 |
| No log | 7.2941 | 372 | 0.4230 | 0.7529 | 0.4230 | 0.6504 |
| No log | 7.3333 | 374 | 0.4059 | 0.7586 | 0.4059 | 0.6371 |
| No log | 7.3725 | 376 | 0.4216 | 0.7586 | 0.4216 | 0.6493 |
| No log | 7.4118 | 378 | 0.4400 | 0.7986 | 0.4400 | 0.6634 |
| No log | 7.4510 | 380 | 0.4562 | 0.7586 | 0.4562 | 0.6754 |
| No log | 7.4902 | 382 | 0.4759 | 0.7524 | 0.4759 | 0.6899 |
| No log | 7.5294 | 384 | 0.5126 | 0.7586 | 0.5126 | 0.7159 |
| No log | 7.5686 | 386 | 0.5545 | 0.7217 | 0.5545 | 0.7447 |
| No log | 7.6078 | 388 | 0.5722 | 0.7217 | 0.5722 | 0.7564 |
| No log | 7.6471 | 390 | 0.5376 | 0.7217 | 0.5376 | 0.7332 |
| No log | 7.6863 | 392 | 0.5370 | 0.7217 | 0.5370 | 0.7328 |
| No log | 7.7255 | 394 | 0.5171 | 0.7217 | 0.5171 | 0.7191 |
| No log | 7.7647 | 396 | 0.5073 | 0.7342 | 0.5073 | 0.7122 |
| No log | 7.8039 | 398 | 0.4794 | 0.7475 | 0.4794 | 0.6924 |
| No log | 7.8431 | 400 | 0.4658 | 0.7273 | 0.4658 | 0.6825 |
| No log | 7.8824 | 402 | 0.4827 | 0.7273 | 0.4827 | 0.6948 |
| No log | 7.9216 | 404 | 0.4991 | 0.7273 | 0.4991 | 0.7065 |
| No log | 7.9608 | 406 | 0.5339 | 0.7147 | 0.5339 | 0.7307 |
| No log | 8.0 | 408 | 0.5641 | 0.7217 | 0.5641 | 0.7511 |
| No log | 8.0392 | 410 | 0.5751 | 0.7217 | 0.5751 | 0.7584 |
| No log | 8.0784 | 412 | 0.5326 | 0.7391 | 0.5326 | 0.7298 |
| No log | 8.1176 | 414 | 0.5097 | 0.7729 | 0.5097 | 0.7139 |
| No log | 8.1569 | 416 | 0.4962 | 0.7866 | 0.4962 | 0.7044 |
| No log | 8.1961 | 418 | 0.4893 | 0.7866 | 0.4893 | 0.6995 |
| No log | 8.2353 | 420 | 0.4947 | 0.7475 | 0.4947 | 0.7033 |
| No log | 8.2745 | 422 | 0.4878 | 0.7475 | 0.4878 | 0.6984 |
| No log | 8.3137 | 424 | 0.4513 | 0.7529 | 0.4513 | 0.6718 |
| No log | 8.3529 | 426 | 0.4140 | 0.7586 | 0.4140 | 0.6435 |
| No log | 8.3922 | 428 | 0.3840 | 0.7986 | 0.3840 | 0.6197 |
| No log | 8.4314 | 430 | 0.3721 | 0.7864 | 0.3721 | 0.6100 |
| No log | 8.4706 | 432 | 0.3672 | 0.7651 | 0.3672 | 0.6059 |
| No log | 8.5098 | 434 | 0.3672 | 0.7986 | 0.3672 | 0.6059 |
| No log | 8.5490 | 436 | 0.3733 | 0.7986 | 0.3733 | 0.6110 |
| No log | 8.5882 | 438 | 0.3933 | 0.7586 | 0.3933 | 0.6272 |
| No log | 8.6275 | 440 | 0.4412 | 0.7279 | 0.4412 | 0.6642 |
| No log | 8.6667 | 442 | 0.5014 | 0.7342 | 0.5014 | 0.7081 |
| No log | 8.7059 | 444 | 0.5515 | 0.7217 | 0.5515 | 0.7426 |
| No log | 8.7451 | 446 | 0.5786 | 0.7217 | 0.5786 | 0.7607 |
| No log | 8.7843 | 448 | 0.5791 | 0.7217 | 0.5791 | 0.7610 |
| No log | 8.8235 | 450 | 0.5753 | 0.7217 | 0.5753 | 0.7585 |
| No log | 8.8627 | 452 | 0.5628 | 0.7217 | 0.5628 | 0.7502 |
| No log | 8.9020 | 454 | 0.5351 | 0.7342 | 0.5351 | 0.7315 |
| No log | 8.9412 | 456 | 0.5011 | 0.7729 | 0.5011 | 0.7079 |
| No log | 8.9804 | 458 | 0.4815 | 0.7729 | 0.4815 | 0.6939 |
| No log | 9.0196 | 460 | 0.4704 | 0.7866 | 0.4704 | 0.6859 |
| No log | 9.0588 | 462 | 0.4571 | 0.7921 | 0.4571 | 0.6761 |
| No log | 9.0980 | 464 | 0.4527 | 0.7921 | 0.4527 | 0.6728 |
| No log | 9.1373 | 466 | 0.4541 | 0.7921 | 0.4541 | 0.6739 |
| No log | 9.1765 | 468 | 0.4608 | 0.7524 | 0.4608 | 0.6788 |
| No log | 9.2157 | 470 | 0.4717 | 0.7524 | 0.4717 | 0.6868 |
| No log | 9.2549 | 472 | 0.4783 | 0.7727 | 0.4783 | 0.6916 |
| No log | 9.2941 | 474 | 0.4996 | 0.7475 | 0.4996 | 0.7068 |
| No log | 9.3333 | 476 | 0.5223 | 0.7217 | 0.5223 | 0.7227 |
| No log | 9.3725 | 478 | 0.5401 | 0.7217 | 0.5401 | 0.7349 |
| No log | 9.4118 | 480 | 0.5448 | 0.7217 | 0.5448 | 0.7381 |
| No log | 9.4510 | 482 | 0.5408 | 0.7217 | 0.5408 | 0.7354 |
| No log | 9.4902 | 484 | 0.5312 | 0.7455 | 0.5312 | 0.7288 |
| No log | 9.5294 | 486 | 0.5227 | 0.7391 | 0.5227 | 0.7230 |
| No log | 9.5686 | 488 | 0.5185 | 0.7391 | 0.5185 | 0.7201 |
| No log | 9.6078 | 490 | 0.5127 | 0.7729 | 0.5127 | 0.7160 |
| No log | 9.6471 | 492 | 0.5126 | 0.7729 | 0.5126 | 0.7159 |
| No log | 9.6863 | 494 | 0.5162 | 0.7729 | 0.5162 | 0.7185 |
| No log | 9.7255 | 496 | 0.5206 | 0.7729 | 0.5206 | 0.7215 |
| No log | 9.7647 | 498 | 0.5230 | 0.7729 | 0.5230 | 0.7232 |
| 0.3864 | 9.8039 | 500 | 0.5220 | 0.7729 | 0.5220 | 0.7225 |
| 0.3864 | 9.8431 | 502 | 0.5203 | 0.7729 | 0.5203 | 0.7213 |
| 0.3864 | 9.8824 | 504 | 0.5186 | 0.7729 | 0.5186 | 0.7202 |
| 0.3864 | 9.9216 | 506 | 0.5184 | 0.7729 | 0.5184 | 0.7200 |
| 0.3864 | 9.9608 | 508 | 0.5182 | 0.7729 | 0.5182 | 0.7199 |
| 0.3864 | 10.0 | 510 | 0.5184 | 0.7729 | 0.5184 | 0.7200 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
|
MayBashendy/Arabic_FineTuningAraBERT_AugV5_k4_task1_organization_fold0 | MayBashendy | 2024-11-24T06:27:52Z | 163 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:aubmindlab/bert-base-arabertv02",
"base_model:finetune:aubmindlab/bert-base-arabertv02",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-18T18:24:50Z | ---
library_name: transformers
base_model: aubmindlab/bert-base-arabertv02
tags:
- generated_from_trainer
model-index:
- name: Arabic_FineTuningAraBERT_AugV5_k4_task1_organization_fold0
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Arabic_FineTuningAraBERT_AugV5_k4_task1_organization_fold0
This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6734
- Qwk: 0.7427
- Mse: 0.6734
- Rmse: 0.8206
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:|
| No log | 0.0392 | 2 | 5.3836 | -0.1290 | 5.3836 | 2.3203 |
| No log | 0.0784 | 4 | 2.7455 | 0.1466 | 2.7455 | 1.6569 |
| No log | 0.1176 | 6 | 1.7583 | 0.2282 | 1.7583 | 1.3260 |
| No log | 0.1569 | 8 | 1.4652 | 0.0 | 1.4652 | 1.2105 |
| No log | 0.1961 | 10 | 1.4474 | 0.0 | 1.4474 | 1.2031 |
| No log | 0.2353 | 12 | 1.5589 | 0.0 | 1.5589 | 1.2486 |
| No log | 0.2745 | 14 | 1.7112 | 0.0742 | 1.7112 | 1.3081 |
| No log | 0.3137 | 16 | 1.8068 | 0.1582 | 1.8068 | 1.3442 |
| No log | 0.3529 | 18 | 1.7587 | 0.1876 | 1.7587 | 1.3262 |
| No log | 0.3922 | 20 | 1.5165 | 0.0950 | 1.5165 | 1.2315 |
| No log | 0.4314 | 22 | 1.3720 | 0.1463 | 1.3720 | 1.1713 |
| No log | 0.4706 | 24 | 1.2697 | 0.2910 | 1.2697 | 1.1268 |
| No log | 0.5098 | 26 | 1.2050 | 0.4854 | 1.2050 | 1.0977 |
| No log | 0.5490 | 28 | 1.2425 | 0.3888 | 1.2425 | 1.1147 |
| No log | 0.5882 | 30 | 1.3064 | 0.2674 | 1.3064 | 1.1430 |
| No log | 0.6275 | 32 | 1.2777 | 0.3888 | 1.2777 | 1.1304 |
| No log | 0.6667 | 34 | 1.1544 | 0.4603 | 1.1544 | 1.0744 |
| No log | 0.7059 | 36 | 1.0943 | 0.4603 | 1.0943 | 1.0461 |
| No log | 0.7451 | 38 | 1.0040 | 0.4842 | 1.0040 | 1.0020 |
| No log | 0.7843 | 40 | 0.9295 | 0.4842 | 0.9295 | 0.9641 |
| No log | 0.8235 | 42 | 0.9330 | 0.4855 | 0.9330 | 0.9659 |
| No log | 0.8627 | 44 | 1.0598 | 0.6003 | 1.0598 | 1.0295 |
| No log | 0.9020 | 46 | 1.1433 | 0.5604 | 1.1433 | 1.0692 |
| No log | 0.9412 | 48 | 1.0182 | 0.6003 | 1.0182 | 1.0091 |
| No log | 0.9804 | 50 | 0.9470 | 0.4867 | 0.9470 | 0.9731 |
| No log | 1.0196 | 52 | 1.2819 | 0.1973 | 1.2819 | 1.1322 |
| No log | 1.0588 | 54 | 1.6069 | 0.0443 | 1.6069 | 1.2676 |
| No log | 1.0980 | 56 | 1.8741 | 0.0443 | 1.8741 | 1.3690 |
| No log | 1.1373 | 58 | 2.0367 | 0.0197 | 2.0367 | 1.4271 |
| No log | 1.1765 | 60 | 1.4110 | 0.1455 | 1.4110 | 1.1878 |
| No log | 1.2157 | 62 | 0.8351 | 0.4579 | 0.8351 | 0.9138 |
| No log | 1.2549 | 64 | 0.7726 | 0.6842 | 0.7726 | 0.8790 |
| No log | 1.2941 | 66 | 0.7635 | 0.6842 | 0.7635 | 0.8738 |
| No log | 1.3333 | 68 | 0.7722 | 0.6272 | 0.7722 | 0.8788 |
| No log | 1.3725 | 70 | 0.7916 | 0.5785 | 0.7916 | 0.8897 |
| No log | 1.4118 | 72 | 0.7462 | 0.6629 | 0.7462 | 0.8638 |
| No log | 1.4510 | 74 | 0.7408 | 0.6509 | 0.7408 | 0.8607 |
| No log | 1.4902 | 76 | 0.7563 | 0.6453 | 0.7563 | 0.8697 |
| No log | 1.5294 | 78 | 0.7787 | 0.6503 | 0.7787 | 0.8824 |
| No log | 1.5686 | 80 | 0.7399 | 0.7095 | 0.7399 | 0.8602 |
| No log | 1.6078 | 82 | 0.7395 | 0.7614 | 0.7395 | 0.8600 |
| No log | 1.6471 | 84 | 0.7472 | 0.7447 | 0.7472 | 0.8644 |
| No log | 1.6863 | 86 | 0.7819 | 0.6593 | 0.7819 | 0.8842 |
| No log | 1.7255 | 88 | 1.0190 | 0.6141 | 1.0190 | 1.0094 |
| No log | 1.7647 | 90 | 1.1069 | 0.5349 | 1.1069 | 1.0521 |
| No log | 1.8039 | 92 | 0.9724 | 0.6260 | 0.9724 | 0.9861 |
| No log | 1.8431 | 94 | 0.7958 | 0.6356 | 0.7958 | 0.8921 |
| No log | 1.8824 | 96 | 0.7435 | 0.6922 | 0.7435 | 0.8623 |
| No log | 1.9216 | 98 | 0.7415 | 0.6987 | 0.7415 | 0.8611 |
| No log | 1.9608 | 100 | 0.7545 | 0.6266 | 0.7545 | 0.8686 |
| No log | 2.0 | 102 | 0.7548 | 0.6569 | 0.7548 | 0.8688 |
| No log | 2.0392 | 104 | 0.7802 | 0.6569 | 0.7802 | 0.8833 |
| No log | 2.0784 | 106 | 0.7617 | 0.7208 | 0.7617 | 0.8728 |
| No log | 2.1176 | 108 | 0.7604 | 0.7152 | 0.7604 | 0.8720 |
| No log | 2.1569 | 110 | 0.7698 | 0.7064 | 0.7698 | 0.8774 |
| No log | 2.1961 | 112 | 0.7854 | 0.7148 | 0.7854 | 0.8862 |
| No log | 2.2353 | 114 | 0.8043 | 0.7216 | 0.8043 | 0.8968 |
| No log | 2.2745 | 116 | 0.7963 | 0.6799 | 0.7963 | 0.8923 |
| No log | 2.3137 | 118 | 0.7827 | 0.6473 | 0.7827 | 0.8847 |
| No log | 2.3529 | 120 | 0.7334 | 0.7019 | 0.7334 | 0.8564 |
| No log | 2.3922 | 122 | 0.6615 | 0.7443 | 0.6615 | 0.8134 |
| No log | 2.4314 | 124 | 0.6402 | 0.7614 | 0.6402 | 0.8001 |
| No log | 2.4706 | 126 | 0.6426 | 0.6856 | 0.6426 | 0.8016 |
| No log | 2.5098 | 128 | 0.6679 | 0.7019 | 0.6679 | 0.8173 |
| No log | 2.5490 | 130 | 0.6443 | 0.7879 | 0.6443 | 0.8027 |
| No log | 2.5882 | 132 | 0.6486 | 0.7879 | 0.6486 | 0.8054 |
| No log | 2.6275 | 134 | 0.6706 | 0.7019 | 0.6706 | 0.8189 |
| No log | 2.6667 | 136 | 0.7266 | 0.6147 | 0.7266 | 0.8524 |
| No log | 2.7059 | 138 | 0.7753 | 0.5876 | 0.7753 | 0.8805 |
| No log | 2.7451 | 140 | 0.8116 | 0.6388 | 0.8116 | 0.9009 |
| No log | 2.7843 | 142 | 0.8154 | 0.6182 | 0.8154 | 0.9030 |
| No log | 2.8235 | 144 | 0.8394 | 0.6388 | 0.8394 | 0.9162 |
| No log | 2.8627 | 146 | 0.9031 | 0.6115 | 0.9031 | 0.9503 |
| No log | 2.9020 | 148 | 1.0601 | 0.5166 | 1.0601 | 1.0296 |
| No log | 2.9412 | 150 | 1.0090 | 0.5166 | 1.0090 | 1.0045 |
| No log | 2.9804 | 152 | 0.7587 | 0.6435 | 0.7587 | 0.8710 |
| No log | 3.0196 | 154 | 0.6882 | 0.7514 | 0.6882 | 0.8296 |
| No log | 3.0588 | 156 | 0.6699 | 0.7354 | 0.6699 | 0.8185 |
| No log | 3.0980 | 158 | 0.6982 | 0.7208 | 0.6982 | 0.8356 |
| No log | 3.1373 | 160 | 0.7228 | 0.7208 | 0.7228 | 0.8501 |
| No log | 3.1765 | 162 | 0.7286 | 0.6741 | 0.7286 | 0.8536 |
| No log | 3.2157 | 164 | 0.7751 | 0.6677 | 0.7751 | 0.8804 |
| No log | 3.2549 | 166 | 0.8315 | 0.6213 | 0.8315 | 0.9119 |
| No log | 3.2941 | 168 | 0.7787 | 0.6462 | 0.7787 | 0.8824 |
| No log | 3.3333 | 170 | 0.7527 | 0.7014 | 0.7527 | 0.8676 |
| No log | 3.3725 | 172 | 0.7386 | 0.7086 | 0.7386 | 0.8594 |
| No log | 3.4118 | 174 | 0.7086 | 0.6987 | 0.7086 | 0.8418 |
| No log | 3.4510 | 176 | 0.6973 | 0.6987 | 0.6973 | 0.8351 |
| No log | 3.4902 | 178 | 0.6904 | 0.7264 | 0.6904 | 0.8309 |
| No log | 3.5294 | 180 | 0.6825 | 0.7058 | 0.6825 | 0.8261 |
| No log | 3.5686 | 182 | 0.6788 | 0.6627 | 0.6788 | 0.8239 |
| No log | 3.6078 | 184 | 0.6915 | 0.6569 | 0.6915 | 0.8316 |
| No log | 3.6471 | 186 | 0.6951 | 0.6519 | 0.6951 | 0.8337 |
| No log | 3.6863 | 188 | 0.6983 | 0.6519 | 0.6983 | 0.8357 |
| No log | 3.7255 | 190 | 0.6931 | 0.7281 | 0.6931 | 0.8326 |
| No log | 3.7647 | 192 | 0.7059 | 0.6519 | 0.7059 | 0.8402 |
| No log | 3.8039 | 194 | 0.7671 | 0.5869 | 0.7671 | 0.8759 |
| No log | 3.8431 | 196 | 0.8072 | 0.6071 | 0.8072 | 0.8985 |
| No log | 3.8824 | 198 | 0.7720 | 0.6071 | 0.7720 | 0.8786 |
| No log | 3.9216 | 200 | 0.7198 | 0.6309 | 0.7198 | 0.8484 |
| No log | 3.9608 | 202 | 0.6870 | 0.6316 | 0.6870 | 0.8288 |
| No log | 4.0 | 204 | 0.6719 | 0.7354 | 0.6719 | 0.8197 |
| No log | 4.0392 | 206 | 0.6801 | 0.7090 | 0.6801 | 0.8247 |
| No log | 4.0784 | 208 | 0.6953 | 0.6272 | 0.6953 | 0.8338 |
| No log | 4.1176 | 210 | 0.7760 | 0.5589 | 0.7760 | 0.8809 |
| No log | 4.1569 | 212 | 0.9216 | 0.6071 | 0.9216 | 0.9600 |
| No log | 4.1961 | 214 | 0.9385 | 0.5889 | 0.9385 | 0.9688 |
| No log | 4.2353 | 216 | 0.8443 | 0.5563 | 0.8443 | 0.9189 |
| No log | 4.2745 | 218 | 0.7346 | 0.6015 | 0.7346 | 0.8571 |
| No log | 4.3137 | 220 | 0.6825 | 0.6871 | 0.6825 | 0.8261 |
| No log | 4.3529 | 222 | 0.6909 | 0.7090 | 0.6909 | 0.8312 |
| No log | 4.3922 | 224 | 0.7052 | 0.7281 | 0.7052 | 0.8398 |
| No log | 4.4314 | 226 | 0.7088 | 0.7090 | 0.7088 | 0.8419 |
| No log | 4.4706 | 228 | 0.7573 | 0.6802 | 0.7573 | 0.8702 |
| No log | 4.5098 | 230 | 0.8424 | 0.5903 | 0.8424 | 0.9178 |
| No log | 4.5490 | 232 | 0.8608 | 0.6339 | 0.8608 | 0.9278 |
| No log | 4.5882 | 234 | 0.8435 | 0.6339 | 0.8435 | 0.9184 |
| No log | 4.6275 | 236 | 0.7996 | 0.6266 | 0.7996 | 0.8942 |
| No log | 4.6667 | 238 | 0.7702 | 0.7019 | 0.7702 | 0.8776 |
| No log | 4.7059 | 240 | 0.7672 | 0.7365 | 0.7672 | 0.8759 |
| No log | 4.7451 | 242 | 0.8174 | 0.7452 | 0.8174 | 0.9041 |
| No log | 4.7843 | 244 | 0.8702 | 0.6916 | 0.8702 | 0.9328 |
| No log | 4.8235 | 246 | 0.8297 | 0.7696 | 0.8297 | 0.9109 |
| No log | 4.8627 | 248 | 0.7389 | 0.7692 | 0.7389 | 0.8596 |
| No log | 4.9020 | 250 | 0.7234 | 0.6684 | 0.7234 | 0.8505 |
| No log | 4.9412 | 252 | 0.7958 | 0.5903 | 0.7958 | 0.8921 |
| No log | 4.9804 | 254 | 0.9754 | 0.6201 | 0.9754 | 0.9876 |
| No log | 5.0196 | 256 | 1.0211 | 0.6201 | 1.0211 | 1.0105 |
| No log | 5.0588 | 258 | 0.9085 | 0.5563 | 0.9085 | 0.9532 |
| No log | 5.0980 | 260 | 0.7736 | 0.6866 | 0.7736 | 0.8795 |
| No log | 5.1373 | 262 | 0.7120 | 0.6856 | 0.7120 | 0.8438 |
| No log | 5.1765 | 264 | 0.7253 | 0.6200 | 0.7253 | 0.8516 |
| No log | 5.2157 | 266 | 0.7209 | 0.6200 | 0.7209 | 0.8490 |
| No log | 5.2549 | 268 | 0.7076 | 0.7090 | 0.7076 | 0.8412 |
| No log | 5.2941 | 270 | 0.7450 | 0.6866 | 0.7450 | 0.8631 |
| No log | 5.3333 | 272 | 0.8738 | 0.5973 | 0.8738 | 0.9348 |
| No log | 5.3725 | 274 | 1.0209 | 0.5563 | 1.0209 | 1.0104 |
| No log | 5.4118 | 276 | 1.0694 | 0.5374 | 1.0694 | 1.0341 |
| No log | 5.4510 | 278 | 0.9807 | 0.5563 | 0.9807 | 0.9903 |
| No log | 5.4902 | 280 | 0.8809 | 0.5563 | 0.8809 | 0.9386 |
| No log | 5.5294 | 282 | 0.8044 | 0.5973 | 0.8044 | 0.8969 |
| No log | 5.5686 | 284 | 0.7455 | 0.6866 | 0.7455 | 0.8634 |
| No log | 5.6078 | 286 | 0.7176 | 0.6802 | 0.7176 | 0.8471 |
| No log | 5.6471 | 288 | 0.7085 | 0.6802 | 0.7085 | 0.8417 |
| No log | 5.6863 | 290 | 0.7125 | 0.6866 | 0.7125 | 0.8441 |
| No log | 5.7255 | 292 | 0.7554 | 0.5973 | 0.7554 | 0.8691 |
| No log | 5.7647 | 294 | 0.7903 | 0.5973 | 0.7903 | 0.8890 |
| No log | 5.8039 | 296 | 0.8400 | 0.6038 | 0.8400 | 0.9165 |
| No log | 5.8431 | 298 | 0.8226 | 0.5973 | 0.8226 | 0.9069 |
| No log | 5.8824 | 300 | 0.7617 | 0.5973 | 0.7617 | 0.8727 |
| No log | 5.9216 | 302 | 0.7063 | 0.5973 | 0.7063 | 0.8404 |
| No log | 5.9608 | 304 | 0.6631 | 0.6363 | 0.6631 | 0.8143 |
| No log | 6.0 | 306 | 0.6546 | 0.6752 | 0.6546 | 0.8091 |
| No log | 6.0392 | 308 | 0.6529 | 0.6752 | 0.6529 | 0.8080 |
| No log | 6.0784 | 310 | 0.6534 | 0.6690 | 0.6534 | 0.8083 |
| No log | 6.1176 | 312 | 0.6565 | 0.6627 | 0.6565 | 0.8102 |
| No log | 6.1569 | 314 | 0.6591 | 0.6684 | 0.6591 | 0.8119 |
| No log | 6.1961 | 316 | 0.6759 | 0.5785 | 0.6759 | 0.8221 |
| No log | 6.2353 | 318 | 0.7101 | 0.5785 | 0.7101 | 0.8427 |
| No log | 6.2745 | 320 | 0.7351 | 0.6435 | 0.7351 | 0.8574 |
| No log | 6.3137 | 322 | 0.7604 | 0.6435 | 0.7604 | 0.8720 |
| No log | 6.3529 | 324 | 0.7849 | 0.6836 | 0.7849 | 0.8859 |
| No log | 6.3922 | 326 | 0.7914 | 0.7038 | 0.7914 | 0.8896 |
| No log | 6.4314 | 328 | 0.7511 | 0.7867 | 0.7511 | 0.8667 |
| No log | 6.4706 | 330 | 0.7465 | 0.7663 | 0.7465 | 0.8640 |
| No log | 6.5098 | 332 | 0.7488 | 0.7745 | 0.7488 | 0.8654 |
| No log | 6.5490 | 334 | 0.7429 | 0.7808 | 0.7429 | 0.8619 |
| No log | 6.5882 | 336 | 0.7406 | 0.7509 | 0.7406 | 0.8606 |
| No log | 6.6275 | 338 | 0.7587 | 0.6836 | 0.7587 | 0.8711 |
| No log | 6.6667 | 340 | 0.7755 | 0.6435 | 0.7755 | 0.8806 |
| No log | 6.7059 | 342 | 0.7792 | 0.5973 | 0.7792 | 0.8827 |
| No log | 6.7451 | 344 | 0.7523 | 0.6545 | 0.7523 | 0.8673 |
| No log | 6.7843 | 346 | 0.6829 | 0.5973 | 0.6829 | 0.8264 |
| No log | 6.8235 | 348 | 0.6398 | 0.6015 | 0.6398 | 0.7999 |
| No log | 6.8627 | 350 | 0.6258 | 0.6015 | 0.6258 | 0.7911 |
| No log | 6.9020 | 352 | 0.6241 | 0.6363 | 0.6241 | 0.7900 |
| No log | 6.9412 | 354 | 0.6251 | 0.6363 | 0.6251 | 0.7906 |
| No log | 6.9804 | 356 | 0.6366 | 0.6627 | 0.6366 | 0.7979 |
| No log | 7.0196 | 358 | 0.6516 | 0.6569 | 0.6516 | 0.8072 |
| No log | 7.0588 | 360 | 0.6647 | 0.6569 | 0.6647 | 0.8153 |
| No log | 7.0980 | 362 | 0.6866 | 0.6622 | 0.6866 | 0.8286 |
| No log | 7.1373 | 364 | 0.7071 | 0.7153 | 0.7071 | 0.8409 |
| No log | 7.1765 | 366 | 0.7019 | 0.7153 | 0.7019 | 0.8378 |
| No log | 7.2157 | 368 | 0.6819 | 0.6622 | 0.6819 | 0.8258 |
| No log | 7.2549 | 370 | 0.6617 | 0.6400 | 0.6617 | 0.8134 |
| No log | 7.2941 | 372 | 0.6668 | 0.7365 | 0.6668 | 0.8166 |
| No log | 7.3333 | 374 | 0.6786 | 0.7369 | 0.6786 | 0.8238 |
| No log | 7.3725 | 376 | 0.6754 | 0.7443 | 0.6754 | 0.8218 |
| No log | 7.4118 | 378 | 0.6610 | 0.6898 | 0.6610 | 0.8130 |
| No log | 7.4510 | 380 | 0.6561 | 0.6400 | 0.6561 | 0.8100 |
| No log | 7.4902 | 382 | 0.6651 | 0.6309 | 0.6651 | 0.8155 |
| No log | 7.5294 | 384 | 0.6942 | 0.6860 | 0.6942 | 0.8332 |
| No log | 7.5686 | 386 | 0.7294 | 0.6860 | 0.7294 | 0.8541 |
| No log | 7.6078 | 388 | 0.7519 | 0.6860 | 0.7519 | 0.8671 |
| No log | 7.6471 | 390 | 0.7724 | 0.6860 | 0.7724 | 0.8789 |
| No log | 7.6863 | 392 | 0.7696 | 0.6860 | 0.7696 | 0.8773 |
| No log | 7.7255 | 394 | 0.7541 | 0.6309 | 0.7541 | 0.8684 |
| No log | 7.7647 | 396 | 0.7438 | 0.6266 | 0.7438 | 0.8624 |
| No log | 7.8039 | 398 | 0.7288 | 0.6225 | 0.7288 | 0.8537 |
| No log | 7.8431 | 400 | 0.7200 | 0.6519 | 0.7200 | 0.8485 |
| No log | 7.8824 | 402 | 0.7102 | 0.6519 | 0.7102 | 0.8428 |
| No log | 7.9216 | 404 | 0.6991 | 0.7019 | 0.6991 | 0.8361 |
| No log | 7.9608 | 406 | 0.6925 | 0.6519 | 0.6925 | 0.8322 |
| No log | 8.0 | 408 | 0.7010 | 0.6266 | 0.7010 | 0.8373 |
| No log | 8.0392 | 410 | 0.7243 | 0.6309 | 0.7243 | 0.8511 |
| No log | 8.0784 | 412 | 0.7437 | 0.6309 | 0.7437 | 0.8624 |
| No log | 8.1176 | 414 | 0.7433 | 0.6309 | 0.7433 | 0.8622 |
| No log | 8.1569 | 416 | 0.7382 | 0.6309 | 0.7382 | 0.8592 |
| No log | 8.1961 | 418 | 0.7198 | 0.6309 | 0.7198 | 0.8484 |
| No log | 8.2353 | 420 | 0.6974 | 0.6266 | 0.6974 | 0.8351 |
| No log | 8.2745 | 422 | 0.6879 | 0.6627 | 0.6879 | 0.8294 |
| No log | 8.3137 | 424 | 0.6883 | 0.6856 | 0.6883 | 0.8296 |
| No log | 8.3529 | 426 | 0.6945 | 0.6856 | 0.6945 | 0.8333 |
| No log | 8.3922 | 428 | 0.7053 | 0.7657 | 0.7053 | 0.8398 |
| No log | 8.4314 | 430 | 0.7150 | 0.7038 | 0.7150 | 0.8456 |
| No log | 8.4706 | 432 | 0.7187 | 0.7101 | 0.7187 | 0.8478 |
| No log | 8.5098 | 434 | 0.7266 | 0.7101 | 0.7266 | 0.8524 |
| No log | 8.5490 | 436 | 0.7203 | 0.7101 | 0.7203 | 0.8487 |
| No log | 8.5882 | 438 | 0.7104 | 0.7101 | 0.7104 | 0.8429 |
| No log | 8.6275 | 440 | 0.7075 | 0.7101 | 0.7075 | 0.8411 |
| No log | 8.6667 | 442 | 0.7069 | 0.7101 | 0.7069 | 0.8408 |
| No log | 8.7059 | 444 | 0.7061 | 0.7101 | 0.7061 | 0.8403 |
| No log | 8.7451 | 446 | 0.7015 | 0.7101 | 0.7015 | 0.8376 |
| No log | 8.7843 | 448 | 0.7016 | 0.7101 | 0.7016 | 0.8376 |
| No log | 8.8235 | 450 | 0.7052 | 0.7101 | 0.7052 | 0.8398 |
| No log | 8.8627 | 452 | 0.7072 | 0.7101 | 0.7072 | 0.8410 |
| No log | 8.9020 | 454 | 0.6983 | 0.7101 | 0.6983 | 0.8357 |
| No log | 8.9412 | 456 | 0.6883 | 0.7101 | 0.6883 | 0.8296 |
| No log | 8.9804 | 458 | 0.6787 | 0.7819 | 0.6787 | 0.8238 |
| No log | 9.0196 | 460 | 0.6717 | 0.7273 | 0.6717 | 0.8196 |
| No log | 9.0588 | 462 | 0.6665 | 0.7273 | 0.6665 | 0.8164 |
| No log | 9.0980 | 464 | 0.6614 | 0.6917 | 0.6614 | 0.8133 |
| No log | 9.1373 | 466 | 0.6571 | 0.6917 | 0.6571 | 0.8106 |
| No log | 9.1765 | 468 | 0.6560 | 0.6982 | 0.6560 | 0.8100 |
| No log | 9.2157 | 470 | 0.6565 | 0.7346 | 0.6565 | 0.8103 |
| No log | 9.2549 | 472 | 0.6580 | 0.7250 | 0.6580 | 0.8112 |
| No log | 9.2941 | 474 | 0.6582 | 0.7250 | 0.6582 | 0.8113 |
| No log | 9.3333 | 476 | 0.6557 | 0.7250 | 0.6557 | 0.8098 |
| No log | 9.3725 | 478 | 0.6535 | 0.7250 | 0.6535 | 0.8084 |
| No log | 9.4118 | 480 | 0.6552 | 0.7250 | 0.6552 | 0.8095 |
| No log | 9.4510 | 482 | 0.6618 | 0.7250 | 0.6618 | 0.8135 |
| No log | 9.4902 | 484 | 0.6667 | 0.7101 | 0.6667 | 0.8165 |
| No log | 9.5294 | 486 | 0.6670 | 0.7586 | 0.6670 | 0.8167 |
| No log | 9.5686 | 488 | 0.6644 | 0.7586 | 0.6644 | 0.8151 |
| No log | 9.6078 | 490 | 0.6622 | 0.7586 | 0.6622 | 0.8137 |
| No log | 9.6471 | 492 | 0.6628 | 0.7427 | 0.6628 | 0.8141 |
| No log | 9.6863 | 494 | 0.6657 | 0.7586 | 0.6657 | 0.8159 |
| No log | 9.7255 | 496 | 0.6683 | 0.7586 | 0.6683 | 0.8175 |
| No log | 9.7647 | 498 | 0.6697 | 0.7427 | 0.6697 | 0.8183 |
| 0.4304 | 9.8039 | 500 | 0.6707 | 0.7427 | 0.6707 | 0.8190 |
| 0.4304 | 9.8431 | 502 | 0.6709 | 0.7427 | 0.6709 | 0.8191 |
| 0.4304 | 9.8824 | 504 | 0.6716 | 0.7427 | 0.6716 | 0.8195 |
| 0.4304 | 9.9216 | 506 | 0.6724 | 0.7427 | 0.6724 | 0.8200 |
| 0.4304 | 9.9608 | 508 | 0.6730 | 0.7427 | 0.6730 | 0.8204 |
| 0.4304 | 10.0 | 510 | 0.6734 | 0.7427 | 0.6734 | 0.8206 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
|
susmitabhatt/whisper-a-clp-ls-25 | susmitabhatt | 2024-11-24T06:15:00Z | 8 | 0 | transformers | [
"transformers",
"safetensors",
"whisper",
"automatic-speech-recognition",
"generated_from_trainer",
"base_model:openai/whisper-small",
"base_model:finetune:openai/whisper-small",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | automatic-speech-recognition | 2024-11-24T05:17:44Z | ---
library_name: transformers
license: apache-2.0
base_model: openai/whisper-small
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: whisper-a-clp-ls-25
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# whisper-a-clp-ls-25
This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co/openai/whisper-small) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0302
- Wer: 8.3857
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0004
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 132
- num_epochs: 11
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-------:|:----:|:---------------:|:-------:|
| No log | 1.0 | 40 | 0.1805 | 27.4633 |
| No log | 2.0 | 80 | 0.2522 | 54.9266 |
| 1.2203 | 3.0 | 120 | 0.2866 | 59.5388 |
| 1.2203 | 4.0 | 160 | 0.1374 | 43.3962 |
| 0.1533 | 5.0 | 200 | 0.1516 | 46.1216 |
| 0.1533 | 6.0 | 240 | 0.1406 | 53.4591 |
| 0.1533 | 7.0 | 280 | 0.0704 | 17.8197 |
| 0.0561 | 8.0 | 320 | 0.0487 | 15.0943 |
| 0.0561 | 9.0 | 360 | 0.0395 | 11.7400 |
| 0.0225 | 10.0 | 400 | 0.0291 | 6.2893 |
| 0.0225 | 10.7342 | 429 | 0.0302 | 8.3857 |
### Framework versions
- Transformers 4.47.0.dev0
- Pytorch 2.4.0
- Datasets 3.0.1
- Tokenizers 0.20.0
|
MayBashendy/Arabic_FineTuningAraBERT_AugV5_k2_task1_organization_fold1 | MayBashendy | 2024-11-24T06:04:01Z | 163 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:aubmindlab/bert-base-arabertv02",
"base_model:finetune:aubmindlab/bert-base-arabertv02",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-18T18:10:40Z | ---
library_name: transformers
base_model: aubmindlab/bert-base-arabertv02
tags:
- generated_from_trainer
model-index:
- name: Arabic_FineTuningAraBERT_AugV5_k2_task1_organization_fold1
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Arabic_FineTuningAraBERT_AugV5_k2_task1_organization_fold1
This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6928
- Qwk: 0.6488
- Mse: 0.6928
- Rmse: 0.8324
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|:-------------:|:------:|:----:|:---------------:|:------:|:------:|:------:|
| No log | 0.0645 | 2 | 3.2588 | 0.0755 | 3.2588 | 1.8052 |
| No log | 0.1290 | 4 | 2.2185 | 0.1881 | 2.2185 | 1.4894 |
| No log | 0.1935 | 6 | 1.8494 | 0.0 | 1.8494 | 1.3599 |
| No log | 0.2581 | 8 | 1.4947 | 0.0 | 1.4947 | 1.2226 |
| No log | 0.3226 | 10 | 1.1893 | 0.0 | 1.1893 | 1.0906 |
| No log | 0.3871 | 12 | 1.0582 | 0.0 | 1.0582 | 1.0287 |
| No log | 0.4516 | 14 | 1.0554 | 0.0 | 1.0554 | 1.0273 |
| No log | 0.5161 | 16 | 1.1632 | 0.0 | 1.1632 | 1.0785 |
| No log | 0.5806 | 18 | 1.2129 | 0.0 | 1.2129 | 1.1013 |
| No log | 0.6452 | 20 | 1.1313 | 0.0 | 1.1313 | 1.0636 |
| No log | 0.7097 | 22 | 1.0789 | 0.0 | 1.0789 | 1.0387 |
| No log | 0.7742 | 24 | 1.0381 | 0.0187 | 1.0381 | 1.0189 |
| No log | 0.8387 | 26 | 0.9852 | 0.2146 | 0.9852 | 0.9926 |
| No log | 0.9032 | 28 | 0.9659 | 0.2613 | 0.9659 | 0.9828 |
| No log | 0.9677 | 30 | 0.9620 | 0.2376 | 0.9620 | 0.9808 |
| No log | 1.0323 | 32 | 0.9632 | 0.1910 | 0.9632 | 0.9814 |
| No log | 1.0968 | 34 | 0.9409 | 0.1910 | 0.9409 | 0.9700 |
| No log | 1.1613 | 36 | 0.8438 | 0.2632 | 0.8438 | 0.9186 |
| No log | 1.2258 | 38 | 0.7711 | 0.2921 | 0.7711 | 0.8781 |
| No log | 1.2903 | 40 | 0.7565 | 0.4199 | 0.7565 | 0.8698 |
| No log | 1.3548 | 42 | 0.8360 | 0.2613 | 0.8360 | 0.9144 |
| No log | 1.4194 | 44 | 1.0154 | 0.2146 | 1.0154 | 1.0077 |
| No log | 1.4839 | 46 | 1.0170 | 0.2146 | 1.0170 | 1.0085 |
| No log | 1.5484 | 48 | 0.9211 | 0.2146 | 0.9211 | 0.9597 |
| No log | 1.6129 | 50 | 0.8266 | 0.4286 | 0.8266 | 0.9091 |
| No log | 1.6774 | 52 | 0.7388 | 0.3913 | 0.7388 | 0.8595 |
| No log | 1.7419 | 54 | 0.7320 | 0.4199 | 0.7320 | 0.8556 |
| No log | 1.8065 | 56 | 0.7463 | 0.3708 | 0.7463 | 0.8639 |
| No log | 1.8710 | 58 | 0.7903 | 0.3834 | 0.7903 | 0.8890 |
| No log | 1.9355 | 60 | 0.8218 | 0.4286 | 0.8218 | 0.9066 |
| No log | 2.0 | 62 | 0.8105 | 0.4 | 0.8105 | 0.9003 |
| No log | 2.0645 | 64 | 0.7506 | 0.3636 | 0.7506 | 0.8663 |
| No log | 2.1290 | 66 | 0.7074 | 0.3425 | 0.7074 | 0.8411 |
| No log | 2.1935 | 68 | 0.6949 | 0.4 | 0.6949 | 0.8336 |
| No log | 2.2581 | 70 | 0.6923 | 0.4 | 0.6923 | 0.8321 |
| No log | 2.3226 | 72 | 0.6849 | 0.3708 | 0.6849 | 0.8276 |
| No log | 2.3871 | 74 | 0.6800 | 0.3425 | 0.6800 | 0.8246 |
| No log | 2.4516 | 76 | 0.6856 | 0.3708 | 0.6856 | 0.8280 |
| No log | 2.5161 | 78 | 0.7206 | 0.4 | 0.7206 | 0.8489 |
| No log | 2.5806 | 80 | 0.7458 | 0.4 | 0.7458 | 0.8636 |
| No log | 2.6452 | 82 | 0.7865 | 0.4 | 0.7865 | 0.8869 |
| No log | 2.7097 | 84 | 0.7714 | 0.4 | 0.7714 | 0.8783 |
| No log | 2.7742 | 86 | 0.7297 | 0.4 | 0.7297 | 0.8542 |
| No log | 2.8387 | 88 | 0.6938 | 0.4 | 0.6938 | 0.8329 |
| No log | 2.9032 | 90 | 0.6866 | 0.4664 | 0.6866 | 0.8286 |
| No log | 2.9677 | 92 | 0.6969 | 0.4979 | 0.6969 | 0.8348 |
| No log | 3.0323 | 94 | 0.7069 | 0.5078 | 0.7069 | 0.8408 |
| No log | 3.0968 | 96 | 0.7196 | 0.4360 | 0.7196 | 0.8483 |
| No log | 3.1613 | 98 | 0.7742 | 0.4615 | 0.7742 | 0.8799 |
| No log | 3.2258 | 100 | 0.8040 | 0.3740 | 0.8040 | 0.8967 |
| No log | 3.2903 | 102 | 0.8239 | 0.4280 | 0.8239 | 0.9077 |
| No log | 3.3548 | 104 | 0.7697 | 0.375 | 0.7697 | 0.8773 |
| No log | 3.4194 | 106 | 0.6902 | 0.5702 | 0.6902 | 0.8308 |
| No log | 3.4839 | 108 | 0.6574 | 0.5702 | 0.6574 | 0.8108 |
| No log | 3.5484 | 110 | 0.6496 | 0.5625 | 0.6496 | 0.8060 |
| No log | 3.6129 | 112 | 0.6463 | 0.6263 | 0.6463 | 0.8039 |
| No log | 3.6774 | 114 | 0.6425 | 0.6263 | 0.6425 | 0.8015 |
| No log | 3.7419 | 116 | 0.6256 | 0.6263 | 0.6256 | 0.7909 |
| No log | 3.8065 | 118 | 0.6080 | 0.5882 | 0.6080 | 0.7797 |
| No log | 3.8710 | 120 | 0.6767 | 0.6111 | 0.6767 | 0.8226 |
| No log | 3.9355 | 122 | 0.7441 | 0.4122 | 0.7441 | 0.8626 |
| No log | 4.0 | 124 | 0.7353 | 0.5039 | 0.7353 | 0.8575 |
| No log | 4.0645 | 126 | 0.7098 | 0.6111 | 0.7098 | 0.8425 |
| No log | 4.1290 | 128 | 0.6767 | 0.6111 | 0.6767 | 0.8226 |
| No log | 4.1935 | 130 | 0.6430 | 0.5911 | 0.6430 | 0.8018 |
| No log | 4.2581 | 132 | 0.6690 | 0.5911 | 0.6690 | 0.8179 |
| No log | 4.3226 | 134 | 0.7304 | 0.6111 | 0.7304 | 0.8546 |
| No log | 4.3871 | 136 | 0.8502 | 0.6111 | 0.8502 | 0.9221 |
| No log | 4.4516 | 138 | 0.9068 | 0.6387 | 0.9068 | 0.9522 |
| No log | 4.5161 | 140 | 0.8387 | 0.6020 | 0.8387 | 0.9158 |
| No log | 4.5806 | 142 | 0.7981 | 0.6111 | 0.7981 | 0.8934 |
| No log | 4.6452 | 144 | 0.7320 | 0.5455 | 0.7320 | 0.8556 |
| No log | 4.7097 | 146 | 0.7020 | 0.5455 | 0.7020 | 0.8378 |
| No log | 4.7742 | 148 | 0.6707 | 0.5363 | 0.6707 | 0.8190 |
| No log | 4.8387 | 150 | 0.6531 | 0.5333 | 0.6531 | 0.8081 |
| No log | 4.9032 | 152 | 0.6652 | 0.5455 | 0.6652 | 0.8156 |
| No log | 4.9677 | 154 | 0.7190 | 0.6111 | 0.7190 | 0.8479 |
| No log | 5.0323 | 156 | 0.7183 | 0.6111 | 0.7183 | 0.8475 |
| No log | 5.0968 | 158 | 0.6569 | 0.6111 | 0.6569 | 0.8105 |
| No log | 5.1613 | 160 | 0.6119 | 0.6111 | 0.6119 | 0.7823 |
| No log | 5.2258 | 162 | 0.6086 | 0.5911 | 0.6086 | 0.7801 |
| No log | 5.2903 | 164 | 0.6147 | 0.6000 | 0.6147 | 0.7840 |
| No log | 5.3548 | 166 | 0.6262 | 0.5522 | 0.6262 | 0.7913 |
| No log | 5.4194 | 168 | 0.6368 | 0.5522 | 0.6368 | 0.7980 |
| No log | 5.4839 | 170 | 0.6765 | 0.5455 | 0.6765 | 0.8225 |
| No log | 5.5484 | 172 | 0.7170 | 0.6291 | 0.7170 | 0.8467 |
| No log | 5.6129 | 174 | 0.7256 | 0.5911 | 0.7256 | 0.8518 |
| No log | 5.6774 | 176 | 0.7007 | 0.5911 | 0.7007 | 0.8371 |
| No log | 5.7419 | 178 | 0.6750 | 0.5911 | 0.6750 | 0.8216 |
| No log | 5.8065 | 180 | 0.6577 | 0.5911 | 0.6577 | 0.8110 |
| No log | 5.8710 | 182 | 0.6665 | 0.5911 | 0.6665 | 0.8164 |
| No log | 5.9355 | 184 | 0.6761 | 0.5455 | 0.6761 | 0.8223 |
| No log | 6.0 | 186 | 0.7069 | 0.5831 | 0.7069 | 0.8408 |
| No log | 6.0645 | 188 | 0.7041 | 0.5455 | 0.7041 | 0.8391 |
| No log | 6.1290 | 190 | 0.7042 | 0.5455 | 0.7042 | 0.8392 |
| No log | 6.1935 | 192 | 0.7336 | 0.5831 | 0.7336 | 0.8565 |
| No log | 6.2581 | 194 | 0.7710 | 0.6387 | 0.7710 | 0.8780 |
| No log | 6.3226 | 196 | 0.7610 | 0.6387 | 0.7610 | 0.8724 |
| No log | 6.3871 | 198 | 0.7231 | 0.5831 | 0.7231 | 0.8504 |
| No log | 6.4516 | 200 | 0.6948 | 0.5455 | 0.6948 | 0.8336 |
| No log | 6.5161 | 202 | 0.6865 | 0.5455 | 0.6865 | 0.8286 |
| No log | 6.5806 | 204 | 0.6906 | 0.5455 | 0.6906 | 0.8310 |
| No log | 6.6452 | 206 | 0.7321 | 0.6111 | 0.7321 | 0.8556 |
| No log | 6.7097 | 208 | 0.7441 | 0.6020 | 0.7441 | 0.8626 |
| No log | 6.7742 | 210 | 0.7014 | 0.6111 | 0.7014 | 0.8375 |
| No log | 6.8387 | 212 | 0.6877 | 0.5911 | 0.6877 | 0.8293 |
| No log | 6.9032 | 214 | 0.6825 | 0.5911 | 0.6825 | 0.8262 |
| No log | 6.9677 | 216 | 0.7006 | 0.5455 | 0.7006 | 0.8370 |
| No log | 7.0323 | 218 | 0.7219 | 0.5908 | 0.7219 | 0.8496 |
| No log | 7.0968 | 220 | 0.7444 | 0.5831 | 0.7444 | 0.8628 |
| No log | 7.1613 | 222 | 0.7626 | 0.5831 | 0.7626 | 0.8733 |
| No log | 7.2258 | 224 | 0.7543 | 0.5831 | 0.7543 | 0.8685 |
| No log | 7.2903 | 226 | 0.7292 | 0.5831 | 0.7292 | 0.8539 |
| No log | 7.3548 | 228 | 0.7330 | 0.5831 | 0.7330 | 0.8562 |
| No log | 7.4194 | 230 | 0.7137 | 0.5831 | 0.7137 | 0.8448 |
| No log | 7.4839 | 232 | 0.6915 | 0.5831 | 0.6915 | 0.8316 |
| No log | 7.5484 | 234 | 0.6643 | 0.5831 | 0.6643 | 0.8151 |
| No log | 7.6129 | 236 | 0.6801 | 0.5831 | 0.6801 | 0.8247 |
| No log | 7.6774 | 238 | 0.7095 | 0.5831 | 0.7095 | 0.8423 |
| No log | 7.7419 | 240 | 0.7413 | 0.6291 | 0.7413 | 0.8610 |
| No log | 7.8065 | 242 | 0.7676 | 0.6198 | 0.7676 | 0.8761 |
| No log | 7.8710 | 244 | 0.7785 | 0.6387 | 0.7785 | 0.8823 |
| No log | 7.9355 | 246 | 0.7696 | 0.6198 | 0.7696 | 0.8772 |
| No log | 8.0 | 248 | 0.7497 | 0.6291 | 0.7497 | 0.8658 |
| No log | 8.0645 | 250 | 0.7363 | 0.6291 | 0.7363 | 0.8581 |
| No log | 8.1290 | 252 | 0.7070 | 0.5831 | 0.7070 | 0.8408 |
| No log | 8.1935 | 254 | 0.6947 | 0.5831 | 0.6947 | 0.8335 |
| No log | 8.2581 | 256 | 0.6983 | 0.5831 | 0.6983 | 0.8356 |
| No log | 8.3226 | 258 | 0.7251 | 0.6291 | 0.7251 | 0.8515 |
| No log | 8.3871 | 260 | 0.7556 | 0.6387 | 0.7556 | 0.8692 |
| No log | 8.4516 | 262 | 0.7563 | 0.6387 | 0.7563 | 0.8697 |
| No log | 8.5161 | 264 | 0.7651 | 0.6387 | 0.7651 | 0.8747 |
| No log | 8.5806 | 266 | 0.7857 | 0.6387 | 0.7857 | 0.8864 |
| No log | 8.6452 | 268 | 0.8015 | 0.6387 | 0.8015 | 0.8953 |
| No log | 8.7097 | 270 | 0.8103 | 0.6387 | 0.8103 | 0.9002 |
| No log | 8.7742 | 272 | 0.7983 | 0.6387 | 0.7983 | 0.8935 |
| No log | 8.8387 | 274 | 0.7795 | 0.6387 | 0.7795 | 0.8829 |
| No log | 8.9032 | 276 | 0.7516 | 0.6013 | 0.7516 | 0.8669 |
| No log | 8.9677 | 278 | 0.7302 | 0.5831 | 0.7302 | 0.8545 |
| No log | 9.0323 | 280 | 0.7125 | 0.5831 | 0.7125 | 0.8441 |
| No log | 9.0968 | 282 | 0.7038 | 0.5831 | 0.7038 | 0.8389 |
| No log | 9.1613 | 284 | 0.6973 | 0.5831 | 0.6973 | 0.8350 |
| No log | 9.2258 | 286 | 0.6995 | 0.5831 | 0.6995 | 0.8364 |
| No log | 9.2903 | 288 | 0.7119 | 0.5831 | 0.7119 | 0.8438 |
| No log | 9.3548 | 290 | 0.7243 | 0.6488 | 0.7243 | 0.8510 |
| No log | 9.4194 | 292 | 0.7230 | 0.6488 | 0.7230 | 0.8503 |
| No log | 9.4839 | 294 | 0.7152 | 0.6488 | 0.7152 | 0.8457 |
| No log | 9.5484 | 296 | 0.7044 | 0.6488 | 0.7044 | 0.8393 |
| No log | 9.6129 | 298 | 0.7006 | 0.6488 | 0.7006 | 0.8370 |
| No log | 9.6774 | 300 | 0.6949 | 0.6488 | 0.6949 | 0.8336 |
| No log | 9.7419 | 302 | 0.6891 | 0.6488 | 0.6891 | 0.8301 |
| No log | 9.8065 | 304 | 0.6881 | 0.6488 | 0.6881 | 0.8295 |
| No log | 9.8710 | 306 | 0.6902 | 0.6488 | 0.6902 | 0.8308 |
| No log | 9.9355 | 308 | 0.6926 | 0.6488 | 0.6926 | 0.8322 |
| No log | 10.0 | 310 | 0.6928 | 0.6488 | 0.6928 | 0.8324 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
|
mlx-community/Qwen2.5-72B-Instruct-3bit | mlx-community | 2024-11-24T06:01:10Z | 18 | 0 | transformers | [
"transformers",
"safetensors",
"qwen2",
"text-generation",
"chat",
"mlx",
"conversational",
"en",
"base_model:Qwen/Qwen2.5-72B-Instruct",
"base_model:quantized:Qwen/Qwen2.5-72B-Instruct",
"license:other",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"3-bit",
"region:us"
] | text-generation | 2024-11-24T05:45:25Z | ---
license: other
license_name: qwen
license_link: https://huggingface.co/Qwen/Qwen2.5-72B-Instruct/blob/main/LICENSE
language:
- en
pipeline_tag: text-generation
base_model: Qwen/Qwen2.5-72B-Instruct
tags:
- chat
- mlx
library_name: transformers
---
# mlx-community/Qwen2.5-72B-Instruct-3bit
The Model [mlx-community/Qwen2.5-72B-Instruct-3bit](https://huggingface.co/mlx-community/Qwen2.5-72B-Instruct-3bit) was converted to MLX format from [Qwen/Qwen2.5-72B-Instruct](https://huggingface.co/Qwen/Qwen2.5-72B-Instruct) using mlx-lm version **0.19.3**.
## Use with mlx
```bash
pip install mlx-lm
```
```python
from mlx_lm import load, generate
model, tokenizer = load("mlx-community/Qwen2.5-72B-Instruct-3bit")
prompt="hello"
if hasattr(tokenizer, "apply_chat_template") and tokenizer.chat_template is not None:
messages = [{"role": "user", "content": prompt}]
prompt = tokenizer.apply_chat_template(
messages, tokenize=False, add_generation_prompt=True
)
response = generate(model, tokenizer, prompt=prompt, verbose=True)
```
|
breezedeus/cnstd-ppocr-ch_PP-OCRv4_det | breezedeus | 2024-11-24T05:59:41Z | 2,861 | 0 | null | [
"onnx",
"OCR",
"STD",
"MFD",
"Layout Analysis",
"Mathematical Formula",
"Chinese",
"English",
"Scene Text Detection",
"license:apache-2.0",
"region:us"
] | null | 2024-11-23T13:53:42Z | ---
license: apache-2.0
tags:
- OCR
- STD
- MFD
- Layout Analysis
- Mathematical Formula
- Chinese
- English
- Scene Text Detection
---
# Text Detection Model for CnSTD
CnSTD: A Python3 package for Scene Text Detection, Mathematical Formula Detection (MFD), and layout analysis in Chinese/English based on PyTorch.
CnSTD:基于 PyTorch 的 中文/英文 场景文字检测(Scene Text Detection)、数学公式检测(Mathematical Formula Detection, MFD)、篇章分析(Layout Analysis)的Python3 包。
See more information: [CnSTD](https://github.com/breezedeus/CnSTD). |
MayBashendy/Arabic_FineTuningAraBERT_AugV5_k2_task1_organization_fold0 | MayBashendy | 2024-11-24T05:58:41Z | 164 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:aubmindlab/bert-base-arabertv02",
"base_model:finetune:aubmindlab/bert-base-arabertv02",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-18T18:07:34Z | ---
library_name: transformers
base_model: aubmindlab/bert-base-arabertv02
tags:
- generated_from_trainer
model-index:
- name: Arabic_FineTuningAraBERT_AugV5_k2_task1_organization_fold0
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Arabic_FineTuningAraBERT_AugV5_k2_task1_organization_fold0
This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.9389
- Qwk: 0.6038
- Mse: 0.9389
- Rmse: 0.9690
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:|
| No log | 0.0645 | 2 | 5.1961 | -0.0516 | 5.1961 | 2.2795 |
| No log | 0.1290 | 4 | 2.7972 | 0.1393 | 2.7972 | 1.6725 |
| No log | 0.1935 | 6 | 1.7676 | 0.2282 | 1.7676 | 1.3295 |
| No log | 0.2581 | 8 | 1.5234 | 0.2012 | 1.5234 | 1.2343 |
| No log | 0.3226 | 10 | 1.5411 | -0.0422 | 1.5411 | 1.2414 |
| No log | 0.3871 | 12 | 1.6195 | -0.0199 | 1.6195 | 1.2726 |
| No log | 0.4516 | 14 | 1.6384 | 0.0915 | 1.6384 | 1.2800 |
| No log | 0.5161 | 16 | 1.5858 | 0.1004 | 1.5858 | 1.2593 |
| No log | 0.5806 | 18 | 1.8284 | 0.0 | 1.8284 | 1.3522 |
| No log | 0.6452 | 20 | 1.6602 | 0.0 | 1.6602 | 1.2885 |
| No log | 0.7097 | 22 | 1.4763 | 0.0 | 1.4763 | 1.2150 |
| No log | 0.7742 | 24 | 1.4636 | 0.0758 | 1.4636 | 1.2098 |
| No log | 0.8387 | 26 | 1.6024 | 0.0 | 1.6024 | 1.2659 |
| No log | 0.9032 | 28 | 1.6943 | 0.0 | 1.6943 | 1.3017 |
| No log | 0.9677 | 30 | 1.6055 | -0.0564 | 1.6055 | 1.2671 |
| No log | 1.0323 | 32 | 1.6272 | -0.0590 | 1.6272 | 1.2756 |
| No log | 1.0968 | 34 | 1.6294 | -0.0862 | 1.6294 | 1.2765 |
| No log | 1.1613 | 36 | 1.5110 | 0.1163 | 1.5110 | 1.2292 |
| No log | 1.2258 | 38 | 1.3520 | 0.3209 | 1.3520 | 1.1627 |
| No log | 1.2903 | 40 | 1.2126 | 0.5056 | 1.2126 | 1.1012 |
| No log | 1.3548 | 42 | 1.1318 | 0.4296 | 1.1318 | 1.0639 |
| No log | 1.4194 | 44 | 1.1102 | 0.4310 | 1.1102 | 1.0536 |
| No log | 1.4839 | 46 | 1.0654 | 0.4310 | 1.0654 | 1.0322 |
| No log | 1.5484 | 48 | 1.0335 | 0.4044 | 1.0335 | 1.0166 |
| No log | 1.6129 | 50 | 0.9982 | 0.4296 | 0.9982 | 0.9991 |
| No log | 1.6774 | 52 | 1.0097 | 0.4296 | 1.0097 | 1.0049 |
| No log | 1.7419 | 54 | 0.9921 | 0.4296 | 0.9921 | 0.9960 |
| No log | 1.8065 | 56 | 0.9368 | 0.4296 | 0.9368 | 0.9679 |
| No log | 1.8710 | 58 | 0.9244 | 0.4856 | 0.9244 | 0.9615 |
| No log | 1.9355 | 60 | 0.8800 | 0.4565 | 0.8800 | 0.9381 |
| No log | 2.0 | 62 | 0.8665 | 0.4535 | 0.8665 | 0.9309 |
| No log | 2.0645 | 64 | 0.8340 | 0.5542 | 0.8340 | 0.9132 |
| No log | 2.1290 | 66 | 0.8387 | 0.3762 | 0.8387 | 0.9158 |
| No log | 2.1935 | 68 | 0.8619 | 0.3831 | 0.8619 | 0.9284 |
| No log | 2.2581 | 70 | 0.8752 | 0.4830 | 0.8752 | 0.9355 |
| No log | 2.3226 | 72 | 0.9132 | 0.5084 | 0.9132 | 0.9556 |
| No log | 2.3871 | 74 | 0.8404 | 0.4549 | 0.8404 | 0.9167 |
| No log | 2.4516 | 76 | 0.7507 | 0.6543 | 0.7507 | 0.8665 |
| No log | 2.5161 | 78 | 0.7993 | 0.5973 | 0.7993 | 0.8941 |
| No log | 2.5806 | 80 | 0.8367 | 0.5973 | 0.8367 | 0.9147 |
| No log | 2.6452 | 82 | 0.7988 | 0.5973 | 0.7988 | 0.8937 |
| No log | 2.7097 | 84 | 0.7789 | 0.7106 | 0.7789 | 0.8826 |
| No log | 2.7742 | 86 | 0.7944 | 0.5940 | 0.7944 | 0.8913 |
| No log | 2.8387 | 88 | 0.7879 | 0.6497 | 0.7879 | 0.8876 |
| No log | 2.9032 | 90 | 0.8038 | 0.6497 | 0.8038 | 0.8965 |
| No log | 2.9677 | 92 | 0.9166 | 0.5004 | 0.9166 | 0.9574 |
| No log | 3.0323 | 94 | 1.0422 | 0.5004 | 1.0422 | 1.0209 |
| No log | 3.0968 | 96 | 1.0352 | 0.5004 | 1.0352 | 1.0175 |
| No log | 3.1613 | 98 | 0.9392 | 0.5004 | 0.9392 | 0.9691 |
| No log | 3.2258 | 100 | 0.8472 | 0.6015 | 0.8472 | 0.9204 |
| No log | 3.2903 | 102 | 0.8335 | 0.5973 | 0.8335 | 0.9130 |
| No log | 3.3548 | 104 | 0.8737 | 0.5004 | 0.8737 | 0.9347 |
| No log | 3.4194 | 106 | 0.9547 | 0.5532 | 0.9547 | 0.9771 |
| No log | 3.4839 | 108 | 0.9698 | 0.5532 | 0.9698 | 0.9848 |
| No log | 3.5484 | 110 | 0.9656 | 0.5556 | 0.9656 | 0.9827 |
| No log | 3.6129 | 112 | 0.8630 | 0.5947 | 0.8630 | 0.9290 |
| No log | 3.6774 | 114 | 0.8061 | 0.6322 | 0.8061 | 0.8979 |
| No log | 3.7419 | 116 | 0.7877 | 0.6574 | 0.7877 | 0.8875 |
| No log | 3.8065 | 118 | 0.7993 | 0.6574 | 0.7993 | 0.8940 |
| No log | 3.8710 | 120 | 0.8664 | 0.6108 | 0.8664 | 0.9308 |
| No log | 3.9355 | 122 | 0.8692 | 0.6108 | 0.8692 | 0.9323 |
| No log | 4.0 | 124 | 0.8836 | 0.6108 | 0.8836 | 0.9400 |
| No log | 4.0645 | 126 | 0.8678 | 0.6108 | 0.8678 | 0.9315 |
| No log | 4.1290 | 128 | 0.8246 | 0.6108 | 0.8246 | 0.9081 |
| No log | 4.1935 | 130 | 0.8425 | 0.6379 | 0.8425 | 0.9179 |
| No log | 4.2581 | 132 | 0.8062 | 0.6736 | 0.8062 | 0.8979 |
| No log | 4.3226 | 134 | 0.8038 | 0.6519 | 0.8038 | 0.8965 |
| No log | 4.3871 | 136 | 0.8262 | 0.6519 | 0.8262 | 0.9090 |
| No log | 4.4516 | 138 | 0.7651 | 0.6519 | 0.7651 | 0.8747 |
| No log | 4.5161 | 140 | 0.7485 | 0.6569 | 0.7485 | 0.8652 |
| No log | 4.5806 | 142 | 0.7876 | 0.6545 | 0.7876 | 0.8874 |
| No log | 4.6452 | 144 | 0.7725 | 0.6211 | 0.7725 | 0.8789 |
| No log | 4.7097 | 146 | 0.7427 | 0.6265 | 0.7427 | 0.8618 |
| No log | 4.7742 | 148 | 0.7502 | 0.7169 | 0.7502 | 0.8662 |
| No log | 4.8387 | 150 | 0.7508 | 0.6975 | 0.7508 | 0.8665 |
| No log | 4.9032 | 152 | 0.7873 | 0.6211 | 0.7873 | 0.8873 |
| No log | 4.9677 | 154 | 0.8570 | 0.6545 | 0.8570 | 0.9258 |
| No log | 5.0323 | 156 | 0.9288 | 0.5556 | 0.9288 | 0.9637 |
| No log | 5.0968 | 158 | 0.8878 | 0.6071 | 0.8878 | 0.9422 |
| No log | 5.1613 | 160 | 0.8802 | 0.6071 | 0.8802 | 0.9382 |
| No log | 5.2258 | 162 | 0.8858 | 0.5540 | 0.8858 | 0.9412 |
| No log | 5.2903 | 164 | 0.8404 | 0.5589 | 0.8404 | 0.9167 |
| No log | 5.3548 | 166 | 0.8428 | 0.5563 | 0.8428 | 0.9180 |
| No log | 5.4194 | 168 | 0.8942 | 0.6071 | 0.8942 | 0.9456 |
| No log | 5.4839 | 170 | 0.8707 | 0.6115 | 0.8707 | 0.9331 |
| No log | 5.5484 | 172 | 0.8050 | 0.5876 | 0.8050 | 0.8972 |
| No log | 5.6129 | 174 | 0.7707 | 0.5882 | 0.7707 | 0.8779 |
| No log | 5.6774 | 176 | 0.7752 | 0.6688 | 0.7752 | 0.8805 |
| No log | 5.7419 | 178 | 0.7770 | 0.6400 | 0.7770 | 0.8815 |
| No log | 5.8065 | 180 | 0.8158 | 0.5903 | 0.8158 | 0.9032 |
| No log | 5.8710 | 182 | 0.8777 | 0.6870 | 0.8777 | 0.9369 |
| No log | 5.9355 | 184 | 0.8609 | 0.6404 | 0.8609 | 0.9279 |
| No log | 6.0 | 186 | 0.8194 | 0.6841 | 0.8194 | 0.9052 |
| No log | 6.0645 | 188 | 0.7459 | 0.6866 | 0.7459 | 0.8637 |
| No log | 6.1290 | 190 | 0.7250 | 0.6802 | 0.7250 | 0.8514 |
| No log | 6.1935 | 192 | 0.7251 | 0.6807 | 0.7251 | 0.8516 |
| No log | 6.2581 | 194 | 0.7728 | 0.6802 | 0.7728 | 0.8791 |
| No log | 6.3226 | 196 | 0.8861 | 0.6503 | 0.8861 | 0.9413 |
| No log | 6.3871 | 198 | 1.0312 | 0.5994 | 1.0312 | 1.0155 |
| No log | 6.4516 | 200 | 1.1239 | 0.5489 | 1.1239 | 1.0601 |
| No log | 6.5161 | 202 | 1.1906 | 0.6311 | 1.1906 | 1.0911 |
| No log | 6.5806 | 204 | 1.1297 | 0.6461 | 1.1297 | 1.0629 |
| No log | 6.6452 | 206 | 0.9689 | 0.6404 | 0.9689 | 0.9843 |
| No log | 6.7097 | 208 | 0.7945 | 0.6085 | 0.7945 | 0.8913 |
| No log | 6.7742 | 210 | 0.7018 | 0.6871 | 0.7018 | 0.8377 |
| No log | 6.8387 | 212 | 0.6950 | 0.6564 | 0.6950 | 0.8337 |
| No log | 6.9032 | 214 | 0.7039 | 0.6376 | 0.7039 | 0.8390 |
| No log | 6.9677 | 216 | 0.7088 | 0.6225 | 0.7088 | 0.8419 |
| No log | 7.0323 | 218 | 0.7157 | 0.6225 | 0.7157 | 0.8460 |
| No log | 7.0968 | 220 | 0.7209 | 0.6225 | 0.7209 | 0.8491 |
| No log | 7.1613 | 222 | 0.7586 | 0.6616 | 0.7586 | 0.8710 |
| No log | 7.2258 | 224 | 0.8059 | 0.6551 | 0.8059 | 0.8977 |
| No log | 7.2903 | 226 | 0.8397 | 0.5563 | 0.8397 | 0.9164 |
| No log | 7.3548 | 228 | 0.8248 | 0.6121 | 0.8248 | 0.9082 |
| No log | 7.4194 | 230 | 0.8071 | 0.6551 | 0.8071 | 0.8984 |
| No log | 7.4839 | 232 | 0.7995 | 0.6497 | 0.7995 | 0.8941 |
| No log | 7.5484 | 234 | 0.8211 | 0.6085 | 0.8211 | 0.9062 |
| No log | 7.6129 | 236 | 0.8525 | 0.6085 | 0.8525 | 0.9233 |
| No log | 7.6774 | 238 | 0.9010 | 0.6044 | 0.9010 | 0.9492 |
| No log | 7.7419 | 240 | 0.9732 | 0.6 | 0.9732 | 0.9865 |
| No log | 7.8065 | 242 | 1.0372 | 0.6398 | 1.0372 | 1.0184 |
| No log | 7.8710 | 244 | 1.0640 | 0.6398 | 1.0640 | 1.0315 |
| No log | 7.9355 | 246 | 1.0499 | 0.6398 | 1.0499 | 1.0246 |
| No log | 8.0 | 248 | 1.0529 | 0.6398 | 1.0529 | 1.0261 |
| No log | 8.0645 | 250 | 1.0564 | 0.6398 | 1.0564 | 1.0278 |
| No log | 8.1290 | 252 | 1.0270 | 0.6364 | 1.0270 | 1.0134 |
| No log | 8.1935 | 254 | 0.9760 | 0.6820 | 0.9760 | 0.9879 |
| No log | 8.2581 | 256 | 0.9227 | 0.6880 | 0.9227 | 0.9606 |
| No log | 8.3226 | 258 | 0.9063 | 0.6784 | 0.9063 | 0.9520 |
| No log | 8.3871 | 260 | 0.9130 | 0.6508 | 0.9130 | 0.9555 |
| No log | 8.4516 | 262 | 0.9082 | 0.6085 | 0.9082 | 0.9530 |
| No log | 8.5161 | 264 | 0.9145 | 0.6085 | 0.9145 | 0.9563 |
| No log | 8.5806 | 266 | 0.9232 | 0.6121 | 0.9232 | 0.9608 |
| No log | 8.6452 | 268 | 0.9152 | 0.6121 | 0.9152 | 0.9567 |
| No log | 8.7097 | 270 | 0.9145 | 0.6121 | 0.9145 | 0.9563 |
| No log | 8.7742 | 272 | 0.9175 | 0.6121 | 0.9175 | 0.9578 |
| No log | 8.8387 | 274 | 0.9328 | 0.6557 | 0.9328 | 0.9658 |
| No log | 8.9032 | 276 | 0.9564 | 0.6938 | 0.9564 | 0.9779 |
| No log | 8.9677 | 278 | 0.9614 | 0.6933 | 0.9614 | 0.9805 |
| No log | 9.0323 | 280 | 0.9686 | 0.6933 | 0.9686 | 0.9842 |
| No log | 9.0968 | 282 | 0.9666 | 0.6552 | 0.9666 | 0.9832 |
| No log | 9.1613 | 284 | 0.9554 | 0.6552 | 0.9554 | 0.9774 |
| No log | 9.2258 | 286 | 0.9443 | 0.6115 | 0.9443 | 0.9717 |
| No log | 9.2903 | 288 | 0.9539 | 0.6552 | 0.9539 | 0.9767 |
| No log | 9.3548 | 290 | 0.9739 | 0.6933 | 0.9739 | 0.9869 |
| No log | 9.4194 | 292 | 0.9743 | 0.6933 | 0.9743 | 0.9871 |
| No log | 9.4839 | 294 | 0.9679 | 0.6933 | 0.9679 | 0.9838 |
| No log | 9.5484 | 296 | 0.9673 | 0.6552 | 0.9673 | 0.9835 |
| No log | 9.6129 | 298 | 0.9705 | 0.6933 | 0.9705 | 0.9851 |
| No log | 9.6774 | 300 | 0.9650 | 0.6552 | 0.9650 | 0.9823 |
| No log | 9.7419 | 302 | 0.9565 | 0.6552 | 0.9565 | 0.9780 |
| No log | 9.8065 | 304 | 0.9487 | 0.6038 | 0.9487 | 0.9740 |
| No log | 9.8710 | 306 | 0.9425 | 0.6038 | 0.9425 | 0.9708 |
| No log | 9.9355 | 308 | 0.9397 | 0.6038 | 0.9397 | 0.9694 |
| No log | 10.0 | 310 | 0.9389 | 0.6038 | 0.9389 | 0.9690 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
|
Eagalon/Marco-o1-abliterated-Q8_0-GGUF | Eagalon | 2024-11-24T05:15:03Z | 6 | 0 | transformers | [
"transformers",
"gguf",
"abliterated",
"uncensored",
"llama-cpp",
"gguf-my-repo",
"base_model:huihui-ai/Marco-o1-abliterated",
"base_model:quantized:huihui-ai/Marco-o1-abliterated",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-11-24T05:14:29Z | ---
license: apache-2.0
library_name: transformers
base_model: huihui-ai/Marco-o1-abliterated
tags:
- abliterated
- uncensored
- llama-cpp
- gguf-my-repo
---
# Eagalon/Marco-o1-abliterated-Q8_0-GGUF
This model was converted to GGUF format from [`huihui-ai/Marco-o1-abliterated`](https://huggingface.co/huihui-ai/Marco-o1-abliterated) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space.
Refer to the [original model card](https://huggingface.co/huihui-ai/Marco-o1-abliterated) for more details on the model.
## Use with llama.cpp
Install llama.cpp through brew (works on Mac and Linux)
```bash
brew install llama.cpp
```
Invoke the llama.cpp server or the CLI.
### CLI:
```bash
llama-cli --hf-repo Eagalon/Marco-o1-abliterated-Q8_0-GGUF --hf-file marco-o1-abliterated-q8_0.gguf -p "The meaning to life and the universe is"
```
### Server:
```bash
llama-server --hf-repo Eagalon/Marco-o1-abliterated-Q8_0-GGUF --hf-file marco-o1-abliterated-q8_0.gguf -c 2048
```
Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well.
Step 1: Clone llama.cpp from GitHub.
```
git clone https://github.com/ggerganov/llama.cpp
```
Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux).
```
cd llama.cpp && LLAMA_CURL=1 make
```
Step 3: Run inference through the main binary.
```
./llama-cli --hf-repo Eagalon/Marco-o1-abliterated-Q8_0-GGUF --hf-file marco-o1-abliterated-q8_0.gguf -p "The meaning to life and the universe is"
```
or
```
./llama-server --hf-repo Eagalon/Marco-o1-abliterated-Q8_0-GGUF --hf-file marco-o1-abliterated-q8_0.gguf -c 2048
```
|
anvorja/bert-base-uncased-biobert | anvorja | 2024-11-24T04:57:24Z | 106 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"token-classification",
"generated_from_trainer",
"base_model:google-bert/bert-base-uncased",
"base_model:finetune:google-bert/bert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | 2024-11-24T03:29:15Z | ---
library_name: transformers
license: apache-2.0
base_model: google-bert/bert-base-uncased
tags:
- generated_from_trainer
metrics:
- precision
- recall
- f1
- accuracy
model-index:
- name: biobert-ner-finetuned-con-kaggle
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# biobert-ner-finetuned
This model is a fine-tuned version of [google-bert/bert-base-uncased](https://huggingface.co/google-bert/bert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0894
- Precision: 0.9293
- Recall: 0.9551
- F1: 0.9420
- Accuracy: 0.9795
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
| No log | 1.0 | 306 | 0.2575 | 0.7864 | 0.8034 | 0.7948 | 0.9322 |
| 0.8692 | 2.0 | 612 | 0.0949 | 0.9170 | 0.9451 | 0.9308 | 0.9759 |
| 0.8692 | 3.0 | 918 | 0.0854 | 0.9234 | 0.9607 | 0.9417 | 0.9791 |
| 0.1096 | 4.0 | 1224 | 0.0768 | 0.9333 | 0.9585 | 0.9457 | 0.9809 |
| 0.0656 | 5.0 | 1530 | 0.0772 | 0.9320 | 0.9562 | 0.9439 | 0.9806 |
| 0.0656 | 6.0 | 1836 | 0.0810 | 0.9360 | 0.9575 | 0.9466 | 0.9806 |
| 0.0468 | 7.0 | 2142 | 0.0827 | 0.9308 | 0.9580 | 0.9442 | 0.9803 |
| 0.0468 | 8.0 | 2448 | 0.0890 | 0.9248 | 0.9568 | 0.9405 | 0.9788 |
| 0.038 | 9.0 | 2754 | 0.0859 | 0.9345 | 0.9579 | 0.9460 | 0.9806 |
| 0.031 | 10.0 | 3060 | 0.0894 | 0.9293 | 0.9551 | 0.9420 | 0.9795 |
### Framework versions
- Transformers 4.45.1
- Pytorch 2.4.0
- Datasets 3.0.1
- Tokenizers 0.20.0
|
PrunaAI/ZeroXClem-L3-Aspire-Heart-Matrix-8B-bnb-8bit-smashed | PrunaAI | 2024-11-24T04:44:05Z | 5 | 0 | null | [
"safetensors",
"llama",
"pruna-ai",
"base_model:ZeroXClem/L3-Aspire-Heart-Matrix-8B",
"base_model:quantized:ZeroXClem/L3-Aspire-Heart-Matrix-8B",
"8-bit",
"bitsandbytes",
"region:us"
] | null | 2024-11-24T04:34:31Z | ---
thumbnail: "https://assets-global.website-files.com/646b351987a8d8ce158d1940/64ec9e96b4334c0e1ac41504_Logo%20with%20white%20text.svg"
base_model: ZeroXClem/L3-Aspire-Heart-Matrix-8B
metrics:
- memory_disk
- memory_inference
- inference_latency
- inference_throughput
- inference_CO2_emissions
- inference_energy_consumption
tags:
- pruna-ai
---
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<a href="https://docs.pruna.ai/en/latest/setup/pip.html" target="_blank" rel="noopener noreferrer">
<img src="https://imgur.com/rVAgqMY.png" alt="PrunaAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</a>
</div>
<!-- header end -->
[](https://twitter.com/PrunaAI)
[](https://github.com/PrunaAI)
[](https://www.linkedin.com/company/93832878/admin/feed/posts/?feedType=following)
[](https://discord.gg/rskEr4BZJx)
# Simply make AI models cheaper, smaller, faster, and greener!
- Give a thumbs up if you like this model!
- Contact us and tell us which model to compress next [here](https://www.pruna.ai/contact).
- Request access to easily compress your *own* AI models [here](https://z0halsaff74.typeform.com/pruna-access?typeform-source=www.pruna.ai).
- Read the documentations to know more [here](https://pruna-ai-pruna.readthedocs-hosted.com/en/latest/)
- Join Pruna AI community on Discord [here](https://discord.gg/CP4VSgck) to share feedback/suggestions or get help.
## Results

**Frequently Asked Questions**
- ***How does the compression work?*** The model is compressed with llm-int8.
- ***How does the model quality change?*** The quality of the model output might vary compared to the base model.
- ***How is the model efficiency evaluated?*** These results were obtained with configuration described in `model/smash_config.json` and are obtained after a hardware warmup. The smashed model is directly compared to the original base model. Efficiency results may vary in other settings (e.g. other hardware, image size, batch size, ...). We recommend to directly run them in the use-case conditions to know if the smashed model can benefit you.
- ***What is the model format?*** We use safetensors.
- ***What calibration data has been used?*** If needed by the compression method, we used WikiText as the calibration data.
- ***What is the naming convention for Pruna Huggingface models?*** We take the original model name and append "turbo", "tiny", or "green" if the smashed model has a measured inference speed, inference memory, or inference energy consumption which is less than 90% of the original base model.
- ***How to compress my own models?*** You can request premium access to more compression methods and tech support for your specific use-cases [here](https://z0halsaff74.typeform.com/pruna-access?typeform-source=www.pruna.ai).
- ***What are "first" metrics?*** Results mentioning "first" are obtained after the first run of the model. The first run might take more memory or be slower than the subsequent runs due cuda overheads.
- ***What are "Sync" and "Async" metrics?*** "Sync" metrics are obtained by syncing all GPU processes and stop measurement when all of them are executed. "Async" metrics are obtained without syncing all GPU processes and stop when the model output can be used by the CPU. We provide both metrics since both could be relevant depending on the use-case. We recommend to test the efficiency gains directly in your use-cases.
## Setup
You can run the smashed model with these steps:
0. Check requirements from the original repo ZeroXClem/L3-Aspire-Heart-Matrix-8B installed. In particular, check python, cuda, and transformers versions.
1. Make sure that you have installed quantization related packages.
```bash
pip install transformers accelerate bitsandbytes>0.37.0
```
2. Load & run the model.
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("PrunaAI/ZeroXClem-L3-Aspire-Heart-Matrix-8B-bnb-8bit-smashed", trust_remote_code=True, device_map='auto')
tokenizer = AutoTokenizer.from_pretrained("ZeroXClem/L3-Aspire-Heart-Matrix-8B")
input_ids = tokenizer("What is the color of prunes?,", return_tensors='pt').to(model.device)["input_ids"]
outputs = model.generate(input_ids, max_new_tokens=216)
tokenizer.decode(outputs[0])
```
## Configurations
The configuration info are in `smash_config.json`.
## Credits & License
The license of the smashed model follows the license of the original model. Please check the license of the original model ZeroXClem/L3-Aspire-Heart-Matrix-8B before using this model which provided the base model. The license of the `pruna-engine` is [here](https://pypi.org/project/pruna-engine/) on Pypi.
## Want to compress other models?
- Contact us and tell us which model to compress next [here](https://www.pruna.ai/contact).
- Do it by yourself [here](https://docs.pruna.ai/en/latest/setup/pip.html). |
John6666/artemix-ponyr-sdxl | John6666 | 2024-11-24T04:36:55Z | 53 | 0 | diffusers | [
"diffusers",
"safetensors",
"text-to-image",
"stable-diffusion",
"stable-diffusion-xl",
"anime",
"cute",
"realistic",
"2.5D",
"pony",
"en",
"license:other",
"autotrain_compatible",
"endpoints_compatible",
"diffusers:StableDiffusionXLPipeline",
"region:us"
] | text-to-image | 2024-11-24T04:29:59Z | ---
license: other
license_name: faipl-1.0-sd
license_link: https://freedevproject.org/faipl-1.0-sd/
language:
- en
library_name: diffusers
pipeline_tag: text-to-image
tags:
- text-to-image
- stable-diffusion
- stable-diffusion-xl
- anime
- cute
- realistic
- 2.5D
- pony
---
Original model is [here](https://civitai.com/models/161041?modelVersionId=1093331).
This model created by [arte_l](https://civitai.com/user/arte_l).
|
Pnyame8/project | Pnyame8 | 2024-11-24T04:29:05Z | 6 | 0 | null | [
"tensorboard",
"safetensors",
"vit",
"image-classification",
"pytorch",
"huggingpics",
"model-index",
"region:us"
] | image-classification | 2024-11-24T04:28:55Z | ---
tags:
- image-classification
- pytorch
- huggingpics
metrics:
- accuracy
model-index:
- name: project
results:
- task:
name: Image Classification
type: image-classification
metrics:
- name: Accuracy
type: accuracy
value: 0.9850746393203735
---
# project
Autogenerated by HuggingPics🤗🖼️
Create your own image classifier for **anything** by running [the demo on Google Colab](https://colab.research.google.com/github/nateraw/huggingpics/blob/main/HuggingPics.ipynb).
Report any issues with the demo at the [github repo](https://github.com/nateraw/huggingpics).
## Example Images
#### cow

#### goat

#### lion
 |
crawlwalkrun/exp_om4 | crawlwalkrun | 2024-11-24T03:54:58Z | 5 | 0 | diffusers | [
"diffusers",
"text-to-image",
"lora",
"template:diffusion-lora",
"base_model:black-forest-labs/FLUX.1-schnell",
"base_model:adapter:black-forest-labs/FLUX.1-schnell",
"license:apache-2.0",
"region:us"
] | text-to-image | 2024-11-24T03:53:36Z | ---
tags:
- text-to-image
- lora
- diffusers
- template:diffusion-lora
widget:
- text: image
output:
url: images/so.png
base_model: black-forest-labs/FLUX.1-schnell
instance_prompt: null
license: apache-2.0
---
# exp_om4
<Gallery />
## Download model
Weights for this model are available in Safetensors format.
[Download](/crawlwalkrun/exp_om4/tree/main) them in the Files & versions tab.
|
hardikg2907/code-llama-html-completion-1 | hardikg2907 | 2024-11-24T03:54:15Z | 12 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"en",
"dataset:hardikg2907/cleaned-dataset-1-500k",
"arxiv:1910.09700",
"base_model:codellama/CodeLlama-7b-hf",
"base_model:finetune:codellama/CodeLlama-7b-hf",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-20T14:37:00Z | ---
library_name: transformers
datasets:
- hardikg2907/cleaned-dataset-1-500k
language:
- en
base_model:
- codellama/CodeLlama-7b-hf
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Pnyame8/animals | Pnyame8 | 2024-11-24T03:16:12Z | 7 | 0 | null | [
"tensorboard",
"safetensors",
"vit",
"image-classification",
"pytorch",
"huggingpics",
"model-index",
"region:us"
] | image-classification | 2024-11-24T03:16:01Z | ---
tags:
- image-classification
- pytorch
- huggingpics
metrics:
- accuracy
model-index:
- name: animals
results:
- task:
name: Image Classification
type: image-classification
metrics:
- name: Accuracy
type: accuracy
value: 0.9910714030265808
---
# animals
Autogenerated by HuggingPics🤗🖼️
Create your own image classifier for **anything** by running [the demo on Google Colab](https://colab.research.google.com/github/nateraw/huggingpics/blob/main/HuggingPics.ipynb).
Report any issues with the demo at the [github repo](https://github.com/nateraw/huggingpics).
## Example Images
#### cat

#### goat

#### horse

#### lion

#### sheep
 |
mergekit-community/UnslopNemo-v4.1-Magnum-v4-12B | mergekit-community | 2024-11-24T03:12:27Z | 38 | 2 | transformers | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"mergekit",
"merge",
"base_model:TheDrummer/UnslopNemo-12B-v4.1",
"base_model:merge:TheDrummer/UnslopNemo-12B-v4.1",
"base_model:anthracite-org/magnum-v4-12b",
"base_model:merge:anthracite-org/magnum-v4-12b",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-24T03:04:54Z | ---
base_model:
- TheDrummer/UnslopNemo-12B-v4.1
- anthracite-org/magnum-v4-12b
library_name: transformers
tags:
- mergekit
- merge
---
# merge
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the SLERP merge method.
### Models Merged
The following models were included in the merge:
* [TheDrummer/UnslopNemo-12B-v4.1](https://huggingface.co/TheDrummer/UnslopNemo-12B-v4.1)
* [anthracite-org/magnum-v4-12b](https://huggingface.co/anthracite-org/magnum-v4-12b)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
models:
- model: TheDrummer/UnslopNemo-12B-v4.1
- model: anthracite-org/magnum-v4-12b
merge_method: slerp
base_model: TheDrummer/UnslopNemo-12B-v4.1
parameters:
t: [0.1, 0.3, 0.6, 0.3, 0.1]
dtype: bfloat16
```
|
MayBashendy/ASAP_FineTuningBERT_AugV4_k4_task1_organization_fold4 | MayBashendy | 2024-11-24T03:07:07Z | 161 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:google-bert/bert-base-uncased",
"base_model:finetune:google-bert/bert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-18T07:07:49Z | ---
library_name: transformers
license: apache-2.0
base_model: bert-base-uncased
tags:
- generated_from_trainer
model-index:
- name: ASAP_FineTuningBERT_AugV4_k4_task1_organization_fold4
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# ASAP_FineTuningBERT_AugV4_k4_task1_organization_fold4
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6215
- Qwk: 0.5835
- Mse: 0.6215
- Rmse: 0.7883
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|:-------------:|:------:|:----:|:---------------:|:------:|:-------:|:------:|
| No log | 0.0049 | 2 | 10.2024 | 0.0035 | 10.2024 | 3.1941 |
| No log | 0.0099 | 4 | 8.6111 | 0.0018 | 8.6111 | 2.9345 |
| No log | 0.0148 | 6 | 6.9434 | 0.0018 | 6.9434 | 2.6350 |
| No log | 0.0197 | 8 | 5.5076 | 0.0607 | 5.5076 | 2.3468 |
| No log | 0.0246 | 10 | 4.5662 | 0.0074 | 4.5662 | 2.1369 |
| No log | 0.0296 | 12 | 3.5615 | 0.0079 | 3.5615 | 1.8872 |
| No log | 0.0345 | 14 | 2.7900 | 0.0079 | 2.7900 | 1.6703 |
| No log | 0.0394 | 16 | 2.1974 | 0.1284 | 2.1974 | 1.4824 |
| No log | 0.0443 | 18 | 1.6442 | 0.0445 | 1.6442 | 1.2823 |
| No log | 0.0493 | 20 | 1.2816 | 0.0316 | 1.2816 | 1.1321 |
| No log | 0.0542 | 22 | 1.0325 | 0.0316 | 1.0325 | 1.0161 |
| No log | 0.0591 | 24 | 0.8507 | 0.3518 | 0.8507 | 0.9223 |
| No log | 0.0640 | 26 | 0.8325 | 0.1026 | 0.8325 | 0.9124 |
| No log | 0.0690 | 28 | 0.8790 | 0.0493 | 0.8790 | 0.9375 |
| No log | 0.0739 | 30 | 0.9754 | 0.0454 | 0.9754 | 0.9876 |
| No log | 0.0788 | 32 | 0.9975 | 0.0454 | 0.9975 | 0.9987 |
| No log | 0.0837 | 34 | 1.1711 | 0.0524 | 1.1711 | 1.0822 |
| No log | 0.0887 | 36 | 0.8027 | 0.1020 | 0.8027 | 0.8959 |
| No log | 0.0936 | 38 | 0.7554 | 0.1873 | 0.7554 | 0.8691 |
| No log | 0.0985 | 40 | 0.8249 | 0.0815 | 0.8249 | 0.9082 |
| No log | 0.1034 | 42 | 0.9253 | 0.0549 | 0.9253 | 0.9619 |
| No log | 0.1084 | 44 | 0.7558 | 0.1910 | 0.7558 | 0.8694 |
| No log | 0.1133 | 46 | 0.7422 | 0.2181 | 0.7422 | 0.8615 |
| No log | 0.1182 | 48 | 0.8660 | 0.0993 | 0.8660 | 0.9306 |
| No log | 0.1232 | 50 | 1.1622 | 0.2804 | 1.1622 | 1.0781 |
| No log | 0.1281 | 52 | 0.8938 | 0.2369 | 0.8938 | 0.9454 |
| No log | 0.1330 | 54 | 0.6613 | 0.3898 | 0.6613 | 0.8132 |
| No log | 0.1379 | 56 | 0.7145 | 0.4016 | 0.7145 | 0.8453 |
| No log | 0.1429 | 58 | 0.6071 | 0.3712 | 0.6071 | 0.7792 |
| No log | 0.1478 | 60 | 0.5986 | 0.3847 | 0.5986 | 0.7737 |
| No log | 0.1527 | 62 | 0.5716 | 0.4019 | 0.5716 | 0.7561 |
| No log | 0.1576 | 64 | 0.6156 | 0.4243 | 0.6156 | 0.7846 |
| No log | 0.1626 | 66 | 0.7946 | 0.4359 | 0.7946 | 0.8914 |
| No log | 0.1675 | 68 | 0.8161 | 0.4327 | 0.8161 | 0.9034 |
| No log | 0.1724 | 70 | 0.6451 | 0.4333 | 0.6451 | 0.8032 |
| No log | 0.1773 | 72 | 0.7140 | 0.4251 | 0.7140 | 0.8450 |
| No log | 0.1823 | 74 | 0.9974 | 0.3740 | 0.9974 | 0.9987 |
| No log | 0.1872 | 76 | 0.9087 | 0.3962 | 0.9087 | 0.9533 |
| No log | 0.1921 | 78 | 0.7836 | 0.4533 | 0.7836 | 0.8852 |
| No log | 0.1970 | 80 | 0.7285 | 0.4763 | 0.7285 | 0.8535 |
| No log | 0.2020 | 82 | 0.5527 | 0.4897 | 0.5527 | 0.7434 |
| No log | 0.2069 | 84 | 0.5552 | 0.5203 | 0.5552 | 0.7451 |
| No log | 0.2118 | 86 | 0.6267 | 0.4863 | 0.6267 | 0.7916 |
| No log | 0.2167 | 88 | 1.0184 | 0.3981 | 1.0184 | 1.0091 |
| No log | 0.2217 | 90 | 0.9069 | 0.4227 | 0.9069 | 0.9523 |
| No log | 0.2266 | 92 | 0.5708 | 0.5256 | 0.5708 | 0.7555 |
| No log | 0.2315 | 94 | 0.6862 | 0.3953 | 0.6862 | 0.8284 |
| No log | 0.2365 | 96 | 0.7177 | 0.3804 | 0.7177 | 0.8472 |
| No log | 0.2414 | 98 | 0.5391 | 0.5272 | 0.5391 | 0.7342 |
| No log | 0.2463 | 100 | 0.5296 | 0.5311 | 0.5296 | 0.7277 |
| No log | 0.2512 | 102 | 0.5360 | 0.5820 | 0.5360 | 0.7321 |
| No log | 0.2562 | 104 | 0.5280 | 0.5707 | 0.5280 | 0.7267 |
| No log | 0.2611 | 106 | 0.5192 | 0.5525 | 0.5192 | 0.7205 |
| No log | 0.2660 | 108 | 0.5556 | 0.5788 | 0.5556 | 0.7454 |
| No log | 0.2709 | 110 | 0.5838 | 0.5453 | 0.5838 | 0.7641 |
| No log | 0.2759 | 112 | 0.6717 | 0.5195 | 0.6717 | 0.8196 |
| No log | 0.2808 | 114 | 0.8962 | 0.4468 | 0.8962 | 0.9467 |
| No log | 0.2857 | 116 | 1.1372 | 0.3628 | 1.1372 | 1.0664 |
| No log | 0.2906 | 118 | 0.9467 | 0.4496 | 0.9467 | 0.9730 |
| No log | 0.2956 | 120 | 0.6575 | 0.4989 | 0.6575 | 0.8108 |
| No log | 0.3005 | 122 | 0.6981 | 0.5134 | 0.6981 | 0.8355 |
| No log | 0.3054 | 124 | 0.9383 | 0.4587 | 0.9383 | 0.9687 |
| No log | 0.3103 | 126 | 0.9933 | 0.4196 | 0.9933 | 0.9966 |
| No log | 0.3153 | 128 | 0.6477 | 0.4793 | 0.6477 | 0.8048 |
| No log | 0.3202 | 130 | 0.5635 | 0.4768 | 0.5635 | 0.7507 |
| No log | 0.3251 | 132 | 0.5558 | 0.5039 | 0.5558 | 0.7455 |
| No log | 0.3300 | 134 | 0.7659 | 0.4872 | 0.7659 | 0.8751 |
| No log | 0.3350 | 136 | 0.7660 | 0.4930 | 0.7660 | 0.8752 |
| No log | 0.3399 | 138 | 0.5414 | 0.5581 | 0.5414 | 0.7358 |
| No log | 0.3448 | 140 | 0.5147 | 0.5692 | 0.5147 | 0.7174 |
| No log | 0.3498 | 142 | 0.5337 | 0.5498 | 0.5337 | 0.7305 |
| No log | 0.3547 | 144 | 0.5214 | 0.5657 | 0.5214 | 0.7221 |
| No log | 0.3596 | 146 | 0.5495 | 0.5751 | 0.5495 | 0.7413 |
| No log | 0.3645 | 148 | 0.5478 | 0.5611 | 0.5478 | 0.7401 |
| No log | 0.3695 | 150 | 0.6485 | 0.5407 | 0.6485 | 0.8053 |
| No log | 0.3744 | 152 | 0.6751 | 0.5224 | 0.6751 | 0.8216 |
| No log | 0.3793 | 154 | 0.8339 | 0.4949 | 0.8339 | 0.9132 |
| No log | 0.3842 | 156 | 0.8045 | 0.4923 | 0.8045 | 0.8970 |
| No log | 0.3892 | 158 | 0.7684 | 0.4719 | 0.7684 | 0.8766 |
| No log | 0.3941 | 160 | 0.6264 | 0.5153 | 0.6264 | 0.7915 |
| No log | 0.3990 | 162 | 0.5496 | 0.5114 | 0.5496 | 0.7413 |
| No log | 0.4039 | 164 | 0.5544 | 0.5290 | 0.5544 | 0.7446 |
| No log | 0.4089 | 166 | 0.6284 | 0.5203 | 0.6284 | 0.7927 |
| No log | 0.4138 | 168 | 0.7093 | 0.5138 | 0.7093 | 0.8422 |
| No log | 0.4187 | 170 | 0.6742 | 0.5245 | 0.6742 | 0.8211 |
| No log | 0.4236 | 172 | 0.5290 | 0.5628 | 0.5290 | 0.7273 |
| No log | 0.4286 | 174 | 0.5285 | 0.5505 | 0.5285 | 0.7270 |
| No log | 0.4335 | 176 | 0.6772 | 0.5107 | 0.6772 | 0.8229 |
| No log | 0.4384 | 178 | 1.1104 | 0.3932 | 1.1104 | 1.0538 |
| No log | 0.4433 | 180 | 0.9902 | 0.4457 | 0.9902 | 0.9951 |
| No log | 0.4483 | 182 | 0.6380 | 0.5112 | 0.6380 | 0.7987 |
| No log | 0.4532 | 184 | 0.5366 | 0.5754 | 0.5366 | 0.7325 |
| No log | 0.4581 | 186 | 0.5233 | 0.6089 | 0.5233 | 0.7234 |
| No log | 0.4631 | 188 | 0.5784 | 0.5525 | 0.5784 | 0.7606 |
| No log | 0.4680 | 190 | 0.7200 | 0.5037 | 0.7200 | 0.8485 |
| No log | 0.4729 | 192 | 0.6236 | 0.5484 | 0.6236 | 0.7897 |
| No log | 0.4778 | 194 | 0.5337 | 0.6114 | 0.5337 | 0.7305 |
| No log | 0.4828 | 196 | 0.5637 | 0.5281 | 0.5637 | 0.7508 |
| No log | 0.4877 | 198 | 0.5420 | 0.5621 | 0.5420 | 0.7362 |
| No log | 0.4926 | 200 | 0.6924 | 0.5013 | 0.6924 | 0.8321 |
| No log | 0.4975 | 202 | 0.9465 | 0.4363 | 0.9465 | 0.9729 |
| No log | 0.5025 | 204 | 0.8229 | 0.4472 | 0.8229 | 0.9071 |
| No log | 0.5074 | 206 | 0.5558 | 0.5468 | 0.5558 | 0.7455 |
| No log | 0.5123 | 208 | 0.5387 | 0.5537 | 0.5387 | 0.7339 |
| No log | 0.5172 | 210 | 0.5254 | 0.5669 | 0.5254 | 0.7248 |
| No log | 0.5222 | 212 | 0.6134 | 0.5174 | 0.6134 | 0.7832 |
| No log | 0.5271 | 214 | 0.6843 | 0.5178 | 0.6843 | 0.8272 |
| No log | 0.5320 | 216 | 0.5729 | 0.5693 | 0.5729 | 0.7569 |
| No log | 0.5369 | 218 | 0.5405 | 0.5703 | 0.5405 | 0.7352 |
| No log | 0.5419 | 220 | 0.5901 | 0.5277 | 0.5901 | 0.7682 |
| No log | 0.5468 | 222 | 0.7462 | 0.4707 | 0.7462 | 0.8638 |
| No log | 0.5517 | 224 | 0.9278 | 0.4026 | 0.9278 | 0.9632 |
| No log | 0.5567 | 226 | 0.8518 | 0.3979 | 0.8518 | 0.9229 |
| No log | 0.5616 | 228 | 0.6941 | 0.4446 | 0.6941 | 0.8331 |
| No log | 0.5665 | 230 | 0.7366 | 0.4447 | 0.7366 | 0.8583 |
| No log | 0.5714 | 232 | 0.9810 | 0.3867 | 0.9810 | 0.9905 |
| No log | 0.5764 | 234 | 0.9273 | 0.4120 | 0.9273 | 0.9630 |
| No log | 0.5813 | 236 | 0.6376 | 0.4818 | 0.6376 | 0.7985 |
| No log | 0.5862 | 238 | 0.5530 | 0.5264 | 0.5530 | 0.7436 |
| No log | 0.5911 | 240 | 0.5440 | 0.5598 | 0.5440 | 0.7375 |
| No log | 0.5961 | 242 | 0.5879 | 0.5305 | 0.5879 | 0.7667 |
| No log | 0.6010 | 244 | 0.6726 | 0.5209 | 0.6726 | 0.8201 |
| No log | 0.6059 | 246 | 0.5916 | 0.5826 | 0.5916 | 0.7692 |
| No log | 0.6108 | 248 | 0.5284 | 0.6160 | 0.5284 | 0.7269 |
| No log | 0.6158 | 250 | 0.5199 | 0.5919 | 0.5199 | 0.7210 |
| No log | 0.6207 | 252 | 0.5181 | 0.5803 | 0.5181 | 0.7198 |
| No log | 0.6256 | 254 | 0.5454 | 0.6075 | 0.5454 | 0.7385 |
| No log | 0.6305 | 256 | 0.5264 | 0.5964 | 0.5264 | 0.7255 |
| No log | 0.6355 | 258 | 0.5361 | 0.5467 | 0.5361 | 0.7322 |
| No log | 0.6404 | 260 | 0.5224 | 0.5523 | 0.5224 | 0.7227 |
| No log | 0.6453 | 262 | 0.5329 | 0.5847 | 0.5329 | 0.7300 |
| No log | 0.6502 | 264 | 0.5846 | 0.5514 | 0.5846 | 0.7646 |
| No log | 0.6552 | 266 | 0.5440 | 0.5705 | 0.5440 | 0.7376 |
| No log | 0.6601 | 268 | 0.5917 | 0.5490 | 0.5917 | 0.7692 |
| No log | 0.6650 | 270 | 0.5993 | 0.5432 | 0.5993 | 0.7741 |
| No log | 0.6700 | 272 | 0.5648 | 0.5431 | 0.5648 | 0.7515 |
| No log | 0.6749 | 274 | 0.5999 | 0.5495 | 0.5999 | 0.7746 |
| No log | 0.6798 | 276 | 0.7386 | 0.4909 | 0.7386 | 0.8594 |
| No log | 0.6847 | 278 | 1.0426 | 0.4491 | 1.0426 | 1.0211 |
| No log | 0.6897 | 280 | 0.9005 | 0.4801 | 0.9005 | 0.9490 |
| No log | 0.6946 | 282 | 0.5863 | 0.5764 | 0.5863 | 0.7657 |
| No log | 0.6995 | 284 | 0.5309 | 0.5578 | 0.5309 | 0.7286 |
| No log | 0.7044 | 286 | 0.5361 | 0.5799 | 0.5361 | 0.7322 |
| No log | 0.7094 | 288 | 0.7339 | 0.4871 | 0.7339 | 0.8567 |
| No log | 0.7143 | 290 | 0.8310 | 0.4768 | 0.8310 | 0.9116 |
| No log | 0.7192 | 292 | 0.6184 | 0.5447 | 0.6184 | 0.7864 |
| No log | 0.7241 | 294 | 0.5325 | 0.5704 | 0.5325 | 0.7298 |
| No log | 0.7291 | 296 | 0.7291 | 0.4214 | 0.7291 | 0.8539 |
| No log | 0.7340 | 298 | 0.7356 | 0.4302 | 0.7356 | 0.8577 |
| No log | 0.7389 | 300 | 0.5650 | 0.5630 | 0.5650 | 0.7516 |
| No log | 0.7438 | 302 | 0.5782 | 0.6177 | 0.5782 | 0.7604 |
| No log | 0.7488 | 304 | 0.6449 | 0.5815 | 0.6449 | 0.8031 |
| No log | 0.7537 | 306 | 0.5584 | 0.6166 | 0.5584 | 0.7473 |
| No log | 0.7586 | 308 | 0.5439 | 0.5793 | 0.5439 | 0.7375 |
| No log | 0.7635 | 310 | 0.5850 | 0.5707 | 0.5850 | 0.7649 |
| No log | 0.7685 | 312 | 0.5343 | 0.6016 | 0.5343 | 0.7309 |
| No log | 0.7734 | 314 | 0.5924 | 0.5813 | 0.5924 | 0.7697 |
| No log | 0.7783 | 316 | 0.7225 | 0.5503 | 0.7225 | 0.8500 |
| No log | 0.7833 | 318 | 0.6295 | 0.5848 | 0.6295 | 0.7934 |
| No log | 0.7882 | 320 | 0.5184 | 0.5971 | 0.5184 | 0.7200 |
| No log | 0.7931 | 322 | 0.5259 | 0.5849 | 0.5259 | 0.7252 |
| No log | 0.7980 | 324 | 0.5993 | 0.5657 | 0.5993 | 0.7741 |
| No log | 0.8030 | 326 | 0.6472 | 0.5462 | 0.6472 | 0.8045 |
| No log | 0.8079 | 328 | 0.7185 | 0.5132 | 0.7185 | 0.8477 |
| No log | 0.8128 | 330 | 0.6743 | 0.5380 | 0.6743 | 0.8212 |
| No log | 0.8177 | 332 | 0.5992 | 0.5750 | 0.5992 | 0.7741 |
| No log | 0.8227 | 334 | 0.6390 | 0.5633 | 0.6390 | 0.7994 |
| No log | 0.8276 | 336 | 0.5926 | 0.5944 | 0.5926 | 0.7698 |
| No log | 0.8325 | 338 | 0.5311 | 0.6209 | 0.5311 | 0.7288 |
| No log | 0.8374 | 340 | 0.5421 | 0.6280 | 0.5421 | 0.7363 |
| No log | 0.8424 | 342 | 0.5824 | 0.6306 | 0.5824 | 0.7632 |
| No log | 0.8473 | 344 | 0.6236 | 0.6111 | 0.6236 | 0.7897 |
| No log | 0.8522 | 346 | 0.6982 | 0.5574 | 0.6982 | 0.8356 |
| No log | 0.8571 | 348 | 0.5674 | 0.6167 | 0.5674 | 0.7532 |
| No log | 0.8621 | 350 | 0.5554 | 0.6172 | 0.5554 | 0.7452 |
| No log | 0.8670 | 352 | 0.6068 | 0.5907 | 0.6068 | 0.7790 |
| No log | 0.8719 | 354 | 0.5836 | 0.6061 | 0.5836 | 0.7639 |
| No log | 0.8768 | 356 | 0.5551 | 0.6292 | 0.5551 | 0.7451 |
| No log | 0.8818 | 358 | 0.5496 | 0.6408 | 0.5496 | 0.7413 |
| No log | 0.8867 | 360 | 0.5611 | 0.6367 | 0.5611 | 0.7491 |
| No log | 0.8916 | 362 | 0.5788 | 0.6108 | 0.5788 | 0.7608 |
| No log | 0.8966 | 364 | 0.5864 | 0.6294 | 0.5864 | 0.7658 |
| No log | 0.9015 | 366 | 0.6137 | 0.6245 | 0.6137 | 0.7834 |
| No log | 0.9064 | 368 | 0.6037 | 0.6264 | 0.6037 | 0.7770 |
| No log | 0.9113 | 370 | 0.6074 | 0.6249 | 0.6074 | 0.7793 |
| No log | 0.9163 | 372 | 0.6092 | 0.5956 | 0.6092 | 0.7805 |
| No log | 0.9212 | 374 | 0.5595 | 0.6471 | 0.5595 | 0.7480 |
| No log | 0.9261 | 376 | 0.5721 | 0.6445 | 0.5721 | 0.7564 |
| No log | 0.9310 | 378 | 0.5497 | 0.6146 | 0.5497 | 0.7414 |
| No log | 0.9360 | 380 | 0.6172 | 0.6017 | 0.6172 | 0.7856 |
| No log | 0.9409 | 382 | 0.8842 | 0.4809 | 0.8842 | 0.9403 |
| No log | 0.9458 | 384 | 0.8845 | 0.4736 | 0.8845 | 0.9405 |
| No log | 0.9507 | 386 | 0.6287 | 0.5626 | 0.6287 | 0.7929 |
| No log | 0.9557 | 388 | 0.5563 | 0.5889 | 0.5563 | 0.7459 |
| No log | 0.9606 | 390 | 0.5666 | 0.6146 | 0.5666 | 0.7527 |
| No log | 0.9655 | 392 | 0.6792 | 0.5609 | 0.6792 | 0.8241 |
| No log | 0.9704 | 394 | 0.6578 | 0.5619 | 0.6578 | 0.8110 |
| No log | 0.9754 | 396 | 0.5254 | 0.6050 | 0.5254 | 0.7249 |
| No log | 0.9803 | 398 | 0.5173 | 0.6103 | 0.5173 | 0.7192 |
| No log | 0.9852 | 400 | 0.5383 | 0.5996 | 0.5383 | 0.7337 |
| No log | 0.9901 | 402 | 0.5584 | 0.6138 | 0.5584 | 0.7473 |
| No log | 0.9951 | 404 | 0.5273 | 0.6194 | 0.5273 | 0.7262 |
| No log | 1.0 | 406 | 0.5224 | 0.6045 | 0.5224 | 0.7228 |
| No log | 1.0049 | 408 | 0.5333 | 0.5858 | 0.5333 | 0.7303 |
| No log | 1.0099 | 410 | 0.5295 | 0.6231 | 0.5295 | 0.7277 |
| No log | 1.0148 | 412 | 0.7751 | 0.5090 | 0.7751 | 0.8804 |
| No log | 1.0197 | 414 | 0.8304 | 0.4884 | 0.8304 | 0.9113 |
| No log | 1.0246 | 416 | 0.6393 | 0.5131 | 0.6393 | 0.7996 |
| No log | 1.0296 | 418 | 0.5274 | 0.5963 | 0.5274 | 0.7262 |
| No log | 1.0345 | 420 | 0.5245 | 0.5805 | 0.5245 | 0.7242 |
| No log | 1.0394 | 422 | 0.5844 | 0.5528 | 0.5844 | 0.7645 |
| No log | 1.0443 | 424 | 0.6738 | 0.5157 | 0.6738 | 0.8209 |
| No log | 1.0493 | 426 | 0.5926 | 0.5530 | 0.5926 | 0.7698 |
| No log | 1.0542 | 428 | 0.5525 | 0.5679 | 0.5525 | 0.7433 |
| No log | 1.0591 | 430 | 0.5286 | 0.6033 | 0.5286 | 0.7271 |
| No log | 1.0640 | 432 | 0.5507 | 0.5875 | 0.5507 | 0.7421 |
| No log | 1.0690 | 434 | 0.5707 | 0.5953 | 0.5707 | 0.7554 |
| No log | 1.0739 | 436 | 0.5642 | 0.6096 | 0.5642 | 0.7511 |
| No log | 1.0788 | 438 | 0.5686 | 0.6132 | 0.5686 | 0.7541 |
| No log | 1.0837 | 440 | 0.5804 | 0.6400 | 0.5804 | 0.7619 |
| No log | 1.0887 | 442 | 0.5833 | 0.6269 | 0.5833 | 0.7638 |
| No log | 1.0936 | 444 | 0.5306 | 0.6374 | 0.5306 | 0.7284 |
| No log | 1.0985 | 446 | 0.5338 | 0.6348 | 0.5338 | 0.7306 |
| No log | 1.1034 | 448 | 0.5428 | 0.6254 | 0.5428 | 0.7368 |
| No log | 1.1084 | 450 | 0.5453 | 0.6149 | 0.5453 | 0.7384 |
| No log | 1.1133 | 452 | 0.5698 | 0.5978 | 0.5698 | 0.7548 |
| No log | 1.1182 | 454 | 0.6823 | 0.5602 | 0.6823 | 0.8260 |
| No log | 1.1232 | 456 | 0.6192 | 0.5805 | 0.6192 | 0.7869 |
| No log | 1.1281 | 458 | 0.5344 | 0.5937 | 0.5344 | 0.7310 |
| No log | 1.1330 | 460 | 0.5751 | 0.5375 | 0.5751 | 0.7584 |
| No log | 1.1379 | 462 | 0.5318 | 0.6044 | 0.5318 | 0.7292 |
| No log | 1.1429 | 464 | 0.6556 | 0.5835 | 0.6556 | 0.8097 |
| No log | 1.1478 | 466 | 0.8071 | 0.5160 | 0.8071 | 0.8984 |
| No log | 1.1527 | 468 | 0.6838 | 0.5670 | 0.6838 | 0.8269 |
| No log | 1.1576 | 470 | 0.5480 | 0.6034 | 0.5480 | 0.7403 |
| No log | 1.1626 | 472 | 0.5545 | 0.5908 | 0.5545 | 0.7446 |
| No log | 1.1675 | 474 | 0.6123 | 0.5686 | 0.6123 | 0.7825 |
| No log | 1.1724 | 476 | 0.7098 | 0.5469 | 0.7098 | 0.8425 |
| No log | 1.1773 | 478 | 0.6827 | 0.5726 | 0.6827 | 0.8262 |
| No log | 1.1823 | 480 | 0.5501 | 0.6158 | 0.5501 | 0.7417 |
| No log | 1.1872 | 482 | 0.5440 | 0.6127 | 0.5440 | 0.7376 |
| No log | 1.1921 | 484 | 0.5919 | 0.6135 | 0.5919 | 0.7693 |
| No log | 1.1970 | 486 | 0.5429 | 0.6103 | 0.5429 | 0.7368 |
| No log | 1.2020 | 488 | 0.5360 | 0.6210 | 0.5360 | 0.7321 |
| No log | 1.2069 | 490 | 0.5071 | 0.6323 | 0.5071 | 0.7121 |
| No log | 1.2118 | 492 | 0.5161 | 0.6447 | 0.5161 | 0.7184 |
| No log | 1.2167 | 494 | 0.5275 | 0.6321 | 0.5275 | 0.7263 |
| No log | 1.2217 | 496 | 0.5750 | 0.6109 | 0.5750 | 0.7583 |
| No log | 1.2266 | 498 | 0.5897 | 0.6013 | 0.5897 | 0.7679 |
| 0.8711 | 1.2315 | 500 | 0.5274 | 0.6038 | 0.5274 | 0.7262 |
| 0.8711 | 1.2365 | 502 | 0.5365 | 0.5941 | 0.5365 | 0.7324 |
| 0.8711 | 1.2414 | 504 | 0.5567 | 0.5855 | 0.5567 | 0.7462 |
| 0.8711 | 1.2463 | 506 | 0.5612 | 0.5775 | 0.5612 | 0.7492 |
| 0.8711 | 1.2512 | 508 | 0.6545 | 0.5623 | 0.6545 | 0.8090 |
| 0.8711 | 1.2562 | 510 | 0.6215 | 0.5835 | 0.6215 | 0.7883 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
|
slobers/pk29-2-24 | slobers | 2024-11-24T03:04:59Z | 5 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-24T03:00:08Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
amang1802/Llama3.2-1B-summary-length-exp5 | amang1802 | 2024-11-24T02:47:39Z | 5 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-21T05:46:30Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
- Summary Length PPO experiment #5
- No KL divergence in loss
## Model Details
- Dataset size: 1024
- Epochs: 1
- Batch Size: 4 * 8 (w/ Gradient Accumulation)
Optimizer args: Torch AdamW default, except
- LR = 0.00001 |
hawalurahman/idt5-base-qaqg_v1-0 | hawalurahman | 2024-11-24T02:38:11Z | 114 | 0 | transformers | [
"transformers",
"safetensors",
"t5",
"text2text-generation",
"generated_from_trainer",
"base_model:muchad/idt5-base",
"base_model:finetune:muchad/idt5-base",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text2text-generation | 2024-11-24T02:37:19Z | ---
library_name: transformers
license: apache-2.0
base_model: muchad/idt5-base
tags:
- generated_from_trainer
metrics:
- rouge
- bleu
model-index:
- name: idt5-base-qaqg_v1-0
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# idt5-base-qaqg_v1-0
This model is a fine-tuned version of [muchad/idt5-base](https://huggingface.co/muchad/idt5-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2262
- Rouge1: 0.4340
- Rouge2: 0.2406
- Rougel: 0.4042
- Rougelsum: 0.4051
- Bleu: 0.1913
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Bleu |
|:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:------:|
| 1.413 | 1.0 | 4695 | 1.3393 | 0.4004 | 0.2104 | 0.3743 | 0.3749 | 0.1641 |
| 1.2176 | 2.0 | 9390 | 1.2736 | 0.4213 | 0.2290 | 0.3925 | 0.3934 | 0.1809 |
| 1.1113 | 3.0 | 14085 | 1.2329 | 0.4272 | 0.2346 | 0.3981 | 0.3990 | 0.1851 |
| 1.028 | 4.0 | 18780 | 1.2241 | 0.4337 | 0.2384 | 0.4036 | 0.4044 | 0.1901 |
| 0.9813 | 5.0 | 23475 | 1.2262 | 0.4340 | 0.2406 | 0.4042 | 0.4051 | 0.1913 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.4.0a0+f70bd71a48.nv24.06
- Datasets 3.1.0
- Tokenizers 0.20.3
|
MayBashendy/ASAP_FineTuningBERT_AugV4_k4_task1_organization_fold3 | MayBashendy | 2024-11-24T02:33:48Z | 182 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:google-bert/bert-base-uncased",
"base_model:finetune:google-bert/bert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-18T06:35:26Z | ---
library_name: transformers
license: apache-2.0
base_model: bert-base-uncased
tags:
- generated_from_trainer
model-index:
- name: ASAP_FineTuningBERT_AugV4_k4_task1_organization_fold3
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# ASAP_FineTuningBERT_AugV4_k4_task1_organization_fold3
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6991
- Qwk: 0.6057
- Mse: 0.6991
- Rmse: 0.8361
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|:-------------:|:------:|:----:|:---------------:|:------:|:-------:|:------:|
| No log | 0.0048 | 2 | 12.1503 | 0.0023 | 12.1503 | 3.4857 |
| No log | 0.0095 | 4 | 11.0481 | 0.0099 | 11.0481 | 3.3239 |
| No log | 0.0143 | 6 | 9.0791 | 0.0018 | 9.0791 | 3.0132 |
| No log | 0.0190 | 8 | 7.4813 | 0.0 | 7.4813 | 2.7352 |
| No log | 0.0238 | 10 | 6.5906 | 0.0013 | 6.5906 | 2.5672 |
| No log | 0.0285 | 12 | 5.5964 | 0.0240 | 5.5964 | 2.3657 |
| No log | 0.0333 | 14 | 4.3752 | 0.0076 | 4.3752 | 2.0917 |
| No log | 0.0380 | 16 | 3.8585 | 0.0076 | 3.8585 | 1.9643 |
| No log | 0.0428 | 18 | 3.3000 | 0.0076 | 3.3000 | 1.8166 |
| No log | 0.0475 | 20 | 2.5740 | 0.0831 | 2.5740 | 1.6044 |
| No log | 0.0523 | 22 | 2.1271 | 0.0431 | 2.1271 | 1.4585 |
| No log | 0.0570 | 24 | 1.6494 | 0.0202 | 1.6494 | 1.2843 |
| No log | 0.0618 | 26 | 1.2823 | 0.0202 | 1.2823 | 1.1324 |
| No log | 0.0665 | 28 | 1.0167 | 0.0202 | 1.0167 | 1.0083 |
| No log | 0.0713 | 30 | 0.9196 | 0.2220 | 0.9196 | 0.9590 |
| No log | 0.0760 | 32 | 0.8467 | 0.0623 | 0.8467 | 0.9202 |
| No log | 0.0808 | 34 | 0.8492 | 0.0623 | 0.8492 | 0.9215 |
| No log | 0.0855 | 36 | 0.9037 | 0.0314 | 0.9037 | 0.9507 |
| No log | 0.0903 | 38 | 1.1507 | 0.0314 | 1.1507 | 1.0727 |
| No log | 0.0950 | 40 | 1.2302 | 0.0348 | 1.2302 | 1.1092 |
| No log | 0.0998 | 42 | 0.9026 | 0.0314 | 0.9026 | 0.9500 |
| No log | 0.1045 | 44 | 0.9244 | 0.2225 | 0.9244 | 0.9615 |
| No log | 0.1093 | 46 | 0.8182 | 0.0867 | 0.8182 | 0.9046 |
| No log | 0.1140 | 48 | 0.8476 | 0.0736 | 0.8476 | 0.9207 |
| No log | 0.1188 | 50 | 0.7608 | 0.2018 | 0.7608 | 0.8722 |
| No log | 0.1235 | 52 | 0.7359 | 0.2720 | 0.7359 | 0.8579 |
| No log | 0.1283 | 54 | 0.7361 | 0.3124 | 0.7361 | 0.8579 |
| No log | 0.1330 | 56 | 0.7472 | 0.1891 | 0.7472 | 0.8644 |
| No log | 0.1378 | 58 | 0.8829 | 0.1940 | 0.8829 | 0.9397 |
| No log | 0.1425 | 60 | 0.7626 | 0.1355 | 0.7626 | 0.8733 |
| No log | 0.1473 | 62 | 0.6908 | 0.2354 | 0.6908 | 0.8311 |
| No log | 0.1520 | 64 | 0.6315 | 0.3194 | 0.6315 | 0.7947 |
| No log | 0.1568 | 66 | 0.6008 | 0.3492 | 0.6008 | 0.7751 |
| No log | 0.1615 | 68 | 0.5996 | 0.3714 | 0.5996 | 0.7743 |
| No log | 0.1663 | 70 | 0.7326 | 0.4130 | 0.7326 | 0.8559 |
| No log | 0.1710 | 72 | 0.6285 | 0.3506 | 0.6285 | 0.7928 |
| No log | 0.1758 | 74 | 0.6445 | 0.3914 | 0.6445 | 0.8028 |
| No log | 0.1805 | 76 | 0.6134 | 0.4473 | 0.6134 | 0.7832 |
| No log | 0.1853 | 78 | 0.6487 | 0.4158 | 0.6487 | 0.8054 |
| No log | 0.1900 | 80 | 0.6270 | 0.4902 | 0.6270 | 0.7918 |
| No log | 0.1948 | 82 | 0.6386 | 0.4710 | 0.6386 | 0.7991 |
| No log | 0.1995 | 84 | 0.6675 | 0.4684 | 0.6675 | 0.8170 |
| No log | 0.2043 | 86 | 0.9009 | 0.3333 | 0.9009 | 0.9492 |
| No log | 0.2090 | 88 | 0.7979 | 0.4251 | 0.7979 | 0.8932 |
| No log | 0.2138 | 90 | 0.6890 | 0.4514 | 0.6890 | 0.8300 |
| No log | 0.2185 | 92 | 0.7512 | 0.4365 | 0.7512 | 0.8667 |
| No log | 0.2233 | 94 | 0.6479 | 0.4245 | 0.6479 | 0.8049 |
| No log | 0.2280 | 96 | 0.6591 | 0.3643 | 0.6591 | 0.8119 |
| No log | 0.2328 | 98 | 0.6557 | 0.3798 | 0.6557 | 0.8098 |
| No log | 0.2375 | 100 | 0.9491 | 0.3912 | 0.9491 | 0.9742 |
| No log | 0.2423 | 102 | 0.9285 | 0.3934 | 0.9285 | 0.9636 |
| No log | 0.2470 | 104 | 0.6809 | 0.4622 | 0.6809 | 0.8252 |
| No log | 0.2518 | 106 | 0.6386 | 0.4795 | 0.6386 | 0.7991 |
| No log | 0.2565 | 108 | 0.6997 | 0.4579 | 0.6997 | 0.8365 |
| No log | 0.2613 | 110 | 0.6543 | 0.5045 | 0.6543 | 0.8089 |
| No log | 0.2660 | 112 | 0.9048 | 0.3750 | 0.9048 | 0.9512 |
| No log | 0.2708 | 114 | 1.0620 | 0.3069 | 1.0620 | 1.0305 |
| No log | 0.2755 | 116 | 1.0310 | 0.2922 | 1.0310 | 1.0154 |
| No log | 0.2803 | 118 | 1.0382 | 0.2909 | 1.0382 | 1.0189 |
| No log | 0.2850 | 120 | 0.9695 | 0.3381 | 0.9695 | 0.9846 |
| No log | 0.2898 | 122 | 0.8457 | 0.3844 | 0.8457 | 0.9196 |
| No log | 0.2945 | 124 | 0.7668 | 0.4393 | 0.7668 | 0.8756 |
| No log | 0.2993 | 126 | 0.8015 | 0.4499 | 0.8015 | 0.8953 |
| No log | 0.3040 | 128 | 0.5596 | 0.5395 | 0.5596 | 0.7481 |
| No log | 0.3088 | 130 | 0.6800 | 0.4951 | 0.6800 | 0.8246 |
| No log | 0.3135 | 132 | 0.6596 | 0.4825 | 0.6596 | 0.8122 |
| No log | 0.3183 | 134 | 0.5470 | 0.5577 | 0.5470 | 0.7396 |
| No log | 0.3230 | 136 | 0.5932 | 0.5661 | 0.5932 | 0.7702 |
| No log | 0.3278 | 138 | 0.5511 | 0.5790 | 0.5511 | 0.7424 |
| No log | 0.3325 | 140 | 0.5491 | 0.5641 | 0.5491 | 0.7410 |
| No log | 0.3373 | 142 | 0.5444 | 0.5825 | 0.5444 | 0.7378 |
| No log | 0.3420 | 144 | 0.6245 | 0.5471 | 0.6245 | 0.7902 |
| No log | 0.3468 | 146 | 0.7777 | 0.5151 | 0.7777 | 0.8819 |
| No log | 0.3515 | 148 | 0.9213 | 0.4735 | 0.9213 | 0.9598 |
| No log | 0.3563 | 150 | 0.8635 | 0.4928 | 0.8635 | 0.9292 |
| No log | 0.3610 | 152 | 0.6761 | 0.5447 | 0.6761 | 0.8223 |
| No log | 0.3658 | 154 | 0.5357 | 0.5768 | 0.5357 | 0.7319 |
| No log | 0.3705 | 156 | 0.5365 | 0.5860 | 0.5365 | 0.7325 |
| No log | 0.3753 | 158 | 0.6922 | 0.5015 | 0.6922 | 0.8320 |
| No log | 0.3800 | 160 | 0.7024 | 0.5000 | 0.7024 | 0.8381 |
| No log | 0.3848 | 162 | 0.5197 | 0.5876 | 0.5197 | 0.7209 |
| No log | 0.3895 | 164 | 0.5869 | 0.4937 | 0.5869 | 0.7661 |
| No log | 0.3943 | 166 | 0.5612 | 0.5415 | 0.5612 | 0.7491 |
| No log | 0.3990 | 168 | 0.5440 | 0.5986 | 0.5440 | 0.7375 |
| No log | 0.4038 | 170 | 0.5620 | 0.6061 | 0.5620 | 0.7497 |
| No log | 0.4086 | 172 | 0.5820 | 0.5993 | 0.5820 | 0.7629 |
| No log | 0.4133 | 174 | 0.5361 | 0.6105 | 0.5361 | 0.7322 |
| No log | 0.4181 | 176 | 0.5471 | 0.5984 | 0.5471 | 0.7397 |
| No log | 0.4228 | 178 | 0.5228 | 0.5845 | 0.5228 | 0.7231 |
| No log | 0.4276 | 180 | 0.5233 | 0.5807 | 0.5233 | 0.7234 |
| No log | 0.4323 | 182 | 0.6492 | 0.5474 | 0.6492 | 0.8057 |
| No log | 0.4371 | 184 | 0.8715 | 0.4276 | 0.8715 | 0.9335 |
| No log | 0.4418 | 186 | 0.9170 | 0.4232 | 0.9170 | 0.9576 |
| No log | 0.4466 | 188 | 0.6264 | 0.5124 | 0.6264 | 0.7914 |
| No log | 0.4513 | 190 | 0.5403 | 0.5395 | 0.5403 | 0.7350 |
| No log | 0.4561 | 192 | 0.5247 | 0.5233 | 0.5247 | 0.7243 |
| No log | 0.4608 | 194 | 0.5335 | 0.5737 | 0.5335 | 0.7304 |
| No log | 0.4656 | 196 | 0.5968 | 0.5489 | 0.5968 | 0.7725 |
| No log | 0.4703 | 198 | 0.5947 | 0.5457 | 0.5947 | 0.7711 |
| No log | 0.4751 | 200 | 0.5196 | 0.5755 | 0.5196 | 0.7209 |
| No log | 0.4798 | 202 | 0.5101 | 0.5772 | 0.5101 | 0.7142 |
| No log | 0.4846 | 204 | 0.5115 | 0.5683 | 0.5115 | 0.7152 |
| No log | 0.4893 | 206 | 0.5438 | 0.5478 | 0.5438 | 0.7374 |
| No log | 0.4941 | 208 | 0.5239 | 0.5671 | 0.5239 | 0.7238 |
| No log | 0.4988 | 210 | 0.5161 | 0.5683 | 0.5161 | 0.7184 |
| No log | 0.5036 | 212 | 0.5537 | 0.5426 | 0.5537 | 0.7441 |
| No log | 0.5083 | 214 | 0.5822 | 0.5482 | 0.5822 | 0.7630 |
| No log | 0.5131 | 216 | 0.5351 | 0.5817 | 0.5351 | 0.7315 |
| No log | 0.5178 | 218 | 0.5633 | 0.5737 | 0.5633 | 0.7505 |
| No log | 0.5226 | 220 | 0.6283 | 0.5460 | 0.6283 | 0.7926 |
| No log | 0.5273 | 222 | 0.5803 | 0.5803 | 0.5803 | 0.7618 |
| No log | 0.5321 | 224 | 0.6079 | 0.5752 | 0.6079 | 0.7797 |
| No log | 0.5368 | 226 | 0.7510 | 0.5237 | 0.7510 | 0.8666 |
| No log | 0.5416 | 228 | 0.7706 | 0.5113 | 0.7706 | 0.8779 |
| No log | 0.5463 | 230 | 0.6130 | 0.5573 | 0.6130 | 0.7830 |
| No log | 0.5511 | 232 | 0.5448 | 0.5789 | 0.5448 | 0.7381 |
| No log | 0.5558 | 234 | 0.5387 | 0.5616 | 0.5387 | 0.7340 |
| No log | 0.5606 | 236 | 0.5109 | 0.5885 | 0.5109 | 0.7148 |
| No log | 0.5653 | 238 | 0.5162 | 0.5469 | 0.5162 | 0.7185 |
| No log | 0.5701 | 240 | 0.6837 | 0.5369 | 0.6837 | 0.8268 |
| No log | 0.5748 | 242 | 0.8818 | 0.4574 | 0.8818 | 0.9391 |
| No log | 0.5796 | 244 | 0.7028 | 0.5198 | 0.7028 | 0.8383 |
| No log | 0.5843 | 246 | 0.5567 | 0.5143 | 0.5567 | 0.7461 |
| No log | 0.5891 | 248 | 0.5596 | 0.5266 | 0.5596 | 0.7480 |
| No log | 0.5938 | 250 | 0.6957 | 0.5526 | 0.6957 | 0.8341 |
| No log | 0.5986 | 252 | 0.7775 | 0.5294 | 0.7775 | 0.8818 |
| No log | 0.6033 | 254 | 0.6095 | 0.5708 | 0.6095 | 0.7807 |
| No log | 0.6081 | 256 | 0.5416 | 0.5629 | 0.5416 | 0.7359 |
| No log | 0.6128 | 258 | 0.5396 | 0.5816 | 0.5396 | 0.7346 |
| No log | 0.6176 | 260 | 0.5549 | 0.5719 | 0.5549 | 0.7449 |
| No log | 0.6223 | 262 | 0.6289 | 0.5769 | 0.6289 | 0.7930 |
| No log | 0.6271 | 264 | 0.5858 | 0.5397 | 0.5858 | 0.7654 |
| No log | 0.6318 | 266 | 0.6379 | 0.5294 | 0.6379 | 0.7987 |
| No log | 0.6366 | 268 | 0.6983 | 0.5190 | 0.6983 | 0.8356 |
| No log | 0.6413 | 270 | 0.9660 | 0.4489 | 0.9660 | 0.9828 |
| No log | 0.6461 | 272 | 0.9920 | 0.4533 | 0.9920 | 0.9960 |
| No log | 0.6508 | 274 | 0.6736 | 0.5317 | 0.6736 | 0.8207 |
| No log | 0.6556 | 276 | 0.5818 | 0.5439 | 0.5818 | 0.7628 |
| No log | 0.6603 | 278 | 0.5843 | 0.5225 | 0.5843 | 0.7644 |
| No log | 0.6651 | 280 | 0.5605 | 0.5477 | 0.5605 | 0.7487 |
| No log | 0.6698 | 282 | 0.6963 | 0.5182 | 0.6963 | 0.8344 |
| No log | 0.6746 | 284 | 0.6066 | 0.5714 | 0.6066 | 0.7788 |
| No log | 0.6793 | 286 | 0.5114 | 0.5998 | 0.5114 | 0.7151 |
| No log | 0.6841 | 288 | 0.5499 | 0.5656 | 0.5499 | 0.7416 |
| No log | 0.6888 | 290 | 0.5407 | 0.5866 | 0.5407 | 0.7353 |
| No log | 0.6936 | 292 | 0.5490 | 0.5962 | 0.5490 | 0.7410 |
| No log | 0.6983 | 294 | 0.5891 | 0.6018 | 0.5891 | 0.7676 |
| No log | 0.7031 | 296 | 0.7656 | 0.5466 | 0.7656 | 0.8750 |
| No log | 0.7078 | 298 | 0.7115 | 0.5602 | 0.7115 | 0.8435 |
| No log | 0.7126 | 300 | 0.6260 | 0.5890 | 0.6260 | 0.7912 |
| No log | 0.7173 | 302 | 0.5525 | 0.5603 | 0.5525 | 0.7433 |
| No log | 0.7221 | 304 | 0.5820 | 0.5754 | 0.5820 | 0.7629 |
| No log | 0.7268 | 306 | 0.6271 | 0.5781 | 0.6271 | 0.7919 |
| No log | 0.7316 | 308 | 0.5334 | 0.5493 | 0.5334 | 0.7303 |
| No log | 0.7363 | 310 | 0.5419 | 0.5268 | 0.5419 | 0.7361 |
| No log | 0.7411 | 312 | 0.5346 | 0.5344 | 0.5346 | 0.7312 |
| No log | 0.7458 | 314 | 0.5945 | 0.6096 | 0.5945 | 0.7711 |
| No log | 0.7506 | 316 | 0.7176 | 0.5904 | 0.7176 | 0.8471 |
| No log | 0.7553 | 318 | 0.5513 | 0.6266 | 0.5513 | 0.7425 |
| No log | 0.7601 | 320 | 0.5128 | 0.5950 | 0.5128 | 0.7161 |
| No log | 0.7648 | 322 | 0.5655 | 0.5846 | 0.5655 | 0.7520 |
| No log | 0.7696 | 324 | 0.5240 | 0.5899 | 0.5240 | 0.7239 |
| No log | 0.7743 | 326 | 0.7023 | 0.5698 | 0.7023 | 0.8380 |
| No log | 0.7791 | 328 | 0.6656 | 0.5898 | 0.6656 | 0.8159 |
| No log | 0.7838 | 330 | 0.5664 | 0.5871 | 0.5664 | 0.7526 |
| No log | 0.7886 | 332 | 0.6157 | 0.5860 | 0.6157 | 0.7847 |
| No log | 0.7933 | 334 | 0.7392 | 0.5397 | 0.7392 | 0.8598 |
| No log | 0.7981 | 336 | 0.6137 | 0.5998 | 0.6137 | 0.7834 |
| No log | 0.8029 | 338 | 0.5748 | 0.6222 | 0.5748 | 0.7581 |
| No log | 0.8076 | 340 | 0.5300 | 0.6077 | 0.5300 | 0.7280 |
| No log | 0.8124 | 342 | 0.5424 | 0.6186 | 0.5424 | 0.7365 |
| No log | 0.8171 | 344 | 0.5674 | 0.6451 | 0.5674 | 0.7533 |
| No log | 0.8219 | 346 | 0.5705 | 0.6170 | 0.5705 | 0.7553 |
| No log | 0.8266 | 348 | 0.6270 | 0.5661 | 0.6270 | 0.7919 |
| No log | 0.8314 | 350 | 0.6201 | 0.5384 | 0.6201 | 0.7874 |
| No log | 0.8361 | 352 | 0.5214 | 0.6189 | 0.5214 | 0.7221 |
| No log | 0.8409 | 354 | 0.7790 | 0.5362 | 0.7790 | 0.8826 |
| No log | 0.8456 | 356 | 0.8630 | 0.5051 | 0.8630 | 0.9290 |
| No log | 0.8504 | 358 | 0.6560 | 0.5746 | 0.6560 | 0.8099 |
| No log | 0.8551 | 360 | 0.6343 | 0.5597 | 0.6343 | 0.7964 |
| No log | 0.8599 | 362 | 0.6900 | 0.5532 | 0.6900 | 0.8307 |
| No log | 0.8646 | 364 | 0.7280 | 0.5239 | 0.7280 | 0.8532 |
| No log | 0.8694 | 366 | 0.8799 | 0.4753 | 0.8799 | 0.9380 |
| No log | 0.8741 | 368 | 0.8059 | 0.4997 | 0.8059 | 0.8977 |
| No log | 0.8789 | 370 | 0.5579 | 0.5611 | 0.5579 | 0.7469 |
| No log | 0.8836 | 372 | 0.5213 | 0.5686 | 0.5213 | 0.7220 |
| No log | 0.8884 | 374 | 0.5175 | 0.5705 | 0.5175 | 0.7193 |
| No log | 0.8931 | 376 | 0.5263 | 0.6002 | 0.5263 | 0.7255 |
| No log | 0.8979 | 378 | 0.5862 | 0.5737 | 0.5862 | 0.7656 |
| No log | 0.9026 | 380 | 0.5873 | 0.5773 | 0.5873 | 0.7663 |
| No log | 0.9074 | 382 | 0.5192 | 0.5742 | 0.5192 | 0.7206 |
| No log | 0.9121 | 384 | 0.5353 | 0.5723 | 0.5353 | 0.7317 |
| No log | 0.9169 | 386 | 0.5213 | 0.5990 | 0.5213 | 0.7220 |
| No log | 0.9216 | 388 | 0.5896 | 0.5834 | 0.5896 | 0.7679 |
| No log | 0.9264 | 390 | 0.7697 | 0.5373 | 0.7697 | 0.8773 |
| No log | 0.9311 | 392 | 0.7164 | 0.5568 | 0.7164 | 0.8464 |
| No log | 0.9359 | 394 | 0.5637 | 0.5946 | 0.5637 | 0.7508 |
| No log | 0.9406 | 396 | 0.5362 | 0.6010 | 0.5362 | 0.7323 |
| No log | 0.9454 | 398 | 0.5434 | 0.5810 | 0.5434 | 0.7371 |
| No log | 0.9501 | 400 | 0.6429 | 0.5530 | 0.6429 | 0.8018 |
| No log | 0.9549 | 402 | 0.9170 | 0.4726 | 0.9170 | 0.9576 |
| No log | 0.9596 | 404 | 0.8541 | 0.4927 | 0.8541 | 0.9242 |
| No log | 0.9644 | 406 | 0.6047 | 0.5198 | 0.6047 | 0.7776 |
| No log | 0.9691 | 408 | 0.5433 | 0.5672 | 0.5433 | 0.7371 |
| No log | 0.9739 | 410 | 0.5462 | 0.5565 | 0.5462 | 0.7391 |
| No log | 0.9786 | 412 | 0.5936 | 0.5184 | 0.5936 | 0.7705 |
| No log | 0.9834 | 414 | 0.6514 | 0.5008 | 0.6514 | 0.8071 |
| No log | 0.9881 | 416 | 0.6009 | 0.5221 | 0.6009 | 0.7752 |
| No log | 0.9929 | 418 | 0.5528 | 0.5732 | 0.5528 | 0.7435 |
| No log | 0.9976 | 420 | 0.5812 | 0.5526 | 0.5812 | 0.7624 |
| No log | 1.0024 | 422 | 0.6889 | 0.5188 | 0.6889 | 0.8300 |
| No log | 1.0071 | 424 | 0.6380 | 0.5403 | 0.6380 | 0.7988 |
| No log | 1.0119 | 426 | 0.5582 | 0.5683 | 0.5582 | 0.7471 |
| No log | 1.0166 | 428 | 0.5758 | 0.5720 | 0.5758 | 0.7588 |
| No log | 1.0214 | 430 | 0.5991 | 0.5944 | 0.5991 | 0.7740 |
| No log | 1.0261 | 432 | 0.5858 | 0.6053 | 0.5858 | 0.7654 |
| No log | 1.0309 | 434 | 0.5552 | 0.6134 | 0.5552 | 0.7451 |
| No log | 1.0356 | 436 | 0.5671 | 0.6030 | 0.5671 | 0.7530 |
| No log | 1.0404 | 438 | 0.6008 | 0.5422 | 0.6008 | 0.7751 |
| No log | 1.0451 | 440 | 0.5980 | 0.5216 | 0.5980 | 0.7733 |
| No log | 1.0499 | 442 | 0.5977 | 0.5210 | 0.5977 | 0.7731 |
| No log | 1.0546 | 444 | 0.6791 | 0.5069 | 0.6791 | 0.8241 |
| No log | 1.0594 | 446 | 0.6943 | 0.5024 | 0.6943 | 0.8333 |
| No log | 1.0641 | 448 | 0.6093 | 0.5253 | 0.6093 | 0.7805 |
| No log | 1.0689 | 450 | 0.5582 | 0.5363 | 0.5582 | 0.7471 |
| No log | 1.0736 | 452 | 0.5694 | 0.5289 | 0.5694 | 0.7546 |
| No log | 1.0784 | 454 | 0.6338 | 0.5316 | 0.6338 | 0.7961 |
| No log | 1.0831 | 456 | 0.6122 | 0.5351 | 0.6122 | 0.7824 |
| No log | 1.0879 | 458 | 0.5802 | 0.5276 | 0.5802 | 0.7617 |
| No log | 1.0926 | 460 | 0.5859 | 0.5633 | 0.5859 | 0.7654 |
| No log | 1.0974 | 462 | 0.6498 | 0.5463 | 0.6498 | 0.8061 |
| No log | 1.1021 | 464 | 0.6043 | 0.5542 | 0.6043 | 0.7774 |
| No log | 1.1069 | 466 | 0.5516 | 0.5799 | 0.5516 | 0.7427 |
| No log | 1.1116 | 468 | 0.5758 | 0.5377 | 0.5758 | 0.7588 |
| No log | 1.1164 | 470 | 0.5437 | 0.5899 | 0.5437 | 0.7374 |
| No log | 1.1211 | 472 | 0.6171 | 0.5980 | 0.6171 | 0.7856 |
| No log | 1.1259 | 474 | 0.5814 | 0.6191 | 0.5814 | 0.7625 |
| No log | 1.1306 | 476 | 0.5520 | 0.5932 | 0.5520 | 0.7430 |
| No log | 1.1354 | 478 | 0.5666 | 0.5967 | 0.5666 | 0.7527 |
| No log | 1.1401 | 480 | 0.6001 | 0.6036 | 0.6001 | 0.7747 |
| No log | 1.1449 | 482 | 0.6719 | 0.5954 | 0.6719 | 0.8197 |
| No log | 1.1496 | 484 | 0.7078 | 0.5787 | 0.7078 | 0.8413 |
| No log | 1.1544 | 486 | 0.6175 | 0.6046 | 0.6175 | 0.7858 |
| No log | 1.1591 | 488 | 0.6018 | 0.6101 | 0.6018 | 0.7757 |
| No log | 1.1639 | 490 | 0.5166 | 0.6321 | 0.5166 | 0.7187 |
| No log | 1.1686 | 492 | 0.5209 | 0.6202 | 0.5209 | 0.7217 |
| No log | 1.1734 | 494 | 0.5586 | 0.6480 | 0.5586 | 0.7474 |
| No log | 1.1781 | 496 | 0.7894 | 0.5495 | 0.7894 | 0.8885 |
| No log | 1.1829 | 498 | 0.7686 | 0.5493 | 0.7686 | 0.8767 |
| 0.9787 | 1.1876 | 500 | 0.5730 | 0.6390 | 0.5730 | 0.7570 |
| 0.9787 | 1.1924 | 502 | 0.5823 | 0.5707 | 0.5823 | 0.7631 |
| 0.9787 | 1.1971 | 504 | 0.5936 | 0.5671 | 0.5936 | 0.7704 |
| 0.9787 | 1.2019 | 506 | 0.5364 | 0.6206 | 0.5364 | 0.7324 |
| 0.9787 | 1.2067 | 508 | 0.6693 | 0.6163 | 0.6693 | 0.8181 |
| 0.9787 | 1.2114 | 510 | 0.6991 | 0.6057 | 0.6991 | 0.8361 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
|
Vizzy/CA_plates_fine-tuned_gpt2_minimal_test | Vizzy | 2024-11-24T02:30:01Z | 126 | 0 | transformers | [
"transformers",
"safetensors",
"gpt2",
"text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-24T02:29:35Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
steph0713/deepffnllama-768_6_4-2 | steph0713 | 2024-11-24T02:05:34Z | 54 | 0 | transformers | [
"transformers",
"safetensors",
"deepffn-llama",
"text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-14T17:01:05Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
steph0713/deepffnllama-768_4_4-3 | steph0713 | 2024-11-24T02:04:59Z | 49 | 0 | transformers | [
"transformers",
"safetensors",
"deepffn-llama",
"text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-14T17:06:20Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
MayBashendy/ASAP_FineTuningBERT_AugV4_k4_task1_organization_fold2 | MayBashendy | 2024-11-24T02:02:33Z | 183 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:google-bert/bert-base-uncased",
"base_model:finetune:google-bert/bert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-18T05:55:20Z | ---
library_name: transformers
license: apache-2.0
base_model: bert-base-uncased
tags:
- generated_from_trainer
model-index:
- name: ASAP_FineTuningBERT_AugV4_k4_task1_organization_fold2
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# ASAP_FineTuningBERT_AugV4_k4_task1_organization_fold2
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4901
- Qwk: 0.5743
- Mse: 0.4901
- Rmse: 0.7001
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|:-------------:|:------:|:----:|:---------------:|:-------:|:-------:|:------:|
| No log | 0.0047 | 2 | 10.4914 | -0.0017 | 10.4914 | 3.2390 |
| No log | 0.0095 | 4 | 8.8484 | 0.0 | 8.8484 | 2.9746 |
| No log | 0.0142 | 6 | 7.1383 | 0.0 | 7.1383 | 2.6718 |
| No log | 0.0189 | 8 | 5.7341 | 0.0715 | 5.7341 | 2.3946 |
| No log | 0.0236 | 10 | 4.6769 | 0.0039 | 4.6769 | 2.1626 |
| No log | 0.0284 | 12 | 3.7222 | 0.0039 | 3.7222 | 1.9293 |
| No log | 0.0331 | 14 | 2.9351 | 0.0 | 2.9351 | 1.7132 |
| No log | 0.0378 | 16 | 2.3612 | 0.0644 | 2.3612 | 1.5366 |
| No log | 0.0426 | 18 | 1.7671 | 0.0437 | 1.7671 | 1.3293 |
| No log | 0.0473 | 20 | 1.3727 | 0.0213 | 1.3727 | 1.1716 |
| No log | 0.0520 | 22 | 1.0604 | 0.0107 | 1.0604 | 1.0298 |
| No log | 0.0567 | 24 | 0.8824 | 0.1218 | 0.8824 | 0.9394 |
| No log | 0.0615 | 26 | 0.7797 | 0.2194 | 0.7797 | 0.8830 |
| No log | 0.0662 | 28 | 0.7981 | 0.0679 | 0.7981 | 0.8933 |
| No log | 0.0709 | 30 | 0.8877 | 0.0164 | 0.8877 | 0.9422 |
| No log | 0.0757 | 32 | 0.9580 | 0.0164 | 0.9580 | 0.9788 |
| No log | 0.0804 | 34 | 0.9864 | 0.0164 | 0.9864 | 0.9932 |
| No log | 0.0851 | 36 | 1.1089 | 0.0164 | 1.1089 | 1.0531 |
| No log | 0.0898 | 38 | 0.8963 | 0.0267 | 0.8963 | 0.9467 |
| No log | 0.0946 | 40 | 0.8871 | 0.0474 | 0.8871 | 0.9418 |
| No log | 0.0993 | 42 | 0.7717 | 0.0824 | 0.7717 | 0.8784 |
| No log | 0.1040 | 44 | 0.8510 | 0.0517 | 0.8510 | 0.9225 |
| No log | 0.1087 | 46 | 0.7427 | 0.1329 | 0.7427 | 0.8618 |
| No log | 0.1135 | 48 | 0.8213 | 0.0722 | 0.8213 | 0.9063 |
| No log | 0.1182 | 50 | 0.8552 | 0.1229 | 0.8552 | 0.9247 |
| No log | 0.1229 | 52 | 1.0579 | 0.2808 | 1.0579 | 1.0285 |
| No log | 0.1277 | 54 | 1.1845 | 0.2799 | 1.1845 | 1.0884 |
| No log | 0.1324 | 56 | 0.7740 | 0.1423 | 0.7740 | 0.8798 |
| No log | 0.1371 | 58 | 0.7331 | 0.1675 | 0.7331 | 0.8562 |
| No log | 0.1418 | 60 | 1.0262 | 0.2856 | 1.0262 | 1.0130 |
| No log | 0.1466 | 62 | 0.9231 | 0.3040 | 0.9231 | 0.9608 |
| No log | 0.1513 | 64 | 0.5857 | 0.3334 | 0.5857 | 0.7653 |
| No log | 0.1560 | 66 | 0.6564 | 0.3726 | 0.6564 | 0.8102 |
| No log | 0.1608 | 68 | 0.9701 | 0.2562 | 0.9701 | 0.9849 |
| No log | 0.1655 | 70 | 0.6997 | 0.4050 | 0.6997 | 0.8365 |
| No log | 0.1702 | 72 | 0.5197 | 0.4292 | 0.5197 | 0.7209 |
| No log | 0.1749 | 74 | 0.5411 | 0.4872 | 0.5411 | 0.7356 |
| No log | 0.1797 | 76 | 0.5565 | 0.4882 | 0.5565 | 0.7460 |
| No log | 0.1844 | 78 | 0.4931 | 0.5288 | 0.4931 | 0.7022 |
| No log | 0.1891 | 80 | 0.6276 | 0.4991 | 0.6276 | 0.7922 |
| No log | 0.1939 | 82 | 0.6705 | 0.4903 | 0.6705 | 0.8188 |
| No log | 0.1986 | 84 | 0.5454 | 0.5417 | 0.5454 | 0.7385 |
| No log | 0.2033 | 86 | 0.6646 | 0.4555 | 0.6646 | 0.8153 |
| No log | 0.2080 | 88 | 0.5160 | 0.6037 | 0.5160 | 0.7183 |
| No log | 0.2128 | 90 | 0.5265 | 0.6021 | 0.5265 | 0.7256 |
| No log | 0.2175 | 92 | 0.5594 | 0.5848 | 0.5594 | 0.7479 |
| No log | 0.2222 | 94 | 0.5486 | 0.5833 | 0.5486 | 0.7406 |
| No log | 0.2270 | 96 | 0.5880 | 0.5182 | 0.5880 | 0.7668 |
| No log | 0.2317 | 98 | 0.5134 | 0.5997 | 0.5134 | 0.7165 |
| No log | 0.2364 | 100 | 0.7284 | 0.4678 | 0.7284 | 0.8535 |
| No log | 0.2411 | 102 | 0.6981 | 0.4778 | 0.6981 | 0.8355 |
| No log | 0.2459 | 104 | 0.5009 | 0.5111 | 0.5009 | 0.7077 |
| No log | 0.2506 | 106 | 0.5377 | 0.5138 | 0.5377 | 0.7333 |
| No log | 0.2553 | 108 | 0.5484 | 0.5100 | 0.5484 | 0.7406 |
| No log | 0.2600 | 110 | 1.1207 | 0.3245 | 1.1207 | 1.0586 |
| No log | 0.2648 | 112 | 1.3283 | 0.2207 | 1.3283 | 1.1525 |
| No log | 0.2695 | 114 | 1.0649 | 0.3406 | 1.0649 | 1.0320 |
| No log | 0.2742 | 116 | 0.5766 | 0.4711 | 0.5766 | 0.7594 |
| No log | 0.2790 | 118 | 0.5194 | 0.4628 | 0.5194 | 0.7207 |
| No log | 0.2837 | 120 | 0.5272 | 0.4062 | 0.5272 | 0.7261 |
| No log | 0.2884 | 122 | 0.7426 | 0.4019 | 0.7426 | 0.8618 |
| No log | 0.2931 | 124 | 0.9893 | 0.3548 | 0.9893 | 0.9946 |
| No log | 0.2979 | 126 | 1.0100 | 0.3460 | 1.0100 | 1.0050 |
| No log | 0.3026 | 128 | 0.7199 | 0.4555 | 0.7199 | 0.8485 |
| No log | 0.3073 | 130 | 0.5013 | 0.5149 | 0.5013 | 0.7080 |
| No log | 0.3121 | 132 | 0.8285 | 0.3875 | 0.8285 | 0.9102 |
| No log | 0.3168 | 134 | 0.7584 | 0.4038 | 0.7584 | 0.8709 |
| No log | 0.3215 | 136 | 0.4947 | 0.5592 | 0.4947 | 0.7034 |
| No log | 0.3262 | 138 | 0.8533 | 0.3822 | 0.8533 | 0.9238 |
| No log | 0.3310 | 140 | 1.1234 | 0.3049 | 1.1234 | 1.0599 |
| No log | 0.3357 | 142 | 0.9969 | 0.3521 | 0.9969 | 0.9985 |
| No log | 0.3404 | 144 | 0.7050 | 0.3583 | 0.7050 | 0.8397 |
| No log | 0.3452 | 146 | 0.5524 | 0.3895 | 0.5524 | 0.7432 |
| No log | 0.3499 | 148 | 0.6219 | 0.4195 | 0.6219 | 0.7886 |
| No log | 0.3546 | 150 | 0.6143 | 0.4168 | 0.6143 | 0.7838 |
| No log | 0.3593 | 152 | 0.5421 | 0.4167 | 0.5421 | 0.7363 |
| No log | 0.3641 | 154 | 0.6319 | 0.4413 | 0.6319 | 0.7949 |
| No log | 0.3688 | 156 | 0.6293 | 0.4431 | 0.6293 | 0.7933 |
| No log | 0.3735 | 158 | 0.6174 | 0.4388 | 0.6174 | 0.7857 |
| No log | 0.3783 | 160 | 0.6786 | 0.4547 | 0.6786 | 0.8238 |
| No log | 0.3830 | 162 | 0.7209 | 0.4449 | 0.7209 | 0.8490 |
| No log | 0.3877 | 164 | 0.5614 | 0.5033 | 0.5614 | 0.7492 |
| No log | 0.3924 | 166 | 0.5325 | 0.5150 | 0.5325 | 0.7298 |
| No log | 0.3972 | 168 | 0.5792 | 0.5305 | 0.5792 | 0.7610 |
| No log | 0.4019 | 170 | 0.6009 | 0.5453 | 0.6009 | 0.7752 |
| No log | 0.4066 | 172 | 0.6198 | 0.5489 | 0.6198 | 0.7873 |
| No log | 0.4113 | 174 | 0.5207 | 0.5649 | 0.5207 | 0.7216 |
| No log | 0.4161 | 176 | 0.5083 | 0.5589 | 0.5083 | 0.7130 |
| No log | 0.4208 | 178 | 0.4906 | 0.5581 | 0.4906 | 0.7004 |
| No log | 0.4255 | 180 | 0.6022 | 0.5401 | 0.6022 | 0.7760 |
| No log | 0.4303 | 182 | 0.8963 | 0.4271 | 0.8963 | 0.9467 |
| No log | 0.4350 | 184 | 0.8773 | 0.4187 | 0.8773 | 0.9367 |
| No log | 0.4397 | 186 | 0.6216 | 0.4956 | 0.6216 | 0.7884 |
| No log | 0.4444 | 188 | 0.5154 | 0.5460 | 0.5154 | 0.7179 |
| No log | 0.4492 | 190 | 0.5311 | 0.5590 | 0.5311 | 0.7287 |
| No log | 0.4539 | 192 | 0.7511 | 0.4396 | 0.7511 | 0.8667 |
| No log | 0.4586 | 194 | 0.9564 | 0.3769 | 0.9564 | 0.9780 |
| No log | 0.4634 | 196 | 0.9081 | 0.3795 | 0.9081 | 0.9529 |
| No log | 0.4681 | 198 | 0.6835 | 0.4517 | 0.6835 | 0.8267 |
| No log | 0.4728 | 200 | 0.6461 | 0.4961 | 0.6461 | 0.8038 |
| No log | 0.4775 | 202 | 0.6718 | 0.4805 | 0.6718 | 0.8196 |
| No log | 0.4823 | 204 | 0.5816 | 0.5177 | 0.5816 | 0.7626 |
| No log | 0.4870 | 206 | 0.5110 | 0.5543 | 0.5110 | 0.7149 |
| No log | 0.4917 | 208 | 0.5280 | 0.5396 | 0.5280 | 0.7266 |
| No log | 0.4965 | 210 | 0.5182 | 0.5329 | 0.5182 | 0.7198 |
| No log | 0.5012 | 212 | 0.5274 | 0.5301 | 0.5274 | 0.7262 |
| No log | 0.5059 | 214 | 0.5108 | 0.5377 | 0.5108 | 0.7147 |
| No log | 0.5106 | 216 | 0.5242 | 0.5585 | 0.5242 | 0.7240 |
| No log | 0.5154 | 218 | 0.4987 | 0.5659 | 0.4987 | 0.7062 |
| No log | 0.5201 | 220 | 0.5278 | 0.5702 | 0.5278 | 0.7265 |
| No log | 0.5248 | 222 | 0.5238 | 0.5671 | 0.5238 | 0.7238 |
| No log | 0.5296 | 224 | 0.4925 | 0.5591 | 0.4925 | 0.7018 |
| No log | 0.5343 | 226 | 0.4999 | 0.5578 | 0.4999 | 0.7070 |
| No log | 0.5390 | 228 | 0.5988 | 0.5392 | 0.5988 | 0.7738 |
| No log | 0.5437 | 230 | 0.8854 | 0.4227 | 0.8854 | 0.9410 |
| No log | 0.5485 | 232 | 0.9978 | 0.3505 | 0.9978 | 0.9989 |
| No log | 0.5532 | 234 | 0.7967 | 0.4039 | 0.7967 | 0.8926 |
| No log | 0.5579 | 236 | 0.6095 | 0.4224 | 0.6095 | 0.7807 |
| No log | 0.5626 | 238 | 0.5992 | 0.4266 | 0.5992 | 0.7741 |
| No log | 0.5674 | 240 | 0.6821 | 0.4632 | 0.6821 | 0.8259 |
| No log | 0.5721 | 242 | 0.6258 | 0.5029 | 0.6258 | 0.7911 |
| No log | 0.5768 | 244 | 0.5851 | 0.5358 | 0.5851 | 0.7649 |
| No log | 0.5816 | 246 | 0.5047 | 0.5652 | 0.5047 | 0.7104 |
| No log | 0.5863 | 248 | 0.5803 | 0.5341 | 0.5803 | 0.7618 |
| No log | 0.5910 | 250 | 0.5628 | 0.5532 | 0.5628 | 0.7502 |
| No log | 0.5957 | 252 | 0.5188 | 0.6007 | 0.5188 | 0.7203 |
| No log | 0.6005 | 254 | 0.5736 | 0.5857 | 0.5736 | 0.7574 |
| No log | 0.6052 | 256 | 0.5789 | 0.5953 | 0.5789 | 0.7608 |
| No log | 0.6099 | 258 | 0.5445 | 0.5888 | 0.5445 | 0.7379 |
| No log | 0.6147 | 260 | 0.5966 | 0.5756 | 0.5966 | 0.7724 |
| No log | 0.6194 | 262 | 0.6566 | 0.4953 | 0.6566 | 0.8103 |
| No log | 0.6241 | 264 | 0.6903 | 0.4782 | 0.6903 | 0.8309 |
| No log | 0.6288 | 266 | 0.7930 | 0.4200 | 0.7930 | 0.8905 |
| No log | 0.6336 | 268 | 0.7367 | 0.4442 | 0.7367 | 0.8583 |
| No log | 0.6383 | 270 | 0.6582 | 0.4679 | 0.6582 | 0.8113 |
| No log | 0.6430 | 272 | 0.6477 | 0.4679 | 0.6477 | 0.8048 |
| No log | 0.6478 | 274 | 0.5714 | 0.5204 | 0.5714 | 0.7559 |
| No log | 0.6525 | 276 | 0.5365 | 0.5606 | 0.5365 | 0.7324 |
| No log | 0.6572 | 278 | 0.5722 | 0.5548 | 0.5722 | 0.7565 |
| No log | 0.6619 | 280 | 0.5821 | 0.5791 | 0.5821 | 0.7630 |
| No log | 0.6667 | 282 | 0.5707 | 0.5829 | 0.5707 | 0.7555 |
| No log | 0.6714 | 284 | 0.6352 | 0.5497 | 0.6352 | 0.7970 |
| No log | 0.6761 | 286 | 0.5735 | 0.5696 | 0.5735 | 0.7573 |
| No log | 0.6809 | 288 | 0.5062 | 0.6143 | 0.5062 | 0.7115 |
| No log | 0.6856 | 290 | 0.5707 | 0.5336 | 0.5707 | 0.7555 |
| No log | 0.6903 | 292 | 0.7620 | 0.4478 | 0.7620 | 0.8729 |
| No log | 0.6950 | 294 | 0.8143 | 0.4406 | 0.8143 | 0.9024 |
| No log | 0.6998 | 296 | 0.7003 | 0.4790 | 0.7003 | 0.8369 |
| No log | 0.7045 | 298 | 0.5546 | 0.5359 | 0.5546 | 0.7447 |
| No log | 0.7092 | 300 | 0.5577 | 0.5289 | 0.5577 | 0.7468 |
| No log | 0.7139 | 302 | 0.5300 | 0.5502 | 0.5300 | 0.7280 |
| No log | 0.7187 | 304 | 0.6383 | 0.5147 | 0.6383 | 0.7989 |
| No log | 0.7234 | 306 | 0.6369 | 0.5288 | 0.6369 | 0.7981 |
| No log | 0.7281 | 308 | 0.5573 | 0.5538 | 0.5573 | 0.7465 |
| No log | 0.7329 | 310 | 0.5638 | 0.5535 | 0.5638 | 0.7509 |
| No log | 0.7376 | 312 | 0.5484 | 0.5617 | 0.5484 | 0.7405 |
| No log | 0.7423 | 314 | 0.6646 | 0.5290 | 0.6646 | 0.8152 |
| No log | 0.7470 | 316 | 0.9168 | 0.4468 | 0.9168 | 0.9575 |
| No log | 0.7518 | 318 | 0.7972 | 0.4601 | 0.7972 | 0.8929 |
| No log | 0.7565 | 320 | 0.5342 | 0.5483 | 0.5342 | 0.7309 |
| No log | 0.7612 | 322 | 0.5445 | 0.5254 | 0.5445 | 0.7379 |
| No log | 0.7660 | 324 | 0.5486 | 0.5300 | 0.5486 | 0.7407 |
| No log | 0.7707 | 326 | 0.7574 | 0.4644 | 0.7574 | 0.8703 |
| No log | 0.7754 | 328 | 0.7651 | 0.4653 | 0.7651 | 0.8747 |
| No log | 0.7801 | 330 | 0.5690 | 0.5264 | 0.5690 | 0.7543 |
| No log | 0.7849 | 332 | 0.5500 | 0.5082 | 0.5500 | 0.7416 |
| No log | 0.7896 | 334 | 0.5414 | 0.5056 | 0.5414 | 0.7358 |
| No log | 0.7943 | 336 | 0.6186 | 0.5221 | 0.6186 | 0.7865 |
| No log | 0.7991 | 338 | 0.6234 | 0.5235 | 0.6234 | 0.7895 |
| No log | 0.8038 | 340 | 0.5212 | 0.5796 | 0.5212 | 0.7220 |
| No log | 0.8085 | 342 | 0.4960 | 0.5706 | 0.4960 | 0.7043 |
| No log | 0.8132 | 344 | 0.5067 | 0.5908 | 0.5067 | 0.7119 |
| No log | 0.8180 | 346 | 0.5212 | 0.5903 | 0.5212 | 0.7219 |
| No log | 0.8227 | 348 | 0.5417 | 0.6069 | 0.5417 | 0.7360 |
| No log | 0.8274 | 350 | 0.5174 | 0.5927 | 0.5174 | 0.7193 |
| No log | 0.8322 | 352 | 0.5052 | 0.5926 | 0.5052 | 0.7107 |
| No log | 0.8369 | 354 | 0.6753 | 0.5515 | 0.6753 | 0.8218 |
| No log | 0.8416 | 356 | 0.8144 | 0.4576 | 0.8144 | 0.9025 |
| No log | 0.8463 | 358 | 0.6737 | 0.5056 | 0.6737 | 0.8208 |
| No log | 0.8511 | 360 | 0.5676 | 0.5225 | 0.5676 | 0.7534 |
| No log | 0.8558 | 362 | 0.6329 | 0.4719 | 0.6329 | 0.7955 |
| No log | 0.8605 | 364 | 0.6884 | 0.4554 | 0.6884 | 0.8297 |
| No log | 0.8652 | 366 | 0.8468 | 0.4488 | 0.8468 | 0.9202 |
| No log | 0.8700 | 368 | 0.8954 | 0.4432 | 0.8954 | 0.9463 |
| No log | 0.8747 | 370 | 0.7265 | 0.4851 | 0.7265 | 0.8523 |
| No log | 0.8794 | 372 | 0.4692 | 0.5877 | 0.4692 | 0.6850 |
| No log | 0.8842 | 374 | 0.4506 | 0.5989 | 0.4506 | 0.6713 |
| No log | 0.8889 | 376 | 0.4906 | 0.5751 | 0.4906 | 0.7004 |
| No log | 0.8936 | 378 | 0.7125 | 0.4935 | 0.7125 | 0.8441 |
| No log | 0.8983 | 380 | 0.6411 | 0.5425 | 0.6411 | 0.8007 |
| No log | 0.9031 | 382 | 0.4538 | 0.5999 | 0.4538 | 0.6736 |
| No log | 0.9078 | 384 | 0.4867 | 0.6015 | 0.4867 | 0.6977 |
| No log | 0.9125 | 386 | 0.4714 | 0.5920 | 0.4714 | 0.6866 |
| No log | 0.9173 | 388 | 0.4613 | 0.6105 | 0.4613 | 0.6792 |
| No log | 0.9220 | 390 | 0.5025 | 0.6048 | 0.5025 | 0.7088 |
| No log | 0.9267 | 392 | 0.5840 | 0.5554 | 0.5840 | 0.7642 |
| No log | 0.9314 | 394 | 0.5159 | 0.5913 | 0.5159 | 0.7183 |
| No log | 0.9362 | 396 | 0.4739 | 0.5926 | 0.4739 | 0.6884 |
| No log | 0.9409 | 398 | 0.4816 | 0.6069 | 0.4816 | 0.6940 |
| No log | 0.9456 | 400 | 0.6224 | 0.5314 | 0.6224 | 0.7889 |
| No log | 0.9504 | 402 | 0.8422 | 0.4642 | 0.8422 | 0.9177 |
| No log | 0.9551 | 404 | 0.9251 | 0.4367 | 0.9251 | 0.9618 |
| No log | 0.9598 | 406 | 0.7229 | 0.4873 | 0.7229 | 0.8502 |
| No log | 0.9645 | 408 | 0.6838 | 0.4813 | 0.6838 | 0.8269 |
| No log | 0.9693 | 410 | 0.7137 | 0.4684 | 0.7137 | 0.8448 |
| No log | 0.9740 | 412 | 0.6379 | 0.4727 | 0.6379 | 0.7987 |
| No log | 0.9787 | 414 | 0.6176 | 0.4684 | 0.6176 | 0.7859 |
| No log | 0.9835 | 416 | 0.7166 | 0.4644 | 0.7166 | 0.8465 |
| No log | 0.9882 | 418 | 0.6380 | 0.4850 | 0.6380 | 0.7988 |
| No log | 0.9929 | 420 | 0.5916 | 0.5149 | 0.5916 | 0.7691 |
| No log | 0.9976 | 422 | 0.5654 | 0.5572 | 0.5654 | 0.7519 |
| No log | 1.0024 | 424 | 0.5018 | 0.6169 | 0.5018 | 0.7084 |
| No log | 1.0071 | 426 | 0.4805 | 0.6151 | 0.4805 | 0.6932 |
| No log | 1.0118 | 428 | 0.4847 | 0.6223 | 0.4847 | 0.6962 |
| No log | 1.0165 | 430 | 0.5531 | 0.6229 | 0.5531 | 0.7437 |
| No log | 1.0213 | 432 | 0.6593 | 0.5865 | 0.6593 | 0.8120 |
| No log | 1.0260 | 434 | 0.6333 | 0.6121 | 0.6333 | 0.7958 |
| No log | 1.0307 | 436 | 0.5266 | 0.6239 | 0.5266 | 0.7257 |
| No log | 1.0355 | 438 | 0.5546 | 0.6168 | 0.5546 | 0.7447 |
| No log | 1.0402 | 440 | 0.7182 | 0.5476 | 0.7182 | 0.8475 |
| No log | 1.0449 | 442 | 0.8598 | 0.4795 | 0.8598 | 0.9272 |
| No log | 1.0496 | 444 | 0.6289 | 0.5624 | 0.6289 | 0.7930 |
| No log | 1.0544 | 446 | 0.4996 | 0.6006 | 0.4996 | 0.7068 |
| No log | 1.0591 | 448 | 0.5206 | 0.5899 | 0.5206 | 0.7215 |
| No log | 1.0638 | 450 | 0.8055 | 0.4665 | 0.8055 | 0.8975 |
| No log | 1.0686 | 452 | 1.0645 | 0.3990 | 1.0645 | 1.0318 |
| No log | 1.0733 | 454 | 0.8904 | 0.4304 | 0.8904 | 0.9436 |
| No log | 1.0780 | 456 | 0.5816 | 0.5416 | 0.5816 | 0.7626 |
| No log | 1.0827 | 458 | 0.5400 | 0.5667 | 0.5400 | 0.7349 |
| No log | 1.0875 | 460 | 0.5979 | 0.5153 | 0.5979 | 0.7733 |
| No log | 1.0922 | 462 | 0.7745 | 0.4683 | 0.7745 | 0.8801 |
| No log | 1.0969 | 464 | 0.6883 | 0.4958 | 0.6883 | 0.8296 |
| No log | 1.1017 | 466 | 0.5412 | 0.5638 | 0.5412 | 0.7357 |
| No log | 1.1064 | 468 | 0.5460 | 0.5479 | 0.5460 | 0.7389 |
| No log | 1.1111 | 470 | 0.5766 | 0.5475 | 0.5766 | 0.7594 |
| No log | 1.1158 | 472 | 0.5325 | 0.5974 | 0.5325 | 0.7298 |
| No log | 1.1206 | 474 | 0.5150 | 0.6194 | 0.5150 | 0.7177 |
| No log | 1.1253 | 476 | 0.6569 | 0.5362 | 0.6569 | 0.8105 |
| No log | 1.1300 | 478 | 0.6637 | 0.5295 | 0.6637 | 0.8147 |
| No log | 1.1348 | 480 | 0.5100 | 0.6369 | 0.5100 | 0.7142 |
| No log | 1.1395 | 482 | 0.5030 | 0.6355 | 0.5030 | 0.7093 |
| No log | 1.1442 | 484 | 0.6283 | 0.5400 | 0.6283 | 0.7926 |
| No log | 1.1489 | 486 | 0.9441 | 0.4669 | 0.9441 | 0.9716 |
| No log | 1.1537 | 488 | 0.9533 | 0.4425 | 0.9533 | 0.9764 |
| No log | 1.1584 | 490 | 0.6172 | 0.5333 | 0.6172 | 0.7856 |
| No log | 1.1631 | 492 | 0.5474 | 0.5682 | 0.5474 | 0.7399 |
| No log | 1.1678 | 494 | 0.6783 | 0.5015 | 0.6783 | 0.8236 |
| No log | 1.1726 | 496 | 0.8878 | 0.4305 | 0.8878 | 0.9423 |
| No log | 1.1773 | 498 | 0.7326 | 0.4727 | 0.7326 | 0.8559 |
| 0.8579 | 1.1820 | 500 | 0.6006 | 0.5253 | 0.6006 | 0.7750 |
| 0.8579 | 1.1868 | 502 | 0.5121 | 0.5665 | 0.5121 | 0.7156 |
| 0.8579 | 1.1915 | 504 | 0.5203 | 0.5636 | 0.5203 | 0.7213 |
| 0.8579 | 1.1962 | 506 | 0.5500 | 0.5453 | 0.5500 | 0.7416 |
| 0.8579 | 1.2009 | 508 | 0.8221 | 0.4778 | 0.8221 | 0.9067 |
| 0.8579 | 1.2057 | 510 | 0.9196 | 0.4730 | 0.9196 | 0.9590 |
| 0.8579 | 1.2104 | 512 | 0.6545 | 0.5423 | 0.6545 | 0.8090 |
| 0.8579 | 1.2151 | 514 | 0.5619 | 0.5666 | 0.5619 | 0.7496 |
| 0.8579 | 1.2199 | 516 | 0.5349 | 0.5922 | 0.5349 | 0.7314 |
| 0.8579 | 1.2246 | 518 | 0.4956 | 0.5994 | 0.4956 | 0.7040 |
| 0.8579 | 1.2293 | 520 | 0.4740 | 0.6127 | 0.4740 | 0.6885 |
| 0.8579 | 1.2340 | 522 | 0.4847 | 0.5970 | 0.4847 | 0.6962 |
| 0.8579 | 1.2388 | 524 | 0.6039 | 0.5653 | 0.6039 | 0.7771 |
| 0.8579 | 1.2435 | 526 | 0.6106 | 0.5560 | 0.6106 | 0.7814 |
| 0.8579 | 1.2482 | 528 | 0.5193 | 0.5824 | 0.5193 | 0.7206 |
| 0.8579 | 1.2530 | 530 | 0.4968 | 0.5820 | 0.4968 | 0.7048 |
| 0.8579 | 1.2577 | 532 | 0.6218 | 0.5239 | 0.6218 | 0.7885 |
| 0.8579 | 1.2624 | 534 | 0.6391 | 0.5110 | 0.6391 | 0.7994 |
| 0.8579 | 1.2671 | 536 | 0.4860 | 0.6001 | 0.4860 | 0.6971 |
| 0.8579 | 1.2719 | 538 | 0.4650 | 0.6070 | 0.4650 | 0.6819 |
| 0.8579 | 1.2766 | 540 | 0.4965 | 0.6093 | 0.4965 | 0.7046 |
| 0.8579 | 1.2813 | 542 | 0.5583 | 0.5933 | 0.5583 | 0.7472 |
| 0.8579 | 1.2861 | 544 | 0.4862 | 0.6239 | 0.4862 | 0.6972 |
| 0.8579 | 1.2908 | 546 | 0.4674 | 0.6369 | 0.4674 | 0.6836 |
| 0.8579 | 1.2955 | 548 | 0.4714 | 0.6459 | 0.4714 | 0.6866 |
| 0.8579 | 1.3002 | 550 | 0.4887 | 0.6261 | 0.4887 | 0.6991 |
| 0.8579 | 1.3050 | 552 | 0.5125 | 0.5980 | 0.5125 | 0.7159 |
| 0.8579 | 1.3097 | 554 | 0.5443 | 0.5822 | 0.5443 | 0.7378 |
| 0.8579 | 1.3144 | 556 | 0.5090 | 0.6060 | 0.5090 | 0.7134 |
| 0.8579 | 1.3191 | 558 | 0.5354 | 0.6088 | 0.5354 | 0.7317 |
| 0.8579 | 1.3239 | 560 | 0.5473 | 0.6024 | 0.5473 | 0.7398 |
| 0.8579 | 1.3286 | 562 | 0.6192 | 0.5556 | 0.6192 | 0.7869 |
| 0.8579 | 1.3333 | 564 | 0.7695 | 0.5238 | 0.7695 | 0.8772 |
| 0.8579 | 1.3381 | 566 | 0.7089 | 0.5214 | 0.7089 | 0.8420 |
| 0.8579 | 1.3428 | 568 | 0.7135 | 0.5145 | 0.7135 | 0.8447 |
| 0.8579 | 1.3475 | 570 | 0.7675 | 0.4901 | 0.7675 | 0.8761 |
| 0.8579 | 1.3522 | 572 | 0.6543 | 0.5366 | 0.6543 | 0.8089 |
| 0.8579 | 1.3570 | 574 | 0.6564 | 0.5299 | 0.6564 | 0.8102 |
| 0.8579 | 1.3617 | 576 | 0.5695 | 0.5685 | 0.5695 | 0.7546 |
| 0.8579 | 1.3664 | 578 | 0.6439 | 0.5416 | 0.6439 | 0.8024 |
| 0.8579 | 1.3712 | 580 | 0.7997 | 0.5146 | 0.7997 | 0.8943 |
| 0.8579 | 1.3759 | 582 | 0.6449 | 0.5456 | 0.6449 | 0.8031 |
| 0.8579 | 1.3806 | 584 | 0.5071 | 0.6118 | 0.5071 | 0.7121 |
| 0.8579 | 1.3853 | 586 | 0.5288 | 0.6136 | 0.5288 | 0.7272 |
| 0.8579 | 1.3901 | 588 | 0.6801 | 0.5566 | 0.6801 | 0.8247 |
| 0.8579 | 1.3948 | 590 | 0.5896 | 0.6057 | 0.5896 | 0.7679 |
| 0.8579 | 1.3995 | 592 | 0.5490 | 0.6126 | 0.5490 | 0.7410 |
| 0.8579 | 1.4043 | 594 | 0.5284 | 0.6345 | 0.5284 | 0.7269 |
| 0.8579 | 1.4090 | 596 | 0.6151 | 0.5734 | 0.6151 | 0.7843 |
| 0.8579 | 1.4137 | 598 | 0.6449 | 0.5588 | 0.6449 | 0.8031 |
| 0.8579 | 1.4184 | 600 | 0.5039 | 0.6226 | 0.5039 | 0.7098 |
| 0.8579 | 1.4232 | 602 | 0.4893 | 0.6105 | 0.4893 | 0.6995 |
| 0.8579 | 1.4279 | 604 | 0.4708 | 0.6064 | 0.4708 | 0.6862 |
| 0.8579 | 1.4326 | 606 | 0.4860 | 0.6331 | 0.4860 | 0.6971 |
| 0.8579 | 1.4374 | 608 | 0.4980 | 0.6216 | 0.4980 | 0.7057 |
| 0.8579 | 1.4421 | 610 | 0.4835 | 0.6099 | 0.4835 | 0.6954 |
| 0.8579 | 1.4468 | 612 | 0.4802 | 0.6142 | 0.4802 | 0.6930 |
| 0.8579 | 1.4515 | 614 | 0.5629 | 0.5843 | 0.5629 | 0.7503 |
| 0.8579 | 1.4563 | 616 | 0.6249 | 0.5641 | 0.6249 | 0.7905 |
| 0.8579 | 1.4610 | 618 | 0.6536 | 0.5509 | 0.6536 | 0.8084 |
| 0.8579 | 1.4657 | 620 | 0.6359 | 0.5538 | 0.6359 | 0.7975 |
| 0.8579 | 1.4704 | 622 | 0.6045 | 0.5623 | 0.6045 | 0.7775 |
| 0.8579 | 1.4752 | 624 | 0.6082 | 0.5628 | 0.6082 | 0.7798 |
| 0.8579 | 1.4799 | 626 | 0.5710 | 0.5911 | 0.5710 | 0.7557 |
| 0.8579 | 1.4846 | 628 | 0.5300 | 0.6150 | 0.5300 | 0.7280 |
| 0.8579 | 1.4894 | 630 | 0.5653 | 0.5911 | 0.5653 | 0.7519 |
| 0.8579 | 1.4941 | 632 | 0.6141 | 0.5640 | 0.6141 | 0.7836 |
| 0.8579 | 1.4988 | 634 | 0.5243 | 0.6002 | 0.5243 | 0.7241 |
| 0.8579 | 1.5035 | 636 | 0.5005 | 0.6091 | 0.5005 | 0.7074 |
| 0.8579 | 1.5083 | 638 | 0.4962 | 0.6201 | 0.4962 | 0.7044 |
| 0.8579 | 1.5130 | 640 | 0.5918 | 0.5675 | 0.5918 | 0.7693 |
| 0.8579 | 1.5177 | 642 | 0.5878 | 0.5590 | 0.5878 | 0.7667 |
| 0.8579 | 1.5225 | 644 | 0.4862 | 0.5992 | 0.4862 | 0.6973 |
| 0.8579 | 1.5272 | 646 | 0.4766 | 0.6022 | 0.4766 | 0.6904 |
| 0.8579 | 1.5319 | 648 | 0.4922 | 0.6169 | 0.4922 | 0.7016 |
| 0.8579 | 1.5366 | 650 | 0.4676 | 0.6175 | 0.4676 | 0.6838 |
| 0.8579 | 1.5414 | 652 | 0.4967 | 0.5977 | 0.4967 | 0.7047 |
| 0.8579 | 1.5461 | 654 | 0.4950 | 0.6083 | 0.4950 | 0.7035 |
| 0.8579 | 1.5508 | 656 | 0.4937 | 0.6291 | 0.4937 | 0.7027 |
| 0.8579 | 1.5556 | 658 | 0.5070 | 0.6217 | 0.5070 | 0.7120 |
| 0.8579 | 1.5603 | 660 | 0.4895 | 0.6282 | 0.4895 | 0.6997 |
| 0.8579 | 1.5650 | 662 | 0.5407 | 0.5910 | 0.5407 | 0.7353 |
| 0.8579 | 1.5697 | 664 | 0.4887 | 0.6099 | 0.4887 | 0.6990 |
| 0.8579 | 1.5745 | 666 | 0.5324 | 0.6061 | 0.5324 | 0.7297 |
| 0.8579 | 1.5792 | 668 | 0.5508 | 0.6062 | 0.5508 | 0.7422 |
| 0.8579 | 1.5839 | 670 | 0.4781 | 0.6185 | 0.4781 | 0.6915 |
| 0.8579 | 1.5887 | 672 | 0.4749 | 0.6124 | 0.4749 | 0.6891 |
| 0.8579 | 1.5934 | 674 | 0.4836 | 0.6210 | 0.4836 | 0.6954 |
| 0.8579 | 1.5981 | 676 | 0.5373 | 0.6161 | 0.5373 | 0.7330 |
| 0.8579 | 1.6028 | 678 | 0.5110 | 0.6091 | 0.5110 | 0.7148 |
| 0.8579 | 1.6076 | 680 | 0.5058 | 0.6106 | 0.5058 | 0.7112 |
| 0.8579 | 1.6123 | 682 | 0.5388 | 0.6033 | 0.5388 | 0.7340 |
| 0.8579 | 1.6170 | 684 | 0.5311 | 0.6054 | 0.5311 | 0.7288 |
| 0.8579 | 1.6217 | 686 | 0.4944 | 0.5917 | 0.4944 | 0.7031 |
| 0.8579 | 1.6265 | 688 | 0.4825 | 0.6135 | 0.4825 | 0.6946 |
| 0.8579 | 1.6312 | 690 | 0.5280 | 0.6053 | 0.5280 | 0.7266 |
| 0.8579 | 1.6359 | 692 | 0.5052 | 0.6214 | 0.5052 | 0.7108 |
| 0.8579 | 1.6407 | 694 | 0.4784 | 0.6257 | 0.4784 | 0.6917 |
| 0.8579 | 1.6454 | 696 | 0.4927 | 0.6377 | 0.4927 | 0.7019 |
| 0.8579 | 1.6501 | 698 | 0.5884 | 0.5824 | 0.5884 | 0.7671 |
| 0.8579 | 1.6548 | 700 | 0.6046 | 0.5720 | 0.6046 | 0.7776 |
| 0.8579 | 1.6596 | 702 | 0.4970 | 0.6379 | 0.4970 | 0.7050 |
| 0.8579 | 1.6643 | 704 | 0.4977 | 0.6133 | 0.4977 | 0.7054 |
| 0.8579 | 1.6690 | 706 | 0.5577 | 0.6010 | 0.5577 | 0.7468 |
| 0.8579 | 1.6738 | 708 | 0.6927 | 0.5323 | 0.6927 | 0.8323 |
| 0.8579 | 1.6785 | 710 | 0.6291 | 0.5645 | 0.6291 | 0.7931 |
| 0.8579 | 1.6832 | 712 | 0.5331 | 0.6023 | 0.5331 | 0.7301 |
| 0.8579 | 1.6879 | 714 | 0.5676 | 0.5776 | 0.5676 | 0.7534 |
| 0.8579 | 1.6927 | 716 | 0.5970 | 0.5585 | 0.5970 | 0.7726 |
| 0.8579 | 1.6974 | 718 | 0.5582 | 0.5848 | 0.5582 | 0.7471 |
| 0.8579 | 1.7021 | 720 | 0.6090 | 0.5600 | 0.6090 | 0.7804 |
| 0.8579 | 1.7069 | 722 | 0.7420 | 0.4987 | 0.7420 | 0.8614 |
| 0.8579 | 1.7116 | 724 | 0.6178 | 0.5450 | 0.6178 | 0.7860 |
| 0.8579 | 1.7163 | 726 | 0.5046 | 0.5911 | 0.5046 | 0.7104 |
| 0.8579 | 1.7210 | 728 | 0.5062 | 0.5884 | 0.5062 | 0.7115 |
| 0.8579 | 1.7258 | 730 | 0.5001 | 0.6256 | 0.5001 | 0.7071 |
| 0.8579 | 1.7305 | 732 | 0.5717 | 0.5701 | 0.5717 | 0.7561 |
| 0.8579 | 1.7352 | 734 | 0.6222 | 0.5667 | 0.6222 | 0.7888 |
| 0.8579 | 1.7400 | 736 | 0.5211 | 0.6410 | 0.5211 | 0.7219 |
| 0.8579 | 1.7447 | 738 | 0.5025 | 0.6170 | 0.5025 | 0.7089 |
| 0.8579 | 1.7494 | 740 | 0.4890 | 0.6422 | 0.4890 | 0.6993 |
| 0.8579 | 1.7541 | 742 | 0.5314 | 0.6386 | 0.5314 | 0.7290 |
| 0.8579 | 1.7589 | 744 | 0.6923 | 0.5657 | 0.6923 | 0.8320 |
| 0.8579 | 1.7636 | 746 | 0.6411 | 0.5806 | 0.6411 | 0.8007 |
| 0.8579 | 1.7683 | 748 | 0.5148 | 0.6559 | 0.5148 | 0.7175 |
| 0.8579 | 1.7730 | 750 | 0.5276 | 0.6317 | 0.5276 | 0.7264 |
| 0.8579 | 1.7778 | 752 | 0.6686 | 0.5621 | 0.6686 | 0.8177 |
| 0.8579 | 1.7825 | 754 | 0.6532 | 0.5641 | 0.6532 | 0.8082 |
| 0.8579 | 1.7872 | 756 | 0.5644 | 0.6027 | 0.5644 | 0.7513 |
| 0.8579 | 1.7920 | 758 | 0.5164 | 0.6064 | 0.5164 | 0.7186 |
| 0.8579 | 1.7967 | 760 | 0.5594 | 0.5952 | 0.5594 | 0.7480 |
| 0.8579 | 1.8014 | 762 | 0.7209 | 0.4934 | 0.7209 | 0.8491 |
| 0.8579 | 1.8061 | 764 | 0.6241 | 0.5369 | 0.6241 | 0.7900 |
| 0.8579 | 1.8109 | 766 | 0.5097 | 0.5782 | 0.5097 | 0.7139 |
| 0.8579 | 1.8156 | 768 | 0.5055 | 0.5719 | 0.5055 | 0.7110 |
| 0.8579 | 1.8203 | 770 | 0.6028 | 0.5288 | 0.6028 | 0.7764 |
| 0.8579 | 1.8251 | 772 | 0.7079 | 0.4835 | 0.7079 | 0.8414 |
| 0.8579 | 1.8298 | 774 | 0.6284 | 0.5137 | 0.6284 | 0.7927 |
| 0.8579 | 1.8345 | 776 | 0.5441 | 0.5729 | 0.5441 | 0.7376 |
| 0.8579 | 1.8392 | 778 | 0.5457 | 0.5705 | 0.5457 | 0.7387 |
| 0.8579 | 1.8440 | 780 | 0.5802 | 0.5339 | 0.5802 | 0.7617 |
| 0.8579 | 1.8487 | 782 | 0.5509 | 0.5786 | 0.5509 | 0.7422 |
| 0.8579 | 1.8534 | 784 | 0.5343 | 0.5911 | 0.5343 | 0.7309 |
| 0.8579 | 1.8582 | 786 | 0.5182 | 0.6100 | 0.5182 | 0.7199 |
| 0.8579 | 1.8629 | 788 | 0.5001 | 0.5907 | 0.5001 | 0.7072 |
| 0.8579 | 1.8676 | 790 | 0.5607 | 0.6119 | 0.5607 | 0.7488 |
| 0.8579 | 1.8723 | 792 | 0.6509 | 0.5393 | 0.6509 | 0.8068 |
| 0.8579 | 1.8771 | 794 | 0.5592 | 0.5975 | 0.5592 | 0.7478 |
| 0.8579 | 1.8818 | 796 | 0.5048 | 0.5921 | 0.5048 | 0.7105 |
| 0.8579 | 1.8865 | 798 | 0.5156 | 0.5909 | 0.5156 | 0.7181 |
| 0.8579 | 1.8913 | 800 | 0.5205 | 0.5863 | 0.5205 | 0.7215 |
| 0.8579 | 1.8960 | 802 | 0.6805 | 0.5375 | 0.6805 | 0.8249 |
| 0.8579 | 1.9007 | 804 | 0.6824 | 0.5318 | 0.6824 | 0.8260 |
| 0.8579 | 1.9054 | 806 | 0.5399 | 0.6011 | 0.5399 | 0.7348 |
| 0.8579 | 1.9102 | 808 | 0.4788 | 0.6341 | 0.4788 | 0.6919 |
| 0.8579 | 1.9149 | 810 | 0.4814 | 0.6300 | 0.4814 | 0.6938 |
| 0.8579 | 1.9196 | 812 | 0.5512 | 0.5859 | 0.5512 | 0.7425 |
| 0.8579 | 1.9243 | 814 | 0.5576 | 0.5969 | 0.5576 | 0.7467 |
| 0.8579 | 1.9291 | 816 | 0.5073 | 0.6372 | 0.5073 | 0.7122 |
| 0.8579 | 1.9338 | 818 | 0.5051 | 0.6096 | 0.5051 | 0.7107 |
| 0.8579 | 1.9385 | 820 | 0.5341 | 0.6099 | 0.5341 | 0.7308 |
| 0.8579 | 1.9433 | 822 | 0.6684 | 0.5239 | 0.6684 | 0.8176 |
| 0.8579 | 1.9480 | 824 | 0.6403 | 0.5262 | 0.6403 | 0.8002 |
| 0.8579 | 1.9527 | 826 | 0.5762 | 0.5682 | 0.5762 | 0.7591 |
| 0.8579 | 1.9574 | 828 | 0.6370 | 0.5264 | 0.6370 | 0.7981 |
| 0.8579 | 1.9622 | 830 | 0.6838 | 0.4901 | 0.6838 | 0.8269 |
| 0.8579 | 1.9669 | 832 | 0.6225 | 0.5274 | 0.6225 | 0.7890 |
| 0.8579 | 1.9716 | 834 | 0.5979 | 0.5450 | 0.5979 | 0.7732 |
| 0.8579 | 1.9764 | 836 | 0.5532 | 0.5632 | 0.5532 | 0.7438 |
| 0.8579 | 1.9811 | 838 | 0.5366 | 0.5677 | 0.5366 | 0.7325 |
| 0.8579 | 1.9858 | 840 | 0.4938 | 0.6211 | 0.4938 | 0.7027 |
| 0.8579 | 1.9905 | 842 | 0.5087 | 0.6156 | 0.5087 | 0.7132 |
| 0.8579 | 1.9953 | 844 | 0.5875 | 0.5596 | 0.5875 | 0.7665 |
| 0.8579 | 2.0 | 846 | 0.5132 | 0.6206 | 0.5132 | 0.7164 |
| 0.8579 | 2.0047 | 848 | 0.4771 | 0.6358 | 0.4771 | 0.6907 |
| 0.8579 | 2.0095 | 850 | 0.4832 | 0.6460 | 0.4832 | 0.6951 |
| 0.8579 | 2.0142 | 852 | 0.4909 | 0.6518 | 0.4909 | 0.7007 |
| 0.8579 | 2.0189 | 854 | 0.5182 | 0.6372 | 0.5182 | 0.7198 |
| 0.8579 | 2.0236 | 856 | 0.5670 | 0.6032 | 0.5670 | 0.7530 |
| 0.8579 | 2.0284 | 858 | 0.6613 | 0.5667 | 0.6613 | 0.8132 |
| 0.8579 | 2.0331 | 860 | 0.5810 | 0.5846 | 0.5810 | 0.7622 |
| 0.8579 | 2.0378 | 862 | 0.5715 | 0.5770 | 0.5715 | 0.7560 |
| 0.8579 | 2.0426 | 864 | 0.6018 | 0.5634 | 0.6018 | 0.7757 |
| 0.8579 | 2.0473 | 866 | 0.7330 | 0.5016 | 0.7330 | 0.8562 |
| 0.8579 | 2.0520 | 868 | 0.7046 | 0.5139 | 0.7046 | 0.8394 |
| 0.8579 | 2.0567 | 870 | 0.5899 | 0.5622 | 0.5899 | 0.7680 |
| 0.8579 | 2.0615 | 872 | 0.5942 | 0.5820 | 0.5942 | 0.7708 |
| 0.8579 | 2.0662 | 874 | 0.8487 | 0.4898 | 0.8487 | 0.9213 |
| 0.8579 | 2.0709 | 876 | 1.0695 | 0.3906 | 1.0695 | 1.0342 |
| 0.8579 | 2.0757 | 878 | 0.8462 | 0.4884 | 0.8462 | 0.9199 |
| 0.8579 | 2.0804 | 880 | 0.5356 | 0.5812 | 0.5356 | 0.7318 |
| 0.8579 | 2.0851 | 882 | 0.4924 | 0.6171 | 0.4924 | 0.7017 |
| 0.8579 | 2.0898 | 884 | 0.5302 | 0.5871 | 0.5302 | 0.7281 |
| 0.8579 | 2.0946 | 886 | 0.6875 | 0.5240 | 0.6875 | 0.8291 |
| 0.8579 | 2.0993 | 888 | 0.6264 | 0.5424 | 0.6264 | 0.7914 |
| 0.8579 | 2.1040 | 890 | 0.4815 | 0.6009 | 0.4815 | 0.6939 |
| 0.8579 | 2.1087 | 892 | 0.4795 | 0.6009 | 0.4795 | 0.6925 |
| 0.8579 | 2.1135 | 894 | 0.4737 | 0.6206 | 0.4737 | 0.6882 |
| 0.8579 | 2.1182 | 896 | 0.5085 | 0.6434 | 0.5085 | 0.7131 |
| 0.8579 | 2.1229 | 898 | 0.6022 | 0.6183 | 0.6022 | 0.7760 |
| 0.8579 | 2.1277 | 900 | 0.5548 | 0.6249 | 0.5548 | 0.7448 |
| 0.8579 | 2.1324 | 902 | 0.5134 | 0.6409 | 0.5134 | 0.7165 |
| 0.8579 | 2.1371 | 904 | 0.5188 | 0.6489 | 0.5188 | 0.7203 |
| 0.8579 | 2.1418 | 906 | 0.5114 | 0.6570 | 0.5114 | 0.7151 |
| 0.8579 | 2.1466 | 908 | 0.5284 | 0.6335 | 0.5284 | 0.7269 |
| 0.8579 | 2.1513 | 910 | 0.5586 | 0.6088 | 0.5586 | 0.7474 |
| 0.8579 | 2.1560 | 912 | 0.5528 | 0.6205 | 0.5528 | 0.7435 |
| 0.8579 | 2.1608 | 914 | 0.5465 | 0.6137 | 0.5465 | 0.7393 |
| 0.8579 | 2.1655 | 916 | 0.6467 | 0.5237 | 0.6467 | 0.8042 |
| 0.8579 | 2.1702 | 918 | 0.6117 | 0.5455 | 0.6117 | 0.7821 |
| 0.8579 | 2.1749 | 920 | 0.5500 | 0.5953 | 0.5500 | 0.7416 |
| 0.8579 | 2.1797 | 922 | 0.4921 | 0.6299 | 0.4921 | 0.7015 |
| 0.8579 | 2.1844 | 924 | 0.4990 | 0.6287 | 0.4990 | 0.7064 |
| 0.8579 | 2.1891 | 926 | 0.5973 | 0.5949 | 0.5973 | 0.7728 |
| 0.8579 | 2.1939 | 928 | 0.7032 | 0.5253 | 0.7032 | 0.8386 |
| 0.8579 | 2.1986 | 930 | 0.6289 | 0.5724 | 0.6289 | 0.7930 |
| 0.8579 | 2.2033 | 932 | 0.4938 | 0.6342 | 0.4938 | 0.7027 |
| 0.8579 | 2.2080 | 934 | 0.4828 | 0.6369 | 0.4828 | 0.6949 |
| 0.8579 | 2.2128 | 936 | 0.5108 | 0.6311 | 0.5108 | 0.7147 |
| 0.8579 | 2.2175 | 938 | 0.5086 | 0.6249 | 0.5086 | 0.7132 |
| 0.8579 | 2.2222 | 940 | 0.5426 | 0.6169 | 0.5426 | 0.7366 |
| 0.8579 | 2.2270 | 942 | 0.6940 | 0.5513 | 0.6940 | 0.8331 |
| 0.8579 | 2.2317 | 944 | 0.7234 | 0.5463 | 0.7234 | 0.8506 |
| 0.8579 | 2.2364 | 946 | 0.5588 | 0.6032 | 0.5588 | 0.7476 |
| 0.8579 | 2.2411 | 948 | 0.5091 | 0.6231 | 0.5091 | 0.7135 |
| 0.8579 | 2.2459 | 950 | 0.5369 | 0.6169 | 0.5369 | 0.7327 |
| 0.8579 | 2.2506 | 952 | 0.5511 | 0.6119 | 0.5511 | 0.7424 |
| 0.8579 | 2.2553 | 954 | 0.5695 | 0.5997 | 0.5695 | 0.7546 |
| 0.8579 | 2.2600 | 956 | 0.7106 | 0.5109 | 0.7106 | 0.8430 |
| 0.8579 | 2.2648 | 958 | 0.7240 | 0.4935 | 0.7240 | 0.8509 |
| 0.8579 | 2.2695 | 960 | 0.6047 | 0.5533 | 0.6047 | 0.7777 |
| 0.8579 | 2.2742 | 962 | 0.5581 | 0.5851 | 0.5581 | 0.7470 |
| 0.8579 | 2.2790 | 964 | 0.6735 | 0.5208 | 0.6735 | 0.8207 |
| 0.8579 | 2.2837 | 966 | 0.7016 | 0.5033 | 0.7016 | 0.8376 |
| 0.8579 | 2.2884 | 968 | 0.5639 | 0.5870 | 0.5639 | 0.7510 |
| 0.8579 | 2.2931 | 970 | 0.5288 | 0.6091 | 0.5288 | 0.7272 |
| 0.8579 | 2.2979 | 972 | 0.5420 | 0.6068 | 0.5420 | 0.7362 |
| 0.8579 | 2.3026 | 974 | 0.5925 | 0.5854 | 0.5925 | 0.7698 |
| 0.8579 | 2.3073 | 976 | 0.5172 | 0.6271 | 0.5172 | 0.7192 |
| 0.8579 | 2.3121 | 978 | 0.4983 | 0.5935 | 0.4983 | 0.7059 |
| 0.8579 | 2.3168 | 980 | 0.4903 | 0.5995 | 0.4903 | 0.7002 |
| 0.8579 | 2.3215 | 982 | 0.5106 | 0.6237 | 0.5106 | 0.7145 |
| 0.8579 | 2.3262 | 984 | 0.7609 | 0.4962 | 0.7609 | 0.8723 |
| 0.8579 | 2.3310 | 986 | 0.7850 | 0.4925 | 0.7850 | 0.8860 |
| 0.8579 | 2.3357 | 988 | 0.5761 | 0.5568 | 0.5761 | 0.7590 |
| 0.8579 | 2.3404 | 990 | 0.4842 | 0.6029 | 0.4842 | 0.6958 |
| 0.8579 | 2.3452 | 992 | 0.4867 | 0.6037 | 0.4867 | 0.6976 |
| 0.8579 | 2.3499 | 994 | 0.5418 | 0.5771 | 0.5418 | 0.7361 |
| 0.8579 | 2.3546 | 996 | 0.6924 | 0.5123 | 0.6924 | 0.8321 |
| 0.8579 | 2.3593 | 998 | 0.6567 | 0.5272 | 0.6567 | 0.8104 |
| 0.3208 | 2.3641 | 1000 | 0.4999 | 0.5963 | 0.4999 | 0.7070 |
| 0.3208 | 2.3688 | 1002 | 0.4856 | 0.5943 | 0.4856 | 0.6968 |
| 0.3208 | 2.3735 | 1004 | 0.5378 | 0.5974 | 0.5378 | 0.7334 |
| 0.3208 | 2.3783 | 1006 | 0.5750 | 0.5882 | 0.5750 | 0.7583 |
| 0.3208 | 2.3830 | 1008 | 0.6491 | 0.5674 | 0.6491 | 0.8056 |
| 0.3208 | 2.3877 | 1010 | 0.5793 | 0.5791 | 0.5793 | 0.7611 |
| 0.3208 | 2.3924 | 1012 | 0.4854 | 0.5791 | 0.4854 | 0.6967 |
| 0.3208 | 2.3972 | 1014 | 0.4901 | 0.5743 | 0.4901 | 0.7001 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
|
Jonathanmann/GPT2-medium-SADnov21 | Jonathanmann | 2024-11-24T02:01:16Z | 7 | 1 | null | [
"pytorch",
"safetensors",
"gpt2",
"base_model:openai-community/gpt2-medium",
"base_model:finetune:openai-community/gpt2-medium",
"region:us"
] | null | 2024-11-21T20:30:55Z | ---
base_model:
- openai-community/gpt2-medium
--- |
peter198477/beautiful_girls | peter198477 | 2024-11-24T01:46:46Z | 6 | 0 | diffusers | [
"diffusers",
"text-to-image",
"lora",
"template:diffusion-lora",
"base_model:black-forest-labs/FLUX.1-dev",
"base_model:adapter:black-forest-labs/FLUX.1-dev",
"region:us"
] | text-to-image | 2024-11-24T01:45:46Z | ---
tags:
- text-to-image
- lora
- diffusers
- template:diffusion-lora
widget:
- text: '-'
output:
url: >-
images/workspace_trainsamples_799432171428303192_dd883b31-f47d-46e9-9f16-b5df14d5c54b.png
base_model: black-forest-labs/FLUX.1-dev
instance_prompt: null
---
# fffg
<Gallery />
## Download model
Weights for this model are available in Safetensors format.
[Download](/peter198477/beautiful_girls/tree/main) them in the Files & versions tab.
|
mlx-community/Florence-2-base-ft-6bit | mlx-community | 2024-11-24T01:32:51Z | 6 | 0 | mlx | [
"mlx",
"safetensors",
"florence2",
"vision",
"image-text-to-text",
"custom_code",
"license:mit",
"region:us"
] | image-text-to-text | 2024-11-24T01:31:33Z | ---
license: mit
license_link: https://huggingface.co/microsoft/Florence-2-base-ft/resolve/main/LICENSE
pipeline_tag: image-text-to-text
tags:
- vision
- mlx
---
# mlx-community/Florence-2-base-ft-6bit
This model was converted to MLX format from [`prince-canuma/Florence-2-base-ft`]() using mlx-vlm version **0.1.0**.
Refer to the [original model card](https://huggingface.co/prince-canuma/Florence-2-base-ft) for more details on the model.
## Use with mlx
```bash
pip install -U mlx-vlm
```
```bash
python -m mlx_vlm.generate --model mlx-community/Florence-2-base-ft-6bit --max-tokens 100 --temp 0.0
```
|
ZeroXClem/Qwen-2.5-Aether-SlerpFusion-7B | ZeroXClem | 2024-11-24T01:22:20Z | 50 | 2 | transformers | [
"transformers",
"safetensors",
"qwen2",
"text-generation",
"merge",
"mergekit",
"lazymergekit",
"Locutusque/StockQwen-2.5-7B",
"allknowingroger/QwenSlerp8-7B",
"conversational",
"en",
"zh",
"base_model:Locutusque/StockQwen-2.5-7B",
"base_model:merge:Locutusque/StockQwen-2.5-7B",
"base_model:allknowingroger/QwenSlerp8-7B",
"base_model:merge:allknowingroger/QwenSlerp8-7B",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-13T17:35:48Z | ---
language:
- en
- zh
license: apache-2.0
library_name: transformers
tags:
- merge
- mergekit
- lazymergekit
- Locutusque/StockQwen-2.5-7B
- allknowingroger/QwenSlerp8-7B
base_model:
- allknowingroger/QwenSlerp8-7B
- Locutusque/StockQwen-2.5-7B
model-index:
- name: Qwen-2.5-Aether-SlerpFusion-7B
results:
- task:
type: text-generation
name: Text Generation
dataset:
name: IFEval (0-Shot)
type: HuggingFaceH4/ifeval
args:
num_few_shot: 0
metrics:
- type: inst_level_strict_acc and prompt_level_strict_acc
value: 62.62
name: strict accuracy
source:
url: https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard?query=ZeroXClem/Qwen-2.5-Aether-SlerpFusion-7B
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: BBH (3-Shot)
type: BBH
args:
num_few_shot: 3
metrics:
- type: acc_norm
value: 36.01
name: normalized accuracy
source:
url: https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard?query=ZeroXClem/Qwen-2.5-Aether-SlerpFusion-7B
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: MATH Lvl 5 (4-Shot)
type: hendrycks/competition_math
args:
num_few_shot: 4
metrics:
- type: exact_match
value: 24.17
name: exact match
source:
url: https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard?query=ZeroXClem/Qwen-2.5-Aether-SlerpFusion-7B
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: GPQA (0-shot)
type: Idavidrein/gpqa
args:
num_few_shot: 0
metrics:
- type: acc_norm
value: 6.49
name: acc_norm
source:
url: https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard?query=ZeroXClem/Qwen-2.5-Aether-SlerpFusion-7B
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: MuSR (0-shot)
type: TAUR-Lab/MuSR
args:
num_few_shot: 0
metrics:
- type: acc_norm
value: 11.29
name: acc_norm
source:
url: https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard?query=ZeroXClem/Qwen-2.5-Aether-SlerpFusion-7B
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: MMLU-PRO (5-shot)
type: TIGER-Lab/MMLU-Pro
config: main
split: test
args:
num_few_shot: 5
metrics:
- type: acc
value: 36.96
name: accuracy
source:
url: https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard?query=ZeroXClem/Qwen-2.5-Aether-SlerpFusion-7B
name: Open LLM Leaderboard
---
# ZeroXClem/Qwen-2.5-Aether-SlerpFusion-7B
**Qwen-2.5-Aether-SlerpFusion-7B** is a sophisticated model merge that combines the strengths of multiple pre-trained language models using the powerful [mergekit](https://github.com/ZeroXClem/mergekit) framework. This fusion leverages spherical linear interpolation (SLERP) to seamlessly blend architectural layers, resulting in a model that benefits from enhanced performance and versatility.
## 🚀 Merged Models
This model merge incorporates the following:
- [**Locutusque/StockQwen-2.5-7B**](https://huggingface.co/Locutusque/StockQwen-2.5-7B): Serves as the foundational model, renowned for its robust language understanding and generation capabilities.
- [**allknowingroger/QwenSlerp8-7B**](https://huggingface.co/allknowingroger/QwenSlerp8-7B): Contributes advanced task-specific fine-tuning, enhancing the model's adaptability across various applications.
## 🧩 Merge Configuration
The configuration below outlines how the models are merged using **spherical linear interpolation (SLERP)**. This method ensures smooth transitions between the layers of both models, facilitating an optimal blend of their unique attributes:
```yaml
# ZeroXClem/Qwen-2.5-Aether-SlerpFusion-7B Merge Configuration
slices:
- sources:
- model: Locutusque/StockQwen-2.5-7B
layer_range: [0, 28]
- model: allknowingroger/QwenSlerp8-7B
layer_range: [0, 28]
merge_method: slerp
base_model: Locutusque/StockQwen-2.5-7B
parameters:
t:
- filter: self_attn
value: [0, 0.5, 0.3, 0.7, 1]
- filter: mlp
value: [1, 0.5, 0.7, 0.3, 0]
- value: 0.5
dtype: bfloat16
```
### 🔑 Key Parameters
- **Self-Attention Filtering** (`self_attn`): Controls the blending extent across self-attention layers, allowing for a dynamic mix between the two source models.
- **MLP Filtering** (`mlp`): Adjusts the balance within the Multi-Layer Perceptrons, fine-tuning the model’s neural network layers for optimal performance.
- **Global Weight (`t.value`)**: Sets a general interpolation factor for all unspecified layers, ensuring an equal contribution from both models.
- **Data Type (`dtype`)**: Utilizes `bfloat16` to maintain computational efficiency while preserving high precision.
### 🗣️ Inference
Below is an example of how to load and use the model for text generation:
```python
from transformers import AutoTokenizer, AutoModelForCausalLM, pipeline
import torch
# Define the model name
model_name = "ZeroXClem/Qwen-2.5-Aether-SlerpFusion-7B"
# Load the tokenizer
tokenizer = AutoTokenizer.from_pretrained(model_name)
# Load the model
model = AutoModelForCausalLM.from_pretrained(
model_name,
torch_dtype=torch.bfloat16,
device_map="auto"
)
# Initialize the pipeline
text_generator = pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
torch_dtype=torch.bfloat16,
device_map="auto"
)
# Define the input prompt
prompt = "Explain the significance of artificial intelligence in modern healthcare."
# Generate the output
outputs = text_generator(
prompt,
max_new_tokens=150,
do_sample=True,
temperature=0.7,
top_k=50,
top_p=0.95
)
# Print the generated text
print(outputs[0]["generated_text"])
```
## 🎯 Use Case & Applications
**Qwen-2.5-Aether-SlerpFusion-7B** excels in scenarios that require both robust language understanding and specialized task performance. This merged model is ideal for:
- **Advanced Text Generation and Comprehension**: Crafting coherent, contextually accurate, and nuanced text for applications like content creation, summarization, and translation.
- **Domain-Specific Tasks**: Enhancing performance in specialized areas such as legal document analysis, medical information processing, and technical support.
- **Interactive AI Systems**: Powering conversational agents and chatbots that require both general language capabilities and task-specific expertise.
## 📜 License
This model is open-sourced under the **Apache-2.0 License**.
## 💡 Tags
- `merge`
- `mergekit`
- `slerp`
- `Qwen`
- `Locutusque/StockQwen-2.5-7B`
- `allknowingroger/QwenSlerp8-7B`
---
# [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard)
Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_ZeroXClem__Qwen-2.5-Aether-SlerpFusion-7B)
| Metric |Value|
|-------------------|----:|
|Avg. |29.59|
|IFEval (0-Shot) |62.62|
|BBH (3-Shot) |36.01|
|MATH Lvl 5 (4-Shot)|24.17|
|GPQA (0-shot) | 6.49|
|MuSR (0-shot) |11.29|
|MMLU-PRO (5-shot) |36.96|
|
mrs83/Kurtis-Qwen2.5-0.5B-Instruct | mrs83 | 2024-11-24T01:14:38Z | 185 | 0 | transformers | [
"transformers",
"safetensors",
"qwen2",
"text-generation",
"text-generation-inference",
"Qwen2.5",
"en",
"dataset:mrs83/kurtis_mental_health_final",
"base_model:Qwen/Qwen2.5-0.5B-Instruct",
"base_model:finetune:Qwen/Qwen2.5-0.5B-Instruct",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-17T23:28:58Z | ---
library_name: transformers
tags:
- text-generation-inference
- Qwen2.5
license: mit
datasets:
- mrs83/kurtis_mental_health_final
language:
- en
base_model:
- Qwen/Qwen2.5-0.5B-Instruct
pipeline_tag: text-generation
---
# Model Card for Kurtis-Qwen2.5-0.5B-Instruct
This model has been fine-tuned using Kurtis, an experimental fine-tuning, inference and evaluation tool for Small Language Models.
## Model Details
### Model Description
- **Developed by:** Massimo R. Scamarcia <[email protected]>
- **Funded by:** Massimo R. Scamarcia <[email protected]> - (self-funded)
- **Shared by:** Massimo R. Scamarcia <[email protected]>
- **Model type:** Transformer decoder
- **Language(s) (NLP):** English
- **License:** MIT
- **Finetuned from model** Qwen/Qwen2.5-0.5B-Instruct
### Model Sources
- **Repository:** [https://github.com/mrs83/kurtis](https://github.com/mrs83/kurtis)
## Uses
The model is intended for use in a conversational setting, particularly in mental health and therapeutic support scenarios.
### Direct Use
Not suitable for production usage.
### Out-of-Scope Use
This model should not be used for:
- Making critical mental health decisions or diagnoses.
- Replacing professional mental health services.
- Applications where responses require regulatory compliance or are highly sensitive.
- Generating responses without human supervision, especially in contexts that involve vulnerable individuals.
## Bias, Risks, and Limitations
Misuse of this dataset could lead to providing inappropriate or harmful responses, so it should not be deployed without proper safeguards in place.
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model.
## How to Get Started with the Model
WIP |
nteku1/Jon_GPT2L_PPO_epi_point1 | nteku1 | 2024-11-24T01:13:32Z | 45 | 0 | transformers | [
"transformers",
"pytorch",
"safetensors",
"trl",
"ppo",
"reinforcement-learning",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | reinforcement-learning | 2024-11-22T23:13:58Z | ---
license: apache-2.0
tags:
- trl
- ppo
- transformers
- reinforcement-learning
---
# TRL Model
This is a [TRL language model](https://github.com/huggingface/trl) that has been fine-tuned with reinforcement learning to
guide the model outputs according to a value, function, or human feedback. The model can be used for text generation.
## Usage
To use this model for inference, first install the TRL library:
```bash
python -m pip install trl
```
You can then generate text as follows:
```python
from transformers import pipeline
generator = pipeline("text-generation", model="nteku1//tmp/tmpu05xwfmf/nteku1/Jon_GPT2L_PPO_epi_point1")
outputs = generator("Hello, my llama is cute")
```
If you want to use the model for training or to obtain the outputs from the value head, load the model as follows:
```python
from transformers import AutoTokenizer
from trl import AutoModelForCausalLMWithValueHead
tokenizer = AutoTokenizer.from_pretrained("nteku1//tmp/tmpu05xwfmf/nteku1/Jon_GPT2L_PPO_epi_point1")
model = AutoModelForCausalLMWithValueHead.from_pretrained("nteku1//tmp/tmpu05xwfmf/nteku1/Jon_GPT2L_PPO_epi_point1")
inputs = tokenizer("Hello, my llama is cute", return_tensors="pt")
outputs = model(**inputs, labels=inputs["input_ids"])
```
|
hazyresearch/lolcats-llama-3.1-70b | hazyresearch | 2024-11-24T01:10:24Z | 6 | 4 | null | [
"en",
"region:us"
] | null | 2024-10-14T04:56:33Z | ---
language:
- en
---
This is a pure sub-quadtratic linear attention 70B parameter model, linearized from the Meta Llama 3.1 70B model starting point.
Details on this model and how to train your own are provided at: https://github.com/HazyResearch/lolcats/tree/lolcats-scaled |
hazyresearch/lolcats-llama-3.1-8b-distill | hazyresearch | 2024-11-24T01:09:31Z | 8 | 14 | null | [
"text-generation",
"en",
"arxiv:2410.10254",
"region:us"
] | text-generation | 2024-10-13T21:43:20Z | ---
pipeline_tag: text-generation
language:
- en
---
This is a pure sub-quadtratic linear attention 8B parameter model, linearized from the Meta Llama 3.1 8B model.
Details on this model and how to train your own are provided at: https://github.com/HazyResearch/lolcats/tree/lolcats-scaled
## Demo
Here is a quick [GitHub GIST](https://gist.github.com/ariG23498/45b0c2afc95ca4c4b7cf64fbc161c1e7) that will help you run inference on the model checkpoints.
## Paper
See the paper page: https://huggingface.co/papers/2410.10254 |
swsqy/flux3dm | swsqy | 2024-11-24T01:04:33Z | 17 | 1 | diffusers | [
"diffusers",
"text-to-image",
"lora",
"template:diffusion-lora",
"base_model:black-forest-labs/FLUX.1-dev",
"base_model:adapter:black-forest-labs/FLUX.1-dev",
"region:us"
] | text-to-image | 2024-11-24T00:57:10Z | ---
tags:
- text-to-image
- lora
- diffusers
- template:diffusion-lora
widget:
- text: ewvew
output:
url: images/generated-image (1).png
base_model: black-forest-labs/FLUX.1-dev
instance_prompt: cec
---
# cvewdc
<Gallery />
## Trigger words
You should use `cec` to trigger the image generation.
## Download model
Weights for this model are available in Safetensors format.
[Download](/swsqy/flux3dm/tree/main) them in the Files & versions tab.
|
mrs83/Kurtis-SmolLM2-360M-Instruct-GGUF | mrs83 | 2024-11-24T01:03:21Z | 13 | 0 | null | [
"gguf",
"text-generation",
"en",
"dataset:mrs83/kurtis_mental_health_final",
"base_model:HuggingFaceTB/SmolLM2-360M-Instruct",
"base_model:quantized:HuggingFaceTB/SmolLM2-360M-Instruct",
"license:mit",
"endpoints_compatible",
"region:us",
"conversational"
] | text-generation | 2024-11-19T23:39:18Z | ---
license: mit
datasets:
- mrs83/kurtis_mental_health_final
language:
- en
base_model:
- HuggingFaceTB/SmolLM2-360M-Instruct
pipeline_tag: text-generation
---
# Model Card for Model ID
This model has been fine-tuned using Kurtis, an experimental fine-tuning, inference and evaluation tool for Small Language Models.
## Model Details
### Model Description
- **Developed by:** Massimo R. Scamarcia <[email protected]>
- **Funded by:** Massimo R. Scamarcia <[email protected]> - (self-funded)
- **Shared by:** Massimo R. Scamarcia <[email protected]>
- **Model type:** Transformer decoder
- **Language(s) (NLP):** English
- **License:** MIT
- **Finetuned from model [optional]:** HuggingFaceTB/SmolLM2-360M-Instruct
### Model Sources
- **Repository:** [https://github.com/mrs83/kurtis](https://github.com/mrs83/kurtis)
## Uses
The model is intended for use in a conversational setting, particularly in mental health and therapeutic support scenarios.
### Direct Use
Not suitable for production usage.
### Out-of-Scope Use
This model should not be used for:
- Making critical mental health decisions or diagnoses.
- Replacing professional mental health services.
- Applications where responses require regulatory compliance or are highly sensitive.
- Generating responses without human supervision, especially in contexts that involve vulnerable individuals.
## Bias, Risks, and Limitations
Misuse of this dataset could lead to providing inappropriate or harmful responses, so it should not be deployed without proper safeguards in place.
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model.
## How to Get Started with the Model
```
ollama run hf.co/mrs83/Kurtis-SmolLM2-360M-Instruct-GGUF
``` |
MayBashendy/ASAP_FineTuningBERT_AugV4_k4_task1_organization_fold1 | MayBashendy | 2024-11-24T00:57:11Z | 161 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:google-bert/bert-base-uncased",
"base_model:finetune:google-bert/bert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-18T05:21:17Z | ---
library_name: transformers
license: apache-2.0
base_model: bert-base-uncased
tags:
- generated_from_trainer
model-index:
- name: ASAP_FineTuningBERT_AugV4_k4_task1_organization_fold1
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# ASAP_FineTuningBERT_AugV4_k4_task1_organization_fold1
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.0233
- Qwk: 0.4858
- Mse: 1.0233
- Rmse: 1.0116
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|:-------------:|:------:|:----:|:---------------:|:------:|:------:|:------:|
| No log | 0.0047 | 2 | 9.3843 | 0.0016 | 9.3843 | 3.0634 |
| No log | 0.0093 | 4 | 7.6902 | 0.0 | 7.6902 | 2.7731 |
| No log | 0.0140 | 6 | 6.2906 | 0.0 | 6.2906 | 2.5081 |
| No log | 0.0186 | 8 | 4.9654 | 0.0320 | 4.9654 | 2.2283 |
| No log | 0.0233 | 10 | 3.6594 | 0.0 | 3.6594 | 1.9130 |
| No log | 0.0280 | 12 | 2.9328 | 0.0 | 2.9328 | 1.7126 |
| No log | 0.0326 | 14 | 2.1220 | 0.1345 | 2.1220 | 1.4567 |
| No log | 0.0373 | 16 | 1.6280 | 0.0482 | 1.6280 | 1.2759 |
| No log | 0.0420 | 18 | 1.2820 | 0.0 | 1.2820 | 1.1322 |
| No log | 0.0466 | 20 | 1.0755 | 0.0 | 1.0755 | 1.0371 |
| No log | 0.0513 | 22 | 0.9013 | 0.2768 | 0.9013 | 0.9494 |
| No log | 0.0559 | 24 | 0.8756 | 0.0943 | 0.8756 | 0.9357 |
| No log | 0.0606 | 26 | 0.9791 | 0.0286 | 0.9791 | 0.9895 |
| No log | 0.0653 | 28 | 0.9280 | 0.0429 | 0.9280 | 0.9633 |
| No log | 0.0699 | 30 | 0.9766 | 0.0286 | 0.9766 | 0.9882 |
| No log | 0.0746 | 32 | 1.3594 | 0.0 | 1.3594 | 1.1659 |
| No log | 0.0793 | 34 | 1.2967 | 0.0 | 1.2967 | 1.1387 |
| No log | 0.0839 | 36 | 1.2188 | 0.0 | 1.2188 | 1.1040 |
| No log | 0.0886 | 38 | 1.0623 | 0.0 | 1.0623 | 1.0307 |
| No log | 0.0932 | 40 | 1.1963 | 0.0 | 1.1963 | 1.0937 |
| No log | 0.0979 | 42 | 1.2359 | 0.0 | 1.2359 | 1.1117 |
| No log | 0.1026 | 44 | 1.1297 | 0.0 | 1.1297 | 1.0629 |
| No log | 0.1072 | 46 | 1.1432 | 0.0 | 1.1432 | 1.0692 |
| No log | 0.1119 | 48 | 1.1393 | 0.0 | 1.1393 | 1.0674 |
| No log | 0.1166 | 50 | 0.9199 | 0.0143 | 0.9199 | 0.9591 |
| No log | 0.1212 | 52 | 0.8848 | 0.0143 | 0.8848 | 0.9407 |
| No log | 0.1259 | 54 | 1.0594 | 0.0143 | 1.0594 | 1.0292 |
| No log | 0.1305 | 56 | 1.0446 | 0.0143 | 1.0446 | 1.0221 |
| No log | 0.1352 | 58 | 0.9308 | 0.0143 | 0.9308 | 0.9648 |
| No log | 0.1399 | 60 | 0.9824 | 0.0143 | 0.9824 | 0.9911 |
| No log | 0.1445 | 62 | 1.1525 | 0.0 | 1.1525 | 1.0736 |
| No log | 0.1492 | 64 | 1.1400 | 0.0 | 1.1400 | 1.0677 |
| No log | 0.1538 | 66 | 0.9839 | 0.0073 | 0.9839 | 0.9919 |
| No log | 0.1585 | 68 | 0.9852 | 0.0036 | 0.9852 | 0.9926 |
| No log | 0.1632 | 70 | 1.1657 | 0.0 | 1.1657 | 1.0797 |
| No log | 0.1678 | 72 | 1.2796 | 0.0 | 1.2796 | 1.1312 |
| No log | 0.1725 | 74 | 1.2845 | 0.0 | 1.2845 | 1.1333 |
| No log | 0.1772 | 76 | 1.0578 | 0.0143 | 1.0578 | 1.0285 |
| No log | 0.1818 | 78 | 0.8877 | 0.0179 | 0.8877 | 0.9422 |
| No log | 0.1865 | 80 | 0.8662 | 0.0358 | 0.8662 | 0.9307 |
| No log | 0.1911 | 82 | 0.9209 | 0.0143 | 0.9209 | 0.9596 |
| No log | 0.1958 | 84 | 1.1707 | 0.0 | 1.1707 | 1.0820 |
| No log | 0.2005 | 86 | 1.3210 | 0.0262 | 1.3210 | 1.1494 |
| No log | 0.2051 | 88 | 1.4341 | 0.2121 | 1.4341 | 1.1975 |
| No log | 0.2098 | 90 | 1.3311 | 0.0070 | 1.3311 | 1.1537 |
| No log | 0.2145 | 92 | 1.1069 | 0.0 | 1.1069 | 1.0521 |
| No log | 0.2191 | 94 | 0.9273 | 0.0 | 0.9273 | 0.9630 |
| No log | 0.2238 | 96 | 0.8827 | 0.0323 | 0.8827 | 0.9395 |
| No log | 0.2284 | 98 | 0.8738 | 0.0252 | 0.8738 | 0.9348 |
| No log | 0.2331 | 100 | 0.9496 | 0.0 | 0.9496 | 0.9745 |
| No log | 0.2378 | 102 | 1.0099 | 0.0 | 1.0099 | 1.0050 |
| No log | 0.2424 | 104 | 1.0647 | 0.0 | 1.0647 | 1.0319 |
| No log | 0.2471 | 106 | 0.9915 | 0.0 | 0.9915 | 0.9958 |
| No log | 0.2517 | 108 | 1.0097 | 0.0382 | 1.0097 | 1.0048 |
| No log | 0.2564 | 110 | 1.0028 | 0.1248 | 1.0028 | 1.0014 |
| No log | 0.2611 | 112 | 0.9278 | 0.1526 | 0.9278 | 0.9632 |
| No log | 0.2657 | 114 | 0.9710 | 0.3092 | 0.9710 | 0.9854 |
| No log | 0.2704 | 116 | 0.8717 | 0.2976 | 0.8717 | 0.9336 |
| No log | 0.2751 | 118 | 0.7836 | 0.2718 | 0.7836 | 0.8852 |
| No log | 0.2797 | 120 | 0.7954 | 0.3127 | 0.7954 | 0.8918 |
| No log | 0.2844 | 122 | 0.9252 | 0.3546 | 0.9252 | 0.9619 |
| No log | 0.2890 | 124 | 0.8700 | 0.3571 | 0.8700 | 0.9327 |
| No log | 0.2937 | 126 | 0.7403 | 0.3369 | 0.7403 | 0.8604 |
| No log | 0.2984 | 128 | 0.7369 | 0.3058 | 0.7369 | 0.8584 |
| No log | 0.3030 | 130 | 0.7438 | 0.2744 | 0.7438 | 0.8625 |
| No log | 0.3077 | 132 | 0.7503 | 0.2813 | 0.7503 | 0.8662 |
| No log | 0.3124 | 134 | 0.6996 | 0.3109 | 0.6996 | 0.8364 |
| No log | 0.3170 | 136 | 0.6582 | 0.3700 | 0.6582 | 0.8113 |
| No log | 0.3217 | 138 | 0.6768 | 0.3613 | 0.6768 | 0.8227 |
| No log | 0.3263 | 140 | 0.6670 | 0.3977 | 0.6670 | 0.8167 |
| No log | 0.3310 | 142 | 0.6407 | 0.4164 | 0.6407 | 0.8004 |
| No log | 0.3357 | 144 | 0.6279 | 0.4588 | 0.6279 | 0.7924 |
| No log | 0.3403 | 146 | 0.6372 | 0.4161 | 0.6372 | 0.7983 |
| No log | 0.3450 | 148 | 0.6649 | 0.4023 | 0.6649 | 0.8154 |
| No log | 0.3497 | 150 | 0.6939 | 0.3856 | 0.6939 | 0.8330 |
| No log | 0.3543 | 152 | 0.6779 | 0.3844 | 0.6779 | 0.8234 |
| No log | 0.3590 | 154 | 0.8422 | 0.3675 | 0.8422 | 0.9177 |
| No log | 0.3636 | 156 | 0.8417 | 0.3777 | 0.8417 | 0.9174 |
| No log | 0.3683 | 158 | 0.7826 | 0.3775 | 0.7826 | 0.8846 |
| No log | 0.3730 | 160 | 0.6487 | 0.3777 | 0.6487 | 0.8054 |
| No log | 0.3776 | 162 | 0.5992 | 0.4299 | 0.5992 | 0.7740 |
| No log | 0.3823 | 164 | 0.6050 | 0.4694 | 0.6050 | 0.7778 |
| No log | 0.3869 | 166 | 0.6142 | 0.4970 | 0.6142 | 0.7837 |
| No log | 0.3916 | 168 | 0.5750 | 0.4569 | 0.5750 | 0.7583 |
| No log | 0.3963 | 170 | 0.6277 | 0.5037 | 0.6277 | 0.7923 |
| No log | 0.4009 | 172 | 0.7736 | 0.4456 | 0.7736 | 0.8795 |
| No log | 0.4056 | 174 | 0.9673 | 0.4133 | 0.9673 | 0.9835 |
| No log | 0.4103 | 176 | 0.8049 | 0.4295 | 0.8049 | 0.8972 |
| No log | 0.4149 | 178 | 0.7364 | 0.4152 | 0.7364 | 0.8581 |
| No log | 0.4196 | 180 | 0.7654 | 0.4253 | 0.7654 | 0.8749 |
| No log | 0.4242 | 182 | 0.9275 | 0.4106 | 0.9275 | 0.9631 |
| No log | 0.4289 | 184 | 0.8539 | 0.4148 | 0.8539 | 0.9241 |
| No log | 0.4336 | 186 | 0.6760 | 0.4512 | 0.6760 | 0.8222 |
| No log | 0.4382 | 188 | 0.6397 | 0.4239 | 0.6397 | 0.7998 |
| No log | 0.4429 | 190 | 0.7127 | 0.4156 | 0.7127 | 0.8442 |
| No log | 0.4476 | 192 | 0.6885 | 0.4279 | 0.6885 | 0.8298 |
| No log | 0.4522 | 194 | 0.6358 | 0.4176 | 0.6358 | 0.7974 |
| No log | 0.4569 | 196 | 0.6393 | 0.4404 | 0.6393 | 0.7996 |
| No log | 0.4615 | 198 | 0.6466 | 0.4403 | 0.6466 | 0.8041 |
| No log | 0.4662 | 200 | 0.7371 | 0.4441 | 0.7371 | 0.8585 |
| No log | 0.4709 | 202 | 0.6930 | 0.4592 | 0.6930 | 0.8325 |
| No log | 0.4755 | 204 | 0.6651 | 0.4764 | 0.6651 | 0.8156 |
| No log | 0.4802 | 206 | 0.6216 | 0.4874 | 0.6216 | 0.7884 |
| No log | 0.4848 | 208 | 0.6288 | 0.4498 | 0.6288 | 0.7930 |
| No log | 0.4895 | 210 | 0.7087 | 0.3337 | 0.7087 | 0.8418 |
| No log | 0.4942 | 212 | 0.6133 | 0.4225 | 0.6133 | 0.7831 |
| No log | 0.4988 | 214 | 0.5628 | 0.5285 | 0.5628 | 0.7502 |
| No log | 0.5035 | 216 | 0.5629 | 0.5312 | 0.5629 | 0.7503 |
| No log | 0.5082 | 218 | 0.5692 | 0.5199 | 0.5692 | 0.7545 |
| No log | 0.5128 | 220 | 0.5704 | 0.5139 | 0.5704 | 0.7553 |
| No log | 0.5175 | 222 | 0.6406 | 0.4129 | 0.6406 | 0.8003 |
| No log | 0.5221 | 224 | 0.6120 | 0.4437 | 0.6120 | 0.7823 |
| No log | 0.5268 | 226 | 0.5876 | 0.5391 | 0.5876 | 0.7666 |
| No log | 0.5315 | 228 | 0.7083 | 0.5018 | 0.7083 | 0.8416 |
| No log | 0.5361 | 230 | 0.5759 | 0.5407 | 0.5759 | 0.7589 |
| No log | 0.5408 | 232 | 0.5650 | 0.4865 | 0.5650 | 0.7517 |
| No log | 0.5455 | 234 | 0.5471 | 0.5400 | 0.5471 | 0.7397 |
| No log | 0.5501 | 236 | 0.6479 | 0.4973 | 0.6479 | 0.8049 |
| No log | 0.5548 | 238 | 0.6382 | 0.5071 | 0.6382 | 0.7989 |
| No log | 0.5594 | 240 | 0.6395 | 0.5236 | 0.6395 | 0.7997 |
| No log | 0.5641 | 242 | 0.5608 | 0.5356 | 0.5608 | 0.7489 |
| No log | 0.5688 | 244 | 0.6164 | 0.5473 | 0.6164 | 0.7851 |
| No log | 0.5734 | 246 | 0.5722 | 0.5658 | 0.5722 | 0.7564 |
| No log | 0.5781 | 248 | 0.5177 | 0.5355 | 0.5177 | 0.7195 |
| No log | 0.5828 | 250 | 0.5329 | 0.5189 | 0.5329 | 0.7300 |
| No log | 0.5874 | 252 | 0.5397 | 0.5682 | 0.5397 | 0.7347 |
| No log | 0.5921 | 254 | 0.7793 | 0.4886 | 0.7793 | 0.8828 |
| No log | 0.5967 | 256 | 0.7562 | 0.4960 | 0.7562 | 0.8696 |
| No log | 0.6014 | 258 | 0.5510 | 0.5218 | 0.5510 | 0.7423 |
| No log | 0.6061 | 260 | 0.5833 | 0.4736 | 0.5833 | 0.7637 |
| No log | 0.6107 | 262 | 0.5789 | 0.4672 | 0.5789 | 0.7608 |
| No log | 0.6154 | 264 | 0.5632 | 0.5213 | 0.5632 | 0.7505 |
| No log | 0.6200 | 266 | 0.8972 | 0.4317 | 0.8972 | 0.9472 |
| No log | 0.6247 | 268 | 1.2352 | 0.3382 | 1.2352 | 1.1114 |
| No log | 0.6294 | 270 | 1.1639 | 0.3660 | 1.1639 | 1.0788 |
| No log | 0.6340 | 272 | 0.8008 | 0.4440 | 0.8008 | 0.8949 |
| No log | 0.6387 | 274 | 0.5685 | 0.5206 | 0.5685 | 0.7540 |
| No log | 0.6434 | 276 | 0.5656 | 0.5081 | 0.5656 | 0.7521 |
| No log | 0.6480 | 278 | 0.5751 | 0.5230 | 0.5751 | 0.7584 |
| No log | 0.6527 | 280 | 0.6808 | 0.4899 | 0.6808 | 0.8251 |
| No log | 0.6573 | 282 | 0.8706 | 0.4267 | 0.8706 | 0.9330 |
| No log | 0.6620 | 284 | 0.7656 | 0.4588 | 0.7656 | 0.8750 |
| No log | 0.6667 | 286 | 0.5443 | 0.5958 | 0.5443 | 0.7378 |
| No log | 0.6713 | 288 | 0.5226 | 0.5962 | 0.5226 | 0.7229 |
| No log | 0.6760 | 290 | 0.5095 | 0.6071 | 0.5095 | 0.7138 |
| No log | 0.6807 | 292 | 0.5768 | 0.5801 | 0.5768 | 0.7595 |
| No log | 0.6853 | 294 | 0.5851 | 0.5650 | 0.5851 | 0.7649 |
| No log | 0.6900 | 296 | 0.5128 | 0.6047 | 0.5128 | 0.7161 |
| No log | 0.6946 | 298 | 0.5089 | 0.5803 | 0.5089 | 0.7134 |
| No log | 0.6993 | 300 | 0.5085 | 0.5734 | 0.5085 | 0.7131 |
| No log | 0.7040 | 302 | 0.5283 | 0.5689 | 0.5283 | 0.7269 |
| No log | 0.7086 | 304 | 0.7961 | 0.4627 | 0.7961 | 0.8922 |
| No log | 0.7133 | 306 | 0.9707 | 0.4049 | 0.9707 | 0.9853 |
| No log | 0.7179 | 308 | 0.7990 | 0.4473 | 0.7990 | 0.8939 |
| No log | 0.7226 | 310 | 0.6139 | 0.4528 | 0.6139 | 0.7835 |
| No log | 0.7273 | 312 | 0.5635 | 0.5261 | 0.5635 | 0.7507 |
| No log | 0.7319 | 314 | 0.5683 | 0.5282 | 0.5683 | 0.7539 |
| No log | 0.7366 | 316 | 0.5644 | 0.5540 | 0.5644 | 0.7513 |
| No log | 0.7413 | 318 | 0.5096 | 0.5971 | 0.5096 | 0.7138 |
| No log | 0.7459 | 320 | 0.5512 | 0.5775 | 0.5512 | 0.7424 |
| No log | 0.7506 | 322 | 0.5975 | 0.5582 | 0.5975 | 0.7730 |
| No log | 0.7552 | 324 | 0.6028 | 0.5605 | 0.6028 | 0.7764 |
| No log | 0.7599 | 326 | 0.6276 | 0.5439 | 0.6276 | 0.7922 |
| No log | 0.7646 | 328 | 0.6967 | 0.5367 | 0.6967 | 0.8347 |
| No log | 0.7692 | 330 | 0.7977 | 0.5171 | 0.7977 | 0.8931 |
| No log | 0.7739 | 332 | 0.9367 | 0.4793 | 0.9367 | 0.9678 |
| No log | 0.7786 | 334 | 0.6564 | 0.5722 | 0.6564 | 0.8102 |
| No log | 0.7832 | 336 | 0.5104 | 0.6285 | 0.5104 | 0.7144 |
| No log | 0.7879 | 338 | 0.5155 | 0.6268 | 0.5155 | 0.7180 |
| No log | 0.7925 | 340 | 0.5791 | 0.6236 | 0.5791 | 0.7610 |
| No log | 0.7972 | 342 | 0.6672 | 0.5843 | 0.6672 | 0.8168 |
| No log | 0.8019 | 344 | 0.6547 | 0.5891 | 0.6547 | 0.8092 |
| No log | 0.8065 | 346 | 0.5056 | 0.6337 | 0.5056 | 0.7110 |
| No log | 0.8112 | 348 | 0.4891 | 0.6295 | 0.4891 | 0.6994 |
| No log | 0.8159 | 350 | 0.4856 | 0.6249 | 0.4856 | 0.6968 |
| No log | 0.8205 | 352 | 0.4916 | 0.6220 | 0.4916 | 0.7011 |
| No log | 0.8252 | 354 | 0.4930 | 0.6282 | 0.4930 | 0.7021 |
| No log | 0.8298 | 356 | 0.5044 | 0.6097 | 0.5044 | 0.7102 |
| No log | 0.8345 | 358 | 0.5103 | 0.6353 | 0.5103 | 0.7144 |
| No log | 0.8392 | 360 | 0.5378 | 0.6343 | 0.5378 | 0.7333 |
| No log | 0.8438 | 362 | 0.5636 | 0.6194 | 0.5636 | 0.7507 |
| No log | 0.8485 | 364 | 0.7472 | 0.5763 | 0.7472 | 0.8644 |
| No log | 0.8531 | 366 | 0.7252 | 0.5791 | 0.7252 | 0.8516 |
| No log | 0.8578 | 368 | 0.5898 | 0.6122 | 0.5898 | 0.7680 |
| No log | 0.8625 | 370 | 0.7453 | 0.5799 | 0.7453 | 0.8633 |
| No log | 0.8671 | 372 | 0.6918 | 0.5720 | 0.6918 | 0.8317 |
| No log | 0.8718 | 374 | 0.5452 | 0.6077 | 0.5452 | 0.7384 |
| No log | 0.8765 | 376 | 0.5438 | 0.5906 | 0.5438 | 0.7374 |
| No log | 0.8811 | 378 | 0.5310 | 0.5871 | 0.5310 | 0.7287 |
| No log | 0.8858 | 380 | 0.5805 | 0.5836 | 0.5805 | 0.7619 |
| No log | 0.8904 | 382 | 0.8195 | 0.5321 | 0.8195 | 0.9053 |
| No log | 0.8951 | 384 | 0.7126 | 0.5589 | 0.7126 | 0.8442 |
| No log | 0.8998 | 386 | 0.5401 | 0.6017 | 0.5401 | 0.7349 |
| No log | 0.9044 | 388 | 0.5559 | 0.6210 | 0.5559 | 0.7456 |
| No log | 0.9091 | 390 | 0.5439 | 0.5992 | 0.5439 | 0.7375 |
| No log | 0.9138 | 392 | 0.6774 | 0.5525 | 0.6774 | 0.8230 |
| No log | 0.9184 | 394 | 0.7939 | 0.4982 | 0.7939 | 0.8910 |
| No log | 0.9231 | 396 | 0.7069 | 0.5294 | 0.7069 | 0.8408 |
| No log | 0.9277 | 398 | 0.5890 | 0.5907 | 0.5890 | 0.7675 |
| No log | 0.9324 | 400 | 0.5975 | 0.5906 | 0.5975 | 0.7730 |
| No log | 0.9371 | 402 | 0.6703 | 0.5743 | 0.6703 | 0.8187 |
| No log | 0.9417 | 404 | 0.8185 | 0.5545 | 0.8185 | 0.9047 |
| No log | 0.9464 | 406 | 0.6894 | 0.5771 | 0.6894 | 0.8303 |
| No log | 0.9510 | 408 | 0.6050 | 0.6040 | 0.6050 | 0.7778 |
| No log | 0.9557 | 410 | 0.5090 | 0.6361 | 0.5090 | 0.7134 |
| No log | 0.9604 | 412 | 0.5219 | 0.6351 | 0.5219 | 0.7224 |
| No log | 0.9650 | 414 | 0.5180 | 0.6548 | 0.5180 | 0.7197 |
| No log | 0.9697 | 416 | 0.5676 | 0.6551 | 0.5676 | 0.7534 |
| No log | 0.9744 | 418 | 0.5886 | 0.6483 | 0.5886 | 0.7672 |
| No log | 0.9790 | 420 | 0.5261 | 0.6469 | 0.5261 | 0.7253 |
| No log | 0.9837 | 422 | 0.5870 | 0.6288 | 0.5870 | 0.7662 |
| No log | 0.9883 | 424 | 0.7002 | 0.5809 | 0.7002 | 0.8368 |
| No log | 0.9930 | 426 | 0.5667 | 0.6122 | 0.5667 | 0.7528 |
| No log | 0.9977 | 428 | 0.5623 | 0.6555 | 0.5623 | 0.7499 |
| No log | 1.0023 | 430 | 0.7268 | 0.5898 | 0.7268 | 0.8525 |
| No log | 1.0070 | 432 | 0.6592 | 0.5963 | 0.6592 | 0.8119 |
| No log | 1.0117 | 434 | 0.6531 | 0.5806 | 0.6531 | 0.8081 |
| No log | 1.0163 | 436 | 0.6722 | 0.5720 | 0.6722 | 0.8199 |
| No log | 1.0210 | 438 | 0.6533 | 0.5393 | 0.6533 | 0.8083 |
| No log | 1.0256 | 440 | 0.6333 | 0.5669 | 0.6333 | 0.7958 |
| No log | 1.0303 | 442 | 0.7562 | 0.5593 | 0.7562 | 0.8696 |
| No log | 1.0350 | 444 | 0.7881 | 0.5553 | 0.7881 | 0.8877 |
| No log | 1.0396 | 446 | 0.5655 | 0.6375 | 0.5655 | 0.7520 |
| No log | 1.0443 | 448 | 0.5010 | 0.6453 | 0.5010 | 0.7078 |
| No log | 1.0490 | 450 | 0.5040 | 0.6472 | 0.5040 | 0.7099 |
| No log | 1.0536 | 452 | 0.6296 | 0.6041 | 0.6296 | 0.7935 |
| No log | 1.0583 | 454 | 0.7043 | 0.5849 | 0.7043 | 0.8392 |
| No log | 1.0629 | 456 | 0.5669 | 0.6263 | 0.5669 | 0.7529 |
| No log | 1.0676 | 458 | 0.4899 | 0.6481 | 0.4899 | 0.7000 |
| No log | 1.0723 | 460 | 0.5168 | 0.6089 | 0.5168 | 0.7189 |
| No log | 1.0769 | 462 | 0.5086 | 0.6155 | 0.5086 | 0.7132 |
| No log | 1.0816 | 464 | 0.5141 | 0.6416 | 0.5141 | 0.7170 |
| No log | 1.0862 | 466 | 0.6181 | 0.6143 | 0.6181 | 0.7862 |
| No log | 1.0909 | 468 | 0.6788 | 0.6171 | 0.6788 | 0.8239 |
| No log | 1.0956 | 470 | 0.5832 | 0.6188 | 0.5832 | 0.7637 |
| No log | 1.1002 | 472 | 0.5803 | 0.6331 | 0.5803 | 0.7618 |
| No log | 1.1049 | 474 | 0.5935 | 0.6359 | 0.5935 | 0.7704 |
| No log | 1.1096 | 476 | 0.5957 | 0.6198 | 0.5957 | 0.7718 |
| No log | 1.1142 | 478 | 0.5229 | 0.6492 | 0.5229 | 0.7231 |
| No log | 1.1189 | 480 | 0.5061 | 0.6446 | 0.5061 | 0.7114 |
| No log | 1.1235 | 482 | 0.5001 | 0.6397 | 0.5001 | 0.7071 |
| No log | 1.1282 | 484 | 0.5478 | 0.6432 | 0.5478 | 0.7402 |
| No log | 1.1329 | 486 | 0.5905 | 0.6402 | 0.5905 | 0.7684 |
| No log | 1.1375 | 488 | 0.5022 | 0.6697 | 0.5022 | 0.7086 |
| No log | 1.1422 | 490 | 0.5098 | 0.6487 | 0.5098 | 0.7140 |
| No log | 1.1469 | 492 | 0.4983 | 0.6456 | 0.4983 | 0.7059 |
| No log | 1.1515 | 494 | 0.4993 | 0.6632 | 0.4993 | 0.7066 |
| No log | 1.1562 | 496 | 0.5403 | 0.6554 | 0.5403 | 0.7350 |
| No log | 1.1608 | 498 | 0.5965 | 0.6392 | 0.5965 | 0.7723 |
| 0.8772 | 1.1655 | 500 | 0.5477 | 0.6326 | 0.5477 | 0.7401 |
| 0.8772 | 1.1702 | 502 | 0.6351 | 0.6098 | 0.6351 | 0.7969 |
| 0.8772 | 1.1748 | 504 | 0.5885 | 0.6078 | 0.5885 | 0.7672 |
| 0.8772 | 1.1795 | 506 | 0.7234 | 0.5930 | 0.7234 | 0.8506 |
| 0.8772 | 1.1841 | 508 | 1.1260 | 0.4874 | 1.1260 | 1.0611 |
| 0.8772 | 1.1888 | 510 | 1.0233 | 0.4858 | 1.0233 | 1.0116 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
|
CalamitousFelicitousness/EVA-Qwen2.5-72B-v0.2-GPTQ-8Bit | CalamitousFelicitousness | 2024-11-24T00:56:55Z | 6 | 2 | transformers | [
"transformers",
"safetensors",
"qwen2",
"text-generation",
"generated_from_trainer",
"conversational",
"dataset:anthracite-org/kalo-opus-instruct-22k-no-refusal",
"dataset:Nopm/Opus_WritingStruct",
"dataset:Gryphe/Sonnet3.5-SlimOrcaDedupCleaned",
"dataset:Gryphe/Sonnet3.5-Charcard-Roleplay",
"dataset:Gryphe/ChatGPT-4o-Writing-Prompts",
"dataset:Epiculous/Synthstruct-Gens-v1.1-Filtered-n-Cleaned",
"dataset:Epiculous/SynthRP-Gens-v1.1-Filtered-n-Cleaned",
"dataset:nothingiisreal/Reddit-Dirty-And-WritingPrompts",
"dataset:allura-org/Celeste-1.x-data-mixture",
"dataset:cognitivecomputations/dolphin-2.9.3",
"base_model:Qwen/Qwen2.5-72B",
"base_model:quantized:Qwen/Qwen2.5-72B",
"license:other",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"8-bit",
"gptq",
"region:us"
] | text-generation | 2024-11-24T00:41:24Z | ---
library_name: transformers
license: other
license_name: qwen
license_link: https://huggingface.co/Qwen/Qwen2.5-72B-Instruct/blob/main/LICENSE
base_model: Qwen/Qwen2.5-72B
datasets:
- anthracite-org/kalo-opus-instruct-22k-no-refusal
- Nopm/Opus_WritingStruct
- Gryphe/Sonnet3.5-SlimOrcaDedupCleaned
- Gryphe/Sonnet3.5-Charcard-Roleplay
- Gryphe/ChatGPT-4o-Writing-Prompts
- Epiculous/Synthstruct-Gens-v1.1-Filtered-n-Cleaned
- Epiculous/SynthRP-Gens-v1.1-Filtered-n-Cleaned
- nothingiisreal/Reddit-Dirty-And-WritingPrompts
- allura-org/Celeste-1.x-data-mixture
- cognitivecomputations/dolphin-2.9.3
tags:
- generated_from_trainer
model-index:
- name: EVA-Qwen2.5-72B-SFFT-v0.2
results: []
---
# EVA Qwen2.5-72B v0.2
<p>
A RP/storywriting specialist model, full-parameter finetune of Qwen2.5-72B on mixture of synthetic and natural data.<br>
It uses Celeste 70B 0.1 data mixture, greatly expanding it to improve versatility, creativity and "flavor" of the resulting model.<br>
</p>
<p>Dedicated to Nev.</p>
<p><b>NOTE: LLM-Compressor quants don't seem to work correctly, quality seems to be much worse than normal. It wasn't the case with previous versions. GGUF and GPTQ seem to be unaffected.</b></p>
</br>
<p><b>Version notes for 0.2</b>: Optimized training hyperparameters and increased sequence length. Better instruction following deeper into context and less repetition.</p>
<p>
<p>Prompt format is ChatML.</p><br>
<h3>Recommended sampler values:</h3>
<ul>
<li>Temperature: 0.8</li>
<li>Min-P: 0.05</li>
<li>Top-A: 0.3</li>
<li>Repetition Penalty: 1.03</li>
</ul>
<h3>Recommended SillyTavern preset (via CalamitousFelicitousness):</h3>
<ul><li><a href="EV01.json">Master import</a></li></ul>
</p>
<p>
<br>
<h3>
Training data:
</h3>
<ul>
<li>Celeste 70B 0.1 data mixture minus Opus Instruct subset. See that model's <a href=https://huggingface.co/nothingiisreal/L3.1-70B-Celeste-V0.1-BF16>card</a> for details.</li>
<li>Kalomaze's Opus_Instruct_25k dataset, filtered for refusals.</li>
<li>A subset (1k rows) of ChatGPT-4o-WritingPrompts by Gryphe</li>
<li>A subset (2k rows) of Sonnet3.5-Charcards-Roleplay by Gryphe</li>
<li>Synthstruct and SynthRP datasets by Epiculous</li>
<li>A subset from Dolphin-2.9.3, including filtered version of not_samantha and a small subset of systemchat.</li>
</ul>
<h3>
Training time and hardware:
</h3>
<ul><li>17 hours on 8xH100 SXM</a></li></ul><br>
</p>
<p>Model was created by Kearm, Auri and Cahvay.</p>
<h4>Special thanks:</h4><ul>
<li>to Featherless for sponsoring this run</li>
<li>to Cahvay for his work on investigating and reprocessing the corrupted dataset, removing the single biggest source of data poisoning.</li>
<li>to Gryphe, Lemmy, Kalomaze, Nopm, Epiculous and CognitiveComputations for the data</li>
<li>and to Allura-org for support, feedback, beta-testing and doing quality control of EVA models.</li></ul>
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
base_model: Qwen/Qwen2.5-72B
load_in_8bit: false
load_in_4bit: false
strict: false
plugins:
- axolotl.integrations.liger.LigerPlugin
liger_rope: true
liger_rms_norm: true
liger_swiglu: true
liger_fused_linear_cross_entropy: true
# plugins:
# - axolotl.integrations.spectrum.SpectrumPlugin
# spectrum_top_fraction: 0.5
# # Optional if using a pre-scanned model as your base_model. Useful if using a model mirror
# spectrum_model_name: Qwen/Qwen2.5-32B
datasets:
- path: datasets/Celeste_Filtered_utf8fix.jsonl
type: sharegpt
- path: datasets/deduped_not_samantha_norefusals.jsonl
type: sharegpt
- path: datasets/deduped_SynthRP-Gens_processed_ShareGPT_converted_cleaned.jsonl
type: sharegpt
- path: datasets/deduped_Synthstruct-Gens_processed_sharegpt_converted_cleaned.jsonl
type: sharegpt
- path: datasets/Gryphe-4o-WP-filtered-sharegpt_utf8fix.jsonl
type: sharegpt
- path: datasets/opus-instruct-22k-no_refusals-filtered_utf8fix.jsonl
type: sharegpt
- path: datasets/Sonnet3-5-charcard-names-filtered-sharegpt_utf8fix.jsonl
type: sharegpt
- path: datasets/SystemChat_subset_filtered_sharegpt_utf8fix.jsonl
type: sharegpt
chat_template: chatml
shuffle_merged_datasets: true
val_set_size: 0.001
output_dir: EVA-Qwen2.5-72B-SFFT-v0.2
sequence_len: 10240
sample_packing: true
eval_sample_packing: false
pad_to_sequence_len: false
# adapter: qlora
# lora_model_dir:
# lora_r: 64
# lora_alpha: 128
# lora_dropout: 0.05
# lora_target_linear: true
# peft_use_dora: true
unfrozen_parameters:
- ^lm_head.weight$
- ^model.embed_tokens.weight$
# mlp.down_proj layers
- model.layers.62.mlp.down_proj
- model.layers.64.mlp.down_proj
- model.layers.63.mlp.down_proj
- model.layers.66.mlp.down_proj
- model.layers.65.mlp.down_proj
- model.layers.67.mlp.down_proj
- model.layers.68.mlp.down_proj
- model.layers.31.mlp.down_proj
- model.layers.60.mlp.down_proj
- model.layers.69.mlp.down_proj
- model.layers.61.mlp.down_proj
- model.layers.59.mlp.down_proj
- model.layers.30.mlp.down_proj
- model.layers.70.mlp.down_proj
- model.layers.32.mlp.down_proj
- model.layers.34.mlp.down_proj
- model.layers.33.mlp.down_proj
- model.layers.76.mlp.down_proj
- model.layers.72.mlp.down_proj
- model.layers.71.mlp.down_proj
- model.layers.58.mlp.down_proj
- model.layers.75.mlp.down_proj
- model.layers.29.mlp.down_proj
- model.layers.56.mlp.down_proj
- model.layers.26.mlp.down_proj
- model.layers.35.mlp.down_proj
- model.layers.28.mlp.down_proj
- model.layers.57.mlp.down_proj
- model.layers.77.mlp.down_proj
- model.layers.36.mlp.down_proj
- model.layers.27.mlp.down_proj
- model.layers.25.mlp.down_proj
- model.layers.78.mlp.down_proj
- model.layers.37.mlp.down_proj
- model.layers.73.mlp.down_proj
- model.layers.55.mlp.down_proj
- model.layers.54.mlp.down_proj
- model.layers.74.mlp.down_proj
- model.layers.24.mlp.down_proj
- model.layers.53.mlp.down_proj
# mlp.gate_proj layers
- model.layers.78.mlp.gate_proj
- model.layers.77.mlp.gate_proj
- model.layers.76.mlp.gate_proj
- model.layers.79.mlp.gate_proj
- model.layers.75.mlp.gate_proj
- model.layers.74.mlp.gate_proj
- model.layers.73.mlp.gate_proj
- model.layers.72.mlp.gate_proj
- model.layers.71.mlp.gate_proj
- model.layers.70.mlp.gate_proj
- model.layers.69.mlp.gate_proj
- model.layers.57.mlp.gate_proj
- model.layers.54.mlp.gate_proj
- model.layers.55.mlp.gate_proj
- model.layers.68.mlp.gate_proj
- model.layers.63.mlp.gate_proj
- model.layers.53.mlp.gate_proj
- model.layers.44.mlp.gate_proj
- model.layers.45.mlp.gate_proj
- model.layers.49.mlp.gate_proj
- model.layers.58.mlp.gate_proj
- model.layers.46.mlp.gate_proj
- model.layers.56.mlp.gate_proj
- model.layers.67.mlp.gate_proj
- model.layers.62.mlp.gate_proj
- model.layers.50.mlp.gate_proj
- model.layers.64.mlp.gate_proj
- model.layers.52.mlp.gate_proj
- model.layers.40.mlp.gate_proj
- model.layers.43.mlp.gate_proj
- model.layers.48.mlp.gate_proj
- model.layers.66.mlp.gate_proj
- model.layers.47.mlp.gate_proj
- model.layers.59.mlp.gate_proj
- model.layers.65.mlp.gate_proj
- model.layers.61.mlp.gate_proj
- model.layers.60.mlp.gate_proj
- model.layers.42.mlp.gate_proj
- model.layers.51.mlp.gate_proj
- model.layers.41.mlp.gate_proj
# mlp.up_proj layers
- model.layers.70.mlp.up_proj
- model.layers.69.mlp.up_proj
- model.layers.71.mlp.up_proj
- model.layers.68.mlp.up_proj
- model.layers.72.mlp.up_proj
- model.layers.67.mlp.up_proj
- model.layers.66.mlp.up_proj
- model.layers.73.mlp.up_proj
- model.layers.46.mlp.up_proj
- model.layers.63.mlp.up_proj
- model.layers.75.mlp.up_proj
- model.layers.76.mlp.up_proj
- model.layers.74.mlp.up_proj
- model.layers.45.mlp.up_proj
- model.layers.62.mlp.up_proj
- model.layers.64.mlp.up_proj
- model.layers.65.mlp.up_proj
- model.layers.44.mlp.up_proj
- model.layers.53.mlp.up_proj
- model.layers.47.mlp.up_proj
- model.layers.49.mlp.up_proj
- model.layers.48.mlp.up_proj
- model.layers.57.mlp.up_proj
- model.layers.43.mlp.up_proj
- model.layers.42.mlp.up_proj
- model.layers.56.mlp.up_proj
- model.layers.61.mlp.up_proj
- model.layers.54.mlp.up_proj
- model.layers.40.mlp.up_proj
- model.layers.55.mlp.up_proj
- model.layers.77.mlp.up_proj
- model.layers.60.mlp.up_proj
- model.layers.41.mlp.up_proj
- model.layers.35.mlp.up_proj
- model.layers.37.mlp.up_proj
- model.layers.58.mlp.up_proj
- model.layers.34.mlp.up_proj
- model.layers.38.mlp.up_proj
- model.layers.33.mlp.up_proj
- model.layers.39.mlp.up_proj
# self_attn.k_proj layers
- model.layers.36.self_attn.k_proj
- model.layers.79.self_attn.k_proj
- model.layers.35.self_attn.k_proj
- model.layers.34.self_attn.k_proj
- model.layers.37.self_attn.k_proj
- model.layers.33.self_attn.k_proj
- model.layers.38.self_attn.k_proj
- model.layers.39.self_attn.k_proj
- model.layers.74.self_attn.k_proj
- model.layers.77.self_attn.k_proj
- model.layers.41.self_attn.k_proj
- model.layers.69.self_attn.k_proj
- model.layers.32.self_attn.k_proj
- model.layers.78.self_attn.k_proj
- model.layers.30.self_attn.k_proj
- model.layers.70.self_attn.k_proj
- model.layers.25.self_attn.k_proj
- model.layers.42.self_attn.k_proj
- model.layers.29.self_attn.k_proj
- model.layers.31.self_attn.k_proj
- model.layers.68.self_attn.k_proj
- model.layers.66.self_attn.k_proj
- model.layers.22.self_attn.k_proj
- model.layers.65.self_attn.k_proj
- model.layers.44.self_attn.k_proj
- model.layers.40.self_attn.k_proj
- model.layers.63.self_attn.k_proj
- model.layers.23.self_attn.k_proj
- model.layers.28.self_attn.k_proj
- model.layers.24.self_attn.k_proj
- model.layers.26.self_attn.k_proj
- model.layers.67.self_attn.k_proj
- model.layers.75.self_attn.k_proj
- model.layers.27.self_attn.k_proj
- model.layers.57.self_attn.k_proj
- model.layers.64.self_attn.k_proj
- model.layers.71.self_attn.k_proj
- model.layers.61.self_attn.k_proj
- model.layers.72.self_attn.k_proj
- model.layers.73.self_attn.k_proj
# self_attn.o_proj layers
- model.layers.69.self_attn.o_proj
- model.layers.39.self_attn.o_proj
- model.layers.16.self_attn.o_proj
- model.layers.14.self_attn.o_proj
- model.layers.19.self_attn.o_proj
- model.layers.42.self_attn.o_proj
- model.layers.12.self_attn.o_proj
- model.layers.15.self_attn.o_proj
- model.layers.17.self_attn.o_proj
- model.layers.38.self_attn.o_proj
- model.layers.23.self_attn.o_proj
- model.layers.22.self_attn.o_proj
- model.layers.13.self_attn.o_proj
- model.layers.29.self_attn.o_proj
- model.layers.41.self_attn.o_proj
- model.layers.44.self_attn.o_proj
- model.layers.46.self_attn.o_proj
- model.layers.45.self_attn.o_proj
- model.layers.43.self_attn.o_proj
- model.layers.49.self_attn.o_proj
- model.layers.30.self_attn.o_proj
- model.layers.26.self_attn.o_proj
- model.layers.25.self_attn.o_proj
- model.layers.37.self_attn.o_proj
- model.layers.47.self_attn.o_proj
- model.layers.11.self_attn.o_proj
- model.layers.18.self_attn.o_proj
- model.layers.28.self_attn.o_proj
- model.layers.20.self_attn.o_proj
- model.layers.27.self_attn.o_proj
- model.layers.53.self_attn.o_proj
- model.layers.52.self_attn.o_proj
- model.layers.35.self_attn.o_proj
- model.layers.71.self_attn.o_proj
- model.layers.10.self_attn.o_proj
- model.layers.3.self_attn.o_proj
- model.layers.21.self_attn.o_proj
- model.layers.24.self_attn.o_proj
- model.layers.68.self_attn.o_proj
- model.layers.48.self_attn.o_proj
# self_attn.q_proj layers
- model.layers.1.self_attn.q_proj
- model.layers.2.self_attn.q_proj
- model.layers.3.self_attn.q_proj
- model.layers.0.self_attn.q_proj
- model.layers.5.self_attn.q_proj
- model.layers.4.self_attn.q_proj
- model.layers.6.self_attn.q_proj
- model.layers.8.self_attn.q_proj
- model.layers.7.self_attn.q_proj
- model.layers.9.self_attn.q_proj
- model.layers.10.self_attn.q_proj
- model.layers.68.self_attn.q_proj
- model.layers.25.self_attn.q_proj
- model.layers.12.self_attn.q_proj
- model.layers.54.self_attn.q_proj
- model.layers.55.self_attn.q_proj
- model.layers.61.self_attn.q_proj
- model.layers.18.self_attn.q_proj
- model.layers.49.self_attn.q_proj
- model.layers.66.self_attn.q_proj
- model.layers.72.self_attn.q_proj
- model.layers.11.self_attn.q_proj
- model.layers.52.self_attn.q_proj
- model.layers.64.self_attn.q_proj
- model.layers.15.self_attn.q_proj
- model.layers.60.self_attn.q_proj
- model.layers.50.self_attn.q_proj
- model.layers.59.self_attn.q_proj
- model.layers.53.self_attn.q_proj
- model.layers.48.self_attn.q_proj
- model.layers.57.self_attn.q_proj
- model.layers.70.self_attn.q_proj
- model.layers.17.self_attn.q_proj
- model.layers.67.self_attn.q_proj
- model.layers.71.self_attn.q_proj
- model.layers.62.self_attn.q_proj
- model.layers.51.self_attn.q_proj
- model.layers.19.self_attn.q_proj
- model.layers.58.self_attn.q_proj
- model.layers.13.self_attn.q_proj
# self_attn.v_proj layers
- model.layers.23.self_attn.v_proj
- model.layers.25.self_attn.v_proj
- model.layers.26.self_attn.v_proj
- model.layers.27.self_attn.v_proj
- model.layers.28.self_attn.v_proj
- model.layers.29.self_attn.v_proj
- model.layers.30.self_attn.v_proj
- model.layers.31.self_attn.v_proj
- model.layers.34.self_attn.v_proj
- model.layers.35.self_attn.v_proj
- model.layers.36.self_attn.v_proj
- model.layers.37.self_attn.v_proj
- model.layers.38.self_attn.v_proj
- model.layers.42.self_attn.v_proj
- model.layers.48.self_attn.v_proj
- model.layers.57.self_attn.v_proj
- model.layers.58.self_attn.v_proj
- model.layers.61.self_attn.v_proj
- model.layers.63.self_attn.v_proj
- model.layers.64.self_attn.v_proj
- model.layers.65.self_attn.v_proj
- model.layers.66.self_attn.v_proj
- model.layers.69.self_attn.v_proj
- model.layers.70.self_attn.v_proj
- model.layers.74.self_attn.v_proj
- model.layers.75.self_attn.v_proj
- model.layers.72.self_attn.v_proj
- model.layers.39.self_attn.v_proj
- model.layers.41.self_attn.v_proj
- model.layers.40.self_attn.v_proj
- model.layers.33.self_attn.v_proj
- model.layers.59.self_attn.v_proj
- model.layers.16.self_attn.v_proj
- model.layers.15.self_attn.v_proj
- model.layers.76.self_attn.v_proj
- model.layers.24.self_attn.v_proj
- model.layers.68.self_attn.v_proj
- model.layers.67.self_attn.v_proj
- model.layers.55.self_attn.v_proj
- model.layers.44.self_attn.v_proj
wandb_project: EVA-Qwen2.5-72B-SFFT-v0.2
wandb_entity:
wandb_watch:
wandb_name: Unit-02
wandb_log_model:
gradient_accumulation_steps: 8
micro_batch_size: 1
num_epochs: 3
optimizer: paged_ademamix_8bit
lr_scheduler: cosine
learning_rate: 0.00003
max_grad_norm: 1.5
train_on_inputs: false
group_by_length: false
bf16: auto
fp16:
tf32: false
gradient_checkpointing: "unsloth"
# gradient_checkpointing_kwargs:
# use_reentrant: true
early_stopping_patience:
resume_from_checkpoint: EVA-Qwen2.5-72B-SFFT-v0.2/checkpoint-128
local_rank:
logging_steps: 1
xformers_attention:
flash_attention: true
warmup_steps: 20
evals_per_epoch: 4
saves_per_epoch: 4
save_safetensors: true
save_total_limit: 1
hub_model_id:
hub_strategy:
debug:
deepspeed: deepspeed_configs/zero3_bf16_cpuoffload_params.json
weight_decay: 0.12
# fsdp:
# - full_shard
# - auto_wrap
# fsdp_config:
# fsdp_limit_all_gathers: true
# fsdp_sync_module_states: false
# fsdp_offload_params: true
# fsdp_cpu_ram_efficient_loading: true
# fsdp_auto_wrap_policy: TRANSFORMER_BASED_WRAP
# fsdp_transformer_layer_cls_to_wrap: Qwen2DecoderLayer
# fsdp_activation_checkpointing: true
# fsdp_state_dict_type: SHARDED_STATE_DICT # Changed from FULL_STATE_DICT
# fsdp_sharding_strategy: FULL_SHARD
# fsdp_forward_prefetch: false # Added
# fsdp_backward_prefetch: "BACKWARD_PRE" # Added
# fsdp_backward_prefetch_limit: 1 # Added
# fsdp_mixed_precision: BF16 # Added
```
</details><br> |
BFS-Search/llama-3.2-3b-i2b2_new_multi_rel | BFS-Search | 2024-11-24T00:46:12Z | 7 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-20T18:52:08Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
mav23/llama-68m-GGUF | mav23 | 2024-11-24T00:33:57Z | 17 | 0 | null | [
"gguf",
"text-generation",
"en",
"dataset:wikipedia",
"arxiv:2305.09781",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-24T00:32:19Z | ---
license: apache-2.0
language:
- en
datasets:
- wikipedia
pipeline_tag: text-generation
---
## Model description
This is a LLaMA-like model with only 68M parameters trained on Wikipedia and part of the C4-en and C4-realnewslike datasets.
No evaluation has been conducted yet, so use it with care.
The model is mainly developed as a base Small Speculative Model in the [SpecInfer](https://arxiv.org/abs/2305.09781) paper.
## Citation
To cite the model, please use
```bibtex
@misc{miao2023specinfer,
title={SpecInfer: Accelerating Generative LLM Serving with Speculative Inference and Token Tree Verification},
author={Xupeng Miao and Gabriele Oliaro and Zhihao Zhang and Xinhao Cheng and Zeyu Wang and Rae Ying Yee Wong and Zhuoming Chen and Daiyaan Arfeen and Reyna Abhyankar and Zhihao Jia},
year={2023},
eprint={2305.09781},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
artificialguybr/LLAMA3.2-1B-Synthia-I-Redmond-gguf | artificialguybr | 2024-11-24T00:25:42Z | 248 | 1 | transformers | [
"transformers",
"gguf",
"instruct",
"finetune",
"chatml",
"gpt4",
"synthetic data",
"distillation",
"facebook",
"meta",
"pytorch",
"llama",
"llama-3",
"en",
"de",
"fr",
"it",
"pt",
"hi",
"es",
"th",
"dataset:migtissera/Synthia-v1.5-I",
"base_model:artificialguybr/LLAMA3.2-1B-Synthia-I-Redmond",
"base_model:quantized:artificialguybr/LLAMA3.2-1B-Synthia-I-Redmond",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | null | 2024-11-24T00:10:57Z | ---
base_model: artificialguybr/LLAMA3.2-1B-Synthia-I-Redmond
datasets:
- migtissera/Synthia-v1.5-I
language:
- en
- de
- fr
- it
- pt
- hi
- es
- th
library_name: transformers
license: apache-2.0
quantized_by: artificialguybr
tags:
- instruct
- finetune
- chatml
- gpt4
- synthetic data
- distillation
- facebook
- meta
- pytorch
- llama
- llama-3
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: -->
Thanks [Redmond.AI](https://redmond.ai/) for GPU Sponsor!
Quantization for: https://huggingface.co/artificialguybr/LLAMA3.2-1B-Synthia-I-Redmond
## How to use
If you are unsure how to use GGUF files, look at the [TheBloke
READMEs](https://huggingface.co/TheBloke/CodeLlama-70B-Python-GGUF) for
more details, including on how to concatenate multi-part files.
|
gokulsrinivasagan/bert_tiny_lda_100_v1 | gokulsrinivasagan | 2024-11-24T00:25:12Z | 33 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"generated_from_trainer",
"dataset:gokulsrinivasagan/processed_wikitext-103-raw-v1-ld-100",
"model-index",
"endpoints_compatible",
"region:us"
] | null | 2024-11-23T18:34:09Z | ---
library_name: transformers
tags:
- generated_from_trainer
datasets:
- gokulsrinivasagan/processed_wikitext-103-raw-v1-ld-100
metrics:
- accuracy
model-index:
- name: bert_tiny_lda_100_v1
results:
- task:
name: Masked Language Modeling
type: fill-mask
dataset:
name: gokulsrinivasagan/processed_wikitext-103-raw-v1-ld-100
type: gokulsrinivasagan/processed_wikitext-103-raw-v1-ld-100
metrics:
- name: Accuracy
type: accuracy
value: 0.3677014773593966
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert_tiny_lda_100_v1
This model is a fine-tuned version of [](https://huggingface.co/) on the gokulsrinivasagan/processed_wikitext-103-raw-v1-ld-100 dataset.
It achieves the following results on the evaluation set:
- Loss: 7.8078
- Accuracy: 0.3677
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 160
- eval_batch_size: 160
- seed: 10
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 10000
- num_epochs: 25
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-------:|:-----:|:---------------:|:--------:|
| 10.7151 | 6.9979 | 10000 | 10.6513 | 0.1546 |
| 9.7707 | 13.9958 | 20000 | 9.6190 | 0.1946 |
| 8.0878 | 20.9937 | 30000 | 7.8520 | 0.3593 |
### Framework versions
- Transformers 4.46.1
- Pytorch 2.2.0+cu121
- Datasets 3.1.0
- Tokenizers 0.20.1
|
MayBashendy/ASAP_FineTuningBERT_AugV4_k4_task1_organization_fold0 | MayBashendy | 2024-11-24T00:24:48Z | 162 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:google-bert/bert-base-uncased",
"base_model:finetune:google-bert/bert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-18T03:47:53Z | ---
library_name: transformers
license: apache-2.0
base_model: bert-base-uncased
tags:
- generated_from_trainer
model-index:
- name: ASAP_FineTuningBERT_AugV4_k4_task1_organization_fold0
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# ASAP_FineTuningBERT_AugV4_k4_task1_organization_fold0
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6338
- Qwk: 0.4989
- Mse: 0.6338
- Rmse: 0.7961
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|:-------------:|:------:|:----:|:---------------:|:------:|:------:|:------:|
| No log | 0.0048 | 2 | 9.7884 | 0.0036 | 9.7884 | 3.1286 |
| No log | 0.0097 | 4 | 8.3457 | 0.0 | 8.3457 | 2.8889 |
| No log | 0.0145 | 6 | 7.3232 | 0.0 | 7.3232 | 2.7061 |
| No log | 0.0193 | 8 | 6.6114 | 0.0 | 6.6114 | 2.5713 |
| No log | 0.0242 | 10 | 5.9011 | 0.0090 | 5.9011 | 2.4292 |
| No log | 0.0290 | 12 | 5.0913 | 0.0112 | 5.0913 | 2.2564 |
| No log | 0.0338 | 14 | 4.3007 | 0.0077 | 4.3007 | 2.0738 |
| No log | 0.0386 | 16 | 3.5755 | 0.0039 | 3.5755 | 1.8909 |
| No log | 0.0435 | 18 | 2.8540 | 0.0 | 2.8540 | 1.6894 |
| No log | 0.0483 | 20 | 2.2425 | 0.0742 | 2.2425 | 1.4975 |
| No log | 0.0531 | 22 | 1.7255 | 0.0382 | 1.7255 | 1.3136 |
| No log | 0.0580 | 24 | 1.3306 | 0.0316 | 1.3306 | 1.1535 |
| No log | 0.0628 | 26 | 1.0734 | 0.0316 | 1.0734 | 1.0361 |
| No log | 0.0676 | 28 | 0.8781 | 0.1273 | 0.8781 | 0.9370 |
| No log | 0.0725 | 30 | 0.8005 | 0.1169 | 0.8005 | 0.8947 |
| No log | 0.0773 | 32 | 0.7497 | 0.1076 | 0.7497 | 0.8659 |
| No log | 0.0821 | 34 | 0.8244 | 0.0689 | 0.8244 | 0.9080 |
| No log | 0.0870 | 36 | 0.8958 | 0.0521 | 0.8958 | 0.9465 |
| No log | 0.0918 | 38 | 0.9542 | 0.0348 | 0.9542 | 0.9769 |
| No log | 0.0966 | 40 | 1.1204 | 0.0348 | 1.1204 | 1.0585 |
| No log | 0.1014 | 42 | 1.1989 | 0.0174 | 1.1989 | 1.0949 |
| No log | 0.1063 | 44 | 1.0374 | 0.0174 | 1.0374 | 1.0185 |
| No log | 0.1111 | 46 | 0.9324 | 0.0174 | 0.9324 | 0.9656 |
| No log | 0.1159 | 48 | 0.9645 | 0.0174 | 0.9645 | 0.9821 |
| No log | 0.1208 | 50 | 1.1371 | 0.0174 | 1.1371 | 1.0663 |
| No log | 0.1256 | 52 | 1.0818 | 0.0174 | 1.0818 | 1.0401 |
| No log | 0.1304 | 54 | 1.1180 | 0.0174 | 1.1180 | 1.0573 |
| No log | 0.1353 | 56 | 1.0966 | 0.0 | 1.0966 | 1.0472 |
| No log | 0.1401 | 58 | 0.9594 | 0.0 | 0.9594 | 0.9795 |
| No log | 0.1449 | 60 | 0.8625 | 0.0 | 0.8625 | 0.9287 |
| No log | 0.1498 | 62 | 0.8699 | 0.0 | 0.8699 | 0.9327 |
| No log | 0.1546 | 64 | 0.9776 | 0.0 | 0.9776 | 0.9887 |
| No log | 0.1594 | 66 | 1.0314 | 0.0 | 1.0314 | 1.0156 |
| No log | 0.1643 | 68 | 0.8824 | 0.0174 | 0.8824 | 0.9393 |
| No log | 0.1691 | 70 | 0.8842 | 0.0174 | 0.8842 | 0.9403 |
| No log | 0.1739 | 72 | 1.0974 | 0.0 | 1.0974 | 1.0475 |
| No log | 0.1787 | 74 | 1.1630 | 0.0 | 1.1630 | 1.0784 |
| No log | 0.1836 | 76 | 0.9696 | 0.0 | 0.9696 | 0.9847 |
| No log | 0.1884 | 78 | 0.9355 | 0.0 | 0.9355 | 0.9672 |
| No log | 0.1932 | 80 | 1.0228 | 0.0 | 1.0228 | 1.0113 |
| No log | 0.1981 | 82 | 1.0875 | 0.0 | 1.0875 | 1.0428 |
| No log | 0.2029 | 84 | 1.0401 | 0.0 | 1.0401 | 1.0198 |
| No log | 0.2077 | 86 | 0.9557 | 0.0 | 0.9557 | 0.9776 |
| No log | 0.2126 | 88 | 0.9466 | 0.0 | 0.9466 | 0.9729 |
| No log | 0.2174 | 90 | 0.9541 | 0.0 | 0.9541 | 0.9768 |
| No log | 0.2222 | 92 | 0.9751 | 0.0 | 0.9751 | 0.9875 |
| No log | 0.2271 | 94 | 0.9317 | 0.0 | 0.9317 | 0.9652 |
| No log | 0.2319 | 96 | 0.8861 | 0.0 | 0.8861 | 0.9413 |
| No log | 0.2367 | 98 | 0.8846 | 0.0 | 0.8846 | 0.9405 |
| No log | 0.2415 | 100 | 0.9770 | 0.0 | 0.9770 | 0.9885 |
| No log | 0.2464 | 102 | 1.0583 | 0.0 | 1.0583 | 1.0287 |
| No log | 0.2512 | 104 | 0.9901 | 0.0 | 0.9901 | 0.9950 |
| No log | 0.2560 | 106 | 0.9516 | 0.0174 | 0.9516 | 0.9755 |
| No log | 0.2609 | 108 | 1.0214 | 0.0 | 1.0214 | 1.0106 |
| No log | 0.2657 | 110 | 0.9706 | 0.0 | 0.9706 | 0.9852 |
| No log | 0.2705 | 112 | 0.8449 | 0.0 | 0.8449 | 0.9192 |
| No log | 0.2754 | 114 | 0.7743 | 0.0049 | 0.7743 | 0.8799 |
| No log | 0.2802 | 116 | 0.7784 | 0.0147 | 0.7784 | 0.8823 |
| No log | 0.2850 | 118 | 0.8229 | 0.0 | 0.8229 | 0.9071 |
| No log | 0.2899 | 120 | 1.0002 | 0.1325 | 1.0002 | 1.0001 |
| No log | 0.2947 | 122 | 1.0582 | 0.1573 | 1.0582 | 1.0287 |
| No log | 0.2995 | 124 | 1.0952 | 0.1566 | 1.0952 | 1.0465 |
| No log | 0.3043 | 126 | 1.2585 | 0.1007 | 1.2585 | 1.1218 |
| No log | 0.3092 | 128 | 1.0667 | 0.2120 | 1.0667 | 1.0328 |
| No log | 0.3140 | 130 | 0.7665 | 0.1629 | 0.7665 | 0.8755 |
| No log | 0.3188 | 132 | 0.9435 | 0.2251 | 0.9435 | 0.9714 |
| No log | 0.3237 | 134 | 1.2567 | 0.1784 | 1.2567 | 1.1210 |
| No log | 0.3285 | 136 | 0.9381 | 0.2307 | 0.9381 | 0.9685 |
| No log | 0.3333 | 138 | 0.7142 | 0.2795 | 0.7142 | 0.8451 |
| No log | 0.3382 | 140 | 0.6960 | 0.2368 | 0.6960 | 0.8343 |
| No log | 0.3430 | 142 | 0.7194 | 0.1643 | 0.7194 | 0.8482 |
| No log | 0.3478 | 144 | 0.7069 | 0.2029 | 0.7069 | 0.8408 |
| No log | 0.3527 | 146 | 0.8457 | 0.2275 | 0.8457 | 0.9196 |
| No log | 0.3575 | 148 | 0.7414 | 0.1941 | 0.7414 | 0.8610 |
| No log | 0.3623 | 150 | 0.7162 | 0.2938 | 0.7162 | 0.8463 |
| No log | 0.3671 | 152 | 0.7151 | 0.2805 | 0.7151 | 0.8457 |
| No log | 0.3720 | 154 | 0.7294 | 0.1767 | 0.7294 | 0.8541 |
| No log | 0.3768 | 156 | 1.0364 | 0.2084 | 1.0364 | 1.0181 |
| No log | 0.3816 | 158 | 1.1460 | 0.1853 | 1.1460 | 1.0705 |
| No log | 0.3865 | 160 | 1.0650 | 0.1970 | 1.0650 | 1.0320 |
| No log | 0.3913 | 162 | 0.9547 | 0.2146 | 0.9547 | 0.9771 |
| No log | 0.3961 | 164 | 0.7515 | 0.1664 | 0.7515 | 0.8669 |
| No log | 0.4010 | 166 | 0.7146 | 0.2398 | 0.7146 | 0.8453 |
| No log | 0.4058 | 168 | 0.7202 | 0.2807 | 0.7202 | 0.8486 |
| No log | 0.4106 | 170 | 0.9297 | 0.3111 | 0.9297 | 0.9642 |
| No log | 0.4155 | 172 | 1.1130 | 0.2237 | 1.1130 | 1.0550 |
| No log | 0.4203 | 174 | 0.8935 | 0.3526 | 0.8935 | 0.9452 |
| No log | 0.4251 | 176 | 0.7485 | 0.3612 | 0.7485 | 0.8652 |
| No log | 0.4300 | 178 | 0.7487 | 0.3652 | 0.7487 | 0.8653 |
| No log | 0.4348 | 180 | 0.7959 | 0.3743 | 0.7959 | 0.8921 |
| No log | 0.4396 | 182 | 0.8790 | 0.3561 | 0.8790 | 0.9375 |
| No log | 0.4444 | 184 | 0.8131 | 0.3858 | 0.8131 | 0.9017 |
| No log | 0.4493 | 186 | 0.6981 | 0.3787 | 0.6981 | 0.8355 |
| No log | 0.4541 | 188 | 0.6664 | 0.3902 | 0.6664 | 0.8164 |
| No log | 0.4589 | 190 | 0.6466 | 0.3808 | 0.6466 | 0.8041 |
| No log | 0.4638 | 192 | 0.6441 | 0.3945 | 0.6441 | 0.8026 |
| No log | 0.4686 | 194 | 0.7083 | 0.3771 | 0.7083 | 0.8416 |
| No log | 0.4734 | 196 | 0.8337 | 0.3348 | 0.8337 | 0.9131 |
| No log | 0.4783 | 198 | 0.7679 | 0.3700 | 0.7679 | 0.8763 |
| No log | 0.4831 | 200 | 0.6496 | 0.4087 | 0.6496 | 0.8060 |
| No log | 0.4879 | 202 | 0.6448 | 0.3172 | 0.6448 | 0.8030 |
| No log | 0.4928 | 204 | 0.6885 | 0.2440 | 0.6885 | 0.8298 |
| No log | 0.4976 | 206 | 0.8107 | 0.3165 | 0.8107 | 0.9004 |
| No log | 0.5024 | 208 | 1.0132 | 0.2603 | 1.0132 | 1.0066 |
| No log | 0.5072 | 210 | 1.0492 | 0.2428 | 1.0492 | 1.0243 |
| No log | 0.5121 | 212 | 0.8933 | 0.2702 | 0.8933 | 0.9451 |
| No log | 0.5169 | 214 | 0.6569 | 0.3295 | 0.6569 | 0.8105 |
| No log | 0.5217 | 216 | 0.6409 | 0.3527 | 0.6409 | 0.8005 |
| No log | 0.5266 | 218 | 0.6300 | 0.3866 | 0.6300 | 0.7937 |
| No log | 0.5314 | 220 | 0.8547 | 0.3163 | 0.8547 | 0.9245 |
| No log | 0.5362 | 222 | 0.9754 | 0.2782 | 0.9754 | 0.9876 |
| No log | 0.5411 | 224 | 0.8529 | 0.3246 | 0.8529 | 0.9235 |
| No log | 0.5459 | 226 | 0.6840 | 0.2933 | 0.6840 | 0.8270 |
| No log | 0.5507 | 228 | 0.6217 | 0.3660 | 0.6217 | 0.7885 |
| No log | 0.5556 | 230 | 0.6445 | 0.3616 | 0.6445 | 0.8028 |
| No log | 0.5604 | 232 | 0.6875 | 0.3841 | 0.6875 | 0.8291 |
| No log | 0.5652 | 234 | 0.6002 | 0.4674 | 0.6002 | 0.7747 |
| No log | 0.5700 | 236 | 0.6340 | 0.4779 | 0.6340 | 0.7962 |
| No log | 0.5749 | 238 | 0.6832 | 0.4418 | 0.6832 | 0.8266 |
| No log | 0.5797 | 240 | 0.5909 | 0.5007 | 0.5909 | 0.7687 |
| No log | 0.5845 | 242 | 0.5939 | 0.4093 | 0.5939 | 0.7706 |
| No log | 0.5894 | 244 | 0.5803 | 0.4322 | 0.5803 | 0.7618 |
| No log | 0.5942 | 246 | 0.6677 | 0.4329 | 0.6677 | 0.8172 |
| No log | 0.5990 | 248 | 0.9902 | 0.3306 | 0.9902 | 0.9951 |
| No log | 0.6039 | 250 | 0.9869 | 0.3203 | 0.9869 | 0.9934 |
| No log | 0.6087 | 252 | 0.7199 | 0.3776 | 0.7199 | 0.8484 |
| No log | 0.6135 | 254 | 0.6199 | 0.4368 | 0.6199 | 0.7874 |
| No log | 0.6184 | 256 | 0.6016 | 0.4359 | 0.6016 | 0.7756 |
| No log | 0.6232 | 258 | 0.6023 | 0.4257 | 0.6023 | 0.7761 |
| No log | 0.6280 | 260 | 0.6339 | 0.4306 | 0.6339 | 0.7962 |
| No log | 0.6329 | 262 | 0.7260 | 0.4105 | 0.7260 | 0.8520 |
| No log | 0.6377 | 264 | 0.7948 | 0.3844 | 0.7948 | 0.8915 |
| No log | 0.6425 | 266 | 0.6656 | 0.4497 | 0.6656 | 0.8158 |
| No log | 0.6473 | 268 | 0.5675 | 0.4684 | 0.5675 | 0.7533 |
| No log | 0.6522 | 270 | 0.5618 | 0.4300 | 0.5618 | 0.7495 |
| No log | 0.6570 | 272 | 0.5491 | 0.4477 | 0.5491 | 0.7410 |
| No log | 0.6618 | 274 | 0.5510 | 0.4729 | 0.5510 | 0.7423 |
| No log | 0.6667 | 276 | 0.5425 | 0.4794 | 0.5425 | 0.7366 |
| No log | 0.6715 | 278 | 0.6099 | 0.4791 | 0.6099 | 0.7810 |
| No log | 0.6763 | 280 | 0.6061 | 0.4839 | 0.6061 | 0.7786 |
| No log | 0.6812 | 282 | 0.5278 | 0.4896 | 0.5278 | 0.7265 |
| No log | 0.6860 | 284 | 0.5469 | 0.4460 | 0.5469 | 0.7395 |
| No log | 0.6908 | 286 | 0.5380 | 0.4797 | 0.5380 | 0.7335 |
| No log | 0.6957 | 288 | 0.5378 | 0.5305 | 0.5378 | 0.7334 |
| No log | 0.7005 | 290 | 0.6036 | 0.5275 | 0.6036 | 0.7769 |
| No log | 0.7053 | 292 | 0.6518 | 0.5202 | 0.6518 | 0.8073 |
| No log | 0.7101 | 294 | 0.5815 | 0.5521 | 0.5815 | 0.7626 |
| No log | 0.7150 | 296 | 0.6317 | 0.5175 | 0.6317 | 0.7948 |
| No log | 0.7198 | 298 | 0.5793 | 0.5322 | 0.5793 | 0.7611 |
| No log | 0.7246 | 300 | 0.7073 | 0.4870 | 0.7073 | 0.8410 |
| No log | 0.7295 | 302 | 0.6423 | 0.4907 | 0.6423 | 0.8014 |
| No log | 0.7343 | 304 | 0.5469 | 0.4514 | 0.5469 | 0.7396 |
| No log | 0.7391 | 306 | 0.6290 | 0.4562 | 0.6290 | 0.7931 |
| No log | 0.7440 | 308 | 0.5726 | 0.4594 | 0.5726 | 0.7567 |
| No log | 0.7488 | 310 | 0.5502 | 0.4640 | 0.5502 | 0.7418 |
| No log | 0.7536 | 312 | 0.5479 | 0.4382 | 0.5479 | 0.7402 |
| No log | 0.7585 | 314 | 0.5798 | 0.4434 | 0.5798 | 0.7615 |
| No log | 0.7633 | 316 | 0.5601 | 0.4432 | 0.5601 | 0.7484 |
| No log | 0.7681 | 318 | 0.5422 | 0.4379 | 0.5422 | 0.7364 |
| No log | 0.7729 | 320 | 0.5739 | 0.4597 | 0.5739 | 0.7575 |
| No log | 0.7778 | 322 | 0.5998 | 0.4597 | 0.5998 | 0.7744 |
| No log | 0.7826 | 324 | 0.5428 | 0.4874 | 0.5428 | 0.7367 |
| No log | 0.7874 | 326 | 0.5459 | 0.4665 | 0.5459 | 0.7388 |
| No log | 0.7923 | 328 | 0.6149 | 0.4466 | 0.6149 | 0.7841 |
| No log | 0.7971 | 330 | 0.6008 | 0.4300 | 0.6008 | 0.7751 |
| No log | 0.8019 | 332 | 0.5639 | 0.4420 | 0.5639 | 0.7509 |
| No log | 0.8068 | 334 | 0.5663 | 0.4335 | 0.5663 | 0.7525 |
| No log | 0.8116 | 336 | 0.6073 | 0.4322 | 0.6073 | 0.7793 |
| No log | 0.8164 | 338 | 0.7503 | 0.4317 | 0.7503 | 0.8662 |
| No log | 0.8213 | 340 | 0.7104 | 0.4243 | 0.7104 | 0.8428 |
| No log | 0.8261 | 342 | 0.5794 | 0.4798 | 0.5794 | 0.7612 |
| No log | 0.8309 | 344 | 0.7353 | 0.4168 | 0.7353 | 0.8575 |
| No log | 0.8357 | 346 | 0.6988 | 0.4094 | 0.6988 | 0.8359 |
| No log | 0.8406 | 348 | 0.5819 | 0.4614 | 0.5819 | 0.7628 |
| No log | 0.8454 | 350 | 0.6966 | 0.4274 | 0.6966 | 0.8346 |
| No log | 0.8502 | 352 | 0.6598 | 0.4491 | 0.6598 | 0.8123 |
| No log | 0.8551 | 354 | 0.5731 | 0.4695 | 0.5731 | 0.7570 |
| No log | 0.8599 | 356 | 0.6861 | 0.4152 | 0.6861 | 0.8283 |
| No log | 0.8647 | 358 | 0.6613 | 0.4218 | 0.6613 | 0.8132 |
| No log | 0.8696 | 360 | 0.5721 | 0.4670 | 0.5721 | 0.7564 |
| No log | 0.8744 | 362 | 0.7264 | 0.4363 | 0.7264 | 0.8523 |
| No log | 0.8792 | 364 | 0.8739 | 0.3913 | 0.8739 | 0.9349 |
| No log | 0.8841 | 366 | 0.7313 | 0.4293 | 0.7313 | 0.8552 |
| No log | 0.8889 | 368 | 0.5945 | 0.5008 | 0.5945 | 0.7710 |
| No log | 0.8937 | 370 | 0.6306 | 0.4690 | 0.6306 | 0.7941 |
| No log | 0.8986 | 372 | 0.6189 | 0.4811 | 0.6189 | 0.7867 |
| No log | 0.9034 | 374 | 0.6407 | 0.5069 | 0.6407 | 0.8005 |
| No log | 0.9082 | 376 | 0.8693 | 0.4153 | 0.8693 | 0.9324 |
| No log | 0.9130 | 378 | 0.9686 | 0.3969 | 0.9686 | 0.9842 |
| No log | 0.9179 | 380 | 0.7622 | 0.4090 | 0.7622 | 0.8730 |
| No log | 0.9227 | 382 | 0.5952 | 0.4832 | 0.5952 | 0.7715 |
| No log | 0.9275 | 384 | 0.5749 | 0.4479 | 0.5749 | 0.7582 |
| No log | 0.9324 | 386 | 0.5869 | 0.4698 | 0.5869 | 0.7661 |
| No log | 0.9372 | 388 | 0.6879 | 0.4415 | 0.6879 | 0.8294 |
| No log | 0.9420 | 390 | 0.6862 | 0.4525 | 0.6862 | 0.8283 |
| No log | 0.9469 | 392 | 0.6271 | 0.4711 | 0.6271 | 0.7919 |
| No log | 0.9517 | 394 | 0.6621 | 0.4657 | 0.6621 | 0.8137 |
| No log | 0.9565 | 396 | 0.7243 | 0.4701 | 0.7243 | 0.8511 |
| No log | 0.9614 | 398 | 0.6952 | 0.4560 | 0.6952 | 0.8338 |
| No log | 0.9662 | 400 | 0.5920 | 0.4401 | 0.5920 | 0.7694 |
| No log | 0.9710 | 402 | 0.5406 | 0.4735 | 0.5406 | 0.7352 |
| No log | 0.9758 | 404 | 0.5606 | 0.4797 | 0.5606 | 0.7488 |
| No log | 0.9807 | 406 | 0.5542 | 0.4880 | 0.5542 | 0.7445 |
| No log | 0.9855 | 408 | 0.5656 | 0.4641 | 0.5656 | 0.7521 |
| No log | 0.9903 | 410 | 0.5598 | 0.4875 | 0.5598 | 0.7482 |
| No log | 0.9952 | 412 | 0.5654 | 0.4879 | 0.5654 | 0.7519 |
| No log | 1.0 | 414 | 0.5967 | 0.5004 | 0.5967 | 0.7724 |
| No log | 1.0048 | 416 | 0.6055 | 0.5170 | 0.6055 | 0.7781 |
| No log | 1.0097 | 418 | 0.5804 | 0.5221 | 0.5804 | 0.7618 |
| No log | 1.0145 | 420 | 0.5666 | 0.5118 | 0.5666 | 0.7527 |
| No log | 1.0193 | 422 | 0.5895 | 0.5255 | 0.5895 | 0.7678 |
| No log | 1.0242 | 424 | 0.5697 | 0.5319 | 0.5697 | 0.7548 |
| No log | 1.0290 | 426 | 0.6259 | 0.5038 | 0.6259 | 0.7912 |
| No log | 1.0338 | 428 | 0.6697 | 0.4944 | 0.6697 | 0.8183 |
| No log | 1.0386 | 430 | 0.5692 | 0.5551 | 0.5692 | 0.7544 |
| No log | 1.0435 | 432 | 0.6608 | 0.4884 | 0.6608 | 0.8129 |
| No log | 1.0483 | 434 | 0.6204 | 0.5025 | 0.6204 | 0.7876 |
| No log | 1.0531 | 436 | 0.5643 | 0.5175 | 0.5643 | 0.7512 |
| No log | 1.0580 | 438 | 0.6181 | 0.4652 | 0.6181 | 0.7862 |
| No log | 1.0628 | 440 | 0.5918 | 0.4781 | 0.5918 | 0.7693 |
| No log | 1.0676 | 442 | 0.5837 | 0.4610 | 0.5837 | 0.7640 |
| No log | 1.0725 | 444 | 0.6268 | 0.4404 | 0.6268 | 0.7917 |
| No log | 1.0773 | 446 | 0.7865 | 0.4318 | 0.7865 | 0.8869 |
| No log | 1.0821 | 448 | 0.8325 | 0.4030 | 0.8325 | 0.9124 |
| No log | 1.0870 | 450 | 0.7097 | 0.4009 | 0.7097 | 0.8424 |
| No log | 1.0918 | 452 | 0.6230 | 0.4214 | 0.6230 | 0.7893 |
| No log | 1.0966 | 454 | 0.6219 | 0.4450 | 0.6219 | 0.7886 |
| No log | 1.1014 | 456 | 0.6602 | 0.4060 | 0.6602 | 0.8125 |
| No log | 1.1063 | 458 | 0.8361 | 0.4141 | 0.8361 | 0.9144 |
| No log | 1.1111 | 460 | 0.7944 | 0.4204 | 0.7944 | 0.8913 |
| No log | 1.1159 | 462 | 0.6283 | 0.4107 | 0.6283 | 0.7926 |
| No log | 1.1208 | 464 | 0.6113 | 0.4390 | 0.6113 | 0.7819 |
| No log | 1.1256 | 466 | 0.6580 | 0.4379 | 0.6580 | 0.8111 |
| No log | 1.1304 | 468 | 0.6096 | 0.4402 | 0.6096 | 0.7807 |
| No log | 1.1353 | 470 | 0.5697 | 0.4985 | 0.5697 | 0.7548 |
| No log | 1.1401 | 472 | 0.5775 | 0.5157 | 0.5775 | 0.7599 |
| No log | 1.1449 | 474 | 0.5896 | 0.5394 | 0.5896 | 0.7678 |
| No log | 1.1498 | 476 | 0.6675 | 0.5300 | 0.6675 | 0.8170 |
| No log | 1.1546 | 478 | 0.7546 | 0.4850 | 0.7546 | 0.8687 |
| No log | 1.1594 | 480 | 0.6185 | 0.5061 | 0.6185 | 0.7865 |
| No log | 1.1643 | 482 | 0.5779 | 0.5398 | 0.5779 | 0.7602 |
| No log | 1.1691 | 484 | 0.6439 | 0.5168 | 0.6439 | 0.8024 |
| No log | 1.1739 | 486 | 0.5688 | 0.5326 | 0.5688 | 0.7542 |
| No log | 1.1787 | 488 | 0.5815 | 0.4819 | 0.5815 | 0.7626 |
| No log | 1.1836 | 490 | 0.6205 | 0.4913 | 0.6205 | 0.7877 |
| No log | 1.1884 | 492 | 0.5510 | 0.4939 | 0.5510 | 0.7423 |
| No log | 1.1932 | 494 | 0.5598 | 0.5283 | 0.5598 | 0.7482 |
| No log | 1.1981 | 496 | 0.5677 | 0.5509 | 0.5677 | 0.7535 |
| No log | 1.2029 | 498 | 0.5941 | 0.5419 | 0.5941 | 0.7708 |
| 1.006 | 1.2077 | 500 | 0.6758 | 0.5151 | 0.6758 | 0.8220 |
| 1.006 | 1.2126 | 502 | 0.6052 | 0.5619 | 0.6052 | 0.7780 |
| 1.006 | 1.2174 | 504 | 0.6164 | 0.5571 | 0.6164 | 0.7851 |
| 1.006 | 1.2222 | 506 | 0.7070 | 0.4966 | 0.7070 | 0.8409 |
| 1.006 | 1.2271 | 508 | 0.6159 | 0.5705 | 0.6159 | 0.7848 |
| 1.006 | 1.2319 | 510 | 0.6100 | 0.5811 | 0.6100 | 0.7810 |
| 1.006 | 1.2367 | 512 | 0.5981 | 0.5895 | 0.5981 | 0.7734 |
| 1.006 | 1.2415 | 514 | 0.5755 | 0.5539 | 0.5755 | 0.7586 |
| 1.006 | 1.2464 | 516 | 0.6028 | 0.5150 | 0.6028 | 0.7764 |
| 1.006 | 1.2512 | 518 | 0.5452 | 0.5514 | 0.5452 | 0.7384 |
| 1.006 | 1.2560 | 520 | 0.5842 | 0.5165 | 0.5842 | 0.7643 |
| 1.006 | 1.2609 | 522 | 0.6252 | 0.4805 | 0.6252 | 0.7907 |
| 1.006 | 1.2657 | 524 | 0.5512 | 0.4616 | 0.5512 | 0.7424 |
| 1.006 | 1.2705 | 526 | 0.5600 | 0.5282 | 0.5600 | 0.7483 |
| 1.006 | 1.2754 | 528 | 0.5831 | 0.4820 | 0.5831 | 0.7636 |
| 1.006 | 1.2802 | 530 | 0.5445 | 0.4965 | 0.5445 | 0.7379 |
| 1.006 | 1.2850 | 532 | 0.6660 | 0.4704 | 0.6660 | 0.8161 |
| 1.006 | 1.2899 | 534 | 0.7488 | 0.4750 | 0.7488 | 0.8654 |
| 1.006 | 1.2947 | 536 | 0.6338 | 0.4989 | 0.6338 | 0.7961 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
|
juancortizgonz/bert-base-uncased-finetuned-ner | juancortizgonz | 2024-11-24T00:20:01Z | 105 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"bert",
"token-classification",
"generated_from_trainer",
"dataset:biobert_json",
"base_model:google-bert/bert-base-uncased",
"base_model:finetune:google-bert/bert-base-uncased",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | 2024-11-24T00:19:46Z | ---
library_name: transformers
license: apache-2.0
base_model: google-bert/bert-base-uncased
tags:
- generated_from_trainer
datasets:
- biobert_json
metrics:
- precision
- recall
- f1
- accuracy
model-index:
- name: bert-base-uncased-finetuned-ner
results:
- task:
name: Token Classification
type: token-classification
dataset:
name: biobert_json
type: biobert_json
config: Biobert_json
split: validation
args: Biobert_json
metrics:
- name: Precision
type: precision
value: 0.942891335567257
- name: Recall
type: recall
value: 0.9658232813572619
- name: F1
type: f1
value: 0.9542195523689565
- name: Accuracy
type: accuracy
value: 0.9763595874355369
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-ner
This model is a fine-tuned version of [google-bert/bert-base-uncased](https://huggingface.co/google-bert/bert-base-uncased) on the biobert_json dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1204
- Precision: 0.9429
- Recall: 0.9658
- F1: 0.9542
- Accuracy: 0.9764
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:---------:|:------:|:------:|:--------:|
| 0.1823 | 1.0 | 1224 | 0.1059 | 0.9301 | 0.9628 | 0.9462 | 0.9731 |
| 0.1142 | 2.0 | 2448 | 0.1163 | 0.9203 | 0.9717 | 0.9453 | 0.9698 |
| 0.0812 | 3.0 | 3672 | 0.1000 | 0.9427 | 0.9705 | 0.9564 | 0.9773 |
| 0.0603 | 4.0 | 4896 | 0.0970 | 0.9424 | 0.9717 | 0.9568 | 0.9773 |
| 0.0516 | 5.0 | 6120 | 0.1018 | 0.9416 | 0.9720 | 0.9566 | 0.9772 |
| 0.0418 | 6.0 | 7344 | 0.1044 | 0.9446 | 0.9704 | 0.9574 | 0.9778 |
| 0.0361 | 7.0 | 8568 | 0.1070 | 0.9422 | 0.9725 | 0.9571 | 0.9775 |
| 0.0296 | 8.0 | 9792 | 0.1166 | 0.9438 | 0.9708 | 0.9571 | 0.9776 |
| 0.0242 | 9.0 | 11016 | 0.1174 | 0.9437 | 0.9671 | 0.9553 | 0.9767 |
| 0.0231 | 10.0 | 12240 | 0.1204 | 0.9429 | 0.9658 | 0.9542 | 0.9764 |
### Framework versions
- Transformers 4.45.1
- Pytorch 2.4.0
- Datasets 3.0.1
- Tokenizers 0.20.0
|
aminv/WordpressQA-Gemma-2-v3-2B-FT | aminv | 2024-11-24T00:10:04Z | 79 | 0 | transformers | [
"transformers",
"safetensors",
"gemma",
"text-generation",
"trl",
"sft",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"4-bit",
"bitsandbytes",
"region:us"
] | text-generation | 2024-11-24T00:08:36Z | ---
library_name: transformers
tags:
- trl
- sft
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Soulex95/speecht5_finetuned_emirhan_tr | Soulex95 | 2024-11-23T23:58:46Z | 77 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"speecht5",
"text-to-audio",
"generated_from_trainer",
"base_model:microsoft/speecht5_tts",
"base_model:finetune:microsoft/speecht5_tts",
"license:mit",
"endpoints_compatible",
"region:us"
] | text-to-audio | 2024-11-21T10:25:08Z | ---
library_name: transformers
license: mit
base_model: microsoft/speecht5_tts
tags:
- generated_from_trainer
model-index:
- name: speecht5_finetuned_emirhan_tr
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# speecht5_finetuned_emirhan_tr
This model is a fine-tuned version of [microsoft/speecht5_tts](https://huggingface.co/microsoft/speecht5_tts) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3230
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 4
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 32
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- training_steps: 500
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 0.5191 | 0.4545 | 100 | 0.4439 |
| 0.4244 | 0.9091 | 200 | 0.3676 |
| 0.3788 | 1.3636 | 300 | 0.3355 |
| 0.3567 | 1.8182 | 400 | 0.3319 |
| 0.3498 | 2.2727 | 500 | 0.3230 |
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.1+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
|
isspek/roberta-base_ebola_1_2e-5_16_undersampling_0.3 | isspek | 2024-11-23T23:54:45Z | 179 | 0 | transformers | [
"transformers",
"safetensors",
"roberta",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-23T23:54:31Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
isspek/roberta-base_ebola_4_2e-5_16_undersampling_0.4 | isspek | 2024-11-23T23:53:51Z | 196 | 0 | transformers | [
"transformers",
"safetensors",
"roberta",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-23T23:53:36Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
isspek/roberta-base_covid_zika_ebola_1_2e-5_16 | isspek | 2024-11-23T23:51:56Z | 195 | 0 | transformers | [
"transformers",
"safetensors",
"roberta",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-23T23:51:41Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
isspek/roberta-base_covid_zika_ebola_5_2e-5_16 | isspek | 2024-11-23T23:51:16Z | 196 | 0 | transformers | [
"transformers",
"safetensors",
"roberta",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-23T23:50:57Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
isspek/roberta-base_zika_top3_ebola_top3_2_2e-5_16 | isspek | 2024-11-23T23:49:55Z | 160 | 0 | transformers | [
"transformers",
"safetensors",
"roberta",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-23T23:49:41Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
isspek/roberta-base_ebola_2_2e-5_16_undersampling_0.3 | isspek | 2024-11-23T23:49:21Z | 196 | 0 | transformers | [
"transformers",
"safetensors",
"roberta",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-23T23:49:06Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
isspek/roberta-base_covid_llama_zika_llama_ebola_llama_5_2e-5_16 | isspek | 2024-11-23T23:47:32Z | 196 | 0 | transformers | [
"transformers",
"safetensors",
"roberta",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-23T23:47:15Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
isspek/roberta-base_covid_llama_zika_llama_ebola_llama_1_2e-5_16 | isspek | 2024-11-23T23:46:57Z | 179 | 0 | transformers | [
"transformers",
"safetensors",
"roberta",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-23T23:46:43Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
isspek/roberta-base_zika_chatgpt_ebola_chatgpt_2_2e-5_16 | isspek | 2024-11-23T23:45:40Z | 161 | 0 | transformers | [
"transformers",
"safetensors",
"roberta",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-23T23:45:21Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
isspek/roberta-base_ebola_3_2e-5_16_undersampling_0.2 | isspek | 2024-11-23T23:43:48Z | 179 | 0 | transformers | [
"transformers",
"safetensors",
"roberta",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-23T23:43:33Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
isspek/roberta-base_covid_chatgpt_zika_chatgpt_ebola_chatgpt_5_2e-5_16 | isspek | 2024-11-23T23:40:09Z | 198 | 0 | transformers | [
"transformers",
"safetensors",
"roberta",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-23T23:39:54Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
isspek/roberta-base_zika_ebola_3_2e-5_16 | isspek | 2024-11-23T23:34:44Z | 196 | 0 | transformers | [
"transformers",
"safetensors",
"roberta",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-23T23:34:24Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
hkhanuja3/finetuned_alpaca_v0.1 | hkhanuja3 | 2024-11-23T23:33:49Z | 75 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"4-bit",
"bitsandbytes",
"region:us"
] | text-generation | 2024-11-23T23:24:39Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
isspek/roberta-base_ebola_5_2e-5_16_undersampling_0.1 | isspek | 2024-11-23T23:32:06Z | 196 | 0 | transformers | [
"transformers",
"safetensors",
"roberta",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-23T23:31:51Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
TheSkullery/Q2.5-MS-Mistoria-72b-v2 | TheSkullery | 2024-11-23T23:30:29Z | 142 | 5 | transformers | [
"transformers",
"safetensors",
"qwen2",
"text-generation",
"mergekit",
"merge",
"conversational",
"base_model:EVA-UNIT-01/EVA-Qwen2.5-72B-v0.2",
"base_model:merge:EVA-UNIT-01/EVA-Qwen2.5-72B-v0.2",
"base_model:Nexusflow/Athene-V2-Chat",
"base_model:merge:Nexusflow/Athene-V2-Chat",
"base_model:ZeusLabs/Chronos-Platinum-72B",
"base_model:merge:ZeusLabs/Chronos-Platinum-72B",
"base_model:shuttleai/shuttle-3",
"base_model:merge:shuttleai/shuttle-3",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-23T02:48:44Z | ---
base_model:
- ZeusLabs/Chronos-Platinum-72B
- Nexusflow/Athene-V2-Chat
- shuttleai/shuttle-3
- EVA-UNIT-01/EVA-Qwen2.5-72B-v0.2
library_name: transformers
tags:
- mergekit
- merge
---
<!DOCTYPE html>
<style>
body {
font-family: 'Quicksand', sans-serif;
background: linear-gradient(135deg, #2E3440 0%, #1A202C 100%);
color: #D8DEE9;
margin: 0;
padding: 0;
font-size: 16px;
}
.container {
width: 80% auto;
max-width: 1080px auto;
margin: 20px auto;
background-color: rgba(255, 255, 255, 0.02);
padding: 20px;
border-radius: 12px;
box-shadow: 0 4px 10px rgba(0, 0, 0, 0.2);
backdrop-filter: blur(10px);
border: 1px solid rgba(255, 255, 255, 0.1);
}
.header h1 {
font-size: 28px;
color: #ECEFF4;
margin: 0 0 20px 0;
text-shadow: 2px 2px 4px rgba(0, 0, 0, 0.3);
}
.update-section {
margin-top: 30px;
}
.update-section h2 {
font-size: 24px;
color: #88C0D0;
}
.update-section p {
font-size: 16px;
line-height: 1.6;
color: #ECEFF4;
}
.info img {
width: 100%;
border-radius: 10px;
margin-bottom: 15px;
}
a {
color: #88C0D0;
text-decoration: none;
}
a:hover {
color: #A3BE8C;
}
.button {
display: inline-block;
background-color: #5E81AC;
color: #E5E9F0;
padding: 10px 20px;
border-radius: 5px;
cursor: pointer;
text-decoration: none;
}
.button:hover {
background-color: #81A1C1;
}
pre {
background-color: #2E3440;
padding: 10px;
border-radius: 5px;
overflow-x: auto;
}
code {
font-family: 'Courier New', monospace;
color: #D8DEE9;
}
</style>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Q2.5-MS-Mistoria-72b-v2 Data Card</title>
<link href="https://fonts.googleapis.com/css2?family=Quicksand:wght@400;500;600&display=swap" rel="stylesheet">
</head>
<body>
<div class="container">
<div class="header">
<h1>Q2.5-MS-Mistoria-72b-v2</h1>
</div>
<div class="info">
<img src="https://cdn-uploads.huggingface.co/production/uploads/64545af5ec40bbbd01242ca6/5LOvUFYiMMw6pcEsOhmo2.webp">
<p>Now the cute anime girl has your attention</p>
<p><strong>Creator:</strong> <a href="https://huggingface.co/Steelskull" target="_blank">SteelSkull</a></p>
<h1>About Mistoria-72b-v2:</h1>
<pre><code>Name Legend:
Q2.5 = Qwen 2.5
MS = Model Stock
72B = its 72B
v2 = its the second version
</code></pre>
<p>This model is my second attempt at a 72b model, as usual, my goal is to merge the robust storytelling of mutiple models while attempting to maintain intelligence.</p>
<p>Use qwen format</p>
<h2>Quants: (List of badasses)</h2>
<p>GGUF Quant: </p>
<p> - bartowski: <a href="https://huggingface.co/bartowski/Q2.5-MS-Mistoria-72b-v2-GGUF" target="_blank"> Combined-GGUF </a></p>
<p> - mradermacher: <a href="https://huggingface.co/mradermacher/Q2.5-MS-Mistoria-72b-v2-GGUF" target="_blank"> GGUF </a>// <a href="https://huggingface.co/mradermacher/Q2.5-MS-Mistoria-72b-v2-i1-GGUF" target="_blank"> Imat-GGUF </a></p>
<h3>Config:</h3>
<pre><code>MODEL_NAME = "Q2.5-MS-Mistoria-72b-v2"
base_model: Nexusflow/Athene-V2-Chat
merge_method: model_stock
dtype: bfloat16
models:
- model: EVA-UNIT-01/EVA-Qwen2.5-72B-v0.2
- model: ZeusLabs/Chronos-Platinum-72B
- model: shuttleai/shuttle-3
</code></pre>
<p><strong>If you wish to support:</strong></p>
</div>
<div class="donation-section">
<a href="https://ko-fi.com/Y8Y0AO2XE" target="_blank">
<img height="36" style="border:0px;height:36px;" src="https://storage.ko-fi.com/cdn/kofi2.png?v=3" border="0" alt="Buy Me a Coffee at ko-fi.com" />
</a>
</div>
</div>
</div>
</body>
</html> |
SteelStorage/Q2.5-MS-Mistoria-72b-v2 | SteelStorage | 2024-11-23T23:30:29Z | 144 | 5 | transformers | [
"transformers",
"safetensors",
"qwen2",
"text-generation",
"mergekit",
"merge",
"conversational",
"base_model:EVA-UNIT-01/EVA-Qwen2.5-72B-v0.2",
"base_model:merge:EVA-UNIT-01/EVA-Qwen2.5-72B-v0.2",
"base_model:Nexusflow/Athene-V2-Chat",
"base_model:merge:Nexusflow/Athene-V2-Chat",
"base_model:ZeusLabs/Chronos-Platinum-72B",
"base_model:merge:ZeusLabs/Chronos-Platinum-72B",
"base_model:shuttleai/shuttle-3",
"base_model:merge:shuttleai/shuttle-3",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-23T02:48:44Z | ---
base_model:
- ZeusLabs/Chronos-Platinum-72B
- Nexusflow/Athene-V2-Chat
- shuttleai/shuttle-3
- EVA-UNIT-01/EVA-Qwen2.5-72B-v0.2
library_name: transformers
tags:
- mergekit
- merge
---
<!DOCTYPE html>
<style>
body {
font-family: 'Quicksand', sans-serif;
background: linear-gradient(135deg, #2E3440 0%, #1A202C 100%);
color: #D8DEE9;
margin: 0;
padding: 0;
font-size: 16px;
}
.container {
width: 80% auto;
max-width: 1080px auto;
margin: 20px auto;
background-color: rgba(255, 255, 255, 0.02);
padding: 20px;
border-radius: 12px;
box-shadow: 0 4px 10px rgba(0, 0, 0, 0.2);
backdrop-filter: blur(10px);
border: 1px solid rgba(255, 255, 255, 0.1);
}
.header h1 {
font-size: 28px;
color: #ECEFF4;
margin: 0 0 20px 0;
text-shadow: 2px 2px 4px rgba(0, 0, 0, 0.3);
}
.update-section {
margin-top: 30px;
}
.update-section h2 {
font-size: 24px;
color: #88C0D0;
}
.update-section p {
font-size: 16px;
line-height: 1.6;
color: #ECEFF4;
}
.info img {
width: 100%;
border-radius: 10px;
margin-bottom: 15px;
}
a {
color: #88C0D0;
text-decoration: none;
}
a:hover {
color: #A3BE8C;
}
.button {
display: inline-block;
background-color: #5E81AC;
color: #E5E9F0;
padding: 10px 20px;
border-radius: 5px;
cursor: pointer;
text-decoration: none;
}
.button:hover {
background-color: #81A1C1;
}
pre {
background-color: #2E3440;
padding: 10px;
border-radius: 5px;
overflow-x: auto;
}
code {
font-family: 'Courier New', monospace;
color: #D8DEE9;
}
</style>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Q2.5-MS-Mistoria-72b-v2 Data Card</title>
<link href="https://fonts.googleapis.com/css2?family=Quicksand:wght@400;500;600&display=swap" rel="stylesheet">
</head>
<body>
<div class="container">
<div class="header">
<h1>Q2.5-MS-Mistoria-72b-v2</h1>
</div>
<div class="info">
<img src="https://cdn-uploads.huggingface.co/production/uploads/64545af5ec40bbbd01242ca6/5LOvUFYiMMw6pcEsOhmo2.webp">
<p>Now the cute anime girl has your attention</p>
<p><strong>Creator:</strong> <a href="https://huggingface.co/Steelskull" target="_blank">SteelSkull</a></p>
<h1>About Mistoria-72b-v2:</h1>
<pre><code>Name Legend:
Q2.5 = Qwen 2.5
MS = Model Stock
72B = its 72B
v2 = its the second version
</code></pre>
<p>This model is my second attempt at a 72b model, as usual, my goal is to merge the robust storytelling of mutiple models while attempting to maintain intelligence.</p>
<p>Use qwen format</p>
<h2>Quants: (List of badasses)</h2>
<p>GGUF Quant: </p>
<p> - bartowski: <a href="https://huggingface.co/bartowski/Q2.5-MS-Mistoria-72b-v2-GGUF" target="_blank"> Combined-GGUF </a></p>
<p> - mradermacher: <a href="https://huggingface.co/mradermacher/Q2.5-MS-Mistoria-72b-v2-GGUF" target="_blank"> GGUF </a>// <a href="https://huggingface.co/mradermacher/Q2.5-MS-Mistoria-72b-v2-i1-GGUF" target="_blank"> Imat-GGUF </a></p>
<h3>Config:</h3>
<pre><code>MODEL_NAME = "Q2.5-MS-Mistoria-72b-v2"
base_model: Nexusflow/Athene-V2-Chat
merge_method: model_stock
dtype: bfloat16
models:
- model: EVA-UNIT-01/EVA-Qwen2.5-72B-v0.2
- model: ZeusLabs/Chronos-Platinum-72B
- model: shuttleai/shuttle-3
</code></pre>
<p><strong>If you wish to support:</strong></p>
</div>
<div class="donation-section">
<a href="https://ko-fi.com/Y8Y0AO2XE" target="_blank">
<img height="36" style="border:0px;height:36px;" src="https://storage.ko-fi.com/cdn/kofi2.png?v=3" border="0" alt="Buy Me a Coffee at ko-fi.com" />
</a>
</div>
</div>
</div>
</body>
</html> |
isspek/roberta-base_covid_top3_zika_top3_ebola_top3_1_2e-5_16 | isspek | 2024-11-23T23:29:18Z | 178 | 0 | transformers | [
"transformers",
"safetensors",
"roberta",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-23T23:29:03Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
isspek/roberta-base_zika_top3_ebola_top3_3_2e-5_16 | isspek | 2024-11-23T23:27:09Z | 196 | 0 | transformers | [
"transformers",
"safetensors",
"roberta",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-23T23:26:53Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
isspek/roberta-base_ebola_5_2e-5_16_undersampling_0.4 | isspek | 2024-11-23T23:23:30Z | 195 | 0 | transformers | [
"transformers",
"safetensors",
"roberta",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-23T23:23:09Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
isspek/roberta-base_zika_chatgpt_ebola_chatgpt_5_2e-5_16 | isspek | 2024-11-23T23:21:29Z | 178 | 0 | transformers | [
"transformers",
"safetensors",
"roberta",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-23T23:21:15Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
isspek/roberta-base_ebola_4_2e-5_16_undersampling_0.2 | isspek | 2024-11-23T23:20:08Z | 161 | 0 | transformers | [
"transformers",
"safetensors",
"roberta",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-23T23:19:52Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
isspek/roberta-base_zika_llama_ebola_llama_2_2e-5_16 | isspek | 2024-11-23T23:19:20Z | 196 | 0 | transformers | [
"transformers",
"safetensors",
"roberta",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-23T23:19:05Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
tencent-community/Hunyuan-A52B-Instruct | tencent-community | 2024-11-23T23:17:21Z | 18 | 16 | transformers | [
"transformers",
"safetensors",
"hunyuan",
"text-generation",
"conversational",
"custom_code",
"en",
"arxiv:2411.02265",
"autotrain_compatible",
"region:us"
] | text-generation | 2024-11-06T04:35:59Z | ---
license_link: https://huggingface.co/tencent/Tencent-Hunyuan-Large/blob/main/LICENSE.txt
language:
- en
pipeline_tag: text-generation
library_name: transformers
---
The original repo is here: https://huggingface.co/tencent/Tencent-Hunyuan-Large
This is the Hunyuan-A52B-Instruct model converted to huggingface safetensor
Thanks to @awni for converting it to MLX!
It runs over 30 TPS on M2 ultra 192gb!
https://huggingface.co/mlx-community/Hunyuan-A52B-Instruct-3bit
<p align="center">
<img src="https://dscache.tencent-cloud.cn/upload/uploader/hunyuan-64b418fd052c033b228e04bc77bbc4b54fd7f5bc.png" width="400"/> <br>
</p><p></p>
### Model Introduction
With the rapid development of artificial intelligence technology, large language models (LLMs) have made significant progress in fields such as natural language processing, computer vision, and scientific tasks. However, as the scale of these models increases, optimizing resource consumption while maintaining high performance has become a key challenge. To address this challenge, we have explored Mixture of Experts (MoE) models. The currently unveiled Hunyuan-Large (Hunyuan-MoE-A52B) model is the largest open-source Transformer-based MoE model in the industry, featuring a total of 389 billion parameters and 52 billion active parameters. This is currently the largest open-source Transformer-based MoE model in the industry, featuring a total of 389 billion parameters and 52 billion active parameters.
By open-sourcing the Hunyuan-Large model and revealing related technical details, we hope to inspire more researchers with innovative ideas and collectively advance the progress and application of AI technology. We welcome you to join our open-source community to explore and optimize future AI models together!
### Introduction to Model Technical Advantages
#### Model
- **High-Quality Synthetic Data**: By enhancing training with synthetic data, Hunyuan-Large can learn richer representations, handle long-context inputs, and generalize better to unseen data.
- **KV Cache Compression**: Utilizes Grouped Query Attention (GQA) and Cross-Layer Attention (CLA) strategies to significantly reduce memory usage and computational overhead of KV caches, improving inference throughput.
- **Expert-Specific Learning Rate Scaling**: Sets different learning rates for different experts to ensure each sub-model effectively learns from the data and contributes to overall performance.
- **Long-Context Processing Capability**: The pre-trained model supports text sequences up to 256K, and the Instruct model supports up to 128K, significantly enhancing the ability to handle long-context tasks.
- **Extensive Benchmarking**: Conducts extensive experiments across various languages and tasks to validate the practical effectiveness and safety of Hunyuan-Large.
## Benchmark Evaluation
**Hunyuan-Large pre-trained model** achieves the best overall performance compared to both Dense and MoE based
competitors having similar activated parameter sizes. For aggregated benchmarks such as MMLU, MMLU-Pro, and CMMLU,
Hunyuan-Large consistently achieves the best performance, confirming its comprehensive abilities on aggregated tasks.
Hunyuan-Large also shows superior performance in commonsense understanding and reasoning, and classical NLP tasks
such as QA and reading comprehension tasks (e.g., CommonsenseQA, PIQA and TriviaQA).
For the mathematics capability, Hunyuan-Large outperforms all baselines in math datasets of GSM8K and MATH,
and also gains the best results on CMATH in Chinese.We also observe that Hunyuan-Large achieves the overall
best performance in all Chinese tasks (e.g., CMMLU, C-Eval).
| Model | LLama3.1-405B | LLama3.1-70B | Mixtral-8x22B | DeepSeek-V2 | Hunyuan-Large |
|------------------|---------------|--------------|---------------|-------------|---------------|
| MMLU | 85.2 | 79.3 | 77.8 | 78.5 | **88.4** |
| MMLU-Pro | **61.6** | 53.8 | 49.5 | - | 60.2 |
| BBH | 85.9 | 81.6 | 78.9 | 78.9 | **86.3** |
| HellaSwag | - | - | **88.7** | 87.8 | 86.8 |
| CommonsenseQA | 85.8 | 84.1 | 82.4 | - | **92.9** |
| WinoGrande | 86.7 | 85.3 | 85.0 | 84.9 | **88.7** |
| PIQA | - | - | 83.6 | 83.7 | **88.3** |
| NaturalQuestions | - | - | 39.6 | 38.7 | **52.8** |
| DROP | 84.8 | 79.6 | 80.4 | 80.1 | **88.9** |
| ARC-C | **96.1** | 92.9 | 91.2 | 92.4 | 95.0 |
| TriviaQA | - | - | 82.1 | 79.9 | **89.2** |
| CMMLU | - | - | 60.0 | 84.0 | **90.2** |
| C-Eval | - | - | 59.6 | 81.7 | **91.9** |
| C3 | - | - | 71.4 | 77.4 | **82.3** |
| GSM8K | 89.0 | 83.7 | 83.7 | 79.2 | **92.8** |
| MATH | 53.8 | 41.4 | 42.5 | 43.6 | **69.8** |
| CMATH | - | - | 72.3 | 78.7 | **91.3** |
| HumanEval | 61.0 | 58.5 | 53.1 | 48.8 | **71.4** |
| MBPP | **73.4** | 68.6 | 64.2 | 66.6 | 72.6 |
**Hunyuan-Large-Instruct** achieves consistent improvements on most types of tasks compared to LLMs having similar
activated parameters, indicating the effectiveness of our post-training. Delving into the model performance
in different categories of benchmarks, we find that our instruct model achieves the best performance on MMLU and MATH dataset.
Notably, on the MMLU dataset, our model demonstrates a significant improvement, outperforming the LLama3.1-405B model by 2.6%.
This enhancement is not just marginal but indicative of the Hunyuan-Large-Instruct’s superior understanding and reasoning
capabilities across a wide array of language understanding tasks. The model’s prowess is further underscored in its performance
on the MATH dataset, where it surpasses the LLama3.1-405B by a notable margin of 3.6%.
Remarkably, this leap in accuracy is achieved with only 52 billion activated parameters, underscoring the efficiency of our model.
| Model | LLama3.1 405B Inst. | LLama3.1 70B Inst. | Mixtral 8x22B Inst. | DeepSeekV2.5 Chat | Hunyuan-Large Inst. |
|----------------------|---------------------|--------------------|---------------------|-------------------|---------------------|
| MMLU | 87.3 | 83.6 | 77.8 | 80.4 | **89.9** |
| CMMLU | - | - | 61.0 | - | **90.4** |
| C-Eval | - | - | 60.0 | - | **88.6** |
| BBH | - | - | 78.4 | 84.3 | **89.5** |
| HellaSwag | - | - | 86.0 | **90.3** | 88.5 |
| ARC-C | **96.9** | 94.8 | 90.0 | - | 94.6 |
| GPQA_diamond | **51.1** | 46.7 | - | - | 42.4 |
| MATH | 73.8 | 68.0 | 49.8 | 74.7 | **77.4** |
| HumanEval | 89.0 | 80.5 | 75.0 | 89.0 | **90.0** |
| AlignBench | 6.0 | 5.9 | 6.2 | 8.0 | **8.3** |
| MT-Bench | 9.1 | 8.8 | 8.1 | 9.0 | **9.4** |
| IFEval strict-prompt | **86.0** | 83.6 | 71.2 | - | 85.0 |
| Arena-Hard | 69.3 | 55.7 | - | 76.2 | **81.8** |
| AlpacaEval-2.0 | 39.3 | 34.3 | 30.9 | 50.5 | **51.8** |
### Citation
If you find our work helpful, feel free to give us a cite.
```
@misc{sun2024hunyuanlargeopensourcemoemodel,
title={Hunyuan-Large: An Open-Source MoE Model with 52 Billion Activated Parameters by Tencent},
author={Xingwu Sun and Yanfeng Chen and Yiqing Huang and Ruobing Xie and Jiaqi Zhu and Kai Zhang and Shuaipeng Li and Zhen Yang and Jonny Han and Xiaobo Shu and Jiahao Bu and Zhongzhi Chen and Xuemeng Huang and Fengzong Lian and Saiyong Yang and Jianfeng Yan and Yuyuan Zeng and Xiaoqin Ren and Chao Yu and Lulu Wu and Yue Mao and Tao Yang and Suncong Zheng and Kan Wu and Dian Jiao and Jinbao Xue and Xipeng Zhang and Decheng Wu and Kai Liu and Dengpeng Wu and Guanghui Xu and Shaohua Chen and Shuang Chen and Xiao Feng and Yigeng Hong and Junqiang Zheng and Chengcheng Xu and Zongwei Li and Xiong Kuang and Jianglu Hu and Yiqi Chen and Yuchi Deng and Guiyang Li and Ao Liu and Chenchen Zhang and Shihui Hu and Zilong Zhao and Zifan Wu and Yao Ding and Weichao Wang and Han Liu and Roberts Wang and Hao Fei and Peijie She and Ze Zhao and Xun Cao and Hai Wang and Fusheng Xiang and Mengyuan Huang and Zhiyuan Xiong and Bin Hu and Xuebin Hou and Lei Jiang and Jiajia Wu and Yaping Deng and Yi Shen and Qian Wang and Weijie Liu and Jie Liu and Meng Chen and Liang Dong and Weiwen Jia and Hu Chen and Feifei Liu and Rui Yuan and Huilin Xu and Zhenxiang Yan and Tengfei Cao and Zhichao Hu and Xinhua Feng and Dong Du and Tinghao She and Yangyu Tao and Feng Zhang and Jianchen Zhu and Chengzhong Xu and Xirui Li and Chong Zha and Wen Ouyang and Yinben Xia and Xiang Li and Zekun He and Rongpeng Chen and Jiawei Song and Ruibin Chen and Fan Jiang and Chongqing Zhao and Bo Wang and Hao Gong and Rong Gan and Winston Hu and Zhanhui Kang and Yong Yang and Yuhong Liu and Di Wang and Jie Jiang},
year={2024},
eprint={2411.02265},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2411.02265},
}
```
|
isspek/roberta-base_covid_top3_zika_top3_ebola_top3_3_2e-5_16 | isspek | 2024-11-23T23:16:41Z | 196 | 0 | transformers | [
"transformers",
"safetensors",
"roberta",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-23T23:16:26Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Subsets and Splits