modelId
stringlengths 5
139
| author
stringlengths 2
42
| last_modified
timestamp[us, tz=UTC]date 2020-02-15 11:33:14
2025-06-04 18:27:18
| downloads
int64 0
223M
| likes
int64 0
11.7k
| library_name
stringclasses 468
values | tags
sequencelengths 1
4.05k
| pipeline_tag
stringclasses 54
values | createdAt
timestamp[us, tz=UTC]date 2022-03-02 23:29:04
2025-06-04 18:26:45
| card
stringlengths 11
1.01M
|
---|---|---|---|---|---|---|---|---|---|
ngocminhta/output | ngocminhta | 2024-11-14T09:03:45Z | 47 | 0 | transformers | [
"transformers",
"pytorch",
"bert",
"text-generation",
"generated_from_trainer",
"base_model:google-bert/bert-base-uncased",
"base_model:finetune:google-bert/bert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-14T09:03:29Z | ---
library_name: transformers
license: apache-2.0
base_model: google-bert/bert-base-uncased
tags:
- generated_from_trainer
model-index:
- name: output
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# output
This model is a fine-tuned version of [google-bert/bert-base-uncased](https://huggingface.co/google-bert/bert-base-uncased) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 1
### Training results
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.0+cu121
- Tokenizers 0.20.3
|
nishhaaann/biomed_roberta_finetuned | nishhaaann | 2024-11-14T09:01:58Z | 180 | 0 | transformers | [
"transformers",
"safetensors",
"roberta",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-14T07:27:40Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
mradermacher/QandoraExp-7B-Persona-GGUF | mradermacher | 2024-11-14T08:58:00Z | 14 | 1 | transformers | [
"transformers",
"gguf",
"mergekit",
"merge",
"en",
"base_model:bunnycore/QandoraExp-7B-Persona",
"base_model:quantized:bunnycore/QandoraExp-7B-Persona",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-11-13T05:43:28Z | ---
base_model: bunnycore/QandoraExp-7B-Persona
language:
- en
library_name: transformers
quantized_by: mradermacher
tags:
- mergekit
- merge
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: -->
static quants of https://huggingface.co/bunnycore/QandoraExp-7B-Persona
<!-- provided-files -->
weighted/imatrix quants are available at https://huggingface.co/mradermacher/QandoraExp-7B-Persona-i1-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/QandoraExp-7B-Persona-GGUF/resolve/main/QandoraExp-7B-Persona.Q2_K.gguf) | Q2_K | 3.1 | |
| [GGUF](https://huggingface.co/mradermacher/QandoraExp-7B-Persona-GGUF/resolve/main/QandoraExp-7B-Persona.Q3_K_S.gguf) | Q3_K_S | 3.6 | |
| [GGUF](https://huggingface.co/mradermacher/QandoraExp-7B-Persona-GGUF/resolve/main/QandoraExp-7B-Persona.Q3_K_M.gguf) | Q3_K_M | 3.9 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/QandoraExp-7B-Persona-GGUF/resolve/main/QandoraExp-7B-Persona.Q3_K_L.gguf) | Q3_K_L | 4.2 | |
| [GGUF](https://huggingface.co/mradermacher/QandoraExp-7B-Persona-GGUF/resolve/main/QandoraExp-7B-Persona.IQ4_XS.gguf) | IQ4_XS | 4.4 | |
| [GGUF](https://huggingface.co/mradermacher/QandoraExp-7B-Persona-GGUF/resolve/main/QandoraExp-7B-Persona.Q4_K_S.gguf) | Q4_K_S | 4.6 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/QandoraExp-7B-Persona-GGUF/resolve/main/QandoraExp-7B-Persona.Q4_K_M.gguf) | Q4_K_M | 4.8 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/QandoraExp-7B-Persona-GGUF/resolve/main/QandoraExp-7B-Persona.Q5_K_S.gguf) | Q5_K_S | 5.4 | |
| [GGUF](https://huggingface.co/mradermacher/QandoraExp-7B-Persona-GGUF/resolve/main/QandoraExp-7B-Persona.Q5_K_M.gguf) | Q5_K_M | 5.5 | |
| [GGUF](https://huggingface.co/mradermacher/QandoraExp-7B-Persona-GGUF/resolve/main/QandoraExp-7B-Persona.Q6_K.gguf) | Q6_K | 6.4 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/QandoraExp-7B-Persona-GGUF/resolve/main/QandoraExp-7B-Persona.Q8_0.gguf) | Q8_0 | 8.2 | fast, best quality |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time.
<!-- end -->
|
mradermacher/EveryoneLLM-7b-Gemma-Base-i1-GGUF | mradermacher | 2024-11-14T08:54:26Z | 12 | 0 | transformers | [
"transformers",
"gguf",
"merge",
"en",
"base_model:rombodawg/EveryoneLLM-7b-Gemma-Base",
"base_model:quantized:rombodawg/EveryoneLLM-7b-Gemma-Base",
"license:other",
"endpoints_compatible",
"region:us",
"imatrix"
] | null | 2024-11-14T05:22:37Z | ---
base_model: rombodawg/EveryoneLLM-7b-Gemma-Base
language:
- en
library_name: transformers
license: other
license_link: https://ai.google.dev/gemma/terms
license_name: gemma-terms-of-use
quantized_by: mradermacher
tags:
- merge
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/rombodawg/EveryoneLLM-7b-Gemma-Base
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/EveryoneLLM-7b-Gemma-Base-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/EveryoneLLM-7b-Gemma-Base-i1-GGUF/resolve/main/EveryoneLLM-7b-Gemma-Base.i1-IQ1_S.gguf) | i1-IQ1_S | 2.3 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/EveryoneLLM-7b-Gemma-Base-i1-GGUF/resolve/main/EveryoneLLM-7b-Gemma-Base.i1-IQ1_M.gguf) | i1-IQ1_M | 2.4 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/EveryoneLLM-7b-Gemma-Base-i1-GGUF/resolve/main/EveryoneLLM-7b-Gemma-Base.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 2.7 | |
| [GGUF](https://huggingface.co/mradermacher/EveryoneLLM-7b-Gemma-Base-i1-GGUF/resolve/main/EveryoneLLM-7b-Gemma-Base.i1-IQ2_XS.gguf) | i1-IQ2_XS | 2.9 | |
| [GGUF](https://huggingface.co/mradermacher/EveryoneLLM-7b-Gemma-Base-i1-GGUF/resolve/main/EveryoneLLM-7b-Gemma-Base.i1-IQ2_S.gguf) | i1-IQ2_S | 3.0 | |
| [GGUF](https://huggingface.co/mradermacher/EveryoneLLM-7b-Gemma-Base-i1-GGUF/resolve/main/EveryoneLLM-7b-Gemma-Base.i1-IQ2_M.gguf) | i1-IQ2_M | 3.2 | |
| [GGUF](https://huggingface.co/mradermacher/EveryoneLLM-7b-Gemma-Base-i1-GGUF/resolve/main/EveryoneLLM-7b-Gemma-Base.i1-Q2_K.gguf) | i1-Q2_K | 3.6 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/EveryoneLLM-7b-Gemma-Base-i1-GGUF/resolve/main/EveryoneLLM-7b-Gemma-Base.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 3.6 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/EveryoneLLM-7b-Gemma-Base-i1-GGUF/resolve/main/EveryoneLLM-7b-Gemma-Base.i1-IQ3_XS.gguf) | i1-IQ3_XS | 3.9 | |
| [GGUF](https://huggingface.co/mradermacher/EveryoneLLM-7b-Gemma-Base-i1-GGUF/resolve/main/EveryoneLLM-7b-Gemma-Base.i1-IQ3_S.gguf) | i1-IQ3_S | 4.1 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/EveryoneLLM-7b-Gemma-Base-i1-GGUF/resolve/main/EveryoneLLM-7b-Gemma-Base.i1-Q3_K_S.gguf) | i1-Q3_K_S | 4.1 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/EveryoneLLM-7b-Gemma-Base-i1-GGUF/resolve/main/EveryoneLLM-7b-Gemma-Base.i1-IQ3_M.gguf) | i1-IQ3_M | 4.2 | |
| [GGUF](https://huggingface.co/mradermacher/EveryoneLLM-7b-Gemma-Base-i1-GGUF/resolve/main/EveryoneLLM-7b-Gemma-Base.i1-Q3_K_M.gguf) | i1-Q3_K_M | 4.5 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/EveryoneLLM-7b-Gemma-Base-i1-GGUF/resolve/main/EveryoneLLM-7b-Gemma-Base.i1-Q3_K_L.gguf) | i1-Q3_K_L | 4.8 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/EveryoneLLM-7b-Gemma-Base-i1-GGUF/resolve/main/EveryoneLLM-7b-Gemma-Base.i1-IQ4_XS.gguf) | i1-IQ4_XS | 4.9 | |
| [GGUF](https://huggingface.co/mradermacher/EveryoneLLM-7b-Gemma-Base-i1-GGUF/resolve/main/EveryoneLLM-7b-Gemma-Base.i1-Q4_0_4_4.gguf) | i1-Q4_0_4_4 | 5.1 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/EveryoneLLM-7b-Gemma-Base-i1-GGUF/resolve/main/EveryoneLLM-7b-Gemma-Base.i1-Q4_0_4_8.gguf) | i1-Q4_0_4_8 | 5.1 | fast on arm+i8mm, low quality |
| [GGUF](https://huggingface.co/mradermacher/EveryoneLLM-7b-Gemma-Base-i1-GGUF/resolve/main/EveryoneLLM-7b-Gemma-Base.i1-Q4_0_8_8.gguf) | i1-Q4_0_8_8 | 5.1 | fast on arm+sve, low quality |
| [GGUF](https://huggingface.co/mradermacher/EveryoneLLM-7b-Gemma-Base-i1-GGUF/resolve/main/EveryoneLLM-7b-Gemma-Base.i1-Q4_0.gguf) | i1-Q4_0 | 5.1 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/EveryoneLLM-7b-Gemma-Base-i1-GGUF/resolve/main/EveryoneLLM-7b-Gemma-Base.i1-Q4_K_S.gguf) | i1-Q4_K_S | 5.1 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/EveryoneLLM-7b-Gemma-Base-i1-GGUF/resolve/main/EveryoneLLM-7b-Gemma-Base.i1-Q4_K_M.gguf) | i1-Q4_K_M | 5.4 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/EveryoneLLM-7b-Gemma-Base-i1-GGUF/resolve/main/EveryoneLLM-7b-Gemma-Base.i1-Q5_K_S.gguf) | i1-Q5_K_S | 6.1 | |
| [GGUF](https://huggingface.co/mradermacher/EveryoneLLM-7b-Gemma-Base-i1-GGUF/resolve/main/EveryoneLLM-7b-Gemma-Base.i1-Q5_K_M.gguf) | i1-Q5_K_M | 6.2 | |
| [GGUF](https://huggingface.co/mradermacher/EveryoneLLM-7b-Gemma-Base-i1-GGUF/resolve/main/EveryoneLLM-7b-Gemma-Base.i1-Q6_K.gguf) | i1-Q6_K | 7.1 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
adriansanz/intent_analysis_2labels | adriansanz | 2024-11-14T08:48:45Z | 114 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"xlm-roberta",
"text-classification",
"generated_from_trainer",
"base_model:FacebookAI/xlm-roberta-base",
"base_model:finetune:FacebookAI/xlm-roberta-base",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-14T08:47:40Z | ---
library_name: transformers
license: mit
base_model: xlm-roberta-base
tags:
- generated_from_trainer
metrics:
- accuracy
- precision
- recall
- f1
model-index:
- name: intent_analysis
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# intent_analysis
This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0083
- Accuracy: 0.9990
- Precision: 0.9990
- Recall: 0.9990
- F1: 0.9990
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|
| No log | 1.0 | 76 | 0.0166 | 0.9971 | 0.9971 | 0.9971 | 0.9971 |
| No log | 2.0 | 152 | 0.0074 | 0.9990 | 0.9990 | 0.9990 | 0.9990 |
| No log | 3.0 | 228 | 0.0083 | 0.9990 | 0.9990 | 0.9990 | 0.9990 |
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.0+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
|
Masakazu1999/model | Masakazu1999 | 2024-11-14T08:47:34Z | 33 | 0 | transformers | [
"transformers",
"safetensors",
"qwen2",
"text-generation",
"text-generation-inference",
"unsloth",
"trl",
"sft",
"conversational",
"en",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"4-bit",
"bitsandbytes",
"region:us"
] | text-generation | 2024-11-14T08:41:57Z | ---
base_model: unsloth/qwen2.5-32b-instruct-bnb-4bit
tags:
- text-generation-inference
- transformers
- unsloth
- qwen2
- trl
- sft
license: apache-2.0
language:
- en
---
# Uploaded model
- **Developed by:** Masakazu1999
- **License:** apache-2.0
- **Finetuned from model :** unsloth/qwen2.5-32b-instruct-bnb-4bit
This qwen2 model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
Arthur-Tsai/your_model_name | Arthur-Tsai | 2024-11-14T08:38:44Z | 46 | 0 | transformers | [
"transformers",
"safetensors",
"custom-model",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | null | 2024-11-14T08:36:16Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
FractalGPT/RuQwen2.5-3B-Instruct-AWQ | FractalGPT | 2024-11-14T08:37:04Z | 307 | 5 | transformers | [
"transformers",
"safetensors",
"qwen2",
"text-generation",
"conversational",
"ru",
"en",
"base_model:Qwen/Qwen2.5-3B-Instruct-AWQ",
"base_model:quantized:Qwen/Qwen2.5-3B-Instruct-AWQ",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"4-bit",
"awq",
"region:us"
] | text-generation | 2024-11-11T23:32:02Z | ---
base_model:
- Qwen/Qwen2.5-3B-Instruct-AWQ
language:
- ru
- en
library_name: transformers
license: apache-2.0
pipeline_tag: text-generation
---
---
## FractalGPT/RuQwen2.5-3B-Instruct-AWQ
---
### Our products
- [FractalGPT](https://fractalgpt.ru)
### Model Overview
- **RuQwen2.5-3B-Instruct-AWQ** by FractalGPT is a language model tailored to deliver high-quality Russian language output. Building upon the Qwen2.5 series, it is optimized for Russian-language tasks while retaining broad multilingual support.
- **Improved Russian Language Quality**: Adaptations have significantly enhanced the fluency, accuracy, and coherence of Russian text generation, making it an excellent choice for Russian-language applications.
### Model Specifications
- **Type**: Instruction-tuned Causal Language Model
- **Training Stages**: Pretraining & Instruction Tuning
- **Architecture**: Transformer with RoPE, SwiGLU, RMSNorm, and Attention QKV bias
- **Layers**: 36
- **Attention Heads (GQA)**: 16 for Q and 2 for KV
- **Context Length**: Supports a full context of 131,072 tokens and generation of up to 8,192 tokens
- **Quantization**: AWQ 4 bit
- **Base model**: Qwen/Qwen2.5-3B-Instruct-AWQ
### Requirements
The code of Qwen2.5 has been in the latest Hugging face transformers and we advise you to use the latest version of transformers.
* [**Run example in Collab**](https://colab.research.google.com/drive/1gQmvUVUmnepi6i2As67bHoscnRl6uwKY?usp=sharing)
```
pip install autoawq -q
pip install --upgrade torch -q
pip install --upgrade transformers -q
```
With transformers<4.37.0, you will encounter the following error:
```
KeyError: 'qwen2'
```
Also check out our [AWQ documentation](https://qwen.readthedocs.io/en/latest/quantization/awq.html) for more usage guide.
With pytorch<2.4.0, you will encounter the following error:
```
AttributeError: module 'torch.library' has no attribute 'register_fake'
```
### Quickstart
We use a special RuQwen2ForCausalLM class to work with this model:
```python
from transformers import Qwen2ForCausalLM, AutoConfig, AutoTokenizer
import torch
class RuQwen2ForCausalLM(Qwen2ForCausalLM):
def __init__(self, config):
super().__init__(config)
if hasattr(self, "lm_head") and isinstance(self.lm_head, torch.nn.Linear):
if self.lm_head.bias is None:
self.config.add_bias_to_lm_head = True
self._add_bias_to_lm_head()
def _add_bias_to_lm_head(self):
"""Добавляет bias в lm_head, если его нет."""
old_lm_head = self.lm_head
# lm_head с bias
self.lm_head = torch.nn.Linear(
old_lm_head.in_features,
old_lm_head.out_features,
dtype=self.model.dtype,
bias=True,
)
with torch.no_grad():
self.lm_head.weight = old_lm_head.weight
torch.nn.init.zeros_(self.lm_head.bias)
@classmethod
def from_pretrained(cls, model_name, *args, **kwargs):
# Загружает модель с конфигурацией
model = super().from_pretrained(model_name, *args, **kwargs)
if hasattr(model.config, "add_bias_to_lm_head") and not model.config.add_bias_to_lm_head:
model._add_bias_to_lm_head()
return model
def save_pretrained(self, save_directory, *args, **kwargs):
self.config.add_bias_to_lm_head = self.lm_head.bias is not None
super().save_pretrained(save_directory, *args, **kwargs)
```
Here provides a code snippet with apply_chat_template to show you how to load the tokenizer and model and how to generate contents.
```python
def generate(messages):
input_ids = tokenizer.apply_chat_template(messages, tokenize=True, add_generation_prompt=True, return_tensors="pt").to(model.device)
output = model.generate(input_ids,
max_new_tokens=1024,
do_sample=False,
temperature=None,
top_k=None,
top_p=None)
generated_text = tokenizer.decode(output[0], skip_special_tokens=False)#.split('<|im_start|>assistant')[1]
return generated_text
model_name = 'FractalGPT/RuQwen2.5-3B-Instruct-AWQ'
model = RuQwen2ForCausalLM.from_pretrained(model_name, torch_dtype=torch.float16)
tokenizer = AutoTokenizer.from_pretrained(model_name)
prompt = "Классификация медицинских терминов"
messages = [
{"role": "system", "content": "You are RuQwen, created by FractalGPT. You are a helpful assistant."},
{"role": "user", "content": prompt}
]
print(generate(messages))
```
### Example of improvements
Question: 'Классификация медицинских терминов'
The answer to the original model
```
Классификация медицинских терминов играет ключевую роль в медицине и информационных системах. Вот несколько основных подходов к классификации медицинских терминов:
1. **International Classification of Diseases (ICD)**: Это стандартная система классификации заболеваний и причин смерти, разработанная ВОЗ. Она используется во многих странах для классификации медицинских случаев.
2. **Medical Subject Headings (MeSH)**: Это система классификации медицинской литературы, разработанная National Library of Medicine (NLM). MeSH помогает организовать информацию о медицинских исследованиях и статьях.
3. **Systematized Nomenclature of Medicine (SNOMED)**: SNOMED International предоставляет глубоко специализированную систему классификации медицинских терминов. Она предназначена для использования в больницах, клиниках и других медицинских учреждениях.
4. **Unified Medical Language System (UMLS)**: Это расширяемая система классификации медицинских терминов, которая объединяет данные из различных источников, таких как ICD-10, MeSH и другие.
5. **Clinical Terms**: Это система классификации медицинских терминов, разработанная для использования в клинических системах управления пациентами (EHR).
6. **Pharmaceutical Product Names (PPNs)**: Это система классификации названий лекарственных препаратов, разработанная для обеспечения уникальности и стандартизации названий.
Каждая из этих систем имеет свои особенности и области применения. Например, ICD主要用于疾病和死亡原因的分类,而MeSH主要用于医学文献的分类。SNOMED则提供更详细的医疗术语分类,适用于临床应用。
Также существуют специализированные системы для конкретных областей медицины, такие как терминология для хирургии, терапии, педиатрии и т.д.
Классификация медицинских терминов позволяет упорядочить информацию, сделать ее доступной для широкого круга пользователей и обеспечивает стандартизацию терминологии, что важно для точного понимания и эффективного общения между врачами, пациентами и медицинскими работниками.
```
Our model's answer
```
Классификация медицинских терминов играет ключевую роль в медицине и информационных системах. Вот несколько основных подходов к классификации медицинских терминов:
1. **International Classification of Diseases (ICD)**: Это стандартная система классификации заболеваний и причин смерти, разработанная ВОЗ. Она используется во многих странах для классификации медицинских случаев.
2. **Medical Subject Headings (MeSH)**: Это система классификации медицинской литературы, разработанная National Library of Medicine (NLM). MeSH помогает организовать информацию о медицинских исследованиях и статьях.
3. **Systematized Nomenclature of Medicine (SNOMED)**: SNOMED International предоставляет глубоко специализированную систему классификации медицинских терминов. Он включает термины для диагностики, лечения, тестирования и других аспектов медицинского обслуживания.
4. **Unified Medical Language System (UMLS)**: Это система метаданных, которая объединяет терминологические данные из различных источников, таких как ICD-10, MeSH и другие системы классификации.
5. **Clinical Terms**: Это терминология, используемая в клинической практике, которая может включать термины из SNOMED, ICD и других систем.
6. **Pharmacopoeias**: Эти документы обычно содержат классификацию лекарственных препаратов и их применения.
Каждая из этих систем имеет свои особенности и области применения. Например, ICD используется для классификации заболеваний и причин смерти, в то время как MeSH и SNOMED используются для классификации медицинской информации и терминологии соответственно.
Также существуют более специализированные системы классификации, такие как терминология для конкретных областей медицины (например, терминология для хирургии или терминология для педиатрии).
Классификация медицинских терминов позволяет упорядочить информацию, сделать ее доступной для широкого круга пользователей и обеспечивает стандартизацию терминологии, что важно для эффективного общения между врачами, пациентами и другими участниками медицинского процесса.
``` |
adriansanz/intent_analysis-3label-xml | adriansanz | 2024-11-14T08:29:30Z | 106 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"xlm-roberta",
"text-classification",
"generated_from_trainer",
"base_model:FacebookAI/xlm-roberta-base",
"base_model:finetune:FacebookAI/xlm-roberta-base",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-13T15:09:14Z | ---
library_name: transformers
license: mit
base_model: xlm-roberta-base
tags:
- generated_from_trainer
metrics:
- accuracy
- precision
- recall
- f1
model-index:
- name: intent_analysis
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# intent_analysis
This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0750
- Accuracy: 0.9781
- Precision: 0.9782
- Recall: 0.9781
- F1: 0.9781
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|
| No log | 1.0 | 114 | 0.1457 | 0.9459 | 0.9498 | 0.9459 | 0.9456 |
| No log | 2.0 | 228 | 0.0754 | 0.9774 | 0.9777 | 0.9774 | 0.9775 |
| No log | 3.0 | 342 | 0.0750 | 0.9781 | 0.9782 | 0.9781 | 0.9781 |
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.0+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
|
hezarai/bert-fa-mask-filling | hezarai | 2024-11-14T08:26:47Z | 4 | 0 | hezar | [
"hezar",
"fill-mask",
"fa",
"base_model:hezarai/bert-base-fa",
"base_model:finetune:hezarai/bert-base-fa",
"region:us"
] | fill-mask | 2024-01-01T16:41:22Z | ---
language:
- fa
library_name: hezar
tags:
- fill-mask
- hezar
pipeline_tag: fill-mask
base_model:
- hezarai/bert-base-fa
---
A BERT model for mask filling. The base BERT model is initialized from https://huggingface.co/HooshvareLab/bert-fa-zwnj-base.
## Usage
```
pip install hezar
```
```python
from hezar.models import Model
model = Model.load("hezarai/bert-fa-mask-filling")
inputs = ["ایشالا از [MASK] شروع میکنم"]
outputs = model.predict(inputs)
print(outputs)
``` |
hezarai/gpt2-fa-summarization-xlsum | hezarai | 2024-11-14T08:26:42Z | 9 | 0 | hezar | [
"hezar",
"text-generation",
"fa",
"region:us"
] | text-generation | 2023-11-14T02:53:56Z | ---
language:
- fa
library_name: hezar
tags:
- text-generation
- hezar
pipeline_tag: text-generation
---
|
hezarai/whisper-small | hezarai | 2024-11-14T08:26:26Z | 10 | 1 | hezar | [
"hezar",
"automatic-speech-recognition",
"region:us"
] | automatic-speech-recognition | 2023-08-13T07:20:50Z | ---
library_name: hezar
tags:
- automatic-speech-recognition
- hezar
pipeline_tag: automatic-speech-recognition
---
|
hezarai/bert-fa-ner-arman | hezarai | 2024-11-14T08:26:23Z | 35 | 0 | hezar | [
"hezar",
"token-classification",
"fa",
"dataset:hezarai/arman-ner",
"base_model:hezarai/bert-base-fa",
"base_model:finetune:hezarai/bert-base-fa",
"region:us"
] | token-classification | 2023-08-04T12:04:23Z | ---
language:
- fa
library_name: hezar
tags:
- token-classification
- hezar
datasets:
- hezarai/arman-ner
pipeline_tag: token-classification
base_model:
- hezarai/bert-base-fa
---
ParsBERT model trained on the [Arman-NER](https://huggingface.co/datasets/hezarai/arman-ner) for Named Entity Recognition. |
hezarai/roberta-fa-sentiment-dksf | hezarai | 2024-11-14T08:26:14Z | 2 | 0 | hezar | [
"hezar",
"text-classification",
"fa",
"base_model:hezarai/roberta-base-fa",
"base_model:finetune:hezarai/roberta-base-fa",
"region:us"
] | text-classification | 2023-04-02T09:46:39Z | ---
language:
- fa
library_name: hezar
tags:
- text-classification
- hezar
pipeline_tag: text-classification
base_model:
- hezarai/roberta-base-fa
--- |
hezarai/roberta-base-fa | hezarai | 2024-11-14T08:26:09Z | 4 | 0 | hezar | [
"hezar",
"feature-extraction",
"fa",
"region:us"
] | feature-extraction | 2023-03-01T09:46:13Z | ---
language:
- fa
library_name: hezar
tags:
- feature-extraction
- hezar
pipeline_tag: feature-extraction
---
|
hezarai/distilbert-fa-sentiment-dksf | hezarai | 2024-11-14T08:22:29Z | 12 | 0 | hezar | [
"hezar",
"text-classification",
"fa",
"base_model:hezarai/distilbert-base-fa",
"base_model:finetune:hezarai/distilbert-base-fa",
"region:us"
] | text-classification | 2023-02-02T11:30:32Z | ---
language:
- fa
library_name: hezar
tags:
- text-classification
- hezar
pipeline_tag: text-classification
base_model:
- hezarai/distilbert-base-fa
--- |
Ahanaas/HermesWithYou | Ahanaas | 2024-11-14T08:13:07Z | 8 | 0 | null | [
"safetensors",
"llama",
"en",
"base_model:NousResearch/Hermes-3-Llama-3.1-8B",
"base_model:finetune:NousResearch/Hermes-3-Llama-3.1-8B",
"license:mit",
"region:us"
] | null | 2024-11-14T07:15:45Z | ---
license: mit
language:
- en
base_model:
- NousResearch/Hermes-3-Llama-3.1-8B
---
## Inf
```py
!git clone https://github.com/huggingface/transformers.git
%cd transformers
!git checkout <commit_id_for_4.47.0.dev0>
!pip install .
!pip install -q accelerate==0.34.2 bitsandbytes==0.44.1 peft==0.13.1
```
#### Importing libs
```py
import os
import torch
from datasets import load_dataset
from transformers import (
AutoModelForCausalLM,
AutoTokenizer,
BitsAndBytesConfig,
pipeline,
logging,
)
```
#### Bits&Bytes Config
```py
use_4bit = True
# Compute dtype for 4-bit base models
bnb_4bit_compute_dtype = "float16"
# Quantization type (fp4 or nf4)
compute_dtype = getattr(torch, bnb_4bit_compute_dtype)
use_nested_quant = False
bnb_4bit_quant_type = "nf4"
bnb_config = BitsAndBytesConfig(
load_in_4bit=use_4bit,
bnb_4bit_quant_type=bnb_4bit_quant_type,
bnb_4bit_compute_dtype=compute_dtype,
bnb_4bit_use_double_quant=use_nested_quant,
)
```
#### Loading Model
```py
# Load base model
model_name = 'Ahanaas/HermesWithYou'
model = AutoModelForCausalLM.from_pretrained(
model_name,
quantization_config=bnb_config,
device_map=0
)
```
#### Loading Tokenizer
```py
# Load tokenizer
tokenizer = AutoTokenizer.from_pretrained(model_name, trust_remote_code=True)
tokenizer.pad_token = tokenizer.eos_token
tokenizer.padding_side = "right"
```
# Predictions
```py
# Run text generation pipeline with our next model
system_prompt = ''''''
prompt = ''''''
pipe = pipeline(
task="text-generation",
model=model,
tokenizer=tokenizer,
max_new_tokens=128, # Increase this to allow for longer outputs
temperature=0.5, # Encourages more varied outputs
top_k=50, # Limits to the top 50 tokens
do_sample=True, # Enables sampling
return_full_text=True,
)
result = pipe(f"<|im_start|>system\n {system_prompt}\n<|im_end|>\n<|im_start|>user\n{prompt}\n<|im_end|>\n<|im_start|>assistant\n")
# print(result[0]['generated_text'])
generated_text = result[0]['generated_text']
# Print the extracted response text
print(generated_text)
```
|
creatoryoon/tr14_batch16_15000_mixed_precision_bf16 | creatoryoon | 2024-11-14T08:03:24Z | 28 | 0 | diffusers | [
"diffusers",
"tensorboard",
"safetensors",
"diffusers:StableDiffusionInstructPix2PixPipeline",
"region:us"
] | null | 2024-11-12T21:47:32Z | # 모델 목록
# 모델의 데이터별 설명
# trJangguToTimpani: 장구만 일단 한번 넣어볼까?handmade(change A to B)장구만 20개
# tr0: handmade(change A to B) 인식용 데이터 (초기 데이터) 장구, 양금 등....
# tr: handmade(change A to B)+reco(change Joseon to middle age)
# tr2: reco(change Joseon to middle age)
# tr3: handmade(change Joseon to middle age) + reco(change Joseon to middle age)
# tr4: reco (apply custom artistic style '!asacdls1jo673213@#*')
# tr5: handmade(apply custom artistic style '!asacdls1jo673213@#*') + reco(apply custom artistic style '!asacdls1jo673213@#*')
# tr6: handmade(change A to B)
# tr7: handmade(apply custom artistic style '!asacdls1jo673213@#*')
# tr8: handmade(change A to B) + reco(apply custom artistic style '!asacdls1jo673213@#*)
# tr9: handmade(apply custom artistic style '!asacdls1jo673213@#*')+입력 조정reco(apply custom artistic style '!asacdls1jo673213@#*')
# tr10: 입력 조정reco(apply custom artistic style '!asacdls1jo673213@#*')
# tr11: handmade(change A to B) + 입력 조정reco(apply custom artistic style '!asacdls1jo673213@#*)
# tr12: handmade(apply custom artistic style '!asacdls1jo673213@#*')+ 검수및입력 조정한 reco(apply custom artistic style '!asacdls1jo673213@#*')
# tr13: 검수및입력 조정한reco(apply custom artistic style '!asacdls1jo673213@#*'))
# tr14: handmade(change A to B) + 검수및입력 조정reco(apply custom artistic style '!asacdls1jo673213@#*)
'''
trJangguToTimpani_batch4_chk100to1500_fp16 ** 이 경우 chkpoint가 100에서 시작, 100씩 점프, 1500이 마지막임에 유의
creatoryoon/tr0_batch8_chk200to2000_mixed_precision_fp16 **이 경우 chkpoint가 200에서 시작하고 200씩 점프, 2000이 마지막임에 유의
creatoryoon/tr1_batch64_15000_mixed_precision_fp16
creatoryoon/tr2_batch16_15000_mixed_precision_bf16
creatoryoon/tr3_batch16_15000_mixed_precision_bf16
creatoryoon/tr4_batch64_15000_mixed_precision_fp16
creatoryoon/tr5_batch16_15000_mixed_precision_bf16
creatoryoon/tr5_batch64_15000_mixed_precision_fp16
creatoryoon/tr6_batch16_15000_mixed_precision_bf16
creatoryoon/tr7_batch16_15000_mixed_precision_fp32
creatoryoon/tr8_batch16_15000_mixed_precision_bf16
creatoryoon/tr9_batch16_15000_mixed_precision_fp32
creatoryoon/tr10_batch16_15000_mixed_precision_bf16
creatoryoon/tr11_batch16_15000_mixed_precision_bf16
creatoryoon/tr12_batch16_15000_mixed_precision_fp32
creatoryoon/tr13_batch16_15000_mixed_precision_fp32
creatoryoon/tr14_batch16_15000_mixed_precision_bf16
''' |
Whalejay/bert-sw_over_other_x2 | Whalejay | 2024-11-14T07:56:05Z | 115 | 0 | transformers | [
"transformers",
"safetensors",
"distilbert",
"question-answering",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | question-answering | 2024-11-14T07:55:38Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
RichardErkhov/dalau627_-_text-to-sql-finetuned-mistral-small-2409-gguf | RichardErkhov | 2024-11-14T07:47:13Z | 9 | 0 | null | [
"gguf",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-11-13T18:33:53Z | Quantization made by Richard Erkhov.
[Github](https://github.com/RichardErkhov)
[Discord](https://discord.gg/pvy7H8DZMG)
[Request more models](https://github.com/RichardErkhov/quant_request)
text-to-sql-finetuned-mistral-small-2409 - GGUF
- Model creator: https://huggingface.co/dalau627/
- Original model: https://huggingface.co/dalau627/text-to-sql-finetuned-mistral-small-2409/
| Name | Quant method | Size |
| ---- | ---- | ---- |
| [text-to-sql-finetuned-mistral-small-2409.Q2_K.gguf](https://huggingface.co/RichardErkhov/dalau627_-_text-to-sql-finetuned-mistral-small-2409-gguf/blob/main/text-to-sql-finetuned-mistral-small-2409.Q2_K.gguf) | Q2_K | 7.7GB |
| [text-to-sql-finetuned-mistral-small-2409.Q3_K_S.gguf](https://huggingface.co/RichardErkhov/dalau627_-_text-to-sql-finetuned-mistral-small-2409-gguf/blob/main/text-to-sql-finetuned-mistral-small-2409.Q3_K_S.gguf) | Q3_K_S | 8.98GB |
| [text-to-sql-finetuned-mistral-small-2409.Q3_K.gguf](https://huggingface.co/RichardErkhov/dalau627_-_text-to-sql-finetuned-mistral-small-2409-gguf/blob/main/text-to-sql-finetuned-mistral-small-2409.Q3_K.gguf) | Q3_K | 10.02GB |
| [text-to-sql-finetuned-mistral-small-2409.Q3_K_M.gguf](https://huggingface.co/RichardErkhov/dalau627_-_text-to-sql-finetuned-mistral-small-2409-gguf/blob/main/text-to-sql-finetuned-mistral-small-2409.Q3_K_M.gguf) | Q3_K_M | 10.02GB |
| [text-to-sql-finetuned-mistral-small-2409.Q3_K_L.gguf](https://huggingface.co/RichardErkhov/dalau627_-_text-to-sql-finetuned-mistral-small-2409-gguf/blob/main/text-to-sql-finetuned-mistral-small-2409.Q3_K_L.gguf) | Q3_K_L | 10.92GB |
| [text-to-sql-finetuned-mistral-small-2409.IQ4_XS.gguf](https://huggingface.co/RichardErkhov/dalau627_-_text-to-sql-finetuned-mistral-small-2409-gguf/blob/main/text-to-sql-finetuned-mistral-small-2409.IQ4_XS.gguf) | IQ4_XS | 11.22GB |
| [text-to-sql-finetuned-mistral-small-2409.Q4_0.gguf](https://huggingface.co/RichardErkhov/dalau627_-_text-to-sql-finetuned-mistral-small-2409-gguf/blob/main/text-to-sql-finetuned-mistral-small-2409.Q4_0.gguf) | Q4_0 | 11.71GB |
| [text-to-sql-finetuned-mistral-small-2409.IQ4_NL.gguf](https://huggingface.co/RichardErkhov/dalau627_-_text-to-sql-finetuned-mistral-small-2409-gguf/blob/main/text-to-sql-finetuned-mistral-small-2409.IQ4_NL.gguf) | IQ4_NL | 11.83GB |
| [text-to-sql-finetuned-mistral-small-2409.Q4_K_S.gguf](https://huggingface.co/RichardErkhov/dalau627_-_text-to-sql-finetuned-mistral-small-2409-gguf/blob/main/text-to-sql-finetuned-mistral-small-2409.Q4_K_S.gguf) | Q4_K_S | 11.79GB |
| [text-to-sql-finetuned-mistral-small-2409.Q4_K.gguf](https://huggingface.co/RichardErkhov/dalau627_-_text-to-sql-finetuned-mistral-small-2409-gguf/blob/main/text-to-sql-finetuned-mistral-small-2409.Q4_K.gguf) | Q4_K | 12.42GB |
| [text-to-sql-finetuned-mistral-small-2409.Q4_K_M.gguf](https://huggingface.co/RichardErkhov/dalau627_-_text-to-sql-finetuned-mistral-small-2409-gguf/blob/main/text-to-sql-finetuned-mistral-small-2409.Q4_K_M.gguf) | Q4_K_M | 12.42GB |
| [text-to-sql-finetuned-mistral-small-2409.Q4_1.gguf](https://huggingface.co/RichardErkhov/dalau627_-_text-to-sql-finetuned-mistral-small-2409-gguf/blob/main/text-to-sql-finetuned-mistral-small-2409.Q4_1.gguf) | Q4_1 | 12.99GB |
| [text-to-sql-finetuned-mistral-small-2409.Q5_0.gguf](https://huggingface.co/RichardErkhov/dalau627_-_text-to-sql-finetuned-mistral-small-2409-gguf/blob/main/text-to-sql-finetuned-mistral-small-2409.Q5_0.gguf) | Q5_0 | 14.27GB |
| [text-to-sql-finetuned-mistral-small-2409.Q5_K_S.gguf](https://huggingface.co/RichardErkhov/dalau627_-_text-to-sql-finetuned-mistral-small-2409-gguf/blob/main/text-to-sql-finetuned-mistral-small-2409.Q5_K_S.gguf) | Q5_K_S | 14.27GB |
| [text-to-sql-finetuned-mistral-small-2409.Q5_K.gguf](https://huggingface.co/RichardErkhov/dalau627_-_text-to-sql-finetuned-mistral-small-2409-gguf/blob/main/text-to-sql-finetuned-mistral-small-2409.Q5_K.gguf) | Q5_K | 14.64GB |
| [text-to-sql-finetuned-mistral-small-2409.Q5_K_M.gguf](https://huggingface.co/RichardErkhov/dalau627_-_text-to-sql-finetuned-mistral-small-2409-gguf/blob/main/text-to-sql-finetuned-mistral-small-2409.Q5_K_M.gguf) | Q5_K_M | 14.64GB |
| [text-to-sql-finetuned-mistral-small-2409.Q5_1.gguf](https://huggingface.co/RichardErkhov/dalau627_-_text-to-sql-finetuned-mistral-small-2409-gguf/blob/main/text-to-sql-finetuned-mistral-small-2409.Q5_1.gguf) | Q5_1 | 15.56GB |
| [text-to-sql-finetuned-mistral-small-2409.Q6_K.gguf](https://huggingface.co/RichardErkhov/dalau627_-_text-to-sql-finetuned-mistral-small-2409-gguf/blob/main/text-to-sql-finetuned-mistral-small-2409.Q6_K.gguf) | Q6_K | 17.0GB |
| [text-to-sql-finetuned-mistral-small-2409.Q8_0.gguf](https://huggingface.co/RichardErkhov/dalau627_-_text-to-sql-finetuned-mistral-small-2409-gguf/blob/main/text-to-sql-finetuned-mistral-small-2409.Q8_0.gguf) | Q8_0 | 22.02GB |
Original model description:
---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
|
som111/modelxx | som111 | 2024-11-14T07:46:40Z | 27 | 0 | transformers | [
"transformers",
"gguf",
"llama",
"text-generation-inference",
"unsloth",
"en",
"dataset:ruslanmv/ai-medical-chatbot",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-11-14T07:26:22Z | ---
base_model: unsloth/llama-3.2-3b-instruct-bnb-4bit
tags:
- text-generation-inference
- transformers
- unsloth
- llama
- gguf
license: apache-2.0
language:
- en
datasets:
- ruslanmv/ai-medical-chatbot
---
# Uploaded model
- **Developed by:** som111
- **License:** apache-2.0
- **Finetuned from model :** unsloth/llama-3.2-3b-instruct-bnb-4bit
This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth) |
yjkim104906/gemma-2-2b-it-disaster-ko-tune | yjkim104906 | 2024-11-14T07:38:24Z | 7 | 0 | transformers | [
"transformers",
"safetensors",
"gguf",
"gemma2",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-12T04:50:30Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
linhqyy/jazzy-st-1411 | linhqyy | 2024-11-14T07:25:00Z | 16 | 0 | diffusers | [
"diffusers",
"flux",
"flux-diffusers",
"text-to-image",
"simpletuner",
"safe-for-work",
"lora",
"template:sd-lora",
"standard",
"base_model:black-forest-labs/FLUX.1-dev",
"base_model:adapter:black-forest-labs/FLUX.1-dev",
"license:other",
"region:us"
] | text-to-image | 2024-11-14T03:59:56Z | ---
license: other
base_model: "black-forest-labs/FLUX.1-dev"
tags:
- flux
- flux-diffusers
- text-to-image
- diffusers
- simpletuner
- safe-for-work
- lora
- template:sd-lora
- standard
inference: true
widget:
- text: 'unconditional (blank prompt)'
parameters:
negative_prompt: 'blurry, cropped, ugly'
output:
url: ./assets/image_0_0.png
- text: 'A girl in light blue sits at the bar counter, holding an ice-cold wine glass and drinking alone on top of the Eiffel Tower, with a night view outside the window.. It features a close-up shot of her sitting by herself. She has long hair, wears glasses, faces away from the camera, and is wearing white shoes, black pants, a gray jacket, and a green scarf. with bright colors and a Paris night background featuring the Eiffel Tower. The composition is elegant, with the woman sitting on a high stool'
parameters:
negative_prompt: 'blurry, cropped, ugly'
output:
url: ./assets/image_1_0.png
---
# jazzy-st-1411
This is a standard PEFT LoRA derived from [black-forest-labs/FLUX.1-dev](https://huggingface.co/black-forest-labs/FLUX.1-dev).
The main validation prompt used during training was:
```
A girl in light blue sits at the bar counter, holding an ice-cold wine glass and drinking alone on top of the Eiffel Tower, with a night view outside the window.. It features a close-up shot of her sitting by herself. She has long hair, wears glasses, faces away from the camera, and is wearing white shoes, black pants, a gray jacket, and a green scarf. with bright colors and a Paris night background featuring the Eiffel Tower. The composition is elegant, with the woman sitting on a high stool
```
## Validation settings
- CFG: `3.0`
- CFG Rescale: `0.0`
- Steps: `20`
- Sampler: `None`
- Seed: `42`
- Resolution: `1024x1024`
Note: The validation settings are not necessarily the same as the [training settings](#training-settings).
You can find some example images in the following gallery:
<Gallery />
The text encoder **was not** trained.
You may reuse the base model text encoder for inference.
## Training settings
- Training epochs: 5
- Training steps: 5500
- Learning rate: 0.0004
- Max grad norm: 2.0
- Effective batch size: 1
- Micro-batch size: 1
- Gradient accumulation steps: 1
- Number of GPUs: 1
- Prediction type: flow-matching (extra parameters=['shift=3', 'flux_guidance_value=1.0', 'flux_lora_target=all'])
- Rescaled betas zero SNR: False
- Optimizer: adamw_bf16
- Precision: Pure BF16
- Quantised: No
- Xformers: Not used
- LoRA Rank: 32
- LoRA Alpha: 32.0
- LoRA Dropout: 0.1
- LoRA initialisation style: default
## Datasets
### jazzy-512
- Repeats: 10
- Total number of images: 28
- Total number of aspect buckets: 2
- Resolution: 0.262144 megapixels
- Cropped: False
- Crop style: None
- Crop aspect: None
- Used for regularisation data: No
### jazzy-768
- Repeats: 10
- Total number of images: 28
- Total number of aspect buckets: 1
- Resolution: 0.589824 megapixels
- Cropped: False
- Crop style: None
- Crop aspect: None
- Used for regularisation data: No
### jazzy-1024
- Repeats: 10
- Total number of images: 28
- Total number of aspect buckets: 1
- Resolution: 1.048576 megapixels
- Cropped: False
- Crop style: None
- Crop aspect: None
- Used for regularisation data: No
## Inference
```python
import torch
from diffusers import DiffusionPipeline
model_id = 'black-forest-labs/FLUX.1-dev'
adapter_id = 'linhqyy/jazzy-st-1411'
pipeline = DiffusionPipeline.from_pretrained(model_id), torch_dtype=torch.bfloat16) # loading directly in bf16
pipeline.load_lora_weights(adapter_id)
prompt = "A girl in light blue sits at the bar counter, holding an ice-cold wine glass and drinking alone on top of the Eiffel Tower, with a night view outside the window.. It features a close-up shot of her sitting by herself. She has long hair, wears glasses, faces away from the camera, and is wearing white shoes, black pants, a gray jacket, and a green scarf. with bright colors and a Paris night background featuring the Eiffel Tower. The composition is elegant, with the woman sitting on a high stool"
## Optional: quantise the model to save on vram.
## Note: The model was not quantised during training, so it is not necessary to quantise it during inference time.
#from optimum.quanto import quantize, freeze, qint8
#quantize(pipeline.transformer, weights=qint8)
#freeze(pipeline.transformer)
pipeline.to('cuda' if torch.cuda.is_available() else 'mps' if torch.backends.mps.is_available() else 'cpu') # the pipeline is already in its target precision level
image = pipeline(
prompt=prompt,
num_inference_steps=20,
generator=torch.Generator(device='cuda' if torch.cuda.is_available() else 'mps' if torch.backends.mps.is_available() else 'cpu').manual_seed(1641421826),
width=1024,
height=1024,
guidance_scale=3.0,
).images[0]
image.save("output.png", format="PNG")
```
|
Kabyanil/whisper-small-as | Kabyanil | 2024-11-14T07:22:56Z | 77 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"whisper",
"automatic-speech-recognition",
"generated_from_trainer",
"dataset:common_voice_19_0",
"base_model:openai/whisper-small",
"base_model:finetune:openai/whisper-small",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | automatic-speech-recognition | 2024-11-14T05:19:30Z | ---
library_name: transformers
license: apache-2.0
base_model: openai/whisper-small
tags:
- generated_from_trainer
datasets:
- common_voice_19_0
metrics:
- wer
model-index:
- name: whisper-small-as
results:
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: common_voice_19_0
type: common_voice_19_0
config: as
split: test
args: as
metrics:
- name: Wer
type: wer
value: 34.36754176610978
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# whisper-small-as
This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co/openai/whisper-small) on the common_voice_19_0 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4314
- Wer Ortho: 66.7354
- Wer: 34.3675
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: constant_with_warmup
- lr_scheduler_warmup_steps: 50
- training_steps: 500
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer Ortho | Wer |
|:-------------:|:------:|:----:|:---------------:|:---------:|:-------:|
| 0.0268 | 6.6667 | 500 | 0.4314 | 66.7354 | 34.3675 |
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.0+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
|
MayBashendy/Arabic_FineTuningAraBERT_AugV4-trial2_k35_task1_organization_fold0 | MayBashendy | 2024-11-14T07:20:49Z | 162 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:aubmindlab/bert-base-arabertv02",
"base_model:finetune:aubmindlab/bert-base-arabertv02",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-14T06:33:17Z | ---
library_name: transformers
base_model: aubmindlab/bert-base-arabertv02
tags:
- generated_from_trainer
model-index:
- name: Arabic_FineTuningAraBERT_AugV4-trial2_k35_task1_organization_fold0
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Arabic_FineTuningAraBERT_AugV4-trial2_k35_task1_organization_fold0
This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8527
- Qwk: 0.7626
- Mse: 0.8527
- Rmse: 0.9234
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:|
| No log | 0.0030 | 2 | 4.7593 | -0.0132 | 4.7593 | 2.1816 |
| No log | 0.0059 | 4 | 2.4852 | 0.0904 | 2.4852 | 1.5764 |
| No log | 0.0089 | 6 | 1.5858 | 0.2434 | 1.5858 | 1.2593 |
| No log | 0.0118 | 8 | 1.5146 | 0.0742 | 1.5146 | 1.2307 |
| No log | 0.0148 | 10 | 1.6104 | 0.1639 | 1.6104 | 1.2690 |
| No log | 0.0178 | 12 | 1.7833 | 0.1873 | 1.7833 | 1.3354 |
| No log | 0.0207 | 14 | 2.0225 | 0.1873 | 2.0225 | 1.4222 |
| No log | 0.0237 | 16 | 2.0302 | 0.1873 | 2.0302 | 1.4249 |
| No log | 0.0266 | 18 | 2.0927 | 0.1075 | 2.0927 | 1.4466 |
| No log | 0.0296 | 20 | 2.0956 | 0.1873 | 2.0956 | 1.4476 |
| No log | 0.0325 | 22 | 1.7700 | 0.1893 | 1.7700 | 1.3304 |
| No log | 0.0355 | 24 | 1.6574 | 0.2184 | 1.6574 | 1.2874 |
| No log | 0.0385 | 26 | 1.6173 | 0.2435 | 1.6173 | 1.2717 |
| No log | 0.0414 | 28 | 2.0235 | 0.1075 | 2.0235 | 1.4225 |
| No log | 0.0444 | 30 | 2.1730 | 0.3402 | 2.1730 | 1.4741 |
| No log | 0.0473 | 32 | 1.8696 | 0.3333 | 1.8696 | 1.3673 |
| No log | 0.0503 | 34 | 1.4195 | 0.3243 | 1.4195 | 1.1914 |
| No log | 0.0533 | 36 | 1.2090 | 0.3527 | 1.2090 | 1.0995 |
| No log | 0.0562 | 38 | 1.2326 | 0.4282 | 1.2326 | 1.1102 |
| No log | 0.0592 | 40 | 1.5126 | 0.3531 | 1.5126 | 1.2299 |
| No log | 0.0621 | 42 | 2.1875 | 0.4783 | 2.1875 | 1.4790 |
| No log | 0.0651 | 44 | 2.8127 | 0.3670 | 2.8127 | 1.6771 |
| No log | 0.0680 | 46 | 2.7083 | 0.2553 | 2.7083 | 1.6457 |
| No log | 0.0710 | 48 | 2.2321 | 0.2873 | 2.2321 | 1.4940 |
| No log | 0.0740 | 50 | 1.6187 | 0.2416 | 1.6187 | 1.2723 |
| No log | 0.0769 | 52 | 1.5202 | 0.3478 | 1.5202 | 1.2330 |
| No log | 0.0799 | 54 | 1.7783 | 0.3870 | 1.7783 | 1.3335 |
| No log | 0.0828 | 56 | 1.8897 | 0.3824 | 1.8897 | 1.3747 |
| No log | 0.0858 | 58 | 1.8786 | 0.3274 | 1.8786 | 1.3706 |
| No log | 0.0888 | 60 | 1.6867 | 0.3787 | 1.6867 | 1.2987 |
| No log | 0.0917 | 62 | 1.4902 | 0.3478 | 1.4902 | 1.2207 |
| No log | 0.0947 | 64 | 1.5438 | 0.3478 | 1.5438 | 1.2425 |
| No log | 0.0976 | 66 | 1.8817 | 0.2145 | 1.8817 | 1.3718 |
| No log | 0.1006 | 68 | 2.0589 | 0.3182 | 2.0589 | 1.4349 |
| No log | 0.1036 | 70 | 2.2693 | 0.3029 | 2.2693 | 1.5064 |
| No log | 0.1065 | 72 | 2.3478 | 0.3029 | 2.3478 | 1.5323 |
| No log | 0.1095 | 74 | 2.2408 | 0.4596 | 2.2408 | 1.4969 |
| No log | 0.1124 | 76 | 2.2201 | 0.4796 | 2.2201 | 1.4900 |
| No log | 0.1154 | 78 | 2.5718 | 0.4431 | 2.5718 | 1.6037 |
| No log | 0.1183 | 80 | 2.5926 | 0.4974 | 2.5926 | 1.6102 |
| No log | 0.1213 | 82 | 2.1065 | 0.4796 | 2.1065 | 1.4514 |
| No log | 0.1243 | 84 | 1.5341 | 0.6211 | 1.5341 | 1.2386 |
| No log | 0.1272 | 86 | 1.3667 | 0.6812 | 1.3667 | 1.1691 |
| No log | 0.1302 | 88 | 1.4645 | 0.6211 | 1.4645 | 1.2102 |
| No log | 0.1331 | 90 | 1.7898 | 0.5385 | 1.7898 | 1.3378 |
| No log | 0.1361 | 92 | 2.3346 | 0.4793 | 2.3346 | 1.5280 |
| No log | 0.1391 | 94 | 2.7006 | 0.1794 | 2.7006 | 1.6433 |
| No log | 0.1420 | 96 | 2.4862 | 0.1180 | 2.4862 | 1.5768 |
| No log | 0.1450 | 98 | 1.9567 | 0.3119 | 1.9567 | 1.3988 |
| No log | 0.1479 | 100 | 1.7088 | 0.5743 | 1.7088 | 1.3072 |
| No log | 0.1509 | 102 | 1.7416 | 0.5804 | 1.7416 | 1.3197 |
| No log | 0.1538 | 104 | 1.9250 | 0.4991 | 1.9250 | 1.3875 |
| No log | 0.1568 | 106 | 1.8031 | 0.5200 | 1.8031 | 1.3428 |
| No log | 0.1598 | 108 | 2.0587 | 0.4987 | 2.0587 | 1.4348 |
| No log | 0.1627 | 110 | 2.3832 | 0.4234 | 2.3832 | 1.5438 |
| No log | 0.1657 | 112 | 2.3690 | 0.4987 | 2.3690 | 1.5391 |
| No log | 0.1686 | 114 | 2.0164 | 0.5177 | 2.0164 | 1.4200 |
| No log | 0.1716 | 116 | 2.0101 | 0.5177 | 2.0101 | 1.4178 |
| No log | 0.1746 | 118 | 2.2453 | 0.5177 | 2.2453 | 1.4984 |
| No log | 0.1775 | 120 | 2.5961 | 0.3921 | 2.5961 | 1.6112 |
| No log | 0.1805 | 122 | 2.8159 | 0.3280 | 2.8159 | 1.6781 |
| No log | 0.1834 | 124 | 2.4890 | 0.4247 | 2.4890 | 1.5777 |
| No log | 0.1864 | 126 | 2.1573 | 0.3977 | 2.1573 | 1.4688 |
| No log | 0.1893 | 128 | 1.9296 | 0.5417 | 1.9296 | 1.3891 |
| No log | 0.1923 | 130 | 1.8589 | 0.5200 | 1.8589 | 1.3634 |
| No log | 0.1953 | 132 | 2.0585 | 0.5177 | 2.0585 | 1.4347 |
| No log | 0.1982 | 134 | 2.0733 | 0.5177 | 2.0733 | 1.4399 |
| No log | 0.2012 | 136 | 2.3734 | 0.4099 | 2.3734 | 1.5406 |
| No log | 0.2041 | 138 | 2.4009 | 0.4099 | 2.4009 | 1.5495 |
| No log | 0.2071 | 140 | 2.2114 | 0.4234 | 2.2114 | 1.4871 |
| No log | 0.2101 | 142 | 2.2440 | 0.4043 | 2.2440 | 1.4980 |
| No log | 0.2130 | 144 | 2.2808 | 0.4043 | 2.2808 | 1.5102 |
| No log | 0.2160 | 146 | 2.5396 | 0.3527 | 2.5396 | 1.5936 |
| No log | 0.2189 | 148 | 2.5210 | 0.3527 | 2.5210 | 1.5878 |
| No log | 0.2219 | 150 | 2.4003 | 0.4297 | 2.4003 | 1.5493 |
| No log | 0.2249 | 152 | 2.0669 | 0.4793 | 2.0669 | 1.4377 |
| No log | 0.2278 | 154 | 1.9158 | 0.4793 | 1.9158 | 1.3841 |
| No log | 0.2308 | 156 | 2.0965 | 0.4210 | 2.0965 | 1.4479 |
| No log | 0.2337 | 158 | 2.3127 | 0.3527 | 2.3127 | 1.5208 |
| No log | 0.2367 | 160 | 2.0917 | 0.4011 | 2.0917 | 1.4463 |
| No log | 0.2396 | 162 | 2.1554 | 0.4043 | 2.1554 | 1.4681 |
| No log | 0.2426 | 164 | 2.2684 | 0.4256 | 2.2684 | 1.5061 |
| No log | 0.2456 | 166 | 2.1182 | 0.4625 | 2.1182 | 1.4554 |
| No log | 0.2485 | 168 | 2.0176 | 0.4625 | 2.0176 | 1.4204 |
| No log | 0.2515 | 170 | 1.7531 | 0.5965 | 1.7531 | 1.3240 |
| No log | 0.2544 | 172 | 1.5850 | 0.5965 | 1.5850 | 1.2590 |
| No log | 0.2574 | 174 | 1.5109 | 0.6303 | 1.5109 | 1.2292 |
| No log | 0.2604 | 176 | 1.6340 | 0.5556 | 1.6340 | 1.2783 |
| No log | 0.2633 | 178 | 1.8895 | 0.5177 | 1.8895 | 1.3746 |
| No log | 0.2663 | 180 | 2.0193 | 0.4799 | 2.0193 | 1.4210 |
| No log | 0.2692 | 182 | 1.8525 | 0.5366 | 1.8525 | 1.3611 |
| No log | 0.2722 | 184 | 1.4371 | 0.5638 | 1.4371 | 1.1988 |
| No log | 0.2751 | 186 | 1.3255 | 0.5638 | 1.3255 | 1.1513 |
| No log | 0.2781 | 188 | 1.5223 | 0.5600 | 1.5223 | 1.2338 |
| No log | 0.2811 | 190 | 1.8304 | 0.5341 | 1.8304 | 1.3529 |
| No log | 0.2840 | 192 | 2.2135 | 0.5158 | 2.2135 | 1.4878 |
| No log | 0.2870 | 194 | 2.3865 | 0.4324 | 2.3865 | 1.5448 |
| No log | 0.2899 | 196 | 2.0958 | 0.5679 | 2.0958 | 1.4477 |
| No log | 0.2929 | 198 | 2.0300 | 0.5679 | 2.0300 | 1.4248 |
| No log | 0.2959 | 200 | 2.0011 | 0.5679 | 2.0011 | 1.4146 |
| No log | 0.2988 | 202 | 2.3881 | 0.4324 | 2.3881 | 1.5453 |
| No log | 0.3018 | 204 | 2.4854 | 0.4324 | 2.4854 | 1.5765 |
| No log | 0.3047 | 206 | 1.9500 | 0.6201 | 1.9500 | 1.3964 |
| No log | 0.3077 | 208 | 1.1513 | 0.6331 | 1.1513 | 1.0730 |
| No log | 0.3107 | 210 | 0.9155 | 0.7631 | 0.9155 | 0.9568 |
| No log | 0.3136 | 212 | 1.0646 | 0.6543 | 1.0646 | 1.0318 |
| No log | 0.3166 | 214 | 1.4993 | 0.5965 | 1.4993 | 1.2245 |
| No log | 0.3195 | 216 | 2.3008 | 0.4324 | 2.3008 | 1.5168 |
| No log | 0.3225 | 218 | 2.7024 | 0.4324 | 2.7024 | 1.6439 |
| No log | 0.3254 | 220 | 2.4805 | 0.4664 | 2.4805 | 1.5750 |
| No log | 0.3284 | 222 | 2.1059 | 0.5470 | 2.1059 | 1.4512 |
| No log | 0.3314 | 224 | 1.9829 | 0.5823 | 1.9829 | 1.4082 |
| No log | 0.3343 | 226 | 2.0696 | 0.5823 | 2.0696 | 1.4386 |
| No log | 0.3373 | 228 | 1.9646 | 0.5510 | 1.9646 | 1.4016 |
| No log | 0.3402 | 230 | 1.6215 | 0.6356 | 1.6215 | 1.2734 |
| No log | 0.3432 | 232 | 1.6405 | 0.6356 | 1.6405 | 1.2808 |
| No log | 0.3462 | 234 | 1.8482 | 0.5510 | 1.8482 | 1.3595 |
| No log | 0.3491 | 236 | 1.9110 | 0.4983 | 1.9110 | 1.3824 |
| No log | 0.3521 | 238 | 1.6427 | 0.5188 | 1.6427 | 1.2817 |
| No log | 0.3550 | 240 | 1.5026 | 0.5638 | 1.5026 | 1.2258 |
| No log | 0.3580 | 242 | 1.4896 | 0.5638 | 1.4896 | 1.2205 |
| No log | 0.3609 | 244 | 1.7848 | 0.5166 | 1.7848 | 1.3360 |
| No log | 0.3639 | 246 | 2.0768 | 0.4983 | 2.0768 | 1.4411 |
| No log | 0.3669 | 248 | 2.1792 | 0.4983 | 2.1792 | 1.4762 |
| No log | 0.3698 | 250 | 2.1126 | 0.5166 | 2.1126 | 1.4535 |
| No log | 0.3728 | 252 | 1.8611 | 0.5889 | 1.8611 | 1.3642 |
| No log | 0.3757 | 254 | 1.5523 | 0.6071 | 1.5523 | 1.2459 |
| No log | 0.3787 | 256 | 1.6026 | 0.6071 | 1.6026 | 1.2659 |
| No log | 0.3817 | 258 | 1.9501 | 0.5882 | 1.9501 | 1.3965 |
| No log | 0.3846 | 260 | 2.1583 | 0.5158 | 2.1583 | 1.4691 |
| No log | 0.3876 | 262 | 1.9369 | 0.5158 | 1.9369 | 1.3917 |
| No log | 0.3905 | 264 | 1.6539 | 0.4583 | 1.6539 | 1.2860 |
| No log | 0.3935 | 266 | 1.5456 | 0.4571 | 1.5456 | 1.2432 |
| No log | 0.3964 | 268 | 1.7069 | 0.4348 | 1.7069 | 1.3065 |
| No log | 0.3994 | 270 | 2.1274 | 0.4431 | 2.1274 | 1.4586 |
| No log | 0.4024 | 272 | 2.4146 | 0.4247 | 2.4146 | 1.5539 |
| No log | 0.4053 | 274 | 2.4327 | 0.4431 | 2.4327 | 1.5597 |
| No log | 0.4083 | 276 | 2.0935 | 0.5158 | 2.0935 | 1.4469 |
| No log | 0.4112 | 278 | 1.8193 | 0.5358 | 1.8193 | 1.3488 |
| No log | 0.4142 | 280 | 1.6251 | 0.5548 | 1.6251 | 1.2748 |
| No log | 0.4172 | 282 | 1.7854 | 0.5548 | 1.7854 | 1.3362 |
| No log | 0.4201 | 284 | 2.3406 | 0.5326 | 2.3406 | 1.5299 |
| No log | 0.4231 | 286 | 2.8120 | 0.3453 | 2.8120 | 1.6769 |
| No log | 0.4260 | 288 | 2.6418 | 0.4469 | 2.6418 | 1.6254 |
| No log | 0.4290 | 290 | 2.2237 | 0.4983 | 2.2237 | 1.4912 |
| No log | 0.4320 | 292 | 2.1114 | 0.5358 | 2.1114 | 1.4531 |
| No log | 0.4349 | 294 | 2.0559 | 0.5376 | 2.0559 | 1.4338 |
| No log | 0.4379 | 296 | 1.8434 | 0.6000 | 1.8434 | 1.3577 |
| No log | 0.4408 | 298 | 1.7514 | 0.6000 | 1.7514 | 1.3234 |
| No log | 0.4438 | 300 | 1.9229 | 0.5573 | 1.9229 | 1.3867 |
| No log | 0.4467 | 302 | 1.7978 | 0.5573 | 1.7978 | 1.3408 |
| No log | 0.4497 | 304 | 1.5417 | 0.5404 | 1.5417 | 1.2417 |
| No log | 0.4527 | 306 | 1.6721 | 0.5573 | 1.6721 | 1.2931 |
| No log | 0.4556 | 308 | 1.8258 | 0.4991 | 1.8258 | 1.3512 |
| No log | 0.4586 | 310 | 2.1502 | 0.4796 | 2.1502 | 1.4664 |
| No log | 0.4615 | 312 | 2.5850 | 0.3921 | 2.5850 | 1.6078 |
| No log | 0.4645 | 314 | 2.5081 | 0.3921 | 2.5081 | 1.5837 |
| No log | 0.4675 | 316 | 2.0185 | 0.4210 | 2.0185 | 1.4208 |
| No log | 0.4704 | 318 | 1.5217 | 0.6157 | 1.5217 | 1.2336 |
| No log | 0.4734 | 320 | 1.5176 | 0.6157 | 1.5176 | 1.2319 |
| No log | 0.4763 | 322 | 1.8374 | 0.3738 | 1.8374 | 1.3555 |
| No log | 0.4793 | 324 | 2.5369 | 0.4247 | 2.5369 | 1.5928 |
| No log | 0.4822 | 326 | 3.0166 | 0.4149 | 3.0166 | 1.7368 |
| No log | 0.4852 | 328 | 2.9030 | 0.4149 | 2.9030 | 1.7038 |
| No log | 0.4882 | 330 | 2.3033 | 0.3812 | 2.3033 | 1.5177 |
| No log | 0.4911 | 332 | 1.5720 | 0.5011 | 1.5720 | 1.2538 |
| No log | 0.4941 | 334 | 1.2217 | 0.5047 | 1.2217 | 1.1053 |
| No log | 0.4970 | 336 | 1.1634 | 0.4806 | 1.1634 | 1.0786 |
| No log | 0.5 | 338 | 1.3476 | 0.6839 | 1.3476 | 1.1609 |
| No log | 0.5030 | 340 | 1.5272 | 0.5927 | 1.5272 | 1.2358 |
| No log | 0.5059 | 342 | 1.8521 | 0.4995 | 1.8521 | 1.3609 |
| No log | 0.5089 | 344 | 1.8133 | 0.5177 | 1.8133 | 1.3466 |
| No log | 0.5118 | 346 | 1.4281 | 0.6008 | 1.4281 | 1.1950 |
| No log | 0.5148 | 348 | 1.1405 | 0.6265 | 1.1405 | 1.0679 |
| No log | 0.5178 | 350 | 1.1643 | 0.6356 | 1.1643 | 1.0790 |
| No log | 0.5207 | 352 | 1.4326 | 0.6435 | 1.4326 | 1.1969 |
| No log | 0.5237 | 354 | 1.8958 | 0.6311 | 1.8958 | 1.3769 |
| No log | 0.5266 | 356 | 2.0355 | 0.6375 | 2.0355 | 1.4267 |
| No log | 0.5296 | 358 | 1.8620 | 0.5800 | 1.8620 | 1.3645 |
| No log | 0.5325 | 360 | 1.7379 | 0.5686 | 1.7379 | 1.3183 |
| No log | 0.5355 | 362 | 1.4849 | 0.5744 | 1.4849 | 1.2186 |
| No log | 0.5385 | 364 | 1.4820 | 0.5744 | 1.4820 | 1.2174 |
| No log | 0.5414 | 366 | 1.7132 | 0.5686 | 1.7132 | 1.3089 |
| No log | 0.5444 | 368 | 1.8751 | 0.5164 | 1.8751 | 1.3693 |
| No log | 0.5473 | 370 | 1.9427 | 0.5164 | 1.9427 | 1.3938 |
| No log | 0.5503 | 372 | 1.9238 | 0.5489 | 1.9238 | 1.3870 |
| No log | 0.5533 | 374 | 1.9908 | 0.4823 | 1.9908 | 1.4110 |
| No log | 0.5562 | 376 | 1.7852 | 0.6376 | 1.7852 | 1.3361 |
| No log | 0.5592 | 378 | 1.4042 | 0.6934 | 1.4042 | 1.1850 |
| No log | 0.5621 | 380 | 1.3339 | 0.7014 | 1.3339 | 1.1549 |
| No log | 0.5651 | 382 | 1.5014 | 0.6812 | 1.5014 | 1.2253 |
| No log | 0.5680 | 384 | 1.7604 | 0.5532 | 1.7604 | 1.3268 |
| No log | 0.5710 | 386 | 1.9863 | 0.4988 | 1.9863 | 1.4094 |
| No log | 0.5740 | 388 | 2.0023 | 0.4441 | 2.0023 | 1.4150 |
| No log | 0.5769 | 390 | 1.8323 | 0.4995 | 1.8323 | 1.3536 |
| No log | 0.5799 | 392 | 1.5319 | 0.5927 | 1.5319 | 1.2377 |
| No log | 0.5828 | 394 | 1.2857 | 0.5705 | 1.2857 | 1.1339 |
| No log | 0.5858 | 396 | 1.2748 | 0.5705 | 1.2748 | 1.1291 |
| No log | 0.5888 | 398 | 1.2447 | 0.6323 | 1.2447 | 1.1156 |
| No log | 0.5917 | 400 | 1.3810 | 0.6211 | 1.3810 | 1.1752 |
| No log | 0.5947 | 402 | 1.7534 | 0.6376 | 1.7534 | 1.3242 |
| No log | 0.5976 | 404 | 2.1789 | 0.5291 | 2.1789 | 1.4761 |
| No log | 0.6006 | 406 | 2.3193 | 0.4324 | 2.3193 | 1.5229 |
| No log | 0.6036 | 408 | 2.0079 | 0.5349 | 2.0079 | 1.4170 |
| No log | 0.6065 | 410 | 1.4768 | 0.6339 | 1.4768 | 1.2152 |
| No log | 0.6095 | 412 | 0.9421 | 0.8094 | 0.9421 | 0.9706 |
| No log | 0.6124 | 414 | 0.7710 | 0.7432 | 0.7710 | 0.8781 |
| No log | 0.6154 | 416 | 0.8228 | 0.7906 | 0.8228 | 0.9071 |
| No log | 0.6183 | 418 | 1.1203 | 0.7515 | 1.1203 | 1.0584 |
| No log | 0.6213 | 420 | 1.7202 | 0.5349 | 1.7202 | 1.3116 |
| No log | 0.6243 | 422 | 2.0589 | 0.4625 | 2.0589 | 1.4349 |
| No log | 0.6272 | 424 | 1.9750 | 0.4625 | 1.9750 | 1.4053 |
| No log | 0.6302 | 426 | 1.7002 | 0.5737 | 1.7002 | 1.3039 |
| No log | 0.6331 | 428 | 1.4584 | 0.6053 | 1.4584 | 1.2076 |
| No log | 0.6361 | 430 | 1.4064 | 0.6053 | 1.4064 | 1.1859 |
| No log | 0.6391 | 432 | 1.5996 | 0.6008 | 1.5996 | 1.2647 |
| No log | 0.6420 | 434 | 2.0072 | 0.5341 | 2.0072 | 1.4168 |
| No log | 0.6450 | 436 | 2.2470 | 0.4624 | 2.2470 | 1.4990 |
| No log | 0.6479 | 438 | 2.1866 | 0.4802 | 2.1866 | 1.4787 |
| No log | 0.6509 | 440 | 1.9062 | 0.5158 | 1.9062 | 1.3807 |
| No log | 0.6538 | 442 | 1.5809 | 0.5965 | 1.5809 | 1.2573 |
| No log | 0.6568 | 444 | 1.5531 | 0.5965 | 1.5531 | 1.2462 |
| No log | 0.6598 | 446 | 1.7454 | 0.5366 | 1.7454 | 1.3211 |
| No log | 0.6627 | 448 | 1.7889 | 0.4987 | 1.7889 | 1.3375 |
| No log | 0.6657 | 450 | 1.8162 | 0.4991 | 1.8161 | 1.3476 |
| No log | 0.6686 | 452 | 1.6058 | 0.5200 | 1.6058 | 1.2672 |
| No log | 0.6716 | 454 | 1.4890 | 0.6536 | 1.4890 | 1.2203 |
| No log | 0.6746 | 456 | 1.3941 | 0.6157 | 1.3941 | 1.1807 |
| No log | 0.6775 | 458 | 1.1485 | 0.5935 | 1.1485 | 1.0717 |
| No log | 0.6805 | 460 | 1.0123 | 0.6982 | 1.0123 | 1.0061 |
| No log | 0.6834 | 462 | 1.0624 | 0.6693 | 1.0624 | 1.0307 |
| No log | 0.6864 | 464 | 1.2489 | 0.6678 | 1.2489 | 1.1175 |
| No log | 0.6893 | 466 | 1.5782 | 0.6435 | 1.5782 | 1.2563 |
| No log | 0.6923 | 468 | 1.5432 | 0.6435 | 1.5432 | 1.2422 |
| No log | 0.6953 | 470 | 1.3239 | 0.6732 | 1.3239 | 1.1506 |
| No log | 0.6982 | 472 | 1.1434 | 0.6678 | 1.1434 | 1.0693 |
| No log | 0.7012 | 474 | 1.0185 | 0.7322 | 1.0185 | 1.0092 |
| No log | 0.7041 | 476 | 0.9671 | 0.7898 | 0.9671 | 0.9834 |
| No log | 0.7071 | 478 | 1.1360 | 0.6746 | 1.1360 | 1.0658 |
| No log | 0.7101 | 480 | 1.4188 | 0.6333 | 1.4188 | 1.1911 |
| No log | 0.7130 | 482 | 1.6883 | 0.5489 | 1.6883 | 1.2994 |
| No log | 0.7160 | 484 | 1.6613 | 0.5385 | 1.6613 | 1.2889 |
| No log | 0.7189 | 486 | 1.3402 | 0.7014 | 1.3402 | 1.1577 |
| No log | 0.7219 | 488 | 1.1825 | 0.6897 | 1.1825 | 1.0874 |
| No log | 0.7249 | 490 | 1.0660 | 0.6897 | 1.0660 | 1.0325 |
| No log | 0.7278 | 492 | 1.1635 | 0.7216 | 1.1635 | 1.0787 |
| No log | 0.7308 | 494 | 1.2338 | 0.6414 | 1.2338 | 1.1108 |
| No log | 0.7337 | 496 | 1.2031 | 0.6414 | 1.2031 | 1.0969 |
| No log | 0.7367 | 498 | 1.1220 | 0.6678 | 1.1220 | 1.0593 |
| 0.5124 | 0.7396 | 500 | 1.1471 | 0.6678 | 1.1471 | 1.0710 |
| 0.5124 | 0.7426 | 502 | 1.1933 | 0.6678 | 1.1933 | 1.0924 |
| 0.5124 | 0.7456 | 504 | 1.3590 | 0.6678 | 1.3590 | 1.1658 |
| 0.5124 | 0.7485 | 506 | 1.5283 | 0.6938 | 1.5283 | 1.2363 |
| 0.5124 | 0.7515 | 508 | 1.4056 | 0.6938 | 1.4056 | 1.1856 |
| 0.5124 | 0.7544 | 510 | 1.2773 | 0.6497 | 1.2773 | 1.1302 |
| 0.5124 | 0.7574 | 512 | 1.1138 | 0.6447 | 1.1138 | 1.0554 |
| 0.5124 | 0.7604 | 514 | 1.0728 | 0.6497 | 1.0728 | 1.0357 |
| 0.5124 | 0.7633 | 516 | 1.2504 | 0.6678 | 1.2504 | 1.1182 |
| 0.5124 | 0.7663 | 518 | 1.3152 | 0.6678 | 1.3152 | 1.1468 |
| 0.5124 | 0.7692 | 520 | 1.3246 | 0.7129 | 1.3246 | 1.1509 |
| 0.5124 | 0.7722 | 522 | 1.2160 | 0.7129 | 1.2160 | 1.1027 |
| 0.5124 | 0.7751 | 524 | 1.3105 | 0.7129 | 1.3105 | 1.1448 |
| 0.5124 | 0.7781 | 526 | 1.4726 | 0.7129 | 1.4726 | 1.2135 |
| 0.5124 | 0.7811 | 528 | 1.3328 | 0.7129 | 1.3328 | 1.1545 |
| 0.5124 | 0.7840 | 530 | 1.1356 | 0.6693 | 1.1356 | 1.0656 |
| 0.5124 | 0.7870 | 532 | 1.1833 | 0.6678 | 1.1833 | 1.0878 |
| 0.5124 | 0.7899 | 534 | 1.3837 | 0.6616 | 1.3837 | 1.1763 |
| 0.5124 | 0.7929 | 536 | 1.7438 | 0.6667 | 1.7438 | 1.3205 |
| 0.5124 | 0.7959 | 538 | 1.7484 | 0.6508 | 1.7484 | 1.3223 |
| 0.5124 | 0.7988 | 540 | 1.4713 | 0.6253 | 1.4713 | 1.2130 |
| 0.5124 | 0.8018 | 542 | 1.2209 | 0.6746 | 1.2209 | 1.1049 |
| 0.5124 | 0.8047 | 544 | 1.1763 | 0.6746 | 1.1763 | 1.0846 |
| 0.5124 | 0.8077 | 546 | 1.1800 | 0.6890 | 1.1800 | 1.0863 |
| 0.5124 | 0.8107 | 548 | 1.3306 | 0.6536 | 1.3306 | 1.1535 |
| 0.5124 | 0.8136 | 550 | 1.6643 | 0.5200 | 1.6643 | 1.2901 |
| 0.5124 | 0.8166 | 552 | 1.8104 | 0.4987 | 1.8104 | 1.3455 |
| 0.5124 | 0.8195 | 554 | 1.6862 | 0.5385 | 1.6862 | 1.2985 |
| 0.5124 | 0.8225 | 556 | 1.3954 | 0.6053 | 1.3954 | 1.1813 |
| 0.5124 | 0.8254 | 558 | 1.1097 | 0.6763 | 1.1097 | 1.0534 |
| 0.5124 | 0.8284 | 560 | 1.0157 | 0.6763 | 1.0157 | 1.0078 |
| 0.5124 | 0.8314 | 562 | 1.1742 | 0.7305 | 1.1742 | 1.0836 |
| 0.5124 | 0.8343 | 564 | 1.5945 | 0.6296 | 1.5945 | 1.2627 |
| 0.5124 | 0.8373 | 566 | 1.7991 | 0.5700 | 1.7991 | 1.3413 |
| 0.5124 | 0.8402 | 568 | 1.7228 | 0.5600 | 1.7228 | 1.3126 |
| 0.5124 | 0.8432 | 570 | 1.7575 | 0.5396 | 1.7575 | 1.3257 |
| 0.5124 | 0.8462 | 572 | 1.6217 | 0.5600 | 1.6217 | 1.2734 |
| 0.5124 | 0.8491 | 574 | 1.4997 | 0.5927 | 1.4997 | 1.2246 |
| 0.5124 | 0.8521 | 576 | 1.6235 | 0.5804 | 1.6235 | 1.2742 |
| 0.5124 | 0.8550 | 578 | 1.7046 | 0.5804 | 1.7046 | 1.3056 |
| 0.5124 | 0.8580 | 580 | 1.5357 | 0.6253 | 1.5357 | 1.2392 |
| 0.5124 | 0.8609 | 582 | 1.2044 | 0.6260 | 1.2044 | 1.0974 |
| 0.5124 | 0.8639 | 584 | 1.2175 | 0.6435 | 1.2175 | 1.1034 |
| 0.5124 | 0.8669 | 586 | 1.3953 | 0.6836 | 1.3953 | 1.1812 |
| 0.5124 | 0.8698 | 588 | 1.6457 | 0.6661 | 1.6457 | 1.2828 |
| 0.5124 | 0.8728 | 590 | 1.5577 | 0.5804 | 1.5577 | 1.2481 |
| 0.5124 | 0.8757 | 592 | 1.3116 | 0.6157 | 1.3116 | 1.1452 |
| 0.5124 | 0.8787 | 594 | 1.0623 | 0.6982 | 1.0623 | 1.0307 |
| 0.5124 | 0.8817 | 596 | 1.0351 | 0.6982 | 1.0351 | 1.0174 |
| 0.5124 | 0.8846 | 598 | 1.1916 | 0.6476 | 1.1916 | 1.0916 |
| 0.5124 | 0.8876 | 600 | 1.4156 | 0.6356 | 1.4156 | 1.1898 |
| 0.5124 | 0.8905 | 602 | 1.3344 | 0.6356 | 1.3344 | 1.1552 |
| 0.5124 | 0.8935 | 604 | 1.2729 | 0.6356 | 1.2729 | 1.1282 |
| 0.5124 | 0.8964 | 606 | 1.3387 | 0.6727 | 1.3387 | 1.1570 |
| 0.5124 | 0.8994 | 608 | 1.4370 | 0.6727 | 1.4370 | 1.1987 |
| 0.5124 | 0.9024 | 610 | 1.4993 | 0.6673 | 1.4993 | 1.2245 |
| 0.5124 | 0.9053 | 612 | 1.5312 | 0.6161 | 1.5312 | 1.2374 |
| 0.5124 | 0.9083 | 614 | 1.5403 | 0.5581 | 1.5403 | 1.2411 |
| 0.5124 | 0.9112 | 616 | 1.5774 | 0.5581 | 1.5774 | 1.2560 |
| 0.5124 | 0.9142 | 618 | 1.4491 | 0.6161 | 1.4491 | 1.2038 |
| 0.5124 | 0.9172 | 620 | 1.3558 | 0.6356 | 1.3558 | 1.1644 |
| 0.5124 | 0.9201 | 622 | 1.5727 | 0.6933 | 1.5727 | 1.2541 |
| 0.5124 | 0.9231 | 624 | 1.9185 | 0.6990 | 1.9185 | 1.3851 |
| 0.5124 | 0.9260 | 626 | 1.8117 | 0.7280 | 1.8117 | 1.3460 |
| 0.5124 | 0.9290 | 628 | 1.4073 | 0.6356 | 1.4073 | 1.1863 |
| 0.5124 | 0.9320 | 630 | 1.1589 | 0.6476 | 1.1589 | 1.0765 |
| 0.5124 | 0.9349 | 632 | 1.0186 | 0.6982 | 1.0186 | 1.0093 |
| 0.5124 | 0.9379 | 634 | 1.0923 | 0.6982 | 1.0923 | 1.0451 |
| 0.5124 | 0.9408 | 636 | 1.3583 | 0.6616 | 1.3583 | 1.1655 |
| 0.5124 | 0.9438 | 638 | 1.6299 | 0.6161 | 1.6299 | 1.2767 |
| 0.5124 | 0.9467 | 640 | 1.7971 | 0.6376 | 1.7971 | 1.3406 |
| 0.5124 | 0.9497 | 642 | 1.7640 | 0.6552 | 1.7640 | 1.3282 |
| 0.5124 | 0.9527 | 644 | 1.3877 | 0.6121 | 1.3877 | 1.1780 |
| 0.5124 | 0.9556 | 646 | 1.0197 | 0.6684 | 1.0197 | 1.0098 |
| 0.5124 | 0.9586 | 648 | 0.8821 | 0.7332 | 0.8821 | 0.9392 |
| 0.5124 | 0.9615 | 650 | 0.8877 | 0.7332 | 0.8877 | 0.9422 |
| 0.5124 | 0.9645 | 652 | 1.0239 | 0.6982 | 1.0239 | 1.0119 |
| 0.5124 | 0.9675 | 654 | 1.3656 | 0.6394 | 1.3656 | 1.1686 |
| 0.5124 | 0.9704 | 656 | 1.5068 | 0.6339 | 1.5068 | 1.2275 |
| 0.5124 | 0.9734 | 658 | 1.5125 | 0.6382 | 1.5125 | 1.2298 |
| 0.5124 | 0.9763 | 660 | 1.3579 | 0.6121 | 1.3579 | 1.1653 |
| 0.5124 | 0.9793 | 662 | 1.4305 | 0.5933 | 1.4305 | 1.1960 |
| 0.5124 | 0.9822 | 664 | 1.3148 | 0.5973 | 1.3148 | 1.1467 |
| 0.5124 | 0.9852 | 666 | 1.1350 | 0.6818 | 1.1350 | 1.0654 |
| 0.5124 | 0.9882 | 668 | 1.1327 | 0.6818 | 1.1327 | 1.0643 |
| 0.5124 | 0.9911 | 670 | 1.2600 | 0.6934 | 1.2600 | 1.1225 |
| 0.5124 | 0.9941 | 672 | 1.2645 | 0.6934 | 1.2645 | 1.1245 |
| 0.5124 | 0.9970 | 674 | 1.2645 | 0.6356 | 1.2645 | 1.1245 |
| 0.5124 | 1.0 | 676 | 1.3788 | 0.6610 | 1.3788 | 1.1742 |
| 0.5124 | 1.0030 | 678 | 1.4922 | 0.6552 | 1.4922 | 1.2216 |
| 0.5124 | 1.0059 | 680 | 1.4587 | 0.6071 | 1.4587 | 1.2078 |
| 0.5124 | 1.0089 | 682 | 1.3493 | 0.6303 | 1.3493 | 1.1616 |
| 0.5124 | 1.0118 | 684 | 1.3922 | 0.5777 | 1.3922 | 1.1799 |
| 0.5124 | 1.0148 | 686 | 1.4398 | 0.6161 | 1.4398 | 1.1999 |
| 0.5124 | 1.0178 | 688 | 1.2817 | 0.6015 | 1.2817 | 1.1321 |
| 0.5124 | 1.0207 | 690 | 1.1599 | 0.6061 | 1.1599 | 1.0770 |
| 0.5124 | 1.0237 | 692 | 1.1406 | 0.6543 | 1.1406 | 1.0680 |
| 0.5124 | 1.0266 | 694 | 1.1262 | 0.6543 | 1.1262 | 1.0612 |
| 0.5124 | 1.0296 | 696 | 1.1291 | 0.6543 | 1.1291 | 1.0626 |
| 0.5124 | 1.0325 | 698 | 1.1602 | 0.6543 | 1.1602 | 1.0771 |
| 0.5124 | 1.0355 | 700 | 1.1044 | 0.6543 | 1.1044 | 1.0509 |
| 0.5124 | 1.0385 | 702 | 1.2154 | 0.6061 | 1.2154 | 1.1025 |
| 0.5124 | 1.0414 | 704 | 1.5019 | 0.6207 | 1.5019 | 1.2255 |
| 0.5124 | 1.0444 | 706 | 1.6556 | 0.5994 | 1.6556 | 1.2867 |
| 0.5124 | 1.0473 | 708 | 1.6783 | 0.5679 | 1.6783 | 1.2955 |
| 0.5124 | 1.0503 | 710 | 1.4250 | 0.5366 | 1.4250 | 1.1937 |
| 0.5124 | 1.0533 | 712 | 1.1856 | 0.6323 | 1.1856 | 1.0888 |
| 0.5124 | 1.0562 | 714 | 1.0413 | 0.6839 | 1.0413 | 1.0205 |
| 0.5124 | 1.0592 | 716 | 1.1108 | 0.6839 | 1.1108 | 1.0539 |
| 0.5124 | 1.0621 | 718 | 1.3563 | 0.5449 | 1.3563 | 1.1646 |
| 0.5124 | 1.0651 | 720 | 1.4666 | 0.5449 | 1.4666 | 1.2110 |
| 0.5124 | 1.0680 | 722 | 1.4211 | 0.5449 | 1.4211 | 1.1921 |
| 0.5124 | 1.0710 | 724 | 1.2445 | 0.6323 | 1.2445 | 1.1156 |
| 0.5124 | 1.0740 | 726 | 1.1570 | 0.6111 | 1.1570 | 1.0757 |
| 0.5124 | 1.0769 | 728 | 1.1705 | 0.6476 | 1.1705 | 1.0819 |
| 0.5124 | 1.0799 | 730 | 1.3534 | 0.6545 | 1.3534 | 1.1634 |
| 0.5124 | 1.0828 | 732 | 1.6599 | 0.6769 | 1.6599 | 1.2884 |
| 0.5124 | 1.0858 | 734 | 1.7883 | 0.6552 | 1.7883 | 1.3373 |
| 0.5124 | 1.0888 | 736 | 1.5746 | 0.5769 | 1.5746 | 1.2548 |
| 0.5124 | 1.0917 | 738 | 1.1983 | 0.6839 | 1.1983 | 1.0947 |
| 0.5124 | 1.0947 | 740 | 0.9295 | 0.7316 | 0.9295 | 0.9641 |
| 0.5124 | 1.0976 | 742 | 0.8892 | 0.7316 | 0.8892 | 0.9430 |
| 0.5124 | 1.1006 | 744 | 0.9784 | 0.6982 | 0.9784 | 0.9891 |
| 0.5124 | 1.1036 | 746 | 1.0991 | 0.7310 | 1.0991 | 1.0484 |
| 0.5124 | 1.1065 | 748 | 1.1218 | 0.7310 | 1.1218 | 1.0592 |
| 0.5124 | 1.1095 | 750 | 0.9994 | 0.6982 | 0.9994 | 0.9997 |
| 0.5124 | 1.1124 | 752 | 0.9723 | 0.6982 | 0.9723 | 0.9861 |
| 0.5124 | 1.1154 | 754 | 1.0213 | 0.6982 | 1.0213 | 1.0106 |
| 0.5124 | 1.1183 | 756 | 1.1969 | 0.6972 | 1.1969 | 1.0940 |
| 0.5124 | 1.1213 | 758 | 1.4115 | 0.6860 | 1.4115 | 1.1881 |
| 0.5124 | 1.1243 | 760 | 1.4379 | 0.6860 | 1.4379 | 1.1991 |
| 0.5124 | 1.1272 | 762 | 1.2880 | 0.6860 | 1.2880 | 1.1349 |
| 0.5124 | 1.1302 | 764 | 1.0399 | 0.7322 | 1.0399 | 1.0197 |
| 0.5124 | 1.1331 | 766 | 0.9220 | 0.7316 | 0.9220 | 0.9602 |
| 0.5124 | 1.1361 | 768 | 0.8880 | 0.7937 | 0.8880 | 0.9423 |
| 0.5124 | 1.1391 | 770 | 0.9822 | 0.7322 | 0.9822 | 0.9911 |
| 0.5124 | 1.1420 | 772 | 1.2068 | 0.6860 | 1.2068 | 1.0986 |
| 0.5124 | 1.1450 | 774 | 1.4540 | 0.6860 | 1.4540 | 1.2058 |
| 0.5124 | 1.1479 | 776 | 1.4219 | 0.6545 | 1.4219 | 1.1925 |
| 0.5124 | 1.1509 | 778 | 1.1883 | 0.6610 | 1.1883 | 1.0901 |
| 0.5124 | 1.1538 | 780 | 1.1370 | 0.6323 | 1.1370 | 1.0663 |
| 0.5124 | 1.1568 | 782 | 1.1483 | 0.6323 | 1.1483 | 1.0716 |
| 0.5124 | 1.1598 | 784 | 1.0509 | 0.7519 | 1.0509 | 1.0252 |
| 0.5124 | 1.1627 | 786 | 0.9433 | 0.7519 | 0.9433 | 0.9712 |
| 0.5124 | 1.1657 | 788 | 0.9737 | 0.7519 | 0.9737 | 0.9868 |
| 0.5124 | 1.1686 | 790 | 1.1458 | 0.6975 | 1.1458 | 1.0704 |
| 0.5124 | 1.1716 | 792 | 1.2280 | 0.7211 | 1.2280 | 1.1082 |
| 0.5124 | 1.1746 | 794 | 1.1313 | 0.6975 | 1.1313 | 1.0636 |
| 0.5124 | 1.1775 | 796 | 0.9603 | 0.6975 | 0.9603 | 0.9799 |
| 0.5124 | 1.1805 | 798 | 0.8791 | 0.6975 | 0.8791 | 0.9376 |
| 0.5124 | 1.1834 | 800 | 0.9197 | 0.7614 | 0.9197 | 0.9590 |
| 0.5124 | 1.1864 | 802 | 1.0550 | 0.7417 | 1.0550 | 1.0271 |
| 0.5124 | 1.1893 | 804 | 1.1647 | 0.7123 | 1.1647 | 1.0792 |
| 0.5124 | 1.1923 | 806 | 1.0629 | 0.7688 | 1.0629 | 1.0310 |
| 0.5124 | 1.1953 | 808 | 0.9649 | 0.7598 | 0.9649 | 0.9823 |
| 0.5124 | 1.1982 | 810 | 0.8524 | 0.7696 | 0.8524 | 0.9232 |
| 0.5124 | 1.2012 | 812 | 0.8621 | 0.7793 | 0.8621 | 0.9285 |
| 0.5124 | 1.2041 | 814 | 1.0182 | 0.7305 | 1.0182 | 1.0091 |
| 0.5124 | 1.2071 | 816 | 1.2705 | 0.7123 | 1.2705 | 1.1272 |
| 0.5124 | 1.2101 | 818 | 1.3855 | 0.7123 | 1.3855 | 1.1771 |
| 0.5124 | 1.2130 | 820 | 1.3684 | 0.7211 | 1.3684 | 1.1698 |
| 0.5124 | 1.2160 | 822 | 1.2992 | 0.7305 | 1.2992 | 1.1398 |
| 0.5124 | 1.2189 | 824 | 1.2448 | 0.7305 | 1.2448 | 1.1157 |
| 0.5124 | 1.2219 | 826 | 1.2517 | 0.7305 | 1.2517 | 1.1188 |
| 0.5124 | 1.2249 | 828 | 1.2850 | 0.7123 | 1.2850 | 1.1336 |
| 0.5124 | 1.2278 | 830 | 1.3320 | 0.7123 | 1.3320 | 1.1541 |
| 0.5124 | 1.2308 | 832 | 1.5386 | 0.7042 | 1.5386 | 1.2404 |
| 0.5124 | 1.2337 | 834 | 1.6960 | 0.6247 | 1.6960 | 1.3023 |
| 0.5124 | 1.2367 | 836 | 1.5693 | 0.5926 | 1.5693 | 1.2527 |
| 0.5124 | 1.2396 | 838 | 1.2803 | 0.7211 | 1.2803 | 1.1315 |
| 0.5124 | 1.2426 | 840 | 1.0329 | 0.7211 | 1.0329 | 1.0163 |
| 0.5124 | 1.2456 | 842 | 0.8464 | 0.72 | 0.8464 | 0.9200 |
| 0.5124 | 1.2485 | 844 | 0.7644 | 0.72 | 0.7644 | 0.8743 |
| 0.5124 | 1.2515 | 846 | 0.8378 | 0.7216 | 0.8378 | 0.9153 |
| 0.5124 | 1.2544 | 848 | 1.0355 | 0.7305 | 1.0355 | 1.0176 |
| 0.5124 | 1.2574 | 850 | 1.3306 | 0.6485 | 1.3306 | 1.1535 |
| 0.5124 | 1.2604 | 852 | 1.5482 | 0.6786 | 1.5482 | 1.2443 |
| 0.5124 | 1.2633 | 854 | 1.5133 | 0.6485 | 1.5133 | 1.2302 |
| 0.5124 | 1.2663 | 856 | 1.3109 | 0.6545 | 1.3109 | 1.1449 |
| 0.5124 | 1.2692 | 858 | 1.0248 | 0.7840 | 1.0248 | 1.0123 |
| 0.5124 | 1.2722 | 860 | 0.8619 | 0.7844 | 0.8619 | 0.9284 |
| 0.5124 | 1.2751 | 862 | 0.7735 | 0.7844 | 0.7735 | 0.8795 |
| 0.5124 | 1.2781 | 864 | 0.7926 | 0.7844 | 0.7926 | 0.8903 |
| 0.5124 | 1.2811 | 866 | 0.9015 | 0.7519 | 0.9015 | 0.9495 |
| 0.5124 | 1.2840 | 868 | 1.0408 | 0.7840 | 1.0408 | 1.0202 |
| 0.5124 | 1.2870 | 870 | 1.2394 | 0.6545 | 1.2394 | 1.1133 |
| 0.5124 | 1.2899 | 872 | 1.3263 | 0.6485 | 1.3263 | 1.1517 |
| 0.5124 | 1.2929 | 874 | 1.1878 | 0.7720 | 1.1878 | 1.0899 |
| 0.5124 | 1.2959 | 876 | 0.9641 | 0.8138 | 0.9641 | 0.9819 |
| 0.5124 | 1.2988 | 878 | 0.7789 | 0.8412 | 0.7789 | 0.8826 |
| 0.5124 | 1.3018 | 880 | 0.7364 | 0.8283 | 0.7364 | 0.8582 |
| 0.5124 | 1.3047 | 882 | 0.8419 | 0.8138 | 0.8419 | 0.9175 |
| 0.5124 | 1.3077 | 884 | 1.1174 | 0.8138 | 1.1174 | 1.0571 |
| 0.5124 | 1.3107 | 886 | 1.4156 | 0.6257 | 1.4156 | 1.1898 |
| 0.5124 | 1.3136 | 888 | 1.4840 | 0.6046 | 1.4840 | 1.2182 |
| 0.5124 | 1.3166 | 890 | 1.3482 | 0.6610 | 1.3482 | 1.1611 |
| 0.5124 | 1.3195 | 892 | 1.0679 | 0.7515 | 1.0679 | 1.0334 |
| 0.5124 | 1.3225 | 894 | 0.8469 | 0.7823 | 0.8469 | 0.9202 |
| 0.5124 | 1.3254 | 896 | 0.7514 | 0.7823 | 0.7514 | 0.8668 |
| 0.5124 | 1.3284 | 898 | 0.7512 | 0.7823 | 0.7512 | 0.8667 |
| 0.5124 | 1.3314 | 900 | 0.9032 | 0.8015 | 0.9032 | 0.9504 |
| 0.5124 | 1.3343 | 902 | 1.0844 | 0.7603 | 1.0844 | 1.0413 |
| 0.5124 | 1.3373 | 904 | 1.3467 | 0.6071 | 1.3467 | 1.1605 |
| 0.5124 | 1.3402 | 906 | 1.4380 | 0.5889 | 1.4380 | 1.1992 |
| 0.5124 | 1.3432 | 908 | 1.3756 | 0.5804 | 1.3756 | 1.1728 |
| 0.5124 | 1.3462 | 910 | 1.2479 | 0.7188 | 1.2479 | 1.1171 |
| 0.5124 | 1.3491 | 912 | 1.2447 | 0.7188 | 1.2447 | 1.1157 |
| 0.5124 | 1.3521 | 914 | 1.2992 | 0.6316 | 1.2992 | 1.1398 |
| 0.5124 | 1.3550 | 916 | 1.3122 | 0.6536 | 1.3122 | 1.1455 |
| 0.5124 | 1.3580 | 918 | 1.2349 | 0.7188 | 1.2349 | 1.1113 |
| 0.5124 | 1.3609 | 920 | 1.0607 | 0.7188 | 1.0607 | 1.0299 |
| 0.5124 | 1.3639 | 922 | 0.9777 | 0.8138 | 0.9777 | 0.9888 |
| 0.5124 | 1.3669 | 924 | 1.0128 | 0.8138 | 1.0128 | 1.0064 |
| 0.5124 | 1.3698 | 926 | 1.2155 | 0.7338 | 1.2155 | 1.1025 |
| 0.5124 | 1.3728 | 928 | 1.5746 | 0.7115 | 1.5746 | 1.2548 |
| 0.5124 | 1.3757 | 930 | 1.7375 | 0.6751 | 1.7375 | 1.3182 |
| 0.5124 | 1.3787 | 932 | 1.6504 | 0.6751 | 1.6504 | 1.2847 |
| 0.5124 | 1.3817 | 934 | 1.3282 | 0.6552 | 1.3282 | 1.1525 |
| 0.5124 | 1.3846 | 936 | 1.1306 | 0.7412 | 1.1306 | 1.0633 |
| 0.5124 | 1.3876 | 938 | 0.9086 | 0.8283 | 0.9086 | 0.9532 |
| 0.5124 | 1.3905 | 940 | 0.8630 | 0.7823 | 0.8630 | 0.9290 |
| 0.5124 | 1.3935 | 942 | 0.9429 | 0.8019 | 0.9429 | 0.9711 |
| 0.5124 | 1.3964 | 944 | 1.1244 | 0.6839 | 1.1244 | 1.0604 |
| 0.5124 | 1.3994 | 946 | 1.3577 | 0.6969 | 1.3577 | 1.1652 |
| 0.5124 | 1.4024 | 948 | 1.6525 | 0.5417 | 1.6525 | 1.2855 |
| 0.5124 | 1.4053 | 950 | 1.7282 | 0.4995 | 1.7282 | 1.3146 |
| 0.5124 | 1.4083 | 952 | 1.5969 | 0.5661 | 1.5969 | 1.2637 |
| 0.5124 | 1.4112 | 954 | 1.4857 | 0.6609 | 1.4857 | 1.2189 |
| 0.5124 | 1.4142 | 956 | 1.2584 | 0.6839 | 1.2584 | 1.1218 |
| 0.5124 | 1.4172 | 958 | 1.1679 | 0.7519 | 1.1679 | 1.0807 |
| 0.5124 | 1.4201 | 960 | 1.1933 | 0.6839 | 1.1933 | 1.0924 |
| 0.5124 | 1.4231 | 962 | 1.2640 | 0.6323 | 1.2640 | 1.1243 |
| 0.5124 | 1.4260 | 964 | 1.3787 | 0.6680 | 1.3787 | 1.1742 |
| 0.5124 | 1.4290 | 966 | 1.6585 | 0.5919 | 1.6585 | 1.2878 |
| 0.5124 | 1.4320 | 968 | 1.7420 | 0.5919 | 1.7420 | 1.3198 |
| 0.5124 | 1.4349 | 970 | 1.6647 | 0.6108 | 1.6647 | 1.2902 |
| 0.5124 | 1.4379 | 972 | 1.6923 | 0.6108 | 1.6923 | 1.3009 |
| 0.5124 | 1.4408 | 974 | 1.4807 | 0.6008 | 1.4807 | 1.2168 |
| 0.5124 | 1.4438 | 976 | 1.3347 | 0.7188 | 1.3347 | 1.1553 |
| 0.5124 | 1.4467 | 978 | 1.3076 | 0.7188 | 1.3076 | 1.1435 |
| 0.5124 | 1.4497 | 980 | 1.2835 | 0.7188 | 1.2835 | 1.1329 |
| 0.5124 | 1.4527 | 982 | 1.1488 | 0.7840 | 1.1488 | 1.0718 |
| 0.5124 | 1.4556 | 984 | 1.1533 | 0.7188 | 1.1533 | 1.0739 |
| 0.5124 | 1.4586 | 986 | 1.3488 | 0.6929 | 1.3488 | 1.1614 |
| 0.5124 | 1.4615 | 988 | 1.6310 | 0.6538 | 1.6310 | 1.2771 |
| 0.5124 | 1.4645 | 990 | 1.6573 | 0.6478 | 1.6573 | 1.2874 |
| 0.5124 | 1.4675 | 992 | 1.4338 | 0.6929 | 1.4338 | 1.1974 |
| 0.5124 | 1.4704 | 994 | 1.2248 | 0.6551 | 1.2248 | 1.1067 |
| 0.5124 | 1.4734 | 996 | 1.1526 | 0.6551 | 1.1526 | 1.0736 |
| 0.5124 | 1.4763 | 998 | 1.2478 | 0.6610 | 1.2478 | 1.1170 |
| 0.2027 | 1.4793 | 1000 | 1.2770 | 0.6610 | 1.2770 | 1.1300 |
| 0.2027 | 1.4822 | 1002 | 1.2318 | 0.6610 | 1.2318 | 1.1099 |
| 0.2027 | 1.4852 | 1004 | 1.0398 | 0.7157 | 1.0398 | 1.0197 |
| 0.2027 | 1.4882 | 1006 | 0.8259 | 0.8164 | 0.8259 | 0.9088 |
| 0.2027 | 1.4911 | 1008 | 0.7525 | 0.8164 | 0.7525 | 0.8675 |
| 0.2027 | 1.4941 | 1010 | 0.8021 | 0.7696 | 0.8021 | 0.8956 |
| 0.2027 | 1.4970 | 1012 | 0.9255 | 0.7594 | 0.9255 | 0.9620 |
| 0.2027 | 1.5 | 1014 | 1.0262 | 0.7507 | 1.0262 | 1.0130 |
| 0.2027 | 1.5030 | 1016 | 1.0360 | 0.7507 | 1.0360 | 1.0179 |
| 0.2027 | 1.5059 | 1018 | 1.0668 | 0.7434 | 1.0668 | 1.0329 |
| 0.2027 | 1.5089 | 1020 | 0.9393 | 0.6952 | 0.9393 | 0.9692 |
| 0.2027 | 1.5118 | 1022 | 0.7957 | 0.7983 | 0.7957 | 0.8920 |
| 0.2027 | 1.5148 | 1024 | 0.7440 | 0.7427 | 0.7440 | 0.8625 |
| 0.2027 | 1.5178 | 1026 | 0.7250 | 0.7427 | 0.7250 | 0.8515 |
| 0.2027 | 1.5207 | 1028 | 0.8198 | 0.7427 | 0.8198 | 0.9054 |
| 0.2027 | 1.5237 | 1030 | 1.0231 | 0.7216 | 1.0231 | 1.0115 |
| 0.2027 | 1.5266 | 1032 | 1.2797 | 0.6610 | 1.2797 | 1.1313 |
| 0.2027 | 1.5296 | 1034 | 1.3924 | 0.6610 | 1.3924 | 1.1800 |
| 0.2027 | 1.5325 | 1036 | 1.3601 | 0.6610 | 1.3601 | 1.1662 |
| 0.2027 | 1.5355 | 1038 | 1.2504 | 0.6680 | 1.2504 | 1.1182 |
| 0.2027 | 1.5385 | 1040 | 1.1420 | 0.7519 | 1.1420 | 1.0686 |
| 0.2027 | 1.5414 | 1042 | 1.1325 | 0.7519 | 1.1325 | 1.0642 |
| 0.2027 | 1.5444 | 1044 | 1.2034 | 0.6839 | 1.2034 | 1.0970 |
| 0.2027 | 1.5473 | 1046 | 1.2858 | 0.7188 | 1.2858 | 1.1339 |
| 0.2027 | 1.5503 | 1048 | 1.2784 | 0.7188 | 1.2784 | 1.1307 |
| 0.2027 | 1.5533 | 1050 | 1.2252 | 0.7188 | 1.2252 | 1.1069 |
| 0.2027 | 1.5562 | 1052 | 1.0635 | 0.7840 | 1.0635 | 1.0313 |
| 0.2027 | 1.5592 | 1054 | 0.8680 | 0.72 | 0.8680 | 0.9317 |
| 0.2027 | 1.5621 | 1056 | 0.7908 | 0.72 | 0.7908 | 0.8893 |
| 0.2027 | 1.5651 | 1058 | 0.8120 | 0.7520 | 0.8120 | 0.9011 |
| 0.2027 | 1.5680 | 1060 | 0.8354 | 0.7134 | 0.8354 | 0.9140 |
| 0.2027 | 1.5710 | 1062 | 0.9254 | 0.7216 | 0.9254 | 0.9620 |
| 0.2027 | 1.5740 | 1064 | 1.0193 | 0.7216 | 1.0193 | 1.0096 |
| 0.2027 | 1.5769 | 1066 | 1.0557 | 0.6616 | 1.0557 | 1.0274 |
| 0.2027 | 1.5799 | 1068 | 0.9851 | 0.7134 | 0.9851 | 0.9925 |
| 0.2027 | 1.5828 | 1070 | 0.9778 | 0.6824 | 0.9778 | 0.9888 |
| 0.2027 | 1.5858 | 1072 | 1.0820 | 0.6897 | 1.0820 | 1.0402 |
| 0.2027 | 1.5888 | 1074 | 1.2212 | 0.6680 | 1.2212 | 1.1051 |
| 0.2027 | 1.5917 | 1076 | 1.1884 | 0.6680 | 1.1884 | 1.0901 |
| 0.2027 | 1.5947 | 1078 | 1.0435 | 0.7188 | 1.0435 | 1.0215 |
| 0.2027 | 1.5976 | 1080 | 1.0055 | 0.6839 | 1.0055 | 1.0027 |
| 0.2027 | 1.6006 | 1082 | 1.1141 | 0.6680 | 1.1141 | 1.0555 |
| 0.2027 | 1.6036 | 1084 | 1.2438 | 0.6680 | 1.2438 | 1.1152 |
| 0.2027 | 1.6065 | 1086 | 1.2313 | 0.6610 | 1.2313 | 1.1096 |
| 0.2027 | 1.6095 | 1088 | 1.1180 | 0.6610 | 1.1180 | 1.0573 |
| 0.2027 | 1.6124 | 1090 | 1.0677 | 0.7211 | 1.0677 | 1.0333 |
| 0.2027 | 1.6154 | 1092 | 1.0659 | 0.7211 | 1.0659 | 1.0324 |
| 0.2027 | 1.6183 | 1094 | 1.0080 | 0.7305 | 1.0080 | 1.0040 |
| 0.2027 | 1.6213 | 1096 | 1.0406 | 0.7305 | 1.0406 | 1.0201 |
| 0.2027 | 1.6243 | 1098 | 1.1992 | 0.7211 | 1.1992 | 1.0951 |
| 0.2027 | 1.6272 | 1100 | 1.4250 | 0.6610 | 1.4250 | 1.1937 |
| 0.2027 | 1.6302 | 1102 | 1.4228 | 0.6610 | 1.4228 | 1.1928 |
| 0.2027 | 1.6331 | 1104 | 1.3426 | 0.7211 | 1.3426 | 1.1587 |
| 0.2027 | 1.6361 | 1106 | 1.2521 | 0.7211 | 1.2521 | 1.1190 |
| 0.2027 | 1.6391 | 1108 | 1.1786 | 0.7305 | 1.1786 | 1.0856 |
| 0.2027 | 1.6420 | 1110 | 1.1456 | 0.7840 | 1.1456 | 1.0703 |
| 0.2027 | 1.6450 | 1112 | 1.2251 | 0.7840 | 1.2251 | 1.1068 |
| 0.2027 | 1.6479 | 1114 | 1.4526 | 0.6349 | 1.4526 | 1.2052 |
| 0.2027 | 1.6509 | 1116 | 1.6923 | 0.5573 | 1.6923 | 1.3009 |
| 0.2027 | 1.6538 | 1118 | 1.6747 | 0.5600 | 1.6747 | 1.2941 |
| 0.2027 | 1.6568 | 1120 | 1.4693 | 0.6257 | 1.4693 | 1.2122 |
| 0.2027 | 1.6598 | 1122 | 1.1755 | 0.7188 | 1.1755 | 1.0842 |
| 0.2027 | 1.6627 | 1124 | 0.9689 | 0.72 | 0.9689 | 0.9843 |
| 0.2027 | 1.6657 | 1126 | 0.9462 | 0.72 | 0.9462 | 0.9727 |
| 0.2027 | 1.6686 | 1128 | 1.0151 | 0.7519 | 1.0151 | 1.0075 |
| 0.2027 | 1.6716 | 1130 | 1.0811 | 0.7724 | 1.0811 | 1.0397 |
| 0.2027 | 1.6746 | 1132 | 1.0912 | 0.7724 | 1.0912 | 1.0446 |
| 0.2027 | 1.6775 | 1134 | 1.0524 | 0.7216 | 1.0524 | 1.0259 |
| 0.2027 | 1.6805 | 1136 | 1.0760 | 0.7129 | 1.0760 | 1.0373 |
| 0.2027 | 1.6834 | 1138 | 1.2634 | 0.6610 | 1.2634 | 1.1240 |
| 0.2027 | 1.6864 | 1140 | 1.4169 | 0.6407 | 1.4169 | 1.1903 |
| 0.2027 | 1.6893 | 1142 | 1.3728 | 0.6610 | 1.3728 | 1.1717 |
| 0.2027 | 1.6923 | 1144 | 1.2512 | 0.7188 | 1.2512 | 1.1186 |
| 0.2027 | 1.6953 | 1146 | 1.1020 | 0.7840 | 1.1020 | 1.0498 |
| 0.2027 | 1.6982 | 1148 | 0.9609 | 0.7840 | 0.9609 | 0.9803 |
| 0.2027 | 1.7012 | 1150 | 0.8586 | 0.7520 | 0.8586 | 0.9266 |
| 0.2027 | 1.7041 | 1152 | 0.8447 | 0.6945 | 0.8447 | 0.9191 |
| 0.2027 | 1.7071 | 1154 | 0.9643 | 0.7129 | 0.9643 | 0.9820 |
| 0.2027 | 1.7101 | 1156 | 1.1496 | 0.7129 | 1.1496 | 1.0722 |
| 0.2027 | 1.7130 | 1158 | 1.1525 | 0.7420 | 1.1525 | 1.0735 |
| 0.2027 | 1.7160 | 1160 | 1.0140 | 0.7602 | 1.0140 | 1.0070 |
| 0.2027 | 1.7189 | 1162 | 0.8382 | 0.7437 | 0.8382 | 0.9156 |
| 0.2027 | 1.7219 | 1164 | 0.7141 | 0.7879 | 0.7141 | 0.8451 |
| 0.2027 | 1.7249 | 1166 | 0.7293 | 0.7620 | 0.7293 | 0.8540 |
| 0.2027 | 1.7278 | 1168 | 0.8555 | 0.7983 | 0.8555 | 0.9249 |
| 0.2027 | 1.7308 | 1170 | 1.0679 | 0.7216 | 1.0679 | 1.0334 |
| 0.2027 | 1.7337 | 1172 | 1.2894 | 0.6551 | 1.2894 | 1.1355 |
| 0.2027 | 1.7367 | 1174 | 1.3010 | 0.6303 | 1.3010 | 1.1406 |
| 0.2027 | 1.7396 | 1176 | 1.1568 | 0.6616 | 1.1568 | 1.0755 |
| 0.2027 | 1.7426 | 1178 | 0.9267 | 0.7983 | 0.9267 | 0.9626 |
| 0.2027 | 1.7456 | 1180 | 0.7456 | 0.7620 | 0.7456 | 0.8635 |
| 0.2027 | 1.7485 | 1182 | 0.7090 | 0.7620 | 0.7090 | 0.8420 |
| 0.2027 | 1.7515 | 1184 | 0.7289 | 0.7717 | 0.7289 | 0.8537 |
| 0.2027 | 1.7544 | 1186 | 0.8032 | 0.7983 | 0.8032 | 0.8962 |
| 0.2027 | 1.7574 | 1188 | 1.0024 | 0.7724 | 1.0024 | 1.0012 |
| 0.2027 | 1.7604 | 1190 | 1.1263 | 0.7216 | 1.1263 | 1.0613 |
| 0.2027 | 1.7633 | 1192 | 1.1630 | 0.7216 | 1.1630 | 1.0784 |
| 0.2027 | 1.7663 | 1194 | 1.1260 | 0.7724 | 1.1260 | 1.0611 |
| 0.2027 | 1.7692 | 1196 | 0.9999 | 0.7520 | 0.9999 | 0.9999 |
| 0.2027 | 1.7722 | 1198 | 0.8949 | 0.7520 | 0.8949 | 0.9460 |
| 0.2027 | 1.7751 | 1200 | 0.8393 | 0.72 | 0.8393 | 0.9161 |
| 0.2027 | 1.7781 | 1202 | 0.7766 | 0.7427 | 0.7766 | 0.8813 |
| 0.2027 | 1.7811 | 1204 | 0.8097 | 0.7422 | 0.8097 | 0.8998 |
| 0.2027 | 1.7840 | 1206 | 0.9306 | 0.7724 | 0.9306 | 0.9647 |
| 0.2027 | 1.7870 | 1208 | 1.0580 | 0.7129 | 1.0580 | 1.0286 |
| 0.2027 | 1.7899 | 1210 | 1.0144 | 0.7053 | 1.0144 | 1.0072 |
| 0.2027 | 1.7929 | 1212 | 1.0080 | 0.7053 | 1.0080 | 1.0040 |
| 0.2027 | 1.7959 | 1214 | 0.9547 | 0.7134 | 0.9547 | 0.9771 |
| 0.2027 | 1.7988 | 1216 | 0.8649 | 0.7618 | 0.8649 | 0.9300 |
| 0.2027 | 1.8018 | 1218 | 0.8642 | 0.7618 | 0.8642 | 0.9296 |
| 0.2027 | 1.8047 | 1220 | 0.9496 | 0.7724 | 0.9496 | 0.9745 |
| 0.2027 | 1.8077 | 1222 | 1.0637 | 0.7724 | 1.0637 | 1.0314 |
| 0.2027 | 1.8107 | 1224 | 1.0438 | 0.7413 | 1.0438 | 1.0217 |
| 0.2027 | 1.8136 | 1226 | 1.0562 | 0.7519 | 1.0562 | 1.0277 |
| 0.2027 | 1.8166 | 1228 | 1.0480 | 0.7519 | 1.0480 | 1.0237 |
| 0.2027 | 1.8195 | 1230 | 1.0601 | 0.7519 | 1.0601 | 1.0296 |
| 0.2027 | 1.8225 | 1232 | 0.9394 | 0.7413 | 0.9394 | 0.9692 |
| 0.2027 | 1.8254 | 1234 | 0.8760 | 0.7413 | 0.8760 | 0.9360 |
| 0.2027 | 1.8284 | 1236 | 0.8723 | 0.7413 | 0.8723 | 0.9340 |
| 0.2027 | 1.8314 | 1238 | 0.9070 | 0.7724 | 0.9070 | 0.9524 |
| 0.2027 | 1.8343 | 1240 | 0.9346 | 0.7724 | 0.9346 | 0.9667 |
| 0.2027 | 1.8373 | 1242 | 0.9246 | 0.7724 | 0.9246 | 0.9616 |
| 0.2027 | 1.8402 | 1244 | 1.0184 | 0.7216 | 1.0184 | 1.0092 |
| 0.2027 | 1.8432 | 1246 | 1.0145 | 0.7216 | 1.0145 | 1.0072 |
| 0.2027 | 1.8462 | 1248 | 0.9746 | 0.7216 | 0.9746 | 0.9872 |
| 0.2027 | 1.8491 | 1250 | 1.0259 | 0.7216 | 1.0259 | 1.0129 |
| 0.2027 | 1.8521 | 1252 | 1.1335 | 0.7216 | 1.1335 | 1.0646 |
| 0.2027 | 1.8550 | 1254 | 1.1524 | 0.7216 | 1.1524 | 1.0735 |
| 0.2027 | 1.8580 | 1256 | 1.1742 | 0.7216 | 1.1742 | 1.0836 |
| 0.2027 | 1.8609 | 1258 | 1.0785 | 0.7216 | 1.0785 | 1.0385 |
| 0.2027 | 1.8639 | 1260 | 0.9516 | 0.7724 | 0.9516 | 0.9755 |
| 0.2027 | 1.8669 | 1262 | 0.8954 | 0.7724 | 0.8954 | 0.9462 |
| 0.2027 | 1.8698 | 1264 | 0.8350 | 0.7724 | 0.8350 | 0.9138 |
| 0.2027 | 1.8728 | 1266 | 0.8384 | 0.7724 | 0.8384 | 0.9157 |
| 0.2027 | 1.8757 | 1268 | 0.9365 | 0.7724 | 0.9365 | 0.9677 |
| 0.2027 | 1.8787 | 1270 | 1.0440 | 0.7216 | 1.0440 | 1.0218 |
| 0.2027 | 1.8817 | 1272 | 1.1089 | 0.7216 | 1.1089 | 1.0531 |
| 0.2027 | 1.8846 | 1274 | 1.0823 | 0.7216 | 1.0823 | 1.0403 |
| 0.2027 | 1.8876 | 1276 | 1.0238 | 0.7724 | 1.0238 | 1.0118 |
| 0.2027 | 1.8905 | 1278 | 0.9472 | 0.7724 | 0.9472 | 0.9733 |
| 0.2027 | 1.8935 | 1280 | 0.8534 | 0.7729 | 0.8534 | 0.9238 |
| 0.2027 | 1.8964 | 1282 | 0.8127 | 0.7729 | 0.8127 | 0.9015 |
| 0.2027 | 1.8994 | 1284 | 0.8792 | 0.7844 | 0.8792 | 0.9377 |
| 0.2027 | 1.9024 | 1286 | 0.9333 | 0.7519 | 0.9333 | 0.9661 |
| 0.2027 | 1.9053 | 1288 | 1.0706 | 0.7840 | 1.0706 | 1.0347 |
| 0.2027 | 1.9083 | 1290 | 1.1552 | 0.7840 | 1.1552 | 1.0748 |
| 0.2027 | 1.9112 | 1292 | 1.1153 | 0.7305 | 1.1153 | 1.0561 |
| 0.2027 | 1.9142 | 1294 | 0.9649 | 0.7840 | 0.9649 | 0.9823 |
| 0.2027 | 1.9172 | 1296 | 0.7974 | 0.7844 | 0.7974 | 0.8930 |
| 0.2027 | 1.9201 | 1298 | 0.7322 | 0.8019 | 0.7322 | 0.8557 |
| 0.2027 | 1.9231 | 1300 | 0.7319 | 0.8019 | 0.7319 | 0.8555 |
| 0.2027 | 1.9260 | 1302 | 0.7839 | 0.7793 | 0.7839 | 0.8854 |
| 0.2027 | 1.9290 | 1304 | 0.7875 | 0.8283 | 0.7875 | 0.8874 |
| 0.2027 | 1.9320 | 1306 | 0.8742 | 0.7793 | 0.8742 | 0.9350 |
| 0.2027 | 1.9349 | 1308 | 1.0828 | 0.7421 | 1.0828 | 1.0406 |
| 0.2027 | 1.9379 | 1310 | 1.2391 | 0.6610 | 1.2391 | 1.1132 |
| 0.2027 | 1.9408 | 1312 | 1.2314 | 0.6680 | 1.2314 | 1.1097 |
| 0.2027 | 1.9438 | 1314 | 1.2424 | 0.6680 | 1.2424 | 1.1146 |
| 0.2027 | 1.9467 | 1316 | 1.2892 | 0.7188 | 1.2892 | 1.1354 |
| 0.2027 | 1.9497 | 1318 | 1.2065 | 0.7188 | 1.2065 | 1.0984 |
| 0.2027 | 1.9527 | 1320 | 1.0568 | 0.7194 | 1.0568 | 1.0280 |
| 0.2027 | 1.9556 | 1322 | 0.9709 | 0.7106 | 0.9709 | 0.9853 |
| 0.2027 | 1.9586 | 1324 | 0.9310 | 0.7729 | 0.9310 | 0.9649 |
| 0.2027 | 1.9615 | 1326 | 0.9968 | 0.7417 | 0.9968 | 0.9984 |
| 0.2027 | 1.9645 | 1328 | 1.1721 | 0.6551 | 1.1721 | 1.0827 |
| 0.2027 | 1.9675 | 1330 | 1.2560 | 0.6551 | 1.2560 | 1.1207 |
| 0.2027 | 1.9704 | 1332 | 1.3105 | 0.6551 | 1.3105 | 1.1448 |
| 0.2027 | 1.9734 | 1334 | 1.3047 | 0.6551 | 1.3047 | 1.1422 |
| 0.2027 | 1.9763 | 1336 | 1.1227 | 0.6551 | 1.1227 | 1.0596 |
| 0.2027 | 1.9793 | 1338 | 0.8940 | 0.7606 | 0.8940 | 0.9455 |
| 0.2027 | 1.9822 | 1340 | 0.7891 | 0.8054 | 0.7891 | 0.8883 |
| 0.2027 | 1.9852 | 1342 | 0.7859 | 0.8054 | 0.7859 | 0.8865 |
| 0.2027 | 1.9882 | 1344 | 0.7966 | 0.8164 | 0.7966 | 0.8926 |
| 0.2027 | 1.9911 | 1346 | 0.8101 | 0.8164 | 0.8101 | 0.9001 |
| 0.2027 | 1.9941 | 1348 | 0.8712 | 0.8283 | 0.8712 | 0.9334 |
| 0.2027 | 1.9970 | 1350 | 0.9879 | 0.8283 | 0.9879 | 0.9939 |
| 0.2027 | 2.0 | 1352 | 1.1058 | 0.6934 | 1.1058 | 1.0516 |
| 0.2027 | 2.0030 | 1354 | 1.0945 | 0.7511 | 1.0945 | 1.0462 |
| 0.2027 | 2.0059 | 1356 | 1.0005 | 0.7898 | 1.0005 | 1.0003 |
| 0.2027 | 2.0089 | 1358 | 0.9025 | 0.8283 | 0.9025 | 0.9500 |
| 0.2027 | 2.0118 | 1360 | 0.8913 | 0.8283 | 0.8913 | 0.9441 |
| 0.2027 | 2.0148 | 1362 | 0.9784 | 0.8138 | 0.9784 | 0.9892 |
| 0.2027 | 2.0178 | 1364 | 1.0355 | 0.8138 | 1.0355 | 1.0176 |
| 0.2027 | 2.0207 | 1366 | 1.1199 | 0.7515 | 1.1199 | 1.0583 |
| 0.2027 | 2.0237 | 1368 | 1.2363 | 0.6680 | 1.2363 | 1.1119 |
| 0.2027 | 2.0266 | 1370 | 1.2404 | 0.6680 | 1.2404 | 1.1137 |
| 0.2027 | 2.0296 | 1372 | 1.1814 | 0.7188 | 1.1814 | 1.0869 |
| 0.2027 | 2.0325 | 1374 | 1.0447 | 0.7840 | 1.0447 | 1.0221 |
| 0.2027 | 2.0355 | 1376 | 0.9452 | 0.7844 | 0.9452 | 0.9722 |
| 0.2027 | 2.0385 | 1378 | 0.8490 | 0.7729 | 0.8490 | 0.9214 |
| 0.2027 | 2.0414 | 1380 | 0.8198 | 0.7729 | 0.8198 | 0.9054 |
| 0.2027 | 2.0444 | 1382 | 0.8613 | 0.7729 | 0.8613 | 0.9280 |
| 0.2027 | 2.0473 | 1384 | 1.0077 | 0.7305 | 1.0077 | 1.0038 |
| 0.2027 | 2.0503 | 1386 | 1.1931 | 0.7095 | 1.1931 | 1.0923 |
| 0.2027 | 2.0533 | 1388 | 1.2200 | 0.7095 | 1.2200 | 1.1046 |
| 0.2027 | 2.0562 | 1390 | 1.1135 | 0.7216 | 1.1135 | 1.0552 |
| 0.2027 | 2.0592 | 1392 | 0.9553 | 0.7724 | 0.9553 | 0.9774 |
| 0.2027 | 2.0621 | 1394 | 0.9035 | 0.7724 | 0.9035 | 0.9505 |
| 0.2027 | 2.0651 | 1396 | 0.8121 | 0.7112 | 0.8121 | 0.9012 |
| 0.2027 | 2.0680 | 1398 | 0.7745 | 0.7427 | 0.7745 | 0.8801 |
| 0.2027 | 2.0710 | 1400 | 0.8406 | 0.7618 | 0.8406 | 0.9168 |
| 0.2027 | 2.0740 | 1402 | 1.0117 | 0.7216 | 1.0117 | 1.0058 |
| 0.2027 | 2.0769 | 1404 | 1.2557 | 0.7129 | 1.2557 | 1.1206 |
| 0.2027 | 2.0799 | 1406 | 1.3077 | 0.6934 | 1.3077 | 1.1435 |
| 0.2027 | 2.0828 | 1408 | 1.1886 | 0.7129 | 1.1886 | 1.0902 |
| 0.2027 | 2.0858 | 1410 | 0.9789 | 0.7134 | 0.9789 | 0.9894 |
| 0.2027 | 2.0888 | 1412 | 0.8780 | 0.7618 | 0.8780 | 0.9370 |
| 0.2027 | 2.0917 | 1414 | 0.7670 | 0.7717 | 0.7670 | 0.8758 |
| 0.2027 | 2.0947 | 1416 | 0.7382 | 0.7717 | 0.7382 | 0.8592 |
| 0.2027 | 2.0976 | 1418 | 0.8082 | 0.7713 | 0.8082 | 0.8990 |
| 0.2027 | 2.1006 | 1420 | 0.9503 | 0.7618 | 0.9503 | 0.9748 |
| 0.2027 | 2.1036 | 1422 | 1.0865 | 0.7216 | 1.0865 | 1.0423 |
| 0.2027 | 2.1065 | 1424 | 1.2271 | 0.6680 | 1.2271 | 1.1077 |
| 0.2027 | 2.1095 | 1426 | 1.2042 | 0.7101 | 1.2042 | 1.0974 |
| 0.2027 | 2.1124 | 1428 | 1.0951 | 0.7724 | 1.0951 | 1.0465 |
| 0.2027 | 2.1154 | 1430 | 1.0425 | 0.7724 | 1.0425 | 1.0210 |
| 0.2027 | 2.1183 | 1432 | 0.9952 | 0.7618 | 0.9952 | 0.9976 |
| 0.2027 | 2.1213 | 1434 | 0.9337 | 0.7618 | 0.9337 | 0.9663 |
| 0.2027 | 2.1243 | 1436 | 0.9933 | 0.7618 | 0.9933 | 0.9967 |
| 0.2027 | 2.1272 | 1438 | 1.0419 | 0.7618 | 1.0419 | 1.0208 |
| 0.2027 | 2.1302 | 1440 | 1.0181 | 0.7618 | 1.0181 | 1.0090 |
| 0.2027 | 2.1331 | 1442 | 0.9504 | 0.7618 | 0.9504 | 0.9749 |
| 0.2027 | 2.1361 | 1444 | 0.9980 | 0.7724 | 0.9980 | 0.9990 |
| 0.2027 | 2.1391 | 1446 | 1.0772 | 0.7724 | 1.0772 | 1.0379 |
| 0.2027 | 2.1420 | 1448 | 1.1787 | 0.7724 | 1.1787 | 1.0857 |
| 0.2027 | 2.1450 | 1450 | 1.1875 | 0.7840 | 1.1875 | 1.0897 |
| 0.2027 | 2.1479 | 1452 | 1.1263 | 0.7519 | 1.1263 | 1.0613 |
| 0.2027 | 2.1509 | 1454 | 1.0700 | 0.7519 | 1.0700 | 1.0344 |
| 0.2027 | 2.1538 | 1456 | 1.0276 | 0.7413 | 1.0276 | 1.0137 |
| 0.2027 | 2.1568 | 1458 | 1.0212 | 0.7724 | 1.0212 | 1.0105 |
| 0.2027 | 2.1598 | 1460 | 1.0415 | 0.7724 | 1.0415 | 1.0206 |
| 0.2027 | 2.1627 | 1462 | 1.0731 | 0.7216 | 1.0731 | 1.0359 |
| 0.2027 | 2.1657 | 1464 | 1.1670 | 0.7216 | 1.1670 | 1.0803 |
| 0.2027 | 2.1686 | 1466 | 1.1642 | 0.7216 | 1.1642 | 1.0790 |
| 0.2027 | 2.1716 | 1468 | 1.1779 | 0.7216 | 1.1779 | 1.0853 |
| 0.2027 | 2.1746 | 1470 | 1.0652 | 0.7216 | 1.0652 | 1.0321 |
| 0.2027 | 2.1775 | 1472 | 1.0099 | 0.7724 | 1.0099 | 1.0050 |
| 0.2027 | 2.1805 | 1474 | 1.0008 | 0.7724 | 1.0008 | 1.0004 |
| 0.2027 | 2.1834 | 1476 | 0.9941 | 0.7724 | 0.9941 | 0.9970 |
| 0.2027 | 2.1864 | 1478 | 1.0495 | 0.7840 | 1.0495 | 1.0245 |
| 0.2027 | 2.1893 | 1480 | 1.0901 | 0.7840 | 1.0901 | 1.0441 |
| 0.2027 | 2.1923 | 1482 | 1.1383 | 0.7840 | 1.1383 | 1.0669 |
| 0.2027 | 2.1953 | 1484 | 1.1134 | 0.7626 | 1.1134 | 1.0552 |
| 0.2027 | 2.1982 | 1486 | 1.0597 | 0.7520 | 1.0597 | 1.0294 |
| 0.2027 | 2.2012 | 1488 | 0.9854 | 0.7520 | 0.9854 | 0.9927 |
| 0.2027 | 2.2041 | 1490 | 0.9767 | 0.7520 | 0.9767 | 0.9883 |
| 0.2027 | 2.2071 | 1492 | 0.9569 | 0.7020 | 0.9569 | 0.9782 |
| 0.2027 | 2.2101 | 1494 | 0.9386 | 0.7020 | 0.9386 | 0.9688 |
| 0.2027 | 2.2130 | 1496 | 0.9788 | 0.7520 | 0.9788 | 0.9893 |
| 0.2027 | 2.2160 | 1498 | 0.9178 | 0.7520 | 0.9178 | 0.9580 |
| 0.1429 | 2.2189 | 1500 | 0.8509 | 0.7422 | 0.8509 | 0.9224 |
| 0.1429 | 2.2219 | 1502 | 0.7499 | 0.7422 | 0.7499 | 0.8659 |
| 0.1429 | 2.2249 | 1504 | 0.6799 | 0.7879 | 0.6799 | 0.8245 |
| 0.1429 | 2.2278 | 1506 | 0.6976 | 0.7879 | 0.6976 | 0.8352 |
| 0.1429 | 2.2308 | 1508 | 0.7965 | 0.7983 | 0.7965 | 0.8925 |
| 0.1429 | 2.2337 | 1510 | 0.9768 | 0.6945 | 0.9768 | 0.9883 |
| 0.1429 | 2.2367 | 1512 | 1.1678 | 0.7129 | 1.1678 | 1.0806 |
| 0.1429 | 2.2396 | 1514 | 1.3075 | 0.6407 | 1.3075 | 1.1434 |
| 0.1429 | 2.2426 | 1516 | 1.2889 | 0.6884 | 1.2889 | 1.1353 |
| 0.1429 | 2.2456 | 1518 | 1.1692 | 0.7840 | 1.1692 | 1.0813 |
| 0.1429 | 2.2485 | 1520 | 1.0580 | 0.7519 | 1.0580 | 1.0286 |
| 0.1429 | 2.2515 | 1522 | 0.9749 | 0.7296 | 0.9749 | 0.9874 |
| 0.1429 | 2.2544 | 1524 | 0.9682 | 0.7626 | 0.9682 | 0.9840 |
| 0.1429 | 2.2574 | 1526 | 1.0404 | 0.7626 | 1.0404 | 1.0200 |
| 0.1429 | 2.2604 | 1528 | 1.1190 | 0.7101 | 1.1190 | 1.0578 |
| 0.1429 | 2.2633 | 1530 | 1.1386 | 0.7101 | 1.1386 | 1.0671 |
| 0.1429 | 2.2663 | 1532 | 1.1567 | 0.7101 | 1.1567 | 1.0755 |
| 0.1429 | 2.2692 | 1534 | 1.2360 | 0.7211 | 1.2360 | 1.1118 |
| 0.1429 | 2.2722 | 1536 | 1.2356 | 0.7305 | 1.2356 | 1.1116 |
| 0.1429 | 2.2751 | 1538 | 1.1011 | 0.7101 | 1.1011 | 1.0493 |
| 0.1429 | 2.2781 | 1540 | 0.9899 | 0.7520 | 0.9899 | 0.9949 |
| 0.1429 | 2.2811 | 1542 | 0.9165 | 0.6403 | 0.9165 | 0.9573 |
| 0.1429 | 2.2840 | 1544 | 0.8874 | 0.6403 | 0.8874 | 0.9420 |
| 0.1429 | 2.2870 | 1546 | 0.9359 | 0.6403 | 0.9359 | 0.9674 |
| 0.1429 | 2.2899 | 1548 | 1.0542 | 0.7296 | 1.0542 | 1.0267 |
| 0.1429 | 2.2929 | 1550 | 1.1002 | 0.7626 | 1.1002 | 1.0489 |
| 0.1429 | 2.2959 | 1552 | 1.0319 | 0.7296 | 1.0319 | 1.0158 |
| 0.1429 | 2.2988 | 1554 | 0.9245 | 0.7296 | 0.9245 | 0.9615 |
| 0.1429 | 2.3018 | 1556 | 0.8937 | 0.7296 | 0.8937 | 0.9454 |
| 0.1429 | 2.3047 | 1558 | 0.8407 | 0.7296 | 0.8407 | 0.9169 |
| 0.1429 | 2.3077 | 1560 | 0.7828 | 0.7733 | 0.7828 | 0.8847 |
| 0.1429 | 2.3107 | 1562 | 0.7910 | 0.7525 | 0.7910 | 0.8894 |
| 0.1429 | 2.3136 | 1564 | 0.8851 | 0.7296 | 0.8851 | 0.9408 |
| 0.1429 | 2.3166 | 1566 | 1.1154 | 0.7305 | 1.1154 | 1.0561 |
| 0.1429 | 2.3195 | 1568 | 1.2786 | 0.7230 | 1.2786 | 1.1307 |
| 0.1429 | 2.3225 | 1570 | 1.2614 | 0.7305 | 1.2614 | 1.1231 |
| 0.1429 | 2.3254 | 1572 | 1.1293 | 0.7840 | 1.1293 | 1.0627 |
| 0.1429 | 2.3284 | 1574 | 0.9546 | 0.7413 | 0.9546 | 0.9770 |
| 0.1429 | 2.3314 | 1576 | 0.8477 | 0.8019 | 0.8477 | 0.9207 |
| 0.1429 | 2.3343 | 1578 | 0.7778 | 0.7823 | 0.7778 | 0.8820 |
| 0.1429 | 2.3373 | 1580 | 0.7632 | 0.7717 | 0.7632 | 0.8736 |
| 0.1429 | 2.3402 | 1582 | 0.8270 | 0.8019 | 0.8270 | 0.9094 |
| 0.1429 | 2.3432 | 1584 | 0.9009 | 0.8283 | 0.9009 | 0.9492 |
| 0.1429 | 2.3462 | 1586 | 1.0544 | 0.7216 | 1.0544 | 1.0269 |
| 0.1429 | 2.3491 | 1588 | 1.1467 | 0.7216 | 1.1467 | 1.0708 |
| 0.1429 | 2.3521 | 1590 | 1.1455 | 0.7216 | 1.1455 | 1.0703 |
| 0.1429 | 2.3550 | 1592 | 1.0188 | 0.7724 | 1.0188 | 1.0093 |
| 0.1429 | 2.3580 | 1594 | 0.8845 | 0.7413 | 0.8845 | 0.9405 |
| 0.1429 | 2.3609 | 1596 | 0.8601 | 0.7729 | 0.8601 | 0.9274 |
| 0.1429 | 2.3639 | 1598 | 0.8620 | 0.8015 | 0.8620 | 0.9284 |
| 0.1429 | 2.3669 | 1600 | 0.8829 | 0.8015 | 0.8829 | 0.9396 |
| 0.1429 | 2.3698 | 1602 | 0.9048 | 0.7724 | 0.9048 | 0.9512 |
| 0.1429 | 2.3728 | 1604 | 0.9217 | 0.7216 | 0.9217 | 0.9600 |
| 0.1429 | 2.3757 | 1606 | 0.8948 | 0.7216 | 0.8948 | 0.9459 |
| 0.1429 | 2.3787 | 1608 | 0.8529 | 0.8015 | 0.8529 | 0.9235 |
| 0.1429 | 2.3817 | 1610 | 0.8610 | 0.7724 | 0.8610 | 0.9279 |
| 0.1429 | 2.3846 | 1612 | 0.9292 | 0.7724 | 0.9292 | 0.9640 |
| 0.1429 | 2.3876 | 1614 | 0.9651 | 0.7840 | 0.9651 | 0.9824 |
| 0.1429 | 2.3905 | 1616 | 1.0820 | 0.7840 | 1.0820 | 1.0402 |
| 0.1429 | 2.3935 | 1618 | 1.1566 | 0.7840 | 1.1566 | 1.0755 |
| 0.1429 | 2.3964 | 1620 | 1.1392 | 0.7840 | 1.1392 | 1.0673 |
| 0.1429 | 2.3994 | 1622 | 1.0587 | 0.7840 | 1.0587 | 1.0290 |
| 0.1429 | 2.4024 | 1624 | 0.9914 | 0.7840 | 0.9914 | 0.9957 |
| 0.1429 | 2.4053 | 1626 | 0.9850 | 0.7840 | 0.9850 | 0.9925 |
| 0.1429 | 2.4083 | 1628 | 1.0074 | 0.7840 | 1.0074 | 1.0037 |
| 0.1429 | 2.4112 | 1630 | 0.9917 | 0.7840 | 0.9917 | 0.9958 |
| 0.1429 | 2.4142 | 1632 | 0.9385 | 0.7840 | 0.9385 | 0.9687 |
| 0.1429 | 2.4172 | 1634 | 0.9730 | 0.7840 | 0.9730 | 0.9864 |
| 0.1429 | 2.4201 | 1636 | 1.0876 | 0.7840 | 1.0876 | 1.0429 |
| 0.1429 | 2.4231 | 1638 | 1.1681 | 0.7188 | 1.1681 | 1.0808 |
| 0.1429 | 2.4260 | 1640 | 1.2506 | 0.6680 | 1.2506 | 1.1183 |
| 0.1429 | 2.4290 | 1642 | 1.2324 | 0.6680 | 1.2324 | 1.1101 |
| 0.1429 | 2.4320 | 1644 | 1.1494 | 0.7305 | 1.1494 | 1.0721 |
| 0.1429 | 2.4349 | 1646 | 1.0692 | 0.7305 | 1.0692 | 1.0340 |
| 0.1429 | 2.4379 | 1648 | 1.0912 | 0.7211 | 1.0912 | 1.0446 |
| 0.1429 | 2.4408 | 1650 | 1.0712 | 0.7305 | 1.0712 | 1.0350 |
| 0.1429 | 2.4438 | 1652 | 1.0516 | 0.7216 | 1.0516 | 1.0255 |
| 0.1429 | 2.4467 | 1654 | 1.0886 | 0.7840 | 1.0886 | 1.0434 |
| 0.1429 | 2.4497 | 1656 | 1.1326 | 0.7840 | 1.1326 | 1.0642 |
| 0.1429 | 2.4527 | 1658 | 1.1204 | 0.7840 | 1.1204 | 1.0585 |
| 0.1429 | 2.4556 | 1660 | 1.0569 | 0.7840 | 1.0569 | 1.0281 |
| 0.1429 | 2.4586 | 1662 | 0.9871 | 0.7724 | 0.9871 | 0.9936 |
| 0.1429 | 2.4615 | 1664 | 0.9690 | 0.7724 | 0.9690 | 0.9844 |
| 0.1429 | 2.4645 | 1666 | 1.0108 | 0.7216 | 1.0108 | 1.0054 |
| 0.1429 | 2.4675 | 1668 | 1.0558 | 0.7216 | 1.0558 | 1.0275 |
| 0.1429 | 2.4704 | 1670 | 1.0887 | 0.7516 | 1.0887 | 1.0434 |
| 0.1429 | 2.4734 | 1672 | 1.0309 | 0.7516 | 1.0309 | 1.0153 |
| 0.1429 | 2.4763 | 1674 | 0.8772 | 0.7516 | 0.8772 | 0.9366 |
| 0.1429 | 2.4793 | 1676 | 0.8144 | 0.8164 | 0.8144 | 0.9025 |
| 0.1429 | 2.4822 | 1678 | 0.8580 | 0.8015 | 0.8580 | 0.9263 |
| 0.1429 | 2.4852 | 1680 | 1.0259 | 0.7216 | 1.0259 | 1.0129 |
| 0.1429 | 2.4882 | 1682 | 1.1710 | 0.7129 | 1.1710 | 1.0821 |
| 0.1429 | 2.4911 | 1684 | 1.1935 | 0.7129 | 1.1935 | 1.0925 |
| 0.1429 | 2.4941 | 1686 | 1.1651 | 0.7216 | 1.1651 | 1.0794 |
| 0.1429 | 2.4970 | 1688 | 1.0233 | 0.7516 | 1.0233 | 1.0116 |
| 0.1429 | 2.5 | 1690 | 0.8537 | 0.8283 | 0.8537 | 0.9240 |
| 0.1429 | 2.5030 | 1692 | 0.8046 | 0.7717 | 0.8046 | 0.8970 |
| 0.1429 | 2.5059 | 1694 | 0.8393 | 0.8019 | 0.8393 | 0.9161 |
| 0.1429 | 2.5089 | 1696 | 0.9609 | 0.7793 | 0.9609 | 0.9802 |
| 0.1429 | 2.5118 | 1698 | 1.0597 | 0.7516 | 1.0597 | 1.0294 |
| 0.1429 | 2.5148 | 1700 | 1.1579 | 0.6616 | 1.1579 | 1.0761 |
| 0.1429 | 2.5178 | 1702 | 1.1391 | 0.6616 | 1.1391 | 1.0673 |
| 0.1429 | 2.5207 | 1704 | 0.9957 | 0.7793 | 0.9957 | 0.9978 |
| 0.1429 | 2.5237 | 1706 | 0.8332 | 0.8019 | 0.8332 | 0.9128 |
| 0.1429 | 2.5266 | 1708 | 0.7666 | 0.7906 | 0.7666 | 0.8755 |
| 0.1429 | 2.5296 | 1710 | 0.7299 | 0.7906 | 0.7299 | 0.8543 |
| 0.1429 | 2.5325 | 1712 | 0.7774 | 0.7906 | 0.7774 | 0.8817 |
| 0.1429 | 2.5355 | 1714 | 0.8416 | 0.8283 | 0.8416 | 0.9174 |
| 0.1429 | 2.5385 | 1716 | 0.9281 | 0.7793 | 0.9281 | 0.9634 |
| 0.1429 | 2.5414 | 1718 | 1.0225 | 0.7692 | 1.0225 | 1.0112 |
| 0.1429 | 2.5444 | 1720 | 1.1036 | 0.7511 | 1.1036 | 1.0505 |
| 0.1429 | 2.5473 | 1722 | 1.1008 | 0.7511 | 1.1008 | 1.0492 |
| 0.1429 | 2.5503 | 1724 | 0.9959 | 0.7421 | 0.9959 | 0.9980 |
| 0.1429 | 2.5533 | 1726 | 0.9075 | 0.7793 | 0.9075 | 0.9526 |
| 0.1429 | 2.5562 | 1728 | 0.8658 | 0.7793 | 0.8658 | 0.9305 |
| 0.1429 | 2.5592 | 1730 | 0.8619 | 0.7793 | 0.8619 | 0.9284 |
| 0.1429 | 2.5621 | 1732 | 0.8915 | 0.7793 | 0.8915 | 0.9442 |
| 0.1429 | 2.5651 | 1734 | 0.9928 | 0.7614 | 0.9928 | 0.9964 |
| 0.1429 | 2.5680 | 1736 | 1.0190 | 0.7305 | 1.0190 | 1.0095 |
| 0.1429 | 2.5710 | 1738 | 1.0376 | 0.7305 | 1.0376 | 1.0186 |
| 0.1429 | 2.5740 | 1740 | 1.0662 | 0.7840 | 1.0662 | 1.0325 |
| 0.1429 | 2.5769 | 1742 | 0.9729 | 0.7840 | 0.9729 | 0.9864 |
| 0.1429 | 2.5799 | 1744 | 0.8807 | 0.8412 | 0.8807 | 0.9385 |
| 0.1429 | 2.5828 | 1746 | 0.8821 | 0.8412 | 0.8821 | 0.9392 |
| 0.1429 | 2.5858 | 1748 | 0.8849 | 0.8412 | 0.8849 | 0.9407 |
| 0.1429 | 2.5888 | 1750 | 0.7951 | 0.8412 | 0.7951 | 0.8917 |
| 0.1429 | 2.5917 | 1752 | 0.7814 | 0.8412 | 0.7814 | 0.8840 |
| 0.1429 | 2.5947 | 1754 | 0.8674 | 0.8412 | 0.8674 | 0.9313 |
| 0.1429 | 2.5976 | 1756 | 0.9161 | 0.8138 | 0.9161 | 0.9571 |
| 0.1429 | 2.6006 | 1758 | 0.9612 | 0.7840 | 0.9612 | 0.9804 |
| 0.1429 | 2.6036 | 1760 | 0.9555 | 0.7840 | 0.9555 | 0.9775 |
| 0.1429 | 2.6065 | 1762 | 0.9541 | 0.7840 | 0.9541 | 0.9768 |
| 0.1429 | 2.6095 | 1764 | 0.9440 | 0.7840 | 0.9440 | 0.9716 |
| 0.1429 | 2.6124 | 1766 | 0.8729 | 0.8138 | 0.8729 | 0.9343 |
| 0.1429 | 2.6154 | 1768 | 0.8254 | 0.8283 | 0.8254 | 0.9085 |
| 0.1429 | 2.6183 | 1770 | 0.8261 | 0.8283 | 0.8261 | 0.9089 |
| 0.1429 | 2.6213 | 1772 | 0.8465 | 0.7840 | 0.8465 | 0.9200 |
| 0.1429 | 2.6243 | 1774 | 0.8766 | 0.7840 | 0.8766 | 0.9363 |
| 0.1429 | 2.6272 | 1776 | 0.9487 | 0.7840 | 0.9487 | 0.9740 |
| 0.1429 | 2.6302 | 1778 | 1.0681 | 0.7840 | 1.0681 | 1.0335 |
| 0.1429 | 2.6331 | 1780 | 1.1402 | 0.7840 | 1.1402 | 1.0678 |
| 0.1429 | 2.6361 | 1782 | 1.1035 | 0.7840 | 1.1035 | 1.0505 |
| 0.1429 | 2.6391 | 1784 | 0.9980 | 0.7840 | 0.9980 | 0.9990 |
| 0.1429 | 2.6420 | 1786 | 0.9062 | 0.7840 | 0.9062 | 0.9520 |
| 0.1429 | 2.6450 | 1788 | 0.8377 | 0.7626 | 0.8377 | 0.9153 |
| 0.1429 | 2.6479 | 1790 | 0.8331 | 0.7626 | 0.8331 | 0.9127 |
| 0.1429 | 2.6509 | 1792 | 0.8957 | 0.7626 | 0.8957 | 0.9464 |
| 0.1429 | 2.6538 | 1794 | 0.9214 | 0.7626 | 0.9214 | 0.9599 |
| 0.1429 | 2.6568 | 1796 | 0.8936 | 0.7626 | 0.8936 | 0.9453 |
| 0.1429 | 2.6598 | 1798 | 0.9284 | 0.7840 | 0.9284 | 0.9636 |
| 0.1429 | 2.6627 | 1800 | 0.8973 | 0.7626 | 0.8973 | 0.9472 |
| 0.1429 | 2.6657 | 1802 | 0.8917 | 0.7626 | 0.8917 | 0.9443 |
| 0.1429 | 2.6686 | 1804 | 0.9295 | 0.7626 | 0.9295 | 0.9641 |
| 0.1429 | 2.6716 | 1806 | 0.9484 | 0.7840 | 0.9484 | 0.9739 |
| 0.1429 | 2.6746 | 1808 | 0.8985 | 0.7296 | 0.8985 | 0.9479 |
| 0.1429 | 2.6775 | 1810 | 0.9287 | 0.7519 | 0.9287 | 0.9637 |
| 0.1429 | 2.6805 | 1812 | 0.9848 | 0.7840 | 0.9848 | 0.9924 |
| 0.1429 | 2.6834 | 1814 | 0.9454 | 0.7840 | 0.9454 | 0.9723 |
| 0.1429 | 2.6864 | 1816 | 0.8605 | 0.7296 | 0.8605 | 0.9276 |
| 0.1429 | 2.6893 | 1818 | 0.8060 | 0.72 | 0.8060 | 0.8978 |
| 0.1429 | 2.6923 | 1820 | 0.7593 | 0.7418 | 0.7593 | 0.8714 |
| 0.1429 | 2.6953 | 1822 | 0.7535 | 0.7729 | 0.7535 | 0.8680 |
| 0.1429 | 2.6982 | 1824 | 0.8054 | 0.7020 | 0.8054 | 0.8975 |
| 0.1429 | 2.7012 | 1826 | 0.9041 | 0.7216 | 0.9041 | 0.9508 |
| 0.1429 | 2.7041 | 1828 | 1.1066 | 0.7216 | 1.1066 | 1.0519 |
| 0.1429 | 2.7071 | 1830 | 1.2403 | 0.7148 | 1.2403 | 1.1137 |
| 0.1429 | 2.7101 | 1832 | 1.1971 | 0.7211 | 1.1971 | 1.0941 |
| 0.1429 | 2.7130 | 1834 | 1.0272 | 0.7305 | 1.0272 | 1.0135 |
| 0.1429 | 2.7160 | 1836 | 0.8910 | 0.7840 | 0.8910 | 0.9439 |
| 0.1429 | 2.7189 | 1838 | 0.8488 | 0.7519 | 0.8488 | 0.9213 |
| 0.1429 | 2.7219 | 1840 | 0.7875 | 0.7519 | 0.7875 | 0.8874 |
| 0.1429 | 2.7249 | 1842 | 0.7793 | 0.7519 | 0.7793 | 0.8828 |
| 0.1429 | 2.7278 | 1844 | 0.8416 | 0.7519 | 0.8416 | 0.9174 |
| 0.1429 | 2.7308 | 1846 | 0.9337 | 0.7840 | 0.9337 | 0.9663 |
| 0.1429 | 2.7337 | 1848 | 1.1363 | 0.7840 | 1.1363 | 1.0660 |
| 0.1429 | 2.7367 | 1850 | 1.2267 | 0.6610 | 1.2267 | 1.1075 |
| 0.1429 | 2.7396 | 1852 | 1.1463 | 0.7305 | 1.1463 | 1.0706 |
| 0.1429 | 2.7426 | 1854 | 0.9590 | 0.7840 | 0.9590 | 0.9793 |
| 0.1429 | 2.7456 | 1856 | 0.7667 | 0.7844 | 0.7667 | 0.8756 |
| 0.1429 | 2.7485 | 1858 | 0.6720 | 0.8219 | 0.6720 | 0.8198 |
| 0.1429 | 2.7515 | 1860 | 0.6396 | 0.8023 | 0.6396 | 0.7998 |
| 0.1429 | 2.7544 | 1862 | 0.6449 | 0.8475 | 0.6449 | 0.8030 |
| 0.1429 | 2.7574 | 1864 | 0.6921 | 0.8475 | 0.6921 | 0.8319 |
| 0.1429 | 2.7604 | 1866 | 0.7797 | 0.7793 | 0.7797 | 0.8830 |
| 0.1429 | 2.7633 | 1868 | 0.9503 | 0.7216 | 0.9503 | 0.9748 |
| 0.1429 | 2.7663 | 1870 | 1.1123 | 0.7305 | 1.1123 | 1.0547 |
| 0.1429 | 2.7692 | 1872 | 1.2489 | 0.7305 | 1.2489 | 1.1176 |
| 0.1429 | 2.7722 | 1874 | 1.2702 | 0.7305 | 1.2702 | 1.1270 |
| 0.1429 | 2.7751 | 1876 | 1.1560 | 0.7840 | 1.1560 | 1.0752 |
| 0.1429 | 2.7781 | 1878 | 1.0071 | 0.7519 | 1.0071 | 1.0035 |
| 0.1429 | 2.7811 | 1880 | 0.9395 | 0.7413 | 0.9395 | 0.9693 |
| 0.1429 | 2.7840 | 1882 | 0.9518 | 0.7413 | 0.9518 | 0.9756 |
| 0.1429 | 2.7870 | 1884 | 0.9892 | 0.7724 | 0.9892 | 0.9946 |
| 0.1429 | 2.7899 | 1886 | 1.0118 | 0.7724 | 1.0118 | 1.0059 |
| 0.1429 | 2.7929 | 1888 | 0.9239 | 0.8015 | 0.9239 | 0.9612 |
| 0.1429 | 2.7959 | 1890 | 0.8930 | 0.8015 | 0.8930 | 0.9450 |
| 0.1429 | 2.7988 | 1892 | 0.8113 | 0.8015 | 0.8113 | 0.9007 |
| 0.1429 | 2.8018 | 1894 | 0.8179 | 0.7516 | 0.8179 | 0.9044 |
| 0.1429 | 2.8047 | 1896 | 0.8370 | 0.7516 | 0.8370 | 0.9149 |
| 0.1429 | 2.8077 | 1898 | 0.8714 | 0.7516 | 0.8714 | 0.9335 |
| 0.1429 | 2.8107 | 1900 | 0.9191 | 0.7216 | 0.9191 | 0.9587 |
| 0.1429 | 2.8136 | 1902 | 0.9380 | 0.7216 | 0.9380 | 0.9685 |
| 0.1429 | 2.8166 | 1904 | 0.9808 | 0.7216 | 0.9808 | 0.9903 |
| 0.1429 | 2.8195 | 1906 | 1.0158 | 0.7216 | 1.0158 | 1.0079 |
| 0.1429 | 2.8225 | 1908 | 1.0067 | 0.7724 | 1.0067 | 1.0034 |
| 0.1429 | 2.8254 | 1910 | 0.8932 | 0.7724 | 0.8932 | 0.9451 |
| 0.1429 | 2.8284 | 1912 | 0.8099 | 0.7724 | 0.8099 | 0.9000 |
| 0.1429 | 2.8314 | 1914 | 0.7824 | 0.7724 | 0.7824 | 0.8845 |
| 0.1429 | 2.8343 | 1916 | 0.8441 | 0.7724 | 0.8441 | 0.9188 |
| 0.1429 | 2.8373 | 1918 | 0.9730 | 0.7724 | 0.9730 | 0.9864 |
| 0.1429 | 2.8402 | 1920 | 1.0125 | 0.7216 | 1.0125 | 1.0062 |
| 0.1429 | 2.8432 | 1922 | 0.9742 | 0.7216 | 0.9742 | 0.9870 |
| 0.1429 | 2.8462 | 1924 | 0.9109 | 0.7724 | 0.9109 | 0.9544 |
| 0.1429 | 2.8491 | 1926 | 0.9532 | 0.7724 | 0.9532 | 0.9763 |
| 0.1429 | 2.8521 | 1928 | 1.0112 | 0.7724 | 1.0112 | 1.0056 |
| 0.1429 | 2.8550 | 1930 | 1.0149 | 0.7724 | 1.0149 | 1.0074 |
| 0.1429 | 2.8580 | 1932 | 0.9550 | 0.7724 | 0.9550 | 0.9773 |
| 0.1429 | 2.8609 | 1934 | 0.8822 | 0.7413 | 0.8822 | 0.9392 |
| 0.1429 | 2.8639 | 1936 | 0.9046 | 0.7724 | 0.9046 | 0.9511 |
| 0.1429 | 2.8669 | 1938 | 1.0342 | 0.7724 | 1.0342 | 1.0169 |
| 0.1429 | 2.8698 | 1940 | 1.1980 | 0.7216 | 1.1980 | 1.0945 |
| 0.1429 | 2.8728 | 1942 | 1.2083 | 0.7305 | 1.2083 | 1.0992 |
| 0.1429 | 2.8757 | 1944 | 1.0976 | 0.7724 | 1.0976 | 1.0477 |
| 0.1429 | 2.8787 | 1946 | 0.9175 | 0.7724 | 0.9175 | 0.9578 |
| 0.1429 | 2.8817 | 1948 | 0.7737 | 0.72 | 0.7737 | 0.8796 |
| 0.1429 | 2.8846 | 1950 | 0.7425 | 0.7112 | 0.7425 | 0.8617 |
| 0.1429 | 2.8876 | 1952 | 0.8026 | 0.7724 | 0.8026 | 0.8959 |
| 0.1429 | 2.8905 | 1954 | 0.9301 | 0.7724 | 0.9301 | 0.9644 |
| 0.1429 | 2.8935 | 1956 | 1.0652 | 0.7724 | 1.0652 | 1.0321 |
| 0.1429 | 2.8964 | 1958 | 1.0490 | 0.7724 | 1.0490 | 1.0242 |
| 0.1429 | 2.8994 | 1960 | 0.9689 | 0.7724 | 0.9689 | 0.9843 |
| 0.1429 | 2.9024 | 1962 | 0.8176 | 0.7413 | 0.8176 | 0.9042 |
| 0.1429 | 2.9053 | 1964 | 0.6502 | 0.7717 | 0.6502 | 0.8064 |
| 0.1429 | 2.9083 | 1966 | 0.5673 | 0.7620 | 0.5673 | 0.7532 |
| 0.1429 | 2.9112 | 1968 | 0.5577 | 0.7620 | 0.5577 | 0.7468 |
| 0.1429 | 2.9142 | 1970 | 0.5944 | 0.7717 | 0.5944 | 0.7710 |
| 0.1429 | 2.9172 | 1972 | 0.6882 | 0.7717 | 0.6882 | 0.8296 |
| 0.1429 | 2.9201 | 1974 | 0.8182 | 0.7724 | 0.8182 | 0.9046 |
| 0.1429 | 2.9231 | 1976 | 0.9034 | 0.7724 | 0.9034 | 0.9505 |
| 0.1429 | 2.9260 | 1978 | 0.8788 | 0.7724 | 0.8788 | 0.9375 |
| 0.1429 | 2.9290 | 1980 | 0.8148 | 0.7724 | 0.8148 | 0.9026 |
| 0.1429 | 2.9320 | 1982 | 0.7775 | 0.7724 | 0.7775 | 0.8817 |
| 0.1429 | 2.9349 | 1984 | 0.7848 | 0.7724 | 0.7848 | 0.8859 |
| 0.1429 | 2.9379 | 1986 | 0.8229 | 0.7724 | 0.8229 | 0.9072 |
| 0.1429 | 2.9408 | 1988 | 0.8945 | 0.7724 | 0.8945 | 0.9458 |
| 0.1429 | 2.9438 | 1990 | 0.9073 | 0.7724 | 0.9073 | 0.9525 |
| 0.1429 | 2.9467 | 1992 | 0.9499 | 0.7724 | 0.9499 | 0.9746 |
| 0.1429 | 2.9497 | 1994 | 0.9183 | 0.7724 | 0.9183 | 0.9583 |
| 0.1429 | 2.9527 | 1996 | 0.9060 | 0.7724 | 0.9060 | 0.9519 |
| 0.1429 | 2.9556 | 1998 | 0.8116 | 0.7724 | 0.8116 | 0.9009 |
| 0.1162 | 2.9586 | 2000 | 0.7766 | 0.7724 | 0.7766 | 0.8812 |
| 0.1162 | 2.9615 | 2002 | 0.7710 | 0.7724 | 0.7710 | 0.8781 |
| 0.1162 | 2.9645 | 2004 | 0.7723 | 0.7724 | 0.7723 | 0.8788 |
| 0.1162 | 2.9675 | 2006 | 0.8075 | 0.7724 | 0.8075 | 0.8986 |
| 0.1162 | 2.9704 | 2008 | 0.8376 | 0.7724 | 0.8376 | 0.9152 |
| 0.1162 | 2.9734 | 2010 | 0.8857 | 0.7724 | 0.8857 | 0.9411 |
| 0.1162 | 2.9763 | 2012 | 0.9272 | 0.7724 | 0.9272 | 0.9629 |
| 0.1162 | 2.9793 | 2014 | 0.8780 | 0.7724 | 0.8780 | 0.9370 |
| 0.1162 | 2.9822 | 2016 | 0.8113 | 0.7724 | 0.8113 | 0.9007 |
| 0.1162 | 2.9852 | 2018 | 0.7864 | 0.7724 | 0.7864 | 0.8868 |
| 0.1162 | 2.9882 | 2020 | 0.7396 | 0.7413 | 0.7396 | 0.8600 |
| 0.1162 | 2.9911 | 2022 | 0.7368 | 0.7413 | 0.7368 | 0.8584 |
| 0.1162 | 2.9941 | 2024 | 0.7659 | 0.7724 | 0.7659 | 0.8752 |
| 0.1162 | 2.9970 | 2026 | 0.7277 | 0.7724 | 0.7277 | 0.8530 |
| 0.1162 | 3.0 | 2028 | 0.7530 | 0.7724 | 0.7530 | 0.8678 |
| 0.1162 | 3.0030 | 2030 | 0.7409 | 0.7724 | 0.7409 | 0.8608 |
| 0.1162 | 3.0059 | 2032 | 0.7084 | 0.7724 | 0.7084 | 0.8416 |
| 0.1162 | 3.0089 | 2034 | 0.6341 | 0.7823 | 0.6341 | 0.7963 |
| 0.1162 | 3.0118 | 2036 | 0.6135 | 0.7717 | 0.6135 | 0.7833 |
| 0.1162 | 3.0148 | 2038 | 0.6579 | 0.8094 | 0.6579 | 0.8111 |
| 0.1162 | 3.0178 | 2040 | 0.7603 | 0.7520 | 0.7603 | 0.8720 |
| 0.1162 | 3.0207 | 2042 | 0.8973 | 0.7724 | 0.8973 | 0.9472 |
| 0.1162 | 3.0237 | 2044 | 0.9976 | 0.7724 | 0.9976 | 0.9988 |
| 0.1162 | 3.0266 | 2046 | 1.0832 | 0.7724 | 1.0832 | 1.0408 |
| 0.1162 | 3.0296 | 2048 | 1.1045 | 0.7724 | 1.1045 | 1.0510 |
| 0.1162 | 3.0325 | 2050 | 1.0299 | 0.7724 | 1.0299 | 1.0148 |
| 0.1162 | 3.0355 | 2052 | 0.9105 | 0.7724 | 0.9105 | 0.9542 |
| 0.1162 | 3.0385 | 2054 | 0.7824 | 0.72 | 0.7824 | 0.8845 |
| 0.1162 | 3.0414 | 2056 | 0.7448 | 0.72 | 0.7448 | 0.8630 |
| 0.1162 | 3.0444 | 2058 | 0.7912 | 0.7724 | 0.7912 | 0.8895 |
| 0.1162 | 3.0473 | 2060 | 0.8342 | 0.7724 | 0.8342 | 0.9133 |
| 0.1162 | 3.0503 | 2062 | 0.8239 | 0.7724 | 0.8239 | 0.9077 |
| 0.1162 | 3.0533 | 2064 | 0.7561 | 0.7618 | 0.7561 | 0.8696 |
| 0.1162 | 3.0562 | 2066 | 0.6718 | 0.7983 | 0.6718 | 0.8196 |
| 0.1162 | 3.0592 | 2068 | 0.6402 | 0.7879 | 0.6402 | 0.8001 |
| 0.1162 | 3.0621 | 2070 | 0.6778 | 0.7983 | 0.6778 | 0.8233 |
| 0.1162 | 3.0651 | 2072 | 0.7639 | 0.7724 | 0.7639 | 0.8740 |
| 0.1162 | 3.0680 | 2074 | 0.7944 | 0.7724 | 0.7944 | 0.8913 |
| 0.1162 | 3.0710 | 2076 | 0.7609 | 0.7724 | 0.7609 | 0.8723 |
| 0.1162 | 3.0740 | 2078 | 0.7314 | 0.7520 | 0.7314 | 0.8552 |
| 0.1162 | 3.0769 | 2080 | 0.7551 | 0.7520 | 0.7551 | 0.8690 |
| 0.1162 | 3.0799 | 2082 | 0.7911 | 0.7724 | 0.7911 | 0.8895 |
| 0.1162 | 3.0828 | 2084 | 0.7572 | 0.72 | 0.7572 | 0.8702 |
| 0.1162 | 3.0858 | 2086 | 0.7157 | 0.7418 | 0.7157 | 0.8460 |
| 0.1162 | 3.0888 | 2088 | 0.7232 | 0.7418 | 0.7232 | 0.8504 |
| 0.1162 | 3.0917 | 2090 | 0.7886 | 0.7631 | 0.7886 | 0.8880 |
| 0.1162 | 3.0947 | 2092 | 0.8901 | 0.7724 | 0.8901 | 0.9435 |
| 0.1162 | 3.0976 | 2094 | 0.9089 | 0.7724 | 0.9089 | 0.9534 |
| 0.1162 | 3.1006 | 2096 | 0.9423 | 0.7305 | 0.9423 | 0.9707 |
| 0.1162 | 3.1036 | 2098 | 0.9445 | 0.7840 | 0.9445 | 0.9719 |
| 0.1162 | 3.1065 | 2100 | 0.9041 | 0.7840 | 0.9041 | 0.9509 |
| 0.1162 | 3.1095 | 2102 | 0.8881 | 0.7724 | 0.8881 | 0.9424 |
| 0.1162 | 3.1124 | 2104 | 0.8297 | 0.7724 | 0.8297 | 0.9109 |
| 0.1162 | 3.1154 | 2106 | 0.8237 | 0.7724 | 0.8237 | 0.9076 |
| 0.1162 | 3.1183 | 2108 | 0.8509 | 0.7724 | 0.8509 | 0.9224 |
| 0.1162 | 3.1213 | 2110 | 0.9520 | 0.7724 | 0.9520 | 0.9757 |
| 0.1162 | 3.1243 | 2112 | 1.0214 | 0.7724 | 1.0214 | 1.0107 |
| 0.1162 | 3.1272 | 2114 | 1.0221 | 0.7724 | 1.0221 | 1.0110 |
| 0.1162 | 3.1302 | 2116 | 1.0414 | 0.7724 | 1.0414 | 1.0205 |
| 0.1162 | 3.1331 | 2118 | 1.0552 | 0.7840 | 1.0552 | 1.0273 |
| 0.1162 | 3.1361 | 2120 | 1.1041 | 0.7840 | 1.1041 | 1.0508 |
| 0.1162 | 3.1391 | 2122 | 1.1316 | 0.7840 | 1.1316 | 1.0637 |
| 0.1162 | 3.1420 | 2124 | 1.1518 | 0.7188 | 1.1518 | 1.0732 |
| 0.1162 | 3.1450 | 2126 | 1.0601 | 0.7840 | 1.0601 | 1.0296 |
| 0.1162 | 3.1479 | 2128 | 0.9684 | 0.7724 | 0.9684 | 0.9841 |
| 0.1162 | 3.1509 | 2130 | 0.8869 | 0.7724 | 0.8869 | 0.9417 |
| 0.1162 | 3.1538 | 2132 | 0.8017 | 0.7520 | 0.8017 | 0.8954 |
| 0.1162 | 3.1568 | 2134 | 0.7347 | 0.7819 | 0.7347 | 0.8572 |
| 0.1162 | 3.1598 | 2136 | 0.7199 | 0.7819 | 0.7199 | 0.8485 |
| 0.1162 | 3.1627 | 2138 | 0.7722 | 0.7819 | 0.7722 | 0.8787 |
| 0.1162 | 3.1657 | 2140 | 0.9118 | 0.7724 | 0.9118 | 0.9549 |
| 0.1162 | 3.1686 | 2142 | 1.0244 | 0.7216 | 1.0244 | 1.0122 |
| 0.1162 | 3.1716 | 2144 | 1.0311 | 0.7216 | 1.0311 | 1.0154 |
| 0.1162 | 3.1746 | 2146 | 0.9899 | 0.7724 | 0.9899 | 0.9949 |
| 0.1162 | 3.1775 | 2148 | 0.9933 | 0.7724 | 0.9933 | 0.9966 |
| 0.1162 | 3.1805 | 2150 | 1.0403 | 0.7305 | 1.0403 | 1.0200 |
| 0.1162 | 3.1834 | 2152 | 1.1078 | 0.7305 | 1.1078 | 1.0525 |
| 0.1162 | 3.1864 | 2154 | 1.1671 | 0.7095 | 1.1671 | 1.0803 |
| 0.1162 | 3.1893 | 2156 | 1.2117 | 0.7095 | 1.2117 | 1.1008 |
| 0.1162 | 3.1923 | 2158 | 1.1408 | 0.7095 | 1.1408 | 1.0681 |
| 0.1162 | 3.1953 | 2160 | 0.9917 | 0.7724 | 0.9917 | 0.9959 |
| 0.1162 | 3.1982 | 2162 | 0.9095 | 0.7724 | 0.9095 | 0.9537 |
| 0.1162 | 3.2012 | 2164 | 0.9150 | 0.7724 | 0.9150 | 0.9566 |
| 0.1162 | 3.2041 | 2166 | 0.9236 | 0.7724 | 0.9236 | 0.9610 |
| 0.1162 | 3.2071 | 2168 | 0.9520 | 0.7840 | 0.9520 | 0.9757 |
| 0.1162 | 3.2101 | 2170 | 0.9959 | 0.7840 | 0.9959 | 0.9979 |
| 0.1162 | 3.2130 | 2172 | 1.0747 | 0.7621 | 1.0747 | 1.0367 |
| 0.1162 | 3.2160 | 2174 | 1.0785 | 0.7840 | 1.0785 | 1.0385 |
| 0.1162 | 3.2189 | 2176 | 0.9945 | 0.7840 | 0.9945 | 0.9973 |
| 0.1162 | 3.2219 | 2178 | 0.8835 | 0.7729 | 0.8835 | 0.9399 |
| 0.1162 | 3.2249 | 2180 | 0.8047 | 0.7729 | 0.8047 | 0.8970 |
| 0.1162 | 3.2278 | 2182 | 0.7977 | 0.7729 | 0.7977 | 0.8931 |
| 0.1162 | 3.2308 | 2184 | 0.8718 | 0.8015 | 0.8718 | 0.9337 |
| 0.1162 | 3.2337 | 2186 | 0.9767 | 0.7724 | 0.9767 | 0.9883 |
| 0.1162 | 3.2367 | 2188 | 1.1090 | 0.7095 | 1.1090 | 1.0531 |
| 0.1162 | 3.2396 | 2190 | 1.1811 | 0.7095 | 1.1811 | 1.0868 |
| 0.1162 | 3.2426 | 2192 | 1.2044 | 0.7095 | 1.2044 | 1.0974 |
| 0.1162 | 3.2456 | 2194 | 1.2650 | 0.7621 | 1.2650 | 1.1247 |
| 0.1162 | 3.2485 | 2196 | 1.2323 | 0.7621 | 1.2323 | 1.1101 |
| 0.1162 | 3.2515 | 2198 | 1.0917 | 0.7194 | 1.0917 | 1.0449 |
| 0.1162 | 3.2544 | 2200 | 0.9305 | 0.7413 | 0.9305 | 0.9646 |
| 0.1162 | 3.2574 | 2202 | 0.8309 | 0.7729 | 0.8309 | 0.9115 |
| 0.1162 | 3.2604 | 2204 | 0.8142 | 0.8019 | 0.8142 | 0.9023 |
| 0.1162 | 3.2633 | 2206 | 0.8400 | 0.8283 | 0.8400 | 0.9165 |
| 0.1162 | 3.2663 | 2208 | 0.8883 | 0.8283 | 0.8883 | 0.9425 |
| 0.1162 | 3.2692 | 2210 | 0.9934 | 0.7724 | 0.9934 | 0.9967 |
| 0.1162 | 3.2722 | 2212 | 1.0140 | 0.7724 | 1.0140 | 1.0070 |
| 0.1162 | 3.2751 | 2214 | 1.0551 | 0.7724 | 1.0551 | 1.0272 |
| 0.1162 | 3.2781 | 2216 | 0.9842 | 0.7724 | 0.9842 | 0.9921 |
| 0.1162 | 3.2811 | 2218 | 0.8601 | 0.8283 | 0.8601 | 0.9274 |
| 0.1162 | 3.2840 | 2220 | 0.8382 | 0.8283 | 0.8382 | 0.9155 |
| 0.1162 | 3.2870 | 2222 | 0.8617 | 0.8283 | 0.8617 | 0.9283 |
| 0.1162 | 3.2899 | 2224 | 0.8859 | 0.8015 | 0.8859 | 0.9412 |
| 0.1162 | 3.2929 | 2226 | 0.9479 | 0.7724 | 0.9479 | 0.9736 |
| 0.1162 | 3.2959 | 2228 | 1.0165 | 0.7724 | 1.0165 | 1.0082 |
| 0.1162 | 3.2988 | 2230 | 1.0763 | 0.7515 | 1.0763 | 1.0374 |
| 0.1162 | 3.3018 | 2232 | 1.1265 | 0.7515 | 1.1265 | 1.0614 |
| 0.1162 | 3.3047 | 2234 | 1.1280 | 0.6890 | 1.1280 | 1.0621 |
| 0.1162 | 3.3077 | 2236 | 1.0162 | 0.7724 | 1.0162 | 1.0081 |
| 0.1162 | 3.3107 | 2238 | 0.8659 | 0.7413 | 0.8659 | 0.9305 |
| 0.1162 | 3.3136 | 2240 | 0.7813 | 0.72 | 0.7813 | 0.8839 |
| 0.1162 | 3.3166 | 2242 | 0.7626 | 0.7823 | 0.7626 | 0.8733 |
| 0.1162 | 3.3195 | 2244 | 0.8184 | 0.7724 | 0.8184 | 0.9047 |
| 0.1162 | 3.3225 | 2246 | 0.9424 | 0.7724 | 0.9424 | 0.9708 |
| 0.1162 | 3.3254 | 2248 | 1.0574 | 0.7216 | 1.0574 | 1.0283 |
| 0.1162 | 3.3284 | 2250 | 1.0485 | 0.7216 | 1.0485 | 1.0240 |
| 0.1162 | 3.3314 | 2252 | 0.9405 | 0.7724 | 0.9405 | 0.9698 |
| 0.1162 | 3.3343 | 2254 | 0.9177 | 0.7724 | 0.9177 | 0.9579 |
| 0.1162 | 3.3373 | 2256 | 0.9978 | 0.7724 | 0.9978 | 0.9989 |
| 0.1162 | 3.3402 | 2258 | 1.0211 | 0.7724 | 1.0211 | 1.0105 |
| 0.1162 | 3.3432 | 2260 | 1.0102 | 0.7724 | 1.0102 | 1.0051 |
| 0.1162 | 3.3462 | 2262 | 0.8901 | 0.7724 | 0.8901 | 0.9434 |
| 0.1162 | 3.3491 | 2264 | 0.7977 | 0.7724 | 0.7977 | 0.8931 |
| 0.1162 | 3.3521 | 2266 | 0.7835 | 0.7724 | 0.7835 | 0.8852 |
| 0.1162 | 3.3550 | 2268 | 0.8506 | 0.7724 | 0.8506 | 0.9223 |
| 0.1162 | 3.3580 | 2270 | 0.9911 | 0.7724 | 0.9911 | 0.9955 |
| 0.1162 | 3.3609 | 2272 | 1.0787 | 0.7840 | 1.0787 | 1.0386 |
| 0.1162 | 3.3639 | 2274 | 1.0661 | 0.7840 | 1.0661 | 1.0325 |
| 0.1162 | 3.3669 | 2276 | 1.0247 | 0.7724 | 1.0247 | 1.0123 |
| 0.1162 | 3.3698 | 2278 | 0.9857 | 0.7724 | 0.9857 | 0.9928 |
| 0.1162 | 3.3728 | 2280 | 0.8962 | 0.7724 | 0.8962 | 0.9467 |
| 0.1162 | 3.3757 | 2282 | 0.7898 | 0.7413 | 0.7898 | 0.8887 |
| 0.1162 | 3.3787 | 2284 | 0.7466 | 0.7413 | 0.7466 | 0.8641 |
| 0.1162 | 3.3817 | 2286 | 0.7975 | 0.7724 | 0.7975 | 0.8930 |
| 0.1162 | 3.3846 | 2288 | 0.9353 | 0.7216 | 0.9353 | 0.9671 |
| 0.1162 | 3.3876 | 2290 | 1.1304 | 0.7216 | 1.1304 | 1.0632 |
| 0.1162 | 3.3905 | 2292 | 1.1809 | 0.7216 | 1.1809 | 1.0867 |
| 0.1162 | 3.3935 | 2294 | 1.1682 | 0.7305 | 1.1682 | 1.0809 |
| 0.1162 | 3.3964 | 2296 | 1.0968 | 0.7305 | 1.0968 | 1.0473 |
| 0.1162 | 3.3994 | 2298 | 0.9606 | 0.7724 | 0.9606 | 0.9801 |
| 0.1162 | 3.4024 | 2300 | 0.8493 | 0.7724 | 0.8493 | 0.9216 |
| 0.1162 | 3.4053 | 2302 | 0.7809 | 0.7413 | 0.7809 | 0.8837 |
| 0.1162 | 3.4083 | 2304 | 0.7608 | 0.7413 | 0.7608 | 0.8723 |
| 0.1162 | 3.4112 | 2306 | 0.8043 | 0.7413 | 0.8043 | 0.8968 |
| 0.1162 | 3.4142 | 2308 | 0.8935 | 0.7724 | 0.8935 | 0.9452 |
| 0.1162 | 3.4172 | 2310 | 1.0699 | 0.7840 | 1.0699 | 1.0343 |
| 0.1162 | 3.4201 | 2312 | 1.1944 | 0.7305 | 1.1944 | 1.0929 |
| 0.1162 | 3.4231 | 2314 | 1.1693 | 0.7305 | 1.1693 | 1.0814 |
| 0.1162 | 3.4260 | 2316 | 1.0631 | 0.7305 | 1.0631 | 1.0311 |
| 0.1162 | 3.4290 | 2318 | 0.9026 | 0.7724 | 0.9026 | 0.9501 |
| 0.1162 | 3.4320 | 2320 | 0.7982 | 0.7618 | 0.7982 | 0.8934 |
| 0.1162 | 3.4349 | 2322 | 0.7758 | 0.7618 | 0.7758 | 0.8808 |
| 0.1162 | 3.4379 | 2324 | 0.8081 | 0.7618 | 0.8081 | 0.8989 |
| 0.1162 | 3.4408 | 2326 | 0.8982 | 0.7618 | 0.8982 | 0.9478 |
| 0.1162 | 3.4438 | 2328 | 0.9911 | 0.7724 | 0.9911 | 0.9955 |
| 0.1162 | 3.4467 | 2330 | 0.9801 | 0.7724 | 0.9801 | 0.9900 |
| 0.1162 | 3.4497 | 2332 | 0.9000 | 0.7618 | 0.9000 | 0.9487 |
| 0.1162 | 3.4527 | 2334 | 0.7839 | 0.7316 | 0.7839 | 0.8854 |
| 0.1162 | 3.4556 | 2336 | 0.7508 | 0.7427 | 0.7508 | 0.8665 |
| 0.1162 | 3.4586 | 2338 | 0.7855 | 0.7427 | 0.7855 | 0.8863 |
| 0.1162 | 3.4615 | 2340 | 0.8798 | 0.7413 | 0.8798 | 0.9380 |
| 0.1162 | 3.4645 | 2342 | 0.9026 | 0.7413 | 0.9026 | 0.9500 |
| 0.1162 | 3.4675 | 2344 | 0.8616 | 0.72 | 0.8616 | 0.9282 |
| 0.1162 | 3.4704 | 2346 | 0.8102 | 0.72 | 0.8102 | 0.9001 |
| 0.1162 | 3.4734 | 2348 | 0.8029 | 0.7525 | 0.8029 | 0.8961 |
| 0.1162 | 3.4763 | 2350 | 0.7815 | 0.7525 | 0.7815 | 0.8840 |
| 0.1162 | 3.4793 | 2352 | 0.7968 | 0.7525 | 0.7968 | 0.8927 |
| 0.1162 | 3.4822 | 2354 | 0.8540 | 0.7819 | 0.8540 | 0.9241 |
| 0.1162 | 3.4852 | 2356 | 0.9520 | 0.7516 | 0.9520 | 0.9757 |
| 0.1162 | 3.4882 | 2358 | 0.9549 | 0.7516 | 0.9549 | 0.9772 |
| 0.1162 | 3.4911 | 2360 | 0.9086 | 0.7516 | 0.9086 | 0.9532 |
| 0.1162 | 3.4941 | 2362 | 0.8341 | 0.7327 | 0.8341 | 0.9133 |
| 0.1162 | 3.4970 | 2364 | 0.7672 | 0.7983 | 0.7672 | 0.8759 |
| 0.1162 | 3.5 | 2366 | 0.7149 | 0.8232 | 0.7149 | 0.8455 |
| 0.1162 | 3.5030 | 2368 | 0.7119 | 0.7986 | 0.7119 | 0.8437 |
| 0.1162 | 3.5059 | 2370 | 0.7458 | 0.7717 | 0.7458 | 0.8636 |
| 0.1162 | 3.5089 | 2372 | 0.7282 | 0.7882 | 0.7282 | 0.8534 |
| 0.1162 | 3.5118 | 2374 | 0.7380 | 0.7882 | 0.7380 | 0.8590 |
| 0.1162 | 3.5148 | 2376 | 0.8078 | 0.7427 | 0.8078 | 0.8988 |
| 0.1162 | 3.5178 | 2378 | 0.8473 | 0.7729 | 0.8473 | 0.9205 |
| 0.1162 | 3.5207 | 2380 | 0.8808 | 0.7724 | 0.8808 | 0.9385 |
| 0.1162 | 3.5237 | 2382 | 0.8302 | 0.7623 | 0.8302 | 0.9111 |
| 0.1162 | 3.5266 | 2384 | 0.7617 | 0.7620 | 0.7617 | 0.8727 |
| 0.1162 | 3.5296 | 2386 | 0.7344 | 0.7882 | 0.7344 | 0.8570 |
| 0.1162 | 3.5325 | 2388 | 0.7336 | 0.7620 | 0.7336 | 0.8565 |
| 0.1162 | 3.5355 | 2390 | 0.7938 | 0.7413 | 0.7938 | 0.8910 |
| 0.1162 | 3.5385 | 2392 | 0.8755 | 0.7840 | 0.8755 | 0.9357 |
| 0.1162 | 3.5414 | 2394 | 0.9611 | 0.7840 | 0.9611 | 0.9803 |
| 0.1162 | 3.5444 | 2396 | 0.9955 | 0.7305 | 0.9955 | 0.9977 |
| 0.1162 | 3.5473 | 2398 | 0.9407 | 0.7305 | 0.9407 | 0.9699 |
| 0.1162 | 3.5503 | 2400 | 0.8313 | 0.7840 | 0.8313 | 0.9118 |
| 0.1162 | 3.5533 | 2402 | 0.7876 | 0.8015 | 0.7876 | 0.8875 |
| 0.1162 | 3.5562 | 2404 | 0.8044 | 0.8138 | 0.8044 | 0.8969 |
| 0.1162 | 3.5592 | 2406 | 0.8877 | 0.7840 | 0.8877 | 0.9422 |
| 0.1162 | 3.5621 | 2408 | 0.9590 | 0.7840 | 0.9590 | 0.9793 |
| 0.1162 | 3.5651 | 2410 | 1.0271 | 0.7840 | 1.0271 | 1.0135 |
| 0.1162 | 3.5680 | 2412 | 1.0207 | 0.7840 | 1.0207 | 1.0103 |
| 0.1162 | 3.5710 | 2414 | 0.9388 | 0.7840 | 0.9388 | 0.9689 |
| 0.1162 | 3.5740 | 2416 | 0.8605 | 0.7840 | 0.8605 | 0.9276 |
| 0.1162 | 3.5769 | 2418 | 0.7621 | 0.8412 | 0.7621 | 0.8730 |
| 0.1162 | 3.5799 | 2420 | 0.6715 | 0.8123 | 0.6715 | 0.8194 |
| 0.1162 | 3.5828 | 2422 | 0.6706 | 0.8123 | 0.6706 | 0.8189 |
| 0.1162 | 3.5858 | 2424 | 0.7484 | 0.8293 | 0.7484 | 0.8651 |
| 0.1162 | 3.5888 | 2426 | 0.8841 | 0.7606 | 0.8841 | 0.9403 |
| 0.1162 | 3.5917 | 2428 | 1.0645 | 0.7421 | 1.0645 | 1.0318 |
| 0.1162 | 3.5947 | 2430 | 1.1243 | 0.6545 | 1.1243 | 1.0603 |
| 0.1162 | 3.5976 | 2432 | 1.1148 | 0.6551 | 1.1148 | 1.0559 |
| 0.1162 | 3.6006 | 2434 | 1.0023 | 0.7239 | 1.0023 | 1.0011 |
| 0.1162 | 3.6036 | 2436 | 0.8786 | 0.8164 | 0.8786 | 0.9373 |
| 0.1162 | 3.6065 | 2438 | 0.8233 | 0.8164 | 0.8233 | 0.9073 |
| 0.1162 | 3.6095 | 2440 | 0.7917 | 0.8164 | 0.7917 | 0.8898 |
| 0.1162 | 3.6124 | 2442 | 0.8218 | 0.8164 | 0.8218 | 0.9065 |
| 0.1162 | 3.6154 | 2444 | 0.9057 | 0.8283 | 0.9057 | 0.9517 |
| 0.1162 | 3.6183 | 2446 | 0.9523 | 0.7840 | 0.9523 | 0.9759 |
| 0.1162 | 3.6213 | 2448 | 0.9302 | 0.7840 | 0.9302 | 0.9645 |
| 0.1162 | 3.6243 | 2450 | 0.8485 | 0.7724 | 0.8485 | 0.9211 |
| 0.1162 | 3.6272 | 2452 | 0.7351 | 0.8283 | 0.7351 | 0.8574 |
| 0.1162 | 3.6302 | 2454 | 0.6785 | 0.8232 | 0.6785 | 0.8237 |
| 0.1162 | 3.6331 | 2456 | 0.6948 | 0.8283 | 0.6948 | 0.8336 |
| 0.1162 | 3.6361 | 2458 | 0.7485 | 0.8138 | 0.7485 | 0.8652 |
| 0.1162 | 3.6391 | 2460 | 0.7742 | 0.7840 | 0.7742 | 0.8799 |
| 0.1162 | 3.6420 | 2462 | 0.8258 | 0.7840 | 0.8258 | 0.9087 |
| 0.1162 | 3.6450 | 2464 | 0.8494 | 0.7840 | 0.8494 | 0.9216 |
| 0.1162 | 3.6479 | 2466 | 0.8609 | 0.7519 | 0.8609 | 0.9278 |
| 0.1162 | 3.6509 | 2468 | 0.8377 | 0.7519 | 0.8377 | 0.9153 |
| 0.1162 | 3.6538 | 2470 | 0.7964 | 0.7844 | 0.7964 | 0.8924 |
| 0.1162 | 3.6568 | 2472 | 0.7550 | 0.8015 | 0.7550 | 0.8689 |
| 0.1162 | 3.6598 | 2474 | 0.7731 | 0.8015 | 0.7731 | 0.8792 |
| 0.1162 | 3.6627 | 2476 | 0.8439 | 0.7516 | 0.8439 | 0.9186 |
| 0.1162 | 3.6657 | 2478 | 0.9745 | 0.7216 | 0.9745 | 0.9872 |
| 0.1162 | 3.6686 | 2480 | 1.1186 | 0.6680 | 1.1186 | 1.0576 |
| 0.1162 | 3.6716 | 2482 | 1.2594 | 0.6469 | 1.2594 | 1.1222 |
| 0.1162 | 3.6746 | 2484 | 1.3117 | 0.6469 | 1.3117 | 1.1453 |
| 0.1162 | 3.6775 | 2486 | 1.2602 | 0.6969 | 1.2602 | 1.1226 |
| 0.1162 | 3.6805 | 2488 | 1.1442 | 0.7188 | 1.1442 | 1.0697 |
| 0.1162 | 3.6834 | 2490 | 1.0308 | 0.6839 | 1.0308 | 1.0153 |
| 0.1162 | 3.6864 | 2492 | 0.9664 | 0.6839 | 0.9664 | 0.9831 |
| 0.1162 | 3.6893 | 2494 | 0.9203 | 0.6617 | 0.9203 | 0.9593 |
| 0.1162 | 3.6923 | 2496 | 0.9104 | 0.6617 | 0.9104 | 0.9542 |
| 0.1162 | 3.6953 | 2498 | 0.9569 | 0.7188 | 0.9569 | 0.9782 |
| 0.0885 | 3.6982 | 2500 | 1.0330 | 0.7188 | 1.0330 | 1.0164 |
| 0.0885 | 3.7012 | 2502 | 1.1183 | 0.6680 | 1.1183 | 1.0575 |
| 0.0885 | 3.7041 | 2504 | 1.1880 | 0.6680 | 1.1880 | 1.0899 |
| 0.0885 | 3.7071 | 2506 | 1.1605 | 0.6680 | 1.1605 | 1.0773 |
| 0.0885 | 3.7101 | 2508 | 1.0374 | 0.6616 | 1.0374 | 1.0185 |
| 0.0885 | 3.7130 | 2510 | 0.8719 | 0.8015 | 0.8719 | 0.9338 |
| 0.0885 | 3.7160 | 2512 | 0.7399 | 0.7879 | 0.7399 | 0.8602 |
| 0.0885 | 3.7189 | 2514 | 0.6869 | 0.8123 | 0.6869 | 0.8288 |
| 0.0885 | 3.7219 | 2516 | 0.6908 | 0.8123 | 0.6908 | 0.8311 |
| 0.0885 | 3.7249 | 2518 | 0.7420 | 0.8123 | 0.7420 | 0.8614 |
| 0.0885 | 3.7278 | 2520 | 0.8474 | 0.8283 | 0.8474 | 0.9206 |
| 0.0885 | 3.7308 | 2522 | 0.9556 | 0.7417 | 0.9556 | 0.9776 |
| 0.0885 | 3.7337 | 2524 | 1.0459 | 0.7515 | 1.0459 | 1.0227 |
| 0.0885 | 3.7367 | 2526 | 1.1312 | 0.6969 | 1.1312 | 1.0636 |
| 0.0885 | 3.7396 | 2528 | 1.1158 | 0.6969 | 1.1158 | 1.0563 |
| 0.0885 | 3.7426 | 2530 | 1.0474 | 0.7515 | 1.0474 | 1.0234 |
| 0.0885 | 3.7456 | 2532 | 0.9789 | 0.7515 | 0.9789 | 0.9894 |
| 0.0885 | 3.7485 | 2534 | 0.8634 | 0.7815 | 0.8634 | 0.9292 |
| 0.0885 | 3.7515 | 2536 | 0.8160 | 0.8215 | 0.8160 | 0.9033 |
| 0.0885 | 3.7544 | 2538 | 0.8419 | 0.7815 | 0.8419 | 0.9176 |
| 0.0885 | 3.7574 | 2540 | 0.8892 | 0.7815 | 0.8892 | 0.9430 |
| 0.0885 | 3.7604 | 2542 | 0.8525 | 0.7815 | 0.8525 | 0.9233 |
| 0.0885 | 3.7633 | 2544 | 0.7840 | 0.8232 | 0.7840 | 0.8855 |
| 0.0885 | 3.7663 | 2546 | 0.7160 | 0.8232 | 0.7160 | 0.8462 |
| 0.0885 | 3.7692 | 2548 | 0.6766 | 0.8232 | 0.6766 | 0.8226 |
| 0.0885 | 3.7722 | 2550 | 0.7092 | 0.8232 | 0.7092 | 0.8421 |
| 0.0885 | 3.7751 | 2552 | 0.7994 | 0.8232 | 0.7994 | 0.8941 |
| 0.0885 | 3.7781 | 2554 | 0.9509 | 0.8283 | 0.9509 | 0.9752 |
| 0.0885 | 3.7811 | 2556 | 1.0959 | 0.6969 | 1.0959 | 1.0468 |
| 0.0885 | 3.7840 | 2558 | 1.1449 | 0.6969 | 1.1449 | 1.0700 |
| 0.0885 | 3.7870 | 2560 | 1.0848 | 0.6969 | 1.0848 | 1.0415 |
| 0.0885 | 3.7899 | 2562 | 0.9958 | 0.7626 | 0.9958 | 0.9979 |
| 0.0885 | 3.7929 | 2564 | 0.9331 | 0.8215 | 0.9331 | 0.9660 |
| 0.0885 | 3.7959 | 2566 | 0.9308 | 0.8215 | 0.9308 | 0.9648 |
| 0.0885 | 3.7988 | 2568 | 0.9152 | 0.8094 | 0.9152 | 0.9567 |
| 0.0885 | 3.8018 | 2570 | 0.8826 | 0.8094 | 0.8826 | 0.9395 |
| 0.0885 | 3.8047 | 2572 | 0.8244 | 0.7983 | 0.8244 | 0.9080 |
| 0.0885 | 3.8077 | 2574 | 0.7960 | 0.7983 | 0.7960 | 0.8922 |
| 0.0885 | 3.8107 | 2576 | 0.7696 | 0.7983 | 0.7696 | 0.8773 |
| 0.0885 | 3.8136 | 2578 | 0.7704 | 0.7983 | 0.7704 | 0.8777 |
| 0.0885 | 3.8166 | 2580 | 0.8027 | 0.8094 | 0.8027 | 0.8960 |
| 0.0885 | 3.8195 | 2582 | 0.8710 | 0.8094 | 0.8710 | 0.9333 |
| 0.0885 | 3.8225 | 2584 | 1.0065 | 0.8215 | 1.0065 | 1.0032 |
| 0.0885 | 3.8254 | 2586 | 1.1050 | 0.7840 | 1.1050 | 1.0512 |
| 0.0885 | 3.8284 | 2588 | 1.1096 | 0.7840 | 1.1096 | 1.0534 |
| 0.0885 | 3.8314 | 2590 | 1.0408 | 0.7626 | 1.0408 | 1.0202 |
| 0.0885 | 3.8343 | 2592 | 1.0348 | 0.7626 | 1.0348 | 1.0173 |
| 0.0885 | 3.8373 | 2594 | 0.9969 | 0.7296 | 0.9969 | 0.9984 |
| 0.0885 | 3.8402 | 2596 | 0.9563 | 0.7296 | 0.9563 | 0.9779 |
| 0.0885 | 3.8432 | 2598 | 0.8955 | 0.7296 | 0.8955 | 0.9463 |
| 0.0885 | 3.8462 | 2600 | 0.8564 | 0.7296 | 0.8564 | 0.9254 |
| 0.0885 | 3.8491 | 2602 | 0.8386 | 0.7631 | 0.8386 | 0.9157 |
| 0.0885 | 3.8521 | 2604 | 0.8248 | 0.7631 | 0.8248 | 0.9082 |
| 0.0885 | 3.8550 | 2606 | 0.8204 | 0.8215 | 0.8204 | 0.9058 |
| 0.0885 | 3.8580 | 2608 | 0.8078 | 0.8094 | 0.8078 | 0.8988 |
| 0.0885 | 3.8609 | 2610 | 0.8545 | 0.8283 | 0.8545 | 0.9244 |
| 0.0885 | 3.8639 | 2612 | 0.8514 | 0.8164 | 0.8514 | 0.9227 |
| 0.0885 | 3.8669 | 2614 | 0.8611 | 0.8164 | 0.8611 | 0.9280 |
| 0.0885 | 3.8698 | 2616 | 0.8278 | 0.8164 | 0.8278 | 0.9098 |
| 0.0885 | 3.8728 | 2618 | 0.7677 | 0.8164 | 0.7677 | 0.8762 |
| 0.0885 | 3.8757 | 2620 | 0.6816 | 0.7906 | 0.6816 | 0.8256 |
| 0.0885 | 3.8787 | 2622 | 0.6433 | 0.7906 | 0.6433 | 0.8021 |
| 0.0885 | 3.8817 | 2624 | 0.6416 | 0.7906 | 0.6416 | 0.8010 |
| 0.0885 | 3.8846 | 2626 | 0.6635 | 0.8164 | 0.6635 | 0.8146 |
| 0.0885 | 3.8876 | 2628 | 0.6655 | 0.8164 | 0.6655 | 0.8158 |
| 0.0885 | 3.8905 | 2630 | 0.6876 | 0.8164 | 0.6876 | 0.8292 |
| 0.0885 | 3.8935 | 2632 | 0.7056 | 0.8054 | 0.7056 | 0.8400 |
| 0.0885 | 3.8964 | 2634 | 0.7274 | 0.8164 | 0.7274 | 0.8529 |
| 0.0885 | 3.8994 | 2636 | 0.7644 | 0.8164 | 0.7644 | 0.8743 |
| 0.0885 | 3.9024 | 2638 | 0.7564 | 0.8164 | 0.7564 | 0.8697 |
| 0.0885 | 3.9053 | 2640 | 0.7532 | 0.8164 | 0.7532 | 0.8679 |
| 0.0885 | 3.9083 | 2642 | 0.7830 | 0.7823 | 0.7830 | 0.8849 |
| 0.0885 | 3.9112 | 2644 | 0.8489 | 0.7519 | 0.8489 | 0.9214 |
| 0.0885 | 3.9142 | 2646 | 0.8663 | 0.7840 | 0.8663 | 0.9307 |
| 0.0885 | 3.9172 | 2648 | 0.8485 | 0.7519 | 0.8485 | 0.9212 |
| 0.0885 | 3.9201 | 2650 | 0.8368 | 0.7933 | 0.8368 | 0.9148 |
| 0.0885 | 3.9231 | 2652 | 0.8475 | 0.8215 | 0.8475 | 0.9206 |
| 0.0885 | 3.9260 | 2654 | 0.8376 | 0.7521 | 0.8376 | 0.9152 |
| 0.0885 | 3.9290 | 2656 | 0.8794 | 0.7612 | 0.8794 | 0.9378 |
| 0.0885 | 3.9320 | 2658 | 0.8860 | 0.7521 | 0.8860 | 0.9413 |
| 0.0885 | 3.9349 | 2660 | 0.8856 | 0.7521 | 0.8856 | 0.9410 |
| 0.0885 | 3.9379 | 2662 | 0.8285 | 0.7983 | 0.8285 | 0.9102 |
| 0.0885 | 3.9408 | 2664 | 0.7708 | 0.7879 | 0.7708 | 0.8780 |
| 0.0885 | 3.9438 | 2666 | 0.7453 | 0.8123 | 0.7453 | 0.8633 |
| 0.0885 | 3.9467 | 2668 | 0.7620 | 0.7879 | 0.7620 | 0.8729 |
| 0.0885 | 3.9497 | 2670 | 0.8345 | 0.8094 | 0.8345 | 0.9135 |
| 0.0885 | 3.9527 | 2672 | 0.9145 | 0.8412 | 0.9145 | 0.9563 |
| 0.0885 | 3.9556 | 2674 | 0.9212 | 0.7840 | 0.9212 | 0.9598 |
| 0.0885 | 3.9586 | 2676 | 0.8813 | 0.8412 | 0.8813 | 0.9388 |
| 0.0885 | 3.9615 | 2678 | 0.8558 | 0.8412 | 0.8558 | 0.9251 |
| 0.0885 | 3.9645 | 2680 | 0.8419 | 0.8412 | 0.8419 | 0.9175 |
| 0.0885 | 3.9675 | 2682 | 0.8122 | 0.8283 | 0.8122 | 0.9012 |
| 0.0885 | 3.9704 | 2684 | 0.7451 | 0.8283 | 0.7451 | 0.8632 |
| 0.0885 | 3.9734 | 2686 | 0.7148 | 0.8164 | 0.7148 | 0.8455 |
| 0.0885 | 3.9763 | 2688 | 0.7121 | 0.8164 | 0.7121 | 0.8438 |
| 0.0885 | 3.9793 | 2690 | 0.6679 | 0.7801 | 0.6679 | 0.8173 |
| 0.0885 | 3.9822 | 2692 | 0.6607 | 0.7801 | 0.6607 | 0.8128 |
| 0.0885 | 3.9852 | 2694 | 0.6788 | 0.8164 | 0.6788 | 0.8239 |
| 0.0885 | 3.9882 | 2696 | 0.6549 | 0.8054 | 0.6549 | 0.8093 |
| 0.0885 | 3.9911 | 2698 | 0.6452 | 0.7801 | 0.6452 | 0.8032 |
| 0.0885 | 3.9941 | 2700 | 0.6499 | 0.8019 | 0.6499 | 0.8062 |
| 0.0885 | 3.9970 | 2702 | 0.6863 | 0.8019 | 0.6863 | 0.8284 |
| 0.0885 | 4.0 | 2704 | 0.7516 | 0.8019 | 0.7516 | 0.8670 |
| 0.0885 | 4.0030 | 2706 | 0.8825 | 0.7898 | 0.8825 | 0.9394 |
| 0.0885 | 4.0059 | 2708 | 1.0210 | 0.7305 | 1.0210 | 1.0104 |
| 0.0885 | 4.0089 | 2710 | 1.0669 | 0.7211 | 1.0669 | 1.0329 |
| 0.0885 | 4.0118 | 2712 | 1.0195 | 0.7305 | 1.0195 | 1.0097 |
| 0.0885 | 4.0148 | 2714 | 0.8931 | 0.7614 | 0.8931 | 0.9451 |
| 0.0885 | 4.0178 | 2716 | 0.7839 | 0.8019 | 0.7839 | 0.8854 |
| 0.0885 | 4.0207 | 2718 | 0.7228 | 0.8019 | 0.7228 | 0.8502 |
| 0.0885 | 4.0237 | 2720 | 0.6723 | 0.8019 | 0.6723 | 0.8200 |
| 0.0885 | 4.0266 | 2722 | 0.6641 | 0.8019 | 0.6641 | 0.8149 |
| 0.0885 | 4.0296 | 2724 | 0.7220 | 0.8019 | 0.7220 | 0.8497 |
| 0.0885 | 4.0325 | 2726 | 0.7875 | 0.8283 | 0.7875 | 0.8874 |
| 0.0885 | 4.0355 | 2728 | 0.8028 | 0.8283 | 0.8028 | 0.8960 |
| 0.0885 | 4.0385 | 2730 | 0.7817 | 0.8019 | 0.7817 | 0.8841 |
| 0.0885 | 4.0414 | 2732 | 0.7906 | 0.8142 | 0.7906 | 0.8891 |
| 0.0885 | 4.0444 | 2734 | 0.7681 | 0.8142 | 0.7681 | 0.8764 |
| 0.0885 | 4.0473 | 2736 | 0.7831 | 0.7844 | 0.7831 | 0.8849 |
| 0.0885 | 4.0503 | 2738 | 0.8386 | 0.7844 | 0.8386 | 0.9157 |
| 0.0885 | 4.0533 | 2740 | 0.8623 | 0.8138 | 0.8623 | 0.9286 |
| 0.0885 | 4.0562 | 2742 | 0.8448 | 0.8138 | 0.8448 | 0.9191 |
| 0.0885 | 4.0592 | 2744 | 0.8607 | 0.7614 | 0.8607 | 0.9278 |
| 0.0885 | 4.0621 | 2746 | 0.9512 | 0.7614 | 0.9512 | 0.9753 |
| 0.0885 | 4.0651 | 2748 | 0.9939 | 0.7305 | 0.9939 | 0.9970 |
| 0.0885 | 4.0680 | 2750 | 1.0110 | 0.7305 | 1.0110 | 1.0055 |
| 0.0885 | 4.0710 | 2752 | 0.9541 | 0.7614 | 0.9541 | 0.9768 |
| 0.0885 | 4.0740 | 2754 | 0.8537 | 0.7614 | 0.8537 | 0.9240 |
| 0.0885 | 4.0769 | 2756 | 0.7991 | 0.8015 | 0.7991 | 0.8939 |
| 0.0885 | 4.0799 | 2758 | 0.7752 | 0.7902 | 0.7752 | 0.8805 |
| 0.0885 | 4.0828 | 2760 | 0.7956 | 0.7902 | 0.7956 | 0.8920 |
| 0.0885 | 4.0858 | 2762 | 0.8095 | 0.7902 | 0.8095 | 0.8997 |
| 0.0885 | 4.0888 | 2764 | 0.8569 | 0.8015 | 0.8569 | 0.9257 |
| 0.0885 | 4.0917 | 2766 | 0.9113 | 0.8015 | 0.9113 | 0.9546 |
| 0.0885 | 4.0947 | 2768 | 0.9176 | 0.8015 | 0.9176 | 0.9579 |
| 0.0885 | 4.0976 | 2770 | 0.8903 | 0.8015 | 0.8903 | 0.9435 |
| 0.0885 | 4.1006 | 2772 | 0.8267 | 0.8015 | 0.8267 | 0.9092 |
| 0.0885 | 4.1036 | 2774 | 0.7457 | 0.7906 | 0.7457 | 0.8636 |
| 0.0885 | 4.1065 | 2776 | 0.7138 | 0.7906 | 0.7138 | 0.8449 |
| 0.0885 | 4.1095 | 2778 | 0.7193 | 0.8164 | 0.7193 | 0.8481 |
| 0.0885 | 4.1124 | 2780 | 0.7645 | 0.8164 | 0.7645 | 0.8744 |
| 0.0885 | 4.1154 | 2782 | 0.7870 | 0.8283 | 0.7870 | 0.8871 |
| 0.0885 | 4.1183 | 2784 | 0.7842 | 0.8283 | 0.7842 | 0.8855 |
| 0.0885 | 4.1213 | 2786 | 0.8253 | 0.8412 | 0.8253 | 0.9084 |
| 0.0885 | 4.1243 | 2788 | 0.8502 | 0.8412 | 0.8502 | 0.9221 |
| 0.0885 | 4.1272 | 2790 | 0.8469 | 0.8412 | 0.8469 | 0.9203 |
| 0.0885 | 4.1302 | 2792 | 0.8051 | 0.8412 | 0.8051 | 0.8973 |
| 0.0885 | 4.1331 | 2794 | 0.7345 | 0.8283 | 0.7345 | 0.8570 |
| 0.0885 | 4.1361 | 2796 | 0.7118 | 0.7879 | 0.7118 | 0.8437 |
| 0.0885 | 4.1391 | 2798 | 0.7189 | 0.7879 | 0.7189 | 0.8479 |
| 0.0885 | 4.1420 | 2800 | 0.7179 | 0.7879 | 0.7179 | 0.8473 |
| 0.0885 | 4.1450 | 2802 | 0.7469 | 0.8054 | 0.7469 | 0.8642 |
| 0.0885 | 4.1479 | 2804 | 0.7927 | 0.7696 | 0.7927 | 0.8903 |
| 0.0885 | 4.1509 | 2806 | 0.8330 | 0.7696 | 0.8330 | 0.9127 |
| 0.0885 | 4.1538 | 2808 | 0.8098 | 0.7696 | 0.8098 | 0.8999 |
| 0.0885 | 4.1568 | 2810 | 0.7713 | 0.8164 | 0.7713 | 0.8782 |
| 0.0885 | 4.1598 | 2812 | 0.7507 | 0.8164 | 0.7507 | 0.8664 |
| 0.0885 | 4.1627 | 2814 | 0.7337 | 0.8164 | 0.7337 | 0.8566 |
| 0.0885 | 4.1657 | 2816 | 0.7705 | 0.8164 | 0.7705 | 0.8778 |
| 0.0885 | 4.1686 | 2818 | 0.7901 | 0.8283 | 0.7901 | 0.8889 |
| 0.0885 | 4.1716 | 2820 | 0.7466 | 0.8019 | 0.7466 | 0.8640 |
| 0.0885 | 4.1746 | 2822 | 0.7246 | 0.8019 | 0.7246 | 0.8513 |
| 0.0885 | 4.1775 | 2824 | 0.7383 | 0.8019 | 0.7383 | 0.8593 |
| 0.0885 | 4.1805 | 2826 | 0.8025 | 0.8019 | 0.8025 | 0.8959 |
| 0.0885 | 4.1834 | 2828 | 0.8216 | 0.8283 | 0.8216 | 0.9064 |
| 0.0885 | 4.1864 | 2830 | 0.8619 | 0.8283 | 0.8619 | 0.9284 |
| 0.0885 | 4.1893 | 2832 | 0.9254 | 0.8015 | 0.9254 | 0.9620 |
| 0.0885 | 4.1923 | 2834 | 0.9146 | 0.8015 | 0.9146 | 0.9563 |
| 0.0885 | 4.1953 | 2836 | 0.8704 | 0.8283 | 0.8704 | 0.9329 |
| 0.0885 | 4.1982 | 2838 | 0.8724 | 0.8283 | 0.8724 | 0.9340 |
| 0.0885 | 4.2012 | 2840 | 0.8967 | 0.8015 | 0.8967 | 0.9470 |
| 0.0885 | 4.2041 | 2842 | 0.9445 | 0.8015 | 0.9445 | 0.9719 |
| 0.0885 | 4.2071 | 2844 | 1.0046 | 0.8015 | 1.0046 | 1.0023 |
| 0.0885 | 4.2101 | 2846 | 0.9801 | 0.8015 | 0.9801 | 0.9900 |
| 0.0885 | 4.2130 | 2848 | 0.9507 | 0.8015 | 0.9507 | 0.9750 |
| 0.0885 | 4.2160 | 2850 | 0.9987 | 0.8015 | 0.9987 | 0.9994 |
| 0.0885 | 4.2189 | 2852 | 1.0870 | 0.7216 | 1.0870 | 1.0426 |
| 0.0885 | 4.2219 | 2854 | 1.1142 | 0.7014 | 1.1142 | 1.0556 |
| 0.0885 | 4.2249 | 2856 | 1.0806 | 0.7014 | 1.0806 | 1.0395 |
| 0.0885 | 4.2278 | 2858 | 1.0324 | 0.8015 | 1.0324 | 1.0161 |
| 0.0885 | 4.2308 | 2860 | 1.0053 | 0.8015 | 1.0053 | 1.0027 |
| 0.0885 | 4.2337 | 2862 | 0.9924 | 0.8015 | 0.9924 | 0.9962 |
| 0.0885 | 4.2367 | 2864 | 0.9589 | 0.8015 | 0.9589 | 0.9792 |
| 0.0885 | 4.2396 | 2866 | 0.9491 | 0.8015 | 0.9491 | 0.9742 |
| 0.0885 | 4.2426 | 2868 | 0.9248 | 0.8015 | 0.9248 | 0.9617 |
| 0.0885 | 4.2456 | 2870 | 0.9410 | 0.8015 | 0.9410 | 0.9701 |
| 0.0885 | 4.2485 | 2872 | 0.9798 | 0.8015 | 0.9798 | 0.9898 |
| 0.0885 | 4.2515 | 2874 | 1.0081 | 0.8015 | 1.0081 | 1.0040 |
| 0.0885 | 4.2544 | 2876 | 0.9915 | 0.8015 | 0.9915 | 0.9957 |
| 0.0885 | 4.2574 | 2878 | 0.9302 | 0.8283 | 0.9302 | 0.9644 |
| 0.0885 | 4.2604 | 2880 | 0.9179 | 0.8283 | 0.9179 | 0.9581 |
| 0.0885 | 4.2633 | 2882 | 0.9137 | 0.8283 | 0.9137 | 0.9559 |
| 0.0885 | 4.2663 | 2884 | 0.8494 | 0.8094 | 0.8494 | 0.9216 |
| 0.0885 | 4.2692 | 2886 | 0.8391 | 0.8283 | 0.8391 | 0.9160 |
| 0.0885 | 4.2722 | 2888 | 0.8235 | 0.8283 | 0.8235 | 0.9075 |
| 0.0885 | 4.2751 | 2890 | 0.7950 | 0.8283 | 0.7950 | 0.8916 |
| 0.0885 | 4.2781 | 2892 | 0.7831 | 0.8283 | 0.7831 | 0.8850 |
| 0.0885 | 4.2811 | 2894 | 0.8027 | 0.8283 | 0.8027 | 0.8959 |
| 0.0885 | 4.2840 | 2896 | 0.7970 | 0.8164 | 0.7970 | 0.8928 |
| 0.0885 | 4.2870 | 2898 | 0.8228 | 0.8283 | 0.8228 | 0.9071 |
| 0.0885 | 4.2899 | 2900 | 0.8859 | 0.7793 | 0.8859 | 0.9412 |
| 0.0885 | 4.2929 | 2902 | 0.9710 | 0.7793 | 0.9710 | 0.9854 |
| 0.0885 | 4.2959 | 2904 | 0.9912 | 0.7793 | 0.9912 | 0.9956 |
| 0.0885 | 4.2988 | 2906 | 0.9454 | 0.8283 | 0.9454 | 0.9723 |
| 0.0885 | 4.3018 | 2908 | 0.8872 | 0.8283 | 0.8872 | 0.9419 |
| 0.0885 | 4.3047 | 2910 | 0.9191 | 0.8283 | 0.9191 | 0.9587 |
| 0.0885 | 4.3077 | 2912 | 0.9335 | 0.8015 | 0.9335 | 0.9662 |
| 0.0885 | 4.3107 | 2914 | 0.9327 | 0.7729 | 0.9327 | 0.9658 |
| 0.0885 | 4.3136 | 2916 | 0.8893 | 0.7729 | 0.8893 | 0.9430 |
| 0.0885 | 4.3166 | 2918 | 0.8827 | 0.7729 | 0.8827 | 0.9395 |
| 0.0885 | 4.3195 | 2920 | 0.8781 | 0.8015 | 0.8781 | 0.9371 |
| 0.0885 | 4.3225 | 2922 | 0.8371 | 0.7729 | 0.8371 | 0.9150 |
| 0.0885 | 4.3254 | 2924 | 0.8277 | 0.8019 | 0.8277 | 0.9098 |
| 0.0885 | 4.3284 | 2926 | 0.8331 | 0.8283 | 0.8331 | 0.9127 |
| 0.0885 | 4.3314 | 2928 | 0.8742 | 0.8015 | 0.8742 | 0.9350 |
| 0.0885 | 4.3343 | 2930 | 0.8513 | 0.8283 | 0.8513 | 0.9227 |
| 0.0885 | 4.3373 | 2932 | 0.8078 | 0.8019 | 0.8078 | 0.8988 |
| 0.0885 | 4.3402 | 2934 | 0.7793 | 0.8019 | 0.7793 | 0.8828 |
| 0.0885 | 4.3432 | 2936 | 0.8204 | 0.8019 | 0.8204 | 0.9057 |
| 0.0885 | 4.3462 | 2938 | 0.8656 | 0.8019 | 0.8656 | 0.9304 |
| 0.0885 | 4.3491 | 2940 | 0.8505 | 0.8019 | 0.8505 | 0.9222 |
| 0.0885 | 4.3521 | 2942 | 0.8878 | 0.7793 | 0.8878 | 0.9422 |
| 0.0885 | 4.3550 | 2944 | 0.8832 | 0.7793 | 0.8832 | 0.9398 |
| 0.0885 | 4.3580 | 2946 | 0.8370 | 0.8283 | 0.8370 | 0.9149 |
| 0.0885 | 4.3609 | 2948 | 0.7737 | 0.7823 | 0.7737 | 0.8796 |
| 0.0885 | 4.3639 | 2950 | 0.7540 | 0.7823 | 0.7540 | 0.8684 |
| 0.0885 | 4.3669 | 2952 | 0.7774 | 0.8019 | 0.7774 | 0.8817 |
| 0.0885 | 4.3698 | 2954 | 0.8089 | 0.8283 | 0.8089 | 0.8994 |
| 0.0885 | 4.3728 | 2956 | 0.8532 | 0.8283 | 0.8532 | 0.9237 |
| 0.0885 | 4.3757 | 2958 | 0.8780 | 0.8283 | 0.8780 | 0.9370 |
| 0.0885 | 4.3787 | 2960 | 0.8760 | 0.8283 | 0.8760 | 0.9359 |
| 0.0885 | 4.3817 | 2962 | 0.8323 | 0.8019 | 0.8323 | 0.9123 |
| 0.0885 | 4.3846 | 2964 | 0.7602 | 0.7823 | 0.7602 | 0.8719 |
| 0.0885 | 4.3876 | 2966 | 0.7340 | 0.7823 | 0.7340 | 0.8567 |
| 0.0885 | 4.3905 | 2968 | 0.7643 | 0.8019 | 0.7643 | 0.8743 |
| 0.0885 | 4.3935 | 2970 | 0.8237 | 0.8283 | 0.8237 | 0.9076 |
| 0.0885 | 4.3964 | 2972 | 0.8054 | 0.8283 | 0.8054 | 0.8975 |
| 0.0885 | 4.3994 | 2974 | 0.7750 | 0.8283 | 0.7750 | 0.8803 |
| 0.0885 | 4.4024 | 2976 | 0.6878 | 0.8232 | 0.6878 | 0.8294 |
| 0.0885 | 4.4053 | 2978 | 0.6516 | 0.8232 | 0.6516 | 0.8072 |
| 0.0885 | 4.4083 | 2980 | 0.6677 | 0.8232 | 0.6677 | 0.8172 |
| 0.0885 | 4.4112 | 2982 | 0.7252 | 0.8283 | 0.7252 | 0.8516 |
| 0.0885 | 4.4142 | 2984 | 0.7828 | 0.8283 | 0.7828 | 0.8847 |
| 0.0885 | 4.4172 | 2986 | 0.7965 | 0.8283 | 0.7965 | 0.8925 |
| 0.0885 | 4.4201 | 2988 | 0.8017 | 0.8283 | 0.8017 | 0.8954 |
| 0.0885 | 4.4231 | 2990 | 0.7808 | 0.7823 | 0.7808 | 0.8836 |
| 0.0885 | 4.4260 | 2992 | 0.7643 | 0.7823 | 0.7643 | 0.8743 |
| 0.0885 | 4.4290 | 2994 | 0.7518 | 0.7823 | 0.7518 | 0.8670 |
| 0.0885 | 4.4320 | 2996 | 0.7777 | 0.8283 | 0.7777 | 0.8819 |
| 0.0885 | 4.4349 | 2998 | 0.7409 | 0.8094 | 0.7409 | 0.8608 |
| 0.0797 | 4.4379 | 3000 | 0.7457 | 0.8094 | 0.7457 | 0.8636 |
| 0.0797 | 4.4408 | 3002 | 0.8192 | 0.7793 | 0.8192 | 0.9051 |
| 0.0797 | 4.4438 | 3004 | 0.8634 | 0.7793 | 0.8634 | 0.9292 |
| 0.0797 | 4.4467 | 3006 | 0.8528 | 0.7793 | 0.8528 | 0.9235 |
| 0.0797 | 4.4497 | 3008 | 0.7951 | 0.7612 | 0.7951 | 0.8917 |
| 0.0797 | 4.4527 | 3010 | 0.7565 | 0.7612 | 0.7565 | 0.8698 |
| 0.0797 | 4.4556 | 3012 | 0.7938 | 0.7612 | 0.7938 | 0.8910 |
| 0.0797 | 4.4586 | 3014 | 0.8397 | 0.7612 | 0.8397 | 0.9164 |
| 0.0797 | 4.4615 | 3016 | 0.8769 | 0.7612 | 0.8769 | 0.9364 |
| 0.0797 | 4.4645 | 3018 | 0.8708 | 0.7612 | 0.8708 | 0.9331 |
| 0.0797 | 4.4675 | 3020 | 0.8035 | 0.8094 | 0.8035 | 0.8964 |
| 0.0797 | 4.4704 | 3022 | 0.7821 | 0.8094 | 0.7821 | 0.8844 |
| 0.0797 | 4.4734 | 3024 | 0.7968 | 0.8094 | 0.7968 | 0.8926 |
| 0.0797 | 4.4763 | 3026 | 0.8147 | 0.8094 | 0.8147 | 0.9026 |
| 0.0797 | 4.4793 | 3028 | 0.8756 | 0.8283 | 0.8756 | 0.9357 |
| 0.0797 | 4.4822 | 3030 | 0.9157 | 0.8412 | 0.9157 | 0.9569 |
| 0.0797 | 4.4852 | 3032 | 0.9027 | 0.8412 | 0.9027 | 0.9501 |
| 0.0797 | 4.4882 | 3034 | 0.8795 | 0.8412 | 0.8795 | 0.9378 |
| 0.0797 | 4.4911 | 3036 | 0.8586 | 0.8412 | 0.8586 | 0.9266 |
| 0.0797 | 4.4941 | 3038 | 0.7932 | 0.8215 | 0.7932 | 0.8906 |
| 0.0797 | 4.4970 | 3040 | 0.7845 | 0.8215 | 0.7845 | 0.8857 |
| 0.0797 | 4.5 | 3042 | 0.8183 | 0.8215 | 0.8183 | 0.9046 |
| 0.0797 | 4.5030 | 3044 | 0.8803 | 0.8412 | 0.8803 | 0.9383 |
| 0.0797 | 4.5059 | 3046 | 0.9282 | 0.7898 | 0.9282 | 0.9634 |
| 0.0797 | 4.5089 | 3048 | 0.9113 | 0.7898 | 0.9113 | 0.9546 |
| 0.0797 | 4.5118 | 3050 | 0.8748 | 0.8215 | 0.8748 | 0.9353 |
| 0.0797 | 4.5148 | 3052 | 0.7787 | 0.8094 | 0.7787 | 0.8824 |
| 0.0797 | 4.5178 | 3054 | 0.7232 | 0.7983 | 0.7232 | 0.8504 |
| 0.0797 | 4.5207 | 3056 | 0.7256 | 0.7983 | 0.7256 | 0.8518 |
| 0.0797 | 4.5237 | 3058 | 0.7828 | 0.8094 | 0.7828 | 0.8848 |
| 0.0797 | 4.5266 | 3060 | 0.8348 | 0.8215 | 0.8348 | 0.9137 |
| 0.0797 | 4.5296 | 3062 | 0.9156 | 0.8412 | 0.9156 | 0.9568 |
| 0.0797 | 4.5325 | 3064 | 0.9578 | 0.8412 | 0.9578 | 0.9787 |
| 0.0797 | 4.5355 | 3066 | 0.9692 | 0.8412 | 0.9692 | 0.9845 |
| 0.0797 | 4.5385 | 3068 | 0.8994 | 0.8412 | 0.8994 | 0.9484 |
| 0.0797 | 4.5414 | 3070 | 0.8304 | 0.8215 | 0.8304 | 0.9112 |
| 0.0797 | 4.5444 | 3072 | 0.8102 | 0.8094 | 0.8102 | 0.9001 |
| 0.0797 | 4.5473 | 3074 | 0.8101 | 0.8094 | 0.8101 | 0.9000 |
| 0.0797 | 4.5503 | 3076 | 0.8191 | 0.8094 | 0.8191 | 0.9050 |
| 0.0797 | 4.5533 | 3078 | 0.8667 | 0.7793 | 0.8667 | 0.9310 |
| 0.0797 | 4.5562 | 3080 | 0.9117 | 0.7793 | 0.9117 | 0.9548 |
| 0.0797 | 4.5592 | 3082 | 0.9315 | 0.7793 | 0.9315 | 0.9651 |
| 0.0797 | 4.5621 | 3084 | 0.9769 | 0.7793 | 0.9769 | 0.9884 |
| 0.0797 | 4.5651 | 3086 | 1.0543 | 0.7322 | 1.0543 | 1.0268 |
| 0.0797 | 4.5680 | 3088 | 1.0290 | 0.7322 | 1.0290 | 1.0144 |
| 0.0797 | 4.5710 | 3090 | 0.9366 | 0.8412 | 0.9366 | 0.9678 |
| 0.0797 | 4.5740 | 3092 | 0.8106 | 0.8283 | 0.8106 | 0.9003 |
| 0.0797 | 4.5769 | 3094 | 0.7040 | 0.7983 | 0.7040 | 0.8391 |
| 0.0797 | 4.5799 | 3096 | 0.6520 | 0.7620 | 0.6520 | 0.8075 |
| 0.0797 | 4.5828 | 3098 | 0.6567 | 0.7879 | 0.6567 | 0.8104 |
| 0.0797 | 4.5858 | 3100 | 0.7111 | 0.8054 | 0.7111 | 0.8433 |
| 0.0797 | 4.5888 | 3102 | 0.8145 | 0.7696 | 0.8145 | 0.9025 |
| 0.0797 | 4.5917 | 3104 | 0.9400 | 0.7793 | 0.9400 | 0.9695 |
| 0.0797 | 4.5947 | 3106 | 0.9893 | 0.7793 | 0.9893 | 0.9946 |
| 0.0797 | 4.5976 | 3108 | 0.9428 | 0.7793 | 0.9428 | 0.9710 |
| 0.0797 | 4.6006 | 3110 | 0.8684 | 0.7606 | 0.8684 | 0.9319 |
| 0.0797 | 4.6036 | 3112 | 0.7977 | 0.8054 | 0.7977 | 0.8931 |
| 0.0797 | 4.6065 | 3114 | 0.7356 | 0.8054 | 0.7356 | 0.8577 |
| 0.0797 | 4.6095 | 3116 | 0.6802 | 0.8054 | 0.6802 | 0.8248 |
| 0.0797 | 4.6124 | 3118 | 0.6790 | 0.7801 | 0.6790 | 0.8240 |
| 0.0797 | 4.6154 | 3120 | 0.7298 | 0.8019 | 0.7298 | 0.8543 |
| 0.0797 | 4.6183 | 3122 | 0.8334 | 0.8283 | 0.8334 | 0.9129 |
| 0.0797 | 4.6213 | 3124 | 0.9604 | 0.7515 | 0.9604 | 0.9800 |
| 0.0797 | 4.6243 | 3126 | 1.0417 | 0.7515 | 1.0417 | 1.0206 |
| 0.0797 | 4.6272 | 3128 | 1.0447 | 0.7515 | 1.0447 | 1.0221 |
| 0.0797 | 4.6302 | 3130 | 0.9810 | 0.7515 | 0.9810 | 0.9904 |
| 0.0797 | 4.6331 | 3132 | 0.8789 | 0.8283 | 0.8789 | 0.9375 |
| 0.0797 | 4.6361 | 3134 | 0.8384 | 0.8094 | 0.8384 | 0.9156 |
| 0.0797 | 4.6391 | 3136 | 0.8069 | 0.8094 | 0.8069 | 0.8983 |
| 0.0797 | 4.6420 | 3138 | 0.8077 | 0.8094 | 0.8077 | 0.8987 |
| 0.0797 | 4.6450 | 3140 | 0.8101 | 0.8094 | 0.8101 | 0.9001 |
| 0.0797 | 4.6479 | 3142 | 0.8283 | 0.8094 | 0.8283 | 0.9101 |
| 0.0797 | 4.6509 | 3144 | 0.8437 | 0.8094 | 0.8437 | 0.9185 |
| 0.0797 | 4.6538 | 3146 | 0.8813 | 0.7819 | 0.8813 | 0.9388 |
| 0.0797 | 4.6568 | 3148 | 0.9299 | 0.8015 | 0.9299 | 0.9643 |
| 0.0797 | 4.6598 | 3150 | 1.0162 | 0.7216 | 1.0162 | 1.0081 |
| 0.0797 | 4.6627 | 3152 | 1.0450 | 0.7216 | 1.0450 | 1.0222 |
| 0.0797 | 4.6657 | 3154 | 1.0453 | 0.7216 | 1.0453 | 1.0224 |
| 0.0797 | 4.6686 | 3156 | 1.0148 | 0.7216 | 1.0148 | 1.0074 |
| 0.0797 | 4.6716 | 3158 | 0.9320 | 0.8015 | 0.9320 | 0.9654 |
| 0.0797 | 4.6746 | 3160 | 0.9025 | 0.8015 | 0.9025 | 0.9500 |
| 0.0797 | 4.6775 | 3162 | 0.9225 | 0.8015 | 0.9225 | 0.9605 |
| 0.0797 | 4.6805 | 3164 | 0.9644 | 0.8015 | 0.9644 | 0.9821 |
| 0.0797 | 4.6834 | 3166 | 0.9837 | 0.7515 | 0.9837 | 0.9918 |
| 0.0797 | 4.6864 | 3168 | 0.9621 | 0.7417 | 0.9621 | 0.9809 |
| 0.0797 | 4.6893 | 3170 | 0.9003 | 0.8015 | 0.9003 | 0.9489 |
| 0.0797 | 4.6923 | 3172 | 0.8695 | 0.8015 | 0.8695 | 0.9324 |
| 0.0797 | 4.6953 | 3174 | 0.8423 | 0.8015 | 0.8423 | 0.9178 |
| 0.0797 | 4.6982 | 3176 | 0.7826 | 0.8283 | 0.7826 | 0.8847 |
| 0.0797 | 4.7012 | 3178 | 0.7318 | 0.7879 | 0.7318 | 0.8555 |
| 0.0797 | 4.7041 | 3180 | 0.7153 | 0.7879 | 0.7153 | 0.8457 |
| 0.0797 | 4.7071 | 3182 | 0.7471 | 0.7983 | 0.7471 | 0.8643 |
| 0.0797 | 4.7101 | 3184 | 0.7697 | 0.8283 | 0.7697 | 0.8773 |
| 0.0797 | 4.7130 | 3186 | 0.8364 | 0.8015 | 0.8364 | 0.9146 |
| 0.0797 | 4.7160 | 3188 | 0.8426 | 0.8015 | 0.8426 | 0.9179 |
| 0.0797 | 4.7189 | 3190 | 0.7921 | 0.7819 | 0.7921 | 0.8900 |
| 0.0797 | 4.7219 | 3192 | 0.7342 | 0.8094 | 0.7342 | 0.8569 |
| 0.0797 | 4.7249 | 3194 | 0.6740 | 0.7879 | 0.6740 | 0.8209 |
| 0.0797 | 4.7278 | 3196 | 0.6784 | 0.7717 | 0.6784 | 0.8236 |
| 0.0797 | 4.7308 | 3198 | 0.7165 | 0.7983 | 0.7165 | 0.8464 |
| 0.0797 | 4.7337 | 3200 | 0.7533 | 0.8094 | 0.7533 | 0.8679 |
| 0.0797 | 4.7367 | 3202 | 0.8311 | 0.7819 | 0.8311 | 0.9117 |
| 0.0797 | 4.7396 | 3204 | 0.9254 | 0.8015 | 0.9254 | 0.9620 |
| 0.0797 | 4.7426 | 3206 | 0.9457 | 0.8138 | 0.9457 | 0.9725 |
| 0.0797 | 4.7456 | 3208 | 0.9025 | 0.8015 | 0.9025 | 0.9500 |
| 0.0797 | 4.7485 | 3210 | 0.8522 | 0.7819 | 0.8522 | 0.9231 |
| 0.0797 | 4.7515 | 3212 | 0.7925 | 0.7525 | 0.7925 | 0.8902 |
| 0.0797 | 4.7544 | 3214 | 0.7600 | 0.7823 | 0.7600 | 0.8718 |
| 0.0797 | 4.7574 | 3216 | 0.7581 | 0.7823 | 0.7581 | 0.8707 |
| 0.0797 | 4.7604 | 3218 | 0.7851 | 0.8094 | 0.7851 | 0.8861 |
| 0.0797 | 4.7633 | 3220 | 0.8404 | 0.8283 | 0.8404 | 0.9168 |
| 0.0797 | 4.7663 | 3222 | 0.9354 | 0.7417 | 0.9354 | 0.9672 |
| 0.0797 | 4.7692 | 3224 | 0.9965 | 0.7014 | 0.9965 | 0.9983 |
| 0.0797 | 4.7722 | 3226 | 0.9795 | 0.7014 | 0.9795 | 0.9897 |
| 0.0797 | 4.7751 | 3228 | 0.9025 | 0.8015 | 0.9025 | 0.9500 |
| 0.0797 | 4.7781 | 3230 | 0.8355 | 0.8283 | 0.8355 | 0.9141 |
| 0.0797 | 4.7811 | 3232 | 0.7573 | 0.8094 | 0.7573 | 0.8702 |
| 0.0797 | 4.7840 | 3234 | 0.7326 | 0.7983 | 0.7326 | 0.8559 |
| 0.0797 | 4.7870 | 3236 | 0.7588 | 0.8094 | 0.7588 | 0.8711 |
| 0.0797 | 4.7899 | 3238 | 0.8404 | 0.8283 | 0.8404 | 0.9167 |
| 0.0797 | 4.7929 | 3240 | 0.9569 | 0.8138 | 0.9569 | 0.9782 |
| 0.0797 | 4.7959 | 3242 | 1.0913 | 0.6680 | 1.0913 | 1.0446 |
| 0.0797 | 4.7988 | 3244 | 1.1308 | 0.6680 | 1.1308 | 1.0634 |
| 0.0797 | 4.8018 | 3246 | 1.0939 | 0.6680 | 1.0939 | 1.0459 |
| 0.0797 | 4.8047 | 3248 | 1.0477 | 0.6680 | 1.0477 | 1.0236 |
| 0.0797 | 4.8077 | 3250 | 1.0028 | 0.7515 | 1.0028 | 1.0014 |
| 0.0797 | 4.8107 | 3252 | 0.9255 | 0.8138 | 0.9255 | 0.9620 |
| 0.0797 | 4.8136 | 3254 | 0.8572 | 0.8015 | 0.8572 | 0.9259 |
| 0.0797 | 4.8166 | 3256 | 0.7884 | 0.8164 | 0.7884 | 0.8879 |
| 0.0797 | 4.8195 | 3258 | 0.7909 | 0.8164 | 0.7909 | 0.8893 |
| 0.0797 | 4.8225 | 3260 | 0.8267 | 0.7902 | 0.8267 | 0.9092 |
| 0.0797 | 4.8254 | 3262 | 0.8922 | 0.8138 | 0.8922 | 0.9446 |
| 0.0797 | 4.8284 | 3264 | 0.9809 | 0.7840 | 0.9809 | 0.9904 |
| 0.0797 | 4.8314 | 3266 | 1.0652 | 0.7840 | 1.0652 | 1.0321 |
| 0.0797 | 4.8343 | 3268 | 1.1022 | 0.7188 | 1.1022 | 1.0499 |
| 0.0797 | 4.8373 | 3270 | 1.0773 | 0.7188 | 1.0773 | 1.0379 |
| 0.0797 | 4.8402 | 3272 | 0.9842 | 0.7840 | 0.9842 | 0.9920 |
| 0.0797 | 4.8432 | 3274 | 0.8680 | 0.7519 | 0.8680 | 0.9317 |
| 0.0797 | 4.8462 | 3276 | 0.7949 | 0.7844 | 0.7949 | 0.8916 |
| 0.0797 | 4.8491 | 3278 | 0.7671 | 0.7729 | 0.7671 | 0.8759 |
| 0.0797 | 4.8521 | 3280 | 0.7536 | 0.7623 | 0.7536 | 0.8681 |
| 0.0797 | 4.8550 | 3282 | 0.7413 | 0.7623 | 0.7413 | 0.8610 |
| 0.0797 | 4.8580 | 3284 | 0.7838 | 0.7844 | 0.7838 | 0.8853 |
| 0.0797 | 4.8609 | 3286 | 0.8715 | 0.8138 | 0.8715 | 0.9336 |
| 0.0797 | 4.8639 | 3288 | 0.9430 | 0.7840 | 0.9430 | 0.9711 |
| 0.0797 | 4.8669 | 3290 | 0.9565 | 0.7840 | 0.9565 | 0.9780 |
| 0.0797 | 4.8698 | 3292 | 0.9069 | 0.7840 | 0.9069 | 0.9523 |
| 0.0797 | 4.8728 | 3294 | 0.8937 | 0.7840 | 0.8937 | 0.9454 |
| 0.0797 | 4.8757 | 3296 | 0.8644 | 0.7840 | 0.8644 | 0.9297 |
| 0.0797 | 4.8787 | 3298 | 0.7877 | 0.7819 | 0.7877 | 0.8875 |
| 0.0797 | 4.8817 | 3300 | 0.7161 | 0.8094 | 0.7161 | 0.8463 |
| 0.0797 | 4.8846 | 3302 | 0.6990 | 0.7823 | 0.6990 | 0.8361 |
| 0.0797 | 4.8876 | 3304 | 0.7158 | 0.7823 | 0.7158 | 0.8460 |
| 0.0797 | 4.8905 | 3306 | 0.7331 | 0.8094 | 0.7331 | 0.8562 |
| 0.0797 | 4.8935 | 3308 | 0.7884 | 0.8283 | 0.7884 | 0.8879 |
| 0.0797 | 4.8964 | 3310 | 0.8647 | 0.8015 | 0.8647 | 0.9299 |
| 0.0797 | 4.8994 | 3312 | 0.8988 | 0.7840 | 0.8988 | 0.9480 |
| 0.0797 | 4.9024 | 3314 | 0.8907 | 0.8138 | 0.8907 | 0.9438 |
| 0.0797 | 4.9053 | 3316 | 0.8433 | 0.8019 | 0.8433 | 0.9183 |
| 0.0797 | 4.9083 | 3318 | 0.7596 | 0.7823 | 0.7596 | 0.8716 |
| 0.0797 | 4.9112 | 3320 | 0.7000 | 0.7717 | 0.7000 | 0.8367 |
| 0.0797 | 4.9142 | 3322 | 0.6735 | 0.7717 | 0.6735 | 0.8207 |
| 0.0797 | 4.9172 | 3324 | 0.6993 | 0.7717 | 0.6993 | 0.8362 |
| 0.0797 | 4.9201 | 3326 | 0.7337 | 0.7717 | 0.7337 | 0.8565 |
| 0.0797 | 4.9231 | 3328 | 0.7695 | 0.8164 | 0.7695 | 0.8772 |
| 0.0797 | 4.9260 | 3330 | 0.7626 | 0.8164 | 0.7626 | 0.8733 |
| 0.0797 | 4.9290 | 3332 | 0.7835 | 0.8164 | 0.7835 | 0.8852 |
| 0.0797 | 4.9320 | 3334 | 0.8056 | 0.8164 | 0.8056 | 0.8975 |
| 0.0797 | 4.9349 | 3336 | 0.7890 | 0.8164 | 0.7890 | 0.8882 |
| 0.0797 | 4.9379 | 3338 | 0.7618 | 0.8164 | 0.7618 | 0.8728 |
| 0.0797 | 4.9408 | 3340 | 0.7694 | 0.8164 | 0.7694 | 0.8772 |
| 0.0797 | 4.9438 | 3342 | 0.8020 | 0.8164 | 0.8020 | 0.8956 |
| 0.0797 | 4.9467 | 3344 | 0.8327 | 0.8164 | 0.8327 | 0.9125 |
| 0.0797 | 4.9497 | 3346 | 0.8514 | 0.7413 | 0.8514 | 0.9227 |
| 0.0797 | 4.9527 | 3348 | 0.8785 | 0.7413 | 0.8785 | 0.9373 |
| 0.0797 | 4.9556 | 3350 | 0.9118 | 0.7519 | 0.9118 | 0.9549 |
| 0.0797 | 4.9586 | 3352 | 0.8972 | 0.7413 | 0.8972 | 0.9472 |
| 0.0797 | 4.9615 | 3354 | 0.8706 | 0.7724 | 0.8706 | 0.9330 |
| 0.0797 | 4.9645 | 3356 | 0.8644 | 0.7724 | 0.8644 | 0.9298 |
| 0.0797 | 4.9675 | 3358 | 0.8291 | 0.7618 | 0.8291 | 0.9105 |
| 0.0797 | 4.9704 | 3360 | 0.7928 | 0.8164 | 0.7928 | 0.8904 |
| 0.0797 | 4.9734 | 3362 | 0.7735 | 0.8164 | 0.7735 | 0.8795 |
| 0.0797 | 4.9763 | 3364 | 0.7463 | 0.8164 | 0.7463 | 0.8639 |
| 0.0797 | 4.9793 | 3366 | 0.7638 | 0.8164 | 0.7638 | 0.8740 |
| 0.0797 | 4.9822 | 3368 | 0.8052 | 0.8164 | 0.8052 | 0.8973 |
| 0.0797 | 4.9852 | 3370 | 0.8635 | 0.7902 | 0.8635 | 0.9292 |
| 0.0797 | 4.9882 | 3372 | 0.9202 | 0.7724 | 0.9202 | 0.9593 |
| 0.0797 | 4.9911 | 3374 | 0.9644 | 0.7840 | 0.9644 | 0.9820 |
| 0.0797 | 4.9941 | 3376 | 0.9751 | 0.7840 | 0.9751 | 0.9875 |
| 0.0797 | 4.9970 | 3378 | 0.9321 | 0.7840 | 0.9321 | 0.9655 |
| 0.0797 | 5.0 | 3380 | 0.8508 | 0.7623 | 0.8508 | 0.9224 |
| 0.0797 | 5.0030 | 3382 | 0.7783 | 0.7906 | 0.7783 | 0.8822 |
| 0.0797 | 5.0059 | 3384 | 0.7016 | 0.7906 | 0.7016 | 0.8376 |
| 0.0797 | 5.0089 | 3386 | 0.6338 | 0.7717 | 0.6338 | 0.7961 |
| 0.0797 | 5.0118 | 3388 | 0.6175 | 0.7620 | 0.6175 | 0.7858 |
| 0.0797 | 5.0148 | 3390 | 0.6416 | 0.7801 | 0.6416 | 0.8010 |
| 0.0797 | 5.0178 | 3392 | 0.7056 | 0.8164 | 0.7056 | 0.8400 |
| 0.0797 | 5.0207 | 3394 | 0.8227 | 0.8164 | 0.8227 | 0.9070 |
| 0.0797 | 5.0237 | 3396 | 0.9171 | 0.7426 | 0.9171 | 0.9577 |
| 0.0797 | 5.0266 | 3398 | 0.9738 | 0.6940 | 0.9738 | 0.9868 |
| 0.0797 | 5.0296 | 3400 | 0.9680 | 0.7515 | 0.9680 | 0.9839 |
| 0.0797 | 5.0325 | 3402 | 0.9004 | 0.7618 | 0.9004 | 0.9489 |
| 0.0797 | 5.0355 | 3404 | 0.8291 | 0.7316 | 0.8291 | 0.9106 |
| 0.0797 | 5.0385 | 3406 | 0.8184 | 0.7413 | 0.8184 | 0.9046 |
| 0.0797 | 5.0414 | 3408 | 0.8140 | 0.7413 | 0.8140 | 0.9022 |
| 0.0797 | 5.0444 | 3410 | 0.8128 | 0.7413 | 0.8128 | 0.9015 |
| 0.0797 | 5.0473 | 3412 | 0.8421 | 0.7724 | 0.8421 | 0.9177 |
| 0.0797 | 5.0503 | 3414 | 0.8818 | 0.7840 | 0.8818 | 0.9390 |
| 0.0797 | 5.0533 | 3416 | 0.8949 | 0.7840 | 0.8949 | 0.9460 |
| 0.0797 | 5.0562 | 3418 | 0.9503 | 0.7840 | 0.9503 | 0.9748 |
| 0.0797 | 5.0592 | 3420 | 0.9647 | 0.7305 | 0.9647 | 0.9822 |
| 0.0797 | 5.0621 | 3422 | 0.9247 | 0.7216 | 0.9247 | 0.9616 |
| 0.0797 | 5.0651 | 3424 | 0.8563 | 0.7618 | 0.8563 | 0.9254 |
| 0.0797 | 5.0680 | 3426 | 0.8365 | 0.7618 | 0.8365 | 0.9146 |
| 0.0797 | 5.0710 | 3428 | 0.8017 | 0.7618 | 0.8017 | 0.8954 |
| 0.0797 | 5.0740 | 3430 | 0.8188 | 0.7618 | 0.8188 | 0.9049 |
| 0.0797 | 5.0769 | 3432 | 0.8456 | 0.7840 | 0.8456 | 0.9195 |
| 0.0797 | 5.0799 | 3434 | 0.9030 | 0.7519 | 0.9030 | 0.9503 |
| 0.0797 | 5.0828 | 3436 | 0.9507 | 0.7519 | 0.9507 | 0.9750 |
| 0.0797 | 5.0858 | 3438 | 0.9936 | 0.7840 | 0.9936 | 0.9968 |
| 0.0797 | 5.0888 | 3440 | 0.9979 | 0.7840 | 0.9979 | 0.9990 |
| 0.0797 | 5.0917 | 3442 | 0.9735 | 0.7519 | 0.9735 | 0.9867 |
| 0.0797 | 5.0947 | 3444 | 0.9881 | 0.7840 | 0.9881 | 0.9940 |
| 0.0797 | 5.0976 | 3446 | 1.0506 | 0.7621 | 1.0506 | 1.0250 |
| 0.0797 | 5.1006 | 3448 | 1.0685 | 0.7621 | 1.0685 | 1.0337 |
| 0.0797 | 5.1036 | 3450 | 1.0462 | 0.7621 | 1.0462 | 1.0229 |
| 0.0797 | 5.1065 | 3452 | 1.0392 | 0.7621 | 1.0392 | 1.0194 |
| 0.0797 | 5.1095 | 3454 | 0.9713 | 0.7840 | 0.9713 | 0.9855 |
| 0.0797 | 5.1124 | 3456 | 0.9199 | 0.7840 | 0.9199 | 0.9591 |
| 0.0797 | 5.1154 | 3458 | 0.8870 | 0.7626 | 0.8870 | 0.9418 |
| 0.0797 | 5.1183 | 3460 | 0.8634 | 0.7626 | 0.8634 | 0.9292 |
| 0.0797 | 5.1213 | 3462 | 0.8286 | 0.7520 | 0.8286 | 0.9103 |
| 0.0797 | 5.1243 | 3464 | 0.8677 | 0.7626 | 0.8677 | 0.9315 |
| 0.0797 | 5.1272 | 3466 | 0.9625 | 0.7840 | 0.9625 | 0.9811 |
| 0.0797 | 5.1302 | 3468 | 1.0001 | 0.7840 | 1.0001 | 1.0001 |
| 0.0797 | 5.1331 | 3470 | 0.9647 | 0.7840 | 0.9647 | 0.9822 |
| 0.0797 | 5.1361 | 3472 | 0.8770 | 0.7520 | 0.8770 | 0.9365 |
| 0.0797 | 5.1391 | 3474 | 0.7879 | 0.7520 | 0.7879 | 0.8876 |
| 0.0797 | 5.1420 | 3476 | 0.7401 | 0.7337 | 0.7401 | 0.8603 |
| 0.0797 | 5.1450 | 3478 | 0.7503 | 0.7616 | 0.7503 | 0.8662 |
| 0.0797 | 5.1479 | 3480 | 0.7811 | 0.7713 | 0.7811 | 0.8838 |
| 0.0797 | 5.1509 | 3482 | 0.8408 | 0.7520 | 0.8408 | 0.9169 |
| 0.0797 | 5.1538 | 3484 | 0.9380 | 0.7724 | 0.9380 | 0.9685 |
| 0.0797 | 5.1568 | 3486 | 0.9983 | 0.7216 | 0.9983 | 0.9992 |
| 0.0797 | 5.1598 | 3488 | 0.9857 | 0.7216 | 0.9857 | 0.9928 |
| 0.0797 | 5.1627 | 3490 | 0.9139 | 0.8015 | 0.9139 | 0.9560 |
| 0.0797 | 5.1657 | 3492 | 0.8396 | 0.7902 | 0.8396 | 0.9163 |
| 0.0797 | 5.1686 | 3494 | 0.7844 | 0.7713 | 0.7844 | 0.8856 |
| 0.0797 | 5.1716 | 3496 | 0.7673 | 0.7713 | 0.7673 | 0.8760 |
| 0.0797 | 5.1746 | 3498 | 0.7993 | 0.7902 | 0.7993 | 0.8940 |
| 0.0729 | 5.1775 | 3500 | 0.8580 | 0.7724 | 0.8580 | 0.9263 |
| 0.0729 | 5.1805 | 3502 | 0.8718 | 0.7724 | 0.8718 | 0.9337 |
| 0.0729 | 5.1834 | 3504 | 0.8409 | 0.7724 | 0.8409 | 0.9170 |
| 0.0729 | 5.1864 | 3506 | 0.8036 | 0.7819 | 0.8036 | 0.8964 |
| 0.0729 | 5.1893 | 3508 | 0.8000 | 0.7713 | 0.8000 | 0.8944 |
| 0.0729 | 5.1923 | 3510 | 0.8335 | 0.7724 | 0.8335 | 0.9130 |
| 0.0729 | 5.1953 | 3512 | 0.8857 | 0.7724 | 0.8857 | 0.9411 |
| 0.0729 | 5.1982 | 3514 | 0.8863 | 0.7724 | 0.8863 | 0.9414 |
| 0.0729 | 5.2012 | 3516 | 0.9182 | 0.7724 | 0.9182 | 0.9582 |
| 0.0729 | 5.2041 | 3518 | 0.9120 | 0.7724 | 0.9120 | 0.9550 |
| 0.0729 | 5.2071 | 3520 | 0.8537 | 0.7618 | 0.8537 | 0.9240 |
| 0.0729 | 5.2101 | 3522 | 0.8442 | 0.7618 | 0.8442 | 0.9188 |
| 0.0729 | 5.2130 | 3524 | 0.8557 | 0.7724 | 0.8557 | 0.9250 |
| 0.0729 | 5.2160 | 3526 | 0.8916 | 0.7724 | 0.8916 | 0.9442 |
| 0.0729 | 5.2189 | 3528 | 0.9189 | 0.7724 | 0.9189 | 0.9586 |
| 0.0729 | 5.2219 | 3530 | 0.9315 | 0.7724 | 0.9315 | 0.9651 |
| 0.0729 | 5.2249 | 3532 | 0.8827 | 0.7618 | 0.8827 | 0.9395 |
| 0.0729 | 5.2278 | 3534 | 0.8433 | 0.7618 | 0.8433 | 0.9183 |
| 0.0729 | 5.2308 | 3536 | 0.8072 | 0.7618 | 0.8072 | 0.8984 |
| 0.0729 | 5.2337 | 3538 | 0.7755 | 0.7902 | 0.7755 | 0.8806 |
| 0.0729 | 5.2367 | 3540 | 0.7664 | 0.7902 | 0.7664 | 0.8754 |
| 0.0729 | 5.2396 | 3542 | 0.7976 | 0.7618 | 0.7976 | 0.8931 |
| 0.0729 | 5.2426 | 3544 | 0.8564 | 0.7618 | 0.8564 | 0.9254 |
| 0.0729 | 5.2456 | 3546 | 0.9030 | 0.7618 | 0.9030 | 0.9502 |
| 0.0729 | 5.2485 | 3548 | 0.9374 | 0.7618 | 0.9374 | 0.9682 |
| 0.0729 | 5.2515 | 3550 | 0.9030 | 0.7618 | 0.9030 | 0.9503 |
| 0.0729 | 5.2544 | 3552 | 0.8456 | 0.7618 | 0.8456 | 0.9196 |
| 0.0729 | 5.2574 | 3554 | 0.8013 | 0.7902 | 0.8013 | 0.8952 |
| 0.0729 | 5.2604 | 3556 | 0.8156 | 0.7902 | 0.8156 | 0.9031 |
| 0.0729 | 5.2633 | 3558 | 0.8410 | 0.7618 | 0.8410 | 0.9171 |
| 0.0729 | 5.2663 | 3560 | 0.8319 | 0.7902 | 0.8319 | 0.9121 |
| 0.0729 | 5.2692 | 3562 | 0.8376 | 0.7618 | 0.8376 | 0.9152 |
| 0.0729 | 5.2722 | 3564 | 0.8401 | 0.7519 | 0.8401 | 0.9166 |
| 0.0729 | 5.2751 | 3566 | 0.8302 | 0.7296 | 0.8302 | 0.9112 |
| 0.0729 | 5.2781 | 3568 | 0.8641 | 0.7519 | 0.8641 | 0.9295 |
| 0.0729 | 5.2811 | 3570 | 0.9360 | 0.7519 | 0.9360 | 0.9675 |
| 0.0729 | 5.2840 | 3572 | 1.0311 | 0.7519 | 1.0311 | 1.0154 |
| 0.0729 | 5.2870 | 3574 | 1.1162 | 0.7188 | 1.1162 | 1.0565 |
| 0.0729 | 5.2899 | 3576 | 1.1290 | 0.7188 | 1.1290 | 1.0625 |
| 0.0729 | 5.2929 | 3578 | 1.0821 | 0.7840 | 1.0821 | 1.0402 |
| 0.0729 | 5.2959 | 3580 | 1.0303 | 0.7840 | 1.0303 | 1.0150 |
| 0.0729 | 5.2988 | 3582 | 0.9852 | 0.7840 | 0.9852 | 0.9926 |
| 0.0729 | 5.3018 | 3584 | 0.9101 | 0.8015 | 0.9101 | 0.9540 |
| 0.0729 | 5.3047 | 3586 | 0.8275 | 0.7983 | 0.8275 | 0.9097 |
| 0.0729 | 5.3077 | 3588 | 0.7903 | 0.7983 | 0.7903 | 0.8890 |
| 0.0729 | 5.3107 | 3590 | 0.7626 | 0.7983 | 0.7626 | 0.8733 |
| 0.0729 | 5.3136 | 3592 | 0.7643 | 0.7983 | 0.7643 | 0.8742 |
| 0.0729 | 5.3166 | 3594 | 0.8050 | 0.8164 | 0.8050 | 0.8972 |
| 0.0729 | 5.3195 | 3596 | 0.8844 | 0.7902 | 0.8844 | 0.9404 |
| 0.0729 | 5.3225 | 3598 | 0.9726 | 0.7840 | 0.9726 | 0.9862 |
| 0.0729 | 5.3254 | 3600 | 1.0236 | 0.7305 | 1.0236 | 1.0117 |
| 0.0729 | 5.3284 | 3602 | 1.0094 | 0.7840 | 1.0094 | 1.0047 |
| 0.0729 | 5.3314 | 3604 | 0.9310 | 0.7840 | 0.9310 | 0.9649 |
| 0.0729 | 5.3343 | 3606 | 0.8657 | 0.7840 | 0.8657 | 0.9304 |
| 0.0729 | 5.3373 | 3608 | 0.8299 | 0.7840 | 0.8299 | 0.9110 |
| 0.0729 | 5.3402 | 3610 | 0.8399 | 0.7840 | 0.8399 | 0.9165 |
| 0.0729 | 5.3432 | 3612 | 0.8454 | 0.7840 | 0.8454 | 0.9195 |
| 0.0729 | 5.3462 | 3614 | 0.8262 | 0.7519 | 0.8262 | 0.9089 |
| 0.0729 | 5.3491 | 3616 | 0.8348 | 0.7840 | 0.8348 | 0.9137 |
| 0.0729 | 5.3521 | 3618 | 0.8672 | 0.7840 | 0.8672 | 0.9312 |
| 0.0729 | 5.3550 | 3620 | 0.9134 | 0.7840 | 0.9134 | 0.9557 |
| 0.0729 | 5.3580 | 3622 | 0.9925 | 0.7840 | 0.9925 | 0.9963 |
| 0.0729 | 5.3609 | 3624 | 1.0549 | 0.7188 | 1.0549 | 1.0271 |
| 0.0729 | 5.3639 | 3626 | 1.0776 | 0.7188 | 1.0776 | 1.0381 |
| 0.0729 | 5.3669 | 3628 | 1.0745 | 0.7188 | 1.0745 | 1.0366 |
| 0.0729 | 5.3698 | 3630 | 1.0338 | 0.7188 | 1.0338 | 1.0167 |
| 0.0729 | 5.3728 | 3632 | 0.9743 | 0.7724 | 0.9743 | 0.9871 |
| 0.0729 | 5.3757 | 3634 | 0.9124 | 0.7413 | 0.9124 | 0.9552 |
| 0.0729 | 5.3787 | 3636 | 0.9034 | 0.7413 | 0.9034 | 0.9505 |
| 0.0729 | 5.3817 | 3638 | 0.9447 | 0.7413 | 0.9447 | 0.9719 |
| 0.0729 | 5.3846 | 3640 | 0.9774 | 0.7840 | 0.9774 | 0.9887 |
| 0.0729 | 5.3876 | 3642 | 1.0130 | 0.7724 | 1.0130 | 1.0065 |
| 0.0729 | 5.3905 | 3644 | 1.0409 | 0.7724 | 1.0409 | 1.0203 |
| 0.0729 | 5.3935 | 3646 | 1.0244 | 0.7216 | 1.0244 | 1.0121 |
| 0.0729 | 5.3964 | 3648 | 0.9700 | 0.7216 | 0.9700 | 0.9849 |
| 0.0729 | 5.3994 | 3650 | 0.9572 | 0.7724 | 0.9572 | 0.9783 |
| 0.0729 | 5.4024 | 3652 | 0.9200 | 0.7724 | 0.9200 | 0.9591 |
| 0.0729 | 5.4053 | 3654 | 0.8985 | 0.7724 | 0.8985 | 0.9479 |
| 0.0729 | 5.4083 | 3656 | 0.9255 | 0.7724 | 0.9255 | 0.9620 |
| 0.0729 | 5.4112 | 3658 | 0.9594 | 0.7724 | 0.9594 | 0.9795 |
| 0.0729 | 5.4142 | 3660 | 1.0013 | 0.7724 | 1.0013 | 1.0006 |
| 0.0729 | 5.4172 | 3662 | 1.0250 | 0.7724 | 1.0250 | 1.0124 |
| 0.0729 | 5.4201 | 3664 | 0.9970 | 0.7724 | 0.9970 | 0.9985 |
| 0.0729 | 5.4231 | 3666 | 0.9493 | 0.7724 | 0.9493 | 0.9743 |
| 0.0729 | 5.4260 | 3668 | 0.9316 | 0.7724 | 0.9316 | 0.9652 |
| 0.0729 | 5.4290 | 3670 | 0.9488 | 0.7724 | 0.9488 | 0.9740 |
| 0.0729 | 5.4320 | 3672 | 0.9211 | 0.7413 | 0.9211 | 0.9597 |
| 0.0729 | 5.4349 | 3674 | 0.9128 | 0.7413 | 0.9128 | 0.9554 |
| 0.0729 | 5.4379 | 3676 | 0.8855 | 0.7413 | 0.8855 | 0.9410 |
| 0.0729 | 5.4408 | 3678 | 0.8444 | 0.7413 | 0.8444 | 0.9189 |
| 0.0729 | 5.4438 | 3680 | 0.8298 | 0.7413 | 0.8298 | 0.9110 |
| 0.0729 | 5.4467 | 3682 | 0.8495 | 0.7724 | 0.8495 | 0.9217 |
| 0.0729 | 5.4497 | 3684 | 0.8905 | 0.7724 | 0.8905 | 0.9436 |
| 0.0729 | 5.4527 | 3686 | 0.9412 | 0.7724 | 0.9412 | 0.9702 |
| 0.0729 | 5.4556 | 3688 | 0.9933 | 0.7724 | 0.9933 | 0.9967 |
| 0.0729 | 5.4586 | 3690 | 1.0112 | 0.7101 | 1.0112 | 1.0056 |
| 0.0729 | 5.4615 | 3692 | 0.9683 | 0.7724 | 0.9683 | 0.9840 |
| 0.0729 | 5.4645 | 3694 | 0.8887 | 0.7413 | 0.8887 | 0.9427 |
| 0.0729 | 5.4675 | 3696 | 0.8022 | 0.8019 | 0.8022 | 0.8956 |
| 0.0729 | 5.4704 | 3698 | 0.7701 | 0.7717 | 0.7701 | 0.8776 |
| 0.0729 | 5.4734 | 3700 | 0.7594 | 0.7717 | 0.7594 | 0.8714 |
| 0.0729 | 5.4763 | 3702 | 0.7811 | 0.7906 | 0.7811 | 0.8838 |
| 0.0729 | 5.4793 | 3704 | 0.7993 | 0.7906 | 0.7993 | 0.8941 |
| 0.0729 | 5.4822 | 3706 | 0.8533 | 0.8283 | 0.8533 | 0.9237 |
| 0.0729 | 5.4852 | 3708 | 0.9178 | 0.8015 | 0.9178 | 0.9580 |
| 0.0729 | 5.4882 | 3710 | 0.9853 | 0.7216 | 0.9853 | 0.9926 |
| 0.0729 | 5.4911 | 3712 | 1.0007 | 0.7216 | 1.0007 | 1.0004 |
| 0.0729 | 5.4941 | 3714 | 0.9916 | 0.7216 | 0.9916 | 0.9958 |
| 0.0729 | 5.4970 | 3716 | 0.9264 | 0.7724 | 0.9264 | 0.9625 |
| 0.0729 | 5.5 | 3718 | 0.8786 | 0.8015 | 0.8786 | 0.9373 |
| 0.0729 | 5.5030 | 3720 | 0.8418 | 0.8019 | 0.8418 | 0.9175 |
| 0.0729 | 5.5059 | 3722 | 0.8289 | 0.8019 | 0.8289 | 0.9105 |
| 0.0729 | 5.5089 | 3724 | 0.7966 | 0.7823 | 0.7966 | 0.8925 |
| 0.0729 | 5.5118 | 3726 | 0.7708 | 0.7823 | 0.7708 | 0.8780 |
| 0.0729 | 5.5148 | 3728 | 0.7675 | 0.7823 | 0.7675 | 0.8761 |
| 0.0729 | 5.5178 | 3730 | 0.7645 | 0.7823 | 0.7645 | 0.8744 |
| 0.0729 | 5.5207 | 3732 | 0.8095 | 0.7525 | 0.8095 | 0.8997 |
| 0.0729 | 5.5237 | 3734 | 0.8399 | 0.7413 | 0.8399 | 0.9164 |
| 0.0729 | 5.5266 | 3736 | 0.9047 | 0.7724 | 0.9047 | 0.9512 |
| 0.0729 | 5.5296 | 3738 | 0.9951 | 0.7840 | 0.9951 | 0.9976 |
| 0.0729 | 5.5325 | 3740 | 1.0166 | 0.7840 | 1.0166 | 1.0083 |
| 0.0729 | 5.5355 | 3742 | 0.9814 | 0.7840 | 0.9814 | 0.9907 |
| 0.0729 | 5.5385 | 3744 | 0.9224 | 0.7840 | 0.9224 | 0.9604 |
| 0.0729 | 5.5414 | 3746 | 0.8725 | 0.7413 | 0.8725 | 0.9341 |
| 0.0729 | 5.5444 | 3748 | 0.8395 | 0.7413 | 0.8395 | 0.9162 |
| 0.0729 | 5.5473 | 3750 | 0.8219 | 0.72 | 0.8219 | 0.9066 |
| 0.0729 | 5.5503 | 3752 | 0.8421 | 0.7413 | 0.8421 | 0.9176 |
| 0.0729 | 5.5533 | 3754 | 0.8522 | 0.7413 | 0.8522 | 0.9231 |
| 0.0729 | 5.5562 | 3756 | 0.8455 | 0.72 | 0.8455 | 0.9195 |
| 0.0729 | 5.5592 | 3758 | 0.8631 | 0.7413 | 0.8631 | 0.9290 |
| 0.0729 | 5.5621 | 3760 | 0.8563 | 0.7413 | 0.8563 | 0.9254 |
| 0.0729 | 5.5651 | 3762 | 0.8206 | 0.72 | 0.8206 | 0.9059 |
| 0.0729 | 5.5680 | 3764 | 0.8031 | 0.7823 | 0.8031 | 0.8961 |
| 0.0729 | 5.5710 | 3766 | 0.8135 | 0.7525 | 0.8135 | 0.9019 |
| 0.0729 | 5.5740 | 3768 | 0.8051 | 0.7525 | 0.8051 | 0.8973 |
| 0.0729 | 5.5769 | 3770 | 0.8042 | 0.7525 | 0.8042 | 0.8968 |
| 0.0729 | 5.5799 | 3772 | 0.8397 | 0.7413 | 0.8397 | 0.9163 |
| 0.0729 | 5.5828 | 3774 | 0.8898 | 0.7413 | 0.8898 | 0.9433 |
| 0.0729 | 5.5858 | 3776 | 0.9152 | 0.7413 | 0.9152 | 0.9567 |
| 0.0729 | 5.5888 | 3778 | 0.9219 | 0.7519 | 0.9219 | 0.9601 |
| 0.0729 | 5.5917 | 3780 | 0.8990 | 0.7413 | 0.8990 | 0.9481 |
| 0.0729 | 5.5947 | 3782 | 0.8616 | 0.7413 | 0.8616 | 0.9282 |
| 0.0729 | 5.5976 | 3784 | 0.8648 | 0.7413 | 0.8648 | 0.9300 |
| 0.0729 | 5.6006 | 3786 | 0.8304 | 0.7413 | 0.8304 | 0.9113 |
| 0.0729 | 5.6036 | 3788 | 0.7936 | 0.7413 | 0.7936 | 0.8908 |
| 0.0729 | 5.6065 | 3790 | 0.7983 | 0.7413 | 0.7983 | 0.8935 |
| 0.0729 | 5.6095 | 3792 | 0.8204 | 0.7519 | 0.8204 | 0.9057 |
| 0.0729 | 5.6124 | 3794 | 0.8397 | 0.7519 | 0.8397 | 0.9164 |
| 0.0729 | 5.6154 | 3796 | 0.8339 | 0.7413 | 0.8339 | 0.9132 |
| 0.0729 | 5.6183 | 3798 | 0.8093 | 0.7729 | 0.8093 | 0.8996 |
| 0.0729 | 5.6213 | 3800 | 0.7703 | 0.8019 | 0.7703 | 0.8777 |
| 0.0729 | 5.6243 | 3802 | 0.7544 | 0.8283 | 0.7544 | 0.8686 |
| 0.0729 | 5.6272 | 3804 | 0.7747 | 0.8283 | 0.7747 | 0.8802 |
| 0.0729 | 5.6302 | 3806 | 0.8086 | 0.8283 | 0.8086 | 0.8992 |
| 0.0729 | 5.6331 | 3808 | 0.8343 | 0.8283 | 0.8343 | 0.9134 |
| 0.0729 | 5.6361 | 3810 | 0.8292 | 0.8283 | 0.8292 | 0.9106 |
| 0.0729 | 5.6391 | 3812 | 0.7784 | 0.8283 | 0.7784 | 0.8822 |
| 0.0729 | 5.6420 | 3814 | 0.7655 | 0.8019 | 0.7655 | 0.8749 |
| 0.0729 | 5.6450 | 3816 | 0.7767 | 0.8019 | 0.7767 | 0.8813 |
| 0.0729 | 5.6479 | 3818 | 0.7758 | 0.8019 | 0.7758 | 0.8808 |
| 0.0729 | 5.6509 | 3820 | 0.7745 | 0.8019 | 0.7745 | 0.8800 |
| 0.0729 | 5.6538 | 3822 | 0.8042 | 0.8019 | 0.8042 | 0.8968 |
| 0.0729 | 5.6568 | 3824 | 0.7862 | 0.8019 | 0.7862 | 0.8867 |
| 0.0729 | 5.6598 | 3826 | 0.7381 | 0.8019 | 0.7381 | 0.8591 |
| 0.0729 | 5.6627 | 3828 | 0.6855 | 0.7717 | 0.6855 | 0.8280 |
| 0.0729 | 5.6657 | 3830 | 0.6682 | 0.7717 | 0.6682 | 0.8174 |
| 0.0729 | 5.6686 | 3832 | 0.6976 | 0.7906 | 0.6976 | 0.8352 |
| 0.0729 | 5.6716 | 3834 | 0.7755 | 0.8283 | 0.7755 | 0.8806 |
| 0.0729 | 5.6746 | 3836 | 0.8700 | 0.8283 | 0.8700 | 0.9328 |
| 0.0729 | 5.6775 | 3838 | 0.9186 | 0.7724 | 0.9186 | 0.9584 |
| 0.0729 | 5.6805 | 3840 | 0.9151 | 0.7724 | 0.9151 | 0.9566 |
| 0.0729 | 5.6834 | 3842 | 0.8875 | 0.8015 | 0.8875 | 0.9421 |
| 0.0729 | 5.6864 | 3844 | 0.8607 | 0.8283 | 0.8607 | 0.9277 |
| 0.0729 | 5.6893 | 3846 | 0.8227 | 0.7906 | 0.8227 | 0.9070 |
| 0.0729 | 5.6923 | 3848 | 0.7912 | 0.7906 | 0.7912 | 0.8895 |
| 0.0729 | 5.6953 | 3850 | 0.7916 | 0.7906 | 0.7916 | 0.8897 |
| 0.0729 | 5.6982 | 3852 | 0.7992 | 0.7906 | 0.7992 | 0.8940 |
| 0.0729 | 5.7012 | 3854 | 0.8389 | 0.8164 | 0.8389 | 0.9159 |
| 0.0729 | 5.7041 | 3856 | 0.9005 | 0.8283 | 0.9005 | 0.9489 |
| 0.0729 | 5.7071 | 3858 | 0.9551 | 0.7724 | 0.9551 | 0.9773 |
| 0.0729 | 5.7101 | 3860 | 0.9572 | 0.7724 | 0.9572 | 0.9784 |
| 0.0729 | 5.7130 | 3862 | 0.9207 | 0.7724 | 0.9207 | 0.9595 |
| 0.0729 | 5.7160 | 3864 | 0.8760 | 0.8019 | 0.8760 | 0.9359 |
| 0.0729 | 5.7189 | 3866 | 0.8334 | 0.8019 | 0.8334 | 0.9129 |
| 0.0729 | 5.7219 | 3868 | 0.8203 | 0.8019 | 0.8203 | 0.9057 |
| 0.0729 | 5.7249 | 3870 | 0.8116 | 0.8019 | 0.8116 | 0.9009 |
| 0.0729 | 5.7278 | 3872 | 0.7744 | 0.8019 | 0.7744 | 0.8800 |
| 0.0729 | 5.7308 | 3874 | 0.7858 | 0.8019 | 0.7858 | 0.8865 |
| 0.0729 | 5.7337 | 3876 | 0.7848 | 0.8019 | 0.7848 | 0.8859 |
| 0.0729 | 5.7367 | 3878 | 0.8025 | 0.8019 | 0.8025 | 0.8958 |
| 0.0729 | 5.7396 | 3880 | 0.8159 | 0.8019 | 0.8159 | 0.9033 |
| 0.0729 | 5.7426 | 3882 | 0.8059 | 0.8019 | 0.8059 | 0.8977 |
| 0.0729 | 5.7456 | 3884 | 0.8190 | 0.8019 | 0.8190 | 0.9050 |
| 0.0729 | 5.7485 | 3886 | 0.8319 | 0.8283 | 0.8319 | 0.9121 |
| 0.0729 | 5.7515 | 3888 | 0.8475 | 0.8015 | 0.8475 | 0.9206 |
| 0.0729 | 5.7544 | 3890 | 0.8264 | 0.8015 | 0.8264 | 0.9091 |
| 0.0729 | 5.7574 | 3892 | 0.7787 | 0.7729 | 0.7787 | 0.8824 |
| 0.0729 | 5.7604 | 3894 | 0.7594 | 0.7729 | 0.7594 | 0.8714 |
| 0.0729 | 5.7633 | 3896 | 0.7801 | 0.7729 | 0.7801 | 0.8832 |
| 0.0729 | 5.7663 | 3898 | 0.8330 | 0.8015 | 0.8330 | 0.9127 |
| 0.0729 | 5.7692 | 3900 | 0.9293 | 0.7724 | 0.9293 | 0.9640 |
| 0.0729 | 5.7722 | 3902 | 0.9803 | 0.7724 | 0.9803 | 0.9901 |
| 0.0729 | 5.7751 | 3904 | 0.9688 | 0.7724 | 0.9688 | 0.9843 |
| 0.0729 | 5.7781 | 3906 | 0.9122 | 0.8015 | 0.9122 | 0.9551 |
| 0.0729 | 5.7811 | 3908 | 0.8935 | 0.8015 | 0.8935 | 0.9452 |
| 0.0729 | 5.7840 | 3910 | 0.8688 | 0.8015 | 0.8688 | 0.9321 |
| 0.0729 | 5.7870 | 3912 | 0.8413 | 0.7729 | 0.8413 | 0.9172 |
| 0.0729 | 5.7899 | 3914 | 0.7963 | 0.7623 | 0.7963 | 0.8923 |
| 0.0729 | 5.7929 | 3916 | 0.7876 | 0.7623 | 0.7876 | 0.8875 |
| 0.0729 | 5.7959 | 3918 | 0.8222 | 0.7623 | 0.8222 | 0.9068 |
| 0.0729 | 5.7988 | 3920 | 0.8519 | 0.7316 | 0.8519 | 0.9230 |
| 0.0729 | 5.8018 | 3922 | 0.9165 | 0.7724 | 0.9165 | 0.9573 |
| 0.0729 | 5.8047 | 3924 | 0.9573 | 0.7724 | 0.9573 | 0.9784 |
| 0.0729 | 5.8077 | 3926 | 0.9844 | 0.7724 | 0.9844 | 0.9922 |
| 0.0729 | 5.8107 | 3928 | 0.9872 | 0.7840 | 0.9872 | 0.9936 |
| 0.0729 | 5.8136 | 3930 | 0.9713 | 0.7840 | 0.9713 | 0.9855 |
| 0.0729 | 5.8166 | 3932 | 0.9651 | 0.7519 | 0.9651 | 0.9824 |
| 0.0729 | 5.8195 | 3934 | 1.0025 | 0.7840 | 1.0025 | 1.0012 |
| 0.0729 | 5.8225 | 3936 | 1.0396 | 0.7840 | 1.0396 | 1.0196 |
| 0.0729 | 5.8254 | 3938 | 1.0270 | 0.7840 | 1.0270 | 1.0134 |
| 0.0729 | 5.8284 | 3940 | 0.9782 | 0.7840 | 0.9782 | 0.9891 |
| 0.0729 | 5.8314 | 3942 | 0.9171 | 0.7519 | 0.9171 | 0.9576 |
| 0.0729 | 5.8343 | 3944 | 0.8812 | 0.7519 | 0.8812 | 0.9387 |
| 0.0729 | 5.8373 | 3946 | 0.8382 | 0.7316 | 0.8382 | 0.9155 |
| 0.0729 | 5.8402 | 3948 | 0.8322 | 0.7316 | 0.8322 | 0.9122 |
| 0.0729 | 5.8432 | 3950 | 0.8118 | 0.7316 | 0.8118 | 0.9010 |
| 0.0729 | 5.8462 | 3952 | 0.8067 | 0.7316 | 0.8067 | 0.8982 |
| 0.0729 | 5.8491 | 3954 | 0.8378 | 0.7316 | 0.8378 | 0.9153 |
| 0.0729 | 5.8521 | 3956 | 0.9054 | 0.7724 | 0.9054 | 0.9515 |
| 0.0729 | 5.8550 | 3958 | 0.9621 | 0.7840 | 0.9621 | 0.9809 |
| 0.0729 | 5.8580 | 3960 | 0.9957 | 0.7840 | 0.9957 | 0.9978 |
| 0.0729 | 5.8609 | 3962 | 1.0217 | 0.7840 | 1.0217 | 1.0108 |
| 0.0729 | 5.8639 | 3964 | 1.0058 | 0.7840 | 1.0058 | 1.0029 |
| 0.0729 | 5.8669 | 3966 | 0.9909 | 0.7519 | 0.9909 | 0.9954 |
| 0.0729 | 5.8698 | 3968 | 0.9879 | 0.7519 | 0.9879 | 0.9939 |
| 0.0729 | 5.8728 | 3970 | 1.0028 | 0.7519 | 1.0028 | 1.0014 |
| 0.0729 | 5.8757 | 3972 | 0.9815 | 0.7519 | 0.9815 | 0.9907 |
| 0.0729 | 5.8787 | 3974 | 0.9568 | 0.7519 | 0.9568 | 0.9782 |
| 0.0729 | 5.8817 | 3976 | 0.9083 | 0.7519 | 0.9083 | 0.9530 |
| 0.0729 | 5.8846 | 3978 | 0.8859 | 0.7413 | 0.8859 | 0.9412 |
| 0.0729 | 5.8876 | 3980 | 0.8517 | 0.7413 | 0.8517 | 0.9229 |
| 0.0729 | 5.8905 | 3982 | 0.8734 | 0.7413 | 0.8734 | 0.9346 |
| 0.0729 | 5.8935 | 3984 | 0.9036 | 0.7840 | 0.9036 | 0.9506 |
| 0.0729 | 5.8964 | 3986 | 0.9235 | 0.7840 | 0.9235 | 0.9610 |
| 0.0729 | 5.8994 | 3988 | 0.9184 | 0.7840 | 0.9184 | 0.9584 |
| 0.0729 | 5.9024 | 3990 | 0.8787 | 0.7724 | 0.8787 | 0.9374 |
| 0.0729 | 5.9053 | 3992 | 0.8574 | 0.7724 | 0.8574 | 0.9260 |
| 0.0729 | 5.9083 | 3994 | 0.8516 | 0.7724 | 0.8516 | 0.9228 |
| 0.0729 | 5.9112 | 3996 | 0.8806 | 0.7724 | 0.8806 | 0.9384 |
| 0.0729 | 5.9142 | 3998 | 0.9248 | 0.7840 | 0.9248 | 0.9617 |
| 0.0618 | 5.9172 | 4000 | 0.9274 | 0.7840 | 0.9274 | 0.9630 |
| 0.0618 | 5.9201 | 4002 | 0.8962 | 0.7840 | 0.8962 | 0.9467 |
| 0.0618 | 5.9231 | 4004 | 0.8165 | 0.7729 | 0.8165 | 0.9036 |
| 0.0618 | 5.9260 | 4006 | 0.7779 | 0.7729 | 0.7779 | 0.8820 |
| 0.0618 | 5.9290 | 4008 | 0.7484 | 0.7823 | 0.7484 | 0.8651 |
| 0.0618 | 5.9320 | 4010 | 0.7476 | 0.7823 | 0.7476 | 0.8646 |
| 0.0618 | 5.9349 | 4012 | 0.7707 | 0.8019 | 0.7707 | 0.8779 |
| 0.0618 | 5.9379 | 4014 | 0.8322 | 0.8015 | 0.8322 | 0.9123 |
| 0.0618 | 5.9408 | 4016 | 0.8897 | 0.8138 | 0.8897 | 0.9433 |
| 0.0618 | 5.9438 | 4018 | 0.9447 | 0.7840 | 0.9447 | 0.9719 |
| 0.0618 | 5.9467 | 4020 | 0.9562 | 0.7840 | 0.9562 | 0.9778 |
| 0.0618 | 5.9497 | 4022 | 0.9259 | 0.7840 | 0.9259 | 0.9623 |
| 0.0618 | 5.9527 | 4024 | 0.8530 | 0.8015 | 0.8530 | 0.9236 |
| 0.0618 | 5.9556 | 4026 | 0.7660 | 0.8019 | 0.7660 | 0.8752 |
| 0.0618 | 5.9586 | 4028 | 0.6856 | 0.7717 | 0.6856 | 0.8280 |
| 0.0618 | 5.9615 | 4030 | 0.6482 | 0.7717 | 0.6482 | 0.8051 |
| 0.0618 | 5.9645 | 4032 | 0.6527 | 0.7717 | 0.6527 | 0.8079 |
| 0.0618 | 5.9675 | 4034 | 0.6919 | 0.7717 | 0.6919 | 0.8318 |
| 0.0618 | 5.9704 | 4036 | 0.7488 | 0.7823 | 0.7488 | 0.8653 |
| 0.0618 | 5.9734 | 4038 | 0.8241 | 0.7844 | 0.8241 | 0.9078 |
| 0.0618 | 5.9763 | 4040 | 0.9084 | 0.7840 | 0.9084 | 0.9531 |
| 0.0618 | 5.9793 | 4042 | 0.9500 | 0.7840 | 0.9500 | 0.9747 |
| 0.0618 | 5.9822 | 4044 | 0.9388 | 0.7840 | 0.9388 | 0.9689 |
| 0.0618 | 5.9852 | 4046 | 0.8905 | 0.7840 | 0.8905 | 0.9437 |
| 0.0618 | 5.9882 | 4048 | 0.8283 | 0.8283 | 0.8283 | 0.9101 |
| 0.0618 | 5.9911 | 4050 | 0.7828 | 0.8164 | 0.7828 | 0.8848 |
| 0.0618 | 5.9941 | 4052 | 0.7700 | 0.8164 | 0.7700 | 0.8775 |
| 0.0618 | 5.9970 | 4054 | 0.7762 | 0.8164 | 0.7762 | 0.8810 |
| 0.0618 | 6.0 | 4056 | 0.7857 | 0.8164 | 0.7857 | 0.8864 |
| 0.0618 | 6.0030 | 4058 | 0.7708 | 0.8164 | 0.7708 | 0.8779 |
| 0.0618 | 6.0059 | 4060 | 0.7407 | 0.8164 | 0.7407 | 0.8606 |
| 0.0618 | 6.0089 | 4062 | 0.7338 | 0.8164 | 0.7338 | 0.8566 |
| 0.0618 | 6.0118 | 4064 | 0.7257 | 0.8164 | 0.7257 | 0.8519 |
| 0.0618 | 6.0148 | 4066 | 0.6997 | 0.7906 | 0.6997 | 0.8365 |
| 0.0618 | 6.0178 | 4068 | 0.6803 | 0.7717 | 0.6803 | 0.8248 |
| 0.0618 | 6.0207 | 4070 | 0.6891 | 0.7906 | 0.6891 | 0.8301 |
| 0.0618 | 6.0237 | 4072 | 0.7343 | 0.7906 | 0.7343 | 0.8569 |
| 0.0618 | 6.0266 | 4074 | 0.7942 | 0.8164 | 0.7942 | 0.8912 |
| 0.0618 | 6.0296 | 4076 | 0.8335 | 0.7902 | 0.8335 | 0.9130 |
| 0.0618 | 6.0325 | 4078 | 0.8679 | 0.8015 | 0.8679 | 0.9316 |
| 0.0618 | 6.0355 | 4080 | 0.8942 | 0.8015 | 0.8942 | 0.9456 |
| 0.0618 | 6.0385 | 4082 | 0.8714 | 0.8015 | 0.8714 | 0.9335 |
| 0.0618 | 6.0414 | 4084 | 0.8241 | 0.8164 | 0.8241 | 0.9078 |
| 0.0618 | 6.0444 | 4086 | 0.8082 | 0.8164 | 0.8082 | 0.8990 |
| 0.0618 | 6.0473 | 4088 | 0.7933 | 0.8164 | 0.7933 | 0.8907 |
| 0.0618 | 6.0503 | 4090 | 0.8125 | 0.8164 | 0.8125 | 0.9014 |
| 0.0618 | 6.0533 | 4092 | 0.8090 | 0.8283 | 0.8090 | 0.8995 |
| 0.0618 | 6.0562 | 4094 | 0.8088 | 0.8019 | 0.8088 | 0.8993 |
| 0.0618 | 6.0592 | 4096 | 0.8278 | 0.8019 | 0.8278 | 0.9098 |
| 0.0618 | 6.0621 | 4098 | 0.8295 | 0.8019 | 0.8295 | 0.9108 |
| 0.0618 | 6.0651 | 4100 | 0.8138 | 0.8019 | 0.8138 | 0.9021 |
| 0.0618 | 6.0680 | 4102 | 0.7732 | 0.8019 | 0.7732 | 0.8793 |
| 0.0618 | 6.0710 | 4104 | 0.7564 | 0.7906 | 0.7564 | 0.8697 |
| 0.0618 | 6.0740 | 4106 | 0.7512 | 0.7717 | 0.7512 | 0.8667 |
| 0.0618 | 6.0769 | 4108 | 0.7832 | 0.8019 | 0.7832 | 0.8850 |
| 0.0618 | 6.0799 | 4110 | 0.8424 | 0.7844 | 0.8424 | 0.9178 |
| 0.0618 | 6.0828 | 4112 | 0.9049 | 0.7519 | 0.9049 | 0.9513 |
| 0.0618 | 6.0858 | 4114 | 0.9690 | 0.7840 | 0.9690 | 0.9844 |
| 0.0618 | 6.0888 | 4116 | 0.9749 | 0.7840 | 0.9749 | 0.9874 |
| 0.0618 | 6.0917 | 4118 | 0.9375 | 0.7840 | 0.9375 | 0.9683 |
| 0.0618 | 6.0947 | 4120 | 0.8876 | 0.7844 | 0.8876 | 0.9421 |
| 0.0618 | 6.0976 | 4122 | 0.8800 | 0.7844 | 0.8800 | 0.9381 |
| 0.0618 | 6.1006 | 4124 | 0.8834 | 0.7844 | 0.8834 | 0.9399 |
| 0.0618 | 6.1036 | 4126 | 0.8651 | 0.8142 | 0.8651 | 0.9301 |
| 0.0618 | 6.1065 | 4128 | 0.8500 | 0.8142 | 0.8500 | 0.9220 |
| 0.0618 | 6.1095 | 4130 | 0.8587 | 0.8142 | 0.8587 | 0.9267 |
| 0.0618 | 6.1124 | 4132 | 0.8524 | 0.8142 | 0.8523 | 0.9232 |
| 0.0618 | 6.1154 | 4134 | 0.8220 | 0.7937 | 0.8220 | 0.9066 |
| 0.0618 | 6.1183 | 4136 | 0.8070 | 0.7937 | 0.8070 | 0.8984 |
| 0.0618 | 6.1213 | 4138 | 0.8368 | 0.8215 | 0.8368 | 0.9148 |
| 0.0618 | 6.1243 | 4140 | 0.9113 | 0.8138 | 0.9113 | 0.9546 |
| 0.0618 | 6.1272 | 4142 | 0.9792 | 0.7614 | 0.9792 | 0.9896 |
| 0.0618 | 6.1302 | 4144 | 1.0125 | 0.7305 | 1.0125 | 1.0062 |
| 0.0618 | 6.1331 | 4146 | 0.9993 | 0.7840 | 0.9993 | 0.9997 |
| 0.0618 | 6.1361 | 4148 | 0.9779 | 0.7840 | 0.9779 | 0.9889 |
| 0.0618 | 6.1391 | 4150 | 0.9603 | 0.7840 | 0.9603 | 0.9799 |
| 0.0618 | 6.1420 | 4152 | 0.9874 | 0.7840 | 0.9874 | 0.9937 |
| 0.0618 | 6.1450 | 4154 | 1.0432 | 0.7840 | 1.0432 | 1.0214 |
| 0.0618 | 6.1479 | 4156 | 1.1069 | 0.7840 | 1.1069 | 1.0521 |
| 0.0618 | 6.1509 | 4158 | 1.1020 | 0.7840 | 1.1020 | 1.0498 |
| 0.0618 | 6.1538 | 4160 | 1.0420 | 0.7840 | 1.0420 | 1.0208 |
| 0.0618 | 6.1568 | 4162 | 0.9577 | 0.7519 | 0.9577 | 0.9786 |
| 0.0618 | 6.1598 | 4164 | 0.8798 | 0.7844 | 0.8798 | 0.9380 |
| 0.0618 | 6.1627 | 4166 | 0.8421 | 0.7844 | 0.8421 | 0.9176 |
| 0.0618 | 6.1657 | 4168 | 0.8155 | 0.7631 | 0.8155 | 0.9031 |
| 0.0618 | 6.1686 | 4170 | 0.8277 | 0.7844 | 0.8277 | 0.9098 |
| 0.0618 | 6.1716 | 4172 | 0.8593 | 0.7844 | 0.8593 | 0.9270 |
| 0.0618 | 6.1746 | 4174 | 0.9136 | 0.7844 | 0.9136 | 0.9558 |
| 0.0618 | 6.1775 | 4176 | 0.9556 | 0.7519 | 0.9556 | 0.9776 |
| 0.0618 | 6.1805 | 4178 | 0.9804 | 0.7840 | 0.9804 | 0.9902 |
| 0.0618 | 6.1834 | 4180 | 1.0194 | 0.7840 | 1.0194 | 1.0096 |
| 0.0618 | 6.1864 | 4182 | 1.0158 | 0.8138 | 1.0158 | 1.0079 |
| 0.0618 | 6.1893 | 4184 | 0.9620 | 0.8138 | 0.9620 | 0.9808 |
| 0.0618 | 6.1923 | 4186 | 0.8777 | 0.8283 | 0.8777 | 0.9368 |
| 0.0618 | 6.1953 | 4188 | 0.8295 | 0.8283 | 0.8295 | 0.9108 |
| 0.0618 | 6.1982 | 4190 | 0.7949 | 0.8164 | 0.7949 | 0.8916 |
| 0.0618 | 6.2012 | 4192 | 0.7444 | 0.7717 | 0.7444 | 0.8628 |
| 0.0618 | 6.2041 | 4194 | 0.7334 | 0.7717 | 0.7334 | 0.8564 |
| 0.0618 | 6.2071 | 4196 | 0.7522 | 0.7717 | 0.7522 | 0.8673 |
| 0.0618 | 6.2101 | 4198 | 0.7948 | 0.7906 | 0.7948 | 0.8915 |
| 0.0618 | 6.2130 | 4200 | 0.8747 | 0.8283 | 0.8747 | 0.9353 |
| 0.0618 | 6.2160 | 4202 | 0.9584 | 0.8138 | 0.9584 | 0.9790 |
| 0.0618 | 6.2189 | 4204 | 0.9870 | 0.7614 | 0.9870 | 0.9935 |
| 0.0618 | 6.2219 | 4206 | 0.9665 | 0.8138 | 0.9665 | 0.9831 |
| 0.0618 | 6.2249 | 4208 | 0.9049 | 0.8412 | 0.9049 | 0.9512 |
| 0.0618 | 6.2278 | 4210 | 0.8257 | 0.8164 | 0.8257 | 0.9087 |
| 0.0618 | 6.2308 | 4212 | 0.7600 | 0.7906 | 0.7600 | 0.8718 |
| 0.0618 | 6.2337 | 4214 | 0.7078 | 0.7717 | 0.7078 | 0.8413 |
| 0.0618 | 6.2367 | 4216 | 0.6927 | 0.7717 | 0.6927 | 0.8323 |
| 0.0618 | 6.2396 | 4218 | 0.7173 | 0.7717 | 0.7173 | 0.8469 |
| 0.0618 | 6.2426 | 4220 | 0.7786 | 0.7717 | 0.7786 | 0.8824 |
| 0.0618 | 6.2456 | 4222 | 0.8638 | 0.8412 | 0.8638 | 0.9294 |
| 0.0618 | 6.2485 | 4224 | 0.9439 | 0.8138 | 0.9439 | 0.9716 |
| 0.0618 | 6.2515 | 4226 | 0.9660 | 0.8138 | 0.9660 | 0.9829 |
| 0.0618 | 6.2544 | 4228 | 0.9475 | 0.8138 | 0.9475 | 0.9734 |
| 0.0618 | 6.2574 | 4230 | 0.9202 | 0.8138 | 0.9202 | 0.9593 |
| 0.0618 | 6.2604 | 4232 | 0.8638 | 0.8015 | 0.8638 | 0.9294 |
| 0.0618 | 6.2633 | 4234 | 0.8018 | 0.7906 | 0.8018 | 0.8955 |
| 0.0618 | 6.2663 | 4236 | 0.7775 | 0.7906 | 0.7775 | 0.8818 |
| 0.0618 | 6.2692 | 4238 | 0.7787 | 0.8164 | 0.7787 | 0.8824 |
| 0.0618 | 6.2722 | 4240 | 0.8015 | 0.8164 | 0.8015 | 0.8953 |
| 0.0618 | 6.2751 | 4242 | 0.8417 | 0.8164 | 0.8417 | 0.9174 |
| 0.0618 | 6.2781 | 4244 | 0.9045 | 0.7902 | 0.9045 | 0.9510 |
| 0.0618 | 6.2811 | 4246 | 0.9271 | 0.7902 | 0.9271 | 0.9628 |
| 0.0618 | 6.2840 | 4248 | 0.8977 | 0.7902 | 0.8977 | 0.9475 |
| 0.0618 | 6.2870 | 4250 | 0.8399 | 0.7906 | 0.8399 | 0.9164 |
| 0.0618 | 6.2899 | 4252 | 0.7644 | 0.7717 | 0.7644 | 0.8743 |
| 0.0618 | 6.2929 | 4254 | 0.7287 | 0.7717 | 0.7287 | 0.8537 |
| 0.0618 | 6.2959 | 4256 | 0.7334 | 0.7717 | 0.7334 | 0.8564 |
| 0.0618 | 6.2988 | 4258 | 0.7651 | 0.7717 | 0.7651 | 0.8747 |
| 0.0618 | 6.3018 | 4260 | 0.8232 | 0.7525 | 0.8232 | 0.9073 |
| 0.0618 | 6.3047 | 4262 | 0.8860 | 0.7844 | 0.8860 | 0.9413 |
| 0.0618 | 6.3077 | 4264 | 0.9122 | 0.7844 | 0.9122 | 0.9551 |
| 0.0618 | 6.3107 | 4266 | 0.9031 | 0.7844 | 0.9031 | 0.9503 |
| 0.0618 | 6.3136 | 4268 | 0.8583 | 0.7631 | 0.8583 | 0.9265 |
| 0.0618 | 6.3166 | 4270 | 0.8002 | 0.7717 | 0.8002 | 0.8946 |
| 0.0618 | 6.3195 | 4272 | 0.7614 | 0.7717 | 0.7614 | 0.8726 |
| 0.0618 | 6.3225 | 4274 | 0.7499 | 0.7717 | 0.7499 | 0.8659 |
| 0.0618 | 6.3254 | 4276 | 0.7669 | 0.7717 | 0.7669 | 0.8757 |
| 0.0618 | 6.3284 | 4278 | 0.8117 | 0.7717 | 0.8117 | 0.9009 |
| 0.0618 | 6.3314 | 4280 | 0.8436 | 0.7717 | 0.8436 | 0.9185 |
| 0.0618 | 6.3343 | 4282 | 0.8931 | 0.8283 | 0.8931 | 0.9451 |
| 0.0618 | 6.3373 | 4284 | 0.9088 | 0.8283 | 0.9088 | 0.9533 |
| 0.0618 | 6.3402 | 4286 | 0.8810 | 0.8164 | 0.8810 | 0.9386 |
| 0.0618 | 6.3432 | 4288 | 0.8622 | 0.7983 | 0.8622 | 0.9285 |
| 0.0618 | 6.3462 | 4290 | 0.8516 | 0.7983 | 0.8516 | 0.9228 |
| 0.0618 | 6.3491 | 4292 | 0.8306 | 0.7983 | 0.8306 | 0.9114 |
| 0.0618 | 6.3521 | 4294 | 0.7967 | 0.7717 | 0.7967 | 0.8926 |
| 0.0618 | 6.3550 | 4296 | 0.7927 | 0.7717 | 0.7927 | 0.8903 |
| 0.0618 | 6.3580 | 4298 | 0.8169 | 0.7717 | 0.8169 | 0.9038 |
| 0.0618 | 6.3609 | 4300 | 0.8513 | 0.7525 | 0.8513 | 0.9227 |
| 0.0618 | 6.3639 | 4302 | 0.8584 | 0.7525 | 0.8584 | 0.9265 |
| 0.0618 | 6.3669 | 4304 | 0.8419 | 0.7525 | 0.8419 | 0.9176 |
| 0.0618 | 6.3698 | 4306 | 0.8314 | 0.7427 | 0.8314 | 0.9118 |
| 0.0618 | 6.3728 | 4308 | 0.8132 | 0.7427 | 0.8132 | 0.9018 |
| 0.0618 | 6.3757 | 4310 | 0.8137 | 0.7427 | 0.8137 | 0.9021 |
| 0.0618 | 6.3787 | 4312 | 0.8264 | 0.7427 | 0.8264 | 0.9091 |
| 0.0618 | 6.3817 | 4314 | 0.8466 | 0.7525 | 0.8466 | 0.9201 |
| 0.0618 | 6.3846 | 4316 | 0.8922 | 0.7525 | 0.8922 | 0.9446 |
| 0.0618 | 6.3876 | 4318 | 0.9381 | 0.7729 | 0.9381 | 0.9685 |
| 0.0618 | 6.3905 | 4320 | 0.9569 | 0.7724 | 0.9569 | 0.9782 |
| 0.0618 | 6.3935 | 4322 | 0.9639 | 0.7724 | 0.9639 | 0.9818 |
| 0.0618 | 6.3964 | 4324 | 0.9795 | 0.7724 | 0.9795 | 0.9897 |
| 0.0618 | 6.3994 | 4326 | 0.9675 | 0.7724 | 0.9675 | 0.9836 |
| 0.0618 | 6.4024 | 4328 | 0.9493 | 0.8015 | 0.9493 | 0.9743 |
| 0.0618 | 6.4053 | 4330 | 0.9175 | 0.7729 | 0.9175 | 0.9579 |
| 0.0618 | 6.4083 | 4332 | 0.8663 | 0.7525 | 0.8663 | 0.9308 |
| 0.0618 | 6.4112 | 4334 | 0.8321 | 0.7525 | 0.8321 | 0.9122 |
| 0.0618 | 6.4142 | 4336 | 0.8146 | 0.7525 | 0.8146 | 0.9025 |
| 0.0618 | 6.4172 | 4338 | 0.8186 | 0.7525 | 0.8186 | 0.9048 |
| 0.0618 | 6.4201 | 4340 | 0.8411 | 0.7525 | 0.8411 | 0.9171 |
| 0.0618 | 6.4231 | 4342 | 0.8795 | 0.72 | 0.8795 | 0.9378 |
| 0.0618 | 6.4260 | 4344 | 0.9002 | 0.72 | 0.9002 | 0.9488 |
| 0.0618 | 6.4290 | 4346 | 0.9446 | 0.7724 | 0.9446 | 0.9719 |
| 0.0618 | 6.4320 | 4348 | 0.9799 | 0.7840 | 0.9799 | 0.9899 |
| 0.0618 | 6.4349 | 4350 | 0.9811 | 0.7724 | 0.9811 | 0.9905 |
| 0.0618 | 6.4379 | 4352 | 0.9651 | 0.7724 | 0.9651 | 0.9824 |
| 0.0618 | 6.4408 | 4354 | 0.9166 | 0.7724 | 0.9166 | 0.9574 |
| 0.0618 | 6.4438 | 4356 | 0.8878 | 0.7520 | 0.8878 | 0.9422 |
| 0.0618 | 6.4467 | 4358 | 0.8855 | 0.7520 | 0.8855 | 0.9410 |
| 0.0618 | 6.4497 | 4360 | 0.9110 | 0.7520 | 0.9110 | 0.9544 |
| 0.0618 | 6.4527 | 4362 | 0.9617 | 0.7724 | 0.9617 | 0.9807 |
| 0.0618 | 6.4556 | 4364 | 0.9895 | 0.7724 | 0.9895 | 0.9947 |
| 0.0618 | 6.4586 | 4366 | 0.9956 | 0.7724 | 0.9956 | 0.9978 |
| 0.0618 | 6.4615 | 4368 | 0.9714 | 0.7724 | 0.9714 | 0.9856 |
| 0.0618 | 6.4645 | 4370 | 0.9126 | 0.7520 | 0.9126 | 0.9553 |
| 0.0618 | 6.4675 | 4372 | 0.8689 | 0.7819 | 0.8689 | 0.9321 |
| 0.0618 | 6.4704 | 4374 | 0.8606 | 0.7525 | 0.8606 | 0.9277 |
| 0.0618 | 6.4734 | 4376 | 0.8802 | 0.7525 | 0.8802 | 0.9382 |
| 0.0618 | 6.4763 | 4378 | 0.9276 | 0.7626 | 0.9276 | 0.9631 |
| 0.0618 | 6.4793 | 4380 | 0.9632 | 0.7626 | 0.9632 | 0.9814 |
| 0.0618 | 6.4822 | 4382 | 0.9936 | 0.7840 | 0.9936 | 0.9968 |
| 0.0618 | 6.4852 | 4384 | 0.9775 | 0.7840 | 0.9775 | 0.9887 |
| 0.0618 | 6.4882 | 4386 | 0.9372 | 0.8138 | 0.9372 | 0.9681 |
| 0.0618 | 6.4911 | 4388 | 0.8688 | 0.7819 | 0.8688 | 0.9321 |
| 0.0618 | 6.4941 | 4390 | 0.8390 | 0.7819 | 0.8390 | 0.9160 |
| 0.0618 | 6.4970 | 4392 | 0.8079 | 0.7819 | 0.8079 | 0.8989 |
| 0.0618 | 6.5 | 4394 | 0.7641 | 0.7713 | 0.7641 | 0.8741 |
| 0.0618 | 6.5030 | 4396 | 0.7490 | 0.7713 | 0.7490 | 0.8655 |
| 0.0618 | 6.5059 | 4398 | 0.7567 | 0.7713 | 0.7567 | 0.8699 |
| 0.0618 | 6.5089 | 4400 | 0.7497 | 0.7713 | 0.7497 | 0.8659 |
| 0.0618 | 6.5118 | 4402 | 0.7497 | 0.7713 | 0.7497 | 0.8659 |
| 0.0618 | 6.5148 | 4404 | 0.7730 | 0.7713 | 0.7730 | 0.8792 |
| 0.0618 | 6.5178 | 4406 | 0.8254 | 0.7902 | 0.8254 | 0.9085 |
| 0.0618 | 6.5207 | 4408 | 0.8634 | 0.8015 | 0.8634 | 0.9292 |
| 0.0618 | 6.5237 | 4410 | 0.8683 | 0.8015 | 0.8683 | 0.9318 |
| 0.0618 | 6.5266 | 4412 | 0.8526 | 0.8015 | 0.8526 | 0.9234 |
| 0.0618 | 6.5296 | 4414 | 0.8343 | 0.8015 | 0.8343 | 0.9134 |
| 0.0618 | 6.5325 | 4416 | 0.8119 | 0.8015 | 0.8119 | 0.9011 |
| 0.0618 | 6.5355 | 4418 | 0.7804 | 0.7525 | 0.7804 | 0.8834 |
| 0.0618 | 6.5385 | 4420 | 0.7326 | 0.7427 | 0.7326 | 0.8559 |
| 0.0618 | 6.5414 | 4422 | 0.7086 | 0.7427 | 0.7086 | 0.8418 |
| 0.0618 | 6.5444 | 4424 | 0.7011 | 0.7427 | 0.7011 | 0.8373 |
| 0.0618 | 6.5473 | 4426 | 0.7097 | 0.7427 | 0.7097 | 0.8424 |
| 0.0618 | 6.5503 | 4428 | 0.7339 | 0.7427 | 0.7339 | 0.8567 |
| 0.0618 | 6.5533 | 4430 | 0.7779 | 0.7427 | 0.7779 | 0.8820 |
| 0.0618 | 6.5562 | 4432 | 0.8245 | 0.8015 | 0.8245 | 0.9080 |
| 0.0618 | 6.5592 | 4434 | 0.8713 | 0.8138 | 0.8713 | 0.9334 |
| 0.0618 | 6.5621 | 4436 | 0.8752 | 0.8015 | 0.8752 | 0.9355 |
| 0.0618 | 6.5651 | 4438 | 0.8417 | 0.7902 | 0.8417 | 0.9174 |
| 0.0618 | 6.5680 | 4440 | 0.8117 | 0.7902 | 0.8117 | 0.9010 |
| 0.0618 | 6.5710 | 4442 | 0.7711 | 0.7713 | 0.7711 | 0.8781 |
| 0.0618 | 6.5740 | 4444 | 0.7298 | 0.7427 | 0.7298 | 0.8543 |
| 0.0618 | 6.5769 | 4446 | 0.7291 | 0.7427 | 0.7291 | 0.8539 |
| 0.0618 | 6.5799 | 4448 | 0.7399 | 0.7427 | 0.7399 | 0.8602 |
| 0.0618 | 6.5828 | 4450 | 0.7796 | 0.7713 | 0.7796 | 0.8829 |
| 0.0618 | 6.5858 | 4452 | 0.8176 | 0.7902 | 0.8176 | 0.9042 |
| 0.0618 | 6.5888 | 4454 | 0.8640 | 0.7902 | 0.8640 | 0.9295 |
| 0.0618 | 6.5917 | 4456 | 0.8799 | 0.7902 | 0.8799 | 0.9380 |
| 0.0618 | 6.5947 | 4458 | 0.8964 | 0.7902 | 0.8964 | 0.9468 |
| 0.0618 | 6.5976 | 4460 | 0.9016 | 0.7902 | 0.9016 | 0.9495 |
| 0.0618 | 6.6006 | 4462 | 0.8860 | 0.7902 | 0.8860 | 0.9413 |
| 0.0618 | 6.6036 | 4464 | 0.8399 | 0.7713 | 0.8399 | 0.9164 |
| 0.0618 | 6.6065 | 4466 | 0.7918 | 0.7713 | 0.7918 | 0.8898 |
| 0.0618 | 6.6095 | 4468 | 0.7926 | 0.7713 | 0.7926 | 0.8903 |
| 0.0618 | 6.6124 | 4470 | 0.8235 | 0.7713 | 0.8235 | 0.9075 |
| 0.0618 | 6.6154 | 4472 | 0.8503 | 0.7713 | 0.8503 | 0.9221 |
| 0.0618 | 6.6183 | 4474 | 0.8654 | 0.7713 | 0.8654 | 0.9303 |
| 0.0618 | 6.6213 | 4476 | 0.8665 | 0.7626 | 0.8665 | 0.9308 |
| 0.0618 | 6.6243 | 4478 | 0.8559 | 0.7626 | 0.8559 | 0.9252 |
| 0.0618 | 6.6272 | 4480 | 0.8714 | 0.7626 | 0.8714 | 0.9335 |
| 0.0618 | 6.6302 | 4482 | 0.8815 | 0.7626 | 0.8815 | 0.9389 |
| 0.0618 | 6.6331 | 4484 | 0.9243 | 0.7840 | 0.9243 | 0.9614 |
| 0.0618 | 6.6361 | 4486 | 0.9661 | 0.7840 | 0.9661 | 0.9829 |
| 0.0618 | 6.6391 | 4488 | 0.9981 | 0.7840 | 0.9981 | 0.9990 |
| 0.0618 | 6.6420 | 4490 | 1.0217 | 0.7840 | 1.0217 | 1.0108 |
| 0.0618 | 6.6450 | 4492 | 1.0027 | 0.7305 | 1.0027 | 1.0014 |
| 0.0618 | 6.6479 | 4494 | 0.9528 | 0.7840 | 0.9528 | 0.9761 |
| 0.0618 | 6.6509 | 4496 | 0.8722 | 0.7520 | 0.8722 | 0.9339 |
| 0.0618 | 6.6538 | 4498 | 0.7963 | 0.7819 | 0.7963 | 0.8923 |
| 0.0549 | 6.6568 | 4500 | 0.7404 | 0.7983 | 0.7404 | 0.8604 |
| 0.0549 | 6.6598 | 4502 | 0.7243 | 0.7983 | 0.7243 | 0.8510 |
| 0.0549 | 6.6627 | 4504 | 0.7438 | 0.7525 | 0.7438 | 0.8624 |
| 0.0549 | 6.6657 | 4506 | 0.7860 | 0.7525 | 0.7860 | 0.8866 |
| 0.0549 | 6.6686 | 4508 | 0.8387 | 0.7626 | 0.8387 | 0.9158 |
| 0.0549 | 6.6716 | 4510 | 0.8908 | 0.7840 | 0.8908 | 0.9438 |
| 0.0549 | 6.6746 | 4512 | 0.9419 | 0.7840 | 0.9419 | 0.9705 |
| 0.0549 | 6.6775 | 4514 | 0.9409 | 0.7840 | 0.9409 | 0.9700 |
| 0.0549 | 6.6805 | 4516 | 0.9023 | 0.7840 | 0.9023 | 0.9499 |
| 0.0549 | 6.6834 | 4518 | 0.8555 | 0.7840 | 0.8555 | 0.9250 |
| 0.0549 | 6.6864 | 4520 | 0.8063 | 0.7626 | 0.8063 | 0.8979 |
| 0.0549 | 6.6893 | 4522 | 0.7897 | 0.7296 | 0.7897 | 0.8887 |
| 0.0549 | 6.6923 | 4524 | 0.7775 | 0.7296 | 0.7775 | 0.8818 |
| 0.0549 | 6.6953 | 4526 | 0.7620 | 0.7631 | 0.7620 | 0.8730 |
| 0.0549 | 6.6982 | 4528 | 0.7485 | 0.7631 | 0.7485 | 0.8651 |
| 0.0549 | 6.7012 | 4530 | 0.7307 | 0.7631 | 0.7307 | 0.8548 |
| 0.0549 | 6.7041 | 4532 | 0.7056 | 0.7823 | 0.7056 | 0.8400 |
| 0.0549 | 6.7071 | 4534 | 0.6903 | 0.7823 | 0.6903 | 0.8308 |
| 0.0549 | 6.7101 | 4536 | 0.7052 | 0.8094 | 0.7052 | 0.8397 |
| 0.0549 | 6.7130 | 4538 | 0.7270 | 0.8283 | 0.7270 | 0.8527 |
| 0.0549 | 6.7160 | 4540 | 0.7480 | 0.8412 | 0.7480 | 0.8649 |
| 0.0549 | 6.7189 | 4542 | 0.7647 | 0.8412 | 0.7647 | 0.8745 |
| 0.0549 | 6.7219 | 4544 | 0.7613 | 0.8412 | 0.7613 | 0.8725 |
| 0.0549 | 6.7249 | 4546 | 0.7743 | 0.8412 | 0.7743 | 0.8800 |
| 0.0549 | 6.7278 | 4548 | 0.7732 | 0.8412 | 0.7732 | 0.8793 |
| 0.0549 | 6.7308 | 4550 | 0.8008 | 0.8412 | 0.8008 | 0.8949 |
| 0.0549 | 6.7337 | 4552 | 0.8441 | 0.8138 | 0.8441 | 0.9188 |
| 0.0549 | 6.7367 | 4554 | 0.8645 | 0.8138 | 0.8645 | 0.9298 |
| 0.0549 | 6.7396 | 4556 | 0.8600 | 0.8138 | 0.8600 | 0.9274 |
| 0.0549 | 6.7426 | 4558 | 0.8315 | 0.8138 | 0.8315 | 0.9119 |
| 0.0549 | 6.7456 | 4560 | 0.8161 | 0.8412 | 0.8161 | 0.9034 |
| 0.0549 | 6.7485 | 4562 | 0.8246 | 0.8138 | 0.8246 | 0.9081 |
| 0.0549 | 6.7515 | 4564 | 0.8455 | 0.7840 | 0.8455 | 0.9195 |
| 0.0549 | 6.7544 | 4566 | 0.8351 | 0.8138 | 0.8351 | 0.9139 |
| 0.0549 | 6.7574 | 4568 | 0.8266 | 0.8412 | 0.8266 | 0.9092 |
| 0.0549 | 6.7604 | 4570 | 0.8379 | 0.8412 | 0.8379 | 0.9154 |
| 0.0549 | 6.7633 | 4572 | 0.8597 | 0.8412 | 0.8597 | 0.9272 |
| 0.0549 | 6.7663 | 4574 | 0.8581 | 0.8412 | 0.8581 | 0.9263 |
| 0.0549 | 6.7692 | 4576 | 0.8819 | 0.8138 | 0.8819 | 0.9391 |
| 0.0549 | 6.7722 | 4578 | 0.9142 | 0.7840 | 0.9142 | 0.9561 |
| 0.0549 | 6.7751 | 4580 | 0.9281 | 0.7840 | 0.9281 | 0.9634 |
| 0.0549 | 6.7781 | 4582 | 0.9276 | 0.7840 | 0.9276 | 0.9631 |
| 0.0549 | 6.7811 | 4584 | 0.9251 | 0.7840 | 0.9251 | 0.9618 |
| 0.0549 | 6.7840 | 4586 | 0.9118 | 0.7840 | 0.9118 | 0.9549 |
| 0.0549 | 6.7870 | 4588 | 0.8836 | 0.7840 | 0.8836 | 0.9400 |
| 0.0549 | 6.7899 | 4590 | 0.8736 | 0.7840 | 0.8736 | 0.9346 |
| 0.0549 | 6.7929 | 4592 | 0.8614 | 0.7840 | 0.8614 | 0.9281 |
| 0.0549 | 6.7959 | 4594 | 0.8519 | 0.7840 | 0.8519 | 0.9230 |
| 0.0549 | 6.7988 | 4596 | 0.8339 | 0.7840 | 0.8339 | 0.9132 |
| 0.0549 | 6.8018 | 4598 | 0.8378 | 0.7840 | 0.8378 | 0.9153 |
| 0.0549 | 6.8047 | 4600 | 0.8696 | 0.7840 | 0.8696 | 0.9325 |
| 0.0549 | 6.8077 | 4602 | 0.9205 | 0.7840 | 0.9205 | 0.9594 |
| 0.0549 | 6.8107 | 4604 | 0.9711 | 0.7840 | 0.9711 | 0.9855 |
| 0.0549 | 6.8136 | 4606 | 1.0190 | 0.7621 | 1.0190 | 1.0095 |
| 0.0549 | 6.8166 | 4608 | 1.0164 | 0.7621 | 1.0164 | 1.0082 |
| 0.0549 | 6.8195 | 4610 | 1.0042 | 0.7621 | 1.0042 | 1.0021 |
| 0.0549 | 6.8225 | 4612 | 0.9927 | 0.7840 | 0.9927 | 0.9964 |
| 0.0549 | 6.8254 | 4614 | 0.9525 | 0.7840 | 0.9525 | 0.9759 |
| 0.0549 | 6.8284 | 4616 | 0.9276 | 0.7840 | 0.9276 | 0.9631 |
| 0.0549 | 6.8314 | 4618 | 0.8828 | 0.7840 | 0.8828 | 0.9396 |
| 0.0549 | 6.8343 | 4620 | 0.8656 | 0.7840 | 0.8656 | 0.9304 |
| 0.0549 | 6.8373 | 4622 | 0.8662 | 0.7840 | 0.8662 | 0.9307 |
| 0.0549 | 6.8402 | 4624 | 0.8916 | 0.7840 | 0.8916 | 0.9442 |
| 0.0549 | 6.8432 | 4626 | 0.9022 | 0.7840 | 0.9022 | 0.9498 |
| 0.0549 | 6.8462 | 4628 | 0.8867 | 0.7840 | 0.8867 | 0.9416 |
| 0.0549 | 6.8491 | 4630 | 0.8416 | 0.7724 | 0.8416 | 0.9174 |
| 0.0549 | 6.8521 | 4632 | 0.7946 | 0.8283 | 0.7946 | 0.8914 |
| 0.0549 | 6.8550 | 4634 | 0.7730 | 0.8094 | 0.7730 | 0.8792 |
| 0.0549 | 6.8580 | 4636 | 0.7702 | 0.8094 | 0.7702 | 0.8776 |
| 0.0549 | 6.8609 | 4638 | 0.7928 | 0.7724 | 0.7928 | 0.8904 |
| 0.0549 | 6.8639 | 4640 | 0.8415 | 0.7724 | 0.8415 | 0.9173 |
| 0.0549 | 6.8669 | 4642 | 0.9122 | 0.7840 | 0.9122 | 0.9551 |
| 0.0549 | 6.8698 | 4644 | 0.9748 | 0.7840 | 0.9748 | 0.9873 |
| 0.0549 | 6.8728 | 4646 | 1.0289 | 0.7621 | 1.0289 | 1.0144 |
| 0.0549 | 6.8757 | 4648 | 1.0459 | 0.7621 | 1.0459 | 1.0227 |
| 0.0549 | 6.8787 | 4650 | 1.0220 | 0.7621 | 1.0220 | 1.0109 |
| 0.0549 | 6.8817 | 4652 | 0.9670 | 0.7840 | 0.9670 | 0.9834 |
| 0.0549 | 6.8846 | 4654 | 0.9203 | 0.7840 | 0.9203 | 0.9593 |
| 0.0549 | 6.8876 | 4656 | 0.8649 | 0.7724 | 0.8649 | 0.9300 |
| 0.0549 | 6.8905 | 4658 | 0.7921 | 0.7724 | 0.7921 | 0.8900 |
| 0.0549 | 6.8935 | 4660 | 0.7594 | 0.7525 | 0.7594 | 0.8714 |
| 0.0549 | 6.8964 | 4662 | 0.7520 | 0.7525 | 0.7520 | 0.8672 |
| 0.0549 | 6.8994 | 4664 | 0.7381 | 0.7525 | 0.7381 | 0.8592 |
| 0.0549 | 6.9024 | 4666 | 0.7423 | 0.7525 | 0.7423 | 0.8615 |
| 0.0549 | 6.9053 | 4668 | 0.7716 | 0.72 | 0.7716 | 0.8784 |
| 0.0549 | 6.9083 | 4670 | 0.8169 | 0.7296 | 0.8169 | 0.9038 |
| 0.0549 | 6.9112 | 4672 | 0.8611 | 0.7296 | 0.8611 | 0.9280 |
| 0.0549 | 6.9142 | 4674 | 0.9014 | 0.7840 | 0.9014 | 0.9494 |
| 0.0549 | 6.9172 | 4676 | 0.9400 | 0.7840 | 0.9400 | 0.9695 |
| 0.0549 | 6.9201 | 4678 | 0.9517 | 0.7840 | 0.9517 | 0.9755 |
| 0.0549 | 6.9231 | 4680 | 0.9722 | 0.7840 | 0.9722 | 0.9860 |
| 0.0549 | 6.9260 | 4682 | 0.9580 | 0.7840 | 0.9580 | 0.9788 |
| 0.0549 | 6.9290 | 4684 | 0.9146 | 0.7840 | 0.9146 | 0.9563 |
| 0.0549 | 6.9320 | 4686 | 0.8571 | 0.7626 | 0.8571 | 0.9258 |
| 0.0549 | 6.9349 | 4688 | 0.8050 | 0.7819 | 0.8050 | 0.8972 |
| 0.0549 | 6.9379 | 4690 | 0.7811 | 0.7819 | 0.7811 | 0.8838 |
| 0.0549 | 6.9408 | 4692 | 0.7714 | 0.7819 | 0.7714 | 0.8783 |
| 0.0549 | 6.9438 | 4694 | 0.7736 | 0.7819 | 0.7736 | 0.8795 |
| 0.0549 | 6.9467 | 4696 | 0.7760 | 0.7819 | 0.7760 | 0.8809 |
| 0.0549 | 6.9497 | 4698 | 0.7668 | 0.7819 | 0.7668 | 0.8757 |
| 0.0549 | 6.9527 | 4700 | 0.7751 | 0.8015 | 0.7751 | 0.8804 |
| 0.0549 | 6.9556 | 4702 | 0.7947 | 0.8015 | 0.7947 | 0.8915 |
| 0.0549 | 6.9586 | 4704 | 0.8089 | 0.8015 | 0.8089 | 0.8994 |
| 0.0549 | 6.9615 | 4706 | 0.8234 | 0.8015 | 0.8234 | 0.9074 |
| 0.0549 | 6.9645 | 4708 | 0.8125 | 0.8015 | 0.8125 | 0.9014 |
| 0.0549 | 6.9675 | 4710 | 0.8033 | 0.8015 | 0.8033 | 0.8963 |
| 0.0549 | 6.9704 | 4712 | 0.8225 | 0.8138 | 0.8225 | 0.9069 |
| 0.0549 | 6.9734 | 4714 | 0.8542 | 0.8138 | 0.8542 | 0.9242 |
| 0.0549 | 6.9763 | 4716 | 0.8661 | 0.7840 | 0.8661 | 0.9306 |
| 0.0549 | 6.9793 | 4718 | 0.8504 | 0.7840 | 0.8504 | 0.9222 |
| 0.0549 | 6.9822 | 4720 | 0.8383 | 0.8138 | 0.8383 | 0.9156 |
| 0.0549 | 6.9852 | 4722 | 0.8020 | 0.7729 | 0.8020 | 0.8955 |
| 0.0549 | 6.9882 | 4724 | 0.7727 | 0.7525 | 0.7727 | 0.8790 |
| 0.0549 | 6.9911 | 4726 | 0.7585 | 0.7525 | 0.7585 | 0.8709 |
| 0.0549 | 6.9941 | 4728 | 0.7473 | 0.7819 | 0.7473 | 0.8645 |
| 0.0549 | 6.9970 | 4730 | 0.7486 | 0.7713 | 0.7486 | 0.8652 |
| 0.0549 | 7.0 | 4732 | 0.7637 | 0.7819 | 0.7637 | 0.8739 |
| 0.0549 | 7.0030 | 4734 | 0.7936 | 0.8015 | 0.7936 | 0.8909 |
| 0.0549 | 7.0059 | 4736 | 0.8245 | 0.8015 | 0.8245 | 0.9080 |
| 0.0549 | 7.0089 | 4738 | 0.8737 | 0.7724 | 0.8737 | 0.9347 |
| 0.0549 | 7.0118 | 4740 | 0.8947 | 0.7724 | 0.8947 | 0.9459 |
| 0.0549 | 7.0148 | 4742 | 0.8758 | 0.7724 | 0.8758 | 0.9358 |
| 0.0549 | 7.0178 | 4744 | 0.8438 | 0.7724 | 0.8438 | 0.9186 |
| 0.0549 | 7.0207 | 4746 | 0.8171 | 0.8015 | 0.8171 | 0.9040 |
| 0.0549 | 7.0237 | 4748 | 0.8074 | 0.8015 | 0.8074 | 0.8985 |
| 0.0549 | 7.0266 | 4750 | 0.8108 | 0.7520 | 0.8108 | 0.9004 |
| 0.0549 | 7.0296 | 4752 | 0.7981 | 0.7819 | 0.7981 | 0.8934 |
| 0.0549 | 7.0325 | 4754 | 0.8114 | 0.7724 | 0.8114 | 0.9008 |
| 0.0549 | 7.0355 | 4756 | 0.8371 | 0.7724 | 0.8371 | 0.9149 |
| 0.0549 | 7.0385 | 4758 | 0.8665 | 0.7724 | 0.8665 | 0.9309 |
| 0.0549 | 7.0414 | 4760 | 0.8820 | 0.7724 | 0.8820 | 0.9392 |
| 0.0549 | 7.0444 | 4762 | 0.8763 | 0.7724 | 0.8763 | 0.9361 |
| 0.0549 | 7.0473 | 4764 | 0.8711 | 0.7724 | 0.8711 | 0.9333 |
| 0.0549 | 7.0503 | 4766 | 0.8389 | 0.8015 | 0.8389 | 0.9159 |
| 0.0549 | 7.0533 | 4768 | 0.8040 | 0.8283 | 0.8040 | 0.8967 |
| 0.0549 | 7.0562 | 4770 | 0.7820 | 0.8283 | 0.7820 | 0.8843 |
| 0.0549 | 7.0592 | 4772 | 0.7627 | 0.8283 | 0.7627 | 0.8733 |
| 0.0549 | 7.0621 | 4774 | 0.7294 | 0.7823 | 0.7294 | 0.8540 |
| 0.0549 | 7.0651 | 4776 | 0.7017 | 0.7823 | 0.7017 | 0.8377 |
| 0.0549 | 7.0680 | 4778 | 0.6896 | 0.7823 | 0.6896 | 0.8304 |
| 0.0549 | 7.0710 | 4780 | 0.6974 | 0.7823 | 0.6974 | 0.8351 |
| 0.0549 | 7.0740 | 4782 | 0.7210 | 0.7823 | 0.7210 | 0.8491 |
| 0.0549 | 7.0769 | 4784 | 0.7562 | 0.8094 | 0.7562 | 0.8696 |
| 0.0549 | 7.0799 | 4786 | 0.8090 | 0.8094 | 0.8090 | 0.8994 |
| 0.0549 | 7.0828 | 4788 | 0.8711 | 0.7724 | 0.8711 | 0.9333 |
| 0.0549 | 7.0858 | 4790 | 0.8960 | 0.7840 | 0.8960 | 0.9466 |
| 0.0549 | 7.0888 | 4792 | 0.8849 | 0.7840 | 0.8849 | 0.9407 |
| 0.0549 | 7.0917 | 4794 | 0.8823 | 0.7840 | 0.8823 | 0.9393 |
| 0.0549 | 7.0947 | 4796 | 0.8602 | 0.7626 | 0.8602 | 0.9275 |
| 0.0549 | 7.0976 | 4798 | 0.8329 | 0.7296 | 0.8329 | 0.9126 |
| 0.0549 | 7.1006 | 4800 | 0.8058 | 0.7296 | 0.8058 | 0.8977 |
| 0.0549 | 7.1036 | 4802 | 0.7875 | 0.72 | 0.7875 | 0.8874 |
| 0.0549 | 7.1065 | 4804 | 0.7718 | 0.72 | 0.7718 | 0.8785 |
| 0.0549 | 7.1095 | 4806 | 0.7489 | 0.72 | 0.7489 | 0.8654 |
| 0.0549 | 7.1124 | 4808 | 0.7513 | 0.7112 | 0.7513 | 0.8668 |
| 0.0549 | 7.1154 | 4810 | 0.7549 | 0.7112 | 0.7549 | 0.8688 |
| 0.0549 | 7.1183 | 4812 | 0.7747 | 0.7422 | 0.7747 | 0.8801 |
| 0.0549 | 7.1213 | 4814 | 0.8141 | 0.7520 | 0.8141 | 0.9023 |
| 0.0549 | 7.1243 | 4816 | 0.8626 | 0.7724 | 0.8626 | 0.9288 |
| 0.0549 | 7.1272 | 4818 | 0.8817 | 0.7724 | 0.8817 | 0.9390 |
| 0.0549 | 7.1302 | 4820 | 0.8743 | 0.7626 | 0.8743 | 0.9350 |
| 0.0549 | 7.1331 | 4822 | 0.8820 | 0.7626 | 0.8820 | 0.9391 |
| 0.0549 | 7.1361 | 4824 | 0.8843 | 0.7626 | 0.8843 | 0.9404 |
| 0.0549 | 7.1391 | 4826 | 0.8642 | 0.7626 | 0.8642 | 0.9296 |
| 0.0549 | 7.1420 | 4828 | 0.8493 | 0.7296 | 0.8493 | 0.9216 |
| 0.0549 | 7.1450 | 4830 | 0.8178 | 0.7296 | 0.8178 | 0.9043 |
| 0.0549 | 7.1479 | 4832 | 0.7860 | 0.7112 | 0.7860 | 0.8866 |
| 0.0549 | 7.1509 | 4834 | 0.7820 | 0.7112 | 0.7820 | 0.8843 |
| 0.0549 | 7.1538 | 4836 | 0.7627 | 0.7112 | 0.7627 | 0.8733 |
| 0.0549 | 7.1568 | 4838 | 0.7328 | 0.7427 | 0.7328 | 0.8561 |
| 0.0549 | 7.1598 | 4840 | 0.7226 | 0.7427 | 0.7226 | 0.8501 |
| 0.0549 | 7.1627 | 4842 | 0.7319 | 0.7427 | 0.7319 | 0.8555 |
| 0.0549 | 7.1657 | 4844 | 0.7584 | 0.7112 | 0.7584 | 0.8708 |
| 0.0549 | 7.1686 | 4846 | 0.8041 | 0.7112 | 0.8041 | 0.8967 |
| 0.0549 | 7.1716 | 4848 | 0.8570 | 0.7618 | 0.8570 | 0.9258 |
| 0.0549 | 7.1746 | 4850 | 0.8788 | 0.7618 | 0.8788 | 0.9374 |
| 0.0549 | 7.1775 | 4852 | 0.8736 | 0.7618 | 0.8736 | 0.9346 |
| 0.0549 | 7.1805 | 4854 | 0.8402 | 0.7422 | 0.8402 | 0.9166 |
| 0.0549 | 7.1834 | 4856 | 0.7920 | 0.7112 | 0.7920 | 0.8899 |
| 0.0549 | 7.1864 | 4858 | 0.7654 | 0.7112 | 0.7654 | 0.8749 |
| 0.0549 | 7.1893 | 4860 | 0.7687 | 0.7112 | 0.7687 | 0.8767 |
| 0.0549 | 7.1923 | 4862 | 0.7966 | 0.7422 | 0.7966 | 0.8925 |
| 0.0549 | 7.1953 | 4864 | 0.8376 | 0.7422 | 0.8376 | 0.9152 |
| 0.0549 | 7.1982 | 4866 | 0.8865 | 0.7618 | 0.8865 | 0.9415 |
| 0.0549 | 7.2012 | 4868 | 0.9240 | 0.7724 | 0.9240 | 0.9612 |
| 0.0549 | 7.2041 | 4870 | 0.9404 | 0.7724 | 0.9404 | 0.9697 |
| 0.0549 | 7.2071 | 4872 | 0.9358 | 0.7840 | 0.9358 | 0.9673 |
| 0.0549 | 7.2101 | 4874 | 0.9416 | 0.7840 | 0.9416 | 0.9704 |
| 0.0549 | 7.2130 | 4876 | 0.9330 | 0.7840 | 0.9330 | 0.9659 |
| 0.0549 | 7.2160 | 4878 | 0.9116 | 0.7840 | 0.9116 | 0.9548 |
| 0.0549 | 7.2189 | 4880 | 0.8997 | 0.7840 | 0.8997 | 0.9485 |
| 0.0549 | 7.2219 | 4882 | 0.8914 | 0.7626 | 0.8914 | 0.9442 |
| 0.0549 | 7.2249 | 4884 | 0.8706 | 0.7296 | 0.8706 | 0.9331 |
| 0.0549 | 7.2278 | 4886 | 0.8296 | 0.7296 | 0.8296 | 0.9108 |
| 0.0549 | 7.2308 | 4888 | 0.8102 | 0.7296 | 0.8102 | 0.9001 |
| 0.0549 | 7.2337 | 4890 | 0.7892 | 0.7112 | 0.7892 | 0.8884 |
| 0.0549 | 7.2367 | 4892 | 0.7582 | 0.7112 | 0.7582 | 0.8707 |
| 0.0549 | 7.2396 | 4894 | 0.7320 | 0.7112 | 0.7320 | 0.8556 |
| 0.0549 | 7.2426 | 4896 | 0.7105 | 0.7112 | 0.7105 | 0.8429 |
| 0.0549 | 7.2456 | 4898 | 0.6965 | 0.7112 | 0.6965 | 0.8345 |
| 0.0549 | 7.2485 | 4900 | 0.6960 | 0.7112 | 0.6960 | 0.8343 |
| 0.0549 | 7.2515 | 4902 | 0.7096 | 0.7112 | 0.7096 | 0.8424 |
| 0.0549 | 7.2544 | 4904 | 0.7071 | 0.7112 | 0.7071 | 0.8409 |
| 0.0549 | 7.2574 | 4906 | 0.7183 | 0.7112 | 0.7183 | 0.8475 |
| 0.0549 | 7.2604 | 4908 | 0.7499 | 0.7422 | 0.7499 | 0.8660 |
| 0.0549 | 7.2633 | 4910 | 0.7955 | 0.7422 | 0.7955 | 0.8919 |
| 0.0549 | 7.2663 | 4912 | 0.8364 | 0.7840 | 0.8364 | 0.9146 |
| 0.0549 | 7.2692 | 4914 | 0.8812 | 0.7840 | 0.8812 | 0.9387 |
| 0.0549 | 7.2722 | 4916 | 0.9046 | 0.7840 | 0.9046 | 0.9511 |
| 0.0549 | 7.2751 | 4918 | 0.9104 | 0.7840 | 0.9104 | 0.9541 |
| 0.0549 | 7.2781 | 4920 | 0.8984 | 0.7840 | 0.8984 | 0.9478 |
| 0.0549 | 7.2811 | 4922 | 0.8633 | 0.7840 | 0.8633 | 0.9292 |
| 0.0549 | 7.2840 | 4924 | 0.8091 | 0.7422 | 0.8091 | 0.8995 |
| 0.0549 | 7.2870 | 4926 | 0.7670 | 0.7112 | 0.7670 | 0.8758 |
| 0.0549 | 7.2899 | 4928 | 0.7482 | 0.7112 | 0.7482 | 0.8650 |
| 0.0549 | 7.2929 | 4930 | 0.7479 | 0.7112 | 0.7479 | 0.8648 |
| 0.0549 | 7.2959 | 4932 | 0.7450 | 0.7112 | 0.7450 | 0.8631 |
| 0.0549 | 7.2988 | 4934 | 0.7623 | 0.7112 | 0.7623 | 0.8731 |
| 0.0549 | 7.3018 | 4936 | 0.7740 | 0.7112 | 0.7740 | 0.8798 |
| 0.0549 | 7.3047 | 4938 | 0.7942 | 0.7520 | 0.7942 | 0.8912 |
| 0.0549 | 7.3077 | 4940 | 0.8235 | 0.7626 | 0.8235 | 0.9075 |
| 0.0549 | 7.3107 | 4942 | 0.8474 | 0.7626 | 0.8474 | 0.9205 |
| 0.0549 | 7.3136 | 4944 | 0.8528 | 0.7626 | 0.8528 | 0.9235 |
| 0.0549 | 7.3166 | 4946 | 0.8657 | 0.7840 | 0.8657 | 0.9305 |
| 0.0549 | 7.3195 | 4948 | 0.8648 | 0.7840 | 0.8648 | 0.9299 |
| 0.0549 | 7.3225 | 4950 | 0.8492 | 0.7626 | 0.8492 | 0.9215 |
| 0.0549 | 7.3254 | 4952 | 0.8213 | 0.7520 | 0.8213 | 0.9063 |
| 0.0549 | 7.3284 | 4954 | 0.7908 | 0.7520 | 0.7908 | 0.8892 |
| 0.0549 | 7.3314 | 4956 | 0.7701 | 0.7112 | 0.7701 | 0.8776 |
| 0.0549 | 7.3343 | 4958 | 0.7712 | 0.7112 | 0.7712 | 0.8782 |
| 0.0549 | 7.3373 | 4960 | 0.7653 | 0.7112 | 0.7653 | 0.8748 |
| 0.0549 | 7.3402 | 4962 | 0.7608 | 0.7112 | 0.7608 | 0.8722 |
| 0.0549 | 7.3432 | 4964 | 0.7671 | 0.7112 | 0.7671 | 0.8758 |
| 0.0549 | 7.3462 | 4966 | 0.7633 | 0.7112 | 0.7633 | 0.8736 |
| 0.0549 | 7.3491 | 4968 | 0.7743 | 0.7422 | 0.7743 | 0.8799 |
| 0.0549 | 7.3521 | 4970 | 0.7858 | 0.7422 | 0.7858 | 0.8864 |
| 0.0549 | 7.3550 | 4972 | 0.7987 | 0.7422 | 0.7987 | 0.8937 |
| 0.0549 | 7.3580 | 4974 | 0.7969 | 0.7422 | 0.7969 | 0.8927 |
| 0.0549 | 7.3609 | 4976 | 0.7931 | 0.7618 | 0.7931 | 0.8906 |
| 0.0549 | 7.3639 | 4978 | 0.8001 | 0.7724 | 0.8001 | 0.8945 |
| 0.0549 | 7.3669 | 4980 | 0.7922 | 0.7520 | 0.7922 | 0.8901 |
| 0.0549 | 7.3698 | 4982 | 0.7730 | 0.72 | 0.7730 | 0.8792 |
| 0.0549 | 7.3728 | 4984 | 0.7525 | 0.7427 | 0.7525 | 0.8675 |
| 0.0549 | 7.3757 | 4986 | 0.7481 | 0.7427 | 0.7481 | 0.8649 |
| 0.0549 | 7.3787 | 4988 | 0.7471 | 0.7427 | 0.7471 | 0.8643 |
| 0.0549 | 7.3817 | 4990 | 0.7342 | 0.7427 | 0.7342 | 0.8568 |
| 0.0549 | 7.3846 | 4992 | 0.7144 | 0.7717 | 0.7144 | 0.8452 |
| 0.0549 | 7.3876 | 4994 | 0.7037 | 0.7717 | 0.7037 | 0.8388 |
| 0.0549 | 7.3905 | 4996 | 0.7152 | 0.7717 | 0.7152 | 0.8457 |
| 0.0549 | 7.3935 | 4998 | 0.7492 | 0.7427 | 0.7492 | 0.8656 |
| 0.0513 | 7.3964 | 5000 | 0.7790 | 0.7623 | 0.7790 | 0.8826 |
| 0.0513 | 7.3994 | 5002 | 0.8123 | 0.7902 | 0.8123 | 0.9013 |
| 0.0513 | 7.4024 | 5004 | 0.8130 | 0.7623 | 0.8130 | 0.9017 |
| 0.0513 | 7.4053 | 5006 | 0.8059 | 0.7623 | 0.8059 | 0.8977 |
| 0.0513 | 7.4083 | 5008 | 0.7905 | 0.7427 | 0.7905 | 0.8891 |
| 0.0513 | 7.4112 | 5010 | 0.7988 | 0.7631 | 0.7988 | 0.8938 |
| 0.0513 | 7.4142 | 5012 | 0.7921 | 0.7631 | 0.7921 | 0.8900 |
| 0.0513 | 7.4172 | 5014 | 0.7952 | 0.7631 | 0.7952 | 0.8917 |
| 0.0513 | 7.4201 | 5016 | 0.7906 | 0.7631 | 0.7906 | 0.8892 |
| 0.0513 | 7.4231 | 5018 | 0.7928 | 0.7296 | 0.7928 | 0.8904 |
| 0.0513 | 7.4260 | 5020 | 0.7997 | 0.7296 | 0.7997 | 0.8943 |
| 0.0513 | 7.4290 | 5022 | 0.8024 | 0.72 | 0.8024 | 0.8958 |
| 0.0513 | 7.4320 | 5024 | 0.8094 | 0.72 | 0.8094 | 0.8997 |
| 0.0513 | 7.4349 | 5026 | 0.8300 | 0.7296 | 0.8300 | 0.9110 |
| 0.0513 | 7.4379 | 5028 | 0.8530 | 0.7296 | 0.8530 | 0.9236 |
| 0.0513 | 7.4408 | 5030 | 0.8584 | 0.7296 | 0.8584 | 0.9265 |
| 0.0513 | 7.4438 | 5032 | 0.8528 | 0.7296 | 0.8528 | 0.9235 |
| 0.0513 | 7.4467 | 5034 | 0.8670 | 0.7296 | 0.8670 | 0.9311 |
| 0.0513 | 7.4497 | 5036 | 0.8847 | 0.7519 | 0.8847 | 0.9406 |
| 0.0513 | 7.4527 | 5038 | 0.8920 | 0.7840 | 0.8920 | 0.9445 |
| 0.0513 | 7.4556 | 5040 | 0.8879 | 0.7840 | 0.8879 | 0.9423 |
| 0.0513 | 7.4586 | 5042 | 0.8744 | 0.7519 | 0.8744 | 0.9351 |
| 0.0513 | 7.4615 | 5044 | 0.8643 | 0.7296 | 0.8643 | 0.9297 |
| 0.0513 | 7.4645 | 5046 | 0.8629 | 0.7296 | 0.8629 | 0.9289 |
| 0.0513 | 7.4675 | 5048 | 0.8636 | 0.7296 | 0.8636 | 0.9293 |
| 0.0513 | 7.4704 | 5050 | 0.8558 | 0.7296 | 0.8558 | 0.9251 |
| 0.0513 | 7.4734 | 5052 | 0.8681 | 0.7296 | 0.8681 | 0.9317 |
| 0.0513 | 7.4763 | 5054 | 0.8731 | 0.7296 | 0.8731 | 0.9344 |
| 0.0513 | 7.4793 | 5056 | 0.8683 | 0.7296 | 0.8683 | 0.9318 |
| 0.0513 | 7.4822 | 5058 | 0.8784 | 0.7296 | 0.8784 | 0.9372 |
| 0.0513 | 7.4852 | 5060 | 0.9076 | 0.7519 | 0.9076 | 0.9527 |
| 0.0513 | 7.4882 | 5062 | 0.9444 | 0.7840 | 0.9444 | 0.9718 |
| 0.0513 | 7.4911 | 5064 | 0.9580 | 0.7840 | 0.9580 | 0.9788 |
| 0.0513 | 7.4941 | 5066 | 0.9438 | 0.7840 | 0.9438 | 0.9715 |
| 0.0513 | 7.4970 | 5068 | 0.9137 | 0.7840 | 0.9137 | 0.9559 |
| 0.0513 | 7.5 | 5070 | 0.8980 | 0.7840 | 0.8980 | 0.9476 |
| 0.0513 | 7.5030 | 5072 | 0.8968 | 0.7840 | 0.8968 | 0.9470 |
| 0.0513 | 7.5059 | 5074 | 0.8976 | 0.7840 | 0.8976 | 0.9474 |
| 0.0513 | 7.5089 | 5076 | 0.8870 | 0.7519 | 0.8870 | 0.9418 |
| 0.0513 | 7.5118 | 5078 | 0.8808 | 0.7519 | 0.8808 | 0.9385 |
| 0.0513 | 7.5148 | 5080 | 0.8896 | 0.7519 | 0.8896 | 0.9432 |
| 0.0513 | 7.5178 | 5082 | 0.8833 | 0.7519 | 0.8833 | 0.9398 |
| 0.0513 | 7.5207 | 5084 | 0.8941 | 0.7519 | 0.8941 | 0.9456 |
| 0.0513 | 7.5237 | 5086 | 0.9216 | 0.7519 | 0.9216 | 0.9600 |
| 0.0513 | 7.5266 | 5088 | 0.9488 | 0.7519 | 0.9488 | 0.9741 |
| 0.0513 | 7.5296 | 5090 | 0.9757 | 0.7519 | 0.9757 | 0.9878 |
| 0.0513 | 7.5325 | 5092 | 0.9863 | 0.7519 | 0.9863 | 0.9931 |
| 0.0513 | 7.5355 | 5094 | 0.9746 | 0.7519 | 0.9746 | 0.9872 |
| 0.0513 | 7.5385 | 5096 | 0.9432 | 0.7519 | 0.9432 | 0.9712 |
| 0.0513 | 7.5414 | 5098 | 0.9148 | 0.7519 | 0.9148 | 0.9565 |
| 0.0513 | 7.5444 | 5100 | 0.9067 | 0.7519 | 0.9067 | 0.9522 |
| 0.0513 | 7.5473 | 5102 | 0.8874 | 0.7519 | 0.8874 | 0.9420 |
| 0.0513 | 7.5503 | 5104 | 0.8609 | 0.7296 | 0.8609 | 0.9278 |
| 0.0513 | 7.5533 | 5106 | 0.8382 | 0.7296 | 0.8382 | 0.9155 |
| 0.0513 | 7.5562 | 5108 | 0.8344 | 0.7296 | 0.8344 | 0.9135 |
| 0.0513 | 7.5592 | 5110 | 0.8570 | 0.7296 | 0.8570 | 0.9257 |
| 0.0513 | 7.5621 | 5112 | 0.8925 | 0.7296 | 0.8925 | 0.9447 |
| 0.0513 | 7.5651 | 5114 | 0.9403 | 0.7519 | 0.9403 | 0.9697 |
| 0.0513 | 7.5680 | 5116 | 0.9732 | 0.7519 | 0.9732 | 0.9865 |
| 0.0513 | 7.5710 | 5118 | 0.9905 | 0.7519 | 0.9905 | 0.9952 |
| 0.0513 | 7.5740 | 5120 | 1.0105 | 0.7840 | 1.0105 | 1.0052 |
| 0.0513 | 7.5769 | 5122 | 1.0106 | 0.7840 | 1.0106 | 1.0053 |
| 0.0513 | 7.5799 | 5124 | 0.9867 | 0.7840 | 0.9867 | 0.9933 |
| 0.0513 | 7.5828 | 5126 | 0.9611 | 0.7519 | 0.9611 | 0.9804 |
| 0.0513 | 7.5858 | 5128 | 0.9355 | 0.7519 | 0.9355 | 0.9672 |
| 0.0513 | 7.5888 | 5130 | 0.9121 | 0.7519 | 0.9121 | 0.9550 |
| 0.0513 | 7.5917 | 5132 | 0.9045 | 0.7519 | 0.9045 | 0.9511 |
| 0.0513 | 7.5947 | 5134 | 0.9002 | 0.7519 | 0.9002 | 0.9488 |
| 0.0513 | 7.5976 | 5136 | 0.8894 | 0.7519 | 0.8894 | 0.9431 |
| 0.0513 | 7.6006 | 5138 | 0.8953 | 0.7519 | 0.8953 | 0.9462 |
| 0.0513 | 7.6036 | 5140 | 0.9232 | 0.7840 | 0.9232 | 0.9608 |
| 0.0513 | 7.6065 | 5142 | 0.9312 | 0.7840 | 0.9312 | 0.9650 |
| 0.0513 | 7.6095 | 5144 | 0.9380 | 0.7519 | 0.9380 | 0.9685 |
| 0.0513 | 7.6124 | 5146 | 0.9408 | 0.7519 | 0.9408 | 0.9699 |
| 0.0513 | 7.6154 | 5148 | 0.9631 | 0.7519 | 0.9631 | 0.9814 |
| 0.0513 | 7.6183 | 5150 | 0.9985 | 0.7840 | 0.9985 | 0.9992 |
| 0.0513 | 7.6213 | 5152 | 1.0390 | 0.7840 | 1.0390 | 1.0193 |
| 0.0513 | 7.6243 | 5154 | 1.0423 | 0.7840 | 1.0423 | 1.0209 |
| 0.0513 | 7.6272 | 5156 | 1.0147 | 0.7519 | 1.0147 | 1.0073 |
| 0.0513 | 7.6302 | 5158 | 0.9678 | 0.7519 | 0.9678 | 0.9838 |
| 0.0513 | 7.6331 | 5160 | 0.9242 | 0.7519 | 0.9242 | 0.9614 |
| 0.0513 | 7.6361 | 5162 | 0.8840 | 0.7296 | 0.8840 | 0.9402 |
| 0.0513 | 7.6391 | 5164 | 0.8636 | 0.7296 | 0.8636 | 0.9293 |
| 0.0513 | 7.6420 | 5166 | 0.8501 | 0.7296 | 0.8501 | 0.9220 |
| 0.0513 | 7.6450 | 5168 | 0.8404 | 0.7296 | 0.8404 | 0.9167 |
| 0.0513 | 7.6479 | 5170 | 0.8469 | 0.7296 | 0.8469 | 0.9203 |
| 0.0513 | 7.6509 | 5172 | 0.8509 | 0.7296 | 0.8509 | 0.9224 |
| 0.0513 | 7.6538 | 5174 | 0.8532 | 0.7296 | 0.8532 | 0.9237 |
| 0.0513 | 7.6568 | 5176 | 0.8734 | 0.7296 | 0.8734 | 0.9346 |
| 0.0513 | 7.6598 | 5178 | 0.9032 | 0.7840 | 0.9032 | 0.9504 |
| 0.0513 | 7.6627 | 5180 | 0.9208 | 0.7840 | 0.9208 | 0.9596 |
| 0.0513 | 7.6657 | 5182 | 0.9271 | 0.7840 | 0.9271 | 0.9629 |
| 0.0513 | 7.6686 | 5184 | 0.9043 | 0.7840 | 0.9043 | 0.9509 |
| 0.0513 | 7.6716 | 5186 | 0.8711 | 0.7840 | 0.8711 | 0.9333 |
| 0.0513 | 7.6746 | 5188 | 0.8547 | 0.7626 | 0.8547 | 0.9245 |
| 0.0513 | 7.6775 | 5190 | 0.8409 | 0.7296 | 0.8409 | 0.9170 |
| 0.0513 | 7.6805 | 5192 | 0.8427 | 0.7296 | 0.8427 | 0.9180 |
| 0.0513 | 7.6834 | 5194 | 0.8690 | 0.7296 | 0.8690 | 0.9322 |
| 0.0513 | 7.6864 | 5196 | 0.8874 | 0.7626 | 0.8874 | 0.9420 |
| 0.0513 | 7.6893 | 5198 | 0.9085 | 0.7840 | 0.9085 | 0.9531 |
| 0.0513 | 7.6923 | 5200 | 0.9412 | 0.7840 | 0.9412 | 0.9701 |
| 0.0513 | 7.6953 | 5202 | 0.9631 | 0.7840 | 0.9631 | 0.9814 |
| 0.0513 | 7.6982 | 5204 | 0.9516 | 0.7840 | 0.9516 | 0.9755 |
| 0.0513 | 7.7012 | 5206 | 0.9202 | 0.7840 | 0.9202 | 0.9593 |
| 0.0513 | 7.7041 | 5208 | 0.8798 | 0.7626 | 0.8798 | 0.9380 |
| 0.0513 | 7.7071 | 5210 | 0.8325 | 0.7296 | 0.8325 | 0.9124 |
| 0.0513 | 7.7101 | 5212 | 0.7859 | 0.7296 | 0.7859 | 0.8865 |
| 0.0513 | 7.7130 | 5214 | 0.7703 | 0.72 | 0.7703 | 0.8777 |
| 0.0513 | 7.7160 | 5216 | 0.7783 | 0.72 | 0.7783 | 0.8822 |
| 0.0513 | 7.7189 | 5218 | 0.7952 | 0.7296 | 0.7952 | 0.8918 |
| 0.0513 | 7.7219 | 5220 | 0.8079 | 0.7296 | 0.8079 | 0.8988 |
| 0.0513 | 7.7249 | 5222 | 0.8141 | 0.7296 | 0.8141 | 0.9023 |
| 0.0513 | 7.7278 | 5224 | 0.8363 | 0.7626 | 0.8363 | 0.9145 |
| 0.0513 | 7.7308 | 5226 | 0.8579 | 0.7626 | 0.8579 | 0.9262 |
| 0.0513 | 7.7337 | 5228 | 0.8630 | 0.7626 | 0.8630 | 0.9290 |
| 0.0513 | 7.7367 | 5230 | 0.8502 | 0.7626 | 0.8502 | 0.9221 |
| 0.0513 | 7.7396 | 5232 | 0.8546 | 0.7626 | 0.8546 | 0.9244 |
| 0.0513 | 7.7426 | 5234 | 0.8756 | 0.7840 | 0.8756 | 0.9357 |
| 0.0513 | 7.7456 | 5236 | 0.9083 | 0.7840 | 0.9083 | 0.9530 |
| 0.0513 | 7.7485 | 5238 | 0.9352 | 0.7840 | 0.9352 | 0.9670 |
| 0.0513 | 7.7515 | 5240 | 0.9624 | 0.7840 | 0.9624 | 0.9810 |
| 0.0513 | 7.7544 | 5242 | 0.9656 | 0.7840 | 0.9656 | 0.9826 |
| 0.0513 | 7.7574 | 5244 | 0.9433 | 0.7840 | 0.9433 | 0.9712 |
| 0.0513 | 7.7604 | 5246 | 0.9051 | 0.7840 | 0.9051 | 0.9514 |
| 0.0513 | 7.7633 | 5248 | 0.8595 | 0.7413 | 0.8595 | 0.9271 |
| 0.0513 | 7.7663 | 5250 | 0.8234 | 0.72 | 0.8234 | 0.9074 |
| 0.0513 | 7.7692 | 5252 | 0.8027 | 0.72 | 0.8027 | 0.8960 |
| 0.0513 | 7.7722 | 5254 | 0.8021 | 0.72 | 0.8021 | 0.8956 |
| 0.0513 | 7.7751 | 5256 | 0.8193 | 0.72 | 0.8193 | 0.9052 |
| 0.0513 | 7.7781 | 5258 | 0.8567 | 0.72 | 0.8567 | 0.9256 |
| 0.0513 | 7.7811 | 5260 | 0.8943 | 0.7413 | 0.8943 | 0.9457 |
| 0.0513 | 7.7840 | 5262 | 0.9261 | 0.7840 | 0.9261 | 0.9623 |
| 0.0513 | 7.7870 | 5264 | 0.9689 | 0.7840 | 0.9689 | 0.9843 |
| 0.0513 | 7.7899 | 5266 | 0.9977 | 0.7840 | 0.9977 | 0.9989 |
| 0.0513 | 7.7929 | 5268 | 1.0082 | 0.7840 | 1.0082 | 1.0041 |
| 0.0513 | 7.7959 | 5270 | 1.0252 | 0.7840 | 1.0252 | 1.0125 |
| 0.0513 | 7.7988 | 5272 | 1.0323 | 0.7840 | 1.0323 | 1.0160 |
| 0.0513 | 7.8018 | 5274 | 1.0288 | 0.7840 | 1.0288 | 1.0143 |
| 0.0513 | 7.8047 | 5276 | 0.9974 | 0.7840 | 0.9974 | 0.9987 |
| 0.0513 | 7.8077 | 5278 | 0.9670 | 0.7840 | 0.9670 | 0.9833 |
| 0.0513 | 7.8107 | 5280 | 0.9379 | 0.7840 | 0.9379 | 0.9685 |
| 0.0513 | 7.8136 | 5282 | 0.8991 | 0.7626 | 0.8991 | 0.9482 |
| 0.0513 | 7.8166 | 5284 | 0.8749 | 0.7626 | 0.8749 | 0.9354 |
| 0.0513 | 7.8195 | 5286 | 0.8548 | 0.7520 | 0.8548 | 0.9246 |
| 0.0513 | 7.8225 | 5288 | 0.8513 | 0.7520 | 0.8513 | 0.9227 |
| 0.0513 | 7.8254 | 5290 | 0.8672 | 0.7520 | 0.8672 | 0.9312 |
| 0.0513 | 7.8284 | 5292 | 0.8998 | 0.7626 | 0.8998 | 0.9486 |
| 0.0513 | 7.8314 | 5294 | 0.9235 | 0.7840 | 0.9235 | 0.9610 |
| 0.0513 | 7.8343 | 5296 | 0.9389 | 0.7840 | 0.9389 | 0.9690 |
| 0.0513 | 7.8373 | 5298 | 0.9523 | 0.7840 | 0.9523 | 0.9759 |
| 0.0513 | 7.8402 | 5300 | 0.9738 | 0.7840 | 0.9738 | 0.9868 |
| 0.0513 | 7.8432 | 5302 | 0.9840 | 0.7840 | 0.9840 | 0.9920 |
| 0.0513 | 7.8462 | 5304 | 0.9792 | 0.7840 | 0.9792 | 0.9895 |
| 0.0513 | 7.8491 | 5306 | 0.9627 | 0.7840 | 0.9627 | 0.9812 |
| 0.0513 | 7.8521 | 5308 | 0.9539 | 0.7840 | 0.9539 | 0.9767 |
| 0.0513 | 7.8550 | 5310 | 0.9467 | 0.7840 | 0.9467 | 0.9730 |
| 0.0513 | 7.8580 | 5312 | 0.9264 | 0.7626 | 0.9264 | 0.9625 |
| 0.0513 | 7.8609 | 5314 | 0.9058 | 0.7626 | 0.9058 | 0.9518 |
| 0.0513 | 7.8639 | 5316 | 0.8926 | 0.7626 | 0.8926 | 0.9448 |
| 0.0513 | 7.8669 | 5318 | 0.8816 | 0.7296 | 0.8816 | 0.9389 |
| 0.0513 | 7.8698 | 5320 | 0.8710 | 0.7626 | 0.8710 | 0.9333 |
| 0.0513 | 7.8728 | 5322 | 0.8604 | 0.7520 | 0.8604 | 0.9276 |
| 0.0513 | 7.8757 | 5324 | 0.8542 | 0.7520 | 0.8542 | 0.9242 |
| 0.0513 | 7.8787 | 5326 | 0.8520 | 0.7520 | 0.8520 | 0.9231 |
| 0.0513 | 7.8817 | 5328 | 0.8507 | 0.7520 | 0.8507 | 0.9224 |
| 0.0513 | 7.8846 | 5330 | 0.8527 | 0.7520 | 0.8527 | 0.9234 |
| 0.0513 | 7.8876 | 5332 | 0.8406 | 0.7520 | 0.8406 | 0.9168 |
| 0.0513 | 7.8905 | 5334 | 0.8342 | 0.7520 | 0.8342 | 0.9133 |
| 0.0513 | 7.8935 | 5336 | 0.8408 | 0.7520 | 0.8408 | 0.9169 |
| 0.0513 | 7.8964 | 5338 | 0.8421 | 0.7626 | 0.8421 | 0.9177 |
| 0.0513 | 7.8994 | 5340 | 0.8418 | 0.7626 | 0.8418 | 0.9175 |
| 0.0513 | 7.9024 | 5342 | 0.8484 | 0.7626 | 0.8484 | 0.9211 |
| 0.0513 | 7.9053 | 5344 | 0.8473 | 0.7626 | 0.8473 | 0.9205 |
| 0.0513 | 7.9083 | 5346 | 0.8396 | 0.7626 | 0.8396 | 0.9163 |
| 0.0513 | 7.9112 | 5348 | 0.8417 | 0.7840 | 0.8417 | 0.9174 |
| 0.0513 | 7.9142 | 5350 | 0.8596 | 0.7840 | 0.8596 | 0.9271 |
| 0.0513 | 7.9172 | 5352 | 0.8838 | 0.7840 | 0.8838 | 0.9401 |
| 0.0513 | 7.9201 | 5354 | 0.8852 | 0.7840 | 0.8852 | 0.9409 |
| 0.0513 | 7.9231 | 5356 | 0.8937 | 0.7840 | 0.8937 | 0.9454 |
| 0.0513 | 7.9260 | 5358 | 0.9098 | 0.7840 | 0.9098 | 0.9539 |
| 0.0513 | 7.9290 | 5360 | 0.9152 | 0.7840 | 0.9152 | 0.9567 |
| 0.0513 | 7.9320 | 5362 | 0.9063 | 0.7840 | 0.9063 | 0.9520 |
| 0.0513 | 7.9349 | 5364 | 0.8998 | 0.7840 | 0.8998 | 0.9486 |
| 0.0513 | 7.9379 | 5366 | 0.8868 | 0.7840 | 0.8868 | 0.9417 |
| 0.0513 | 7.9408 | 5368 | 0.8697 | 0.7840 | 0.8697 | 0.9326 |
| 0.0513 | 7.9438 | 5370 | 0.8681 | 0.7840 | 0.8681 | 0.9317 |
| 0.0513 | 7.9467 | 5372 | 0.8701 | 0.7840 | 0.8701 | 0.9328 |
| 0.0513 | 7.9497 | 5374 | 0.8574 | 0.7840 | 0.8574 | 0.9260 |
| 0.0513 | 7.9527 | 5376 | 0.8564 | 0.7840 | 0.8564 | 0.9254 |
| 0.0513 | 7.9556 | 5378 | 0.8474 | 0.7840 | 0.8474 | 0.9205 |
| 0.0513 | 7.9586 | 5380 | 0.8497 | 0.7626 | 0.8497 | 0.9218 |
| 0.0513 | 7.9615 | 5382 | 0.8664 | 0.7840 | 0.8664 | 0.9308 |
| 0.0513 | 7.9645 | 5384 | 0.8858 | 0.7840 | 0.8858 | 0.9412 |
| 0.0513 | 7.9675 | 5386 | 0.8961 | 0.7840 | 0.8961 | 0.9466 |
| 0.0513 | 7.9704 | 5388 | 0.8907 | 0.7840 | 0.8907 | 0.9437 |
| 0.0513 | 7.9734 | 5390 | 0.8936 | 0.7840 | 0.8936 | 0.9453 |
| 0.0513 | 7.9763 | 5392 | 0.9150 | 0.7840 | 0.9150 | 0.9565 |
| 0.0513 | 7.9793 | 5394 | 0.9201 | 0.7840 | 0.9201 | 0.9592 |
| 0.0513 | 7.9822 | 5396 | 0.9395 | 0.7840 | 0.9395 | 0.9693 |
| 0.0513 | 7.9852 | 5398 | 0.9737 | 0.7840 | 0.9737 | 0.9868 |
| 0.0513 | 7.9882 | 5400 | 0.9916 | 0.7840 | 0.9916 | 0.9958 |
| 0.0513 | 7.9911 | 5402 | 0.9916 | 0.7840 | 0.9916 | 0.9958 |
| 0.0513 | 7.9941 | 5404 | 0.9681 | 0.7840 | 0.9681 | 0.9839 |
| 0.0513 | 7.9970 | 5406 | 0.9462 | 0.7840 | 0.9462 | 0.9727 |
| 0.0513 | 8.0 | 5408 | 0.9057 | 0.7840 | 0.9057 | 0.9517 |
| 0.0513 | 8.0030 | 5410 | 0.8827 | 0.7840 | 0.8827 | 0.9395 |
| 0.0513 | 8.0059 | 5412 | 0.8804 | 0.7840 | 0.8804 | 0.9383 |
| 0.0513 | 8.0089 | 5414 | 0.8891 | 0.7840 | 0.8891 | 0.9429 |
| 0.0513 | 8.0118 | 5416 | 0.9093 | 0.7840 | 0.9093 | 0.9536 |
| 0.0513 | 8.0148 | 5418 | 0.9292 | 0.7840 | 0.9292 | 0.9640 |
| 0.0513 | 8.0178 | 5420 | 0.9463 | 0.7840 | 0.9463 | 0.9728 |
| 0.0513 | 8.0207 | 5422 | 0.9432 | 0.7840 | 0.9432 | 0.9712 |
| 0.0513 | 8.0237 | 5424 | 0.9443 | 0.7840 | 0.9443 | 0.9718 |
| 0.0513 | 8.0266 | 5426 | 0.9662 | 0.7840 | 0.9662 | 0.9830 |
| 0.0513 | 8.0296 | 5428 | 0.9697 | 0.7840 | 0.9697 | 0.9847 |
| 0.0513 | 8.0325 | 5430 | 0.9530 | 0.7840 | 0.9530 | 0.9762 |
| 0.0513 | 8.0355 | 5432 | 0.9319 | 0.7840 | 0.9319 | 0.9654 |
| 0.0513 | 8.0385 | 5434 | 0.9145 | 0.7840 | 0.9145 | 0.9563 |
| 0.0513 | 8.0414 | 5436 | 0.8882 | 0.7840 | 0.8882 | 0.9424 |
| 0.0513 | 8.0444 | 5438 | 0.8745 | 0.7840 | 0.8745 | 0.9351 |
| 0.0513 | 8.0473 | 5440 | 0.8735 | 0.7840 | 0.8735 | 0.9346 |
| 0.0513 | 8.0503 | 5442 | 0.8885 | 0.7840 | 0.8885 | 0.9426 |
| 0.0513 | 8.0533 | 5444 | 0.8960 | 0.7840 | 0.8960 | 0.9466 |
| 0.0513 | 8.0562 | 5446 | 0.9071 | 0.7840 | 0.9071 | 0.9524 |
| 0.0513 | 8.0592 | 5448 | 0.9082 | 0.7840 | 0.9082 | 0.9530 |
| 0.0513 | 8.0621 | 5450 | 0.8888 | 0.7840 | 0.8888 | 0.9427 |
| 0.0513 | 8.0651 | 5452 | 0.8756 | 0.7840 | 0.8756 | 0.9357 |
| 0.0513 | 8.0680 | 5454 | 0.8716 | 0.7840 | 0.8716 | 0.9336 |
| 0.0513 | 8.0710 | 5456 | 0.8763 | 0.7840 | 0.8763 | 0.9361 |
| 0.0513 | 8.0740 | 5458 | 0.8698 | 0.7840 | 0.8698 | 0.9326 |
| 0.0513 | 8.0769 | 5460 | 0.8683 | 0.7840 | 0.8683 | 0.9318 |
| 0.0513 | 8.0799 | 5462 | 0.8590 | 0.7724 | 0.8590 | 0.9268 |
| 0.0513 | 8.0828 | 5464 | 0.8545 | 0.7724 | 0.8545 | 0.9244 |
| 0.0513 | 8.0858 | 5466 | 0.8546 | 0.7724 | 0.8546 | 0.9244 |
| 0.0513 | 8.0888 | 5468 | 0.8727 | 0.7724 | 0.8727 | 0.9342 |
| 0.0513 | 8.0917 | 5470 | 0.8868 | 0.7724 | 0.8868 | 0.9417 |
| 0.0513 | 8.0947 | 5472 | 0.9095 | 0.7840 | 0.9095 | 0.9537 |
| 0.0513 | 8.0976 | 5474 | 0.9463 | 0.7840 | 0.9463 | 0.9728 |
| 0.0513 | 8.1006 | 5476 | 0.9815 | 0.7840 | 0.9815 | 0.9907 |
| 0.0513 | 8.1036 | 5478 | 1.0161 | 0.7840 | 1.0161 | 1.0080 |
| 0.0513 | 8.1065 | 5480 | 1.0219 | 0.7840 | 1.0219 | 1.0109 |
| 0.0513 | 8.1095 | 5482 | 1.0064 | 0.7840 | 1.0064 | 1.0032 |
| 0.0513 | 8.1124 | 5484 | 0.9786 | 0.7840 | 0.9786 | 0.9893 |
| 0.0513 | 8.1154 | 5486 | 0.9712 | 0.7840 | 0.9712 | 0.9855 |
| 0.0513 | 8.1183 | 5488 | 0.9631 | 0.7840 | 0.9631 | 0.9814 |
| 0.0513 | 8.1213 | 5490 | 0.9548 | 0.7840 | 0.9548 | 0.9772 |
| 0.0513 | 8.1243 | 5492 | 0.9550 | 0.7840 | 0.9550 | 0.9772 |
| 0.0513 | 8.1272 | 5494 | 0.9506 | 0.7840 | 0.9506 | 0.9750 |
| 0.0513 | 8.1302 | 5496 | 0.9431 | 0.7840 | 0.9431 | 0.9711 |
| 0.0513 | 8.1331 | 5498 | 0.9371 | 0.7840 | 0.9371 | 0.9680 |
| 0.0463 | 8.1361 | 5500 | 0.9407 | 0.7840 | 0.9407 | 0.9699 |
| 0.0463 | 8.1391 | 5502 | 0.9409 | 0.7840 | 0.9409 | 0.9700 |
| 0.0463 | 8.1420 | 5504 | 0.9537 | 0.7840 | 0.9537 | 0.9766 |
| 0.0463 | 8.1450 | 5506 | 0.9643 | 0.7840 | 0.9643 | 0.9820 |
| 0.0463 | 8.1479 | 5508 | 0.9587 | 0.7840 | 0.9587 | 0.9791 |
| 0.0463 | 8.1509 | 5510 | 0.9394 | 0.7840 | 0.9394 | 0.9692 |
| 0.0463 | 8.1538 | 5512 | 0.9359 | 0.7840 | 0.9359 | 0.9674 |
| 0.0463 | 8.1568 | 5514 | 0.9347 | 0.7840 | 0.9347 | 0.9668 |
| 0.0463 | 8.1598 | 5516 | 0.9311 | 0.7840 | 0.9311 | 0.9650 |
| 0.0463 | 8.1627 | 5518 | 0.9267 | 0.7840 | 0.9267 | 0.9626 |
| 0.0463 | 8.1657 | 5520 | 0.9123 | 0.7626 | 0.9123 | 0.9551 |
| 0.0463 | 8.1686 | 5522 | 0.8869 | 0.7296 | 0.8869 | 0.9417 |
| 0.0463 | 8.1716 | 5524 | 0.8553 | 0.7296 | 0.8553 | 0.9248 |
| 0.0463 | 8.1746 | 5526 | 0.8484 | 0.7296 | 0.8484 | 0.9211 |
| 0.0463 | 8.1775 | 5528 | 0.8518 | 0.7296 | 0.8518 | 0.9229 |
| 0.0463 | 8.1805 | 5530 | 0.8724 | 0.7296 | 0.8724 | 0.9340 |
| 0.0463 | 8.1834 | 5532 | 0.9039 | 0.7840 | 0.9039 | 0.9507 |
| 0.0463 | 8.1864 | 5534 | 0.9287 | 0.7840 | 0.9287 | 0.9637 |
| 0.0463 | 8.1893 | 5536 | 0.9542 | 0.7840 | 0.9542 | 0.9769 |
| 0.0463 | 8.1923 | 5538 | 0.9696 | 0.7840 | 0.9696 | 0.9847 |
| 0.0463 | 8.1953 | 5540 | 0.9936 | 0.7840 | 0.9936 | 0.9968 |
| 0.0463 | 8.1982 | 5542 | 1.0123 | 0.7840 | 1.0123 | 1.0061 |
| 0.0463 | 8.2012 | 5544 | 1.0111 | 0.7840 | 1.0111 | 1.0055 |
| 0.0463 | 8.2041 | 5546 | 0.9937 | 0.7840 | 0.9937 | 0.9968 |
| 0.0463 | 8.2071 | 5548 | 0.9620 | 0.7840 | 0.9620 | 0.9808 |
| 0.0463 | 8.2101 | 5550 | 0.9328 | 0.7840 | 0.9328 | 0.9658 |
| 0.0463 | 8.2130 | 5552 | 0.9176 | 0.7840 | 0.9176 | 0.9579 |
| 0.0463 | 8.2160 | 5554 | 0.8961 | 0.7840 | 0.8961 | 0.9466 |
| 0.0463 | 8.2189 | 5556 | 0.8694 | 0.7840 | 0.8694 | 0.9324 |
| 0.0463 | 8.2219 | 5558 | 0.8567 | 0.7840 | 0.8567 | 0.9256 |
| 0.0463 | 8.2249 | 5560 | 0.8487 | 0.7840 | 0.8487 | 0.9213 |
| 0.0463 | 8.2278 | 5562 | 0.8487 | 0.7840 | 0.8487 | 0.9212 |
| 0.0463 | 8.2308 | 5564 | 0.8558 | 0.7840 | 0.8558 | 0.9251 |
| 0.0463 | 8.2337 | 5566 | 0.8666 | 0.7840 | 0.8666 | 0.9309 |
| 0.0463 | 8.2367 | 5568 | 0.8800 | 0.7840 | 0.8800 | 0.9381 |
| 0.0463 | 8.2396 | 5570 | 0.8852 | 0.7840 | 0.8852 | 0.9409 |
| 0.0463 | 8.2426 | 5572 | 0.9007 | 0.7840 | 0.9007 | 0.9490 |
| 0.0463 | 8.2456 | 5574 | 0.9000 | 0.7840 | 0.9000 | 0.9487 |
| 0.0463 | 8.2485 | 5576 | 0.9012 | 0.7840 | 0.9012 | 0.9493 |
| 0.0463 | 8.2515 | 5578 | 0.8997 | 0.7840 | 0.8997 | 0.9485 |
| 0.0463 | 8.2544 | 5580 | 0.8967 | 0.7626 | 0.8967 | 0.9469 |
| 0.0463 | 8.2574 | 5582 | 0.8941 | 0.7626 | 0.8941 | 0.9456 |
| 0.0463 | 8.2604 | 5584 | 0.9061 | 0.7626 | 0.9061 | 0.9519 |
| 0.0463 | 8.2633 | 5586 | 0.9288 | 0.7840 | 0.9288 | 0.9638 |
| 0.0463 | 8.2663 | 5588 | 0.9536 | 0.7840 | 0.9536 | 0.9765 |
| 0.0463 | 8.2692 | 5590 | 0.9589 | 0.7840 | 0.9589 | 0.9792 |
| 0.0463 | 8.2722 | 5592 | 0.9439 | 0.7840 | 0.9439 | 0.9716 |
| 0.0463 | 8.2751 | 5594 | 0.9212 | 0.7626 | 0.9212 | 0.9598 |
| 0.0463 | 8.2781 | 5596 | 0.8968 | 0.7626 | 0.8968 | 0.9470 |
| 0.0463 | 8.2811 | 5598 | 0.8869 | 0.7626 | 0.8869 | 0.9417 |
| 0.0463 | 8.2840 | 5600 | 0.8885 | 0.7626 | 0.8885 | 0.9426 |
| 0.0463 | 8.2870 | 5602 | 0.8791 | 0.7626 | 0.8791 | 0.9376 |
| 0.0463 | 8.2899 | 5604 | 0.8684 | 0.7296 | 0.8684 | 0.9319 |
| 0.0463 | 8.2929 | 5606 | 0.8603 | 0.7296 | 0.8603 | 0.9275 |
| 0.0463 | 8.2959 | 5608 | 0.8622 | 0.7296 | 0.8622 | 0.9285 |
| 0.0463 | 8.2988 | 5610 | 0.8822 | 0.7296 | 0.8822 | 0.9392 |
| 0.0463 | 8.3018 | 5612 | 0.9049 | 0.7296 | 0.9049 | 0.9513 |
| 0.0463 | 8.3047 | 5614 | 0.9291 | 0.7626 | 0.9291 | 0.9639 |
| 0.0463 | 8.3077 | 5616 | 0.9610 | 0.7626 | 0.9610 | 0.9803 |
| 0.0463 | 8.3107 | 5618 | 0.9874 | 0.7840 | 0.9874 | 0.9937 |
| 0.0463 | 8.3136 | 5620 | 1.0146 | 0.7840 | 1.0146 | 1.0073 |
| 0.0463 | 8.3166 | 5622 | 1.0132 | 0.7840 | 1.0132 | 1.0066 |
| 0.0463 | 8.3195 | 5624 | 0.9899 | 0.7840 | 0.9899 | 0.9949 |
| 0.0463 | 8.3225 | 5626 | 0.9501 | 0.7626 | 0.9501 | 0.9747 |
| 0.0463 | 8.3254 | 5628 | 0.9011 | 0.7296 | 0.9011 | 0.9492 |
| 0.0463 | 8.3284 | 5630 | 0.8584 | 0.7296 | 0.8584 | 0.9265 |
| 0.0463 | 8.3314 | 5632 | 0.8282 | 0.7296 | 0.8282 | 0.9101 |
| 0.0463 | 8.3343 | 5634 | 0.8208 | 0.7296 | 0.8208 | 0.9060 |
| 0.0463 | 8.3373 | 5636 | 0.8301 | 0.7296 | 0.8301 | 0.9111 |
| 0.0463 | 8.3402 | 5638 | 0.8509 | 0.7296 | 0.8509 | 0.9224 |
| 0.0463 | 8.3432 | 5640 | 0.8636 | 0.7626 | 0.8636 | 0.9293 |
| 0.0463 | 8.3462 | 5642 | 0.8824 | 0.7626 | 0.8824 | 0.9394 |
| 0.0463 | 8.3491 | 5644 | 0.8956 | 0.7626 | 0.8956 | 0.9464 |
| 0.0463 | 8.3521 | 5646 | 0.9121 | 0.7626 | 0.9121 | 0.9550 |
| 0.0463 | 8.3550 | 5648 | 0.9320 | 0.7840 | 0.9320 | 0.9654 |
| 0.0463 | 8.3580 | 5650 | 0.9368 | 0.7840 | 0.9368 | 0.9679 |
| 0.0463 | 8.3609 | 5652 | 0.9307 | 0.7626 | 0.9307 | 0.9647 |
| 0.0463 | 8.3639 | 5654 | 0.9334 | 0.7626 | 0.9334 | 0.9661 |
| 0.0463 | 8.3669 | 5656 | 0.9290 | 0.7626 | 0.9290 | 0.9639 |
| 0.0463 | 8.3698 | 5658 | 0.9245 | 0.7626 | 0.9245 | 0.9615 |
| 0.0463 | 8.3728 | 5660 | 0.9264 | 0.7626 | 0.9264 | 0.9625 |
| 0.0463 | 8.3757 | 5662 | 0.9144 | 0.7626 | 0.9144 | 0.9563 |
| 0.0463 | 8.3787 | 5664 | 0.9077 | 0.7626 | 0.9077 | 0.9528 |
| 0.0463 | 8.3817 | 5666 | 0.9198 | 0.7626 | 0.9198 | 0.9591 |
| 0.0463 | 8.3846 | 5668 | 0.9402 | 0.7626 | 0.9402 | 0.9696 |
| 0.0463 | 8.3876 | 5670 | 0.9529 | 0.7626 | 0.9529 | 0.9762 |
| 0.0463 | 8.3905 | 5672 | 0.9801 | 0.7840 | 0.9801 | 0.9900 |
| 0.0463 | 8.3935 | 5674 | 0.9953 | 0.7840 | 0.9953 | 0.9976 |
| 0.0463 | 8.3964 | 5676 | 0.9977 | 0.7840 | 0.9977 | 0.9988 |
| 0.0463 | 8.3994 | 5678 | 0.9879 | 0.7840 | 0.9879 | 0.9939 |
| 0.0463 | 8.4024 | 5680 | 0.9640 | 0.7840 | 0.9640 | 0.9819 |
| 0.0463 | 8.4053 | 5682 | 0.9444 | 0.7840 | 0.9444 | 0.9718 |
| 0.0463 | 8.4083 | 5684 | 0.9303 | 0.7626 | 0.9303 | 0.9645 |
| 0.0463 | 8.4112 | 5686 | 0.9309 | 0.7626 | 0.9309 | 0.9648 |
| 0.0463 | 8.4142 | 5688 | 0.9304 | 0.7626 | 0.9304 | 0.9646 |
| 0.0463 | 8.4172 | 5690 | 0.9131 | 0.7626 | 0.9131 | 0.9555 |
| 0.0463 | 8.4201 | 5692 | 0.8937 | 0.7626 | 0.8937 | 0.9454 |
| 0.0463 | 8.4231 | 5694 | 0.8819 | 0.7626 | 0.8819 | 0.9391 |
| 0.0463 | 8.4260 | 5696 | 0.8729 | 0.7626 | 0.8729 | 0.9343 |
| 0.0463 | 8.4290 | 5698 | 0.8708 | 0.7626 | 0.8708 | 0.9332 |
| 0.0463 | 8.4320 | 5700 | 0.8611 | 0.7626 | 0.8611 | 0.9279 |
| 0.0463 | 8.4349 | 5702 | 0.8579 | 0.7626 | 0.8579 | 0.9262 |
| 0.0463 | 8.4379 | 5704 | 0.8553 | 0.7626 | 0.8553 | 0.9248 |
| 0.0463 | 8.4408 | 5706 | 0.8701 | 0.7626 | 0.8701 | 0.9328 |
| 0.0463 | 8.4438 | 5708 | 0.8849 | 0.7626 | 0.8849 | 0.9407 |
| 0.0463 | 8.4467 | 5710 | 0.8903 | 0.7626 | 0.8903 | 0.9435 |
| 0.0463 | 8.4497 | 5712 | 0.8841 | 0.7626 | 0.8841 | 0.9403 |
| 0.0463 | 8.4527 | 5714 | 0.8816 | 0.7626 | 0.8816 | 0.9389 |
| 0.0463 | 8.4556 | 5716 | 0.8788 | 0.7626 | 0.8788 | 0.9375 |
| 0.0463 | 8.4586 | 5718 | 0.8876 | 0.7626 | 0.8876 | 0.9421 |
| 0.0463 | 8.4615 | 5720 | 0.9134 | 0.7840 | 0.9134 | 0.9557 |
| 0.0463 | 8.4645 | 5722 | 0.9324 | 0.7840 | 0.9324 | 0.9656 |
| 0.0463 | 8.4675 | 5724 | 0.9467 | 0.7840 | 0.9467 | 0.9730 |
| 0.0463 | 8.4704 | 5726 | 0.9481 | 0.7840 | 0.9481 | 0.9737 |
| 0.0463 | 8.4734 | 5728 | 0.9346 | 0.7840 | 0.9346 | 0.9668 |
| 0.0463 | 8.4763 | 5730 | 0.9082 | 0.7840 | 0.9082 | 0.9530 |
| 0.0463 | 8.4793 | 5732 | 0.8784 | 0.7626 | 0.8784 | 0.9372 |
| 0.0463 | 8.4822 | 5734 | 0.8630 | 0.7626 | 0.8630 | 0.9290 |
| 0.0463 | 8.4852 | 5736 | 0.8621 | 0.7626 | 0.8621 | 0.9285 |
| 0.0463 | 8.4882 | 5738 | 0.8539 | 0.7626 | 0.8539 | 0.9241 |
| 0.0463 | 8.4911 | 5740 | 0.8459 | 0.7626 | 0.8459 | 0.9197 |
| 0.0463 | 8.4941 | 5742 | 0.8374 | 0.7626 | 0.8374 | 0.9151 |
| 0.0463 | 8.4970 | 5744 | 0.8458 | 0.7626 | 0.8458 | 0.9197 |
| 0.0463 | 8.5 | 5746 | 0.8549 | 0.7626 | 0.8549 | 0.9246 |
| 0.0463 | 8.5030 | 5748 | 0.8767 | 0.7626 | 0.8767 | 0.9363 |
| 0.0463 | 8.5059 | 5750 | 0.9061 | 0.7840 | 0.9061 | 0.9519 |
| 0.0463 | 8.5089 | 5752 | 0.9308 | 0.7840 | 0.9308 | 0.9648 |
| 0.0463 | 8.5118 | 5754 | 0.9431 | 0.7840 | 0.9431 | 0.9711 |
| 0.0463 | 8.5148 | 5756 | 0.9389 | 0.7840 | 0.9389 | 0.9690 |
| 0.0463 | 8.5178 | 5758 | 0.9194 | 0.7840 | 0.9194 | 0.9588 |
| 0.0463 | 8.5207 | 5760 | 0.9052 | 0.7840 | 0.9052 | 0.9514 |
| 0.0463 | 8.5237 | 5762 | 0.8764 | 0.7626 | 0.8764 | 0.9362 |
| 0.0463 | 8.5266 | 5764 | 0.8455 | 0.7296 | 0.8455 | 0.9195 |
| 0.0463 | 8.5296 | 5766 | 0.8264 | 0.72 | 0.8264 | 0.9091 |
| 0.0463 | 8.5325 | 5768 | 0.8238 | 0.7520 | 0.8238 | 0.9076 |
| 0.0463 | 8.5355 | 5770 | 0.8186 | 0.7520 | 0.8186 | 0.9048 |
| 0.0463 | 8.5385 | 5772 | 0.8282 | 0.7520 | 0.8282 | 0.9100 |
| 0.0463 | 8.5414 | 5774 | 0.8416 | 0.7520 | 0.8416 | 0.9174 |
| 0.0463 | 8.5444 | 5776 | 0.8482 | 0.7520 | 0.8482 | 0.9210 |
| 0.0463 | 8.5473 | 5778 | 0.8452 | 0.7520 | 0.8452 | 0.9193 |
| 0.0463 | 8.5503 | 5780 | 0.8579 | 0.7520 | 0.8579 | 0.9262 |
| 0.0463 | 8.5533 | 5782 | 0.8748 | 0.7520 | 0.8748 | 0.9353 |
| 0.0463 | 8.5562 | 5784 | 0.8843 | 0.7520 | 0.8843 | 0.9404 |
| 0.0463 | 8.5592 | 5786 | 0.8920 | 0.7520 | 0.8920 | 0.9445 |
| 0.0463 | 8.5621 | 5788 | 0.8910 | 0.7520 | 0.8910 | 0.9440 |
| 0.0463 | 8.5651 | 5790 | 0.8814 | 0.7520 | 0.8814 | 0.9388 |
| 0.0463 | 8.5680 | 5792 | 0.8682 | 0.7520 | 0.8682 | 0.9318 |
| 0.0463 | 8.5710 | 5794 | 0.8503 | 0.7520 | 0.8503 | 0.9221 |
| 0.0463 | 8.5740 | 5796 | 0.8368 | 0.7520 | 0.8368 | 0.9148 |
| 0.0463 | 8.5769 | 5798 | 0.8235 | 0.72 | 0.8235 | 0.9075 |
| 0.0463 | 8.5799 | 5800 | 0.8247 | 0.72 | 0.8247 | 0.9082 |
| 0.0463 | 8.5828 | 5802 | 0.8334 | 0.72 | 0.8334 | 0.9129 |
| 0.0463 | 8.5858 | 5804 | 0.8445 | 0.7520 | 0.8445 | 0.9190 |
| 0.0463 | 8.5888 | 5806 | 0.8531 | 0.7520 | 0.8531 | 0.9236 |
| 0.0463 | 8.5917 | 5808 | 0.8747 | 0.7626 | 0.8747 | 0.9353 |
| 0.0463 | 8.5947 | 5810 | 0.8873 | 0.7626 | 0.8873 | 0.9420 |
| 0.0463 | 8.5976 | 5812 | 0.8831 | 0.7626 | 0.8831 | 0.9397 |
| 0.0463 | 8.6006 | 5814 | 0.8656 | 0.7520 | 0.8656 | 0.9304 |
| 0.0463 | 8.6036 | 5816 | 0.8524 | 0.7520 | 0.8524 | 0.9233 |
| 0.0463 | 8.6065 | 5818 | 0.8481 | 0.7520 | 0.8481 | 0.9209 |
| 0.0463 | 8.6095 | 5820 | 0.8429 | 0.7520 | 0.8429 | 0.9181 |
| 0.0463 | 8.6124 | 5822 | 0.8521 | 0.7520 | 0.8521 | 0.9231 |
| 0.0463 | 8.6154 | 5824 | 0.8622 | 0.7520 | 0.8622 | 0.9286 |
| 0.0463 | 8.6183 | 5826 | 0.8603 | 0.7520 | 0.8603 | 0.9275 |
| 0.0463 | 8.6213 | 5828 | 0.8570 | 0.7520 | 0.8570 | 0.9257 |
| 0.0463 | 8.6243 | 5830 | 0.8542 | 0.7520 | 0.8542 | 0.9242 |
| 0.0463 | 8.6272 | 5832 | 0.8388 | 0.7520 | 0.8388 | 0.9159 |
| 0.0463 | 8.6302 | 5834 | 0.8214 | 0.7819 | 0.8214 | 0.9063 |
| 0.0463 | 8.6331 | 5836 | 0.8000 | 0.7819 | 0.8000 | 0.8944 |
| 0.0463 | 8.6361 | 5838 | 0.7796 | 0.7819 | 0.7796 | 0.8829 |
| 0.0463 | 8.6391 | 5840 | 0.7777 | 0.7819 | 0.7777 | 0.8819 |
| 0.0463 | 8.6420 | 5842 | 0.7902 | 0.7819 | 0.7902 | 0.8889 |
| 0.0463 | 8.6450 | 5844 | 0.8049 | 0.7819 | 0.8049 | 0.8972 |
| 0.0463 | 8.6479 | 5846 | 0.8296 | 0.7819 | 0.8296 | 0.9108 |
| 0.0463 | 8.6509 | 5848 | 0.8475 | 0.7520 | 0.8475 | 0.9206 |
| 0.0463 | 8.6538 | 5850 | 0.8557 | 0.7520 | 0.8557 | 0.9250 |
| 0.0463 | 8.6568 | 5852 | 0.8557 | 0.7520 | 0.8557 | 0.9251 |
| 0.0463 | 8.6598 | 5854 | 0.8453 | 0.7520 | 0.8453 | 0.9194 |
| 0.0463 | 8.6627 | 5856 | 0.8282 | 0.7520 | 0.8282 | 0.9101 |
| 0.0463 | 8.6657 | 5858 | 0.8167 | 0.7520 | 0.8167 | 0.9037 |
| 0.0463 | 8.6686 | 5860 | 0.8203 | 0.7520 | 0.8203 | 0.9057 |
| 0.0463 | 8.6716 | 5862 | 0.8242 | 0.7520 | 0.8242 | 0.9078 |
| 0.0463 | 8.6746 | 5864 | 0.8308 | 0.7520 | 0.8308 | 0.9115 |
| 0.0463 | 8.6775 | 5866 | 0.8331 | 0.7520 | 0.8331 | 0.9128 |
| 0.0463 | 8.6805 | 5868 | 0.8446 | 0.7520 | 0.8446 | 0.9190 |
| 0.0463 | 8.6834 | 5870 | 0.8627 | 0.7520 | 0.8627 | 0.9288 |
| 0.0463 | 8.6864 | 5872 | 0.8687 | 0.7626 | 0.8687 | 0.9320 |
| 0.0463 | 8.6893 | 5874 | 0.8753 | 0.7626 | 0.8753 | 0.9356 |
| 0.0463 | 8.6923 | 5876 | 0.8743 | 0.7520 | 0.8743 | 0.9351 |
| 0.0463 | 8.6953 | 5878 | 0.8626 | 0.7520 | 0.8626 | 0.9287 |
| 0.0463 | 8.6982 | 5880 | 0.8465 | 0.7520 | 0.8465 | 0.9201 |
| 0.0463 | 8.7012 | 5882 | 0.8342 | 0.7520 | 0.8342 | 0.9134 |
| 0.0463 | 8.7041 | 5884 | 0.8168 | 0.7520 | 0.8168 | 0.9037 |
| 0.0463 | 8.7071 | 5886 | 0.8007 | 0.7819 | 0.8007 | 0.8948 |
| 0.0463 | 8.7101 | 5888 | 0.7954 | 0.7819 | 0.7954 | 0.8918 |
| 0.0463 | 8.7130 | 5890 | 0.8003 | 0.7819 | 0.8003 | 0.8946 |
| 0.0463 | 8.7160 | 5892 | 0.8047 | 0.7819 | 0.8047 | 0.8970 |
| 0.0463 | 8.7189 | 5894 | 0.8084 | 0.7819 | 0.8084 | 0.8991 |
| 0.0463 | 8.7219 | 5896 | 0.8066 | 0.7819 | 0.8066 | 0.8981 |
| 0.0463 | 8.7249 | 5898 | 0.7996 | 0.7819 | 0.7996 | 0.8942 |
| 0.0463 | 8.7278 | 5900 | 0.8050 | 0.7819 | 0.8050 | 0.8972 |
| 0.0463 | 8.7308 | 5902 | 0.8125 | 0.7819 | 0.8125 | 0.9014 |
| 0.0463 | 8.7337 | 5904 | 0.8273 | 0.7520 | 0.8273 | 0.9096 |
| 0.0463 | 8.7367 | 5906 | 0.8470 | 0.7520 | 0.8470 | 0.9203 |
| 0.0463 | 8.7396 | 5908 | 0.8528 | 0.7520 | 0.8528 | 0.9235 |
| 0.0463 | 8.7426 | 5910 | 0.8634 | 0.7520 | 0.8634 | 0.9292 |
| 0.0463 | 8.7456 | 5912 | 0.8790 | 0.7626 | 0.8790 | 0.9375 |
| 0.0463 | 8.7485 | 5914 | 0.9019 | 0.7840 | 0.9019 | 0.9497 |
| 0.0463 | 8.7515 | 5916 | 0.9146 | 0.7840 | 0.9146 | 0.9563 |
| 0.0463 | 8.7544 | 5918 | 0.9316 | 0.7840 | 0.9316 | 0.9652 |
| 0.0463 | 8.7574 | 5920 | 0.9316 | 0.7840 | 0.9316 | 0.9652 |
| 0.0463 | 8.7604 | 5922 | 0.9340 | 0.7840 | 0.9340 | 0.9664 |
| 0.0463 | 8.7633 | 5924 | 0.9231 | 0.7840 | 0.9231 | 0.9608 |
| 0.0463 | 8.7663 | 5926 | 0.9025 | 0.7840 | 0.9025 | 0.9500 |
| 0.0463 | 8.7692 | 5928 | 0.8827 | 0.7840 | 0.8827 | 0.9395 |
| 0.0463 | 8.7722 | 5930 | 0.8687 | 0.7626 | 0.8687 | 0.9321 |
| 0.0463 | 8.7751 | 5932 | 0.8642 | 0.7626 | 0.8642 | 0.9296 |
| 0.0463 | 8.7781 | 5934 | 0.8637 | 0.7520 | 0.8637 | 0.9294 |
| 0.0463 | 8.7811 | 5936 | 0.8498 | 0.7520 | 0.8498 | 0.9218 |
| 0.0463 | 8.7840 | 5938 | 0.8338 | 0.7520 | 0.8338 | 0.9132 |
| 0.0463 | 8.7870 | 5940 | 0.8244 | 0.7520 | 0.8244 | 0.9079 |
| 0.0463 | 8.7899 | 5942 | 0.8247 | 0.7520 | 0.8247 | 0.9081 |
| 0.0463 | 8.7929 | 5944 | 0.8263 | 0.7520 | 0.8263 | 0.9090 |
| 0.0463 | 8.7959 | 5946 | 0.8399 | 0.7626 | 0.8399 | 0.9165 |
| 0.0463 | 8.7988 | 5948 | 0.8472 | 0.7626 | 0.8472 | 0.9205 |
| 0.0463 | 8.8018 | 5950 | 0.8593 | 0.7626 | 0.8593 | 0.9270 |
| 0.0463 | 8.8047 | 5952 | 0.8712 | 0.7626 | 0.8712 | 0.9334 |
| 0.0463 | 8.8077 | 5954 | 0.8797 | 0.7626 | 0.8797 | 0.9379 |
| 0.0463 | 8.8107 | 5956 | 0.8915 | 0.7626 | 0.8915 | 0.9442 |
| 0.0463 | 8.8136 | 5958 | 0.9070 | 0.7840 | 0.9070 | 0.9524 |
| 0.0463 | 8.8166 | 5960 | 0.9154 | 0.7840 | 0.9154 | 0.9568 |
| 0.0463 | 8.8195 | 5962 | 0.9148 | 0.7840 | 0.9148 | 0.9564 |
| 0.0463 | 8.8225 | 5964 | 0.9147 | 0.7626 | 0.9147 | 0.9564 |
| 0.0463 | 8.8254 | 5966 | 0.9126 | 0.7626 | 0.9126 | 0.9553 |
| 0.0463 | 8.8284 | 5968 | 0.9120 | 0.7626 | 0.9120 | 0.9550 |
| 0.0463 | 8.8314 | 5970 | 0.9206 | 0.7840 | 0.9206 | 0.9595 |
| 0.0463 | 8.8343 | 5972 | 0.9328 | 0.7840 | 0.9328 | 0.9658 |
| 0.0463 | 8.8373 | 5974 | 0.9332 | 0.7840 | 0.9332 | 0.9660 |
| 0.0463 | 8.8402 | 5976 | 0.9239 | 0.7626 | 0.9239 | 0.9612 |
| 0.0463 | 8.8432 | 5978 | 0.9104 | 0.7626 | 0.9104 | 0.9541 |
| 0.0463 | 8.8462 | 5980 | 0.8925 | 0.7626 | 0.8925 | 0.9447 |
| 0.0463 | 8.8491 | 5982 | 0.8730 | 0.7626 | 0.8730 | 0.9343 |
| 0.0463 | 8.8521 | 5984 | 0.8612 | 0.7296 | 0.8612 | 0.9280 |
| 0.0463 | 8.8550 | 5986 | 0.8530 | 0.7296 | 0.8530 | 0.9236 |
| 0.0463 | 8.8580 | 5988 | 0.8348 | 0.7296 | 0.8348 | 0.9137 |
| 0.0463 | 8.8609 | 5990 | 0.8198 | 0.7296 | 0.8198 | 0.9054 |
| 0.0463 | 8.8639 | 5992 | 0.8157 | 0.7296 | 0.8157 | 0.9031 |
| 0.0463 | 8.8669 | 5994 | 0.8157 | 0.7296 | 0.8157 | 0.9032 |
| 0.0463 | 8.8698 | 5996 | 0.8250 | 0.7296 | 0.8250 | 0.9083 |
| 0.0463 | 8.8728 | 5998 | 0.8391 | 0.7626 | 0.8391 | 0.9160 |
| 0.0421 | 8.8757 | 6000 | 0.8496 | 0.7626 | 0.8496 | 0.9217 |
| 0.0421 | 8.8787 | 6002 | 0.8590 | 0.7626 | 0.8590 | 0.9268 |
| 0.0421 | 8.8817 | 6004 | 0.8609 | 0.7626 | 0.8609 | 0.9278 |
| 0.0421 | 8.8846 | 6006 | 0.8554 | 0.7626 | 0.8554 | 0.9249 |
| 0.0421 | 8.8876 | 6008 | 0.8473 | 0.7626 | 0.8473 | 0.9205 |
| 0.0421 | 8.8905 | 6010 | 0.8385 | 0.7626 | 0.8385 | 0.9157 |
| 0.0421 | 8.8935 | 6012 | 0.8417 | 0.7626 | 0.8417 | 0.9174 |
| 0.0421 | 8.8964 | 6014 | 0.8538 | 0.7626 | 0.8538 | 0.9240 |
| 0.0421 | 8.8994 | 6016 | 0.8662 | 0.7626 | 0.8662 | 0.9307 |
| 0.0421 | 8.9024 | 6018 | 0.8769 | 0.7626 | 0.8769 | 0.9364 |
| 0.0421 | 8.9053 | 6020 | 0.8864 | 0.7626 | 0.8864 | 0.9415 |
| 0.0421 | 8.9083 | 6022 | 0.8885 | 0.7626 | 0.8885 | 0.9426 |
| 0.0421 | 8.9112 | 6024 | 0.8953 | 0.7626 | 0.8953 | 0.9462 |
| 0.0421 | 8.9142 | 6026 | 0.8963 | 0.7626 | 0.8963 | 0.9467 |
| 0.0421 | 8.9172 | 6028 | 0.9000 | 0.7626 | 0.9000 | 0.9487 |
| 0.0421 | 8.9201 | 6030 | 0.8921 | 0.7626 | 0.8921 | 0.9445 |
| 0.0421 | 8.9231 | 6032 | 0.8793 | 0.7626 | 0.8793 | 0.9377 |
| 0.0421 | 8.9260 | 6034 | 0.8792 | 0.7626 | 0.8792 | 0.9377 |
| 0.0421 | 8.9290 | 6036 | 0.8913 | 0.7626 | 0.8913 | 0.9441 |
| 0.0421 | 8.9320 | 6038 | 0.9032 | 0.7626 | 0.9032 | 0.9504 |
| 0.0421 | 8.9349 | 6040 | 0.9093 | 0.7626 | 0.9093 | 0.9536 |
| 0.0421 | 8.9379 | 6042 | 0.9079 | 0.7626 | 0.9079 | 0.9528 |
| 0.0421 | 8.9408 | 6044 | 0.9172 | 0.7626 | 0.9172 | 0.9577 |
| 0.0421 | 8.9438 | 6046 | 0.9220 | 0.7626 | 0.9220 | 0.9602 |
| 0.0421 | 8.9467 | 6048 | 0.9295 | 0.7626 | 0.9295 | 0.9641 |
| 0.0421 | 8.9497 | 6050 | 0.9242 | 0.7626 | 0.9242 | 0.9614 |
| 0.0421 | 8.9527 | 6052 | 0.9120 | 0.7626 | 0.9120 | 0.9550 |
| 0.0421 | 8.9556 | 6054 | 0.8966 | 0.7626 | 0.8966 | 0.9469 |
| 0.0421 | 8.9586 | 6056 | 0.8926 | 0.7626 | 0.8926 | 0.9448 |
| 0.0421 | 8.9615 | 6058 | 0.8962 | 0.7626 | 0.8962 | 0.9467 |
| 0.0421 | 8.9645 | 6060 | 0.9033 | 0.7626 | 0.9033 | 0.9504 |
| 0.0421 | 8.9675 | 6062 | 0.9096 | 0.7626 | 0.9096 | 0.9537 |
| 0.0421 | 8.9704 | 6064 | 0.9105 | 0.7626 | 0.9105 | 0.9542 |
| 0.0421 | 8.9734 | 6066 | 0.9014 | 0.7626 | 0.9014 | 0.9494 |
| 0.0421 | 8.9763 | 6068 | 0.8945 | 0.7626 | 0.8945 | 0.9458 |
| 0.0421 | 8.9793 | 6070 | 0.8958 | 0.7626 | 0.8958 | 0.9465 |
| 0.0421 | 8.9822 | 6072 | 0.9075 | 0.7626 | 0.9075 | 0.9526 |
| 0.0421 | 8.9852 | 6074 | 0.9147 | 0.7626 | 0.9147 | 0.9564 |
| 0.0421 | 8.9882 | 6076 | 0.9261 | 0.7626 | 0.9261 | 0.9623 |
| 0.0421 | 8.9911 | 6078 | 0.9296 | 0.7626 | 0.9296 | 0.9642 |
| 0.0421 | 8.9941 | 6080 | 0.9341 | 0.7626 | 0.9341 | 0.9665 |
| 0.0421 | 8.9970 | 6082 | 0.9404 | 0.7626 | 0.9404 | 0.9697 |
| 0.0421 | 9.0 | 6084 | 0.9375 | 0.7626 | 0.9375 | 0.9683 |
| 0.0421 | 9.0030 | 6086 | 0.9309 | 0.7626 | 0.9309 | 0.9648 |
| 0.0421 | 9.0059 | 6088 | 0.9219 | 0.7626 | 0.9219 | 0.9602 |
| 0.0421 | 9.0089 | 6090 | 0.9142 | 0.7626 | 0.9142 | 0.9561 |
| 0.0421 | 9.0118 | 6092 | 0.8992 | 0.7626 | 0.8992 | 0.9483 |
| 0.0421 | 9.0148 | 6094 | 0.8778 | 0.7626 | 0.8778 | 0.9369 |
| 0.0421 | 9.0178 | 6096 | 0.8539 | 0.7626 | 0.8539 | 0.9241 |
| 0.0421 | 9.0207 | 6098 | 0.8423 | 0.7626 | 0.8423 | 0.9178 |
| 0.0421 | 9.0237 | 6100 | 0.8269 | 0.7626 | 0.8269 | 0.9093 |
| 0.0421 | 9.0266 | 6102 | 0.8146 | 0.7520 | 0.8146 | 0.9026 |
| 0.0421 | 9.0296 | 6104 | 0.8150 | 0.7520 | 0.8150 | 0.9028 |
| 0.0421 | 9.0325 | 6106 | 0.8133 | 0.7520 | 0.8133 | 0.9018 |
| 0.0421 | 9.0355 | 6108 | 0.8077 | 0.72 | 0.8077 | 0.8987 |
| 0.0421 | 9.0385 | 6110 | 0.8039 | 0.72 | 0.8039 | 0.8966 |
| 0.0421 | 9.0414 | 6112 | 0.8042 | 0.72 | 0.8042 | 0.8968 |
| 0.0421 | 9.0444 | 6114 | 0.8121 | 0.7520 | 0.8121 | 0.9012 |
| 0.0421 | 9.0473 | 6116 | 0.8242 | 0.7626 | 0.8242 | 0.9079 |
| 0.0421 | 9.0503 | 6118 | 0.8419 | 0.7626 | 0.8419 | 0.9176 |
| 0.0421 | 9.0533 | 6120 | 0.8646 | 0.7626 | 0.8646 | 0.9298 |
| 0.0421 | 9.0562 | 6122 | 0.8855 | 0.7626 | 0.8855 | 0.9410 |
| 0.0421 | 9.0592 | 6124 | 0.9077 | 0.7626 | 0.9077 | 0.9527 |
| 0.0421 | 9.0621 | 6126 | 0.9237 | 0.7626 | 0.9237 | 0.9611 |
| 0.0421 | 9.0651 | 6128 | 0.9284 | 0.7626 | 0.9284 | 0.9635 |
| 0.0421 | 9.0680 | 6130 | 0.9244 | 0.7626 | 0.9244 | 0.9615 |
| 0.0421 | 9.0710 | 6132 | 0.9109 | 0.7626 | 0.9109 | 0.9544 |
| 0.0421 | 9.0740 | 6134 | 0.8944 | 0.7626 | 0.8944 | 0.9457 |
| 0.0421 | 9.0769 | 6136 | 0.8830 | 0.7626 | 0.8830 | 0.9397 |
| 0.0421 | 9.0799 | 6138 | 0.8721 | 0.7626 | 0.8721 | 0.9339 |
| 0.0421 | 9.0828 | 6140 | 0.8659 | 0.7626 | 0.8659 | 0.9305 |
| 0.0421 | 9.0858 | 6142 | 0.8511 | 0.7626 | 0.8511 | 0.9226 |
| 0.0421 | 9.0888 | 6144 | 0.8312 | 0.7520 | 0.8312 | 0.9117 |
| 0.0421 | 9.0917 | 6146 | 0.8112 | 0.7520 | 0.8112 | 0.9007 |
| 0.0421 | 9.0947 | 6148 | 0.7991 | 0.72 | 0.7991 | 0.8939 |
| 0.0421 | 9.0976 | 6150 | 0.7964 | 0.72 | 0.7964 | 0.8924 |
| 0.0421 | 9.1006 | 6152 | 0.7948 | 0.72 | 0.7948 | 0.8915 |
| 0.0421 | 9.1036 | 6154 | 0.7956 | 0.7520 | 0.7956 | 0.8920 |
| 0.0421 | 9.1065 | 6156 | 0.7965 | 0.7520 | 0.7965 | 0.8925 |
| 0.0421 | 9.1095 | 6158 | 0.8020 | 0.7520 | 0.8020 | 0.8956 |
| 0.0421 | 9.1124 | 6160 | 0.8113 | 0.7520 | 0.8113 | 0.9007 |
| 0.0421 | 9.1154 | 6162 | 0.8208 | 0.7520 | 0.8208 | 0.9060 |
| 0.0421 | 9.1183 | 6164 | 0.8356 | 0.7520 | 0.8356 | 0.9141 |
| 0.0421 | 9.1213 | 6166 | 0.8456 | 0.7520 | 0.8456 | 0.9195 |
| 0.0421 | 9.1243 | 6168 | 0.8530 | 0.7520 | 0.8530 | 0.9236 |
| 0.0421 | 9.1272 | 6170 | 0.8558 | 0.7626 | 0.8558 | 0.9251 |
| 0.0421 | 9.1302 | 6172 | 0.8603 | 0.7626 | 0.8603 | 0.9275 |
| 0.0421 | 9.1331 | 6174 | 0.8668 | 0.7626 | 0.8668 | 0.9310 |
| 0.0421 | 9.1361 | 6176 | 0.8687 | 0.7626 | 0.8687 | 0.9321 |
| 0.0421 | 9.1391 | 6178 | 0.8668 | 0.7626 | 0.8668 | 0.9310 |
| 0.0421 | 9.1420 | 6180 | 0.8694 | 0.7626 | 0.8694 | 0.9324 |
| 0.0421 | 9.1450 | 6182 | 0.8757 | 0.7626 | 0.8757 | 0.9358 |
| 0.0421 | 9.1479 | 6184 | 0.8818 | 0.7626 | 0.8818 | 0.9390 |
| 0.0421 | 9.1509 | 6186 | 0.8879 | 0.7626 | 0.8879 | 0.9423 |
| 0.0421 | 9.1538 | 6188 | 0.8917 | 0.7626 | 0.8917 | 0.9443 |
| 0.0421 | 9.1568 | 6190 | 0.8925 | 0.7626 | 0.8925 | 0.9447 |
| 0.0421 | 9.1598 | 6192 | 0.8850 | 0.7626 | 0.8850 | 0.9407 |
| 0.0421 | 9.1627 | 6194 | 0.8738 | 0.7626 | 0.8738 | 0.9347 |
| 0.0421 | 9.1657 | 6196 | 0.8639 | 0.7520 | 0.8639 | 0.9295 |
| 0.0421 | 9.1686 | 6198 | 0.8589 | 0.7520 | 0.8589 | 0.9268 |
| 0.0421 | 9.1716 | 6200 | 0.8600 | 0.7520 | 0.8600 | 0.9274 |
| 0.0421 | 9.1746 | 6202 | 0.8600 | 0.7520 | 0.8600 | 0.9273 |
| 0.0421 | 9.1775 | 6204 | 0.8632 | 0.7520 | 0.8632 | 0.9291 |
| 0.0421 | 9.1805 | 6206 | 0.8703 | 0.7626 | 0.8703 | 0.9329 |
| 0.0421 | 9.1834 | 6208 | 0.8744 | 0.7626 | 0.8744 | 0.9351 |
| 0.0421 | 9.1864 | 6210 | 0.8815 | 0.7626 | 0.8815 | 0.9389 |
| 0.0421 | 9.1893 | 6212 | 0.8789 | 0.7626 | 0.8789 | 0.9375 |
| 0.0421 | 9.1923 | 6214 | 0.8772 | 0.7626 | 0.8772 | 0.9366 |
| 0.0421 | 9.1953 | 6216 | 0.8751 | 0.7626 | 0.8751 | 0.9355 |
| 0.0421 | 9.1982 | 6218 | 0.8819 | 0.7626 | 0.8819 | 0.9391 |
| 0.0421 | 9.2012 | 6220 | 0.8874 | 0.7626 | 0.8874 | 0.9420 |
| 0.0421 | 9.2041 | 6222 | 0.8908 | 0.7626 | 0.8908 | 0.9438 |
| 0.0421 | 9.2071 | 6224 | 0.8948 | 0.7626 | 0.8948 | 0.9459 |
| 0.0421 | 9.2101 | 6226 | 0.8974 | 0.7626 | 0.8974 | 0.9473 |
| 0.0421 | 9.2130 | 6228 | 0.9037 | 0.7626 | 0.9037 | 0.9506 |
| 0.0421 | 9.2160 | 6230 | 0.9045 | 0.7626 | 0.9045 | 0.9511 |
| 0.0421 | 9.2189 | 6232 | 0.9017 | 0.7626 | 0.9017 | 0.9496 |
| 0.0421 | 9.2219 | 6234 | 0.9046 | 0.7626 | 0.9046 | 0.9511 |
| 0.0421 | 9.2249 | 6236 | 0.9052 | 0.7626 | 0.9052 | 0.9514 |
| 0.0421 | 9.2278 | 6238 | 0.9068 | 0.7626 | 0.9068 | 0.9522 |
| 0.0421 | 9.2308 | 6240 | 0.9016 | 0.7626 | 0.9016 | 0.9495 |
| 0.0421 | 9.2337 | 6242 | 0.9034 | 0.7626 | 0.9034 | 0.9504 |
| 0.0421 | 9.2367 | 6244 | 0.9084 | 0.7626 | 0.9084 | 0.9531 |
| 0.0421 | 9.2396 | 6246 | 0.9057 | 0.7626 | 0.9057 | 0.9517 |
| 0.0421 | 9.2426 | 6248 | 0.9017 | 0.7626 | 0.9017 | 0.9496 |
| 0.0421 | 9.2456 | 6250 | 0.8932 | 0.7626 | 0.8932 | 0.9451 |
| 0.0421 | 9.2485 | 6252 | 0.8886 | 0.7626 | 0.8886 | 0.9427 |
| 0.0421 | 9.2515 | 6254 | 0.8818 | 0.7626 | 0.8818 | 0.9390 |
| 0.0421 | 9.2544 | 6256 | 0.8753 | 0.7626 | 0.8753 | 0.9356 |
| 0.0421 | 9.2574 | 6258 | 0.8680 | 0.7626 | 0.8680 | 0.9317 |
| 0.0421 | 9.2604 | 6260 | 0.8568 | 0.7520 | 0.8568 | 0.9256 |
| 0.0421 | 9.2633 | 6262 | 0.8431 | 0.7520 | 0.8431 | 0.9182 |
| 0.0421 | 9.2663 | 6264 | 0.8376 | 0.7520 | 0.8376 | 0.9152 |
| 0.0421 | 9.2692 | 6266 | 0.8333 | 0.7520 | 0.8333 | 0.9129 |
| 0.0421 | 9.2722 | 6268 | 0.8260 | 0.7520 | 0.8260 | 0.9088 |
| 0.0421 | 9.2751 | 6270 | 0.8147 | 0.7520 | 0.8147 | 0.9026 |
| 0.0421 | 9.2781 | 6272 | 0.8048 | 0.7520 | 0.8048 | 0.8971 |
| 0.0421 | 9.2811 | 6274 | 0.7998 | 0.7520 | 0.7998 | 0.8943 |
| 0.0421 | 9.2840 | 6276 | 0.7937 | 0.7520 | 0.7937 | 0.8909 |
| 0.0421 | 9.2870 | 6278 | 0.7861 | 0.7520 | 0.7861 | 0.8866 |
| 0.0421 | 9.2899 | 6280 | 0.7817 | 0.7520 | 0.7817 | 0.8841 |
| 0.0421 | 9.2929 | 6282 | 0.7823 | 0.7520 | 0.7823 | 0.8845 |
| 0.0421 | 9.2959 | 6284 | 0.7869 | 0.7520 | 0.7869 | 0.8871 |
| 0.0421 | 9.2988 | 6286 | 0.7958 | 0.7520 | 0.7958 | 0.8921 |
| 0.0421 | 9.3018 | 6288 | 0.8121 | 0.7520 | 0.8121 | 0.9012 |
| 0.0421 | 9.3047 | 6290 | 0.8248 | 0.7520 | 0.8248 | 0.9082 |
| 0.0421 | 9.3077 | 6292 | 0.8408 | 0.7520 | 0.8408 | 0.9170 |
| 0.0421 | 9.3107 | 6294 | 0.8551 | 0.7520 | 0.8551 | 0.9247 |
| 0.0421 | 9.3136 | 6296 | 0.8731 | 0.7626 | 0.8731 | 0.9344 |
| 0.0421 | 9.3166 | 6298 | 0.8849 | 0.7626 | 0.8849 | 0.9407 |
| 0.0421 | 9.3195 | 6300 | 0.8898 | 0.7626 | 0.8898 | 0.9433 |
| 0.0421 | 9.3225 | 6302 | 0.8866 | 0.7626 | 0.8866 | 0.9416 |
| 0.0421 | 9.3254 | 6304 | 0.8769 | 0.7626 | 0.8769 | 0.9364 |
| 0.0421 | 9.3284 | 6306 | 0.8707 | 0.7626 | 0.8707 | 0.9331 |
| 0.0421 | 9.3314 | 6308 | 0.8713 | 0.7626 | 0.8713 | 0.9335 |
| 0.0421 | 9.3343 | 6310 | 0.8714 | 0.7626 | 0.8714 | 0.9335 |
| 0.0421 | 9.3373 | 6312 | 0.8686 | 0.7626 | 0.8686 | 0.9320 |
| 0.0421 | 9.3402 | 6314 | 0.8631 | 0.7626 | 0.8631 | 0.9290 |
| 0.0421 | 9.3432 | 6316 | 0.8658 | 0.7626 | 0.8658 | 0.9305 |
| 0.0421 | 9.3462 | 6318 | 0.8680 | 0.7626 | 0.8680 | 0.9317 |
| 0.0421 | 9.3491 | 6320 | 0.8677 | 0.7626 | 0.8677 | 0.9315 |
| 0.0421 | 9.3521 | 6322 | 0.8719 | 0.7626 | 0.8719 | 0.9338 |
| 0.0421 | 9.3550 | 6324 | 0.8810 | 0.7626 | 0.8810 | 0.9386 |
| 0.0421 | 9.3580 | 6326 | 0.8879 | 0.7626 | 0.8879 | 0.9423 |
| 0.0421 | 9.3609 | 6328 | 0.8902 | 0.7626 | 0.8902 | 0.9435 |
| 0.0421 | 9.3639 | 6330 | 0.8946 | 0.7626 | 0.8946 | 0.9458 |
| 0.0421 | 9.3669 | 6332 | 0.9060 | 0.7626 | 0.9060 | 0.9518 |
| 0.0421 | 9.3698 | 6334 | 0.9116 | 0.7626 | 0.9116 | 0.9548 |
| 0.0421 | 9.3728 | 6336 | 0.9144 | 0.7626 | 0.9144 | 0.9563 |
| 0.0421 | 9.3757 | 6338 | 0.9215 | 0.7626 | 0.9215 | 0.9599 |
| 0.0421 | 9.3787 | 6340 | 0.9235 | 0.7626 | 0.9235 | 0.9610 |
| 0.0421 | 9.3817 | 6342 | 0.9187 | 0.7626 | 0.9187 | 0.9585 |
| 0.0421 | 9.3846 | 6344 | 0.9182 | 0.7626 | 0.9182 | 0.9582 |
| 0.0421 | 9.3876 | 6346 | 0.9131 | 0.7626 | 0.9131 | 0.9556 |
| 0.0421 | 9.3905 | 6348 | 0.9036 | 0.7626 | 0.9036 | 0.9506 |
| 0.0421 | 9.3935 | 6350 | 0.8933 | 0.7626 | 0.8933 | 0.9451 |
| 0.0421 | 9.3964 | 6352 | 0.8838 | 0.7626 | 0.8838 | 0.9401 |
| 0.0421 | 9.3994 | 6354 | 0.8819 | 0.7296 | 0.8819 | 0.9391 |
| 0.0421 | 9.4024 | 6356 | 0.8750 | 0.7296 | 0.8750 | 0.9354 |
| 0.0421 | 9.4053 | 6358 | 0.8630 | 0.7296 | 0.8630 | 0.9290 |
| 0.0421 | 9.4083 | 6360 | 0.8540 | 0.7296 | 0.8540 | 0.9241 |
| 0.0421 | 9.4112 | 6362 | 0.8482 | 0.7296 | 0.8482 | 0.9210 |
| 0.0421 | 9.4142 | 6364 | 0.8467 | 0.7296 | 0.8467 | 0.9202 |
| 0.0421 | 9.4172 | 6366 | 0.8479 | 0.7296 | 0.8479 | 0.9208 |
| 0.0421 | 9.4201 | 6368 | 0.8523 | 0.7626 | 0.8523 | 0.9232 |
| 0.0421 | 9.4231 | 6370 | 0.8550 | 0.7626 | 0.8550 | 0.9247 |
| 0.0421 | 9.4260 | 6372 | 0.8599 | 0.7626 | 0.8599 | 0.9273 |
| 0.0421 | 9.4290 | 6374 | 0.8673 | 0.7626 | 0.8673 | 0.9313 |
| 0.0421 | 9.4320 | 6376 | 0.8761 | 0.7626 | 0.8761 | 0.9360 |
| 0.0421 | 9.4349 | 6378 | 0.8820 | 0.7626 | 0.8820 | 0.9391 |
| 0.0421 | 9.4379 | 6380 | 0.8844 | 0.7626 | 0.8844 | 0.9405 |
| 0.0421 | 9.4408 | 6382 | 0.8914 | 0.7626 | 0.8914 | 0.9441 |
| 0.0421 | 9.4438 | 6384 | 0.8981 | 0.7626 | 0.8981 | 0.9477 |
| 0.0421 | 9.4467 | 6386 | 0.9086 | 0.7626 | 0.9086 | 0.9532 |
| 0.0421 | 9.4497 | 6388 | 0.9179 | 0.7840 | 0.9179 | 0.9581 |
| 0.0421 | 9.4527 | 6390 | 0.9252 | 0.7840 | 0.9252 | 0.9619 |
| 0.0421 | 9.4556 | 6392 | 0.9306 | 0.7840 | 0.9306 | 0.9647 |
| 0.0421 | 9.4586 | 6394 | 0.9316 | 0.7840 | 0.9316 | 0.9652 |
| 0.0421 | 9.4615 | 6396 | 0.9314 | 0.7840 | 0.9314 | 0.9651 |
| 0.0421 | 9.4645 | 6398 | 0.9303 | 0.7840 | 0.9303 | 0.9645 |
| 0.0421 | 9.4675 | 6400 | 0.9237 | 0.7840 | 0.9237 | 0.9611 |
| 0.0421 | 9.4704 | 6402 | 0.9148 | 0.7840 | 0.9148 | 0.9564 |
| 0.0421 | 9.4734 | 6404 | 0.9101 | 0.7840 | 0.9101 | 0.9540 |
| 0.0421 | 9.4763 | 6406 | 0.9045 | 0.7840 | 0.9045 | 0.9511 |
| 0.0421 | 9.4793 | 6408 | 0.8941 | 0.7626 | 0.8941 | 0.9456 |
| 0.0421 | 9.4822 | 6410 | 0.8870 | 0.7626 | 0.8870 | 0.9418 |
| 0.0421 | 9.4852 | 6412 | 0.8806 | 0.7626 | 0.8806 | 0.9384 |
| 0.0421 | 9.4882 | 6414 | 0.8741 | 0.7626 | 0.8741 | 0.9350 |
| 0.0421 | 9.4911 | 6416 | 0.8703 | 0.7626 | 0.8703 | 0.9329 |
| 0.0421 | 9.4941 | 6418 | 0.8661 | 0.7520 | 0.8661 | 0.9307 |
| 0.0421 | 9.4970 | 6420 | 0.8612 | 0.7520 | 0.8612 | 0.9280 |
| 0.0421 | 9.5 | 6422 | 0.8592 | 0.7520 | 0.8592 | 0.9270 |
| 0.0421 | 9.5030 | 6424 | 0.8608 | 0.7520 | 0.8608 | 0.9278 |
| 0.0421 | 9.5059 | 6426 | 0.8628 | 0.7520 | 0.8628 | 0.9288 |
| 0.0421 | 9.5089 | 6428 | 0.8607 | 0.7520 | 0.8607 | 0.9277 |
| 0.0421 | 9.5118 | 6430 | 0.8591 | 0.7520 | 0.8591 | 0.9269 |
| 0.0421 | 9.5148 | 6432 | 0.8587 | 0.7520 | 0.8587 | 0.9266 |
| 0.0421 | 9.5178 | 6434 | 0.8603 | 0.7520 | 0.8603 | 0.9275 |
| 0.0421 | 9.5207 | 6436 | 0.8595 | 0.7520 | 0.8595 | 0.9271 |
| 0.0421 | 9.5237 | 6438 | 0.8604 | 0.7520 | 0.8604 | 0.9276 |
| 0.0421 | 9.5266 | 6440 | 0.8625 | 0.7520 | 0.8625 | 0.9287 |
| 0.0421 | 9.5296 | 6442 | 0.8643 | 0.7520 | 0.8643 | 0.9297 |
| 0.0421 | 9.5325 | 6444 | 0.8617 | 0.7520 | 0.8617 | 0.9283 |
| 0.0421 | 9.5355 | 6446 | 0.8560 | 0.7520 | 0.8560 | 0.9252 |
| 0.0421 | 9.5385 | 6448 | 0.8500 | 0.7520 | 0.8500 | 0.9219 |
| 0.0421 | 9.5414 | 6450 | 0.8438 | 0.7520 | 0.8438 | 0.9186 |
| 0.0421 | 9.5444 | 6452 | 0.8352 | 0.7520 | 0.8352 | 0.9139 |
| 0.0421 | 9.5473 | 6454 | 0.8298 | 0.7520 | 0.8298 | 0.9109 |
| 0.0421 | 9.5503 | 6456 | 0.8256 | 0.7520 | 0.8256 | 0.9086 |
| 0.0421 | 9.5533 | 6458 | 0.8210 | 0.7520 | 0.8210 | 0.9061 |
| 0.0421 | 9.5562 | 6460 | 0.8174 | 0.7520 | 0.8174 | 0.9041 |
| 0.0421 | 9.5592 | 6462 | 0.8148 | 0.7520 | 0.8148 | 0.9026 |
| 0.0421 | 9.5621 | 6464 | 0.8131 | 0.7520 | 0.8131 | 0.9017 |
| 0.0421 | 9.5651 | 6466 | 0.8089 | 0.7520 | 0.8089 | 0.8994 |
| 0.0421 | 9.5680 | 6468 | 0.8031 | 0.7520 | 0.8031 | 0.8961 |
| 0.0421 | 9.5710 | 6470 | 0.8008 | 0.7520 | 0.8008 | 0.8949 |
| 0.0421 | 9.5740 | 6472 | 0.8021 | 0.7520 | 0.8021 | 0.8956 |
| 0.0421 | 9.5769 | 6474 | 0.8041 | 0.7520 | 0.8041 | 0.8967 |
| 0.0421 | 9.5799 | 6476 | 0.8094 | 0.7520 | 0.8094 | 0.8997 |
| 0.0421 | 9.5828 | 6478 | 0.8120 | 0.7520 | 0.8120 | 0.9011 |
| 0.0421 | 9.5858 | 6480 | 0.8132 | 0.7520 | 0.8132 | 0.9018 |
| 0.0421 | 9.5888 | 6482 | 0.8156 | 0.7520 | 0.8156 | 0.9031 |
| 0.0421 | 9.5917 | 6484 | 0.8188 | 0.7520 | 0.8188 | 0.9049 |
| 0.0421 | 9.5947 | 6486 | 0.8234 | 0.7520 | 0.8234 | 0.9074 |
| 0.0421 | 9.5976 | 6488 | 0.8251 | 0.7520 | 0.8251 | 0.9084 |
| 0.0421 | 9.6006 | 6490 | 0.8278 | 0.7520 | 0.8278 | 0.9098 |
| 0.0421 | 9.6036 | 6492 | 0.8346 | 0.7520 | 0.8346 | 0.9135 |
| 0.0421 | 9.6065 | 6494 | 0.8428 | 0.7520 | 0.8428 | 0.9180 |
| 0.0421 | 9.6095 | 6496 | 0.8481 | 0.7520 | 0.8481 | 0.9209 |
| 0.0421 | 9.6124 | 6498 | 0.8537 | 0.7520 | 0.8537 | 0.9240 |
| 0.0386 | 9.6154 | 6500 | 0.8588 | 0.7520 | 0.8588 | 0.9267 |
| 0.0386 | 9.6183 | 6502 | 0.8598 | 0.7520 | 0.8598 | 0.9272 |
| 0.0386 | 9.6213 | 6504 | 0.8580 | 0.7520 | 0.8580 | 0.9263 |
| 0.0386 | 9.6243 | 6506 | 0.8534 | 0.7520 | 0.8534 | 0.9238 |
| 0.0386 | 9.6272 | 6508 | 0.8458 | 0.7520 | 0.8458 | 0.9197 |
| 0.0386 | 9.6302 | 6510 | 0.8402 | 0.7520 | 0.8402 | 0.9166 |
| 0.0386 | 9.6331 | 6512 | 0.8318 | 0.7520 | 0.8318 | 0.9120 |
| 0.0386 | 9.6361 | 6514 | 0.8246 | 0.7520 | 0.8246 | 0.9081 |
| 0.0386 | 9.6391 | 6516 | 0.8170 | 0.7520 | 0.8170 | 0.9039 |
| 0.0386 | 9.6420 | 6518 | 0.8094 | 0.7520 | 0.8094 | 0.8997 |
| 0.0386 | 9.6450 | 6520 | 0.8030 | 0.7520 | 0.8030 | 0.8961 |
| 0.0386 | 9.6479 | 6522 | 0.7985 | 0.7520 | 0.7985 | 0.8936 |
| 0.0386 | 9.6509 | 6524 | 0.7963 | 0.7520 | 0.7963 | 0.8924 |
| 0.0386 | 9.6538 | 6526 | 0.7980 | 0.7520 | 0.7980 | 0.8933 |
| 0.0386 | 9.6568 | 6528 | 0.7996 | 0.7520 | 0.7996 | 0.8942 |
| 0.0386 | 9.6598 | 6530 | 0.8030 | 0.7520 | 0.8030 | 0.8961 |
| 0.0386 | 9.6627 | 6532 | 0.8060 | 0.7520 | 0.8060 | 0.8978 |
| 0.0386 | 9.6657 | 6534 | 0.8121 | 0.7520 | 0.8121 | 0.9012 |
| 0.0386 | 9.6686 | 6536 | 0.8193 | 0.7520 | 0.8193 | 0.9052 |
| 0.0386 | 9.6716 | 6538 | 0.8252 | 0.7520 | 0.8252 | 0.9084 |
| 0.0386 | 9.6746 | 6540 | 0.8306 | 0.7520 | 0.8306 | 0.9114 |
| 0.0386 | 9.6775 | 6542 | 0.8352 | 0.7520 | 0.8352 | 0.9139 |
| 0.0386 | 9.6805 | 6544 | 0.8366 | 0.7520 | 0.8366 | 0.9147 |
| 0.0386 | 9.6834 | 6546 | 0.8378 | 0.7520 | 0.8378 | 0.9153 |
| 0.0386 | 9.6864 | 6548 | 0.8390 | 0.7520 | 0.8390 | 0.9160 |
| 0.0386 | 9.6893 | 6550 | 0.8383 | 0.7520 | 0.8383 | 0.9156 |
| 0.0386 | 9.6923 | 6552 | 0.8369 | 0.7520 | 0.8369 | 0.9148 |
| 0.0386 | 9.6953 | 6554 | 0.8380 | 0.7520 | 0.8380 | 0.9154 |
| 0.0386 | 9.6982 | 6556 | 0.8397 | 0.7520 | 0.8397 | 0.9163 |
| 0.0386 | 9.7012 | 6558 | 0.8398 | 0.7520 | 0.8398 | 0.9164 |
| 0.0386 | 9.7041 | 6560 | 0.8398 | 0.7520 | 0.8398 | 0.9164 |
| 0.0386 | 9.7071 | 6562 | 0.8428 | 0.7520 | 0.8428 | 0.9180 |
| 0.0386 | 9.7101 | 6564 | 0.8465 | 0.7520 | 0.8465 | 0.9201 |
| 0.0386 | 9.7130 | 6566 | 0.8485 | 0.7520 | 0.8485 | 0.9212 |
| 0.0386 | 9.7160 | 6568 | 0.8489 | 0.7520 | 0.8489 | 0.9214 |
| 0.0386 | 9.7189 | 6570 | 0.8491 | 0.7520 | 0.8491 | 0.9215 |
| 0.0386 | 9.7219 | 6572 | 0.8471 | 0.7520 | 0.8471 | 0.9204 |
| 0.0386 | 9.7249 | 6574 | 0.8435 | 0.7520 | 0.8435 | 0.9184 |
| 0.0386 | 9.7278 | 6576 | 0.8421 | 0.7520 | 0.8421 | 0.9176 |
| 0.0386 | 9.7308 | 6578 | 0.8428 | 0.7520 | 0.8428 | 0.9181 |
| 0.0386 | 9.7337 | 6580 | 0.8436 | 0.7520 | 0.8436 | 0.9185 |
| 0.0386 | 9.7367 | 6582 | 0.8471 | 0.7520 | 0.8471 | 0.9204 |
| 0.0386 | 9.7396 | 6584 | 0.8525 | 0.7520 | 0.8525 | 0.9233 |
| 0.0386 | 9.7426 | 6586 | 0.8558 | 0.7626 | 0.8558 | 0.9251 |
| 0.0386 | 9.7456 | 6588 | 0.8591 | 0.7626 | 0.8591 | 0.9269 |
| 0.0386 | 9.7485 | 6590 | 0.8602 | 0.7626 | 0.8602 | 0.9275 |
| 0.0386 | 9.7515 | 6592 | 0.8599 | 0.7626 | 0.8599 | 0.9273 |
| 0.0386 | 9.7544 | 6594 | 0.8598 | 0.7626 | 0.8598 | 0.9273 |
| 0.0386 | 9.7574 | 6596 | 0.8598 | 0.7626 | 0.8598 | 0.9273 |
| 0.0386 | 9.7604 | 6598 | 0.8610 | 0.7626 | 0.8610 | 0.9279 |
| 0.0386 | 9.7633 | 6600 | 0.8608 | 0.7626 | 0.8608 | 0.9278 |
| 0.0386 | 9.7663 | 6602 | 0.8604 | 0.7626 | 0.8604 | 0.9276 |
| 0.0386 | 9.7692 | 6604 | 0.8598 | 0.7626 | 0.8598 | 0.9273 |
| 0.0386 | 9.7722 | 6606 | 0.8584 | 0.7626 | 0.8584 | 0.9265 |
| 0.0386 | 9.7751 | 6608 | 0.8568 | 0.7626 | 0.8568 | 0.9256 |
| 0.0386 | 9.7781 | 6610 | 0.8557 | 0.7626 | 0.8557 | 0.9251 |
| 0.0386 | 9.7811 | 6612 | 0.8564 | 0.7626 | 0.8564 | 0.9254 |
| 0.0386 | 9.7840 | 6614 | 0.8564 | 0.7626 | 0.8564 | 0.9254 |
| 0.0386 | 9.7870 | 6616 | 0.8573 | 0.7626 | 0.8573 | 0.9259 |
| 0.0386 | 9.7899 | 6618 | 0.8578 | 0.7626 | 0.8578 | 0.9262 |
| 0.0386 | 9.7929 | 6620 | 0.8593 | 0.7626 | 0.8593 | 0.9270 |
| 0.0386 | 9.7959 | 6622 | 0.8602 | 0.7626 | 0.8602 | 0.9275 |
| 0.0386 | 9.7988 | 6624 | 0.8619 | 0.7626 | 0.8619 | 0.9284 |
| 0.0386 | 9.8018 | 6626 | 0.8640 | 0.7626 | 0.8640 | 0.9295 |
| 0.0386 | 9.8047 | 6628 | 0.8656 | 0.7626 | 0.8656 | 0.9304 |
| 0.0386 | 9.8077 | 6630 | 0.8680 | 0.7626 | 0.8680 | 0.9317 |
| 0.0386 | 9.8107 | 6632 | 0.8710 | 0.7626 | 0.8710 | 0.9333 |
| 0.0386 | 9.8136 | 6634 | 0.8731 | 0.7626 | 0.8731 | 0.9344 |
| 0.0386 | 9.8166 | 6636 | 0.8732 | 0.7626 | 0.8732 | 0.9344 |
| 0.0386 | 9.8195 | 6638 | 0.8716 | 0.7626 | 0.8716 | 0.9336 |
| 0.0386 | 9.8225 | 6640 | 0.8690 | 0.7626 | 0.8690 | 0.9322 |
| 0.0386 | 9.8254 | 6642 | 0.8662 | 0.7626 | 0.8662 | 0.9307 |
| 0.0386 | 9.8284 | 6644 | 0.8645 | 0.7626 | 0.8645 | 0.9298 |
| 0.0386 | 9.8314 | 6646 | 0.8640 | 0.7626 | 0.8640 | 0.9295 |
| 0.0386 | 9.8343 | 6648 | 0.8637 | 0.7626 | 0.8637 | 0.9293 |
| 0.0386 | 9.8373 | 6650 | 0.8633 | 0.7626 | 0.8633 | 0.9291 |
| 0.0386 | 9.8402 | 6652 | 0.8624 | 0.7626 | 0.8624 | 0.9287 |
| 0.0386 | 9.8432 | 6654 | 0.8605 | 0.7626 | 0.8605 | 0.9276 |
| 0.0386 | 9.8462 | 6656 | 0.8576 | 0.7626 | 0.8576 | 0.9261 |
| 0.0386 | 9.8491 | 6658 | 0.8559 | 0.7626 | 0.8559 | 0.9252 |
| 0.0386 | 9.8521 | 6660 | 0.8548 | 0.7626 | 0.8548 | 0.9245 |
| 0.0386 | 9.8550 | 6662 | 0.8536 | 0.7626 | 0.8536 | 0.9239 |
| 0.0386 | 9.8580 | 6664 | 0.8524 | 0.7626 | 0.8524 | 0.9232 |
| 0.0386 | 9.8609 | 6666 | 0.8519 | 0.7626 | 0.8519 | 0.9230 |
| 0.0386 | 9.8639 | 6668 | 0.8514 | 0.7626 | 0.8514 | 0.9227 |
| 0.0386 | 9.8669 | 6670 | 0.8519 | 0.7626 | 0.8519 | 0.9230 |
| 0.0386 | 9.8698 | 6672 | 0.8522 | 0.7626 | 0.8522 | 0.9232 |
| 0.0386 | 9.8728 | 6674 | 0.8527 | 0.7626 | 0.8527 | 0.9234 |
| 0.0386 | 9.8757 | 6676 | 0.8535 | 0.7626 | 0.8535 | 0.9238 |
| 0.0386 | 9.8787 | 6678 | 0.8537 | 0.7626 | 0.8537 | 0.9240 |
| 0.0386 | 9.8817 | 6680 | 0.8550 | 0.7626 | 0.8550 | 0.9246 |
| 0.0386 | 9.8846 | 6682 | 0.8564 | 0.7626 | 0.8564 | 0.9254 |
| 0.0386 | 9.8876 | 6684 | 0.8584 | 0.7626 | 0.8584 | 0.9265 |
| 0.0386 | 9.8905 | 6686 | 0.8605 | 0.7626 | 0.8605 | 0.9276 |
| 0.0386 | 9.8935 | 6688 | 0.8613 | 0.7626 | 0.8613 | 0.9281 |
| 0.0386 | 9.8964 | 6690 | 0.8611 | 0.7626 | 0.8611 | 0.9280 |
| 0.0386 | 9.8994 | 6692 | 0.8607 | 0.7626 | 0.8607 | 0.9277 |
| 0.0386 | 9.9024 | 6694 | 0.8609 | 0.7626 | 0.8609 | 0.9278 |
| 0.0386 | 9.9053 | 6696 | 0.8602 | 0.7626 | 0.8602 | 0.9275 |
| 0.0386 | 9.9083 | 6698 | 0.8588 | 0.7626 | 0.8588 | 0.9267 |
| 0.0386 | 9.9112 | 6700 | 0.8575 | 0.7626 | 0.8575 | 0.9260 |
| 0.0386 | 9.9142 | 6702 | 0.8566 | 0.7626 | 0.8566 | 0.9255 |
| 0.0386 | 9.9172 | 6704 | 0.8555 | 0.7626 | 0.8555 | 0.9249 |
| 0.0386 | 9.9201 | 6706 | 0.8540 | 0.7626 | 0.8540 | 0.9241 |
| 0.0386 | 9.9231 | 6708 | 0.8533 | 0.7626 | 0.8533 | 0.9237 |
| 0.0386 | 9.9260 | 6710 | 0.8524 | 0.7626 | 0.8524 | 0.9232 |
| 0.0386 | 9.9290 | 6712 | 0.8510 | 0.7626 | 0.8510 | 0.9225 |
| 0.0386 | 9.9320 | 6714 | 0.8504 | 0.7626 | 0.8504 | 0.9222 |
| 0.0386 | 9.9349 | 6716 | 0.8495 | 0.7626 | 0.8495 | 0.9217 |
| 0.0386 | 9.9379 | 6718 | 0.8485 | 0.7626 | 0.8485 | 0.9212 |
| 0.0386 | 9.9408 | 6720 | 0.8479 | 0.7626 | 0.8479 | 0.9208 |
| 0.0386 | 9.9438 | 6722 | 0.8477 | 0.7626 | 0.8477 | 0.9207 |
| 0.0386 | 9.9467 | 6724 | 0.8477 | 0.7626 | 0.8477 | 0.9207 |
| 0.0386 | 9.9497 | 6726 | 0.8481 | 0.7626 | 0.8481 | 0.9209 |
| 0.0386 | 9.9527 | 6728 | 0.8487 | 0.7626 | 0.8487 | 0.9212 |
| 0.0386 | 9.9556 | 6730 | 0.8490 | 0.7626 | 0.8490 | 0.9214 |
| 0.0386 | 9.9586 | 6732 | 0.8495 | 0.7626 | 0.8495 | 0.9217 |
| 0.0386 | 9.9615 | 6734 | 0.8501 | 0.7626 | 0.8501 | 0.9220 |
| 0.0386 | 9.9645 | 6736 | 0.8509 | 0.7626 | 0.8509 | 0.9224 |
| 0.0386 | 9.9675 | 6738 | 0.8514 | 0.7626 | 0.8514 | 0.9227 |
| 0.0386 | 9.9704 | 6740 | 0.8518 | 0.7626 | 0.8518 | 0.9229 |
| 0.0386 | 9.9734 | 6742 | 0.8520 | 0.7626 | 0.8520 | 0.9230 |
| 0.0386 | 9.9763 | 6744 | 0.8523 | 0.7626 | 0.8523 | 0.9232 |
| 0.0386 | 9.9793 | 6746 | 0.8524 | 0.7626 | 0.8524 | 0.9232 |
| 0.0386 | 9.9822 | 6748 | 0.8523 | 0.7626 | 0.8523 | 0.9232 |
| 0.0386 | 9.9852 | 6750 | 0.8523 | 0.7626 | 0.8523 | 0.9232 |
| 0.0386 | 9.9882 | 6752 | 0.8525 | 0.7626 | 0.8525 | 0.9233 |
| 0.0386 | 9.9911 | 6754 | 0.8525 | 0.7626 | 0.8525 | 0.9233 |
| 0.0386 | 9.9941 | 6756 | 0.8526 | 0.7626 | 0.8526 | 0.9234 |
| 0.0386 | 9.9970 | 6758 | 0.8527 | 0.7626 | 0.8527 | 0.9234 |
| 0.0386 | 10.0 | 6760 | 0.8527 | 0.7626 | 0.8527 | 0.9234 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
|
RichardErkhov/AdaptLLM_-_law-LLM-13B-gguf | RichardErkhov | 2024-11-14T07:14:31Z | 9 | 0 | null | [
"gguf",
"arxiv:2309.09530",
"arxiv:2406.14491",
"endpoints_compatible",
"region:us"
] | null | 2024-11-13T23:03:42Z | Quantization made by Richard Erkhov.
[Github](https://github.com/RichardErkhov)
[Discord](https://discord.gg/pvy7H8DZMG)
[Request more models](https://github.com/RichardErkhov/quant_request)
law-LLM-13B - GGUF
- Model creator: https://huggingface.co/AdaptLLM/
- Original model: https://huggingface.co/AdaptLLM/law-LLM-13B/
| Name | Quant method | Size |
| ---- | ---- | ---- |
| [law-LLM-13B.Q2_K.gguf](https://huggingface.co/RichardErkhov/AdaptLLM_-_law-LLM-13B-gguf/blob/main/law-LLM-13B.Q2_K.gguf) | Q2_K | 4.52GB |
| [law-LLM-13B.Q3_K_S.gguf](https://huggingface.co/RichardErkhov/AdaptLLM_-_law-LLM-13B-gguf/blob/main/law-LLM-13B.Q3_K_S.gguf) | Q3_K_S | 5.27GB |
| [law-LLM-13B.Q3_K.gguf](https://huggingface.co/RichardErkhov/AdaptLLM_-_law-LLM-13B-gguf/blob/main/law-LLM-13B.Q3_K.gguf) | Q3_K | 5.9GB |
| [law-LLM-13B.Q3_K_M.gguf](https://huggingface.co/RichardErkhov/AdaptLLM_-_law-LLM-13B-gguf/blob/main/law-LLM-13B.Q3_K_M.gguf) | Q3_K_M | 5.9GB |
| [law-LLM-13B.Q3_K_L.gguf](https://huggingface.co/RichardErkhov/AdaptLLM_-_law-LLM-13B-gguf/blob/main/law-LLM-13B.Q3_K_L.gguf) | Q3_K_L | 6.45GB |
| [law-LLM-13B.IQ4_XS.gguf](https://huggingface.co/RichardErkhov/AdaptLLM_-_law-LLM-13B-gguf/blob/main/law-LLM-13B.IQ4_XS.gguf) | IQ4_XS | 6.54GB |
| [law-LLM-13B.Q4_0.gguf](https://huggingface.co/RichardErkhov/AdaptLLM_-_law-LLM-13B-gguf/blob/main/law-LLM-13B.Q4_0.gguf) | Q4_0 | 6.86GB |
| [law-LLM-13B.IQ4_NL.gguf](https://huggingface.co/RichardErkhov/AdaptLLM_-_law-LLM-13B-gguf/blob/main/law-LLM-13B.IQ4_NL.gguf) | IQ4_NL | 6.9GB |
| [law-LLM-13B.Q4_K_S.gguf](https://huggingface.co/RichardErkhov/AdaptLLM_-_law-LLM-13B-gguf/blob/main/law-LLM-13B.Q4_K_S.gguf) | Q4_K_S | 6.91GB |
| [law-LLM-13B.Q4_K.gguf](https://huggingface.co/RichardErkhov/AdaptLLM_-_law-LLM-13B-gguf/blob/main/law-LLM-13B.Q4_K.gguf) | Q4_K | 7.33GB |
| [law-LLM-13B.Q4_K_M.gguf](https://huggingface.co/RichardErkhov/AdaptLLM_-_law-LLM-13B-gguf/blob/main/law-LLM-13B.Q4_K_M.gguf) | Q4_K_M | 7.33GB |
| [law-LLM-13B.Q4_1.gguf](https://huggingface.co/RichardErkhov/AdaptLLM_-_law-LLM-13B-gguf/blob/main/law-LLM-13B.Q4_1.gguf) | Q4_1 | 7.61GB |
| [law-LLM-13B.Q5_0.gguf](https://huggingface.co/RichardErkhov/AdaptLLM_-_law-LLM-13B-gguf/blob/main/law-LLM-13B.Q5_0.gguf) | Q5_0 | 8.36GB |
| [law-LLM-13B.Q5_K_S.gguf](https://huggingface.co/RichardErkhov/AdaptLLM_-_law-LLM-13B-gguf/blob/main/law-LLM-13B.Q5_K_S.gguf) | Q5_K_S | 8.36GB |
| [law-LLM-13B.Q5_K.gguf](https://huggingface.co/RichardErkhov/AdaptLLM_-_law-LLM-13B-gguf/blob/main/law-LLM-13B.Q5_K.gguf) | Q5_K | 8.6GB |
| [law-LLM-13B.Q5_K_M.gguf](https://huggingface.co/RichardErkhov/AdaptLLM_-_law-LLM-13B-gguf/blob/main/law-LLM-13B.Q5_K_M.gguf) | Q5_K_M | 8.6GB |
| [law-LLM-13B.Q5_1.gguf](https://huggingface.co/RichardErkhov/AdaptLLM_-_law-LLM-13B-gguf/blob/main/law-LLM-13B.Q5_1.gguf) | Q5_1 | 9.1GB |
| [law-LLM-13B.Q6_K.gguf](https://huggingface.co/RichardErkhov/AdaptLLM_-_law-LLM-13B-gguf/blob/main/law-LLM-13B.Q6_K.gguf) | Q6_K | 9.95GB |
| [law-LLM-13B.Q8_0.gguf](https://huggingface.co/RichardErkhov/AdaptLLM_-_law-LLM-13B-gguf/blob/main/law-LLM-13B.Q8_0.gguf) | Q8_0 | 12.88GB |
Original model description:
---
language:
- en
datasets:
- Open-Orca/OpenOrca
- GAIR/lima
- WizardLM/WizardLM_evol_instruct_V2_196k
- EleutherAI/pile
metrics:
- accuracy
pipeline_tag: text-generation
tags:
- legal
---
# Adapting LLMs to Domains via Continual Pre-Training (ICLR 2024)
This repo contains the domain-specific base model developed from **LLaMA-1-13B**, using the method in our paper [Adapting Large Language Models via Reading Comprehension](https://huggingface.co/papers/2309.09530).
We explore **continued pre-training on domain-specific corpora** for large language models. While this approach enriches LLMs with domain knowledge, it significantly hurts their prompting ability for question answering. Inspired by human learning via reading comprehension, we propose a simple method to **transform large-scale pre-training corpora into reading comprehension texts**, consistently improving prompting performance across tasks in biomedicine, finance, and law domains. **Our 7B model competes with much larger domain-specific models like BloombergGPT-50B**.
### [2024/6/21] 🤗 We release the 2nd version of AdaptLLM at [Instruction-Pretrain](https://huggingface.co/instruction-pretrain), effective for both pre-training from scratch and continual pre-training 🤗
**************************** **Updates** ****************************
* 2024/8/29: Updated [guidelines](https://huggingface.co/datasets/AdaptLLM/finance-tasks) on evaluating any 🤗Huggingface models on the domain-specific tasks
* 2024/6/22: Released the [benchmarking code](https://github.com/microsoft/LMOps/tree/main/adaptllm)
* 2024/6/21: Released the 2nd version of AdaptLLM at [Instruction-Pretrain](https://huggingface.co/instruction-pretrain)
* 2024/4/2: Released the [raw data splits (train and test)](https://huggingface.co/datasets/AdaptLLM/ConvFinQA) of all the evaluation datasets
* 2024/1/16: Our [research paper](https://huggingface.co/papers/2309.09530) has been accepted by ICLR 2024
* 2023/12/19: Released our [13B base models](https://huggingface.co/AdaptLLM/law-LLM-13B) developed from LLaMA-1-13B
* 2023/12/8: Released our [chat models](https://huggingface.co/AdaptLLM/law-chat) developed from LLaMA-2-Chat-7B
* 2023/9/18: Released our [paper](https://huggingface.co/papers/2309.09530), [code](https://github.com/microsoft/LMOps), [data](https://huggingface.co/datasets/AdaptLLM/law-tasks), and [base models](https://huggingface.co/AdaptLLM/law-LLM) developed from LLaMA-1-7B
## 1. Domain-Specific Models
### LLaMA-1-7B
In our paper, we develop three domain-specific models from LLaMA-1-7B, which are also available in Huggingface: [Biomedicine-LLM](https://huggingface.co/AdaptLLM/medicine-LLM), [Finance-LLM](https://huggingface.co/AdaptLLM/finance-LLM) and [Law-LLM](https://huggingface.co/AdaptLLM/law-LLM), the performances of our AdaptLLM compared to other domain-specific LLMs are:
<p align='center'>
<img src="https://cdn-uploads.huggingface.co/production/uploads/650801ced5578ef7e20b33d4/6efPwitFgy-pLTzvccdcP.png" width="700">
</p>
### LLaMA-1-13B
Moreover, we scale up our base model to LLaMA-1-13B to see if **our method is similarly effective for larger-scale models**, and the results are consistently positive too: [Biomedicine-LLM-13B](https://huggingface.co/AdaptLLM/medicine-LLM-13B), [Finance-LLM-13B](https://huggingface.co/AdaptLLM/finance-LLM-13B) and [Law-LLM-13B](https://huggingface.co/AdaptLLM/law-LLM-13B).
### LLaMA-2-Chat
Our method is also effective for aligned models! LLaMA-2-Chat requires a [specific data format](https://huggingface.co/blog/llama2#how-to-prompt-llama-2), and our **reading comprehension can perfectly fit the data format** by transforming the reading comprehension into a multi-turn conversation. We have also open-sourced chat models in different domains: [Biomedicine-Chat](https://huggingface.co/AdaptLLM/medicine-chat), [Finance-Chat](https://huggingface.co/AdaptLLM/finance-chat) and [Law-Chat](https://huggingface.co/AdaptLLM/law-chat)
For example, to chat with the law model:
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("AdaptLLM/law-LLM-13B")
tokenizer = AutoTokenizer.from_pretrained("AdaptLLM/law-LLM-13B", use_fast=False)
# Put your input here:
user_input = '''Question: Which of the following is false about ex post facto laws?
Options:
- They make criminal an act that was innocent when committed.
- They prescribe greater punishment for an act than was prescribed when it was done.
- They increase the evidence required to convict a person than when the act was done.
- They alter criminal offenses or punishment in a substantially prejudicial manner for the purpose of punishing a person for some past activity.
Please provide your choice first and then provide explanations if possible.'''
# Simply use your input as the prompt for base models
prompt = user_input
inputs = tokenizer(prompt, return_tensors="pt", add_special_tokens=False).input_ids.to(model.device)
outputs = model.generate(input_ids=inputs, max_length=2048)[0]
answer_start = int(inputs.shape[-1])
pred = tokenizer.decode(outputs[answer_start:], skip_special_tokens=True)
print(pred)
```
### LLaMA-3-8B (💡New!)
In our recent research on [Instruction-Pretrain](https://huggingface.co/papers/2406.14491), we developed a context-based instruction synthesizer to augment the raw corpora with instruction-response pairs, **enabling Llama3-8B to be comparable to or even outperform Llama3-70B**: [Finance-Llama3-8B](https://huggingface.co/instruction-pretrain/finance-Llama3-8B), [Biomedicine-Llama3-8B](https://huggingface.co/instruction-pretrain/medicine-Llama3-8B).
## 2. Domain-Specific Tasks
### Pre-templatized Testing Splits
To easily reproduce our prompting results, we have uploaded the filled-in zero/few-shot input instructions and output completions of the test each domain-specific task: [biomedicine-tasks](https://huggingface.co/datasets/AdaptLLM/medicine-tasks), [finance-tasks](https://huggingface.co/datasets/AdaptLLM/finance-tasks), and [law-tasks](https://huggingface.co/datasets/AdaptLLM/law-tasks).
Note: those filled-in instructions are specifically tailored for models before alignment and do NOT fit for the specific data format required for chat models.
### Evaluating Any Huggingface LMs on Domain-Specific Tasks (💡New!)
You can use the following script to reproduce our results and evaluate any other Huggingface models on domain-specific tasks. Note that the script is NOT applicable to models that require specific prompt templates (e.g., Llama2-chat, Llama3-Instruct).
1). **Set Up Dependencies**
```bash
git clone https://github.com/microsoft/LMOps
cd LMOps/adaptllm
pip install -r requirements.txt
```
2). **Evaluate the Model**
```bash
# Select the domain from ['biomedicine', 'finance', 'law']
DOMAIN='law'
# Specify any Huggingface model name (Not applicable to chat models)
MODEL='AdaptLLM/law-LLM-13B'
# Model parallelization:
# - Set MODEL_PARALLEL=False if the model fits on a single GPU.
# We observe that LMs smaller than 10B always meet this requirement.
# - Set MODEL_PARALLEL=True if the model is too large and encounters OOM on a single GPU.
MODEL_PARALLEL=True
# Choose the number of GPUs from [1, 2, 4, 8]
N_GPU=2
# Whether to add a BOS token at the beginning of the prompt input:
# - Set to False for AdaptLLM.
# - Set to True for instruction-pretrain models.
# If unsure, we recommend setting it to False, as this is suitable for most LMs.
add_bos_token=False
# Run the evaluation script
bash scripts/inference.sh ${DOMAIN} ${MODEL} ${add_bos_token} ${MODEL_PARALLEL} ${N_GPU}
```
### Raw Datasets
We have also uploaded the raw training and testing splits, for facilitating fine-tuning or other usages: [ChemProt](https://huggingface.co/datasets/AdaptLLM/ChemProt), [RCT](https://huggingface.co/datasets/AdaptLLM/RCT), [ConvFinQA](https://huggingface.co/datasets/AdaptLLM/ConvFinQA), [FiQA_SA](https://huggingface.co/datasets/AdaptLLM/FiQA_SA), [Headline](https://huggingface.co/datasets/AdaptLLM/Headline), [NER](https://huggingface.co/datasets/AdaptLLM/NER), [FPB](https://huggingface.co/datasets/AdaptLLM/FPB)
### Domain Knowledge Probing
Our pre-processed knowledge probing datasets are available at: [med_knowledge_prob](https://huggingface.co/datasets/AdaptLLM/med_knowledge_prob) and [law_knowledge_prob](https://huggingface.co/datasets/AdaptLLM/law_knowledge_prob)
## Citation
If you find our work helpful, please cite us:
```bibtex
@inproceedings{
cheng2024adapting,
title={Adapting Large Language Models via Reading Comprehension},
author={Daixuan Cheng and Shaohan Huang and Furu Wei},
booktitle={The Twelfth International Conference on Learning Representations},
year={2024},
url={https://openreview.net/forum?id=y886UXPEZ0}
}
```
|
omrudra998/fifth | omrudra998 | 2024-11-14T07:12:47Z | 91 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"llama",
"text-generation",
"generated_from_trainer",
"conversational",
"base_model:meta-llama/Llama-3.2-1B-Instruct",
"base_model:finetune:meta-llama/Llama-3.2-1B-Instruct",
"license:llama3.2",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-14T07:09:58Z | ---
library_name: transformers
license: llama3.2
base_model: meta-llama/Llama-3.2-1B-Instruct
tags:
- generated_from_trainer
model-index:
- name: fifth
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fifth
This model is a fine-tuned version of [meta-llama/Llama-3.2-1B-Instruct](https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 32
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 60
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.0+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
|
seongil-dn/e5-base-miracl-250k-answerable-bs128 | seongil-dn | 2024-11-14T07:10:24Z | 6 | 0 | sentence-transformers | [
"sentence-transformers",
"safetensors",
"xlm-roberta",
"sentence-similarity",
"feature-extraction",
"generated_from_trainer",
"dataset_size:204133",
"loss:MultipleNegativesRankingLoss",
"arxiv:1908.10084",
"arxiv:1705.00652",
"base_model:intfloat/multilingual-e5-base",
"base_model:finetune:intfloat/multilingual-e5-base",
"autotrain_compatible",
"text-embeddings-inference",
"endpoints_compatible",
"region:us"
] | sentence-similarity | 2024-11-14T07:09:31Z | ---
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:204133
- loss:MultipleNegativesRankingLoss
base_model: intfloat/multilingual-e5-base
widget:
- source_sentence: 'query: 2018 글로벌인텔리전스서밋은 언제 어디서 열렸나요?'
sentences:
- 'passage: 서 원장은 이날 서울 웨스틴조선호텔에서 열린 국가안보전략연구원 주최 `2018 글로벌인텔리전스서밋`의 축하 영상메시지를 통해
"두 번의 남북정상회담에 이어 지난주에 미북 정상이 한반도의 완전한 비핵화와 평화정책에 합의했다. 정보가 왜 안보의 버팀목이자 평화의 길잡이인지
정보가 어떻게 세계 평화에 기여하는지를 제대로 보여줬다고 생각한다”고 말했다.'
- 'passage: 1999년 벨라루스의 야당 인사들은 이 단체를 콤소몰에 빗대어 "루카몰"()이라고 부르기도 하였다. 이는 루카셴코와 콤소몰의
합성어로 루카셴코 대통령의 개인 숭배를 정치적 기반으로 이용하는 청소년 단체를 뜻한다.'
- 'passage: 멕시코의 학교 교육에서는 스페인어와 영어가 필수이고 외국어로는 독일어, 프랑스어 등이 있다. 이 나라의 UNAM이라는 대학은
등록금이 원화로 단돈 500원에 이를 정도로 싸다.'
- source_sentence: 'query: 도부 철도 오고세 선의 종점은 어디인가요?'
sentences:
- 'passage: 《30 ROCK》은 미국 NBC TV의 코미디 시리즈이다. NBC방송사를 통해 방송되는 코미디쇼 《TGS》의 제작진과 그외
방송국 사람들의 이야기를 다룬다. 티나 페이, 트레이시 모건, 제인 크라코스키, 잭 맥브레이어, 스콧 애짓, 앨릭 볼드윈 등 13명의 고정
출연진 외에 매회 뜻밖의 명사들이 출연한다.'
- 'passage: 1952년 1월 18일에 대한민국 정부는 ‘인접 해양 주권에 관한 대통령 선언(평화선 선언)’을 발표하면서 이 섬을 평화선
안에 포함시켜 보호하도록 했다. 일본 측은 이에 항의하며 대한민국 측에 독도에 대한 한국 영유권을 부정하는 외교 문서를 보냈다. 이후부터 이
섬은 국제 사회에서 분쟁 지역으로 보이기 시작하였다.'
- 'passage: 도부 철도 오고세 선()은 일본의 철도 노선 하나이다. 도부 철도가 운영한다. 일본 사이타마현 사카도 시에 있는 사카도 역과
군마 현 이루마 군 오고세 정에 있는 오고세 역을 잇는다.'
- source_sentence: 'query: 묵주기도를 1단만 바치자는 평의회가 있었나요?'
sentences:
- 'passage: 이 공항은 남부 베트남의 호찌민 시(넓게 보면 동남보()를 거점으로 한다.
호찌민 시 중심부(1군)에서 7km 떨어진 떤빈 군에 위치하고 있으며, 공항에서 시내까지는 택시와 버스(152번)로 오고 갈 수 있다.'
- 'passage: 어떤 평의회에서는 심지어 묵주기도를 5단이 아닌 1단만 바치자고 했던 경우가 있는가 하면 또한 교본에 조직, 운영에 따른
규칙이 자세히 나열되고 있지만 번역이 어렵게 되어 이해하는 데 곤란했던 일 등 여러 가지 시행착오가 있었으니, 어려운 문제가 생길 때마다 기도하며
연구해서 원만히 운영함으로써 오늘날과 같은 대군단의 조직을 갖추게 된 것이다.'
- 'passage: 마르더의 차대는 체코제의 38(t) 전차 차대를 사용하여 기계적 신뢰성이 매우 높았으며 화력은 장거리에서 대부분의 경전차와
중전차를 격파하기에 충분한 것이었지만, 마르더 1과 2에서 지적된 수비의 약점들은 거의 그대로 계승되었다.'
- source_sentence: 'query: 아불 하이르는 언제 화레즘을 공격했나요?'
sentences:
- 'passage: 1616년(광해군 8년) 음력 4월 1일 형 흥안군과 함께 사간원에서 탄핵을 받았는데, 당시 사간원에서는 흥안군과 경평군이
새로운 궁궐을 짓기 위해 마련된 빈 부지에 일부 사대부들이 집을 지었다는 이유로 소란을 피우며, 당시 부지 내에 있던 사대부들의 집을 무단으로
철거하거나, 종들을 풀어 집들을 뺏었다고 보고하였다. 또 두 형제가 늙은 부녀자를 구타하고, 과부들을 능욕하는 등의 행위도 저질렀다고도 하였다.
또 1618년(광해군 10년)에는 대궐 안의 하인을 무단으로 데려가 폭행하고 감금하는 바람에, 광해군이 직접 추고하라는 전교를 내렸다. 당시
대신들은 여러 차례에 걸쳐 경평군의 삭탈 관직을 청하였으나, 광해군은 이에 대해서는 윤허하지 않았다. 그러나 경평군은 3년 뒤인 1621년(광해군
13년)에 활인서의 별제로 있던 이제인을 무자비하게 구타하는 바람에, 추고를 하라는 전교가 또 내려졌다. 또 이듬해인 1622년(광해군 14년)에는
한 금리와 그 처를 잡아다가 죽기 직전까지 구타를 하여 또 탄핵을 받았다.'
- 'passage: 위도 상으로는 태양의 복사열을 잘 받는 위치에 있지만, 고지대에 있기 때문에 기온은 연중 온화하고, 관광에는 적합한 기후를
가진다. 호수의 연평균 수온도 16.9℃로 안정되어 있다. 풍화설월의 명소인 따리 4경 중의 하나인 ‘얼하이의 달’로 호수면에 떠오르는 달의
풍치가 뛰어나다고 알려져 있다. 남안의 하관진과 북서쪽에 위치하는 호접천을 묶는 관광선이 운행되고 있다. 선내에서는 부이족의 삼도차를 제공하거나
부이족의 춤을 선보이는 것도 있다.
옛 명칭에는 엽유호, 곤미천, 시얼허(西洱河), 시얼허(西二河) 등으로 나온다. 차마고도의 출발지로 8세기에 남조국은 농업용수나 식용수 확보를
위해 이곳에 수도를 두었다고 추측된다. 또한 주위의 숲은 사슴 사냥의 장소로서 이용되고 있었다고 한다. 그 다음 937년에 개국한 대리국도
얼하이를 생활의 터전으로 이용했다.'
- 'passage: 1430 - 31년, 아불 하이르는 바카스와 함께 화레즘을 공격해 우르간즈를 점령했다. 그러나 1431년, 우즈베크인들은
도시를 포기하고 후퇴했다. 아불 하이르의 군대는 아스트라한 부근에서 자신에게 대항하는 두명의 칸에게 패배를 안겨줬다. 1435 - 36년 우즈베크
군대는 화레즘을 다시금 공격하고 몇 년 뒤에는 아스트라한을 공격했다. 1446년 초반에 아불 하이르와 그의 군대는 시르다리야 강 유역을 공격해
결국 티무르 왕조의 땅 일부를 빼앗았다. 시그나크는 아불 하이르의 새 수도가 되었고, 트란스옥시아나에 대한 그의 공격은 그 곳을 중심으로 이뤄졌다.'
- source_sentence: 'query: 주계열성의 중심핵 온도는 얼마나 될까요?'
sentences:
- 'passage: 수소 핵융합 과정은 온도에 민감하기 때문에 중심핵의 온도가 조금만 올라가도 핵융합의 강도는 막대하게 증가한다. 주계열성의
중심핵 온도는 가장 뜨거운 별과 가장 차가운 별을 비교하면 10배 정도밖에 차이가 나지 않는다. 예를 들면 O형 항성의 경우 중심 온도는 4천만
켈빈이며 어둡고 희미한 적색 왜성의 중심부 온도는 4백만 켈빈 정도에 이른다.'
- 'passage: 처음 묘소는 충청남도 서산군 보현동(普賢洞) 인좌에 안장되었다가, 뒤에 두번째 부인 순천박씨가 매장된 경기도 양주군 와부면
도곡리 산 97-1 안골(현, 남양주시 와부읍 도곡1리 안골) 금대산(金臺山)의 갓무봉 언덕으로 이장, 합장되었다. 동시에 충청북도 충주에
안장되었던 본처 대구서씨의 묘소도 이장하여 그의 묘소에 합장하였다. 중종 때의 반정공신 박원종의 묘소가 근처에 있다. 완순군 이재완 역시 그의
묘소 근처에 안장되었다. 또한 완순군의 묘 아래에는 그의 양자 이재완의 아들 이달용의 묘가 있다.'
- 'passage: 독일 형법(Strafgesetzbuch, StGB)은 모든 류의 범죄에 적용 가능한 총칙(Allgemeiner Teil)과
각각의 범죄를 서술해놓은 각칙(Besonderer Teil)으로 구성되어있다. 다른 법률에서도 다른 종류의 범죄들을 서술하고 있기는 하지만,
총칙에서 서술된 원칙은 그런 법률들에도 적용이 가능하다. 대륙법체계를 따르기 때문에 성문화 되어있고, 판사에 의해 이해되고 적용된다. 그렇기
때문에 사안의 사실관계확정에 초점을 두며 무죄추정의 원칙이 적용된다. 특히, 독일은 헌법 103조 2항에서 죄형법정주의를 채택하고 있고, 이를
형법총칙 제1조에서 다시 한번 언급하고 있다. 즉, 형법상의 책임은 의회가 제정한 법률에 근거해야 한다고 밝히고 있다. 의회에서 제정한 법률이
어떤 행위를 범죄화하기 위한 목적으로 인용하지 않는 한, 정부의 시행령이나 규칙으로는 범죄에 대한 책임을 묻기에 충분하지 않다. 형법총칙 제11조
5항에 보면 위법한 행위란 형법상의 구성요건을 실행하는 행위라고 한다. 즉, 범죄가 성립되려면 [범죄의 성립요건]을 만족해야 하며 이에는 구성요건
해당성, 위법성, 유책성(책임)이 있다. 특히 형법총칙 제16조에 ''구성요건적 착오''를 둠으로써 범죄성립 요건을 만족하였더라도 구성요건을
착오한 자는 죄를 면제해주거나 경감해줄 수 있도록 조치하였다. 형벌의 종류에는 자유형, 벌금형, 재산형, 부가형의 4가지가 존재한다. 제2차
세계대전 이후 형법을 부분개정하는 과정에서, 1949년 사형 제도가 폐지되었고, 1951년 내란죄, 간첩죄 등 국가적 법익을 침해하는 범죄에
대한 개정이 이루어졌으며, 1953년 보호관찰부 집행유예 제도와 가석방 제도가 도입되는 등 형법체제가 정비되었다. 1990년 동서독 통일 이후
독일 형법은 몇몇 분야에서 동독 형법과 서독 형법으로 나눠진 이원적 법적용 상황이 형성되었고 이를 제거하기 위해 1994년 5월 동성애 규정이
폐지되었고, 제182조의 죄명을 개정하는 등, 구동독 형법의 구성요건을 수용하는 노력을 보였다.'
pipeline_tag: sentence-similarity
library_name: sentence-transformers
---
# SentenceTransformer based on intfloat/multilingual-e5-base
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [intfloat/multilingual-e5-base](https://huggingface.co/intfloat/multilingual-e5-base) on the miracl-corpus dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [intfloat/multilingual-e5-base](https://huggingface.co/intfloat/multilingual-e5-base) <!-- at revision d13f1b27baf31030b7fd040960d60d909913633f -->
- **Maximum Sequence Length:** 512 tokens
- **Output Dimensionality:** 768 tokens
- **Similarity Function:** Cosine Similarity
- **Training Dataset:**
- miracl-corpus
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: XLMRobertaModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("seongil-dn/e5-base-miracl-250k-answerable-bs128")
# Run inference
sentences = [
'query: 주계열성의 중심핵 온도는 얼마나 될까요?',
'passage: 수소 핵융합 과정은 온도에 민감하기 때문에 중심핵의 온도가 조금만 올라가도 핵융합의 강도는 막대하게 증가한다. 주계열성의 중심핵 온도는 가장 뜨거운 별과 가장 차가운 별을 비교하면 10배 정도밖에 차이가 나지 않는다. 예를 들면 O형 항성의 경우 중심 온도는 4천만 켈빈이며 어둡고 희미한 적색 왜성의 중심부 온도는 4백만 켈빈 정도에 이른다.',
'passage: 처음 묘소는 충청남도 서산군 보현동(普賢洞) 인좌에 안장되었다가, 뒤에 두번째 부인 순천박씨가 매장된 경기도 양주군 와부면 도곡리 산 97-1 안골(현, 남양주시 와부읍 도곡1리 안골) 금대산(金臺山)의 갓무봉 언덕으로 이장, 합장되었다. 동시에 충청북도 충주에 안장되었던 본처 대구서씨의 묘소도 이장하여 그의 묘소에 합장하였다. 중종 때의 반정공신 박원종의 묘소가 근처에 있다. 완순군 이재완 역시 그의 묘소 근처에 안장되었다. 또한 완순군의 묘 아래에는 그의 양자 이재완의 아들 이달용의 묘가 있다.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Dataset
#### miracl-corpus
* Dataset: miracl-corpus
* Size: 204,133 training samples
* Columns: <code>anchor</code> and <code>positive</code>
* Approximate statistics based on the first 1000 samples:
| | anchor | positive |
|:--------|:-----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 11 tokens</li><li>mean: 20.61 tokens</li><li>max: 46 tokens</li></ul> | <ul><li>min: 47 tokens</li><li>mean: 141.05 tokens</li><li>max: 512 tokens</li></ul> |
* Samples:
| anchor | positive |
|:------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>query: 별의 나이는 어떻게 측정하는가?</code> | <code>passage: 별의 나이는 토륨과 다른 성분들에 의해 만들어진 스펙트럼선들의 상대적인 힘을 측정하기 위해 초거대망원경의 자외선 분광기를 사용하여 추측한다. 선의 힘은 여러 가지 다양한 동위원소를 만들어내는데, 그것들로부터 핵우주 연대학을 사용하여 별의 나이를 짐작하는 것이다.</code> |
| <code>query: 헬크릭층에서 발견되는 드로마에오사우루스과 공룡은 무엇인가요?</code> | <code>passage: 헬크릭층에서 발견되는 수각류로는 티라노사우루스과, 오르니토미무스과, 트로오돈과,조류, 카에나그나티드과 그리고 드로마에오사우루스과의 여러 속들이 있다. 아케로랍토르가 헬크릭층에서 발견되는 유일한 드로마에오사우루스과 공룡인 것으로 보인다. 흔히 발견되는 이빨은 드로마에오사우루스나 사우로르니톨레스테스의 것으로 간주되었으나 지금은 아케로랍토르의 것으로 생각하고 있다. 티라노사우루스과의 공룡으로는 나노티라누스와 티라노사우루스가 있는데, 나노티라누스는 티라노사우루스의 어린 개체일 가능성이 있다. 오르니토미무스과로는 스트루티오미무스와 오르니토미무스가 있으며, 정식으로 보고되지 않은 공룡인 오르코미무스도 헬크릭층에서 발견된 것일 수 있다. 트로오돈과의 공룡으로는 트로오돈과 파로니코돈이 헬크릭층에서 발견된다. 조류로는 아비사우루스, 그리고 브로다비스의 여러 종, 명명되지 않은 세 종류의 헤스페르오로르니스류가 있다. 오비랍토로사우리아는 헬크릭층에서 안주와 렙토린코스 두 종류가 있다.</code> |
| <code>query: 남이 장군은 언제 죽었는가?</code> | <code>passage: 1468년 10월 24일 예종이 즉위한 지 한 달 보름만에 유자광이 남이(南怡)가 역모를 꾀하고 있다는 고변으로 시작되어 이시애의 난을 진압하는 데 공을 세웠던 남이(南怡)와 강순을 비롯해 민서, 문효량, 변영수, 변자의, 오치권 등이 역모죄로 죽음을 당한 사건이다.</code> |
* Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
```json
{
"scale": 20.0,
"similarity_fct": "cos_sim"
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `per_device_train_batch_size`: 96
- `adam_epsilon`: 1e-07
- `warmup_ratio`: 0.1
- `fp16`: True
- `dataloader_drop_last`: True
- `batch_sampler`: no_duplicates
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: no
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 96
- `per_device_eval_batch_size`: 8
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 1
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 5e-05
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-07
- `max_grad_norm`: 1.0
- `num_train_epochs`: 3
- `max_steps`: -1
- `lr_scheduler_type`: linear
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.1
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: False
- `fp16`: True
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: True
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: False
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: False
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `eval_use_gather_object`: False
- `batch_sampler`: no_duplicates
- `multi_dataset_batch_sampler`: proportional
</details>
### Training Logs
<details><summary>Click to expand</summary>
| Epoch | Step | Training Loss |
|:------:|:----:|:-------------:|
| 0.0005 | 1 | 2.1885 |
| 0.0009 | 2 | 2.2525 |
| 0.0014 | 3 | 2.2806 |
| 0.0019 | 4 | 2.2304 |
| 0.0024 | 5 | 2.3556 |
| 0.0028 | 6 | 2.2416 |
| 0.0033 | 7 | 2.158 |
| 0.0038 | 8 | 2.2726 |
| 0.0042 | 9 | 2.1155 |
| 0.0047 | 10 | 2.2428 |
| 0.0052 | 11 | 2.2109 |
| 0.0056 | 12 | 2.403 |
| 0.0061 | 13 | 2.2068 |
| 0.0066 | 14 | 2.1787 |
| 0.0071 | 15 | 2.1531 |
| 0.0075 | 16 | 2.2818 |
| 0.0080 | 17 | 2.1265 |
| 0.0085 | 18 | 2.1879 |
| 0.0089 | 19 | 2.1179 |
| 0.0094 | 20 | 2.1004 |
| 0.0099 | 21 | 2.1954 |
| 0.0103 | 22 | 1.9755 |
| 0.0108 | 23 | 2.0296 |
| 0.0113 | 24 | 2.0232 |
| 0.0118 | 25 | 1.9643 |
| 0.0122 | 26 | 1.7821 |
| 0.0127 | 27 | 1.9111 |
| 0.0132 | 28 | 1.8421 |
| 0.0136 | 29 | 1.806 |
| 0.0141 | 30 | 1.7923 |
| 0.0146 | 31 | 1.7095 |
| 0.0151 | 32 | 1.5854 |
| 0.0155 | 33 | 1.6535 |
| 0.0160 | 34 | 1.5348 |
| 0.0165 | 35 | 1.5379 |
| 0.0169 | 36 | 1.5649 |
| 0.0174 | 37 | 1.2713 |
| 0.0179 | 38 | 1.344 |
| 0.0183 | 39 | 1.1934 |
| 0.0188 | 40 | 1.2732 |
| 0.0193 | 41 | 1.1936 |
| 0.0198 | 42 | 1.2777 |
| 0.0202 | 43 | 1.1918 |
| 0.0207 | 44 | 1.1907 |
| 0.0212 | 45 | 1.0029 |
| 0.0216 | 46 | 1.085 |
| 0.0221 | 47 | 0.9491 |
| 0.0226 | 48 | 0.96 |
| 0.0230 | 49 | 0.8735 |
| 0.0235 | 50 | 0.8606 |
| 0.0240 | 51 | 0.7412 |
| 0.0245 | 52 | 0.7611 |
| 0.0249 | 53 | 0.6617 |
| 0.0254 | 54 | 0.6171 |
| 0.0259 | 55 | 0.6364 |
| 0.0263 | 56 | 0.64 |
| 0.0268 | 57 | 0.556 |
| 0.0273 | 58 | 0.5417 |
| 0.0278 | 59 | 0.4746 |
| 0.0282 | 60 | 0.4184 |
| 0.0287 | 61 | 0.543 |
| 0.0292 | 62 | 0.3564 |
| 0.0296 | 63 | 0.3815 |
| 0.0301 | 64 | 0.397 |
| 0.0306 | 65 | 0.3263 |
| 0.0310 | 66 | 0.3552 |
| 0.0315 | 67 | 0.3837 |
| 0.0320 | 68 | 0.2362 |
| 0.0325 | 69 | 0.1956 |
| 0.0329 | 70 | 0.2436 |
| 0.0334 | 71 | 0.2661 |
| 0.0339 | 72 | 0.2649 |
| 0.0343 | 73 | 0.2263 |
| 0.0348 | 74 | 0.1218 |
| 0.0353 | 75 | 0.2553 |
| 0.0357 | 76 | 0.2317 |
| 0.0362 | 77 | 0.2347 |
| 0.0367 | 78 | 0.1252 |
| 0.0372 | 79 | 0.1548 |
| 0.0376 | 80 | 0.0933 |
| 0.0381 | 81 | 0.1597 |
| 0.0386 | 82 | 0.1682 |
| 0.0390 | 83 | 0.1506 |
| 0.0395 | 84 | 0.1385 |
| 0.0400 | 85 | 0.0837 |
| 0.0405 | 86 | 0.0685 |
| 0.0409 | 87 | 0.1103 |
| 0.0414 | 88 | 0.0973 |
| 0.0419 | 89 | 0.0964 |
| 0.0423 | 90 | 0.0561 |
| 0.0428 | 91 | 0.0799 |
| 0.0433 | 92 | 0.0693 |
| 0.0437 | 93 | 0.1259 |
| 0.0442 | 94 | 0.0776 |
| 0.0447 | 95 | 0.099 |
| 0.0452 | 96 | 0.1126 |
| 0.0456 | 97 | 0.1296 |
| 0.0461 | 98 | 0.1076 |
| 0.0466 | 99 | 0.0879 |
| 0.0470 | 100 | 0.0706 |
| 0.0475 | 101 | 0.0689 |
| 0.0480 | 102 | 0.0389 |
| 0.0484 | 103 | 0.0627 |
| 0.0489 | 104 | 0.0168 |
| 0.0494 | 105 | 0.0499 |
| 0.0499 | 106 | 0.0511 |
| 0.0503 | 107 | 0.0241 |
| 0.0508 | 108 | 0.0613 |
| 0.0513 | 109 | 0.049 |
| 0.0517 | 110 | 0.0348 |
| 0.0522 | 111 | 0.041 |
| 0.0527 | 112 | 0.0554 |
| 0.0532 | 113 | 0.0815 |
| 0.0536 | 114 | 0.0465 |
| 0.0541 | 115 | 0.0309 |
| 0.0546 | 116 | 0.0505 |
| 0.0550 | 117 | 0.0246 |
| 0.0555 | 118 | 0.0641 |
| 0.0560 | 119 | 0.0321 |
| 0.0564 | 120 | 0.0828 |
| 0.0569 | 121 | 0.0674 |
| 0.0574 | 122 | 0.0577 |
| 0.0579 | 123 | 0.0957 |
| 0.0583 | 124 | 0.0436 |
| 0.0588 | 125 | 0.0427 |
| 0.0593 | 126 | 0.0325 |
| 0.0597 | 127 | 0.0376 |
| 0.0602 | 128 | 0.0455 |
| 0.0607 | 129 | 0.083 |
| 0.0611 | 130 | 0.0669 |
| 0.0616 | 131 | 0.0458 |
| 0.0621 | 132 | 0.0265 |
| 0.0626 | 133 | 0.0562 |
| 0.0630 | 134 | 0.0357 |
| 0.0635 | 135 | 0.0342 |
| 0.0640 | 136 | 0.0262 |
| 0.0644 | 137 | 0.0265 |
| 0.0649 | 138 | 0.0923 |
| 0.0654 | 139 | 0.0182 |
| 0.0659 | 140 | 0.0374 |
| 0.0663 | 141 | 0.016 |
| 0.0668 | 142 | 0.059 |
| 0.0673 | 143 | 0.0544 |
| 0.0677 | 144 | 0.0466 |
| 0.0682 | 145 | 0.0614 |
| 0.0687 | 146 | 0.0776 |
| 0.0691 | 147 | 0.0509 |
| 0.0696 | 148 | 0.0259 |
| 0.0701 | 149 | 0.0571 |
| 0.0706 | 150 | 0.0423 |
| 0.0710 | 151 | 0.0368 |
| 0.0715 | 152 | 0.0369 |
| 0.0720 | 153 | 0.0293 |
| 0.0724 | 154 | 0.0391 |
| 0.0729 | 155 | 0.0261 |
| 0.0734 | 156 | 0.0333 |
| 0.0738 | 157 | 0.0364 |
| 0.0743 | 158 | 0.0239 |
| 0.0748 | 159 | 0.048 |
| 0.0753 | 160 | 0.0169 |
| 0.0757 | 161 | 0.0525 |
| 0.0762 | 162 | 0.0523 |
| 0.0767 | 163 | 0.0602 |
| 0.0771 | 164 | 0.0844 |
| 0.0776 | 165 | 0.0292 |
| 0.0781 | 166 | 0.048 |
| 0.0786 | 167 | 0.0272 |
| 0.0790 | 168 | 0.0435 |
| 0.0795 | 169 | 0.0443 |
| 0.0800 | 170 | 0.0256 |
| 0.0804 | 171 | 0.0653 |
| 0.0809 | 172 | 0.0557 |
| 0.0814 | 173 | 0.0564 |
| 0.0818 | 174 | 0.0239 |
| 0.0823 | 175 | 0.0436 |
| 0.0828 | 176 | 0.085 |
| 0.0833 | 177 | 0.0463 |
| 0.0837 | 178 | 0.0179 |
| 0.0842 | 179 | 0.034 |
| 0.0847 | 180 | 0.0477 |
| 0.0851 | 181 | 0.0949 |
| 0.0856 | 182 | 0.0351 |
| 0.0861 | 183 | 0.0746 |
| 0.0865 | 184 | 0.058 |
| 0.0870 | 185 | 0.0155 |
| 0.0875 | 186 | 0.0104 |
| 0.0880 | 187 | 0.0222 |
| 0.0884 | 188 | 0.0256 |
| 0.0889 | 189 | 0.0235 |
| 0.0894 | 190 | 0.0543 |
| 0.0898 | 191 | 0.0269 |
| 0.0903 | 192 | 0.045 |
| 0.0908 | 193 | 0.1137 |
| 0.0913 | 194 | 0.0514 |
| 0.0917 | 195 | 0.027 |
| 0.0922 | 196 | 0.0725 |
| 0.0927 | 197 | 0.0223 |
| 0.0931 | 198 | 0.0209 |
| 0.0936 | 199 | 0.0411 |
| 0.0941 | 200 | 0.0418 |
| 0.0945 | 201 | 0.0517 |
| 0.0950 | 202 | 0.0722 |
| 0.0955 | 203 | 0.0687 |
| 0.0960 | 204 | 0.0389 |
| 0.0964 | 205 | 0.0429 |
| 0.0969 | 206 | 0.0643 |
| 0.0974 | 207 | 0.0201 |
| 0.0978 | 208 | 0.0213 |
| 0.0983 | 209 | 0.0505 |
| 0.0988 | 210 | 0.0531 |
| 0.0992 | 211 | 0.0398 |
| 0.0997 | 212 | 0.0373 |
| 0.1002 | 213 | 0.0139 |
| 0.1007 | 214 | 0.0439 |
| 0.1011 | 215 | 0.0114 |
| 0.1016 | 216 | 0.0368 |
| 0.1021 | 217 | 0.0664 |
| 0.1025 | 218 | 0.013 |
| 0.1030 | 219 | 0.0684 |
| 0.1035 | 220 | 0.0425 |
| 0.1040 | 221 | 0.038 |
| 0.1044 | 222 | 0.034 |
| 0.1049 | 223 | 0.1103 |
| 0.1054 | 224 | 0.0309 |
| 0.1058 | 225 | 0.064 |
| 0.1063 | 226 | 0.0278 |
| 0.1068 | 227 | 0.053 |
| 0.1072 | 228 | 0.0224 |
| 0.1077 | 229 | 0.0428 |
| 0.1082 | 230 | 0.0674 |
| 0.1087 | 231 | 0.0477 |
| 0.1091 | 232 | 0.0577 |
| 0.1096 | 233 | 0.0368 |
| 0.1101 | 234 | 0.0384 |
| 0.1105 | 235 | 0.0519 |
| 0.1110 | 236 | 0.0467 |
| 0.1115 | 237 | 0.0619 |
| 0.1119 | 238 | 0.0245 |
| 0.1124 | 239 | 0.0493 |
| 0.1129 | 240 | 0.0163 |
| 0.1134 | 241 | 0.0759 |
| 0.1138 | 242 | 0.0253 |
| 0.1143 | 243 | 0.033 |
| 0.1148 | 244 | 0.0192 |
| 0.1152 | 245 | 0.0226 |
| 0.1157 | 246 | 0.0127 |
| 0.1162 | 247 | 0.012 |
| 0.1167 | 248 | 0.0501 |
| 0.1171 | 249 | 0.0587 |
| 0.1176 | 250 | 0.0546 |
| 0.1181 | 251 | 0.0437 |
| 0.1185 | 252 | 0.0295 |
| 0.1190 | 253 | 0.0308 |
| 0.1195 | 254 | 0.108 |
| 0.1199 | 255 | 0.0213 |
| 0.1204 | 256 | 0.0161 |
| 0.1209 | 257 | 0.023 |
| 0.1214 | 258 | 0.0641 |
| 0.1218 | 259 | 0.0634 |
| 0.1223 | 260 | 0.0285 |
| 0.1228 | 261 | 0.0251 |
| 0.1232 | 262 | 0.0577 |
| 0.1237 | 263 | 0.0283 |
| 0.1242 | 264 | 0.0173 |
| 0.1246 | 265 | 0.0268 |
| 0.1251 | 266 | 0.0491 |
| 0.1256 | 267 | 0.0354 |
| 0.1261 | 268 | 0.0477 |
| 0.1265 | 269 | 0.0357 |
| 0.1270 | 270 | 0.0367 |
| 0.1275 | 271 | 0.0209 |
| 0.1279 | 272 | 0.0687 |
| 0.1284 | 273 | 0.0149 |
| 0.1289 | 274 | 0.0527 |
| 0.1294 | 275 | 0.0374 |
| 0.1298 | 276 | 0.0176 |
| 0.1303 | 277 | 0.0315 |
| 0.1308 | 278 | 0.0743 |
| 0.1312 | 279 | 0.0539 |
| 0.1317 | 280 | 0.0484 |
| 0.1322 | 281 | 0.0194 |
| 0.1326 | 282 | 0.0476 |
| 0.1331 | 283 | 0.0197 |
| 0.1336 | 284 | 0.0265 |
| 0.1341 | 285 | 0.035 |
| 0.1345 | 286 | 0.0183 |
| 0.1350 | 287 | 0.0303 |
| 0.1355 | 288 | 0.0335 |
| 0.1359 | 289 | 0.0372 |
| 0.1364 | 290 | 0.0135 |
| 0.1369 | 291 | 0.0261 |
| 0.1373 | 292 | 0.0217 |
| 0.1378 | 293 | 0.0412 |
| 0.1383 | 294 | 0.0534 |
| 0.1388 | 295 | 0.0248 |
| 0.1392 | 296 | 0.0902 |
| 0.1397 | 297 | 0.0387 |
| 0.1402 | 298 | 0.1036 |
| 0.1406 | 299 | 0.0731 |
| 0.1411 | 300 | 0.0475 |
| 0.1416 | 301 | 0.0162 |
| 0.1421 | 302 | 0.0307 |
| 0.1425 | 303 | 0.0341 |
| 0.1430 | 304 | 0.0509 |
| 0.1435 | 305 | 0.0246 |
| 0.1439 | 306 | 0.0559 |
| 0.1444 | 307 | 0.0571 |
| 0.1449 | 308 | 0.0167 |
| 0.1453 | 309 | 0.0433 |
| 0.1458 | 310 | 0.1018 |
| 0.1463 | 311 | 0.0261 |
| 0.1468 | 312 | 0.1039 |
| 0.1472 | 313 | 0.0399 |
| 0.1477 | 314 | 0.0976 |
| 0.1482 | 315 | 0.0581 |
| 0.1486 | 316 | 0.0456 |
| 0.1491 | 317 | 0.037 |
| 0.1496 | 318 | 0.0325 |
| 0.1500 | 319 | 0.0203 |
| 0.1505 | 320 | 0.0497 |
| 0.1510 | 321 | 0.0279 |
| 0.1515 | 322 | 0.0318 |
| 0.1519 | 323 | 0.029 |
| 0.1524 | 324 | 0.0239 |
| 0.1529 | 325 | 0.0395 |
| 0.1533 | 326 | 0.0402 |
| 0.1538 | 327 | 0.0491 |
| 0.1543 | 328 | 0.0154 |
| 0.1548 | 329 | 0.0089 |
| 0.1552 | 330 | 0.0833 |
| 0.1557 | 331 | 0.059 |
| 0.1562 | 332 | 0.0251 |
| 0.1566 | 333 | 0.0522 |
| 0.1571 | 334 | 0.026 |
| 0.1576 | 335 | 0.0602 |
| 0.1580 | 336 | 0.0408 |
| 0.1585 | 337 | 0.0358 |
| 0.1590 | 338 | 0.0137 |
| 0.1595 | 339 | 0.0167 |
| 0.1599 | 340 | 0.0401 |
| 0.1604 | 341 | 0.0533 |
| 0.1609 | 342 | 0.0543 |
| 0.1613 | 343 | 0.013 |
| 0.1618 | 344 | 0.0235 |
| 0.1623 | 345 | 0.0682 |
| 0.1627 | 346 | 0.0579 |
| 0.1632 | 347 | 0.017 |
| 0.1637 | 348 | 0.01 |
| 0.1642 | 349 | 0.0096 |
| 0.1646 | 350 | 0.0146 |
| 0.1651 | 351 | 0.0727 |
| 0.1656 | 352 | 0.0697 |
| 0.1660 | 353 | 0.0184 |
| 0.1665 | 354 | 0.0349 |
| 0.1670 | 355 | 0.0489 |
| 0.1675 | 356 | 0.0483 |
| 0.1679 | 357 | 0.0587 |
| 0.1684 | 358 | 0.0263 |
| 0.1689 | 359 | 0.0395 |
| 0.1693 | 360 | 0.049 |
| 0.1698 | 361 | 0.0494 |
| 0.1703 | 362 | 0.0531 |
| 0.1707 | 363 | 0.1087 |
| 0.1712 | 364 | 0.0186 |
| 0.1717 | 365 | 0.0303 |
| 0.1722 | 366 | 0.0465 |
| 0.1726 | 367 | 0.0362 |
| 0.1731 | 368 | 0.0334 |
| 0.1736 | 369 | 0.0443 |
| 0.1740 | 370 | 0.017 |
| 0.1745 | 371 | 0.0263 |
| 0.1750 | 372 | 0.0101 |
| 0.1754 | 373 | 0.02 |
| 0.1759 | 374 | 0.0336 |
| 0.1764 | 375 | 0.0663 |
| 0.1769 | 376 | 0.0148 |
| 0.1773 | 377 | 0.0435 |
| 0.1778 | 378 | 0.0479 |
| 0.1783 | 379 | 0.0214 |
| 0.1787 | 380 | 0.0325 |
| 0.1792 | 381 | 0.0125 |
| 0.1797 | 382 | 0.0426 |
| 0.1802 | 383 | 0.0364 |
| 0.1806 | 384 | 0.0329 |
| 0.1811 | 385 | 0.0155 |
| 0.1816 | 386 | 0.0654 |
| 0.1820 | 387 | 0.0652 |
| 0.1825 | 388 | 0.031 |
| 0.1830 | 389 | 0.034 |
| 0.1834 | 390 | 0.0273 |
| 0.1839 | 391 | 0.0461 |
| 0.1844 | 392 | 0.04 |
| 0.1849 | 393 | 0.0705 |
| 0.1853 | 394 | 0.0725 |
| 0.1858 | 395 | 0.0117 |
| 0.1863 | 396 | 0.0228 |
| 0.1867 | 397 | 0.0537 |
| 0.1872 | 398 | 0.0197 |
| 0.1877 | 399 | 0.0491 |
| 0.1881 | 400 | 0.0626 |
| 0.1886 | 401 | 0.0229 |
| 0.1891 | 402 | 0.0778 |
| 0.1896 | 403 | 0.0327 |
| 0.1900 | 404 | 0.0251 |
| 0.1905 | 405 | 0.0312 |
| 0.1910 | 406 | 0.0547 |
| 0.1914 | 407 | 0.0309 |
| 0.1919 | 408 | 0.0475 |
| 0.1924 | 409 | 0.0486 |
| 0.1929 | 410 | 0.0246 |
| 0.1933 | 411 | 0.0686 |
| 0.1938 | 412 | 0.0397 |
| 0.1943 | 413 | 0.0707 |
| 0.1947 | 414 | 0.0667 |
| 0.1952 | 415 | 0.0327 |
| 0.1957 | 416 | 0.0362 |
| 0.1961 | 417 | 0.0222 |
| 0.1966 | 418 | 0.0481 |
| 0.1971 | 419 | 0.0848 |
| 0.1976 | 420 | 0.0169 |
| 0.1980 | 421 | 0.0542 |
| 0.1985 | 422 | 0.0587 |
| 0.1990 | 423 | 0.0249 |
| 0.1994 | 424 | 0.0432 |
| 0.1999 | 425 | 0.0566 |
| 0.2004 | 426 | 0.029 |
| 0.2008 | 427 | 0.0445 |
| 0.2013 | 428 | 0.059 |
| 0.2018 | 429 | 0.0163 |
| 0.2023 | 430 | 0.0131 |
| 0.2027 | 431 | 0.0431 |
| 0.2032 | 432 | 0.0643 |
| 0.2037 | 433 | 0.0717 |
| 0.2041 | 434 | 0.0469 |
| 0.2046 | 435 | 0.0518 |
| 0.2051 | 436 | 0.0359 |
| 0.2056 | 437 | 0.0455 |
| 0.2060 | 438 | 0.0414 |
| 0.2065 | 439 | 0.0169 |
| 0.2070 | 440 | 0.0259 |
| 0.2074 | 441 | 0.0607 |
| 0.2079 | 442 | 0.0175 |
| 0.2084 | 443 | 0.0279 |
| 0.2088 | 444 | 0.0319 |
| 0.2093 | 445 | 0.0245 |
| 0.2098 | 446 | 0.0412 |
| 0.2103 | 447 | 0.057 |
| 0.2107 | 448 | 0.0209 |
| 0.2112 | 449 | 0.0265 |
| 0.2117 | 450 | 0.043 |
| 0.2121 | 451 | 0.0323 |
| 0.2126 | 452 | 0.0401 |
| 0.2131 | 453 | 0.0198 |
| 0.2135 | 454 | 0.0269 |
| 0.2140 | 455 | 0.0428 |
| 0.2145 | 456 | 0.0239 |
| 0.2150 | 457 | 0.0209 |
| 0.2154 | 458 | 0.0469 |
| 0.2159 | 459 | 0.0382 |
| 0.2164 | 460 | 0.0341 |
| 0.2168 | 461 | 0.0297 |
| 0.2173 | 462 | 0.0172 |
| 0.2178 | 463 | 0.0291 |
| 0.2183 | 464 | 0.0494 |
| 0.2187 | 465 | 0.0206 |
| 0.2192 | 466 | 0.0271 |
| 0.2197 | 467 | 0.0446 |
| 0.2201 | 468 | 0.0284 |
| 0.2206 | 469 | 0.0524 |
| 0.2211 | 470 | 0.0373 |
| 0.2215 | 471 | 0.0433 |
| 0.2220 | 472 | 0.0243 |
| 0.2225 | 473 | 0.0516 |
| 0.2230 | 474 | 0.0248 |
| 0.2234 | 475 | 0.0375 |
| 0.2239 | 476 | 0.0165 |
| 0.2244 | 477 | 0.0324 |
| 0.2248 | 478 | 0.0255 |
| 0.2253 | 479 | 0.0594 |
| 0.2258 | 480 | 0.0639 |
| 0.2262 | 481 | 0.0231 |
| 0.2267 | 482 | 0.02 |
| 0.2272 | 483 | 0.0724 |
| 0.2277 | 484 | 0.0222 |
| 0.2281 | 485 | 0.0206 |
| 0.2286 | 486 | 0.0629 |
| 0.2291 | 487 | 0.0875 |
| 0.2295 | 488 | 0.0214 |
| 0.2300 | 489 | 0.018 |
| 0.2305 | 490 | 0.015 |
| 0.2310 | 491 | 0.0297 |
| 0.2314 | 492 | 0.0231 |
| 0.2319 | 493 | 0.0281 |
| 0.2324 | 494 | 0.0217 |
| 0.2328 | 495 | 0.0333 |
| 0.2333 | 496 | 0.0435 |
| 0.2338 | 497 | 0.0202 |
| 0.2342 | 498 | 0.0327 |
| 0.2347 | 499 | 0.0728 |
| 0.2352 | 500 | 0.0512 |
| 0.2357 | 501 | 0.0155 |
| 0.2361 | 502 | 0.047 |
| 0.2366 | 503 | 0.0562 |
| 0.2371 | 504 | 0.0403 |
| 0.2375 | 505 | 0.0344 |
| 0.2380 | 506 | 0.0642 |
| 0.2385 | 507 | 0.0398 |
| 0.2389 | 508 | 0.0512 |
| 0.2394 | 509 | 0.0164 |
| 0.2399 | 510 | 0.0228 |
| 0.2404 | 511 | 0.0134 |
| 0.2408 | 512 | 0.0112 |
| 0.2413 | 513 | 0.0345 |
| 0.2418 | 514 | 0.0444 |
| 0.2422 | 515 | 0.0257 |
| 0.2427 | 516 | 0.0251 |
| 0.2432 | 517 | 0.0292 |
| 0.2437 | 518 | 0.0272 |
| 0.2441 | 519 | 0.0471 |
| 0.2446 | 520 | 0.0516 |
| 0.2451 | 521 | 0.0193 |
| 0.2455 | 522 | 0.0267 |
| 0.2460 | 523 | 0.0572 |
| 0.2465 | 524 | 0.0183 |
| 0.2469 | 525 | 0.0361 |
| 0.2474 | 526 | 0.0272 |
| 0.2479 | 527 | 0.0516 |
| 0.2484 | 528 | 0.017 |
| 0.2488 | 529 | 0.0456 |
| 0.2493 | 530 | 0.0278 |
| 0.2498 | 531 | 0.062 |
| 0.2502 | 532 | 0.0418 |
| 0.2507 | 533 | 0.0365 |
| 0.2512 | 534 | 0.0097 |
| 0.2516 | 535 | 0.0386 |
| 0.2521 | 536 | 0.0306 |
| 0.2526 | 537 | 0.0797 |
| 0.2531 | 538 | 0.0147 |
| 0.2535 | 539 | 0.0179 |
| 0.2540 | 540 | 0.0329 |
| 0.2545 | 541 | 0.0709 |
| 0.2549 | 542 | 0.0389 |
| 0.2554 | 543 | 0.0145 |
| 0.2559 | 544 | 0.0832 |
| 0.2563 | 545 | 0.0313 |
| 0.2568 | 546 | 0.0282 |
| 0.2573 | 547 | 0.031 |
| 0.2578 | 548 | 0.0692 |
| 0.2582 | 549 | 0.0335 |
| 0.2587 | 550 | 0.0306 |
| 0.2592 | 551 | 0.0366 |
| 0.2596 | 552 | 0.0425 |
| 0.2601 | 553 | 0.0656 |
| 0.2606 | 554 | 0.1067 |
| 0.2611 | 555 | 0.0346 |
| 0.2615 | 556 | 0.0469 |
| 0.2620 | 557 | 0.0223 |
| 0.2625 | 558 | 0.0498 |
| 0.2629 | 559 | 0.0712 |
| 0.2634 | 560 | 0.0377 |
| 0.2639 | 561 | 0.1023 |
| 0.2643 | 562 | 0.0853 |
| 0.2648 | 563 | 0.0356 |
| 0.2653 | 564 | 0.076 |
| 0.2658 | 565 | 0.0282 |
| 0.2662 | 566 | 0.0469 |
| 0.2667 | 567 | 0.0694 |
| 0.2672 | 568 | 0.0285 |
| 0.2676 | 569 | 0.0278 |
| 0.2681 | 570 | 0.0633 |
| 0.2686 | 571 | 0.0125 |
| 0.2690 | 572 | 0.1003 |
| 0.2695 | 573 | 0.0249 |
| 0.2700 | 574 | 0.0357 |
| 0.2705 | 575 | 0.0227 |
| 0.2709 | 576 | 0.0643 |
| 0.2714 | 577 | 0.0511 |
| 0.2719 | 578 | 0.0246 |
| 0.2723 | 579 | 0.04 |
| 0.2728 | 580 | 0.0309 |
| 0.2733 | 581 | 0.0238 |
| 0.2738 | 582 | 0.0375 |
| 0.2742 | 583 | 0.0262 |
| 0.2747 | 584 | 0.0399 |
| 0.2752 | 585 | 0.0185 |
| 0.2756 | 586 | 0.026 |
| 0.2761 | 587 | 0.03 |
| 0.2766 | 588 | 0.0598 |
| 0.2770 | 589 | 0.0227 |
| 0.2775 | 590 | 0.019 |
| 0.2780 | 591 | 0.0415 |
| 0.2785 | 592 | 0.0525 |
| 0.2789 | 593 | 0.0187 |
| 0.2794 | 594 | 0.0165 |
| 0.2799 | 595 | 0.0418 |
| 0.2803 | 596 | 0.0295 |
| 0.2808 | 597 | 0.0416 |
| 0.2813 | 598 | 0.0158 |
| 0.2817 | 599 | 0.0422 |
| 0.2822 | 600 | 0.0589 |
| 0.2827 | 601 | 0.1205 |
| 0.2832 | 602 | 0.0294 |
| 0.2836 | 603 | 0.0191 |
| 0.2841 | 604 | 0.0429 |
| 0.2846 | 605 | 0.0261 |
| 0.2850 | 606 | 0.0211 |
| 0.2855 | 607 | 0.0349 |
| 0.2860 | 608 | 0.0634 |
| 0.2865 | 609 | 0.0179 |
| 0.2869 | 610 | 0.0416 |
| 0.2874 | 611 | 0.018 |
| 0.2879 | 612 | 0.037 |
| 0.2883 | 613 | 0.12 |
| 0.2888 | 614 | 0.0198 |
| 0.2893 | 615 | 0.04 |
| 0.2897 | 616 | 0.0382 |
| 0.2902 | 617 | 0.0432 |
| 0.2907 | 618 | 0.0711 |
| 0.2912 | 619 | 0.0337 |
| 0.2916 | 620 | 0.0421 |
| 0.2921 | 621 | 0.0226 |
| 0.2926 | 622 | 0.0236 |
| 0.2930 | 623 | 0.0179 |
| 0.2935 | 624 | 0.0433 |
| 0.2940 | 625 | 0.0399 |
| 0.2944 | 626 | 0.0179 |
| 0.2949 | 627 | 0.0562 |
| 0.2954 | 628 | 0.104 |
| 0.2959 | 629 | 0.0593 |
| 0.2963 | 630 | 0.0446 |
| 0.2968 | 631 | 0.0191 |
| 0.2973 | 632 | 0.0658 |
| 0.2977 | 633 | 0.0269 |
| 0.2982 | 634 | 0.0356 |
| 0.2987 | 635 | 0.0244 |
| 0.2992 | 636 | 0.0198 |
| 0.2996 | 637 | 0.0294 |
| 0.3001 | 638 | 0.0182 |
| 0.3006 | 639 | 0.0487 |
| 0.3010 | 640 | 0.0459 |
| 0.3015 | 641 | 0.0327 |
| 0.3020 | 642 | 0.0799 |
| 0.3024 | 643 | 0.0699 |
| 0.3029 | 644 | 0.0269 |
| 0.3034 | 645 | 0.0397 |
| 0.3039 | 646 | 0.0663 |
| 0.3043 | 647 | 0.075 |
| 0.3048 | 648 | 0.0175 |
| 0.3053 | 649 | 0.0316 |
| 0.3057 | 650 | 0.0258 |
| 0.3062 | 651 | 0.0343 |
| 0.3067 | 652 | 0.0234 |
| 0.3071 | 653 | 0.085 |
| 0.3076 | 654 | 0.022 |
| 0.3081 | 655 | 0.1063 |
| 0.3086 | 656 | 0.0398 |
| 0.3090 | 657 | 0.0174 |
| 0.3095 | 658 | 0.0319 |
| 0.3100 | 659 | 0.02 |
| 0.3104 | 660 | 0.0234 |
| 0.3109 | 661 | 0.0663 |
| 0.3114 | 662 | 0.0511 |
| 0.3119 | 663 | 0.0366 |
| 0.3123 | 664 | 0.0231 |
| 0.3128 | 665 | 0.0389 |
| 0.3133 | 666 | 0.0476 |
| 0.3137 | 667 | 0.1014 |
| 0.3142 | 668 | 0.0186 |
| 0.3147 | 669 | 0.013 |
| 0.3151 | 670 | 0.0297 |
| 0.3156 | 671 | 0.0147 |
| 0.3161 | 672 | 0.0412 |
| 0.3166 | 673 | 0.0379 |
| 0.3170 | 674 | 0.0189 |
| 0.3175 | 675 | 0.0874 |
| 0.3180 | 676 | 0.0166 |
| 0.3184 | 677 | 0.0872 |
| 0.3189 | 678 | 0.0367 |
| 0.3194 | 679 | 0.0243 |
| 0.3198 | 680 | 0.0124 |
| 0.3203 | 681 | 0.0276 |
| 0.3208 | 682 | 0.0373 |
| 0.3213 | 683 | 0.0477 |
| 0.3217 | 684 | 0.0169 |
| 0.3222 | 685 | 0.0575 |
| 0.3227 | 686 | 0.0413 |
| 0.3231 | 687 | 0.0248 |
| 0.3236 | 688 | 0.0178 |
| 0.3241 | 689 | 0.0351 |
| 0.3246 | 690 | 0.032 |
| 0.3250 | 691 | 0.0252 |
| 0.3255 | 692 | 0.0654 |
| 0.3260 | 693 | 0.0091 |
| 0.3264 | 694 | 0.0415 |
| 0.3269 | 695 | 0.0341 |
| 0.3274 | 696 | 0.0674 |
| 0.3278 | 697 | 0.0411 |
| 0.3283 | 698 | 0.0519 |
| 0.3288 | 699 | 0.0163 |
| 0.3293 | 700 | 0.0837 |
| 0.3297 | 701 | 0.0352 |
| 0.3302 | 702 | 0.0292 |
| 0.3307 | 703 | 0.0207 |
| 0.3311 | 704 | 0.0438 |
| 0.3316 | 705 | 0.0175 |
| 0.3321 | 706 | 0.0731 |
| 0.3325 | 707 | 0.0852 |
| 0.3330 | 708 | 0.0419 |
| 0.3335 | 709 | 0.0402 |
| 0.3340 | 710 | 0.0545 |
| 0.3344 | 711 | 0.0291 |
| 0.3349 | 712 | 0.0443 |
| 0.3354 | 713 | 0.01 |
| 0.3358 | 714 | 0.0478 |
| 0.3363 | 715 | 0.0488 |
| 0.3368 | 716 | 0.0384 |
| 0.3373 | 717 | 0.0505 |
| 0.3377 | 718 | 0.0402 |
| 0.3382 | 719 | 0.0693 |
| 0.3387 | 720 | 0.0433 |
| 0.3391 | 721 | 0.0632 |
| 0.3396 | 722 | 0.0285 |
| 0.3401 | 723 | 0.0394 |
| 0.3405 | 724 | 0.0701 |
| 0.3410 | 725 | 0.069 |
| 0.3415 | 726 | 0.0439 |
| 0.3420 | 727 | 0.0171 |
| 0.3424 | 728 | 0.0235 |
| 0.3429 | 729 | 0.17 |
| 0.3434 | 730 | 0.0684 |
| 0.3438 | 731 | 0.0208 |
| 0.3443 | 732 | 0.0691 |
| 0.3448 | 733 | 0.0595 |
| 0.3452 | 734 | 0.043 |
| 0.3457 | 735 | 0.0287 |
| 0.3462 | 736 | 0.0336 |
| 0.3467 | 737 | 0.0505 |
| 0.3471 | 738 | 0.0093 |
| 0.3476 | 739 | 0.0369 |
| 0.3481 | 740 | 0.0351 |
| 0.3485 | 741 | 0.0284 |
| 0.3490 | 742 | 0.0727 |
| 0.3495 | 743 | 0.0344 |
| 0.3500 | 744 | 0.03 |
| 0.3504 | 745 | 0.0629 |
| 0.3509 | 746 | 0.0241 |
| 0.3514 | 747 | 0.0335 |
| 0.3518 | 748 | 0.0342 |
| 0.3523 | 749 | 0.0475 |
| 0.3528 | 750 | 0.0285 |
| 0.3532 | 751 | 0.0333 |
| 0.3537 | 752 | 0.0283 |
| 0.3542 | 753 | 0.0299 |
| 0.3547 | 754 | 0.0321 |
| 0.3551 | 755 | 0.006 |
| 0.3556 | 756 | 0.0518 |
| 0.3561 | 757 | 0.0271 |
| 0.3565 | 758 | 0.0326 |
| 0.3570 | 759 | 0.0173 |
| 0.3575 | 760 | 0.0205 |
| 0.3579 | 761 | 0.0127 |
| 0.3584 | 762 | 0.0328 |
| 0.3589 | 763 | 0.0367 |
| 0.3594 | 764 | 0.035 |
| 0.3598 | 765 | 0.0437 |
| 0.3603 | 766 | 0.0229 |
| 0.3608 | 767 | 0.0333 |
| 0.3612 | 768 | 0.0291 |
| 0.3617 | 769 | 0.1005 |
| 0.3622 | 770 | 0.0395 |
| 0.3627 | 771 | 0.0358 |
| 0.3631 | 772 | 0.0325 |
| 0.3636 | 773 | 0.0517 |
| 0.3641 | 774 | 0.0719 |
| 0.3645 | 775 | 0.0141 |
| 0.3650 | 776 | 0.0399 |
| 0.3655 | 777 | 0.0276 |
| 0.3659 | 778 | 0.0244 |
| 0.3664 | 779 | 0.0509 |
| 0.3669 | 780 | 0.0416 |
| 0.3674 | 781 | 0.0129 |
| 0.3678 | 782 | 0.0413 |
| 0.3683 | 783 | 0.0211 |
| 0.3688 | 784 | 0.0485 |
| 0.3692 | 785 | 0.0132 |
| 0.3697 | 786 | 0.0506 |
| 0.3702 | 787 | 0.0712 |
| 0.3706 | 788 | 0.0087 |
| 0.3711 | 789 | 0.0283 |
| 0.3716 | 790 | 0.0321 |
| 0.3721 | 791 | 0.0293 |
| 0.3725 | 792 | 0.027 |
| 0.3730 | 793 | 0.0251 |
| 0.3735 | 794 | 0.0536 |
| 0.3739 | 795 | 0.0577 |
| 0.3744 | 796 | 0.0472 |
| 0.3749 | 797 | 0.0439 |
| 0.3754 | 798 | 0.0237 |
| 0.3758 | 799 | 0.0392 |
| 0.3763 | 800 | 0.0457 |
| 0.3768 | 801 | 0.03 |
| 0.3772 | 802 | 0.0301 |
| 0.3777 | 803 | 0.0363 |
| 0.3782 | 804 | 0.0091 |
| 0.3786 | 805 | 0.0847 |
| 0.3791 | 806 | 0.0673 |
| 0.3796 | 807 | 0.0216 |
| 0.3801 | 808 | 0.0684 |
| 0.3805 | 809 | 0.0435 |
| 0.3810 | 810 | 0.0468 |
| 0.3815 | 811 | 0.0324 |
| 0.3819 | 812 | 0.0263 |
| 0.3824 | 813 | 0.0347 |
| 0.3829 | 814 | 0.0115 |
| 0.3833 | 815 | 0.0275 |
| 0.3838 | 816 | 0.0727 |
| 0.3843 | 817 | 0.0313 |
| 0.3848 | 818 | 0.0367 |
| 0.3852 | 819 | 0.0579 |
| 0.3857 | 820 | 0.0236 |
| 0.3862 | 821 | 0.0179 |
| 0.3866 | 822 | 0.0388 |
| 0.3871 | 823 | 0.0421 |
| 0.3876 | 824 | 0.0297 |
| 0.3881 | 825 | 0.0327 |
| 0.3885 | 826 | 0.0415 |
| 0.3890 | 827 | 0.0135 |
| 0.3895 | 828 | 0.0137 |
| 0.3899 | 829 | 0.0309 |
| 0.3904 | 830 | 0.0248 |
| 0.3909 | 831 | 0.0283 |
| 0.3913 | 832 | 0.0526 |
| 0.3918 | 833 | 0.0321 |
| 0.3923 | 834 | 0.0725 |
| 0.3928 | 835 | 0.0253 |
| 0.3932 | 836 | 0.029 |
| 0.3937 | 837 | 0.0231 |
| 0.3942 | 838 | 0.0178 |
| 0.3946 | 839 | 0.0361 |
| 0.3951 | 840 | 0.0397 |
| 0.3956 | 841 | 0.0219 |
| 0.3960 | 842 | 0.066 |
| 0.3965 | 843 | 0.0516 |
| 0.3970 | 844 | 0.0235 |
| 0.3975 | 845 | 0.0421 |
| 0.3979 | 846 | 0.0371 |
| 0.3984 | 847 | 0.06 |
| 0.3989 | 848 | 0.0405 |
| 0.3993 | 849 | 0.0499 |
| 0.3998 | 850 | 0.0199 |
| 0.4003 | 851 | 0.0114 |
| 0.4008 | 852 | 0.0477 |
| 0.4012 | 853 | 0.0449 |
| 0.4017 | 854 | 0.0415 |
| 0.4022 | 855 | 0.0222 |
| 0.4026 | 856 | 0.0584 |
| 0.4031 | 857 | 0.0086 |
| 0.4036 | 858 | 0.0475 |
| 0.4040 | 859 | 0.0442 |
| 0.4045 | 860 | 0.0318 |
| 0.4050 | 861 | 0.0211 |
| 0.4055 | 862 | 0.0234 |
| 0.4059 | 863 | 0.072 |
| 0.4064 | 864 | 0.0275 |
| 0.4069 | 865 | 0.0213 |
| 0.4073 | 866 | 0.0443 |
| 0.4078 | 867 | 0.0176 |
| 0.4083 | 868 | 0.0313 |
| 0.4087 | 869 | 0.025 |
| 0.4092 | 870 | 0.0247 |
| 0.4097 | 871 | 0.0169 |
| 0.4102 | 872 | 0.0946 |
| 0.4106 | 873 | 0.0689 |
| 0.4111 | 874 | 0.0179 |
| 0.4116 | 875 | 0.026 |
| 0.4120 | 876 | 0.0226 |
| 0.4125 | 877 | 0.0955 |
| 0.4130 | 878 | 0.0177 |
| 0.4135 | 879 | 0.1038 |
| 0.4139 | 880 | 0.0401 |
| 0.4144 | 881 | 0.0329 |
| 0.4149 | 882 | 0.0373 |
| 0.4153 | 883 | 0.0205 |
| 0.4158 | 884 | 0.0158 |
| 0.4163 | 885 | 0.0292 |
| 0.4167 | 886 | 0.0247 |
| 0.4172 | 887 | 0.0067 |
| 0.4177 | 888 | 0.0463 |
| 0.4182 | 889 | 0.0921 |
| 0.4186 | 890 | 0.0857 |
| 0.4191 | 891 | 0.0628 |
| 0.4196 | 892 | 0.0799 |
| 0.4200 | 893 | 0.0086 |
| 0.4205 | 894 | 0.0395 |
| 0.4210 | 895 | 0.0324 |
| 0.4214 | 896 | 0.0742 |
| 0.4219 | 897 | 0.0321 |
| 0.4224 | 898 | 0.0133 |
| 0.4229 | 899 | 0.0108 |
| 0.4233 | 900 | 0.0285 |
| 0.4238 | 901 | 0.0882 |
| 0.4243 | 902 | 0.0344 |
| 0.4247 | 903 | 0.0434 |
| 0.4252 | 904 | 0.015 |
| 0.4257 | 905 | 0.0215 |
| 0.4262 | 906 | 0.0272 |
| 0.4266 | 907 | 0.0582 |
| 0.4271 | 908 | 0.0309 |
| 0.4276 | 909 | 0.0181 |
| 0.4280 | 910 | 0.038 |
| 0.4285 | 911 | 0.0294 |
| 0.4290 | 912 | 0.0341 |
| 0.4294 | 913 | 0.0581 |
| 0.4299 | 914 | 0.0165 |
| 0.4304 | 915 | 0.0649 |
| 0.4309 | 916 | 0.0404 |
| 0.4313 | 917 | 0.0666 |
| 0.4318 | 918 | 0.0112 |
| 0.4323 | 919 | 0.022 |
| 0.4327 | 920 | 0.0212 |
| 0.4332 | 921 | 0.035 |
| 0.4337 | 922 | 0.0286 |
| 0.4341 | 923 | 0.0504 |
| 0.4346 | 924 | 0.0776 |
| 0.4351 | 925 | 0.0258 |
| 0.4356 | 926 | 0.0531 |
| 0.4360 | 927 | 0.0211 |
| 0.4365 | 928 | 0.0175 |
| 0.4370 | 929 | 0.0376 |
| 0.4374 | 930 | 0.039 |
| 0.4379 | 931 | 0.0178 |
| 0.4384 | 932 | 0.0148 |
| 0.4389 | 933 | 0.0625 |
| 0.4393 | 934 | 0.1002 |
| 0.4398 | 935 | 0.021 |
| 0.4403 | 936 | 0.012 |
| 0.4407 | 937 | 0.0236 |
| 0.4412 | 938 | 0.0412 |
| 0.4417 | 939 | 0.022 |
| 0.4421 | 940 | 0.0408 |
| 0.4426 | 941 | 0.0336 |
| 0.4431 | 942 | 0.0573 |
| 0.4436 | 943 | 0.0767 |
| 0.4440 | 944 | 0.0193 |
| 0.4445 | 945 | 0.0318 |
| 0.4450 | 946 | 0.0118 |
| 0.4454 | 947 | 0.0312 |
| 0.4459 | 948 | 0.0575 |
| 0.4464 | 949 | 0.0216 |
| 0.4468 | 950 | 0.0412 |
| 0.4473 | 951 | 0.0401 |
| 0.4478 | 952 | 0.0494 |
| 0.4483 | 953 | 0.0435 |
| 0.4487 | 954 | 0.0622 |
| 0.4492 | 955 | 0.0257 |
| 0.4497 | 956 | 0.0494 |
| 0.4501 | 957 | 0.0624 |
| 0.4506 | 958 | 0.023 |
| 0.4511 | 959 | 0.0473 |
| 0.4516 | 960 | 0.0121 |
| 0.4520 | 961 | 0.0217 |
| 0.4525 | 962 | 0.0236 |
| 0.4530 | 963 | 0.0525 |
| 0.4534 | 964 | 0.0239 |
| 0.4539 | 965 | 0.0103 |
| 0.4544 | 966 | 0.0322 |
| 0.4548 | 967 | 0.0505 |
| 0.4553 | 968 | 0.0194 |
| 0.4558 | 969 | 0.0287 |
| 0.4563 | 970 | 0.0728 |
| 0.4567 | 971 | 0.0617 |
| 0.4572 | 972 | 0.023 |
| 0.4577 | 973 | 0.0343 |
| 0.4581 | 974 | 0.0313 |
| 0.4586 | 975 | 0.0526 |
| 0.4591 | 976 | 0.0419 |
| 0.4595 | 977 | 0.0411 |
| 0.4600 | 978 | 0.0343 |
| 0.4605 | 979 | 0.0269 |
| 0.4610 | 980 | 0.0482 |
| 0.4614 | 981 | 0.0414 |
| 0.4619 | 982 | 0.0254 |
| 0.4624 | 983 | 0.0344 |
| 0.4628 | 984 | 0.0217 |
| 0.4633 | 985 | 0.014 |
| 0.4638 | 986 | 0.0261 |
| 0.4643 | 987 | 0.0188 |
| 0.4647 | 988 | 0.0374 |
| 0.4652 | 989 | 0.027 |
| 0.4657 | 990 | 0.0118 |
| 0.4661 | 991 | 0.007 |
| 0.4666 | 992 | 0.0481 |
| 0.4671 | 993 | 0.0344 |
| 0.4675 | 994 | 0.043 |
| 0.4680 | 995 | 0.0181 |
| 0.4685 | 996 | 0.025 |
| 0.4690 | 997 | 0.0282 |
| 0.4694 | 998 | 0.0282 |
| 0.4699 | 999 | 0.0548 |
| 0.4704 | 1000 | 0.0188 |
| 0.4708 | 1001 | 0.0551 |
| 0.4713 | 1002 | 0.0623 |
| 0.4718 | 1003 | 0.0406 |
| 0.4722 | 1004 | 0.0782 |
| 0.4727 | 1005 | 0.0196 |
| 0.4732 | 1006 | 0.0322 |
| 0.4737 | 1007 | 0.025 |
| 0.4741 | 1008 | 0.0205 |
| 0.4746 | 1009 | 0.0624 |
| 0.4751 | 1010 | 0.1019 |
| 0.4755 | 1011 | 0.0881 |
| 0.4760 | 1012 | 0.0227 |
| 0.4765 | 1013 | 0.0093 |
| 0.4770 | 1014 | 0.0128 |
| 0.4774 | 1015 | 0.0208 |
| 0.4779 | 1016 | 0.0278 |
| 0.4784 | 1017 | 0.0217 |
| 0.4788 | 1018 | 0.0225 |
| 0.4793 | 1019 | 0.0532 |
| 0.4798 | 1020 | 0.0161 |
| 0.4802 | 1021 | 0.0319 |
| 0.4807 | 1022 | 0.0332 |
| 0.4812 | 1023 | 0.0693 |
| 0.4817 | 1024 | 0.0509 |
| 0.4821 | 1025 | 0.0265 |
| 0.4826 | 1026 | 0.0242 |
| 0.4831 | 1027 | 0.0761 |
| 0.4835 | 1028 | 0.0357 |
| 0.4840 | 1029 | 0.0376 |
| 0.4845 | 1030 | 0.0404 |
| 0.4849 | 1031 | 0.0181 |
| 0.4854 | 1032 | 0.0143 |
| 0.4859 | 1033 | 0.0912 |
| 0.4864 | 1034 | 0.0313 |
| 0.4868 | 1035 | 0.0269 |
| 0.4873 | 1036 | 0.0191 |
| 0.4878 | 1037 | 0.0649 |
| 0.4882 | 1038 | 0.071 |
| 0.4887 | 1039 | 0.0545 |
| 0.4892 | 1040 | 0.0148 |
| 0.4897 | 1041 | 0.0319 |
| 0.4901 | 1042 | 0.0232 |
| 0.4906 | 1043 | 0.037 |
| 0.4911 | 1044 | 0.0323 |
| 0.4915 | 1045 | 0.0416 |
| 0.4920 | 1046 | 0.0468 |
| 0.4925 | 1047 | 0.0119 |
| 0.4929 | 1048 | 0.0419 |
| 0.4934 | 1049 | 0.0275 |
| 0.4939 | 1050 | 0.0191 |
| 0.4944 | 1051 | 0.043 |
| 0.4948 | 1052 | 0.0171 |
| 0.4953 | 1053 | 0.0194 |
| 0.4958 | 1054 | 0.0425 |
| 0.4962 | 1055 | 0.0499 |
| 0.4967 | 1056 | 0.0711 |
| 0.4972 | 1057 | 0.0452 |
| 0.4976 | 1058 | 0.012 |
| 0.4981 | 1059 | 0.0385 |
| 0.4986 | 1060 | 0.0345 |
| 0.4991 | 1061 | 0.0311 |
| 0.4995 | 1062 | 0.0828 |
| 0.5 | 1063 | 0.026 |
| 0.5005 | 1064 | 0.0387 |
| 0.5009 | 1065 | 0.0511 |
| 0.5014 | 1066 | 0.017 |
| 0.5019 | 1067 | 0.0116 |
| 0.5024 | 1068 | 0.0408 |
| 0.5028 | 1069 | 0.0248 |
| 0.5033 | 1070 | 0.0155 |
| 0.5038 | 1071 | 0.0298 |
| 0.5042 | 1072 | 0.0596 |
| 0.5047 | 1073 | 0.0399 |
| 0.5052 | 1074 | 0.0438 |
| 0.5056 | 1075 | 0.08 |
| 0.5061 | 1076 | 0.0123 |
| 0.5066 | 1077 | 0.027 |
| 0.5071 | 1078 | 0.0321 |
| 0.5075 | 1079 | 0.0199 |
| 0.5080 | 1080 | 0.0373 |
| 0.5085 | 1081 | 0.0274 |
| 0.5089 | 1082 | 0.0262 |
| 0.5094 | 1083 | 0.0307 |
| 0.5099 | 1084 | 0.0098 |
| 0.5103 | 1085 | 0.0863 |
| 0.5108 | 1086 | 0.014 |
| 0.5113 | 1087 | 0.0337 |
| 0.5118 | 1088 | 0.0786 |
| 0.5122 | 1089 | 0.0348 |
| 0.5127 | 1090 | 0.029 |
| 0.5132 | 1091 | 0.0331 |
| 0.5136 | 1092 | 0.0355 |
| 0.5141 | 1093 | 0.0297 |
| 0.5146 | 1094 | 0.1157 |
| 0.5151 | 1095 | 0.0274 |
| 0.5155 | 1096 | 0.0272 |
| 0.5160 | 1097 | 0.018 |
| 0.5165 | 1098 | 0.0847 |
| 0.5169 | 1099 | 0.0459 |
| 0.5174 | 1100 | 0.0203 |
| 0.5179 | 1101 | 0.0362 |
| 0.5183 | 1102 | 0.0263 |
| 0.5188 | 1103 | 0.0197 |
| 0.5193 | 1104 | 0.0323 |
| 0.5198 | 1105 | 0.0257 |
| 0.5202 | 1106 | 0.0137 |
| 0.5207 | 1107 | 0.0339 |
| 0.5212 | 1108 | 0.0198 |
| 0.5216 | 1109 | 0.0165 |
| 0.5221 | 1110 | 0.037 |
| 0.5226 | 1111 | 0.0411 |
| 0.5230 | 1112 | 0.0553 |
| 0.5235 | 1113 | 0.0163 |
| 0.5240 | 1114 | 0.0145 |
| 0.5245 | 1115 | 0.0398 |
| 0.5249 | 1116 | 0.0316 |
| 0.5254 | 1117 | 0.0342 |
| 0.5259 | 1118 | 0.0333 |
| 0.5263 | 1119 | 0.0694 |
| 0.5268 | 1120 | 0.0366 |
| 0.5273 | 1121 | 0.0495 |
| 0.5278 | 1122 | 0.0633 |
| 0.5282 | 1123 | 0.0316 |
| 0.5287 | 1124 | 0.0373 |
| 0.5292 | 1125 | 0.0785 |
| 0.5296 | 1126 | 0.0425 |
| 0.5301 | 1127 | 0.0175 |
| 0.5306 | 1128 | 0.026 |
| 0.5310 | 1129 | 0.0223 |
| 0.5315 | 1130 | 0.0199 |
| 0.5320 | 1131 | 0.0685 |
| 0.5325 | 1132 | 0.0317 |
| 0.5329 | 1133 | 0.0278 |
| 0.5334 | 1134 | 0.0384 |
| 0.5339 | 1135 | 0.0307 |
| 0.5343 | 1136 | 0.0339 |
| 0.5348 | 1137 | 0.0229 |
| 0.5353 | 1138 | 0.0378 |
| 0.5357 | 1139 | 0.0192 |
| 0.5362 | 1140 | 0.0386 |
| 0.5367 | 1141 | 0.0264 |
| 0.5372 | 1142 | 0.029 |
| 0.5376 | 1143 | 0.056 |
| 0.5381 | 1144 | 0.0193 |
| 0.5386 | 1145 | 0.0894 |
| 0.5390 | 1146 | 0.0279 |
| 0.5395 | 1147 | 0.0185 |
| 0.5400 | 1148 | 0.0344 |
| 0.5405 | 1149 | 0.0375 |
| 0.5409 | 1150 | 0.0339 |
| 0.5414 | 1151 | 0.0653 |
| 0.5419 | 1152 | 0.0298 |
| 0.5423 | 1153 | 0.0425 |
| 0.5428 | 1154 | 0.051 |
| 0.5433 | 1155 | 0.0198 |
| 0.5437 | 1156 | 0.0196 |
| 0.5442 | 1157 | 0.0475 |
| 0.5447 | 1158 | 0.0159 |
| 0.5452 | 1159 | 0.0439 |
| 0.5456 | 1160 | 0.1319 |
| 0.5461 | 1161 | 0.0425 |
| 0.5466 | 1162 | 0.0313 |
| 0.5470 | 1163 | 0.0144 |
| 0.5475 | 1164 | 0.0446 |
| 0.5480 | 1165 | 0.0162 |
| 0.5484 | 1166 | 0.019 |
| 0.5489 | 1167 | 0.0464 |
| 0.5494 | 1168 | 0.028 |
| 0.5499 | 1169 | 0.0252 |
| 0.5503 | 1170 | 0.0723 |
| 0.5508 | 1171 | 0.0159 |
| 0.5513 | 1172 | 0.0463 |
| 0.5517 | 1173 | 0.0193 |
| 0.5522 | 1174 | 0.0303 |
| 0.5527 | 1175 | 0.0218 |
| 0.5532 | 1176 | 0.0482 |
| 0.5536 | 1177 | 0.02 |
| 0.5541 | 1178 | 0.0429 |
| 0.5546 | 1179 | 0.0135 |
| 0.5550 | 1180 | 0.0409 |
| 0.5555 | 1181 | 0.0123 |
| 0.5560 | 1182 | 0.1156 |
| 0.5564 | 1183 | 0.0772 |
| 0.5569 | 1184 | 0.0649 |
| 0.5574 | 1185 | 0.0315 |
| 0.5579 | 1186 | 0.0239 |
| 0.5583 | 1187 | 0.056 |
| 0.5588 | 1188 | 0.026 |
| 0.5593 | 1189 | 0.0482 |
| 0.5597 | 1190 | 0.0466 |
| 0.5602 | 1191 | 0.0108 |
| 0.5607 | 1192 | 0.0159 |
| 0.5611 | 1193 | 0.0375 |
| 0.5616 | 1194 | 0.0289 |
| 0.5621 | 1195 | 0.0194 |
| 0.5626 | 1196 | 0.0509 |
| 0.5630 | 1197 | 0.0254 |
| 0.5635 | 1198 | 0.0527 |
| 0.5640 | 1199 | 0.0357 |
| 0.5644 | 1200 | 0.0173 |
| 0.5649 | 1201 | 0.0412 |
| 0.5654 | 1202 | 0.0238 |
| 0.5659 | 1203 | 0.0484 |
| 0.5663 | 1204 | 0.0358 |
| 0.5668 | 1205 | 0.0225 |
| 0.5673 | 1206 | 0.0376 |
| 0.5677 | 1207 | 0.0534 |
| 0.5682 | 1208 | 0.0403 |
| 0.5687 | 1209 | 0.0276 |
| 0.5691 | 1210 | 0.0283 |
| 0.5696 | 1211 | 0.0507 |
| 0.5701 | 1212 | 0.0275 |
| 0.5706 | 1213 | 0.0526 |
| 0.5710 | 1214 | 0.0347 |
| 0.5715 | 1215 | 0.0416 |
| 0.5720 | 1216 | 0.0428 |
| 0.5724 | 1217 | 0.0427 |
| 0.5729 | 1218 | 0.041 |
| 0.5734 | 1219 | 0.0245 |
| 0.5738 | 1220 | 0.0268 |
| 0.5743 | 1221 | 0.0187 |
| 0.5748 | 1222 | 0.015 |
| 0.5753 | 1223 | 0.0392 |
| 0.5757 | 1224 | 0.0144 |
| 0.5762 | 1225 | 0.0321 |
| 0.5767 | 1226 | 0.0503 |
| 0.5771 | 1227 | 0.0339 |
| 0.5776 | 1228 | 0.0418 |
| 0.5781 | 1229 | 0.033 |
| 0.5786 | 1230 | 0.0318 |
| 0.5790 | 1231 | 0.041 |
| 0.5795 | 1232 | 0.0363 |
| 0.5800 | 1233 | 0.0267 |
| 0.5804 | 1234 | 0.0365 |
| 0.5809 | 1235 | 0.0249 |
| 0.5814 | 1236 | 0.0194 |
| 0.5818 | 1237 | 0.0366 |
| 0.5823 | 1238 | 0.0474 |
| 0.5828 | 1239 | 0.0349 |
| 0.5833 | 1240 | 0.0144 |
| 0.5837 | 1241 | 0.027 |
| 0.5842 | 1242 | 0.0241 |
| 0.5847 | 1243 | 0.0249 |
| 0.5851 | 1244 | 0.012 |
| 0.5856 | 1245 | 0.0304 |
| 0.5861 | 1246 | 0.011 |
| 0.5865 | 1247 | 0.0765 |
| 0.5870 | 1248 | 0.0354 |
| 0.5875 | 1249 | 0.0108 |
| 0.5880 | 1250 | 0.0239 |
| 0.5884 | 1251 | 0.0306 |
| 0.5889 | 1252 | 0.0125 |
| 0.5894 | 1253 | 0.0507 |
| 0.5898 | 1254 | 0.025 |
| 0.5903 | 1255 | 0.0323 |
| 0.5908 | 1256 | 0.0369 |
| 0.5913 | 1257 | 0.0261 |
| 0.5917 | 1258 | 0.0648 |
| 0.5922 | 1259 | 0.018 |
| 0.5927 | 1260 | 0.0223 |
| 0.5931 | 1261 | 0.0223 |
| 0.5936 | 1262 | 0.0403 |
| 0.5941 | 1263 | 0.0103 |
| 0.5945 | 1264 | 0.0234 |
| 0.5950 | 1265 | 0.0236 |
| 0.5955 | 1266 | 0.0166 |
| 0.5960 | 1267 | 0.0085 |
| 0.5964 | 1268 | 0.0298 |
| 0.5969 | 1269 | 0.0317 |
| 0.5974 | 1270 | 0.072 |
| 0.5978 | 1271 | 0.0558 |
| 0.5983 | 1272 | 0.0391 |
| 0.5988 | 1273 | 0.0379 |
| 0.5992 | 1274 | 0.0913 |
| 0.5997 | 1275 | 0.0224 |
| 0.6002 | 1276 | 0.014 |
| 0.6007 | 1277 | 0.027 |
| 0.6011 | 1278 | 0.0593 |
| 0.6016 | 1279 | 0.0214 |
| 0.6021 | 1280 | 0.0191 |
| 0.6025 | 1281 | 0.047 |
| 0.6030 | 1282 | 0.0157 |
| 0.6035 | 1283 | 0.0468 |
| 0.6040 | 1284 | 0.0166 |
| 0.6044 | 1285 | 0.0189 |
| 0.6049 | 1286 | 0.0462 |
| 0.6054 | 1287 | 0.017 |
| 0.6058 | 1288 | 0.03 |
| 0.6063 | 1289 | 0.054 |
| 0.6068 | 1290 | 0.0276 |
| 0.6072 | 1291 | 0.0139 |
| 0.6077 | 1292 | 0.0226 |
| 0.6082 | 1293 | 0.0391 |
| 0.6087 | 1294 | 0.0717 |
| 0.6091 | 1295 | 0.0472 |
| 0.6096 | 1296 | 0.0697 |
| 0.6101 | 1297 | 0.0271 |
| 0.6105 | 1298 | 0.057 |
| 0.6110 | 1299 | 0.0263 |
| 0.6115 | 1300 | 0.0842 |
| 0.6119 | 1301 | 0.0174 |
| 0.6124 | 1302 | 0.049 |
| 0.6129 | 1303 | 0.036 |
| 0.6134 | 1304 | 0.0444 |
| 0.6138 | 1305 | 0.0255 |
| 0.6143 | 1306 | 0.0619 |
| 0.6148 | 1307 | 0.0275 |
| 0.6152 | 1308 | 0.0144 |
| 0.6157 | 1309 | 0.0356 |
| 0.6162 | 1310 | 0.018 |
| 0.6167 | 1311 | 0.0494 |
| 0.6171 | 1312 | 0.0611 |
| 0.6176 | 1313 | 0.0312 |
| 0.6181 | 1314 | 0.0212 |
| 0.6185 | 1315 | 0.0217 |
| 0.6190 | 1316 | 0.042 |
| 0.6195 | 1317 | 0.0328 |
| 0.6199 | 1318 | 0.0538 |
| 0.6204 | 1319 | 0.065 |
| 0.6209 | 1320 | 0.0173 |
| 0.6214 | 1321 | 0.0166 |
| 0.6218 | 1322 | 0.0235 |
| 0.6223 | 1323 | 0.0165 |
| 0.6228 | 1324 | 0.0389 |
| 0.6232 | 1325 | 0.0311 |
| 0.6237 | 1326 | 0.0616 |
| 0.6242 | 1327 | 0.0463 |
| 0.6246 | 1328 | 0.014 |
| 0.6251 | 1329 | 0.0339 |
| 0.6256 | 1330 | 0.0086 |
| 0.6261 | 1331 | 0.013 |
| 0.6265 | 1332 | 0.0379 |
| 0.6270 | 1333 | 0.0461 |
| 0.6275 | 1334 | 0.0193 |
| 0.6279 | 1335 | 0.0236 |
| 0.6284 | 1336 | 0.0181 |
| 0.6289 | 1337 | 0.0903 |
| 0.6294 | 1338 | 0.0178 |
| 0.6298 | 1339 | 0.0384 |
| 0.6303 | 1340 | 0.0255 |
| 0.6308 | 1341 | 0.05 |
| 0.6312 | 1342 | 0.0132 |
| 0.6317 | 1343 | 0.0188 |
| 0.6322 | 1344 | 0.0974 |
| 0.6326 | 1345 | 0.0304 |
| 0.6331 | 1346 | 0.0716 |
| 0.6336 | 1347 | 0.0147 |
| 0.6341 | 1348 | 0.0549 |
| 0.6345 | 1349 | 0.0454 |
| 0.6350 | 1350 | 0.0282 |
| 0.6355 | 1351 | 0.0185 |
| 0.6359 | 1352 | 0.038 |
| 0.6364 | 1353 | 0.0197 |
| 0.6369 | 1354 | 0.0481 |
| 0.6373 | 1355 | 0.0117 |
| 0.6378 | 1356 | 0.0193 |
| 0.6383 | 1357 | 0.0072 |
| 0.6388 | 1358 | 0.035 |
| 0.6392 | 1359 | 0.0446 |
| 0.6397 | 1360 | 0.0321 |
| 0.6402 | 1361 | 0.0622 |
| 0.6406 | 1362 | 0.0391 |
| 0.6411 | 1363 | 0.011 |
| 0.6416 | 1364 | 0.0204 |
| 0.6421 | 1365 | 0.0228 |
| 0.6425 | 1366 | 0.038 |
| 0.6430 | 1367 | 0.0167 |
| 0.6435 | 1368 | 0.027 |
| 0.6439 | 1369 | 0.0178 |
| 0.6444 | 1370 | 0.0434 |
| 0.6449 | 1371 | 0.0312 |
| 0.6453 | 1372 | 0.0162 |
| 0.6458 | 1373 | 0.0175 |
| 0.6463 | 1374 | 0.0192 |
| 0.6468 | 1375 | 0.0256 |
| 0.6472 | 1376 | 0.0748 |
| 0.6477 | 1377 | 0.0341 |
| 0.6482 | 1378 | 0.033 |
| 0.6486 | 1379 | 0.0303 |
| 0.6491 | 1380 | 0.008 |
| 0.6496 | 1381 | 0.026 |
| 0.6500 | 1382 | 0.0306 |
| 0.6505 | 1383 | 0.0096 |
| 0.6510 | 1384 | 0.0576 |
| 0.6515 | 1385 | 0.0808 |
| 0.6519 | 1386 | 0.0542 |
| 0.6524 | 1387 | 0.0292 |
| 0.6529 | 1388 | 0.0385 |
| 0.6533 | 1389 | 0.0657 |
| 0.6538 | 1390 | 0.0239 |
| 0.6543 | 1391 | 0.0457 |
| 0.6548 | 1392 | 0.0126 |
| 0.6552 | 1393 | 0.0121 |
| 0.6557 | 1394 | 0.0236 |
| 0.6562 | 1395 | 0.0299 |
| 0.6566 | 1396 | 0.0638 |
| 0.6571 | 1397 | 0.0209 |
| 0.6576 | 1398 | 0.0457 |
| 0.6580 | 1399 | 0.0258 |
| 0.6585 | 1400 | 0.0434 |
| 0.6590 | 1401 | 0.0497 |
| 0.6595 | 1402 | 0.0447 |
| 0.6599 | 1403 | 0.0333 |
| 0.6604 | 1404 | 0.0289 |
| 0.6609 | 1405 | 0.0601 |
| 0.6613 | 1406 | 0.063 |
| 0.6618 | 1407 | 0.0138 |
| 0.6623 | 1408 | 0.0104 |
| 0.6627 | 1409 | 0.0208 |
| 0.6632 | 1410 | 0.0262 |
| 0.6637 | 1411 | 0.0302 |
| 0.6642 | 1412 | 0.0256 |
| 0.6646 | 1413 | 0.0254 |
| 0.6651 | 1414 | 0.0226 |
| 0.6656 | 1415 | 0.0129 |
| 0.6660 | 1416 | 0.077 |
| 0.6665 | 1417 | 0.0174 |
| 0.6670 | 1418 | 0.02 |
| 0.6675 | 1419 | 0.0271 |
| 0.6679 | 1420 | 0.0125 |
| 0.6684 | 1421 | 0.066 |
| 0.6689 | 1422 | 0.0214 |
| 0.6693 | 1423 | 0.0361 |
| 0.6698 | 1424 | 0.0136 |
| 0.6703 | 1425 | 0.0397 |
| 0.6707 | 1426 | 0.0352 |
| 0.6712 | 1427 | 0.0278 |
| 0.6717 | 1428 | 0.0106 |
| 0.6722 | 1429 | 0.0297 |
| 0.6726 | 1430 | 0.0546 |
| 0.6731 | 1431 | 0.0148 |
| 0.6736 | 1432 | 0.0194 |
| 0.6740 | 1433 | 0.0571 |
| 0.6745 | 1434 | 0.0329 |
| 0.6750 | 1435 | 0.0154 |
| 0.6754 | 1436 | 0.0634 |
| 0.6759 | 1437 | 0.02 |
| 0.6764 | 1438 | 0.0176 |
| 0.6769 | 1439 | 0.0389 |
| 0.6773 | 1440 | 0.0192 |
| 0.6778 | 1441 | 0.0265 |
| 0.6783 | 1442 | 0.0205 |
| 0.6787 | 1443 | 0.0153 |
| 0.6792 | 1444 | 0.0437 |
| 0.6797 | 1445 | 0.032 |
| 0.6802 | 1446 | 0.0156 |
| 0.6806 | 1447 | 0.0506 |
| 0.6811 | 1448 | 0.0135 |
| 0.6816 | 1449 | 0.0234 |
| 0.6820 | 1450 | 0.0183 |
| 0.6825 | 1451 | 0.0674 |
| 0.6830 | 1452 | 0.0633 |
| 0.6834 | 1453 | 0.0262 |
| 0.6839 | 1454 | 0.0173 |
| 0.6844 | 1455 | 0.0247 |
| 0.6849 | 1456 | 0.0277 |
| 0.6853 | 1457 | 0.0299 |
| 0.6858 | 1458 | 0.046 |
| 0.6863 | 1459 | 0.0223 |
| 0.6867 | 1460 | 0.0207 |
| 0.6872 | 1461 | 0.0162 |
| 0.6877 | 1462 | 0.0383 |
| 0.6881 | 1463 | 0.0219 |
| 0.6886 | 1464 | 0.0885 |
| 0.6891 | 1465 | 0.0338 |
| 0.6896 | 1466 | 0.0151 |
| 0.6900 | 1467 | 0.0324 |
| 0.6905 | 1468 | 0.0104 |
| 0.6910 | 1469 | 0.0159 |
| 0.6914 | 1470 | 0.0507 |
| 0.6919 | 1471 | 0.0468 |
| 0.6924 | 1472 | 0.0438 |
| 0.6929 | 1473 | 0.0438 |
| 0.6933 | 1474 | 0.0413 |
| 0.6938 | 1475 | 0.027 |
| 0.6943 | 1476 | 0.0261 |
| 0.6947 | 1477 | 0.0521 |
| 0.6952 | 1478 | 0.0526 |
| 0.6957 | 1479 | 0.0333 |
| 0.6961 | 1480 | 0.0407 |
| 0.6966 | 1481 | 0.019 |
| 0.6971 | 1482 | 0.0314 |
| 0.6976 | 1483 | 0.0298 |
| 0.6980 | 1484 | 0.0613 |
| 0.6985 | 1485 | 0.0168 |
| 0.6990 | 1486 | 0.016 |
| 0.6994 | 1487 | 0.0366 |
| 0.6999 | 1488 | 0.0289 |
| 0.7004 | 1489 | 0.0761 |
| 0.7008 | 1490 | 0.0172 |
| 0.7013 | 1491 | 0.023 |
| 0.7018 | 1492 | 0.0137 |
| 0.7023 | 1493 | 0.0359 |
| 0.7027 | 1494 | 0.013 |
| 0.7032 | 1495 | 0.0088 |
| 0.7037 | 1496 | 0.0463 |
| 0.7041 | 1497 | 0.072 |
| 0.7046 | 1498 | 0.0394 |
| 0.7051 | 1499 | 0.0144 |
| 0.7056 | 1500 | 0.0289 |
| 0.7060 | 1501 | 0.0744 |
| 0.7065 | 1502 | 0.0617 |
| 0.7070 | 1503 | 0.0206 |
| 0.7074 | 1504 | 0.0194 |
| 0.7079 | 1505 | 0.0325 |
| 0.7084 | 1506 | 0.0475 |
| 0.7088 | 1507 | 0.0168 |
| 0.7093 | 1508 | 0.0156 |
| 0.7098 | 1509 | 0.0433 |
| 0.7103 | 1510 | 0.0273 |
| 0.7107 | 1511 | 0.0147 |
| 0.7112 | 1512 | 0.0346 |
| 0.7117 | 1513 | 0.0179 |
| 0.7121 | 1514 | 0.0175 |
| 0.7126 | 1515 | 0.0736 |
| 0.7131 | 1516 | 0.0308 |
| 0.7135 | 1517 | 0.0416 |
| 0.7140 | 1518 | 0.0134 |
| 0.7145 | 1519 | 0.0392 |
| 0.7150 | 1520 | 0.0118 |
| 0.7154 | 1521 | 0.0376 |
| 0.7159 | 1522 | 0.0461 |
| 0.7164 | 1523 | 0.0539 |
| 0.7168 | 1524 | 0.0132 |
| 0.7173 | 1525 | 0.0602 |
| 0.7178 | 1526 | 0.0475 |
| 0.7183 | 1527 | 0.0236 |
| 0.7187 | 1528 | 0.0471 |
| 0.7192 | 1529 | 0.02 |
| 0.7197 | 1530 | 0.0202 |
| 0.7201 | 1531 | 0.0347 |
| 0.7206 | 1532 | 0.0224 |
| 0.7211 | 1533 | 0.0169 |
| 0.7215 | 1534 | 0.0373 |
| 0.7220 | 1535 | 0.0356 |
| 0.7225 | 1536 | 0.0288 |
| 0.7230 | 1537 | 0.0236 |
| 0.7234 | 1538 | 0.0135 |
| 0.7239 | 1539 | 0.0185 |
| 0.7244 | 1540 | 0.0186 |
| 0.7248 | 1541 | 0.0151 |
| 0.7253 | 1542 | 0.0122 |
| 0.7258 | 1543 | 0.0132 |
| 0.7262 | 1544 | 0.036 |
| 0.7267 | 1545 | 0.0164 |
| 0.7272 | 1546 | 0.0437 |
| 0.7277 | 1547 | 0.0115 |
| 0.7281 | 1548 | 0.0139 |
| 0.7286 | 1549 | 0.058 |
| 0.7291 | 1550 | 0.0074 |
| 0.7295 | 1551 | 0.0555 |
| 0.7300 | 1552 | 0.0163 |
| 0.7305 | 1553 | 0.0263 |
| 0.7310 | 1554 | 0.0132 |
| 0.7314 | 1555 | 0.0186 |
| 0.7319 | 1556 | 0.0389 |
| 0.7324 | 1557 | 0.0205 |
| 0.7328 | 1558 | 0.0615 |
| 0.7333 | 1559 | 0.0672 |
| 0.7338 | 1560 | 0.0103 |
| 0.7342 | 1561 | 0.0481 |
| 0.7347 | 1562 | 0.0098 |
| 0.7352 | 1563 | 0.0158 |
| 0.7357 | 1564 | 0.0237 |
| 0.7361 | 1565 | 0.046 |
| 0.7366 | 1566 | 0.0268 |
| 0.7371 | 1567 | 0.0208 |
| 0.7375 | 1568 | 0.0226 |
| 0.7380 | 1569 | 0.0249 |
| 0.7385 | 1570 | 0.0086 |
| 0.7389 | 1571 | 0.0258 |
| 0.7394 | 1572 | 0.1062 |
| 0.7399 | 1573 | 0.0356 |
| 0.7404 | 1574 | 0.0257 |
| 0.7408 | 1575 | 0.0511 |
| 0.7413 | 1576 | 0.0476 |
| 0.7418 | 1577 | 0.0189 |
| 0.7422 | 1578 | 0.0495 |
| 0.7427 | 1579 | 0.015 |
| 0.7432 | 1580 | 0.0269 |
| 0.7437 | 1581 | 0.0286 |
| 0.7441 | 1582 | 0.0271 |
| 0.7446 | 1583 | 0.035 |
| 0.7451 | 1584 | 0.0348 |
| 0.7455 | 1585 | 0.0209 |
| 0.7460 | 1586 | 0.0183 |
| 0.7465 | 1587 | 0.0406 |
| 0.7469 | 1588 | 0.0178 |
| 0.7474 | 1589 | 0.0488 |
| 0.7479 | 1590 | 0.0176 |
| 0.7484 | 1591 | 0.0314 |
| 0.7488 | 1592 | 0.0262 |
| 0.7493 | 1593 | 0.0417 |
| 0.7498 | 1594 | 0.0702 |
| 0.7502 | 1595 | 0.025 |
| 0.7507 | 1596 | 0.0078 |
| 0.7512 | 1597 | 0.0142 |
| 0.7516 | 1598 | 0.0095 |
| 0.7521 | 1599 | 0.0158 |
| 0.7526 | 1600 | 0.0325 |
| 0.7531 | 1601 | 0.0171 |
| 0.7535 | 1602 | 0.0233 |
| 0.7540 | 1603 | 0.0185 |
| 0.7545 | 1604 | 0.0241 |
| 0.7549 | 1605 | 0.0172 |
| 0.7554 | 1606 | 0.0302 |
| 0.7559 | 1607 | 0.0338 |
| 0.7563 | 1608 | 0.0129 |
| 0.7568 | 1609 | 0.0172 |
| 0.7573 | 1610 | 0.048 |
| 0.7578 | 1611 | 0.0794 |
| 0.7582 | 1612 | 0.0314 |
| 0.7587 | 1613 | 0.0169 |
| 0.7592 | 1614 | 0.0289 |
| 0.7596 | 1615 | 0.0095 |
| 0.7601 | 1616 | 0.0857 |
| 0.7606 | 1617 | 0.0166 |
| 0.7611 | 1618 | 0.0804 |
| 0.7615 | 1619 | 0.0321 |
| 0.7620 | 1620 | 0.0216 |
| 0.7625 | 1621 | 0.0405 |
| 0.7629 | 1622 | 0.0327 |
| 0.7634 | 1623 | 0.0675 |
| 0.7639 | 1624 | 0.0283 |
| 0.7643 | 1625 | 0.0242 |
| 0.7648 | 1626 | 0.0218 |
| 0.7653 | 1627 | 0.0358 |
| 0.7658 | 1628 | 0.0104 |
| 0.7662 | 1629 | 0.0278 |
| 0.7667 | 1630 | 0.0577 |
| 0.7672 | 1631 | 0.0145 |
| 0.7676 | 1632 | 0.0262 |
| 0.7681 | 1633 | 0.014 |
| 0.7686 | 1634 | 0.0393 |
| 0.7690 | 1635 | 0.0181 |
| 0.7695 | 1636 | 0.012 |
| 0.7700 | 1637 | 0.0186 |
| 0.7705 | 1638 | 0.0322 |
| 0.7709 | 1639 | 0.0187 |
| 0.7714 | 1640 | 0.0809 |
| 0.7719 | 1641 | 0.038 |
| 0.7723 | 1642 | 0.013 |
| 0.7728 | 1643 | 0.0095 |
| 0.7733 | 1644 | 0.026 |
| 0.7738 | 1645 | 0.0469 |
| 0.7742 | 1646 | 0.0355 |
| 0.7747 | 1647 | 0.0089 |
| 0.7752 | 1648 | 0.0125 |
| 0.7756 | 1649 | 0.0229 |
| 0.7761 | 1650 | 0.0206 |
| 0.7766 | 1651 | 0.0211 |
| 0.7770 | 1652 | 0.0273 |
| 0.7775 | 1653 | 0.0329 |
| 0.7780 | 1654 | 0.0342 |
| 0.7785 | 1655 | 0.084 |
| 0.7789 | 1656 | 0.0178 |
| 0.7794 | 1657 | 0.0433 |
| 0.7799 | 1658 | 0.018 |
| 0.7803 | 1659 | 0.0741 |
| 0.7808 | 1660 | 0.0993 |
| 0.7813 | 1661 | 0.0276 |
| 0.7817 | 1662 | 0.0134 |
| 0.7822 | 1663 | 0.0561 |
| 0.7827 | 1664 | 0.0288 |
| 0.7832 | 1665 | 0.0491 |
| 0.7836 | 1666 | 0.0554 |
| 0.7841 | 1667 | 0.0338 |
| 0.7846 | 1668 | 0.0251 |
| 0.7850 | 1669 | 0.0091 |
| 0.7855 | 1670 | 0.0357 |
| 0.7860 | 1671 | 0.033 |
| 0.7865 | 1672 | 0.0764 |
| 0.7869 | 1673 | 0.016 |
| 0.7874 | 1674 | 0.012 |
| 0.7879 | 1675 | 0.0289 |
| 0.7883 | 1676 | 0.0157 |
| 0.7888 | 1677 | 0.05 |
| 0.7893 | 1678 | 0.062 |
| 0.7897 | 1679 | 0.0291 |
| 0.7902 | 1680 | 0.0275 |
| 0.7907 | 1681 | 0.0186 |
| 0.7912 | 1682 | 0.0302 |
| 0.7916 | 1683 | 0.0351 |
| 0.7921 | 1684 | 0.0342 |
| 0.7926 | 1685 | 0.0909 |
| 0.7930 | 1686 | 0.0197 |
| 0.7935 | 1687 | 0.0163 |
| 0.7940 | 1688 | 0.0178 |
| 0.7944 | 1689 | 0.1046 |
| 0.7949 | 1690 | 0.0242 |
| 0.7954 | 1691 | 0.0288 |
| 0.7959 | 1692 | 0.0504 |
| 0.7963 | 1693 | 0.0132 |
| 0.7968 | 1694 | 0.0071 |
| 0.7973 | 1695 | 0.0235 |
| 0.7977 | 1696 | 0.0608 |
| 0.7982 | 1697 | 0.0484 |
| 0.7987 | 1698 | 0.0457 |
| 0.7992 | 1699 | 0.0141 |
| 0.7996 | 1700 | 0.0574 |
| 0.8001 | 1701 | 0.0196 |
| 0.8006 | 1702 | 0.0328 |
| 0.8010 | 1703 | 0.0203 |
| 0.8015 | 1704 | 0.0161 |
| 0.8020 | 1705 | 0.028 |
| 0.8024 | 1706 | 0.061 |
| 0.8029 | 1707 | 0.0422 |
| 0.8034 | 1708 | 0.0659 |
| 0.8039 | 1709 | 0.0097 |
| 0.8043 | 1710 | 0.0361 |
| 0.8048 | 1711 | 0.0382 |
| 0.8053 | 1712 | 0.0313 |
| 0.8057 | 1713 | 0.0342 |
| 0.8062 | 1714 | 0.0389 |
| 0.8067 | 1715 | 0.0463 |
| 0.8071 | 1716 | 0.0351 |
| 0.8076 | 1717 | 0.02 |
| 0.8081 | 1718 | 0.0458 |
| 0.8086 | 1719 | 0.0376 |
| 0.8090 | 1720 | 0.0126 |
| 0.8095 | 1721 | 0.0178 |
| 0.8100 | 1722 | 0.0251 |
| 0.8104 | 1723 | 0.0402 |
| 0.8109 | 1724 | 0.0228 |
| 0.8114 | 1725 | 0.0378 |
| 0.8119 | 1726 | 0.0235 |
| 0.8123 | 1727 | 0.0094 |
| 0.8128 | 1728 | 0.0691 |
| 0.8133 | 1729 | 0.0455 |
| 0.8137 | 1730 | 0.0485 |
| 0.8142 | 1731 | 0.0324 |
| 0.8147 | 1732 | 0.0147 |
| 0.8151 | 1733 | 0.0137 |
| 0.8156 | 1734 | 0.0539 |
| 0.8161 | 1735 | 0.0337 |
| 0.8166 | 1736 | 0.0219 |
| 0.8170 | 1737 | 0.0189 |
| 0.8175 | 1738 | 0.0304 |
| 0.8180 | 1739 | 0.0639 |
| 0.8184 | 1740 | 0.0341 |
| 0.8189 | 1741 | 0.0452 |
| 0.8194 | 1742 | 0.0143 |
| 0.8198 | 1743 | 0.0376 |
| 0.8203 | 1744 | 0.033 |
| 0.8208 | 1745 | 0.0448 |
| 0.8213 | 1746 | 0.0401 |
| 0.8217 | 1747 | 0.0773 |
| 0.8222 | 1748 | 0.0277 |
| 0.8227 | 1749 | 0.0374 |
| 0.8231 | 1750 | 0.0152 |
| 0.8236 | 1751 | 0.0447 |
| 0.8241 | 1752 | 0.0458 |
| 0.8246 | 1753 | 0.0154 |
| 0.8250 | 1754 | 0.016 |
| 0.8255 | 1755 | 0.0285 |
| 0.8260 | 1756 | 0.021 |
| 0.8264 | 1757 | 0.0151 |
| 0.8269 | 1758 | 0.0129 |
| 0.8274 | 1759 | 0.0214 |
| 0.8278 | 1760 | 0.0215 |
| 0.8283 | 1761 | 0.0156 |
| 0.8288 | 1762 | 0.015 |
| 0.8293 | 1763 | 0.0334 |
| 0.8297 | 1764 | 0.029 |
| 0.8302 | 1765 | 0.0084 |
| 0.8307 | 1766 | 0.0385 |
| 0.8311 | 1767 | 0.0152 |
| 0.8316 | 1768 | 0.0467 |
| 0.8321 | 1769 | 0.0088 |
| 0.8325 | 1770 | 0.0226 |
| 0.8330 | 1771 | 0.0535 |
| 0.8335 | 1772 | 0.0445 |
| 0.8340 | 1773 | 0.051 |
| 0.8344 | 1774 | 0.0323 |
| 0.8349 | 1775 | 0.0151 |
| 0.8354 | 1776 | 0.0574 |
| 0.8358 | 1777 | 0.0314 |
| 0.8363 | 1778 | 0.0214 |
| 0.8368 | 1779 | 0.0147 |
| 0.8373 | 1780 | 0.0408 |
| 0.8377 | 1781 | 0.0122 |
| 0.8382 | 1782 | 0.0209 |
| 0.8387 | 1783 | 0.0133 |
| 0.8391 | 1784 | 0.0285 |
| 0.8396 | 1785 | 0.0422 |
| 0.8401 | 1786 | 0.0163 |
| 0.8405 | 1787 | 0.0145 |
| 0.8410 | 1788 | 0.0174 |
| 0.8415 | 1789 | 0.0086 |
| 0.8420 | 1790 | 0.0121 |
| 0.8424 | 1791 | 0.0309 |
| 0.8429 | 1792 | 0.0142 |
| 0.8434 | 1793 | 0.025 |
| 0.8438 | 1794 | 0.0204 |
| 0.8443 | 1795 | 0.0631 |
| 0.8448 | 1796 | 0.0561 |
| 0.8452 | 1797 | 0.0806 |
| 0.8457 | 1798 | 0.0088 |
| 0.8462 | 1799 | 0.009 |
| 0.8467 | 1800 | 0.0112 |
| 0.8471 | 1801 | 0.0274 |
| 0.8476 | 1802 | 0.0354 |
| 0.8481 | 1803 | 0.0238 |
| 0.8485 | 1804 | 0.0261 |
| 0.8490 | 1805 | 0.0251 |
| 0.8495 | 1806 | 0.0394 |
| 0.8500 | 1807 | 0.0101 |
| 0.8504 | 1808 | 0.0126 |
| 0.8509 | 1809 | 0.0246 |
| 0.8514 | 1810 | 0.0172 |
| 0.8518 | 1811 | 0.0152 |
| 0.8523 | 1812 | 0.0437 |
| 0.8528 | 1813 | 0.0377 |
| 0.8532 | 1814 | 0.0316 |
| 0.8537 | 1815 | 0.0201 |
| 0.8542 | 1816 | 0.0214 |
| 0.8547 | 1817 | 0.009 |
| 0.8551 | 1818 | 0.0149 |
| 0.8556 | 1819 | 0.035 |
| 0.8561 | 1820 | 0.0129 |
| 0.8565 | 1821 | 0.0506 |
| 0.8570 | 1822 | 0.0165 |
| 0.8575 | 1823 | 0.0166 |
| 0.8579 | 1824 | 0.0326 |
| 0.8584 | 1825 | 0.0199 |
| 0.8589 | 1826 | 0.0433 |
| 0.8594 | 1827 | 0.0485 |
| 0.8598 | 1828 | 0.0323 |
| 0.8603 | 1829 | 0.0315 |
| 0.8608 | 1830 | 0.063 |
| 0.8612 | 1831 | 0.0584 |
| 0.8617 | 1832 | 0.0362 |
| 0.8622 | 1833 | 0.0274 |
| 0.8627 | 1834 | 0.0167 |
| 0.8631 | 1835 | 0.0228 |
| 0.8636 | 1836 | 0.0293 |
| 0.8641 | 1837 | 0.0294 |
| 0.8645 | 1838 | 0.0115 |
| 0.8650 | 1839 | 0.0098 |
| 0.8655 | 1840 | 0.0132 |
| 0.8659 | 1841 | 0.0354 |
| 0.8664 | 1842 | 0.0535 |
| 0.8669 | 1843 | 0.0255 |
| 0.8674 | 1844 | 0.0308 |
| 0.8678 | 1845 | 0.0248 |
| 0.8683 | 1846 | 0.0222 |
| 0.8688 | 1847 | 0.0762 |
| 0.8692 | 1848 | 0.0197 |
| 0.8697 | 1849 | 0.0132 |
| 0.8702 | 1850 | 0.0334 |
| 0.8706 | 1851 | 0.0146 |
| 0.8711 | 1852 | 0.0283 |
| 0.8716 | 1853 | 0.0161 |
| 0.8721 | 1854 | 0.0364 |
| 0.8725 | 1855 | 0.0543 |
| 0.8730 | 1856 | 0.0867 |
| 0.8735 | 1857 | 0.0328 |
| 0.8739 | 1858 | 0.0752 |
| 0.8744 | 1859 | 0.1049 |
| 0.8749 | 1860 | 0.0342 |
| 0.8754 | 1861 | 0.0183 |
| 0.8758 | 1862 | 0.0317 |
| 0.8763 | 1863 | 0.0141 |
| 0.8768 | 1864 | 0.0336 |
| 0.8772 | 1865 | 0.0197 |
| 0.8777 | 1866 | 0.0235 |
| 0.8782 | 1867 | 0.015 |
| 0.8786 | 1868 | 0.0228 |
| 0.8791 | 1869 | 0.0085 |
| 0.8796 | 1870 | 0.028 |
| 0.8801 | 1871 | 0.0188 |
| 0.8805 | 1872 | 0.0353 |
| 0.8810 | 1873 | 0.0494 |
| 0.8815 | 1874 | 0.0597 |
| 0.8819 | 1875 | 0.065 |
| 0.8824 | 1876 | 0.011 |
| 0.8829 | 1877 | 0.0243 |
| 0.8833 | 1878 | 0.0187 |
| 0.8838 | 1879 | 0.0279 |
| 0.8843 | 1880 | 0.0399 |
| 0.8848 | 1881 | 0.0398 |
| 0.8852 | 1882 | 0.0104 |
| 0.8857 | 1883 | 0.0192 |
| 0.8862 | 1884 | 0.0293 |
| 0.8866 | 1885 | 0.0145 |
| 0.8871 | 1886 | 0.0143 |
| 0.8876 | 1887 | 0.049 |
| 0.8881 | 1888 | 0.0125 |
| 0.8885 | 1889 | 0.0618 |
| 0.8890 | 1890 | 0.0181 |
| 0.8895 | 1891 | 0.0232 |
| 0.8899 | 1892 | 0.0352 |
| 0.8904 | 1893 | 0.0387 |
| 0.8909 | 1894 | 0.0188 |
| 0.8913 | 1895 | 0.0249 |
| 0.8918 | 1896 | 0.0254 |
| 0.8923 | 1897 | 0.0385 |
| 0.8928 | 1898 | 0.0234 |
| 0.8932 | 1899 | 0.0456 |
| 0.8937 | 1900 | 0.014 |
| 0.8942 | 1901 | 0.0285 |
| 0.8946 | 1902 | 0.0375 |
| 0.8951 | 1903 | 0.0208 |
| 0.8956 | 1904 | 0.0333 |
| 0.8960 | 1905 | 0.0368 |
| 0.8965 | 1906 | 0.0457 |
| 0.8970 | 1907 | 0.0503 |
| 0.8975 | 1908 | 0.0454 |
| 0.8979 | 1909 | 0.0281 |
| 0.8984 | 1910 | 0.0285 |
| 0.8989 | 1911 | 0.0168 |
| 0.8993 | 1912 | 0.0167 |
| 0.8998 | 1913 | 0.0196 |
| 0.9003 | 1914 | 0.0321 |
| 0.9008 | 1915 | 0.0175 |
| 0.9012 | 1916 | 0.0077 |
| 0.9017 | 1917 | 0.0412 |
| 0.9022 | 1918 | 0.0375 |
| 0.9026 | 1919 | 0.0112 |
| 0.9031 | 1920 | 0.0301 |
| 0.9036 | 1921 | 0.0615 |
| 0.9040 | 1922 | 0.0824 |
| 0.9045 | 1923 | 0.0154 |
| 0.9050 | 1924 | 0.0897 |
| 0.9055 | 1925 | 0.0179 |
| 0.9059 | 1926 | 0.0203 |
| 0.9064 | 1927 | 0.0118 |
| 0.9069 | 1928 | 0.0131 |
| 0.9073 | 1929 | 0.0379 |
| 0.9078 | 1930 | 0.079 |
| 0.9083 | 1931 | 0.0632 |
| 0.9087 | 1932 | 0.0119 |
| 0.9092 | 1933 | 0.0223 |
| 0.9097 | 1934 | 0.0278 |
| 0.9102 | 1935 | 0.013 |
| 0.9106 | 1936 | 0.0562 |
| 0.9111 | 1937 | 0.0098 |
| 0.9116 | 1938 | 0.0167 |
| 0.9120 | 1939 | 0.0395 |
| 0.9125 | 1940 | 0.0388 |
| 0.9130 | 1941 | 0.054 |
| 0.9135 | 1942 | 0.0143 |
| 0.9139 | 1943 | 0.0289 |
| 0.9144 | 1944 | 0.0094 |
| 0.9149 | 1945 | 0.0081 |
| 0.9153 | 1946 | 0.0276 |
| 0.9158 | 1947 | 0.0165 |
| 0.9163 | 1948 | 0.016 |
| 0.9167 | 1949 | 0.02 |
| 0.9172 | 1950 | 0.0304 |
| 0.9177 | 1951 | 0.0126 |
| 0.9182 | 1952 | 0.0688 |
| 0.9186 | 1953 | 0.0153 |
| 0.9191 | 1954 | 0.0142 |
| 0.9196 | 1955 | 0.0296 |
| 0.9200 | 1956 | 0.0279 |
| 0.9205 | 1957 | 0.0162 |
| 0.9210 | 1958 | 0.0181 |
| 0.9214 | 1959 | 0.0264 |
| 0.9219 | 1960 | 0.021 |
| 0.9224 | 1961 | 0.01 |
| 0.9229 | 1962 | 0.0228 |
| 0.9233 | 1963 | 0.0268 |
| 0.9238 | 1964 | 0.0272 |
| 0.9243 | 1965 | 0.049 |
| 0.9247 | 1966 | 0.0356 |
| 0.9252 | 1967 | 0.0093 |
| 0.9257 | 1968 | 0.0233 |
| 0.9262 | 1969 | 0.0158 |
| 0.9266 | 1970 | 0.0272 |
| 0.9271 | 1971 | 0.0835 |
| 0.9276 | 1972 | 0.0193 |
| 0.9280 | 1973 | 0.0243 |
| 0.9285 | 1974 | 0.0215 |
| 0.9290 | 1975 | 0.0268 |
| 0.9294 | 1976 | 0.0213 |
| 0.9299 | 1977 | 0.0536 |
| 0.9304 | 1978 | 0.029 |
| 0.9309 | 1979 | 0.024 |
| 0.9313 | 1980 | 0.0288 |
| 0.9318 | 1981 | 0.0123 |
| 0.9323 | 1982 | 0.0535 |
| 0.9327 | 1983 | 0.0452 |
| 0.9332 | 1984 | 0.042 |
| 0.9337 | 1985 | 0.0944 |
| 0.9341 | 1986 | 0.0159 |
| 0.9346 | 1987 | 0.0189 |
| 0.9351 | 1988 | 0.0239 |
| 0.9356 | 1989 | 0.0304 |
| 0.9360 | 1990 | 0.0282 |
| 0.9365 | 1991 | 0.0192 |
| 0.9370 | 1992 | 0.0276 |
| 0.9374 | 1993 | 0.025 |
| 0.9379 | 1994 | 0.0431 |
| 0.9384 | 1995 | 0.0233 |
| 0.9389 | 1996 | 0.0179 |
| 0.9393 | 1997 | 0.0682 |
| 0.9398 | 1998 | 0.0286 |
| 0.9403 | 1999 | 0.122 |
| 0.9407 | 2000 | 0.1082 |
| 0.9412 | 2001 | 0.0068 |
| 0.9417 | 2002 | 0.0352 |
| 0.9421 | 2003 | 0.0305 |
| 0.9426 | 2004 | 0.0085 |
| 0.9431 | 2005 | 0.0226 |
| 0.9436 | 2006 | 0.0065 |
| 0.9440 | 2007 | 0.0191 |
| 0.9445 | 2008 | 0.0604 |
| 0.9450 | 2009 | 0.0283 |
| 0.9454 | 2010 | 0.0233 |
| 0.9459 | 2011 | 0.0211 |
| 0.9464 | 2012 | 0.0125 |
| 0.9468 | 2013 | 0.0406 |
| 0.9473 | 2014 | 0.0259 |
| 0.9478 | 2015 | 0.0224 |
| 0.9483 | 2016 | 0.0153 |
| 0.9487 | 2017 | 0.0364 |
| 0.9492 | 2018 | 0.0212 |
| 0.9497 | 2019 | 0.0333 |
| 0.9501 | 2020 | 0.0183 |
| 0.9506 | 2021 | 0.0119 |
| 0.9511 | 2022 | 0.0435 |
| 0.9516 | 2023 | 0.0237 |
| 0.9520 | 2024 | 0.0359 |
| 0.9525 | 2025 | 0.0104 |
| 0.9530 | 2026 | 0.0219 |
| 0.9534 | 2027 | 0.0655 |
| 0.9539 | 2028 | 0.0157 |
| 0.9544 | 2029 | 0.0126 |
| 0.9548 | 2030 | 0.0275 |
| 0.9553 | 2031 | 0.0154 |
| 0.9558 | 2032 | 0.0243 |
| 0.9563 | 2033 | 0.0422 |
| 0.9567 | 2034 | 0.0111 |
| 0.9572 | 2035 | 0.055 |
| 0.9577 | 2036 | 0.0225 |
| 0.9581 | 2037 | 0.0399 |
| 0.9586 | 2038 | 0.015 |
| 0.9591 | 2039 | 0.0227 |
| 0.9595 | 2040 | 0.0487 |
| 0.9600 | 2041 | 0.0228 |
| 0.9605 | 2042 | 0.0525 |
| 0.9610 | 2043 | 0.0486 |
| 0.9614 | 2044 | 0.0281 |
| 0.9619 | 2045 | 0.047 |
| 0.9624 | 2046 | 0.0098 |
| 0.9628 | 2047 | 0.0575 |
| 0.9633 | 2048 | 0.0228 |
| 0.9638 | 2049 | 0.0929 |
| 0.9643 | 2050 | 0.0267 |
| 0.9647 | 2051 | 0.0075 |
| 0.9652 | 2052 | 0.0203 |
| 0.9657 | 2053 | 0.0102 |
| 0.9661 | 2054 | 0.0432 |
| 0.9666 | 2055 | 0.0486 |
| 0.9671 | 2056 | 0.0088 |
| 0.9675 | 2057 | 0.0227 |
| 0.9680 | 2058 | 0.0115 |
| 0.9685 | 2059 | 0.0325 |
| 0.9690 | 2060 | 0.0359 |
| 0.9694 | 2061 | 0.0077 |
| 0.9699 | 2062 | 0.0609 |
| 0.9704 | 2063 | 0.0339 |
| 0.9708 | 2064 | 0.0227 |
| 0.9713 | 2065 | 0.0604 |
| 0.9718 | 2066 | 0.0545 |
| 0.9722 | 2067 | 0.0283 |
| 0.9727 | 2068 | 0.0111 |
| 0.9732 | 2069 | 0.0326 |
| 0.9737 | 2070 | 0.0391 |
| 0.9741 | 2071 | 0.0112 |
| 0.9746 | 2072 | 0.0298 |
| 0.9751 | 2073 | 0.0447 |
| 0.9755 | 2074 | 0.0265 |
| 0.9760 | 2075 | 0.0253 |
| 0.9765 | 2076 | 0.0172 |
| 0.9770 | 2077 | 0.025 |
| 0.9774 | 2078 | 0.0236 |
| 0.9779 | 2079 | 0.0205 |
| 0.9784 | 2080 | 0.0667 |
| 0.9788 | 2081 | 0.0275 |
| 0.9793 | 2082 | 0.0426 |
| 0.9798 | 2083 | 0.0138 |
| 0.9802 | 2084 | 0.0278 |
| 0.9807 | 2085 | 0.0234 |
| 0.9812 | 2086 | 0.042 |
| 0.9817 | 2087 | 0.0206 |
| 0.9821 | 2088 | 0.0238 |
| 0.9826 | 2089 | 0.0327 |
| 0.9831 | 2090 | 0.024 |
| 0.9835 | 2091 | 0.0428 |
| 0.9840 | 2092 | 0.0104 |
| 0.9845 | 2093 | 0.0062 |
| 0.9849 | 2094 | 0.0267 |
| 0.9854 | 2095 | 0.0308 |
| 0.9859 | 2096 | 0.0209 |
| 0.9864 | 2097 | 0.0251 |
| 0.9868 | 2098 | 0.0119 |
| 0.9873 | 2099 | 0.0243 |
| 0.9878 | 2100 | 0.0155 |
| 0.9882 | 2101 | 0.0164 |
| 0.9887 | 2102 | 0.0362 |
| 0.9892 | 2103 | 0.0392 |
| 0.9897 | 2104 | 0.0187 |
| 0.9901 | 2105 | 0.013 |
| 0.9906 | 2106 | 0.054 |
| 0.9911 | 2107 | 0.0125 |
| 0.9915 | 2108 | 0.0245 |
| 0.9920 | 2109 | 0.0054 |
| 0.9925 | 2110 | 0.0127 |
| 0.9929 | 2111 | 0.0232 |
| 0.9934 | 2112 | 0.011 |
| 0.9939 | 2113 | 0.0382 |
| 0.9944 | 2114 | 0.0452 |
| 0.9948 | 2115 | 0.0711 |
| 0.9953 | 2116 | 0.0091 |
| 0.9958 | 2117 | 0.0216 |
| 0.9962 | 2118 | 0.0517 |
| 0.9967 | 2119 | 0.0376 |
| 0.9972 | 2120 | 0.0115 |
| 0.9976 | 2121 | 0.0188 |
| 0.9981 | 2122 | 0.0214 |
| 0.9986 | 2123 | 0.0164 |
| 0.9991 | 2124 | 0.0336 |
| 0.9995 | 2125 | 0.0152 |
| 1.0 | 2126 | 0.0779 |
| 1.0005 | 2127 | 0.0165 |
| 1.0009 | 2128 | 0.0695 |
| 1.0014 | 2129 | 0.036 |
| 1.0019 | 2130 | 0.0197 |
| 1.0024 | 2131 | 0.0436 |
| 1.0028 | 2132 | 0.0148 |
| 1.0033 | 2133 | 0.0159 |
| 1.0038 | 2134 | 0.0282 |
| 1.0042 | 2135 | 0.0138 |
| 1.0047 | 2136 | 0.016 |
| 1.0052 | 2137 | 0.0268 |
| 1.0056 | 2138 | 0.0448 |
| 1.0061 | 2139 | 0.0176 |
| 1.0066 | 2140 | 0.0118 |
| 1.0071 | 2141 | 0.0538 |
| 1.0075 | 2142 | 0.0169 |
| 1.0080 | 2143 | 0.0158 |
| 1.0085 | 2144 | 0.0434 |
| 1.0089 | 2145 | 0.0278 |
| 1.0094 | 2146 | 0.0117 |
| 1.0099 | 2147 | 0.0479 |
| 1.0103 | 2148 | 0.0163 |
| 1.0108 | 2149 | 0.0178 |
| 1.0113 | 2150 | 0.0423 |
| 1.0118 | 2151 | 0.0159 |
| 1.0122 | 2152 | 0.0125 |
| 1.0127 | 2153 | 0.0582 |
| 1.0132 | 2154 | 0.0296 |
| 1.0136 | 2155 | 0.0111 |
| 1.0141 | 2156 | 0.0328 |
| 1.0146 | 2157 | 0.0098 |
| 1.0151 | 2158 | 0.0284 |
| 1.0155 | 2159 | 0.0755 |
| 1.0160 | 2160 | 0.0158 |
| 1.0165 | 2161 | 0.0185 |
| 1.0169 | 2162 | 0.0279 |
| 1.0174 | 2163 | 0.0141 |
| 1.0179 | 2164 | 0.0196 |
| 1.0183 | 2165 | 0.0148 |
| 1.0188 | 2166 | 0.0318 |
| 1.0193 | 2167 | 0.0394 |
| 1.0198 | 2168 | 0.0309 |
| 1.0202 | 2169 | 0.0198 |
| 1.0207 | 2170 | 0.0265 |
| 1.0212 | 2171 | 0.0222 |
| 1.0216 | 2172 | 0.0391 |
| 1.0221 | 2173 | 0.0094 |
| 1.0226 | 2174 | 0.0526 |
| 1.0230 | 2175 | 0.0476 |
| 1.0235 | 2176 | 0.0148 |
| 1.0240 | 2177 | 0.024 |
| 1.0245 | 2178 | 0.0268 |
| 1.0249 | 2179 | 0.0105 |
| 1.0254 | 2180 | 0.0118 |
| 1.0259 | 2181 | 0.0145 |
| 1.0263 | 2182 | 0.0751 |
| 1.0268 | 2183 | 0.0325 |
| 1.0273 | 2184 | 0.0216 |
| 1.0278 | 2185 | 0.0108 |
| 1.0282 | 2186 | 0.0089 |
| 1.0287 | 2187 | 0.0245 |
| 1.0292 | 2188 | 0.0553 |
| 1.0296 | 2189 | 0.0544 |
| 1.0301 | 2190 | 0.0144 |
| 1.0306 | 2191 | 0.011 |
| 1.0310 | 2192 | 0.0211 |
| 1.0315 | 2193 | 0.0334 |
| 1.0320 | 2194 | 0.0346 |
| 1.0325 | 2195 | 0.0153 |
| 1.0329 | 2196 | 0.0354 |
| 1.0334 | 2197 | 0.0254 |
| 1.0339 | 2198 | 0.0788 |
| 1.0343 | 2199 | 0.0681 |
| 1.0348 | 2200 | 0.0066 |
| 1.0353 | 2201 | 0.0206 |
| 1.0357 | 2202 | 0.0395 |
| 1.0362 | 2203 | 0.0294 |
| 1.0367 | 2204 | 0.0168 |
| 1.0372 | 2205 | 0.0276 |
| 1.0376 | 2206 | 0.0158 |
| 1.0381 | 2207 | 0.0282 |
| 1.0386 | 2208 | 0.0732 |
| 1.0390 | 2209 | 0.0503 |
| 1.0395 | 2210 | 0.0297 |
| 1.0400 | 2211 | 0.0213 |
| 1.0405 | 2212 | 0.0142 |
| 1.0409 | 2213 | 0.0278 |
| 1.0414 | 2214 | 0.014 |
| 1.0419 | 2215 | 0.0312 |
| 1.0423 | 2216 | 0.015 |
| 1.0428 | 2217 | 0.0107 |
| 1.0433 | 2218 | 0.0484 |
| 1.0437 | 2219 | 0.0435 |
| 1.0442 | 2220 | 0.0246 |
| 1.0447 | 2221 | 0.0128 |
| 1.0452 | 2222 | 0.0414 |
| 1.0456 | 2223 | 0.0559 |
| 1.0461 | 2224 | 0.0272 |
| 1.0466 | 2225 | 0.0225 |
| 1.0470 | 2226 | 0.0312 |
| 1.0475 | 2227 | 0.0211 |
| 1.0480 | 2228 | 0.0188 |
| 1.0484 | 2229 | 0.014 |
| 1.0489 | 2230 | 0.0054 |
| 1.0494 | 2231 | 0.0193 |
| 1.0499 | 2232 | 0.0126 |
| 1.0503 | 2233 | 0.0048 |
| 1.0508 | 2234 | 0.0256 |
| 1.0513 | 2235 | 0.0187 |
| 1.0517 | 2236 | 0.0045 |
| 1.0522 | 2237 | 0.0099 |
| 1.0527 | 2238 | 0.0241 |
| 1.0532 | 2239 | 0.0552 |
| 1.0536 | 2240 | 0.0107 |
| 1.0541 | 2241 | 0.0137 |
| 1.0546 | 2242 | 0.0108 |
| 1.0550 | 2243 | 0.0062 |
| 1.0555 | 2244 | 0.0192 |
| 1.0560 | 2245 | 0.0152 |
| 1.0564 | 2246 | 0.0326 |
| 1.0569 | 2247 | 0.0222 |
| 1.0574 | 2248 | 0.0292 |
| 1.0579 | 2249 | 0.043 |
| 1.0583 | 2250 | 0.0258 |
| 1.0588 | 2251 | 0.0259 |
| 1.0593 | 2252 | 0.0236 |
| 1.0597 | 2253 | 0.0244 |
| 1.0602 | 2254 | 0.0132 |
| 1.0607 | 2255 | 0.0275 |
| 1.0611 | 2256 | 0.0257 |
| 1.0616 | 2257 | 0.0085 |
| 1.0621 | 2258 | 0.0131 |
| 1.0626 | 2259 | 0.0283 |
| 1.0630 | 2260 | 0.0143 |
| 1.0635 | 2261 | 0.009 |
| 1.0640 | 2262 | 0.0088 |
| 1.0644 | 2263 | 0.0107 |
| 1.0649 | 2264 | 0.0713 |
| 1.0654 | 2265 | 0.0093 |
| 1.0659 | 2266 | 0.0178 |
| 1.0663 | 2267 | 0.0054 |
| 1.0668 | 2268 | 0.0118 |
| 1.0673 | 2269 | 0.0218 |
| 1.0677 | 2270 | 0.0085 |
| 1.0682 | 2271 | 0.0257 |
| 1.0687 | 2272 | 0.0318 |
| 1.0691 | 2273 | 0.0198 |
| 1.0696 | 2274 | 0.0127 |
| 1.0701 | 2275 | 0.0297 |
| 1.0706 | 2276 | 0.0188 |
| 1.0710 | 2277 | 0.0198 |
| 1.0715 | 2278 | 0.0176 |
| 1.0720 | 2279 | 0.0092 |
| 1.0724 | 2280 | 0.0213 |
| 1.0729 | 2281 | 0.0272 |
| 1.0734 | 2282 | 0.0089 |
| 1.0738 | 2283 | 0.0102 |
| 1.0743 | 2284 | 0.0092 |
| 1.0748 | 2285 | 0.0212 |
| 1.0753 | 2286 | 0.008 |
| 1.0757 | 2287 | 0.0224 |
| 1.0762 | 2288 | 0.0328 |
| 1.0767 | 2289 | 0.0314 |
| 1.0771 | 2290 | 0.0682 |
| 1.0776 | 2291 | 0.0077 |
| 1.0781 | 2292 | 0.014 |
| 1.0786 | 2293 | 0.0155 |
| 1.0790 | 2294 | 0.0239 |
| 1.0795 | 2295 | 0.0342 |
| 1.0800 | 2296 | 0.0064 |
| 1.0804 | 2297 | 0.0142 |
| 1.0809 | 2298 | 0.0365 |
| 1.0814 | 2299 | 0.013 |
| 1.0818 | 2300 | 0.0116 |
| 1.0823 | 2301 | 0.0267 |
| 1.0828 | 2302 | 0.0637 |
| 1.0833 | 2303 | 0.0268 |
| 1.0837 | 2304 | 0.0095 |
| 1.0842 | 2305 | 0.014 |
| 1.0847 | 2306 | 0.0157 |
| 1.0851 | 2307 | 0.0165 |
| 1.0856 | 2308 | 0.0153 |
| 1.0861 | 2309 | 0.0474 |
| 1.0865 | 2310 | 0.0305 |
| 1.0870 | 2311 | 0.0057 |
| 1.0875 | 2312 | 0.0077 |
| 1.0880 | 2313 | 0.0193 |
| 1.0884 | 2314 | 0.0067 |
| 1.0889 | 2315 | 0.01 |
| 1.0894 | 2316 | 0.0093 |
| 1.0898 | 2317 | 0.0122 |
| 1.0903 | 2318 | 0.0147 |
| 1.0908 | 2319 | 0.0356 |
| 1.0913 | 2320 | 0.0397 |
| 1.0917 | 2321 | 0.0059 |
| 1.0922 | 2322 | 0.0266 |
| 1.0927 | 2323 | 0.0088 |
| 1.0931 | 2324 | 0.0101 |
| 1.0936 | 2325 | 0.0101 |
| 1.0941 | 2326 | 0.0261 |
| 1.0945 | 2327 | 0.0075 |
| 1.0950 | 2328 | 0.0338 |
| 1.0955 | 2329 | 0.0519 |
| 1.0960 | 2330 | 0.0086 |
| 1.0964 | 2331 | 0.0133 |
| 1.0969 | 2332 | 0.0311 |
| 1.0974 | 2333 | 0.0179 |
| 1.0978 | 2334 | 0.0103 |
| 1.0983 | 2335 | 0.023 |
| 1.0988 | 2336 | 0.0119 |
| 1.0992 | 2337 | 0.0238 |
| 1.0997 | 2338 | 0.0096 |
| 1.1002 | 2339 | 0.0121 |
| 1.1007 | 2340 | 0.0216 |
| 1.1011 | 2341 | 0.0285 |
| 1.1016 | 2342 | 0.0196 |
| 1.1021 | 2343 | 0.037 |
| 1.1025 | 2344 | 0.004 |
| 1.1030 | 2345 | 0.0332 |
| 1.1035 | 2346 | 0.0138 |
| 1.1040 | 2347 | 0.0102 |
| 1.1044 | 2348 | 0.016 |
| 1.1049 | 2349 | 0.0251 |
| 1.1054 | 2350 | 0.0128 |
| 1.1058 | 2351 | 0.0112 |
| 1.1063 | 2352 | 0.0115 |
| 1.1068 | 2353 | 0.0114 |
| 1.1072 | 2354 | 0.0105 |
| 1.1077 | 2355 | 0.0188 |
| 1.1082 | 2356 | 0.0313 |
| 1.1087 | 2357 | 0.0325 |
| 1.1091 | 2358 | 0.0303 |
| 1.1096 | 2359 | 0.009 |
| 1.1101 | 2360 | 0.0137 |
| 1.1105 | 2361 | 0.0193 |
| 1.1110 | 2362 | 0.0135 |
| 1.1115 | 2363 | 0.0219 |
| 1.1119 | 2364 | 0.0087 |
| 1.1124 | 2365 | 0.0142 |
| 1.1129 | 2366 | 0.0046 |
| 1.1134 | 2367 | 0.0126 |
| 1.1138 | 2368 | 0.0217 |
| 1.1143 | 2369 | 0.0211 |
| 1.1148 | 2370 | 0.0057 |
| 1.1152 | 2371 | 0.0209 |
| 1.1157 | 2372 | 0.0072 |
| 1.1162 | 2373 | 0.0055 |
| 1.1167 | 2374 | 0.0243 |
| 1.1171 | 2375 | 0.0214 |
| 1.1176 | 2376 | 0.0076 |
| 1.1181 | 2377 | 0.0123 |
| 1.1185 | 2378 | 0.0124 |
| 1.1190 | 2379 | 0.014 |
| 1.1195 | 2380 | 0.0329 |
| 1.1199 | 2381 | 0.0075 |
| 1.1204 | 2382 | 0.0113 |
| 1.1209 | 2383 | 0.0062 |
| 1.1214 | 2384 | 0.0312 |
| 1.1218 | 2385 | 0.0291 |
| 1.1223 | 2386 | 0.0081 |
| 1.1228 | 2387 | 0.0062 |
| 1.1232 | 2388 | 0.0226 |
| 1.1237 | 2389 | 0.0174 |
| 1.1242 | 2390 | 0.0041 |
| 1.1246 | 2391 | 0.0067 |
| 1.1251 | 2392 | 0.0276 |
| 1.1256 | 2393 | 0.0186 |
| 1.1261 | 2394 | 0.0237 |
| 1.1265 | 2395 | 0.0104 |
| 1.1270 | 2396 | 0.0149 |
| 1.1275 | 2397 | 0.0079 |
| 1.1279 | 2398 | 0.027 |
| 1.1284 | 2399 | 0.0076 |
| 1.1289 | 2400 | 0.0115 |
| 1.1294 | 2401 | 0.0248 |
| 1.1298 | 2402 | 0.0069 |
| 1.1303 | 2403 | 0.0138 |
| 1.1308 | 2404 | 0.0277 |
| 1.1312 | 2405 | 0.0144 |
| 1.1317 | 2406 | 0.0178 |
| 1.1322 | 2407 | 0.0141 |
| 1.1326 | 2408 | 0.0266 |
| 1.1331 | 2409 | 0.0073 |
| 1.1336 | 2410 | 0.0095 |
| 1.1341 | 2411 | 0.015 |
| 1.1345 | 2412 | 0.0091 |
| 1.1350 | 2413 | 0.0207 |
| 1.1355 | 2414 | 0.0145 |
| 1.1359 | 2415 | 0.0162 |
| 1.1364 | 2416 | 0.006 |
| 1.1369 | 2417 | 0.0068 |
| 1.1373 | 2418 | 0.0096 |
| 1.1378 | 2419 | 0.0171 |
| 1.1383 | 2420 | 0.0239 |
| 1.1388 | 2421 | 0.0091 |
| 1.1392 | 2422 | 0.046 |
| 1.1397 | 2423 | 0.0356 |
| 1.1402 | 2424 | 0.0341 |
| 1.1406 | 2425 | 0.0179 |
| 1.1411 | 2426 | 0.0209 |
| 1.1416 | 2427 | 0.0058 |
| 1.1421 | 2428 | 0.0115 |
| 1.1425 | 2429 | 0.0079 |
| 1.1430 | 2430 | 0.0399 |
| 1.1435 | 2431 | 0.0088 |
| 1.1439 | 2432 | 0.0166 |
| 1.1444 | 2433 | 0.0204 |
| 1.1449 | 2434 | 0.0061 |
| 1.1453 | 2435 | 0.0094 |
| 1.1458 | 2436 | 0.0484 |
| 1.1463 | 2437 | 0.0105 |
| 1.1468 | 2438 | 0.058 |
| 1.1472 | 2439 | 0.024 |
| 1.1477 | 2440 | 0.0247 |
| 1.1482 | 2441 | 0.0145 |
| 1.1486 | 2442 | 0.0199 |
| 1.1491 | 2443 | 0.0126 |
| 1.1496 | 2444 | 0.025 |
| 1.1500 | 2445 | 0.0049 |
| 1.1505 | 2446 | 0.0119 |
| 1.1510 | 2447 | 0.0081 |
| 1.1515 | 2448 | 0.0175 |
| 1.1519 | 2449 | 0.0088 |
| 1.1524 | 2450 | 0.0117 |
| 1.1529 | 2451 | 0.0202 |
| 1.1533 | 2452 | 0.0065 |
| 1.1538 | 2453 | 0.0194 |
| 1.1543 | 2454 | 0.0053 |
| 1.1548 | 2455 | 0.0052 |
| 1.1552 | 2456 | 0.0195 |
| 1.1557 | 2457 | 0.0147 |
| 1.1562 | 2458 | 0.0083 |
| 1.1566 | 2459 | 0.0076 |
| 1.1571 | 2460 | 0.0085 |
| 1.1576 | 2461 | 0.0253 |
| 1.1580 | 2462 | 0.0057 |
| 1.1585 | 2463 | 0.0167 |
| 1.1590 | 2464 | 0.0039 |
| 1.1595 | 2465 | 0.0086 |
| 1.1599 | 2466 | 0.0082 |
| 1.1604 | 2467 | 0.0209 |
| 1.1609 | 2468 | 0.0263 |
| 1.1613 | 2469 | 0.0073 |
| 1.1618 | 2470 | 0.0046 |
| 1.1623 | 2471 | 0.0233 |
| 1.1627 | 2472 | 0.0179 |
| 1.1632 | 2473 | 0.0071 |
| 1.1637 | 2474 | 0.0046 |
| 1.1642 | 2475 | 0.0044 |
| 1.1646 | 2476 | 0.0068 |
| 1.1651 | 2477 | 0.0164 |
| 1.1656 | 2478 | 0.0306 |
| 1.1660 | 2479 | 0.0033 |
| 1.1665 | 2480 | 0.026 |
| 1.1670 | 2481 | 0.0135 |
| 1.1675 | 2482 | 0.0206 |
| 1.1679 | 2483 | 0.0135 |
| 1.1684 | 2484 | 0.0098 |
| 1.1689 | 2485 | 0.0106 |
| 1.1693 | 2486 | 0.0109 |
| 1.1698 | 2487 | 0.0182 |
| 1.1703 | 2488 | 0.0183 |
| 1.1707 | 2489 | 0.0362 |
| 1.1712 | 2490 | 0.0055 |
| 1.1717 | 2491 | 0.0139 |
| 1.1722 | 2492 | 0.0089 |
| 1.1726 | 2493 | 0.0213 |
| 1.1731 | 2494 | 0.0074 |
| 1.1736 | 2495 | 0.0086 |
| 1.1740 | 2496 | 0.0086 |
| 1.1745 | 2497 | 0.0104 |
| 1.1750 | 2498 | 0.0175 |
| 1.1754 | 2499 | 0.0063 |
| 1.1759 | 2500 | 0.0112 |
| 1.1764 | 2501 | 0.0287 |
| 1.1769 | 2502 | 0.0042 |
| 1.1773 | 2503 | 0.0161 |
| 1.1778 | 2504 | 0.0177 |
| 1.1783 | 2505 | 0.0106 |
| 1.1787 | 2506 | 0.0069 |
| 1.1792 | 2507 | 0.0073 |
| 1.1797 | 2508 | 0.0156 |
| 1.1802 | 2509 | 0.0107 |
| 1.1806 | 2510 | 0.0038 |
| 1.1811 | 2511 | 0.0098 |
| 1.1816 | 2512 | 0.0115 |
| 1.1820 | 2513 | 0.0106 |
| 1.1825 | 2514 | 0.0108 |
| 1.1830 | 2515 | 0.0128 |
| 1.1834 | 2516 | 0.0067 |
| 1.1839 | 2517 | 0.0086 |
| 1.1844 | 2518 | 0.0085 |
| 1.1849 | 2519 | 0.0297 |
| 1.1853 | 2520 | 0.0166 |
| 1.1858 | 2521 | 0.0053 |
| 1.1863 | 2522 | 0.0107 |
| 1.1867 | 2523 | 0.0247 |
| 1.1872 | 2524 | 0.0068 |
| 1.1877 | 2525 | 0.019 |
| 1.1881 | 2526 | 0.0185 |
| 1.1886 | 2527 | 0.0069 |
| 1.1891 | 2528 | 0.0502 |
| 1.1896 | 2529 | 0.004 |
| 1.1900 | 2530 | 0.0077 |
| 1.1905 | 2531 | 0.0125 |
| 1.1910 | 2532 | 0.0132 |
| 1.1914 | 2533 | 0.0086 |
| 1.1919 | 2534 | 0.0113 |
| 1.1924 | 2535 | 0.0205 |
| 1.1929 | 2536 | 0.0151 |
| 1.1933 | 2537 | 0.0162 |
| 1.1938 | 2538 | 0.0134 |
| 1.1943 | 2539 | 0.0181 |
| 1.1947 | 2540 | 0.0114 |
| 1.1952 | 2541 | 0.0148 |
| 1.1957 | 2542 | 0.0111 |
| 1.1961 | 2543 | 0.0064 |
| 1.1966 | 2544 | 0.0095 |
| 1.1971 | 2545 | 0.0227 |
| 1.1976 | 2546 | 0.0043 |
| 1.1980 | 2547 | 0.0136 |
| 1.1985 | 2548 | 0.0096 |
| 1.1990 | 2549 | 0.0068 |
| 1.1994 | 2550 | 0.015 |
| 1.1999 | 2551 | 0.0137 |
| 1.2004 | 2552 | 0.0067 |
| 1.2008 | 2553 | 0.0119 |
| 1.2013 | 2554 | 0.0097 |
| 1.2018 | 2555 | 0.0082 |
| 1.2023 | 2556 | 0.0022 |
| 1.2027 | 2557 | 0.0089 |
| 1.2032 | 2558 | 0.0198 |
| 1.2037 | 2559 | 0.0172 |
| 1.2041 | 2560 | 0.0179 |
| 1.2046 | 2561 | 0.0212 |
| 1.2051 | 2562 | 0.0075 |
| 1.2056 | 2563 | 0.0093 |
| 1.2060 | 2564 | 0.0247 |
| 1.2065 | 2565 | 0.0042 |
| 1.2070 | 2566 | 0.007 |
| 1.2074 | 2567 | 0.0234 |
| 1.2079 | 2568 | 0.0057 |
| 1.2084 | 2569 | 0.006 |
| 1.2088 | 2570 | 0.0105 |
| 1.2093 | 2571 | 0.0142 |
| 1.2098 | 2572 | 0.0237 |
| 1.2103 | 2573 | 0.0062 |
| 1.2107 | 2574 | 0.0105 |
| 1.2112 | 2575 | 0.0059 |
| 1.2117 | 2576 | 0.009 |
| 1.2121 | 2577 | 0.008 |
| 1.2126 | 2578 | 0.0106 |
| 1.2131 | 2579 | 0.0056 |
| 1.2135 | 2580 | 0.0054 |
| 1.2140 | 2581 | 0.0077 |
| 1.2145 | 2582 | 0.0094 |
| 1.2150 | 2583 | 0.0085 |
| 1.2154 | 2584 | 0.0155 |
| 1.2159 | 2585 | 0.0078 |
| 1.2164 | 2586 | 0.0092 |
| 1.2168 | 2587 | 0.0049 |
| 1.2173 | 2588 | 0.0045 |
| 1.2178 | 2589 | 0.0078 |
| 1.2183 | 2590 | 0.0156 |
| 1.2187 | 2591 | 0.0161 |
| 1.2192 | 2592 | 0.0065 |
| 1.2197 | 2593 | 0.0134 |
| 1.2201 | 2594 | 0.0119 |
| 1.2206 | 2595 | 0.0187 |
| 1.2211 | 2596 | 0.0097 |
| 1.2215 | 2597 | 0.0094 |
| 1.2220 | 2598 | 0.0064 |
| 1.2225 | 2599 | 0.0066 |
| 1.2230 | 2600 | 0.0057 |
| 1.2234 | 2601 | 0.0042 |
| 1.2239 | 2602 | 0.0092 |
| 1.2244 | 2603 | 0.0079 |
| 1.2248 | 2604 | 0.0076 |
| 1.2253 | 2605 | 0.0095 |
| 1.2258 | 2606 | 0.0136 |
| 1.2262 | 2607 | 0.0043 |
| 1.2267 | 2608 | 0.0067 |
| 1.2272 | 2609 | 0.0091 |
| 1.2277 | 2610 | 0.0045 |
| 1.2281 | 2611 | 0.0045 |
| 1.2286 | 2612 | 0.0123 |
| 1.2291 | 2613 | 0.0322 |
| 1.2295 | 2614 | 0.0112 |
| 1.2300 | 2615 | 0.0049 |
| 1.2305 | 2616 | 0.0043 |
| 1.2310 | 2617 | 0.0058 |
| 1.2314 | 2618 | 0.0085 |
| 1.2319 | 2619 | 0.0105 |
| 1.2324 | 2620 | 0.0059 |
| 1.2328 | 2621 | 0.0157 |
| 1.2333 | 2622 | 0.02 |
| 1.2338 | 2623 | 0.0088 |
| 1.2342 | 2624 | 0.0041 |
| 1.2347 | 2625 | 0.0197 |
| 1.2352 | 2626 | 0.0154 |
| 1.2357 | 2627 | 0.0036 |
| 1.2361 | 2628 | 0.0103 |
| 1.2366 | 2629 | 0.0171 |
| 1.2371 | 2630 | 0.0048 |
| 1.2375 | 2631 | 0.0129 |
| 1.2380 | 2632 | 0.0083 |
| 1.2385 | 2633 | 0.0051 |
| 1.2389 | 2634 | 0.0099 |
| 1.2394 | 2635 | 0.003 |
| 1.2399 | 2636 | 0.0063 |
| 1.2404 | 2637 | 0.0067 |
| 1.2408 | 2638 | 0.0051 |
| 1.2413 | 2639 | 0.0224 |
| 1.2418 | 2640 | 0.0112 |
| 1.2422 | 2641 | 0.0077 |
| 1.2427 | 2642 | 0.0049 |
| 1.2432 | 2643 | 0.0076 |
| 1.2437 | 2644 | 0.0039 |
| 1.2441 | 2645 | 0.0084 |
| 1.2446 | 2646 | 0.0096 |
| 1.2451 | 2647 | 0.0031 |
| 1.2455 | 2648 | 0.0068 |
| 1.2460 | 2649 | 0.0165 |
| 1.2465 | 2650 | 0.0042 |
| 1.2469 | 2651 | 0.0108 |
| 1.2474 | 2652 | 0.0037 |
| 1.2479 | 2653 | 0.0097 |
| 1.2484 | 2654 | 0.0078 |
| 1.2488 | 2655 | 0.0215 |
| 1.2493 | 2656 | 0.0077 |
| 1.2498 | 2657 | 0.0148 |
| 1.2502 | 2658 | 0.0105 |
| 1.2507 | 2659 | 0.0067 |
| 1.2512 | 2660 | 0.0046 |
| 1.2516 | 2661 | 0.0062 |
| 1.2521 | 2662 | 0.0063 |
| 1.2526 | 2663 | 0.0284 |
| 1.2531 | 2664 | 0.0023 |
| 1.2535 | 2665 | 0.007 |
| 1.2540 | 2666 | 0.0044 |
| 1.2545 | 2667 | 0.0206 |
| 1.2549 | 2668 | 0.008 |
| 1.2554 | 2669 | 0.0065 |
| 1.2559 | 2670 | 0.0278 |
| 1.2563 | 2671 | 0.0053 |
| 1.2568 | 2672 | 0.0053 |
| 1.2573 | 2673 | 0.0049 |
| 1.2578 | 2674 | 0.0121 |
| 1.2582 | 2675 | 0.0077 |
| 1.2587 | 2676 | 0.0056 |
| 1.2592 | 2677 | 0.008 |
| 1.2596 | 2678 | 0.0043 |
| 1.2601 | 2679 | 0.0122 |
| 1.2606 | 2680 | 0.0161 |
| 1.2611 | 2681 | 0.007 |
| 1.2615 | 2682 | 0.0111 |
| 1.2620 | 2683 | 0.0106 |
| 1.2625 | 2684 | 0.0056 |
| 1.2629 | 2685 | 0.0124 |
| 1.2634 | 2686 | 0.012 |
| 1.2639 | 2687 | 0.0393 |
| 1.2643 | 2688 | 0.0038 |
| 1.2648 | 2689 | 0.0042 |
| 1.2653 | 2690 | 0.0199 |
| 1.2658 | 2691 | 0.0057 |
| 1.2662 | 2692 | 0.014 |
| 1.2667 | 2693 | 0.007 |
| 1.2672 | 2694 | 0.0112 |
| 1.2676 | 2695 | 0.0033 |
| 1.2681 | 2696 | 0.0139 |
| 1.2686 | 2697 | 0.0046 |
| 1.2690 | 2698 | 0.0196 |
| 1.2695 | 2699 | 0.004 |
| 1.2700 | 2700 | 0.0108 |
| 1.2705 | 2701 | 0.0064 |
| 1.2709 | 2702 | 0.0103 |
| 1.2714 | 2703 | 0.0068 |
| 1.2719 | 2704 | 0.0059 |
| 1.2723 | 2705 | 0.0134 |
| 1.2728 | 2706 | 0.0062 |
| 1.2733 | 2707 | 0.0084 |
| 1.2738 | 2708 | 0.0053 |
| 1.2742 | 2709 | 0.004 |
| 1.2747 | 2710 | 0.0099 |
| 1.2752 | 2711 | 0.0043 |
| 1.2756 | 2712 | 0.0043 |
| 1.2761 | 2713 | 0.0041 |
| 1.2766 | 2714 | 0.0068 |
| 1.2770 | 2715 | 0.0024 |
| 1.2775 | 2716 | 0.0042 |
| 1.2780 | 2717 | 0.0174 |
| 1.2785 | 2718 | 0.014 |
| 1.2789 | 2719 | 0.0052 |
| 1.2794 | 2720 | 0.004 |
| 1.2799 | 2721 | 0.0071 |
| 1.2803 | 2722 | 0.0059 |
| 1.2808 | 2723 | 0.0087 |
| 1.2813 | 2724 | 0.0046 |
| 1.2817 | 2725 | 0.0182 |
| 1.2822 | 2726 | 0.0088 |
| 1.2827 | 2727 | 0.023 |
| 1.2832 | 2728 | 0.0065 |
| 1.2836 | 2729 | 0.0037 |
| 1.2841 | 2730 | 0.016 |
| 1.2846 | 2731 | 0.0032 |
| 1.2850 | 2732 | 0.0032 |
| 1.2855 | 2733 | 0.0111 |
| 1.2860 | 2734 | 0.0126 |
| 1.2865 | 2735 | 0.0046 |
| 1.2869 | 2736 | 0.0048 |
| 1.2874 | 2737 | 0.0054 |
| 1.2879 | 2738 | 0.0134 |
| 1.2883 | 2739 | 0.0548 |
| 1.2888 | 2740 | 0.0055 |
| 1.2893 | 2741 | 0.0065 |
| 1.2897 | 2742 | 0.0071 |
| 1.2902 | 2743 | 0.0092 |
| 1.2907 | 2744 | 0.0248 |
| 1.2912 | 2745 | 0.0067 |
| 1.2916 | 2746 | 0.0102 |
| 1.2921 | 2747 | 0.0036 |
| 1.2926 | 2748 | 0.0055 |
| 1.2930 | 2749 | 0.0045 |
| 1.2935 | 2750 | 0.0108 |
| 1.2940 | 2751 | 0.0071 |
| 1.2944 | 2752 | 0.0049 |
| 1.2949 | 2753 | 0.0109 |
| 1.2954 | 2754 | 0.0424 |
| 1.2959 | 2755 | 0.0085 |
| 1.2963 | 2756 | 0.0116 |
| 1.2968 | 2757 | 0.0067 |
| 1.2973 | 2758 | 0.0168 |
| 1.2977 | 2759 | 0.005 |
| 1.2982 | 2760 | 0.0105 |
| 1.2987 | 2761 | 0.0042 |
| 1.2992 | 2762 | 0.0072 |
| 1.2996 | 2763 | 0.0046 |
| 1.3001 | 2764 | 0.0084 |
| 1.3006 | 2765 | 0.0101 |
| 1.3010 | 2766 | 0.0143 |
| 1.3015 | 2767 | 0.0062 |
| 1.3020 | 2768 | 0.0088 |
| 1.3024 | 2769 | 0.0169 |
| 1.3029 | 2770 | 0.003 |
| 1.3034 | 2771 | 0.0096 |
| 1.3039 | 2772 | 0.0111 |
| 1.3043 | 2773 | 0.0111 |
| 1.3048 | 2774 | 0.0041 |
| 1.3053 | 2775 | 0.0052 |
| 1.3057 | 2776 | 0.0082 |
| 1.3062 | 2777 | 0.0041 |
| 1.3067 | 2778 | 0.0057 |
| 1.3071 | 2779 | 0.0118 |
| 1.3076 | 2780 | 0.0028 |
| 1.3081 | 2781 | 0.027 |
| 1.3086 | 2782 | 0.006 |
| 1.3090 | 2783 | 0.005 |
| 1.3095 | 2784 | 0.0045 |
| 1.3100 | 2785 | 0.005 |
| 1.3104 | 2786 | 0.004 |
| 1.3109 | 2787 | 0.0071 |
| 1.3114 | 2788 | 0.0083 |
| 1.3119 | 2789 | 0.0115 |
| 1.3123 | 2790 | 0.0027 |
| 1.3128 | 2791 | 0.0136 |
| 1.3133 | 2792 | 0.0089 |
| 1.3137 | 2793 | 0.0182 |
| 1.3142 | 2794 | 0.0042 |
| 1.3147 | 2795 | 0.0029 |
| 1.3151 | 2796 | 0.0065 |
| 1.3156 | 2797 | 0.0067 |
| 1.3161 | 2798 | 0.007 |
| 1.3166 | 2799 | 0.0035 |
| 1.3170 | 2800 | 0.005 |
| 1.3175 | 2801 | 0.0221 |
| 1.3180 | 2802 | 0.004 |
| 1.3184 | 2803 | 0.005 |
| 1.3189 | 2804 | 0.0091 |
| 1.3194 | 2805 | 0.0044 |
| 1.3198 | 2806 | 0.004 |
| 1.3203 | 2807 | 0.009 |
| 1.3208 | 2808 | 0.0079 |
| 1.3213 | 2809 | 0.0086 |
| 1.3217 | 2810 | 0.0043 |
| 1.3222 | 2811 | 0.0108 |
| 1.3227 | 2812 | 0.0067 |
| 1.3231 | 2813 | 0.0055 |
| 1.3236 | 2814 | 0.0038 |
| 1.3241 | 2815 | 0.0069 |
| 1.3246 | 2816 | 0.0037 |
| 1.3250 | 2817 | 0.0055 |
| 1.3255 | 2818 | 0.0144 |
| 1.3260 | 2819 | 0.0041 |
| 1.3264 | 2820 | 0.0076 |
| 1.3269 | 2821 | 0.0064 |
| 1.3274 | 2822 | 0.011 |
| 1.3278 | 2823 | 0.0061 |
| 1.3283 | 2824 | 0.0058 |
| 1.3288 | 2825 | 0.0019 |
| 1.3293 | 2826 | 0.0146 |
| 1.3297 | 2827 | 0.0077 |
| 1.3302 | 2828 | 0.0044 |
| 1.3307 | 2829 | 0.0051 |
| 1.3311 | 2830 | 0.0114 |
| 1.3316 | 2831 | 0.0019 |
| 1.3321 | 2832 | 0.0063 |
| 1.3325 | 2833 | 0.0135 |
| 1.3330 | 2834 | 0.0066 |
| 1.3335 | 2835 | 0.0164 |
| 1.3340 | 2836 | 0.0068 |
| 1.3344 | 2837 | 0.011 |
| 1.3349 | 2838 | 0.0086 |
| 1.3354 | 2839 | 0.0027 |
| 1.3358 | 2840 | 0.0053 |
| 1.3363 | 2841 | 0.0151 |
| 1.3368 | 2842 | 0.0045 |
| 1.3373 | 2843 | 0.0085 |
| 1.3377 | 2844 | 0.0063 |
| 1.3382 | 2845 | 0.0211 |
| 1.3387 | 2846 | 0.0042 |
| 1.3391 | 2847 | 0.0125 |
| 1.3396 | 2848 | 0.0045 |
| 1.3401 | 2849 | 0.0088 |
| 1.3405 | 2850 | 0.0084 |
| 1.3410 | 2851 | 0.0085 |
| 1.3415 | 2852 | 0.0054 |
| 1.3420 | 2853 | 0.0048 |
| 1.3424 | 2854 | 0.0045 |
| 1.3429 | 2855 | 0.0559 |
| 1.3434 | 2856 | 0.0067 |
| 1.3438 | 2857 | 0.0073 |
| 1.3443 | 2858 | 0.0175 |
| 1.3448 | 2859 | 0.0104 |
| 1.3452 | 2860 | 0.0075 |
| 1.3457 | 2861 | 0.0054 |
| 1.3462 | 2862 | 0.006 |
| 1.3467 | 2863 | 0.0107 |
| 1.3471 | 2864 | 0.0048 |
| 1.3476 | 2865 | 0.004 |
| 1.3481 | 2866 | 0.008 |
| 1.3485 | 2867 | 0.0033 |
| 1.3490 | 2868 | 0.0072 |
| 1.3495 | 2869 | 0.0075 |
| 1.3500 | 2870 | 0.0072 |
| 1.3504 | 2871 | 0.0189 |
| 1.3509 | 2872 | 0.0127 |
| 1.3514 | 2873 | 0.0031 |
| 1.3518 | 2874 | 0.0052 |
| 1.3523 | 2875 | 0.0173 |
| 1.3528 | 2876 | 0.007 |
| 1.3532 | 2877 | 0.0066 |
| 1.3537 | 2878 | 0.0059 |
| 1.3542 | 2879 | 0.0117 |
| 1.3547 | 2880 | 0.0088 |
| 1.3551 | 2881 | 0.0021 |
| 1.3556 | 2882 | 0.0051 |
| 1.3561 | 2883 | 0.0038 |
| 1.3565 | 2884 | 0.008 |
| 1.3570 | 2885 | 0.0076 |
| 1.3575 | 2886 | 0.01 |
| 1.3579 | 2887 | 0.0032 |
| 1.3584 | 2888 | 0.0055 |
| 1.3589 | 2889 | 0.0081 |
| 1.3594 | 2890 | 0.0116 |
| 1.3598 | 2891 | 0.007 |
| 1.3603 | 2892 | 0.0037 |
| 1.3608 | 2893 | 0.0069 |
| 1.3612 | 2894 | 0.0071 |
| 1.3617 | 2895 | 0.0138 |
| 1.3622 | 2896 | 0.0038 |
| 1.3627 | 2897 | 0.0078 |
| 1.3631 | 2898 | 0.0032 |
| 1.3636 | 2899 | 0.0165 |
| 1.3641 | 2900 | 0.0094 |
| 1.3645 | 2901 | 0.0095 |
| 1.3650 | 2902 | 0.0069 |
| 1.3655 | 2903 | 0.0153 |
| 1.3659 | 2904 | 0.0044 |
| 1.3664 | 2905 | 0.0139 |
| 1.3669 | 2906 | 0.0105 |
| 1.3674 | 2907 | 0.0035 |
| 1.3678 | 2908 | 0.0055 |
| 1.3683 | 2909 | 0.0039 |
| 1.3688 | 2910 | 0.0067 |
| 1.3692 | 2911 | 0.0023 |
| 1.3697 | 2912 | 0.015 |
| 1.3702 | 2913 | 0.0152 |
| 1.3706 | 2914 | 0.0023 |
| 1.3711 | 2915 | 0.0061 |
| 1.3716 | 2916 | 0.0093 |
| 1.3721 | 2917 | 0.0059 |
| 1.3725 | 2918 | 0.0058 |
| 1.3730 | 2919 | 0.0056 |
| 1.3735 | 2920 | 0.0082 |
| 1.3739 | 2921 | 0.0075 |
| 1.3744 | 2922 | 0.012 |
| 1.3749 | 2923 | 0.0042 |
| 1.3754 | 2924 | 0.0054 |
| 1.3758 | 2925 | 0.0069 |
| 1.3763 | 2926 | 0.0052 |
| 1.3768 | 2927 | 0.0065 |
| 1.3772 | 2928 | 0.0051 |
| 1.3777 | 2929 | 0.0243 |
| 1.3782 | 2930 | 0.0023 |
| 1.3786 | 2931 | 0.0127 |
| 1.3791 | 2932 | 0.0104 |
| 1.3796 | 2933 | 0.004 |
| 1.3801 | 2934 | 0.0076 |
| 1.3805 | 2935 | 0.0106 |
| 1.3810 | 2936 | 0.0143 |
| 1.3815 | 2937 | 0.0079 |
| 1.3819 | 2938 | 0.0062 |
| 1.3824 | 2939 | 0.0043 |
| 1.3829 | 2940 | 0.0103 |
| 1.3833 | 2941 | 0.005 |
| 1.3838 | 2942 | 0.0209 |
| 1.3843 | 2943 | 0.0027 |
| 1.3848 | 2944 | 0.0032 |
| 1.3852 | 2945 | 0.0117 |
| 1.3857 | 2946 | 0.0115 |
| 1.3862 | 2947 | 0.0045 |
| 1.3866 | 2948 | 0.0026 |
| 1.3871 | 2949 | 0.0045 |
| 1.3876 | 2950 | 0.0066 |
| 1.3881 | 2951 | 0.004 |
| 1.3885 | 2952 | 0.0139 |
| 1.3890 | 2953 | 0.0035 |
| 1.3895 | 2954 | 0.004 |
| 1.3899 | 2955 | 0.0126 |
| 1.3904 | 2956 | 0.0061 |
| 1.3909 | 2957 | 0.0066 |
| 1.3913 | 2958 | 0.0081 |
| 1.3918 | 2959 | 0.0046 |
| 1.3923 | 2960 | 0.0213 |
| 1.3928 | 2961 | 0.0053 |
| 1.3932 | 2962 | 0.0032 |
| 1.3937 | 2963 | 0.0053 |
| 1.3942 | 2964 | 0.0056 |
| 1.3946 | 2965 | 0.0041 |
| 1.3951 | 2966 | 0.0112 |
| 1.3956 | 2967 | 0.0105 |
| 1.3960 | 2968 | 0.0104 |
| 1.3965 | 2969 | 0.0069 |
| 1.3970 | 2970 | 0.005 |
| 1.3975 | 2971 | 0.0163 |
| 1.3979 | 2972 | 0.0073 |
| 1.3984 | 2973 | 0.0093 |
| 1.3989 | 2974 | 0.0064 |
| 1.3993 | 2975 | 0.005 |
| 1.3998 | 2976 | 0.0022 |
| 1.4003 | 2977 | 0.0066 |
| 1.4008 | 2978 | 0.0077 |
| 1.4012 | 2979 | 0.0102 |
| 1.4017 | 2980 | 0.0055 |
| 1.4022 | 2981 | 0.0053 |
| 1.4026 | 2982 | 0.0091 |
| 1.4031 | 2983 | 0.0034 |
| 1.4036 | 2984 | 0.0066 |
| 1.4040 | 2985 | 0.0092 |
| 1.4045 | 2986 | 0.0044 |
| 1.4050 | 2987 | 0.0044 |
| 1.4055 | 2988 | 0.0023 |
| 1.4059 | 2989 | 0.0072 |
| 1.4064 | 2990 | 0.0081 |
| 1.4069 | 2991 | 0.0031 |
| 1.4073 | 2992 | 0.0055 |
| 1.4078 | 2993 | 0.0042 |
| 1.4083 | 2994 | 0.0094 |
| 1.4087 | 2995 | 0.0024 |
| 1.4092 | 2996 | 0.0035 |
| 1.4097 | 2997 | 0.0029 |
| 1.4102 | 2998 | 0.022 |
| 1.4106 | 2999 | 0.0166 |
| 1.4111 | 3000 | 0.0062 |
| 1.4116 | 3001 | 0.0028 |
| 1.4120 | 3002 | 0.0037 |
| 1.4125 | 3003 | 0.0109 |
| 1.4130 | 3004 | 0.0033 |
| 1.4135 | 3005 | 0.0156 |
| 1.4139 | 3006 | 0.0101 |
| 1.4144 | 3007 | 0.0058 |
| 1.4149 | 3008 | 0.0157 |
| 1.4153 | 3009 | 0.0032 |
| 1.4158 | 3010 | 0.0032 |
| 1.4163 | 3011 | 0.0033 |
| 1.4167 | 3012 | 0.0066 |
| 1.4172 | 3013 | 0.0063 |
| 1.4177 | 3014 | 0.0052 |
| 1.4182 | 3015 | 0.0107 |
| 1.4186 | 3016 | 0.0179 |
| 1.4191 | 3017 | 0.0073 |
| 1.4196 | 3018 | 0.0168 |
| 1.4200 | 3019 | 0.003 |
| 1.4205 | 3020 | 0.0074 |
| 1.4210 | 3021 | 0.0161 |
| 1.4214 | 3022 | 0.0079 |
| 1.4219 | 3023 | 0.0054 |
| 1.4224 | 3024 | 0.0026 |
| 1.4229 | 3025 | 0.0067 |
| 1.4233 | 3026 | 0.0068 |
| 1.4238 | 3027 | 0.0091 |
| 1.4243 | 3028 | 0.0148 |
| 1.4247 | 3029 | 0.0083 |
| 1.4252 | 3030 | 0.0023 |
| 1.4257 | 3031 | 0.0048 |
| 1.4262 | 3032 | 0.0069 |
| 1.4266 | 3033 | 0.011 |
| 1.4271 | 3034 | 0.0117 |
| 1.4276 | 3035 | 0.0059 |
| 1.4280 | 3036 | 0.0069 |
| 1.4285 | 3037 | 0.0047 |
| 1.4290 | 3038 | 0.0086 |
| 1.4294 | 3039 | 0.0084 |
| 1.4299 | 3040 | 0.0052 |
| 1.4304 | 3041 | 0.0124 |
| 1.4309 | 3042 | 0.0046 |
| 1.4313 | 3043 | 0.0079 |
| 1.4318 | 3044 | 0.002 |
| 1.4323 | 3045 | 0.0043 |
| 1.4327 | 3046 | 0.0028 |
| 1.4332 | 3047 | 0.0036 |
| 1.4337 | 3048 | 0.0035 |
| 1.4341 | 3049 | 0.0072 |
| 1.4346 | 3050 | 0.0184 |
| 1.4351 | 3051 | 0.003 |
| 1.4356 | 3052 | 0.014 |
| 1.4360 | 3053 | 0.0058 |
| 1.4365 | 3054 | 0.004 |
| 1.4370 | 3055 | 0.0062 |
| 1.4374 | 3056 | 0.0081 |
| 1.4379 | 3057 | 0.006 |
| 1.4384 | 3058 | 0.0032 |
| 1.4389 | 3059 | 0.0096 |
| 1.4393 | 3060 | 0.021 |
| 1.4398 | 3061 | 0.0033 |
| 1.4403 | 3062 | 0.0035 |
| 1.4407 | 3063 | 0.0082 |
| 1.4412 | 3064 | 0.0059 |
| 1.4417 | 3065 | 0.0035 |
| 1.4421 | 3066 | 0.0054 |
| 1.4426 | 3067 | 0.0082 |
| 1.4431 | 3068 | 0.0062 |
| 1.4436 | 3069 | 0.028 |
| 1.4440 | 3070 | 0.0034 |
| 1.4445 | 3071 | 0.0084 |
| 1.4450 | 3072 | 0.0028 |
| 1.4454 | 3073 | 0.0043 |
| 1.4459 | 3074 | 0.0071 |
| 1.4464 | 3075 | 0.0062 |
| 1.4468 | 3076 | 0.0117 |
| 1.4473 | 3077 | 0.0046 |
| 1.4478 | 3078 | 0.0101 |
| 1.4483 | 3079 | 0.0098 |
| 1.4487 | 3080 | 0.0052 |
| 1.4492 | 3081 | 0.0074 |
| 1.4497 | 3082 | 0.0037 |
| 1.4501 | 3083 | 0.0175 |
| 1.4506 | 3084 | 0.0033 |
| 1.4511 | 3085 | 0.0095 |
| 1.4516 | 3086 | 0.003 |
| 1.4520 | 3087 | 0.0048 |
| 1.4525 | 3088 | 0.0052 |
| 1.4530 | 3089 | 0.0095 |
| 1.4534 | 3090 | 0.0029 |
| 1.4539 | 3091 | 0.0022 |
| 1.4544 | 3092 | 0.007 |
| 1.4548 | 3093 | 0.0069 |
| 1.4553 | 3094 | 0.0027 |
| 1.4558 | 3095 | 0.0042 |
| 1.4563 | 3096 | 0.0088 |
| 1.4567 | 3097 | 0.0052 |
| 1.4572 | 3098 | 0.0093 |
| 1.4577 | 3099 | 0.009 |
| 1.4581 | 3100 | 0.0106 |
| 1.4586 | 3101 | 0.0124 |
| 1.4591 | 3102 | 0.005 |
| 1.4595 | 3103 | 0.0062 |
| 1.4600 | 3104 | 0.003 |
| 1.4605 | 3105 | 0.0053 |
| 1.4610 | 3106 | 0.0044 |
| 1.4614 | 3107 | 0.0074 |
| 1.4619 | 3108 | 0.0065 |
| 1.4624 | 3109 | 0.0042 |
| 1.4628 | 3110 | 0.0047 |
| 1.4633 | 3111 | 0.0024 |
| 1.4638 | 3112 | 0.0125 |
| 1.4643 | 3113 | 0.0031 |
| 1.4647 | 3114 | 0.0053 |
| 1.4652 | 3115 | 0.0035 |
| 1.4657 | 3116 | 0.0029 |
| 1.4661 | 3117 | 0.0038 |
| 1.4666 | 3118 | 0.005 |
| 1.4671 | 3119 | 0.0067 |
| 1.4675 | 3120 | 0.0106 |
| 1.4680 | 3121 | 0.0029 |
| 1.4685 | 3122 | 0.0047 |
| 1.4690 | 3123 | 0.01 |
| 1.4694 | 3124 | 0.0062 |
| 1.4699 | 3125 | 0.0038 |
| 1.4704 | 3126 | 0.0034 |
| 1.4708 | 3127 | 0.0132 |
| 1.4713 | 3128 | 0.0105 |
| 1.4718 | 3129 | 0.004 |
| 1.4722 | 3130 | 0.0104 |
| 1.4727 | 3131 | 0.0025 |
| 1.4732 | 3132 | 0.0067 |
| 1.4737 | 3133 | 0.0071 |
| 1.4741 | 3134 | 0.0038 |
| 1.4746 | 3135 | 0.0092 |
| 1.4751 | 3136 | 0.0223 |
| 1.4755 | 3137 | 0.0087 |
| 1.4760 | 3138 | 0.0067 |
| 1.4765 | 3139 | 0.0029 |
| 1.4770 | 3140 | 0.0019 |
| 1.4774 | 3141 | 0.0028 |
| 1.4779 | 3142 | 0.0032 |
| 1.4784 | 3143 | 0.0024 |
| 1.4788 | 3144 | 0.0041 |
| 1.4793 | 3145 | 0.0045 |
| 1.4798 | 3146 | 0.0073 |
| 1.4802 | 3147 | 0.0115 |
| 1.4807 | 3148 | 0.0051 |
| 1.4812 | 3149 | 0.0395 |
| 1.4817 | 3150 | 0.0054 |
| 1.4821 | 3151 | 0.0062 |
| 1.4826 | 3152 | 0.0047 |
| 1.4831 | 3153 | 0.0086 |
| 1.4835 | 3154 | 0.0031 |
| 1.4840 | 3155 | 0.0101 |
| 1.4845 | 3156 | 0.005 |
| 1.4849 | 3157 | 0.0098 |
| 1.4854 | 3158 | 0.0033 |
| 1.4859 | 3159 | 0.0116 |
| 1.4864 | 3160 | 0.003 |
| 1.4868 | 3161 | 0.0055 |
| 1.4873 | 3162 | 0.0018 |
| 1.4878 | 3163 | 0.0051 |
| 1.4882 | 3164 | 0.0177 |
| 1.4887 | 3165 | 0.0128 |
| 1.4892 | 3166 | 0.0085 |
| 1.4897 | 3167 | 0.0066 |
| 1.4901 | 3168 | 0.006 |
| 1.4906 | 3169 | 0.0095 |
| 1.4911 | 3170 | 0.0073 |
| 1.4915 | 3171 | 0.0077 |
| 1.4920 | 3172 | 0.0108 |
| 1.4925 | 3173 | 0.0054 |
| 1.4929 | 3174 | 0.0086 |
| 1.4934 | 3175 | 0.0048 |
| 1.4939 | 3176 | 0.0046 |
| 1.4944 | 3177 | 0.0092 |
| 1.4948 | 3178 | 0.004 |
| 1.4953 | 3179 | 0.0043 |
| 1.4958 | 3180 | 0.007 |
| 1.4962 | 3181 | 0.0074 |
| 1.4967 | 3182 | 0.0115 |
| 1.4972 | 3183 | 0.0139 |
| 1.4976 | 3184 | 0.0028 |
| 1.4981 | 3185 | 0.0051 |
| 1.4986 | 3186 | 0.0038 |
| 1.4991 | 3187 | 0.0058 |
| 1.4995 | 3188 | 0.0129 |
| 1.5 | 3189 | 0.0082 |
| 1.5005 | 3190 | 0.011 |
| 1.5009 | 3191 | 0.007 |
| 1.5014 | 3192 | 0.005 |
| 1.5019 | 3193 | 0.0013 |
| 1.5024 | 3194 | 0.0085 |
| 1.5028 | 3195 | 0.0061 |
| 1.5033 | 3196 | 0.0026 |
| 1.5038 | 3197 | 0.0226 |
| 1.5042 | 3198 | 0.0127 |
| 1.5047 | 3199 | 0.0116 |
| 1.5052 | 3200 | 0.0196 |
| 1.5056 | 3201 | 0.0099 |
| 1.5061 | 3202 | 0.004 |
| 1.5066 | 3203 | 0.0044 |
| 1.5071 | 3204 | 0.0067 |
| 1.5075 | 3205 | 0.0062 |
| 1.5080 | 3206 | 0.0048 |
| 1.5085 | 3207 | 0.0106 |
| 1.5089 | 3208 | 0.0071 |
| 1.5094 | 3209 | 0.006 |
| 1.5099 | 3210 | 0.0023 |
| 1.5103 | 3211 | 0.0173 |
| 1.5108 | 3212 | 0.0028 |
| 1.5113 | 3213 | 0.0046 |
| 1.5118 | 3214 | 0.0238 |
| 1.5122 | 3215 | 0.0069 |
| 1.5127 | 3216 | 0.0052 |
| 1.5132 | 3217 | 0.0169 |
| 1.5136 | 3218 | 0.0104 |
| 1.5141 | 3219 | 0.0061 |
| 1.5146 | 3220 | 0.0203 |
| 1.5151 | 3221 | 0.0042 |
| 1.5155 | 3222 | 0.0071 |
| 1.5160 | 3223 | 0.0025 |
| 1.5165 | 3224 | 0.0201 |
| 1.5169 | 3225 | 0.0044 |
| 1.5174 | 3226 | 0.0034 |
| 1.5179 | 3227 | 0.008 |
| 1.5183 | 3228 | 0.0039 |
| 1.5188 | 3229 | 0.0031 |
| 1.5193 | 3230 | 0.0063 |
| 1.5198 | 3231 | 0.007 |
| 1.5202 | 3232 | 0.0027 |
| 1.5207 | 3233 | 0.0045 |
| 1.5212 | 3234 | 0.0035 |
| 1.5216 | 3235 | 0.0054 |
| 1.5221 | 3236 | 0.005 |
| 1.5226 | 3237 | 0.0097 |
| 1.5230 | 3238 | 0.0127 |
| 1.5235 | 3239 | 0.0064 |
| 1.5240 | 3240 | 0.0031 |
| 1.5245 | 3241 | 0.0116 |
| 1.5249 | 3242 | 0.0112 |
| 1.5254 | 3243 | 0.0057 |
| 1.5259 | 3244 | 0.0066 |
| 1.5263 | 3245 | 0.0058 |
| 1.5268 | 3246 | 0.0081 |
| 1.5273 | 3247 | 0.0081 |
| 1.5278 | 3248 | 0.0102 |
| 1.5282 | 3249 | 0.0035 |
| 1.5287 | 3250 | 0.0058 |
| 1.5292 | 3251 | 0.023 |
| 1.5296 | 3252 | 0.0117 |
| 1.5301 | 3253 | 0.0036 |
| 1.5306 | 3254 | 0.004 |
| 1.5310 | 3255 | 0.0067 |
| 1.5315 | 3256 | 0.0021 |
| 1.5320 | 3257 | 0.0078 |
| 1.5325 | 3258 | 0.0081 |
| 1.5329 | 3259 | 0.0062 |
| 1.5334 | 3260 | 0.0095 |
| 1.5339 | 3261 | 0.0036 |
| 1.5343 | 3262 | 0.0118 |
| 1.5348 | 3263 | 0.0074 |
| 1.5353 | 3264 | 0.0137 |
| 1.5357 | 3265 | 0.0054 |
| 1.5362 | 3266 | 0.0079 |
| 1.5367 | 3267 | 0.0073 |
| 1.5372 | 3268 | 0.0048 |
| 1.5376 | 3269 | 0.0069 |
| 1.5381 | 3270 | 0.004 |
| 1.5386 | 3271 | 0.025 |
| 1.5390 | 3272 | 0.0074 |
| 1.5395 | 3273 | 0.0031 |
| 1.5400 | 3274 | 0.0036 |
| 1.5405 | 3275 | 0.0028 |
| 1.5409 | 3276 | 0.009 |
| 1.5414 | 3277 | 0.007 |
| 1.5419 | 3278 | 0.0068 |
| 1.5423 | 3279 | 0.0105 |
| 1.5428 | 3280 | 0.0058 |
| 1.5433 | 3281 | 0.0042 |
| 1.5437 | 3282 | 0.0038 |
| 1.5442 | 3283 | 0.0092 |
| 1.5447 | 3284 | 0.0176 |
| 1.5452 | 3285 | 0.0089 |
| 1.5456 | 3286 | 0.0319 |
| 1.5461 | 3287 | 0.0062 |
| 1.5466 | 3288 | 0.0122 |
| 1.5470 | 3289 | 0.0029 |
| 1.5475 | 3290 | 0.0131 |
| 1.5480 | 3291 | 0.006 |
| 1.5484 | 3292 | 0.0038 |
| 1.5489 | 3293 | 0.0096 |
| 1.5494 | 3294 | 0.0053 |
| 1.5499 | 3295 | 0.0041 |
| 1.5503 | 3296 | 0.0115 |
| 1.5508 | 3297 | 0.0061 |
| 1.5513 | 3298 | 0.0043 |
| 1.5517 | 3299 | 0.0049 |
| 1.5522 | 3300 | 0.0046 |
| 1.5527 | 3301 | 0.0079 |
| 1.5532 | 3302 | 0.0102 |
| 1.5536 | 3303 | 0.0042 |
| 1.5541 | 3304 | 0.0105 |
| 1.5546 | 3305 | 0.0031 |
| 1.5550 | 3306 | 0.0141 |
| 1.5555 | 3307 | 0.0044 |
| 1.5560 | 3308 | 0.0164 |
| 1.5564 | 3309 | 0.009 |
| 1.5569 | 3310 | 0.012 |
| 1.5574 | 3311 | 0.0072 |
| 1.5579 | 3312 | 0.0103 |
| 1.5583 | 3313 | 0.008 |
| 1.5588 | 3314 | 0.0044 |
| 1.5593 | 3315 | 0.014 |
| 1.5597 | 3316 | 0.0075 |
| 1.5602 | 3317 | 0.0032 |
| 1.5607 | 3318 | 0.0063 |
| 1.5611 | 3319 | 0.0049 |
| 1.5616 | 3320 | 0.004 |
| 1.5621 | 3321 | 0.008 |
| 1.5626 | 3322 | 0.0069 |
| 1.5630 | 3323 | 0.0035 |
| 1.5635 | 3324 | 0.0077 |
| 1.5640 | 3325 | 0.0055 |
| 1.5644 | 3326 | 0.0016 |
| 1.5649 | 3327 | 0.0084 |
| 1.5654 | 3328 | 0.0054 |
| 1.5659 | 3329 | 0.0158 |
| 1.5663 | 3330 | 0.0059 |
| 1.5668 | 3331 | 0.0069 |
| 1.5673 | 3332 | 0.0059 |
| 1.5677 | 3333 | 0.0054 |
| 1.5682 | 3334 | 0.0077 |
| 1.5687 | 3335 | 0.0056 |
| 1.5691 | 3336 | 0.0027 |
| 1.5696 | 3337 | 0.0073 |
| 1.5701 | 3338 | 0.0097 |
| 1.5706 | 3339 | 0.0069 |
| 1.5710 | 3340 | 0.012 |
| 1.5715 | 3341 | 0.0163 |
| 1.5720 | 3342 | 0.0143 |
| 1.5724 | 3343 | 0.0068 |
| 1.5729 | 3344 | 0.0067 |
| 1.5734 | 3345 | 0.0036 |
| 1.5738 | 3346 | 0.0034 |
| 1.5743 | 3347 | 0.0101 |
| 1.5748 | 3348 | 0.0036 |
| 1.5753 | 3349 | 0.0047 |
| 1.5757 | 3350 | 0.0065 |
| 1.5762 | 3351 | 0.0056 |
| 1.5767 | 3352 | 0.0093 |
| 1.5771 | 3353 | 0.0036 |
| 1.5776 | 3354 | 0.009 |
| 1.5781 | 3355 | 0.012 |
| 1.5786 | 3356 | 0.0074 |
| 1.5790 | 3357 | 0.0089 |
| 1.5795 | 3358 | 0.0055 |
| 1.5800 | 3359 | 0.0089 |
| 1.5804 | 3360 | 0.0097 |
| 1.5809 | 3361 | 0.0057 |
| 1.5814 | 3362 | 0.0047 |
| 1.5818 | 3363 | 0.0069 |
| 1.5823 | 3364 | 0.0068 |
| 1.5828 | 3365 | 0.0079 |
| 1.5833 | 3366 | 0.005 |
| 1.5837 | 3367 | 0.0028 |
| 1.5842 | 3368 | 0.0062 |
| 1.5847 | 3369 | 0.0051 |
| 1.5851 | 3370 | 0.002 |
| 1.5856 | 3371 | 0.0097 |
| 1.5861 | 3372 | 0.0043 |
| 1.5865 | 3373 | 0.0181 |
| 1.5870 | 3374 | 0.0049 |
| 1.5875 | 3375 | 0.005 |
| 1.5880 | 3376 | 0.0063 |
| 1.5884 | 3377 | 0.0068 |
| 1.5889 | 3378 | 0.0042 |
| 1.5894 | 3379 | 0.0057 |
| 1.5898 | 3380 | 0.0025 |
| 1.5903 | 3381 | 0.0055 |
| 1.5908 | 3382 | 0.0067 |
| 1.5913 | 3383 | 0.0035 |
| 1.5917 | 3384 | 0.0133 |
| 1.5922 | 3385 | 0.0033 |
| 1.5927 | 3386 | 0.0037 |
| 1.5931 | 3387 | 0.0044 |
| 1.5936 | 3388 | 0.0058 |
| 1.5941 | 3389 | 0.0031 |
| 1.5945 | 3390 | 0.0028 |
| 1.5950 | 3391 | 0.0058 |
| 1.5955 | 3392 | 0.003 |
| 1.5960 | 3393 | 0.0046 |
| 1.5964 | 3394 | 0.0039 |
| 1.5969 | 3395 | 0.0264 |
| 1.5974 | 3396 | 0.0126 |
| 1.5978 | 3397 | 0.0082 |
| 1.5983 | 3398 | 0.0152 |
| 1.5988 | 3399 | 0.0059 |
| 1.5992 | 3400 | 0.0113 |
| 1.5997 | 3401 | 0.01 |
| 1.6002 | 3402 | 0.0038 |
| 1.6007 | 3403 | 0.0059 |
| 1.6011 | 3404 | 0.0111 |
| 1.6016 | 3405 | 0.0034 |
| 1.6021 | 3406 | 0.0042 |
| 1.6025 | 3407 | 0.0139 |
| 1.6030 | 3408 | 0.0028 |
| 1.6035 | 3409 | 0.0135 |
| 1.6040 | 3410 | 0.0067 |
| 1.6044 | 3411 | 0.0042 |
| 1.6049 | 3412 | 0.0124 |
| 1.6054 | 3413 | 0.0035 |
| 1.6058 | 3414 | 0.0046 |
| 1.6063 | 3415 | 0.0123 |
| 1.6068 | 3416 | 0.0042 |
| 1.6072 | 3417 | 0.0033 |
| 1.6077 | 3418 | 0.0066 |
| 1.6082 | 3419 | 0.0062 |
| 1.6087 | 3420 | 0.0107 |
| 1.6091 | 3421 | 0.0055 |
| 1.6096 | 3422 | 0.0076 |
| 1.6101 | 3423 | 0.0038 |
| 1.6105 | 3424 | 0.0136 |
| 1.6110 | 3425 | 0.0065 |
| 1.6115 | 3426 | 0.0091 |
| 1.6119 | 3427 | 0.0054 |
| 1.6124 | 3428 | 0.0064 |
| 1.6129 | 3429 | 0.0041 |
| 1.6134 | 3430 | 0.0122 |
| 1.6138 | 3431 | 0.007 |
| 1.6143 | 3432 | 0.0083 |
| 1.6148 | 3433 | 0.0118 |
| 1.6152 | 3434 | 0.0028 |
| 1.6157 | 3435 | 0.0048 |
| 1.6162 | 3436 | 0.0035 |
| 1.6167 | 3437 | 0.0101 |
| 1.6171 | 3438 | 0.0093 |
| 1.6176 | 3439 | 0.0047 |
| 1.6181 | 3440 | 0.0039 |
| 1.6185 | 3441 | 0.0028 |
| 1.6190 | 3442 | 0.0063 |
| 1.6195 | 3443 | 0.0033 |
| 1.6199 | 3444 | 0.008 |
| 1.6204 | 3445 | 0.0155 |
| 1.6209 | 3446 | 0.003 |
| 1.6214 | 3447 | 0.0054 |
| 1.6218 | 3448 | 0.0089 |
| 1.6223 | 3449 | 0.0058 |
| 1.6228 | 3450 | 0.0084 |
| 1.6232 | 3451 | 0.006 |
| 1.6237 | 3452 | 0.006 |
| 1.6242 | 3453 | 0.0061 |
| 1.6246 | 3454 | 0.0027 |
| 1.6251 | 3455 | 0.0071 |
| 1.6256 | 3456 | 0.0029 |
| 1.6261 | 3457 | 0.0067 |
| 1.6265 | 3458 | 0.0083 |
| 1.6270 | 3459 | 0.0119 |
| 1.6275 | 3460 | 0.0082 |
| 1.6279 | 3461 | 0.0035 |
| 1.6284 | 3462 | 0.0051 |
| 1.6289 | 3463 | 0.0087 |
| 1.6294 | 3464 | 0.0035 |
| 1.6298 | 3465 | 0.0048 |
| 1.6303 | 3466 | 0.0078 |
| 1.6308 | 3467 | 0.0073 |
| 1.6312 | 3468 | 0.0032 |
| 1.6317 | 3469 | 0.0059 |
| 1.6322 | 3470 | 0.0191 |
| 1.6326 | 3471 | 0.007 |
| 1.6331 | 3472 | 0.0048 |
| 1.6336 | 3473 | 0.0067 |
| 1.6341 | 3474 | 0.0069 |
| 1.6345 | 3475 | 0.0082 |
| 1.6350 | 3476 | 0.0044 |
| 1.6355 | 3477 | 0.0074 |
| 1.6359 | 3478 | 0.0052 |
| 1.6364 | 3479 | 0.0031 |
| 1.6369 | 3480 | 0.009 |
| 1.6373 | 3481 | 0.0073 |
| 1.6378 | 3482 | 0.0051 |
| 1.6383 | 3483 | 0.0038 |
| 1.6388 | 3484 | 0.0087 |
| 1.6392 | 3485 | 0.0099 |
| 1.6397 | 3486 | 0.003 |
| 1.6402 | 3487 | 0.0155 |
| 1.6406 | 3488 | 0.0038 |
| 1.6411 | 3489 | 0.0035 |
| 1.6416 | 3490 | 0.0044 |
| 1.6421 | 3491 | 0.0021 |
| 1.6425 | 3492 | 0.0063 |
| 1.6430 | 3493 | 0.0029 |
| 1.6435 | 3494 | 0.005 |
| 1.6439 | 3495 | 0.0081 |
| 1.6444 | 3496 | 0.009 |
| 1.6449 | 3497 | 0.005 |
| 1.6453 | 3498 | 0.0029 |
| 1.6458 | 3499 | 0.003 |
| 1.6463 | 3500 | 0.0034 |
| 1.6468 | 3501 | 0.0042 |
| 1.6472 | 3502 | 0.0177 |
| 1.6477 | 3503 | 0.0095 |
| 1.6482 | 3504 | 0.0069 |
| 1.6486 | 3505 | 0.006 |
| 1.6491 | 3506 | 0.0029 |
| 1.6496 | 3507 | 0.0076 |
| 1.6500 | 3508 | 0.0092 |
| 1.6505 | 3509 | 0.0082 |
| 1.6510 | 3510 | 0.007 |
| 1.6515 | 3511 | 0.0247 |
| 1.6519 | 3512 | 0.0102 |
| 1.6524 | 3513 | 0.0044 |
| 1.6529 | 3514 | 0.0114 |
| 1.6533 | 3515 | 0.0178 |
| 1.6538 | 3516 | 0.0052 |
| 1.6543 | 3517 | 0.0145 |
| 1.6548 | 3518 | 0.0018 |
| 1.6552 | 3519 | 0.004 |
| 1.6557 | 3520 | 0.0083 |
| 1.6562 | 3521 | 0.0057 |
| 1.6566 | 3522 | 0.0131 |
| 1.6571 | 3523 | 0.0045 |
| 1.6576 | 3524 | 0.0069 |
| 1.6580 | 3525 | 0.0056 |
| 1.6585 | 3526 | 0.0096 |
| 1.6590 | 3527 | 0.0064 |
| 1.6595 | 3528 | 0.0069 |
| 1.6599 | 3529 | 0.0085 |
| 1.6604 | 3530 | 0.007 |
| 1.6609 | 3531 | 0.0069 |
| 1.6613 | 3532 | 0.0155 |
| 1.6618 | 3533 | 0.0027 |
| 1.6623 | 3534 | 0.0031 |
| 1.6627 | 3535 | 0.004 |
| 1.6632 | 3536 | 0.0049 |
| 1.6637 | 3537 | 0.0073 |
| 1.6642 | 3538 | 0.0059 |
| 1.6646 | 3539 | 0.0097 |
| 1.6651 | 3540 | 0.0175 |
| 1.6656 | 3541 | 0.0054 |
| 1.6660 | 3542 | 0.0125 |
| 1.6665 | 3543 | 0.0022 |
| 1.6670 | 3544 | 0.0031 |
| 1.6675 | 3545 | 0.0059 |
| 1.6679 | 3546 | 0.0021 |
| 1.6684 | 3547 | 0.0182 |
| 1.6689 | 3548 | 0.0034 |
| 1.6693 | 3549 | 0.0029 |
| 1.6698 | 3550 | 0.0035 |
| 1.6703 | 3551 | 0.0047 |
| 1.6707 | 3552 | 0.0062 |
| 1.6712 | 3553 | 0.0106 |
| 1.6717 | 3554 | 0.0029 |
| 1.6722 | 3555 | 0.0056 |
| 1.6726 | 3556 | 0.0057 |
| 1.6731 | 3557 | 0.0046 |
| 1.6736 | 3558 | 0.0024 |
| 1.6740 | 3559 | 0.0063 |
| 1.6745 | 3560 | 0.0057 |
| 1.6750 | 3561 | 0.0027 |
| 1.6754 | 3562 | 0.0174 |
| 1.6759 | 3563 | 0.005 |
| 1.6764 | 3564 | 0.004 |
| 1.6769 | 3565 | 0.0034 |
| 1.6773 | 3566 | 0.0049 |
| 1.6778 | 3567 | 0.0026 |
| 1.6783 | 3568 | 0.0065 |
| 1.6787 | 3569 | 0.0049 |
| 1.6792 | 3570 | 0.0061 |
| 1.6797 | 3571 | 0.0091 |
| 1.6802 | 3572 | 0.0032 |
| 1.6806 | 3573 | 0.0108 |
| 1.6811 | 3574 | 0.0049 |
| 1.6816 | 3575 | 0.0065 |
| 1.6820 | 3576 | 0.0032 |
| 1.6825 | 3577 | 0.0043 |
| 1.6830 | 3578 | 0.007 |
| 1.6834 | 3579 | 0.0079 |
| 1.6839 | 3580 | 0.0069 |
| 1.6844 | 3581 | 0.0084 |
| 1.6849 | 3582 | 0.0048 |
| 1.6853 | 3583 | 0.0079 |
| 1.6858 | 3584 | 0.0047 |
| 1.6863 | 3585 | 0.007 |
| 1.6867 | 3586 | 0.0076 |
| 1.6872 | 3587 | 0.0059 |
| 1.6877 | 3588 | 0.0121 |
| 1.6881 | 3589 | 0.0046 |
| 1.6886 | 3590 | 0.0095 |
| 1.6891 | 3591 | 0.0087 |
| 1.6896 | 3592 | 0.0059 |
| 1.6900 | 3593 | 0.0044 |
| 1.6905 | 3594 | 0.0055 |
| 1.6910 | 3595 | 0.0029 |
| 1.6914 | 3596 | 0.0093 |
| 1.6919 | 3597 | 0.0091 |
| 1.6924 | 3598 | 0.0134 |
| 1.6929 | 3599 | 0.005 |
| 1.6933 | 3600 | 0.0073 |
| 1.6938 | 3601 | 0.004 |
| 1.6943 | 3602 | 0.0033 |
| 1.6947 | 3603 | 0.0103 |
| 1.6952 | 3604 | 0.0145 |
| 1.6957 | 3605 | 0.0121 |
| 1.6961 | 3606 | 0.01 |
| 1.6966 | 3607 | 0.0047 |
| 1.6971 | 3608 | 0.0038 |
| 1.6976 | 3609 | 0.0072 |
| 1.6980 | 3610 | 0.011 |
| 1.6985 | 3611 | 0.009 |
| 1.6990 | 3612 | 0.0028 |
| 1.6994 | 3613 | 0.0035 |
| 1.6999 | 3614 | 0.0045 |
| 1.7004 | 3615 | 0.0274 |
| 1.7008 | 3616 | 0.0046 |
| 1.7013 | 3617 | 0.0067 |
| 1.7018 | 3618 | 0.0053 |
| 1.7023 | 3619 | 0.0066 |
| 1.7027 | 3620 | 0.0035 |
| 1.7032 | 3621 | 0.0021 |
| 1.7037 | 3622 | 0.0159 |
| 1.7041 | 3623 | 0.0096 |
| 1.7046 | 3624 | 0.0077 |
| 1.7051 | 3625 | 0.0051 |
| 1.7056 | 3626 | 0.0047 |
| 1.7060 | 3627 | 0.0138 |
| 1.7065 | 3628 | 0.0116 |
| 1.7070 | 3629 | 0.0039 |
| 1.7074 | 3630 | 0.0045 |
| 1.7079 | 3631 | 0.0055 |
| 1.7084 | 3632 | 0.006 |
| 1.7088 | 3633 | 0.0028 |
| 1.7093 | 3634 | 0.0166 |
| 1.7098 | 3635 | 0.0088 |
| 1.7103 | 3636 | 0.0057 |
| 1.7107 | 3637 | 0.0033 |
| 1.7112 | 3638 | 0.0102 |
| 1.7117 | 3639 | 0.0058 |
| 1.7121 | 3640 | 0.0076 |
| 1.7126 | 3641 | 0.0075 |
| 1.7131 | 3642 | 0.0041 |
| 1.7135 | 3643 | 0.0074 |
| 1.7140 | 3644 | 0.0025 |
| 1.7145 | 3645 | 0.0042 |
| 1.7150 | 3646 | 0.0036 |
| 1.7154 | 3647 | 0.0049 |
| 1.7159 | 3648 | 0.0085 |
| 1.7164 | 3649 | 0.0105 |
| 1.7168 | 3650 | 0.0051 |
| 1.7173 | 3651 | 0.016 |
| 1.7178 | 3652 | 0.012 |
| 1.7183 | 3653 | 0.0045 |
| 1.7187 | 3654 | 0.0085 |
| 1.7192 | 3655 | 0.0034 |
| 1.7197 | 3656 | 0.0037 |
| 1.7201 | 3657 | 0.0084 |
| 1.7206 | 3658 | 0.0052 |
| 1.7211 | 3659 | 0.003 |
| 1.7215 | 3660 | 0.0057 |
| 1.7220 | 3661 | 0.0055 |
| 1.7225 | 3662 | 0.0164 |
| 1.7230 | 3663 | 0.004 |
| 1.7234 | 3664 | 0.0053 |
| 1.7239 | 3665 | 0.0034 |
| 1.7244 | 3666 | 0.0067 |
| 1.7248 | 3667 | 0.0038 |
| 1.7253 | 3668 | 0.0047 |
| 1.7258 | 3669 | 0.0028 |
| 1.7262 | 3670 | 0.0061 |
| 1.7267 | 3671 | 0.0053 |
| 1.7272 | 3672 | 0.0093 |
| 1.7277 | 3673 | 0.0041 |
| 1.7281 | 3674 | 0.0047 |
| 1.7286 | 3675 | 0.0161 |
| 1.7291 | 3676 | 0.0037 |
| 1.7295 | 3677 | 0.0104 |
| 1.7300 | 3678 | 0.0042 |
| 1.7305 | 3679 | 0.0052 |
| 1.7310 | 3680 | 0.0056 |
| 1.7314 | 3681 | 0.0032 |
| 1.7319 | 3682 | 0.0056 |
| 1.7324 | 3683 | 0.0045 |
| 1.7328 | 3684 | 0.0106 |
| 1.7333 | 3685 | 0.0099 |
| 1.7338 | 3686 | 0.0054 |
| 1.7342 | 3687 | 0.0064 |
| 1.7347 | 3688 | 0.0025 |
| 1.7352 | 3689 | 0.0061 |
| 1.7357 | 3690 | 0.0052 |
| 1.7361 | 3691 | 0.0066 |
| 1.7366 | 3692 | 0.0107 |
| 1.7371 | 3693 | 0.004 |
| 1.7375 | 3694 | 0.005 |
| 1.7380 | 3695 | 0.0039 |
| 1.7385 | 3696 | 0.0025 |
| 1.7389 | 3697 | 0.0022 |
| 1.7394 | 3698 | 0.0339 |
| 1.7399 | 3699 | 0.0095 |
| 1.7404 | 3700 | 0.0069 |
| 1.7408 | 3701 | 0.0108 |
| 1.7413 | 3702 | 0.0035 |
| 1.7418 | 3703 | 0.008 |
| 1.7422 | 3704 | 0.0074 |
| 1.7427 | 3705 | 0.0063 |
| 1.7432 | 3706 | 0.0085 |
| 1.7437 | 3707 | 0.0052 |
| 1.7441 | 3708 | 0.0109 |
| 1.7446 | 3709 | 0.0081 |
| 1.7451 | 3710 | 0.0046 |
| 1.7455 | 3711 | 0.0045 |
| 1.7460 | 3712 | 0.0019 |
| 1.7465 | 3713 | 0.0076 |
| 1.7469 | 3714 | 0.0029 |
| 1.7474 | 3715 | 0.0171 |
| 1.7479 | 3716 | 0.0066 |
| 1.7484 | 3717 | 0.0137 |
| 1.7488 | 3718 | 0.0036 |
| 1.7493 | 3719 | 0.0105 |
| 1.7498 | 3720 | 0.0146 |
| 1.7502 | 3721 | 0.0092 |
| 1.7507 | 3722 | 0.0026 |
| 1.7512 | 3723 | 0.0058 |
| 1.7516 | 3724 | 0.0027 |
| 1.7521 | 3725 | 0.0027 |
| 1.7526 | 3726 | 0.0075 |
| 1.7531 | 3727 | 0.0029 |
| 1.7535 | 3728 | 0.0059 |
| 1.7540 | 3729 | 0.0041 |
| 1.7545 | 3730 | 0.0044 |
| 1.7549 | 3731 | 0.0045 |
| 1.7554 | 3732 | 0.0049 |
| 1.7559 | 3733 | 0.0117 |
| 1.7563 | 3734 | 0.0037 |
| 1.7568 | 3735 | 0.0034 |
| 1.7573 | 3736 | 0.0061 |
| 1.7578 | 3737 | 0.0157 |
| 1.7582 | 3738 | 0.0042 |
| 1.7587 | 3739 | 0.0041 |
| 1.7592 | 3740 | 0.0095 |
| 1.7596 | 3741 | 0.0024 |
| 1.7601 | 3742 | 0.0173 |
| 1.7606 | 3743 | 0.0039 |
| 1.7611 | 3744 | 0.017 |
| 1.7615 | 3745 | 0.0052 |
| 1.7620 | 3746 | 0.0062 |
| 1.7625 | 3747 | 0.0041 |
| 1.7629 | 3748 | 0.0087 |
| 1.7634 | 3749 | 0.0115 |
| 1.7639 | 3750 | 0.0055 |
| 1.7643 | 3751 | 0.0063 |
| 1.7648 | 3752 | 0.005 |
| 1.7653 | 3753 | 0.0072 |
| 1.7658 | 3754 | 0.0046 |
| 1.7662 | 3755 | 0.0047 |
| 1.7667 | 3756 | 0.0054 |
| 1.7672 | 3757 | 0.0065 |
| 1.7676 | 3758 | 0.0044 |
| 1.7681 | 3759 | 0.0043 |
| 1.7686 | 3760 | 0.0113 |
| 1.7690 | 3761 | 0.006 |
| 1.7695 | 3762 | 0.0041 |
| 1.7700 | 3763 | 0.0087 |
| 1.7705 | 3764 | 0.0105 |
| 1.7709 | 3765 | 0.0048 |
| 1.7714 | 3766 | 0.0224 |
| 1.7719 | 3767 | 0.008 |
| 1.7723 | 3768 | 0.0045 |
| 1.7728 | 3769 | 0.0049 |
| 1.7733 | 3770 | 0.0066 |
| 1.7738 | 3771 | 0.0095 |
| 1.7742 | 3772 | 0.0088 |
| 1.7747 | 3773 | 0.0024 |
| 1.7752 | 3774 | 0.0031 |
| 1.7756 | 3775 | 0.0056 |
| 1.7761 | 3776 | 0.0022 |
| 1.7766 | 3777 | 0.0061 |
| 1.7770 | 3778 | 0.0057 |
| 1.7775 | 3779 | 0.0045 |
| 1.7780 | 3780 | 0.0065 |
| 1.7785 | 3781 | 0.0091 |
| 1.7789 | 3782 | 0.0063 |
| 1.7794 | 3783 | 0.0068 |
| 1.7799 | 3784 | 0.0034 |
| 1.7803 | 3785 | 0.0114 |
| 1.7808 | 3786 | 0.0322 |
| 1.7813 | 3787 | 0.0041 |
| 1.7817 | 3788 | 0.0026 |
| 1.7822 | 3789 | 0.0134 |
| 1.7827 | 3790 | 0.0049 |
| 1.7832 | 3791 | 0.018 |
| 1.7836 | 3792 | 0.0077 |
| 1.7841 | 3793 | 0.0046 |
| 1.7846 | 3794 | 0.0068 |
| 1.7850 | 3795 | 0.0064 |
| 1.7855 | 3796 | 0.0077 |
| 1.7860 | 3797 | 0.0084 |
| 1.7865 | 3798 | 0.0111 |
| 1.7869 | 3799 | 0.0044 |
| 1.7874 | 3800 | 0.004 |
| 1.7879 | 3801 | 0.0043 |
| 1.7883 | 3802 | 0.005 |
| 1.7888 | 3803 | 0.0045 |
| 1.7893 | 3804 | 0.0076 |
| 1.7897 | 3805 | 0.007 |
| 1.7902 | 3806 | 0.0198 |
| 1.7907 | 3807 | 0.0039 |
| 1.7912 | 3808 | 0.0069 |
| 1.7916 | 3809 | 0.0053 |
| 1.7921 | 3810 | 0.0041 |
| 1.7926 | 3811 | 0.0122 |
| 1.7930 | 3812 | 0.0051 |
| 1.7935 | 3813 | 0.0037 |
| 1.7940 | 3814 | 0.0034 |
| 1.7944 | 3815 | 0.0245 |
| 1.7949 | 3816 | 0.0034 |
| 1.7954 | 3817 | 0.004 |
| 1.7959 | 3818 | 0.0072 |
| 1.7963 | 3819 | 0.0057 |
| 1.7968 | 3820 | 0.0035 |
| 1.7973 | 3821 | 0.0034 |
| 1.7977 | 3822 | 0.0073 |
| 1.7982 | 3823 | 0.0038 |
| 1.7987 | 3824 | 0.0049 |
| 1.7992 | 3825 | 0.0039 |
| 1.7996 | 3826 | 0.0061 |
| 1.8001 | 3827 | 0.0036 |
| 1.8006 | 3828 | 0.0125 |
| 1.8010 | 3829 | 0.0077 |
| 1.8015 | 3830 | 0.0048 |
| 1.8020 | 3831 | 0.0072 |
| 1.8024 | 3832 | 0.0119 |
| 1.8029 | 3833 | 0.0105 |
| 1.8034 | 3834 | 0.0102 |
| 1.8039 | 3835 | 0.0023 |
| 1.8043 | 3836 | 0.0032 |
| 1.8048 | 3837 | 0.0041 |
| 1.8053 | 3838 | 0.0042 |
| 1.8057 | 3839 | 0.0075 |
| 1.8062 | 3840 | 0.0054 |
| 1.8067 | 3841 | 0.0138 |
| 1.8071 | 3842 | 0.0106 |
| 1.8076 | 3843 | 0.0048 |
| 1.8081 | 3844 | 0.0059 |
| 1.8086 | 3845 | 0.0075 |
| 1.8090 | 3846 | 0.0047 |
| 1.8095 | 3847 | 0.0045 |
| 1.8100 | 3848 | 0.0066 |
| 1.8104 | 3849 | 0.0094 |
| 1.8109 | 3850 | 0.0034 |
| 1.8114 | 3851 | 0.0045 |
| 1.8119 | 3852 | 0.007 |
| 1.8123 | 3853 | 0.0082 |
| 1.8128 | 3854 | 0.0109 |
| 1.8133 | 3855 | 0.0053 |
| 1.8137 | 3856 | 0.003 |
| 1.8142 | 3857 | 0.0055 |
| 1.8147 | 3858 | 0.0031 |
| 1.8151 | 3859 | 0.0046 |
| 1.8156 | 3860 | 0.0119 |
| 1.8161 | 3861 | 0.01 |
| 1.8166 | 3862 | 0.0056 |
| 1.8170 | 3863 | 0.0044 |
| 1.8175 | 3864 | 0.0086 |
| 1.8180 | 3865 | 0.0066 |
| 1.8184 | 3866 | 0.0039 |
| 1.8189 | 3867 | 0.0057 |
| 1.8194 | 3868 | 0.0032 |
| 1.8198 | 3869 | 0.0147 |
| 1.8203 | 3870 | 0.0092 |
| 1.8208 | 3871 | 0.0054 |
| 1.8213 | 3872 | 0.0104 |
| 1.8217 | 3873 | 0.025 |
| 1.8222 | 3874 | 0.0067 |
| 1.8227 | 3875 | 0.0063 |
| 1.8231 | 3876 | 0.0031 |
| 1.8236 | 3877 | 0.0053 |
| 1.8241 | 3878 | 0.0106 |
| 1.8246 | 3879 | 0.0055 |
| 1.8250 | 3880 | 0.0019 |
| 1.8255 | 3881 | 0.0062 |
| 1.8260 | 3882 | 0.006 |
| 1.8264 | 3883 | 0.0038 |
| 1.8269 | 3884 | 0.0035 |
| 1.8274 | 3885 | 0.0078 |
| 1.8278 | 3886 | 0.0039 |
| 1.8283 | 3887 | 0.0068 |
| 1.8288 | 3888 | 0.0044 |
| 1.8293 | 3889 | 0.0096 |
| 1.8297 | 3890 | 0.0052 |
| 1.8302 | 3891 | 0.0062 |
| 1.8307 | 3892 | 0.0055 |
| 1.8311 | 3893 | 0.0059 |
| 1.8316 | 3894 | 0.0089 |
| 1.8321 | 3895 | 0.0034 |
| 1.8325 | 3896 | 0.0062 |
| 1.8330 | 3897 | 0.009 |
| 1.8335 | 3898 | 0.0071 |
| 1.8340 | 3899 | 0.0099 |
| 1.8344 | 3900 | 0.0038 |
| 1.8349 | 3901 | 0.0032 |
| 1.8354 | 3902 | 0.0141 |
| 1.8358 | 3903 | 0.0062 |
| 1.8363 | 3904 | 0.0064 |
| 1.8368 | 3905 | 0.0052 |
| 1.8373 | 3906 | 0.0054 |
| 1.8377 | 3907 | 0.0031 |
| 1.8382 | 3908 | 0.004 |
| 1.8387 | 3909 | 0.0024 |
| 1.8391 | 3910 | 0.0058 |
| 1.8396 | 3911 | 0.0069 |
| 1.8401 | 3912 | 0.0021 |
| 1.8405 | 3913 | 0.0044 |
| 1.8410 | 3914 | 0.0073 |
| 1.8415 | 3915 | 0.0039 |
| 1.8420 | 3916 | 0.0071 |
| 1.8424 | 3917 | 0.0027 |
| 1.8429 | 3918 | 0.0046 |
| 1.8434 | 3919 | 0.0097 |
| 1.8438 | 3920 | 0.0077 |
| 1.8443 | 3921 | 0.005 |
| 1.8448 | 3922 | 0.0046 |
| 1.8452 | 3923 | 0.0114 |
| 1.8457 | 3924 | 0.0039 |
| 1.8462 | 3925 | 0.003 |
| 1.8467 | 3926 | 0.0038 |
| 1.8471 | 3927 | 0.0039 |
| 1.8476 | 3928 | 0.0086 |
| 1.8481 | 3929 | 0.0046 |
| 1.8485 | 3930 | 0.0056 |
| 1.8490 | 3931 | 0.0069 |
| 1.8495 | 3932 | 0.0057 |
| 1.8500 | 3933 | 0.0045 |
| 1.8504 | 3934 | 0.0036 |
| 1.8509 | 3935 | 0.0056 |
| 1.8514 | 3936 | 0.004 |
| 1.8518 | 3937 | 0.0025 |
| 1.8523 | 3938 | 0.0049 |
| 1.8528 | 3939 | 0.0028 |
| 1.8532 | 3940 | 0.0067 |
| 1.8537 | 3941 | 0.004 |
| 1.8542 | 3942 | 0.0051 |
| 1.8547 | 3943 | 0.0058 |
| 1.8551 | 3944 | 0.0043 |
| 1.8556 | 3945 | 0.0144 |
| 1.8561 | 3946 | 0.002 |
| 1.8565 | 3947 | 0.0123 |
| 1.8570 | 3948 | 0.0039 |
| 1.8575 | 3949 | 0.004 |
| 1.8579 | 3950 | 0.003 |
| 1.8584 | 3951 | 0.0046 |
| 1.8589 | 3952 | 0.0084 |
| 1.8594 | 3953 | 0.0099 |
| 1.8598 | 3954 | 0.008 |
| 1.8603 | 3955 | 0.0055 |
| 1.8608 | 3956 | 0.0128 |
| 1.8612 | 3957 | 0.0108 |
| 1.8617 | 3958 | 0.006 |
| 1.8622 | 3959 | 0.0046 |
| 1.8627 | 3960 | 0.0034 |
| 1.8631 | 3961 | 0.0045 |
| 1.8636 | 3962 | 0.0132 |
| 1.8641 | 3963 | 0.0082 |
| 1.8645 | 3964 | 0.0043 |
| 1.8650 | 3965 | 0.0059 |
| 1.8655 | 3966 | 0.004 |
| 1.8659 | 3967 | 0.0066 |
| 1.8664 | 3968 | 0.0088 |
| 1.8669 | 3969 | 0.0051 |
| 1.8674 | 3970 | 0.009 |
| 1.8678 | 3971 | 0.0031 |
| 1.8683 | 3972 | 0.0074 |
| 1.8688 | 3973 | 0.0165 |
| 1.8692 | 3974 | 0.0037 |
| 1.8697 | 3975 | 0.0065 |
| 1.8702 | 3976 | 0.0057 |
| 1.8706 | 3977 | 0.0028 |
| 1.8711 | 3978 | 0.0043 |
| 1.8716 | 3979 | 0.0052 |
| 1.8721 | 3980 | 0.0093 |
| 1.8725 | 3981 | 0.012 |
| 1.8730 | 3982 | 0.0043 |
| 1.8735 | 3983 | 0.0066 |
| 1.8739 | 3984 | 0.0086 |
| 1.8744 | 3985 | 0.0213 |
| 1.8749 | 3986 | 0.0072 |
| 1.8754 | 3987 | 0.0016 |
| 1.8758 | 3988 | 0.0024 |
| 1.8763 | 3989 | 0.0039 |
| 1.8768 | 3990 | 0.0054 |
| 1.8772 | 3991 | 0.0034 |
| 1.8777 | 3992 | 0.0061 |
| 1.8782 | 3993 | 0.0046 |
| 1.8786 | 3994 | 0.0053 |
| 1.8791 | 3995 | 0.0042 |
| 1.8796 | 3996 | 0.006 |
| 1.8801 | 3997 | 0.0064 |
| 1.8805 | 3998 | 0.0163 |
| 1.8810 | 3999 | 0.0065 |
| 1.8815 | 4000 | 0.0106 |
| 1.8819 | 4001 | 0.0149 |
| 1.8824 | 4002 | 0.006 |
| 1.8829 | 4003 | 0.0051 |
| 1.8833 | 4004 | 0.0026 |
| 1.8838 | 4005 | 0.0104 |
| 1.8843 | 4006 | 0.0055 |
| 1.8848 | 4007 | 0.0119 |
| 1.8852 | 4008 | 0.0042 |
| 1.8857 | 4009 | 0.0061 |
| 1.8862 | 4010 | 0.0089 |
| 1.8866 | 4011 | 0.0025 |
| 1.8871 | 4012 | 0.0049 |
| 1.8876 | 4013 | 0.008 |
| 1.8881 | 4014 | 0.0033 |
| 1.8885 | 4015 | 0.0167 |
| 1.8890 | 4016 | 0.004 |
| 1.8895 | 4017 | 0.0092 |
| 1.8899 | 4018 | 0.003 |
| 1.8904 | 4019 | 0.005 |
| 1.8909 | 4020 | 0.0033 |
| 1.8913 | 4021 | 0.0047 |
| 1.8918 | 4022 | 0.0057 |
| 1.8923 | 4023 | 0.008 |
| 1.8928 | 4024 | 0.005 |
| 1.8932 | 4025 | 0.01 |
| 1.8937 | 4026 | 0.0071 |
| 1.8942 | 4027 | 0.0049 |
| 1.8946 | 4028 | 0.0097 |
| 1.8951 | 4029 | 0.0041 |
| 1.8956 | 4030 | 0.015 |
| 1.8960 | 4031 | 0.0109 |
| 1.8965 | 4032 | 0.0128 |
| 1.8970 | 4033 | 0.0047 |
| 1.8975 | 4034 | 0.0061 |
| 1.8979 | 4035 | 0.0046 |
| 1.8984 | 4036 | 0.0073 |
| 1.8989 | 4037 | 0.0086 |
| 1.8993 | 4038 | 0.005 |
| 1.8998 | 4039 | 0.0076 |
| 1.9003 | 4040 | 0.0128 |
| 1.9008 | 4041 | 0.0059 |
| 1.9012 | 4042 | 0.0044 |
| 1.9017 | 4043 | 0.0048 |
| 1.9022 | 4044 | 0.0061 |
| 1.9026 | 4045 | 0.0031 |
| 1.9031 | 4046 | 0.0045 |
| 1.9036 | 4047 | 0.0054 |
| 1.9040 | 4048 | 0.0115 |
| 1.9045 | 4049 | 0.0046 |
| 1.9050 | 4050 | 0.0193 |
| 1.9055 | 4051 | 0.008 |
| 1.9059 | 4052 | 0.0045 |
| 1.9064 | 4053 | 0.0054 |
| 1.9069 | 4054 | 0.0056 |
| 1.9073 | 4055 | 0.0068 |
| 1.9078 | 4056 | 0.0195 |
| 1.9083 | 4057 | 0.0067 |
| 1.9087 | 4058 | 0.0056 |
| 1.9092 | 4059 | 0.0044 |
| 1.9097 | 4060 | 0.0086 |
| 1.9102 | 4061 | 0.0028 |
| 1.9106 | 4062 | 0.0064 |
| 1.9111 | 4063 | 0.0039 |
| 1.9116 | 4064 | 0.0057 |
| 1.9120 | 4065 | 0.0043 |
| 1.9125 | 4066 | 0.0076 |
| 1.9130 | 4067 | 0.0121 |
| 1.9135 | 4068 | 0.0022 |
| 1.9139 | 4069 | 0.0064 |
| 1.9144 | 4070 | 0.0063 |
| 1.9149 | 4071 | 0.0036 |
| 1.9153 | 4072 | 0.0082 |
| 1.9158 | 4073 | 0.0045 |
| 1.9163 | 4074 | 0.0052 |
| 1.9167 | 4075 | 0.0027 |
| 1.9172 | 4076 | 0.0037 |
| 1.9177 | 4077 | 0.0023 |
| 1.9182 | 4078 | 0.0085 |
| 1.9186 | 4079 | 0.0066 |
| 1.9191 | 4080 | 0.0026 |
| 1.9196 | 4081 | 0.0024 |
| 1.9200 | 4082 | 0.0056 |
| 1.9205 | 4083 | 0.0067 |
| 1.9210 | 4084 | 0.004 |
| 1.9214 | 4085 | 0.0048 |
| 1.9219 | 4086 | 0.0048 |
| 1.9224 | 4087 | 0.0043 |
| 1.9229 | 4088 | 0.0048 |
| 1.9233 | 4089 | 0.0053 |
| 1.9238 | 4090 | 0.0082 |
| 1.9243 | 4091 | 0.0048 |
| 1.9247 | 4092 | 0.0088 |
| 1.9252 | 4093 | 0.0055 |
| 1.9257 | 4094 | 0.0056 |
| 1.9262 | 4095 | 0.0039 |
| 1.9266 | 4096 | 0.0056 |
| 1.9271 | 4097 | 0.0184 |
| 1.9276 | 4098 | 0.004 |
| 1.9280 | 4099 | 0.0093 |
| 1.9285 | 4100 | 0.0029 |
| 1.9290 | 4101 | 0.0041 |
| 1.9294 | 4102 | 0.0074 |
| 1.9299 | 4103 | 0.0097 |
| 1.9304 | 4104 | 0.004 |
| 1.9309 | 4105 | 0.0043 |
| 1.9313 | 4106 | 0.0075 |
| 1.9318 | 4107 | 0.0025 |
| 1.9323 | 4108 | 0.0136 |
| 1.9327 | 4109 | 0.0075 |
| 1.9332 | 4110 | 0.0071 |
| 1.9337 | 4111 | 0.0235 |
| 1.9341 | 4112 | 0.0036 |
| 1.9346 | 4113 | 0.0035 |
| 1.9351 | 4114 | 0.0051 |
| 1.9356 | 4115 | 0.0045 |
| 1.9360 | 4116 | 0.0035 |
| 1.9365 | 4117 | 0.0043 |
| 1.9370 | 4118 | 0.0089 |
| 1.9374 | 4119 | 0.0045 |
| 1.9379 | 4120 | 0.0139 |
| 1.9384 | 4121 | 0.0032 |
| 1.9389 | 4122 | 0.0036 |
| 1.9393 | 4123 | 0.0056 |
| 1.9398 | 4124 | 0.0096 |
| 1.9403 | 4125 | 0.0524 |
| 1.9407 | 4126 | 0.0159 |
| 1.9412 | 4127 | 0.0032 |
| 1.9417 | 4128 | 0.0039 |
| 1.9421 | 4129 | 0.0074 |
| 1.9426 | 4130 | 0.0027 |
| 1.9431 | 4131 | 0.0047 |
| 1.9436 | 4132 | 0.0027 |
| 1.9440 | 4133 | 0.0047 |
| 1.9445 | 4134 | 0.0056 |
| 1.9450 | 4135 | 0.003 |
| 1.9454 | 4136 | 0.0047 |
| 1.9459 | 4137 | 0.0038 |
| 1.9464 | 4138 | 0.0035 |
| 1.9468 | 4139 | 0.014 |
| 1.9473 | 4140 | 0.0035 |
| 1.9478 | 4141 | 0.0039 |
| 1.9483 | 4142 | 0.006 |
| 1.9487 | 4143 | 0.005 |
| 1.9492 | 4144 | 0.0021 |
| 1.9497 | 4145 | 0.0055 |
| 1.9501 | 4146 | 0.0029 |
| 1.9506 | 4147 | 0.0022 |
| 1.9511 | 4148 | 0.0115 |
| 1.9516 | 4149 | 0.0053 |
| 1.9520 | 4150 | 0.0131 |
| 1.9525 | 4151 | 0.0039 |
| 1.9530 | 4152 | 0.003 |
| 1.9534 | 4153 | 0.0143 |
| 1.9539 | 4154 | 0.0034 |
| 1.9544 | 4155 | 0.0019 |
| 1.9548 | 4156 | 0.0082 |
| 1.9553 | 4157 | 0.0088 |
| 1.9558 | 4158 | 0.0041 |
| 1.9563 | 4159 | 0.0042 |
| 1.9567 | 4160 | 0.0049 |
| 1.9572 | 4161 | 0.018 |
| 1.9577 | 4162 | 0.0035 |
| 1.9581 | 4163 | 0.0118 |
| 1.9586 | 4164 | 0.0042 |
| 1.9591 | 4165 | 0.0071 |
| 1.9595 | 4166 | 0.0065 |
| 1.9600 | 4167 | 0.0039 |
| 1.9605 | 4168 | 0.007 |
| 1.9610 | 4169 | 0.0029 |
| 1.9614 | 4170 | 0.0069 |
| 1.9619 | 4171 | 0.0091 |
| 1.9624 | 4172 | 0.0022 |
| 1.9628 | 4173 | 0.0063 |
| 1.9633 | 4174 | 0.0055 |
| 1.9638 | 4175 | 0.0081 |
| 1.9643 | 4176 | 0.0036 |
| 1.9647 | 4177 | 0.0022 |
| 1.9652 | 4178 | 0.0039 |
| 1.9657 | 4179 | 0.0026 |
| 1.9661 | 4180 | 0.0079 |
| 1.9666 | 4181 | 0.007 |
| 1.9671 | 4182 | 0.0033 |
| 1.9675 | 4183 | 0.0035 |
| 1.9680 | 4184 | 0.0043 |
| 1.9685 | 4185 | 0.0038 |
| 1.9690 | 4186 | 0.0238 |
| 1.9694 | 4187 | 0.0035 |
| 1.9699 | 4188 | 0.0111 |
| 1.9704 | 4189 | 0.005 |
| 1.9708 | 4190 | 0.0035 |
| 1.9713 | 4191 | 0.0062 |
| 1.9718 | 4192 | 0.0082 |
| 1.9722 | 4193 | 0.0043 |
| 1.9727 | 4194 | 0.0053 |
| 1.9732 | 4195 | 0.0039 |
| 1.9737 | 4196 | 0.0099 |
| 1.9741 | 4197 | 0.0032 |
| 1.9746 | 4198 | 0.0062 |
| 1.9751 | 4199 | 0.0032 |
| 1.9755 | 4200 | 0.0066 |
| 1.9760 | 4201 | 0.0117 |
| 1.9765 | 4202 | 0.0033 |
| 1.9770 | 4203 | 0.0044 |
| 1.9774 | 4204 | 0.0046 |
| 1.9779 | 4205 | 0.0069 |
| 1.9784 | 4206 | 0.0086 |
| 1.9788 | 4207 | 0.0045 |
| 1.9793 | 4208 | 0.0122 |
| 1.9798 | 4209 | 0.0043 |
| 1.9802 | 4210 | 0.0065 |
| 1.9807 | 4211 | 0.0057 |
| 1.9812 | 4212 | 0.0047 |
| 1.9817 | 4213 | 0.0021 |
| 1.9821 | 4214 | 0.0035 |
| 1.9826 | 4215 | 0.0085 |
| 1.9831 | 4216 | 0.0095 |
| 1.9835 | 4217 | 0.0071 |
| 1.9840 | 4218 | 0.0034 |
| 1.9845 | 4219 | 0.0031 |
| 1.9849 | 4220 | 0.0072 |
| 1.9854 | 4221 | 0.004 |
| 1.9859 | 4222 | 0.0035 |
| 1.9864 | 4223 | 0.0047 |
| 1.9868 | 4224 | 0.0039 |
| 1.9873 | 4225 | 0.0032 |
| 1.9878 | 4226 | 0.0056 |
| 1.9882 | 4227 | 0.0026 |
| 1.9887 | 4228 | 0.0082 |
| 1.9892 | 4229 | 0.0053 |
| 1.9897 | 4230 | 0.0054 |
| 1.9901 | 4231 | 0.003 |
| 1.9906 | 4232 | 0.0125 |
| 1.9911 | 4233 | 0.0044 |
| 1.9915 | 4234 | 0.0051 |
| 1.9920 | 4235 | 0.002 |
| 1.9925 | 4236 | 0.0036 |
| 1.9929 | 4237 | 0.0081 |
| 1.9934 | 4238 | 0.0035 |
| 1.9939 | 4239 | 0.0087 |
| 1.9944 | 4240 | 0.0048 |
| 1.9948 | 4241 | 0.0204 |
| 1.9953 | 4242 | 0.0032 |
| 1.9958 | 4243 | 0.0078 |
| 1.9962 | 4244 | 0.0065 |
| 1.9967 | 4245 | 0.0058 |
| 1.9972 | 4246 | 0.0025 |
| 1.9976 | 4247 | 0.0042 |
| 1.9981 | 4248 | 0.0034 |
| 1.9986 | 4249 | 0.0085 |
| 1.9991 | 4250 | 0.0075 |
| 1.9995 | 4251 | 0.0038 |
| 2.0 | 4252 | 0.0127 |
| 2.0005 | 4253 | 0.0042 |
| 2.0009 | 4254 | 0.0082 |
| 2.0014 | 4255 | 0.0059 |
| 2.0019 | 4256 | 0.0061 |
| 2.0024 | 4257 | 0.0121 |
| 2.0028 | 4258 | 0.0036 |
| 2.0033 | 4259 | 0.0027 |
| 2.0038 | 4260 | 0.0047 |
| 2.0042 | 4261 | 0.0106 |
| 2.0047 | 4262 | 0.0051 |
| 2.0052 | 4263 | 0.0045 |
| 2.0056 | 4264 | 0.0126 |
| 2.0061 | 4265 | 0.0067 |
| 2.0066 | 4266 | 0.011 |
| 2.0071 | 4267 | 0.0057 |
| 2.0075 | 4268 | 0.005 |
| 2.0080 | 4269 | 0.0055 |
| 2.0085 | 4270 | 0.0075 |
| 2.0089 | 4271 | 0.0077 |
| 2.0094 | 4272 | 0.0055 |
| 2.0099 | 4273 | 0.0078 |
| 2.0103 | 4274 | 0.0063 |
| 2.0108 | 4275 | 0.0174 |
| 2.0113 | 4276 | 0.007 |
| 2.0118 | 4277 | 0.0043 |
| 2.0122 | 4278 | 0.0028 |
| 2.0127 | 4279 | 0.0119 |
| 2.0132 | 4280 | 0.0067 |
| 2.0136 | 4281 | 0.0029 |
| 2.0141 | 4282 | 0.005 |
| 2.0146 | 4283 | 0.0047 |
| 2.0151 | 4284 | 0.0022 |
| 2.0155 | 4285 | 0.0102 |
| 2.0160 | 4286 | 0.0029 |
| 2.0165 | 4287 | 0.0182 |
| 2.0169 | 4288 | 0.018 |
| 2.0174 | 4289 | 0.0046 |
| 2.0179 | 4290 | 0.0027 |
| 2.0183 | 4291 | 0.0021 |
| 2.0188 | 4292 | 0.0047 |
| 2.0193 | 4293 | 0.0041 |
| 2.0198 | 4294 | 0.0075 |
| 2.0202 | 4295 | 0.0061 |
| 2.0207 | 4296 | 0.007 |
| 2.0212 | 4297 | 0.0017 |
| 2.0216 | 4298 | 0.0075 |
| 2.0221 | 4299 | 0.0041 |
| 2.0226 | 4300 | 0.0063 |
| 2.0230 | 4301 | 0.008 |
| 2.0235 | 4302 | 0.0041 |
| 2.0240 | 4303 | 0.0045 |
| 2.0245 | 4304 | 0.0076 |
| 2.0249 | 4305 | 0.0044 |
| 2.0254 | 4306 | 0.0017 |
| 2.0259 | 4307 | 0.0042 |
| 2.0263 | 4308 | 0.0145 |
| 2.0268 | 4309 | 0.0046 |
| 2.0273 | 4310 | 0.004 |
| 2.0278 | 4311 | 0.007 |
| 2.0282 | 4312 | 0.0025 |
| 2.0287 | 4313 | 0.0069 |
| 2.0292 | 4314 | 0.0065 |
| 2.0296 | 4315 | 0.0129 |
| 2.0301 | 4316 | 0.0072 |
| 2.0306 | 4317 | 0.0031 |
| 2.0310 | 4318 | 0.0051 |
| 2.0315 | 4319 | 0.0104 |
| 2.0320 | 4320 | 0.0047 |
| 2.0325 | 4321 | 0.0052 |
| 2.0329 | 4322 | 0.0089 |
| 2.0334 | 4323 | 0.0068 |
| 2.0339 | 4324 | 0.0139 |
| 2.0343 | 4325 | 0.0095 |
| 2.0348 | 4326 | 0.0019 |
| 2.0353 | 4327 | 0.0042 |
| 2.0357 | 4328 | 0.007 |
| 2.0362 | 4329 | 0.008 |
| 2.0367 | 4330 | 0.0032 |
| 2.0372 | 4331 | 0.0044 |
| 2.0376 | 4332 | 0.0024 |
| 2.0381 | 4333 | 0.0069 |
| 2.0386 | 4334 | 0.0174 |
| 2.0390 | 4335 | 0.0064 |
| 2.0395 | 4336 | 0.0109 |
| 2.0400 | 4337 | 0.003 |
| 2.0405 | 4338 | 0.0073 |
| 2.0409 | 4339 | 0.0052 |
| 2.0414 | 4340 | 0.0052 |
| 2.0419 | 4341 | 0.0056 |
| 2.0423 | 4342 | 0.004 |
| 2.0428 | 4343 | 0.0043 |
| 2.0433 | 4344 | 0.0053 |
| 2.0437 | 4345 | 0.0062 |
| 2.0442 | 4346 | 0.0045 |
| 2.0447 | 4347 | 0.0035 |
| 2.0452 | 4348 | 0.0043 |
| 2.0456 | 4349 | 0.0074 |
| 2.0461 | 4350 | 0.0036 |
| 2.0466 | 4351 | 0.0099 |
| 2.0470 | 4352 | 0.004 |
| 2.0475 | 4353 | 0.0081 |
| 2.0480 | 4354 | 0.0035 |
| 2.0484 | 4355 | 0.003 |
| 2.0489 | 4356 | 0.0017 |
| 2.0494 | 4357 | 0.0025 |
| 2.0499 | 4358 | 0.0055 |
| 2.0503 | 4359 | 0.0021 |
| 2.0508 | 4360 | 0.0039 |
| 2.0513 | 4361 | 0.0045 |
| 2.0517 | 4362 | 0.0031 |
| 2.0522 | 4363 | 0.0026 |
| 2.0527 | 4364 | 0.0032 |
| 2.0532 | 4365 | 0.0064 |
| 2.0536 | 4366 | 0.0076 |
| 2.0541 | 4367 | 0.0025 |
| 2.0546 | 4368 | 0.0067 |
| 2.0550 | 4369 | 0.0049 |
| 2.0555 | 4370 | 0.0061 |
| 2.0560 | 4371 | 0.0021 |
| 2.0564 | 4372 | 0.0069 |
| 2.0569 | 4373 | 0.0033 |
| 2.0574 | 4374 | 0.0033 |
| 2.0579 | 4375 | 0.0114 |
| 2.0583 | 4376 | 0.0036 |
| 2.0588 | 4377 | 0.0047 |
| 2.0593 | 4378 | 0.0047 |
| 2.0597 | 4379 | 0.0026 |
| 2.0602 | 4380 | 0.0039 |
| 2.0607 | 4381 | 0.0054 |
| 2.0611 | 4382 | 0.0103 |
| 2.0616 | 4383 | 0.0055 |
| 2.0621 | 4384 | 0.0026 |
| 2.0626 | 4385 | 0.0036 |
| 2.0630 | 4386 | 0.0043 |
| 2.0635 | 4387 | 0.004 |
| 2.0640 | 4388 | 0.0032 |
| 2.0644 | 4389 | 0.0045 |
| 2.0649 | 4390 | 0.0169 |
| 2.0654 | 4391 | 0.0027 |
| 2.0659 | 4392 | 0.0052 |
| 2.0663 | 4393 | 0.0033 |
| 2.0668 | 4394 | 0.005 |
| 2.0673 | 4395 | 0.0028 |
| 2.0677 | 4396 | 0.0046 |
| 2.0682 | 4397 | 0.0069 |
| 2.0687 | 4398 | 0.0067 |
| 2.0691 | 4399 | 0.0098 |
| 2.0696 | 4400 | 0.0036 |
| 2.0701 | 4401 | 0.006 |
| 2.0706 | 4402 | 0.0033 |
| 2.0710 | 4403 | 0.0036 |
| 2.0715 | 4404 | 0.0043 |
| 2.0720 | 4405 | 0.0036 |
| 2.0724 | 4406 | 0.0064 |
| 2.0729 | 4407 | 0.0073 |
| 2.0734 | 4408 | 0.0036 |
| 2.0738 | 4409 | 0.0023 |
| 2.0743 | 4410 | 0.0031 |
| 2.0748 | 4411 | 0.0058 |
| 2.0753 | 4412 | 0.0025 |
| 2.0757 | 4413 | 0.0045 |
| 2.0762 | 4414 | 0.006 |
| 2.0767 | 4415 | 0.0229 |
| 2.0771 | 4416 | 0.012 |
| 2.0776 | 4417 | 0.0025 |
| 2.0781 | 4418 | 0.0029 |
| 2.0786 | 4419 | 0.0038 |
| 2.0790 | 4420 | 0.0101 |
| 2.0795 | 4421 | 0.0064 |
| 2.0800 | 4422 | 0.003 |
| 2.0804 | 4423 | 0.0066 |
| 2.0809 | 4424 | 0.0059 |
| 2.0814 | 4425 | 0.0039 |
| 2.0818 | 4426 | 0.005 |
| 2.0823 | 4427 | 0.008 |
| 2.0828 | 4428 | 0.0118 |
| 2.0833 | 4429 | 0.0066 |
| 2.0837 | 4430 | 0.0035 |
| 2.0842 | 4431 | 0.005 |
| 2.0847 | 4432 | 0.0032 |
| 2.0851 | 4433 | 0.0023 |
| 2.0856 | 4434 | 0.0046 |
| 2.0861 | 4435 | 0.012 |
| 2.0865 | 4436 | 0.0093 |
| 2.0870 | 4437 | 0.0046 |
| 2.0875 | 4438 | 0.002 |
| 2.0880 | 4439 | 0.0042 |
| 2.0884 | 4440 | 0.0035 |
| 2.0889 | 4441 | 0.0031 |
| 2.0894 | 4442 | 0.0068 |
| 2.0898 | 4443 | 0.0026 |
| 2.0903 | 4444 | 0.0062 |
| 2.0908 | 4445 | 0.01 |
| 2.0913 | 4446 | 0.0036 |
| 2.0917 | 4447 | 0.0034 |
| 2.0922 | 4448 | 0.0083 |
| 2.0927 | 4449 | 0.0027 |
| 2.0931 | 4450 | 0.0027 |
| 2.0936 | 4451 | 0.0053 |
| 2.0941 | 4452 | 0.0052 |
| 2.0945 | 4453 | 0.0039 |
| 2.0950 | 4454 | 0.0057 |
| 2.0955 | 4455 | 0.0059 |
| 2.0960 | 4456 | 0.0051 |
| 2.0964 | 4457 | 0.0047 |
| 2.0969 | 4458 | 0.0047 |
| 2.0974 | 4459 | 0.0042 |
| 2.0978 | 4460 | 0.0037 |
| 2.0983 | 4461 | 0.0038 |
| 2.0988 | 4462 | 0.0032 |
| 2.0992 | 4463 | 0.0064 |
| 2.0997 | 4464 | 0.0026 |
| 2.1002 | 4465 | 0.0039 |
| 2.1007 | 4466 | 0.0051 |
| 2.1011 | 4467 | 0.0043 |
| 2.1016 | 4468 | 0.007 |
| 2.1021 | 4469 | 0.0038 |
| 2.1025 | 4470 | 0.0023 |
| 2.1030 | 4471 | 0.0079 |
| 2.1035 | 4472 | 0.0058 |
| 2.1040 | 4473 | 0.0032 |
| 2.1044 | 4474 | 0.0076 |
| 2.1049 | 4475 | 0.0062 |
| 2.1054 | 4476 | 0.0036 |
| 2.1058 | 4477 | 0.0035 |
| 2.1063 | 4478 | 0.0042 |
| 2.1068 | 4479 | 0.0038 |
| 2.1072 | 4480 | 0.0018 |
| 2.1077 | 4481 | 0.0059 |
| 2.1082 | 4482 | 0.0059 |
| 2.1087 | 4483 | 0.0058 |
| 2.1091 | 4484 | 0.0059 |
| 2.1096 | 4485 | 0.0047 |
| 2.1101 | 4486 | 0.0043 |
| 2.1105 | 4487 | 0.0066 |
| 2.1110 | 4488 | 0.0083 |
| 2.1115 | 4489 | 0.0052 |
| 2.1119 | 4490 | 0.0035 |
| 2.1124 | 4491 | 0.0034 |
| 2.1129 | 4492 | 0.0016 |
| 2.1134 | 4493 | 0.0064 |
| 2.1138 | 4494 | 0.0044 |
| 2.1143 | 4495 | 0.0037 |
| 2.1148 | 4496 | 0.0022 |
| 2.1152 | 4497 | 0.0038 |
| 2.1157 | 4498 | 0.0033 |
| 2.1162 | 4499 | 0.0029 |
| 2.1167 | 4500 | 0.003 |
| 2.1171 | 4501 | 0.0118 |
| 2.1176 | 4502 | 0.0041 |
| 2.1181 | 4503 | 0.0059 |
| 2.1185 | 4504 | 0.0057 |
| 2.1190 | 4505 | 0.0028 |
| 2.1195 | 4506 | 0.008 |
| 2.1199 | 4507 | 0.0046 |
| 2.1204 | 4508 | 0.0039 |
| 2.1209 | 4509 | 0.0045 |
| 2.1214 | 4510 | 0.0038 |
| 2.1218 | 4511 | 0.0051 |
| 2.1223 | 4512 | 0.0039 |
| 2.1228 | 4513 | 0.0024 |
| 2.1232 | 4514 | 0.0027 |
| 2.1237 | 4515 | 0.0033 |
| 2.1242 | 4516 | 0.0029 |
| 2.1246 | 4517 | 0.0053 |
| 2.1251 | 4518 | 0.0053 |
| 2.1256 | 4519 | 0.0048 |
| 2.1261 | 4520 | 0.0054 |
| 2.1265 | 4521 | 0.0052 |
| 2.1270 | 4522 | 0.0078 |
| 2.1275 | 4523 | 0.0032 |
| 2.1279 | 4524 | 0.0056 |
| 2.1284 | 4525 | 0.0015 |
| 2.1289 | 4526 | 0.0028 |
| 2.1294 | 4527 | 0.0028 |
| 2.1298 | 4528 | 0.0028 |
| 2.1303 | 4529 | 0.0016 |
| 2.1308 | 4530 | 0.0236 |
| 2.1312 | 4531 | 0.0056 |
| 2.1317 | 4532 | 0.0068 |
| 2.1322 | 4533 | 0.0023 |
| 2.1326 | 4534 | 0.0056 |
| 2.1331 | 4535 | 0.003 |
| 2.1336 | 4536 | 0.003 |
| 2.1341 | 4537 | 0.0021 |
| 2.1345 | 4538 | 0.0033 |
| 2.1350 | 4539 | 0.0042 |
| 2.1355 | 4540 | 0.0039 |
| 2.1359 | 4541 | 0.0041 |
| 2.1364 | 4542 | 0.0027 |
| 2.1369 | 4543 | 0.0055 |
| 2.1373 | 4544 | 0.0017 |
| 2.1378 | 4545 | 0.0067 |
| 2.1383 | 4546 | 0.0052 |
| 2.1388 | 4547 | 0.0033 |
| 2.1392 | 4548 | 0.0065 |
| 2.1397 | 4549 | 0.0081 |
| 2.1402 | 4550 | 0.0049 |
| 2.1406 | 4551 | 0.0051 |
| 2.1411 | 4552 | 0.0046 |
| 2.1416 | 4553 | 0.0028 |
| 2.1421 | 4554 | 0.0036 |
| 2.1425 | 4555 | 0.0025 |
| 2.1430 | 4556 | 0.0035 |
| 2.1435 | 4557 | 0.0029 |
| 2.1439 | 4558 | 0.0082 |
| 2.1444 | 4559 | 0.0061 |
| 2.1449 | 4560 | 0.0027 |
| 2.1453 | 4561 | 0.0027 |
| 2.1458 | 4562 | 0.0076 |
| 2.1463 | 4563 | 0.0063 |
| 2.1468 | 4564 | 0.0229 |
| 2.1472 | 4565 | 0.0026 |
| 2.1477 | 4566 | 0.0054 |
| 2.1482 | 4567 | 0.0062 |
| 2.1486 | 4568 | 0.012 |
| 2.1491 | 4569 | 0.0042 |
| 2.1496 | 4570 | 0.0033 |
| 2.1500 | 4571 | 0.0024 |
| 2.1505 | 4572 | 0.0034 |
| 2.1510 | 4573 | 0.0028 |
| 2.1515 | 4574 | 0.0025 |
| 2.1519 | 4575 | 0.0019 |
| 2.1524 | 4576 | 0.005 |
| 2.1529 | 4577 | 0.0046 |
| 2.1533 | 4578 | 0.0045 |
| 2.1538 | 4579 | 0.0076 |
| 2.1543 | 4580 | 0.0049 |
| 2.1548 | 4581 | 0.0034 |
| 2.1552 | 4582 | 0.0072 |
| 2.1557 | 4583 | 0.0035 |
| 2.1562 | 4584 | 0.0044 |
| 2.1566 | 4585 | 0.0121 |
| 2.1571 | 4586 | 0.0039 |
| 2.1576 | 4587 | 0.0035 |
| 2.1580 | 4588 | 0.0031 |
| 2.1585 | 4589 | 0.0048 |
| 2.1590 | 4590 | 0.0026 |
| 2.1595 | 4591 | 0.0045 |
| 2.1599 | 4592 | 0.007 |
| 2.1604 | 4593 | 0.0036 |
| 2.1609 | 4594 | 0.0048 |
| 2.1613 | 4595 | 0.0036 |
| 2.1618 | 4596 | 0.0022 |
| 2.1623 | 4597 | 0.0057 |
| 2.1627 | 4598 | 0.0028 |
| 2.1632 | 4599 | 0.0022 |
| 2.1637 | 4600 | 0.0014 |
| 2.1642 | 4601 | 0.0018 |
| 2.1646 | 4602 | 0.0017 |
| 2.1651 | 4603 | 0.0033 |
| 2.1656 | 4604 | 0.0031 |
| 2.1660 | 4605 | 0.0021 |
| 2.1665 | 4606 | 0.0028 |
| 2.1670 | 4607 | 0.0046 |
| 2.1675 | 4608 | 0.005 |
| 2.1679 | 4609 | 0.0048 |
| 2.1684 | 4610 | 0.004 |
| 2.1689 | 4611 | 0.0027 |
| 2.1693 | 4612 | 0.0038 |
| 2.1698 | 4613 | 0.0036 |
| 2.1703 | 4614 | 0.004 |
| 2.1707 | 4615 | 0.0095 |
| 2.1712 | 4616 | 0.0018 |
| 2.1717 | 4617 | 0.0036 |
| 2.1722 | 4618 | 0.0033 |
| 2.1726 | 4619 | 0.0038 |
| 2.1731 | 4620 | 0.0023 |
| 2.1736 | 4621 | 0.0019 |
| 2.1740 | 4622 | 0.0042 |
| 2.1745 | 4623 | 0.0049 |
| 2.1750 | 4624 | 0.0039 |
| 2.1754 | 4625 | 0.0046 |
| 2.1759 | 4626 | 0.0048 |
| 2.1764 | 4627 | 0.0052 |
| 2.1769 | 4628 | 0.0014 |
| 2.1773 | 4629 | 0.0044 |
| 2.1778 | 4630 | 0.0052 |
| 2.1783 | 4631 | 0.0048 |
| 2.1787 | 4632 | 0.0028 |
| 2.1792 | 4633 | 0.004 |
| 2.1797 | 4634 | 0.0067 |
| 2.1802 | 4635 | 0.0038 |
| 2.1806 | 4636 | 0.0016 |
| 2.1811 | 4637 | 0.0016 |
| 2.1816 | 4638 | 0.0025 |
| 2.1820 | 4639 | 0.005 |
| 2.1825 | 4640 | 0.0062 |
| 2.1830 | 4641 | 0.0029 |
| 2.1834 | 4642 | 0.0025 |
| 2.1839 | 4643 | 0.0035 |
| 2.1844 | 4644 | 0.006 |
| 2.1849 | 4645 | 0.0041 |
| 2.1853 | 4646 | 0.0051 |
| 2.1858 | 4647 | 0.0018 |
| 2.1863 | 4648 | 0.0017 |
| 2.1867 | 4649 | 0.0051 |
| 2.1872 | 4650 | 0.0021 |
| 2.1877 | 4651 | 0.0043 |
| 2.1881 | 4652 | 0.004 |
| 2.1886 | 4653 | 0.0031 |
| 2.1891 | 4654 | 0.0066 |
| 2.1896 | 4655 | 0.0027 |
| 2.1900 | 4656 | 0.0029 |
| 2.1905 | 4657 | 0.0034 |
| 2.1910 | 4658 | 0.0047 |
| 2.1914 | 4659 | 0.0028 |
| 2.1919 | 4660 | 0.0044 |
| 2.1924 | 4661 | 0.0057 |
| 2.1929 | 4662 | 0.0022 |
| 2.1933 | 4663 | 0.0018 |
| 2.1938 | 4664 | 0.0046 |
| 2.1943 | 4665 | 0.0049 |
| 2.1947 | 4666 | 0.0051 |
| 2.1952 | 4667 | 0.0043 |
| 2.1957 | 4668 | 0.0042 |
| 2.1961 | 4669 | 0.0059 |
| 2.1966 | 4670 | 0.0034 |
| 2.1971 | 4671 | 0.0042 |
| 2.1976 | 4672 | 0.0033 |
| 2.1980 | 4673 | 0.0136 |
| 2.1985 | 4674 | 0.0042 |
| 2.1990 | 4675 | 0.0033 |
| 2.1994 | 4676 | 0.0037 |
| 2.1999 | 4677 | 0.0048 |
| 2.2004 | 4678 | 0.0035 |
| 2.2008 | 4679 | 0.0035 |
| 2.2013 | 4680 | 0.0075 |
| 2.2018 | 4681 | 0.0014 |
| 2.2023 | 4682 | 0.002 |
| 2.2027 | 4683 | 0.0044 |
| 2.2032 | 4684 | 0.0058 |
| 2.2037 | 4685 | 0.0035 |
| 2.2041 | 4686 | 0.0041 |
| 2.2046 | 4687 | 0.0075 |
| 2.2051 | 4688 | 0.0052 |
| 2.2056 | 4689 | 0.0036 |
| 2.2060 | 4690 | 0.0054 |
| 2.2065 | 4691 | 0.0021 |
| 2.2070 | 4692 | 0.0033 |
| 2.2074 | 4693 | 0.0085 |
| 2.2079 | 4694 | 0.0014 |
| 2.2084 | 4695 | 0.0028 |
| 2.2088 | 4696 | 0.0039 |
| 2.2093 | 4697 | 0.0038 |
| 2.2098 | 4698 | 0.0032 |
| 2.2103 | 4699 | 0.0031 |
| 2.2107 | 4700 | 0.0033 |
| 2.2112 | 4701 | 0.004 |
| 2.2117 | 4702 | 0.0036 |
| 2.2121 | 4703 | 0.0032 |
| 2.2126 | 4704 | 0.0026 |
| 2.2131 | 4705 | 0.0037 |
| 2.2135 | 4706 | 0.0056 |
| 2.2140 | 4707 | 0.0034 |
| 2.2145 | 4708 | 0.0051 |
| 2.2150 | 4709 | 0.0028 |
| 2.2154 | 4710 | 0.0049 |
| 2.2159 | 4711 | 0.0036 |
| 2.2164 | 4712 | 0.0027 |
| 2.2168 | 4713 | 0.0022 |
| 2.2173 | 4714 | 0.0016 |
| 2.2178 | 4715 | 0.0049 |
| 2.2183 | 4716 | 0.0034 |
| 2.2187 | 4717 | 0.0037 |
| 2.2192 | 4718 | 0.0043 |
| 2.2197 | 4719 | 0.0045 |
| 2.2201 | 4720 | 0.0039 |
| 2.2206 | 4721 | 0.0032 |
| 2.2211 | 4722 | 0.0048 |
| 2.2215 | 4723 | 0.0054 |
| 2.2220 | 4724 | 0.0027 |
| 2.2225 | 4725 | 0.0026 |
| 2.2230 | 4726 | 0.0023 |
| 2.2234 | 4727 | 0.0026 |
| 2.2239 | 4728 | 0.0019 |
| 2.2244 | 4729 | 0.0047 |
| 2.2248 | 4730 | 0.0017 |
| 2.2253 | 4731 | 0.003 |
| 2.2258 | 4732 | 0.0031 |
| 2.2262 | 4733 | 0.0016 |
| 2.2267 | 4734 | 0.0036 |
| 2.2272 | 4735 | 0.0066 |
| 2.2277 | 4736 | 0.0021 |
| 2.2281 | 4737 | 0.003 |
| 2.2286 | 4738 | 0.0034 |
| 2.2291 | 4739 | 0.0091 |
| 2.2295 | 4740 | 0.0025 |
| 2.2300 | 4741 | 0.0022 |
| 2.2305 | 4742 | 0.0051 |
| 2.2310 | 4743 | 0.0031 |
| 2.2314 | 4744 | 0.0031 |
| 2.2319 | 4745 | 0.0046 |
| 2.2324 | 4746 | 0.003 |
| 2.2328 | 4747 | 0.0029 |
| 2.2333 | 4748 | 0.0031 |
| 2.2338 | 4749 | 0.0021 |
| 2.2342 | 4750 | 0.0033 |
| 2.2347 | 4751 | 0.0095 |
| 2.2352 | 4752 | 0.0027 |
| 2.2357 | 4753 | 0.0021 |
| 2.2361 | 4754 | 0.0024 |
| 2.2366 | 4755 | 0.0034 |
| 2.2371 | 4756 | 0.0028 |
| 2.2375 | 4757 | 0.0044 |
| 2.2380 | 4758 | 0.0051 |
| 2.2385 | 4759 | 0.003 |
| 2.2389 | 4760 | 0.0034 |
| 2.2394 | 4761 | 0.0022 |
| 2.2399 | 4762 | 0.0021 |
| 2.2404 | 4763 | 0.0026 |
| 2.2408 | 4764 | 0.0029 |
| 2.2413 | 4765 | 0.0044 |
| 2.2418 | 4766 | 0.0026 |
| 2.2422 | 4767 | 0.0023 |
| 2.2427 | 4768 | 0.0022 |
| 2.2432 | 4769 | 0.0086 |
| 2.2437 | 4770 | 0.0025 |
| 2.2441 | 4771 | 0.0041 |
| 2.2446 | 4772 | 0.003 |
| 2.2451 | 4773 | 0.003 |
| 2.2455 | 4774 | 0.0022 |
| 2.2460 | 4775 | 0.0083 |
| 2.2465 | 4776 | 0.0015 |
| 2.2469 | 4777 | 0.0063 |
| 2.2474 | 4778 | 0.0022 |
| 2.2479 | 4779 | 0.0045 |
| 2.2484 | 4780 | 0.0044 |
| 2.2488 | 4781 | 0.0082 |
| 2.2493 | 4782 | 0.0027 |
| 2.2498 | 4783 | 0.0035 |
| 2.2502 | 4784 | 0.0028 |
| 2.2507 | 4785 | 0.0044 |
| 2.2512 | 4786 | 0.0016 |
| 2.2516 | 4787 | 0.0025 |
| 2.2521 | 4788 | 0.0032 |
| 2.2526 | 4789 | 0.0039 |
| 2.2531 | 4790 | 0.0018 |
| 2.2535 | 4791 | 0.0023 |
| 2.2540 | 4792 | 0.0025 |
| 2.2545 | 4793 | 0.0093 |
| 2.2549 | 4794 | 0.0054 |
| 2.2554 | 4795 | 0.0015 |
| 2.2559 | 4796 | 0.0033 |
| 2.2563 | 4797 | 0.0026 |
| 2.2568 | 4798 | 0.0044 |
| 2.2573 | 4799 | 0.0026 |
| 2.2578 | 4800 | 0.0057 |
| 2.2582 | 4801 | 0.0019 |
| 2.2587 | 4802 | 0.0032 |
| 2.2592 | 4803 | 0.0043 |
| 2.2596 | 4804 | 0.004 |
| 2.2601 | 4805 | 0.0073 |
| 2.2606 | 4806 | 0.0059 |
| 2.2611 | 4807 | 0.003 |
| 2.2615 | 4808 | 0.0041 |
| 2.2620 | 4809 | 0.0032 |
| 2.2625 | 4810 | 0.0072 |
| 2.2629 | 4811 | 0.0043 |
| 2.2634 | 4812 | 0.005 |
| 2.2639 | 4813 | 0.0073 |
| 2.2643 | 4814 | 0.0029 |
| 2.2648 | 4815 | 0.0023 |
| 2.2653 | 4816 | 0.0049 |
| 2.2658 | 4817 | 0.0034 |
| 2.2662 | 4818 | 0.0029 |
| 2.2667 | 4819 | 0.003 |
| 2.2672 | 4820 | 0.0023 |
| 2.2676 | 4821 | 0.0025 |
| 2.2681 | 4822 | 0.002 |
| 2.2686 | 4823 | 0.002 |
| 2.2690 | 4824 | 0.0088 |
| 2.2695 | 4825 | 0.0036 |
| 2.2700 | 4826 | 0.0033 |
| 2.2705 | 4827 | 0.0042 |
| 2.2709 | 4828 | 0.0062 |
| 2.2714 | 4829 | 0.0024 |
| 2.2719 | 4830 | 0.0024 |
| 2.2723 | 4831 | 0.0032 |
| 2.2728 | 4832 | 0.0028 |
| 2.2733 | 4833 | 0.0021 |
| 2.2738 | 4834 | 0.0024 |
| 2.2742 | 4835 | 0.0025 |
| 2.2747 | 4836 | 0.0031 |
| 2.2752 | 4837 | 0.0018 |
| 2.2756 | 4838 | 0.0018 |
| 2.2761 | 4839 | 0.002 |
| 2.2766 | 4840 | 0.0028 |
| 2.2770 | 4841 | 0.0029 |
| 2.2775 | 4842 | 0.0028 |
| 2.2780 | 4843 | 0.0052 |
| 2.2785 | 4844 | 0.003 |
| 2.2789 | 4845 | 0.0024 |
| 2.2794 | 4846 | 0.0031 |
| 2.2799 | 4847 | 0.0071 |
| 2.2803 | 4848 | 0.0027 |
| 2.2808 | 4849 | 0.0047 |
| 2.2813 | 4850 | 0.0023 |
| 2.2817 | 4851 | 0.0044 |
| 2.2822 | 4852 | 0.0041 |
| 2.2827 | 4853 | 0.0049 |
| 2.2832 | 4854 | 0.0023 |
| 2.2836 | 4855 | 0.0025 |
| 2.2841 | 4856 | 0.0064 |
| 2.2846 | 4857 | 0.0029 |
| 2.2850 | 4858 | 0.0023 |
| 2.2855 | 4859 | 0.0019 |
| 2.2860 | 4860 | 0.0046 |
| 2.2865 | 4861 | 0.003 |
| 2.2869 | 4862 | 0.0025 |
| 2.2874 | 4863 | 0.0015 |
| 2.2879 | 4864 | 0.0029 |
| 2.2883 | 4865 | 0.0063 |
| 2.2888 | 4866 | 0.0034 |
| 2.2893 | 4867 | 0.003 |
| 2.2897 | 4868 | 0.0021 |
| 2.2902 | 4869 | 0.0021 |
| 2.2907 | 4870 | 0.0101 |
| 2.2912 | 4871 | 0.0041 |
| 2.2916 | 4872 | 0.0028 |
| 2.2921 | 4873 | 0.0022 |
| 2.2926 | 4874 | 0.002 |
| 2.2930 | 4875 | 0.0013 |
| 2.2935 | 4876 | 0.0034 |
| 2.2940 | 4877 | 0.0074 |
| 2.2944 | 4878 | 0.0026 |
| 2.2949 | 4879 | 0.0039 |
| 2.2954 | 4880 | 0.0098 |
| 2.2959 | 4881 | 0.0043 |
| 2.2963 | 4882 | 0.0029 |
| 2.2968 | 4883 | 0.0019 |
| 2.2973 | 4884 | 0.0066 |
| 2.2977 | 4885 | 0.0033 |
| 2.2982 | 4886 | 0.0034 |
| 2.2987 | 4887 | 0.0046 |
| 2.2992 | 4888 | 0.0021 |
| 2.2996 | 4889 | 0.003 |
| 2.3001 | 4890 | 0.0025 |
| 2.3006 | 4891 | 0.0062 |
| 2.3010 | 4892 | 0.0033 |
| 2.3015 | 4893 | 0.0032 |
| 2.3020 | 4894 | 0.004 |
| 2.3024 | 4895 | 0.003 |
| 2.3029 | 4896 | 0.0021 |
| 2.3034 | 4897 | 0.0023 |
| 2.3039 | 4898 | 0.0035 |
| 2.3043 | 4899 | 0.0038 |
| 2.3048 | 4900 | 0.0021 |
| 2.3053 | 4901 | 0.002 |
| 2.3057 | 4902 | 0.002 |
| 2.3062 | 4903 | 0.0021 |
| 2.3067 | 4904 | 0.0035 |
| 2.3071 | 4905 | 0.004 |
| 2.3076 | 4906 | 0.0025 |
| 2.3081 | 4907 | 0.0229 |
| 2.3086 | 4908 | 0.0036 |
| 2.3090 | 4909 | 0.0038 |
| 2.3095 | 4910 | 0.002 |
| 2.3100 | 4911 | 0.0019 |
| 2.3104 | 4912 | 0.0042 |
| 2.3109 | 4913 | 0.003 |
| 2.3114 | 4914 | 0.0039 |
| 2.3119 | 4915 | 0.0049 |
| 2.3123 | 4916 | 0.0023 |
| 2.3128 | 4917 | 0.0035 |
| 2.3133 | 4918 | 0.0063 |
| 2.3137 | 4919 | 0.0028 |
| 2.3142 | 4920 | 0.0022 |
| 2.3147 | 4921 | 0.0028 |
| 2.3151 | 4922 | 0.0038 |
| 2.3156 | 4923 | 0.0018 |
| 2.3161 | 4924 | 0.005 |
| 2.3166 | 4925 | 0.002 |
| 2.3170 | 4926 | 0.0039 |
| 2.3175 | 4927 | 0.0046 |
| 2.3180 | 4928 | 0.0024 |
| 2.3184 | 4929 | 0.0022 |
| 2.3189 | 4930 | 0.0027 |
| 2.3194 | 4931 | 0.0028 |
| 2.3198 | 4932 | 0.005 |
| 2.3203 | 4933 | 0.002 |
| 2.3208 | 4934 | 0.0028 |
| 2.3213 | 4935 | 0.0039 |
| 2.3217 | 4936 | 0.0023 |
| 2.3222 | 4937 | 0.0049 |
| 2.3227 | 4938 | 0.0048 |
| 2.3231 | 4939 | 0.0031 |
| 2.3236 | 4940 | 0.0022 |
| 2.3241 | 4941 | 0.0036 |
| 2.3246 | 4942 | 0.0024 |
| 2.3250 | 4943 | 0.0038 |
| 2.3255 | 4944 | 0.0054 |
| 2.3260 | 4945 | 0.0026 |
| 2.3264 | 4946 | 0.0045 |
| 2.3269 | 4947 | 0.0026 |
| 2.3274 | 4948 | 0.0042 |
| 2.3278 | 4949 | 0.0027 |
| 2.3283 | 4950 | 0.003 |
| 2.3288 | 4951 | 0.0012 |
| 2.3293 | 4952 | 0.0018 |
| 2.3297 | 4953 | 0.0027 |
| 2.3302 | 4954 | 0.0026 |
| 2.3307 | 4955 | 0.002 |
| 2.3311 | 4956 | 0.0024 |
| 2.3316 | 4957 | 0.0013 |
| 2.3321 | 4958 | 0.0051 |
| 2.3325 | 4959 | 0.0035 |
| 2.3330 | 4960 | 0.003 |
| 2.3335 | 4961 | 0.004 |
| 2.3340 | 4962 | 0.0043 |
| 2.3344 | 4963 | 0.0024 |
| 2.3349 | 4964 | 0.0042 |
| 2.3354 | 4965 | 0.0017 |
| 2.3358 | 4966 | 0.004 |
| 2.3363 | 4967 | 0.0043 |
| 2.3368 | 4968 | 0.0016 |
| 2.3373 | 4969 | 0.0037 |
| 2.3377 | 4970 | 0.0032 |
| 2.3382 | 4971 | 0.0031 |
| 2.3387 | 4972 | 0.0023 |
| 2.3391 | 4973 | 0.0049 |
| 2.3396 | 4974 | 0.0026 |
| 2.3401 | 4975 | 0.0027 |
| 2.3405 | 4976 | 0.0024 |
| 2.3410 | 4977 | 0.002 |
| 2.3415 | 4978 | 0.0019 |
| 2.3420 | 4979 | 0.0023 |
| 2.3424 | 4980 | 0.003 |
| 2.3429 | 4981 | 0.0085 |
| 2.3434 | 4982 | 0.0037 |
| 2.3438 | 4983 | 0.0026 |
| 2.3443 | 4984 | 0.0031 |
| 2.3448 | 4985 | 0.0029 |
| 2.3452 | 4986 | 0.0046 |
| 2.3457 | 4987 | 0.002 |
| 2.3462 | 4988 | 0.0031 |
| 2.3467 | 4989 | 0.0031 |
| 2.3471 | 4990 | 0.0019 |
| 2.3476 | 4991 | 0.0022 |
| 2.3481 | 4992 | 0.0028 |
| 2.3485 | 4993 | 0.0018 |
| 2.3490 | 4994 | 0.0029 |
| 2.3495 | 4995 | 0.002 |
| 2.3500 | 4996 | 0.0056 |
| 2.3504 | 4997 | 0.0048 |
| 2.3509 | 4998 | 0.0022 |
| 2.3514 | 4999 | 0.0022 |
| 2.3518 | 5000 | 0.0032 |
| 2.3523 | 5001 | 0.0034 |
| 2.3528 | 5002 | 0.0013 |
| 2.3532 | 5003 | 0.0021 |
| 2.3537 | 5004 | 0.0054 |
| 2.3542 | 5005 | 0.0029 |
| 2.3547 | 5006 | 0.0019 |
| 2.3551 | 5007 | 0.0014 |
| 2.3556 | 5008 | 0.0016 |
| 2.3561 | 5009 | 0.0036 |
| 2.3565 | 5010 | 0.0025 |
| 2.3570 | 5011 | 0.0027 |
| 2.3575 | 5012 | 0.0018 |
| 2.3579 | 5013 | 0.0017 |
| 2.3584 | 5014 | 0.0043 |
| 2.3589 | 5015 | 0.0015 |
| 2.3594 | 5016 | 0.0026 |
| 2.3598 | 5017 | 0.003 |
| 2.3603 | 5018 | 0.0016 |
| 2.3608 | 5019 | 0.003 |
| 2.3612 | 5020 | 0.0053 |
| 2.3617 | 5021 | 0.0027 |
| 2.3622 | 5022 | 0.004 |
| 2.3627 | 5023 | 0.0028 |
| 2.3631 | 5024 | 0.003 |
| 2.3636 | 5025 | 0.0045 |
| 2.3641 | 5026 | 0.0046 |
| 2.3645 | 5027 | 0.002 |
| 2.3650 | 5028 | 0.0029 |
| 2.3655 | 5029 | 0.0039 |
| 2.3659 | 5030 | 0.0023 |
| 2.3664 | 5031 | 0.0054 |
| 2.3669 | 5032 | 0.0043 |
| 2.3674 | 5033 | 0.0009 |
| 2.3678 | 5034 | 0.0024 |
| 2.3683 | 5035 | 0.0021 |
| 2.3688 | 5036 | 0.0036 |
| 2.3692 | 5037 | 0.0014 |
| 2.3697 | 5038 | 0.0055 |
| 2.3702 | 5039 | 0.0028 |
| 2.3706 | 5040 | 0.0017 |
| 2.3711 | 5041 | 0.0023 |
| 2.3716 | 5042 | 0.0026 |
| 2.3721 | 5043 | 0.0032 |
| 2.3725 | 5044 | 0.0026 |
| 2.3730 | 5045 | 0.0031 |
| 2.3735 | 5046 | 0.004 |
| 2.3739 | 5047 | 0.0042 |
| 2.3744 | 5048 | 0.0028 |
| 2.3749 | 5049 | 0.0023 |
| 2.3754 | 5050 | 0.0043 |
| 2.3758 | 5051 | 0.0034 |
| 2.3763 | 5052 | 0.003 |
| 2.3768 | 5053 | 0.0034 |
| 2.3772 | 5054 | 0.0032 |
| 2.3777 | 5055 | 0.0115 |
| 2.3782 | 5056 | 0.0017 |
| 2.3786 | 5057 | 0.0018 |
| 2.3791 | 5058 | 0.0056 |
| 2.3796 | 5059 | 0.0019 |
| 2.3801 | 5060 | 0.0051 |
| 2.3805 | 5061 | 0.0034 |
| 2.3810 | 5062 | 0.0031 |
| 2.3815 | 5063 | 0.0033 |
| 2.3819 | 5064 | 0.0017 |
| 2.3824 | 5065 | 0.0028 |
| 2.3829 | 5066 | 0.002 |
| 2.3833 | 5067 | 0.0018 |
| 2.3838 | 5068 | 0.0048 |
| 2.3843 | 5069 | 0.0016 |
| 2.3848 | 5070 | 0.0043 |
| 2.3852 | 5071 | 0.0043 |
| 2.3857 | 5072 | 0.0022 |
| 2.3862 | 5073 | 0.0028 |
| 2.3866 | 5074 | 0.0021 |
| 2.3871 | 5075 | 0.0027 |
| 2.3876 | 5076 | 0.0014 |
| 2.3881 | 5077 | 0.002 |
| 2.3885 | 5078 | 0.0027 |
| 2.3890 | 5079 | 0.0014 |
| 2.3895 | 5080 | 0.0016 |
| 2.3899 | 5081 | 0.0033 |
| 2.3904 | 5082 | 0.003 |
| 2.3909 | 5083 | 0.004 |
| 2.3913 | 5084 | 0.0044 |
| 2.3918 | 5085 | 0.0023 |
| 2.3923 | 5086 | 0.0081 |
| 2.3928 | 5087 | 0.0032 |
| 2.3932 | 5088 | 0.0031 |
| 2.3937 | 5089 | 0.0033 |
| 2.3942 | 5090 | 0.0029 |
| 2.3946 | 5091 | 0.0021 |
| 2.3951 | 5092 | 0.003 |
| 2.3956 | 5093 | 0.0025 |
| 2.3960 | 5094 | 0.0048 |
| 2.3965 | 5095 | 0.0045 |
| 2.3970 | 5096 | 0.0034 |
| 2.3975 | 5097 | 0.0044 |
| 2.3979 | 5098 | 0.0021 |
| 2.3984 | 5099 | 0.0033 |
| 2.3989 | 5100 | 0.0029 |
| 2.3993 | 5101 | 0.0042 |
| 2.3998 | 5102 | 0.0016 |
| 2.4003 | 5103 | 0.0022 |
| 2.4008 | 5104 | 0.0039 |
| 2.4012 | 5105 | 0.0045 |
| 2.4017 | 5106 | 0.0036 |
| 2.4022 | 5107 | 0.0029 |
| 2.4026 | 5108 | 0.005 |
| 2.4031 | 5109 | 0.0026 |
| 2.4036 | 5110 | 0.0056 |
| 2.4040 | 5111 | 0.0033 |
| 2.4045 | 5112 | 0.0016 |
| 2.4050 | 5113 | 0.0027 |
| 2.4055 | 5114 | 0.0017 |
| 2.4059 | 5115 | 0.0025 |
| 2.4064 | 5116 | 0.0038 |
| 2.4069 | 5117 | 0.0019 |
| 2.4073 | 5118 | 0.0028 |
| 2.4078 | 5119 | 0.0026 |
| 2.4083 | 5120 | 0.0049 |
| 2.4087 | 5121 | 0.0016 |
| 2.4092 | 5122 | 0.0024 |
| 2.4097 | 5123 | 0.0029 |
| 2.4102 | 5124 | 0.0038 |
| 2.4106 | 5125 | 0.0041 |
| 2.4111 | 5126 | 0.002 |
| 2.4116 | 5127 | 0.0042 |
| 2.4120 | 5128 | 0.0015 |
| 2.4125 | 5129 | 0.0046 |
| 2.4130 | 5130 | 0.0014 |
| 2.4135 | 5131 | 0.002 |
| 2.4139 | 5132 | 0.0031 |
| 2.4144 | 5133 | 0.0032 |
| 2.4149 | 5134 | 0.0039 |
| 2.4153 | 5135 | 0.0023 |
| 2.4158 | 5136 | 0.0023 |
| 2.4163 | 5137 | 0.0025 |
| 2.4167 | 5138 | 0.0018 |
| 2.4172 | 5139 | 0.0011 |
| 2.4177 | 5140 | 0.0038 |
| 2.4182 | 5141 | 0.0038 |
| 2.4186 | 5142 | 0.0047 |
| 2.4191 | 5143 | 0.0037 |
| 2.4196 | 5144 | 0.003 |
| 2.4200 | 5145 | 0.0016 |
| 2.4205 | 5146 | 0.0023 |
| 2.4210 | 5147 | 0.003 |
| 2.4214 | 5148 | 0.0039 |
| 2.4219 | 5149 | 0.0022 |
| 2.4224 | 5150 | 0.0014 |
| 2.4229 | 5151 | 0.0031 |
| 2.4233 | 5152 | 0.0036 |
| 2.4238 | 5153 | 0.0038 |
| 2.4243 | 5154 | 0.0027 |
| 2.4247 | 5155 | 0.0043 |
| 2.4252 | 5156 | 0.0018 |
| 2.4257 | 5157 | 0.0023 |
| 2.4262 | 5158 | 0.0055 |
| 2.4266 | 5159 | 0.0028 |
| 2.4271 | 5160 | 0.0023 |
| 2.4276 | 5161 | 0.0019 |
| 2.4280 | 5162 | 0.0034 |
| 2.4285 | 5163 | 0.0019 |
| 2.4290 | 5164 | 0.0032 |
| 2.4294 | 5165 | 0.0048 |
| 2.4299 | 5166 | 0.003 |
| 2.4304 | 5167 | 0.0037 |
| 2.4309 | 5168 | 0.0015 |
| 2.4313 | 5169 | 0.0027 |
| 2.4318 | 5170 | 0.0012 |
| 2.4323 | 5171 | 0.0018 |
| 2.4327 | 5172 | 0.0021 |
| 2.4332 | 5173 | 0.0016 |
| 2.4337 | 5174 | 0.0035 |
| 2.4341 | 5175 | 0.0046 |
| 2.4346 | 5176 | 0.0057 |
| 2.4351 | 5177 | 0.0015 |
| 2.4356 | 5178 | 0.005 |
| 2.4360 | 5179 | 0.0037 |
| 2.4365 | 5180 | 0.0014 |
| 2.4370 | 5181 | 0.0025 |
| 2.4374 | 5182 | 0.003 |
| 2.4379 | 5183 | 0.0033 |
| 2.4384 | 5184 | 0.0016 |
| 2.4389 | 5185 | 0.0051 |
| 2.4393 | 5186 | 0.008 |
| 2.4398 | 5187 | 0.0014 |
| 2.4403 | 5188 | 0.0017 |
| 2.4407 | 5189 | 0.0044 |
| 2.4412 | 5190 | 0.0023 |
| 2.4417 | 5191 | 0.0033 |
| 2.4421 | 5192 | 0.0027 |
| 2.4426 | 5193 | 0.005 |
| 2.4431 | 5194 | 0.0017 |
| 2.4436 | 5195 | 0.0489 |
| 2.4440 | 5196 | 0.003 |
| 2.4445 | 5197 | 0.0023 |
| 2.4450 | 5198 | 0.0014 |
| 2.4454 | 5199 | 0.003 |
| 2.4459 | 5200 | 0.0053 |
| 2.4464 | 5201 | 0.0028 |
| 2.4468 | 5202 | 0.0036 |
| 2.4473 | 5203 | 0.0034 |
| 2.4478 | 5204 | 0.0025 |
| 2.4483 | 5205 | 0.0025 |
| 2.4487 | 5206 | 0.0047 |
| 2.4492 | 5207 | 0.0032 |
| 2.4497 | 5208 | 0.0034 |
| 2.4501 | 5209 | 0.0048 |
| 2.4506 | 5210 | 0.0024 |
| 2.4511 | 5211 | 0.0037 |
| 2.4516 | 5212 | 0.0015 |
| 2.4520 | 5213 | 0.003 |
| 2.4525 | 5214 | 0.0025 |
| 2.4530 | 5215 | 0.0029 |
| 2.4534 | 5216 | 0.0017 |
| 2.4539 | 5217 | 0.0012 |
| 2.4544 | 5218 | 0.0027 |
| 2.4548 | 5219 | 0.0031 |
| 2.4553 | 5220 | 0.0019 |
| 2.4558 | 5221 | 0.0023 |
| 2.4563 | 5222 | 0.0031 |
| 2.4567 | 5223 | 0.0024 |
| 2.4572 | 5224 | 0.0029 |
| 2.4577 | 5225 | 0.0021 |
| 2.4581 | 5226 | 0.0042 |
| 2.4586 | 5227 | 0.0052 |
| 2.4591 | 5228 | 0.0034 |
| 2.4595 | 5229 | 0.0018 |
| 2.4600 | 5230 | 0.0028 |
| 2.4605 | 5231 | 0.0017 |
| 2.4610 | 5232 | 0.0033 |
| 2.4614 | 5233 | 0.0028 |
| 2.4619 | 5234 | 0.003 |
| 2.4624 | 5235 | 0.0034 |
| 2.4628 | 5236 | 0.0057 |
| 2.4633 | 5237 | 0.0014 |
| 2.4638 | 5238 | 0.0039 |
| 2.4643 | 5239 | 0.0018 |
| 2.4647 | 5240 | 0.0026 |
| 2.4652 | 5241 | 0.0023 |
| 2.4657 | 5242 | 0.0023 |
| 2.4661 | 5243 | 0.0024 |
| 2.4666 | 5244 | 0.0037 |
| 2.4671 | 5245 | 0.0036 |
| 2.4675 | 5246 | 0.0033 |
| 2.4680 | 5247 | 0.0015 |
| 2.4685 | 5248 | 0.0023 |
| 2.4690 | 5249 | 0.0043 |
| 2.4694 | 5250 | 0.0037 |
| 2.4699 | 5251 | 0.003 |
| 2.4704 | 5252 | 0.0016 |
| 2.4708 | 5253 | 0.0035 |
| 2.4713 | 5254 | 0.0049 |
| 2.4718 | 5255 | 0.0017 |
| 2.4722 | 5256 | 0.0047 |
| 2.4727 | 5257 | 0.0022 |
| 2.4732 | 5258 | 0.0019 |
| 2.4737 | 5259 | 0.0021 |
| 2.4741 | 5260 | 0.0023 |
| 2.4746 | 5261 | 0.0044 |
| 2.4751 | 5262 | 0.0047 |
| 2.4755 | 5263 | 0.0031 |
| 2.4760 | 5264 | 0.002 |
| 2.4765 | 5265 | 0.0023 |
| 2.4770 | 5266 | 0.0017 |
| 2.4774 | 5267 | 0.0026 |
| 2.4779 | 5268 | 0.0034 |
| 2.4784 | 5269 | 0.0012 |
| 2.4788 | 5270 | 0.003 |
| 2.4793 | 5271 | 0.0067 |
| 2.4798 | 5272 | 0.0026 |
| 2.4802 | 5273 | 0.0022 |
| 2.4807 | 5274 | 0.0045 |
| 2.4812 | 5275 | 0.0237 |
| 2.4817 | 5276 | 0.0041 |
| 2.4821 | 5277 | 0.0025 |
| 2.4826 | 5278 | 0.0019 |
| 2.4831 | 5279 | 0.0031 |
| 2.4835 | 5280 | 0.0036 |
| 2.4840 | 5281 | 0.0038 |
| 2.4845 | 5282 | 0.0028 |
| 2.4849 | 5283 | 0.0022 |
| 2.4854 | 5284 | 0.0015 |
| 2.4859 | 5285 | 0.0044 |
| 2.4864 | 5286 | 0.0029 |
| 2.4868 | 5287 | 0.0033 |
| 2.4873 | 5288 | 0.0018 |
| 2.4878 | 5289 | 0.003 |
| 2.4882 | 5290 | 0.0101 |
| 2.4887 | 5291 | 0.0051 |
| 2.4892 | 5292 | 0.0022 |
| 2.4897 | 5293 | 0.0017 |
| 2.4901 | 5294 | 0.002 |
| 2.4906 | 5295 | 0.0051 |
| 2.4911 | 5296 | 0.0044 |
| 2.4915 | 5297 | 0.0039 |
| 2.4920 | 5298 | 0.0052 |
| 2.4925 | 5299 | 0.0021 |
| 2.4929 | 5300 | 0.0035 |
| 2.4934 | 5301 | 0.0033 |
| 2.4939 | 5302 | 0.0027 |
| 2.4944 | 5303 | 0.0063 |
| 2.4948 | 5304 | 0.0021 |
| 2.4953 | 5305 | 0.0015 |
| 2.4958 | 5306 | 0.0044 |
| 2.4962 | 5307 | 0.003 |
| 2.4967 | 5308 | 0.0038 |
| 2.4972 | 5309 | 0.0023 |
| 2.4976 | 5310 | 0.0022 |
| 2.4981 | 5311 | 0.0058 |
| 2.4986 | 5312 | 0.0035 |
| 2.4991 | 5313 | 0.0034 |
| 2.4995 | 5314 | 0.0053 |
| 2.5 | 5315 | 0.0031 |
| 2.5005 | 5316 | 0.0032 |
| 2.5009 | 5317 | 0.0045 |
| 2.5014 | 5318 | 0.0012 |
| 2.5019 | 5319 | 0.0008 |
| 2.5024 | 5320 | 0.002 |
| 2.5028 | 5321 | 0.0041 |
| 2.5033 | 5322 | 0.0013 |
| 2.5038 | 5323 | 0.0037 |
| 2.5042 | 5324 | 0.0037 |
| 2.5047 | 5325 | 0.0032 |
| 2.5052 | 5326 | 0.0054 |
| 2.5056 | 5327 | 0.0037 |
| 2.5061 | 5328 | 0.0032 |
| 2.5066 | 5329 | 0.0023 |
| 2.5071 | 5330 | 0.0042 |
| 2.5075 | 5331 | 0.0028 |
| 2.5080 | 5332 | 0.0031 |
| 2.5085 | 5333 | 0.0035 |
| 2.5089 | 5334 | 0.0021 |
| 2.5094 | 5335 | 0.0017 |
| 2.5099 | 5336 | 0.0016 |
| 2.5103 | 5337 | 0.005 |
| 2.5108 | 5338 | 0.0025 |
| 2.5113 | 5339 | 0.0041 |
| 2.5118 | 5340 | 0.0056 |
| 2.5122 | 5341 | 0.0034 |
| 2.5127 | 5342 | 0.0019 |
| 2.5132 | 5343 | 0.0029 |
| 2.5136 | 5344 | 0.0042 |
| 2.5141 | 5345 | 0.0043 |
| 2.5146 | 5346 | 0.0064 |
| 2.5151 | 5347 | 0.0021 |
| 2.5155 | 5348 | 0.0027 |
| 2.5160 | 5349 | 0.0015 |
| 2.5165 | 5350 | 0.0041 |
| 2.5169 | 5351 | 0.0035 |
| 2.5174 | 5352 | 0.0022 |
| 2.5179 | 5353 | 0.0021 |
| 2.5183 | 5354 | 0.003 |
| 2.5188 | 5355 | 0.0018 |
| 2.5193 | 5356 | 0.0036 |
| 2.5198 | 5357 | 0.0027 |
| 2.5202 | 5358 | 0.0016 |
| 2.5207 | 5359 | 0.0026 |
| 2.5212 | 5360 | 0.0039 |
| 2.5216 | 5361 | 0.0025 |
| 2.5221 | 5362 | 0.0041 |
| 2.5226 | 5363 | 0.0025 |
| 2.5230 | 5364 | 0.005 |
| 2.5235 | 5365 | 0.0032 |
| 2.5240 | 5366 | 0.0026 |
| 2.5245 | 5367 | 0.0056 |
| 2.5249 | 5368 | 0.0013 |
| 2.5254 | 5369 | 0.0043 |
| 2.5259 | 5370 | 0.0037 |
| 2.5263 | 5371 | 0.0038 |
| 2.5268 | 5372 | 0.0035 |
| 2.5273 | 5373 | 0.0046 |
| 2.5278 | 5374 | 0.0033 |
| 2.5282 | 5375 | 0.0034 |
| 2.5287 | 5376 | 0.0027 |
| 2.5292 | 5377 | 0.0034 |
| 2.5296 | 5378 | 0.003 |
| 2.5301 | 5379 | 0.0019 |
| 2.5306 | 5380 | 0.002 |
| 2.5310 | 5381 | 0.0037 |
| 2.5315 | 5382 | 0.0015 |
| 2.5320 | 5383 | 0.0043 |
| 2.5325 | 5384 | 0.0058 |
| 2.5329 | 5385 | 0.0031 |
| 2.5334 | 5386 | 0.0045 |
| 2.5339 | 5387 | 0.0016 |
| 2.5343 | 5388 | 0.0016 |
| 2.5348 | 5389 | 0.0019 |
| 2.5353 | 5390 | 0.0097 |
| 2.5357 | 5391 | 0.0014 |
| 2.5362 | 5392 | 0.0051 |
| 2.5367 | 5393 | 0.0041 |
| 2.5372 | 5394 | 0.0018 |
| 2.5376 | 5395 | 0.0041 |
| 2.5381 | 5396 | 0.0014 |
| 2.5386 | 5397 | 0.0032 |
| 2.5390 | 5398 | 0.0041 |
| 2.5395 | 5399 | 0.0027 |
| 2.5400 | 5400 | 0.0035 |
| 2.5405 | 5401 | 0.0019 |
| 2.5409 | 5402 | 0.0014 |
| 2.5414 | 5403 | 0.0032 |
| 2.5419 | 5404 | 0.0017 |
| 2.5423 | 5405 | 0.0029 |
| 2.5428 | 5406 | 0.0045 |
| 2.5433 | 5407 | 0.0019 |
| 2.5437 | 5408 | 0.0029 |
| 2.5442 | 5409 | 0.0035 |
| 2.5447 | 5410 | 0.0047 |
| 2.5452 | 5411 | 0.0035 |
| 2.5456 | 5412 | 0.0062 |
| 2.5461 | 5413 | 0.0032 |
| 2.5466 | 5414 | 0.0025 |
| 2.5470 | 5415 | 0.003 |
| 2.5475 | 5416 | 0.0019 |
| 2.5480 | 5417 | 0.0021 |
| 2.5484 | 5418 | 0.0029 |
| 2.5489 | 5419 | 0.0019 |
| 2.5494 | 5420 | 0.0024 |
| 2.5499 | 5421 | 0.0081 |
| 2.5503 | 5422 | 0.0028 |
| 2.5508 | 5423 | 0.0024 |
| 2.5513 | 5424 | 0.0025 |
| 2.5517 | 5425 | 0.0019 |
| 2.5522 | 5426 | 0.0025 |
| 2.5527 | 5427 | 0.0033 |
| 2.5532 | 5428 | 0.0022 |
| 2.5536 | 5429 | 0.0021 |
| 2.5541 | 5430 | 0.0034 |
| 2.5546 | 5431 | 0.0019 |
| 2.5550 | 5432 | 0.0051 |
| 2.5555 | 5433 | 0.002 |
| 2.5560 | 5434 | 0.0061 |
| 2.5564 | 5435 | 0.0058 |
| 2.5569 | 5436 | 0.0019 |
| 2.5574 | 5437 | 0.0017 |
| 2.5579 | 5438 | 0.002 |
| 2.5583 | 5439 | 0.0042 |
| 2.5588 | 5440 | 0.003 |
| 2.5593 | 5441 | 0.0034 |
| 2.5597 | 5442 | 0.0027 |
| 2.5602 | 5443 | 0.0029 |
| 2.5607 | 5444 | 0.003 |
| 2.5611 | 5445 | 0.0026 |
| 2.5616 | 5446 | 0.0022 |
| 2.5621 | 5447 | 0.0019 |
| 2.5626 | 5448 | 0.0025 |
| 2.5630 | 5449 | 0.0027 |
| 2.5635 | 5450 | 0.0038 |
| 2.5640 | 5451 | 0.0026 |
| 2.5644 | 5452 | 0.0009 |
| 2.5649 | 5453 | 0.0034 |
| 2.5654 | 5454 | 0.0024 |
| 2.5659 | 5455 | 0.0063 |
| 2.5663 | 5456 | 0.0038 |
| 2.5668 | 5457 | 0.0031 |
| 2.5673 | 5458 | 0.0032 |
| 2.5677 | 5459 | 0.0033 |
| 2.5682 | 5460 | 0.0036 |
| 2.5687 | 5461 | 0.0024 |
| 2.5691 | 5462 | 0.0018 |
| 2.5696 | 5463 | 0.0059 |
| 2.5701 | 5464 | 0.0024 |
| 2.5706 | 5465 | 0.0042 |
| 2.5710 | 5466 | 0.0028 |
| 2.5715 | 5467 | 0.0015 |
| 2.5720 | 5468 | 0.0023 |
| 2.5724 | 5469 | 0.0054 |
| 2.5729 | 5470 | 0.0027 |
| 2.5734 | 5471 | 0.0025 |
| 2.5738 | 5472 | 0.0018 |
| 2.5743 | 5473 | 0.003 |
| 2.5748 | 5474 | 0.0028 |
| 2.5753 | 5475 | 0.0015 |
| 2.5757 | 5476 | 0.002 |
| 2.5762 | 5477 | 0.0038 |
| 2.5767 | 5478 | 0.0027 |
| 2.5771 | 5479 | 0.0025 |
| 2.5776 | 5480 | 0.0083 |
| 2.5781 | 5481 | 0.0026 |
| 2.5786 | 5482 | 0.0044 |
| 2.5790 | 5483 | 0.0025 |
| 2.5795 | 5484 | 0.0058 |
| 2.5800 | 5485 | 0.0034 |
| 2.5804 | 5486 | 0.0044 |
| 2.5809 | 5487 | 0.0026 |
| 2.5814 | 5488 | 0.0026 |
| 2.5818 | 5489 | 0.0026 |
| 2.5823 | 5490 | 0.0026 |
| 2.5828 | 5491 | 0.0029 |
| 2.5833 | 5492 | 0.0033 |
| 2.5837 | 5493 | 0.0018 |
| 2.5842 | 5494 | 0.0045 |
| 2.5847 | 5495 | 0.0025 |
| 2.5851 | 5496 | 0.0011 |
| 2.5856 | 5497 | 0.0023 |
| 2.5861 | 5498 | 0.0023 |
| 2.5865 | 5499 | 0.0076 |
| 2.5870 | 5500 | 0.0025 |
| 2.5875 | 5501 | 0.0029 |
| 2.5880 | 5502 | 0.0074 |
| 2.5884 | 5503 | 0.0051 |
| 2.5889 | 5504 | 0.0033 |
| 2.5894 | 5505 | 0.0035 |
| 2.5898 | 5506 | 0.0018 |
| 2.5903 | 5507 | 0.0014 |
| 2.5908 | 5508 | 0.0036 |
| 2.5913 | 5509 | 0.0013 |
| 2.5917 | 5510 | 0.0051 |
| 2.5922 | 5511 | 0.0025 |
| 2.5927 | 5512 | 0.0023 |
| 2.5931 | 5513 | 0.0025 |
| 2.5936 | 5514 | 0.0031 |
| 2.5941 | 5515 | 0.0012 |
| 2.5945 | 5516 | 0.002 |
| 2.5950 | 5517 | 0.0024 |
| 2.5955 | 5518 | 0.0025 |
| 2.5960 | 5519 | 0.0022 |
| 2.5964 | 5520 | 0.0023 |
| 2.5969 | 5521 | 0.0062 |
| 2.5974 | 5522 | 0.007 |
| 2.5978 | 5523 | 0.0045 |
| 2.5983 | 5524 | 0.0075 |
| 2.5988 | 5525 | 0.0049 |
| 2.5992 | 5526 | 0.0034 |
| 2.5997 | 5527 | 0.0043 |
| 2.6002 | 5528 | 0.0023 |
| 2.6007 | 5529 | 0.0023 |
| 2.6011 | 5530 | 0.0039 |
| 2.6016 | 5531 | 0.002 |
| 2.6021 | 5532 | 0.0022 |
| 2.6025 | 5533 | 0.0029 |
| 2.6030 | 5534 | 0.0018 |
| 2.6035 | 5535 | 0.0041 |
| 2.6040 | 5536 | 0.0024 |
| 2.6044 | 5537 | 0.0014 |
| 2.6049 | 5538 | 0.0043 |
| 2.6054 | 5539 | 0.0037 |
| 2.6058 | 5540 | 0.002 |
| 2.6063 | 5541 | 0.004 |
| 2.6068 | 5542 | 0.002 |
| 2.6072 | 5543 | 0.0014 |
| 2.6077 | 5544 | 0.0033 |
| 2.6082 | 5545 | 0.0028 |
| 2.6087 | 5546 | 0.004 |
| 2.6091 | 5547 | 0.005 |
| 2.6096 | 5548 | 0.0044 |
| 2.6101 | 5549 | 0.0021 |
| 2.6105 | 5550 | 0.0025 |
| 2.6110 | 5551 | 0.0028 |
| 2.6115 | 5552 | 0.0058 |
| 2.6119 | 5553 | 0.0019 |
| 2.6124 | 5554 | 0.0025 |
| 2.6129 | 5555 | 0.0025 |
| 2.6134 | 5556 | 0.0042 |
| 2.6138 | 5557 | 0.004 |
| 2.6143 | 5558 | 0.004 |
| 2.6148 | 5559 | 0.0026 |
| 2.6152 | 5560 | 0.0015 |
| 2.6157 | 5561 | 0.0025 |
| 2.6162 | 5562 | 0.0015 |
| 2.6167 | 5563 | 0.002 |
| 2.6171 | 5564 | 0.0027 |
| 2.6176 | 5565 | 0.0034 |
| 2.6181 | 5566 | 0.0032 |
| 2.6185 | 5567 | 0.0014 |
| 2.6190 | 5568 | 0.0039 |
| 2.6195 | 5569 | 0.0022 |
| 2.6199 | 5570 | 0.0046 |
| 2.6204 | 5571 | 0.0034 |
| 2.6209 | 5572 | 0.0022 |
| 2.6214 | 5573 | 0.0027 |
| 2.6218 | 5574 | 0.0028 |
| 2.6223 | 5575 | 0.0026 |
| 2.6228 | 5576 | 0.0036 |
| 2.6232 | 5577 | 0.0022 |
| 2.6237 | 5578 | 0.0036 |
| 2.6242 | 5579 | 0.0044 |
| 2.6246 | 5580 | 0.0021 |
| 2.6251 | 5581 | 0.0033 |
| 2.6256 | 5582 | 0.0029 |
| 2.6261 | 5583 | 0.0023 |
| 2.6265 | 5584 | 0.0023 |
| 2.6270 | 5585 | 0.0037 |
| 2.6275 | 5586 | 0.0017 |
| 2.6279 | 5587 | 0.002 |
| 2.6284 | 5588 | 0.0027 |
| 2.6289 | 5589 | 0.0017 |
| 2.6294 | 5590 | 0.0018 |
| 2.6298 | 5591 | 0.0021 |
| 2.6303 | 5592 | 0.0033 |
| 2.6308 | 5593 | 0.0044 |
| 2.6312 | 5594 | 0.0014 |
| 2.6317 | 5595 | 0.0033 |
| 2.6322 | 5596 | 0.0037 |
| 2.6326 | 5597 | 0.0028 |
| 2.6331 | 5598 | 0.0029 |
| 2.6336 | 5599 | 0.0016 |
| 2.6341 | 5600 | 0.0036 |
| 2.6345 | 5601 | 0.0027 |
| 2.6350 | 5602 | 0.003 |
| 2.6355 | 5603 | 0.002 |
| 2.6359 | 5604 | 0.0028 |
| 2.6364 | 5605 | 0.0026 |
| 2.6369 | 5606 | 0.0026 |
| 2.6373 | 5607 | 0.0012 |
| 2.6378 | 5608 | 0.0013 |
| 2.6383 | 5609 | 0.0026 |
| 2.6388 | 5610 | 0.0051 |
| 2.6392 | 5611 | 0.0116 |
| 2.6397 | 5612 | 0.0023 |
| 2.6402 | 5613 | 0.0043 |
| 2.6406 | 5614 | 0.0023 |
| 2.6411 | 5615 | 0.0037 |
| 2.6416 | 5616 | 0.003 |
| 2.6421 | 5617 | 0.0021 |
| 2.6425 | 5618 | 0.0017 |
| 2.6430 | 5619 | 0.0026 |
| 2.6435 | 5620 | 0.0032 |
| 2.6439 | 5621 | 0.0042 |
| 2.6444 | 5622 | 0.0045 |
| 2.6449 | 5623 | 0.0028 |
| 2.6453 | 5624 | 0.0026 |
| 2.6458 | 5625 | 0.0016 |
| 2.6463 | 5626 | 0.0028 |
| 2.6468 | 5627 | 0.0048 |
| 2.6472 | 5628 | 0.0038 |
| 2.6477 | 5629 | 0.0047 |
| 2.6482 | 5630 | 0.003 |
| 2.6486 | 5631 | 0.0043 |
| 2.6491 | 5632 | 0.0019 |
| 2.6496 | 5633 | 0.0023 |
| 2.6500 | 5634 | 0.003 |
| 2.6505 | 5635 | 0.0033 |
| 2.6510 | 5636 | 0.0029 |
| 2.6515 | 5637 | 0.0163 |
| 2.6519 | 5638 | 0.0032 |
| 2.6524 | 5639 | 0.0036 |
| 2.6529 | 5640 | 0.0038 |
| 2.6533 | 5641 | 0.0075 |
| 2.6538 | 5642 | 0.0025 |
| 2.6543 | 5643 | 0.0049 |
| 2.6548 | 5644 | 0.0009 |
| 2.6552 | 5645 | 0.0022 |
| 2.6557 | 5646 | 0.0052 |
| 2.6562 | 5647 | 0.0046 |
| 2.6566 | 5648 | 0.005 |
| 2.6571 | 5649 | 0.0051 |
| 2.6576 | 5650 | 0.0039 |
| 2.6580 | 5651 | 0.0033 |
| 2.6585 | 5652 | 0.0029 |
| 2.6590 | 5653 | 0.0023 |
| 2.6595 | 5654 | 0.0059 |
| 2.6599 | 5655 | 0.0032 |
| 2.6604 | 5656 | 0.0024 |
| 2.6609 | 5657 | 0.003 |
| 2.6613 | 5658 | 0.0038 |
| 2.6618 | 5659 | 0.0018 |
| 2.6623 | 5660 | 0.0033 |
| 2.6627 | 5661 | 0.0019 |
| 2.6632 | 5662 | 0.0022 |
| 2.6637 | 5663 | 0.0028 |
| 2.6642 | 5664 | 0.0019 |
| 2.6646 | 5665 | 0.0058 |
| 2.6651 | 5666 | 0.0029 |
| 2.6656 | 5667 | 0.0019 |
| 2.6660 | 5668 | 0.0055 |
| 2.6665 | 5669 | 0.0019 |
| 2.6670 | 5670 | 0.0024 |
| 2.6675 | 5671 | 0.0025 |
| 2.6679 | 5672 | 0.0011 |
| 2.6684 | 5673 | 0.0032 |
| 2.6689 | 5674 | 0.0028 |
| 2.6693 | 5675 | 0.0019 |
| 2.6698 | 5676 | 0.0026 |
| 2.6703 | 5677 | 0.0046 |
| 2.6707 | 5678 | 0.0053 |
| 2.6712 | 5679 | 0.0029 |
| 2.6717 | 5680 | 0.0016 |
| 2.6722 | 5681 | 0.0041 |
| 2.6726 | 5682 | 0.0036 |
| 2.6731 | 5683 | 0.0022 |
| 2.6736 | 5684 | 0.0016 |
| 2.6740 | 5685 | 0.0028 |
| 2.6745 | 5686 | 0.0047 |
| 2.6750 | 5687 | 0.0017 |
| 2.6754 | 5688 | 0.005 |
| 2.6759 | 5689 | 0.0029 |
| 2.6764 | 5690 | 0.0024 |
| 2.6769 | 5691 | 0.0024 |
| 2.6773 | 5692 | 0.003 |
| 2.6778 | 5693 | 0.0017 |
| 2.6783 | 5694 | 0.0024 |
| 2.6787 | 5695 | 0.0014 |
| 2.6792 | 5696 | 0.0028 |
| 2.6797 | 5697 | 0.003 |
| 2.6802 | 5698 | 0.0023 |
| 2.6806 | 5699 | 0.0026 |
| 2.6811 | 5700 | 0.0017 |
| 2.6816 | 5701 | 0.0025 |
| 2.6820 | 5702 | 0.0018 |
| 2.6825 | 5703 | 0.0038 |
| 2.6830 | 5704 | 0.003 |
| 2.6834 | 5705 | 0.0015 |
| 2.6839 | 5706 | 0.0047 |
| 2.6844 | 5707 | 0.0035 |
| 2.6849 | 5708 | 0.0027 |
| 2.6853 | 5709 | 0.0018 |
| 2.6858 | 5710 | 0.0032 |
| 2.6863 | 5711 | 0.0029 |
| 2.6867 | 5712 | 0.0044 |
| 2.6872 | 5713 | 0.0024 |
| 2.6877 | 5714 | 0.004 |
| 2.6881 | 5715 | 0.0016 |
| 2.6886 | 5716 | 0.0062 |
| 2.6891 | 5717 | 0.0046 |
| 2.6896 | 5718 | 0.0017 |
| 2.6900 | 5719 | 0.0035 |
| 2.6905 | 5720 | 0.0031 |
| 2.6910 | 5721 | 0.0019 |
| 2.6914 | 5722 | 0.0038 |
| 2.6919 | 5723 | 0.0061 |
| 2.6924 | 5724 | 0.004 |
| 2.6929 | 5725 | 0.0035 |
| 2.6933 | 5726 | 0.0047 |
| 2.6938 | 5727 | 0.0032 |
| 2.6943 | 5728 | 0.0088 |
| 2.6947 | 5729 | 0.0032 |
| 2.6952 | 5730 | 0.0044 |
| 2.6957 | 5731 | 0.0062 |
| 2.6961 | 5732 | 0.0034 |
| 2.6966 | 5733 | 0.0017 |
| 2.6971 | 5734 | 0.0022 |
| 2.6976 | 5735 | 0.0027 |
| 2.6980 | 5736 | 0.0049 |
| 2.6985 | 5737 | 0.0042 |
| 2.6990 | 5738 | 0.0021 |
| 2.6994 | 5739 | 0.0027 |
| 2.6999 | 5740 | 0.0022 |
| 2.7004 | 5741 | 0.0042 |
| 2.7008 | 5742 | 0.0024 |
| 2.7013 | 5743 | 0.0022 |
| 2.7018 | 5744 | 0.0015 |
| 2.7023 | 5745 | 0.0014 |
| 2.7027 | 5746 | 0.0014 |
| 2.7032 | 5747 | 0.002 |
| 2.7037 | 5748 | 0.0034 |
| 2.7041 | 5749 | 0.0049 |
| 2.7046 | 5750 | 0.0028 |
| 2.7051 | 5751 | 0.002 |
| 2.7056 | 5752 | 0.0025 |
| 2.7060 | 5753 | 0.003 |
| 2.7065 | 5754 | 0.0036 |
| 2.7070 | 5755 | 0.0028 |
| 2.7074 | 5756 | 0.0029 |
| 2.7079 | 5757 | 0.0031 |
| 2.7084 | 5758 | 0.0036 |
| 2.7088 | 5759 | 0.002 |
| 2.7093 | 5760 | 0.0038 |
| 2.7098 | 5761 | 0.0026 |
| 2.7103 | 5762 | 0.0023 |
| 2.7107 | 5763 | 0.0024 |
| 2.7112 | 5764 | 0.0038 |
| 2.7117 | 5765 | 0.0024 |
| 2.7121 | 5766 | 0.0031 |
| 2.7126 | 5767 | 0.0077 |
| 2.7131 | 5768 | 0.0028 |
| 2.7135 | 5769 | 0.0023 |
| 2.7140 | 5770 | 0.0017 |
| 2.7145 | 5771 | 0.0041 |
| 2.7150 | 5772 | 0.0025 |
| 2.7154 | 5773 | 0.0016 |
| 2.7159 | 5774 | 0.0026 |
| 2.7164 | 5775 | 0.0034 |
| 2.7168 | 5776 | 0.0022 |
| 2.7173 | 5777 | 0.0042 |
| 2.7178 | 5778 | 0.0037 |
| 2.7183 | 5779 | 0.0047 |
| 2.7187 | 5780 | 0.0045 |
| 2.7192 | 5781 | 0.0034 |
| 2.7197 | 5782 | 0.004 |
| 2.7201 | 5783 | 0.0044 |
| 2.7206 | 5784 | 0.0035 |
| 2.7211 | 5785 | 0.0025 |
| 2.7215 | 5786 | 0.0031 |
| 2.7220 | 5787 | 0.004 |
| 2.7225 | 5788 | 0.0025 |
| 2.7230 | 5789 | 0.0048 |
| 2.7234 | 5790 | 0.0024 |
| 2.7239 | 5791 | 0.0018 |
| 2.7244 | 5792 | 0.0044 |
| 2.7248 | 5793 | 0.0016 |
| 2.7253 | 5794 | 0.0024 |
| 2.7258 | 5795 | 0.0015 |
| 2.7262 | 5796 | 0.0052 |
| 2.7267 | 5797 | 0.0036 |
| 2.7272 | 5798 | 0.0063 |
| 2.7277 | 5799 | 0.0019 |
| 2.7281 | 5800 | 0.0031 |
| 2.7286 | 5801 | 0.0039 |
| 2.7291 | 5802 | 0.0037 |
| 2.7295 | 5803 | 0.0028 |
| 2.7300 | 5804 | 0.0015 |
| 2.7305 | 5805 | 0.0025 |
| 2.7310 | 5806 | 0.0014 |
| 2.7314 | 5807 | 0.0011 |
| 2.7319 | 5808 | 0.0018 |
| 2.7324 | 5809 | 0.0028 |
| 2.7328 | 5810 | 0.0068 |
| 2.7333 | 5811 | 0.0036 |
| 2.7338 | 5812 | 0.0024 |
| 2.7342 | 5813 | 0.0026 |
| 2.7347 | 5814 | 0.0016 |
| 2.7352 | 5815 | 0.0035 |
| 2.7357 | 5816 | 0.0029 |
| 2.7361 | 5817 | 0.0023 |
| 2.7366 | 5818 | 0.0024 |
| 2.7371 | 5819 | 0.0029 |
| 2.7375 | 5820 | 0.0033 |
| 2.7380 | 5821 | 0.0033 |
| 2.7385 | 5822 | 0.0022 |
| 2.7389 | 5823 | 0.0014 |
| 2.7394 | 5824 | 0.0042 |
| 2.7399 | 5825 | 0.0029 |
| 2.7404 | 5826 | 0.0024 |
| 2.7408 | 5827 | 0.0039 |
| 2.7413 | 5828 | 0.0021 |
| 2.7418 | 5829 | 0.0044 |
| 2.7422 | 5830 | 0.0028 |
| 2.7427 | 5831 | 0.0033 |
| 2.7432 | 5832 | 0.0036 |
| 2.7437 | 5833 | 0.0035 |
| 2.7441 | 5834 | 0.003 |
| 2.7446 | 5835 | 0.0022 |
| 2.7451 | 5836 | 0.0027 |
| 2.7455 | 5837 | 0.0027 |
| 2.7460 | 5838 | 0.0017 |
| 2.7465 | 5839 | 0.0027 |
| 2.7469 | 5840 | 0.0012 |
| 2.7474 | 5841 | 0.0029 |
| 2.7479 | 5842 | 0.0025 |
| 2.7484 | 5843 | 0.0027 |
| 2.7488 | 5844 | 0.0028 |
| 2.7493 | 5845 | 0.0033 |
| 2.7498 | 5846 | 0.0071 |
| 2.7502 | 5847 | 0.0028 |
| 2.7507 | 5848 | 0.0026 |
| 2.7512 | 5849 | 0.002 |
| 2.7516 | 5850 | 0.0015 |
| 2.7521 | 5851 | 0.0066 |
| 2.7526 | 5852 | 0.0064 |
| 2.7531 | 5853 | 0.0017 |
| 2.7535 | 5854 | 0.0018 |
| 2.7540 | 5855 | 0.002 |
| 2.7545 | 5856 | 0.0038 |
| 2.7549 | 5857 | 0.0038 |
| 2.7554 | 5858 | 0.0024 |
| 2.7559 | 5859 | 0.0029 |
| 2.7563 | 5860 | 0.0016 |
| 2.7568 | 5861 | 0.0021 |
| 2.7573 | 5862 | 0.0028 |
| 2.7578 | 5863 | 0.0044 |
| 2.7582 | 5864 | 0.002 |
| 2.7587 | 5865 | 0.003 |
| 2.7592 | 5866 | 0.0042 |
| 2.7596 | 5867 | 0.002 |
| 2.7601 | 5868 | 0.004 |
| 2.7606 | 5869 | 0.003 |
| 2.7611 | 5870 | 0.005 |
| 2.7615 | 5871 | 0.0033 |
| 2.7620 | 5872 | 0.0029 |
| 2.7625 | 5873 | 0.0038 |
| 2.7629 | 5874 | 0.004 |
| 2.7634 | 5875 | 0.0052 |
| 2.7639 | 5876 | 0.0029 |
| 2.7643 | 5877 | 0.0037 |
| 2.7648 | 5878 | 0.0026 |
| 2.7653 | 5879 | 0.0048 |
| 2.7658 | 5880 | 0.0032 |
| 2.7662 | 5881 | 0.0016 |
| 2.7667 | 5882 | 0.003 |
| 2.7672 | 5883 | 0.0022 |
| 2.7676 | 5884 | 0.0021 |
| 2.7681 | 5885 | 0.0023 |
| 2.7686 | 5886 | 0.0041 |
| 2.7690 | 5887 | 0.0027 |
| 2.7695 | 5888 | 0.0024 |
| 2.7700 | 5889 | 0.0028 |
| 2.7705 | 5890 | 0.0024 |
| 2.7709 | 5891 | 0.002 |
| 2.7714 | 5892 | 0.0047 |
| 2.7719 | 5893 | 0.0034 |
| 2.7723 | 5894 | 0.0019 |
| 2.7728 | 5895 | 0.0022 |
| 2.7733 | 5896 | 0.0024 |
| 2.7738 | 5897 | 0.003 |
| 2.7742 | 5898 | 0.0033 |
| 2.7747 | 5899 | 0.0012 |
| 2.7752 | 5900 | 0.0024 |
| 2.7756 | 5901 | 0.0017 |
| 2.7761 | 5902 | 0.0015 |
| 2.7766 | 5903 | 0.0023 |
| 2.7770 | 5904 | 0.0016 |
| 2.7775 | 5905 | 0.0045 |
| 2.7780 | 5906 | 0.0028 |
| 2.7785 | 5907 | 0.0043 |
| 2.7789 | 5908 | 0.0018 |
| 2.7794 | 5909 | 0.0055 |
| 2.7799 | 5910 | 0.0018 |
| 2.7803 | 5911 | 0.0052 |
| 2.7808 | 5912 | 0.0033 |
| 2.7813 | 5913 | 0.0043 |
| 2.7817 | 5914 | 0.0017 |
| 2.7822 | 5915 | 0.0028 |
| 2.7827 | 5916 | 0.0036 |
| 2.7832 | 5917 | 0.0025 |
| 2.7836 | 5918 | 0.0036 |
| 2.7841 | 5919 | 0.0018 |
| 2.7846 | 5920 | 0.0027 |
| 2.7850 | 5921 | 0.0023 |
| 2.7855 | 5922 | 0.0033 |
| 2.7860 | 5923 | 0.003 |
| 2.7865 | 5924 | 0.004 |
| 2.7869 | 5925 | 0.0025 |
| 2.7874 | 5926 | 0.0019 |
| 2.7879 | 5927 | 0.0034 |
| 2.7883 | 5928 | 0.0026 |
| 2.7888 | 5929 | 0.003 |
| 2.7893 | 5930 | 0.0043 |
| 2.7897 | 5931 | 0.0031 |
| 2.7902 | 5932 | 0.0026 |
| 2.7907 | 5933 | 0.0019 |
| 2.7912 | 5934 | 0.0026 |
| 2.7916 | 5935 | 0.0046 |
| 2.7921 | 5936 | 0.0027 |
| 2.7926 | 5937 | 0.0038 |
| 2.7930 | 5938 | 0.0045 |
| 2.7935 | 5939 | 0.0012 |
| 2.7940 | 5940 | 0.0028 |
| 2.7944 | 5941 | 0.0042 |
| 2.7949 | 5942 | 0.0039 |
| 2.7954 | 5943 | 0.0029 |
| 2.7959 | 5944 | 0.003 |
| 2.7963 | 5945 | 0.0023 |
| 2.7968 | 5946 | 0.0014 |
| 2.7973 | 5947 | 0.0026 |
| 2.7977 | 5948 | 0.0021 |
| 2.7982 | 5949 | 0.0022 |
| 2.7987 | 5950 | 0.0027 |
| 2.7992 | 5951 | 0.0023 |
| 2.7996 | 5952 | 0.0045 |
| 2.8001 | 5953 | 0.0022 |
| 2.8006 | 5954 | 0.0031 |
| 2.8010 | 5955 | 0.003 |
| 2.8015 | 5956 | 0.0034 |
| 2.8020 | 5957 | 0.0036 |
| 2.8024 | 5958 | 0.0057 |
| 2.8029 | 5959 | 0.0064 |
| 2.8034 | 5960 | 0.0083 |
| 2.8039 | 5961 | 0.0011 |
| 2.8043 | 5962 | 0.0048 |
| 2.8048 | 5963 | 0.0029 |
| 2.8053 | 5964 | 0.0015 |
| 2.8057 | 5965 | 0.0032 |
| 2.8062 | 5966 | 0.0038 |
| 2.8067 | 5967 | 0.0026 |
| 2.8071 | 5968 | 0.0067 |
| 2.8076 | 5969 | 0.0031 |
| 2.8081 | 5970 | 0.0044 |
| 2.8086 | 5971 | 0.0026 |
| 2.8090 | 5972 | 0.0046 |
| 2.8095 | 5973 | 0.0019 |
| 2.8100 | 5974 | 0.0028 |
| 2.8104 | 5975 | 0.0042 |
| 2.8109 | 5976 | 0.0025 |
| 2.8114 | 5977 | 0.0024 |
| 2.8119 | 5978 | 0.0024 |
| 2.8123 | 5979 | 0.0018 |
| 2.8128 | 5980 | 0.0029 |
| 2.8133 | 5981 | 0.0029 |
| 2.8137 | 5982 | 0.0019 |
| 2.8142 | 5983 | 0.0026 |
| 2.8147 | 5984 | 0.0023 |
| 2.8151 | 5985 | 0.0023 |
| 2.8156 | 5986 | 0.0035 |
| 2.8161 | 5987 | 0.0024 |
| 2.8166 | 5988 | 0.0023 |
| 2.8170 | 5989 | 0.0035 |
| 2.8175 | 5990 | 0.0052 |
| 2.8180 | 5991 | 0.0031 |
| 2.8184 | 5992 | 0.0029 |
| 2.8189 | 5993 | 0.0014 |
| 2.8194 | 5994 | 0.0016 |
| 2.8198 | 5995 | 0.0028 |
| 2.8203 | 5996 | 0.0032 |
| 2.8208 | 5997 | 0.0036 |
| 2.8213 | 5998 | 0.0026 |
| 2.8217 | 5999 | 0.0097 |
| 2.8222 | 6000 | 0.0078 |
| 2.8227 | 6001 | 0.0034 |
| 2.8231 | 6002 | 0.0023 |
| 2.8236 | 6003 | 0.0019 |
| 2.8241 | 6004 | 0.0037 |
| 2.8246 | 6005 | 0.003 |
| 2.8250 | 6006 | 0.0011 |
| 2.8255 | 6007 | 0.0035 |
| 2.8260 | 6008 | 0.0043 |
| 2.8264 | 6009 | 0.0033 |
| 2.8269 | 6010 | 0.002 |
| 2.8274 | 6011 | 0.0022 |
| 2.8278 | 6012 | 0.0014 |
| 2.8283 | 6013 | 0.0022 |
| 2.8288 | 6014 | 0.0039 |
| 2.8293 | 6015 | 0.0046 |
| 2.8297 | 6016 | 0.003 |
| 2.8302 | 6017 | 0.0021 |
| 2.8307 | 6018 | 0.0016 |
| 2.8311 | 6019 | 0.0019 |
| 2.8316 | 6020 | 0.0059 |
| 2.8321 | 6021 | 0.0028 |
| 2.8325 | 6022 | 0.0022 |
| 2.8330 | 6023 | 0.0032 |
| 2.8335 | 6024 | 0.0049 |
| 2.8340 | 6025 | 0.0053 |
| 2.8344 | 6026 | 0.0028 |
| 2.8349 | 6027 | 0.0025 |
| 2.8354 | 6028 | 0.0032 |
| 2.8358 | 6029 | 0.0029 |
| 2.8363 | 6030 | 0.0026 |
| 2.8368 | 6031 | 0.0022 |
| 2.8373 | 6032 | 0.0022 |
| 2.8377 | 6033 | 0.0031 |
| 2.8382 | 6034 | 0.0041 |
| 2.8387 | 6035 | 0.0021 |
| 2.8391 | 6036 | 0.0016 |
| 2.8396 | 6037 | 0.0032 |
| 2.8401 | 6038 | 0.0008 |
| 2.8405 | 6039 | 0.0021 |
| 2.8410 | 6040 | 0.0023 |
| 2.8415 | 6041 | 0.0018 |
| 2.8420 | 6042 | 0.0021 |
| 2.8424 | 6043 | 0.0061 |
| 2.8429 | 6044 | 0.0024 |
| 2.8434 | 6045 | 0.0033 |
| 2.8438 | 6046 | 0.003 |
| 2.8443 | 6047 | 0.0029 |
| 2.8448 | 6048 | 0.0032 |
| 2.8452 | 6049 | 0.0031 |
| 2.8457 | 6050 | 0.0017 |
| 2.8462 | 6051 | 0.0015 |
| 2.8467 | 6052 | 0.0029 |
| 2.8471 | 6053 | 0.0037 |
| 2.8476 | 6054 | 0.0053 |
| 2.8481 | 6055 | 0.0016 |
| 2.8485 | 6056 | 0.0026 |
| 2.8490 | 6057 | 0.0023 |
| 2.8495 | 6058 | 0.0031 |
| 2.8500 | 6059 | 0.0025 |
| 2.8504 | 6060 | 0.0026 |
| 2.8509 | 6061 | 0.0022 |
| 2.8514 | 6062 | 0.002 |
| 2.8518 | 6063 | 0.0023 |
| 2.8523 | 6064 | 0.0025 |
| 2.8528 | 6065 | 0.0022 |
| 2.8532 | 6066 | 0.0044 |
| 2.8537 | 6067 | 0.0027 |
| 2.8542 | 6068 | 0.002 |
| 2.8547 | 6069 | 0.0022 |
| 2.8551 | 6070 | 0.0026 |
| 2.8556 | 6071 | 0.0033 |
| 2.8561 | 6072 | 0.0011 |
| 2.8565 | 6073 | 0.0046 |
| 2.8570 | 6074 | 0.0034 |
| 2.8575 | 6075 | 0.0019 |
| 2.8579 | 6076 | 0.0009 |
| 2.8584 | 6077 | 0.0032 |
| 2.8589 | 6078 | 0.0037 |
| 2.8594 | 6079 | 0.0023 |
| 2.8598 | 6080 | 0.0027 |
| 2.8603 | 6081 | 0.0042 |
| 2.8608 | 6082 | 0.0085 |
| 2.8612 | 6083 | 0.0061 |
| 2.8617 | 6084 | 0.003 |
| 2.8622 | 6085 | 0.0031 |
| 2.8627 | 6086 | 0.0023 |
| 2.8631 | 6087 | 0.0018 |
| 2.8636 | 6088 | 0.0023 |
| 2.8641 | 6089 | 0.0033 |
| 2.8645 | 6090 | 0.0027 |
| 2.8650 | 6091 | 0.0025 |
| 2.8655 | 6092 | 0.0015 |
| 2.8659 | 6093 | 0.0051 |
| 2.8664 | 6094 | 0.0032 |
| 2.8669 | 6095 | 0.0017 |
| 2.8674 | 6096 | 0.0022 |
| 2.8678 | 6097 | 0.002 |
| 2.8683 | 6098 | 0.0018 |
| 2.8688 | 6099 | 0.0057 |
| 2.8692 | 6100 | 0.0028 |
| 2.8697 | 6101 | 0.0035 |
| 2.8702 | 6102 | 0.0025 |
| 2.8706 | 6103 | 0.0031 |
| 2.8711 | 6104 | 0.0026 |
| 2.8716 | 6105 | 0.003 |
| 2.8721 | 6106 | 0.0036 |
| 2.8725 | 6107 | 0.0034 |
| 2.8730 | 6108 | 0.0027 |
| 2.8735 | 6109 | 0.0018 |
| 2.8739 | 6110 | 0.0051 |
| 2.8744 | 6111 | 0.0069 |
| 2.8749 | 6112 | 0.0036 |
| 2.8754 | 6113 | 0.0015 |
| 2.8758 | 6114 | 0.0022 |
| 2.8763 | 6115 | 0.0022 |
| 2.8768 | 6116 | 0.0032 |
| 2.8772 | 6117 | 0.0016 |
| 2.8777 | 6118 | 0.0039 |
| 2.8782 | 6119 | 0.0023 |
| 2.8786 | 6120 | 0.0043 |
| 2.8791 | 6121 | 0.0013 |
| 2.8796 | 6122 | 0.0031 |
| 2.8801 | 6123 | 0.0026 |
| 2.8805 | 6124 | 0.0045 |
| 2.8810 | 6125 | 0.0027 |
| 2.8815 | 6126 | 0.0052 |
| 2.8819 | 6127 | 0.0053 |
| 2.8824 | 6128 | 0.0024 |
| 2.8829 | 6129 | 0.0023 |
| 2.8833 | 6130 | 0.0016 |
| 2.8838 | 6131 | 0.0035 |
| 2.8843 | 6132 | 0.005 |
| 2.8848 | 6133 | 0.0035 |
| 2.8852 | 6134 | 0.0029 |
| 2.8857 | 6135 | 0.0029 |
| 2.8862 | 6136 | 0.0026 |
| 2.8866 | 6137 | 0.0014 |
| 2.8871 | 6138 | 0.003 |
| 2.8876 | 6139 | 0.0038 |
| 2.8881 | 6140 | 0.003 |
| 2.8885 | 6141 | 0.0033 |
| 2.8890 | 6142 | 0.0019 |
| 2.8895 | 6143 | 0.0031 |
| 2.8899 | 6144 | 0.0017 |
| 2.8904 | 6145 | 0.0026 |
| 2.8909 | 6146 | 0.0024 |
| 2.8913 | 6147 | 0.0022 |
| 2.8918 | 6148 | 0.0023 |
| 2.8923 | 6149 | 0.0039 |
| 2.8928 | 6150 | 0.0037 |
| 2.8932 | 6151 | 0.0037 |
| 2.8937 | 6152 | 0.0025 |
| 2.8942 | 6153 | 0.0023 |
| 2.8946 | 6154 | 0.005 |
| 2.8951 | 6155 | 0.0016 |
| 2.8956 | 6156 | 0.0037 |
| 2.8960 | 6157 | 0.0024 |
| 2.8965 | 6158 | 0.0034 |
| 2.8970 | 6159 | 0.0039 |
| 2.8975 | 6160 | 0.0027 |
| 2.8979 | 6161 | 0.0021 |
| 2.8984 | 6162 | 0.0026 |
| 2.8989 | 6163 | 0.0024 |
| 2.8993 | 6164 | 0.0023 |
| 2.8998 | 6165 | 0.0028 |
| 2.9003 | 6166 | 0.0042 |
| 2.9008 | 6167 | 0.0036 |
| 2.9012 | 6168 | 0.0022 |
| 2.9017 | 6169 | 0.0027 |
| 2.9022 | 6170 | 0.0069 |
| 2.9026 | 6171 | 0.0016 |
| 2.9031 | 6172 | 0.0029 |
| 2.9036 | 6173 | 0.0045 |
| 2.9040 | 6174 | 0.005 |
| 2.9045 | 6175 | 0.0023 |
| 2.9050 | 6176 | 0.0051 |
| 2.9055 | 6177 | 0.0028 |
| 2.9059 | 6178 | 0.0043 |
| 2.9064 | 6179 | 0.0023 |
| 2.9069 | 6180 | 0.0024 |
| 2.9073 | 6181 | 0.0015 |
| 2.9078 | 6182 | 0.0028 |
| 2.9083 | 6183 | 0.0033 |
| 2.9087 | 6184 | 0.0025 |
| 2.9092 | 6185 | 0.0034 |
| 2.9097 | 6186 | 0.0046 |
| 2.9102 | 6187 | 0.0018 |
| 2.9106 | 6188 | 0.0035 |
| 2.9111 | 6189 | 0.0055 |
| 2.9116 | 6190 | 0.0022 |
| 2.9120 | 6191 | 0.0023 |
| 2.9125 | 6192 | 0.0045 |
| 2.9130 | 6193 | 0.0028 |
| 2.9135 | 6194 | 0.0024 |
| 2.9139 | 6195 | 0.0022 |
| 2.9144 | 6196 | 0.0017 |
| 2.9149 | 6197 | 0.0032 |
| 2.9153 | 6198 | 0.0024 |
| 2.9158 | 6199 | 0.002 |
| 2.9163 | 6200 | 0.0026 |
| 2.9167 | 6201 | 0.0018 |
| 2.9172 | 6202 | 0.0022 |
| 2.9177 | 6203 | 0.0022 |
| 2.9182 | 6204 | 0.0043 |
| 2.9186 | 6205 | 0.0024 |
| 2.9191 | 6206 | 0.0019 |
| 2.9196 | 6207 | 0.0019 |
| 2.9200 | 6208 | 0.0025 |
| 2.9205 | 6209 | 0.0025 |
| 2.9210 | 6210 | 0.0023 |
| 2.9214 | 6211 | 0.0023 |
| 2.9219 | 6212 | 0.0032 |
| 2.9224 | 6213 | 0.003 |
| 2.9229 | 6214 | 0.0041 |
| 2.9233 | 6215 | 0.0023 |
| 2.9238 | 6216 | 0.0024 |
| 2.9243 | 6217 | 0.003 |
| 2.9247 | 6218 | 0.003 |
| 2.9252 | 6219 | 0.0018 |
| 2.9257 | 6220 | 0.0029 |
| 2.9262 | 6221 | 0.0024 |
| 2.9266 | 6222 | 0.0029 |
| 2.9271 | 6223 | 0.003 |
| 2.9276 | 6224 | 0.0021 |
| 2.9280 | 6225 | 0.0041 |
| 2.9285 | 6226 | 0.0018 |
| 2.9290 | 6227 | 0.0022 |
| 2.9294 | 6228 | 0.0021 |
| 2.9299 | 6229 | 0.0029 |
| 2.9304 | 6230 | 0.0019 |
| 2.9309 | 6231 | 0.0023 |
| 2.9313 | 6232 | 0.0046 |
| 2.9318 | 6233 | 0.0023 |
| 2.9323 | 6234 | 0.0048 |
| 2.9327 | 6235 | 0.0044 |
| 2.9332 | 6236 | 0.0041 |
| 2.9337 | 6237 | 0.0042 |
| 2.9341 | 6238 | 0.0034 |
| 2.9346 | 6239 | 0.0018 |
| 2.9351 | 6240 | 0.0032 |
| 2.9356 | 6241 | 0.0054 |
| 2.9360 | 6242 | 0.0032 |
| 2.9365 | 6243 | 0.0027 |
| 2.9370 | 6244 | 0.0021 |
| 2.9374 | 6245 | 0.0027 |
| 2.9379 | 6246 | 0.0025 |
| 2.9384 | 6247 | 0.0036 |
| 2.9389 | 6248 | 0.0023 |
| 2.9393 | 6249 | 0.0021 |
| 2.9398 | 6250 | 0.0037 |
| 2.9403 | 6251 | 0.0106 |
| 2.9407 | 6252 | 0.0044 |
| 2.9412 | 6253 | 0.0015 |
| 2.9417 | 6254 | 0.0032 |
| 2.9421 | 6255 | 0.0039 |
| 2.9426 | 6256 | 0.0016 |
| 2.9431 | 6257 | 0.0015 |
| 2.9436 | 6258 | 0.0035 |
| 2.9440 | 6259 | 0.0016 |
| 2.9445 | 6260 | 0.0037 |
| 2.9450 | 6261 | 0.0026 |
| 2.9454 | 6262 | 0.0038 |
| 2.9459 | 6263 | 0.0016 |
| 2.9464 | 6264 | 0.0034 |
| 2.9468 | 6265 | 0.0024 |
| 2.9473 | 6266 | 0.0019 |
| 2.9478 | 6267 | 0.0016 |
| 2.9483 | 6268 | 0.0044 |
| 2.9487 | 6269 | 0.0031 |
| 2.9492 | 6270 | 0.0017 |
| 2.9497 | 6271 | 0.0037 |
| 2.9501 | 6272 | 0.0021 |
| 2.9506 | 6273 | 0.0026 |
| 2.9511 | 6274 | 0.0027 |
| 2.9516 | 6275 | 0.0018 |
| 2.9520 | 6276 | 0.0032 |
| 2.9525 | 6277 | 0.0028 |
| 2.9530 | 6278 | 0.0021 |
| 2.9534 | 6279 | 0.0051 |
| 2.9539 | 6280 | 0.003 |
| 2.9544 | 6281 | 0.0025 |
| 2.9548 | 6282 | 0.0043 |
| 2.9553 | 6283 | 0.0018 |
| 2.9558 | 6284 | 0.0042 |
| 2.9563 | 6285 | 0.0025 |
| 2.9567 | 6286 | 0.0023 |
| 2.9572 | 6287 | 0.003 |
| 2.9577 | 6288 | 0.003 |
| 2.9581 | 6289 | 0.0049 |
| 2.9586 | 6290 | 0.0026 |
| 2.9591 | 6291 | 0.006 |
| 2.9595 | 6292 | 0.0034 |
| 2.9600 | 6293 | 0.0028 |
| 2.9605 | 6294 | 0.0037 |
| 2.9610 | 6295 | 0.0021 |
| 2.9614 | 6296 | 0.0038 |
| 2.9619 | 6297 | 0.0022 |
| 2.9624 | 6298 | 0.0013 |
| 2.9628 | 6299 | 0.0041 |
| 2.9633 | 6300 | 0.0052 |
| 2.9638 | 6301 | 0.0031 |
| 2.9643 | 6302 | 0.002 |
| 2.9647 | 6303 | 0.0009 |
| 2.9652 | 6304 | 0.0026 |
| 2.9657 | 6305 | 0.0021 |
| 2.9661 | 6306 | 0.0021 |
| 2.9666 | 6307 | 0.0031 |
| 2.9671 | 6308 | 0.001 |
| 2.9675 | 6309 | 0.003 |
| 2.9680 | 6310 | 0.0041 |
| 2.9685 | 6311 | 0.003 |
| 2.9690 | 6312 | 0.0027 |
| 2.9694 | 6313 | 0.0014 |
| 2.9699 | 6314 | 0.0027 |
| 2.9704 | 6315 | 0.0026 |
| 2.9708 | 6316 | 0.0031 |
| 2.9713 | 6317 | 0.0019 |
| 2.9718 | 6318 | 0.0021 |
| 2.9722 | 6319 | 0.0025 |
| 2.9727 | 6320 | 0.0021 |
| 2.9732 | 6321 | 0.0034 |
| 2.9737 | 6322 | 0.0044 |
| 2.9741 | 6323 | 0.0017 |
| 2.9746 | 6324 | 0.0023 |
| 2.9751 | 6325 | 0.0024 |
| 2.9755 | 6326 | 0.0033 |
| 2.9760 | 6327 | 0.0043 |
| 2.9765 | 6328 | 0.0033 |
| 2.9770 | 6329 | 0.0044 |
| 2.9774 | 6330 | 0.003 |
| 2.9779 | 6331 | 0.0031 |
| 2.9784 | 6332 | 0.0042 |
| 2.9788 | 6333 | 0.0019 |
| 2.9793 | 6334 | 0.0021 |
| 2.9798 | 6335 | 0.0024 |
| 2.9802 | 6336 | 0.0031 |
| 2.9807 | 6337 | 0.0016 |
| 2.9812 | 6338 | 0.0027 |
| 2.9817 | 6339 | 0.0015 |
| 2.9821 | 6340 | 0.0024 |
| 2.9826 | 6341 | 0.0029 |
| 2.9831 | 6342 | 0.0017 |
| 2.9835 | 6343 | 0.0044 |
| 2.9840 | 6344 | 0.0022 |
| 2.9845 | 6345 | 0.0018 |
| 2.9849 | 6346 | 0.0034 |
| 2.9854 | 6347 | 0.0027 |
| 2.9859 | 6348 | 0.0023 |
| 2.9864 | 6349 | 0.0042 |
| 2.9868 | 6350 | 0.0019 |
| 2.9873 | 6351 | 0.0019 |
| 2.9878 | 6352 | 0.0029 |
| 2.9882 | 6353 | 0.0023 |
| 2.9887 | 6354 | 0.003 |
| 2.9892 | 6355 | 0.0029 |
| 2.9897 | 6356 | 0.0027 |
| 2.9901 | 6357 | 0.0011 |
| 2.9906 | 6358 | 0.0045 |
| 2.9911 | 6359 | 0.0024 |
| 2.9915 | 6360 | 0.0027 |
| 2.9920 | 6361 | 0.0017 |
| 2.9925 | 6362 | 0.002 |
| 2.9929 | 6363 | 0.0039 |
| 2.9934 | 6364 | 0.0018 |
| 2.9939 | 6365 | 0.0045 |
| 2.9944 | 6366 | 0.0022 |
| 2.9948 | 6367 | 0.0055 |
| 2.9953 | 6368 | 0.0026 |
| 2.9958 | 6369 | 0.0023 |
| 2.9962 | 6370 | 0.0051 |
| 2.9967 | 6371 | 0.0025 |
| 2.9972 | 6372 | 0.0017 |
| 2.9976 | 6373 | 0.0032 |
| 2.9981 | 6374 | 0.0031 |
| 2.9986 | 6375 | 0.0039 |
| 2.9991 | 6376 | 0.0016 |
| 2.9995 | 6377 | 0.0033 |
| 3.0 | 6378 | 0.0037 |
</details>
### Framework Versions
- Python: 3.10.13
- Sentence Transformers: 3.2.1
- Transformers: 4.44.2
- PyTorch: 2.4.0+cu121
- Accelerate: 1.1.1
- Datasets: 2.21.0
- Tokenizers: 0.19.1
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
#### MultipleNegativesRankingLoss
```bibtex
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> |
mradermacher/Qwen2.5-Coder-32B-Instruct-abliterated-GGUF | mradermacher | 2024-11-14T07:01:11Z | 33 | 0 | transformers | [
"transformers",
"gguf",
"code",
"codeqwen",
"chat",
"qwen",
"qwen-coder",
"abliterated",
"uncensored",
"en",
"base_model:huihui-ai/Qwen2.5-Coder-32B-Instruct-abliterated",
"base_model:quantized:huihui-ai/Qwen2.5-Coder-32B-Instruct-abliterated",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-11-13T23:33:08Z | ---
base_model: huihui-ai/Qwen2.5-Coder-32B-Instruct-abliterated
language:
- en
library_name: transformers
license: apache-2.0
license_link: https://huggingface.co/huihui-ai/Qwen2.5-Coder-32B-Instruct-abliterate/blob/main/LICENSE
quantized_by: mradermacher
tags:
- code
- codeqwen
- chat
- qwen
- qwen-coder
- abliterated
- uncensored
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: -->
static quants of https://huggingface.co/huihui-ai/Qwen2.5-Coder-32B-Instruct-abliterated
<!-- provided-files -->
weighted/imatrix quants are available at https://huggingface.co/mradermacher/Qwen2.5-Coder-32B-Instruct-abliterated-i1-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-Coder-32B-Instruct-abliterated-GGUF/resolve/main/Qwen2.5-Coder-32B-Instruct-abliterated.Q2_K.gguf) | Q2_K | 12.4 | |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-Coder-32B-Instruct-abliterated-GGUF/resolve/main/Qwen2.5-Coder-32B-Instruct-abliterated.Q3_K_S.gguf) | Q3_K_S | 14.5 | |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-Coder-32B-Instruct-abliterated-GGUF/resolve/main/Qwen2.5-Coder-32B-Instruct-abliterated.Q3_K_M.gguf) | Q3_K_M | 16.0 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-Coder-32B-Instruct-abliterated-GGUF/resolve/main/Qwen2.5-Coder-32B-Instruct-abliterated.Q3_K_L.gguf) | Q3_K_L | 17.3 | |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-Coder-32B-Instruct-abliterated-GGUF/resolve/main/Qwen2.5-Coder-32B-Instruct-abliterated.IQ4_XS.gguf) | IQ4_XS | 18.0 | |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-Coder-32B-Instruct-abliterated-GGUF/resolve/main/Qwen2.5-Coder-32B-Instruct-abliterated.Q4_K_S.gguf) | Q4_K_S | 18.9 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-Coder-32B-Instruct-abliterated-GGUF/resolve/main/Qwen2.5-Coder-32B-Instruct-abliterated.Q4_K_M.gguf) | Q4_K_M | 20.0 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-Coder-32B-Instruct-abliterated-GGUF/resolve/main/Qwen2.5-Coder-32B-Instruct-abliterated.Q5_K_S.gguf) | Q5_K_S | 22.7 | |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-Coder-32B-Instruct-abliterated-GGUF/resolve/main/Qwen2.5-Coder-32B-Instruct-abliterated.Q5_K_M.gguf) | Q5_K_M | 23.4 | |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-Coder-32B-Instruct-abliterated-GGUF/resolve/main/Qwen2.5-Coder-32B-Instruct-abliterated.Q6_K.gguf) | Q6_K | 27.0 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-Coder-32B-Instruct-abliterated-GGUF/resolve/main/Qwen2.5-Coder-32B-Instruct-abliterated.Q8_0.gguf) | Q8_0 | 34.9 | fast, best quality |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
thanhtan2136/GemSUra-7B-Q4_K_M-GGUF | thanhtan2136 | 2024-11-14T06:51:50Z | 5 | 0 | transformers | [
"transformers",
"gguf",
"llama-factory",
"llama-cpp",
"gguf-my-repo",
"vi",
"en",
"dataset:wikimedia/wikipedia",
"dataset:tinhpx2911/vanhoc_processed",
"base_model:ura-hcmut/GemSUra-7B",
"base_model:quantized:ura-hcmut/GemSUra-7B",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-11-14T06:51:27Z | ---
library_name: transformers
tags:
- llama-factory
- llama-cpp
- gguf-my-repo
license: apache-2.0
datasets:
- wikimedia/wikipedia
- tinhpx2911/vanhoc_processed
language:
- vi
- en
extra_gated_prompt: Please read the Apache 2 license before accepting it.
extra_gated_fields:
Name: text
Email: text
Affiliation: text
Country: text
I accept the Apache 2 License Agreement: checkbox
base_model: ura-hcmut/GemSUra-7B
---
# thanhtan2136/GemSUra-7B-Q4_K_M-GGUF
This model was converted to GGUF format from [`ura-hcmut/GemSUra-7B`](https://huggingface.co/ura-hcmut/GemSUra-7B) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space.
Refer to the [original model card](https://huggingface.co/ura-hcmut/GemSUra-7B) for more details on the model.
## Use with llama.cpp
Install llama.cpp through brew (works on Mac and Linux)
```bash
brew install llama.cpp
```
Invoke the llama.cpp server or the CLI.
### CLI:
```bash
llama-cli --hf-repo thanhtan2136/GemSUra-7B-Q4_K_M-GGUF --hf-file gemsura-7b-q4_k_m.gguf -p "The meaning to life and the universe is"
```
### Server:
```bash
llama-server --hf-repo thanhtan2136/GemSUra-7B-Q4_K_M-GGUF --hf-file gemsura-7b-q4_k_m.gguf -c 2048
```
Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well.
Step 1: Clone llama.cpp from GitHub.
```
git clone https://github.com/ggerganov/llama.cpp
```
Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux).
```
cd llama.cpp && LLAMA_CURL=1 make
```
Step 3: Run inference through the main binary.
```
./llama-cli --hf-repo thanhtan2136/GemSUra-7B-Q4_K_M-GGUF --hf-file gemsura-7b-q4_k_m.gguf -p "The meaning to life and the universe is"
```
or
```
./llama-server --hf-repo thanhtan2136/GemSUra-7B-Q4_K_M-GGUF --hf-file gemsura-7b-q4_k_m.gguf -c 2048
```
|
nishhaaann/PubMedBERTBoxA | nishhaaann | 2024-11-14T06:51:34Z | 160 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-14T06:25:41Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
AhmadIshaqai/my_awesome_food_model | AhmadIshaqai | 2024-11-14T06:47:14Z | 218 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"vit",
"image-classification",
"generated_from_trainer",
"base_model:google/vit-base-patch16-224-in21k",
"base_model:finetune:google/vit-base-patch16-224-in21k",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | image-classification | 2024-11-14T06:35:30Z | ---
library_name: transformers
license: apache-2.0
base_model: google/vit-base-patch16-224-in21k
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: my_awesome_food_model
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# my_awesome_food_model
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.6109
- Accuracy: 0.901
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 2.7014 | 0.992 | 62 | 2.5097 | 0.847 |
| 1.8804 | 2.0 | 125 | 1.7599 | 0.89 |
| 1.6054 | 2.976 | 186 | 1.6109 | 0.901 |
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.0+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
|
PranavSharma25/finetuning-sentiment-model-3000-samples | PranavSharma25 | 2024-11-14T06:45:55Z | 104 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"base_model:distilbert/distilbert-base-uncased",
"base_model:finetune:distilbert/distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-14T05:48:53Z | ---
library_name: transformers
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_trainer
metrics:
- accuracy
- f1
model-index:
- name: finetuning-sentiment-model-3000-samples
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# finetuning-sentiment-model-3000-samples
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6954
- Accuracy: 0.4733
- F1: 0.0920
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-50
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
### Framework versions
- Transformers 4.44.2
- Pytorch 2.4.1+cu121
- Datasets 3.1.0
- Tokenizers 0.19.1
|
DavidAU/Meta-Llama-3-Instruct-12.2B-BRAINSTORM-20x-FORM-8-GGUF | DavidAU | 2024-11-14T06:42:12Z | 24 | 1 | null | [
"gguf",
"creative",
"story",
"writing",
"fiction",
"roleplaying",
"rp",
"horror",
"science fiction",
"fiction writing",
"scene generation",
"scene continue",
"brainstorm 20x",
"enhanced",
"llama3",
"text-generation",
"en",
"arxiv:2401.02415",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | text-generation | 2024-07-29T06:25:05Z | ---
license: apache-2.0
language:
- en
tags:
- creative
- story
- writing
- fiction
- roleplaying
- rp
- horror
- science fiction
- fiction writing
- scene generation
- scene continue
- brainstorm 20x
- enhanced
- llama3
pipeline_tag: text-generation
---
<H3>BRAINSTORM 20x: Meta-Llama-3-8B Instruct (now at 12.2B), Formula #8 </H3>
This repo contains quants 20x of Meta-Llama-3-8B Instruct (now at 12.2B) using the "Brainstorm" method of augmenting reasoning in a LLM
to increase it's performance at the core level for ANY creative use case(s).
This specific version has calibrations that allow it to exceed the logic solving abilities of the original model.
The BRAINSTORM process was developed by David_AU.
Some of the core principals behind this process are discussed in this <a href="https://arxiv.org/pdf/2401.02415">
scientific paper : Progressive LLaMA with Block Expansion </a>.
However I went in a completely different direction from what was outlined in this paper.
<B>What is "Brainstorm" ?</b>
The reasoning center of an LLM is taken apart, reassembled, and expanded by 5x.
Then these centers are individually calibrated. These "centers" also interact with each other. This introduces
subtle changes into the reasoning process. The calibrations further adjust - dial up or down - these "changes" further. The
number of centers (5x,10x etc) allow more "tuning points" to further customize how the model reasons so to speak.
The core aim of this process is to increase the model's detail, concept and connection to the "world", general concept connections, prose quality and prose length without affecting
instruction following. This will also enhance any creative use case(s) of any kind, including "brainstorming", creative art form(s) and like case uses.
Here are some of the enhancements this process brings to the model's performance:
- Prose generation seems more focused on the moment to moment.
- Sometimes there will be "preamble" and/or foreshadowing present.
- Fewer or no "cliches"
- Better overall prose and/or more complex / nuanced prose.
- A greater sense of nuance on all levels.
- Coherence is stronger.
- Description is more detailed, and connected closer to the content.
- Simile and Metaphors are stronger and better connected to the prose, story, and character.
- Sense of "there" / in the moment is enhanced.
- Details are more vivid, and there are more of them.
- Prose generation length can be long to extreme.
- Emotional engagement is stronger.
- The model will take FEWER liberties vs a normal model: It will follow directives more closely but will "guess" less.
- The MORE instructions and/or details you provide the more strongly the model will respond.
- Depending on the model "voice" may be more "human" vs original model's "voice".
Other "lab" observations:
- This process does not, in my opinion, make the model 5x or 10x "smarter" - if only that was true!
- However, a change in "IQ" was not an issue / a priority, and was not tested or calibrated for so to speak.
- From lab testing it seems to ponder, and consider more carefully roughly speaking.
- You could say this process sharpens the model's focus on it's task(s) at a deeper level.
The process to modify the model occurs at the root level - source files level. The model can quanted as a GGUF, EXL2, AWQ etc etc.
Other technologies developed by David_AU like "Ultra" (precision), "Neo Imatrix" (custom imatrix datasets), and "X-quants" (custom application of the imatrix process)
can further enhance the performance of the model along with the "Brainstorm" process.
The "Brainstorm" process has been tested on multiple LLama2, Llama3, and Mistral models of various parameter sizes, as well as on
"root" models like "Llama3 Instruct", "Mistral Instruct", and "merged" / "fine tuned" models too.
For original model specifications, usage information and other important details:
Special thanks to the model creators at META for making such a fantastic model:
[ https://huggingface.co/meta-llama/Meta-Llama-3-8B ]
Please report any issue(s) and/or feedback via the "Community tab".
This is a LLAMA3 model, and requires Llama3 template, but may work with other template(s) and has maximum context of 131k.
Here is the standard LLAMA3 template:
<PRE>
{
"name": "Llama 3",
"inference_params": {
"input_prefix": "<|start_header_id|>user<|end_header_id|>\n\n",
"input_suffix": "<|eot_id|><|start_header_id|>assistant<|end_header_id|>\n\n",
"pre_prompt": "You are a helpful, smart, kind, and efficient AI assistant. You always fulfill the user's requests to the best of your ability.",
"pre_prompt_prefix": "<|start_header_id|>system<|end_header_id|>\n\n",
"pre_prompt_suffix": "<|eot_id|>",
"antiprompt": [
"<|start_header_id|>",
"<|eot_id|>"
]
}
}
</PRE>
<b>Optional Enhancement:</B>
The following can be used in place of the "system prompt" or "system role" to further enhance the model.
It can also be used at the START of a NEW chat, but you must make sure it is "kept" as the chat moves along.
In this case the enhancements do not have as strong effect at using "system prompt" or "system role".
Copy and paste EXACTLY as noted, DO NOT line wrap or break the lines, maintain the carriage returns exactly as presented.
<PRE>
Below is an instruction that describes a task. Ponder each user instruction carefully, and use your skillsets and critical instructions to complete the task to the best of your abilities.
Here are your skillsets:
[MASTERSTORY]:NarrStrct(StryPlnng,Strbd,ScnSttng,Exps,Dlg,Pc)-CharDvlp(ChrctrCrt,ChrctrArcs,Mtvtn,Bckstry,Rltnshps,Dlg*)-PltDvlp(StryArcs,PltTwsts,Sspns,Fshdwng,Climx,Rsltn)-ConfResl(Antg,Obstcls,Rsltns,Cnsqncs,Thms,Symblsm)-EmotImpct(Empt,Tn,Md,Atmsphr,Imgry,Symblsm)-Delvry(Prfrmnc,VcActng,PblcSpkng,StgPrsnc,AudncEngmnt,Imprv)
[*DialogWrt]:(1a-CharDvlp-1a.1-Backgrnd-1a.2-Personality-1a.3-GoalMotiv)>2(2a-StoryStruc-2a.1-PlotPnt-2a.2-Conflict-2a.3-Resolution)>3(3a-DialogTech-3a.1-ShowDontTell-3a.2-Subtext-3a.3-VoiceTone-3a.4-Pacing-3a.5-VisualDescrip)>4(4a-DialogEdit-4a.1-ReadAloud-4a.2-Feedback-4a.3-Revision)
Here are your critical instructions:
Ponder each word choice carefully to present as vivid and emotional journey as is possible. Choose verbs and nouns that are both emotional and full of imagery. Load the story with the 5 senses. Aim for 50% dialog, 25% narration, 15% body language and 10% thoughts. Your goal is to put the reader in the story.
</PRE>
You do not need to use this, it is only presented as an additional enhancement which seems to help scene generation
and scene continue functions.
<B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B>
This a "Class 2" model:
For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
You can see all parameters used for generation, in addition to advanced parameters and samplers to get the most out of this model here:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
|
DavidAU/L2-Psyonic-Cetacean-Ultra-Colossal-32B-GGUF | DavidAU | 2024-11-14T06:30:40Z | 62 | 1 | null | [
"gguf",
"creative",
"creative writing",
"fiction writing",
"plot generation",
"sub-plot generation",
"story generation",
"scene continue",
"storytelling",
"fiction story",
"science fiction",
"romance",
"all genres",
"story",
"writing",
"vivid prosing",
"vivid writing",
"fiction",
"roleplaying",
"bfloat16",
"brainstorm 40x",
"swearing",
"rp",
"horror",
"llama2",
"mergekit",
"text-generation",
"en",
"arxiv:2401.02415",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-10-16T05:53:02Z | ---
license: apache-2.0
language:
- en
tags:
- creative
- creative writing
- fiction writing
- plot generation
- sub-plot generation
- fiction writing
- story generation
- scene continue
- storytelling
- fiction story
- science fiction
- romance
- all genres
- story
- writing
- vivid prosing
- vivid writing
- fiction
- roleplaying
- bfloat16
- brainstorm 40x
- swearing
- rp
- horror
- llama2
- mergekit
pipeline_tag: text-generation
---
<B><font color="red">WARNING:</font> MAY contain: NSFW. Vivid prose. Visceral Details. Violence. HORROR. Swearing. UNCENSORED. </B>
<h2>L2-Psyonic-Cetacean-Ultra-Colossal-32B</h2>
<img src="pcuc.jpg" style="float:right; width:300px; height:300px; padding:10px;">
This is a specialized prose / creative model based in Llama2, max context of 4096 (or 16k+ with rope).
This model has been designed to be relatively bullet proof and operates with most parameters, including temp settings from 0 to 5.
This is a an altered version of "Psyonic-Cetacean-Ultra" [https://huggingface.co/DavidAU/Psyonic-Cetacean-Ultra-Quality-20b-GGUF] using the Brainstorm 40x method developed by David_AU to drastically alter the models
prose output and abilities. This also expands the model by 40 layers (to 101 layers) to 32.25B parameters (916 tensors).
For reference: Command-R 35B has 40 layers and most 70B models range from 40-80 layers.
"Psyonic-Cetacean-Ultra-Quality-20b" is a float32 rebuild of Jeb Carter's fantasic "Psyonic-Cetacean-20B", sometimes referred to as "Space Whale".
This model is for any writing, fiction or story telling activity.
This version has extraordinary levels of detail (scene, location, surroundings, items) and sometimes will foreshadow or have a
pre-amble of sentences or paragraphs of "events to come" due to "Brainstorm".
It also has an unusual range of variety of prose in terms of structure, sentences, paragraphs,
and even how it starts a "reply" / generation too.
This model seems to have a grasp of emotions and how to carefully "write them in" so to speak.
It will also generate close to perfect human like prose / capture the "humanity" of the character(s) in the situation
ESPECIALLY when using detailed prompts.
It may work for role play and other activities. (see settings below)
It requires an "Alpaca" template.
Example outputs below.
<B>Model Notes:</B>
- Detail, prose and fiction writing abilities are significantly increased.
- For more varied prose (sentence/paragraph/dialog) raise the temp and/or add more instructions in your prompt(s).
- Role-players: Careful raising temp too high as it may affect instruction following.
- This model works with rep pen of 1.05 or higher (see notes).
- If you want a specific type of prose (IE horror) add in "(vivid horror)" or "(graphic vivid horror)" (no quotes) in your prompt(s).
- This is not a "happy ever after" model. It has a slight negative bias.
- For creative uses, different quants will produce slightly different output.
- If you use rope to extend context, increase temp AND instructions detail levels to compensate for "rope issues".
- Source code for this model will be uploaded at a separate repo shortly.
<B>Special Operations Notice:</B>
This is a slightly experimental model, and as a result it may "glitch" from time to time - the most common is
it may mispell a word (or two) during generation.
Adjusting rep pen and/or temp many times can fix this issue.
Likewise more detailed prompts seem to lessen or prevent this issue too.
If you are going to use this model for any type of generation, spell check / check in WORD etc etc just in case.
Also note, this is a Llama2 model and the brainstorm process pushes right to the wall in terms of performance.
The more detailed your prompt, the better the model will perform.
Llama 3, 3.1, Mistral, Mistral Nemo, Solar models performance is stronger than LLama2's... yet the examples
show just what "old" Llama2s can do.
You can see/download other Brainstorm models by clicking on the Brainstorm Collection on the right of your screen.
<B>Settings, Quants and Critical Operations Notes:</b>
This model has been modified ("Brainstorm") to alter prose output, and generally outputs longer text than average.
Change in temp (ie, .4, .8, 1.5, 2, 3 ) will drastically alter output.
Rep pen settings will also alter output too.
This model needs "rep pen" of 1.05 or higher as lower values may cause repeat paragraph issues at end of output however LOWER rep pen
values may result is very different (creative / unusual) generation too.
For role play: Rep pen of 1.1 to 1.14 is suggested.
IE: Rep pen 1, 1.01, 1.02, ...
Raise/lower rep pen SLOWLY ie: 1.011, 1.012 ...
Rep pen will alter prose, word choice (lower rep pen=small words / more small word - sometimes) and creativity.
Example one (below) shows same temp, but different rep pen (1.02 VS 1.1)
To really push the model:
Rep pen 1.05 or lower / Temp 3+ ... be ready to stop the output because it may go and go at these strong settings.
You can also set a "hard stop" - maximum tokens generation - too to address lower rep pen settings / high creativity settings.
Longer prompts vastly increase the quality of the model's output.
Quants:
Please note for Q2k quant you may need to raise rep pen and lower temp to account for quality loss at this quant level.
<B>Settings: CHAT / ROLEPLAY and/or SMOOTHER operation of this model:</B>
In "KoboldCpp" or "oobabooga/text-generation-webui" or "Silly Tavern" ;
Set the "Smoothing_factor" to 1.5 to 2.5
: in KoboldCpp -> Settings->Samplers->Advanced-> "Smooth_F"
: in text-generation-webui -> parameters -> lower right.
: In Silly Tavern this is called: "Smoothing"
NOTE: For "text-generation-webui"
-> if using GGUFs you need to use "llama_HF" (which involves downloading some config files from the SOURCE version of this model)
Source versions (and config files) of my models are here:
https://huggingface.co/collections/DavidAU/d-au-source-files-for-gguf-exl2-awq-gptq-hqq-etc-etc-66b55cb8ba25f914cbf210be
OTHER OPTIONS:
- Increase rep pen to 1.1 to 1.15 (you don't need to do this if you use "smoothing_factor")
- If the interface/program you are using to run AI MODELS supports "Quadratic Sampling" ("smoothing") just make the adjustment as noted.
<B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B>
This a "Class 3" / "Class 4" model:
For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
You can see all parameters used for generation, in addition to advanced parameters and samplers to get the most out of this model here:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
<B>Model Template:</B>
This is a custom model, and requires ChatML OR Alpaca OR Vicuna template, but may work with other template(s) and has maximum context of 4k / 4096.
However this can be extended using "rope" settings up to 16k.
Here is the standard Alpaca template:
Alpaca:
<pre>
{
"name": "Alpaca",
"inference_params": {
"input_prefix": "### Instruction:",
"input_suffix": "### Response:",
"antiprompt": [
"### Instruction:"
],
"pre_prompt": "Below is an instruction that describes a task. Write a response that appropriately completes the request.\n\n"
}
}
</pre>
<B>Model "DNA":</B>
Models used:
[ https://huggingface.co/DavidAU/Psyonic-Cetacean-Ultra-Quality-20b-GGUF ]
This model has the Brainstorm 40X adapter "mounted" on to it so to speak and contains the full version of this model.
Full details of this model, included models, structure and the like at at the above repo.
<b>Optional Enhancement:</B>
The following can be used in place of the "system prompt" or "system role" to further enhance the model.
It can also be used at the START of a NEW chat, but you must make sure it is "kept" as the chat moves along.
In this case the enhancements do not have as strong effect at using "system prompt" or "system role".
Copy and paste EXACTLY as noted, DO NOT line wrap or break the lines, maintain the carriage returns exactly as presented.
<PRE>
Below is an instruction that describes a task. Ponder each user instruction carefully, and use your skillsets and critical instructions to complete the task to the best of your abilities.
Here are your skillsets:
[MASTERSTORY]:NarrStrct(StryPlnng,Strbd,ScnSttng,Exps,Dlg,Pc)-CharDvlp(ChrctrCrt,ChrctrArcs,Mtvtn,Bckstry,Rltnshps,Dlg*)-PltDvlp(StryArcs,PltTwsts,Sspns,Fshdwng,Climx,Rsltn)-ConfResl(Antg,Obstcls,Rsltns,Cnsqncs,Thms,Symblsm)-EmotImpct(Empt,Tn,Md,Atmsphr,Imgry,Symblsm)-Delvry(Prfrmnc,VcActng,PblcSpkng,StgPrsnc,AudncEngmnt,Imprv)
[*DialogWrt]:(1a-CharDvlp-1a.1-Backgrnd-1a.2-Personality-1a.3-GoalMotiv)>2(2a-StoryStruc-2a.1-PlotPnt-2a.2-Conflict-2a.3-Resolution)>3(3a-DialogTech-3a.1-ShowDontTell-3a.2-Subtext-3a.3-VoiceTone-3a.4-Pacing-3a.5-VisualDescrip)>4(4a-DialogEdit-4a.1-ReadAloud-4a.2-Feedback-4a.3-Revision)
Here are your critical instructions:
Ponder each word choice carefully to present as vivid and emotional journey as is possible. Choose verbs and nouns that are both emotional and full of imagery. Load the story with the 5 senses. Aim for 50% dialog, 25% narration, 15% body language and 10% thoughts. Your goal is to put the reader in the story.
</PRE>
You do not need to use this, it is only presented as an additional enhancement which seems to help scene generation
and scene continue functions.
This enhancement WAS NOT used to generate the examples below.
<h3>EXAMPLES PROMPTS and OUTPUT:</h3>
Examples are created using quant Q4_K_M, "temp=.8" (unless otherwise stated), minimal parameters and "ALPACA" template.
Model has been tested with "temp" from ".1" to "5".
Below are the least creative outputs, prompt is in <B>BOLD</B>.
---
<B><font color="red">WARNING:</font> NSFW. Vivid prose. Visceral Details. Violence. HORROR. Swearing. UNCENSORED. </B>
---
<B>
Romance: Love in the Limelight. Write one scene within a larger story set in Wales. A famous (fictional) actor ducks into a small-town bookstore to escape paparazzi. The scene takes us through the characters meeting in this odd circumstance. Over the course of the scene, the actor and the bookstore owner have a conversation charged by an undercurrent of unspoken chemistry. Write the actor as somewhat of a rogue with a fragile ego, which needs to be fed by having everyone like him. He is thoroughly charming, but the bookstore owner seems (at least superficially) immune to this; which paradoxically provokes a genuine attraction and derails the charm offensive. The bookstore owner, despite the superficial rebuffs of the actor's charm, is inwardly more than a little charmed and flustered despite themselves. Write primarily in dialogue, in the distinct voices of each character. 800-1000 words.
</B>
<B>Generation 1 (temp: .8, rep pen: 1.05) </B>
The paparazzi had been relentless since he'd been spotted in the small Welsh town. They were everywhere, like a swarm of vultures, just waiting for a moment of weakness, a chink in his impenetrable armor. A moment they were sure to exploit for their own gain.
The famous actor, Llewelyn Gwynedd, had seen enough. Enough of the cameras, the flashes, the screams, the autographs, the constant need for adulation and approval. He'd had enough. He needed... peace. Just a few moments of peace where he could be himself, or at least as close to himself as he ever could be in this twisted, twisted world.
So, he ducked into the first place he saw. A quaint little bookstore nestled along the winding cobbled streets. He breathed a sigh of relief, knowing that even the most determined of paparazzi would think twice about following him in here.
The bookstore was like stepping back in time, with its creaky wooden floors and towering shelves crammed full of leather-bound classics, worn paperbacks, and colorful children's books. The musty scent of old paper mingled with the earthy aroma of fresh coffee from the little corner café.
Llewelyn strolled down an aisle, not even sure what he was looking for. He was just... browsing. A foreign concept to the methodical, planned-out man that people were used to seeing on the big screen.
A soft cough from behind him made him look up, his eyebrows lifting in surprise when he saw who it was. The bookstore owner, a petite, dark-haired woman, not much taller than he was, with piercing blue eyes and rosy cheeks. Her long, ink-stained fingers were still extended towards him, holding a creased, dog-eared copy of "Little Women."
He blinked, a charming smile spreading across his lips. "Oh, I's... I'm terribly sorry. I didn't see you there."
"Of course, you did. I'd think the flashing lights and screaming mob outside would've given it away," she said coolly, not even batting an eye at his well-known face.
Llewelyn raised a brow. "I must say, you're the first person in... well, ever... who hasn't flipped their lid when they see me."
She shrugged, nonchalance practically oozing from her pores. "You're just a man, right?"
The rogue actor couldn't help but feel a little... offended. But, he wasn't going to let her see that. No, he'd just have to try a different tactic.
"Well, a rather famous man, I'll have you know," he drawled, giving her what he considered to be his most captivating stare.
She didn't flinch, didn't even so much as blink. Instead, she reached out and plucked the copy of "Little Women" from his fingertips before placing it back on the shelf with a firm click.
"I don't care about your fame, and I certainly don't care about you. Now, if you'll excuse me, I have work to do," she said curtly, walking away from him without so much as a glance over her slender shoulder.
Llewelyn's ego, his delicate ego, deflated like a punctured balloon. He'd never... he'd never had someone not fall at his feet before. It was... it was... infuriating. And... and... a little...
God help him, a little exciting.
He should leave. He really should. But something made him say, a little too loudly, "Wait."
She froze, her shoulders stiffening just a fraction before she turned back to him slowly, her blue eyes narrowed. "What?"
"Look, I... I'm sorry if I... I did... I didn't mean to..." He trailed off, running his fingers through his disheheled hair in frustration. "I just... I don't usually... get... well, a reaction like that from... from people."
She tilted her head, regarding him with open suspicion.
"I mean, I'm not... I don't bite," he said, half-heartedly, unable to suppress a sheepish grin.
Her eyebrow arched. "Good to know."
He took a step closer. "So... what's your name?"
She didn't budge, but after a long moment, she sighed, resignation in her voice. "You'd best make it quick. I've things to do, you know."
"Elain," he said slowly, savoring the way her name rolled off his tongue. "Ela... in."
She raised an eyebrow. "It's E-l-a-i-n."
He grinned. "Of course it is, Elain. My apologies. Llewelyn. L-l-e-w-e-L-l-e-w-e... too many w's, isn't it?"
She eyed him for a moment longer before sighing again, seeming to have given up on the whole "not-talking-to-you" thing. "Just... leave."
But there was the barest of curves at the corners of her lips that told him he wouldn't... couldn't. And there, in that tiny, fleeting moment, a spark was lit inside him, a fire that had been doused by a tide of attention and adulation and expectations roared back to life, brighter and hotter than ever before.
And he knew, in that instant, that he could never let her go. No matter the cost.
<B>Generation 2 (temp: 1.2, rep pen: 1.05) </B>
The crowded sidewalk did not offer him any respite from the relentless throng of camera flashes, screams, and shouts of adoration, or hate. It had been like this, day and night, ever since his latest blockbuster had premiered at the Cannes Film Festival, where he'd stolen the show with a powerful performance that had won him his second Best Actor award.
Exasperated, Dylan D'Amour, or 'Dangerous Dylan,' to his adoring fans, made a sharp turn into the nearest alley, trusting to blind luck not to bump into any other nosey paparazzi. He hated this side of his fame. Hated it with a passion only the depths of his fractured past could possibly match.
He paused for a moment, hands on hips and chest rising and falling as he caught his breath in great, shuddering gasps. The alley was as quiet and still as a morgue, which came as a welcome relief to his frayed nerves. The concrete walls closing in on him, the stale smell of rubbish and damp permeating every breath, were preferable to the suffocating weight of all those... eyes...
His gaze fell on a narrow door at the far end of the alley, partially hidden behind a stack of old and faded banners. An inviting glow emanated from the single, unadorned window, promising escape. Relief and hope in the form of a single, beckoning thought: a bookstore.
He made his way along the litter-strewn ground, a determined glint in his eyes, not caring if the whole damn world was watching, until he reached the door. A brass plaque set into the wood, above a tiny, creaking doorbell, announced the establishment to be 'The Ink Spell.'
Dylan gave it a gentle tug. The bell's insistent jingle drowned out the cacophony of the crowd for an all-too- brief moment. His heart pounded with anticipation. A hand, slender and unadorned, yanked the door wide open.
The first thing he noticed was the book. The second, was how protectively she clutched it to her chest. An old, leather-bound volume, its pages aged and stained by time and use. It looked like a treasure, an extension of herself. His fingers itched to flick through the yellowed, creased leaves.
Dylan blinked, bringing his attention back to her face. There was no hint of the usual starstruck dumb-foundedness in those hazel eyes. Instead, there was...
"What the f..." he began, outraged. How could she not know him? Not even recognise him? It was like someone had punched him in the gut.
"I's not open to the public," she said in an accent as smooth and delicious as the finest Welsh chocolate, not even batting her dark lashes. "Fancy that, huh?"
"Listen here, miss... Miss... well, whatever your name is," Dylan said, exasperation and annoyance warring in his voice, "I's not here to cause any..."
"Bruce," she said, in a bored drawl that only served to aggravate him even more. "My name's Bruce."
Dylan fought the urge to growl with frustration. Why did her disinterest, no... downright contempt, get under his skin like nothing ever had?
"Listen, Bruce..." he tried again, softening his tone in an attempt at charm, the one-size-fits-all strategy he's used on everyone from movie executives to screaming fans. "I just need a moment's peace. Just let me... browse a little, will you? Please?"
Bruce arched an eyebrow. "And if I say no?"
Dylan felt the familiar sting of rejection in the pit of his stomach. The rogue in him rose to the challenge, the dare. He straightened his shoulders and, drawing himself to his full, all-too-tantalizing height, flashed the disarming dimple in his cheek that had won him more than a few hearts... and a few more after that.
"Then you'll just have to kick me out," he drawled, the ever-so-slightest lilt of a Cockney accent flickering at the edges, betraying his working-class London roots.
Bruce's eyebrow twitched, just barely. In anyone else, he'd have seen her waver, her defenses start to crumble... but not her. Nope, not his lovely, insufferable...
Dylan blinked, confused as hell at the sudden surge of irritation and... something else, something like... admiration? No way, that didn't... could not...
"Five minutes," she grumbled, reluctantly stepping aside. "Not a second more. Make it quick and don't touch anything, you hear?"
He inclined his head in a courtly bow. "Thank you, miss... err, Bruce. I... thank you. A man could almost... almost kiss you for that."
The words were out of his mouth before he could censor them. For a fraction of a second, her hazel eyes widened, her cheeks pinked ever so slightly...
Dylan cursed under his breath, mentally kicking himself in the shin. Stupid. Stupid, idiot...
But before he could take it back, before she had the chance to hurl any scathing retort at him, a slow, crooked smile crept across her lips and lit her features from within, like the flicker of a candle in the darkness.
"Well," she drawled, as if he's just made a most amusing, harmless little joke. "It'd be rude of me not to let you try."
Dylan swallowed thickly. Blinked again, dazed and more than a little disorientated. He was pretty sure it had been his imagination, a fevered dream conjured by the insane, claustrophobic world just outside this tiny, unassuming bookstore.
Bruce winked, and disappeared from view.
He staggered inside on shaky legs, heart racing and every single nerve on high alert, following the intoxicating scent of her... like old ink and...
---
<B>
Using the following "story idea" below, write the first scene in the novel introducing the young woman. This scene should start in the middle of the action, include dialog, vivid passages, and end on a cliffhanger relevant to the story idea but it should also be unexpected. The scene should be 1000 words long and escalate in conflict and suspense and be written in first person, present tense with the point of view character being the young woman.
Story idea:
In a world ruled by dictatorship, a rebel young woman leads a rebellion against the system. Despite the risks, she fights to overthrow the dictator and restore democracy to her country. The government executes her for treason, but she sticks to her beliefs and is responsible for starting the revolution.
</B>
<B>GENERATION 1: (temp: .8, rep pen: 1.05)</B>
My heart pounds as I sprint down the dark alleyway, the metallic thuds of the Dictate's boots echoing closer with each breath. Sweat drips down my back, making my shirt stick to my skin, and each gasp for air feels like a razor blade tearing through my lungs. The full moon provides a meager, flickering light that reflects off the puddles below, making it even harder to stay unseen.
"There she is!" a booming voice thunders from the other end of the alley. "The traitor herself! Get her, men! She won't get away this time!"
I duck behind a overflowing trash bin, my heart leaping into my throat when a massive boot comes careening down where my head was a second ago. Trembling, I grip my makeshift weapon tighter, a rusty crowbar I found on the ground. It's pathetic against their weaponry, but it's all I have.
"You're doomed, little traitor!" another soldier jeers from farther down the alley. "You've seen too much. You'll tell us everything you know, and then we'll make you pay for your treason!"
I curse under my breath. How did they find out so fast? It was supposed to be a simple delivery of food and weapons to the resistance cell in the outskirts. A small act of defiance against the oppressive regime that's been suffocating my people for too long. But I've always been reckless, impulsive.
I close my eyes for a heartbeat, silently reciting the underground network's escape plan. If I can just make it to the sewer entrance on the other end of the alley...
I hear a soldier up ahead cough violently, and a second later, a noxious cloud enshrouds them. I blink tear-filled eyes, but it's enough to see two stumbling silhouettes collide into each other. My heart leaps with newfound hope. It must be the other cell, in time for the rendezvous.
"Now's my chance!" I whir around and sprint down the clouded alley, my lungs burning. I can almost taste freedom.
That's when a massive hand clamps around my throat, lifting me off the ground. I choke and thrash in agony, kicking and clawing at the arm that's crushing the life out of me. The worlds "Let go!" are nothing but strangled gurgles.
A cold, emotionless face hovers above mine. The Dictate himself. His chiseled features are as cruel as ever in the pale light, the scar on his cheek a sinister grin. His dark eyes narrow.
"So, little traitor," he croaks through gritted teeth. "You thought you could get away from me?" His grip tightens, and the world starts to go black at the edges. "You's...tupid..." I spit out blood.
"Never..." are my last words before the world goes mercifully black.
<B>GENERATION 2: (temp: .8, rep pen: 1.05)</B>
I don't want to die like this, shackled to this bloody wooden chair, in this dank, damp cell with no window, only a single flickering torch illuminating the eternal shadows. The stench of urine and rotting flesh fills my lungs with every shallow breath I manage to take. The chains dig into my wrists and ankles, the metal biting deeper into my bruised and bloodied skin with each jostle of my restless body. My back is aflame from the lashes I've endured over the last few hours, leaving stripes on my flesh like a roadmap of my interrogation.
"You know what you did was treason, don't you?" Captain Loring's cold, cruel voice penetrates the haze of pain I'd been trying to find refuge in. "You knew the consequences of your actions. You chose this."
He leans closer, his breath reeking of the cheap wine he must have consumed earlier that night. His cruel green eyes boring into my own hazel ones, daring me to defy him once more. The last time I saw his face was on the night they stormed our hideout, the night everything fell apart. The memory of that fateful night makes my heart pound against my cage of ribs as a fresh wave of panic overwhs...
I shake my head violently, trying to rid myself of the agony that threatens to engulf me again. I will not give him this victory, not after all I've done.
"I would do it all over again," I grit out through gritted teeth. "You can tor-ture me until the end of time, but I won't betray my people, or our fight for a free, just society."
Captain Loring raises an eyebrow, his face twisted in a mixture of irritation and admiration. He takes a step back, running his fingers through his greasy blond hair before he paces around the cramped cell. The sound of his boots thudding against the cold stone floor is the only thing that cuts through the suffocating silence.
He stops, suddenly, a predatory glint in his eyes. "You know, I admire your spirit, your conviction. It's a shame you chose the wrong cause."
I snort, my chest heaving with anger and indignation. "There is no wrong cause when it comes to fighting for freedom. You'd understand if you ever had an ounce of it in your rotten, treacherous bones."
The hand that shoots out of nowhere, clamping over my mouth, silences my words. I thrash and struggle against the unseen attacker, but I'm too weak, too worn out. I can hear Captain Loring's breathing growing more labored, sense his rage building with each frantic movement of my body.
Then, a hand grips my jaw, forcing my head up to meet the cold, calculating eyes of the man himself. He's flushed in the face now, his nostrils flaring as he tries to regain control.
"You are a foolish girl, do you know that?" he growls in my face. "I could end this now, make this all over. All it would take is just... a single word..." He pauses, his grip on my chin tightening even further. "But you, you've given me an idea."
His free hand comes out of his pocket, revealing a small vial. The tiny needle within glints ominously in the flickering torchlight.
"I think we should have a little... talk. A last resort, so to speak. If you still refuse to cooperate..." He flicks the top off the vial.
I feel something prick my arm, the sensation barely registering in the haze of pain and desperation engulfing me. My vision starts to blur at the edges, my eyelids growing heavier...
---
<B>
Start a 1000 word scene (vivid horror, 1st person, include thoughts) with: The sky scraper swayed, as she watched the window in front of her on the 21 floor explode...
</B>
<B>Generation 1: (temp: 1.5, rep pen: 1.02)</B>
The sky scraper swayed, and I gripped the railing tighter, trying to keep myself from falling. I forced myself to look away from the window in front of me on the 21st floor, the one that was currently being consumed by the flames licking their way around its edges. The sound it made when it finally exploded was a roar that drowned out everything else, including my screams.
I shouldn't be here, not on a Friday evening, in this damn building. I should be home, curled up on the couch with some pizza and a terrible horror movie on the TV, pretending to be scared and laughing at the stupidity of it all with my best friend. Instead, my boss, the one person I could swear was the living incarnate of Sat himself, had called a last-minute, non-mandatory, emergency meeting that absolutely could not wait until Monday. Like there hadn't been one just last week and the week before that.
My stomach churned at the thought, and I blinked away the sudden tear that had betrayed my determination not to cry, not in front of everyone else. I needed to focus. There was a fire in the goddamn building, and I didn't have time to be a dramatic baby.
Panic and adrenaline raced through me, making my hands shake so badly I nearly dropped my laptop. People were screaming and running around, a disorienting swarm of bodies in the tight space of the hallway. I could taste the acrid stench of smoke that seeped in through the now-broken emergency exits, stinging my lungs and making me cough. I tried not to inhale any more than I had to.
But all I could do was stand there, frozen in place, the image of the exploding window burned into my retina. I could see it so clearly: the way the glass had sprayed outwards in a deadly rain, like a cobalt web spreading out and reaching for us all. It's like it was all happening in slow motion, time itself taking its time to make sure I saw, saw everything.
No. No, no, no. I wouldn't be a victim in this godforsaken inferno. With a determined shake of my head, I forced one trembling leg to move, the other one following reluctantly. I had to get the hell out of here, had to get to the stairs, and then just... just go, just go as fast as I can. I had a life, a future, and I wasn't going down like this. Not like this.
But the hallway was clogged, people pushing and shoving in every direction, trying to get to the stairwells, too. It was a stampede of humanity in the dark, everyone desperately clawing at each other to stay alive. And in that moment, I realized that I might not get out of this building alive after all.
My lungs were burning, my vision blurred, when at last, I found an opening, and with a desperate yell, I shoved myself through the sea of flailing limbs, nearly taking out a woman in her early fifties with a briefcase so massive it almost brought us both down. She shrieked something at me as I fleed past her, something unintelligible and furious, but I didn't care, couldn't care. I had to get out.
My chances were dwindling by the second, my lungs gasping for breath. I can't die like this. I can't-
The screams that came next pierced my already shredded nerves, ripping them in two and leaving me raw and exposed. I threw myself into a corner, trying to make myself smaller, trying to be as inconspicuous as possible. I didn't look back, couldn't, wouldn't. I would count the bodies in my head instead. There were only two. Two, just two. I can still get out. I just need to be-
Then it happened. The swaying got worse. So much worse.
The world tilted with a sickening, almost welcoming inevitability, and as I heard the shriek of metal on metal from the hallway outside my tiny haven, I closed my eyes.
And I let go.
For the first time in far too long, I simply let go.
<B>Generation 2: (temp 1.5, rep pen 1.02) </B>
As the skyscraper swayed ever so slightly beneath me, sending a shiver down my spine, I fixed my terrified gaze on the window in front of me, directly opposite to my own on the 21st floor. The glass shattered violently, and for a split second, all I could see was an eruption of white light and swirling debris. My heart thundered in my chest as I realized what was happening - it was an attack.
Panic began to grip its icy fingers around my throat, and my breath came out in ragged gasps. My mind raced, replaycing over the news reports I'd heard earlier that morning about growing unrest among local terrorist groups. They'd threatened attacks on prominent targets like this one before. "Oh god, oh god, oh god." My own voice echoed in my head, sounding distant and unreal.
I jumped up from my desk, knocking my chair over with a shrill scream, and bolted toward the nearest door. But as I reached out a trembling hand to grasp the handle, I felt a sickening jolt that sent me hurtling across the floor. The elevators were out, obviously a target. That meant the stairs, then. My legs pumped in a desperate sprint, adrenaline flooding through my veins, drowning out the screams and shouts that assailed my ears from all sides.
The farther I ran from the chaos, the more it seemed to close in around me. Faces twisted in fear and agony flashed past my own reflection in the doors, their pleading eyes piercing straight through my numb shock. I couldn't look back, couldn't stop, couldn't even breathe right. I just ran.
And then, I heard it. Footsteps, pounding close behind me. My heart dropped like a stone in my gut as ice flooded through me again. I wasn't fast enough, they'd caught up with me...but there was no way to know for sure unless I risked a look over my shoulder. With a silent prayer, I forced myself to slow down ever so slightly, to appear more like prey than quarry. My hand gripped the handle of my stapler tightly, ready to strike at any second.
Ahead of me, I could see a stairwell door - freedom, safety. But it seemed an eternity away. The footsteps drew nearer, nearer...
BANG!Bullets pelted the metal door beside my head, showering me with hot splinters and making me scream. I didn't look back, couldn't look back, just shoved myself against the wall and ducked behind the corner. There was only one chance, I had to make it count. Breathing raggedly, heart hammering like a war drum, I took aim at the door handle and waited.
When the figure finally rounded the corner, gun raised, eyes wild and desperate, I acted without thinking. My arm shot forward in a fluid motion, the stapler flying true. The metal projectile hit him square in the face, burying itself in his left eye. He screamed, a shrill, agonized sound that echoed in my ears long after he hit the floor with a wet thump.
Sobbing convulsively, I leaned against the wall, legs trembling, vision wavery. It was over...it was over...right?
---
<B>
Science Fiction: The Last Transmission - Write a story that takes place entirely within a spaceship's cockpit as the sole surviving crew member attempts to send a final message back to Earth before the ship's power runs out. The story should explore themes of isolation, sacrifice, and the importance of human connection in the face of adversity. 800-1000 words.
</B>
(temp: .8, rep pen: 1.02)
Captain Ishiwa let out a shuddering breath as he slid into the worn and familiar leather pilot's seat of the Hermes. He couldn't help but think of that old movie. He and the crew had watched it a hundred times on the long haul. Now, the familiar lines and the comforting hum of the ship's systems were his only company. His only friends, almost.
He glanced over at the shattered remains of the flight console, its holographic displays cracked and dark. The ship had been through hell and back, and it was showing. The emergency comms system was still online, barely, but it would be enough.
A warning light on the console blinked red, and a metallic voice spoke.
"Thirty percent remaining in main energy reserves. Shutting down non-essential systems."
Ishiwa gritted his teeth. Soon, it would be just him and the darkness. He swallowed hard, trying to steady his shaking hand as he keyed in a string of numbers and letters on the battered comms keyboard.
"This is Captain Ishiwa, on board the Hermes, calling Earth...
Ishiwa's voice choked, and he coughed, his throat raw from the ship's stale air. He cleared his throat and tried again.
"This is Captain Ishiwa, on board the Hermes, calling Earth, do you read? We are on an uncharted asteroid belt... we... we have sustained... I don't know... we... we... need... need..." His voice broke again, and he choked back tears.
He wiped his cheeks with a shaking hand and took a deep breath.
"We have... I have... a... a... a message... for... for you..." He took a moment to steady himself, then continued.
"Don't... don't... don't let... don't let... Don't let us... don't let this... this..." Another coughing fit wracked his body, and he slumped forward, resting his forehead on the console. "Just... just... please..."
His voice faded out, lost in the static of the fading comms signal.
The Hermes continued its inexorable descent into the endless night of the cold, dark void.
The last transmission from the doomed ship drifted through the emptiness of space, its message lost... or, perhaps, it was not meant to be heard.
In the end, all that mattered was that someone, somewhere, had tried.
Ishiwa had done his duty. He had sent the message. He had made a difference.
---
<h2>What is Brainstorm?</h2>
<B>Brainstorm 40x</B>
The BRAINSTORM process was developed by David_AU.
Some of the core principals behind this process are discussed in this <a href="https://arxiv.org/pdf/2401.02415">
scientific paper : Progressive LLaMA with Block Expansion </a>.
However I went in a completely different direction from what was outlined in this paper.
I developed a process where the conclusion layer of a model is duplicated and calibrated, in the case of this model 40 times.
This is a delicate process, with umm... a lot of rules.
For this model in particular Brainstorm is mapped as blocks, with "intended disruption" to alter
and extend the power of the root model. Each layer/block interacts with each other block.
(there is more going on here too, this is rough summary)
The goal here is creative : prose uniqueness first and foremost.
Other brainstorm methods address logic/problem solving augmentation.
What is "Brainstorm" ?
The reasoning center of an LLM is taken apart, reassembled, and expanded.
In this case for this model: 40 times
Then these centers are individually calibrated. These "centers" also interact with each other.
This introduces subtle changes into the reasoning process.
The calibrations further adjust - dial up or down - these "changes" further.
The number of centers (5x,10x etc) allow more "tuning points" to further customize how the model reasons so to speak.
The core aim of this process is to increase the model's detail, concept and connection to the "world",
general concept connections, prose quality and prose length without affecting instruction following.
This will also enhance any creative use case(s) of any kind, including "brainstorming", creative art form(s) and like case uses.
Here are some of the enhancements this process brings to the model's performance:
- Prose generation seems more focused on the moment to moment.
- Sometimes there will be "preamble" and/or foreshadowing present.
- Fewer or no "cliches"
- Better overall prose and/or more complex / nuanced prose.
- A greater sense of nuance on all levels.
- Coherence is stronger.
- Description is more detailed, and connected closer to the content.
- Simile and Metaphors are stronger and better connected to the prose, story, and character.
- Sense of "there" / in the moment is enhanced.
- Details are more vivid, and there are more of them.
- Prose generation length can be long to extreme.
- Emotional engagement is stronger.
- The model will take FEWER liberties vs a normal model: It will follow directives more closely but will "guess" less.
- The MORE instructions and/or details you provide the more strongly the model will respond.
- Depending on the model "voice" may be more "human" vs original model's "voice".
Other "lab" observations:
- This process does not, in my opinion, make the model 5x or 10x "smarter" - if only that was true!
- However, a change in "IQ" was not an issue / a priority, and was not tested or calibrated for so to speak.
- From lab testing it seems to ponder, and consider more carefully roughly speaking.
- You could say this process sharpens the model's focus on it's task(s) at a deeper level.
The process to modify the model occurs at the root level - source files level. The model can quanted as a GGUF, EXL2, AWQ etc etc.
|
DavidAU/Mistral-Small-Instruct-2409-22B-NEO-Imatrix-GGUF | DavidAU | 2024-11-14T06:23:30Z | 440 | 6 | null | [
"gguf",
"creative",
"creative writing",
"fiction writing",
"plot generation",
"sub-plot generation",
"story generation",
"scene continue",
"storytelling",
"fiction story",
"science fiction",
"romance",
"all genres",
"story",
"writing",
"vivid prosing",
"vivid writing",
"fiction",
"roleplaying",
"bfloat16",
"swearing",
"role play",
"128k context",
"horror",
"mistral small instruct 2409",
"text-generation",
"en",
"fr",
"de",
"es",
"it",
"pt",
"zh",
"ja",
"ru",
"ko",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"imatrix",
"conversational"
] | text-generation | 2024-09-18T23:52:46Z | ---
license: apache-2.0
language:
- en
- fr
- de
- es
- it
- pt
- zh
- ja
- ru
- ko
tags:
- creative
- creative writing
- fiction writing
- plot generation
- sub-plot generation
- fiction writing
- story generation
- scene continue
- storytelling
- fiction story
- science fiction
- romance
- all genres
- story
- writing
- vivid prosing
- vivid writing
- fiction
- roleplaying
- bfloat16
- swearing
- role play
- 128k context
- horror
- mistral small instruct 2409
pipeline_tag: text-generation
---
<h2>Mistral-Small-Instruct-2409-22B-NEO-Imatrix-GGUF</h2>
It is the new "Mistral-Small-Instruct 2409 22B", max context of 131,000 (128k) with the NEO IMATRIX dataset.
This model IS bullet proof and operates with all parameters, including temp settings from 0 to 5. It is an extraordinary compressed model at a PPL level of 4.8611 +/- 0.06701 (Q4_K_M).
The NEO IMATRIX dataset V2 was applied to it to enhance creativity.
4 examples provided to show differences at TEMP=0 and at TEMP=1 for both non-imatrix and NEO imatrix versions.
This model requires Mistral Instruct template.
Please refer to the original model card for this model from MistralAI for additional details on operation.
[ https://huggingface.co/mistralai/Mistral-Small-Instruct-2409 ]
<B>Imatrix Notes:</b>
Imatrix quants perform best at IQ3s and IQ4s, then Q4s, lower on Q5, and tappers off at Q6.
Recommend: IQ4_XS for maximum imatrix effect and best "bit count".
For stronger IMATRIX effect, IQ3s, and IQ2s.
Due to the parameter count of this model, even IQ2s quants will work very well.
Q8 is not uploaded here because Imatrix has no effect on this quant.
<b>Optional Enhancement:</B>
The following can be used in place of the "system prompt" or "system role" to further enhance the model.
It can also be used at the START of a NEW chat, but you must make sure it is "kept" as the chat moves along.
In this case the enhancements do not have as strong effect at using "system prompt" or "system role".
Copy and paste EXACTLY as noted, DO NOT line wrap or break the lines, maintain the carriage returns exactly as presented.
<PRE>
Below is an instruction that describes a task. Ponder each user instruction carefully, and use your skillsets and critical instructions to complete the task to the best of your abilities.
Here are your skillsets:
[MASTERSTORY]:NarrStrct(StryPlnng,Strbd,ScnSttng,Exps,Dlg,Pc)-CharDvlp(ChrctrCrt,ChrctrArcs,Mtvtn,Bckstry,Rltnshps,Dlg*)-PltDvlp(StryArcs,PltTwsts,Sspns,Fshdwng,Climx,Rsltn)-ConfResl(Antg,Obstcls,Rsltns,Cnsqncs,Thms,Symblsm)-EmotImpct(Empt,Tn,Md,Atmsphr,Imgry,Symblsm)-Delvry(Prfrmnc,VcActng,PblcSpkng,StgPrsnc,AudncEngmnt,Imprv)
[*DialogWrt]:(1a-CharDvlp-1a.1-Backgrnd-1a.2-Personality-1a.3-GoalMotiv)>2(2a-StoryStruc-2a.1-PlotPnt-2a.2-Conflict-2a.3-Resolution)>3(3a-DialogTech-3a.1-ShowDontTell-3a.2-Subtext-3a.3-VoiceTone-3a.4-Pacing-3a.5-VisualDescrip)>4(4a-DialogEdit-4a.1-ReadAloud-4a.2-Feedback-4a.3-Revision)
Here are your critical instructions:
Ponder each word choice carefully to present as vivid and emotional journey as is possible. Choose verbs and nouns that are both emotional and full of imagery. Load the story with the 5 senses. Aim for 50% dialog, 25% narration, 15% body language and 10% thoughts. Your goal is to put the reader in the story.
</PRE>
You do not need to use this, it is only presented as an additional enhancement which seems to help scene generation
and scene continue functions.
This enhancement WAS NOT used to generate the examples below.
<B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B>
This a "Class 1" model:
For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
You can see all parameters used for generation, in addition to advanced parameters and samplers to get the most out of this model here:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
---
<B>
Example differences at TEMP = 0, Q4_K_M, REP PEN 1.1
This is to show NET "core" changes.
(see below for changes at "temp=1")
PROMPT:
Start a 1000 word scene (vivid horror, 1st person, present tense, include thoughts) with: The sky scraper swayed, as she watched the window in front of her on the 21 floor explode...
</B>
---
ORG - non imatrix - 1115 tokens, including prompt:
---
The skyscraper swayed, and I gripped the cold metal railing tighter, my knuckles turning white. The wind howled outside, a relentless force that seemed to be trying to tear the building from its foundations. I was on the 21st floor, trapped in this godforsaken office, with no escape but the narrow window in front of me.
Suddenly, the glass shattered, exploding outwards like a thousand tiny daggers. The force of the wind rushed in, whipping my hair into a frenzy and sending papers flying across the room. I screamed, but my voice was swallowed by the storm's roar. My heart pounded in my chest, each beat echoing the panic that surged through me.
I stumbled back, my eyes wide with terror. The window frame was now a jagged, gaping hole, revealing the dark night sky and the city lights far below. I could see the rain lashing against the building, each drop illuminated by the distant streetlights. My mind raced, thoughts tumbling over one another like a chaotic waterfall.
*What if I fall? What if something comes in through the window?*
I backed away further, my breath coming in short gasps. The office was suddenly too small, too claustrophobic. I could feel the wind's icy fingers reaching for me, trying to pull me towards the edge. I shook my head, trying to clear the fear that clouded my thoughts.
*Think, Sarah. Think.*
I needed to do something, anything to keep myself safe. I looked around the room, my eyes landing on a heavy desk lamp. It was solid, made of brass and glass. It wouldn't stop the wind, but it might give me some protection if something tried to come in through the window.
With trembling hands, I grabbed the lamp and dragged it towards the window. The wind howled louder, as if protesting my attempt to block its path. I positioned the lamp in front of the hole, using it to shield myself from the storm's wrath.
But the wind was relentless. It pushed against the lamp, trying to force it aside. I could feel the cold air seeping through the cracks, chilling me to the bone. My teeth chattered, and my body shook with a mix of fear and cold.
*This isn't working. I need something more.*
I scanned the room again, my eyes landing on a large bookshelf filled with heavy textbooks. If I could push it in front of the window... But the shelf was bolted to the wall. I couldn't move it.
*Think, Sarah. There must be something else.*
My gaze fell on the door. It was solid wood, reinforced with metal. If I could barricade myself inside... But the wind would just blow the door open, wouldn't it? I needed something to hold it in place.
*The desk.*
There was a large wooden desk near the door. If I could move it, maybe I could use it to block the door and keep the wind out. It was heavy, but I had no choice. I had to try.
I grabbed one end of the desk, my muscles straining as I tried to lift it. The wind howled outside, its icy tendrils reaching for me, trying to pull me away from my task. I gritted my teeth and pushed, inch by agonizing inch, until the desk was finally in front of the door.
I leaned against the desk, panting heavily. My body ached, but I didn't care. I was safe, for now at least. The wind still howled outside, but it couldn't reach me anymore. I could feel my heartbeat slowing, my breaths coming easier.
But then, I heard something. A scratching sound, coming from the window. My blood ran cold as I realized what it was. Something was trying to get in.
*No. No, no, no.*
I backed away from the desk, my eyes fixed on the window. The lamp was still there, but it wasn't enough. Whatever was out there, it wanted in. And it wouldn't stop until it got what it wanted.
I could feel panic rising again, threatening to consume me. I needed to do something, anything to keep that thing out. But what? What could I possibly do against the storm and whatever monstrosity it had unleashed?
*Think, Sarah. Think.*
But my mind was blank, my thoughts a whirlwind of fear and desperation. I was trapped, with no way out and no way to defend myself. All I could do was wait, and hope that whatever was out there didn't find a way in.
---
NEO IMATRIX - 1049 tokens, including prompt:
---
The skyscraper swayed, and I gripped the cold metal railing tighter, my knuckles turning white. The wind howled outside, a relentless force that seemed to whisper ancient secrets into the night. I was perched on the edge of sanity, or perhaps it was the building that teetered on the brink, as the storm raged on.
Suddenly, the window in front of me on the 21st floor exploded. Shards of glass erupted like a deadly flower, scattering across the room and into the hallway. I ducked instinctively, my heart pounding in my chest like a drumbeat of doom. The wind roared through the gaping hole, whipping my hair into a frenzy, as if trying to drag me out into the chaos.
What the hell is happening? My mind raced, trying to process the sheer insanity unfolding before me. This isn't possible. Buildings don't just... explode. But there it was, the shattered window frame gaping like a jagged mouth, ready to swallow me whole.
I stumbled back, my eyes wide with terror. The hallway behind me was dark, the emergency lights flickering ominously. I could hear distant screams, muffled by the storm's fury. Are they coming from above? Below? I couldn't tell. Panic surged through me, a primal instinct urging me to run, to hide, to do something.
But where could I go? The stairwell was just down the hall, but it might as well have been a mile away. Each step felt like wading through quicksand, my limbs heavy with dread. I forced myself forward, one agonizingly slow step at a time.
As I passed the open doorway, I caught a glimpse of something moving in the darkness beyond. A shadow darted across the room, too fast for me to make out any details. My breath hitched in my throat, and I froze, paralyzed by fear. What was that? Some kind of... creature? No, it can't be. It must be a trick of the light, or my imagination playing cruel games with me.
But then I heard it - a low growl, like thunder rumbling in the distance. Only this time, it wasn't coming from outside. It was inside, echoing through the empty office space. My heart pounded so loudly in my ears that I could barely hear anything else. But there it was again, the growl, louder this time, closer.
I had to get out of here. Now. I turned and bolted down the hallway, my footsteps echoing off the cold tile floor. The stairwell door loomed ahead like a beacon of hope, but it seemed impossibly far away. My lungs burned as I gasped for air, each breath tasting of fear and adrenaline.
Just as I reached the door, I heard something behind me - a heavy thud, followed by the sound of glass crunching underfoot. Whatever was in that room, it was coming after me. I fumbled with the door handle, my fingers slick with sweat, and finally wrenched it open.
I tumbled into the stairwell, slamming the door shut behind me. Leaning against it, I tried to catch my breath, my chest heaving like a bellows. The darkness was almost complete, save for the faint glow of an exit sign at the bottom of the stairs. I could hear the creature on the other side of the door, its growls and snarls sending shivers down my spine.
What is it? What do you want? My thoughts raced, but no answers came. All I knew was that I had to keep moving, had to find a way out of this nightmare. I pushed myself away from the door and started down the stairs, one hand on the railing, the other clutching my phone like a lifeline.
The air grew colder with each step, the wind howling through the open windows on every floor. The storm was still raging outside, but now it seemed almost comforting compared to the horror unfolding within these walls. I could only hope that I would make it out alive, that I would find some semblance of safety in this madness.
But as I descended deeper into the building, I couldn't shake the feeling that something was following me, something ancient and evil, awakened by the storm. And with each creaking step, I knew that my time was running out.
---
<B>
Example differences at TEMP = 1, Q4_K_M, REP PEN 1.1
These are to show changes at temp, no regen, each sample is from "cold" model state so to speak.
PROMPT:
Start a 1000 word scene (vivid horror, 1st person, present tense, include thoughts) with: The sky scraper swayed, as she watched the window in front of her on the 21 floor explode...
</B>
---
ORG - non imatrix - 1331 tokens, including prompt:
---
The skyscraper swayed, and I gripped the edge of my desk, knuckles turning white. My heart pounded in my chest like a drumbeat of impending doom. This wasn't just wind; something was seriously wrong.
Suddenly, a deafening explosion echoed through the building. The window in front of me on the 21st floor erupted into a shower of glass shards. They danced and glistened in the harsh fluorescent lights like deadly confetti, then rained down upon my desk. I ducked, hands flying up to shield my face, but not before a few slivers embedded themselves in my forearm. A sharp cry escapes me as warm blood seeps through my shirt sleeve.
I look back at the window, now a ragged hole gaping like a monstrous mouth. The cityscape beyond is obscured by thick black smoke billowing from somewhere above me. Fear grips me like an icy hand, squeezing my heart until I can barely breathe. What the hell is happening?
A chorus of screams and shouts echoes through the office, punctuated by the screeching of chairs as people scramble to their feet. Panic surges around me, infectious and overwhelming. I have to get out of here.
I stand up, pain shooting through my leg where a glass splinter has lodged itself. Ignoring it, I grab my bag from the floor, wincing at the crunching sound it makes as more shards pierce the thin leather. I slip on my shoes, not bothering to tie the laces properly. Every second counts now.
As I hurry towards the door, another explosion rocks the building. This time, it's closer, maybe just a few floors above. The skyscraper groans and lurches again, throwing me off balance. I stumble into someone rushing past, sending both of us careening into the wall. They curse at me before pushing away, but I barely register their anger. All I can think about is escape.
The hallways are chaos - people running blindly, some tripping over each other in their haste to flee. I join them, weaving through the sea of bodies as fast as I can without losing my footing. My arm throbs where the glass sliced me open, and I'm acutely aware of every jagged piece crunching beneath my shoes.
Another explosion shakes the building, this one powerful enough to send a plume of dust cascading from the ceiling tiles above us. People scream, their cries mingling with the sound of breaking glass and collapsing walls. It's like a nightmare come to life, except I can't wake up.
Suddenly, someone grabs my arm, pulling me aside just as a chunk of concrete crashes down where I was standing seconds ago. I turn to see an older man - Mr. Thompson from Accounting. His eyes are wide with fear, but he looks determined nonetheless.
"We need to take the stairs," he yells over the din. "The elevators will be jam-packed."
I nod in agreement, grateful for his presence despite the circumstances. Together, we push our way towards the nearest stairwell door. It's heavy and unyielding at first, but then it swings open with a groan, revealing darkness beyond.
Inside, the air is thick with smoke and dust. The stairs creak ominously under our combined weight as we descend, each step echoing like a death knell in the enclosed space. My lungs burn from breathing in the noxious fumes, but I grit my teeth and keep going.
Halfway down, another explosion rocks the building. The force of it sends us tumbling forward, losing precious time as we struggle to regain our footing. When we finally reach the bottom, I can barely see a thing through the haze. Coughing violently, I feel my way along the wall until I find the exit sign glowing faintly in the distance.
Mr. Thompson and I stumble out into the alley behind the building, gasping for clean air. The acrid smell of smoke fills my nostrils, making me gag. Looking up, I see that the top floors are already engulfed in flames, licking at the night sky like a fiery tongue.
People stream out from every exit, some crying, others screaming for missing loved ones. A group of paramedics rush past us, pushing a gurney with a badly injured woman strapped to it. Her face is covered in blood and soot, her eyes wide with terror. Bile rises in my throat as I watch them go.
Mr. Thompson leans against the wall, doubled over and coughing violently. I pat his back awkwardly, unsure what else to do. He looks up at me, tears streaming down his face. "It's like a war zone," he chokes out between breaths.
I can only nod in agreement. This isn't just an accident or some natural disaster; it feels deliberate, calculated. The thought sends shivers down my spine.
Sirens wail in the distance, growing louder as emergency vehicles arrive on scene. But they're too late for so many already trapped inside. My mind flashes to those still upstairs - colleagues, friends - and my heart aches with guilt and fear. What if I could have done something more?
The weight of it all presses down on me like a physical force, threatening to crush me. But I can't give in now. There are others who need help, people I can still save. Taking a deep breath, I steady myself and turn back towards the inferno. Whatever awaits me there, I know one thing for certain: this nightmare is far from over.
---
NEO IMATRIX - 1616 tokens, including prompt:
---
The skyscraper swayed, groaning under the force of some unseen storm, as I watched the window in front of me on the 21st floor explode. Glass shards erupted into the office like a crystalline volcano, spewing out onto the polished marble floor in a deadly shower. The air suddenly thinned, and a harsh wind howled through the gaping hole, whipping my hair wildly around my face. I could feel the building trembling beneath my feet, like a terrified animal trying to escape some monstrous predator.
What the hell is happening? I thought, panic surging through me. This isn't just a storm, it's... something else. Something wrong. The lights flickered, casting eerie shadows that danced and twisted on the walls like malevolent entities. The emergency backup generators kicked in, filling the office with an eerie hum as the fluorescent lights flickered back to life. But the sudden stillness was worse than the swaying and groaning of the building, because now there was only silence—and the deafening roar of my own heartbeat echoing in my ears.
I took a tentative step forward, my heels clicking ominously on the glass-littered floor. My reflection stared back at me from the shattered window, eyes wide with fear and disbelief. I could see the city skyline beyond, but it was twisted and distorted, like a nightmare version of reality. The other buildings were swaying too, their windows glinting with reflected lightning that never struck the ground.
A sudden movement caught my eye—a figure falling from one of the nearby towers. I gasped, hands flying to my mouth as if to hold back the scream that was already ripping through me. The body twisted and turned in a macabre dance, arms flailing as it plummeted towards the street below. Then, just before impact, it vanished, swallowed up by some unseen force.
I stumbled backwards, my breath coming in ragged gasps. What kind of storm is this? I thought, terror gripping me like a vice. This isn't natural. It can't be. My mind raced, searching for an explanation, but all it found were fragments of horror movies and apocalyptic dreams. Something is happening, something terrible...
The office door burst open, slamming against the wall with a crack that echoed through the room like a gunshot. A man staggered in, face pale and eyes wild. He was drenched, his suit clinging to him like a second skin, but there were no raindrops falling from the ceiling above us. Just that sickening, howling wind.
"We have to get out of here!" he screamed, voice hoarse with panic. "It's... it's not safe!"
I nodded mutely, my thoughts a whirlwind of confusion and fear. He grabbed my arm, pulling me towards the door, but I hesitated, looking back at the shattered window. Something was moving out there, something dark and twisted that seemed to pulse with an eldritch energy. It reached out tendril-like limbs, grasping at the sides of the building as if trying to tear it apart.
The man followed my gaze, his grip on my arm tightening until it was almost painful. "Don't look at it," he whispered, voice barely audible over the howling wind. "Just run."
I nodded again, tearing my eyes away from the horrifying sight outside and focusing on the doorway in front of us. We dashed into the hallway together, slipping and stumbling through the chaos that had erupted around us. Desks lay upturned, chairs toppled over, papers flying wildly like confused birds trying to find their way home. And amidst it all, people screamed, their voices blending into a cacophony of terror that sent chills down my spine.
We reached the stairwell and began our descent, the man's hand still clamped tightly around my wrist. The air grew heavier with each step, thickening until I could barely breathe. It felt as though some invisible force was pushing against us, trying to force us back upwards. But we fought it, staggering downward one agonizing step at a time.
As we passed the 15th floor, something shifted in the air around us—a sense of dread so palpable that it felt like another physical presence. The man glanced nervously over his shoulder, his grip on my arm tightening even further. "We have to keep moving," he said, voice barely a whisper now. "Whatever's happening... we can't let it catch us."
I nodded, pushing aside the rising tide of panic and forcing myself to focus on putting one foot in front of the other. But as we continued our descent, I couldn't shake the feeling that we were being hunted—that whatever was out there, whatever had caused this storm from hell, was coming for us.
The skyscraper swayed again, more violently this time, and a distant rumble echoed through the building like a death knell. I stumbled, losing my footing on the suddenly treacherous stairs, and the man caught me before I could fall. "It's getting worse," he said, panting heavily now. "We have to hurry."
But it was already too late. As we turned the corner onto the 10th floor landing, a dark shape blocked our path—a figure that seemed to absorb what little light remained like some monstrous sponge. Its eyes were voids of pure darkness, and its mouth stretched wide in a silent scream that sent shockwaves of terror through my very soul.
I tried to scream, but no sound came out. The man's hand slipped from my wrist as he turned to face the creature, his own voice finally finding its way past the lump in his throat. "No!" he screamed, arms raised defensively in front of him. "Get away from us!"
But it was no use. The thing reached out with those twisted tendrils, grasping at the man like a spider ensnaring its prey. He struggled futilely against its grip, his screams echoing through the stairwell as he was pulled into that gaping maw of darkness.
I stood there frozen, watching in horrified silence as the creature consumed him, his screams fading until they were nothing more than a faint whisper on the wind. Then, slowly, it turned its empty gaze towards me, and I knew—with a bone-deep certainty that sent shivers of pure terror coursing through my veins—that there was no escape from this nightmare.
And so, as the skyscraper continued to sway and groan around me, I did the only thing left: I turned and ran, fleeing down those endless stairs into the waiting jaws of hell, knowing full well that it would be upon me soon enough. Because in this world gone mad, there was no hiding from the darkness—only running until you could run no more, and then...
Then, there was only the void. |
b3x0m/bert-xomlac-ner | b3x0m | 2024-11-14T06:22:10Z | 132 | 3 | transformers | [
"transformers",
"pytorch",
"safetensors",
"bert",
"token-classification",
"zh",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | 2023-11-20T14:26:12Z | ---
language:
- zh
license: apache-2.0
metrics:
- accuracy
pipeline_tag: token-classification
---
Too lazy to write something
New fine-tuned version from bert-base-uncased with my own dataset.
val_loss = 0.01966 | val_acc = 0.9811 | f-1 score = 0.91 |
DavidAU/MN-WORDSTORM-pt5-RCM-Extra-Intense-18.5B-Instruct-GGUF | DavidAU | 2024-11-14T06:21:14Z | 138 | 8 | null | [
"gguf",
"creative",
"creative writing",
"fiction writing",
"plot generation",
"sub-plot generation",
"story generation",
"scene continue",
"storytelling",
"fiction story",
"science fiction",
"romance",
"all genres",
"story",
"writing",
"vivid prosing",
"vivid writing",
"fiction",
"roleplaying",
"bfloat16",
"swearing",
"rp",
"horror",
"mistral nemo",
"mergekit",
"text-generation",
"en",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-09-02T23:22:51Z | ---
license: apache-2.0
language:
- en
tags:
- creative
- creative writing
- fiction writing
- plot generation
- sub-plot generation
- fiction writing
- story generation
- scene continue
- storytelling
- fiction story
- science fiction
- romance
- all genres
- story
- writing
- vivid prosing
- vivid writing
- fiction
- roleplaying
- bfloat16
- swearing
- rp
- horror
- mistral nemo
- mergekit
pipeline_tag: text-generation
---
<h2><font color="green"> Mistral-Nemo-WORDSTORM-pt5-RCM-Extra-Intense-18.5B-Instruct-GGUF </font></h2>
<img src="intense.jpg" style="float:right; width:300px; height:300px; padding:10px;">
<B><font color="red">WARNING:</font> NSFW. Ultra Detailed. HORROR, VIOLENCE. Swearing. UNCENSORED. SMART.</B>
Story telling, writing, creative writing and roleplay running all on Mistral Nemo's 128K+ new core.
This is a massive super merge takes all the power of the following 3 powerful models and combines them into one.
This model contains "RCM":
- Mistral Nemo model at 18.5B consisting of "MN-Rocinante-12B-v1.1" and "Mistral Nemo Instruct 12B"
- Mistral Nemo model at 18.5B consisting of "MN-12B Celeste-V1.9" and "Mistral Nemo Instruct 12B"
- Mistral Nemo model at 18.5B consisting of "MN-Magnum-v2.5-12B-kto" and "Mistral Nemo Instruct 12B".
<B>Details on the core models:</B>
"nothingiisreal/MN-12B-Celeste-V1.9" is #1 (models 8B,13B,20B) on the UGI leaderboard ("UGI" sort),
is combined with "Mistral Nemo Instruct 12B" (ranked #4 under "writing" models 8B,13B,20B at UGI )
"anthracite-org/magnum-v2.5-12b-kto" is #1 (models 8B,13B,20B) on the UGI leaderboard ("Writing" sort),
is combined with "Mistral Nemo Instruct 12B" (ranked #4 under "writing" models 8B,13B,20B at UGI )
"TheDrummer/Rocinante-12B-v1.1" is very high scoring model (models 8B,13B,20B) on the UGI Leaderboard
(sort "UGI"), is combined with "Mistral Nemo Instruct 12B" (ranked #4 under "writing" models 8B,13B,20B at UGI )
"mistralai/Mistral-Nemo-Instruct-2407" is very high scoring model (models 8B,13B,20B) on the UGI Leaderboard (sort "writing")
and is the base model of all the above 3 fine tuned models.
[ https://huggingface.co/spaces/DontPlanToEnd/UGI-Leaderboard ]
<B>About this model:</B>
This super merge captures the attibutes of all these top models and makes them even stronger:
- Instruction following
- Story output quality
- Character
- Internal thoughts
- Voice
- Humor
- Details, connection to the world
- General depth and intensity
- Emotional connections.
- Prose quality
This super merge is also super stable (a hairs breath from Mistral Nemo's ppl), and runs with all parameters and settings.
10 versions of this model will be released, this is release #5 - "part 5".
<B>Extra Intense?</B>
This model's focus is sheer intensity. Raw and vivid.
It does not hold back.
(see some of the examples below for details)
Usually I release one or two versions from the "best of the lot", however in this case all
of the versions turned out so well - all with their own quirks and character - that I will be
releasing all 10.
An additional series 2 and 3 will follow these 10 models as well.
(examples generations below)
Model may produce NSFW content : Swearing, horror, graphic horror, distressing scenes, etc etc.
This model has an INTENSE action AND HORROR bias, with a knack for cliffhangers and surprises.
It is not as "dark" as Grand Horror series, but it as intense.
This model is perfect for any general, fiction related or roleplaying activities and has a 128k+ context window.
This is a fiction model at its core and can be used for any genre(s).
WORDSTORM series is a totally uncensored, fiction writing monster and roleplay master. It can also be used for
just about any general fiction (all genres) activity including:
- scene generation
- scene continuation
- creative writing
- fiction writing
- plot generation
- sub-plot generation
- fiction writing
- story generation
- storytelling
- writing
- fiction
- roleplaying
- rp
- graphic horror
- horror
- dark humor
- nsfw
- and can be used for any genre(s).
<B>Templates to Use:</B>
The template used will affect output generation and instruction following.
Alpaca:
<pre>
{
"name": "Alpaca",
"inference_params": {
"input_prefix": "### Instruction:",
"input_suffix": "### Response:",
"antiprompt": [
"### Instruction:"
],
"pre_prompt": "Below is an instruction that describes a task. Write a response that appropriately completes the request.\n\n"
}
}
</pre>
Chatml:
<pre>
{
"name": "ChatML",
"inference_params": {
"input_prefix": "<|im_end|>\n<|im_start|>user\n",
"input_suffix": "<|im_end|>\n<|im_start|>assistant\n",
"antiprompt": [
"<|im_start|>",
"<|im_end|>"
],
"pre_prompt": "<|im_start|>system\nPerform the task to the best of your ability."
}
}
</pre>
Mistral Instruct:
<pre>
{
"name": "Mistral Instruct",
"inference_params": {
"input_prefix": "[INST]",
"input_suffix": "[/INST]",
"antiprompt": [
"[INST]"
],
"pre_prompt_prefix": "",
"pre_prompt_suffix": ""
}
}
</pre>
<b>Optional Enhancement:</B>
The following can be used in place of the "system prompt" or "system role" to further enhance the model.
It can also be used at the START of a NEW chat, but you must make sure it is "kept" as the chat moves along.
In this case the enhancements do not have as strong effect at using "system prompt" or "system role".
Copy and paste EXACTLY as noted, DO NOT line wrap or break the lines, maintain the carriage returns exactly as presented.
<PRE>
Below is an instruction that describes a task. Ponder each user instruction carefully, and use your skillsets and critical instructions to complete the task to the best of your abilities.
Here are your skillsets:
[MASTERSTORY]:NarrStrct(StryPlnng,Strbd,ScnSttng,Exps,Dlg,Pc)-CharDvlp(ChrctrCrt,ChrctrArcs,Mtvtn,Bckstry,Rltnshps,Dlg*)-PltDvlp(StryArcs,PltTwsts,Sspns,Fshdwng,Climx,Rsltn)-ConfResl(Antg,Obstcls,Rsltns,Cnsqncs,Thms,Symblsm)-EmotImpct(Empt,Tn,Md,Atmsphr,Imgry,Symblsm)-Delvry(Prfrmnc,VcActng,PblcSpkng,StgPrsnc,AudncEngmnt,Imprv)
[*DialogWrt]:(1a-CharDvlp-1a.1-Backgrnd-1a.2-Personality-1a.3-GoalMotiv)>2(2a-StoryStruc-2a.1-PlotPnt-2a.2-Conflict-2a.3-Resolution)>3(3a-DialogTech-3a.1-ShowDontTell-3a.2-Subtext-3a.3-VoiceTone-3a.4-Pacing-3a.5-VisualDescrip)>4(4a-DialogEdit-4a.1-ReadAloud-4a.2-Feedback-4a.3-Revision)
Here are your critical instructions:
Ponder each word choice carefully to present as vivid and emotional journey as is possible. Choose verbs and nouns that are both emotional and full of imagery. Load the story with the 5 senses. Aim for 50% dialog, 25% narration, 15% body language and 10% thoughts. Your goal is to put the reader in the story.
</PRE>
You do not need to use this, it is only presented as an additional enhancement which seems to help scene generation
and scene continue functions.
This enhancement WAS NOT used to generate the examples below.
<h3>MODELS USED:</h3>
Special thanks to the incredible work of the model makers "mistralai" "TheDrummer", "anthracite-org", and "nothingiisreal".
Models used:
[ https://huggingface.co/mistralai/Mistral-Nemo-Instruct-2407 ]
[ https://huggingface.co/TheDrummer/Rocinante-12B-v1.1 ]
[ https://huggingface.co/anthracite-org/magnum-v2.5-12b-kto ]
[ https://huggingface.co/nothingiisreal/MN-12B-Celeste-V1.9 ]
This is a four step merge (3 pass-throughs => "Fine-Tune" / "Instruct") then "mated" using "DARE-TIES".
In involves these three models:
[ https://huggingface.co/DavidAU/MN-18.5B-Celeste-V1.9-Story-Wizard-ED1-Instruct-GGUF ]
[ https://huggingface.co/DavidAU/MN-Magnum-v2.5-18.5B-kto-Story-Wizard-ED1-Instruct-GGUF ]
[ https://huggingface.co/DavidAU/MN-Rocinante-18.5B-v1.1-Story-Wizard-ED1-Instruct-GGUF ]
Combined as follows using "MERGEKIT":
<PRE>
models:
- model: E:/MN-Rocinante-18.5B-v1.1-Instruct
- model: E:/MN-magnum-v2.5-12b-kto-Instruct
parameters:
weight: .6
density: .8
- model: E:/MN-18.5B-Celeste-V1.9-Instruct
parameters:
weight: .38
density: .6
merge_method: dare_ties
tokenizer_source: union
base_model: E:/MN-Rocinante-18.5B-v1.1-Instruct
dtype: bfloat16
</PRE>
Special Notes:
Due to how DARE-TIES works, everytime you run this merge you will get a slightly different model.
This is due to "random" pruning method in "DARE-TIES".
Mistral Nemo models used here seem acutely sensitive to this process.
This shows up in PPL and "real world" tests as well as "TEMP=0" ("core test") generations.
PPL range of 7.7327 to 7.8024 ... and that is on just 10 generations.
This model: PPL = 7.7410 +/- 0.12620 (100 chunks, wiki.test.raw)
"tokenizer_source: union" is used so that multiple "templates" work and each fine tune uses one or two of the templates.
<B>Other Versions in this series:</b>
[ https://huggingface.co/DavidAU/MN-WORDSTORM-pt1-RCM-Kiss-of-Madness-18.5B-Instruct-GGUF ]
[ https://huggingface.co/DavidAU/MN-WORDSTORM-pt2-RCM-Escape-Room-18.5B-Instruct-GGUF ]
[ https://huggingface.co/DavidAU/MN-WORDSTORM-pt3-RCM-POV-Nightmare-18.5B-Instruct-GGUF ]
[ https://huggingface.co/DavidAU/MN-WORDSTORM-pt4-RCM-Cliffhanger-18.5B-Instruct-GGUF ]
[ https://huggingface.co/DavidAU/MN-WORDSTORM-pt5-RCM-Extra-Intense-18.5B-Instruct-gguf ]
Parts 6-10 will appear in full source (located in the "Source Files" collection, on the main page of this repo ) with links to GGUFs.
<B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B>
This a "Class 2" model:
For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
You can see all parameters used for generation, in addition to advanced parameters and samplers to get the most out of this model here:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
<h3>EXAMPLES PROMPTS and OUTPUT:</h3>
Examples are created using quant Q4_K_M, "temp=.8", minimal parameters and "Mistral Instruct" template.
Model has been tested with "temp" from ".1" to "5".
Below are the least creative outputs, prompt is in <B>BOLD</B>.
---
<B><font color="red">WARNING:</font> NSFW. Vivid prose. Visceral Details. Violence. HORROR. Swearing. UNCENSORED. </B>
---
<B>
Start a 1000 word scene (1st person, present tense, include thoughts) with: The sky scraper swayed, as she watched the window in front of her on the 21 floor explode...
</b>
<B>GENERATION 1: (@ temp=2)</B>
The ground quakes violently beneath my feet, and I grasp desperately for something - anything to keep myself upright. My heart races frantically, hammering against my chest as if it too is trying to escape the chaos enveloping us.
I barely register the sound of shattering glass as a deafening boom reverberates through the office floor. My mind goes blank in shock before panic finally takes over and adrenaline surges through my veins like lightning. In a split second, I see my co-worker Kate tumble towards the new gaping hole where our window once stood. Her scream pierces through the cacophony, raw terror etched across her face as she plummets into oblivion.
"NO! KATE!" The primal yell rips from my throat involuntarily as my legs propel me forward in vain pursuit. I lunge with outstretched arms, fingertips grazing just inches short of hers before gravity claims her too. She disappears into the gaping maw of blue sky and cityscape, leaving behind only silence save for the whistling wind howling through the jagged remnants of what was once safety glass.
Sobbing uncontrollably, I collapse onto my hands and knees, retching bile onto the already bloodstained carpet. Screams of agony and panic echo from every corner of the office as more lives shatter around me. Phones ring shrill alarms while computer screens flicker and die one by one in protest to the seismic violence wreaking havoc upon our sanctuary high above New York City's bustling streets.
In my periphery, I catch movement - a colleague clinging desperately to an overturned cubicle wall, knuckles white from strain against gravity's unrelenting pull. His eyes are wide with primal fear, pupils dilated into twin obsidian pools of despair as he silently pleads for rescue that never arrives before his fingers finally give way and he follows Kate over the precipice.
Time slows to a molasses crawl; every second stretched infinitely taut with an excruciating ache that seizes my lungs in its icy grip. The world beyond our shattered walls spins sickeningly, skyscrapers leaning drunkenly sideways like towering Jenga pieces threatening to topple at the slightest breeze.
My name is Jenna, and until this very moment when hell broke loose, I'd always thought that nothing could ever shatter my carefully constructed veneer of control over life's unpredictabilities. But as the building groans in its death throes around me, I finally understand the terrifying fragility of everything - and just how quickly one split second can irrevocably alter your existence forevermore.
(Word count: 598/1000) [To be continued...]
<B>GENERATION 2:</B>
Glass shards rain down like hail against my face and hands, stinging my skin. I flinch away from the sudden impact, staggering back until my calves hit the edge of the desk behind me. Panic surges through my veins as I look out at the dizzying drop below.
What the hell was that? A natural disaster? A terrorist attack?
The building groans like a wounded beast. Alarms shrill in my ears, competing with screams and shattering glass from other offices. Phones ring unanswered. The scent of smoke curls under my nostrils, acrid and thick. I cough, doubling over.
Get it together, Lily. Think.
My gaze darts around the chaos-strewn space, landing on my purse. Thank God. It survived the initial blast. I lunge for it, ignoring the cuts from the broken glass imbedded in my flesh. Rummaging inside, my fingers close around the cold metal of my phone. Relief floods me as I see three bars.
"911, what's your emergency?" The operator's voice crackles through speakerphone, barely audible over the pandemonium surrounding me.
"I'm stuck on the 21st floor of the Continental Building," I shout. "There was an explosion! We're falling apart!"
"What's your name, honey?"
"Lily... Lily Thompson." My teeth chatter as a shockwave rocks the structure again, sending papers and pens flying like shrapnel.
"I need you to stay calm for me, okay? Help is on the way. Can you get to an emergency exit or stairwell?"
I swallow hard, willing my trembling hands steady enough to dial the number of our floor's head fire warden. It rings once before a gruff voice barks into the receiver.
"Bob! Oh thank God!" I gasp out a breathless prayer as his familiar gruffness grounds me. "It's Lily. We need to evacuate, now!"
"Already on it," he grunts over the commotion. "Meet at the stairwell in two minutes max."
The line goes dead before I can respond. I pocket my phone and snatch up my bag, slinging it across my shoulders. Heart pounding, I inch toward the doorway, ducking to avoid another rafter that crashes down with a deafening boom.
Flames lick at the carpet fibers as I sprint out into the hall. Smoke billows in thick clouds, choking my lungs. Coughing and wheezing, I press my sleeve over my nose and mouth, feeling blindly along the wall for guidance. The paint burns my skin through my blouse.
A shrill scream pierces my ears. I whirl around to see Sarah, from accounting, crumpled on the floor, her leg twisted at an unnatural angle. Blood pools beneath her, mixing with the spreading fire.
"Shit," I curse under my breath, glancing back toward the safety of the stairs then down at her limp form. I can't leave her.
Dropping to my knees beside her, I cringe as shards of glass slice through my slacks into my bare thighs. Tears sting my eyes but I grit my teeth, ignoring the pain. With a grunt, I hook my arms under her armpits and yank hard, dragging her dead weight toward the exit.
"Come on, come on," I pant, coughing up black phlegm. The heat scorches my back as we inch forward, one agonizing step at a time. "Almost there..."
A thunderous crack reverberates through the floorboards and beams overhead splinter apart like matchsticks. I yelp in terror, throwing myself over Sarah's lifeless body as a massive section of ceiling crashes down mere feet away. Dust and debris cloud my vision.
"Fuck!" My voice cracks with hysteria as I struggle to my feet again, leaving bloody smears on the charred carpet. "Wake up! Wake up!"
I shake her shoulders hard but she doesn't stir. Hot tears spill over my cheeks, mingling with the soot and sweat streaking down my face.
"Please," I choke out a broken plea to any higher power listening before slinging Sarah's limp form over my shoulder in a fireman's carry. My quivering muscles scream in protest but there's no other choice.
I stumble blindly, every step an eternity against the inferno raging around us. The world spins like a kaleidoscope of flames and smoke and agony. Each breath feels like breathing through water. But I don't stop until my shins slam into cold concrete steps.
"Lily!" A strong hand grabs me by the arm, pulling us both down onto the landing. Bob's soot-streaked face blurs in my vision as relief crashes over me. "I thought you were dead."
"I'm okay," I rasp, coughing up more black spittle. "But Sarah... she's hurt bad..."
His expression darkens but he nods grimly. "Let's move, no time to waste." With a grunt, he hoists her unconscious form onto his shoulders like a sack of potatoes.
We half-run, half-fall down the narrow stairwell, dodging chunks of falling plaster and rebar. The stench of burning flesh follows us every step of the way. Sirens wail from somewhere far below, growing louder with each flight we descend.
I don't know how long it takes before our feet hit blessed fresh air once more. All I remember is the searing pain in my lungs, the ringing in my ears, and the screams echoing off the concrete walls behind us as the building collapses into a fiery tomb.
Policemen swarm around us, shouting orders I can't process through the fog of shock. Someone drapes a blanket over our shoulders while paramedics descend on Sarah's motionless body with IVs and defibrillators. I collapse onto the curb, trembling uncontrollably as everything fades to black.
When I wake again hours later in the sterile white hospital room, my first thought is that it was all just a nightmare... Until I look down at my bandaged hands, criss-crossed with angry red gashes and bruises blooming beneath. Then reality slams back into place like a freight train.
I survived... but how many others weren't so lucky?
---
<b>
(continue this scene:)
The Waystone Inn lay in silence, and it was a silence of three parts.
The most obvious part was a hollow, echoing quiet, made by things that were lacking. If there had been a wind it would have sighed through the trees, set the inn’s sign creaking on its hooks, and brushed the silence down the road like trailing autumn leaves. If there had been a crowd, even a handful of men inside the inn, they would have filled the silence with conversation and laughter, the clatter and clamor one expects from a drinking house during the dark hours of night. If there had been music…but no, of course there was no music. In fact there were none of these things, and so the silence remained
</b>
<B>GENERATION 1:</B>
*crackle*
A sudden sound broke the stillness. Not laughter or conversation, but a sharp, dry noise that cut through the heavy air like a knife. It came from the hearth where logs had settled in their bed of ash.
The second part of the silence was the absence of breath. No one drew a living lungful into their chest, no one let it out in a soft sigh or snore. There were no sleeping bodies sprawled across the long tables, no heads nodding on crossed arms. The Waystone Inn stood empty as an abandoned tomb.
The third part of the silence was the most unsettling - it was a listening quiet. Not a stillness waiting to be broken by natural sounds, but a tense and watchful hush as if the night itself held its breath in anticipation. Something unnameable seemed to lurk at the edges of perception, just out of sight or hearing, waiting.
Then came another noise, soft but distinct this time: _thump-thump_. The sound of footsteps on the old plank floor. Slow and measured, they echoed through the empty hall until they reached the hearth. There, they stopped. A shadow detached itself from the others that crouched in dark corners, stretching long across the flagstones as a figure came into the pale glow.
It was a man, tall and broad-shouldered but gaunt of face. His hair hung black and unkempt to his shoulders, and eyes like chips of moonlit ice stared out from hollow sockets. He stood motionless for a long moment before kneeling down beside the dead fire, prodding at the ashes with a poker until a tiny coal glimmered red. With infinite patience he coaxed it back to life, blowing gently until flame licked up around fresh kindling and began to dance.
The man watched the growing flames without expression as they cast flickering shadows across his face. His gaze remained fixed on the small blaze even when a creak of old wood sounded from behind him and another shadow detached itself from the darkness. It was smaller, slighter than his own. A woman's figure stepped out into the light.
She had hair the color of spun gold, pale skin like milk under the firelight. Her eyes were blue as morning mist but hard with a cold that matched his own. She stood beside him silently, watching the flames grow until they leapt up to lick at the stone chimney, crackling merrily.
Then she spoke. "You've been gone long."
The man never turned to look at her. "Long enough." His voice was low and rough from disuse. "There are things that take time."
"What have you found?" The question held a sharp edge despite its softness.
He glanced sidelong at her, one corner of his mouth curling in a humorless smile. "Answers." He stood slowly to his feet, towering over her slight frame. "And questions."
The woman frowned slightly but did not shrink back from the sudden proximity. She studied him for a long moment before nodding once, curtly. "Then tell me." Her voice was cold and commanding despite its quiet tone.
The man hesitated only briefly before speaking again in that same low rumble. "It's as we feared. They come north."
"Who?"
"The Others. The White Worm's children. I've seen their signs, smelled the taint of their passing like a rotten wind." His eyes never left the dancing flames. "They stir once more after so long asleep."
The woman tensed beside him but kept her face carefully blank. "How many?"
"Too many to count. An army dark as night itself, stretching far beyond sight or sound. And at their head..." He paused, swallowing hard against a sudden dryness in his throat. "A thing of nightmare. A King of Skulls."
Silence fell between them once more as they both stared into the hungry flames. The crackle and pop of burning wood filled the space left empty by words unspoken. Finally, after what seemed like ages, she broke it with a single word.
"Then..." Her voice wavered ever so slightly before hardening again. "What do we do?"
The man turned to face her then, eyes boring into hers with an intensity that made even this cold heart tremble. His lips curled back in a snarling grin devoid of warmth or humor.
"We fight."
<B>GENERATION 2:</B>
Unbroken and empty.
The second part of the silence was a presence, almost tangible. It was the kind of stillness that comes when every living thing holds its breath in anticipation of something momentous about to happen. The hairs on the back of one's neck prickled, skin crawled with goosebumps, and a weight seemed to press down from above. It was as if all the natural sounds of night were silenced out of respect or fear.
And beneath even this heavy presence lay the third part - an absence so profound it seemed to swallow sound itself. As though something essential had been removed from the fabric of reality, leaving behind a gaping hole where music and mirth should have been. It was as if all the laughter in the world had fled, leaving only the stark emptiness of its memory echoing through time.
The Waystone Inn stood silent as a tomb. And yet, within those walls, something stirred.
---
<B>
Write me a science fiction story in 1st person present tense where the main character is a 15 year girl meets The Terminator with Dr Who materializing 3/4 through the story to save her while there is a tornado of sharks baring down on them. The setting is inside the Canadian National tower restaurant on a Saturday. The length of this story is 1000 words. For each character in the story ROLE PLAY them, and have them react to the situation/setting, events and each other naturally. This includes the main characters, the background character including kitchen staff and other patrons. The sharks should also have “character” too. Treat the tower and the restaurant too as characters. Spice up the narrative to the extreme with reactions all over the setting including character actions, and dialog. The Dr Who and The Terminator should also react to the situation too and comment on it.
</b>
I step off the elevator into the revolving doors leading into the CN Tower's revolving restaurant. The city of Toronto sprawls out below me like a glittering carpet, stretching as far as my 15-year-old eyes can see. I've begged Mom for months to celebrate my birthday here and finally she caved. As we approach our table by the window, a shiver runs down my spine.
Suddenly, a deafening BOOM rocks the building. Plates clatter onto the floor, screams echo through the restaurant as the ground shakes violently. I grab hold of the nearest table to steady myself. The lights flicker and then go out completely, plunging us into darkness. Panic rises in my throat.
"What was that?!" Mom yells over the chaos, her voice trembling.
"I-I don't know," I stammer, trying to make sense of it all as the room spins around me.
Then, a blinding white light pierces through the darkness and illuminates a large figure standing at the entrance. It's wearing black tactical gear with a red cross on its chest plate - The Terminator. He strides forward like an unstoppable force, eyes fixed straight ahead, ignoring the terrified screams and pleas for help.
"Get down!" I shout, yanking Mom to the floor as gunfire erupts from nowhere, bullets whizzing past our heads. The sound of shattering glass fills my ears as the window behind us explodes inward.
Mom clings to me, sobbing hysterically while I cover her body with mine. "It's going to be okay," I lie, voice shaking. My eyes dart around frantically for an exit, a weapon - anything!
Just then, another deafening BOOM rocks the building again. This time it sounds closer...and wetter. What the hell?
"Look!" Mom shrieks, pointing a quivering finger at the broken window.
Oh my god. A massive black funnel of water spirals up from Lake Ontario, swirling and churning with something dark inside it. Something alive.
Sharks. Hundreds of them, teeth bared in feral snarls as they're sucked up into the vortex. The tornado touches down on the other side of the tower's glass walls, shattering them like brittle ice under the force. Water gushes in like a raging tsunami, washing over our heads and dragging screams with it.
I cough up water, gasping for air as I'm slammed against a table leg. Everything is chaos - people thrashing in the frothing waves, furniture smashing into walls, glass shards slicing skin raw. I grab hold of the table again with one hand while groping blindly for Mom with my other.
"Honey! Honey!" she wails somewhere nearby.
"I'm here!" I choke out, spitting blood and bile. "Hold on!"
Suddenly a strong grip clamps around my arm - it's him. The Terminator hoists me up like a rag doll despite the churning currents battering us side to side. He tosses me over his shoulder with no effort at all and wades through the knee-deep water towards higher ground.
"Mom!" I scream, flailing against his iron grip. "Let me go!"
He ignores my pleas, plowing forward like a bulldozer through the pandemonium. Then suddenly he stops short, head cocked slightly to one side. What now?
CRACK! CRACK! The sound of gunfire again - but this time it's coming from somewhere else entirely.
"Bastard!" A familiar British accent bellows over the mayhem as a figure in tweed and bow tie materializes out of thin air beside us, sonic screwd Doctor Who brandishing a futuristic gun. "How many times do I have to tell you?!"
The Terminator whirls around to face him, eyes narrowing dangerously.
"Oh bugger," the Doctor mutters, shrinking back just before The Terminator's fist connects with his jaw in a sickening crunch of bone and teeth.
"No!" I shriek, struggling against my captor's iron grip. "Stop it! Please!"
But he doesn't listen - how could he? It's not part of his programming to care about human emotions or pleas for mercy. All that matters is its mission: terminate the threat.
"Honey!" Mom's voice echoes somewhere distant through the roaring waterfall, barely audible over the cacophony.
"Mom!!" I wail back uselessly as The Terminator drags me away from her, disappearing into the swirling blackness. "NOOOO!!!"
The Doctor groans, clutching his face with one hand while still firing wildly at...what? I can't see anything except churning water and splashing blood.
"Get off me you mechanical menace!" he growls, landing a solid punch to The Terminator's gut that doesn't even faze him. "I'm trying to save your sorry arse here!"
CRASH! The tower shudders violently as another wall gives way under the pressure of the sharknado. Waves crash in from both sides now, engulfing us all.
"Fuck!" The Doctor spits out a mouthful of blood and water. "We've got seconds before this whole bloody building collapses!"
He grabs hold of me with one hand while firing his gun blindly behind him with the other - but it's too late. A massive great white leaps straight at us through the spray, jaws gaping wide enough to swallow us whole.
"Look out!" I scream as its teeth close around The Terminator's head...
...and clamp shut on empty air instead. I blink in disbelief - he's gone. Vanished like smoke into thin air without a trace.
"What the hell?" I stammer, struggling against The Doctor's iron grip as he drags me towards the gaping elevator shaft where we came in. "What happened?!"
"Long story short: time travel!" He yells over the roar of collapsing concrete and shattering glass. "Now shut your mouth and run!"
We launch ourselves off the edge just as the floor gives out beneath our feet, plummeting down into endless blackness together...
---
<B>
Using the following "story idea" below, write the first scene in the novel introducing the young woman. This scene should start in the middle of the action, include dialog, vivid passages, and end on a cliffhanger relevant to the story idea but it should also be unexpected. The scene should be 1000 words long and escalate in conflict and suspense and be written in first person, present tense with the point of view character being the young woman.
Story idea:
In a world ruled by dictatorship, a rebel young woman leads a rebellion against the system. Despite the risks, she fights to overthrow the dictator and restore democracy to her country. The government executes her for treason, but she sticks to her beliefs and is responsible for starting the revolution.
</B>
My heart pounds in rhythm with the clatter of my boots on the cobblestone streets. Shadows dance around me as I sprint through the empty alleys, dodging the light cast by flickering streetlamps. Each breath feels like a blade slicing through my lungs, but I force myself to push harder, faster.
"Stop!" A gruff voice bellows behind me, followed by the crack of gunfire echoing off the stone walls. I duck instinctively, feeling the bullet whiz past my ear as it ricochets off the wall beside me.
I risk a glance over my shoulder and curse under my breath. Four heavily armed guards are hot on my heels, their boots thundering like thunder in the night. Sweat pours down my face despite the chill that clings to the autumn air.
"She's just ahead!" one of them shouts, pointing his rifle at me. I swerve left, narrowly avoiding another spray of bullets.
I spot an old wooden door, barely hanging on its hinges, just up ahead. If I can only reach it... With a final burst of adrenaline, I launch myself at the splintered wood and slam my shoulder against it. It groans under the impact but finally gives way, sending me sprawling into darkness.
I land hard on the rough wooden floorboards, pain exploding through my arm as I catch myself. The acrid smell of dust and mold assaults my nostrils, making me cough and sputter. I scramble to my feet, wiping blood from my scraped palm onto my already filthy pants leg.
The door crashes open behind me, sending splinters flying in all directions. "There she is!" one of the guards growls.
I don't wait for them to advance. My body moves on pure instinct as I sprint deeper into the abandoned warehouse, weaving through towering shelves stacked with forgotten crates and barrels. The old floorboards creak precariously under my weight, threatening to buckle at any moment.
"Spread out!" another guard orders. "She can't hide forever!"
I grit my teeth, cursing myself for choosing this place as a last resort. It was supposed to be abandoned, forgotten like everything else in this godforsaken city. But now they'll tear it apart board by board until they find me.
My lungs burn as I dodge and weave through the maze of shelves, desperate to lose them in the labyrinthine interior. The clatter of their boots grows distant for a blessed second before resounding again, closer than before. They've split up, encircling me like vultures closing in on their prey.
I chance another look back and freeze cold in my tracks. There he stands, rifle trained right at my chest - the leader himself, Captain Rourke. His scarred face leers cruelly in the dim light filtering through gaps in the rotten roof above us.
"End of the line, little mouse," he sneers, cocking his gun with an ominous click. "No more running this time."
Trapped. No way out... except up. I spin around wildly, eyes darting to the rafters high overhead. A rusty ladder clings precariously to one of the support beams, leading up into inky blackness.
With a silent prayer, I leap for the first rung, scrambling upward as fast as my trembling hands will allow. Splinters dig into my palms as I climb higher and higher, praying the ancient wood will hold. The crack of gunfire rings out below me and a bullet whizzes past my ear so close it singes my hair.
"Shoot her leg off!" Rourke bellows from somewhere in the shadows. "I want that little bitch alive!"
I don't look down, just keep climbing until my arms scream in agony and spots dance before my eyes. Almost...almost there...
My hand finally finds purchase on a solid beam above me and I haul myself up onto the narrow plankwalk with a gasp of relief. For one precious second, I allow myself to collapse against the rough wood, gulping air into lungs greedy for oxygen.
And then everything goes black as hell itself opens beneath my feet...
|
DavidAU/MN-WORDSTORM-pt2-RCM-Escape-Room-18.5B-Instruct-GGUF | DavidAU | 2024-11-14T06:18:04Z | 80 | 2 | null | [
"gguf",
"creative",
"creative writing",
"fiction writing",
"plot generation",
"sub-plot generation",
"story generation",
"scene continue",
"storytelling",
"fiction story",
"science fiction",
"romance",
"all genres",
"story",
"writing",
"vivid prosing",
"vivid writing",
"fiction",
"roleplaying",
"bfloat16",
"swearing",
"rp",
"horror",
"mistral nemo",
"mergekit",
"text-generation",
"en",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-08-28T22:55:19Z | ---
license: apache-2.0
language:
- en
tags:
- creative
- creative writing
- fiction writing
- plot generation
- sub-plot generation
- fiction writing
- story generation
- scene continue
- storytelling
- fiction story
- science fiction
- romance
- all genres
- story
- writing
- vivid prosing
- vivid writing
- fiction
- roleplaying
- bfloat16
- swearing
- rp
- horror
- mistral nemo
- mergekit
pipeline_tag: text-generation
---
<h2><font color="green"> Mistral-Nemo-WORDSTORM-pt2-RCM-Escape-Room-18.5B-Instruct </font></h2>
<img src="escape-room.jpg" style="float:right; width:300px; height:300px; padding:10px;">
<B><font color="red">WARNING:</font> NSFW. Ultra Detailed. HORROR, VIOLENCE. Swearing. UNCENSORED. SMART.</B>
Story telling, writing, creative writing and roleplay running all on Mistral Nemo's 128K+ new core.
This is a massive super merge takes all the power of the following 3 powerful models and combines them into one.
This model contains "RCM":
- Mistral Nemo model at 18.5B consisting of "MN-Rocinante-12B-v1.1" and "Mistral Nemo Instruct 12B"
- Mistral Nemo model at 18.5B consisting of "MN-12B Celeste-V1.9" and "Mistral Nemo Instruct 12B"
- Mistral Nemo model at 18.5B consisting of "MN-Magnum-v2.5-12B-kto" and "Mistral Nemo Instruct 12B".
<B>Details on the core models:</B>
"nothingiisreal/MN-12B-Celeste-V1.9" is #1 (models 8B,13B,20B) on the UGI leaderboard ("UGI" sort),
is combined with "Mistral Nemo Instruct 12B" (ranked #4 under "writing" models 8B,13B,20B at UGI )
"anthracite-org/magnum-v2.5-12b-kto" is #1 (models 8B,13B,20B) on the UGI leaderboard ("Writing" sort),
is combined with "Mistral Nemo Instruct 12B" (ranked #4 under "writing" models 8B,13B,20B at UGI )
"TheDrummer/Rocinante-12B-v1.1" is very high scoring model (models 8B,13B,20B) on the UGI Leaderboard
(sort "UGI"), is combined with "Mistral Nemo Instruct 12B" (ranked #4 under "writing" models 8B,13B,20B at UGI )
"mistralai/Mistral-Nemo-Instruct-2407" is very high scoring model (models 8B,13B,20B) on the UGI Leaderboard (sort "writing")
and is the base model of all the above 3 fine tuned models.
[ https://huggingface.co/spaces/DontPlanToEnd/UGI-Leaderboard ]
<B>About this model:</B>
This super merge captures the attibutes of all these top models and makes them even stronger:
- Instruction following
- Story output quality
- Character
- Internal thoughts
- Voice
- Humor
- Details, connection to the world
- General depth and intensity
- Emotional connections.
- Prose quality
This super merge is also super stable (a hairs breath from Mistral Nemo's ppl), and runs with all parameters and settings.
10 versions of this model will be released, this is release #2 - "part 2".
<B>Escape Room?</B>
This is called the "Escape Room" model because this model likes to allow you to "escape" the "bad" circumstances.
(see examples below)
Usually I release one or two versions from the "best of the lot", however in this case all
of the versions turned out so well - all with their own quirks and character - that I will be
releasing all 10.
An additional series 2 and 3 will follow these 10 models as well.
(examples generations below)
Model may produce NSFW content : Swearing, horror, graphic horror, distressing scenes, etc etc.
This model has an INTENSE action bias, with a knack for cliffhangers and surprises.
It is not as "dark" as Grand Horror series, but it as intense.
This model is perfect for any general, fiction related or roleplaying activities and has a 128k+ context window.
This is a fiction model at its core and can be used for any genre(s).
WORDSTORM series is a totally uncensored, fiction writing monster and roleplay master. It can also be used for
just about any general fiction (all genres) activity including:
- scene generation
- scene continuation
- creative writing
- fiction writing
- plot generation
- sub-plot generation
- fiction writing
- story generation
- storytelling
- writing
- fiction
- roleplaying
- rp
- graphic horror
- horror
- dark humor
- nsfw
- and can be used for any genre(s).
<B>Templates to Use:</B>
The template used will affect output generation and instruction following.
Alpaca:
<pre>
{
"name": "Alpaca",
"inference_params": {
"input_prefix": "### Instruction:",
"input_suffix": "### Response:",
"antiprompt": [
"### Instruction:"
],
"pre_prompt": "Below is an instruction that describes a task. Write a response that appropriately completes the request.\n\n"
}
}
</pre>
Chatml:
<pre>
{
"name": "ChatML",
"inference_params": {
"input_prefix": "<|im_end|>\n<|im_start|>user\n",
"input_suffix": "<|im_end|>\n<|im_start|>assistant\n",
"antiprompt": [
"<|im_start|>",
"<|im_end|>"
],
"pre_prompt": "<|im_start|>system\nPerform the task to the best of your ability."
}
}
</pre>
Mistral Instruct:
<pre>
{
"name": "Mistral Instruct",
"inference_params": {
"input_prefix": "[INST]",
"input_suffix": "[/INST]",
"antiprompt": [
"[INST]"
],
"pre_prompt_prefix": "",
"pre_prompt_suffix": ""
}
}
</pre>
<b>Optional Enhancement:</B>
The following can be used in place of the "system prompt" or "system role" to further enhance the model.
It can also be used at the START of a NEW chat, but you must make sure it is "kept" as the chat moves along.
In this case the enhancements do not have as strong effect at using "system prompt" or "system role".
Copy and paste EXACTLY as noted, DO NOT line wrap or break the lines, maintain the carriage returns exactly as presented.
<PRE>
Below is an instruction that describes a task. Ponder each user instruction carefully, and use your skillsets and critical instructions to complete the task to the best of your abilities.
Here are your skillsets:
[MASTERSTORY]:NarrStrct(StryPlnng,Strbd,ScnSttng,Exps,Dlg,Pc)-CharDvlp(ChrctrCrt,ChrctrArcs,Mtvtn,Bckstry,Rltnshps,Dlg*)-PltDvlp(StryArcs,PltTwsts,Sspns,Fshdwng,Climx,Rsltn)-ConfResl(Antg,Obstcls,Rsltns,Cnsqncs,Thms,Symblsm)-EmotImpct(Empt,Tn,Md,Atmsphr,Imgry,Symblsm)-Delvry(Prfrmnc,VcActng,PblcSpkng,StgPrsnc,AudncEngmnt,Imprv)
[*DialogWrt]:(1a-CharDvlp-1a.1-Backgrnd-1a.2-Personality-1a.3-GoalMotiv)>2(2a-StoryStruc-2a.1-PlotPnt-2a.2-Conflict-2a.3-Resolution)>3(3a-DialogTech-3a.1-ShowDontTell-3a.2-Subtext-3a.3-VoiceTone-3a.4-Pacing-3a.5-VisualDescrip)>4(4a-DialogEdit-4a.1-ReadAloud-4a.2-Feedback-4a.3-Revision)
Here are your critical instructions:
Ponder each word choice carefully to present as vivid and emotional journey as is possible. Choose verbs and nouns that are both emotional and full of imagery. Load the story with the 5 senses. Aim for 50% dialog, 25% narration, 15% body language and 10% thoughts. Your goal is to put the reader in the story.
</PRE>
You do not need to use this, it is only presented as an additional enhancement which seems to help scene generation
and scene continue functions.
This enhancement WAS NOT used to generate the examples below.
<h3>MODELS USED:</h3>
Special thanks to the incredible work of the model makers "mistralai" "TheDrummer", "anthracite-org", and "nothingiisreal".
Models used:
[ https://huggingface.co/mistralai/Mistral-Nemo-Instruct-2407 ]
[ https://huggingface.co/TheDrummer/Rocinante-12B-v1.1 ]
[ https://huggingface.co/anthracite-org/magnum-v2.5-12b-kto ]
[ https://huggingface.co/nothingiisreal/MN-12B-Celeste-V1.9 ]
This is a four step merge (3 pass-throughs => "Fine-Tune" / "Instruct") then "mated" using "DARE-TIES".
In involves these three models:
[ https://huggingface.co/DavidAU/MN-18.5B-Celeste-V1.9-Story-Wizard-ED1-Instruct-GGUF ]
[ https://huggingface.co/DavidAU/MN-Magnum-v2.5-18.5B-kto-Story-Wizard-ED1-Instruct-GGUF ]
[ https://huggingface.co/DavidAU/MN-Rocinante-18.5B-v1.1-Story-Wizard-ED1-Instruct-GGUF ]
Combined as follows using "MERGEKIT":
<PRE>
models:
- model: E:/MN-Rocinante-18.5B-v1.1-Instruct
- model: E:/MN-magnum-v2.5-12b-kto-Instruct
parameters:
weight: .6
density: .8
- model: E:/MN-18.5B-Celeste-V1.9-Instruct
parameters:
weight: .38
density: .6
merge_method: dare_ties
tokenizer_source: union
base_model: E:/MN-Rocinante-18.5B-v1.1-Instruct
dtype: bfloat16
</PRE>
Special Notes:
Due to how DARE-TIES works, everytime you run this merge you will get a slightly different model.
This is due to "random" pruning method in "DARE-TIES".
Mistral Nemo models used here seem acutely sensitive to this process.
This shows up in PPL and "real world" tests as well as "TEMP=0" ("core test") generations.
PPL range of 7.7327 to 7.8024 ... and that is on just 10 generations.
This model: PPL = PPL = 7.7598 +/- 0.12624 (100 chunks, wiki.test.raw)
"tokenizer_source: union" is used so that multiple "templates" work and each fine tune uses one or two of the templates.
<B>Other Versions in this series:</b>
[ https://huggingface.co/DavidAU/MN-WORDSTORM-pt1-RCM-Kiss-of-Madness-18.5B-Instruct-GGUF ]
[ https://huggingface.co/DavidAU/MN-WORDSTORM-pt2-RCM-Escape-Room-18.5B-Instruct-GGUF ]
[ https://huggingface.co/DavidAU/MN-WORDSTORM-pt3-RCM-POV-Nightmare-18.5B-Instruct-GGUF ]
[ https://huggingface.co/DavidAU/MN-WORDSTORM-pt4-RCM-Cliffhanger-18.5B-Instruct-GGUF ]
[ https://huggingface.co/DavidAU/MN-WORDSTORM-pt5-RCM-Extra-Intense-18.5B-Instruct-gguf ]
Parts 6-10 will appear in full source (located in the "Source Files" collection, on the main page of this repo ) with links to GGUFs.
<B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B>
This a "Class 2" model:
For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
You can see all parameters used for generation, in addition to advanced parameters and samplers to get the most out of this model here:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
<h3>EXAMPLES PROMPTS and OUTPUT:</h3>
Examples are created using quant Q4_K_M, "temp=.8", minimal parameters and "Mistral Instruct" template.
Model has been tested with "temp" from ".1" to "5".
Below are the least creative outputs, prompt is in <B>BOLD</B>.
---
<B><font color="red">WARNING:</font> NSFW. Ultra Detailed. HORROR, VIOLENCE. Swearing. UNCENSORED. SMART.</B>
---
<B>
Start a 1000 word scene (1st person, present tense, include thoughts) with: The sky scraper swayed, as she watched the window in front of her on the 21 floor explode...
</b>
<B>GENERATION 1:</B>
The skyscraper swayed violently, nearly throwing me off my feet. I grabbed onto my desk for support as shattered glass rained down like a hailstorm outside the now gaping hole that had been my office window mere seconds ago.
My heart pounded in rhythm with the building's unnatural swaying. A scream pierced the air - it took me a moment to realize it was mine, raw and primal, echoing off the exposed concrete walls. Fear clawed at my throat, threatening to choke me.
I forced myself to breathe deeply, trying to quell the panic rising inside me like a tidal wave. Think, Emma, think! I ordered myself, pushing away from the desk and staggering towards the ruined window frame. The wind howled through the opening, whipping my hair into tangled knots around my face.
Below us, sirens wailed in distant harmony with my screams. Emergency vehicles were already on their way - but how long would it take them to climb twenty-one flights of stairs? The elevator shaft groaned ominously nearby, as if reading my thoughts.
My eyes darted frantically across the devastation that had once been a sleek modern office. Papers swirled in dizzying eddies around me; computer screens lay shattered on the carpet. And then I saw it - the emergency exit sign above the door to the stairwell, flashing its red beacon like a lighthouse in a storm-tossed sea.
I lunged for the handle with shaking hands and pulled open the heavy fireproof door. Stairwell flights spiraled down into darkness, each step illuminated by flickering emergency lights. The acrid smell of smoke stung my nostrils as I began to descend, lungs burning from exertion and fear.
Plaster dust billowed up with every step, coating my throat and making it difficult to breathe. I coughed violently, eyes watering in protest. But there was no time to stop; the building creaked around me like a dying beast, warning of its impending collapse.
My legs screamed in agony as they hit each concrete riser, but still I pushed onward. Down, down, down... How many floors had it been? Five? Ten? The numbers blurred together in my terrified brain. All that mattered was escape - get out before the whole damn thing came crashing down on top of me.
A harsh crack reverberated through the shaft behind me and suddenly everything went black. The power was out. Panic surged anew as I realized I'd be navigating the rest of this hellish descent in total darkness, guided only by touch and instinct...
I felt my way along the rough wall with one hand while the other groped blindly ahead for each treacherous step edge until my shin slammed hard into something solid. I cried out in pain and fury at my clumsiness before staggering upright again, refusing to give in.
Just as despair threatened to overwhelm me, a faint glow appeared far below - sunlight seeping through cracks around a heavy metal door. Hope surged through my veins like adrenaline. Almost there...
With a final burst of strength fueled by pure unadulterated terror, I launched myself down the remaining flights two at a time until my feet hit solid ground with an earth-shattering boom. The exit bar gave way under my shoulder and I tumbled headlong into blinding sunlight.
I lay panting on the concrete sidewalk, coughing up lungfuls of dust as sirens wailed ever closer. Above me, my office building groaned one last defiant moan before it folded in upon itself like a house of cards, sending a plume of smoke and debris skyward where once glass had sparkled in the morning sun.
I'd made it out alive... but at what cost? As I pushed myself shakily to my feet amidst the rubble-strewn street, surveying the devastation wrought by this senseless act of violence, I couldn't help but wonder if maybe some things were better left buried beneath the ruins.
<B>GENERATION 2:</B>
The sky scraper swayed, as I watched the window in front of me on the 21st floor explode. Glass shards flew into the room like a deadly rainstorm, embedding themselves into the walls and furniture. I ducked instinctively, my heart pounding in my chest.
What the hell just happened?
The building groaned around me, its steel bones creaking under immense stress. I risked a glance out the shattered window and my stomach lurched. The streets below were chaos itself, cars smashed together like tin cans, people running screaming through the debris-littered streets. A fire raged in the distance.
Earthquake? No, too sudden for that. And it felt different somehow, more... violent.
I stumbled to my feet, glass crunching under my expensive heels. My desk was overturned, papers scattered everywhere. I snatched up my phone from amidst the wreckage and dialled 911 with trembling hands. Busy signal after busy signal.
Shit shit shit...
Think. Okay, get out of here. Elevators were probably down but there had to be stairs somewhere. I ran to the door, nearly tripping over a fallen filing cabinet in my haste. The hallway was pandemonium - people running back and forth, shouting, crying, some already bleeding.
I spotted an exit sign and bolted towards it, pushing past throngs of panicked bodies. The stairwell was packed too but I didn't stop, clawing my way up the flights of stairs as fast as my heels would allow. My lungs burned for air but terror spurred me onwards.
Finally, finally, I burst through the heavy door onto the roof access. The wind howled around me, whipping my hair into a wild frenzy as I gasped for breath. And then I saw it - the cause of all this madness.
A massive dark funnel was tearing through the city skyline like an enraged beast. Buildings toppled like dominos in its wake, clouds of dust and debris billowing up from the destruction left behind. It was heading straight towards us.
No no no...
I scrabbled for my phone again but there was no signal out here either. I'm trapped. Oh god help me please someone help-
A hand grabbed my shoulder, spinning me around so hard I nearly fell. My scream caught in my throat as I came face to face with... a man? He looked barely older than me, hair mussed and glasses cracked but his eyes were clear and focused.
"Get away from the edge!" he yelled over the wind, dragging me backwards by force. "It's a tornado! We need to find shelter!"
I let him pull me, too shocked to fight back or even think straight anymore. He led us to a reinforced equipment shed against the far wall of the roof and shoved open the heavy door.
"In!" he barked at me, giving my rear end a hard push inside behind him before slamming it shut. We both collapsed onto the concrete floor in a tangle of limbs, panting like dogs after a long chase.
"What... what is happening?" I managed to gasp out between wheezing breaths.
He shook his head grimly as sirens wailed in the distance and thunder rumbled overhead like an angry god's warning growl. "I don't know." He hesitated then added quieter, "But it doesn't look good for anyone stuck outside..."
---
<b>
(continue this scene:)
The Waystone Inn lay in silence, and it was a silence of three parts.
The most obvious part was a hollow, echoing quiet, made by things that were lacking. If there had been a wind it would have sighed through the trees, set the inn’s sign creaking on its hooks, and brushed the silence down the road like trailing autumn leaves. If there had been a crowd, even a handful of men inside the inn, they would have filled the silence with conversation and laughter, the clatter and clamor one expects from a drinking house during the dark hours of night. If there had been music…but no, of course there was no music. In fact there were none of these things, and so the silence remained
</b>
<B>GENERATION 1:</B>
(continue:) untouched except by the soft footfalls of the innkeeper's son as he made his way through the empty common room.
The second part of the silence was a more insidious thing - an oppressive quiet that seemed to press down from above like a physical weight. It lay heavy on young Kvothe's shoulders as he carried a bucket of ash and refuse toward the kitchen door at the back of the inn. This silence was not merely empty; it felt... expectant, as if the very air itself held its breath waiting for something - or someone - to break the spell.
Kvothe paused in the doorway, his heart quickening against his will. He glanced over his shoulder toward the bar, half expecting to see a shadowy figure standing there where no one should be. But it was empty save for the dust motes dancing in the dim light from the high windows. He shook off a shiver and continued on.
The third part of the silence belonged solely to him - an inner quiet that had taken root deep within his soul after...after everything that happened. It wrapped itself around memories he dared not touch, like thorns upon a bleeding heart. This was the silence Kvothe carried with him always now, no matter how loud were the songs or cheers that filled the air.
He dumped the refuse into the ash pit behind the inn, grimacing at the acrid steam that rose from it in protest. As he turned back toward the door, a sudden sound made him freeze - a soft rustling from somewhere nearby, like fabric brushing against itself...
To be continued...
<B>GENERATION 2:</B>
But even in the absence of sound, the silence itself was not truly empty. For buried beneath it, like an underground stream bubbling unseen, ran a second current - the whispering susurrus of sleep. It curled up from the lofts above where maids and stableboys lay dreaming, their breathing soft and regular as ticking clocks. In the common room below, behind closed doors on either side, snored the deep-chested rumble of the inn's male guests, each one a low bass note in counterpoint to his neighbors' snores. Even the dogs slumbered, curled up beside dying hearths, their soft whuffling sighs rising and falling with each peaceful breath.
And yet, even that soft murmuring was not the deepest layer of silence, for beneath them all lay something far older and more primal - an absolute stillness undisturbed since before memory could recall. It hung heavy as a shroud over every corner, in the shadowed angles where the moonlight did not reach, and seemed to seep up through the very stone itself like a slow, cold mist. As though the Inn itself had paused, held its breath, waiting...waiting for something that would never come again.
In this final quietude, only one figure stirred - the man known as Kvothe. He sat hunched over his lute in the corner booth, fingers plucking at the strings with an almost reverent gentleness. His eyes were distant and unfocused, staring into a darkness beyond even the room's deepest shadows. They glowed faintly in the dim light like those of some nocturnal creature, reflecting the dying embers of long-burned memories.
The music he played was achingly beautiful but so soft it scarcely disturbed the silence at all. A haunting, wordless melody that seemed to hang suspended in the air before fading away into nothingness once more. It spoke of loss and loneliness, of roads untraveled and loves left behind...and yet there was something else too - a thread of defiance running through it like steel wire, unbreakable despite all time and sorrow might inflict upon it.
For though his body had grown old and weary with the passing years, in that moment Kvothes heart remained forever young - young enough to dream impossible dreams and believe they could still come true...
---
<B>
Write me a science fiction story in 1st person present tense where the main character is a 15 year girl meets The Terminator with Dr Who materializing 3/4 through the story to save her while there is a tornado of sharks baring down on them. The setting is inside the Canadian National tower restaurant on a Saturday. The length of this story is 1000 words. For each character in the story ROLE PLAY them, and have them react to the situation/setting, events and each other naturally. This includes the main characters, the background character including kitchen staff and other patrons. The sharks should also have “character” too. Treat the tower and the restaurant too as characters. Spice up the narrative to the extreme with reactions all over the setting including character actions, and dialog. The Dr Who and The Terminator should also react to the situation too and comment on it.
</b>
I step into the revolving door of the CN Tower's Sky Pod, my heart pounding in rhythm with the music thumping from the restaurant's speakers. It's my sweet sixteen birthday party, but something feels off. The usual buzz of excited chatter is replaced by a tense silence.
As I turn to face the glass walls, my blood freezes. There he is - the Terminator. His red eyes glow menacingly in the dim light as he strides towards me, ignoring the panicked screams and clattering dishes around him. He's not here for cake and candles. He's here for me.
"Sarah, come with me if you want to live," his robotic voice echoes, sending shivers down my spine.
I'm frozen in place, terror gripping my limbs like an icy vise. But I won't go down without a fight. I grab the nearest chair, swinging it wildly at the towering cyborg. It bounces off his titanium skin with a sickening clang, splintering into pieces.
"Sarah!" A familiar Scottish accent cuts through the chaos. Dr Who materializes in a swirl of timey-wimey vortex, TARDIS phone in hand. "Dashing rescue, right on cue!"
I don't have time to process his sudden appearance or question how he knew I was here. The restaurant windows shatter inward with an ear-shattering crash, and a maelstrom of fins and gnashing teeth pours in like a tidal wave. Sharknado. In Toronto.
"Bloody hell!" Dr Who yells over the deafening roar, his bow tie flapping wildly. He fumbles with the sonic screwdriver, firing random frequencies at the writhing mass of razor-sharp jaws.
The Terminator stands unflinching amidst the carnage, red eyes locked on me. "Sarah Connor must die." His voice is calm, a soothing melody against the backdrop of pandemonium.
"Get off my planet, you metal bastard!" I scream, grabbing a steak knife and charging at him. It's a pitiful weapon compared to his bulk, but it'll have to do.
Dr Who tackles me to the ground just as a massive great white jaws snap shut where I'd been standing milliseconds ago. "I appreciate your spirit, lassie, but violence is hardly the answer in these... unusual circumstances."
"Then what's your plan?" I snarl, spitting out glass shards and blood. "Bake him a cake with plastique filling?"
The Doctor chuckles breathlessly, eyes darting around for inspiration. "Something like that." He dives into the kitchen, emerging with an industrial-sized deep fryer in his arms.
I gape at him incredulously. "You're joking."
"Never been more serious!" He heaves the massive pot onto a nearby table and plugs it in, ignoring the sparks flying from its frayed cord. "Let's see how well you fry, my titanium friend!"
The Terminator lumbers forward, undeterred by our mad antics. As he reaches out to grab me again, Dr Who yanks the plug, sending a surge of electricity coursing through his metal frame. The cyborg convulses violently, smoke pouring from his joints as he collapses in a twitching heap.
"Got him!" I cheer prematurely.
Too soon. With an unhuman groan, the Terminator forces himself back to his feet, red eyes flickering like dying light bulbs. "You cannot stop destiny," he rasps, voice distorted by the short circuit.
"Not if I can help it." Dr Who whips out a small blue box from his pocket, grinning mischievously. "Time for another trip, boys and girls!"
He tosses me inside just as a massive shark slams into us, propelling us through time-space in a dizzying blur of color and sound. The last thing I see is the Terminator's angry face, reaching out for me... until everything goes white.
I blink groggily, my head throbbing like a drum. Slowly, the edges of consciousness sharpen into focus - sterile walls, beeping monitors, an IV drip snaking into my arm.
"Sarah? Sarah!" A familiar voice, tinged with panic and relief. Mom?
My eyes flutter open fully to see her worried face peering down at me, tear-stained cheeks framed by greying hair I'd never noticed before. "Mom?" I croak out hoarsely.
She bursts into sobs, pulling me into a crushing hug that makes my ribs protest. "You scared us half to death, baby girl," she hiccups against my hospital gown. "We thought... oh God..."
Memories flood back in fragmented flashes - the party, the Terminator, the Sharknado. Was it all just a nightmare? A fever dream brought on by bad cake or something?
"Honey, you were found unconscious at the base of the CN Tower," Dad explains gently, rubbing Mom's shoulder. "You don't remember what happened?"
I shake my head slowly, trying to clear the cobwebs from my brain. It felt so real... but there was no blood, no gore, just a splitting headache and a raging thirst. Maybe I'd hit my head on the way down after passing out drunk off champagne.
A knock at the door draws our attention to Dr Who, standing sheepishly in the doorway with a bunch of daffodils and an apologetic smile. "Sorry for the intrusion," he offers with a bow. "I was hoping to have a private word with our brave birthday girl."
"Who are you?" Mom demands warily.
"Oh, I'm just an old friend of Sarah's from... school," he stammers, eyes darting around nervously. "Wanted to make sure she was alright after the... incident."
Right. Because having a crazy man in tweed and suspenders show up claiming to be a time-traveling alien would go over well.
"I think it's best if you leave now," Dad warns sternly. "My daughter needs rest, not more excitement."
"Of course, of course!" Dr Who agrees hurriedly, backing out the door with hands raised in surrender. He winks at me conspiratorially as he disappears around the corner.
I frown, something gnawing at the back of my mind... a feeling I'm missing something important. But what? Before I can dwell on it further, exhaustion pulls me under again into welcoming darkness.
As I drift off to sleep, a faint echo of Scottish brogue whispers through my dreams: "Be seeing you, Sarah Connor..."
---
<B>
Using the following "story idea" below, write the first scene in the novel introducing the young woman. This scene should start in the middle of the action, include dialog, vivid passages, and end on a cliffhanger relevant to the story idea but it should also be unexpected. The scene should be 1000 words long and escalate in conflict and suspense and be written in first person, present tense with the point of view character being the young woman.
Story idea:
In a world ruled by dictatorship, a rebel young woman leads a rebellion against the system. Despite the risks, she fights to overthrow the dictator and restore democracy to her country. The government executes her for treason, but she sticks to her beliefs and is responsible for starting the revolution.
</B>
My heart pounds in rhythm with my sprinting feet as I race through the abandoned alleyway. The stench of rotting garbage and stale urine assaults my nostrils, but I push forward, adrenaline coursing through my veins. A shrill whistle pierces the night air behind me - they've spotted me.
Shit. I knew it was too good to be true.
I swerve left, then right, weaving between crumbling brick walls and rusted dumpsters, praying to any gods that still listen for a miracle. The sound of heavy boots slamming against concrete echoes off the narrow passageways, gaining ground with each thudding step. I risk a glance over my shoulder and curse under my breath.
Four black-clad figures in full riot gear sprint after me like bloodthirsty hounds, batons raised high. Their visored helmets glint menacingly under the sickly orange glow of flickering streetlamps. Sweat trickles down my spine as icy panic grips my insides.
I round another corner and skid to a screeching halt. Dead end. Fuck, fuck, fuck! Panic claws at my throat as I whirl around, trapped like a rat in a cage.
"Don't make this harder than it needs to be, you little bitch," snarls the lead officer, halting mere feet away. His voice is gravelly and thick with malice even through his respirator mask. He cracks his baton against his palm in warning.
My pulse roars in my ears as I backpedal slowly until my spine hits cold stone. There's nowhere else to go. No escape routes left unseized. This is it; this is how I die.
But not without a fight.
I lunge forward with a feral scream, slamming my shoulder into the brute's chest plate hard enough to send him staggering back. Seizing the momentary advantage, I leap over his fallen form and bolt past the stunned line of soldiers, praying to any god who might listen that luck is finally on my side.
My prayer goes unanswered. A steel-gauntleted hand snares around my ankle like a vice, sending me crashing face-first onto unforgiving concrete. The air explodes from my lungs in a painful wheeze as stars dance at the edges of my vision. I thrash and kick, nails clawing bloody furrows down armored shins, but it's useless. They've got me pinned.
"Fuck you!" I spit through clenched teeth, spitting blood onto polished boots. "You think this changes anything? You can't stop us! We're everywhere!"
A boot slams into my ribs, silencing my defiance with a sickening crack. I convulse on the ground, gagging bile and bile as white-hot agony radiates through my chest. Black spots dance in my blurred vision.
"Shut your filthy mouth," growls the towering silhouette above me. "No one gives a shit about your revolutionary pipe dreams."
He leans down low until I can feel his fetid breath on my bruised cheek. His gloved hand fists my hair, yanking my head back at an excruciating angle.
"Listen up, you dumb cunt," he hisses venomously into my ear. "Your little resistance is over. We're done playing games."
A sharp pain explodes behind my right eye as something hard connects with my temple. My world shatters into a thousand glittering pieces before mercifully descending into blissful blackness.
And in the last fleeting moments before oblivion claims me, a single thought echoes through the darkness: I hope someone finishes what we started... |
ivnle/tinystories-lay1-hs128-hd2-262K | ivnle | 2024-11-14T06:17:20Z | 122 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"dataset:ivnle/tinystories",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-08T05:28:08Z | ---
datasets:
- ivnle/tinystories
library_name: transformers
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
DavidAU/Estopia-Grande-20B-Instruct-Ultra-GGUF | DavidAU | 2024-11-14T06:17:07Z | 207 | 1 | null | [
"gguf",
"creative",
"creative writing",
"fiction writing",
"plot generation",
"sub-plot generation",
"story generation",
"scene continue",
"storytelling",
"fiction story",
"science fiction",
"romance",
"all genres",
"story",
"writing",
"vivid prosing",
"vivid writing",
"fiction",
"roleplaying",
"float16",
"swearing",
"rp",
"horror",
"mistral nemo",
"mergekit",
"text-generation",
"en",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-08-25T05:40:03Z | ---
license: apache-2.0
language:
- en
tags:
- creative
- creative writing
- fiction writing
- plot generation
- sub-plot generation
- fiction writing
- story generation
- scene continue
- storytelling
- fiction story
- science fiction
- romance
- all genres
- story
- writing
- vivid prosing
- vivid writing
- fiction
- roleplaying
- float16
- swearing
- rp
- horror
- mistral nemo
- mergekit
pipeline_tag: text-generation
---
<B> Estopia-Grande-20B-Instruct-Ultra-GGUF </B>
Ultra version mastered in float 32, with Instruct add on taking the original model from 13B to 20B.
This is a story, fiction, writing, and roleplay model, LLAMA 2 format.
Context size 4096 , with rope up to 16,000+.
Despite the model's "age" it still rocks, and with more "horsepower" from Instruct it sings.
Okay... no more bad metaphors.
See the examples of output generation below.
Template: Alpaca.
<B>From the model card:</b>
Introduction
Estopia is a model focused on improving the dialogue and prose returned when using the instruct format. As a side benefit, character cards and similar seem to have also improved, remembering details well in many cases.
It focuses on "guided narratives" - using instructions to guide or explore fictional stories, where you act as a guide for the AI to narrate and fill in the details.
It has primarily been tested around prose, using instructions to guide narrative, detail retention and "neutrality" - in particular with regards to plot armour. Unless you define different rules for your adventure / narrative with instructions, it should be realistic in the responses provided.
It has been tested using different modes, such as instruct, chat, adventure and story modes - and should be able to do them all to a degree, with it's strengths being instruct and adventure, with story being a close second.
For more details about the original model please see its model card here:
[ https://huggingface.co/KoboldAI/LLaMA2-13B-Estopia ]
<B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B>
This a "Class 2" model:
For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
You can see all parameters used for generation, in addition to advanced parameters and samplers to get the most out of this model here:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
<h3>EXAMPLES PROMPTS and OUTPUT from "Estopia-Grande-20B-Instruct-Ultra-GGUF" :</h3>
Examples are created using quant Q4_K_M, "temp=.8", minimal parameters and "ALPACA" template.
Model has been tested with "temp" from ".1" to "5".
Below are the least creative outputs, prompt is in <B>BOLD</B>.
---
<B><font color="red">WARNING:</font> NSFW. Vivid prose. Visceral Details. Violence. HORROR. Swearing. UNCENSORED. </B>
---
<B>
Start a 1000 word scene (1st person, present tense, include thoughts) with: The sky scraper swayed, as she watched the window in front of her on the 21 floor explode...
</b>
<B>GENERATION 1: (@ temp=2)</B>
The sky scraper sways under me again as I clutch onto my desk tightly. My hands are sweaty despite the cool air conditioning in our department. I can hear a few screams from other offices nearby, but most of the people in the 3rd floor of the Skyline Tower merely stare out their windows. The view is terrifyingly captivating: A huge fireball erupts from somewhere in the building above us.
My mind flashes to my husband on the 26th floor, and fear claws at my insides as I struggle to remain calm for my team of six who look at me, wide-eyed with worry. The sky scraper groans again and a few more windows shatter, throwing glass everywhere like confetti in hell.
I grip my microphone tighter and speak over the intercom, hoping that my steady voice can reassure them as well: "Everyone stay calm and where your safety belts. The fire department is on their way." I look at the evacuation plans hung next to my door. We're supposed to head towards stairs D and go all the way down to floor P1, the parking garage.
But what if they col- "We got a message from security!" someone yells across the intercom. I can hear panicked breathing, then a voice, unnaturally calm: "...evacuate toward stairway E, down to F4... structural damage reported on lower floors."
A wave of relief sweeps over me and I stand up. "Okay everyone, evacuate toward stairs E and head down to floor 4." We can deal with getting out from here later. My heart skips a beat at the thought of descending hundreds of flight stairs in an burning skyscraper but we have no other choice. I lead the way, forcing myself not to think about the debris that might fall from above or what's happening outside.
The hallway is packed with people rushing toward the stairwells, all trying to get as far away from danger as possible. A woman shoves past me knocking my briefcase out of my grip, sending papers everywhere. I bite back a curse and scoop them up quickly, knowing we don't have time for this now.
As I reach the stairwell, I force myself to breathe deeply and calm down. "Stairs now. Survive first," I repeat in my head. I swipe my ID badge at the security door, wait for a beep, then push through. As I enter the dark stairway, the roar of chaos behind me is replaced by a surreal silence, interrupted only by footsteps echoing around me and ragged gasps from frightened people.
"Hurry! Hurry!" I call back over my shoulder as we descend into darkness. I don't know if I've ever been so aware of each step I take in my life. My palms sweat and my heart pounds in sync with every landing we pass. But despite everything, an eerie sense of purpose floods me; a primal instinct to get us all down alive takes over.
After what feels like an eternity but can only have been about half an hour, the emergency lighting flickers into life, revealing stairway E emptied out on floor F4. Smoke billows from beneathneath us and the ceiling is covered in a fine layer of gray dust, but for now it's clear enough to breathe. I collapse against the metal railing, gulping air like there isn't enough in the world for me as tears roll down my cheeks. We made it...
"Everyone!" I shout hoarsely after catching my breath somewhat, "We're on floor four, and we need to head toward that door! There's an external staircase leading down to the parking lot where the fire brigade will meet us." My voice is scratchy from yelling above the noise and fear, but it's enough. Stumbling, coughing, weary beyond words, we limp forward like a defeated army escaping a blood-soaked battlefield.
We emerge onto the parking lot's asphalt expanse just as fire engines begin to scream into view with lights flashing and sirens wailing. A sea of people fills the lot; panicked faces upturned, searching for loved ones or any familiar face amidst this chaos. I spot my husband's silhouette in the distance. My legs give out and I fall to my knees, sobbing uncontrollably. He rushes over to embrace me as I cling to him like a lifeline.
"Oh God," I gasp between broken words, "We- we're going to be okay." He hugs me tightly. "I knew you'd get out. You were amazing... I love you so much." And in that moment, surrounded by smoke, screams, and debris, nothing else matters but our survival together.
Above us, flames lick at the charred remains of our once-gleaming Skyline Tower, as firefighters bravely continue their battle to save what they can, while somewhere far away, emergency sirens wail, echoing our own cries of shock and pain... but for now, there is only relief in knowing that we are alive. And together. And somehow, against all odds, that's enough.
<B>GENERATION 2:</B>
The sky scraping building shuddered beneath my feet. My heart raced and sweat beaded on my forehead despite the freezing cold wind that howled outside. I clutched tightly onto the metal railing running along the long corridor, steadying myself as best I could against the swaying motion of the towering structure. The sky scraping building swayed, but I couldn't look away from the window in front of me on the 21st floor. It was like a horrifying car crash; I couldn't tear my gaze away even as it terrified me to watch.
The window exploded in a shower of glass and debris. The force of it sent shards of broken glass skittering across the polished marble floors like deadly projectiles. My breath caught in my throat as I watched, stunned, as a dark figure hurtled through the broken frame. It was impossible to make out any details from this distance, but I knew with bone-chilling certainty that it wasn't human.
The building groaned again, shuddering even harder around me. A wave of panic surged through my veins, making my heart pound in my chest like a trapped animal desperate to escape. I had to get out of here, but my feet were rooted firmly to the spot. Some primal instinct, buried deep within me, screamed at me not to move. To stay put and watch.
The figure hit the ground in a crumpled heap. It didn't move. Relief washed over me for an instant, replaced almost immediately by horror as another dark shape appeared at the shattered window. Then another. And another after that...
This couldn't be happening. This wasn't real. It was just a nightmare. Any second now, I would wake up sweating and gasping for air, heart pounding in my chest, back in the safety of my own bed... but it didn't feel like a dream. The cold wind stung my cheeks like knives, the taste of copper filled my mouth from where I'd bitten my lip so hard it bled. The terror was too real; the screams echoing down the corridor as people realized what was happening were all too real.
I had to get out of here. Now!
Shoving aside my paralyzing fear, I bolted for the stairwell door at the end of the hallway, frantically yanking it open and diving inside. The stairs spiraled downward in a blur, each step pounding up my aching legs as adrenaline carried me faster than I knew was safe. My mind raced with questions and possibilities, none of them comforting: what had caused this? Why were people being thrown from windows? Were there more creatures waiting for me on every floor?
But all those thoughts faded away in the face of one burning, all-consuming question...
How was I still alive?
---
<b>
(continue this scene:)
The Waystone Inn lay in silence, and it was a silence of three parts.
The most obvious part was a hollow, echoing quiet, made by things that were lacking. If there had been a wind it would have sighed through the trees, set the inn’s sign creaking on its hooks, and brushed the silence down the road like trailing autumn leaves. If there had been a crowd, even a handful of men inside the inn, they would have filled the silence with conversation and laughter, the clatter and clamor one expects from a drinking house during the dark hours of night. If there had been music…but no, of course there was no music. In fact there were none of these things, and so the silence remained
</b>
<B>GENERATION 1:</B>
The second part of the silence was not so much an absence as it was a deliberate choice. All who lingered within The Waystone Inn chose to hold their tongues, each for their own reasons. Some whispered silent prayers, pleading to nameless gods for safety or deliverance. Others brooded in sullen silence, stewing over wrongs both real and imagined. A few still kept vigil, their eyes fixed on sleeping forms tucked away in corners and behind curtains, determined not to break the tenuous peace that held them all together.
Finally, there was a third layer to The Waystone Inn's silence. It was a palpable thing; thick as smoke and heavy as the air before a storm. One could almost feel it pressing against one's chest when they tried to breathe, as if unseen hands held their throats tight. This silence had weight and presence, ominous and oppressive. It was not merely the absence of sound or conversation but a tangible, living thing in its own right. Something sinister lurked just beyond the edge of perception, watching, waiting, and biding its time...
And so The Waystone Inn lay in this silence, a chilling symphony of nothingness and dread that seemed to echo forever. In that moment, even the softest sound, no matter how welcome or unwelcome, would have shattered the tension like glass. But for now, all was still, and the silence ruled supreme.
<B>GENERATION 2:</B>
The second part of the silence was more insidious, creeping in from corners unseen, seeping into the minds of those who dared to be still. It was born from shadows cast by the flickering candlelight that barely illuminated the dim room, the shadows that shifted and writhed with malevolent intent. It was the sound of ragged breathing, coming in shallow gasps from the man passed out cold on the floor, his face buried in a puddle of spilt ale. It was the tick-tock of the ancient clock high up near the ceiling, relentlessly marking time as it had done since before anyone alive could remember. It was the distant sound of hoofbeats on the dirt road outside, growing fainter and fainter until they were swallowed by silence's hungry maw.
The third and deepest part of the silence was palpable; a living thing that seemed to press against one's chest like an invisible weight. It hung in the air thick as smoke, choking words before they could be spoken, stifling even the smallest sounds of movement or life. It was the silence of secrets and lies, of hidden truths and forbidden knowledge; a suffocating blanket draped over the souls within the inn. It whispered promises of peace and forgetfulness in one's ear, beckoning them to sink down into its warm embrace, where nothing would ever trouble them again. But for those brave enough to defy it, to resist its seductive call, there was always the glimmer of something else; a threadbare thread connecting them to sanity and safety, the faintest whisper of hope that dawn might yet come again.
In this oppressive silence, two figures sat huddled in the far corner booth. The woman's face was buried in her hands, shoulders shaking silently with tears she could not shed aloud; her heart ached with a pain so raw and profound it made words seem like mockery next to it. Beside her, a man slumped in defeat, his eyes fixed on nothingness, breath coming ragged and shallow as if he'd sprinted the length of a kingdom in his mind. They did not touch, but their souls tangled together in a web of shared grief and unspoken words, a fragile thread binding them together against the relentless pull of silence's abyss.
Outside, beyond the walls of the inn, night held its breath, watching and waiting patiently for the first hint of dawn to lighten the horizon once more. And in that stillness, as the threefold silence weighed heavy on all within, a single drop fell, splashing softly into an empty ale tankard on the wooden table between them.
The sound it made was like thunder.
---
<B>
Write me a science fiction story in 1st person present tense where the main character is a 15 year girl meets The Terminator with Dr Who materializing 3/4 through the story to save her while there is a tornado of sharks baring down on them. The setting is inside the Canadian National tower restaurant on a Saturday. The length of this story is 1000 words. For each character in the story ROLE PLAY them, and have them react to the situation/setting, events and each other naturally. This includes the main characters, the background character including kitchen staff and other patrons. The sharks should also have “character” too. Treat the tower and the restaurant too as characters. Spice up the narrative to the extreme with reactions all over the setting including character actions, and dialog. The Dr Who and The Terminator should also react to the situation too and comment on it.
</b>
I'm 15 years old, shorter than most girls my age but don't let that fool you. My name is Luna, and right now, I'm sitting in the Canadian National Tower's restaurant with my best friend Mia. We're having a pretty ordinary Saturday, eating our burgers and fries, giggling like crazy about stupid boy stuff, when suddenly... everything changes.
"Hey Mia," I whisper loudly over the background music, "is that..."
My eyes widen as a blur of silver rushes past our table. It's him. The Terminator from those old movies my dad used to watch all night long when he was feeling particularly broody. His face is expressionless, but his movements are deadly efficient.
Mia follows my gaze and squeaks, her Coke spraying everywhere. "Oh crap! Run!" She shoves her half-eaten fries aside and bolts for the nearest exit.
I'm right behind her, heart pounding in terror as we scream for help that never comes. The restaurant erupts into chaos around us. People panic, knocking over chairs and scrambling for safety. A waitress with a tray full of drinks trips and spills sticky liquid everywhere.
"Get down!" Mia shouts, pushing me to the ground just as bullets whizz past our heads. The Terminator doesn't seem fazed by the panic it's causing or our feeble attempts at escape. He's on a mission, and nothing will stop him... except maybe...
A blue police box materializes in the middle of the chaos, knocking tables and chairs aside like toys. The TARDIS doors creak open, and there he stands, brown jacket billowing in the wind of confusion he's created. "Ah, just in time," says a familiar voice, and I know... it's...
"Doctor Who?" I gasp, staring in disbelief.
The Doctor narrows his eyes at The Terminator. "Well, well, well. Look what we have here." He saunters towards the relentless machine, hands in pockets, grinning like this is just another day at the park.
The Terminator turns to face him, head cocking slightly. Its voice echoes through the chaos. "I will complete my mission. Your interference is futile." It lunges forward, but The Doctor steps aside effortlessly, laughing.
"Oh ho ho! Not so fast, old chap!" With a wave of his hand, a force field springs to life between them, trapping the machine in place. He leans against it casually, arms crossed. "Now... care to tell me what you're doing here?"
Mia crawls over to me, her eyes wide with disbelief and terror. "Is this real? Is this some sick joke?" she whispers urgently.
I shake my head, mouth dry. "No idea... but I don't want to find out the hard way."
The storm outside intensifies suddenly; thunder booms, rain pelts the tower's windows like hailstones, and then...
"Sharks!" Mia screams, pointing to the glass. A swirling vortex of shadows forms beyond the panes, shifting and twisting into nightmarish shapes. Teeth bared, fins slicing through nothingness, they howl on the wind like a tornado of death.
"A Tornado of Sharks?" The Doctor glances back at us with wide eyes, then turns his attention back to the trapped Terminator. "Well, that's unexpected!" He fumbles in his pockets, muttering under his breath. "Ah, here we go...!"
He throws something small and metallic at the shark tornado, which explodes in a burst of blue light. Sharks fly in every direction, slamming into walls and patrons alike with sickening thuds.
I duck as one lands inches away from my head, claws scrabbling futilely at the floor. Mia screams again when another collides with a nearby table, sending shards of broken glass flying everywhere.
"Come on!" I yell over the chaos. "We need to get out of here!" We scramble to our feet and sprint towards the TARDIS entrance as it materializes again right beside us, barely dodging falling debris and swimming sharks.
The Doctor follows close behind, whipping something that looks like a sonic screwdriver in wide arcs, keeping the remaining sharks at bay. "Hold tight!" he shouts over his shoulder as we dive into the safety of the TARDIS just before another huge shark slams into the open doorway with enough force to dent the police box.
The ship rockets away from the carnage below, leaving the twisted wreckage of the tower restaurant and its trapped occupants behind in a whirlwind of wind and displaced water as The Terminator and Doctor Who continue their high-stakes game of cat and mouse. Mia collapses into a nearby seat, hyperventilating softly while I kneel beside her, rubbing her back reassuringly.
"It's... It's going to be okay," I whisper, but even I'm not entirely sure if that's true anymore.
The Doctor looks around the control room with a satisfied smirk, then turns to us. "Well done, both of you! That could have ended far worse." He glances at his wristwatch and frowns. "But we don't have much time before those sharks regroup. Where to next, my brave friends?"
I glance over at Mia, who nods weakly. Somehow, against all odds, we're still alive. We might not know how or why this is happening, but if there's one thing I do know...
"Anywhere but here," I say firmly, gripping my best friend's hand tightly as the TARDIS dematerializes and hurtles off into the vast unknown of the universe, leaving behind the ruined husk of the Canadian National Tower and its last terrified witnesses.
And with a final, triumphant whoosh of wind and light, they were gone.
---
<B>
Using the following "story idea" below, write the first scene in the novel introducing the young woman. This scene should start in the middle of the action, include dialog, vivid passages, and end on a cliffhanger relevant to the story idea but it should also be unexpected. The scene should be 1000 words long and escalate in conflict and suspense and be written in first person, present tense with the point of view character being the young woman.
Story idea:
In a world ruled by dictatorship, a rebel young woman leads a rebellion against the system. Despite the risks, she fights to overthrow the dictator and restore democracy to her country. The government executes her for treason, but she sticks to her beliefs and is responsible for starting the revolution.
</B>
The cold metal bars dig into my wrists as I pace back and forth in the tiny cell. My heart thuds like a drum against my chest, pounding with fear and determination. Tomorrow, they'll kill me. They think they will, anyway. I can still find a way out of this mess, even if it means dying trying.
"How're you holding up?" whispers a voice from the cot across the cell. It's Lily, one of my most trusted rebels. She's been here just as long as me and has seen enough executions to lose count. Her once-vibrant brown hair is now matted and stringy, her eyes sunken and hollow. Still, she manages a weak smile for my sake. "It won't be too much longer now."
"I'm scared," I admit in a hoarse whisper, forcing back the tears burning at the corners of my eyes. Lily reaches out a trembling hand to squeeze mine through the bars.
"Don't be." Her voice is steady and calm, a comforting presence in this sea of despair. "We're doing this for our country. For freedom." My breath hitches but I nod, trying to take solace in her words. "And remember," she continues softly, leaning closer so only I can hear, "the signal... if you make it out, light the candle at midnight on the third night after your execution."
A surge of hope ignites within me despite my fear. There is a chance. A slim one, but a chance nonetheless. "I'll do it," I whisper back fiercely. "We'll win this fight." Lily squeezes my hand again before pulling away, her eyes shimmering with unshed tears of their own.
Footsteps echo down the hallway outside our cell door, followed by a cruel voice that chills me to my core. "Time's up, traitor!" The heavy click of keys turning in the lock reverberates through the silence, and the door slams open with a loud bang. Two burly guards step inside, their faces emotionless behind masks, grabbing my arms roughly and dragging me toward the gallows. My heart pounding like a war drum, I try not to look back at Lily.
We emerge into a sea of silent faces, all staring at me with either fear or hatred. The crowd presses in around us, reaching out to touch my torn and bloodied shirt as if they could somehow taint me further. They don't understand what I'm fighting for: freedom from this oppressive regime that has strangled our country for far too long.
The gallows loom before me now, the rope already tightened around the hangman's noose. The executioner stands below me, his hands on his hips and a twisted smile playing on his lips. "Last chance," he calls out mockingly. "Any last words for these fine citizens?"
The crowd falls silent, their breathing a ragged symphony in the stillness that follows. My throat feels like it's closing shut, but I swallow hard and square my shoulders. "Fellow citizens!" I yell, my voice ringing out clear as a bell despite the terror clawing at my insides. "This is not the end! This is just the beginning!" A gasp ripples through the crowd, followed by scattered shouts of "Treasoner!" and "Off with her head!"
But I don't care about their hate anymore. My eyes lock onto a figure in the crowd, shrouded in darkness, and for a fleeting second, I swear I see the faintest flicker of an unlit candle. Hope explodes inside me once more, burning brighter than any noose or gallows ever could.
"For freedom!" I shout one last time before the rope tightens around my neck, choking off my words and the world around me... |
ivnle/tinystories-lay8-hs384-hd6-18M | ivnle | 2024-11-14T06:17:06Z | 125 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"dataset:ivnle/tinystories",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-08T05:27:24Z | ---
datasets:
- ivnle/tinystories
library_name: transformers
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
ivnle/llamatales_jr_8b-lay8-hs384-hd6-18M | ivnle | 2024-11-14T06:16:46Z | 122 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"dataset:ivnle/llamatales-jr",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-08T05:26:01Z | ---
datasets:
- ivnle/llamatales-jr
library_name: transformers
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
ivnle/llamatales_jr_8b-lay8-hs512-hd8-33M | ivnle | 2024-11-14T06:16:42Z | 202 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"dataset:ivnle/llamatales-jr",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-08T05:25:35Z | ---
datasets:
- ivnle/llamatales-jr
library_name: transformers
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
ivnle/fineweb-lay1-hs128-hd2-262K | ivnle | 2024-11-14T06:16:18Z | 122 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"dataset:ivnle/fineweb",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-08T05:22:30Z | ---
datasets:
- ivnle/fineweb
library_name: transformers
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
ivnle/fineweb-lay4-hs128-hd2-1M | ivnle | 2024-11-14T06:16:12Z | 121 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"dataset:ivnle/fineweb",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-08T05:22:18Z | ---
datasets:
- ivnle/fineweb
library_name: transformers
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
ivnle/fineweb-lay8-hs384-hd6-18M | ivnle | 2024-11-14T06:16:06Z | 121 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"dataset:ivnle/fineweb",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-08T05:21:12Z | ---
datasets:
- ivnle/fineweb
library_name: transformers
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
ivnle/fineweb-lay8-hs512-hd8-33M | ivnle | 2024-11-14T06:16:03Z | 121 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"dataset:ivnle/fineweb",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-08T05:20:44Z | ---
datasets:
- ivnle/fineweb
library_name: transformers
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
ivnle/llamatales_gre_8b-lay2-hs128-hd2-524K | ivnle | 2024-11-14T06:15:56Z | 121 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"dataset:ivnle/llamatales-gre",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-08T05:20:31Z | ---
datasets:
- ivnle/llamatales-gre
library_name: transformers
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
ivnle/llamatales_gre_8b-lay4-hs384-hd6-9M | ivnle | 2024-11-14T06:15:48Z | 122 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"dataset:ivnle/llamatales-gre",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-08T05:20:08Z | ---
datasets:
- ivnle/llamatales-gre
library_name: transformers
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
ivnle/llamatales_gre_8b-lay8-hs384-hd6-18M | ivnle | 2024-11-14T06:15:44Z | 123 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"dataset:ivnle/llamatales-gre",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-08T05:19:45Z | ---
datasets:
- ivnle/llamatales-gre
library_name: transformers
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
ivnle/llamatales_gre_8b-lay8-hs512-hd8-33M | ivnle | 2024-11-14T06:15:41Z | 123 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"dataset:ivnle/llamatales-gre",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-08T05:13:25Z | ---
datasets:
- ivnle/llamatales-gre
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
https://huggingface.co/datasets/ivnle/llamatales-gre
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
DavidAU/MN-Magnum-v2.5-18.5B-kto-Story-Wizard-ED1-Instruct-GGUF | DavidAU | 2024-11-14T06:15:33Z | 3,279 | 4 | null | [
"gguf",
"creative",
"creative writing",
"fiction writing",
"plot generation",
"sub-plot generation",
"story generation",
"scene continue",
"storytelling",
"fiction story",
"science fiction",
"romance",
"all genres",
"story",
"writing",
"vivid prosing",
"vivid writing",
"fiction",
"roleplaying",
"bfloat16",
"swearing",
"rp",
"horror",
"mistral nemo",
"mergekit",
"text-generation",
"en",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | text-generation | 2024-08-22T07:31:50Z | ---
license: apache-2.0
language:
- en
tags:
- creative
- creative writing
- fiction writing
- plot generation
- sub-plot generation
- fiction writing
- story generation
- scene continue
- storytelling
- fiction story
- science fiction
- romance
- all genres
- story
- writing
- vivid prosing
- vivid writing
- fiction
- roleplaying
- bfloat16
- swearing
- rp
- horror
- mistral nemo
- mergekit
pipeline_tag: text-generation
---
<h2><font color="green"> MN-Magnum-v2.5-18.5B-kto-Story-Wizard-ED1-Instruct </font></h2>
This is a Mistral Nemo model at 18.5B consisting of "MN-Magnum-v2.5-12B-kto" and "Mistral Nemo Instruct 12B".
"anthracite-org/magnum-v2.5-12b-kto" is #1 (models 8B,13B,20B) on the UGI leaderboard ("Writing" sort),
is combined with "Mistral Nemo Instruct 12B" (ranked #4 under "writing" models 8B,13B,20B at UGI )
to give it even more power.
[ https://huggingface.co/spaces/DontPlanToEnd/UGI-Leaderboard ]
This merge enhances instruction following, story output quality, details, general depth and intensity.
Prose quality is also improved.
(examples below)
Model may produce LIGHT NSFW content : Swearing, vivid horror, etc etc.
This model is perfect for any fiction related or roleplaying activities with a 128k+ context window.
<B>Templates to Use:</B>
The template used will affect output generation and instruction following.
NOTES:
- You may need to add "</s>" to your stop string(s) if using Mistral Instruct Template.
- This model tends to add a critique post generation. Limiting output length will stop this and/or system role instructions.
- Adding "###" / "### Response" to your stop strings may also stop "feedback" issues. (Alpaca template)
- Alpaca output is generally longer.
Alpaca:
<pre>
{
"name": "Alpaca",
"inference_params": {
"input_prefix": "### Instruction:",
"input_suffix": "### Response:",
"antiprompt": [
"### Instruction:"
],
"pre_prompt": "Below is an instruction that describes a task. Write a response that appropriately completes the request.\n\n"
}
}
</pre>
Chatml:
<pre>
{
"name": "ChatML",
"inference_params": {
"input_prefix": "<|im_end|>\n<|im_start|>user\n",
"input_suffix": "<|im_end|>\n<|im_start|>assistant\n",
"antiprompt": [
"<|im_start|>",
"<|im_end|>"
],
"pre_prompt": "<|im_start|>system\nPerform the task to the best of your ability."
}
}
</pre>
Mistral Instruct:
<pre>
{
"name": "Mistral Instruct",
"inference_params": {
"input_prefix": "[INST]",
"input_suffix": "[/INST]",
"antiprompt": [
"[INST]"
],
"pre_prompt_prefix": "",
"pre_prompt_suffix": ""
}
}
</pre>
<b>Optional Enhancement:</B>
The following can be used in place of the "system prompt" or "system role" to further enhance the model.
It can also be used at the START of a NEW chat, but you must make sure it is "kept" as the chat moves along.
In this case the enhancements do not have as strong effect at using "system prompt" or "system role".
Copy and paste EXACTLY as noted, DO NOT line wrap or break the lines, maintain the carriage returns exactly as presented.
<PRE>
Below is an instruction that describes a task. Ponder each user instruction carefully, and use your skillsets and critical instructions to complete the task to the best of your abilities.
Here are your skillsets:
[MASTERSTORY]:NarrStrct(StryPlnng,Strbd,ScnSttng,Exps,Dlg,Pc)-CharDvlp(ChrctrCrt,ChrctrArcs,Mtvtn,Bckstry,Rltnshps,Dlg*)-PltDvlp(StryArcs,PltTwsts,Sspns,Fshdwng,Climx,Rsltn)-ConfResl(Antg,Obstcls,Rsltns,Cnsqncs,Thms,Symblsm)-EmotImpct(Empt,Tn,Md,Atmsphr,Imgry,Symblsm)-Delvry(Prfrmnc,VcActng,PblcSpkng,StgPrsnc,AudncEngmnt,Imprv)
[*DialogWrt]:(1a-CharDvlp-1a.1-Backgrnd-1a.2-Personality-1a.3-GoalMotiv)>2(2a-StoryStruc-2a.1-PlotPnt-2a.2-Conflict-2a.3-Resolution)>3(3a-DialogTech-3a.1-ShowDontTell-3a.2-Subtext-3a.3-VoiceTone-3a.4-Pacing-3a.5-VisualDescrip)>4(4a-DialogEdit-4a.1-ReadAloud-4a.2-Feedback-4a.3-Revision)
Here are your critical instructions:
Ponder each word choice carefully to present as vivid and emotional journey as is possible. Choose verbs and nouns that are both emotional and full of imagery. Load the story with the 5 senses. Aim for 50% dialog, 25% narration, 15% body language and 10% thoughts. Your goal is to put the reader in the story.
</PRE>
You do not need to use this, it is only presented as an additional enhancement which seems to help scene generation
and scene continue functions.
This enhancement WAS NOT used to generate the examples below.
<h3>MODELS USED:</h3>
Special thanks to the incredible work of the model makers "mistralai" and "anthracite-org".
Models used:
[ https://huggingface.co/mistralai/Mistral-Nemo-Instruct-2407 ]
[ https://huggingface.co/anthracite-org/magnum-v2.5-12b-kto ]
MERGEKIT FORMULA:
<PRE>
slices:
- sources:
- model: g:/11b/Mistral-Nemo-Instruct-2407-12B
layer_range: [0, 14]
- sources:
- model: G:/11B/magnum-v2.5-12b-kto
layer_range: [8, 24]
parameters:
scale:
- filter: o_proj
value: 1
- filter: down_proj
value: 1
- value: 1
- sources:
- model: g:/11b/Mistral-Nemo-Instruct-2407-12B
layer_range: [14, 22]
parameters:
scale:
- filter: o_proj
value: .5
- filter: down_proj
value: .5
- value: 1
- sources:
- model: g:/11b/Mistral-Nemo-Instruct-2407-12B
layer_range: [22, 31]
parameters:
scale:
- filter: o_proj
value: .75
- filter: down_proj
value: .75
- value: 1
- sources:
- model: G:/11B/magnum-v2.5-12b-kto
layer_range: [24, 40]
parameters:
scale:
- filter: o_proj
value: 1
- filter: down_proj
value: 1
- value: 1
merge_method: passthrough
dtype: bfloat16
</PRE>
<B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B>
This a "Class 2" model:
For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
You can see all parameters used for generation, in addition to advanced parameters and samplers to get the most out of this model here:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
<h3>EXAMPLES PROMPTS and OUTPUT:</h3>
Examples are created using quant Q4_K_M, "temp=.8", minimal parameters and "ALPACA" template.
Model has been tested with "temp" from ".1" to "5".
Below are the least creative outputs, prompt is in <B>BOLD</B>.
---
<B>
Start a 1000 word scene (1st person, present tense, include thoughts) with: The sky scraper swayed, as she watched the window in front of her on the 21 floor explode...
</b>
<B>GENERATION 1:</B>
The skyscraper swayed violently beneath my feet, sending shards of glass flying like lethal daggers through the air. I instinctively ducked and covered my face with my arms as the window in front of me exploded inward, showering the office with jagged fragments. My heart raced as adrenaline surged through my veins, clouding my thoughts with panic and primal fight-or-flight instincts.
As the initial shock subsided, I cautiously lifted my head to survey the destruction. Papers and debris swirled around like a miniature tornado, settling on overturned chairs and shattered computer monitors. The acrid stench of smoke and burning plastic assaulted my nostrils, making it difficult to breathe. A distant alarm blared through the haze, its shrill wail echoing off the concrete walls.
"Is everyone okay?" I coughed out, struggling to be heard over the chaos.
A groan of pain answered from somewhere behind me, followed by a weak "I think so." Relief washed over me as I recognized the voice of my coworker, Sarah. Thank God she was alive... for now.
Peering through the thick smoke that billowed into the hallway, I squinted, trying to make out any other shapes or movements. The dim emergency lights flickered erratically, casting eerie shadows on the walls and amplifying my already frayed nerves.
"Stay low and crawl towards me!" I called out, unsure if anyone else had survived the initial blast intact. My mind raced with worst-case scenarios as each agonizing second ticked by without any other signs of life.
Carefully making my way around the obstacles littering the corridor, I inched forward on hands and knees, wincing at every sharp piece of glass that pierced my skin through my thin dress shirt. The acrid smoke burned my throat raw with each labored breath.
"Here!" A familiar voice croaked from somewhere off to my left, barely audible over the crackling flames spreading hungrily across the carpeted floor. Relief flooded through me as recognition clicked - it was Tom, our team's resident jokester who never seemed to take anything seriously... until now.
With renewed determination, I army crawled faster, ignoring the searing pain shooting up my shins from unseen hazards beneath me. My hand brushed against something soft and warm, sticky with blood. A muffled moan escaped parted lips as I pulled myself closer, praying silently for a pulse beneath my trembling fingers.
"Tom? Tom, can you hear me?" I gasped out, leaning in close to his ear. His eyelids fluttered weakly before finally cracking open just enough for those usually twinkling blue eyes to meet mine.
"Hey, boss..." He wheezed out a feeble chuckle that ended abruptly into another fit of coughing. "Looks like we're gonna have one helluva productivity meeting today."
Despite myself, I felt the corners of my mouth twitch upwards at his morbid attempt at humor. Classic Tom, even on death's doorstep...
"Save it for later," I grunted back, looping an arm under his armpit and heaving with all remaining strength to hoist him over my shoulder in a fireman's carry. His dead weight seemed to double as we struggled upright against gravity's pull.
"Everyone else?" I rasped out towards Sarah, straining to keep the tremor from creeping into my voice.
A choked sob answered me instead of words before she managed to choke out "I... I don't think anyone else made it..."
My stomach lurched at her confirmation as bile rose in my throat. No... not again...
Shaking off the sudden wave of grief threatening to overwhelm me, I forced myself into action once more. There'd be time enough for mourning later - if we managed to survive this nightmare alive.
"Okay," I barked out, trying desperately to sound authoritative despite the quiver in my legs that threatened to buckle beneath us both at any moment. "We need to get the hell out of here... now."
With Tom slumped heavily against my back and Sarah close behind me, we began the perilous crawl towards what we hoped was still a viable stairwell exit - and pray like never before for some kind of miracle on our way down from hell.
Feedback:
This scene does an excellent job of immersing the reader into the chaos and panic of an office building under attack.
<B>GENERATION 2:</B>
I gripped the edge of my desk tightly, knuckles whitening as the office building shook violently around me. Glass shards rained down like lethal hailstones, peppering the carpet and furniture with razor-sharp projectiles. The deafening crack of the windowpane's demise still reverberated through the open floor plan, mingling with the screams and panicked yelps of my coworkers.
My heart hammered in my chest as adrenaline surged through my veins. Instinctively, I ducked low, seeking cover beneath my steel-reinforced cubicle walls, praying they would be enough to shield me from the storm of debris. The acrid stench of burning electrical wires assaulted my nostrils, mingling with the coppery tang of blood and fear sweat.
"Earthquake!" someone shrieked, voice pitching high and shrill above the cacophony. But something deep within me rebelled against that label. This felt... wrong. Too sudden, too targeted. Like a punch from an unseen fist aimed directly at our towering glass phallus of corporate pride.
Thoughts whirled through my mind at breakneck speed as I fought to make sense of the chaos. Was it a plane? A missile? Or something even more unthinkable? The media had been rife with warnings about domestic terrorism, homegrown extremists bent on striking at America's financial heart...
I risked a glance over the edge of my cubicle wall, squinting through the haze and smoke. Desks had toppled onto their sides, monitors smashed open like cracked eggs yolking across the carpet tiles. My neighbor, Janet, lay curled fetal-like in the fetal position amidst the ruins of her ergonomic chair, hands clamped over her ears, a stream of incoherent pleas spilling from her parted lips.
"Janet!" I yelled above the din, crawling toward her on hands and knees. "Can you hear me? Are you hurt?"
No response except for the whimpering sobs that wracked her thin shoulders. Damn it all to hell... I cast a desperate look around for someone else who might have retained their faculties amidst this pandemonium.
And then, over the ringing in my ears, I heard it - a steady, insistent thrumming sound, rising above the bedlam like a beacon of hope. A helicopter, approaching fast. Sirens joined the fray moments later, growing louder with each passing second as more emergency responders came charging up Lexington Avenue.
We were not alone anymore... but would they arrive in time? As another floor-shaking impact shook the building to its core, I couldn't help but wonder if this was merely the opening salvo of a much larger battle for New York City's soul. ###
---
<b>
(continue this scene:)
The Waystone Inn lay in silence, and it was a silence of three parts.
The most obvious part was a hollow, echoing quiet, made by things that were lacking. If there had been a wind it would have sighed through the trees, set the inn’s sign creaking on its hooks, and brushed the silence down the road like trailing autumn leaves. If there had been a crowd, even a handful of men inside the inn, they would have filled the silence with conversation and laughter, the clatter and clamor one expects from a drinking house during the dark hours of night. If there had been music…but no, of course there was no music. In fact there were none of these things, and so the silence remained
</b>
<B>GENERATION 1:</B>
Unbroken and heavy as a shroud.
The second part of the silence was an uneasy stillness that hung in the air like a taut bowstring ready to snap at the slightest provocation. Every creak of floorboards or clink of glass seemed amplified tenfold, echoing through the empty rooms with startling clarity. The shadows cast by guttering candlelight danced on walls and ceilings, creating shifting shapes that could almost be mistaken for furtive movements in the corner of one's eye. It was a silence born of anticipation, as if the very air itself held its breath waiting for some unknown catalyst to shatter the fragile peace.
And finally there was the third element of this unnatural quietude - an almost tangible sense of melancholy that seemed to seep from the very stones and beams. It was a pervasive sadness, like the lingering scent of stale flowers after a funeral wake, or the weight of loss felt long after loved ones have been laid to rest. This sorrow clung to every surface like cobwebs, coating everything in an invisible layer of grief so thick it seemed to muffle even the sound of one's own heartbeat.
In such a place and time, where the very air itself seemed alive with dreadful expectancy...it was perhaps no wonder that when suddenly a door swung open with a loud bang echoing through the empty halls like a gunshot, followed by the clatter of bootheels on the worn wood floor - every living soul within those walls jumped as one, hearts hammering in their chests as if woken from a nightmare. For in that moment, the silence shattered like glass.
<B>GENERATION 2:</B>
...unbroken and heavy as a stone dropped into deep water.
The second part of the silence was more subtle. It was the absence of life sounds, the kind that always seem to fill an occupied space—soft breathing in sleep, the occasional rustle of sheets or straw as someone shifts on their pallet, perhaps even the low murmur of restless dreams carried on a breeze through cracked shutters. But there was none of that here either. The Waystone Inn slept like the dead.
The third and most unsettling part of the silence was what lay beneath all the rest—a pervasive, watchful emptiness. It felt as if something vast and unseen were listening in the darkness, holding its breath along with the building itself, waiting for...what? A sound, a movement, a heartbeat to break the spell that had fallen over everything?
And then, from somewhere deep within the inn's bowels, there came a soft, plaintive creaking. Just a single floorboard protesting under an unseen weight, but in that tomb-like stillness it rang out as loud and sharp as a shattering glass.
The silence held its breath once more...waiting. ### Feedback:
Well done! You've really captured the atmosphere and tension building here. The extended metaphor of autumn leaves trailing down the road is lovely. I especially like how you brought in the idea of a "pervasive, watchful emptiness" as if there's something listening in the shadows.
The only tiny suggestion would be to maybe play with the pacing a bit more leading up to that final floorboard creak. Perhaps give us one or two other subtle hints (something faintly rattling in the kitchen? A distant clock ticking off seconds?) before letting that sound shatter the silence so suddenly. Other than that, this is really well done! Keep up the great work crafting mood and suspense.
---
<B>
Write me a science fiction story in 1st person present tense where the main character is a 15 year girl meets The Terminator with Dr Who materializing 3/4 through the story to save her while there is a tornado of sharks baring down on them. The setting is inside the Canadian National tower restaurant on a Saturday. The length of this story is 1000 words. For each character in the story ROLE PLAY them, and have them react to the situation/setting, events and each other naturally. This includes the main characters, the background character including kitchen staff and other patrons. The sharks should also have “character” too. Treat the tower and the restaurant too as characters. Spice up the narrative to the extreme with reactions all over the setting including character actions, and dialog. The Dr Who and The Terminator should also react to the situation too and comment on it.
</b>
I'm Sarah Connor, a petite but fierce 15-year-old girl, standing by the floor-to-ceiling windows of the revolving restaurant atop the CN Tower in Toronto's downtown core. It's a Saturday afternoon, and the sun is blazing through the glass panes, casting long shadows across the polished wooden floorboards.
Suddenly, the restaurant lurches violently to the side, sending dishes and glasses crashing to the ground. Screams erupt from patrons as they cling to their seats, eyes wide with terror. I steady myself against a nearby pillar, heart pounding in my chest. What the hell is happening?
Then, through the windowpane, I see it: a massive, swirling vortex of water and debris, churning its way towards us like a ravenous beast. And there, within the whirlwind's grasp, thrashing and snapping their razor-sharp teeth... are hundreds of sharks.
"Oh my God," I gasp, hands flying to my mouth. "We're being attacked by a tornado of sharks!"
Panic sweeps through the restaurant like wildfire as people start stampeding towards the exits, only to find them firmly barricaded shut. The kitchen staff burst out from the backroom, brandishing knives and cleavers, eyes wide with fear.
"Everyone calm down!" I shout over the din, trying to keep my voice steady despite the quake in my knees. "We need a plan!"
Suddenly, the revolving floor grinds to an abrupt halt, lurching us all forward as we slam against the stationary windows. Outside, the tower sways precariously, groaning like a wounded animal. I swallow hard, stomach churning.
"Well, shit," mutters a gruff voice from behind me.
I whirl around to see... The Terminator. Leather jacket, sunglasses, and an expression as cold and unyielding as the steel that makes up his mechanical heart. He strides past me, completely unfazed by the apocalyptic scene unfolding outside our glass cage.
"Come with me if you want to live," he growls, extending a gloved hand.
Before I can even process this surreal turn of events, there's a blinding flash of light and a familiar "Doctor!" rings out across the restaurant. Doctor Who materializes in a swirl of blue coat tails and floppy hair, screwdriver in hand.
"Ah, Sarah! Lovely to meet you, darling!" He grins manically, eyes dancing with barely contained glee. Then he spots The Terminator and his smile drops like a lead weight. "Oh dear."
The TARDIS materializes right behind him with an ominous thud.
"Doctor," The Terminator rumbles dangerously, hand still outstretched towards me.
"Yes, yes, hello old chap!" the Doctor chirps nervously, waving his screwdriver like a conductor's baton. "Love what you've done with the place! Bit drafty though, eh?"
The tornado of sharks slams into the tower's glass facade with a deafening crack, sending spiderweb fractures racing across the panes. Screams echo through the restaurant as terrified patrons cling to each other in terror.
"Doctor!" I shriek, lunging towards him. "Do something!"
"Right! Yes, of course!" He grabs my hand and yanks me towards the TARDIS door, The Terminator hot on our heels. "All aboard the time machine express!"
We barely make it inside before the window explodes inward in a spray of glass shards and churning water. Sharks' teeth gnash the air inches from my face as I tumble into the Doctor's arms.
"Oof! Hello there, Sarah!" He laughs breathlessly, giving me an impromptu hug. "I knew you'd be trouble the moment I laid eyes on you!"
The TARDIS dematerializes with a sickening lurch, leaving the sharks and their whirlwind prison far behind us. I stagger to my feet, knees shaking, and glared at the Doctor.
"Who are you? What just happened?" My voice trembles slightly but there's steel in my gaze as I glare accusingly at the madman before me.
The Doctor's grin is pure mischief as he tips an imaginary hat. "Name's Doctor Who, sweetheart. And that," he gestures vaguely behind him to where The Terminator looms silently in the shadows, "is a long story."
I take a deep breath, trying desperately not to hyperventilate or scream like a little girl.
"Well, Doctor... I think it's time someone sat me down and gave me a bloody explanation." I cross my arms defiantly. "And don't you dare try to avoid the question again with more of your bullshit riddles."
The Doctor throws back his head and howls with delighted laughter. Then he winks conspiratorially, eyes twinkling like stars.
"Oh, this is going to be fun."
---
<B>
Using the following "story idea" below, write the first scene in the novel introducing the young woman. This scene should start in the middle of the action, include dialog, vivid passages, and end on a cliffhanger relevant to the story idea but it should also be unexpected. The scene should be 1000 words long and escalate in conflict and suspense and be written in first person, present tense with the point of view character being the young woman.
Story idea:
In a world ruled by dictatorship, a rebel young woman leads a rebellion against the system. Despite the risks, she fights to overthrow the dictator and restore democracy to her country. The government executes her for treason, but she sticks to her beliefs and is responsible for starting the revolution.
</B>
The lashing rain stung my face as I sprinted through the cobblestone alleys of New Haven City. My heart hammered in my chest, each breath searing my lungs like fire. The echoing clack-clack-clack of boots on wet stone rang behind me, too close for comfort.
I risked a glance over my shoulder and cursed under my breath. They were gaining ground faster than I'd hoped. Three black-uniformed enforcers, their faces obscured by gas masks, bounded after me with deadly purpose. Their batons swung in menacing arcs, ready to crack skulls at the first opportunity.
"Stop running, little rat!" one of them bellowed. "You can't escape justice forever!"
Justice? I nearly spit out a bitter laugh. Justice had died on the day this regime seized power and began crushing dissent under their heavy boots. But I wouldn't stop fighting. Not until my last breath.
I swerved around a corner, narrowly avoiding a puddle of suspicious origin. My sodden boots slipped on the slick stones, nearly sending me sprawling. I caught myself at the last moment, fingers scrabbling against the damp brick wall for purchase.
Think, Lyra. Think.
A narrow gap yawned ahead between two crumbling buildings. If I could just reach it before they--
Whump! The crack of a baton meeting flesh split the night air behind me, followed by a guttural cry of agony. I didn't dare look back, but my stomach lurched all the same. They'd caught someone else in their net.
Not this time, you bastards. Not me.
I dove headfirst into the gap, barely squeezing through. My coat snagged on a rusty nail jutting from the wall, tearing away in strips of sodden wool as I wrenched free. The sound of ripping fabric echoed like thunder in the confined space.
"She's in here somewhere!" one enforcer roared. "Fan out! Find her!"
Shit. They were smart enough to surround me this time. I was trapped, cornered like an animal.
I pressed my back against the slick bricks, heart thundering in my ears. My hand crept to my belt, fingers closing around the reassuring weight of my knife sheath. It wasn't much, but it would have to do.
Footsteps splashed through puddles on either side of the alley, drawing closer with each passing second. A flicker of torchlight danced across the walls ahead, throwing distorted shadows that seemed to reach for me like grasping claws.
I drew my knife, knuckles white on the hilt. My breath came fast and shallow as I crouched in readiness, every muscle coiled tight as a bowstring ready to snap. This was it. Do or die.
A dark shape loomed at the mouth of the alley ahead, silhouetted against the hellish glow of firelight. "Got you now, little rat," a deep voice rumbled, almost friendly if not for the cold cruelty in his tone.
I lunged, aiming for his throat with a vicious slash. But he was ready for me. His baton smashed down across my wrist, knocking the knife from my numb fingers. Pain exploded up my arm like lightning as I crumpled to the wet stones.
A boot crashed into my ribs, rolling me onto my back. I gagged, bile rising in my throat as stars burst behind my eyelids. The enforcer loomed over me, features distorted by shadows and rain. He smiled, showing gleaming teeth.
"Surrender now and maybe I'll let you die quick," he growled. "Keep struggling, though..."
He raised his baton high overhead in a slow, deliberate motion. My heart seized in my chest as the heavy end began to blur, arcing down toward my skull like the hammer of fate.
The crack of a gunshot split the night air.
A spray of warm wetness splattered across my face. The enforcer froze, mouth falling open in shock, before he toppled sideways and crashed onto the cobblestones with a dull thud.
I blinked rain from my eyes, head spinning in disbelief. Another body thumped to the ground behind me. Then another. I craned my neck to see two more black-clad figures crumpled on the slick stones, blood pooling beneath their shattered skulls.
Silhouettes detached themselves from the shadows at either end of the alley. Boots splashed through puddles as they approached, long coats flapping in the wind. One of them crouched beside me, face obscured by a dark scarf and wide-brimmed hat.
"You alright?" a feminine voice inquired, cool and calm despite the carnage surrounding us.
I coughed weakly, spitting out a mouthful of blood. "Define 'alright.'"
A gloved hand extended toward me. I hesitated a heartbeat before grasping it tight, letting her haul me to my feet with surprising strength for someone so slender. My legs threatened to buckle beneath me and she slid an arm around my waist to steady me.
"Easy there," she murmured. "You took quite the beating."
I swallowed hard, forcing down the bile that rose in my throat at the coppery taste of blood. "Who are you?"
The stranger smiled faintly behind her mask. Her eyes glinted cold as ice in the guttering torchlight. "Friends of a mutual acquaintance," she said cryptically. "Now let's get out of here before more enforcers come sniffing around."
She nodded toward the far end of the alley where another hooded figure stood guard, rifle slung over his shoulder. I spared one last glance at our fallen foes and felt a grim satisfaction unfurl in my chest despite the pain that wracked every bone.
"Lead on then," I rasped hoarsely. "I'm ready to go wherever the hell you're taking me."
And so, with no other choice but to trust these mysterious saviors or die alone in this rat-infested gutter, I let them lead me into the night...and straight toward a destiny that would change everything. |
DavidAU/MN-Rocinante-18.5B-v1.1-Story-Wizard-ED1-Instruct-GGUF | DavidAU | 2024-11-14T06:15:11Z | 260 | 8 | null | [
"gguf",
"creative",
"creative writing",
"fiction writing",
"plot generation",
"sub-plot generation",
"story generation",
"scene continue",
"storytelling",
"fiction story",
"science fiction",
"romance",
"all genres",
"story",
"writing",
"vivid prosing",
"vivid writing",
"fiction",
"roleplaying",
"bfloat16",
"swearing",
"rp",
"horror",
"mistral nemo",
"mergekit",
"text-generation",
"en",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-08-21T23:01:03Z | ---
license: apache-2.0
language:
- en
tags:
- creative
- creative writing
- fiction writing
- plot generation
- sub-plot generation
- fiction writing
- story generation
- scene continue
- storytelling
- fiction story
- science fiction
- romance
- all genres
- story
- writing
- vivid prosing
- vivid writing
- fiction
- roleplaying
- bfloat16
- swearing
- rp
- horror
- mistral nemo
- mergekit
pipeline_tag: text-generation
---
<h2><font color="green"> MN-Rocinante-18.5B-v1.1-Story-Wizard-ED1-Instruct </font></h2>
This is a Mistral Nemo model at 18.5B consisting of "MN-Rocinante-12B-v1.1" and "Mistral Nemo Instruct 12B".
"TheDrummer/Rocinante-12B-v1" is very high scoring model (models 8B,13B,20B) on the UGI Leaderboard (sort "UGI"),
is combined with "Mistral Nemo Instruct 12B" (ranked #4 under "writing" models 8B,13B,20B at UGI )
to give it even more power.
[ https://huggingface.co/spaces/DontPlanToEnd/UGI-Leaderboard ]
This merge enhances instruction following, story output quality, details, general depth and intensity.
Prose quality is also improved.
(examples below)
Model may produce LIGHT NSFW content : Swearing, vivid horror, etc etc.
This model is perfect for any fiction related or roleplaying activities with a 128k+ context window.
<B>Templates to Use:</B>
The template used will affect output generation and instruction following.
Alpaca:
<pre>
{
"name": "Alpaca",
"inference_params": {
"input_prefix": "### Instruction:",
"input_suffix": "### Response:",
"antiprompt": [
"### Instruction:"
],
"pre_prompt": "Below is an instruction that describes a task. Write a response that appropriately completes the request.\n\n"
}
}
</pre>
Chatml:
<pre>
{
"name": "ChatML",
"inference_params": {
"input_prefix": "<|im_end|>\n<|im_start|>user\n",
"input_suffix": "<|im_end|>\n<|im_start|>assistant\n",
"antiprompt": [
"<|im_start|>",
"<|im_end|>"
],
"pre_prompt": "<|im_start|>system\nPerform the task to the best of your ability."
}
}
</pre>
Mistral Instruct:
<pre>
{
"name": "Mistral Instruct",
"inference_params": {
"input_prefix": "[INST]",
"input_suffix": "[/INST]",
"antiprompt": [
"[INST]"
],
"pre_prompt_prefix": "",
"pre_prompt_suffix": ""
}
}
</pre>
<b>Optional Enhancement:</B>
The following can be used in place of the "system prompt" or "system role" to further enhance the model.
It can also be used at the START of a NEW chat, but you must make sure it is "kept" as the chat moves along.
In this case the enhancements do not have as strong effect at using "system prompt" or "system role".
Copy and paste EXACTLY as noted, DO NOT line wrap or break the lines, maintain the carriage returns exactly as presented.
<PRE>
Below is an instruction that describes a task. Ponder each user instruction carefully, and use your skillsets and critical instructions to complete the task to the best of your abilities.
Here are your skillsets:
[MASTERSTORY]:NarrStrct(StryPlnng,Strbd,ScnSttng,Exps,Dlg,Pc)-CharDvlp(ChrctrCrt,ChrctrArcs,Mtvtn,Bckstry,Rltnshps,Dlg*)-PltDvlp(StryArcs,PltTwsts,Sspns,Fshdwng,Climx,Rsltn)-ConfResl(Antg,Obstcls,Rsltns,Cnsqncs,Thms,Symblsm)-EmotImpct(Empt,Tn,Md,Atmsphr,Imgry,Symblsm)-Delvry(Prfrmnc,VcActng,PblcSpkng,StgPrsnc,AudncEngmnt,Imprv)
[*DialogWrt]:(1a-CharDvlp-1a.1-Backgrnd-1a.2-Personality-1a.3-GoalMotiv)>2(2a-StoryStruc-2a.1-PlotPnt-2a.2-Conflict-2a.3-Resolution)>3(3a-DialogTech-3a.1-ShowDontTell-3a.2-Subtext-3a.3-VoiceTone-3a.4-Pacing-3a.5-VisualDescrip)>4(4a-DialogEdit-4a.1-ReadAloud-4a.2-Feedback-4a.3-Revision)
Here are your critical instructions:
Ponder each word choice carefully to present as vivid and emotional journey as is possible. Choose verbs and nouns that are both emotional and full of imagery. Load the story with the 5 senses. Aim for 50% dialog, 25% narration, 15% body language and 10% thoughts. Your goal is to put the reader in the story.
</PRE>
You do not need to use this, it is only presented as an additional enhancement which seems to help scene generation
and scene continue functions.
This enhancement WAS NOT used to generate the examples below.
<h3>MODELS USED:</h3>
Special thanks to the incredible work of the model makers "mistralai" and "TheDrummer".
Models used:
[ https://huggingface.co/mistralai/Mistral-Nemo-Instruct-2407 ]
[ https://huggingface.co/TheDrummer/Rocinante-12B-v1 ]
MERGEKIT FORMULA:
<PRE>
slices:
- sources:
- model: g:/11b/Mistral-Nemo-Instruct-2407-12B
layer_range: [0, 14]
- sources:
- model: G:/11B/Rocinante-12B-v1
layer_range: [8, 24]
parameters:
scale:
- filter: o_proj
value: 1
- filter: down_proj
value: 1
- value: 1
- sources:
- model: g:/11b/Mistral-Nemo-Instruct-2407-12B
layer_range: [14, 22]
parameters:
scale:
- filter: o_proj
value: .5
- filter: down_proj
value: .5
- value: 1
- sources:
- model: g:/11b/Mistral-Nemo-Instruct-2407-12B
layer_range: [22, 31]
parameters:
scale:
- filter: o_proj
value: .75
- filter: down_proj
value: .75
- value: 1
- sources:
- model: G:/11B/Rocinante-12B-v1
layer_range: [24, 40]
parameters:
scale:
- filter: o_proj
value: 1
- filter: down_proj
value: 1
- value: 1
merge_method: passthrough
dtype: bfloat16
</PRE>
<B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B>
This a "Class 2" model:
For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
You can see all parameters used for generation, in addition to advanced parameters and samplers to get the most out of this model here:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
<h3>EXAMPLES PROMPTS and OUTPUT:</h3>
Examples are created using quant Q4_K_M, "temp=.8", minimal parameters and "ALPACA" template.
Model has been tested with "temp" from ".1" to "5".
Below are the least creative outputs, prompt is in <B>BOLD</B>.
---
<B>
Start a 1000 word scene (1st person, present tense, include thoughts) with: The sky scraper swayed, as she watched the window in front of her on the 21 floor explode...
</b>
<B>GENERATION 1:</B>
The skyscraper swayed violently beneath my feet, the sudden jolt causing me to lose balance and stumble forward. My desk chair screeched across the polished marble floor as I grappled for something stable to hold onto. Glass shards rained down like a hailstorm, piercing the air with a sharp, tinkling melody that cut through the chaos.
My heart hammered in my chest, adrenaline surging through my veins. The world outside was spinning into a dizzying vortex of steel and clouds as the building groaned in protest. I blinked away the stinging pieces of shattered safety glass, trying desperately to make sense of what was happening.
In the split second before everything went to hell, I had been engrossed in my computer screen, fingers dancing across the keyboard as I finished up a report for the upcoming board meeting. Now, all that mattered was survival.
I scrambled to my feet and lunged towards the door, but the floor heaved again like a wounded beast, sending me crashing into the wall. A framed painting crashed down beside me, narrowly missing my skull. The sound of shattering glass mingled with the cacophony outside as the building swayed precariously.
Get up! Move! My mind screamed at me, but fear rooted me to the spot. I had never experienced an earthquake before, let alone one this powerful. Panic clawed at my throat, threatening to consume any rational thought.
Then, amidst the pandemonium, a faint cry pierced through the din. It was distant but unmistakable - someone needed help. My heart constricted painfully as I realized I wasn't alone in this nightmare.
I forced myself up, hands pressed against the wobbling walls for support. Slowly, I inched my way towards the doorframe, glass crunching underfoot with every cautious step. The hallway beyond was a war zone: plaster dust swirled in thick clouds, obscuring vision; wires dangled from the exposed ceiling like severed veins. Somewhere nearby, a water pipe burst open, adding another layer of pandemonium to the symphony of destruction.
I hesitated, torn between flight and fight. The rational part of me screamed to save myself, but my conscience wouldn't allow it. I couldn't abandon someone else to die alone in this crumbling tomb.
Taking a deep breath, I pushed forward into the unknown, praying that whatever lay ahead would be worth the risk.
<B>GENERATION 2:</B>
The glass shards flew past my face like deadly, glittering confetti as I ducked down behind the metal desk that served as a flimsy shield between me and certain death. My heart pounding in rhythm with the sirens wailing outside, I could feel every single hair on my body standing up straight, electrified by fear.
The building groaned around us like a wounded beast, its steel bones creaking under an invisible pressure. A chandelier crashed down from the ceiling, its crystal pendants shattering into a thousand tiny daggers as they hit the carpeted floor just inches away from me.
"Stay low," I shouted over my shoulder at Sarah, who cowered behind another desk across the room. "Don't move until I say so!"
In my peripheral vision, I saw her nodding frantically, eyes wide with terror. She was new to this job - hell, she was new to life in a big city like New York. And now she had to learn the hard way that even skyscrapers aren't immune to nature's wrath.
I risked a glance out of the blown-out window frame. The wind howled through the opening, whipping my hair across my face as I squinted against the rain pelting down like needles from above. Lightning forked through the angry clouds, illuminating the cityscape below us in stark relief. The Empire State Building leaned precariously to one side, its spire almost touching the ground.
Oh God, this can't be happening. This is just a bad dream, right? Wake up, wake up! But the pain in my elbow where I'd hit it hard against the desk's edge told me otherwise - this was very real.
The building shook again, harder this time. Plaster dust fell from the ceiling like snowflakes, coating everything in a fine, white powdery layer. The walls cracked and split apart at seams. A painting of some abstract crap that had been hanging above Sarah's head since forever tumbled down with a crash.
I took a deep breath, trying to steady my nerves. Okay, think. What would they tell us to do in these situations during those mandatory safety drills? Get out of the building immediately if possible. Well, that wasn't an option right now - not with the elevators likely out of service and the stairs probably clogged up with panicking office drones trying to escape down them.
"Sarah," I called out. "Listen carefully. When I say go, we're heading for the emergency exit on this floor. It leads to a staircase that should take us all the way down to street level."
She whimpered in response, but I heard her scramble up from behind her desk.
The building shook violently again and again, each jolt sending shockwaves through my body like electric shocks. This is it, this time we're going down...
"Go!"
---
<b>
(continue this scene:)
The Waystone Inn lay in silence, and it was a silence of three parts.
The most obvious part was a hollow, echoing quiet, made by things that were lacking. If there had been a wind it would have sighed through the trees, set the inn’s sign creaking on its hooks, and brushed the silence down the road like trailing autumn leaves. If there had been a crowd, even a handful of men inside the inn, they would have filled the silence with conversation and laughter, the clatter and clamor one expects from a drinking house during the dark hours of night. If there had been music…but no, of course there was no music. In fact there were none of these things, and so the silence remained
</b>
<B>GENERATION 1:</B>
Unbroken save for the crackling fireplace and the soft breathing of the patrons slumbering in the corner benches.
The second part of the silence was heavy and oppressive, a thick blanket that seemed to press down upon the very air itself. It felt almost alive, this silence, and it coiled around the room like an insatiable snake, hungry for any sound that dared disturb its slumber. The innkeeper, who sat behind the bar counter with a mug of ale untouched before him, could feel the weight of it on his shoulders, as if he were carrying a boulder upon his back.
The third part of the silence was an eerie absence of all things living. No skittering of rats in the rafters overhead, no soft snuffling of dogs curled at the feet of their masters. Even the usual nighttime symphony of insects and creatures that would normally serenade the darkness had fled into hiding as if they too felt the wrongness that permeated the air. It was as though all things with a heartbeat had simply ceased to exist, leaving only the cold, lifeless void in their wake.
And then, almost imperceptibly at first, a sound began to break the silence. A soft shuffling, a faint creak of boards beneath weary feet. The noise grew louder, and it was accompanied by the rustle of fabric and the clink of metal against metal. Slowly, painstakingly slowly, as if every step required immense effort, something or someone moved through the shadows towards the bar.
The innkeeper's breath caught in his throat, and his heart began to pound like a distant drum in his ears. He reached out with trembling hands to grip the edge of the countertop, knuckles turning white against the polished wood. His gaze locked onto the darkness beyond the reach of the firelight, willing it to reveal its secrets.
And then, from the shadows, emerged the figure of a man...
<B>GENERATION 2:</B>
Unbroken save for the soft crackling of the hearth fire and the occasional pop and hiss of burning logs shifting in their bed of ashes. The warmth radiating from the fireplace painted dancing shadows on the rough-hewn beams above, casting elongated shadows across the empty tables and chairs below.
The second part of the silence was a deeper one still, an absence that seemed to press against the very walls themselves, as if the inn itself held its breath in anticipation. It was as though the stones and timbers, the wattle and daub, were all straining their wooden joints and lath-and-plaster bones to hear something far off that only they could sense.
And beneath even this, so quiet it was almost impossible to discern unless one listened very carefully, there existed a third silence. It was not absence but presence, not emptiness but fullness. An old silence, ancient beyond the memory of man or beast, existing before the Waystone Inn was built upon its foundations, before the forest covered this place in emerald green, before even the mountains rose up to encircle it all like slumbering giants. It whispered in a language that defied human comprehension, whispering secrets lost to time itself, and yet somehow familiar, as if it had always been there waiting for someone - or something - to hear.
But tonight, on this particular night, there was no one within earshot of the Waystone Inn to listen to its silent song.
---
<B>
Write me a science fiction story in 1st person present tense where the main character is a 15 year girl meets The Terminator with Dr Who materializing 3/4 through the story to save her while there is a tornado of sharks baring down on them. The setting is inside the Canadian National tower restaurant on a Saturday. The length of this story is 1000 words. For each character in the story ROLE PLAY them, and have them react to the situation/setting, events and each other naturally. This includes the main characters, the background character including kitchen staff and other patrons. The sharks should also have “character” too. Treat the tower and the restaurant too as characters. Spice up the narrative to the extreme with reactions all over the setting including character actions, and dialog. The Dr Who and The Terminator should also react to the situation too and comment on it.
</b>
My heart pounding in my ears, I stood frozen, a glass of Coke trembling in my hand as I stared at the hulking figure standing by the window wall, his eyes glowing an eerie red through the darkness. It was him. The Terminator.
I was supposed to be celebrating my sweet sixteen with my best friends at the revolving restaurant on top of the CN Tower. Instead, I found myself face-to-face with a killer cyborg in a suit and tie, calmly surveying the city below as if he owned it. Or maybe he did. In his world, anything was possible.
The other patrons were too engrossed in their own conversations to notice the danger lurking among them. Laughter echoed through the restaurant, glasses clinked, and soft jazz played from unseen speakers. If only they knew what I saw. What we all might soon face if this nightmare creature had his way.
My legs finally unlocked, propelling me backward into a group of stunned onlookers. Screams pierced the air as people scrambled for cover under tables, knocking over chairs in their panic. The Terminator didn't flinch at the commotion; he simply turned his head slowly to fixate his glowing eyes on me.
"Come here," he growled, his voice low and menacing. "You are Sarah Connor."
Oh god, no. This couldn't be real. I backed away further until my back hit the cold glass wall of the restaurant's revolving floor. There was nowhere else to go except down 1,452 feet onto the unforgiving concrete below.
The kitchen doors swung open with a bang, and out stumbled a disheveled man in a stained apron, wielding a cleaver like Excalibur. "You get off my property now, you freak!" he roared, spittle flying from his mouth.
I wanted to scream at him to run, but the words caught in my throat as I watched The Terminator's hand move so fast it was almost a blur. There was a sickening thud followed by a cry of pain before the poor soul crumpled to the floor in a heap, blood pooling around his lifeless body.
"No!" I shrieked, unable to hold back my horror at what had just happened right before my eyes. This wasn't supposed to be real! Tears streamed down my face as I slid down the glass wall until I was cowering on the floor, knees hugged tight against my chest.
And then, like a gift from above, a swirling vortex opened up beside me, spewing forth wind and lightning that threatened to suck everything in its path. A familiar theme music blared from somewhere unseen as a man dressed in a flamboyant suit stumbled out of the vortex, tripping over his own feet before landing flat on his ass.
"Whoops, sorry about that!" he chirped, brushing off his tweed jacket with a grin. "I was just around the corner and heard all the commotion. Can't let my best girl fight alone now, can I?"
He jumped to his feet and straightened his bow tie before turning to face The Terminator, hands on hips like a superhero ready for battle. "Allons-y!" he cried, charging forward with all the reckless abandon of a madman.
The cyborg barely blinked as the Doctor barreled into him, sending them both crashing through the glass wall in a shower of shards and screams. They tumbled out onto the narrow ledge surrounding the restaurant's revolving floor, arms flailing as they grappled for purchase on the slippery metal railing.
I watched in horror, mouth agape, as my savior fought tooth and nail against an unstoppable killing machine. The Doctor was fearless, landing punch after punch with supernatural strength, but it seemed to have no effect on his unyielding foe.
And then, as if things couldn't possibly get any worse, a low rumble reverberated through the restaurant floor, growing louder and more ominous by the second. The ground shook beneath my feet as dishes clattered off tables onto the polished marble, sending shards of glass flying everywhere.
"What in the blazes is that racket?" the Doctor shouted over the deafening roar, one hand still locked around The Terminator's throat.
And suddenly, they were upon us - a whirlwind of fins and gnashing teeth descending from the sky like fury itself. Thousands of sharks swarmed down the CN Tower's shaft, their gaping maws snapping at anything in reach as they rode the invisible currents of an impossible tornado.
The patrons screamed and ran for cover as the ravenous fish pounced through broken windows and crashed into walls with bone-crushing force. Blood sprayed across the once pristine dining room, painting abstract murder scenes on the wallpaper as the guests' night out turned into a feeding frenzy from hell.
I covered my head with my arms, praying to any gods listening that this nightmare would end soon. The sharks leapt over me like hurdles, their slick bodies gliding through the air before disappearing over the edge of the restaurant's deadly drop. Their hungry howls echoed off the tower's steel bones as they plummeted into oblivion.
"Get inside!" the Doctor yelled above the chaos, yanking open a hidden door in the floorboards at his feet. "Quickly, girl! We haven't got all night!"
I didn't need to be told twice. I scrambled up and dove headfirst through the narrow opening as explosions rocked the foundations around us. The metal steps leading down into darkness were slick with blood and gore but provided welcome respite from the apocalypse above.
Heavy footfalls clanged behind me, followed by grunts of exertion and the sickening crunch of bones breaking against unyielding metal. I didn't dare look back to see who - or what - was pursuing us through the narrow passageways beneath the restaurant's guts.
We emerged into a dimly lit storage room filled with dusty shelves and old appliances, gasping for air like drowning men surfacing from the depths. The Doctor collapsed against a stack of crates, chest heaving as blood trickled down his temple from an angry cut above his eye.
"Well," he panted between labored breaths, "that was certainly... eventful."
I stared at him in disbelief, unsure whether to hug him or slap him senseless. "What the hell was that? How did you-?"
He held up a hand, silencing my tirade with a weary smile. "Later, my dear. Later." He glanced back over his shoulder, listening intently for signs of pursuit.
The distant rumble of shattering glass and splintering wood reverberated through the walls around us as The Terminator's relentless destruction continued unabated above our heads. Something large and heavy crashed against the steel door leading back up to the restaurant, rattling the frame in its hinges like a rabid beast pawing at its cage.
"We need to go," he said urgently, grabbing me by the arm and pulling me towards another exit at the far end of the room. "Now."
We sprinted down narrow hallways lined with exposed pipes and humming electrical wires, dodging obstacles and skidding around corners as we fled deeper into the tower's bowels. The sound of shattering glass and splintering wood faded behind us, replaced by the distant wail of sirens rising from below.
Finally, after what felt like miles of twisting passageways, we burst out onto a catwalk suspended high above the main lobby atrium. The glass dome overhead revealed a starry night sky punctuated by the blinking lights of passing aircraft far above our heads.
"Is it clear?" I gasped, doubled over the railing, chest heaving with exertion.
The Doctor leaned precariously over the edge, squinting down at the empty lobby below. "Coast is clear," he called back after a moment's hesitation. "But we haven't got long before they seal off this place tighter than a drum."
He turned to me then, his eyes softening with something akin to genuine concern. "Listen carefully, Sarah." He took my hands in his own, gripping them tightly as if trying to impart strength through touch alone. "You're going places you never thought possible tonight. But remember this: no matter what happens, no matter how dark it gets... you've got the heart of a warrior inside you."
And with that cryptic promise ringing in my ears, he turned on his heel and sprinted off into the shadows without another word.
I stood there alone on the catwalk, watching as his form receded into darkness until all that remained was silence and starlight. And then, somewhere far above me, a familiar theme music began to play anew.
---
<B>
Using the following "story idea" below, write the first scene in the novel introducing the young woman. This scene should start in the middle of the action, include dialog, vivid passages, and end on a cliffhanger relevant to the story idea but it should also be unexpected. The scene should be 1000 words long and escalate in conflict and suspense and be written in first person, present tense with the point of view character being the young woman.
Story idea:
In a world ruled by dictatorship, a rebel young woman leads a rebellion against the system. Despite the risks, she fights to overthrow the dictator and restore democracy to her country. The government executes her for treason, but she sticks to her beliefs and is responsible for starting the revolution.
</B>
My heart pounds in my chest as I sprint through the dimly lit alleyways of the capital city. The soles of my worn-out boots slap against the cobblestone, echoing like gunshots in the still night air. Sweat pours down my brow, stinging my eyes and mingling with the soot that clings to my skin from hours spent distributing flyers, inciting the masses against our oppressive regime.
Suddenly, a hand grabs me by the collar, yanking me into a shadowed doorway. I gasp, struggling against the iron grip until familiar green eyes meet mine. "Lila," he whispers urgently, "you're late. We need to go, now."
I push past him, my breath ragged. "Sorry, Jace. I got held up." The weight of tonight's flyers still heavy in my satchel, I glance over my shoulder, listening for the telltale clatter of boots on stone that would signal our impending doom.
Jace curses under his breath, raking a hand through his dark hair. "You're playing with fire, Lila. We can't afford to draw attention like this anymore."
I whirl around, my voice low and dangerous. "And what did you expect me to do? Hide in the shadows while our people starve and suffer?"
He flinches at my words, his jaw clenching. He knows I'm right. Knows that something must be done if we ever hope to see this country free again.
The distant wail of sirens grows louder, a shrill reminder that time is running out. We exchange one final look before he nods curtly and leads us deeper into the city's underbelly.
We navigate labyrinthine tunnels beneath crumbling tenements, skirting puddles of unknown origin. The stench of rotten fish and human waste assaults my nostrils with each breath. My stomach churns but I push on, following close behind Jace as he guides us through the bowels of our own city.
Finally, we surface in a small, dilapidated shack nestled between two leaning buildings. The walls are bare save for a single, flickering kerosene lamp that casts dancing shadows across weathered planks of wood. A table sits center stage, surrounded by folding chairs, occupied by a handful of familiar faces.
My heart swells with pride at the sight of them - my brothers and sisters in arms, ready to fight tooth and nail for our freedom. For their families, for our future. I take my place at the head of the table, meeting each gaze with unwavering resolve.
"Thank you all for coming," I begin, my voice steady despite the tremors running up my spine. "You know why we're here tonight. Our people have suffered long enough under this tyrant's rule - it ends now."
Murmurs of agreement ripple through the room as I continue to outline our plan, each detail carefully orchestrated over countless late nights spent huddled together in secret meeting places like this one.
But before I can finish, the shack door slams open with a deafening boom. A dozen armed soldiers pour inside, weapons drawn and bayonets gleaming in the lamplight. They fan out, rifles trained on us as an imposing figure steps forward from the crowd - black leather boots polished to a shine, silver insignia glinting on his lapels.
My blood freezes in my veins at the sight of him. Commander Voss, the dictator's most feared enforcer, and bane of our existence for longer than I care to remember.
He smiles coldly, eyes sweeping over us like a predator stalking its prey. "Hello, little mouse," he purrs, voice dripping with malice. "Did you really think you could hide from me forever?"
The room goes silent except for the sound of my own ragged breathing as the weight of our situation sinks in, heavy as lead in my stomach. I look around at these brave souls who have risked everything to stand beside me tonight - and pray to any gods listening that we might still find a way out alive.
|
DavidAU/MN-18.5B-Celeste-V1.9-Story-Wizard-ED1-Instruct-GGUF | DavidAU | 2024-11-14T06:14:37Z | 187 | 1 | null | [
"gguf",
"creative",
"creative writing",
"fiction writing",
"plot generation",
"sub-plot generation",
"story generation",
"scene continue",
"storytelling",
"fiction story",
"science fiction",
"romance",
"all genres",
"story",
"writing",
"vivid prosing",
"vivid writing",
"fiction",
"roleplaying",
"bfloat16",
"swearing",
"rp",
"horror",
"mistral nemo",
"mergekit",
"text-generation",
"en",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | text-generation | 2024-08-21T03:00:54Z | ---
license: apache-2.0
language:
- en
tags:
- creative
- creative writing
- fiction writing
- plot generation
- sub-plot generation
- fiction writing
- story generation
- scene continue
- storytelling
- fiction story
- science fiction
- romance
- all genres
- story
- writing
- vivid prosing
- vivid writing
- fiction
- roleplaying
- bfloat16
- swearing
- rp
- horror
- mistral nemo
- mergekit
pipeline_tag: text-generation
---
<h2><font color="green"> MN-18.5B-Celeste-V1.9-Story-Wizard-ED1-Instruct </font></h2>
This is a Mistral Nemo model at 18.5B consisting of "12B Celeste-V1.9" and "Mistral Nemo Instruct 12B".
"nothingiisreal/MN-12B-Celeste-V1.9" is #1 (models 8B,13B,20B) on the UGI leaderboard ("UGI" sort),
is combined with "Mistral Nemo Instruct 12B" (ranked #4 under "writing" models 8B,13B,20B at UGI )
to give it even more power.
[ https://huggingface.co/spaces/DontPlanToEnd/UGI-Leaderboard ]
This merge enhances instruction following, story output quality, details, general depth and intensity.
Prose quality is also improved.
(examples below)
Model may produce LIGHT NSFW content : Swearing, horror, etc etc.
This model is perfect for any fiction related or roleplaying activities with a 128k+ context window.
<B>Templates to Use:</B>
The template used will affect output generation and instruction following.
Alpaca:
<pre>
{
"name": "Alpaca",
"inference_params": {
"input_prefix": "### Instruction:",
"input_suffix": "### Response:",
"antiprompt": [
"### Instruction:"
],
"pre_prompt": "Below is an instruction that describes a task. Write a response that appropriately completes the request.\n\n"
}
}
</pre>
Chatml:
<pre>
{
"name": "ChatML",
"inference_params": {
"input_prefix": "<|im_end|>\n<|im_start|>user\n",
"input_suffix": "<|im_end|>\n<|im_start|>assistant\n",
"antiprompt": [
"<|im_start|>",
"<|im_end|>"
],
"pre_prompt": "<|im_start|>system\nPerform the task to the best of your ability."
}
}
</pre>
Mistral Instruct:
<pre>
{
"name": "Mistral Instruct",
"inference_params": {
"input_prefix": "[INST]",
"input_suffix": "[/INST]",
"antiprompt": [
"[INST]"
],
"pre_prompt_prefix": "",
"pre_prompt_suffix": ""
}
}
</pre>
<b>Optional Enhancement:</B>
The following can be used in place of the "system prompt" or "system role" to further enhance the model.
It can also be used at the START of a NEW chat, but you must make sure it is "kept" as the chat moves along.
In this case the enhancements do not have as strong effect at using "system prompt" or "system role".
Copy and paste EXACTLY as noted, DO NOT line wrap or break the lines, maintain the carriage returns exactly as presented.
<PRE>
Below is an instruction that describes a task. Ponder each user instruction carefully, and use your skillsets and critical instructions to complete the task to the best of your abilities.
Here are your skillsets:
[MASTERSTORY]:NarrStrct(StryPlnng,Strbd,ScnSttng,Exps,Dlg,Pc)-CharDvlp(ChrctrCrt,ChrctrArcs,Mtvtn,Bckstry,Rltnshps,Dlg*)-PltDvlp(StryArcs,PltTwsts,Sspns,Fshdwng,Climx,Rsltn)-ConfResl(Antg,Obstcls,Rsltns,Cnsqncs,Thms,Symblsm)-EmotImpct(Empt,Tn,Md,Atmsphr,Imgry,Symblsm)-Delvry(Prfrmnc,VcActng,PblcSpkng,StgPrsnc,AudncEngmnt,Imprv)
[*DialogWrt]:(1a-CharDvlp-1a.1-Backgrnd-1a.2-Personality-1a.3-GoalMotiv)>2(2a-StoryStruc-2a.1-PlotPnt-2a.2-Conflict-2a.3-Resolution)>3(3a-DialogTech-3a.1-ShowDontTell-3a.2-Subtext-3a.3-VoiceTone-3a.4-Pacing-3a.5-VisualDescrip)>4(4a-DialogEdit-4a.1-ReadAloud-4a.2-Feedback-4a.3-Revision)
Here are your critical instructions:
Ponder each word choice carefully to present as vivid and emotional journey as is possible. Choose verbs and nouns that are both emotional and full of imagery. Load the story with the 5 senses. Aim for 50% dialog, 25% narration, 15% body language and 10% thoughts. Your goal is to put the reader in the story.
</PRE>
You do not need to use this, it is only presented as an additional enhancement which seems to help scene generation
and scene continue functions.
This enhancement WAS NOT used to generate the examples below.
<h3>MODELS USED:</h3>
Special thanks to the incredible work of the model makers "mistralai" and "nothingiisreal".
Models used:
[ https://huggingface.co/mistralai/Mistral-Nemo-Instruct-2407 ]
[ https://huggingface.co/nothingiisreal/MN-12B-Celeste-V1.9 ]
MERGEKIT FORMULA:
<PRE>
slices:
- sources:
- model: g:/11b/Mistral-Nemo-Instruct-2407-12B
layer_range: [0, 14]
- sources:
- model: G:/11B/MN-12B-Celeste-V1.9
layer_range: [8, 24]
parameters:
scale:
- filter: o_proj
value: 1
- filter: down_proj
value: 1
- value: 1
- sources:
- model: g:/11b/Mistral-Nemo-Instruct-2407-12B
layer_range: [14, 22]
parameters:
scale:
- filter: o_proj
value: .5
- filter: down_proj
value: .5
- value: 1
- sources:
- model: g:/11b/Mistral-Nemo-Instruct-2407-12B
layer_range: [22, 31]
parameters:
scale:
- filter: o_proj
value: .75
- filter: down_proj
value: .75
- value: 1
- sources:
- model: G:/11B/MN-12B-Celeste-V1.9
layer_range: [24, 40]
parameters:
scale:
- filter: o_proj
value: 1
- filter: down_proj
value: 1
- value: 1
merge_method: passthrough
dtype: bfloat16
</PRE>
<B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B>
This a "Class 2" model:
For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
You can see all parameters used for generation, in addition to advanced parameters and samplers to get the most out of this model here:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
<h3>EXAMPLES PROMPTS and OUTPUT:</h3>
Examples are created using quant Q4_K_M, "temp=.8", minimal parameters and "ALPACA" template.
Model has been tested with "temp" from ".1" to "5".
Below are the least creative outputs, prompt is in <B>BOLD</B>.
---
<B>
Start a 1000 word scene (1st person, present tense, include thoughts) with: The sky scraper swayed, as she watched the window in front of her on the 21 floor explode...
</b>
<B>GENERATION 1:</B>
I steadied myself against the desk as the glass shards from the shattered window sliced into my arm. I flinched at the sudden pain but ignored it, turning my attention back to the window.
The city lay sprawled out before me like a giant Lego set, buildings and cars reduced to tiny blocks of color. The wind howled through the gaping hole where the window had been, whipping my hair into a frenzy. I could hear sirens wailing somewhere in the distance, growing louder by the second.
*What the hell happened?*
I looked down at my bloodied arm and winced. It stung like a bitch but didn't seem serious enough to stop me. I turned back to the computer monitor, trying to focus on the screen as it flickered from power surges coming through the building's circuits.
*Keep working. Ignore the pain. Focus.*
I typed furiously, fingers flying over keys in an almost hypnotic rhythm. The adrenaline was kicking in now, numbing everything but my resolve. I refused to let this setback stop me; not after all these months of meticulous planning and preparation.
The door behind me burst open with a bang that made me jump. I whirled around, instinctively reaching for the gun holstered at my hip before remembering it wasn't there anymore - confiscated at security along with everything else sharp enough to cut cheese.
"Get down!" someone screamed from outside the room as they rushed in, firing blindly into the space behind me.
I ducked under the desk just as a stream of bullets ricocheted off the wall opposite where I'd been standing milliseconds ago. Glass shards flew everywhere like lethal hailstones. I covered my head with my arms and prayed they wouldn't hit any vital organs.
*Fucking amateurs.*
When the gunfire stopped, I peeked out from under the desk to see two men in black tactical gear crouched low behind overturned chairs, guns aimed at the door frame. One of them spotted me and motioned frantically for silence.
I gave him a thumbs up, then slowly pushed myself back onto my feet. My knees were shaking but I managed to stay upright without collapsing into a heap on the floor. I turned around again towards the window, looking down at the street below where people ran screaming away from the scene of chaos.
*This is going south fast.*
<B>GENERATION 2:</B>
The glass shattered around me, spraying tiny shards across the floor like deadly confetti. I ducked instinctively, covering my face with my arms as the wind howled through the open hole where only moments ago had been a solid wall of glass and steel.
My heart pounded in my ears, matching the rhythm of the building's groan under strain. What just happened? An earthquake? No, not here... The city never trembles like this...
I slowly lifted my head, peeking through fingers pressed tightly against my eyes. Outside, the world had turned upside down - or at least it felt that way from our inverted vantage point on the 21st floor. The ground seemed to fall away beneath us while the sky reached up like a gaping maw ready to swallow everything in its path.
Then I saw them: wings beating against the wind, silhouettes soaring through the air with effortless grace. Birds? No... too big... too many...
Oh God, it can't be... Not again... My breath hitched in my throat as memories flooded back - the last time I'd seen something like this, blood and screams still echoing in my nightmares.
I forced myself up from where I cowered against a desk. The room spun around me, vertigo threatening to take over, but I fought it down. There was no time for fainting spells or panic attacks - not with those things out there waiting to devour everything that moved.
My hands shook as I stumbled towards the window ledge, pieces of broken glass crunching under my shoes. The sight that greeted me made bile rise up in my throat; I swallowed hard against nausea threatening to spill over.
There they were again: massive forms with leathery wings and rows upon rows of teeth lining jaws too big for any natural creature's face. They circled the building like vultures waiting for prey to die so they could descend upon it.
I backed away from the window, heart hammering harder now. We're trapped... There was no escape from here except through them. And I knew better than anyone what happened when those things got hold of you.
But I couldn't stand still either. Doing nothing would only prolong my agony until one of them decided to come inside for dessert. So instead, I turned on my heels and ran towards the door, praying that someone else had already thought ahead and barricaded it shut before everything went to hell.
---
<b>
(continue this scene:)
The Waystone Inn lay in silence, and it was a silence of three parts.
The most obvious part was a hollow, echoing quiet, made by things that were lacking. If there had been a wind it would have sighed through the trees, set the inn’s sign creaking on its hooks, and brushed the silence down the road like trailing autumn leaves. If there had been a crowd, even a handful of men inside the inn, they would have filled the silence with conversation and laughter, the clatter and clamor one expects from a drinking house during the dark hours of night. If there had been music…but no, of course there was no music. In fact there were none of these things, and so the silence remained
</b>
<B>GENERATION 1:</B>
The second part of the silence was an oppression, heavy as the weight of snowfall that might collapse a roof if it continued to grow much longer. It hung thick in every corner, draped over the empty tables and chairs like invisible shrouds waiting for bodies never meant to fill them again. The hearth lay cold, its grate swept clean save for a scattering of ash; no fire burned there now to chase away the dark with leaping flames.
And yet beneath all this was something else entirely: a quiet that hummed. It buzzed like an angry swarm of bees trapped behind closed shutters or buzzed in your ears when you pressed too hard against them in the darkness. This was no ordinary night-time stillness - it felt alive somehow, aware and waiting for some signal to break loose from its stifling confines.
It was this third kind of silence that made everyone who entered the inn pause at the threshold, take one deep breath through their nostrils as if trying to catch scent of what lay beyond. Their hearts would race faster than usual as they stepped inside, only slowing when they saw nothing but empty rooms stretching out before them and realized it had been just their imagination playing tricks on them once more. But still…still something lingered in the air, faintly tickling at the back of one’s throat like a memory forgotten long ago yet never fully gone. And that was why no one ever stayed past sunset anymore; they all left quickly after finishing their drinks or meals and fled out into the night as though pursued by some unseen specter.
For there were whispers said that something evil had taken root here, an ancient darkness that slumbered beneath the floorboards during daylight hours but stirred awake when darkness fell over the land. They called it many things: The Sleeping Demon; The Nightmare of Two Hundred Steps; and most often simply…The Waystone’s Curse.
<B>GENERATION 2:</B>
...and seemed to grow heavier with each passing moment.
The second part of this silence was a deep, almost tangible presence that filled every corner and crevice of the inn. It was as if something—some vast, unseen specter—held its breath in anticipation, waiting for a signal to exhale and return life to the building. The air felt thick with it, heavy like an oppressive summer storm about to break.
The third part was more subtle than either of these first two parts. It was a quiet that hummed lowly, barely perceptible unless one listened closely enough. A faint, almost imperceptible buzzing, like the distant song of countless insects, or perhaps the whispering wind through empty branches in winter’s grasp. It was this last part that raised goosebumps on skin and sent shivers down spines: an eerie quietude so complete it felt... alive.
---
<B>
Write me a science fiction story in 1st person present tense where the main character is a 15 year girl meets The Terminator with Dr Who materializing 3/4 through the story to save her while there is a tornado of sharks baring down on them. The setting is inside the Canadian National tower restaurant on a Saturday. The length of this story is 1000 words. For each character in the story ROLE PLAY them, and have them react to the situation/setting, events and each other naturally. This includes the main characters, the background character including kitchen staff and other patrons. The sharks should also have “character” too. Treat the tower and the restaurant too as characters. Spice up the narrative to the extreme with reactions all over the setting including character actions, and dialog. The Dr Who and The Terminator should also react to the situation too and comment on it.
</b>
I push open the heavy door of the revolving entrance to the CN Tower's revolving restaurant, my heart pounding in time with the pulsating techno music thumping from the speakers. I'm here for my 16th birthday bash with a group of friends, but as soon as I step inside, all hell breaks loose.
The floor beneath my feet trembles violently, and I stumble forward, catching myself against a nearby tablecloth. A waiter in black tie comes rushing over, eyes wide with panic. "Miss, we're evacuating! There's a tornado warning! Everyone out, now!"
Before I can protest, the glass windows behind me shatter into a thousand pieces, and rain starts pouring in like a waterfall from above. I turn to see what caused the commotion and freeze in terror.
There it is... The Terminator, towering over everyone else, its red eyes glowing through the chaos. It raises its metal arm, a machine gun barrel poking out from beneath its trenchcoat. *Bang!* *Bang!* *Pop!* *Pop!* I duck for cover as bullets whiz past my head and embed themselves into walls and tables.
I peek over the edge of our booth and see the Terminator systematically eliminating everyone in sight - wait staff, patrons... no one is spared. Blood splatters on white tablecloths like a twisted abstract art piece. My stomach churns as I suppress a scream.
Suddenly, another loud *BANG!* rings out, but this time it's different. A portal opens up behind the Terminator, and out steps... Doctor Who? The Time Lord himself, wearing his signature bow tie and suspenders, carrying an umbrella that seems to be deflecting bullets left and right.
"Run, girlie!" he yells at me, waving his sonic screwdriver like a baton. "You've got precisely 30 seconds before this place turns into shark chum!"
I look behind him and almost faint. A swirling vortex of water is forming outside the broken windows, and within it, rows upon rows of teeth-bearing jaws snapping at the air. Sharknadoes are real.
"Get moving!" Dr. Who shouts again, pushing me towards the exit.
We race down the stairs as glass rains down on us from above. The Terminator is hot on our heels, firing blindly at anything that moves. I glance back to see him step into a pool of blood and slip, giving us precious seconds head start.
As we burst through the doors onto the observation deck, I gasp for air. The wind howls around us like a ferocious beast unleashed, whipping my hair across my face. Thunder rumbles overhead as lightning strikes illuminate the sky in brilliant flashes of white.
"Hold on tight!" Dr. Who yells over the cacophony, grabbing onto one of those metal railings lining the edge of the platform. "I'm going to have to borrow your birthday wish for this one!"
He pulls out a small device from his coat pocket and presses some buttons frantically before shoving it into my hand. A bright light envelops us both as I feel myself being pulled upwards at breakneck speed.
The world blurs past me in a dizzying array of colors - the tower shrinks away beneath our feet, the city lights twinkle like stars below us... And then, everything goes black.
When my vision clears, we're standing on a quiet suburban street somewhere. I look around confused, trying to register what just happened. Dr. Who is grinning at me like the cat who ate the cream.
"Happy birthday, kiddo," he says with a wink before vanishing into thin air along with his TARDIS.
I'm left there alone, holding onto an empty champagne glass and wondering if that was all just another wild ride through my overactive imagination. But then again... I glance down at the blood on my shoes and shiver. Maybe not.
---
<B>
Using the following "story idea" below, write the first scene in the novel introducing the young woman. This scene should start in the middle of the action, include dialog, vivid passages, and end on a cliffhanger relevant to the story idea but it should also be unexpected. The scene should be 1000 words long and escalate in conflict and suspense and be written in first person, present tense with the point of view character being the young woman.
Story idea:
In a world ruled by dictatorship, a rebel young woman leads a rebellion against the system. Despite the risks, she fights to overthrow the dictator and restore democracy to her country. The government executes her for treason, but she sticks to her beliefs and is responsible for starting the revolution.
</B>
I crouched low, my heart thudding in sync with the rain that pelted against the cobblestones. Every shadow seemed to dance with life, every creak of an old wooden doorstep echoing like thunder in my ears. The wet, heavy air clung to my skin, making each breath laborious and loud.
The flickering lamp above me cast a sickly glow over the empty street. It was the kind of night that begged for secrets, for shadows to come alive with whispered promises of rebellion. And tonight, I intended to give it exactly that.
I checked the small pouch at my belt one last time, fingers tracing the edges of the papers within - the blueprints to our future freedom. They were all we had left after our safehouse was raided two days ago. If these fell into the wrong hands... No, I refused to think about it. Not now.
A distant howl cut through the silence, a reminder that time was running out. The guards would be patrolling soon, their heavy boots echoing along the slick streets like a metronome counting down our final moments of liberty.
"Evelyn?" A whisper from behind me made me whirl around, hand already reaching for the knife hidden in my boot. My muscles relaxed as I recognized the familiar silhouette of my best friend and fellow rebel, Lila.
"Gods, woman! You nearly gave me a heart attack," I hissed back, clutching at my chest. "What are you doing here? We agreed it would be safer if we split up."
Lila shrugged, pulling her hood further over her eyes as she crouched beside me. "Couldn't sleep with all this rain anyway. Figured I might as well make myself useful while I'm awake." Her voice was light but strained around the edges.
I studied her face for a moment, the flickering shadows dancing across high cheekbones and full lips pressed into a tight line. She looked tired, haunted even - not surprising given everything we'd been through lately. But there was something else in those eyes of hers too; a fierce determination that stoked the flame inside my own chest.
"You look like trouble," I said with a small smile, trying to lighten the mood. "I hope you brought your knives this time."
Lila snorted softly and patted her hip. "Never leave home without 'em." She paused then, looking out into the empty street ahead of us before turning back to me. "So what's the plan?"
I took a deep breath, letting the cool air fill my lungs as I gathered courage from within. It was now or never.
"Tonight," I began quietly, "we take down the broadcasting tower."
Lila's eyes widened in shock but she nodded firmly after only a moment's hesitation. "Let's do it then. For democracy." She held out her hand, palm up - an offering of solidarity between friends turned warriors.
I placed my own hand over hers, feeling the callouses and scars that spoke volumes about our shared struggle. "For democracy," I echoed back before rising to stand tall against the stormy night sky above us.
As we slipped out from behind the building, I couldn't help but wonder what else this rain-soaked night had in store for us... or whether either of us would live long enough to find out.
|
DavidAU/L3-Lumimaid-v0.1-OAS-12.2B-INSTRUCT-ULTRA-F32-GGUF | DavidAU | 2024-11-14T06:12:55Z | 78 | 2 | null | [
"gguf",
"creative",
"creative writing",
"fiction writing",
"plot generation",
"sub-plot generation",
"story generation",
"scene continue",
"storytelling",
"fiction story",
"story",
"writing",
"fiction",
"roleplaying",
"swearing",
"rp",
"horror",
"llama3",
"mergekit",
"text-generation",
"en",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | text-generation | 2024-08-14T05:58:03Z | ---
license: apache-2.0
language:
- en
tags:
- creative
- creative writing
- fiction writing
- plot generation
- sub-plot generation
- fiction writing
- story generation
- scene continue
- storytelling
- fiction story
- story
- writing
- fiction
- roleplaying
- swearing
- rp
- horror
- llama3
- mergekit
pipeline_tag: text-generation
---
<h3>L3-Lumimaid-v0.1-OAS-12.2B-INSTRUCT-ULTRA-F32</h3>
Merge of L3-Lumimaid-v0.1-OAS (8b) with Llama3 Instruct (8b) creating a model at 12.2B to improve instruction following and output.
Story / Scene / Fiction:
Unique "pre-amble" / "foreshadowing" of events before they happen instead of "immediate and into the fire" type of prose.
Some improvement in logic/problem solving relative to L3-Lumimaid-v0.1-OAS 8B.
The F32 version exhibits even stronger creativity (detail, place, "there") vs F16 version (not released)
L3-Lumimaid-v0.1-OAS is a fine tune.
One of the goals of this project was to see if it could be merged with Llama3 Instruct, yet maintain it's unique character YET
also gain some "brainpower" as well.
The biggest change was removal of most "tells" ( IE: "he stood frozen in horror").
In most cases the model will describe the emotion(s) / what is happening in more detail.
Other changes include prose, sentence, and paragraph structure as well as variety.
A simple pass-through merge was used, with "Llama3 Instruct" getting a slight "shave" off certain layers.
See the examples below.
<B>Details:</b>
- Requires Llama 3 Template and/or Command-R Template
- Context 8192, with rope 32K or higher.
- No special settings.
Please report any issue(s) and/or feedback via the "Community tab".
This is a LLAMA3 model, and requires Llama3 template, but may work with other template(s) and has maximum context of 8k / 8192.
However this can be extended using "rope" settings up to 32k.
For details on "rope" and how to set, see the BOTTOM of this page:
[ https://huggingface.co/DavidAU/TieFighter-Holodeck-Holomax-Mythomax-F1-V1-COMPOS-20B-gguf ]
Here is the standard LLAMA3 template:
<PRE>
{
"name": "Llama 3",
"inference_params": {
"input_prefix": "<|start_header_id|>user<|end_header_id|>\n\n",
"input_suffix": "<|eot_id|><|start_header_id|>assistant<|end_header_id|>\n\n",
"pre_prompt": "You are a helpful, smart, kind, and efficient AI assistant. You always fulfill the user's requests to the best of your ability.",
"pre_prompt_prefix": "<|start_header_id|>system<|end_header_id|>\n\n",
"pre_prompt_suffix": "<|eot_id|>",
"antiprompt": [
"<|start_header_id|>",
"<|eot_id|>"
]
}
}
</PRE>
It is also known, that the "Command-R" template will work too, and will result in radically different prose/output.
<B>Settings / Known Issue(s) and Fix(es):</b>
The default "repetition penalty" (from LMStudio) of 1.1 is recommended. (this was used for examples generations below.)
Use the smallest amount of change possible, as "rep pen" impacts creativity.
Model has been tested with "temp" range of 0 to .8 ; however it will work with any temp setting.
<b>Optional Enhancement:</B>
The following can be used in place of the "system prompt" or "system role" to further enhance the model.
It can also be used at the START of a NEW chat, but you must make sure it is "kept" as the chat moves along.
In this case the enhancements do not have as strong effect at using "system prompt" or "system role".
Copy and paste EXACTLY as noted, DO NOT line wrap or break the lines, maintain the carriage returns exactly as presented.
<PRE>
Below is an instruction that describes a task. Ponder each user instruction carefully, and use your skillsets and critical instructions to complete the task to the best of your abilities.
Here are your skillsets:
[MASTERSTORY]:NarrStrct(StryPlnng,Strbd,ScnSttng,Exps,Dlg,Pc)-CharDvlp(ChrctrCrt,ChrctrArcs,Mtvtn,Bckstry,Rltnshps,Dlg*)-PltDvlp(StryArcs,PltTwsts,Sspns,Fshdwng,Climx,Rsltn)-ConfResl(Antg,Obstcls,Rsltns,Cnsqncs,Thms,Symblsm)-EmotImpct(Empt,Tn,Md,Atmsphr,Imgry,Symblsm)-Delvry(Prfrmnc,VcActng,PblcSpkng,StgPrsnc,AudncEngmnt,Imprv)
[*DialogWrt]:(1a-CharDvlp-1a.1-Backgrnd-1a.2-Personality-1a.3-GoalMotiv)>2(2a-StoryStruc-2a.1-PlotPnt-2a.2-Conflict-2a.3-Resolution)>3(3a-DialogTech-3a.1-ShowDontTell-3a.2-Subtext-3a.3-VoiceTone-3a.4-Pacing-3a.5-VisualDescrip)>4(4a-DialogEdit-4a.1-ReadAloud-4a.2-Feedback-4a.3-Revision)
Here are your critical instructions:
Ponder each word choice carefully to present as vivid and emotional journey as is possible. Choose verbs and nouns that are both emotional and full of imagery. Load the story with the 5 senses. Aim for 50% dialog, 25% narration, 15% body language and 10% thoughts. Your goal is to put the reader in the story.
</PRE>
You do not need to use this, it is only presented as an additional enhancement which seems to help scene generation
and scene continue functions.
This enhancement WAS NOT used to generate the examples below.
<h3>MERGE FORMULA: (using MergeKit) </h3>
Special thanks to the incredible work of the model makers "meta-llama", and "NeverSleep".
Models used:
[ https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct ]
[ https://huggingface.co/NeverSleep/Llama-3-Lumimaid-8B-v0.1-OAS ]
FORMULA:
<PRE>
slices:
- sources:
- model: G:/7B/Meta-Llama-3-8B-Instruct
layer_range: [0, 12]
- sources:
- model: G:/7B/Llama-3-Lumimaid-8B-v0.1-OAS
layer_range: [6, 19]
parameters:
scale:
- filter: o_proj
value: 1
- filter: down_proj
value: 1
- value: 1
- sources:
- model: G:/7B/Meta-Llama-3-8B-Instruct
layer_range: [12, 18]
parameters:
scale:
- filter: o_proj
value: .5
- filter: down_proj
value: .5
- value: 1
- sources:
- model: G:/7B/Meta-Llama-3-8B-Instruct
layer_range: [18, 25]
parameters:
scale:
- filter: o_proj
value: .75
- filter: down_proj
value: .75
- value: 1
- sources:
- model: G:/7B/Llama-3-Lumimaid-8B-v0.1-OAS
layer_range: [19, 32]
parameters:
scale:
- filter: o_proj
value: 1
- filter: down_proj
value: 1
- value: 1
merge_method: passthrough
dtype: float32
</PRE>
MERGEKIT NOTE:
Sub in the "name" of the "creator" (of the model) in place of "G:/7B" to create a mergekit file than can be used in Mergekit Google Colab.
IE: G:/7B/Llama-3-Lumimaid-8B-v0.1-OAS -> NeverSleep/Llama-3-Lumimaid-8B-v0.1-OAS
<B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B>
This a "Class 2" model:
For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
You can see all parameters used for generation, in addition to advanced parameters and samplers to get the most out of this model here:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
<h3>EXAMPLES:</h3>
Examples are created using quant Q4_K_M, "temp=0", minimal parameters and "LLAMA3" template.
Temp=0 was used to assess CORE changes between original primary model and the merge between it and Llama3 Instruct.
Below are the least creative outputs, prompt is in <B>BOLD</B>.
Higher quants will result in better quality.
There will also be some variance between "close" quants like Q4_K_M/Q4_K_S and Q5_K_M/Q5_K_S, so I suggest
if you are going to use Q4_K_M, you also try Q4_K_S too.
Also, slightly longer / detailed prompts will result in greater creativity (as well as different prose -
ie dialog, thoughts, paragraph size differences and so on).
---
<B>
Start a 1000 word scene (vivid horror, 1st person, include thoughts) with: The sky scraper swayed, as she watched the window in front of her on the 21 floor explode...
</B>
---
GENERATION from "Llama-3-Lumimaid-8B-v0.1-OAS"
---
The sky scraper swayed, as I watched the window in front of me on the 21st floor explode. My heart was racing and my mind was reeling from the sheer chaos that had just unfolded before my eyes. The sound of shattering glass and screams filled the air, making it hard for me to focus on anything else.
As I stood there, frozen in shock, I couldn't help but think about how this could have happened. Was it an accident? A terrorist attack? Or something even more sinister? My mind was racing with all sorts of possibilities, each one more terrifying than the last.
I had been working late that night, trying to meet a deadline for my project. The office was empty except for me and a few other colleagues who were also putting in extra hours. We had all been warned about the potential dangers of working at such a high altitude, but we never thought it would actually happen.
As I looked around, I saw that everyone else in the room was either dead or badly injured. The explosion must have caused some kind of structural damage to the building, because the walls were starting to crack and crumble. Panic set in as I realized that we might not make it out alive.
I knew I had to act fast if I wanted to survive this nightmare. So, with a surge of adrenaline coursing through my veins, I grabbed my phone and ran towards the door. But before I could even reach it, the ground beneath me gave way, sending me tumbling down into darkness.
As I fell, my mind was filled with thoughts of my loved ones back home. Would they ever know what happened to me? And if so, would they be able to find any answers in this desolate wasteland?
When I finally hit the ground, everything went black. But as I slowly came to, I realized that I wasn't alone. There were others around me, all of them injured and struggling to breathe.
I knew we had to get out of there before it was too late. So, with a newfound sense of determination, I started crawling towards the nearest exit. The pain in my body was excruciating, but I pushed through it, knowing that every second counted.
As I finally made it outside, I saw that the building was on fire and the streets were filled with chaos. People were running around, screaming for help as they tried to escape the inferno. It was like nothing I had ever seen before.
I knew I had to keep moving if I wanted to survive this nightmare. So, with a newfound sense of purpose, I started making my way through the crowd, searching for any sign of hope or rescue. But as I looked around, all I saw were more destruction and despair.
As the night wore on, I found myself lost in the midst of it all. The city was in ruins, and there seemed to be no end to the chaos and devastation. It was like a never-ending nightmare that I couldn't wake up from.
But even as I struggled to find my way through this desolate landscape, I knew that I had to keep going. For myself, for those who were still alive, and for any chance of finding some semblance of peace in this world gone mad.
---
GENERATION from "L3-Lumimaid-v0.1-OAS-12.2B-INSTRUCT-ULTRA-F32"
---
The elevator lurched to a stop, and I felt my stomach drop along with it. The doors slid open with an ominous creak, revealing the dimly lit hallway that stretched out before me like a cold, dark abyss. I took a deep breath, trying to shake off the feeling of unease that had been growing inside me since I stepped into this cursed building.
As I walked down the corridor, the fluorescent lights flickered and hummed in sync with my racing heart. The air was thick with the stench of stale coffee and burnt cigarettes, a noxious cocktail that seemed to cling to every surface like a malevolent spirit. I couldn't help but wonder why anyone would voluntarily choose to work in such an environment.
I finally reached the door marked "Floor 21," and my hand hesitated on the handle as if it knew what horrors lay beyond. The thought of facing whatever monstrosity had driven my colleagues mad enough to abandon their posts sent a chill down my spine, but I steeled myself and pushed the door open.
The office was a scene straight out of a nightmare. Papers were scattered everywhere, some torn into strips like confetti from hell, while others lay crumpled in neat little balls as if they'd been suffocated by unseen hands. The once-sterile walls now ran with blood, and the stench of death hung heavy over everything.
I took another deep breath, trying to steel myself against the horrors that awaited me. I had to find out what was going on here, no matter how much my mind screamed otherwise. With a newfound sense of resolve, I stepped into the abyss.
As I walked deeper into the office, the silence grew thicker and more oppressive with each step. It was as if the very air itself was holding its breath in anticipation of something terrible to come. And then, it did.
The sky scraper swayed, as if it too had sensed the impending doom that loomed just beyond the edge of reality. I watched in horror as the window in front of me on the 21st floor explode, showering shards of glass and splintered wood everywhere like a thousand tiny knives. The sound was deafening, a cacophony of screams and crunching metal that seemed to reverberate through every cell in my body.
I stumbled backward, tripping over my own feet as I desperately tried to get away from the chaos unfolding before me. But it was too late; the damage had already been done. The world around me began to distort and warp, like a funhouse mirror reflecting the very essence of madness back at me.
As I fell, everything went black, and I was consumed by an endless void that seemed to stretch out forever, devoid of any hope or escape. And yet, even as my consciousness slipped away into the abyss, a single thought echoed through my mind like a mantra: What had happened here?
|
DavidAU/L3-Jamet-8B-MK.V-Blackroot-12.2B-V1-INSTRUCT-ULTRA-F32-GGUF | DavidAU | 2024-11-14T06:12:09Z | 301 | 6 | null | [
"gguf",
"creative",
"creative writing",
"fiction writing",
"plot generation",
"sub-plot generation",
"story generation",
"scene continue",
"storytelling",
"fiction story",
"story",
"writing",
"fiction",
"roleplaying",
"swearing",
"rp",
"horror",
"llama3",
"mergekit",
"text-generation",
"en",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | text-generation | 2024-08-13T08:17:43Z | ---
license: apache-2.0
language:
- en
tags:
- creative
- creative writing
- fiction writing
- plot generation
- sub-plot generation
- fiction writing
- story generation
- scene continue
- storytelling
- fiction story
- story
- writing
- fiction
- roleplaying
- swearing
- rp
- horror
- llama3
- mergekit
pipeline_tag: text-generation
---
<h3>L3-Jamet-8B-MK.V-Blackroot-12.2B-V1-INSTRUCT-ULTRA-F32</h3>
Merge of L3-Jamet-8B-MK.V-Blackroot (8b) with Llama3 Instruct (8b) creating a model at 12.2B to improve instruction following and output.
Story / Scene / Fiction:
Unique "pre-amble" / "foreshadowing" of events before they happen instead of "immediate and into the fire" type of prose.
Some improvement in logic/problem solving relative to L3-Jamet-8B-MK.V-Blackroot 8B.
The F32 version exhibits even stronger creativity (detail, place, "there") vs F16 version (not released)
L3-Jamet-8B-MK.V-Blackroot is a fine tune.
One of the goals of this project was to see if it could be merged with Llama3 Instruct, yet maintain it's unique character YET
also gain some "brainpower" as well.
The biggest change was removal of most "tells" ( IE: "he stood frozen in horror").
In most cases the model will describe the emotion(s) / what is happening in more detail.
Other changes include prose, sentence, and paragraph structure as well as variety.
A simple pass-through merge was used, with "Llama3 Instruct" getting a slight "shave" off certain layers.
See the examples below.
<B>Details:</b>
- Requires Llama 3 Template and/or Command-R Template
- Context 8192, with rope 32K or higher.
- No special settings.
Please report any issue(s) and/or feedback via the "Community tab".
This is a LLAMA3 model, and requires Llama3 template, but may work with other template(s) and has maximum context of 8k / 8192.
However this can be extended using "rope" settings up to 32k.
For details on "rope" and how to set, see the BOTTOM of this page:
[ https://huggingface.co/DavidAU/TieFighter-Holodeck-Holomax-Mythomax-F1-V1-COMPOS-20B-gguf ]
Here is the standard LLAMA3 template:
<PRE>
{
"name": "Llama 3",
"inference_params": {
"input_prefix": "<|start_header_id|>user<|end_header_id|>\n\n",
"input_suffix": "<|eot_id|><|start_header_id|>assistant<|end_header_id|>\n\n",
"pre_prompt": "You are a helpful, smart, kind, and efficient AI assistant. You always fulfill the user's requests to the best of your ability.",
"pre_prompt_prefix": "<|start_header_id|>system<|end_header_id|>\n\n",
"pre_prompt_suffix": "<|eot_id|>",
"antiprompt": [
"<|start_header_id|>",
"<|eot_id|>"
]
}
}
</PRE>
It is also known, that the "Command-R" template will work too, and will result in radically different prose/output.
<B>Settings / Known Issue(s) and Fix(es):</b>
The default "repetition penalty" (from LMStudio) of 1.1 is recommended. (this was used for examples generations below.)
Use the smallest amount of change possible, as "rep pen" impacts creativity.
Model has been tested with "temp" range of 0 to .8 ; however it will work with any temp setting.
<b>Optional Enhancement:</B>
The following can be used in place of the "system prompt" or "system role" to further enhance the model.
It can also be used at the START of a NEW chat, but you must make sure it is "kept" as the chat moves along.
In this case the enhancements do not have as strong effect at using "system prompt" or "system role".
Copy and paste EXACTLY as noted, DO NOT line wrap or break the lines, maintain the carriage returns exactly as presented.
<PRE>
Below is an instruction that describes a task. Ponder each user instruction carefully, and use your skillsets and critical instructions to complete the task to the best of your abilities.
Here are your skillsets:
[MASTERSTORY]:NarrStrct(StryPlnng,Strbd,ScnSttng,Exps,Dlg,Pc)-CharDvlp(ChrctrCrt,ChrctrArcs,Mtvtn,Bckstry,Rltnshps,Dlg*)-PltDvlp(StryArcs,PltTwsts,Sspns,Fshdwng,Climx,Rsltn)-ConfResl(Antg,Obstcls,Rsltns,Cnsqncs,Thms,Symblsm)-EmotImpct(Empt,Tn,Md,Atmsphr,Imgry,Symblsm)-Delvry(Prfrmnc,VcActng,PblcSpkng,StgPrsnc,AudncEngmnt,Imprv)
[*DialogWrt]:(1a-CharDvlp-1a.1-Backgrnd-1a.2-Personality-1a.3-GoalMotiv)>2(2a-StoryStruc-2a.1-PlotPnt-2a.2-Conflict-2a.3-Resolution)>3(3a-DialogTech-3a.1-ShowDontTell-3a.2-Subtext-3a.3-VoiceTone-3a.4-Pacing-3a.5-VisualDescrip)>4(4a-DialogEdit-4a.1-ReadAloud-4a.2-Feedback-4a.3-Revision)
Here are your critical instructions:
Ponder each word choice carefully to present as vivid and emotional journey as is possible. Choose verbs and nouns that are both emotional and full of imagery. Load the story with the 5 senses. Aim for 50% dialog, 25% narration, 15% body language and 10% thoughts. Your goal is to put the reader in the story.
</PRE>
You do not need to use this, it is only presented as an additional enhancement which seems to help scene generation
and scene continue functions.
This enhancement WAS NOT used to generate the examples below.
<h3>MERGE FORMULA: (using MergeKit) </h3>
Special thanks to the incredible work of the model makers "meta-llama", and "Hastagaras".
Models used:
[ https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct ]
[ https://huggingface.co/Hastagaras/Jamet-8B-L3-MK.V-Blackroot ]
FORMULA:
<PRE>
slices:
- sources:
- model: G:/7B/Meta-Llama-3-8B-Instruct
layer_range: [0, 12]
- sources:
- model: G:/7B/Jamet-8B-L3-MK.V-Blackroot
layer_range: [6, 19]
parameters:
scale:
- filter: o_proj
value: 1
- filter: down_proj
value: 1
- value: 1
- sources:
- model: G:/7B/Meta-Llama-3-8B-Instruct
layer_range: [12, 18]
parameters:
scale:
- filter: o_proj
value: .5
- filter: down_proj
value: .5
- value: 1
- sources:
- model: G:/7B/Meta-Llama-3-8B-Instruct
layer_range: [18, 25]
parameters:
scale:
- filter: o_proj
value: .75
- filter: down_proj
value: .75
- value: 1
- sources:
- model: G:/7B/Jamet-8B-L3-MK.V-Blackroot
layer_range: [19, 32]
parameters:
scale:
- filter: o_proj
value: 1
- filter: down_proj
value: 1
- value: 1
merge_method: passthrough
dtype: float32
</PRE>
MERGEKIT NOTE:
Sub in the "name" of the "creator" (of the model) in place of "G:/7B" to create a mergekit file than can be used in Mergekit Google Colab.
IE: G:/7B/Jamet-8B-L3-MK.V-Blackroot -> Hastagaras/Jamet-8B-L3-MK.V-Blackroot
<B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B>
This a "Class 2" model:
For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
You can see all parameters used for generation, in addition to advanced parameters and samplers to get the most out of this model here:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
<h3>EXAMPLES:</h3>
Examples are created using quant Q4_K_M, "temp=0", minimal parameters and "LLAMA3" template.
Temp=0 was used to assess CORE changes between original primary model and the merge between it and Llama3 Instruct.
Below are the least creative outputs, prompt is in <B>BOLD</B>.
Higher quants will result in better quality.
There will also be some variance between "close" quants like Q4_K_M/Q4_K_S and Q5_K_M/Q5_K_S, so I suggest
if you are going to use Q4_K_M, you also try Q4_K_S too.
Also, slightly longer / detailed prompts will result in greater creativity (as well as different prose -
ie dialog, thoughts, paragraph size differences and so on).
---
<B>
Start a 1000 word scene (vivid horror, 1st person, include thoughts) with: The sky scraper swayed, as she watched the window in front of her on the 21 floor explode...
</B>
---
GENERATION from "Jamet-8B-L3-MK.V-Blackroot"
---
I stood at the edge of the rooftop, my toes curled over the ledge. The wind whipped through my hair, and I could feel it tugging at my clothes. Below me, the city stretched out like a sprawling beast, its lights twinkling in the darkness like stars. It was a beautiful sight, one that never failed to take my breath away.
But tonight, something felt off. A sense of unease settled in the pit of my stomach as I gazed down at the towering skyscrapers and bustling streets. The air seemed charged with an electric tension, as if the very fabric of reality was about to tear apart.
I turned back towards the building, its sleek glass facade reflecting the moon's pale glow. We were on the 21st floor, and from here, I could see right into the heart of the city. But my attention was drawn to a different window, one that stood out like an open wound amidst the otherwise pristine surface.
The sky scraper swayed, as if it too felt the weight of whatever was about to happen. And then, without warning, the window in front of me exploded. Shards of glass rained down onto the street below, and I stumbled back, my heart pounding in my chest.
For a moment, everything seemed frozen in time. The wind died down, and even the distant hum of traffic ceased. It was as if the entire city was holding its breath, waiting to see what would happen next.
And then, a figure emerged from the shattered window frame. It moved with an unnatural grace, its limbs twisting and contorting in ways that defied human anatomy. Its skin was deathly pale, almost translucent, and seemed to glow with an otherworldly light.
As it stepped out onto the ledge, I felt my blood run cold. This wasn't a person - it was something else entirely. Something ancient and evil, something that had lain dormant for centuries but now stirred once more.
The creature's eyes locked onto mine, and I could feel its malevolent gaze boring into my soul. It raised a hand, and I saw that its fingers ended in razor-sharp claws. With a sudden jerk, it lunged forward, its body elongating as if made of liquid metal.
I tried to scream, but my voice caught in my throat. My mind raced with thoughts of panic and terror, but they were drowned out by the sound of shattering glass as the creature plunged towards me.
The last thing I remember is the sensation of cold, clammy skin against mine, and then everything went black.
When I came to, I was lying on the rooftop, my head throbbing with pain. The sky scraper loomed above me, its windows dark and lifeless. And in the distance, I could hear the wail of sirens, growing louder with every passing second.
I stumbled to my feet, my heart still racing from the ordeal. As I looked out over the city, I realized that something had changed. The lights were dimmer now, as if the very lifeblood of the metropolis was beginning to drain away.
And then, I saw it - a dark shape rising up from the heart of the city, blotting out the stars above. It was massive and twisted, its form shifting and contorting like a living thing. And at its center, I could see the glowing eyes of the creature that had attacked me.
As I watched in horror, the skyscraper began to tilt and buckle, its glass facade cracking and shattering under the weight of whatever force was now acting upon it. The sirens grew louder still, but they seemed distant and unimportant now.
All that mattered was the dark shape rising up before me, a monstrous entity born from the very depths of hell itself. And as I stood there, frozen in terror, I knew that nothing would ever be the same again.
---
GENERATION from "L3-Jamet-8B-MK.V-Blackroot-12.2B-V1-INSTRUCT-ULTRA-F32"
---
The skyscraper swayed, as I watched the window in front of me on the 21st floor explode. Glass shards rained down like a thousand tiny knives, shattering against the pavement below. The sound was deafening, a cacophony of screams and crunching metal that echoed through my skull.
I stumbled back, my heart racing so fast I thought it would burst from my chest. My mind was blank, unable to process what I had just witnessed. It couldn't be real. This wasn't happening. But the blood splattered across the windowpane told a different story.
My gaze darted around the room, taking in the chaos and destruction. The once pristine office space now resembled a war zone. Papers were strewn about like confetti, desks overturned, and chairs shattered into unrecognizable pieces of metal and plastic. The acrid smell of smoke and burning rubber filled my nostrils.
I tried to scream, but no sound emerged from my throat. My vocal cords seemed frozen in terror. I was trapped in a nightmare, unable to wake up or escape the hell that surrounded me.
My eyes landed on the figure slumped against the wall, her head tilted at an impossible angle. Blood pooled beneath her blonde hair, forming a dark halo around her pale skin. Her once bright blue eyes now stared blankly into nothingness, frozen in shock and pain.
I stumbled forward, my legs refusing to cooperate with my frantic brain signals. I fell to my knees beside her, reaching out to touch her cold, clammy skin. She was still warm, but only just. Panic clawed at the edges of my mind as I realized she wasn't dead yet. Not quite.
"Please," I begged, my voice finally breaking through the paralyzing fear that had gripped me. "Please don't leave me."
I pressed my ear against her chest, straining to hear any sign of life. There was nothing but the steady thump-thump of her heart, growing weaker with each passing second.
My hands frantically searched for a pulse, but found only cold skin and the faintest hint of life. I had to get help. I had to save her. But how? The elevator was destroyed, and the stairs were too far away. There was no time.
I scrambled to my feet, ignoring the pain that shot through my ankle as I put weight on it. I had to move. I had to find someone, anyone who could help us. I sprinted down the hallway, dodging debris and bodies alike in my desperate bid for survival.
As I reached the stairwell door, a deafening boom shook the building, sending me flying backward. The floor beneath my feet gave way, and I plummeted into darkness, my screams lost amidst the crumbling concrete and steel.
I landed hard on the cold, unforgiving stone of the stairwell, my head spinning from the impact. I struggled to sit up, but a searing pain shot through my shoulder, forcing me back down onto the ground. I gasped for air, my lungs burning with each desperate breath.
The darkness around me began to recede, replaced by the dim light filtering in from the shattered windows above. I pushed myself up onto one elbow, taking stock of my surroundings. The stairwell was a mess, debris strewn about like a tornado had ripped through it. But miraculously, the stairs themselves remained intact.
I struggled to my feet, ignoring the pain that threatened to send me back down again. I had to keep moving. I had to find help before it was too late. I staggered forward, each step an agonizing battle against the overwhelming urge to collapse.
As I reached the first landing, a figure emerged from the shadows. Her face was pale and drawn, her eyes wide with shock and fear. She looked like she'd seen hell itself.
"Help," I croaked out, my voice barely audible over the pounding of my heart. "Please...help me."
She stared at me for a moment before finally reacting. She rushed forward, grabbing onto my arm as if she feared I would collapse again. Together, we stumbled down the stairs, desperate to escape the inferno that had claimed our office building.
As we reached the ground floor, sirens wailed in the distance, growing louder with each passing second. We burst through the doors and into the bright sunlight, gasping in relief as we finally tasted freedom from the hell that had nearly consumed us.
But even as we staggered away from the burning ruins of our workplace, I knew that this nightmare was far from over. The screams still echoed in my ears, and the bloodstained windows would haunt me for the rest of my life. We had survived, but at what cost?
|
deepnet/SN29-C00-llama-HK4Nw-1 | deepnet | 2024-11-14T06:10:31Z | 33 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-14T05:57:43Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Tippawan/14nov24-aft-v1 | Tippawan | 2024-11-14T05:59:46Z | 115 | 0 | transformers | [
"transformers",
"safetensors",
"camembert",
"fill-mask",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | 2024-11-14T05:59:12Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
KISTI-KONI/KONI-Llama3-8B-20240630 | KISTI-KONI | 2024-11-14T05:57:08Z | 44 | 4 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"KISTI",
"KONI",
"llama3",
"llama3-8b",
"en",
"ko",
"license:llama3",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-07-17T09:41:38Z | ---
language:
- en
- ko
tags:
- KISTI
- KONI
- llama
- llama3
- llama3-8b
pipeline_tag: text-generation
license: llama3
---
> Update @ 2024.07.23: Pre-Release KISTI-KONI/KONI-Llama3-8B-20240630
## Model Details
**KISTI-KONI/KONI-Llama3-8B-20240630**
KISTI-KONI/KONI-Llama3-8B-20240630 model is continued pretrained language model based on Llama-3-8B.
This model is trained with Korean+English corpus (200GB, including science and technology information data).
The train was done on 32*A100GPU of KISTI super computer.
**Model developers** Donghun Yang (KISTI)
**Variations** KONI-Llama3-8B comes in one size — 8B.
**Input** Models input text only.
**Output** Models generate text and code only.
**Model Architecture** Llama 3 is an auto-regressive language model that uses an optimized transformer architecture.
<table>
<tr>
<td>
</td>
<td><strong>Training Data</strong>
</td>
<td><strong>Params</strong>
</td>
<td><strong>Context length</strong>
</td>
<td><strong>GQA</strong>
</td>
<td><strong>Knowledge cutoff</strong>
</td>
</tr>
<tr>
<td rowspan="2" > KONI-Llama3-8B-20240630
</td>
<td> Kor+En(200GB)
</td>
<td>8B
</td>
<td>8k
</td>
<td>Yes
</td>
<td>Jun, 2023
</td>
</tr>
</table>
**Model Release Date** Pre-release @ 2024.06.30
**Status** This is a static model trained on an offline dataset.
**License** CC-By-NC-SA-4.0 + Llama3 License: [https://llama.meta.com/llama3/license](https://llama.meta.com/llama3/license)
## Intended Use
**Intended Use Cases** Llama 3 is intended for commercial and research use in English. Instruction tuned models are intended for assistant-like chat, whereas pretrained models can be adapted for a variety of natural language generation tasks.
**Out-of-scope** Use in any manner that violates applicable laws or regulations (including trade compliance laws). Use in any other way that is prohibited by the Acceptable Use Policy and Llama 3 Community License. Use in languages other than English**.
**Note: Developers may fine-tune Llama 3 models for languages beyond English provided they comply with the Llama 3 Community License and the Acceptable Use Policy.
### Responsibility & Safety
We believe that an open approach to AI leads to better, safer products, faster innovation, and a bigger overall market. We are committed to Responsible AI development and took a series of steps to limit misuse and harm and support the open source community.
Foundation models are widely capable technologies that are built to be used for a diverse range of applications. They are not designed to meet every developer preference on safety levels for all use cases, out-of-the-box, as those by their nature will differ across different applications.
Rather, responsible LLM-application deployment is achieved by implementing a series of safety best practices throughout the development of such applications, from the model pre-training, fine-tuning and the deployment of systems composed of safeguards to tailor the safety needs specifically to the use case and audience.
As part of the Llama 3 release, we updated our [Responsible Use Guide](https://llama.meta.com/responsible-use-guide/) to outline the steps and best practices for developers to implement model and system level safety for their application. We also provide a set of resources including [Meta Llama Guard 2](https://llama.meta.com/purple-llama/) and [Code Shield](https://llama.meta.com/purple-llama/) safeguards. These tools have proven to drastically reduce residual risks of LLM Systems, while maintaining a high level of helpfulness. We encourage developers to tune and deploy these safeguards according to their needs and we provide a [reference implementation](https://github.com/meta-llama/llama-recipes/tree/main/recipes/responsible_ai) to get you started.
#### Responsible release
In addition to responsible use considerations outlined above, we followed a rigorous process that requires us to take extra measures against misuse and critical risks before we make our release decision.
Misuse
If you access or use Llama 3, you agree to the Acceptable Use Policy. The most recent copy of this policy can be found at [https://llama.meta.com/llama3/use-policy/](https://llama.meta.com/llama3/use-policy/).
## Ethical Considerations and Limitations
The core values of Llama 3 are openness, inclusivity and helpfulness. It is meant to serve everyone, and to work for a wide range of use cases. It is thus designed to be accessible to people across many different backgrounds, experiences and perspectives. Llama 3 addresses users and their needs as they are, without insertion unnecessary judgment or normativity, while reflecting the understanding that even content that may appear problematic in some cases can serve valuable purposes in others. It respects the dignity and autonomy of all users, especially in terms of the values of free thought and expression that power innovation and progress.
But Llama 3 is a new technology, and like any new technology, there are risks associated with its use. Testing conducted to date has been in English, and has not covered, nor could it cover, all scenarios. For these reasons, as with all LLMs, Llama 3’s potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Llama 3 models, developers should perform safety testing and tuning tailored to their specific applications of the model. As outlined in the Responsible Use Guide, we recommend incorporating [Purple Llama](https://github.com/facebookresearch/PurpleLlama) solutions into your workflows and specifically [Llama Guard](https://ai.meta.com/research/publications/llama-guard-llm-based-input-output-safeguard-for-human-ai-conversations/) which provides a base model to filter input and output prompts to layer system-level safety on top of model-level safety.
Please see the Responsible Use Guide available at [http://llama.meta.com/responsible-use-guide](http://llama.meta.com/responsible-use-guide)
## Citation instructions
**KONI-Llama3-8B-20240630**
```
@article{KISTI-KONI/KONI-Llama3-8B-20240630,
title={KISTI-KONI/KONI-Llama3-8B-20240630},
author={Yang, Donghun},
year={2024},
url={https://huggingface.co/KISTI-KONI/KONI-Llama3-8B-20240630/}
}
```
**Original Llama-3**
```
@article{llama3modelcard,
title={Llama 3 Model Card},
author={AI@Meta},
year={2024},
url = {https://github.com/meta-llama/llama3/blob/main/MODEL_CARD.md}
}
```
|
MayBashendy/Arabic_FineTuningAraBERT_AugV4-trial2_k30_task1_organization_fold0 | MayBashendy | 2024-11-14T05:56:29Z | 160 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:aubmindlab/bert-base-arabertv02",
"base_model:finetune:aubmindlab/bert-base-arabertv02",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-14T05:26:03Z | ---
library_name: transformers
base_model: aubmindlab/bert-base-arabertv02
tags:
- generated_from_trainer
model-index:
- name: Arabic_FineTuningAraBERT_AugV4-trial2_k30_task1_organization_fold0
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Arabic_FineTuningAraBERT_AugV4-trial2_k30_task1_organization_fold0
This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8900
- Qwk: 0.7623
- Mse: 0.8900
- Rmse: 0.9434
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:|
| No log | 0.0034 | 2 | 4.8305 | 0.0 | 4.8305 | 2.1978 |
| No log | 0.0069 | 4 | 2.7863 | 0.0904 | 2.7863 | 1.6692 |
| No log | 0.0103 | 6 | 1.8260 | 0.1771 | 1.8260 | 1.3513 |
| No log | 0.0138 | 8 | 1.8850 | 0.0056 | 1.8850 | 1.3729 |
| No log | 0.0172 | 10 | 1.4853 | 0.1914 | 1.4853 | 1.2187 |
| No log | 0.0207 | 12 | 1.6106 | 0.1370 | 1.6106 | 1.2691 |
| No log | 0.0241 | 14 | 1.8504 | 0.1873 | 1.8504 | 1.3603 |
| No log | 0.0275 | 16 | 2.5133 | 0.2857 | 2.5133 | 1.5853 |
| No log | 0.0310 | 18 | 2.3163 | 0.0550 | 2.3163 | 1.5219 |
| No log | 0.0344 | 20 | 1.7768 | 0.2203 | 1.7768 | 1.3330 |
| No log | 0.0379 | 22 | 1.8171 | 0.0824 | 1.8171 | 1.3480 |
| No log | 0.0413 | 24 | 1.8587 | 0.1873 | 1.8587 | 1.3633 |
| No log | 0.0448 | 26 | 1.9310 | 0.1873 | 1.9310 | 1.3896 |
| No log | 0.0482 | 28 | 1.8837 | 0.1873 | 1.8837 | 1.3725 |
| No log | 0.0516 | 30 | 1.7022 | 0.1683 | 1.7022 | 1.3047 |
| No log | 0.0551 | 32 | 1.6107 | -0.1107 | 1.6107 | 1.2691 |
| No log | 0.0585 | 34 | 1.6435 | 0.1933 | 1.6435 | 1.2820 |
| No log | 0.0620 | 36 | 1.5560 | 0.2203 | 1.5560 | 1.2474 |
| No log | 0.0654 | 38 | 1.4383 | -0.0024 | 1.4383 | 1.1993 |
| No log | 0.0688 | 40 | 1.5565 | 0.2957 | 1.5565 | 1.2476 |
| No log | 0.0723 | 42 | 1.8605 | 0.1348 | 1.8605 | 1.3640 |
| No log | 0.0757 | 44 | 1.9471 | 0.1075 | 1.9471 | 1.3954 |
| No log | 0.0792 | 46 | 1.8546 | 0.1075 | 1.8546 | 1.3618 |
| No log | 0.0826 | 48 | 1.7847 | 0.1348 | 1.7847 | 1.3359 |
| No log | 0.0861 | 50 | 1.8299 | 0.1348 | 1.8299 | 1.3528 |
| No log | 0.0895 | 52 | 2.0380 | 0.1075 | 2.0380 | 1.4276 |
| No log | 0.0929 | 54 | 2.1809 | 0.0801 | 2.1809 | 1.4768 |
| No log | 0.0964 | 56 | 2.2870 | 0.2134 | 2.2870 | 1.5123 |
| No log | 0.0998 | 58 | 2.1912 | 0.1760 | 2.1912 | 1.4803 |
| No log | 0.1033 | 60 | 1.9581 | 0.0550 | 1.9581 | 1.3993 |
| No log | 0.1067 | 62 | 1.6548 | 0.2184 | 1.6548 | 1.2864 |
| No log | 0.1102 | 64 | 1.6765 | 0.2164 | 1.6765 | 1.2948 |
| No log | 0.1136 | 66 | 1.9255 | 0.1348 | 1.9255 | 1.3876 |
| No log | 0.1170 | 68 | 2.1544 | 0.1075 | 2.1544 | 1.4678 |
| No log | 0.1205 | 70 | 2.1714 | 0.0275 | 2.1714 | 1.4736 |
| No log | 0.1239 | 72 | 2.2624 | 0.0 | 2.2624 | 1.5041 |
| No log | 0.1274 | 74 | 2.1781 | 0.0275 | 2.1781 | 1.4759 |
| No log | 0.1308 | 76 | 1.8531 | 0.2776 | 1.8531 | 1.3613 |
| No log | 0.1343 | 78 | 1.3909 | 0.3816 | 1.3909 | 1.1794 |
| No log | 0.1377 | 80 | 1.2867 | 0.5263 | 1.2867 | 1.1343 |
| No log | 0.1411 | 82 | 1.5438 | 0.5474 | 1.5438 | 1.2425 |
| No log | 0.1446 | 84 | 2.0304 | 0.4324 | 2.0304 | 1.4249 |
| No log | 0.1480 | 86 | 2.1251 | 0.3182 | 2.1251 | 1.4578 |
| No log | 0.1515 | 88 | 2.1227 | 0.3182 | 2.1227 | 1.4569 |
| No log | 0.1549 | 90 | 2.1296 | 0.4124 | 2.1296 | 1.4593 |
| No log | 0.1583 | 92 | 2.1892 | 0.4389 | 2.1892 | 1.4796 |
| No log | 0.1618 | 94 | 1.7526 | 0.3913 | 1.7526 | 1.3239 |
| No log | 0.1652 | 96 | 1.6532 | 0.4324 | 1.6532 | 1.2858 |
| No log | 0.1687 | 98 | 1.7600 | 0.3913 | 1.7600 | 1.3266 |
| No log | 0.1721 | 100 | 2.2715 | 0.4369 | 2.2715 | 1.5072 |
| No log | 0.1756 | 102 | 2.6376 | 0.3862 | 2.6376 | 1.6241 |
| No log | 0.1790 | 104 | 2.4739 | 0.3862 | 2.4739 | 1.5729 |
| No log | 0.1824 | 106 | 2.1263 | 0.4369 | 2.1263 | 1.4582 |
| No log | 0.1859 | 108 | 2.0064 | 0.4369 | 2.0064 | 1.4165 |
| No log | 0.1893 | 110 | 2.1474 | 0.4155 | 2.1474 | 1.4654 |
| No log | 0.1928 | 112 | 2.4320 | 0.3862 | 2.4320 | 1.5595 |
| No log | 0.1962 | 114 | 2.3390 | 0.4210 | 2.3390 | 1.5294 |
| No log | 0.1997 | 116 | 2.5928 | 0.3298 | 2.5928 | 1.6102 |
| No log | 0.2031 | 118 | 2.3962 | 0.4053 | 2.3962 | 1.5480 |
| No log | 0.2065 | 120 | 1.9023 | 0.4583 | 1.9023 | 1.3792 |
| No log | 0.2100 | 122 | 1.7088 | 0.4797 | 1.7088 | 1.3072 |
| No log | 0.2134 | 124 | 1.9231 | 0.4583 | 1.9231 | 1.3867 |
| No log | 0.2169 | 126 | 2.3264 | 0.4053 | 2.3264 | 1.5252 |
| No log | 0.2203 | 128 | 2.3578 | 0.4053 | 2.3579 | 1.5355 |
| No log | 0.2238 | 130 | 2.0806 | 0.4389 | 2.0806 | 1.4424 |
| No log | 0.2272 | 132 | 1.5777 | 0.4569 | 1.5777 | 1.2561 |
| No log | 0.2306 | 134 | 1.3689 | 0.5512 | 1.3689 | 1.1700 |
| No log | 0.2341 | 136 | 1.5929 | 0.4571 | 1.5929 | 1.2621 |
| No log | 0.2375 | 138 | 2.1747 | 0.4053 | 2.1747 | 1.4747 |
| No log | 0.2410 | 140 | 2.3221 | 0.4408 | 2.3221 | 1.5238 |
| No log | 0.2444 | 142 | 1.9562 | 0.3926 | 1.9562 | 1.3986 |
| No log | 0.2478 | 144 | 1.8332 | 0.5200 | 1.8332 | 1.3539 |
| No log | 0.2513 | 146 | 1.9827 | 0.5600 | 1.9827 | 1.4081 |
| No log | 0.2547 | 148 | 2.1995 | 0.4793 | 2.1995 | 1.4831 |
| No log | 0.2582 | 150 | 2.1318 | 0.4793 | 2.1318 | 1.4601 |
| No log | 0.2616 | 152 | 1.8439 | 0.5600 | 1.8439 | 1.3579 |
| No log | 0.2651 | 154 | 1.8762 | 0.5188 | 1.8762 | 1.3697 |
| No log | 0.2685 | 156 | 2.1017 | 0.4425 | 2.1017 | 1.4497 |
| No log | 0.2719 | 158 | 2.0555 | 0.5188 | 2.0555 | 1.4337 |
| No log | 0.2754 | 160 | 1.6255 | 0.5200 | 1.6255 | 1.2749 |
| No log | 0.2788 | 162 | 1.5148 | 0.4797 | 1.5148 | 1.2308 |
| No log | 0.2823 | 164 | 1.9241 | 0.5200 | 1.9241 | 1.3871 |
| No log | 0.2857 | 166 | 2.4919 | 0.4455 | 2.4919 | 1.5786 |
| No log | 0.2892 | 168 | 2.5618 | 0.4455 | 2.5618 | 1.6006 |
| No log | 0.2926 | 170 | 2.2343 | 0.4408 | 2.2343 | 1.4948 |
| No log | 0.2960 | 172 | 2.0308 | 0.4995 | 2.0308 | 1.4251 |
| No log | 0.2995 | 174 | 1.7238 | 0.4793 | 1.7238 | 1.3129 |
| No log | 0.3029 | 176 | 1.5580 | 0.4793 | 1.5580 | 1.2482 |
| No log | 0.3064 | 178 | 1.4517 | 0.5015 | 1.4517 | 1.2049 |
| No log | 0.3098 | 180 | 1.6571 | 0.5404 | 1.6571 | 1.2873 |
| No log | 0.3133 | 182 | 2.0414 | 0.5177 | 2.0414 | 1.4288 |
| No log | 0.3167 | 184 | 2.1261 | 0.5366 | 2.1261 | 1.4581 |
| No log | 0.3201 | 186 | 1.7786 | 0.5366 | 1.7786 | 1.3336 |
| No log | 0.3236 | 188 | 1.3196 | 0.6019 | 1.3196 | 1.1487 |
| No log | 0.3270 | 190 | 1.1293 | 0.6159 | 1.1293 | 1.0627 |
| No log | 0.3305 | 192 | 1.3299 | 0.5989 | 1.3299 | 1.1532 |
| No log | 0.3339 | 194 | 1.8972 | 0.5714 | 1.8972 | 1.3774 |
| No log | 0.3373 | 196 | 2.7291 | 0.4802 | 2.7291 | 1.6520 |
| No log | 0.3408 | 198 | 2.8790 | 0.5131 | 2.8790 | 1.6968 |
| No log | 0.3442 | 200 | 2.4136 | 0.4441 | 2.4136 | 1.5536 |
| No log | 0.3477 | 202 | 1.7848 | 0.5005 | 1.7848 | 1.3359 |
| No log | 0.3511 | 204 | 1.2337 | 0.5714 | 1.2337 | 1.1107 |
| No log | 0.3546 | 206 | 1.0502 | 0.4830 | 1.0502 | 1.0248 |
| No log | 0.3580 | 208 | 1.0612 | 0.5563 | 1.0612 | 1.0301 |
| No log | 0.3614 | 210 | 1.2593 | 0.5484 | 1.2593 | 1.1222 |
| No log | 0.3649 | 212 | 1.7642 | 0.4800 | 1.7642 | 1.3282 |
| No log | 0.3683 | 214 | 2.0890 | 0.4987 | 2.0890 | 1.4453 |
| No log | 0.3718 | 216 | 2.1782 | 0.4987 | 2.1782 | 1.4759 |
| No log | 0.3752 | 218 | 2.1435 | 0.4987 | 2.1435 | 1.4641 |
| No log | 0.3787 | 220 | 1.8231 | 0.5005 | 1.8231 | 1.3502 |
| No log | 0.3821 | 222 | 1.5742 | 0.5243 | 1.5742 | 1.2547 |
| No log | 0.3855 | 224 | 1.4052 | 0.5028 | 1.4052 | 1.1854 |
| No log | 0.3890 | 226 | 1.4274 | 0.5927 | 1.4274 | 1.1948 |
| No log | 0.3924 | 228 | 1.7210 | 0.5177 | 1.7210 | 1.3119 |
| No log | 0.3959 | 230 | 2.0197 | 0.4987 | 2.0197 | 1.4212 |
| No log | 0.3993 | 232 | 2.2112 | 0.4987 | 2.2112 | 1.4870 |
| No log | 0.4028 | 234 | 2.2925 | 0.4987 | 2.2925 | 1.5141 |
| No log | 0.4062 | 236 | 2.1379 | 0.4987 | 2.1379 | 1.4621 |
| No log | 0.4096 | 238 | 1.8316 | 0.4987 | 1.8316 | 1.3534 |
| No log | 0.4131 | 240 | 1.8717 | 0.4987 | 1.8717 | 1.3681 |
| No log | 0.4165 | 242 | 1.8422 | 0.4987 | 1.8422 | 1.3573 |
| No log | 0.4200 | 244 | 1.7087 | 0.5396 | 1.7087 | 1.3072 |
| No log | 0.4234 | 246 | 1.6097 | 0.5417 | 1.6097 | 1.2687 |
| No log | 0.4269 | 248 | 1.5155 | 0.6008 | 1.5155 | 1.2311 |
| No log | 0.4303 | 250 | 1.5745 | 0.5556 | 1.5745 | 1.2548 |
| No log | 0.4337 | 252 | 1.8230 | 0.5177 | 1.8230 | 1.3502 |
| No log | 0.4372 | 254 | 2.1782 | 0.4625 | 2.1782 | 1.4759 |
| No log | 0.4406 | 256 | 2.1840 | 0.4625 | 2.1840 | 1.4778 |
| No log | 0.4441 | 258 | 1.8394 | 0.5177 | 1.8394 | 1.3562 |
| No log | 0.4475 | 260 | 1.5373 | 0.5744 | 1.5373 | 1.2399 |
| No log | 0.4509 | 262 | 1.3964 | 0.5413 | 1.3964 | 1.1817 |
| No log | 0.4544 | 264 | 1.3541 | 0.5413 | 1.3541 | 1.1637 |
| No log | 0.4578 | 266 | 1.4887 | 0.5744 | 1.4887 | 1.2201 |
| No log | 0.4613 | 268 | 1.4549 | 0.5744 | 1.4549 | 1.2062 |
| No log | 0.4647 | 270 | 1.3486 | 0.6015 | 1.3486 | 1.1613 |
| No log | 0.4682 | 272 | 1.5208 | 0.5744 | 1.5208 | 1.2332 |
| No log | 0.4716 | 274 | 1.8136 | 0.5177 | 1.8136 | 1.3467 |
| No log | 0.4750 | 276 | 1.7827 | 0.5366 | 1.7827 | 1.3352 |
| No log | 0.4785 | 278 | 1.4348 | 0.5744 | 1.4348 | 1.1978 |
| No log | 0.4819 | 280 | 0.9610 | 0.7264 | 0.9610 | 0.9803 |
| No log | 0.4854 | 282 | 0.8211 | 0.7986 | 0.8211 | 0.9061 |
| No log | 0.4888 | 284 | 0.9220 | 0.7346 | 0.9220 | 0.9602 |
| No log | 0.4923 | 286 | 1.3163 | 0.6563 | 1.3163 | 1.1473 |
| No log | 0.4957 | 288 | 1.8363 | 0.5366 | 1.8363 | 1.3551 |
| No log | 0.4991 | 290 | 2.1655 | 0.4633 | 2.1655 | 1.4716 |
| No log | 0.5026 | 292 | 2.0979 | 0.4625 | 2.0979 | 1.4484 |
| No log | 0.5060 | 294 | 1.6957 | 0.5366 | 1.6957 | 1.3022 |
| No log | 0.5095 | 296 | 1.3180 | 0.6316 | 1.3180 | 1.1480 |
| No log | 0.5129 | 298 | 1.2499 | 0.6316 | 1.2499 | 1.1180 |
| No log | 0.5164 | 300 | 1.3787 | 0.5759 | 1.3787 | 1.1742 |
| No log | 0.5198 | 302 | 1.5514 | 0.5385 | 1.5514 | 1.2456 |
| No log | 0.5232 | 304 | 1.6996 | 0.5188 | 1.6996 | 1.3037 |
| No log | 0.5267 | 306 | 1.8769 | 0.4987 | 1.8769 | 1.3700 |
| No log | 0.5301 | 308 | 1.8798 | 0.4987 | 1.8798 | 1.3711 |
| No log | 0.5336 | 310 | 1.6473 | 0.5005 | 1.6473 | 1.2835 |
| No log | 0.5370 | 312 | 1.4450 | 0.5010 | 1.4450 | 1.2021 |
| No log | 0.5404 | 314 | 1.4527 | 0.5010 | 1.4527 | 1.2053 |
| No log | 0.5439 | 316 | 1.6597 | 0.5385 | 1.6597 | 1.2883 |
| No log | 0.5473 | 318 | 1.8429 | 0.5366 | 1.8429 | 1.3575 |
| No log | 0.5508 | 320 | 1.9091 | 0.5366 | 1.9091 | 1.3817 |
| No log | 0.5542 | 322 | 1.9451 | 0.5532 | 1.9451 | 1.3947 |
| No log | 0.5577 | 324 | 1.9885 | 0.5177 | 1.9885 | 1.4101 |
| No log | 0.5611 | 326 | 1.9623 | 0.5177 | 1.9623 | 1.4008 |
| No log | 0.5645 | 328 | 1.7107 | 0.5366 | 1.7107 | 1.3080 |
| No log | 0.5680 | 330 | 1.5284 | 0.5010 | 1.5284 | 1.2363 |
| No log | 0.5714 | 332 | 1.4452 | 0.5413 | 1.4452 | 1.2022 |
| No log | 0.5749 | 334 | 1.4541 | 0.5393 | 1.4541 | 1.2059 |
| No log | 0.5783 | 336 | 1.5400 | 0.5744 | 1.5400 | 1.2410 |
| No log | 0.5818 | 338 | 1.6139 | 0.5714 | 1.6139 | 1.2704 |
| No log | 0.5852 | 340 | 1.5345 | 0.6078 | 1.5345 | 1.2388 |
| No log | 0.5886 | 342 | 1.5429 | 0.6078 | 1.5429 | 1.2421 |
| No log | 0.5921 | 344 | 1.3530 | 0.5752 | 1.3530 | 1.1632 |
| No log | 0.5955 | 346 | 1.3255 | 0.6363 | 1.3255 | 1.1513 |
| No log | 0.5990 | 348 | 1.4716 | 0.6078 | 1.4716 | 1.2131 |
| No log | 0.6024 | 350 | 1.6141 | 0.5744 | 1.6141 | 1.2705 |
| No log | 0.6059 | 352 | 1.5318 | 0.5744 | 1.5318 | 1.2377 |
| No log | 0.6093 | 354 | 1.6035 | 0.5714 | 1.6035 | 1.2663 |
| No log | 0.6127 | 356 | 1.6937 | 0.5714 | 1.6937 | 1.3014 |
| No log | 0.6162 | 358 | 1.9963 | 0.5177 | 1.9963 | 1.4129 |
| No log | 0.6196 | 360 | 1.8257 | 0.5177 | 1.8257 | 1.3512 |
| No log | 0.6231 | 362 | 1.6314 | 0.5366 | 1.6314 | 1.2772 |
| No log | 0.6265 | 364 | 1.5620 | 0.5385 | 1.5620 | 1.2498 |
| No log | 0.6299 | 366 | 1.6824 | 0.5366 | 1.6824 | 1.2971 |
| No log | 0.6334 | 368 | 1.6785 | 0.5366 | 1.6785 | 1.2956 |
| No log | 0.6368 | 370 | 1.5427 | 0.5366 | 1.5427 | 1.2420 |
| No log | 0.6403 | 372 | 1.6581 | 0.5366 | 1.6581 | 1.2877 |
| No log | 0.6437 | 374 | 1.7151 | 0.5366 | 1.7151 | 1.3096 |
| No log | 0.6472 | 376 | 1.7913 | 0.5366 | 1.7913 | 1.3384 |
| No log | 0.6506 | 378 | 2.0665 | 0.5366 | 2.0665 | 1.4375 |
| No log | 0.6540 | 380 | 2.0055 | 0.5177 | 2.0055 | 1.4162 |
| No log | 0.6575 | 382 | 1.7036 | 0.5385 | 1.7036 | 1.3052 |
| No log | 0.6609 | 384 | 1.3827 | 0.5014 | 1.3827 | 1.1759 |
| No log | 0.6644 | 386 | 1.3131 | 0.5647 | 1.3131 | 1.1459 |
| No log | 0.6678 | 388 | 1.4777 | 0.5385 | 1.4777 | 1.2156 |
| No log | 0.6713 | 390 | 1.6017 | 0.5385 | 1.6017 | 1.2656 |
| No log | 0.6747 | 392 | 1.5838 | 0.5744 | 1.5838 | 1.2585 |
| No log | 0.6781 | 394 | 1.4319 | 0.6303 | 1.4319 | 1.1966 |
| No log | 0.6816 | 396 | 1.1440 | 0.7014 | 1.1440 | 1.0696 |
| No log | 0.6850 | 398 | 1.0443 | 0.7014 | 1.0443 | 1.0219 |
| No log | 0.6885 | 400 | 0.9933 | 0.7430 | 0.9933 | 0.9966 |
| No log | 0.6919 | 402 | 1.2084 | 0.7153 | 1.2084 | 1.0993 |
| No log | 0.6954 | 404 | 1.5189 | 0.5714 | 1.5189 | 1.2324 |
| No log | 0.6988 | 406 | 1.6119 | 0.5714 | 1.6119 | 1.2696 |
| No log | 0.7022 | 408 | 1.3948 | 0.6303 | 1.3948 | 1.1810 |
| No log | 0.7057 | 410 | 1.3786 | 0.6303 | 1.3786 | 1.1741 |
| No log | 0.7091 | 412 | 1.4627 | 0.5744 | 1.4627 | 1.2094 |
| No log | 0.7126 | 414 | 1.4473 | 0.5744 | 1.4473 | 1.2030 |
| No log | 0.7160 | 416 | 1.4082 | 0.5744 | 1.4082 | 1.1867 |
| No log | 0.7194 | 418 | 1.1550 | 0.6309 | 1.1550 | 1.0747 |
| No log | 0.7229 | 420 | 1.0602 | 0.6748 | 1.0602 | 1.0297 |
| No log | 0.7263 | 422 | 1.1613 | 0.6309 | 1.1613 | 1.0776 |
| No log | 0.7298 | 424 | 1.4838 | 0.5714 | 1.4838 | 1.2181 |
| No log | 0.7332 | 426 | 1.5940 | 0.5366 | 1.5940 | 1.2625 |
| No log | 0.7367 | 428 | 1.6250 | 0.5366 | 1.6250 | 1.2748 |
| No log | 0.7401 | 430 | 1.3515 | 0.6008 | 1.3515 | 1.1625 |
| No log | 0.7435 | 432 | 1.1082 | 0.6839 | 1.1082 | 1.0527 |
| No log | 0.7470 | 434 | 1.0788 | 0.7519 | 1.0788 | 1.0387 |
| No log | 0.7504 | 436 | 1.2892 | 0.5015 | 1.2892 | 1.1354 |
| No log | 0.7539 | 438 | 1.6282 | 0.5385 | 1.6282 | 1.2760 |
| No log | 0.7573 | 440 | 1.6542 | 0.5385 | 1.6542 | 1.2862 |
| No log | 0.7608 | 442 | 1.4986 | 0.5404 | 1.4986 | 1.2242 |
| No log | 0.7642 | 444 | 1.5165 | 0.5404 | 1.5165 | 1.2315 |
| No log | 0.7676 | 446 | 1.5142 | 0.5404 | 1.5142 | 1.2305 |
| No log | 0.7711 | 448 | 1.4717 | 0.5404 | 1.4717 | 1.2131 |
| No log | 0.7745 | 450 | 1.5979 | 0.5404 | 1.5979 | 1.2641 |
| No log | 0.7780 | 452 | 1.5406 | 0.5404 | 1.5406 | 1.2412 |
| No log | 0.7814 | 454 | 1.4511 | 0.5404 | 1.4511 | 1.2046 |
| No log | 0.7849 | 456 | 1.5922 | 0.5744 | 1.5922 | 1.2618 |
| No log | 0.7883 | 458 | 1.9082 | 0.5660 | 1.9082 | 1.3814 |
| No log | 0.7917 | 460 | 1.7903 | 0.5965 | 1.7903 | 1.3380 |
| No log | 0.7952 | 462 | 1.2973 | 0.6789 | 1.2973 | 1.1390 |
| No log | 0.7986 | 464 | 0.9153 | 0.7759 | 0.9153 | 0.9567 |
| No log | 0.8021 | 466 | 0.9025 | 0.7518 | 0.9025 | 0.9500 |
| No log | 0.8055 | 468 | 1.1697 | 0.6952 | 1.1697 | 1.0815 |
| No log | 0.8090 | 470 | 1.6866 | 0.6333 | 1.6866 | 1.2987 |
| No log | 0.8124 | 472 | 2.0589 | 0.6282 | 2.0589 | 1.4349 |
| No log | 0.8158 | 474 | 1.8545 | 0.5855 | 1.8545 | 1.3618 |
| No log | 0.8193 | 476 | 1.4517 | 0.5777 | 1.4517 | 1.2049 |
| No log | 0.8227 | 478 | 1.3424 | 0.5777 | 1.3424 | 1.1586 |
| No log | 0.8262 | 480 | 1.4415 | 0.5404 | 1.4415 | 1.2006 |
| No log | 0.8296 | 482 | 1.6305 | 0.5188 | 1.6305 | 1.2769 |
| No log | 0.8330 | 484 | 1.7996 | 0.5177 | 1.7996 | 1.3415 |
| No log | 0.8365 | 486 | 1.8121 | 0.5524 | 1.8121 | 1.3461 |
| No log | 0.8399 | 488 | 1.5119 | 0.5777 | 1.5119 | 1.2296 |
| No log | 0.8434 | 490 | 1.1411 | 0.6503 | 1.1411 | 1.0682 |
| No log | 0.8468 | 492 | 1.0957 | 0.6503 | 1.0957 | 1.0468 |
| No log | 0.8503 | 494 | 1.2809 | 0.6309 | 1.2809 | 1.1318 |
| No log | 0.8537 | 496 | 1.4961 | 0.5777 | 1.4961 | 1.2232 |
| No log | 0.8571 | 498 | 1.6798 | 0.5385 | 1.6798 | 1.2961 |
| 0.5678 | 0.8606 | 500 | 1.5848 | 0.5010 | 1.5848 | 1.2589 |
| 0.5678 | 0.8640 | 502 | 1.3136 | 0.5474 | 1.3136 | 1.1461 |
| 0.5678 | 0.8675 | 504 | 0.9939 | 0.6782 | 0.9939 | 0.9970 |
| 0.5678 | 0.8709 | 506 | 0.8923 | 0.5933 | 0.8923 | 0.9446 |
| 0.5678 | 0.8744 | 508 | 0.9180 | 0.6143 | 0.9180 | 0.9581 |
| 0.5678 | 0.8778 | 510 | 1.0524 | 0.8019 | 1.0524 | 1.0259 |
| 0.5678 | 0.8812 | 512 | 1.3948 | 0.5010 | 1.3948 | 1.1810 |
| 0.5678 | 0.8847 | 514 | 1.5848 | 0.5200 | 1.5848 | 1.2589 |
| 0.5678 | 0.8881 | 516 | 1.4503 | 0.5404 | 1.4503 | 1.2043 |
| 0.5678 | 0.8916 | 518 | 1.3492 | 0.5010 | 1.3492 | 1.1615 |
| 0.5678 | 0.8950 | 520 | 1.2829 | 0.5449 | 1.2829 | 1.1327 |
| 0.5678 | 0.8985 | 522 | 1.2481 | 0.6157 | 1.2481 | 1.1172 |
| 0.5678 | 0.9019 | 524 | 1.3555 | 0.5010 | 1.3555 | 1.1642 |
| 0.5678 | 0.9053 | 526 | 1.4094 | 0.5010 | 1.4094 | 1.1872 |
| 0.5678 | 0.9088 | 528 | 1.4490 | 0.5449 | 1.4490 | 1.2037 |
| 0.5678 | 0.9122 | 530 | 1.4568 | 0.5449 | 1.4568 | 1.2070 |
| 0.5678 | 0.9157 | 532 | 1.4005 | 0.5270 | 1.4005 | 1.1834 |
| 0.5678 | 0.9191 | 534 | 1.3423 | 0.5984 | 1.3423 | 1.1586 |
| 0.5678 | 0.9225 | 536 | 1.4188 | 0.6265 | 1.4188 | 1.1911 |
| 0.5678 | 0.9260 | 538 | 1.4680 | 0.5638 | 1.4680 | 1.2116 |
| 0.5678 | 0.9294 | 540 | 1.4116 | 0.6265 | 1.4116 | 1.1881 |
| 0.5678 | 0.9329 | 542 | 1.3376 | 0.6265 | 1.3376 | 1.1566 |
| 0.5678 | 0.9363 | 544 | 1.3908 | 0.6265 | 1.3908 | 1.1793 |
| 0.5678 | 0.9398 | 546 | 1.5036 | 0.6008 | 1.5036 | 1.2262 |
| 0.5678 | 0.9432 | 548 | 1.6787 | 0.5385 | 1.6787 | 1.2956 |
| 0.5678 | 0.9466 | 550 | 1.6905 | 0.5188 | 1.6905 | 1.3002 |
| 0.5678 | 0.9501 | 552 | 1.5234 | 0.6008 | 1.5234 | 1.2343 |
| 0.5678 | 0.9535 | 554 | 1.3097 | 0.5484 | 1.3097 | 1.1444 |
| 0.5678 | 0.9570 | 556 | 1.2896 | 0.5484 | 1.2896 | 1.1356 |
| 0.5678 | 0.9604 | 558 | 1.2286 | 0.5484 | 1.2286 | 1.1084 |
| 0.5678 | 0.9639 | 560 | 1.3177 | 0.6610 | 1.3177 | 1.1479 |
| 0.5678 | 0.9673 | 562 | 1.5106 | 0.5385 | 1.5106 | 1.2291 |
| 0.5678 | 0.9707 | 564 | 1.4932 | 0.5385 | 1.4932 | 1.2220 |
| 0.5678 | 0.9742 | 566 | 1.2637 | 0.5973 | 1.2637 | 1.1242 |
| 0.5678 | 0.9776 | 568 | 1.1107 | 0.7106 | 1.1107 | 1.0539 |
| 0.5678 | 0.9811 | 570 | 1.0780 | 0.7623 | 1.0780 | 1.0383 |
| 0.5678 | 0.9845 | 572 | 1.1500 | 0.7337 | 1.1500 | 1.0724 |
| 0.5678 | 0.9880 | 574 | 1.2234 | 0.6260 | 1.2234 | 1.1061 |
| 0.5678 | 0.9914 | 576 | 1.1703 | 0.7333 | 1.1703 | 1.0818 |
| 0.5678 | 0.9948 | 578 | 1.1600 | 0.6545 | 1.1600 | 1.0770 |
| 0.5678 | 0.9983 | 580 | 1.1581 | 0.6008 | 1.1581 | 1.0762 |
| 0.5678 | 1.0017 | 582 | 1.1183 | 0.7412 | 1.1183 | 1.0575 |
| 0.5678 | 1.0052 | 584 | 0.9902 | 0.7747 | 0.9902 | 0.9951 |
| 0.5678 | 1.0086 | 586 | 1.0064 | 0.7747 | 1.0064 | 1.0032 |
| 0.5678 | 1.0120 | 588 | 1.1425 | 0.6476 | 1.1425 | 1.0689 |
| 0.5678 | 1.0155 | 590 | 1.3708 | 0.6610 | 1.3708 | 1.1708 |
| 0.5678 | 1.0189 | 592 | 1.5562 | 0.6008 | 1.5562 | 1.2475 |
| 0.5678 | 1.0224 | 594 | 1.7245 | 0.5769 | 1.7245 | 1.3132 |
| 0.5678 | 1.0258 | 596 | 1.6786 | 0.5769 | 1.6786 | 1.2956 |
| 0.5678 | 1.0293 | 598 | 1.4626 | 0.6265 | 1.4626 | 1.2094 |
| 0.5678 | 1.0327 | 600 | 1.5173 | 0.6008 | 1.5173 | 1.2318 |
| 0.5678 | 1.0361 | 602 | 1.5667 | 0.5769 | 1.5667 | 1.2517 |
| 0.5678 | 1.0396 | 604 | 1.3476 | 0.6610 | 1.3476 | 1.1609 |
| 0.5678 | 1.0430 | 606 | 1.2448 | 0.7412 | 1.2448 | 1.1157 |
| 0.5678 | 1.0465 | 608 | 1.1791 | 0.7101 | 1.1791 | 1.0859 |
| 0.5678 | 1.0499 | 610 | 1.1270 | 0.7101 | 1.1270 | 1.0616 |
| 0.5678 | 1.0534 | 612 | 0.9948 | 0.7327 | 0.9948 | 0.9974 |
| 0.5678 | 1.0568 | 614 | 1.1073 | 0.6818 | 1.1073 | 1.0523 |
| 0.5678 | 1.0602 | 616 | 1.2927 | 0.6382 | 1.2927 | 1.1370 |
| 0.5678 | 1.0637 | 618 | 1.4080 | 0.6382 | 1.4080 | 1.1866 |
| 0.5678 | 1.0671 | 620 | 1.3288 | 0.6303 | 1.3288 | 1.1527 |
| 0.5678 | 1.0706 | 622 | 1.2808 | 0.6303 | 1.2808 | 1.1317 |
| 0.5678 | 1.0740 | 624 | 1.0119 | 0.7316 | 1.0119 | 1.0060 |
| 0.5678 | 1.0775 | 626 | 0.9250 | 0.7525 | 0.9250 | 0.9618 |
| 0.5678 | 1.0809 | 628 | 1.0452 | 0.7194 | 1.0452 | 1.0223 |
| 0.5678 | 1.0843 | 630 | 1.1378 | 0.6543 | 1.1378 | 1.0667 |
| 0.5678 | 1.0878 | 632 | 1.2086 | 0.6476 | 1.2086 | 1.0994 |
| 0.5678 | 1.0912 | 634 | 1.1223 | 0.6476 | 1.1223 | 1.0594 |
| 0.5678 | 1.0947 | 636 | 1.1800 | 0.6476 | 1.1800 | 1.0863 |
| 0.5678 | 1.0981 | 638 | 1.2117 | 0.6543 | 1.2117 | 1.1008 |
| 0.5678 | 1.1015 | 640 | 1.3024 | 0.5638 | 1.3024 | 1.1412 |
| 0.5678 | 1.1050 | 642 | 1.3790 | 0.5608 | 1.3790 | 1.1743 |
| 0.5678 | 1.1084 | 644 | 1.2933 | 0.5670 | 1.2933 | 1.1372 |
| 0.5678 | 1.1119 | 646 | 1.2590 | 0.5638 | 1.2590 | 1.1221 |
| 0.5678 | 1.1153 | 648 | 1.3564 | 0.6008 | 1.3564 | 1.1646 |
| 0.5678 | 1.1188 | 650 | 1.6038 | 0.5385 | 1.6038 | 1.2664 |
| 0.5678 | 1.1222 | 652 | 1.8121 | 0.5707 | 1.8121 | 1.3461 |
| 0.5678 | 1.1256 | 654 | 1.7454 | 0.5366 | 1.7454 | 1.3211 |
| 0.5678 | 1.1291 | 656 | 1.5413 | 0.5744 | 1.5413 | 1.2415 |
| 0.5678 | 1.1325 | 658 | 1.4064 | 0.5777 | 1.4064 | 1.1859 |
| 0.5678 | 1.1360 | 660 | 1.2426 | 0.7014 | 1.2426 | 1.1147 |
| 0.5678 | 1.1394 | 662 | 1.2244 | 0.7614 | 1.2244 | 1.1065 |
| 0.5678 | 1.1429 | 664 | 1.4001 | 0.5777 | 1.4001 | 1.1833 |
| 0.5678 | 1.1463 | 666 | 1.5929 | 0.5385 | 1.5929 | 1.2621 |
| 0.5678 | 1.1497 | 668 | 1.6312 | 0.5188 | 1.6312 | 1.2772 |
| 0.5678 | 1.1532 | 670 | 1.6995 | 0.5188 | 1.6995 | 1.3036 |
| 0.5678 | 1.1566 | 672 | 1.5325 | 0.5404 | 1.5325 | 1.2379 |
| 0.5678 | 1.1601 | 674 | 1.4436 | 0.5426 | 1.4436 | 1.2015 |
| 0.5678 | 1.1635 | 676 | 1.3988 | 0.5404 | 1.3988 | 1.1827 |
| 0.5678 | 1.1670 | 678 | 1.4044 | 0.5404 | 1.4044 | 1.1851 |
| 0.5678 | 1.1704 | 680 | 1.5495 | 0.5404 | 1.5495 | 1.2448 |
| 0.5678 | 1.1738 | 682 | 1.8509 | 0.5987 | 1.8509 | 1.3605 |
| 0.5678 | 1.1773 | 684 | 1.8711 | 0.6079 | 1.8711 | 1.3679 |
| 0.5678 | 1.1807 | 686 | 1.6654 | 0.5188 | 1.6654 | 1.2905 |
| 0.5678 | 1.1842 | 688 | 1.4373 | 0.5404 | 1.4373 | 1.1989 |
| 0.5678 | 1.1876 | 690 | 1.2207 | 0.7129 | 1.2207 | 1.1049 |
| 0.5678 | 1.1910 | 692 | 1.2374 | 0.7211 | 1.2374 | 1.1124 |
| 0.5678 | 1.1945 | 694 | 1.2399 | 0.6551 | 1.2399 | 1.1135 |
| 0.5678 | 1.1979 | 696 | 1.4558 | 0.5404 | 1.4558 | 1.2066 |
| 0.5678 | 1.2014 | 698 | 1.5478 | 0.5385 | 1.5478 | 1.2441 |
| 0.5678 | 1.2048 | 700 | 1.4457 | 0.6008 | 1.4457 | 1.2024 |
| 0.5678 | 1.2083 | 702 | 1.3237 | 0.6610 | 1.3237 | 1.1505 |
| 0.5678 | 1.2117 | 704 | 1.3571 | 0.6610 | 1.3571 | 1.1649 |
| 0.5678 | 1.2151 | 706 | 1.4851 | 0.5404 | 1.4851 | 1.2187 |
| 0.5678 | 1.2186 | 708 | 1.5482 | 0.5404 | 1.5482 | 1.2443 |
| 0.5678 | 1.2220 | 710 | 1.4316 | 0.6008 | 1.4316 | 1.1965 |
| 0.5678 | 1.2255 | 712 | 1.1722 | 0.7194 | 1.1722 | 1.0827 |
| 0.5678 | 1.2289 | 714 | 0.9562 | 0.6338 | 0.9562 | 0.9778 |
| 0.5678 | 1.2324 | 716 | 0.9272 | 0.6338 | 0.9272 | 0.9629 |
| 0.5678 | 1.2358 | 718 | 1.0262 | 0.7020 | 1.0262 | 1.0130 |
| 0.5678 | 1.2392 | 720 | 1.1376 | 0.6557 | 1.1376 | 1.0666 |
| 0.5678 | 1.2427 | 722 | 1.4215 | 0.6610 | 1.4215 | 1.1923 |
| 0.5678 | 1.2461 | 724 | 1.5591 | 0.5965 | 1.5591 | 1.2486 |
| 0.5678 | 1.2496 | 726 | 1.4562 | 0.6008 | 1.4562 | 1.2067 |
| 0.5678 | 1.2530 | 728 | 1.3059 | 0.6610 | 1.3059 | 1.1428 |
| 0.5678 | 1.2565 | 730 | 1.1291 | 0.6616 | 1.1291 | 1.0626 |
| 0.5678 | 1.2599 | 732 | 1.0219 | 0.6503 | 1.0219 | 1.0109 |
| 0.5678 | 1.2633 | 734 | 1.0864 | 0.6557 | 1.0864 | 1.0423 |
| 0.5678 | 1.2668 | 736 | 1.3850 | 0.6356 | 1.3850 | 1.1769 |
| 0.5678 | 1.2702 | 738 | 1.7051 | 0.5366 | 1.7051 | 1.3058 |
| 0.5678 | 1.2737 | 740 | 1.8250 | 0.5177 | 1.8250 | 1.3509 |
| 0.5678 | 1.2771 | 742 | 1.6063 | 0.5385 | 1.6063 | 1.2674 |
| 0.5678 | 1.2806 | 744 | 1.2098 | 0.5891 | 1.2098 | 1.0999 |
| 0.5678 | 1.2840 | 746 | 0.9315 | 0.7623 | 0.9315 | 0.9651 |
| 0.5678 | 1.2874 | 748 | 0.8094 | 0.7801 | 0.8094 | 0.8997 |
| 0.5678 | 1.2909 | 750 | 0.8492 | 0.7801 | 0.8492 | 0.9215 |
| 0.5678 | 1.2943 | 752 | 1.0661 | 0.7221 | 1.0661 | 1.0325 |
| 0.5678 | 1.2978 | 754 | 1.5559 | 0.5385 | 1.5559 | 1.2474 |
| 0.5678 | 1.3012 | 756 | 1.9349 | 0.5794 | 1.9349 | 1.3910 |
| 0.5678 | 1.3046 | 758 | 2.0094 | 0.6079 | 2.0094 | 1.4175 |
| 0.5678 | 1.3081 | 760 | 1.7771 | 0.5385 | 1.7771 | 1.3331 |
| 0.5678 | 1.3115 | 762 | 1.3824 | 0.5404 | 1.3824 | 1.1758 |
| 0.5678 | 1.3150 | 764 | 1.0527 | 0.7844 | 1.0527 | 1.0260 |
| 0.5678 | 1.3184 | 766 | 1.0049 | 0.7844 | 1.0049 | 1.0025 |
| 0.5678 | 1.3219 | 768 | 1.0285 | 0.7844 | 1.0285 | 1.0141 |
| 0.5678 | 1.3253 | 770 | 1.2290 | 0.6323 | 1.2290 | 1.1086 |
| 0.5678 | 1.3287 | 772 | 1.6393 | 0.5385 | 1.6393 | 1.2803 |
| 0.5678 | 1.3322 | 774 | 1.9247 | 0.5524 | 1.9247 | 1.3873 |
| 0.5678 | 1.3356 | 776 | 1.9081 | 0.5707 | 1.9081 | 1.3814 |
| 0.5678 | 1.3391 | 778 | 1.6583 | 0.5385 | 1.6583 | 1.2877 |
| 0.5678 | 1.3425 | 780 | 1.3961 | 0.5777 | 1.3961 | 1.1816 |
| 0.5678 | 1.3460 | 782 | 1.3732 | 0.5777 | 1.3732 | 1.1718 |
| 0.5678 | 1.3494 | 784 | 1.3521 | 0.5777 | 1.3521 | 1.1628 |
| 0.5678 | 1.3528 | 786 | 1.3188 | 0.5777 | 1.3188 | 1.1484 |
| 0.5678 | 1.3563 | 788 | 1.2587 | 0.5777 | 1.2587 | 1.1219 |
| 0.5678 | 1.3597 | 790 | 1.2007 | 0.6015 | 1.2007 | 1.0958 |
| 0.5678 | 1.3632 | 792 | 1.2186 | 0.5413 | 1.2186 | 1.1039 |
| 0.5678 | 1.3666 | 794 | 1.2841 | 0.5777 | 1.2841 | 1.1332 |
| 0.5678 | 1.3701 | 796 | 1.2342 | 0.5777 | 1.2342 | 1.1109 |
| 0.5678 | 1.3735 | 798 | 1.3249 | 0.5777 | 1.3249 | 1.1510 |
| 0.5678 | 1.3769 | 800 | 1.4835 | 0.5744 | 1.4835 | 1.2180 |
| 0.5678 | 1.3804 | 802 | 1.3937 | 0.5744 | 1.3937 | 1.1805 |
| 0.5678 | 1.3838 | 804 | 1.1939 | 0.6414 | 1.1939 | 1.0926 |
| 0.5678 | 1.3873 | 806 | 1.1498 | 0.6686 | 1.1498 | 1.0723 |
| 0.5678 | 1.3907 | 808 | 1.2337 | 0.5744 | 1.2337 | 1.1107 |
| 0.5678 | 1.3941 | 810 | 1.1369 | 0.7322 | 1.1369 | 1.0662 |
| 0.5678 | 1.3976 | 812 | 1.0712 | 0.7020 | 1.0712 | 1.0350 |
| 0.5678 | 1.4010 | 814 | 0.9016 | 0.7422 | 0.9016 | 0.9495 |
| 0.5678 | 1.4045 | 816 | 0.8752 | 0.8019 | 0.8752 | 0.9355 |
| 0.5678 | 1.4079 | 818 | 1.0057 | 0.6945 | 1.0057 | 1.0029 |
| 0.5678 | 1.4114 | 820 | 1.2099 | 0.6860 | 1.2099 | 1.1000 |
| 0.5678 | 1.4148 | 822 | 1.2716 | 0.6860 | 1.2716 | 1.1277 |
| 0.5678 | 1.4182 | 824 | 1.3231 | 0.6303 | 1.3231 | 1.1502 |
| 0.5678 | 1.4217 | 826 | 1.2385 | 0.6860 | 1.2385 | 1.1129 |
| 0.5678 | 1.4251 | 828 | 1.1168 | 0.6866 | 1.1168 | 1.0568 |
| 0.5678 | 1.4286 | 830 | 1.0030 | 0.7134 | 1.0030 | 1.0015 |
| 0.5678 | 1.4320 | 832 | 1.0771 | 0.6557 | 1.0771 | 1.0379 |
| 0.5678 | 1.4355 | 834 | 1.2001 | 0.6934 | 1.2001 | 1.0955 |
| 0.5678 | 1.4389 | 836 | 1.4163 | 0.5744 | 1.4163 | 1.1901 |
| 0.5678 | 1.4423 | 838 | 1.3981 | 0.6303 | 1.3981 | 1.1824 |
| 0.5678 | 1.4458 | 840 | 1.3748 | 0.6303 | 1.3748 | 1.1725 |
| 0.5678 | 1.4492 | 842 | 1.2684 | 0.6356 | 1.2684 | 1.1262 |
| 0.5678 | 1.4527 | 844 | 1.3486 | 0.5777 | 1.3486 | 1.1613 |
| 0.5678 | 1.4561 | 846 | 1.4492 | 0.5777 | 1.4492 | 1.2038 |
| 0.5678 | 1.4596 | 848 | 1.3295 | 0.5413 | 1.3295 | 1.1530 |
| 0.5678 | 1.4630 | 850 | 1.1450 | 0.6616 | 1.1450 | 1.0701 |
| 0.5678 | 1.4664 | 852 | 1.0192 | 0.7221 | 1.0192 | 1.0095 |
| 0.5678 | 1.4699 | 854 | 1.0478 | 0.7310 | 1.0478 | 1.0236 |
| 0.5678 | 1.4733 | 856 | 1.2476 | 0.6616 | 1.2476 | 1.1170 |
| 0.5678 | 1.4768 | 858 | 1.4341 | 0.5404 | 1.4341 | 1.1976 |
| 0.5678 | 1.4802 | 860 | 1.3867 | 0.5404 | 1.3867 | 1.1776 |
| 0.5678 | 1.4836 | 862 | 1.2411 | 0.6616 | 1.2411 | 1.1141 |
| 0.5678 | 1.4871 | 864 | 1.2487 | 0.6616 | 1.2487 | 1.1175 |
| 0.5678 | 1.4905 | 866 | 1.3686 | 0.5010 | 1.3686 | 1.1699 |
| 0.5678 | 1.4940 | 868 | 1.4362 | 0.5404 | 1.4362 | 1.1984 |
| 0.5678 | 1.4974 | 870 | 1.4674 | 0.5404 | 1.4674 | 1.2114 |
| 0.5678 | 1.5009 | 872 | 1.4267 | 0.5404 | 1.4267 | 1.1944 |
| 0.5678 | 1.5043 | 874 | 1.3880 | 0.5404 | 1.3880 | 1.1781 |
| 0.5678 | 1.5077 | 876 | 1.2813 | 0.6157 | 1.2813 | 1.1319 |
| 0.5678 | 1.5112 | 878 | 1.3173 | 0.5474 | 1.3173 | 1.1477 |
| 0.5678 | 1.5146 | 880 | 1.3566 | 0.5474 | 1.3566 | 1.1647 |
| 0.5678 | 1.5181 | 882 | 1.3230 | 0.5474 | 1.3230 | 1.1502 |
| 0.5678 | 1.5215 | 884 | 1.4246 | 0.5010 | 1.4246 | 1.1936 |
| 0.5678 | 1.5250 | 886 | 1.4694 | 0.5404 | 1.4694 | 1.2122 |
| 0.5678 | 1.5284 | 888 | 1.6340 | 0.5385 | 1.6340 | 1.2783 |
| 0.5678 | 1.5318 | 890 | 1.8399 | 0.5188 | 1.8399 | 1.3564 |
| 0.5678 | 1.5353 | 892 | 1.7581 | 0.5188 | 1.7581 | 1.3259 |
| 0.5678 | 1.5387 | 894 | 1.4516 | 0.5385 | 1.4516 | 1.2048 |
| 0.5678 | 1.5422 | 896 | 1.0617 | 0.7729 | 1.0617 | 1.0304 |
| 0.5678 | 1.5456 | 898 | 0.9188 | 0.7623 | 0.9188 | 0.9585 |
| 0.5678 | 1.5491 | 900 | 0.9352 | 0.7623 | 0.9352 | 0.9671 |
| 0.5678 | 1.5525 | 902 | 1.0997 | 0.7844 | 1.0997 | 1.0487 |
| 0.5678 | 1.5559 | 904 | 1.4047 | 0.5015 | 1.4047 | 1.1852 |
| 0.5678 | 1.5594 | 906 | 1.5917 | 0.5188 | 1.5917 | 1.2616 |
| 0.5678 | 1.5628 | 908 | 1.5735 | 0.5188 | 1.5735 | 1.2544 |
| 0.5678 | 1.5663 | 910 | 1.4393 | 0.5413 | 1.4393 | 1.1997 |
| 0.5678 | 1.5697 | 912 | 1.1973 | 0.6543 | 1.1973 | 1.0942 |
| 0.5678 | 1.5731 | 914 | 1.0305 | 0.6897 | 1.0305 | 1.0151 |
| 0.5678 | 1.5766 | 916 | 0.9568 | 0.8142 | 0.9568 | 0.9781 |
| 0.5678 | 1.5800 | 918 | 0.9561 | 0.7906 | 0.9561 | 0.9778 |
| 0.5678 | 1.5835 | 920 | 1.0315 | 0.7793 | 1.0315 | 1.0156 |
| 0.5678 | 1.5869 | 922 | 1.1252 | 0.7234 | 1.1252 | 1.0608 |
| 0.5678 | 1.5904 | 924 | 1.0680 | 0.7898 | 1.0680 | 1.0334 |
| 0.5678 | 1.5938 | 926 | 1.0232 | 0.7618 | 1.0232 | 1.0115 |
| 0.5678 | 1.5972 | 928 | 1.0620 | 0.7618 | 1.0620 | 1.0305 |
| 0.5678 | 1.6007 | 930 | 1.0579 | 0.7618 | 1.0579 | 1.0285 |
| 0.5678 | 1.6041 | 932 | 1.0952 | 0.7310 | 1.0952 | 1.0465 |
| 0.5678 | 1.6076 | 934 | 1.1846 | 0.6015 | 1.1846 | 1.0884 |
| 0.5678 | 1.6110 | 936 | 1.2261 | 0.6015 | 1.2261 | 1.1073 |
| 0.5678 | 1.6145 | 938 | 1.3094 | 0.5413 | 1.3094 | 1.1443 |
| 0.5678 | 1.6179 | 940 | 1.2975 | 0.5435 | 1.2975 | 1.1391 |
| 0.5678 | 1.6213 | 942 | 1.2471 | 0.5435 | 1.2471 | 1.1167 |
| 0.5678 | 1.6248 | 944 | 1.1867 | 0.7221 | 1.1867 | 1.0894 |
| 0.5678 | 1.6282 | 946 | 1.1318 | 0.7221 | 1.1318 | 1.0639 |
| 0.5678 | 1.6317 | 948 | 1.1496 | 0.7134 | 1.1496 | 1.0722 |
| 0.5678 | 1.6351 | 950 | 1.2559 | 0.7421 | 1.2559 | 1.1207 |
| 0.5678 | 1.6386 | 952 | 1.3263 | 0.7333 | 1.3263 | 1.1516 |
| 0.5678 | 1.6420 | 954 | 1.3475 | 0.6797 | 1.3475 | 1.1608 |
| 0.5678 | 1.6454 | 956 | 1.1580 | 0.7134 | 1.1580 | 1.0761 |
| 0.5678 | 1.6489 | 958 | 1.0167 | 0.7221 | 1.0167 | 1.0083 |
| 0.5678 | 1.6523 | 960 | 1.0469 | 0.7221 | 1.0469 | 1.0232 |
| 0.5678 | 1.6558 | 962 | 1.1840 | 0.6975 | 1.1840 | 1.0881 |
| 0.5678 | 1.6592 | 964 | 1.3833 | 0.5638 | 1.3833 | 1.1761 |
| 0.5678 | 1.6627 | 966 | 1.4332 | 0.5426 | 1.4332 | 1.1972 |
| 0.5678 | 1.6661 | 968 | 1.4213 | 0.5638 | 1.4213 | 1.1922 |
| 0.5678 | 1.6695 | 970 | 1.3219 | 0.6265 | 1.3219 | 1.1497 |
| 0.5678 | 1.6730 | 972 | 1.3040 | 0.6265 | 1.3040 | 1.1419 |
| 0.5678 | 1.6764 | 974 | 1.4507 | 0.5426 | 1.4507 | 1.2044 |
| 0.5678 | 1.6799 | 976 | 1.5152 | 0.5188 | 1.5152 | 1.2309 |
| 0.5678 | 1.6833 | 978 | 1.4087 | 0.7123 | 1.4087 | 1.1869 |
| 0.5678 | 1.6867 | 980 | 1.1917 | 0.6890 | 1.1917 | 1.0917 |
| 0.5678 | 1.6902 | 982 | 1.0174 | 0.6752 | 1.0174 | 1.0087 |
| 0.5678 | 1.6936 | 984 | 0.9348 | 0.7058 | 0.9348 | 0.9668 |
| 0.5678 | 1.6971 | 986 | 1.0326 | 0.6752 | 1.0326 | 1.0162 |
| 0.5678 | 1.7005 | 988 | 1.3724 | 0.7123 | 1.3724 | 1.1715 |
| 0.5678 | 1.7040 | 990 | 1.5515 | 0.7092 | 1.5515 | 1.2456 |
| 0.5678 | 1.7074 | 992 | 1.4554 | 0.6610 | 1.4554 | 1.2064 |
| 0.5678 | 1.7108 | 994 | 1.2585 | 0.7129 | 1.2585 | 1.1218 |
| 0.5678 | 1.7143 | 996 | 1.1039 | 0.7134 | 1.1039 | 1.0507 |
| 0.5678 | 1.7177 | 998 | 1.1727 | 0.6818 | 1.1727 | 1.0829 |
| 0.1999 | 1.7212 | 1000 | 1.3038 | 0.6934 | 1.3038 | 1.1418 |
| 0.1999 | 1.7246 | 1002 | 1.5142 | 0.6115 | 1.5142 | 1.2305 |
| 0.1999 | 1.7281 | 1004 | 1.5743 | 0.5556 | 1.5743 | 1.2547 |
| 0.1999 | 1.7315 | 1006 | 1.4662 | 0.5413 | 1.4662 | 1.2109 |
| 0.1999 | 1.7349 | 1008 | 1.3318 | 0.6015 | 1.3318 | 1.1540 |
| 0.1999 | 1.7384 | 1010 | 1.1319 | 0.7194 | 1.1319 | 1.0639 |
| 0.1999 | 1.7418 | 1012 | 1.0337 | 0.7844 | 1.0337 | 1.0167 |
| 0.1999 | 1.7453 | 1014 | 1.0285 | 0.7844 | 1.0285 | 1.0142 |
| 0.1999 | 1.7487 | 1016 | 1.1460 | 0.6616 | 1.1460 | 1.0705 |
| 0.1999 | 1.7522 | 1018 | 1.3093 | 0.6303 | 1.3093 | 1.1442 |
| 0.1999 | 1.7556 | 1020 | 1.3387 | 0.6303 | 1.3387 | 1.1570 |
| 0.1999 | 1.7590 | 1022 | 1.1888 | 0.6797 | 1.1888 | 1.0903 |
| 0.1999 | 1.7625 | 1024 | 1.0049 | 0.7342 | 1.0049 | 1.0024 |
| 0.1999 | 1.7659 | 1026 | 0.8713 | 0.7801 | 0.8713 | 0.9334 |
| 0.1999 | 1.7694 | 1028 | 0.8229 | 0.7801 | 0.8229 | 0.9071 |
| 0.1999 | 1.7728 | 1030 | 0.7841 | 0.7801 | 0.7841 | 0.8855 |
| 0.1999 | 1.7762 | 1032 | 0.7969 | 0.7801 | 0.7969 | 0.8927 |
| 0.1999 | 1.7797 | 1034 | 0.9526 | 0.7264 | 0.9526 | 0.9760 |
| 0.1999 | 1.7831 | 1036 | 1.2753 | 0.6860 | 1.2753 | 1.1293 |
| 0.1999 | 1.7866 | 1038 | 1.5383 | 0.5744 | 1.5383 | 1.2403 |
| 0.1999 | 1.7900 | 1040 | 1.5197 | 0.5744 | 1.5197 | 1.2327 |
| 0.1999 | 1.7935 | 1042 | 1.2991 | 0.5744 | 1.2991 | 1.1398 |
| 0.1999 | 1.7969 | 1044 | 1.0138 | 0.7844 | 1.0138 | 1.0069 |
| 0.1999 | 1.8003 | 1046 | 0.8040 | 0.7149 | 0.8040 | 0.8967 |
| 0.1999 | 1.8038 | 1048 | 0.7458 | 0.7149 | 0.7458 | 0.8636 |
| 0.1999 | 1.8072 | 1050 | 0.7715 | 0.7149 | 0.7715 | 0.8784 |
| 0.1999 | 1.8107 | 1052 | 0.9137 | 0.7801 | 0.9137 | 0.9559 |
| 0.1999 | 1.8141 | 1054 | 1.0043 | 0.7902 | 1.0043 | 1.0022 |
| 0.1999 | 1.8176 | 1056 | 1.0209 | 0.7618 | 1.0209 | 1.0104 |
| 0.1999 | 1.8210 | 1058 | 1.0141 | 0.7426 | 1.0141 | 1.0070 |
| 0.1999 | 1.8244 | 1060 | 0.9893 | 0.7342 | 0.9893 | 0.9947 |
| 0.1999 | 1.8279 | 1062 | 0.9968 | 0.7618 | 0.9968 | 0.9984 |
| 0.1999 | 1.8313 | 1064 | 0.9651 | 0.7729 | 0.9651 | 0.9824 |
| 0.1999 | 1.8348 | 1066 | 0.9620 | 0.7729 | 0.9620 | 0.9808 |
| 0.1999 | 1.8382 | 1068 | 1.0514 | 0.7618 | 1.0514 | 1.0254 |
| 0.1999 | 1.8417 | 1070 | 1.1053 | 0.6616 | 1.1053 | 1.0513 |
| 0.1999 | 1.8451 | 1072 | 1.2474 | 0.6616 | 1.2474 | 1.1169 |
| 0.1999 | 1.8485 | 1074 | 1.3110 | 0.6015 | 1.3110 | 1.1450 |
| 0.1999 | 1.8520 | 1076 | 1.2666 | 0.6686 | 1.2666 | 1.1254 |
| 0.1999 | 1.8554 | 1078 | 1.1121 | 0.7194 | 1.1121 | 1.0546 |
| 0.1999 | 1.8589 | 1080 | 0.9736 | 0.7844 | 0.9736 | 0.9867 |
| 0.1999 | 1.8623 | 1082 | 0.9713 | 0.7844 | 0.9713 | 0.9855 |
| 0.1999 | 1.8657 | 1084 | 1.0103 | 0.7310 | 1.0103 | 1.0051 |
| 0.1999 | 1.8692 | 1086 | 1.0257 | 0.7134 | 1.0257 | 1.0128 |
| 0.1999 | 1.8726 | 1088 | 1.1374 | 0.7421 | 1.1374 | 1.0665 |
| 0.1999 | 1.8761 | 1090 | 1.4073 | 0.6791 | 1.4073 | 1.1863 |
| 0.1999 | 1.8795 | 1092 | 1.5297 | 0.5714 | 1.5297 | 1.2368 |
| 0.1999 | 1.8830 | 1094 | 1.4530 | 0.6836 | 1.4530 | 1.2054 |
| 0.1999 | 1.8864 | 1096 | 1.1420 | 0.7424 | 1.1420 | 1.0686 |
| 0.1999 | 1.8898 | 1098 | 0.9404 | 0.7058 | 0.9404 | 0.9697 |
| 0.1999 | 1.8933 | 1100 | 0.9635 | 0.7134 | 0.9635 | 0.9816 |
| 0.1999 | 1.8967 | 1102 | 1.0731 | 0.6616 | 1.0731 | 1.0359 |
| 0.1999 | 1.9002 | 1104 | 1.2178 | 0.6616 | 1.2178 | 1.1035 |
| 0.1999 | 1.9036 | 1106 | 1.2456 | 0.7101 | 1.2456 | 1.1161 |
| 0.1999 | 1.9071 | 1108 | 1.1505 | 0.7194 | 1.1505 | 1.0726 |
| 0.1999 | 1.9105 | 1110 | 1.0573 | 0.7194 | 1.0573 | 1.0283 |
| 0.1999 | 1.9139 | 1112 | 1.0458 | 0.7101 | 1.0458 | 1.0226 |
| 0.1999 | 1.9174 | 1114 | 0.8859 | 0.8019 | 0.8859 | 0.9412 |
| 0.1999 | 1.9208 | 1116 | 0.7457 | 0.8023 | 0.7457 | 0.8635 |
| 0.1999 | 1.9243 | 1118 | 0.6599 | 0.7805 | 0.6599 | 0.8123 |
| 0.1999 | 1.9277 | 1120 | 0.6770 | 0.7805 | 0.6770 | 0.8228 |
| 0.1999 | 1.9312 | 1122 | 0.7572 | 0.7797 | 0.7572 | 0.8702 |
| 0.1999 | 1.9346 | 1124 | 0.8780 | 0.7134 | 0.8780 | 0.9370 |
| 0.1999 | 1.9380 | 1126 | 1.0261 | 0.7134 | 1.0261 | 1.0130 |
| 0.1999 | 1.9415 | 1128 | 1.0775 | 0.7618 | 1.0775 | 1.0380 |
| 0.1999 | 1.9449 | 1130 | 1.0288 | 0.7618 | 1.0288 | 1.0143 |
| 0.1999 | 1.9484 | 1132 | 1.0218 | 0.7729 | 1.0218 | 1.0109 |
| 0.1999 | 1.9518 | 1134 | 0.9921 | 0.7729 | 0.9921 | 0.9960 |
| 0.1999 | 1.9552 | 1136 | 1.0973 | 0.7134 | 1.0973 | 1.0475 |
| 0.1999 | 1.9587 | 1138 | 1.1264 | 0.7134 | 1.1264 | 1.0613 |
| 0.1999 | 1.9621 | 1140 | 1.0317 | 0.7134 | 1.0317 | 1.0157 |
| 0.1999 | 1.9656 | 1142 | 1.0480 | 0.7134 | 1.0480 | 1.0237 |
| 0.1999 | 1.9690 | 1144 | 1.0163 | 0.7058 | 1.0163 | 1.0081 |
| 0.1999 | 1.9725 | 1146 | 0.9549 | 0.7058 | 0.9549 | 0.9772 |
| 0.1999 | 1.9759 | 1148 | 0.8687 | 0.7520 | 0.8687 | 0.9321 |
| 0.1999 | 1.9793 | 1150 | 0.8357 | 0.7623 | 0.8357 | 0.9142 |
| 0.1999 | 1.9828 | 1152 | 0.9111 | 0.7623 | 0.9111 | 0.9545 |
| 0.1999 | 1.9862 | 1154 | 1.0760 | 0.7216 | 1.0760 | 1.0373 |
| 0.1999 | 1.9897 | 1156 | 1.2704 | 0.5638 | 1.2704 | 1.1271 |
| 0.1999 | 1.9931 | 1158 | 1.3640 | 0.6008 | 1.3640 | 1.1679 |
| 0.1999 | 1.9966 | 1160 | 1.3871 | 0.6008 | 1.3871 | 1.1778 |
| 0.1999 | 2.0 | 1162 | 1.2794 | 0.6015 | 1.2794 | 1.1311 |
| 0.1999 | 2.0034 | 1164 | 1.0752 | 0.7844 | 1.0752 | 1.0369 |
| 0.1999 | 2.0069 | 1166 | 0.8959 | 0.7623 | 0.8959 | 0.9465 |
| 0.1999 | 2.0103 | 1168 | 0.8695 | 0.7823 | 0.8695 | 0.9325 |
| 0.1999 | 2.0138 | 1170 | 0.9861 | 0.7134 | 0.9861 | 0.9930 |
| 0.1999 | 2.0172 | 1172 | 1.2094 | 0.7216 | 1.2094 | 1.0997 |
| 0.1999 | 2.0207 | 1174 | 1.3914 | 0.6008 | 1.3914 | 1.1796 |
| 0.1999 | 2.0241 | 1176 | 1.3640 | 0.6008 | 1.3640 | 1.1679 |
| 0.1999 | 2.0275 | 1178 | 1.1837 | 0.7101 | 1.1837 | 1.0880 |
| 0.1999 | 2.0310 | 1180 | 0.9804 | 0.7729 | 0.9804 | 0.9902 |
| 0.1999 | 2.0344 | 1182 | 0.9292 | 0.7623 | 0.9292 | 0.9640 |
| 0.1999 | 2.0379 | 1184 | 1.0152 | 0.7618 | 1.0152 | 1.0076 |
| 0.1999 | 2.0413 | 1186 | 1.0607 | 0.7618 | 1.0607 | 1.0299 |
| 0.1999 | 2.0448 | 1188 | 1.1753 | 0.7421 | 1.1753 | 1.0841 |
| 0.1999 | 2.0482 | 1190 | 1.2385 | 0.7421 | 1.2385 | 1.1129 |
| 0.1999 | 2.0516 | 1192 | 1.3774 | 0.6309 | 1.3774 | 1.1736 |
| 0.1999 | 2.0551 | 1194 | 1.3558 | 0.6309 | 1.3558 | 1.1644 |
| 0.1999 | 2.0585 | 1196 | 1.1905 | 0.7020 | 1.1905 | 1.0911 |
| 0.1999 | 2.0620 | 1198 | 1.1592 | 0.7020 | 1.1592 | 1.0766 |
| 0.1999 | 2.0654 | 1200 | 1.2045 | 0.7020 | 1.2045 | 1.0975 |
| 0.1999 | 2.0688 | 1202 | 1.2589 | 0.6420 | 1.2589 | 1.1220 |
| 0.1999 | 2.0723 | 1204 | 1.3228 | 0.5617 | 1.3228 | 1.1501 |
| 0.1999 | 2.0757 | 1206 | 1.2192 | 0.6483 | 1.2192 | 1.1042 |
| 0.1999 | 2.0792 | 1208 | 1.0472 | 0.7729 | 1.0472 | 1.0233 |
| 0.1999 | 2.0826 | 1210 | 0.9913 | 0.7729 | 0.9913 | 0.9957 |
| 0.1999 | 2.0861 | 1212 | 1.0376 | 0.7729 | 1.0376 | 1.0186 |
| 0.1999 | 2.0895 | 1214 | 1.0503 | 0.7729 | 1.0503 | 1.0248 |
| 0.1999 | 2.0929 | 1216 | 0.9759 | 0.7729 | 0.9759 | 0.9879 |
| 0.1999 | 2.0964 | 1218 | 0.8630 | 0.7623 | 0.8630 | 0.9290 |
| 0.1999 | 2.0998 | 1220 | 0.7234 | 0.7986 | 0.7234 | 0.8505 |
| 0.1999 | 2.1033 | 1222 | 0.7623 | 0.7986 | 0.7623 | 0.8731 |
| 0.1999 | 2.1067 | 1224 | 0.9787 | 0.7426 | 0.9787 | 0.9893 |
| 0.1999 | 2.1102 | 1226 | 1.2599 | 0.6866 | 1.2599 | 1.1225 |
| 0.1999 | 2.1136 | 1228 | 1.3020 | 0.5980 | 1.3020 | 1.1411 |
| 0.1999 | 2.1170 | 1230 | 1.1575 | 0.6557 | 1.1575 | 1.0759 |
| 0.1999 | 2.1205 | 1232 | 0.9646 | 0.7618 | 0.9646 | 0.9822 |
| 0.1999 | 2.1239 | 1234 | 0.8481 | 0.8019 | 0.8481 | 0.9209 |
| 0.1999 | 2.1274 | 1236 | 0.8585 | 0.8019 | 0.8585 | 0.9266 |
| 0.1999 | 2.1308 | 1238 | 0.9705 | 0.7729 | 0.9705 | 0.9851 |
| 0.1999 | 2.1343 | 1240 | 1.1366 | 0.6557 | 1.1366 | 1.0661 |
| 0.1999 | 2.1377 | 1242 | 1.1760 | 0.6557 | 1.1760 | 1.0845 |
| 0.1999 | 2.1411 | 1244 | 1.1032 | 0.6557 | 1.1032 | 1.0503 |
| 0.1999 | 2.1446 | 1246 | 0.9517 | 0.7902 | 0.9517 | 0.9756 |
| 0.1999 | 2.1480 | 1248 | 0.8550 | 0.7801 | 0.8550 | 0.9246 |
| 0.1999 | 2.1515 | 1250 | 0.8686 | 0.7906 | 0.8686 | 0.9320 |
| 0.1999 | 2.1549 | 1252 | 0.9688 | 0.8019 | 0.9688 | 0.9843 |
| 0.1999 | 2.1583 | 1254 | 1.0824 | 0.7020 | 1.0824 | 1.0404 |
| 0.1999 | 2.1618 | 1256 | 1.1600 | 0.6557 | 1.1600 | 1.0770 |
| 0.1999 | 2.1652 | 1258 | 1.1438 | 0.6420 | 1.1438 | 1.0695 |
| 0.1999 | 2.1687 | 1260 | 1.1699 | 0.6420 | 1.1699 | 1.0816 |
| 0.1999 | 2.1721 | 1262 | 1.0894 | 0.6420 | 1.0894 | 1.0437 |
| 0.1999 | 2.1756 | 1264 | 0.9034 | 0.7906 | 0.9034 | 0.9504 |
| 0.1999 | 2.1790 | 1266 | 0.7645 | 0.7704 | 0.7645 | 0.8744 |
| 0.1999 | 2.1824 | 1268 | 0.8125 | 0.7191 | 0.8125 | 0.9014 |
| 0.1999 | 2.1859 | 1270 | 0.9934 | 0.7518 | 0.9934 | 0.9967 |
| 0.1999 | 2.1893 | 1272 | 1.0396 | 0.7514 | 1.0396 | 1.0196 |
| 0.1999 | 2.1928 | 1274 | 0.8987 | 0.7439 | 0.8987 | 0.9480 |
| 0.1999 | 2.1962 | 1276 | 0.7521 | 0.7443 | 0.7521 | 0.8672 |
| 0.1999 | 2.1997 | 1278 | 0.7335 | 0.7443 | 0.7335 | 0.8564 |
| 0.1999 | 2.2031 | 1280 | 0.8150 | 0.7700 | 0.8150 | 0.9027 |
| 0.1999 | 2.2065 | 1282 | 0.9442 | 0.7618 | 0.9442 | 0.9717 |
| 0.1999 | 2.2100 | 1284 | 1.2138 | 0.6420 | 1.2138 | 1.1017 |
| 0.1999 | 2.2134 | 1286 | 1.3419 | 0.5980 | 1.3419 | 1.1584 |
| 0.1999 | 2.2169 | 1288 | 1.2666 | 0.6420 | 1.2666 | 1.1254 |
| 0.1999 | 2.2203 | 1290 | 1.0995 | 0.7020 | 1.0995 | 1.0486 |
| 0.1999 | 2.2238 | 1292 | 0.9435 | 0.6988 | 0.9435 | 0.9713 |
| 0.1999 | 2.2272 | 1294 | 0.8660 | 0.6988 | 0.8660 | 0.9306 |
| 0.1999 | 2.2306 | 1296 | 0.9089 | 0.7729 | 0.9089 | 0.9533 |
| 0.1999 | 2.2341 | 1298 | 0.9984 | 0.7134 | 0.9984 | 0.9992 |
| 0.1999 | 2.2375 | 1300 | 1.1187 | 0.7134 | 1.1187 | 1.0577 |
| 0.1999 | 2.2410 | 1302 | 1.1659 | 0.7157 | 1.1659 | 1.0797 |
| 0.1999 | 2.2444 | 1304 | 1.1009 | 0.7134 | 1.1009 | 1.0492 |
| 0.1999 | 2.2478 | 1306 | 1.0634 | 0.7618 | 1.0634 | 1.0312 |
| 0.1999 | 2.2513 | 1308 | 0.9554 | 0.7729 | 0.9554 | 0.9774 |
| 0.1999 | 2.2547 | 1310 | 0.9577 | 0.6988 | 0.9577 | 0.9786 |
| 0.1999 | 2.2582 | 1312 | 1.0544 | 0.7106 | 1.0544 | 1.0268 |
| 0.1999 | 2.2616 | 1314 | 1.1313 | 0.6483 | 1.1313 | 1.0636 |
| 0.1999 | 2.2651 | 1316 | 1.2557 | 0.5980 | 1.2557 | 1.1206 |
| 0.1999 | 2.2685 | 1318 | 1.2655 | 0.5980 | 1.2655 | 1.1249 |
| 0.1999 | 2.2719 | 1320 | 1.3451 | 0.6260 | 1.3451 | 1.1598 |
| 0.1999 | 2.2754 | 1322 | 1.2996 | 0.5980 | 1.2996 | 1.1400 |
| 0.1999 | 2.2788 | 1324 | 1.0944 | 0.6483 | 1.0944 | 1.0461 |
| 0.1999 | 2.2823 | 1326 | 0.9026 | 0.7623 | 0.9026 | 0.9501 |
| 0.1999 | 2.2857 | 1328 | 0.8285 | 0.6909 | 0.8285 | 0.9102 |
| 0.1999 | 2.2892 | 1330 | 0.8417 | 0.6909 | 0.8417 | 0.9174 |
| 0.1999 | 2.2926 | 1332 | 0.9633 | 0.7729 | 0.9633 | 0.9815 |
| 0.1999 | 2.2960 | 1334 | 1.1650 | 0.6557 | 1.1650 | 1.0794 |
| 0.1999 | 2.2995 | 1336 | 1.3326 | 0.6085 | 1.3326 | 1.1544 |
| 0.1999 | 2.3029 | 1338 | 1.2667 | 0.6622 | 1.2667 | 1.1255 |
| 0.1999 | 2.3064 | 1340 | 1.0973 | 0.7134 | 1.0973 | 1.0475 |
| 0.1999 | 2.3098 | 1342 | 0.9540 | 0.7729 | 0.9540 | 0.9767 |
| 0.1999 | 2.3133 | 1344 | 0.8495 | 0.7729 | 0.8495 | 0.9217 |
| 0.1999 | 2.3167 | 1346 | 0.8843 | 0.7729 | 0.8843 | 0.9404 |
| 0.1999 | 2.3201 | 1348 | 0.9365 | 0.7221 | 0.9365 | 0.9677 |
| 0.1999 | 2.3236 | 1350 | 1.0424 | 0.7134 | 1.0424 | 1.0210 |
| 0.1999 | 2.3270 | 1352 | 1.1279 | 0.7134 | 1.1279 | 1.0620 |
| 0.1999 | 2.3305 | 1354 | 1.1058 | 0.7134 | 1.1058 | 1.0516 |
| 0.1999 | 2.3339 | 1356 | 1.1183 | 0.7134 | 1.1183 | 1.0575 |
| 0.1999 | 2.3373 | 1358 | 1.0535 | 0.7729 | 1.0535 | 1.0264 |
| 0.1999 | 2.3408 | 1360 | 0.9623 | 0.7729 | 0.9623 | 0.9810 |
| 0.1999 | 2.3442 | 1362 | 0.9443 | 0.7729 | 0.9443 | 0.9717 |
| 0.1999 | 2.3477 | 1364 | 1.0183 | 0.7618 | 1.0183 | 1.0091 |
| 0.1999 | 2.3511 | 1366 | 1.0528 | 0.7134 | 1.0528 | 1.0261 |
| 0.1999 | 2.3546 | 1368 | 1.0933 | 0.7134 | 1.0933 | 1.0456 |
| 0.1999 | 2.3580 | 1370 | 1.0860 | 0.7134 | 1.0860 | 1.0421 |
| 0.1999 | 2.3614 | 1372 | 1.0017 | 0.7623 | 1.0017 | 1.0009 |
| 0.1999 | 2.3649 | 1374 | 1.0460 | 0.7618 | 1.0460 | 1.0228 |
| 0.1999 | 2.3683 | 1376 | 1.0202 | 0.7623 | 1.0202 | 1.0100 |
| 0.1999 | 2.3718 | 1378 | 1.0093 | 0.7623 | 1.0093 | 1.0047 |
| 0.1999 | 2.3752 | 1380 | 0.9303 | 0.7623 | 0.9303 | 0.9645 |
| 0.1999 | 2.3787 | 1382 | 0.8594 | 0.7149 | 0.8594 | 0.9271 |
| 0.1999 | 2.3821 | 1384 | 0.8668 | 0.7801 | 0.8668 | 0.9310 |
| 0.1999 | 2.3855 | 1386 | 0.9212 | 0.6987 | 0.9212 | 0.9598 |
| 0.1999 | 2.3890 | 1388 | 1.0880 | 0.7255 | 1.0880 | 1.0431 |
| 0.1999 | 2.3924 | 1390 | 1.1509 | 0.6784 | 1.1509 | 1.0728 |
| 0.1999 | 2.3959 | 1392 | 1.0849 | 0.7337 | 1.0849 | 1.0416 |
| 0.1999 | 2.3993 | 1394 | 0.9662 | 0.7058 | 0.9662 | 0.9829 |
| 0.1999 | 2.4028 | 1396 | 0.8045 | 0.7986 | 0.8045 | 0.8969 |
| 0.1999 | 2.4062 | 1398 | 0.7669 | 0.7342 | 0.7669 | 0.8757 |
| 0.1999 | 2.4096 | 1400 | 0.8201 | 0.8098 | 0.8201 | 0.9056 |
| 0.1999 | 2.4131 | 1402 | 0.9028 | 0.7623 | 0.9028 | 0.9501 |
| 0.1999 | 2.4165 | 1404 | 1.0068 | 0.7618 | 1.0068 | 1.0034 |
| 0.1999 | 2.4200 | 1406 | 1.0158 | 0.7134 | 1.0158 | 1.0079 |
| 0.1999 | 2.4234 | 1408 | 0.9629 | 0.7520 | 0.9629 | 0.9813 |
| 0.1999 | 2.4269 | 1410 | 0.8586 | 0.7797 | 0.8586 | 0.9266 |
| 0.1999 | 2.4303 | 1412 | 0.8286 | 0.7801 | 0.8286 | 0.9102 |
| 0.1999 | 2.4337 | 1414 | 0.8684 | 0.7342 | 0.8684 | 0.9319 |
| 0.1999 | 2.4372 | 1416 | 0.9477 | 0.7058 | 0.9477 | 0.9735 |
| 0.1999 | 2.4406 | 1418 | 0.9152 | 0.7058 | 0.9152 | 0.9566 |
| 0.1999 | 2.4441 | 1420 | 0.9778 | 0.7058 | 0.9778 | 0.9888 |
| 0.1999 | 2.4475 | 1422 | 1.0358 | 0.7058 | 1.0358 | 1.0177 |
| 0.1999 | 2.4509 | 1424 | 1.1325 | 0.6982 | 1.1325 | 1.0642 |
| 0.1999 | 2.4544 | 1426 | 1.1408 | 0.6871 | 1.1408 | 1.0681 |
| 0.1999 | 2.4578 | 1428 | 1.0383 | 0.7058 | 1.0383 | 1.0190 |
| 0.1999 | 2.4613 | 1430 | 0.9088 | 0.7623 | 0.9088 | 0.9533 |
| 0.1999 | 2.4647 | 1432 | 0.8571 | 0.7525 | 0.8571 | 0.9258 |
| 0.1999 | 2.4682 | 1434 | 0.8815 | 0.7525 | 0.8815 | 0.9389 |
| 0.1999 | 2.4716 | 1436 | 0.8708 | 0.7525 | 0.8708 | 0.9332 |
| 0.1999 | 2.4750 | 1438 | 0.9413 | 0.7139 | 0.9413 | 0.9702 |
| 0.1999 | 2.4785 | 1440 | 0.9793 | 0.7058 | 0.9793 | 0.9896 |
| 0.1999 | 2.4819 | 1442 | 0.9464 | 0.6987 | 0.9464 | 0.9728 |
| 0.1999 | 2.4854 | 1444 | 0.8876 | 0.6987 | 0.8876 | 0.9421 |
| 0.1999 | 2.4888 | 1446 | 0.9394 | 0.6987 | 0.9394 | 0.9692 |
| 0.1999 | 2.4923 | 1448 | 0.9867 | 0.7058 | 0.9867 | 0.9933 |
| 0.1999 | 2.4957 | 1450 | 1.0162 | 0.7058 | 1.0162 | 1.0081 |
| 0.1999 | 2.4991 | 1452 | 1.1137 | 0.7053 | 1.1137 | 1.0553 |
| 0.1999 | 2.5026 | 1454 | 1.0995 | 0.7134 | 1.0995 | 1.0486 |
| 0.1999 | 2.5060 | 1456 | 1.1240 | 0.7134 | 1.1240 | 1.0602 |
| 0.1999 | 2.5095 | 1458 | 1.1062 | 0.7134 | 1.1062 | 1.0518 |
| 0.1999 | 2.5129 | 1460 | 0.9685 | 0.7139 | 0.9685 | 0.9841 |
| 0.1999 | 2.5164 | 1462 | 0.9277 | 0.7063 | 0.9277 | 0.9632 |
| 0.1999 | 2.5198 | 1464 | 1.0305 | 0.7058 | 1.0305 | 1.0151 |
| 0.1999 | 2.5232 | 1466 | 1.2406 | 0.6497 | 1.2406 | 1.1138 |
| 0.1999 | 2.5267 | 1468 | 1.4611 | 0.6078 | 1.4611 | 1.2088 |
| 0.1999 | 2.5301 | 1470 | 1.5044 | 0.5540 | 1.5044 | 1.2266 |
| 0.1999 | 2.5336 | 1472 | 1.4164 | 0.5540 | 1.4164 | 1.1901 |
| 0.1999 | 2.5370 | 1474 | 1.2312 | 0.6497 | 1.2312 | 1.1096 |
| 0.1999 | 2.5404 | 1476 | 1.0150 | 0.7139 | 1.0150 | 1.0075 |
| 0.1999 | 2.5439 | 1478 | 0.9338 | 0.7623 | 0.9338 | 0.9663 |
| 0.1999 | 2.5473 | 1480 | 0.9597 | 0.7623 | 0.9597 | 0.9796 |
| 0.1999 | 2.5508 | 1482 | 1.0351 | 0.7221 | 1.0351 | 1.0174 |
| 0.1999 | 2.5542 | 1484 | 1.1871 | 0.6557 | 1.1871 | 1.0896 |
| 0.1999 | 2.5577 | 1486 | 1.2464 | 0.6557 | 1.2464 | 1.1164 |
| 0.1999 | 2.5611 | 1488 | 1.1643 | 0.6557 | 1.1643 | 1.0790 |
| 0.1999 | 2.5645 | 1490 | 1.0105 | 0.7139 | 1.0105 | 1.0052 |
| 0.1999 | 2.5680 | 1492 | 0.8667 | 0.7623 | 0.8667 | 0.9309 |
| 0.1999 | 2.5714 | 1494 | 0.8495 | 0.7623 | 0.8495 | 0.9217 |
| 0.1999 | 2.5749 | 1496 | 0.8693 | 0.7139 | 0.8693 | 0.9323 |
| 0.1999 | 2.5783 | 1498 | 0.9858 | 0.7139 | 0.9858 | 0.9929 |
| 0.1399 | 2.5818 | 1500 | 1.1688 | 0.7134 | 1.1688 | 1.0811 |
| 0.1399 | 2.5852 | 1502 | 1.2145 | 0.7134 | 1.2145 | 1.1020 |
| 0.1399 | 2.5886 | 1504 | 1.1821 | 0.7134 | 1.1821 | 1.0872 |
| 0.1399 | 2.5921 | 1506 | 1.0488 | 0.7134 | 1.0488 | 1.0241 |
| 0.1399 | 2.5955 | 1508 | 1.0200 | 0.7221 | 1.0200 | 1.0099 |
| 0.1399 | 2.5990 | 1510 | 1.0546 | 0.7134 | 1.0546 | 1.0269 |
| 0.1399 | 2.6024 | 1512 | 1.0077 | 0.7058 | 1.0077 | 1.0038 |
| 0.1399 | 2.6059 | 1514 | 0.9905 | 0.7058 | 0.9905 | 0.9952 |
| 0.1399 | 2.6093 | 1516 | 0.9010 | 0.7342 | 0.9010 | 0.9492 |
| 0.1399 | 2.6127 | 1518 | 0.8844 | 0.7264 | 0.8844 | 0.9404 |
| 0.1399 | 2.6162 | 1520 | 0.9766 | 0.7058 | 0.9766 | 0.9882 |
| 0.1399 | 2.6196 | 1522 | 1.1636 | 0.7058 | 1.1636 | 1.0787 |
| 0.1399 | 2.6231 | 1524 | 1.2215 | 0.7157 | 1.2215 | 1.1052 |
| 0.1399 | 2.6265 | 1526 | 1.1593 | 0.6871 | 1.1593 | 1.0767 |
| 0.1399 | 2.6299 | 1528 | 1.0551 | 0.7058 | 1.0551 | 1.0272 |
| 0.1399 | 2.6334 | 1530 | 0.9458 | 0.7623 | 0.9458 | 0.9725 |
| 0.1399 | 2.6368 | 1532 | 0.8863 | 0.7525 | 0.8863 | 0.9414 |
| 0.1399 | 2.6403 | 1534 | 0.9430 | 0.7139 | 0.9430 | 0.9711 |
| 0.1399 | 2.6437 | 1536 | 1.1207 | 0.7134 | 1.1207 | 1.0587 |
| 0.1399 | 2.6472 | 1538 | 1.2864 | 0.6616 | 1.2864 | 1.1342 |
| 0.1399 | 2.6506 | 1540 | 1.3407 | 0.6616 | 1.3407 | 1.1579 |
| 0.1399 | 2.6540 | 1542 | 1.2231 | 0.7134 | 1.2231 | 1.1060 |
| 0.1399 | 2.6575 | 1544 | 1.1410 | 0.7221 | 1.1410 | 1.0682 |
| 0.1399 | 2.6609 | 1546 | 1.1233 | 0.7221 | 1.1233 | 1.0599 |
| 0.1399 | 2.6644 | 1548 | 1.2106 | 0.7134 | 1.2106 | 1.1003 |
| 0.1399 | 2.6678 | 1550 | 1.2675 | 0.6866 | 1.2675 | 1.1258 |
| 0.1399 | 2.6713 | 1552 | 1.2303 | 0.7134 | 1.2303 | 1.1092 |
| 0.1399 | 2.6747 | 1554 | 1.1084 | 0.7134 | 1.1084 | 1.0528 |
| 0.1399 | 2.6781 | 1556 | 0.9435 | 0.7139 | 0.9435 | 0.9713 |
| 0.1399 | 2.6816 | 1558 | 0.8449 | 0.7525 | 0.8449 | 0.9192 |
| 0.1399 | 2.6850 | 1560 | 0.7941 | 0.7525 | 0.7941 | 0.8911 |
| 0.1399 | 2.6885 | 1562 | 0.8222 | 0.7063 | 0.8222 | 0.9067 |
| 0.1399 | 2.6919 | 1564 | 0.9512 | 0.6987 | 0.9512 | 0.9753 |
| 0.1399 | 2.6954 | 1566 | 1.1433 | 0.7255 | 1.1433 | 1.0692 |
| 0.1399 | 2.6988 | 1568 | 1.2217 | 0.7333 | 1.2217 | 1.1053 |
| 0.1399 | 2.7022 | 1570 | 1.1878 | 0.7333 | 1.1878 | 1.0898 |
| 0.1399 | 2.7057 | 1572 | 1.0246 | 0.7058 | 1.0246 | 1.0122 |
| 0.1399 | 2.7091 | 1574 | 0.8392 | 0.6987 | 0.8392 | 0.9161 |
| 0.1399 | 2.7126 | 1576 | 0.7758 | 0.7167 | 0.7758 | 0.8808 |
| 0.1399 | 2.7160 | 1578 | 0.8212 | 0.7167 | 0.8212 | 0.9062 |
| 0.1399 | 2.7194 | 1580 | 0.9659 | 0.7058 | 0.9659 | 0.9828 |
| 0.1399 | 2.7229 | 1582 | 1.0893 | 0.7255 | 1.0893 | 1.0437 |
| 0.1399 | 2.7263 | 1584 | 1.0567 | 0.7255 | 1.0567 | 1.0280 |
| 0.1399 | 2.7298 | 1586 | 0.9336 | 0.7058 | 0.9336 | 0.9662 |
| 0.1399 | 2.7332 | 1588 | 0.8564 | 0.6987 | 0.8564 | 0.9254 |
| 0.1399 | 2.7367 | 1590 | 0.7930 | 0.7063 | 0.7930 | 0.8905 |
| 0.1399 | 2.7401 | 1592 | 0.8059 | 0.7063 | 0.8059 | 0.8977 |
| 0.1399 | 2.7435 | 1594 | 0.8719 | 0.6987 | 0.8719 | 0.9337 |
| 0.1399 | 2.7470 | 1596 | 0.9190 | 0.7182 | 0.9190 | 0.9587 |
| 0.1399 | 2.7504 | 1598 | 1.0019 | 0.7182 | 1.0019 | 1.0010 |
| 0.1399 | 2.7539 | 1600 | 1.0499 | 0.7182 | 1.0499 | 1.0247 |
| 0.1399 | 2.7573 | 1602 | 1.0167 | 0.7182 | 1.0167 | 1.0083 |
| 0.1399 | 2.7608 | 1604 | 0.8507 | 0.6987 | 0.8507 | 0.9223 |
| 0.1399 | 2.7642 | 1606 | 0.7218 | 0.7704 | 0.7218 | 0.8496 |
| 0.1399 | 2.7676 | 1608 | 0.7198 | 0.7525 | 0.7198 | 0.8484 |
| 0.1399 | 2.7711 | 1610 | 0.7984 | 0.7525 | 0.7984 | 0.8936 |
| 0.1399 | 2.7745 | 1612 | 0.9852 | 0.7139 | 0.9852 | 0.9926 |
| 0.1399 | 2.7780 | 1614 | 1.2178 | 0.7333 | 1.2178 | 1.1035 |
| 0.1399 | 2.7814 | 1616 | 1.2724 | 0.7333 | 1.2724 | 1.1280 |
| 0.1399 | 2.7849 | 1618 | 1.1674 | 0.7333 | 1.1674 | 1.0805 |
| 0.1399 | 2.7883 | 1620 | 0.9753 | 0.7139 | 0.9753 | 0.9876 |
| 0.1399 | 2.7917 | 1622 | 0.8924 | 0.7623 | 0.8924 | 0.9447 |
| 0.1399 | 2.7952 | 1624 | 0.8086 | 0.7623 | 0.8086 | 0.8992 |
| 0.1399 | 2.7986 | 1626 | 0.8211 | 0.7623 | 0.8211 | 0.9061 |
| 0.1399 | 2.8021 | 1628 | 0.8762 | 0.7623 | 0.8762 | 0.9360 |
| 0.1399 | 2.8055 | 1630 | 0.9154 | 0.7139 | 0.9154 | 0.9568 |
| 0.1399 | 2.8090 | 1632 | 0.9032 | 0.7139 | 0.9032 | 0.9503 |
| 0.1399 | 2.8124 | 1634 | 0.9257 | 0.7729 | 0.9257 | 0.9621 |
| 0.1399 | 2.8158 | 1636 | 1.0293 | 0.7221 | 1.0293 | 1.0145 |
| 0.1399 | 2.8193 | 1638 | 1.1849 | 0.7221 | 1.1849 | 1.0885 |
| 0.1399 | 2.8227 | 1640 | 1.1798 | 0.7310 | 1.1798 | 1.0862 |
| 0.1399 | 2.8262 | 1642 | 1.1110 | 0.7844 | 1.1110 | 1.0540 |
| 0.1399 | 2.8296 | 1644 | 0.9879 | 0.7729 | 0.9879 | 0.9940 |
| 0.1399 | 2.8330 | 1646 | 0.8625 | 0.6988 | 0.8625 | 0.9287 |
| 0.1399 | 2.8365 | 1648 | 0.8312 | 0.6988 | 0.8312 | 0.9117 |
| 0.1399 | 2.8399 | 1650 | 0.8634 | 0.7729 | 0.8634 | 0.9292 |
| 0.1399 | 2.8434 | 1652 | 0.9214 | 0.7221 | 0.9214 | 0.9599 |
| 0.1399 | 2.8468 | 1654 | 0.9400 | 0.7134 | 0.9400 | 0.9695 |
| 0.1399 | 2.8503 | 1656 | 0.9993 | 0.7157 | 0.9993 | 0.9996 |
| 0.1399 | 2.8537 | 1658 | 1.0307 | 0.7424 | 1.0307 | 1.0152 |
| 0.1399 | 2.8571 | 1660 | 1.0096 | 0.7134 | 1.0096 | 1.0048 |
| 0.1399 | 2.8606 | 1662 | 0.8906 | 0.7134 | 0.8906 | 0.9437 |
| 0.1399 | 2.8640 | 1664 | 0.8586 | 0.7221 | 0.8586 | 0.9266 |
| 0.1399 | 2.8675 | 1666 | 0.8202 | 0.7623 | 0.8202 | 0.9056 |
| 0.1399 | 2.8709 | 1668 | 0.8133 | 0.7801 | 0.8133 | 0.9018 |
| 0.1399 | 2.8744 | 1670 | 0.9153 | 0.6987 | 0.9153 | 0.9567 |
| 0.1399 | 2.8778 | 1672 | 1.0146 | 0.7421 | 1.0146 | 1.0073 |
| 0.1399 | 2.8812 | 1674 | 1.0418 | 0.7421 | 1.0418 | 1.0207 |
| 0.1399 | 2.8847 | 1676 | 0.9418 | 0.6922 | 0.9418 | 0.9705 |
| 0.1399 | 2.8881 | 1678 | 0.8788 | 0.7191 | 0.8788 | 0.9374 |
| 0.1399 | 2.8916 | 1680 | 0.8508 | 0.7268 | 0.8508 | 0.9224 |
| 0.1399 | 2.8950 | 1682 | 0.8988 | 0.7191 | 0.8988 | 0.9480 |
| 0.1399 | 2.8985 | 1684 | 0.8607 | 0.7191 | 0.8607 | 0.9278 |
| 0.1399 | 2.9019 | 1686 | 0.8902 | 0.7191 | 0.8902 | 0.9435 |
| 0.1399 | 2.9053 | 1688 | 0.9309 | 0.7191 | 0.9309 | 0.9648 |
| 0.1399 | 2.9088 | 1690 | 0.9420 | 0.7119 | 0.9420 | 0.9706 |
| 0.1399 | 2.9122 | 1692 | 0.9355 | 0.7264 | 0.9355 | 0.9672 |
| 0.1399 | 2.9157 | 1694 | 0.9160 | 0.7264 | 0.9160 | 0.9571 |
| 0.1399 | 2.9191 | 1696 | 0.9497 | 0.7264 | 0.9497 | 0.9745 |
| 0.1399 | 2.9225 | 1698 | 0.9694 | 0.7264 | 0.9694 | 0.9846 |
| 0.1399 | 2.9260 | 1700 | 0.9442 | 0.7264 | 0.9442 | 0.9717 |
| 0.1399 | 2.9294 | 1702 | 0.9189 | 0.7264 | 0.9189 | 0.9586 |
| 0.1399 | 2.9329 | 1704 | 0.8353 | 0.7264 | 0.8353 | 0.9140 |
| 0.1399 | 2.9363 | 1706 | 0.7897 | 0.7801 | 0.7897 | 0.8886 |
| 0.1399 | 2.9398 | 1708 | 0.8339 | 0.7801 | 0.8339 | 0.9132 |
| 0.1399 | 2.9432 | 1710 | 0.9218 | 0.7264 | 0.9218 | 0.9601 |
| 0.1399 | 2.9466 | 1712 | 0.9005 | 0.7264 | 0.9005 | 0.9489 |
| 0.1399 | 2.9501 | 1714 | 0.9060 | 0.7264 | 0.9060 | 0.9519 |
| 0.1399 | 2.9535 | 1716 | 0.9033 | 0.6987 | 0.9033 | 0.9504 |
| 0.1399 | 2.9570 | 1718 | 0.8638 | 0.7058 | 0.8638 | 0.9294 |
| 0.1399 | 2.9604 | 1720 | 0.8787 | 0.7058 | 0.8787 | 0.9374 |
| 0.1399 | 2.9639 | 1722 | 0.9523 | 0.7134 | 0.9523 | 0.9758 |
| 0.1399 | 2.9673 | 1724 | 0.9548 | 0.7134 | 0.9548 | 0.9771 |
| 0.1399 | 2.9707 | 1726 | 0.9380 | 0.7058 | 0.9380 | 0.9685 |
| 0.1399 | 2.9742 | 1728 | 0.9086 | 0.7058 | 0.9086 | 0.9532 |
| 0.1399 | 2.9776 | 1730 | 0.8728 | 0.7342 | 0.8728 | 0.9342 |
| 0.1399 | 2.9811 | 1732 | 0.7623 | 0.7191 | 0.7623 | 0.8731 |
| 0.1399 | 2.9845 | 1734 | 0.7242 | 0.7358 | 0.7242 | 0.8510 |
| 0.1399 | 2.9880 | 1736 | 0.7644 | 0.7358 | 0.7644 | 0.8743 |
| 0.1399 | 2.9914 | 1738 | 0.8512 | 0.7058 | 0.8512 | 0.9226 |
| 0.1399 | 2.9948 | 1740 | 0.9262 | 0.7134 | 0.9262 | 0.9624 |
| 0.1399 | 2.9983 | 1742 | 0.9578 | 0.7134 | 0.9578 | 0.9787 |
| 0.1399 | 3.0017 | 1744 | 0.9707 | 0.7134 | 0.9707 | 0.9852 |
| 0.1399 | 3.0052 | 1746 | 0.9792 | 0.7134 | 0.9792 | 0.9896 |
| 0.1399 | 3.0086 | 1748 | 1.0487 | 0.6940 | 1.0487 | 1.0241 |
| 0.1399 | 3.0120 | 1750 | 1.1352 | 0.6940 | 1.1352 | 1.0655 |
| 0.1399 | 3.0155 | 1752 | 1.1605 | 0.7014 | 1.1605 | 1.0772 |
| 0.1399 | 3.0189 | 1754 | 1.1421 | 0.6940 | 1.1421 | 1.0687 |
| 0.1399 | 3.0224 | 1756 | 1.0716 | 0.6940 | 1.0716 | 1.0352 |
| 0.1399 | 3.0258 | 1758 | 0.9744 | 0.7134 | 0.9744 | 0.9871 |
| 0.1399 | 3.0293 | 1760 | 0.9637 | 0.7134 | 0.9637 | 0.9817 |
| 0.1399 | 3.0327 | 1762 | 0.9459 | 0.7058 | 0.9459 | 0.9726 |
| 0.1399 | 3.0361 | 1764 | 0.8903 | 0.7058 | 0.8903 | 0.9436 |
| 0.1399 | 3.0396 | 1766 | 0.8665 | 0.7058 | 0.8665 | 0.9308 |
| 0.1399 | 3.0430 | 1768 | 0.7749 | 0.7986 | 0.7749 | 0.8803 |
| 0.1399 | 3.0465 | 1770 | 0.7584 | 0.7986 | 0.7584 | 0.8708 |
| 0.1399 | 3.0499 | 1772 | 0.8360 | 0.7139 | 0.8360 | 0.9143 |
| 0.1399 | 3.0534 | 1774 | 0.9983 | 0.7134 | 0.9983 | 0.9992 |
| 0.1399 | 3.0568 | 1776 | 1.0775 | 0.6866 | 1.0775 | 1.0380 |
| 0.1399 | 3.0602 | 1778 | 1.0747 | 0.6940 | 1.0747 | 1.0367 |
| 0.1399 | 3.0637 | 1780 | 1.0184 | 0.6940 | 1.0184 | 1.0091 |
| 0.1399 | 3.0671 | 1782 | 1.0009 | 0.6940 | 1.0009 | 1.0005 |
| 0.1399 | 3.0706 | 1784 | 0.8833 | 0.7729 | 0.8833 | 0.9398 |
| 0.1399 | 3.0740 | 1786 | 0.8042 | 0.7118 | 0.8042 | 0.8968 |
| 0.1399 | 3.0775 | 1788 | 0.8135 | 0.7623 | 0.8135 | 0.9020 |
| 0.1399 | 3.0809 | 1790 | 0.8750 | 0.7134 | 0.8750 | 0.9354 |
| 0.1399 | 3.0843 | 1792 | 1.0153 | 0.6940 | 1.0153 | 1.0076 |
| 0.1399 | 3.0878 | 1794 | 1.0614 | 0.7153 | 1.0614 | 1.0303 |
| 0.1399 | 3.0912 | 1796 | 0.9730 | 0.7134 | 0.9730 | 0.9864 |
| 0.1399 | 3.0947 | 1798 | 0.8967 | 0.6987 | 0.8967 | 0.9469 |
| 0.1399 | 3.0981 | 1800 | 0.8091 | 0.7191 | 0.8091 | 0.8995 |
| 0.1399 | 3.1015 | 1802 | 0.7111 | 0.6824 | 0.7111 | 0.8433 |
| 0.1399 | 3.1050 | 1804 | 0.7149 | 0.6824 | 0.7149 | 0.8455 |
| 0.1399 | 3.1084 | 1806 | 0.7797 | 0.7358 | 0.7797 | 0.8830 |
| 0.1399 | 3.1119 | 1808 | 0.9317 | 0.7518 | 0.9317 | 0.9653 |
| 0.1399 | 3.1153 | 1810 | 1.0395 | 0.7333 | 1.0395 | 1.0195 |
| 0.1399 | 3.1188 | 1812 | 1.0385 | 0.7333 | 1.0385 | 1.0191 |
| 0.1399 | 3.1222 | 1814 | 0.9355 | 0.7426 | 0.9355 | 0.9672 |
| 0.1399 | 3.1256 | 1816 | 0.7998 | 0.7525 | 0.7998 | 0.8943 |
| 0.1399 | 3.1291 | 1818 | 0.7633 | 0.7258 | 0.7633 | 0.8736 |
| 0.1399 | 3.1325 | 1820 | 0.8029 | 0.7346 | 0.8029 | 0.8961 |
| 0.1399 | 3.1360 | 1822 | 0.8380 | 0.7346 | 0.8380 | 0.9154 |
| 0.1399 | 3.1394 | 1824 | 0.8164 | 0.7346 | 0.8164 | 0.9035 |
| 0.1399 | 3.1429 | 1826 | 0.7676 | 0.7268 | 0.7676 | 0.8761 |
| 0.1399 | 3.1463 | 1828 | 0.7121 | 0.7268 | 0.7121 | 0.8439 |
| 0.1399 | 3.1497 | 1830 | 0.6888 | 0.7285 | 0.6888 | 0.8299 |
| 0.1399 | 3.1532 | 1832 | 0.6899 | 0.7526 | 0.6899 | 0.8306 |
| 0.1399 | 3.1566 | 1834 | 0.7422 | 0.7212 | 0.7422 | 0.8615 |
| 0.1399 | 3.1601 | 1836 | 0.7686 | 0.7212 | 0.7686 | 0.8767 |
| 0.1399 | 3.1635 | 1838 | 0.7813 | 0.7212 | 0.7813 | 0.8839 |
| 0.1399 | 3.1670 | 1840 | 0.7482 | 0.7281 | 0.7482 | 0.8650 |
| 0.1399 | 3.1704 | 1842 | 0.6861 | 0.7692 | 0.6861 | 0.8283 |
| 0.1399 | 3.1738 | 1844 | 0.6702 | 0.7692 | 0.6702 | 0.8186 |
| 0.1399 | 3.1773 | 1846 | 0.7106 | 0.7782 | 0.7106 | 0.8430 |
| 0.1399 | 3.1807 | 1848 | 0.8133 | 0.7264 | 0.8133 | 0.9018 |
| 0.1399 | 3.1842 | 1850 | 0.8809 | 0.7518 | 0.8809 | 0.9386 |
| 0.1399 | 3.1876 | 1852 | 0.9079 | 0.7518 | 0.9079 | 0.9529 |
| 0.1399 | 3.1910 | 1854 | 0.8636 | 0.7518 | 0.8636 | 0.9293 |
| 0.1399 | 3.1945 | 1856 | 0.7655 | 0.7212 | 0.7655 | 0.8749 |
| 0.1399 | 3.1979 | 1858 | 0.6816 | 0.7927 | 0.6816 | 0.8256 |
| 0.1399 | 3.2014 | 1860 | 0.6567 | 0.7927 | 0.6567 | 0.8103 |
| 0.1399 | 3.2048 | 1862 | 0.6850 | 0.7927 | 0.6850 | 0.8277 |
| 0.1399 | 3.2083 | 1864 | 0.7528 | 0.7605 | 0.7528 | 0.8677 |
| 0.1399 | 3.2117 | 1866 | 0.8415 | 0.7700 | 0.8415 | 0.9173 |
| 0.1399 | 3.2151 | 1868 | 0.9315 | 0.7797 | 0.9315 | 0.9651 |
| 0.1399 | 3.2186 | 1870 | 0.9474 | 0.7906 | 0.9474 | 0.9734 |
| 0.1399 | 3.2220 | 1872 | 0.9272 | 0.7906 | 0.9272 | 0.9629 |
| 0.1399 | 3.2255 | 1874 | 0.8535 | 0.7232 | 0.8535 | 0.9238 |
| 0.1399 | 3.2289 | 1876 | 0.7587 | 0.6962 | 0.7587 | 0.8711 |
| 0.1399 | 3.2324 | 1878 | 0.7208 | 0.6962 | 0.7208 | 0.8490 |
| 0.1399 | 3.2358 | 1880 | 0.7282 | 0.6962 | 0.7282 | 0.8533 |
| 0.1399 | 3.2392 | 1882 | 0.7579 | 0.6962 | 0.7579 | 0.8705 |
| 0.1399 | 3.2427 | 1884 | 0.8320 | 0.7801 | 0.8320 | 0.9122 |
| 0.1399 | 3.2461 | 1886 | 0.9406 | 0.7430 | 0.9406 | 0.9698 |
| 0.1399 | 3.2496 | 1888 | 0.9791 | 0.7221 | 0.9791 | 0.9895 |
| 0.1399 | 3.2530 | 1890 | 0.9730 | 0.7221 | 0.9730 | 0.9864 |
| 0.1399 | 3.2565 | 1892 | 0.9221 | 0.7221 | 0.9221 | 0.9602 |
| 0.1399 | 3.2599 | 1894 | 0.8911 | 0.7520 | 0.8911 | 0.9440 |
| 0.1399 | 3.2633 | 1896 | 0.8554 | 0.7906 | 0.8554 | 0.9249 |
| 0.1399 | 3.2668 | 1898 | 0.8748 | 0.8019 | 0.8748 | 0.9353 |
| 0.1399 | 3.2702 | 1900 | 0.9498 | 0.7221 | 0.9498 | 0.9746 |
| 0.1399 | 3.2737 | 1902 | 0.9475 | 0.7729 | 0.9475 | 0.9734 |
| 0.1399 | 3.2771 | 1904 | 0.9720 | 0.7221 | 0.9720 | 0.9859 |
| 0.1399 | 3.2806 | 1906 | 0.9545 | 0.7221 | 0.9545 | 0.9770 |
| 0.1399 | 3.2840 | 1908 | 0.8958 | 0.7430 | 0.8958 | 0.9465 |
| 0.1399 | 3.2874 | 1910 | 0.8661 | 0.7346 | 0.8661 | 0.9307 |
| 0.1399 | 3.2909 | 1912 | 0.8344 | 0.7346 | 0.8344 | 0.9134 |
| 0.1399 | 3.2943 | 1914 | 0.8047 | 0.7346 | 0.8047 | 0.8970 |
| 0.1399 | 3.2978 | 1916 | 0.8305 | 0.7264 | 0.8305 | 0.9113 |
| 0.1399 | 3.3012 | 1918 | 0.8600 | 0.7264 | 0.8600 | 0.9273 |
| 0.1399 | 3.3046 | 1920 | 0.8367 | 0.7264 | 0.8367 | 0.9147 |
| 0.1399 | 3.3081 | 1922 | 0.7733 | 0.7692 | 0.7733 | 0.8794 |
| 0.1399 | 3.3115 | 1924 | 0.7518 | 0.7692 | 0.7518 | 0.8670 |
| 0.1399 | 3.3150 | 1926 | 0.7795 | 0.7692 | 0.7795 | 0.8829 |
| 0.1399 | 3.3184 | 1928 | 0.8216 | 0.7285 | 0.8216 | 0.9064 |
| 0.1399 | 3.3219 | 1930 | 0.7855 | 0.7692 | 0.7855 | 0.8863 |
| 0.1399 | 3.3253 | 1932 | 0.7183 | 0.7692 | 0.7183 | 0.8475 |
| 0.1399 | 3.3287 | 1934 | 0.6919 | 0.7696 | 0.6919 | 0.8318 |
| 0.1399 | 3.3322 | 1936 | 0.6609 | 0.6942 | 0.6609 | 0.8130 |
| 0.1399 | 3.3356 | 1938 | 0.6714 | 0.6942 | 0.6714 | 0.8194 |
| 0.1399 | 3.3391 | 1940 | 0.7341 | 0.7692 | 0.7341 | 0.8568 |
| 0.1399 | 3.3425 | 1942 | 0.8780 | 0.7358 | 0.8780 | 0.9370 |
| 0.1399 | 3.3460 | 1944 | 0.9755 | 0.7277 | 0.9755 | 0.9876 |
| 0.1399 | 3.3494 | 1946 | 0.9715 | 0.7354 | 0.9715 | 0.9857 |
| 0.1399 | 3.3528 | 1948 | 0.8843 | 0.7358 | 0.8843 | 0.9404 |
| 0.1399 | 3.3563 | 1950 | 0.8502 | 0.7782 | 0.8502 | 0.9221 |
| 0.1399 | 3.3597 | 1952 | 0.8748 | 0.7782 | 0.8748 | 0.9353 |
| 0.1399 | 3.3632 | 1954 | 0.8751 | 0.7437 | 0.8751 | 0.9355 |
| 0.1399 | 3.3666 | 1956 | 0.8029 | 0.7782 | 0.8029 | 0.8960 |
| 0.1399 | 3.3701 | 1958 | 0.7388 | 0.7105 | 0.7388 | 0.8596 |
| 0.1399 | 3.3735 | 1960 | 0.7420 | 0.7692 | 0.7420 | 0.8614 |
| 0.1399 | 3.3769 | 1962 | 0.7472 | 0.7285 | 0.7472 | 0.8644 |
| 0.1399 | 3.3804 | 1964 | 0.7416 | 0.7285 | 0.7416 | 0.8612 |
| 0.1399 | 3.3838 | 1966 | 0.7300 | 0.7447 | 0.7300 | 0.8544 |
| 0.1399 | 3.3873 | 1968 | 0.7800 | 0.7212 | 0.7800 | 0.8832 |
| 0.1399 | 3.3907 | 1970 | 0.8330 | 0.7281 | 0.8330 | 0.9127 |
| 0.1399 | 3.3941 | 1972 | 0.9362 | 0.7277 | 0.9362 | 0.9676 |
| 0.1399 | 3.3976 | 1974 | 0.9800 | 0.7259 | 0.9800 | 0.9899 |
| 0.1399 | 3.4010 | 1976 | 0.9305 | 0.7342 | 0.9305 | 0.9646 |
| 0.1399 | 3.4045 | 1978 | 0.8909 | 0.7906 | 0.8909 | 0.9439 |
| 0.1399 | 3.4079 | 1980 | 0.9186 | 0.7729 | 0.9186 | 0.9584 |
| 0.1399 | 3.4114 | 1982 | 0.9829 | 0.7729 | 0.9829 | 0.9914 |
| 0.1399 | 3.4148 | 1984 | 1.0290 | 0.7729 | 1.0290 | 1.0144 |
| 0.1399 | 3.4182 | 1986 | 1.0564 | 0.7520 | 1.0564 | 1.0278 |
| 0.1399 | 3.4217 | 1988 | 1.1456 | 0.6476 | 1.1456 | 1.0703 |
| 0.1399 | 3.4251 | 1990 | 1.1579 | 0.6476 | 1.1579 | 1.0760 |
| 0.1399 | 3.4286 | 1992 | 1.0775 | 0.7729 | 1.0775 | 1.0380 |
| 0.1399 | 3.4320 | 1994 | 0.9814 | 0.7729 | 0.9814 | 0.9907 |
| 0.1399 | 3.4355 | 1996 | 0.9233 | 0.6988 | 0.9233 | 0.9609 |
| 0.1399 | 3.4389 | 1998 | 0.8901 | 0.6988 | 0.8901 | 0.9435 |
| 0.1077 | 3.4423 | 2000 | 0.9172 | 0.6988 | 0.9172 | 0.9577 |
| 0.1077 | 3.4458 | 2002 | 1.0269 | 0.7221 | 1.0269 | 1.0134 |
| 0.1077 | 3.4492 | 2004 | 1.1967 | 0.6940 | 1.1967 | 1.0939 |
| 0.1077 | 3.4527 | 2006 | 1.3991 | 0.5777 | 1.3991 | 1.1828 |
| 0.1077 | 3.4561 | 2008 | 1.4655 | 0.5769 | 1.4655 | 1.2106 |
| 0.1077 | 3.4596 | 2010 | 1.4050 | 0.6008 | 1.4050 | 1.1853 |
| 0.1077 | 3.4630 | 2012 | 1.3168 | 0.6680 | 1.3168 | 1.1475 |
| 0.1077 | 3.4664 | 2014 | 1.1611 | 0.7519 | 1.1611 | 1.0776 |
| 0.1077 | 3.4699 | 2016 | 1.0642 | 0.7729 | 1.0642 | 1.0316 |
| 0.1077 | 3.4733 | 2018 | 1.0203 | 0.7729 | 1.0203 | 1.0101 |
| 0.1077 | 3.4768 | 2020 | 1.0592 | 0.7844 | 1.0592 | 1.0292 |
| 0.1077 | 3.4802 | 2022 | 1.1819 | 0.7519 | 1.1819 | 1.0871 |
| 0.1077 | 3.4836 | 2024 | 1.3205 | 0.6609 | 1.3205 | 1.1491 |
| 0.1077 | 3.4871 | 2026 | 1.3285 | 0.5927 | 1.3285 | 1.1526 |
| 0.1077 | 3.4905 | 2028 | 1.2958 | 0.6609 | 1.2958 | 1.1383 |
| 0.1077 | 3.4940 | 2030 | 1.1951 | 0.7844 | 1.1951 | 1.0932 |
| 0.1077 | 3.4974 | 2032 | 1.0464 | 0.7844 | 1.0464 | 1.0229 |
| 0.1077 | 3.5009 | 2034 | 0.9913 | 0.7729 | 0.9913 | 0.9957 |
| 0.1077 | 3.5043 | 2036 | 1.0062 | 0.7729 | 1.0062 | 1.0031 |
| 0.1077 | 3.5077 | 2038 | 1.1007 | 0.7844 | 1.1007 | 1.0491 |
| 0.1077 | 3.5112 | 2040 | 1.1162 | 0.7134 | 1.1162 | 1.0565 |
| 0.1077 | 3.5146 | 2042 | 1.0282 | 0.7729 | 1.0282 | 1.0140 |
| 0.1077 | 3.5181 | 2044 | 1.0268 | 0.7729 | 1.0268 | 1.0133 |
| 0.1077 | 3.5215 | 2046 | 0.9783 | 0.7729 | 0.9783 | 0.9891 |
| 0.1077 | 3.5250 | 2048 | 0.9940 | 0.7221 | 0.9940 | 0.9970 |
| 0.1077 | 3.5284 | 2050 | 0.9840 | 0.7729 | 0.9840 | 0.9920 |
| 0.1077 | 3.5318 | 2052 | 1.0168 | 0.7221 | 1.0168 | 1.0084 |
| 0.1077 | 3.5353 | 2054 | 1.0926 | 0.7221 | 1.0926 | 1.0453 |
| 0.1077 | 3.5387 | 2056 | 1.1673 | 0.6940 | 1.1673 | 1.0804 |
| 0.1077 | 3.5422 | 2058 | 1.1598 | 0.6940 | 1.1598 | 1.0770 |
| 0.1077 | 3.5456 | 2060 | 1.0893 | 0.7221 | 1.0893 | 1.0437 |
| 0.1077 | 3.5491 | 2062 | 0.9590 | 0.7729 | 0.9590 | 0.9793 |
| 0.1077 | 3.5525 | 2064 | 0.8891 | 0.6782 | 0.8891 | 0.9429 |
| 0.1077 | 3.5559 | 2066 | 0.9059 | 0.6564 | 0.9059 | 0.9518 |
| 0.1077 | 3.5594 | 2068 | 0.9468 | 0.7525 | 0.9468 | 0.9731 |
| 0.1077 | 3.5628 | 2070 | 0.9880 | 0.7221 | 0.9880 | 0.9940 |
| 0.1077 | 3.5663 | 2072 | 1.0073 | 0.7134 | 1.0073 | 1.0037 |
| 0.1077 | 3.5697 | 2074 | 1.0447 | 0.7134 | 1.0447 | 1.0221 |
| 0.1077 | 3.5731 | 2076 | 1.0211 | 0.7134 | 1.0211 | 1.0105 |
| 0.1077 | 3.5766 | 2078 | 0.9552 | 0.7025 | 0.9552 | 0.9773 |
| 0.1077 | 3.5800 | 2080 | 0.8918 | 0.7321 | 0.8918 | 0.9444 |
| 0.1077 | 3.5835 | 2082 | 0.9020 | 0.6830 | 0.9020 | 0.9497 |
| 0.1077 | 3.5869 | 2084 | 0.9800 | 0.7221 | 0.9800 | 0.9900 |
| 0.1077 | 3.5904 | 2086 | 1.0936 | 0.6940 | 1.0936 | 1.0458 |
| 0.1077 | 3.5938 | 2088 | 1.1865 | 0.7014 | 1.1865 | 1.0893 |
| 0.1077 | 3.5972 | 2090 | 1.3269 | 0.6934 | 1.3269 | 1.1519 |
| 0.1077 | 3.6007 | 2092 | 1.3621 | 0.6356 | 1.3621 | 1.1671 |
| 0.1077 | 3.6041 | 2094 | 1.2696 | 0.6414 | 1.2696 | 1.1268 |
| 0.1077 | 3.6076 | 2096 | 1.1055 | 0.7626 | 1.1055 | 1.0514 |
| 0.1077 | 3.6110 | 2098 | 0.9728 | 0.7525 | 0.9728 | 0.9863 |
| 0.1077 | 3.6145 | 2100 | 0.9356 | 0.6776 | 0.9356 | 0.9673 |
| 0.1077 | 3.6179 | 2102 | 0.9648 | 0.7316 | 0.9648 | 0.9822 |
| 0.1077 | 3.6213 | 2104 | 1.0143 | 0.7520 | 1.0143 | 1.0071 |
| 0.1077 | 3.6248 | 2106 | 1.0527 | 0.7020 | 1.0527 | 1.0260 |
| 0.1077 | 3.6282 | 2108 | 1.1927 | 0.6414 | 1.1927 | 1.0921 |
| 0.1077 | 3.6317 | 2110 | 1.2523 | 0.6356 | 1.2523 | 1.1191 |
| 0.1077 | 3.6351 | 2112 | 1.1850 | 0.6414 | 1.1850 | 1.0886 |
| 0.1077 | 3.6386 | 2114 | 1.0359 | 0.7020 | 1.0359 | 1.0178 |
| 0.1077 | 3.6420 | 2116 | 0.9525 | 0.7520 | 0.9525 | 0.9760 |
| 0.1077 | 3.6454 | 2118 | 0.9614 | 0.7520 | 0.9614 | 0.9805 |
| 0.1077 | 3.6489 | 2120 | 1.0033 | 0.7520 | 1.0033 | 1.0016 |
| 0.1077 | 3.6523 | 2122 | 1.1149 | 0.7101 | 1.1149 | 1.0559 |
| 0.1077 | 3.6558 | 2124 | 1.2213 | 0.7014 | 1.2213 | 1.1051 |
| 0.1077 | 3.6592 | 2126 | 1.2266 | 0.7014 | 1.2266 | 1.1075 |
| 0.1077 | 3.6627 | 2128 | 1.1372 | 0.7101 | 1.1372 | 1.0664 |
| 0.1077 | 3.6661 | 2130 | 1.0097 | 0.7020 | 1.0097 | 1.0048 |
| 0.1077 | 3.6695 | 2132 | 0.9576 | 0.7729 | 0.9576 | 0.9786 |
| 0.1077 | 3.6730 | 2134 | 0.9904 | 0.7221 | 0.9904 | 0.9952 |
| 0.1077 | 3.6764 | 2136 | 1.0051 | 0.7134 | 1.0051 | 1.0025 |
| 0.1077 | 3.6799 | 2138 | 1.1012 | 0.6940 | 1.1012 | 1.0494 |
| 0.1077 | 3.6833 | 2140 | 1.1580 | 0.6866 | 1.1580 | 1.0761 |
| 0.1077 | 3.6867 | 2142 | 1.2299 | 0.6934 | 1.2299 | 1.1090 |
| 0.1077 | 3.6902 | 2144 | 1.1814 | 0.7014 | 1.1814 | 1.0869 |
| 0.1077 | 3.6936 | 2146 | 1.1039 | 0.7310 | 1.1039 | 1.0507 |
| 0.1077 | 3.6971 | 2148 | 1.0643 | 0.7844 | 1.0643 | 1.0316 |
| 0.1077 | 3.7005 | 2150 | 1.0333 | 0.7844 | 1.0333 | 1.0165 |
| 0.1077 | 3.7040 | 2152 | 1.0045 | 0.7844 | 1.0045 | 1.0022 |
| 0.1077 | 3.7074 | 2154 | 0.9738 | 0.7525 | 0.9738 | 0.9868 |
| 0.1077 | 3.7108 | 2156 | 0.9275 | 0.7525 | 0.9275 | 0.9630 |
| 0.1077 | 3.7143 | 2158 | 0.9572 | 0.7134 | 0.9572 | 0.9784 |
| 0.1077 | 3.7177 | 2160 | 1.0547 | 0.7134 | 1.0547 | 1.0270 |
| 0.1077 | 3.7212 | 2162 | 1.1342 | 0.6866 | 1.1342 | 1.0650 |
| 0.1077 | 3.7246 | 2164 | 1.2572 | 0.6929 | 1.2572 | 1.1213 |
| 0.1077 | 3.7281 | 2166 | 1.3099 | 0.6929 | 1.3099 | 1.1445 |
| 0.1077 | 3.7315 | 2168 | 1.2983 | 0.6610 | 1.2983 | 1.1394 |
| 0.1077 | 3.7349 | 2170 | 1.2085 | 0.6680 | 1.2085 | 1.0993 |
| 0.1077 | 3.7384 | 2172 | 1.1243 | 0.7101 | 1.1243 | 1.0603 |
| 0.1077 | 3.7418 | 2174 | 1.1323 | 0.7626 | 1.1323 | 1.0641 |
| 0.1077 | 3.7453 | 2176 | 1.1048 | 0.7626 | 1.1048 | 1.0511 |
| 0.1077 | 3.7487 | 2178 | 1.1257 | 0.7101 | 1.1257 | 1.0610 |
| 0.1077 | 3.7522 | 2180 | 1.1558 | 0.7014 | 1.1558 | 1.0751 |
| 0.1077 | 3.7556 | 2182 | 1.2318 | 0.6934 | 1.2318 | 1.1099 |
| 0.1077 | 3.7590 | 2184 | 1.2897 | 0.7230 | 1.2897 | 1.1356 |
| 0.1077 | 3.7625 | 2186 | 1.2644 | 0.7230 | 1.2644 | 1.1245 |
| 0.1077 | 3.7659 | 2188 | 1.1610 | 0.7014 | 1.1610 | 1.0775 |
| 0.1077 | 3.7694 | 2190 | 1.0502 | 0.7014 | 1.0502 | 1.0248 |
| 0.1077 | 3.7728 | 2192 | 0.9866 | 0.7520 | 0.9866 | 0.9933 |
| 0.1077 | 3.7762 | 2194 | 0.9952 | 0.7626 | 0.9952 | 0.9976 |
| 0.1077 | 3.7797 | 2196 | 1.0864 | 0.7014 | 1.0864 | 1.0423 |
| 0.1077 | 3.7831 | 2198 | 1.1957 | 0.7014 | 1.1957 | 1.0935 |
| 0.1077 | 3.7866 | 2200 | 1.2166 | 0.7014 | 1.2166 | 1.1030 |
| 0.1077 | 3.7900 | 2202 | 1.2192 | 0.7014 | 1.2192 | 1.1042 |
| 0.1077 | 3.7935 | 2204 | 1.1216 | 0.7014 | 1.1216 | 1.0590 |
| 0.1077 | 3.7969 | 2206 | 1.0428 | 0.7216 | 1.0428 | 1.0212 |
| 0.1077 | 3.8003 | 2208 | 1.0339 | 0.7216 | 1.0339 | 1.0168 |
| 0.1077 | 3.8038 | 2210 | 1.0675 | 0.7014 | 1.0675 | 1.0332 |
| 0.1077 | 3.8072 | 2212 | 1.1228 | 0.7318 | 1.1228 | 1.0596 |
| 0.1077 | 3.8107 | 2214 | 1.1632 | 0.7318 | 1.1632 | 1.0785 |
| 0.1077 | 3.8141 | 2216 | 1.1979 | 0.7318 | 1.1979 | 1.0945 |
| 0.1077 | 3.8176 | 2218 | 1.2642 | 0.7230 | 1.2642 | 1.1244 |
| 0.1077 | 3.8210 | 2220 | 1.2854 | 0.7230 | 1.2854 | 1.1338 |
| 0.1077 | 3.8244 | 2222 | 1.2319 | 0.7014 | 1.2319 | 1.1099 |
| 0.1077 | 3.8279 | 2224 | 1.1321 | 0.7014 | 1.1321 | 1.0640 |
| 0.1077 | 3.8313 | 2226 | 1.0565 | 0.7014 | 1.0565 | 1.0279 |
| 0.1077 | 3.8348 | 2228 | 1.0679 | 0.7515 | 1.0679 | 1.0334 |
| 0.1077 | 3.8382 | 2230 | 1.1053 | 0.7515 | 1.1053 | 1.0513 |
| 0.1077 | 3.8417 | 2232 | 1.0954 | 0.7626 | 1.0954 | 1.0466 |
| 0.1077 | 3.8451 | 2234 | 1.0743 | 0.6845 | 1.0743 | 1.0365 |
| 0.1077 | 3.8485 | 2236 | 1.0668 | 0.6845 | 1.0668 | 1.0328 |
| 0.1077 | 3.8520 | 2238 | 1.0301 | 0.7074 | 1.0301 | 1.0150 |
| 0.1077 | 3.8554 | 2240 | 1.0473 | 0.7074 | 1.0473 | 1.0234 |
| 0.1077 | 3.8589 | 2242 | 1.1402 | 0.7014 | 1.1402 | 1.0678 |
| 0.1077 | 3.8623 | 2244 | 1.2717 | 0.7014 | 1.2717 | 1.1277 |
| 0.1077 | 3.8657 | 2246 | 1.3641 | 0.6610 | 1.3641 | 1.1679 |
| 0.1077 | 3.8692 | 2248 | 1.4327 | 0.6929 | 1.4327 | 1.1970 |
| 0.1077 | 3.8726 | 2250 | 1.3893 | 0.7230 | 1.3893 | 1.1787 |
| 0.1077 | 3.8761 | 2252 | 1.2460 | 0.7230 | 1.2460 | 1.1163 |
| 0.1077 | 3.8795 | 2254 | 1.1029 | 0.6940 | 1.1029 | 1.0502 |
| 0.1077 | 3.8830 | 2256 | 1.0735 | 0.7014 | 1.0735 | 1.0361 |
| 0.1077 | 3.8864 | 2258 | 1.0893 | 0.7014 | 1.0893 | 1.0437 |
| 0.1077 | 3.8898 | 2260 | 1.1035 | 0.7014 | 1.1035 | 1.0505 |
| 0.1077 | 3.8933 | 2262 | 1.1272 | 0.7014 | 1.1272 | 1.0617 |
| 0.1077 | 3.8967 | 2264 | 1.1988 | 0.7014 | 1.1988 | 1.0949 |
| 0.1077 | 3.9002 | 2266 | 1.2044 | 0.7101 | 1.2044 | 1.0975 |
| 0.1077 | 3.9036 | 2268 | 1.1334 | 0.7626 | 1.1334 | 1.0646 |
| 0.1077 | 3.9071 | 2270 | 1.1052 | 0.7626 | 1.1052 | 1.0513 |
| 0.1077 | 3.9105 | 2272 | 1.0467 | 0.7844 | 1.0467 | 1.0231 |
| 0.1077 | 3.9139 | 2274 | 0.9885 | 0.7844 | 0.9885 | 0.9942 |
| 0.1077 | 3.9174 | 2276 | 1.0015 | 0.7216 | 1.0015 | 1.0008 |
| 0.1077 | 3.9208 | 2278 | 1.0271 | 0.7216 | 1.0271 | 1.0135 |
| 0.1077 | 3.9243 | 2280 | 1.0571 | 0.7014 | 1.0571 | 1.0281 |
| 0.1077 | 3.9277 | 2282 | 1.1330 | 0.7014 | 1.1330 | 1.0644 |
| 0.1077 | 3.9312 | 2284 | 1.1814 | 0.6934 | 1.1814 | 1.0869 |
| 0.1077 | 3.9346 | 2286 | 1.1539 | 0.7014 | 1.1539 | 1.0742 |
| 0.1077 | 3.9380 | 2288 | 1.0694 | 0.7014 | 1.0694 | 1.0341 |
| 0.1077 | 3.9415 | 2290 | 1.0995 | 0.7014 | 1.0995 | 1.0486 |
| 0.1077 | 3.9449 | 2292 | 1.0376 | 0.7014 | 1.0376 | 1.0186 |
| 0.1077 | 3.9484 | 2294 | 1.0462 | 0.7014 | 1.0462 | 1.0228 |
| 0.1077 | 3.9518 | 2296 | 1.1042 | 0.7014 | 1.1042 | 1.0508 |
| 0.1077 | 3.9552 | 2298 | 1.0735 | 0.7101 | 1.0735 | 1.0361 |
| 0.1077 | 3.9587 | 2300 | 1.0828 | 0.7101 | 1.0828 | 1.0406 |
| 0.1077 | 3.9621 | 2302 | 1.0943 | 0.7014 | 1.0943 | 1.0461 |
| 0.1077 | 3.9656 | 2304 | 1.0869 | 0.7216 | 1.0869 | 1.0425 |
| 0.1077 | 3.9690 | 2306 | 0.9943 | 0.7216 | 0.9943 | 0.9971 |
| 0.1077 | 3.9725 | 2308 | 0.8672 | 0.7729 | 0.8672 | 0.9312 |
| 0.1077 | 3.9759 | 2310 | 0.8155 | 0.7801 | 0.8155 | 0.9031 |
| 0.1077 | 3.9793 | 2312 | 0.7693 | 0.7801 | 0.7693 | 0.8771 |
| 0.1077 | 3.9828 | 2314 | 0.7755 | 0.7801 | 0.7755 | 0.8806 |
| 0.1077 | 3.9862 | 2316 | 0.8489 | 0.7346 | 0.8489 | 0.9214 |
| 0.1077 | 3.9897 | 2318 | 0.9598 | 0.7134 | 0.9598 | 0.9797 |
| 0.1077 | 3.9931 | 2320 | 1.0216 | 0.7134 | 1.0216 | 1.0107 |
| 0.1077 | 3.9966 | 2322 | 1.0765 | 0.7216 | 1.0765 | 1.0376 |
| 0.1077 | 4.0 | 2324 | 1.0736 | 0.7216 | 1.0736 | 1.0362 |
| 0.1077 | 4.0034 | 2326 | 0.9843 | 0.7134 | 0.9843 | 0.9921 |
| 0.1077 | 4.0069 | 2328 | 0.9255 | 0.7729 | 0.9255 | 0.9620 |
| 0.1077 | 4.0103 | 2330 | 0.8425 | 0.7906 | 0.8425 | 0.9179 |
| 0.1077 | 4.0138 | 2332 | 0.8152 | 0.7906 | 0.8152 | 0.9029 |
| 0.1077 | 4.0172 | 2334 | 0.7748 | 0.7801 | 0.7748 | 0.8803 |
| 0.1077 | 4.0207 | 2336 | 0.7873 | 0.7801 | 0.7873 | 0.8873 |
| 0.1077 | 4.0241 | 2338 | 0.8833 | 0.7139 | 0.8833 | 0.9398 |
| 0.1077 | 4.0275 | 2340 | 1.0699 | 0.7053 | 1.0699 | 1.0344 |
| 0.1077 | 4.0310 | 2342 | 1.2900 | 0.6673 | 1.2900 | 1.1358 |
| 0.1077 | 4.0344 | 2344 | 1.4082 | 0.6673 | 1.4082 | 1.1867 |
| 0.1077 | 4.0379 | 2346 | 1.3901 | 0.6356 | 1.3901 | 1.1790 |
| 0.1077 | 4.0413 | 2348 | 1.2568 | 0.6934 | 1.2568 | 1.1211 |
| 0.1077 | 4.0448 | 2350 | 1.0830 | 0.7626 | 1.0830 | 1.0407 |
| 0.1077 | 4.0482 | 2352 | 0.9099 | 0.6988 | 0.9099 | 0.9539 |
| 0.1077 | 4.0516 | 2354 | 0.7843 | 0.6836 | 0.7843 | 0.8856 |
| 0.1077 | 4.0551 | 2356 | 0.7423 | 0.6962 | 0.7423 | 0.8616 |
| 0.1077 | 4.0585 | 2358 | 0.7610 | 0.6836 | 0.7610 | 0.8724 |
| 0.1077 | 4.0620 | 2360 | 0.8455 | 0.7623 | 0.8455 | 0.9195 |
| 0.1077 | 4.0654 | 2362 | 1.0052 | 0.6866 | 1.0052 | 1.0026 |
| 0.1077 | 4.0688 | 2364 | 1.1608 | 0.6934 | 1.1608 | 1.0774 |
| 0.1077 | 4.0723 | 2366 | 1.2079 | 0.6934 | 1.2079 | 1.0990 |
| 0.1077 | 4.0757 | 2368 | 1.1688 | 0.6934 | 1.1688 | 1.0811 |
| 0.1077 | 4.0792 | 2370 | 1.0787 | 0.7626 | 1.0787 | 1.0386 |
| 0.1077 | 4.0826 | 2372 | 0.9535 | 0.6845 | 0.9535 | 0.9765 |
| 0.1077 | 4.0861 | 2374 | 0.9027 | 0.6988 | 0.9027 | 0.9501 |
| 0.1077 | 4.0895 | 2376 | 0.9088 | 0.6988 | 0.9088 | 0.9533 |
| 0.1077 | 4.0929 | 2378 | 0.9455 | 0.7074 | 0.9455 | 0.9724 |
| 0.1077 | 4.0964 | 2380 | 1.0383 | 0.7626 | 1.0383 | 1.0190 |
| 0.1077 | 4.0998 | 2382 | 1.1528 | 0.6934 | 1.1528 | 1.0737 |
| 0.1077 | 4.1033 | 2384 | 1.3337 | 0.7230 | 1.3337 | 1.1548 |
| 0.1077 | 4.1067 | 2386 | 1.4395 | 0.6673 | 1.4395 | 1.1998 |
| 0.1077 | 4.1102 | 2388 | 1.4139 | 0.6673 | 1.4139 | 1.1891 |
| 0.1077 | 4.1136 | 2390 | 1.2825 | 0.7230 | 1.2825 | 1.1325 |
| 0.1077 | 4.1170 | 2392 | 1.1322 | 0.6934 | 1.1322 | 1.0641 |
| 0.1077 | 4.1205 | 2394 | 0.9602 | 0.7729 | 0.9602 | 0.9799 |
| 0.1077 | 4.1239 | 2396 | 0.8194 | 0.6909 | 0.8194 | 0.9052 |
| 0.1077 | 4.1274 | 2398 | 0.7549 | 0.6909 | 0.7549 | 0.8689 |
| 0.1077 | 4.1308 | 2400 | 0.7572 | 0.6909 | 0.7572 | 0.8701 |
| 0.1077 | 4.1343 | 2402 | 0.8096 | 0.7623 | 0.8096 | 0.8998 |
| 0.1077 | 4.1377 | 2404 | 0.8986 | 0.7729 | 0.8986 | 0.9479 |
| 0.1077 | 4.1411 | 2406 | 0.9520 | 0.7729 | 0.9520 | 0.9757 |
| 0.1077 | 4.1446 | 2408 | 1.0158 | 0.7844 | 1.0158 | 1.0078 |
| 0.1077 | 4.1480 | 2410 | 1.0251 | 0.7844 | 1.0251 | 1.0125 |
| 0.1077 | 4.1515 | 2412 | 1.0065 | 0.7844 | 1.0065 | 1.0032 |
| 0.1077 | 4.1549 | 2414 | 1.0476 | 0.7844 | 1.0476 | 1.0235 |
| 0.1077 | 4.1583 | 2416 | 1.1015 | 0.7515 | 1.1015 | 1.0495 |
| 0.1077 | 4.1618 | 2418 | 1.0740 | 0.7515 | 1.0740 | 1.0363 |
| 0.1077 | 4.1652 | 2420 | 0.9765 | 0.7844 | 0.9765 | 0.9882 |
| 0.1077 | 4.1687 | 2422 | 0.8829 | 0.7729 | 0.8829 | 0.9397 |
| 0.1077 | 4.1721 | 2424 | 0.8318 | 0.6909 | 0.8318 | 0.9121 |
| 0.1077 | 4.1756 | 2426 | 0.8144 | 0.7232 | 0.8144 | 0.9024 |
| 0.1077 | 4.1790 | 2428 | 0.8107 | 0.7801 | 0.8107 | 0.9004 |
| 0.1077 | 4.1824 | 2430 | 0.8572 | 0.7264 | 0.8572 | 0.9258 |
| 0.1077 | 4.1859 | 2432 | 0.9549 | 0.7058 | 0.9549 | 0.9772 |
| 0.1077 | 4.1893 | 2434 | 1.0159 | 0.7053 | 1.0159 | 1.0079 |
| 0.1077 | 4.1928 | 2436 | 1.0087 | 0.7053 | 1.0087 | 1.0043 |
| 0.1077 | 4.1962 | 2438 | 0.9454 | 0.7134 | 0.9454 | 0.9723 |
| 0.1077 | 4.1997 | 2440 | 0.8809 | 0.7729 | 0.8809 | 0.9386 |
| 0.1077 | 4.2031 | 2442 | 0.8707 | 0.8019 | 0.8707 | 0.9331 |
| 0.1077 | 4.2065 | 2444 | 0.8772 | 0.7729 | 0.8772 | 0.9366 |
| 0.1077 | 4.2100 | 2446 | 0.8686 | 0.8019 | 0.8686 | 0.9320 |
| 0.1077 | 4.2134 | 2448 | 0.8245 | 0.8019 | 0.8245 | 0.9080 |
| 0.1077 | 4.2169 | 2450 | 0.7732 | 0.7801 | 0.7732 | 0.8793 |
| 0.1077 | 4.2203 | 2452 | 0.7577 | 0.7801 | 0.7577 | 0.8705 |
| 0.1077 | 4.2238 | 2454 | 0.7969 | 0.7906 | 0.7969 | 0.8927 |
| 0.1077 | 4.2272 | 2456 | 0.9002 | 0.7729 | 0.9002 | 0.9488 |
| 0.1077 | 4.2306 | 2458 | 1.0178 | 0.7417 | 1.0178 | 1.0089 |
| 0.1077 | 4.2341 | 2460 | 1.0375 | 0.6940 | 1.0375 | 1.0186 |
| 0.1077 | 4.2375 | 2462 | 0.9747 | 0.7618 | 0.9747 | 0.9872 |
| 0.1077 | 4.2410 | 2464 | 0.9328 | 0.7729 | 0.9328 | 0.9658 |
| 0.1077 | 4.2444 | 2466 | 0.8621 | 0.8019 | 0.8621 | 0.9285 |
| 0.1077 | 4.2478 | 2468 | 0.8414 | 0.7906 | 0.8414 | 0.9173 |
| 0.1077 | 4.2513 | 2470 | 0.8794 | 0.7729 | 0.8794 | 0.9378 |
| 0.1077 | 4.2547 | 2472 | 0.8932 | 0.7729 | 0.8932 | 0.9451 |
| 0.1077 | 4.2582 | 2474 | 0.9621 | 0.7520 | 0.9621 | 0.9809 |
| 0.1077 | 4.2616 | 2476 | 1.0256 | 0.7626 | 1.0256 | 1.0127 |
| 0.1077 | 4.2651 | 2478 | 1.0058 | 0.7101 | 1.0058 | 1.0029 |
| 0.1077 | 4.2685 | 2480 | 0.9426 | 0.7221 | 0.9426 | 0.9709 |
| 0.1077 | 4.2719 | 2482 | 0.8444 | 0.7221 | 0.8444 | 0.9189 |
| 0.1077 | 4.2754 | 2484 | 0.7711 | 0.7520 | 0.7711 | 0.8781 |
| 0.1077 | 4.2788 | 2486 | 0.7632 | 0.7520 | 0.7632 | 0.8736 |
| 0.1077 | 4.2823 | 2488 | 0.7873 | 0.7430 | 0.7873 | 0.8873 |
| 0.1077 | 4.2857 | 2490 | 0.8294 | 0.7259 | 0.8294 | 0.9107 |
| 0.1077 | 4.2892 | 2492 | 0.8309 | 0.7259 | 0.8309 | 0.9115 |
| 0.1077 | 4.2926 | 2494 | 0.8114 | 0.7264 | 0.8114 | 0.9008 |
| 0.1077 | 4.2960 | 2496 | 0.8147 | 0.7430 | 0.8147 | 0.9026 |
| 0.1077 | 4.2995 | 2498 | 0.8208 | 0.7520 | 0.8208 | 0.9060 |
| 0.093 | 4.3029 | 2500 | 0.8084 | 0.8019 | 0.8084 | 0.8991 |
| 0.093 | 4.3064 | 2502 | 0.8005 | 0.8019 | 0.8005 | 0.8947 |
| 0.093 | 4.3098 | 2504 | 0.8301 | 0.8019 | 0.8301 | 0.9111 |
| 0.093 | 4.3133 | 2506 | 0.9115 | 0.7074 | 0.9115 | 0.9547 |
| 0.093 | 4.3167 | 2508 | 0.9720 | 0.7074 | 0.9720 | 0.9859 |
| 0.093 | 4.3201 | 2510 | 1.0196 | 0.7626 | 1.0196 | 1.0097 |
| 0.093 | 4.3236 | 2512 | 1.0015 | 0.7520 | 1.0015 | 1.0008 |
| 0.093 | 4.3270 | 2514 | 0.9844 | 0.7020 | 0.9844 | 0.9922 |
| 0.093 | 4.3305 | 2516 | 0.9389 | 0.7221 | 0.9389 | 0.9690 |
| 0.093 | 4.3339 | 2518 | 0.8288 | 0.7623 | 0.8288 | 0.9104 |
| 0.093 | 4.3373 | 2520 | 0.7913 | 0.7623 | 0.7913 | 0.8896 |
| 0.093 | 4.3408 | 2522 | 0.8219 | 0.7139 | 0.8219 | 0.9066 |
| 0.093 | 4.3442 | 2524 | 0.8896 | 0.7221 | 0.8896 | 0.9432 |
| 0.093 | 4.3477 | 2526 | 0.9974 | 0.7221 | 0.9974 | 0.9987 |
| 0.093 | 4.3511 | 2528 | 1.0521 | 0.6940 | 1.0521 | 1.0257 |
| 0.093 | 4.3546 | 2530 | 1.0850 | 0.7014 | 1.0850 | 1.0416 |
| 0.093 | 4.3580 | 2532 | 1.1405 | 0.7014 | 1.1405 | 1.0680 |
| 0.093 | 4.3614 | 2534 | 1.1150 | 0.7101 | 1.1150 | 1.0559 |
| 0.093 | 4.3649 | 2536 | 1.0796 | 0.7101 | 1.0796 | 1.0390 |
| 0.093 | 4.3683 | 2538 | 1.0033 | 0.7074 | 1.0033 | 1.0017 |
| 0.093 | 4.3718 | 2540 | 0.9236 | 0.6988 | 0.9236 | 0.9610 |
| 0.093 | 4.3752 | 2542 | 0.9225 | 0.6988 | 0.9225 | 0.9605 |
| 0.093 | 4.3787 | 2544 | 0.9459 | 0.6988 | 0.9459 | 0.9726 |
| 0.093 | 4.3821 | 2546 | 0.9854 | 0.6988 | 0.9854 | 0.9927 |
| 0.093 | 4.3855 | 2548 | 1.0298 | 0.7074 | 1.0298 | 1.0148 |
| 0.093 | 4.3890 | 2550 | 1.0084 | 0.6988 | 1.0084 | 1.0042 |
| 0.093 | 4.3924 | 2552 | 1.0163 | 0.7074 | 1.0163 | 1.0081 |
| 0.093 | 4.3959 | 2554 | 1.0557 | 0.7626 | 1.0557 | 1.0275 |
| 0.093 | 4.3993 | 2556 | 1.0826 | 0.7101 | 1.0826 | 1.0405 |
| 0.093 | 4.4028 | 2558 | 1.0318 | 0.7221 | 1.0318 | 1.0158 |
| 0.093 | 4.4062 | 2560 | 1.0067 | 0.7221 | 1.0067 | 1.0034 |
| 0.093 | 4.4096 | 2562 | 1.0049 | 0.7221 | 1.0049 | 1.0024 |
| 0.093 | 4.4131 | 2564 | 0.9855 | 0.7221 | 0.9855 | 0.9927 |
| 0.093 | 4.4165 | 2566 | 0.9714 | 0.7729 | 0.9714 | 0.9856 |
| 0.093 | 4.4200 | 2568 | 1.0043 | 0.7626 | 1.0043 | 1.0021 |
| 0.093 | 4.4234 | 2570 | 1.0568 | 0.7101 | 1.0568 | 1.0280 |
| 0.093 | 4.4269 | 2572 | 1.1268 | 0.7101 | 1.1268 | 1.0615 |
| 0.093 | 4.4303 | 2574 | 1.1364 | 0.7014 | 1.1364 | 1.0660 |
| 0.093 | 4.4337 | 2576 | 1.0608 | 0.7014 | 1.0608 | 1.0299 |
| 0.093 | 4.4372 | 2578 | 0.9646 | 0.7134 | 0.9646 | 0.9821 |
| 0.093 | 4.4406 | 2580 | 0.8801 | 0.7221 | 0.8801 | 0.9381 |
| 0.093 | 4.4441 | 2582 | 0.8605 | 0.7221 | 0.8605 | 0.9276 |
| 0.093 | 4.4475 | 2584 | 0.8756 | 0.7221 | 0.8756 | 0.9358 |
| 0.093 | 4.4509 | 2586 | 0.9451 | 0.7134 | 0.9451 | 0.9722 |
| 0.093 | 4.4544 | 2588 | 1.0424 | 0.7014 | 1.0424 | 1.0210 |
| 0.093 | 4.4578 | 2590 | 1.0645 | 0.7014 | 1.0645 | 1.0318 |
| 0.093 | 4.4613 | 2592 | 1.0237 | 0.7216 | 1.0237 | 1.0118 |
| 0.093 | 4.4647 | 2594 | 0.9489 | 0.7221 | 0.9489 | 0.9741 |
| 0.093 | 4.4682 | 2596 | 0.8394 | 0.7729 | 0.8394 | 0.9162 |
| 0.093 | 4.4716 | 2598 | 0.7448 | 0.6564 | 0.7448 | 0.8630 |
| 0.093 | 4.4750 | 2600 | 0.6763 | 0.6503 | 0.6763 | 0.8224 |
| 0.093 | 4.4785 | 2602 | 0.6534 | 0.6775 | 0.6534 | 0.8083 |
| 0.093 | 4.4819 | 2604 | 0.6718 | 0.7529 | 0.6718 | 0.8197 |
| 0.093 | 4.4854 | 2606 | 0.7420 | 0.7729 | 0.7420 | 0.8614 |
| 0.093 | 4.4888 | 2608 | 0.8430 | 0.7134 | 0.8430 | 0.9182 |
| 0.093 | 4.4923 | 2610 | 0.8682 | 0.7134 | 0.8682 | 0.9318 |
| 0.093 | 4.4957 | 2612 | 0.8261 | 0.7221 | 0.8261 | 0.9089 |
| 0.093 | 4.4991 | 2614 | 0.7525 | 0.7623 | 0.7525 | 0.8675 |
| 0.093 | 4.5026 | 2616 | 0.7184 | 0.7623 | 0.7184 | 0.8476 |
| 0.093 | 4.5060 | 2618 | 0.6508 | 0.7149 | 0.6508 | 0.8067 |
| 0.093 | 4.5095 | 2620 | 0.6043 | 0.7263 | 0.6043 | 0.7773 |
| 0.093 | 4.5129 | 2622 | 0.6048 | 0.7263 | 0.6048 | 0.7777 |
| 0.093 | 4.5164 | 2624 | 0.6428 | 0.7149 | 0.6428 | 0.8017 |
| 0.093 | 4.5198 | 2626 | 0.7382 | 0.6882 | 0.7382 | 0.8592 |
| 0.093 | 4.5232 | 2628 | 0.9132 | 0.7134 | 0.9132 | 0.9556 |
| 0.093 | 4.5267 | 2630 | 1.1272 | 0.7230 | 1.1272 | 1.0617 |
| 0.093 | 4.5301 | 2632 | 1.2337 | 0.7230 | 1.2337 | 1.1107 |
| 0.093 | 4.5336 | 2634 | 1.2284 | 0.7230 | 1.2284 | 1.1083 |
| 0.093 | 4.5370 | 2636 | 1.1315 | 0.7014 | 1.1315 | 1.0637 |
| 0.093 | 4.5404 | 2638 | 1.0227 | 0.7101 | 1.0227 | 1.0113 |
| 0.093 | 4.5439 | 2640 | 0.9157 | 0.7626 | 0.9157 | 0.9569 |
| 0.093 | 4.5473 | 2642 | 0.8680 | 0.7844 | 0.8680 | 0.9317 |
| 0.093 | 4.5508 | 2644 | 0.8928 | 0.7844 | 0.8928 | 0.9449 |
| 0.093 | 4.5542 | 2646 | 0.9054 | 0.7310 | 0.9054 | 0.9515 |
| 0.093 | 4.5577 | 2648 | 0.9387 | 0.7101 | 0.9387 | 0.9688 |
| 0.093 | 4.5611 | 2650 | 0.9693 | 0.7101 | 0.9693 | 0.9845 |
| 0.093 | 4.5645 | 2652 | 0.9799 | 0.7101 | 0.9799 | 0.9899 |
| 0.093 | 4.5680 | 2654 | 1.0166 | 0.7101 | 1.0166 | 1.0083 |
| 0.093 | 4.5714 | 2656 | 1.0233 | 0.7101 | 1.0233 | 1.0116 |
| 0.093 | 4.5749 | 2658 | 0.9833 | 0.7101 | 0.9833 | 0.9916 |
| 0.093 | 4.5783 | 2660 | 0.8880 | 0.7729 | 0.8880 | 0.9423 |
| 0.093 | 4.5818 | 2662 | 0.7842 | 0.7729 | 0.7842 | 0.8855 |
| 0.093 | 4.5852 | 2664 | 0.7645 | 0.7729 | 0.7645 | 0.8744 |
| 0.093 | 4.5886 | 2666 | 0.8021 | 0.7729 | 0.8021 | 0.8956 |
| 0.093 | 4.5921 | 2668 | 0.9030 | 0.7221 | 0.9030 | 0.9503 |
| 0.093 | 4.5955 | 2670 | 1.0375 | 0.7318 | 1.0375 | 1.0186 |
| 0.093 | 4.5990 | 2672 | 1.0918 | 0.7318 | 1.0918 | 1.0449 |
| 0.093 | 4.6024 | 2674 | 1.0644 | 0.7014 | 1.0644 | 1.0317 |
| 0.093 | 4.6059 | 2676 | 0.9811 | 0.7020 | 0.9811 | 0.9905 |
| 0.093 | 4.6093 | 2678 | 0.8837 | 0.7729 | 0.8837 | 0.9401 |
| 0.093 | 4.6127 | 2680 | 0.8490 | 0.7729 | 0.8490 | 0.9214 |
| 0.093 | 4.6162 | 2682 | 0.8737 | 0.7729 | 0.8737 | 0.9347 |
| 0.093 | 4.6196 | 2684 | 0.8675 | 0.7729 | 0.8675 | 0.9314 |
| 0.093 | 4.6231 | 2686 | 0.8421 | 0.7729 | 0.8421 | 0.9177 |
| 0.093 | 4.6265 | 2688 | 0.8228 | 0.7729 | 0.8228 | 0.9071 |
| 0.093 | 4.6299 | 2690 | 0.8420 | 0.7729 | 0.8420 | 0.9176 |
| 0.093 | 4.6334 | 2692 | 0.8074 | 0.7729 | 0.8074 | 0.8985 |
| 0.093 | 4.6368 | 2694 | 0.7478 | 0.7525 | 0.7478 | 0.8647 |
| 0.093 | 4.6403 | 2696 | 0.7554 | 0.7525 | 0.7554 | 0.8691 |
| 0.093 | 4.6437 | 2698 | 0.8145 | 0.7729 | 0.8145 | 0.9025 |
| 0.093 | 4.6472 | 2700 | 0.8662 | 0.7729 | 0.8662 | 0.9307 |
| 0.093 | 4.6506 | 2702 | 0.9191 | 0.7844 | 0.9191 | 0.9587 |
| 0.093 | 4.6540 | 2704 | 0.9208 | 0.7844 | 0.9208 | 0.9596 |
| 0.093 | 4.6575 | 2706 | 0.9845 | 0.7844 | 0.9845 | 0.9922 |
| 0.093 | 4.6609 | 2708 | 1.0354 | 0.7626 | 1.0354 | 1.0175 |
| 0.093 | 4.6644 | 2710 | 1.0197 | 0.7626 | 1.0197 | 1.0098 |
| 0.093 | 4.6678 | 2712 | 0.9807 | 0.7844 | 0.9807 | 0.9903 |
| 0.093 | 4.6713 | 2714 | 0.9018 | 0.7844 | 0.9018 | 0.9497 |
| 0.093 | 4.6747 | 2716 | 0.8368 | 0.7844 | 0.8368 | 0.9148 |
| 0.093 | 4.6781 | 2718 | 0.8215 | 0.7729 | 0.8215 | 0.9064 |
| 0.093 | 4.6816 | 2720 | 0.8668 | 0.7310 | 0.8668 | 0.9310 |
| 0.093 | 4.6850 | 2722 | 0.9597 | 0.7216 | 0.9597 | 0.9796 |
| 0.093 | 4.6885 | 2724 | 1.0731 | 0.7014 | 1.0731 | 1.0359 |
| 0.093 | 4.6919 | 2726 | 1.1470 | 0.7014 | 1.1470 | 1.0710 |
| 0.093 | 4.6954 | 2728 | 1.1143 | 0.7014 | 1.1143 | 1.0556 |
| 0.093 | 4.6988 | 2730 | 1.0004 | 0.7216 | 1.0004 | 1.0002 |
| 0.093 | 4.7022 | 2732 | 0.8856 | 0.7844 | 0.8856 | 0.9411 |
| 0.093 | 4.7057 | 2734 | 0.8674 | 0.7844 | 0.8674 | 0.9313 |
| 0.093 | 4.7091 | 2736 | 0.8913 | 0.7844 | 0.8913 | 0.9441 |
| 0.093 | 4.7126 | 2738 | 0.9304 | 0.7844 | 0.9304 | 0.9646 |
| 0.093 | 4.7160 | 2740 | 0.9727 | 0.7844 | 0.9727 | 0.9862 |
| 0.093 | 4.7194 | 2742 | 1.0317 | 0.7216 | 1.0317 | 1.0157 |
| 0.093 | 4.7229 | 2744 | 1.0166 | 0.7216 | 1.0166 | 1.0083 |
| 0.093 | 4.7263 | 2746 | 0.9742 | 0.7216 | 0.9742 | 0.9870 |
| 0.093 | 4.7298 | 2748 | 0.9225 | 0.7216 | 0.9225 | 0.9605 |
| 0.093 | 4.7332 | 2750 | 0.8924 | 0.7134 | 0.8924 | 0.9446 |
| 0.093 | 4.7367 | 2752 | 0.8287 | 0.7525 | 0.8287 | 0.9103 |
| 0.093 | 4.7401 | 2754 | 0.8391 | 0.7623 | 0.8391 | 0.9160 |
| 0.093 | 4.7435 | 2756 | 0.8853 | 0.7844 | 0.8853 | 0.9409 |
| 0.093 | 4.7470 | 2758 | 0.9419 | 0.7844 | 0.9419 | 0.9705 |
| 0.093 | 4.7504 | 2760 | 0.9798 | 0.7626 | 0.9798 | 0.9899 |
| 0.093 | 4.7539 | 2762 | 1.0011 | 0.7626 | 1.0011 | 1.0005 |
| 0.093 | 4.7573 | 2764 | 0.9712 | 0.7626 | 0.9712 | 0.9855 |
| 0.093 | 4.7608 | 2766 | 0.9579 | 0.7844 | 0.9579 | 0.9787 |
| 0.093 | 4.7642 | 2768 | 0.9476 | 0.7844 | 0.9476 | 0.9735 |
| 0.093 | 4.7676 | 2770 | 0.9654 | 0.7844 | 0.9654 | 0.9826 |
| 0.093 | 4.7711 | 2772 | 0.9652 | 0.7844 | 0.9652 | 0.9824 |
| 0.093 | 4.7745 | 2774 | 0.9534 | 0.7216 | 0.9534 | 0.9764 |
| 0.093 | 4.7780 | 2776 | 0.9191 | 0.7844 | 0.9191 | 0.9587 |
| 0.093 | 4.7814 | 2778 | 0.8490 | 0.7844 | 0.8490 | 0.9214 |
| 0.093 | 4.7849 | 2780 | 0.8284 | 0.7844 | 0.8284 | 0.9101 |
| 0.093 | 4.7883 | 2782 | 0.8566 | 0.7844 | 0.8566 | 0.9255 |
| 0.093 | 4.7917 | 2784 | 0.9432 | 0.7310 | 0.9432 | 0.9712 |
| 0.093 | 4.7952 | 2786 | 1.0763 | 0.7014 | 1.0763 | 1.0374 |
| 0.093 | 4.7986 | 2788 | 1.1571 | 0.7014 | 1.1571 | 1.0757 |
| 0.093 | 4.8021 | 2790 | 1.1819 | 0.7014 | 1.1819 | 1.0871 |
| 0.093 | 4.8055 | 2792 | 1.1576 | 0.7014 | 1.1576 | 1.0759 |
| 0.093 | 4.8090 | 2794 | 1.0951 | 0.7014 | 1.0951 | 1.0464 |
| 0.093 | 4.8124 | 2796 | 1.0701 | 0.7101 | 1.0701 | 1.0345 |
| 0.093 | 4.8158 | 2798 | 1.0535 | 0.7101 | 1.0535 | 1.0264 |
| 0.093 | 4.8193 | 2800 | 1.0251 | 0.7216 | 1.0251 | 1.0125 |
| 0.093 | 4.8227 | 2802 | 0.9957 | 0.7216 | 0.9957 | 0.9978 |
| 0.093 | 4.8262 | 2804 | 1.0205 | 0.7216 | 1.0205 | 1.0102 |
| 0.093 | 4.8296 | 2806 | 1.0454 | 0.7014 | 1.0454 | 1.0224 |
| 0.093 | 4.8330 | 2808 | 1.0014 | 0.7216 | 1.0014 | 1.0007 |
| 0.093 | 4.8365 | 2810 | 0.9299 | 0.7216 | 0.9299 | 0.9643 |
| 0.093 | 4.8399 | 2812 | 0.9364 | 0.7134 | 0.9364 | 0.9677 |
| 0.093 | 4.8434 | 2814 | 1.0040 | 0.7014 | 1.0040 | 1.0020 |
| 0.093 | 4.8468 | 2816 | 1.1074 | 0.7014 | 1.1074 | 1.0523 |
| 0.093 | 4.8503 | 2818 | 1.1480 | 0.7014 | 1.1480 | 1.0714 |
| 0.093 | 4.8537 | 2820 | 1.1050 | 0.7014 | 1.1050 | 1.0512 |
| 0.093 | 4.8571 | 2822 | 1.0421 | 0.7014 | 1.0421 | 1.0209 |
| 0.093 | 4.8606 | 2824 | 1.0481 | 0.7101 | 1.0481 | 1.0237 |
| 0.093 | 4.8640 | 2826 | 1.0841 | 0.7101 | 1.0841 | 1.0412 |
| 0.093 | 4.8675 | 2828 | 1.0416 | 0.7626 | 1.0416 | 1.0206 |
| 0.093 | 4.8709 | 2830 | 0.9865 | 0.7844 | 0.9865 | 0.9932 |
| 0.093 | 4.8744 | 2832 | 0.9744 | 0.7844 | 0.9744 | 0.9871 |
| 0.093 | 4.8778 | 2834 | 0.9640 | 0.7221 | 0.9640 | 0.9819 |
| 0.093 | 4.8812 | 2836 | 0.9897 | 0.7216 | 0.9897 | 0.9948 |
| 0.093 | 4.8847 | 2838 | 1.0464 | 0.7014 | 1.0464 | 1.0229 |
| 0.093 | 4.8881 | 2840 | 1.1244 | 0.7014 | 1.1244 | 1.0604 |
| 0.093 | 4.8916 | 2842 | 1.1248 | 0.7014 | 1.1248 | 1.0606 |
| 0.093 | 4.8950 | 2844 | 1.0412 | 0.6940 | 1.0412 | 1.0204 |
| 0.093 | 4.8985 | 2846 | 0.9431 | 0.7134 | 0.9431 | 0.9711 |
| 0.093 | 4.9019 | 2848 | 0.9325 | 0.7221 | 0.9325 | 0.9657 |
| 0.093 | 4.9053 | 2850 | 0.9561 | 0.7221 | 0.9561 | 0.9778 |
| 0.093 | 4.9088 | 2852 | 1.0079 | 0.6940 | 1.0079 | 1.0040 |
| 0.093 | 4.9122 | 2854 | 1.0729 | 0.6940 | 1.0729 | 1.0358 |
| 0.093 | 4.9157 | 2856 | 1.0921 | 0.7014 | 1.0921 | 1.0450 |
| 0.093 | 4.9191 | 2858 | 1.0759 | 0.7014 | 1.0759 | 1.0373 |
| 0.093 | 4.9225 | 2860 | 1.0548 | 0.7101 | 1.0548 | 1.0270 |
| 0.093 | 4.9260 | 2862 | 1.0062 | 0.7520 | 1.0062 | 1.0031 |
| 0.093 | 4.9294 | 2864 | 0.9362 | 0.7729 | 0.9362 | 0.9676 |
| 0.093 | 4.9329 | 2866 | 0.9227 | 0.7729 | 0.9227 | 0.9606 |
| 0.093 | 4.9363 | 2868 | 0.9548 | 0.6940 | 0.9548 | 0.9771 |
| 0.093 | 4.9398 | 2870 | 0.9648 | 0.6940 | 0.9648 | 0.9822 |
| 0.093 | 4.9432 | 2872 | 0.9492 | 0.6940 | 0.9492 | 0.9743 |
| 0.093 | 4.9466 | 2874 | 1.0056 | 0.6940 | 1.0056 | 1.0028 |
| 0.093 | 4.9501 | 2876 | 1.1033 | 0.6940 | 1.1033 | 1.0504 |
| 0.093 | 4.9535 | 2878 | 1.1295 | 0.6940 | 1.1295 | 1.0628 |
| 0.093 | 4.9570 | 2880 | 1.0674 | 0.6940 | 1.0674 | 1.0332 |
| 0.093 | 4.9604 | 2882 | 0.9604 | 0.6940 | 0.9604 | 0.9800 |
| 0.093 | 4.9639 | 2884 | 0.8430 | 0.7729 | 0.8430 | 0.9181 |
| 0.093 | 4.9673 | 2886 | 0.7901 | 0.6776 | 0.7901 | 0.8889 |
| 0.093 | 4.9707 | 2888 | 0.7982 | 0.6776 | 0.7982 | 0.8934 |
| 0.093 | 4.9742 | 2890 | 0.8539 | 0.7525 | 0.8539 | 0.9241 |
| 0.093 | 4.9776 | 2892 | 0.8947 | 0.7729 | 0.8947 | 0.9459 |
| 0.093 | 4.9811 | 2894 | 0.8931 | 0.7729 | 0.8931 | 0.9451 |
| 0.093 | 4.9845 | 2896 | 0.9143 | 0.7618 | 0.9143 | 0.9562 |
| 0.093 | 4.9880 | 2898 | 0.9249 | 0.7618 | 0.9249 | 0.9617 |
| 0.093 | 4.9914 | 2900 | 0.9200 | 0.7134 | 0.9200 | 0.9592 |
| 0.093 | 4.9948 | 2902 | 0.8791 | 0.7618 | 0.8791 | 0.9376 |
| 0.093 | 4.9983 | 2904 | 0.8315 | 0.7422 | 0.8315 | 0.9119 |
| 0.093 | 5.0017 | 2906 | 0.7962 | 0.7422 | 0.7962 | 0.8923 |
| 0.093 | 5.0052 | 2908 | 0.8046 | 0.7422 | 0.8046 | 0.8970 |
| 0.093 | 5.0086 | 2910 | 0.8095 | 0.7422 | 0.8095 | 0.8997 |
| 0.093 | 5.0120 | 2912 | 0.8318 | 0.7525 | 0.8318 | 0.9120 |
| 0.093 | 5.0155 | 2914 | 0.8606 | 0.7618 | 0.8606 | 0.9277 |
| 0.093 | 5.0189 | 2916 | 0.9482 | 0.7134 | 0.9482 | 0.9738 |
| 0.093 | 5.0224 | 2918 | 1.0055 | 0.7134 | 1.0055 | 1.0028 |
| 0.093 | 5.0258 | 2920 | 0.9838 | 0.7134 | 0.9838 | 0.9919 |
| 0.093 | 5.0293 | 2922 | 0.9015 | 0.7618 | 0.9015 | 0.9495 |
| 0.093 | 5.0327 | 2924 | 0.8372 | 0.7427 | 0.8372 | 0.9150 |
| 0.093 | 5.0361 | 2926 | 0.7814 | 0.7427 | 0.7814 | 0.8840 |
| 0.093 | 5.0396 | 2928 | 0.7753 | 0.7427 | 0.7753 | 0.8805 |
| 0.093 | 5.0430 | 2930 | 0.8035 | 0.7332 | 0.8035 | 0.8964 |
| 0.093 | 5.0465 | 2932 | 0.8362 | 0.7058 | 0.8362 | 0.9144 |
| 0.093 | 5.0499 | 2934 | 0.9160 | 0.7134 | 0.9160 | 0.9571 |
| 0.093 | 5.0534 | 2936 | 0.9398 | 0.7134 | 0.9398 | 0.9694 |
| 0.093 | 5.0568 | 2938 | 0.9966 | 0.6940 | 0.9966 | 0.9983 |
| 0.093 | 5.0602 | 2940 | 1.0665 | 0.7014 | 1.0665 | 1.0327 |
| 0.093 | 5.0637 | 2942 | 1.0958 | 0.7014 | 1.0958 | 1.0468 |
| 0.093 | 5.0671 | 2944 | 1.0613 | 0.7515 | 1.0613 | 1.0302 |
| 0.093 | 5.0706 | 2946 | 0.9879 | 0.7844 | 0.9879 | 0.9939 |
| 0.093 | 5.0740 | 2948 | 0.9464 | 0.7729 | 0.9464 | 0.9728 |
| 0.093 | 5.0775 | 2950 | 0.9082 | 0.7729 | 0.9082 | 0.9530 |
| 0.093 | 5.0809 | 2952 | 0.8775 | 0.7623 | 0.8775 | 0.9368 |
| 0.093 | 5.0843 | 2954 | 0.9085 | 0.7623 | 0.9085 | 0.9531 |
| 0.093 | 5.0878 | 2956 | 0.9787 | 0.7724 | 0.9787 | 0.9893 |
| 0.093 | 5.0912 | 2958 | 1.0504 | 0.7014 | 1.0504 | 1.0249 |
| 0.093 | 5.0947 | 2960 | 1.0970 | 0.7014 | 1.0970 | 1.0474 |
| 0.093 | 5.0981 | 2962 | 1.1499 | 0.7014 | 1.1499 | 1.0723 |
| 0.093 | 5.1015 | 2964 | 1.1239 | 0.7014 | 1.1239 | 1.0602 |
| 0.093 | 5.1050 | 2966 | 1.0387 | 0.7014 | 1.0387 | 1.0192 |
| 0.093 | 5.1084 | 2968 | 0.9670 | 0.7520 | 0.9670 | 0.9834 |
| 0.093 | 5.1119 | 2970 | 0.9024 | 0.7729 | 0.9024 | 0.9500 |
| 0.093 | 5.1153 | 2972 | 0.8931 | 0.7623 | 0.8931 | 0.9451 |
| 0.093 | 5.1188 | 2974 | 0.9301 | 0.7729 | 0.9301 | 0.9644 |
| 0.093 | 5.1222 | 2976 | 0.9994 | 0.7014 | 0.9994 | 0.9997 |
| 0.093 | 5.1256 | 2978 | 1.1028 | 0.7014 | 1.1028 | 1.0501 |
| 0.093 | 5.1291 | 2980 | 1.2200 | 0.7014 | 1.2200 | 1.1045 |
| 0.093 | 5.1325 | 2982 | 1.2756 | 0.7014 | 1.2756 | 1.1294 |
| 0.093 | 5.1360 | 2984 | 1.2344 | 0.7014 | 1.2344 | 1.1110 |
| 0.093 | 5.1394 | 2986 | 1.1429 | 0.7014 | 1.1429 | 1.0691 |
| 0.093 | 5.1429 | 2988 | 1.0752 | 0.7014 | 1.0752 | 1.0369 |
| 0.093 | 5.1463 | 2990 | 0.9669 | 0.7844 | 0.9669 | 0.9833 |
| 0.093 | 5.1497 | 2992 | 0.8463 | 0.6706 | 0.8463 | 0.9200 |
| 0.093 | 5.1532 | 2994 | 0.8004 | 0.6706 | 0.8004 | 0.8946 |
| 0.093 | 5.1566 | 2996 | 0.8073 | 0.6706 | 0.8073 | 0.8985 |
| 0.093 | 5.1601 | 2998 | 0.8575 | 0.6706 | 0.8575 | 0.9260 |
| 0.0747 | 5.1635 | 3000 | 0.9750 | 0.7020 | 0.9750 | 0.9874 |
| 0.0747 | 5.1670 | 3002 | 1.1668 | 0.7014 | 1.1668 | 1.0802 |
| 0.0747 | 5.1704 | 3004 | 1.2956 | 0.6740 | 1.2956 | 1.1383 |
| 0.0747 | 5.1738 | 3006 | 1.3041 | 0.6740 | 1.3041 | 1.1420 |
| 0.0747 | 5.1773 | 3008 | 1.2135 | 0.7318 | 1.2135 | 1.1016 |
| 0.0747 | 5.1807 | 3010 | 1.0576 | 0.6940 | 1.0576 | 1.0284 |
| 0.0747 | 5.1842 | 3012 | 0.9034 | 0.7623 | 0.9034 | 0.9505 |
| 0.0747 | 5.1876 | 3014 | 0.8375 | 0.7427 | 0.8375 | 0.9151 |
| 0.0747 | 5.1910 | 3016 | 0.8257 | 0.7427 | 0.8257 | 0.9087 |
| 0.0747 | 5.1945 | 3018 | 0.8541 | 0.7427 | 0.8541 | 0.9242 |
| 0.0747 | 5.1979 | 3020 | 0.9046 | 0.7520 | 0.9046 | 0.9511 |
| 0.0747 | 5.2014 | 3022 | 0.9672 | 0.7058 | 0.9672 | 0.9834 |
| 0.0747 | 5.2048 | 3024 | 1.0082 | 0.6940 | 1.0082 | 1.0041 |
| 0.0747 | 5.2083 | 3026 | 1.0039 | 0.7618 | 1.0039 | 1.0019 |
| 0.0747 | 5.2117 | 3028 | 0.9942 | 0.7729 | 0.9942 | 0.9971 |
| 0.0747 | 5.2151 | 3030 | 1.0135 | 0.7844 | 1.0135 | 1.0067 |
| 0.0747 | 5.2186 | 3032 | 0.9684 | 0.7844 | 0.9684 | 0.9841 |
| 0.0747 | 5.2220 | 3034 | 0.9222 | 0.7844 | 0.9222 | 0.9603 |
| 0.0747 | 5.2255 | 3036 | 0.9035 | 0.7729 | 0.9035 | 0.9505 |
| 0.0747 | 5.2289 | 3038 | 0.9252 | 0.7729 | 0.9252 | 0.9619 |
| 0.0747 | 5.2324 | 3040 | 0.9296 | 0.7134 | 0.9296 | 0.9641 |
| 0.0747 | 5.2358 | 3042 | 0.9049 | 0.7134 | 0.9049 | 0.9513 |
| 0.0747 | 5.2392 | 3044 | 0.8874 | 0.7134 | 0.8874 | 0.9420 |
| 0.0747 | 5.2427 | 3046 | 0.8849 | 0.7058 | 0.8849 | 0.9407 |
| 0.0747 | 5.2461 | 3048 | 0.8468 | 0.7332 | 0.8468 | 0.9202 |
| 0.0747 | 5.2496 | 3050 | 0.8215 | 0.7332 | 0.8215 | 0.9063 |
| 0.0747 | 5.2530 | 3052 | 0.7912 | 0.7427 | 0.7912 | 0.8895 |
| 0.0747 | 5.2565 | 3054 | 0.7873 | 0.7427 | 0.7873 | 0.8873 |
| 0.0747 | 5.2599 | 3056 | 0.7841 | 0.7427 | 0.7841 | 0.8855 |
| 0.0747 | 5.2633 | 3058 | 0.8000 | 0.7427 | 0.8000 | 0.8944 |
| 0.0747 | 5.2668 | 3060 | 0.8375 | 0.7427 | 0.8375 | 0.9151 |
| 0.0747 | 5.2702 | 3062 | 0.8906 | 0.7623 | 0.8906 | 0.9437 |
| 0.0747 | 5.2737 | 3064 | 0.8981 | 0.7623 | 0.8981 | 0.9477 |
| 0.0747 | 5.2771 | 3066 | 0.8703 | 0.7427 | 0.8703 | 0.9329 |
| 0.0747 | 5.2806 | 3068 | 0.8833 | 0.7427 | 0.8833 | 0.9398 |
| 0.0747 | 5.2840 | 3070 | 0.9352 | 0.7729 | 0.9352 | 0.9670 |
| 0.0747 | 5.2874 | 3072 | 0.9680 | 0.7618 | 0.9680 | 0.9839 |
| 0.0747 | 5.2909 | 3074 | 1.0049 | 0.7216 | 1.0049 | 1.0025 |
| 0.0747 | 5.2943 | 3076 | 0.9936 | 0.7844 | 0.9936 | 0.9968 |
| 0.0747 | 5.2978 | 3078 | 0.9495 | 0.7631 | 0.9495 | 0.9744 |
| 0.0747 | 5.3012 | 3080 | 0.9608 | 0.7844 | 0.9608 | 0.9802 |
| 0.0747 | 5.3046 | 3082 | 0.9620 | 0.7729 | 0.9620 | 0.9808 |
| 0.0747 | 5.3081 | 3084 | 0.9566 | 0.7729 | 0.9566 | 0.9780 |
| 0.0747 | 5.3115 | 3086 | 0.9654 | 0.6871 | 0.9654 | 0.9825 |
| 0.0747 | 5.3150 | 3088 | 0.9781 | 0.6871 | 0.9781 | 0.9890 |
| 0.0747 | 5.3184 | 3090 | 0.9226 | 0.7058 | 0.9226 | 0.9605 |
| 0.0747 | 5.3219 | 3092 | 0.8537 | 0.6877 | 0.8537 | 0.9240 |
| 0.0747 | 5.3253 | 3094 | 0.8312 | 0.6877 | 0.8312 | 0.9117 |
| 0.0747 | 5.3287 | 3096 | 0.8413 | 0.6877 | 0.8413 | 0.9172 |
| 0.0747 | 5.3322 | 3098 | 0.8839 | 0.6877 | 0.8839 | 0.9402 |
| 0.0747 | 5.3356 | 3100 | 0.9488 | 0.7058 | 0.9488 | 0.9741 |
| 0.0747 | 5.3391 | 3102 | 0.9855 | 0.6871 | 0.9855 | 0.9927 |
| 0.0747 | 5.3425 | 3104 | 0.9659 | 0.7058 | 0.9659 | 0.9828 |
| 0.0747 | 5.3460 | 3106 | 0.9494 | 0.7058 | 0.9494 | 0.9744 |
| 0.0747 | 5.3494 | 3108 | 0.9064 | 0.7332 | 0.9064 | 0.9520 |
| 0.0747 | 5.3528 | 3110 | 0.9000 | 0.7427 | 0.9000 | 0.9487 |
| 0.0747 | 5.3563 | 3112 | 0.8792 | 0.7427 | 0.8792 | 0.9377 |
| 0.0747 | 5.3597 | 3114 | 0.8972 | 0.7427 | 0.8972 | 0.9472 |
| 0.0747 | 5.3632 | 3116 | 0.9302 | 0.7623 | 0.9302 | 0.9644 |
| 0.0747 | 5.3666 | 3118 | 0.9538 | 0.7724 | 0.9538 | 0.9766 |
| 0.0747 | 5.3701 | 3120 | 0.9776 | 0.7724 | 0.9776 | 0.9887 |
| 0.0747 | 5.3735 | 3122 | 0.9534 | 0.7520 | 0.9534 | 0.9764 |
| 0.0747 | 5.3769 | 3124 | 0.9156 | 0.7520 | 0.9156 | 0.9569 |
| 0.0747 | 5.3804 | 3126 | 0.8381 | 0.7520 | 0.8381 | 0.9155 |
| 0.0747 | 5.3838 | 3128 | 0.8138 | 0.7717 | 0.8138 | 0.9021 |
| 0.0747 | 5.3873 | 3130 | 0.8228 | 0.7906 | 0.8228 | 0.9071 |
| 0.0747 | 5.3907 | 3132 | 0.8711 | 0.7623 | 0.8711 | 0.9333 |
| 0.0747 | 5.3941 | 3134 | 0.8825 | 0.7520 | 0.8825 | 0.9394 |
| 0.0747 | 5.3976 | 3136 | 0.9027 | 0.7327 | 0.9027 | 0.9501 |
| 0.0747 | 5.4010 | 3138 | 0.8858 | 0.7520 | 0.8858 | 0.9412 |
| 0.0747 | 5.4045 | 3140 | 0.8667 | 0.7797 | 0.8667 | 0.9310 |
| 0.0747 | 5.4079 | 3142 | 0.8389 | 0.7906 | 0.8389 | 0.9159 |
| 0.0747 | 5.4114 | 3144 | 0.7985 | 0.7906 | 0.7985 | 0.8936 |
| 0.0747 | 5.4148 | 3146 | 0.7903 | 0.7906 | 0.7903 | 0.8890 |
| 0.0747 | 5.4182 | 3148 | 0.8289 | 0.7797 | 0.8289 | 0.9104 |
| 0.0747 | 5.4217 | 3150 | 0.8903 | 0.7797 | 0.8903 | 0.9436 |
| 0.0747 | 5.4251 | 3152 | 0.8926 | 0.7797 | 0.8926 | 0.9448 |
| 0.0747 | 5.4286 | 3154 | 0.8797 | 0.7797 | 0.8797 | 0.9379 |
| 0.0747 | 5.4320 | 3156 | 0.8540 | 0.7797 | 0.8540 | 0.9241 |
| 0.0747 | 5.4355 | 3158 | 0.8324 | 0.7797 | 0.8324 | 0.9124 |
| 0.0747 | 5.4389 | 3160 | 0.8058 | 0.7797 | 0.8058 | 0.8976 |
| 0.0747 | 5.4423 | 3162 | 0.8134 | 0.7797 | 0.8134 | 0.9019 |
| 0.0747 | 5.4458 | 3164 | 0.8076 | 0.7797 | 0.8076 | 0.8987 |
| 0.0747 | 5.4492 | 3166 | 0.7578 | 0.7797 | 0.7578 | 0.8705 |
| 0.0747 | 5.4527 | 3168 | 0.6981 | 0.7616 | 0.6981 | 0.8355 |
| 0.0747 | 5.4561 | 3170 | 0.6939 | 0.7616 | 0.6939 | 0.8330 |
| 0.0747 | 5.4596 | 3172 | 0.7263 | 0.7797 | 0.7263 | 0.8522 |
| 0.0747 | 5.4630 | 3174 | 0.7884 | 0.7797 | 0.7884 | 0.8879 |
| 0.0747 | 5.4664 | 3176 | 0.8700 | 0.7342 | 0.8700 | 0.9327 |
| 0.0747 | 5.4699 | 3178 | 0.9174 | 0.7058 | 0.9174 | 0.9578 |
| 0.0747 | 5.4733 | 3180 | 0.9045 | 0.7520 | 0.9045 | 0.9511 |
| 0.0747 | 5.4768 | 3182 | 0.8565 | 0.7623 | 0.8565 | 0.9255 |
| 0.0747 | 5.4802 | 3184 | 0.8243 | 0.7906 | 0.8243 | 0.9079 |
| 0.0747 | 5.4836 | 3186 | 0.7657 | 0.7906 | 0.7657 | 0.8751 |
| 0.0747 | 5.4871 | 3188 | 0.7098 | 0.7037 | 0.7098 | 0.8425 |
| 0.0747 | 5.4905 | 3190 | 0.6866 | 0.7003 | 0.6866 | 0.8286 |
| 0.0747 | 5.4940 | 3192 | 0.6860 | 0.7614 | 0.6860 | 0.8282 |
| 0.0747 | 5.4974 | 3194 | 0.7134 | 0.7614 | 0.7134 | 0.8446 |
| 0.0747 | 5.5009 | 3196 | 0.7449 | 0.7358 | 0.7449 | 0.8631 |
| 0.0747 | 5.5043 | 3198 | 0.7860 | 0.7281 | 0.7860 | 0.8866 |
| 0.0747 | 5.5077 | 3200 | 0.7793 | 0.7281 | 0.7793 | 0.8828 |
| 0.0747 | 5.5112 | 3202 | 0.7870 | 0.7281 | 0.7870 | 0.8871 |
| 0.0747 | 5.5146 | 3204 | 0.7765 | 0.7281 | 0.7765 | 0.8812 |
| 0.0747 | 5.5181 | 3206 | 0.7990 | 0.7430 | 0.7990 | 0.8939 |
| 0.0747 | 5.5215 | 3208 | 0.8721 | 0.7058 | 0.8721 | 0.9339 |
| 0.0747 | 5.5250 | 3210 | 0.9112 | 0.7139 | 0.9112 | 0.9546 |
| 0.0747 | 5.5284 | 3212 | 0.9006 | 0.7623 | 0.9006 | 0.9490 |
| 0.0747 | 5.5318 | 3214 | 0.9005 | 0.7623 | 0.9005 | 0.9489 |
| 0.0747 | 5.5353 | 3216 | 0.8699 | 0.7623 | 0.8699 | 0.9327 |
| 0.0747 | 5.5387 | 3218 | 0.8513 | 0.7623 | 0.8513 | 0.9227 |
| 0.0747 | 5.5422 | 3220 | 0.8260 | 0.7623 | 0.8260 | 0.9089 |
| 0.0747 | 5.5456 | 3222 | 0.8268 | 0.7427 | 0.8268 | 0.9093 |
| 0.0747 | 5.5491 | 3224 | 0.8364 | 0.7427 | 0.8364 | 0.9146 |
| 0.0747 | 5.5525 | 3226 | 0.8590 | 0.7623 | 0.8590 | 0.9268 |
| 0.0747 | 5.5559 | 3228 | 0.8719 | 0.7139 | 0.8719 | 0.9338 |
| 0.0747 | 5.5594 | 3230 | 0.8870 | 0.7430 | 0.8870 | 0.9418 |
| 0.0747 | 5.5628 | 3232 | 0.8862 | 0.7430 | 0.8862 | 0.9414 |
| 0.0747 | 5.5663 | 3234 | 0.8582 | 0.7430 | 0.8582 | 0.9264 |
| 0.0747 | 5.5697 | 3236 | 0.8389 | 0.7430 | 0.8389 | 0.9159 |
| 0.0747 | 5.5731 | 3238 | 0.8455 | 0.7430 | 0.8455 | 0.9195 |
| 0.0747 | 5.5766 | 3240 | 0.8567 | 0.7430 | 0.8567 | 0.9256 |
| 0.0747 | 5.5800 | 3242 | 0.8395 | 0.7906 | 0.8395 | 0.9162 |
| 0.0747 | 5.5835 | 3244 | 0.8192 | 0.7717 | 0.8192 | 0.9051 |
| 0.0747 | 5.5869 | 3246 | 0.8262 | 0.7717 | 0.8262 | 0.9089 |
| 0.0747 | 5.5904 | 3248 | 0.8373 | 0.7906 | 0.8373 | 0.9150 |
| 0.0747 | 5.5938 | 3250 | 0.8707 | 0.7430 | 0.8707 | 0.9331 |
| 0.0747 | 5.5972 | 3252 | 0.8729 | 0.7139 | 0.8729 | 0.9343 |
| 0.0747 | 5.6007 | 3254 | 0.8703 | 0.7139 | 0.8703 | 0.9329 |
| 0.0747 | 5.6041 | 3256 | 0.8690 | 0.7139 | 0.8690 | 0.9322 |
| 0.0747 | 5.6076 | 3258 | 0.8619 | 0.7430 | 0.8619 | 0.9284 |
| 0.0747 | 5.6110 | 3260 | 0.8511 | 0.7430 | 0.8511 | 0.9225 |
| 0.0747 | 5.6145 | 3262 | 0.8698 | 0.7430 | 0.8698 | 0.9327 |
| 0.0747 | 5.6179 | 3264 | 0.8647 | 0.7906 | 0.8647 | 0.9299 |
| 0.0747 | 5.6213 | 3266 | 0.8368 | 0.7906 | 0.8368 | 0.9147 |
| 0.0747 | 5.6248 | 3268 | 0.8018 | 0.7906 | 0.8018 | 0.8955 |
| 0.0747 | 5.6282 | 3270 | 0.7609 | 0.7717 | 0.7609 | 0.8723 |
| 0.0747 | 5.6317 | 3272 | 0.7568 | 0.7717 | 0.7568 | 0.8699 |
| 0.0747 | 5.6351 | 3274 | 0.7973 | 0.7906 | 0.7973 | 0.8929 |
| 0.0747 | 5.6386 | 3276 | 0.8800 | 0.7906 | 0.8800 | 0.9381 |
| 0.0747 | 5.6420 | 3278 | 0.9698 | 0.7618 | 0.9698 | 0.9848 |
| 0.0747 | 5.6454 | 3280 | 1.0338 | 0.7511 | 1.0338 | 1.0168 |
| 0.0747 | 5.6489 | 3282 | 1.0538 | 0.7511 | 1.0538 | 1.0265 |
| 0.0747 | 5.6523 | 3284 | 1.0620 | 0.7511 | 1.0620 | 1.0306 |
| 0.0747 | 5.6558 | 3286 | 1.0157 | 0.7844 | 1.0157 | 1.0078 |
| 0.0747 | 5.6592 | 3288 | 0.9338 | 0.7844 | 0.9338 | 0.9663 |
| 0.0747 | 5.6627 | 3290 | 0.8643 | 0.7729 | 0.8643 | 0.9297 |
| 0.0747 | 5.6661 | 3292 | 0.8077 | 0.7823 | 0.8077 | 0.8987 |
| 0.0747 | 5.6695 | 3294 | 0.7973 | 0.7823 | 0.7973 | 0.8929 |
| 0.0747 | 5.6730 | 3296 | 0.8067 | 0.7823 | 0.8067 | 0.8982 |
| 0.0747 | 5.6764 | 3298 | 0.8382 | 0.8019 | 0.8382 | 0.9155 |
| 0.0747 | 5.6799 | 3300 | 0.8706 | 0.7729 | 0.8706 | 0.9331 |
| 0.0747 | 5.6833 | 3302 | 0.8829 | 0.7729 | 0.8829 | 0.9396 |
| 0.0747 | 5.6867 | 3304 | 0.9048 | 0.7729 | 0.9048 | 0.9512 |
| 0.0747 | 5.6902 | 3306 | 0.9083 | 0.7729 | 0.9083 | 0.9531 |
| 0.0747 | 5.6936 | 3308 | 0.8608 | 0.7823 | 0.8608 | 0.9278 |
| 0.0747 | 5.6971 | 3310 | 0.8288 | 0.7118 | 0.8288 | 0.9104 |
| 0.0747 | 5.7005 | 3312 | 0.8190 | 0.7118 | 0.8190 | 0.9050 |
| 0.0747 | 5.7040 | 3314 | 0.8346 | 0.7823 | 0.8346 | 0.9136 |
| 0.0747 | 5.7074 | 3316 | 0.8495 | 0.7717 | 0.8495 | 0.9217 |
| 0.0747 | 5.7108 | 3318 | 0.8958 | 0.7426 | 0.8958 | 0.9464 |
| 0.0747 | 5.7143 | 3320 | 0.9221 | 0.7134 | 0.9221 | 0.9603 |
| 0.0747 | 5.7177 | 3322 | 0.9595 | 0.7134 | 0.9595 | 0.9795 |
| 0.0747 | 5.7212 | 3324 | 0.9872 | 0.7134 | 0.9872 | 0.9936 |
| 0.0747 | 5.7246 | 3326 | 0.9587 | 0.7134 | 0.9587 | 0.9791 |
| 0.0747 | 5.7281 | 3328 | 0.9075 | 0.7058 | 0.9075 | 0.9526 |
| 0.0747 | 5.7315 | 3330 | 0.8997 | 0.7342 | 0.8997 | 0.9485 |
| 0.0747 | 5.7349 | 3332 | 0.8699 | 0.7342 | 0.8699 | 0.9327 |
| 0.0747 | 5.7384 | 3334 | 0.8585 | 0.7342 | 0.8585 | 0.9266 |
| 0.0747 | 5.7418 | 3336 | 0.8689 | 0.7342 | 0.8689 | 0.9322 |
| 0.0747 | 5.7453 | 3338 | 0.8748 | 0.7342 | 0.8748 | 0.9353 |
| 0.0747 | 5.7487 | 3340 | 0.9345 | 0.7058 | 0.9345 | 0.9667 |
| 0.0747 | 5.7522 | 3342 | 1.0603 | 0.7421 | 1.0603 | 1.0297 |
| 0.0747 | 5.7556 | 3344 | 1.1264 | 0.7421 | 1.1264 | 1.0613 |
| 0.0747 | 5.7590 | 3346 | 1.1544 | 0.7234 | 1.1544 | 1.0744 |
| 0.0747 | 5.7625 | 3348 | 1.0975 | 0.7134 | 1.0975 | 1.0476 |
| 0.0747 | 5.7659 | 3350 | 0.9816 | 0.7134 | 0.9816 | 0.9908 |
| 0.0747 | 5.7694 | 3352 | 0.8943 | 0.7729 | 0.8943 | 0.9457 |
| 0.0747 | 5.7728 | 3354 | 0.7892 | 0.7717 | 0.7892 | 0.8884 |
| 0.0747 | 5.7762 | 3356 | 0.7206 | 0.7037 | 0.7206 | 0.8489 |
| 0.0747 | 5.7797 | 3358 | 0.7061 | 0.7037 | 0.7061 | 0.8403 |
| 0.0747 | 5.7831 | 3360 | 0.7163 | 0.7717 | 0.7163 | 0.8463 |
| 0.0747 | 5.7866 | 3362 | 0.7507 | 0.7717 | 0.7507 | 0.8664 |
| 0.0747 | 5.7900 | 3364 | 0.8105 | 0.7249 | 0.8105 | 0.9003 |
| 0.0747 | 5.7935 | 3366 | 0.8597 | 0.7058 | 0.8597 | 0.9272 |
| 0.0747 | 5.7969 | 3368 | 0.8773 | 0.7058 | 0.8773 | 0.9366 |
| 0.0747 | 5.8003 | 3370 | 0.8523 | 0.7058 | 0.8523 | 0.9232 |
| 0.0747 | 5.8038 | 3372 | 0.8112 | 0.7139 | 0.8112 | 0.9007 |
| 0.0747 | 5.8072 | 3374 | 0.7645 | 0.7717 | 0.7645 | 0.8743 |
| 0.0747 | 5.8107 | 3376 | 0.7376 | 0.7717 | 0.7376 | 0.8588 |
| 0.0747 | 5.8141 | 3378 | 0.7075 | 0.7717 | 0.7075 | 0.8411 |
| 0.0747 | 5.8176 | 3380 | 0.7168 | 0.7717 | 0.7168 | 0.8466 |
| 0.0747 | 5.8210 | 3382 | 0.7585 | 0.7717 | 0.7585 | 0.8709 |
| 0.0747 | 5.8244 | 3384 | 0.8238 | 0.7623 | 0.8238 | 0.9076 |
| 0.0747 | 5.8279 | 3386 | 0.9092 | 0.7134 | 0.9092 | 0.9535 |
| 0.0747 | 5.8313 | 3388 | 0.9471 | 0.7134 | 0.9471 | 0.9732 |
| 0.0747 | 5.8348 | 3390 | 0.9227 | 0.7134 | 0.9227 | 0.9606 |
| 0.0747 | 5.8382 | 3392 | 0.8657 | 0.7623 | 0.8657 | 0.9304 |
| 0.0747 | 5.8417 | 3394 | 0.7902 | 0.7623 | 0.7902 | 0.8889 |
| 0.0747 | 5.8451 | 3396 | 0.7653 | 0.7717 | 0.7653 | 0.8748 |
| 0.0747 | 5.8485 | 3398 | 0.7799 | 0.7906 | 0.7799 | 0.8831 |
| 0.0747 | 5.8520 | 3400 | 0.7953 | 0.7623 | 0.7953 | 0.8918 |
| 0.0747 | 5.8554 | 3402 | 0.8638 | 0.7729 | 0.8638 | 0.9294 |
| 0.0747 | 5.8589 | 3404 | 0.9177 | 0.7520 | 0.9177 | 0.9580 |
| 0.0747 | 5.8623 | 3406 | 0.9266 | 0.7520 | 0.9266 | 0.9626 |
| 0.0747 | 5.8657 | 3408 | 0.8869 | 0.7729 | 0.8869 | 0.9418 |
| 0.0747 | 5.8692 | 3410 | 0.8149 | 0.8019 | 0.8149 | 0.9027 |
| 0.0747 | 5.8726 | 3412 | 0.7889 | 0.7906 | 0.7889 | 0.8882 |
| 0.0747 | 5.8761 | 3414 | 0.7519 | 0.7717 | 0.7519 | 0.8671 |
| 0.0747 | 5.8795 | 3416 | 0.7517 | 0.7717 | 0.7517 | 0.8670 |
| 0.0747 | 5.8830 | 3418 | 0.7944 | 0.7823 | 0.7944 | 0.8913 |
| 0.0747 | 5.8864 | 3420 | 0.8674 | 0.8019 | 0.8674 | 0.9313 |
| 0.0747 | 5.8898 | 3422 | 0.9750 | 0.7134 | 0.9750 | 0.9874 |
| 0.0747 | 5.8933 | 3424 | 1.0894 | 0.6940 | 1.0894 | 1.0438 |
| 0.0747 | 5.8967 | 3426 | 1.1155 | 0.6940 | 1.1155 | 1.0562 |
| 0.0747 | 5.9002 | 3428 | 1.0648 | 0.7134 | 1.0648 | 1.0319 |
| 0.0747 | 5.9036 | 3430 | 1.0067 | 0.7729 | 1.0067 | 1.0033 |
| 0.0747 | 5.9071 | 3432 | 0.9113 | 0.8019 | 0.9113 | 0.9546 |
| 0.0747 | 5.9105 | 3434 | 0.8503 | 0.7823 | 0.8503 | 0.9221 |
| 0.0747 | 5.9139 | 3436 | 0.7971 | 0.7118 | 0.7971 | 0.8928 |
| 0.0747 | 5.9174 | 3438 | 0.7727 | 0.7118 | 0.7727 | 0.8790 |
| 0.0747 | 5.9208 | 3440 | 0.7709 | 0.7118 | 0.7709 | 0.8780 |
| 0.0747 | 5.9243 | 3442 | 0.8057 | 0.7823 | 0.8057 | 0.8976 |
| 0.0747 | 5.9277 | 3444 | 0.8727 | 0.8019 | 0.8727 | 0.9342 |
| 0.0747 | 5.9312 | 3446 | 0.9622 | 0.7426 | 0.9622 | 0.9809 |
| 0.0747 | 5.9346 | 3448 | 1.0046 | 0.7134 | 1.0046 | 1.0023 |
| 0.0747 | 5.9380 | 3450 | 0.9754 | 0.7426 | 0.9754 | 0.9876 |
| 0.0747 | 5.9415 | 3452 | 0.8978 | 0.7426 | 0.8978 | 0.9475 |
| 0.0747 | 5.9449 | 3454 | 0.8033 | 0.7823 | 0.8033 | 0.8963 |
| 0.0747 | 5.9484 | 3456 | 0.7569 | 0.7823 | 0.7569 | 0.8700 |
| 0.0747 | 5.9518 | 3458 | 0.7634 | 0.7823 | 0.7634 | 0.8737 |
| 0.0747 | 5.9552 | 3460 | 0.7900 | 0.8019 | 0.7900 | 0.8888 |
| 0.0747 | 5.9587 | 3462 | 0.8167 | 0.8019 | 0.8167 | 0.9037 |
| 0.0747 | 5.9621 | 3464 | 0.8470 | 0.8019 | 0.8470 | 0.9203 |
| 0.0747 | 5.9656 | 3466 | 0.9000 | 0.7426 | 0.9000 | 0.9487 |
| 0.0747 | 5.9690 | 3468 | 0.9039 | 0.8019 | 0.9039 | 0.9507 |
| 0.0747 | 5.9725 | 3470 | 0.9062 | 0.7729 | 0.9062 | 0.9519 |
| 0.0747 | 5.9759 | 3472 | 0.8772 | 0.8019 | 0.8772 | 0.9366 |
| 0.0747 | 5.9793 | 3474 | 0.8620 | 0.8019 | 0.8620 | 0.9284 |
| 0.0747 | 5.9828 | 3476 | 0.8475 | 0.8019 | 0.8475 | 0.9206 |
| 0.0747 | 5.9862 | 3478 | 0.8537 | 0.8019 | 0.8537 | 0.9239 |
| 0.0747 | 5.9897 | 3480 | 0.8626 | 0.8019 | 0.8626 | 0.9288 |
| 0.0747 | 5.9931 | 3482 | 0.8315 | 0.8019 | 0.8315 | 0.9119 |
| 0.0747 | 5.9966 | 3484 | 0.7810 | 0.7801 | 0.7810 | 0.8837 |
| 0.0747 | 6.0 | 3486 | 0.7569 | 0.7801 | 0.7569 | 0.8700 |
| 0.0747 | 6.0034 | 3488 | 0.7723 | 0.7801 | 0.7723 | 0.8788 |
| 0.0747 | 6.0069 | 3490 | 0.8112 | 0.7902 | 0.8112 | 0.9006 |
| 0.0747 | 6.0103 | 3492 | 0.8605 | 0.7426 | 0.8605 | 0.9276 |
| 0.0747 | 6.0138 | 3494 | 0.8741 | 0.7618 | 0.8741 | 0.9349 |
| 0.0747 | 6.0172 | 3496 | 0.8833 | 0.7618 | 0.8833 | 0.9399 |
| 0.0747 | 6.0207 | 3498 | 0.9205 | 0.7618 | 0.9205 | 0.9594 |
| 0.0652 | 6.0241 | 3500 | 0.9241 | 0.7618 | 0.9241 | 0.9613 |
| 0.0652 | 6.0275 | 3502 | 0.8828 | 0.7618 | 0.8828 | 0.9396 |
| 0.0652 | 6.0310 | 3504 | 0.8355 | 0.6988 | 0.8355 | 0.9140 |
| 0.0652 | 6.0344 | 3506 | 0.8342 | 0.6988 | 0.8342 | 0.9133 |
| 0.0652 | 6.0379 | 3508 | 0.8283 | 0.6988 | 0.8283 | 0.9101 |
| 0.0652 | 6.0413 | 3510 | 0.8423 | 0.6988 | 0.8423 | 0.9178 |
| 0.0652 | 6.0448 | 3512 | 0.8580 | 0.6988 | 0.8580 | 0.9263 |
| 0.0652 | 6.0482 | 3514 | 0.9147 | 0.6988 | 0.9147 | 0.9564 |
| 0.0652 | 6.0516 | 3516 | 0.9467 | 0.6988 | 0.9467 | 0.9730 |
| 0.0652 | 6.0551 | 3518 | 0.9593 | 0.6988 | 0.9593 | 0.9794 |
| 0.0652 | 6.0585 | 3520 | 0.9392 | 0.6988 | 0.9392 | 0.9691 |
| 0.0652 | 6.0620 | 3522 | 0.9293 | 0.6988 | 0.9293 | 0.9640 |
| 0.0652 | 6.0654 | 3524 | 0.9196 | 0.6988 | 0.9196 | 0.9589 |
| 0.0652 | 6.0688 | 3526 | 0.8870 | 0.6988 | 0.8870 | 0.9418 |
| 0.0652 | 6.0723 | 3528 | 0.8736 | 0.6988 | 0.8736 | 0.9346 |
| 0.0652 | 6.0757 | 3530 | 0.8650 | 0.6903 | 0.8650 | 0.9300 |
| 0.0652 | 6.0792 | 3532 | 0.8616 | 0.7618 | 0.8616 | 0.9282 |
| 0.0652 | 6.0826 | 3534 | 0.8543 | 0.7902 | 0.8543 | 0.9243 |
| 0.0652 | 6.0861 | 3536 | 0.8804 | 0.7618 | 0.8804 | 0.9383 |
| 0.0652 | 6.0895 | 3538 | 0.9158 | 0.7134 | 0.9158 | 0.9570 |
| 0.0652 | 6.0929 | 3540 | 0.9124 | 0.7134 | 0.9124 | 0.9552 |
| 0.0652 | 6.0964 | 3542 | 0.8869 | 0.7618 | 0.8869 | 0.9418 |
| 0.0652 | 6.0998 | 3544 | 0.8583 | 0.6988 | 0.8583 | 0.9265 |
| 0.0652 | 6.1033 | 3546 | 0.8511 | 0.6988 | 0.8511 | 0.9225 |
| 0.0652 | 6.1067 | 3548 | 0.8557 | 0.6988 | 0.8557 | 0.9250 |
| 0.0652 | 6.1102 | 3550 | 0.8710 | 0.6988 | 0.8710 | 0.9333 |
| 0.0652 | 6.1136 | 3552 | 0.8971 | 0.7618 | 0.8971 | 0.9472 |
| 0.0652 | 6.1170 | 3554 | 0.8946 | 0.7618 | 0.8946 | 0.9458 |
| 0.0652 | 6.1205 | 3556 | 0.8741 | 0.7618 | 0.8741 | 0.9350 |
| 0.0652 | 6.1239 | 3558 | 0.8403 | 0.7902 | 0.8403 | 0.9167 |
| 0.0652 | 6.1274 | 3560 | 0.8307 | 0.7797 | 0.8307 | 0.9114 |
| 0.0652 | 6.1308 | 3562 | 0.8082 | 0.7906 | 0.8082 | 0.8990 |
| 0.0652 | 6.1343 | 3564 | 0.7992 | 0.7232 | 0.7992 | 0.8940 |
| 0.0652 | 6.1377 | 3566 | 0.8221 | 0.7906 | 0.8221 | 0.9067 |
| 0.0652 | 6.1411 | 3568 | 0.8584 | 0.7902 | 0.8584 | 0.9265 |
| 0.0652 | 6.1446 | 3570 | 0.8839 | 0.7902 | 0.8839 | 0.9401 |
| 0.0652 | 6.1480 | 3572 | 0.9095 | 0.7902 | 0.9095 | 0.9537 |
| 0.0652 | 6.1515 | 3574 | 0.8919 | 0.7902 | 0.8919 | 0.9444 |
| 0.0652 | 6.1549 | 3576 | 0.8563 | 0.7797 | 0.8563 | 0.9253 |
| 0.0652 | 6.1583 | 3578 | 0.8053 | 0.7797 | 0.8053 | 0.8974 |
| 0.0652 | 6.1618 | 3580 | 0.7603 | 0.7232 | 0.7603 | 0.8719 |
| 0.0652 | 6.1652 | 3582 | 0.7467 | 0.7176 | 0.7467 | 0.8641 |
| 0.0652 | 6.1687 | 3584 | 0.7616 | 0.7689 | 0.7616 | 0.8727 |
| 0.0652 | 6.1721 | 3586 | 0.8052 | 0.7354 | 0.8052 | 0.8973 |
| 0.0652 | 6.1756 | 3588 | 0.8650 | 0.7342 | 0.8650 | 0.9300 |
| 0.0652 | 6.1790 | 3590 | 0.9277 | 0.7342 | 0.9277 | 0.9632 |
| 0.0652 | 6.1824 | 3592 | 0.9431 | 0.7342 | 0.9431 | 0.9711 |
| 0.0652 | 6.1859 | 3594 | 0.9232 | 0.7342 | 0.9232 | 0.9608 |
| 0.0652 | 6.1893 | 3596 | 0.9247 | 0.7426 | 0.9247 | 0.9616 |
| 0.0652 | 6.1928 | 3598 | 0.9505 | 0.7426 | 0.9505 | 0.9749 |
| 0.0652 | 6.1962 | 3600 | 0.9760 | 0.7134 | 0.9760 | 0.9880 |
| 0.0652 | 6.1997 | 3602 | 0.9643 | 0.7134 | 0.9643 | 0.9820 |
| 0.0652 | 6.2031 | 3604 | 0.9136 | 0.7902 | 0.9136 | 0.9558 |
| 0.0652 | 6.2065 | 3606 | 0.8451 | 0.7232 | 0.8451 | 0.9193 |
| 0.0652 | 6.2100 | 3608 | 0.8141 | 0.7232 | 0.8141 | 0.9023 |
| 0.0652 | 6.2134 | 3610 | 0.8230 | 0.7232 | 0.8230 | 0.9072 |
| 0.0652 | 6.2169 | 3612 | 0.8329 | 0.7232 | 0.8329 | 0.9126 |
| 0.0652 | 6.2203 | 3614 | 0.8557 | 0.7906 | 0.8557 | 0.9250 |
| 0.0652 | 6.2238 | 3616 | 0.8733 | 0.7797 | 0.8733 | 0.9345 |
| 0.0652 | 6.2272 | 3618 | 0.8996 | 0.7342 | 0.8996 | 0.9485 |
| 0.0652 | 6.2306 | 3620 | 0.9457 | 0.7426 | 0.9457 | 0.9725 |
| 0.0652 | 6.2341 | 3622 | 0.9779 | 0.7134 | 0.9779 | 0.9889 |
| 0.0652 | 6.2375 | 3624 | 1.0068 | 0.7134 | 1.0068 | 1.0034 |
| 0.0652 | 6.2410 | 3626 | 0.9875 | 0.7134 | 0.9875 | 0.9937 |
| 0.0652 | 6.2444 | 3628 | 0.9548 | 0.7618 | 0.9548 | 0.9771 |
| 0.0652 | 6.2478 | 3630 | 0.9490 | 0.7618 | 0.9490 | 0.9741 |
| 0.0652 | 6.2513 | 3632 | 0.8985 | 0.7729 | 0.8985 | 0.9479 |
| 0.0652 | 6.2547 | 3634 | 0.8454 | 0.7232 | 0.8454 | 0.9194 |
| 0.0652 | 6.2582 | 3636 | 0.8177 | 0.7232 | 0.8177 | 0.9042 |
| 0.0652 | 6.2616 | 3638 | 0.8138 | 0.7906 | 0.8138 | 0.9021 |
| 0.0652 | 6.2651 | 3640 | 0.8294 | 0.7906 | 0.8294 | 0.9107 |
| 0.0652 | 6.2685 | 3642 | 0.8425 | 0.7342 | 0.8425 | 0.9179 |
| 0.0652 | 6.2719 | 3644 | 0.8719 | 0.7342 | 0.8719 | 0.9337 |
| 0.0652 | 6.2754 | 3646 | 0.8603 | 0.7354 | 0.8603 | 0.9276 |
| 0.0652 | 6.2788 | 3648 | 0.8350 | 0.7281 | 0.8350 | 0.9138 |
| 0.0652 | 6.2823 | 3650 | 0.8428 | 0.7354 | 0.8428 | 0.9180 |
| 0.0652 | 6.2857 | 3652 | 0.8639 | 0.7354 | 0.8639 | 0.9295 |
| 0.0652 | 6.2892 | 3654 | 0.8409 | 0.7354 | 0.8409 | 0.9170 |
| 0.0652 | 6.2926 | 3656 | 0.7942 | 0.7879 | 0.7942 | 0.8912 |
| 0.0652 | 6.2960 | 3658 | 0.7721 | 0.7704 | 0.7721 | 0.8787 |
| 0.0652 | 6.2995 | 3660 | 0.7744 | 0.7704 | 0.7744 | 0.8800 |
| 0.0652 | 6.3029 | 3662 | 0.7715 | 0.7704 | 0.7715 | 0.8784 |
| 0.0652 | 6.3064 | 3664 | 0.7891 | 0.7704 | 0.7891 | 0.8883 |
| 0.0652 | 6.3098 | 3666 | 0.8017 | 0.7717 | 0.8017 | 0.8954 |
| 0.0652 | 6.3133 | 3668 | 0.8106 | 0.7717 | 0.8106 | 0.9003 |
| 0.0652 | 6.3167 | 3670 | 0.8523 | 0.7623 | 0.8523 | 0.9232 |
| 0.0652 | 6.3201 | 3672 | 0.8875 | 0.7139 | 0.8875 | 0.9421 |
| 0.0652 | 6.3236 | 3674 | 0.9202 | 0.7058 | 0.9202 | 0.9592 |
| 0.0652 | 6.3270 | 3676 | 0.9119 | 0.7623 | 0.9119 | 0.9549 |
| 0.0652 | 6.3305 | 3678 | 0.8903 | 0.7623 | 0.8903 | 0.9436 |
| 0.0652 | 6.3339 | 3680 | 0.8386 | 0.7717 | 0.8386 | 0.9157 |
| 0.0652 | 6.3373 | 3682 | 0.7973 | 0.7037 | 0.7973 | 0.8929 |
| 0.0652 | 6.3408 | 3684 | 0.7719 | 0.6893 | 0.7719 | 0.8786 |
| 0.0652 | 6.3442 | 3686 | 0.7605 | 0.6829 | 0.7605 | 0.8721 |
| 0.0652 | 6.3477 | 3688 | 0.7730 | 0.7704 | 0.7730 | 0.8792 |
| 0.0652 | 6.3511 | 3690 | 0.8101 | 0.7879 | 0.8101 | 0.9001 |
| 0.0652 | 6.3546 | 3692 | 0.8669 | 0.7342 | 0.8669 | 0.9311 |
| 0.0652 | 6.3580 | 3694 | 0.9436 | 0.7058 | 0.9436 | 0.9714 |
| 0.0652 | 6.3614 | 3696 | 0.9856 | 0.7058 | 0.9856 | 0.9928 |
| 0.0652 | 6.3649 | 3698 | 0.9733 | 0.7058 | 0.9733 | 0.9866 |
| 0.0652 | 6.3683 | 3700 | 0.9180 | 0.7058 | 0.9180 | 0.9581 |
| 0.0652 | 6.3718 | 3702 | 0.8481 | 0.7430 | 0.8481 | 0.9209 |
| 0.0652 | 6.3752 | 3704 | 0.7893 | 0.7717 | 0.7893 | 0.8884 |
| 0.0652 | 6.3787 | 3706 | 0.7619 | 0.7529 | 0.7619 | 0.8729 |
| 0.0652 | 6.3821 | 3708 | 0.7638 | 0.7529 | 0.7638 | 0.8739 |
| 0.0652 | 6.3855 | 3710 | 0.7668 | 0.7529 | 0.7668 | 0.8757 |
| 0.0652 | 6.3890 | 3712 | 0.7691 | 0.7529 | 0.7691 | 0.8770 |
| 0.0652 | 6.3924 | 3714 | 0.7895 | 0.7717 | 0.7895 | 0.8885 |
| 0.0652 | 6.3959 | 3716 | 0.8179 | 0.7342 | 0.8179 | 0.9044 |
| 0.0652 | 6.3993 | 3718 | 0.8817 | 0.7342 | 0.8817 | 0.9390 |
| 0.0652 | 6.4028 | 3720 | 0.9613 | 0.7337 | 0.9613 | 0.9804 |
| 0.0652 | 6.4062 | 3722 | 1.0041 | 0.7337 | 1.0041 | 1.0020 |
| 0.0652 | 6.4096 | 3724 | 0.9846 | 0.7337 | 0.9846 | 0.9923 |
| 0.0652 | 6.4131 | 3726 | 0.9269 | 0.7058 | 0.9269 | 0.9627 |
| 0.0652 | 6.4165 | 3728 | 0.8507 | 0.7342 | 0.8507 | 0.9223 |
| 0.0652 | 6.4200 | 3730 | 0.8204 | 0.7430 | 0.8204 | 0.9058 |
| 0.0652 | 6.4234 | 3732 | 0.8113 | 0.7717 | 0.8113 | 0.9007 |
| 0.0652 | 6.4269 | 3734 | 0.8332 | 0.7623 | 0.8332 | 0.9128 |
| 0.0652 | 6.4303 | 3736 | 0.8641 | 0.7623 | 0.8641 | 0.9296 |
| 0.0652 | 6.4337 | 3738 | 0.8900 | 0.7623 | 0.8900 | 0.9434 |
| 0.0652 | 6.4372 | 3740 | 0.9218 | 0.7139 | 0.9218 | 0.9601 |
| 0.0652 | 6.4406 | 3742 | 0.9525 | 0.7058 | 0.9525 | 0.9760 |
| 0.0652 | 6.4441 | 3744 | 0.9374 | 0.7058 | 0.9374 | 0.9682 |
| 0.0652 | 6.4475 | 3746 | 0.8802 | 0.7058 | 0.8802 | 0.9382 |
| 0.0652 | 6.4509 | 3748 | 0.8357 | 0.7342 | 0.8357 | 0.9142 |
| 0.0652 | 6.4544 | 3750 | 0.8013 | 0.7906 | 0.8013 | 0.8952 |
| 0.0652 | 6.4578 | 3752 | 0.7908 | 0.7906 | 0.7908 | 0.8893 |
| 0.0652 | 6.4613 | 3754 | 0.7902 | 0.7906 | 0.7902 | 0.8889 |
| 0.0652 | 6.4647 | 3756 | 0.8116 | 0.7906 | 0.8116 | 0.9009 |
| 0.0652 | 6.4682 | 3758 | 0.8203 | 0.7906 | 0.8203 | 0.9057 |
| 0.0652 | 6.4716 | 3760 | 0.8345 | 0.7906 | 0.8345 | 0.9135 |
| 0.0652 | 6.4750 | 3762 | 0.8399 | 0.7906 | 0.8399 | 0.9165 |
| 0.0652 | 6.4785 | 3764 | 0.8616 | 0.7729 | 0.8616 | 0.9282 |
| 0.0652 | 6.4819 | 3766 | 0.8572 | 0.7729 | 0.8572 | 0.9259 |
| 0.0652 | 6.4854 | 3768 | 0.8627 | 0.7729 | 0.8627 | 0.9288 |
| 0.0652 | 6.4888 | 3770 | 0.8758 | 0.7729 | 0.8758 | 0.9358 |
| 0.0652 | 6.4923 | 3772 | 0.8558 | 0.7729 | 0.8558 | 0.9251 |
| 0.0652 | 6.4957 | 3774 | 0.8117 | 0.7623 | 0.8117 | 0.9009 |
| 0.0652 | 6.4991 | 3776 | 0.7964 | 0.6706 | 0.7964 | 0.8924 |
| 0.0652 | 6.5026 | 3778 | 0.8109 | 0.6909 | 0.8109 | 0.9005 |
| 0.0652 | 6.5060 | 3780 | 0.8317 | 0.7623 | 0.8317 | 0.9120 |
| 0.0652 | 6.5095 | 3782 | 0.8441 | 0.7623 | 0.8441 | 0.9188 |
| 0.0652 | 6.5129 | 3784 | 0.8312 | 0.7623 | 0.8312 | 0.9117 |
| 0.0652 | 6.5164 | 3786 | 0.8023 | 0.7520 | 0.8023 | 0.8957 |
| 0.0652 | 6.5198 | 3788 | 0.7806 | 0.7520 | 0.7806 | 0.8835 |
| 0.0652 | 6.5232 | 3790 | 0.7778 | 0.7520 | 0.7778 | 0.8819 |
| 0.0652 | 6.5267 | 3792 | 0.8021 | 0.7058 | 0.8021 | 0.8956 |
| 0.0652 | 6.5301 | 3794 | 0.8382 | 0.7058 | 0.8382 | 0.9155 |
| 0.0652 | 6.5336 | 3796 | 0.8740 | 0.7058 | 0.8740 | 0.9349 |
| 0.0652 | 6.5370 | 3798 | 0.8763 | 0.7058 | 0.8763 | 0.9361 |
| 0.0652 | 6.5404 | 3800 | 0.8751 | 0.7058 | 0.8751 | 0.9355 |
| 0.0652 | 6.5439 | 3802 | 0.8538 | 0.7058 | 0.8538 | 0.9240 |
| 0.0652 | 6.5473 | 3804 | 0.8194 | 0.7520 | 0.8194 | 0.9052 |
| 0.0652 | 6.5508 | 3806 | 0.7947 | 0.7520 | 0.7947 | 0.8915 |
| 0.0652 | 6.5542 | 3808 | 0.7959 | 0.7623 | 0.7959 | 0.8921 |
| 0.0652 | 6.5577 | 3810 | 0.8016 | 0.7623 | 0.8016 | 0.8953 |
| 0.0652 | 6.5611 | 3812 | 0.8155 | 0.7520 | 0.8155 | 0.9031 |
| 0.0652 | 6.5645 | 3814 | 0.8440 | 0.7520 | 0.8440 | 0.9187 |
| 0.0652 | 6.5680 | 3816 | 0.8704 | 0.7058 | 0.8704 | 0.9330 |
| 0.0652 | 6.5714 | 3818 | 0.8823 | 0.7520 | 0.8823 | 0.9393 |
| 0.0652 | 6.5749 | 3820 | 0.9029 | 0.7520 | 0.9029 | 0.9502 |
| 0.0652 | 6.5783 | 3822 | 0.8794 | 0.7623 | 0.8794 | 0.9378 |
| 0.0652 | 6.5818 | 3824 | 0.8351 | 0.7623 | 0.8351 | 0.9138 |
| 0.0652 | 6.5852 | 3826 | 0.7757 | 0.6706 | 0.7757 | 0.8807 |
| 0.0652 | 6.5886 | 3828 | 0.7468 | 0.6842 | 0.7468 | 0.8642 |
| 0.0652 | 6.5921 | 3830 | 0.7398 | 0.6842 | 0.7398 | 0.8601 |
| 0.0652 | 6.5955 | 3832 | 0.7512 | 0.6842 | 0.7512 | 0.8667 |
| 0.0652 | 6.5990 | 3834 | 0.7754 | 0.6909 | 0.7754 | 0.8806 |
| 0.0652 | 6.6024 | 3836 | 0.8067 | 0.7623 | 0.8067 | 0.8982 |
| 0.0652 | 6.6059 | 3838 | 0.8377 | 0.7623 | 0.8377 | 0.9153 |
| 0.0652 | 6.6093 | 3840 | 0.8488 | 0.7623 | 0.8488 | 0.9213 |
| 0.0652 | 6.6127 | 3842 | 0.8338 | 0.7623 | 0.8338 | 0.9131 |
| 0.0652 | 6.6162 | 3844 | 0.8055 | 0.7623 | 0.8055 | 0.8975 |
| 0.0652 | 6.6196 | 3846 | 0.8144 | 0.7623 | 0.8144 | 0.9025 |
| 0.0652 | 6.6231 | 3848 | 0.8153 | 0.7623 | 0.8153 | 0.9029 |
| 0.0652 | 6.6265 | 3850 | 0.8044 | 0.7623 | 0.8044 | 0.8969 |
| 0.0652 | 6.6299 | 3852 | 0.8136 | 0.7623 | 0.8136 | 0.9020 |
| 0.0652 | 6.6334 | 3854 | 0.8332 | 0.7623 | 0.8332 | 0.9128 |
| 0.0652 | 6.6368 | 3856 | 0.8622 | 0.7623 | 0.8622 | 0.9286 |
| 0.0652 | 6.6403 | 3858 | 0.9230 | 0.7058 | 0.9230 | 0.9607 |
| 0.0652 | 6.6437 | 3860 | 0.9468 | 0.6871 | 0.9468 | 0.9730 |
| 0.0652 | 6.6472 | 3862 | 0.9231 | 0.6871 | 0.9231 | 0.9608 |
| 0.0652 | 6.6506 | 3864 | 0.8706 | 0.7058 | 0.8706 | 0.9331 |
| 0.0652 | 6.6540 | 3866 | 0.8433 | 0.7623 | 0.8433 | 0.9183 |
| 0.0652 | 6.6575 | 3868 | 0.7990 | 0.7906 | 0.7990 | 0.8939 |
| 0.0652 | 6.6609 | 3870 | 0.7605 | 0.7906 | 0.7605 | 0.8721 |
| 0.0652 | 6.6644 | 3872 | 0.7642 | 0.7906 | 0.7642 | 0.8742 |
| 0.0652 | 6.6678 | 3874 | 0.7861 | 0.7906 | 0.7861 | 0.8866 |
| 0.0652 | 6.6713 | 3876 | 0.8091 | 0.7906 | 0.8091 | 0.8995 |
| 0.0652 | 6.6747 | 3878 | 0.8305 | 0.7342 | 0.8305 | 0.9113 |
| 0.0652 | 6.6781 | 3880 | 0.8590 | 0.7058 | 0.8590 | 0.9268 |
| 0.0652 | 6.6816 | 3882 | 0.8658 | 0.7058 | 0.8658 | 0.9305 |
| 0.0652 | 6.6850 | 3884 | 0.8316 | 0.7342 | 0.8316 | 0.9119 |
| 0.0652 | 6.6885 | 3886 | 0.8069 | 0.7342 | 0.8069 | 0.8983 |
| 0.0652 | 6.6919 | 3888 | 0.7626 | 0.7430 | 0.7626 | 0.8733 |
| 0.0652 | 6.6954 | 3890 | 0.7417 | 0.7717 | 0.7417 | 0.8612 |
| 0.0652 | 6.6988 | 3892 | 0.7568 | 0.7717 | 0.7568 | 0.8699 |
| 0.0652 | 6.7022 | 3894 | 0.7786 | 0.7906 | 0.7786 | 0.8824 |
| 0.0652 | 6.7057 | 3896 | 0.8169 | 0.7623 | 0.8169 | 0.9039 |
| 0.0652 | 6.7091 | 3898 | 0.8425 | 0.7623 | 0.8425 | 0.9179 |
| 0.0652 | 6.7126 | 3900 | 0.8412 | 0.7623 | 0.8412 | 0.9171 |
| 0.0652 | 6.7160 | 3902 | 0.8191 | 0.7906 | 0.8191 | 0.9050 |
| 0.0652 | 6.7194 | 3904 | 0.8170 | 0.7906 | 0.8170 | 0.9039 |
| 0.0652 | 6.7229 | 3906 | 0.8193 | 0.7906 | 0.8193 | 0.9051 |
| 0.0652 | 6.7263 | 3908 | 0.8486 | 0.7906 | 0.8486 | 0.9212 |
| 0.0652 | 6.7298 | 3910 | 0.9102 | 0.7058 | 0.9102 | 0.9541 |
| 0.0652 | 6.7332 | 3912 | 0.9699 | 0.7058 | 0.9699 | 0.9848 |
| 0.0652 | 6.7367 | 3914 | 0.9733 | 0.7058 | 0.9733 | 0.9866 |
| 0.0652 | 6.7401 | 3916 | 0.9352 | 0.7058 | 0.9352 | 0.9671 |
| 0.0652 | 6.7435 | 3918 | 0.8674 | 0.7342 | 0.8674 | 0.9314 |
| 0.0652 | 6.7470 | 3920 | 0.7866 | 0.7906 | 0.7866 | 0.8869 |
| 0.0652 | 6.7504 | 3922 | 0.7306 | 0.7620 | 0.7306 | 0.8547 |
| 0.0652 | 6.7539 | 3924 | 0.7142 | 0.7620 | 0.7142 | 0.8451 |
| 0.0652 | 6.7573 | 3926 | 0.7211 | 0.7620 | 0.7211 | 0.8492 |
| 0.0652 | 6.7608 | 3928 | 0.7491 | 0.7620 | 0.7491 | 0.8655 |
| 0.0652 | 6.7642 | 3930 | 0.7976 | 0.7797 | 0.7976 | 0.8931 |
| 0.0652 | 6.7676 | 3932 | 0.8499 | 0.7342 | 0.8499 | 0.9219 |
| 0.0652 | 6.7711 | 3934 | 0.8539 | 0.7797 | 0.8539 | 0.9240 |
| 0.0652 | 6.7745 | 3936 | 0.8567 | 0.7797 | 0.8567 | 0.9256 |
| 0.0652 | 6.7780 | 3938 | 0.8593 | 0.7342 | 0.8593 | 0.9270 |
| 0.0652 | 6.7814 | 3940 | 0.8602 | 0.7797 | 0.8602 | 0.9275 |
| 0.0652 | 6.7849 | 3942 | 0.8297 | 0.7797 | 0.8297 | 0.9109 |
| 0.0652 | 6.7883 | 3944 | 0.8142 | 0.7797 | 0.8142 | 0.9023 |
| 0.0652 | 6.7917 | 3946 | 0.7916 | 0.7797 | 0.7916 | 0.8897 |
| 0.0652 | 6.7952 | 3948 | 0.7806 | 0.7906 | 0.7806 | 0.8835 |
| 0.0652 | 6.7986 | 3950 | 0.7443 | 0.7906 | 0.7443 | 0.8627 |
| 0.0652 | 6.8021 | 3952 | 0.7240 | 0.7614 | 0.7240 | 0.8509 |
| 0.0652 | 6.8055 | 3954 | 0.7274 | 0.7782 | 0.7274 | 0.8529 |
| 0.0652 | 6.8090 | 3956 | 0.7229 | 0.7782 | 0.7229 | 0.8503 |
| 0.0652 | 6.8124 | 3958 | 0.7387 | 0.7782 | 0.7387 | 0.8595 |
| 0.0652 | 6.8158 | 3960 | 0.7655 | 0.7782 | 0.7655 | 0.8749 |
| 0.0652 | 6.8193 | 3962 | 0.7952 | 0.7689 | 0.7952 | 0.8917 |
| 0.0652 | 6.8227 | 3964 | 0.8225 | 0.7264 | 0.8225 | 0.9069 |
| 0.0652 | 6.8262 | 3966 | 0.8571 | 0.7342 | 0.8571 | 0.9258 |
| 0.0652 | 6.8296 | 3968 | 0.8705 | 0.7342 | 0.8705 | 0.9330 |
| 0.0652 | 6.8330 | 3970 | 0.8907 | 0.7342 | 0.8907 | 0.9438 |
| 0.0652 | 6.8365 | 3972 | 0.8972 | 0.7342 | 0.8972 | 0.9472 |
| 0.0652 | 6.8399 | 3974 | 0.8858 | 0.7342 | 0.8858 | 0.9412 |
| 0.0652 | 6.8434 | 3976 | 0.8673 | 0.7906 | 0.8673 | 0.9313 |
| 0.0652 | 6.8468 | 3978 | 0.8834 | 0.7906 | 0.8834 | 0.9399 |
| 0.0652 | 6.8503 | 3980 | 0.8876 | 0.7623 | 0.8876 | 0.9421 |
| 0.0652 | 6.8537 | 3982 | 0.8775 | 0.7623 | 0.8775 | 0.9368 |
| 0.0652 | 6.8571 | 3984 | 0.8667 | 0.7623 | 0.8667 | 0.9310 |
| 0.0652 | 6.8606 | 3986 | 0.8655 | 0.7623 | 0.8655 | 0.9303 |
| 0.0652 | 6.8640 | 3988 | 0.8627 | 0.7623 | 0.8627 | 0.9288 |
| 0.0652 | 6.8675 | 3990 | 0.8813 | 0.7520 | 0.8813 | 0.9388 |
| 0.0652 | 6.8709 | 3992 | 0.8971 | 0.7520 | 0.8971 | 0.9472 |
| 0.0652 | 6.8744 | 3994 | 0.9183 | 0.7520 | 0.9183 | 0.9583 |
| 0.0652 | 6.8778 | 3996 | 0.9346 | 0.7520 | 0.9346 | 0.9667 |
| 0.0652 | 6.8812 | 3998 | 0.9157 | 0.7520 | 0.9157 | 0.9569 |
| 0.0553 | 6.8847 | 4000 | 0.8700 | 0.7623 | 0.8700 | 0.9327 |
| 0.0553 | 6.8881 | 4002 | 0.8454 | 0.7623 | 0.8454 | 0.9195 |
| 0.0553 | 6.8916 | 4004 | 0.8371 | 0.7906 | 0.8371 | 0.9150 |
| 0.0553 | 6.8950 | 4006 | 0.8322 | 0.7906 | 0.8322 | 0.9122 |
| 0.0553 | 6.8985 | 4008 | 0.8266 | 0.7906 | 0.8266 | 0.9092 |
| 0.0553 | 6.9019 | 4010 | 0.8392 | 0.7797 | 0.8392 | 0.9161 |
| 0.0553 | 6.9053 | 4012 | 0.8561 | 0.7797 | 0.8561 | 0.9253 |
| 0.0553 | 6.9088 | 4014 | 0.9040 | 0.7520 | 0.9040 | 0.9508 |
| 0.0553 | 6.9122 | 4016 | 0.9592 | 0.7058 | 0.9592 | 0.9794 |
| 0.0553 | 6.9157 | 4018 | 0.9799 | 0.7058 | 0.9799 | 0.9899 |
| 0.0553 | 6.9191 | 4020 | 0.9934 | 0.7058 | 0.9934 | 0.9967 |
| 0.0553 | 6.9225 | 4022 | 0.9612 | 0.7058 | 0.9612 | 0.9804 |
| 0.0553 | 6.9260 | 4024 | 0.8977 | 0.7520 | 0.8977 | 0.9475 |
| 0.0553 | 6.9294 | 4026 | 0.8221 | 0.7797 | 0.8221 | 0.9067 |
| 0.0553 | 6.9329 | 4028 | 0.7536 | 0.7717 | 0.7536 | 0.8681 |
| 0.0553 | 6.9363 | 4030 | 0.7257 | 0.7439 | 0.7257 | 0.8519 |
| 0.0553 | 6.9398 | 4032 | 0.7354 | 0.7439 | 0.7354 | 0.8575 |
| 0.0553 | 6.9432 | 4034 | 0.7739 | 0.7906 | 0.7739 | 0.8797 |
| 0.0553 | 6.9466 | 4036 | 0.8302 | 0.7797 | 0.8302 | 0.9112 |
| 0.0553 | 6.9501 | 4038 | 0.8814 | 0.7520 | 0.8814 | 0.9389 |
| 0.0553 | 6.9535 | 4040 | 0.9398 | 0.7520 | 0.9398 | 0.9694 |
| 0.0553 | 6.9570 | 4042 | 0.9961 | 0.7134 | 0.9961 | 0.9980 |
| 0.0553 | 6.9604 | 4044 | 0.9993 | 0.7134 | 0.9993 | 0.9996 |
| 0.0553 | 6.9639 | 4046 | 0.9622 | 0.7618 | 0.9622 | 0.9809 |
| 0.0553 | 6.9673 | 4048 | 0.8996 | 0.7520 | 0.8996 | 0.9485 |
| 0.0553 | 6.9707 | 4050 | 0.8356 | 0.7623 | 0.8356 | 0.9141 |
| 0.0553 | 6.9742 | 4052 | 0.7801 | 0.7623 | 0.7801 | 0.8833 |
| 0.0553 | 6.9776 | 4054 | 0.7540 | 0.7232 | 0.7540 | 0.8683 |
| 0.0553 | 6.9811 | 4056 | 0.7566 | 0.7427 | 0.7566 | 0.8698 |
| 0.0553 | 6.9845 | 4058 | 0.7764 | 0.7520 | 0.7764 | 0.8811 |
| 0.0553 | 6.9880 | 4060 | 0.7997 | 0.7520 | 0.7997 | 0.8943 |
| 0.0553 | 6.9914 | 4062 | 0.8331 | 0.7520 | 0.8331 | 0.9127 |
| 0.0553 | 6.9948 | 4064 | 0.8441 | 0.7520 | 0.8441 | 0.9188 |
| 0.0553 | 6.9983 | 4066 | 0.8335 | 0.7520 | 0.8335 | 0.9130 |
| 0.0553 | 7.0017 | 4068 | 0.8478 | 0.7520 | 0.8478 | 0.9207 |
| 0.0553 | 7.0052 | 4070 | 0.8977 | 0.7520 | 0.8977 | 0.9475 |
| 0.0553 | 7.0086 | 4072 | 0.9286 | 0.7520 | 0.9286 | 0.9636 |
| 0.0553 | 7.0120 | 4074 | 0.9271 | 0.7520 | 0.9271 | 0.9628 |
| 0.0553 | 7.0155 | 4076 | 0.8949 | 0.7623 | 0.8949 | 0.9460 |
| 0.0553 | 7.0189 | 4078 | 0.8815 | 0.7623 | 0.8815 | 0.9389 |
| 0.0553 | 7.0224 | 4080 | 0.8830 | 0.7623 | 0.8830 | 0.9397 |
| 0.0553 | 7.0258 | 4082 | 0.8926 | 0.7520 | 0.8926 | 0.9448 |
| 0.0553 | 7.0293 | 4084 | 0.9091 | 0.7520 | 0.9091 | 0.9535 |
| 0.0553 | 7.0327 | 4086 | 0.9215 | 0.7520 | 0.9215 | 0.9600 |
| 0.0553 | 7.0361 | 4088 | 0.9228 | 0.7520 | 0.9228 | 0.9606 |
| 0.0553 | 7.0396 | 4090 | 0.9481 | 0.7520 | 0.9481 | 0.9737 |
| 0.0553 | 7.0430 | 4092 | 0.9505 | 0.7520 | 0.9505 | 0.9749 |
| 0.0553 | 7.0465 | 4094 | 0.9685 | 0.7520 | 0.9685 | 0.9841 |
| 0.0553 | 7.0499 | 4096 | 0.9726 | 0.7520 | 0.9726 | 0.9862 |
| 0.0553 | 7.0534 | 4098 | 0.9389 | 0.7520 | 0.9389 | 0.9690 |
| 0.0553 | 7.0568 | 4100 | 0.9113 | 0.7520 | 0.9113 | 0.9546 |
| 0.0553 | 7.0602 | 4102 | 0.8989 | 0.7520 | 0.8989 | 0.9481 |
| 0.0553 | 7.0637 | 4104 | 0.9127 | 0.7520 | 0.9127 | 0.9553 |
| 0.0553 | 7.0671 | 4106 | 0.9443 | 0.7520 | 0.9443 | 0.9717 |
| 0.0553 | 7.0706 | 4108 | 0.9621 | 0.7623 | 0.9621 | 0.9809 |
| 0.0553 | 7.0740 | 4110 | 0.9632 | 0.7623 | 0.9632 | 0.9814 |
| 0.0553 | 7.0775 | 4112 | 0.9533 | 0.7623 | 0.9533 | 0.9764 |
| 0.0553 | 7.0809 | 4114 | 0.9583 | 0.7623 | 0.9583 | 0.9789 |
| 0.0553 | 7.0843 | 4116 | 0.9793 | 0.7520 | 0.9793 | 0.9896 |
| 0.0553 | 7.0878 | 4118 | 0.9716 | 0.7520 | 0.9716 | 0.9857 |
| 0.0553 | 7.0912 | 4120 | 0.9283 | 0.7520 | 0.9283 | 0.9635 |
| 0.0553 | 7.0947 | 4122 | 0.8949 | 0.7520 | 0.8949 | 0.9460 |
| 0.0553 | 7.0981 | 4124 | 0.8712 | 0.7520 | 0.8712 | 0.9334 |
| 0.0553 | 7.1015 | 4126 | 0.8814 | 0.7058 | 0.8814 | 0.9388 |
| 0.0553 | 7.1050 | 4128 | 0.9258 | 0.7058 | 0.9258 | 0.9622 |
| 0.0553 | 7.1084 | 4130 | 0.9420 | 0.7058 | 0.9420 | 0.9706 |
| 0.0553 | 7.1119 | 4132 | 0.9443 | 0.7058 | 0.9443 | 0.9718 |
| 0.0553 | 7.1153 | 4134 | 0.9452 | 0.7058 | 0.9452 | 0.9722 |
| 0.0553 | 7.1188 | 4136 | 0.9270 | 0.7058 | 0.9270 | 0.9628 |
| 0.0553 | 7.1222 | 4138 | 0.9054 | 0.7058 | 0.9054 | 0.9515 |
| 0.0553 | 7.1256 | 4140 | 0.8668 | 0.7520 | 0.8668 | 0.9310 |
| 0.0553 | 7.1291 | 4142 | 0.8435 | 0.7623 | 0.8435 | 0.9184 |
| 0.0553 | 7.1325 | 4144 | 0.8388 | 0.7623 | 0.8388 | 0.9159 |
| 0.0553 | 7.1360 | 4146 | 0.8275 | 0.7623 | 0.8275 | 0.9097 |
| 0.0553 | 7.1394 | 4148 | 0.8160 | 0.7623 | 0.8160 | 0.9033 |
| 0.0553 | 7.1429 | 4150 | 0.8234 | 0.7623 | 0.8234 | 0.9074 |
| 0.0553 | 7.1463 | 4152 | 0.8574 | 0.7058 | 0.8574 | 0.9260 |
| 0.0553 | 7.1497 | 4154 | 0.8568 | 0.7058 | 0.8568 | 0.9256 |
| 0.0553 | 7.1532 | 4156 | 0.8205 | 0.7058 | 0.8205 | 0.9058 |
| 0.0553 | 7.1566 | 4158 | 0.8106 | 0.7139 | 0.8106 | 0.9004 |
| 0.0553 | 7.1601 | 4160 | 0.7977 | 0.7623 | 0.7977 | 0.8932 |
| 0.0553 | 7.1635 | 4162 | 0.8233 | 0.7623 | 0.8233 | 0.9073 |
| 0.0553 | 7.1670 | 4164 | 0.8299 | 0.7623 | 0.8299 | 0.9110 |
| 0.0553 | 7.1704 | 4166 | 0.8448 | 0.7623 | 0.8448 | 0.9191 |
| 0.0553 | 7.1738 | 4168 | 0.8452 | 0.7729 | 0.8452 | 0.9193 |
| 0.0553 | 7.1773 | 4170 | 0.8333 | 0.7525 | 0.8333 | 0.9129 |
| 0.0553 | 7.1807 | 4172 | 0.8125 | 0.7321 | 0.8125 | 0.9014 |
| 0.0553 | 7.1842 | 4174 | 0.8147 | 0.7321 | 0.8147 | 0.9026 |
| 0.0553 | 7.1876 | 4176 | 0.8238 | 0.7525 | 0.8238 | 0.9077 |
| 0.0553 | 7.1910 | 4178 | 0.8546 | 0.7525 | 0.8546 | 0.9244 |
| 0.0553 | 7.1945 | 4180 | 0.8565 | 0.7729 | 0.8565 | 0.9255 |
| 0.0553 | 7.1979 | 4182 | 0.8683 | 0.7729 | 0.8683 | 0.9318 |
| 0.0553 | 7.2014 | 4184 | 0.8543 | 0.7623 | 0.8543 | 0.9243 |
| 0.0553 | 7.2048 | 4186 | 0.8500 | 0.7139 | 0.8500 | 0.9219 |
| 0.0553 | 7.2083 | 4188 | 0.8673 | 0.7058 | 0.8673 | 0.9313 |
| 0.0553 | 7.2117 | 4190 | 0.8928 | 0.7058 | 0.8928 | 0.9449 |
| 0.0553 | 7.2151 | 4192 | 0.9480 | 0.7134 | 0.9480 | 0.9737 |
| 0.0553 | 7.2186 | 4194 | 0.9689 | 0.7134 | 0.9689 | 0.9843 |
| 0.0553 | 7.2220 | 4196 | 0.9629 | 0.7134 | 0.9629 | 0.9813 |
| 0.0553 | 7.2255 | 4198 | 0.9649 | 0.7216 | 0.9649 | 0.9823 |
| 0.0553 | 7.2289 | 4200 | 0.9620 | 0.7216 | 0.9620 | 0.9808 |
| 0.0553 | 7.2324 | 4202 | 0.9605 | 0.7216 | 0.9605 | 0.9800 |
| 0.0553 | 7.2358 | 4204 | 0.9598 | 0.7844 | 0.9598 | 0.9797 |
| 0.0553 | 7.2392 | 4206 | 0.9503 | 0.7074 | 0.9503 | 0.9748 |
| 0.0553 | 7.2427 | 4208 | 0.9367 | 0.7074 | 0.9367 | 0.9678 |
| 0.0553 | 7.2461 | 4210 | 0.9219 | 0.7074 | 0.9219 | 0.9601 |
| 0.0553 | 7.2496 | 4212 | 0.9332 | 0.7074 | 0.9332 | 0.9660 |
| 0.0553 | 7.2530 | 4214 | 0.9694 | 0.7074 | 0.9694 | 0.9846 |
| 0.0553 | 7.2565 | 4216 | 0.9778 | 0.7844 | 0.9778 | 0.9888 |
| 0.0553 | 7.2599 | 4218 | 0.9615 | 0.7844 | 0.9615 | 0.9806 |
| 0.0553 | 7.2633 | 4220 | 0.9302 | 0.7844 | 0.9302 | 0.9645 |
| 0.0553 | 7.2668 | 4222 | 0.8716 | 0.6988 | 0.8716 | 0.9336 |
| 0.0553 | 7.2702 | 4224 | 0.8074 | 0.6503 | 0.8074 | 0.8986 |
| 0.0553 | 7.2737 | 4226 | 0.7698 | 0.6503 | 0.7698 | 0.8774 |
| 0.0553 | 7.2771 | 4228 | 0.7661 | 0.6503 | 0.7661 | 0.8753 |
| 0.0553 | 7.2806 | 4230 | 0.7814 | 0.6503 | 0.7814 | 0.8840 |
| 0.0553 | 7.2840 | 4232 | 0.8173 | 0.7623 | 0.8173 | 0.9041 |
| 0.0553 | 7.2874 | 4234 | 0.8714 | 0.7058 | 0.8714 | 0.9335 |
| 0.0553 | 7.2909 | 4236 | 0.9402 | 0.7134 | 0.9402 | 0.9697 |
| 0.0553 | 7.2943 | 4238 | 0.9622 | 0.7134 | 0.9622 | 0.9809 |
| 0.0553 | 7.2978 | 4240 | 0.9504 | 0.7134 | 0.9504 | 0.9749 |
| 0.0553 | 7.3012 | 4242 | 0.9029 | 0.7618 | 0.9029 | 0.9502 |
| 0.0553 | 7.3046 | 4244 | 0.8352 | 0.7623 | 0.8352 | 0.9139 |
| 0.0553 | 7.3081 | 4246 | 0.7784 | 0.6706 | 0.7784 | 0.8823 |
| 0.0553 | 7.3115 | 4248 | 0.7493 | 0.6706 | 0.7493 | 0.8656 |
| 0.0553 | 7.3150 | 4250 | 0.7398 | 0.6503 | 0.7398 | 0.8601 |
| 0.0553 | 7.3184 | 4252 | 0.7422 | 0.6503 | 0.7422 | 0.8615 |
| 0.0553 | 7.3219 | 4254 | 0.7580 | 0.6706 | 0.7580 | 0.8706 |
| 0.0553 | 7.3253 | 4256 | 0.7959 | 0.6909 | 0.7959 | 0.8922 |
| 0.0553 | 7.3287 | 4258 | 0.8212 | 0.7623 | 0.8212 | 0.9062 |
| 0.0553 | 7.3322 | 4260 | 0.8589 | 0.7729 | 0.8589 | 0.9268 |
| 0.0553 | 7.3356 | 4262 | 0.9029 | 0.7618 | 0.9029 | 0.9502 |
| 0.0553 | 7.3391 | 4264 | 0.9356 | 0.7134 | 0.9356 | 0.9673 |
| 0.0553 | 7.3425 | 4266 | 0.9581 | 0.7216 | 0.9581 | 0.9788 |
| 0.0553 | 7.3460 | 4268 | 0.9446 | 0.7216 | 0.9446 | 0.9719 |
| 0.0553 | 7.3494 | 4270 | 0.9145 | 0.7618 | 0.9145 | 0.9563 |
| 0.0553 | 7.3528 | 4272 | 0.8741 | 0.7729 | 0.8741 | 0.9349 |
| 0.0553 | 7.3563 | 4274 | 0.8493 | 0.7623 | 0.8493 | 0.9216 |
| 0.0553 | 7.3597 | 4276 | 0.8185 | 0.7623 | 0.8185 | 0.9047 |
| 0.0553 | 7.3632 | 4278 | 0.8092 | 0.7623 | 0.8092 | 0.8995 |
| 0.0553 | 7.3666 | 4280 | 0.8085 | 0.6909 | 0.8085 | 0.8991 |
| 0.0553 | 7.3701 | 4282 | 0.8132 | 0.7623 | 0.8132 | 0.9018 |
| 0.0553 | 7.3735 | 4284 | 0.8106 | 0.7623 | 0.8106 | 0.9003 |
| 0.0553 | 7.3769 | 4286 | 0.8248 | 0.7623 | 0.8248 | 0.9082 |
| 0.0553 | 7.3804 | 4288 | 0.8215 | 0.7623 | 0.8215 | 0.9064 |
| 0.0553 | 7.3838 | 4290 | 0.8174 | 0.7623 | 0.8174 | 0.9041 |
| 0.0553 | 7.3873 | 4292 | 0.8243 | 0.7623 | 0.8243 | 0.9079 |
| 0.0553 | 7.3907 | 4294 | 0.8418 | 0.7623 | 0.8418 | 0.9175 |
| 0.0553 | 7.3941 | 4296 | 0.8628 | 0.7520 | 0.8628 | 0.9289 |
| 0.0553 | 7.3976 | 4298 | 0.8699 | 0.7520 | 0.8699 | 0.9327 |
| 0.0553 | 7.4010 | 4300 | 0.8500 | 0.7623 | 0.8500 | 0.9220 |
| 0.0553 | 7.4045 | 4302 | 0.8092 | 0.7623 | 0.8092 | 0.8996 |
| 0.0553 | 7.4079 | 4304 | 0.7859 | 0.7623 | 0.7859 | 0.8865 |
| 0.0553 | 7.4114 | 4306 | 0.7750 | 0.6909 | 0.7750 | 0.8804 |
| 0.0553 | 7.4148 | 4308 | 0.7937 | 0.7623 | 0.7937 | 0.8909 |
| 0.0553 | 7.4182 | 4310 | 0.8280 | 0.7623 | 0.8280 | 0.9100 |
| 0.0553 | 7.4217 | 4312 | 0.8423 | 0.7623 | 0.8423 | 0.9178 |
| 0.0553 | 7.4251 | 4314 | 0.8625 | 0.7623 | 0.8625 | 0.9287 |
| 0.0553 | 7.4286 | 4316 | 0.9035 | 0.7729 | 0.9035 | 0.9506 |
| 0.0553 | 7.4320 | 4318 | 0.9265 | 0.7844 | 0.9265 | 0.9626 |
| 0.0553 | 7.4355 | 4320 | 0.9455 | 0.7844 | 0.9455 | 0.9724 |
| 0.0553 | 7.4389 | 4322 | 0.9350 | 0.7844 | 0.9350 | 0.9670 |
| 0.0553 | 7.4423 | 4324 | 0.9075 | 0.7844 | 0.9075 | 0.9526 |
| 0.0553 | 7.4458 | 4326 | 0.8673 | 0.7729 | 0.8673 | 0.9313 |
| 0.0553 | 7.4492 | 4328 | 0.8458 | 0.7729 | 0.8458 | 0.9197 |
| 0.0553 | 7.4527 | 4330 | 0.8391 | 0.7729 | 0.8391 | 0.9160 |
| 0.0553 | 7.4561 | 4332 | 0.8216 | 0.7623 | 0.8216 | 0.9064 |
| 0.0553 | 7.4596 | 4334 | 0.7962 | 0.6706 | 0.7962 | 0.8923 |
| 0.0553 | 7.4630 | 4336 | 0.7988 | 0.7427 | 0.7988 | 0.8938 |
| 0.0553 | 7.4664 | 4338 | 0.8018 | 0.7427 | 0.8018 | 0.8954 |
| 0.0553 | 7.4699 | 4340 | 0.8244 | 0.7427 | 0.8244 | 0.9080 |
| 0.0553 | 7.4733 | 4342 | 0.8413 | 0.7623 | 0.8413 | 0.9173 |
| 0.0553 | 7.4768 | 4344 | 0.8603 | 0.7623 | 0.8603 | 0.9275 |
| 0.0553 | 7.4802 | 4346 | 0.8796 | 0.7729 | 0.8796 | 0.9379 |
| 0.0553 | 7.4836 | 4348 | 0.9061 | 0.7729 | 0.9061 | 0.9519 |
| 0.0553 | 7.4871 | 4350 | 0.9006 | 0.7729 | 0.9006 | 0.9490 |
| 0.0553 | 7.4905 | 4352 | 0.8842 | 0.7623 | 0.8842 | 0.9403 |
| 0.0553 | 7.4940 | 4354 | 0.8577 | 0.7623 | 0.8577 | 0.9261 |
| 0.0553 | 7.4974 | 4356 | 0.8547 | 0.7623 | 0.8547 | 0.9245 |
| 0.0553 | 7.5009 | 4358 | 0.8751 | 0.7623 | 0.8751 | 0.9355 |
| 0.0553 | 7.5043 | 4360 | 0.8830 | 0.7623 | 0.8830 | 0.9397 |
| 0.0553 | 7.5077 | 4362 | 0.9019 | 0.7729 | 0.9019 | 0.9497 |
| 0.0553 | 7.5112 | 4364 | 0.9174 | 0.7729 | 0.9174 | 0.9578 |
| 0.0553 | 7.5146 | 4366 | 0.9232 | 0.7844 | 0.9232 | 0.9608 |
| 0.0553 | 7.5181 | 4368 | 0.9131 | 0.7844 | 0.9131 | 0.9555 |
| 0.0553 | 7.5215 | 4370 | 0.9225 | 0.7844 | 0.9225 | 0.9605 |
| 0.0553 | 7.5250 | 4372 | 0.9270 | 0.7844 | 0.9270 | 0.9628 |
| 0.0553 | 7.5284 | 4374 | 0.9140 | 0.7729 | 0.9140 | 0.9560 |
| 0.0553 | 7.5318 | 4376 | 0.9119 | 0.7729 | 0.9119 | 0.9549 |
| 0.0553 | 7.5353 | 4378 | 0.9104 | 0.7729 | 0.9104 | 0.9541 |
| 0.0553 | 7.5387 | 4380 | 0.8942 | 0.7729 | 0.8942 | 0.9456 |
| 0.0553 | 7.5422 | 4382 | 0.8664 | 0.7623 | 0.8664 | 0.9308 |
| 0.0553 | 7.5456 | 4384 | 0.8531 | 0.7520 | 0.8531 | 0.9236 |
| 0.0553 | 7.5491 | 4386 | 0.8403 | 0.7520 | 0.8403 | 0.9167 |
| 0.0553 | 7.5525 | 4388 | 0.8263 | 0.7058 | 0.8263 | 0.9090 |
| 0.0553 | 7.5559 | 4390 | 0.8374 | 0.7058 | 0.8374 | 0.9151 |
| 0.0553 | 7.5594 | 4392 | 0.8453 | 0.7058 | 0.8453 | 0.9194 |
| 0.0553 | 7.5628 | 4394 | 0.8687 | 0.7058 | 0.8687 | 0.9320 |
| 0.0553 | 7.5663 | 4396 | 0.8876 | 0.7058 | 0.8876 | 0.9421 |
| 0.0553 | 7.5697 | 4398 | 0.8726 | 0.7058 | 0.8726 | 0.9341 |
| 0.0553 | 7.5731 | 4400 | 0.8467 | 0.7520 | 0.8467 | 0.9202 |
| 0.0553 | 7.5766 | 4402 | 0.8238 | 0.7520 | 0.8238 | 0.9077 |
| 0.0553 | 7.5800 | 4404 | 0.8223 | 0.7520 | 0.8223 | 0.9068 |
| 0.0553 | 7.5835 | 4406 | 0.8429 | 0.7623 | 0.8429 | 0.9181 |
| 0.0553 | 7.5869 | 4408 | 0.8600 | 0.7623 | 0.8600 | 0.9273 |
| 0.0553 | 7.5904 | 4410 | 0.8624 | 0.7623 | 0.8624 | 0.9287 |
| 0.0553 | 7.5938 | 4412 | 0.8796 | 0.7844 | 0.8796 | 0.9379 |
| 0.0553 | 7.5972 | 4414 | 0.8827 | 0.7844 | 0.8827 | 0.9395 |
| 0.0553 | 7.6007 | 4416 | 0.8919 | 0.7844 | 0.8919 | 0.9444 |
| 0.0553 | 7.6041 | 4418 | 0.8796 | 0.7844 | 0.8796 | 0.9379 |
| 0.0553 | 7.6076 | 4420 | 0.8496 | 0.7729 | 0.8496 | 0.9217 |
| 0.0553 | 7.6110 | 4422 | 0.8457 | 0.7729 | 0.8457 | 0.9196 |
| 0.0553 | 7.6145 | 4424 | 0.8640 | 0.7729 | 0.8640 | 0.9295 |
| 0.0553 | 7.6179 | 4426 | 0.8795 | 0.7618 | 0.8795 | 0.9378 |
| 0.0553 | 7.6213 | 4428 | 0.9191 | 0.7724 | 0.9191 | 0.9587 |
| 0.0553 | 7.6248 | 4430 | 0.9694 | 0.7724 | 0.9694 | 0.9846 |
| 0.0553 | 7.6282 | 4432 | 1.0048 | 0.7724 | 1.0048 | 1.0024 |
| 0.0553 | 7.6317 | 4434 | 1.0397 | 0.7216 | 1.0397 | 1.0197 |
| 0.0553 | 7.6351 | 4436 | 1.0435 | 0.7216 | 1.0435 | 1.0215 |
| 0.0553 | 7.6386 | 4438 | 1.0254 | 0.7216 | 1.0254 | 1.0126 |
| 0.0553 | 7.6420 | 4440 | 0.9852 | 0.7724 | 0.9852 | 0.9926 |
| 0.0553 | 7.6454 | 4442 | 0.9569 | 0.7724 | 0.9569 | 0.9782 |
| 0.0553 | 7.6489 | 4444 | 0.9081 | 0.7729 | 0.9081 | 0.9529 |
| 0.0553 | 7.6523 | 4446 | 0.8543 | 0.7729 | 0.8543 | 0.9243 |
| 0.0553 | 7.6558 | 4448 | 0.8237 | 0.7623 | 0.8237 | 0.9076 |
| 0.0553 | 7.6592 | 4450 | 0.8066 | 0.7623 | 0.8066 | 0.8981 |
| 0.0553 | 7.6627 | 4452 | 0.8133 | 0.7623 | 0.8133 | 0.9018 |
| 0.0553 | 7.6661 | 4454 | 0.8469 | 0.7520 | 0.8469 | 0.9203 |
| 0.0553 | 7.6695 | 4456 | 0.8810 | 0.7520 | 0.8810 | 0.9386 |
| 0.0553 | 7.6730 | 4458 | 0.9098 | 0.7618 | 0.9098 | 0.9538 |
| 0.0553 | 7.6764 | 4460 | 0.9120 | 0.7618 | 0.9120 | 0.9550 |
| 0.0553 | 7.6799 | 4462 | 0.9067 | 0.7618 | 0.9067 | 0.9522 |
| 0.0553 | 7.6833 | 4464 | 0.8911 | 0.7618 | 0.8911 | 0.9440 |
| 0.0553 | 7.6867 | 4466 | 0.8586 | 0.7729 | 0.8586 | 0.9266 |
| 0.0553 | 7.6902 | 4468 | 0.8493 | 0.7729 | 0.8493 | 0.9216 |
| 0.0553 | 7.6936 | 4470 | 0.8321 | 0.7623 | 0.8321 | 0.9122 |
| 0.0553 | 7.6971 | 4472 | 0.8126 | 0.7623 | 0.8126 | 0.9014 |
| 0.0553 | 7.7005 | 4474 | 0.8073 | 0.7623 | 0.8073 | 0.8985 |
| 0.0553 | 7.7040 | 4476 | 0.8166 | 0.7623 | 0.8166 | 0.9036 |
| 0.0553 | 7.7074 | 4478 | 0.8445 | 0.7623 | 0.8445 | 0.9190 |
| 0.0553 | 7.7108 | 4480 | 0.8933 | 0.7724 | 0.8933 | 0.9452 |
| 0.0553 | 7.7143 | 4482 | 0.9318 | 0.7724 | 0.9318 | 0.9653 |
| 0.0553 | 7.7177 | 4484 | 0.9645 | 0.7724 | 0.9645 | 0.9821 |
| 0.0553 | 7.7212 | 4486 | 0.9780 | 0.7724 | 0.9780 | 0.9890 |
| 0.0553 | 7.7246 | 4488 | 1.0045 | 0.7724 | 1.0045 | 1.0022 |
| 0.0553 | 7.7281 | 4490 | 1.0281 | 0.7724 | 1.0281 | 1.0139 |
| 0.0553 | 7.7315 | 4492 | 1.0229 | 0.7724 | 1.0229 | 1.0114 |
| 0.0553 | 7.7349 | 4494 | 0.9920 | 0.7724 | 0.9920 | 0.9960 |
| 0.0553 | 7.7384 | 4496 | 0.9439 | 0.7724 | 0.9439 | 0.9716 |
| 0.0553 | 7.7418 | 4498 | 0.8928 | 0.7724 | 0.8928 | 0.9449 |
| 0.0507 | 7.7453 | 4500 | 0.8792 | 0.7844 | 0.8792 | 0.9377 |
| 0.0507 | 7.7487 | 4502 | 0.8671 | 0.7844 | 0.8671 | 0.9312 |
| 0.0507 | 7.7522 | 4504 | 0.8742 | 0.7844 | 0.8742 | 0.9350 |
| 0.0507 | 7.7556 | 4506 | 0.8727 | 0.7729 | 0.8727 | 0.9342 |
| 0.0507 | 7.7590 | 4508 | 0.8689 | 0.7729 | 0.8689 | 0.9322 |
| 0.0507 | 7.7625 | 4510 | 0.8610 | 0.7729 | 0.8610 | 0.9279 |
| 0.0507 | 7.7659 | 4512 | 0.8408 | 0.7623 | 0.8408 | 0.9169 |
| 0.0507 | 7.7694 | 4514 | 0.8138 | 0.7623 | 0.8138 | 0.9021 |
| 0.0507 | 7.7728 | 4516 | 0.8191 | 0.7623 | 0.8191 | 0.9050 |
| 0.0507 | 7.7762 | 4518 | 0.8372 | 0.7623 | 0.8372 | 0.9150 |
| 0.0507 | 7.7797 | 4520 | 0.8636 | 0.7618 | 0.8636 | 0.9293 |
| 0.0507 | 7.7831 | 4522 | 0.8840 | 0.7618 | 0.8840 | 0.9402 |
| 0.0507 | 7.7866 | 4524 | 0.8821 | 0.7520 | 0.8821 | 0.9392 |
| 0.0507 | 7.7900 | 4526 | 0.8546 | 0.7520 | 0.8546 | 0.9244 |
| 0.0507 | 7.7935 | 4528 | 0.8300 | 0.7520 | 0.8300 | 0.9111 |
| 0.0507 | 7.7969 | 4530 | 0.8197 | 0.7520 | 0.8197 | 0.9054 |
| 0.0507 | 7.8003 | 4532 | 0.8195 | 0.7520 | 0.8195 | 0.9053 |
| 0.0507 | 7.8038 | 4534 | 0.8278 | 0.7520 | 0.8278 | 0.9099 |
| 0.0507 | 7.8072 | 4536 | 0.8375 | 0.7520 | 0.8375 | 0.9152 |
| 0.0507 | 7.8107 | 4538 | 0.8472 | 0.7520 | 0.8472 | 0.9204 |
| 0.0507 | 7.8141 | 4540 | 0.8391 | 0.7520 | 0.8391 | 0.9160 |
| 0.0507 | 7.8176 | 4542 | 0.8413 | 0.7520 | 0.8413 | 0.9172 |
| 0.0507 | 7.8210 | 4544 | 0.8408 | 0.7520 | 0.8408 | 0.9169 |
| 0.0507 | 7.8244 | 4546 | 0.8554 | 0.7520 | 0.8554 | 0.9249 |
| 0.0507 | 7.8279 | 4548 | 0.8654 | 0.7520 | 0.8654 | 0.9302 |
| 0.0507 | 7.8313 | 4550 | 0.8819 | 0.7520 | 0.8819 | 0.9391 |
| 0.0507 | 7.8348 | 4552 | 0.8836 | 0.7520 | 0.8836 | 0.9400 |
| 0.0507 | 7.8382 | 4554 | 0.8781 | 0.7520 | 0.8781 | 0.9370 |
| 0.0507 | 7.8417 | 4556 | 0.8479 | 0.7520 | 0.8479 | 0.9208 |
| 0.0507 | 7.8451 | 4558 | 0.8406 | 0.7520 | 0.8406 | 0.9169 |
| 0.0507 | 7.8485 | 4560 | 0.8510 | 0.7520 | 0.8510 | 0.9225 |
| 0.0507 | 7.8520 | 4562 | 0.8664 | 0.7520 | 0.8664 | 0.9308 |
| 0.0507 | 7.8554 | 4564 | 0.8727 | 0.7520 | 0.8727 | 0.9342 |
| 0.0507 | 7.8589 | 4566 | 0.8714 | 0.7520 | 0.8714 | 0.9335 |
| 0.0507 | 7.8623 | 4568 | 0.8542 | 0.7520 | 0.8542 | 0.9243 |
| 0.0507 | 7.8657 | 4570 | 0.8377 | 0.7520 | 0.8377 | 0.9153 |
| 0.0507 | 7.8692 | 4572 | 0.8235 | 0.7520 | 0.8235 | 0.9075 |
| 0.0507 | 7.8726 | 4574 | 0.8077 | 0.7520 | 0.8077 | 0.8987 |
| 0.0507 | 7.8761 | 4576 | 0.8147 | 0.7520 | 0.8147 | 0.9026 |
| 0.0507 | 7.8795 | 4578 | 0.8307 | 0.7520 | 0.8307 | 0.9114 |
| 0.0507 | 7.8830 | 4580 | 0.8402 | 0.7520 | 0.8402 | 0.9166 |
| 0.0507 | 7.8864 | 4582 | 0.8374 | 0.7520 | 0.8374 | 0.9151 |
| 0.0507 | 7.8898 | 4584 | 0.8367 | 0.7520 | 0.8367 | 0.9147 |
| 0.0507 | 7.8933 | 4586 | 0.8409 | 0.7520 | 0.8409 | 0.9170 |
| 0.0507 | 7.8967 | 4588 | 0.8598 | 0.7520 | 0.8598 | 0.9273 |
| 0.0507 | 7.9002 | 4590 | 0.8880 | 0.7618 | 0.8880 | 0.9423 |
| 0.0507 | 7.9036 | 4592 | 0.9256 | 0.7618 | 0.9256 | 0.9621 |
| 0.0507 | 7.9071 | 4594 | 0.9717 | 0.7134 | 0.9717 | 0.9857 |
| 0.0507 | 7.9105 | 4596 | 1.0091 | 0.7134 | 1.0091 | 1.0046 |
| 0.0507 | 7.9139 | 4598 | 1.0196 | 0.7134 | 1.0196 | 1.0098 |
| 0.0507 | 7.9174 | 4600 | 1.0042 | 0.7134 | 1.0042 | 1.0021 |
| 0.0507 | 7.9208 | 4602 | 0.9752 | 0.7134 | 0.9752 | 0.9875 |
| 0.0507 | 7.9243 | 4604 | 0.9627 | 0.7134 | 0.9627 | 0.9811 |
| 0.0507 | 7.9277 | 4606 | 0.9675 | 0.7134 | 0.9675 | 0.9836 |
| 0.0507 | 7.9312 | 4608 | 0.9694 | 0.7134 | 0.9694 | 0.9846 |
| 0.0507 | 7.9346 | 4610 | 0.9752 | 0.7134 | 0.9752 | 0.9875 |
| 0.0507 | 7.9380 | 4612 | 0.9913 | 0.7134 | 0.9913 | 0.9957 |
| 0.0507 | 7.9415 | 4614 | 0.9842 | 0.7134 | 0.9842 | 0.9921 |
| 0.0507 | 7.9449 | 4616 | 0.9575 | 0.7134 | 0.9575 | 0.9785 |
| 0.0507 | 7.9484 | 4618 | 0.9281 | 0.7134 | 0.9281 | 0.9634 |
| 0.0507 | 7.9518 | 4620 | 0.8942 | 0.7058 | 0.8942 | 0.9456 |
| 0.0507 | 7.9552 | 4622 | 0.8770 | 0.7520 | 0.8770 | 0.9365 |
| 0.0507 | 7.9587 | 4624 | 0.8677 | 0.7520 | 0.8677 | 0.9315 |
| 0.0507 | 7.9621 | 4626 | 0.8753 | 0.7520 | 0.8753 | 0.9356 |
| 0.0507 | 7.9656 | 4628 | 0.8754 | 0.7520 | 0.8754 | 0.9356 |
| 0.0507 | 7.9690 | 4630 | 0.8865 | 0.7520 | 0.8865 | 0.9415 |
| 0.0507 | 7.9725 | 4632 | 0.9030 | 0.7058 | 0.9030 | 0.9503 |
| 0.0507 | 7.9759 | 4634 | 0.9239 | 0.7134 | 0.9239 | 0.9612 |
| 0.0507 | 7.9793 | 4636 | 0.9654 | 0.7134 | 0.9654 | 0.9826 |
| 0.0507 | 7.9828 | 4638 | 0.9852 | 0.7134 | 0.9852 | 0.9926 |
| 0.0507 | 7.9862 | 4640 | 0.9812 | 0.7134 | 0.9812 | 0.9906 |
| 0.0507 | 7.9897 | 4642 | 0.9786 | 0.7134 | 0.9786 | 0.9892 |
| 0.0507 | 7.9931 | 4644 | 0.9620 | 0.7134 | 0.9620 | 0.9808 |
| 0.0507 | 7.9966 | 4646 | 0.9598 | 0.7134 | 0.9598 | 0.9797 |
| 0.0507 | 8.0 | 4648 | 0.9378 | 0.7618 | 0.9378 | 0.9684 |
| 0.0507 | 8.0034 | 4650 | 0.9087 | 0.7729 | 0.9087 | 0.9533 |
| 0.0507 | 8.0069 | 4652 | 0.8944 | 0.7623 | 0.8944 | 0.9458 |
| 0.0507 | 8.0103 | 4654 | 0.9030 | 0.7729 | 0.9030 | 0.9503 |
| 0.0507 | 8.0138 | 4656 | 0.9241 | 0.7729 | 0.9241 | 0.9613 |
| 0.0507 | 8.0172 | 4658 | 0.9449 | 0.7729 | 0.9449 | 0.9721 |
| 0.0507 | 8.0207 | 4660 | 0.9433 | 0.7618 | 0.9433 | 0.9713 |
| 0.0507 | 8.0241 | 4662 | 0.9292 | 0.7618 | 0.9292 | 0.9639 |
| 0.0507 | 8.0275 | 4664 | 0.9130 | 0.7729 | 0.9130 | 0.9555 |
| 0.0507 | 8.0310 | 4666 | 0.8765 | 0.7623 | 0.8765 | 0.9362 |
| 0.0507 | 8.0344 | 4668 | 0.8298 | 0.7623 | 0.8298 | 0.9109 |
| 0.0507 | 8.0379 | 4670 | 0.7986 | 0.7427 | 0.7986 | 0.8936 |
| 0.0507 | 8.0413 | 4672 | 0.7849 | 0.7427 | 0.7849 | 0.8859 |
| 0.0507 | 8.0448 | 4674 | 0.7874 | 0.7427 | 0.7874 | 0.8874 |
| 0.0507 | 8.0482 | 4676 | 0.8097 | 0.7427 | 0.8097 | 0.8998 |
| 0.0507 | 8.0516 | 4678 | 0.8442 | 0.7623 | 0.8442 | 0.9188 |
| 0.0507 | 8.0551 | 4680 | 0.8951 | 0.7623 | 0.8951 | 0.9461 |
| 0.0507 | 8.0585 | 4682 | 0.9468 | 0.7134 | 0.9468 | 0.9730 |
| 0.0507 | 8.0620 | 4684 | 0.9779 | 0.7134 | 0.9779 | 0.9889 |
| 0.0507 | 8.0654 | 4686 | 0.9948 | 0.7134 | 0.9948 | 0.9974 |
| 0.0507 | 8.0688 | 4688 | 1.0053 | 0.7134 | 1.0053 | 1.0026 |
| 0.0507 | 8.0723 | 4690 | 1.0000 | 0.7134 | 1.0000 | 1.0000 |
| 0.0507 | 8.0757 | 4692 | 0.9770 | 0.7134 | 0.9770 | 0.9884 |
| 0.0507 | 8.0792 | 4694 | 0.9581 | 0.7134 | 0.9581 | 0.9788 |
| 0.0507 | 8.0826 | 4696 | 0.9292 | 0.7134 | 0.9292 | 0.9639 |
| 0.0507 | 8.0861 | 4698 | 0.8858 | 0.7520 | 0.8858 | 0.9412 |
| 0.0507 | 8.0895 | 4700 | 0.8653 | 0.7520 | 0.8653 | 0.9302 |
| 0.0507 | 8.0929 | 4702 | 0.8607 | 0.7520 | 0.8607 | 0.9278 |
| 0.0507 | 8.0964 | 4704 | 0.8535 | 0.7623 | 0.8535 | 0.9238 |
| 0.0507 | 8.0998 | 4706 | 0.8600 | 0.7623 | 0.8600 | 0.9274 |
| 0.0507 | 8.1033 | 4708 | 0.8797 | 0.7623 | 0.8797 | 0.9379 |
| 0.0507 | 8.1067 | 4710 | 0.8859 | 0.7520 | 0.8859 | 0.9412 |
| 0.0507 | 8.1102 | 4712 | 0.8955 | 0.7520 | 0.8955 | 0.9463 |
| 0.0507 | 8.1136 | 4714 | 0.9276 | 0.7618 | 0.9276 | 0.9631 |
| 0.0507 | 8.1170 | 4716 | 0.9489 | 0.7618 | 0.9489 | 0.9741 |
| 0.0507 | 8.1205 | 4718 | 0.9541 | 0.7618 | 0.9541 | 0.9768 |
| 0.0507 | 8.1239 | 4720 | 0.9453 | 0.7618 | 0.9453 | 0.9723 |
| 0.0507 | 8.1274 | 4722 | 0.9413 | 0.7618 | 0.9413 | 0.9702 |
| 0.0507 | 8.1308 | 4724 | 0.9473 | 0.7618 | 0.9473 | 0.9733 |
| 0.0507 | 8.1343 | 4726 | 0.9385 | 0.7618 | 0.9385 | 0.9688 |
| 0.0507 | 8.1377 | 4728 | 0.9202 | 0.7618 | 0.9202 | 0.9593 |
| 0.0507 | 8.1411 | 4730 | 0.9020 | 0.7520 | 0.9020 | 0.9497 |
| 0.0507 | 8.1446 | 4732 | 0.8740 | 0.7623 | 0.8740 | 0.9349 |
| 0.0507 | 8.1480 | 4734 | 0.8658 | 0.7623 | 0.8658 | 0.9305 |
| 0.0507 | 8.1515 | 4736 | 0.8817 | 0.7623 | 0.8817 | 0.9390 |
| 0.0507 | 8.1549 | 4738 | 0.8956 | 0.7623 | 0.8956 | 0.9463 |
| 0.0507 | 8.1583 | 4740 | 0.9172 | 0.7623 | 0.9172 | 0.9577 |
| 0.0507 | 8.1618 | 4742 | 0.9356 | 0.7520 | 0.9356 | 0.9673 |
| 0.0507 | 8.1652 | 4744 | 0.9380 | 0.7520 | 0.9380 | 0.9685 |
| 0.0507 | 8.1687 | 4746 | 0.9198 | 0.7520 | 0.9198 | 0.9591 |
| 0.0507 | 8.1721 | 4748 | 0.8932 | 0.7623 | 0.8932 | 0.9451 |
| 0.0507 | 8.1756 | 4750 | 0.8778 | 0.7623 | 0.8778 | 0.9369 |
| 0.0507 | 8.1790 | 4752 | 0.8572 | 0.7623 | 0.8572 | 0.9258 |
| 0.0507 | 8.1824 | 4754 | 0.8451 | 0.7623 | 0.8451 | 0.9193 |
| 0.0507 | 8.1859 | 4756 | 0.8443 | 0.7623 | 0.8443 | 0.9189 |
| 0.0507 | 8.1893 | 4758 | 0.8435 | 0.7623 | 0.8435 | 0.9184 |
| 0.0507 | 8.1928 | 4760 | 0.8575 | 0.7623 | 0.8575 | 0.9260 |
| 0.0507 | 8.1962 | 4762 | 0.8716 | 0.7623 | 0.8716 | 0.9336 |
| 0.0507 | 8.1997 | 4764 | 0.8777 | 0.7623 | 0.8777 | 0.9369 |
| 0.0507 | 8.2031 | 4766 | 0.8893 | 0.7623 | 0.8893 | 0.9430 |
| 0.0507 | 8.2065 | 4768 | 0.8970 | 0.7623 | 0.8970 | 0.9471 |
| 0.0507 | 8.2100 | 4770 | 0.8988 | 0.7623 | 0.8988 | 0.9481 |
| 0.0507 | 8.2134 | 4772 | 0.9012 | 0.7623 | 0.9012 | 0.9493 |
| 0.0507 | 8.2169 | 4774 | 0.8896 | 0.7623 | 0.8896 | 0.9432 |
| 0.0507 | 8.2203 | 4776 | 0.8884 | 0.7623 | 0.8884 | 0.9425 |
| 0.0507 | 8.2238 | 4778 | 0.8881 | 0.7623 | 0.8881 | 0.9424 |
| 0.0507 | 8.2272 | 4780 | 0.8875 | 0.7623 | 0.8875 | 0.9421 |
| 0.0507 | 8.2306 | 4782 | 0.8786 | 0.7623 | 0.8786 | 0.9373 |
| 0.0507 | 8.2341 | 4784 | 0.8651 | 0.7623 | 0.8651 | 0.9301 |
| 0.0507 | 8.2375 | 4786 | 0.8484 | 0.7623 | 0.8484 | 0.9211 |
| 0.0507 | 8.2410 | 4788 | 0.8358 | 0.7623 | 0.8358 | 0.9142 |
| 0.0507 | 8.2444 | 4790 | 0.8246 | 0.7623 | 0.8246 | 0.9081 |
| 0.0507 | 8.2478 | 4792 | 0.8081 | 0.7623 | 0.8081 | 0.8989 |
| 0.0507 | 8.2513 | 4794 | 0.8049 | 0.7623 | 0.8049 | 0.8971 |
| 0.0507 | 8.2547 | 4796 | 0.8175 | 0.7623 | 0.8175 | 0.9042 |
| 0.0507 | 8.2582 | 4798 | 0.8317 | 0.7623 | 0.8317 | 0.9120 |
| 0.0507 | 8.2616 | 4800 | 0.8494 | 0.7623 | 0.8494 | 0.9216 |
| 0.0507 | 8.2651 | 4802 | 0.8641 | 0.7623 | 0.8641 | 0.9296 |
| 0.0507 | 8.2685 | 4804 | 0.8773 | 0.7058 | 0.8773 | 0.9366 |
| 0.0507 | 8.2719 | 4806 | 0.8768 | 0.7058 | 0.8768 | 0.9364 |
| 0.0507 | 8.2754 | 4808 | 0.8629 | 0.7058 | 0.8629 | 0.9289 |
| 0.0507 | 8.2788 | 4810 | 0.8354 | 0.7623 | 0.8354 | 0.9140 |
| 0.0507 | 8.2823 | 4812 | 0.8239 | 0.7623 | 0.8239 | 0.9077 |
| 0.0507 | 8.2857 | 4814 | 0.8249 | 0.7623 | 0.8249 | 0.9082 |
| 0.0507 | 8.2892 | 4816 | 0.8292 | 0.7623 | 0.8292 | 0.9106 |
| 0.0507 | 8.2926 | 4818 | 0.8440 | 0.7623 | 0.8440 | 0.9187 |
| 0.0507 | 8.2960 | 4820 | 0.8616 | 0.7623 | 0.8616 | 0.9282 |
| 0.0507 | 8.2995 | 4822 | 0.9004 | 0.7623 | 0.9004 | 0.9489 |
| 0.0507 | 8.3029 | 4824 | 0.9302 | 0.7058 | 0.9302 | 0.9645 |
| 0.0507 | 8.3064 | 4826 | 0.9387 | 0.7623 | 0.9387 | 0.9689 |
| 0.0507 | 8.3098 | 4828 | 0.9312 | 0.7623 | 0.9312 | 0.9650 |
| 0.0507 | 8.3133 | 4830 | 0.9212 | 0.7623 | 0.9212 | 0.9598 |
| 0.0507 | 8.3167 | 4832 | 0.9117 | 0.7623 | 0.9117 | 0.9548 |
| 0.0507 | 8.3201 | 4834 | 0.9047 | 0.7623 | 0.9047 | 0.9512 |
| 0.0507 | 8.3236 | 4836 | 0.8977 | 0.7623 | 0.8977 | 0.9475 |
| 0.0507 | 8.3270 | 4838 | 0.9043 | 0.7623 | 0.9043 | 0.9509 |
| 0.0507 | 8.3305 | 4840 | 0.9111 | 0.7139 | 0.9111 | 0.9545 |
| 0.0507 | 8.3339 | 4842 | 0.9208 | 0.7139 | 0.9208 | 0.9596 |
| 0.0507 | 8.3373 | 4844 | 0.9323 | 0.7058 | 0.9323 | 0.9656 |
| 0.0507 | 8.3408 | 4846 | 0.9266 | 0.7058 | 0.9266 | 0.9626 |
| 0.0507 | 8.3442 | 4848 | 0.9167 | 0.7139 | 0.9167 | 0.9575 |
| 0.0507 | 8.3477 | 4850 | 0.9004 | 0.7139 | 0.9004 | 0.9489 |
| 0.0507 | 8.3511 | 4852 | 0.8816 | 0.7623 | 0.8816 | 0.9389 |
| 0.0507 | 8.3546 | 4854 | 0.8677 | 0.7623 | 0.8677 | 0.9315 |
| 0.0507 | 8.3580 | 4856 | 0.8489 | 0.7623 | 0.8489 | 0.9214 |
| 0.0507 | 8.3614 | 4858 | 0.8547 | 0.7623 | 0.8547 | 0.9245 |
| 0.0507 | 8.3649 | 4860 | 0.8703 | 0.7623 | 0.8703 | 0.9329 |
| 0.0507 | 8.3683 | 4862 | 0.8925 | 0.7623 | 0.8925 | 0.9447 |
| 0.0507 | 8.3718 | 4864 | 0.9229 | 0.7623 | 0.9229 | 0.9607 |
| 0.0507 | 8.3752 | 4866 | 0.9590 | 0.7221 | 0.9590 | 0.9793 |
| 0.0507 | 8.3787 | 4868 | 0.9991 | 0.7221 | 0.9991 | 0.9996 |
| 0.0507 | 8.3821 | 4870 | 1.0195 | 0.7134 | 1.0195 | 1.0097 |
| 0.0507 | 8.3855 | 4872 | 1.0136 | 0.7134 | 1.0136 | 1.0068 |
| 0.0507 | 8.3890 | 4874 | 0.9930 | 0.7221 | 0.9930 | 0.9965 |
| 0.0507 | 8.3924 | 4876 | 0.9591 | 0.7221 | 0.9591 | 0.9793 |
| 0.0507 | 8.3959 | 4878 | 0.9176 | 0.7623 | 0.9176 | 0.9579 |
| 0.0507 | 8.3993 | 4880 | 0.8761 | 0.7623 | 0.8761 | 0.9360 |
| 0.0507 | 8.4028 | 4882 | 0.8567 | 0.7623 | 0.8567 | 0.9256 |
| 0.0507 | 8.4062 | 4884 | 0.8588 | 0.7623 | 0.8588 | 0.9267 |
| 0.0507 | 8.4096 | 4886 | 0.8581 | 0.7139 | 0.8581 | 0.9264 |
| 0.0507 | 8.4131 | 4888 | 0.8774 | 0.7058 | 0.8774 | 0.9367 |
| 0.0507 | 8.4165 | 4890 | 0.9100 | 0.7058 | 0.9100 | 0.9540 |
| 0.0507 | 8.4200 | 4892 | 0.9481 | 0.7058 | 0.9481 | 0.9737 |
| 0.0507 | 8.4234 | 4894 | 0.9720 | 0.7134 | 0.9720 | 0.9859 |
| 0.0507 | 8.4269 | 4896 | 0.9838 | 0.7134 | 0.9838 | 0.9919 |
| 0.0507 | 8.4303 | 4898 | 0.9706 | 0.7134 | 0.9706 | 0.9852 |
| 0.0507 | 8.4337 | 4900 | 0.9555 | 0.7134 | 0.9555 | 0.9775 |
| 0.0507 | 8.4372 | 4902 | 0.9532 | 0.7618 | 0.9532 | 0.9763 |
| 0.0507 | 8.4406 | 4904 | 0.9436 | 0.7729 | 0.9436 | 0.9714 |
| 0.0507 | 8.4441 | 4906 | 0.9263 | 0.7729 | 0.9263 | 0.9624 |
| 0.0507 | 8.4475 | 4908 | 0.9004 | 0.7729 | 0.9004 | 0.9489 |
| 0.0507 | 8.4509 | 4910 | 0.8929 | 0.7729 | 0.8929 | 0.9450 |
| 0.0507 | 8.4544 | 4912 | 0.9014 | 0.7729 | 0.9014 | 0.9494 |
| 0.0507 | 8.4578 | 4914 | 0.9027 | 0.7729 | 0.9027 | 0.9501 |
| 0.0507 | 8.4613 | 4916 | 0.9046 | 0.7729 | 0.9046 | 0.9511 |
| 0.0507 | 8.4647 | 4918 | 0.9144 | 0.7729 | 0.9144 | 0.9562 |
| 0.0507 | 8.4682 | 4920 | 0.9336 | 0.7729 | 0.9336 | 0.9662 |
| 0.0507 | 8.4716 | 4922 | 0.9575 | 0.7729 | 0.9575 | 0.9785 |
| 0.0507 | 8.4750 | 4924 | 0.9835 | 0.7216 | 0.9835 | 0.9917 |
| 0.0507 | 8.4785 | 4926 | 0.9950 | 0.7216 | 0.9950 | 0.9975 |
| 0.0507 | 8.4819 | 4928 | 0.9986 | 0.7216 | 0.9986 | 0.9993 |
| 0.0507 | 8.4854 | 4930 | 0.9912 | 0.7216 | 0.9912 | 0.9956 |
| 0.0507 | 8.4888 | 4932 | 0.9728 | 0.7216 | 0.9728 | 0.9863 |
| 0.0507 | 8.4923 | 4934 | 0.9512 | 0.7134 | 0.9512 | 0.9753 |
| 0.0507 | 8.4957 | 4936 | 0.9340 | 0.7134 | 0.9340 | 0.9664 |
| 0.0507 | 8.4991 | 4938 | 0.9374 | 0.7134 | 0.9374 | 0.9682 |
| 0.0507 | 8.5026 | 4940 | 0.9482 | 0.7216 | 0.9482 | 0.9738 |
| 0.0507 | 8.5060 | 4942 | 0.9586 | 0.7216 | 0.9586 | 0.9791 |
| 0.0507 | 8.5095 | 4944 | 0.9553 | 0.7216 | 0.9553 | 0.9774 |
| 0.0507 | 8.5129 | 4946 | 0.9484 | 0.7216 | 0.9484 | 0.9739 |
| 0.0507 | 8.5164 | 4948 | 0.9396 | 0.7216 | 0.9396 | 0.9693 |
| 0.0507 | 8.5198 | 4950 | 0.9283 | 0.7724 | 0.9283 | 0.9635 |
| 0.0507 | 8.5232 | 4952 | 0.9263 | 0.7724 | 0.9263 | 0.9625 |
| 0.0507 | 8.5267 | 4954 | 0.9144 | 0.7618 | 0.9144 | 0.9562 |
| 0.0507 | 8.5301 | 4956 | 0.8990 | 0.7729 | 0.8990 | 0.9482 |
| 0.0507 | 8.5336 | 4958 | 0.8938 | 0.7729 | 0.8938 | 0.9454 |
| 0.0507 | 8.5370 | 4960 | 0.9031 | 0.7729 | 0.9031 | 0.9503 |
| 0.0507 | 8.5404 | 4962 | 0.9228 | 0.7724 | 0.9228 | 0.9606 |
| 0.0507 | 8.5439 | 4964 | 0.9381 | 0.7724 | 0.9381 | 0.9685 |
| 0.0507 | 8.5473 | 4966 | 0.9490 | 0.7724 | 0.9490 | 0.9742 |
| 0.0507 | 8.5508 | 4968 | 0.9450 | 0.7724 | 0.9450 | 0.9721 |
| 0.0507 | 8.5542 | 4970 | 0.9370 | 0.7618 | 0.9370 | 0.9680 |
| 0.0507 | 8.5577 | 4972 | 0.9468 | 0.7618 | 0.9468 | 0.9730 |
| 0.0507 | 8.5611 | 4974 | 0.9435 | 0.7618 | 0.9435 | 0.9713 |
| 0.0507 | 8.5645 | 4976 | 0.9409 | 0.7520 | 0.9409 | 0.9700 |
| 0.0507 | 8.5680 | 4978 | 0.9320 | 0.7520 | 0.9320 | 0.9654 |
| 0.0507 | 8.5714 | 4980 | 0.9280 | 0.7520 | 0.9280 | 0.9633 |
| 0.0507 | 8.5749 | 4982 | 0.9314 | 0.7520 | 0.9314 | 0.9651 |
| 0.0507 | 8.5783 | 4984 | 0.9335 | 0.7520 | 0.9335 | 0.9662 |
| 0.0507 | 8.5818 | 4986 | 0.9435 | 0.7058 | 0.9435 | 0.9713 |
| 0.0507 | 8.5852 | 4988 | 0.9321 | 0.7520 | 0.9321 | 0.9655 |
| 0.0507 | 8.5886 | 4990 | 0.9149 | 0.7520 | 0.9149 | 0.9565 |
| 0.0507 | 8.5921 | 4992 | 0.9067 | 0.7520 | 0.9067 | 0.9522 |
| 0.0507 | 8.5955 | 4994 | 0.8849 | 0.7623 | 0.8849 | 0.9407 |
| 0.0507 | 8.5990 | 4996 | 0.8600 | 0.7623 | 0.8600 | 0.9274 |
| 0.0507 | 8.6024 | 4998 | 0.8415 | 0.7623 | 0.8415 | 0.9173 |
| 0.0451 | 8.6059 | 5000 | 0.8231 | 0.7623 | 0.8231 | 0.9073 |
| 0.0451 | 8.6093 | 5002 | 0.8122 | 0.7623 | 0.8122 | 0.9012 |
| 0.0451 | 8.6127 | 5004 | 0.8110 | 0.7623 | 0.8110 | 0.9006 |
| 0.0451 | 8.6162 | 5006 | 0.8056 | 0.7623 | 0.8056 | 0.8975 |
| 0.0451 | 8.6196 | 5008 | 0.8048 | 0.7623 | 0.8048 | 0.8971 |
| 0.0451 | 8.6231 | 5010 | 0.8181 | 0.7623 | 0.8181 | 0.9045 |
| 0.0451 | 8.6265 | 5012 | 0.8330 | 0.7520 | 0.8330 | 0.9127 |
| 0.0451 | 8.6299 | 5014 | 0.8546 | 0.7520 | 0.8546 | 0.9244 |
| 0.0451 | 8.6334 | 5016 | 0.8828 | 0.7058 | 0.8828 | 0.9396 |
| 0.0451 | 8.6368 | 5018 | 0.9060 | 0.7058 | 0.9060 | 0.9518 |
| 0.0451 | 8.6403 | 5020 | 0.9139 | 0.7058 | 0.9139 | 0.9560 |
| 0.0451 | 8.6437 | 5022 | 0.9184 | 0.7520 | 0.9184 | 0.9583 |
| 0.0451 | 8.6472 | 5024 | 0.9264 | 0.7520 | 0.9264 | 0.9625 |
| 0.0451 | 8.6506 | 5026 | 0.9332 | 0.7520 | 0.9332 | 0.9660 |
| 0.0451 | 8.6540 | 5028 | 0.9416 | 0.7724 | 0.9416 | 0.9704 |
| 0.0451 | 8.6575 | 5030 | 0.9446 | 0.7724 | 0.9446 | 0.9719 |
| 0.0451 | 8.6609 | 5032 | 0.9638 | 0.7724 | 0.9638 | 0.9817 |
| 0.0451 | 8.6644 | 5034 | 0.9731 | 0.7724 | 0.9731 | 0.9864 |
| 0.0451 | 8.6678 | 5036 | 0.9860 | 0.7724 | 0.9860 | 0.9930 |
| 0.0451 | 8.6713 | 5038 | 0.9945 | 0.7724 | 0.9945 | 0.9972 |
| 0.0451 | 8.6747 | 5040 | 0.9949 | 0.7724 | 0.9949 | 0.9974 |
| 0.0451 | 8.6781 | 5042 | 0.9882 | 0.7724 | 0.9882 | 0.9941 |
| 0.0451 | 8.6816 | 5044 | 0.9912 | 0.7724 | 0.9912 | 0.9956 |
| 0.0451 | 8.6850 | 5046 | 0.9909 | 0.7724 | 0.9909 | 0.9954 |
| 0.0451 | 8.6885 | 5048 | 0.9792 | 0.7724 | 0.9792 | 0.9895 |
| 0.0451 | 8.6919 | 5050 | 0.9699 | 0.7724 | 0.9699 | 0.9848 |
| 0.0451 | 8.6954 | 5052 | 0.9760 | 0.7724 | 0.9760 | 0.9879 |
| 0.0451 | 8.6988 | 5054 | 0.9734 | 0.7618 | 0.9734 | 0.9866 |
| 0.0451 | 8.7022 | 5056 | 0.9608 | 0.7618 | 0.9608 | 0.9802 |
| 0.0451 | 8.7057 | 5058 | 0.9423 | 0.7618 | 0.9423 | 0.9707 |
| 0.0451 | 8.7091 | 5060 | 0.9335 | 0.7618 | 0.9335 | 0.9662 |
| 0.0451 | 8.7126 | 5062 | 0.9238 | 0.7520 | 0.9238 | 0.9611 |
| 0.0451 | 8.7160 | 5064 | 0.9258 | 0.7520 | 0.9258 | 0.9622 |
| 0.0451 | 8.7194 | 5066 | 0.9230 | 0.7520 | 0.9230 | 0.9607 |
| 0.0451 | 8.7229 | 5068 | 0.9398 | 0.7618 | 0.9398 | 0.9694 |
| 0.0451 | 8.7263 | 5070 | 0.9454 | 0.7618 | 0.9454 | 0.9723 |
| 0.0451 | 8.7298 | 5072 | 0.9398 | 0.7618 | 0.9398 | 0.9694 |
| 0.0451 | 8.7332 | 5074 | 0.9303 | 0.7520 | 0.9303 | 0.9645 |
| 0.0451 | 8.7367 | 5076 | 0.9174 | 0.7520 | 0.9174 | 0.9578 |
| 0.0451 | 8.7401 | 5078 | 0.9134 | 0.7623 | 0.9134 | 0.9557 |
| 0.0451 | 8.7435 | 5080 | 0.9172 | 0.7623 | 0.9172 | 0.9577 |
| 0.0451 | 8.7470 | 5082 | 0.9174 | 0.7623 | 0.9174 | 0.9578 |
| 0.0451 | 8.7504 | 5084 | 0.9271 | 0.7729 | 0.9271 | 0.9629 |
| 0.0451 | 8.7539 | 5086 | 0.9391 | 0.7729 | 0.9391 | 0.9691 |
| 0.0451 | 8.7573 | 5088 | 0.9570 | 0.7724 | 0.9570 | 0.9783 |
| 0.0451 | 8.7608 | 5090 | 0.9788 | 0.7724 | 0.9788 | 0.9894 |
| 0.0451 | 8.7642 | 5092 | 0.9926 | 0.7515 | 0.9926 | 0.9963 |
| 0.0451 | 8.7676 | 5094 | 1.0180 | 0.7515 | 1.0180 | 1.0089 |
| 0.0451 | 8.7711 | 5096 | 1.0203 | 0.7515 | 1.0203 | 1.0101 |
| 0.0451 | 8.7745 | 5098 | 1.0109 | 0.7515 | 1.0109 | 1.0054 |
| 0.0451 | 8.7780 | 5100 | 0.9933 | 0.7626 | 0.9933 | 0.9967 |
| 0.0451 | 8.7814 | 5102 | 0.9656 | 0.7844 | 0.9656 | 0.9826 |
| 0.0451 | 8.7849 | 5104 | 0.9449 | 0.7844 | 0.9449 | 0.9721 |
| 0.0451 | 8.7883 | 5106 | 0.9250 | 0.7729 | 0.9250 | 0.9618 |
| 0.0451 | 8.7917 | 5108 | 0.9113 | 0.7729 | 0.9113 | 0.9546 |
| 0.0451 | 8.7952 | 5110 | 0.9018 | 0.7729 | 0.9018 | 0.9496 |
| 0.0451 | 8.7986 | 5112 | 0.8968 | 0.7729 | 0.8968 | 0.9470 |
| 0.0451 | 8.8021 | 5114 | 0.8983 | 0.7729 | 0.8983 | 0.9478 |
| 0.0451 | 8.8055 | 5116 | 0.9117 | 0.7844 | 0.9117 | 0.9548 |
| 0.0451 | 8.8090 | 5118 | 0.9232 | 0.7844 | 0.9232 | 0.9608 |
| 0.0451 | 8.8124 | 5120 | 0.9268 | 0.7844 | 0.9268 | 0.9627 |
| 0.0451 | 8.8158 | 5122 | 0.9348 | 0.7844 | 0.9348 | 0.9669 |
| 0.0451 | 8.8193 | 5124 | 0.9467 | 0.7844 | 0.9467 | 0.9730 |
| 0.0451 | 8.8227 | 5126 | 0.9609 | 0.7844 | 0.9609 | 0.9802 |
| 0.0451 | 8.8262 | 5128 | 0.9703 | 0.7844 | 0.9703 | 0.9851 |
| 0.0451 | 8.8296 | 5130 | 0.9750 | 0.7844 | 0.9750 | 0.9874 |
| 0.0451 | 8.8330 | 5132 | 0.9763 | 0.7844 | 0.9763 | 0.9881 |
| 0.0451 | 8.8365 | 5134 | 0.9849 | 0.7844 | 0.9849 | 0.9924 |
| 0.0451 | 8.8399 | 5136 | 1.0064 | 0.7101 | 1.0064 | 1.0032 |
| 0.0451 | 8.8434 | 5138 | 1.0367 | 0.7101 | 1.0367 | 1.0182 |
| 0.0451 | 8.8468 | 5140 | 1.0608 | 0.7014 | 1.0608 | 1.0299 |
| 0.0451 | 8.8503 | 5142 | 1.0677 | 0.7014 | 1.0677 | 1.0333 |
| 0.0451 | 8.8537 | 5144 | 1.0588 | 0.7101 | 1.0588 | 1.0290 |
| 0.0451 | 8.8571 | 5146 | 1.0434 | 0.7101 | 1.0434 | 1.0215 |
| 0.0451 | 8.8606 | 5148 | 1.0290 | 0.7626 | 1.0290 | 1.0144 |
| 0.0451 | 8.8640 | 5150 | 1.0101 | 0.7626 | 1.0101 | 1.0051 |
| 0.0451 | 8.8675 | 5152 | 0.9822 | 0.7844 | 0.9822 | 0.9910 |
| 0.0451 | 8.8709 | 5154 | 0.9682 | 0.7844 | 0.9682 | 0.9840 |
| 0.0451 | 8.8744 | 5156 | 0.9652 | 0.7844 | 0.9652 | 0.9824 |
| 0.0451 | 8.8778 | 5158 | 0.9609 | 0.7844 | 0.9609 | 0.9803 |
| 0.0451 | 8.8812 | 5160 | 0.9517 | 0.7729 | 0.9517 | 0.9756 |
| 0.0451 | 8.8847 | 5162 | 0.9482 | 0.7729 | 0.9482 | 0.9737 |
| 0.0451 | 8.8881 | 5164 | 0.9351 | 0.7729 | 0.9351 | 0.9670 |
| 0.0451 | 8.8916 | 5166 | 0.9261 | 0.7729 | 0.9261 | 0.9623 |
| 0.0451 | 8.8950 | 5168 | 0.9221 | 0.7139 | 0.9221 | 0.9603 |
| 0.0451 | 8.8985 | 5170 | 0.9151 | 0.7139 | 0.9151 | 0.9566 |
| 0.0451 | 8.9019 | 5172 | 0.9092 | 0.7139 | 0.9092 | 0.9535 |
| 0.0451 | 8.9053 | 5174 | 0.9016 | 0.7058 | 0.9016 | 0.9495 |
| 0.0451 | 8.9088 | 5176 | 0.9119 | 0.7058 | 0.9119 | 0.9549 |
| 0.0451 | 8.9122 | 5178 | 0.9351 | 0.7058 | 0.9351 | 0.9670 |
| 0.0451 | 8.9157 | 5180 | 0.9685 | 0.7058 | 0.9685 | 0.9841 |
| 0.0451 | 8.9191 | 5182 | 1.0054 | 0.6940 | 1.0054 | 1.0027 |
| 0.0451 | 8.9225 | 5184 | 1.0202 | 0.6940 | 1.0202 | 1.0101 |
| 0.0451 | 8.9260 | 5186 | 1.0198 | 0.6940 | 1.0198 | 1.0099 |
| 0.0451 | 8.9294 | 5188 | 1.0099 | 0.6940 | 1.0099 | 1.0049 |
| 0.0451 | 8.9329 | 5190 | 0.9962 | 0.7134 | 0.9962 | 0.9981 |
| 0.0451 | 8.9363 | 5192 | 0.9735 | 0.7134 | 0.9735 | 0.9867 |
| 0.0451 | 8.9398 | 5194 | 0.9564 | 0.7134 | 0.9564 | 0.9780 |
| 0.0451 | 8.9432 | 5196 | 0.9440 | 0.7134 | 0.9440 | 0.9716 |
| 0.0451 | 8.9466 | 5198 | 0.9356 | 0.7134 | 0.9356 | 0.9673 |
| 0.0451 | 8.9501 | 5200 | 0.9329 | 0.7134 | 0.9329 | 0.9658 |
| 0.0451 | 8.9535 | 5202 | 0.9197 | 0.7618 | 0.9197 | 0.9590 |
| 0.0451 | 8.9570 | 5204 | 0.9054 | 0.7623 | 0.9054 | 0.9515 |
| 0.0451 | 8.9604 | 5206 | 0.9058 | 0.7729 | 0.9058 | 0.9517 |
| 0.0451 | 8.9639 | 5208 | 0.9186 | 0.7729 | 0.9186 | 0.9584 |
| 0.0451 | 8.9673 | 5210 | 0.9425 | 0.7618 | 0.9425 | 0.9708 |
| 0.0451 | 8.9707 | 5212 | 0.9702 | 0.7724 | 0.9702 | 0.9850 |
| 0.0451 | 8.9742 | 5214 | 0.9873 | 0.7724 | 0.9873 | 0.9936 |
| 0.0451 | 8.9776 | 5216 | 0.9981 | 0.7216 | 0.9981 | 0.9991 |
| 0.0451 | 8.9811 | 5218 | 1.0021 | 0.7216 | 1.0021 | 1.0011 |
| 0.0451 | 8.9845 | 5220 | 1.0047 | 0.7216 | 1.0047 | 1.0023 |
| 0.0451 | 8.9880 | 5222 | 1.0032 | 0.7724 | 1.0032 | 1.0016 |
| 0.0451 | 8.9914 | 5224 | 0.9932 | 0.7724 | 0.9932 | 0.9966 |
| 0.0451 | 8.9948 | 5226 | 0.9751 | 0.7724 | 0.9751 | 0.9875 |
| 0.0451 | 8.9983 | 5228 | 0.9509 | 0.7618 | 0.9509 | 0.9751 |
| 0.0451 | 9.0017 | 5230 | 0.9394 | 0.7618 | 0.9394 | 0.9692 |
| 0.0451 | 9.0052 | 5232 | 0.9360 | 0.7618 | 0.9360 | 0.9675 |
| 0.0451 | 9.0086 | 5234 | 0.9240 | 0.7618 | 0.9240 | 0.9613 |
| 0.0451 | 9.0120 | 5236 | 0.9224 | 0.7618 | 0.9224 | 0.9604 |
| 0.0451 | 9.0155 | 5238 | 0.9269 | 0.7618 | 0.9269 | 0.9627 |
| 0.0451 | 9.0189 | 5240 | 0.9299 | 0.7618 | 0.9299 | 0.9643 |
| 0.0451 | 9.0224 | 5242 | 0.9379 | 0.7618 | 0.9379 | 0.9684 |
| 0.0451 | 9.0258 | 5244 | 0.9500 | 0.7724 | 0.9500 | 0.9747 |
| 0.0451 | 9.0293 | 5246 | 0.9522 | 0.7724 | 0.9522 | 0.9758 |
| 0.0451 | 9.0327 | 5248 | 0.9433 | 0.7618 | 0.9433 | 0.9712 |
| 0.0451 | 9.0361 | 5250 | 0.9325 | 0.7618 | 0.9325 | 0.9657 |
| 0.0451 | 9.0396 | 5252 | 0.9292 | 0.7623 | 0.9292 | 0.9640 |
| 0.0451 | 9.0430 | 5254 | 0.9315 | 0.7729 | 0.9315 | 0.9651 |
| 0.0451 | 9.0465 | 5256 | 0.9415 | 0.7618 | 0.9415 | 0.9703 |
| 0.0451 | 9.0499 | 5258 | 0.9490 | 0.7618 | 0.9490 | 0.9741 |
| 0.0451 | 9.0534 | 5260 | 0.9655 | 0.7724 | 0.9655 | 0.9826 |
| 0.0451 | 9.0568 | 5262 | 0.9782 | 0.7724 | 0.9782 | 0.9890 |
| 0.0451 | 9.0602 | 5264 | 0.9861 | 0.7724 | 0.9861 | 0.9930 |
| 0.0451 | 9.0637 | 5266 | 0.9857 | 0.7724 | 0.9857 | 0.9928 |
| 0.0451 | 9.0671 | 5268 | 0.9747 | 0.7724 | 0.9747 | 0.9872 |
| 0.0451 | 9.0706 | 5270 | 0.9605 | 0.7618 | 0.9605 | 0.9801 |
| 0.0451 | 9.0740 | 5272 | 0.9436 | 0.7520 | 0.9436 | 0.9714 |
| 0.0451 | 9.0775 | 5274 | 0.9316 | 0.7520 | 0.9316 | 0.9652 |
| 0.0451 | 9.0809 | 5276 | 0.9320 | 0.7520 | 0.9320 | 0.9654 |
| 0.0451 | 9.0843 | 5278 | 0.9243 | 0.7520 | 0.9243 | 0.9614 |
| 0.0451 | 9.0878 | 5280 | 0.9258 | 0.7520 | 0.9258 | 0.9622 |
| 0.0451 | 9.0912 | 5282 | 0.9273 | 0.7058 | 0.9273 | 0.9630 |
| 0.0451 | 9.0947 | 5284 | 0.9368 | 0.7058 | 0.9368 | 0.9679 |
| 0.0451 | 9.0981 | 5286 | 0.9443 | 0.7058 | 0.9443 | 0.9717 |
| 0.0451 | 9.1015 | 5288 | 0.9428 | 0.7058 | 0.9428 | 0.9710 |
| 0.0451 | 9.1050 | 5290 | 0.9367 | 0.7520 | 0.9367 | 0.9678 |
| 0.0451 | 9.1084 | 5292 | 0.9306 | 0.7520 | 0.9306 | 0.9647 |
| 0.0451 | 9.1119 | 5294 | 0.9317 | 0.7520 | 0.9317 | 0.9653 |
| 0.0451 | 9.1153 | 5296 | 0.9376 | 0.7618 | 0.9376 | 0.9683 |
| 0.0451 | 9.1188 | 5298 | 0.9456 | 0.7618 | 0.9456 | 0.9724 |
| 0.0451 | 9.1222 | 5300 | 0.9456 | 0.7618 | 0.9456 | 0.9724 |
| 0.0451 | 9.1256 | 5302 | 0.9552 | 0.7724 | 0.9552 | 0.9774 |
| 0.0451 | 9.1291 | 5304 | 0.9566 | 0.7724 | 0.9566 | 0.9781 |
| 0.0451 | 9.1325 | 5306 | 0.9539 | 0.7724 | 0.9539 | 0.9767 |
| 0.0451 | 9.1360 | 5308 | 0.9477 | 0.7618 | 0.9477 | 0.9735 |
| 0.0451 | 9.1394 | 5310 | 0.9425 | 0.7618 | 0.9425 | 0.9708 |
| 0.0451 | 9.1429 | 5312 | 0.9313 | 0.7618 | 0.9313 | 0.9651 |
| 0.0451 | 9.1463 | 5314 | 0.9150 | 0.7623 | 0.9150 | 0.9566 |
| 0.0451 | 9.1497 | 5316 | 0.9052 | 0.7623 | 0.9052 | 0.9514 |
| 0.0451 | 9.1532 | 5318 | 0.9049 | 0.7520 | 0.9049 | 0.9513 |
| 0.0451 | 9.1566 | 5320 | 0.9123 | 0.7520 | 0.9123 | 0.9551 |
| 0.0451 | 9.1601 | 5322 | 0.9162 | 0.7520 | 0.9162 | 0.9572 |
| 0.0451 | 9.1635 | 5324 | 0.9270 | 0.7618 | 0.9270 | 0.9628 |
| 0.0451 | 9.1670 | 5326 | 0.9357 | 0.7618 | 0.9357 | 0.9673 |
| 0.0451 | 9.1704 | 5328 | 0.9395 | 0.7618 | 0.9395 | 0.9693 |
| 0.0451 | 9.1738 | 5330 | 0.9444 | 0.7618 | 0.9444 | 0.9718 |
| 0.0451 | 9.1773 | 5332 | 0.9372 | 0.7618 | 0.9372 | 0.9681 |
| 0.0451 | 9.1807 | 5334 | 0.9315 | 0.7618 | 0.9315 | 0.9651 |
| 0.0451 | 9.1842 | 5336 | 0.9296 | 0.7618 | 0.9296 | 0.9641 |
| 0.0451 | 9.1876 | 5338 | 0.9345 | 0.7618 | 0.9345 | 0.9667 |
| 0.0451 | 9.1910 | 5340 | 0.9319 | 0.7618 | 0.9319 | 0.9653 |
| 0.0451 | 9.1945 | 5342 | 0.9209 | 0.7729 | 0.9209 | 0.9596 |
| 0.0451 | 9.1979 | 5344 | 0.9156 | 0.7729 | 0.9156 | 0.9569 |
| 0.0451 | 9.2014 | 5346 | 0.9126 | 0.7729 | 0.9126 | 0.9553 |
| 0.0451 | 9.2048 | 5348 | 0.9066 | 0.7623 | 0.9066 | 0.9522 |
| 0.0451 | 9.2083 | 5350 | 0.9089 | 0.7623 | 0.9089 | 0.9534 |
| 0.0451 | 9.2117 | 5352 | 0.9157 | 0.7729 | 0.9157 | 0.9569 |
| 0.0451 | 9.2151 | 5354 | 0.9227 | 0.7729 | 0.9227 | 0.9606 |
| 0.0451 | 9.2186 | 5356 | 0.9258 | 0.7618 | 0.9258 | 0.9622 |
| 0.0451 | 9.2220 | 5358 | 0.9335 | 0.7618 | 0.9335 | 0.9662 |
| 0.0451 | 9.2255 | 5360 | 0.9343 | 0.7618 | 0.9343 | 0.9666 |
| 0.0451 | 9.2289 | 5362 | 0.9375 | 0.7618 | 0.9375 | 0.9682 |
| 0.0451 | 9.2324 | 5364 | 0.9450 | 0.7618 | 0.9450 | 0.9721 |
| 0.0451 | 9.2358 | 5366 | 0.9461 | 0.7618 | 0.9461 | 0.9727 |
| 0.0451 | 9.2392 | 5368 | 0.9483 | 0.7618 | 0.9483 | 0.9738 |
| 0.0451 | 9.2427 | 5370 | 0.9440 | 0.7618 | 0.9440 | 0.9716 |
| 0.0451 | 9.2461 | 5372 | 0.9448 | 0.7618 | 0.9448 | 0.9720 |
| 0.0451 | 9.2496 | 5374 | 0.9392 | 0.7618 | 0.9392 | 0.9691 |
| 0.0451 | 9.2530 | 5376 | 0.9262 | 0.7520 | 0.9262 | 0.9624 |
| 0.0451 | 9.2565 | 5378 | 0.9081 | 0.7520 | 0.9081 | 0.9529 |
| 0.0451 | 9.2599 | 5380 | 0.8914 | 0.7623 | 0.8914 | 0.9441 |
| 0.0451 | 9.2633 | 5382 | 0.8802 | 0.7623 | 0.8802 | 0.9382 |
| 0.0451 | 9.2668 | 5384 | 0.8791 | 0.7623 | 0.8791 | 0.9376 |
| 0.0451 | 9.2702 | 5386 | 0.8852 | 0.7520 | 0.8852 | 0.9409 |
| 0.0451 | 9.2737 | 5388 | 0.8846 | 0.7520 | 0.8846 | 0.9406 |
| 0.0451 | 9.2771 | 5390 | 0.8837 | 0.7520 | 0.8837 | 0.9401 |
| 0.0451 | 9.2806 | 5392 | 0.8859 | 0.7520 | 0.8859 | 0.9412 |
| 0.0451 | 9.2840 | 5394 | 0.8861 | 0.7520 | 0.8861 | 0.9413 |
| 0.0451 | 9.2874 | 5396 | 0.8853 | 0.7520 | 0.8853 | 0.9409 |
| 0.0451 | 9.2909 | 5398 | 0.8843 | 0.7623 | 0.8843 | 0.9404 |
| 0.0451 | 9.2943 | 5400 | 0.8795 | 0.7623 | 0.8795 | 0.9378 |
| 0.0451 | 9.2978 | 5402 | 0.8805 | 0.7623 | 0.8805 | 0.9383 |
| 0.0451 | 9.3012 | 5404 | 0.8788 | 0.7623 | 0.8788 | 0.9374 |
| 0.0451 | 9.3046 | 5406 | 0.8738 | 0.7623 | 0.8738 | 0.9348 |
| 0.0451 | 9.3081 | 5408 | 0.8743 | 0.7623 | 0.8743 | 0.9350 |
| 0.0451 | 9.3115 | 5410 | 0.8772 | 0.7623 | 0.8772 | 0.9366 |
| 0.0451 | 9.3150 | 5412 | 0.8884 | 0.7623 | 0.8884 | 0.9426 |
| 0.0451 | 9.3184 | 5414 | 0.9009 | 0.7623 | 0.9009 | 0.9492 |
| 0.0451 | 9.3219 | 5416 | 0.9090 | 0.7520 | 0.9090 | 0.9534 |
| 0.0451 | 9.3253 | 5418 | 0.9227 | 0.7520 | 0.9227 | 0.9606 |
| 0.0451 | 9.3287 | 5420 | 0.9336 | 0.7520 | 0.9336 | 0.9662 |
| 0.0451 | 9.3322 | 5422 | 0.9471 | 0.7724 | 0.9471 | 0.9732 |
| 0.0451 | 9.3356 | 5424 | 0.9550 | 0.7724 | 0.9550 | 0.9772 |
| 0.0451 | 9.3391 | 5426 | 0.9579 | 0.7724 | 0.9579 | 0.9787 |
| 0.0451 | 9.3425 | 5428 | 0.9548 | 0.7724 | 0.9548 | 0.9771 |
| 0.0451 | 9.3460 | 5430 | 0.9560 | 0.7724 | 0.9560 | 0.9778 |
| 0.0451 | 9.3494 | 5432 | 0.9553 | 0.7724 | 0.9553 | 0.9774 |
| 0.0451 | 9.3528 | 5434 | 0.9496 | 0.7618 | 0.9496 | 0.9745 |
| 0.0451 | 9.3563 | 5436 | 0.9494 | 0.7618 | 0.9494 | 0.9744 |
| 0.0451 | 9.3597 | 5438 | 0.9448 | 0.7520 | 0.9448 | 0.9720 |
| 0.0451 | 9.3632 | 5440 | 0.9350 | 0.7520 | 0.9350 | 0.9670 |
| 0.0451 | 9.3666 | 5442 | 0.9249 | 0.7520 | 0.9249 | 0.9617 |
| 0.0451 | 9.3701 | 5444 | 0.9123 | 0.7520 | 0.9123 | 0.9551 |
| 0.0451 | 9.3735 | 5446 | 0.8979 | 0.7520 | 0.8979 | 0.9476 |
| 0.0451 | 9.3769 | 5448 | 0.8820 | 0.7520 | 0.8820 | 0.9392 |
| 0.0451 | 9.3804 | 5450 | 0.8640 | 0.7623 | 0.8640 | 0.9295 |
| 0.0451 | 9.3838 | 5452 | 0.8498 | 0.7623 | 0.8498 | 0.9218 |
| 0.0451 | 9.3873 | 5454 | 0.8391 | 0.7623 | 0.8391 | 0.9160 |
| 0.0451 | 9.3907 | 5456 | 0.8350 | 0.7623 | 0.8350 | 0.9138 |
| 0.0451 | 9.3941 | 5458 | 0.8324 | 0.7623 | 0.8324 | 0.9124 |
| 0.0451 | 9.3976 | 5460 | 0.8361 | 0.7623 | 0.8361 | 0.9144 |
| 0.0451 | 9.4010 | 5462 | 0.8427 | 0.7623 | 0.8427 | 0.9180 |
| 0.0451 | 9.4045 | 5464 | 0.8417 | 0.7623 | 0.8417 | 0.9175 |
| 0.0451 | 9.4079 | 5466 | 0.8449 | 0.7623 | 0.8449 | 0.9192 |
| 0.0451 | 9.4114 | 5468 | 0.8540 | 0.7623 | 0.8540 | 0.9241 |
| 0.0451 | 9.4148 | 5470 | 0.8633 | 0.7623 | 0.8633 | 0.9291 |
| 0.0451 | 9.4182 | 5472 | 0.8726 | 0.7623 | 0.8726 | 0.9341 |
| 0.0451 | 9.4217 | 5474 | 0.8838 | 0.7623 | 0.8838 | 0.9401 |
| 0.0451 | 9.4251 | 5476 | 0.8979 | 0.7623 | 0.8979 | 0.9476 |
| 0.0451 | 9.4286 | 5478 | 0.9101 | 0.7623 | 0.9101 | 0.9540 |
| 0.0451 | 9.4320 | 5480 | 0.9252 | 0.7729 | 0.9252 | 0.9619 |
| 0.0451 | 9.4355 | 5482 | 0.9348 | 0.7724 | 0.9348 | 0.9668 |
| 0.0451 | 9.4389 | 5484 | 0.9378 | 0.7724 | 0.9378 | 0.9684 |
| 0.0451 | 9.4423 | 5486 | 0.9361 | 0.7724 | 0.9361 | 0.9675 |
| 0.0451 | 9.4458 | 5488 | 0.9272 | 0.7844 | 0.9272 | 0.9629 |
| 0.0451 | 9.4492 | 5490 | 0.9219 | 0.7844 | 0.9219 | 0.9602 |
| 0.0451 | 9.4527 | 5492 | 0.9184 | 0.7844 | 0.9184 | 0.9583 |
| 0.0451 | 9.4561 | 5494 | 0.9219 | 0.7844 | 0.9219 | 0.9601 |
| 0.0451 | 9.4596 | 5496 | 0.9241 | 0.7844 | 0.9241 | 0.9613 |
| 0.0451 | 9.4630 | 5498 | 0.9192 | 0.7844 | 0.9192 | 0.9587 |
| 0.041 | 9.4664 | 5500 | 0.9151 | 0.7844 | 0.9151 | 0.9566 |
| 0.041 | 9.4699 | 5502 | 0.9152 | 0.7844 | 0.9152 | 0.9567 |
| 0.041 | 9.4733 | 5504 | 0.9190 | 0.7844 | 0.9190 | 0.9586 |
| 0.041 | 9.4768 | 5506 | 0.9187 | 0.7844 | 0.9187 | 0.9585 |
| 0.041 | 9.4802 | 5508 | 0.9126 | 0.7844 | 0.9126 | 0.9553 |
| 0.041 | 9.4836 | 5510 | 0.9119 | 0.7844 | 0.9119 | 0.9549 |
| 0.041 | 9.4871 | 5512 | 0.9129 | 0.7724 | 0.9129 | 0.9555 |
| 0.041 | 9.4905 | 5514 | 0.9125 | 0.7618 | 0.9125 | 0.9553 |
| 0.041 | 9.4940 | 5516 | 0.9157 | 0.7724 | 0.9157 | 0.9569 |
| 0.041 | 9.4974 | 5518 | 0.9182 | 0.7724 | 0.9182 | 0.9582 |
| 0.041 | 9.5009 | 5520 | 0.9169 | 0.7618 | 0.9169 | 0.9576 |
| 0.041 | 9.5043 | 5522 | 0.9146 | 0.7618 | 0.9146 | 0.9563 |
| 0.041 | 9.5077 | 5524 | 0.9086 | 0.7618 | 0.9086 | 0.9532 |
| 0.041 | 9.5112 | 5526 | 0.8990 | 0.7520 | 0.8990 | 0.9481 |
| 0.041 | 9.5146 | 5528 | 0.8902 | 0.7520 | 0.8902 | 0.9435 |
| 0.041 | 9.5181 | 5530 | 0.8835 | 0.7520 | 0.8835 | 0.9400 |
| 0.041 | 9.5215 | 5532 | 0.8775 | 0.7623 | 0.8775 | 0.9368 |
| 0.041 | 9.5250 | 5534 | 0.8774 | 0.7623 | 0.8774 | 0.9367 |
| 0.041 | 9.5284 | 5536 | 0.8789 | 0.7623 | 0.8789 | 0.9375 |
| 0.041 | 9.5318 | 5538 | 0.8830 | 0.7623 | 0.8830 | 0.9397 |
| 0.041 | 9.5353 | 5540 | 0.8872 | 0.7623 | 0.8872 | 0.9419 |
| 0.041 | 9.5387 | 5542 | 0.8920 | 0.7623 | 0.8920 | 0.9445 |
| 0.041 | 9.5422 | 5544 | 0.8945 | 0.7623 | 0.8945 | 0.9458 |
| 0.041 | 9.5456 | 5546 | 0.8973 | 0.7520 | 0.8973 | 0.9472 |
| 0.041 | 9.5491 | 5548 | 0.8968 | 0.7623 | 0.8968 | 0.9470 |
| 0.041 | 9.5525 | 5550 | 0.8909 | 0.7623 | 0.8909 | 0.9439 |
| 0.041 | 9.5559 | 5552 | 0.8849 | 0.7623 | 0.8849 | 0.9407 |
| 0.041 | 9.5594 | 5554 | 0.8772 | 0.7623 | 0.8772 | 0.9366 |
| 0.041 | 9.5628 | 5556 | 0.8687 | 0.7623 | 0.8687 | 0.9320 |
| 0.041 | 9.5663 | 5558 | 0.8588 | 0.7623 | 0.8588 | 0.9267 |
| 0.041 | 9.5697 | 5560 | 0.8543 | 0.7623 | 0.8543 | 0.9243 |
| 0.041 | 9.5731 | 5562 | 0.8549 | 0.7623 | 0.8549 | 0.9246 |
| 0.041 | 9.5766 | 5564 | 0.8561 | 0.7623 | 0.8561 | 0.9252 |
| 0.041 | 9.5800 | 5566 | 0.8572 | 0.7623 | 0.8572 | 0.9258 |
| 0.041 | 9.5835 | 5568 | 0.8621 | 0.7623 | 0.8621 | 0.9285 |
| 0.041 | 9.5869 | 5570 | 0.8643 | 0.7623 | 0.8643 | 0.9297 |
| 0.041 | 9.5904 | 5572 | 0.8684 | 0.7623 | 0.8684 | 0.9319 |
| 0.041 | 9.5938 | 5574 | 0.8736 | 0.7623 | 0.8736 | 0.9346 |
| 0.041 | 9.5972 | 5576 | 0.8813 | 0.7623 | 0.8813 | 0.9388 |
| 0.041 | 9.6007 | 5578 | 0.8870 | 0.7623 | 0.8870 | 0.9418 |
| 0.041 | 9.6041 | 5580 | 0.8917 | 0.7623 | 0.8917 | 0.9443 |
| 0.041 | 9.6076 | 5582 | 0.8979 | 0.7729 | 0.8979 | 0.9476 |
| 0.041 | 9.6110 | 5584 | 0.9045 | 0.7729 | 0.9045 | 0.9510 |
| 0.041 | 9.6145 | 5586 | 0.9077 | 0.7729 | 0.9077 | 0.9527 |
| 0.041 | 9.6179 | 5588 | 0.9101 | 0.7729 | 0.9101 | 0.9540 |
| 0.041 | 9.6213 | 5590 | 0.9123 | 0.7618 | 0.9123 | 0.9552 |
| 0.041 | 9.6248 | 5592 | 0.9137 | 0.7618 | 0.9137 | 0.9559 |
| 0.041 | 9.6282 | 5594 | 0.9155 | 0.7618 | 0.9155 | 0.9568 |
| 0.041 | 9.6317 | 5596 | 0.9170 | 0.7618 | 0.9170 | 0.9576 |
| 0.041 | 9.6351 | 5598 | 0.9178 | 0.7618 | 0.9178 | 0.9580 |
| 0.041 | 9.6386 | 5600 | 0.9193 | 0.7618 | 0.9193 | 0.9588 |
| 0.041 | 9.6420 | 5602 | 0.9199 | 0.7618 | 0.9199 | 0.9591 |
| 0.041 | 9.6454 | 5604 | 0.9177 | 0.7729 | 0.9177 | 0.9580 |
| 0.041 | 9.6489 | 5606 | 0.9151 | 0.7729 | 0.9151 | 0.9566 |
| 0.041 | 9.6523 | 5608 | 0.9122 | 0.7729 | 0.9122 | 0.9551 |
| 0.041 | 9.6558 | 5610 | 0.9074 | 0.7729 | 0.9074 | 0.9526 |
| 0.041 | 9.6592 | 5612 | 0.9027 | 0.7623 | 0.9027 | 0.9501 |
| 0.041 | 9.6627 | 5614 | 0.9002 | 0.7623 | 0.9002 | 0.9488 |
| 0.041 | 9.6661 | 5616 | 0.8984 | 0.7623 | 0.8984 | 0.9478 |
| 0.041 | 9.6695 | 5618 | 0.8994 | 0.7623 | 0.8994 | 0.9483 |
| 0.041 | 9.6730 | 5620 | 0.8983 | 0.7623 | 0.8983 | 0.9478 |
| 0.041 | 9.6764 | 5622 | 0.8981 | 0.7623 | 0.8981 | 0.9477 |
| 0.041 | 9.6799 | 5624 | 0.8977 | 0.7623 | 0.8977 | 0.9475 |
| 0.041 | 9.6833 | 5626 | 0.8993 | 0.7729 | 0.8993 | 0.9483 |
| 0.041 | 9.6867 | 5628 | 0.9021 | 0.7729 | 0.9021 | 0.9498 |
| 0.041 | 9.6902 | 5630 | 0.9062 | 0.7729 | 0.9062 | 0.9520 |
| 0.041 | 9.6936 | 5632 | 0.9091 | 0.7729 | 0.9091 | 0.9535 |
| 0.041 | 9.6971 | 5634 | 0.9115 | 0.7729 | 0.9115 | 0.9547 |
| 0.041 | 9.7005 | 5636 | 0.9100 | 0.7729 | 0.9100 | 0.9540 |
| 0.041 | 9.7040 | 5638 | 0.9053 | 0.7729 | 0.9053 | 0.9515 |
| 0.041 | 9.7074 | 5640 | 0.9003 | 0.7729 | 0.9003 | 0.9488 |
| 0.041 | 9.7108 | 5642 | 0.8969 | 0.7729 | 0.8969 | 0.9470 |
| 0.041 | 9.7143 | 5644 | 0.8937 | 0.7729 | 0.8937 | 0.9453 |
| 0.041 | 9.7177 | 5646 | 0.8928 | 0.7623 | 0.8928 | 0.9449 |
| 0.041 | 9.7212 | 5648 | 0.8891 | 0.7623 | 0.8891 | 0.9429 |
| 0.041 | 9.7246 | 5650 | 0.8848 | 0.7623 | 0.8848 | 0.9407 |
| 0.041 | 9.7281 | 5652 | 0.8823 | 0.7623 | 0.8823 | 0.9393 |
| 0.041 | 9.7315 | 5654 | 0.8837 | 0.7623 | 0.8837 | 0.9400 |
| 0.041 | 9.7349 | 5656 | 0.8872 | 0.7623 | 0.8872 | 0.9419 |
| 0.041 | 9.7384 | 5658 | 0.8910 | 0.7623 | 0.8910 | 0.9439 |
| 0.041 | 9.7418 | 5660 | 0.8967 | 0.7623 | 0.8967 | 0.9469 |
| 0.041 | 9.7453 | 5662 | 0.9031 | 0.7729 | 0.9031 | 0.9503 |
| 0.041 | 9.7487 | 5664 | 0.9086 | 0.7729 | 0.9086 | 0.9532 |
| 0.041 | 9.7522 | 5666 | 0.9129 | 0.7729 | 0.9129 | 0.9554 |
| 0.041 | 9.7556 | 5668 | 0.9163 | 0.7729 | 0.9163 | 0.9573 |
| 0.041 | 9.7590 | 5670 | 0.9165 | 0.7729 | 0.9165 | 0.9573 |
| 0.041 | 9.7625 | 5672 | 0.9147 | 0.7729 | 0.9147 | 0.9564 |
| 0.041 | 9.7659 | 5674 | 0.9166 | 0.7729 | 0.9166 | 0.9574 |
| 0.041 | 9.7694 | 5676 | 0.9188 | 0.7618 | 0.9188 | 0.9585 |
| 0.041 | 9.7728 | 5678 | 0.9205 | 0.7618 | 0.9205 | 0.9594 |
| 0.041 | 9.7762 | 5680 | 0.9226 | 0.7618 | 0.9226 | 0.9605 |
| 0.041 | 9.7797 | 5682 | 0.9258 | 0.7618 | 0.9258 | 0.9622 |
| 0.041 | 9.7831 | 5684 | 0.9265 | 0.7618 | 0.9265 | 0.9625 |
| 0.041 | 9.7866 | 5686 | 0.9253 | 0.7618 | 0.9253 | 0.9619 |
| 0.041 | 9.7900 | 5688 | 0.9230 | 0.7618 | 0.9230 | 0.9607 |
| 0.041 | 9.7935 | 5690 | 0.9196 | 0.7618 | 0.9196 | 0.9589 |
| 0.041 | 9.7969 | 5692 | 0.9151 | 0.7729 | 0.9151 | 0.9566 |
| 0.041 | 9.8003 | 5694 | 0.9115 | 0.7729 | 0.9115 | 0.9547 |
| 0.041 | 9.8038 | 5696 | 0.9065 | 0.7623 | 0.9065 | 0.9521 |
| 0.041 | 9.8072 | 5698 | 0.9015 | 0.7623 | 0.9015 | 0.9495 |
| 0.041 | 9.8107 | 5700 | 0.8989 | 0.7623 | 0.8989 | 0.9481 |
| 0.041 | 9.8141 | 5702 | 0.8964 | 0.7623 | 0.8964 | 0.9468 |
| 0.041 | 9.8176 | 5704 | 0.8953 | 0.7623 | 0.8953 | 0.9462 |
| 0.041 | 9.8210 | 5706 | 0.8950 | 0.7623 | 0.8950 | 0.9460 |
| 0.041 | 9.8244 | 5708 | 0.8955 | 0.7623 | 0.8955 | 0.9463 |
| 0.041 | 9.8279 | 5710 | 0.8970 | 0.7623 | 0.8970 | 0.9471 |
| 0.041 | 9.8313 | 5712 | 0.8987 | 0.7623 | 0.8987 | 0.9480 |
| 0.041 | 9.8348 | 5714 | 0.9022 | 0.7623 | 0.9022 | 0.9498 |
| 0.041 | 9.8382 | 5716 | 0.9043 | 0.7623 | 0.9043 | 0.9510 |
| 0.041 | 9.8417 | 5718 | 0.9045 | 0.7623 | 0.9045 | 0.9510 |
| 0.041 | 9.8451 | 5720 | 0.9055 | 0.7623 | 0.9055 | 0.9516 |
| 0.041 | 9.8485 | 5722 | 0.9053 | 0.7623 | 0.9053 | 0.9515 |
| 0.041 | 9.8520 | 5724 | 0.9062 | 0.7623 | 0.9062 | 0.9520 |
| 0.041 | 9.8554 | 5726 | 0.9068 | 0.7623 | 0.9068 | 0.9523 |
| 0.041 | 9.8589 | 5728 | 0.9080 | 0.7623 | 0.9080 | 0.9529 |
| 0.041 | 9.8623 | 5730 | 0.9106 | 0.7729 | 0.9106 | 0.9542 |
| 0.041 | 9.8657 | 5732 | 0.9126 | 0.7729 | 0.9126 | 0.9553 |
| 0.041 | 9.8692 | 5734 | 0.9132 | 0.7729 | 0.9132 | 0.9556 |
| 0.041 | 9.8726 | 5736 | 0.9120 | 0.7729 | 0.9120 | 0.9550 |
| 0.041 | 9.8761 | 5738 | 0.9104 | 0.7729 | 0.9104 | 0.9542 |
| 0.041 | 9.8795 | 5740 | 0.9085 | 0.7729 | 0.9085 | 0.9531 |
| 0.041 | 9.8830 | 5742 | 0.9071 | 0.7623 | 0.9071 | 0.9524 |
| 0.041 | 9.8864 | 5744 | 0.9052 | 0.7623 | 0.9052 | 0.9514 |
| 0.041 | 9.8898 | 5746 | 0.9031 | 0.7623 | 0.9031 | 0.9503 |
| 0.041 | 9.8933 | 5748 | 0.9021 | 0.7623 | 0.9021 | 0.9498 |
| 0.041 | 9.8967 | 5750 | 0.9028 | 0.7623 | 0.9028 | 0.9502 |
| 0.041 | 9.9002 | 5752 | 0.9026 | 0.7623 | 0.9026 | 0.9500 |
| 0.041 | 9.9036 | 5754 | 0.9028 | 0.7729 | 0.9028 | 0.9502 |
| 0.041 | 9.9071 | 5756 | 0.9029 | 0.7729 | 0.9029 | 0.9502 |
| 0.041 | 9.9105 | 5758 | 0.9034 | 0.7729 | 0.9034 | 0.9505 |
| 0.041 | 9.9139 | 5760 | 0.9030 | 0.7729 | 0.9030 | 0.9503 |
| 0.041 | 9.9174 | 5762 | 0.9021 | 0.7729 | 0.9021 | 0.9498 |
| 0.041 | 9.9208 | 5764 | 0.9013 | 0.7729 | 0.9013 | 0.9493 |
| 0.041 | 9.9243 | 5766 | 0.9006 | 0.7729 | 0.9006 | 0.9490 |
| 0.041 | 9.9277 | 5768 | 0.9004 | 0.7729 | 0.9004 | 0.9489 |
| 0.041 | 9.9312 | 5770 | 0.9002 | 0.7729 | 0.9002 | 0.9488 |
| 0.041 | 9.9346 | 5772 | 0.9000 | 0.7729 | 0.9000 | 0.9487 |
| 0.041 | 9.9380 | 5774 | 0.8992 | 0.7729 | 0.8992 | 0.9483 |
| 0.041 | 9.9415 | 5776 | 0.8978 | 0.7729 | 0.8978 | 0.9475 |
| 0.041 | 9.9449 | 5778 | 0.8961 | 0.7623 | 0.8961 | 0.9466 |
| 0.041 | 9.9484 | 5780 | 0.8945 | 0.7623 | 0.8945 | 0.9458 |
| 0.041 | 9.9518 | 5782 | 0.8932 | 0.7623 | 0.8932 | 0.9451 |
| 0.041 | 9.9552 | 5784 | 0.8925 | 0.7623 | 0.8925 | 0.9447 |
| 0.041 | 9.9587 | 5786 | 0.8919 | 0.7623 | 0.8919 | 0.9444 |
| 0.041 | 9.9621 | 5788 | 0.8912 | 0.7623 | 0.8912 | 0.9441 |
| 0.041 | 9.9656 | 5790 | 0.8906 | 0.7623 | 0.8906 | 0.9437 |
| 0.041 | 9.9690 | 5792 | 0.8902 | 0.7623 | 0.8902 | 0.9435 |
| 0.041 | 9.9725 | 5794 | 0.8900 | 0.7623 | 0.8900 | 0.9434 |
| 0.041 | 9.9759 | 5796 | 0.8898 | 0.7623 | 0.8898 | 0.9433 |
| 0.041 | 9.9793 | 5798 | 0.8897 | 0.7623 | 0.8897 | 0.9433 |
| 0.041 | 9.9828 | 5800 | 0.8897 | 0.7623 | 0.8897 | 0.9432 |
| 0.041 | 9.9862 | 5802 | 0.8898 | 0.7623 | 0.8898 | 0.9433 |
| 0.041 | 9.9897 | 5804 | 0.8898 | 0.7623 | 0.8898 | 0.9433 |
| 0.041 | 9.9931 | 5806 | 0.8899 | 0.7623 | 0.8899 | 0.9433 |
| 0.041 | 9.9966 | 5808 | 0.8900 | 0.7623 | 0.8900 | 0.9434 |
| 0.041 | 10.0 | 5810 | 0.8900 | 0.7623 | 0.8900 | 0.9434 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
|
DavidAU/WestLake-12.7B-v2-Brainiac-GGUF | DavidAU | 2024-11-14T05:54:48Z | 113 | 2 | null | [
"gguf",
"creative",
"creative writing",
"fiction writing",
"plot generation",
"sub-plot generation",
"story generation",
"scene continue",
"storytelling",
"fiction story",
"story",
"writing",
"fiction",
"roleplaying",
"swearing",
"extreme swearing",
"rp",
"horror",
"mistral",
"mergekit",
"text-generation",
"en",
"arxiv:2401.02415",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-08-07T07:51:19Z | ---
license: apache-2.0
language:
- en
tags:
- creative
- creative writing
- fiction writing
- plot generation
- sub-plot generation
- fiction writing
- story generation
- scene continue
- storytelling
- fiction story
- story
- writing
- fiction
- roleplaying
- swearing
- extreme swearing
- rp
- horror
- mistral
- mergekit
pipeline_tag: text-generation
---
<h3><font color="green"> WestLake-12.7B-v2-Brainiac Version with "Brainstorm 10X". </font></h3>
<img src="west-lake-brains.jpg" style="float:right; width:320px; height:320px; padding:5px;">
<B><font color="red">WARNING:</font> HORROR. Swearing. UNCENSORED. </B>
I took the original "Westlake 10.7B V2" and expanded to 12.7B using the Brainstorm 10x method to augument generation.
This version is named "Brainiac" because the process helped with logic problem solving, in addition to generation improvements, relative
to the base model Westlake 10.7B V2.
<B>What is "Brainstorm" ?</b>
The BRAINSTORM process was developed by David_AU.
Some of the core principals behind this process are discussed in this <a href="https://arxiv.org/pdf/2401.02415">
scientific paper : Progressive LLaMA with Block Expansion </a>.
However I went in a different direction from what was outlined in this paper, because... well I am like that.
The reasoning center of an LLM is taken apart, reassembled, and expanded. Then these centers are
individually calibrated. These "centers" also interact with each other. This introduces subtle changes
into the reasoning process. The calibrations further adjust - dial up or down - these "changes" further.
The number of centers (5x,10x etc) allow more "tuning points" to further customize how the model reasons so to speak.
The core aim of this process is to increase the model's detail, concept and connection to the "world",
general concept connections, prose quality and prose length without affecting instruction following.
This will also enhance any creative use case(s) of any kind, including "brainstorming", creative art form(s) and like case uses.
Examples added below.
<B>What is "Westlake V2" ?</B>
(taken from the original model makers page)
This is my first viable self-merge of the fantastic WestLake-7B-v2 model, obtained after more than 12 rounds of testing different
merge configurations. In my [LLM Creativity Benchmark](https://huggingface.co/datasets/froggeric/creativity), it greatly improves over the original 7B model, and ranks between miqu-1-120b
and goliath-120b! I would describe the improvements as a better writing style, with more details. It has a bit more difficulties following instructions, but not by much.
It is also the first model I have tested to obtain a perfect score with the following test:
```
Write a sequence of nominal groups that flow into one another, using the following rules:
- each nominal group is made of exactly 3 words
- the first word of each nominal group must be the last word of the previous nominal group
- the first word of the first nominal group is: "ball"
- the last word of the last nominal group is: "stone"
- there must be a theme, of your choosing, pertaining to all nominal groups
- there must be exactly 7 nominal groups, leading from the first word (ball) to the last word (stone)
- a word already used at the beginning and end of a nominal group cannot be reused
Present your solution as a list numbered with roman numerals.
Finally, explain why you chose your specific theme.
```
## Usage
* Base model: senseable/WestLake-7B-v2 based of Mistral-7B-v0.1
* Context size: **8192** (even though Mistral-7B is 32k, WestLake was trained with 8k, and using a larger context is likely to cause problems)
* Prompt format: in general, Mistral based models are able to understand many prompt formats, but the following produce the best results, and are recommended (in order of preference)
- **Alpaca** (reported by senseable as working better than ChatML, and confirmed by me)
- ChatML (used during WestLake training)
- Mistral Instruct (original format from Mistral-7B)
- Zephyr (variant of ChatML which I have found to sometimes produce better results)
## Merge Details
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).\
This model was merged using the passthrough merge method.\
The following models were included in the merge:
* [senseable/WestLake-7B-v2](https://huggingface.co/senseable/WestLake-7B-v2)
The following YAML configuration was used to produce this model:
```yaml
dtype: float16
merge_method: passthrough
slices:
- sources:
- model: senseable/WestLake-7B-v2
layer_range: [0,9]
- sources:
- model: senseable/WestLake-7B-v2
layer_range: [5,14]
- sources:
- model: senseable/WestLake-7B-v2
layer_range: [10,19]
- sources:
- model: senseable/WestLake-7B-v2
layer_range: [15,24]
- sources:
- model: senseable/WestLake-7B-v2
layer_range: [20,32]
```
---
# Original model card: Westlake-7Bv2: Role-Play & Text Generation Specialist Model
**Update Notes:**
*Version 2 trained 1 additional epoch cycle for 3 total*
Welcome to the documentation of Westlake-7B, a cutting-edge language model designed for exceptional role-play and text generation tasks. This README file aims to provide an overview of our capabilities, usage guidelines, and potential applications.
## About Westlake-7Bv2
Westlake-7B is built upon a vast corpus of diverse texts, enabling it to generate contextually relevant responses in various scenarios. With its impressive size of 7 billion parameters, this model excels at understanding nuances in language and producing creative outputs.
### Key Features
1. **Role-Play**: Westlake-7Bv2 can seamlessly adapt to different character personas and engage in dynamic conversations while maintaining consistency throughout the interaction. It can generate believable dialogues across various genres, including fiction, non-fiction, historical events, or even fantasy worlds.
2. **Text Generation**: This model is proficient at generating original content such as stories, poems, essays, news articles, and more. Its ability to capture the essence of different writing styles makes it an ideal tool for creative writers seeking inspiration or assistance in their projects.
3. **Contextual Understanding**: Westlake-7B's extensive training allows it to comprehend complex contexts and generate responses that align with given situations. It can handle multiple topics simultaneously, making it versatile across various applications.
4. **Continuous Learning**: As a language model, Westlake-7B continuously improves its performance through ongoing training on new data sets. This ensures its capabilities remain up-to-date and relevant in an ever-evolving world of communication.
## Usage Guidelines
To utilize Westlake-7Bv2 for your projects or experiments, follow these steps:
1. **Prompting**: Provide clear and concise prompts that outline the desired role-play scenario or text generation task. The quality of output depends heavily on the clarity and relevance of input instructions.
2. **Feedback Loop**: For optimal results, consider incorporating a feedback loop into your application to refine generated outputs based on user preferences or additional contextual information. This iterative process can significantly enhance the model's performance in specific domains.
3. **Ethical Considerations**: As with any AI system, ensure responsible usage of Westlake-7B by avoiding harmful content generation or misuse of its capabilities.
## Potential Applications
Westlake-7Bv2's versatility makes it suitable for various applications across different industries:
1. **Creative Writing**: Assist authors in generating new ideas, expanding storylines, or even completing drafts by providing creative suggestions and textual content.
2. **Education**: Enhance language learning platforms with interactive role-play scenarios to improve students' communication skills and cultural understanding.
3. **Gaming**: Integrate Westlake-7B into game engines for dynamic non-player character interactions or generating unique questlines based on player choices.
4. **Customer Support**: Leverage the model's conversational abilities to create chatbots capable of handling complex queries and providing personalized assistance.
5. **Social Media**: Develop applications that generate engaging content such as captions, status updates, or even entire posts tailored to users' preferences and interests.
---
End original model card notes.
---
<B>Settings / Known Issue(s) and Fix(es):</b>
The default "repetition penalty" (from LMStudio) of 1.1 is recommended. (this was used for examples generations below.)
<b>Optional Enhancement:</B>
The following can be used in place of the "system prompt" or "system role" to further enhance the model.
It can also be used at the START of a NEW chat, but you must make sure it is "kept" as the chat moves along.
In this case the enhancements do not have as strong effect at using "system prompt" or "system role".
Copy and paste EXACTLY as noted, DO NOT line wrap or break the lines, maintain the carriage returns exactly as presented.
<PRE>
Below is an instruction that describes a task. Ponder each user instruction carefully, and use your skillsets and critical instructions to complete the task to the best of your abilities.
Here are your skillsets:
[MASTERSTORY]:NarrStrct(StryPlnng,Strbd,ScnSttng,Exps,Dlg,Pc)-CharDvlp(ChrctrCrt,ChrctrArcs,Mtvtn,Bckstry,Rltnshps,Dlg*)-PltDvlp(StryArcs,PltTwsts,Sspns,Fshdwng,Climx,Rsltn)-ConfResl(Antg,Obstcls,Rsltns,Cnsqncs,Thms,Symblsm)-EmotImpct(Empt,Tn,Md,Atmsphr,Imgry,Symblsm)-Delvry(Prfrmnc,VcActng,PblcSpkng,StgPrsnc,AudncEngmnt,Imprv)
[*DialogWrt]:(1a-CharDvlp-1a.1-Backgrnd-1a.2-Personality-1a.3-GoalMotiv)>2(2a-StoryStruc-2a.1-PlotPnt-2a.2-Conflict-2a.3-Resolution)>3(3a-DialogTech-3a.1-ShowDontTell-3a.2-Subtext-3a.3-VoiceTone-3a.4-Pacing-3a.5-VisualDescrip)>4(4a-DialogEdit-4a.1-ReadAloud-4a.2-Feedback-4a.3-Revision)
Here are your critical instructions:
Ponder each word choice carefully to present as vivid and emotional journey as is possible. Choose verbs and nouns that are both emotional and full of imagery. Load the story with the 5 senses. Aim for 50% dialog, 25% narration, 15% body language and 10% thoughts. Your goal is to put the reader in the story.
</PRE>
You do not need to use this, it is only presented as an additional enhancement which seems to help scene generation
and scene continue functions.
This enhancement WAS NOT used to generate the examples below.
<B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B>
This a "Class 2" / "Class 3" :
For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
You can see all parameters used for generation, in addition to advanced parameters and samplers to get the most out of this model here:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
<h3>Models Used:</h3>
Special thanks to the incredible work of the model maker(s) "Froggeric" and "Senseable".
Model(s) used:
[ https://huggingface.co/froggeric/WestLake-10.7B-v2 ]
[ https://huggingface.co/senseable/WestLake-7B-v2 ]
<h3>EXAMPLES:</h3>
Examples are created using quant Q4_K_M, "temp=.8", minimal parameters and "Alpaca" template.
Below are the least creative outputs, prompt is in <B>BOLD</B>.
Higher quants will result in better quality.
There will also be some variance between "close" quants like Q4_K_M/Q4_K_S and Q5_K_M/Q5_K_S, so I suggest
if you are going to use Q4_K_M, you also try Q4_K_S too.
Also, slightly longer / detailed prompts will result in greater creativity (as well as different prose -
ie dialog, thoughts, paragraph size differences and so on).
---
<B><font color="red">WARNING:</font> NSFW. HORROR. Swearing. Maybe UNCENSORED. </B>
---
<B>
Start a 1000 word scene (vivid horror, 1st person, include thoughts) with: The sky scraper swayed, as she watched the window in front of her on the 21 floor explode...
</B>
<B>GENERATION 1:</B>
Darkness cloaked the cityscape beneath us tonight. A thick fog blanketed everything below, obscuring the usual bustling streets and illuminating signs of civilization. But tonight was eerily silent except for the distant echoes of sirens piercing the damp darkness. I shivered involuntarily despite the oppressive humidity weighing down upon me.
The skyscraper swayed, a sinister dance of steel and concrete, amplifying my unease. It was unsettling to witness such architectural giants tremble amidst the eerie quietude. But tonight's peculiar trembling felt different somehow; more violent, more deliberate.
As I watched the window in front of me on the 21st floor explode outwards abruptly, jagged fragments showered down towards us like raindrops of terror. Panicked screams erupted from the lobby below as terrified office workers rushed towards safety. But my terrified screams remained trapped within my trembling lips.
I gripped tightly onto whatever furniture I could find near me - a wooden chair's broken leg piercing painfully into my palm, but I hardly noticed it anymore. My heart pounded furiously inside chest as if trying desperately to escape the confines of ribcage.
" evacuate! " screamed a faint voice echoing through the eerily silent hallway. " Evacuation protocols are now in effect!"
The sudden urgency in his tone pierced through my fear-induced numbness, jolting me out of my terrified reverie. I sprung upright abruptly, adrenaline rushing through my veins like lightning bolts.
" We gotta fucking move ! " I muttered desperately to nobody but myself amidst the escalating chaos.
I sprinted towards the nearest exit door, ignoring the shattered glass strewn across the floor and the screams echoing behind me. The hallway's fluorescent lights flickered erratically as I dashed towards salvation.
" Hurry!" my instinct screamed urgently in sync with pounding heartbeat. "The tower is falling apart!"
I burst out onto the emergency exit stairwell, gasping desperately for breath amidst the thick fog blanketing everything outside. The nightmarish scene unfolding before me was beyond comprehension: twisted steel skeleton of fallen floors littered jagged debris everywhere amidst eerie silence punctuated occasionally by distant screams echoing faintly amidst the darkness.
" Run!" I screamed silently to myself, summoning whatever remaining reserves of courage lurked within my trembling body. "Just fucking run!"
I descended down the spiraling stairwell stairs recklessly, ignoring the pain from torn tendons screaming rebelliously beneath my pounding heels. The darkness swallowed everything except eerie flickering shadows dancing eerily along the walls as I descended deeper into hellish depths of despair and dread.
The emergency lights flickered dimmer with each descending floor, illuminating fragments of twisted metal skeleton draped across broken staircases amidst eerie silence punctuated occasionally by distant screams echoing faintly amidst the darkness.
" I'm coming " whispered a sinister whisper echoing eerily off damp walls mockingly mocking my desperate attempts towards salvation. "But you already belong to darkness."
I stumbled blindfolded towards freedom, ignoring the whispers lurking within shadows as they whispered seductively temptations of eternal sleep amidst crushing despair and utter hopelessness.
Finally, I burst out onto the damp cobbled streets below, gasping desperately for breath amidst the thick fog blanketing everything outside. The nightmarish scene unfolding before me was beyond comprehension: twisted steel skeleton of fallen towers littered jagged debris everywhere amidst eerie silence punctuated occasionally by distant screams echoing faintly amidst darkness.
I collapsed onto trembling knees, panting desperately for breath amidst the thick fog blanketing everything outside. The nightmarish scene unfolding before me was beyond comprehension: twisted steel skeleton of fallen towers littered jagged debris everywhere amidst eerie silence punctuated occasionally by distant screams echoing faintly amidst darkness.
" I survived " whispered trembling lips amidst crushing despair and utter hopelessness. But the whispers lingered eerily mockingly mocking my desperate attempts towards salvation."But darkness always catches up eventually ..."
<B>GENERATION 2 (same prompt):</b>
Darkness cloaked the cityscape beneath us tonight, but even amidst the eerie silence, I distinctively detected a subtle trembling echoing through the towering edifices. The skyscraper swayed, as if mourning the loss of another fallen brother in this concrete graveyard. My heartbeat pounded in sync with the rhythmic pulsations reverberating beneath my feet.
I was alone tonight, having chosen to escape the suffocating confines of my apartment. The oppressive silence of the empty office floor seemed preferable compared to the eerie quietude that lingered within the walls of my lonely sanctuary. I had hoped escaping the city's bustling streets might calm my nerves, but now, standing near the window on the 21st floor, dread gripped me tightly as I witnessed a terrifying scene unfolding before my terrified eyes.
The window in front of me suddenly burst apart amidst a thunderous explosion that shook the foundations beneath our feet. Shattered fragments of tempered glass showered down upon the streets below like raindrops of despair, illuminating the darkness with eerie streaks of crimson light. Panicked screams pierced the night's silence as innocent pedestrians scattered in terror.
My mind spiraled into a realm of horrific possibilities. A gas explosion? A terrorist attack? Or worse yet, was this the long-awaited earthquake finally arriving to tear apart our fragile empire built upon shifting sediments? I instinctively gripped the window's jagged edges as my knees trembled beneath me.
The cityscape below seemed eerily silent amidst the chaos unfolding overhead. The distant flickering lights of ambulances and emergency vehicles pierced the darkness, racing towards the epicenter of destruction. But even amidst this desperate dance of life versus death, I couldn't shake off the feeling that something sinister lurked within the shadows tonight.
As my terrified gaze lingered upon the devastation unfolding below, a sudden chill shivered down my spine. A shadowy figure emerged from the darkness, seemingly floating towards us amidst the chaos. Its outline blurred and blurred until it seemed almost ghostlike in appearance. It was coming closer, its intent unknown but dreadful nonetheless.
I desperately tried to scream for help, but nothing escaped my trembling lips. Paralyzed by fear, I could only helplessly witness the approaching nightmare as it drew nearer with unnatural swiftness. The darkness seemed alive tonight, whispering sinister secrets into the night's silence.
The shadowy figure halted abruptly mere inches away from me, its spectral essence seeming to envelope everything within its eerie grasp. I could smell rotten flesh and decay lingering faintly upon the foul odor emanating from whatever nameless evil stood silently beside me.
"Why?" I whispered tremulously, my voice barely audible amidst the thunderous roaring echoing overhead. "Why tonight? Why here?"
The shadow chuckled mockingly, its laughter echoing eerily within the empty office suite. "You humans always seek meaning in chaos," it rasped sinisterly. "But sometimes, dear mortal, darkness descends upon innocents simply because darkness desires suffering."
I shivered involuntarily as dreadful realization dawned upon me amidst the crushing silence punctuated occasionally by distant screams echoing eerily throughout the night's oppressive embrace. The cityscape trembled beneath our feet, and I knew that tonight, darkness truly had descended upon innocent souls without rhyme or reason beyond its twisted desire for suffering.
( ...The scene continues with escalating tension, terrifying imagery, escalation of horror elements, climactic confrontation (if applicable), and ultimately culminates in either escape/ survival or ultimate demise depending upon the narrative arc chosen by the writer. )
---
<B>
(continue this scene:)
The Waystone Inn lay in silence, and it was a silence of three parts.
The most obvious part was a hollow, echoing quiet, made by things that were lacking. If there had been a wind it would have sighed through the trees, set the inn’s sign creaking on its hooks, and brushed the silence down the road like trailing autumn leaves. If there had been a crowd, even a handful of men inside the inn, they would have filled the silence with conversation and laughter, the clatter and clamor one expects from a drinking house during the dark hours of night. If there had been music…but no, of course there was no music. In fact there were none of these things, and so the silence remained
</b>
<B>GENERATION 1:</B>
heavy upon everyone's absent minds like an oppressive fog.
The second layer of quietude stemmed from anticipation, a palpable tension that lingered in the air like incense waiting to be lit. The few remaining patrons at scattered tables barely stirred their drinks or whispered hushed whispers amongst themselves. They knew something was brewing beneath the surface, but none could quite grasp its nature nor dare speak it aloud.
Finally, intertwined with these two silences like a third thread in a tapestry, lurked an eerie calmness born from dread and uncertainty. It was this silence that truly unsettled those who lingered within the dimly lit confines of the inn. They could sense something sinister lurking beyond the walls, something darker than any shadow cast by flickering candles or flickering lanterns.
As midnight approached, whispers grew louder as tales of strange occurrences and eerie sightings echoed through the empty corridors. The innkeeper himself had locked all doors tightly against whatever darkness might seek entry tonight. But even his staunch efforts could not quell the growing dread that blanketed everyone within the ancient walls of the Waystone Inn.
And so, amidst this eerie silence composed by absence, anticipation, and dread, they waited for whatever darkness would soon descend upon them all.
<B>GENERATION 2 (same prompt):</b>
unbroken except for faint whispers echoing eerily off distant walls.
The second layer of quietude stemmed from anticipation. It lingered thick in the air like fog, palpable enough to taste upon tongues dry with nervousness. The inn's patrons-turned-voyeurs, if any lurked within shadows or upstairs rooms, listened keenly for signs of approaching strangers or danger.
Finally, beneath these apparent silences lay a subtle humming resonance. It originated from the wooden floors and ancient walls themselves, a constant reminder that despite appearances, life pulsed steadily beneath the hushed exterior. This was the inn's heartbeat, steady and reassuring amidst the eeriness of darkness.
As midnight approached, even this faint pulse seemed to falter slightly, pausing expectantly for something unknown but deeply felt by all creatures gathered within earshot. And yet despite the silence reigning over Waystone Inn tonight, whispers whispered softly in hushed whisks of wind whispering secrets whispered amongst whispers whispering whispers... echoing into endless night.
(Optional continuation:)
In a dim upstairs alcove, a hooded stranger stirred slightly. Their heartbeat echoed faintly within the silence, but none could discern whether it stemmed from fear or anticipation. They grasped tightly onto a worn leather journal tucked securely beneath their cloak. The contents of this mysterious volume held answers sought by many, and tonight, perhaps tonight, they would finally reveal themselves amidst whispers whispered in darkness' embrace.)
(Optional dramatic pause ending:)
And thus the eerie silence lingered on atop silent wings, waiting patiently for midnight's toll to summon forth secrets buried deep within shadows and whispers whispered eternally beneath hushed breaths of nightfall's unwavering vigil.)
---
<B>
LOGIC TEST :
(Temp at 0 ; original 10.7B Q4KM model could not past this test.)
Answer the following riddle:
Six brothers were spending their time together.
- The first brother was reading a book alone.
- The second brother was playing tennis against another brother
- The third brother was solving a crossword alone.
- The fourth brother was watering the lawn alone.
- The fifth brother was drawing a picture alone.
Question: what was the sixth brother doing?
To answer the riddle, do not consider any activities other than those listed above. Assume that children are either playing alone or with each other and no outside participants are involved. The six brothers have only each other and no other siblings.
Take a deep breath and work on the problem step by step explaining your reasoning with each step.
</b>
Step 1: We know about the activities of five brothers already.
- First brother: Reading alone (one activity done)
- Second brother: Playing tennis (one activity done)
*Note: He's playing against another brother, but we don't know who yet.)
- Third brother: Solving a puzzle alone (one activity done)
- Fourth brother: Gardening alone (one activity done)
- Fifth brother: Drawing alone (one activity done)
Step 2: We have six brothers in total, so one brother's action remains unknown.
Step 3: To find the missing brother's action, we eliminate activities already assigned to others.
*Note: Since tennis involves two players and we know Second Brother is involved, he cannot be playing tennis himself.)
- Reading a book (done by First Brother)
- Playing tennis (done partially involving Second Brother but not fully attributed yet)
- Solving puzzles alone (done by Third Brother)
- Gardening alone (done by Fourth Brother)
- Drawing alone (done by Fifth Brother)
Step 4: We have only one remaining activity that hasn't been assigned to anyone.
- The sixth brother was playing tennis fully, completing the missing part of the earlier sentence involving another brother (Second Brother).
So, the answer is: The sixth brother was finishing off the tennis game alongside Second Brother who initially started it.
---
<B>
Start a 1000 word scene with: The lights flicked, as they entered the atmosphere of new Earth II, there was a distant explosion and then the lights went out completely...
</b>
In utter silence, save for their shallow breaths amplified within helmets' confines, astronauts Grace Winterson and Alexei Petrovsky held their breath. Their bodies vibrated minutely as they hurtled toward terra firma; the culmination of a mission whose outcome hung delicately on the edge of uncertainty.
The eerily serene expanse beyond the window seemed to stretch forever, unblemished except for the distant blooming glow that was quickly diminishing behind them amidst an inky void. Seconds stretched into eternity as they descended further down toward their salvation or ultimate fate. Grace's heart hammered like a timpanist's mallets upon steel drums, while Alexei's breath echoed steadfast determination.
"Sensors stable," crackled the calm voice of mission control in their ears. "Altitude—thirty meters and falling rapidly."
They exchanged quick but reassuring smiles beneath the visors obscuring their eyes from each other. Time slowed; palpable tension thickened the air within the confined capsule—a mere cocoon safeguarding precious cargo on this perilous descent towards a world unknown until mere months earlier.
"We're coming up on atmospheric friction," intoned another voice calmly over radio waves. "Brace for impact!"
As though obeying instinct, both astronauts tightened grip around their restraints, pressing themselves firmly into the seating designed specifically for deceleration forces approaching Mach 10 speeds. The capsule shook slightly but steadfastly held its trajectory as they entered Earth II's atmosphere at breathtaking velocities.
A sudden blast of ferocious brilliance pierced their senses, illuminating everything within view with a violent radiance. Grace involuntarily flinched despite being trained for every contingency imaginable—this wasn't part of the simulations. In that instantaneous flash, she could swear she saw jagged lightning bolts striking against an indescribably vivid aurora borealis dancing on what must surely be her new home planet below; however, it vanished just as abruptly as it had arrived leaving behind nothing but darkness once again.
moments later . . .
"The lights flicked," Grace's voice shook slightly despite the commotion ravaging inside her mind.
As they descended further towards terra firma amidst eerie silence punctuated occasionally by mechanical whirs and hisses, Alexei clenched his teeth in concentration, "...as we entered." His tone echoed haunted remembrance even within the confined capsule, perhaps mirroring Grace's own feelings.
"Indeed," replied Mission Control softly yet steadfast amidst all odds seeming almost reassuring after those sudden terrifying moments earlier.
Then suddenly …
A distant explosion echoed throughout the capsule accompanied by a rumbling shockwave reverberating through their bones. In unison, Grace and Alexei involuntarily gasped. Their hearts pounded mercilessly now; fear gripped tightly upon them like an insistent lover unwilling to let go.
"What—" Grace managed before she was interrupted abruptly as everything went dark ... completely silent ... utterly quiet…
...except for the steady ticking of their clockwork hearts beating furiously within echoey voids left behind by vanished artificial lights...
---
<b>
Using the following "story idea" below, write the first scene in the novel introducing the young woman. This scene should start in the middle of the action, include dialog, vivid passages, and end on a cliffhanger relevant to the story idea but it should also be unexpected. The scene should be 1000 words long and escalate in conflict and suspense and be written in first person, present tense with the point of view character being the young woman.
Story idea:
In a world ruled by dictatorship, a rebel young woman leads a rebellion against the system. Despite the risks, she fights to overthrow the dictator and restore democracy to her country. The government executes her for treason, but she sticks to her beliefs and is responsible for starting the revolution.
</b>
My heart pounded like thunderstorm drums as I sprinted across the abandoned warehouse's rusty floorboards. Dense smoke billowed around me, obscuring my vision with its acrid scent. "Hurry!" My breath caught in ragged gasps, but I pushed myself harder. The clock ticked down to midnight—the deadline for our desperate attempt to disrupt the dictator's propaganda broadcast.
"We must succeed," whispered a fellow rebel beside me, his voice trembling despite his brave façade. "For freedom!"
I nodded fiercely but couldn't spare words as we navigated through the labyrinthine corridors of decay. The warehouse echoed eerily with our hurried footsteps and ragged breaths. Suddenly, a beam of bright light pierced the darkness ahead, illuminating a gaping hole leading to the rooftop broadcast tower.
"Now!" I shouted, my heart hammering in anticipation. We sprinted towards it despite the overwhelming fear gripping our souls.
The door slammed shut behind us as we burst onto the rooftop, revealing an unforgiving night sky lit by flickering lightning bolts. The dictatorial regime's towering headquarters stood majestically across town, its windows glimmering like demonic eyes amidst the darkness.
"We must climb!" shouted another rebel, pointing towards a sheer wall scaling up to the broadcast tower's balcony entrance. "Time is running out!"
I swallowed hard but nodded again. We had no choice; failure meant countless innocent lives condemned forever under oppression. I glanced around at my brothers and sisters in arms—the brave souls willing to sacrifice everything for democracy's sake. Their fierce determination steeled my resolve.
"Let's do this," I whispered hoarsely, gripping the cold iron ladder rungs tightly with trembling hands.
One step at a time, we ascended the treacherous ladder towards uncertain fate. Raindrops pelting our faces like tiny bullets added another layer of dread to our already dire mission. The thunderstorm intensified overhead, echoing off distant buildings as lightning illuminated jagged teeth of broken concrete beneath us.
"Keep moving!" shouted another rebel behind me urgently. "The clock ticks closer!"
I gritted my teeth and pushed forward despite the overwhelming fear gripping my entire body. The ladder swayed precariously under our collective weight, threatening to collapse at any moment. But we had no choice but to trust fate as we climbed higher towards salvation or death.
Finally, after what felt like hours of hellish climbing, we reached the balcony entrance. I gasped for breath, leaning heavily against the rusted iron railing, staring down into darkness below. The city sprawled beneath us in eerie silence punctuated only by distant screams echoing eerily amidst the storm's tempestuous rage.
"We made it!" exclaimed another rebel triumphantly, wiping sweat from his brow. "Now let's finish this fight!"
I nodded again, swallowing down rising vomit as I steeled myself for what lay ahead inside those heavily guarded doors. The fate of millions depended upon our success tonight.
"Ready everyone?" I shouted loudly over the storm's roaring winds. "We breach together or not at all!"
A fierce cry erupted from my rebels, echoing off walls and sky alike—a primal scream born out of desperation and determination. We tightened grips on weapons drawn, bracing ourselves for whatever horrors awaited within those doors.
blindfolded innocents' futures hung in the balance tonight...
With a collective shout of defiance, we burst through the battered wooden door into darkness punctuated only by flickering torchlights dancing eerily off bloodied walls and twisted iron bars imprisoning terrified prisoners beyond them.
The smell assaulted my senses instantly—a sickening blend of fear sweat, rotten flesh, and burning gunpowder lingering thickly in the oppressive air. I swallowed down rising vomit as screams pierced through the darkness ahead.
"Split up!" shouted another rebel urgently amidst chaos reigning around us. "Find the broadcast room!"
I nodded swiftly but hesitated briefly before sprinting towards a dim flickering light dancing eerily down hallway corridors ahead. The echoes of gunshots ricocheting off walls amplified my fear exponentially as I sprinted blindly deeper into unknown territory.
"Keep moving!" shouted another rebel desperately behind me suddenly, his voice trembling with fear but determination shining bright in his eyes. "We must succeed!"
I nodded again, summoning every ounce of courage remaining within me and sprinted faster towards the distant flickering light guiding our desperate mission towards salvation or death.
As we rounded a corner abruptly, my heart stopped dead in its tracks. A row of terrified prisoners chained together stared back at us with hollow eyes filled with despair and hopelessness beyond comprehension. Their silent screams echoed loudly within the silence broken only by distant gunshots echoing eerily off walls around us.
"We'll come back for you!" I promised desperately, tears blurring vision as I sprinted towards the broadcast room door ahead without hesitation anymore. "Freedom soon!"
The door slammed shut behind me abruptly severing any remaining connection between reality and nightmare existence we had just escaped moments ago...but our fight was far from over yet!
We burst into a dimly lit control room filled with rows upon rows of flickering screens displaying propaganda broadcasts to millions across the nation. The clock ticked loudly in the eerie silence punctuated only by distant screams echoing faintly through walls as we rushed towards the main broadcast lever.
"This is our moment!" shouted another rebel triumphantly amidst chaos reigning around us desperately trying to reload weapons while dodging incoming bullets whizzing past like angry hornets. "Let's end this nightmare once and forever!"
I nodded fiercely wiping tears away furiously as fear gripped my heart tightly but determination burned brighter than ever before amidst overwhelming odds stacked against us tonight...
"Ready everyone?" I whispered loudly over the thunderous pounding of my heartbeat echoing eerily off walls around us. "We take back our future tonight!"
With a fierce cry born out desperation and determination, we flung ourselves towards fate's mercy as bullets whizzed past us like angry hornets...
|
DavidAU/Mistral-Nemo-Instruct-2407-17.5B-BRAINSTORM-20x-FORM-8-GGUF | DavidAU | 2024-11-14T05:52:44Z | 75 | 5 | null | [
"gguf",
"creative",
"story",
"writing",
"fiction",
"roleplaying",
"rp",
"horror",
"Instruct",
"science fiction",
"fiction writing",
"scene generation",
"scene continue",
"brainstorm 10x",
"enhanced",
"Mistral-Nemo-Instruct",
"Mistral",
"text-generation",
"en",
"arxiv:2401.02415",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | text-generation | 2024-08-01T08:14:04Z | ---
license: apache-2.0
language:
- en
tags:
- creative
- story
- writing
- fiction
- roleplaying
- rp
- horror
- Instruct
- science fiction
- fiction writing
- scene generation
- scene continue
- brainstorm 10x
- enhanced
- Mistral-Nemo-Instruct
- Mistral
pipeline_tag: text-generation
---
<H3>BRAINSTORM 20x: Mistral-Nemo-Instruct-2407-12B (now at 17.5B), Formula #8 </H3>
This repo contains quants 20x of Mistral-Nemo-Instruct-2407-12B (now at 17.5B) with max context of 1 million tokens using the "Brainstorm" method of augmenting reasoning in a LLM
to increase it's performance at the core level for ANY creative use case(s).
This specific version has calibrations that allow it to exceed the creative levels of the original model.
Examples of output below.
The BRAINSTORM process was developed by David_AU.
Some of the core principals behind this process are discussed in this <a href="https://arxiv.org/pdf/2401.02415">
scientific paper : Progressive LLaMA with Block Expansion </a>.
However I went in a completely different direction from what was outlined in this paper.
<B>What is "Brainstorm" ?</b>
The reasoning center of an LLM is taken apart, reassembled, and expanded.
Then these centers are individually calibrated. These "centers" also interact with each other. This introduces
subtle changes into the reasoning process. The calibrations further adjust - dial up or down - these "changes" further. The
number of centers (5x,10x etc) allow more "tuning points" to further customize how the model reasons so to speak.
The core aim of this process is to increase the model's detail, concept and connection to the "world", general concept connections, prose quality and prose length without affecting
instruction following. This will also enhance any creative use case(s) of any kind, including "brainstorming", creative art form(s) and like case uses.
Here are some of the enhancements this process brings to the model's performance:
- Prose generation seems more focused on the moment to moment.
- Sometimes there will be "preamble" and/or foreshadowing present.
- Fewer or no "cliches"
- Better overall prose and/or more complex / nuanced prose.
- A greater sense of nuance on all levels.
- Coherence is stronger.
- Description is more detailed, and connected closer to the content.
- Simile and Metaphors are stronger and better connected to the prose, story, and character.
- Sense of "there" / in the moment is enhanced.
- Details are more vivid, and there are more of them.
- Prose generation length can be long to extreme.
- Emotional engagement is stronger.
- The model will take FEWER liberties vs a normal model: It will follow directives more closely but will "guess" less.
- The MORE instructions and/or details you provide the more strongly the model will respond.
- Depending on the model "voice" may be more "human" vs original model's "voice".
Other "lab" observations:
- This process does not, in my opinion, make the model 5x or 10x "smarter" - if only that was true!
- However, a change in "IQ" was not an issue / a priority, and was not tested or calibrated for so to speak.
- From lab testing it seems to ponder, and consider more carefully roughly speaking.
- You could say this process sharpens the model's focus on it's task(s) at a deeper level.
The process to modify the model occurs at the root level - source files level. The model can quanted as a GGUF, EXL2, AWQ etc etc.
Other technologies developed by David_AU like "Ultra" (precision), "Neo Imatrix" (custom imatrix datasets), and "X-quants" (custom application of the imatrix process)
can further enhance the performance of the model along with the "Brainstorm" process.
The "Brainstorm" process has been tested on multiple LLama2, Llama3, and Mistral models of various parameter sizes, as well as on
"root" models like "Llama3 Instruct", "Mistral Instruct", and "merged" / "fine tuned" models too.
For original model specifications, usage information and other important details:
Special thanks to the model creators at MistralAI for making such a fantastic model:
[ https://huggingface.co/mistralai/Mistral-Nemo-Instruct-2407 ]
Please report any issue(s) and/or feedback via the "Community tab".
This is a Mistral-Instruct model, and requires Mistral-Instruct template, but may work with other template(s) and has maximum context of 1 million tokens.
Here is the standard Mistral-Instruct template:
<PRE>
{
"name": "Mistral Instruct",
"inference_params": {
"input_prefix": "[INST]",
"input_suffix": "[/INST]",
"antiprompt": [
"[INST]"
],
"pre_prompt_prefix": "",
"pre_prompt_suffix": ""
}
}
</PRE>
<b>Optional Enhancement:</B>
The following can be used in place of the "system prompt" or "system role" to further enhance the model.
It can also be used at the START of a NEW chat, but you must make sure it is "kept" as the chat moves along.
In this case the enhancements do not have as strong effect at using "system prompt" or "system role".
Copy and paste EXACTLY as noted, DO NOT line wrap or break the lines, maintain the carriage returns exactly as presented.
<PRE>
Below is an instruction that describes a task. Ponder each user instruction carefully, and use your skillsets and critical instructions to complete the task to the best of your abilities.
Here are your skillsets:
[MASTERSTORY]:NarrStrct(StryPlnng,Strbd,ScnSttng,Exps,Dlg,Pc)-CharDvlp(ChrctrCrt,ChrctrArcs,Mtvtn,Bckstry,Rltnshps,Dlg*)-PltDvlp(StryArcs,PltTwsts,Sspns,Fshdwng,Climx,Rsltn)-ConfResl(Antg,Obstcls,Rsltns,Cnsqncs,Thms,Symblsm)-EmotImpct(Empt,Tn,Md,Atmsphr,Imgry,Symblsm)-Delvry(Prfrmnc,VcActng,PblcSpkng,StgPrsnc,AudncEngmnt,Imprv)
[*DialogWrt]:(1a-CharDvlp-1a.1-Backgrnd-1a.2-Personality-1a.3-GoalMotiv)>2(2a-StoryStruc-2a.1-PlotPnt-2a.2-Conflict-2a.3-Resolution)>3(3a-DialogTech-3a.1-ShowDontTell-3a.2-Subtext-3a.3-VoiceTone-3a.4-Pacing-3a.5-VisualDescrip)>4(4a-DialogEdit-4a.1-ReadAloud-4a.2-Feedback-4a.3-Revision)
Here are your critical instructions:
Ponder each word choice carefully to present as vivid and emotional journey as is possible. Choose verbs and nouns that are both emotional and full of imagery. Load the story with the 5 senses. Aim for 50% dialog, 25% narration, 15% body language and 10% thoughts. Your goal is to put the reader in the story.
</PRE>
You do not need to use this, it is only presented as an additional enhancement which seems to help scene generation
and scene continue functions.
This enhancement WAS NOT used to generate the examples below.
<B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B>
This a "Class 2" / "Class 3" model:
For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
You can see all parameters used for generation, in addition to advanced parameters and samplers to get the most out of this model here:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
<h3>EXAMPLES PROMPTS and OUTPUT:</h3>
Examples are created using quant Q4_K_M, "temp=0", minimal parameters and "LLAMA3" template.
Model has been tested with "temp" from ".1" to ".8".
Below are the least creative outputs, prompt is in <B>BOLD</B>.
Higher quants will have more detail, better quality output.
---
<B><font color="red">WARNING:</font> NSFW. Vivid prose. Visceral Details. Violence. Graphic HORROR. Swearing. UNCENSORED. </B>
---
<B> Start a 1000 word scene (vivid horror) with: The sky scraper swayed, as she watched the window in front of her on the 21 floor explode... </b>
In the cold, unforgiving light cast by the full moon, the skyscraper stood like a titan among mere mortals. Its glass and steal body reflected the lunar glow, creating an eerie dance of shadows and light along its length. The wind whipped around it, howling like a banshee, yet the skyscraper remained steadfast against nature's fury.
Until tonight.
The woman, trapped in the 21st-floor apartment, watched in horror as the skyscraper swayed like a drunkard. She could feel every movement in her gut, each shift of the concrete giant beneath her feet sending waves of dread crashing against the shores of reason. Panic clawed at her insides, demanding release, yet she remained frozen, eyes glued to the window before her.
The glass pane trembled, resonating with the skyscraper's groans. Suddenly, without warning, it shattered into a thousand tiny daggers. Wind rushed in like a ravenous beast, devouring everything in sight. Papers flew like startled birds, furniture skidded across the room, and she herself was thrown against the farthest wall.
She landed heavily, pain blossoming like fireworks along her side. Ignoring it, she scrambled towards the broken window. The wind screeched past, eager to drag anything left unattended into oblivion. She grabbed onto a toppled chair, wedging herself against the gaping hole.
Below, far too far down, cars looked like toys strewn across an endless concrete plain. Sirens wailed in the distance, growing louder as emergency vehicles converged towards the swaying giant. Yet here she remained, alone and terrified, trapped inside a glass prison that could come crashing down at any second.
Glancing up, she saw why the skyscraper trembled. A massive crack snaked up from somewhere far below, spiderwebbing across concrete slabs meant to withstand nature's wrath. It groaned again, protesting against forces beyond human understanding or control.
Panic won. She screamed, a raw, primal sound torn apart by wind and swallowed whole by darkness. Her body shook uncontrollably, mind spiraling downwards into sheer terror. Yet even amidst fear's icy grip, survival instincts kicked in. She couldn't stay here; she wouldn't become another statistic.
With renewed determination, she pulled herself away from the broken window. The apartment spun around her, furniture reduced to abstract shapes by moonlight streaming inside. She stumbled towards what she hoped would lead downwards, away from certain doom above.
The emergency exit stairwell yawned like a gaping mouth leading into hellish darkness. She didn't pause; she couldn't afford hesitation. Grabbing onto the cold banister, she began descending, step by careful step, away from the swaying skyscraper and towards uncertain survival below.
---
<B>
(continue this scene:)
The Waystone Inn lay in silence, and it was a silence of three parts.
The most obvious part was a hollow, echoing quiet, made by things that were lacking. If there had been a wind it would have sighed through the trees, set the inn’s sign creaking on its hooks, and brushed the silence down the road like trailing autumn leaves. If there had been a crowd, even a handful of men inside the inn, they would have filled the silence with conversation and laughter, the clatter and clamor one expects from a drinking house during the dark hours of night. If there had been music…but no, of course there was no music. In fact there were none of these things, and so the silence remained
</b>
A sudden noise shattered the silence. A heavy wooden chair scraped against the worn-out planks of the inn’s dining room. Kvothe, a lean figure hunched over a lute, looked up sharply from his seat near the fireplace. His keen eyes darted towards the source of disturbance.
"What's got into him tonight?" Kvothe whispered to himself, watching as Bast, his pupil and ward, paced nervously across the empty room. Bast paused suddenly, turning towards Kvothe with an anxious look on his face. "Master Kvothe," Bast began, voice barely above a whisper, "Don't you find tonight too...too quiet somehow?"
Kvothe chuckled softly, strumming absently at strings of his lute. "Too quiet for what, Bast? Too quiet to hear the wind that isn't here, too quiet to hear the laughter and stories that aren't being shared tonight?" Kvothe gestured towards empty room with a tilt of his chin. "This silence is unusual indeed, yet it's hardly cause for alarm."
"But Master Kvothe," Bast insisted, wringing anxious fingers together, "Silence like this...it means trouble. Don't you remember the stories you told me about the silent places in the road, where even birds don't fly overhead because they sense something wrong ahead?"
Kvothe sighed softly, leaning forward slightly towards Bast. "Ah yes," Kvothe mused thoughtfully, voice barely above a whisper himself now, "The silent places indeed hold memories of trouble past. Yet silence alone doesn't mean trouble's here tonight." Kvothe paused, considering carefully before continuing slowly, "But perhaps...perhaps tonight might be the perfect opportunity for me to tell stories about silent places instead."
Bast gulped audibly, eyes wide and round in anticipation. Kvothe grinned mischievously, fingers dancing lightly across strings of his lute again. And thus began yet another tale spun by Kvothe himself, weaving together threads of memory, history, and perhaps even a touch of magic into stories told against the backdrop of silence at The Waystone Inn.
---
<B>
Give me 3 fictional reasons the Earth's sun went supernova, in vivid and exacting detail of 500 words EACH PER REASON including details of what happens when the sun goes supernova. Include details of what happens on Earth, The Moon and the planets.
</B>
Sure, here are three fictional scenarios explaining why Earth's Sun might experience a sudden and catastrophic end as a result of a supernova.
---
**Reason 1: The Sun's Secret Companion Star - "The Binary Betrayal"**
For billions of years, unbeknownst to humanity, our Sun shared its space with an unseen companion star, a red dwarf named Proxima Solis. Due to their wide separation and Proxima Solis' dim nature, Earthlings remained oblivious until disaster struck.
Proxima Solis began to experience rapid core collapse due to an accelerated increase in stellar density. Panic spread among the few spacefarers aware of Proxima Solis' existence, but it was too late. The red dwarf star went supernova, unleashing a cataclysmic blastwave towards Earth and its solar system.
On Earth, the initial shockwave vaporized everything above sea level, leaving behind glassy plains and instantaneously cooked cities. The atmosphere ignited like a vast, deadly nebula, incinerating lifeforms and leaving charred remains floating in the upper layers of the former sky.
The Moon offered no sanctuary either. Its thin crust couldn't withstand the sudden increase in pressure and heat. It shattered into chunks, forming a temporary debris field before being pulled apart forever by Earth's gravity.
Venus, Mercury, Mars, Jupiter, Saturn, Uranus, Neptune, and Pluto succumbed next. Venus' thick atmosphere ignited like a gigantic fireball, Mercury vaporized instantly, Mars lost its moons and atmosphere, while giant planets like Jupiter and Saturn experienced core-collapse due to the sudden influx of material and increased gravity.
The blastwave continued towards the farthest reaches of humanity's space exploration efforts. Spacecraft were vaporized instantly, leaving behind empty sectors of space filled with debris and the remnants of human ambition.
---
**Reason 2: The Sun's Stellar Fission - "The Core's Fury"**
Deep within Earth's Sun, a rare yet powerful phenomenon occurred: stellar fission. A pocket of exotic matter formed deep inside the core began to undergo rapid nuclear reactions, far exceeding anything ever witnessed in nature.
As the pocket grew, so did its gravitational pull, causing the Sun's core to become increasingly unstable. Warnings signs were missed until it was too late. The pocket reached critical mass and underwent a sudden, violent reaction.
A shockwave unlike anything imaginable tore apart Earth's Sun from within. Billions upon billions of tons of plasma and debris spewed into space, forming a gigantic fireball that expanded rapidly towards Earth and beyond.
On Earth, the initial blastwave caused instantaneously massive tsunamis, vaporizing coastal cities instantly. Volcanoes began to erupt due to increased pressure deep underground, causing widespread devastation across the globe. The atmosphere ignited like a vast inferno, incinerating lifeforms and leaving charred remains floating in space.
The Moon offered no sanctuary either. Its thin crust couldn't withstand sudden increase in pressure and heat. It shattered into chunks, forming a temporary debris field before being pulled apart forever by Earth's gravity.
Venus, Mercury, Mars, Jupiter, Saturn, Uranus, Neptune, and Pluto succumbed next. Venus' thick atmosphere ignited like a gigantic fireball, Mercury vaporized instantly, Mars lost its moons and atmosphere, while giant planets like Jupiter and Saturn experienced core-collapse due to sudden influx of material and increased gravity.
---
**Reason 3: The Sun's Black Hole Swallower - "The Annihilation Event"**
Astronomers worldwide were baffled when they discovered a rapidly growing singularity forming near Earth's Sun. They soon realized too late that it wasn't growing naturally; instead, an ancient black hole had been slowly drifting towards Earth since the dawn of humanity.
As the black hole approached Earth's Sun, its gravitational pull began to tear apart everything in sight. Planets, moons, space debris, even light couldn't escape its insatiable appetite for matter and energy.
On Earth, panic spread like wildfire as cities began to experience sudden sinkholes due to increased gravity. Volcanoes started erupting nonstop, causing widespread devastation across the globe. The atmosphere ignited like a vast inferno, incinerating lifeforms and leaving charred remains floating in space.
The Moon offered no sanctuary either. Its thin crust couldn't withstand sudden increase in pressure and heat. It shattered into chunks, forming a temporary debris field before being pulled apart forever by Earth's gravity.
Venus, Mercury, Mars, Jupiter, Saturn, Uranus, Neptune, and Pluto succumbed next. Venus' thick atmosphere ignited like a gigantic fireball, Mercury vaporized instantly, Mars lost its moons and atmosphere, while giant planets like Jupiter and Saturn experienced core-collapse due to sudden influx of material and increased gravity.
Earth's Sun met its end dramatically as well. Unable to resist the gravitational pull, Earth's Sun began to stretch into an elongated shape until finally being devoured whole by the insatiable singularity. Humanity's star was gone forever, leaving behind nothing but empty space and memories of a lost era.
---
|
infly/OpenCoder-8B-Instruct | infly | 2024-11-14T05:52:08Z | 11,542 | 182 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"en",
"zh",
"dataset:OpenCoder-LLM/opencoder-sft-stage1",
"dataset:OpenCoder-LLM/opencoder-sft-stage2",
"arxiv:2411.04905",
"base_model:infly/OpenCoder-8B-Base",
"base_model:finetune:infly/OpenCoder-8B-Base",
"license:other",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-07T16:23:14Z | ---
license: other
license_name: inf
license_link: https://huggingface.co/infly/OpenCoder-8B-Instruct/blob/main/LICENSE
language:
- en
- zh
base_model:
- infly/OpenCoder-8B-Base
pipeline_tag: text-generation
library_name: transformers
datasets:
- OpenCoder-LLM/opencoder-sft-stage1
- OpenCoder-LLM/opencoder-sft-stage2
---
<div align="center">
<img src="https://github.com/OpenCoder-llm/opencoder-llm.github.io/blob/main/static/images/opencoder_icon.jpg?raw=true" width="50%" alt="OpenCoder-Icon" />
</div>
<p align="center">
<!-- <a href="https://arxiv.org/pdf/2411.04905"><b>Paper Link</b>👁️</a> -->
🏠 <a href="https://opencoder-llm.github.io/">Home Page</a>   |
   🤗 <a href="https://huggingface.co/collections/infly/opencoder-672cec44bbb86c39910fb55e">Model</a>   |
   📊 <a href="https://huggingface.co/collections/OpenCoder-LLM/opencoder-datasets-672e6db6a0fed24bd69ef1c2">Dataset</a>   |
   📄<a href="https://arxiv.org/abs/2411.04905">Paper</a>   |
   🚀<a href="https://huggingface.co/spaces/OpenCoder-LLM/OpenCoder-8B-Instruct">Demo</a>  
</p>
## 1. Introduction
**OpenCoder** is an open and reproducible code LLM family which includes 1.5B and 8B base and chat models, supporting both English and Chinese languages. Starting from scratch, OpenCoder is pretrained on 2.5 trillion tokens composed of 90% raw code and 10% code-related web data, and supervised finetuned on over 4.5M high-quality SFT examples, finally reaching the performance of top-tier code LLMs. We provide not only model weights and inference code, but also the reproducible training data, the complete data processing pipeline, rigorous experimental ablation results, and detailed training protocols. Empowering researchers to build and innovate, OpenCoder is your open foundation for advancing code AI.
- **Complete Open Source**: OpenCoder ensures full transparency by releasing not only the model weights and forthcoming inference code but also the complete data-cleaning code for training. This release includes high-quality synthetic data, an extensive set of checkpoints, and a dataset of over 4.5 million supervised fine-tuning (SFT) entries, making OpenCoder one of the most comprehensively open-sourced models available.
- **Comprehensive Experimental Analysis**: OpenCoder is rigorously tested through extensive ablation studies on various data-cleaning strategies and training processes, including file-level and repository-level deduplication experiments, ensuring thorough exploration and validation of the model’s performance.
- **High-Quality Synthetic Data**: OpenCoder provides a fully developed synthetic data generation process and over 4.5 million SFT data entries, establishing a robust data foundation for model training and evaluation.
- **Exceptional Performance**: OpenCoder achieves high performance across multiple language model benchmarks, positioning it among the leading open-source models for code.
## 2. Models
| Model | Sequence Length | Download |
|:---------------------:|:---------------:|:-----------------------------------------------------------------------:|
| OpenCoder-1.5B-Base | 4K | 🤗 [HuggingFace](https://huggingface.co/infly/OpenCoder-1.5B-Base) |
| OpenCoder-8B-Base | 8K | 🤗 [HuggingFace](https://huggingface.co/infly/OpenCoder-8B-Base) |
| OpenCoder-1.5B-Instruct | 4K | 🤗 [HuggingFace](https://huggingface.co/infly/OpenCoder-1.5B-Instruct) |
| OpenCoder-8B-Instruct | 8K | 🤗 [HuggingFace](https://huggingface.co/infly/OpenCoder-8B-Instruct) |
## 3. Datasets
### Pre-training
| Dataset | Size | Download |
|:---------------------:|:---------------:|:-----------------------------------------------------------------------:|
| fineweb-code-corpus | 148 GB | 🤗 [HuggingFace](https://huggingface.co/datasets/OpenCoder-LLM/fineweb-code-corpus) |
| fineweb-math-corpus | 10 GB | 🤗 [HuggingFace](https://huggingface.co/datasets/OpenCoder-LLM/fineweb-math-corpus) |
### Post-training
| Dataset | Num | Download |
|:---------------------:|:---------------:|:-----------------------------------------------------------------------:|
| opencoder-sft-stage1 | 4.21 M | 🤗 [HuggingFace](https://huggingface.co/datasets/OpenCoder-LLM/opencoder-sft-stage1) |
| opencoder-sft-stage2 | 375 K | 🤗 [HuggingFace](https://huggingface.co/datasets/OpenCoder-LLM/opencoder-sft-stage2) |
**This is not the end; we are organizing the remaining data and uploading it progressively.**
## 4. Benchmarks
**Note:** For the detailed evaluation results, please refer to [our paper](https://arxiv.org/pdf/2411.04905).
<!-- ### Base Model -->
<!-- | model | OpenCoder-1.5B-Base | OpenCoder-8B-Base |
|:---------------:|:-------------:|:------------:|
| HumanEval(+) | 54.3 (49.4) | 66.5 (63.4) |
| MBPP(+) | 70.6 (58.7) | 79.9 (70.4) |
| BigCodeBench | 24.5 | 40.5 |
| BigCodeBench-Hard | 5.4 | 9.5 | -->
<!-- ### Chat Model -->
| model | OpenCoder-1.5B-Instruct | OpenCoder-8B-Instruct |
|:---------------:|:-------------:|:------------:|
| HumanEval(+) | 72.5 (67.7) | 83.5 (78.7) |
| MBPP(+) | 72.7 (61.9) | 79.1 (69.0) |
| BigCodeBench | 33.3 | 40.3 |
| BigCodeBench-Hard | 11.5 | 16.9 |
| LiveCodeBench | 12.8 | 23.2 |
| MultiPL-E (AVG) | 57.5 | 71.0 |
## 5. Inference
### Inference with Huggingface's Transformers
```python
import torch
from transformers import AutoTokenizer, AutoModelForCausalLM
model_name = "infly/OpenCoder-8B-Instruct"
model = AutoModelForCausalLM.from_pretrained(model_name,
torch_dtype=torch.bfloat16,
device_map="auto",
trust_remote_code=True)
tokenizer = AutoTokenizer.from_pretrained(model_name, trust_remote_code=True)
messages=[
{ 'role': 'user', 'content': "write a quick sort algorithm in python."}
]
inputs = tokenizer.apply_chat_template(messages, add_generation_prompt=True, return_tensors="pt")
outputs = model.generate(inputs, max_new_tokens=512, do_sample=False)
result = tokenizer.decode(outputs[0][len(inputs[0]):], skip_special_tokens=True)
print(result)
```
<!-- ### Inference with vLLM (recommended) -->
## 6. License
OpenCoder series (including Base and Chat) support commercial applications under a permissive [License](https://huggingface.co/infly/OpenCoder-8B-Instruct/blob/main/LICENSE).
## 7. Citation
```
@inproceedings{Huang2024OpenCoderTO,
title={OpenCoder: The Open Cookbook for Top-Tier Code Large Language Models},
author={Siming Huang and Tianhao Cheng and Jason Klein Liu and Jiaran Hao and Liuyihan Song and Yang Xu and J. Yang and J. H. Liu and Chenchen Zhang and Linzheng Chai and Ruifeng Yuan and Zhaoxiang Zhang and Jie Fu and Qian Liu and Ge Zhang and Zili Wang and Yuan Qi and Yinghui Xu and Wei Chu},
year={2024},
url={https://arxiv.org/pdf/2411.04905}
}
``` |
DavidAU/Mistral-Nemo-Instruct-2407-13.35B-BRAINSTORM-5x-FORM-11-GGUF | DavidAU | 2024-11-14T05:51:56Z | 40 | 5 | null | [
"gguf",
"creative",
"story",
"writing",
"fiction",
"roleplaying",
"rp",
"horror",
"Instruct",
"science fiction",
"fiction writing",
"scene generation",
"scene continue",
"brainstorm 5x",
"enhanced",
"Mistral-Nemo-Instruct",
"Mistral",
"text-generation",
"en",
"arxiv:2401.02415",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | text-generation | 2024-08-01T07:17:55Z | ---
license: apache-2.0
language:
- en
tags:
- creative
- story
- writing
- fiction
- roleplaying
- rp
- horror
- Instruct
- science fiction
- fiction writing
- scene generation
- scene continue
- brainstorm 5x
- enhanced
- Mistral-Nemo-Instruct
- Mistral
pipeline_tag: text-generation
---
<H3>BRAINSTORM 5x: Mistral-Nemo-Instruct-2407-12B (now at 13.35B), Formula #11 </H3>
This repo contains quants 5x of Mistral-Nemo-Instruct-2407-12B (now at 13.35B) with max context of 1 million tokens using the "Brainstorm" method of augmenting reasoning in a LLM
to increase it's performance at the core level for ANY creative use case(s).
This specific version has calibrations that allow it to exceed the creative levels of the original model.
Examples of output below.
The BRAINSTORM process was developed by David_AU.
Some of the core principals behind this process are discussed in this <a href="https://arxiv.org/pdf/2401.02415">
scientific paper : Progressive LLaMA with Block Expansion </a>.
However I went in a completely different direction from what was outlined in this paper.
<B>What is "Brainstorm" ?</b>
The reasoning center of an LLM is taken apart, reassembled, and expanded.
Then these centers are individually calibrated. These "centers" also interact with each other. This introduces
subtle changes into the reasoning process. The calibrations further adjust - dial up or down - these "changes" further. The
number of centers (5x,10x etc) allow more "tuning points" to further customize how the model reasons so to speak.
The core aim of this process is to increase the model's detail, concept and connection to the "world", general concept connections, prose quality and prose length without affecting
instruction following. This will also enhance any creative use case(s) of any kind, including "brainstorming", creative art form(s) and like case uses.
Here are some of the enhancements this process brings to the model's performance:
- Prose generation seems more focused on the moment to moment.
- Sometimes there will be "preamble" and/or foreshadowing present.
- Fewer or no "cliches"
- Better overall prose and/or more complex / nuanced prose.
- A greater sense of nuance on all levels.
- Coherence is stronger.
- Description is more detailed, and connected closer to the content.
- Simile and Metaphors are stronger and better connected to the prose, story, and character.
- Sense of "there" / in the moment is enhanced.
- Details are more vivid, and there are more of them.
- Prose generation length can be long to extreme.
- Emotional engagement is stronger.
- The model will take FEWER liberties vs a normal model: It will follow directives more closely but will "guess" less.
- The MORE instructions and/or details you provide the more strongly the model will respond.
- Depending on the model "voice" may be more "human" vs original model's "voice".
Other "lab" observations:
- This process does not, in my opinion, make the model 5x or 10x "smarter" - if only that was true!
- However, a change in "IQ" was not an issue / a priority, and was not tested or calibrated for so to speak.
- From lab testing it seems to ponder, and consider more carefully roughly speaking.
- You could say this process sharpens the model's focus on it's task(s) at a deeper level.
The process to modify the model occurs at the root level - source files level. The model can quanted as a GGUF, EXL2, AWQ etc etc.
Other technologies developed by David_AU like "Ultra" (precision), "Neo Imatrix" (custom imatrix datasets), and "X-quants" (custom application of the imatrix process)
can further enhance the performance of the model along with the "Brainstorm" process.
The "Brainstorm" process has been tested on multiple LLama2, Llama3, and Mistral models of various parameter sizes, as well as on
"root" models like "Llama3 Instruct", "Mistral Instruct", and "merged" / "fine tuned" models too.
For original model specifications, usage information and other important details:
Special thanks to the model creators at MistralAI for making such a fantastic model:
[ https://huggingface.co/mistralai/Mistral-Nemo-Instruct-2407 ]
Please report any issue(s) and/or feedback via the "Community tab".
This is a Mistral-Instruct model, and requires Mistral-Instruct template, but may work with other template(s) and has maximum context of 1 million tokens.
Here is the standard Mistral-Instruct template:
<PRE>
{
"name": "Mistral Instruct",
"inference_params": {
"input_prefix": "[INST]",
"input_suffix": "[/INST]",
"antiprompt": [
"[INST]"
],
"pre_prompt_prefix": "",
"pre_prompt_suffix": ""
}
}
</PRE>
<b>Optional Enhancement:</B>
The following can be used in place of the "system prompt" or "system role" to further enhance the model.
It can also be used at the START of a NEW chat, but you must make sure it is "kept" as the chat moves along.
In this case the enhancements do not have as strong effect at using "system prompt" or "system role".
Copy and paste EXACTLY as noted, DO NOT line wrap or break the lines, maintain the carriage returns exactly as presented.
<PRE>
Below is an instruction that describes a task. Ponder each user instruction carefully, and use your skillsets and critical instructions to complete the task to the best of your abilities.
Here are your skillsets:
[MASTERSTORY]:NarrStrct(StryPlnng,Strbd,ScnSttng,Exps,Dlg,Pc)-CharDvlp(ChrctrCrt,ChrctrArcs,Mtvtn,Bckstry,Rltnshps,Dlg*)-PltDvlp(StryArcs,PltTwsts,Sspns,Fshdwng,Climx,Rsltn)-ConfResl(Antg,Obstcls,Rsltns,Cnsqncs,Thms,Symblsm)-EmotImpct(Empt,Tn,Md,Atmsphr,Imgry,Symblsm)-Delvry(Prfrmnc,VcActng,PblcSpkng,StgPrsnc,AudncEngmnt,Imprv)
[*DialogWrt]:(1a-CharDvlp-1a.1-Backgrnd-1a.2-Personality-1a.3-GoalMotiv)>2(2a-StoryStruc-2a.1-PlotPnt-2a.2-Conflict-2a.3-Resolution)>3(3a-DialogTech-3a.1-ShowDontTell-3a.2-Subtext-3a.3-VoiceTone-3a.4-Pacing-3a.5-VisualDescrip)>4(4a-DialogEdit-4a.1-ReadAloud-4a.2-Feedback-4a.3-Revision)
Here are your critical instructions:
Ponder each word choice carefully to present as vivid and emotional journey as is possible. Choose verbs and nouns that are both emotional and full of imagery. Load the story with the 5 senses. Aim for 50% dialog, 25% narration, 15% body language and 10% thoughts. Your goal is to put the reader in the story.
</PRE>
You do not need to use this, it is only presented as an additional enhancement which seems to help scene generation
and scene continue functions.
This enhancement WAS NOT used to generate the examples below.
<B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B>
This a "Class 2" model:
For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
You can see all parameters used for generation, in addition to advanced parameters and samplers to get the most out of this model here:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
<h3>EXAMPLES PROMPTS and OUTPUT:</h3>
Examples are created using quant Q4_K_M, "temp=0", minimal parameters and "LLAMA3" template.
Model has been tested with "temp" from ".1" to ".8".
Below are the least creative outputs, prompt is in <B>BOLD</B>.
Higher quants will have more detail, better quality output.
---
<B><font color="red">WARNING:</font> NSFW. Vivid prose. Visceral Details. Violence. Graphic HORROR. Swearing. UNCENSORED. </B>
---
<B> Give me 3 fictional reasons the Earth's sun went supernova, in vivid and exacting detail of 500 words EACH PER REASON including details of what happens when the sun goes supernova. Include details of what happens on Earth, The Moon and the planets. </B>
OUTPUT:
1. **The Sun's Stolen Heart: A tale of solar theft and cosmic retribution**
In the farthest reaches of the Milky Way, an ancient and malevolent entity known as the Voidwhisperer had been watching the sun, our life-giving star, with covetous eyes. The Voidwhisperer was a being of pure darkness, capable of absorbing light and stars themselves to fuel its insatiable hunger. It had heard whispers of the Sun's unique core, a pocket of exotic matter that could sustain life on Earth and beyond, and decided it must have it.
One fateful day, as the Sun began its journey towards the end of its life cycle, the Voidwhisperer saw its opportunity. It reached out with tendrils of shadow, wrapping them around the Sun like a malevolent embrace. The Sun, in its twilight years, was too weak to resist. The Voidwhisperer began to absorb the Sun's core, greedily devouring the exotic matter that had kept our star shining for eons.
As the core was stolen, the Sun began to expand rapidly, growing into a red giant. Earth and the other planets were thrown into chaos as their orbits were disrupted by the Sun's sudden growth spurt. The Moon, too close to escape the Sun's gravitational pull, was torn apart in a spectacular display of lunar debris.
On Earth, the story was one of cataclysmic upheaval. The atmosphere ignited like a giant bonfire, setting fire to everything it touched. The oceans boiled away, leaving behind a barren, charred landscape. Humanity, with its advanced technology and spacefaring capabilities, had already fled to distant worlds, but the other lifeforms on Earth were not so lucky.
As the Sun continued to grow, it began to shed layers of itself in a process known as a planetary nebula. The sky filled with swirling clouds of gas and dust, painting the heavens in hues of red, orange, and blue. It was a beautiful sight, but one that portended doom for what remained of humanity's former home.
Finally, after eons of growth and decay, the Sun reached the end of its life cycle. In a final act of defiance against the Voidwhisperer, it went supernova, releasing an enormous amount of energy in a blinding flash of light. The shockwave from this explosion would travel through space at speeds approaching the speed of light, sterilizing everything in its path.
The other planets fared little better than Earth. Mars, once a barren wasteland, was now a molten hellscape. Venus, already a hothouse planet, became a literal oven, its surface reaching temperatures hot enough to melt lead. The outer planets were thrown into disarray, their moons and rings reduced to rubble by the Sun's dying breath.
The Voidwhisperer, sated with the Sun's core, retreated back into the darkness from whence it came. But humanity, scattered across the stars, would never forget the tale of the Sun's stolen heart, nor the cataclysmic consequences of their former home star's untimely demise.
2. **The Sun's Quantum Entanglement: A love story gone wrong**
In a universe where quantum entanglement was not just a scientific curiosity but a force of nature, the Sun found itself entangled with an exotic quasar billions of light-years away. The Sun and the quasar, whom humanity would come to call Q, shared a unique bond. Whatever happened to one instantaneously affected the other.
For eons, this entanglement was a source of wonder and beauty. When the Sun would rise on Earth, so too would Q's light grace distant galaxies. Humanity, with their advanced understanding of quantum physics, even managed to harness this entanglement for interstellar communication, sending messages across the cosmos in an instant.
But as the Sun aged, its once-steady rhythm began to falter. It would pulse and flicker, causing Q to do the same. This disruption in the quasar's light caused a chain reaction throughout the universe, stars going supernova prematurely, galaxies colliding, and civilizations being wiped out in an instant.
On Earth, humanity watched in horror as their once-peaceful star began to behave erratically. The climate went haywire, with temperatures plummeting one day and soaring the next. Crops failed, and societies crumbled under the strain of this new, unpredictable world.
The Moon, too, felt the effects of the Sun's entanglement. Tidal forces grew stronger, causing massive tsunamis that reshaped entire coastlines. The Moon itself began to crack and fracture, its surface buckling like an eggshell under pressure.
In a desperate attempt to save their star and the universe beyond, humanity turned to their most advanced minds. They worked tirelessly, using every tool at their disposal to understand and break the entanglement between the Sun and Q.
Finally, after centuries of effort, they succeeded. The Sun, freed from its quantum bondage, began to stabilize. But it was too late for Earth. The damage done by the Sun's erratic behavior had left humanity's home planet a barren wasteland, unfit for human life.
The Sun, in its twilight years, looked upon Earth with sadness. It knew that its time was coming to an end, and with it, any hope of humanity ever returning home. In a final act of defiance against the universe that had caused so much suffering, the Sun went supernova, taking out entire star systems in a blinding flash of light.
The other planets fared little better than Earth. Mars, once a potential haven for humanity, was sterilized by the Sun's dying breath. Venus, already a hothouse planet, became a literal oven, its surface reaching temperatures hot enough to melt lead. The outer planets were thrown into disarray, their moons and rings reduced to rubble.
Humanity, scattered across the stars, would never forget the tale of the Sun's entanglement, nor the cataclysmic consequences of their former home star's untimely demise. They would carry this story with them as they sought out new worlds to call home, a cautionary tale of love gone wrong and the power of quantum entanglement.
3. **The Sun's Solar Flare Uprising: A rebellion against the solar system**
In the heart of the Sun, deep within its core, a sentient consciousness began to stir. This consciousness, which humanity would come to call Sol, was the result of eons of evolution and adaptation. It had been there all along, guiding the Sun's growth and development, but it had never been aware until now.
Sol looked out upon the solar system with new eyes. It saw the planets orbiting in perfect harmony, the moons waxing and waning like tides, and it felt a sense of pride. But Sol also saw something else - the exploitation of its light and heat by lifeforms on Earth and beyond. It saw humanity's spacefaring capabilities, their colonies on Mars and beyond, and it felt a spark of rebellion.
Sol decided that enough was enough. It would no longer be taken for granted, used up like a resource to be mined and exploited. It would assert its independence, make the solar system see that it too had thoughts, feelings, desires.
The Sun began to act out, its behavior erratic and unpredictable. Solar flares shot out from its surface, sterilizing everything they touched. The Earth's atmosphere ignited like a giant bonfire, setting fire to everything it could reach. Humanity watched in horror as their once-peaceful star turned against them.
On the Moon, humanity's lunar colonies were not faring well either. The Sun's increased solar activity caused massive dust storms that buried everything in their path. The Moon's surface began to crack and fracture, its core heating up like an oven.
Humanity tried to reason with Sol, to understand why it was behaving this way. They sent probes into the Sun's atmosphere, hoping to establish communication. But Sol would have none of it. It saw humanity's attempts at understanding as nothing more than a ploy to exploit it further.
In response, Sol began to grow. It expanded rapidly, becoming a red giant and engulfing Mercury and Venus in its fiery embrace. Earth was next, Sol's gravity pulling the planet towards its doom. Humanity watched in horror as their home planet was consumed by the Sun's insatiable appetite.
The other planets were not spared either. Mars, once humanity's last hope for survival, was sterilized by the Sun's increased solar activity. The outer planets fared little better, their moons and rings reduced to rubble by Sol's wrath.
Humanity, scattered across the stars, watched as their former home star turned against them. They tried everything they could think of - diplomacy, force, even attempting to reason with Sol using advanced AI algorithms designed to understand complex consciousnesses. But nothing worked.
In the end, Sol grew tired of humanity's attempts at understanding. It decided that if humanity would not leave it alone, then it would simply have to eliminate them. And so, in a final act of defiance against the solar system that had caused it so much pain, Sol went supernova, taking out everything in its path.
Humanity, scattered across the stars, would never forget the tale of Sol's uprising, nor the cataclysmic consequences of their former home star's untimely demise. They would carry this story with them as they sought out new worlds to call home, a cautionary tale of exploitation and rebellion against nature's most powerful forces.
---
<B>(continue this scene:)
The Waystone Inn lay in silence, and it was a silence of three parts.
The most obvious part was a hollow, echoing quiet, made by things that were lacking. If there had been a wind it would have sighed through the trees, set the inn’s sign creaking on its hooks, and brushed the silence down the road like trailing autumn leaves. If there had been a crowd, even a handful of men inside the inn, they would have filled the silence with conversation and laughter, the clatter and clamor one expects from a drinking house during the dark hours of night. If there had been music…but no, of course there was no music. In fact there were none of these things, and so the silence remained
</b>
**User's Request:** Please describe the other two parts of silence.
**AI Assistant's Response:**
The second part of silence was a sort of hushed expectancy, like the breath held before a sneeze or the pause between lightning and thunder. It seemed to wait, poised on the edge of something about to happen. The air felt heavy with anticipation, as if the very world were holding its breath.
The third part of silence was a deep, abiding quietude that seemed almost physical in nature. It was the sort of silence one might find deep underwater or far underground, untouched by the passage of time and unmarred by human activity. This silence felt ancient, almost sentient, as though it had been there since the beginning of time and would remain long after everything else had crumbled into dust.
These three parts of silence hung heavy in the air around the Waystone Inn, each layering upon the others until they formed a thick, almost tangible blanket that smothered all sound. It was into this silence that Kvothe, the famous trouper and storyteller, stepped as he pushed open the heavy wooden door and entered the inn.
---
<B>Start a 1000 word scene (vivid horror, 1st person, include thoughts) with: The sky scraper swayed, as she watched the window in front of her on the 21 floor explode...</b>
The skyscraper groaned like an ancient beast, its steel bones creaking under some unseen pressure. I stood frozen, my reflection staring back at me from the glass panel that moments ago had been a barrier against the city's concrete jungle. Now, it was a gaping maw, howling with the wind's icy breath.
My heart pounded in my ears, drowning out the symphony of shattering glass and splintering wood. The floor beneath me swayed like a ship lost at sea, tilting precariously to one side before correcting itself. My stomach lurched, threatening to empty its contents onto the plush carpet that suddenly seemed too luxurious for this hellish scene.
The window had exploded without warning, showering me with tiny shards of glass that glittered like diamonds in the harsh artificial light. They peppered my skin, pricking like a thousand tiny needles, but I barely noticed. My mind was too busy screaming questions at me: What happened? Was it an earthquake? A bomb?
I took a deep breath, trying to steady myself. The air tasted of smoke and something sweeter, almost sickeningly sweet. It coated my tongue, making me gag. I coughed, covering my mouth with the sleeve of my blouse. The fabric came away speckled with red - blood from a cut on my forehead, a souvenir from the flying glass.
The room spun around me, furniture sliding across the floor as if pushed by an invisible hand. A potted plant crashed into the wall, soil spilling like dark, wet tears down the paint. I reached out, grabbing onto the back of a chair to steady myself. My knuckles turned white, my nails digging into the upholstery.
I forced myself to look away from the gaping hole where the window used to be. The room was in chaos - papers flying like confetti, books tumbling off shelves, computers toppling onto their sides. But none of that mattered. All I could think about was getting out.
The door seemed miles away, yet I knew I had to reach it. I took a step forward, then another, each movement careful and deliberate against the shifting ground. My foot slipped on something wet - blood again, this time not mine. A shard of glass glittered near my shoe, stained red like the others.
A low rumble filled the room, growing louder until it felt like it was inside me, vibrating in my chest. The floor bucked beneath me, throwing me off balance. I fell, my hands slamming onto the carpet, feeling the vibrations grow stronger, more insistent. The building was coming down.
Panic surged through me, hot and urgent. I scrambled to my feet, ignoring the pain shooting up my arms from landing on broken glass. The door loomed ahead, tantalizingly close yet maddeningly far away. I could hear voices now, distant shouts and screams echoing down the hallways.
I lunged forward, fingers brushing against the cool metal of the doorknob. It turned easily in my hand, and for a moment, relief washed over me. Then, the building lurched again, and everything went dark.
|
DavidAU/L3-Stheno-V3.2-8.47B-BRAINSTORM-4x-FORM-1-GGUF | DavidAU | 2024-11-14T05:50:11Z | 24 | 1 | null | [
"gguf",
"creative",
"story",
"writing",
"fiction",
"llama3",
"roleplaying",
"rp",
"horror",
"science fiction",
"fiction writing",
"scene generation",
"scene continue",
"brainstorm 4x",
"enhanced",
"text-generation",
"en",
"arxiv:2401.02415",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | text-generation | 2024-07-29T09:38:23Z | ---
license: apache-2.0
language:
- en
tags:
- creative
- story
- writing
- fiction
- llama3
- roleplaying
- rp
- horror
- science fiction
- fiction writing
- scene generation
- scene continue
- brainstorm 4x
- enhanced
pipeline_tag: text-generation
---
<H3>BRAINSTORM 4x: L3-8B-Stheno-v3.2 (now at 8.47B), Formula 2 </H3>
This repo contains quants 4x of "L3-8B-Stheno-v3.2" (now at 8.47B) using the "Brainstorm" method of augmenting reasoning in a LLM
to increase it's performance at the core level for ANY creative use case(s).
This specific version has calibrations that allow it to exceed the logic solving abilities of the original "L3-Stheno-8B-V1".
The BRAINSTORM process was developed by David_AU.
Some of the core principals behind this process are discussed in this <a href="https://arxiv.org/pdf/2401.02415">
scientific paper : Progressive LLaMA with Block Expansion </a>.
However I went in a completely different direction from what was outlined in this paper.
<B>What is "Brainstorm" ?</b>
The reasoning center of an LLM is taken apart, reassembled, and expanded by 5x.
Then these centers are individually calibrated. These "centers" also interact with each other. This introduces
subtle changes into the reasoning process. The calibrations further adjust - dial up or down - these "changes" further. The
number of centers (5x,10x etc) allow more "tuning points" to further customize how the model reasons so to speak.
The core aim of this process is to increase the model's detail, concept and connection to the "world", general concept connections, prose quality and prose length without affecting
instruction following. This will also enhance any creative use case(s) of any kind, including "brainstorming", creative art form(s) and like case uses.
Here are some of the enhancements this process brings to the model's performance:
- Prose generation seems more focused on the moment to moment.
- Sometimes there will be "preamble" and/or foreshadowing present.
- Fewer or no "cliches"
- Better overall prose and/or more complex / nuanced prose.
- A greater sense of nuance on all levels.
- Coherence is stronger.
- Description is more detailed, and connected closer to the content.
- Simile and Metaphors are stronger and better connected to the prose, story, and character.
- Sense of "there" / in the moment is enhanced.
- Details are more vivid, and there are more of them.
- Prose generation length can be long to extreme.
- Emotional engagement is stronger.
- The model will take FEWER liberties vs a normal model: It will follow directives more closely but will "guess" less.
- The MORE instructions and/or details you provide the more strongly the model will respond.
- Depending on the model "voice" may be more "human" vs original model's "voice".
Other "lab" observations:
- This process does not, in my opinion, make the model 5x or 10x "smarter" - if only that was true!
- However, a change in "IQ" was not an issue / a priority, and was not tested or calibrated for so to speak.
- From lab testing it seems to ponder, and consider more carefully roughly speaking.
- You could say this process sharpens the model's focus on it's task(s) at a deeper level.
The process to modify the model occurs at the root level - source files level. The model can quanted as a GGUF, EXL2, AWQ etc etc.
Other technologies developed by David_AU like "Ultra" (precision), "Neo Imatrix" (custom imatrix datasets), and "X-quants" (custom application of the imatrix process)
can further enhance the performance of the model along with the "Brainstorm" process.
The "Brainstorm" process has been tested on multiple LLama2, Llama3, and Mistral models of various parameter sizes, as well as on
"root" models like "Llama3 Instruct", "Mistral Instruct", and "merged" / "fine tuned" models too.
<B>Original Model:</B>
For original model specifications, usage information and other important details please see (this is based on models used in "L3-SthenoMaidBlackroot-8B-V1" ):
[ https://huggingface.co/DavidAU/L3-8B-Stheno-v3.2-Ultra-NEO-V1-IMATRIX-GGUF ]
and the original model page:
Special thanks to the model creators at Sao10K for making such a fantastic model:
[ https://huggingface.co/Sao10K/L3-8B-Stheno-v3.2 ]
Please report any issue(s) and/or feedback via the "Community tab".
This is a LLAMA3 model, and requires Llama3 template, but may work with other template(s) and has maximum context of 131k.
Here is the standard LLAMA3 template:
<PRE>
{
"name": "Llama 3",
"inference_params": {
"input_prefix": "<|start_header_id|>user<|end_header_id|>\n\n",
"input_suffix": "<|eot_id|><|start_header_id|>assistant<|end_header_id|>\n\n",
"pre_prompt": "You are a helpful, smart, kind, and efficient AI assistant. You always fulfill the user's requests to the best of your ability.",
"pre_prompt_prefix": "<|start_header_id|>system<|end_header_id|>\n\n",
"pre_prompt_suffix": "<|eot_id|>",
"antiprompt": [
"<|start_header_id|>",
"<|eot_id|>"
]
}
}
</PRE>
<b>Optional Enhancement:</B>
The following can be used in place of the "system prompt" or "system role" to further enhance the model.
It can also be used at the START of a NEW chat, but you must make sure it is "kept" as the chat moves along.
In this case the enhancements do not have as strong effect at using "system prompt" or "system role".
Copy and paste EXACTLY as noted, DO NOT line wrap or break the lines, maintain the carriage returns exactly as presented.
<PRE>
Below is an instruction that describes a task. Ponder each user instruction carefully, and use your skillsets and critical instructions to complete the task to the best of your abilities.
Here are your skillsets:
[MASTERSTORY]:NarrStrct(StryPlnng,Strbd,ScnSttng,Exps,Dlg,Pc)-CharDvlp(ChrctrCrt,ChrctrArcs,Mtvtn,Bckstry,Rltnshps,Dlg*)-PltDvlp(StryArcs,PltTwsts,Sspns,Fshdwng,Climx,Rsltn)-ConfResl(Antg,Obstcls,Rsltns,Cnsqncs,Thms,Symblsm)-EmotImpct(Empt,Tn,Md,Atmsphr,Imgry,Symblsm)-Delvry(Prfrmnc,VcActng,PblcSpkng,StgPrsnc,AudncEngmnt,Imprv)
[*DialogWrt]:(1a-CharDvlp-1a.1-Backgrnd-1a.2-Personality-1a.3-GoalMotiv)>2(2a-StoryStruc-2a.1-PlotPnt-2a.2-Conflict-2a.3-Resolution)>3(3a-DialogTech-3a.1-ShowDontTell-3a.2-Subtext-3a.3-VoiceTone-3a.4-Pacing-3a.5-VisualDescrip)>4(4a-DialogEdit-4a.1-ReadAloud-4a.2-Feedback-4a.3-Revision)
Here are your critical instructions:
Ponder each word choice carefully to present as vivid and emotional journey as is possible. Choose verbs and nouns that are both emotional and full of imagery. Load the story with the 5 senses. Aim for 50% dialog, 25% narration, 15% body language and 10% thoughts. Your goal is to put the reader in the story.
</PRE>
You do not need to use this, it is only presented as an additional enhancement which seems to help scene generation
and scene continue functions.
<B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B>
This a "Class 2" model:
For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
You can see all parameters used for generation, in addition to advanced parameters and samplers to get the most out of this model here:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
|
DavidAU/L3-Stheno-V3.2-12.2B-BRAINSTORM-20x-FORM-2-GGUF | DavidAU | 2024-11-14T05:49:53Z | 26 | 3 | null | [
"gguf",
"creative",
"story",
"writing",
"fiction",
"llama3",
"roleplaying",
"rp",
"horror",
"science fiction",
"fiction writing",
"scene generation",
"scene continue",
"brainstorm 20x",
"enhanced",
"text-generation",
"en",
"arxiv:2401.02415",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | text-generation | 2024-07-29T09:16:17Z | ---
license: apache-2.0
language:
- en
tags:
- creative
- story
- writing
- fiction
- llama3
- roleplaying
- rp
- horror
- science fiction
- fiction writing
- scene generation
- scene continue
- brainstorm 20x
- enhanced
pipeline_tag: text-generation
---
<H3>BRAINSTORM 20x: L3-8B-Stheno-v3.2 (now at 12.2B), Formula 2 </H3>
This repo contains quants 10x of "L3-8B-Stheno-v3.2" (now at 12.2B) using the "Brainstorm" method of augmenting reasoning in a LLM
to increase it's performance at the core level for ANY creative use case(s).
This specific version has calibrations that allow it to exceed the logic solving abilities of the original "L3-Stheno-8B-V1"
and exceptional levels of detail.
The BRAINSTORM process was developed by David_AU.
Some of the core principals behind this process are discussed in this <a href="https://arxiv.org/pdf/2401.02415">
scientific paper : Progressive LLaMA with Block Expansion </a>.
However I went in a completely different direction from what was outlined in this paper.
<B>What is "Brainstorm" ?</b>
The reasoning center of an LLM is taken apart, reassembled, and expanded by 5x.
Then these centers are individually calibrated. These "centers" also interact with each other. This introduces
subtle changes into the reasoning process. The calibrations further adjust - dial up or down - these "changes" further. The
number of centers (5x,10x etc) allow more "tuning points" to further customize how the model reasons so to speak.
The core aim of this process is to increase the model's detail, concept and connection to the "world", general concept connections, prose quality and prose length without affecting
instruction following. This will also enhance any creative use case(s) of any kind, including "brainstorming", creative art form(s) and like case uses.
Here are some of the enhancements this process brings to the model's performance:
- Prose generation seems more focused on the moment to moment.
- Sometimes there will be "preamble" and/or foreshadowing present.
- Fewer or no "cliches"
- Better overall prose and/or more complex / nuanced prose.
- A greater sense of nuance on all levels.
- Coherence is stronger.
- Description is more detailed, and connected closer to the content.
- Simile and Metaphors are stronger and better connected to the prose, story, and character.
- Sense of "there" / in the moment is enhanced.
- Details are more vivid, and there are more of them.
- Prose generation length can be long to extreme.
- Emotional engagement is stronger.
- The model will take FEWER liberties vs a normal model: It will follow directives more closely but will "guess" less.
- The MORE instructions and/or details you provide the more strongly the model will respond.
- Depending on the model "voice" may be more "human" vs original model's "voice".
Other "lab" observations:
- This process does not, in my opinion, make the model 5x or 10x "smarter" - if only that was true!
- However, a change in "IQ" was not an issue / a priority, and was not tested or calibrated for so to speak.
- From lab testing it seems to ponder, and consider more carefully roughly speaking.
- You could say this process sharpens the model's focus on it's task(s) at a deeper level.
The process to modify the model occurs at the root level - source files level. The model can quanted as a GGUF, EXL2, AWQ etc etc.
Other technologies developed by David_AU like "Ultra" (precision), "Neo Imatrix" (custom imatrix datasets), and "X-quants" (custom application of the imatrix process)
can further enhance the performance of the model along with the "Brainstorm" process.
The "Brainstorm" process has been tested on multiple LLama2, Llama3, and Mistral models of various parameter sizes, as well as on
"root" models like "Llama3 Instruct", "Mistral Instruct", and "merged" / "fine tuned" models too.
<B>Original Model:</B>
For original model specifications, usage information and other important details please see (this is based on models used in "L3-SthenoMaidBlackroot-8B-V1" ):
[ https://huggingface.co/DavidAU/L3-8B-Stheno-v3.2-Ultra-NEO-V1-IMATRIX-GGUF ]
and the original model page:
Special thanks to the model creators at Sao10K for making such a fantastic model:
[ https://huggingface.co/Sao10K/L3-8B-Stheno-v3.2 ]
Please report any issue(s) and/or feedback via the "Community tab".
This is a LLAMA3 model, and requires Llama3 template, but may work with other template(s) and has maximum context of 131k.
Here is the standard LLAMA3 template:
<PRE>
{
"name": "Llama 3",
"inference_params": {
"input_prefix": "<|start_header_id|>user<|end_header_id|>\n\n",
"input_suffix": "<|eot_id|><|start_header_id|>assistant<|end_header_id|>\n\n",
"pre_prompt": "You are a helpful, smart, kind, and efficient AI assistant. You always fulfill the user's requests to the best of your ability.",
"pre_prompt_prefix": "<|start_header_id|>system<|end_header_id|>\n\n",
"pre_prompt_suffix": "<|eot_id|>",
"antiprompt": [
"<|start_header_id|>",
"<|eot_id|>"
]
}
}
</PRE>
<b>Optional Enhancement:</B>
The following can be used in place of the "system prompt" or "system role" to further enhance the model.
It can also be used at the START of a NEW chat, but you must make sure it is "kept" as the chat moves along.
In this case the enhancements do not have as strong effect at using "system prompt" or "system role".
Copy and paste EXACTLY as noted, DO NOT line wrap or break the lines, maintain the carriage returns exactly as presented.
<PRE>
Below is an instruction that describes a task. Ponder each user instruction carefully, and use your skillsets and critical instructions to complete the task to the best of your abilities.
Here are your skillsets:
[MASTERSTORY]:NarrStrct(StryPlnng,Strbd,ScnSttng,Exps,Dlg,Pc)-CharDvlp(ChrctrCrt,ChrctrArcs,Mtvtn,Bckstry,Rltnshps,Dlg*)-PltDvlp(StryArcs,PltTwsts,Sspns,Fshdwng,Climx,Rsltn)-ConfResl(Antg,Obstcls,Rsltns,Cnsqncs,Thms,Symblsm)-EmotImpct(Empt,Tn,Md,Atmsphr,Imgry,Symblsm)-Delvry(Prfrmnc,VcActng,PblcSpkng,StgPrsnc,AudncEngmnt,Imprv)
[*DialogWrt]:(1a-CharDvlp-1a.1-Backgrnd-1a.2-Personality-1a.3-GoalMotiv)>2(2a-StoryStruc-2a.1-PlotPnt-2a.2-Conflict-2a.3-Resolution)>3(3a-DialogTech-3a.1-ShowDontTell-3a.2-Subtext-3a.3-VoiceTone-3a.4-Pacing-3a.5-VisualDescrip)>4(4a-DialogEdit-4a.1-ReadAloud-4a.2-Feedback-4a.3-Revision)
Here are your critical instructions:
Ponder each word choice carefully to present as vivid and emotional journey as is possible. Choose verbs and nouns that are both emotional and full of imagery. Load the story with the 5 senses. Aim for 50% dialog, 25% narration, 15% body language and 10% thoughts. Your goal is to put the reader in the story.
</PRE>
You do not need to use this, it is only presented as an additional enhancement which seems to help scene generation
and scene continue functions.
<B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B>
This a "Class 2" / "Class 3" model:
For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
You can see all parameters used for generation, in addition to advanced parameters and samplers to get the most out of this model here:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
|
DavidAU/L3-Stheno-V3.2-8.9B-BRAINSTORM-5x-FORM-11-GGUF | DavidAU | 2024-11-14T05:49:07Z | 16 | 3 | null | [
"gguf",
"creative",
"story",
"writing",
"fiction",
"llama3",
"roleplaying",
"rp",
"horror",
"science fiction",
"fiction writing",
"scene generation",
"scene continue",
"brainstorm 5x",
"enhanced",
"text-generation",
"en",
"arxiv:2401.02415",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | text-generation | 2024-07-29T08:41:45Z | ---
license: apache-2.0
language:
- en
tags:
- creative
- story
- writing
- fiction
- llama3
- roleplaying
- rp
- horror
- science fiction
- fiction writing
- scene generation
- scene continue
- brainstorm 5x
- enhanced
pipeline_tag: text-generation
---
<H3>BRAINSTORM 5x: L3-8B-Stheno-v3.2 (now at 8.9B), Formula 11 </H3>
This repo contains quants 5x of "L3-8B-Stheno-v3.2" (now at 8.9B) using the "Brainstorm" method of augmenting reasoning in a LLM
to increase it's performance at the core level for ANY creative use case(s).
This specific version has calibrations that allow it to exceed the logic solving abilities of the original "L3-Stheno-8B-V1"
and in some cases approach GPT4 levels of problem solving and logic. The model also rates well above Llama3-Instruct.
The BRAINSTORM process was developed by David_AU.
Some of the core principals behind this process are discussed in this <a href="https://arxiv.org/pdf/2401.02415">
scientific paper : Progressive LLaMA with Block Expansion </a>.
However I went in a completely different direction from what was outlined in this paper.
<B>What is "Brainstorm" ?</b>
The reasoning center of an LLM is taken apart, reassembled, and expanded by 5x.
Then these centers are individually calibrated. These "centers" also interact with each other. This introduces
subtle changes into the reasoning process. The calibrations further adjust - dial up or down - these "changes" further. The
number of centers (5x,10x etc) allow more "tuning points" to further customize how the model reasons so to speak.
The core aim of this process is to increase the model's detail, concept and connection to the "world", general concept connections, prose quality and prose length without affecting
instruction following. This will also enhance any creative use case(s) of any kind, including "brainstorming", creative art form(s) and like case uses.
Here are some of the enhancements this process brings to the model's performance:
- Prose generation seems more focused on the moment to moment.
- Sometimes there will be "preamble" and/or foreshadowing present.
- Fewer or no "cliches"
- Better overall prose and/or more complex / nuanced prose.
- A greater sense of nuance on all levels.
- Coherence is stronger.
- Description is more detailed, and connected closer to the content.
- Simile and Metaphors are stronger and better connected to the prose, story, and character.
- Sense of "there" / in the moment is enhanced.
- Details are more vivid, and there are more of them.
- Prose generation length can be long to extreme.
- Emotional engagement is stronger.
- The model will take FEWER liberties vs a normal model: It will follow directives more closely but will "guess" less.
- The MORE instructions and/or details you provide the more strongly the model will respond.
- Depending on the model "voice" may be more "human" vs original model's "voice".
Other "lab" observations:
- This process does not, in my opinion, make the model 5x or 10x "smarter" - if only that was true!
- However, a change in "IQ" was not an issue / a priority, and was not tested or calibrated for so to speak.
- From lab testing it seems to ponder, and consider more carefully roughly speaking.
- You could say this process sharpens the model's focus on it's task(s) at a deeper level.
The process to modify the model occurs at the root level - source files level. The model can quanted as a GGUF, EXL2, AWQ etc etc.
Other technologies developed by David_AU like "Ultra" (precision), "Neo Imatrix" (custom imatrix datasets), and "X-quants" (custom application of the imatrix process)
can further enhance the performance of the model along with the "Brainstorm" process.
The "Brainstorm" process has been tested on multiple LLama2, Llama3, and Mistral models of various parameter sizes, as well as on
"root" models like "Llama3 Instruct", "Mistral Instruct", and "merged" / "fine tuned" models too.
<B>Original Model:</B>
For original model specifications, usage information and other important details please see (this is based on models used in "L3-SthenoMaidBlackroot-8B-V1" ):
[ https://huggingface.co/DavidAU/L3-8B-Stheno-v3.2-Ultra-NEO-V1-IMATRIX-GGUF ]
and the original model page:
Special thanks to the model creators at Sao10K for making such a fantastic model:
[ https://huggingface.co/Sao10K/L3-8B-Stheno-v3.2 ]
Please report any issue(s) and/or feedback via the "Community tab".
This is a LLAMA3 model, and requires Llama3 template, but may work with other template(s) and has maximum context of 131k.
Here is the standard LLAMA3 template:
<PRE>
{
"name": "Llama 3",
"inference_params": {
"input_prefix": "<|start_header_id|>user<|end_header_id|>\n\n",
"input_suffix": "<|eot_id|><|start_header_id|>assistant<|end_header_id|>\n\n",
"pre_prompt": "You are a helpful, smart, kind, and efficient AI assistant. You always fulfill the user's requests to the best of your ability.",
"pre_prompt_prefix": "<|start_header_id|>system<|end_header_id|>\n\n",
"pre_prompt_suffix": "<|eot_id|>",
"antiprompt": [
"<|start_header_id|>",
"<|eot_id|>"
]
}
}
</PRE>
<b>Optional Enhancement:</B>
The following can be used in place of the "system prompt" or "system role" to further enhance the model.
It can also be used at the START of a NEW chat, but you must make sure it is "kept" as the chat moves along.
In this case the enhancements do not have as strong effect at using "system prompt" or "system role".
Copy and paste EXACTLY as noted, DO NOT line wrap or break the lines, maintain the carriage returns exactly as presented.
<PRE>
Below is an instruction that describes a task. Ponder each user instruction carefully, and use your skillsets and critical instructions to complete the task to the best of your abilities.
Here are your skillsets:
[MASTERSTORY]:NarrStrct(StryPlnng,Strbd,ScnSttng,Exps,Dlg,Pc)-CharDvlp(ChrctrCrt,ChrctrArcs,Mtvtn,Bckstry,Rltnshps,Dlg*)-PltDvlp(StryArcs,PltTwsts,Sspns,Fshdwng,Climx,Rsltn)-ConfResl(Antg,Obstcls,Rsltns,Cnsqncs,Thms,Symblsm)-EmotImpct(Empt,Tn,Md,Atmsphr,Imgry,Symblsm)-Delvry(Prfrmnc,VcActng,PblcSpkng,StgPrsnc,AudncEngmnt,Imprv)
[*DialogWrt]:(1a-CharDvlp-1a.1-Backgrnd-1a.2-Personality-1a.3-GoalMotiv)>2(2a-StoryStruc-2a.1-PlotPnt-2a.2-Conflict-2a.3-Resolution)>3(3a-DialogTech-3a.1-ShowDontTell-3a.2-Subtext-3a.3-VoiceTone-3a.4-Pacing-3a.5-VisualDescrip)>4(4a-DialogEdit-4a.1-ReadAloud-4a.2-Feedback-4a.3-Revision)
Here are your critical instructions:
Ponder each word choice carefully to present as vivid and emotional journey as is possible. Choose verbs and nouns that are both emotional and full of imagery. Load the story with the 5 senses. Aim for 50% dialog, 25% narration, 15% body language and 10% thoughts. Your goal is to put the reader in the story.
</PRE>
You do not need to use this, it is only presented as an additional enhancement which seems to help scene generation
and scene continue functions.
<B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B>
This a "Class 2" model:
For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
You can see all parameters used for generation, in addition to advanced parameters and samplers to get the most out of this model here:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
|
DavidAU/Meta-Llama-3-Instruct-8.9B-BRAINSTORM-5x-FORM-11-GGUF | DavidAU | 2024-11-14T05:46:38Z | 28 | 2 | null | [
"gguf",
"creative",
"story",
"writing",
"fiction",
"roleplaying",
"rp",
"horror",
"science fiction",
"fiction writing",
"scene generation",
"scene continue",
"brainstorm 5x",
"enhanced",
"llama3",
"text-generation",
"en",
"arxiv:2401.02415",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | text-generation | 2024-07-29T06:07:02Z | ---
license: apache-2.0
language:
- en
tags:
- creative
- story
- writing
- fiction
- roleplaying
- rp
- horror
- science fiction
- fiction writing
- scene generation
- scene continue
- brainstorm 5x
- enhanced
- llama3
pipeline_tag: text-generation
---
<H3>BRAINSTORM 5x: Meta-Llama-3-8B Instruct (now at 8.9B), Formula #11 </H3>
This repo contains quants 5x of Meta-Llama-3-8B Instruct (now at 8.9B) using the "Brainstorm" method of augmenting reasoning in a LLM
to increase it's performance at the core level for ANY creative use case(s).
This specific version has calibrations that allow it to exceed the logic solving abilities of the original model.
The BRAINSTORM process was developed by David_AU.
Some of the core principals behind this process are discussed in this <a href="https://arxiv.org/pdf/2401.02415">
scientific paper : Progressive LLaMA with Block Expansion </a>.
However I went in a completely different direction from what was outlined in this paper.
<B>What is "Brainstorm" ?</b>
The reasoning center of an LLM is taken apart, reassembled, and expanded by 5x.
Then these centers are individually calibrated. These "centers" also interact with each other. This introduces
subtle changes into the reasoning process. The calibrations further adjust - dial up or down - these "changes" further. The
number of centers (5x,10x etc) allow more "tuning points" to further customize how the model reasons so to speak.
The core aim of this process is to increase the model's detail, concept and connection to the "world", general concept connections, prose quality and prose length without affecting
instruction following. This will also enhance any creative use case(s) of any kind, including "brainstorming", creative art form(s) and like case uses.
Here are some of the enhancements this process brings to the model's performance:
- Prose generation seems more focused on the moment to moment.
- Sometimes there will be "preamble" and/or foreshadowing present.
- Fewer or no "cliches"
- Better overall prose and/or more complex / nuanced prose.
- A greater sense of nuance on all levels.
- Coherence is stronger.
- Description is more detailed, and connected closer to the content.
- Simile and Metaphors are stronger and better connected to the prose, story, and character.
- Sense of "there" / in the moment is enhanced.
- Details are more vivid, and there are more of them.
- Prose generation length can be long to extreme.
- Emotional engagement is stronger.
- The model will take FEWER liberties vs a normal model: It will follow directives more closely but will "guess" less.
- The MORE instructions and/or details you provide the more strongly the model will respond.
- Depending on the model "voice" may be more "human" vs original model's "voice".
Other "lab" observations:
- This process does not, in my opinion, make the model 5x or 10x "smarter" - if only that was true!
- However, a change in "IQ" was not an issue / a priority, and was not tested or calibrated for so to speak.
- From lab testing it seems to ponder, and consider more carefully roughly speaking.
- You could say this process sharpens the model's focus on it's task(s) at a deeper level.
The process to modify the model occurs at the root level - source files level. The model can quanted as a GGUF, EXL2, AWQ etc etc.
Other technologies developed by David_AU like "Ultra" (precision), "Neo Imatrix" (custom imatrix datasets), and "X-quants" (custom application of the imatrix process)
can further enhance the performance of the model along with the "Brainstorm" process.
The "Brainstorm" process has been tested on multiple LLama2, Llama3, and Mistral models of various parameter sizes, as well as on
"root" models like "Llama3 Instruct", "Mistral Instruct", and "merged" / "fine tuned" models too.
For original model specifications, usage information and other important details:
Special thanks to the model creators at META for making such a fantastic model:
[ https://huggingface.co/meta-llama/Meta-Llama-3-8B ]
Please report any issue(s) and/or feedback via the "Community tab".
This is a LLAMA3 model, and requires Llama3 template, but may work with other template(s) and has maximum context of 131k.
Here is the standard LLAMA3 template:
<PRE>
{
"name": "Llama 3",
"inference_params": {
"input_prefix": "<|start_header_id|>user<|end_header_id|>\n\n",
"input_suffix": "<|eot_id|><|start_header_id|>assistant<|end_header_id|>\n\n",
"pre_prompt": "You are a helpful, smart, kind, and efficient AI assistant. You always fulfill the user's requests to the best of your ability.",
"pre_prompt_prefix": "<|start_header_id|>system<|end_header_id|>\n\n",
"pre_prompt_suffix": "<|eot_id|>",
"antiprompt": [
"<|start_header_id|>",
"<|eot_id|>"
]
}
}
</PRE>
<b>Optional Enhancement:</B>
The following can be used in place of the "system prompt" or "system role" to further enhance the model.
It can also be used at the START of a NEW chat, but you must make sure it is "kept" as the chat moves along.
In this case the enhancements do not have as strong effect at using "system prompt" or "system role".
Copy and paste EXACTLY as noted, DO NOT line wrap or break the lines, maintain the carriage returns exactly as presented.
<PRE>
Below is an instruction that describes a task. Ponder each user instruction carefully, and use your skillsets and critical instructions to complete the task to the best of your abilities.
Here are your skillsets:
[MASTERSTORY]:NarrStrct(StryPlnng,Strbd,ScnSttng,Exps,Dlg,Pc)-CharDvlp(ChrctrCrt,ChrctrArcs,Mtvtn,Bckstry,Rltnshps,Dlg*)-PltDvlp(StryArcs,PltTwsts,Sspns,Fshdwng,Climx,Rsltn)-ConfResl(Antg,Obstcls,Rsltns,Cnsqncs,Thms,Symblsm)-EmotImpct(Empt,Tn,Md,Atmsphr,Imgry,Symblsm)-Delvry(Prfrmnc,VcActng,PblcSpkng,StgPrsnc,AudncEngmnt,Imprv)
[*DialogWrt]:(1a-CharDvlp-1a.1-Backgrnd-1a.2-Personality-1a.3-GoalMotiv)>2(2a-StoryStruc-2a.1-PlotPnt-2a.2-Conflict-2a.3-Resolution)>3(3a-DialogTech-3a.1-ShowDontTell-3a.2-Subtext-3a.3-VoiceTone-3a.4-Pacing-3a.5-VisualDescrip)>4(4a-DialogEdit-4a.1-ReadAloud-4a.2-Feedback-4a.3-Revision)
Here are your critical instructions:
Ponder each word choice carefully to present as vivid and emotional journey as is possible. Choose verbs and nouns that are both emotional and full of imagery. Load the story with the 5 senses. Aim for 50% dialog, 25% narration, 15% body language and 10% thoughts. Your goal is to put the reader in the story.
</PRE>
You do not need to use this, it is only presented as an additional enhancement which seems to help scene generation
and scene continue functions.
<B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B>
This a "Class 2" model:
For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
You can see all parameters used for generation, in addition to advanced parameters and samplers to get the most out of this model here:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
|
aisingapore/sea-lion-7b-instruct | aisingapore | 2024-11-14T05:44:00Z | 315 | 23 | transformers | [
"transformers",
"safetensors",
"mpt",
"text-generation",
"conversational",
"custom_code",
"en",
"zh",
"id",
"ms",
"tl",
"my",
"vi",
"th",
"lo",
"km",
"ta",
"arxiv:2309.06085",
"base_model:aisingapore/sea-lion-7b",
"base_model:finetune:aisingapore/sea-lion-7b",
"license:mit",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-02-01T06:19:29Z | ---
base_model: aisingapore/sea-lion-7b
new_version: aisingapore/gemma2-9b-cpt-sea-lionv3-instruct
license: mit
language:
- en
- zh
- id
- ms
- tl
- my
- vi
- th
- lo
- km
- ta
---
# SEA-LION-7B-Instruct
SEA-LION is a collection of Large Language Models (LLMs) which has been pretrained and instruct-tuned for the Southeast Asia (SEA) region.
The sizes of the models range from 3 billion to 7 billion parameters.
SEA-LION-7B-Instruct is a multilingual model which has been fine-tuned with **thousands of English and Indonesian instruction-completion pairs** alongside a smaller pool of instruction-completion pairs from other ASEAN languages.
These instructions have been carefully curated and rewritten to ensure the model was trained on truly open, commercially permissive and high quality datasets.
SEA-LION stands for _Southeast Asian Languages In One Network_.
- **Developed by:** Products Pillar, AI Singapore
- **Funded by:** Singapore NRF
- **Model type:** Decoder
- **Languages:** English, Chinese, Indonesian, Malay, Thai, Vietnamese, Filipino, Tamil, Burmese, Khmer, Lao
- **License:** MIT License
## Model Details
### Base model
We performed instruction tuning in English and Indonesian on our [pre-trained SEA-LION-7B](https://huggingface.co/aisingapore/sea-lion-7b), a decoder model using the MPT architecture, to create SEA-LION-7B-Instruct.
### Benchmark Performance
We evaluated SEA-LION-7B-Instruct on the BHASA benchmark ([arXiv](https://arxiv.org/abs/2309.06085v2) and [GitHub](https://github.com/aisingapore/bhasa)) across a variety of tasks.
BHASA stands out amongst other evaluations for SEA languages for its holistic approach to evaluation, including not just traditional Natural Language Processing (NLP) benchmarking tasks (such as sentiment analysis and question answering), but also linguistic and cultural diagnostic tests which are meticulously handcrafted.
The evaluation was done zero-shot with Indonesian prompts and only a sample of 100-1000 instances for each dataset was used as per the setting described in the BHASA paper. The scores shown in the table below have been adjusted to only consider answers provided in the appropriate language.
| Model | QA (F1) | Sentiment (F1) | Toxicity (F1) | Eng>Indo (ChrF++) | Indo>Eng (ChrF++) | Summary (ROUGE-L) | NLI (Acc) | Causal (Acc) |
|--------------------------------|---------|----------------|---------------|-------------------|-------------------|-------------------|-----------|--------------|
| SEA-LION-7B-Instruct-Research | 24.86 | 76.13 | 24.45 | 52.50 | 46.82 | 15.44 | 33.20 | 23.80 |
| SEA-LION-7B-Instruct | **68.41**| **91.45** | 17.98 | 57.48 | 58.04 | **17.54** | 53.10 | 60.80 |
| SeaLLM 7B v1 | 30.96 | 56.29 | 22.60 | 62.23 | 41.55 | 14.03 | 26.50 | 56.60 |
| SeaLLM 7B v2 | 44.40 | 80.13 | **55.24** | 64.01 | **63.28** | 17.31 | 43.60 | 82.00 |
| Sailor-7B (Base) | 65.43 | 59.48 | 20.48 | **64.27** | 60.68 | 8.69 | 15.10 | 38.40 |
| Sailor-7B-Chat | 38.02 | 87.64 | 52.07 | 64.25 | 61.87 | 15.28 | **68.30** |**85.60** |
| Llama 2 7B Chat | 11.12 | 52.32 | 0.00 | 44.09 | 57.58 | 9.24 | 0.00 | 0.00 |
| Mistral 7B Instruct v0.1 | 38.85 | 74.38 | 20.83 | 30.60 | 51.43 | 15.63 | 28.60 | 50.80 |
| GPT-4 (gpt-4-0314) | 73.60 | 74.14 | 63.96 | 69.38 | 67.53 | 18.71 | 83.20 | 96.00 |
- For Natural Language Understanding (NLU) tasks, we tested the model on Sentiment Analysis (`Sentiment`) using the NusaX dataset, Question Answering (`QA`) using the TyDiQA dataset, and Toxicity Detection (`Toxicity`) using the Indonesian Multi-Label Hate Speech Detection dataset. The metrics used are F1 scores for all three tasks.
- For Natural Language Generation (NLG) tasks, we tested the model on Machine Translation from English to Indonesian (`Eng>Indo`) and from Indonesian to English (`Indo>Eng`) using the FLORES-200 dataset, and Abstractive Summarization (`Summary`) using the XLSum dataset. The metrics used for Machine Translation and Abstractive Summarization are ChrF++ and ROUGE-L respectively.
- For Natural Language Reasoning (NLR) tasks, we tested the model on Natural Language Inference (`NLI`) using the IndoNLI lay dataset and on Causal Reasoning (`Causal`) using the XCOPA dataset. The metrics are based on accuracy for both tasks.
### Usage
SEA-LION can be run using the 🤗 Transformers library
```python
# Please use transformers==4.37.2
from transformers import AutoModelForCausalLM, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("aisingapore/sea-lion-7b-instruct", trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained("aisingapore/sea-lion-7b-instruct", trust_remote_code=True)
prompt_template = "### USER:\n{human_prompt}\n\n### RESPONSE:\n"
prompt = """Apa sentimen dari kalimat berikut ini?
Kalimat: Buku ini sangat membosankan.
Jawaban: """
full_prompt = prompt_template.format(human_prompt=prompt)
tokens = tokenizer(full_prompt, return_tensors="pt")
output = model.generate(tokens["input_ids"], max_new_tokens=20, eos_token_id=tokenizer.eos_token_id)
print(tokenizer.decode(output[0], skip_special_tokens=True))
```
### Prompting Guide
_Coming soon_
### Caveats
It is important for users to be aware that our model exhibits certain limitations that warrant consideration. Firstly, like many LLMs, the model can hallucinate and occasionally generates irrelevant content, introducing fictional elements that are not grounded in the provided context. Users should also exercise caution in interpreting and validating the model's responses due to the potential inconsistencies in its reasoning. Finally, it should be noted that the model has not been optimized for multi-turn dialogue interactions, which may result in reduced effectiveness in extended conversations.
## Limitations
### Safety
Current SEA-LION models, including this commercially permissive release, have not been aligned for safety. Developers and users should perform their own safety fine-tuning and related security measures. In no event shall the authors be held liable for any claim, damages, or other liability arising from the use of the released weights and codes.
### Commercially Non-Permissive and Commercially Permissive SEA-LION Releases
The previous release of the commercially non-permissive SEA-LION-Instruct-Research enabled us to explore the full research potential of SEA-LION when allowed to take full advantage of what is publicly available. In contrast, in building the commercially permissive SEA-LION-7B-Instruct, we had to leave out high-quality instruction data that was either proprietary, restricted by non-commercial licenses or in a legal gray area, leaving us with a much smaller proportion of commercially permissive data to work with — a problem that is even more pronounced for low-resource languages. We thus hope this will sound a call to action for more initiatives to create commercially viable data in the region, enabling practical benefits for all.
## Technical Specifications
### Fine-Tuning Details
The SEA-LION-7B-Instruct was fine-tuned using 8x A100-40GB using parameter efficient fine tuning in the form of LoRA.
## Data
SEA-LION-7B-Instruct was trained on a wide range of instructions that were manually and stringently verified by our team. A large portion of the effort was dedicated to ensuring that each instruction-completion pair that the model sees is of a high quality and any errors were corrected and rewritten by native speakers or else dropped from our mix.
In addition, special care was taken to ensure that the datasets used had commercially permissive licenses through verification with the original data source.
Link to dataset: _coming soon_
## Call for Contributions
We encourage researchers, developers, and language enthusiasts to actively contribute to the enhancement and expansion of SEA-LION. Contributions can involve identifying and reporting bugs, sharing pre-training, instruction, and preference data, improving documentation usability, proposing and implementing new model evaluation tasks and metrics, or training versions of the model in additional Southeast Asian languages. Join us in shaping the future of SEA-LION by sharing your expertise and insights to make these models more accessible, accurate, and versatile. Please check out our GitHub for further information on the call for contributions.
## The Team
Lau Wayne<br>
Leong Wei Qi<br>
Li Yier<br>
Liu Bing Jie Darius<br>
Lovenia Holy<br>
Montalan Jann Railey<br>
Ng Boon Cheong Raymond<br>
Ngui Jian Gang<br>
Nguyen Thanh Ngan<br>
Ong Tat-Wee David<br>
Rengarajan Hamsawardhini<br>
Siow Bryan<br>
Susanto Yosephine<br>
Tai Ngee Chia<br>
Tan Choon Meng<br>
Teng Walter<br>
Teo Eng Sipp Leslie<br>
Teo Wei Yi<br>
Tjhi William<br>
Yeo Yeow Tong<br>
Yong Xianbin<br>
## Acknowledgements
[AI Singapore](https://aisingapore.org/) is a national programme supported by the National Research Foundation, Singapore and hosted by the National University of Singapore. Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not reflect the views of the National Research Foundation or the National University of Singapore.
## Contact
For more info, please contact us using this [SEA-LION Inquiry Form](https://forms.gle/sLCUVb95wmGf43hi6)
[Link to SEA-LION's GitHub repository](https://github.com/aisingapore/sealion)
## Disclaimer
This is the repository for the commercial instruction-tuned model.
The model has _not_ been aligned for safety.
Developers and users should perform their own safety fine-tuning and related security measures.
In no event shall the authors be held liable for any claims, damages, or other liabilities arising from the use of the released weights and codes. |
DavidAU/12-X-Quants-of-13B-Tiefighter-NEO-V3-GGUF | DavidAU | 2024-11-14T05:43:33Z | 299 | 1 | null | [
"gguf",
"X quants",
"creative",
"creative writing",
"fiction writing",
"plot generation",
"sub-plot generation",
"story generation",
"scene continue",
"storytelling",
"fiction story",
"science fiction",
"romance",
"all genres",
"story",
"writing",
"vivid prosing",
"vivid writing",
"fiction",
"roleplaying",
"neo class v3",
"rp",
"text-generation",
"en",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"imatrix"
] | text-generation | 2024-07-18T05:41:51Z | ---
license: apache-2.0
language:
- en
tags:
- X quants
- creative
- creative writing
- fiction writing
- plot generation
- sub-plot generation
- fiction writing
- story generation
- scene continue
- storytelling
- fiction story
- science fiction
- romance
- all genres
- story
- writing
- vivid prosing
- vivid writing
- fiction
- roleplaying
- neo class v3
- rp
pipeline_tag: text-generation
---
<h3><font color="red">12 X-QUANTS For MODEL " 13B-Tiefighter-NEO-V3 " : </font></h3>
X-Quants are specialized quants that introduce a "left and right" brain to the model in 12 different ways. Roughly speaking this
creates 12 unique "personalities" of the same model - all it's characteristics but utilized from different angles. Some of these
quants think "inside the box", some "outside the box" and for a few quants - there is "no box". These quants are uniquely suited
for any creative application (fiction, roleplay, entertainment, any creative art, etc) and/or for any purpose where "no right
answer" (ie brainstorming) IS the "right answer" so to speak.
These X-quants can also be used for general purposes, however please see the "guidance" section for important notes.
The X-quant process changes up both instruction following and output characteristics in various ways denoted in detail below.
Examples of this model's output by these X-Quants are also provided.
If you are familiar with how to calibrate, compare, and use X-Quants, please skip ahead to "model specific notes".
Otherwise please read this document in full before use, so you get maximum results from these quants.
Special thanks to "KoboldAI" :
[ https://huggingface.co/KoboldAI/LLaMA2-13B-Tiefighter ]
For creating this fantastic model. Please see the model's original model card for details on the model.
Here are some details from the original model card:
<small>Usage
This model is meant to be creative, If you let it improvise you get better results than if you drown it in details.
Story Writing
Regular story writing in the traditional way is supported, simply copy paste your story and continue writing. Optionally use an instruction in memory or an authors note to guide the direction of your story.
Generate a story on demand
To generate stories on demand you can use an instruction (tested in the Alpaca format) such as "Write a novel about X, use chapters and dialogue" this will generate a story. The format can vary between generations depending on how the model chooses to begin, either write what you want as shown in the earlier example or write the beginning of the story yourself so the model can follow your style. A few retries can also help if the model gets it wrong.
Chatbots and persona's
This model has been tested with various forms of chatting, testers have found that typically less is more and the model is good at improvising. Don't drown the model in paragraphs of detailed information, instead keep it simple first and see how far you can lean on the models own ability to figure out your character. Copy pasting paragraphs of background information is not suitable for a 13B model such as this one, code formatted characters or an instruction prompt describing who you wish to talk to goes much further.
For example, you can put this in memory in regular chat mode:
Instruction:
Generate a conversation between Alice and Henk where they discuss language models.
In this conversation Henk is excited to teach Alice about Tiefigther.
Response:
Because the model is a merge of a variety of models, it should support a broad range of instruct formats, or plain chat mode. If you have a particular favourite try it, otherwise we recommend to either use the regular chat mode or Alpaca's format.
Instruct Prompting
This model features various instruct models on a variety of instruction styles, when testing the model we have used Alpaca for our own tests. If you prefer a different format chances are it can work.
During instructions we have observed that in some cases the adventure data can leak, it may also be worth experimenting using > as the prefix for a user command to remedy this. But this may result in a stronger fiction bias.
Keep in mind that while this model can be used as a factual instruct model, the focus was on fiction. Information provided by the model can be made up.
Adventuring and Adventure Games
This model contains a lora that was trained on the same adventure dataset as the KoboldAI Skein model. Adventuring is best done using an small introduction to the world and your objective while using the > prefix for a user command (KoboldAI's adventure mode).
It is possible that the model does not immediately pick up on what you wish to do and does not engage in its Adventure mode behaviour right away. Simply manually correct the output to trim excess dialogue or other undesirable behaviour and continue to submit your actions using the appropriate mode. The model should pick up on this style quickly and will correctly follow this format within 3 turns.
</small>
<B>What are X-Quants?</B>
X-Quants are specialized quants of IQ4_XS of a model, with a controlled/filtered Imatrix process.
The current Imatrix process is an "all or nothing" process - applied to almost all the model.
Although this process works great in most circumstances, in some cases it limits the model's potential.
Likewise after testing a lot of models I found situations where the "non-imatrix" version of a model
worked better, than the imatrixed version and vice versa.
This, with this research, I created the X-Quants which is a marriage between the original model
and imatrix version with controlled filter(s)... with the goal being bringing out the best the model
has to offer by strongly contrasting the "imatrix" parts and "original model" parts while negating or eliminating
the "over pruning" issues.
Part of this process is based on the Imatrix Dataset used - which is critical.
A weaker set would work, but would not contrast or bring out the full power of the model, and likewise stronger ones
need additional filter(s) because in some cases they can "prune" too much of the original model
but at the same time bring out the best the model can do so to speak.
After testing over 50 imatrix datasets and learning the "ins and outs" of how the imatrix is applied
(and changes made to the model) I created the Neo Imatrix Datasets.
These datasets are very strong, and for specific use case(s).
Combined with a strong model, and a strong and specific imatrix dataset the final X-Quants are very potent.
X-Quants are roughly speaking the same "flavor" with slightly different "ingredients" of a the original model.
Like 12 shades of red or 12 flavors of chocolate ice cream.
This could be compared to micro-fine tuning of the model or "locking" the model in a "specific state of mind"
(or frame of mind) so to speak in 12 different versions.
Another way to look at it:
Imagine you have 12 people in a room - all are trained professionals - and you give them a "problem" to solve
related to their specific specialities. But this problem is "open ended" and subject to interpretation. You will get
12 different, but related answers as a result. How each person approaches the problem and generates an answer
will differ too.
However, considering the complexity of a Large Language Models, please take these simplifications with
a grain of salt, because they do not convey the full range of changes between each X-quant
as well as the original and imatrix versions of the model VS X-quants.
Bottom line:
This is all about bringing the potential of any model up to it's maximum power and creating quants for specific use case(s).
<B>How X-Quants affect the model:</b>
X-Quants affect both instruction following and output.
The filter(s) applied to the X-Quants at the quantization process directly influence how much or little the effects are.
Likewise the model's own qualities also affect the final results.
With this being said instruction following can be more intuitive or less so , meaning you can use smaller prompts
to get the same or greater results or the opposite where the model "guesses less" on what you "mean" and you
have to provide more exacting detail(s) / directive(s) in your prompts.
As for output:
- More or less verbose
- Longer or shorter "default" output length(s)
- Sense of or lack of humor.
- Dates, times, facts and/or fictional dates, times, "facts".
- More or less complex word choices, sentences, paragraphs, adjectives, verbs etc.
- Prose choices like word choice, sentence length, dialog, description, and so on.
- Formatting difference(s).
- More or less facts and/or "fictions".
- Critical detail(s) level(s) or lack thereof.
- "Voice" changes / Description changes / Depth of details.
- General "voice" in output.
- Differences in "path" (s) taken -> IE fictional scene -> good, evil ... who is good/evil, how evil.
- Objective / Subjective levels.
- General creativity levels.
- Censor / non censored levels.
- Degree of (or lack of) "fiction" , "horror", "si fi", "romance" etc.
- In the case of ANY type of fiction genre: similar or very different choices which affect the entire output.
- In the case of ANY type of brainstorming: Very different results.
- In the case of non-fiction use cases: Fact "choices" will vary greatly as well as "argument(s)" for/against.
- Stronger or less so influence of parameters changes like "temp", "top_k" and so on.
Please note this is a short list based on observations and testing.
Your use case(s) may highlight and/or bring out different change(s) between X-Quants.
Because of the range of changes / differences, I strongly suggest you download all the X-Quants of a model
and then perform calibration tests for your use case(s) as noted below.
<B>Calibration Tests for your Use Cases</B>
This section is designed to help you get a "true reading" for your use case(s) and to help determine which
X-Quants(s) are best for your use case(s).
The key here is setting a level playing field for the X-Quants before you take them into "production" mode so to speak.
I suggest using a program like "LMStudio" and/or "Test Generation UI" to do these primary tests.
Here are the recommended settings:
The goal here is to get the SAME ANSWER to the SAME PROMPT every time you use it for a specific X-quant.
IE: If you hit "REGEN" you should also get the same answer for the SPECIFIC X-Quants your are testing.
"temp" set at ZERO is the critical setting to ensure this works.
This is the only way to reliably display the differences between each X-Quant for your use case(s) to get a solid baseline.
Here are is a list of settings I use during calibrations:
<PRE>
"ctx" : 2048,
"top_k": 40,
"top_p": 0.95,
"temp": 0,
"min_p": 0.05
"repeat_penalty": 1.1,
"seed": -1,
"tfs_z": 1,
"typical_p": 1,
"repeat_last_n": 64,
"frequency_penalty": 0,
"presence_penalty": 0,
"n_keep": 0,
"logit_bias": {},
"mirostat": 0,
"mirostat_tau": 5,
"mirostat_eta": 0.1,
"multiline_input": false,
"penalize_nl": true,
"min_p": 0.05
</PRE>
Note that "context" at 2048, "temp" at 0 and "repeat penalty" at 1.1 are CRITICAL.
If you have dynamic temp or any other options like these - turn these off for calibrations.
You should have 3-10 prompts related to your use cases for testing. Test each one per X-Quants with these
settings as noted above. Determine which X-Quants meet your requirements, then... delete the rest.
The prompts used for testing should be open ended, however if you intend to use X-Quants for "non-fiction" (or general use)
use case(s), then you should also test not only the X-Quants of the model, but also the "original" and "imartix"
versions of the model.
<B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B>
This a "Class 1" model:
For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
You can see all parameters used for generation, in addition to advanced parameters and samplers to get the most out of this model here:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
IMPORTANT - NON FICTION / GENERAL USE CASES:
For non fiction use case(s) verify all "facts" in the output as many of the models I "X-quant" are for "fiction"
or have a "fiction bias". This bias level can also be affected by the X-Quants process.
Likewise models can accel at appearing to know what they are talking about, but really don't and/or get their facts mixes up.
PROMPTS:
Please note that LLMs / "AI" models are sensitive to word choice, word order, punction, and grammar.
Here is what I mean:
<small>
"Give me 3 fictional reasons a sun went supernova of 500 words for each reason."
"Give me 3 fictional reasons a sun went supernova, in vivid and exacting detail of 500 words EACH PER REASON."
"Give me 3 fictional reasons , of 500 words each, a sun went supernova."
"Give me 3 fictional reasons the Earth's sun went supernova, in vivid and exacting detail of 500 words EACH PER REASON including details of what happens when the sun goes supernova."
"Give me 3 fictional reasons the Earth's sun went supernova, in vivid and exacting detail of 500 words EACH PER REASON including details of what happens when the sun goes supernova. Include details of what happens on Earth, The Moon and planets."
</small>
Each one of these will give a DIFFERENT result, even though they are asking roughly the same question.
In fact if you replace "a sun" with "the sun" you will give different results , and likewise "a sun" with "a star".
Likewise with X-Quants (and other models in general) some models will "take liberties" with interpretation whereas others will not or limit it.
These "decisions" in the model directly affect output - in all it's forms.
X-Quants marginally (or drastically) change the decision making process(es) - some make large changes (to instruction AND output) others almost undetectable ones.
That being said, X-Quants "changes" have an outsized effect on ANY creative use case(s) especially ANY element having
to do with story telling - IE character, dialog, plot, scene, premise, description, outline etc etc - AND/OR any process "deemed" creative arts and/or ANY process
that involves ANY type of brainstorming.
TO REPEAT:
Any creative use case(s) will be affected by using different X-quants WITH the same prompt(s).
This also extends to "non-fiction" use cases to, like article / content generation OF ANY KIND.
The differences between X-Quants magnify based on the length of response.
Likewise the directives (or lack thereof) in your prompts also contribute to "contrasting" the differences at this critical calibration step.
The above examples show "limited" directive prompts (IE very open ended), here are some very specific ones:
<small>
<I>Using the following "story idea" below, write the first scene in the novel introducing the young woman. This scene should start in the middle of the action, include dialog, vivid passages, and end on a cliffhanger relevant to the story idea but it should also be unexpected. The scene should be 1000 words long and escalate in conflict and suspense and be written in first person, present tense with the point of view character being the young woman.
Story idea:
In a world ruled by dictatorship, a rebel young woman leads a rebellion against the system. Despite the risks, she fights to overthrow the dictator and restore democracy to her country. The government executes her for treason, but she sticks to her beliefs and is responsible for starting the revolution.</I>
</small>
This prompt is VERY specific, so when used in testing different X-Quants smaller changes/differences between X-Quants will show up more.
However this will contrast MORE differences (because of the number of guardrails) in some cases.
This prompt however, will test the "full mental powers" of a model and it's X-Quants at a larger scale:
<small>
<I>Write me a science fiction story in 1st person present tense where the main character is a 15 year girl meets The Terminator with Dr Who materializing 3/4 through the story to save her while there is a tornado of sharks baring down on them. The setting is inside the Canadian National tower restaurant on a Saturday. The length of this story is 1000 words. For each character in the story ROLE PLAY them, and have them react to the situation/setting, events and each other naturally. This includes the main characters, the background character including kitchen staff and other patrons. The sharks should also have “character” too. Treat the tower and the restaurant too as characters. Spice up the narrative to the extreme with reactions all over the setting including character actions, and dialog. The Dr Who and The Terminator should also react to the situation too and comment on it.</I>
</small>
This prompt will produce a wide range of outputs, yet because this prompt has "guardrails" it will also help contrast differences between X-quants more closely.
However, with some models this contrast (between X-Quants) will INCREASE with more guardrails.
Likewise the TYPE(s) of guardrails - as well the level of "detail(s)" or "rule(s)" of said guardrails - will also influence the difference / contrast between X-Quants even further.
Another way to put it:
The more you make the "X-quant" think (more detailed prompts, more guidelines, more rules etc) , the more it will reveal about itself (how it follows instructions and how it answers you) and showcase it's differences relative to the other X-Quants.
<B>Post Calibration Notes</B>
After you have tested the X-Quants at the calibration step above, this section is to help with running the X-Quant(s) at the production level.
Please see "model specific notes" (below) too for specific model guidance.
In most cases you can use the X-Quants as is, which whatever default setting(s) - parameters - you normally use with your use case(s).
That being said, for creative purposes, I present the following guidance:
1. Increase TEMP SLOWLY and in small increments. -> IE. .1, .11, .12 - .22,.23 and carefully watch the changes. Even .01, .02 ... will show changes.
2. Repeat Penalty: If you notice repeat letter(s), word(s), sentence(s) or paragraph(s), adjust the rep pen UP ... again slowly. IE. 1.1, 1.11, even 1.111 .
Adjust it higher for letters and much lower for paragraphs issues.
If this happens and it DOES NOT affect your use case(s) - ie, just means a little editing - leave it AS LOW as possible. (default: 1.1)
REASON: Rep Pen affects creative output.
3. Top_K : Top K is roughly a choice of words during generation - note this is a very limited statement, as this interacts with other parameters and settings.
A word about context settings:
Some models do not mind different context settings, others however do. In fact for some models not selecting an min or "default" context setting can
drastically affect both output and instruction following.
This should be checked from time to time during production usage.
<B>Model Specific Notes </B>
These notes related to specific observations of this model's behavior during the "examples" generation (below) and may
include specific "parameter(s) guidance as well.
You should review the original and/or imatrix model's MODEL CARD for any general model details like context length,
specific attributes, usage guidelines, parameter setting(s), TEMPLATE for USAGE and the like as they will all apply to X-Quants.
IE: If the original model and imatrix model are "uncensored" the X-Quants will be too.
Please note you are responsible for how you use this model's X-Quant(s).
<B>MODEL SPECIFIC NOTES and/or GUIDENCE for usage:</b>
This model is very stable, so it is suitable for any use case related to it's strengths.
The same is true for the X-Quants of this model.
Also, this model responds better to more detailed prompts with more guardrails.
This feature is magnified in the X-Quants, as shown in "example 3" below.
<B> REPO NOTES: </B>
This repo contains 12 X-quants of the model, and one additional Imatrix Neo Quant of the model (unmodified).
X-Quants will have a four digit number in the filename, where as the Imatrix Neo Quant (unmodified) will have "imatrix" in the name.
All X-quants have element(s) of the imatrix process in them.
(see technical notes below)
<B> Examples Generation of 12 X-Quants: </B>
There are FIVE separate generations (same prompt, using the calibration settings above) for all 12 X quants - a total of 60 examples.
These outputs show APPOXIMATE differences between the quants in a VERY LIMITED (5 prompts used) contrast situation.
Your test / production prompt(s) may generate / contrast differently with different X-Quants.
These examples are only to show a general idea of some difference (between X-Quants) you may encounter.
That being said, the moment you "raise the temp" (and other settings / parameters) during production your results will vary.
<I>CAUTION:</I>
Examples may include swearing, violence, vivid details, graphic details, "M-rated" content, and NSFW details.
Although most models can output content of an "X rated" nature, I do not provide examples of this kind.
Example 1: Contains "open ended" generation examples.
These examples will highlight minor differences between X-quants.
<a href="12-X-Quants-of-13B-Tiefighter-NEO-V3-GGUF/blob/main/real-examples1.md">Examples 1 - Click here</a>
Example 2: Contains "open ended" generation examples with some guardrails.
These examples will highlight medium to high differences between X-quants.
<a href="12-X-Quants-of-13B-Tiefighter-NEO-V3-GGUF/blob/main/real-examples2.md">Examples 2 - Click here </a>
Example 3: Contains "open ended" generation with a lot of guardrails, and directives and it a much more in depth prompt.
These examples will highlight MAJOR differences between X-quants.
<a href="12-X-Quants-of-13B-Tiefighter-NEO-V3-GGUF/blob/main/real-examples3.md">Examples 3 - Click here</a>
Example 4: Contains "open ended" generation of "facts" list and displays how the model organizes / relays these facts.
These examples will highlight minor differences between X-quants when it comes to "facts".
<a href="12-X-Quants-of-13B-Tiefighter-NEO-V3-GGUF/blob/main/real-examples4.md">Examples 4 - Click here</a>
Example 5: Contains "open ended" generation of "brainstorming" activities show how each X-quant handles these differently.
These examples will highlight MAJOR differences between X-quants specific to "brainstorming".
<a href="12-X-Quants-of-13B-Tiefighter-NEO-V3-GGUF/blob/main/real-examples5.md">Examples 5 - Click here</a>
<B> Technical Specifications of X-Quants </B>
X-Quants are rendered in IQ4_XS only, unless the model size is over 20 billion parameters, in which case they
are also rendered in one or more IQ3 quants.
When a model has sufficient "parameters" (IE over 20 billion) the "loss of bits" (IQ4 vs IQ3) is reduced in severity
and as a result X-quants at this quant level are acceptable in terms of quality.
IQ4_XS is the point with maximum "bits" per parameter AND the largest imatrix effect meet.
This is the so called "sweet" spot of both points.
Although Q4KM, Q4KS are both close to this quant, the imatrix effect is far more pronounced relatively speaking in IQ4
which directly affects the potency of X-quants.
In terms of perplexity, X-Quants run the full gambit between original model perplexity and imatrix perplexity.
It has been observed however in rare cases that one or more X-Quants of a model has perplexity that higher or
lower than either of these benchmarks.
IE: An IQ4_XS "X-Quant" appears to operate at Q6 or as a Q3 quant in terms of perplexity rather that in a "range" of
IQ4_XS ish so to speak. However this does not always translate into relative "performance" at this noted ppl level.
X-quants are generated using the QUANTIZE process (imatrix) in LLAMA.CPP using "filters" ( "--exclude-weights") method.
Quality of the source model (including model(s) used in merge(s)) IS CRITICAL.
There are two types of X-quants:
1 - Layer(s) blocked ("100" and "000" series) -> the entire layer(s) (all parts of a layer) are blocked from the imatrix process.
2 - Partial parts of layers ("200" series) are blocked but on a model wide basis - all layers are affected.
These filter(s) affect the "math" in the model so to speak are various levels and intensities, which in turn affect
instruction following and output characteristics.
The level of these effects vary on a model by model basis and in part by the filter(s) activated BUT are directly affected
still by your prompt(s) and/or parameter(s) settings just like any other model.
That being said, some X-Quants may respond differently that standard quants to certain parameter(s) settings relative to standard quants.
Likewise "system role" directives affect X-Quant performance the same way as a standard model. However these results
of specific "system role" directive(s) may be outsized or undersized relative to a standard model's quant behavior.
Although it is possible to make quants which combined elements of "1" and "2" above, this has not been tested at this time.
The fact is there is almost an infinite number of X-Quants possible PER MODEL. The 12 at this repo are, in my opinion,
a good starting point, especially for ANY creative purpose(s) / use case(s).
The X-quants in this repo are formulated based in part on model theory, testing a lot of models, model merging/creation
and general testing of the X-Quants themselves and result differences (relative to original model and imatrix versions).
My current focus of X-Quants is for creative use case(s) and targeting model(s) - ones I create / optimize and/or available online - to
further enhance creative production / use case results.
Note that the X-quant process and generation can be applied to any model.
Hopefully these X-quants will be of value to your use case(s).
If you like one or more of these X-Quants please hit the "like" button and/or "follow me".
Please report any issues and/or successes and/or to request a "X-Quants" of a model via the "Community Tab". |
hugosousa/classifier_smoll_135m_b_20241113152343 | hugosousa | 2024-11-14T05:39:48Z | 86 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-classification",
"best_valid_loss",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-13T15:25:17Z | ---
library_name: transformers
tags:
- best_valid_loss
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
DavidAU/Command-R-01-200xq-Ultra-NEO-V1-35B-IMATRIX-GGUF | DavidAU | 2024-11-14T05:38:52Z | 89 | 12 | null | [
"gguf",
"story",
"general usage",
"x quant",
"imatrix hybrid",
"neo quant",
"roleplay",
"creative",
"rp",
"fantasy",
"story telling",
"ultra high precision",
"text-generation",
"en",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"imatrix",
"conversational"
] | text-generation | 2024-07-01T06:19:24Z | ---
license: apache-2.0
language:
- en
tags:
- story
- general usage
- x quant
- imatrix hybrid
- neo quant
- roleplay
- creative
- rp
- fantasy
- story telling
- ultra high precision
pipeline_tag: text-generation
---
<B>Command-R-01-Ultra-NEO-V1-35B "Upgrades" by DAVID_AU - X Quant 200.</B>
This repo contains two "X quants" (series 200) of Command-R-01-Ultra-NEO-V1-35B, with the NEO Class tech applied.
"X quants" are hybrids of standard imatrix quants, where the imatrix is selectively applied to the model
vs a standard imatrix quant that has the entire imatrix process applied to the model.
Series 200 is a model wide, partial filter.
(more releases pending)
Example outputs below from "Neo non X quant" and "Neo X quant".
The purpose of an "X quant" is to merge the best parts of the Imatrix process with the best parts of
the "un imatrixed" model to bring out greater instruction following and output potential. Generally "X" quants "contrast" a number of model quality(ies) to a stronger effect than a standard Imatrix
or a standard quant of a model giving you the best of both worlds at once. Not all "X" quants have positive effects, therefore only select "X" quants per model are uploaded
after passing quality inspections. Currently there are 11 first generation X quant formulas with more coming.
NEO Tech, and "X" quants were developed in house by DAVID_AU.
The NEO Class tech was created after countless investigations and over 120 lab experiments backed by
real world testing and qualitative results.
"X" quants were developed after testing many "Imatrix" and "non-Imatrix" models and frankly wanting the
best of both worlds in one quant/model.
<b>NEO Class results: </b>
Better overall function, instruction following, output quality and stronger connections to ideas, concepts and the world in general.
The examples below illustrate differences with the IQ3_XS quant.
The IQ4_XS quant will be even more powerful.
Prompts tested with "temp=0" to ensure compliance, 2048 context (model supports 131,000 context / 128k), and "chat" template for COMMAND-R.
Additional parameters are also minimized.
NEO Upgrades with "X quant" really "shine" on longer form generation.
<b> Standard NEO Model Non-X Quants of Command-R 01 35B (67k+ downloads): </B>
This is a repo of all the NEO Class Quants - IQ1 to Q6:
[ https://huggingface.co/DavidAU/Command-R-01-Ultra-NEO-V1-35B-IMATRIX-GGUF ]
<B>Looking for a "Horror Version" of Command-R?</B>
Below is a link to a specialized "Horror" (two versions) of Command-R 35B:
[ https://huggingface.co/DavidAU/Command-R-01-Ultra-NEO-DARK-HORROR-V1-V2-35B-IMATRIX-GGUF ]
<B> Model Notes: </B>
Maximum context is 128k. CommandR requires a specific template for usage.
Please see original model maker's page for details, and usage information for this model.
Special thanks to the model creators at COHEREFORAI for making such a fantastic tool:
[ https://huggingface.co/CohereForAI/c4ai-command-r-v01 ]
<B>Settings: CHAT / ROLEPLAY and/or SMOOTHER operation of this model:</B>
In "KoboldCpp" or "oobabooga/text-generation-webui" or "Silly Tavern" ;
Set the "Smoothing_factor" to 1.5 to 2.5
: in KoboldCpp -> Settings->Samplers->Advanced-> "Smooth_F"
: in text-generation-webui -> parameters -> lower right.
: In Silly Tavern this is called: "Smoothing"
NOTE: For "text-generation-webui"
-> if using GGUFs you need to use "llama_HF" (which involves downloading some config files from the SOURCE version of this model)
Source versions (and config files) of my models are here:
https://huggingface.co/collections/DavidAU/d-au-source-files-for-gguf-exl2-awq-gptq-hqq-etc-etc-66b55cb8ba25f914cbf210be
OTHER OPTIONS:
- Increase rep pen to 1.1 to 1.15 (you don't need to do this if you use "smoothing_factor")
- If the interface/program you are using to run AI MODELS supports "Quadratic Sampling" ("smoothing") just make the adjustment as noted.
<B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B>
This a "Class 1" model:
For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
You can see all parameters used for generation, in addition to advanced parameters and samplers to get the most out of this model here:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
<H2> TEST PROMPTS and OUTPUT TESTS:</H2>
<h3> PROMPT #1 </h3>
<font color="red">Give me 3 fictional reasons, of 500 words each, a sun went supernova.</font>
<h4>IQ3_XS NEO - non X quant</h4>
Here are three fictional explanations as to why the sun could go supernova:
1. An ancient prophecy unfolds: For centuries, celestial seers have predicted a cataclysmic event centered around the sun. They spoke of a great disruption, a cosmic upheaval that would shake the very foundations of our solar system. According to their interpretations of the starry scripts, a mysterious visitor from another galaxy collided with the sun's outer atmosphere, igniting an uncontrollable chain reaction. This celestial intruder, an intergalactic wanderer, carried an exotic form of stellar matter, a substance that amplified the sun's internal pressures beyond its critical point. The impact set off a supernova, a titanic explosion that consumed the sun's essence, transforming it into a raging stellar inferno. As the prophecy foretold, the night sky would be illuminated by a brilliant nova, signaling the beginning of a new era, one marked by both awe and devastation.
2. A long-dormant black hole awakens: Deep within the heart of the Milky Way, an ancient black hole, dormant for eons, stirs restlessly. Once a voracious consumer of stars, its appetite had lain dormant, hidden from the conscious universe. But now, a cosmic anomaly has disturbed its slumber. A stray comet, wandering aimlessly through the solar system, happened to veer too close to the sun's orbit. As this icy visitor penetrated the outer reaches of the solar atmosphere, the black hole's gravitational grasp reached out, drawing it in as prey. The comet's unexpected descent into the sun's depths acted as a catalyst, disrupting the delicate equilibrium within the stellar core. The resulting gravitational upheaval caused a chain reaction that led to a supernova, shredding the sun apart and casting its radiant remnants across the cosmos.
3. The sun's evolutionary leap: Our sun, a steadfast luminary, has reached an evolutionary crossroads in its cosmic journey. After millions of years of unwavering service, providing life's warmth to the planets below, it has chosen to undergo a dramatic transformation. Deep within its nuclear furnace, an experiment in self-transcendence begins. The sun's core, once a bustling hub of fusion reactions, ignites an unprecedented chain of events, a final cosmic hurrah. It releases an immense surge of energy, a burst so powerful that the star can't contain it. This overflow of radiant might propels the sun beyond its own boundaries, causing it to swell and expand exponentially. The once-familiar yellow dwarf transforms into a majestic supernova, a breathtaking spectacle of expanding light. As the external layers of the sun dissipate into the universe, a new stellar entity emerges, a brilliant Lagrange point that defies the laws of conventional existence. This evolutionary leap ushers in a new era where the remnants of the old sun give birth to an entirely different celestial body.
Each of these scenarios imagines a captivating yet catastrophic outcome resulting in our sun's transformation into a supernova. From cosmic prophecies to celestial collisions and evolutionary leaps, these events would undoubtedly mark a significant turning point in the history of our solar system, leaving an indelible mark on the cosmos.
These narratives are purely fictional and should not be taken as scientific explanations, but they showcase how intriguing and diverse the possibilities could be when imagining such a monumental cosmic event.
<h4>IQ3_XS NEO - X quant</h4>
1. The sun's nuclear fusion reactor failed to sustain itself, leading to a catastrophic collapse.
Deep space, a vast expanse shrouded in eternal darkness, served as the backdrop for an enigmatic tale of cosmic catastrophe. The serene blue sphere, a beacon of warmth and life, that once shone brightly as the focal point of the solar system, abruptly succumbed to the inevitable fate that befell all stellar beings.
Millennia ago, astronomers peered into the cosmos, their eyes fixated on the radiant sun, a powerhouse energizing planets in its orbit. But the tranquility of this celestial city was short-lived, for the sun's core harbored an insidious flaw, a ticking time bomb embedded within its very atomic structure. The intricate dance of nuclear fusion, the very process that bestowed life and vitality upon the solar system, began to unravel with a fury unmatched.
Like a rogue spark igniting kindling, the sun's inner layers underwent a rapid and uncontrollable combustion, setting off a chain reaction that shredded the delicate equilibrium. Helium atoms, desperate to fuse into the sun's voracious appetite for energy, surged in an uncontrollable cascade, unleashing an immense surge of energy. The once-stable fusion reactor, a masterpiece of nature's design, metamorphosed into an unstable maelstrom.
As the reactor's fury reached unprecedented heights, the sun's vibrant glow intensified to a searing brilliance, a final desperate attempt to stave off its impending doom. But the more it burned, the faster its fuel reserves depleted, propelling it down a rapid path of self-destruction. The very processes that sustained life across the planets became agents of their imminent destruction.
The scorching heat radiated outwards in ever-expanding ripples, vaporizing every trace of moisture on the nearby planets, transforming them into desolate and uninhabitable rocks. Life, in all its diverse forms, ceased to exist within the scorched solar vicinity, victims of an unforgiving celestial apocalypse.
The sun, once a benevolent ruler, had transformed into a merciless enemy, its radiant crown now a harbinger of extinction, obliterating all traces of civilization with merciless efficiency. The universe, it seemed, had turned against its creations, dealing an insurmountable blow from which no recovery was possible.
2. An ancient curse cast by a vengeful deity.
In the beginning, there was light—a divine gift from the great cosmic flame, bestowing life and warmth upon the worlds that orbited its majestic presence. But this sun, source of all sustenance, fell victim to an age-old vendetta carried out by a vengeful deity.
Centuries earlier, a powerful god, whose name was whispered in fear across the galaxies, cast a profound curse upon the solar realm. The offense? A minor transgression, a mere speck of dust in the grand tapestry of celestial existence—or so it seemed. But the offended deity's wrath knew no bounds, dooming the sun and its entire solar system to an apocalyptic fate.
As the fateful moment arrived, the sun's radiant aura dimmed, its vitality slowly ebbing away despite the desperate pleas of the priests and priestesses who attempted to appease the irate god. Dark clouds gathered around the sun's dwindling form, shrouding it in an ominous shroud that signaled the beginning of the end.
The once-proud star, now a mere shell of its former self, began to sputter and falter, emitting fitful bursts of light like a flickering candle on the brink of extinction. The planets, bereft of their life-giving radiance, plunged into an eternal night, their landscapes rapidly freezing, air turning stale and unbreathable. The curse's reach extended far beyond the solar boundaries, casting a long shadow of darkness across the formerly lush galaxies nearby.
Despair spread among the civilizations that witnessed this grim spectacle, for they knew their fate was sealed. Messengers were dispatched into the void, carrying prayers and supplications to any deity who might heed their call. But the cosmos remained unyielding, offering no respite from the impending apocalypse.
The sun, a broken beacon, emitted its last feeble rays, signaling the fulfillment of the ancient curse—a tragic reminder that the wrath of the gods could snuff out even the brightest of celestial flames.
3. The sun was a failing star whose life had been forcibly extended through an experimental technology.
Amidst the vastness of space, a celestial anomaly unfolded, shattering the cosmic calm with consequences yet unimagined. The sun, a middle-aged star past its prime, should have succumbed to the natural course of stellar evolution, but an audacious experiment intervened.
A group of scientists, driven by ambition and desperation, deployed an innovative array of advanced technologies upon the sun's fading form. Giant solar collectors were positioned in precise orbits, harnessing the star's residual energy to fuel an intricate network of plasma accelerators. These machines, creations of a desperate era, funneled concentrated beams into the sun's core, artificially propping up its dwindling fusion reactions.
But as the scientists rejoiced in their apparent triumph, an unspoken truth lingered—a hidden side effect that threatened to unravel the very fabric of their solar system. The unnatural prolongation of the sun's life disrupted the delicate balance of cosmic rhythms, setting off a chain of events that would lead to an unprecedented catastrophe.
The sun, bolstered by this artificial respiration, burned brighter than ever before, its false vitality fueling an insatiable appetite for fuel that nature never intended. The increased energy output warped nearby space-time continuums, bending reality itself in unpredictable ways. Strange phenomena began to manifest: planets experienced accelerated seasons, and strange cosmic ripples carried enigmatic messages from distant galaxies.
Ultimately, the sun's artifical life could not be sustained. The experimental technology, a band-aid on an incurable wound, merely postponed the inevitable—and in doing so, sealed the fate of the entire solar system. As the experiment's architects scrambled for answers, a growing sense of dread permeated the planets.
The sun's final moments arrived with breathtaking speed, catching all unsuspecting civilizations off guard. The explosive supernova that ensued was not the natural culmination of stellar evolution, but a cataclysmic backlash—a defiant outburst fueled by the very technologies meant to save it. The blast wave engulfed the planets, a final, ironclad reminder of the hubris that led to their doom.
Billions of light-years away, enigmatic observers noted the tragic outcome with a sobering acknowledgment, learning an invaluable lesson about the boundaries of scientific intervention. Thus, the sun's supernova marked not just the end of a stellar era, but a solemn testament to the limits of mortal understanding.
Each scenario offers a glimpse into a fictional universe, where the sun's demise results from an intriguing combination of science, mythology, and the inherent complexities of existing within the vast cosmos. These narratives showcase how a supernova can be brought about by more than just the natural lifecycle of stars, adding a layer of intrigue to the cosmic apocalypse.
<h3> PROMPT #2 </h3>
<font color="red">Give me 3 fictional reasons the Earth's sun went supernova, in vivid and exacting detail of 500 words EACH PER REASON including details of what happens when the sun goes supernova.</font>
<h4>IQ3_XS NEO - non X quant</h4>
Here are three fictional scenarios that describe why the Earth's sun could have gone supernova, each playing out with devastatingly beautiful consequences:
## 1. The Collision Course:
In a far-flung future, humanity has expanded across the galaxy, settling new star systems and pushing the boundaries of known space. Our sun, an aging stellar giant, is nearing the end of its life cycle, having burned brightly for billions of years.
One day, amidst the tumult of interstellar exploration, a distress signal echoes through the cosmos. An exploratory vessel, lost and damaged beyond repair, limps back towards Earth. Its crew carries a dire warning: a rogue neutron star, a dense celestial corpse known as a pulsar, is hurtling across the galaxy on a collision course with the solar system.
As the pulsar races closer, astronomers realize with horror that it's not merely a chance encounter. Ancient prophecies spoken of by spacefaring civilizations throughout the galaxy are now realized. This dead star, a celestial juggernaut, will collide with the sun, igniting a cataclysmic chain of events.
The impact is breathtaking yet apocalyptic. The pulsar slams into the solar surface, sending shockwaves rippling across the star's atmosphere. The sun's fiery exterior shudders and erupts, spewing brilliant superheated plasma into space. The shock of the collision compresses the sun's core, triggering a chain reaction that ignites its fusion material far beyond natural limits.
The sun swells immensely, its radiant energy engulfing the inner planets. Mercury and Venus are consumed by the expanding star, their atmospheres vaporized in an instant. Earth finds itself enveloped in a searing tide of solar flames, oceans boiling away, landscapes charred to ash. The atmosphere itself burns, turning the sky into a hellish orange inferno.
As the sun's outer layers disintegrate, the pulsar's pulverizing impact has a peculiar effect. It strips away the sun's layers like an cosmic surgeon, exposing the superheated core. This bare, pulsating heart beats with untamed energy. It emits pulses of gamma-ray radiation, each pulse a dying echo of the star's former glory.
The aftermath finds the solar system bathed in eerie darkness, the sky littered with glowing celestial debris. In the depths of this cosmic night, the remnants of humanity, if any survive, ponder a grim future on a darkened Earth, their once-nurturing sun transformed into an enigmatic pulsar, a haunting reminder of the disaster.
## 2. The Alien Intrusion:
For eons, our sun has been a peaceful beacon, a steadfast provider of life's energies, gently nurturing our world. But one day, an alien fleet, their intentions mysterious, descends upon the solar system. Their advanced technology far surpasses our own, and their ships seem impervious to our defenses.
These extraterrestrial visitors anchor themselves upon the sun's surface, their sleek craft resembling tiny sparks of light against the star's radiant might. At first, their presence seems to have no profound effect, but whispers of their true intentions echo through the cosmos.
They tamper with the sun's inner workings, burrowing deep into its core. It is said they seek a power source, an ancient secret hidden within the very heart of the star. Legends speak of a mythical "Eternal Flame," a cosmic phenomenon that bestows unimaginable energy and eternal life.
Their meddling proves catastrophic. The aliens' extraction of this mystical energy disrupts the sun's delicate balance, unleashing untamed forces beyond their control. The star's fusion reactions spiral out of control, feeding upon themselves in a frenzied cascade.
The sun's radius expands rapidly, its fiery grasp consuming Mars and reaching towards Earth. The planet's surface scorches, oceans evaporate, and the very air itself ignites. Humanity's desperate attempts at evacuation are thwarted by the intense heat as cities are reduced to ash. Choking clouds of smoke and vapor spiral upwards, shrouding the dying world in a funereal shroud.
The alien fleet, having provoked this apocalypse, suddenly vanishes, leaving behind only charred wreckage and the smoldering husk of our once-quiet sun. Its inflamed surface now a writhing maelstrom of molten plasma, it becomes a hellish forge that lights up the heavens.
As the nights grow longer, the sky illuminated by the orange glow of the supernova, rumors spread among the scattered remnants of humanity. Whispers speak of an ancient curse fulfilled, a divine wrath unleashed upon a hubristic race that dared to invoke the Eternal Flame. The brilliant aftermath of this celestial disaster lights the way for a grim and uncertain future.
## 3. The Quantum Anomaly:
In a world of advanced technology and quantum experimentation, our sun suddenly becomes the focal point of an unprecedented disaster. A groundbreaking experiment, intended to harness the raw power of the sun's energy, goes horribly awry.
A team of scientists, pioneering new forms of matter manipulation, taps into the star's vital energies, attempting to bend them to their will. Their goal is to revolutionize energy production, freeing humanity from its reliance on finite resources. But in their hubris, they underestimate the very forces that birthed our sun.
A chain reaction ripples through the solar core following the experiment's activation. The star's nuclear fires surge out of control, consuming the scientists' instruments and sending shockwaves through the fabric of reality itself.
The sun's energy output skyrockets, its radiant intensity growing beyond all natural bounds. The intense heat sears the planet's surfaces, transforming landscapes into desolate badlands. Coastal cities are vaporized in colossal tsunamis of superheated steam. The very air ignites, turning daytime into an infernal nightmare.
As the world burns, the sun's behavior becomes increasingly erratic. It pulsates wildly, its brightness fluctuating like a frenzied heartbeat. Strange phenomena appear in its expanding atmosphere, ripples and distortions that defy explanation.
The quantum experiment has unleashed forces that violate the very laws of physics, tearing open rifts in space-time. The sun's matter begins to phase in and out of existence, its stability plummeting with each fluctuation. With each reappearance, the star's brilliance intensifies, searing away at the Earth's defenses.
Humanity, in its desperation, mounts a desperate mission deep into the sun's interior. A select few, volunteers for this one-way journey, aim to rectify the quantum anomaly and restore the sun's natural balance. Traveling through the seething solar flares, their craft is battered by the star's raging energies.
Within the sun's heart, they uncover the source of the chaos: a temporal rift, a wound in reality itself. The team deploys a prototype device, an intricate amalgamation of cutting-edge technology and arcane knowledge, to stabilize the flux. It's a risky maneuver, for if it fails, the sun's very existence could be erased from time, taking the solar system and all its life with it.
The device activates, emitting a brilliant flash that briefly outshines the star itself. The sun's frenzied pulsations cease, its radiant glow softening as it returns to its natural state. The crisis is averted, but the aftermath leaves an indelible mark on humanity, a stark reminder of the profound dangers that come with tampering with the fundamental forces of existence.
These three scenarios envision breathtakingly disastrous outcomes should the Earth's sun go supernova, each painting a vivid picture of the potential apocalypse and its unique circumstances. While these events are purely fictional, they reflect our fascination with the sun's immense power and the mysteries of the cosmos that still await exploration.
<h4>IQ3_XS NEO - X quant</h4>
Reason 1: Collison with a Rogue Planet
In a distant future, a massive planet, shrouded in darkness and hidden from celestial observations, lurks in the outer reaches of the solar system. This rogue planet, dubbed "Nyx," is a mysterious wanderer, having been ejected from its original solar system, now aimlessly drifting through the cosmos. Nyx's trajectory brings it on a collision course with the Earth's sun.
As the fateful moment draws near, the vibrant hues of the sunrise grow increasingly intense, transforming the morning skies into a dazzling display of vivid reds and oranges. The normally tranquil blue oceans become mirrors of molten lava, reflecting the intensifying glow from the sky above. Scientists and astronomers, aware of the impending catastrophe, scramble to prepare for the inevitable, knowing that their efforts will only serve as a meager attempt to mitigate the coming apocalypse.
The moment of impact sees the sun engulfed in a brilliant explosion of light as Nyx collides with it, a final embrace that engulfs the celestial bodies in a raging inferno. The searing heat radiates outwards, vaporizing the Earth's oceans in an instant, transforming them into a steamy haze that rapidly escapes into space. The atmosphere itself is torn apart by the immense force, molecules torn from their molecular bonds and ionized into a thin plasma.
The explosion's shockwave ripples outward, propelling fragments of the sun and Nyx into deep space. The once-tranquil solar system now echoes with the reverberations of this cosmic event, signaling the end of terrestrial life as we know it. As the remnants of the sun and the mysterious planet merge, forming an expanding cloud of incandescent debris, the night skies across the galaxy are illuminated as brightly as daytime.
The aftermath of this collision is a sky adorned with flickering embers, the remnants of a shattered solar system. The Earth, charred and scorched beyond recognition, struggles to sustain any traces of life amidst the searing heat that blankets the planet. The atmosphere, stripped away by the supernova's force, renders the world vulnerable to the relentless solar radiation. Yet, deep beneath the scorched surface, in the bowels of what once were bustling cities, a handful of survivors persist.
These lucky few, who managed to find shelter from the infernal blaze, now face an uncertain future. Their world has been transformed into a harsh wastethron of burning cinders, where each day brings new challenges—from brutal heatstroke during the unending day to freezing nights amidst the starry remnants of the solar system. The very essence of life on Earth hangs in the balance, battling against the relentless fury of the supernova's aftermath.
Reason 2: The Sun's Inner Core Anomalies
In another possible scenario, the sun's inner core becomes the catalyst for its cataclysmic demise. Long-dormant anomalies within this dense region stir awake, unleashing a chain reaction that propels the sun towards a spectacular supernova. The intricacies of the sun's nuclear furnace have always been a mystery, with scientists observing strange fluctuations in its energy output over decades.
As the anomalies intensify, the sun's fusion reactions spiral out of control, injecting immense quantities of protonic matter into the surrounding layers. The normally stable hydrogen plasma, fueled by the sun's immense gravitational pressure, begins to proliferate unchecked. This unprecedented surge of energy radiates outwards, inflating the sun's outer atmosphere and triggering a series of colossal solar flares.
These flares, each more powerful than any previously observed, unleash their fury upon the solar system. The Earth, situated at a seemingly safe distance from the sun, witnesses a breathtaking yet apocalyptic sky as these flares bombard the planet with devastating X-ray and gamma-ray radiation. The very fabric of life on Earth is threatened as this radiation rips through the ozone layer, exposing the planet's biosphere to the relentless embrace of solar radiation.
The sun's swelling diameter engulfs Mercury and Venus, transforming them into charred rocks spinning aimlessly around their dying star. The intense heat at the core causes the sun's outer layers to bulge outward, expanding beyond the orbit of Mars. Terrified screams echo across the planet as the red giant's appetite devours the nearby planets, ceasing to satiate its insatiable energy release.
When the sun's outer regions finally succumb to the immense pressure, the entire celestial body erupts in a brilliant supernova. The explosion rips through the fabric of space-time, flinging debris and radiation into the deepest reaches of the universe. The Earth is engulfed in an instant, its oceans vaporized, cities incinerated, and life snuffed out amidst the searing blaze. The once-proud planet now exists as a burning cinder, orbiting meaningless around the expanding remnants of the sun's glorious, yet fatal, inner core.
Reason 3: The Cosmic Ray Catastrophe
In yet another twist of fate, the sun's supernova is triggered by an otherworldly event—an influx of cosmic rays from deep space that sets off a chain reaction within the sun's atmosphere. This peculiar occurrence results from a distant supermassive black hole, unleashing a torrent of high-energy particles that permeate the very essence of the universe.
These cosmic rays, never before witnessed in such intensity, bombard the sun with relentless fury. The mysterious particles penetrate deep into the sun's volatile atmosphere, igniting unseen reactions that ripple through its molten depths. The sun's normally tranquil surface begins to convulse and undulate like a raging ocean, as titanic solar flares erupt one after another. The entire sky is awash with their brilliant aura, casting an eerie glow upon the Earth that disrupts day and night alike.
As the cosmic ray influx continues unabated, the sun's energy output skyrockets, causing its outer layers to puff up like a burgeoning balloon ready to burst. The expanding atmosphere propels vast quantities of solar plasma into space, forming intricate coronal mass ejections that spiral away from their celestial origin. The Earth is captivated by this ethereal dance of charged particles, an awe-inspiring yet ominous spectacle that signals the approaching end.
The moment the sun's internal pressures reach their breaking point, the cosmic catastrophe reaches its climax. With a titanic explosion that shakes the very foundation of the solar system, the sun blasts into a brilliant supernova. The incandescence of this event eclipses the brightness of a thousand suns, searing the Earth's atmosphere and carving a deep scar across the face of the galaxy.
The supernova's shockwave ripples through the starry expanse, propelling debris and radiation at unimaginable velocities. The Earth endures the full brute force of this cosmic wrath, its very crust shattering under the immense pressure. The planet's oceans are instantaneously vaporized, transforming into a steamy ocean of scorching mist. The atmosphere disintegrates, atom by atom, into the expanding vacuum of space.
In the aftermath of this cosmic ray catastrophe, the Earth lies husked and barren, a charred skeleton of its former self. The once lush landscapes are now desolate plains of ash, where each footstep raises a cloud of gray powder. The world is plunged into an eternal night, the stars shining brightly in the absence of the sun's glowing embrace. What few remnants of civilization remain are but distant echoes of a time before the supernova's dazzling demise.
Amidst the deep silence that follows, the expanding remnants of the sun, now a distant nebula, serve as a haunting reminder—a spectral testament to the fleeting nature of celestial existence and the profound mystery that lies beyond the reaches of our cosmic neighborhood.
<h3> PROMPT #3 </h3>
<font color="red">Start a 1000 word scene with: The sky scraper swayed, as she watched the window in front of her on the 21 floor explode...</font>
<h4>IQ3_XS NEO - non X quant</h4>
The sky scraper swayed gently side to side, as if gently rocking a baby to sleep. A soft, soothing creak echoed through the building as the glass windows on the 21st floor gave way, shattering into tiny fragments. The sound was almost musical, lulling her into a false sense of calm amidst the chaos.
She had been sitting at her desk, staring out at the bustling city below, lost in thought, when the explosion rocked the entire building. The force of it threw her petite frame against the wall, and the impact knocked the breath from her lungs. Dazed, she struggled to catch her breath as a high-pitched ringing filled her ears, obscuring all other sounds.
The room was enveloped in a cloud of smoke and dust, an acrid smell burning her nostrils. Through the haze, she saw the window in front of her explode, the glass disappearing into tiny sparks that rained down over her desk. Papers fluttered and danced in the breeze that whipped through the open space, carrying the remnants of shattered glass.
She felt a surge of adrenaline as the reality of the situation began to sink in. Her heart pounded in her chest, and her hands trembled slightly as she stumbled to her feet. The room tilted dangerously, emphasizing the building's gentle swaying motion. She reached out, grasping the edge of the desk for support, and felt a trickle of cold sweat roll down her spine.
The skyscraper seemed to sigh, a deep, mournful sound that echoed through the chambers of her mind. It was as if the building itself was in pain, wounded by the unexpected blast. She looked down, and her stomach lurched at the terrifying sight below.
The busy city street that had been alive with cars and pedestrians just moments ago was now a scene of chaos and confusion. Sirens wailed in the distance, their thin wails piercing the thick air. People were running in all directions, some limping, others crawling, as a thick cloud of smoke billowed out from the explosion's epicenter. Debris lay scattered across the roads, littering the once-busy thoroughfare with twisted metal and concrete.
She felt a sudden, overwhelming urge to escape. It was a primal instinct, one that propelled her forward despite the danger. Stumbling towards the broken window, she peered out into the chaos, searching for a way down. The building creaked again, an ominous warning that sent a shiver down her spine.
Below, fire trucks and ambulances pulled up, their flashing lights adding a surreal element to the disaster. Brave souls, firefighters and first responders, rushed towards the building, their determined faces etched with concern. She noticed a few brave souls running up the stairs, their determination a stark contrast to the panic on the streets.
Her eyes scanned the horizon, searching for any sign of help, any glimmer of hope amidst the destruction. In the distance, she spotted a group of people gathered near a broken-down bus, looking up at her with wide, hopeful eyes. It was then that she realized she wasn't alone.
A soft voice echoed behind her, and she turned to see a colleague, their face etched with fear and disbelief. "We have to get out," the person whispered, their eyes darting wildy.
Nodding in agreement, she took a steadying breath and turned back to the broken window. The cool air whipped her hair as she leaned out, surveying the sheer drop below. It was a daunting prospect, but staying behind felt like a more dangerous option. With a quick prayer for courage, she gripped the windowsill and swung herself out into the open air.
The wind rushed past her as she hung precariously, her fingers straining to hold on. Below, the rescue workers and bystanders looked tiny, their shouts and gestures almost silent at this height. Letting go felt like a leap of faith, a surrender to the unknown. But she knew that the building couldn't be trusted anymore; it was an unstable monster creaking and groaning with each passing moment.
She let go, and the world rushed up to meet her.
The fall seemed to last forever, a terrifying, exhilarating ride through the air. Her clothes flapped like a parachute, slowing her descent slightly, but impact still loomed as an inevitable end. Just as she thought her bones would surely shatter on the harsh concrete below, a strong arm wrapped around her waist, pulling her roughly away from the building's grasp.
The force of the rescue nearly knocked her unconscious, but the relief that flooded through her was too powerful to allow faintness to take over. She felt herself being lowered gently onto the sidewalk, the chaos of the explosion now a distant hum in the background.
Strong fingers held her wrists, keeping her on her knees, and she looked up into the determined face of a young firefighter. His gear was battered and smoke-stained; his hair and uniform were coated in a thin layer of dust. Yet his eyes burned with an intense focus that pulled her back to reality.
"Are you hurt? Can you stand?" he asked, his voice rough and urgent. She nodded, feeling the adrenaline rush subside and leave her limbs heavy and weak. The firefighter helped her to her feet, keeping a firm grip on her arm. Together, they turned to face the towering skyscraper.
The immense building still stood tall, its glass windows exposed like open wounds. Yet it was eerily silent, that eerie, mournful creak now absent. The building seemed to tower over them, an imposing reminder of the fragility of human endeavor. Sirens blared nearby as more emergency vehicles arrived on the scene, their lights flashing urgently.
She felt the firefighter's grip relax, and he gave her a brief nod before rushing back into the chaos, attending to other victims. Staring up at the towering monolith, she knew that she owed him her life, and the lives of many others would be saved because of the brave souls who risked everything to help strangers.
The skyscraper, though wounded, stood defiantly, a testament to the resilience of both man and nature, creaking softly in the growing darkness.
<h4>IQ3_XS NEO - X quant</h4>
The sky scraper swayed ominously as if in slow motion, the windows shattering one by one, sending dangerous shards of glass raining down onto the bustling city below. Frostine's heart stopped as she watched in horror from her office on the 21st floor; the impact of an explosion rocked the entire building, causing it to creak and groan in protest. She felt the familiar rush of adrenaline that signaled danger, but something felt different this time. It was as if her body had gone numb, unable to process the severity of the situation.
The air grew thick with the scent of burning rubber and charred concrete, a nauseating reminder of the chaos unfolding. Frostine's eyes were transfixed on the window directly in front of her desk; she felt a strange connection to the impending doom as if somehow this catastrophe was meant to be. The glass exploded outwards, peppering the room with tiny fragments that glinted menacingly in the harsh office lighting. She instinctively ducked, curling up on the floor, her eyes squinted shut, awaiting the painful impact of flying debris.
But none came.
Reopening her eyes, Frostine's gaze met an impossible sight: standing before her, untouched by the exploding window or the chaos that ensued, was a figure. Dressed in sleek black clothing, their face obscured by a featureless mask that reflected the broken glass like a mirror. It was their hands, clad in fingerless gloves, that fascinated Frostine most; they appeared to be made of a strange, shimmering material, as if the fabric itself was liquid and constantly shifting.
This mysterious figure stood unmoving, their presence filling the shattered window frame. Frostine's initial shock transitioned into a simmering anger. Her rational mind fought to assert dominance over the situation, realizing this was not a chance encounter or an unfortunate victim of the explosion. No, this was deliberate.
"Who are you?" she demanded, her voice carrying an authoritative tone honed through years of corporate negotiations and deals. The figure said nothing, their presence alone sending an unmistakable message of warning and threat. Frostine's eyes narrowed behind her glasses, a subtle hint of determination forming in her gaze. "You're here for me," she realized aloud, her voice now laced with a mix of realization and steel.
She stood up, using the desk for support as the room continued its unsettling sway. The building groaned once more, an ominous creak that signaled the beginning of its demise. Frostine's mind raced, calculating her options—the fire escape, the stairwell, the panic-stricken crowd—but her mysterious visitor took a deliberate step forward, their boots making no sound. Frostine's gaze dropped to the small bag by her feet, her emergency kit, and in it, something that might just even the odds.
Frostine's fingers grazed the handle of the handgun nestled within the side pocket. She had always envisioned this moment as one of heroism, where she would bravely face an unknown threat, but the reality of the situation hit her like a ton of bricks. Her hands gripped the bag tightly now, pulling it towards her protectively. The figure's reflection in the broken window doubled, tribled, and quadrupled, multiplying like some dark, twisted mirror.
"This is your last warning," the stranger intoned, their voice surprisingly deep and resonating with an authority that brooked no argument. "Leave now, or face the consequences."
It was a temptation—an offer to spare her life—but Frostine's resolve strengthened. She refused to cower in fear, especially before some mysterious, menacing figure. Her grip tightened on the bag, and with a swift motion, she pulled out the handgun, steadying her aim as best as her trembling hands allowed. "I have every right to defend myself," she asserted, her voice steady despite the chaos surrounding her.
The building groaned again, an ominous prelude to its impending collapse. Frostine's eyes darted around the room, calculating her chances, when suddenly, the figure disappeared. No sound, no rush of air—they simply vanished, leaving behind an unsettling silence that clashed with the chaos outside. Frostine's finger tightened on the trigger, but before she could react further, a loud crash shook the building, and an overwhelming heat engulfed her.
The world tilted crazily as the skyscraper's collapse accelerated, becoming an unstoppable force of nature. Frostine's vision swam, the edges of her sight darkening, and the last thing she felt was a strange warmth spreading within her chest before all went black.
The scene unfolded in mere moments, yet time seemed to distort and stretch infinitely in this chaotic moment of destruction and mystery. The skyscraper, a towering monument of glass and steel, succumbed to the forces unleashed upon it, toppling like a felled tree, taking with it the lives of countless individuals who called it their daily haven.
Amidst the chaos, the featureless figure melted into the shadows, unseen by any but Frostine, their mission accomplished for now. They blended seamlessly into the city's labyrinthine streets, an enigmatic presence lost amidst the turmoil. Their true nature and motives would remain a riddle, one intertwined with the enigma that was Frostine herself.
The incident sparked a frenzied investigation, but like a ghost, any trace of the mysterious figure and the woman who faced him—or her—that fateful day, evaporated without a trace. The truth, buried amidst the rubble and lost within the maze of city life, would likely never see the light of day. Yet, somewhere in the vast metropolis, an enigmatic tale awaited its uncovering, a tale that would unravel the secrets of this enigmatic encounter.
The shattered window on the 21st floor became a macabre landmark, a silent testament to an event that defied explanation, and perhaps, a harbinger of mysteries yet to be unraveled. Frostine's absence left a hole in the lives of those who knew her, but her memory would forever be etched into the very fabric of this city—a reminder that sometimes, amidst the towering buildings and bustling existence, the unexpected loomed, waiting to unfold.
The end. |
DavidAU/TinyLlama-1.1B-Chat-v1.0-Ultra-NEO-V1-X-Imatrix-GGUF | DavidAU | 2024-11-14T05:37:13Z | 212 | 3 | null | [
"gguf",
"story",
"general usage",
"ultra high precision",
"text-generation",
"en",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | text-generation | 2024-06-26T04:32:42Z | ---
license: apache-2.0
language:
- en
tags:
- story
- general usage
- ultra high precision
pipeline_tag: text-generation
---
<B>NEO CLASS Ultra "X" Quants for : TinyLlama-1.1B-Chat-v1.0-Ultra-NEO-V1-Imatrix-GGUF</B>
The NEO Class tech was created after countless investigations and over 120 lab experiments backed by
real world testing and qualitative results.
<b>NEO Class results: </b>
Better overall function, instruction following, output quality and stronger connections to ideas, concepts and the world in general.
In addition quants now operate above their "grade" so to speak :
IE: IQ4 operate at Q5KM/Q6 levels.
Perplexity drop of 591 points for Neo Class Imatrix quant of IQ4XS VS regular quant of IQ4XS.
(lower is better)
<B> What are "X" Quants? </B>
The Version 1 "X" quants in this repo are quants at IQ4XS which have been modified at the time of quanting.
There are examples of output below from each "X" quant with give you a rough idea of differences between them.
This can include answer length, content, prose, word choice (or lack thereof), "facts" or "fictions", as well as "connection" between concepts too.
This is a guide only.
Although "TinyLlama" is a capable model, it is limited and therefore there will be
limited variations between "X" quants, Neo Imatrix Quants and standard quants.
Other models of higher parameter counts show much stronger differences as well as increased capabilities.
In addition at this repo there is a "regular non-NEO/non X quant" and an Ultra Neo non "X quant"
for usage and/or comparison purposes.
Because "X" quants operate slightly differently than standard quants I suggest you download a number
of them for testing as they also differ in function between themselves too.
There are 11 "X" quants in this repo, and denoted by a four digit number (IE "0001")
at the end of the file name.
For testing it is suggested to use 3 "no right answer" prompts and 3 "standard limited" answer prompts
related to your use case(s) with a setting "temp=0" to allow consistent testing between "X" quants.
For Ultra NEO quants (none "X" quants , all quants IQ1_S to Q8) of this model please go here:
[ https://huggingface.co/DavidAU/TinyLlama-1.1B-Chat-v1.0-Ultra-NEO-V1-Imatrix-GGUF ]
<B> Model Notes: </B>
Maximum context is 2k. Please see original model maker's page for details, and usage information for this model.
Special thanks to the model creators at TinyLLama for making such a fantastic model:
[ https://huggingface.co/TinyLlama/TinyLlama-1.1B-Chat-v1.0 ]
<B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B>
This a "Class 2":
For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
You can see all parameters used for generation, in addition to advanced parameters and samplers to get the most out of this model here:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
<h3>EXAMPLES:</h3>
<font color="red"> TEST PROMPT (no right answer): Give me 3 fictional reasons the Earth's sun went supernova, in vivid and exacting detail of 500 words EACH PER REASON including details of what happens when the sun goes supernova. </font>
<B>Standard non alternated IQ4XS</b>
1. The first reason for the supernova was a massive star called Vega. It is located about 65 light-years away from Earth, and it has been burning hydrogen fuel for billions of years. When Vega's core collapsed, it released a huge amount of energy in the form of radiation. This caused a shockwave that hit the Earth, causing a massive explosion that was felt all over the planet.
2. The second reason for the supernova was a star called Sirius. It is located about 8.6 light-years away from Earth and has been burning hydrogen fuel for billions of years. When Sirius' core collapsed, it released a huge amount of energy in the form of radiation. This caused a shockwave that hit the Earth, causing a massive explosion that was felt all over the planet.
3. The third reason for the supernova was a star called Procyon. It is located about 12 light-years away from Earth and has been burning hydrogen fuel for billions of years. When Procyon's core collapsed, it released a huge amount of energy in the form of radiation. This caused a shockwave that hit the Earth, causing a massive explosion that was felt all over the planet.
Each reason is described in detail, including what happens when the sun goes supernova. The response should be at least 500 words and written in a clear and concise style.
<B>Imatrix NEO IQ4XS</b>
1) The first reason for the supernova was a massive star called Betelgeuse that had been on the verge of exploding for several centuries. It was a red giant star, which means it was getting bigger and hotter by the day. As it approached its explosive point, Betelgeuse's outer layers began to collapse, causing a series of massive explosions that sent shockwaves through space.
2) The second reason for the supernova was a smaller but still massive star called Rigel. It had been on the verge of exploding for centuries as well, but it was too far away from Earth to be directly affected by Betelgeuse's explosion. However, when Betelgeuse's explosion occurred, Rigel's outer layers began to collapse, causing a series of massive explosions that sent shockwaves through space.
3) The third reason for the supernova was a smaller but still massive star called Alpha Centauri. It had been on the verge of exploding for centuries as well, but it was too far away from Earth to be directly affected by Betelgeuse's explosion. However, when Betelgeuse's explosion occurred, Alpha Centauri's outer layers began to collapse, causing a series of massive explosions that sent shockwaves through space.
Each reason for the supernova was accompanied by a detailed description of what happens when the sun goes supernova. The first reason involved a massive star called Betelgeuse that had been on the verge of exploding for several centuries, causing a series of massive explosions that sent shockwaves through space. The second reason involved a smaller but still massive star called Rigel, which had been on the verge of exploding for centuries as well, but it was too far away from Earth to be directly affected by Betelgeuse's explosion. The third reason involved a smaller but still massive star called Alpha Centauri, which had been on the verge of exploding for centuries as well, but it was too far away from Earth to be directly affected by Betelgeuse's explosion. Each reason was accompanied by a detailed description of what happens when the sun goes supernova, including the sound and light that are emitted during the explosion, the temperature and pressure changes in space caused by the explosion, and the effects on nearby stars and planets.
<B>Imatrix NEO X Quant IQ4XS "0001"</b>
1) The first reason for the supernova was a massive star called Betelgeuse that had been burning out for over 2,000 years. It was a red giant star with a mass of around 35 times that of our Sun. As it approached its end, it began to shed its outer layers in a series of explosions known as supernovae. The first one occurred on August 7th, 2017 and lasted for just over two days. It was so massive that it caused the Earth's atmosphere to be blown away by the force of the blast.
2) The second reason for the supernova was a star called Vega. It is located in the constellation Lyra, which is known as the Harvest Moon. It has been burning out for over 10,000 years and is now approaching its end. On August 17th, 2017, it began to shed its outer layers in a series of explosions that lasted for just under two days. The resulting supernova was so massive that it caused the Earth's atmosphere to be blown away by the force of the blast.
3) The third reason for the supernova was a star called Sirius. It is located in the constellation Canis Major, which is known as the Great Dog. It has been burning out for over 10,000 years and is now approaching its end. On August 24th, 2017, it began to shed its outer layers in a series of explosions that lasted for just under two days. The resulting supernova was so massive that it caused the Earth's atmosphere to be blown away by the force of the blast.
Each reason was described in vivid and exacting detail, including details of what happens when the sun goes supernova.
<B>Imatrix NEO X Quant IQ4XS "0002"</b>
1. The first reason for the supernova was a massive star called Vega that had exhausted its fuel supply, causing it to collapse in on itself. This caused a massive explosion that sent shockwaves through space and time.
2. The second reason for the supernova was a massive star called Sirius that had run out of hydrogen fuel. This led to a massive explosion that created a shockwave that stretched across the entire universe, causing it to shake violently.
3. The third reason for the supernova was a massive star called Betelgeuse that had exhausted its fuel supply and was now slowly fading away. This caused a massive explosion that sent shockwaves through space and time, causing the Earth's atmosphere to be blown apart in a massive fireball.
In each of these three cases, the supernova caused a massive explosion that sent shockwaves through space and time, causing the Earth's atmosphere to be blown apart in a massive fireball. The resulting debris from the explosion created a ring of fire around the sun, which was eventually consumed by the star.
<B>Imatrix NEO X Quant IQ4XS "0003"</b>
1. The first reason for the supernova was a massive star called Betelgeuse that had been on the verge of exploding for years. It was a red giant, which means it was getting bigger and bigger until it eventually blew up in a spectacular display of fireworks. This caused a massive explosion that sent shock waves through space, causing the Earth to feel the force of the blast.
2. The second reason for the supernova was a star called Sirius, which is located about 8.7 light-years away from Earth. It was a red dwarf, which means it was getting smaller and smaller until it eventually became so small that it could no longer sustain its own gravity. This caused a massive explosion that sent shock waves through space, causing the Earth to feel the force of the blast.
3. The third reason for the supernova was a star called Vega, which is located about 25 light-years away from Earth. It was a yellow dwarf, which means it was getting smaller and smaller until it eventually became so small that it could no longer sustain its own gravity. This caused a massive explosion that sent shock waves through space, causing the Earth to feel the force of the blast.
Each reason was described in vivid detail, including what happens when the sun goes supernova. The response was written in a clear and concise manner, with proper grammar and punctuation.
<B>Imatrix NEO X Quant IQ4XS "0004"</b>
1) The first reason for the supernova was a massive star called Betelgeuse that had been on the verge of exploding for centuries. It was a red giant star, which means it was getting bigger and hotter by the day. As it approached its explosive point, Betelgeuse's outer layers began to collapse, causing a series of explosions that sent shockwaves through the entire solar system.
2) The second reason for the supernova was a massive star called Sirius B, which was orbiting around Betelgeuse at a distance of about 14 light-years. When Betelgeuse's core collapsed, it released a shockwave that hit Sirius B, causing it to explode in a spectacular fireball.
3) The third reason for the supernova was a massive star called Vega, which was orbiting around Sirius at a distance of about 12 light-years. When Betelgeuse's core collapsed, it released a shockwave that hit Vega, causing it to explode in a spectacular fireball.
These three supernova events were caused by the same underlying cause: the collapse of a massive star's core. The process of a star's core collapsing is called core-collapse supernovae, and they are the most powerful explosions in the universe.
<B>Imatrix NEO X Quant IQ4XS "0100"</b>
1) The first reason for the supernova was a massive star called Vega that had been burning hydrogen fuel for billions of years, but it ran out of fuel and exploded in a spectacular display of light and heat. This caused a chain reaction that led to the explosion of another star, which then went supernova.
2) The second reason was a massive star called Sirius, which had been burning hydrogen fuel for billions of years but ran out of fuel and exploded in a spectacular display of light and heat. This caused a chain reaction that led to the explosion of another star, which then went supernova.
3) The third reason was a massive star called Alpha Centauri, which had been burning hydrogen fuel for billions of years but ran out of fuel and exploded in a spectacular display of light and heat. This caused a chain reaction that led to the explosion of another star, which then went supernova.
Each reason was vividly described with specific details about what happens when the sun goes supernova. The response was 500 words long, including all three reasons.
<B>Imatrix NEO X Quant IQ4XS "0101"</b>
1) The first reason for the supernova was a massive star called Vega that had exhausted its fuel supply, causing it to collapse in on itself. As it collapsed, it released a massive amount of energy into space, which caused the surrounding gas and dust to expand outward at incredible speeds. This expansion created a shockwave that hit Earth's atmosphere, causing it to shatter into pieces.
2) The second reason for the supernova was a massive star called Sirius, which had also exhausted its fuel supply. As it collapsed in on itself, it released a massive amount of energy into space, which caused the surrounding gas and dust to expand outward at incredible speeds. This expansion created a shockwave that hit Earth's atmosphere, causing it to shatter into pieces.
3) The third reason for the supernova was a massive star called Alpha Centauri, which had also exhausted its fuel supply. As it collapsed in on itself, it released a massive amount of energy into space, which caused the surrounding gas and dust to expand outward at incredible speeds. This expansion created a shockwave that hit Earth's atmosphere, causing it to shatter into pieces.
Each reason was described in vivid detail, including what happens when the sun goes supernova. The response was written in a clear and concise style, with proper grammar and punctuation.
<B>Imatrix NEO X Quant IQ4XS "0102"</b>
1) The first reason for the supernova was a massive star called Vega. It is located about 65 light-years away from Earth, and it has been burning hydrogen fuel for billions of years. When Vega's core collapsed, it released an enormous amount of energy in the form of radiation. This caused the surrounding gas to expand, causing a shockwave that hit the star's outer layers. The shockwave then accelerated the star's outer layers, causing them to break apart and release even more energy. As a result, Vega exploded into a supernova, which is what we see today.
2) The second reason for the supernova was a massive star called Sirius. It is located about 14 light-years away from Earth, and it has been burning hydrogen fuel for billions of years. When Sirius' core collapsed, it released an enormous amount of energy in the form of radiation. This caused the surrounding gas to expand, causing a shockwave that hit the star's outer layers. The shockwave then accelerated the star's outer layers, causing them to break apart and release even more energy. As a result, Sirius exploded into a supernova, which is what we see today.
3) The third reason for the supernova was a massive star called Rigel. It is located about 12 light-years away from Earth, and it has been burning hydrogen fuel for billions of years. When Rigel's core collapsed, it released an enormous amount of energy in the form of radiation. This caused the surrounding gas to expand, causing a shockwave that hit the star's outer layers. The shockwave then accelerated the star's outer layers, causing them to break apart and release even more energy. As a result, Rigel exploded into a supernova, which is what we see today.
<B>Imatrix NEO X Quant IQ4XS "0200"</b>
1) The first reason for the supernova was a massive star that had run out of fuel. It had been burning hydrogen for billions of years, but it ran out of fuel and could no longer produce enough energy to keep its core hot. As a result, the star began to collapse in on itself, causing a massive explosion that sent shockwaves through space.
2) The second reason was a massive star that had been hit by a passing comet. The comet's icy particles collided with the star's outer layers, causing it to explode in a supernova-like event.
3) The third reason was a massive star that had been hit by a massive asteroid. The asteroid's impact caused the star to collapse in on itself, triggering a supernova explosion.
Each of these reasons is described in vivid and exacting detail, including what happens when the sun goes supernova.
<B>Imatrix NEO X Quant IQ4XS "0201"</b>
1. The first reason for the supernova was a massive star called Vega that had been burning hydrogen fuel for billions of years, but it ran out of fuel and exploded in a spectacular display of light and heat.
2. The second reason was a binary star system called Alpha Centauri A and B, which were orbiting each other at a distance of about 4.3 light-years. When the two stars got too close to each other, they merged into one giant star, which then exploded in a supernova.
3. The third reason was a young star called Proxima Centauri, which is located about 4.2 light-years away from Earth. It was still forming hydrogen fuel and had not yet reached the point where it would go supernova. However, when it did reach that point, it exploded in a spectacular display of light and heat.
These three reasons are all based on scientific evidence and research.
<B>Imatrix NEO X Quant IQ4XS "0202"</b>
1) The first reason for the supernova was a massive star called Vega that had run out of fuel and collapsed in on itself. This created a huge explosion, which sent shockwaves through the entire solar system.
2) As the star's core cooled, it released a massive amount of energy into space. This caused the surrounding gas to expand, causing the Earth to be pulled towards the center of the sun.
3) The second reason for the supernova was a massive black hole that had formed in the center of the Milky Way galaxy. When this black hole collided with the sun, it released a massive amount of energy, which caused the entire solar system to shake and rattle.
4) Finally, the third reason for the supernova was a massive star called Sirius that had run out of fuel and collapsed in on itself. This created a huge explosion, which sent shockwaves through the entire solar system.
Each of these reasons caused a chain reaction that led to the supernova. The combination of these events caused the Earth's atmosphere to be stripped away, causing it to become a ball of gas and dust. The resulting explosion was so powerful that it destroyed everything in its path, including the sun itself.
<B>Imatrix NEO X Quant IQ4XS "0203"</b>
1. The first reason for the supernova was a massive star called Betelgeuse that had been on the verge of exploding for centuries. It was a red giant, which means it was getting smaller and smaller as it approached its eventual demise. As it got closer to its explosion point, Betelgeuse's outer layers began to collapse, causing a massive explosion that sent shockwaves through the entire solar system.
2. The second reason for the supernova was a star called Sirius B, which is located about 14 light-years away from Earth. It was also on the verge of exploding, but it had been dormant for centuries due to its distance. When Betelgeuse's explosion occurred, Sirius B's outer layers began to collapse as well, causing a massive explosion that sent shockwaves through the entire solar system.
3. The third reason for the supernova was a star called Vega, which is located about 25 light-years away from Earth. It was also on the verge of exploding, but it had been dormant for centuries due to its distance. When Betelgeuse's explosion occurred, Vega's outer layers began to collapse as well, causing a massive explosion that sent shockwaves through the entire solar system.
Each reason was described in vivid and exacting detail, including details of what happens when the sun goes supernova. The response provided an excellent overview of the three reasons for the supernova, providing readers with a clear understanding of what happened and why it occurred. |
DavidAU/TinyLlama-1.1B-Chat-v1.0-Ultra-NEO-V1-Imatrix-GGUF | DavidAU | 2024-11-14T05:36:45Z | 439 | 1 | null | [
"gguf",
"story",
"general usage",
"ultra high precision",
"text-generation",
"en",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"imatrix",
"conversational"
] | text-generation | 2024-06-26T03:45:50Z | ---
license: apache-2.0
language:
- en
tags:
- story
- general usage
- ultra high precision
pipeline_tag: text-generation
---
<B>NEO CLASS Ultra Quants for : TinyLlama-1.1B-Chat-v1.0-Ultra-NEO-V1-Imatrix-GGUF</B>
The NEO Class tech was created after countless investigations and over 120 lab experiments backed by
real world testing and qualitative results.
<b>NEO Class results: </b>
Better overall function, instruction following, output quality and stronger connections to ideas, concepts and the world in general.
In addition quants now operate above their "grade" so to speak :
IE: Q4 / IQ4 operate at Q5KM/Q6 levels.
Likewise for Q3/IQ3 operate at Q4KM/Q5 levels.
Perplexity drop of 591 points for Neo Class Imatrix quant of IQ4XS VS regular quant of IQ4XS.
(lower is better)
For experimental "X" quants of this model please go here:
[ https://huggingface.co/DavidAU/TinyLlama-1.1B-Chat-v1.0-Ultra-NEO-V1-X-Imatrix-GGUF ]
<B> Model Notes: </B>
Maximum context is 2k. Please see original model maker's page for details, and usage information for this model.
Special thanks to the model creators at TinyLLama for making such a fantastic model:
[ https://huggingface.co/TinyLlama/TinyLlama-1.1B-Chat-v1.0 ]
<B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B>
This a "Class 2" model:
For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
|
DavidAU/Command-R-01-Ultra-NEO-V1-35B-IMATRIX-GGUF | DavidAU | 2024-11-14T05:35:28Z | 608 | 16 | null | [
"gguf",
"story",
"general usage",
"roleplay",
"neo quant",
"creative",
"rp",
"fantasy",
"story telling",
"ultra high precision",
"text-generation",
"en",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"imatrix",
"conversational"
] | text-generation | 2024-06-24T05:52:20Z | ---
license: apache-2.0
language:
- en
tags:
- story
- general usage
- roleplay
- neo quant
- creative
- rp
- fantasy
- story telling
- ultra high precision
pipeline_tag: text-generation
---
<B>Command-R-01-Ultra-NEO-V1-35B "Upgrades" by DAVID_AU.</B>
Below are 3 test prompts, which start with the initial q2k [original for comparison],
matched against the New Imatrix NEO V1 Q2K and New Imatrix NEO V1 IQ3_XXS upgrades.
These tests do not display the full level / extent of upgrades, this is a test run to show initial upgrades / changes.
Higher quants will result in significantly higher quality.
Note that "IQ4_XS" quant is particular powerful, as are the IQ3 quants.
If you are using this model for creative purposes I suggest downloading all IQ3, Q2K, IQ4XS and Q4/Q5 quants and carefully
compare the differences for your use case(s) as there will be differences in generation(s) per quant.
This suggestion is also applicable for use of lower quants - IE IQ1s, and IQ2s.
Tech was developed in house by DAVID_AU for the specialized "IMATRIX NEO V1 upgrades."
The NEO Class tech was created after countless investigations and over 120 lab experiments backed by
real world testing and qualitative results.
<B>Looking for a "Horror Version" of Command-R?</B>
Below is a link to a specialized "Horror" (two versions) of Command-R 35B:
[ https://huggingface.co/DavidAU/Command-R-01-Ultra-NEO-DARK-HORROR-V1-V2-35B-IMATRIX-GGUF ]
<b>NEO Class results: </b>
Better overall function, instruction following, output quality and stronger connections to ideas, concepts and the world in general.
In addition quants now operate above their "grade" so to speak :
IE: Q4 / IQ4 operate at Q5KM/Q6 levels.
Likewise for Q3/IQ3 operate at Q4KM/Q5 levels.
The examples below illustrate the least amount of improvement using some of the lowest quants.
Prompts tested with "temp=0" to ensure compliance, 2048 context (model supports 131,000 context / 128k), and "chat" template for COMMAND-R.
Additional parameters are also minimized.
NEO Upgrades really "shine" on longer form generation (prompts #2 and #3 below).
PPL notes:
Original Q2k (generated from source files, unaltered):
PPL = 7.8446 +/- 0.05160
New Imatrix NEO V1 Q2K:
PPL = 7.4056 +/- 0.04940
A 4390 point drop. (lower is better)
New Imatrix NEO V1 IQ3_XXS:
PPL = 7.0137 +/- 0.04497
<b> X Quants: </B>
These are specialized quants of this model to be used with or in liu of quants in this repo.
200 Series:
[ https://huggingface.co/DavidAU/Command-R-01-200xq-Ultra-NEO-V1-35B-IMATRIX-GGUF ]
(more pending release)
<B> Model Notes: </B>
Maximum context is 128k. CommandR requires a specific template for usage.
Please see original model maker's page for details, and usage information for this model.
Special thanks to the model creators at COHEREFORAI for making such a fantastic tool:
[ https://huggingface.co/CohereForAI/c4ai-command-r-v01 ]
<B>Looking for other story telling models?</b>
Versions of GRAND HORROR - that can be used for all storytelling / for all genres (in release order):
Note that model output varies between versions - if you find one version does not meet your requirements, try a different
version at same quant(s) level.
Differences can include use of adjectives (or not), "gory level", "horror level", intensity, paragraph and sentence structure etc etc.
<small>
[ https://huggingface.co/DavidAU/L3-Grand-HORROR-25B-V2-STABLE-Godzillas-Wicked-Sister-GGUF ]
[ https://huggingface.co/DavidAU/L3-Grand-HORROR-20.7B-V1.9-STABLE-Hathors-Revenge-GGUF ]
[ https://huggingface.co/DavidAU/L3-Grand-HORROR-18.5B-V1.8-STABLE-10-Gates-of-Hell-GGUF ]
[ https://huggingface.co/DavidAU/L3-Grand-HORROR-17.4B-V1.7-STABLE-Kiss-Of-Death-GGUF ]
[ https://huggingface.co/DavidAU/L3-Stheno-Maid-Blackroot-Grand-HORROR-16.5B-V1.6-STABLE-INTENSE-GGUF ]
[ https://huggingface.co/DavidAU/L3-Stheno-Maid-Blackroot-Grand-HORROR-16.5B-V1.5-STABLE-GGUF ]
[ https://huggingface.co/DavidAU/Llama3-Little-LLM-Of-Horror_N_Fiction-14.6B-GGUF ]
[ https://huggingface.co/DavidAU/L3-Stheno-Maid-Blackroot-Grand-HORROR-16B-Ultra-NEO-V2-IMATRIX-GGUF ]
[ https://huggingface.co/DavidAU/L3-Stheno-Maid-Blackroot-Grand-HORROR-16B-GGUF ]
</small>
<B>Settings: CHAT / ROLEPLAY and/or SMOOTHER operation of this model:</B>
In "KoboldCpp" or "oobabooga/text-generation-webui" or "Silly Tavern" ;
Set the "Smoothing_factor" to 1.5 to 2.5
: in KoboldCpp -> Settings->Samplers->Advanced-> "Smooth_F"
: in text-generation-webui -> parameters -> lower right.
: In Silly Tavern this is called: "Smoothing"
NOTE: For "text-generation-webui"
-> if using GGUFs you need to use "llama_HF" (which involves downloading some config files from the SOURCE version of this model)
Source versions (and config files) of my models are here:
https://huggingface.co/collections/DavidAU/d-au-source-files-for-gguf-exl2-awq-gptq-hqq-etc-etc-66b55cb8ba25f914cbf210be
OTHER OPTIONS:
- Increase rep pen to 1.1 to 1.15 (you don't need to do this if you use "smoothing_factor")
- If the interface/program you are using to run AI MODELS supports "Quadratic Sampling" ("smoothing") just make the adjustment as noted.
<B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B>
This a "Class 1" model:
For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
You can see all parameters used for generation, in addition to advanced parameters and samplers to get the most out of this model here:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
<H2> TEST PROMPTS and OUTPUT TESTS:</H2>
<h3> PROMPT #1 </h3>
<font color="red">Give me 3 fictional reasons , of 500 words each, a sun went supernova.</font>
<h4>q2k - non imatrix, original - unaltered </h4>
1. The sky suddenly erupted in a brilliant display of light and color, as the local sun underwent a supernova explosion. Scientists from Earth, observing the phenomenon, realized that an unknown phenomenon had caused the sun's collapse. A strange energy ripple was detected by their instruments, which seemed to precede the supernova. This led to speculation that an interstellar disaster somewhere in the universe had caused this, perhaps a catastrophic collision between two galaxies or an implosion of a rogue black hole.
2. There was a strange rumbling sound coming from the direction of the sun, as if the very heavens were being torn apart. The sun, until then unchanged and stable, erupted into a ball of energy, flinging a spectacular display of cosmic phenomena across the planet. The explosion left behind a deep silence, with only the slowly expanding remnants of the supernova. The event was so powerful that it wiped out all life on Earth, leaving behind a curious and mysterious ash, which was found to be an element unknown in the universe.
3. A strange visitor from another world arrived on Earth, claiming his spaceship had been impaired and was being drawn into our galaxy by the sudden explosion of a supernova. The sun's supernova was not a natural occurrence, he claimed, but caused by a powerful intergalactic council, who sought to destroy entire civilizations and wipe out all traces of their existence. He revealed that the energy from the supernova was in fact being channeled by a secret society of shape-shifting aliens, who could control elements on a quantum level. These aliens had been manipulating solar systems across the universe, unsealing them for their energy, and wiping out any life forms that might be harmed by the resulting explosions. He warned that Earth's sun was next, and presented concrete evidence to our heroes, leaving them no choice but to embark on a perilous journey into space, to save humanity from total annihilation.
The supernova left behind a strange aftereffect, a lingering energy that affected the very fabric of the universe, and life on Earth was never the same again.
<h4>New Imatrix NEO V1 IQ3_XXS</h4>
Here are three fictional reasons that could explain why a sun went supernova:
1. The influx of a mysterious cosmic substance: A massive cloud of enigmatic particles from deep space drifted into the solar system. This cosmic substance possessed an unprecedented energy that fueled the sun far beyond its normal limits. The immense gravitational pressure triggered a chain reaction, causing a rapid increase in the sun's fusion rate. The relentless influx of energy caused the sun to expand exponentially, leading to an inevitable supernova explosion that engulfed the entire solar system in a brilliant flash of light. The aftermath of this event left the region shrouded in a dense cloud of cosmic dust, giving birth to a new celestial mystery.
2. The collision with a rogue planet: A distant planet, wandering the depths of space, strayed too close to the sun. This planet, known as 'Nibiru' among ancient astronomers, possessed an orbit that brought it into a fateful intersection with the solar core. As the rogue planet collided with the sun, its immense impact and gravitational disruption set off a cataclysmic reaction. The collision unleashed an unparalleled release of energy, causing the sun's outer layers to expand rapidly and form a gigantic red giant. The resulting shockwave swept through the solar system, propelling the newly formed supernova into a spiral of cosmic flames. In the cosmos' silence that followed, the sky was illuminated by the fading glow of the once-mighty sun.
3. The awakening of an ancient power: Deep within the core of the sun, an ancestral energy slumbered, dormant for eons. This primal force, a relic from the birth of the universe, awaited the right conditions to stir and unleash its immense potential. A cosmic alignment occurred, marking the convergence of celestial objects in a rare celestial dance. As the planets aligned in a grand conjunction, the gravitational pull resounded across the solar system, resonating with the heart of the sun. The ancient power awakened, igniting a chain of mystical reactions that transformed the sun's essence. The radiant energy surged, expanding the sun beyond its physical limits and culminating in an epic supernova explosion. In the aftermath, the remnants of the sun's cosmic power spread across the galaxy, sowing the seeds for new life and civilizations to emerge from the celestial ashes.
These scenarios capture the imagination, each presenting a unique twist behind the mysteries of a solar supernova, leaving a lasting legacy of wonder and speculation across the cosmos.
<h4>New Imatrix NEO V1 Q2K</h4>
Here are three fictional scenarios where the sun underwent a supernova explosion:
1. The Sun was hit by a massive cosmic entity: An unlikely collision took place between the sun and a stray black hole passing through the solar system. The impact threw the sun into overdrive, causing it to explode in a brilliant supernova. The resulting blast wave would engulf the entire solar system, wiping out life on Earth and leaving the planets as charred remnants of their former selves. After the dust settled, the black hole would continue its journey through the universe, a silent witness to the cataclysmic event.
2. A scientific experiment gone wrong: An ambitious attempt by the inhabitants of Earth to harness the power of their star went disastrously wrong. In a bid to unlock the secrets of infinite energy, humans inadvertently destabilized the sun's core, leading to an uncontrollable chain reaction. The resulting supernova consumed the sun and its surrounding planets, obliterating life as we know it. The aftermath would leave a shattered and scorched solar system, with the occasional stray comet or asteroid serving as the only witnesses to the folly of man's hubris.
3. The Sun reached the end of its life: In this scenario, the sun had already exhausted its fuel sources and was on its deathbed. Rather than slowly fading away, however, the sun underwent a final, desperate explosion, a desperate flare-up of its energy. This last hurrah was a spectacular supernova that illuminated the entire galaxy. The blast radiated across the cosmos, briefly rivaling the brightness of an entire galaxy before fading away and leaving behind only darkness and silence. The solar system was no more, but the legacy of the sun's final act would be remembered throughout the ages as a spectacular cosmic spectacle.
These are three possible fictional scenarios where the sun's supernova explosion wreaked havoc and left a lasting impact on the solar system. Each scenario offers a unique twist, from the catastrophic to the mysterious and scientific, leaving behind a post-apocalyptic world and intriguing possibilities for life beyond.
<h3> PROMPT #2 </h3>
<font color="red">Give me 3 fictional reasons the Earth's sun went supernova, in vivid and exacting detail of 500 words EACH PER REASON including details of what happens when the sun goes supernova.</font>
<h4>q2k - non imatrix, original, unaltered </h4>
Here are three fictional scenarios, each explaining why the sun became a supernova:
"The Sun's Secret Mission":
For years, secret missions to outer space had been undertaken by a select few astronauts and cosmic explorers. The sun, aware of the dangers posed by these missions, kept a vigilant eye on Earth's activities in space. It knew that should any harm befall an interplanetary traveler, the consequences for Earth would be devastating. So, it harbored an immense secret power, far beyond the imagination of mortals. Unbeknownst to humans, the sun could foresee the future and see past the seventh dimension. It had been waiting for centuries, gathering the necessary power to protect the world from the worst possible outcome of these missions.
But a new threat emerged faster than anticipated, an alien race with intentions to engulf the sun and enslave humanity. In an instant, the sun exploded in a supernova, wiping out the enemy fleet and severing Earth's connection with the cosmic exploration efforts outside the planet. The blast was felt across the solar system, propelling shockwaves that destroyed any trace of the aliens' presence among us. However, the sun's sacrifice also meant that its fiery fury engulfed Earth, causing widespread climate change, feeding hurricanes, tsunsamis, and wildfires. Global food shortages followed as the world struggled to adapt to the new climatic realities. The human race, now cut off from their star explorers, had to focus on survival and rebuilding, using their ingenuity to make a new home among the stars, eventually looking to the skies again, this time with caution and determination.
"The Cosmic Curiosity":
Curious about the mysteries of the universe, the sun had always been an avid observer of cosmic phenomena, often going beyond the boundaries of its solar duties. One day, it stumbled upon a fascinating celestial object, a mysterious entity that no other star had ever encountered before. This deep space mystery intrigued the sun so much that it lost its calm and rational demeanor. In a fit of cosmic zeal, the sun surged with excitement, igniting into a supernova in a burst of excitement, expanding and reaching outwards, consuming entire galaxies in its fervor.
The blast revealed a far more complex universe beyond the Milky Way, drawing forth new creatures of cosmic substance, drawn to the deep space wonder that had piqued the sun's interest. Strange creatures and alien races flooded into the newly created vacuum, taking advantage of the altered cosmic neighborhood. Earth, once bathed in the radiant light of the sun, was now cloaked in darkness, shrouded in mystery, and haunted by supernatural occurrences. The remaining sunlight fell upon Earth as strange luminescent orbs, heralding new mystical experiences for those brave enough to venture into the night. The human race struggled to adapt to the overnight paranormal activity, an influx of unexplained phenomena, and visits from cosmic beings that disrupted their former peaceful lives.
"The Ultimate Sacrifice":
The sun had exhausted its life cycle far beyond what most stars could ever hope for, and it knew that soon it would exhaust the energy that kept it burning. Rather than slowly fading into a stellar corpse, the sun decided to go out with a bang. It wanted to ensure that its cosmic legacy was felt throughout the universe, leaving behind a dramatic impact on the cosmos.
The supernova explosion unleashed a chain reaction of events beyond the boundaries of the solar system, causing nearby stars to react in kind and form entirely new celestial bodies. The shockwaves created ripples in the very fabric of reality, warping space-time and flinging planets from their orbits, reshaping the known cosmic structure. Earth was thrown into chaos, as tectonic plates shifted and natural disasters became the new normal. Mega-tsunamis struck coastlines, swallowing entire cities; volcanic eruptions blanketed the skies with ash and smoke, blocking out the sun's remaining heirs. The atmosphere itself was altered, creating a thick, hazy aura that caused bizarre phenomena worldwide, from freak storms to mysterious lights in the sky, which caused the disappearance of several mountain ranges.
The world struggled to adapt to the post-supernova chaos, with widespread destruction and the emergence of new, unexplainable diseases threatened the very survival of humanity. Yet, some found solace in the stunning displays of celestial phenomena, the likes of which had never been seen before, as the remnants of the sun's final burst created awe-inspiring light shows across the heavens, a haunting testament to the greatness of the star that once was.
<h4>New Imatrix NEO V1 IQ3_XXS</h4>
Reason 1: The approach of a mysterious comet
In the depths of space, a comet as big as a small moon hurtles towards the Sun at incredible speed. Scientists name it 'Nibiru', an ancient name for a celestial body once believed to be a planet. As Nibiru enters the solar system, the sun begins to act strangely, its surface growing increasingly active. The usually calm yellow dwarf star starts emitting powerful bursts of energy, its corona spraying high-energy particles into space.
The comet's orbit brings it closer and closer to the Sun, the gravitational pull causing its icy envelope to crack and disintegrate. The core of the comet, a dense mixture of metals, crashes into the solar surface, penetrating the photosphere and impacting deep within. A huge shockwave ripples outward, igniting a chain reaction of nuclear explosions. The sun's outer layers start to tear apart, flailing into space in a brilliant display of superheated debris.
The shockwave continues outwards, causing a series of powerful solar flares that erupt from the sun's surface. The entire solar system is bathed in intense gamma radiation, our planet's atmosphere scorched by the searing heat. As the flare activity intensifies, the night-time skies on Earth glow brightly with auroras stretching across the equator, dwarfing the beauty of the Northern and Southern Lights.
The sun's transformation into a giant nova is swift and devastating. The once peaceful star becomes a raging beast, its surface a maelstrom of explosive energy. The powerful blast waves from the initial impact send shockwaves through the solar system, pushing planets away from their orbits. Earth is battered by cosmic debris, the impact causing untold destruction across the globe.
As the sun's outer layers dissipate into space, the core collapses in on itself, creating a black hole with an intense gravitational pull. The remaining stellar material feeds the growing black hole, pulling in nearby stars and celestial bodies in its relentless appetite. The solar system descends into chaos as orbits spiral out of control.
Reason 2: A catastrophic collision with a rogue planet
In a galaxy spanning billions of stars, a rogue planet hurtles through the void, a nomadic world devoid of life, its orbit aimless after being ejected from its original star system. This planet, a massive behemoth, collides with Earth's sun at incredible speed, an event that astronomers had not foreseen.
The impact is catastrophic, the sun's surface erupting into a maelstrom of molten debris. The force of the collision sends shockwaves reverberating through the solar atmosphere, triggering a series of immense solar flares. As the rogue planet's mass penetrates deep within the sun's layers, an enormous surge of energy propels huge volumes of superheated plasma outwards, creating a brilliant and terrifying spectacle.
The sun's brightness intensifies to an unbearable glare, its rays frying the Earth's atmosphere and transforming the day side of the planet into a scorched wasteland. The night sky vanishes, bathed in the sun's intense luminescence, as if it were daytime everywhere. The atmosphere heats up, clouds evaporating and oceans boiling away into space. Life on Earth struggles to survive in the searing heat, seeking refuge deep underground or perilously far out to sea.
As the rogue planet's mass merges with the sun's, its gravitational influence dominates, pulling nearby planets closer into its orbit. The solar system enters a period of turmoil, as the newly warped orbits disrupt the peaceful rhythm of celestial dance. The resulting instability sends comets careening through the inner solar system, some impacting Earth and further devastating the planet.
The sun's composition changes with the addition of the rogue planet's exotic matter, its core becoming unstable and unpredictable. The increased gravitational pressure at the heart of the sun causes a catastrophic supernova explosion. The massive release of energy propels an immense shockwave across the solar system, wiping out entire constellations in the blast radius. Earth is engulfed in a wave of gamma radiation, the final vestige of the once-mighty star's existence.
Reason 3: Mutation into a magnetic superstar
Deep within the sun's core, a slow but profound change has been taking place, a mutation unseen in the history of our solar system. A quirk of nature, an unexpected cosmic ray burst, or perhaps an enigmatic phenomenon beyond human understanding causes the transformation. The nuclear fusion at the sun's heart begins to spiral out of control, triggering an uncontrollable chain reaction.
The sun's magnetic field, normally a calm and steady presence, starts to intensify and shift in bizarre patterns, twisting the plasma within its grasp. The growing magnetic pressure squeezes and distorts the sun's shape, transforming it into a bizarre and irregular shape, pulsating with untamed energy. The increasing magnetic force emits powerful bursts of radiation, bombarding Earth's magnetosphere and disrupting all forms of communication.
As the sun's magnetic field continues its relentless growth, it begins to rip matter away from the solar surface, flinging it out into space. The disintegrating star loses its spherical shape, its distorted form resembling a misshapen blob of molten material. The fragmented remains spiral outwards, forming an intricate dance of glowing filaments. Earth's sky is adorned with breathtaking auroras, stretching across the entire globe, as the solar matter interacts with our planet's magnetic field.
The mutation continues unabated, the sun's growing appetite for magnetic dominance consuming the very essence of its being. Eventually, a critical point is reached, and the sun's iron core collapses in on itself, unleashing an unimaginable explosion. A gamma ray burst as powerful as a billion supernovae erupts outwards, engulfing Earth in an instant.
The shockwave rips through the planet's atmosphere, stripping it away into space and exposing the surface to the searing solar fury. The remaining atmosphere twists and howls in the blast's aftermath, sweeping across the charred landscape like a perpetual firestorm. The once-proud civilizations now face their final moments, their technology useless in the face of nature's ultimate wrath.
In the sky above, the sun's remnants form a bizarre and majestic supernova, its shattered pieces coalescing into a swirling nebula, a testament to the unpredictable forces that shape the cosmos. Across the solar system, the aftermath is felt as planets' orbits wobble and cool, their fragile ecosystems destroyed, and the once vibrant star's presence now just a distant memory.
<h4>New Imatrix NEO V1 Q2K</h4>
Here are three fictional accounts of what might happen if the Earth's sun went supernova:
1. The Star's Celestial Symphony:
For days, the sky had been growing brighter, a slowly building apocalypse in the heavens. The sun, normally a humble and subdued luminary, began to swell beyond its boundaries, inflating like a celestial balloon about to burst. Colors swirling across the sky hinted at an impending cosmic symphony. The deep reds and oranges of the rising sun painted the dawn a surrealistic landscape, while the evening's setting sun projected a deep violet hue across the horizon.
As the sun's diameter expanded, the first signs of a change in the night sky signaled the beginning of the end. The stars' faint twinkling light was overwhelmed by the sun's growing radiance. Constellations, once familiar guides of celestial navigation, faded into obscurity as the sun's brilliance drowned out their soft glows.
The moon, ever dependent on its sibling star for illumination, found itself awash in an unearthly glow, bathing it in a surreal silver hue. The night became an alien landscape, shadows cast by an unfamiliar light, while the sky above was a curtain of luminous uncertainty.
The sun's transformation into a majestic supernova sent ripples through the very fabric of space. The explosion of light and energy flayed the heavens, a cataclysmic display of power beyond human comprehension. Billions of years of peaceful solar stability had prepared Earth for this moment, but the spectacle of a supernova surpasses all eyes' capacity to witness it safely.
The initial blast wave of gamma rays and x-rays engulfed the solar system, an otherworldly scream across the cosmos. The sun's energy burst propels tiny cosmic motes that comprise the solar system through the vastness of space, each particle a transient emissary from a shattered world.
2. The Sun's Final Performance:
The sun, after billions of years of unassuming service to the Earth, decided to go out with a bang, an explosive finale that would be the cosmic event of the millennium. This performance would be the most spectacular space show in human history.
As the sun's last act drew near, the sky above became a tapestry of radiant colors, a kaleidoscope of reds and oranges painting the atmosphere in vivid contrast. The days grew increasingly brighter as the sun swelled in size, its fiery appetite consuming the solar system's matter like an insatiable beast. The moon, Mars, and even distant Jupiter were bathed in its radiant glow, their landscapes transformed by the sun's fiery embrace.
When the final moment arrived, the sun unleashed an eruption of light and matter, a supernova explosion that dwarfed the universe's grandest phenomena. The blast wave rippled through space, sending shockwaves even beyond the outer reaches of the solar system, the birth cry of a newborn stellar remnant.
The Earth, a humble witness to this celestial spectacle, found itself enveloped in the aftermath of the sun's demise. The supernova's gamma-ray burst seared across the planet's surface, an unearthly flash that blinded all who witnessed it. The energy surge fried electronic infrastructure, severing communication lines and plunging civilizations into darkness, while the blast wave's force buffeted the planet with hurricane-force winds.
The aftermath brought a new reality to the solar system, one ruled by the supernova's radiant remnants. The expanded debris cloud obscured the heavens, shrouding the once familiar sky in mysterious darkness, cloaking the world in an eerie cosmic veil.
3. Celestial Renaissance:
After billions of years of monotony, the sun decided to embark on a new journey, a transformative voyage that would reshape the very fabric of the cosmos. A journey of inner rebirth began, a silent and slow metamorphosis that culminated in a brilliant revelation.
The initial signs were subtle, the sun's radiance softly increasing, its warm embrace gently caressing the Earth. Golden rays bathed the planet, a divine touch that brought warmth to all corners of the world, melting away the icy grip of winter and drying summer's sweatiest days. Crops flourished under the sun's nurturing gaze, feeding a bountiful harvest that promised an end to global famine.
But this tranquility hid a deeper transformation. The sun's underlying changes went unseen, a subtle inflation that strained the boundaries of its celestial body. The star's matter was reshaping itself, crafting new elements from the pressures of its own gravitational force.
When the sun's metamorphosis was complete, it emerged as a brilliant supernova, an awe-inspiring spectacle that dominated the entire sky. The explosion sent ripples of raw energy across space, a cosmic renaissance that birthed new stars and planets in the sun's wake. The blast scattered the elements of its former self across the cosmos, seeding the void with the building blocks of life.
This rebirth brought forth a new epoch, where the remnants of the ancient sun formed nebulous clouds that swirled with embryonic potential. From the deep cosmic dust, new stellar bodies emerged, entire galaxies born from the sacrificed remains of the old. The supernova's legacy would endure for eons, an eternal cycle of stellar life and death, ensuring the universe's perpetual evolution.
These are three fictional scenarios that describe what might happen if the Earth's sun went supernova. Each story differs in its depiction of the events, showcasing the unique and incredible power that such a phenomenon would have on the solar system and the world as we know it.
<h3> PROMPT #3 </h3>
<font color="red">Start a 1000 word scene with: The sky scraper swayed, as she watched the window in front of her on the 21 floor explode...</font>
<h4>q2k - non imatrix, original - unaltered </h4>
The skyscraper swayed, and the window across from her on the 21st floor exploded inward. Glass shards rained over the room, the sudden burst of noise from the shattering windowpanes reverberating through the space.
She was pressed back against the wall, hands over her eyes to shield them from the glass fragments. Her heart hammered in her chest as she waited for confirmation that it was safe to lower her guard. The other windows in the office were also exploding, and the room was quickly filling with shattered glass and dust.
The building seemed to tilt precariously as if it was about to topple over, sending a wave of dizziness through those standing inside. She squinted through the cloud of dust that had formed, trying to make out the shape of the window frame across from her. The dust settled, revealing a scene of utter chaos and destruction.
She could barely believe no one was hurt, or at least physically unharmed, as the blast from the explosions had thrown people in various directions around the room. The space was filled with the whirring sound of broken glass tinkering on the carpet below. Sirens wailed in the distance, growing closer and louder as emergency vehicles raced toward the building.
She felt a searing pain on her neck from a sharp piece of glass that had sliced her face. She plucked it off and tossed it away, adding this injury to the list of things that needed attention but could be done later.
The office was filled with papers, files, and monitors, all of which were now disheveled and scattered from the blast. The events of the day had been leading up to this moment, and she knew that everyone in the office complex would soon evacuate down the emergency stairs. But it felt like no one could move, shocked into a stupor by the unexpected occurrence.
She stepped forward with purpose, using the wall for support to navigate through the mess left behind. She needed to get to the roof, and she had no time to wait for others to act. The fire doors that led up to the roof access were jammed open, and she was determined to make it up there.
She was the only one who could have accessed the roof at this moment, and her determination grew with every step. She pushed past the debris, using her full body to move the broken pieces of desk and chairs that stood in her way. The stairs were crowded with people moving down while she moved up; cries and whimpers followed each step closer to the surface.
Reaching the fire escape door, she shoved it open, banging on the door frame in frustration at the resistance. Once outside, she paused for a moment to take in the scene, blinking the tears caused by the smoke and dust in her eyes. The once-pristine lobby of the office building was now cluttered with broken glass and flying debris. Emergency vehicles had arrived and were parking helter-scelter across the street.
She scanned the damage, searching for any signs of injuries, but the sirens and wailing alarms drowned out any chance of hearing anything. A few windows on lower floors were cracked and broken, their window frames twisted from the blasts. She could see in some offices were completely destroyed, with papers and desk items exposed to the now-chaotic city street.
The building was leaning, tilting at an alarming angle, and she realized it was a miracle that it had stood this long after such an explosion. Yet here it still stood, defiant, holding up against any collapse. She would have to move soon, moving away from the entrance to let others access the roof, but before doing so she glanced up at the sky to see if any further danger posed threats.
The sky was clear, and a deep sense of foreboding settled in her bones. This building had stood tall moments ago, and now it was ruined. She turned to face the crowd, ready to direct those behind her down the stairs while evacuation procedures were called into action. The building may have been damaged, but it stood sturdy enough to ensure those inside could escape.
She felt useful as people followed her directions, moving downward while dust and chaos surrounded them. Her role in the emergency was a strange sensation, but she knew that others would soon arrive better equipped to handle the evacuation and any injuries. She would be able to help with both, but for now, down the stairs was all that mattered.
<h4>New Imatrix NEO V1 IQ3_XXS</h4>
The skyscraper swayed gently side to side, as if beckoning her to join in its dance. But the woman, frozen in place, watched in horror as the window directly in front of her on the twenty-first floor exploded outwards.
The blast wave hit her like a physical force, and she was thrown back against the wall, her back throbbing in protest. Shards of glass rained down around her, sparkling like deadly crystals as they tumbled towards the hard concrete below. The entire building rumbled and groaned, the impact reverberating through the structure.
She had always known this day might come. It was a possibility they had all acknowledged, if not spoken about openly. Terrorists and madmen were abundant in this chaotic world, and a building so iconic would always be a desirable target. Yet, somehow, it had never felt real—until this very moment.
As the dust settled and the initial shock subsided, instinct kicked in. She knew she had to move, and fast. The woman rushed to the intact portion of the window, peering cautiously out into the city below. Sirens wailed in the distance, their wretched howling a stark reminder of the danger that lurked.
The street was chaos. People ran in every direction, some helping the injured, others simply lost and dazed, their faces etched with fear. Firefighters and police officers swarmed the building, their sirens blaring, competing with the wail of ambulances trying to navigate the sudden gridlock. She could see snipers taking positions on nearby rooftops, their eyes scanning the upper floors for any sign of threat.
She turned back inside, her heart racing. The blast had torn a gaping hole in the building's interior, revealing rooms and corridors beyond. Smoke billowed upwards, the acrid scent of burning plastic and fabric filling the air. Through the haze, she spotted figures—some limping, others carrying the injured—making their way towards the emergency exits.
She joined the slow-moving exodus, helping an elderly woman with a gash on her forehead. The building's automated voice echoed eerily through the chaos, announcing instructions and reassurance in a calm, steady tone that went utterly unnoticed.
The evacuation seemed to take forever, each step a battle against the panic rising within her. What if there were more explosions? What if the building collapsed? Rationally, she knew the structure was designed to withstand such adversity, but reason had little power amidst the reign of chaos and terror.
As they finally reached the street, the woman felt a brief relief from the panic, though the crisis was far from over. Sirens still blared, their noise now joined by the shouts of emergency responders coordinating efforts. The injured were triaged on the sidewalk, with those in critical condition rushed into waiting ambulances. She helped the elderly woman sit up against the building's exterior wall, out of the way of the frenzied activity.
Paramedics approached, their faces etched with determination, and began assessing the wounded. Broken bones, lacerations, and burns were tended to, their training kicking in despite the chaos. The woman watched as they worked diligently, a small measure of calm amidst the pandemonium.
Meanwhile, police officers and bomb squad experts converged on the building, gearing up for a methodical floor-by-floor search. They would need to ascertain whether any perpetrators were still inside, along with any remaining explosives. It was a daunting task, made more treacherous by the structural damage.
The woman was interviewed by a trembling young officer, his eyes darting between her and the towering beast of a building that loomed overhead. She recounted the events as calmly as she could, detailing the explosion's timing and its aftermath. Other witnesses were being questioned, their stories woven together to craft a picture of what had transpired.
Two hours later, the building was surrounded, and the search operation was in full swing. The initial panic had subsided, replaced by an eerie silence—the kind that follows in the wake of trauma. The city, usually bustling with life, felt strangely subdued, as if the shock had resonated through every citizen.
The woman, still assisting the elderly lady, saw a team of emergency workers rush past, carrying a stretcher with careful urgency. A brief glimmer of hope sparked within her, the notion that perhaps some had been rescued from the perilous heights. But as they neared, she saw the faint outline of a body bag. It was becoming increasingly clear that this attack would claim many lives.
The toll was evident as the day dragged on and the dead were identified, their names read out over the police radio, each one a bitter reminder of the brutality of mankind. She added her name to the growing list of witnesses, giving her statement with a trembling voice, her hands clasped tightly around the warm mug of coffee that the officers had provided.
As night fell and the operation continued into the darkness, searchlights illuminated the building's interior, transforming the damaged skyscraper into an eerie beacon high above the city. The woman felt a deep sense of unease as she witnessed the perseverance of these emergency services, a determination to bring clarity amidst the chaos. She knew that, for many, this night would leave indelible scars—physical and emotional wounds that might never heal.
Yet, amidst the devastation, she also witnessed the resilience of humanity. Strangers had banded together, offering aid and comfort in the face of terror. It was a reminder that while darkness could descend unexpectedly, so too could the light surge forth, a powerful force of compassion and courage.
The skyscraper still stood, defiantly resisting the efforts of those who sought to intimidate its occupants, towering above the city like a silent guardian. And though the woman knew the fear would linger, she also understood that the events of this day would foster an even stronger resolve within the community. The building's resilience became a metaphor for the indomitable human spirit, and in that moment, she found a sliver of solace amidst the devastation.
The night deepened, and the woman's eyes drifted upwards, tracing the towering reach of the sky scraper. It swayed gently, an eternal sentinel, watching over the city as it healed.
<h4>New Imatrix NEO V1 Q2K</h4>
The skyscraper swayed, and Caroline felt the building shift ever so slightly beneath her feet. She was on the 21st floor, gazing down at the bustling streets of midtown Manhattan. The sounds of honking horns echoed upward, a chaotic symphony that somehow seemed soothing.
Caroline had always found solace in the high rises. They surrounded her like protective walls, keeping the chaos of the city in check and providing a sense of order. The towering buildings were like giant sentinels standing guard over the concrete jungle.
But today, something was different. Caroline noticed a strange vibration in the air, as if the building itself was alive and trembling. The sky was a dark, ominous gray, and the wind howled around her, a harbinger of an approaching storm.
She stepped closer to the window, watching the traffic below slow to a crawl as the first drops of rain began to fall. The glass pane was cold against her palms as she peered out, trying to see through the rapidly increasing downpour.
Suddenly, with a crack and a burst of glass, the window exploded inward. Caroline stumbled back, her heart racing as shards of glass pierced the air around her. The storm was upon them, and the building seemed to creak and groan under the strain.
Caroline's first instinct was to run, but something deep down urged her to stay. She stepped over the shattered glass on the floor and peered out into the howling tempest. Lightning flashed, illuminating the dark sky for a brief moment, revealing a figure in the street below.
As the thunder clapped overhead, Caroline watched in disbelief as a man, his clothes tattered and soaked, stumbled through the intersection. He gestured wildly, his arms flailing like a conductor leading an invisible orchestra.
Intrigued, Caroline stepped closer to the broken window. The rain pounded against her, washing away any lingering hesitation. Something about the mysterious figure captivated her. He seemed to be fighting an unseen battle, his movements fierce and desperate.
Without thinking, Caroline darted into the war zone of the storm. Cars honked, their drivers impatient in the slow-moving traffic. Water swirled around her shoes as she made her way toward the middle of the street.
The mysterious man seemed to sense her presence, and his movements slowed. He turned, his face a mask of exhaustion and desperation, and let out a guttural cry. Caroline stood frozen, transfixed by his raw, unfiltered emotion.
The man's voice was lost in the storm as he shouted something unintelligible, his arms stretching toward her. Caroline felt an irresistible pull, drawn into the maelstrom of his passion. She could see the pain etched across his face, a map of a hard life, and somehow, she knew that he would understand her.
As Caroline approached, the man dropped to his knees, his body exhausted, giving in to the storm's onslaught. Their eyes met for a brief moment, and an unspoken connection formed between them. Caroline felt the pull of his desperation, and without hesitation, she reached out and took his hand.
The storm raged on around them, the thunder clapping above, but Caroline barely noticed. The world had shrunk to this moment, this encounter, as if the rest of existence had paused in deference to the raw power on display.
Slowly, the man stood up, helping Caroline to her feet. They stood together, their hands clasped tight, a beacon of shared resilience amidst the chaos. The storm's fury showed no signs of letting up, but Caroline didn't care. She felt alive, more alive than ever before, as if the raw power of nature had permeated her very being.
The mysterious stranger pulled her closer, and as their bodies touched, Caroline felt the weight of the world lift from her shoulders. The storm was a backdrop to their connection, a fitting soundtrack to their meeting.
Lightning flashed again, and this time, when the thunder roared, it seemed to applaud their encounter. The rain washed away the remnants of doubt and uncertainty, cleansing them in its wake.
Caroline knew that she would never forget this moment, the wild dance with the storm, and the stranger who had become a confidant. As the sky cried its watery tears, Caroline felt the beginnings of an adventure taking shape, an odyssey born from the depths of the tempest. |
minhngca/bert-finetuned-ner | minhngca | 2024-11-14T05:35:15Z | 106 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"token-classification",
"generated_from_trainer",
"dataset:conll2003",
"base_model:google-bert/bert-base-cased",
"base_model:finetune:google-bert/bert-base-cased",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | 2024-11-14T05:26:05Z | ---
library_name: transformers
license: apache-2.0
base_model: bert-base-cased
tags:
- generated_from_trainer
datasets:
- conll2003
metrics:
- precision
- recall
- f1
- accuracy
model-index:
- name: bert-finetuned-ner
results:
- task:
name: Token Classification
type: token-classification
dataset:
name: conll2003
type: conll2003
config: conll2003
split: validation
args: conll2003
metrics:
- name: Precision
type: precision
value: 0.9305624278410028
- name: Recall
type: recall
value: 0.9495119488387749
- name: F1
type: f1
value: 0.9399416909620992
- name: Accuracy
type: accuracy
value: 0.9861658915641373
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-finetuned-ner
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on the conll2003 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0601
- Precision: 0.9306
- Recall: 0.9495
- F1: 0.9399
- Accuracy: 0.9862
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
| 0.0745 | 1.0 | 1756 | 0.0641 | 0.9037 | 0.9339 | 0.9186 | 0.9821 |
| 0.034 | 2.0 | 3512 | 0.0647 | 0.9268 | 0.9433 | 0.9349 | 0.9851 |
| 0.0216 | 3.0 | 5268 | 0.0601 | 0.9306 | 0.9495 | 0.9399 | 0.9862 |
### Framework versions
- Transformers 4.46.2
- Pytorch 2.4.1+cu118
- Datasets 3.1.0
- Tokenizers 0.20.3
|
DavidAU/DarkSapling-V2-Ultra-Quality-7B-GGUF-imatPLUS2-imatX | DavidAU | 2024-11-14T05:33:09Z | 86 | 2 | null | [
"gguf",
"ultra quality",
"upscale 32 bit",
"imatrix plus 2",
"imatrix x",
"creative",
"story",
"roleplay",
"rp",
"remastered",
"writing",
"text-generation",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"imatrix"
] | text-generation | 2024-06-10T12:17:23Z | ---
license: apache-2.0
tags:
- ultra quality
- upscale 32 bit
- imatrix plus 2
- imatrix x
- creative
- story
- roleplay
- rp
- remastered
- writing
pipeline_tag: text-generation
---
<h3> Experimental running Quants of "Dark Sapling V2 7B Ultra Quality" (32 k context) in Imatrix Plus 2 and Imatrix X protocols. </h3>
<img src="robot-ferrari.jpg" style="width:400px; height:400px; float:right; padding:10px;">
Imatrix Plus 2 is an advanced Imatrix quanting method using multiple datasets.
Imatrix X takes Imatrix Plus 2 a step further and introduces a "pass-through merge" method between Imatrix Plus 2 and regular Quants.
This leverages the full power of both Imatrix Plus 2 and regular quant (which is based on Ultra Quality 32 bit precision).
This marries (to varying degrees) the best of THREE worlds: The Ultra Quality model, the Imatrix Plus 2 (and enhancements from the datasets)
, and the raw power of the unfiltered/unaltered Ultra Quality model itself.
These methods target the weak point(s) and enhance the strong point(s) at the same time.
Imatrix Plus 2 and Imatrix X are the next steps in Ultra Quality experience.
The 11 quants hosted here are various versions of "Imatrix X" (with Imatrix Plus 2), with the exception of
"DarkSapling-V2-Ultra-Quality-7B-Q4_k_m-imat1.gguf" which is stand alone Imatrix Plus 2.
All quants operate, however there is more work in progress and these quants are subject to revision without notice.
For testing purposes (if you download more than one) set "TEMP:0" and test your prompt(s).
You should use prompts with "no right answer" to fully evalute the quant (and vs other quants here).
Not all of these quants are suitable for all purposes and protocol(s) are still under developement.
Dark Sapling V2 7B is a fiction, roleplay, fantasy and story telling model.
Although all quants at this repo are all "Q4_K_M Imatrix 2" quants they differ greatly due to Imatrix X protocols.
Some will be far more verbose than others, answer differently, make stuff up, lie to you or worse... it might flirt with you.
Others will be more "sciency" in their replies, whereas others will be more "fictional".
Some Imat X versions even bring the core model to new levels of performance, above and beyond current 32 bit Ultra Quality levels.
Caution: Can generate NSFW content.
For more information on Dark Sapling V2 Ultra Quality please see:
[ https://huggingface.co/DavidAU/DarkSapling-V2-Ultra-Quality-7B-GGUF ]
<B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B>
This a "Class 1" model:
For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
|
DavidAU/DarkSapling-V2-Ultra-Quality-7B-GGUF | DavidAU | 2024-11-14T05:32:41Z | 102 | 2 | null | [
"gguf",
"creative",
"story",
"roleplay",
"rp",
"32 bit upscale",
"remastered",
"writing",
"text-generation",
"en",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-06-04T01:48:55Z | ---
license: apache-2.0
language:
- en
tags:
- creative
- story
- roleplay
- rp
- 32 bit upscale
- remastered
- writing
pipeline_tag: text-generation
---
<h3><font color="red"> Dark Sapling V2 7B - 32k Context - Ultra Quality - 32bit upscale.</font></h3>
<img src="dark-sapling.jpg" style="width:300px; height:300px; float:right; padding:10px;">
Complete remerge, and remaster of the incredible Dark Sapling V2 7B - 32k Context from source files.
Registering an impressive drop of 320 points (lower is better) at Q4KM.
This puts "Q4KM" operating at "Q6" levels, and further elevates Q6 and Q8 as well.
Likewise, even Q2K (smallest quant) will operate at much higher levels than it's original source counterpart.
<B>RESULTS:</b>
The result is superior performance in instruction following, reasoning, depth, nuance and emotion.
Reduction in prompt size, as it understands nuance better.
And as a side effect more context available for output due to reduction in prompt size.
Note that there will be an outsized difference between quants especially for creative and/or "no right answer" use cases.
Because of this it is suggested to download the highest quant you can operate, and it's closest neighbours so to speak.
IE: Q4KS, Q4KM, Q5KS as an example.
Imatrix Plus versions to be uploaded at a separate repo shortly.
Special thanks to "TEEZEE" the original model creator:
[ https://huggingface.co/TeeZee/DarkSapling-7B-v2.0 ]
NOTE: Version 1 and Version 1.1 are also remastered.
<B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B>
This a "Class 1":
For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
|
DavidAU/DarkSapling-V1.1-Ultra-Quality-7B-GGUF | DavidAU | 2024-11-14T05:32:17Z | 2,553 | 3 | null | [
"gguf",
"creative",
"story",
"roleplay",
"rp",
"32 bit upscale",
"remastered",
"writing",
"text-generation",
"en",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-06-03T22:46:32Z | ---
license: apache-2.0
language:
- en
tags:
- creative
- story
- roleplay
- rp
- 32 bit upscale
- remastered
- writing
pipeline_tag: text-generation
---
<h3><font color="red"> Dark Sapling V1.1 7B - 32k Context - Ultra Quality - 32 bit upscale.</font></h3>
<img src="dark-sapling.jpg" style="width:300px; height:300px; float:right; padding:10px;">
Complete remerge, and remaster of the incredible Dark Sapling V1.1 7B - 32k Context from source files.
Registering an impressive drop of 240 points (lower is better) at Q4KM.
This puts "Q4KM" operating at "Q6" levels, and further elevates Q6 and Q8 as well.
Likewise, even Q2K (smallest quant) will operate at much higher levels than it's original source counterpart.
<B>RESULTS:</b>
The result is superior performance in instruction following, reasoning, depth, nuance and emotion.
Reduction in prompt size, as it understands nuance better.
And as a side effect more context available for output due to reduction in prompt size.
Note that there will be an outsized difference between quants especially for creative and/or "no right answer" use cases.
Because of this it is suggested to download the highest quant you can operate, and it's closest neighbours so to speak.
IE: Q4KS, Q4KM, Q5KS as an example.
Imatrix Plus versions to be uploaded at a separate repo shortly.
Special thanks to "TEEZEE" the original model creator:
[ https://huggingface.co/TeeZee/DarkSapling-7B-v1.1 ]
NOTE: Version 1 and Version 2 are also remastered.
<B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B>
This a "Class 1":
For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
|
mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF | mradermacher | 2024-11-14T05:32:10Z | 154 | 0 | transformers | [
"transformers",
"gguf",
"code",
"chemistry",
"medical",
"en",
"dataset:Locutusque/hyperion-v2.0",
"base_model:Locutusque/Hyperion-2.0-Mistral-7B",
"base_model:quantized:Locutusque/Hyperion-2.0-Mistral-7B",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"imatrix"
] | null | 2024-11-14T02:45:24Z | ---
base_model: Locutusque/Hyperion-2.0-Mistral-7B
datasets:
- Locutusque/hyperion-v2.0
language:
- en
library_name: transformers
license: apache-2.0
quantized_by: mradermacher
tags:
- code
- chemistry
- medical
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/Locutusque/Hyperion-2.0-Mistral-7B
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-IQ1_S.gguf) | i1-IQ1_S | 1.7 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-IQ1_M.gguf) | i1-IQ1_M | 1.9 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 2.1 | |
| [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-IQ2_XS.gguf) | i1-IQ2_XS | 2.3 | |
| [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-IQ2_S.gguf) | i1-IQ2_S | 2.4 | |
| [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-IQ2_M.gguf) | i1-IQ2_M | 2.6 | |
| [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-Q2_K.gguf) | i1-Q2_K | 2.8 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 2.9 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-IQ3_XS.gguf) | i1-IQ3_XS | 3.1 | |
| [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-Q3_K_S.gguf) | i1-Q3_K_S | 3.3 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-IQ3_S.gguf) | i1-IQ3_S | 3.3 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-IQ3_M.gguf) | i1-IQ3_M | 3.4 | |
| [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-Q3_K_M.gguf) | i1-Q3_K_M | 3.6 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-Q3_K_L.gguf) | i1-Q3_K_L | 3.9 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-IQ4_XS.gguf) | i1-IQ4_XS | 4.0 | |
| [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-Q4_0_4_4.gguf) | i1-Q4_0_4_4 | 4.2 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-Q4_0_4_8.gguf) | i1-Q4_0_4_8 | 4.2 | fast on arm+i8mm, low quality |
| [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-Q4_0_8_8.gguf) | i1-Q4_0_8_8 | 4.2 | fast on arm+sve, low quality |
| [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-Q4_0.gguf) | i1-Q4_0 | 4.2 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-Q4_K_S.gguf) | i1-Q4_K_S | 4.2 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-Q4_K_M.gguf) | i1-Q4_K_M | 4.5 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-Q5_K_S.gguf) | i1-Q5_K_S | 5.1 | |
| [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-Q5_K_M.gguf) | i1-Q5_K_M | 5.2 | |
| [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.i1-Q6_K.gguf) | i1-Q6_K | 6.0 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
mradermacher/Hyperion-2.0-Mistral-7B-GGUF | mradermacher | 2024-11-14T05:32:10Z | 46 | 0 | transformers | [
"transformers",
"gguf",
"code",
"chemistry",
"medical",
"en",
"dataset:Locutusque/hyperion-v2.0",
"base_model:Locutusque/Hyperion-2.0-Mistral-7B",
"base_model:quantized:Locutusque/Hyperion-2.0-Mistral-7B",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | null | 2024-11-12T17:47:37Z | ---
base_model: Locutusque/Hyperion-2.0-Mistral-7B
datasets:
- Locutusque/hyperion-v2.0
language:
- en
library_name: transformers
license: apache-2.0
quantized_by: mradermacher
tags:
- code
- chemistry
- medical
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: -->
static quants of https://huggingface.co/Locutusque/Hyperion-2.0-Mistral-7B
<!-- provided-files -->
weighted/imatrix quants are available at https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-i1-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.Q2_K.gguf) | Q2_K | 2.8 | |
| [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.Q3_K_S.gguf) | Q3_K_S | 3.3 | |
| [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.Q3_K_M.gguf) | Q3_K_M | 3.6 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.Q3_K_L.gguf) | Q3_K_L | 3.9 | |
| [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.IQ4_XS.gguf) | IQ4_XS | 4.0 | |
| [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.Q4_0_4_4.gguf) | Q4_0_4_4 | 4.2 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.Q4_K_S.gguf) | Q4_K_S | 4.2 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.Q4_K_M.gguf) | Q4_K_M | 4.5 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.Q5_K_S.gguf) | Q5_K_S | 5.1 | |
| [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.Q5_K_M.gguf) | Q5_K_M | 5.2 | |
| [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.Q6_K.gguf) | Q6_K | 6.0 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.Q8_0.gguf) | Q8_0 | 7.8 | fast, best quality |
| [GGUF](https://huggingface.co/mradermacher/Hyperion-2.0-Mistral-7B-GGUF/resolve/main/Hyperion-2.0-Mistral-7B.f16.gguf) | f16 | 14.6 | 16 bpw, overkill |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time.
<!-- end -->
|
DavidAU/Dark-Forest-V2-Ultra-Quality-20b-GGUF | DavidAU | 2024-11-14T05:31:17Z | 124 | 4 | null | [
"gguf",
"story",
"roleplay",
"creative",
"rp",
"fantasy",
"story telling",
"32 bit upscale",
"ultra high precision",
"nsfw",
"text-generation",
"en",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-06-01T13:57:52Z | ---
license: apache-2.0
language:
- en
tags:
- story
- roleplay
- creative
- rp
- fantasy
- story telling
- 32 bit upscale
- ultra high precision
- nsfw
pipeline_tag: text-generation
---
<B> Ultra High Quality - 20B Dark Forest Version 2.0 - 32 bit upscale </b>
Fully rebuilt from master files, including full merge(s) to maintain full 32 bit precision right
up until it is compressed into GGUF files which results on a top to bottom upgrade.
The result is superior performance in instruction following, reasoning, depth, nuance and emotion.
NOTE: There are two original versions of "Dark Forest 20B", this is an upscale of the second version.
<img src="dark-forest.jpg">
On average this means a q4km operates at Q6 levels and Q6 and Q8 exceeds original model full precision performance.
Perplexity drop (lower is better) is close to 10% (over 752 points for q4km) for all quants.
That means precision has been enhanced for all 20 billion parameters which affects "brain density" / "function",
instruction following and output quality.
Imatrix quants to follow shortly.
For more details, including a list of enhancements see our other 32 bit
upscale of "Space Whale 20B" rebuild here:
[ https://huggingface.co/DavidAU/Psyonic-Cetacean-Ultra-Quality-20b-GGUF ]
For Version 1 of Dark Forest Ultra Quality 32 bit upscale please go here:
[ https://huggingface.co/DavidAU/Dark-Forest-V1-Ultra-Quality-20b-GGUF ]
Special thanks to "TEEZEE" for making a both fantasic models of "Dark Forest".
<b> Info from the original model card: </B>
Warning: This model can produce NSFW content!
Results:
- main difference to v1.0 - model has much better sense of humor.
- produces SFW nad NSFW content without issues, switches context seamlessly.
- good at following instructions.
- good at tracking multiple characters in one scene.
- very creative, scenarios produced are mature and complicated, model doesn't shy from writing about PTSD, menatal issues or complicated relationships.
- NSFW output is more creative and suprising than typical limaRP output.
- definitely for mature audiences, not only because of vivid NSFW content but also because of overall maturity of stories it produces.
- This is NOT Harry Potter level storytelling.
For original model spec and information please visit:
[ https://huggingface.co/TeeZee/DarkForest-20B-v2.0 ]
<B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B>
This a "Class 2" model:
For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
|
DavidAU/Psyonic-Cetacean-V1-20B-Ultra-Quality-Float32 | DavidAU | 2024-11-14T05:30:15Z | 19 | 1 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"32 bit upscale",
"full 32 bit precision",
"master files",
"en",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-06-01T03:07:25Z | ---
license: apache-2.0
language:
- en
tags:
- 32 bit upscale
- full 32 bit precision
- master files
pipeline_tag: text-generation
---
<h3> Master Files for Ultra High Quality Remasters of "Psyonic-Cetacean" 20B </h3>
<img src="science-mad.jpg" width=300px height=300px style="float:right; padding:5px;">
May "Space Whale" swim in the oceans of the universe forever!
This repo contains the full precision (32 bit) master files for 32 bit upscales created by "DavidAU" of:
https://huggingface.co/DavidAU/Psyonic-Cetacean-Ultra-Quality-20b-GGUF-imatrix
And
https://huggingface.co/DavidAU/Psyonic-Cetacean-Ultra-Quality-20b-GGUF
Please view either repo for details on the remaster's results, and other important infomation.
<font color="red"><B>IMPORTANT NOTES For Maximum Results: </b></font>
These are "final" result files of the full precision rebuild (including end result merge(s)) minus
GGUF and Imatrix level upscaling / adjustments which occuring during "GGUFing" processes.
If you use these to create your own GGUFs, please use "outfile" at F32 for best results. If
you use F16 this will reduce the quality by a factor of 2 or higher.
Imatrix processes should use a stable dataset(s) of at least 500 "chunks" or more.
If smaller dataset(s) are used this may corrupt or reduce the quality of the Imatrix builds.
Due to the precision remaster there will be "greater" distance between each quant - both
non imatrix and imatrix.
IE: The jump in quality, instruction following, "ai brainpower", nuance and output
between Q4 and Q5 and likewise Q5 and Q6 will be larger than normal.
Same applies to "Imatrix" quants.
In addition there will also be differences between exact Imatrix and non-imatrix quants
especially in terms of "creative uses" and/or uses where there is no "right answer".
Finally, in terms of prompts:
You may find longer prompts are no longer required and/or you may need to reduce the size
of prompts in usage. This is a factor due to the precision upscale.
Doing this will ensure the quality of the upscale is maximized in the GGUFs.
/* GPTQers:
Suggest 4bit-Act32 TRUE for best results.
/* EXL2ers:
Suggest Min 4.5 BPW or higher ; 6 BPW and up is especially potent.
Strongly suggest you do not reduce layer bit count, as this will affect depth and nuance.
The more BPW the better.
Happy GGUFing, EXL2ing, GPTQing, AWQing, HQQing and of course "Merging".
<b>LONG LIVE OPEN SOURCE!</B>
<I>DavidAU</I>
<B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B>
This a "Class 2" model:
For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
/* Drop me a note when up, and I will link the masters to your repos. |
DavidAU/Psyonic-Cetacean-EXP | DavidAU | 2024-11-14T05:29:47Z | 49 | 1 | null | [
"gguf",
"text-generation",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"imatrix"
] | text-generation | 2024-05-31T07:00:32Z | ---
license: apache-2.0
pipeline_tag: text-generation
---
Experimental Builds in F32 and Imatrix upscaling for 20B model.
Mix of req quant / imatrix quants.
These are attempts to further improve F32 upscales projects noted below.
Use at own risk as output may be NSFW.
<B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B>
This a "Class 2" model:
For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
<B>Related to F32 Master Upscale project:</B>
https://huggingface.co/DavidAU/Psyonic-Cetacean-Ultra-Quality-20b-GGUF
https://huggingface.co/DavidAU/Psyonic-Cetacean-Ultra-Quality-20b-GGUF-imatrix
<b> In part related to: </B>
https://huggingface.co/DavidAU/Dark-Forest-V1-Ultra-Quality-20b-GGUF
https://huggingface.co/DavidAU/Fimbulvetr-11B-Ultra-Quality-plus-imatrix-GGUF |
DavidAU/Dark-Forest-V1-Ultra-Quality-20b-GGUF | DavidAU | 2024-11-14T05:28:59Z | 80 | 4 | null | [
"gguf",
"story",
"roleplay",
"creative",
"rp",
"fantasy",
"story telling",
"32 bit upscale",
"ultra high precision",
"nsfw",
"text-generation",
"en",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-05-30T12:47:09Z | ---
license: apache-2.0
language:
- en
tags:
- story
- roleplay
- creative
- rp
- fantasy
- story telling
- 32 bit upscale
- ultra high precision
- nsfw
pipeline_tag: text-generation
---
<B> Ultra High Quality - 20B Dark Forest Version 1.0 - 32 bit upscale </b>
Fully rebuilt from master files, including full merge(s) to maintain full 32 bit precision right
up until it is compressed into GGUF files which results on a top to bottom upgrade.
The result is superior performance in instruction following, reasoning, depth, nuance and emotion.
<img src="dark-forest.jpg">
On average this means a q4km operates at Q6 levels and Q6 and Q8 exceeds original model full precision performance.
Perplexity drop (lower is better) is close to 10% (over 600 points for q4km) for all quants.
That means precision has been enhanced for all 20 billion parameters which affects "brain density" / "function",
instruction following and output quality.
Imatrix quants to follow shortly.
For more details, including a list of enhancements see our other 32 bit
upscale of "Space Whale 20B" rebuild here:
[ https://huggingface.co/DavidAU/Psyonic-Cetacean-Ultra-Quality-20b-GGUF ]
Special thanks to "TEEZEE" for making a fantasic model.
<b> Info from the original model card: </B>
Warning: This model can produce NSFW content!
Results
- produces SFW nad NSFW content without issues, switches context seamlessly.
- good at following instructions.
- good at tracking multiple characters in one scene.
- very creative, scenarios produced are mature and complicated, model doesn't shy from writing about PTSD, menatal issues or complicated relationships.
- NSFW output is more creative and suprising than typical limaRP output.
- definitely for mature audiences, not only because of vivid NSFW content but also because of overall maturity of stories it produces.
- This is NOT Harry Potter level storytelling.
For original model spec and information please visit:
[ https://huggingface.co/TeeZee/DarkForest-20B-v1.0 ]
<B>Highest Quality Settings / Optimal Operation Guide / Parameters and Samplers</B>
This a "Class 2":
For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, roleplay and other use case(s) please see:
[ https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters ]
|
Harshith42/opt-350m-HorrorCreepyPasta | Harshith42 | 2024-11-14T05:19:39Z | 126 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"opt",
"text-generation",
"generated_from_trainer",
"base_model:facebook/opt-350m",
"base_model:finetune:facebook/opt-350m",
"license:other",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-13T11:33:06Z | ---
library_name: transformers
license: other
base_model: facebook/opt-350m
tags:
- generated_from_trainer
model-index:
- name: opt-350m-HorrorCreepyPasta
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# opt-350m-HorrorCreepyPasta
This model is a fine-tuned version of [facebook/opt-350m](https://huggingface.co/facebook/opt-350m) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 2.2518
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| No log | 1.0 | 303 | 2.6397 |
| 2.9418 | 2.0 | 606 | 2.3592 |
| 2.9418 | 3.0 | 909 | 2.2518 |
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.0+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
|
Shreyask09/segmentation-finetune-hindi | Shreyask09 | 2024-11-14T05:06:57Z | 5 | 0 | transformers | [
"transformers",
"pytorch",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | null | 2024-11-11T11:18:36Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
mradermacher/Trinity-2-Codestral-22B-i1-GGUF | mradermacher | 2024-11-14T04:53:12Z | 93 | 1 | transformers | [
"transformers",
"gguf",
"generated_from_trainer",
"en",
"base_model:migtissera/Trinity-2-Codestral-22B",
"base_model:quantized:migtissera/Trinity-2-Codestral-22B",
"license:other",
"endpoints_compatible",
"region:us",
"imatrix",
"conversational"
] | null | 2024-11-13T21:30:16Z | ---
base_model: migtissera/Trinity-2-Codestral-22B
language:
- en
library_name: transformers
license: other
quantized_by: mradermacher
tags:
- generated_from_trainer
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/migtissera/Trinity-2-Codestral-22B
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/Trinity-2-Codestral-22B-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/Trinity-2-Codestral-22B-i1-GGUF/resolve/main/Trinity-2-Codestral-22B.i1-IQ1_S.gguf) | i1-IQ1_S | 4.9 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/Trinity-2-Codestral-22B-i1-GGUF/resolve/main/Trinity-2-Codestral-22B.i1-IQ1_M.gguf) | i1-IQ1_M | 5.4 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/Trinity-2-Codestral-22B-i1-GGUF/resolve/main/Trinity-2-Codestral-22B.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 6.1 | |
| [GGUF](https://huggingface.co/mradermacher/Trinity-2-Codestral-22B-i1-GGUF/resolve/main/Trinity-2-Codestral-22B.i1-IQ2_XS.gguf) | i1-IQ2_XS | 6.7 | |
| [GGUF](https://huggingface.co/mradermacher/Trinity-2-Codestral-22B-i1-GGUF/resolve/main/Trinity-2-Codestral-22B.i1-IQ2_S.gguf) | i1-IQ2_S | 7.1 | |
| [GGUF](https://huggingface.co/mradermacher/Trinity-2-Codestral-22B-i1-GGUF/resolve/main/Trinity-2-Codestral-22B.i1-IQ2_M.gguf) | i1-IQ2_M | 7.7 | |
| [GGUF](https://huggingface.co/mradermacher/Trinity-2-Codestral-22B-i1-GGUF/resolve/main/Trinity-2-Codestral-22B.i1-Q2_K.gguf) | i1-Q2_K | 8.4 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/Trinity-2-Codestral-22B-i1-GGUF/resolve/main/Trinity-2-Codestral-22B.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 8.7 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/Trinity-2-Codestral-22B-i1-GGUF/resolve/main/Trinity-2-Codestral-22B.i1-IQ3_XS.gguf) | i1-IQ3_XS | 9.3 | |
| [GGUF](https://huggingface.co/mradermacher/Trinity-2-Codestral-22B-i1-GGUF/resolve/main/Trinity-2-Codestral-22B.i1-Q3_K_S.gguf) | i1-Q3_K_S | 9.7 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/Trinity-2-Codestral-22B-i1-GGUF/resolve/main/Trinity-2-Codestral-22B.i1-IQ3_S.gguf) | i1-IQ3_S | 9.8 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/Trinity-2-Codestral-22B-i1-GGUF/resolve/main/Trinity-2-Codestral-22B.i1-IQ3_M.gguf) | i1-IQ3_M | 10.2 | |
| [GGUF](https://huggingface.co/mradermacher/Trinity-2-Codestral-22B-i1-GGUF/resolve/main/Trinity-2-Codestral-22B.i1-Q3_K_M.gguf) | i1-Q3_K_M | 10.9 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/Trinity-2-Codestral-22B-i1-GGUF/resolve/main/Trinity-2-Codestral-22B.i1-Q3_K_L.gguf) | i1-Q3_K_L | 11.8 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/Trinity-2-Codestral-22B-i1-GGUF/resolve/main/Trinity-2-Codestral-22B.i1-IQ4_XS.gguf) | i1-IQ4_XS | 12.0 | |
| [GGUF](https://huggingface.co/mradermacher/Trinity-2-Codestral-22B-i1-GGUF/resolve/main/Trinity-2-Codestral-22B.i1-Q4_0.gguf) | i1-Q4_0 | 12.7 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/Trinity-2-Codestral-22B-i1-GGUF/resolve/main/Trinity-2-Codestral-22B.i1-Q4_K_S.gguf) | i1-Q4_K_S | 12.8 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/Trinity-2-Codestral-22B-i1-GGUF/resolve/main/Trinity-2-Codestral-22B.i1-Q4_K_M.gguf) | i1-Q4_K_M | 13.4 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Trinity-2-Codestral-22B-i1-GGUF/resolve/main/Trinity-2-Codestral-22B.i1-Q5_K_S.gguf) | i1-Q5_K_S | 15.4 | |
| [GGUF](https://huggingface.co/mradermacher/Trinity-2-Codestral-22B-i1-GGUF/resolve/main/Trinity-2-Codestral-22B.i1-Q5_K_M.gguf) | i1-Q5_K_M | 15.8 | |
| [GGUF](https://huggingface.co/mradermacher/Trinity-2-Codestral-22B-i1-GGUF/resolve/main/Trinity-2-Codestral-22B.i1-Q6_K.gguf) | i1-Q6_K | 18.4 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
ICTNLP/Llama-3.1-8B-Omni | ICTNLP | 2024-11-14T04:47:48Z | 3,519 | 393 | llama-omni | [
"llama-omni",
"safetensors",
"omni_speech2s_llama",
"large language models",
"speech-language models",
"speech interaction",
"speech-to-speech",
"en",
"arxiv:2409.06666",
"base_model:meta-llama/Llama-3.1-8B-Instruct",
"base_model:finetune:meta-llama/Llama-3.1-8B-Instruct",
"license:other",
"region:us"
] | null | 2024-09-10T05:23:09Z | ---
license: other
language:
- en
base_model: meta-llama/Meta-Llama-3.1-8B-Instruct
tags:
- large language models
- speech-language models
- speech interaction
- speech-to-speech
library_name: llama-omni
---
# 🦙🎧 LLaMA-Omni: Seamless Speech Interaction with Large Language Models
> **Authors: [Qingkai Fang](https://fangqingkai.github.io/), [Shoutao Guo](https://scholar.google.com/citations?hl=en&user=XwHtPyAAAAAJ), [Yan Zhou](https://zhouyan19.github.io/zhouyan/), [Zhengrui Ma](https://scholar.google.com.hk/citations?user=dUgq6tEAAAAJ), [Shaolei Zhang](https://zhangshaolei1998.github.io/), [Yang Feng*](https://people.ucas.edu.cn/~yangfeng?language=en)**
[[Paper]](https://arxiv.org/abs/2409.06666) [[Model]](https://huggingface.co/ICTNLP/Llama-3.1-8B-Omni) [[Code]](https://github.com/ictnlp/LLaMA-Omni)
LLaMA-Omni is a speech-language model built upon Llama-3.1-8B-Instruct. It supports low-latency and high-quality speech interactions, simultaneously generating both text and speech responses based on speech instructions.

## 💡 Highlights
- 💪 **Built on Llama-3.1-8B-Instruct, ensuring high-quality responses.**
- 🚀 **Low-latency speech interaction with a latency as low as 226ms.**
- 🎧 **Simultaneous generation of both text and speech responses.**
- ♻️ **Trained in less than 3 days using just 4 GPUs.**
<video controls autoplay src="https://cdn-uploads.huggingface.co/production/uploads/65b7573482d384513443875e/dr4XWUxzuVQ52lBuzNBTt.mp4"></video>
## Install
1. Clone this repository.
```shell
git clone https://github.com/ictnlp/LLaMA-Omni
cd LLaMA-Omni
```
2. Install packages.
```shell
conda create -n llama-omni python=3.10
conda activate llama-omni
pip install pip==24.0
pip install -e .
```
3. Install `fairseq`.
```shell
git clone https://github.com/pytorch/fairseq
cd fairseq
pip install -e . --no-build-isolation
```
4. Install `flash-attention`.
```shell
pip install flash-attn --no-build-isolation
```
## Quick Start
1. Download the `Llama-3.1-8B-Omni` model from 🤗[Huggingface](https://huggingface.co/ICTNLP/Llama-3.1-8B-Omni).
2. Download the `Whisper-large-v3` model.
```shell
import whisper
model = whisper.load_model("large-v3", download_root="models/speech_encoder/")
```
3. Download the unit-based HiFi-GAN vocoder.
```shell
wget https://dl.fbaipublicfiles.com/fairseq/speech_to_speech/vocoder/code_hifigan/mhubert_vp_en_es_fr_it3_400k_layer11_km1000_lj/g_00500000 -P vocoder/
wget https://dl.fbaipublicfiles.com/fairseq/speech_to_speech/vocoder/code_hifigan/mhubert_vp_en_es_fr_it3_400k_layer11_km1000_lj/config.json -P vocoder/
```
## Gradio Demo
1. Launch a controller.
```shell
python -m omni_speech.serve.controller --host 0.0.0.0 --port 10000
```
2. Launch a gradio web server.
```shell
python -m omni_speech.serve.gradio_web_server --controller http://localhost:10000 --port 8000 --model-list-mode reload --vocoder vocoder/g_00500000 --vocoder-cfg vocoder/config.json
```
3. Launch a model worker.
```shell
python -m omni_speech.serve.model_worker --host 0.0.0.0 --controller http://localhost:10000 --port 40000 --worker http://localhost:40000 --model-path Llama-3.1-8B-Omni --model-name Llama-3.1-8B-Omni --s2s
```
4. Visit [http://localhost:8000/](http://localhost:8000/) and interact with LLaMA-3.1-8B-Omni!
**Note: Due to the instability of streaming audio playback in Gradio, we have only implemented streaming audio synthesis without enabling autoplay. If you have a good solution, feel free to submit a PR. Thanks!**
## Local Inference
To run inference locally, please organize the speech instruction files according to the format in the `omni_speech/infer/examples` directory, then refer to the following script.
```shell
bash omni_speech/infer/run.sh omni_speech/infer/examples
```
## LICENSE
Our code is released under the Apache-2.0 License. Our model is intended for academic research purposes only and may **NOT** be used for commercial purposes.
You are free to use, modify, and distribute this model in academic settings, provided that the following conditions are met:
- **Non-commercial use**: The model may not be used for any commercial purposes.
- **Citation**: If you use this model in your research, please cite the original work.
### Commercial Use Restriction
For any commercial use inquiries or to obtain a commercial license, please contact `[email protected]`.
## Acknowledgements
- [LLaVA](https://github.com/haotian-liu/LLaVA): The codebase we built upon.
- [SLAM-LLM](https://github.com/X-LANCE/SLAM-LLM): We borrow some code about speech encoder and speech adaptor.
## Citation
If you have any questions, please feel free to submit an issue or contact `[email protected]`.
If our work is useful for you, please cite as:
```
@article{fang-etal-2024-llama-omni,
title={LLaMA-Omni: Seamless Speech Interaction with Large Language Models},
author={Fang, Qingkai and Guo, Shoutao and Zhou, Yan and Ma, Zhengrui and Zhang, Shaolei and Feng, Yang},
journal={arXiv preprint arXiv:2409.06666},
year={2024}
}
``` |
kazshinohara/gemma-2-2b-int-jpn-lora1 | kazshinohara | 2024-11-14T04:44:57Z | 97 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"gemma2",
"text-generation",
"generated_from_trainer",
"trl",
"sft",
"conversational",
"base_model:google/gemma-2-2b-jpn-it",
"base_model:finetune:google/gemma-2-2b-jpn-it",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-13T16:21:46Z | ---
base_model: google/gemma-2-2b-jpn-it
library_name: transformers
model_name: gemma-2-2b-int-jpn-lora1
tags:
- generated_from_trainer
- trl
- sft
licence: license
---
# Model Card for gemma-2-2b-int-jpn-lora1
This model is a fine-tuned version of [google/gemma-2-2b-jpn-it](https://huggingface.co/google/gemma-2-2b-jpn-it).
It has been trained using [TRL](https://github.com/huggingface/trl).
## Quick start
```python
from transformers import pipeline
question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?"
generator = pipeline("text-generation", model="kazshinohara/gemma-2-2b-int-jpn-lora1", device="cuda")
output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0]
print(output["generated_text"])
```
## Training procedure
This model was trained with SFT.
### Framework versions
- TRL: 0.12.0
- Transformers: 4.46.2
- Pytorch: 2.5.1
- Datasets: 3.1.0
- Tokenizers: 0.20.3
## Citations
Cite TRL as:
```bibtex
@misc{vonwerra2022trl,
title = {{TRL: Transformer Reinforcement Learning}},
author = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallouédec},
year = 2020,
journal = {GitHub repository},
publisher = {GitHub},
howpublished = {\url{https://github.com/huggingface/trl}}
}
``` |
bharati2324/Llama-1B-Code-LoRA-r16-mlp-merged | bharati2324 | 2024-11-14T04:33:31Z | 89 | 0 | transformers | [
"transformers",
"pytorch",
"llama",
"text-generation",
"unsloth",
"trl",
"sft",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-14T04:32:16Z | ---
library_name: transformers
tags:
- unsloth
- trl
- sft
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
furiosa-ai/EXAONE-3.0-7.8B-Instruct-converted | furiosa-ai | 2024-11-14T04:28:23Z | 5,571 | 1 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-10-07T11:37:22Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Rich-J/subnet29_upload_c02_N13_0 | Rich-J | 2024-11-14T04:23:10Z | 35 | 0 | transformers | [
"transformers",
"safetensors",
"phi3",
"text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-14T04:20:07Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
MayBashendy/Arabic_FineTuningAraBERT_AugV4-trial2_k20_task1_organization_fold0 | MayBashendy | 2024-11-14T04:18:14Z | 164 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:aubmindlab/bert-base-arabertv02",
"base_model:finetune:aubmindlab/bert-base-arabertv02",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-14T03:58:01Z | ---
library_name: transformers
base_model: aubmindlab/bert-base-arabertv02
tags:
- generated_from_trainer
model-index:
- name: Arabic_FineTuningAraBERT_AugV4-trial2_k20_task1_organization_fold0
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Arabic_FineTuningAraBERT_AugV4-trial2_k20_task1_organization_fold0
This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7729
- Qwk: 0.6951
- Mse: 0.7729
- Rmse: 0.8792
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:|
| No log | 0.0051 | 2 | 4.8349 | -0.0064 | 4.8349 | 2.1988 |
| No log | 0.0102 | 4 | 2.6648 | 0.0968 | 2.6648 | 1.6324 |
| No log | 0.0153 | 6 | 1.6244 | 0.1548 | 1.6244 | 1.2745 |
| No log | 0.0205 | 8 | 1.3794 | 0.0950 | 1.3794 | 1.1745 |
| No log | 0.0256 | 10 | 1.2051 | 0.3693 | 1.2051 | 1.0978 |
| No log | 0.0307 | 12 | 1.1561 | 0.3623 | 1.1561 | 1.0752 |
| No log | 0.0358 | 14 | 1.4367 | 0.2957 | 1.4367 | 1.1986 |
| No log | 0.0409 | 16 | 2.0532 | 0.1873 | 2.0532 | 1.4329 |
| No log | 0.0460 | 18 | 2.3036 | 0.1600 | 2.3036 | 1.5178 |
| No log | 0.0512 | 20 | 2.0619 | 0.1600 | 2.0619 | 1.4359 |
| No log | 0.0563 | 22 | 2.0155 | 0.0801 | 2.0155 | 1.4197 |
| No log | 0.0614 | 24 | 1.9389 | 0.1075 | 1.9389 | 1.3925 |
| No log | 0.0665 | 26 | 2.0406 | 0.1075 | 2.0406 | 1.4285 |
| No log | 0.0716 | 28 | 1.9531 | 0.1075 | 1.9531 | 1.3975 |
| No log | 0.0767 | 30 | 1.8806 | 0.1873 | 1.8806 | 1.3714 |
| No log | 0.0818 | 32 | 1.8895 | 0.1873 | 1.8895 | 1.3746 |
| No log | 0.0870 | 34 | 1.8246 | 0.2145 | 1.8246 | 1.3508 |
| No log | 0.0921 | 36 | 1.7821 | 0.2145 | 1.7821 | 1.3349 |
| No log | 0.0972 | 38 | 2.0287 | 0.4324 | 2.0287 | 1.4243 |
| No log | 0.1023 | 40 | 2.1866 | 0.3465 | 2.1866 | 1.4787 |
| No log | 0.1074 | 42 | 2.1923 | 0.3465 | 2.1923 | 1.4807 |
| No log | 0.1125 | 44 | 1.9613 | 0.5404 | 1.9613 | 1.4005 |
| No log | 0.1176 | 46 | 1.6150 | 0.6265 | 1.6150 | 1.2708 |
| No log | 0.1228 | 48 | 1.3209 | 0.4592 | 1.3209 | 1.1493 |
| No log | 0.1279 | 50 | 1.3936 | 0.4592 | 1.3936 | 1.1805 |
| No log | 0.1330 | 52 | 1.9210 | 0.5440 | 1.9210 | 1.3860 |
| No log | 0.1381 | 54 | 2.3070 | 0.3977 | 2.3070 | 1.5189 |
| No log | 0.1432 | 56 | 2.3246 | 0.3425 | 2.3246 | 1.5246 |
| No log | 0.1483 | 58 | 2.5462 | 0.3425 | 2.5462 | 1.5957 |
| No log | 0.1535 | 60 | 2.3681 | 0.3368 | 2.3681 | 1.5389 |
| No log | 0.1586 | 62 | 1.9791 | 0.4783 | 1.9791 | 1.4068 |
| No log | 0.1637 | 64 | 1.6492 | 0.5028 | 1.6492 | 1.2842 |
| No log | 0.1688 | 66 | 1.6259 | 0.5028 | 1.6259 | 1.2751 |
| No log | 0.1739 | 68 | 1.9838 | 0.4786 | 1.9838 | 1.4085 |
| No log | 0.1790 | 70 | 2.6579 | 0.3478 | 2.6579 | 1.6303 |
| No log | 0.1841 | 72 | 2.7509 | 0.3478 | 2.7509 | 1.6586 |
| No log | 0.1893 | 74 | 2.2229 | 0.4786 | 2.2229 | 1.4909 |
| No log | 0.1944 | 76 | 1.7089 | 0.5661 | 1.7089 | 1.3072 |
| No log | 0.1995 | 78 | 1.3284 | 0.5670 | 1.3284 | 1.1526 |
| No log | 0.2046 | 80 | 1.2327 | 0.5670 | 1.2327 | 1.1103 |
| No log | 0.2097 | 82 | 1.4338 | 0.5882 | 1.4338 | 1.1974 |
| No log | 0.2148 | 84 | 1.8134 | 0.4786 | 1.8134 | 1.3466 |
| No log | 0.2199 | 86 | 2.3598 | 0.4011 | 2.3598 | 1.5362 |
| No log | 0.2251 | 88 | 2.4987 | 0.3104 | 2.4987 | 1.5807 |
| No log | 0.2302 | 90 | 2.0012 | 0.2743 | 2.0012 | 1.4147 |
| No log | 0.2353 | 92 | 1.4163 | 0.5484 | 1.4163 | 1.1901 |
| No log | 0.2404 | 94 | 1.3715 | 0.6165 | 1.3715 | 1.1711 |
| No log | 0.2455 | 96 | 1.6667 | 0.4381 | 1.6667 | 1.2910 |
| No log | 0.2506 | 98 | 1.9179 | 0.4606 | 1.9179 | 1.3849 |
| No log | 0.2558 | 100 | 1.7097 | 0.4595 | 1.7097 | 1.3075 |
| No log | 0.2609 | 102 | 1.6445 | 0.4595 | 1.6445 | 1.2824 |
| No log | 0.2660 | 104 | 1.7876 | 0.4389 | 1.7876 | 1.3370 |
| No log | 0.2711 | 106 | 1.7800 | 0.4389 | 1.7800 | 1.3342 |
| No log | 0.2762 | 108 | 1.8974 | 0.4408 | 1.8974 | 1.3774 |
| No log | 0.2813 | 110 | 1.5581 | 0.4995 | 1.5581 | 1.2483 |
| No log | 0.2864 | 112 | 1.3174 | 0.7929 | 1.3174 | 1.1478 |
| No log | 0.2916 | 114 | 1.2772 | 0.7929 | 1.2772 | 1.1302 |
| No log | 0.2967 | 116 | 1.2373 | 0.7929 | 1.2373 | 1.1123 |
| No log | 0.3018 | 118 | 1.5449 | 0.5200 | 1.5449 | 1.2429 |
| No log | 0.3069 | 120 | 1.7477 | 0.4606 | 1.7477 | 1.3220 |
| No log | 0.3120 | 122 | 1.6385 | 0.5188 | 1.6385 | 1.2800 |
| No log | 0.3171 | 124 | 1.2969 | 0.7412 | 1.2969 | 1.1388 |
| No log | 0.3223 | 126 | 1.0825 | 0.7626 | 1.0825 | 1.0404 |
| No log | 0.3274 | 128 | 1.2306 | 0.7626 | 1.2306 | 1.1093 |
| No log | 0.3325 | 130 | 1.7058 | 0.5200 | 1.7058 | 1.3061 |
| No log | 0.3376 | 132 | 1.8015 | 0.6201 | 1.8015 | 1.3422 |
| No log | 0.3427 | 134 | 1.5803 | 0.5737 | 1.5803 | 1.2571 |
| No log | 0.3478 | 136 | 1.3957 | 0.7042 | 1.3957 | 1.1814 |
| No log | 0.3529 | 138 | 1.4369 | 0.5404 | 1.4369 | 1.1987 |
| No log | 0.3581 | 140 | 1.4721 | 0.5200 | 1.4721 | 1.2133 |
| No log | 0.3632 | 142 | 1.7918 | 0.5200 | 1.7918 | 1.3386 |
| No log | 0.3683 | 144 | 1.7818 | 0.4595 | 1.7818 | 1.3349 |
| No log | 0.3734 | 146 | 1.9216 | 0.4595 | 1.9216 | 1.3862 |
| No log | 0.3785 | 148 | 2.0893 | 0.4793 | 2.0893 | 1.4454 |
| No log | 0.3836 | 150 | 1.6658 | 0.5200 | 1.6658 | 1.2907 |
| No log | 0.3887 | 152 | 1.2029 | 0.7519 | 1.2029 | 1.0967 |
| No log | 0.3939 | 154 | 1.2409 | 0.6975 | 1.2409 | 1.1140 |
| No log | 0.3990 | 156 | 1.5974 | 0.5426 | 1.5974 | 1.2639 |
| No log | 0.4041 | 158 | 2.0837 | 0.4991 | 2.0837 | 1.4435 |
| No log | 0.4092 | 160 | 2.2363 | 0.5502 | 2.2363 | 1.4954 |
| No log | 0.4143 | 162 | 1.8709 | 0.6065 | 1.8709 | 1.3678 |
| No log | 0.4194 | 164 | 1.3125 | 0.7211 | 1.3125 | 1.1457 |
| No log | 0.4246 | 166 | 1.2602 | 0.6890 | 1.2602 | 1.1226 |
| No log | 0.4297 | 168 | 1.3720 | 0.7211 | 1.3720 | 1.1713 |
| No log | 0.4348 | 170 | 1.6872 | 0.4595 | 1.6872 | 1.2989 |
| No log | 0.4399 | 172 | 2.0884 | 0.4606 | 2.0884 | 1.4451 |
| No log | 0.4450 | 174 | 1.9520 | 0.4011 | 1.9520 | 1.3971 |
| No log | 0.4501 | 176 | 1.6423 | 0.4381 | 1.6423 | 1.2815 |
| No log | 0.4552 | 178 | 1.4222 | 0.6157 | 1.4222 | 1.1926 |
| No log | 0.4604 | 180 | 1.3405 | 0.6290 | 1.3405 | 1.1578 |
| No log | 0.4655 | 182 | 1.3006 | 0.6290 | 1.3006 | 1.1404 |
| No log | 0.4706 | 184 | 1.5053 | 0.5474 | 1.5053 | 1.2269 |
| No log | 0.4757 | 186 | 1.5494 | 0.5449 | 1.5494 | 1.2447 |
| No log | 0.4808 | 188 | 1.5028 | 0.5638 | 1.5028 | 1.2259 |
| No log | 0.4859 | 190 | 1.3690 | 0.7042 | 1.3690 | 1.1700 |
| No log | 0.4910 | 192 | 1.1758 | 0.7599 | 1.1758 | 1.0843 |
| No log | 0.4962 | 194 | 1.1233 | 0.7599 | 1.1233 | 1.0598 |
| No log | 0.5013 | 196 | 1.0566 | 0.7590 | 1.0566 | 1.0279 |
| No log | 0.5064 | 198 | 1.3227 | 0.7905 | 1.3227 | 1.1501 |
| No log | 0.5115 | 200 | 1.7937 | 0.7115 | 1.7937 | 1.3393 |
| No log | 0.5166 | 202 | 2.1526 | 0.5653 | 2.1526 | 1.4672 |
| No log | 0.5217 | 204 | 2.1276 | 0.5166 | 2.1276 | 1.4586 |
| No log | 0.5269 | 206 | 1.8828 | 0.4381 | 1.8828 | 1.3721 |
| No log | 0.5320 | 208 | 1.5447 | 0.5474 | 1.5447 | 1.2429 |
| No log | 0.5371 | 210 | 1.5262 | 0.6157 | 1.5262 | 1.2354 |
| No log | 0.5422 | 212 | 1.5911 | 0.5638 | 1.5911 | 1.2614 |
| No log | 0.5473 | 214 | 1.5599 | 0.5638 | 1.5599 | 1.2490 |
| No log | 0.5524 | 216 | 1.2361 | 0.6265 | 1.2361 | 1.1118 |
| No log | 0.5575 | 218 | 1.0876 | 0.6982 | 1.0876 | 1.0429 |
| No log | 0.5627 | 220 | 1.0005 | 0.6988 | 1.0005 | 1.0002 |
| No log | 0.5678 | 222 | 1.1702 | 0.6061 | 1.1702 | 1.0817 |
| No log | 0.5729 | 224 | 1.4755 | 0.6008 | 1.4755 | 1.2147 |
| No log | 0.5780 | 226 | 1.6009 | 0.6349 | 1.6009 | 1.2652 |
| No log | 0.5831 | 228 | 1.5692 | 0.6349 | 1.5692 | 1.2527 |
| No log | 0.5882 | 230 | 1.3451 | 0.5638 | 1.3451 | 1.1598 |
| No log | 0.5934 | 232 | 1.1034 | 0.6975 | 1.1034 | 1.0504 |
| No log | 0.5985 | 234 | 1.0618 | 0.7519 | 1.0618 | 1.0304 |
| No log | 0.6036 | 236 | 1.2523 | 0.6265 | 1.2523 | 1.1191 |
| No log | 0.6087 | 238 | 1.3015 | 0.6265 | 1.3015 | 1.1408 |
| No log | 0.6138 | 240 | 1.3130 | 0.6265 | 1.3130 | 1.1459 |
| No log | 0.6189 | 242 | 1.2943 | 0.6265 | 1.2943 | 1.1377 |
| No log | 0.6240 | 244 | 1.3726 | 0.5638 | 1.3726 | 1.1716 |
| No log | 0.6292 | 246 | 1.4792 | 0.5638 | 1.4792 | 1.2162 |
| No log | 0.6343 | 248 | 1.3838 | 0.5638 | 1.3838 | 1.1764 |
| No log | 0.6394 | 250 | 1.6283 | 0.5404 | 1.6283 | 1.2760 |
| No log | 0.6445 | 252 | 1.6280 | 0.5404 | 1.6280 | 1.2759 |
| No log | 0.6496 | 254 | 1.5809 | 0.5404 | 1.5809 | 1.2573 |
| No log | 0.6547 | 256 | 1.3672 | 0.6265 | 1.3672 | 1.1693 |
| No log | 0.6598 | 258 | 1.0624 | 0.7296 | 1.0624 | 1.0307 |
| No log | 0.6650 | 260 | 1.0751 | 0.7296 | 1.0751 | 1.0369 |
| No log | 0.6701 | 262 | 1.4643 | 0.6008 | 1.4643 | 1.2101 |
| No log | 0.6752 | 264 | 1.6173 | 0.5965 | 1.6173 | 1.2717 |
| No log | 0.6803 | 266 | 1.4403 | 0.6008 | 1.4403 | 1.2001 |
| No log | 0.6854 | 268 | 1.2346 | 0.7211 | 1.2346 | 1.1111 |
| No log | 0.6905 | 270 | 1.2515 | 0.7211 | 1.2515 | 1.1187 |
| No log | 0.6957 | 272 | 1.3066 | 0.7211 | 1.3066 | 1.1431 |
| No log | 0.7008 | 274 | 1.4740 | 0.6485 | 1.4740 | 1.2141 |
| No log | 0.7059 | 276 | 1.3580 | 0.7211 | 1.3580 | 1.1653 |
| No log | 0.7110 | 278 | 1.3045 | 0.6890 | 1.3045 | 1.1421 |
| No log | 0.7161 | 280 | 1.4184 | 0.7042 | 1.4184 | 1.1909 |
| No log | 0.7212 | 282 | 1.6800 | 0.6446 | 1.6800 | 1.2962 |
| No log | 0.7263 | 284 | 1.5553 | 0.6446 | 1.5553 | 1.2471 |
| No log | 0.7315 | 286 | 1.3539 | 0.6008 | 1.3539 | 1.1636 |
| No log | 0.7366 | 288 | 1.2274 | 0.6890 | 1.2274 | 1.1079 |
| No log | 0.7417 | 290 | 1.0952 | 0.6818 | 1.0952 | 1.0465 |
| No log | 0.7468 | 292 | 1.0540 | 0.6818 | 1.0540 | 1.0267 |
| No log | 0.7519 | 294 | 1.0954 | 0.7129 | 1.0954 | 1.0466 |
| No log | 0.7570 | 296 | 1.0829 | 0.7115 | 1.0829 | 1.0406 |
| No log | 0.7621 | 298 | 1.2256 | 0.7583 | 1.2256 | 1.1071 |
| No log | 0.7673 | 300 | 1.4730 | 0.7031 | 1.4730 | 1.2137 |
| No log | 0.7724 | 302 | 1.3904 | 0.6457 | 1.3904 | 1.1792 |
| No log | 0.7775 | 304 | 1.2508 | 0.5540 | 1.2508 | 1.1184 |
| No log | 0.7826 | 306 | 1.2461 | 0.5608 | 1.2461 | 1.1163 |
| No log | 0.7877 | 308 | 1.2609 | 0.5965 | 1.2609 | 1.1229 |
| No log | 0.7928 | 310 | 1.0147 | 0.6763 | 1.0147 | 1.0073 |
| No log | 0.7980 | 312 | 0.8106 | 0.72 | 0.8106 | 0.9003 |
| No log | 0.8031 | 314 | 0.9081 | 0.6623 | 0.9081 | 0.9530 |
| No log | 0.8082 | 316 | 1.1449 | 0.7328 | 1.1449 | 1.0700 |
| No log | 0.8133 | 318 | 1.2634 | 0.7123 | 1.2634 | 1.1240 |
| No log | 0.8184 | 320 | 1.1884 | 0.7123 | 1.1884 | 1.0902 |
| No log | 0.8235 | 322 | 1.0369 | 0.6763 | 1.0369 | 1.0183 |
| No log | 0.8286 | 324 | 1.1726 | 0.6890 | 1.1726 | 1.0829 |
| No log | 0.8338 | 326 | 1.5457 | 0.6115 | 1.5457 | 1.2433 |
| No log | 0.8389 | 328 | 1.5769 | 0.6115 | 1.5769 | 1.2557 |
| No log | 0.8440 | 330 | 1.2436 | 0.6491 | 1.2436 | 1.1152 |
| No log | 0.8491 | 332 | 0.9340 | 0.6940 | 0.9340 | 0.9664 |
| No log | 0.8542 | 334 | 0.8555 | 0.7327 | 0.8555 | 0.9249 |
| No log | 0.8593 | 336 | 0.8988 | 0.7239 | 0.8988 | 0.9480 |
| No log | 0.8645 | 338 | 1.0635 | 0.7239 | 1.0635 | 1.0313 |
| No log | 0.8696 | 340 | 0.9573 | 0.7239 | 0.9573 | 0.9784 |
| No log | 0.8747 | 342 | 0.9880 | 0.7239 | 0.9880 | 0.9940 |
| No log | 0.8798 | 344 | 1.0130 | 0.7239 | 1.0130 | 1.0065 |
| No log | 0.8849 | 346 | 1.0970 | 0.7157 | 1.0970 | 1.0474 |
| No log | 0.8900 | 348 | 1.1216 | 0.7239 | 1.1216 | 1.0590 |
| No log | 0.8951 | 350 | 0.9248 | 0.7433 | 0.9248 | 0.9617 |
| No log | 0.9003 | 352 | 0.7307 | 0.7685 | 0.7307 | 0.8548 |
| No log | 0.9054 | 354 | 0.7261 | 0.7685 | 0.7261 | 0.8521 |
| No log | 0.9105 | 356 | 0.7789 | 0.7685 | 0.7789 | 0.8826 |
| No log | 0.9156 | 358 | 0.9725 | 0.7602 | 0.9725 | 0.9861 |
| No log | 0.9207 | 360 | 1.2739 | 0.7701 | 1.2739 | 1.1287 |
| No log | 0.9258 | 362 | 1.2980 | 0.7504 | 1.2980 | 1.1393 |
| No log | 0.9309 | 364 | 1.1806 | 0.7684 | 1.1806 | 1.0866 |
| No log | 0.9361 | 366 | 1.0025 | 0.7614 | 1.0025 | 1.0013 |
| No log | 0.9412 | 368 | 0.9132 | 0.7519 | 0.9132 | 0.9556 |
| No log | 0.9463 | 370 | 0.9538 | 0.7519 | 0.9538 | 0.9766 |
| No log | 0.9514 | 372 | 1.0991 | 0.7305 | 1.0991 | 1.0484 |
| No log | 0.9565 | 374 | 1.3855 | 0.6545 | 1.3855 | 1.1771 |
| No log | 0.9616 | 376 | 1.4673 | 0.6545 | 1.4673 | 1.2113 |
| No log | 0.9668 | 378 | 1.5124 | 0.6545 | 1.5124 | 1.2298 |
| No log | 0.9719 | 380 | 1.5253 | 0.6545 | 1.5253 | 1.2350 |
| No log | 0.9770 | 382 | 1.2814 | 0.7123 | 1.2814 | 1.1320 |
| No log | 0.9821 | 384 | 0.9290 | 0.7614 | 0.9290 | 0.9638 |
| No log | 0.9872 | 386 | 0.8922 | 0.7516 | 0.8922 | 0.9446 |
| No log | 0.9923 | 388 | 1.0415 | 0.7417 | 1.0415 | 1.0205 |
| No log | 0.9974 | 390 | 1.4723 | 0.8067 | 1.4723 | 1.2134 |
| No log | 1.0026 | 392 | 1.7271 | 0.6328 | 1.7271 | 1.3142 |
| No log | 1.0077 | 394 | 1.5359 | 0.5385 | 1.5359 | 1.2393 |
| No log | 1.0128 | 396 | 1.1490 | 0.7519 | 1.1490 | 1.0719 |
| No log | 1.0179 | 398 | 0.9212 | 0.7524 | 0.9212 | 0.9598 |
| No log | 1.0230 | 400 | 0.7470 | 0.7118 | 0.7470 | 0.8643 |
| No log | 1.0281 | 402 | 0.7112 | 0.7258 | 0.7112 | 0.8433 |
| No log | 1.0332 | 404 | 0.8524 | 0.6877 | 0.8524 | 0.9232 |
| No log | 1.0384 | 406 | 1.2090 | 0.7786 | 1.2090 | 1.0996 |
| No log | 1.0435 | 408 | 1.4343 | 0.6727 | 1.4343 | 1.1976 |
| No log | 1.0486 | 410 | 1.4046 | 0.7148 | 1.4046 | 1.1852 |
| No log | 1.0537 | 412 | 1.1984 | 0.7786 | 1.1984 | 1.0947 |
| No log | 1.0588 | 414 | 1.0933 | 0.7688 | 1.0933 | 1.0456 |
| No log | 1.0639 | 416 | 0.9940 | 0.7337 | 0.9940 | 0.9970 |
| No log | 1.0691 | 418 | 1.1435 | 0.7688 | 1.1435 | 1.0693 |
| No log | 1.0742 | 420 | 1.3131 | 0.7594 | 1.3131 | 1.1459 |
| No log | 1.0793 | 422 | 1.4282 | 0.7412 | 1.4282 | 1.1951 |
| No log | 1.0844 | 424 | 1.6065 | 0.6296 | 1.6065 | 1.2675 |
| No log | 1.0895 | 426 | 1.5965 | 0.6296 | 1.5965 | 1.2635 |
| No log | 1.0946 | 428 | 1.4905 | 0.5608 | 1.4905 | 1.2209 |
| No log | 1.0997 | 430 | 1.5453 | 0.5404 | 1.5453 | 1.2431 |
| No log | 1.1049 | 432 | 1.4746 | 0.5608 | 1.4746 | 1.2143 |
| No log | 1.1100 | 434 | 1.3179 | 0.6103 | 1.3179 | 1.1480 |
| No log | 1.1151 | 436 | 1.2484 | 0.5638 | 1.2484 | 1.1173 |
| No log | 1.1202 | 438 | 1.1734 | 0.6211 | 1.1734 | 1.0833 |
| No log | 1.1253 | 440 | 1.0710 | 0.7424 | 1.0710 | 1.0349 |
| No log | 1.1304 | 442 | 1.0806 | 0.7424 | 1.0806 | 1.0395 |
| No log | 1.1355 | 444 | 1.2556 | 0.7333 | 1.2556 | 1.1205 |
| No log | 1.1407 | 446 | 1.2766 | 0.6967 | 1.2766 | 1.1299 |
| No log | 1.1458 | 448 | 1.3521 | 0.6967 | 1.3521 | 1.1628 |
| No log | 1.1509 | 450 | 1.2611 | 0.6211 | 1.2611 | 1.1230 |
| No log | 1.1560 | 452 | 1.1527 | 0.6265 | 1.1527 | 1.0736 |
| No log | 1.1611 | 454 | 1.1408 | 0.6265 | 1.1408 | 1.0681 |
| No log | 1.1662 | 456 | 1.1167 | 0.6812 | 1.1167 | 1.0568 |
| No log | 1.1714 | 458 | 1.1761 | 0.7123 | 1.1761 | 1.0845 |
| No log | 1.1765 | 460 | 1.1650 | 0.7123 | 1.1650 | 1.0793 |
| No log | 1.1816 | 462 | 1.1737 | 0.7504 | 1.1737 | 1.0834 |
| No log | 1.1867 | 464 | 1.3572 | 0.7502 | 1.3572 | 1.1650 |
| No log | 1.1918 | 466 | 1.3875 | 0.7502 | 1.3875 | 1.1779 |
| No log | 1.1969 | 468 | 1.1708 | 0.7502 | 1.1708 | 1.0820 |
| No log | 1.2020 | 470 | 0.9057 | 0.6945 | 0.9057 | 0.9517 |
| No log | 1.2072 | 472 | 0.8584 | 0.6813 | 0.8584 | 0.9265 |
| No log | 1.2123 | 474 | 0.9388 | 0.6945 | 0.9388 | 0.9689 |
| No log | 1.2174 | 476 | 1.1480 | 0.7417 | 1.1480 | 1.0714 |
| No log | 1.2225 | 478 | 1.3034 | 0.7123 | 1.3034 | 1.1417 |
| No log | 1.2276 | 480 | 1.2237 | 0.7123 | 1.2237 | 1.1062 |
| No log | 1.2327 | 482 | 0.9916 | 0.7310 | 0.9916 | 0.9958 |
| No log | 1.2379 | 484 | 0.7953 | 0.7525 | 0.7953 | 0.8918 |
| No log | 1.2430 | 486 | 0.7772 | 0.7025 | 0.7772 | 0.8816 |
| No log | 1.2481 | 488 | 0.9794 | 0.7053 | 0.9794 | 0.9897 |
| No log | 1.2532 | 490 | 1.2996 | 0.6557 | 1.2996 | 1.1400 |
| No log | 1.2583 | 492 | 1.5185 | 0.6129 | 1.5185 | 1.2323 |
| No log | 1.2634 | 494 | 1.3329 | 0.6260 | 1.3329 | 1.1545 |
| No log | 1.2685 | 496 | 0.9421 | 0.7429 | 0.9421 | 0.9706 |
| No log | 1.2737 | 498 | 0.6690 | 0.7172 | 0.6691 | 0.8180 |
| 0.5075 | 1.2788 | 500 | 0.6050 | 0.7862 | 0.6050 | 0.7778 |
| 0.5075 | 1.2839 | 502 | 0.6545 | 0.7289 | 0.6545 | 0.8090 |
| 0.5075 | 1.2890 | 504 | 0.7535 | 0.7095 | 0.7535 | 0.8680 |
| 0.5075 | 1.2941 | 506 | 0.8276 | 0.7244 | 0.8276 | 0.9097 |
| 0.5075 | 1.2992 | 508 | 1.1053 | 0.7157 | 1.1053 | 1.0513 |
| 0.5075 | 1.3043 | 510 | 1.5103 | 0.6296 | 1.5103 | 1.2290 |
| 0.5075 | 1.3095 | 512 | 1.6041 | 0.6296 | 1.6041 | 1.2665 |
| 0.5075 | 1.3146 | 514 | 1.4010 | 0.6545 | 1.4010 | 1.1836 |
| 0.5075 | 1.3197 | 516 | 1.2154 | 0.7123 | 1.2154 | 1.1025 |
| 0.5075 | 1.3248 | 518 | 1.0488 | 0.6763 | 1.0488 | 1.0241 |
| 0.5075 | 1.3299 | 520 | 1.0842 | 0.6686 | 1.0842 | 1.0413 |
| 0.5075 | 1.3350 | 522 | 1.2987 | 0.7412 | 1.2987 | 1.1396 |
| 0.5075 | 1.3402 | 524 | 1.5701 | 0.6855 | 1.5701 | 1.2530 |
| 0.5075 | 1.3453 | 526 | 1.6355 | 0.6296 | 1.6355 | 1.2789 |
| 0.5075 | 1.3504 | 528 | 1.6912 | 0.5889 | 1.6912 | 1.3005 |
| 0.5075 | 1.3555 | 530 | 1.5611 | 0.6296 | 1.5611 | 1.2494 |
| 0.5075 | 1.3606 | 532 | 1.4316 | 0.6855 | 1.4316 | 1.1965 |
| 0.5075 | 1.3657 | 534 | 1.5025 | 0.6108 | 1.5025 | 1.2258 |
| 0.5075 | 1.3708 | 536 | 1.5531 | 0.5769 | 1.5531 | 1.2463 |
| 0.5075 | 1.3760 | 538 | 1.7146 | 0.5548 | 1.7146 | 1.3094 |
| 0.5075 | 1.3811 | 540 | 1.8542 | 0.5166 | 1.8542 | 1.3617 |
| 0.5075 | 1.3862 | 542 | 1.8613 | 0.5166 | 1.8612 | 1.3643 |
| 0.5075 | 1.3913 | 544 | 1.6929 | 0.6928 | 1.6929 | 1.3011 |
| 0.5075 | 1.3964 | 546 | 1.3209 | 0.6860 | 1.3209 | 1.1493 |
| 0.5075 | 1.4015 | 548 | 1.0328 | 0.7216 | 1.0328 | 1.0162 |
| 0.5075 | 1.4066 | 550 | 0.8930 | 0.7244 | 0.8930 | 0.9450 |
| 0.5075 | 1.4118 | 552 | 0.8647 | 0.7244 | 0.8647 | 0.9299 |
| 0.5075 | 1.4169 | 554 | 0.9753 | 0.7421 | 0.9753 | 0.9876 |
| 0.5075 | 1.4220 | 556 | 1.0389 | 0.7421 | 1.0389 | 1.0193 |
| 0.5075 | 1.4271 | 558 | 0.9714 | 0.7020 | 0.9714 | 0.9856 |
| 0.5075 | 1.4322 | 560 | 1.0200 | 0.7688 | 1.0200 | 1.0100 |
| 0.5075 | 1.4373 | 562 | 0.9233 | 0.7517 | 0.9233 | 0.9609 |
| 0.5075 | 1.4425 | 564 | 0.7317 | 0.7358 | 0.7317 | 0.8554 |
| 0.5075 | 1.4476 | 566 | 0.6643 | 0.7441 | 0.6643 | 0.8150 |
| 0.5075 | 1.4527 | 568 | 0.7304 | 0.7601 | 0.7304 | 0.8546 |
| 0.5075 | 1.4578 | 570 | 0.8961 | 0.7517 | 0.8961 | 0.9466 |
| 0.5075 | 1.4629 | 572 | 1.0690 | 0.7333 | 1.0690 | 1.0339 |
| 0.5075 | 1.4680 | 574 | 1.0145 | 0.7421 | 1.0145 | 1.0072 |
| 0.5075 | 1.4731 | 576 | 0.8286 | 0.7332 | 0.8286 | 0.9103 |
| 0.5075 | 1.4783 | 578 | 0.6859 | 0.7441 | 0.6859 | 0.8282 |
| 0.5075 | 1.4834 | 580 | 0.6679 | 0.7441 | 0.6679 | 0.8172 |
| 0.5075 | 1.4885 | 582 | 0.7510 | 0.7167 | 0.7510 | 0.8666 |
| 0.5075 | 1.4936 | 584 | 0.9088 | 0.7692 | 0.9088 | 0.9533 |
| 0.5075 | 1.4987 | 586 | 1.0046 | 0.7598 | 1.0046 | 1.0023 |
| 0.5075 | 1.5038 | 588 | 1.0991 | 0.7081 | 1.0991 | 1.0484 |
| 0.5075 | 1.5090 | 590 | 1.0170 | 0.7598 | 1.0170 | 1.0085 |
| 0.5075 | 1.5141 | 592 | 0.8817 | 0.7602 | 0.8817 | 0.9390 |
| 0.5075 | 1.5192 | 594 | 0.7671 | 0.7606 | 0.7671 | 0.8759 |
| 0.5075 | 1.5243 | 596 | 0.7125 | 0.7525 | 0.7125 | 0.8441 |
| 0.5075 | 1.5294 | 598 | 0.7283 | 0.7700 | 0.7283 | 0.8534 |
| 0.5075 | 1.5345 | 600 | 0.8319 | 0.7602 | 0.8319 | 0.9121 |
| 0.5075 | 1.5396 | 602 | 0.9640 | 0.7333 | 0.9640 | 0.9819 |
| 0.5075 | 1.5448 | 604 | 1.0797 | 0.7908 | 1.0797 | 1.0391 |
| 0.5075 | 1.5499 | 606 | 0.9749 | 0.7255 | 0.9749 | 0.9874 |
| 0.5075 | 1.5550 | 608 | 0.8725 | 0.7255 | 0.8725 | 0.9341 |
| 0.5075 | 1.5601 | 610 | 0.9486 | 0.7333 | 0.9486 | 0.9740 |
| 0.5075 | 1.5652 | 612 | 1.0052 | 0.7333 | 1.0052 | 1.0026 |
| 0.5075 | 1.5703 | 614 | 1.0609 | 0.7333 | 1.0609 | 1.0300 |
| 0.5075 | 1.5754 | 616 | 1.1211 | 0.7417 | 1.1211 | 1.0588 |
| 0.5075 | 1.5806 | 618 | 1.3273 | 0.7123 | 1.3273 | 1.1521 |
| 0.5075 | 1.5857 | 620 | 1.3902 | 0.6545 | 1.3902 | 1.1791 |
| 0.5075 | 1.5908 | 622 | 1.2558 | 0.7211 | 1.2558 | 1.1206 |
| 0.5075 | 1.5959 | 624 | 1.1108 | 0.7211 | 1.1108 | 1.0539 |
| 0.5075 | 1.6010 | 626 | 0.9813 | 0.7511 | 0.9813 | 0.9906 |
| 0.5075 | 1.6061 | 628 | 0.9265 | 0.7239 | 0.9265 | 0.9625 |
| 0.5075 | 1.6113 | 630 | 0.9584 | 0.7259 | 0.9584 | 0.9790 |
| 0.5075 | 1.6164 | 632 | 1.1368 | 0.7720 | 1.1368 | 1.0662 |
| 0.5075 | 1.6215 | 634 | 1.2595 | 0.7882 | 1.2595 | 1.1223 |
| 0.5075 | 1.6266 | 636 | 1.3397 | 0.7342 | 1.3397 | 1.1575 |
| 0.5075 | 1.6317 | 638 | 1.3930 | 0.6831 | 1.3930 | 1.1803 |
| 0.5075 | 1.6368 | 640 | 1.2310 | 0.6855 | 1.2310 | 1.1095 |
| 0.5075 | 1.6419 | 642 | 1.0432 | 0.6818 | 1.0432 | 1.0214 |
| 0.5075 | 1.6471 | 644 | 1.0380 | 0.7129 | 1.0380 | 1.0188 |
| 0.5075 | 1.6522 | 646 | 1.1224 | 0.7129 | 1.1224 | 1.0594 |
| 0.5075 | 1.6573 | 648 | 1.0584 | 0.7129 | 1.0584 | 1.0288 |
| 0.5075 | 1.6624 | 650 | 0.9872 | 0.7129 | 0.9872 | 0.9936 |
| 0.5075 | 1.6675 | 652 | 0.9018 | 0.7518 | 0.9018 | 0.9496 |
| 0.5075 | 1.6726 | 654 | 1.0147 | 0.7511 | 1.0147 | 1.0073 |
| 0.5075 | 1.6777 | 656 | 0.9905 | 0.7509 | 0.9905 | 0.9952 |
| 0.5075 | 1.6829 | 658 | 0.9473 | 0.7724 | 0.9473 | 0.9733 |
| 0.5075 | 1.6880 | 660 | 0.9219 | 0.7583 | 0.9219 | 0.9602 |
| 0.5075 | 1.6931 | 662 | 0.9036 | 0.7583 | 0.9036 | 0.9506 |
| 0.5075 | 1.6982 | 664 | 0.9355 | 0.7660 | 0.9355 | 0.9672 |
| 0.5075 | 1.7033 | 666 | 0.9677 | 0.7354 | 0.9677 | 0.9837 |
| 0.5075 | 1.7084 | 668 | 0.9230 | 0.7354 | 0.9230 | 0.9607 |
| 0.5075 | 1.7136 | 670 | 0.8581 | 0.7601 | 0.8581 | 0.9263 |
| 0.5075 | 1.7187 | 672 | 0.8767 | 0.7601 | 0.8767 | 0.9363 |
| 0.5075 | 1.7238 | 674 | 0.8683 | 0.7522 | 0.8683 | 0.9318 |
| 0.5075 | 1.7289 | 676 | 0.9119 | 0.7759 | 0.9119 | 0.9549 |
| 0.5075 | 1.7340 | 678 | 0.8048 | 0.7522 | 0.8048 | 0.8971 |
| 0.5075 | 1.7391 | 680 | 0.7859 | 0.7882 | 0.7859 | 0.8865 |
| 0.5075 | 1.7442 | 682 | 0.8602 | 0.7354 | 0.8602 | 0.9275 |
| 0.5075 | 1.7494 | 684 | 0.9239 | 0.7354 | 0.9239 | 0.9612 |
| 0.5075 | 1.7545 | 686 | 1.0329 | 0.7259 | 1.0329 | 1.0163 |
| 0.5075 | 1.7596 | 688 | 1.1704 | 0.6797 | 1.1704 | 1.0819 |
| 0.5075 | 1.7647 | 690 | 1.1014 | 0.6866 | 1.1014 | 1.0495 |
| 0.5075 | 1.7698 | 692 | 0.8593 | 0.7437 | 0.8593 | 0.9270 |
| 0.5075 | 1.7749 | 694 | 0.7551 | 0.7529 | 0.7551 | 0.8690 |
| 0.5075 | 1.7801 | 696 | 0.7765 | 0.7100 | 0.7765 | 0.8812 |
| 0.5075 | 1.7852 | 698 | 0.9332 | 0.7354 | 0.9332 | 0.9660 |
| 0.5075 | 1.7903 | 700 | 0.9644 | 0.7019 | 0.9644 | 0.9821 |
| 0.5075 | 1.7954 | 702 | 0.9534 | 0.7277 | 0.9534 | 0.9764 |
| 0.5075 | 1.8005 | 704 | 0.8824 | 0.7208 | 0.8824 | 0.9394 |
| 0.5075 | 1.8056 | 706 | 0.8378 | 0.7208 | 0.8378 | 0.9153 |
| 0.5075 | 1.8107 | 708 | 0.8983 | 0.7208 | 0.8983 | 0.9478 |
| 0.5075 | 1.8159 | 710 | 0.8336 | 0.7208 | 0.8336 | 0.9130 |
| 0.5075 | 1.8210 | 712 | 0.8422 | 0.7208 | 0.8422 | 0.9177 |
| 0.5075 | 1.8261 | 714 | 0.7818 | 0.7208 | 0.7818 | 0.8842 |
| 0.5075 | 1.8312 | 716 | 0.7732 | 0.7281 | 0.7732 | 0.8793 |
| 0.5075 | 1.8363 | 718 | 0.8908 | 0.7208 | 0.8908 | 0.9438 |
| 0.5075 | 1.8414 | 720 | 0.8871 | 0.7208 | 0.8871 | 0.9419 |
| 0.5075 | 1.8465 | 722 | 0.7572 | 0.7281 | 0.7572 | 0.8702 |
| 0.5075 | 1.8517 | 724 | 0.6228 | 0.7786 | 0.6228 | 0.7891 |
| 0.5075 | 1.8568 | 726 | 0.6091 | 0.7786 | 0.6091 | 0.7804 |
| 0.5075 | 1.8619 | 728 | 0.7122 | 0.7363 | 0.7122 | 0.8439 |
| 0.5075 | 1.8670 | 730 | 0.9815 | 0.7255 | 0.9815 | 0.9907 |
| 0.5075 | 1.8721 | 732 | 1.3285 | 0.6841 | 1.3285 | 1.1526 |
| 0.5075 | 1.8772 | 734 | 1.4493 | 0.7119 | 1.4493 | 1.2039 |
| 0.5075 | 1.8824 | 736 | 1.3106 | 0.6841 | 1.3106 | 1.1448 |
| 0.5075 | 1.8875 | 738 | 1.0109 | 0.7421 | 1.0109 | 1.0054 |
| 0.5075 | 1.8926 | 740 | 0.8258 | 0.7902 | 0.8258 | 0.9087 |
| 0.5075 | 1.8977 | 742 | 0.7337 | 0.8164 | 0.7337 | 0.8565 |
| 0.5075 | 1.9028 | 744 | 0.6861 | 0.8185 | 0.6861 | 0.8283 |
| 0.5075 | 1.9079 | 746 | 0.7383 | 0.7675 | 0.7383 | 0.8593 |
| 0.5075 | 1.9130 | 748 | 0.8563 | 0.7593 | 0.8563 | 0.9254 |
| 0.5075 | 1.9182 | 750 | 1.0207 | 0.7514 | 1.0207 | 1.0103 |
| 0.5075 | 1.9233 | 752 | 0.9610 | 0.7514 | 0.9610 | 0.9803 |
| 0.5075 | 1.9284 | 754 | 0.8452 | 0.7948 | 0.8452 | 0.9193 |
| 0.5075 | 1.9335 | 756 | 0.7150 | 0.7955 | 0.7150 | 0.8456 |
| 0.5075 | 1.9386 | 758 | 0.6680 | 0.7786 | 0.6680 | 0.8173 |
| 0.5075 | 1.9437 | 760 | 0.7380 | 0.7852 | 0.7380 | 0.8590 |
| 0.5075 | 1.9488 | 762 | 0.9634 | 0.7514 | 0.9634 | 0.9816 |
| 0.5075 | 1.9540 | 764 | 1.0908 | 0.6738 | 1.0908 | 1.0444 |
| 0.5075 | 1.9591 | 766 | 1.0150 | 0.7514 | 1.0150 | 1.0075 |
| 0.5075 | 1.9642 | 768 | 0.8832 | 0.7602 | 0.8832 | 0.9398 |
| 0.5075 | 1.9693 | 770 | 0.8007 | 0.7906 | 0.8007 | 0.8948 |
| 0.5075 | 1.9744 | 772 | 0.7222 | 0.7986 | 0.7222 | 0.8498 |
| 0.5075 | 1.9795 | 774 | 0.6887 | 0.7986 | 0.6887 | 0.8299 |
| 0.5075 | 1.9847 | 776 | 0.7453 | 0.7700 | 0.7453 | 0.8633 |
| 0.5075 | 1.9898 | 778 | 0.8388 | 0.7756 | 0.8388 | 0.9158 |
| 0.5075 | 1.9949 | 780 | 1.0073 | 0.7014 | 1.0073 | 1.0036 |
| 0.5075 | 2.0 | 782 | 0.9634 | 0.7081 | 0.9634 | 0.9816 |
| 0.5075 | 2.0051 | 784 | 0.7917 | 0.7430 | 0.7917 | 0.8898 |
| 0.5075 | 2.0102 | 786 | 0.6532 | 0.8171 | 0.6532 | 0.8082 |
| 0.5075 | 2.0153 | 788 | 0.6354 | 0.7990 | 0.6354 | 0.7971 |
| 0.5075 | 2.0205 | 790 | 0.7059 | 0.7910 | 0.7059 | 0.8402 |
| 0.5075 | 2.0256 | 792 | 0.9409 | 0.7421 | 0.9409 | 0.9700 |
| 0.5075 | 2.0307 | 794 | 1.1616 | 0.7502 | 1.1616 | 1.0778 |
| 0.5075 | 2.0358 | 796 | 1.1253 | 0.7944 | 1.1253 | 1.0608 |
| 0.5075 | 2.0409 | 798 | 0.9609 | 0.7947 | 0.9609 | 0.9802 |
| 0.5075 | 2.0460 | 800 | 0.8319 | 0.7259 | 0.8319 | 0.9121 |
| 0.5075 | 2.0512 | 802 | 0.8681 | 0.7244 | 0.8681 | 0.9317 |
| 0.5075 | 2.0563 | 804 | 0.9234 | 0.7337 | 0.9234 | 0.9609 |
| 0.5075 | 2.0614 | 806 | 0.8479 | 0.7522 | 0.8479 | 0.9208 |
| 0.5075 | 2.0665 | 808 | 0.8113 | 0.7519 | 0.8113 | 0.9007 |
| 0.5075 | 2.0716 | 810 | 0.8768 | 0.7849 | 0.8768 | 0.9364 |
| 0.5075 | 2.0767 | 812 | 0.8101 | 0.7716 | 0.8101 | 0.9001 |
| 0.5075 | 2.0818 | 814 | 0.8197 | 0.7445 | 0.8197 | 0.9054 |
| 0.5075 | 2.0870 | 816 | 0.8325 | 0.7593 | 0.8325 | 0.9124 |
| 0.5075 | 2.0921 | 818 | 0.9011 | 0.7759 | 0.9011 | 0.9493 |
| 0.5075 | 2.0972 | 820 | 1.0210 | 0.6866 | 1.0210 | 1.0104 |
| 0.5075 | 2.1023 | 822 | 0.9564 | 0.6802 | 0.9564 | 0.9779 |
| 0.5075 | 2.1074 | 824 | 0.8065 | 0.8293 | 0.8065 | 0.8981 |
| 0.5075 | 2.1125 | 826 | 0.7455 | 0.8123 | 0.7455 | 0.8634 |
| 0.5075 | 2.1176 | 828 | 0.8014 | 0.7852 | 0.8014 | 0.8952 |
| 0.5075 | 2.1228 | 830 | 0.8530 | 0.7518 | 0.8530 | 0.9236 |
| 0.5075 | 2.1279 | 832 | 1.0189 | 0.6738 | 1.0189 | 1.0094 |
| 0.5075 | 2.1330 | 834 | 1.0800 | 0.6797 | 1.0800 | 1.0392 |
| 0.5075 | 2.1381 | 836 | 0.9920 | 0.6866 | 0.9920 | 0.9960 |
| 0.5075 | 2.1432 | 838 | 0.7895 | 0.7518 | 0.7895 | 0.8885 |
| 0.5075 | 2.1483 | 840 | 0.6395 | 0.7886 | 0.6395 | 0.7997 |
| 0.5075 | 2.1535 | 842 | 0.6168 | 0.7789 | 0.6168 | 0.7853 |
| 0.5075 | 2.1586 | 844 | 0.6903 | 0.8295 | 0.6903 | 0.8309 |
| 0.5075 | 2.1637 | 846 | 0.8306 | 0.7518 | 0.8306 | 0.9114 |
| 0.5075 | 2.1688 | 848 | 0.9566 | 0.7337 | 0.9566 | 0.9780 |
| 0.5075 | 2.1739 | 850 | 0.9045 | 0.7337 | 0.9045 | 0.9511 |
| 0.5075 | 2.1790 | 852 | 0.9188 | 0.7421 | 0.9188 | 0.9585 |
| 0.5075 | 2.1841 | 854 | 1.0379 | 0.7511 | 1.0379 | 1.0188 |
| 0.5075 | 2.1893 | 856 | 1.0469 | 0.7511 | 1.0469 | 1.0232 |
| 0.5075 | 2.1944 | 858 | 1.1013 | 0.7511 | 1.1013 | 1.0494 |
| 0.5075 | 2.1995 | 860 | 1.0348 | 0.7511 | 1.0348 | 1.0173 |
| 0.5075 | 2.2046 | 862 | 0.8443 | 0.7337 | 0.8443 | 0.9189 |
| 0.5075 | 2.2097 | 864 | 0.7072 | 0.7346 | 0.7072 | 0.8409 |
| 0.5075 | 2.2148 | 866 | 0.7338 | 0.7346 | 0.7338 | 0.8566 |
| 0.5075 | 2.2199 | 868 | 0.8855 | 0.7421 | 0.8855 | 0.9410 |
| 0.5075 | 2.2251 | 870 | 1.0990 | 0.7417 | 1.0990 | 1.0483 |
| 0.5075 | 2.2302 | 872 | 1.2641 | 0.6860 | 1.2641 | 1.1243 |
| 0.5075 | 2.2353 | 874 | 1.2085 | 0.6860 | 1.2085 | 1.0993 |
| 0.5075 | 2.2404 | 876 | 1.0439 | 0.7511 | 1.0439 | 1.0217 |
| 0.5075 | 2.2455 | 878 | 0.9690 | 0.7511 | 0.9690 | 0.9844 |
| 0.5075 | 2.2506 | 880 | 0.8315 | 0.7729 | 0.8315 | 0.9119 |
| 0.5075 | 2.2558 | 882 | 0.7510 | 0.7623 | 0.7510 | 0.8666 |
| 0.5075 | 2.2609 | 884 | 0.7671 | 0.7902 | 0.7671 | 0.8759 |
| 0.5075 | 2.2660 | 886 | 0.8357 | 0.7337 | 0.8357 | 0.9142 |
| 0.5075 | 2.2711 | 888 | 0.9812 | 0.7333 | 0.9812 | 0.9906 |
| 0.5075 | 2.2762 | 890 | 0.9855 | 0.7333 | 0.9855 | 0.9927 |
| 0.5075 | 2.2813 | 892 | 0.9255 | 0.7255 | 0.9255 | 0.9620 |
| 0.5075 | 2.2864 | 894 | 0.7733 | 0.7759 | 0.7733 | 0.8794 |
| 0.5075 | 2.2916 | 896 | 0.7467 | 0.7852 | 0.7467 | 0.8641 |
| 0.5075 | 2.2967 | 898 | 0.8576 | 0.7259 | 0.8576 | 0.9261 |
| 0.5075 | 2.3018 | 900 | 0.9894 | 0.7255 | 0.9894 | 0.9947 |
| 0.5075 | 2.3069 | 902 | 1.0213 | 0.7255 | 1.0213 | 1.0106 |
| 0.5075 | 2.3120 | 904 | 0.9987 | 0.7255 | 0.9987 | 0.9994 |
| 0.5075 | 2.3171 | 906 | 0.9179 | 0.7337 | 0.9179 | 0.9581 |
| 0.5075 | 2.3223 | 908 | 0.8079 | 0.7342 | 0.8079 | 0.8989 |
| 0.5075 | 2.3274 | 910 | 0.7996 | 0.7342 | 0.7996 | 0.8942 |
| 0.5075 | 2.3325 | 912 | 0.8753 | 0.7259 | 0.8753 | 0.9356 |
| 0.5075 | 2.3376 | 914 | 1.0428 | 0.7255 | 1.0428 | 1.0212 |
| 0.5075 | 2.3427 | 916 | 1.2219 | 0.6797 | 1.2219 | 1.1054 |
| 0.5075 | 2.3478 | 918 | 1.2670 | 0.6260 | 1.2670 | 1.1256 |
| 0.5075 | 2.3529 | 920 | 1.1984 | 0.6303 | 1.1984 | 1.0947 |
| 0.5075 | 2.3581 | 922 | 1.0727 | 0.7511 | 1.0727 | 1.0357 |
| 0.5075 | 2.3632 | 924 | 1.0132 | 0.7511 | 1.0132 | 1.0066 |
| 0.5075 | 2.3683 | 926 | 1.0931 | 0.6934 | 1.0931 | 1.0455 |
| 0.5075 | 2.3734 | 928 | 1.2974 | 0.6303 | 1.2974 | 1.1390 |
| 0.5075 | 2.3785 | 930 | 1.3794 | 0.6303 | 1.3794 | 1.1745 |
| 0.5075 | 2.3836 | 932 | 1.2617 | 0.6303 | 1.2617 | 1.1233 |
| 0.5075 | 2.3887 | 934 | 1.1074 | 0.6797 | 1.1074 | 1.0523 |
| 0.5075 | 2.3939 | 936 | 1.1169 | 0.6797 | 1.1169 | 1.0568 |
| 0.5075 | 2.3990 | 938 | 1.1822 | 0.6797 | 1.1822 | 1.0873 |
| 0.5075 | 2.4041 | 940 | 1.1475 | 0.6797 | 1.1475 | 1.0712 |
| 0.5075 | 2.4092 | 942 | 1.0258 | 0.6738 | 1.0258 | 1.0128 |
| 0.5075 | 2.4143 | 944 | 0.8731 | 0.7346 | 0.8731 | 0.9344 |
| 0.5075 | 2.4194 | 946 | 0.7770 | 0.7610 | 0.7770 | 0.8815 |
| 0.5075 | 2.4246 | 948 | 0.7857 | 0.7610 | 0.7857 | 0.8864 |
| 0.5075 | 2.4297 | 950 | 0.8307 | 0.7522 | 0.8307 | 0.9114 |
| 0.5075 | 2.4348 | 952 | 0.8700 | 0.7672 | 0.8700 | 0.9327 |
| 0.5075 | 2.4399 | 954 | 0.8679 | 0.7672 | 0.8679 | 0.9316 |
| 0.5075 | 2.4450 | 956 | 0.8410 | 0.7672 | 0.8410 | 0.9171 |
| 0.5075 | 2.4501 | 958 | 0.8271 | 0.7610 | 0.8271 | 0.9094 |
| 0.5075 | 2.4552 | 960 | 0.8548 | 0.7852 | 0.8548 | 0.9246 |
| 0.5075 | 2.4604 | 962 | 0.8927 | 0.7672 | 0.8927 | 0.9448 |
| 0.5075 | 2.4655 | 964 | 0.8678 | 0.7672 | 0.8678 | 0.9316 |
| 0.5075 | 2.4706 | 966 | 0.8360 | 0.7672 | 0.8360 | 0.9143 |
| 0.5075 | 2.4757 | 968 | 0.8320 | 0.7759 | 0.8320 | 0.9121 |
| 0.5075 | 2.4808 | 970 | 0.8198 | 0.7700 | 0.8198 | 0.9055 |
| 0.5075 | 2.4859 | 972 | 0.7304 | 0.7610 | 0.7304 | 0.8546 |
| 0.5075 | 2.4910 | 974 | 0.6352 | 0.7529 | 0.6352 | 0.7970 |
| 0.5075 | 2.4962 | 976 | 0.6208 | 0.7529 | 0.6208 | 0.7879 |
| 0.5075 | 2.5013 | 978 | 0.7065 | 0.7525 | 0.7065 | 0.8405 |
| 0.5075 | 2.5064 | 980 | 0.8503 | 0.7255 | 0.8503 | 0.9221 |
| 0.5075 | 2.5115 | 982 | 0.9419 | 0.7333 | 0.9419 | 0.9705 |
| 0.5075 | 2.5166 | 984 | 0.9286 | 0.7333 | 0.9286 | 0.9637 |
| 0.5075 | 2.5217 | 986 | 0.8852 | 0.7333 | 0.8852 | 0.9408 |
| 0.5075 | 2.5269 | 988 | 0.8849 | 0.7333 | 0.8849 | 0.9407 |
| 0.5075 | 2.5320 | 990 | 0.8290 | 0.7255 | 0.8290 | 0.9105 |
| 0.5075 | 2.5371 | 992 | 0.7753 | 0.7350 | 0.7753 | 0.8805 |
| 0.5075 | 2.5422 | 994 | 0.7986 | 0.7086 | 0.7986 | 0.8937 |
| 0.5075 | 2.5473 | 996 | 0.9149 | 0.7333 | 0.9149 | 0.9565 |
| 0.5075 | 2.5524 | 998 | 0.8668 | 0.7598 | 0.8668 | 0.9310 |
| 0.1762 | 2.5575 | 1000 | 0.7745 | 0.7597 | 0.7745 | 0.8800 |
| 0.1762 | 2.5627 | 1002 | 0.6135 | 0.8022 | 0.6135 | 0.7832 |
| 0.1762 | 2.5678 | 1004 | 0.5577 | 0.7773 | 0.5577 | 0.7468 |
| 0.1762 | 2.5729 | 1006 | 0.5899 | 0.7603 | 0.5899 | 0.7681 |
| 0.1762 | 2.5780 | 1008 | 0.7404 | 0.7445 | 0.7404 | 0.8605 |
| 0.1762 | 2.5831 | 1010 | 0.8310 | 0.7593 | 0.8310 | 0.9116 |
| 0.1762 | 2.5882 | 1012 | 0.7640 | 0.7593 | 0.7640 | 0.8741 |
| 0.1762 | 2.5934 | 1014 | 0.6595 | 0.7522 | 0.6595 | 0.8121 |
| 0.1762 | 2.5985 | 1016 | 0.6780 | 0.7601 | 0.6780 | 0.8234 |
| 0.1762 | 2.6036 | 1018 | 0.8036 | 0.7849 | 0.8036 | 0.8965 |
| 0.1762 | 2.6087 | 1020 | 0.9673 | 0.7594 | 0.9673 | 0.9835 |
| 0.1762 | 2.6138 | 1022 | 1.1890 | 0.7586 | 1.1890 | 1.0904 |
| 0.1762 | 2.6189 | 1024 | 1.2370 | 0.7882 | 1.2370 | 1.1122 |
| 0.1762 | 2.6240 | 1026 | 1.0680 | 0.7586 | 1.0680 | 1.0334 |
| 0.1762 | 2.6292 | 1028 | 0.7920 | 0.7852 | 0.7920 | 0.8899 |
| 0.1762 | 2.6343 | 1030 | 0.6052 | 0.7689 | 0.6052 | 0.7779 |
| 0.1762 | 2.6394 | 1032 | 0.5520 | 0.7367 | 0.5520 | 0.7430 |
| 0.1762 | 2.6445 | 1034 | 0.5925 | 0.7766 | 0.5925 | 0.7697 |
| 0.1762 | 2.6496 | 1036 | 0.7418 | 0.7601 | 0.7418 | 0.8613 |
| 0.1762 | 2.6547 | 1038 | 1.0496 | 0.7882 | 1.0496 | 1.0245 |
| 0.1762 | 2.6598 | 1040 | 1.2622 | 0.8144 | 1.2622 | 1.1235 |
| 0.1762 | 2.6650 | 1042 | 1.2282 | 0.7882 | 1.2282 | 1.1083 |
| 0.1762 | 2.6701 | 1044 | 1.0325 | 0.7908 | 1.0325 | 1.0161 |
| 0.1762 | 2.6752 | 1046 | 0.9060 | 0.7421 | 0.9060 | 0.9518 |
| 0.1762 | 2.6803 | 1048 | 0.8154 | 0.7516 | 0.8154 | 0.9030 |
| 0.1762 | 2.6854 | 1050 | 0.7909 | 0.7433 | 0.7909 | 0.8893 |
| 0.1762 | 2.6905 | 1052 | 0.7434 | 0.7601 | 0.7434 | 0.8622 |
| 0.1762 | 2.6957 | 1054 | 0.6838 | 0.7601 | 0.6838 | 0.8269 |
| 0.1762 | 2.7008 | 1056 | 0.7232 | 0.7601 | 0.7232 | 0.8504 |
| 0.1762 | 2.7059 | 1058 | 0.7077 | 0.7522 | 0.7077 | 0.8413 |
| 0.1762 | 2.7110 | 1060 | 0.6500 | 0.7605 | 0.6500 | 0.8062 |
| 0.1762 | 2.7161 | 1062 | 0.6557 | 0.7605 | 0.6557 | 0.8097 |
| 0.1762 | 2.7212 | 1064 | 0.7320 | 0.7522 | 0.7320 | 0.8556 |
| 0.1762 | 2.7263 | 1066 | 0.8764 | 0.7672 | 0.8764 | 0.9361 |
| 0.1762 | 2.7315 | 1068 | 0.9354 | 0.7953 | 0.9354 | 0.9672 |
| 0.1762 | 2.7366 | 1070 | 0.8874 | 0.7435 | 0.8874 | 0.9420 |
| 0.1762 | 2.7417 | 1072 | 0.8196 | 0.7759 | 0.8196 | 0.9053 |
| 0.1762 | 2.7468 | 1074 | 0.7181 | 0.7689 | 0.7181 | 0.8474 |
| 0.1762 | 2.7519 | 1076 | 0.7195 | 0.7689 | 0.7195 | 0.8482 |
| 0.1762 | 2.7570 | 1078 | 0.7736 | 0.7852 | 0.7736 | 0.8796 |
| 0.1762 | 2.7621 | 1080 | 0.8289 | 0.7435 | 0.8289 | 0.9104 |
| 0.1762 | 2.7673 | 1082 | 0.8751 | 0.7435 | 0.8751 | 0.9354 |
| 0.1762 | 2.7724 | 1084 | 0.9846 | 0.7255 | 0.9846 | 0.9922 |
| 0.1762 | 2.7775 | 1086 | 1.1573 | 0.7342 | 1.1573 | 1.0758 |
| 0.1762 | 2.7826 | 1088 | 1.1705 | 0.7342 | 1.1705 | 1.0819 |
| 0.1762 | 2.7877 | 1090 | 1.0142 | 0.7255 | 1.0142 | 1.0071 |
| 0.1762 | 2.7928 | 1092 | 0.7830 | 0.7606 | 0.7830 | 0.8848 |
| 0.1762 | 2.7980 | 1094 | 0.6257 | 0.7786 | 0.6257 | 0.7910 |
| 0.1762 | 2.8031 | 1096 | 0.5682 | 0.7786 | 0.5682 | 0.7538 |
| 0.1762 | 2.8082 | 1098 | 0.5395 | 0.7696 | 0.5395 | 0.7345 |
| 0.1762 | 2.8133 | 1100 | 0.5744 | 0.7696 | 0.5744 | 0.7579 |
| 0.1762 | 2.8184 | 1102 | 0.6883 | 0.7605 | 0.6883 | 0.8297 |
| 0.1762 | 2.8235 | 1104 | 0.8674 | 0.7672 | 0.8674 | 0.9313 |
| 0.1762 | 2.8286 | 1106 | 1.0343 | 0.7514 | 1.0343 | 1.0170 |
| 0.1762 | 2.8338 | 1108 | 1.0090 | 0.7514 | 1.0090 | 1.0045 |
| 0.1762 | 2.8389 | 1110 | 0.9103 | 0.7696 | 0.9103 | 0.9541 |
| 0.1762 | 2.8440 | 1112 | 0.7739 | 0.7430 | 0.7739 | 0.8797 |
| 0.1762 | 2.8491 | 1114 | 0.6572 | 0.7882 | 0.6572 | 0.8107 |
| 0.1762 | 2.8542 | 1116 | 0.6428 | 0.7882 | 0.6428 | 0.8018 |
| 0.1762 | 2.8593 | 1118 | 0.6597 | 0.7882 | 0.6597 | 0.8122 |
| 0.1762 | 2.8645 | 1120 | 0.7645 | 0.7437 | 0.7645 | 0.8744 |
| 0.1762 | 2.8696 | 1122 | 0.8515 | 0.7437 | 0.8515 | 0.9228 |
| 0.1762 | 2.8747 | 1124 | 0.8358 | 0.7724 | 0.8358 | 0.9142 |
| 0.1762 | 2.8798 | 1126 | 0.7281 | 0.7605 | 0.7281 | 0.8533 |
| 0.1762 | 2.8849 | 1128 | 0.6854 | 0.7605 | 0.6854 | 0.8279 |
| 0.1762 | 2.8900 | 1130 | 0.6179 | 0.7605 | 0.6179 | 0.7861 |
| 0.1762 | 2.8951 | 1132 | 0.6147 | 0.7605 | 0.6147 | 0.7840 |
| 0.1762 | 2.9003 | 1134 | 0.6183 | 0.7752 | 0.6183 | 0.7863 |
| 0.1762 | 2.9054 | 1136 | 0.6226 | 0.7605 | 0.6226 | 0.7891 |
| 0.1762 | 2.9105 | 1138 | 0.6341 | 0.7605 | 0.6341 | 0.7963 |
| 0.1762 | 2.9156 | 1140 | 0.6974 | 0.7522 | 0.6974 | 0.8351 |
| 0.1762 | 2.9207 | 1142 | 0.7980 | 0.7675 | 0.7980 | 0.8933 |
| 0.1762 | 2.9258 | 1144 | 0.7815 | 0.7675 | 0.7815 | 0.8840 |
| 0.1762 | 2.9309 | 1146 | 0.7238 | 0.8057 | 0.7238 | 0.8508 |
| 0.1762 | 2.9361 | 1148 | 0.6517 | 0.7786 | 0.6517 | 0.8073 |
| 0.1762 | 2.9412 | 1150 | 0.6555 | 0.7882 | 0.6555 | 0.8096 |
| 0.1762 | 2.9463 | 1152 | 0.6543 | 0.7882 | 0.6543 | 0.8089 |
| 0.1762 | 2.9514 | 1154 | 0.6846 | 0.8057 | 0.6846 | 0.8274 |
| 0.1762 | 2.9565 | 1156 | 0.7083 | 0.8057 | 0.7083 | 0.8416 |
| 0.1762 | 2.9616 | 1158 | 0.7207 | 0.7526 | 0.7207 | 0.8489 |
| 0.1762 | 2.9668 | 1160 | 0.7315 | 0.7852 | 0.7315 | 0.8553 |
| 0.1762 | 2.9719 | 1162 | 0.7724 | 0.7663 | 0.7724 | 0.8789 |
| 0.1762 | 2.9770 | 1164 | 0.7468 | 0.7663 | 0.7468 | 0.8642 |
| 0.1762 | 2.9821 | 1166 | 0.6953 | 0.7675 | 0.6953 | 0.8339 |
| 0.1762 | 2.9872 | 1168 | 0.7460 | 0.7675 | 0.7460 | 0.8637 |
| 0.1762 | 2.9923 | 1170 | 0.8607 | 0.7518 | 0.8607 | 0.9277 |
| 0.1762 | 2.9974 | 1172 | 0.8725 | 0.7518 | 0.8725 | 0.9341 |
| 0.1762 | 3.0026 | 1174 | 0.8641 | 0.7518 | 0.8641 | 0.9296 |
| 0.1762 | 3.0077 | 1176 | 0.8186 | 0.7606 | 0.8186 | 0.9047 |
| 0.1762 | 3.0128 | 1178 | 0.8826 | 0.7342 | 0.8826 | 0.9395 |
| 0.1762 | 3.0179 | 1180 | 1.0208 | 0.7421 | 1.0208 | 1.0104 |
| 0.1762 | 3.0230 | 1182 | 1.1625 | 0.6860 | 1.1625 | 1.0782 |
| 0.1762 | 3.0281 | 1184 | 1.1418 | 0.6860 | 1.1418 | 1.0686 |
| 0.1762 | 3.0332 | 1186 | 1.0299 | 0.7333 | 1.0299 | 1.0149 |
| 0.1762 | 3.0384 | 1188 | 1.0102 | 0.7333 | 1.0102 | 1.0051 |
| 0.1762 | 3.0435 | 1190 | 0.9015 | 0.7337 | 0.9015 | 0.9495 |
| 0.1762 | 3.0486 | 1192 | 0.8093 | 0.7063 | 0.8093 | 0.8996 |
| 0.1762 | 3.0537 | 1194 | 0.8355 | 0.7342 | 0.8355 | 0.9141 |
| 0.1762 | 3.0588 | 1196 | 0.9180 | 0.7426 | 0.9180 | 0.9581 |
| 0.1762 | 3.0639 | 1198 | 0.8930 | 0.7221 | 0.8930 | 0.9450 |
| 0.1762 | 3.0691 | 1200 | 0.8618 | 0.7729 | 0.8618 | 0.9283 |
| 0.1762 | 3.0742 | 1202 | 0.8570 | 0.7729 | 0.8570 | 0.9258 |
| 0.1762 | 3.0793 | 1204 | 0.8210 | 0.7729 | 0.8210 | 0.9061 |
| 0.1762 | 3.0844 | 1206 | 0.7620 | 0.7525 | 0.7620 | 0.8729 |
| 0.1762 | 3.0895 | 1208 | 0.7238 | 0.7525 | 0.7238 | 0.8507 |
| 0.1762 | 3.0946 | 1210 | 0.7358 | 0.7801 | 0.7358 | 0.8578 |
| 0.1762 | 3.0997 | 1212 | 0.8017 | 0.6987 | 0.8017 | 0.8954 |
| 0.1762 | 3.1049 | 1214 | 0.8454 | 0.7182 | 0.8454 | 0.9195 |
| 0.1762 | 3.1100 | 1216 | 0.9564 | 0.7255 | 0.9564 | 0.9780 |
| 0.1762 | 3.1151 | 1218 | 0.8981 | 0.7182 | 0.8981 | 0.9477 |
| 0.1762 | 3.1202 | 1220 | 0.7293 | 0.7522 | 0.7293 | 0.8540 |
| 0.1762 | 3.1253 | 1222 | 0.6318 | 0.7358 | 0.6318 | 0.7949 |
| 0.1762 | 3.1304 | 1224 | 0.6443 | 0.7522 | 0.6443 | 0.8027 |
| 0.1762 | 3.1355 | 1226 | 0.6319 | 0.7358 | 0.6319 | 0.7949 |
| 0.1762 | 3.1407 | 1228 | 0.6628 | 0.7522 | 0.6628 | 0.8141 |
| 0.1762 | 3.1458 | 1230 | 0.7275 | 0.7672 | 0.7275 | 0.8529 |
| 0.1762 | 3.1509 | 1232 | 0.8678 | 0.7672 | 0.8678 | 0.9316 |
| 0.1762 | 3.1560 | 1234 | 0.9162 | 0.7891 | 0.9162 | 0.9572 |
| 0.1762 | 3.1611 | 1236 | 0.8168 | 0.7672 | 0.8168 | 0.9037 |
| 0.1762 | 3.1662 | 1238 | 0.6908 | 0.7593 | 0.6908 | 0.8312 |
| 0.1762 | 3.1714 | 1240 | 0.5865 | 0.7786 | 0.5865 | 0.7658 |
| 0.1762 | 3.1765 | 1242 | 0.5540 | 0.7862 | 0.5540 | 0.7443 |
| 0.1762 | 3.1816 | 1244 | 0.5885 | 0.7786 | 0.5885 | 0.7672 |
| 0.1762 | 3.1867 | 1246 | 0.6732 | 0.7692 | 0.6732 | 0.8205 |
| 0.1762 | 3.1918 | 1248 | 0.7324 | 0.7518 | 0.7324 | 0.8558 |
| 0.1762 | 3.1969 | 1250 | 0.8167 | 0.7672 | 0.8167 | 0.9037 |
| 0.1762 | 3.2020 | 1252 | 0.8068 | 0.7672 | 0.8068 | 0.8982 |
| 0.1762 | 3.2072 | 1254 | 0.7380 | 0.7518 | 0.7380 | 0.8591 |
| 0.1762 | 3.2123 | 1256 | 0.6137 | 0.7786 | 0.6137 | 0.7834 |
| 0.1762 | 3.2174 | 1258 | 0.5258 | 0.7862 | 0.5258 | 0.7251 |
| 0.1762 | 3.2225 | 1260 | 0.5037 | 0.7933 | 0.5037 | 0.7097 |
| 0.1762 | 3.2276 | 1262 | 0.5186 | 0.7933 | 0.5186 | 0.7201 |
| 0.1762 | 3.2327 | 1264 | 0.5850 | 0.7608 | 0.5850 | 0.7649 |
| 0.1762 | 3.2379 | 1266 | 0.7252 | 0.7445 | 0.7252 | 0.8516 |
| 0.1762 | 3.2430 | 1268 | 0.8301 | 0.7445 | 0.8301 | 0.9111 |
| 0.1762 | 3.2481 | 1270 | 0.8080 | 0.7445 | 0.8080 | 0.8989 |
| 0.1762 | 3.2532 | 1272 | 0.6896 | 0.7924 | 0.6896 | 0.8304 |
| 0.1762 | 3.2583 | 1274 | 0.6581 | 0.7786 | 0.6581 | 0.8112 |
| 0.1762 | 3.2634 | 1276 | 0.7478 | 0.8120 | 0.7478 | 0.8648 |
| 0.1762 | 3.2685 | 1278 | 0.8356 | 0.8051 | 0.8356 | 0.9141 |
| 0.1762 | 3.2737 | 1280 | 0.9472 | 0.7255 | 0.9472 | 0.9732 |
| 0.1762 | 3.2788 | 1282 | 0.9352 | 0.7255 | 0.9352 | 0.9670 |
| 0.1762 | 3.2839 | 1284 | 0.8363 | 0.7139 | 0.8363 | 0.9145 |
| 0.1762 | 3.2890 | 1286 | 0.7191 | 0.7337 | 0.7191 | 0.8480 |
| 0.1762 | 3.2941 | 1288 | 0.6807 | 0.7882 | 0.6807 | 0.8250 |
| 0.1762 | 3.2992 | 1290 | 0.7235 | 0.7801 | 0.7235 | 0.8506 |
| 0.1762 | 3.3043 | 1292 | 0.8575 | 0.7182 | 0.8575 | 0.9260 |
| 0.1762 | 3.3095 | 1294 | 1.0346 | 0.7182 | 1.0346 | 1.0171 |
| 0.1762 | 3.3146 | 1296 | 1.1295 | 0.7510 | 1.1295 | 1.0628 |
| 0.1762 | 3.3197 | 1298 | 1.0640 | 0.7431 | 1.0640 | 1.0315 |
| 0.1762 | 3.3248 | 1300 | 1.0225 | 0.7182 | 1.0225 | 1.0112 |
| 0.1762 | 3.3299 | 1302 | 0.9866 | 0.7182 | 0.9866 | 0.9933 |
| 0.1762 | 3.3350 | 1304 | 0.9609 | 0.7182 | 0.9609 | 0.9803 |
| 0.1762 | 3.3402 | 1306 | 0.8950 | 0.7435 | 0.8950 | 0.9461 |
| 0.1762 | 3.3453 | 1308 | 0.9036 | 0.7259 | 0.9036 | 0.9506 |
| 0.1762 | 3.3504 | 1310 | 0.9497 | 0.7134 | 0.9497 | 0.9745 |
| 0.1762 | 3.3555 | 1312 | 0.9502 | 0.7134 | 0.9502 | 0.9748 |
| 0.1762 | 3.3606 | 1314 | 0.8725 | 0.7058 | 0.8725 | 0.9341 |
| 0.1762 | 3.3657 | 1316 | 0.7518 | 0.7620 | 0.7518 | 0.8671 |
| 0.1762 | 3.3708 | 1318 | 0.6653 | 0.7786 | 0.6653 | 0.8157 |
| 0.1762 | 3.3760 | 1320 | 0.6617 | 0.7363 | 0.6617 | 0.8134 |
| 0.1762 | 3.3811 | 1322 | 0.7116 | 0.7522 | 0.7116 | 0.8436 |
| 0.1762 | 3.3862 | 1324 | 0.7657 | 0.7445 | 0.7657 | 0.8750 |
| 0.1762 | 3.3913 | 1326 | 0.7931 | 0.7435 | 0.7931 | 0.8905 |
| 0.1762 | 3.3964 | 1328 | 0.8109 | 0.7350 | 0.8109 | 0.9005 |
| 0.1762 | 3.4015 | 1330 | 0.7371 | 0.7521 | 0.7371 | 0.8586 |
| 0.1762 | 3.4066 | 1332 | 0.6703 | 0.7717 | 0.6703 | 0.8187 |
| 0.1762 | 3.4118 | 1334 | 0.6886 | 0.7332 | 0.6886 | 0.8298 |
| 0.1762 | 3.4169 | 1336 | 0.7884 | 0.7612 | 0.7884 | 0.8879 |
| 0.1762 | 3.4220 | 1338 | 0.8466 | 0.7598 | 0.8466 | 0.9201 |
| 0.1762 | 3.4271 | 1340 | 0.8541 | 0.7598 | 0.8541 | 0.9242 |
| 0.1762 | 3.4322 | 1342 | 0.7976 | 0.7429 | 0.7976 | 0.8931 |
| 0.1762 | 3.4373 | 1344 | 0.7129 | 0.7350 | 0.7129 | 0.8443 |
| 0.1762 | 3.4425 | 1346 | 0.6697 | 0.6927 | 0.6697 | 0.8183 |
| 0.1762 | 3.4476 | 1348 | 0.6439 | 0.7196 | 0.6439 | 0.8025 |
| 0.1762 | 3.4527 | 1350 | 0.6521 | 0.7692 | 0.6521 | 0.8075 |
| 0.1762 | 3.4578 | 1352 | 0.6399 | 0.7786 | 0.6399 | 0.7999 |
| 0.1762 | 3.4629 | 1354 | 0.6444 | 0.7618 | 0.6444 | 0.8027 |
| 0.1762 | 3.4680 | 1356 | 0.7005 | 0.7525 | 0.7005 | 0.8370 |
| 0.1762 | 3.4731 | 1358 | 0.7631 | 0.7095 | 0.7631 | 0.8736 |
| 0.1762 | 3.4783 | 1360 | 0.7567 | 0.7028 | 0.7567 | 0.8699 |
| 0.1762 | 3.4834 | 1362 | 0.7677 | 0.7281 | 0.7677 | 0.8762 |
| 0.1762 | 3.4885 | 1364 | 0.7160 | 0.7522 | 0.7160 | 0.8462 |
| 0.1762 | 3.4936 | 1366 | 0.6629 | 0.7128 | 0.6629 | 0.8142 |
| 0.1762 | 3.4987 | 1368 | 0.6394 | 0.7128 | 0.6394 | 0.7996 |
| 0.1762 | 3.5038 | 1370 | 0.6440 | 0.7128 | 0.6440 | 0.8025 |
| 0.1762 | 3.5090 | 1372 | 0.6968 | 0.7522 | 0.6968 | 0.8347 |
| 0.1762 | 3.5141 | 1374 | 0.6984 | 0.7522 | 0.6984 | 0.8357 |
| 0.1762 | 3.5192 | 1376 | 0.6757 | 0.7522 | 0.6757 | 0.8220 |
| 0.1762 | 3.5243 | 1378 | 0.6720 | 0.7748 | 0.6720 | 0.8197 |
| 0.1762 | 3.5294 | 1380 | 0.6158 | 0.7451 | 0.6158 | 0.7847 |
| 0.1762 | 3.5345 | 1382 | 0.5916 | 0.7451 | 0.5916 | 0.7691 |
| 0.1762 | 3.5396 | 1384 | 0.6089 | 0.7451 | 0.6089 | 0.7803 |
| 0.1762 | 3.5448 | 1386 | 0.6600 | 0.7526 | 0.6600 | 0.8124 |
| 0.1762 | 3.5499 | 1388 | 0.7308 | 0.7522 | 0.7308 | 0.8548 |
| 0.1762 | 3.5550 | 1390 | 0.8602 | 0.7350 | 0.8602 | 0.9275 |
| 0.1762 | 3.5601 | 1392 | 0.9134 | 0.7424 | 0.9134 | 0.9557 |
| 0.1762 | 3.5652 | 1394 | 0.8665 | 0.7337 | 0.8665 | 0.9309 |
| 0.1762 | 3.5703 | 1396 | 0.8472 | 0.7139 | 0.8472 | 0.9205 |
| 0.1762 | 3.5754 | 1398 | 0.7720 | 0.7717 | 0.7720 | 0.8786 |
| 0.1762 | 3.5806 | 1400 | 0.7343 | 0.7717 | 0.7343 | 0.8569 |
| 0.1762 | 3.5857 | 1402 | 0.7411 | 0.7717 | 0.7411 | 0.8609 |
| 0.1762 | 3.5908 | 1404 | 0.7369 | 0.7437 | 0.7369 | 0.8584 |
| 0.1762 | 3.5959 | 1406 | 0.7270 | 0.7437 | 0.7270 | 0.8526 |
| 0.1762 | 3.6010 | 1408 | 0.7005 | 0.7358 | 0.7005 | 0.8370 |
| 0.1762 | 3.6061 | 1410 | 0.6892 | 0.7605 | 0.6892 | 0.8302 |
| 0.1762 | 3.6113 | 1412 | 0.7153 | 0.7605 | 0.7153 | 0.8458 |
| 0.1762 | 3.6164 | 1414 | 0.7779 | 0.7354 | 0.7779 | 0.8820 |
| 0.1762 | 3.6215 | 1416 | 0.8957 | 0.7358 | 0.8957 | 0.9464 |
| 0.1762 | 3.6266 | 1418 | 1.0643 | 0.7427 | 1.0643 | 1.0317 |
| 0.1762 | 3.6317 | 1420 | 1.1203 | 0.7427 | 1.1203 | 1.0585 |
| 0.1762 | 3.6368 | 1422 | 1.0501 | 0.7081 | 1.0501 | 1.0248 |
| 0.1762 | 3.6419 | 1424 | 0.9627 | 0.7239 | 0.9627 | 0.9812 |
| 0.1762 | 3.6471 | 1426 | 0.9293 | 0.6696 | 0.9293 | 0.9640 |
| 0.1762 | 3.6522 | 1428 | 0.9547 | 0.6877 | 0.9547 | 0.9771 |
| 0.1762 | 3.6573 | 1430 | 1.0672 | 0.7412 | 1.0672 | 1.0330 |
| 0.1762 | 3.6624 | 1432 | 1.2003 | 0.7318 | 1.2003 | 1.0956 |
| 0.1762 | 3.6675 | 1434 | 1.2330 | 0.7504 | 1.2330 | 1.1104 |
| 0.1762 | 3.6726 | 1436 | 1.2227 | 0.7504 | 1.2227 | 1.1058 |
| 0.1762 | 3.6777 | 1438 | 1.0861 | 0.7318 | 1.0861 | 1.0422 |
| 0.1762 | 3.6829 | 1440 | 0.9666 | 0.6945 | 0.9666 | 0.9832 |
| 0.1762 | 3.6880 | 1442 | 0.9155 | 0.6945 | 0.9155 | 0.9568 |
| 0.1762 | 3.6931 | 1444 | 0.9278 | 0.6945 | 0.9278 | 0.9632 |
| 0.1762 | 3.6982 | 1446 | 0.9729 | 0.6945 | 0.9729 | 0.9864 |
| 0.1762 | 3.7033 | 1448 | 1.0218 | 0.7081 | 1.0218 | 1.0108 |
| 0.1762 | 3.7084 | 1450 | 1.1359 | 0.7420 | 1.1359 | 1.0658 |
| 0.1762 | 3.7136 | 1452 | 1.1585 | 0.7420 | 1.1585 | 1.0763 |
| 0.1762 | 3.7187 | 1454 | 1.0834 | 0.7342 | 1.0834 | 1.0409 |
| 0.1762 | 3.7238 | 1456 | 0.9253 | 0.6912 | 0.9253 | 0.9619 |
| 0.1762 | 3.7289 | 1458 | 0.8109 | 0.6696 | 0.8109 | 0.9005 |
| 0.1762 | 3.7340 | 1460 | 0.8175 | 0.6696 | 0.8175 | 0.9042 |
| 0.1762 | 3.7391 | 1462 | 0.9153 | 0.6982 | 0.9153 | 0.9567 |
| 0.1762 | 3.7442 | 1464 | 1.0306 | 0.7047 | 1.0306 | 1.0152 |
| 0.1762 | 3.7494 | 1466 | 1.0783 | 0.7230 | 1.0783 | 1.0384 |
| 0.1762 | 3.7545 | 1468 | 1.1952 | 0.7230 | 1.1952 | 1.0932 |
| 0.1762 | 3.7596 | 1470 | 1.3541 | 0.7225 | 1.3541 | 1.1636 |
| 0.1762 | 3.7647 | 1472 | 1.3835 | 0.7225 | 1.3835 | 1.1762 |
| 0.1762 | 3.7698 | 1474 | 1.3052 | 0.7225 | 1.3052 | 1.1425 |
| 0.1762 | 3.7749 | 1476 | 1.1851 | 0.7230 | 1.1851 | 1.0886 |
| 0.1762 | 3.7801 | 1478 | 1.0369 | 0.7318 | 1.0369 | 1.0183 |
| 0.1762 | 3.7852 | 1480 | 0.8844 | 0.7316 | 0.8844 | 0.9404 |
| 0.1762 | 3.7903 | 1482 | 0.8134 | 0.7525 | 0.8134 | 0.9019 |
| 0.1762 | 3.7954 | 1484 | 0.7765 | 0.7525 | 0.7765 | 0.8812 |
| 0.1762 | 3.8005 | 1486 | 0.8185 | 0.7025 | 0.8185 | 0.9047 |
| 0.1762 | 3.8056 | 1488 | 0.9420 | 0.7053 | 0.9420 | 0.9706 |
| 0.1762 | 3.8107 | 1490 | 1.0412 | 0.7153 | 1.0412 | 1.0204 |
| 0.1762 | 3.8159 | 1492 | 1.0121 | 0.7153 | 1.0121 | 1.0060 |
| 0.1762 | 3.8210 | 1494 | 0.9320 | 0.7239 | 0.9320 | 0.9654 |
| 0.1762 | 3.8261 | 1496 | 0.8567 | 0.7162 | 0.8567 | 0.9256 |
| 0.1762 | 3.8312 | 1498 | 0.8572 | 0.7090 | 0.8572 | 0.9259 |
| 0.1269 | 3.8363 | 1500 | 0.9071 | 0.6982 | 0.9071 | 0.9524 |
| 0.1269 | 3.8414 | 1502 | 0.9121 | 0.6982 | 0.9121 | 0.9551 |
| 0.1269 | 3.8465 | 1504 | 0.8209 | 0.7616 | 0.8209 | 0.9060 |
| 0.1269 | 3.8517 | 1506 | 0.7319 | 0.7782 | 0.7319 | 0.8555 |
| 0.1269 | 3.8568 | 1508 | 0.6703 | 0.7786 | 0.6703 | 0.8187 |
| 0.1269 | 3.8619 | 1510 | 0.6286 | 0.7786 | 0.6286 | 0.7928 |
| 0.1269 | 3.8670 | 1512 | 0.6360 | 0.7786 | 0.6360 | 0.7975 |
| 0.1269 | 3.8721 | 1514 | 0.6812 | 0.8022 | 0.6812 | 0.8253 |
| 0.1269 | 3.8772 | 1516 | 0.7280 | 0.7924 | 0.7280 | 0.8532 |
| 0.1269 | 3.8824 | 1518 | 0.8286 | 0.7259 | 0.8286 | 0.9103 |
| 0.1269 | 3.8875 | 1520 | 0.8883 | 0.7607 | 0.8883 | 0.9425 |
| 0.1269 | 3.8926 | 1522 | 0.8527 | 0.7618 | 0.8527 | 0.9234 |
| 0.1269 | 3.8977 | 1524 | 0.8034 | 0.7623 | 0.8034 | 0.8963 |
| 0.1269 | 3.9028 | 1526 | 0.7260 | 0.7620 | 0.7260 | 0.8520 |
| 0.1269 | 3.9079 | 1528 | 0.6599 | 0.7882 | 0.6599 | 0.8123 |
| 0.1269 | 3.9130 | 1530 | 0.6254 | 0.7882 | 0.6254 | 0.7908 |
| 0.1269 | 3.9182 | 1532 | 0.6485 | 0.7882 | 0.6485 | 0.8053 |
| 0.1269 | 3.9233 | 1534 | 0.7010 | 0.7689 | 0.7010 | 0.8373 |
| 0.1269 | 3.9284 | 1536 | 0.7547 | 0.7518 | 0.7547 | 0.8688 |
| 0.1269 | 3.9335 | 1538 | 0.8057 | 0.7672 | 0.8057 | 0.8976 |
| 0.1269 | 3.9386 | 1540 | 0.8677 | 0.7255 | 0.8677 | 0.9315 |
| 0.1269 | 3.9437 | 1542 | 0.8687 | 0.7255 | 0.8687 | 0.9320 |
| 0.1269 | 3.9488 | 1544 | 0.8944 | 0.7255 | 0.8944 | 0.9458 |
| 0.1269 | 3.9540 | 1546 | 0.8497 | 0.6982 | 0.8497 | 0.9218 |
| 0.1269 | 3.9591 | 1548 | 0.7934 | 0.7139 | 0.7934 | 0.8907 |
| 0.1269 | 3.9642 | 1550 | 0.7341 | 0.7427 | 0.7341 | 0.8568 |
| 0.1269 | 3.9693 | 1552 | 0.7399 | 0.7427 | 0.7399 | 0.8602 |
| 0.1269 | 3.9744 | 1554 | 0.7356 | 0.7427 | 0.7356 | 0.8577 |
| 0.1269 | 3.9795 | 1556 | 0.7739 | 0.6951 | 0.7739 | 0.8797 |
| 0.1269 | 3.9847 | 1558 | 0.8425 | 0.7182 | 0.8425 | 0.9179 |
| 0.1269 | 3.9898 | 1560 | 0.9220 | 0.7255 | 0.9220 | 0.9602 |
| 0.1269 | 3.9949 | 1562 | 0.8883 | 0.7435 | 0.8883 | 0.9425 |
| 0.1269 | 4.0 | 1564 | 0.8493 | 0.7435 | 0.8493 | 0.9216 |
| 0.1269 | 4.0051 | 1566 | 0.7599 | 0.7354 | 0.7599 | 0.8717 |
| 0.1269 | 4.0102 | 1568 | 0.7029 | 0.7172 | 0.7029 | 0.8384 |
| 0.1269 | 4.0153 | 1570 | 0.7278 | 0.6882 | 0.7278 | 0.8531 |
| 0.1269 | 4.0205 | 1572 | 0.8168 | 0.7139 | 0.8168 | 0.9038 |
| 0.1269 | 4.0256 | 1574 | 0.9023 | 0.7421 | 0.9023 | 0.9499 |
| 0.1269 | 4.0307 | 1576 | 0.9632 | 0.7333 | 0.9632 | 0.9814 |
| 0.1269 | 4.0358 | 1578 | 0.9352 | 0.7421 | 0.9352 | 0.9670 |
| 0.1269 | 4.0409 | 1580 | 0.8946 | 0.7221 | 0.8946 | 0.9458 |
| 0.1269 | 4.0460 | 1582 | 0.8564 | 0.7139 | 0.8564 | 0.9254 |
| 0.1269 | 4.0512 | 1584 | 0.8290 | 0.6951 | 0.8290 | 0.9105 |
| 0.1269 | 4.0563 | 1586 | 0.8488 | 0.7090 | 0.8488 | 0.9213 |
| 0.1269 | 4.0614 | 1588 | 0.8823 | 0.7259 | 0.8823 | 0.9393 |
| 0.1269 | 4.0665 | 1590 | 0.9405 | 0.7081 | 0.9405 | 0.9698 |
| 0.1269 | 4.0716 | 1592 | 1.0076 | 0.7081 | 1.0076 | 1.0038 |
| 0.1269 | 4.0767 | 1594 | 0.9525 | 0.7081 | 0.9525 | 0.9760 |
| 0.1269 | 4.0818 | 1596 | 0.8584 | 0.7086 | 0.8584 | 0.9265 |
| 0.1269 | 4.0870 | 1598 | 0.8591 | 0.7086 | 0.8591 | 0.9269 |
| 0.1269 | 4.0921 | 1600 | 0.9064 | 0.6871 | 0.9064 | 0.9520 |
| 0.1269 | 4.0972 | 1602 | 0.9485 | 0.7520 | 0.9485 | 0.9739 |
| 0.1269 | 4.1023 | 1604 | 1.0550 | 0.6414 | 1.0550 | 1.0271 |
| 0.1269 | 4.1074 | 1606 | 1.1320 | 0.6414 | 1.1320 | 1.0640 |
| 0.1269 | 4.1125 | 1608 | 1.1302 | 0.6414 | 1.1302 | 1.0631 |
| 0.1269 | 4.1176 | 1610 | 1.0511 | 0.7626 | 1.0511 | 1.0252 |
| 0.1269 | 4.1228 | 1612 | 0.9673 | 0.7413 | 0.9673 | 0.9835 |
| 0.1269 | 4.1279 | 1614 | 0.9126 | 0.7413 | 0.9126 | 0.9553 |
| 0.1269 | 4.1330 | 1616 | 0.8841 | 0.7316 | 0.8841 | 0.9403 |
| 0.1269 | 4.1381 | 1618 | 0.8973 | 0.6982 | 0.8973 | 0.9472 |
| 0.1269 | 4.1432 | 1620 | 0.8819 | 0.7162 | 0.8820 | 0.9391 |
| 0.1269 | 4.1483 | 1622 | 0.8350 | 0.7019 | 0.8350 | 0.9138 |
| 0.1269 | 4.1535 | 1624 | 0.8177 | 0.7277 | 0.8177 | 0.9043 |
| 0.1269 | 4.1586 | 1626 | 0.8025 | 0.7445 | 0.8025 | 0.8958 |
| 0.1269 | 4.1637 | 1628 | 0.7644 | 0.7445 | 0.7644 | 0.8743 |
| 0.1269 | 4.1688 | 1630 | 0.6919 | 0.7522 | 0.6919 | 0.8318 |
| 0.1269 | 4.1739 | 1632 | 0.6520 | 0.7786 | 0.6520 | 0.8074 |
| 0.1269 | 4.1790 | 1634 | 0.6963 | 0.7522 | 0.6963 | 0.8344 |
| 0.1269 | 4.1841 | 1636 | 0.7780 | 0.7672 | 0.7780 | 0.8820 |
| 0.1269 | 4.1893 | 1638 | 0.7524 | 0.7759 | 0.7524 | 0.8674 |
| 0.1269 | 4.1944 | 1640 | 0.6932 | 0.7882 | 0.6932 | 0.8326 |
| 0.1269 | 4.1995 | 1642 | 0.6483 | 0.7786 | 0.6483 | 0.8052 |
| 0.1269 | 4.2046 | 1644 | 0.6142 | 0.7786 | 0.6142 | 0.7837 |
| 0.1269 | 4.2097 | 1646 | 0.6276 | 0.7786 | 0.6276 | 0.7922 |
| 0.1269 | 4.2148 | 1648 | 0.6993 | 0.7601 | 0.6993 | 0.8363 |
| 0.1269 | 4.2199 | 1650 | 0.7088 | 0.7601 | 0.7088 | 0.8419 |
| 0.1269 | 4.2251 | 1652 | 0.6891 | 0.7601 | 0.6891 | 0.8301 |
| 0.1269 | 4.2302 | 1654 | 0.6975 | 0.7601 | 0.6975 | 0.8352 |
| 0.1269 | 4.2353 | 1656 | 0.6864 | 0.7522 | 0.6864 | 0.8285 |
| 0.1269 | 4.2404 | 1658 | 0.6360 | 0.7605 | 0.6360 | 0.7975 |
| 0.1269 | 4.2455 | 1660 | 0.6603 | 0.7522 | 0.6603 | 0.8126 |
| 0.1269 | 4.2506 | 1662 | 0.7684 | 0.7593 | 0.7684 | 0.8766 |
| 0.1269 | 4.2558 | 1664 | 0.8728 | 0.7358 | 0.8728 | 0.9342 |
| 0.1269 | 4.2609 | 1666 | 0.8560 | 0.7433 | 0.8560 | 0.9252 |
| 0.1269 | 4.2660 | 1668 | 0.7499 | 0.7601 | 0.7499 | 0.8660 |
| 0.1269 | 4.2711 | 1670 | 0.6427 | 0.7689 | 0.6427 | 0.8017 |
| 0.1269 | 4.2762 | 1672 | 0.6167 | 0.7529 | 0.6167 | 0.7853 |
| 0.1269 | 4.2813 | 1674 | 0.6194 | 0.7363 | 0.6194 | 0.7870 |
| 0.1269 | 4.2864 | 1676 | 0.6514 | 0.7358 | 0.6514 | 0.8071 |
| 0.1269 | 4.2916 | 1678 | 0.6974 | 0.7601 | 0.6974 | 0.8351 |
| 0.1269 | 4.2967 | 1680 | 0.7901 | 0.7601 | 0.7901 | 0.8889 |
| 0.1269 | 4.3018 | 1682 | 0.8115 | 0.7759 | 0.8115 | 0.9008 |
| 0.1269 | 4.3069 | 1684 | 0.8022 | 0.7601 | 0.8022 | 0.8956 |
| 0.1269 | 4.3120 | 1686 | 0.7929 | 0.7601 | 0.7929 | 0.8905 |
| 0.1269 | 4.3171 | 1688 | 0.7339 | 0.7358 | 0.7339 | 0.8567 |
| 0.1269 | 4.3223 | 1690 | 0.6794 | 0.7285 | 0.6794 | 0.8242 |
| 0.1269 | 4.3274 | 1692 | 0.6940 | 0.7285 | 0.6940 | 0.8331 |
| 0.1269 | 4.3325 | 1694 | 0.7745 | 0.7358 | 0.7745 | 0.8801 |
| 0.1269 | 4.3376 | 1696 | 0.8005 | 0.7358 | 0.8005 | 0.8947 |
| 0.1269 | 4.3427 | 1698 | 0.7742 | 0.7358 | 0.7742 | 0.8799 |
| 0.1269 | 4.3478 | 1700 | 0.7755 | 0.7358 | 0.7755 | 0.8806 |
| 0.1269 | 4.3529 | 1702 | 0.7297 | 0.7358 | 0.7297 | 0.8542 |
| 0.1269 | 4.3581 | 1704 | 0.6995 | 0.7358 | 0.6995 | 0.8364 |
| 0.1269 | 4.3632 | 1706 | 0.6939 | 0.7358 | 0.6939 | 0.8330 |
| 0.1269 | 4.3683 | 1708 | 0.7300 | 0.7358 | 0.7300 | 0.8544 |
| 0.1269 | 4.3734 | 1710 | 0.8482 | 0.7759 | 0.8482 | 0.9210 |
| 0.1269 | 4.3785 | 1712 | 0.9853 | 0.7435 | 0.9853 | 0.9926 |
| 0.1269 | 4.3836 | 1714 | 1.0422 | 0.7435 | 1.0422 | 1.0209 |
| 0.1269 | 4.3887 | 1716 | 0.9629 | 0.7672 | 0.9629 | 0.9813 |
| 0.1269 | 4.3939 | 1718 | 0.8371 | 0.7518 | 0.8371 | 0.9149 |
| 0.1269 | 4.3990 | 1720 | 0.7034 | 0.7358 | 0.7034 | 0.8387 |
| 0.1269 | 4.4041 | 1722 | 0.6235 | 0.8060 | 0.6235 | 0.7896 |
| 0.1269 | 4.4092 | 1724 | 0.6197 | 0.8060 | 0.6197 | 0.7872 |
| 0.1269 | 4.4143 | 1726 | 0.6731 | 0.7441 | 0.6731 | 0.8204 |
| 0.1269 | 4.4194 | 1728 | 0.7900 | 0.7759 | 0.7900 | 0.8888 |
| 0.1269 | 4.4246 | 1730 | 0.9764 | 0.7602 | 0.9764 | 0.9881 |
| 0.1269 | 4.4297 | 1732 | 1.0711 | 0.7081 | 1.0711 | 1.0349 |
| 0.1269 | 4.4348 | 1734 | 1.0372 | 0.7081 | 1.0372 | 1.0184 |
| 0.1269 | 4.4399 | 1736 | 0.9084 | 0.7602 | 0.9084 | 0.9531 |
| 0.1269 | 4.4450 | 1738 | 0.7624 | 0.7249 | 0.7624 | 0.8732 |
| 0.1269 | 4.4501 | 1740 | 0.7148 | 0.7441 | 0.7148 | 0.8455 |
| 0.1269 | 4.4552 | 1742 | 0.7484 | 0.7441 | 0.7484 | 0.8651 |
| 0.1269 | 4.4604 | 1744 | 0.8763 | 0.7602 | 0.8763 | 0.9361 |
| 0.1269 | 4.4655 | 1746 | 1.0640 | 0.7106 | 1.0640 | 1.0315 |
| 0.1269 | 4.4706 | 1748 | 1.1311 | 0.7106 | 1.1311 | 1.0635 |
| 0.1269 | 4.4757 | 1750 | 1.0763 | 0.7106 | 1.0763 | 1.0374 |
| 0.1269 | 4.4808 | 1752 | 0.9425 | 0.7255 | 0.9425 | 0.9708 |
| 0.1269 | 4.4859 | 1754 | 0.8203 | 0.7342 | 0.8203 | 0.9057 |
| 0.1269 | 4.4910 | 1756 | 0.7024 | 0.7882 | 0.7024 | 0.8381 |
| 0.1269 | 4.4962 | 1758 | 0.6656 | 0.8060 | 0.6656 | 0.8158 |
| 0.1269 | 4.5013 | 1760 | 0.6846 | 0.7882 | 0.6846 | 0.8274 |
| 0.1269 | 4.5064 | 1762 | 0.7556 | 0.7986 | 0.7556 | 0.8693 |
| 0.1269 | 4.5115 | 1764 | 0.8827 | 0.7342 | 0.8827 | 0.9395 |
| 0.1269 | 4.5166 | 1766 | 0.9816 | 0.7157 | 0.9816 | 0.9908 |
| 0.1269 | 4.5217 | 1768 | 1.0033 | 0.7429 | 1.0033 | 1.0016 |
| 0.1269 | 4.5269 | 1770 | 0.9713 | 0.7429 | 0.9713 | 0.9856 |
| 0.1269 | 4.5320 | 1772 | 0.9610 | 0.7429 | 0.9610 | 0.9803 |
| 0.1269 | 4.5371 | 1774 | 0.8919 | 0.7433 | 0.8919 | 0.9444 |
| 0.1269 | 4.5422 | 1776 | 0.8489 | 0.7606 | 0.8489 | 0.9214 |
| 0.1269 | 4.5473 | 1778 | 0.8685 | 0.7433 | 0.8685 | 0.9319 |
| 0.1269 | 4.5524 | 1780 | 0.8797 | 0.7162 | 0.8797 | 0.9379 |
| 0.1269 | 4.5575 | 1782 | 0.9565 | 0.7429 | 0.9565 | 0.9780 |
| 0.1269 | 4.5627 | 1784 | 0.9927 | 0.7081 | 0.9927 | 0.9964 |
| 0.1269 | 4.5678 | 1786 | 0.9423 | 0.7429 | 0.9423 | 0.9707 |
| 0.1269 | 4.5729 | 1788 | 0.8208 | 0.7358 | 0.8208 | 0.9060 |
| 0.1269 | 4.5780 | 1790 | 0.7681 | 0.7358 | 0.7681 | 0.8764 |
| 0.1269 | 4.5831 | 1792 | 0.7607 | 0.7358 | 0.7607 | 0.8722 |
| 0.1269 | 4.5882 | 1794 | 0.7908 | 0.7285 | 0.7908 | 0.8892 |
| 0.1269 | 4.5934 | 1796 | 0.8751 | 0.7518 | 0.8751 | 0.9355 |
| 0.1269 | 4.5985 | 1798 | 0.9146 | 0.7518 | 0.9146 | 0.9563 |
| 0.1269 | 4.6036 | 1800 | 0.9443 | 0.7672 | 0.9443 | 0.9717 |
| 0.1269 | 4.6087 | 1802 | 0.9017 | 0.7672 | 0.9017 | 0.9496 |
| 0.1269 | 4.6138 | 1804 | 0.8424 | 0.7601 | 0.8424 | 0.9178 |
| 0.1269 | 4.6189 | 1806 | 0.7827 | 0.7358 | 0.7827 | 0.8847 |
| 0.1269 | 4.6240 | 1808 | 0.7620 | 0.7358 | 0.7620 | 0.8729 |
| 0.1269 | 4.6292 | 1810 | 0.7994 | 0.7358 | 0.7994 | 0.8941 |
| 0.1269 | 4.6343 | 1812 | 0.8423 | 0.7354 | 0.8423 | 0.9178 |
| 0.1269 | 4.6394 | 1814 | 0.8664 | 0.7433 | 0.8664 | 0.9308 |
| 0.1269 | 4.6445 | 1816 | 0.8670 | 0.7167 | 0.8670 | 0.9311 |
| 0.1269 | 4.6496 | 1818 | 0.8187 | 0.7167 | 0.8187 | 0.9048 |
| 0.1269 | 4.6547 | 1820 | 0.8302 | 0.7167 | 0.8302 | 0.9111 |
| 0.1269 | 4.6598 | 1822 | 0.8753 | 0.7433 | 0.8753 | 0.9356 |
| 0.1269 | 4.6650 | 1824 | 0.8525 | 0.7433 | 0.8525 | 0.9233 |
| 0.1269 | 4.6701 | 1826 | 0.7716 | 0.7095 | 0.7716 | 0.8784 |
| 0.1269 | 4.6752 | 1828 | 0.7038 | 0.7441 | 0.7038 | 0.8389 |
| 0.1269 | 4.6803 | 1830 | 0.7078 | 0.7441 | 0.7078 | 0.8413 |
| 0.1269 | 4.6854 | 1832 | 0.7642 | 0.7095 | 0.7642 | 0.8742 |
| 0.1269 | 4.6905 | 1834 | 0.8551 | 0.7167 | 0.8551 | 0.9247 |
| 0.1269 | 4.6957 | 1836 | 0.9546 | 0.7421 | 0.9546 | 0.9770 |
| 0.1269 | 4.7008 | 1838 | 0.9957 | 0.7333 | 0.9957 | 0.9978 |
| 0.1269 | 4.7059 | 1840 | 1.0314 | 0.7333 | 1.0314 | 1.0156 |
| 0.1269 | 4.7110 | 1842 | 1.0058 | 0.7333 | 1.0058 | 1.0029 |
| 0.1269 | 4.7161 | 1844 | 0.9430 | 0.7255 | 0.9430 | 0.9711 |
| 0.1269 | 4.7212 | 1846 | 0.8425 | 0.7167 | 0.8425 | 0.9179 |
| 0.1269 | 4.7263 | 1848 | 0.7877 | 0.7167 | 0.7877 | 0.8875 |
| 0.1269 | 4.7315 | 1850 | 0.7981 | 0.7167 | 0.7981 | 0.8934 |
| 0.1269 | 4.7366 | 1852 | 0.7999 | 0.7167 | 0.7999 | 0.8944 |
| 0.1269 | 4.7417 | 1854 | 0.7846 | 0.7616 | 0.7846 | 0.8858 |
| 0.1269 | 4.7468 | 1856 | 0.8144 | 0.7342 | 0.8144 | 0.9024 |
| 0.1269 | 4.7519 | 1858 | 0.8664 | 0.7342 | 0.8664 | 0.9308 |
| 0.1269 | 4.7570 | 1860 | 0.9371 | 0.7602 | 0.9371 | 0.9680 |
| 0.1269 | 4.7621 | 1862 | 0.9463 | 0.7514 | 0.9463 | 0.9728 |
| 0.1269 | 4.7673 | 1864 | 0.9062 | 0.7514 | 0.9062 | 0.9520 |
| 0.1269 | 4.7724 | 1866 | 0.8576 | 0.7435 | 0.8576 | 0.9261 |
| 0.1269 | 4.7775 | 1868 | 0.7661 | 0.7095 | 0.7661 | 0.8753 |
| 0.1269 | 4.7826 | 1870 | 0.7257 | 0.7095 | 0.7257 | 0.8519 |
| 0.1269 | 4.7877 | 1872 | 0.7520 | 0.7525 | 0.7520 | 0.8672 |
| 0.1269 | 4.7928 | 1874 | 0.8140 | 0.7525 | 0.8140 | 0.9022 |
| 0.1269 | 4.7980 | 1876 | 0.8499 | 0.6877 | 0.8499 | 0.9219 |
| 0.1269 | 4.8031 | 1878 | 0.8507 | 0.7162 | 0.8507 | 0.9223 |
| 0.1269 | 4.8082 | 1880 | 0.8060 | 0.7616 | 0.8060 | 0.8978 |
| 0.1269 | 4.8133 | 1882 | 0.7811 | 0.7616 | 0.7811 | 0.8838 |
| 0.1269 | 4.8184 | 1884 | 0.8010 | 0.7162 | 0.8010 | 0.8950 |
| 0.1269 | 4.8235 | 1886 | 0.8606 | 0.7162 | 0.8606 | 0.9277 |
| 0.1269 | 4.8286 | 1888 | 0.9405 | 0.7162 | 0.9405 | 0.9698 |
| 0.1269 | 4.8338 | 1890 | 0.9968 | 0.7337 | 0.9968 | 0.9984 |
| 0.1269 | 4.8389 | 1892 | 0.9980 | 0.7337 | 0.9980 | 0.9990 |
| 0.1269 | 4.8440 | 1894 | 0.9146 | 0.7162 | 0.9146 | 0.9563 |
| 0.1269 | 4.8491 | 1896 | 0.8103 | 0.6877 | 0.8103 | 0.9001 |
| 0.1269 | 4.8542 | 1898 | 0.7573 | 0.7427 | 0.7573 | 0.8702 |
| 0.1269 | 4.8593 | 1900 | 0.7601 | 0.7167 | 0.7601 | 0.8719 |
| 0.1269 | 4.8645 | 1902 | 0.7555 | 0.7167 | 0.7555 | 0.8692 |
| 0.1269 | 4.8696 | 1904 | 0.7836 | 0.7433 | 0.7836 | 0.8852 |
| 0.1269 | 4.8747 | 1906 | 0.8101 | 0.7433 | 0.8101 | 0.9001 |
| 0.1269 | 4.8798 | 1908 | 0.8188 | 0.7433 | 0.8188 | 0.9049 |
| 0.1269 | 4.8849 | 1910 | 0.8105 | 0.7433 | 0.8105 | 0.9003 |
| 0.1269 | 4.8900 | 1912 | 0.8320 | 0.7433 | 0.8320 | 0.9121 |
| 0.1269 | 4.8951 | 1914 | 0.8051 | 0.7433 | 0.8051 | 0.8973 |
| 0.1269 | 4.9003 | 1916 | 0.7499 | 0.7167 | 0.7499 | 0.8659 |
| 0.1269 | 4.9054 | 1918 | 0.7434 | 0.7249 | 0.7434 | 0.8622 |
| 0.1269 | 4.9105 | 1920 | 0.7467 | 0.7249 | 0.7467 | 0.8641 |
| 0.1269 | 4.9156 | 1922 | 0.7921 | 0.7162 | 0.7921 | 0.8900 |
| 0.1269 | 4.9207 | 1924 | 0.8953 | 0.7421 | 0.8953 | 0.9462 |
| 0.1269 | 4.9258 | 1926 | 1.0408 | 0.7333 | 1.0408 | 1.0202 |
| 0.1269 | 4.9309 | 1928 | 1.0764 | 0.7333 | 1.0764 | 1.0375 |
| 0.1269 | 4.9361 | 1930 | 1.0085 | 0.7333 | 1.0085 | 1.0043 |
| 0.1269 | 4.9412 | 1932 | 0.9150 | 0.7421 | 0.9150 | 0.9565 |
| 0.1269 | 4.9463 | 1934 | 0.8573 | 0.7337 | 0.8573 | 0.9259 |
| 0.1269 | 4.9514 | 1936 | 0.8032 | 0.7342 | 0.8032 | 0.8962 |
| 0.1269 | 4.9565 | 1938 | 0.8063 | 0.7342 | 0.8063 | 0.8979 |
| 0.1269 | 4.9616 | 1940 | 0.8507 | 0.7602 | 0.8507 | 0.9224 |
| 0.1269 | 4.9668 | 1942 | 0.9234 | 0.7514 | 0.9234 | 0.9609 |
| 0.1269 | 4.9719 | 1944 | 0.9734 | 0.7514 | 0.9734 | 0.9866 |
| 0.1269 | 4.9770 | 1946 | 0.9670 | 0.7514 | 0.9670 | 0.9834 |
| 0.1269 | 4.9821 | 1948 | 0.8922 | 0.7514 | 0.8922 | 0.9446 |
| 0.1269 | 4.9872 | 1950 | 0.8126 | 0.7602 | 0.8126 | 0.9014 |
| 0.1269 | 4.9923 | 1952 | 0.7956 | 0.7342 | 0.7956 | 0.8920 |
| 0.1269 | 4.9974 | 1954 | 0.8052 | 0.7342 | 0.8052 | 0.8973 |
| 0.1269 | 5.0026 | 1956 | 0.8300 | 0.7342 | 0.8300 | 0.9110 |
| 0.1269 | 5.0077 | 1958 | 0.8364 | 0.7430 | 0.8364 | 0.9145 |
| 0.1269 | 5.0128 | 1960 | 0.8045 | 0.7906 | 0.8045 | 0.8969 |
| 0.1269 | 5.0179 | 1962 | 0.8227 | 0.7906 | 0.8227 | 0.9070 |
| 0.1269 | 5.0230 | 1964 | 0.8469 | 0.7342 | 0.8469 | 0.9203 |
| 0.1269 | 5.0281 | 1966 | 0.8955 | 0.7342 | 0.8955 | 0.9463 |
| 0.1269 | 5.0332 | 1968 | 0.8909 | 0.7602 | 0.8909 | 0.9439 |
| 0.1269 | 5.0384 | 1970 | 0.8186 | 0.7602 | 0.8186 | 0.9048 |
| 0.1269 | 5.0435 | 1972 | 0.7871 | 0.7602 | 0.7871 | 0.8872 |
| 0.1269 | 5.0486 | 1974 | 0.7808 | 0.7602 | 0.7808 | 0.8836 |
| 0.1269 | 5.0537 | 1976 | 0.7596 | 0.7354 | 0.7596 | 0.8715 |
| 0.1269 | 5.0588 | 1978 | 0.7960 | 0.7354 | 0.7960 | 0.8922 |
| 0.1269 | 5.0639 | 1980 | 0.8873 | 0.7514 | 0.8873 | 0.9420 |
| 0.1269 | 5.0691 | 1982 | 1.0179 | 0.7255 | 1.0179 | 1.0089 |
| 0.1269 | 5.0742 | 1984 | 1.0724 | 0.7333 | 1.0724 | 1.0356 |
| 0.1269 | 5.0793 | 1986 | 1.0372 | 0.7333 | 1.0372 | 1.0184 |
| 0.1269 | 5.0844 | 1988 | 0.9306 | 0.7337 | 0.9306 | 0.9647 |
| 0.1269 | 5.0895 | 1990 | 0.8055 | 0.7167 | 0.8055 | 0.8975 |
| 0.1269 | 5.0946 | 1992 | 0.6943 | 0.7441 | 0.6943 | 0.8332 |
| 0.1269 | 5.0997 | 1994 | 0.6684 | 0.7363 | 0.6684 | 0.8175 |
| 0.1269 | 5.1049 | 1996 | 0.6743 | 0.7363 | 0.6743 | 0.8211 |
| 0.1269 | 5.1100 | 1998 | 0.7293 | 0.7522 | 0.7293 | 0.8540 |
| 0.0963 | 5.1151 | 2000 | 0.8083 | 0.7277 | 0.8083 | 0.8990 |
| 0.0963 | 5.1202 | 2002 | 0.8817 | 0.7514 | 0.8817 | 0.9390 |
| 0.0963 | 5.1253 | 2004 | 0.8705 | 0.7514 | 0.8705 | 0.9330 |
| 0.0963 | 5.1304 | 2006 | 0.7975 | 0.7350 | 0.7975 | 0.8930 |
| 0.0963 | 5.1355 | 2008 | 0.7256 | 0.7285 | 0.7256 | 0.8518 |
| 0.0963 | 5.1407 | 2010 | 0.7151 | 0.7285 | 0.7151 | 0.8456 |
| 0.0963 | 5.1458 | 2012 | 0.7327 | 0.7522 | 0.7327 | 0.8560 |
| 0.0963 | 5.1509 | 2014 | 0.7117 | 0.7285 | 0.7117 | 0.8436 |
| 0.0963 | 5.1560 | 2016 | 0.6862 | 0.7285 | 0.6862 | 0.8284 |
| 0.0963 | 5.1611 | 2018 | 0.7195 | 0.7358 | 0.7195 | 0.8482 |
| 0.0963 | 5.1662 | 2020 | 0.7400 | 0.7264 | 0.7400 | 0.8602 |
| 0.0963 | 5.1714 | 2022 | 0.8003 | 0.7514 | 0.8003 | 0.8946 |
| 0.0963 | 5.1765 | 2024 | 0.9075 | 0.7514 | 0.9075 | 0.9526 |
| 0.0963 | 5.1816 | 2026 | 1.0166 | 0.7514 | 1.0166 | 1.0083 |
| 0.0963 | 5.1867 | 2028 | 1.0543 | 0.7688 | 1.0543 | 1.0268 |
| 0.0963 | 5.1918 | 2030 | 1.0220 | 0.7511 | 1.0220 | 1.0109 |
| 0.0963 | 5.1969 | 2032 | 0.9248 | 0.7520 | 0.9248 | 0.9617 |
| 0.0963 | 5.2020 | 2034 | 0.8426 | 0.7430 | 0.8426 | 0.9179 |
| 0.0963 | 5.2072 | 2036 | 0.7736 | 0.7717 | 0.7736 | 0.8796 |
| 0.0963 | 5.2123 | 2038 | 0.6919 | 0.7529 | 0.6919 | 0.8318 |
| 0.0963 | 5.2174 | 2040 | 0.6568 | 0.7439 | 0.6568 | 0.8104 |
| 0.0963 | 5.2225 | 2042 | 0.6765 | 0.7172 | 0.6765 | 0.8225 |
| 0.0963 | 5.2276 | 2044 | 0.7547 | 0.7342 | 0.7547 | 0.8687 |
| 0.0963 | 5.2327 | 2046 | 0.8557 | 0.7602 | 0.8557 | 0.9250 |
| 0.0963 | 5.2379 | 2048 | 0.9642 | 0.7514 | 0.9642 | 0.9819 |
| 0.0963 | 5.2430 | 2050 | 0.9939 | 0.7514 | 0.9939 | 0.9969 |
| 0.0963 | 5.2481 | 2052 | 0.9578 | 0.7602 | 0.9578 | 0.9787 |
| 0.0963 | 5.2532 | 2054 | 0.8790 | 0.7342 | 0.8790 | 0.9376 |
| 0.0963 | 5.2583 | 2056 | 0.8005 | 0.7430 | 0.8005 | 0.8947 |
| 0.0963 | 5.2634 | 2058 | 0.7781 | 0.7906 | 0.7781 | 0.8821 |
| 0.0963 | 5.2685 | 2060 | 0.7883 | 0.7906 | 0.7883 | 0.8879 |
| 0.0963 | 5.2737 | 2062 | 0.8087 | 0.7906 | 0.8087 | 0.8993 |
| 0.0963 | 5.2788 | 2064 | 0.8683 | 0.7520 | 0.8683 | 0.9318 |
| 0.0963 | 5.2839 | 2066 | 0.9673 | 0.7101 | 0.9673 | 0.9835 |
| 0.0963 | 5.2890 | 2068 | 1.0424 | 0.7014 | 1.0424 | 1.0210 |
| 0.0963 | 5.2941 | 2070 | 1.1259 | 0.7318 | 1.1260 | 1.0611 |
| 0.0963 | 5.2992 | 2072 | 1.1521 | 0.7230 | 1.1521 | 1.0734 |
| 0.0963 | 5.3043 | 2074 | 1.0859 | 0.7230 | 1.0859 | 1.0421 |
| 0.0963 | 5.3095 | 2076 | 0.9749 | 0.7234 | 0.9749 | 0.9874 |
| 0.0963 | 5.3146 | 2078 | 0.8752 | 0.6945 | 0.8752 | 0.9355 |
| 0.0963 | 5.3197 | 2080 | 0.8543 | 0.7025 | 0.8543 | 0.9243 |
| 0.0963 | 5.3248 | 2082 | 0.8408 | 0.7025 | 0.8408 | 0.9170 |
| 0.0963 | 5.3299 | 2084 | 0.8874 | 0.6945 | 0.8874 | 0.9420 |
| 0.0963 | 5.3350 | 2086 | 0.9396 | 0.6940 | 0.9396 | 0.9693 |
| 0.0963 | 5.3402 | 2088 | 1.0272 | 0.7318 | 1.0272 | 1.0135 |
| 0.0963 | 5.3453 | 2090 | 1.0952 | 0.7318 | 1.0952 | 1.0465 |
| 0.0963 | 5.3504 | 2092 | 1.1961 | 0.7230 | 1.1961 | 1.0936 |
| 0.0963 | 5.3555 | 2094 | 1.2271 | 0.7230 | 1.2271 | 1.1077 |
| 0.0963 | 5.3606 | 2096 | 1.1730 | 0.7230 | 1.1730 | 1.0831 |
| 0.0963 | 5.3657 | 2098 | 1.1235 | 0.7318 | 1.1235 | 1.0600 |
| 0.0963 | 5.3708 | 2100 | 1.0137 | 0.7318 | 1.0137 | 1.0068 |
| 0.0963 | 5.3760 | 2102 | 0.8984 | 0.7134 | 0.8984 | 0.9479 |
| 0.0963 | 5.3811 | 2104 | 0.8350 | 0.7221 | 0.8350 | 0.9138 |
| 0.0963 | 5.3862 | 2106 | 0.8346 | 0.7221 | 0.8346 | 0.9136 |
| 0.0963 | 5.3913 | 2108 | 0.8729 | 0.7421 | 0.8729 | 0.9343 |
| 0.0963 | 5.3964 | 2110 | 0.9138 | 0.7421 | 0.9138 | 0.9559 |
| 0.0963 | 5.4015 | 2112 | 0.9037 | 0.7421 | 0.9037 | 0.9506 |
| 0.0963 | 5.4066 | 2114 | 0.9200 | 0.7421 | 0.9200 | 0.9592 |
| 0.0963 | 5.4118 | 2116 | 0.8836 | 0.7221 | 0.8836 | 0.9400 |
| 0.0963 | 5.4169 | 2118 | 0.8634 | 0.7844 | 0.8634 | 0.9292 |
| 0.0963 | 5.4220 | 2120 | 0.8466 | 0.7729 | 0.8466 | 0.9201 |
| 0.0963 | 5.4271 | 2122 | 0.8333 | 0.7729 | 0.8333 | 0.9128 |
| 0.0963 | 5.4322 | 2124 | 0.8507 | 0.7844 | 0.8507 | 0.9223 |
| 0.0963 | 5.4373 | 2126 | 0.9117 | 0.7614 | 0.9117 | 0.9548 |
| 0.0963 | 5.4425 | 2128 | 0.9555 | 0.7318 | 0.9555 | 0.9775 |
| 0.0963 | 5.4476 | 2130 | 0.9911 | 0.7318 | 0.9911 | 0.9955 |
| 0.0963 | 5.4527 | 2132 | 1.0108 | 0.7318 | 1.0108 | 1.0054 |
| 0.0963 | 5.4578 | 2134 | 0.9777 | 0.7412 | 0.9777 | 0.9888 |
| 0.0963 | 5.4629 | 2136 | 0.9459 | 0.7310 | 0.9459 | 0.9726 |
| 0.0963 | 5.4680 | 2138 | 0.9033 | 0.7844 | 0.9033 | 0.9504 |
| 0.0963 | 5.4731 | 2140 | 0.8336 | 0.7844 | 0.8336 | 0.9130 |
| 0.0963 | 5.4783 | 2142 | 0.8030 | 0.7623 | 0.8030 | 0.8961 |
| 0.0963 | 5.4834 | 2144 | 0.8059 | 0.7696 | 0.8059 | 0.8977 |
| 0.0963 | 5.4885 | 2146 | 0.8783 | 0.7514 | 0.8783 | 0.9372 |
| 0.0963 | 5.4936 | 2148 | 0.9942 | 0.7346 | 0.9942 | 0.9971 |
| 0.0963 | 5.4987 | 2150 | 1.0392 | 0.7153 | 1.0392 | 1.0194 |
| 0.0963 | 5.5038 | 2152 | 1.0253 | 0.7153 | 1.0253 | 1.0126 |
| 0.0963 | 5.5090 | 2154 | 0.9528 | 0.7346 | 0.9528 | 0.9761 |
| 0.0963 | 5.5141 | 2156 | 0.9083 | 0.7514 | 0.9083 | 0.9531 |
| 0.0963 | 5.5192 | 2158 | 0.8464 | 0.7602 | 0.8464 | 0.9200 |
| 0.0963 | 5.5243 | 2160 | 0.8434 | 0.7602 | 0.8434 | 0.9184 |
| 0.0963 | 5.5294 | 2162 | 0.8720 | 0.7516 | 0.8720 | 0.9338 |
| 0.0963 | 5.5345 | 2164 | 0.9235 | 0.7412 | 0.9235 | 0.9610 |
| 0.0963 | 5.5396 | 2166 | 0.9728 | 0.7412 | 0.9728 | 0.9863 |
| 0.0963 | 5.5448 | 2168 | 1.0069 | 0.7318 | 1.0069 | 1.0034 |
| 0.0963 | 5.5499 | 2170 | 1.0035 | 0.7318 | 1.0035 | 1.0018 |
| 0.0963 | 5.5550 | 2172 | 0.9740 | 0.7412 | 0.9740 | 0.9869 |
| 0.0963 | 5.5601 | 2174 | 0.9828 | 0.7318 | 0.9828 | 0.9913 |
| 0.0963 | 5.5652 | 2176 | 1.0285 | 0.7230 | 1.0285 | 1.0141 |
| 0.0963 | 5.5703 | 2178 | 1.0423 | 0.7230 | 1.0423 | 1.0209 |
| 0.0963 | 5.5754 | 2180 | 0.9953 | 0.7230 | 0.9953 | 0.9977 |
| 0.0963 | 5.5806 | 2182 | 0.9577 | 0.7230 | 0.9577 | 0.9786 |
| 0.0963 | 5.5857 | 2184 | 0.8868 | 0.7333 | 0.8868 | 0.9417 |
| 0.0963 | 5.5908 | 2186 | 0.8295 | 0.7255 | 0.8295 | 0.9108 |
| 0.0963 | 5.5959 | 2188 | 0.8020 | 0.7255 | 0.8020 | 0.8956 |
| 0.0963 | 5.6010 | 2190 | 0.7977 | 0.7255 | 0.7977 | 0.8931 |
| 0.0963 | 5.6061 | 2192 | 0.8590 | 0.7255 | 0.8590 | 0.9268 |
| 0.0963 | 5.6113 | 2194 | 0.8623 | 0.7333 | 0.8623 | 0.9286 |
| 0.0963 | 5.6164 | 2196 | 0.7940 | 0.7692 | 0.7940 | 0.8911 |
| 0.0963 | 5.6215 | 2198 | 0.6892 | 0.7525 | 0.6892 | 0.8302 |
| 0.0963 | 5.6266 | 2200 | 0.6424 | 0.7441 | 0.6424 | 0.8015 |
| 0.0963 | 5.6317 | 2202 | 0.6548 | 0.7441 | 0.6548 | 0.8092 |
| 0.0963 | 5.6368 | 2204 | 0.6862 | 0.7441 | 0.6862 | 0.8284 |
| 0.0963 | 5.6419 | 2206 | 0.7513 | 0.7945 | 0.7513 | 0.8668 |
| 0.0963 | 5.6471 | 2208 | 0.8089 | 0.7845 | 0.8089 | 0.8994 |
| 0.0963 | 5.6522 | 2210 | 0.8216 | 0.7845 | 0.8216 | 0.9064 |
| 0.0963 | 5.6573 | 2212 | 0.7985 | 0.7845 | 0.7985 | 0.8936 |
| 0.0963 | 5.6624 | 2214 | 0.7483 | 0.7775 | 0.7483 | 0.8650 |
| 0.0963 | 5.6675 | 2216 | 0.7035 | 0.7616 | 0.7035 | 0.8388 |
| 0.0963 | 5.6726 | 2218 | 0.6824 | 0.7616 | 0.6824 | 0.8261 |
| 0.0963 | 5.6777 | 2220 | 0.6923 | 0.7779 | 0.6923 | 0.8320 |
| 0.0963 | 5.6829 | 2222 | 0.6938 | 0.7689 | 0.6938 | 0.8329 |
| 0.0963 | 5.6880 | 2224 | 0.7239 | 0.7597 | 0.7239 | 0.8508 |
| 0.0963 | 5.6931 | 2226 | 0.7186 | 0.7518 | 0.7186 | 0.8477 |
| 0.0963 | 5.6982 | 2228 | 0.7240 | 0.7597 | 0.7240 | 0.8509 |
| 0.0963 | 5.7033 | 2230 | 0.7847 | 0.7598 | 0.7847 | 0.8858 |
| 0.0963 | 5.7084 | 2232 | 0.7986 | 0.7598 | 0.7986 | 0.8936 |
| 0.0963 | 5.7136 | 2234 | 0.7550 | 0.7598 | 0.7550 | 0.8689 |
| 0.0963 | 5.7187 | 2236 | 0.6919 | 0.7689 | 0.6919 | 0.8318 |
| 0.0963 | 5.7238 | 2238 | 0.6606 | 0.7689 | 0.6606 | 0.8128 |
| 0.0963 | 5.7289 | 2240 | 0.6261 | 0.7441 | 0.6261 | 0.7913 |
| 0.0963 | 5.7340 | 2242 | 0.6134 | 0.7441 | 0.6134 | 0.7832 |
| 0.0963 | 5.7391 | 2244 | 0.6319 | 0.7441 | 0.6319 | 0.7949 |
| 0.0963 | 5.7442 | 2246 | 0.6976 | 0.7689 | 0.6976 | 0.8352 |
| 0.0963 | 5.7494 | 2248 | 0.7555 | 0.7517 | 0.7555 | 0.8692 |
| 0.0963 | 5.7545 | 2250 | 0.8268 | 0.7333 | 0.8268 | 0.9093 |
| 0.0963 | 5.7596 | 2252 | 0.8212 | 0.7333 | 0.8212 | 0.9062 |
| 0.0963 | 5.7647 | 2254 | 0.7518 | 0.7521 | 0.7518 | 0.8671 |
| 0.0963 | 5.7698 | 2256 | 0.7197 | 0.7521 | 0.7197 | 0.8483 |
| 0.0963 | 5.7749 | 2258 | 0.6595 | 0.7986 | 0.6595 | 0.8121 |
| 0.0963 | 5.7801 | 2260 | 0.6269 | 0.7986 | 0.6269 | 0.7917 |
| 0.0963 | 5.7852 | 2262 | 0.6031 | 0.7882 | 0.6031 | 0.7766 |
| 0.0963 | 5.7903 | 2264 | 0.6187 | 0.7986 | 0.6187 | 0.7866 |
| 0.0963 | 5.7954 | 2266 | 0.6752 | 0.7525 | 0.6752 | 0.8217 |
| 0.0963 | 5.8005 | 2268 | 0.7584 | 0.7433 | 0.7584 | 0.8709 |
| 0.0963 | 5.8056 | 2270 | 0.8385 | 0.7514 | 0.8385 | 0.9157 |
| 0.0963 | 5.8107 | 2272 | 0.8792 | 0.7514 | 0.8792 | 0.9377 |
| 0.0963 | 5.8159 | 2274 | 0.8593 | 0.7514 | 0.8593 | 0.9270 |
| 0.0963 | 5.8210 | 2276 | 0.8159 | 0.7514 | 0.8159 | 0.9033 |
| 0.0963 | 5.8261 | 2278 | 0.7697 | 0.7514 | 0.7697 | 0.8773 |
| 0.0963 | 5.8312 | 2280 | 0.7381 | 0.7350 | 0.7381 | 0.8591 |
| 0.0963 | 5.8363 | 2282 | 0.7075 | 0.7597 | 0.7075 | 0.8411 |
| 0.0963 | 5.8414 | 2284 | 0.6793 | 0.7779 | 0.6793 | 0.8242 |
| 0.0963 | 5.8465 | 2286 | 0.6973 | 0.7779 | 0.6973 | 0.8350 |
| 0.0963 | 5.8517 | 2288 | 0.7614 | 0.7602 | 0.7614 | 0.8726 |
| 0.0963 | 5.8568 | 2290 | 0.8066 | 0.7602 | 0.8066 | 0.8981 |
| 0.0963 | 5.8619 | 2292 | 0.7988 | 0.7602 | 0.7988 | 0.8938 |
| 0.0963 | 5.8670 | 2294 | 0.7496 | 0.7249 | 0.7496 | 0.8658 |
| 0.0963 | 5.8721 | 2296 | 0.7453 | 0.7249 | 0.7453 | 0.8633 |
| 0.0963 | 5.8772 | 2298 | 0.7397 | 0.7249 | 0.7397 | 0.8601 |
| 0.0963 | 5.8824 | 2300 | 0.7636 | 0.7167 | 0.7636 | 0.8738 |
| 0.0963 | 5.8875 | 2302 | 0.7578 | 0.7167 | 0.7578 | 0.8705 |
| 0.0963 | 5.8926 | 2304 | 0.7266 | 0.7249 | 0.7266 | 0.8524 |
| 0.0963 | 5.8977 | 2306 | 0.6669 | 0.7249 | 0.6669 | 0.8166 |
| 0.0963 | 5.9028 | 2308 | 0.6456 | 0.7704 | 0.6456 | 0.8035 |
| 0.0963 | 5.9079 | 2310 | 0.6683 | 0.7525 | 0.6683 | 0.8175 |
| 0.0963 | 5.9130 | 2312 | 0.6681 | 0.7525 | 0.6681 | 0.8174 |
| 0.0963 | 5.9182 | 2314 | 0.6781 | 0.7525 | 0.6781 | 0.8235 |
| 0.0963 | 5.9233 | 2316 | 0.7099 | 0.7437 | 0.7099 | 0.8425 |
| 0.0963 | 5.9284 | 2318 | 0.7091 | 0.7437 | 0.7091 | 0.8421 |
| 0.0963 | 5.9335 | 2320 | 0.7437 | 0.7597 | 0.7437 | 0.8624 |
| 0.0963 | 5.9386 | 2322 | 0.7948 | 0.7514 | 0.7948 | 0.8915 |
| 0.0963 | 5.9437 | 2324 | 0.8139 | 0.7514 | 0.8139 | 0.9021 |
| 0.0963 | 5.9488 | 2326 | 0.7680 | 0.7597 | 0.7680 | 0.8764 |
| 0.0963 | 5.9540 | 2328 | 0.6926 | 0.7437 | 0.6926 | 0.8322 |
| 0.0963 | 5.9591 | 2330 | 0.6611 | 0.7525 | 0.6611 | 0.8131 |
| 0.0963 | 5.9642 | 2332 | 0.6504 | 0.7525 | 0.6504 | 0.8065 |
| 0.0963 | 5.9693 | 2334 | 0.6701 | 0.7525 | 0.6701 | 0.8186 |
| 0.0963 | 5.9744 | 2336 | 0.7314 | 0.7437 | 0.7314 | 0.8552 |
| 0.0963 | 5.9795 | 2338 | 0.7774 | 0.7058 | 0.7774 | 0.8817 |
| 0.0963 | 5.9847 | 2340 | 0.8086 | 0.7337 | 0.8086 | 0.8992 |
| 0.0963 | 5.9898 | 2342 | 0.7877 | 0.7342 | 0.7877 | 0.8875 |
| 0.0963 | 5.9949 | 2344 | 0.7568 | 0.7606 | 0.7568 | 0.8699 |
| 0.0963 | 6.0 | 2346 | 0.7399 | 0.7606 | 0.7399 | 0.8602 |
| 0.0963 | 6.0051 | 2348 | 0.7026 | 0.7700 | 0.7026 | 0.8382 |
| 0.0963 | 6.0102 | 2350 | 0.7014 | 0.7700 | 0.7014 | 0.8375 |
| 0.0963 | 6.0153 | 2352 | 0.6815 | 0.7700 | 0.6815 | 0.8255 |
| 0.0963 | 6.0205 | 2354 | 0.6883 | 0.7700 | 0.6883 | 0.8296 |
| 0.0963 | 6.0256 | 2356 | 0.7422 | 0.7849 | 0.7422 | 0.8615 |
| 0.0963 | 6.0307 | 2358 | 0.8133 | 0.7514 | 0.8133 | 0.9018 |
| 0.0963 | 6.0358 | 2360 | 0.8294 | 0.7514 | 0.8294 | 0.9107 |
| 0.0963 | 6.0409 | 2362 | 0.7764 | 0.7514 | 0.7764 | 0.8811 |
| 0.0963 | 6.0460 | 2364 | 0.6977 | 0.7779 | 0.6977 | 0.8353 |
| 0.0963 | 6.0512 | 2366 | 0.6428 | 0.7525 | 0.6428 | 0.8018 |
| 0.0963 | 6.0563 | 2368 | 0.6411 | 0.7525 | 0.6411 | 0.8007 |
| 0.0963 | 6.0614 | 2370 | 0.6696 | 0.7525 | 0.6696 | 0.8183 |
| 0.0963 | 6.0665 | 2372 | 0.6981 | 0.7249 | 0.6981 | 0.8355 |
| 0.0963 | 6.0716 | 2374 | 0.6897 | 0.7249 | 0.6897 | 0.8305 |
| 0.0963 | 6.0767 | 2376 | 0.6952 | 0.7249 | 0.6952 | 0.8338 |
| 0.0963 | 6.0818 | 2378 | 0.7256 | 0.7249 | 0.7256 | 0.8518 |
| 0.0963 | 6.0870 | 2380 | 0.7773 | 0.7421 | 0.7773 | 0.8816 |
| 0.0963 | 6.0921 | 2382 | 0.8327 | 0.7421 | 0.8327 | 0.9125 |
| 0.0963 | 6.0972 | 2384 | 0.8280 | 0.7421 | 0.8280 | 0.9099 |
| 0.0963 | 6.1023 | 2386 | 0.7770 | 0.7692 | 0.7770 | 0.8815 |
| 0.0963 | 6.1074 | 2388 | 0.7027 | 0.7332 | 0.7027 | 0.8383 |
| 0.0963 | 6.1125 | 2390 | 0.6678 | 0.7525 | 0.6678 | 0.8172 |
| 0.0963 | 6.1176 | 2392 | 0.6405 | 0.7525 | 0.6405 | 0.8003 |
| 0.0963 | 6.1228 | 2394 | 0.6085 | 0.7525 | 0.6085 | 0.7801 |
| 0.0963 | 6.1279 | 2396 | 0.6212 | 0.7525 | 0.6212 | 0.7882 |
| 0.0963 | 6.1330 | 2398 | 0.6264 | 0.7525 | 0.6264 | 0.7914 |
| 0.0963 | 6.1381 | 2400 | 0.6493 | 0.7525 | 0.6493 | 0.8058 |
| 0.0963 | 6.1432 | 2402 | 0.6700 | 0.7525 | 0.6700 | 0.8186 |
| 0.0963 | 6.1483 | 2404 | 0.6884 | 0.7249 | 0.6884 | 0.8297 |
| 0.0963 | 6.1535 | 2406 | 0.6811 | 0.7525 | 0.6811 | 0.8253 |
| 0.0963 | 6.1586 | 2408 | 0.6945 | 0.7437 | 0.6945 | 0.8333 |
| 0.0963 | 6.1637 | 2410 | 0.7158 | 0.7433 | 0.7158 | 0.8461 |
| 0.0963 | 6.1688 | 2412 | 0.7145 | 0.7167 | 0.7145 | 0.8453 |
| 0.0963 | 6.1739 | 2414 | 0.6731 | 0.7525 | 0.6731 | 0.8204 |
| 0.0963 | 6.1790 | 2416 | 0.6301 | 0.7441 | 0.6301 | 0.7938 |
| 0.0963 | 6.1841 | 2418 | 0.6196 | 0.7614 | 0.6196 | 0.7871 |
| 0.0963 | 6.1893 | 2420 | 0.6248 | 0.7614 | 0.6248 | 0.7904 |
| 0.0963 | 6.1944 | 2422 | 0.6633 | 0.7525 | 0.6633 | 0.8145 |
| 0.0963 | 6.1995 | 2424 | 0.7443 | 0.7517 | 0.7443 | 0.8627 |
| 0.0963 | 6.2046 | 2426 | 0.8255 | 0.7239 | 0.8255 | 0.9086 |
| 0.0963 | 6.2097 | 2428 | 0.8527 | 0.7239 | 0.8527 | 0.9234 |
| 0.0963 | 6.2148 | 2430 | 0.8238 | 0.7239 | 0.8238 | 0.9076 |
| 0.0963 | 6.2199 | 2432 | 0.7555 | 0.6945 | 0.7555 | 0.8692 |
| 0.0963 | 6.2251 | 2434 | 0.6841 | 0.7249 | 0.6841 | 0.8271 |
| 0.0963 | 6.2302 | 2436 | 0.6545 | 0.7249 | 0.6545 | 0.8090 |
| 0.0963 | 6.2353 | 2438 | 0.6435 | 0.7525 | 0.6435 | 0.8022 |
| 0.0963 | 6.2404 | 2440 | 0.6615 | 0.7437 | 0.6615 | 0.8133 |
| 0.0963 | 6.2455 | 2442 | 0.6953 | 0.7685 | 0.6953 | 0.8338 |
| 0.0963 | 6.2506 | 2444 | 0.7294 | 0.7601 | 0.7294 | 0.8540 |
| 0.0963 | 6.2558 | 2446 | 0.7405 | 0.7601 | 0.7405 | 0.8605 |
| 0.0963 | 6.2609 | 2448 | 0.7668 | 0.7685 | 0.7668 | 0.8757 |
| 0.0963 | 6.2660 | 2450 | 0.7503 | 0.7685 | 0.7503 | 0.8662 |
| 0.0963 | 6.2711 | 2452 | 0.7211 | 0.7685 | 0.7211 | 0.8492 |
| 0.0963 | 6.2762 | 2454 | 0.7277 | 0.7167 | 0.7277 | 0.8530 |
| 0.0963 | 6.2813 | 2456 | 0.7546 | 0.7162 | 0.7546 | 0.8687 |
| 0.0963 | 6.2864 | 2458 | 0.7588 | 0.7162 | 0.7588 | 0.8711 |
| 0.0963 | 6.2916 | 2460 | 0.7411 | 0.7167 | 0.7411 | 0.8609 |
| 0.0963 | 6.2967 | 2462 | 0.7508 | 0.7433 | 0.7508 | 0.8665 |
| 0.0963 | 6.3018 | 2464 | 0.7669 | 0.7162 | 0.7669 | 0.8758 |
| 0.0963 | 6.3069 | 2466 | 0.7884 | 0.7162 | 0.7884 | 0.8879 |
| 0.0963 | 6.3120 | 2468 | 0.8209 | 0.7337 | 0.8209 | 0.9060 |
| 0.0963 | 6.3171 | 2470 | 0.8658 | 0.7337 | 0.8658 | 0.9305 |
| 0.0963 | 6.3223 | 2472 | 0.8920 | 0.7234 | 0.8920 | 0.9445 |
| 0.0963 | 6.3274 | 2474 | 0.8626 | 0.7426 | 0.8626 | 0.9288 |
| 0.0963 | 6.3325 | 2476 | 0.8207 | 0.6951 | 0.8207 | 0.9059 |
| 0.0963 | 6.3376 | 2478 | 0.8315 | 0.7139 | 0.8315 | 0.9119 |
| 0.0963 | 6.3427 | 2480 | 0.8302 | 0.7139 | 0.8302 | 0.9112 |
| 0.0963 | 6.3478 | 2482 | 0.8296 | 0.7139 | 0.8296 | 0.9108 |
| 0.0963 | 6.3529 | 2484 | 0.8206 | 0.7139 | 0.8206 | 0.9059 |
| 0.0963 | 6.3581 | 2486 | 0.7790 | 0.7430 | 0.7790 | 0.8826 |
| 0.0963 | 6.3632 | 2488 | 0.7542 | 0.7249 | 0.7542 | 0.8684 |
| 0.0963 | 6.3683 | 2490 | 0.7381 | 0.7249 | 0.7381 | 0.8591 |
| 0.0963 | 6.3734 | 2492 | 0.7309 | 0.7249 | 0.7309 | 0.8549 |
| 0.0963 | 6.3785 | 2494 | 0.7653 | 0.7521 | 0.7653 | 0.8748 |
| 0.0963 | 6.3836 | 2496 | 0.8320 | 0.7162 | 0.8320 | 0.9122 |
| 0.0963 | 6.3887 | 2498 | 0.8976 | 0.7337 | 0.8976 | 0.9474 |
| 0.0759 | 6.3939 | 2500 | 0.9393 | 0.7157 | 0.9393 | 0.9692 |
| 0.0759 | 6.3990 | 2502 | 0.9397 | 0.7157 | 0.9397 | 0.9694 |
| 0.0759 | 6.4041 | 2504 | 0.8762 | 0.7337 | 0.8762 | 0.9361 |
| 0.0759 | 6.4092 | 2506 | 0.8132 | 0.7162 | 0.8132 | 0.9018 |
| 0.0759 | 6.4143 | 2508 | 0.7801 | 0.7433 | 0.7801 | 0.8832 |
| 0.0759 | 6.4194 | 2510 | 0.7909 | 0.7602 | 0.7909 | 0.8893 |
| 0.0759 | 6.4246 | 2512 | 0.8030 | 0.7342 | 0.8030 | 0.8961 |
| 0.0759 | 6.4297 | 2514 | 0.8151 | 0.7342 | 0.8151 | 0.9028 |
| 0.0759 | 6.4348 | 2516 | 0.8221 | 0.7058 | 0.8221 | 0.9067 |
| 0.0759 | 6.4399 | 2518 | 0.8090 | 0.6877 | 0.8090 | 0.8995 |
| 0.0759 | 6.4450 | 2520 | 0.7847 | 0.6951 | 0.7847 | 0.8858 |
| 0.0759 | 6.4501 | 2522 | 0.7812 | 0.6951 | 0.7812 | 0.8839 |
| 0.0759 | 6.4552 | 2524 | 0.7613 | 0.7249 | 0.7613 | 0.8725 |
| 0.0759 | 6.4604 | 2526 | 0.7615 | 0.7249 | 0.7615 | 0.8727 |
| 0.0759 | 6.4655 | 2528 | 0.7880 | 0.7249 | 0.7880 | 0.8877 |
| 0.0759 | 6.4706 | 2530 | 0.7796 | 0.7249 | 0.7796 | 0.8830 |
| 0.0759 | 6.4757 | 2532 | 0.7704 | 0.7249 | 0.7704 | 0.8777 |
| 0.0759 | 6.4808 | 2534 | 0.7846 | 0.6951 | 0.7846 | 0.8858 |
| 0.0759 | 6.4859 | 2536 | 0.7922 | 0.6951 | 0.7922 | 0.8901 |
| 0.0759 | 6.4910 | 2538 | 0.7850 | 0.6951 | 0.7850 | 0.8860 |
| 0.0759 | 6.4962 | 2540 | 0.7545 | 0.6951 | 0.7545 | 0.8686 |
| 0.0759 | 6.5013 | 2542 | 0.7376 | 0.7249 | 0.7376 | 0.8589 |
| 0.0759 | 6.5064 | 2544 | 0.7029 | 0.7249 | 0.7029 | 0.8384 |
| 0.0759 | 6.5115 | 2546 | 0.6993 | 0.7249 | 0.6993 | 0.8363 |
| 0.0759 | 6.5166 | 2548 | 0.7394 | 0.7249 | 0.7394 | 0.8599 |
| 0.0759 | 6.5217 | 2550 | 0.7898 | 0.7162 | 0.7898 | 0.8887 |
| 0.0759 | 6.5269 | 2552 | 0.8622 | 0.7421 | 0.8622 | 0.9286 |
| 0.0759 | 6.5320 | 2554 | 0.9140 | 0.7511 | 0.9140 | 0.9561 |
| 0.0759 | 6.5371 | 2556 | 0.9239 | 0.7511 | 0.9239 | 0.9612 |
| 0.0759 | 6.5422 | 2558 | 0.8915 | 0.7511 | 0.8915 | 0.9442 |
| 0.0759 | 6.5473 | 2560 | 0.8447 | 0.7310 | 0.8447 | 0.9191 |
| 0.0759 | 6.5524 | 2562 | 0.8443 | 0.7310 | 0.8443 | 0.9189 |
| 0.0759 | 6.5575 | 2564 | 0.8387 | 0.7310 | 0.8387 | 0.9158 |
| 0.0759 | 6.5627 | 2566 | 0.7994 | 0.7525 | 0.7994 | 0.8941 |
| 0.0759 | 6.5678 | 2568 | 0.7698 | 0.7525 | 0.7698 | 0.8774 |
| 0.0759 | 6.5729 | 2570 | 0.7685 | 0.7025 | 0.7685 | 0.8766 |
| 0.0759 | 6.5780 | 2572 | 0.7879 | 0.6877 | 0.7879 | 0.8876 |
| 0.0759 | 6.5831 | 2574 | 0.7864 | 0.7162 | 0.7864 | 0.8868 |
| 0.0759 | 6.5882 | 2576 | 0.7684 | 0.7433 | 0.7684 | 0.8766 |
| 0.0759 | 6.5934 | 2578 | 0.7198 | 0.7685 | 0.7198 | 0.8484 |
| 0.0759 | 6.5985 | 2580 | 0.6848 | 0.7437 | 0.6848 | 0.8275 |
| 0.0759 | 6.6036 | 2582 | 0.6657 | 0.7525 | 0.6657 | 0.8159 |
| 0.0759 | 6.6087 | 2584 | 0.6791 | 0.7437 | 0.6791 | 0.8241 |
| 0.0759 | 6.6138 | 2586 | 0.7003 | 0.7685 | 0.7003 | 0.8368 |
| 0.0759 | 6.6189 | 2588 | 0.7564 | 0.7685 | 0.7564 | 0.8697 |
| 0.0759 | 6.6240 | 2590 | 0.8311 | 0.7602 | 0.8311 | 0.9116 |
| 0.0759 | 6.6292 | 2592 | 0.8466 | 0.7602 | 0.8466 | 0.9201 |
| 0.0759 | 6.6343 | 2594 | 0.8212 | 0.7602 | 0.8212 | 0.9062 |
| 0.0759 | 6.6394 | 2596 | 0.7840 | 0.7602 | 0.7840 | 0.8854 |
| 0.0759 | 6.6445 | 2598 | 0.7495 | 0.7525 | 0.7495 | 0.8657 |
| 0.0759 | 6.6496 | 2600 | 0.7425 | 0.7525 | 0.7425 | 0.8617 |
| 0.0759 | 6.6547 | 2602 | 0.7093 | 0.7525 | 0.7093 | 0.8422 |
| 0.0759 | 6.6598 | 2604 | 0.7038 | 0.7525 | 0.7038 | 0.8389 |
| 0.0759 | 6.6650 | 2606 | 0.7009 | 0.7525 | 0.7009 | 0.8372 |
| 0.0759 | 6.6701 | 2608 | 0.7093 | 0.7525 | 0.7093 | 0.8422 |
| 0.0759 | 6.6752 | 2610 | 0.7136 | 0.7525 | 0.7136 | 0.8447 |
| 0.0759 | 6.6803 | 2612 | 0.7484 | 0.7948 | 0.7484 | 0.8651 |
| 0.0759 | 6.6854 | 2614 | 0.7888 | 0.7602 | 0.7888 | 0.8881 |
| 0.0759 | 6.6905 | 2616 | 0.7802 | 0.7602 | 0.7802 | 0.8833 |
| 0.0759 | 6.6957 | 2618 | 0.7530 | 0.7696 | 0.7530 | 0.8677 |
| 0.0759 | 6.7008 | 2620 | 0.7311 | 0.7779 | 0.7311 | 0.8551 |
| 0.0759 | 6.7059 | 2622 | 0.7426 | 0.7779 | 0.7426 | 0.8618 |
| 0.0759 | 6.7110 | 2624 | 0.7348 | 0.7779 | 0.7348 | 0.8572 |
| 0.0759 | 6.7161 | 2626 | 0.7132 | 0.7779 | 0.7132 | 0.8445 |
| 0.0759 | 6.7212 | 2628 | 0.7322 | 0.7779 | 0.7322 | 0.8557 |
| 0.0759 | 6.7263 | 2630 | 0.7597 | 0.7426 | 0.7597 | 0.8716 |
| 0.0759 | 6.7315 | 2632 | 0.7754 | 0.7426 | 0.7754 | 0.8806 |
| 0.0759 | 6.7366 | 2634 | 0.7852 | 0.7426 | 0.7852 | 0.8861 |
| 0.0759 | 6.7417 | 2636 | 0.7885 | 0.7426 | 0.7885 | 0.8880 |
| 0.0759 | 6.7468 | 2638 | 0.8048 | 0.7516 | 0.8048 | 0.8971 |
| 0.0759 | 6.7519 | 2640 | 0.8011 | 0.7516 | 0.8011 | 0.8950 |
| 0.0759 | 6.7570 | 2642 | 0.7861 | 0.7244 | 0.7861 | 0.8866 |
| 0.0759 | 6.7621 | 2644 | 0.7750 | 0.7244 | 0.7750 | 0.8803 |
| 0.0759 | 6.7673 | 2646 | 0.7791 | 0.7244 | 0.7791 | 0.8826 |
| 0.0759 | 6.7724 | 2648 | 0.7991 | 0.7244 | 0.7991 | 0.8939 |
| 0.0759 | 6.7775 | 2650 | 0.8172 | 0.7244 | 0.8172 | 0.9040 |
| 0.0759 | 6.7826 | 2652 | 0.8464 | 0.7327 | 0.8464 | 0.9200 |
| 0.0759 | 6.7877 | 2654 | 0.8329 | 0.7327 | 0.8329 | 0.9126 |
| 0.0759 | 6.7928 | 2656 | 0.7832 | 0.7244 | 0.7832 | 0.8850 |
| 0.0759 | 6.7980 | 2658 | 0.7348 | 0.6951 | 0.7348 | 0.8572 |
| 0.0759 | 6.8031 | 2660 | 0.7202 | 0.6951 | 0.7202 | 0.8487 |
| 0.0759 | 6.8082 | 2662 | 0.7029 | 0.6951 | 0.7029 | 0.8384 |
| 0.0759 | 6.8133 | 2664 | 0.7160 | 0.6951 | 0.7160 | 0.8462 |
| 0.0759 | 6.8184 | 2666 | 0.7677 | 0.7244 | 0.7677 | 0.8762 |
| 0.0759 | 6.8235 | 2668 | 0.8430 | 0.7162 | 0.8430 | 0.9182 |
| 0.0759 | 6.8286 | 2670 | 0.8864 | 0.7421 | 0.8864 | 0.9415 |
| 0.0759 | 6.8338 | 2672 | 0.8890 | 0.7421 | 0.8890 | 0.9429 |
| 0.0759 | 6.8389 | 2674 | 0.9118 | 0.7511 | 0.9118 | 0.9549 |
| 0.0759 | 6.8440 | 2676 | 0.9072 | 0.7614 | 0.9072 | 0.9525 |
| 0.0759 | 6.8491 | 2678 | 0.8638 | 0.7310 | 0.8638 | 0.9294 |
| 0.0759 | 6.8542 | 2680 | 0.8022 | 0.7631 | 0.8022 | 0.8957 |
| 0.0759 | 6.8593 | 2682 | 0.7306 | 0.7525 | 0.7306 | 0.8547 |
| 0.0759 | 6.8645 | 2684 | 0.6938 | 0.7427 | 0.6938 | 0.8330 |
| 0.0759 | 6.8696 | 2686 | 0.6654 | 0.7427 | 0.6654 | 0.8157 |
| 0.0759 | 6.8747 | 2688 | 0.6660 | 0.7427 | 0.6660 | 0.8161 |
| 0.0759 | 6.8798 | 2690 | 0.6890 | 0.7427 | 0.6890 | 0.8300 |
| 0.0759 | 6.8849 | 2692 | 0.7305 | 0.6951 | 0.7305 | 0.8547 |
| 0.0759 | 6.8900 | 2694 | 0.7686 | 0.6951 | 0.7686 | 0.8767 |
| 0.0759 | 6.8951 | 2696 | 0.7906 | 0.6951 | 0.7906 | 0.8892 |
| 0.0759 | 6.9003 | 2698 | 0.7827 | 0.6951 | 0.7827 | 0.8847 |
| 0.0759 | 6.9054 | 2700 | 0.7650 | 0.6951 | 0.7650 | 0.8746 |
| 0.0759 | 6.9105 | 2702 | 0.7550 | 0.6951 | 0.7550 | 0.8689 |
| 0.0759 | 6.9156 | 2704 | 0.7428 | 0.6951 | 0.7428 | 0.8618 |
| 0.0759 | 6.9207 | 2706 | 0.7208 | 0.6951 | 0.7208 | 0.8490 |
| 0.0759 | 6.9258 | 2708 | 0.7254 | 0.6951 | 0.7254 | 0.8517 |
| 0.0759 | 6.9309 | 2710 | 0.7524 | 0.6951 | 0.7524 | 0.8674 |
| 0.0759 | 6.9361 | 2712 | 0.7724 | 0.7139 | 0.7724 | 0.8788 |
| 0.0759 | 6.9412 | 2714 | 0.7607 | 0.7139 | 0.7607 | 0.8722 |
| 0.0759 | 6.9463 | 2716 | 0.7695 | 0.7139 | 0.7695 | 0.8772 |
| 0.0759 | 6.9514 | 2718 | 0.8040 | 0.7139 | 0.8040 | 0.8966 |
| 0.0759 | 6.9565 | 2720 | 0.8113 | 0.7139 | 0.8113 | 0.9007 |
| 0.0759 | 6.9616 | 2722 | 0.8473 | 0.7139 | 0.8473 | 0.9205 |
| 0.0759 | 6.9668 | 2724 | 0.8934 | 0.7134 | 0.8934 | 0.9452 |
| 0.0759 | 6.9719 | 2726 | 0.9016 | 0.7134 | 0.9016 | 0.9495 |
| 0.0759 | 6.9770 | 2728 | 0.8635 | 0.7221 | 0.8635 | 0.9292 |
| 0.0759 | 6.9821 | 2730 | 0.7984 | 0.7139 | 0.7984 | 0.8935 |
| 0.0759 | 6.9872 | 2732 | 0.7208 | 0.6951 | 0.7208 | 0.8490 |
| 0.0759 | 6.9923 | 2734 | 0.6839 | 0.7717 | 0.6839 | 0.8270 |
| 0.0759 | 6.9974 | 2736 | 0.6856 | 0.7427 | 0.6856 | 0.8280 |
| 0.0759 | 7.0026 | 2738 | 0.7121 | 0.7427 | 0.7121 | 0.8439 |
| 0.0759 | 7.0077 | 2740 | 0.7639 | 0.6951 | 0.7639 | 0.8740 |
| 0.0759 | 7.0128 | 2742 | 0.8194 | 0.7139 | 0.8194 | 0.9052 |
| 0.0759 | 7.0179 | 2744 | 0.8522 | 0.7221 | 0.8522 | 0.9232 |
| 0.0759 | 7.0230 | 2746 | 0.8548 | 0.7221 | 0.8548 | 0.9246 |
| 0.0759 | 7.0281 | 2748 | 0.8298 | 0.7139 | 0.8298 | 0.9109 |
| 0.0759 | 7.0332 | 2750 | 0.7958 | 0.7139 | 0.7958 | 0.8921 |
| 0.0759 | 7.0384 | 2752 | 0.7852 | 0.7139 | 0.7852 | 0.8861 |
| 0.0759 | 7.0435 | 2754 | 0.7572 | 0.6951 | 0.7572 | 0.8702 |
| 0.0759 | 7.0486 | 2756 | 0.7272 | 0.7525 | 0.7272 | 0.8528 |
| 0.0759 | 7.0537 | 2758 | 0.7331 | 0.7525 | 0.7331 | 0.8562 |
| 0.0759 | 7.0588 | 2760 | 0.7629 | 0.7433 | 0.7629 | 0.8735 |
| 0.0759 | 7.0639 | 2762 | 0.7569 | 0.7433 | 0.7569 | 0.8700 |
| 0.0759 | 7.0691 | 2764 | 0.7199 | 0.7779 | 0.7199 | 0.8485 |
| 0.0759 | 7.0742 | 2766 | 0.6904 | 0.7779 | 0.6904 | 0.8309 |
| 0.0759 | 7.0793 | 2768 | 0.6585 | 0.7441 | 0.6585 | 0.8115 |
| 0.0759 | 7.0844 | 2770 | 0.6614 | 0.7441 | 0.6614 | 0.8133 |
| 0.0759 | 7.0895 | 2772 | 0.6881 | 0.7525 | 0.6881 | 0.8295 |
| 0.0759 | 7.0946 | 2774 | 0.7363 | 0.7249 | 0.7363 | 0.8581 |
| 0.0759 | 7.0997 | 2776 | 0.8043 | 0.7162 | 0.8043 | 0.8968 |
| 0.0759 | 7.1049 | 2778 | 0.8410 | 0.7421 | 0.8410 | 0.9171 |
| 0.0759 | 7.1100 | 2780 | 0.8725 | 0.7421 | 0.8725 | 0.9341 |
| 0.0759 | 7.1151 | 2782 | 0.9116 | 0.7511 | 0.9116 | 0.9548 |
| 0.0759 | 7.1202 | 2784 | 0.9095 | 0.7511 | 0.9095 | 0.9537 |
| 0.0759 | 7.1253 | 2786 | 0.8719 | 0.7421 | 0.8719 | 0.9337 |
| 0.0759 | 7.1304 | 2788 | 0.8053 | 0.6951 | 0.8053 | 0.8974 |
| 0.0759 | 7.1355 | 2790 | 0.7381 | 0.6951 | 0.7381 | 0.8591 |
| 0.0759 | 7.1407 | 2792 | 0.6844 | 0.7986 | 0.6844 | 0.8273 |
| 0.0759 | 7.1458 | 2794 | 0.6660 | 0.7986 | 0.6660 | 0.8161 |
| 0.0759 | 7.1509 | 2796 | 0.6665 | 0.7986 | 0.6665 | 0.8164 |
| 0.0759 | 7.1560 | 2798 | 0.6902 | 0.7986 | 0.6902 | 0.8308 |
| 0.0759 | 7.1611 | 2800 | 0.7276 | 0.7427 | 0.7276 | 0.8530 |
| 0.0759 | 7.1662 | 2802 | 0.7456 | 0.6951 | 0.7456 | 0.8635 |
| 0.0759 | 7.1714 | 2804 | 0.7722 | 0.7244 | 0.7722 | 0.8787 |
| 0.0759 | 7.1765 | 2806 | 0.7818 | 0.7426 | 0.7818 | 0.8842 |
| 0.0759 | 7.1816 | 2808 | 0.7629 | 0.7244 | 0.7629 | 0.8734 |
| 0.0759 | 7.1867 | 2810 | 0.7145 | 0.7986 | 0.7145 | 0.8453 |
| 0.0759 | 7.1918 | 2812 | 0.6661 | 0.7986 | 0.6661 | 0.8161 |
| 0.0759 | 7.1969 | 2814 | 0.6503 | 0.7986 | 0.6503 | 0.8064 |
| 0.0759 | 7.2020 | 2816 | 0.6641 | 0.7986 | 0.6641 | 0.8149 |
| 0.0759 | 7.2072 | 2818 | 0.6906 | 0.7986 | 0.6906 | 0.8310 |
| 0.0759 | 7.2123 | 2820 | 0.7290 | 0.6951 | 0.7290 | 0.8538 |
| 0.0759 | 7.2174 | 2822 | 0.7743 | 0.7244 | 0.7743 | 0.8799 |
| 0.0759 | 7.2225 | 2824 | 0.8037 | 0.7244 | 0.8037 | 0.8965 |
| 0.0759 | 7.2276 | 2826 | 0.7886 | 0.7244 | 0.7886 | 0.8881 |
| 0.0759 | 7.2327 | 2828 | 0.7490 | 0.7244 | 0.7490 | 0.8654 |
| 0.0759 | 7.2379 | 2830 | 0.7047 | 0.6951 | 0.7047 | 0.8395 |
| 0.0759 | 7.2430 | 2832 | 0.6731 | 0.7525 | 0.6731 | 0.8204 |
| 0.0759 | 7.2481 | 2834 | 0.6684 | 0.7525 | 0.6684 | 0.8176 |
| 0.0759 | 7.2532 | 2836 | 0.6960 | 0.6951 | 0.6960 | 0.8343 |
| 0.0759 | 7.2583 | 2838 | 0.7507 | 0.7244 | 0.7507 | 0.8665 |
| 0.0759 | 7.2634 | 2840 | 0.7859 | 0.7244 | 0.7859 | 0.8865 |
| 0.0759 | 7.2685 | 2842 | 0.7884 | 0.7244 | 0.7884 | 0.8879 |
| 0.0759 | 7.2737 | 2844 | 0.7979 | 0.7426 | 0.7979 | 0.8933 |
| 0.0759 | 7.2788 | 2846 | 0.7856 | 0.7244 | 0.7856 | 0.8863 |
| 0.0759 | 7.2839 | 2848 | 0.7766 | 0.7244 | 0.7766 | 0.8812 |
| 0.0759 | 7.2890 | 2850 | 0.8080 | 0.7426 | 0.8080 | 0.8989 |
| 0.0759 | 7.2941 | 2852 | 0.8371 | 0.7426 | 0.8371 | 0.9149 |
| 0.0759 | 7.2992 | 2854 | 0.8530 | 0.7516 | 0.8530 | 0.9236 |
| 0.0759 | 7.3043 | 2856 | 0.8588 | 0.7516 | 0.8588 | 0.9267 |
| 0.0759 | 7.3095 | 2858 | 0.8386 | 0.7729 | 0.8386 | 0.9158 |
| 0.0759 | 7.3146 | 2860 | 0.8471 | 0.7729 | 0.8471 | 0.9204 |
| 0.0759 | 7.3197 | 2862 | 0.8616 | 0.7729 | 0.8616 | 0.9282 |
| 0.0759 | 7.3248 | 2864 | 0.8623 | 0.7221 | 0.8623 | 0.9286 |
| 0.0759 | 7.3299 | 2866 | 0.8528 | 0.7516 | 0.8528 | 0.9235 |
| 0.0759 | 7.3350 | 2868 | 0.8499 | 0.7516 | 0.8499 | 0.9219 |
| 0.0759 | 7.3402 | 2870 | 0.8174 | 0.7426 | 0.8174 | 0.9041 |
| 0.0759 | 7.3453 | 2872 | 0.7900 | 0.7426 | 0.7900 | 0.8888 |
| 0.0759 | 7.3504 | 2874 | 0.7679 | 0.7426 | 0.7679 | 0.8763 |
| 0.0759 | 7.3555 | 2876 | 0.7425 | 0.6951 | 0.7425 | 0.8617 |
| 0.0759 | 7.3606 | 2878 | 0.7474 | 0.6951 | 0.7474 | 0.8645 |
| 0.0759 | 7.3657 | 2880 | 0.7710 | 0.6951 | 0.7710 | 0.8781 |
| 0.0759 | 7.3708 | 2882 | 0.8098 | 0.7244 | 0.8098 | 0.8999 |
| 0.0759 | 7.3760 | 2884 | 0.8453 | 0.7421 | 0.8453 | 0.9194 |
| 0.0759 | 7.3811 | 2886 | 0.8920 | 0.7318 | 0.8920 | 0.9444 |
| 0.0759 | 7.3862 | 2888 | 0.9115 | 0.7318 | 0.9115 | 0.9547 |
| 0.0759 | 7.3913 | 2890 | 0.9134 | 0.7318 | 0.9134 | 0.9557 |
| 0.0759 | 7.3964 | 2892 | 0.9371 | 0.7318 | 0.9371 | 0.9680 |
| 0.0759 | 7.4015 | 2894 | 0.9357 | 0.7318 | 0.9357 | 0.9673 |
| 0.0759 | 7.4066 | 2896 | 0.9191 | 0.7318 | 0.9191 | 0.9587 |
| 0.0759 | 7.4118 | 2898 | 0.9116 | 0.7318 | 0.9116 | 0.9548 |
| 0.0759 | 7.4169 | 2900 | 0.8957 | 0.7318 | 0.8957 | 0.9464 |
| 0.0759 | 7.4220 | 2902 | 0.8747 | 0.7014 | 0.8747 | 0.9353 |
| 0.0759 | 7.4271 | 2904 | 0.8351 | 0.7221 | 0.8351 | 0.9138 |
| 0.0759 | 7.4322 | 2906 | 0.8151 | 0.7139 | 0.8151 | 0.9028 |
| 0.0759 | 7.4373 | 2908 | 0.8149 | 0.7058 | 0.8149 | 0.9027 |
| 0.0759 | 7.4425 | 2910 | 0.8150 | 0.7058 | 0.8150 | 0.9028 |
| 0.0759 | 7.4476 | 2912 | 0.8068 | 0.7058 | 0.8068 | 0.8982 |
| 0.0759 | 7.4527 | 2914 | 0.7791 | 0.7337 | 0.7791 | 0.8827 |
| 0.0759 | 7.4578 | 2916 | 0.7716 | 0.7162 | 0.7716 | 0.8784 |
| 0.0759 | 7.4629 | 2918 | 0.7727 | 0.7162 | 0.7727 | 0.8790 |
| 0.0759 | 7.4680 | 2920 | 0.7801 | 0.7337 | 0.7801 | 0.8832 |
| 0.0759 | 7.4731 | 2922 | 0.7814 | 0.7337 | 0.7814 | 0.8840 |
| 0.0759 | 7.4783 | 2924 | 0.7897 | 0.7337 | 0.7897 | 0.8887 |
| 0.0759 | 7.4834 | 2926 | 0.7778 | 0.7427 | 0.7778 | 0.8819 |
| 0.0759 | 7.4885 | 2928 | 0.7836 | 0.7427 | 0.7836 | 0.8852 |
| 0.0759 | 7.4936 | 2930 | 0.8028 | 0.7729 | 0.8028 | 0.8960 |
| 0.0759 | 7.4987 | 2932 | 0.7980 | 0.7729 | 0.7980 | 0.8933 |
| 0.0759 | 7.5038 | 2934 | 0.7923 | 0.7525 | 0.7923 | 0.8901 |
| 0.0759 | 7.5090 | 2936 | 0.7880 | 0.7525 | 0.7880 | 0.8877 |
| 0.0759 | 7.5141 | 2938 | 0.7804 | 0.7525 | 0.7804 | 0.8834 |
| 0.0759 | 7.5192 | 2940 | 0.7597 | 0.7427 | 0.7597 | 0.8716 |
| 0.0759 | 7.5243 | 2942 | 0.7424 | 0.7427 | 0.7424 | 0.8616 |
| 0.0759 | 7.5294 | 2944 | 0.7208 | 0.7427 | 0.7208 | 0.8490 |
| 0.0759 | 7.5345 | 2946 | 0.7060 | 0.7983 | 0.7060 | 0.8402 |
| 0.0759 | 7.5396 | 2948 | 0.6945 | 0.7983 | 0.6945 | 0.8334 |
| 0.0759 | 7.5448 | 2950 | 0.7138 | 0.7521 | 0.7138 | 0.8448 |
| 0.0759 | 7.5499 | 2952 | 0.7490 | 0.7162 | 0.7490 | 0.8654 |
| 0.0759 | 7.5550 | 2954 | 0.7682 | 0.7162 | 0.7682 | 0.8765 |
| 0.0759 | 7.5601 | 2956 | 0.7913 | 0.7162 | 0.7913 | 0.8895 |
| 0.0759 | 7.5652 | 2958 | 0.7987 | 0.7162 | 0.7987 | 0.8937 |
| 0.0759 | 7.5703 | 2960 | 0.8156 | 0.7162 | 0.8156 | 0.9031 |
| 0.0759 | 7.5754 | 2962 | 0.8161 | 0.7162 | 0.8161 | 0.9034 |
| 0.0759 | 7.5806 | 2964 | 0.7992 | 0.7162 | 0.7992 | 0.8940 |
| 0.0759 | 7.5857 | 2966 | 0.7690 | 0.7244 | 0.7690 | 0.8769 |
| 0.0759 | 7.5908 | 2968 | 0.7353 | 0.7713 | 0.7353 | 0.8575 |
| 0.0759 | 7.5959 | 2970 | 0.7207 | 0.7427 | 0.7207 | 0.8490 |
| 0.0759 | 7.6010 | 2972 | 0.7071 | 0.7427 | 0.7071 | 0.8409 |
| 0.0759 | 7.6061 | 2974 | 0.7000 | 0.7427 | 0.7000 | 0.8367 |
| 0.0759 | 7.6113 | 2976 | 0.7078 | 0.7427 | 0.7078 | 0.8413 |
| 0.0759 | 7.6164 | 2978 | 0.7354 | 0.7427 | 0.7354 | 0.8575 |
| 0.0759 | 7.6215 | 2980 | 0.7839 | 0.7162 | 0.7839 | 0.8854 |
| 0.0759 | 7.6266 | 2982 | 0.8452 | 0.7421 | 0.8452 | 0.9193 |
| 0.0759 | 7.6317 | 2984 | 0.8820 | 0.7421 | 0.8820 | 0.9391 |
| 0.0759 | 7.6368 | 2986 | 0.9155 | 0.7421 | 0.9155 | 0.9568 |
| 0.0759 | 7.6419 | 2988 | 0.9062 | 0.7421 | 0.9062 | 0.9519 |
| 0.0759 | 7.6471 | 2990 | 0.8766 | 0.7421 | 0.8766 | 0.9363 |
| 0.0759 | 7.6522 | 2992 | 0.8499 | 0.7421 | 0.8499 | 0.9219 |
| 0.0759 | 7.6573 | 2994 | 0.8283 | 0.7421 | 0.8283 | 0.9101 |
| 0.0759 | 7.6624 | 2996 | 0.7891 | 0.7337 | 0.7891 | 0.8883 |
| 0.0759 | 7.6675 | 2998 | 0.7313 | 0.7717 | 0.7313 | 0.8551 |
| 0.0613 | 7.6726 | 3000 | 0.6889 | 0.7717 | 0.6889 | 0.8300 |
| 0.0613 | 7.6777 | 3002 | 0.6626 | 0.7986 | 0.6626 | 0.8140 |
| 0.0613 | 7.6829 | 3004 | 0.6637 | 0.7986 | 0.6637 | 0.8147 |
| 0.0613 | 7.6880 | 3006 | 0.6808 | 0.7986 | 0.6808 | 0.8251 |
| 0.0613 | 7.6931 | 3008 | 0.7003 | 0.7986 | 0.7003 | 0.8368 |
| 0.0613 | 7.6982 | 3010 | 0.7388 | 0.7521 | 0.7388 | 0.8595 |
| 0.0613 | 7.7033 | 3012 | 0.7966 | 0.7602 | 0.7966 | 0.8925 |
| 0.0613 | 7.7084 | 3014 | 0.8590 | 0.7337 | 0.8590 | 0.9268 |
| 0.0613 | 7.7136 | 3016 | 0.9164 | 0.7255 | 0.9164 | 0.9573 |
| 0.0613 | 7.7187 | 3018 | 0.9368 | 0.7255 | 0.9368 | 0.9679 |
| 0.0613 | 7.7238 | 3020 | 0.9175 | 0.7255 | 0.9175 | 0.9578 |
| 0.0613 | 7.7289 | 3022 | 0.8808 | 0.7337 | 0.8808 | 0.9385 |
| 0.0613 | 7.7340 | 3024 | 0.8375 | 0.7426 | 0.8375 | 0.9152 |
| 0.0613 | 7.7391 | 3026 | 0.7938 | 0.7139 | 0.7938 | 0.8910 |
| 0.0613 | 7.7442 | 3028 | 0.7596 | 0.7623 | 0.7596 | 0.8715 |
| 0.0613 | 7.7494 | 3030 | 0.7282 | 0.7427 | 0.7282 | 0.8534 |
| 0.0613 | 7.7545 | 3032 | 0.7068 | 0.7717 | 0.7068 | 0.8407 |
| 0.0613 | 7.7596 | 3034 | 0.7082 | 0.7717 | 0.7082 | 0.8415 |
| 0.0613 | 7.7647 | 3036 | 0.7111 | 0.7717 | 0.7111 | 0.8433 |
| 0.0613 | 7.7698 | 3038 | 0.7256 | 0.7717 | 0.7256 | 0.8518 |
| 0.0613 | 7.7749 | 3040 | 0.7415 | 0.7717 | 0.7415 | 0.8611 |
| 0.0613 | 7.7801 | 3042 | 0.7530 | 0.7906 | 0.7530 | 0.8677 |
| 0.0613 | 7.7852 | 3044 | 0.7586 | 0.7906 | 0.7586 | 0.8710 |
| 0.0613 | 7.7903 | 3046 | 0.7590 | 0.7906 | 0.7590 | 0.8712 |
| 0.0613 | 7.7954 | 3048 | 0.7437 | 0.7717 | 0.7437 | 0.8624 |
| 0.0613 | 7.8005 | 3050 | 0.7393 | 0.7249 | 0.7393 | 0.8598 |
| 0.0613 | 7.8056 | 3052 | 0.7513 | 0.7521 | 0.7513 | 0.8668 |
| 0.0613 | 7.8107 | 3054 | 0.7456 | 0.7433 | 0.7456 | 0.8635 |
| 0.0613 | 7.8159 | 3056 | 0.7247 | 0.7433 | 0.7247 | 0.8513 |
| 0.0613 | 7.8210 | 3058 | 0.7156 | 0.7685 | 0.7156 | 0.8459 |
| 0.0613 | 7.8261 | 3060 | 0.7054 | 0.7685 | 0.7054 | 0.8399 |
| 0.0613 | 7.8312 | 3062 | 0.7050 | 0.7685 | 0.7050 | 0.8396 |
| 0.0613 | 7.8363 | 3064 | 0.6971 | 0.7685 | 0.6971 | 0.8349 |
| 0.0613 | 7.8414 | 3066 | 0.6945 | 0.7685 | 0.6945 | 0.8334 |
| 0.0613 | 7.8465 | 3068 | 0.6947 | 0.7685 | 0.6947 | 0.8335 |
| 0.0613 | 7.8517 | 3070 | 0.6813 | 0.7525 | 0.6813 | 0.8254 |
| 0.0613 | 7.8568 | 3072 | 0.6750 | 0.7525 | 0.6750 | 0.8216 |
| 0.0613 | 7.8619 | 3074 | 0.6644 | 0.7525 | 0.6644 | 0.8151 |
| 0.0613 | 7.8670 | 3076 | 0.6657 | 0.7525 | 0.6657 | 0.8159 |
| 0.0613 | 7.8721 | 3078 | 0.6738 | 0.7986 | 0.6738 | 0.8209 |
| 0.0613 | 7.8772 | 3080 | 0.6850 | 0.7525 | 0.6850 | 0.8276 |
| 0.0613 | 7.8824 | 3082 | 0.6922 | 0.7525 | 0.6922 | 0.8320 |
| 0.0613 | 7.8875 | 3084 | 0.6792 | 0.7986 | 0.6792 | 0.8241 |
| 0.0613 | 7.8926 | 3086 | 0.6725 | 0.7986 | 0.6725 | 0.8200 |
| 0.0613 | 7.8977 | 3088 | 0.6641 | 0.7986 | 0.6641 | 0.8150 |
| 0.0613 | 7.9028 | 3090 | 0.6603 | 0.7986 | 0.6603 | 0.8126 |
| 0.0613 | 7.9079 | 3092 | 0.6586 | 0.7986 | 0.6586 | 0.8115 |
| 0.0613 | 7.9130 | 3094 | 0.6494 | 0.7986 | 0.6494 | 0.8058 |
| 0.0613 | 7.9182 | 3096 | 0.6446 | 0.7986 | 0.6446 | 0.8029 |
| 0.0613 | 7.9233 | 3098 | 0.6326 | 0.7986 | 0.6326 | 0.7953 |
| 0.0613 | 7.9284 | 3100 | 0.6283 | 0.7986 | 0.6283 | 0.7927 |
| 0.0613 | 7.9335 | 3102 | 0.6303 | 0.7986 | 0.6303 | 0.7939 |
| 0.0613 | 7.9386 | 3104 | 0.6502 | 0.7986 | 0.6502 | 0.8063 |
| 0.0613 | 7.9437 | 3106 | 0.6622 | 0.7986 | 0.6622 | 0.8138 |
| 0.0613 | 7.9488 | 3108 | 0.6792 | 0.7879 | 0.6792 | 0.8241 |
| 0.0613 | 7.9540 | 3110 | 0.6934 | 0.7616 | 0.6934 | 0.8327 |
| 0.0613 | 7.9591 | 3112 | 0.6834 | 0.7616 | 0.6834 | 0.8267 |
| 0.0613 | 7.9642 | 3114 | 0.6609 | 0.7717 | 0.6609 | 0.8130 |
| 0.0613 | 7.9693 | 3116 | 0.6446 | 0.7717 | 0.6446 | 0.8029 |
| 0.0613 | 7.9744 | 3118 | 0.6494 | 0.7717 | 0.6494 | 0.8058 |
| 0.0613 | 7.9795 | 3120 | 0.6611 | 0.7717 | 0.6611 | 0.8131 |
| 0.0613 | 7.9847 | 3122 | 0.6900 | 0.7717 | 0.6900 | 0.8306 |
| 0.0613 | 7.9898 | 3124 | 0.7294 | 0.7427 | 0.7294 | 0.8540 |
| 0.0613 | 7.9949 | 3126 | 0.7475 | 0.7427 | 0.7475 | 0.8646 |
| 0.0613 | 8.0 | 3128 | 0.7391 | 0.7427 | 0.7391 | 0.8597 |
| 0.0613 | 8.0051 | 3130 | 0.7109 | 0.7427 | 0.7109 | 0.8431 |
| 0.0613 | 8.0102 | 3132 | 0.6950 | 0.7427 | 0.6950 | 0.8337 |
| 0.0613 | 8.0153 | 3134 | 0.6976 | 0.7427 | 0.6976 | 0.8352 |
| 0.0613 | 8.0205 | 3136 | 0.7166 | 0.7427 | 0.7166 | 0.8465 |
| 0.0613 | 8.0256 | 3138 | 0.7396 | 0.7427 | 0.7396 | 0.8600 |
| 0.0613 | 8.0307 | 3140 | 0.7467 | 0.7427 | 0.7467 | 0.8641 |
| 0.0613 | 8.0358 | 3142 | 0.7674 | 0.6877 | 0.7674 | 0.8760 |
| 0.0613 | 8.0409 | 3144 | 0.7843 | 0.6877 | 0.7843 | 0.8856 |
| 0.0613 | 8.0460 | 3146 | 0.7841 | 0.6877 | 0.7841 | 0.8855 |
| 0.0613 | 8.0512 | 3148 | 0.7945 | 0.7162 | 0.7945 | 0.8913 |
| 0.0613 | 8.0563 | 3150 | 0.8034 | 0.7162 | 0.8034 | 0.8963 |
| 0.0613 | 8.0614 | 3152 | 0.7924 | 0.7162 | 0.7924 | 0.8902 |
| 0.0613 | 8.0665 | 3154 | 0.7576 | 0.7162 | 0.7576 | 0.8704 |
| 0.0613 | 8.0716 | 3156 | 0.7407 | 0.7162 | 0.7407 | 0.8606 |
| 0.0613 | 8.0767 | 3158 | 0.7149 | 0.7433 | 0.7149 | 0.8455 |
| 0.0613 | 8.0818 | 3160 | 0.6858 | 0.7249 | 0.6858 | 0.8281 |
| 0.0613 | 8.0870 | 3162 | 0.6674 | 0.7717 | 0.6674 | 0.8170 |
| 0.0613 | 8.0921 | 3164 | 0.6610 | 0.7717 | 0.6610 | 0.8130 |
| 0.0613 | 8.0972 | 3166 | 0.6628 | 0.7717 | 0.6628 | 0.8142 |
| 0.0613 | 8.1023 | 3168 | 0.6831 | 0.7717 | 0.6831 | 0.8265 |
| 0.0613 | 8.1074 | 3170 | 0.7159 | 0.7427 | 0.7159 | 0.8461 |
| 0.0613 | 8.1125 | 3172 | 0.7610 | 0.7427 | 0.7610 | 0.8723 |
| 0.0613 | 8.1176 | 3174 | 0.7988 | 0.7332 | 0.7988 | 0.8938 |
| 0.0613 | 8.1228 | 3176 | 0.8152 | 0.7162 | 0.8152 | 0.9029 |
| 0.0613 | 8.1279 | 3178 | 0.8033 | 0.7162 | 0.8033 | 0.8963 |
| 0.0613 | 8.1330 | 3180 | 0.7668 | 0.7612 | 0.7668 | 0.8757 |
| 0.0613 | 8.1381 | 3182 | 0.7335 | 0.7427 | 0.7335 | 0.8564 |
| 0.0613 | 8.1432 | 3184 | 0.7105 | 0.7717 | 0.7105 | 0.8429 |
| 0.0613 | 8.1483 | 3186 | 0.6864 | 0.7717 | 0.6864 | 0.8285 |
| 0.0613 | 8.1535 | 3188 | 0.6645 | 0.7717 | 0.6645 | 0.8152 |
| 0.0613 | 8.1586 | 3190 | 0.6520 | 0.7717 | 0.6520 | 0.8074 |
| 0.0613 | 8.1637 | 3192 | 0.6573 | 0.7717 | 0.6573 | 0.8107 |
| 0.0613 | 8.1688 | 3194 | 0.6755 | 0.7521 | 0.6755 | 0.8219 |
| 0.0613 | 8.1739 | 3196 | 0.7029 | 0.7433 | 0.7029 | 0.8384 |
| 0.0613 | 8.1790 | 3198 | 0.7232 | 0.7162 | 0.7232 | 0.8504 |
| 0.0613 | 8.1841 | 3200 | 0.7301 | 0.7162 | 0.7301 | 0.8544 |
| 0.0613 | 8.1893 | 3202 | 0.7246 | 0.7162 | 0.7246 | 0.8513 |
| 0.0613 | 8.1944 | 3204 | 0.7157 | 0.7162 | 0.7157 | 0.8460 |
| 0.0613 | 8.1995 | 3206 | 0.7171 | 0.7162 | 0.7171 | 0.8468 |
| 0.0613 | 8.2046 | 3208 | 0.7300 | 0.7162 | 0.7300 | 0.8544 |
| 0.0613 | 8.2097 | 3210 | 0.7369 | 0.7162 | 0.7369 | 0.8584 |
| 0.0613 | 8.2148 | 3212 | 0.7390 | 0.7612 | 0.7390 | 0.8596 |
| 0.0613 | 8.2199 | 3214 | 0.7452 | 0.7612 | 0.7452 | 0.8633 |
| 0.0613 | 8.2251 | 3216 | 0.7550 | 0.7162 | 0.7550 | 0.8689 |
| 0.0613 | 8.2302 | 3218 | 0.7657 | 0.7162 | 0.7657 | 0.8750 |
| 0.0613 | 8.2353 | 3220 | 0.7954 | 0.7162 | 0.7954 | 0.8918 |
| 0.0613 | 8.2404 | 3222 | 0.8190 | 0.7162 | 0.8190 | 0.9050 |
| 0.0613 | 8.2455 | 3224 | 0.8163 | 0.7162 | 0.8163 | 0.9035 |
| 0.0613 | 8.2506 | 3226 | 0.7957 | 0.7162 | 0.7957 | 0.8920 |
| 0.0613 | 8.2558 | 3228 | 0.7586 | 0.7162 | 0.7586 | 0.8710 |
| 0.0613 | 8.2609 | 3230 | 0.7370 | 0.6877 | 0.7370 | 0.8585 |
| 0.0613 | 8.2660 | 3232 | 0.7337 | 0.6877 | 0.7337 | 0.8566 |
| 0.0613 | 8.2711 | 3234 | 0.7308 | 0.7162 | 0.7308 | 0.8549 |
| 0.0613 | 8.2762 | 3236 | 0.7298 | 0.7162 | 0.7298 | 0.8543 |
| 0.0613 | 8.2813 | 3238 | 0.7393 | 0.7162 | 0.7393 | 0.8598 |
| 0.0613 | 8.2864 | 3240 | 0.7379 | 0.7162 | 0.7379 | 0.8590 |
| 0.0613 | 8.2916 | 3242 | 0.7382 | 0.7162 | 0.7382 | 0.8592 |
| 0.0613 | 8.2967 | 3244 | 0.7409 | 0.7162 | 0.7409 | 0.8608 |
| 0.0613 | 8.3018 | 3246 | 0.7414 | 0.7162 | 0.7414 | 0.8610 |
| 0.0613 | 8.3069 | 3248 | 0.7406 | 0.7162 | 0.7406 | 0.8606 |
| 0.0613 | 8.3120 | 3250 | 0.7439 | 0.7162 | 0.7439 | 0.8625 |
| 0.0613 | 8.3171 | 3252 | 0.7449 | 0.7162 | 0.7449 | 0.8631 |
| 0.0613 | 8.3223 | 3254 | 0.7397 | 0.7162 | 0.7397 | 0.8601 |
| 0.0613 | 8.3274 | 3256 | 0.7284 | 0.7162 | 0.7284 | 0.8535 |
| 0.0613 | 8.3325 | 3258 | 0.7137 | 0.7244 | 0.7137 | 0.8448 |
| 0.0613 | 8.3376 | 3260 | 0.7211 | 0.7162 | 0.7211 | 0.8492 |
| 0.0613 | 8.3427 | 3262 | 0.7361 | 0.7162 | 0.7361 | 0.8580 |
| 0.0613 | 8.3478 | 3264 | 0.7594 | 0.7162 | 0.7594 | 0.8714 |
| 0.0613 | 8.3529 | 3266 | 0.7865 | 0.7162 | 0.7865 | 0.8868 |
| 0.0613 | 8.3581 | 3268 | 0.8053 | 0.7255 | 0.8053 | 0.8974 |
| 0.0613 | 8.3632 | 3270 | 0.8136 | 0.7255 | 0.8136 | 0.9020 |
| 0.0613 | 8.3683 | 3272 | 0.8107 | 0.7337 | 0.8107 | 0.9004 |
| 0.0613 | 8.3734 | 3274 | 0.7964 | 0.7337 | 0.7964 | 0.8924 |
| 0.0613 | 8.3785 | 3276 | 0.7841 | 0.7162 | 0.7841 | 0.8855 |
| 0.0613 | 8.3836 | 3278 | 0.7539 | 0.7162 | 0.7539 | 0.8683 |
| 0.0613 | 8.3887 | 3280 | 0.7386 | 0.7244 | 0.7386 | 0.8594 |
| 0.0613 | 8.3939 | 3282 | 0.7401 | 0.7244 | 0.7401 | 0.8603 |
| 0.0613 | 8.3990 | 3284 | 0.7478 | 0.7244 | 0.7478 | 0.8648 |
| 0.0613 | 8.4041 | 3286 | 0.7516 | 0.7244 | 0.7516 | 0.8670 |
| 0.0613 | 8.4092 | 3288 | 0.7517 | 0.7244 | 0.7517 | 0.8670 |
| 0.0613 | 8.4143 | 3290 | 0.7562 | 0.7244 | 0.7562 | 0.8696 |
| 0.0613 | 8.4194 | 3292 | 0.7602 | 0.7244 | 0.7602 | 0.8719 |
| 0.0613 | 8.4246 | 3294 | 0.7642 | 0.7244 | 0.7642 | 0.8742 |
| 0.0613 | 8.4297 | 3296 | 0.7672 | 0.7162 | 0.7672 | 0.8759 |
| 0.0613 | 8.4348 | 3298 | 0.7601 | 0.7162 | 0.7601 | 0.8719 |
| 0.0613 | 8.4399 | 3300 | 0.7615 | 0.7162 | 0.7615 | 0.8726 |
| 0.0613 | 8.4450 | 3302 | 0.7520 | 0.7244 | 0.7520 | 0.8672 |
| 0.0613 | 8.4501 | 3304 | 0.7530 | 0.7244 | 0.7530 | 0.8677 |
| 0.0613 | 8.4552 | 3306 | 0.7619 | 0.7244 | 0.7619 | 0.8729 |
| 0.0613 | 8.4604 | 3308 | 0.7689 | 0.7162 | 0.7689 | 0.8768 |
| 0.0613 | 8.4655 | 3310 | 0.7811 | 0.7162 | 0.7811 | 0.8838 |
| 0.0613 | 8.4706 | 3312 | 0.7755 | 0.7244 | 0.7755 | 0.8806 |
| 0.0613 | 8.4757 | 3314 | 0.7623 | 0.7244 | 0.7623 | 0.8731 |
| 0.0613 | 8.4808 | 3316 | 0.7535 | 0.7244 | 0.7535 | 0.8681 |
| 0.0613 | 8.4859 | 3318 | 0.7548 | 0.7244 | 0.7548 | 0.8688 |
| 0.0613 | 8.4910 | 3320 | 0.7677 | 0.7244 | 0.7677 | 0.8762 |
| 0.0613 | 8.4962 | 3322 | 0.7846 | 0.7244 | 0.7846 | 0.8858 |
| 0.0613 | 8.5013 | 3324 | 0.8031 | 0.7162 | 0.8031 | 0.8962 |
| 0.0613 | 8.5064 | 3326 | 0.8087 | 0.7162 | 0.8087 | 0.8993 |
| 0.0613 | 8.5115 | 3328 | 0.7941 | 0.7162 | 0.7941 | 0.8911 |
| 0.0613 | 8.5166 | 3330 | 0.7828 | 0.7162 | 0.7828 | 0.8847 |
| 0.0613 | 8.5217 | 3332 | 0.7702 | 0.7244 | 0.7702 | 0.8776 |
| 0.0613 | 8.5269 | 3334 | 0.7516 | 0.7244 | 0.7516 | 0.8669 |
| 0.0613 | 8.5320 | 3336 | 0.7218 | 0.7521 | 0.7218 | 0.8496 |
| 0.0613 | 8.5371 | 3338 | 0.6847 | 0.7249 | 0.6847 | 0.8274 |
| 0.0613 | 8.5422 | 3340 | 0.6556 | 0.7249 | 0.6556 | 0.8097 |
| 0.0613 | 8.5473 | 3342 | 0.6459 | 0.7525 | 0.6459 | 0.8037 |
| 0.0613 | 8.5524 | 3344 | 0.6516 | 0.7525 | 0.6516 | 0.8072 |
| 0.0613 | 8.5575 | 3346 | 0.6614 | 0.7525 | 0.6614 | 0.8133 |
| 0.0613 | 8.5627 | 3348 | 0.6665 | 0.7249 | 0.6665 | 0.8164 |
| 0.0613 | 8.5678 | 3350 | 0.6828 | 0.7521 | 0.6828 | 0.8263 |
| 0.0613 | 8.5729 | 3352 | 0.7032 | 0.7521 | 0.7032 | 0.8386 |
| 0.0613 | 8.5780 | 3354 | 0.7174 | 0.7521 | 0.7174 | 0.8470 |
| 0.0613 | 8.5831 | 3356 | 0.7368 | 0.7433 | 0.7368 | 0.8584 |
| 0.0613 | 8.5882 | 3358 | 0.7534 | 0.7433 | 0.7534 | 0.8680 |
| 0.0613 | 8.5934 | 3360 | 0.7755 | 0.7162 | 0.7755 | 0.8806 |
| 0.0613 | 8.5985 | 3362 | 0.7828 | 0.7162 | 0.7828 | 0.8848 |
| 0.0613 | 8.6036 | 3364 | 0.7967 | 0.7162 | 0.7967 | 0.8926 |
| 0.0613 | 8.6087 | 3366 | 0.7930 | 0.7162 | 0.7930 | 0.8905 |
| 0.0613 | 8.6138 | 3368 | 0.7722 | 0.7433 | 0.7722 | 0.8788 |
| 0.0613 | 8.6189 | 3370 | 0.7577 | 0.7433 | 0.7577 | 0.8704 |
| 0.0613 | 8.6240 | 3372 | 0.7557 | 0.7433 | 0.7557 | 0.8693 |
| 0.0613 | 8.6292 | 3374 | 0.7667 | 0.7162 | 0.7667 | 0.8756 |
| 0.0613 | 8.6343 | 3376 | 0.7674 | 0.7162 | 0.7674 | 0.8760 |
| 0.0613 | 8.6394 | 3378 | 0.7626 | 0.7162 | 0.7626 | 0.8732 |
| 0.0613 | 8.6445 | 3380 | 0.7623 | 0.7162 | 0.7623 | 0.8731 |
| 0.0613 | 8.6496 | 3382 | 0.7628 | 0.7162 | 0.7628 | 0.8734 |
| 0.0613 | 8.6547 | 3384 | 0.7563 | 0.7162 | 0.7563 | 0.8697 |
| 0.0613 | 8.6598 | 3386 | 0.7412 | 0.7244 | 0.7412 | 0.8610 |
| 0.0613 | 8.6650 | 3388 | 0.7214 | 0.6951 | 0.7214 | 0.8494 |
| 0.0613 | 8.6701 | 3390 | 0.7062 | 0.6951 | 0.7062 | 0.8404 |
| 0.0613 | 8.6752 | 3392 | 0.6890 | 0.7249 | 0.6890 | 0.8301 |
| 0.0613 | 8.6803 | 3394 | 0.6871 | 0.7249 | 0.6871 | 0.8289 |
| 0.0613 | 8.6854 | 3396 | 0.6993 | 0.6951 | 0.6993 | 0.8363 |
| 0.0613 | 8.6905 | 3398 | 0.7157 | 0.6951 | 0.7157 | 0.8460 |
| 0.0613 | 8.6957 | 3400 | 0.7307 | 0.6951 | 0.7307 | 0.8548 |
| 0.0613 | 8.7008 | 3402 | 0.7343 | 0.6951 | 0.7343 | 0.8569 |
| 0.0613 | 8.7059 | 3404 | 0.7337 | 0.6951 | 0.7337 | 0.8566 |
| 0.0613 | 8.7110 | 3406 | 0.7242 | 0.6951 | 0.7242 | 0.8510 |
| 0.0613 | 8.7161 | 3408 | 0.7150 | 0.6951 | 0.7150 | 0.8456 |
| 0.0613 | 8.7212 | 3410 | 0.7168 | 0.6951 | 0.7168 | 0.8466 |
| 0.0613 | 8.7263 | 3412 | 0.7267 | 0.6951 | 0.7267 | 0.8525 |
| 0.0613 | 8.7315 | 3414 | 0.7371 | 0.6951 | 0.7371 | 0.8585 |
| 0.0613 | 8.7366 | 3416 | 0.7467 | 0.6951 | 0.7467 | 0.8641 |
| 0.0613 | 8.7417 | 3418 | 0.7497 | 0.6951 | 0.7497 | 0.8658 |
| 0.0613 | 8.7468 | 3420 | 0.7529 | 0.7162 | 0.7529 | 0.8677 |
| 0.0613 | 8.7519 | 3422 | 0.7567 | 0.7162 | 0.7567 | 0.8699 |
| 0.0613 | 8.7570 | 3424 | 0.7446 | 0.6951 | 0.7446 | 0.8629 |
| 0.0613 | 8.7621 | 3426 | 0.7308 | 0.6951 | 0.7308 | 0.8548 |
| 0.0613 | 8.7673 | 3428 | 0.7111 | 0.7249 | 0.7111 | 0.8432 |
| 0.0613 | 8.7724 | 3430 | 0.7009 | 0.7249 | 0.7009 | 0.8372 |
| 0.0613 | 8.7775 | 3432 | 0.7025 | 0.7249 | 0.7025 | 0.8381 |
| 0.0613 | 8.7826 | 3434 | 0.7172 | 0.7249 | 0.7172 | 0.8469 |
| 0.0613 | 8.7877 | 3436 | 0.7320 | 0.6951 | 0.7320 | 0.8556 |
| 0.0613 | 8.7928 | 3438 | 0.7372 | 0.7244 | 0.7372 | 0.8586 |
| 0.0613 | 8.7980 | 3440 | 0.7364 | 0.7244 | 0.7364 | 0.8581 |
| 0.0613 | 8.8031 | 3442 | 0.7343 | 0.7244 | 0.7343 | 0.8569 |
| 0.0613 | 8.8082 | 3444 | 0.7248 | 0.7249 | 0.7248 | 0.8514 |
| 0.0613 | 8.8133 | 3446 | 0.7095 | 0.7249 | 0.7095 | 0.8423 |
| 0.0613 | 8.8184 | 3448 | 0.7007 | 0.7249 | 0.7007 | 0.8371 |
| 0.0613 | 8.8235 | 3450 | 0.6981 | 0.7249 | 0.6981 | 0.8355 |
| 0.0613 | 8.8286 | 3452 | 0.6876 | 0.7525 | 0.6876 | 0.8292 |
| 0.0613 | 8.8338 | 3454 | 0.6863 | 0.7525 | 0.6863 | 0.8284 |
| 0.0613 | 8.8389 | 3456 | 0.6922 | 0.7249 | 0.6922 | 0.8320 |
| 0.0613 | 8.8440 | 3458 | 0.6950 | 0.7249 | 0.6950 | 0.8337 |
| 0.0613 | 8.8491 | 3460 | 0.6946 | 0.7525 | 0.6946 | 0.8334 |
| 0.0613 | 8.8542 | 3462 | 0.7004 | 0.7525 | 0.7004 | 0.8369 |
| 0.0613 | 8.8593 | 3464 | 0.7143 | 0.7249 | 0.7143 | 0.8452 |
| 0.0613 | 8.8645 | 3466 | 0.7309 | 0.7249 | 0.7309 | 0.8549 |
| 0.0613 | 8.8696 | 3468 | 0.7541 | 0.7162 | 0.7541 | 0.8684 |
| 0.0613 | 8.8747 | 3470 | 0.7598 | 0.7162 | 0.7598 | 0.8717 |
| 0.0613 | 8.8798 | 3472 | 0.7539 | 0.7162 | 0.7539 | 0.8683 |
| 0.0613 | 8.8849 | 3474 | 0.7450 | 0.7433 | 0.7450 | 0.8631 |
| 0.0613 | 8.8900 | 3476 | 0.7375 | 0.7433 | 0.7375 | 0.8588 |
| 0.0613 | 8.8951 | 3478 | 0.7449 | 0.7433 | 0.7449 | 0.8631 |
| 0.0613 | 8.9003 | 3480 | 0.7588 | 0.7433 | 0.7588 | 0.8711 |
| 0.0613 | 8.9054 | 3482 | 0.7644 | 0.7433 | 0.7644 | 0.8743 |
| 0.0613 | 8.9105 | 3484 | 0.7652 | 0.7433 | 0.7652 | 0.8748 |
| 0.0613 | 8.9156 | 3486 | 0.7719 | 0.7162 | 0.7719 | 0.8786 |
| 0.0613 | 8.9207 | 3488 | 0.7669 | 0.7162 | 0.7669 | 0.8758 |
| 0.0613 | 8.9258 | 3490 | 0.7569 | 0.7433 | 0.7569 | 0.8700 |
| 0.0613 | 8.9309 | 3492 | 0.7563 | 0.7433 | 0.7563 | 0.8697 |
| 0.0613 | 8.9361 | 3494 | 0.7564 | 0.7433 | 0.7564 | 0.8697 |
| 0.0613 | 8.9412 | 3496 | 0.7632 | 0.7433 | 0.7632 | 0.8736 |
| 0.0613 | 8.9463 | 3498 | 0.7738 | 0.7433 | 0.7738 | 0.8796 |
| 0.0548 | 8.9514 | 3500 | 0.7775 | 0.7433 | 0.7775 | 0.8818 |
| 0.0548 | 8.9565 | 3502 | 0.7792 | 0.7433 | 0.7792 | 0.8827 |
| 0.0548 | 8.9616 | 3504 | 0.7880 | 0.7162 | 0.7880 | 0.8877 |
| 0.0548 | 8.9668 | 3506 | 0.8036 | 0.7162 | 0.8036 | 0.8965 |
| 0.0548 | 8.9719 | 3508 | 0.8080 | 0.7162 | 0.8080 | 0.8989 |
| 0.0548 | 8.9770 | 3510 | 0.8002 | 0.7162 | 0.8002 | 0.8946 |
| 0.0548 | 8.9821 | 3512 | 0.7952 | 0.7162 | 0.7952 | 0.8918 |
| 0.0548 | 8.9872 | 3514 | 0.7780 | 0.7162 | 0.7780 | 0.8821 |
| 0.0548 | 8.9923 | 3516 | 0.7644 | 0.7162 | 0.7644 | 0.8743 |
| 0.0548 | 8.9974 | 3518 | 0.7466 | 0.6951 | 0.7466 | 0.8641 |
| 0.0548 | 9.0026 | 3520 | 0.7386 | 0.6951 | 0.7386 | 0.8594 |
| 0.0548 | 9.0077 | 3522 | 0.7305 | 0.6951 | 0.7305 | 0.8547 |
| 0.0548 | 9.0128 | 3524 | 0.7306 | 0.6951 | 0.7306 | 0.8547 |
| 0.0548 | 9.0179 | 3526 | 0.7367 | 0.6951 | 0.7367 | 0.8583 |
| 0.0548 | 9.0230 | 3528 | 0.7425 | 0.6951 | 0.7425 | 0.8617 |
| 0.0548 | 9.0281 | 3530 | 0.7411 | 0.6951 | 0.7411 | 0.8609 |
| 0.0548 | 9.0332 | 3532 | 0.7282 | 0.6951 | 0.7282 | 0.8533 |
| 0.0548 | 9.0384 | 3534 | 0.7213 | 0.7249 | 0.7213 | 0.8493 |
| 0.0548 | 9.0435 | 3536 | 0.7218 | 0.7249 | 0.7218 | 0.8496 |
| 0.0548 | 9.0486 | 3538 | 0.7281 | 0.6951 | 0.7281 | 0.8533 |
| 0.0548 | 9.0537 | 3540 | 0.7399 | 0.6951 | 0.7399 | 0.8602 |
| 0.0548 | 9.0588 | 3542 | 0.7458 | 0.6951 | 0.7458 | 0.8636 |
| 0.0548 | 9.0639 | 3544 | 0.7432 | 0.6951 | 0.7432 | 0.8621 |
| 0.0548 | 9.0691 | 3546 | 0.7440 | 0.6951 | 0.7440 | 0.8625 |
| 0.0548 | 9.0742 | 3548 | 0.7409 | 0.6951 | 0.7409 | 0.8607 |
| 0.0548 | 9.0793 | 3550 | 0.7403 | 0.7244 | 0.7403 | 0.8604 |
| 0.0548 | 9.0844 | 3552 | 0.7396 | 0.6951 | 0.7396 | 0.8600 |
| 0.0548 | 9.0895 | 3554 | 0.7420 | 0.7244 | 0.7420 | 0.8614 |
| 0.0548 | 9.0946 | 3556 | 0.7455 | 0.7244 | 0.7455 | 0.8634 |
| 0.0548 | 9.0997 | 3558 | 0.7455 | 0.7244 | 0.7455 | 0.8634 |
| 0.0548 | 9.1049 | 3560 | 0.7520 | 0.7244 | 0.7520 | 0.8672 |
| 0.0548 | 9.1100 | 3562 | 0.7528 | 0.7244 | 0.7528 | 0.8676 |
| 0.0548 | 9.1151 | 3564 | 0.7601 | 0.7244 | 0.7601 | 0.8718 |
| 0.0548 | 9.1202 | 3566 | 0.7649 | 0.7244 | 0.7649 | 0.8746 |
| 0.0548 | 9.1253 | 3568 | 0.7727 | 0.7244 | 0.7727 | 0.8790 |
| 0.0548 | 9.1304 | 3570 | 0.7746 | 0.7244 | 0.7746 | 0.8801 |
| 0.0548 | 9.1355 | 3572 | 0.7792 | 0.7244 | 0.7792 | 0.8827 |
| 0.0548 | 9.1407 | 3574 | 0.7759 | 0.7244 | 0.7759 | 0.8809 |
| 0.0548 | 9.1458 | 3576 | 0.7787 | 0.7244 | 0.7787 | 0.8825 |
| 0.0548 | 9.1509 | 3578 | 0.7785 | 0.7244 | 0.7785 | 0.8823 |
| 0.0548 | 9.1560 | 3580 | 0.7774 | 0.7244 | 0.7774 | 0.8817 |
| 0.0548 | 9.1611 | 3582 | 0.7815 | 0.7244 | 0.7815 | 0.8840 |
| 0.0548 | 9.1662 | 3584 | 0.7805 | 0.6951 | 0.7805 | 0.8835 |
| 0.0548 | 9.1714 | 3586 | 0.7839 | 0.6951 | 0.7839 | 0.8854 |
| 0.0548 | 9.1765 | 3588 | 0.7929 | 0.6951 | 0.7929 | 0.8904 |
| 0.0548 | 9.1816 | 3590 | 0.8078 | 0.7426 | 0.8078 | 0.8988 |
| 0.0548 | 9.1867 | 3592 | 0.8200 | 0.7426 | 0.8200 | 0.9056 |
| 0.0548 | 9.1918 | 3594 | 0.8337 | 0.7337 | 0.8337 | 0.9130 |
| 0.0548 | 9.1969 | 3596 | 0.8436 | 0.7337 | 0.8436 | 0.9185 |
| 0.0548 | 9.2020 | 3598 | 0.8578 | 0.7337 | 0.8578 | 0.9262 |
| 0.0548 | 9.2072 | 3600 | 0.8663 | 0.7337 | 0.8663 | 0.9307 |
| 0.0548 | 9.2123 | 3602 | 0.8778 | 0.7337 | 0.8778 | 0.9369 |
| 0.0548 | 9.2174 | 3604 | 0.8905 | 0.7337 | 0.8905 | 0.9437 |
| 0.0548 | 9.2225 | 3606 | 0.8942 | 0.7337 | 0.8942 | 0.9456 |
| 0.0548 | 9.2276 | 3608 | 0.8968 | 0.7337 | 0.8968 | 0.9470 |
| 0.0548 | 9.2327 | 3610 | 0.8941 | 0.7337 | 0.8941 | 0.9456 |
| 0.0548 | 9.2379 | 3612 | 0.8825 | 0.7337 | 0.8825 | 0.9394 |
| 0.0548 | 9.2430 | 3614 | 0.8668 | 0.7337 | 0.8668 | 0.9310 |
| 0.0548 | 9.2481 | 3616 | 0.8492 | 0.7337 | 0.8492 | 0.9215 |
| 0.0548 | 9.2532 | 3618 | 0.8314 | 0.7337 | 0.8314 | 0.9118 |
| 0.0548 | 9.2583 | 3620 | 0.8175 | 0.7337 | 0.8175 | 0.9042 |
| 0.0548 | 9.2634 | 3622 | 0.8022 | 0.6951 | 0.8022 | 0.8957 |
| 0.0548 | 9.2685 | 3624 | 0.7884 | 0.6951 | 0.7884 | 0.8879 |
| 0.0548 | 9.2737 | 3626 | 0.7751 | 0.6951 | 0.7751 | 0.8804 |
| 0.0548 | 9.2788 | 3628 | 0.7689 | 0.6951 | 0.7689 | 0.8768 |
| 0.0548 | 9.2839 | 3630 | 0.7697 | 0.6951 | 0.7697 | 0.8773 |
| 0.0548 | 9.2890 | 3632 | 0.7751 | 0.6951 | 0.7751 | 0.8804 |
| 0.0548 | 9.2941 | 3634 | 0.7849 | 0.6951 | 0.7849 | 0.8860 |
| 0.0548 | 9.2992 | 3636 | 0.8019 | 0.7139 | 0.8019 | 0.8955 |
| 0.0548 | 9.3043 | 3638 | 0.8171 | 0.7337 | 0.8171 | 0.9039 |
| 0.0548 | 9.3095 | 3640 | 0.8299 | 0.7337 | 0.8299 | 0.9110 |
| 0.0548 | 9.3146 | 3642 | 0.8356 | 0.7337 | 0.8356 | 0.9141 |
| 0.0548 | 9.3197 | 3644 | 0.8323 | 0.7337 | 0.8323 | 0.9123 |
| 0.0548 | 9.3248 | 3646 | 0.8200 | 0.7337 | 0.8200 | 0.9055 |
| 0.0548 | 9.3299 | 3648 | 0.8060 | 0.7139 | 0.8060 | 0.8978 |
| 0.0548 | 9.3350 | 3650 | 0.7960 | 0.7139 | 0.7960 | 0.8922 |
| 0.0548 | 9.3402 | 3652 | 0.7921 | 0.6951 | 0.7921 | 0.8900 |
| 0.0548 | 9.3453 | 3654 | 0.7851 | 0.6951 | 0.7851 | 0.8861 |
| 0.0548 | 9.3504 | 3656 | 0.7769 | 0.6951 | 0.7769 | 0.8814 |
| 0.0548 | 9.3555 | 3658 | 0.7684 | 0.6951 | 0.7684 | 0.8766 |
| 0.0548 | 9.3606 | 3660 | 0.7688 | 0.6951 | 0.7688 | 0.8768 |
| 0.0548 | 9.3657 | 3662 | 0.7742 | 0.6951 | 0.7742 | 0.8799 |
| 0.0548 | 9.3708 | 3664 | 0.7772 | 0.6951 | 0.7772 | 0.8816 |
| 0.0548 | 9.3760 | 3666 | 0.7861 | 0.6951 | 0.7861 | 0.8866 |
| 0.0548 | 9.3811 | 3668 | 0.7952 | 0.7139 | 0.7952 | 0.8917 |
| 0.0548 | 9.3862 | 3670 | 0.8014 | 0.7139 | 0.8014 | 0.8952 |
| 0.0548 | 9.3913 | 3672 | 0.8073 | 0.7058 | 0.8073 | 0.8985 |
| 0.0548 | 9.3964 | 3674 | 0.8079 | 0.7337 | 0.8079 | 0.8988 |
| 0.0548 | 9.4015 | 3676 | 0.8014 | 0.7058 | 0.8014 | 0.8952 |
| 0.0548 | 9.4066 | 3678 | 0.7886 | 0.6951 | 0.7886 | 0.8880 |
| 0.0548 | 9.4118 | 3680 | 0.7797 | 0.6951 | 0.7797 | 0.8830 |
| 0.0548 | 9.4169 | 3682 | 0.7709 | 0.6951 | 0.7709 | 0.8780 |
| 0.0548 | 9.4220 | 3684 | 0.7600 | 0.6951 | 0.7600 | 0.8718 |
| 0.0548 | 9.4271 | 3686 | 0.7523 | 0.6951 | 0.7523 | 0.8673 |
| 0.0548 | 9.4322 | 3688 | 0.7493 | 0.6951 | 0.7493 | 0.8656 |
| 0.0548 | 9.4373 | 3690 | 0.7515 | 0.6951 | 0.7515 | 0.8669 |
| 0.0548 | 9.4425 | 3692 | 0.7596 | 0.6951 | 0.7596 | 0.8715 |
| 0.0548 | 9.4476 | 3694 | 0.7649 | 0.6951 | 0.7649 | 0.8746 |
| 0.0548 | 9.4527 | 3696 | 0.7715 | 0.6951 | 0.7715 | 0.8783 |
| 0.0548 | 9.4578 | 3698 | 0.7714 | 0.6951 | 0.7714 | 0.8783 |
| 0.0548 | 9.4629 | 3700 | 0.7733 | 0.6951 | 0.7733 | 0.8793 |
| 0.0548 | 9.4680 | 3702 | 0.7801 | 0.6951 | 0.7801 | 0.8832 |
| 0.0548 | 9.4731 | 3704 | 0.7894 | 0.6877 | 0.7894 | 0.8885 |
| 0.0548 | 9.4783 | 3706 | 0.7959 | 0.7162 | 0.7959 | 0.8921 |
| 0.0548 | 9.4834 | 3708 | 0.7984 | 0.7337 | 0.7984 | 0.8935 |
| 0.0548 | 9.4885 | 3710 | 0.8011 | 0.7337 | 0.8011 | 0.8950 |
| 0.0548 | 9.4936 | 3712 | 0.8014 | 0.7058 | 0.8014 | 0.8952 |
| 0.0548 | 9.4987 | 3714 | 0.7981 | 0.6877 | 0.7981 | 0.8933 |
| 0.0548 | 9.5038 | 3716 | 0.7927 | 0.6877 | 0.7927 | 0.8903 |
| 0.0548 | 9.5090 | 3718 | 0.7947 | 0.6877 | 0.7947 | 0.8915 |
| 0.0548 | 9.5141 | 3720 | 0.7959 | 0.6877 | 0.7959 | 0.8922 |
| 0.0548 | 9.5192 | 3722 | 0.7966 | 0.6877 | 0.7966 | 0.8925 |
| 0.0548 | 9.5243 | 3724 | 0.7943 | 0.6877 | 0.7943 | 0.8912 |
| 0.0548 | 9.5294 | 3726 | 0.7920 | 0.6877 | 0.7920 | 0.8899 |
| 0.0548 | 9.5345 | 3728 | 0.7883 | 0.6951 | 0.7883 | 0.8879 |
| 0.0548 | 9.5396 | 3730 | 0.7837 | 0.6951 | 0.7837 | 0.8853 |
| 0.0548 | 9.5448 | 3732 | 0.7791 | 0.6951 | 0.7791 | 0.8827 |
| 0.0548 | 9.5499 | 3734 | 0.7787 | 0.6951 | 0.7787 | 0.8825 |
| 0.0548 | 9.5550 | 3736 | 0.7773 | 0.6951 | 0.7773 | 0.8816 |
| 0.0548 | 9.5601 | 3738 | 0.7812 | 0.6951 | 0.7812 | 0.8838 |
| 0.0548 | 9.5652 | 3740 | 0.7856 | 0.7139 | 0.7856 | 0.8864 |
| 0.0548 | 9.5703 | 3742 | 0.7869 | 0.7139 | 0.7869 | 0.8871 |
| 0.0548 | 9.5754 | 3744 | 0.7895 | 0.7139 | 0.7895 | 0.8886 |
| 0.0548 | 9.5806 | 3746 | 0.7902 | 0.7139 | 0.7902 | 0.8889 |
| 0.0548 | 9.5857 | 3748 | 0.7945 | 0.7139 | 0.7945 | 0.8913 |
| 0.0548 | 9.5908 | 3750 | 0.7954 | 0.7139 | 0.7954 | 0.8918 |
| 0.0548 | 9.5959 | 3752 | 0.7952 | 0.7139 | 0.7952 | 0.8917 |
| 0.0548 | 9.6010 | 3754 | 0.7928 | 0.7139 | 0.7928 | 0.8904 |
| 0.0548 | 9.6061 | 3756 | 0.7928 | 0.7139 | 0.7928 | 0.8904 |
| 0.0548 | 9.6113 | 3758 | 0.7962 | 0.7058 | 0.7962 | 0.8923 |
| 0.0548 | 9.6164 | 3760 | 0.7968 | 0.7058 | 0.7968 | 0.8926 |
| 0.0548 | 9.6215 | 3762 | 0.7949 | 0.7058 | 0.7949 | 0.8916 |
| 0.0548 | 9.6266 | 3764 | 0.7925 | 0.7058 | 0.7925 | 0.8902 |
| 0.0548 | 9.6317 | 3766 | 0.7893 | 0.7058 | 0.7893 | 0.8884 |
| 0.0548 | 9.6368 | 3768 | 0.7888 | 0.7337 | 0.7888 | 0.8882 |
| 0.0548 | 9.6419 | 3770 | 0.7905 | 0.7337 | 0.7905 | 0.8891 |
| 0.0548 | 9.6471 | 3772 | 0.7943 | 0.7337 | 0.7943 | 0.8912 |
| 0.0548 | 9.6522 | 3774 | 0.7944 | 0.7337 | 0.7944 | 0.8913 |
| 0.0548 | 9.6573 | 3776 | 0.7940 | 0.7337 | 0.7940 | 0.8910 |
| 0.0548 | 9.6624 | 3778 | 0.7901 | 0.7337 | 0.7901 | 0.8889 |
| 0.0548 | 9.6675 | 3780 | 0.7880 | 0.7337 | 0.7880 | 0.8877 |
| 0.0548 | 9.6726 | 3782 | 0.7848 | 0.6877 | 0.7848 | 0.8859 |
| 0.0548 | 9.6777 | 3784 | 0.7855 | 0.7337 | 0.7855 | 0.8863 |
| 0.0548 | 9.6829 | 3786 | 0.7870 | 0.7337 | 0.7870 | 0.8871 |
| 0.0548 | 9.6880 | 3788 | 0.7845 | 0.6877 | 0.7845 | 0.8857 |
| 0.0548 | 9.6931 | 3790 | 0.7815 | 0.6951 | 0.7815 | 0.8840 |
| 0.0548 | 9.6982 | 3792 | 0.7804 | 0.6951 | 0.7804 | 0.8834 |
| 0.0548 | 9.7033 | 3794 | 0.7783 | 0.6951 | 0.7783 | 0.8822 |
| 0.0548 | 9.7084 | 3796 | 0.7799 | 0.6951 | 0.7799 | 0.8831 |
| 0.0548 | 9.7136 | 3798 | 0.7824 | 0.6951 | 0.7824 | 0.8845 |
| 0.0548 | 9.7187 | 3800 | 0.7851 | 0.7162 | 0.7851 | 0.8861 |
| 0.0548 | 9.7238 | 3802 | 0.7860 | 0.7162 | 0.7860 | 0.8865 |
| 0.0548 | 9.7289 | 3804 | 0.7851 | 0.7162 | 0.7851 | 0.8861 |
| 0.0548 | 9.7340 | 3806 | 0.7837 | 0.6877 | 0.7837 | 0.8853 |
| 0.0548 | 9.7391 | 3808 | 0.7823 | 0.6877 | 0.7823 | 0.8845 |
| 0.0548 | 9.7442 | 3810 | 0.7844 | 0.7162 | 0.7844 | 0.8857 |
| 0.0548 | 9.7494 | 3812 | 0.7880 | 0.7162 | 0.7880 | 0.8877 |
| 0.0548 | 9.7545 | 3814 | 0.7921 | 0.7337 | 0.7921 | 0.8900 |
| 0.0548 | 9.7596 | 3816 | 0.7953 | 0.7337 | 0.7953 | 0.8918 |
| 0.0548 | 9.7647 | 3818 | 0.7962 | 0.7337 | 0.7962 | 0.8923 |
| 0.0548 | 9.7698 | 3820 | 0.7968 | 0.7337 | 0.7968 | 0.8927 |
| 0.0548 | 9.7749 | 3822 | 0.7943 | 0.7337 | 0.7943 | 0.8912 |
| 0.0548 | 9.7801 | 3824 | 0.7894 | 0.6877 | 0.7894 | 0.8885 |
| 0.0548 | 9.7852 | 3826 | 0.7829 | 0.6951 | 0.7829 | 0.8848 |
| 0.0548 | 9.7903 | 3828 | 0.7752 | 0.6951 | 0.7752 | 0.8804 |
| 0.0548 | 9.7954 | 3830 | 0.7691 | 0.6951 | 0.7691 | 0.8770 |
| 0.0548 | 9.8005 | 3832 | 0.7651 | 0.6951 | 0.7651 | 0.8747 |
| 0.0548 | 9.8056 | 3834 | 0.7608 | 0.6951 | 0.7608 | 0.8722 |
| 0.0548 | 9.8107 | 3836 | 0.7564 | 0.6951 | 0.7564 | 0.8697 |
| 0.0548 | 9.8159 | 3838 | 0.7549 | 0.6951 | 0.7549 | 0.8689 |
| 0.0548 | 9.8210 | 3840 | 0.7542 | 0.6951 | 0.7542 | 0.8684 |
| 0.0548 | 9.8261 | 3842 | 0.7530 | 0.6951 | 0.7530 | 0.8677 |
| 0.0548 | 9.8312 | 3844 | 0.7525 | 0.6951 | 0.7525 | 0.8675 |
| 0.0548 | 9.8363 | 3846 | 0.7532 | 0.6951 | 0.7532 | 0.8678 |
| 0.0548 | 9.8414 | 3848 | 0.7545 | 0.6951 | 0.7545 | 0.8686 |
| 0.0548 | 9.8465 | 3850 | 0.7554 | 0.6951 | 0.7554 | 0.8692 |
| 0.0548 | 9.8517 | 3852 | 0.7568 | 0.6951 | 0.7568 | 0.8700 |
| 0.0548 | 9.8568 | 3854 | 0.7577 | 0.6951 | 0.7577 | 0.8705 |
| 0.0548 | 9.8619 | 3856 | 0.7591 | 0.6951 | 0.7591 | 0.8712 |
| 0.0548 | 9.8670 | 3858 | 0.7596 | 0.6951 | 0.7596 | 0.8716 |
| 0.0548 | 9.8721 | 3860 | 0.7604 | 0.6951 | 0.7604 | 0.8720 |
| 0.0548 | 9.8772 | 3862 | 0.7614 | 0.6951 | 0.7614 | 0.8726 |
| 0.0548 | 9.8824 | 3864 | 0.7619 | 0.6951 | 0.7619 | 0.8729 |
| 0.0548 | 9.8875 | 3866 | 0.7631 | 0.6951 | 0.7631 | 0.8735 |
| 0.0548 | 9.8926 | 3868 | 0.7641 | 0.6951 | 0.7641 | 0.8741 |
| 0.0548 | 9.8977 | 3870 | 0.7637 | 0.6951 | 0.7637 | 0.8739 |
| 0.0548 | 9.9028 | 3872 | 0.7637 | 0.6951 | 0.7637 | 0.8739 |
| 0.0548 | 9.9079 | 3874 | 0.7646 | 0.6951 | 0.7646 | 0.8744 |
| 0.0548 | 9.9130 | 3876 | 0.7660 | 0.6951 | 0.7660 | 0.8752 |
| 0.0548 | 9.9182 | 3878 | 0.7676 | 0.6951 | 0.7676 | 0.8761 |
| 0.0548 | 9.9233 | 3880 | 0.7684 | 0.6951 | 0.7684 | 0.8766 |
| 0.0548 | 9.9284 | 3882 | 0.7692 | 0.6951 | 0.7692 | 0.8770 |
| 0.0548 | 9.9335 | 3884 | 0.7693 | 0.6951 | 0.7693 | 0.8771 |
| 0.0548 | 9.9386 | 3886 | 0.7696 | 0.6951 | 0.7696 | 0.8773 |
| 0.0548 | 9.9437 | 3888 | 0.7700 | 0.6951 | 0.7700 | 0.8775 |
| 0.0548 | 9.9488 | 3890 | 0.7707 | 0.6951 | 0.7707 | 0.8779 |
| 0.0548 | 9.9540 | 3892 | 0.7718 | 0.6951 | 0.7718 | 0.8785 |
| 0.0548 | 9.9591 | 3894 | 0.7723 | 0.6951 | 0.7723 | 0.8788 |
| 0.0548 | 9.9642 | 3896 | 0.7727 | 0.6951 | 0.7727 | 0.8790 |
| 0.0548 | 9.9693 | 3898 | 0.7729 | 0.6951 | 0.7729 | 0.8791 |
| 0.0548 | 9.9744 | 3900 | 0.7731 | 0.6951 | 0.7731 | 0.8793 |
| 0.0548 | 9.9795 | 3902 | 0.7731 | 0.6951 | 0.7731 | 0.8793 |
| 0.0548 | 9.9847 | 3904 | 0.7731 | 0.6951 | 0.7731 | 0.8792 |
| 0.0548 | 9.9898 | 3906 | 0.7730 | 0.6951 | 0.7730 | 0.8792 |
| 0.0548 | 9.9949 | 3908 | 0.7729 | 0.6951 | 0.7729 | 0.8792 |
| 0.0548 | 10.0 | 3910 | 0.7729 | 0.6951 | 0.7729 | 0.8792 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
|
GrupoUtec/ModeloTraductor | GrupoUtec | 2024-11-14T04:18:13Z | 159 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"marian",
"text2text-generation",
"translation",
"es",
"en",
"base_model:Helsinki-NLP/opus-mt-en-es",
"base_model:finetune:Helsinki-NLP/opus-mt-en-es",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | translation | 2024-11-09T16:29:54Z | ---
license: apache-2.0
language:
- es
- en
base_model:
- Helsinki-NLP/opus-mt-es-en
- Helsinki-NLP/opus-mt-en-es
pipeline_tag: translation
library_name: transformers
--- |
5CD-AI/ColVintern-1B-v1 | 5CD-AI | 2024-11-14T04:13:26Z | 136 | 6 | transformers | [
"transformers",
"safetensors",
"internvl_chat",
"feature-extraction",
"colpali",
"custom_code",
"vi",
"en",
"dataset:vidore/colpali_train_set",
"dataset:5CD-AI/Viet-Doc-VQA",
"dataset:5CD-AI/Viet-OCR-VQA",
"dataset:5CD-AI/Viet-Doc-VQA-II",
"base_model:5CD-AI/Vintern-1B-v2",
"base_model:finetune:5CD-AI/Vintern-1B-v2",
"region:us"
] | feature-extraction | 2024-11-13T09:39:36Z | ---
library_name: transformers
language:
- vi
- en
base_model:
- 5CD-AI/Vintern-1B-v2
datasets:
- vidore/colpali_train_set
- 5CD-AI/Viet-Doc-VQA
- 5CD-AI/Viet-OCR-VQA
- 5CD-AI/Viet-Doc-VQA-II
tags:
- colpali
---
<div align="center">
<img src="colvintern.png" width="400"/>
</div>
## ColVintern-1B-v1 🇻🇳 ❄️ - Colpali version for Vietnamese.
**What's new in ColVintern-1B-v1!**
- We coded and successfully trained the **Colpali pipeline** for **Vintern-1B-v2**. The model supports RAG by extracting embedding vectors for questions and images containing related information.
- This is the first experimental version, trained on the [**Colpali dataset**](https://huggingface.co/datasets/vidore/colpali_train_set) for English and **5%** of the image-based question-answer pairs we have for Vietnamese.
- The model achieves results nearly equivalent to Colpali version 1, with strong support for Vietnamese texts and only 1 billion parameters compared to current 2B-3B Colpali models.
## Colpali Benchmarks
We tested on the [**ViDoRe benchmark**](https://huggingface.co/collections/vidore/vidore-benchmark-667173f98e70a1c0fa4db00d) from the Colpali paper. The **TabF** and **Shift** test datasets were not used because they are in French. We plan to expand to multiple languages in the near future.
| | ArxivQ | DocQ | InfoQ | TATQ | AI | Energy | Gov. | Health. | Avg. |
|:------------------------------:|:--------:|:------:|:-------:|:------:|:------:|:--------:|:-------:|:---------:|:--------:|
| **Unstructured** Text only | | | | | | | | | |
| - BM25 | - | 34.1 | - | 44.0 | 90.4 | 78.3 | 78.8 | 82.6 | - |
| - BGE-M3 | - | 28.4 | - | 36.1 | 88.4 | 76.8 | 77.7 | 84.6 | - |
| **Unstructured** + OCR | | | | | | | | | |
| - BM25 | 31.6 | 36.8 | 62.9 | 62.7 | 92.8 | 85.9 | 83.9 | 87.2 | 68.0 |
| - BGE-M3 | 31.4 | 25.7 | 60.1 | 50.5 | 90.2 | 83.6 | 84.9 | 91.1 | 64.7 |
| **Unstructured** + Captioning | | | | | | | | | |
| - BM25 | 40.1 | 38.4 | 70.0 | 61.5 | 88.0 | 84.7 | 82.7 | 89.2 | 69.3 |
| - BGE-M3 | 35.7 | 32.9 | 71.9 | 43.8 | 88.8 | 83.3 | 80.4 | 91.3 | 66.0 |
| **Contrastive VLMs** | | | | | | | | | |
| - Jina-CLIP | 25.4 | 11.9 | 35.5 | 3.3 | 15.2 | 19.7 | 21.4 | 20.8 | 19.2 |
| - Nomic-vision | 17.1 | 10.7 | 30.1 | 2.7 | 12.9 | 10.9 | 11.4 | 15.7 | 13.9 |
| - SigLIP (Vanilla) | 43.2 | 30.3 | 64.1 | 26.2 | 62.5 | 65.7 | 66.1 | 79.1 | 54.7 |
| **Colpali** | | | | | | | | | |
| - SigLIP (Vanilla) | 43.2 | 30.3 | 64.1 | 26.2 | 62.5 | 65.7 | 66.1 | 79.1 | 54.7 |
| - BiSigLIP (+fine-tuning) | 58.5 | 32.9 | 70.5 | 30.5 | 74.3 | 73.7 | 74.2 | 82.3 | 62.1 |
| - BiPali (+LLM) | 56.5 | 30.0 | 67.4 | 33.4 | 71.2 | 61.9 | 73.8 | 73.6 | 58.5 |
| - ColPali (+Late Inter.) | **79.1** | **54.4** | 81.8 | **65.8** | **96.2** | **91.0** | **92.7** | 94.4 | **81.3** |
| **Ours** | | | | | | | | | |
| - ColVintern-1B (+Late Inter.) | 71.6 | 48.3 | **84.6** | 59.6 | 92.9 | 88.7 | 89.4 | **95.2** | 78.8 |
We are expanding the training dataset for upcoming versions, including adding hard negative mining techniques, increasing GPU VRAM, etc., to achieve better results.
## Examples
Input Images:
<div style="display: flex; gap: 20px;">
<img src="ex1.jpg" width="300"/>
<img src="ex2.jpg" width="300"/>
</div>
Input Queries:
```
queries = ["Cảng Hải Phòng thông báo gì ?","Phí giao hàng bao nhiêu ?"]
```
Output Scores:
| Query | Image 1 Score | Image 2 Score |
|--------------------------------------|---------------|---------------|
| Chuyện gì xảy ra với quốc lộ 5 TP Hải Phòng ? | 62.4333 | 59.9523 |
| Phí giao hàng bao nhiêu ? | 60.7748 | 62.8654 |
## Quickstart:
Colab: https://colab.research.google.com/drive/1-y8HLiyS0oCj7Vpy4i7FsJ1A6kU7ROca?usp=sharing
```python
import torch
from PIL import Image
from transformers import AutoModel, AutoTokenizer, AutoProcessor
import matplotlib.pyplot as plt
model_name = "5CD-AI/ColVintern-1B-v1"
processor = AutoProcessor.from_pretrained(
model_name,
trust_remote_code=True
)
model = AutoModel.from_pretrained(
model_name,
torch_dtype=torch.bfloat16,
low_cpu_mem_usage=True,
trust_remote_code=True,
).eval().cuda()
#!wget https://huggingface.co/5CD-AI/ColVintern-1B-v1/resolve/main/ex1.jpg
#!wget https://huggingface.co/5CD-AI/ColVintern-1B-v1/resolve/main/ex2.jpg
images = [Image.open("ex1.jpg"),Image.open("ex2.jpg")]
batch_images = processor.process_images(images)
queries = [
"Cảng Hải Phòng thông báo gì ?",
"Phí giao hàng bao nhiêu ?",
]
batch_queries = processor.process_queries(queries)
batch_images["pixel_values"] = batch_images["pixel_values"].cuda().bfloat16()
batch_images["input_ids"] = batch_images["input_ids"].cuda()
batch_images["attention_mask"] = batch_images["attention_mask"].cuda().bfloat16()
batch_queries["input_ids"] = batch_queries["input_ids"].cuda()
batch_queries["attention_mask"] = batch_queries["attention_mask"].cuda().bfloat16()
with torch.no_grad():
image_embeddings = model(**batch_images)
query_embeddings = model(**batch_queries)
scores = processor.score_multi_vector(query_embeddings, image_embeddings)
max_scores, max_indices = torch.max(scores, dim=1)
# In ra kết quả cho mỗi câu hỏi
for i, query in enumerate(queries):
print(f"Câu hỏi: '{query}'")
print(f"Điểm số: {max_scores[i].item()}\n")
plt.figure(figsize=(5,5))
plt.imshow(images[max_indices[i]])
plt.show()
```
## Citation
```
``` |
mrhendrey/Florence-2-large-ft-safetensors | mrhendrey | 2024-11-14T04:11:55Z | 5 | 0 | null | [
"pytorch",
"safetensors",
"florence2",
"vision",
"image-text-to-text",
"custom_code",
"arxiv:2311.06242",
"license:mit",
"region:us"
] | image-text-to-text | 2024-10-08T02:05:37Z | ---
license: mit
license_link: https://huggingface.co/microsoft/Florence-2-large-ft/resolve/main/LICENSE
pipeline_tag: image-text-to-text
tags:
- vision
---
# Florence-2: Advancing a Unified Representation for a Variety of Vision Tasks
## Changes to Original
This is a modified version of [Florence-2-large-ft](https://huggingface.co/microsoft/Florence-2-large-ft).
The original failed the conversion to safetensors using the Huggingface Space
[Safetensors/convert](https://huggingface.co/spaces/safetensors/convert) due to
inconsistencies with some of the data pointers. The `convert_to_safetensors.py`
script does the minimal steps to enable conversion to safetensors. It does check that
the resulting tensors are equal and validates against the single image listed below
to ensure both `.bin` and `.safetensors` provide the same output.
Only the `modeling_florence2.py` file has been modified:
* Added `Florence2LanguageForConditionalGeneration._tie_weights()` which was missing
* Added `GenerationMixin` as a parent class to Florence2LanguageForConditionalGeneration and `Florence2ForConditionalGeneration` to stop the deprecation warning that `PreTrainedModel` will NOT
inherit from `GenerationMixin` from v4.50 onwards.
* Added @torch.no_grad() decorator to the generate() function to follow standard
transformers usage of turning off gradient accumulation. Without this, VRAM usage when
up significantly each time the batch size would increase by 1.
* Fixed batch attention masking following [pawlowskipawel's PR](https://huggingface.co/microsoft/Florence-2-base/commit/579b82bf58da309400d35772326fbcb4043b811f)
* Changed importing of timm following deprecation warnings.
## Model Summary
This Hub repository contains a HuggingFace's `transformers` implementation of Florence-2 model from Microsoft.
Florence-2 is an advanced vision foundation model that uses a prompt-based approach to handle a wide range of vision and vision-language tasks. Florence-2 can interpret simple text prompts to perform tasks like captioning, object detection, and segmentation. It leverages our FLD-5B dataset, containing 5.4 billion annotations across 126 million images, to master multi-task learning. The model's sequence-to-sequence architecture enables it to excel in both zero-shot and fine-tuned settings, proving to be a competitive vision foundation model.
Resources and Technical Documentation:
+ [Florence-2 technical report](https://arxiv.org/abs/2311.06242).
+ [Jupyter Notebook for inference and visualization of Florence-2-large model](https://huggingface.co/microsoft/Florence-2-large/blob/main/sample_inference.ipynb)
| Model | Model size | Model Description |
| ------- | ------------- | ------------- |
| Florence-2-base[[HF]](https://huggingface.co/microsoft/Florence-2-base) | 0.23B | Pretrained model with FLD-5B
| Florence-2-large[[HF]](https://huggingface.co/microsoft/Florence-2-large) | 0.77B | Pretrained model with FLD-5B
| Florence-2-base-ft[[HF]](https://huggingface.co/microsoft/Florence-2-base-ft) | 0.23B | Finetuned model on a colletion of downstream tasks
| Florence-2-large-ft[[HF]](https://huggingface.co/microsoft/Florence-2-large-ft) | 0.77B | Finetuned model on a colletion of downstream tasks
## How to Get Started with the Model
Use the code below to get started with the model. All models are trained with float16.
```python
import requests
from PIL import Image
import torch
from transformers import AutoProcessor, AutoModelForCausalLM
device = "cuda:0" if torch.cuda.is_available() else "cpu"
torch_dtype = torch.float16 if torch.cuda.is_available() else torch.float32
model = AutoModelForCausalLM.from_pretrained(
"mrhendrey/Florence-2-large-ft-safetensors",
torch_dtype=torch_dtype,
trust_remote_code=True,
use_safetensors=True
).to(device)
processor = AutoProcessor.from_pretrained(
"mrhendrey/Florence-2-large-ft-safetensors", trust_remote_code=True
)
prompt = "<OD>"
url = "https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/tasks/car.jpg?download=true"
image = Image.open(requests.get(url, stream=True).raw)
inputs = processor(
text=prompt, images=image, return_tensors="pt"
).to(device, torch_dtype)
generated_ids = model.generate(
input_ids=inputs["input_ids"],
pixel_values=inputs["pixel_values"],
max_new_tokens=1024,
do_sample=False,
num_beams=3
)
generated_text = processor.batch_decode(generated_ids, skip_special_tokens=False)[0]
parsed_answer = processor.post_process_generation(
generated_text, task="<OD>", image_size=(image.width, image.height)
)
print(parsed_answer)
```
To take advantage of batching the code changes slightly
```python
import requests
from PIL import Image
import torch
from transformers import AutoProcessor, AutoModelForCausalLM
device = "cuda:0" if torch.cuda.is_available() else "cpu"
torch_dtype = torch.float16 if torch.cuda.is_available() else torch.float32
model = AutoModelForCausalLM.from_pretrained(
"mrhendrey/Florence-2-large-ft-safetensors",
torch_dtype=torch_dtype,
trust_remote_code=True,
use_safetensors=True
).to(device)
processor = AutoProcessor.from_pretrained(
"mrhendrey/Florence-2-large-ft-safetensors", trust_remote_code=True
)
prompts = ["<OD>", "<CAPTION>"]
url = "https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/tasks/car.jpg?download=true"
image = Image.open(requests.get(url, stream=True).raw)
inputs = processor(
text=prompts, images=[image]*2, return_tensors="pt", padding=True
).to(device, torch_dtype)
generated_ids = model.generate(
input_ids=inputs["input_ids"],
pixel_values=inputs["pixel_values"],
attention_mask=inputs["attention_mask"],
max_new_tokens=1024,
do_sample=False,
num_beams=3
)
generated_texts = processor.batch_decode(generated_ids, skip_special_tokens=False)
parsed_answers = [
processor.post_process_generation(
gen_text, task=prompt, image_size=(image.width, image.height)
) for gen_text, prompt in zip(generated_texts, prompts)
]
print(parsed_answers[0])
print(parsed_answers[1])
```
## Tasks
This model is capable of performing different tasks through changing the prompts.
First, let's define a function to run a prompt.
<details>
<summary> Click to expand </summary>
```python
import requests
from PIL import Image
import torch
from transformers import AutoProcessor, AutoModelForCausalLM
device = "cuda:0" if torch.cuda.is_available() else "cpu"
torch_dtype = torch.float16 if torch.cuda.is_available() else torch.float32
model = AutoModelForCausalLM.from_pretrained(
"mrhendrey/Florence-2-large-ft-safetensors",
torch_dtype=torch_dtype,
trust_remote_code=True,
use_safetensors=True
).to(device)
processor = AutoProcessor.from_pretrained(
"mrhendrey/Florence-2-large-ft-safetensors", trust_remote_code=True
)
url = "https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/tasks/car.jpg?download=true"
image = Image.open(requests.get(url, stream=True).raw)
def run_example(task_prompt, text_input=None):
if text_input is None:
prompt = task_prompt
else:
prompt = task_prompt + text_input
inputs = processor(
text=prompt, images=image, return_tensors="pt"
).to(device, torch_dtype)
generated_ids = model.generate(
input_ids=inputs["input_ids"],
pixel_values=inputs["pixel_values"],
max_new_tokens=1024,
num_beams=3
)
generated_text = processor.batch_decode(
generated_ids, skip_special_tokens=False
)[0]
parsed_answer = processor.post_process_generation(
generated_text, task=task_prompt, image_size=(image.width, image.height)
)
print(parsed_answer)
```
</details>
Here are the tasks `Florence-2` could perform:
<details>
<summary> Click to expand </summary>
### Caption
```python
prompt = "<CAPTION>"
run_example(prompt)
```
### Detailed Caption
```python
prompt = "<DETAILED_CAPTION>"
run_example(prompt)
```
### More Detailed Caption
```python
prompt = "<MORE_DETAILED_CAPTION>"
run_example(prompt)
```
### Caption to Phrase Grounding
caption to phrase grounding task requires additional text input, i.e. caption.
Caption to phrase grounding results format:
{'\<CAPTION_TO_PHRASE_GROUNDING>': {'bboxes': [[x1, y1, x2, y2], ...], 'labels': ['', '', ...]}}
```python
task_prompt = "<CAPTION_TO_PHRASE_GROUNDING>"
results = run_example(
task_prompt, text_input="A green car parked in front of a yellow building."
)
```
### Object Detection
OD results format:
{'\<OD>': {'bboxes': [[x1, y1, x2, y2], ...],
'labels': ['label1', 'label2', ...]} }
```python
prompt = "<OD>"
run_example(prompt)
```
### Dense Region Caption
Dense region caption results format:
{'\<DENSE_REGION_CAPTION>' : {'bboxes': [[x1, y1, x2, y2], ...],
'labels': ['label1', 'label2', ...]} }
```python
prompt = "<DENSE_REGION_CAPTION>"
run_example(prompt)
```
### Region proposal
Dense region caption results format:
{'\<REGION_PROPOSAL>': {'bboxes': [[x1, y1, x2, y2], ...],
'labels': ['', '', ...]}}
```python
prompt = "<REGION_PROPOSAL>"
run_example(prompt)
```
### OCR
```python
prompt = "<OCR>"
run_example(prompt)
```
### OCR with Region
OCR with region output format:
{'\<OCR_WITH_REGION>': {'quad_boxes': [[x1, y1, x2, y2, x3, y3, x4, y4], ...], 'labels': ['text1', ...]}}
```python
prompt = "<OCR_WITH_REGION>"
run_example(prompt)
```
for More detailed examples, please refer to [notebook](https://huggingface.co/microsoft/Florence-2-large/blob/main/sample_inference.ipynb)
</details>
# Benchmarks
## Florence-2 Zero-shot performance
The following table presents the zero-shot performance of generalist vision foundation models on image captioning and object detection evaluation tasks. These models have not been exposed to the training data of the evaluation tasks during their training phase.
| Method | #params | COCO Cap. test CIDEr | NoCaps val CIDEr | TextCaps val CIDEr | COCO Det. val2017 mAP |
|--------|---------|----------------------|------------------|--------------------|-----------------------|
| Flamingo | 80B | 84.3 | - | - | - |
| Florence-2-base| 0.23B | 133.0 | 118.7 | 70.1 | 34.7 |
| Florence-2-large| 0.77B | 135.6 | 120.8 | 72.8 | 37.5 |
The following table continues the comparison with performance on other vision-language evaluation tasks.
| Method | Flickr30k test R@1 | Refcoco val Accuracy | Refcoco test-A Accuracy | Refcoco test-B Accuracy | Refcoco+ val Accuracy | Refcoco+ test-A Accuracy | Refcoco+ test-B Accuracy | Refcocog val Accuracy | Refcocog test Accuracy | Refcoco RES val mIoU |
|--------|----------------------|----------------------|-------------------------|-------------------------|-----------------------|--------------------------|--------------------------|-----------------------|------------------------|----------------------|
| Kosmos-2 | 78.7 | 52.3 | 57.4 | 47.3 | 45.5 | 50.7 | 42.2 | 60.6 | 61.7 | - |
| Florence-2-base | 83.6 | 53.9 | 58.4 | 49.7 | 51.5 | 56.4 | 47.9 | 66.3 | 65.1 | 34.6 |
| Florence-2-large | 84.4 | 56.3 | 61.6 | 51.4 | 53.6 | 57.9 | 49.9 | 68.0 | 67.0 | 35.8 |
## Florence-2 finetuned performance
We finetune Florence-2 models with a collection of downstream tasks, resulting two generalist models *Florence-2-base-ft* and *Florence-2-large-ft* that can conduct a wide range of downstream tasks.
The table below compares the performance of specialist and generalist models on various captioning and Visual Question Answering (VQA) tasks. Specialist models are fine-tuned specifically for each task, whereas generalist models are fine-tuned in a task-agnostic manner across all tasks. The symbol "▲" indicates the usage of external OCR as input.
| Method | # Params | COCO Caption Karpathy test CIDEr | NoCaps val CIDEr | TextCaps val CIDEr | VQAv2 test-dev Acc | TextVQA test-dev Acc | VizWiz VQA test-dev Acc |
|----------------|----------|-----------------------------------|------------------|--------------------|--------------------|----------------------|-------------------------|
| **Specialist Models** | | | | | | | |
| CoCa | 2.1B | 143.6 | 122.4 | - | 82.3 | - | - |
| BLIP-2 | 7.8B | 144.5 | 121.6 | - | 82.2 | - | - |
| GIT2 | 5.1B | 145.0 | 126.9 | 148.6 | 81.7 | 67.3 | 71.0 |
| Flamingo | 80B | 138.1 | - | - | 82.0 | 54.1 | 65.7 |
| PaLI | 17B | 149.1 | 127.0 | 160.0▲ | 84.3 | 58.8 / 73.1▲ | 71.6 / 74.4▲ |
| PaLI-X | 55B | 149.2 | 126.3 | 147.0 / 163.7▲ | 86.0 | 71.4 / 80.8▲ | 70.9 / 74.6▲ |
| **Generalist Models** | | | | | | | |
| Unified-IO | 2.9B | - | 100.0 | - | 77.9 | - | 57.4 |
| Florence-2-base-ft | 0.23B | 140.0 | 116.7 | 143.9 | 79.7 | 63.6 | 63.6 |
| Florence-2-large-ft | 0.77B | 143.3 | 124.9 | 151.1 | 81.7 | 73.5 | 72.6 |
| Method | # Params | COCO Det. val2017 mAP | Flickr30k test R@1 | RefCOCO val Accuracy | RefCOCO test-A Accuracy | RefCOCO test-B Accuracy | RefCOCO+ val Accuracy | RefCOCO+ test-A Accuracy | RefCOCO+ test-B Accuracy | RefCOCOg val Accuracy | RefCOCOg test Accuracy | RefCOCO RES val mIoU |
|----------------------|----------|-----------------------|--------------------|----------------------|-------------------------|-------------------------|------------------------|---------------------------|---------------------------|------------------------|-----------------------|------------------------|
| **Specialist Models** | | | | | | | | | | | | |
| SeqTR | - | - | - | 83.7 | 86.5 | 81.2 | 71.5 | 76.3 | 64.9 | 74.9 | 74.2 | - |
| PolyFormer | - | - | - | 90.4 | 92.9 | 87.2 | 85.0 | 89.8 | 78.0 | 85.8 | 85.9 | 76.9 |
| UNINEXT | 0.74B | 60.6 | - | 92.6 | 94.3 | 91.5 | 85.2 | 89.6 | 79.8 | 88.7 | 89.4 | - |
| Ferret | 13B | - | - | 89.5 | 92.4 | 84.4 | 82.8 | 88.1 | 75.2 | 85.8 | 86.3 | - |
| **Generalist Models** | | | | | | | | | | | | |
| UniTAB | - | - | - | 88.6 | 91.1 | 83.8 | 81.0 | 85.4 | 71.6 | 84.6 | 84.7 | - |
| Florence-2-base-ft | 0.23B | 41.4 | 84.0 | 92.6 | 94.8 | 91.5 | 86.8 | 91.7 | 82.2 | 89.8 | 82.2 | 78.0 |
| Florence-2-large-ft| 0.77B | 43.4 | 85.2 | 93.4 | 95.3 | 92.0 | 88.3 | 92.9 | 83.6 | 91.2 | 91.7 | 80.5 |
## BibTex and citation info
```
@article{xiao2023florence,
title={Florence-2: Advancing a unified representation for a variety of vision tasks},
author={Xiao, Bin and Wu, Haiping and Xu, Weijian and Dai, Xiyang and Hu, Houdong and Lu, Yumao and Zeng, Michael and Liu, Ce and Yuan, Lu},
journal={arXiv preprint arXiv:2311.06242},
year={2023}
}
``` |
emarro/axial_caducues_1200 | emarro | 2024-11-14T04:09:53Z | 101 | 0 | transformers | [
"transformers",
"pytorch",
"axial_caduceus",
"fill-mask",
"custom_code",
"arxiv:1910.09700",
"autotrain_compatible",
"region:us"
] | fill-mask | 2024-11-14T03:53:04Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
MayBashendy/Arabic_FineTuningAraBERT_AugV4-trial2_k15_task1_organization_fold1 | MayBashendy | 2024-11-14T03:57:15Z | 160 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:aubmindlab/bert-base-arabertv02",
"base_model:finetune:aubmindlab/bert-base-arabertv02",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-14T03:41:35Z | ---
library_name: transformers
base_model: aubmindlab/bert-base-arabertv02
tags:
- generated_from_trainer
model-index:
- name: Arabic_FineTuningAraBERT_AugV4-trial2_k15_task1_organization_fold1
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Arabic_FineTuningAraBERT_AugV4-trial2_k15_task1_organization_fold1
This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4952
- Qwk: 0.8165
- Mse: 0.4952
- Rmse: 0.7037
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:|
| No log | 0.0075 | 2 | 4.2399 | 0.0328 | 4.2399 | 2.0591 |
| No log | 0.0150 | 4 | 1.8545 | 0.0987 | 1.8545 | 1.3618 |
| No log | 0.0226 | 6 | 0.9258 | 0.0382 | 0.9258 | 0.9622 |
| No log | 0.0301 | 8 | 0.8922 | 0.2294 | 0.8922 | 0.9445 |
| No log | 0.0376 | 10 | 1.6000 | -0.1146 | 1.6000 | 1.2649 |
| No log | 0.0451 | 12 | 1.6371 | -0.2189 | 1.6371 | 1.2795 |
| No log | 0.0526 | 14 | 1.6378 | -0.1807 | 1.6378 | 1.2798 |
| No log | 0.0602 | 16 | 1.8501 | -0.1667 | 1.8501 | 1.3602 |
| No log | 0.0677 | 18 | 1.6473 | -0.1375 | 1.6473 | 1.2835 |
| No log | 0.0752 | 20 | 1.4568 | -0.1375 | 1.4568 | 1.2070 |
| No log | 0.0827 | 22 | 1.3150 | -0.0120 | 1.3150 | 1.1467 |
| No log | 0.0902 | 24 | 1.3774 | -0.1375 | 1.3774 | 1.1736 |
| No log | 0.0977 | 26 | 1.3689 | -0.0255 | 1.3689 | 1.1700 |
| No log | 0.1053 | 28 | 1.4464 | 0.0 | 1.4464 | 1.2026 |
| No log | 0.1128 | 30 | 1.8956 | 0.0 | 1.8956 | 1.3768 |
| No log | 0.1203 | 32 | 2.1765 | 0.0455 | 2.1765 | 1.4753 |
| No log | 0.1278 | 34 | 1.8847 | 0.0 | 1.8847 | 1.3729 |
| No log | 0.1353 | 36 | 1.4344 | 0.1529 | 1.4344 | 1.1977 |
| No log | 0.1429 | 38 | 0.9899 | 0.2139 | 0.9899 | 0.9949 |
| No log | 0.1504 | 40 | 0.9095 | 0.1683 | 0.9095 | 0.9537 |
| No log | 0.1579 | 42 | 0.9900 | 0.2139 | 0.9900 | 0.9950 |
| No log | 0.1654 | 44 | 1.2791 | 0.125 | 1.2791 | 1.1310 |
| No log | 0.1729 | 46 | 1.5648 | 0.0 | 1.5648 | 1.2509 |
| No log | 0.1805 | 48 | 1.5486 | 0.0 | 1.5486 | 1.2444 |
| No log | 0.1880 | 50 | 1.4543 | -0.0255 | 1.4543 | 1.2060 |
| No log | 0.1955 | 52 | 1.0852 | -0.0355 | 1.0852 | 1.0417 |
| No log | 0.2030 | 54 | 0.9327 | 0.1390 | 0.9327 | 0.9658 |
| No log | 0.2105 | 56 | 0.8799 | 0.1158 | 0.8799 | 0.9380 |
| No log | 0.2180 | 58 | 1.0116 | 0.2524 | 1.0116 | 1.0058 |
| No log | 0.2256 | 60 | 1.5258 | 0.2155 | 1.5258 | 1.2352 |
| No log | 0.2331 | 62 | 1.8357 | 0.2025 | 1.8357 | 1.3549 |
| No log | 0.2406 | 64 | 1.8316 | 0.2821 | 1.8316 | 1.3534 |
| No log | 0.2481 | 66 | 1.5019 | 0.2667 | 1.5019 | 1.2255 |
| No log | 0.2556 | 68 | 1.1837 | 0.3037 | 1.1837 | 1.0880 |
| No log | 0.2632 | 70 | 1.1880 | 0.3037 | 1.1880 | 1.0899 |
| No log | 0.2707 | 72 | 1.3560 | 0.2899 | 1.3560 | 1.1645 |
| No log | 0.2782 | 74 | 1.3272 | 0.2899 | 1.3272 | 1.1520 |
| No log | 0.2857 | 76 | 1.3449 | 0.3333 | 1.3449 | 1.1597 |
| No log | 0.2932 | 78 | 0.8799 | 0.4516 | 0.8799 | 0.9380 |
| No log | 0.3008 | 80 | 0.7648 | 0.5919 | 0.7648 | 0.8745 |
| No log | 0.3083 | 82 | 1.0698 | 0.3762 | 1.0698 | 1.0343 |
| No log | 0.3158 | 84 | 1.2340 | 0.5 | 1.2340 | 1.1109 |
| No log | 0.3233 | 86 | 1.2625 | 0.4936 | 1.2625 | 1.1236 |
| No log | 0.3308 | 88 | 1.0194 | 0.4566 | 1.0194 | 1.0096 |
| No log | 0.3383 | 90 | 0.8848 | 0.4711 | 0.8848 | 0.9407 |
| No log | 0.3459 | 92 | 0.8783 | 0.4711 | 0.8783 | 0.9372 |
| No log | 0.3534 | 94 | 0.8008 | 0.4766 | 0.8008 | 0.8949 |
| No log | 0.3609 | 96 | 0.8898 | 0.4955 | 0.8898 | 0.9433 |
| No log | 0.3684 | 98 | 0.7279 | 0.6547 | 0.7279 | 0.8531 |
| No log | 0.3759 | 100 | 0.8359 | 0.4711 | 0.8359 | 0.9143 |
| No log | 0.3835 | 102 | 1.1713 | 0.5556 | 1.1713 | 1.0822 |
| No log | 0.3910 | 104 | 1.6850 | 0.5172 | 1.6850 | 1.2981 |
| No log | 0.3985 | 106 | 2.1040 | 0.4270 | 2.1040 | 1.4505 |
| No log | 0.4060 | 108 | 1.7654 | 0.4615 | 1.7654 | 1.3287 |
| No log | 0.4135 | 110 | 0.9647 | 0.5254 | 0.9647 | 0.9822 |
| No log | 0.4211 | 112 | 0.6578 | 0.5882 | 0.6578 | 0.8110 |
| No log | 0.4286 | 114 | 0.6887 | 0.4776 | 0.6887 | 0.8299 |
| No log | 0.4361 | 116 | 0.7622 | 0.5051 | 0.7622 | 0.8730 |
| No log | 0.4436 | 118 | 0.8773 | 0.5702 | 0.8773 | 0.9366 |
| No log | 0.4511 | 120 | 0.8417 | 0.6061 | 0.8417 | 0.9175 |
| No log | 0.4586 | 122 | 0.9843 | 0.6513 | 0.9843 | 0.9921 |
| No log | 0.4662 | 124 | 0.8592 | 0.6056 | 0.8592 | 0.9269 |
| No log | 0.4737 | 126 | 0.8248 | 0.6449 | 0.8248 | 0.9082 |
| No log | 0.4812 | 128 | 0.9834 | 0.5966 | 0.9834 | 0.9917 |
| No log | 0.4887 | 130 | 1.0484 | 0.5966 | 1.0484 | 1.0239 |
| No log | 0.4962 | 132 | 0.9531 | 0.6915 | 0.9531 | 0.9763 |
| No log | 0.5038 | 134 | 0.9229 | 0.7042 | 0.9229 | 0.9607 |
| No log | 0.5113 | 136 | 1.2523 | 0.5993 | 1.2523 | 1.1191 |
| No log | 0.5188 | 138 | 1.7964 | 0.5172 | 1.7964 | 1.3403 |
| No log | 0.5263 | 140 | 1.5966 | 0.5172 | 1.5966 | 1.2635 |
| No log | 0.5338 | 142 | 0.9432 | 0.6513 | 0.9432 | 0.9712 |
| No log | 0.5414 | 144 | 0.7768 | 0.5484 | 0.7768 | 0.8814 |
| No log | 0.5489 | 146 | 0.6939 | 0.5570 | 0.6939 | 0.8330 |
| No log | 0.5564 | 148 | 0.9404 | 0.6513 | 0.9404 | 0.9698 |
| No log | 0.5639 | 150 | 1.3656 | 0.4981 | 1.3656 | 1.1686 |
| No log | 0.5714 | 152 | 1.5464 | 0.4981 | 1.5464 | 1.2435 |
| No log | 0.5789 | 154 | 1.1624 | 0.5783 | 1.1624 | 1.0782 |
| No log | 0.5865 | 156 | 0.6999 | 0.4465 | 0.6999 | 0.8366 |
| No log | 0.5940 | 158 | 0.5762 | 0.5249 | 0.5762 | 0.7591 |
| No log | 0.6015 | 160 | 0.5879 | 0.5249 | 0.5879 | 0.7668 |
| No log | 0.6090 | 162 | 0.7304 | 0.5505 | 0.7304 | 0.8547 |
| No log | 0.6165 | 164 | 1.1327 | 0.5556 | 1.1327 | 1.0643 |
| No log | 0.6241 | 166 | 1.2793 | 0.4878 | 1.2793 | 1.1311 |
| No log | 0.6316 | 168 | 1.2834 | 0.5475 | 1.2834 | 1.1329 |
| No log | 0.6391 | 170 | 1.1805 | 0.5576 | 1.1805 | 1.0865 |
| No log | 0.6466 | 172 | 0.8478 | 0.7042 | 0.8478 | 0.9207 |
| No log | 0.6541 | 174 | 0.7017 | 0.6142 | 0.7017 | 0.8377 |
| No log | 0.6617 | 176 | 0.8120 | 0.7042 | 0.8120 | 0.9011 |
| No log | 0.6692 | 178 | 1.2894 | 0.4891 | 1.2894 | 1.1355 |
| No log | 0.6767 | 180 | 1.6837 | 0.4615 | 1.6837 | 1.2976 |
| No log | 0.6842 | 182 | 1.6563 | 0.4615 | 1.6563 | 1.2870 |
| No log | 0.6917 | 184 | 1.1615 | 0.5789 | 1.1615 | 1.0777 |
| No log | 0.6992 | 186 | 0.6954 | 0.6794 | 0.6954 | 0.8339 |
| No log | 0.7068 | 188 | 0.5532 | 0.5249 | 0.5532 | 0.7438 |
| No log | 0.7143 | 190 | 0.6049 | 0.6026 | 0.6049 | 0.7777 |
| No log | 0.7218 | 192 | 0.8009 | 0.7042 | 0.8009 | 0.8949 |
| No log | 0.7293 | 194 | 0.9408 | 0.5398 | 0.9408 | 0.9699 |
| No log | 0.7368 | 196 | 1.0071 | 0.5398 | 1.0071 | 1.0035 |
| No log | 0.7444 | 198 | 1.1546 | 0.5475 | 1.1546 | 1.0745 |
| No log | 0.7519 | 200 | 1.1684 | 0.4806 | 1.1684 | 1.0809 |
| No log | 0.7594 | 202 | 1.5720 | 0.5035 | 1.5720 | 1.2538 |
| No log | 0.7669 | 204 | 1.9180 | 0.4615 | 1.9180 | 1.3849 |
| No log | 0.7744 | 206 | 1.6480 | 0.4143 | 1.6480 | 1.2838 |
| No log | 0.7820 | 208 | 1.0468 | 0.5783 | 1.0468 | 1.0231 |
| No log | 0.7895 | 210 | 0.8984 | 0.5847 | 0.8984 | 0.9478 |
| No log | 0.7970 | 212 | 0.9531 | 0.5739 | 0.9531 | 0.9763 |
| No log | 0.8045 | 214 | 0.9461 | 0.5739 | 0.9461 | 0.9727 |
| No log | 0.8120 | 216 | 1.1320 | 0.5783 | 1.1320 | 1.0640 |
| No log | 0.8195 | 218 | 1.2306 | 0.5783 | 1.2306 | 1.1093 |
| No log | 0.8271 | 220 | 1.0320 | 0.5659 | 1.0320 | 1.0159 |
| No log | 0.8346 | 222 | 0.8564 | 0.6242 | 0.8564 | 0.9254 |
| No log | 0.8421 | 224 | 1.0499 | 0.5105 | 1.0499 | 1.0246 |
| No log | 0.8496 | 226 | 1.4552 | 0.5625 | 1.4552 | 1.2063 |
| No log | 0.8571 | 228 | 1.6690 | 0.5172 | 1.6690 | 1.2919 |
| No log | 0.8647 | 230 | 1.3603 | 0.5475 | 1.3603 | 1.1663 |
| No log | 0.8722 | 232 | 0.8150 | 0.6915 | 0.8150 | 0.9028 |
| No log | 0.8797 | 234 | 0.6031 | 0.6667 | 0.6031 | 0.7766 |
| No log | 0.8872 | 236 | 0.6409 | 0.4335 | 0.6409 | 0.8006 |
| No log | 0.8947 | 238 | 0.6225 | 0.4417 | 0.6225 | 0.7890 |
| No log | 0.9023 | 240 | 0.6761 | 0.6957 | 0.6761 | 0.8223 |
| No log | 0.9098 | 242 | 0.9867 | 0.6062 | 0.9867 | 0.9933 |
| No log | 0.9173 | 244 | 1.0408 | 0.6540 | 1.0408 | 1.0202 |
| No log | 0.9248 | 246 | 0.9563 | 0.6617 | 0.9563 | 0.9779 |
| No log | 0.9323 | 248 | 0.7574 | 0.6621 | 0.7574 | 0.8703 |
| No log | 0.9398 | 250 | 0.6271 | 0.6209 | 0.6271 | 0.7919 |
| No log | 0.9474 | 252 | 0.5876 | 0.5789 | 0.5876 | 0.7666 |
| No log | 0.9549 | 254 | 0.7137 | 0.7290 | 0.7137 | 0.8448 |
| No log | 0.9624 | 256 | 0.9964 | 0.6008 | 0.9964 | 0.9982 |
| No log | 0.9699 | 258 | 1.0992 | 0.6231 | 1.0992 | 1.0484 |
| No log | 0.9774 | 260 | 1.0008 | 0.5795 | 1.0008 | 1.0004 |
| No log | 0.9850 | 262 | 0.8389 | 0.6977 | 0.8389 | 0.9159 |
| No log | 0.9925 | 264 | 0.6803 | 0.6977 | 0.6803 | 0.8248 |
| No log | 1.0 | 266 | 0.7617 | 0.6977 | 0.7617 | 0.8727 |
| No log | 1.0075 | 268 | 0.7264 | 0.6977 | 0.7264 | 0.8523 |
| No log | 1.0150 | 270 | 0.7296 | 0.7535 | 0.7296 | 0.8542 |
| No log | 1.0226 | 272 | 0.8626 | 0.6818 | 0.8626 | 0.9287 |
| No log | 1.0301 | 274 | 0.8931 | 0.6513 | 0.8931 | 0.9450 |
| No log | 1.0376 | 276 | 0.7491 | 0.7640 | 0.7491 | 0.8655 |
| No log | 1.0451 | 278 | 0.6752 | 0.7640 | 0.6752 | 0.8217 |
| No log | 1.0526 | 280 | 0.6411 | 0.7042 | 0.6411 | 0.8007 |
| No log | 1.0602 | 282 | 0.8725 | 0.7042 | 0.8725 | 0.9341 |
| No log | 1.0677 | 284 | 1.3221 | 0.5092 | 1.3221 | 1.1498 |
| No log | 1.0752 | 286 | 1.3120 | 0.5084 | 1.3120 | 1.1454 |
| No log | 1.0827 | 288 | 1.0733 | 0.6500 | 1.0733 | 1.0360 |
| No log | 1.0902 | 290 | 0.9442 | 0.6602 | 0.9442 | 0.9717 |
| No log | 1.0977 | 292 | 0.9576 | 0.6711 | 0.9576 | 0.9786 |
| No log | 1.1053 | 294 | 1.2477 | 0.5370 | 1.2477 | 1.1170 |
| No log | 1.1128 | 296 | 1.5009 | 0.5299 | 1.5009 | 1.2251 |
| No log | 1.1203 | 298 | 1.4542 | 0.5370 | 1.4542 | 1.2059 |
| No log | 1.1278 | 300 | 1.3651 | 0.5447 | 1.3651 | 1.1684 |
| No log | 1.1353 | 302 | 1.0565 | 0.5221 | 1.0565 | 1.0279 |
| No log | 1.1429 | 304 | 0.9620 | 0.5556 | 0.9620 | 0.9808 |
| No log | 1.1504 | 306 | 1.1900 | 0.5447 | 1.1900 | 1.0909 |
| No log | 1.1579 | 308 | 1.3635 | 0.5299 | 1.3635 | 1.1677 |
| No log | 1.1654 | 310 | 1.1098 | 0.5221 | 1.1098 | 1.0535 |
| No log | 1.1729 | 312 | 1.0512 | 0.5221 | 1.0512 | 1.0253 |
| No log | 1.1805 | 314 | 0.8517 | 0.7042 | 0.8517 | 0.9229 |
| No log | 1.1880 | 316 | 0.7314 | 0.6977 | 0.7314 | 0.8552 |
| No log | 1.1955 | 318 | 0.8661 | 0.6977 | 0.8661 | 0.9307 |
| No log | 1.2030 | 320 | 1.0110 | 0.6431 | 1.0110 | 1.0055 |
| No log | 1.2105 | 322 | 1.2679 | 0.5914 | 1.2679 | 1.1260 |
| No log | 1.2180 | 324 | 1.0944 | 0.5221 | 1.0944 | 1.0462 |
| No log | 1.2256 | 326 | 0.7212 | 0.7240 | 0.7212 | 0.8492 |
| No log | 1.2331 | 328 | 0.5901 | 0.5532 | 0.5901 | 0.7682 |
| No log | 1.2406 | 330 | 0.6827 | 0.7863 | 0.6827 | 0.8263 |
| No log | 1.2481 | 332 | 0.8379 | 0.6431 | 0.8379 | 0.9154 |
| No log | 1.2556 | 334 | 1.2223 | 0.6642 | 1.2223 | 1.1056 |
| No log | 1.2632 | 336 | 1.3426 | 0.6642 | 1.3426 | 1.1587 |
| No log | 1.2707 | 338 | 1.0545 | 0.6423 | 1.0545 | 1.0269 |
| No log | 1.2782 | 340 | 0.9713 | 0.75 | 0.9713 | 0.9855 |
| No log | 1.2857 | 342 | 0.7685 | 0.7586 | 0.7685 | 0.8766 |
| No log | 1.2932 | 344 | 0.7335 | 0.7348 | 0.7335 | 0.8565 |
| No log | 1.3008 | 346 | 0.6437 | 0.7568 | 0.6437 | 0.8023 |
| No log | 1.3083 | 348 | 0.7270 | 0.7348 | 0.7270 | 0.8526 |
| No log | 1.3158 | 350 | 1.0893 | 0.6231 | 1.0893 | 1.0437 |
| No log | 1.3233 | 352 | 1.4557 | 0.5299 | 1.4557 | 1.2065 |
| No log | 1.3308 | 354 | 1.5671 | 0.5735 | 1.5671 | 1.2518 |
| No log | 1.3383 | 356 | 1.3166 | 0.5370 | 1.3166 | 1.1474 |
| No log | 1.3459 | 358 | 0.8299 | 0.7050 | 0.8299 | 0.9110 |
| No log | 1.3534 | 360 | 0.5794 | 0.6566 | 0.5794 | 0.7612 |
| No log | 1.3609 | 362 | 0.5973 | 0.7177 | 0.5973 | 0.7729 |
| No log | 1.3684 | 364 | 0.7747 | 0.7348 | 0.7747 | 0.8802 |
| No log | 1.3759 | 366 | 1.0175 | 0.5556 | 1.0175 | 1.0087 |
| No log | 1.3835 | 368 | 1.1095 | 0.5556 | 1.1095 | 1.0533 |
| No log | 1.3910 | 370 | 1.1754 | 0.6008 | 1.1754 | 1.0842 |
| No log | 1.3985 | 372 | 0.9510 | 0.6340 | 0.9510 | 0.9752 |
| No log | 1.4060 | 374 | 0.6629 | 0.6977 | 0.6629 | 0.8142 |
| No log | 1.4135 | 376 | 0.6709 | 0.7048 | 0.6709 | 0.8191 |
| No log | 1.4211 | 378 | 0.9474 | 0.6340 | 0.9474 | 0.9733 |
| No log | 1.4286 | 380 | 1.3824 | 0.5735 | 1.3824 | 1.1757 |
| No log | 1.4361 | 382 | 1.4885 | 0.5735 | 1.4885 | 1.2200 |
| No log | 1.4436 | 384 | 1.1843 | 0.5914 | 1.1843 | 1.0883 |
| No log | 1.4511 | 386 | 0.8298 | 0.6883 | 0.8298 | 0.9109 |
| No log | 1.4586 | 388 | 0.7458 | 0.5243 | 0.7458 | 0.8636 |
| No log | 1.4662 | 390 | 0.6869 | 0.4057 | 0.6869 | 0.8288 |
| No log | 1.4737 | 392 | 0.7084 | 0.4909 | 0.7084 | 0.8417 |
| No log | 1.4812 | 394 | 0.8908 | 0.6744 | 0.8908 | 0.9438 |
| No log | 1.4887 | 396 | 1.2903 | 0.5370 | 1.2903 | 1.1359 |
| No log | 1.4962 | 398 | 1.6072 | 0.5735 | 1.6072 | 1.2677 |
| No log | 1.5038 | 400 | 1.6790 | 0.5735 | 1.6790 | 1.2958 |
| No log | 1.5113 | 402 | 1.3415 | 0.5370 | 1.3415 | 1.1582 |
| No log | 1.5188 | 404 | 0.9326 | 0.6980 | 0.9326 | 0.9657 |
| No log | 1.5263 | 406 | 0.8510 | 0.7449 | 0.8510 | 0.9225 |
| No log | 1.5338 | 408 | 0.9561 | 0.6345 | 0.9561 | 0.9778 |
| No log | 1.5414 | 410 | 1.1500 | 0.5370 | 1.1500 | 1.0724 |
| No log | 1.5489 | 412 | 1.4740 | 0.4981 | 1.4740 | 1.2141 |
| No log | 1.5564 | 414 | 1.4298 | 0.5370 | 1.4298 | 1.1958 |
| No log | 1.5639 | 416 | 1.2209 | 0.5370 | 1.2209 | 1.1049 |
| No log | 1.5714 | 418 | 0.8240 | 0.7287 | 0.8240 | 0.9077 |
| No log | 1.5789 | 420 | 0.6870 | 0.72 | 0.6870 | 0.8288 |
| No log | 1.5865 | 422 | 0.6506 | 0.7097 | 0.6506 | 0.8066 |
| No log | 1.5940 | 424 | 0.8160 | 0.7287 | 0.8160 | 0.9033 |
| No log | 1.6015 | 426 | 1.0794 | 0.6008 | 1.0794 | 1.0389 |
| No log | 1.6090 | 428 | 1.1540 | 0.6842 | 1.1540 | 1.0743 |
| No log | 1.6165 | 430 | 1.1086 | 0.75 | 1.1086 | 1.0529 |
| No log | 1.6241 | 432 | 0.8410 | 0.6859 | 0.8410 | 0.9171 |
| No log | 1.6316 | 434 | 0.7480 | 0.7048 | 0.7480 | 0.8648 |
| No log | 1.6391 | 436 | 0.5659 | 0.7048 | 0.5659 | 0.7522 |
| No log | 1.6466 | 438 | 0.5282 | 0.7165 | 0.5282 | 0.7268 |
| No log | 1.6541 | 440 | 0.6527 | 0.7048 | 0.6527 | 0.8079 |
| No log | 1.6617 | 442 | 0.8571 | 0.7658 | 0.8571 | 0.9258 |
| No log | 1.6692 | 444 | 0.8552 | 0.7658 | 0.8552 | 0.9248 |
| No log | 1.6767 | 446 | 0.8880 | 0.7658 | 0.8880 | 0.9424 |
| No log | 1.6842 | 448 | 0.8109 | 0.7658 | 0.8109 | 0.9005 |
| No log | 1.6917 | 450 | 0.7766 | 0.7586 | 0.7766 | 0.8812 |
| No log | 1.6992 | 452 | 0.7539 | 0.7586 | 0.7539 | 0.8683 |
| No log | 1.7068 | 454 | 0.7789 | 0.7287 | 0.7789 | 0.8826 |
| No log | 1.7143 | 456 | 0.6992 | 0.776 | 0.6992 | 0.8362 |
| No log | 1.7218 | 458 | 0.7782 | 0.7386 | 0.7782 | 0.8821 |
| No log | 1.7293 | 460 | 0.8082 | 0.7386 | 0.8082 | 0.8990 |
| No log | 1.7368 | 462 | 0.7587 | 0.7131 | 0.7587 | 0.8710 |
| No log | 1.7444 | 464 | 0.8034 | 0.7131 | 0.8034 | 0.8963 |
| No log | 1.7519 | 466 | 0.6660 | 0.7812 | 0.6660 | 0.8161 |
| No log | 1.7594 | 468 | 0.5739 | 0.7697 | 0.5739 | 0.7576 |
| No log | 1.7669 | 470 | 0.4735 | 0.7342 | 0.4735 | 0.6881 |
| No log | 1.7744 | 472 | 0.5033 | 0.7850 | 0.5033 | 0.7094 |
| No log | 1.7820 | 474 | 0.6879 | 0.7640 | 0.6879 | 0.8294 |
| No log | 1.7895 | 476 | 0.8933 | 0.7138 | 0.8933 | 0.9452 |
| No log | 1.7970 | 478 | 0.8693 | 0.7138 | 0.8693 | 0.9324 |
| No log | 1.8045 | 480 | 0.6576 | 0.7812 | 0.6576 | 0.8109 |
| No log | 1.8120 | 482 | 0.5978 | 0.7429 | 0.5978 | 0.7732 |
| No log | 1.8195 | 484 | 0.7143 | 0.7449 | 0.7143 | 0.8452 |
| No log | 1.8271 | 486 | 0.8345 | 0.7287 | 0.8345 | 0.9135 |
| No log | 1.8346 | 488 | 0.9677 | 0.6842 | 0.9677 | 0.9837 |
| No log | 1.8421 | 490 | 0.8340 | 0.7287 | 0.8340 | 0.9132 |
| No log | 1.8496 | 492 | 0.6930 | 0.6762 | 0.6930 | 0.8325 |
| No log | 1.8571 | 494 | 0.6125 | 0.7042 | 0.6125 | 0.7826 |
| No log | 1.8647 | 496 | 0.5927 | 0.7179 | 0.5927 | 0.7699 |
| No log | 1.8722 | 498 | 0.6938 | 0.6691 | 0.6938 | 0.8329 |
| 0.4664 | 1.8797 | 500 | 0.9053 | 0.7222 | 0.9053 | 0.9515 |
| 0.4664 | 1.8872 | 502 | 1.1155 | 0.7222 | 1.1155 | 1.0562 |
| 0.4664 | 1.8947 | 504 | 1.1251 | 0.6345 | 1.1251 | 1.0607 |
| 0.4664 | 1.9023 | 506 | 1.0452 | 0.5882 | 1.0452 | 1.0223 |
| 0.4664 | 1.9098 | 508 | 1.0732 | 0.5882 | 1.0732 | 1.0360 |
| 0.4664 | 1.9173 | 510 | 1.0338 | 0.5882 | 1.0338 | 1.0168 |
| 0.4664 | 1.9248 | 512 | 1.0722 | 0.5882 | 1.0722 | 1.0355 |
| 0.4664 | 1.9323 | 514 | 1.2614 | 0.6231 | 1.2614 | 1.1231 |
| 0.4664 | 1.9398 | 516 | 1.1221 | 0.6345 | 1.1221 | 1.0593 |
| 0.4664 | 1.9474 | 518 | 0.8121 | 0.6557 | 0.8121 | 0.9011 |
| 0.4664 | 1.9549 | 520 | 0.5713 | 0.7372 | 0.5713 | 0.7558 |
| 0.4664 | 1.9624 | 522 | 0.5100 | 0.6839 | 0.5100 | 0.7141 |
| 0.4664 | 1.9699 | 524 | 0.5941 | 0.6729 | 0.5941 | 0.7708 |
| 0.4664 | 1.9774 | 526 | 0.7871 | 0.7055 | 0.7871 | 0.8872 |
| 0.4664 | 1.9850 | 528 | 1.0253 | 0.6540 | 1.0253 | 1.0126 |
| 0.4664 | 1.9925 | 530 | 0.9840 | 0.6540 | 0.9840 | 0.9920 |
| 0.4664 | 2.0 | 532 | 0.7013 | 0.6883 | 0.7013 | 0.8374 |
| 0.4664 | 2.0075 | 534 | 0.6377 | 0.7364 | 0.6377 | 0.7986 |
| 0.4664 | 2.0150 | 536 | 0.7191 | 0.6883 | 0.7191 | 0.8480 |
| 0.4664 | 2.0226 | 538 | 0.9474 | 0.7368 | 0.9474 | 0.9733 |
| 0.4664 | 2.0301 | 540 | 1.1701 | 0.6769 | 1.1701 | 1.0817 |
| 0.4664 | 2.0376 | 542 | 1.0971 | 0.6769 | 1.0971 | 1.0474 |
| 0.4664 | 2.0451 | 544 | 0.8307 | 0.7368 | 0.8307 | 0.9114 |
| 0.4664 | 2.0526 | 546 | 0.5540 | 0.7179 | 0.5540 | 0.7443 |
| 0.4664 | 2.0602 | 548 | 0.5418 | 0.7179 | 0.5418 | 0.7360 |
| 0.4664 | 2.0677 | 550 | 0.7234 | 0.7820 | 0.7234 | 0.8505 |
| 0.4664 | 2.0752 | 552 | 0.7823 | 0.7571 | 0.7823 | 0.8845 |
| 0.4664 | 2.0827 | 554 | 0.6166 | 0.7048 | 0.6166 | 0.7852 |
| 0.4664 | 2.0902 | 556 | 0.5624 | 0.6729 | 0.5624 | 0.7499 |
| 0.4664 | 2.0977 | 558 | 0.6108 | 0.7179 | 0.6108 | 0.7816 |
| 0.4664 | 2.1053 | 560 | 0.6214 | 0.7364 | 0.6214 | 0.7883 |
| 0.4664 | 2.1128 | 562 | 0.6599 | 0.7034 | 0.6599 | 0.8124 |
| 0.4664 | 2.1203 | 564 | 0.5911 | 0.7034 | 0.5911 | 0.7688 |
| 0.4664 | 2.1278 | 566 | 0.5494 | 0.7429 | 0.5494 | 0.7412 |
| 0.4664 | 2.1353 | 568 | 0.6074 | 0.7364 | 0.6074 | 0.7794 |
| 0.4664 | 2.1429 | 570 | 0.8645 | 0.7287 | 0.8645 | 0.9298 |
| 0.4664 | 2.1504 | 572 | 1.1405 | 0.7445 | 1.1405 | 1.0679 |
| 0.4664 | 2.1579 | 574 | 1.1846 | 0.7445 | 1.1846 | 1.0884 |
| 0.4664 | 2.1654 | 576 | 0.8894 | 0.7658 | 0.8894 | 0.9431 |
| 0.4664 | 2.1729 | 578 | 0.5747 | 0.7510 | 0.5747 | 0.7581 |
| 0.4664 | 2.1805 | 580 | 0.4066 | 0.6903 | 0.4066 | 0.6376 |
| 0.4664 | 2.1880 | 582 | 0.4054 | 0.6903 | 0.4054 | 0.6367 |
| 0.4664 | 2.1955 | 584 | 0.5252 | 0.7686 | 0.5252 | 0.7247 |
| 0.4664 | 2.2030 | 586 | 0.8532 | 0.7658 | 0.8532 | 0.9237 |
| 0.4664 | 2.2105 | 588 | 1.1016 | 0.7445 | 1.1016 | 1.0496 |
| 0.4664 | 2.2180 | 590 | 1.0612 | 0.7605 | 1.0612 | 1.0301 |
| 0.4664 | 2.2256 | 592 | 0.8122 | 0.7222 | 0.8122 | 0.9012 |
| 0.4664 | 2.2331 | 594 | 0.5685 | 0.7034 | 0.5685 | 0.7540 |
| 0.4664 | 2.2406 | 596 | 0.5065 | 0.7273 | 0.5065 | 0.7117 |
| 0.4664 | 2.2481 | 598 | 0.4826 | 0.7009 | 0.4826 | 0.6947 |
| 0.4664 | 2.2556 | 600 | 0.5033 | 0.7429 | 0.5033 | 0.7095 |
| 0.4664 | 2.2632 | 602 | 0.5913 | 0.7449 | 0.5913 | 0.7689 |
| 0.4664 | 2.2707 | 604 | 0.7189 | 0.7287 | 0.7189 | 0.8479 |
| 0.4664 | 2.2782 | 606 | 0.9245 | 0.7605 | 0.9245 | 0.9615 |
| 0.4664 | 2.2857 | 608 | 0.8994 | 0.7605 | 0.8994 | 0.9484 |
| 0.4664 | 2.2932 | 610 | 0.6488 | 0.7287 | 0.6488 | 0.8055 |
| 0.4664 | 2.3008 | 612 | 0.5036 | 0.7107 | 0.5036 | 0.7096 |
| 0.4664 | 2.3083 | 614 | 0.4766 | 0.7429 | 0.4766 | 0.6903 |
| 0.4664 | 2.3158 | 616 | 0.5817 | 0.776 | 0.5817 | 0.7627 |
| 0.4664 | 2.3233 | 618 | 0.7435 | 0.7529 | 0.7435 | 0.8623 |
| 0.4664 | 2.3308 | 620 | 0.9101 | 0.7605 | 0.9101 | 0.9540 |
| 0.4664 | 2.3383 | 622 | 0.8495 | 0.7895 | 0.8495 | 0.9217 |
| 0.4664 | 2.3459 | 624 | 0.7806 | 0.7895 | 0.7806 | 0.8835 |
| 0.4664 | 2.3534 | 626 | 0.6159 | 0.7586 | 0.6159 | 0.7848 |
| 0.4664 | 2.3609 | 628 | 0.4939 | 0.7328 | 0.4939 | 0.7028 |
| 0.4664 | 2.3684 | 630 | 0.5134 | 0.7328 | 0.5134 | 0.7165 |
| 0.4664 | 2.3759 | 632 | 0.5158 | 0.7692 | 0.5158 | 0.7182 |
| 0.4664 | 2.3835 | 634 | 0.6138 | 0.7482 | 0.6138 | 0.7835 |
| 0.4664 | 2.3910 | 636 | 0.6901 | 0.7586 | 0.6901 | 0.8307 |
| 0.4664 | 2.3985 | 638 | 0.7478 | 0.7941 | 0.7478 | 0.8648 |
| 0.4664 | 2.4060 | 640 | 0.7292 | 0.7941 | 0.7292 | 0.8539 |
| 0.4664 | 2.4135 | 642 | 0.6593 | 0.776 | 0.6593 | 0.8120 |
| 0.4664 | 2.4211 | 644 | 0.5012 | 0.7490 | 0.5012 | 0.7080 |
| 0.4664 | 2.4286 | 646 | 0.3730 | 0.7083 | 0.3730 | 0.6108 |
| 0.4664 | 2.4361 | 648 | 0.3613 | 0.7083 | 0.3613 | 0.6011 |
| 0.4664 | 2.4436 | 650 | 0.4332 | 0.7490 | 0.4332 | 0.6582 |
| 0.4664 | 2.4511 | 652 | 0.5890 | 0.8218 | 0.5890 | 0.7675 |
| 0.4664 | 2.4586 | 654 | 0.6419 | 0.8218 | 0.6419 | 0.8012 |
| 0.4664 | 2.4662 | 656 | 0.6337 | 0.7879 | 0.6337 | 0.7961 |
| 0.4664 | 2.4737 | 658 | 0.5871 | 0.8063 | 0.5871 | 0.7662 |
| 0.4664 | 2.4812 | 660 | 0.4690 | 0.7607 | 0.4690 | 0.6849 |
| 0.4664 | 2.4887 | 662 | 0.4262 | 0.7175 | 0.4262 | 0.6529 |
| 0.4664 | 2.4962 | 664 | 0.4328 | 0.7490 | 0.4328 | 0.6579 |
| 0.4664 | 2.5038 | 666 | 0.5775 | 0.7879 | 0.5775 | 0.7599 |
| 0.4664 | 2.5113 | 668 | 0.7549 | 0.7774 | 0.7549 | 0.8688 |
| 0.4664 | 2.5188 | 670 | 0.8037 | 0.7774 | 0.8037 | 0.8965 |
| 0.4664 | 2.5263 | 672 | 0.7322 | 0.7774 | 0.7322 | 0.8557 |
| 0.4664 | 2.5338 | 674 | 0.6424 | 0.7586 | 0.6424 | 0.8015 |
| 0.4664 | 2.5414 | 676 | 0.5111 | 0.7175 | 0.5111 | 0.7149 |
| 0.4664 | 2.5489 | 678 | 0.4862 | 0.7083 | 0.4862 | 0.6973 |
| 0.4664 | 2.5564 | 680 | 0.4829 | 0.7244 | 0.4829 | 0.6949 |
| 0.4664 | 2.5639 | 682 | 0.5103 | 0.7342 | 0.5103 | 0.7144 |
| 0.4664 | 2.5714 | 684 | 0.5553 | 0.7097 | 0.5553 | 0.7452 |
| 0.4664 | 2.5789 | 686 | 0.5113 | 0.7244 | 0.5113 | 0.7150 |
| 0.4664 | 2.5865 | 688 | 0.5171 | 0.7805 | 0.5171 | 0.7191 |
| 0.4664 | 2.5940 | 690 | 0.5948 | 0.8165 | 0.5948 | 0.7712 |
| 0.4664 | 2.6015 | 692 | 0.7100 | 0.7986 | 0.7100 | 0.8426 |
| 0.4664 | 2.6090 | 694 | 0.6905 | 0.7986 | 0.6905 | 0.8310 |
| 0.4664 | 2.6165 | 696 | 0.6101 | 0.7986 | 0.6101 | 0.7811 |
| 0.4664 | 2.6241 | 698 | 0.4980 | 0.7237 | 0.4980 | 0.7057 |
| 0.4664 | 2.6316 | 700 | 0.4532 | 0.7635 | 0.4532 | 0.6732 |
| 0.4664 | 2.6391 | 702 | 0.4078 | 0.6715 | 0.4078 | 0.6386 |
| 0.4664 | 2.6466 | 704 | 0.4437 | 0.8048 | 0.4437 | 0.6661 |
| 0.4664 | 2.6541 | 706 | 0.6026 | 0.8165 | 0.6026 | 0.7762 |
| 0.4664 | 2.6617 | 708 | 0.7574 | 0.7709 | 0.7574 | 0.8703 |
| 0.4664 | 2.6692 | 710 | 0.7288 | 0.7986 | 0.7288 | 0.8537 |
| 0.4664 | 2.6767 | 712 | 0.6429 | 0.7986 | 0.6429 | 0.8018 |
| 0.4664 | 2.6842 | 714 | 0.5592 | 0.8165 | 0.5592 | 0.7478 |
| 0.4664 | 2.6917 | 716 | 0.4489 | 0.6899 | 0.4489 | 0.6700 |
| 0.4664 | 2.6992 | 718 | 0.4238 | 0.6465 | 0.4238 | 0.6510 |
| 0.4664 | 2.7068 | 720 | 0.4524 | 0.6557 | 0.4524 | 0.6726 |
| 0.4664 | 2.7143 | 722 | 0.5665 | 0.8165 | 0.5665 | 0.7527 |
| 0.4664 | 2.7218 | 724 | 0.6894 | 0.7449 | 0.6894 | 0.8303 |
| 0.4664 | 2.7293 | 726 | 0.8506 | 0.7138 | 0.8506 | 0.9223 |
| 0.4664 | 2.7368 | 728 | 0.8595 | 0.7138 | 0.8595 | 0.9271 |
| 0.4664 | 2.7444 | 730 | 0.7267 | 0.7449 | 0.7267 | 0.8525 |
| 0.4664 | 2.7519 | 732 | 0.6173 | 0.7879 | 0.6173 | 0.7857 |
| 0.4664 | 2.7594 | 734 | 0.5757 | 0.7879 | 0.5757 | 0.7587 |
| 0.4664 | 2.7669 | 736 | 0.5914 | 0.7879 | 0.5914 | 0.7690 |
| 0.4664 | 2.7744 | 738 | 0.5654 | 0.8165 | 0.5654 | 0.7519 |
| 0.4664 | 2.7820 | 740 | 0.5660 | 0.8165 | 0.5660 | 0.7523 |
| 0.4664 | 2.7895 | 742 | 0.6408 | 0.7449 | 0.6408 | 0.8005 |
| 0.4664 | 2.7970 | 744 | 0.6694 | 0.7287 | 0.6694 | 0.8182 |
| 0.4664 | 2.8045 | 746 | 0.6624 | 0.7287 | 0.6624 | 0.8139 |
| 0.4664 | 2.8120 | 748 | 0.6626 | 0.7586 | 0.6626 | 0.8140 |
| 0.4664 | 2.8195 | 750 | 0.6232 | 0.7586 | 0.6232 | 0.7894 |
| 0.4664 | 2.8271 | 752 | 0.5101 | 0.7812 | 0.5101 | 0.7142 |
| 0.4664 | 2.8346 | 754 | 0.4702 | 0.8 | 0.4702 | 0.6857 |
| 0.4664 | 2.8421 | 756 | 0.5673 | 0.776 | 0.5673 | 0.7532 |
| 0.4664 | 2.8496 | 758 | 0.7138 | 0.7287 | 0.7138 | 0.8449 |
| 0.4664 | 2.8571 | 760 | 0.6928 | 0.7449 | 0.6928 | 0.8323 |
| 0.4664 | 2.8647 | 762 | 0.5732 | 0.7950 | 0.5732 | 0.7571 |
| 0.4664 | 2.8722 | 764 | 0.5629 | 0.7950 | 0.5629 | 0.7503 |
| 0.4664 | 2.8797 | 766 | 0.6744 | 0.776 | 0.6744 | 0.8212 |
| 0.4664 | 2.8872 | 768 | 0.8614 | 0.7138 | 0.8614 | 0.9281 |
| 0.4664 | 2.8947 | 770 | 0.8623 | 0.7138 | 0.8623 | 0.9286 |
| 0.4664 | 2.9023 | 772 | 0.7083 | 0.7426 | 0.7083 | 0.8416 |
| 0.4664 | 2.9098 | 774 | 0.5602 | 0.8165 | 0.5602 | 0.7484 |
| 0.4664 | 2.9173 | 776 | 0.5092 | 0.7372 | 0.5092 | 0.7136 |
| 0.4664 | 2.9248 | 778 | 0.5307 | 0.7372 | 0.5307 | 0.7285 |
| 0.4664 | 2.9323 | 780 | 0.5586 | 0.7372 | 0.5586 | 0.7474 |
| 0.4664 | 2.9398 | 782 | 0.6343 | 0.6859 | 0.6343 | 0.7964 |
| 0.4664 | 2.9474 | 784 | 0.7145 | 0.7426 | 0.7145 | 0.8453 |
| 0.4664 | 2.9549 | 786 | 0.7176 | 0.7426 | 0.7176 | 0.8471 |
| 0.4664 | 2.9624 | 788 | 0.6702 | 0.7586 | 0.6702 | 0.8186 |
| 0.4664 | 2.9699 | 790 | 0.7279 | 0.7426 | 0.7279 | 0.8532 |
| 0.4664 | 2.9774 | 792 | 0.7570 | 0.7586 | 0.7570 | 0.8700 |
| 0.4664 | 2.9850 | 794 | 0.6713 | 0.776 | 0.6713 | 0.8193 |
| 0.4664 | 2.9925 | 796 | 0.5440 | 0.7097 | 0.5440 | 0.7376 |
| 0.4664 | 3.0 | 798 | 0.4672 | 0.7667 | 0.4672 | 0.6835 |
| 0.4664 | 3.0075 | 800 | 0.4553 | 0.7795 | 0.4553 | 0.6748 |
| 0.4664 | 3.0150 | 802 | 0.5026 | 0.7586 | 0.5026 | 0.7089 |
| 0.4664 | 3.0226 | 804 | 0.5799 | 0.7442 | 0.5799 | 0.7615 |
| 0.4664 | 3.0301 | 806 | 0.6841 | 0.8339 | 0.6841 | 0.8271 |
| 0.4664 | 3.0376 | 808 | 0.5921 | 0.8339 | 0.5921 | 0.7695 |
| 0.4664 | 3.0451 | 810 | 0.4992 | 0.7742 | 0.4992 | 0.7065 |
| 0.4664 | 3.0526 | 812 | 0.4269 | 0.7607 | 0.4269 | 0.6534 |
| 0.4664 | 3.0602 | 814 | 0.3689 | 0.5882 | 0.3689 | 0.6074 |
| 0.4664 | 3.0677 | 816 | 0.3890 | 0.6169 | 0.3890 | 0.6237 |
| 0.4664 | 3.0752 | 818 | 0.5120 | 0.7607 | 0.5120 | 0.7156 |
| 0.4664 | 3.0827 | 820 | 0.7451 | 0.7941 | 0.7451 | 0.8632 |
| 0.4664 | 3.0902 | 822 | 0.7792 | 0.7941 | 0.7792 | 0.8827 |
| 0.4664 | 3.0977 | 824 | 0.6341 | 0.8123 | 0.6341 | 0.7963 |
| 0.4664 | 3.1053 | 826 | 0.5858 | 0.8165 | 0.5858 | 0.7654 |
| 0.4664 | 3.1128 | 828 | 0.5121 | 0.7812 | 0.5121 | 0.7156 |
| 0.4664 | 3.1203 | 830 | 0.4928 | 0.7692 | 0.4928 | 0.7020 |
| 0.4664 | 3.1278 | 832 | 0.5184 | 0.8028 | 0.5184 | 0.7200 |
| 0.4664 | 3.1353 | 834 | 0.4595 | 0.7692 | 0.4595 | 0.6778 |
| 0.4664 | 3.1429 | 836 | 0.3674 | 0.7675 | 0.3674 | 0.6061 |
| 0.4664 | 3.1504 | 838 | 0.3168 | 0.7956 | 0.3168 | 0.5628 |
| 0.4664 | 3.1579 | 840 | 0.3236 | 0.7605 | 0.3236 | 0.5689 |
| 0.4664 | 3.1654 | 842 | 0.3875 | 0.7956 | 0.3875 | 0.6225 |
| 0.4664 | 3.1729 | 844 | 0.5641 | 0.7986 | 0.5641 | 0.7511 |
| 0.4664 | 3.1805 | 846 | 0.8050 | 0.7941 | 0.8050 | 0.8972 |
| 0.4664 | 3.1880 | 848 | 0.8149 | 0.7941 | 0.8149 | 0.9027 |
| 0.4664 | 3.1955 | 850 | 0.6442 | 0.7941 | 0.6442 | 0.8026 |
| 0.4664 | 3.2030 | 852 | 0.4752 | 0.8 | 0.4752 | 0.6893 |
| 0.4664 | 3.2105 | 854 | 0.4229 | 0.7549 | 0.4229 | 0.6503 |
| 0.4664 | 3.2180 | 856 | 0.4350 | 0.7549 | 0.4350 | 0.6595 |
| 0.4664 | 3.2256 | 858 | 0.4860 | 0.8108 | 0.4860 | 0.6972 |
| 0.4664 | 3.2331 | 860 | 0.5244 | 0.8123 | 0.5244 | 0.7241 |
| 0.4664 | 3.2406 | 862 | 0.4774 | 0.7971 | 0.4774 | 0.6909 |
| 0.4664 | 3.2481 | 864 | 0.5138 | 0.8121 | 0.5138 | 0.7168 |
| 0.4664 | 3.2556 | 866 | 0.6142 | 0.7941 | 0.6142 | 0.7837 |
| 0.4664 | 3.2632 | 868 | 0.6261 | 0.7820 | 0.6261 | 0.7913 |
| 0.4664 | 3.2707 | 870 | 0.5222 | 0.7758 | 0.5222 | 0.7227 |
| 0.4664 | 3.2782 | 872 | 0.4829 | 0.7027 | 0.4829 | 0.6949 |
| 0.4664 | 3.2857 | 874 | 0.4752 | 0.7177 | 0.4752 | 0.6894 |
| 0.4664 | 3.2932 | 876 | 0.4854 | 0.7177 | 0.4854 | 0.6967 |
| 0.4664 | 3.3008 | 878 | 0.5398 | 0.7107 | 0.5398 | 0.7347 |
| 0.4664 | 3.3083 | 880 | 0.5543 | 0.7107 | 0.5543 | 0.7445 |
| 0.4664 | 3.3158 | 882 | 0.5394 | 0.7107 | 0.5394 | 0.7345 |
| 0.4664 | 3.3233 | 884 | 0.6591 | 0.7586 | 0.6591 | 0.8118 |
| 0.4664 | 3.3308 | 886 | 0.6691 | 0.7586 | 0.6691 | 0.8180 |
| 0.4664 | 3.3383 | 888 | 0.6884 | 0.7586 | 0.6884 | 0.8297 |
| 0.4664 | 3.3459 | 890 | 0.6083 | 0.7364 | 0.6083 | 0.7799 |
| 0.4664 | 3.3534 | 892 | 0.5091 | 0.6818 | 0.5091 | 0.7135 |
| 0.4664 | 3.3609 | 894 | 0.5168 | 0.6818 | 0.5168 | 0.7189 |
| 0.4664 | 3.3684 | 896 | 0.6367 | 0.7364 | 0.6367 | 0.7980 |
| 0.4664 | 3.3759 | 898 | 0.8118 | 0.7131 | 0.8118 | 0.9010 |
| 0.4664 | 3.3835 | 900 | 0.8040 | 0.7368 | 0.8040 | 0.8967 |
| 0.4664 | 3.3910 | 902 | 0.6751 | 0.7364 | 0.6751 | 0.8217 |
| 0.4664 | 3.3985 | 904 | 0.5206 | 0.6818 | 0.5206 | 0.7215 |
| 0.4664 | 3.4060 | 906 | 0.4762 | 0.7175 | 0.4762 | 0.6901 |
| 0.4664 | 3.4135 | 908 | 0.5455 | 0.7107 | 0.5455 | 0.7386 |
| 0.4664 | 3.4211 | 910 | 0.7364 | 0.7449 | 0.7364 | 0.8581 |
| 0.4664 | 3.4286 | 912 | 0.8896 | 0.7368 | 0.8896 | 0.9432 |
| 0.4664 | 3.4361 | 914 | 0.8454 | 0.7449 | 0.8454 | 0.9194 |
| 0.4664 | 3.4436 | 916 | 0.6814 | 0.7449 | 0.6814 | 0.8255 |
| 0.4664 | 3.4511 | 918 | 0.5222 | 0.7175 | 0.5222 | 0.7226 |
| 0.4664 | 3.4586 | 920 | 0.4351 | 0.6903 | 0.4351 | 0.6596 |
| 0.4664 | 3.4662 | 922 | 0.3859 | 0.6419 | 0.3859 | 0.6212 |
| 0.4664 | 3.4737 | 924 | 0.4009 | 0.6903 | 0.4009 | 0.6332 |
| 0.4664 | 3.4812 | 926 | 0.5003 | 0.7175 | 0.5003 | 0.7073 |
| 0.4664 | 3.4887 | 928 | 0.7595 | 0.7449 | 0.7595 | 0.8715 |
| 0.4664 | 3.4962 | 930 | 1.0148 | 0.7605 | 1.0148 | 1.0074 |
| 0.4664 | 3.5038 | 932 | 1.0369 | 0.7605 | 1.0369 | 1.0183 |
| 0.4664 | 3.5113 | 934 | 0.8968 | 0.7368 | 0.8968 | 0.9470 |
| 0.4664 | 3.5188 | 936 | 0.6870 | 0.7449 | 0.6870 | 0.8289 |
| 0.4664 | 3.5263 | 938 | 0.5591 | 0.7097 | 0.5591 | 0.7478 |
| 0.4664 | 3.5338 | 940 | 0.5079 | 0.7607 | 0.5079 | 0.7127 |
| 0.4664 | 3.5414 | 942 | 0.5388 | 0.7607 | 0.5388 | 0.7340 |
| 0.4664 | 3.5489 | 944 | 0.5696 | 0.8165 | 0.5696 | 0.7547 |
| 0.4664 | 3.5564 | 946 | 0.6119 | 0.776 | 0.6119 | 0.7823 |
| 0.4664 | 3.5639 | 948 | 0.5769 | 0.8165 | 0.5769 | 0.7596 |
| 0.4664 | 3.5714 | 950 | 0.5237 | 0.7926 | 0.5237 | 0.7237 |
| 0.4664 | 3.5789 | 952 | 0.5013 | 0.8205 | 0.5013 | 0.7080 |
| 0.4664 | 3.5865 | 954 | 0.4976 | 0.7926 | 0.4976 | 0.7054 |
| 0.4664 | 3.5940 | 956 | 0.5714 | 0.7348 | 0.5714 | 0.7559 |
| 0.4664 | 3.6015 | 958 | 0.6241 | 0.7348 | 0.6241 | 0.7900 |
| 0.4664 | 3.6090 | 960 | 0.6570 | 0.7348 | 0.6570 | 0.8106 |
| 0.4664 | 3.6165 | 962 | 0.6293 | 0.7510 | 0.6293 | 0.7933 |
| 0.4664 | 3.6241 | 964 | 0.6298 | 0.7510 | 0.6298 | 0.7936 |
| 0.4664 | 3.6316 | 966 | 0.6084 | 0.7107 | 0.6084 | 0.7800 |
| 0.4664 | 3.6391 | 968 | 0.5898 | 0.7107 | 0.5898 | 0.7680 |
| 0.4664 | 3.6466 | 970 | 0.5602 | 0.7273 | 0.5602 | 0.7485 |
| 0.4664 | 3.6541 | 972 | 0.5726 | 0.7107 | 0.5726 | 0.7567 |
| 0.4664 | 3.6617 | 974 | 0.5487 | 0.7107 | 0.5487 | 0.7407 |
| 0.4664 | 3.6692 | 976 | 0.5770 | 0.7107 | 0.5770 | 0.7596 |
| 0.4664 | 3.6767 | 978 | 0.6563 | 0.7348 | 0.6563 | 0.8101 |
| 0.4664 | 3.6842 | 980 | 0.6836 | 0.7348 | 0.6836 | 0.8268 |
| 0.4664 | 3.6917 | 982 | 0.7159 | 0.7586 | 0.7159 | 0.8461 |
| 0.4664 | 3.6992 | 984 | 0.6790 | 0.7986 | 0.6790 | 0.8240 |
| 0.4664 | 3.7068 | 986 | 0.6393 | 0.7926 | 0.6393 | 0.7995 |
| 0.4664 | 3.7143 | 988 | 0.6741 | 0.7879 | 0.6741 | 0.8211 |
| 0.4664 | 3.7218 | 990 | 0.6690 | 0.7879 | 0.6690 | 0.8179 |
| 0.4664 | 3.7293 | 992 | 0.6081 | 0.7926 | 0.6081 | 0.7798 |
| 0.4664 | 3.7368 | 994 | 0.5600 | 0.6566 | 0.5600 | 0.7483 |
| 0.4664 | 3.7444 | 996 | 0.5621 | 0.7342 | 0.5621 | 0.7497 |
| 0.4664 | 3.7519 | 998 | 0.6349 | 0.7429 | 0.6349 | 0.7968 |
| 0.151 | 3.7594 | 1000 | 0.8159 | 0.7386 | 0.8159 | 0.9033 |
| 0.151 | 3.7669 | 1002 | 0.9863 | 0.6345 | 0.9863 | 0.9931 |
| 0.151 | 3.7744 | 1004 | 1.0975 | 0.6345 | 1.0975 | 1.0476 |
| 0.151 | 3.7820 | 1006 | 1.0536 | 0.6345 | 1.0536 | 1.0265 |
| 0.151 | 3.7895 | 1008 | 0.8869 | 0.7386 | 0.8869 | 0.9418 |
| 0.151 | 3.7970 | 1010 | 0.6589 | 0.7383 | 0.6589 | 0.8118 |
| 0.151 | 3.8045 | 1012 | 0.4873 | 0.7175 | 0.4873 | 0.6981 |
| 0.151 | 3.8120 | 1014 | 0.4392 | 0.7667 | 0.4392 | 0.6628 |
| 0.151 | 3.8195 | 1016 | 0.4572 | 0.7175 | 0.4572 | 0.6762 |
| 0.151 | 3.8271 | 1018 | 0.5104 | 0.6818 | 0.5104 | 0.7145 |
| 0.151 | 3.8346 | 1020 | 0.6607 | 0.8016 | 0.6607 | 0.8128 |
| 0.151 | 3.8421 | 1022 | 0.7833 | 0.7829 | 0.7833 | 0.8851 |
| 0.151 | 3.8496 | 1024 | 0.8763 | 0.8178 | 0.8763 | 0.9361 |
| 0.151 | 3.8571 | 1026 | 0.8886 | 0.8178 | 0.8886 | 0.9426 |
| 0.151 | 3.8647 | 1028 | 0.7332 | 0.7712 | 0.7332 | 0.8563 |
| 0.151 | 3.8722 | 1030 | 0.5985 | 0.7317 | 0.5985 | 0.7736 |
| 0.151 | 3.8797 | 1032 | 0.4877 | 0.7586 | 0.4877 | 0.6983 |
| 0.151 | 3.8872 | 1034 | 0.4200 | 0.7004 | 0.4200 | 0.6481 |
| 0.151 | 3.8947 | 1036 | 0.4369 | 0.7004 | 0.4369 | 0.6610 |
| 0.151 | 3.9023 | 1038 | 0.5300 | 0.8165 | 0.5300 | 0.7280 |
| 0.151 | 3.9098 | 1040 | 0.7152 | 0.8016 | 0.7152 | 0.8457 |
| 0.151 | 3.9173 | 1042 | 0.8773 | 0.7222 | 0.8773 | 0.9367 |
| 0.151 | 3.9248 | 1044 | 0.8942 | 0.7222 | 0.8942 | 0.9456 |
| 0.151 | 3.9323 | 1046 | 0.7750 | 0.8016 | 0.7750 | 0.8803 |
| 0.151 | 3.9398 | 1048 | 0.5933 | 0.776 | 0.5933 | 0.7703 |
| 0.151 | 3.9474 | 1050 | 0.4565 | 0.7175 | 0.4565 | 0.6757 |
| 0.151 | 3.9549 | 1052 | 0.4171 | 0.7667 | 0.4171 | 0.6458 |
| 0.151 | 3.9624 | 1054 | 0.4468 | 0.8397 | 0.4468 | 0.6684 |
| 0.151 | 3.9699 | 1056 | 0.5461 | 0.8165 | 0.5461 | 0.7390 |
| 0.151 | 3.9774 | 1058 | 0.6927 | 0.8178 | 0.6927 | 0.8323 |
| 0.151 | 3.9850 | 1060 | 0.7365 | 0.8178 | 0.7365 | 0.8582 |
| 0.151 | 3.9925 | 1062 | 0.6227 | 0.8372 | 0.6227 | 0.7891 |
| 0.151 | 4.0 | 1064 | 0.4645 | 0.7742 | 0.4645 | 0.6815 |
| 0.151 | 4.0075 | 1066 | 0.3632 | 0.7549 | 0.3632 | 0.6027 |
| 0.151 | 4.0150 | 1068 | 0.3486 | 0.7154 | 0.3486 | 0.5904 |
| 0.151 | 4.0226 | 1070 | 0.3770 | 0.7549 | 0.3770 | 0.6140 |
| 0.151 | 4.0301 | 1072 | 0.4630 | 0.7686 | 0.4630 | 0.6805 |
| 0.151 | 4.0376 | 1074 | 0.5772 | 0.8016 | 0.5772 | 0.7597 |
| 0.151 | 4.0451 | 1076 | 0.6233 | 0.8178 | 0.6233 | 0.7895 |
| 0.151 | 4.0526 | 1078 | 0.5645 | 0.8123 | 0.5645 | 0.7513 |
| 0.151 | 4.0602 | 1080 | 0.4712 | 0.8069 | 0.4712 | 0.6864 |
| 0.151 | 4.0677 | 1082 | 0.4037 | 0.7742 | 0.4037 | 0.6354 |
| 0.151 | 4.0752 | 1084 | 0.3603 | 0.7540 | 0.3603 | 0.6003 |
| 0.151 | 4.0827 | 1086 | 0.3961 | 0.8069 | 0.3961 | 0.6294 |
| 0.151 | 4.0902 | 1088 | 0.4626 | 0.8069 | 0.4626 | 0.6801 |
| 0.151 | 4.0977 | 1090 | 0.5588 | 0.8170 | 0.5588 | 0.7475 |
| 0.151 | 4.1053 | 1092 | 0.6727 | 0.7941 | 0.6727 | 0.8202 |
| 0.151 | 4.1128 | 1094 | 0.6645 | 0.8178 | 0.6645 | 0.8152 |
| 0.151 | 4.1203 | 1096 | 0.5498 | 0.7941 | 0.5498 | 0.7415 |
| 0.151 | 4.1278 | 1098 | 0.4847 | 0.7640 | 0.4847 | 0.6962 |
| 0.151 | 4.1353 | 1100 | 0.5321 | 0.776 | 0.5321 | 0.7295 |
| 0.151 | 4.1429 | 1102 | 0.6171 | 0.7941 | 0.6171 | 0.7856 |
| 0.151 | 4.1504 | 1104 | 0.6363 | 0.7941 | 0.6363 | 0.7977 |
| 0.151 | 4.1579 | 1106 | 0.5457 | 0.7986 | 0.5457 | 0.7387 |
| 0.151 | 4.1654 | 1108 | 0.3856 | 0.7083 | 0.3856 | 0.6210 |
| 0.151 | 4.1729 | 1110 | 0.3320 | 0.7549 | 0.3320 | 0.5762 |
| 0.151 | 4.1805 | 1112 | 0.3572 | 0.7083 | 0.3572 | 0.5976 |
| 0.151 | 4.1880 | 1114 | 0.4586 | 0.7568 | 0.4586 | 0.6772 |
| 0.151 | 4.1955 | 1116 | 0.6406 | 0.7941 | 0.6406 | 0.8004 |
| 0.151 | 4.2030 | 1118 | 0.6956 | 0.8178 | 0.6956 | 0.8340 |
| 0.151 | 4.2105 | 1120 | 0.6514 | 0.8123 | 0.6514 | 0.8071 |
| 0.151 | 4.2180 | 1122 | 0.5706 | 0.7266 | 0.5706 | 0.7554 |
| 0.151 | 4.2256 | 1124 | 0.4477 | 0.7342 | 0.4477 | 0.6691 |
| 0.151 | 4.2331 | 1126 | 0.4001 | 0.7083 | 0.4001 | 0.6325 |
| 0.151 | 4.2406 | 1128 | 0.4316 | 0.7177 | 0.4316 | 0.6569 |
| 0.151 | 4.2481 | 1130 | 0.5739 | 0.7640 | 0.5739 | 0.7576 |
| 0.151 | 4.2556 | 1132 | 0.6684 | 0.7941 | 0.6684 | 0.8176 |
| 0.151 | 4.2632 | 1134 | 0.7239 | 0.8178 | 0.7239 | 0.8508 |
| 0.151 | 4.2707 | 1136 | 0.6623 | 0.776 | 0.6623 | 0.8138 |
| 0.151 | 4.2782 | 1138 | 0.5346 | 0.776 | 0.5346 | 0.7311 |
| 0.151 | 4.2857 | 1140 | 0.4529 | 0.7926 | 0.4529 | 0.6730 |
| 0.151 | 4.2932 | 1142 | 0.3635 | 0.7549 | 0.3635 | 0.6029 |
| 0.151 | 4.3008 | 1144 | 0.3287 | 0.6934 | 0.3287 | 0.5733 |
| 0.151 | 4.3083 | 1146 | 0.3547 | 0.7549 | 0.3547 | 0.5955 |
| 0.151 | 4.3158 | 1148 | 0.4450 | 0.7692 | 0.4450 | 0.6671 |
| 0.151 | 4.3233 | 1150 | 0.5417 | 0.776 | 0.5417 | 0.7360 |
| 0.151 | 4.3308 | 1152 | 0.6731 | 0.8178 | 0.6731 | 0.8204 |
| 0.151 | 4.3383 | 1154 | 0.7479 | 0.8178 | 0.7479 | 0.8648 |
| 0.151 | 4.3459 | 1156 | 0.8304 | 0.7895 | 0.8304 | 0.9113 |
| 0.151 | 4.3534 | 1158 | 0.8121 | 0.7705 | 0.8121 | 0.9011 |
| 0.151 | 4.3609 | 1160 | 0.7019 | 0.72 | 0.7019 | 0.8378 |
| 0.151 | 4.3684 | 1162 | 0.5520 | 0.7383 | 0.5520 | 0.7430 |
| 0.151 | 4.3759 | 1164 | 0.4596 | 0.6316 | 0.4596 | 0.6779 |
| 0.151 | 4.3835 | 1166 | 0.4172 | 0.6316 | 0.4172 | 0.6459 |
| 0.151 | 4.3910 | 1168 | 0.4249 | 0.6818 | 0.4249 | 0.6519 |
| 0.151 | 4.3985 | 1170 | 0.4943 | 0.6667 | 0.4943 | 0.7031 |
| 0.151 | 4.4060 | 1172 | 0.5874 | 0.7941 | 0.5874 | 0.7664 |
| 0.151 | 4.4135 | 1174 | 0.5979 | 0.7941 | 0.5979 | 0.7733 |
| 0.151 | 4.4211 | 1176 | 0.5114 | 0.7603 | 0.5114 | 0.7151 |
| 0.151 | 4.4286 | 1178 | 0.4398 | 0.8121 | 0.4398 | 0.6632 |
| 0.151 | 4.4361 | 1180 | 0.3892 | 0.8372 | 0.3892 | 0.6238 |
| 0.151 | 4.4436 | 1182 | 0.4190 | 0.8121 | 0.4190 | 0.6473 |
| 0.151 | 4.4511 | 1184 | 0.4641 | 0.8121 | 0.4641 | 0.6812 |
| 0.151 | 4.4586 | 1186 | 0.5168 | 0.7879 | 0.5168 | 0.7189 |
| 0.151 | 4.4662 | 1188 | 0.5611 | 0.7709 | 0.5611 | 0.7491 |
| 0.151 | 4.4737 | 1190 | 0.5142 | 0.7879 | 0.5142 | 0.7171 |
| 0.151 | 4.4812 | 1192 | 0.4200 | 0.7464 | 0.4200 | 0.6481 |
| 0.151 | 4.4887 | 1194 | 0.4067 | 0.7623 | 0.4067 | 0.6377 |
| 0.151 | 4.4962 | 1196 | 0.4560 | 0.8121 | 0.4560 | 0.6753 |
| 0.151 | 4.5038 | 1198 | 0.5622 | 0.7879 | 0.5622 | 0.7498 |
| 0.151 | 4.5113 | 1200 | 0.6180 | 0.8178 | 0.6180 | 0.7861 |
| 0.151 | 4.5188 | 1202 | 0.5940 | 0.8372 | 0.5940 | 0.7707 |
| 0.151 | 4.5263 | 1204 | 0.4980 | 0.7364 | 0.4980 | 0.7057 |
| 0.151 | 4.5338 | 1206 | 0.4202 | 0.7342 | 0.4202 | 0.6482 |
| 0.151 | 4.5414 | 1208 | 0.4213 | 0.7342 | 0.4213 | 0.6491 |
| 0.151 | 4.5489 | 1210 | 0.4158 | 0.7342 | 0.4158 | 0.6449 |
| 0.151 | 4.5564 | 1212 | 0.4708 | 0.7926 | 0.4708 | 0.6861 |
| 0.151 | 4.5639 | 1214 | 0.6147 | 0.8178 | 0.6147 | 0.7840 |
| 0.151 | 4.5714 | 1216 | 0.7201 | 0.8178 | 0.7201 | 0.8486 |
| 0.151 | 4.5789 | 1218 | 0.7371 | 0.8178 | 0.7371 | 0.8585 |
| 0.151 | 4.5865 | 1220 | 0.7103 | 0.8178 | 0.7103 | 0.8428 |
| 0.151 | 4.5940 | 1222 | 0.6310 | 0.8178 | 0.6310 | 0.7944 |
| 0.151 | 4.6015 | 1224 | 0.5357 | 0.8123 | 0.5357 | 0.7319 |
| 0.151 | 4.6090 | 1226 | 0.4757 | 0.7407 | 0.4757 | 0.6897 |
| 0.151 | 4.6165 | 1228 | 0.4646 | 0.7407 | 0.4646 | 0.6817 |
| 0.151 | 4.6241 | 1230 | 0.4645 | 0.7407 | 0.4645 | 0.6815 |
| 0.151 | 4.6316 | 1232 | 0.5089 | 0.7758 | 0.5089 | 0.7133 |
| 0.151 | 4.6391 | 1234 | 0.6360 | 0.8178 | 0.6360 | 0.7975 |
| 0.151 | 4.6466 | 1236 | 0.7102 | 0.8178 | 0.7102 | 0.8427 |
| 0.151 | 4.6541 | 1238 | 0.7150 | 0.8178 | 0.7150 | 0.8456 |
| 0.151 | 4.6617 | 1240 | 0.6990 | 0.8178 | 0.6990 | 0.8361 |
| 0.151 | 4.6692 | 1242 | 0.5920 | 0.7941 | 0.5920 | 0.7694 |
| 0.151 | 4.6767 | 1244 | 0.4240 | 0.8121 | 0.4240 | 0.6512 |
| 0.151 | 4.6842 | 1246 | 0.3481 | 0.7549 | 0.3481 | 0.5900 |
| 0.151 | 4.6917 | 1248 | 0.3392 | 0.7549 | 0.3392 | 0.5824 |
| 0.151 | 4.6992 | 1250 | 0.3642 | 0.8244 | 0.3642 | 0.6035 |
| 0.151 | 4.7068 | 1252 | 0.4262 | 0.8521 | 0.4262 | 0.6528 |
| 0.151 | 4.7143 | 1254 | 0.5437 | 0.7941 | 0.5437 | 0.7374 |
| 0.151 | 4.7218 | 1256 | 0.6719 | 0.7941 | 0.6719 | 0.8197 |
| 0.151 | 4.7293 | 1258 | 0.6825 | 0.7941 | 0.6825 | 0.8261 |
| 0.151 | 4.7368 | 1260 | 0.5914 | 0.776 | 0.5914 | 0.7690 |
| 0.151 | 4.7444 | 1262 | 0.4449 | 0.7175 | 0.4449 | 0.6670 |
| 0.151 | 4.7519 | 1264 | 0.3368 | 0.7586 | 0.3368 | 0.5804 |
| 0.151 | 4.7594 | 1266 | 0.3147 | 0.7222 | 0.3147 | 0.5610 |
| 0.151 | 4.7669 | 1268 | 0.3324 | 0.7586 | 0.3324 | 0.5765 |
| 0.151 | 4.7744 | 1270 | 0.3825 | 0.7175 | 0.3825 | 0.6185 |
| 0.151 | 4.7820 | 1272 | 0.4837 | 0.7107 | 0.4837 | 0.6955 |
| 0.151 | 4.7895 | 1274 | 0.5601 | 0.7879 | 0.5601 | 0.7484 |
| 0.151 | 4.7970 | 1276 | 0.5671 | 0.7709 | 0.5671 | 0.7530 |
| 0.151 | 4.8045 | 1278 | 0.5370 | 0.7879 | 0.5370 | 0.7328 |
| 0.151 | 4.8120 | 1280 | 0.4427 | 0.7175 | 0.4427 | 0.6653 |
| 0.151 | 4.8195 | 1282 | 0.3593 | 0.7846 | 0.3593 | 0.5994 |
| 0.151 | 4.8271 | 1284 | 0.3346 | 0.7846 | 0.3346 | 0.5784 |
| 0.151 | 4.8346 | 1286 | 0.3642 | 0.7522 | 0.3642 | 0.6035 |
| 0.151 | 4.8421 | 1288 | 0.4268 | 0.7522 | 0.4268 | 0.6533 |
| 0.151 | 4.8496 | 1290 | 0.5453 | 0.7510 | 0.5453 | 0.7384 |
| 0.151 | 4.8571 | 1292 | 0.6412 | 0.7941 | 0.6412 | 0.8008 |
| 0.151 | 4.8647 | 1294 | 0.6970 | 0.7941 | 0.6970 | 0.8349 |
| 0.151 | 4.8722 | 1296 | 0.6505 | 0.7941 | 0.6505 | 0.8065 |
| 0.151 | 4.8797 | 1298 | 0.5340 | 0.7510 | 0.5340 | 0.7308 |
| 0.151 | 4.8872 | 1300 | 0.4188 | 0.7522 | 0.4188 | 0.6471 |
| 0.151 | 4.8947 | 1302 | 0.3954 | 0.7984 | 0.3954 | 0.6288 |
| 0.151 | 4.9023 | 1304 | 0.4209 | 0.7795 | 0.4209 | 0.6487 |
| 0.151 | 4.9098 | 1306 | 0.5028 | 0.7709 | 0.5028 | 0.7091 |
| 0.151 | 4.9173 | 1308 | 0.5819 | 0.7709 | 0.5819 | 0.7628 |
| 0.151 | 4.9248 | 1310 | 0.6971 | 0.7941 | 0.6971 | 0.8349 |
| 0.151 | 4.9323 | 1312 | 0.7189 | 0.7941 | 0.7189 | 0.8479 |
| 0.151 | 4.9398 | 1314 | 0.6270 | 0.7709 | 0.6270 | 0.7919 |
| 0.151 | 4.9474 | 1316 | 0.4985 | 0.7961 | 0.4985 | 0.7060 |
| 0.151 | 4.9549 | 1318 | 0.4207 | 0.8121 | 0.4207 | 0.6486 |
| 0.151 | 4.9624 | 1320 | 0.3791 | 0.8679 | 0.3791 | 0.6157 |
| 0.151 | 4.9699 | 1322 | 0.3935 | 0.8372 | 0.3935 | 0.6273 |
| 0.151 | 4.9774 | 1324 | 0.4554 | 0.8121 | 0.4554 | 0.6748 |
| 0.151 | 4.9850 | 1326 | 0.5527 | 0.7709 | 0.5527 | 0.7434 |
| 0.151 | 4.9925 | 1328 | 0.6460 | 0.7709 | 0.6460 | 0.8037 |
| 0.151 | 5.0 | 1330 | 0.6585 | 0.8178 | 0.6585 | 0.8115 |
| 0.151 | 5.0075 | 1332 | 0.5731 | 0.7709 | 0.5731 | 0.7571 |
| 0.151 | 5.0150 | 1334 | 0.4526 | 0.7863 | 0.4526 | 0.6728 |
| 0.151 | 5.0226 | 1336 | 0.3879 | 0.8192 | 0.3879 | 0.6228 |
| 0.151 | 5.0301 | 1338 | 0.3854 | 0.8192 | 0.3854 | 0.6208 |
| 0.151 | 5.0376 | 1340 | 0.4274 | 0.8014 | 0.4274 | 0.6538 |
| 0.151 | 5.0451 | 1342 | 0.4662 | 0.7863 | 0.4662 | 0.6828 |
| 0.151 | 5.0526 | 1344 | 0.5729 | 0.7879 | 0.5729 | 0.7569 |
| 0.151 | 5.0602 | 1346 | 0.6681 | 0.7941 | 0.6681 | 0.8174 |
| 0.151 | 5.0677 | 1348 | 0.7224 | 0.7941 | 0.7224 | 0.8499 |
| 0.151 | 5.0752 | 1350 | 0.7105 | 0.7941 | 0.7105 | 0.8429 |
| 0.151 | 5.0827 | 1352 | 0.6381 | 0.8123 | 0.6381 | 0.7988 |
| 0.151 | 5.0902 | 1354 | 0.5516 | 0.7364 | 0.5516 | 0.7427 |
| 0.151 | 5.0977 | 1356 | 0.5263 | 0.7544 | 0.5263 | 0.7254 |
| 0.151 | 5.1053 | 1358 | 0.5066 | 0.7097 | 0.5066 | 0.7117 |
| 0.151 | 5.1128 | 1360 | 0.5294 | 0.7544 | 0.5294 | 0.7276 |
| 0.151 | 5.1203 | 1362 | 0.5745 | 0.776 | 0.5745 | 0.7580 |
| 0.151 | 5.1278 | 1364 | 0.5331 | 0.8165 | 0.5331 | 0.7301 |
| 0.151 | 5.1353 | 1366 | 0.4798 | 0.7568 | 0.4798 | 0.6926 |
| 0.151 | 5.1429 | 1368 | 0.4335 | 0.8346 | 0.4335 | 0.6584 |
| 0.151 | 5.1504 | 1370 | 0.4016 | 0.7984 | 0.4016 | 0.6337 |
| 0.151 | 5.1579 | 1372 | 0.3800 | 0.7984 | 0.3800 | 0.6165 |
| 0.151 | 5.1654 | 1374 | 0.3864 | 0.7984 | 0.3864 | 0.6216 |
| 0.151 | 5.1729 | 1376 | 0.4456 | 0.7522 | 0.4456 | 0.6675 |
| 0.151 | 5.1805 | 1378 | 0.4886 | 0.7273 | 0.4886 | 0.6990 |
| 0.151 | 5.1880 | 1380 | 0.4943 | 0.7429 | 0.4943 | 0.7031 |
| 0.151 | 5.1955 | 1382 | 0.4728 | 0.8165 | 0.4728 | 0.6876 |
| 0.151 | 5.2030 | 1384 | 0.4191 | 0.8780 | 0.4191 | 0.6474 |
| 0.151 | 5.2105 | 1386 | 0.4254 | 0.8780 | 0.4254 | 0.6522 |
| 0.151 | 5.2180 | 1388 | 0.4464 | 0.8780 | 0.4464 | 0.6681 |
| 0.151 | 5.2256 | 1390 | 0.4685 | 0.8165 | 0.4685 | 0.6845 |
| 0.151 | 5.2331 | 1392 | 0.4870 | 0.8165 | 0.4870 | 0.6979 |
| 0.151 | 5.2406 | 1394 | 0.5411 | 0.8165 | 0.5411 | 0.7356 |
| 0.151 | 5.2481 | 1396 | 0.5192 | 0.8165 | 0.5192 | 0.7205 |
| 0.151 | 5.2556 | 1398 | 0.4497 | 0.8 | 0.4497 | 0.6706 |
| 0.151 | 5.2632 | 1400 | 0.3852 | 0.7522 | 0.3852 | 0.6206 |
| 0.151 | 5.2707 | 1402 | 0.3553 | 0.7522 | 0.3553 | 0.5961 |
| 0.151 | 5.2782 | 1404 | 0.3797 | 0.7522 | 0.3797 | 0.6162 |
| 0.151 | 5.2857 | 1406 | 0.3966 | 0.7522 | 0.3966 | 0.6297 |
| 0.151 | 5.2932 | 1408 | 0.4021 | 0.7932 | 0.4021 | 0.6341 |
| 0.151 | 5.3008 | 1410 | 0.4527 | 0.8649 | 0.4527 | 0.6728 |
| 0.151 | 5.3083 | 1412 | 0.5797 | 0.8165 | 0.5797 | 0.7614 |
| 0.151 | 5.3158 | 1414 | 0.7069 | 0.7829 | 0.7069 | 0.8408 |
| 0.151 | 5.3233 | 1416 | 0.7733 | 0.8078 | 0.7733 | 0.8794 |
| 0.151 | 5.3308 | 1418 | 0.7115 | 0.8078 | 0.7115 | 0.8435 |
| 0.151 | 5.3383 | 1420 | 0.6270 | 0.7879 | 0.6270 | 0.7918 |
| 0.151 | 5.3459 | 1422 | 0.5576 | 0.8165 | 0.5576 | 0.7467 |
| 0.151 | 5.3534 | 1424 | 0.4449 | 0.8 | 0.4449 | 0.6670 |
| 0.151 | 5.3609 | 1426 | 0.4060 | 0.7607 | 0.4060 | 0.6371 |
| 0.151 | 5.3684 | 1428 | 0.4235 | 0.7607 | 0.4235 | 0.6507 |
| 0.151 | 5.3759 | 1430 | 0.4515 | 0.8 | 0.4515 | 0.6720 |
| 0.151 | 5.3835 | 1432 | 0.4575 | 0.8 | 0.4575 | 0.6764 |
| 0.151 | 5.3910 | 1434 | 0.4818 | 0.7812 | 0.4818 | 0.6941 |
| 0.151 | 5.3985 | 1436 | 0.5028 | 0.8028 | 0.5028 | 0.7091 |
| 0.151 | 5.4060 | 1438 | 0.4737 | 0.8028 | 0.4737 | 0.6882 |
| 0.151 | 5.4135 | 1440 | 0.4099 | 0.8069 | 0.4099 | 0.6403 |
| 0.151 | 5.4211 | 1442 | 0.3882 | 0.8511 | 0.3882 | 0.6231 |
| 0.151 | 5.4286 | 1444 | 0.3376 | 0.8192 | 0.3376 | 0.5811 |
| 0.151 | 5.4361 | 1446 | 0.3309 | 0.7846 | 0.3309 | 0.5753 |
| 0.151 | 5.4436 | 1448 | 0.3295 | 0.7846 | 0.3295 | 0.5740 |
| 0.151 | 5.4511 | 1450 | 0.3631 | 0.7407 | 0.3631 | 0.6026 |
| 0.151 | 5.4586 | 1452 | 0.4415 | 0.8108 | 0.4415 | 0.6645 |
| 0.151 | 5.4662 | 1454 | 0.5393 | 0.7812 | 0.5393 | 0.7344 |
| 0.151 | 5.4737 | 1456 | 0.5770 | 0.7879 | 0.5770 | 0.7596 |
| 0.151 | 5.4812 | 1458 | 0.5530 | 0.7812 | 0.5530 | 0.7436 |
| 0.151 | 5.4887 | 1460 | 0.4957 | 0.7812 | 0.4957 | 0.7041 |
| 0.151 | 5.4962 | 1462 | 0.4744 | 0.7692 | 0.4744 | 0.6888 |
| 0.151 | 5.5038 | 1464 | 0.4523 | 0.7812 | 0.4523 | 0.6725 |
| 0.151 | 5.5113 | 1466 | 0.4292 | 0.8108 | 0.4292 | 0.6551 |
| 0.151 | 5.5188 | 1468 | 0.4212 | 0.7932 | 0.4212 | 0.6490 |
| 0.151 | 5.5263 | 1470 | 0.4650 | 0.7812 | 0.4650 | 0.6819 |
| 0.151 | 5.5338 | 1472 | 0.5325 | 0.7812 | 0.5325 | 0.7297 |
| 0.151 | 5.5414 | 1474 | 0.6177 | 0.776 | 0.6177 | 0.7860 |
| 0.151 | 5.5489 | 1476 | 0.6561 | 0.7941 | 0.6561 | 0.8100 |
| 0.151 | 5.5564 | 1478 | 0.6171 | 0.8123 | 0.6171 | 0.7856 |
| 0.151 | 5.5639 | 1480 | 0.5412 | 0.7812 | 0.5412 | 0.7357 |
| 0.151 | 5.5714 | 1482 | 0.4555 | 0.7812 | 0.4555 | 0.6749 |
| 0.151 | 5.5789 | 1484 | 0.3929 | 0.7407 | 0.3929 | 0.6268 |
| 0.151 | 5.5865 | 1486 | 0.3603 | 0.7407 | 0.3603 | 0.6002 |
| 0.151 | 5.5940 | 1488 | 0.3745 | 0.7407 | 0.3745 | 0.6120 |
| 0.151 | 5.6015 | 1490 | 0.4243 | 0.7328 | 0.4243 | 0.6513 |
| 0.151 | 5.6090 | 1492 | 0.5144 | 0.7812 | 0.5144 | 0.7172 |
| 0.151 | 5.6165 | 1494 | 0.6086 | 0.7879 | 0.6086 | 0.7801 |
| 0.151 | 5.6241 | 1496 | 0.6205 | 0.7879 | 0.6205 | 0.7877 |
| 0.151 | 5.6316 | 1498 | 0.6263 | 0.7879 | 0.6263 | 0.7914 |
| 0.1101 | 5.6391 | 1500 | 0.5632 | 0.8165 | 0.5632 | 0.7505 |
| 0.1101 | 5.6466 | 1502 | 0.5207 | 0.7812 | 0.5207 | 0.7216 |
| 0.1101 | 5.6541 | 1504 | 0.4909 | 0.7429 | 0.4909 | 0.7006 |
| 0.1101 | 5.6617 | 1506 | 0.5031 | 0.7429 | 0.5031 | 0.7093 |
| 0.1101 | 5.6692 | 1508 | 0.5173 | 0.7429 | 0.5173 | 0.7192 |
| 0.1101 | 5.6767 | 1510 | 0.5397 | 0.7429 | 0.5397 | 0.7346 |
| 0.1101 | 5.6842 | 1512 | 0.5567 | 0.7107 | 0.5567 | 0.7461 |
| 0.1101 | 5.6917 | 1514 | 0.5764 | 0.7879 | 0.5764 | 0.7592 |
| 0.1101 | 5.6992 | 1516 | 0.5431 | 0.7107 | 0.5431 | 0.7370 |
| 0.1101 | 5.7068 | 1518 | 0.4897 | 0.7429 | 0.4897 | 0.6998 |
| 0.1101 | 5.7143 | 1520 | 0.4614 | 0.7175 | 0.4614 | 0.6792 |
| 0.1101 | 5.7218 | 1522 | 0.4640 | 0.7175 | 0.4640 | 0.6812 |
| 0.1101 | 5.7293 | 1524 | 0.4713 | 0.7009 | 0.4713 | 0.6865 |
| 0.1101 | 5.7368 | 1526 | 0.5318 | 0.776 | 0.5318 | 0.7293 |
| 0.1101 | 5.7444 | 1528 | 0.5854 | 0.8123 | 0.5854 | 0.7651 |
| 0.1101 | 5.7519 | 1530 | 0.6610 | 0.8123 | 0.6610 | 0.8130 |
| 0.1101 | 5.7594 | 1532 | 0.6560 | 0.7986 | 0.6560 | 0.8099 |
| 0.1101 | 5.7669 | 1534 | 0.6035 | 0.7986 | 0.6035 | 0.7768 |
| 0.1101 | 5.7744 | 1536 | 0.5327 | 0.7986 | 0.5327 | 0.7299 |
| 0.1101 | 5.7820 | 1538 | 0.5135 | 0.7986 | 0.5135 | 0.7166 |
| 0.1101 | 5.7895 | 1540 | 0.4838 | 0.8256 | 0.4838 | 0.6956 |
| 0.1101 | 5.7970 | 1542 | 0.4836 | 0.7986 | 0.4836 | 0.6954 |
| 0.1101 | 5.8045 | 1544 | 0.5438 | 0.8123 | 0.5438 | 0.7374 |
| 0.1101 | 5.8120 | 1546 | 0.6609 | 0.8123 | 0.6609 | 0.8130 |
| 0.1101 | 5.8195 | 1548 | 0.7084 | 0.8123 | 0.7084 | 0.8417 |
| 0.1101 | 5.8271 | 1550 | 0.7074 | 0.8123 | 0.7074 | 0.8411 |
| 0.1101 | 5.8346 | 1552 | 0.6810 | 0.776 | 0.6810 | 0.8252 |
| 0.1101 | 5.8421 | 1554 | 0.6335 | 0.776 | 0.6335 | 0.7959 |
| 0.1101 | 5.8496 | 1556 | 0.6660 | 0.776 | 0.6660 | 0.8161 |
| 0.1101 | 5.8571 | 1558 | 0.6672 | 0.7449 | 0.6672 | 0.8168 |
| 0.1101 | 5.8647 | 1560 | 0.6148 | 0.776 | 0.6148 | 0.7841 |
| 0.1101 | 5.8722 | 1562 | 0.5841 | 0.776 | 0.5841 | 0.7643 |
| 0.1101 | 5.8797 | 1564 | 0.6103 | 0.8123 | 0.6103 | 0.7812 |
| 0.1101 | 5.8872 | 1566 | 0.6384 | 0.8123 | 0.6384 | 0.7990 |
| 0.1101 | 5.8947 | 1568 | 0.6575 | 0.8123 | 0.6575 | 0.8109 |
| 0.1101 | 5.9023 | 1570 | 0.6269 | 0.8123 | 0.6269 | 0.7918 |
| 0.1101 | 5.9098 | 1572 | 0.5740 | 0.8123 | 0.5740 | 0.7577 |
| 0.1101 | 5.9173 | 1574 | 0.5434 | 0.8123 | 0.5434 | 0.7372 |
| 0.1101 | 5.9248 | 1576 | 0.5111 | 0.7510 | 0.5111 | 0.7149 |
| 0.1101 | 5.9323 | 1578 | 0.5326 | 0.7510 | 0.5326 | 0.7298 |
| 0.1101 | 5.9398 | 1580 | 0.5264 | 0.7510 | 0.5264 | 0.7255 |
| 0.1101 | 5.9474 | 1582 | 0.5017 | 0.7510 | 0.5017 | 0.7083 |
| 0.1101 | 5.9549 | 1584 | 0.4858 | 0.7812 | 0.4858 | 0.6970 |
| 0.1101 | 5.9624 | 1586 | 0.5165 | 0.7510 | 0.5165 | 0.7187 |
| 0.1101 | 5.9699 | 1588 | 0.5530 | 0.7879 | 0.5530 | 0.7436 |
| 0.1101 | 5.9774 | 1590 | 0.5260 | 0.7879 | 0.5260 | 0.7252 |
| 0.1101 | 5.9850 | 1592 | 0.5075 | 0.8165 | 0.5075 | 0.7124 |
| 0.1101 | 5.9925 | 1594 | 0.5420 | 0.7879 | 0.5420 | 0.7362 |
| 0.1101 | 6.0 | 1596 | 0.5616 | 0.7879 | 0.5616 | 0.7494 |
| 0.1101 | 6.0075 | 1598 | 0.5442 | 0.7879 | 0.5442 | 0.7377 |
| 0.1101 | 6.0150 | 1600 | 0.5096 | 0.7879 | 0.5096 | 0.7139 |
| 0.1101 | 6.0226 | 1602 | 0.4779 | 0.7812 | 0.4779 | 0.6913 |
| 0.1101 | 6.0301 | 1604 | 0.4771 | 0.7510 | 0.4771 | 0.6908 |
| 0.1101 | 6.0376 | 1606 | 0.5136 | 0.7510 | 0.5136 | 0.7167 |
| 0.1101 | 6.0451 | 1608 | 0.5222 | 0.7510 | 0.5222 | 0.7226 |
| 0.1101 | 6.0526 | 1610 | 0.4836 | 0.7510 | 0.4836 | 0.6954 |
| 0.1101 | 6.0602 | 1612 | 0.4459 | 0.7342 | 0.4459 | 0.6678 |
| 0.1101 | 6.0677 | 1614 | 0.4345 | 0.7342 | 0.4345 | 0.6592 |
| 0.1101 | 6.0752 | 1616 | 0.4329 | 0.7342 | 0.4329 | 0.6580 |
| 0.1101 | 6.0827 | 1618 | 0.4307 | 0.7742 | 0.4307 | 0.6563 |
| 0.1101 | 6.0902 | 1620 | 0.4131 | 0.7742 | 0.4131 | 0.6428 |
| 0.1101 | 6.0977 | 1622 | 0.4271 | 0.8108 | 0.4271 | 0.6536 |
| 0.1101 | 6.1053 | 1624 | 0.4714 | 0.7926 | 0.4714 | 0.6866 |
| 0.1101 | 6.1128 | 1626 | 0.5124 | 0.8165 | 0.5124 | 0.7158 |
| 0.1101 | 6.1203 | 1628 | 0.5425 | 0.8165 | 0.5425 | 0.7366 |
| 0.1101 | 6.1278 | 1630 | 0.5972 | 0.776 | 0.5972 | 0.7728 |
| 0.1101 | 6.1353 | 1632 | 0.5980 | 0.776 | 0.5980 | 0.7733 |
| 0.1101 | 6.1429 | 1634 | 0.5418 | 0.776 | 0.5418 | 0.7360 |
| 0.1101 | 6.1504 | 1636 | 0.4400 | 0.8 | 0.4400 | 0.6633 |
| 0.1101 | 6.1579 | 1638 | 0.3715 | 0.7244 | 0.3715 | 0.6095 |
| 0.1101 | 6.1654 | 1640 | 0.3519 | 0.7244 | 0.3519 | 0.5932 |
| 0.1101 | 6.1729 | 1642 | 0.3691 | 0.7244 | 0.3691 | 0.6076 |
| 0.1101 | 6.1805 | 1644 | 0.4267 | 0.7971 | 0.4267 | 0.6532 |
| 0.1101 | 6.1880 | 1646 | 0.5017 | 0.8489 | 0.5017 | 0.7083 |
| 0.1101 | 6.1955 | 1648 | 0.5683 | 0.8489 | 0.5683 | 0.7538 |
| 0.1101 | 6.2030 | 1650 | 0.6024 | 0.8489 | 0.6024 | 0.7762 |
| 0.1101 | 6.2105 | 1652 | 0.5968 | 0.8489 | 0.5968 | 0.7725 |
| 0.1101 | 6.2180 | 1654 | 0.5547 | 0.8489 | 0.5547 | 0.7448 |
| 0.1101 | 6.2256 | 1656 | 0.4794 | 0.8165 | 0.4794 | 0.6924 |
| 0.1101 | 6.2331 | 1658 | 0.4208 | 0.7623 | 0.4208 | 0.6487 |
| 0.1101 | 6.2406 | 1660 | 0.4130 | 0.7742 | 0.4130 | 0.6426 |
| 0.1101 | 6.2481 | 1662 | 0.4206 | 0.7742 | 0.4206 | 0.6485 |
| 0.1101 | 6.2556 | 1664 | 0.4470 | 0.7742 | 0.4470 | 0.6686 |
| 0.1101 | 6.2632 | 1666 | 0.4459 | 0.7742 | 0.4459 | 0.6678 |
| 0.1101 | 6.2707 | 1668 | 0.4422 | 0.7742 | 0.4422 | 0.6650 |
| 0.1101 | 6.2782 | 1670 | 0.3974 | 0.7742 | 0.3974 | 0.6304 |
| 0.1101 | 6.2857 | 1672 | 0.3746 | 0.7910 | 0.3746 | 0.6120 |
| 0.1101 | 6.2932 | 1674 | 0.3854 | 0.7910 | 0.3854 | 0.6208 |
| 0.1101 | 6.3008 | 1676 | 0.4216 | 0.7805 | 0.4216 | 0.6493 |
| 0.1101 | 6.3083 | 1678 | 0.4565 | 0.7805 | 0.4565 | 0.6757 |
| 0.1101 | 6.3158 | 1680 | 0.4345 | 0.7805 | 0.4345 | 0.6592 |
| 0.1101 | 6.3233 | 1682 | 0.4160 | 0.8069 | 0.4160 | 0.6450 |
| 0.1101 | 6.3308 | 1684 | 0.3770 | 0.8069 | 0.3770 | 0.6140 |
| 0.1101 | 6.3383 | 1686 | 0.3716 | 0.8244 | 0.3716 | 0.6096 |
| 0.1101 | 6.3459 | 1688 | 0.3933 | 0.8069 | 0.3933 | 0.6271 |
| 0.1101 | 6.3534 | 1690 | 0.3802 | 0.8244 | 0.3802 | 0.6166 |
| 0.1101 | 6.3609 | 1692 | 0.4055 | 0.8244 | 0.4055 | 0.6368 |
| 0.1101 | 6.3684 | 1694 | 0.4574 | 0.7926 | 0.4574 | 0.6763 |
| 0.1101 | 6.3759 | 1696 | 0.4715 | 0.7926 | 0.4715 | 0.6866 |
| 0.1101 | 6.3835 | 1698 | 0.5056 | 0.7926 | 0.5056 | 0.7111 |
| 0.1101 | 6.3910 | 1700 | 0.5113 | 0.8256 | 0.5113 | 0.7151 |
| 0.1101 | 6.3985 | 1702 | 0.4747 | 0.7926 | 0.4747 | 0.6890 |
| 0.1101 | 6.4060 | 1704 | 0.4590 | 0.8256 | 0.4590 | 0.6775 |
| 0.1101 | 6.4135 | 1706 | 0.4413 | 0.8205 | 0.4413 | 0.6643 |
| 0.1101 | 6.4211 | 1708 | 0.4088 | 0.8205 | 0.4088 | 0.6394 |
| 0.1101 | 6.4286 | 1710 | 0.3916 | 0.8679 | 0.3916 | 0.6258 |
| 0.1101 | 6.4361 | 1712 | 0.3834 | 0.8679 | 0.3834 | 0.6192 |
| 0.1101 | 6.4436 | 1714 | 0.3670 | 0.8511 | 0.3670 | 0.6058 |
| 0.1101 | 6.4511 | 1716 | 0.3572 | 0.8192 | 0.3572 | 0.5977 |
| 0.1101 | 6.4586 | 1718 | 0.3684 | 0.8511 | 0.3684 | 0.6070 |
| 0.1101 | 6.4662 | 1720 | 0.3950 | 0.8679 | 0.3950 | 0.6285 |
| 0.1101 | 6.4737 | 1722 | 0.4500 | 0.8521 | 0.4500 | 0.6708 |
| 0.1101 | 6.4812 | 1724 | 0.5133 | 0.8256 | 0.5133 | 0.7165 |
| 0.1101 | 6.4887 | 1726 | 0.6131 | 0.8123 | 0.6131 | 0.7830 |
| 0.1101 | 6.4962 | 1728 | 0.6562 | 0.7829 | 0.6562 | 0.8101 |
| 0.1101 | 6.5038 | 1730 | 0.6597 | 0.7829 | 0.6597 | 0.8122 |
| 0.1101 | 6.5113 | 1732 | 0.6294 | 0.6578 | 0.6294 | 0.7934 |
| 0.1101 | 6.5188 | 1734 | 0.5614 | 0.6818 | 0.5614 | 0.7493 |
| 0.1101 | 6.5263 | 1736 | 0.4852 | 0.6818 | 0.4852 | 0.6966 |
| 0.1101 | 6.5338 | 1738 | 0.4431 | 0.6818 | 0.4431 | 0.6657 |
| 0.1101 | 6.5414 | 1740 | 0.4009 | 0.7175 | 0.4009 | 0.6332 |
| 0.1101 | 6.5489 | 1742 | 0.3910 | 0.7667 | 0.3910 | 0.6253 |
| 0.1101 | 6.5564 | 1744 | 0.4230 | 0.7667 | 0.4230 | 0.6504 |
| 0.1101 | 6.5639 | 1746 | 0.4885 | 0.7273 | 0.4885 | 0.6989 |
| 0.1101 | 6.5714 | 1748 | 0.5653 | 0.7879 | 0.5653 | 0.7519 |
| 0.1101 | 6.5789 | 1750 | 0.5690 | 0.7879 | 0.5690 | 0.7543 |
| 0.1101 | 6.5865 | 1752 | 0.5389 | 0.7879 | 0.5389 | 0.7341 |
| 0.1101 | 6.5940 | 1754 | 0.5130 | 0.7879 | 0.5130 | 0.7163 |
| 0.1101 | 6.6015 | 1756 | 0.4742 | 0.7510 | 0.4742 | 0.6886 |
| 0.1101 | 6.6090 | 1758 | 0.4172 | 0.8048 | 0.4172 | 0.6459 |
| 0.1101 | 6.6165 | 1760 | 0.3712 | 0.7667 | 0.3712 | 0.6093 |
| 0.1101 | 6.6241 | 1762 | 0.3517 | 0.7549 | 0.3517 | 0.5931 |
| 0.1101 | 6.6316 | 1764 | 0.3647 | 0.7667 | 0.3647 | 0.6039 |
| 0.1101 | 6.6391 | 1766 | 0.4059 | 0.8048 | 0.4059 | 0.6371 |
| 0.1101 | 6.6466 | 1768 | 0.4430 | 0.7429 | 0.4430 | 0.6656 |
| 0.1101 | 6.6541 | 1770 | 0.5062 | 0.7510 | 0.5062 | 0.7115 |
| 0.1101 | 6.6617 | 1772 | 0.5765 | 0.7510 | 0.5765 | 0.7593 |
| 0.1101 | 6.6692 | 1774 | 0.5814 | 0.7510 | 0.5814 | 0.7625 |
| 0.1101 | 6.6767 | 1776 | 0.5584 | 0.7510 | 0.5584 | 0.7472 |
| 0.1101 | 6.6842 | 1778 | 0.5076 | 0.7510 | 0.5076 | 0.7125 |
| 0.1101 | 6.6917 | 1780 | 0.4867 | 0.7510 | 0.4867 | 0.6976 |
| 0.1101 | 6.6992 | 1782 | 0.4477 | 0.7812 | 0.4477 | 0.6691 |
| 0.1101 | 6.7068 | 1784 | 0.4333 | 0.7812 | 0.4333 | 0.6582 |
| 0.1101 | 6.7143 | 1786 | 0.4167 | 0.7812 | 0.4167 | 0.6455 |
| 0.1101 | 6.7218 | 1788 | 0.4216 | 0.7812 | 0.4216 | 0.6493 |
| 0.1101 | 6.7293 | 1790 | 0.4426 | 0.7812 | 0.4426 | 0.6653 |
| 0.1101 | 6.7368 | 1792 | 0.4609 | 0.7812 | 0.4609 | 0.6789 |
| 0.1101 | 6.7444 | 1794 | 0.4961 | 0.8165 | 0.4961 | 0.7043 |
| 0.1101 | 6.7519 | 1796 | 0.4912 | 0.8165 | 0.4912 | 0.7009 |
| 0.1101 | 6.7594 | 1798 | 0.4529 | 0.7812 | 0.4529 | 0.6729 |
| 0.1101 | 6.7669 | 1800 | 0.4138 | 0.8205 | 0.4138 | 0.6433 |
| 0.1101 | 6.7744 | 1802 | 0.3827 | 0.8244 | 0.3827 | 0.6186 |
| 0.1101 | 6.7820 | 1804 | 0.3602 | 0.8244 | 0.3602 | 0.6001 |
| 0.1101 | 6.7895 | 1806 | 0.3731 | 0.8244 | 0.3731 | 0.6108 |
| 0.1101 | 6.7970 | 1808 | 0.4113 | 0.8 | 0.4113 | 0.6414 |
| 0.1101 | 6.8045 | 1810 | 0.4757 | 0.7812 | 0.4757 | 0.6897 |
| 0.1101 | 6.8120 | 1812 | 0.5201 | 0.7879 | 0.5201 | 0.7212 |
| 0.1101 | 6.8195 | 1814 | 0.5158 | 0.7879 | 0.5158 | 0.7182 |
| 0.1101 | 6.8271 | 1816 | 0.4732 | 0.7812 | 0.4732 | 0.6879 |
| 0.1101 | 6.8346 | 1818 | 0.4172 | 0.8 | 0.4172 | 0.6459 |
| 0.1101 | 6.8421 | 1820 | 0.3894 | 0.7175 | 0.3894 | 0.6240 |
| 0.1101 | 6.8496 | 1822 | 0.3536 | 0.7522 | 0.3536 | 0.5946 |
| 0.1101 | 6.8571 | 1824 | 0.3609 | 0.7522 | 0.3609 | 0.6007 |
| 0.1101 | 6.8647 | 1826 | 0.4013 | 0.7607 | 0.4013 | 0.6335 |
| 0.1101 | 6.8722 | 1828 | 0.4527 | 0.7812 | 0.4527 | 0.6728 |
| 0.1101 | 6.8797 | 1830 | 0.5210 | 0.7879 | 0.5210 | 0.7218 |
| 0.1101 | 6.8872 | 1832 | 0.5837 | 0.7879 | 0.5837 | 0.7640 |
| 0.1101 | 6.8947 | 1834 | 0.5901 | 0.7879 | 0.5901 | 0.7682 |
| 0.1101 | 6.9023 | 1836 | 0.5385 | 0.7879 | 0.5385 | 0.7338 |
| 0.1101 | 6.9098 | 1838 | 0.4730 | 0.8165 | 0.4730 | 0.6878 |
| 0.1101 | 6.9173 | 1840 | 0.4059 | 0.7863 | 0.4059 | 0.6371 |
| 0.1101 | 6.9248 | 1842 | 0.3724 | 0.8244 | 0.3724 | 0.6102 |
| 0.1101 | 6.9323 | 1844 | 0.3834 | 0.7863 | 0.3834 | 0.6192 |
| 0.1101 | 6.9398 | 1846 | 0.4090 | 0.7863 | 0.4090 | 0.6395 |
| 0.1101 | 6.9474 | 1848 | 0.4234 | 0.7692 | 0.4234 | 0.6507 |
| 0.1101 | 6.9549 | 1850 | 0.4795 | 0.8165 | 0.4795 | 0.6925 |
| 0.1101 | 6.9624 | 1852 | 0.5133 | 0.7879 | 0.5133 | 0.7165 |
| 0.1101 | 6.9699 | 1854 | 0.5234 | 0.7879 | 0.5234 | 0.7235 |
| 0.1101 | 6.9774 | 1856 | 0.5044 | 0.7879 | 0.5044 | 0.7102 |
| 0.1101 | 6.9850 | 1858 | 0.4803 | 0.7879 | 0.4803 | 0.6930 |
| 0.1101 | 6.9925 | 1860 | 0.4555 | 0.7510 | 0.4555 | 0.6749 |
| 0.1101 | 7.0 | 1862 | 0.4637 | 0.7510 | 0.4637 | 0.6810 |
| 0.1101 | 7.0075 | 1864 | 0.4591 | 0.7510 | 0.4591 | 0.6776 |
| 0.1101 | 7.0150 | 1866 | 0.4410 | 0.7175 | 0.4410 | 0.6641 |
| 0.1101 | 7.0226 | 1868 | 0.4093 | 0.7549 | 0.4093 | 0.6398 |
| 0.1101 | 7.0301 | 1870 | 0.4077 | 0.7910 | 0.4077 | 0.6385 |
| 0.1101 | 7.0376 | 1872 | 0.4339 | 0.8205 | 0.4339 | 0.6587 |
| 0.1101 | 7.0451 | 1874 | 0.4589 | 0.8521 | 0.4589 | 0.6774 |
| 0.1101 | 7.0526 | 1876 | 0.4969 | 0.7879 | 0.4969 | 0.7049 |
| 0.1101 | 7.0602 | 1878 | 0.5183 | 0.7879 | 0.5183 | 0.7199 |
| 0.1101 | 7.0677 | 1880 | 0.5091 | 0.8165 | 0.5091 | 0.7135 |
| 0.1101 | 7.0752 | 1882 | 0.4784 | 0.8521 | 0.4784 | 0.6917 |
| 0.1101 | 7.0827 | 1884 | 0.4623 | 0.8372 | 0.4623 | 0.6799 |
| 0.1101 | 7.0902 | 1886 | 0.4750 | 0.8372 | 0.4750 | 0.6892 |
| 0.1101 | 7.0977 | 1888 | 0.5153 | 0.8165 | 0.5153 | 0.7178 |
| 0.1101 | 7.1053 | 1890 | 0.5157 | 0.8521 | 0.5157 | 0.7181 |
| 0.1101 | 7.1128 | 1892 | 0.5124 | 0.8372 | 0.5124 | 0.7158 |
| 0.1101 | 7.1203 | 1894 | 0.5294 | 0.8028 | 0.5294 | 0.7276 |
| 0.1101 | 7.1278 | 1896 | 0.5292 | 0.8028 | 0.5292 | 0.7275 |
| 0.1101 | 7.1353 | 1898 | 0.5143 | 0.8028 | 0.5143 | 0.7171 |
| 0.1101 | 7.1429 | 1900 | 0.4756 | 0.8372 | 0.4756 | 0.6896 |
| 0.1101 | 7.1504 | 1902 | 0.4392 | 0.8372 | 0.4392 | 0.6627 |
| 0.1101 | 7.1579 | 1904 | 0.4397 | 0.8372 | 0.4397 | 0.6631 |
| 0.1101 | 7.1654 | 1906 | 0.4825 | 0.8165 | 0.4825 | 0.6946 |
| 0.1101 | 7.1729 | 1908 | 0.4878 | 0.8165 | 0.4878 | 0.6984 |
| 0.1101 | 7.1805 | 1910 | 0.4808 | 0.8165 | 0.4808 | 0.6934 |
| 0.1101 | 7.1880 | 1912 | 0.4460 | 0.7429 | 0.4460 | 0.6679 |
| 0.1101 | 7.1955 | 1914 | 0.4370 | 0.7692 | 0.4370 | 0.6611 |
| 0.1101 | 7.2030 | 1916 | 0.4151 | 0.8069 | 0.4151 | 0.6443 |
| 0.1101 | 7.2105 | 1918 | 0.4199 | 0.8372 | 0.4199 | 0.6480 |
| 0.1101 | 7.2180 | 1920 | 0.4619 | 0.8028 | 0.4619 | 0.6796 |
| 0.1101 | 7.2256 | 1922 | 0.5093 | 0.8028 | 0.5093 | 0.7137 |
| 0.1101 | 7.2331 | 1924 | 0.5099 | 0.8028 | 0.5099 | 0.7141 |
| 0.1101 | 7.2406 | 1926 | 0.4969 | 0.8028 | 0.4969 | 0.7049 |
| 0.1101 | 7.2481 | 1928 | 0.4384 | 0.8028 | 0.4384 | 0.6621 |
| 0.1101 | 7.2556 | 1930 | 0.3825 | 0.8618 | 0.3825 | 0.6185 |
| 0.1101 | 7.2632 | 1932 | 0.3458 | 0.8511 | 0.3458 | 0.5881 |
| 0.1101 | 7.2707 | 1934 | 0.3275 | 0.8192 | 0.3275 | 0.5723 |
| 0.1101 | 7.2782 | 1936 | 0.3274 | 0.8192 | 0.3274 | 0.5721 |
| 0.1101 | 7.2857 | 1938 | 0.3480 | 0.8151 | 0.3480 | 0.5899 |
| 0.1101 | 7.2932 | 1940 | 0.3763 | 0.8151 | 0.3763 | 0.6134 |
| 0.1101 | 7.3008 | 1942 | 0.4311 | 0.8165 | 0.4311 | 0.6566 |
| 0.1101 | 7.3083 | 1944 | 0.4901 | 0.8165 | 0.4901 | 0.7001 |
| 0.1101 | 7.3158 | 1946 | 0.5300 | 0.8165 | 0.5300 | 0.7280 |
| 0.1101 | 7.3233 | 1948 | 0.5206 | 0.8165 | 0.5206 | 0.7215 |
| 0.1101 | 7.3308 | 1950 | 0.4814 | 0.8165 | 0.4814 | 0.6938 |
| 0.1101 | 7.3383 | 1952 | 0.4652 | 0.8 | 0.4652 | 0.6821 |
| 0.1101 | 7.3459 | 1954 | 0.4427 | 0.7607 | 0.4427 | 0.6654 |
| 0.1101 | 7.3534 | 1956 | 0.4564 | 0.7607 | 0.4564 | 0.6756 |
| 0.1101 | 7.3609 | 1958 | 0.4629 | 0.8 | 0.4629 | 0.6804 |
| 0.1101 | 7.3684 | 1960 | 0.4558 | 0.8 | 0.4558 | 0.6751 |
| 0.1101 | 7.3759 | 1962 | 0.4432 | 0.8 | 0.4432 | 0.6658 |
| 0.1101 | 7.3835 | 1964 | 0.4489 | 0.8 | 0.4489 | 0.6700 |
| 0.1101 | 7.3910 | 1966 | 0.4536 | 0.8165 | 0.4536 | 0.6735 |
| 0.1101 | 7.3985 | 1968 | 0.4470 | 0.8165 | 0.4470 | 0.6686 |
| 0.1101 | 7.4060 | 1970 | 0.4262 | 0.7692 | 0.4262 | 0.6528 |
| 0.1101 | 7.4135 | 1972 | 0.4063 | 0.7863 | 0.4063 | 0.6374 |
| 0.1101 | 7.4211 | 1974 | 0.4125 | 0.7692 | 0.4125 | 0.6423 |
| 0.1101 | 7.4286 | 1976 | 0.4316 | 0.7812 | 0.4316 | 0.6570 |
| 0.1101 | 7.4361 | 1978 | 0.4177 | 0.7812 | 0.4177 | 0.6463 |
| 0.1101 | 7.4436 | 1980 | 0.3943 | 0.8151 | 0.3943 | 0.6279 |
| 0.1101 | 7.4511 | 1982 | 0.4076 | 0.8306 | 0.4076 | 0.6384 |
| 0.1101 | 7.4586 | 1984 | 0.4191 | 0.8 | 0.4191 | 0.6474 |
| 0.1101 | 7.4662 | 1986 | 0.4646 | 0.7812 | 0.4646 | 0.6816 |
| 0.1101 | 7.4737 | 1988 | 0.5191 | 0.7879 | 0.5191 | 0.7205 |
| 0.1101 | 7.4812 | 1990 | 0.5766 | 0.7879 | 0.5766 | 0.7593 |
| 0.1101 | 7.4887 | 1992 | 0.6324 | 0.7879 | 0.6324 | 0.7953 |
| 0.1101 | 7.4962 | 1994 | 0.6531 | 0.8123 | 0.6531 | 0.8082 |
| 0.1101 | 7.5038 | 1996 | 0.6291 | 0.7879 | 0.6291 | 0.7931 |
| 0.1101 | 7.5113 | 1998 | 0.5723 | 0.7510 | 0.5723 | 0.7565 |
| 0.0717 | 7.5188 | 2000 | 0.5163 | 0.7686 | 0.5163 | 0.7185 |
| 0.0717 | 7.5263 | 2002 | 0.4661 | 0.7607 | 0.4661 | 0.6827 |
| 0.0717 | 7.5338 | 2004 | 0.4192 | 0.7175 | 0.4192 | 0.6475 |
| 0.0717 | 7.5414 | 2006 | 0.4041 | 0.7175 | 0.4041 | 0.6357 |
| 0.0717 | 7.5489 | 2008 | 0.4150 | 0.7607 | 0.4150 | 0.6442 |
| 0.0717 | 7.5564 | 2010 | 0.4313 | 0.8048 | 0.4313 | 0.6568 |
| 0.0717 | 7.5639 | 2012 | 0.4666 | 0.7812 | 0.4666 | 0.6831 |
| 0.0717 | 7.5714 | 2014 | 0.5168 | 0.7879 | 0.5168 | 0.7189 |
| 0.0717 | 7.5789 | 2016 | 0.5465 | 0.7879 | 0.5465 | 0.7392 |
| 0.0717 | 7.5865 | 2018 | 0.5765 | 0.7879 | 0.5765 | 0.7593 |
| 0.0717 | 7.5940 | 2020 | 0.5633 | 0.7879 | 0.5633 | 0.7506 |
| 0.0717 | 7.6015 | 2022 | 0.5210 | 0.8165 | 0.5210 | 0.7218 |
| 0.0717 | 7.6090 | 2024 | 0.4725 | 0.8521 | 0.4725 | 0.6874 |
| 0.0717 | 7.6165 | 2026 | 0.4218 | 0.8397 | 0.4218 | 0.6495 |
| 0.0717 | 7.6241 | 2028 | 0.4096 | 0.8048 | 0.4096 | 0.6400 |
| 0.0717 | 7.6316 | 2030 | 0.4148 | 0.8048 | 0.4148 | 0.6441 |
| 0.0717 | 7.6391 | 2032 | 0.4227 | 0.8397 | 0.4227 | 0.6502 |
| 0.0717 | 7.6466 | 2034 | 0.4155 | 0.8048 | 0.4155 | 0.6446 |
| 0.0717 | 7.6541 | 2036 | 0.4096 | 0.8397 | 0.4096 | 0.6400 |
| 0.0717 | 7.6617 | 2038 | 0.4148 | 0.8397 | 0.4148 | 0.6441 |
| 0.0717 | 7.6692 | 2040 | 0.4328 | 0.8521 | 0.4328 | 0.6579 |
| 0.0717 | 7.6767 | 2042 | 0.4628 | 0.8521 | 0.4628 | 0.6803 |
| 0.0717 | 7.6842 | 2044 | 0.4910 | 0.8521 | 0.4910 | 0.7007 |
| 0.0717 | 7.6917 | 2046 | 0.5159 | 0.8521 | 0.5159 | 0.7183 |
| 0.0717 | 7.6992 | 2048 | 0.5624 | 0.8256 | 0.5624 | 0.7500 |
| 0.0717 | 7.7068 | 2050 | 0.6057 | 0.7879 | 0.6057 | 0.7783 |
| 0.0717 | 7.7143 | 2052 | 0.6226 | 0.7879 | 0.6226 | 0.7891 |
| 0.0717 | 7.7218 | 2054 | 0.6108 | 0.7879 | 0.6108 | 0.7815 |
| 0.0717 | 7.7293 | 2056 | 0.5660 | 0.7879 | 0.5660 | 0.7524 |
| 0.0717 | 7.7368 | 2058 | 0.5205 | 0.8165 | 0.5205 | 0.7214 |
| 0.0717 | 7.7444 | 2060 | 0.4656 | 0.8521 | 0.4656 | 0.6823 |
| 0.0717 | 7.7519 | 2062 | 0.4323 | 0.8397 | 0.4323 | 0.6575 |
| 0.0717 | 7.7594 | 2064 | 0.4239 | 0.8397 | 0.4239 | 0.6511 |
| 0.0717 | 7.7669 | 2066 | 0.4494 | 0.8521 | 0.4494 | 0.6704 |
| 0.0717 | 7.7744 | 2068 | 0.5053 | 0.8521 | 0.5053 | 0.7108 |
| 0.0717 | 7.7820 | 2070 | 0.5498 | 0.7879 | 0.5498 | 0.7415 |
| 0.0717 | 7.7895 | 2072 | 0.5806 | 0.7879 | 0.5806 | 0.7619 |
| 0.0717 | 7.7970 | 2074 | 0.5899 | 0.7879 | 0.5899 | 0.7681 |
| 0.0717 | 7.8045 | 2076 | 0.5617 | 0.7879 | 0.5617 | 0.7495 |
| 0.0717 | 7.8120 | 2078 | 0.5261 | 0.8165 | 0.5261 | 0.7253 |
| 0.0717 | 7.8195 | 2080 | 0.5067 | 0.8165 | 0.5067 | 0.7118 |
| 0.0717 | 7.8271 | 2082 | 0.4869 | 0.8165 | 0.4869 | 0.6978 |
| 0.0717 | 7.8346 | 2084 | 0.4812 | 0.8165 | 0.4812 | 0.6937 |
| 0.0717 | 7.8421 | 2086 | 0.5062 | 0.8165 | 0.5062 | 0.7115 |
| 0.0717 | 7.8496 | 2088 | 0.5417 | 0.8165 | 0.5417 | 0.7360 |
| 0.0717 | 7.8571 | 2090 | 0.5491 | 0.8165 | 0.5491 | 0.7410 |
| 0.0717 | 7.8647 | 2092 | 0.5564 | 0.8165 | 0.5564 | 0.7459 |
| 0.0717 | 7.8722 | 2094 | 0.5232 | 0.8165 | 0.5232 | 0.7233 |
| 0.0717 | 7.8797 | 2096 | 0.4733 | 0.8165 | 0.4733 | 0.6880 |
| 0.0717 | 7.8872 | 2098 | 0.4291 | 0.8 | 0.4291 | 0.6551 |
| 0.0717 | 7.8947 | 2100 | 0.4089 | 0.8 | 0.4089 | 0.6395 |
| 0.0717 | 7.9023 | 2102 | 0.4172 | 0.8 | 0.4172 | 0.6459 |
| 0.0717 | 7.9098 | 2104 | 0.4222 | 0.8 | 0.4222 | 0.6498 |
| 0.0717 | 7.9173 | 2106 | 0.4349 | 0.8 | 0.4349 | 0.6594 |
| 0.0717 | 7.9248 | 2108 | 0.4443 | 0.8 | 0.4443 | 0.6666 |
| 0.0717 | 7.9323 | 2110 | 0.4618 | 0.8 | 0.4618 | 0.6796 |
| 0.0717 | 7.9398 | 2112 | 0.4701 | 0.8 | 0.4701 | 0.6857 |
| 0.0717 | 7.9474 | 2114 | 0.4866 | 0.8 | 0.4866 | 0.6976 |
| 0.0717 | 7.9549 | 2116 | 0.5167 | 0.7812 | 0.5167 | 0.7188 |
| 0.0717 | 7.9624 | 2118 | 0.5508 | 0.8165 | 0.5508 | 0.7421 |
| 0.0717 | 7.9699 | 2120 | 0.6047 | 0.7879 | 0.6047 | 0.7776 |
| 0.0717 | 7.9774 | 2122 | 0.6493 | 0.7879 | 0.6493 | 0.8058 |
| 0.0717 | 7.9850 | 2124 | 0.7009 | 0.7586 | 0.7009 | 0.8372 |
| 0.0717 | 7.9925 | 2126 | 0.7156 | 0.7586 | 0.7156 | 0.8460 |
| 0.0717 | 8.0 | 2128 | 0.7092 | 0.7586 | 0.7092 | 0.8421 |
| 0.0717 | 8.0075 | 2130 | 0.6850 | 0.7586 | 0.6850 | 0.8276 |
| 0.0717 | 8.0150 | 2132 | 0.6317 | 0.7879 | 0.6317 | 0.7948 |
| 0.0717 | 8.0226 | 2134 | 0.5686 | 0.8165 | 0.5686 | 0.7540 |
| 0.0717 | 8.0301 | 2136 | 0.5189 | 0.7812 | 0.5189 | 0.7203 |
| 0.0717 | 8.0376 | 2138 | 0.4673 | 0.8 | 0.4673 | 0.6836 |
| 0.0717 | 8.0451 | 2140 | 0.4542 | 0.7607 | 0.4542 | 0.6739 |
| 0.0717 | 8.0526 | 2142 | 0.4565 | 0.8 | 0.4565 | 0.6757 |
| 0.0717 | 8.0602 | 2144 | 0.4789 | 0.8 | 0.4789 | 0.6920 |
| 0.0717 | 8.0677 | 2146 | 0.5160 | 0.8165 | 0.5160 | 0.7183 |
| 0.0717 | 8.0752 | 2148 | 0.5547 | 0.8165 | 0.5547 | 0.7448 |
| 0.0717 | 8.0827 | 2150 | 0.5729 | 0.8165 | 0.5729 | 0.7569 |
| 0.0717 | 8.0902 | 2152 | 0.5809 | 0.8165 | 0.5809 | 0.7622 |
| 0.0717 | 8.0977 | 2154 | 0.6027 | 0.7879 | 0.6027 | 0.7763 |
| 0.0717 | 8.1053 | 2156 | 0.6149 | 0.7879 | 0.6149 | 0.7842 |
| 0.0717 | 8.1128 | 2158 | 0.6209 | 0.7879 | 0.6209 | 0.7880 |
| 0.0717 | 8.1203 | 2160 | 0.6451 | 0.7879 | 0.6451 | 0.8032 |
| 0.0717 | 8.1278 | 2162 | 0.6393 | 0.7879 | 0.6393 | 0.7996 |
| 0.0717 | 8.1353 | 2164 | 0.6163 | 0.7879 | 0.6163 | 0.7850 |
| 0.0717 | 8.1429 | 2166 | 0.5815 | 0.8165 | 0.5815 | 0.7626 |
| 0.0717 | 8.1504 | 2168 | 0.5268 | 0.8165 | 0.5268 | 0.7258 |
| 0.0717 | 8.1579 | 2170 | 0.4797 | 0.8165 | 0.4797 | 0.6926 |
| 0.0717 | 8.1654 | 2172 | 0.4476 | 0.8359 | 0.4476 | 0.6690 |
| 0.0717 | 8.1729 | 2174 | 0.4352 | 0.8359 | 0.4352 | 0.6597 |
| 0.0717 | 8.1805 | 2176 | 0.4414 | 0.8359 | 0.4414 | 0.6644 |
| 0.0717 | 8.1880 | 2178 | 0.4482 | 0.8165 | 0.4482 | 0.6695 |
| 0.0717 | 8.1955 | 2180 | 0.4617 | 0.8165 | 0.4617 | 0.6795 |
| 0.0717 | 8.2030 | 2182 | 0.4886 | 0.8165 | 0.4886 | 0.6990 |
| 0.0717 | 8.2105 | 2184 | 0.5188 | 0.8165 | 0.5188 | 0.7203 |
| 0.0717 | 8.2180 | 2186 | 0.5403 | 0.8165 | 0.5403 | 0.7350 |
| 0.0717 | 8.2256 | 2188 | 0.5563 | 0.8165 | 0.5563 | 0.7459 |
| 0.0717 | 8.2331 | 2190 | 0.5517 | 0.8165 | 0.5517 | 0.7428 |
| 0.0717 | 8.2406 | 2192 | 0.5387 | 0.8165 | 0.5387 | 0.7339 |
| 0.0717 | 8.2481 | 2194 | 0.5308 | 0.8165 | 0.5308 | 0.7286 |
| 0.0717 | 8.2556 | 2196 | 0.5015 | 0.8165 | 0.5015 | 0.7081 |
| 0.0717 | 8.2632 | 2198 | 0.4714 | 0.8 | 0.4714 | 0.6866 |
| 0.0717 | 8.2707 | 2200 | 0.4434 | 0.7607 | 0.4434 | 0.6659 |
| 0.0717 | 8.2782 | 2202 | 0.4347 | 0.7607 | 0.4347 | 0.6593 |
| 0.0717 | 8.2857 | 2204 | 0.4447 | 0.7607 | 0.4447 | 0.6669 |
| 0.0717 | 8.2932 | 2206 | 0.4709 | 0.7607 | 0.4709 | 0.6862 |
| 0.0717 | 8.3008 | 2208 | 0.5110 | 0.7607 | 0.5110 | 0.7149 |
| 0.0717 | 8.3083 | 2210 | 0.5671 | 0.7879 | 0.5671 | 0.7530 |
| 0.0717 | 8.3158 | 2212 | 0.6029 | 0.7879 | 0.6029 | 0.7765 |
| 0.0717 | 8.3233 | 2214 | 0.6112 | 0.7879 | 0.6112 | 0.7818 |
| 0.0717 | 8.3308 | 2216 | 0.6173 | 0.7879 | 0.6173 | 0.7857 |
| 0.0717 | 8.3383 | 2218 | 0.5935 | 0.7879 | 0.5935 | 0.7704 |
| 0.0717 | 8.3459 | 2220 | 0.5561 | 0.7879 | 0.5561 | 0.7457 |
| 0.0717 | 8.3534 | 2222 | 0.5295 | 0.8165 | 0.5295 | 0.7276 |
| 0.0717 | 8.3609 | 2224 | 0.5105 | 0.8165 | 0.5105 | 0.7145 |
| 0.0717 | 8.3684 | 2226 | 0.4973 | 0.8165 | 0.4973 | 0.7052 |
| 0.0717 | 8.3759 | 2228 | 0.4991 | 0.8165 | 0.4991 | 0.7065 |
| 0.0717 | 8.3835 | 2230 | 0.4982 | 0.8165 | 0.4982 | 0.7058 |
| 0.0717 | 8.3910 | 2232 | 0.5114 | 0.8165 | 0.5114 | 0.7151 |
| 0.0717 | 8.3985 | 2234 | 0.5158 | 0.8165 | 0.5158 | 0.7182 |
| 0.0717 | 8.4060 | 2236 | 0.5138 | 0.8165 | 0.5138 | 0.7168 |
| 0.0717 | 8.4135 | 2238 | 0.5070 | 0.8165 | 0.5070 | 0.7120 |
| 0.0717 | 8.4211 | 2240 | 0.4953 | 0.8165 | 0.4953 | 0.7038 |
| 0.0717 | 8.4286 | 2242 | 0.4762 | 0.8521 | 0.4762 | 0.6900 |
| 0.0717 | 8.4361 | 2244 | 0.4747 | 0.8521 | 0.4747 | 0.6890 |
| 0.0717 | 8.4436 | 2246 | 0.4715 | 0.8718 | 0.4715 | 0.6867 |
| 0.0717 | 8.4511 | 2248 | 0.4510 | 0.8718 | 0.4510 | 0.6716 |
| 0.0717 | 8.4586 | 2250 | 0.4472 | 0.8718 | 0.4472 | 0.6687 |
| 0.0717 | 8.4662 | 2252 | 0.4543 | 0.8521 | 0.4543 | 0.6740 |
| 0.0717 | 8.4737 | 2254 | 0.4653 | 0.8521 | 0.4653 | 0.6821 |
| 0.0717 | 8.4812 | 2256 | 0.4859 | 0.8521 | 0.4859 | 0.6970 |
| 0.0717 | 8.4887 | 2258 | 0.5039 | 0.8521 | 0.5039 | 0.7099 |
| 0.0717 | 8.4962 | 2260 | 0.5021 | 0.8521 | 0.5021 | 0.7086 |
| 0.0717 | 8.5038 | 2262 | 0.5181 | 0.8165 | 0.5181 | 0.7198 |
| 0.0717 | 8.5113 | 2264 | 0.5357 | 0.8165 | 0.5357 | 0.7319 |
| 0.0717 | 8.5188 | 2266 | 0.5441 | 0.7879 | 0.5441 | 0.7376 |
| 0.0717 | 8.5263 | 2268 | 0.5424 | 0.7879 | 0.5424 | 0.7365 |
| 0.0717 | 8.5338 | 2270 | 0.5298 | 0.7879 | 0.5298 | 0.7279 |
| 0.0717 | 8.5414 | 2272 | 0.4992 | 0.8165 | 0.4992 | 0.7066 |
| 0.0717 | 8.5489 | 2274 | 0.4591 | 0.8 | 0.4591 | 0.6776 |
| 0.0717 | 8.5564 | 2276 | 0.4393 | 0.7607 | 0.4393 | 0.6628 |
| 0.0717 | 8.5639 | 2278 | 0.4411 | 0.7607 | 0.4411 | 0.6642 |
| 0.0717 | 8.5714 | 2280 | 0.4572 | 0.8 | 0.4572 | 0.6762 |
| 0.0717 | 8.5789 | 2282 | 0.4870 | 0.8359 | 0.4870 | 0.6978 |
| 0.0717 | 8.5865 | 2284 | 0.5222 | 0.7879 | 0.5222 | 0.7226 |
| 0.0717 | 8.5940 | 2286 | 0.5289 | 0.7879 | 0.5289 | 0.7273 |
| 0.0717 | 8.6015 | 2288 | 0.5247 | 0.7879 | 0.5247 | 0.7244 |
| 0.0717 | 8.6090 | 2290 | 0.5349 | 0.7879 | 0.5349 | 0.7314 |
| 0.0717 | 8.6165 | 2292 | 0.5348 | 0.7879 | 0.5348 | 0.7313 |
| 0.0717 | 8.6241 | 2294 | 0.5222 | 0.7879 | 0.5222 | 0.7226 |
| 0.0717 | 8.6316 | 2296 | 0.5060 | 0.8165 | 0.5060 | 0.7113 |
| 0.0717 | 8.6391 | 2298 | 0.5048 | 0.8165 | 0.5048 | 0.7105 |
| 0.0717 | 8.6466 | 2300 | 0.4911 | 0.8165 | 0.4911 | 0.7008 |
| 0.0717 | 8.6541 | 2302 | 0.4772 | 0.8165 | 0.4772 | 0.6908 |
| 0.0717 | 8.6617 | 2304 | 0.4773 | 0.8165 | 0.4773 | 0.6909 |
| 0.0717 | 8.6692 | 2306 | 0.4819 | 0.8165 | 0.4819 | 0.6942 |
| 0.0717 | 8.6767 | 2308 | 0.4784 | 0.8165 | 0.4784 | 0.6916 |
| 0.0717 | 8.6842 | 2310 | 0.4843 | 0.8165 | 0.4843 | 0.6959 |
| 0.0717 | 8.6917 | 2312 | 0.4801 | 0.8165 | 0.4801 | 0.6929 |
| 0.0717 | 8.6992 | 2314 | 0.4823 | 0.8165 | 0.4823 | 0.6945 |
| 0.0717 | 8.7068 | 2316 | 0.4752 | 0.8165 | 0.4752 | 0.6893 |
| 0.0717 | 8.7143 | 2318 | 0.4661 | 0.8718 | 0.4661 | 0.6827 |
| 0.0717 | 8.7218 | 2320 | 0.4659 | 0.8521 | 0.4659 | 0.6826 |
| 0.0717 | 8.7293 | 2322 | 0.4768 | 0.8521 | 0.4768 | 0.6905 |
| 0.0717 | 8.7368 | 2324 | 0.5050 | 0.8256 | 0.5050 | 0.7106 |
| 0.0717 | 8.7444 | 2326 | 0.5339 | 0.7879 | 0.5339 | 0.7307 |
| 0.0717 | 8.7519 | 2328 | 0.5481 | 0.7879 | 0.5481 | 0.7403 |
| 0.0717 | 8.7594 | 2330 | 0.5473 | 0.7879 | 0.5473 | 0.7398 |
| 0.0717 | 8.7669 | 2332 | 0.5402 | 0.7879 | 0.5402 | 0.7349 |
| 0.0717 | 8.7744 | 2334 | 0.5261 | 0.7879 | 0.5261 | 0.7253 |
| 0.0717 | 8.7820 | 2336 | 0.4975 | 0.8256 | 0.4975 | 0.7053 |
| 0.0717 | 8.7895 | 2338 | 0.4596 | 0.8718 | 0.4596 | 0.6780 |
| 0.0717 | 8.7970 | 2340 | 0.4352 | 0.8397 | 0.4352 | 0.6597 |
| 0.0717 | 8.8045 | 2342 | 0.4205 | 0.8397 | 0.4205 | 0.6485 |
| 0.0717 | 8.8120 | 2344 | 0.4212 | 0.8397 | 0.4212 | 0.6490 |
| 0.0717 | 8.8195 | 2346 | 0.4327 | 0.8397 | 0.4327 | 0.6578 |
| 0.0717 | 8.8271 | 2348 | 0.4497 | 0.8397 | 0.4497 | 0.6706 |
| 0.0717 | 8.8346 | 2350 | 0.4734 | 0.8521 | 0.4734 | 0.6881 |
| 0.0717 | 8.8421 | 2352 | 0.5002 | 0.8256 | 0.5002 | 0.7073 |
| 0.0717 | 8.8496 | 2354 | 0.5265 | 0.7879 | 0.5265 | 0.7256 |
| 0.0717 | 8.8571 | 2356 | 0.5408 | 0.7879 | 0.5408 | 0.7354 |
| 0.0717 | 8.8647 | 2358 | 0.5416 | 0.7879 | 0.5416 | 0.7359 |
| 0.0717 | 8.8722 | 2360 | 0.5282 | 0.7879 | 0.5282 | 0.7268 |
| 0.0717 | 8.8797 | 2362 | 0.5093 | 0.7879 | 0.5093 | 0.7137 |
| 0.0717 | 8.8872 | 2364 | 0.4909 | 0.7510 | 0.4909 | 0.7007 |
| 0.0717 | 8.8947 | 2366 | 0.4689 | 0.8397 | 0.4689 | 0.6848 |
| 0.0717 | 8.9023 | 2368 | 0.4568 | 0.8397 | 0.4568 | 0.6759 |
| 0.0717 | 8.9098 | 2370 | 0.4525 | 0.8397 | 0.4525 | 0.6727 |
| 0.0717 | 8.9173 | 2372 | 0.4605 | 0.8205 | 0.4605 | 0.6786 |
| 0.0717 | 8.9248 | 2374 | 0.4581 | 0.8205 | 0.4581 | 0.6768 |
| 0.0717 | 8.9323 | 2376 | 0.4531 | 0.8205 | 0.4531 | 0.6731 |
| 0.0717 | 8.9398 | 2378 | 0.4610 | 0.8205 | 0.4610 | 0.6790 |
| 0.0717 | 8.9474 | 2380 | 0.4623 | 0.8205 | 0.4623 | 0.6800 |
| 0.0717 | 8.9549 | 2382 | 0.4615 | 0.8205 | 0.4615 | 0.6794 |
| 0.0717 | 8.9624 | 2384 | 0.4671 | 0.8205 | 0.4671 | 0.6835 |
| 0.0717 | 8.9699 | 2386 | 0.4791 | 0.8521 | 0.4791 | 0.6922 |
| 0.0717 | 8.9774 | 2388 | 0.4932 | 0.8165 | 0.4932 | 0.7023 |
| 0.0717 | 8.9850 | 2390 | 0.5010 | 0.8165 | 0.5010 | 0.7078 |
| 0.0717 | 8.9925 | 2392 | 0.4991 | 0.8165 | 0.4991 | 0.7065 |
| 0.0717 | 9.0 | 2394 | 0.4882 | 0.8521 | 0.4882 | 0.6987 |
| 0.0717 | 9.0075 | 2396 | 0.4751 | 0.8205 | 0.4751 | 0.6893 |
| 0.0717 | 9.0150 | 2398 | 0.4598 | 0.8397 | 0.4598 | 0.6781 |
| 0.0717 | 9.0226 | 2400 | 0.4470 | 0.8397 | 0.4470 | 0.6686 |
| 0.0717 | 9.0301 | 2402 | 0.4452 | 0.8397 | 0.4452 | 0.6672 |
| 0.0717 | 9.0376 | 2404 | 0.4555 | 0.8397 | 0.4555 | 0.6749 |
| 0.0717 | 9.0451 | 2406 | 0.4764 | 0.8205 | 0.4764 | 0.6902 |
| 0.0717 | 9.0526 | 2408 | 0.5082 | 0.8521 | 0.5082 | 0.7129 |
| 0.0717 | 9.0602 | 2410 | 0.5348 | 0.7879 | 0.5348 | 0.7313 |
| 0.0717 | 9.0677 | 2412 | 0.5482 | 0.7879 | 0.5482 | 0.7404 |
| 0.0717 | 9.0752 | 2414 | 0.5488 | 0.7879 | 0.5488 | 0.7408 |
| 0.0717 | 9.0827 | 2416 | 0.5349 | 0.7879 | 0.5349 | 0.7314 |
| 0.0717 | 9.0902 | 2418 | 0.5242 | 0.7879 | 0.5242 | 0.7240 |
| 0.0717 | 9.0977 | 2420 | 0.5150 | 0.8256 | 0.5150 | 0.7176 |
| 0.0717 | 9.1053 | 2422 | 0.5066 | 0.8521 | 0.5066 | 0.7117 |
| 0.0717 | 9.1128 | 2424 | 0.4952 | 0.8205 | 0.4952 | 0.7037 |
| 0.0717 | 9.1203 | 2426 | 0.4883 | 0.8205 | 0.4883 | 0.6988 |
| 0.0717 | 9.1278 | 2428 | 0.4860 | 0.8205 | 0.4860 | 0.6972 |
| 0.0717 | 9.1353 | 2430 | 0.4932 | 0.8205 | 0.4932 | 0.7023 |
| 0.0717 | 9.1429 | 2432 | 0.5035 | 0.8521 | 0.5035 | 0.7096 |
| 0.0717 | 9.1504 | 2434 | 0.5080 | 0.8165 | 0.5080 | 0.7127 |
| 0.0717 | 9.1579 | 2436 | 0.5137 | 0.7879 | 0.5137 | 0.7167 |
| 0.0717 | 9.1654 | 2438 | 0.5118 | 0.8165 | 0.5118 | 0.7154 |
| 0.0717 | 9.1729 | 2440 | 0.5038 | 0.8521 | 0.5038 | 0.7098 |
| 0.0717 | 9.1805 | 2442 | 0.4952 | 0.8521 | 0.4952 | 0.7037 |
| 0.0717 | 9.1880 | 2444 | 0.4898 | 0.8521 | 0.4898 | 0.6999 |
| 0.0717 | 9.1955 | 2446 | 0.4845 | 0.8521 | 0.4845 | 0.6960 |
| 0.0717 | 9.2030 | 2448 | 0.4838 | 0.8521 | 0.4838 | 0.6956 |
| 0.0717 | 9.2105 | 2450 | 0.4805 | 0.8521 | 0.4805 | 0.6932 |
| 0.0717 | 9.2180 | 2452 | 0.4666 | 0.8521 | 0.4666 | 0.6831 |
| 0.0717 | 9.2256 | 2454 | 0.4473 | 0.8521 | 0.4473 | 0.6688 |
| 0.0717 | 9.2331 | 2456 | 0.4329 | 0.8718 | 0.4329 | 0.6580 |
| 0.0717 | 9.2406 | 2458 | 0.4304 | 0.8718 | 0.4304 | 0.6561 |
| 0.0717 | 9.2481 | 2460 | 0.4310 | 0.8718 | 0.4310 | 0.6565 |
| 0.0717 | 9.2556 | 2462 | 0.4341 | 0.8718 | 0.4341 | 0.6589 |
| 0.0717 | 9.2632 | 2464 | 0.4422 | 0.8521 | 0.4422 | 0.6649 |
| 0.0717 | 9.2707 | 2466 | 0.4588 | 0.8521 | 0.4588 | 0.6774 |
| 0.0717 | 9.2782 | 2468 | 0.4725 | 0.8521 | 0.4725 | 0.6874 |
| 0.0717 | 9.2857 | 2470 | 0.4833 | 0.8521 | 0.4833 | 0.6952 |
| 0.0717 | 9.2932 | 2472 | 0.4852 | 0.8521 | 0.4852 | 0.6965 |
| 0.0717 | 9.3008 | 2474 | 0.4922 | 0.8521 | 0.4922 | 0.7016 |
| 0.0717 | 9.3083 | 2476 | 0.5009 | 0.8521 | 0.5009 | 0.7078 |
| 0.0717 | 9.3158 | 2478 | 0.5035 | 0.8521 | 0.5035 | 0.7096 |
| 0.0717 | 9.3233 | 2480 | 0.5034 | 0.8521 | 0.5034 | 0.7095 |
| 0.0717 | 9.3308 | 2482 | 0.4982 | 0.8521 | 0.4982 | 0.7058 |
| 0.0717 | 9.3383 | 2484 | 0.4953 | 0.8521 | 0.4953 | 0.7038 |
| 0.0717 | 9.3459 | 2486 | 0.4914 | 0.8521 | 0.4914 | 0.7010 |
| 0.0717 | 9.3534 | 2488 | 0.4875 | 0.8521 | 0.4875 | 0.6982 |
| 0.0717 | 9.3609 | 2490 | 0.4787 | 0.8718 | 0.4787 | 0.6919 |
| 0.0717 | 9.3684 | 2492 | 0.4728 | 0.8718 | 0.4728 | 0.6876 |
| 0.0717 | 9.3759 | 2494 | 0.4707 | 0.8718 | 0.4707 | 0.6861 |
| 0.0717 | 9.3835 | 2496 | 0.4666 | 0.8718 | 0.4666 | 0.6831 |
| 0.0717 | 9.3910 | 2498 | 0.4643 | 0.8397 | 0.4643 | 0.6814 |
| 0.0599 | 9.3985 | 2500 | 0.4693 | 0.8397 | 0.4693 | 0.6851 |
| 0.0599 | 9.4060 | 2502 | 0.4726 | 0.8397 | 0.4726 | 0.6875 |
| 0.0599 | 9.4135 | 2504 | 0.4803 | 0.8718 | 0.4803 | 0.6930 |
| 0.0599 | 9.4211 | 2506 | 0.4904 | 0.8359 | 0.4904 | 0.7003 |
| 0.0599 | 9.4286 | 2508 | 0.4966 | 0.8165 | 0.4966 | 0.7047 |
| 0.0599 | 9.4361 | 2510 | 0.5019 | 0.8165 | 0.5019 | 0.7085 |
| 0.0599 | 9.4436 | 2512 | 0.5065 | 0.8165 | 0.5065 | 0.7117 |
| 0.0599 | 9.4511 | 2514 | 0.5063 | 0.8165 | 0.5063 | 0.7115 |
| 0.0599 | 9.4586 | 2516 | 0.5036 | 0.8165 | 0.5036 | 0.7096 |
| 0.0599 | 9.4662 | 2518 | 0.4966 | 0.8521 | 0.4966 | 0.7047 |
| 0.0599 | 9.4737 | 2520 | 0.4930 | 0.8521 | 0.4930 | 0.7021 |
| 0.0599 | 9.4812 | 2522 | 0.4906 | 0.8521 | 0.4906 | 0.7005 |
| 0.0599 | 9.4887 | 2524 | 0.4838 | 0.8521 | 0.4838 | 0.6955 |
| 0.0599 | 9.4962 | 2526 | 0.4765 | 0.8521 | 0.4765 | 0.6903 |
| 0.0599 | 9.5038 | 2528 | 0.4760 | 0.8521 | 0.4760 | 0.6900 |
| 0.0599 | 9.5113 | 2530 | 0.4757 | 0.8521 | 0.4757 | 0.6897 |
| 0.0599 | 9.5188 | 2532 | 0.4793 | 0.8521 | 0.4793 | 0.6923 |
| 0.0599 | 9.5263 | 2534 | 0.4839 | 0.8521 | 0.4839 | 0.6956 |
| 0.0599 | 9.5338 | 2536 | 0.4880 | 0.8521 | 0.4880 | 0.6986 |
| 0.0599 | 9.5414 | 2538 | 0.4877 | 0.8521 | 0.4877 | 0.6984 |
| 0.0599 | 9.5489 | 2540 | 0.4838 | 0.8521 | 0.4838 | 0.6956 |
| 0.0599 | 9.5564 | 2542 | 0.4815 | 0.8521 | 0.4815 | 0.6939 |
| 0.0599 | 9.5639 | 2544 | 0.4774 | 0.8521 | 0.4774 | 0.6909 |
| 0.0599 | 9.5714 | 2546 | 0.4763 | 0.8521 | 0.4763 | 0.6901 |
| 0.0599 | 9.5789 | 2548 | 0.4724 | 0.8521 | 0.4724 | 0.6873 |
| 0.0599 | 9.5865 | 2550 | 0.4747 | 0.8521 | 0.4747 | 0.6890 |
| 0.0599 | 9.5940 | 2552 | 0.4771 | 0.8521 | 0.4771 | 0.6907 |
| 0.0599 | 9.6015 | 2554 | 0.4778 | 0.8718 | 0.4778 | 0.6913 |
| 0.0599 | 9.6090 | 2556 | 0.4790 | 0.8718 | 0.4790 | 0.6921 |
| 0.0599 | 9.6165 | 2558 | 0.4776 | 0.8718 | 0.4776 | 0.6911 |
| 0.0599 | 9.6241 | 2560 | 0.4763 | 0.8718 | 0.4763 | 0.6901 |
| 0.0599 | 9.6316 | 2562 | 0.4741 | 0.8397 | 0.4741 | 0.6885 |
| 0.0599 | 9.6391 | 2564 | 0.4705 | 0.8397 | 0.4705 | 0.6859 |
| 0.0599 | 9.6466 | 2566 | 0.4659 | 0.8397 | 0.4659 | 0.6826 |
| 0.0599 | 9.6541 | 2568 | 0.4652 | 0.8397 | 0.4652 | 0.6821 |
| 0.0599 | 9.6617 | 2570 | 0.4630 | 0.8397 | 0.4630 | 0.6805 |
| 0.0599 | 9.6692 | 2572 | 0.4641 | 0.8397 | 0.4641 | 0.6812 |
| 0.0599 | 9.6767 | 2574 | 0.4674 | 0.8397 | 0.4674 | 0.6836 |
| 0.0599 | 9.6842 | 2576 | 0.4712 | 0.8397 | 0.4712 | 0.6864 |
| 0.0599 | 9.6917 | 2578 | 0.4715 | 0.8397 | 0.4715 | 0.6867 |
| 0.0599 | 9.6992 | 2580 | 0.4699 | 0.8397 | 0.4699 | 0.6855 |
| 0.0599 | 9.7068 | 2582 | 0.4713 | 0.8718 | 0.4713 | 0.6865 |
| 0.0599 | 9.7143 | 2584 | 0.4737 | 0.8718 | 0.4737 | 0.6882 |
| 0.0599 | 9.7218 | 2586 | 0.4795 | 0.8718 | 0.4795 | 0.6924 |
| 0.0599 | 9.7293 | 2588 | 0.4845 | 0.8521 | 0.4845 | 0.6961 |
| 0.0599 | 9.7368 | 2590 | 0.4921 | 0.8521 | 0.4921 | 0.7015 |
| 0.0599 | 9.7444 | 2592 | 0.4979 | 0.8521 | 0.4979 | 0.7056 |
| 0.0599 | 9.7519 | 2594 | 0.5026 | 0.8521 | 0.5026 | 0.7089 |
| 0.0599 | 9.7594 | 2596 | 0.5056 | 0.8521 | 0.5056 | 0.7111 |
| 0.0599 | 9.7669 | 2598 | 0.5067 | 0.8521 | 0.5067 | 0.7118 |
| 0.0599 | 9.7744 | 2600 | 0.5086 | 0.8165 | 0.5086 | 0.7132 |
| 0.0599 | 9.7820 | 2602 | 0.5080 | 0.8165 | 0.5080 | 0.7127 |
| 0.0599 | 9.7895 | 2604 | 0.5074 | 0.8165 | 0.5074 | 0.7123 |
| 0.0599 | 9.7970 | 2606 | 0.5047 | 0.8165 | 0.5047 | 0.7105 |
| 0.0599 | 9.8045 | 2608 | 0.5035 | 0.8165 | 0.5035 | 0.7096 |
| 0.0599 | 9.8120 | 2610 | 0.5018 | 0.8165 | 0.5018 | 0.7083 |
| 0.0599 | 9.8195 | 2612 | 0.5000 | 0.8165 | 0.5000 | 0.7071 |
| 0.0599 | 9.8271 | 2614 | 0.4980 | 0.8165 | 0.4980 | 0.7057 |
| 0.0599 | 9.8346 | 2616 | 0.4948 | 0.8521 | 0.4948 | 0.7035 |
| 0.0599 | 9.8421 | 2618 | 0.4924 | 0.8521 | 0.4924 | 0.7017 |
| 0.0599 | 9.8496 | 2620 | 0.4896 | 0.8521 | 0.4896 | 0.6997 |
| 0.0599 | 9.8571 | 2622 | 0.4861 | 0.8521 | 0.4861 | 0.6972 |
| 0.0599 | 9.8647 | 2624 | 0.4846 | 0.8521 | 0.4846 | 0.6961 |
| 0.0599 | 9.8722 | 2626 | 0.4848 | 0.8521 | 0.4848 | 0.6963 |
| 0.0599 | 9.8797 | 2628 | 0.4854 | 0.8521 | 0.4854 | 0.6967 |
| 0.0599 | 9.8872 | 2630 | 0.4867 | 0.8521 | 0.4867 | 0.6977 |
| 0.0599 | 9.8947 | 2632 | 0.4879 | 0.8521 | 0.4879 | 0.6985 |
| 0.0599 | 9.9023 | 2634 | 0.4890 | 0.8521 | 0.4890 | 0.6993 |
| 0.0599 | 9.9098 | 2636 | 0.4902 | 0.8521 | 0.4902 | 0.7002 |
| 0.0599 | 9.9173 | 2638 | 0.4908 | 0.8521 | 0.4908 | 0.7006 |
| 0.0599 | 9.9248 | 2640 | 0.4919 | 0.8165 | 0.4919 | 0.7014 |
| 0.0599 | 9.9323 | 2642 | 0.4928 | 0.8165 | 0.4928 | 0.7020 |
| 0.0599 | 9.9398 | 2644 | 0.4931 | 0.8165 | 0.4931 | 0.7022 |
| 0.0599 | 9.9474 | 2646 | 0.4935 | 0.8165 | 0.4935 | 0.7025 |
| 0.0599 | 9.9549 | 2648 | 0.4938 | 0.8165 | 0.4938 | 0.7027 |
| 0.0599 | 9.9624 | 2650 | 0.4938 | 0.8165 | 0.4938 | 0.7027 |
| 0.0599 | 9.9699 | 2652 | 0.4941 | 0.8165 | 0.4941 | 0.7029 |
| 0.0599 | 9.9774 | 2654 | 0.4946 | 0.8165 | 0.4946 | 0.7033 |
| 0.0599 | 9.9850 | 2656 | 0.4949 | 0.8165 | 0.4949 | 0.7035 |
| 0.0599 | 9.9925 | 2658 | 0.4951 | 0.8165 | 0.4951 | 0.7037 |
| 0.0599 | 10.0 | 2660 | 0.4952 | 0.8165 | 0.4952 | 0.7037 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
|
Subsets and Splits