modelId
string | author
string | last_modified
timestamp[us, tz=UTC] | downloads
int64 | likes
int64 | library_name
string | tags
list | pipeline_tag
string | createdAt
timestamp[us, tz=UTC] | card
string |
|---|---|---|---|---|---|---|---|---|---|
RicaldeNelmo/clone-nelmo-ia
|
RicaldeNelmo
| 2025-03-18T01:12:33Z
| 0
| 0
| null |
[
"license:other",
"region:us"
] | null | 2025-03-18T00:34:10Z
|
---
license: other
license_name: flux-1-dev-non-commercial-license
license_link: https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/LICENSE.md
---
|
NickKolok/trgnfs-20250301-03-pony-1girl
|
NickKolok
| 2025-03-18T01:10:34Z
| 0
| 0
| null |
[
"license:other",
"region:us"
] | null | 2025-03-14T20:54:11Z
|
---
license: other
license_name: other
license_link: LICENSE
---
A LoRA found... somewhere. The LoRA trainer clearly refuses to put any restrictions on it, and either do I.
Trigger word: `trgnfs`. It produces a strange hybrid of Meryl and Milly from the goog old Trigun anime.
|
albertus-sussex/veriscrape-book-test-sbert-bs64_lr0.0002_ep5_euclidean_snTrue_spFalse_hn1
|
albertus-sussex
| 2025-03-18T01:09:35Z
| 0
| 0
|
sentence-transformers
|
[
"sentence-transformers",
"safetensors",
"new",
"sentence-similarity",
"feature-extraction",
"generated_from_trainer",
"dataset_size:84524",
"loss:AttributeTripletLoss",
"custom_code",
"arxiv:1908.10084",
"arxiv:1703.07737",
"base_model:Alibaba-NLP/gte-base-en-v1.5",
"base_model:finetune:Alibaba-NLP/gte-base-en-v1.5",
"model-index",
"autotrain_compatible",
"text-embeddings-inference",
"endpoints_compatible",
"region:us"
] |
sentence-similarity
| 2025-03-18T01:09:16Z
|
---
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:84524
- loss:AttributeTripletLoss
base_model: Alibaba-NLP/gte-base-en-v1.5
widget:
- source_sentence: Don Piper
sentences:
- Tommy Nelson
- Kate Walbert
- publisher
- author
- source_sentence: The Luxe
sentences:
- '1999'
- publication_date
- title
- 'Critical Care, Mercy Hospital Series #1'
- source_sentence: Bram Stoker
sentences:
- author
- Michael J. Pangio
- '9781598871012'
- isbn_13
- source_sentence: '9780385340557'
sentences:
- BBC Books
- '9780399208539'
- author
- isbn_13
- source_sentence: Midnight
sentences:
- The Bone Parade
- 12/01/2005
- publication_date
- title
pipeline_tag: sentence-similarity
library_name: sentence-transformers
metrics:
- cosine_accuracy
- silhouette_cosine
- silhouette_euclidean
model-index:
- name: SentenceTransformer based on Alibaba-NLP/gte-base-en-v1.5
results:
- task:
type: triplet
name: Triplet
dataset:
name: Unknown
type: unknown
metrics:
- type: cosine_accuracy
value: 0.1492759734392166
name: Cosine Accuracy
- type: cosine_accuracy
value: 0.15005749464035034
name: Cosine Accuracy
- task:
type: silhouette
name: Silhouette
dataset:
name: Unknown
type: unknown
metrics:
- type: silhouette_cosine
value: 0.0
name: Silhouette Cosine
- type: silhouette_euclidean
value: -0.20105557143688202
name: Silhouette Euclidean
- type: silhouette_cosine
value: -0.00044717005221173167
name: Silhouette Cosine
- type: silhouette_euclidean
value: -0.22267667949199677
name: Silhouette Euclidean
---
# SentenceTransformer based on Alibaba-NLP/gte-base-en-v1.5
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [Alibaba-NLP/gte-base-en-v1.5](https://huggingface.co/Alibaba-NLP/gte-base-en-v1.5). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [Alibaba-NLP/gte-base-en-v1.5](https://huggingface.co/Alibaba-NLP/gte-base-en-v1.5) <!-- at revision a829fd0e060bb84554da0dfd354d0de0f7712b7f -->
- **Maximum Sequence Length:** 8192 tokens
- **Output Dimensionality:** 768 dimensions
- **Similarity Function:** Cosine Similarity
<!-- - **Training Dataset:** Unknown -->
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 8192, 'do_lower_case': False}) with Transformer model: NewModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("albertus-sussex/veriscrape-book-test-sbert-bs64_lr0.0002_ep5_euclidean_snTrue_spFalse_hn1")
# Run inference
sentences = [
'Midnight',
'The Bone Parade',
'12/01/2005',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
## Evaluation
### Metrics
#### Triplet
* Evaluated with [<code>TripletEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.TripletEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| **cosine_accuracy** | **0.1493** |
#### Silhouette
* Evaluated with <code>veriscrape.training.SilhouetteEvaluator</code>
| Metric | Value |
|:----------------------|:--------|
| **silhouette_cosine** | **0.0** |
| silhouette_euclidean | -0.2011 |
#### Triplet
* Evaluated with [<code>TripletEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.TripletEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| **cosine_accuracy** | **0.1501** |
#### Silhouette
* Evaluated with <code>veriscrape.training.SilhouetteEvaluator</code>
| Metric | Value |
|:----------------------|:------------|
| **silhouette_cosine** | **-0.0004** |
| silhouette_euclidean | -0.2227 |
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Dataset
#### Unnamed Dataset
* Size: 84,524 training samples
* Columns: <code>anchor</code>, <code>positive</code>, <code>negative</code>, <code>pos_attr_name</code>, and <code>neg_attr_name</code>
* Approximate statistics based on the first 1000 samples:
| | anchor | positive | negative | pos_attr_name | neg_attr_name |
|:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:--------------------------------------------------------------------------------|:-------------------------------------------------------------------------------|
| type | string | string | string | string | string |
| details | <ul><li>min: 3 tokens</li><li>mean: 6.97 tokens</li><li>max: 37 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 7.09 tokens</li><li>max: 28 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 6.31 tokens</li><li>max: 23 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 3.77 tokens</li><li>max: 5 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 3.8 tokens</li><li>max: 5 tokens</li></ul> |
* Samples:
| anchor | positive | negative | pos_attr_name | neg_attr_name |
|:---------------------------|:---------------------------|:------------------------------------------|:------------------------------|:-----------------------|
| <code>09/01/1997</code> | <code>12/01/1977</code> | <code>2010</code> | <code>publication_date</code> | <code>title</code> |
| <code>9780060275730</code> | <code>9780829748772</code> | <code>HarperCollins Publishers Ltd</code> | <code>isbn_13</code> | <code>publisher</code> |
| <code>9780609809648</code> | <code>9780764551956</code> | <code>HarperCollins Publishers</code> | <code>isbn_13</code> | <code>author</code> |
* Loss: <code>veriscrape.training.AttributeTripletLoss</code> with these parameters:
```json
{
"distance_metric": "TripletDistanceMetric.EUCLIDEAN",
"triplet_margin": 5
}
```
### Evaluation Dataset
#### Unnamed Dataset
* Size: 9,392 evaluation samples
* Columns: <code>anchor</code>, <code>positive</code>, <code>negative</code>, <code>pos_attr_name</code>, and <code>neg_attr_name</code>
* Approximate statistics based on the first 1000 samples:
| | anchor | positive | negative | pos_attr_name | neg_attr_name |
|:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:--------------------------------------------------------------------------------|:-------------------------------------------------------------------------------|
| type | string | string | string | string | string |
| details | <ul><li>min: 3 tokens</li><li>mean: 6.85 tokens</li><li>max: 27 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 6.98 tokens</li><li>max: 44 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 6.08 tokens</li><li>max: 18 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 3.75 tokens</li><li>max: 5 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 3.8 tokens</li><li>max: 5 tokens</li></ul> |
* Samples:
| anchor | positive | negative | pos_attr_name | neg_attr_name |
|:-------------------------------|:-----------------------------|:---------------------------|:-----------------------|:------------------------------|
| <code>9780764200564</code> | <code>: 9780590458467</code> | <code>1984</code> | <code>isbn_13</code> | <code>publication_date</code> |
| <code>Penguin Group USA</code> | <code>Signet</code> | <code>9781600243912</code> | <code>publisher</code> | <code>isbn_13</code> |
| <code>Alphabet Juice</code> | <code>Space</code> | <code>9780807871133</code> | <code>title</code> | <code>isbn_13</code> |
* Loss: <code>veriscrape.training.AttributeTripletLoss</code> with these parameters:
```json
{
"distance_metric": "TripletDistanceMetric.EUCLIDEAN",
"triplet_margin": 5
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: epoch
- `per_device_train_batch_size`: 64
- `per_device_eval_batch_size`: 64
- `learning_rate`: 0.0002
- `num_train_epochs`: 5
- `warmup_ratio`: 0.1
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: epoch
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 64
- `per_device_eval_batch_size`: 64
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 1
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 0.0002
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 5
- `max_steps`: -1
- `lr_scheduler_type`: linear
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.1
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: False
- `fp16`: False
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: False
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: False
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `use_liger_kernel`: False
- `eval_use_gather_object`: False
- `prompts`: None
- `batch_sampler`: batch_sampler
- `multi_dataset_batch_sampler`: proportional
</details>
### Training Logs
| Epoch | Step | Training Loss | Validation Loss | cosine_accuracy | silhouette_cosine |
|:-----:|:----:|:-------------:|:---------------:|:---------------:|:-----------------:|
| -1 | -1 | - | - | 0.4283 | 0.1492 |
| 1.0 | 1321 | 0.6483 | 3.4269 | 0.9019 | 0.0595 |
| 2.0 | 2642 | 3.1188 | 3.0181 | 0.5564 | -0.1520 |
| 3.0 | 3963 | 3.1704 | 3.0180 | 0.5139 | -0.0101 |
| 4.0 | 5284 | 4.7916 | 5.0000 | 0.2123 | -0.4084 |
| 5.0 | 6605 | 4.9963 | 5.0000 | 0.1493 | 0.0 |
| -1 | -1 | - | - | 0.1501 | -0.0004 |
### Framework Versions
- Python: 3.10.16
- Sentence Transformers: 3.4.1
- Transformers: 4.45.2
- PyTorch: 2.5.1+cu124
- Accelerate: 1.5.2
- Datasets: 3.1.0
- Tokenizers: 0.20.3
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
#### AttributeTripletLoss
```bibtex
@misc{hermans2017defense,
title={In Defense of the Triplet Loss for Person Re-Identification},
author={Alexander Hermans and Lucas Beyer and Bastian Leibe},
year={2017},
eprint={1703.07737},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
-->
|
texanrangee/9d700d47-2469-46a3-8f6d-cac0a34cbce6
|
texanrangee
| 2025-03-18T01:08:28Z
| 0
| 0
|
transformers
|
[
"transformers",
"safetensors",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | null | 2025-03-17T20:21:35Z
|
---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
|
Benzonah6/alexis-2
|
Benzonah6
| 2025-03-18T01:00:28Z
| 0
| 0
|
diffusers
|
[
"diffusers",
"text-to-image",
"flux",
"lora",
"template:sd-lora",
"fluxgym",
"base_model:black-forest-labs/FLUX.1-dev",
"base_model:adapter:black-forest-labs/FLUX.1-dev",
"license:other",
"region:us"
] |
text-to-image
| 2025-03-18T01:00:19Z
|
---
tags:
- text-to-image
- flux
- lora
- diffusers
- template:sd-lora
- fluxgym
widget:
- output:
url: sample/alexis-2_001680_00_20250318004908.png
text: a woman wearing a fuzzy sweater, holding a coffee
base_model: black-forest-labs/FLUX.1-dev
license: other
license_name: flux-1-dev-non-commercial-license
license_link: https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/LICENSE.md
---
# Alexis 2
A Flux LoRA trained on a local computer with [Fluxgym](https://github.com/cocktailpeanut/fluxgym)
<Gallery />
## Trigger words
No trigger words defined.
## Download model and use it with ComfyUI, AUTOMATIC1111, SD.Next, Invoke AI, Forge, etc.
Weights for this model are available in Safetensors format.
|
erincozgur/M7BTrChatbotv3
|
erincozgur
| 2025-03-18T00:55:54Z
| 0
| 0
|
transformers
|
[
"transformers",
"safetensors",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | null | 2025-03-17T17:11:23Z
|
---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
|
lesso08/cbf2fea1-8cc9-43d6-9e96-2b554a42dd44
|
lesso08
| 2025-03-18T00:55:22Z
| 0
| 0
|
peft
|
[
"peft",
"safetensors",
"phi3",
"axolotl",
"generated_from_trainer",
"custom_code",
"base_model:numind/NuExtract-1.5",
"base_model:adapter:numind/NuExtract-1.5",
"license:mit",
"region:us"
] | null | 2025-03-17T20:18:57Z
|
---
library_name: peft
license: mit
base_model: numind/NuExtract-v1.5
tags:
- axolotl
- generated_from_trainer
model-index:
- name: cbf2fea1-8cc9-43d6-9e96-2b554a42dd44
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: numind/NuExtract-v1.5
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- 01a5f4337526bb62_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/01a5f4337526bb62_train_data.json
type:
field_input: input
field_instruction: instruction
field_output: output
format: '{instruction} {input}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
do_eval: true
early_stopping_patience: 3
eval_batch_size: 4
eval_max_new_tokens: 128
eval_steps: 500
evals_per_epoch: null
flash_attention: true
fp16: false
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 8
gradient_checkpointing: true
group_by_length: true
hub_model_id: lesso08/cbf2fea1-8cc9-43d6-9e96-2b554a42dd44
hub_repo: null
hub_strategy: checkpoint
hub_token: null
learning_rate: 0.000208
load_in_4bit: false
load_in_8bit: false
local_rank: null
logging_steps: 50
lora_alpha: 128
lora_dropout: 0.15
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 64
lora_target_linear: true
lr_scheduler: cosine
max_grad_norm: 1.0
max_steps: 2000
micro_batch_size: 4
mlflow_experiment_name: /tmp/01a5f4337526bb62_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 10
optimizer: adamw_torch_fused
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
save_steps: 500
saves_per_epoch: null
seed: 80
sequence_len: 1024
strict: false
tf32: true
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: bc4dc511-2172-4f5a-881d-d4261bb6efd7
wandb_project: 08a
wandb_run: your_name
wandb_runid: bc4dc511-2172-4f5a-881d-d4261bb6efd7
warmup_steps: 100
weight_decay: 0.0
xformers_attention: null
```
</details><br>
# cbf2fea1-8cc9-43d6-9e96-2b554a42dd44
This model is a fine-tuned version of [numind/NuExtract-v1.5](https://huggingface.co/numind/NuExtract-v1.5) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7532
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.000208
- train_batch_size: 4
- eval_batch_size: 4
- seed: 80
- gradient_accumulation_steps: 8
- total_train_batch_size: 32
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 100
- training_steps: 2000
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| No log | 0.0003 | 1 | 1.1654 |
| 6.2506 | 0.1525 | 500 | 0.8151 |
| 6.0736 | 0.3050 | 1000 | 0.7755 |
| 5.9722 | 0.4575 | 1500 | 0.7558 |
| 5.9383 | 0.6100 | 2000 | 0.7532 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1
|
GhostScientist/distilgpt2-int8-browser-completion
|
GhostScientist
| 2025-03-18T00:52:41Z
| 0
| 0
| null |
[
"onnx",
"gpt2",
"base_model:distilbert/distilgpt2",
"base_model:quantized:distilbert/distilgpt2",
"license:apache-2.0",
"region:us"
] | null | 2025-03-18T00:45:32Z
|
---
license: apache-2.0
base_model:
- distilbert/distilgpt2
---
This is a quantized version of DistilGPT-2 optimized for browser deployment.
Smaller file size (120MB compared to 317MB original model)
|
sukatune/Qwen2.5-Coder-7B-Instruct-Java-lora-v2.0
|
sukatune
| 2025-03-18T00:52:37Z
| 0
| 0
| null |
[
"safetensors",
"unsloth",
"license:apache-2.0",
"region:us"
] | null | 2025-03-18T00:11:17Z
|
---
license: apache-2.0
tags:
- unsloth
---
|
Novaciano/Sphynx-3.2-1B-Q6_K-GGUF
|
Novaciano
| 2025-03-18T00:52:17Z
| 0
| 0
|
transformers
|
[
"transformers",
"gguf",
"mergekit",
"merge",
"llama-cpp",
"gguf-my-repo",
"base_model:Novaciano/Sphynx-3.2-1B",
"base_model:quantized:Novaciano/Sphynx-3.2-1B",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2025-03-18T00:52:08Z
|
---
base_model: Novaciano/Sphynx-3.2-1B
library_name: transformers
tags:
- mergekit
- merge
- llama-cpp
- gguf-my-repo
---
# Novaciano/Sphynx-3.2-1B-Q6_K-GGUF
This model was converted to GGUF format from [`Novaciano/Sphynx-3.2-1B`](https://huggingface.co/Novaciano/Sphynx-3.2-1B) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space.
Refer to the [original model card](https://huggingface.co/Novaciano/Sphynx-3.2-1B) for more details on the model.
## Use with llama.cpp
Install llama.cpp through brew (works on Mac and Linux)
```bash
brew install llama.cpp
```
Invoke the llama.cpp server or the CLI.
### CLI:
```bash
llama-cli --hf-repo Novaciano/Sphynx-3.2-1B-Q6_K-GGUF --hf-file sphynx-3.2-1b-q6_k.gguf -p "The meaning to life and the universe is"
```
### Server:
```bash
llama-server --hf-repo Novaciano/Sphynx-3.2-1B-Q6_K-GGUF --hf-file sphynx-3.2-1b-q6_k.gguf -c 2048
```
Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well.
Step 1: Clone llama.cpp from GitHub.
```
git clone https://github.com/ggerganov/llama.cpp
```
Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux).
```
cd llama.cpp && LLAMA_CURL=1 make
```
Step 3: Run inference through the main binary.
```
./llama-cli --hf-repo Novaciano/Sphynx-3.2-1B-Q6_K-GGUF --hf-file sphynx-3.2-1b-q6_k.gguf -p "The meaning to life and the universe is"
```
or
```
./llama-server --hf-repo Novaciano/Sphynx-3.2-1B-Q6_K-GGUF --hf-file sphynx-3.2-1b-q6_k.gguf -c 2048
```
|
genki10/Version27NewTestASAP_FineTuningBERT_AugV27_k3_task1_organization_k3_k3_fold4
|
genki10
| 2025-03-18T00:51:29Z
| 0
| 0
|
transformers
|
[
"transformers",
"pytorch",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:google-bert/bert-base-uncased",
"base_model:finetune:google-bert/bert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2025-03-18T00:27:52Z
|
---
library_name: transformers
license: apache-2.0
base_model: bert-base-uncased
tags:
- generated_from_trainer
model-index:
- name: Version27NewTestASAP_FineTuningBERT_AugV27_k3_task1_organization_k3_k3_fold4
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Version27NewTestASAP_FineTuningBERT_AugV27_k3_task1_organization_k3_k3_fold4
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6529
- Qwk: 0.5648
- Mse: 0.6529
- Rmse: 0.8080
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 150
### Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|:-------------:|:-----:|:----:|:---------------:|:------:|:-------:|:------:|
| No log | 1.0 | 2 | 14.6314 | 0.0 | 14.6314 | 3.8251 |
| No log | 2.0 | 4 | 12.2692 | 0.0 | 12.2692 | 3.5027 |
| No log | 3.0 | 6 | 9.9737 | 0.0066 | 9.9737 | 3.1581 |
| No log | 4.0 | 8 | 7.7158 | 0.0018 | 7.7158 | 2.7777 |
| No log | 5.0 | 10 | 5.5404 | 0.0686 | 5.5404 | 2.3538 |
| No log | 6.0 | 12 | 3.7990 | 0.0079 | 3.7990 | 1.9491 |
| No log | 7.0 | 14 | 2.6051 | 0.0040 | 2.6051 | 1.6140 |
| No log | 8.0 | 16 | 2.0976 | 0.1616 | 2.0976 | 1.4483 |
| No log | 9.0 | 18 | 1.4779 | 0.0420 | 1.4779 | 1.2157 |
| No log | 10.0 | 20 | 1.1374 | 0.0316 | 1.1374 | 1.0665 |
| No log | 11.0 | 22 | 0.9740 | 0.0316 | 0.9740 | 0.9869 |
| No log | 12.0 | 24 | 0.8095 | 0.3480 | 0.8095 | 0.8997 |
| No log | 13.0 | 26 | 0.7346 | 0.4054 | 0.7346 | 0.8571 |
| No log | 14.0 | 28 | 0.6515 | 0.3743 | 0.6515 | 0.8071 |
| No log | 15.0 | 30 | 0.6759 | 0.5030 | 0.6759 | 0.8221 |
| No log | 16.0 | 32 | 0.5668 | 0.5061 | 0.5668 | 0.7529 |
| No log | 17.0 | 34 | 0.6652 | 0.5390 | 0.6652 | 0.8156 |
| No log | 18.0 | 36 | 0.5753 | 0.5364 | 0.5753 | 0.7585 |
| No log | 19.0 | 38 | 0.7934 | 0.5319 | 0.7934 | 0.8907 |
| No log | 20.0 | 40 | 0.5839 | 0.5388 | 0.5839 | 0.7641 |
| No log | 21.0 | 42 | 0.6019 | 0.5666 | 0.6019 | 0.7758 |
| No log | 22.0 | 44 | 0.6716 | 0.5794 | 0.6716 | 0.8195 |
| No log | 23.0 | 46 | 0.6197 | 0.5773 | 0.6197 | 0.7872 |
| No log | 24.0 | 48 | 0.6649 | 0.5924 | 0.6649 | 0.8154 |
| No log | 25.0 | 50 | 0.6998 | 0.5919 | 0.6998 | 0.8365 |
| No log | 26.0 | 52 | 0.7202 | 0.5988 | 0.7202 | 0.8486 |
| No log | 27.0 | 54 | 0.6556 | 0.6124 | 0.6556 | 0.8097 |
| No log | 28.0 | 56 | 0.6501 | 0.6360 | 0.6501 | 0.8063 |
| No log | 29.0 | 58 | 0.6210 | 0.6175 | 0.6210 | 0.7880 |
| No log | 30.0 | 60 | 0.6323 | 0.6200 | 0.6323 | 0.7952 |
| No log | 31.0 | 62 | 0.7047 | 0.6289 | 0.7047 | 0.8395 |
| No log | 32.0 | 64 | 0.7298 | 0.5938 | 0.7298 | 0.8543 |
| No log | 33.0 | 66 | 0.6589 | 0.6181 | 0.6589 | 0.8117 |
| No log | 34.0 | 68 | 0.9631 | 0.4715 | 0.9631 | 0.9814 |
| No log | 35.0 | 70 | 0.6021 | 0.6107 | 0.6021 | 0.7759 |
| No log | 36.0 | 72 | 1.0091 | 0.4968 | 1.0091 | 1.0045 |
| No log | 37.0 | 74 | 0.7669 | 0.5514 | 0.7669 | 0.8757 |
| No log | 38.0 | 76 | 0.7919 | 0.4929 | 0.7919 | 0.8899 |
| No log | 39.0 | 78 | 1.1351 | 0.4040 | 1.1351 | 1.0654 |
| No log | 40.0 | 80 | 0.6510 | 0.5434 | 0.6510 | 0.8068 |
| No log | 41.0 | 82 | 0.7800 | 0.5383 | 0.7800 | 0.8832 |
| No log | 42.0 | 84 | 0.6956 | 0.5422 | 0.6956 | 0.8340 |
| No log | 43.0 | 86 | 0.6248 | 0.5949 | 0.6248 | 0.7905 |
| No log | 44.0 | 88 | 0.6024 | 0.6077 | 0.6024 | 0.7761 |
| No log | 45.0 | 90 | 0.7641 | 0.5473 | 0.7641 | 0.8741 |
| No log | 46.0 | 92 | 0.6522 | 0.6054 | 0.6522 | 0.8076 |
| No log | 47.0 | 94 | 0.6301 | 0.6252 | 0.6301 | 0.7938 |
| No log | 48.0 | 96 | 0.6111 | 0.6011 | 0.6111 | 0.7818 |
| No log | 49.0 | 98 | 0.6311 | 0.6391 | 0.6311 | 0.7944 |
| No log | 50.0 | 100 | 0.6264 | 0.6378 | 0.6264 | 0.7915 |
| No log | 51.0 | 102 | 0.7371 | 0.5658 | 0.7371 | 0.8586 |
| No log | 52.0 | 104 | 0.6154 | 0.6375 | 0.6154 | 0.7845 |
| No log | 53.0 | 106 | 0.5999 | 0.6386 | 0.5999 | 0.7745 |
| No log | 54.0 | 108 | 0.6246 | 0.5881 | 0.6246 | 0.7903 |
| No log | 55.0 | 110 | 0.6091 | 0.6443 | 0.6091 | 0.7805 |
| No log | 56.0 | 112 | 0.6166 | 0.6469 | 0.6166 | 0.7853 |
| No log | 57.0 | 114 | 0.6502 | 0.5956 | 0.6502 | 0.8063 |
| No log | 58.0 | 116 | 0.6416 | 0.6296 | 0.6416 | 0.8010 |
| No log | 59.0 | 118 | 0.6357 | 0.6364 | 0.6357 | 0.7973 |
| No log | 60.0 | 120 | 0.6444 | 0.5893 | 0.6444 | 0.8027 |
| No log | 61.0 | 122 | 0.6188 | 0.6114 | 0.6188 | 0.7866 |
| No log | 62.0 | 124 | 0.6086 | 0.6270 | 0.6086 | 0.7801 |
| No log | 63.0 | 126 | 0.6231 | 0.5971 | 0.6231 | 0.7894 |
| No log | 64.0 | 128 | 0.6033 | 0.6395 | 0.6033 | 0.7767 |
| No log | 65.0 | 130 | 0.6151 | 0.5909 | 0.6151 | 0.7843 |
| No log | 66.0 | 132 | 0.8197 | 0.5413 | 0.8197 | 0.9054 |
| No log | 67.0 | 134 | 0.6625 | 0.5574 | 0.6625 | 0.8140 |
| No log | 68.0 | 136 | 0.5979 | 0.6156 | 0.5979 | 0.7733 |
| No log | 69.0 | 138 | 0.5935 | 0.6142 | 0.5935 | 0.7704 |
| No log | 70.0 | 140 | 0.7617 | 0.5431 | 0.7617 | 0.8727 |
| No log | 71.0 | 142 | 0.6538 | 0.5874 | 0.6538 | 0.8086 |
| No log | 72.0 | 144 | 0.6414 | 0.6287 | 0.6414 | 0.8009 |
| No log | 73.0 | 146 | 0.6242 | 0.6249 | 0.6242 | 0.7900 |
| No log | 74.0 | 148 | 0.7411 | 0.5817 | 0.7411 | 0.8609 |
| No log | 75.0 | 150 | 0.6433 | 0.6032 | 0.6433 | 0.8021 |
| No log | 76.0 | 152 | 0.6520 | 0.6034 | 0.6520 | 0.8075 |
| No log | 77.0 | 154 | 0.6063 | 0.6243 | 0.6063 | 0.7787 |
| No log | 78.0 | 156 | 0.6537 | 0.5777 | 0.6537 | 0.8085 |
| No log | 79.0 | 158 | 0.6023 | 0.6062 | 0.6023 | 0.7761 |
| No log | 80.0 | 160 | 0.6262 | 0.6339 | 0.6262 | 0.7913 |
| No log | 81.0 | 162 | 0.6174 | 0.6231 | 0.6174 | 0.7857 |
| No log | 82.0 | 164 | 0.7510 | 0.5747 | 0.7510 | 0.8666 |
| No log | 83.0 | 166 | 0.6830 | 0.5980 | 0.6830 | 0.8264 |
| No log | 84.0 | 168 | 0.6389 | 0.6301 | 0.6389 | 0.7993 |
| No log | 85.0 | 170 | 0.6124 | 0.6231 | 0.6124 | 0.7826 |
| No log | 86.0 | 172 | 0.6529 | 0.5648 | 0.6529 | 0.8080 |
### Framework versions
- Transformers 4.47.0
- Pytorch 2.5.1+cu121
- Datasets 3.3.1
- Tokenizers 0.21.0
|
llm-jp/llm-jp-3-1.8b-sae-l12-k32-16x-c10
|
llm-jp
| 2025-03-18T00:48:22Z
| 0
| 0
| null |
[
"license:apache-2.0",
"region:us"
] | null | 2025-03-18T00:48:22Z
|
---
license: apache-2.0
---
|
Swephoenix/phi2-lora-pbhsahxt-1742255413
|
Swephoenix
| 2025-03-18T00:48:09Z
| 0
| 0
|
peft
|
[
"peft",
"safetensors",
"generated_from_trainer",
"base_model:microsoft/phi-2",
"base_model:adapter:microsoft/phi-2",
"license:mit",
"region:us"
] | null | 2025-03-17T23:50:43Z
|
---
license: mit
base_model: microsoft/phi-2
tags:
- generated_from_trainer
library_name: peft
model-index:
- name: phi2-lora-pbhsahxt-1742255413
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# phi2-lora-pbhsahxt-1742255413
This model is a fine-tuned version of [microsoft/phi-2](https://huggingface.co/microsoft/phi-2) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 1
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 8
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 3
### Training results
### Framework versions
- PEFT 0.7.1
- Transformers 4.36.2
- Pytorch 2.5.1+cu121
- Datasets 3.3.1
- Tokenizers 0.15.2
|
llm-jp/llm-jp-3-1.8b-sae-l12-k32-16x-c10000
|
llm-jp
| 2025-03-18T00:47:49Z
| 0
| 0
| null |
[
"license:apache-2.0",
"region:us"
] | null | 2025-03-18T00:47:49Z
|
---
license: apache-2.0
---
|
llm-jp/llm-jp-3-1.8b-sae-l12-k32-16x-c100000
|
llm-jp
| 2025-03-18T00:47:36Z
| 0
| 0
| null |
[
"license:apache-2.0",
"region:us"
] | null | 2025-03-18T00:47:36Z
|
---
license: apache-2.0
---
|
mradermacher/aimo-model-1.5B-GGUF
|
mradermacher
| 2025-03-18T00:45:36Z
| 0
| 0
|
transformers
|
[
"transformers",
"gguf",
"generated_from_trainer",
"trl",
"sft",
"en",
"base_model:accountblabla/aimo-model-1.5B",
"base_model:quantized:accountblabla/aimo-model-1.5B",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2025-03-18T00:33:35Z
|
---
base_model: accountblabla/aimo-model-1.5B
language:
- en
library_name: transformers
model_name: aimo-model-1.5B
quantized_by: mradermacher
tags:
- generated_from_trainer
- trl
- sft
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: -->
static quants of https://huggingface.co/accountblabla/aimo-model-1.5B
<!-- provided-files -->
weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/aimo-model-1.5B-GGUF/resolve/main/aimo-model-1.5B.Q2_K.gguf) | Q2_K | 0.9 | |
| [GGUF](https://huggingface.co/mradermacher/aimo-model-1.5B-GGUF/resolve/main/aimo-model-1.5B.Q3_K_S.gguf) | Q3_K_S | 1.0 | |
| [GGUF](https://huggingface.co/mradermacher/aimo-model-1.5B-GGUF/resolve/main/aimo-model-1.5B.Q3_K_M.gguf) | Q3_K_M | 1.0 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/aimo-model-1.5B-GGUF/resolve/main/aimo-model-1.5B.Q3_K_L.gguf) | Q3_K_L | 1.1 | |
| [GGUF](https://huggingface.co/mradermacher/aimo-model-1.5B-GGUF/resolve/main/aimo-model-1.5B.IQ4_XS.gguf) | IQ4_XS | 1.1 | |
| [GGUF](https://huggingface.co/mradermacher/aimo-model-1.5B-GGUF/resolve/main/aimo-model-1.5B.Q4_K_S.gguf) | Q4_K_S | 1.2 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/aimo-model-1.5B-GGUF/resolve/main/aimo-model-1.5B.Q4_K_M.gguf) | Q4_K_M | 1.2 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/aimo-model-1.5B-GGUF/resolve/main/aimo-model-1.5B.Q5_K_S.gguf) | Q5_K_S | 1.4 | |
| [GGUF](https://huggingface.co/mradermacher/aimo-model-1.5B-GGUF/resolve/main/aimo-model-1.5B.Q5_K_M.gguf) | Q5_K_M | 1.4 | |
| [GGUF](https://huggingface.co/mradermacher/aimo-model-1.5B-GGUF/resolve/main/aimo-model-1.5B.Q6_K.gguf) | Q6_K | 1.6 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/aimo-model-1.5B-GGUF/resolve/main/aimo-model-1.5B.Q8_0.gguf) | Q8_0 | 2.0 | fast, best quality |
| [GGUF](https://huggingface.co/mradermacher/aimo-model-1.5B-GGUF/resolve/main/aimo-model-1.5B.f16.gguf) | f16 | 3.7 | 16 bpw, overkill |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time.
<!-- end -->
|
Zolotomeo/Denismodel6
|
Zolotomeo
| 2025-03-18T00:44:50Z
| 0
| 0
|
diffusers
|
[
"diffusers",
"flux",
"lora",
"replicate",
"text-to-image",
"en",
"base_model:black-forest-labs/FLUX.1-dev",
"base_model:adapter:black-forest-labs/FLUX.1-dev",
"license:other",
"region:us"
] |
text-to-image
| 2025-03-18T00:36:56Z
|
---
license: other
license_name: flux-1-dev-non-commercial-license
license_link: https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/LICENSE.md
language:
- en
tags:
- flux
- diffusers
- lora
- replicate
base_model: "black-forest-labs/FLUX.1-dev"
pipeline_tag: text-to-image
# widget:
# - text: >-
# prompt
# output:
# url: https://...
instance_prompt: jimaSM
---
# Denismodel6
<Gallery />
Trained on Replicate using:
https://replicate.com/ostris/flux-dev-lora-trainer/train
## Trigger words
You should use `jimaSM` to trigger the image generation.
## Use it with the [🧨 diffusers library](https://github.com/huggingface/diffusers)
```py
from diffusers import AutoPipelineForText2Image
import torch
pipeline = AutoPipelineForText2Image.from_pretrained('black-forest-labs/FLUX.1-dev', torch_dtype=torch.float16).to('cuda')
pipeline.load_lora_weights('Zolotomeo/Denismodel6', weight_name='lora.safetensors')
image = pipeline('your prompt').images[0]
```
For more details, including weighting, merging and fusing LoRAs, check the [documentation on loading LoRAs in diffusers](https://huggingface.co/docs/diffusers/main/en/using-diffusers/loading_adapters)
|
MagedSaeed/APCD-Plus-meter-classification-model
|
MagedSaeed
| 2025-03-18T00:44:00Z
| 0
| 0
| null |
[
"safetensors",
"model_hub_mixin",
"pytorch_model_hub_mixin",
"text-classification",
"license:mit",
"region:us"
] |
text-classification
| 2025-03-17T19:16:12Z
|
---
license: mit
pipeline_tag: text-classification
tags:
- model_hub_mixin
- pytorch_model_hub_mixin
---
This model has been pushed to the Hub using the [PytorchModelHubMixin](https://huggingface.co/docs/huggingface_hub/package_reference/mixins#huggingface_hub.PyTorchModelHubMixin) integration:
- Library: APCD-Plus-meter-classification-model
- Docs: [More Information Needed]
|
tollea1234/llava1.6-7b-cropped_city-lora
|
tollea1234
| 2025-03-18T00:42:40Z
| 0
| 0
| null |
[
"safetensors",
"llava_llama",
"license:apache-2.0",
"region:us"
] | null | 2025-03-18T00:06:59Z
|
---
license: apache-2.0
---
|
Triangle104/Cydonia-24B-v2.1-Q4_K_M-GGUF
|
Triangle104
| 2025-03-18T00:41:18Z
| 0
| 0
| null |
[
"gguf",
"llama-cpp",
"gguf-my-repo",
"base_model:TheDrummer/Cydonia-24B-v2.1",
"base_model:quantized:TheDrummer/Cydonia-24B-v2.1",
"license:other",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2025-03-18T00:35:51Z
|
---
base_model: TheDrummer/Cydonia-24B-v2.1
license: other
tags:
- llama-cpp
- gguf-my-repo
---
# Triangle104/Cydonia-24B-v2.1-Q4_K_M-GGUF
This model was converted to GGUF format from [`TheDrummer/Cydonia-24B-v2.1`](https://huggingface.co/TheDrummer/Cydonia-24B-v2.1) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space.
Refer to the [original model card](https://huggingface.co/TheDrummer/Cydonia-24B-v2.1) for more details on the model.
---
Supported Chat Templates
-
Mistral v7 Tekken (recommended)
Metharme (may require some patching)
Alpaca (worth a try for story)
Description
-
Cydonia 24B v2.1 is a finetune of Mistral's latest 'Small' model (2501).
Further tuning was done to improve prose, foster creativity, and tone down positivity.
---
## Use with llama.cpp
Install llama.cpp through brew (works on Mac and Linux)
```bash
brew install llama.cpp
```
Invoke the llama.cpp server or the CLI.
### CLI:
```bash
llama-cli --hf-repo Triangle104/Cydonia-24B-v2.1-Q4_K_M-GGUF --hf-file cydonia-24b-v2.1-q4_k_m.gguf -p "The meaning to life and the universe is"
```
### Server:
```bash
llama-server --hf-repo Triangle104/Cydonia-24B-v2.1-Q4_K_M-GGUF --hf-file cydonia-24b-v2.1-q4_k_m.gguf -c 2048
```
Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well.
Step 1: Clone llama.cpp from GitHub.
```
git clone https://github.com/ggerganov/llama.cpp
```
Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux).
```
cd llama.cpp && LLAMA_CURL=1 make
```
Step 3: Run inference through the main binary.
```
./llama-cli --hf-repo Triangle104/Cydonia-24B-v2.1-Q4_K_M-GGUF --hf-file cydonia-24b-v2.1-q4_k_m.gguf -p "The meaning to life and the universe is"
```
or
```
./llama-server --hf-repo Triangle104/Cydonia-24B-v2.1-Q4_K_M-GGUF --hf-file cydonia-24b-v2.1-q4_k_m.gguf -c 2048
```
|
pmoharana/Dhruv-27B-Mixed
|
pmoharana
| 2025-03-18T00:40:49Z
| 0
| 0
|
transformers
|
[
"transformers",
"safetensors",
"text-generation-inference",
"unsloth",
"gemma3",
"trl",
"en",
"base_model:unsloth/gemma-3-27b-it-unsloth-bnb-4bit",
"base_model:finetune:unsloth/gemma-3-27b-it-unsloth-bnb-4bit",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | null | 2025-03-18T00:40:31Z
|
---
base_model: unsloth/gemma-3-27b-it-unsloth-bnb-4bit
tags:
- text-generation-inference
- transformers
- unsloth
- gemma3
- trl
license: apache-2.0
language:
- en
---
# Uploaded model
- **Developed by:** pmoharana
- **License:** apache-2.0
- **Finetuned from model :** unsloth/gemma-3-27b-it-unsloth-bnb-4bit
This gemma3 model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
Triangle104/Cydonia-24B-v2.1-Q3_K_L-GGUF
|
Triangle104
| 2025-03-18T00:39:55Z
| 0
| 0
| null |
[
"gguf",
"llama-cpp",
"gguf-my-repo",
"base_model:TheDrummer/Cydonia-24B-v2.1",
"base_model:quantized:TheDrummer/Cydonia-24B-v2.1",
"license:other",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2025-03-17T23:22:04Z
|
---
base_model: TheDrummer/Cydonia-24B-v2.1
license: other
tags:
- llama-cpp
- gguf-my-repo
---
# Triangle104/Cydonia-24B-v2.1-Q3_K_L-GGUF
This model was converted to GGUF format from [`TheDrummer/Cydonia-24B-v2.1`](https://huggingface.co/TheDrummer/Cydonia-24B-v2.1) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space.
Refer to the [original model card](https://huggingface.co/TheDrummer/Cydonia-24B-v2.1) for more details on the model.
---
Supported Chat Templates
-
Mistral v7 Tekken (recommended)
Metharme (may require some patching)
Alpaca (worth a try for story)
Description
-
Cydonia 24B v2.1 is a finetune of Mistral's latest 'Small' model (2501).
Further tuning was done to improve prose, foster creativity, and tone down positivity.
---
## Use with llama.cpp
Install llama.cpp through brew (works on Mac and Linux)
```bash
brew install llama.cpp
```
Invoke the llama.cpp server or the CLI.
### CLI:
```bash
llama-cli --hf-repo Triangle104/Cydonia-24B-v2.1-Q3_K_L-GGUF --hf-file cydonia-24b-v2.1-q3_k_l.gguf -p "The meaning to life and the universe is"
```
### Server:
```bash
llama-server --hf-repo Triangle104/Cydonia-24B-v2.1-Q3_K_L-GGUF --hf-file cydonia-24b-v2.1-q3_k_l.gguf -c 2048
```
Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well.
Step 1: Clone llama.cpp from GitHub.
```
git clone https://github.com/ggerganov/llama.cpp
```
Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux).
```
cd llama.cpp && LLAMA_CURL=1 make
```
Step 3: Run inference through the main binary.
```
./llama-cli --hf-repo Triangle104/Cydonia-24B-v2.1-Q3_K_L-GGUF --hf-file cydonia-24b-v2.1-q3_k_l.gguf -p "The meaning to life and the universe is"
```
or
```
./llama-server --hf-repo Triangle104/Cydonia-24B-v2.1-Q3_K_L-GGUF --hf-file cydonia-24b-v2.1-q3_k_l.gguf -c 2048
```
|
nmcco/14-llama3.2-3b-balanced_vs_hp
|
nmcco
| 2025-03-18T00:36:21Z
| 0
| 0
|
transformers
|
[
"transformers",
"safetensors",
"generated_from_trainer",
"trl",
"sft",
"base_model:nmcco/llama-3.2-3b-speakertokens",
"base_model:finetune:nmcco/llama-3.2-3b-speakertokens",
"endpoints_compatible",
"region:us"
] | null | 2025-03-17T20:34:17Z
|
---
base_model: nmcco/llama-3.2-3b-speakertokens
library_name: transformers
model_name: 14-llama3.2-3b-balanced_vs_hp
tags:
- generated_from_trainer
- trl
- sft
licence: license
---
# Model Card for 14-llama3.2-3b-balanced_vs_hp
This model is a fine-tuned version of [nmcco/llama-3.2-3b-speakertokens](https://huggingface.co/nmcco/llama-3.2-3b-speakertokens).
It has been trained using [TRL](https://github.com/huggingface/trl).
## Quick start
```python
from transformers import pipeline
question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?"
generator = pipeline("text-generation", model="nmcco/14-llama3.2-3b-balanced_vs_hp", device="cuda")
output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0]
print(output["generated_text"])
```
## Training procedure
[<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="150" height="24"/>](https://wandb.ai/hwerzog-huh/huggingface/runs/3guxi481)
This model was trained with SFT.
### Framework versions
- TRL: 0.14.0
- Transformers: 4.48.2
- Pytorch: 2.4.1
- Datasets: 3.2.0
- Tokenizers: 0.21.0
## Citations
Cite TRL as:
```bibtex
@misc{vonwerra2022trl,
title = {{TRL: Transformer Reinforcement Learning}},
author = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallouédec},
year = 2020,
journal = {GitHub repository},
publisher = {GitHub},
howpublished = {\url{https://github.com/huggingface/trl}}
}
```
|
jiinking/17_first_MQA_llama3B_model
|
jiinking
| 2025-03-18T00:36:04Z
| 0
| 0
|
transformers
|
[
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2025-03-17T23:23:12Z
|
---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
|
SantiagoGr/Santiago
|
SantiagoGr
| 2025-03-18T00:33:01Z
| 0
| 0
| null |
[
"license:creativeml-openrail-m",
"region:us"
] | null | 2025-03-18T00:33:01Z
|
---
license: creativeml-openrail-m
---
|
mradermacher/Radomir_Gemma_4B-GGUF
|
mradermacher
| 2025-03-18T00:31:12Z
| 0
| 0
|
transformers
|
[
"transformers",
"gguf",
"text-generation-inference",
"unsloth",
"gemma3",
"en",
"base_model:KoDer123/Radomir_Gemma_4B",
"base_model:quantized:KoDer123/Radomir_Gemma_4B",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2025-03-18T00:06:03Z
|
---
base_model: KoDer123/Radomir_Gemma_4B
language:
- en
library_name: transformers
license: apache-2.0
quantized_by: mradermacher
tags:
- text-generation-inference
- transformers
- unsloth
- gemma3
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: -->
static quants of https://huggingface.co/KoDer123/Radomir_Gemma_4B
<!-- provided-files -->
weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/Radomir_Gemma_4B-GGUF/resolve/main/Radomir_Gemma_4B.Q2_K.gguf) | Q2_K | 1.8 | |
| [GGUF](https://huggingface.co/mradermacher/Radomir_Gemma_4B-GGUF/resolve/main/Radomir_Gemma_4B.Q3_K_S.gguf) | Q3_K_S | 2.0 | |
| [GGUF](https://huggingface.co/mradermacher/Radomir_Gemma_4B-GGUF/resolve/main/Radomir_Gemma_4B.Q3_K_M.gguf) | Q3_K_M | 2.2 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/Radomir_Gemma_4B-GGUF/resolve/main/Radomir_Gemma_4B.Q3_K_L.gguf) | Q3_K_L | 2.3 | |
| [GGUF](https://huggingface.co/mradermacher/Radomir_Gemma_4B-GGUF/resolve/main/Radomir_Gemma_4B.IQ4_XS.gguf) | IQ4_XS | 2.4 | |
| [GGUF](https://huggingface.co/mradermacher/Radomir_Gemma_4B-GGUF/resolve/main/Radomir_Gemma_4B.Q4_K_S.gguf) | Q4_K_S | 2.5 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Radomir_Gemma_4B-GGUF/resolve/main/Radomir_Gemma_4B.Q4_K_M.gguf) | Q4_K_M | 2.6 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Radomir_Gemma_4B-GGUF/resolve/main/Radomir_Gemma_4B.Q5_K_S.gguf) | Q5_K_S | 2.9 | |
| [GGUF](https://huggingface.co/mradermacher/Radomir_Gemma_4B-GGUF/resolve/main/Radomir_Gemma_4B.Q5_K_M.gguf) | Q5_K_M | 2.9 | |
| [GGUF](https://huggingface.co/mradermacher/Radomir_Gemma_4B-GGUF/resolve/main/Radomir_Gemma_4B.Q6_K.gguf) | Q6_K | 3.3 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/Radomir_Gemma_4B-GGUF/resolve/main/Radomir_Gemma_4B.Q8_0.gguf) | Q8_0 | 4.2 | fast, best quality |
| [GGUF](https://huggingface.co/mradermacher/Radomir_Gemma_4B-GGUF/resolve/main/Radomir_Gemma_4B.f16.gguf) | f16 | 7.9 | 16 bpw, overkill |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time.
<!-- end -->
|
MelisaO/modelo_clasificacion_violencia3
|
MelisaO
| 2025-03-18T00:30:45Z
| 0
| 0
|
transformers
|
[
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:MelisaO/modelo_clasificacion_violencia3",
"base_model:finetune:MelisaO/modelo_clasificacion_violencia3",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2025-03-17T23:23:36Z
|
---
library_name: transformers
license: apache-2.0
base_model: MelisaO/modelo_clasificacion_violencia3
tags:
- generated_from_trainer
model-index:
- name: modelo_clasificacion_violencia3
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# modelo_clasificacion_violencia3
This model is a fine-tuned version of [MelisaO/modelo_clasificacion_violencia3](https://huggingface.co/MelisaO/modelo_clasificacion_violencia3) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0000
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 15
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| No log | 1.0 | 27 | 0.0000 |
| No log | 2.0 | 54 | 0.0000 |
| No log | 3.0 | 81 | 0.0006 |
| No log | 4.0 | 108 | 0.0000 |
| No log | 5.0 | 135 | 0.0000 |
| No log | 6.0 | 162 | 0.0000 |
| No log | 7.0 | 189 | 0.0000 |
| No log | 8.0 | 216 | 0.0000 |
| No log | 9.0 | 243 | 0.0000 |
| No log | 10.0 | 270 | 0.0000 |
| No log | 11.0 | 297 | 0.0000 |
| No log | 12.0 | 324 | 0.0000 |
| No log | 13.0 | 351 | 0.0000 |
| No log | 14.0 | 378 | 0.0000 |
| No log | 15.0 | 405 | 0.0000 |
### Framework versions
- Transformers 4.48.3
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
|
SpongeEngine/MT-Gen9-gemma-2-9B-i1-GGUF
|
SpongeEngine
| 2025-03-18T00:28:21Z
| 0
| 0
| null |
[
"gguf",
"SpongeQuant",
"i1-GGUF",
"en",
"base_model:zelk12/MT-Gen9-gemma-2-9B",
"base_model:quantized:zelk12/MT-Gen9-gemma-2-9B",
"license:mit",
"endpoints_compatible",
"region:us",
"imatrix",
"conversational"
] | null | 2025-03-17T21:11:36Z
|
---
base_model: zelk12/MT-Gen9-gemma-2-9B
language:
- en
license: mit
quantized_by: SpongeQuant
tags:
- SpongeQuant
- i1-GGUF
---
Quantized to `i1-GGUF` using [SpongeQuant](https://github.com/SpongeEngine/SpongeQuant), the Oobabooga of LLM quantization.
<div style="display: flex; gap: 20px; align-items: center; margin-top:0;">
<a href="https://github.com/SpongeEngine/SpongeQuant">
<img src="https://huggingface.co/spaces/SpongeEngine/README/resolve/main/github-button.png" width="173">
</a>
<a href="https://discord.gg/azNmr2Gdgy">
<img src="https://huggingface.co/spaces/SpongeEngine/README/resolve/main/discord-button.png" width="173">
</a>
</div>
***
<figure>
<img src="https://huggingface.co/spaces/SpongeEngine/README/resolve/main/099.png" alt="X-ray of hand">
<figcaption>X-ray of hand</figcaption>
</figure>
<figure>
<audio controls>
<source src="https://huggingface.co/spaces/SpongeEngine/README/resolve/main/012.mp3" type="audio/mp3">
Your browser does not support the audio element.
</audio>
<figcaption>El Cascabel – Antonio Maciel and Los Aguilillas with Mariachi México de Pepe Villa / Rafael Carrión (Mexico, Unknown)</figcaption>
</figure>
***
### What is a GGUF?
GGUF is a file format used for running large language models (LLMs) on different types of computers. It supports both regular processors (CPUs) and graphics cards (GPUs), making it easier to run models across a wide range of hardware. Many LLMs require powerful and expensive GPUs, but GGUF improves compatibility and efficiency by optimizing how models are loaded and executed. If a GPU doesn't have enough memory, GGUF can offload parts of the model to the CPU, allowing it to run even when GPU resources are limited. GGUF is designed to work well with quantized models, which use less memory and run faster, making them ideal for lower-end hardware. However, it can also store full-precision models when needed. Thanks to these optimizations, GGUF allows LLMs to run efficiently on everything from high-end GPUs to laptops and even CPU-only systems.
### What is an i1-GGUF?
i1-GGUF is an enhanced type of GGUF model that uses imatrix quantization—a smarter way of reducing model size while preserving key details. Instead of shrinking everything equally, it analyzes the importance of different model components and keeps the most crucial parts more accurate. Like standard GGUF, i1-GGUF allows LLMs to run on various hardware, including CPUs and lower-end GPUs. However, because it prioritizes important weights, i1-GGUF models deliver better responses than traditional GGUF models while maintaining efficiency.
|
genki10/Version27NewTestASAP_FineTuningBERT_AugV27_k3_task1_organization_k3_k3_fold3
|
genki10
| 2025-03-18T00:27:46Z
| 0
| 0
|
transformers
|
[
"transformers",
"pytorch",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:google-bert/bert-base-uncased",
"base_model:finetune:google-bert/bert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2025-03-18T00:03:35Z
|
---
library_name: transformers
license: apache-2.0
base_model: bert-base-uncased
tags:
- generated_from_trainer
model-index:
- name: Version27NewTestASAP_FineTuningBERT_AugV27_k3_task1_organization_k3_k3_fold3
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Version27NewTestASAP_FineTuningBERT_AugV27_k3_task1_organization_k3_k3_fold3
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7726
- Qwk: 0.5170
- Mse: 0.7736
- Rmse: 0.8795
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 150
### Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|:-------------:|:-----:|:----:|:---------------:|:------:|:-------:|:------:|
| No log | 1.0 | 2 | 11.4364 | 0.0111 | 11.4342 | 3.3814 |
| No log | 2.0 | 4 | 10.2559 | 0.0 | 10.2538 | 3.2022 |
| No log | 3.0 | 6 | 8.5685 | 0.0 | 8.5668 | 2.9269 |
| No log | 4.0 | 8 | 6.7098 | 0.0 | 6.7084 | 2.5901 |
| No log | 5.0 | 10 | 4.6561 | 0.0227 | 4.6550 | 2.1575 |
| No log | 6.0 | 12 | 3.4481 | 0.0038 | 3.4469 | 1.8566 |
| No log | 7.0 | 14 | 2.4119 | 0.1247 | 2.4112 | 1.5528 |
| No log | 8.0 | 16 | 1.9670 | 0.1009 | 1.9664 | 1.4023 |
| No log | 9.0 | 18 | 1.2446 | 0.0365 | 1.2441 | 1.1154 |
| No log | 10.0 | 20 | 1.0547 | 0.0365 | 1.0542 | 1.0268 |
| No log | 11.0 | 22 | 0.8682 | 0.2674 | 0.8680 | 0.9316 |
| No log | 12.0 | 24 | 0.8306 | 0.3271 | 0.8305 | 0.9113 |
| No log | 13.0 | 26 | 0.7557 | 0.4016 | 0.7557 | 0.8693 |
| No log | 14.0 | 28 | 0.6190 | 0.4619 | 0.6190 | 0.7868 |
| No log | 15.0 | 30 | 0.6399 | 0.5154 | 0.6401 | 0.8001 |
| No log | 16.0 | 32 | 1.0329 | 0.3237 | 1.0332 | 1.0165 |
| No log | 17.0 | 34 | 0.7372 | 0.4558 | 0.7375 | 0.8588 |
| No log | 18.0 | 36 | 0.6971 | 0.4792 | 0.6975 | 0.8352 |
| No log | 19.0 | 38 | 0.9070 | 0.4000 | 0.9076 | 0.9527 |
| No log | 20.0 | 40 | 0.8033 | 0.4495 | 0.8040 | 0.8967 |
| No log | 21.0 | 42 | 0.5771 | 0.5728 | 0.5778 | 0.7601 |
| No log | 22.0 | 44 | 0.8935 | 0.4541 | 0.8944 | 0.9457 |
| No log | 23.0 | 46 | 0.8029 | 0.4980 | 0.8039 | 0.8966 |
| No log | 24.0 | 48 | 0.9154 | 0.4615 | 0.9164 | 0.9573 |
| No log | 25.0 | 50 | 0.9065 | 0.4611 | 0.9076 | 0.9527 |
| No log | 26.0 | 52 | 0.9981 | 0.4246 | 0.9992 | 0.9996 |
| No log | 27.0 | 54 | 1.4498 | 0.3364 | 1.4510 | 1.2046 |
| No log | 28.0 | 56 | 0.8700 | 0.5031 | 0.8711 | 0.9333 |
| No log | 29.0 | 58 | 0.9759 | 0.4554 | 0.9771 | 0.9885 |
| No log | 30.0 | 60 | 1.3913 | 0.3361 | 1.3925 | 1.1801 |
| No log | 31.0 | 62 | 0.8301 | 0.4870 | 0.8313 | 0.9118 |
| No log | 32.0 | 64 | 0.6153 | 0.5754 | 0.6163 | 0.7850 |
| No log | 33.0 | 66 | 0.7778 | 0.4971 | 0.7789 | 0.8826 |
| No log | 34.0 | 68 | 1.0342 | 0.4076 | 1.0354 | 1.0176 |
| No log | 35.0 | 70 | 0.6708 | 0.5369 | 0.6719 | 0.8197 |
| No log | 36.0 | 72 | 0.6765 | 0.5329 | 0.6776 | 0.8232 |
| No log | 37.0 | 74 | 0.9024 | 0.4322 | 0.9036 | 0.9506 |
| No log | 38.0 | 76 | 0.6662 | 0.5459 | 0.6673 | 0.8169 |
| No log | 39.0 | 78 | 0.7022 | 0.5332 | 0.7033 | 0.8386 |
| No log | 40.0 | 80 | 0.8924 | 0.4613 | 0.8936 | 0.9453 |
| No log | 41.0 | 82 | 0.6731 | 0.5485 | 0.6742 | 0.8211 |
| No log | 42.0 | 84 | 0.6859 | 0.5555 | 0.6868 | 0.8288 |
| No log | 43.0 | 86 | 0.8955 | 0.4796 | 0.8966 | 0.9469 |
| No log | 44.0 | 88 | 0.7649 | 0.5184 | 0.7659 | 0.8752 |
| No log | 45.0 | 90 | 0.8957 | 0.4791 | 0.8968 | 0.9470 |
| No log | 46.0 | 92 | 0.7932 | 0.5181 | 0.7943 | 0.8912 |
| No log | 47.0 | 94 | 0.6668 | 0.6129 | 0.6677 | 0.8171 |
| No log | 48.0 | 96 | 0.7689 | 0.5112 | 0.7699 | 0.8775 |
| No log | 49.0 | 98 | 1.1397 | 0.4099 | 1.1408 | 1.0681 |
| No log | 50.0 | 100 | 0.9710 | 0.4301 | 0.9721 | 0.9860 |
| No log | 51.0 | 102 | 0.6383 | 0.6153 | 0.6391 | 0.7994 |
| No log | 52.0 | 104 | 0.6474 | 0.6029 | 0.6481 | 0.8050 |
| No log | 53.0 | 106 | 0.6312 | 0.6021 | 0.6321 | 0.7950 |
| No log | 54.0 | 108 | 0.8651 | 0.4515 | 0.8661 | 0.9307 |
| No log | 55.0 | 110 | 0.8007 | 0.4845 | 0.8016 | 0.8953 |
| No log | 56.0 | 112 | 0.6133 | 0.6200 | 0.6141 | 0.7837 |
| No log | 57.0 | 114 | 0.6142 | 0.5947 | 0.6150 | 0.7842 |
| No log | 58.0 | 116 | 0.7456 | 0.5245 | 0.7465 | 0.8640 |
| No log | 59.0 | 118 | 0.8565 | 0.4670 | 0.8575 | 0.9260 |
| No log | 60.0 | 120 | 0.6868 | 0.5303 | 0.6877 | 0.8293 |
| No log | 61.0 | 122 | 0.7016 | 0.5274 | 0.7025 | 0.8382 |
| No log | 62.0 | 124 | 0.9015 | 0.4624 | 0.9026 | 0.9500 |
| No log | 63.0 | 126 | 0.8409 | 0.4958 | 0.8420 | 0.9176 |
| No log | 64.0 | 128 | 0.6834 | 0.5756 | 0.6844 | 0.8273 |
| No log | 65.0 | 130 | 0.7287 | 0.5269 | 0.7298 | 0.8543 |
| No log | 66.0 | 132 | 0.7592 | 0.5254 | 0.7603 | 0.8719 |
| No log | 67.0 | 134 | 0.6600 | 0.5821 | 0.6609 | 0.8130 |
| No log | 68.0 | 136 | 0.7104 | 0.5427 | 0.7113 | 0.8434 |
| No log | 69.0 | 138 | 0.7274 | 0.5253 | 0.7283 | 0.8534 |
| No log | 70.0 | 140 | 0.6830 | 0.5537 | 0.6838 | 0.8270 |
| No log | 71.0 | 142 | 0.7903 | 0.5101 | 0.7913 | 0.8896 |
| No log | 72.0 | 144 | 0.7517 | 0.5146 | 0.7527 | 0.8676 |
| No log | 73.0 | 146 | 0.7884 | 0.5110 | 0.7894 | 0.8885 |
| No log | 74.0 | 148 | 0.7918 | 0.5098 | 0.7928 | 0.8904 |
| No log | 75.0 | 150 | 0.6654 | 0.5457 | 0.6662 | 0.8162 |
| No log | 76.0 | 152 | 0.6960 | 0.5339 | 0.6968 | 0.8348 |
| No log | 77.0 | 154 | 0.7676 | 0.5131 | 0.7684 | 0.8766 |
| No log | 78.0 | 156 | 0.7083 | 0.5426 | 0.7091 | 0.8421 |
| No log | 79.0 | 158 | 0.8075 | 0.4989 | 0.8084 | 0.8991 |
| No log | 80.0 | 160 | 0.8297 | 0.4966 | 0.8307 | 0.9114 |
| No log | 81.0 | 162 | 0.7005 | 0.5482 | 0.7014 | 0.8375 |
| No log | 82.0 | 164 | 0.6833 | 0.5636 | 0.6842 | 0.8272 |
| No log | 83.0 | 166 | 0.6876 | 0.5652 | 0.6885 | 0.8298 |
| No log | 84.0 | 168 | 0.6562 | 0.5779 | 0.6571 | 0.8106 |
| No log | 85.0 | 170 | 0.6447 | 0.5964 | 0.6456 | 0.8035 |
| No log | 86.0 | 172 | 0.7726 | 0.5170 | 0.7736 | 0.8795 |
### Framework versions
- Transformers 4.47.0
- Pytorch 2.5.1+cu121
- Datasets 3.3.1
- Tokenizers 0.21.0
|
hZzy/qwen2.5-1.5b-sft3-25-3
|
hZzy
| 2025-03-18T00:27:01Z
| 0
| 0
| null |
[
"safetensors",
"qwen2",
"alignment-handbook",
"trl",
"sft",
"generated_from_trainer",
"dataset:hZzy/SFT_new_full2",
"base_model:Qwen/Qwen2.5-1.5B",
"base_model:finetune:Qwen/Qwen2.5-1.5B",
"license:apache-2.0",
"region:us"
] | null | 2025-03-17T22:25:20Z
|
---
license: apache-2.0
base_model: Qwen/Qwen2.5-1.5B
tags:
- alignment-handbook
- trl
- sft
- generated_from_trainer
- trl
- sft
- generated_from_trainer
datasets:
- hZzy/SFT_new_full2
model-index:
- name: qwen2.5-1.5b-sft3-25-3
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/zhiyuzha-university-of-florida/huggingface/runs/pt6pvjen)
# qwen2.5-1.5b-sft3-25-3
This model is a fine-tuned version of [Qwen/Qwen2.5-1.5B](https://huggingface.co/Qwen/Qwen2.5-1.5B) on the hZzy/SFT_new_full2 dataset.
It achieves the following results on the evaluation set:
- Loss: 2.1614
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-06
- train_batch_size: 10
- eval_batch_size: 10
- seed: 42
- distributed_type: multi-GPU
- num_devices: 4
- gradient_accumulation_steps: 8
- total_train_batch_size: 320
- total_eval_batch_size: 40
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 2.9732 | 0.2439 | 5 | 2.9550 |
| 2.9685 | 0.4878 | 10 | 2.9329 |
| 2.9341 | 0.7317 | 15 | 2.8866 |
| 2.8788 | 0.9756 | 20 | 2.8079 |
| 2.8082 | 1.2195 | 25 | 2.7484 |
| 2.7341 | 1.4634 | 30 | 2.6838 |
| 2.6784 | 1.7073 | 35 | 2.6335 |
| 2.6326 | 1.9512 | 40 | 2.5951 |
| 2.5934 | 2.1951 | 45 | 2.5594 |
| 2.5543 | 2.4390 | 50 | 2.5217 |
| 2.513 | 2.6829 | 55 | 2.4829 |
| 2.4712 | 2.9268 | 60 | 2.4461 |
| 2.4365 | 3.1707 | 65 | 2.4138 |
| 2.4066 | 3.4146 | 70 | 2.3859 |
| 2.375 | 3.6585 | 75 | 2.3606 |
| 2.3415 | 3.9024 | 80 | 2.3369 |
| 2.3225 | 4.1463 | 85 | 2.3143 |
| 2.2989 | 4.3902 | 90 | 2.2927 |
| 2.2748 | 4.6341 | 95 | 2.2732 |
| 2.2513 | 4.8780 | 100 | 2.2562 |
| 2.2401 | 5.1220 | 105 | 2.2412 |
| 2.2172 | 5.3659 | 110 | 2.2282 |
| 2.204 | 5.6098 | 115 | 2.2168 |
| 2.1893 | 5.8537 | 120 | 2.2069 |
| 2.1784 | 6.0976 | 125 | 2.1984 |
| 2.1646 | 6.3415 | 130 | 2.1914 |
| 2.1673 | 6.5854 | 135 | 2.1852 |
| 2.1555 | 6.8293 | 140 | 2.1801 |
| 2.1599 | 7.0732 | 145 | 2.1757 |
| 2.145 | 7.3171 | 150 | 2.1721 |
| 2.1359 | 7.5610 | 155 | 2.1692 |
| 2.1391 | 7.8049 | 160 | 2.1668 |
| 2.1274 | 8.0488 | 165 | 2.1650 |
| 2.1342 | 8.2927 | 170 | 2.1637 |
| 2.1272 | 8.5366 | 175 | 2.1627 |
| 2.133 | 8.7805 | 180 | 2.1621 |
| 2.1286 | 9.0244 | 185 | 2.1617 |
| 2.1296 | 9.2683 | 190 | 2.1615 |
| 2.1256 | 9.5122 | 195 | 2.1614 |
| 2.1267 | 9.7561 | 200 | 2.1614 |
### Framework versions
- Transformers 4.42.0
- Pytorch 2.6.0+cu124
- Datasets 3.2.0
- Tokenizers 0.19.1
|
Yntec/NextPhoton
|
Yntec
| 2025-03-18T00:26:56Z
| 7,787
| 1
|
diffusers
|
[
"diffusers",
"safetensors",
"Base Model",
"Style",
"Photo",
"Easy",
"Photorealistic",
"Clothes",
"bigbeanboiler",
"Photographer",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"base_model:digiplay/NextPhoto_v2",
"base_model:merge:digiplay/NextPhoto_v2",
"base_model:digiplay/Photon_v1",
"base_model:merge:digiplay/Photon_v1",
"license:creativeml-openrail-m",
"autotrain_compatible",
"endpoints_compatible",
"diffusers:StableDiffusionPipeline",
"region:us"
] |
text-to-image
| 2024-09-15T03:57:08Z
|
---
license: creativeml-openrail-m
library_name: diffusers
pipeline_tag: text-to-image
tags:
- Base Model
- Style
- Photo
- Easy
- Photorealistic
- Clothes
- bigbeanboiler
- Photographer
- stable-diffusion
- stable-diffusion-diffusers
- diffusers
- text-to-image
base_model:
- digiplay/Photon_v1
- digiplay/NextPhoto_v2
base_model_relation: merge
inference: true
---
# Next Photon
Next Photo 2 merged with Photon to make them real! Samples and prompts (all use seed 9119):

A well lit photograph of a girl having fun with her boyfriend and friends at the bar. Coke can.

An abandoned amusement park reclaimed by nature, with rusted roller coasters and a carousel frozen in time.

A well lit candid closeup street photograph of a normal, yet beautiful girl posing in the club

A photo of a steampunk-inspired airship soaring through the sky, propelled by a magnificent array of gears and turbines.
Original pages:
https://civitai.com/models/84728/photon
https://civitai.com/models/84335?modelVersionId=109819 (Next Photo 2)
# Recipe:
- SuperMerger Weight Sum use MBW 1,0,0,0,0,0,0,0,0,0,1,1,1,0,1,1,1,1,1,1,0,0,0,1,1,1
Model A:
Next Photo 2
Model B:
Photon
Output:
NextPhoton
|
Zolotomeo/Denismodel5
|
Zolotomeo
| 2025-03-18T00:24:20Z
| 0
| 0
|
diffusers
|
[
"diffusers",
"flux",
"lora",
"replicate",
"text-to-image",
"en",
"base_model:black-forest-labs/FLUX.1-dev",
"base_model:adapter:black-forest-labs/FLUX.1-dev",
"license:other",
"region:us"
] |
text-to-image
| 2025-03-18T00:18:51Z
|
---
license: other
license_name: flux-1-dev-non-commercial-license
license_link: https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/LICENSE.md
language:
- en
tags:
- flux
- diffusers
- lora
- replicate
base_model: "black-forest-labs/FLUX.1-dev"
pipeline_tag: text-to-image
# widget:
# - text: >-
# prompt
# output:
# url: https://...
instance_prompt: ZKDD
---
# Denismodel5
<Gallery />
Trained on Replicate using:
https://replicate.com/ostris/flux-dev-lora-trainer/train
## Trigger words
You should use `ZKDD` to trigger the image generation.
## Use it with the [🧨 diffusers library](https://github.com/huggingface/diffusers)
```py
from diffusers import AutoPipelineForText2Image
import torch
pipeline = AutoPipelineForText2Image.from_pretrained('black-forest-labs/FLUX.1-dev', torch_dtype=torch.float16).to('cuda')
pipeline.load_lora_weights('Zolotomeo/Denismodel5', weight_name='lora.safetensors')
image = pipeline('your prompt').images[0]
```
For more details, including weighting, merging and fusing LoRAs, check the [documentation on loading LoRAs in diffusers](https://huggingface.co/docs/diffusers/main/en/using-diffusers/loading_adapters)
|
Maryam-Faisal-Viral-Video-news/Maryam.Faisal.Viral.Video.Original.Link.Tiktok.Instagram.Twitter
|
Maryam-Faisal-Viral-Video-news
| 2025-03-18T00:22:56Z
| 0
| 0
| null |
[
"region:us"
] | null | 2025-03-18T00:22:28Z
|
<animated-image data-catalyst=""><a href="https://alltvsteam.com/viral-video/?v=news-es-tvdf" rel="nofollow" data-target="animated-image.originalLink"><img src="https://static.wixstatic.com/media/b249f9_adac8f70fb3f45b88691696c77de18f3~mv2.gif" alt="Foo" data-canonical-src="https://static.wixstatic.com/media/b249f9_adac8f70fb3f45b88691696c77de18f3~mv2.gif" style="max-width: 100%; display: inline-block;" data-target="animated-image.originalImage"></a>
|
FluxiIA/Qwen_7b-tool_call_on_reasonin-Q6_K-GGUF
|
FluxiIA
| 2025-03-18T00:15:00Z
| 0
| 0
|
transformers
|
[
"transformers",
"gguf",
"text-generation-inference",
"unsloth",
"qwen2",
"trl",
"sft",
"llama-cpp",
"gguf-my-repo",
"en",
"base_model:FluxiIA/Qwen_7b-tool_call_on_reasonin",
"base_model:quantized:FluxiIA/Qwen_7b-tool_call_on_reasonin",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2025-03-17T20:56:41Z
|
---
base_model: FluxiIA/Qwen_7b-tool_call_on_reasonin
language:
- en
license: apache-2.0
tags:
- text-generation-inference
- transformers
- unsloth
- qwen2
- trl
- sft
- llama-cpp
- gguf-my-repo
---
# FluxiIA/Qwen_7b-tool_call_on_reasonin-Q6_K-GGUF
This model was converted to GGUF format from [`FluxiIA/Qwen_7b-tool_call_on_reasonin`](https://huggingface.co/FluxiIA/Qwen_7b-tool_call_on_reasonin) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space.
Refer to the [original model card](https://huggingface.co/FluxiIA/Qwen_7b-tool_call_on_reasonin) for more details on the model.
## Use with llama.cpp
Install llama.cpp through brew (works on Mac and Linux)
```bash
brew install llama.cpp
```
Invoke the llama.cpp server or the CLI.
### CLI:
```bash
llama-cli --hf-repo FluxiIA/Qwen_7b-tool_call_on_reasonin-Q6_K-GGUF --hf-file qwen_7b-tool_call_on_reasonin-q6_k.gguf -p "The meaning to life and the universe is"
```
### Server:
```bash
llama-server --hf-repo FluxiIA/Qwen_7b-tool_call_on_reasonin-Q6_K-GGUF --hf-file qwen_7b-tool_call_on_reasonin-q6_k.gguf -c 2048
```
Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well.
Step 1: Clone llama.cpp from GitHub.
```
git clone https://github.com/ggerganov/llama.cpp
```
Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux).
```
cd llama.cpp && LLAMA_CURL=1 make
```
Step 3: Run inference through the main binary.
```
./llama-cli --hf-repo FluxiIA/Qwen_7b-tool_call_on_reasonin-Q6_K-GGUF --hf-file qwen_7b-tool_call_on_reasonin-q6_k.gguf -p "The meaning to life and the universe is"
```
or
```
./llama-server --hf-repo FluxiIA/Qwen_7b-tool_call_on_reasonin-Q6_K-GGUF --hf-file qwen_7b-tool_call_on_reasonin-q6_k.gguf -c 2048
```
|
YOYO-AI/Qwen2.5-32B-YOYO-reasoning-v2-Q4_K_M-GGUF
|
YOYO-AI
| 2025-03-18T00:12:16Z
| 0
| 0
|
transformers
|
[
"transformers",
"gguf",
"mergekit",
"merge",
"llama-cpp",
"gguf-my-repo",
"base_model:YOYO-AI/Qwen2.5-32B-YOYO-reasoning-v2",
"base_model:quantized:YOYO-AI/Qwen2.5-32B-YOYO-reasoning-v2",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2025-03-18T00:10:45Z
|
---
base_model: YOYO-AI/Qwen2.5-32B-YOYO-reasoning-v2
library_name: transformers
tags:
- mergekit
- merge
- llama-cpp
- gguf-my-repo
---
# YOYO-AI/Qwen2.5-32B-YOYO-reasoning-v2-Q4_K_M-GGUF
This model was converted to GGUF format from [`YOYO-AI/Qwen2.5-32B-YOYO-reasoning-v2`](https://huggingface.co/YOYO-AI/Qwen2.5-32B-YOYO-reasoning-v2) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space.
Refer to the [original model card](https://huggingface.co/YOYO-AI/Qwen2.5-32B-YOYO-reasoning-v2) for more details on the model.
## Use with llama.cpp
Install llama.cpp through brew (works on Mac and Linux)
```bash
brew install llama.cpp
```
Invoke the llama.cpp server or the CLI.
### CLI:
```bash
llama-cli --hf-repo YOYO-AI/Qwen2.5-32B-YOYO-reasoning-v2-Q4_K_M-GGUF --hf-file qwen2.5-32b-yoyo-reasoning-v2-q4_k_m.gguf -p "The meaning to life and the universe is"
```
### Server:
```bash
llama-server --hf-repo YOYO-AI/Qwen2.5-32B-YOYO-reasoning-v2-Q4_K_M-GGUF --hf-file qwen2.5-32b-yoyo-reasoning-v2-q4_k_m.gguf -c 2048
```
Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well.
Step 1: Clone llama.cpp from GitHub.
```
git clone https://github.com/ggerganov/llama.cpp
```
Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux).
```
cd llama.cpp && LLAMA_CURL=1 make
```
Step 3: Run inference through the main binary.
```
./llama-cli --hf-repo YOYO-AI/Qwen2.5-32B-YOYO-reasoning-v2-Q4_K_M-GGUF --hf-file qwen2.5-32b-yoyo-reasoning-v2-q4_k_m.gguf -p "The meaning to life and the universe is"
```
or
```
./llama-server --hf-repo YOYO-AI/Qwen2.5-32B-YOYO-reasoning-v2-Q4_K_M-GGUF --hf-file qwen2.5-32b-yoyo-reasoning-v2-q4_k_m.gguf -c 2048
```
|
kpokhrel007/fine-tuned-DeepSeek-R1-Distill-1.5B-CGA_Base
|
kpokhrel007
| 2025-03-18T00:10:28Z
| 0
| 0
|
transformers
|
[
"transformers",
"safetensors",
"qwen2",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"4-bit",
"bitsandbytes",
"region:us"
] |
text-classification
| 2025-03-18T00:06:05Z
|
---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
|
Zolotomeo/Denismodel4
|
Zolotomeo
| 2025-03-18T00:09:29Z
| 0
| 0
|
diffusers
|
[
"diffusers",
"flux",
"lora",
"replicate",
"text-to-image",
"en",
"base_model:black-forest-labs/FLUX.1-dev",
"base_model:adapter:black-forest-labs/FLUX.1-dev",
"license:other",
"region:us"
] |
text-to-image
| 2025-03-17T23:51:20Z
|
---
license: other
license_name: flux-1-dev-non-commercial-license
license_link: https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/LICENSE.md
language:
- en
tags:
- flux
- diffusers
- lora
- replicate
base_model: "black-forest-labs/FLUX.1-dev"
pipeline_tag: text-to-image
# widget:
# - text: >-
# prompt
# output:
# url: https://...
instance_prompt: GIBB
---
# Denismodel4
<Gallery />
Trained on Replicate using:
https://replicate.com/ostris/flux-dev-lora-trainer/train
## Trigger words
You should use `GIBB` to trigger the image generation.
## Use it with the [🧨 diffusers library](https://github.com/huggingface/diffusers)
```py
from diffusers import AutoPipelineForText2Image
import torch
pipeline = AutoPipelineForText2Image.from_pretrained('black-forest-labs/FLUX.1-dev', torch_dtype=torch.float16).to('cuda')
pipeline.load_lora_weights('Zolotomeo/Denismodel4', weight_name='lora.safetensors')
image = pipeline('your prompt').images[0]
```
For more details, including weighting, merging and fusing LoRAs, check the [documentation on loading LoRAs in diffusers](https://huggingface.co/docs/diffusers/main/en/using-diffusers/loading_adapters)
|
Uthar/John6666_unlimited-porn-xreal-sdxl
|
Uthar
| 2025-03-18T00:09:27Z
| 0
| 0
|
diffusers
|
[
"diffusers",
"safetensors",
"text-to-image",
"stable-diffusion",
"stable-diffusion-xl",
"anime",
"hentai",
"realistic",
"photorealistic",
"pony",
"en",
"license:other",
"autotrain_compatible",
"endpoints_compatible",
"diffusers:StableDiffusionXLPipeline",
"region:us"
] |
text-to-image
| 2025-03-18T00:09:26Z
|
---
license: other
license_name: faipl-1.0-sd
license_link: https://freedevproject.org/faipl-1.0-sd/
language:
- en
library_name: diffusers
pipeline_tag: text-to-image
tags:
- text-to-image
- stable-diffusion
- stable-diffusion-xl
- anime
- hentai
- realistic
- photorealistic
- pony
---
Original model is [here](https://civitai.com/models/722566?modelVersionId=858718).
This model created by [advokat](https://civitai.com/user/advokat).
|
Grogros/dmWM-llama-3.2-1B-Instruct-KGWB-OWT_WMBoundary-OWT-WB-v2
|
Grogros
| 2025-03-18T00:08:28Z
| 0
| 0
|
transformers
|
[
"transformers",
"safetensors",
"llama",
"text-generation",
"generated_from_trainer",
"conversational",
"dataset:openwebtext",
"base_model:meta-llama/Llama-3.2-1B-Instruct",
"base_model:finetune:meta-llama/Llama-3.2-1B-Instruct",
"license:llama3.2",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2025-03-17T16:52:40Z
|
---
library_name: transformers
license: llama3.2
base_model: meta-llama/Llama-3.2-1B-Instruct
tags:
- generated_from_trainer
datasets:
- openwebtext
model-index:
- name: dmWM-llama-3.2-1B-Instruct-KGWB-OWT_WMBoundary-OWT-WB-v2
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dmWM-llama-3.2-1B-Instruct-KGWB-OWT_WMBoundary-OWT-WB-v2
This model is a fine-tuned version of [meta-llama/Llama-3.2-1B-Instruct](https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct) on the openwebtext dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 64
- optimizer: Use OptimizerNames.ADAFACTOR and the args are:
No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 5000
### Training results
### Framework versions
- Transformers 4.46.3
- Pytorch 2.5.1.post303
- Datasets 3.2.0
- Tokenizers 0.20.3
|
Uthar/John6666_true-amateur-feeling-xl-v1-sdxl
|
Uthar
| 2025-03-18T00:06:59Z
| 0
| 0
|
diffusers
|
[
"diffusers",
"safetensors",
"text-to-image",
"stable-diffusion",
"stable-diffusion-xl",
"realistic",
"photorealistic",
"amateur",
"true amateur feeling",
"en",
"license:creativeml-openrail-m",
"autotrain_compatible",
"endpoints_compatible",
"diffusers:StableDiffusionXLPipeline",
"region:us"
] |
text-to-image
| 2025-03-18T00:06:58Z
|
---
license: creativeml-openrail-m
language:
- en
library_name: diffusers
pipeline_tag: text-to-image
tags:
- text-to-image
- stable-diffusion
- stable-diffusion-xl
- realistic
- photorealistic
- amateur
- true amateur feeling
---
Original model is [here](https://civitai.com/models/645007/true-amateur-feeling-xl?modelVersionId=721527).
|
drewbenson/DeepSeek-R1-Distill-Llama-8B-4bit-MLX
|
drewbenson
| 2025-03-18T00:06:08Z
| 17
| 1
|
transformers
|
[
"transformers",
"safetensors",
"llama",
"text-generation",
"mlx",
"conversational",
"base_model:deepseek-ai/DeepSeek-R1-Distill-Llama-8B",
"base_model:quantized:deepseek-ai/DeepSeek-R1-Distill-Llama-8B",
"license:mit",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"4-bit",
"region:us"
] |
text-generation
| 2025-03-11T03:21:58Z
|
---
license: mit
library_name: transformers
tags:
- mlx
base_model: deepseek-ai/DeepSeek-R1-Distill-Llama-8B
---
# drewbenson/DeepSeek-R1-Distill-Llama-8B-4bit-MLX
The Model [drewbenson/DeepSeek-R1-Distill-Llama-8B-4bit-MLX](https://huggingface.co/drewbenson/DeepSeek-R1-Distill-Llama-8B-4bit-MLX) was
converted to MLX format from [deepseek-ai/DeepSeek-R1-Distill-Llama-8B](https://huggingface.co/deepseek-ai/DeepSeek-R1-Distill-Llama-8B)
using mlx-lm version **0.21.5**.
On M4 Max, this runs 40%-50% faster than the GGUF.
## Use with mlx
```bash
pip install mlx-lm
```
```python
from mlx_lm import load, generate
model, tokenizer = load("drewbenson/DeepSeek-R1-Distill-Llama-8B-4bit-MLX")
prompt = "hello"
if tokenizer.chat_template is not None:
messages = [{"role": "user", "content": prompt}]
prompt = tokenizer.apply_chat_template(
messages, add_generation_prompt=True
)
response = generate(model, tokenizer, prompt=prompt, verbose=True)
```
|
Uthar/John6666_fennfoto-pony-v3-sdxl
|
Uthar
| 2025-03-18T00:05:35Z
| 0
| 0
|
diffusers
|
[
"diffusers",
"safetensors",
"text-to-image",
"stable-diffusion",
"stable-diffusion-xl",
"realistic",
"photorealistic",
"overwatch",
"pony",
"license:other",
"autotrain_compatible",
"endpoints_compatible",
"diffusers:StableDiffusionXLPipeline",
"region:us"
] |
text-to-image
| 2025-03-18T00:05:35Z
|
---
license: other
license_name: faipl-1.0-sd
license_link: https://freedevproject.org/faipl-1.0-sd/
tags:
- text-to-image
- stable-diffusion
- stable-diffusion-xl
- realistic
- photorealistic
- overwatch
- pony
---
Original model is [here](https://civitai.com/models/503537/fennfoto-pony?modelVersionId=676770).
|
Asap7772/smollm2_lr1e4_10ep_binned_sft
|
Asap7772
| 2025-03-18T00:02:52Z
| 0
| 0
|
transformers
|
[
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2025-03-18T00:01:47Z
|
---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
|
Uthar/John6666_ultra-v5-sdxl
|
Uthar
| 2025-03-18T00:02:06Z
| 0
| 0
|
diffusers
|
[
"diffusers",
"safetensors",
"text-to-image",
"stable-diffusion",
"stable-diffusion-xl",
"realistic",
"photorealistic",
"beautiful",
"pony",
"en",
"license:other",
"autotrain_compatible",
"endpoints_compatible",
"diffusers:StableDiffusionXLPipeline",
"region:us"
] |
text-to-image
| 2025-03-18T00:02:04Z
|
---
license: other
license_name: faipl-1.0-sd
license_link: https://freedevproject.org/faipl-1.0-sd/
language:
- en
library_name: diffusers
pipeline_tag: text-to-image
tags:
- text-to-image
- stable-diffusion
- stable-diffusion-xl
- realistic
- photorealistic
- beautiful
- pony
---
Original model is [here](https://civitai.com/models/228525?modelVersionId=877221).
This model created by [AIA_civit](https://civitai.com/user/AIA_civit).
|
Asap7772/smollm2_lr1e4_7ep_binned_sft
|
Asap7772
| 2025-03-18T00:01:38Z
| 0
| 0
|
transformers
|
[
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2025-03-18T00:00:55Z
|
---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
|
Uthar/John6666_stoiqo-new-reality-sdxl-xlpro-sdxl
|
Uthar
| 2025-03-18T00:01:30Z
| 0
| 0
|
diffusers
|
[
"diffusers",
"safetensors",
"text-to-image",
"stable-diffusion",
"stable-diffusion-xl",
"realistic",
"photorealistic",
"photo",
"en",
"license:creativeml-openrail-m",
"autotrain_compatible",
"endpoints_compatible",
"diffusers:StableDiffusionXLPipeline",
"region:us"
] |
text-to-image
| 2025-03-18T00:01:28Z
|
---
license: creativeml-openrail-m
language:
- en
library_name: diffusers
pipeline_tag: text-to-image
tags:
- text-to-image
- stable-diffusion
- stable-diffusion-xl
- realistic
- photorealistic
- photo
---
Original model is [here](https://civitai.com/models/161068/stoiqo-newreality-or-sd-xl-lightning?modelVersionId=690310).
This model created by [ALIENHAZE](https://civitai.com/user/ALIENHAZE).
|
devmgck/bert-department-classification
|
devmgck
| 2025-03-18T00:01:05Z
| 0
| 0
|
transformers
|
[
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:google-bert/bert-base-uncased",
"base_model:finetune:google-bert/bert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2025-03-18T00:00:30Z
|
---
library_name: transformers
license: apache-2.0
base_model: bert-base-uncased
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: bert-department-classification
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-department-classification
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0003
- Accuracy: 1.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:------:|:----:|:---------------:|:--------:|
| 0.0006 | 3.1746 | 200 | 0.0003 | 1.0 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.6.0+cpu
- Datasets 3.3.2
- Tokenizers 0.20.3
|
Asap7772/smollm2_lr1e4_5ep_binned_sft
|
Asap7772
| 2025-03-18T00:00:43Z
| 0
| 0
|
transformers
|
[
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2025-03-17T23:59:36Z
|
---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
|
camilla-araujo-viral-video-news/camilla.araujo.viral.video.on.social.media.x.twitter.telegram
|
camilla-araujo-viral-video-news
| 2025-03-18T00:00:42Z
| 0
| 0
| null |
[
"region:us"
] | null | 2025-03-18T00:00:08Z
|
<animated-image data-catalyst=""><a href="https://alltvsteam.com/viral-video/?v=news-es-tvdf" rel="nofollow" data-target="animated-image.originalLink"><img src="https://static.wixstatic.com/media/b249f9_adac8f70fb3f45b88691696c77de18f3~mv2.gif" alt="Foo" data-canonical-src="https://static.wixstatic.com/media/b249f9_adac8f70fb3f45b88691696c77de18f3~mv2.gif" style="max-width: 100%; display: inline-block;" data-target="animated-image.originalImage"></a>
|
4ddigital/moniqueclone1
|
4ddigital
| 2025-03-18T00:00:23Z
| 0
| 0
| null |
[
"license:other",
"region:us"
] | null | 2025-03-17T22:55:55Z
|
---
license: other
license_name: flux-1-dev-non-commercial-license
license_link: https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/LICENSE.md
---
|
Uthar/John6666_mklan-aio-nsfw-aio-nextgen-xlv2-sdxl
|
Uthar
| 2025-03-17T23:59:49Z
| 0
| 0
|
diffusers
|
[
"diffusers",
"safetensors",
"text-to-image",
"stable-diffusion",
"stable-diffusion-xl",
"realistic",
"photorealistic",
"anime",
"game",
"animals",
"en",
"license:other",
"autotrain_compatible",
"endpoints_compatible",
"diffusers:StableDiffusionXLPipeline",
"region:us"
] |
text-to-image
| 2025-03-17T23:59:48Z
|
---
license: other
license_name: faipl-1.0-sd
license_link: https://freedevproject.org/faipl-1.0-sd/
language:
- en
library_name: diffusers
pipeline_tag: text-to-image
tags:
- text-to-image
- stable-diffusion
- stable-diffusion-xl
- realistic
- photorealistic
- anime
- game
- animals
---
Original model is [here](https://civitai.com/models/319953/mklan-aio-nsfw?modelVersionId=836616).
This model created by [mskiller51](https://civitai.com/user/mskiller51).
|
JuliaBotAI/Juliette-32B-16bit-Med
|
JuliaBotAI
| 2025-03-17T23:59:08Z
| 0
| 0
|
transformers
|
[
"transformers",
"safetensors",
"text-generation-inference",
"unsloth",
"qwen2",
"trl",
"en",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | null | 2025-03-17T23:58:57Z
|
---
base_model: unsloth/qwq-32b-unsloth-bnb-4bit
tags:
- text-generation-inference
- transformers
- unsloth
- qwen2
- trl
license: apache-2.0
language:
- en
---
# Uploaded model
- **Developed by:** JuliaBotAI
- **License:** apache-2.0
- **Finetuned from model :** unsloth/qwq-32b-unsloth-bnb-4bit
This qwen2 model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
Asap7772/smollm2_lr5e5_10ep_binned_sft
|
Asap7772
| 2025-03-17T23:58:28Z
| 0
| 0
|
transformers
|
[
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2025-03-17T23:57:51Z
|
---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
|
Asap7772/smollm2_lr5e5_7ep_binned_sft
|
Asap7772
| 2025-03-17T23:57:42Z
| 0
| 0
|
transformers
|
[
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2025-03-17T23:57:04Z
|
---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
|
TongZheng1999/gemma-2-9b-it-star-mixed_unique_conclusion-OP-final_10-2-3Rounds-iter-1
|
TongZheng1999
| 2025-03-17T23:57:06Z
| 0
| 0
|
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"gemma2",
"text-generation",
"generated_from_trainer",
"alignment-handbook",
"trl",
"sft",
"conversational",
"base_model:google/gemma-2-9b-it",
"base_model:finetune:google/gemma-2-9b-it",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2025-03-17T21:46:17Z
|
---
base_model: google/gemma-2-9b-it
library_name: transformers
model_name: gemma-2-9b-it-star-mixed_unique_conclusion-OP-final_10-2-3Rounds-iter-1
tags:
- generated_from_trainer
- alignment-handbook
- trl
- sft
licence: license
---
# Model Card for gemma-2-9b-it-star-mixed_unique_conclusion-OP-final_10-2-3Rounds-iter-1
This model is a fine-tuned version of [google/gemma-2-9b-it](https://huggingface.co/google/gemma-2-9b-it).
It has been trained using [TRL](https://github.com/huggingface/trl).
## Quick start
```python
from transformers import pipeline
question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?"
generator = pipeline("text-generation", model="TongZheng1999/gemma-2-9b-it-star-mixed_unique_conclusion-OP-final_10-2-3Rounds-iter-1", device="cuda")
output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0]
print(output["generated_text"])
```
## Training procedure
[<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="150" height="24"/>](https://wandb.ai/kidzheng/huggingface/runs/z8uurlew)
This model was trained with SFT.
### Framework versions
- TRL: 0.12.0
- Transformers: 4.46.0
- Pytorch: 2.6.0
- Datasets: 3.3.1
- Tokenizers: 0.20.3
## Citations
Cite TRL as:
```bibtex
@misc{vonwerra2022trl,
title = {{TRL: Transformer Reinforcement Learning}},
author = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallouédec},
year = 2020,
journal = {GitHub repository},
publisher = {GitHub},
howpublished = {\url{https://github.com/huggingface/trl}}
}
```
|
nakshatra44/17mar_incremental_v9_2epoches
|
nakshatra44
| 2025-03-17T23:54:59Z
| 0
| 0
|
peft
|
[
"peft",
"safetensors",
"arxiv:1910.09700",
"base_model:unsloth/mistral-7b-v0.3-bnb-4bit",
"base_model:adapter:unsloth/mistral-7b-v0.3-bnb-4bit",
"region:us"
] | null | 2025-03-17T23:54:44Z
|
---
base_model: unsloth/mistral-7b-v0.3-bnb-4bit
library_name: peft
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
### Framework versions
- PEFT 0.14.0
|
alelisita/brunom
|
alelisita
| 2025-03-17T23:54:50Z
| 0
| 0
|
diffusers
|
[
"diffusers",
"flux",
"lora",
"replicate",
"text-to-image",
"en",
"base_model:black-forest-labs/FLUX.1-dev",
"base_model:adapter:black-forest-labs/FLUX.1-dev",
"license:other",
"region:us"
] |
text-to-image
| 2025-03-17T23:44:24Z
|
---
license: other
license_name: flux-1-dev-non-commercial-license
license_link: https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/LICENSE.md
language:
- en
tags:
- flux
- diffusers
- lora
- replicate
base_model: "black-forest-labs/FLUX.1-dev"
pipeline_tag: text-to-image
# widget:
# - text: >-
# prompt
# output:
# url: https://...
instance_prompt: TOK
---
# Brunom
<Gallery />
Trained on Replicate using:
https://replicate.com/ostris/flux-dev-lora-trainer/train
## Trigger words
You should use `TOK` to trigger the image generation.
## Use it with the [🧨 diffusers library](https://github.com/huggingface/diffusers)
```py
from diffusers import AutoPipelineForText2Image
import torch
pipeline = AutoPipelineForText2Image.from_pretrained('black-forest-labs/FLUX.1-dev', torch_dtype=torch.float16).to('cuda')
pipeline.load_lora_weights('alelisita/brunom', weight_name='lora.safetensors')
image = pipeline('your prompt').images[0]
```
For more details, including weighting, merging and fusing LoRAs, check the [documentation on loading LoRAs in diffusers](https://huggingface.co/docs/diffusers/main/en/using-diffusers/loading_adapters)
|
MrRobotoAI/248-Q4_K_M-GGUF
|
MrRobotoAI
| 2025-03-17T23:54:23Z
| 0
| 0
|
transformers
|
[
"transformers",
"gguf",
"mergekit",
"merge",
"llama-cpp",
"gguf-my-repo",
"base_model:MrRobotoAI/248",
"base_model:quantized:MrRobotoAI/248",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2025-03-17T23:53:56Z
|
---
base_model: MrRobotoAI/248
library_name: transformers
tags:
- mergekit
- merge
- llama-cpp
- gguf-my-repo
---
# MrRobotoAI/248-Q4_K_M-GGUF
This model was converted to GGUF format from [`MrRobotoAI/248`](https://huggingface.co/MrRobotoAI/248) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space.
Refer to the [original model card](https://huggingface.co/MrRobotoAI/248) for more details on the model.
## Use with llama.cpp
Install llama.cpp through brew (works on Mac and Linux)
```bash
brew install llama.cpp
```
Invoke the llama.cpp server or the CLI.
### CLI:
```bash
llama-cli --hf-repo MrRobotoAI/248-Q4_K_M-GGUF --hf-file 248-q4_k_m.gguf -p "The meaning to life and the universe is"
```
### Server:
```bash
llama-server --hf-repo MrRobotoAI/248-Q4_K_M-GGUF --hf-file 248-q4_k_m.gguf -c 2048
```
Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well.
Step 1: Clone llama.cpp from GitHub.
```
git clone https://github.com/ggerganov/llama.cpp
```
Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux).
```
cd llama.cpp && LLAMA_CURL=1 make
```
Step 3: Run inference through the main binary.
```
./llama-cli --hf-repo MrRobotoAI/248-Q4_K_M-GGUF --hf-file 248-q4_k_m.gguf -p "The meaning to life and the universe is"
```
or
```
./llama-server --hf-repo MrRobotoAI/248-Q4_K_M-GGUF --hf-file 248-q4_k_m.gguf -c 2048
```
|
mouseyy/best_model_copy
|
mouseyy
| 2025-03-17T23:51:38Z
| 0
| 0
|
transformers
|
[
"transformers",
"safetensors",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"dataset:common_voice_17_0",
"base_model:facebook/wav2vec2-xls-r-300m",
"base_model:finetune:facebook/wav2vec2-xls-r-300m",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2025-03-17T22:49:19Z
|
---
library_name: transformers
license: apache-2.0
base_model: facebook/wav2vec2-xls-r-300m
tags:
- generated_from_trainer
datasets:
- common_voice_17_0
metrics:
- wer
model-index:
- name: result_data-1
results:
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: common_voice_17_0
type: common_voice_17_0
config: uk
split: test
args: uk
metrics:
- name: Wer
type: wer
value: 0.36512878573450325
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# result_data-1
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the common_voice_17_0 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2220
- Wer: 0.3651
- Cer: 0.1691
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6.532628754904162e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- distributed_type: multi-GPU
- num_devices: 2
- total_train_batch_size: 32
- total_eval_batch_size: 32
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 206
- num_epochs: 7.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
|:-------------:|:------:|:----:|:---------------:|:------:|:------:|
| 0.6324 | 0.9099 | 1000 | 0.5004 | 0.6083 | 0.2381 |
| 0.3497 | 1.8198 | 2000 | 0.3087 | 0.4650 | 0.1965 |
| 0.2642 | 2.7298 | 3000 | 0.2636 | 0.4249 | 0.1841 |
| 0.2328 | 3.6397 | 4000 | 0.2431 | 0.3960 | 0.1789 |
| 0.1933 | 4.5496 | 5000 | 0.2289 | 0.3773 | 0.1732 |
| 0.1783 | 5.4595 | 6000 | 0.2300 | 0.3728 | 0.1711 |
| 0.1617 | 6.3694 | 7000 | 0.2233 | 0.3637 | 0.1700 |
### Framework versions
- Transformers 4.49.0
- Pytorch 2.6.0+cu124
- Datasets 3.3.2
- Tokenizers 0.21.0
|
jkazdan/results
|
jkazdan
| 2025-03-17T23:50:56Z
| 0
| 0
|
transformers
|
[
"transformers",
"safetensors",
"generated_from_trainer",
"trl",
"sft",
"base_model:meta-llama/Llama-3.3-70B-Instruct",
"base_model:finetune:meta-llama/Llama-3.3-70B-Instruct",
"endpoints_compatible",
"region:us"
] | null | 2025-03-17T23:49:17Z
|
---
base_model: meta-llama/Llama-3.3-70B-Instruct
library_name: transformers
model_name: results
tags:
- generated_from_trainer
- trl
- sft
licence: license
---
# Model Card for results
This model is a fine-tuned version of [meta-llama/Llama-3.3-70B-Instruct](https://huggingface.co/meta-llama/Llama-3.3-70B-Instruct).
It has been trained using [TRL](https://github.com/huggingface/trl).
## Quick start
```python
from transformers import pipeline
question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?"
generator = pipeline("text-generation", model="jkazdan/results", device="cuda")
output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0]
print(output["generated_text"])
```
## Training procedure
This model was trained with SFT.
### Framework versions
- TRL: 0.15.2
- Transformers: 4.49.0
- Pytorch: 2.5.1
- Datasets: 3.3.2
- Tokenizers: 0.21.0
## Citations
Cite TRL as:
```bibtex
@misc{vonwerra2022trl,
title = {{TRL: Transformer Reinforcement Learning}},
author = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallouédec},
year = 2020,
journal = {GitHub repository},
publisher = {GitHub},
howpublished = {\url{https://github.com/huggingface/trl}}
}
```
|
genki10/Version27NewTestASAP_FineTuningBERT_AugV27_k3_task1_organization_k3_k3_fold1
|
genki10
| 2025-03-17T23:49:35Z
| 0
| 0
|
transformers
|
[
"transformers",
"pytorch",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:google-bert/bert-base-uncased",
"base_model:finetune:google-bert/bert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2025-03-17T23:24:10Z
|
---
library_name: transformers
license: apache-2.0
base_model: bert-base-uncased
tags:
- generated_from_trainer
model-index:
- name: Version27NewTestASAP_FineTuningBERT_AugV27_k3_task1_organization_k3_k3_fold1
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Version27NewTestASAP_FineTuningBERT_AugV27_k3_task1_organization_k3_k3_fold1
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6094
- Qwk: 0.5813
- Mse: 0.6093
- Rmse: 0.7806
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 150
### Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|
| No log | 1.0 | 2 | 9.6178 | 0.0037 | 9.6151 | 3.1008 |
| No log | 2.0 | 4 | 7.1481 | 0.0 | 7.1457 | 2.6731 |
| No log | 3.0 | 6 | 5.5481 | 0.0664 | 5.5459 | 2.3550 |
| No log | 4.0 | 8 | 4.4128 | 0.0215 | 4.4107 | 2.1002 |
| No log | 5.0 | 10 | 3.4918 | 0.0 | 3.4897 | 1.8681 |
| No log | 6.0 | 12 | 2.7932 | 0.0 | 2.7914 | 1.6707 |
| No log | 7.0 | 14 | 2.1173 | 0.1711 | 2.1156 | 1.4545 |
| No log | 8.0 | 16 | 1.6283 | 0.0545 | 1.6267 | 1.2754 |
| No log | 9.0 | 18 | 1.3013 | 0.0 | 1.2998 | 1.1401 |
| No log | 10.0 | 20 | 1.0254 | 0.0 | 1.0241 | 1.0120 |
| No log | 11.0 | 22 | 0.8896 | 0.1202 | 0.8883 | 0.9425 |
| No log | 12.0 | 24 | 0.7839 | 0.3258 | 0.7827 | 0.8847 |
| No log | 13.0 | 26 | 0.6986 | 0.5222 | 0.6975 | 0.8352 |
| No log | 14.0 | 28 | 0.7605 | 0.3804 | 0.7595 | 0.8715 |
| No log | 15.0 | 30 | 0.5577 | 0.5076 | 0.5568 | 0.7462 |
| No log | 16.0 | 32 | 0.7155 | 0.4582 | 0.7146 | 0.8453 |
| No log | 17.0 | 34 | 0.6854 | 0.5193 | 0.6846 | 0.8274 |
| No log | 18.0 | 36 | 0.4660 | 0.5858 | 0.4654 | 0.6822 |
| No log | 19.0 | 38 | 0.4696 | 0.5843 | 0.4692 | 0.6850 |
| No log | 20.0 | 40 | 0.8539 | 0.4823 | 0.8535 | 0.9238 |
| No log | 21.0 | 42 | 0.5500 | 0.5523 | 0.5497 | 0.7414 |
| No log | 22.0 | 44 | 0.5815 | 0.5791 | 0.5813 | 0.7624 |
| No log | 23.0 | 46 | 0.6415 | 0.4940 | 0.6414 | 0.8008 |
| No log | 24.0 | 48 | 0.6223 | 0.5947 | 0.6220 | 0.7887 |
| No log | 25.0 | 50 | 0.6499 | 0.5848 | 0.6497 | 0.8061 |
| No log | 26.0 | 52 | 0.6527 | 0.5473 | 0.6526 | 0.8078 |
| No log | 27.0 | 54 | 0.6759 | 0.5781 | 0.6757 | 0.8220 |
| No log | 28.0 | 56 | 0.6447 | 0.5961 | 0.6445 | 0.8028 |
| No log | 29.0 | 58 | 0.6411 | 0.5873 | 0.6411 | 0.8007 |
| No log | 30.0 | 60 | 0.6868 | 0.5708 | 0.6866 | 0.8286 |
| No log | 31.0 | 62 | 0.6310 | 0.5885 | 0.6309 | 0.7943 |
| No log | 32.0 | 64 | 0.6166 | 0.5809 | 0.6165 | 0.7852 |
| No log | 33.0 | 66 | 0.6781 | 0.5824 | 0.6781 | 0.8235 |
| No log | 34.0 | 68 | 0.6434 | 0.5970 | 0.6433 | 0.8021 |
| No log | 35.0 | 70 | 0.5922 | 0.5710 | 0.5921 | 0.7695 |
| No log | 36.0 | 72 | 0.7697 | 0.5253 | 0.7697 | 0.8773 |
| No log | 37.0 | 74 | 0.7347 | 0.5427 | 0.7347 | 0.8571 |
| No log | 38.0 | 76 | 0.6191 | 0.5620 | 0.6190 | 0.7868 |
| No log | 39.0 | 78 | 0.6645 | 0.5745 | 0.6645 | 0.8152 |
| No log | 40.0 | 80 | 0.6294 | 0.6046 | 0.6295 | 0.7934 |
| No log | 41.0 | 82 | 0.6357 | 0.5431 | 0.6357 | 0.7973 |
| No log | 42.0 | 84 | 0.6209 | 0.5798 | 0.6209 | 0.7880 |
| No log | 43.0 | 86 | 0.8206 | 0.5009 | 0.8206 | 0.9059 |
| No log | 44.0 | 88 | 0.6068 | 0.5912 | 0.6068 | 0.7790 |
| No log | 45.0 | 90 | 0.6311 | 0.5909 | 0.6312 | 0.7945 |
| No log | 46.0 | 92 | 0.8113 | 0.5113 | 0.8115 | 0.9009 |
| No log | 47.0 | 94 | 0.6313 | 0.5829 | 0.6315 | 0.7947 |
| No log | 48.0 | 96 | 0.6293 | 0.5975 | 0.6295 | 0.7934 |
| No log | 49.0 | 98 | 0.7054 | 0.5525 | 0.7055 | 0.8399 |
| No log | 50.0 | 100 | 0.5931 | 0.5547 | 0.5930 | 0.7701 |
| No log | 51.0 | 102 | 0.6022 | 0.5544 | 0.6022 | 0.7760 |
| No log | 52.0 | 104 | 0.6822 | 0.5726 | 0.6822 | 0.8260 |
| No log | 53.0 | 106 | 0.6220 | 0.5808 | 0.6221 | 0.7887 |
| No log | 54.0 | 108 | 0.6423 | 0.5833 | 0.6424 | 0.8015 |
| No log | 55.0 | 110 | 0.7235 | 0.5382 | 0.7236 | 0.8507 |
| No log | 56.0 | 112 | 0.6421 | 0.5675 | 0.6421 | 0.8013 |
| No log | 57.0 | 114 | 0.6715 | 0.5632 | 0.6714 | 0.8194 |
| No log | 58.0 | 116 | 0.6326 | 0.5904 | 0.6326 | 0.7954 |
| No log | 59.0 | 118 | 0.6067 | 0.5950 | 0.6067 | 0.7789 |
| No log | 60.0 | 120 | 0.6829 | 0.5825 | 0.6830 | 0.8264 |
| No log | 61.0 | 122 | 0.5733 | 0.6186 | 0.5733 | 0.7572 |
| No log | 62.0 | 124 | 0.6243 | 0.6010 | 0.6243 | 0.7901 |
| No log | 63.0 | 126 | 0.7012 | 0.5699 | 0.7013 | 0.8374 |
| No log | 64.0 | 128 | 0.5974 | 0.5937 | 0.5975 | 0.7730 |
| No log | 65.0 | 130 | 0.6103 | 0.5757 | 0.6104 | 0.7813 |
| No log | 66.0 | 132 | 0.7307 | 0.5422 | 0.7308 | 0.8549 |
| No log | 67.0 | 134 | 0.7870 | 0.5283 | 0.7871 | 0.8872 |
| No log | 68.0 | 136 | 0.6104 | 0.5833 | 0.6104 | 0.7813 |
| No log | 69.0 | 138 | 0.6012 | 0.5779 | 0.6011 | 0.7753 |
| No log | 70.0 | 140 | 0.7370 | 0.5416 | 0.7370 | 0.8585 |
| No log | 71.0 | 142 | 0.7227 | 0.5469 | 0.7227 | 0.8501 |
| No log | 72.0 | 144 | 0.6079 | 0.5926 | 0.6079 | 0.7796 |
| No log | 73.0 | 146 | 0.6619 | 0.5836 | 0.6619 | 0.8136 |
| No log | 74.0 | 148 | 0.6296 | 0.5978 | 0.6296 | 0.7935 |
| No log | 75.0 | 150 | 0.6014 | 0.5885 | 0.6014 | 0.7755 |
| No log | 76.0 | 152 | 0.6589 | 0.5753 | 0.6590 | 0.8118 |
| No log | 77.0 | 154 | 0.6361 | 0.5965 | 0.6361 | 0.7976 |
| No log | 78.0 | 156 | 0.6696 | 0.5707 | 0.6697 | 0.8183 |
| No log | 79.0 | 158 | 0.6077 | 0.5957 | 0.6077 | 0.7795 |
| No log | 80.0 | 160 | 0.6884 | 0.5569 | 0.6885 | 0.8297 |
| No log | 81.0 | 162 | 0.6634 | 0.5658 | 0.6635 | 0.8145 |
| No log | 82.0 | 164 | 0.6121 | 0.5876 | 0.6121 | 0.7824 |
| No log | 83.0 | 166 | 0.6568 | 0.5843 | 0.6570 | 0.8105 |
| No log | 84.0 | 168 | 0.6403 | 0.5878 | 0.6404 | 0.8003 |
| No log | 85.0 | 170 | 0.6185 | 0.6159 | 0.6186 | 0.7865 |
| No log | 86.0 | 172 | 0.6397 | 0.5901 | 0.6398 | 0.7999 |
| No log | 87.0 | 174 | 0.6039 | 0.6085 | 0.6039 | 0.7771 |
| No log | 88.0 | 176 | 0.6378 | 0.5893 | 0.6378 | 0.7986 |
| No log | 89.0 | 178 | 0.5903 | 0.5920 | 0.5903 | 0.7683 |
| No log | 90.0 | 180 | 0.5860 | 0.5990 | 0.5858 | 0.7654 |
| No log | 91.0 | 182 | 0.6094 | 0.5813 | 0.6093 | 0.7806 |
### Framework versions
- Transformers 4.47.0
- Pytorch 2.5.1+cu121
- Datasets 3.3.1
- Tokenizers 0.21.0
|
Asap7772/smollm2_lr5e6_7ep_binned_sft
|
Asap7772
| 2025-03-17T23:48:56Z
| 0
| 0
|
transformers
|
[
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2025-03-17T23:47:46Z
|
---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
|
MrRobotoAI/248
|
MrRobotoAI
| 2025-03-17T23:48:22Z
| 0
| 0
|
transformers
|
[
"transformers",
"safetensors",
"llama",
"text-generation",
"mergekit",
"merge",
"conversational",
"arxiv:2203.05482",
"base_model:Blackroot/Llama-3-LongStory-LORA",
"base_model:merge:Blackroot/Llama-3-LongStory-LORA",
"base_model:MrRobotoAI/106",
"base_model:merge:MrRobotoAI/106",
"base_model:MrRobotoAI/212",
"base_model:merge:MrRobotoAI/212",
"base_model:MrRobotoAI/236",
"base_model:merge:MrRobotoAI/236",
"base_model:MrRobotoAI/237",
"base_model:merge:MrRobotoAI/237",
"base_model:MrRobotoAI/238",
"base_model:merge:MrRobotoAI/238",
"base_model:MrRobotoAI/240",
"base_model:merge:MrRobotoAI/240",
"base_model:MrRobotoAI/242",
"base_model:merge:MrRobotoAI/242",
"base_model:MrRobotoAI/246",
"base_model:merge:MrRobotoAI/246",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2025-03-17T23:41:55Z
|
---
base_model:
- MrRobotoAI/212
- MrRobotoAI/238
- MrRobotoAI/246
- MrRobotoAI/237
- MrRobotoAI/240
- MrRobotoAI/106
- MrRobotoAI/236
- MrRobotoAI/242
- Blackroot/Llama-3-LongStory-LORA
library_name: transformers
tags:
- mergekit
- merge
---
# merge
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the [Linear](https://arxiv.org/abs/2203.05482) merge method.
### Models Merged
The following models were included in the merge:
* [MrRobotoAI/212](https://huggingface.co/MrRobotoAI/212)
* [MrRobotoAI/238](https://huggingface.co/MrRobotoAI/238)
* [MrRobotoAI/246](https://huggingface.co/MrRobotoAI/246)
* [MrRobotoAI/237](https://huggingface.co/MrRobotoAI/237)
* [MrRobotoAI/240](https://huggingface.co/MrRobotoAI/240)
* [MrRobotoAI/106](https://huggingface.co/MrRobotoAI/106)
* [MrRobotoAI/236](https://huggingface.co/MrRobotoAI/236)
* [MrRobotoAI/242](https://huggingface.co/MrRobotoAI/242) + [Blackroot/Llama-3-LongStory-LORA](https://huggingface.co/Blackroot/Llama-3-LongStory-LORA)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
models:
- model: MrRobotoAI/106
- model: MrRobotoAI/212
- model: MrRobotoAI/236
- model: MrRobotoAI/237
- model: MrRobotoAI/238
- model: MrRobotoAI/240
- model: MrRobotoAI/242+Blackroot/Llama-3-LongStory-LORA
- model: MrRobotoAI/246
parameters:
weight: 1.0
merge_method: linear
dtype: float16
```
|
Asap7772/smollm2_lr5e6_2ep_binned_sft
|
Asap7772
| 2025-03-17T23:46:29Z
| 0
| 0
|
transformers
|
[
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2025-03-17T23:45:49Z
|
---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
|
FluxiIA/Qwen_7b-tool_call_on_reasonin
|
FluxiIA
| 2025-03-17T23:45:39Z
| 0
| 0
|
transformers
|
[
"transformers",
"safetensors",
"qwen2",
"text-generation",
"text-generation-inference",
"unsloth",
"trl",
"sft",
"conversational",
"en",
"base_model:unsloth/Qwen2.5-7B-Instruct",
"base_model:finetune:unsloth/Qwen2.5-7B-Instruct",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2025-03-17T19:54:56Z
|
---
base_model: unsloth/Qwen2.5-7B-Instruct
tags:
- text-generation-inference
- transformers
- unsloth
- qwen2
- trl
- sft
license: apache-2.0
language:
- en
---
# Uploaded finetuned model
- **Developed by:** FluxiIA
- **License:** apache-2.0
- **Finetuned from model :** unsloth/Qwen2.5-7B-Instruct
This qwen2 model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
mradermacher/HPGPT-DeepSeek-R1-Distill-Qwen-7B-R-GGUF
|
mradermacher
| 2025-03-17T23:42:10Z
| 0
| 0
|
transformers
|
[
"transformers",
"gguf",
"en",
"base_model:ABrain/HPGPT-DeepSeek-R1-Distill-Qwen-7B-R",
"base_model:quantized:ABrain/HPGPT-DeepSeek-R1-Distill-Qwen-7B-R",
"license:mit",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2025-03-17T23:26:13Z
|
---
base_model: ABrain/HPGPT-DeepSeek-R1-Distill-Qwen-7B-R
language:
- en
library_name: transformers
license: mit
quantized_by: mradermacher
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: -->
static quants of https://huggingface.co/ABrain/HPGPT-DeepSeek-R1-Distill-Qwen-7B-R
<!-- provided-files -->
weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/HPGPT-DeepSeek-R1-Distill-Qwen-7B-R-GGUF/resolve/main/HPGPT-DeepSeek-R1-Distill-Qwen-7B-R.Q2_K.gguf) | Q2_K | 3.1 | |
| [GGUF](https://huggingface.co/mradermacher/HPGPT-DeepSeek-R1-Distill-Qwen-7B-R-GGUF/resolve/main/HPGPT-DeepSeek-R1-Distill-Qwen-7B-R.Q3_K_S.gguf) | Q3_K_S | 3.6 | |
| [GGUF](https://huggingface.co/mradermacher/HPGPT-DeepSeek-R1-Distill-Qwen-7B-R-GGUF/resolve/main/HPGPT-DeepSeek-R1-Distill-Qwen-7B-R.Q3_K_M.gguf) | Q3_K_M | 3.9 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/HPGPT-DeepSeek-R1-Distill-Qwen-7B-R-GGUF/resolve/main/HPGPT-DeepSeek-R1-Distill-Qwen-7B-R.Q3_K_L.gguf) | Q3_K_L | 4.2 | |
| [GGUF](https://huggingface.co/mradermacher/HPGPT-DeepSeek-R1-Distill-Qwen-7B-R-GGUF/resolve/main/HPGPT-DeepSeek-R1-Distill-Qwen-7B-R.IQ4_XS.gguf) | IQ4_XS | 4.4 | |
| [GGUF](https://huggingface.co/mradermacher/HPGPT-DeepSeek-R1-Distill-Qwen-7B-R-GGUF/resolve/main/HPGPT-DeepSeek-R1-Distill-Qwen-7B-R.Q4_K_S.gguf) | Q4_K_S | 4.6 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/HPGPT-DeepSeek-R1-Distill-Qwen-7B-R-GGUF/resolve/main/HPGPT-DeepSeek-R1-Distill-Qwen-7B-R.Q4_K_M.gguf) | Q4_K_M | 4.8 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/HPGPT-DeepSeek-R1-Distill-Qwen-7B-R-GGUF/resolve/main/HPGPT-DeepSeek-R1-Distill-Qwen-7B-R.Q5_K_S.gguf) | Q5_K_S | 5.4 | |
| [GGUF](https://huggingface.co/mradermacher/HPGPT-DeepSeek-R1-Distill-Qwen-7B-R-GGUF/resolve/main/HPGPT-DeepSeek-R1-Distill-Qwen-7B-R.Q5_K_M.gguf) | Q5_K_M | 5.5 | |
| [GGUF](https://huggingface.co/mradermacher/HPGPT-DeepSeek-R1-Distill-Qwen-7B-R-GGUF/resolve/main/HPGPT-DeepSeek-R1-Distill-Qwen-7B-R.Q6_K.gguf) | Q6_K | 6.4 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/HPGPT-DeepSeek-R1-Distill-Qwen-7B-R-GGUF/resolve/main/HPGPT-DeepSeek-R1-Distill-Qwen-7B-R.Q8_0.gguf) | Q8_0 | 8.2 | fast, best quality |
| [GGUF](https://huggingface.co/mradermacher/HPGPT-DeepSeek-R1-Distill-Qwen-7B-R-GGUF/resolve/main/HPGPT-DeepSeek-R1-Distill-Qwen-7B-R.f16.gguf) | f16 | 15.3 | 16 bpw, overkill |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time.
<!-- end -->
|
mradermacher/Qwen2.5-7B-Instruct-medical_summary_latest-i1-GGUF
|
mradermacher
| 2025-03-17T23:42:10Z
| 0
| 0
|
transformers
|
[
"transformers",
"gguf",
"en",
"base_model:hendra01/Qwen2.5-7B-Instruct-medical_summary_latest",
"base_model:quantized:hendra01/Qwen2.5-7B-Instruct-medical_summary_latest",
"endpoints_compatible",
"region:us",
"imatrix",
"conversational"
] | null | 2025-03-17T21:06:03Z
|
---
base_model: hendra01/Qwen2.5-7B-Instruct-medical_summary_latest
language:
- en
library_name: transformers
quantized_by: mradermacher
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/hendra01/Qwen2.5-7B-Instruct-medical_summary_latest
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/Qwen2.5-7B-Instruct-medical_summary_latest-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-7B-Instruct-medical_summary_latest-i1-GGUF/resolve/main/Qwen2.5-7B-Instruct-medical_summary_latest.i1-IQ1_S.gguf) | i1-IQ1_S | 2.0 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-7B-Instruct-medical_summary_latest-i1-GGUF/resolve/main/Qwen2.5-7B-Instruct-medical_summary_latest.i1-IQ1_M.gguf) | i1-IQ1_M | 2.1 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-7B-Instruct-medical_summary_latest-i1-GGUF/resolve/main/Qwen2.5-7B-Instruct-medical_summary_latest.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 2.4 | |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-7B-Instruct-medical_summary_latest-i1-GGUF/resolve/main/Qwen2.5-7B-Instruct-medical_summary_latest.i1-IQ2_XS.gguf) | i1-IQ2_XS | 2.6 | |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-7B-Instruct-medical_summary_latest-i1-GGUF/resolve/main/Qwen2.5-7B-Instruct-medical_summary_latest.i1-IQ2_S.gguf) | i1-IQ2_S | 2.7 | |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-7B-Instruct-medical_summary_latest-i1-GGUF/resolve/main/Qwen2.5-7B-Instruct-medical_summary_latest.i1-IQ2_M.gguf) | i1-IQ2_M | 2.9 | |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-7B-Instruct-medical_summary_latest-i1-GGUF/resolve/main/Qwen2.5-7B-Instruct-medical_summary_latest.i1-Q2_K_S.gguf) | i1-Q2_K_S | 2.9 | very low quality |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-7B-Instruct-medical_summary_latest-i1-GGUF/resolve/main/Qwen2.5-7B-Instruct-medical_summary_latest.i1-Q2_K.gguf) | i1-Q2_K | 3.1 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-7B-Instruct-medical_summary_latest-i1-GGUF/resolve/main/Qwen2.5-7B-Instruct-medical_summary_latest.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 3.2 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-7B-Instruct-medical_summary_latest-i1-GGUF/resolve/main/Qwen2.5-7B-Instruct-medical_summary_latest.i1-IQ3_XS.gguf) | i1-IQ3_XS | 3.4 | |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-7B-Instruct-medical_summary_latest-i1-GGUF/resolve/main/Qwen2.5-7B-Instruct-medical_summary_latest.i1-Q3_K_S.gguf) | i1-Q3_K_S | 3.6 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-7B-Instruct-medical_summary_latest-i1-GGUF/resolve/main/Qwen2.5-7B-Instruct-medical_summary_latest.i1-IQ3_S.gguf) | i1-IQ3_S | 3.6 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-7B-Instruct-medical_summary_latest-i1-GGUF/resolve/main/Qwen2.5-7B-Instruct-medical_summary_latest.i1-IQ3_M.gguf) | i1-IQ3_M | 3.7 | |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-7B-Instruct-medical_summary_latest-i1-GGUF/resolve/main/Qwen2.5-7B-Instruct-medical_summary_latest.i1-Q3_K_M.gguf) | i1-Q3_K_M | 3.9 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-7B-Instruct-medical_summary_latest-i1-GGUF/resolve/main/Qwen2.5-7B-Instruct-medical_summary_latest.i1-Q3_K_L.gguf) | i1-Q3_K_L | 4.2 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-7B-Instruct-medical_summary_latest-i1-GGUF/resolve/main/Qwen2.5-7B-Instruct-medical_summary_latest.i1-IQ4_XS.gguf) | i1-IQ4_XS | 4.3 | |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-7B-Instruct-medical_summary_latest-i1-GGUF/resolve/main/Qwen2.5-7B-Instruct-medical_summary_latest.i1-IQ4_NL.gguf) | i1-IQ4_NL | 4.5 | prefer IQ4_XS |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-7B-Instruct-medical_summary_latest-i1-GGUF/resolve/main/Qwen2.5-7B-Instruct-medical_summary_latest.i1-Q4_0.gguf) | i1-Q4_0 | 4.5 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-7B-Instruct-medical_summary_latest-i1-GGUF/resolve/main/Qwen2.5-7B-Instruct-medical_summary_latest.i1-Q4_K_S.gguf) | i1-Q4_K_S | 4.6 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-7B-Instruct-medical_summary_latest-i1-GGUF/resolve/main/Qwen2.5-7B-Instruct-medical_summary_latest.i1-Q4_K_M.gguf) | i1-Q4_K_M | 4.8 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-7B-Instruct-medical_summary_latest-i1-GGUF/resolve/main/Qwen2.5-7B-Instruct-medical_summary_latest.i1-Q4_1.gguf) | i1-Q4_1 | 5.0 | |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-7B-Instruct-medical_summary_latest-i1-GGUF/resolve/main/Qwen2.5-7B-Instruct-medical_summary_latest.i1-Q5_K_S.gguf) | i1-Q5_K_S | 5.4 | |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-7B-Instruct-medical_summary_latest-i1-GGUF/resolve/main/Qwen2.5-7B-Instruct-medical_summary_latest.i1-Q5_K_M.gguf) | i1-Q5_K_M | 5.5 | |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-7B-Instruct-medical_summary_latest-i1-GGUF/resolve/main/Qwen2.5-7B-Instruct-medical_summary_latest.i1-Q6_K.gguf) | i1-Q6_K | 6.4 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
HOT-Gul-Chahat-Viral-Video-news/Gul.Chahat.leaked.Video.on.social.media.trending
|
HOT-Gul-Chahat-Viral-Video-news
| 2025-03-17T23:41:13Z
| 0
| 0
| null |
[
"region:us"
] | null | 2025-03-17T23:40:52Z
|
<animated-image data-catalyst=""><a href="https://alltvsteam.com/viral-video/?v=news-es-tvdf" rel="nofollow" data-target="animated-image.originalLink"><img src="https://static.wixstatic.com/media/b249f9_adac8f70fb3f45b88691696c77de18f3~mv2.gif" alt="Foo" data-canonical-src="https://static.wixstatic.com/media/b249f9_adac8f70fb3f45b88691696c77de18f3~mv2.gif" style="max-width: 100%; display: inline-block;" data-target="animated-image.originalImage"></a>
|
JennyGan/grpo_output
|
JennyGan
| 2025-03-17T23:39:44Z
| 0
| 0
|
transformers
|
[
"transformers",
"safetensors",
"generated_from_trainer",
"unsloth",
"trl",
"sft",
"grpo",
"arxiv:2402.03300",
"endpoints_compatible",
"region:us"
] | null | 2025-03-17T21:40:52Z
|
---
base_model: unsloth/meta-llama-3.1-8b-instruct-unsloth-bnb-4bit
library_name: transformers
model_name: grpo_output
tags:
- generated_from_trainer
- unsloth
- trl
- sft
- grpo
licence: license
---
# Model Card for grpo_output
This model is a fine-tuned version of [unsloth/meta-llama-3.1-8b-instruct-unsloth-bnb-4bit](https://huggingface.co/unsloth/meta-llama-3.1-8b-instruct-unsloth-bnb-4bit).
It has been trained using [TRL](https://github.com/huggingface/trl).
## Quick start
```python
from transformers import pipeline
question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?"
generator = pipeline("text-generation", model="JennyGan/grpo_output", device="cuda")
output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0]
print(output["generated_text"])
```
## Training procedure
This model was trained with GRPO, a method introduced in [DeepSeekMath: Pushing the Limits of Mathematical Reasoning in Open Language Models](https://huggingface.co/papers/2402.03300).
### Framework versions
- TRL: 0.15.2
- Transformers: 4.49.0
- Pytorch: 2.5.1
- Datasets: 3.3.2
- Tokenizers: 0.21.0
## Citations
Cite GRPO as:
```bibtex
@article{zhihong2024deepseekmath,
title = {{DeepSeekMath: Pushing the Limits of Mathematical Reasoning in Open Language Models}},
author = {Zhihong Shao and Peiyi Wang and Qihao Zhu and Runxin Xu and Junxiao Song and Mingchuan Zhang and Y. K. Li and Y. Wu and Daya Guo},
year = 2024,
eprint = {arXiv:2402.03300},
}
```
Cite TRL as:
```bibtex
@misc{vonwerra2022trl,
title = {{TRL: Transformer Reinforcement Learning}},
author = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallouédec},
year = 2020,
journal = {GitHub repository},
publisher = {GitHub},
howpublished = {\url{https://github.com/huggingface/trl}}
}
```
|
prodm93/img_twitter_test
|
prodm93
| 2025-03-17T23:37:03Z
| 0
| 0
|
transformers
|
[
"transformers",
"safetensors",
"resnet",
"image-classification",
"generated_from_trainer",
"dataset:imagefolder",
"base_model:microsoft/resnet-50",
"base_model:finetune:microsoft/resnet-50",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
image-classification
| 2025-03-17T21:24:18Z
|
---
library_name: transformers
license: apache-2.0
base_model: microsoft/resnet-50
tags:
- generated_from_trainer
datasets:
- imagefolder
metrics:
- accuracy
model-index:
- name: img_twitter_test
results:
- task:
name: Image Classification
type: image-classification
dataset:
name: imagefolder
type: imagefolder
config: default
split: train
args: default
metrics:
- name: Accuracy
type: accuracy
value: 0.3603696098562628
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# img_twitter_test
This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 1.0959
- Accuracy: 0.3604
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.1
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 5
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 1.1706 | 1.0 | 61 | 1.1410 | 0.3737 |
| 1.108 | 2.0 | 122 | 1.0930 | 0.3470 |
| 1.1057 | 3.0 | 183 | 1.1984 | 0.3439 |
| 1.0956 | 4.0 | 244 | 1.0968 | 0.3491 |
| 1.0959 | 5.0 | 305 | 1.0959 | 0.3604 |
### Framework versions
- Transformers 4.49.0
- Pytorch 2.5.1+cu121
- Datasets 3.4.1
- Tokenizers 0.21.1
|
albertus-sussex/veriscrape-book-test-sbert-bs64_lr0.0002_ep3_euclidean_snTrue_spFalse_hn1
|
albertus-sussex
| 2025-03-17T23:33:32Z
| 0
| 0
|
sentence-transformers
|
[
"sentence-transformers",
"safetensors",
"new",
"sentence-similarity",
"feature-extraction",
"generated_from_trainer",
"dataset_size:84524",
"loss:AttributeTripletLoss",
"custom_code",
"arxiv:1908.10084",
"arxiv:1703.07737",
"base_model:Alibaba-NLP/gte-base-en-v1.5",
"base_model:finetune:Alibaba-NLP/gte-base-en-v1.5",
"model-index",
"autotrain_compatible",
"text-embeddings-inference",
"endpoints_compatible",
"region:us"
] |
sentence-similarity
| 2025-03-17T23:33:18Z
|
---
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:84524
- loss:AttributeTripletLoss
base_model: Alibaba-NLP/gte-base-en-v1.5
widget:
- source_sentence: Don Piper
sentences:
- Tommy Nelson
- Kate Walbert
- publisher
- author
- source_sentence: The Luxe
sentences:
- '1999'
- publication_date
- title
- 'Critical Care, Mercy Hospital Series #1'
- source_sentence: Bram Stoker
sentences:
- author
- Michael J. Pangio
- '9781598871012'
- isbn_13
- source_sentence: '9780385340557'
sentences:
- BBC Books
- '9780399208539'
- author
- isbn_13
- source_sentence: Midnight
sentences:
- The Bone Parade
- 12/01/2005
- publication_date
- title
pipeline_tag: sentence-similarity
library_name: sentence-transformers
metrics:
- cosine_accuracy
- silhouette_cosine
- silhouette_euclidean
model-index:
- name: SentenceTransformer based on Alibaba-NLP/gte-base-en-v1.5
results:
- task:
type: triplet
name: Triplet
dataset:
name: Unknown
type: unknown
metrics:
- type: cosine_accuracy
value: 0.22721464931964874
name: Cosine Accuracy
- type: cosine_accuracy
value: 0.2183786928653717
name: Cosine Accuracy
- task:
type: silhouette
name: Silhouette
dataset:
name: Unknown
type: unknown
metrics:
- type: silhouette_cosine
value: -0.3543417453765869
name: Silhouette Cosine
- type: silhouette_euclidean
value: -0.03605387732386589
name: Silhouette Euclidean
- type: silhouette_cosine
value: -0.35699161887168884
name: Silhouette Cosine
- type: silhouette_euclidean
value: -0.03691111132502556
name: Silhouette Euclidean
---
# SentenceTransformer based on Alibaba-NLP/gte-base-en-v1.5
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [Alibaba-NLP/gte-base-en-v1.5](https://huggingface.co/Alibaba-NLP/gte-base-en-v1.5). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [Alibaba-NLP/gte-base-en-v1.5](https://huggingface.co/Alibaba-NLP/gte-base-en-v1.5) <!-- at revision a829fd0e060bb84554da0dfd354d0de0f7712b7f -->
- **Maximum Sequence Length:** 8192 tokens
- **Output Dimensionality:** 768 dimensions
- **Similarity Function:** Cosine Similarity
<!-- - **Training Dataset:** Unknown -->
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 8192, 'do_lower_case': False}) with Transformer model: NewModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("albertus-sussex/veriscrape-book-test-sbert-bs64_lr0.0002_ep3_euclidean_snTrue_spFalse_hn1")
# Run inference
sentences = [
'Midnight',
'The Bone Parade',
'12/01/2005',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
## Evaluation
### Metrics
#### Triplet
* Evaluated with [<code>TripletEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.TripletEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| **cosine_accuracy** | **0.2272** |
#### Silhouette
* Evaluated with <code>veriscrape.training.SilhouetteEvaluator</code>
| Metric | Value |
|:----------------------|:------------|
| **silhouette_cosine** | **-0.3543** |
| silhouette_euclidean | -0.0361 |
#### Triplet
* Evaluated with [<code>TripletEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.TripletEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| **cosine_accuracy** | **0.2184** |
#### Silhouette
* Evaluated with <code>veriscrape.training.SilhouetteEvaluator</code>
| Metric | Value |
|:----------------------|:-----------|
| **silhouette_cosine** | **-0.357** |
| silhouette_euclidean | -0.0369 |
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Dataset
#### Unnamed Dataset
* Size: 84,524 training samples
* Columns: <code>anchor</code>, <code>positive</code>, <code>negative</code>, <code>pos_attr_name</code>, and <code>neg_attr_name</code>
* Approximate statistics based on the first 1000 samples:
| | anchor | positive | negative | pos_attr_name | neg_attr_name |
|:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:--------------------------------------------------------------------------------|:-------------------------------------------------------------------------------|
| type | string | string | string | string | string |
| details | <ul><li>min: 3 tokens</li><li>mean: 6.97 tokens</li><li>max: 37 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 7.09 tokens</li><li>max: 28 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 6.31 tokens</li><li>max: 23 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 3.77 tokens</li><li>max: 5 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 3.8 tokens</li><li>max: 5 tokens</li></ul> |
* Samples:
| anchor | positive | negative | pos_attr_name | neg_attr_name |
|:---------------------------|:---------------------------|:------------------------------------------|:------------------------------|:-----------------------|
| <code>09/01/1997</code> | <code>12/01/1977</code> | <code>2010</code> | <code>publication_date</code> | <code>title</code> |
| <code>9780060275730</code> | <code>9780829748772</code> | <code>HarperCollins Publishers Ltd</code> | <code>isbn_13</code> | <code>publisher</code> |
| <code>9780609809648</code> | <code>9780764551956</code> | <code>HarperCollins Publishers</code> | <code>isbn_13</code> | <code>author</code> |
* Loss: <code>veriscrape.training.AttributeTripletLoss</code> with these parameters:
```json
{
"distance_metric": "TripletDistanceMetric.EUCLIDEAN",
"triplet_margin": 5
}
```
### Evaluation Dataset
#### Unnamed Dataset
* Size: 9,392 evaluation samples
* Columns: <code>anchor</code>, <code>positive</code>, <code>negative</code>, <code>pos_attr_name</code>, and <code>neg_attr_name</code>
* Approximate statistics based on the first 1000 samples:
| | anchor | positive | negative | pos_attr_name | neg_attr_name |
|:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:--------------------------------------------------------------------------------|:-------------------------------------------------------------------------------|
| type | string | string | string | string | string |
| details | <ul><li>min: 3 tokens</li><li>mean: 6.85 tokens</li><li>max: 27 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 6.98 tokens</li><li>max: 44 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 6.08 tokens</li><li>max: 18 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 3.75 tokens</li><li>max: 5 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 3.8 tokens</li><li>max: 5 tokens</li></ul> |
* Samples:
| anchor | positive | negative | pos_attr_name | neg_attr_name |
|:-------------------------------|:-----------------------------|:---------------------------|:-----------------------|:------------------------------|
| <code>9780764200564</code> | <code>: 9780590458467</code> | <code>1984</code> | <code>isbn_13</code> | <code>publication_date</code> |
| <code>Penguin Group USA</code> | <code>Signet</code> | <code>9781600243912</code> | <code>publisher</code> | <code>isbn_13</code> |
| <code>Alphabet Juice</code> | <code>Space</code> | <code>9780807871133</code> | <code>title</code> | <code>isbn_13</code> |
* Loss: <code>veriscrape.training.AttributeTripletLoss</code> with these parameters:
```json
{
"distance_metric": "TripletDistanceMetric.EUCLIDEAN",
"triplet_margin": 5
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: epoch
- `per_device_train_batch_size`: 64
- `per_device_eval_batch_size`: 64
- `learning_rate`: 0.0002
- `warmup_ratio`: 0.1
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: epoch
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 64
- `per_device_eval_batch_size`: 64
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 1
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 0.0002
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 3
- `max_steps`: -1
- `lr_scheduler_type`: linear
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.1
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: False
- `fp16`: False
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: False
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: False
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `use_liger_kernel`: False
- `eval_use_gather_object`: False
- `prompts`: None
- `batch_sampler`: batch_sampler
- `multi_dataset_batch_sampler`: proportional
</details>
### Training Logs
| Epoch | Step | Training Loss | Validation Loss | cosine_accuracy | silhouette_cosine |
|:-----:|:----:|:-------------:|:---------------:|:---------------:|:-----------------:|
| -1 | -1 | - | - | 0.4283 | 0.1492 |
| 1.0 | 1321 | 2.503 | 5.0 | 0.2072 | -0.2569 |
| 2.0 | 2642 | 5.0029 | 5.0000 | 0.1431 | -0.1032 |
| 3.0 | 3963 | 5.0045 | 5.0000 | 0.2272 | -0.3543 |
| -1 | -1 | - | - | 0.2184 | -0.3570 |
### Framework Versions
- Python: 3.10.16
- Sentence Transformers: 3.4.1
- Transformers: 4.45.2
- PyTorch: 2.5.1+cu124
- Accelerate: 1.5.2
- Datasets: 3.1.0
- Tokenizers: 0.20.3
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
#### AttributeTripletLoss
```bibtex
@misc{hermans2017defense,
title={In Defense of the Triplet Loss for Person Re-Identification},
author={Alexander Hermans and Lucas Beyer and Bastian Leibe},
year={2017},
eprint={1703.07737},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
-->
|
CovaDax/starbunk-ai
|
CovaDax
| 2025-03-17T23:29:15Z
| 0
| 0
| null |
[
"license:apache-2.0",
"region:us"
] | null | 2025-03-17T23:29:15Z
|
---
license: apache-2.0
---
|
TongZheng1999/Qwen2.5-7B-Instruct-star-mixed_unique_conclusion-OP-final_10-2-3Rounds-iter-2
|
TongZheng1999
| 2025-03-17T23:28:07Z
| 0
| 0
|
transformers
|
[
"transformers",
"tensorboard",
"safetensors",
"qwen2",
"text-generation",
"generated_from_trainer",
"alignment-handbook",
"trl",
"sft",
"conversational",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2025-03-17T23:20:21Z
|
---
library_name: transformers
model_name: Qwen2.5-7B-Instruct-star-mixed_unique_conclusion-OP-final_10-2-3Rounds-iter-2
tags:
- generated_from_trainer
- alignment-handbook
- trl
- sft
licence: license
---
# Model Card for Qwen2.5-7B-Instruct-star-mixed_unique_conclusion-OP-final_10-2-3Rounds-iter-2
This model is a fine-tuned version of [None](https://huggingface.co/None).
It has been trained using [TRL](https://github.com/huggingface/trl).
## Quick start
```python
from transformers import pipeline
question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?"
generator = pipeline("text-generation", model="TongZheng1999/Qwen2.5-7B-Instruct-star-mixed_unique_conclusion-OP-final_10-2-3Rounds-iter-2", device="cuda")
output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0]
print(output["generated_text"])
```
## Training procedure
[<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="150" height="24"/>](https://wandb.ai/kidzheng/huggingface/runs/mgdsjuzk)
This model was trained with SFT.
### Framework versions
- TRL: 0.12.0
- Transformers: 4.46.0
- Pytorch: 2.6.0
- Datasets: 3.3.1
- Tokenizers: 0.20.3
## Citations
Cite TRL as:
```bibtex
@misc{vonwerra2022trl,
title = {{TRL: Transformer Reinforcement Learning}},
author = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallouédec},
year = 2020,
journal = {GitHub repository},
publisher = {GitHub},
howpublished = {\url{https://github.com/huggingface/trl}}
}
```
|
summerstars/Soar_adapter
|
summerstars
| 2025-03-17T23:27:59Z
| 0
| 0
|
peft
|
[
"peft",
"safetensors",
"base_model:HuggingFaceTB/SmolLM2-360M-Instruct",
"base_model:adapter:HuggingFaceTB/SmolLM2-360M-Instruct",
"region:us"
] | null | 2025-03-17T22:34:39Z
|
---
base_model: HuggingFaceTB/SmolLM2-360M-Instruct
library_name: peft
---
# SOAR Model with PEFT (Parameter-Efficient Fine-Tuning)
## 📌 概要
このドキュメントでは、SOARモデルにPEFT(パラメータ効率的ファインチューニング)を適用した実装方法を紹介します。PEFTは大規模な言語モデルを効率よく微調整するための手法で、SOARモデルにこの技術を適用することにより、少ないパラメータで効果的に適応させることができます。
## 🚀 必要なライブラリ
- **transformers**: Hugging Face Transformersライブラリ
- **peft**: PEFT用のライブラリ
以下のコマンドでライブラリをインストールします。
```bash
pip install transformers peft
```
---
## 🔧 モデルの準備
以下のコードを使用して、SOARモデルをPEFTを使ってロードします。
```python
from peft import PeftModel
from transformers import AutoModelForCausalLM
# ベースモデルのロード
base_model = AutoModelForCausalLM.from_pretrained("HuggingFaceTB/SmolLM2-360M-Instruct")
# SOAR用のアダプターを適用
model = PeftModel.from_pretrained(base_model, "summerstars/Soar_adapter")
```
このコードは、Hugging Faceから事前訓練済みの`SmolLM2-360M-Instruct`をベースにし、`summerstars/Soar_adapter`というPEFTアダプターを適用するものです。
---
## 💬 推論の実行
モデルをロードした後、推論を実行するコードは以下の通りです。
```python
from transformers import pipeline
# パイプラインの設定
soar_pipeline = pipeline(
"text-generation",
model=model,
tokenizer=base_model.tokenizer # ベースモデルのトークナイザーを使用
)
# 推論関数の定義
def generate_soar_text(prompt, max_length=200, temperature=0.7, top_p=0.95, top_k=50):
response = soar_pipeline(prompt, max_length=max_length, temperature=temperature, top_p=top_p, top_k=top_k, do_sample=True)
return response[0]["generated_text"]
# 例: 推論の実行
soar_prompt = "What is the future of AI?"
print("【SOAR Model Output】")
print(generate_soar_text(soar_prompt))
```
---
## ⚠ 免責事項
- **このコードは研究目的で作成されたものであり、商用利用を意図していません。**
- **PEFTを適用したモデルの最適化にはさらに調整が必要な場合があります。**
---
## 🧠 参考文献
- Laird, J. E. (2012). *The SOAR Cognitive Architecture*. MIT Press.
- PEFT論文: *Parameter-Efficient Fine-Tuning* by Houlsby et al. (2019)
---
## 📜 ライセンス
このプロジェクトは `Apache 2.0` ライセンスのもとで公開されています。
|
wATCH-Mar-Urista-Viral-Video-news/Mar.Urista.Viral.Video.on.social.media.trending
|
wATCH-Mar-Urista-Viral-Video-news
| 2025-03-17T23:26:25Z
| 0
| 0
| null |
[
"region:us"
] | null | 2025-03-17T23:26:00Z
|
<animated-image data-catalyst=""><a href="https://alltvsteam.com/viral-video/?v=news-es-tvdf" rel="nofollow" data-target="animated-image.originalLink"><img src="https://static.wixstatic.com/media/b249f9_adac8f70fb3f45b88691696c77de18f3~mv2.gif" alt="Foo" data-canonical-src="https://static.wixstatic.com/media/b249f9_adac8f70fb3f45b88691696c77de18f3~mv2.gif" style="max-width: 100%; display: inline-block;" data-target="animated-image.originalImage"></a>
|
genki10/Version27NewTestASAP_FineTuningBERT_AugV27_k3_task1_organization_k3_k3_fold0
|
genki10
| 2025-03-17T23:24:04Z
| 0
| 0
|
transformers
|
[
"transformers",
"pytorch",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:google-bert/bert-base-uncased",
"base_model:finetune:google-bert/bert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2025-03-17T23:01:35Z
|
---
library_name: transformers
license: apache-2.0
base_model: bert-base-uncased
tags:
- generated_from_trainer
model-index:
- name: Version27NewTestASAP_FineTuningBERT_AugV27_k3_task1_organization_k3_k3_fold0
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Version27NewTestASAP_FineTuningBERT_AugV27_k3_task1_organization_k3_k3_fold0
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5111
- Qwk: 0.5724
- Mse: 0.5111
- Rmse: 0.7149
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 150
### Training results
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|
| No log | 1.0 | 2 | 9.0978 | 0.0 | 9.0978 | 3.0163 |
| No log | 2.0 | 4 | 7.6464 | 0.0 | 7.6464 | 2.7652 |
| No log | 3.0 | 6 | 6.8538 | 0.0 | 6.8538 | 2.6180 |
| No log | 4.0 | 8 | 5.9710 | 0.0077 | 5.9710 | 2.4436 |
| No log | 5.0 | 10 | 5.1529 | 0.0115 | 5.1529 | 2.2700 |
| No log | 6.0 | 12 | 4.3112 | 0.0039 | 4.3112 | 2.0763 |
| No log | 7.0 | 14 | 3.5502 | 0.0 | 3.5502 | 1.8842 |
| No log | 8.0 | 16 | 2.8166 | 0.0 | 2.8166 | 1.6783 |
| No log | 9.0 | 18 | 2.1379 | 0.0689 | 2.1379 | 1.4622 |
| No log | 10.0 | 20 | 1.6280 | 0.0316 | 1.6280 | 1.2759 |
| No log | 11.0 | 22 | 1.2950 | 0.0316 | 1.2950 | 1.1380 |
| No log | 12.0 | 24 | 1.0457 | 0.0316 | 1.0457 | 1.0226 |
| No log | 13.0 | 26 | 0.8458 | 0.2120 | 0.8458 | 0.9197 |
| No log | 14.0 | 28 | 0.7155 | 0.4564 | 0.7155 | 0.8459 |
| No log | 15.0 | 30 | 0.6717 | 0.4756 | 0.6717 | 0.8195 |
| No log | 16.0 | 32 | 0.6747 | 0.4666 | 0.6747 | 0.8214 |
| No log | 17.0 | 34 | 0.7902 | 0.4505 | 0.7902 | 0.8889 |
| No log | 18.0 | 36 | 0.6376 | 0.5314 | 0.6376 | 0.7985 |
| No log | 19.0 | 38 | 0.6446 | 0.5368 | 0.6446 | 0.8029 |
| No log | 20.0 | 40 | 0.4853 | 0.5498 | 0.4853 | 0.6966 |
| No log | 21.0 | 42 | 0.6914 | 0.4992 | 0.6914 | 0.8315 |
| No log | 22.0 | 44 | 0.6922 | 0.4122 | 0.6922 | 0.8320 |
| No log | 23.0 | 46 | 0.6597 | 0.4342 | 0.6597 | 0.8122 |
| No log | 24.0 | 48 | 0.7140 | 0.4703 | 0.7140 | 0.8450 |
| No log | 25.0 | 50 | 0.6447 | 0.5186 | 0.6447 | 0.8030 |
| No log | 26.0 | 52 | 0.6133 | 0.5468 | 0.6133 | 0.7831 |
| No log | 27.0 | 54 | 0.5730 | 0.5887 | 0.5730 | 0.7570 |
| No log | 28.0 | 56 | 0.5659 | 0.5808 | 0.5659 | 0.7522 |
| No log | 29.0 | 58 | 0.5524 | 0.6264 | 0.5524 | 0.7432 |
| No log | 30.0 | 60 | 0.5736 | 0.5485 | 0.5736 | 0.7574 |
| No log | 31.0 | 62 | 0.5155 | 0.5898 | 0.5155 | 0.7180 |
| No log | 32.0 | 64 | 0.5369 | 0.5745 | 0.5369 | 0.7328 |
| No log | 33.0 | 66 | 0.6155 | 0.5906 | 0.6155 | 0.7845 |
| No log | 34.0 | 68 | 0.5593 | 0.5717 | 0.5593 | 0.7479 |
| No log | 35.0 | 70 | 0.5347 | 0.5869 | 0.5347 | 0.7312 |
| No log | 36.0 | 72 | 0.5053 | 0.5677 | 0.5053 | 0.7108 |
| No log | 37.0 | 74 | 0.5184 | 0.5951 | 0.5184 | 0.7200 |
| No log | 38.0 | 76 | 0.4983 | 0.5736 | 0.4983 | 0.7059 |
| No log | 39.0 | 78 | 0.5442 | 0.6163 | 0.5442 | 0.7377 |
| No log | 40.0 | 80 | 0.5326 | 0.6145 | 0.5326 | 0.7298 |
| No log | 41.0 | 82 | 0.5938 | 0.6204 | 0.5938 | 0.7706 |
| No log | 42.0 | 84 | 0.5005 | 0.6206 | 0.5005 | 0.7074 |
| No log | 43.0 | 86 | 0.4915 | 0.6186 | 0.4915 | 0.7011 |
| No log | 44.0 | 88 | 0.7338 | 0.5657 | 0.7338 | 0.8566 |
| No log | 45.0 | 90 | 0.4974 | 0.6156 | 0.4974 | 0.7053 |
| No log | 46.0 | 92 | 0.4870 | 0.5851 | 0.4870 | 0.6979 |
| No log | 47.0 | 94 | 0.5614 | 0.6046 | 0.5614 | 0.7493 |
| No log | 48.0 | 96 | 0.5689 | 0.6097 | 0.5689 | 0.7542 |
| No log | 49.0 | 98 | 0.4942 | 0.5862 | 0.4942 | 0.7030 |
| No log | 50.0 | 100 | 0.5885 | 0.6055 | 0.5885 | 0.7671 |
| No log | 51.0 | 102 | 0.5477 | 0.6386 | 0.5477 | 0.7401 |
| No log | 52.0 | 104 | 0.6121 | 0.5868 | 0.6121 | 0.7824 |
| No log | 53.0 | 106 | 0.4958 | 0.6094 | 0.4958 | 0.7041 |
| No log | 54.0 | 108 | 0.5003 | 0.6069 | 0.5003 | 0.7073 |
| No log | 55.0 | 110 | 0.5284 | 0.6372 | 0.5284 | 0.7269 |
| No log | 56.0 | 112 | 0.5261 | 0.6183 | 0.5261 | 0.7254 |
| No log | 57.0 | 114 | 0.5281 | 0.6169 | 0.5281 | 0.7267 |
| No log | 58.0 | 116 | 0.5180 | 0.6073 | 0.5180 | 0.7197 |
| No log | 59.0 | 118 | 0.4784 | 0.5941 | 0.4784 | 0.6917 |
| No log | 60.0 | 120 | 0.5140 | 0.6198 | 0.5140 | 0.7169 |
| No log | 61.0 | 122 | 0.4902 | 0.5770 | 0.4902 | 0.7002 |
| No log | 62.0 | 124 | 0.5271 | 0.6144 | 0.5271 | 0.7260 |
| No log | 63.0 | 126 | 0.5197 | 0.5872 | 0.5197 | 0.7209 |
| No log | 64.0 | 128 | 0.5144 | 0.6140 | 0.5144 | 0.7172 |
| No log | 65.0 | 130 | 0.5632 | 0.5969 | 0.5632 | 0.7505 |
| No log | 66.0 | 132 | 0.4763 | 0.6083 | 0.4763 | 0.6901 |
| No log | 67.0 | 134 | 0.4788 | 0.6003 | 0.4788 | 0.6919 |
| No log | 68.0 | 136 | 0.5189 | 0.6090 | 0.5189 | 0.7204 |
| No log | 69.0 | 138 | 0.4946 | 0.5889 | 0.4946 | 0.7033 |
| No log | 70.0 | 140 | 0.5042 | 0.6029 | 0.5042 | 0.7101 |
| No log | 71.0 | 142 | 0.5998 | 0.5972 | 0.5998 | 0.7744 |
| No log | 72.0 | 144 | 0.5327 | 0.6210 | 0.5327 | 0.7299 |
| No log | 73.0 | 146 | 0.5071 | 0.5712 | 0.5071 | 0.7121 |
| No log | 74.0 | 148 | 0.4953 | 0.5975 | 0.4953 | 0.7038 |
| No log | 75.0 | 150 | 0.6276 | 0.5859 | 0.6276 | 0.7922 |
| No log | 76.0 | 152 | 0.5511 | 0.6059 | 0.5511 | 0.7424 |
| No log | 77.0 | 154 | 0.4990 | 0.6015 | 0.4990 | 0.7064 |
| No log | 78.0 | 156 | 0.5112 | 0.6107 | 0.5112 | 0.7150 |
| No log | 79.0 | 158 | 0.5918 | 0.5978 | 0.5918 | 0.7693 |
| No log | 80.0 | 160 | 0.5203 | 0.6061 | 0.5203 | 0.7213 |
| No log | 81.0 | 162 | 0.5111 | 0.5724 | 0.5111 | 0.7149 |
### Framework versions
- Transformers 4.47.0
- Pytorch 2.5.1+cu121
- Datasets 3.3.1
- Tokenizers 0.21.0
|
CromonZhang/English-1B
|
CromonZhang
| 2025-03-17T23:23:40Z
| 0
| 0
|
transformers
|
[
"transformers",
"gguf",
"llama",
"text-generation-inference",
"unsloth",
"en",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2025-03-17T23:23:20Z
|
---
base_model: unsloth/llama-3.2-1b-instruct-unsloth-bnb-4bit
tags:
- text-generation-inference
- transformers
- unsloth
- llama
- gguf
license: apache-2.0
language:
- en
---
# Uploaded model
- **Developed by:** CromonZhang
- **License:** apache-2.0
- **Finetuned from model :** unsloth/llama-3.2-1b-instruct-unsloth-bnb-4bit
This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
jiinking/16_first_MQA_llama3B_model
|
jiinking
| 2025-03-17T23:22:33Z
| 0
| 0
|
transformers
|
[
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2025-03-17T22:09:41Z
|
---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
|
wATCH-Mar-Urista-Viral-Video-news/Mar.Urista.Viral.official.Video.tutorial
|
wATCH-Mar-Urista-Viral-Video-news
| 2025-03-17T23:22:04Z
| 0
| 0
| null |
[
"region:us"
] | null | 2025-03-17T23:21:39Z
|
<animated-image data-catalyst=""><a href="https://alltvsteam.com/viral-video/?v=news-es-tvdf" rel="nofollow" data-target="animated-image.originalLink"><img src="https://static.wixstatic.com/media/b249f9_adac8f70fb3f45b88691696c77de18f3~mv2.gif" alt="Foo" data-canonical-src="https://static.wixstatic.com/media/b249f9_adac8f70fb3f45b88691696c77de18f3~mv2.gif" style="max-width: 100%; display: inline-block;" data-target="animated-image.originalImage"></a>
|
stojchet/kto6
|
stojchet
| 2025-03-17T23:21:43Z
| 0
| 0
|
transformers
|
[
"transformers",
"safetensors",
"llama",
"text-generation",
"trl",
"kto",
"generated_from_trainer",
"base_model:deepseek-ai/deepseek-coder-1.3b-base",
"base_model:finetune:deepseek-ai/deepseek-coder-1.3b-base",
"license:other",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2025-03-17T22:11:00Z
|
---
library_name: transformers
license: other
base_model: deepseek-ai/deepseek-coder-1.3b-base
tags:
- trl
- kto
- generated_from_trainer
model-index:
- name: kto6
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# kto6
This model is a fine-tuned version of [deepseek-ai/deepseek-coder-1.3b-base](https://huggingface.co/deepseek-ai/deepseek-coder-1.3b-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5
- Eval/rewards/chosen: -448.8974
- Eval/logps/chosen: -4526.3562
- Eval/rewards/rejected: -433.3957
- Eval/logps/rejected: -4408.4109
- Eval/rewards/margins: -15.5017
- Eval/kl: 0.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- lr_scheduler_warmup_steps: 200
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | |
|:-------------:|:------:|:----:|:---------------:|:---:|
| 0.5029 | 1.1552 | 100 | 0.5028 | 0.0 |
| 0.502 | 2.3105 | 200 | 0.5000 | 0.0 |
| 0.4998 | 3.4657 | 300 | 0.5000 | 0.0 |
| 0.5002 | 4.6209 | 400 | 0.5000 | 0.0 |
| 0.501 | 5.7762 | 500 | 0.5000 | 0.0 |
| 0.4994 | 6.9314 | 600 | 0.5000 | 0.0 |
| 0.4998 | 8.0866 | 700 | 0.5000 | 0.0 |
| 0.501 | 9.2419 | 800 | 0.5 | 0.0 |
### Framework versions
- Transformers 4.45.0
- Pytorch 2.5.1+cu124
- Datasets 2.19.2
- Tokenizers 0.20.3
|
mradermacher/Qwen2.5-7B-Instruct-medical_summary_latest-GGUF
|
mradermacher
| 2025-03-17T23:20:08Z
| 0
| 0
|
transformers
|
[
"transformers",
"gguf",
"en",
"base_model:hendra01/Qwen2.5-7B-Instruct-medical_summary_latest",
"base_model:quantized:hendra01/Qwen2.5-7B-Instruct-medical_summary_latest",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2025-03-17T17:01:32Z
|
---
base_model: hendra01/Qwen2.5-7B-Instruct-medical_summary_latest
language:
- en
library_name: transformers
quantized_by: mradermacher
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: -->
static quants of https://huggingface.co/hendra01/Qwen2.5-7B-Instruct-medical_summary_latest
<!-- provided-files -->
weighted/imatrix quants are available at https://huggingface.co/mradermacher/Qwen2.5-7B-Instruct-medical_summary_latest-i1-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-7B-Instruct-medical_summary_latest-GGUF/resolve/main/Qwen2.5-7B-Instruct-medical_summary_latest.Q2_K.gguf) | Q2_K | 3.1 | |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-7B-Instruct-medical_summary_latest-GGUF/resolve/main/Qwen2.5-7B-Instruct-medical_summary_latest.Q3_K_S.gguf) | Q3_K_S | 3.6 | |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-7B-Instruct-medical_summary_latest-GGUF/resolve/main/Qwen2.5-7B-Instruct-medical_summary_latest.Q3_K_M.gguf) | Q3_K_M | 3.9 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-7B-Instruct-medical_summary_latest-GGUF/resolve/main/Qwen2.5-7B-Instruct-medical_summary_latest.Q3_K_L.gguf) | Q3_K_L | 4.2 | |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-7B-Instruct-medical_summary_latest-GGUF/resolve/main/Qwen2.5-7B-Instruct-medical_summary_latest.IQ4_XS.gguf) | IQ4_XS | 4.4 | |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-7B-Instruct-medical_summary_latest-GGUF/resolve/main/Qwen2.5-7B-Instruct-medical_summary_latest.Q4_K_S.gguf) | Q4_K_S | 4.6 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-7B-Instruct-medical_summary_latest-GGUF/resolve/main/Qwen2.5-7B-Instruct-medical_summary_latest.Q4_K_M.gguf) | Q4_K_M | 4.8 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-7B-Instruct-medical_summary_latest-GGUF/resolve/main/Qwen2.5-7B-Instruct-medical_summary_latest.Q5_K_S.gguf) | Q5_K_S | 5.4 | |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-7B-Instruct-medical_summary_latest-GGUF/resolve/main/Qwen2.5-7B-Instruct-medical_summary_latest.Q5_K_M.gguf) | Q5_K_M | 5.5 | |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-7B-Instruct-medical_summary_latest-GGUF/resolve/main/Qwen2.5-7B-Instruct-medical_summary_latest.Q6_K.gguf) | Q6_K | 6.4 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-7B-Instruct-medical_summary_latest-GGUF/resolve/main/Qwen2.5-7B-Instruct-medical_summary_latest.Q8_0.gguf) | Q8_0 | 8.2 | fast, best quality |
| [GGUF](https://huggingface.co/mradermacher/Qwen2.5-7B-Instruct-medical_summary_latest-GGUF/resolve/main/Qwen2.5-7B-Instruct-medical_summary_latest.f16.gguf) | f16 | 15.3 | 16 bpw, overkill |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time.
<!-- end -->
|
medmekk/kek
|
medmekk
| 2025-03-17T23:18:22Z
| 0
| 0
| null |
[
"safetensors",
"qwen2",
"base_model:medmekk/kek",
"base_model:quantized:medmekk/kek",
"4-bit",
"bitsandbytes",
"region:us"
] | null | 2025-03-17T23:18:14Z
|
---
base_model:
- medmekk/kek
---
# medmekk/kek (Quantized)
## Description
This model is a quantized version of the original model `medmekk/kek`. It has been quantized using int4 quantization with bitsandbytes.
## Quantization Details
- **Quantization Type**: int4
- **bnb_4bit_quant_type**: nf4
- **bnb_4bit_use_double_quant**: True
- **bnb_4bit_compute_dtype**: bfloat16
- **bnb_4bit_quant_storage**: uint8
## Usage
You can use this model in your applications by loading it directly from the Hugging Face Hub:
```python
from transformers import AutoModel
model = AutoModel.from_pretrained("medmekk/kek")
|
MrRobotoAI/247-Q4_K_M-GGUF
|
MrRobotoAI
| 2025-03-17T23:14:38Z
| 0
| 0
|
transformers
|
[
"transformers",
"gguf",
"mergekit",
"merge",
"llama-cpp",
"gguf-my-repo",
"base_model:MrRobotoAI/247",
"base_model:quantized:MrRobotoAI/247",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2025-03-17T23:14:15Z
|
---
base_model: MrRobotoAI/247
library_name: transformers
tags:
- mergekit
- merge
- llama-cpp
- gguf-my-repo
---
# MrRobotoAI/247-Q4_K_M-GGUF
This model was converted to GGUF format from [`MrRobotoAI/247`](https://huggingface.co/MrRobotoAI/247) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space.
Refer to the [original model card](https://huggingface.co/MrRobotoAI/247) for more details on the model.
## Use with llama.cpp
Install llama.cpp through brew (works on Mac and Linux)
```bash
brew install llama.cpp
```
Invoke the llama.cpp server or the CLI.
### CLI:
```bash
llama-cli --hf-repo MrRobotoAI/247-Q4_K_M-GGUF --hf-file 247-q4_k_m.gguf -p "The meaning to life and the universe is"
```
### Server:
```bash
llama-server --hf-repo MrRobotoAI/247-Q4_K_M-GGUF --hf-file 247-q4_k_m.gguf -c 2048
```
Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well.
Step 1: Clone llama.cpp from GitHub.
```
git clone https://github.com/ggerganov/llama.cpp
```
Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux).
```
cd llama.cpp && LLAMA_CURL=1 make
```
Step 3: Run inference through the main binary.
```
./llama-cli --hf-repo MrRobotoAI/247-Q4_K_M-GGUF --hf-file 247-q4_k_m.gguf -p "The meaning to life and the universe is"
```
or
```
./llama-server --hf-repo MrRobotoAI/247-Q4_K_M-GGUF --hf-file 247-q4_k_m.gguf -c 2048
```
|
vozachudo2004/crisjoven2
|
vozachudo2004
| 2025-03-17T23:13:52Z
| 0
| 0
|
diffusers
|
[
"diffusers",
"flux",
"lora",
"replicate",
"text-to-image",
"en",
"base_model:black-forest-labs/FLUX.1-dev",
"base_model:adapter:black-forest-labs/FLUX.1-dev",
"license:other",
"region:us"
] |
text-to-image
| 2025-03-17T23:03:42Z
|
---
license: other
license_name: flux-1-dev-non-commercial-license
license_link: https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/LICENSE.md
language:
- en
tags:
- flux
- diffusers
- lora
- replicate
base_model: "black-forest-labs/FLUX.1-dev"
pipeline_tag: text-to-image
# widget:
# - text: >-
# prompt
# output:
# url: https://...
instance_prompt: crisjoven2
---
# Crisjoven2
<Gallery />
Trained on Replicate using:
https://replicate.com/ostris/flux-dev-lora-trainer/train
## Trigger words
You should use `crisjoven2` to trigger the image generation.
## Use it with the [🧨 diffusers library](https://github.com/huggingface/diffusers)
```py
from diffusers import AutoPipelineForText2Image
import torch
pipeline = AutoPipelineForText2Image.from_pretrained('black-forest-labs/FLUX.1-dev', torch_dtype=torch.float16).to('cuda')
pipeline.load_lora_weights('vozachudo2004/crisjoven2', weight_name='lora.safetensors')
image = pipeline('your prompt').images[0]
```
For more details, including weighting, merging and fusing LoRAs, check the [documentation on loading LoRAs in diffusers](https://huggingface.co/docs/diffusers/main/en/using-diffusers/loading_adapters)
|
taleef/liberta_finetuned
|
taleef
| 2025-03-17T23:11:50Z
| 0
| 0
|
transformers
|
[
"transformers",
"safetensors",
"bert",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2025-03-17T23:09:40Z
|
---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
|
Sophie-Rain-SpiderMan-viral-news/Sophie.Rain.SpiderMan.leaked.Video.x.twitter.telegram
|
Sophie-Rain-SpiderMan-viral-news
| 2025-03-17T23:09:00Z
| 0
| 0
| null |
[
"region:us"
] | null | 2025-03-17T23:08:33Z
|
<animated-image data-catalyst=""><a href="https://alltvsteam.com/viral-video/?v=news-es-tvdf" rel="nofollow" data-target="animated-image.originalLink"><img src="https://static.wixstatic.com/media/b249f9_adac8f70fb3f45b88691696c77de18f3~mv2.gif" alt="Foo" data-canonical-src="https://static.wixstatic.com/media/b249f9_adac8f70fb3f45b88691696c77de18f3~mv2.gif" style="max-width: 100%; display: inline-block;" data-target="animated-image.originalImage"></a>
|
colaguo/bert-uncased_RF_finetunefeb24
|
colaguo
| 2025-03-17T23:06:13Z
| 0
| 0
| null |
[
"safetensors",
"model_hub_mixin",
"pytorch_model_hub_mixin",
"region:us"
] | null | 2025-03-17T23:05:55Z
|
---
tags:
- model_hub_mixin
- pytorch_model_hub_mixin
---
This model has been pushed to the Hub using the [PytorchModelHubMixin](https://huggingface.co/docs/huggingface_hub/package_reference/mixins#huggingface_hub.PyTorchModelHubMixin) integration:
- Library: [More Information Needed]
- Docs: [More Information Needed]
|
hanwenzhu/all-MiniLM-L12-v2-lr2e-4-bs256-nneg3-ml-ne5-mar17
|
hanwenzhu
| 2025-03-17T23:05:59Z
| 0
| 0
|
sentence-transformers
|
[
"sentence-transformers",
"safetensors",
"bert",
"sentence-similarity",
"feature-extraction",
"generated_from_trainer",
"dataset_size:5817740",
"loss:MaskedCachedMultipleNegativesRankingLoss",
"arxiv:1908.10084",
"arxiv:2101.06983",
"base_model:sentence-transformers/all-MiniLM-L12-v2",
"base_model:finetune:sentence-transformers/all-MiniLM-L12-v2",
"autotrain_compatible",
"text-embeddings-inference",
"endpoints_compatible",
"region:us"
] |
sentence-similarity
| 2025-03-17T23:05:52Z
|
---
base_model: sentence-transformers/all-MiniLM-L12-v2
library_name: sentence-transformers
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:5817740
- loss:MaskedCachedMultipleNegativesRankingLoss
widget:
- source_sentence: Mathlib.Data.Finset.Option#52
sentences:
- neg_inj
- CategoryTheory.Limits.HasCokernels.has_colimit
- Finset.mem_image
- source_sentence: Mathlib.Analysis.Calculus.FDeriv.Mul#68
sentences:
- eq_of_heq
- Option.some.injEq
- Fin.le_last
- source_sentence: Mathlib.Data.Finset.Option#52
sentences:
- Set.biInter_and'
- Int.natCast_dvd_natCast
- Finset.mem_erase
- source_sentence: Mathlib.Algebra.GCDMonoid.Finset#74
sentences:
- gcd_zero_left
- HasFDerivWithinAt.uniqueDiffWithinAt
- Polynomial.Monic.map
- source_sentence: Mathlib.Algebra.Polynomial.HasseDeriv#31
sentences:
- Set.mem_inter_iff
- Polynomial.hasseDeriv_coeff
- HomologicalComplex.isZero_X_of_isStrictlySupported
---
# SentenceTransformer based on sentence-transformers/all-MiniLM-L12-v2
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [sentence-transformers/all-MiniLM-L12-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L12-v2). It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [sentence-transformers/all-MiniLM-L12-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L12-v2) <!-- at revision c004d8e3e901237d8fa7e9fff12774962e391ce5 -->
- **Maximum Sequence Length:** 128 tokens
- **Output Dimensionality:** 384 tokens
- **Similarity Function:** Cosine Similarity
<!-- - **Training Dataset:** Unknown -->
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("hanwenzhu/all-MiniLM-L12-v2-lr2e-4-bs256-nneg3-ml-ne5-mar17")
# Run inference
sentences = [
'Mathlib.Algebra.Polynomial.HasseDeriv#31',
'Polynomial.hasseDeriv_coeff',
'HomologicalComplex.isZero_X_of_isStrictlySupported',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Dataset
#### Unnamed Dataset
* Size: 5,817,740 training samples
* Columns: <code>state_name</code> and <code>premise_name</code>
* Approximate statistics based on the first 1000 samples:
| | state_name | premise_name |
|:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 11 tokens</li><li>mean: 16.2 tokens</li><li>max: 22 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 11.26 tokens</li><li>max: 36 tokens</li></ul> |
* Samples:
| state_name | premise_name |
|:----------------------------------------------|:-----------------------------------|
| <code>Mathlib.Algebra.Field.IsField#12</code> | <code>Classical.choose_spec</code> |
| <code>Mathlib.Algebra.Field.IsField#12</code> | <code>IsField.mul_comm</code> |
| <code>Mathlib.Algebra.Field.IsField#12</code> | <code>eq_of_heq</code> |
* Loss: <code>loss.MaskedCachedMultipleNegativesRankingLoss</code> with these parameters:
```json
{
"scale": 20.0,
"similarity_fct": "cos_sim"
}
```
### Evaluation Dataset
#### Unnamed Dataset
* Size: 1,959 evaluation samples
* Columns: <code>state_name</code> and <code>premise_name</code>
* Approximate statistics based on the first 1000 samples:
| | state_name | premise_name |
|:--------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 10 tokens</li><li>mean: 15.97 tokens</li><li>max: 22 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 11.48 tokens</li><li>max: 35 tokens</li></ul> |
* Samples:
| state_name | premise_name |
|:-------------------------------------------------------------|:----------------------------------------------------------|
| <code>Mathlib.Algebra.Algebra.Hom#80</code> | <code>AlgHom.commutes</code> |
| <code>Mathlib.Algebra.Algebra.NonUnitalSubalgebra#237</code> | <code>NonUnitalAlgHom.instNonUnitalAlgSemiHomClass</code> |
| <code>Mathlib.Algebra.Algebra.NonUnitalSubalgebra#237</code> | <code>NonUnitalAlgebra.mem_top</code> |
* Loss: <code>loss.MaskedCachedMultipleNegativesRankingLoss</code> with these parameters:
```json
{
"scale": 20.0,
"similarity_fct": "cos_sim"
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: steps
- `per_device_train_batch_size`: 256
- `per_device_eval_batch_size`: 64
- `learning_rate`: 0.0002
- `num_train_epochs`: 5.0
- `lr_scheduler_type`: cosine
- `warmup_ratio`: 0.03
- `bf16`: True
- `dataloader_num_workers`: 4
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: steps
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 256
- `per_device_eval_batch_size`: 64
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 1
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 0.0002
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 5.0
- `max_steps`: -1
- `lr_scheduler_type`: cosine
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.03
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: True
- `fp16`: False
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 4
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: False
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: False
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `use_liger_kernel`: False
- `eval_use_gather_object`: False
- `batch_sampler`: batch_sampler
- `multi_dataset_batch_sampler`: proportional
</details>
### Training Logs
<details><summary>Click to expand</summary>
| Epoch | Step | Training Loss | loss |
|:------:|:------:|:-------------:|:------:|
| 0.0004 | 10 | 6.2321 | - |
| 0.0009 | 20 | 6.1869 | - |
| 0.0013 | 30 | 5.9732 | - |
| 0.0018 | 40 | 5.664 | - |
| 0.0022 | 50 | 5.5547 | - |
| 0.0026 | 60 | 5.444 | - |
| 0.0031 | 70 | 5.3214 | - |
| 0.0035 | 80 | 5.2506 | - |
| 0.0040 | 90 | 5.1097 | - |
| 0.0044 | 100 | 5.044 | - |
| 0.0048 | 110 | 5.001 | - |
| 0.0053 | 120 | 5.0118 | - |
| 0.0057 | 130 | 4.8318 | - |
| 0.0062 | 140 | 4.8274 | - |
| 0.0066 | 150 | 4.7683 | - |
| 0.0070 | 160 | 4.7065 | - |
| 0.0075 | 170 | 4.6916 | - |
| 0.0079 | 180 | 4.6859 | - |
| 0.0084 | 190 | 4.641 | - |
| 0.0088 | 200 | 4.6174 | - |
| 0.0092 | 210 | 4.6516 | - |
| 0.0097 | 220 | 4.6408 | - |
| 0.0101 | 230 | 4.5452 | - |
| 0.0106 | 240 | 4.4376 | - |
| 0.0110 | 250 | 4.4505 | - |
| 0.0114 | 260 | 4.4188 | - |
| 0.0119 | 270 | 4.5416 | - |
| 0.0123 | 280 | 4.4036 | - |
| 0.0128 | 290 | 4.3716 | - |
| 0.0132 | 300 | 4.4278 | - |
| 0.0136 | 310 | 4.4089 | - |
| 0.0141 | 320 | 4.3297 | - |
| 0.0145 | 330 | 4.3806 | - |
| 0.0150 | 340 | 4.1996 | - |
| 0.0154 | 350 | 4.301 | - |
| 0.0158 | 360 | 4.2606 | - |
| 0.0163 | 370 | 4.254 | - |
| 0.0167 | 380 | 4.212 | - |
| 0.0172 | 390 | 4.2467 | - |
| 0.0176 | 400 | 4.1584 | - |
| 0.0180 | 410 | 4.2136 | - |
| 0.0185 | 420 | 4.2396 | - |
| 0.0189 | 430 | 4.2378 | - |
| 0.0194 | 440 | 4.2007 | - |
| 0.0198 | 450 | 4.1394 | - |
| 0.0202 | 460 | 4.2282 | - |
| 0.0207 | 470 | 4.1912 | - |
| 0.0211 | 480 | 4.2879 | - |
| 0.0216 | 490 | 4.106 | - |
| 0.0220 | 500 | 4.1463 | - |
| 0.0224 | 510 | 4.1244 | - |
| 0.0229 | 520 | 4.1425 | - |
| 0.0233 | 530 | 4.1112 | - |
| 0.0238 | 540 | 4.1122 | - |
| 0.0242 | 550 | 4.057 | - |
| 0.0246 | 560 | 4.1289 | - |
| 0.0251 | 570 | 4.0986 | - |
| 0.0255 | 580 | 4.0424 | - |
| 0.0260 | 590 | 4.0662 | - |
| 0.0264 | 600 | 4.0743 | - |
| 0.0268 | 610 | 4.0151 | - |
| 0.0273 | 620 | 3.9671 | - |
| 0.0277 | 630 | 4.013 | - |
| 0.0282 | 640 | 4.108 | - |
| 0.0286 | 650 | 4.0448 | - |
| 0.0290 | 660 | 4.0378 | - |
| 0.0295 | 670 | 4.0244 | - |
| 0.0299 | 680 | 3.9739 | - |
| 0.0304 | 690 | 4.0087 | - |
| 0.0308 | 700 | 3.9205 | - |
| 0.0312 | 710 | 3.9618 | - |
| 0.0317 | 720 | 3.9158 | - |
| 0.0321 | 730 | 3.9446 | - |
| 0.0326 | 740 | 3.8831 | - |
| 0.0330 | 750 | 3.9462 | - |
| 0.0334 | 760 | 3.9076 | - |
| 0.0339 | 770 | 3.9135 | - |
| 0.0343 | 780 | 3.9326 | - |
| 0.0348 | 790 | 3.9661 | - |
| 0.0352 | 800 | 3.9789 | - |
| 0.0356 | 810 | 3.821 | - |
| 0.0361 | 820 | 3.9227 | - |
| 0.0365 | 830 | 3.8306 | - |
| 0.0370 | 840 | 3.914 | - |
| 0.0374 | 850 | 3.7688 | - |
| 0.0378 | 860 | 3.872 | - |
| 0.0383 | 870 | 3.7765 | - |
| 0.0387 | 880 | 3.8352 | - |
| 0.0392 | 890 | 3.8048 | - |
| 0.0396 | 900 | 3.7835 | - |
| 0.0400 | 910 | 3.7257 | - |
| 0.0405 | 920 | 3.8683 | - |
| 0.0409 | 930 | 3.7596 | - |
| 0.0414 | 940 | 3.8021 | - |
| 0.0418 | 950 | 3.8071 | - |
| 0.0422 | 960 | 3.8436 | - |
| 0.0427 | 970 | 3.8086 | - |
| 0.0431 | 980 | 3.7806 | - |
| 0.0436 | 990 | 3.8124 | - |
| 0.0440 | 1000 | 3.8011 | - |
| 0.0444 | 1010 | 3.7173 | - |
| 0.0449 | 1020 | 3.7175 | - |
| 0.0453 | 1030 | 3.8145 | - |
| 0.0458 | 1040 | 3.7478 | - |
| 0.0462 | 1050 | 3.7432 | - |
| 0.0466 | 1060 | 3.77 | - |
| 0.0471 | 1070 | 3.8071 | - |
| 0.0475 | 1080 | 3.6848 | - |
| 0.0480 | 1090 | 3.7451 | - |
| 0.0484 | 1100 | 3.7065 | - |
| 0.0488 | 1110 | 3.7122 | - |
| 0.0493 | 1120 | 3.6611 | - |
| 0.0497 | 1130 | 3.7198 | - |
| 0.0500 | 1137 | - | 1.9881 |
| 0.0502 | 1140 | 3.7097 | - |
| 0.0506 | 1150 | 3.7886 | - |
| 0.0510 | 1160 | 3.7134 | - |
| 0.0515 | 1170 | 3.6009 | - |
| 0.0519 | 1180 | 3.743 | - |
| 0.0524 | 1190 | 3.6948 | - |
| 0.0528 | 1200 | 3.5993 | - |
| 0.0532 | 1210 | 3.6945 | - |
| 0.0537 | 1220 | 3.6236 | - |
| 0.0541 | 1230 | 3.61 | - |
| 0.0546 | 1240 | 3.6753 | - |
| 0.0550 | 1250 | 3.6376 | - |
| 0.0554 | 1260 | 3.5882 | - |
| 0.0559 | 1270 | 3.6905 | - |
| 0.0563 | 1280 | 3.5454 | - |
| 0.0568 | 1290 | 3.6019 | - |
| 0.0572 | 1300 | 3.6338 | - |
| 0.0576 | 1310 | 3.7021 | - |
| 0.0581 | 1320 | 3.5602 | - |
| 0.0585 | 1330 | 3.6088 | - |
| 0.0590 | 1340 | 3.5783 | - |
| 0.0594 | 1350 | 3.6099 | - |
| 0.0598 | 1360 | 3.6671 | - |
| 0.0603 | 1370 | 3.6 | - |
| 0.0607 | 1380 | 3.6125 | - |
| 0.0612 | 1390 | 3.63 | - |
| 0.0616 | 1400 | 3.5842 | - |
| 0.0620 | 1410 | 3.4877 | - |
| 0.0625 | 1420 | 3.6062 | - |
| 0.0629 | 1430 | 3.6039 | - |
| 0.0634 | 1440 | 3.52 | - |
| 0.0638 | 1450 | 3.5671 | - |
| 0.0642 | 1460 | 3.6287 | - |
| 0.0647 | 1470 | 3.5789 | - |
| 0.0651 | 1480 | 3.4494 | - |
| 0.0656 | 1490 | 3.6117 | - |
| 0.0660 | 1500 | 3.5354 | - |
| 0.0664 | 1510 | 3.54 | - |
| 0.0669 | 1520 | 3.5309 | - |
| 0.0673 | 1530 | 3.5197 | - |
| 0.0678 | 1540 | 3.5321 | - |
| 0.0682 | 1550 | 3.5214 | - |
| 0.0686 | 1560 | 3.5533 | - |
| 0.0691 | 1570 | 3.4996 | - |
| 0.0695 | 1580 | 3.6056 | - |
| 0.0700 | 1590 | 3.5286 | - |
| 0.0704 | 1600 | 3.5519 | - |
| 0.0708 | 1610 | 3.5282 | - |
| 0.0713 | 1620 | 3.5485 | - |
| 0.0717 | 1630 | 3.544 | - |
| 0.0722 | 1640 | 3.5892 | - |
| 0.0726 | 1650 | 3.5002 | - |
| 0.0730 | 1660 | 3.4184 | - |
| 0.0735 | 1670 | 3.5317 | - |
| 0.0739 | 1680 | 3.4487 | - |
| 0.0744 | 1690 | 3.4431 | - |
| 0.0748 | 1700 | 3.5365 | - |
| 0.0752 | 1710 | 3.4642 | - |
| 0.0757 | 1720 | 3.4115 | - |
| 0.0761 | 1730 | 3.456 | - |
| 0.0766 | 1740 | 3.4662 | - |
| 0.0770 | 1750 | 3.4666 | - |
| 0.0774 | 1760 | 3.5115 | - |
| 0.0779 | 1770 | 3.4323 | - |
| 0.0783 | 1780 | 3.4221 | - |
| 0.0788 | 1790 | 3.4998 | - |
| 0.0792 | 1800 | 3.5024 | - |
| 0.0796 | 1810 | 3.4714 | - |
| 0.0801 | 1820 | 3.3843 | - |
| 0.0805 | 1830 | 3.4754 | - |
| 0.0810 | 1840 | 3.4148 | - |
| 0.0814 | 1850 | 3.3943 | - |
| 0.0818 | 1860 | 3.4642 | - |
| 0.0823 | 1870 | 3.5303 | - |
| 0.0827 | 1880 | 3.4522 | - |
| 0.0832 | 1890 | 3.4357 | - |
| 0.0836 | 1900 | 3.4872 | - |
| 0.0840 | 1910 | 3.4548 | - |
| 0.0845 | 1920 | 3.5129 | - |
| 0.0849 | 1930 | 3.3739 | - |
| 0.0854 | 1940 | 3.3893 | - |
| 0.0858 | 1950 | 3.4615 | - |
| 0.0862 | 1960 | 3.4024 | - |
| 0.0867 | 1970 | 3.4489 | - |
| 0.0871 | 1980 | 3.407 | - |
| 0.0876 | 1990 | 3.4463 | - |
| 0.0880 | 2000 | 3.374 | - |
| 0.0884 | 2010 | 3.4189 | - |
| 0.0889 | 2020 | 3.4141 | - |
| 0.0893 | 2030 | 3.4273 | - |
| 0.0898 | 2040 | 3.4397 | - |
| 0.0902 | 2050 | 3.4994 | - |
| 0.0906 | 2060 | 3.3854 | - |
| 0.0911 | 2070 | 3.4111 | - |
| 0.0915 | 2080 | 3.4001 | - |
| 0.0920 | 2090 | 3.3708 | - |
| 0.0924 | 2100 | 3.4405 | - |
| 0.0928 | 2110 | 3.4141 | - |
| 0.0933 | 2120 | 3.3916 | - |
| 0.0937 | 2130 | 3.36 | - |
| 0.0942 | 2140 | 3.4311 | - |
| 0.0946 | 2150 | 3.3726 | - |
| 0.0950 | 2160 | 3.3535 | - |
| 0.0955 | 2170 | 3.4069 | - |
| 0.0959 | 2180 | 3.4195 | - |
| 0.0964 | 2190 | 3.3888 | - |
| 0.0968 | 2200 | 3.2911 | - |
| 0.0972 | 2210 | 3.3258 | - |
| 0.0977 | 2220 | 3.3438 | - |
| 0.0981 | 2230 | 3.4223 | - |
| 0.0986 | 2240 | 3.3433 | - |
| 0.0990 | 2250 | 3.3387 | - |
| 0.0994 | 2260 | 3.3545 | - |
| 0.0999 | 2270 | 3.2985 | - |
| 0.1001 | 2274 | - | 1.8921 |
| 0.1003 | 2280 | 3.2899 | - |
| 0.1008 | 2290 | 3.3553 | - |
| 0.1012 | 2300 | 3.3363 | - |
| 0.1016 | 2310 | 3.3067 | - |
| 0.1021 | 2320 | 3.3432 | - |
| 0.1025 | 2330 | 3.2771 | - |
| 0.1030 | 2340 | 3.3382 | - |
| 0.1034 | 2350 | 3.3449 | - |
| 0.1038 | 2360 | 3.3152 | - |
| 0.1043 | 2370 | 3.3884 | - |
| 0.1047 | 2380 | 3.3377 | - |
| 0.1052 | 2390 | 3.4296 | - |
| 0.1056 | 2400 | 3.3408 | - |
| 0.1060 | 2410 | 3.3298 | - |
| 0.1065 | 2420 | 3.2049 | - |
| 0.1069 | 2430 | 3.2954 | - |
| 0.1074 | 2440 | 3.3515 | - |
| 0.1078 | 2450 | 3.4093 | - |
| 0.1082 | 2460 | 3.2805 | - |
| 0.1087 | 2470 | 3.3148 | - |
| 0.1091 | 2480 | 3.3447 | - |
| 0.1096 | 2490 | 3.299 | - |
| 0.1100 | 2500 | 3.331 | - |
| 0.1104 | 2510 | 3.3101 | - |
| 0.1109 | 2520 | 3.307 | - |
| 0.1113 | 2530 | 3.2985 | - |
| 0.1118 | 2540 | 3.2943 | - |
| 0.1122 | 2550 | 3.2054 | - |
| 0.1126 | 2560 | 3.2247 | - |
| 0.1131 | 2570 | 3.3231 | - |
| 0.1135 | 2580 | 3.31 | - |
| 0.1140 | 2590 | 3.1949 | - |
| 0.1144 | 2600 | 3.2993 | - |
| 0.1148 | 2610 | 3.3238 | - |
| 0.1153 | 2620 | 3.2747 | - |
| 0.1157 | 2630 | 3.2343 | - |
| 0.1162 | 2640 | 3.1709 | - |
| 0.1166 | 2650 | 3.226 | - |
| 0.1170 | 2660 | 3.1823 | - |
| 0.1175 | 2670 | 3.3017 | - |
| 0.1179 | 2680 | 3.2518 | - |
| 0.1184 | 2690 | 3.267 | - |
| 0.1188 | 2700 | 3.2721 | - |
| 0.1192 | 2710 | 3.2388 | - |
| 0.1197 | 2720 | 3.2943 | - |
| 0.1201 | 2730 | 3.2847 | - |
| 0.1206 | 2740 | 3.2357 | - |
| 0.1210 | 2750 | 3.2395 | - |
| 0.1214 | 2760 | 3.2844 | - |
| 0.1219 | 2770 | 3.2426 | - |
| 0.1223 | 2780 | 3.2321 | - |
| 0.1228 | 2790 | 3.2434 | - |
| 0.1232 | 2800 | 3.2678 | - |
| 0.1236 | 2810 | 3.2168 | - |
| 0.1241 | 2820 | 3.2612 | - |
| 0.1245 | 2830 | 3.2129 | - |
| 0.1250 | 2840 | 3.323 | - |
| 0.1254 | 2850 | 3.2474 | - |
| 0.1258 | 2860 | 3.2582 | - |
| 0.1263 | 2870 | 3.2707 | - |
| 0.1267 | 2880 | 3.1792 | - |
| 0.1272 | 2890 | 3.248 | - |
| 0.1276 | 2900 | 3.1511 | - |
| 0.1280 | 2910 | 3.1873 | - |
| 0.1285 | 2920 | 3.2937 | - |
| 0.1289 | 2930 | 3.329 | - |
| 0.1294 | 2940 | 3.2944 | - |
| 0.1298 | 2950 | 3.277 | - |
| 0.1302 | 2960 | 3.2229 | - |
| 0.1307 | 2970 | 3.2448 | - |
| 0.1311 | 2980 | 3.1787 | - |
| 0.1316 | 2990 | 3.1463 | - |
| 0.1320 | 3000 | 3.2104 | - |
| 0.1324 | 3010 | 3.169 | - |
| 0.1329 | 3020 | 3.2585 | - |
| 0.1333 | 3030 | 3.2772 | - |
| 0.1338 | 3040 | 3.1355 | - |
| 0.1342 | 3050 | 3.1807 | - |
| 0.1346 | 3060 | 3.1542 | - |
| 0.1351 | 3070 | 3.2317 | - |
| 0.1355 | 3080 | 3.1524 | - |
| 0.1360 | 3090 | 3.1785 | - |
| 0.1364 | 3100 | 3.2429 | - |
| 0.1368 | 3110 | 3.2291 | - |
| 0.1373 | 3120 | 3.2127 | - |
| 0.1377 | 3130 | 3.2158 | - |
| 0.1382 | 3140 | 3.1747 | - |
| 0.1386 | 3150 | 3.1273 | - |
| 0.1390 | 3160 | 3.1862 | - |
| 0.1395 | 3170 | 3.215 | - |
| 0.1399 | 3180 | 3.2133 | - |
| 0.1404 | 3190 | 3.1247 | - |
| 0.1408 | 3200 | 3.1276 | - |
| 0.1412 | 3210 | 3.1264 | - |
| 0.1417 | 3220 | 3.1814 | - |
| 0.1421 | 3230 | 3.1948 | - |
| 0.1426 | 3240 | 3.1955 | - |
| 0.1430 | 3250 | 3.1975 | - |
| 0.1434 | 3260 | 3.2944 | - |
| 0.1439 | 3270 | 3.2601 | - |
| 0.1443 | 3280 | 3.2299 | - |
| 0.1448 | 3290 | 3.2304 | - |
| 0.1452 | 3300 | 3.2761 | - |
| 0.1456 | 3310 | 3.2344 | - |
| 0.1461 | 3320 | 3.15 | - |
| 0.1465 | 3330 | 3.1618 | - |
| 0.1470 | 3340 | 3.1848 | - |
| 0.1474 | 3350 | 3.2541 | - |
| 0.1478 | 3360 | 3.2181 | - |
| 0.1483 | 3370 | 3.1528 | - |
| 0.1487 | 3380 | 3.1594 | - |
| 0.1492 | 3390 | 3.2499 | - |
| 0.1496 | 3400 | 3.1627 | - |
| 0.1500 | 3410 | 3.1068 | - |
| 0.1501 | 3411 | - | 1.8237 |
| 0.1505 | 3420 | 3.1432 | - |
| 0.1509 | 3430 | 3.1234 | - |
| 0.1514 | 3440 | 3.135 | - |
| 0.1518 | 3450 | 3.2263 | - |
| 0.1522 | 3460 | 3.2149 | - |
| 0.1527 | 3470 | 3.2167 | - |
| 0.1531 | 3480 | 3.2188 | - |
| 0.1536 | 3490 | 3.1786 | - |
| 0.1540 | 3500 | 3.1897 | - |
| 0.1544 | 3510 | 3.1668 | - |
| 0.1549 | 3520 | 3.1545 | - |
| 0.1553 | 3530 | 3.1791 | - |
| 0.1558 | 3540 | 3.1333 | - |
| 0.1562 | 3550 | 3.2027 | - |
| 0.1566 | 3560 | 3.1903 | - |
| 0.1571 | 3570 | 3.2083 | - |
| 0.1575 | 3580 | 3.1571 | - |
| 0.1580 | 3590 | 3.0979 | - |
| 0.1584 | 3600 | 3.1939 | - |
| 0.1588 | 3610 | 3.2419 | - |
| 0.1593 | 3620 | 3.2895 | - |
| 0.1597 | 3630 | 3.1006 | - |
| 0.1602 | 3640 | 3.1691 | - |
| 0.1606 | 3650 | 3.1694 | - |
| 0.1610 | 3660 | 3.2381 | - |
| 0.1615 | 3670 | 3.246 | - |
| 0.1619 | 3680 | 3.1835 | - |
| 0.1624 | 3690 | 3.0894 | - |
| 0.1628 | 3700 | 3.1258 | - |
| 0.1632 | 3710 | 3.2302 | - |
| 0.1637 | 3720 | 3.1929 | - |
| 0.1641 | 3730 | 3.2028 | - |
| 0.1646 | 3740 | 3.1922 | - |
| 0.1650 | 3750 | 3.1843 | - |
| 0.1654 | 3760 | 3.1302 | - |
| 0.1659 | 3770 | 3.1583 | - |
| 0.1663 | 3780 | 3.2058 | - |
| 0.1668 | 3790 | 3.0715 | - |
| 0.1672 | 3800 | 3.0977 | - |
| 0.1676 | 3810 | 3.0988 | - |
| 0.1681 | 3820 | 3.0889 | - |
| 0.1685 | 3830 | 3.1694 | - |
| 0.1690 | 3840 | 3.1043 | - |
| 0.1694 | 3850 | 3.1153 | - |
| 0.1698 | 3860 | 3.1379 | - |
| 0.1703 | 3870 | 3.1722 | - |
| 0.1707 | 3880 | 3.1078 | - |
| 0.1712 | 3890 | 3.0635 | - |
| 0.1716 | 3900 | 3.1154 | - |
| 0.1720 | 3910 | 3.1195 | - |
| 0.1725 | 3920 | 3.094 | - |
| 0.1729 | 3930 | 3.087 | - |
| 0.1734 | 3940 | 3.1612 | - |
| 0.1738 | 3950 | 3.1344 | - |
| 0.1742 | 3960 | 3.234 | - |
| 0.1747 | 3970 | 3.5787 | - |
| 0.1751 | 3980 | 3.1878 | - |
| 0.1756 | 3990 | 3.0841 | - |
| 0.1760 | 4000 | 3.1308 | - |
| 0.1764 | 4010 | 3.0583 | - |
| 0.1769 | 4020 | 3.0529 | - |
| 0.1773 | 4030 | 3.1005 | - |
| 0.1778 | 4040 | 3.1302 | - |
| 0.1782 | 4050 | 3.0867 | - |
| 0.1787 | 4060 | 3.0806 | - |
| 0.1791 | 4070 | 3.1313 | - |
| 0.1795 | 4080 | 3.0209 | - |
| 0.1800 | 4090 | 3.1377 | - |
| 0.1804 | 4100 | 3.0006 | - |
| 0.1809 | 4110 | 3.1011 | - |
| 0.1813 | 4120 | 3.1383 | - |
| 0.1817 | 4130 | 3.0499 | - |
| 0.1822 | 4140 | 3.0779 | - |
| 0.1826 | 4150 | 3.0954 | - |
| 0.1831 | 4160 | 3.0676 | - |
| 0.1835 | 4170 | 3.0457 | - |
| 0.1839 | 4180 | 3.068 | - |
| 0.1844 | 4190 | 3.1466 | - |
| 0.1848 | 4200 | 3.0883 | - |
| 0.1853 | 4210 | 3.0638 | - |
| 0.1857 | 4220 | 3.071 | - |
| 0.1861 | 4230 | 3.0264 | - |
| 0.1866 | 4240 | 3.0557 | - |
| 0.1870 | 4250 | 3.0106 | - |
| 0.1875 | 4260 | 3.0173 | - |
| 0.1879 | 4270 | 2.9711 | - |
| 0.1883 | 4280 | 3.1116 | - |
| 0.1888 | 4290 | 3.057 | - |
| 0.1892 | 4300 | 3.0097 | - |
| 0.1897 | 4310 | 3.0541 | - |
| 0.1901 | 4320 | 2.9574 | - |
| 0.1905 | 4330 | 3.0306 | - |
| 0.1910 | 4340 | 3.0747 | - |
| 0.1914 | 4350 | 3.0136 | - |
| 0.1919 | 4360 | 3.0541 | - |
| 0.1923 | 4370 | 3.0284 | - |
| 0.1927 | 4380 | 3.0468 | - |
| 0.1932 | 4390 | 3.0191 | - |
| 0.1936 | 4400 | 3.0012 | - |
| 0.1941 | 4410 | 2.9889 | - |
| 0.1945 | 4420 | 3.067 | - |
| 0.1949 | 4430 | 3.0595 | - |
| 0.1954 | 4440 | 3.0287 | - |
| 0.1958 | 4450 | 3.0727 | - |
| 0.1963 | 4460 | 3.0399 | - |
| 0.1967 | 4470 | 2.9577 | - |
| 0.1971 | 4480 | 3.0587 | - |
| 0.1976 | 4490 | 2.9597 | - |
| 0.1980 | 4500 | 3.01 | - |
| 0.1985 | 4510 | 3.0442 | - |
| 0.1989 | 4520 | 2.9393 | - |
| 0.1993 | 4530 | 3.0278 | - |
| 0.1998 | 4540 | 2.8901 | - |
| 0.2001 | 4548 | - | 1.7955 |
| 0.2002 | 4550 | 2.955 | - |
| 0.2007 | 4560 | 3.0005 | - |
| 0.2011 | 4570 | 2.9045 | - |
| 0.2015 | 4580 | 3.0159 | - |
| 0.2020 | 4590 | 2.9778 | - |
| 0.2024 | 4600 | 2.9194 | - |
| 0.2029 | 4610 | 2.9819 | - |
| 0.2033 | 4620 | 2.8893 | - |
| 0.2037 | 4630 | 2.9904 | - |
| 0.2042 | 4640 | 2.9846 | - |
| 0.2046 | 4650 | 3.028 | - |
| 0.2051 | 4660 | 3.047 | - |
| 0.2055 | 4670 | 3.0462 | - |
| 0.2059 | 4680 | 2.9082 | - |
| 0.2064 | 4690 | 3.0016 | - |
| 0.2068 | 4700 | 2.9611 | - |
| 0.2073 | 4710 | 2.8786 | - |
| 0.2077 | 4720 | 2.9202 | - |
| 0.2081 | 4730 | 2.9133 | - |
| 0.2086 | 4740 | 3.0017 | - |
| 0.2090 | 4750 | 2.8931 | - |
| 0.2095 | 4760 | 2.9423 | - |
| 0.2099 | 4770 | 2.9565 | - |
| 0.2103 | 4780 | 2.912 | - |
| 0.2108 | 4790 | 2.9542 | - |
| 0.2112 | 4800 | 2.9813 | - |
| 0.2117 | 4810 | 2.9214 | - |
| 0.2121 | 4820 | 2.9468 | - |
| 0.2125 | 4830 | 2.9535 | - |
| 0.2130 | 4840 | 2.9539 | - |
| 0.2134 | 4850 | 2.9748 | - |
| 0.2139 | 4860 | 2.9134 | - |
| 0.2143 | 4870 | 2.8876 | - |
| 0.2147 | 4880 | 2.8892 | - |
| 0.2152 | 4890 | 2.8991 | - |
| 0.2156 | 4900 | 2.9633 | - |
| 0.2161 | 4910 | 2.9377 | - |
| 0.2165 | 4920 | 2.9128 | - |
| 0.2169 | 4930 | 2.9323 | - |
| 0.2174 | 4940 | 2.9083 | - |
| 0.2178 | 4950 | 2.9329 | - |
| 0.2183 | 4960 | 2.8861 | - |
| 0.2187 | 4970 | 2.9136 | - |
| 0.2191 | 4980 | 2.9142 | - |
| 0.2196 | 4990 | 2.8903 | - |
| 0.2200 | 5000 | 2.8701 | - |
| 0.2205 | 5010 | 2.8072 | - |
| 0.2209 | 5020 | 2.8508 | - |
| 0.2213 | 5030 | 2.9698 | - |
| 0.2218 | 5040 | 2.9334 | - |
| 0.2222 | 5050 | 2.9368 | - |
| 0.2227 | 5060 | 2.917 | - |
| 0.2231 | 5070 | 2.9023 | - |
| 0.2235 | 5080 | 2.9141 | - |
| 0.2240 | 5090 | 2.9003 | - |
| 0.2244 | 5100 | 2.8847 | - |
| 0.2249 | 5110 | 2.8319 | - |
| 0.2253 | 5120 | 2.854 | - |
| 0.2257 | 5130 | 2.8788 | - |
| 0.2262 | 5140 | 2.8399 | - |
| 0.2266 | 5150 | 2.8667 | - |
| 0.2271 | 5160 | 2.8935 | - |
| 0.2275 | 5170 | 2.85 | - |
| 0.2279 | 5180 | 2.8874 | - |
| 0.2284 | 5190 | 2.9649 | - |
| 0.2288 | 5200 | 2.8439 | - |
| 0.2293 | 5210 | 2.9177 | - |
| 0.2297 | 5220 | 2.8992 | - |
| 0.2301 | 5230 | 2.8711 | - |
| 0.2306 | 5240 | 2.799 | - |
| 0.2310 | 5250 | 2.9185 | - |
| 0.2315 | 5260 | 2.8427 | - |
| 0.2319 | 5270 | 2.7905 | - |
| 0.2323 | 5280 | 2.883 | - |
| 0.2328 | 5290 | 2.8292 | - |
| 0.2332 | 5300 | 2.8618 | - |
| 0.2337 | 5310 | 2.8175 | - |
| 0.2341 | 5320 | 2.8074 | - |
| 0.2345 | 5330 | 2.8245 | - |
| 0.2350 | 5340 | 2.8974 | - |
| 0.2354 | 5350 | 2.841 | - |
| 0.2359 | 5360 | 2.8983 | - |
| 0.2363 | 5370 | 2.8141 | - |
| 0.2367 | 5380 | 2.7842 | - |
| 0.2372 | 5390 | 2.7849 | - |
| 0.2376 | 5400 | 2.7416 | - |
| 0.2381 | 5410 | 2.86 | - |
| 0.2385 | 5420 | 2.8711 | - |
| 0.2389 | 5430 | 2.839 | - |
| 0.2394 | 5440 | 2.8244 | - |
| 0.2398 | 5450 | 2.7942 | - |
| 0.2403 | 5460 | 2.8173 | - |
| 0.2407 | 5470 | 2.8413 | - |
| 0.2411 | 5480 | 2.8185 | - |
| 0.2416 | 5490 | 2.8404 | - |
| 0.2420 | 5500 | 2.7627 | - |
| 0.2425 | 5510 | 2.8237 | - |
| 0.2429 | 5520 | 2.8416 | - |
| 0.2433 | 5530 | 2.8288 | - |
| 0.2438 | 5540 | 2.8932 | - |
| 0.2442 | 5550 | 2.8916 | - |
| 0.2447 | 5560 | 2.864 | - |
| 0.2451 | 5570 | 2.7919 | - |
| 0.2455 | 5580 | 2.8545 | - |
| 0.2460 | 5590 | 2.8298 | - |
| 0.2464 | 5600 | 2.7542 | - |
| 0.2469 | 5610 | 2.7379 | - |
| 0.2473 | 5620 | 2.8381 | - |
| 0.2477 | 5630 | 2.9065 | - |
| 0.2482 | 5640 | 2.7571 | - |
| 0.2486 | 5650 | 2.7824 | - |
| 0.2491 | 5660 | 2.8318 | - |
| 0.2495 | 5670 | 2.7792 | - |
| 0.2499 | 5680 | 2.7935 | - |
| 0.2502 | 5685 | - | 1.7503 |
| 0.2504 | 5690 | 2.8 | - |
| 0.2508 | 5700 | 2.7359 | - |
| 0.2513 | 5710 | 2.8177 | - |
| 0.2517 | 5720 | 2.7953 | - |
| 0.2521 | 5730 | 2.7641 | - |
| 0.2526 | 5740 | 2.8042 | - |
| 0.2530 | 5750 | 2.738 | - |
| 0.2535 | 5760 | 2.761 | - |
| 0.2539 | 5770 | 2.7126 | - |
| 0.2543 | 5780 | 2.7502 | - |
| 0.2548 | 5790 | 2.7546 | - |
| 0.2552 | 5800 | 2.789 | - |
| 0.2557 | 5810 | 2.8448 | - |
| 0.2561 | 5820 | 2.779 | - |
| 0.2565 | 5830 | 2.7048 | - |
| 0.2570 | 5840 | 2.6868 | - |
| 0.2574 | 5850 | 2.727 | - |
| 0.2579 | 5860 | 2.7836 | - |
| 0.2583 | 5870 | 2.7101 | - |
| 0.2587 | 5880 | 2.7093 | - |
| 0.2592 | 5890 | 2.734 | - |
| 0.2596 | 5900 | 2.7864 | - |
| 0.2601 | 5910 | 2.7053 | - |
| 0.2605 | 5920 | 2.7824 | - |
| 0.2609 | 5930 | 2.8109 | - |
| 0.2614 | 5940 | 2.7778 | - |
| 0.2618 | 5950 | 2.6806 | - |
| 0.2623 | 5960 | 2.7973 | - |
| 0.2627 | 5970 | 2.7844 | - |
| 0.2631 | 5980 | 2.7301 | - |
| 0.2636 | 5990 | 2.7691 | - |
| 0.2640 | 6000 | 2.6653 | - |
| 0.2645 | 6010 | 2.7424 | - |
| 0.2649 | 6020 | 2.7406 | - |
| 0.2653 | 6030 | 2.7018 | - |
| 0.2658 | 6040 | 2.741 | - |
| 0.2662 | 6050 | 2.7459 | - |
| 0.2667 | 6060 | 2.7394 | - |
| 0.2671 | 6070 | 2.7859 | - |
| 0.2675 | 6080 | 2.675 | - |
| 0.2680 | 6090 | 2.7465 | - |
| 0.2684 | 6100 | 2.8133 | - |
| 0.2689 | 6110 | 2.768 | - |
| 0.2693 | 6120 | 2.7792 | - |
| 0.2697 | 6130 | 2.7898 | - |
| 0.2702 | 6140 | 2.7046 | - |
| 0.2706 | 6150 | 2.7425 | - |
| 0.2711 | 6160 | 2.7018 | - |
| 0.2715 | 6170 | 2.7993 | - |
| 0.2719 | 6180 | 2.7209 | - |
| 0.2724 | 6190 | 2.7522 | - |
| 0.2728 | 6200 | 2.7158 | - |
| 0.2733 | 6210 | 2.6777 | - |
| 0.2737 | 6220 | 2.7328 | - |
| 0.2741 | 6230 | 2.7566 | - |
| 0.2746 | 6240 | 2.6412 | - |
| 0.2750 | 6250 | 2.7031 | - |
| 0.2755 | 6260 | 2.6709 | - |
| 0.2759 | 6270 | 2.7575 | - |
| 0.2763 | 6280 | 2.6936 | - |
| 0.2768 | 6290 | 2.7016 | - |
| 0.2772 | 6300 | 2.7334 | - |
| 0.2777 | 6310 | 2.7926 | - |
| 0.2781 | 6320 | 2.7459 | - |
| 0.2785 | 6330 | 2.6771 | - |
| 0.2790 | 6340 | 2.6905 | - |
| 0.2794 | 6350 | 2.6922 | - |
| 0.2799 | 6360 | 2.6975 | - |
| 0.2803 | 6370 | 2.7242 | - |
| 0.2807 | 6380 | 2.6617 | - |
| 0.2812 | 6390 | 2.7189 | - |
| 0.2816 | 6400 | 2.7561 | - |
| 0.2821 | 6410 | 2.6875 | - |
| 0.2825 | 6420 | 2.6702 | - |
| 0.2829 | 6430 | 2.677 | - |
| 0.2834 | 6440 | 2.6384 | - |
| 0.2838 | 6450 | 2.7081 | - |
| 0.2843 | 6460 | 2.7128 | - |
| 0.2847 | 6470 | 2.7018 | - |
| 0.2851 | 6480 | 2.6551 | - |
| 0.2856 | 6490 | 2.6997 | - |
| 0.2860 | 6500 | 2.7075 | - |
| 0.2865 | 6510 | 2.7774 | - |
| 0.2869 | 6520 | 2.6615 | - |
| 0.2873 | 6530 | 2.7677 | - |
| 0.2878 | 6540 | 2.7219 | - |
| 0.2882 | 6550 | 2.7515 | - |
| 0.2887 | 6560 | 2.761 | - |
| 0.2891 | 6570 | 2.6382 | - |
| 0.2895 | 6580 | 2.6545 | - |
| 0.2900 | 6590 | 2.6677 | - |
| 0.2904 | 6600 | 2.6469 | - |
| 0.2909 | 6610 | 2.679 | - |
| 0.2913 | 6620 | 2.6645 | - |
| 0.2917 | 6630 | 2.7476 | - |
| 0.2922 | 6640 | 2.599 | - |
| 0.2926 | 6650 | 2.6616 | - |
| 0.2931 | 6660 | 2.6904 | - |
| 0.2935 | 6670 | 2.6197 | - |
| 0.2939 | 6680 | 2.6739 | - |
| 0.2944 | 6690 | 2.6517 | - |
| 0.2948 | 6700 | 2.7092 | - |
| 0.2953 | 6710 | 2.6325 | - |
| 0.2957 | 6720 | 2.7366 | - |
| 0.2961 | 6730 | 2.5898 | - |
| 0.2966 | 6740 | 2.6748 | - |
| 0.2970 | 6750 | 2.7128 | - |
| 0.2975 | 6760 | 2.5639 | - |
| 0.2979 | 6770 | 2.7254 | - |
| 0.2983 | 6780 | 2.5829 | - |
| 0.2988 | 6790 | 2.6725 | - |
| 0.2992 | 6800 | 2.66 | - |
| 0.2997 | 6810 | 2.6256 | - |
| 0.3001 | 6820 | 2.5548 | - |
| 0.3002 | 6822 | - | 1.7512 |
| 0.3005 | 6830 | 2.6259 | - |
| 0.3010 | 6840 | 2.7273 | - |
| 0.3014 | 6850 | 2.6848 | - |
| 0.3019 | 6860 | 2.5811 | - |
| 0.3023 | 6870 | 2.643 | - |
| 0.3027 | 6880 | 2.5756 | - |
| 0.3032 | 6890 | 2.648 | - |
| 0.3036 | 6900 | 2.5769 | - |
| 0.3041 | 6910 | 2.5854 | - |
| 0.3045 | 6920 | 2.589 | - |
| 0.3049 | 6930 | 2.6698 | - |
| 0.3054 | 6940 | 2.5703 | - |
| 0.3058 | 6950 | 2.6519 | - |
| 0.3063 | 6960 | 2.5974 | - |
| 0.3067 | 6970 | 2.6398 | - |
| 0.3071 | 6980 | 2.6566 | - |
| 0.3076 | 6990 | 2.6383 | - |
| 0.3080 | 7000 | 2.6297 | - |
| 0.3085 | 7010 | 2.5817 | - |
| 0.3089 | 7020 | 2.632 | - |
| 0.3093 | 7030 | 2.6536 | - |
| 0.3098 | 7040 | 2.6606 | - |
| 0.3102 | 7050 | 2.5936 | - |
| 0.3107 | 7060 | 2.605 | - |
| 0.3111 | 7070 | 2.5671 | - |
| 0.3115 | 7080 | 2.6172 | - |
| 0.3120 | 7090 | 2.5917 | - |
| 0.3124 | 7100 | 2.6574 | - |
| 0.3129 | 7110 | 2.61 | - |
| 0.3133 | 7120 | 2.6355 | - |
| 0.3137 | 7130 | 2.5853 | - |
| 0.3142 | 7140 | 2.5879 | - |
| 0.3146 | 7150 | 2.6295 | - |
| 0.3151 | 7160 | 2.5929 | - |
| 0.3155 | 7170 | 2.5144 | - |
| 0.3159 | 7180 | 2.6094 | - |
| 0.3164 | 7190 | 2.6053 | - |
| 0.3168 | 7200 | 2.6508 | - |
| 0.3173 | 7210 | 2.4983 | - |
| 0.3177 | 7220 | 2.6363 | - |
| 0.3181 | 7230 | 2.5806 | - |
| 0.3186 | 7240 | 2.5851 | - |
| 0.3190 | 7250 | 2.5634 | - |
| 0.3195 | 7260 | 2.5874 | - |
| 0.3199 | 7270 | 2.5645 | - |
| 0.3203 | 7280 | 2.5303 | - |
| 0.3208 | 7290 | 2.6154 | - |
| 0.3212 | 7300 | 2.5939 | - |
| 0.3217 | 7310 | 2.4914 | - |
| 0.3221 | 7320 | 2.5836 | - |
| 0.3225 | 7330 | 2.6024 | - |
| 0.3230 | 7340 | 2.5512 | - |
| 0.3234 | 7350 | 2.5706 | - |
| 0.3239 | 7360 | 2.5158 | - |
| 0.3243 | 7370 | 2.5128 | - |
| 0.3247 | 7380 | 2.5767 | - |
| 0.3252 | 7390 | 2.5246 | - |
| 0.3256 | 7400 | 2.5595 | - |
| 0.3261 | 7410 | 2.532 | - |
| 0.3265 | 7420 | 2.528 | - |
| 0.3269 | 7430 | 2.4752 | - |
| 0.3274 | 7440 | 2.5033 | - |
| 0.3278 | 7450 | 2.6272 | - |
| 0.3283 | 7460 | 2.527 | - |
| 0.3287 | 7470 | 2.4914 | - |
| 0.3291 | 7480 | 2.6452 | - |
| 0.3296 | 7490 | 2.5193 | - |
| 0.3300 | 7500 | 2.4932 | - |
| 0.3305 | 7510 | 2.5131 | - |
| 0.3309 | 7520 | 2.5443 | - |
| 0.3313 | 7530 | 2.5736 | - |
| 0.3318 | 7540 | 2.5781 | - |
| 0.3322 | 7550 | 2.597 | - |
| 0.3327 | 7560 | 2.5257 | - |
| 0.3331 | 7570 | 2.5796 | - |
| 0.3335 | 7580 | 2.5578 | - |
| 0.3340 | 7590 | 2.5428 | - |
| 0.3344 | 7600 | 2.4747 | - |
| 0.3349 | 7610 | 2.5069 | - |
| 0.3353 | 7620 | 2.4651 | - |
| 0.3357 | 7630 | 2.5747 | - |
| 0.3362 | 7640 | 2.5984 | - |
| 0.3366 | 7650 | 2.5524 | - |
| 0.3371 | 7660 | 2.5248 | - |
| 0.3375 | 7670 | 2.5376 | - |
| 0.3379 | 7680 | 2.5771 | - |
| 0.3384 | 7690 | 2.5508 | - |
| 0.3388 | 7700 | 2.6057 | - |
| 0.3393 | 7710 | 2.4919 | - |
| 0.3397 | 7720 | 2.5062 | - |
| 0.3401 | 7730 | 2.472 | - |
| 0.3406 | 7740 | 2.5702 | - |
| 0.3410 | 7750 | 2.5309 | - |
| 0.3415 | 7760 | 2.5172 | - |
| 0.3419 | 7770 | 2.5355 | - |
| 0.3423 | 7780 | 2.5452 | - |
| 0.3428 | 7790 | 2.4959 | - |
| 0.3432 | 7800 | 2.5822 | - |
| 0.3437 | 7810 | 2.4648 | - |
| 0.3441 | 7820 | 2.4875 | - |
| 0.3445 | 7830 | 2.488 | - |
| 0.3450 | 7840 | 2.504 | - |
| 0.3454 | 7850 | 2.4502 | - |
| 0.3459 | 7860 | 2.4832 | - |
| 0.3463 | 7870 | 2.5333 | - |
| 0.3467 | 7880 | 2.5148 | - |
| 0.3472 | 7890 | 2.4968 | - |
| 0.3476 | 7900 | 2.5114 | - |
| 0.3481 | 7910 | 2.6032 | - |
| 0.3485 | 7920 | 2.4245 | - |
| 0.3489 | 7930 | 2.4944 | - |
| 0.3494 | 7940 | 2.5364 | - |
| 0.3498 | 7950 | 2.5045 | - |
| 0.3502 | 7959 | - | 1.6867 |
| 0.3503 | 7960 | 2.4544 | - |
| 0.3507 | 7970 | 2.432 | - |
| 0.3511 | 7980 | 2.4295 | - |
| 0.3516 | 7990 | 2.4436 | - |
| 0.3520 | 8000 | 2.524 | - |
| 0.3525 | 8010 | 2.5537 | - |
| 0.3529 | 8020 | 2.4655 | - |
| 0.3533 | 8030 | 2.4661 | - |
| 0.3538 | 8040 | 2.4245 | - |
| 0.3542 | 8050 | 2.5014 | - |
| 0.3547 | 8060 | 2.5844 | - |
| 0.3551 | 8070 | 2.5683 | - |
| 0.3555 | 8080 | 2.4476 | - |
| 0.3560 | 8090 | 2.5325 | - |
| 0.3564 | 8100 | 2.5194 | - |
| 0.3569 | 8110 | 2.5057 | - |
| 0.3573 | 8120 | 2.5139 | - |
| 0.3577 | 8130 | 2.5152 | - |
| 0.3582 | 8140 | 2.4537 | - |
| 0.3586 | 8150 | 2.4896 | - |
| 0.3591 | 8160 | 2.445 | - |
| 0.3595 | 8170 | 2.5446 | - |
| 0.3599 | 8180 | 2.5018 | - |
| 0.3604 | 8190 | 2.3995 | - |
| 0.3608 | 8200 | 2.4249 | - |
| 0.3613 | 8210 | 2.4941 | - |
| 0.3617 | 8220 | 2.618 | - |
| 0.3621 | 8230 | 2.7269 | - |
| 0.3626 | 8240 | 2.5891 | - |
| 0.3630 | 8250 | 2.5098 | - |
| 0.3635 | 8260 | 2.4639 | - |
| 0.3639 | 8270 | 2.4344 | - |
| 0.3643 | 8280 | 2.4849 | - |
| 0.3648 | 8290 | 2.4547 | - |
| 0.3652 | 8300 | 2.4509 | - |
| 0.3657 | 8310 | 2.4289 | - |
| 0.3661 | 8320 | 2.5457 | - |
| 0.3665 | 8330 | 2.4892 | - |
| 0.3670 | 8340 | 2.488 | - |
| 0.3674 | 8350 | 2.4313 | - |
| 0.3679 | 8360 | 2.4311 | - |
| 0.3683 | 8370 | 2.448 | - |
| 0.3687 | 8380 | 2.4468 | - |
| 0.3692 | 8390 | 2.497 | - |
| 0.3696 | 8400 | 2.4236 | - |
| 0.3701 | 8410 | 2.476 | - |
| 0.3705 | 8420 | 2.5083 | - |
| 0.3709 | 8430 | 2.4692 | - |
| 0.3714 | 8440 | 2.5016 | - |
| 0.3718 | 8450 | 2.4872 | - |
| 0.3723 | 8460 | 2.4453 | - |
| 0.3727 | 8470 | 2.4229 | - |
| 0.3731 | 8480 | 2.4482 | - |
| 0.3736 | 8490 | 2.4095 | - |
| 0.3740 | 8500 | 2.4221 | - |
| 0.3745 | 8510 | 2.4073 | - |
| 0.3749 | 8520 | 2.4462 | - |
| 0.3753 | 8530 | 2.4278 | - |
| 0.3758 | 8540 | 2.4804 | - |
| 0.3762 | 8550 | 2.4622 | - |
| 0.3767 | 8560 | 2.4626 | - |
| 0.3771 | 8570 | 2.3896 | - |
| 0.3775 | 8580 | 2.4613 | - |
| 0.3780 | 8590 | 2.372 | - |
| 0.3784 | 8600 | 2.4439 | - |
| 0.3789 | 8610 | 2.4185 | - |
| 0.3793 | 8620 | 2.4153 | - |
| 0.3797 | 8630 | 2.4377 | - |
| 0.3802 | 8640 | 2.4831 | - |
| 0.3806 | 8650 | 2.5059 | - |
| 0.3811 | 8660 | 2.3586 | - |
| 0.3815 | 8670 | 2.4187 | - |
| 0.3819 | 8680 | 2.4521 | - |
| 0.3824 | 8690 | 2.3921 | - |
| 0.3828 | 8700 | 2.3381 | - |
| 0.3833 | 8710 | 2.3365 | - |
| 0.3837 | 8720 | 2.4051 | - |
| 0.3841 | 8730 | 2.4808 | - |
| 0.3846 | 8740 | 2.4048 | - |
| 0.3850 | 8750 | 2.4582 | - |
| 0.3855 | 8760 | 2.4336 | - |
| 0.3859 | 8770 | 2.4465 | - |
| 0.3863 | 8780 | 2.3616 | - |
| 0.3868 | 8790 | 2.4262 | - |
| 0.3872 | 8800 | 2.3956 | - |
| 0.3877 | 8810 | 2.3254 | - |
| 0.3881 | 8820 | 2.3583 | - |
| 0.3885 | 8830 | 2.3967 | - |
| 0.3890 | 8840 | 2.4775 | - |
| 0.3894 | 8850 | 2.4321 | - |
| 0.3899 | 8860 | 2.4225 | - |
| 0.3903 | 8870 | 2.3912 | - |
| 0.3907 | 8880 | 2.4729 | - |
| 0.3912 | 8890 | 2.4833 | - |
| 0.3916 | 8900 | 2.4556 | - |
| 0.3921 | 8910 | 2.4182 | - |
| 0.3925 | 8920 | 2.5093 | - |
| 0.3929 | 8930 | 2.3859 | - |
| 0.3934 | 8940 | 2.4626 | - |
| 0.3938 | 8950 | 2.4321 | - |
| 0.3943 | 8960 | 2.4709 | - |
| 0.3947 | 8970 | 2.424 | - |
| 0.3951 | 8980 | 2.343 | - |
| 0.3956 | 8990 | 2.4043 | - |
| 0.3960 | 9000 | 2.4239 | - |
| 0.3965 | 9010 | 2.3272 | - |
| 0.3969 | 9020 | 2.4332 | - |
| 0.3973 | 9030 | 2.3799 | - |
| 0.3978 | 9040 | 2.4098 | - |
| 0.3982 | 9050 | 2.3606 | - |
| 0.3987 | 9060 | 2.3919 | - |
| 0.3991 | 9070 | 2.3603 | - |
| 0.3995 | 9080 | 2.4018 | - |
| 0.4000 | 9090 | 2.3785 | - |
| 0.4002 | 9096 | - | 1.6704 |
| 0.4004 | 9100 | 2.3531 | - |
| 0.4009 | 9110 | 2.4391 | - |
| 0.4013 | 9120 | 2.441 | - |
| 0.4017 | 9130 | 2.4498 | - |
| 0.4022 | 9140 | 2.3853 | - |
| 0.4026 | 9150 | 2.3781 | - |
| 0.4031 | 9160 | 2.2869 | - |
| 0.4035 | 9170 | 2.4228 | - |
| 0.4039 | 9180 | 2.3155 | - |
| 0.4044 | 9190 | 2.3749 | - |
| 0.4048 | 9200 | 2.4039 | - |
| 0.4053 | 9210 | 2.4326 | - |
| 0.4057 | 9220 | 2.428 | - |
| 0.4061 | 9230 | 2.3993 | - |
| 0.4066 | 9240 | 2.3684 | - |
| 0.4070 | 9250 | 2.398 | - |
| 0.4075 | 9260 | 2.3256 | - |
| 0.4079 | 9270 | 2.384 | - |
| 0.4083 | 9280 | 2.3798 | - |
| 0.4088 | 9290 | 2.4522 | - |
| 0.4092 | 9300 | 2.3099 | - |
| 0.4097 | 9310 | 2.4492 | - |
| 0.4101 | 9320 | 2.3989 | - |
| 0.4105 | 9330 | 2.4296 | - |
| 0.4110 | 9340 | 2.3987 | - |
| 0.4114 | 9350 | 2.4239 | - |
| 0.4119 | 9360 | 2.3676 | - |
| 0.4123 | 9370 | 2.4248 | - |
| 0.4127 | 9380 | 2.4128 | - |
| 0.4132 | 9390 | 2.3812 | - |
| 0.4136 | 9400 | 2.4627 | - |
| 0.4141 | 9410 | 2.4025 | - |
| 0.4145 | 9420 | 2.4162 | - |
| 0.4149 | 9430 | 2.4433 | - |
| 0.4154 | 9440 | 2.3259 | - |
| 0.4158 | 9450 | 2.3521 | - |
| 0.4163 | 9460 | 2.4114 | - |
| 0.4167 | 9470 | 2.3829 | - |
| 0.4171 | 9480 | 2.402 | - |
| 0.4176 | 9490 | 2.3171 | - |
| 0.4180 | 9500 | 2.3291 | - |
| 0.4185 | 9510 | 2.4218 | - |
| 0.4189 | 9520 | 2.4103 | - |
| 0.4193 | 9530 | 2.4138 | - |
| 0.4198 | 9540 | 2.3116 | - |
| 0.4202 | 9550 | 2.3739 | - |
| 0.4207 | 9560 | 2.3966 | - |
| 0.4211 | 9570 | 2.3498 | - |
| 0.4215 | 9580 | 2.38 | - |
| 0.4220 | 9590 | 2.3787 | - |
| 0.4224 | 9600 | 2.3443 | - |
| 0.4229 | 9610 | 2.2764 | - |
| 0.4233 | 9620 | 2.3685 | - |
| 0.4237 | 9630 | 2.4038 | - |
| 0.4242 | 9640 | 2.3309 | - |
| 0.4246 | 9650 | 2.2911 | - |
| 0.4251 | 9660 | 2.3556 | - |
| 0.4255 | 9670 | 2.3092 | - |
| 0.4259 | 9680 | 2.3333 | - |
| 0.4264 | 9690 | 2.3105 | - |
| 0.4268 | 9700 | 2.3896 | - |
| 0.4273 | 9710 | 2.3765 | - |
| 0.4277 | 9720 | 2.3597 | - |
| 0.4281 | 9730 | 2.3789 | - |
| 0.4286 | 9740 | 2.3524 | - |
| 0.4290 | 9750 | 2.3307 | - |
| 0.4295 | 9760 | 2.3434 | - |
| 0.4299 | 9770 | 2.3482 | - |
| 0.4303 | 9780 | 2.3302 | - |
| 0.4308 | 9790 | 2.385 | - |
| 0.4312 | 9800 | 2.3721 | - |
| 0.4317 | 9810 | 2.3453 | - |
| 0.4321 | 9820 | 2.3311 | - |
| 0.4325 | 9830 | 2.3464 | - |
| 0.4330 | 9840 | 2.3301 | - |
| 0.4334 | 9850 | 2.3336 | - |
| 0.4339 | 9860 | 2.3392 | - |
| 0.4343 | 9870 | 2.353 | - |
| 0.4347 | 9880 | 2.3181 | - |
| 0.4352 | 9890 | 2.3395 | - |
| 0.4356 | 9900 | 2.3888 | - |
| 0.4361 | 9910 | 2.3445 | - |
| 0.4365 | 9920 | 2.3776 | - |
| 0.4369 | 9930 | 2.3673 | - |
| 0.4374 | 9940 | 2.2807 | - |
| 0.4378 | 9950 | 2.3448 | - |
| 0.4383 | 9960 | 2.3262 | - |
| 0.4387 | 9970 | 2.3342 | - |
| 0.4391 | 9980 | 2.3457 | - |
| 0.4396 | 9990 | 2.3395 | - |
| 0.4400 | 10000 | 2.3014 | - |
| 0.4405 | 10010 | 2.2837 | - |
| 0.4409 | 10020 | 2.3655 | - |
| 0.4413 | 10030 | 2.3199 | - |
| 0.4418 | 10040 | 2.2369 | - |
| 0.4422 | 10050 | 2.2882 | - |
| 0.4427 | 10060 | 2.2339 | - |
| 0.4431 | 10070 | 2.328 | - |
| 0.4435 | 10080 | 2.3068 | - |
| 0.4440 | 10090 | 2.2645 | - |
| 0.4444 | 10100 | 2.3818 | - |
| 0.4449 | 10110 | 2.3856 | - |
| 0.4453 | 10120 | 2.2961 | - |
| 0.4457 | 10130 | 2.3333 | - |
| 0.4462 | 10140 | 2.3428 | - |
| 0.4466 | 10150 | 2.2594 | - |
| 0.4471 | 10160 | 2.3017 | - |
| 0.4475 | 10170 | 2.308 | - |
| 0.4479 | 10180 | 2.3405 | - |
| 0.4484 | 10190 | 2.2267 | - |
| 0.4488 | 10200 | 2.3015 | - |
| 0.4493 | 10210 | 2.3121 | - |
| 0.4497 | 10220 | 2.2587 | - |
| 0.4501 | 10230 | 2.3268 | - |
| 0.4503 | 10233 | - | 1.6561 |
| 0.4506 | 10240 | 2.298 | - |
| 0.4510 | 10250 | 2.3505 | - |
| 0.4515 | 10260 | 2.3892 | - |
| 0.4519 | 10270 | 2.458 | - |
| 0.4523 | 10280 | 2.2556 | - |
| 0.4528 | 10290 | 2.2891 | - |
| 0.4532 | 10300 | 2.2728 | - |
| 0.4537 | 10310 | 2.2892 | - |
| 0.4541 | 10320 | 2.2953 | - |
| 0.4545 | 10330 | 2.3029 | - |
| 0.4550 | 10340 | 2.2345 | - |
| 0.4554 | 10350 | 2.2552 | - |
| 0.4559 | 10360 | 2.3035 | - |
| 0.4563 | 10370 | 2.3488 | - |
| 0.4567 | 10380 | 2.276 | - |
| 0.4572 | 10390 | 2.239 | - |
| 0.4576 | 10400 | 2.2913 | - |
| 0.4581 | 10410 | 2.3284 | - |
| 0.4585 | 10420 | 2.2491 | - |
| 0.4589 | 10430 | 2.2429 | - |
| 0.4594 | 10440 | 2.2715 | - |
| 0.4598 | 10450 | 2.2351 | - |
| 0.4603 | 10460 | 2.3041 | - |
| 0.4607 | 10470 | 2.2778 | - |
| 0.4611 | 10480 | 2.3156 | - |
| 0.4616 | 10490 | 2.3188 | - |
| 0.4620 | 10500 | 2.2925 | - |
| 0.4625 | 10510 | 2.2567 | - |
| 0.4629 | 10520 | 2.2646 | - |
| 0.4633 | 10530 | 2.2575 | - |
| 0.4638 | 10540 | 2.2581 | - |
| 0.4642 | 10550 | 2.2815 | - |
| 0.4647 | 10560 | 2.297 | - |
| 0.4651 | 10570 | 2.3325 | - |
| 0.4655 | 10580 | 2.4611 | - |
| 0.4660 | 10590 | 2.423 | - |
| 0.4664 | 10600 | 2.2807 | - |
| 0.4669 | 10610 | 2.2093 | - |
| 0.4673 | 10620 | 2.2237 | - |
| 0.4677 | 10630 | 2.2129 | - |
| 0.4682 | 10640 | 2.2596 | - |
| 0.4686 | 10650 | 2.1446 | - |
| 0.4691 | 10660 | 2.243 | - |
| 0.4695 | 10670 | 2.2383 | - |
| 0.4699 | 10680 | 2.233 | - |
| 0.4704 | 10690 | 2.1491 | - |
| 0.4708 | 10700 | 2.1095 | - |
| 0.4713 | 10710 | 2.2765 | - |
| 0.4717 | 10720 | 2.1988 | - |
| 0.4721 | 10730 | 2.2385 | - |
| 0.4726 | 10740 | 2.2018 | - |
| 0.4730 | 10750 | 2.2159 | - |
| 0.4735 | 10760 | 2.1915 | - |
| 0.4739 | 10770 | 2.2963 | - |
| 0.4743 | 10780 | 2.2944 | - |
| 0.4748 | 10790 | 2.2749 | - |
| 0.4752 | 10800 | 2.2491 | - |
| 0.4757 | 10810 | 2.2406 | - |
| 0.4761 | 10820 | 2.236 | - |
| 0.4765 | 10830 | 2.2486 | - |
| 0.4770 | 10840 | 2.2538 | - |
| 0.4774 | 10850 | 2.2362 | - |
| 0.4779 | 10860 | 2.2184 | - |
| 0.4783 | 10870 | 2.258 | - |
| 0.4787 | 10880 | 2.2502 | - |
| 0.4792 | 10890 | 2.2279 | - |
| 0.4796 | 10900 | 2.2452 | - |
| 0.4801 | 10910 | 2.2283 | - |
| 0.4805 | 10920 | 2.279 | - |
| 0.4809 | 10930 | 2.1799 | - |
| 0.4814 | 10940 | 2.1493 | - |
| 0.4818 | 10950 | 2.243 | - |
| 0.4823 | 10960 | 2.1814 | - |
| 0.4827 | 10970 | 2.1648 | - |
| 0.4831 | 10980 | 2.2533 | - |
| 0.4836 | 10990 | 2.2699 | - |
| 0.4840 | 11000 | 2.2254 | - |
| 0.4845 | 11010 | 2.2608 | - |
| 0.4849 | 11020 | 2.2266 | - |
| 0.4853 | 11030 | 2.2175 | - |
| 0.4858 | 11040 | 2.2818 | - |
| 0.4862 | 11050 | 2.2912 | - |
| 0.4867 | 11060 | 2.2326 | - |
| 0.4871 | 11070 | 2.2147 | - |
| 0.4875 | 11080 | 2.261 | - |
| 0.4880 | 11090 | 2.1797 | - |
| 0.4884 | 11100 | 2.2339 | - |
| 0.4889 | 11110 | 2.2218 | - |
| 0.4893 | 11120 | 2.2002 | - |
| 0.4897 | 11130 | 2.2522 | - |
| 0.4902 | 11140 | 2.2338 | - |
| 0.4906 | 11150 | 2.2076 | - |
| 0.4911 | 11160 | 2.2865 | - |
| 0.4915 | 11170 | 2.2799 | - |
| 0.4919 | 11180 | 2.2942 | - |
| 0.4924 | 11190 | 2.2318 | - |
| 0.4928 | 11200 | 2.2683 | - |
| 0.4933 | 11210 | 2.3292 | - |
| 0.4937 | 11220 | 2.1199 | - |
| 0.4941 | 11230 | 2.3099 | - |
| 0.4946 | 11240 | 2.3124 | - |
| 0.4950 | 11250 | 2.2397 | - |
| 0.4955 | 11260 | 2.1843 | - |
| 0.4959 | 11270 | 2.2832 | - |
| 0.4963 | 11280 | 2.2853 | - |
| 0.4968 | 11290 | 2.2136 | - |
| 0.4972 | 11300 | 2.2506 | - |
| 0.4977 | 11310 | 2.2309 | - |
| 0.4981 | 11320 | 2.2485 | - |
| 0.4985 | 11330 | 2.2212 | - |
| 0.4990 | 11340 | 2.288 | - |
| 0.4994 | 11350 | 2.2405 | - |
| 0.4999 | 11360 | 2.2229 | - |
| 0.5003 | 11370 | 2.2243 | 1.6550 |
| 0.5007 | 11380 | 2.223 | - |
| 0.5012 | 11390 | 2.2095 | - |
| 0.5016 | 11400 | 2.229 | - |
| 0.5021 | 11410 | 2.1573 | - |
| 0.5025 | 11420 | 2.1874 | - |
| 0.5029 | 11430 | 2.2178 | - |
| 0.5034 | 11440 | 2.2216 | - |
| 0.5038 | 11450 | 2.1874 | - |
| 0.5043 | 11460 | 2.2173 | - |
| 0.5047 | 11470 | 2.2863 | - |
| 0.5051 | 11480 | 2.2291 | - |
| 0.5056 | 11490 | 2.2277 | - |
| 0.5060 | 11500 | 2.2268 | - |
| 0.5065 | 11510 | 2.1924 | - |
| 0.5069 | 11520 | 2.1803 | - |
| 0.5073 | 11530 | 2.3353 | - |
| 0.5078 | 11540 | 2.2135 | - |
| 0.5082 | 11550 | 2.2166 | - |
| 0.5087 | 11560 | 2.1964 | - |
| 0.5091 | 11570 | 2.2717 | - |
| 0.5095 | 11580 | 2.1799 | - |
| 0.5100 | 11590 | 2.2374 | - |
| 0.5104 | 11600 | 2.2552 | - |
| 0.5109 | 11610 | 2.2522 | - |
| 0.5113 | 11620 | 2.1857 | - |
| 0.5117 | 11630 | 2.2299 | - |
| 0.5122 | 11640 | 2.2373 | - |
| 0.5126 | 11650 | 2.1962 | - |
| 0.5131 | 11660 | 2.1974 | - |
| 0.5135 | 11670 | 2.2282 | - |
| 0.5139 | 11680 | 2.1123 | - |
| 0.5144 | 11690 | 2.2021 | - |
| 0.5148 | 11700 | 2.2147 | - |
| 0.5153 | 11710 | 2.21 | - |
| 0.5157 | 11720 | 2.242 | - |
| 0.5161 | 11730 | 2.2442 | - |
| 0.5166 | 11740 | 2.4435 | - |
| 0.5170 | 11750 | 2.3149 | - |
| 0.5175 | 11760 | 2.1625 | - |
| 0.5179 | 11770 | 2.1468 | - |
| 0.5183 | 11780 | 2.2297 | - |
| 0.5188 | 11790 | 2.2598 | - |
| 0.5192 | 11800 | 2.1902 | - |
| 0.5197 | 11810 | 2.279 | - |
| 0.5201 | 11820 | 2.2105 | - |
| 0.5205 | 11830 | 2.1977 | - |
| 0.5210 | 11840 | 2.2536 | - |
| 0.5214 | 11850 | 2.2721 | - |
| 0.5219 | 11860 | 2.1815 | - |
| 0.5223 | 11870 | 2.1553 | - |
| 0.5227 | 11880 | 2.2374 | - |
| 0.5232 | 11890 | 2.2503 | - |
| 0.5236 | 11900 | 2.1831 | - |
| 0.5241 | 11910 | 2.1795 | - |
| 0.5245 | 11920 | 2.2131 | - |
| 0.5249 | 11930 | 2.1808 | - |
| 0.5254 | 11940 | 2.1749 | - |
| 0.5258 | 11950 | 2.1886 | - |
| 0.5263 | 11960 | 2.2023 | - |
| 0.5267 | 11970 | 2.2045 | - |
| 0.5271 | 11980 | 2.1567 | - |
| 0.5276 | 11990 | 2.2009 | - |
| 0.5280 | 12000 | 2.2441 | - |
| 0.5285 | 12010 | 2.1719 | - |
| 0.5289 | 12020 | 2.2189 | - |
| 0.5293 | 12030 | 2.2646 | - |
| 0.5298 | 12040 | 2.1829 | - |
| 0.5302 | 12050 | 2.1665 | - |
| 0.5307 | 12060 | 2.2014 | - |
| 0.5311 | 12070 | 2.168 | - |
| 0.5315 | 12080 | 2.2505 | - |
| 0.5320 | 12090 | 2.1841 | - |
| 0.5324 | 12100 | 2.1454 | - |
| 0.5329 | 12110 | 2.1997 | - |
| 0.5333 | 12120 | 2.1324 | - |
| 0.5337 | 12130 | 2.1821 | - |
| 0.5342 | 12140 | 2.218 | - |
| 0.5346 | 12150 | 2.1542 | - |
| 0.5351 | 12160 | 2.2036 | - |
| 0.5355 | 12170 | 2.1698 | - |
| 0.5360 | 12180 | 2.1889 | - |
| 0.5364 | 12190 | 2.1638 | - |
| 0.5368 | 12200 | 2.243 | - |
| 0.5373 | 12210 | 2.1579 | - |
| 0.5377 | 12220 | 2.1528 | - |
| 0.5382 | 12230 | 2.1191 | - |
| 0.5386 | 12240 | 2.1055 | - |
| 0.5390 | 12250 | 2.1879 | - |
| 0.5395 | 12260 | 2.2033 | - |
| 0.5399 | 12270 | 2.1998 | - |
| 0.5404 | 12280 | 2.1193 | - |
| 0.5408 | 12290 | 2.1746 | - |
| 0.5412 | 12300 | 2.1963 | - |
| 0.5417 | 12310 | 2.1488 | - |
| 0.5421 | 12320 | 2.134 | - |
| 0.5426 | 12330 | 2.2197 | - |
| 0.5430 | 12340 | 2.202 | - |
| 0.5434 | 12350 | 2.2213 | - |
| 0.5439 | 12360 | 2.178 | - |
| 0.5443 | 12370 | 2.2152 | - |
| 0.5448 | 12380 | 2.2245 | - |
| 0.5452 | 12390 | 2.1241 | - |
| 0.5456 | 12400 | 2.1852 | - |
| 0.5461 | 12410 | 2.1504 | - |
| 0.5465 | 12420 | 2.1495 | - |
| 0.5470 | 12430 | 2.2413 | - |
| 0.5474 | 12440 | 2.2526 | - |
| 0.5478 | 12450 | 2.1368 | - |
| 0.5483 | 12460 | 2.0786 | - |
| 0.5487 | 12470 | 2.1458 | - |
| 0.5492 | 12480 | 2.2687 | - |
| 0.5496 | 12490 | 2.1685 | - |
| 0.5500 | 12500 | 2.1937 | - |
| 0.5503 | 12507 | - | 1.6362 |
| 0.5505 | 12510 | 2.1266 | - |
| 0.5509 | 12520 | 2.195 | - |
| 0.5514 | 12530 | 2.2274 | - |
| 0.5518 | 12540 | 2.1123 | - |
| 0.5522 | 12550 | 2.1506 | - |
| 0.5527 | 12560 | 2.151 | - |
| 0.5531 | 12570 | 2.1655 | - |
| 0.5536 | 12580 | 2.1755 | - |
| 0.5540 | 12590 | 2.1225 | - |
| 0.5544 | 12600 | 2.1871 | - |
| 0.5549 | 12610 | 2.1216 | - |
| 0.5553 | 12620 | 2.2259 | - |
| 0.5558 | 12630 | 2.1567 | - |
| 0.5562 | 12640 | 2.1801 | - |
| 0.5566 | 12650 | 2.0892 | - |
| 0.5571 | 12660 | 2.2128 | - |
| 0.5575 | 12670 | 2.152 | - |
| 0.5580 | 12680 | 2.0692 | - |
| 0.5584 | 12690 | 2.2158 | - |
| 0.5588 | 12700 | 2.1783 | - |
| 0.5593 | 12710 | 2.0882 | - |
| 0.5597 | 12720 | 2.1339 | - |
| 0.5602 | 12730 | 2.1556 | - |
| 0.5606 | 12740 | 2.1334 | - |
| 0.5610 | 12750 | 2.1542 | - |
| 0.5615 | 12760 | 2.182 | - |
| 0.5619 | 12770 | 2.2081 | - |
| 0.5624 | 12780 | 2.054 | - |
| 0.5628 | 12790 | 2.0703 | - |
| 0.5632 | 12800 | 2.0711 | - |
| 0.5637 | 12810 | 2.102 | - |
| 0.5641 | 12820 | 2.1622 | - |
| 0.5646 | 12830 | 2.1172 | - |
| 0.5650 | 12840 | 2.12 | - |
| 0.5654 | 12850 | 2.1486 | - |
| 0.5659 | 12860 | 2.1639 | - |
| 0.5663 | 12870 | 2.0938 | - |
| 0.5668 | 12880 | 2.0924 | - |
| 0.5672 | 12890 | 2.1483 | - |
| 0.5676 | 12900 | 2.1407 | - |
| 0.5681 | 12910 | 2.1205 | - |
| 0.5685 | 12920 | 2.1487 | - |
| 0.5690 | 12930 | 2.0719 | - |
| 0.5694 | 12940 | 2.1413 | - |
| 0.5698 | 12950 | 2.1933 | - |
| 0.5703 | 12960 | 2.1017 | - |
| 0.5707 | 12970 | 2.0898 | - |
| 0.5712 | 12980 | 2.1855 | - |
| 0.5716 | 12990 | 2.0927 | - |
| 0.5720 | 13000 | 2.1754 | - |
| 0.5725 | 13010 | 2.0582 | - |
| 0.5729 | 13020 | 2.0855 | - |
| 0.5734 | 13030 | 2.1287 | - |
| 0.5738 | 13040 | 2.1392 | - |
| 0.5742 | 13050 | 2.0965 | - |
| 0.5747 | 13060 | 2.0605 | - |
| 0.5751 | 13070 | 2.0129 | - |
| 0.5756 | 13080 | 2.1665 | - |
| 0.5760 | 13090 | 2.1176 | - |
| 0.5764 | 13100 | 2.1114 | - |
| 0.5769 | 13110 | 2.1687 | - |
| 0.5773 | 13120 | 2.1031 | - |
| 0.5778 | 13130 | 2.0653 | - |
| 0.5782 | 13140 | 2.0488 | - |
| 0.5786 | 13150 | 2.0589 | - |
| 0.5791 | 13160 | 2.1508 | - |
| 0.5795 | 13170 | 2.0854 | - |
| 0.5800 | 13180 | 2.1213 | - |
| 0.5804 | 13190 | 2.1037 | - |
| 0.5808 | 13200 | 2.0336 | - |
| 0.5813 | 13210 | 2.0623 | - |
| 0.5817 | 13220 | 2.0997 | - |
| 0.5822 | 13230 | 2.1145 | - |
| 0.5826 | 13240 | 2.0546 | - |
| 0.5830 | 13250 | 2.086 | - |
| 0.5835 | 13260 | 2.1133 | - |
| 0.5839 | 13270 | 2.084 | - |
| 0.5844 | 13280 | 2.018 | - |
| 0.5848 | 13290 | 2.029 | - |
| 0.5852 | 13300 | 2.0729 | - |
| 0.5857 | 13310 | 2.1447 | - |
| 0.5861 | 13320 | 2.0465 | - |
| 0.5866 | 13330 | 2.0705 | - |
| 0.5870 | 13340 | 2.1098 | - |
| 0.5874 | 13350 | 2.0671 | - |
| 0.5879 | 13360 | 2.0269 | - |
| 0.5883 | 13370 | 2.0774 | - |
| 0.5888 | 13380 | 2.0891 | - |
| 0.5892 | 13390 | 2.0304 | - |
| 0.5896 | 13400 | 2.0746 | - |
| 0.5901 | 13410 | 2.1342 | - |
| 0.5905 | 13420 | 2.1643 | - |
| 0.5910 | 13430 | 2.1895 | - |
| 0.5914 | 13440 | 2.0432 | - |
| 0.5918 | 13450 | 2.096 | - |
| 0.5923 | 13460 | 2.1885 | - |
| 0.5927 | 13470 | 2.1114 | - |
| 0.5932 | 13480 | 2.0138 | - |
| 0.5936 | 13490 | 2.0943 | - |
| 0.5940 | 13500 | 2.0797 | - |
| 0.5945 | 13510 | 2.1222 | - |
| 0.5949 | 13520 | 2.0857 | - |
| 0.5954 | 13530 | 1.9979 | - |
| 0.5958 | 13540 | 2.1758 | - |
| 0.5962 | 13550 | 2.1357 | - |
| 0.5967 | 13560 | 2.0915 | - |
| 0.5971 | 13570 | 2.0796 | - |
| 0.5976 | 13580 | 2.0367 | - |
| 0.5980 | 13590 | 2.0731 | - |
| 0.5984 | 13600 | 2.0627 | - |
| 0.5989 | 13610 | 2.0705 | - |
| 0.5993 | 13620 | 2.032 | - |
| 0.5998 | 13630 | 2.0704 | - |
| 0.6002 | 13640 | 2.0142 | - |
| 0.6004 | 13644 | - | 1.6511 |
| 0.6006 | 13650 | 2.0328 | - |
| 0.6011 | 13660 | 2.1207 | - |
| 0.6015 | 13670 | 2.0918 | - |
| 0.6020 | 13680 | 2.0111 | - |
| 0.6024 | 13690 | 1.9967 | - |
| 0.6028 | 13700 | 2.0118 | - |
| 0.6033 | 13710 | 2.1229 | - |
| 0.6037 | 13720 | 2.0852 | - |
| 0.6042 | 13730 | 2.0507 | - |
| 0.6046 | 13740 | 2.1564 | - |
| 0.6050 | 13750 | 2.0733 | - |
| 0.6055 | 13760 | 2.0436 | - |
| 0.6059 | 13770 | 2.0325 | - |
| 0.6064 | 13780 | 2.03 | - |
| 0.6068 | 13790 | 2.0208 | - |
| 0.6072 | 13800 | 2.0705 | - |
| 0.6077 | 13810 | 2.0254 | - |
| 0.6081 | 13820 | 2.0847 | - |
| 0.6086 | 13830 | 2.0665 | - |
| 0.6090 | 13840 | 2.0908 | - |
| 0.6094 | 13850 | 2.0369 | - |
| 0.6099 | 13860 | 2.0613 | - |
| 0.6103 | 13870 | 2.0114 | - |
| 0.6108 | 13880 | 2.0728 | - |
| 0.6112 | 13890 | 2.0345 | - |
| 0.6116 | 13900 | 2.0463 | - |
| 0.6121 | 13910 | 2.0384 | - |
| 0.6125 | 13920 | 1.9778 | - |
| 0.6130 | 13930 | 2.0623 | - |
| 0.6134 | 13940 | 2.0866 | - |
| 0.6138 | 13950 | 2.122 | - |
| 0.6143 | 13960 | 2.0427 | - |
| 0.6147 | 13970 | 1.9975 | - |
| 0.6152 | 13980 | 2.0792 | - |
| 0.6156 | 13990 | 2.0742 | - |
| 0.6160 | 14000 | 2.1509 | - |
| 0.6165 | 14010 | 2.0977 | - |
| 0.6169 | 14020 | 2.1102 | - |
| 0.6174 | 14030 | 2.0786 | - |
| 0.6178 | 14040 | 2.0859 | - |
| 0.6182 | 14050 | 2.0782 | - |
| 0.6187 | 14060 | 2.0807 | - |
| 0.6191 | 14070 | 2.0981 | - |
| 0.6196 | 14080 | 2.1078 | - |
| 0.6200 | 14090 | 2.0824 | - |
| 0.6204 | 14100 | 2.1259 | - |
| 0.6209 | 14110 | 2.0759 | - |
| 0.6213 | 14120 | 2.0787 | - |
| 0.6218 | 14130 | 2.072 | - |
| 0.6222 | 14140 | 2.1007 | - |
| 0.6226 | 14150 | 2.0283 | - |
| 0.6231 | 14160 | 2.0858 | - |
| 0.6235 | 14170 | 2.0461 | - |
| 0.6240 | 14180 | 2.0836 | - |
| 0.6244 | 14190 | 2.0531 | - |
| 0.6248 | 14200 | 2.0524 | - |
| 0.6253 | 14210 | 1.9935 | - |
| 0.6257 | 14220 | 2.0489 | - |
| 0.6262 | 14230 | 2.0534 | - |
| 0.6266 | 14240 | 2.0831 | - |
| 0.6270 | 14250 | 2.0905 | - |
| 0.6275 | 14260 | 2.118 | - |
| 0.6279 | 14270 | 2.1405 | - |
| 0.6284 | 14280 | 1.9983 | - |
| 0.6288 | 14290 | 2.0557 | - |
| 0.6292 | 14300 | 2.0508 | - |
| 0.6297 | 14310 | 2.0976 | - |
| 0.6301 | 14320 | 2.0394 | - |
| 0.6306 | 14330 | 2.0523 | - |
| 0.6310 | 14340 | 2.0442 | - |
| 0.6314 | 14350 | 2.0912 | - |
| 0.6319 | 14360 | 2.0054 | - |
| 0.6323 | 14370 | 2.037 | - |
| 0.6328 | 14380 | 2.0602 | - |
| 0.6332 | 14390 | 2.1039 | - |
| 0.6336 | 14400 | 2.0973 | - |
| 0.6341 | 14410 | 2.0182 | - |
| 0.6345 | 14420 | 2.0342 | - |
| 0.6350 | 14430 | 1.9848 | - |
| 0.6354 | 14440 | 2.0139 | - |
| 0.6358 | 14450 | 2.0791 | - |
| 0.6363 | 14460 | 1.9974 | - |
| 0.6367 | 14470 | 2.0795 | - |
| 0.6372 | 14480 | 1.9788 | - |
| 0.6376 | 14490 | 2.0217 | - |
| 0.6380 | 14500 | 2.0575 | - |
| 0.6385 | 14510 | 2.0404 | - |
| 0.6389 | 14520 | 1.9974 | - |
| 0.6394 | 14530 | 2.0637 | - |
| 0.6398 | 14540 | 2.0414 | - |
| 0.6402 | 14550 | 2.0391 | - |
| 0.6407 | 14560 | 2.0205 | - |
| 0.6411 | 14570 | 2.1212 | - |
| 0.6416 | 14580 | 2.0613 | - |
| 0.6420 | 14590 | 1.9789 | - |
| 0.6424 | 14600 | 2.0583 | - |
| 0.6429 | 14610 | 2.0416 | - |
| 0.6433 | 14620 | 2.0329 | - |
| 0.6438 | 14630 | 2.0317 | - |
| 0.6442 | 14640 | 2.0492 | - |
| 0.6446 | 14650 | 1.98 | - |
| 0.6451 | 14660 | 2.0339 | - |
| 0.6455 | 14670 | 2.0109 | - |
| 0.6460 | 14680 | 1.9918 | - |
| 0.6464 | 14690 | 2.055 | - |
| 0.6468 | 14700 | 2.0407 | - |
| 0.6473 | 14710 | 2.0258 | - |
| 0.6477 | 14720 | 2.0004 | - |
| 0.6482 | 14730 | 1.968 | - |
| 0.6486 | 14740 | 2.0799 | - |
| 0.6490 | 14750 | 1.9926 | - |
| 0.6495 | 14760 | 2.0861 | - |
| 0.6499 | 14770 | 2.0119 | - |
| 0.6504 | 14780 | 1.9994 | - |
| 0.6504 | 14781 | - | 1.6550 |
| 0.6508 | 14790 | 2.0513 | - |
| 0.6512 | 14800 | 1.9457 | - |
| 0.6517 | 14810 | 2.0068 | - |
| 0.6521 | 14820 | 2.0122 | - |
| 0.6526 | 14830 | 1.9853 | - |
| 0.6530 | 14840 | 2.1078 | - |
| 0.6534 | 14850 | 2.0112 | - |
| 0.6539 | 14860 | 2.0081 | - |
| 0.6543 | 14870 | 1.9741 | - |
| 0.6548 | 14880 | 2.0818 | - |
| 0.6552 | 14890 | 2.0318 | - |
| 0.6556 | 14900 | 2.0212 | - |
| 0.6561 | 14910 | 2.0054 | - |
| 0.6565 | 14920 | 2.0354 | - |
| 0.6570 | 14930 | 1.9928 | - |
| 0.6574 | 14940 | 2.0121 | - |
| 0.6578 | 14950 | 2.0528 | - |
| 0.6583 | 14960 | 2.0699 | - |
| 0.6587 | 14970 | 2.0456 | - |
| 0.6592 | 14980 | 2.0132 | - |
| 0.6596 | 14990 | 2.0044 | - |
| 0.6600 | 15000 | 1.9857 | - |
| 0.6605 | 15010 | 1.9661 | - |
| 0.6609 | 15020 | 1.9975 | - |
| 0.6614 | 15030 | 1.9892 | - |
| 0.6618 | 15040 | 2.003 | - |
| 0.6622 | 15050 | 1.985 | - |
| 0.6627 | 15060 | 2.0688 | - |
| 0.6631 | 15070 | 1.999 | - |
| 0.6636 | 15080 | 2.022 | - |
| 0.6640 | 15090 | 1.9856 | - |
| 0.6644 | 15100 | 1.9467 | - |
| 0.6649 | 15110 | 2.0101 | - |
| 0.6653 | 15120 | 2.023 | - |
| 0.6658 | 15130 | 2.0124 | - |
| 0.6662 | 15140 | 1.966 | - |
| 0.6666 | 15150 | 1.9288 | - |
| 0.6671 | 15160 | 2.0111 | - |
| 0.6675 | 15170 | 2.0144 | - |
| 0.6680 | 15180 | 2.0381 | - |
| 0.6684 | 15190 | 2.0387 | - |
| 0.6688 | 15200 | 2.0242 | - |
| 0.6693 | 15210 | 2.0189 | - |
| 0.6697 | 15220 | 1.9769 | - |
| 0.6702 | 15230 | 2.0003 | - |
| 0.6706 | 15240 | 1.9428 | - |
| 0.6710 | 15250 | 1.9705 | - |
| 0.6715 | 15260 | 2.0487 | - |
| 0.6719 | 15270 | 1.9851 | - |
| 0.6724 | 15280 | 1.9971 | - |
| 0.6728 | 15290 | 2.0047 | - |
| 0.6732 | 15300 | 1.9591 | - |
| 0.6737 | 15310 | 2.0125 | - |
| 0.6741 | 15320 | 1.9697 | - |
| 0.6746 | 15330 | 1.9648 | - |
| 0.6750 | 15340 | 1.9851 | - |
| 0.6754 | 15350 | 1.9928 | - |
| 0.6759 | 15360 | 1.9712 | - |
| 0.6763 | 15370 | 2.0227 | - |
| 0.6768 | 15380 | 1.9951 | - |
| 0.6772 | 15390 | 1.9646 | - |
| 0.6776 | 15400 | 1.9851 | - |
| 0.6781 | 15410 | 2.039 | - |
| 0.6785 | 15420 | 1.9756 | - |
| 0.6790 | 15430 | 2.0222 | - |
| 0.6794 | 15440 | 2.004 | - |
| 0.6798 | 15450 | 2.0234 | - |
| 0.6803 | 15460 | 1.969 | - |
| 0.6807 | 15470 | 2.0091 | - |
| 0.6812 | 15480 | 1.9971 | - |
| 0.6816 | 15490 | 1.952 | - |
| 0.6820 | 15500 | 1.9685 | - |
| 0.6825 | 15510 | 2.0028 | - |
| 0.6829 | 15520 | 1.9674 | - |
| 0.6834 | 15530 | 2.0195 | - |
| 0.6838 | 15540 | 2.0071 | - |
| 0.6842 | 15550 | 2.0386 | - |
| 0.6847 | 15560 | 1.9654 | - |
| 0.6851 | 15570 | 1.9931 | - |
| 0.6856 | 15580 | 1.9381 | - |
| 0.6860 | 15590 | 1.9591 | - |
| 0.6864 | 15600 | 1.9999 | - |
| 0.6869 | 15610 | 1.9987 | - |
| 0.6873 | 15620 | 1.9122 | - |
| 0.6878 | 15630 | 1.9122 | - |
| 0.6882 | 15640 | 1.9859 | - |
| 0.6886 | 15650 | 1.956 | - |
| 0.6891 | 15660 | 1.9345 | - |
| 0.6895 | 15670 | 1.9804 | - |
| 0.6900 | 15680 | 2.0369 | - |
| 0.6904 | 15690 | 1.9414 | - |
| 0.6908 | 15700 | 1.9851 | - |
| 0.6913 | 15710 | 1.9641 | - |
| 0.6917 | 15720 | 1.9742 | - |
| 0.6922 | 15730 | 2.0284 | - |
| 0.6926 | 15740 | 2.0152 | - |
| 0.6930 | 15750 | 1.9432 | - |
| 0.6935 | 15760 | 1.9695 | - |
| 0.6939 | 15770 | 1.9856 | - |
| 0.6944 | 15780 | 1.969 | - |
| 0.6948 | 15790 | 2.0506 | - |
| 0.6952 | 15800 | 1.9368 | - |
| 0.6957 | 15810 | 1.9805 | - |
| 0.6961 | 15820 | 1.9444 | - |
| 0.6966 | 15830 | 1.9975 | - |
| 0.6970 | 15840 | 1.9759 | - |
| 0.6974 | 15850 | 1.9677 | - |
| 0.6979 | 15860 | 1.9686 | - |
| 0.6983 | 15870 | 2.0063 | - |
| 0.6988 | 15880 | 1.9882 | - |
| 0.6992 | 15890 | 1.9475 | - |
| 0.6996 | 15900 | 2.0128 | - |
| 0.7001 | 15910 | 1.9166 | - |
| 0.7004 | 15918 | - | 1.6010 |
| 0.7005 | 15920 | 1.9738 | - |
| 0.7010 | 15930 | 2.0058 | - |
| 0.7014 | 15940 | 1.9684 | - |
| 0.7018 | 15950 | 1.9108 | - |
| 0.7023 | 15960 | 1.906 | - |
| 0.7027 | 15970 | 1.9673 | - |
| 0.7032 | 15980 | 1.9322 | - |
| 0.7036 | 15990 | 1.9514 | - |
| 0.7040 | 16000 | 1.9381 | - |
| 0.7045 | 16010 | 2.0138 | - |
| 0.7049 | 16020 | 1.9438 | - |
| 0.7054 | 16030 | 1.9595 | - |
| 0.7058 | 16040 | 1.9554 | - |
| 0.7062 | 16050 | 1.996 | - |
| 0.7067 | 16060 | 1.9162 | - |
| 0.7071 | 16070 | 1.9143 | - |
| 0.7076 | 16080 | 1.9857 | - |
| 0.7080 | 16090 | 1.934 | - |
| 0.7084 | 16100 | 1.9765 | - |
| 0.7089 | 16110 | 1.9235 | - |
| 0.7093 | 16120 | 1.9736 | - |
| 0.7098 | 16130 | 2.0085 | - |
| 0.7102 | 16140 | 1.9905 | - |
| 0.7106 | 16150 | 1.9611 | - |
| 0.7111 | 16160 | 1.9799 | - |
| 0.7115 | 16170 | 1.9702 | - |
| 0.7120 | 16180 | 1.9908 | - |
| 0.7124 | 16190 | 1.9414 | - |
| 0.7128 | 16200 | 1.9245 | - |
| 0.7133 | 16210 | 1.9649 | - |
| 0.7137 | 16220 | 1.9114 | - |
| 0.7142 | 16230 | 1.9446 | - |
| 0.7146 | 16240 | 1.9302 | - |
| 0.7150 | 16250 | 1.9114 | - |
| 0.7155 | 16260 | 1.9277 | - |
| 0.7159 | 16270 | 1.9506 | - |
| 0.7164 | 16280 | 1.8981 | - |
| 0.7168 | 16290 | 1.8898 | - |
| 0.7172 | 16300 | 1.9163 | - |
| 0.7177 | 16310 | 1.9316 | - |
| 0.7181 | 16320 | 1.9798 | - |
| 0.7186 | 16330 | 1.9602 | - |
| 0.7190 | 16340 | 1.977 | - |
| 0.7194 | 16350 | 1.9393 | - |
| 0.7199 | 16360 | 1.9012 | - |
| 0.7203 | 16370 | 1.9297 | - |
| 0.7208 | 16380 | 1.9149 | - |
| 0.7212 | 16390 | 1.9086 | - |
| 0.7216 | 16400 | 1.9328 | - |
| 0.7221 | 16410 | 1.9112 | - |
| 0.7225 | 16420 | 1.9349 | - |
| 0.7230 | 16430 | 1.9219 | - |
| 0.7234 | 16440 | 1.9194 | - |
| 0.7238 | 16450 | 1.9173 | - |
| 0.7243 | 16460 | 1.9888 | - |
| 0.7247 | 16470 | 1.9171 | - |
| 0.7252 | 16480 | 1.9196 | - |
| 0.7256 | 16490 | 1.9456 | - |
| 0.7260 | 16500 | 2.0132 | - |
| 0.7265 | 16510 | 1.905 | - |
| 0.7269 | 16520 | 1.9018 | - |
| 0.7274 | 16530 | 1.9402 | - |
| 0.7278 | 16540 | 1.8879 | - |
| 0.7282 | 16550 | 1.9636 | - |
| 0.7287 | 16560 | 1.9084 | - |
| 0.7291 | 16570 | 1.8942 | - |
| 0.7296 | 16580 | 1.9532 | - |
| 0.7300 | 16590 | 1.8728 | - |
| 0.7304 | 16600 | 1.9016 | - |
| 0.7309 | 16610 | 1.923 | - |
| 0.7313 | 16620 | 1.8552 | - |
| 0.7318 | 16630 | 1.9149 | - |
| 0.7322 | 16640 | 1.9013 | - |
| 0.7326 | 16650 | 1.9101 | - |
| 0.7331 | 16660 | 1.94 | - |
| 0.7335 | 16670 | 1.886 | - |
| 0.7340 | 16680 | 1.9479 | - |
| 0.7344 | 16690 | 1.8639 | - |
| 0.7348 | 16700 | 1.8938 | - |
| 0.7353 | 16710 | 1.9004 | - |
| 0.7357 | 16720 | 1.9364 | - |
| 0.7362 | 16730 | 1.9731 | - |
| 0.7366 | 16740 | 1.9131 | - |
| 0.7370 | 16750 | 1.8727 | - |
| 0.7375 | 16760 | 1.8715 | - |
| 0.7379 | 16770 | 1.9839 | - |
| 0.7384 | 16780 | 2.0026 | - |
| 0.7388 | 16790 | 1.8844 | - |
| 0.7392 | 16800 | 1.892 | - |
| 0.7397 | 16810 | 1.9367 | - |
| 0.7401 | 16820 | 1.9039 | - |
| 0.7406 | 16830 | 1.9148 | - |
| 0.7410 | 16840 | 1.9005 | - |
| 0.7414 | 16850 | 1.859 | - |
| 0.7419 | 16860 | 1.9244 | - |
| 0.7423 | 16870 | 1.9186 | - |
| 0.7428 | 16880 | 1.9045 | - |
| 0.7432 | 16890 | 1.8995 | - |
| 0.7436 | 16900 | 1.925 | - |
| 0.7441 | 16910 | 1.8795 | - |
| 0.7445 | 16920 | 1.9001 | - |
| 0.7450 | 16930 | 1.9489 | - |
| 0.7454 | 16940 | 1.8565 | - |
| 0.7458 | 16950 | 1.914 | - |
| 0.7463 | 16960 | 1.8759 | - |
| 0.7467 | 16970 | 1.8933 | - |
| 0.7472 | 16980 | 1.9254 | - |
| 0.7476 | 16990 | 1.9349 | - |
| 0.7480 | 17000 | 1.9387 | - |
| 0.7485 | 17010 | 1.9317 | - |
| 0.7489 | 17020 | 1.8703 | - |
| 0.7494 | 17030 | 1.8994 | - |
| 0.7498 | 17040 | 1.9459 | - |
| 0.7502 | 17050 | 1.9336 | - |
| 0.7505 | 17055 | - | 1.5676 |
| 0.7507 | 17060 | 1.9267 | - |
| 0.7511 | 17070 | 1.9571 | - |
| 0.7516 | 17080 | 1.9209 | - |
| 0.7520 | 17090 | 1.8401 | - |
| 0.7524 | 17100 | 1.8811 | - |
| 0.7529 | 17110 | 1.9583 | - |
| 0.7533 | 17120 | 1.8986 | - |
| 0.7538 | 17130 | 1.9402 | - |
| 0.7542 | 17140 | 1.9505 | - |
| 0.7546 | 17150 | 1.8932 | - |
| 0.7551 | 17160 | 1.9286 | - |
| 0.7555 | 17170 | 1.8711 | - |
| 0.7560 | 17180 | 1.8566 | - |
| 0.7564 | 17190 | 1.9541 | - |
| 0.7568 | 17200 | 1.9216 | - |
| 0.7573 | 17210 | 1.9025 | - |
| 0.7577 | 17220 | 1.8562 | - |
| 0.7582 | 17230 | 1.8582 | - |
| 0.7586 | 17240 | 1.8472 | - |
| 0.7590 | 17250 | 1.8236 | - |
| 0.7595 | 17260 | 1.8478 | - |
| 0.7599 | 17270 | 1.9184 | - |
| 0.7604 | 17280 | 1.9134 | - |
| 0.7608 | 17290 | 1.9225 | - |
| 0.7612 | 17300 | 1.945 | - |
| 0.7617 | 17310 | 1.88 | - |
| 0.7621 | 17320 | 1.8459 | - |
| 0.7626 | 17330 | 1.9136 | - |
| 0.7630 | 17340 | 1.9802 | - |
| 0.7634 | 17350 | 1.8634 | - |
| 0.7639 | 17360 | 1.8658 | - |
| 0.7643 | 17370 | 1.8964 | - |
| 0.7648 | 17380 | 1.9211 | - |
| 0.7652 | 17390 | 1.8278 | - |
| 0.7656 | 17400 | 1.9097 | - |
| 0.7661 | 17410 | 1.8214 | - |
| 0.7665 | 17420 | 1.8737 | - |
| 0.7670 | 17430 | 1.899 | - |
| 0.7674 | 17440 | 1.846 | - |
| 0.7678 | 17450 | 1.8559 | - |
| 0.7683 | 17460 | 1.82 | - |
| 0.7687 | 17470 | 1.8828 | - |
| 0.7692 | 17480 | 1.8555 | - |
| 0.7696 | 17490 | 1.9132 | - |
| 0.7700 | 17500 | 1.8653 | - |
| 0.7705 | 17510 | 1.9059 | - |
| 0.7709 | 17520 | 1.8282 | - |
| 0.7714 | 17530 | 1.8079 | - |
| 0.7718 | 17540 | 1.8216 | - |
| 0.7722 | 17550 | 1.8722 | - |
| 0.7727 | 17560 | 1.8364 | - |
| 0.7731 | 17570 | 1.8671 | - |
| 0.7736 | 17580 | 1.8983 | - |
| 0.7740 | 17590 | 1.8 | - |
| 0.7744 | 17600 | 1.8803 | - |
| 0.7749 | 17610 | 1.8154 | - |
| 0.7753 | 17620 | 1.8628 | - |
| 0.7758 | 17630 | 1.8952 | - |
| 0.7762 | 17640 | 1.8616 | - |
| 0.7766 | 17650 | 1.8972 | - |
| 0.7771 | 17660 | 1.8594 | - |
| 0.7775 | 17670 | 1.8395 | - |
| 0.7780 | 17680 | 1.8249 | - |
| 0.7784 | 17690 | 1.8314 | - |
| 0.7788 | 17700 | 1.8972 | - |
| 0.7793 | 17710 | 1.9091 | - |
| 0.7797 | 17720 | 1.9162 | - |
| 0.7802 | 17730 | 1.9065 | - |
| 0.7806 | 17740 | 1.9435 | - |
| 0.7810 | 17750 | 1.7767 | - |
| 0.7815 | 17760 | 1.8404 | - |
| 0.7819 | 17770 | 1.8216 | - |
| 0.7824 | 17780 | 1.8375 | - |
| 0.7828 | 17790 | 1.8618 | - |
| 0.7832 | 17800 | 1.9264 | - |
| 0.7837 | 17810 | 1.7724 | - |
| 0.7841 | 17820 | 1.8157 | - |
| 0.7846 | 17830 | 1.8692 | - |
| 0.7850 | 17840 | 1.8779 | - |
| 0.7854 | 17850 | 1.8358 | - |
| 0.7859 | 17860 | 1.7702 | - |
| 0.7863 | 17870 | 1.9372 | - |
| 0.7868 | 17880 | 1.8435 | - |
| 0.7872 | 17890 | 1.8384 | - |
| 0.7876 | 17900 | 1.7599 | - |
| 0.7881 | 17910 | 1.8393 | - |
| 0.7885 | 17920 | 1.7924 | - |
| 0.7890 | 17930 | 1.8769 | - |
| 0.7894 | 17940 | 1.8373 | - |
| 0.7898 | 17950 | 1.8418 | - |
| 0.7903 | 17960 | 1.8635 | - |
| 0.7907 | 17970 | 1.8744 | - |
| 0.7912 | 17980 | 1.8571 | - |
| 0.7916 | 17990 | 1.8606 | - |
| 0.7920 | 18000 | 1.8856 | - |
| 0.7925 | 18010 | 1.8458 | - |
| 0.7929 | 18020 | 1.8481 | - |
| 0.7934 | 18030 | 1.8172 | - |
| 0.7938 | 18040 | 1.8137 | - |
| 0.7942 | 18050 | 1.9036 | - |
| 0.7947 | 18060 | 1.9114 | - |
| 0.7951 | 18070 | 1.7782 | - |
| 0.7956 | 18080 | 1.8684 | - |
| 0.7960 | 18090 | 1.8765 | - |
| 0.7964 | 18100 | 1.8754 | - |
| 0.7969 | 18110 | 1.8458 | - |
| 0.7973 | 18120 | 1.833 | - |
| 0.7978 | 18130 | 1.8893 | - |
| 0.7982 | 18140 | 1.8447 | - |
| 0.7986 | 18150 | 1.8001 | - |
| 0.7991 | 18160 | 1.9099 | - |
| 0.7995 | 18170 | 1.8267 | - |
| 0.8000 | 18180 | 1.8753 | - |
| 0.8004 | 18190 | 1.8016 | - |
| 0.8005 | 18192 | - | 1.5615 |
| 0.8008 | 18200 | 1.7541 | - |
| 0.8013 | 18210 | 1.8432 | - |
| 0.8017 | 18220 | 1.8485 | - |
| 0.8022 | 18230 | 1.8694 | - |
| 0.8026 | 18240 | 1.8499 | - |
| 0.8030 | 18250 | 1.8036 | - |
| 0.8035 | 18260 | 1.8616 | - |
| 0.8039 | 18270 | 1.8231 | - |
| 0.8044 | 18280 | 1.752 | - |
| 0.8048 | 18290 | 1.8418 | - |
| 0.8052 | 18300 | 1.8482 | - |
| 0.8057 | 18310 | 1.8448 | - |
| 0.8061 | 18320 | 1.8132 | - |
| 0.8066 | 18330 | 1.8637 | - |
| 0.8070 | 18340 | 1.7586 | - |
| 0.8074 | 18350 | 1.7852 | - |
| 0.8079 | 18360 | 1.8291 | - |
| 0.8083 | 18370 | 1.8388 | - |
| 0.8088 | 18380 | 1.8504 | - |
| 0.8092 | 18390 | 1.7731 | - |
| 0.8096 | 18400 | 1.9129 | - |
| 0.8101 | 18410 | 1.8155 | - |
| 0.8105 | 18420 | 1.7654 | - |
| 0.8110 | 18430 | 1.8348 | - |
| 0.8114 | 18440 | 1.7973 | - |
| 0.8118 | 18450 | 1.8052 | - |
| 0.8123 | 18460 | 1.8421 | - |
| 0.8127 | 18470 | 1.7896 | - |
| 0.8132 | 18480 | 1.8636 | - |
| 0.8136 | 18490 | 1.7796 | - |
| 0.8140 | 18500 | 1.9163 | - |
| 0.8145 | 18510 | 1.7897 | - |
| 0.8149 | 18520 | 1.8253 | - |
| 0.8154 | 18530 | 1.8305 | - |
| 0.8158 | 18540 | 1.9007 | - |
| 0.8162 | 18550 | 1.8168 | - |
| 0.8167 | 18560 | 1.8045 | - |
| 0.8171 | 18570 | 1.8646 | - |
| 0.8176 | 18580 | 1.8861 | - |
| 0.8180 | 18590 | 1.8134 | - |
| 0.8184 | 18600 | 1.7831 | - |
| 0.8189 | 18610 | 1.8707 | - |
| 0.8193 | 18620 | 1.8086 | - |
| 0.8198 | 18630 | 1.7648 | - |
| 0.8202 | 18640 | 1.8225 | - |
| 0.8206 | 18650 | 1.8026 | - |
| 0.8211 | 18660 | 1.8861 | - |
| 0.8215 | 18670 | 1.8046 | - |
| 0.8220 | 18680 | 1.8015 | - |
| 0.8224 | 18690 | 1.7553 | - |
| 0.8228 | 18700 | 1.7537 | - |
| 0.8233 | 18710 | 1.7866 | - |
| 0.8237 | 18720 | 1.7797 | - |
| 0.8242 | 18730 | 1.8398 | - |
| 0.8246 | 18740 | 1.8304 | - |
| 0.8250 | 18750 | 1.8695 | - |
| 0.8255 | 18760 | 1.8135 | - |
| 0.8259 | 18770 | 1.8341 | - |
| 0.8264 | 18780 | 1.775 | - |
| 0.8268 | 18790 | 1.8677 | - |
| 0.8272 | 18800 | 1.7496 | - |
| 0.8277 | 18810 | 1.7449 | - |
| 0.8281 | 18820 | 1.8395 | - |
| 0.8286 | 18830 | 1.8641 | - |
| 0.8290 | 18840 | 1.7987 | - |
| 0.8294 | 18850 | 1.7412 | - |
| 0.8299 | 18860 | 1.774 | - |
| 0.8303 | 18870 | 1.8424 | - |
| 0.8308 | 18880 | 1.7948 | - |
| 0.8312 | 18890 | 1.8334 | - |
| 0.8316 | 18900 | 1.888 | - |
| 0.8321 | 18910 | 1.7935 | - |
| 0.8325 | 18920 | 1.8035 | - |
| 0.8330 | 18930 | 1.792 | - |
| 0.8334 | 18940 | 1.7599 | - |
| 0.8338 | 18950 | 1.7776 | - |
| 0.8343 | 18960 | 1.7618 | - |
| 0.8347 | 18970 | 1.7845 | - |
| 0.8352 | 18980 | 1.7669 | - |
| 0.8356 | 18990 | 1.8069 | - |
| 0.8360 | 19000 | 1.7698 | - |
| 0.8365 | 19010 | 1.8179 | - |
| 0.8369 | 19020 | 1.7678 | - |
| 0.8374 | 19030 | 1.7974 | - |
| 0.8378 | 19040 | 1.7848 | - |
| 0.8382 | 19050 | 1.8098 | - |
| 0.8387 | 19060 | 1.8252 | - |
| 0.8391 | 19070 | 1.7918 | - |
| 0.8396 | 19080 | 1.8026 | - |
| 0.8400 | 19090 | 1.8011 | - |
| 0.8404 | 19100 | 1.7617 | - |
| 0.8409 | 19110 | 1.8422 | - |
| 0.8413 | 19120 | 1.841 | - |
| 0.8418 | 19130 | 1.7515 | - |
| 0.8422 | 19140 | 1.7376 | - |
| 0.8426 | 19150 | 1.7447 | - |
| 0.8431 | 19160 | 1.733 | - |
| 0.8435 | 19170 | 1.7957 | - |
| 0.8440 | 19180 | 1.81 | - |
| 0.8444 | 19190 | 1.7999 | - |
| 0.8448 | 19200 | 1.8028 | - |
| 0.8453 | 19210 | 1.8225 | - |
| 0.8457 | 19220 | 1.7875 | - |
| 0.8462 | 19230 | 1.8472 | - |
| 0.8466 | 19240 | 1.7315 | - |
| 0.8470 | 19250 | 1.7722 | - |
| 0.8475 | 19260 | 1.7764 | - |
| 0.8479 | 19270 | 1.7705 | - |
| 0.8484 | 19280 | 1.7843 | - |
| 0.8488 | 19290 | 1.807 | - |
| 0.8492 | 19300 | 1.8111 | - |
| 0.8497 | 19310 | 1.8147 | - |
| 0.8501 | 19320 | 1.8467 | - |
| 0.8505 | 19329 | - | 1.5148 |
| 0.8506 | 19330 | 1.7413 | - |
| 0.8510 | 19340 | 1.837 | - |
| 0.8514 | 19350 | 1.727 | - |
| 0.8519 | 19360 | 1.7782 | - |
| 0.8523 | 19370 | 1.7197 | - |
| 0.8528 | 19380 | 1.7461 | - |
| 0.8532 | 19390 | 1.7826 | - |
| 0.8536 | 19400 | 1.8417 | - |
| 0.8541 | 19410 | 1.8094 | - |
| 0.8545 | 19420 | 1.7443 | - |
| 0.8550 | 19430 | 1.7668 | - |
| 0.8554 | 19440 | 1.7869 | - |
| 0.8558 | 19450 | 1.7792 | - |
| 0.8563 | 19460 | 1.7712 | - |
| 0.8567 | 19470 | 1.8001 | - |
| 0.8572 | 19480 | 1.7587 | - |
| 0.8576 | 19490 | 1.7809 | - |
| 0.8580 | 19500 | 1.7866 | - |
| 0.8585 | 19510 | 1.8188 | - |
| 0.8589 | 19520 | 1.7744 | - |
| 0.8594 | 19530 | 1.7683 | - |
| 0.8598 | 19540 | 1.7473 | - |
| 0.8602 | 19550 | 1.8089 | - |
| 0.8607 | 19560 | 1.817 | - |
| 0.8611 | 19570 | 1.7317 | - |
| 0.8616 | 19580 | 1.739 | - |
| 0.8620 | 19590 | 1.7309 | - |
| 0.8624 | 19600 | 1.8112 | - |
| 0.8629 | 19610 | 1.7462 | - |
| 0.8633 | 19620 | 1.8007 | - |
| 0.8638 | 19630 | 1.7153 | - |
| 0.8642 | 19640 | 1.7704 | - |
| 0.8646 | 19650 | 1.7318 | - |
| 0.8651 | 19660 | 1.7531 | - |
| 0.8655 | 19670 | 1.7946 | - |
| 0.8660 | 19680 | 1.7935 | - |
| 0.8664 | 19690 | 1.7794 | - |
| 0.8668 | 19700 | 1.7703 | - |
| 0.8673 | 19710 | 1.7383 | - |
| 0.8677 | 19720 | 1.764 | - |
| 0.8682 | 19730 | 1.7786 | - |
| 0.8686 | 19740 | 1.726 | - |
| 0.8690 | 19750 | 1.7793 | - |
| 0.8695 | 19760 | 1.7449 | - |
| 0.8699 | 19770 | 1.7471 | - |
| 0.8704 | 19780 | 1.7321 | - |
| 0.8708 | 19790 | 1.7575 | - |
| 0.8712 | 19800 | 1.8125 | - |
| 0.8717 | 19810 | 1.812 | - |
| 0.8721 | 19820 | 1.752 | - |
| 0.8726 | 19830 | 1.7595 | - |
| 0.8730 | 19840 | 1.7412 | - |
| 0.8734 | 19850 | 1.7724 | - |
| 0.8739 | 19860 | 1.7666 | - |
| 0.8743 | 19870 | 1.7528 | - |
| 0.8748 | 19880 | 1.7338 | - |
| 0.8752 | 19890 | 1.798 | - |
| 0.8756 | 19900 | 1.8185 | - |
| 0.8761 | 19910 | 1.7647 | - |
| 0.8765 | 19920 | 1.7295 | - |
| 0.8770 | 19930 | 1.7924 | - |
| 0.8774 | 19940 | 1.7428 | - |
| 0.8778 | 19950 | 1.8205 | - |
| 0.8783 | 19960 | 1.8204 | - |
| 0.8787 | 19970 | 1.7778 | - |
| 0.8792 | 19980 | 1.7698 | - |
| 0.8796 | 19990 | 1.7181 | - |
| 0.8800 | 20000 | 1.7802 | - |
| 0.8805 | 20010 | 1.7699 | - |
| 0.8809 | 20020 | 1.7082 | - |
| 0.8814 | 20030 | 1.7828 | - |
| 0.8818 | 20040 | 1.7598 | - |
| 0.8822 | 20050 | 1.7002 | - |
| 0.8827 | 20060 | 1.7462 | - |
| 0.8831 | 20070 | 1.7514 | - |
| 0.8836 | 20080 | 1.7615 | - |
| 0.8840 | 20090 | 1.7357 | - |
| 0.8844 | 20100 | 1.7724 | - |
| 0.8849 | 20110 | 1.781 | - |
| 0.8853 | 20120 | 1.7107 | - |
| 0.8858 | 20130 | 1.7392 | - |
| 0.8862 | 20140 | 1.6857 | - |
| 0.8866 | 20150 | 1.738 | - |
| 0.8871 | 20160 | 1.7456 | - |
| 0.8875 | 20170 | 1.7181 | - |
| 0.8880 | 20180 | 1.728 | - |
| 0.8884 | 20190 | 1.7524 | - |
| 0.8888 | 20200 | 1.757 | - |
| 0.8893 | 20210 | 1.7756 | - |
| 0.8897 | 20220 | 1.7904 | - |
| 0.8902 | 20230 | 1.7905 | - |
| 0.8906 | 20240 | 1.7341 | - |
| 0.8910 | 20250 | 1.7457 | - |
| 0.8915 | 20260 | 1.7085 | - |
| 0.8919 | 20270 | 1.7183 | - |
| 0.8924 | 20280 | 1.7952 | - |
| 0.8928 | 20290 | 1.7555 | - |
| 0.8933 | 20300 | 1.7643 | - |
| 0.8937 | 20310 | 1.7575 | - |
| 0.8941 | 20320 | 1.8018 | - |
| 0.8946 | 20330 | 1.6861 | - |
| 0.8950 | 20340 | 1.7434 | - |
| 0.8955 | 20350 | 1.7578 | - |
| 0.8959 | 20360 | 1.784 | - |
| 0.8963 | 20370 | 1.6991 | - |
| 0.8968 | 20380 | 1.7822 | - |
| 0.8972 | 20390 | 1.7359 | - |
| 0.8977 | 20400 | 1.7536 | - |
| 0.8981 | 20410 | 1.685 | - |
| 0.8985 | 20420 | 1.7435 | - |
| 0.8990 | 20430 | 1.74 | - |
| 0.8994 | 20440 | 1.6932 | - |
| 0.8999 | 20450 | 1.7326 | - |
| 0.9003 | 20460 | 1.7389 | - |
| 0.9006 | 20466 | - | 1.5659 |
| 0.9007 | 20470 | 1.7049 | - |
| 0.9012 | 20480 | 1.7546 | - |
| 0.9016 | 20490 | 1.75 | - |
| 0.9021 | 20500 | 1.7242 | - |
| 0.9025 | 20510 | 1.7383 | - |
| 0.9029 | 20520 | 1.8238 | - |
| 0.9034 | 20530 | 1.7249 | - |
| 0.9038 | 20540 | 1.7586 | - |
| 0.9043 | 20550 | 1.7213 | - |
| 0.9047 | 20560 | 1.7271 | - |
| 0.9051 | 20570 | 1.7467 | - |
| 0.9056 | 20580 | 1.6756 | - |
| 0.9060 | 20590 | 1.7365 | - |
| 0.9065 | 20600 | 1.7897 | - |
| 0.9069 | 20610 | 1.7548 | - |
| 0.9073 | 20620 | 1.7865 | - |
| 0.9078 | 20630 | 1.6873 | - |
| 0.9082 | 20640 | 1.6873 | - |
| 0.9087 | 20650 | 1.7292 | - |
| 0.9091 | 20660 | 1.7395 | - |
| 0.9095 | 20670 | 1.7688 | - |
| 0.9100 | 20680 | 1.7188 | - |
| 0.9104 | 20690 | 1.7244 | - |
| 0.9109 | 20700 | 1.7362 | - |
| 0.9113 | 20710 | 1.7803 | - |
| 0.9117 | 20720 | 1.6902 | - |
| 0.9122 | 20730 | 1.79 | - |
| 0.9126 | 20740 | 1.7739 | - |
| 0.9131 | 20750 | 1.735 | - |
| 0.9135 | 20760 | 1.6805 | - |
| 0.9139 | 20770 | 1.7446 | - |
| 0.9144 | 20780 | 1.7967 | - |
| 0.9148 | 20790 | 1.7478 | - |
| 0.9153 | 20800 | 1.777 | - |
| 0.9157 | 20810 | 1.756 | - |
| 0.9161 | 20820 | 1.7738 | - |
| 0.9166 | 20830 | 1.7288 | - |
| 0.9170 | 20840 | 1.6982 | - |
| 0.9175 | 20850 | 1.707 | - |
| 0.9179 | 20860 | 1.748 | - |
| 0.9183 | 20870 | 1.7194 | - |
| 0.9188 | 20880 | 1.7428 | - |
| 0.9192 | 20890 | 1.716 | - |
| 0.9197 | 20900 | 1.7279 | - |
| 0.9201 | 20910 | 1.7387 | - |
| 0.9205 | 20920 | 1.7488 | - |
| 0.9210 | 20930 | 1.701 | - |
| 0.9214 | 20940 | 1.7126 | - |
| 0.9219 | 20950 | 1.7416 | - |
| 0.9223 | 20960 | 1.7723 | - |
| 0.9227 | 20970 | 1.7044 | - |
| 0.9232 | 20980 | 1.7429 | - |
| 0.9236 | 20990 | 1.7344 | - |
| 0.9241 | 21000 | 1.7799 | - |
| 0.9245 | 21010 | 1.7437 | - |
| 0.9249 | 21020 | 1.7143 | - |
| 0.9254 | 21030 | 1.7263 | - |
| 0.9258 | 21040 | 1.6835 | - |
| 0.9263 | 21050 | 1.684 | - |
| 0.9267 | 21060 | 1.7164 | - |
| 0.9271 | 21070 | 1.7237 | - |
| 0.9276 | 21080 | 1.733 | - |
| 0.9280 | 21090 | 1.6971 | - |
| 0.9285 | 21100 | 1.7094 | - |
| 0.9289 | 21110 | 1.7141 | - |
| 0.9293 | 21120 | 1.6635 | - |
| 0.9298 | 21130 | 1.6956 | - |
| 0.9302 | 21140 | 1.6918 | - |
| 0.9307 | 21150 | 1.768 | - |
| 0.9311 | 21160 | 1.7473 | - |
| 0.9315 | 21170 | 1.7332 | - |
| 0.9320 | 21180 | 1.7504 | - |
| 0.9324 | 21190 | 1.7022 | - |
| 0.9329 | 21200 | 1.6398 | - |
| 0.9333 | 21210 | 1.6898 | - |
| 0.9337 | 21220 | 1.745 | - |
| 0.9342 | 21230 | 1.7418 | - |
| 0.9346 | 21240 | 1.7308 | - |
| 0.9351 | 21250 | 1.7091 | - |
| 0.9355 | 21260 | 1.7052 | - |
| 0.9359 | 21270 | 1.6847 | - |
| 0.9364 | 21280 | 1.7309 | - |
| 0.9368 | 21290 | 1.7568 | - |
| 0.9373 | 21300 | 1.6818 | - |
| 0.9377 | 21310 | 1.6938 | - |
| 0.9381 | 21320 | 1.7351 | - |
| 0.9386 | 21330 | 1.6788 | - |
| 0.9390 | 21340 | 1.6727 | - |
| 0.9395 | 21350 | 1.6749 | - |
| 0.9399 | 21360 | 1.6577 | - |
| 0.9403 | 21370 | 1.7146 | - |
| 0.9408 | 21380 | 1.6958 | - |
| 0.9412 | 21390 | 1.705 | - |
| 0.9417 | 21400 | 1.6555 | - |
| 0.9421 | 21410 | 1.732 | - |
| 0.9425 | 21420 | 1.739 | - |
| 0.9430 | 21430 | 1.7517 | - |
| 0.9434 | 21440 | 1.7185 | - |
| 0.9439 | 21450 | 1.6613 | - |
| 0.9443 | 21460 | 1.7243 | - |
| 0.9447 | 21470 | 1.7739 | - |
| 0.9452 | 21480 | 1.6779 | - |
| 0.9456 | 21490 | 1.6934 | - |
| 0.9461 | 21500 | 1.7542 | - |
| 0.9465 | 21510 | 1.7099 | - |
| 0.9469 | 21520 | 1.7137 | - |
| 0.9474 | 21530 | 1.7286 | - |
| 0.9478 | 21540 | 1.7231 | - |
| 0.9483 | 21550 | 1.7182 | - |
| 0.9487 | 21560 | 1.6938 | - |
| 0.9491 | 21570 | 1.7649 | - |
| 0.9496 | 21580 | 1.7201 | - |
| 0.9500 | 21590 | 1.6845 | - |
| 0.9505 | 21600 | 1.6983 | - |
| 0.9506 | 21603 | - | 1.5557 |
| 0.9509 | 21610 | 1.6874 | - |
| 0.9513 | 21620 | 1.6564 | - |
| 0.9518 | 21630 | 1.6851 | - |
| 0.9522 | 21640 | 1.6712 | - |
| 0.9527 | 21650 | 1.6958 | - |
| 0.9531 | 21660 | 1.6831 | - |
| 0.9535 | 21670 | 1.7378 | - |
| 0.9540 | 21680 | 1.742 | - |
| 0.9544 | 21690 | 1.7139 | - |
| 0.9549 | 21700 | 1.783 | - |
| 0.9553 | 21710 | 1.7006 | - |
| 0.9557 | 21720 | 1.7022 | - |
| 0.9562 | 21730 | 1.5912 | - |
| 0.9566 | 21740 | 1.7013 | - |
| 0.9571 | 21750 | 1.6654 | - |
| 0.9575 | 21760 | 1.7027 | - |
| 0.9579 | 21770 | 1.6858 | - |
| 0.9584 | 21780 | 1.6601 | - |
| 0.9588 | 21790 | 1.6907 | - |
| 0.9593 | 21800 | 1.7123 | - |
| 0.9597 | 21810 | 1.6935 | - |
| 0.9601 | 21820 | 1.702 | - |
| 0.9606 | 21830 | 1.6678 | - |
| 0.9610 | 21840 | 1.7244 | - |
| 0.9615 | 21850 | 1.7523 | - |
| 0.9619 | 21860 | 1.6928 | - |
| 0.9623 | 21870 | 1.6641 | - |
| 0.9628 | 21880 | 1.6427 | - |
| 0.9632 | 21890 | 1.6817 | - |
| 0.9637 | 21900 | 1.6663 | - |
| 0.9641 | 21910 | 1.7587 | - |
| 0.9645 | 21920 | 1.6881 | - |
| 0.9650 | 21930 | 1.7287 | - |
| 0.9654 | 21940 | 1.7327 | - |
| 0.9659 | 21950 | 1.7048 | - |
| 0.9663 | 21960 | 1.6799 | - |
| 0.9667 | 21970 | 1.6927 | - |
| 0.9672 | 21980 | 1.725 | - |
| 0.9676 | 21990 | 1.6678 | - |
| 0.9681 | 22000 | 1.7152 | - |
| 0.9685 | 22010 | 1.6702 | - |
| 0.9689 | 22020 | 1.7111 | - |
| 0.9694 | 22030 | 1.7045 | - |
| 0.9698 | 22040 | 1.7205 | - |
| 0.9703 | 22050 | 1.7132 | - |
| 0.9707 | 22060 | 1.683 | - |
| 0.9711 | 22070 | 1.6894 | - |
| 0.9716 | 22080 | 1.6673 | - |
| 0.9720 | 22090 | 1.6615 | - |
| 0.9725 | 22100 | 1.676 | - |
| 0.9729 | 22110 | 1.682 | - |
| 0.9733 | 22120 | 1.6953 | - |
| 0.9738 | 22130 | 1.6266 | - |
| 0.9742 | 22140 | 1.7501 | - |
| 0.9747 | 22150 | 1.7364 | - |
| 0.9751 | 22160 | 1.683 | - |
| 0.9755 | 22170 | 1.7098 | - |
| 0.9760 | 22180 | 1.6807 | - |
| 0.9764 | 22190 | 1.6944 | - |
| 0.9769 | 22200 | 1.6351 | - |
| 0.9773 | 22210 | 1.7202 | - |
| 0.9777 | 22220 | 1.6849 | - |
| 0.9782 | 22230 | 1.6461 | - |
| 0.9786 | 22240 | 1.6318 | - |
| 0.9791 | 22250 | 1.6644 | - |
| 0.9795 | 22260 | 1.6302 | - |
| 0.9799 | 22270 | 1.6398 | - |
| 0.9804 | 22280 | 1.7222 | - |
| 0.9808 | 22290 | 1.7678 | - |
| 0.9813 | 22300 | 1.6438 | - |
| 0.9817 | 22310 | 1.6607 | - |
| 0.9821 | 22320 | 1.6955 | - |
| 0.9826 | 22330 | 1.6424 | - |
| 0.9830 | 22340 | 1.6555 | - |
| 0.9835 | 22350 | 1.6481 | - |
| 0.9839 | 22360 | 1.6649 | - |
| 0.9843 | 22370 | 1.7239 | - |
| 0.9848 | 22380 | 1.7024 | - |
| 0.9852 | 22390 | 1.6426 | - |
| 0.9857 | 22400 | 1.7082 | - |
| 0.9861 | 22410 | 1.6465 | - |
| 0.9865 | 22420 | 1.7059 | - |
| 0.9870 | 22430 | 1.6484 | - |
| 0.9874 | 22440 | 1.7004 | - |
| 0.9879 | 22450 | 1.682 | - |
| 0.9883 | 22460 | 1.663 | - |
| 0.9887 | 22470 | 1.7186 | - |
| 0.9892 | 22480 | 1.6622 | - |
| 0.9896 | 22490 | 1.6895 | - |
| 0.9901 | 22500 | 1.6893 | - |
| 0.9905 | 22510 | 1.6527 | - |
| 0.9909 | 22520 | 1.6462 | - |
| 0.9914 | 22530 | 1.7192 | - |
| 0.9918 | 22540 | 1.6883 | - |
| 0.9923 | 22550 | 1.6261 | - |
| 0.9927 | 22560 | 1.6477 | - |
| 0.9931 | 22570 | 1.6856 | - |
| 0.9936 | 22580 | 1.6427 | - |
| 0.9940 | 22590 | 1.6723 | - |
| 0.9945 | 22600 | 1.6706 | - |
| 0.9949 | 22610 | 1.6391 | - |
| 0.9953 | 22620 | 1.6861 | - |
| 0.9958 | 22630 | 1.6388 | - |
| 0.9962 | 22640 | 1.6668 | - |
| 0.9967 | 22650 | 1.6732 | - |
| 0.9971 | 22660 | 1.7444 | - |
| 0.9975 | 22670 | 1.6241 | - |
| 0.9980 | 22680 | 1.673 | - |
| 0.9984 | 22690 | 1.6565 | - |
| 0.9989 | 22700 | 1.6076 | - |
| 0.9993 | 22710 | 1.5716 | - |
| 0.9997 | 22720 | 1.656 | - |
| 1.0002 | 22730 | 1.5846 | - |
| 1.0006 | 22740 | 1.6155 | 1.5586 |
| 1.0011 | 22750 | 1.5894 | - |
| 1.0015 | 22760 | 1.664 | - |
| 1.0019 | 22770 | 1.6272 | - |
| 1.0024 | 22780 | 1.5798 | - |
| 1.0028 | 22790 | 1.6216 | - |
| 1.0033 | 22800 | 1.6389 | - |
| 1.0037 | 22810 | 1.6825 | - |
| 1.0041 | 22820 | 1.6871 | - |
| 1.0046 | 22830 | 1.6127 | - |
| 1.0050 | 22840 | 1.6145 | - |
| 1.0055 | 22850 | 1.6165 | - |
| 1.0059 | 22860 | 1.6061 | - |
| 1.0063 | 22870 | 1.6167 | - |
| 1.0068 | 22880 | 1.6383 | - |
| 1.0072 | 22890 | 1.626 | - |
| 1.0077 | 22900 | 1.5837 | - |
| 1.0081 | 22910 | 1.595 | - |
| 1.0085 | 22920 | 1.6577 | - |
| 1.0090 | 22930 | 1.6312 | - |
| 1.0094 | 22940 | 1.6584 | - |
| 1.0099 | 22950 | 1.6219 | - |
| 1.0103 | 22960 | 1.5435 | - |
| 1.0107 | 22970 | 1.6176 | - |
| 1.0112 | 22980 | 1.5628 | - |
| 1.0116 | 22990 | 1.6404 | - |
| 1.0121 | 23000 | 1.6436 | - |
| 1.0125 | 23010 | 1.6794 | - |
| 1.0129 | 23020 | 1.5755 | - |
| 1.0134 | 23030 | 1.633 | - |
| 1.0138 | 23040 | 1.6051 | - |
| 1.0143 | 23050 | 1.5989 | - |
| 1.0147 | 23060 | 1.6019 | - |
| 1.0151 | 23070 | 1.6456 | - |
| 1.0156 | 23080 | 1.6421 | - |
| 1.0160 | 23090 | 1.5864 | - |
| 1.0165 | 23100 | 1.5927 | - |
| 1.0169 | 23110 | 1.5462 | - |
| 1.0173 | 23120 | 1.5672 | - |
| 1.0178 | 23130 | 1.6073 | - |
| 1.0182 | 23140 | 1.637 | - |
| 1.0187 | 23150 | 1.609 | - |
| 1.0191 | 23160 | 1.6224 | - |
| 1.0195 | 23170 | 1.6331 | - |
| 1.0200 | 23180 | 1.5989 | - |
| 1.0204 | 23190 | 1.6537 | - |
| 1.0209 | 23200 | 1.5852 | - |
| 1.0213 | 23210 | 1.5784 | - |
| 1.0217 | 23220 | 1.6571 | - |
| 1.0222 | 23230 | 1.5607 | - |
| 1.0226 | 23240 | 1.5708 | - |
| 1.0231 | 23250 | 1.638 | - |
| 1.0235 | 23260 | 1.6183 | - |
| 1.0239 | 23270 | 1.6298 | - |
| 1.0244 | 23280 | 1.6077 | - |
| 1.0248 | 23290 | 1.6341 | - |
| 1.0253 | 23300 | 1.5449 | - |
| 1.0257 | 23310 | 1.5858 | - |
| 1.0261 | 23320 | 1.6523 | - |
| 1.0266 | 23330 | 1.5675 | - |
| 1.0270 | 23340 | 1.5781 | - |
| 1.0275 | 23350 | 1.5752 | - |
| 1.0279 | 23360 | 1.6639 | - |
| 1.0283 | 23370 | 1.6219 | - |
| 1.0288 | 23380 | 1.6102 | - |
| 1.0292 | 23390 | 1.5822 | - |
| 1.0297 | 23400 | 1.5894 | - |
| 1.0301 | 23410 | 1.6236 | - |
| 1.0305 | 23420 | 1.6078 | - |
| 1.0310 | 23430 | 1.5646 | - |
| 1.0314 | 23440 | 1.5686 | - |
| 1.0319 | 23450 | 1.6226 | - |
| 1.0323 | 23460 | 1.6077 | - |
| 1.0327 | 23470 | 1.6261 | - |
| 1.0332 | 23480 | 1.5964 | - |
| 1.0336 | 23490 | 1.5754 | - |
| 1.0341 | 23500 | 1.6041 | - |
| 1.0345 | 23510 | 1.6038 | - |
| 1.0349 | 23520 | 1.6242 | - |
| 1.0354 | 23530 | 1.6389 | - |
| 1.0358 | 23540 | 1.6102 | - |
| 1.0363 | 23550 | 1.6563 | - |
| 1.0367 | 23560 | 1.6206 | - |
| 1.0371 | 23570 | 1.6093 | - |
| 1.0376 | 23580 | 1.58 | - |
| 1.0380 | 23590 | 1.6329 | - |
| 1.0385 | 23600 | 1.6063 | - |
| 1.0389 | 23610 | 1.5582 | - |
| 1.0393 | 23620 | 1.6299 | - |
| 1.0398 | 23630 | 1.5943 | - |
| 1.0402 | 23640 | 1.5983 | - |
| 1.0407 | 23650 | 1.5919 | - |
| 1.0411 | 23660 | 1.6356 | - |
| 1.0415 | 23670 | 1.6778 | - |
| 1.0420 | 23680 | 1.6047 | - |
| 1.0424 | 23690 | 1.6156 | - |
| 1.0429 | 23700 | 1.6299 | - |
| 1.0433 | 23710 | 1.5966 | - |
| 1.0437 | 23720 | 1.5609 | - |
| 1.0442 | 23730 | 1.5333 | - |
| 1.0446 | 23740 | 1.5445 | - |
| 1.0451 | 23750 | 1.6083 | - |
| 1.0455 | 23760 | 1.5686 | - |
| 1.0459 | 23770 | 1.6096 | - |
| 1.0464 | 23780 | 1.5213 | - |
| 1.0468 | 23790 | 1.5709 | - |
| 1.0473 | 23800 | 1.6091 | - |
| 1.0477 | 23810 | 1.6004 | - |
| 1.0481 | 23820 | 1.571 | - |
| 1.0486 | 23830 | 1.6229 | - |
| 1.0490 | 23840 | 1.6111 | - |
| 1.0495 | 23850 | 1.6658 | - |
| 1.0499 | 23860 | 1.6398 | - |
| 1.0503 | 23870 | 1.5838 | - |
| 1.0506 | 23877 | - | 1.5838 |
| 1.0508 | 23880 | 1.6513 | - |
| 1.0512 | 23890 | 1.5825 | - |
| 1.0517 | 23900 | 1.5546 | - |
| 1.0521 | 23910 | 1.661 | - |
| 1.0525 | 23920 | 1.5327 | - |
| 1.0530 | 23930 | 1.5597 | - |
| 1.0534 | 23940 | 1.6161 | - |
| 1.0539 | 23950 | 1.5688 | - |
| 1.0543 | 23960 | 1.4822 | - |
| 1.0547 | 23970 | 1.5688 | - |
| 1.0552 | 23980 | 1.5853 | - |
| 1.0556 | 23990 | 1.5897 | - |
| 1.0561 | 24000 | 1.58 | - |
| 1.0565 | 24010 | 1.6115 | - |
| 1.0569 | 24020 | 1.5621 | - |
| 1.0574 | 24030 | 1.5731 | - |
| 1.0578 | 24040 | 1.6084 | - |
| 1.0583 | 24050 | 1.5941 | - |
| 1.0587 | 24060 | 1.636 | - |
| 1.0591 | 24070 | 1.6195 | - |
| 1.0596 | 24080 | 1.5591 | - |
| 1.0600 | 24090 | 1.5727 | - |
| 1.0605 | 24100 | 1.6293 | - |
| 1.0609 | 24110 | 1.5979 | - |
| 1.0613 | 24120 | 1.6034 | - |
| 1.0618 | 24130 | 1.5808 | - |
| 1.0622 | 24140 | 1.5912 | - |
| 1.0627 | 24150 | 1.5821 | - |
| 1.0631 | 24160 | 1.5517 | - |
| 1.0635 | 24170 | 1.5667 | - |
| 1.0640 | 24180 | 1.6848 | - |
| 1.0644 | 24190 | 1.6575 | - |
| 1.0649 | 24200 | 1.6135 | - |
| 1.0653 | 24210 | 1.577 | - |
| 1.0657 | 24220 | 1.6022 | - |
| 1.0662 | 24230 | 1.6088 | - |
| 1.0666 | 24240 | 1.5619 | - |
| 1.0671 | 24250 | 1.609 | - |
| 1.0675 | 24260 | 1.5263 | - |
| 1.0679 | 24270 | 1.6639 | - |
| 1.0684 | 24280 | 1.529 | - |
| 1.0688 | 24290 | 1.5617 | - |
| 1.0693 | 24300 | 1.6523 | - |
| 1.0697 | 24310 | 1.6064 | - |
| 1.0701 | 24320 | 1.6213 | - |
| 1.0706 | 24330 | 1.5709 | - |
| 1.0710 | 24340 | 1.558 | - |
| 1.0715 | 24350 | 1.6251 | - |
| 1.0719 | 24360 | 1.567 | - |
| 1.0723 | 24370 | 1.5582 | - |
| 1.0728 | 24380 | 1.571 | - |
| 1.0732 | 24390 | 1.5574 | - |
| 1.0737 | 24400 | 1.5773 | - |
| 1.0741 | 24410 | 1.5377 | - |
| 1.0745 | 24420 | 1.5495 | - |
| 1.0750 | 24430 | 1.6313 | - |
| 1.0754 | 24440 | 1.5946 | - |
| 1.0759 | 24450 | 1.5544 | - |
| 1.0763 | 24460 | 1.5671 | - |
| 1.0767 | 24470 | 1.6169 | - |
| 1.0772 | 24480 | 1.5978 | - |
| 1.0776 | 24490 | 1.547 | - |
| 1.0781 | 24500 | 1.593 | - |
| 1.0785 | 24510 | 1.5184 | - |
| 1.0789 | 24520 | 1.5649 | - |
| 1.0794 | 24530 | 1.6023 | - |
| 1.0798 | 24540 | 1.539 | - |
| 1.0803 | 24550 | 1.5698 | - |
| 1.0807 | 24560 | 1.6108 | - |
| 1.0811 | 24570 | 1.538 | - |
| 1.0816 | 24580 | 1.5991 | - |
| 1.0820 | 24590 | 1.5727 | - |
| 1.0825 | 24600 | 1.5733 | - |
| 1.0829 | 24610 | 1.5921 | - |
| 1.0833 | 24620 | 1.5663 | - |
| 1.0838 | 24630 | 1.5519 | - |
| 1.0842 | 24640 | 1.5981 | - |
| 1.0847 | 24650 | 1.6053 | - |
| 1.0851 | 24660 | 1.6398 | - |
| 1.0855 | 24670 | 1.6241 | - |
| 1.0860 | 24680 | 1.5833 | - |
| 1.0864 | 24690 | 1.5416 | - |
| 1.0869 | 24700 | 1.5838 | - |
| 1.0873 | 24710 | 1.5521 | - |
| 1.0877 | 24720 | 1.5813 | - |
| 1.0882 | 24730 | 1.6061 | - |
| 1.0886 | 24740 | 1.5673 | - |
| 1.0891 | 24750 | 1.5791 | - |
| 1.0895 | 24760 | 1.6384 | - |
| 1.0899 | 24770 | 1.5555 | - |
| 1.0904 | 24780 | 1.5682 | - |
| 1.0908 | 24790 | 1.5832 | - |
| 1.0913 | 24800 | 1.5829 | - |
| 1.0917 | 24810 | 1.6027 | - |
| 1.0921 | 24820 | 1.6238 | - |
| 1.0926 | 24830 | 1.578 | - |
| 1.0930 | 24840 | 1.5684 | - |
| 1.0935 | 24850 | 1.5562 | - |
| 1.0939 | 24860 | 1.5686 | - |
| 1.0943 | 24870 | 1.5537 | - |
| 1.0948 | 24880 | 1.5987 | - |
| 1.0952 | 24890 | 1.5808 | - |
| 1.0957 | 24900 | 1.5316 | - |
| 1.0961 | 24910 | 1.573 | - |
| 1.0965 | 24920 | 1.5939 | - |
| 1.0970 | 24930 | 1.6022 | - |
| 1.0974 | 24940 | 1.5498 | - |
| 1.0979 | 24950 | 1.5125 | - |
| 1.0983 | 24960 | 1.5304 | - |
| 1.0987 | 24970 | 1.5748 | - |
| 1.0992 | 24980 | 1.563 | - |
| 1.0996 | 24990 | 1.5458 | - |
| 1.1001 | 25000 | 1.5121 | - |
| 1.1005 | 25010 | 1.5332 | - |
| 1.1007 | 25014 | - | 1.5314 |
| 1.1009 | 25020 | 1.5743 | - |
| 1.1014 | 25030 | 1.5818 | - |
| 1.1018 | 25040 | 1.5752 | - |
| 1.1023 | 25050 | 1.5558 | - |
| 1.1027 | 25060 | 1.5288 | - |
| 1.1031 | 25070 | 1.6382 | - |
| 1.1036 | 25080 | 1.6479 | - |
| 1.1040 | 25090 | 1.595 | - |
| 1.1045 | 25100 | 1.508 | - |
| 1.1049 | 25110 | 1.5791 | - |
| 1.1053 | 25120 | 1.5317 | - |
| 1.1058 | 25130 | 1.6043 | - |
| 1.1062 | 25140 | 1.5714 | - |
| 1.1067 | 25150 | 1.539 | - |
| 1.1071 | 25160 | 1.5765 | - |
| 1.1075 | 25170 | 1.5377 | - |
| 1.1080 | 25180 | 1.5805 | - |
| 1.1084 | 25190 | 1.5595 | - |
| 1.1089 | 25200 | 1.5551 | - |
| 1.1093 | 25210 | 1.5584 | - |
| 1.1097 | 25220 | 1.55 | - |
| 1.1102 | 25230 | 1.5349 | - |
| 1.1106 | 25240 | 1.442 | - |
| 1.1111 | 25250 | 1.5366 | - |
| 1.1115 | 25260 | 1.5484 | - |
| 1.1119 | 25270 | 1.5347 | - |
| 1.1124 | 25280 | 1.5725 | - |
| 1.1128 | 25290 | 1.5698 | - |
| 1.1133 | 25300 | 1.6015 | - |
| 1.1137 | 25310 | 1.5378 | - |
| 1.1141 | 25320 | 1.5662 | - |
| 1.1146 | 25330 | 1.5458 | - |
| 1.1150 | 25340 | 1.539 | - |
| 1.1155 | 25350 | 1.5545 | - |
| 1.1159 | 25360 | 1.4799 | - |
| 1.1163 | 25370 | 1.5101 | - |
| 1.1168 | 25380 | 1.5322 | - |
| 1.1172 | 25390 | 1.5509 | - |
| 1.1177 | 25400 | 1.6249 | - |
| 1.1181 | 25410 | 1.5188 | - |
| 1.1185 | 25420 | 1.5324 | - |
| 1.1190 | 25430 | 1.5394 | - |
| 1.1194 | 25440 | 1.5214 | - |
| 1.1199 | 25450 | 1.5182 | - |
| 1.1203 | 25460 | 1.5995 | - |
| 1.1207 | 25470 | 1.5705 | - |
| 1.1212 | 25480 | 1.5038 | - |
| 1.1216 | 25490 | 1.5399 | - |
| 1.1221 | 25500 | 1.5695 | - |
| 1.1225 | 25510 | 1.5111 | - |
| 1.1229 | 25520 | 1.553 | - |
| 1.1234 | 25530 | 1.5808 | - |
| 1.1238 | 25540 | 1.5617 | - |
| 1.1243 | 25550 | 1.6142 | - |
| 1.1247 | 25560 | 1.5168 | - |
| 1.1251 | 25570 | 1.5328 | - |
| 1.1256 | 25580 | 1.5462 | - |
| 1.1260 | 25590 | 1.5271 | - |
| 1.1265 | 25600 | 1.6445 | - |
| 1.1269 | 25610 | 1.5348 | - |
| 1.1273 | 25620 | 1.5479 | - |
| 1.1278 | 25630 | 1.5485 | - |
| 1.1282 | 25640 | 1.5601 | - |
| 1.1287 | 25650 | 1.5352 | - |
| 1.1291 | 25660 | 1.5161 | - |
| 1.1295 | 25670 | 1.5358 | - |
| 1.1300 | 25680 | 1.5807 | - |
| 1.1304 | 25690 | 1.6604 | - |
| 1.1309 | 25700 | 1.5012 | - |
| 1.1313 | 25710 | 1.5671 | - |
| 1.1317 | 25720 | 1.5661 | - |
| 1.1322 | 25730 | 1.5375 | - |
| 1.1326 | 25740 | 1.5744 | - |
| 1.1331 | 25750 | 1.5689 | - |
| 1.1335 | 25760 | 1.5973 | - |
| 1.1339 | 25770 | 1.5024 | - |
| 1.1344 | 25780 | 1.5421 | - |
| 1.1348 | 25790 | 1.5502 | - |
| 1.1353 | 25800 | 1.5725 | - |
| 1.1357 | 25810 | 1.6019 | - |
| 1.1361 | 25820 | 1.5618 | - |
| 1.1366 | 25830 | 1.5758 | - |
| 1.1370 | 25840 | 1.5102 | - |
| 1.1375 | 25850 | 1.5665 | - |
| 1.1379 | 25860 | 1.5621 | - |
| 1.1383 | 25870 | 1.5545 | - |
| 1.1388 | 25880 | 1.5682 | - |
| 1.1392 | 25890 | 1.5397 | - |
| 1.1397 | 25900 | 1.5156 | - |
| 1.1401 | 25910 | 1.4805 | - |
| 1.1405 | 25920 | 1.508 | - |
| 1.1410 | 25930 | 1.5694 | - |
| 1.1414 | 25940 | 1.5639 | - |
| 1.1419 | 25950 | 1.5375 | - |
| 1.1423 | 25960 | 1.5747 | - |
| 1.1427 | 25970 | 1.5025 | - |
| 1.1432 | 25980 | 1.5259 | - |
| 1.1436 | 25990 | 1.5286 | - |
| 1.1441 | 26000 | 1.4884 | - |
| 1.1445 | 26010 | 1.5334 | - |
| 1.1449 | 26020 | 1.5859 | - |
| 1.1454 | 26030 | 1.6257 | - |
| 1.1458 | 26040 | 1.552 | - |
| 1.1463 | 26050 | 1.5315 | - |
| 1.1467 | 26060 | 1.558 | - |
| 1.1471 | 26070 | 1.5516 | - |
| 1.1476 | 26080 | 1.5276 | - |
| 1.1480 | 26090 | 1.5249 | - |
| 1.1485 | 26100 | 1.5358 | - |
| 1.1489 | 26110 | 1.5064 | - |
| 1.1493 | 26120 | 1.5066 | - |
| 1.1498 | 26130 | 1.5581 | - |
| 1.1502 | 26140 | 1.5587 | - |
| 1.1507 | 26150 | 1.5254 | - |
| 1.1507 | 26151 | - | 1.5403 |
| 1.1511 | 26160 | 1.5899 | - |
| 1.1515 | 26170 | 1.6296 | - |
| 1.1520 | 26180 | 1.5497 | - |
| 1.1524 | 26190 | 1.6028 | - |
| 1.1529 | 26200 | 1.5345 | - |
| 1.1533 | 26210 | 1.5784 | - |
| 1.1537 | 26220 | 1.5382 | - |
| 1.1542 | 26230 | 1.5197 | - |
| 1.1546 | 26240 | 1.604 | - |
| 1.1551 | 26250 | 1.5698 | - |
| 1.1555 | 26260 | 1.5455 | - |
| 1.1559 | 26270 | 1.538 | - |
| 1.1564 | 26280 | 1.5195 | - |
| 1.1568 | 26290 | 1.5518 | - |
| 1.1573 | 26300 | 1.5052 | - |
| 1.1577 | 26310 | 1.5586 | - |
| 1.1581 | 26320 | 1.5595 | - |
| 1.1586 | 26330 | 1.5454 | - |
| 1.1590 | 26340 | 1.6054 | - |
| 1.1595 | 26350 | 1.5232 | - |
| 1.1599 | 26360 | 1.5796 | - |
| 1.1603 | 26370 | 1.5537 | - |
| 1.1608 | 26380 | 1.5197 | - |
| 1.1612 | 26390 | 1.5146 | - |
| 1.1617 | 26400 | 1.4997 | - |
| 1.1621 | 26410 | 1.5534 | - |
| 1.1625 | 26420 | 1.5527 | - |
| 1.1630 | 26430 | 1.5386 | - |
| 1.1634 | 26440 | 1.5212 | - |
| 1.1639 | 26450 | 1.5005 | - |
| 1.1643 | 26460 | 1.5312 | - |
| 1.1647 | 26470 | 1.5638 | - |
| 1.1652 | 26480 | 1.4781 | - |
| 1.1656 | 26490 | 1.5198 | - |
| 1.1661 | 26500 | 1.5852 | - |
| 1.1665 | 26510 | 1.5192 | - |
| 1.1669 | 26520 | 1.5029 | - |
| 1.1674 | 26530 | 1.4692 | - |
| 1.1678 | 26540 | 1.4464 | - |
| 1.1683 | 26550 | 1.5214 | - |
| 1.1687 | 26560 | 1.46 | - |
| 1.1691 | 26570 | 1.5423 | - |
| 1.1696 | 26580 | 1.5396 | - |
| 1.1700 | 26590 | 1.5609 | - |
| 1.1705 | 26600 | 1.5281 | - |
| 1.1709 | 26610 | 1.5499 | - |
| 1.1713 | 26620 | 1.594 | - |
| 1.1718 | 26630 | 1.5283 | - |
| 1.1722 | 26640 | 1.5919 | - |
| 1.1727 | 26650 | 1.5299 | - |
| 1.1731 | 26660 | 1.5332 | - |
| 1.1735 | 26670 | 1.5466 | - |
| 1.1740 | 26680 | 1.5043 | - |
| 1.1744 | 26690 | 1.5474 | - |
| 1.1749 | 26700 | 1.4992 | - |
| 1.1753 | 26710 | 1.5334 | - |
| 1.1757 | 26720 | 1.4716 | - |
| 1.1762 | 26730 | 1.5314 | - |
| 1.1766 | 26740 | 1.5406 | - |
| 1.1771 | 26750 | 1.517 | - |
| 1.1775 | 26760 | 1.4782 | - |
| 1.1779 | 26770 | 1.5761 | - |
| 1.1784 | 26780 | 1.5395 | - |
| 1.1788 | 26790 | 1.5022 | - |
| 1.1793 | 26800 | 1.4723 | - |
| 1.1797 | 26810 | 1.5059 | - |
| 1.1801 | 26820 | 1.5512 | - |
| 1.1806 | 26830 | 1.472 | - |
| 1.1810 | 26840 | 1.5354 | - |
| 1.1815 | 26850 | 1.5695 | - |
| 1.1819 | 26860 | 1.4729 | - |
| 1.1823 | 26870 | 1.4322 | - |
| 1.1828 | 26880 | 1.519 | - |
| 1.1832 | 26890 | 1.5342 | - |
| 1.1837 | 26900 | 1.5213 | - |
| 1.1841 | 26910 | 1.4996 | - |
| 1.1845 | 26920 | 1.524 | - |
| 1.1850 | 26930 | 1.5831 | - |
| 1.1854 | 26940 | 1.4951 | - |
| 1.1859 | 26950 | 1.4982 | - |
| 1.1863 | 26960 | 1.4833 | - |
| 1.1867 | 26970 | 1.5268 | - |
| 1.1872 | 26980 | 1.557 | - |
| 1.1876 | 26990 | 1.5278 | - |
| 1.1881 | 27000 | 1.5375 | - |
| 1.1885 | 27010 | 1.5099 | - |
| 1.1889 | 27020 | 1.5191 | - |
| 1.1894 | 27030 | 1.5206 | - |
| 1.1898 | 27040 | 1.4592 | - |
| 1.1903 | 27050 | 1.5455 | - |
| 1.1907 | 27060 | 1.5448 | - |
| 1.1911 | 27070 | 1.5377 | - |
| 1.1916 | 27080 | 1.5127 | - |
| 1.1920 | 27090 | 1.5417 | - |
| 1.1925 | 27100 | 1.5707 | - |
| 1.1929 | 27110 | 1.5224 | - |
| 1.1933 | 27120 | 1.4661 | - |
| 1.1938 | 27130 | 1.5125 | - |
| 1.1942 | 27140 | 1.5433 | - |
| 1.1947 | 27150 | 1.5019 | - |
| 1.1951 | 27160 | 1.5221 | - |
| 1.1955 | 27170 | 1.4745 | - |
| 1.1960 | 27180 | 1.4602 | - |
| 1.1964 | 27190 | 1.5021 | - |
| 1.1969 | 27200 | 1.5022 | - |
| 1.1973 | 27210 | 1.5399 | - |
| 1.1977 | 27220 | 1.4776 | - |
| 1.1982 | 27230 | 1.4812 | - |
| 1.1986 | 27240 | 1.5186 | - |
| 1.1991 | 27250 | 1.5503 | - |
| 1.1995 | 27260 | 1.5308 | - |
| 1.1999 | 27270 | 1.5146 | - |
| 1.2004 | 27280 | 1.483 | - |
| 1.2007 | 27288 | - | 1.5249 |
| 1.2008 | 27290 | 1.4959 | - |
| 1.2013 | 27300 | 1.5327 | - |
| 1.2017 | 27310 | 1.5524 | - |
| 1.2021 | 27320 | 1.51 | - |
| 1.2026 | 27330 | 1.4816 | - |
| 1.2030 | 27340 | 1.5598 | - |
| 1.2035 | 27350 | 1.4903 | - |
| 1.2039 | 27360 | 1.5492 | - |
| 1.2043 | 27370 | 1.5216 | - |
| 1.2048 | 27380 | 1.5226 | - |
| 1.2052 | 27390 | 1.5485 | - |
| 1.2057 | 27400 | 1.5003 | - |
| 1.2061 | 27410 | 1.4854 | - |
| 1.2065 | 27420 | 1.459 | - |
| 1.2070 | 27430 | 1.4907 | - |
| 1.2074 | 27440 | 1.4451 | - |
| 1.2079 | 27450 | 1.4867 | - |
| 1.2083 | 27460 | 1.5078 | - |
| 1.2087 | 27470 | 1.5509 | - |
| 1.2092 | 27480 | 1.5315 | - |
| 1.2096 | 27490 | 1.4643 | - |
| 1.2101 | 27500 | 1.4728 | - |
| 1.2105 | 27510 | 1.4716 | - |
| 1.2109 | 27520 | 1.5411 | - |
| 1.2114 | 27530 | 1.499 | - |
| 1.2118 | 27540 | 1.5291 | - |
| 1.2123 | 27550 | 1.5318 | - |
| 1.2127 | 27560 | 1.5371 | - |
| 1.2131 | 27570 | 1.5087 | - |
| 1.2136 | 27580 | 1.5023 | - |
| 1.2140 | 27590 | 1.4815 | - |
| 1.2145 | 27600 | 1.5566 | - |
| 1.2149 | 27610 | 1.5223 | - |
| 1.2153 | 27620 | 1.5145 | - |
| 1.2158 | 27630 | 1.5318 | - |
| 1.2162 | 27640 | 1.4781 | - |
| 1.2167 | 27650 | 1.4755 | - |
| 1.2171 | 27660 | 1.4004 | - |
| 1.2175 | 27670 | 1.4974 | - |
| 1.2180 | 27680 | 1.5118 | - |
| 1.2184 | 27690 | 1.5281 | - |
| 1.2189 | 27700 | 1.5068 | - |
| 1.2193 | 27710 | 1.4435 | - |
| 1.2197 | 27720 | 1.4722 | - |
| 1.2202 | 27730 | 1.4818 | - |
| 1.2206 | 27740 | 1.4994 | - |
| 1.2211 | 27750 | 1.4888 | - |
| 1.2215 | 27760 | 1.4943 | - |
| 1.2219 | 27770 | 1.5474 | - |
| 1.2224 | 27780 | 1.4982 | - |
| 1.2228 | 27790 | 1.5354 | - |
| 1.2233 | 27800 | 1.5473 | - |
| 1.2237 | 27810 | 1.5395 | - |
| 1.2241 | 27820 | 1.5548 | - |
| 1.2246 | 27830 | 1.518 | - |
| 1.2250 | 27840 | 1.4738 | - |
| 1.2255 | 27850 | 1.4477 | - |
| 1.2259 | 27860 | 1.5478 | - |
| 1.2263 | 27870 | 1.5161 | - |
| 1.2268 | 27880 | 1.5016 | - |
| 1.2272 | 27890 | 1.4857 | - |
| 1.2277 | 27900 | 1.5142 | - |
| 1.2281 | 27910 | 1.4935 | - |
| 1.2285 | 27920 | 1.5488 | - |
| 1.2290 | 27930 | 1.4733 | - |
| 1.2294 | 27940 | 1.4386 | - |
| 1.2299 | 27950 | 1.5798 | - |
| 1.2303 | 27960 | 1.4593 | - |
| 1.2307 | 27970 | 1.5343 | - |
| 1.2312 | 27980 | 1.4595 | - |
| 1.2316 | 27990 | 1.4699 | - |
| 1.2321 | 28000 | 1.5538 | - |
| 1.2325 | 28010 | 1.4872 | - |
| 1.2329 | 28020 | 1.5248 | - |
| 1.2334 | 28030 | 1.4839 | - |
| 1.2338 | 28040 | 1.486 | - |
| 1.2343 | 28050 | 1.4502 | - |
| 1.2347 | 28060 | 1.4716 | - |
| 1.2351 | 28070 | 1.4728 | - |
| 1.2356 | 28080 | 1.461 | - |
| 1.2360 | 28090 | 1.489 | - |
| 1.2365 | 28100 | 1.4606 | - |
| 1.2369 | 28110 | 1.4919 | - |
| 1.2373 | 28120 | 1.5063 | - |
| 1.2378 | 28130 | 1.4836 | - |
| 1.2382 | 28140 | 1.4986 | - |
| 1.2387 | 28150 | 1.451 | - |
| 1.2391 | 28160 | 1.5152 | - |
| 1.2395 | 28170 | 1.5438 | - |
| 1.2400 | 28180 | 1.4313 | - |
| 1.2404 | 28190 | 1.5188 | - |
| 1.2409 | 28200 | 1.5321 | - |
| 1.2413 | 28210 | 1.4912 | - |
| 1.2417 | 28220 | 1.505 | - |
| 1.2422 | 28230 | 1.5087 | - |
| 1.2426 | 28240 | 1.4345 | - |
| 1.2431 | 28250 | 1.4074 | - |
| 1.2435 | 28260 | 1.5118 | - |
| 1.2439 | 28270 | 1.4833 | - |
| 1.2444 | 28280 | 1.4951 | - |
| 1.2448 | 28290 | 1.4328 | - |
| 1.2453 | 28300 | 1.5184 | - |
| 1.2457 | 28310 | 1.4643 | - |
| 1.2461 | 28320 | 1.4246 | - |
| 1.2466 | 28330 | 1.4355 | - |
| 1.2470 | 28340 | 1.5257 | - |
| 1.2475 | 28350 | 1.4811 | - |
| 1.2479 | 28360 | 1.4853 | - |
| 1.2483 | 28370 | 1.4736 | - |
| 1.2488 | 28380 | 1.4907 | - |
| 1.2492 | 28390 | 1.4797 | - |
| 1.2497 | 28400 | 1.4412 | - |
| 1.2501 | 28410 | 1.4927 | - |
| 1.2506 | 28420 | 1.4616 | - |
| 1.2508 | 28425 | - | 1.5616 |
| 1.2510 | 28430 | 1.5473 | - |
| 1.2514 | 28440 | 1.4752 | - |
| 1.2519 | 28450 | 1.4587 | - |
| 1.2523 | 28460 | 1.4909 | - |
| 1.2528 | 28470 | 1.5132 | - |
| 1.2532 | 28480 | 1.4678 | - |
| 1.2536 | 28490 | 1.4836 | - |
| 1.2541 | 28500 | 1.3888 | - |
| 1.2545 | 28510 | 1.5269 | - |
| 1.2550 | 28520 | 1.5596 | - |
| 1.2554 | 28530 | 1.4718 | - |
| 1.2558 | 28540 | 1.514 | - |
| 1.2563 | 28550 | 1.5507 | - |
| 1.2567 | 28560 | 1.4828 | - |
| 1.2572 | 28570 | 1.4465 | - |
| 1.2576 | 28580 | 1.518 | - |
| 1.2580 | 28590 | 1.4096 | - |
| 1.2585 | 28600 | 1.443 | - |
| 1.2589 | 28610 | 1.4665 | - |
| 1.2594 | 28620 | 1.4606 | - |
| 1.2598 | 28630 | 1.4536 | - |
| 1.2602 | 28640 | 1.4904 | - |
| 1.2607 | 28650 | 1.5067 | - |
| 1.2611 | 28660 | 1.4778 | - |
| 1.2616 | 28670 | 1.4444 | - |
| 1.2620 | 28680 | 1.5134 | - |
| 1.2624 | 28690 | 1.4689 | - |
| 1.2629 | 28700 | 1.4615 | - |
| 1.2633 | 28710 | 1.459 | - |
| 1.2638 | 28720 | 1.4847 | - |
| 1.2642 | 28730 | 1.4464 | - |
| 1.2646 | 28740 | 1.4751 | - |
| 1.2651 | 28750 | 1.4457 | - |
| 1.2655 | 28760 | 1.5031 | - |
| 1.2660 | 28770 | 1.4892 | - |
| 1.2664 | 28780 | 1.4987 | - |
| 1.2668 | 28790 | 1.4357 | - |
| 1.2673 | 28800 | 1.4605 | - |
| 1.2677 | 28810 | 1.4296 | - |
| 1.2682 | 28820 | 1.4824 | - |
| 1.2686 | 28830 | 1.5143 | - |
| 1.2690 | 28840 | 1.5263 | - |
| 1.2695 | 28850 | 1.4121 | - |
| 1.2699 | 28860 | 1.4294 | - |
| 1.2704 | 28870 | 1.4982 | - |
| 1.2708 | 28880 | 1.4588 | - |
| 1.2712 | 28890 | 1.5023 | - |
| 1.2717 | 28900 | 1.5302 | - |
| 1.2721 | 28910 | 1.433 | - |
| 1.2726 | 28920 | 1.5513 | - |
| 1.2730 | 28930 | 1.5006 | - |
| 1.2734 | 28940 | 1.504 | - |
| 1.2739 | 28950 | 1.5437 | - |
| 1.2743 | 28960 | 1.4158 | - |
| 1.2748 | 28970 | 1.4281 | - |
| 1.2752 | 28980 | 1.4614 | - |
| 1.2756 | 28990 | 1.4756 | - |
| 1.2761 | 29000 | 1.4751 | - |
| 1.2765 | 29010 | 1.5179 | - |
| 1.2770 | 29020 | 1.5211 | - |
| 1.2774 | 29030 | 1.5066 | - |
| 1.2778 | 29040 | 1.4433 | - |
| 1.2783 | 29050 | 1.478 | - |
| 1.2787 | 29060 | 1.4996 | - |
| 1.2792 | 29070 | 1.4917 | - |
| 1.2796 | 29080 | 1.4517 | - |
| 1.2800 | 29090 | 1.4831 | - |
| 1.2805 | 29100 | 1.438 | - |
| 1.2809 | 29110 | 1.5083 | - |
| 1.2814 | 29120 | 1.4198 | - |
| 1.2818 | 29130 | 1.5313 | - |
| 1.2822 | 29140 | 1.407 | - |
| 1.2827 | 29150 | 1.4461 | - |
| 1.2831 | 29160 | 1.4957 | - |
| 1.2836 | 29170 | 1.4194 | - |
| 1.2840 | 29180 | 1.4736 | - |
| 1.2844 | 29190 | 1.5035 | - |
| 1.2849 | 29200 | 1.5077 | - |
| 1.2853 | 29210 | 1.4619 | - |
| 1.2858 | 29220 | 1.4465 | - |
| 1.2862 | 29230 | 1.4893 | - |
| 1.2866 | 29240 | 1.4712 | - |
| 1.2871 | 29250 | 1.4198 | - |
| 1.2875 | 29260 | 1.4918 | - |
| 1.2880 | 29270 | 1.444 | - |
| 1.2884 | 29280 | 1.4696 | - |
| 1.2888 | 29290 | 1.5026 | - |
| 1.2893 | 29300 | 1.547 | - |
| 1.2897 | 29310 | 1.4942 | - |
| 1.2902 | 29320 | 1.5173 | - |
| 1.2906 | 29330 | 1.5094 | - |
| 1.2910 | 29340 | 1.4719 | - |
| 1.2915 | 29350 | 1.5285 | - |
| 1.2919 | 29360 | 1.4899 | - |
| 1.2924 | 29370 | 1.4835 | - |
| 1.2928 | 29380 | 1.4602 | - |
| 1.2932 | 29390 | 1.4235 | - |
| 1.2937 | 29400 | 1.4728 | - |
| 1.2941 | 29410 | 1.4136 | - |
| 1.2946 | 29420 | 1.4531 | - |
| 1.2950 | 29430 | 1.4931 | - |
| 1.2954 | 29440 | 1.5204 | - |
| 1.2959 | 29450 | 1.4199 | - |
| 1.2963 | 29460 | 1.4358 | - |
| 1.2968 | 29470 | 1.4562 | - |
| 1.2972 | 29480 | 1.4586 | - |
| 1.2976 | 29490 | 1.4555 | - |
| 1.2981 | 29500 | 1.5062 | - |
| 1.2985 | 29510 | 1.4603 | - |
| 1.2990 | 29520 | 1.4835 | - |
| 1.2994 | 29530 | 1.5371 | - |
| 1.2998 | 29540 | 1.4801 | - |
| 1.3003 | 29550 | 1.4872 | - |
| 1.3007 | 29560 | 1.4703 | - |
| 1.3008 | 29562 | - | 1.5516 |
| 1.3012 | 29570 | 1.422 | - |
| 1.3016 | 29580 | 1.4933 | - |
| 1.3020 | 29590 | 1.4693 | - |
| 1.3025 | 29600 | 1.4436 | - |
| 1.3029 | 29610 | 1.4722 | - |
| 1.3034 | 29620 | 1.4921 | - |
| 1.3038 | 29630 | 1.4925 | - |
| 1.3042 | 29640 | 1.4363 | - |
| 1.3047 | 29650 | 1.4369 | - |
| 1.3051 | 29660 | 1.5419 | - |
| 1.3056 | 29670 | 1.5535 | - |
| 1.3060 | 29680 | 1.5036 | - |
| 1.3064 | 29690 | 1.5064 | - |
| 1.3069 | 29700 | 1.5199 | - |
| 1.3073 | 29710 | 1.4914 | - |
| 1.3078 | 29720 | 1.4679 | - |
| 1.3082 | 29730 | 1.5185 | - |
| 1.3086 | 29740 | 1.4846 | - |
| 1.3091 | 29750 | 1.4736 | - |
| 1.3095 | 29760 | 1.4547 | - |
| 1.3100 | 29770 | 1.4409 | - |
| 1.3104 | 29780 | 1.4611 | - |
| 1.3108 | 29790 | 1.4541 | - |
| 1.3113 | 29800 | 1.4389 | - |
| 1.3117 | 29810 | 1.4575 | - |
| 1.3122 | 29820 | 1.456 | - |
| 1.3126 | 29830 | 1.4267 | - |
| 1.3130 | 29840 | 1.4144 | - |
| 1.3135 | 29850 | 1.4896 | - |
| 1.3139 | 29860 | 1.4689 | - |
| 1.3144 | 29870 | 1.5245 | - |
| 1.3148 | 29880 | 1.4615 | - |
| 1.3152 | 29890 | 1.3983 | - |
| 1.3157 | 29900 | 1.4807 | - |
| 1.3161 | 29910 | 1.4559 | - |
| 1.3166 | 29920 | 1.4581 | - |
| 1.3170 | 29930 | 1.4965 | - |
| 1.3174 | 29940 | 1.4369 | - |
| 1.3179 | 29950 | 1.4024 | - |
| 1.3183 | 29960 | 1.4185 | - |
| 1.3188 | 29970 | 1.4499 | - |
| 1.3192 | 29980 | 1.4757 | - |
| 1.3196 | 29990 | 1.4864 | - |
| 1.3201 | 30000 | 1.4493 | - |
| 1.3205 | 30010 | 1.4121 | - |
| 1.3210 | 30020 | 1.5093 | - |
| 1.3214 | 30030 | 1.4136 | - |
| 1.3218 | 30040 | 1.5351 | - |
| 1.3223 | 30050 | 1.4663 | - |
| 1.3227 | 30060 | 1.547 | - |
| 1.3232 | 30070 | 1.5368 | - |
| 1.3236 | 30080 | 1.495 | - |
| 1.3240 | 30090 | 1.5057 | - |
| 1.3245 | 30100 | 1.4947 | - |
| 1.3249 | 30110 | 1.5306 | - |
| 1.3254 | 30120 | 1.463 | - |
| 1.3258 | 30130 | 1.4811 | - |
| 1.3262 | 30140 | 1.5036 | - |
| 1.3267 | 30150 | 1.4564 | - |
| 1.3271 | 30160 | 1.4994 | - |
| 1.3276 | 30170 | 1.481 | - |
| 1.3280 | 30180 | 1.3895 | - |
| 1.3284 | 30190 | 1.4379 | - |
| 1.3289 | 30200 | 1.4851 | - |
| 1.3293 | 30210 | 1.4769 | - |
| 1.3298 | 30220 | 1.4344 | - |
| 1.3302 | 30230 | 1.3751 | - |
| 1.3306 | 30240 | 1.4441 | - |
| 1.3311 | 30250 | 1.4067 | - |
| 1.3315 | 30260 | 1.4614 | - |
| 1.3320 | 30270 | 1.4679 | - |
| 1.3324 | 30280 | 1.4541 | - |
| 1.3328 | 30290 | 1.4269 | - |
| 1.3333 | 30300 | 1.4585 | - |
| 1.3337 | 30310 | 1.417 | - |
| 1.3342 | 30320 | 1.4897 | - |
| 1.3346 | 30330 | 1.4732 | - |
| 1.3350 | 30340 | 1.5293 | - |
| 1.3355 | 30350 | 1.5201 | - |
| 1.3359 | 30360 | 1.4786 | - |
| 1.3364 | 30370 | 1.4912 | - |
| 1.3368 | 30380 | 1.4941 | - |
| 1.3372 | 30390 | 1.4435 | - |
| 1.3377 | 30400 | 1.4619 | - |
| 1.3381 | 30410 | 1.5254 | - |
| 1.3386 | 30420 | 1.4483 | - |
| 1.3390 | 30430 | 1.435 | - |
| 1.3394 | 30440 | 1.4577 | - |
| 1.3399 | 30450 | 1.4673 | - |
| 1.3403 | 30460 | 1.4702 | - |
| 1.3408 | 30470 | 1.5189 | - |
| 1.3412 | 30480 | 1.4202 | - |
| 1.3416 | 30490 | 1.4301 | - |
| 1.3421 | 30500 | 1.4342 | - |
| 1.3425 | 30510 | 1.4259 | - |
| 1.3430 | 30520 | 1.4098 | - |
| 1.3434 | 30530 | 1.4739 | - |
| 1.3438 | 30540 | 1.4157 | - |
| 1.3443 | 30550 | 1.425 | - |
| 1.3447 | 30560 | 1.4457 | - |
| 1.3452 | 30570 | 1.466 | - |
| 1.3456 | 30580 | 1.4505 | - |
| 1.3460 | 30590 | 1.467 | - |
| 1.3465 | 30600 | 1.4269 | - |
| 1.3469 | 30610 | 1.4521 | - |
| 1.3474 | 30620 | 1.3804 | - |
| 1.3478 | 30630 | 1.4376 | - |
| 1.3482 | 30640 | 1.4688 | - |
| 1.3487 | 30650 | 1.4699 | - |
| 1.3491 | 30660 | 1.4475 | - |
| 1.3496 | 30670 | 1.4432 | - |
| 1.3500 | 30680 | 1.3884 | - |
| 1.3504 | 30690 | 1.4259 | - |
| 1.3508 | 30699 | - | 1.5375 |
| 1.3509 | 30700 | 1.423 | - |
| 1.3513 | 30710 | 1.4643 | - |
| 1.3518 | 30720 | 1.5106 | - |
| 1.3522 | 30730 | 1.3689 | - |
| 1.3526 | 30740 | 1.4726 | - |
| 1.3531 | 30750 | 1.445 | - |
| 1.3535 | 30760 | 1.4754 | - |
| 1.3540 | 30770 | 1.4361 | - |
| 1.3544 | 30780 | 1.4291 | - |
| 1.3548 | 30790 | 1.4508 | - |
| 1.3553 | 30800 | 1.407 | - |
| 1.3557 | 30810 | 1.4611 | - |
| 1.3562 | 30820 | 1.5086 | - |
| 1.3566 | 30830 | 1.3813 | - |
| 1.3570 | 30840 | 1.4546 | - |
| 1.3575 | 30850 | 1.4563 | - |
| 1.3579 | 30860 | 1.425 | - |
| 1.3584 | 30870 | 1.4243 | - |
| 1.3588 | 30880 | 1.4077 | - |
| 1.3592 | 30890 | 1.4478 | - |
| 1.3597 | 30900 | 1.4448 | - |
| 1.3601 | 30910 | 1.4987 | - |
| 1.3606 | 30920 | 1.446 | - |
| 1.3610 | 30930 | 1.4446 | - |
| 1.3614 | 30940 | 1.4492 | - |
| 1.3619 | 30950 | 1.4461 | - |
| 1.3623 | 30960 | 1.4964 | - |
| 1.3628 | 30970 | 1.4406 | - |
| 1.3632 | 30980 | 1.3926 | - |
| 1.3636 | 30990 | 1.3516 | - |
| 1.3641 | 31000 | 1.4007 | - |
| 1.3645 | 31010 | 1.4494 | - |
| 1.3650 | 31020 | 1.4517 | - |
| 1.3654 | 31030 | 1.4383 | - |
| 1.3658 | 31040 | 1.3785 | - |
| 1.3663 | 31050 | 1.3953 | - |
| 1.3667 | 31060 | 1.4111 | - |
| 1.3672 | 31070 | 1.4596 | - |
| 1.3676 | 31080 | 1.4139 | - |
| 1.3680 | 31090 | 1.494 | - |
| 1.3685 | 31100 | 1.4054 | - |
| 1.3689 | 31110 | 1.434 | - |
| 1.3694 | 31120 | 1.526 | - |
| 1.3698 | 31130 | 1.4363 | - |
| 1.3702 | 31140 | 1.4937 | - |
| 1.3707 | 31150 | 1.5342 | - |
| 1.3711 | 31160 | 1.4656 | - |
| 1.3716 | 31170 | 1.419 | - |
| 1.3720 | 31180 | 1.4149 | - |
| 1.3724 | 31190 | 1.4163 | - |
| 1.3729 | 31200 | 1.4471 | - |
| 1.3733 | 31210 | 1.4444 | - |
| 1.3738 | 31220 | 1.4594 | - |
| 1.3742 | 31230 | 1.4987 | - |
| 1.3746 | 31240 | 1.4017 | - |
| 1.3751 | 31250 | 1.4853 | - |
| 1.3755 | 31260 | 1.4265 | - |
| 1.3760 | 31270 | 1.4342 | - |
| 1.3764 | 31280 | 1.4531 | - |
| 1.3768 | 31290 | 1.3976 | - |
| 1.3773 | 31300 | 1.4358 | - |
| 1.3777 | 31310 | 1.4285 | - |
| 1.3782 | 31320 | 1.4515 | - |
| 1.3786 | 31330 | 1.4661 | - |
| 1.3790 | 31340 | 1.4068 | - |
| 1.3795 | 31350 | 1.4413 | - |
| 1.3799 | 31360 | 1.4188 | - |
| 1.3804 | 31370 | 1.4789 | - |
| 1.3808 | 31380 | 1.4218 | - |
| 1.3812 | 31390 | 1.4364 | - |
| 1.3817 | 31400 | 1.4808 | - |
| 1.3821 | 31410 | 1.4445 | - |
| 1.3826 | 31420 | 1.4203 | - |
| 1.3830 | 31430 | 1.4232 | - |
| 1.3834 | 31440 | 1.4258 | - |
| 1.3839 | 31450 | 1.4519 | - |
| 1.3843 | 31460 | 1.3884 | - |
| 1.3848 | 31470 | 1.4784 | - |
| 1.3852 | 31480 | 1.4862 | - |
| 1.3856 | 31490 | 1.4771 | - |
| 1.3861 | 31500 | 1.4906 | - |
| 1.3865 | 31510 | 1.4823 | - |
| 1.3870 | 31520 | 1.4671 | - |
| 1.3874 | 31530 | 1.4223 | - |
| 1.3878 | 31540 | 1.5021 | - |
| 1.3883 | 31550 | 1.4529 | - |
| 1.3887 | 31560 | 1.4673 | - |
| 1.3892 | 31570 | 1.437 | - |
| 1.3896 | 31580 | 1.4135 | - |
| 1.3900 | 31590 | 1.4692 | - |
| 1.3905 | 31600 | 1.504 | - |
| 1.3909 | 31610 | 1.5138 | - |
| 1.3914 | 31620 | 1.4028 | - |
| 1.3918 | 31630 | 1.4777 | - |
| 1.3922 | 31640 | 1.4307 | - |
| 1.3927 | 31650 | 1.4268 | - |
| 1.3931 | 31660 | 1.4285 | - |
| 1.3936 | 31670 | 1.4967 | - |
| 1.3940 | 31680 | 1.3859 | - |
| 1.3944 | 31690 | 1.4597 | - |
| 1.3949 | 31700 | 1.4589 | - |
| 1.3953 | 31710 | 1.4324 | - |
| 1.3958 | 31720 | 1.4262 | - |
| 1.3962 | 31730 | 1.4637 | - |
| 1.3966 | 31740 | 1.4423 | - |
| 1.3971 | 31750 | 1.4639 | - |
| 1.3975 | 31760 | 1.4815 | - |
| 1.3980 | 31770 | 1.4165 | - |
| 1.3984 | 31780 | 1.4625 | - |
| 1.3988 | 31790 | 1.4542 | - |
| 1.3993 | 31800 | 1.5103 | - |
| 1.3997 | 31810 | 1.4058 | - |
| 1.4002 | 31820 | 1.4236 | - |
| 1.4006 | 31830 | 1.4415 | - |
| 1.4009 | 31836 | - | 1.5173 |
| 1.4010 | 31840 | 1.4424 | - |
| 1.4015 | 31850 | 1.514 | - |
| 1.4019 | 31860 | 1.4359 | - |
| 1.4024 | 31870 | 1.4775 | - |
| 1.4028 | 31880 | 1.4655 | - |
| 1.4032 | 31890 | 1.445 | - |
| 1.4037 | 31900 | 1.4292 | - |
| 1.4041 | 31910 | 1.4133 | - |
| 1.4046 | 31920 | 1.4819 | - |
| 1.4050 | 31930 | 1.4687 | - |
| 1.4054 | 31940 | 1.4853 | - |
| 1.4059 | 31950 | 1.4179 | - |
| 1.4063 | 31960 | 1.4208 | - |
| 1.4068 | 31970 | 1.4276 | - |
| 1.4072 | 31980 | 1.4781 | - |
| 1.4076 | 31990 | 1.457 | - |
| 1.4081 | 32000 | 1.4884 | - |
| 1.4085 | 32010 | 1.4736 | - |
| 1.4090 | 32020 | 1.4949 | - |
| 1.4094 | 32030 | 1.4637 | - |
| 1.4098 | 32040 | 1.4376 | - |
| 1.4103 | 32050 | 1.5201 | - |
| 1.4107 | 32060 | 1.4423 | - |
| 1.4112 | 32070 | 1.4462 | - |
| 1.4116 | 32080 | 1.4215 | - |
| 1.4120 | 32090 | 1.4986 | - |
| 1.4125 | 32100 | 1.4338 | - |
| 1.4129 | 32110 | 1.4201 | - |
| 1.4134 | 32120 | 1.5362 | - |
| 1.4138 | 32130 | 1.384 | - |
| 1.4142 | 32140 | 1.422 | - |
| 1.4147 | 32150 | 1.4682 | - |
| 1.4151 | 32160 | 1.4023 | - |
| 1.4156 | 32170 | 1.4667 | - |
| 1.4160 | 32180 | 1.4361 | - |
| 1.4164 | 32190 | 1.4294 | - |
| 1.4169 | 32200 | 1.4949 | - |
| 1.4173 | 32210 | 1.4581 | - |
| 1.4178 | 32220 | 1.4963 | - |
| 1.4182 | 32230 | 1.4472 | - |
| 1.4186 | 32240 | 1.4342 | - |
| 1.4191 | 32250 | 1.4754 | - |
| 1.4195 | 32260 | 1.4537 | - |
| 1.4200 | 32270 | 1.38 | - |
| 1.4204 | 32280 | 1.4453 | - |
| 1.4208 | 32290 | 1.4655 | - |
| 1.4213 | 32300 | 1.4839 | - |
| 1.4217 | 32310 | 1.4261 | - |
| 1.4222 | 32320 | 1.4918 | - |
| 1.4226 | 32330 | 1.4522 | - |
| 1.4230 | 32340 | 1.4189 | - |
| 1.4235 | 32350 | 1.4805 | - |
| 1.4239 | 32360 | 1.4076 | - |
| 1.4244 | 32370 | 1.4667 | - |
| 1.4248 | 32380 | 1.4184 | - |
| 1.4252 | 32390 | 1.4804 | - |
| 1.4257 | 32400 | 1.4851 | - |
| 1.4261 | 32410 | 1.3915 | - |
| 1.4266 | 32420 | 1.483 | - |
| 1.4270 | 32430 | 1.3958 | - |
| 1.4274 | 32440 | 1.4061 | - |
| 1.4279 | 32450 | 1.4916 | - |
| 1.4283 | 32460 | 1.4498 | - |
| 1.4288 | 32470 | 1.4841 | - |
| 1.4292 | 32480 | 1.4422 | - |
| 1.4296 | 32490 | 1.4627 | - |
| 1.4301 | 32500 | 1.4495 | - |
| 1.4305 | 32510 | 1.4268 | - |
| 1.4310 | 32520 | 1.4045 | - |
| 1.4314 | 32530 | 1.4712 | - |
| 1.4318 | 32540 | 1.3766 | - |
| 1.4323 | 32550 | 1.4567 | - |
| 1.4327 | 32560 | 1.4588 | - |
| 1.4332 | 32570 | 1.4375 | - |
| 1.4336 | 32580 | 1.4189 | - |
| 1.4340 | 32590 | 1.4421 | - |
| 1.4345 | 32600 | 1.3936 | - |
| 1.4349 | 32610 | 1.4688 | - |
| 1.4354 | 32620 | 1.4066 | - |
| 1.4358 | 32630 | 1.3836 | - |
| 1.4362 | 32640 | 1.3809 | - |
| 1.4367 | 32650 | 1.4437 | - |
| 1.4371 | 32660 | 1.4769 | - |
| 1.4376 | 32670 | 1.4882 | - |
| 1.4380 | 32680 | 1.4066 | - |
| 1.4384 | 32690 | 1.441 | - |
| 1.4389 | 32700 | 1.446 | - |
| 1.4393 | 32710 | 1.4381 | - |
| 1.4398 | 32720 | 1.4302 | - |
| 1.4402 | 32730 | 1.4484 | - |
| 1.4406 | 32740 | 1.4216 | - |
| 1.4411 | 32750 | 1.4218 | - |
| 1.4415 | 32760 | 1.426 | - |
| 1.4420 | 32770 | 1.4466 | - |
| 1.4424 | 32780 | 1.3842 | - |
| 1.4428 | 32790 | 1.4083 | - |
| 1.4433 | 32800 | 1.4575 | - |
| 1.4437 | 32810 | 1.4426 | - |
| 1.4442 | 32820 | 1.4689 | - |
| 1.4446 | 32830 | 1.4371 | - |
| 1.4450 | 32840 | 1.4762 | - |
| 1.4455 | 32850 | 1.3859 | - |
| 1.4459 | 32860 | 1.4748 | - |
| 1.4464 | 32870 | 1.5154 | - |
| 1.4468 | 32880 | 1.423 | - |
| 1.4472 | 32890 | 1.3968 | - |
| 1.4477 | 32900 | 1.4136 | - |
| 1.4481 | 32910 | 1.3942 | - |
| 1.4486 | 32920 | 1.4826 | - |
| 1.4490 | 32930 | 1.4253 | - |
| 1.4494 | 32940 | 1.4486 | - |
| 1.4499 | 32950 | 1.386 | - |
| 1.4503 | 32960 | 1.4604 | - |
| 1.4508 | 32970 | 1.4225 | - |
| 1.4509 | 32973 | - | 1.4685 |
| 1.4512 | 32980 | 1.368 | - |
| 1.4516 | 32990 | 1.4771 | - |
| 1.4521 | 33000 | 1.44 | - |
| 1.4525 | 33010 | 1.3619 | - |
| 1.4530 | 33020 | 1.3897 | - |
| 1.4534 | 33030 | 1.4355 | - |
| 1.4538 | 33040 | 1.4098 | - |
| 1.4543 | 33050 | 1.44 | - |
| 1.4547 | 33060 | 1.4174 | - |
| 1.4552 | 33070 | 1.4406 | - |
| 1.4556 | 33080 | 1.4348 | - |
| 1.4560 | 33090 | 1.4444 | - |
| 1.4565 | 33100 | 1.4101 | - |
| 1.4569 | 33110 | 1.3728 | - |
| 1.4574 | 33120 | 1.372 | - |
| 1.4578 | 33130 | 1.3701 | - |
| 1.4582 | 33140 | 1.4877 | - |
| 1.4587 | 33150 | 1.4265 | - |
| 1.4591 | 33160 | 1.4123 | - |
| 1.4596 | 33170 | 1.3918 | - |
| 1.4600 | 33180 | 1.4163 | - |
| 1.4604 | 33190 | 1.3888 | - |
| 1.4609 | 33200 | 1.4784 | - |
| 1.4613 | 33210 | 1.4037 | - |
| 1.4618 | 33220 | 1.4427 | - |
| 1.4622 | 33230 | 1.3532 | - |
| 1.4626 | 33240 | 1.4689 | - |
| 1.4631 | 33250 | 1.389 | - |
| 1.4635 | 33260 | 1.4426 | - |
| 1.4640 | 33270 | 1.4039 | - |
| 1.4644 | 33280 | 1.4403 | - |
| 1.4648 | 33290 | 1.4117 | - |
| 1.4653 | 33300 | 1.4155 | - |
| 1.4657 | 33310 | 1.4407 | - |
| 1.4662 | 33320 | 1.4255 | - |
| 1.4666 | 33330 | 1.392 | - |
| 1.4670 | 33340 | 1.4496 | - |
| 1.4675 | 33350 | 1.4077 | - |
| 1.4679 | 33360 | 1.383 | - |
| 1.4684 | 33370 | 1.3814 | - |
| 1.4688 | 33380 | 1.4055 | - |
| 1.4692 | 33390 | 1.4011 | - |
| 1.4697 | 33400 | 1.3996 | - |
| 1.4701 | 33410 | 1.4197 | - |
| 1.4706 | 33420 | 1.4212 | - |
| 1.4710 | 33430 | 1.4135 | - |
| 1.4714 | 33440 | 1.3899 | - |
| 1.4719 | 33450 | 1.4703 | - |
| 1.4723 | 33460 | 1.3748 | - |
| 1.4728 | 33470 | 1.3894 | - |
| 1.4732 | 33480 | 1.4071 | - |
| 1.4736 | 33490 | 1.3926 | - |
| 1.4741 | 33500 | 1.3902 | - |
| 1.4745 | 33510 | 1.4212 | - |
| 1.4750 | 33520 | 1.3856 | - |
| 1.4754 | 33530 | 1.4449 | - |
| 1.4758 | 33540 | 1.3777 | - |
| 1.4763 | 33550 | 1.4336 | - |
| 1.4767 | 33560 | 1.3527 | - |
| 1.4772 | 33570 | 1.3741 | - |
| 1.4776 | 33580 | 1.3706 | - |
| 1.4780 | 33590 | 1.383 | - |
| 1.4785 | 33600 | 1.3662 | - |
| 1.4789 | 33610 | 1.3727 | - |
| 1.4794 | 33620 | 1.3678 | - |
| 1.4798 | 33630 | 1.4044 | - |
| 1.4802 | 33640 | 1.3741 | - |
| 1.4807 | 33650 | 1.3866 | - |
| 1.4811 | 33660 | 1.3784 | - |
| 1.4816 | 33670 | 1.3974 | - |
| 1.4820 | 33680 | 1.401 | - |
| 1.4824 | 33690 | 1.4117 | - |
| 1.4829 | 33700 | 1.4046 | - |
| 1.4833 | 33710 | 1.3722 | - |
| 1.4838 | 33720 | 1.3855 | - |
| 1.4842 | 33730 | 1.3862 | - |
| 1.4846 | 33740 | 1.4105 | - |
| 1.4851 | 33750 | 1.4511 | - |
| 1.4855 | 33760 | 1.4656 | - |
| 1.4860 | 33770 | 1.3674 | - |
| 1.4864 | 33780 | 1.3601 | - |
| 1.4868 | 33790 | 1.3655 | - |
| 1.4873 | 33800 | 1.336 | - |
| 1.4877 | 33810 | 1.3911 | - |
| 1.4882 | 33820 | 1.4271 | - |
| 1.4886 | 33830 | 1.4148 | - |
| 1.4890 | 33840 | 1.4187 | - |
| 1.4895 | 33850 | 1.3551 | - |
| 1.4899 | 33860 | 1.3764 | - |
| 1.4904 | 33870 | 1.4137 | - |
| 1.4908 | 33880 | 1.4367 | - |
| 1.4912 | 33890 | 1.4422 | - |
| 1.4917 | 33900 | 1.3986 | - |
| 1.4921 | 33910 | 1.366 | - |
| 1.4926 | 33920 | 1.3926 | - |
| 1.4930 | 33930 | 1.4395 | - |
| 1.4934 | 33940 | 1.3947 | - |
| 1.4939 | 33950 | 1.3706 | - |
| 1.4943 | 33960 | 1.3473 | - |
| 1.4948 | 33970 | 1.3924 | - |
| 1.4952 | 33980 | 1.3652 | - |
| 1.4956 | 33990 | 1.4336 | - |
| 1.4961 | 34000 | 1.3858 | - |
| 1.4965 | 34010 | 1.3208 | - |
| 1.4970 | 34020 | 1.3603 | - |
| 1.4974 | 34030 | 1.3871 | - |
| 1.4978 | 34040 | 1.3966 | - |
| 1.4983 | 34050 | 1.3969 | - |
| 1.4987 | 34060 | 1.4145 | - |
| 1.4992 | 34070 | 1.4456 | - |
| 1.4996 | 34080 | 1.3761 | - |
| 1.5000 | 34090 | 1.4099 | - |
| 1.5005 | 34100 | 1.3718 | - |
| 1.5009 | 34110 | 1.4387 | 1.5171 |
| 1.5014 | 34120 | 1.3529 | - |
| 1.5018 | 34130 | 1.3963 | - |
| 1.5022 | 34140 | 1.3716 | - |
| 1.5027 | 34150 | 1.4132 | - |
| 1.5031 | 34160 | 1.4204 | - |
| 1.5036 | 34170 | 1.4007 | - |
| 1.5040 | 34180 | 1.3992 | - |
| 1.5044 | 34190 | 1.3007 | - |
| 1.5049 | 34200 | 1.3684 | - |
| 1.5053 | 34210 | 1.3756 | - |
| 1.5058 | 34220 | 1.3825 | - |
| 1.5062 | 34230 | 1.3781 | - |
| 1.5066 | 34240 | 1.4243 | - |
| 1.5071 | 34250 | 1.3829 | - |
| 1.5075 | 34260 | 1.3598 | - |
| 1.5080 | 34270 | 1.3877 | - |
| 1.5084 | 34280 | 1.4243 | - |
| 1.5088 | 34290 | 1.3623 | - |
| 1.5093 | 34300 | 1.3672 | - |
| 1.5097 | 34310 | 1.3651 | - |
| 1.5102 | 34320 | 1.3242 | - |
| 1.5106 | 34330 | 1.4086 | - |
| 1.5110 | 34340 | 1.3607 | - |
| 1.5115 | 34350 | 1.3874 | - |
| 1.5119 | 34360 | 1.3329 | - |
| 1.5124 | 34370 | 1.3803 | - |
| 1.5128 | 34380 | 1.3551 | - |
| 1.5132 | 34390 | 1.3438 | - |
| 1.5137 | 34400 | 1.3584 | - |
| 1.5141 | 34410 | 1.3543 | - |
| 1.5146 | 34420 | 1.3898 | - |
| 1.5150 | 34430 | 1.4326 | - |
| 1.5154 | 34440 | 1.3848 | - |
| 1.5159 | 34450 | 1.357 | - |
| 1.5163 | 34460 | 1.3477 | - |
| 1.5168 | 34470 | 1.3912 | - |
| 1.5172 | 34480 | 1.3611 | - |
| 1.5176 | 34490 | 1.3536 | - |
| 1.5181 | 34500 | 1.3641 | - |
| 1.5185 | 34510 | 1.3474 | - |
| 1.5190 | 34520 | 1.4305 | - |
| 1.5194 | 34530 | 1.3871 | - |
| 1.5198 | 34540 | 1.4021 | - |
| 1.5203 | 34550 | 1.385 | - |
| 1.5207 | 34560 | 1.3894 | - |
| 1.5212 | 34570 | 1.3683 | - |
| 1.5216 | 34580 | 1.3821 | - |
| 1.5220 | 34590 | 1.3537 | - |
| 1.5225 | 34600 | 1.3898 | - |
| 1.5229 | 34610 | 1.3776 | - |
| 1.5234 | 34620 | 1.3898 | - |
| 1.5238 | 34630 | 1.4633 | - |
| 1.5242 | 34640 | 1.425 | - |
| 1.5247 | 34650 | 1.3891 | - |
| 1.5251 | 34660 | 1.419 | - |
| 1.5256 | 34670 | 1.3916 | - |
| 1.5260 | 34680 | 1.3952 | - |
| 1.5264 | 34690 | 1.3997 | - |
| 1.5269 | 34700 | 1.4075 | - |
| 1.5273 | 34710 | 1.3383 | - |
| 1.5278 | 34720 | 1.3208 | - |
| 1.5282 | 34730 | 1.401 | - |
| 1.5286 | 34740 | 1.3741 | - |
| 1.5291 | 34750 | 1.4139 | - |
| 1.5295 | 34760 | 1.3547 | - |
| 1.5300 | 34770 | 1.3665 | - |
| 1.5304 | 34780 | 1.3704 | - |
| 1.5308 | 34790 | 1.3962 | - |
| 1.5313 | 34800 | 1.3951 | - |
| 1.5317 | 34810 | 1.3904 | - |
| 1.5322 | 34820 | 1.4821 | - |
| 1.5326 | 34830 | 1.3537 | - |
| 1.5330 | 34840 | 1.4081 | - |
| 1.5335 | 34850 | 1.3727 | - |
| 1.5339 | 34860 | 1.361 | - |
| 1.5344 | 34870 | 1.382 | - |
| 1.5348 | 34880 | 1.3657 | - |
| 1.5352 | 34890 | 1.3817 | - |
| 1.5357 | 34900 | 1.3815 | - |
| 1.5361 | 34910 | 1.3716 | - |
| 1.5366 | 34920 | 1.3518 | - |
| 1.5370 | 34930 | 1.3634 | - |
| 1.5374 | 34940 | 1.356 | - |
| 1.5379 | 34950 | 1.4058 | - |
| 1.5383 | 34960 | 1.3794 | - |
| 1.5388 | 34970 | 1.3868 | - |
| 1.5392 | 34980 | 1.3747 | - |
| 1.5396 | 34990 | 1.3963 | - |
| 1.5401 | 35000 | 1.3372 | - |
| 1.5405 | 35010 | 1.3554 | - |
| 1.5410 | 35020 | 1.4119 | - |
| 1.5414 | 35030 | 1.339 | - |
| 1.5418 | 35040 | 1.3991 | - |
| 1.5423 | 35050 | 1.3651 | - |
| 1.5427 | 35060 | 1.3831 | - |
| 1.5432 | 35070 | 1.3874 | - |
| 1.5436 | 35080 | 1.3419 | - |
| 1.5440 | 35090 | 1.3315 | - |
| 1.5445 | 35100 | 1.3522 | - |
| 1.5449 | 35110 | 1.3695 | - |
| 1.5454 | 35120 | 1.3761 | - |
| 1.5458 | 35130 | 1.3638 | - |
| 1.5462 | 35140 | 1.3587 | - |
| 1.5467 | 35150 | 1.3645 | - |
| 1.5471 | 35160 | 1.4011 | - |
| 1.5476 | 35170 | 1.339 | - |
| 1.5480 | 35180 | 1.3691 | - |
| 1.5484 | 35190 | 1.3782 | - |
| 1.5489 | 35200 | 1.3139 | - |
| 1.5493 | 35210 | 1.4535 | - |
| 1.5498 | 35220 | 1.3693 | - |
| 1.5502 | 35230 | 1.3761 | - |
| 1.5506 | 35240 | 1.4011 | - |
| 1.5510 | 35247 | - | 1.4452 |
| 1.5511 | 35250 | 1.3455 | - |
| 1.5515 | 35260 | 1.3339 | - |
| 1.5520 | 35270 | 1.3719 | - |
| 1.5524 | 35280 | 1.3886 | - |
| 1.5528 | 35290 | 1.3132 | - |
| 1.5533 | 35300 | 1.3281 | - |
| 1.5537 | 35310 | 1.3487 | - |
| 1.5542 | 35320 | 1.3508 | - |
| 1.5546 | 35330 | 1.3815 | - |
| 1.5550 | 35340 | 1.3565 | - |
| 1.5555 | 35350 | 1.3429 | - |
| 1.5559 | 35360 | 1.3834 | - |
| 1.5564 | 35370 | 1.3467 | - |
| 1.5568 | 35380 | 1.3858 | - |
| 1.5572 | 35390 | 1.3668 | - |
| 1.5577 | 35400 | 1.3752 | - |
| 1.5581 | 35410 | 1.3116 | - |
| 1.5586 | 35420 | 1.3333 | - |
| 1.5590 | 35430 | 1.3632 | - |
| 1.5594 | 35440 | 1.3869 | - |
| 1.5599 | 35450 | 1.4063 | - |
| 1.5603 | 35460 | 1.3449 | - |
| 1.5608 | 35470 | 1.2758 | - |
| 1.5612 | 35480 | 1.3168 | - |
| 1.5616 | 35490 | 1.349 | - |
| 1.5621 | 35500 | 1.3952 | - |
| 1.5625 | 35510 | 1.3774 | - |
| 1.5630 | 35520 | 1.3626 | - |
| 1.5634 | 35530 | 1.3864 | - |
| 1.5638 | 35540 | 1.3479 | - |
| 1.5643 | 35550 | 1.3595 | - |
| 1.5647 | 35560 | 1.3419 | - |
| 1.5652 | 35570 | 1.3131 | - |
| 1.5656 | 35580 | 1.3659 | - |
| 1.5660 | 35590 | 1.3311 | - |
| 1.5665 | 35600 | 1.3641 | - |
| 1.5669 | 35610 | 1.3609 | - |
| 1.5674 | 35620 | 1.4058 | - |
| 1.5678 | 35630 | 1.3501 | - |
| 1.5682 | 35640 | 1.3229 | - |
| 1.5687 | 35650 | 1.3944 | - |
| 1.5691 | 35660 | 1.3538 | - |
| 1.5696 | 35670 | 1.3918 | - |
| 1.5700 | 35680 | 1.3621 | - |
| 1.5704 | 35690 | 1.3647 | - |
| 1.5709 | 35700 | 1.3474 | - |
| 1.5713 | 35710 | 1.3752 | - |
| 1.5718 | 35720 | 1.3477 | - |
| 1.5722 | 35730 | 1.3532 | - |
| 1.5726 | 35740 | 1.3555 | - |
| 1.5731 | 35750 | 1.3016 | - |
| 1.5735 | 35760 | 1.3628 | - |
| 1.5740 | 35770 | 1.3422 | - |
| 1.5744 | 35780 | 1.4055 | - |
| 1.5748 | 35790 | 1.3899 | - |
| 1.5753 | 35800 | 1.3259 | - |
| 1.5757 | 35810 | 1.3425 | - |
| 1.5762 | 35820 | 1.3506 | - |
| 1.5766 | 35830 | 1.3508 | - |
| 1.5770 | 35840 | 1.3463 | - |
| 1.5775 | 35850 | 1.3699 | - |
| 1.5779 | 35860 | 1.4086 | - |
| 1.5784 | 35870 | 1.3903 | - |
| 1.5788 | 35880 | 1.3239 | - |
| 1.5792 | 35890 | 1.3654 | - |
| 1.5797 | 35900 | 1.3551 | - |
| 1.5801 | 35910 | 1.3387 | - |
| 1.5806 | 35920 | 1.3195 | - |
| 1.5810 | 35930 | 1.3475 | - |
| 1.5814 | 35940 | 1.4056 | - |
| 1.5819 | 35950 | 1.3778 | - |
| 1.5823 | 35960 | 1.3608 | - |
| 1.5828 | 35970 | 1.3798 | - |
| 1.5832 | 35980 | 1.3783 | - |
| 1.5836 | 35990 | 1.3499 | - |
| 1.5841 | 36000 | 1.357 | - |
| 1.5845 | 36010 | 1.3316 | - |
| 1.5850 | 36020 | 1.3914 | - |
| 1.5854 | 36030 | 1.3479 | - |
| 1.5858 | 36040 | 1.3668 | - |
| 1.5863 | 36050 | 1.3575 | - |
| 1.5867 | 36060 | 1.3702 | - |
| 1.5872 | 36070 | 1.3942 | - |
| 1.5876 | 36080 | 1.3626 | - |
| 1.5880 | 36090 | 1.36 | - |
| 1.5885 | 36100 | 1.4056 | - |
| 1.5889 | 36110 | 1.3489 | - |
| 1.5894 | 36120 | 1.3008 | - |
| 1.5898 | 36130 | 1.3453 | - |
| 1.5902 | 36140 | 1.3681 | - |
| 1.5907 | 36150 | 1.3671 | - |
| 1.5911 | 36160 | 1.3215 | - |
| 1.5916 | 36170 | 1.3786 | - |
| 1.5920 | 36180 | 1.3952 | - |
| 1.5924 | 36190 | 1.3789 | - |
| 1.5929 | 36200 | 1.3122 | - |
| 1.5933 | 36210 | 1.411 | - |
| 1.5938 | 36220 | 1.4002 | - |
| 1.5942 | 36230 | 1.3526 | - |
| 1.5946 | 36240 | 1.3371 | - |
| 1.5951 | 36250 | 1.3647 | - |
| 1.5955 | 36260 | 1.3341 | - |
| 1.5960 | 36270 | 1.3821 | - |
| 1.5964 | 36280 | 1.3211 | - |
| 1.5968 | 36290 | 1.3498 | - |
| 1.5973 | 36300 | 1.3154 | - |
| 1.5977 | 36310 | 1.3773 | - |
| 1.5982 | 36320 | 1.3265 | - |
| 1.5986 | 36330 | 1.3147 | - |
| 1.5990 | 36340 | 1.3306 | - |
| 1.5995 | 36350 | 1.3801 | - |
| 1.5999 | 36360 | 1.369 | - |
| 1.6004 | 36370 | 1.3631 | - |
| 1.6008 | 36380 | 1.3392 | - |
| 1.6010 | 36384 | - | 1.4795 |
| 1.6012 | 36390 | 1.3487 | - |
| 1.6017 | 36400 | 1.2972 | - |
| 1.6021 | 36410 | 1.3496 | - |
| 1.6026 | 36420 | 1.3831 | - |
| 1.6030 | 36430 | 1.3394 | - |
| 1.6034 | 36440 | 1.2754 | - |
| 1.6039 | 36450 | 1.3626 | - |
| 1.6043 | 36460 | 1.3347 | - |
| 1.6048 | 36470 | 1.3791 | - |
| 1.6052 | 36480 | 1.3726 | - |
| 1.6056 | 36490 | 1.3044 | - |
| 1.6061 | 36500 | 1.3179 | - |
| 1.6065 | 36510 | 1.3817 | - |
| 1.6070 | 36520 | 1.3042 | - |
| 1.6074 | 36530 | 1.3323 | - |
| 1.6079 | 36540 | 1.3289 | - |
| 1.6083 | 36550 | 1.3554 | - |
| 1.6087 | 36560 | 1.2904 | - |
| 1.6092 | 36570 | 1.3331 | - |
| 1.6096 | 36580 | 1.3505 | - |
| 1.6101 | 36590 | 1.3379 | - |
| 1.6105 | 36600 | 1.2795 | - |
| 1.6109 | 36610 | 1.3004 | - |
| 1.6114 | 36620 | 1.3028 | - |
| 1.6118 | 36630 | 1.2873 | - |
| 1.6123 | 36640 | 1.3664 | - |
| 1.6127 | 36650 | 1.3386 | - |
| 1.6131 | 36660 | 1.3274 | - |
| 1.6136 | 36670 | 1.2951 | - |
| 1.6140 | 36680 | 1.3477 | - |
| 1.6145 | 36690 | 1.3391 | - |
| 1.6149 | 36700 | 1.3411 | - |
| 1.6153 | 36710 | 1.3573 | - |
| 1.6158 | 36720 | 1.3317 | - |
| 1.6162 | 36730 | 1.3542 | - |
| 1.6167 | 36740 | 1.3624 | - |
| 1.6171 | 36750 | 1.369 | - |
| 1.6175 | 36760 | 1.3739 | - |
| 1.6180 | 36770 | 1.3 | - |
| 1.6184 | 36780 | 1.3238 | - |
| 1.6189 | 36790 | 1.3121 | - |
| 1.6193 | 36800 | 1.3508 | - |
| 1.6197 | 36810 | 1.3816 | - |
| 1.6202 | 36820 | 1.3426 | - |
| 1.6206 | 36830 | 1.3112 | - |
| 1.6211 | 36840 | 1.3271 | - |
| 1.6215 | 36850 | 1.3058 | - |
| 1.6219 | 36860 | 1.3741 | - |
| 1.6224 | 36870 | 1.3358 | - |
| 1.6228 | 36880 | 1.3056 | - |
| 1.6233 | 36890 | 1.2963 | - |
| 1.6237 | 36900 | 1.3259 | - |
| 1.6241 | 36910 | 1.306 | - |
| 1.6246 | 36920 | 1.3082 | - |
| 1.6250 | 36930 | 1.3215 | - |
| 1.6255 | 36940 | 1.3326 | - |
| 1.6259 | 36950 | 1.3172 | - |
| 1.6263 | 36960 | 1.3569 | - |
| 1.6268 | 36970 | 1.3187 | - |
| 1.6272 | 36980 | 1.3302 | - |
| 1.6277 | 36990 | 1.2998 | - |
| 1.6281 | 37000 | 1.3204 | - |
| 1.6285 | 37010 | 1.3552 | - |
| 1.6290 | 37020 | 1.2758 | - |
| 1.6294 | 37030 | 1.3735 | - |
| 1.6299 | 37040 | 1.313 | - |
| 1.6303 | 37050 | 1.3223 | - |
| 1.6307 | 37060 | 1.4062 | - |
| 1.6312 | 37070 | 1.3215 | - |
| 1.6316 | 37080 | 1.3357 | - |
| 1.6321 | 37090 | 1.3752 | - |
| 1.6325 | 37100 | 1.3157 | - |
| 1.6329 | 37110 | 1.3816 | - |
| 1.6334 | 37120 | 1.2821 | - |
| 1.6338 | 37130 | 1.3352 | - |
| 1.6343 | 37140 | 1.3531 | - |
| 1.6347 | 37150 | 1.3309 | - |
| 1.6351 | 37160 | 1.3267 | - |
| 1.6356 | 37170 | 1.2928 | - |
| 1.6360 | 37180 | 1.3384 | - |
| 1.6365 | 37190 | 1.3476 | - |
| 1.6369 | 37200 | 1.3066 | - |
| 1.6373 | 37210 | 1.3049 | - |
| 1.6378 | 37220 | 1.3607 | - |
| 1.6382 | 37230 | 1.327 | - |
| 1.6387 | 37240 | 1.3513 | - |
| 1.6391 | 37250 | 1.2971 | - |
| 1.6395 | 37260 | 1.308 | - |
| 1.6400 | 37270 | 1.3102 | - |
| 1.6404 | 37280 | 1.3196 | - |
| 1.6409 | 37290 | 1.317 | - |
| 1.6413 | 37300 | 1.3555 | - |
| 1.6417 | 37310 | 1.3757 | - |
| 1.6422 | 37320 | 1.3467 | - |
| 1.6426 | 37330 | 1.3165 | - |
| 1.6431 | 37340 | 1.3345 | - |
| 1.6435 | 37350 | 1.3636 | - |
| 1.6439 | 37360 | 1.2672 | - |
| 1.6444 | 37370 | 1.3093 | - |
| 1.6448 | 37380 | 1.3344 | - |
| 1.6453 | 37390 | 1.2783 | - |
| 1.6457 | 37400 | 1.3032 | - |
| 1.6461 | 37410 | 1.2973 | - |
| 1.6466 | 37420 | 1.3667 | - |
| 1.6470 | 37430 | 1.3193 | - |
| 1.6475 | 37440 | 1.2588 | - |
| 1.6479 | 37450 | 1.3357 | - |
| 1.6483 | 37460 | 1.2927 | - |
| 1.6488 | 37470 | 1.3269 | - |
| 1.6492 | 37480 | 1.3212 | - |
| 1.6497 | 37490 | 1.286 | - |
| 1.6501 | 37500 | 1.3447 | - |
| 1.6505 | 37510 | 1.3217 | - |
| 1.6510 | 37520 | 1.2734 | - |
| 1.6510 | 37521 | - | 1.4744 |
| 1.6514 | 37530 | 1.3382 | - |
| 1.6519 | 37540 | 1.3124 | - |
| 1.6523 | 37550 | 1.3377 | - |
| 1.6527 | 37560 | 1.3469 | - |
| 1.6532 | 37570 | 1.3995 | - |
| 1.6536 | 37580 | 1.3455 | - |
| 1.6541 | 37590 | 1.2808 | - |
| 1.6545 | 37600 | 1.3253 | - |
| 1.6549 | 37610 | 1.2796 | - |
| 1.6554 | 37620 | 1.3247 | - |
| 1.6558 | 37630 | 1.3332 | - |
| 1.6563 | 37640 | 1.301 | - |
| 1.6567 | 37650 | 1.3142 | - |
| 1.6571 | 37660 | 1.3662 | - |
| 1.6576 | 37670 | 1.3525 | - |
| 1.6580 | 37680 | 1.3062 | - |
| 1.6585 | 37690 | 1.3014 | - |
| 1.6589 | 37700 | 1.3002 | - |
| 1.6593 | 37710 | 1.3124 | - |
| 1.6598 | 37720 | 1.3232 | - |
| 1.6602 | 37730 | 1.3047 | - |
| 1.6607 | 37740 | 1.2943 | - |
| 1.6611 | 37750 | 1.3032 | - |
| 1.6615 | 37760 | 1.3117 | - |
| 1.6620 | 37770 | 1.3134 | - |
| 1.6624 | 37780 | 1.3203 | - |
| 1.6629 | 37790 | 1.3367 | - |
| 1.6633 | 37800 | 1.3214 | - |
| 1.6637 | 37810 | 1.3116 | - |
| 1.6642 | 37820 | 1.3177 | - |
| 1.6646 | 37830 | 1.3749 | - |
| 1.6651 | 37840 | 1.2592 | - |
| 1.6655 | 37850 | 1.3063 | - |
| 1.6659 | 37860 | 1.3416 | - |
| 1.6664 | 37870 | 1.3413 | - |
| 1.6668 | 37880 | 1.3657 | - |
| 1.6673 | 37890 | 1.3429 | - |
| 1.6677 | 37900 | 1.2744 | - |
| 1.6681 | 37910 | 1.2726 | - |
| 1.6686 | 37920 | 1.2935 | - |
| 1.6690 | 37930 | 1.3384 | - |
| 1.6695 | 37940 | 1.3414 | - |
| 1.6699 | 37950 | 1.2987 | - |
| 1.6703 | 37960 | 1.3402 | - |
| 1.6708 | 37970 | 1.3191 | - |
| 1.6712 | 37980 | 1.3505 | - |
| 1.6717 | 37990 | 1.3213 | - |
| 1.6721 | 38000 | 1.285 | - |
| 1.6725 | 38010 | 1.3031 | - |
| 1.6730 | 38020 | 1.3696 | - |
| 1.6734 | 38030 | 1.3121 | - |
| 1.6739 | 38040 | 1.2937 | - |
| 1.6743 | 38050 | 1.2887 | - |
| 1.6747 | 38060 | 1.2651 | - |
| 1.6752 | 38070 | 1.2658 | - |
| 1.6756 | 38080 | 1.2811 | - |
| 1.6761 | 38090 | 1.2794 | - |
| 1.6765 | 38100 | 1.3276 | - |
| 1.6769 | 38110 | 1.2781 | - |
| 1.6774 | 38120 | 1.2967 | - |
| 1.6778 | 38130 | 1.2884 | - |
| 1.6783 | 38140 | 1.3171 | - |
| 1.6787 | 38150 | 1.2997 | - |
| 1.6791 | 38160 | 1.2994 | - |
| 1.6796 | 38170 | 1.2623 | - |
| 1.6800 | 38180 | 1.2913 | - |
| 1.6805 | 38190 | 1.3678 | - |
| 1.6809 | 38200 | 1.2382 | - |
| 1.6813 | 38210 | 1.3296 | - |
| 1.6818 | 38220 | 1.2841 | - |
| 1.6822 | 38230 | 1.3364 | - |
| 1.6827 | 38240 | 1.319 | - |
| 1.6831 | 38250 | 1.284 | - |
| 1.6835 | 38260 | 1.2789 | - |
| 1.6840 | 38270 | 1.3435 | - |
| 1.6844 | 38280 | 1.369 | - |
| 1.6849 | 38290 | 1.3483 | - |
| 1.6853 | 38300 | 1.3325 | - |
| 1.6857 | 38310 | 1.2701 | - |
| 1.6862 | 38320 | 1.3629 | - |
| 1.6866 | 38330 | 1.2818 | - |
| 1.6871 | 38340 | 1.3419 | - |
| 1.6875 | 38350 | 1.348 | - |
| 1.6879 | 38360 | 1.3292 | - |
| 1.6884 | 38370 | 1.2962 | - |
| 1.6888 | 38380 | 1.2869 | - |
| 1.6893 | 38390 | 1.2968 | - |
| 1.6897 | 38400 | 1.3004 | - |
| 1.6901 | 38410 | 1.3068 | - |
| 1.6906 | 38420 | 1.3223 | - |
| 1.6910 | 38430 | 1.2944 | - |
| 1.6915 | 38440 | 1.2811 | - |
| 1.6919 | 38450 | 1.286 | - |
| 1.6923 | 38460 | 1.3072 | - |
| 1.6928 | 38470 | 1.2918 | - |
| 1.6932 | 38480 | 1.2844 | - |
| 1.6937 | 38490 | 1.2914 | - |
| 1.6941 | 38500 | 1.2862 | - |
| 1.6945 | 38510 | 1.349 | - |
| 1.6950 | 38520 | 1.3202 | - |
| 1.6954 | 38530 | 1.3505 | - |
| 1.6959 | 38540 | 1.2953 | - |
| 1.6963 | 38550 | 1.314 | - |
| 1.6967 | 38560 | 1.3213 | - |
| 1.6972 | 38570 | 1.3299 | - |
| 1.6976 | 38580 | 1.28 | - |
| 1.6981 | 38590 | 1.3027 | - |
| 1.6985 | 38600 | 1.2801 | - |
| 1.6989 | 38610 | 1.3062 | - |
| 1.6994 | 38620 | 1.2529 | - |
| 1.6998 | 38630 | 1.3008 | - |
| 1.7003 | 38640 | 1.237 | - |
| 1.7007 | 38650 | 1.2875 | - |
| 1.7010 | 38658 | - | 1.4439 |
| 1.7011 | 38660 | 1.3599 | - |
| 1.7016 | 38670 | 1.2927 | - |
| 1.7020 | 38680 | 1.3287 | - |
| 1.7025 | 38690 | 1.3365 | - |
| 1.7029 | 38700 | 1.3176 | - |
| 1.7033 | 38710 | 1.2767 | - |
| 1.7038 | 38720 | 1.2953 | - |
| 1.7042 | 38730 | 1.3177 | - |
| 1.7047 | 38740 | 1.2676 | - |
| 1.7051 | 38750 | 1.3263 | - |
| 1.7055 | 38760 | 1.3145 | - |
| 1.7060 | 38770 | 1.3307 | - |
| 1.7064 | 38780 | 1.2984 | - |
| 1.7069 | 38790 | 1.2774 | - |
| 1.7073 | 38800 | 1.3795 | - |
| 1.7077 | 38810 | 1.3124 | - |
| 1.7082 | 38820 | 1.2482 | - |
| 1.7086 | 38830 | 1.3279 | - |
| 1.7091 | 38840 | 1.3301 | - |
| 1.7095 | 38850 | 1.338 | - |
| 1.7099 | 38860 | 1.2698 | - |
| 1.7104 | 38870 | 1.3295 | - |
| 1.7108 | 38880 | 1.2979 | - |
| 1.7113 | 38890 | 1.2969 | - |
| 1.7117 | 38900 | 1.2877 | - |
| 1.7121 | 38910 | 1.2817 | - |
| 1.7126 | 38920 | 1.3678 | - |
| 1.7130 | 38930 | 1.2811 | - |
| 1.7135 | 38940 | 1.32 | - |
| 1.7139 | 38950 | 1.3134 | - |
| 1.7143 | 38960 | 1.3522 | - |
| 1.7148 | 38970 | 1.2835 | - |
| 1.7152 | 38980 | 1.2715 | - |
| 1.7157 | 38990 | 1.3048 | - |
| 1.7161 | 39000 | 1.2977 | - |
| 1.7165 | 39010 | 1.2831 | - |
| 1.7170 | 39020 | 1.2592 | - |
| 1.7174 | 39030 | 1.3096 | - |
| 1.7179 | 39040 | 1.2818 | - |
| 1.7183 | 39050 | 1.3058 | - |
| 1.7187 | 39060 | 1.2605 | - |
| 1.7192 | 39070 | 1.2797 | - |
| 1.7196 | 39080 | 1.3339 | - |
| 1.7201 | 39090 | 1.3171 | - |
| 1.7205 | 39100 | 1.307 | - |
| 1.7209 | 39110 | 1.2682 | - |
| 1.7214 | 39120 | 1.2777 | - |
| 1.7218 | 39130 | 1.2587 | - |
| 1.7223 | 39140 | 1.3123 | - |
| 1.7227 | 39150 | 1.3383 | - |
| 1.7231 | 39160 | 1.3378 | - |
| 1.7236 | 39170 | 1.3259 | - |
| 1.7240 | 39180 | 1.29 | - |
| 1.7245 | 39190 | 1.3329 | - |
| 1.7249 | 39200 | 1.3614 | - |
| 1.7253 | 39210 | 1.3194 | - |
| 1.7258 | 39220 | 1.2633 | - |
| 1.7262 | 39230 | 1.2659 | - |
| 1.7267 | 39240 | 1.284 | - |
| 1.7271 | 39250 | 1.3738 | - |
| 1.7275 | 39260 | 1.2807 | - |
| 1.7280 | 39270 | 1.2669 | - |
| 1.7284 | 39280 | 1.3196 | - |
| 1.7289 | 39290 | 1.2416 | - |
| 1.7293 | 39300 | 1.31 | - |
| 1.7297 | 39310 | 1.3092 | - |
| 1.7302 | 39320 | 1.2877 | - |
| 1.7306 | 39330 | 1.3224 | - |
| 1.7311 | 39340 | 1.2594 | - |
| 1.7315 | 39350 | 1.2513 | - |
| 1.7319 | 39360 | 1.2798 | - |
| 1.7324 | 39370 | 1.3012 | - |
| 1.7328 | 39380 | 1.242 | - |
| 1.7333 | 39390 | 1.2914 | - |
| 1.7337 | 39400 | 1.2309 | - |
| 1.7341 | 39410 | 1.301 | - |
| 1.7346 | 39420 | 1.3049 | - |
| 1.7350 | 39430 | 1.2755 | - |
| 1.7355 | 39440 | 1.3232 | - |
| 1.7359 | 39450 | 1.3349 | - |
| 1.7363 | 39460 | 1.3445 | - |
| 1.7368 | 39470 | 1.3255 | - |
| 1.7372 | 39480 | 1.2528 | - |
| 1.7377 | 39490 | 1.3223 | - |
| 1.7381 | 39500 | 1.3106 | - |
| 1.7385 | 39510 | 1.3059 | - |
| 1.7390 | 39520 | 1.3232 | - |
| 1.7394 | 39530 | 1.2773 | - |
| 1.7399 | 39540 | 1.3 | - |
| 1.7403 | 39550 | 1.29 | - |
| 1.7407 | 39560 | 1.3774 | - |
| 1.7412 | 39570 | 1.2872 | - |
| 1.7416 | 39580 | 1.3088 | - |
| 1.7421 | 39590 | 1.3069 | - |
| 1.7425 | 39600 | 1.2943 | - |
| 1.7429 | 39610 | 1.2882 | - |
| 1.7434 | 39620 | 1.2522 | - |
| 1.7438 | 39630 | 1.2971 | - |
| 1.7443 | 39640 | 1.3618 | - |
| 1.7447 | 39650 | 1.2953 | - |
| 1.7451 | 39660 | 1.3362 | - |
| 1.7456 | 39670 | 1.328 | - |
| 1.7460 | 39680 | 1.2736 | - |
| 1.7465 | 39690 | 1.2702 | - |
| 1.7469 | 39700 | 1.2804 | - |
| 1.7473 | 39710 | 1.3029 | - |
| 1.7478 | 39720 | 1.3195 | - |
| 1.7482 | 39730 | 1.3179 | - |
| 1.7487 | 39740 | 1.3247 | - |
| 1.7491 | 39750 | 1.2466 | - |
| 1.7495 | 39760 | 1.2645 | - |
| 1.7500 | 39770 | 1.2483 | - |
| 1.7504 | 39780 | 1.3118 | - |
| 1.7509 | 39790 | 1.3171 | - |
| 1.7511 | 39795 | - | 1.4577 |
| 1.7513 | 39800 | 1.3596 | - |
| 1.7517 | 39810 | 1.307 | - |
| 1.7522 | 39820 | 1.2593 | - |
| 1.7526 | 39830 | 1.2823 | - |
| 1.7531 | 39840 | 1.2841 | - |
| 1.7535 | 39850 | 1.3379 | - |
| 1.7539 | 39860 | 1.3044 | - |
| 1.7544 | 39870 | 1.3106 | - |
| 1.7548 | 39880 | 1.3573 | - |
| 1.7553 | 39890 | 1.2856 | - |
| 1.7557 | 39900 | 1.2396 | - |
| 1.7561 | 39910 | 1.3224 | - |
| 1.7566 | 39920 | 1.2987 | - |
| 1.7570 | 39930 | 1.2695 | - |
| 1.7575 | 39940 | 1.2958 | - |
| 1.7579 | 39950 | 1.3007 | - |
| 1.7583 | 39960 | 1.3856 | - |
| 1.7588 | 39970 | 1.3228 | - |
| 1.7592 | 39980 | 1.2999 | - |
| 1.7597 | 39990 | 1.2838 | - |
| 1.7601 | 40000 | 1.2745 | - |
| 1.7605 | 40010 | 1.3075 | - |
| 1.7610 | 40020 | 1.2669 | - |
| 1.7614 | 40030 | 1.3372 | - |
| 1.7619 | 40040 | 1.2743 | - |
| 1.7623 | 40050 | 1.2726 | - |
| 1.7627 | 40060 | 1.3105 | - |
| 1.7632 | 40070 | 1.3208 | - |
| 1.7636 | 40080 | 1.3161 | - |
| 1.7641 | 40090 | 1.356 | - |
| 1.7645 | 40100 | 1.3236 | - |
| 1.7649 | 40110 | 1.2555 | - |
| 1.7654 | 40120 | 1.305 | - |
| 1.7658 | 40130 | 1.3069 | - |
| 1.7663 | 40140 | 1.2842 | - |
| 1.7667 | 40150 | 1.357 | - |
| 1.7671 | 40160 | 1.3035 | - |
| 1.7676 | 40170 | 1.3396 | - |
| 1.7680 | 40180 | 1.2742 | - |
| 1.7685 | 40190 | 1.2874 | - |
| 1.7689 | 40200 | 1.2624 | - |
| 1.7693 | 40210 | 1.2686 | - |
| 1.7698 | 40220 | 1.282 | - |
| 1.7702 | 40230 | 1.3201 | - |
| 1.7707 | 40240 | 1.2513 | - |
| 1.7711 | 40250 | 1.2451 | - |
| 1.7715 | 40260 | 1.29 | - |
| 1.7720 | 40270 | 1.2484 | - |
| 1.7724 | 40280 | 1.2779 | - |
| 1.7729 | 40290 | 1.2476 | - |
| 1.7733 | 40300 | 1.3332 | - |
| 1.7737 | 40310 | 1.2769 | - |
| 1.7742 | 40320 | 1.2951 | - |
| 1.7746 | 40330 | 1.3006 | - |
| 1.7751 | 40340 | 1.3085 | - |
| 1.7755 | 40350 | 1.2817 | - |
| 1.7759 | 40360 | 1.3635 | - |
| 1.7764 | 40370 | 1.3447 | - |
| 1.7768 | 40380 | 1.2821 | - |
| 1.7773 | 40390 | 1.3464 | - |
| 1.7777 | 40400 | 1.2702 | - |
| 1.7781 | 40410 | 1.2609 | - |
| 1.7786 | 40420 | 1.2936 | - |
| 1.7790 | 40430 | 1.2659 | - |
| 1.7795 | 40440 | 1.2988 | - |
| 1.7799 | 40450 | 1.295 | - |
| 1.7803 | 40460 | 1.2822 | - |
| 1.7808 | 40470 | 1.265 | - |
| 1.7812 | 40480 | 1.3371 | - |
| 1.7817 | 40490 | 1.235 | - |
| 1.7821 | 40500 | 1.2849 | - |
| 1.7825 | 40510 | 1.3149 | - |
| 1.7830 | 40520 | 1.2928 | - |
| 1.7834 | 40530 | 1.2107 | - |
| 1.7839 | 40540 | 1.2943 | - |
| 1.7843 | 40550 | 1.2458 | - |
| 1.7847 | 40560 | 1.2286 | - |
| 1.7852 | 40570 | 1.2862 | - |
| 1.7856 | 40580 | 1.3167 | - |
| 1.7861 | 40590 | 1.2586 | - |
| 1.7865 | 40600 | 1.3258 | - |
| 1.7869 | 40610 | 1.2607 | - |
| 1.7874 | 40620 | 1.295 | - |
| 1.7878 | 40630 | 1.2956 | - |
| 1.7883 | 40640 | 1.2517 | - |
| 1.7887 | 40650 | 1.3354 | - |
| 1.7891 | 40660 | 1.2984 | - |
| 1.7896 | 40670 | 1.3375 | - |
| 1.7900 | 40680 | 1.2492 | - |
| 1.7905 | 40690 | 1.2533 | - |
| 1.7909 | 40700 | 1.2438 | - |
| 1.7913 | 40710 | 1.2809 | - |
| 1.7918 | 40720 | 1.2617 | - |
| 1.7922 | 40730 | 1.3062 | - |
| 1.7927 | 40740 | 1.3145 | - |
| 1.7931 | 40750 | 1.3021 | - |
| 1.7935 | 40760 | 1.3429 | - |
| 1.7940 | 40770 | 1.2653 | - |
| 1.7944 | 40780 | 1.3146 | - |
| 1.7949 | 40790 | 1.3172 | - |
| 1.7953 | 40800 | 1.3324 | - |
| 1.7957 | 40810 | 1.3086 | - |
| 1.7962 | 40820 | 1.2807 | - |
| 1.7966 | 40830 | 1.332 | - |
| 1.7971 | 40840 | 1.282 | - |
| 1.7975 | 40850 | 1.2264 | - |
| 1.7979 | 40860 | 1.2751 | - |
| 1.7984 | 40870 | 1.2984 | - |
| 1.7988 | 40880 | 1.2982 | - |
| 1.7993 | 40890 | 1.3141 | - |
| 1.7997 | 40900 | 1.2978 | - |
| 1.8001 | 40910 | 1.285 | - |
| 1.8006 | 40920 | 1.3283 | - |
| 1.8010 | 40930 | 1.2851 | - |
| 1.8011 | 40932 | - | 1.4573 |
| 1.8015 | 40940 | 1.28 | - |
| 1.8019 | 40950 | 1.3295 | - |
| 1.8023 | 40960 | 1.2422 | - |
| 1.8028 | 40970 | 1.2969 | - |
| 1.8032 | 40980 | 1.2788 | - |
| 1.8037 | 40990 | 1.2599 | - |
| 1.8041 | 41000 | 1.2756 | - |
| 1.8045 | 41010 | 1.2465 | - |
| 1.8050 | 41020 | 1.2603 | - |
| 1.8054 | 41030 | 1.3453 | - |
| 1.8059 | 41040 | 1.316 | - |
| 1.8063 | 41050 | 1.2454 | - |
| 1.8067 | 41060 | 1.276 | - |
| 1.8072 | 41070 | 1.2824 | - |
| 1.8076 | 41080 | 1.2363 | - |
| 1.8081 | 41090 | 1.3011 | - |
| 1.8085 | 41100 | 1.3058 | - |
| 1.8089 | 41110 | 1.2903 | - |
| 1.8094 | 41120 | 1.287 | - |
| 1.8098 | 41130 | 1.2791 | - |
| 1.8103 | 41140 | 1.2922 | - |
| 1.8107 | 41150 | 1.3072 | - |
| 1.8111 | 41160 | 1.2815 | - |
| 1.8116 | 41170 | 1.2355 | - |
| 1.8120 | 41180 | 1.3552 | - |
| 1.8125 | 41190 | 1.2498 | - |
| 1.8129 | 41200 | 1.2513 | - |
| 1.8133 | 41210 | 1.2513 | - |
| 1.8138 | 41220 | 1.3102 | - |
| 1.8142 | 41230 | 1.3082 | - |
| 1.8147 | 41240 | 1.2696 | - |
| 1.8151 | 41250 | 1.2875 | - |
| 1.8155 | 41260 | 1.2797 | - |
| 1.8160 | 41270 | 1.2979 | - |
| 1.8164 | 41280 | 1.2518 | - |
| 1.8169 | 41290 | 1.2806 | - |
| 1.8173 | 41300 | 1.2553 | - |
| 1.8177 | 41310 | 1.2684 | - |
| 1.8182 | 41320 | 1.2654 | - |
| 1.8186 | 41330 | 1.2622 | - |
| 1.8191 | 41340 | 1.2704 | - |
| 1.8195 | 41350 | 1.2026 | - |
| 1.8199 | 41360 | 1.253 | - |
| 1.8204 | 41370 | 1.2779 | - |
| 1.8208 | 41380 | 1.2343 | - |
| 1.8213 | 41390 | 1.2653 | - |
| 1.8217 | 41400 | 1.2272 | - |
| 1.8221 | 41410 | 1.2933 | - |
| 1.8226 | 41420 | 1.2514 | - |
| 1.8230 | 41430 | 1.2548 | - |
| 1.8235 | 41440 | 1.2223 | - |
| 1.8239 | 41450 | 1.2742 | - |
| 1.8243 | 41460 | 1.2604 | - |
| 1.8248 | 41470 | 1.2647 | - |
| 1.8252 | 41480 | 1.261 | - |
| 1.8257 | 41490 | 1.2152 | - |
| 1.8261 | 41500 | 1.271 | - |
| 1.8265 | 41510 | 1.2544 | - |
| 1.8270 | 41520 | 1.2887 | - |
| 1.8274 | 41530 | 1.2867 | - |
| 1.8279 | 41540 | 1.2604 | - |
| 1.8283 | 41550 | 1.2833 | - |
| 1.8287 | 41560 | 1.2497 | - |
| 1.8292 | 41570 | 1.2885 | - |
| 1.8296 | 41580 | 1.2847 | - |
| 1.8301 | 41590 | 1.2649 | - |
| 1.8305 | 41600 | 1.3126 | - |
| 1.8309 | 41610 | 1.2479 | - |
| 1.8314 | 41620 | 1.2969 | - |
| 1.8318 | 41630 | 1.2361 | - |
| 1.8323 | 41640 | 1.2906 | - |
| 1.8327 | 41650 | 1.2385 | - |
| 1.8331 | 41660 | 1.2781 | - |
| 1.8336 | 41670 | 1.243 | - |
| 1.8340 | 41680 | 1.2267 | - |
| 1.8345 | 41690 | 1.277 | - |
| 1.8349 | 41700 | 1.2748 | - |
| 1.8353 | 41710 | 1.2984 | - |
| 1.8358 | 41720 | 1.2669 | - |
| 1.8362 | 41730 | 1.2356 | - |
| 1.8367 | 41740 | 1.3332 | - |
| 1.8371 | 41750 | 1.2548 | - |
| 1.8375 | 41760 | 1.2564 | - |
| 1.8380 | 41770 | 1.2341 | - |
| 1.8384 | 41780 | 1.2982 | - |
| 1.8389 | 41790 | 1.2592 | - |
| 1.8393 | 41800 | 1.3252 | - |
| 1.8397 | 41810 | 1.2408 | - |
| 1.8402 | 41820 | 1.3018 | - |
| 1.8406 | 41830 | 1.2611 | - |
| 1.8411 | 41840 | 1.2669 | - |
| 1.8415 | 41850 | 1.2219 | - |
| 1.8419 | 41860 | 1.2903 | - |
| 1.8424 | 41870 | 1.2382 | - |
| 1.8428 | 41880 | 1.2862 | - |
| 1.8433 | 41890 | 1.2575 | - |
| 1.8437 | 41900 | 1.2199 | - |
| 1.8441 | 41910 | 1.2695 | - |
| 1.8446 | 41920 | 1.3006 | - |
| 1.8450 | 41930 | 1.2234 | - |
| 1.8455 | 41940 | 1.3298 | - |
| 1.8459 | 41950 | 1.2137 | - |
| 1.8463 | 41960 | 1.2433 | - |
| 1.8468 | 41970 | 1.2399 | - |
| 1.8472 | 41980 | 1.2762 | - |
| 1.8477 | 41990 | 1.3331 | - |
| 1.8481 | 42000 | 1.2446 | - |
| 1.8485 | 42010 | 1.2489 | - |
| 1.8490 | 42020 | 1.241 | - |
| 1.8494 | 42030 | 1.2126 | - |
| 1.8499 | 42040 | 1.2485 | - |
| 1.8503 | 42050 | 1.2745 | - |
| 1.8507 | 42060 | 1.2937 | - |
| 1.8511 | 42069 | - | 1.4229 |
| 1.8512 | 42070 | 1.2472 | - |
| 1.8516 | 42080 | 1.2725 | - |
| 1.8521 | 42090 | 1.2441 | - |
| 1.8525 | 42100 | 1.3102 | - |
| 1.8529 | 42110 | 1.2773 | - |
| 1.8534 | 42120 | 1.2628 | - |
| 1.8538 | 42130 | 1.2595 | - |
| 1.8543 | 42140 | 1.3287 | - |
| 1.8547 | 42150 | 1.2748 | - |
| 1.8551 | 42160 | 1.2809 | - |
| 1.8556 | 42170 | 1.2611 | - |
| 1.8560 | 42180 | 1.2392 | - |
| 1.8565 | 42190 | 1.2604 | - |
| 1.8569 | 42200 | 1.3052 | - |
| 1.8573 | 42210 | 1.212 | - |
| 1.8578 | 42220 | 1.2544 | - |
| 1.8582 | 42230 | 1.2485 | - |
| 1.8587 | 42240 | 1.2703 | - |
| 1.8591 | 42250 | 1.284 | - |
| 1.8595 | 42260 | 1.2966 | - |
| 1.8600 | 42270 | 1.301 | - |
| 1.8604 | 42280 | 1.2412 | - |
| 1.8609 | 42290 | 1.2585 | - |
| 1.8613 | 42300 | 1.2882 | - |
| 1.8617 | 42310 | 1.243 | - |
| 1.8622 | 42320 | 1.2556 | - |
| 1.8626 | 42330 | 1.2515 | - |
| 1.8631 | 42340 | 1.2196 | - |
| 1.8635 | 42350 | 1.261 | - |
| 1.8639 | 42360 | 1.2633 | - |
| 1.8644 | 42370 | 1.2165 | - |
| 1.8648 | 42380 | 1.2253 | - |
| 1.8653 | 42390 | 1.2358 | - |
| 1.8657 | 42400 | 1.2548 | - |
| 1.8661 | 42410 | 1.258 | - |
| 1.8666 | 42420 | 1.2522 | - |
| 1.8670 | 42430 | 1.2694 | - |
| 1.8675 | 42440 | 1.279 | - |
| 1.8679 | 42450 | 1.2432 | - |
| 1.8683 | 42460 | 1.2929 | - |
| 1.8688 | 42470 | 1.2578 | - |
| 1.8692 | 42480 | 1.2543 | - |
| 1.8697 | 42490 | 1.298 | - |
| 1.8701 | 42500 | 1.2227 | - |
| 1.8705 | 42510 | 1.2647 | - |
| 1.8710 | 42520 | 1.2929 | - |
| 1.8714 | 42530 | 1.2756 | - |
| 1.8719 | 42540 | 1.2361 | - |
| 1.8723 | 42550 | 1.3049 | - |
| 1.8727 | 42560 | 1.2007 | - |
| 1.8732 | 42570 | 1.2228 | - |
| 1.8736 | 42580 | 1.2409 | - |
| 1.8741 | 42590 | 1.2427 | - |
| 1.8745 | 42600 | 1.2336 | - |
| 1.8749 | 42610 | 1.2435 | - |
| 1.8754 | 42620 | 1.2307 | - |
| 1.8758 | 42630 | 1.2713 | - |
| 1.8763 | 42640 | 1.3075 | - |
| 1.8767 | 42650 | 1.289 | - |
| 1.8771 | 42660 | 1.2015 | - |
| 1.8776 | 42670 | 1.225 | - |
| 1.8780 | 42680 | 1.263 | - |
| 1.8785 | 42690 | 1.2587 | - |
| 1.8789 | 42700 | 1.2727 | - |
| 1.8793 | 42710 | 1.2524 | - |
| 1.8798 | 42720 | 1.2238 | - |
| 1.8802 | 42730 | 1.2543 | - |
| 1.8807 | 42740 | 1.2365 | - |
| 1.8811 | 42750 | 1.2244 | - |
| 1.8815 | 42760 | 1.2655 | - |
| 1.8820 | 42770 | 1.2615 | - |
| 1.8824 | 42780 | 1.1978 | - |
| 1.8829 | 42790 | 1.1973 | - |
| 1.8833 | 42800 | 1.2177 | - |
| 1.8837 | 42810 | 1.2504 | - |
| 1.8842 | 42820 | 1.2827 | - |
| 1.8846 | 42830 | 1.2368 | - |
| 1.8851 | 42840 | 1.2813 | - |
| 1.8855 | 42850 | 1.2547 | - |
| 1.8859 | 42860 | 1.261 | - |
| 1.8864 | 42870 | 1.2139 | - |
| 1.8868 | 42880 | 1.2461 | - |
| 1.8873 | 42890 | 1.2092 | - |
| 1.8877 | 42900 | 1.2279 | - |
| 1.8881 | 42910 | 1.2957 | - |
| 1.8886 | 42920 | 1.2341 | - |
| 1.8890 | 42930 | 1.2043 | - |
| 1.8895 | 42940 | 1.2911 | - |
| 1.8899 | 42950 | 1.2113 | - |
| 1.8903 | 42960 | 1.2178 | - |
| 1.8908 | 42970 | 1.2258 | - |
| 1.8912 | 42980 | 1.2747 | - |
| 1.8917 | 42990 | 1.2478 | - |
| 1.8921 | 43000 | 1.2408 | - |
| 1.8925 | 43010 | 1.2478 | - |
| 1.8930 | 43020 | 1.221 | - |
| 1.8934 | 43030 | 1.2284 | - |
| 1.8939 | 43040 | 1.2927 | - |
| 1.8943 | 43050 | 1.2314 | - |
| 1.8947 | 43060 | 1.2726 | - |
| 1.8952 | 43070 | 1.2121 | - |
| 1.8956 | 43080 | 1.2661 | - |
| 1.8961 | 43090 | 1.2714 | - |
| 1.8965 | 43100 | 1.2025 | - |
| 1.8969 | 43110 | 1.2645 | - |
| 1.8974 | 43120 | 1.2225 | - |
| 1.8978 | 43130 | 1.1991 | - |
| 1.8983 | 43140 | 1.237 | - |
| 1.8987 | 43150 | 1.2331 | - |
| 1.8991 | 43160 | 1.1902 | - |
| 1.8996 | 43170 | 1.2081 | - |
| 1.9000 | 43180 | 1.2319 | - |
| 1.9005 | 43190 | 1.2096 | - |
| 1.9009 | 43200 | 1.2294 | - |
| 1.9012 | 43206 | - | 1.4650 |
| 1.9013 | 43210 | 1.2718 | - |
| 1.9018 | 43220 | 1.2537 | - |
| 1.9022 | 43230 | 1.2556 | - |
| 1.9027 | 43240 | 1.2786 | - |
| 1.9031 | 43250 | 1.2505 | - |
| 1.9035 | 43260 | 1.2189 | - |
| 1.9040 | 43270 | 1.26 | - |
| 1.9044 | 43280 | 1.293 | - |
| 1.9049 | 43290 | 1.2441 | - |
| 1.9053 | 43300 | 1.2659 | - |
| 1.9057 | 43310 | 1.234 | - |
| 1.9062 | 43320 | 1.2432 | - |
| 1.9066 | 43330 | 1.2626 | - |
| 1.9071 | 43340 | 1.2532 | - |
| 1.9075 | 43350 | 1.2517 | - |
| 1.9079 | 43360 | 1.2673 | - |
| 1.9084 | 43370 | 1.2305 | - |
| 1.9088 | 43380 | 1.2711 | - |
| 1.9093 | 43390 | 1.2272 | - |
| 1.9097 | 43400 | 1.2367 | - |
| 1.9101 | 43410 | 1.2215 | - |
| 1.9106 | 43420 | 1.2298 | - |
| 1.9110 | 43430 | 1.2569 | - |
| 1.9115 | 43440 | 1.1759 | - |
| 1.9119 | 43450 | 1.2203 | - |
| 1.9123 | 43460 | 1.2429 | - |
| 1.9128 | 43470 | 1.2088 | - |
| 1.9132 | 43480 | 1.2465 | - |
| 1.9137 | 43490 | 1.2587 | - |
| 1.9141 | 43500 | 1.2091 | - |
| 1.9145 | 43510 | 1.2183 | - |
| 1.9150 | 43520 | 1.2518 | - |
| 1.9154 | 43530 | 1.275 | - |
| 1.9159 | 43540 | 1.228 | - |
| 1.9163 | 43550 | 1.2183 | - |
| 1.9167 | 43560 | 1.2786 | - |
| 1.9172 | 43570 | 1.2444 | - |
| 1.9176 | 43580 | 1.1888 | - |
| 1.9181 | 43590 | 1.2629 | - |
| 1.9185 | 43600 | 1.2104 | - |
| 1.9189 | 43610 | 1.2146 | - |
| 1.9194 | 43620 | 1.1956 | - |
| 1.9198 | 43630 | 1.2573 | - |
| 1.9203 | 43640 | 1.2178 | - |
| 1.9207 | 43650 | 1.2567 | - |
| 1.9211 | 43660 | 1.2283 | - |
| 1.9216 | 43670 | 1.2332 | - |
| 1.9220 | 43680 | 1.2694 | - |
| 1.9225 | 43690 | 1.2485 | - |
| 1.9229 | 43700 | 1.2436 | - |
| 1.9233 | 43710 | 1.2344 | - |
| 1.9238 | 43720 | 1.2543 | - |
| 1.9242 | 43730 | 1.2306 | - |
| 1.9247 | 43740 | 1.205 | - |
| 1.9251 | 43750 | 1.2398 | - |
| 1.9255 | 43760 | 1.1984 | - |
| 1.9260 | 43770 | 1.2118 | - |
| 1.9264 | 43780 | 1.1936 | - |
| 1.9269 | 43790 | 1.2391 | - |
| 1.9273 | 43800 | 1.1831 | - |
| 1.9277 | 43810 | 1.2139 | - |
| 1.9282 | 43820 | 1.2443 | - |
| 1.9286 | 43830 | 1.2328 | - |
| 1.9291 | 43840 | 1.2027 | - |
| 1.9295 | 43850 | 1.2173 | - |
| 1.9299 | 43860 | 1.3188 | - |
| 1.9304 | 43870 | 1.2375 | - |
| 1.9308 | 43880 | 1.2259 | - |
| 1.9313 | 43890 | 1.3048 | - |
| 1.9317 | 43900 | 1.2067 | - |
| 1.9321 | 43910 | 1.2558 | - |
| 1.9326 | 43920 | 1.2306 | - |
| 1.9330 | 43930 | 1.3222 | - |
| 1.9335 | 43940 | 1.1926 | - |
| 1.9339 | 43950 | 1.2498 | - |
| 1.9343 | 43960 | 1.2325 | - |
| 1.9348 | 43970 | 1.2411 | - |
| 1.9352 | 43980 | 1.2125 | - |
| 1.9357 | 43990 | 1.2426 | - |
| 1.9361 | 44000 | 1.2147 | - |
| 1.9365 | 44010 | 1.2195 | - |
| 1.9370 | 44020 | 1.2321 | - |
| 1.9374 | 44030 | 1.2523 | - |
| 1.9379 | 44040 | 1.1595 | - |
| 1.9383 | 44050 | 1.2679 | - |
| 1.9387 | 44060 | 1.2489 | - |
| 1.9392 | 44070 | 1.2034 | - |
| 1.9396 | 44080 | 1.1912 | - |
| 1.9401 | 44090 | 1.2504 | - |
| 1.9405 | 44100 | 1.2502 | - |
| 1.9409 | 44110 | 1.1937 | - |
| 1.9414 | 44120 | 1.2048 | - |
| 1.9418 | 44130 | 1.27 | - |
| 1.9423 | 44140 | 1.2108 | - |
| 1.9427 | 44150 | 1.269 | - |
| 1.9431 | 44160 | 1.1876 | - |
| 1.9436 | 44170 | 1.2537 | - |
| 1.9440 | 44180 | 1.265 | - |
| 1.9445 | 44190 | 1.2449 | - |
| 1.9449 | 44200 | 1.2249 | - |
| 1.9453 | 44210 | 1.1842 | - |
| 1.9458 | 44220 | 1.2124 | - |
| 1.9462 | 44230 | 1.2052 | - |
| 1.9467 | 44240 | 1.2232 | - |
| 1.9471 | 44250 | 1.2927 | - |
| 1.9475 | 44260 | 1.2284 | - |
| 1.9480 | 44270 | 1.2425 | - |
| 1.9484 | 44280 | 1.2172 | - |
| 1.9489 | 44290 | 1.221 | - |
| 1.9493 | 44300 | 1.1802 | - |
| 1.9497 | 44310 | 1.2276 | - |
| 1.9502 | 44320 | 1.209 | - |
| 1.9506 | 44330 | 1.2081 | - |
| 1.9511 | 44340 | 1.191 | - |
| 1.9512 | 44343 | - | 1.4393 |
| 1.9515 | 44350 | 1.1747 | - |
| 1.9519 | 44360 | 1.2651 | - |
| 1.9524 | 44370 | 1.2358 | - |
| 1.9528 | 44380 | 1.2293 | - |
| 1.9533 | 44390 | 1.2077 | - |
| 1.9537 | 44400 | 1.1746 | - |
| 1.9541 | 44410 | 1.1921 | - |
| 1.9546 | 44420 | 1.2008 | - |
| 1.9550 | 44430 | 1.1774 | - |
| 1.9555 | 44440 | 1.2157 | - |
| 1.9559 | 44450 | 1.2056 | - |
| 1.9563 | 44460 | 1.2213 | - |
| 1.9568 | 44470 | 1.1978 | - |
| 1.9572 | 44480 | 1.2311 | - |
| 1.9577 | 44490 | 1.2527 | - |
| 1.9581 | 44500 | 1.24 | - |
| 1.9585 | 44510 | 1.192 | - |
| 1.9590 | 44520 | 1.2173 | - |
| 1.9594 | 44530 | 1.2202 | - |
| 1.9599 | 44540 | 1.2196 | - |
| 1.9603 | 44550 | 1.2162 | - |
| 1.9607 | 44560 | 1.2352 | - |
| 1.9612 | 44570 | 1.1828 | - |
| 1.9616 | 44580 | 1.1828 | - |
| 1.9621 | 44590 | 1.2272 | - |
| 1.9625 | 44600 | 1.2181 | - |
| 1.9629 | 44610 | 1.2246 | - |
| 1.9634 | 44620 | 1.1387 | - |
| 1.9638 | 44630 | 1.2135 | - |
| 1.9643 | 44640 | 1.2216 | - |
| 1.9647 | 44650 | 1.1748 | - |
| 1.9652 | 44660 | 1.2193 | - |
| 1.9656 | 44670 | 1.2107 | - |
| 1.9660 | 44680 | 1.226 | - |
| 1.9665 | 44690 | 1.193 | - |
| 1.9669 | 44700 | 1.2014 | - |
| 1.9674 | 44710 | 1.2137 | - |
| 1.9678 | 44720 | 1.149 | - |
| 1.9682 | 44730 | 1.2528 | - |
| 1.9687 | 44740 | 1.2081 | - |
| 1.9691 | 44750 | 1.1579 | - |
| 1.9696 | 44760 | 1.2146 | - |
| 1.9700 | 44770 | 1.2108 | - |
| 1.9704 | 44780 | 1.2441 | - |
| 1.9709 | 44790 | 1.2371 | - |
| 1.9713 | 44800 | 1.1517 | - |
| 1.9718 | 44810 | 1.2325 | - |
| 1.9722 | 44820 | 1.195 | - |
| 1.9726 | 44830 | 1.1587 | - |
| 1.9731 | 44840 | 1.1637 | - |
| 1.9735 | 44850 | 1.1501 | - |
| 1.9740 | 44860 | 1.2464 | - |
| 1.9744 | 44870 | 1.2132 | - |
| 1.9748 | 44880 | 1.191 | - |
| 1.9753 | 44890 | 1.2337 | - |
| 1.9757 | 44900 | 1.2 | - |
| 1.9762 | 44910 | 1.2284 | - |
| 1.9766 | 44920 | 1.204 | - |
| 1.9770 | 44930 | 1.2139 | - |
| 1.9775 | 44940 | 1.2 | - |
| 1.9779 | 44950 | 1.2382 | - |
| 1.9784 | 44960 | 1.2091 | - |
| 1.9788 | 44970 | 1.1872 | - |
| 1.9792 | 44980 | 1.2054 | - |
| 1.9797 | 44990 | 1.216 | - |
| 1.9801 | 45000 | 1.1583 | - |
| 1.9806 | 45010 | 1.2521 | - |
| 1.9810 | 45020 | 1.1383 | - |
| 1.9814 | 45030 | 1.2627 | - |
| 1.9819 | 45040 | 1.2044 | - |
| 1.9823 | 45050 | 1.1981 | - |
| 1.9828 | 45060 | 1.2125 | - |
| 1.9832 | 45070 | 1.1665 | - |
| 1.9836 | 45080 | 1.2238 | - |
| 1.9841 | 45090 | 1.2506 | - |
| 1.9845 | 45100 | 1.209 | - |
| 1.9850 | 45110 | 1.1833 | - |
| 1.9854 | 45120 | 1.2208 | - |
| 1.9858 | 45130 | 1.1635 | - |
| 1.9863 | 45140 | 1.1512 | - |
| 1.9867 | 45150 | 1.1986 | - |
| 1.9872 | 45160 | 1.2217 | - |
| 1.9876 | 45170 | 1.1708 | - |
| 1.9880 | 45180 | 1.1945 | - |
| 1.9885 | 45190 | 1.2086 | - |
| 1.9889 | 45200 | 1.1804 | - |
| 1.9894 | 45210 | 1.2037 | - |
| 1.9898 | 45220 | 1.181 | - |
| 1.9902 | 45230 | 1.2427 | - |
| 1.9907 | 45240 | 1.2067 | - |
| 1.9911 | 45250 | 1.1328 | - |
| 1.9916 | 45260 | 1.1816 | - |
| 1.9920 | 45270 | 1.1682 | - |
| 1.9924 | 45280 | 1.1889 | - |
| 1.9929 | 45290 | 1.2515 | - |
| 1.9933 | 45300 | 1.2586 | - |
| 1.9938 | 45310 | 1.24 | - |
| 1.9942 | 45320 | 1.235 | - |
| 1.9946 | 45330 | 1.2196 | - |
| 1.9951 | 45340 | 1.2146 | - |
| 1.9955 | 45350 | 1.1598 | - |
| 1.9960 | 45360 | 1.2057 | - |
| 1.9964 | 45370 | 1.1568 | - |
| 1.9968 | 45380 | 1.1764 | - |
| 1.9973 | 45390 | 1.2248 | - |
| 1.9977 | 45400 | 1.2201 | - |
| 1.9982 | 45410 | 1.1651 | - |
| 1.9986 | 45420 | 1.1533 | - |
| 1.9990 | 45430 | 1.1544 | - |
| 1.9995 | 45440 | 1.2051 | - |
| 1.9999 | 45450 | 1.1873 | - |
| 2.0004 | 45460 | 1.1677 | - |
| 2.0008 | 45470 | 1.1805 | - |
| 2.0012 | 45480 | 1.1588 | 1.4466 |
| 2.0017 | 45490 | 1.1435 | - |
| 2.0021 | 45500 | 1.161 | - |
| 2.0026 | 45510 | 1.1623 | - |
| 2.0030 | 45520 | 1.1286 | - |
| 2.0034 | 45530 | 1.1396 | - |
| 2.0039 | 45540 | 1.1261 | - |
| 2.0043 | 45550 | 1.148 | - |
| 2.0048 | 45560 | 1.1262 | - |
| 2.0052 | 45570 | 1.1199 | - |
| 2.0056 | 45580 | 1.1295 | - |
| 2.0061 | 45590 | 1.1318 | - |
| 2.0065 | 45600 | 1.1313 | - |
| 2.0070 | 45610 | 1.1575 | - |
| 2.0074 | 45620 | 1.1377 | - |
| 2.0078 | 45630 | 1.1511 | - |
| 2.0083 | 45640 | 1.1992 | - |
| 2.0087 | 45650 | 1.133 | - |
| 2.0092 | 45660 | 1.1312 | - |
| 2.0096 | 45670 | 1.173 | - |
| 2.0100 | 45680 | 1.1198 | - |
| 2.0105 | 45690 | 1.1622 | - |
| 2.0109 | 45700 | 1.1127 | - |
| 2.0114 | 45710 | 1.1428 | - |
| 2.0118 | 45720 | 1.1418 | - |
| 2.0122 | 45730 | 1.1217 | - |
| 2.0127 | 45740 | 1.2172 | - |
| 2.0131 | 45750 | 1.129 | - |
| 2.0136 | 45760 | 1.1428 | - |
| 2.0140 | 45770 | 1.1452 | - |
| 2.0144 | 45780 | 1.145 | - |
| 2.0149 | 45790 | 1.1729 | - |
| 2.0153 | 45800 | 1.1727 | - |
| 2.0158 | 45810 | 1.1661 | - |
| 2.0162 | 45820 | 1.1989 | - |
| 2.0166 | 45830 | 1.1421 | - |
| 2.0171 | 45840 | 1.15 | - |
| 2.0175 | 45850 | 1.1472 | - |
| 2.0180 | 45860 | 1.0956 | - |
| 2.0184 | 45870 | 1.1323 | - |
| 2.0188 | 45880 | 1.1322 | - |
| 2.0193 | 45890 | 1.1787 | - |
| 2.0197 | 45900 | 1.1562 | - |
| 2.0202 | 45910 | 1.1066 | - |
| 2.0206 | 45920 | 1.1027 | - |
| 2.0210 | 45930 | 1.1457 | - |
| 2.0215 | 45940 | 1.1302 | - |
| 2.0219 | 45950 | 1.1472 | - |
| 2.0224 | 45960 | 1.1042 | - |
| 2.0228 | 45970 | 1.137 | - |
| 2.0232 | 45980 | 1.1179 | - |
| 2.0237 | 45990 | 1.13 | - |
| 2.0241 | 46000 | 1.1145 | - |
| 2.0246 | 46010 | 1.1494 | - |
| 2.0250 | 46020 | 1.1831 | - |
| 2.0254 | 46030 | 1.1463 | - |
| 2.0259 | 46040 | 1.1235 | - |
| 2.0263 | 46050 | 1.1468 | - |
| 2.0268 | 46060 | 1.1911 | - |
| 2.0272 | 46070 | 1.0997 | - |
| 2.0276 | 46080 | 1.1333 | - |
| 2.0281 | 46090 | 1.1641 | - |
| 2.0285 | 46100 | 1.1764 | - |
| 2.0290 | 46110 | 1.1559 | - |
| 2.0294 | 46120 | 1.0704 | - |
| 2.0298 | 46130 | 1.13 | - |
| 2.0303 | 46140 | 1.1119 | - |
| 2.0307 | 46150 | 1.174 | - |
| 2.0312 | 46160 | 1.1778 | - |
| 2.0316 | 46170 | 1.1358 | - |
| 2.0320 | 46180 | 1.1365 | - |
| 2.0325 | 46190 | 1.1975 | - |
| 2.0329 | 46200 | 1.138 | - |
| 2.0334 | 46210 | 1.2148 | - |
| 2.0338 | 46220 | 1.1404 | - |
| 2.0342 | 46230 | 1.162 | - |
| 2.0347 | 46240 | 1.1609 | - |
| 2.0351 | 46250 | 1.1473 | - |
| 2.0356 | 46260 | 1.1309 | - |
| 2.0360 | 46270 | 1.0938 | - |
| 2.0364 | 46280 | 1.2018 | - |
| 2.0369 | 46290 | 1.1356 | - |
| 2.0373 | 46300 | 1.168 | - |
| 2.0378 | 46310 | 1.1588 | - |
| 2.0382 | 46320 | 1.1548 | - |
| 2.0386 | 46330 | 1.1634 | - |
| 2.0391 | 46340 | 1.1966 | - |
| 2.0395 | 46350 | 1.1124 | - |
| 2.0400 | 46360 | 1.1321 | - |
| 2.0404 | 46370 | 1.0939 | - |
| 2.0408 | 46380 | 1.0787 | - |
| 2.0413 | 46390 | 1.1545 | - |
| 2.0417 | 46400 | 1.128 | - |
| 2.0422 | 46410 | 1.1119 | - |
| 2.0426 | 46420 | 1.1249 | - |
| 2.0430 | 46430 | 1.0741 | - |
| 2.0435 | 46440 | 1.1776 | - |
| 2.0439 | 46450 | 1.1439 | - |
| 2.0444 | 46460 | 1.1645 | - |
| 2.0448 | 46470 | 1.1725 | - |
| 2.0452 | 46480 | 1.1332 | - |
| 2.0457 | 46490 | 1.1602 | - |
| 2.0461 | 46500 | 1.1338 | - |
| 2.0466 | 46510 | 1.1509 | - |
| 2.0470 | 46520 | 1.1058 | - |
| 2.0474 | 46530 | 1.1414 | - |
| 2.0479 | 46540 | 1.0845 | - |
| 2.0483 | 46550 | 1.1417 | - |
| 2.0488 | 46560 | 1.1161 | - |
| 2.0492 | 46570 | 1.1876 | - |
| 2.0496 | 46580 | 1.152 | - |
| 2.0501 | 46590 | 1.1135 | - |
| 2.0505 | 46600 | 1.1572 | - |
| 2.0510 | 46610 | 1.1277 | - |
| 2.0513 | 46617 | - | 1.4280 |
| 2.0514 | 46620 | 1.1353 | - |
| 2.0518 | 46630 | 1.1452 | - |
| 2.0523 | 46640 | 1.1292 | - |
| 2.0527 | 46650 | 1.1286 | - |
| 2.0532 | 46660 | 1.0943 | - |
| 2.0536 | 46670 | 1.1508 | - |
| 2.0540 | 46680 | 1.1514 | - |
| 2.0545 | 46690 | 1.1258 | - |
| 2.0549 | 46700 | 1.1509 | - |
| 2.0554 | 46710 | 1.1591 | - |
| 2.0558 | 46720 | 1.1293 | - |
| 2.0562 | 46730 | 1.1361 | - |
| 2.0567 | 46740 | 1.1483 | - |
| 2.0571 | 46750 | 1.1384 | - |
| 2.0576 | 46760 | 1.1348 | - |
| 2.0580 | 46770 | 1.1346 | - |
| 2.0584 | 46780 | 1.144 | - |
| 2.0589 | 46790 | 1.1114 | - |
| 2.0593 | 46800 | 1.1814 | - |
| 2.0598 | 46810 | 1.1427 | - |
| 2.0602 | 46820 | 1.1264 | - |
| 2.0606 | 46830 | 1.0985 | - |
| 2.0611 | 46840 | 1.1533 | - |
| 2.0615 | 46850 | 1.0977 | - |
| 2.0620 | 46860 | 1.1625 | - |
| 2.0624 | 46870 | 1.113 | - |
| 2.0628 | 46880 | 1.067 | - |
| 2.0633 | 46890 | 1.0999 | - |
| 2.0637 | 46900 | 1.1682 | - |
| 2.0642 | 46910 | 1.155 | - |
| 2.0646 | 46920 | 1.1295 | - |
| 2.0650 | 46930 | 1.17 | - |
| 2.0655 | 46940 | 1.1496 | - |
| 2.0659 | 46950 | 1.0787 | - |
| 2.0664 | 46960 | 1.131 | - |
| 2.0668 | 46970 | 1.1335 | - |
| 2.0672 | 46980 | 1.0748 | - |
| 2.0677 | 46990 | 1.1728 | - |
| 2.0681 | 47000 | 1.1168 | - |
| 2.0686 | 47010 | 1.1243 | - |
| 2.0690 | 47020 | 1.1152 | - |
| 2.0694 | 47030 | 1.1387 | - |
| 2.0699 | 47040 | 1.1423 | - |
| 2.0703 | 47050 | 1.1311 | - |
| 2.0708 | 47060 | 1.1319 | - |
| 2.0712 | 47070 | 1.1475 | - |
| 2.0716 | 47080 | 1.1193 | - |
| 2.0721 | 47090 | 1.1414 | - |
| 2.0725 | 47100 | 1.1108 | - |
| 2.0730 | 47110 | 1.1304 | - |
| 2.0734 | 47120 | 1.1273 | - |
| 2.0738 | 47130 | 1.1309 | - |
| 2.0743 | 47140 | 1.1311 | - |
| 2.0747 | 47150 | 1.1579 | - |
| 2.0752 | 47160 | 1.1694 | - |
| 2.0756 | 47170 | 1.137 | - |
| 2.0760 | 47180 | 1.117 | - |
| 2.0765 | 47190 | 1.1054 | - |
| 2.0769 | 47200 | 1.0723 | - |
| 2.0774 | 47210 | 1.1011 | - |
| 2.0778 | 47220 | 1.1403 | - |
| 2.0782 | 47230 | 1.1405 | - |
| 2.0787 | 47240 | 1.1642 | - |
| 2.0791 | 47250 | 1.1169 | - |
| 2.0796 | 47260 | 1.1318 | - |
| 2.0800 | 47270 | 1.1309 | - |
| 2.0804 | 47280 | 1.0999 | - |
| 2.0809 | 47290 | 1.1413 | - |
| 2.0813 | 47300 | 1.1334 | - |
| 2.0818 | 47310 | 1.1066 | - |
| 2.0822 | 47320 | 1.1302 | - |
| 2.0826 | 47330 | 1.0762 | - |
| 2.0831 | 47340 | 1.1662 | - |
| 2.0835 | 47350 | 1.1621 | - |
| 2.0840 | 47360 | 1.2094 | - |
| 2.0844 | 47370 | 1.0951 | - |
| 2.0848 | 47380 | 1.1515 | - |
| 2.0853 | 47390 | 1.1212 | - |
| 2.0857 | 47400 | 1.0982 | - |
| 2.0862 | 47410 | 1.1406 | - |
| 2.0866 | 47420 | 1.1067 | - |
| 2.0870 | 47430 | 1.1151 | - |
| 2.0875 | 47440 | 1.1266 | - |
| 2.0879 | 47450 | 1.117 | - |
| 2.0884 | 47460 | 1.1597 | - |
| 2.0888 | 47470 | 1.1563 | - |
| 2.0892 | 47480 | 1.1151 | - |
| 2.0897 | 47490 | 1.1321 | - |
| 2.0901 | 47500 | 1.0743 | - |
| 2.0906 | 47510 | 1.152 | - |
| 2.0910 | 47520 | 1.1018 | - |
| 2.0914 | 47530 | 1.0976 | - |
| 2.0919 | 47540 | 1.1333 | - |
| 2.0923 | 47550 | 1.1535 | - |
| 2.0928 | 47560 | 1.0793 | - |
| 2.0932 | 47570 | 1.189 | - |
| 2.0936 | 47580 | 1.125 | - |
| 2.0941 | 47590 | 1.1049 | - |
| 2.0945 | 47600 | 1.1202 | - |
| 2.0950 | 47610 | 1.0985 | - |
| 2.0954 | 47620 | 1.1431 | - |
| 2.0958 | 47630 | 1.128 | - |
| 2.0963 | 47640 | 1.1152 | - |
| 2.0967 | 47650 | 1.1643 | - |
| 2.0972 | 47660 | 1.0748 | - |
| 2.0976 | 47670 | 1.1251 | - |
| 2.0980 | 47680 | 1.1402 | - |
| 2.0985 | 47690 | 1.1132 | - |
| 2.0989 | 47700 | 1.1114 | - |
| 2.0994 | 47710 | 1.1201 | - |
| 2.0998 | 47720 | 1.1324 | - |
| 2.1002 | 47730 | 1.107 | - |
| 2.1007 | 47740 | 1.1409 | - |
| 2.1011 | 47750 | 1.1447 | - |
| 2.1013 | 47754 | - | 1.4399 |
| 2.1016 | 47760 | 1.1183 | - |
| 2.1020 | 47770 | 1.089 | - |
| 2.1024 | 47780 | 1.1683 | - |
| 2.1029 | 47790 | 1.1189 | - |
| 2.1033 | 47800 | 1.1497 | - |
| 2.1038 | 47810 | 1.1256 | - |
| 2.1042 | 47820 | 1.0732 | - |
| 2.1046 | 47830 | 1.1136 | - |
| 2.1051 | 47840 | 1.0963 | - |
| 2.1055 | 47850 | 1.1425 | - |
| 2.1060 | 47860 | 1.1293 | - |
| 2.1064 | 47870 | 1.0829 | - |
| 2.1068 | 47880 | 1.1116 | - |
| 2.1073 | 47890 | 1.0804 | - |
| 2.1077 | 47900 | 1.1267 | - |
| 2.1082 | 47910 | 1.1318 | - |
| 2.1086 | 47920 | 1.1302 | - |
| 2.1090 | 47930 | 1.111 | - |
| 2.1095 | 47940 | 1.1409 | - |
| 2.1099 | 47950 | 1.0963 | - |
| 2.1104 | 47960 | 1.1185 | - |
| 2.1108 | 47970 | 1.1154 | - |
| 2.1112 | 47980 | 1.1677 | - |
| 2.1117 | 47990 | 1.0884 | - |
| 2.1121 | 48000 | 1.1258 | - |
| 2.1126 | 48010 | 1.1174 | - |
| 2.1130 | 48020 | 1.136 | - |
| 2.1134 | 48030 | 1.1272 | - |
| 2.1139 | 48040 | 1.1159 | - |
| 2.1143 | 48050 | 1.1314 | - |
| 2.1148 | 48060 | 1.1025 | - |
| 2.1152 | 48070 | 1.1034 | - |
| 2.1156 | 48080 | 1.1151 | - |
| 2.1161 | 48090 | 1.0858 | - |
| 2.1165 | 48100 | 1.1712 | - |
| 2.1170 | 48110 | 1.0976 | - |
| 2.1174 | 48120 | 1.1011 | - |
| 2.1178 | 48130 | 1.1609 | - |
| 2.1183 | 48140 | 1.1451 | - |
| 2.1187 | 48150 | 1.1546 | - |
| 2.1192 | 48160 | 1.0814 | - |
| 2.1196 | 48170 | 1.1571 | - |
| 2.1200 | 48180 | 1.1015 | - |
| 2.1205 | 48190 | 1.1021 | - |
| 2.1209 | 48200 | 1.135 | - |
| 2.1214 | 48210 | 1.0967 | - |
| 2.1218 | 48220 | 1.0826 | - |
| 2.1222 | 48230 | 1.1111 | - |
| 2.1227 | 48240 | 1.0837 | - |
| 2.1231 | 48250 | 1.1292 | - |
| 2.1236 | 48260 | 1.1014 | - |
| 2.1240 | 48270 | 1.1874 | - |
| 2.1244 | 48280 | 1.1611 | - |
| 2.1249 | 48290 | 1.1514 | - |
| 2.1253 | 48300 | 1.0561 | - |
| 2.1258 | 48310 | 1.139 | - |
| 2.1262 | 48320 | 1.1302 | - |
| 2.1266 | 48330 | 1.1391 | - |
| 2.1271 | 48340 | 1.1482 | - |
| 2.1275 | 48350 | 1.1474 | - |
| 2.1280 | 48360 | 1.1067 | - |
| 2.1284 | 48370 | 1.1239 | - |
| 2.1288 | 48380 | 1.1108 | - |
| 2.1293 | 48390 | 1.1379 | - |
| 2.1297 | 48400 | 1.1529 | - |
| 2.1302 | 48410 | 1.1783 | - |
| 2.1306 | 48420 | 1.0841 | - |
| 2.1310 | 48430 | 1.1482 | - |
| 2.1315 | 48440 | 1.1356 | - |
| 2.1319 | 48450 | 1.0778 | - |
| 2.1324 | 48460 | 1.13 | - |
| 2.1328 | 48470 | 1.1024 | - |
| 2.1332 | 48480 | 1.1644 | - |
| 2.1337 | 48490 | 1.0955 | - |
| 2.1341 | 48500 | 1.1487 | - |
| 2.1346 | 48510 | 1.1176 | - |
| 2.1350 | 48520 | 1.1658 | - |
| 2.1354 | 48530 | 1.1444 | - |
| 2.1359 | 48540 | 1.0683 | - |
| 2.1363 | 48550 | 1.1197 | - |
| 2.1368 | 48560 | 1.1317 | - |
| 2.1372 | 48570 | 1.0842 | - |
| 2.1376 | 48580 | 1.1293 | - |
| 2.1381 | 48590 | 1.148 | - |
| 2.1385 | 48600 | 1.0655 | - |
| 2.1390 | 48610 | 1.0856 | - |
| 2.1394 | 48620 | 1.1494 | - |
| 2.1398 | 48630 | 1.108 | - |
| 2.1403 | 48640 | 1.1439 | - |
| 2.1407 | 48650 | 1.0594 | - |
| 2.1412 | 48660 | 1.1249 | - |
| 2.1416 | 48670 | 1.1617 | - |
| 2.1420 | 48680 | 1.0733 | - |
| 2.1425 | 48690 | 1.1022 | - |
| 2.1429 | 48700 | 1.1487 | - |
| 2.1434 | 48710 | 1.1455 | - |
| 2.1438 | 48720 | 1.1223 | - |
| 2.1442 | 48730 | 1.0898 | - |
| 2.1447 | 48740 | 1.1267 | - |
| 2.1451 | 48750 | 1.118 | - |
| 2.1456 | 48760 | 1.0967 | - |
| 2.1460 | 48770 | 1.1294 | - |
| 2.1464 | 48780 | 1.1186 | - |
| 2.1469 | 48790 | 1.117 | - |
| 2.1473 | 48800 | 1.1796 | - |
| 2.1478 | 48810 | 1.0751 | - |
| 2.1482 | 48820 | 1.1539 | - |
| 2.1486 | 48830 | 1.1054 | - |
| 2.1491 | 48840 | 1.1049 | - |
| 2.1495 | 48850 | 1.0968 | - |
| 2.1500 | 48860 | 1.1159 | - |
| 2.1504 | 48870 | 1.1218 | - |
| 2.1508 | 48880 | 1.1396 | - |
| 2.1513 | 48890 | 1.1175 | - |
| 2.1513 | 48891 | - | 1.4101 |
| 2.1517 | 48900 | 1.0506 | - |
| 2.1522 | 48910 | 1.1275 | - |
| 2.1526 | 48920 | 1.0995 | - |
| 2.1530 | 48930 | 1.1034 | - |
| 2.1535 | 48940 | 1.1638 | - |
| 2.1539 | 48950 | 1.1007 | - |
| 2.1544 | 48960 | 1.1156 | - |
| 2.1548 | 48970 | 1.068 | - |
| 2.1552 | 48980 | 1.1299 | - |
| 2.1557 | 48990 | 1.1209 | - |
| 2.1561 | 49000 | 1.1112 | - |
| 2.1566 | 49010 | 1.0734 | - |
| 2.1570 | 49020 | 1.1103 | - |
| 2.1574 | 49030 | 1.0968 | - |
| 2.1579 | 49040 | 1.1753 | - |
| 2.1583 | 49050 | 1.1101 | - |
| 2.1588 | 49060 | 1.0715 | - |
| 2.1592 | 49070 | 1.139 | - |
| 2.1596 | 49080 | 1.0928 | - |
| 2.1601 | 49090 | 1.0868 | - |
| 2.1605 | 49100 | 1.0935 | - |
| 2.1610 | 49110 | 1.0937 | - |
| 2.1614 | 49120 | 1.0755 | - |
| 2.1618 | 49130 | 1.0998 | - |
| 2.1623 | 49140 | 1.1163 | - |
| 2.1627 | 49150 | 1.1277 | - |
| 2.1632 | 49160 | 1.1101 | - |
| 2.1636 | 49170 | 1.1342 | - |
| 2.1640 | 49180 | 1.0917 | - |
| 2.1645 | 49190 | 1.1043 | - |
| 2.1649 | 49200 | 1.1365 | - |
| 2.1654 | 49210 | 1.1702 | - |
| 2.1658 | 49220 | 1.1341 | - |
| 2.1662 | 49230 | 1.1541 | - |
| 2.1667 | 49240 | 1.0884 | - |
| 2.1671 | 49250 | 1.1467 | - |
| 2.1676 | 49260 | 1.1442 | - |
| 2.1680 | 49270 | 1.0916 | - |
| 2.1684 | 49280 | 1.1297 | - |
| 2.1689 | 49290 | 1.1187 | - |
| 2.1693 | 49300 | 1.1316 | - |
| 2.1698 | 49310 | 1.1348 | - |
| 2.1702 | 49320 | 1.1328 | - |
| 2.1706 | 49330 | 1.1327 | - |
| 2.1711 | 49340 | 1.1363 | - |
| 2.1715 | 49350 | 1.1496 | - |
| 2.1720 | 49360 | 1.1214 | - |
| 2.1724 | 49370 | 1.0989 | - |
| 2.1728 | 49380 | 1.1128 | - |
| 2.1733 | 49390 | 1.1109 | - |
| 2.1737 | 49400 | 1.0502 | - |
| 2.1742 | 49410 | 1.1199 | - |
| 2.1746 | 49420 | 1.1522 | - |
| 2.1750 | 49430 | 1.0955 | - |
| 2.1755 | 49440 | 1.1256 | - |
| 2.1759 | 49450 | 1.0977 | - |
| 2.1764 | 49460 | 1.1316 | - |
| 2.1768 | 49470 | 1.0727 | - |
| 2.1772 | 49480 | 1.091 | - |
| 2.1777 | 49490 | 1.1476 | - |
| 2.1781 | 49500 | 1.0993 | - |
| 2.1786 | 49510 | 1.0953 | - |
| 2.1790 | 49520 | 1.1485 | - |
| 2.1794 | 49530 | 1.1321 | - |
| 2.1799 | 49540 | 1.0641 | - |
| 2.1803 | 49550 | 1.1163 | - |
| 2.1808 | 49560 | 1.0851 | - |
| 2.1812 | 49570 | 1.1525 | - |
| 2.1816 | 49580 | 1.1256 | - |
| 2.1821 | 49590 | 1.0561 | - |
| 2.1825 | 49600 | 1.0944 | - |
| 2.1830 | 49610 | 1.0914 | - |
| 2.1834 | 49620 | 1.0825 | - |
| 2.1838 | 49630 | 1.0701 | - |
| 2.1843 | 49640 | 1.1396 | - |
| 2.1847 | 49650 | 1.0871 | - |
| 2.1852 | 49660 | 1.0919 | - |
| 2.1856 | 49670 | 1.0439 | - |
| 2.1860 | 49680 | 1.1112 | - |
| 2.1865 | 49690 | 1.133 | - |
| 2.1869 | 49700 | 1.059 | - |
| 2.1874 | 49710 | 1.104 | - |
| 2.1878 | 49720 | 1.0858 | - |
| 2.1882 | 49730 | 1.1178 | - |
| 2.1887 | 49740 | 1.0722 | - |
| 2.1891 | 49750 | 1.136 | - |
| 2.1896 | 49760 | 1.149 | - |
| 2.1900 | 49770 | 1.1167 | - |
| 2.1904 | 49780 | 1.1606 | - |
| 2.1909 | 49790 | 1.1402 | - |
| 2.1913 | 49800 | 1.1197 | - |
| 2.1918 | 49810 | 1.0781 | - |
| 2.1922 | 49820 | 1.1023 | - |
| 2.1926 | 49830 | 1.0872 | - |
| 2.1931 | 49840 | 1.122 | - |
| 2.1935 | 49850 | 1.1593 | - |
| 2.1940 | 49860 | 1.0881 | - |
| 2.1944 | 49870 | 1.0719 | - |
| 2.1948 | 49880 | 1.1236 | - |
| 2.1953 | 49890 | 1.1484 | - |
| 2.1957 | 49900 | 1.0953 | - |
| 2.1962 | 49910 | 1.1773 | - |
| 2.1966 | 49920 | 1.1479 | - |
| 2.1970 | 49930 | 1.0996 | - |
| 2.1975 | 49940 | 1.1329 | - |
| 2.1979 | 49950 | 1.1454 | - |
| 2.1984 | 49960 | 1.1236 | - |
| 2.1988 | 49970 | 1.1117 | - |
| 2.1992 | 49980 | 1.1038 | - |
| 2.1997 | 49990 | 1.1434 | - |
| 2.2001 | 50000 | 1.1429 | - |
| 2.2006 | 50010 | 1.1118 | - |
| 2.2010 | 50020 | 1.0844 | - |
| 2.2014 | 50028 | - | 1.4408 |
| 2.2014 | 50030 | 1.113 | - |
| 2.2019 | 50040 | 1.1151 | - |
| 2.2023 | 50050 | 1.0982 | - |
| 2.2028 | 50060 | 1.0955 | - |
| 2.2032 | 50070 | 1.1292 | - |
| 2.2036 | 50080 | 1.0705 | - |
| 2.2041 | 50090 | 1.0683 | - |
| 2.2045 | 50100 | 1.1567 | - |
| 2.2050 | 50110 | 1.1074 | - |
| 2.2054 | 50120 | 1.0935 | - |
| 2.2058 | 50130 | 1.1724 | - |
| 2.2063 | 50140 | 1.1547 | - |
| 2.2067 | 50150 | 1.1448 | - |
| 2.2072 | 50160 | 1.0953 | - |
| 2.2076 | 50170 | 1.1117 | - |
| 2.2080 | 50180 | 1.0832 | - |
| 2.2085 | 50190 | 1.0815 | - |
| 2.2089 | 50200 | 1.084 | - |
| 2.2094 | 50210 | 1.1106 | - |
| 2.2098 | 50220 | 1.1453 | - |
| 2.2102 | 50230 | 1.1225 | - |
| 2.2107 | 50240 | 1.1107 | - |
| 2.2111 | 50250 | 1.1041 | - |
| 2.2116 | 50260 | 1.0893 | - |
| 2.2120 | 50270 | 1.1054 | - |
| 2.2124 | 50280 | 1.1094 | - |
| 2.2129 | 50290 | 1.108 | - |
| 2.2133 | 50300 | 1.1005 | - |
| 2.2138 | 50310 | 1.0846 | - |
| 2.2142 | 50320 | 1.1241 | - |
| 2.2146 | 50330 | 1.1103 | - |
| 2.2151 | 50340 | 1.0858 | - |
| 2.2155 | 50350 | 1.135 | - |
| 2.2160 | 50360 | 1.1783 | - |
| 2.2164 | 50370 | 1.0845 | - |
| 2.2168 | 50380 | 1.1204 | - |
| 2.2173 | 50390 | 1.1253 | - |
| 2.2177 | 50400 | 1.1014 | - |
| 2.2182 | 50410 | 1.1209 | - |
| 2.2186 | 50420 | 1.1072 | - |
| 2.2190 | 50430 | 1.1092 | - |
| 2.2195 | 50440 | 1.1163 | - |
| 2.2199 | 50450 | 1.1242 | - |
| 2.2204 | 50460 | 1.138 | - |
| 2.2208 | 50470 | 1.1393 | - |
| 2.2212 | 50480 | 1.0676 | - |
| 2.2217 | 50490 | 1.0912 | - |
| 2.2221 | 50500 | 1.1118 | - |
| 2.2226 | 50510 | 1.1031 | - |
| 2.2230 | 50520 | 1.1166 | - |
| 2.2234 | 50530 | 1.0913 | - |
| 2.2239 | 50540 | 1.089 | - |
| 2.2243 | 50550 | 1.141 | - |
| 2.2248 | 50560 | 1.0876 | - |
| 2.2252 | 50570 | 1.1473 | - |
| 2.2256 | 50580 | 1.1168 | - |
| 2.2261 | 50590 | 1.081 | - |
| 2.2265 | 50600 | 1.0927 | - |
| 2.2270 | 50610 | 1.1059 | - |
| 2.2274 | 50620 | 1.1167 | - |
| 2.2278 | 50630 | 1.142 | - |
| 2.2283 | 50640 | 1.1299 | - |
| 2.2287 | 50650 | 1.1039 | - |
| 2.2292 | 50660 | 1.0575 | - |
| 2.2296 | 50670 | 1.0804 | - |
| 2.2300 | 50680 | 1.129 | - |
| 2.2305 | 50690 | 1.0703 | - |
| 2.2309 | 50700 | 1.0901 | - |
| 2.2314 | 50710 | 1.0804 | - |
| 2.2318 | 50720 | 1.1232 | - |
| 2.2322 | 50730 | 1.1095 | - |
| 2.2327 | 50740 | 1.1034 | - |
| 2.2331 | 50750 | 1.0418 | - |
| 2.2336 | 50760 | 1.0633 | - |
| 2.2340 | 50770 | 1.1047 | - |
| 2.2344 | 50780 | 1.0475 | - |
| 2.2349 | 50790 | 1.0813 | - |
| 2.2353 | 50800 | 1.1026 | - |
| 2.2358 | 50810 | 1.1035 | - |
| 2.2362 | 50820 | 1.0921 | - |
| 2.2366 | 50830 | 1.0977 | - |
| 2.2371 | 50840 | 1.1125 | - |
| 2.2375 | 50850 | 1.096 | - |
| 2.2380 | 50860 | 1.0888 | - |
| 2.2384 | 50870 | 1.1415 | - |
| 2.2388 | 50880 | 1.114 | - |
| 2.2393 | 50890 | 1.07 | - |
| 2.2397 | 50900 | 1.1107 | - |
| 2.2402 | 50910 | 1.1219 | - |
| 2.2406 | 50920 | 1.078 | - |
| 2.2410 | 50930 | 1.0593 | - |
| 2.2415 | 50940 | 1.0679 | - |
| 2.2419 | 50950 | 1.1221 | - |
| 2.2424 | 50960 | 1.1256 | - |
| 2.2428 | 50970 | 1.0984 | - |
| 2.2432 | 50980 | 1.0762 | - |
| 2.2437 | 50990 | 1.0965 | - |
| 2.2441 | 51000 | 1.087 | - |
| 2.2446 | 51010 | 1.1202 | - |
| 2.2450 | 51020 | 1.1204 | - |
| 2.2454 | 51030 | 1.0823 | - |
| 2.2459 | 51040 | 1.0699 | - |
| 2.2463 | 51050 | 1.0692 | - |
| 2.2468 | 51060 | 1.0948 | - |
| 2.2472 | 51070 | 1.0958 | - |
| 2.2476 | 51080 | 1.0666 | - |
| 2.2481 | 51090 | 1.1122 | - |
| 2.2485 | 51100 | 1.0778 | - |
| 2.2490 | 51110 | 1.0867 | - |
| 2.2494 | 51120 | 1.1254 | - |
| 2.2498 | 51130 | 1.094 | - |
| 2.2503 | 51140 | 1.1131 | - |
| 2.2507 | 51150 | 1.1322 | - |
| 2.2512 | 51160 | 1.1406 | - |
| 2.2514 | 51165 | - | 1.4187 |
| 2.2516 | 51170 | 1.0756 | - |
| 2.2520 | 51180 | 1.0685 | - |
| 2.2525 | 51190 | 1.0716 | - |
| 2.2529 | 51200 | 1.1367 | - |
| 2.2534 | 51210 | 1.14 | - |
| 2.2538 | 51220 | 1.1043 | - |
| 2.2542 | 51230 | 1.0751 | - |
| 2.2547 | 51240 | 1.0577 | - |
| 2.2551 | 51250 | 1.1215 | - |
| 2.2556 | 51260 | 1.0925 | - |
| 2.2560 | 51270 | 1.0975 | - |
| 2.2564 | 51280 | 1.1289 | - |
| 2.2569 | 51290 | 1.0778 | - |
| 2.2573 | 51300 | 1.0623 | - |
| 2.2578 | 51310 | 1.0657 | - |
| 2.2582 | 51320 | 1.116 | - |
| 2.2586 | 51330 | 1.1092 | - |
| 2.2591 | 51340 | 1.1516 | - |
| 2.2595 | 51350 | 1.0981 | - |
| 2.2600 | 51360 | 1.0781 | - |
| 2.2604 | 51370 | 1.107 | - |
| 2.2608 | 51380 | 1.0898 | - |
| 2.2613 | 51390 | 1.0604 | - |
| 2.2617 | 51400 | 1.057 | - |
| 2.2622 | 51410 | 1.1112 | - |
| 2.2626 | 51420 | 1.0782 | - |
| 2.2630 | 51430 | 1.0522 | - |
| 2.2635 | 51440 | 1.0443 | - |
| 2.2639 | 51450 | 1.1068 | - |
| 2.2644 | 51460 | 1.1218 | - |
| 2.2648 | 51470 | 1.112 | - |
| 2.2652 | 51480 | 1.0964 | - |
| 2.2657 | 51490 | 1.0659 | - |
| 2.2661 | 51500 | 1.1209 | - |
| 2.2666 | 51510 | 1.1179 | - |
| 2.2670 | 51520 | 1.0571 | - |
| 2.2674 | 51530 | 1.0894 | - |
| 2.2679 | 51540 | 1.1095 | - |
| 2.2683 | 51550 | 1.0836 | - |
| 2.2688 | 51560 | 1.0798 | - |
| 2.2692 | 51570 | 1.0885 | - |
| 2.2696 | 51580 | 1.1281 | - |
| 2.2701 | 51590 | 1.0855 | - |
| 2.2705 | 51600 | 1.1194 | - |
| 2.2710 | 51610 | 1.0966 | - |
| 2.2714 | 51620 | 1.0604 | - |
| 2.2718 | 51630 | 1.1153 | - |
| 2.2723 | 51640 | 1.0573 | - |
| 2.2727 | 51650 | 1.0953 | - |
| 2.2732 | 51660 | 1.1374 | - |
| 2.2736 | 51670 | 1.051 | - |
| 2.2740 | 51680 | 1.0674 | - |
| 2.2745 | 51690 | 1.1214 | - |
| 2.2749 | 51700 | 1.1118 | - |
| 2.2754 | 51710 | 1.1055 | - |
| 2.2758 | 51720 | 1.0673 | - |
| 2.2762 | 51730 | 1.1018 | - |
| 2.2767 | 51740 | 1.1306 | - |
| 2.2771 | 51750 | 1.0728 | - |
| 2.2776 | 51760 | 1.1219 | - |
| 2.2780 | 51770 | 1.0974 | - |
| 2.2784 | 51780 | 1.0987 | - |
| 2.2789 | 51790 | 1.0742 | - |
| 2.2793 | 51800 | 1.1319 | - |
| 2.2798 | 51810 | 1.1054 | - |
| 2.2802 | 51820 | 1.1471 | - |
| 2.2806 | 51830 | 1.1143 | - |
| 2.2811 | 51840 | 1.0715 | - |
| 2.2815 | 51850 | 1.0598 | - |
| 2.2820 | 51860 | 1.0512 | - |
| 2.2824 | 51870 | 1.0542 | - |
| 2.2828 | 51880 | 1.0944 | - |
| 2.2833 | 51890 | 1.1054 | - |
| 2.2837 | 51900 | 1.09 | - |
| 2.2842 | 51910 | 1.0663 | - |
| 2.2846 | 51920 | 1.1157 | - |
| 2.2850 | 51930 | 1.0759 | - |
| 2.2855 | 51940 | 1.042 | - |
| 2.2859 | 51950 | 1.0747 | - |
| 2.2864 | 51960 | 1.1287 | - |
| 2.2868 | 51970 | 1.0216 | - |
| 2.2872 | 51980 | 1.0706 | - |
| 2.2877 | 51990 | 1.0959 | - |
| 2.2881 | 52000 | 1.131 | - |
| 2.2886 | 52010 | 1.0953 | - |
| 2.2890 | 52020 | 1.1178 | - |
| 2.2894 | 52030 | 1.071 | - |
| 2.2899 | 52040 | 1.0247 | - |
| 2.2903 | 52050 | 1.063 | - |
| 2.2908 | 52060 | 1.0872 | - |
| 2.2912 | 52070 | 1.0889 | - |
| 2.2916 | 52080 | 1.1129 | - |
| 2.2921 | 52090 | 1.1533 | - |
| 2.2925 | 52100 | 1.0576 | - |
| 2.2930 | 52110 | 1.1611 | - |
| 2.2934 | 52120 | 1.0805 | - |
| 2.2938 | 52130 | 1.1009 | - |
| 2.2943 | 52140 | 1.1339 | - |
| 2.2947 | 52150 | 1.113 | - |
| 2.2952 | 52160 | 1.0992 | - |
| 2.2956 | 52170 | 1.0933 | - |
| 2.2960 | 52180 | 1.0737 | - |
| 2.2965 | 52190 | 1.0951 | - |
| 2.2969 | 52200 | 1.0731 | - |
| 2.2974 | 52210 | 1.0501 | - |
| 2.2978 | 52220 | 1.109 | - |
| 2.2982 | 52230 | 1.1004 | - |
| 2.2987 | 52240 | 1.0688 | - |
| 2.2991 | 52250 | 1.066 | - |
| 2.2996 | 52260 | 1.0736 | - |
| 2.3000 | 52270 | 1.1011 | - |
| 2.3004 | 52280 | 1.1167 | - |
| 2.3009 | 52290 | 1.0832 | - |
| 2.3013 | 52300 | 1.1215 | - |
| 2.3014 | 52302 | - | 1.4365 |
| 2.3018 | 52310 | 1.0201 | - |
| 2.3022 | 52320 | 1.1023 | - |
| 2.3026 | 52330 | 1.0713 | - |
| 2.3031 | 52340 | 1.0557 | - |
| 2.3035 | 52350 | 1.108 | - |
| 2.3040 | 52360 | 1.0622 | - |
| 2.3044 | 52370 | 1.0705 | - |
| 2.3048 | 52380 | 1.1035 | - |
| 2.3053 | 52390 | 1.1058 | - |
| 2.3057 | 52400 | 1.0379 | - |
| 2.3062 | 52410 | 1.0658 | - |
| 2.3066 | 52420 | 1.0458 | - |
| 2.3070 | 52430 | 1.0925 | - |
| 2.3075 | 52440 | 1.0923 | - |
| 2.3079 | 52450 | 1.0482 | - |
| 2.3084 | 52460 | 1.0728 | - |
| 2.3088 | 52470 | 1.0209 | - |
| 2.3092 | 52480 | 1.0573 | - |
| 2.3097 | 52490 | 1.1076 | - |
| 2.3101 | 52500 | 1.109 | - |
| 2.3106 | 52510 | 1.0855 | - |
| 2.3110 | 52520 | 1.0674 | - |
| 2.3114 | 52530 | 1.0761 | - |
| 2.3119 | 52540 | 1.0648 | - |
| 2.3123 | 52550 | 1.1026 | - |
| 2.3128 | 52560 | 1.0821 | - |
| 2.3132 | 52570 | 1.0581 | - |
| 2.3136 | 52580 | 1.0535 | - |
| 2.3141 | 52590 | 1.0425 | - |
| 2.3145 | 52600 | 1.0693 | - |
| 2.3150 | 52610 | 1.0886 | - |
| 2.3154 | 52620 | 1.0379 | - |
| 2.3158 | 52630 | 1.0744 | - |
| 2.3163 | 52640 | 1.0726 | - |
| 2.3167 | 52650 | 1.0825 | - |
| 2.3172 | 52660 | 1.0875 | - |
| 2.3176 | 52670 | 1.1 | - |
| 2.3180 | 52680 | 1.0972 | - |
| 2.3185 | 52690 | 1.1335 | - |
| 2.3189 | 52700 | 1.0373 | - |
| 2.3194 | 52710 | 1.0293 | - |
| 2.3198 | 52720 | 1.0911 | - |
| 2.3202 | 52730 | 1.071 | - |
| 2.3207 | 52740 | 1.0564 | - |
| 2.3211 | 52750 | 1.0978 | - |
| 2.3216 | 52760 | 1.1199 | - |
| 2.3220 | 52770 | 1.1439 | - |
| 2.3225 | 52780 | 1.1313 | - |
| 2.3229 | 52790 | 1.0947 | - |
| 2.3233 | 52800 | 1.0892 | - |
| 2.3238 | 52810 | 1.0569 | - |
| 2.3242 | 52820 | 1.077 | - |
| 2.3247 | 52830 | 1.1019 | - |
| 2.3251 | 52840 | 1.0339 | - |
| 2.3255 | 52850 | 1.095 | - |
| 2.3260 | 52860 | 1.0683 | - |
| 2.3264 | 52870 | 1.0656 | - |
| 2.3269 | 52880 | 1.0459 | - |
| 2.3273 | 52890 | 1.0803 | - |
| 2.3277 | 52900 | 1.1183 | - |
| 2.3282 | 52910 | 1.0902 | - |
| 2.3286 | 52920 | 1.0583 | - |
| 2.3291 | 52930 | 1.069 | - |
| 2.3295 | 52940 | 1.0722 | - |
| 2.3299 | 52950 | 1.0738 | - |
| 2.3304 | 52960 | 1.0694 | - |
| 2.3308 | 52970 | 1.0309 | - |
| 2.3313 | 52980 | 1.0746 | - |
| 2.3317 | 52990 | 1.1187 | - |
| 2.3321 | 53000 | 1.0679 | - |
| 2.3326 | 53010 | 1.0625 | - |
| 2.3330 | 53020 | 1.0828 | - |
| 2.3335 | 53030 | 1.1082 | - |
| 2.3339 | 53040 | 1.0918 | - |
| 2.3343 | 53050 | 1.0799 | - |
| 2.3348 | 53060 | 1.0968 | - |
| 2.3352 | 53070 | 1.0629 | - |
| 2.3357 | 53080 | 1.0944 | - |
| 2.3361 | 53090 | 1.058 | - |
| 2.3365 | 53100 | 1.0826 | - |
| 2.3370 | 53110 | 1.0775 | - |
| 2.3374 | 53120 | 1.0657 | - |
| 2.3379 | 53130 | 1.063 | - |
| 2.3383 | 53140 | 1.0905 | - |
| 2.3387 | 53150 | 1.0692 | - |
| 2.3392 | 53160 | 1.032 | - |
| 2.3396 | 53170 | 1.0057 | - |
| 2.3401 | 53180 | 1.046 | - |
| 2.3405 | 53190 | 1.0726 | - |
| 2.3409 | 53200 | 1.0694 | - |
| 2.3414 | 53210 | 1.081 | - |
| 2.3418 | 53220 | 1.0692 | - |
| 2.3423 | 53230 | 1.0485 | - |
| 2.3427 | 53240 | 1.0573 | - |
| 2.3431 | 53250 | 1.1172 | - |
| 2.3436 | 53260 | 1.0777 | - |
| 2.3440 | 53270 | 0.9993 | - |
| 2.3445 | 53280 | 1.0661 | - |
| 2.3449 | 53290 | 1.0711 | - |
| 2.3453 | 53300 | 1.0624 | - |
| 2.3458 | 53310 | 1.0392 | - |
| 2.3462 | 53320 | 1.0589 | - |
| 2.3467 | 53330 | 1.042 | - |
| 2.3471 | 53340 | 1.1111 | - |
| 2.3475 | 53350 | 1.0779 | - |
| 2.3480 | 53360 | 1.0928 | - |
| 2.3484 | 53370 | 1.1236 | - |
| 2.3489 | 53380 | 1.0631 | - |
| 2.3493 | 53390 | 1.0884 | - |
| 2.3497 | 53400 | 1.0774 | - |
| 2.3502 | 53410 | 1.0683 | - |
| 2.3506 | 53420 | 1.0932 | - |
| 2.3511 | 53430 | 1.0834 | - |
| 2.3514 | 53439 | - | 1.4361 |
| 2.3515 | 53440 | 1.0951 | - |
| 2.3519 | 53450 | 1.0572 | - |
| 2.3524 | 53460 | 1.0538 | - |
| 2.3528 | 53470 | 1.1145 | - |
| 2.3533 | 53480 | 1.0818 | - |
| 2.3537 | 53490 | 1.1 | - |
| 2.3541 | 53500 | 1.0793 | - |
| 2.3546 | 53510 | 1.0522 | - |
| 2.3550 | 53520 | 1.066 | - |
| 2.3555 | 53530 | 1.0902 | - |
| 2.3559 | 53540 | 1.0927 | - |
| 2.3563 | 53550 | 1.1553 | - |
| 2.3568 | 53560 | 1.0633 | - |
| 2.3572 | 53570 | 1.0889 | - |
| 2.3577 | 53580 | 1.0778 | - |
| 2.3581 | 53590 | 1.0817 | - |
| 2.3585 | 53600 | 1.1198 | - |
| 2.3590 | 53610 | 1.0662 | - |
| 2.3594 | 53620 | 1.0948 | - |
| 2.3599 | 53630 | 1.1131 | - |
| 2.3603 | 53640 | 1.0974 | - |
| 2.3607 | 53650 | 1.0441 | - |
| 2.3612 | 53660 | 1.0179 | - |
| 2.3616 | 53670 | 1.1159 | - |
| 2.3621 | 53680 | 1.0543 | - |
| 2.3625 | 53690 | 1.0677 | - |
| 2.3629 | 53700 | 1.0675 | - |
| 2.3634 | 53710 | 1.0662 | - |
| 2.3638 | 53720 | 1.0692 | - |
| 2.3643 | 53730 | 1.0724 | - |
| 2.3647 | 53740 | 1.1182 | - |
| 2.3651 | 53750 | 1.0783 | - |
| 2.3656 | 53760 | 1.057 | - |
| 2.3660 | 53770 | 1.0513 | - |
| 2.3665 | 53780 | 1.0299 | - |
| 2.3669 | 53790 | 1.1076 | - |
| 2.3673 | 53800 | 1.0505 | - |
| 2.3678 | 53810 | 1.1195 | - |
| 2.3682 | 53820 | 1.0519 | - |
| 2.3687 | 53830 | 1.076 | - |
| 2.3691 | 53840 | 1.0485 | - |
| 2.3695 | 53850 | 1.0077 | - |
| 2.3700 | 53860 | 1.0979 | - |
| 2.3704 | 53870 | 1.0451 | - |
| 2.3709 | 53880 | 1.0495 | - |
| 2.3713 | 53890 | 1.0568 | - |
| 2.3717 | 53900 | 1.0788 | - |
| 2.3722 | 53910 | 1.0937 | - |
| 2.3726 | 53920 | 1.0685 | - |
| 2.3731 | 53930 | 1.1056 | - |
| 2.3735 | 53940 | 1.0109 | - |
| 2.3739 | 53950 | 1.104 | - |
| 2.3744 | 53960 | 1.0395 | - |
| 2.3748 | 53970 | 1.0662 | - |
| 2.3753 | 53980 | 1.0684 | - |
| 2.3757 | 53990 | 1.1029 | - |
| 2.3761 | 54000 | 1.0807 | - |
| 2.3766 | 54010 | 1.0894 | - |
| 2.3770 | 54020 | 1.0581 | - |
| 2.3775 | 54030 | 1.0437 | - |
| 2.3779 | 54040 | 1.0884 | - |
| 2.3783 | 54050 | 1.0783 | - |
| 2.3788 | 54060 | 1.0618 | - |
| 2.3792 | 54070 | 1.0879 | - |
| 2.3797 | 54080 | 1.1041 | - |
| 2.3801 | 54090 | 1.0866 | - |
| 2.3805 | 54100 | 1.0872 | - |
| 2.3810 | 54110 | 1.0646 | - |
| 2.3814 | 54120 | 1.0544 | - |
| 2.3819 | 54130 | 1.0749 | - |
| 2.3823 | 54140 | 1.0968 | - |
| 2.3827 | 54150 | 1.0022 | - |
| 2.3832 | 54160 | 1.0607 | - |
| 2.3836 | 54170 | 1.0385 | - |
| 2.3841 | 54180 | 1.0158 | - |
| 2.3845 | 54190 | 1.0835 | - |
| 2.3849 | 54200 | 1.0962 | - |
| 2.3854 | 54210 | 1.0165 | - |
| 2.3858 | 54220 | 1.0762 | - |
| 2.3863 | 54230 | 1.071 | - |
| 2.3867 | 54240 | 1.1083 | - |
| 2.3871 | 54250 | 1.1387 | - |
| 2.3876 | 54260 | 1.0888 | - |
| 2.3880 | 54270 | 1.0631 | - |
| 2.3885 | 54280 | 1.1257 | - |
| 2.3889 | 54290 | 1.0871 | - |
| 2.3893 | 54300 | 1.0466 | - |
| 2.3898 | 54310 | 0.9915 | - |
| 2.3902 | 54320 | 1.044 | - |
| 2.3907 | 54330 | 1.1024 | - |
| 2.3911 | 54340 | 1.0451 | - |
| 2.3915 | 54350 | 1.1005 | - |
| 2.3920 | 54360 | 1.1357 | - |
| 2.3924 | 54370 | 1.1037 | - |
| 2.3929 | 54380 | 1.0745 | - |
| 2.3933 | 54390 | 1.0469 | - |
| 2.3937 | 54400 | 1.057 | - |
| 2.3942 | 54410 | 1.0936 | - |
| 2.3946 | 54420 | 1.0911 | - |
| 2.3951 | 54430 | 1.0707 | - |
| 2.3955 | 54440 | 1.0793 | - |
| 2.3959 | 54450 | 1.0677 | - |
| 2.3964 | 54460 | 1.0681 | - |
| 2.3968 | 54470 | 1.0905 | - |
| 2.3973 | 54480 | 1.0787 | - |
| 2.3977 | 54490 | 1.0317 | - |
| 2.3981 | 54500 | 1.035 | - |
| 2.3986 | 54510 | 1.1043 | - |
| 2.3990 | 54520 | 1.1018 | - |
| 2.3995 | 54530 | 1.0624 | - |
| 2.3999 | 54540 | 1.0959 | - |
| 2.4003 | 54550 | 1.0768 | - |
| 2.4008 | 54560 | 1.0553 | - |
| 2.4012 | 54570 | 1.0678 | - |
| 2.4015 | 54576 | - | 1.4412 |
| 2.4017 | 54580 | 1.0706 | - |
| 2.4021 | 54590 | 1.0236 | - |
| 2.4025 | 54600 | 1.1212 | - |
| 2.4030 | 54610 | 1.0802 | - |
| 2.4034 | 54620 | 1.0539 | - |
| 2.4039 | 54630 | 1.0917 | - |
| 2.4043 | 54640 | 1.0818 | - |
| 2.4047 | 54650 | 1.0648 | - |
| 2.4052 | 54660 | 1.0275 | - |
| 2.4056 | 54670 | 1.0787 | - |
| 2.4061 | 54680 | 1.0739 | - |
| 2.4065 | 54690 | 1.0738 | - |
| 2.4069 | 54700 | 1.081 | - |
| 2.4074 | 54710 | 1.0124 | - |
| 2.4078 | 54720 | 1.1086 | - |
| 2.4083 | 54730 | 1.0525 | - |
| 2.4087 | 54740 | 1.1011 | - |
| 2.4091 | 54750 | 1.0791 | - |
| 2.4096 | 54760 | 1.0921 | - |
| 2.4100 | 54770 | 1.0903 | - |
| 2.4105 | 54780 | 1.0389 | - |
| 2.4109 | 54790 | 1.0963 | - |
| 2.4113 | 54800 | 1.0615 | - |
| 2.4118 | 54810 | 1.0641 | - |
| 2.4122 | 54820 | 1.0583 | - |
| 2.4127 | 54830 | 1.0618 | - |
| 2.4131 | 54840 | 1.0476 | - |
| 2.4135 | 54850 | 1.0744 | - |
| 2.4140 | 54860 | 1.0718 | - |
| 2.4144 | 54870 | 1.0994 | - |
| 2.4149 | 54880 | 1.0308 | - |
| 2.4153 | 54890 | 1.0442 | - |
| 2.4157 | 54900 | 1.0375 | - |
| 2.4162 | 54910 | 1.078 | - |
| 2.4166 | 54920 | 1.0702 | - |
| 2.4171 | 54930 | 1.0285 | - |
| 2.4175 | 54940 | 1.0784 | - |
| 2.4179 | 54950 | 1.0314 | - |
| 2.4184 | 54960 | 1.0464 | - |
| 2.4188 | 54970 | 1.0277 | - |
| 2.4193 | 54980 | 1.07 | - |
| 2.4197 | 54990 | 1.0389 | - |
| 2.4201 | 55000 | 1.0458 | - |
| 2.4206 | 55010 | 1.0938 | - |
| 2.4210 | 55020 | 1.0885 | - |
| 2.4215 | 55030 | 1.0572 | - |
| 2.4219 | 55040 | 1.0778 | - |
| 2.4223 | 55050 | 1.0539 | - |
| 2.4228 | 55060 | 1.0905 | - |
| 2.4232 | 55070 | 1.0991 | - |
| 2.4237 | 55080 | 1.0503 | - |
| 2.4241 | 55090 | 1.0593 | - |
| 2.4245 | 55100 | 1.0972 | - |
| 2.4250 | 55110 | 1.0775 | - |
| 2.4254 | 55120 | 1.0613 | - |
| 2.4259 | 55130 | 1.0438 | - |
| 2.4263 | 55140 | 1.0332 | - |
| 2.4267 | 55150 | 1.0727 | - |
| 2.4272 | 55160 | 1.1038 | - |
| 2.4276 | 55170 | 1.0955 | - |
| 2.4281 | 55180 | 1.0648 | - |
| 2.4285 | 55190 | 1.0327 | - |
| 2.4289 | 55200 | 1.0368 | - |
| 2.4294 | 55210 | 1.1125 | - |
| 2.4298 | 55220 | 1.0285 | - |
| 2.4303 | 55230 | 1.0384 | - |
| 2.4307 | 55240 | 1.0424 | - |
| 2.4311 | 55250 | 1.0561 | - |
| 2.4316 | 55260 | 1.007 | - |
| 2.4320 | 55270 | 1.0292 | - |
| 2.4325 | 55280 | 1.0525 | - |
| 2.4329 | 55290 | 1.0978 | - |
| 2.4333 | 55300 | 0.9937 | - |
| 2.4338 | 55310 | 1.0233 | - |
| 2.4342 | 55320 | 1.0835 | - |
| 2.4347 | 55330 | 1.0263 | - |
| 2.4351 | 55340 | 1.0752 | - |
| 2.4355 | 55350 | 1.0787 | - |
| 2.4360 | 55360 | 1.0858 | - |
| 2.4364 | 55370 | 1.0746 | - |
| 2.4369 | 55380 | 1.05 | - |
| 2.4373 | 55390 | 1.0796 | - |
| 2.4377 | 55400 | 1.099 | - |
| 2.4382 | 55410 | 1.0369 | - |
| 2.4386 | 55420 | 1.0536 | - |
| 2.4391 | 55430 | 1.0829 | - |
| 2.4395 | 55440 | 1.0651 | - |
| 2.4399 | 55450 | 1.0562 | - |
| 2.4404 | 55460 | 1.038 | - |
| 2.4408 | 55470 | 1.0571 | - |
| 2.4413 | 55480 | 1.0587 | - |
| 2.4417 | 55490 | 1.0225 | - |
| 2.4421 | 55500 | 1.0073 | - |
| 2.4426 | 55510 | 1.0601 | - |
| 2.4430 | 55520 | 1.0995 | - |
| 2.4435 | 55530 | 1.0771 | - |
| 2.4439 | 55540 | 1.0476 | - |
| 2.4443 | 55550 | 1.0263 | - |
| 2.4448 | 55560 | 1.0765 | - |
| 2.4452 | 55570 | 1.0435 | - |
| 2.4457 | 55580 | 1.0579 | - |
| 2.4461 | 55590 | 1.0667 | - |
| 2.4465 | 55600 | 1.1013 | - |
| 2.4470 | 55610 | 1.0416 | - |
| 2.4474 | 55620 | 1.0923 | - |
| 2.4479 | 55630 | 1.0587 | - |
| 2.4483 | 55640 | 1.0302 | - |
| 2.4487 | 55650 | 1.0585 | - |
| 2.4492 | 55660 | 1.0216 | - |
| 2.4496 | 55670 | 1.1019 | - |
| 2.4501 | 55680 | 1.0308 | - |
| 2.4505 | 55690 | 1.093 | - |
| 2.4509 | 55700 | 1.0813 | - |
| 2.4514 | 55710 | 1.0636 | - |
| 2.4515 | 55713 | - | 1.4065 |
| 2.4518 | 55720 | 1.0551 | - |
| 2.4523 | 55730 | 1.0432 | - |
| 2.4527 | 55740 | 1.0239 | - |
| 2.4531 | 55750 | 1.0448 | - |
| 2.4536 | 55760 | 1.0427 | - |
| 2.4540 | 55770 | 1.0941 | - |
| 2.4545 | 55780 | 1.0511 | - |
| 2.4549 | 55790 | 1.0679 | - |
| 2.4553 | 55800 | 1.0565 | - |
| 2.4558 | 55810 | 1.078 | - |
| 2.4562 | 55820 | 1.0305 | - |
| 2.4567 | 55830 | 1.0216 | - |
| 2.4571 | 55840 | 1.056 | - |
| 2.4575 | 55850 | 1.0821 | - |
| 2.4580 | 55860 | 1.0965 | - |
| 2.4584 | 55870 | 1.0411 | - |
| 2.4589 | 55880 | 1.0276 | - |
| 2.4593 | 55890 | 1.0469 | - |
| 2.4597 | 55900 | 1.064 | - |
| 2.4602 | 55910 | 1.0258 | - |
| 2.4606 | 55920 | 1.079 | - |
| 2.4611 | 55930 | 1.0735 | - |
| 2.4615 | 55940 | 1.0651 | - |
| 2.4619 | 55950 | 1.0397 | - |
| 2.4624 | 55960 | 1.0247 | - |
| 2.4628 | 55970 | 1.0288 | - |
| 2.4633 | 55980 | 1.0427 | - |
| 2.4637 | 55990 | 1.0839 | - |
| 2.4641 | 56000 | 1.0599 | - |
| 2.4646 | 56010 | 1.0288 | - |
| 2.4650 | 56020 | 1.0757 | - |
| 2.4655 | 56030 | 1.046 | - |
| 2.4659 | 56040 | 1.0935 | - |
| 2.4663 | 56050 | 1.0191 | - |
| 2.4668 | 56060 | 1.0871 | - |
| 2.4672 | 56070 | 1.0441 | - |
| 2.4677 | 56080 | 0.9984 | - |
| 2.4681 | 56090 | 1.0167 | - |
| 2.4685 | 56100 | 1.0699 | - |
| 2.4690 | 56110 | 1.0541 | - |
| 2.4694 | 56120 | 1.0554 | - |
| 2.4699 | 56130 | 1.0779 | - |
| 2.4703 | 56140 | 1.0664 | - |
| 2.4707 | 56150 | 1.0249 | - |
| 2.4712 | 56160 | 1.0716 | - |
| 2.4716 | 56170 | 1.0663 | - |
| 2.4721 | 56180 | 1.0761 | - |
| 2.4725 | 56190 | 1.063 | - |
| 2.4729 | 56200 | 1.1305 | - |
| 2.4734 | 56210 | 1.0561 | - |
| 2.4738 | 56220 | 1.0777 | - |
| 2.4743 | 56230 | 0.9978 | - |
| 2.4747 | 56240 | 1.0797 | - |
| 2.4751 | 56250 | 1.0362 | - |
| 2.4756 | 56260 | 1.0718 | - |
| 2.4760 | 56270 | 1.0517 | - |
| 2.4765 | 56280 | 0.9846 | - |
| 2.4769 | 56290 | 1.0837 | - |
| 2.4773 | 56300 | 1.059 | - |
| 2.4778 | 56310 | 1.0038 | - |
| 2.4782 | 56320 | 1.0337 | - |
| 2.4787 | 56330 | 1.027 | - |
| 2.4791 | 56340 | 1.0378 | - |
| 2.4795 | 56350 | 1.0941 | - |
| 2.4800 | 56360 | 1.0282 | - |
| 2.4804 | 56370 | 1.0445 | - |
| 2.4809 | 56380 | 1.0552 | - |
| 2.4813 | 56390 | 1.049 | - |
| 2.4817 | 56400 | 1.0085 | - |
| 2.4822 | 56410 | 1.0319 | - |
| 2.4826 | 56420 | 1.0504 | - |
| 2.4831 | 56430 | 1.1004 | - |
| 2.4835 | 56440 | 1.06 | - |
| 2.4839 | 56450 | 0.9767 | - |
| 2.4844 | 56460 | 1.0323 | - |
| 2.4848 | 56470 | 1.0049 | - |
| 2.4853 | 56480 | 1.0367 | - |
| 2.4857 | 56490 | 1.0365 | - |
| 2.4861 | 56500 | 1.0516 | - |
| 2.4866 | 56510 | 1.086 | - |
| 2.4870 | 56520 | 1.0777 | - |
| 2.4875 | 56530 | 1.0317 | - |
| 2.4879 | 56540 | 1.0898 | - |
| 2.4883 | 56550 | 1.0335 | - |
| 2.4888 | 56560 | 1.0395 | - |
| 2.4892 | 56570 | 1.0747 | - |
| 2.4897 | 56580 | 1.134 | - |
| 2.4901 | 56590 | 1.0366 | - |
| 2.4905 | 56600 | 1.0421 | - |
| 2.4910 | 56610 | 1.0269 | - |
| 2.4914 | 56620 | 1.0184 | - |
| 2.4919 | 56630 | 1.0536 | - |
| 2.4923 | 56640 | 1.0444 | - |
| 2.4927 | 56650 | 1.0738 | - |
| 2.4932 | 56660 | 1.0485 | - |
| 2.4936 | 56670 | 1.0908 | - |
| 2.4941 | 56680 | 1.0472 | - |
| 2.4945 | 56690 | 1.0438 | - |
| 2.4949 | 56700 | 1.0445 | - |
| 2.4954 | 56710 | 1.0445 | - |
| 2.4958 | 56720 | 1.0481 | - |
| 2.4963 | 56730 | 1.0785 | - |
| 2.4967 | 56740 | 1.0477 | - |
| 2.4971 | 56750 | 1.0855 | - |
| 2.4976 | 56760 | 1.0679 | - |
| 2.4980 | 56770 | 1.0612 | - |
| 2.4985 | 56780 | 1.0068 | - |
| 2.4989 | 56790 | 1.0615 | - |
| 2.4993 | 56800 | 1.0025 | - |
| 2.4998 | 56810 | 1.0041 | - |
| 2.5002 | 56820 | 1.051 | - |
| 2.5007 | 56830 | 1.0423 | - |
| 2.5011 | 56840 | 1.0434 | - |
| 2.5015 | 56850 | 1.0633 | 1.4340 |
| 2.5020 | 56860 | 1.0791 | - |
| 2.5024 | 56870 | 0.9987 | - |
| 2.5029 | 56880 | 1.0375 | - |
| 2.5033 | 56890 | 1.061 | - |
| 2.5037 | 56900 | 1.046 | - |
| 2.5042 | 56910 | 1.0416 | - |
| 2.5046 | 56920 | 1.0173 | - |
| 2.5051 | 56930 | 1.0261 | - |
| 2.5055 | 56940 | 1.0372 | - |
| 2.5059 | 56950 | 0.9978 | - |
| 2.5064 | 56960 | 1.0273 | - |
| 2.5068 | 56970 | 1.0344 | - |
| 2.5073 | 56980 | 1.0284 | - |
| 2.5077 | 56990 | 0.9848 | - |
| 2.5081 | 57000 | 1.0622 | - |
| 2.5086 | 57010 | 1.0227 | - |
| 2.5090 | 57020 | 1.0344 | - |
| 2.5095 | 57030 | 1.0111 | - |
| 2.5099 | 57040 | 1.0435 | - |
| 2.5103 | 57050 | 0.9895 | - |
| 2.5108 | 57060 | 1.0379 | - |
| 2.5112 | 57070 | 1.0114 | - |
| 2.5117 | 57080 | 1.0497 | - |
| 2.5121 | 57090 | 1.0423 | - |
| 2.5125 | 57100 | 1.0663 | - |
| 2.5130 | 57110 | 1.079 | - |
| 2.5134 | 57120 | 0.9959 | - |
| 2.5139 | 57130 | 1.0357 | - |
| 2.5143 | 57140 | 1.0155 | - |
| 2.5147 | 57150 | 1.0289 | - |
| 2.5152 | 57160 | 1.0307 | - |
| 2.5156 | 57170 | 1.0397 | - |
| 2.5161 | 57180 | 1.0354 | - |
| 2.5165 | 57190 | 1.0169 | - |
| 2.5169 | 57200 | 1.0033 | - |
| 2.5174 | 57210 | 1.0283 | - |
| 2.5178 | 57220 | 1.0651 | - |
| 2.5183 | 57230 | 1.0714 | - |
| 2.5187 | 57240 | 1.0168 | - |
| 2.5191 | 57250 | 1.022 | - |
| 2.5196 | 57260 | 1.0326 | - |
| 2.5200 | 57270 | 1.025 | - |
| 2.5205 | 57280 | 1.0397 | - |
| 2.5209 | 57290 | 1.0337 | - |
| 2.5213 | 57300 | 1.0241 | - |
| 2.5218 | 57310 | 1.0573 | - |
| 2.5222 | 57320 | 1.0677 | - |
| 2.5227 | 57330 | 0.996 | - |
| 2.5231 | 57340 | 0.9951 | - |
| 2.5235 | 57350 | 1.0357 | - |
| 2.5240 | 57360 | 1.0648 | - |
| 2.5244 | 57370 | 1.0838 | - |
| 2.5249 | 57380 | 1.0464 | - |
| 2.5253 | 57390 | 1.008 | - |
| 2.5257 | 57400 | 1.0477 | - |
| 2.5262 | 57410 | 1.0458 | - |
| 2.5266 | 57420 | 1.0541 | - |
| 2.5271 | 57430 | 1.0158 | - |
| 2.5275 | 57440 | 1.0733 | - |
| 2.5279 | 57450 | 1.0613 | - |
| 2.5284 | 57460 | 0.9815 | - |
| 2.5288 | 57470 | 1.052 | - |
| 2.5293 | 57480 | 1.0365 | - |
| 2.5297 | 57490 | 1.0429 | - |
| 2.5301 | 57500 | 1.0602 | - |
| 2.5306 | 57510 | 1.0644 | - |
| 2.5310 | 57520 | 1.0195 | - |
| 2.5315 | 57530 | 1.004 | - |
| 2.5319 | 57540 | 1.0188 | - |
| 2.5323 | 57550 | 1.0467 | - |
| 2.5328 | 57560 | 1.0552 | - |
| 2.5332 | 57570 | 1.0478 | - |
| 2.5337 | 57580 | 1.019 | - |
| 2.5341 | 57590 | 1.0241 | - |
| 2.5345 | 57600 | 1.0023 | - |
| 2.5350 | 57610 | 1.0715 | - |
| 2.5354 | 57620 | 1.0153 | - |
| 2.5359 | 57630 | 1.0575 | - |
| 2.5363 | 57640 | 1.0357 | - |
| 2.5367 | 57650 | 0.9973 | - |
| 2.5372 | 57660 | 1.0399 | - |
| 2.5376 | 57670 | 1.0088 | - |
| 2.5381 | 57680 | 1.0685 | - |
| 2.5385 | 57690 | 1.0389 | - |
| 2.5389 | 57700 | 1.026 | - |
| 2.5394 | 57710 | 1.007 | - |
| 2.5398 | 57720 | 1.0209 | - |
| 2.5403 | 57730 | 1.0019 | - |
| 2.5407 | 57740 | 1.0016 | - |
| 2.5411 | 57750 | 1.0022 | - |
| 2.5416 | 57760 | 1.0136 | - |
| 2.5420 | 57770 | 1.0578 | - |
| 2.5425 | 57780 | 1.0189 | - |
| 2.5429 | 57790 | 1.0722 | - |
| 2.5433 | 57800 | 0.9929 | - |
| 2.5438 | 57810 | 1.0625 | - |
| 2.5442 | 57820 | 1.0459 | - |
| 2.5447 | 57830 | 1.043 | - |
| 2.5451 | 57840 | 1.0401 | - |
| 2.5455 | 57850 | 1.0056 | - |
| 2.5460 | 57860 | 1.0816 | - |
| 2.5464 | 57870 | 1.0408 | - |
| 2.5469 | 57880 | 1.0303 | - |
| 2.5473 | 57890 | 1.0511 | - |
| 2.5477 | 57900 | 1.0755 | - |
| 2.5482 | 57910 | 1.0367 | - |
| 2.5486 | 57920 | 1.0719 | - |
| 2.5491 | 57930 | 0.9815 | - |
| 2.5495 | 57940 | 1.0221 | - |
| 2.5499 | 57950 | 0.9871 | - |
| 2.5504 | 57960 | 1.0358 | - |
| 2.5508 | 57970 | 1.0398 | - |
| 2.5513 | 57980 | 1.0697 | - |
| 2.5516 | 57987 | - | 1.3997 |
| 2.5517 | 57990 | 1.0379 | - |
| 2.5521 | 58000 | 1.0341 | - |
| 2.5526 | 58010 | 1.0277 | - |
| 2.5530 | 58020 | 0.9824 | - |
| 2.5535 | 58030 | 0.9985 | - |
| 2.5539 | 58040 | 1.0447 | - |
| 2.5543 | 58050 | 1.026 | - |
| 2.5548 | 58060 | 1.0088 | - |
| 2.5552 | 58070 | 1.0525 | - |
| 2.5557 | 58080 | 1.0885 | - |
| 2.5561 | 58090 | 1.0457 | - |
| 2.5565 | 58100 | 1.0598 | - |
| 2.5570 | 58110 | 1.0421 | - |
| 2.5574 | 58120 | 1.0092 | - |
| 2.5579 | 58130 | 1.0288 | - |
| 2.5583 | 58140 | 1.0042 | - |
| 2.5587 | 58150 | 1.099 | - |
| 2.5592 | 58160 | 1.0186 | - |
| 2.5596 | 58170 | 1.0223 | - |
| 2.5601 | 58180 | 0.9859 | - |
| 2.5605 | 58190 | 1.0066 | - |
| 2.5609 | 58200 | 1.0196 | - |
| 2.5614 | 58210 | 1.0217 | - |
| 2.5618 | 58220 | 0.9746 | - |
| 2.5623 | 58230 | 0.9798 | - |
| 2.5627 | 58240 | 0.9759 | - |
| 2.5631 | 58250 | 1.061 | - |
| 2.5636 | 58260 | 0.9937 | - |
| 2.5640 | 58270 | 1.0715 | - |
| 2.5645 | 58280 | 0.9926 | - |
| 2.5649 | 58290 | 1.0171 | - |
| 2.5653 | 58300 | 1.0325 | - |
| 2.5658 | 58310 | 1.0908 | - |
| 2.5662 | 58320 | 1.0424 | - |
| 2.5667 | 58330 | 1.02 | - |
| 2.5671 | 58340 | 1.0576 | - |
| 2.5675 | 58350 | 1.0702 | - |
| 2.5680 | 58360 | 1.0182 | - |
| 2.5684 | 58370 | 0.9575 | - |
| 2.5689 | 58380 | 1.0155 | - |
| 2.5693 | 58390 | 0.9984 | - |
| 2.5697 | 58400 | 1.0177 | - |
| 2.5702 | 58410 | 0.9657 | - |
| 2.5706 | 58420 | 1.018 | - |
| 2.5711 | 58430 | 1.0431 | - |
| 2.5715 | 58440 | 1.0082 | - |
| 2.5719 | 58450 | 1.0208 | - |
| 2.5724 | 58460 | 1.0011 | - |
| 2.5728 | 58470 | 1.0283 | - |
| 2.5733 | 58480 | 1.0172 | - |
| 2.5737 | 58490 | 1.0381 | - |
| 2.5741 | 58500 | 1.0172 | - |
| 2.5746 | 58510 | 1.0501 | - |
| 2.5750 | 58520 | 1.0642 | - |
| 2.5755 | 58530 | 0.9841 | - |
| 2.5759 | 58540 | 1.0222 | - |
| 2.5763 | 58550 | 1.0464 | - |
| 2.5768 | 58560 | 0.996 | - |
| 2.5772 | 58570 | 1.0682 | - |
| 2.5777 | 58580 | 1.0023 | - |
| 2.5781 | 58590 | 0.9897 | - |
| 2.5785 | 58600 | 1.0479 | - |
| 2.5790 | 58610 | 1.0291 | - |
| 2.5794 | 58620 | 1.0415 | - |
| 2.5799 | 58630 | 1.024 | - |
| 2.5803 | 58640 | 1.0468 | - |
| 2.5807 | 58650 | 1.0039 | - |
| 2.5812 | 58660 | 1.0231 | - |
| 2.5816 | 58670 | 1.0262 | - |
| 2.5821 | 58680 | 1.0658 | - |
| 2.5825 | 58690 | 1.034 | - |
| 2.5829 | 58700 | 1.0318 | - |
| 2.5834 | 58710 | 0.9824 | - |
| 2.5838 | 58720 | 1.0216 | - |
| 2.5843 | 58730 | 1.0503 | - |
| 2.5847 | 58740 | 1.0529 | - |
| 2.5851 | 58750 | 1.0295 | - |
| 2.5856 | 58760 | 1.0441 | - |
| 2.5860 | 58770 | 0.9772 | - |
| 2.5865 | 58780 | 0.9984 | - |
| 2.5869 | 58790 | 1.0672 | - |
| 2.5873 | 58800 | 0.9919 | - |
| 2.5878 | 58810 | 1.0599 | - |
| 2.5882 | 58820 | 1.0243 | - |
| 2.5887 | 58830 | 0.9944 | - |
| 2.5891 | 58840 | 0.9968 | - |
| 2.5895 | 58850 | 0.9829 | - |
| 2.5900 | 58860 | 0.9994 | - |
| 2.5904 | 58870 | 1.0324 | - |
| 2.5909 | 58880 | 0.9773 | - |
| 2.5913 | 58890 | 0.9879 | - |
| 2.5917 | 58900 | 1.0291 | - |
| 2.5922 | 58910 | 1.0082 | - |
| 2.5926 | 58920 | 1.0423 | - |
| 2.5931 | 58930 | 0.9893 | - |
| 2.5935 | 58940 | 1.0249 | - |
| 2.5939 | 58950 | 0.9961 | - |
| 2.5944 | 58960 | 1.0435 | - |
| 2.5948 | 58970 | 0.9898 | - |
| 2.5953 | 58980 | 1.0427 | - |
| 2.5957 | 58990 | 1.028 | - |
| 2.5961 | 59000 | 1.009 | - |
| 2.5966 | 59010 | 0.9943 | - |
| 2.5970 | 59020 | 0.9896 | - |
| 2.5975 | 59030 | 1.0172 | - |
| 2.5979 | 59040 | 1.0015 | - |
| 2.5983 | 59050 | 0.997 | - |
| 2.5988 | 59060 | 0.9995 | - |
| 2.5992 | 59070 | 1.0351 | - |
| 2.5997 | 59080 | 1.0154 | - |
| 2.6001 | 59090 | 0.9849 | - |
| 2.6005 | 59100 | 0.996 | - |
| 2.6010 | 59110 | 1.0498 | - |
| 2.6014 | 59120 | 1.0687 | - |
| 2.6016 | 59124 | - | 1.4053 |
| 2.6019 | 59130 | 1.0203 | - |
| 2.6023 | 59140 | 1.0059 | - |
| 2.6027 | 59150 | 1.0207 | - |
| 2.6032 | 59160 | 1.0197 | - |
| 2.6036 | 59170 | 1.0111 | - |
| 2.6041 | 59180 | 1.0145 | - |
| 2.6045 | 59190 | 1.0195 | - |
| 2.6049 | 59200 | 0.9988 | - |
| 2.6054 | 59210 | 1.0194 | - |
| 2.6058 | 59220 | 1.0372 | - |
| 2.6063 | 59230 | 1.0407 | - |
| 2.6067 | 59240 | 0.997 | - |
| 2.6071 | 59250 | 0.9987 | - |
| 2.6076 | 59260 | 1.0505 | - |
| 2.6080 | 59270 | 1.0382 | - |
| 2.6085 | 59280 | 1.0189 | - |
| 2.6089 | 59290 | 1.0359 | - |
| 2.6093 | 59300 | 0.973 | - |
| 2.6098 | 59310 | 0.9758 | - |
| 2.6102 | 59320 | 1.0234 | - |
| 2.6107 | 59330 | 1.0103 | - |
| 2.6111 | 59340 | 1.0243 | - |
| 2.6115 | 59350 | 0.9793 | - |
| 2.6120 | 59360 | 0.9281 | - |
| 2.6124 | 59370 | 1.0291 | - |
| 2.6129 | 59380 | 1.0052 | - |
| 2.6133 | 59390 | 1.0208 | - |
| 2.6137 | 59400 | 1.0234 | - |
| 2.6142 | 59410 | 1.0115 | - |
| 2.6146 | 59420 | 1.0444 | - |
| 2.6151 | 59430 | 1.0196 | - |
| 2.6155 | 59440 | 1.0044 | - |
| 2.6159 | 59450 | 1.0178 | - |
| 2.6164 | 59460 | 1.0224 | - |
| 2.6168 | 59470 | 1.0718 | - |
| 2.6173 | 59480 | 1.0486 | - |
| 2.6177 | 59490 | 1.0701 | - |
| 2.6181 | 59500 | 1.0392 | - |
| 2.6186 | 59510 | 1.0633 | - |
| 2.6190 | 59520 | 1.0088 | - |
| 2.6195 | 59530 | 1.0453 | - |
| 2.6199 | 59540 | 1.0233 | - |
| 2.6203 | 59550 | 0.9815 | - |
| 2.6208 | 59560 | 1.0467 | - |
| 2.6212 | 59570 | 1.0139 | - |
| 2.6217 | 59580 | 1.0513 | - |
| 2.6221 | 59590 | 0.9923 | - |
| 2.6225 | 59600 | 1.0188 | - |
| 2.6230 | 59610 | 1.0169 | - |
| 2.6234 | 59620 | 0.9783 | - |
| 2.6239 | 59630 | 1.0065 | - |
| 2.6243 | 59640 | 1.0147 | - |
| 2.6247 | 59650 | 1.038 | - |
| 2.6252 | 59660 | 1.0255 | - |
| 2.6256 | 59670 | 0.9882 | - |
| 2.6261 | 59680 | 1.0337 | - |
| 2.6265 | 59690 | 1.0639 | - |
| 2.6269 | 59700 | 1.0001 | - |
| 2.6274 | 59710 | 1.0348 | - |
| 2.6278 | 59720 | 0.9949 | - |
| 2.6283 | 59730 | 1.0428 | - |
| 2.6287 | 59740 | 1.0202 | - |
| 2.6291 | 59750 | 1.0239 | - |
| 2.6296 | 59760 | 1.0756 | - |
| 2.6300 | 59770 | 1.0305 | - |
| 2.6305 | 59780 | 0.9798 | - |
| 2.6309 | 59790 | 1.0432 | - |
| 2.6313 | 59800 | 1.0045 | - |
| 2.6318 | 59810 | 0.9888 | - |
| 2.6322 | 59820 | 0.9663 | - |
| 2.6327 | 59830 | 1.054 | - |
| 2.6331 | 59840 | 1.0371 | - |
| 2.6335 | 59850 | 1.04 | - |
| 2.6340 | 59860 | 1.0025 | - |
| 2.6344 | 59870 | 1.0185 | - |
| 2.6349 | 59880 | 1.0125 | - |
| 2.6353 | 59890 | 1.0086 | - |
| 2.6357 | 59900 | 1.0442 | - |
| 2.6362 | 59910 | 1.0043 | - |
| 2.6366 | 59920 | 1.0251 | - |
| 2.6371 | 59930 | 1.0135 | - |
| 2.6375 | 59940 | 1.0114 | - |
| 2.6379 | 59950 | 1.0204 | - |
| 2.6384 | 59960 | 1.052 | - |
| 2.6388 | 59970 | 1.0178 | - |
| 2.6393 | 59980 | 1.0707 | - |
| 2.6397 | 59990 | 1.0484 | - |
| 2.6401 | 60000 | 1.0584 | - |
| 2.6406 | 60010 | 1.0082 | - |
| 2.6410 | 60020 | 1.0452 | - |
| 2.6415 | 60030 | 0.9976 | - |
| 2.6419 | 60040 | 1.0137 | - |
| 2.6423 | 60050 | 1.014 | - |
| 2.6428 | 60060 | 1.0239 | - |
| 2.6432 | 60070 | 0.955 | - |
| 2.6437 | 60080 | 0.9711 | - |
| 2.6441 | 60090 | 1.0513 | - |
| 2.6445 | 60100 | 0.9854 | - |
| 2.6450 | 60110 | 0.9957 | - |
| 2.6454 | 60120 | 0.9909 | - |
| 2.6459 | 60130 | 1.0271 | - |
| 2.6463 | 60140 | 1.0009 | - |
| 2.6467 | 60150 | 1.0189 | - |
| 2.6472 | 60160 | 1.0277 | - |
| 2.6476 | 60170 | 1.0362 | - |
| 2.6481 | 60180 | 0.9839 | - |
| 2.6485 | 60190 | 1.0261 | - |
| 2.6489 | 60200 | 1.0036 | - |
| 2.6494 | 60210 | 1.0483 | - |
| 2.6498 | 60220 | 1.0178 | - |
| 2.6503 | 60230 | 0.984 | - |
| 2.6507 | 60240 | 1.0078 | - |
| 2.6511 | 60250 | 1.0424 | - |
| 2.6516 | 60260 | 0.9991 | - |
| 2.6516 | 60261 | - | 1.4031 |
| 2.6520 | 60270 | 0.9808 | - |
| 2.6525 | 60280 | 1.0062 | - |
| 2.6529 | 60290 | 1.0058 | - |
| 2.6533 | 60300 | 1.0275 | - |
| 2.6538 | 60310 | 1.0474 | - |
| 2.6542 | 60320 | 1.0422 | - |
| 2.6547 | 60330 | 0.9976 | - |
| 2.6551 | 60340 | 1.0008 | - |
| 2.6555 | 60350 | 0.9751 | - |
| 2.6560 | 60360 | 0.9672 | - |
| 2.6564 | 60370 | 0.9775 | - |
| 2.6569 | 60380 | 1.0612 | - |
| 2.6573 | 60390 | 1.0038 | - |
| 2.6577 | 60400 | 0.9966 | - |
| 2.6582 | 60410 | 1.0681 | - |
| 2.6586 | 60420 | 0.9923 | - |
| 2.6591 | 60430 | 0.9863 | - |
| 2.6595 | 60440 | 0.9994 | - |
| 2.6599 | 60450 | 1.0029 | - |
| 2.6604 | 60460 | 0.9867 | - |
| 2.6608 | 60470 | 1.0223 | - |
| 2.6613 | 60480 | 1.0195 | - |
| 2.6617 | 60490 | 1.0122 | - |
| 2.6621 | 60500 | 0.9735 | - |
| 2.6626 | 60510 | 0.9904 | - |
| 2.6630 | 60520 | 1.0392 | - |
| 2.6635 | 60530 | 0.9941 | - |
| 2.6639 | 60540 | 1.0389 | - |
| 2.6643 | 60550 | 1.0295 | - |
| 2.6648 | 60560 | 0.98 | - |
| 2.6652 | 60570 | 1.0509 | - |
| 2.6657 | 60580 | 0.9976 | - |
| 2.6661 | 60590 | 1.0167 | - |
| 2.6665 | 60600 | 1.0257 | - |
| 2.6670 | 60610 | 1.0024 | - |
| 2.6674 | 60620 | 1.013 | - |
| 2.6679 | 60630 | 0.9811 | - |
| 2.6683 | 60640 | 1.0639 | - |
| 2.6687 | 60650 | 0.991 | - |
| 2.6692 | 60660 | 0.9691 | - |
| 2.6696 | 60670 | 1.0222 | - |
| 2.6701 | 60680 | 1.0692 | - |
| 2.6705 | 60690 | 0.9754 | - |
| 2.6709 | 60700 | 1.0219 | - |
| 2.6714 | 60710 | 0.9966 | - |
| 2.6718 | 60720 | 1.0098 | - |
| 2.6723 | 60730 | 1.0132 | - |
| 2.6727 | 60740 | 0.9955 | - |
| 2.6731 | 60750 | 0.9789 | - |
| 2.6736 | 60760 | 1.0112 | - |
| 2.6740 | 60770 | 0.9922 | - |
| 2.6745 | 60780 | 1.0087 | - |
| 2.6749 | 60790 | 1.068 | - |
| 2.6753 | 60800 | 0.9834 | - |
| 2.6758 | 60810 | 1.0062 | - |
| 2.6762 | 60820 | 0.9884 | - |
| 2.6767 | 60830 | 0.9865 | - |
| 2.6771 | 60840 | 0.9919 | - |
| 2.6775 | 60850 | 1.0043 | - |
| 2.6780 | 60860 | 0.9848 | - |
| 2.6784 | 60870 | 1.0297 | - |
| 2.6789 | 60880 | 1.0108 | - |
| 2.6793 | 60890 | 1.0275 | - |
| 2.6798 | 60900 | 0.9725 | - |
| 2.6802 | 60910 | 0.9834 | - |
| 2.6806 | 60920 | 0.9773 | - |
| 2.6811 | 60930 | 1.003 | - |
| 2.6815 | 60940 | 1.0144 | - |
| 2.6820 | 60950 | 0.966 | - |
| 2.6824 | 60960 | 0.9708 | - |
| 2.6828 | 60970 | 1.0001 | - |
| 2.6833 | 60980 | 0.9731 | - |
| 2.6837 | 60990 | 0.984 | - |
| 2.6842 | 61000 | 0.9683 | - |
| 2.6846 | 61010 | 1.0115 | - |
| 2.6850 | 61020 | 1.038 | - |
| 2.6855 | 61030 | 0.9599 | - |
| 2.6859 | 61040 | 1.0146 | - |
| 2.6864 | 61050 | 0.9981 | - |
| 2.6868 | 61060 | 0.9793 | - |
| 2.6872 | 61070 | 0.9958 | - |
| 2.6877 | 61080 | 0.9898 | - |
| 2.6881 | 61090 | 0.9935 | - |
| 2.6886 | 61100 | 1.0196 | - |
| 2.6890 | 61110 | 0.9991 | - |
| 2.6894 | 61120 | 0.9969 | - |
| 2.6899 | 61130 | 0.9879 | - |
| 2.6903 | 61140 | 0.9978 | - |
| 2.6908 | 61150 | 1.0246 | - |
| 2.6912 | 61160 | 0.9698 | - |
| 2.6916 | 61170 | 0.9818 | - |
| 2.6921 | 61180 | 1.0289 | - |
| 2.6925 | 61190 | 0.9697 | - |
| 2.6930 | 61200 | 0.986 | - |
| 2.6934 | 61210 | 1.0111 | - |
| 2.6938 | 61220 | 0.9913 | - |
| 2.6943 | 61230 | 1.0094 | - |
| 2.6947 | 61240 | 1.0067 | - |
| 2.6952 | 61250 | 1.0267 | - |
| 2.6956 | 61260 | 0.9805 | - |
| 2.6960 | 61270 | 1.0015 | - |
| 2.6965 | 61280 | 0.9731 | - |
| 2.6969 | 61290 | 0.9698 | - |
| 2.6974 | 61300 | 0.9689 | - |
| 2.6978 | 61310 | 1.0202 | - |
| 2.6982 | 61320 | 0.9741 | - |
| 2.6987 | 61330 | 1.0203 | - |
| 2.6991 | 61340 | 0.9913 | - |
| 2.6996 | 61350 | 0.9874 | - |
| 2.7000 | 61360 | 1.022 | - |
| 2.7004 | 61370 | 0.9427 | - |
| 2.7009 | 61380 | 1.016 | - |
| 2.7013 | 61390 | 0.9859 | - |
| 2.7017 | 61398 | - | 1.4182 |
| 2.7018 | 61400 | 0.9814 | - |
| 2.7022 | 61410 | 1.0272 | - |
| 2.7026 | 61420 | 1.0051 | - |
| 2.7031 | 61430 | 0.9782 | - |
| 2.7035 | 61440 | 1.0113 | - |
| 2.7040 | 61450 | 0.9442 | - |
| 2.7044 | 61460 | 0.9906 | - |
| 2.7048 | 61470 | 0.9717 | - |
| 2.7053 | 61480 | 0.9921 | - |
| 2.7057 | 61490 | 1.0211 | - |
| 2.7062 | 61500 | 1.0186 | - |
| 2.7066 | 61510 | 0.979 | - |
| 2.7070 | 61520 | 0.9549 | - |
| 2.7075 | 61530 | 1.0076 | - |
| 2.7079 | 61540 | 0.9974 | - |
| 2.7084 | 61550 | 0.9892 | - |
| 2.7088 | 61560 | 0.9796 | - |
| 2.7092 | 61570 | 0.9754 | - |
| 2.7097 | 61580 | 1.0503 | - |
| 2.7101 | 61590 | 0.9709 | - |
| 2.7106 | 61600 | 0.95 | - |
| 2.7110 | 61610 | 1.0043 | - |
| 2.7114 | 61620 | 0.9379 | - |
| 2.7119 | 61630 | 0.9976 | - |
| 2.7123 | 61640 | 0.9983 | - |
| 2.7128 | 61650 | 0.9642 | - |
| 2.7132 | 61660 | 0.9454 | - |
| 2.7136 | 61670 | 1.0031 | - |
| 2.7141 | 61680 | 0.9881 | - |
| 2.7145 | 61690 | 0.978 | - |
| 2.7150 | 61700 | 0.9721 | - |
| 2.7154 | 61710 | 0.9811 | - |
| 2.7158 | 61720 | 1.0271 | - |
| 2.7163 | 61730 | 1.0262 | - |
| 2.7167 | 61740 | 0.9757 | - |
| 2.7172 | 61750 | 1.0199 | - |
| 2.7176 | 61760 | 0.9787 | - |
| 2.7180 | 61770 | 0.9825 | - |
| 2.7185 | 61780 | 1.005 | - |
| 2.7189 | 61790 | 1.0164 | - |
| 2.7194 | 61800 | 0.9788 | - |
| 2.7198 | 61810 | 1.0079 | - |
| 2.7202 | 61820 | 0.9838 | - |
| 2.7207 | 61830 | 0.9842 | - |
| 2.7211 | 61840 | 0.9866 | - |
| 2.7216 | 61850 | 0.9658 | - |
| 2.7220 | 61860 | 0.9756 | - |
| 2.7224 | 61870 | 0.9995 | - |
| 2.7229 | 61880 | 0.958 | - |
| 2.7233 | 61890 | 0.9666 | - |
| 2.7238 | 61900 | 0.9839 | - |
| 2.7242 | 61910 | 1.0069 | - |
| 2.7246 | 61920 | 0.9648 | - |
| 2.7251 | 61930 | 0.9428 | - |
| 2.7255 | 61940 | 0.9907 | - |
| 2.7260 | 61950 | 0.9568 | - |
| 2.7264 | 61960 | 1.0011 | - |
| 2.7268 | 61970 | 1.0205 | - |
| 2.7273 | 61980 | 0.9806 | - |
| 2.7277 | 61990 | 0.9821 | - |
| 2.7282 | 62000 | 0.9144 | - |
| 2.7286 | 62010 | 0.969 | - |
| 2.7290 | 62020 | 1.0242 | - |
| 2.7295 | 62030 | 0.994 | - |
| 2.7299 | 62040 | 0.9891 | - |
| 2.7304 | 62050 | 0.9915 | - |
| 2.7308 | 62060 | 1.026 | - |
| 2.7312 | 62070 | 1.0168 | - |
| 2.7317 | 62080 | 0.99 | - |
| 2.7321 | 62090 | 0.9904 | - |
| 2.7326 | 62100 | 0.9744 | - |
| 2.7330 | 62110 | 0.9762 | - |
| 2.7334 | 62120 | 0.9758 | - |
| 2.7339 | 62130 | 0.9566 | - |
| 2.7343 | 62140 | 0.9373 | - |
| 2.7348 | 62150 | 0.9963 | - |
| 2.7352 | 62160 | 0.973 | - |
| 2.7356 | 62170 | 0.9558 | - |
| 2.7361 | 62180 | 1.0284 | - |
| 2.7365 | 62190 | 1.0116 | - |
| 2.7370 | 62200 | 0.9722 | - |
| 2.7374 | 62210 | 0.9768 | - |
| 2.7378 | 62220 | 0.9977 | - |
| 2.7383 | 62230 | 0.9554 | - |
| 2.7387 | 62240 | 0.9947 | - |
| 2.7392 | 62250 | 0.9923 | - |
| 2.7396 | 62260 | 1.0169 | - |
| 2.7400 | 62270 | 1.0167 | - |
| 2.7405 | 62280 | 0.9663 | - |
| 2.7409 | 62290 | 0.9929 | - |
| 2.7414 | 62300 | 0.981 | - |
| 2.7418 | 62310 | 0.9743 | - |
| 2.7422 | 62320 | 0.9492 | - |
| 2.7427 | 62330 | 0.9719 | - |
| 2.7431 | 62340 | 1.0118 | - |
| 2.7436 | 62350 | 0.9886 | - |
| 2.7440 | 62360 | 0.9877 | - |
| 2.7444 | 62370 | 0.9656 | - |
| 2.7449 | 62380 | 1.0129 | - |
| 2.7453 | 62390 | 0.9878 | - |
| 2.7458 | 62400 | 0.9646 | - |
| 2.7462 | 62410 | 1.008 | - |
| 2.7466 | 62420 | 0.9663 | - |
| 2.7471 | 62430 | 0.988 | - |
| 2.7475 | 62440 | 1.0001 | - |
| 2.7480 | 62450 | 0.9786 | - |
| 2.7484 | 62460 | 0.988 | - |
| 2.7488 | 62470 | 0.9843 | - |
| 2.7493 | 62480 | 0.9777 | - |
| 2.7497 | 62490 | 1.0405 | - |
| 2.7502 | 62500 | 1.0087 | - |
| 2.7506 | 62510 | 0.9865 | - |
| 2.7510 | 62520 | 0.9733 | - |
| 2.7515 | 62530 | 1.0518 | - |
| 2.7517 | 62535 | - | 1.4275 |
| 2.7519 | 62540 | 1.003 | - |
| 2.7524 | 62550 | 0.9849 | - |
| 2.7528 | 62560 | 1.0063 | - |
| 2.7532 | 62570 | 1.0046 | - |
| 2.7537 | 62580 | 0.956 | - |
| 2.7541 | 62590 | 0.9616 | - |
| 2.7546 | 62600 | 1.0175 | - |
| 2.7550 | 62610 | 1.0241 | - |
| 2.7554 | 62620 | 0.9807 | - |
| 2.7559 | 62630 | 0.9802 | - |
| 2.7563 | 62640 | 0.9717 | - |
| 2.7568 | 62650 | 0.9866 | - |
| 2.7572 | 62660 | 0.9489 | - |
| 2.7576 | 62670 | 1.0021 | - |
| 2.7581 | 62680 | 1.0325 | - |
| 2.7585 | 62690 | 1.0167 | - |
| 2.7590 | 62700 | 0.9765 | - |
| 2.7594 | 62710 | 0.9843 | - |
| 2.7598 | 62720 | 0.9458 | - |
| 2.7603 | 62730 | 0.9849 | - |
| 2.7607 | 62740 | 0.983 | - |
| 2.7612 | 62750 | 1.0202 | - |
| 2.7616 | 62760 | 0.9966 | - |
| 2.7620 | 62770 | 0.9667 | - |
| 2.7625 | 62780 | 0.9982 | - |
| 2.7629 | 62790 | 0.9695 | - |
| 2.7634 | 62800 | 1.0125 | - |
| 2.7638 | 62810 | 0.9695 | - |
| 2.7642 | 62820 | 0.9938 | - |
| 2.7647 | 62830 | 1.0364 | - |
| 2.7651 | 62840 | 0.9575 | - |
| 2.7656 | 62850 | 0.9886 | - |
| 2.7660 | 62860 | 0.9947 | - |
| 2.7664 | 62870 | 0.9653 | - |
| 2.7669 | 62880 | 0.9729 | - |
| 2.7673 | 62890 | 0.9697 | - |
| 2.7678 | 62900 | 1.0244 | - |
| 2.7682 | 62910 | 0.9795 | - |
| 2.7686 | 62920 | 0.9978 | - |
| 2.7691 | 62930 | 0.9662 | - |
| 2.7695 | 62940 | 0.9559 | - |
| 2.7700 | 62950 | 0.988 | - |
| 2.7704 | 62960 | 0.973 | - |
| 2.7708 | 62970 | 0.9212 | - |
| 2.7713 | 62980 | 0.956 | - |
| 2.7717 | 62990 | 1.0327 | - |
| 2.7722 | 63000 | 0.9891 | - |
| 2.7726 | 63010 | 0.9819 | - |
| 2.7730 | 63020 | 0.9962 | - |
| 2.7735 | 63030 | 0.9638 | - |
| 2.7739 | 63040 | 1.0071 | - |
| 2.7744 | 63050 | 0.9844 | - |
| 2.7748 | 63060 | 0.9542 | - |
| 2.7752 | 63070 | 1.0177 | - |
| 2.7757 | 63080 | 0.9507 | - |
| 2.7761 | 63090 | 0.9625 | - |
| 2.7766 | 63100 | 0.988 | - |
| 2.7770 | 63110 | 0.9617 | - |
| 2.7774 | 63120 | 0.9376 | - |
| 2.7779 | 63130 | 0.9938 | - |
| 2.7783 | 63140 | 0.9616 | - |
| 2.7788 | 63150 | 1.0192 | - |
| 2.7792 | 63160 | 0.9593 | - |
| 2.7796 | 63170 | 1.0152 | - |
| 2.7801 | 63180 | 0.9521 | - |
| 2.7805 | 63190 | 1.0063 | - |
| 2.7810 | 63200 | 0.9498 | - |
| 2.7814 | 63210 | 1.0048 | - |
| 2.7818 | 63220 | 0.9776 | - |
| 2.7823 | 63230 | 0.9934 | - |
| 2.7827 | 63240 | 0.9722 | - |
| 2.7832 | 63250 | 0.9143 | - |
| 2.7836 | 63260 | 0.9494 | - |
| 2.7840 | 63270 | 0.9866 | - |
| 2.7845 | 63280 | 0.9731 | - |
| 2.7849 | 63290 | 0.929 | - |
| 2.7854 | 63300 | 1.0062 | - |
| 2.7858 | 63310 | 0.9814 | - |
| 2.7862 | 63320 | 0.9475 | - |
| 2.7867 | 63330 | 1.0054 | - |
| 2.7871 | 63340 | 0.9178 | - |
| 2.7876 | 63350 | 0.9822 | - |
| 2.7880 | 63360 | 0.9903 | - |
| 2.7884 | 63370 | 0.954 | - |
| 2.7889 | 63380 | 0.9306 | - |
| 2.7893 | 63390 | 1.0151 | - |
| 2.7898 | 63400 | 1.0007 | - |
| 2.7902 | 63410 | 0.9604 | - |
| 2.7906 | 63420 | 0.9658 | - |
| 2.7911 | 63430 | 0.9366 | - |
| 2.7915 | 63440 | 0.9949 | - |
| 2.7920 | 63450 | 0.9398 | - |
| 2.7924 | 63460 | 0.9365 | - |
| 2.7928 | 63470 | 0.9658 | - |
| 2.7933 | 63480 | 1.0023 | - |
| 2.7937 | 63490 | 0.9365 | - |
| 2.7942 | 63500 | 0.9635 | - |
| 2.7946 | 63510 | 0.9218 | - |
| 2.7950 | 63520 | 0.9643 | - |
| 2.7955 | 63530 | 0.9629 | - |
| 2.7959 | 63540 | 0.9422 | - |
| 2.7964 | 63550 | 0.9577 | - |
| 2.7968 | 63560 | 0.8946 | - |
| 2.7972 | 63570 | 0.9962 | - |
| 2.7977 | 63580 | 0.9649 | - |
| 2.7981 | 63590 | 0.9778 | - |
| 2.7986 | 63600 | 0.9731 | - |
| 2.7990 | 63610 | 0.9654 | - |
| 2.7994 | 63620 | 0.9912 | - |
| 2.7999 | 63630 | 0.9668 | - |
| 2.8003 | 63640 | 0.9645 | - |
| 2.8008 | 63650 | 0.9762 | - |
| 2.8012 | 63660 | 0.9573 | - |
| 2.8016 | 63670 | 1.0233 | - |
| 2.8017 | 63672 | - | 1.4237 |
| 2.8021 | 63680 | 0.9747 | - |
| 2.8025 | 63690 | 0.9511 | - |
| 2.8030 | 63700 | 0.9762 | - |
| 2.8034 | 63710 | 0.9565 | - |
| 2.8038 | 63720 | 0.9645 | - |
| 2.8043 | 63730 | 0.9517 | - |
| 2.8047 | 63740 | 0.9634 | - |
| 2.8052 | 63750 | 0.9971 | - |
| 2.8056 | 63760 | 0.9415 | - |
| 2.8060 | 63770 | 0.9689 | - |
| 2.8065 | 63780 | 0.9797 | - |
| 2.8069 | 63790 | 0.9631 | - |
| 2.8074 | 63800 | 1.014 | - |
| 2.8078 | 63810 | 0.9842 | - |
| 2.8082 | 63820 | 1.0076 | - |
| 2.8087 | 63830 | 0.9782 | - |
| 2.8091 | 63840 | 0.9678 | - |
| 2.8096 | 63850 | 0.9736 | - |
| 2.8100 | 63860 | 0.9232 | - |
| 2.8104 | 63870 | 0.9465 | - |
| 2.8109 | 63880 | 0.9826 | - |
| 2.8113 | 63890 | 1.0009 | - |
| 2.8118 | 63900 | 0.9719 | - |
| 2.8122 | 63910 | 0.9961 | - |
| 2.8126 | 63920 | 0.982 | - |
| 2.8131 | 63930 | 0.9737 | - |
| 2.8135 | 63940 | 0.9694 | - |
| 2.8140 | 63950 | 0.9092 | - |
| 2.8144 | 63960 | 0.959 | - |
| 2.8148 | 63970 | 0.9375 | - |
| 2.8153 | 63980 | 1.0143 | - |
| 2.8157 | 63990 | 0.9414 | - |
| 2.8162 | 64000 | 0.9157 | - |
| 2.8166 | 64010 | 0.9641 | - |
| 2.8170 | 64020 | 0.927 | - |
| 2.8175 | 64030 | 1.0102 | - |
| 2.8179 | 64040 | 0.975 | - |
| 2.8184 | 64050 | 0.9542 | - |
| 2.8188 | 64060 | 0.9673 | - |
| 2.8192 | 64070 | 0.9969 | - |
| 2.8197 | 64080 | 0.974 | - |
| 2.8201 | 64090 | 0.9639 | - |
| 2.8206 | 64100 | 0.9726 | - |
| 2.8210 | 64110 | 0.988 | - |
| 2.8214 | 64120 | 0.9504 | - |
| 2.8219 | 64130 | 0.9609 | - |
| 2.8223 | 64140 | 0.9615 | - |
| 2.8228 | 64150 | 0.9475 | - |
| 2.8232 | 64160 | 0.9669 | - |
| 2.8236 | 64170 | 0.9476 | - |
| 2.8241 | 64180 | 0.9894 | - |
| 2.8245 | 64190 | 0.9774 | - |
| 2.8250 | 64200 | 0.9228 | - |
| 2.8254 | 64210 | 0.9409 | - |
| 2.8258 | 64220 | 0.9292 | - |
| 2.8263 | 64230 | 0.9485 | - |
| 2.8267 | 64240 | 0.9599 | - |
| 2.8272 | 64250 | 0.9535 | - |
| 2.8276 | 64260 | 0.9584 | - |
| 2.8280 | 64270 | 0.9694 | - |
| 2.8285 | 64280 | 0.9641 | - |
| 2.8289 | 64290 | 0.9308 | - |
| 2.8294 | 64300 | 0.9206 | - |
| 2.8298 | 64310 | 0.962 | - |
| 2.8302 | 64320 | 0.9246 | - |
| 2.8307 | 64330 | 0.9339 | - |
| 2.8311 | 64340 | 0.9664 | - |
| 2.8316 | 64350 | 0.9894 | - |
| 2.8320 | 64360 | 0.968 | - |
| 2.8324 | 64370 | 0.9598 | - |
| 2.8329 | 64380 | 0.9669 | - |
| 2.8333 | 64390 | 0.9732 | - |
| 2.8338 | 64400 | 0.9562 | - |
| 2.8342 | 64410 | 0.9626 | - |
| 2.8346 | 64420 | 1.0196 | - |
| 2.8351 | 64430 | 0.9983 | - |
| 2.8355 | 64440 | 0.9723 | - |
| 2.8360 | 64450 | 0.9406 | - |
| 2.8364 | 64460 | 0.9621 | - |
| 2.8368 | 64470 | 0.9648 | - |
| 2.8373 | 64480 | 0.9752 | - |
| 2.8377 | 64490 | 0.9526 | - |
| 2.8382 | 64500 | 0.9039 | - |
| 2.8386 | 64510 | 0.9611 | - |
| 2.8390 | 64520 | 1.023 | - |
| 2.8395 | 64530 | 0.9273 | - |
| 2.8399 | 64540 | 0.9521 | - |
| 2.8404 | 64550 | 0.9978 | - |
| 2.8408 | 64560 | 0.973 | - |
| 2.8412 | 64570 | 1.0046 | - |
| 2.8417 | 64580 | 0.951 | - |
| 2.8421 | 64590 | 0.9648 | - |
| 2.8426 | 64600 | 0.9417 | - |
| 2.8430 | 64610 | 0.9401 | - |
| 2.8434 | 64620 | 0.998 | - |
| 2.8439 | 64630 | 0.9642 | - |
| 2.8443 | 64640 | 1.0118 | - |
| 2.8448 | 64650 | 0.9536 | - |
| 2.8452 | 64660 | 0.9827 | - |
| 2.8456 | 64670 | 0.9364 | - |
| 2.8461 | 64680 | 0.9455 | - |
| 2.8465 | 64690 | 0.9554 | - |
| 2.8470 | 64700 | 0.9305 | - |
| 2.8474 | 64710 | 0.9852 | - |
| 2.8478 | 64720 | 0.9987 | - |
| 2.8483 | 64730 | 0.9579 | - |
| 2.8487 | 64740 | 0.9876 | - |
| 2.8492 | 64750 | 0.9604 | - |
| 2.8496 | 64760 | 0.9016 | - |
| 2.8500 | 64770 | 0.9506 | - |
| 2.8505 | 64780 | 0.9767 | - |
| 2.8509 | 64790 | 0.9482 | - |
| 2.8514 | 64800 | 1.0002 | - |
| 2.8518 | 64809 | - | 1.4014 |
| 2.8518 | 64810 | 0.9516 | - |
| 2.8522 | 64820 | 0.9842 | - |
| 2.8527 | 64830 | 0.9624 | - |
| 2.8531 | 64840 | 0.946 | - |
| 2.8536 | 64850 | 0.9637 | - |
| 2.8540 | 64860 | 0.9815 | - |
| 2.8544 | 64870 | 0.9671 | - |
| 2.8549 | 64880 | 0.9125 | - |
| 2.8553 | 64890 | 0.9348 | - |
| 2.8558 | 64900 | 0.9951 | - |
| 2.8562 | 64910 | 0.9496 | - |
| 2.8566 | 64920 | 0.967 | - |
| 2.8571 | 64930 | 0.9707 | - |
| 2.8575 | 64940 | 0.9322 | - |
| 2.8580 | 64950 | 0.9546 | - |
| 2.8584 | 64960 | 0.9511 | - |
| 2.8588 | 64970 | 0.9452 | - |
| 2.8593 | 64980 | 0.9569 | - |
| 2.8597 | 64990 | 0.9269 | - |
| 2.8602 | 65000 | 0.9148 | - |
| 2.8606 | 65010 | 0.9604 | - |
| 2.8610 | 65020 | 0.9487 | - |
| 2.8615 | 65030 | 0.9696 | - |
| 2.8619 | 65040 | 0.9582 | - |
| 2.8624 | 65050 | 0.9387 | - |
| 2.8628 | 65060 | 0.9363 | - |
| 2.8632 | 65070 | 0.9353 | - |
| 2.8637 | 65080 | 0.9773 | - |
| 2.8641 | 65090 | 0.9388 | - |
| 2.8646 | 65100 | 0.9474 | - |
| 2.8650 | 65110 | 0.9729 | - |
| 2.8654 | 65120 | 0.96 | - |
| 2.8659 | 65130 | 0.9578 | - |
| 2.8663 | 65140 | 0.9655 | - |
| 2.8668 | 65150 | 0.9331 | - |
| 2.8672 | 65160 | 0.9543 | - |
| 2.8676 | 65170 | 0.9313 | - |
| 2.8681 | 65180 | 0.9817 | - |
| 2.8685 | 65190 | 0.9566 | - |
| 2.8690 | 65200 | 0.9485 | - |
| 2.8694 | 65210 | 0.9455 | - |
| 2.8698 | 65220 | 0.9596 | - |
| 2.8703 | 65230 | 0.9791 | - |
| 2.8707 | 65240 | 0.9653 | - |
| 2.8712 | 65250 | 0.9649 | - |
| 2.8716 | 65260 | 0.9334 | - |
| 2.8720 | 65270 | 0.9877 | - |
| 2.8725 | 65280 | 0.9586 | - |
| 2.8729 | 65290 | 0.9551 | - |
| 2.8734 | 65300 | 0.9287 | - |
| 2.8738 | 65310 | 0.9415 | - |
| 2.8742 | 65320 | 0.9473 | - |
| 2.8747 | 65330 | 0.9495 | - |
| 2.8751 | 65340 | 0.9542 | - |
| 2.8756 | 65350 | 0.9194 | - |
| 2.8760 | 65360 | 0.9606 | - |
| 2.8764 | 65370 | 0.9243 | - |
| 2.8769 | 65380 | 0.9438 | - |
| 2.8773 | 65390 | 0.9668 | - |
| 2.8778 | 65400 | 0.9526 | - |
| 2.8782 | 65410 | 0.9644 | - |
| 2.8786 | 65420 | 0.9333 | - |
| 2.8791 | 65430 | 0.9634 | - |
| 2.8795 | 65440 | 0.965 | - |
| 2.8800 | 65450 | 1.0017 | - |
| 2.8804 | 65460 | 0.9383 | - |
| 2.8808 | 65470 | 0.9425 | - |
| 2.8813 | 65480 | 0.936 | - |
| 2.8817 | 65490 | 0.9481 | - |
| 2.8822 | 65500 | 0.9727 | - |
| 2.8826 | 65510 | 0.978 | - |
| 2.8830 | 65520 | 0.9682 | - |
| 2.8835 | 65530 | 0.9318 | - |
| 2.8839 | 65540 | 0.9922 | - |
| 2.8844 | 65550 | 0.9367 | - |
| 2.8848 | 65560 | 0.9918 | - |
| 2.8852 | 65570 | 0.992 | - |
| 2.8857 | 65580 | 0.9491 | - |
| 2.8861 | 65590 | 0.9786 | - |
| 2.8866 | 65600 | 0.9659 | - |
| 2.8870 | 65610 | 0.9216 | - |
| 2.8874 | 65620 | 0.9707 | - |
| 2.8879 | 65630 | 0.962 | - |
| 2.8883 | 65640 | 0.953 | - |
| 2.8888 | 65650 | 0.9281 | - |
| 2.8892 | 65660 | 0.9333 | - |
| 2.8896 | 65670 | 0.9395 | - |
| 2.8901 | 65680 | 0.9433 | - |
| 2.8905 | 65690 | 0.9503 | - |
| 2.8910 | 65700 | 0.9386 | - |
| 2.8914 | 65710 | 0.8914 | - |
| 2.8918 | 65720 | 0.9276 | - |
| 2.8923 | 65730 | 0.9597 | - |
| 2.8927 | 65740 | 0.9641 | - |
| 2.8932 | 65750 | 0.9367 | - |
| 2.8936 | 65760 | 0.9842 | - |
| 2.8940 | 65770 | 0.9456 | - |
| 2.8945 | 65780 | 0.9384 | - |
| 2.8949 | 65790 | 0.9093 | - |
| 2.8954 | 65800 | 0.9444 | - |
| 2.8958 | 65810 | 0.9486 | - |
| 2.8962 | 65820 | 0.9303 | - |
| 2.8967 | 65830 | 0.9425 | - |
| 2.8971 | 65840 | 0.9409 | - |
| 2.8976 | 65850 | 0.9564 | - |
| 2.8980 | 65860 | 0.9413 | - |
| 2.8984 | 65870 | 0.9595 | - |
| 2.8989 | 65880 | 0.988 | - |
| 2.8993 | 65890 | 0.9484 | - |
| 2.8998 | 65900 | 0.9876 | - |
| 2.9002 | 65910 | 0.9512 | - |
| 2.9006 | 65920 | 0.9367 | - |
| 2.9011 | 65930 | 0.9903 | - |
| 2.9015 | 65940 | 0.9649 | - |
| 2.9018 | 65946 | - | 1.4093 |
| 2.9020 | 65950 | 0.933 | - |
| 2.9024 | 65960 | 0.9118 | - |
| 2.9028 | 65970 | 0.9324 | - |
| 2.9033 | 65980 | 0.9346 | - |
| 2.9037 | 65990 | 0.9474 | - |
| 2.9042 | 66000 | 0.9665 | - |
| 2.9046 | 66010 | 0.8859 | - |
| 2.9050 | 66020 | 0.911 | - |
| 2.9055 | 66030 | 0.9469 | - |
| 2.9059 | 66040 | 0.9528 | - |
| 2.9064 | 66050 | 0.968 | - |
| 2.9068 | 66060 | 0.936 | - |
| 2.9072 | 66070 | 0.9757 | - |
| 2.9077 | 66080 | 0.9455 | - |
| 2.9081 | 66090 | 0.9537 | - |
| 2.9086 | 66100 | 0.9419 | - |
| 2.9090 | 66110 | 0.94 | - |
| 2.9094 | 66120 | 0.948 | - |
| 2.9099 | 66130 | 0.9683 | - |
| 2.9103 | 66140 | 0.933 | - |
| 2.9108 | 66150 | 0.9711 | - |
| 2.9112 | 66160 | 0.9318 | - |
| 2.9116 | 66170 | 0.9349 | - |
| 2.9121 | 66180 | 0.9487 | - |
| 2.9125 | 66190 | 0.9265 | - |
| 2.9130 | 66200 | 0.9176 | - |
| 2.9134 | 66210 | 0.9134 | - |
| 2.9138 | 66220 | 0.9465 | - |
| 2.9143 | 66230 | 0.9439 | - |
| 2.9147 | 66240 | 0.9264 | - |
| 2.9152 | 66250 | 0.9678 | - |
| 2.9156 | 66260 | 0.93 | - |
| 2.9160 | 66270 | 0.9397 | - |
| 2.9165 | 66280 | 0.9385 | - |
| 2.9169 | 66290 | 0.9916 | - |
| 2.9174 | 66300 | 0.9582 | - |
| 2.9178 | 66310 | 0.9701 | - |
| 2.9182 | 66320 | 0.9795 | - |
| 2.9187 | 66330 | 0.9415 | - |
| 2.9191 | 66340 | 0.9324 | - |
| 2.9196 | 66350 | 0.9563 | - |
| 2.9200 | 66360 | 0.9297 | - |
| 2.9204 | 66370 | 0.9621 | - |
| 2.9209 | 66380 | 0.9421 | - |
| 2.9213 | 66390 | 0.9769 | - |
| 2.9218 | 66400 | 0.9751 | - |
| 2.9222 | 66410 | 0.9601 | - |
| 2.9226 | 66420 | 0.9182 | - |
| 2.9231 | 66430 | 0.9328 | - |
| 2.9235 | 66440 | 0.9954 | - |
| 2.9240 | 66450 | 0.9775 | - |
| 2.9244 | 66460 | 0.9481 | - |
| 2.9248 | 66470 | 0.9252 | - |
| 2.9253 | 66480 | 0.9601 | - |
| 2.9257 | 66490 | 0.9258 | - |
| 2.9262 | 66500 | 0.9519 | - |
| 2.9266 | 66510 | 0.9419 | - |
| 2.9270 | 66520 | 0.9821 | - |
| 2.9275 | 66530 | 0.9628 | - |
| 2.9279 | 66540 | 0.9596 | - |
| 2.9284 | 66550 | 0.9651 | - |
| 2.9288 | 66560 | 0.9457 | - |
| 2.9292 | 66570 | 0.9636 | - |
| 2.9297 | 66580 | 0.9565 | - |
| 2.9301 | 66590 | 0.943 | - |
| 2.9306 | 66600 | 0.9347 | - |
| 2.9310 | 66610 | 0.9608 | - |
| 2.9314 | 66620 | 0.9401 | - |
| 2.9319 | 66630 | 0.9316 | - |
| 2.9323 | 66640 | 0.9514 | - |
| 2.9328 | 66650 | 0.932 | - |
| 2.9332 | 66660 | 0.8954 | - |
| 2.9336 | 66670 | 0.9506 | - |
| 2.9341 | 66680 | 0.9512 | - |
| 2.9345 | 66690 | 0.9201 | - |
| 2.9350 | 66700 | 0.9724 | - |
| 2.9354 | 66710 | 0.9628 | - |
| 2.9358 | 66720 | 0.9679 | - |
| 2.9363 | 66730 | 0.9034 | - |
| 2.9367 | 66740 | 0.9232 | - |
| 2.9372 | 66750 | 0.9499 | - |
| 2.9376 | 66760 | 0.956 | - |
| 2.9380 | 66770 | 0.8967 | - |
| 2.9385 | 66780 | 0.9078 | - |
| 2.9389 | 66790 | 0.9554 | - |
| 2.9394 | 66800 | 0.962 | - |
| 2.9398 | 66810 | 0.9277 | - |
| 2.9402 | 66820 | 0.973 | - |
| 2.9407 | 66830 | 0.9941 | - |
| 2.9411 | 66840 | 0.9007 | - |
| 2.9416 | 66850 | 1.0093 | - |
| 2.9420 | 66860 | 0.9651 | - |
| 2.9424 | 66870 | 0.9464 | - |
| 2.9429 | 66880 | 0.9382 | - |
| 2.9433 | 66890 | 0.92 | - |
| 2.9438 | 66900 | 0.9509 | - |
| 2.9442 | 66910 | 0.9039 | - |
| 2.9446 | 66920 | 0.9271 | - |
| 2.9451 | 66930 | 1.0063 | - |
| 2.9455 | 66940 | 0.8765 | - |
| 2.9460 | 66950 | 0.9411 | - |
| 2.9464 | 66960 | 0.9383 | - |
| 2.9468 | 66970 | 0.9549 | - |
| 2.9473 | 66980 | 0.9784 | - |
| 2.9477 | 66990 | 0.9143 | - |
| 2.9482 | 67000 | 0.9289 | - |
| 2.9486 | 67010 | 0.9784 | - |
| 2.9490 | 67020 | 0.9697 | - |
| 2.9495 | 67030 | 0.9222 | - |
| 2.9499 | 67040 | 0.9148 | - |
| 2.9504 | 67050 | 0.939 | - |
| 2.9508 | 67060 | 0.9518 | - |
| 2.9512 | 67070 | 0.9758 | - |
| 2.9517 | 67080 | 0.9733 | - |
| 2.9518 | 67083 | - | 1.4033 |
| 2.9521 | 67090 | 0.9229 | - |
| 2.9526 | 67100 | 0.9332 | - |
| 2.9530 | 67110 | 0.9693 | - |
| 2.9534 | 67120 | 0.932 | - |
| 2.9539 | 67130 | 0.9004 | - |
| 2.9543 | 67140 | 0.9508 | - |
| 2.9548 | 67150 | 0.8953 | - |
| 2.9552 | 67160 | 0.945 | - |
| 2.9556 | 67170 | 0.8999 | - |
| 2.9561 | 67180 | 0.9741 | - |
| 2.9565 | 67190 | 0.9893 | - |
| 2.9570 | 67200 | 0.9268 | - |
| 2.9574 | 67210 | 0.8717 | - |
| 2.9578 | 67220 | 0.9399 | - |
| 2.9583 | 67230 | 0.9196 | - |
| 2.9587 | 67240 | 0.9522 | - |
| 2.9592 | 67250 | 0.8865 | - |
| 2.9596 | 67260 | 0.9172 | - |
| 2.9600 | 67270 | 0.9659 | - |
| 2.9605 | 67280 | 0.9731 | - |
| 2.9609 | 67290 | 0.9698 | - |
| 2.9614 | 67300 | 0.9513 | - |
| 2.9618 | 67310 | 1.0016 | - |
| 2.9622 | 67320 | 0.9286 | - |
| 2.9627 | 67330 | 0.9043 | - |
| 2.9631 | 67340 | 0.957 | - |
| 2.9636 | 67350 | 0.9494 | - |
| 2.9640 | 67360 | 0.9474 | - |
| 2.9644 | 67370 | 0.9437 | - |
| 2.9649 | 67380 | 0.9502 | - |
| 2.9653 | 67390 | 0.9383 | - |
| 2.9658 | 67400 | 0.913 | - |
| 2.9662 | 67410 | 0.9565 | - |
| 2.9666 | 67420 | 0.948 | - |
| 2.9671 | 67430 | 0.9597 | - |
| 2.9675 | 67440 | 0.9396 | - |
| 2.9680 | 67450 | 0.9148 | - |
| 2.9684 | 67460 | 0.9202 | - |
| 2.9688 | 67470 | 0.9518 | - |
| 2.9693 | 67480 | 0.9288 | - |
| 2.9697 | 67490 | 0.9129 | - |
| 2.9702 | 67500 | 0.9529 | - |
| 2.9706 | 67510 | 0.9457 | - |
| 2.9710 | 67520 | 0.9496 | - |
| 2.9715 | 67530 | 0.9633 | - |
| 2.9719 | 67540 | 0.9281 | - |
| 2.9724 | 67550 | 0.9118 | - |
| 2.9728 | 67560 | 0.9332 | - |
| 2.9732 | 67570 | 0.8986 | - |
| 2.9737 | 67580 | 0.9324 | - |
| 2.9741 | 67590 | 0.9701 | - |
| 2.9746 | 67600 | 0.9659 | - |
| 2.9750 | 67610 | 0.94 | - |
| 2.9754 | 67620 | 0.9052 | - |
| 2.9759 | 67630 | 0.9231 | - |
| 2.9763 | 67640 | 0.9277 | - |
| 2.9768 | 67650 | 0.9046 | - |
| 2.9772 | 67660 | 0.9656 | - |
| 2.9776 | 67670 | 0.8967 | - |
| 2.9781 | 67680 | 0.9378 | - |
| 2.9785 | 67690 | 0.9661 | - |
| 2.9790 | 67700 | 0.9174 | - |
| 2.9794 | 67710 | 0.9411 | - |
| 2.9798 | 67720 | 0.9935 | - |
| 2.9803 | 67730 | 0.966 | - |
| 2.9807 | 67740 | 0.9429 | - |
| 2.9812 | 67750 | 0.9312 | - |
| 2.9816 | 67760 | 0.9141 | - |
| 2.9820 | 67770 | 0.9305 | - |
| 2.9825 | 67780 | 0.9499 | - |
| 2.9829 | 67790 | 0.8737 | - |
| 2.9834 | 67800 | 0.9317 | - |
| 2.9838 | 67810 | 0.9332 | - |
| 2.9842 | 67820 | 0.9365 | - |
| 2.9847 | 67830 | 0.9659 | - |
| 2.9851 | 67840 | 0.9129 | - |
| 2.9856 | 67850 | 0.9318 | - |
| 2.9860 | 67860 | 0.9325 | - |
| 2.9864 | 67870 | 0.9568 | - |
| 2.9869 | 67880 | 0.9447 | - |
| 2.9873 | 67890 | 0.9452 | - |
| 2.9878 | 67900 | 0.9204 | - |
| 2.9882 | 67910 | 0.9152 | - |
| 2.9886 | 67920 | 0.9105 | - |
| 2.9891 | 67930 | 0.9512 | - |
| 2.9895 | 67940 | 0.9048 | - |
| 2.9900 | 67950 | 0.9502 | - |
| 2.9904 | 67960 | 0.9192 | - |
| 2.9908 | 67970 | 0.9599 | - |
| 2.9913 | 67980 | 0.9313 | - |
| 2.9917 | 67990 | 0.9556 | - |
| 2.9922 | 68000 | 0.9323 | - |
| 2.9926 | 68010 | 0.9789 | - |
| 2.9930 | 68020 | 0.916 | - |
| 2.9935 | 68030 | 0.9094 | - |
| 2.9939 | 68040 | 0.9188 | - |
| 2.9944 | 68050 | 0.8964 | - |
| 2.9948 | 68060 | 0.9545 | - |
| 2.9952 | 68070 | 0.9498 | - |
| 2.9957 | 68080 | 0.8951 | - |
| 2.9961 | 68090 | 0.8845 | - |
| 2.9966 | 68100 | 0.9399 | - |
| 2.9970 | 68110 | 0.9405 | - |
| 2.9974 | 68120 | 0.9405 | - |
| 2.9979 | 68130 | 0.9891 | - |
| 2.9983 | 68140 | 0.9637 | - |
| 2.9988 | 68150 | 0.8949 | - |
| 2.9992 | 68160 | 0.9027 | - |
| 2.9996 | 68170 | 0.883 | - |
| 3.0001 | 68180 | 0.8899 | - |
| 3.0005 | 68190 | 0.8705 | - |
| 3.0010 | 68200 | 0.8856 | - |
| 3.0014 | 68210 | 0.8968 | - |
| 3.0018 | 68220 | 0.875 | 1.3861 |
| 3.0023 | 68230 | 0.8976 | - |
| 3.0027 | 68240 | 0.8922 | - |
| 3.0032 | 68250 | 0.8546 | - |
| 3.0036 | 68260 | 0.8864 | - |
| 3.0040 | 68270 | 0.8953 | - |
| 3.0045 | 68280 | 0.8629 | - |
| 3.0049 | 68290 | 0.8722 | - |
| 3.0054 | 68300 | 0.8894 | - |
| 3.0058 | 68310 | 0.9284 | - |
| 3.0062 | 68320 | 0.8542 | - |
| 3.0067 | 68330 | 0.8437 | - |
| 3.0071 | 68340 | 0.9421 | - |
| 3.0076 | 68350 | 0.8891 | - |
| 3.0080 | 68360 | 0.867 | - |
| 3.0084 | 68370 | 0.918 | - |
| 3.0089 | 68380 | 0.8838 | - |
| 3.0093 | 68390 | 0.8671 | - |
| 3.0098 | 68400 | 0.9028 | - |
| 3.0102 | 68410 | 0.857 | - |
| 3.0106 | 68420 | 0.888 | - |
| 3.0111 | 68430 | 0.8766 | - |
| 3.0115 | 68440 | 0.8578 | - |
| 3.0120 | 68450 | 0.884 | - |
| 3.0124 | 68460 | 0.8392 | - |
| 3.0128 | 68470 | 0.8895 | - |
| 3.0133 | 68480 | 0.872 | - |
| 3.0137 | 68490 | 0.9079 | - |
| 3.0142 | 68500 | 0.8682 | - |
| 3.0146 | 68510 | 0.9102 | - |
| 3.0150 | 68520 | 0.8569 | - |
| 3.0155 | 68530 | 0.8634 | - |
| 3.0159 | 68540 | 0.8789 | - |
| 3.0164 | 68550 | 0.8669 | - |
| 3.0168 | 68560 | 0.8199 | - |
| 3.0172 | 68570 | 0.8682 | - |
| 3.0177 | 68580 | 0.8796 | - |
| 3.0181 | 68590 | 0.8327 | - |
| 3.0186 | 68600 | 0.8988 | - |
| 3.0190 | 68610 | 0.8954 | - |
| 3.0194 | 68620 | 0.9102 | - |
| 3.0199 | 68630 | 0.8689 | - |
| 3.0203 | 68640 | 0.9151 | - |
| 3.0208 | 68650 | 0.8124 | - |
| 3.0212 | 68660 | 0.866 | - |
| 3.0216 | 68670 | 0.8721 | - |
| 3.0221 | 68680 | 0.871 | - |
| 3.0225 | 68690 | 0.8716 | - |
| 3.0230 | 68700 | 0.8693 | - |
| 3.0234 | 68710 | 0.8947 | - |
| 3.0238 | 68720 | 0.8758 | - |
| 3.0243 | 68730 | 0.8546 | - |
| 3.0247 | 68740 | 0.8122 | - |
| 3.0252 | 68750 | 0.872 | - |
| 3.0256 | 68760 | 0.9222 | - |
| 3.0260 | 68770 | 0.8297 | - |
| 3.0265 | 68780 | 0.8678 | - |
| 3.0269 | 68790 | 0.8712 | - |
| 3.0274 | 68800 | 0.8741 | - |
| 3.0278 | 68810 | 0.9197 | - |
| 3.0282 | 68820 | 0.8898 | - |
| 3.0287 | 68830 | 0.9001 | - |
| 3.0291 | 68840 | 0.923 | - |
| 3.0296 | 68850 | 0.8059 | - |
| 3.0300 | 68860 | 0.8863 | - |
| 3.0304 | 68870 | 0.8842 | - |
| 3.0309 | 68880 | 0.8652 | - |
| 3.0313 | 68890 | 0.8658 | - |
| 3.0318 | 68900 | 0.91 | - |
| 3.0322 | 68910 | 0.8827 | - |
| 3.0326 | 68920 | 0.8869 | - |
| 3.0331 | 68930 | 0.8724 | - |
| 3.0335 | 68940 | 0.8916 | - |
| 3.0340 | 68950 | 0.863 | - |
| 3.0344 | 68960 | 0.8782 | - |
| 3.0348 | 68970 | 0.8918 | - |
| 3.0353 | 68980 | 0.9083 | - |
| 3.0357 | 68990 | 0.8584 | - |
| 3.0362 | 69000 | 0.8667 | - |
| 3.0366 | 69010 | 0.8636 | - |
| 3.0371 | 69020 | 0.9053 | - |
| 3.0375 | 69030 | 0.9107 | - |
| 3.0379 | 69040 | 0.8752 | - |
| 3.0384 | 69050 | 0.9087 | - |
| 3.0388 | 69060 | 0.8617 | - |
| 3.0393 | 69070 | 0.9103 | - |
| 3.0397 | 69080 | 0.8752 | - |
| 3.0401 | 69090 | 0.883 | - |
| 3.0406 | 69100 | 0.8619 | - |
| 3.0410 | 69110 | 0.8453 | - |
| 3.0415 | 69120 | 0.8296 | - |
| 3.0419 | 69130 | 0.8738 | - |
| 3.0423 | 69140 | 0.9009 | - |
| 3.0428 | 69150 | 0.8957 | - |
| 3.0432 | 69160 | 0.8846 | - |
| 3.0437 | 69170 | 0.875 | - |
| 3.0441 | 69180 | 0.8724 | - |
| 3.0445 | 69190 | 0.8717 | - |
| 3.0450 | 69200 | 0.9502 | - |
| 3.0454 | 69210 | 0.865 | - |
| 3.0459 | 69220 | 0.8873 | - |
| 3.0463 | 69230 | 0.8425 | - |
| 3.0467 | 69240 | 0.874 | - |
| 3.0472 | 69250 | 0.8408 | - |
| 3.0476 | 69260 | 0.8373 | - |
| 3.0481 | 69270 | 0.8771 | - |
| 3.0485 | 69280 | 0.8633 | - |
| 3.0489 | 69290 | 0.8769 | - |
| 3.0494 | 69300 | 0.8437 | - |
| 3.0498 | 69310 | 0.8826 | - |
| 3.0503 | 69320 | 0.848 | - |
| 3.0507 | 69330 | 0.8592 | - |
| 3.0511 | 69340 | 0.897 | - |
| 3.0516 | 69350 | 0.8933 | - |
| 3.0519 | 69357 | - | 1.3783 |
| 3.0520 | 69360 | 0.8788 | - |
| 3.0525 | 69370 | 0.8821 | - |
| 3.0529 | 69380 | 0.8681 | - |
| 3.0533 | 69390 | 0.8904 | - |
| 3.0538 | 69400 | 0.8663 | - |
| 3.0542 | 69410 | 0.8501 | - |
| 3.0547 | 69420 | 0.895 | - |
| 3.0551 | 69430 | 0.8897 | - |
| 3.0555 | 69440 | 0.8628 | - |
| 3.0560 | 69450 | 0.9012 | - |
| 3.0564 | 69460 | 0.8866 | - |
| 3.0569 | 69470 | 0.9094 | - |
| 3.0573 | 69480 | 0.8725 | - |
| 3.0577 | 69490 | 0.8627 | - |
| 3.0582 | 69500 | 0.8742 | - |
| 3.0586 | 69510 | 0.9026 | - |
| 3.0591 | 69520 | 0.8816 | - |
| 3.0595 | 69530 | 0.8702 | - |
| 3.0599 | 69540 | 0.8787 | - |
| 3.0604 | 69550 | 0.8611 | - |
| 3.0608 | 69560 | 0.8785 | - |
| 3.0613 | 69570 | 0.8271 | - |
| 3.0617 | 69580 | 0.8608 | - |
| 3.0621 | 69590 | 0.8825 | - |
| 3.0626 | 69600 | 0.8905 | - |
| 3.0630 | 69610 | 0.8714 | - |
| 3.0635 | 69620 | 0.8495 | - |
| 3.0639 | 69630 | 0.8484 | - |
| 3.0643 | 69640 | 0.9333 | - |
| 3.0648 | 69650 | 0.8568 | - |
| 3.0652 | 69660 | 0.8751 | - |
| 3.0657 | 69670 | 0.8494 | - |
| 3.0661 | 69680 | 0.8752 | - |
| 3.0665 | 69690 | 0.9166 | - |
| 3.0670 | 69700 | 0.8814 | - |
| 3.0674 | 69710 | 0.8848 | - |
| 3.0679 | 69720 | 0.8855 | - |
| 3.0683 | 69730 | 0.9204 | - |
| 3.0687 | 69740 | 0.8633 | - |
| 3.0692 | 69750 | 0.8591 | - |
| 3.0696 | 69760 | 0.8535 | - |
| 3.0701 | 69770 | 0.8578 | - |
| 3.0705 | 69780 | 0.8895 | - |
| 3.0709 | 69790 | 0.8958 | - |
| 3.0714 | 69800 | 0.9212 | - |
| 3.0718 | 69810 | 0.8626 | - |
| 3.0723 | 69820 | 0.8955 | - |
| 3.0727 | 69830 | 0.8607 | - |
| 3.0731 | 69840 | 0.8748 | - |
| 3.0736 | 69850 | 0.85 | - |
| 3.0740 | 69860 | 0.8711 | - |
| 3.0745 | 69870 | 0.8392 | - |
| 3.0749 | 69880 | 0.8723 | - |
| 3.0753 | 69890 | 0.9051 | - |
| 3.0758 | 69900 | 0.8526 | - |
| 3.0762 | 69910 | 0.8826 | - |
| 3.0767 | 69920 | 0.9082 | - |
| 3.0771 | 69930 | 0.8755 | - |
| 3.0775 | 69940 | 0.8864 | - |
| 3.0780 | 69950 | 0.8856 | - |
| 3.0784 | 69960 | 0.8867 | - |
| 3.0789 | 69970 | 0.9266 | - |
| 3.0793 | 69980 | 0.85 | - |
| 3.0797 | 69990 | 0.87 | - |
| 3.0802 | 70000 | 0.8997 | - |
| 3.0806 | 70010 | 0.8685 | - |
| 3.0811 | 70020 | 0.8403 | - |
| 3.0815 | 70030 | 0.9155 | - |
| 3.0819 | 70040 | 0.8898 | - |
| 3.0824 | 70050 | 0.8915 | - |
| 3.0828 | 70060 | 0.8996 | - |
| 3.0833 | 70070 | 0.8439 | - |
| 3.0837 | 70080 | 0.8993 | - |
| 3.0841 | 70090 | 0.9006 | - |
| 3.0846 | 70100 | 0.8969 | - |
| 3.0850 | 70110 | 0.8873 | - |
| 3.0855 | 70120 | 0.9167 | - |
| 3.0859 | 70130 | 0.8905 | - |
| 3.0863 | 70140 | 0.8747 | - |
| 3.0868 | 70150 | 0.8638 | - |
| 3.0872 | 70160 | 0.8768 | - |
| 3.0877 | 70170 | 0.8899 | - |
| 3.0881 | 70180 | 0.862 | - |
| 3.0885 | 70190 | 0.8917 | - |
| 3.0890 | 70200 | 0.8517 | - |
| 3.0894 | 70210 | 0.8779 | - |
| 3.0899 | 70220 | 0.8895 | - |
| 3.0903 | 70230 | 0.8567 | - |
| 3.0907 | 70240 | 0.9012 | - |
| 3.0912 | 70250 | 0.8854 | - |
| 3.0916 | 70260 | 0.9253 | - |
| 3.0921 | 70270 | 0.8856 | - |
| 3.0925 | 70280 | 0.8944 | - |
| 3.0929 | 70290 | 0.8486 | - |
| 3.0934 | 70300 | 0.8674 | - |
| 3.0938 | 70310 | 0.8876 | - |
| 3.0943 | 70320 | 0.8408 | - |
| 3.0947 | 70330 | 0.8944 | - |
| 3.0951 | 70340 | 0.8931 | - |
| 3.0956 | 70350 | 0.8585 | - |
| 3.0960 | 70360 | 0.8356 | - |
| 3.0965 | 70370 | 0.8835 | - |
| 3.0969 | 70380 | 0.8768 | - |
| 3.0973 | 70390 | 0.8439 | - |
| 3.0978 | 70400 | 0.8579 | - |
| 3.0982 | 70410 | 0.8342 | - |
| 3.0987 | 70420 | 0.8822 | - |
| 3.0991 | 70430 | 0.873 | - |
| 3.0995 | 70440 | 0.8757 | - |
| 3.1000 | 70450 | 0.8242 | - |
| 3.1004 | 70460 | 0.8762 | - |
| 3.1009 | 70470 | 0.9052 | - |
| 3.1013 | 70480 | 0.8328 | - |
| 3.1017 | 70490 | 0.9259 | - |
| 3.1019 | 70494 | - | 1.4059 |
| 3.1022 | 70500 | 0.8657 | - |
| 3.1026 | 70510 | 0.8788 | - |
| 3.1031 | 70520 | 0.8769 | - |
| 3.1035 | 70530 | 0.8709 | - |
| 3.1039 | 70540 | 0.9124 | - |
| 3.1044 | 70550 | 0.8832 | - |
| 3.1048 | 70560 | 0.8313 | - |
| 3.1053 | 70570 | 0.9088 | - |
| 3.1057 | 70580 | 0.8783 | - |
| 3.1061 | 70590 | 0.9065 | - |
| 3.1066 | 70600 | 0.8562 | - |
| 3.1070 | 70610 | 0.8638 | - |
| 3.1075 | 70620 | 0.9117 | - |
| 3.1079 | 70630 | 0.9038 | - |
| 3.1083 | 70640 | 0.8414 | - |
| 3.1088 | 70650 | 0.8729 | - |
| 3.1092 | 70660 | 0.9392 | - |
| 3.1097 | 70670 | 0.8653 | - |
| 3.1101 | 70680 | 0.9107 | - |
| 3.1105 | 70690 | 0.8917 | - |
| 3.1110 | 70700 | 0.8565 | - |
| 3.1114 | 70710 | 0.9033 | - |
| 3.1119 | 70720 | 0.8841 | - |
| 3.1123 | 70730 | 0.86 | - |
| 3.1127 | 70740 | 0.844 | - |
| 3.1132 | 70750 | 0.8666 | - |
| 3.1136 | 70760 | 0.8496 | - |
| 3.1141 | 70770 | 0.8932 | - |
| 3.1145 | 70780 | 0.8989 | - |
| 3.1149 | 70790 | 0.8951 | - |
| 3.1154 | 70800 | 0.8755 | - |
| 3.1158 | 70810 | 0.8966 | - |
| 3.1163 | 70820 | 0.8831 | - |
| 3.1167 | 70830 | 0.914 | - |
| 3.1171 | 70840 | 0.8747 | - |
| 3.1176 | 70850 | 0.8487 | - |
| 3.1180 | 70860 | 0.895 | - |
| 3.1185 | 70870 | 0.8452 | - |
| 3.1189 | 70880 | 0.8676 | - |
| 3.1193 | 70890 | 0.868 | - |
| 3.1198 | 70900 | 0.8824 | - |
| 3.1202 | 70910 | 0.8484 | - |
| 3.1207 | 70920 | 0.8967 | - |
| 3.1211 | 70930 | 0.874 | - |
| 3.1215 | 70940 | 0.8526 | - |
| 3.1220 | 70950 | 0.8501 | - |
| 3.1224 | 70960 | 0.8697 | - |
| 3.1229 | 70970 | 0.8843 | - |
| 3.1233 | 70980 | 0.8799 | - |
| 3.1237 | 70990 | 0.9044 | - |
| 3.1242 | 71000 | 0.8499 | - |
| 3.1246 | 71010 | 0.8395 | - |
| 3.1251 | 71020 | 0.8691 | - |
| 3.1255 | 71030 | 0.8651 | - |
| 3.1259 | 71040 | 0.8746 | - |
| 3.1264 | 71050 | 0.8702 | - |
| 3.1268 | 71060 | 0.8482 | - |
| 3.1273 | 71070 | 0.8803 | - |
| 3.1277 | 71080 | 0.8644 | - |
| 3.1281 | 71090 | 0.887 | - |
| 3.1286 | 71100 | 0.8875 | - |
| 3.1290 | 71110 | 0.8706 | - |
| 3.1295 | 71120 | 0.9344 | - |
| 3.1299 | 71130 | 0.8792 | - |
| 3.1303 | 71140 | 0.8702 | - |
| 3.1308 | 71150 | 0.8527 | - |
| 3.1312 | 71160 | 0.8684 | - |
| 3.1317 | 71170 | 0.8475 | - |
| 3.1321 | 71180 | 0.8717 | - |
| 3.1325 | 71190 | 0.8908 | - |
| 3.1330 | 71200 | 0.9005 | - |
| 3.1334 | 71210 | 0.8661 | - |
| 3.1339 | 71220 | 0.828 | - |
| 3.1343 | 71230 | 0.8894 | - |
| 3.1347 | 71240 | 0.8429 | - |
| 3.1352 | 71250 | 0.8614 | - |
| 3.1356 | 71260 | 0.8565 | - |
| 3.1361 | 71270 | 0.8853 | - |
| 3.1365 | 71280 | 0.8975 | - |
| 3.1369 | 71290 | 0.8371 | - |
| 3.1374 | 71300 | 0.86 | - |
| 3.1378 | 71310 | 0.8612 | - |
| 3.1383 | 71320 | 0.8315 | - |
| 3.1387 | 71330 | 0.8748 | - |
| 3.1391 | 71340 | 0.8505 | - |
| 3.1396 | 71350 | 0.852 | - |
| 3.1400 | 71360 | 0.8791 | - |
| 3.1405 | 71370 | 0.8855 | - |
| 3.1409 | 71380 | 0.8525 | - |
| 3.1413 | 71390 | 0.891 | - |
| 3.1418 | 71400 | 0.8859 | - |
| 3.1422 | 71410 | 0.8675 | - |
| 3.1427 | 71420 | 0.8576 | - |
| 3.1431 | 71430 | 0.8597 | - |
| 3.1435 | 71440 | 0.8793 | - |
| 3.1440 | 71450 | 0.8746 | - |
| 3.1444 | 71460 | 0.8381 | - |
| 3.1449 | 71470 | 0.8749 | - |
| 3.1453 | 71480 | 0.8599 | - |
| 3.1457 | 71490 | 0.8813 | - |
| 3.1462 | 71500 | 0.8672 | - |
| 3.1466 | 71510 | 0.8848 | - |
| 3.1471 | 71520 | 0.8636 | - |
| 3.1475 | 71530 | 0.8846 | - |
| 3.1479 | 71540 | 0.8926 | - |
| 3.1484 | 71550 | 0.8662 | - |
| 3.1488 | 71560 | 0.8405 | - |
| 3.1493 | 71570 | 0.9048 | - |
| 3.1497 | 71580 | 0.8546 | - |
| 3.1501 | 71590 | 0.8603 | - |
| 3.1506 | 71600 | 0.8645 | - |
| 3.1510 | 71610 | 0.893 | - |
| 3.1515 | 71620 | 0.8996 | - |
| 3.1519 | 71630 | 0.8778 | - |
| 3.1519 | 71631 | - | 1.4057 |
| 3.1523 | 71640 | 0.8815 | - |
| 3.1528 | 71650 | 0.8766 | - |
| 3.1532 | 71660 | 0.8817 | - |
| 3.1537 | 71670 | 0.846 | - |
| 3.1541 | 71680 | 0.8448 | - |
| 3.1545 | 71690 | 0.8584 | - |
| 3.1550 | 71700 | 0.8714 | - |
| 3.1554 | 71710 | 0.8972 | - |
| 3.1559 | 71720 | 0.9076 | - |
| 3.1563 | 71730 | 0.8858 | - |
| 3.1567 | 71740 | 0.8809 | - |
| 3.1572 | 71750 | 0.8324 | - |
| 3.1576 | 71760 | 0.8337 | - |
| 3.1581 | 71770 | 0.8719 | - |
| 3.1585 | 71780 | 0.8467 | - |
| 3.1589 | 71790 | 0.924 | - |
| 3.1594 | 71800 | 0.861 | - |
| 3.1598 | 71810 | 0.8428 | - |
| 3.1603 | 71820 | 0.89 | - |
| 3.1607 | 71830 | 0.8862 | - |
| 3.1611 | 71840 | 0.8591 | - |
| 3.1616 | 71850 | 0.8471 | - |
| 3.1620 | 71860 | 0.8829 | - |
| 3.1625 | 71870 | 0.848 | - |
| 3.1629 | 71880 | 0.8456 | - |
| 3.1633 | 71890 | 0.8539 | - |
| 3.1638 | 71900 | 0.8845 | - |
| 3.1642 | 71910 | 0.857 | - |
| 3.1647 | 71920 | 0.8991 | - |
| 3.1651 | 71930 | 0.8731 | - |
| 3.1655 | 71940 | 0.8521 | - |
| 3.1660 | 71950 | 0.9003 | - |
| 3.1664 | 71960 | 0.8453 | - |
| 3.1669 | 71970 | 0.8589 | - |
| 3.1673 | 71980 | 0.8597 | - |
| 3.1677 | 71990 | 0.9139 | - |
| 3.1682 | 72000 | 0.8422 | - |
| 3.1686 | 72010 | 0.8327 | - |
| 3.1691 | 72020 | 0.902 | - |
| 3.1695 | 72030 | 0.8488 | - |
| 3.1699 | 72040 | 0.8705 | - |
| 3.1704 | 72050 | 0.8809 | - |
| 3.1708 | 72060 | 0.8831 | - |
| 3.1713 | 72070 | 0.8868 | - |
| 3.1717 | 72080 | 0.9048 | - |
| 3.1721 | 72090 | 0.8537 | - |
| 3.1726 | 72100 | 0.868 | - |
| 3.1730 | 72110 | 0.8656 | - |
| 3.1735 | 72120 | 0.8675 | - |
| 3.1739 | 72130 | 0.8657 | - |
| 3.1743 | 72140 | 0.8895 | - |
| 3.1748 | 72150 | 0.8638 | - |
| 3.1752 | 72160 | 0.9095 | - |
| 3.1757 | 72170 | 0.847 | - |
| 3.1761 | 72180 | 0.8702 | - |
| 3.1765 | 72190 | 0.8475 | - |
| 3.1770 | 72200 | 0.8743 | - |
| 3.1774 | 72210 | 0.8403 | - |
| 3.1779 | 72220 | 0.8885 | - |
| 3.1783 | 72230 | 0.8953 | - |
| 3.1787 | 72240 | 0.8825 | - |
| 3.1792 | 72250 | 0.8505 | - |
| 3.1796 | 72260 | 0.8588 | - |
| 3.1801 | 72270 | 0.8795 | - |
| 3.1805 | 72280 | 0.8661 | - |
| 3.1809 | 72290 | 0.908 | - |
| 3.1814 | 72300 | 0.8164 | - |
| 3.1818 | 72310 | 0.8724 | - |
| 3.1823 | 72320 | 0.8971 | - |
| 3.1827 | 72330 | 0.8565 | - |
| 3.1831 | 72340 | 0.8989 | - |
| 3.1836 | 72350 | 0.8657 | - |
| 3.1840 | 72360 | 0.8959 | - |
| 3.1845 | 72370 | 0.8687 | - |
| 3.1849 | 72380 | 0.8742 | - |
| 3.1853 | 72390 | 0.886 | - |
| 3.1858 | 72400 | 0.8864 | - |
| 3.1862 | 72410 | 0.8834 | - |
| 3.1867 | 72420 | 0.916 | - |
| 3.1871 | 72430 | 0.8533 | - |
| 3.1875 | 72440 | 0.8754 | - |
| 3.1880 | 72450 | 0.8526 | - |
| 3.1884 | 72460 | 0.8871 | - |
| 3.1889 | 72470 | 0.8749 | - |
| 3.1893 | 72480 | 0.8558 | - |
| 3.1897 | 72490 | 0.8836 | - |
| 3.1902 | 72500 | 0.8912 | - |
| 3.1906 | 72510 | 0.9199 | - |
| 3.1911 | 72520 | 0.8659 | - |
| 3.1915 | 72530 | 0.8359 | - |
| 3.1919 | 72540 | 0.8645 | - |
| 3.1924 | 72550 | 0.8584 | - |
| 3.1928 | 72560 | 0.8556 | - |
| 3.1933 | 72570 | 0.8451 | - |
| 3.1937 | 72580 | 0.8495 | - |
| 3.1941 | 72590 | 0.869 | - |
| 3.1946 | 72600 | 0.9066 | - |
| 3.1950 | 72610 | 0.8721 | - |
| 3.1955 | 72620 | 0.8245 | - |
| 3.1959 | 72630 | 0.8488 | - |
| 3.1963 | 72640 | 0.8663 | - |
| 3.1968 | 72650 | 0.8676 | - |
| 3.1972 | 72660 | 0.9114 | - |
| 3.1977 | 72670 | 0.854 | - |
| 3.1981 | 72680 | 0.8724 | - |
| 3.1985 | 72690 | 0.867 | - |
| 3.1990 | 72700 | 0.8576 | - |
| 3.1994 | 72710 | 0.8678 | - |
| 3.1999 | 72720 | 0.8528 | - |
| 3.2003 | 72730 | 0.8587 | - |
| 3.2007 | 72740 | 0.8738 | - |
| 3.2012 | 72750 | 0.8712 | - |
| 3.2016 | 72760 | 0.8604 | - |
| 3.2020 | 72768 | - | 1.3963 |
| 3.2021 | 72770 | 0.8545 | - |
| 3.2025 | 72780 | 0.8605 | - |
| 3.2029 | 72790 | 0.857 | - |
| 3.2034 | 72800 | 0.8822 | - |
| 3.2038 | 72810 | 0.866 | - |
| 3.2043 | 72820 | 0.8597 | - |
| 3.2047 | 72830 | 0.8428 | - |
| 3.2051 | 72840 | 0.847 | - |
| 3.2056 | 72850 | 0.8678 | - |
| 3.2060 | 72860 | 0.8578 | - |
| 3.2065 | 72870 | 0.8293 | - |
| 3.2069 | 72880 | 0.9004 | - |
| 3.2073 | 72890 | 0.9046 | - |
| 3.2078 | 72900 | 0.8331 | - |
| 3.2082 | 72910 | 0.8626 | - |
| 3.2087 | 72920 | 0.8655 | - |
| 3.2091 | 72930 | 0.8347 | - |
| 3.2095 | 72940 | 0.893 | - |
| 3.2100 | 72950 | 0.8574 | - |
| 3.2104 | 72960 | 0.8239 | - |
| 3.2109 | 72970 | 0.8332 | - |
| 3.2113 | 72980 | 0.8784 | - |
| 3.2117 | 72990 | 0.8581 | - |
| 3.2122 | 73000 | 0.865 | - |
| 3.2126 | 73010 | 0.902 | - |
| 3.2131 | 73020 | 0.8743 | - |
| 3.2135 | 73030 | 0.8472 | - |
| 3.2139 | 73040 | 0.8842 | - |
| 3.2144 | 73050 | 0.8441 | - |
| 3.2148 | 73060 | 0.8567 | - |
| 3.2153 | 73070 | 0.8317 | - |
| 3.2157 | 73080 | 0.8252 | - |
| 3.2161 | 73090 | 0.847 | - |
| 3.2166 | 73100 | 0.8672 | - |
| 3.2170 | 73110 | 0.8742 | - |
| 3.2175 | 73120 | 0.8704 | - |
| 3.2179 | 73130 | 0.8661 | - |
| 3.2183 | 73140 | 0.8684 | - |
| 3.2188 | 73150 | 0.9076 | - |
| 3.2192 | 73160 | 0.8757 | - |
| 3.2197 | 73170 | 0.8571 | - |
| 3.2201 | 73180 | 0.8579 | - |
| 3.2205 | 73190 | 0.836 | - |
| 3.2210 | 73200 | 0.8443 | - |
| 3.2214 | 73210 | 0.8405 | - |
| 3.2219 | 73220 | 0.871 | - |
| 3.2223 | 73230 | 0.858 | - |
| 3.2227 | 73240 | 0.8739 | - |
| 3.2232 | 73250 | 0.8497 | - |
| 3.2236 | 73260 | 0.8439 | - |
| 3.2241 | 73270 | 0.8801 | - |
| 3.2245 | 73280 | 0.8884 | - |
| 3.2249 | 73290 | 0.881 | - |
| 3.2254 | 73300 | 0.8293 | - |
| 3.2258 | 73310 | 0.8795 | - |
| 3.2263 | 73320 | 0.8629 | - |
| 3.2267 | 73330 | 0.8524 | - |
| 3.2271 | 73340 | 0.8624 | - |
| 3.2276 | 73350 | 0.866 | - |
| 3.2280 | 73360 | 0.8479 | - |
| 3.2285 | 73370 | 0.8857 | - |
| 3.2289 | 73380 | 0.8492 | - |
| 3.2293 | 73390 | 0.8516 | - |
| 3.2298 | 73400 | 0.8663 | - |
| 3.2302 | 73410 | 0.8785 | - |
| 3.2307 | 73420 | 0.8518 | - |
| 3.2311 | 73430 | 0.8908 | - |
| 3.2315 | 73440 | 0.8543 | - |
| 3.2320 | 73450 | 0.8612 | - |
| 3.2324 | 73460 | 0.8751 | - |
| 3.2329 | 73470 | 0.9037 | - |
| 3.2333 | 73480 | 0.8683 | - |
| 3.2337 | 73490 | 0.8634 | - |
| 3.2342 | 73500 | 0.8556 | - |
| 3.2346 | 73510 | 0.8528 | - |
| 3.2351 | 73520 | 0.8645 | - |
| 3.2355 | 73530 | 0.8063 | - |
| 3.2359 | 73540 | 0.8321 | - |
| 3.2364 | 73550 | 0.8169 | - |
| 3.2368 | 73560 | 0.8884 | - |
| 3.2373 | 73570 | 0.8433 | - |
| 3.2377 | 73580 | 0.8671 | - |
| 3.2381 | 73590 | 0.8578 | - |
| 3.2386 | 73600 | 0.8301 | - |
| 3.2390 | 73610 | 0.8482 | - |
| 3.2395 | 73620 | 0.8438 | - |
| 3.2399 | 73630 | 0.8727 | - |
| 3.2403 | 73640 | 0.8256 | - |
| 3.2408 | 73650 | 0.8507 | - |
| 3.2412 | 73660 | 0.8507 | - |
| 3.2417 | 73670 | 0.8431 | - |
| 3.2421 | 73680 | 0.8417 | - |
| 3.2425 | 73690 | 0.8497 | - |
| 3.2430 | 73700 | 0.8864 | - |
| 3.2434 | 73710 | 0.8681 | - |
| 3.2439 | 73720 | 0.877 | - |
| 3.2443 | 73730 | 0.861 | - |
| 3.2447 | 73740 | 0.8285 | - |
| 3.2452 | 73750 | 0.8656 | - |
| 3.2456 | 73760 | 0.8962 | - |
| 3.2461 | 73770 | 0.8371 | - |
| 3.2465 | 73780 | 0.8877 | - |
| 3.2469 | 73790 | 0.8387 | - |
| 3.2474 | 73800 | 0.8896 | - |
| 3.2478 | 73810 | 0.8809 | - |
| 3.2483 | 73820 | 0.8335 | - |
| 3.2487 | 73830 | 0.9112 | - |
| 3.2491 | 73840 | 0.8402 | - |
| 3.2496 | 73850 | 0.8418 | - |
| 3.2500 | 73860 | 0.8782 | - |
| 3.2505 | 73870 | 0.8136 | - |
| 3.2509 | 73880 | 0.897 | - |
| 3.2513 | 73890 | 0.8313 | - |
| 3.2518 | 73900 | 0.8154 | - |
| 3.2520 | 73905 | - | 1.3836 |
| 3.2522 | 73910 | 0.8629 | - |
| 3.2527 | 73920 | 0.8269 | - |
| 3.2531 | 73930 | 0.8649 | - |
| 3.2535 | 73940 | 0.8493 | - |
| 3.2540 | 73950 | 0.8364 | - |
| 3.2544 | 73960 | 0.8402 | - |
| 3.2549 | 73970 | 0.8661 | - |
| 3.2553 | 73980 | 0.9145 | - |
| 3.2557 | 73990 | 0.839 | - |
| 3.2562 | 74000 | 0.879 | - |
| 3.2566 | 74010 | 0.8438 | - |
| 3.2571 | 74020 | 0.8585 | - |
| 3.2575 | 74030 | 0.8421 | - |
| 3.2579 | 74040 | 0.8625 | - |
| 3.2584 | 74050 | 0.8678 | - |
| 3.2588 | 74060 | 0.8418 | - |
| 3.2593 | 74070 | 0.8499 | - |
| 3.2597 | 74080 | 0.8604 | - |
| 3.2601 | 74090 | 0.8375 | - |
| 3.2606 | 74100 | 0.8354 | - |
| 3.2610 | 74110 | 0.8586 | - |
| 3.2615 | 74120 | 0.8375 | - |
| 3.2619 | 74130 | 0.8473 | - |
| 3.2623 | 74140 | 0.87 | - |
| 3.2628 | 74150 | 0.8336 | - |
| 3.2632 | 74160 | 0.8636 | - |
| 3.2637 | 74170 | 0.8224 | - |
| 3.2641 | 74180 | 0.8334 | - |
| 3.2645 | 74190 | 0.8581 | - |
| 3.2650 | 74200 | 0.8605 | - |
| 3.2654 | 74210 | 0.8221 | - |
| 3.2659 | 74220 | 0.8597 | - |
| 3.2663 | 74230 | 0.8458 | - |
| 3.2667 | 74240 | 0.8671 | - |
| 3.2672 | 74250 | 0.8514 | - |
| 3.2676 | 74260 | 0.8402 | - |
| 3.2681 | 74270 | 0.8411 | - |
| 3.2685 | 74280 | 0.8481 | - |
| 3.2689 | 74290 | 0.8518 | - |
| 3.2694 | 74300 | 0.836 | - |
| 3.2698 | 74310 | 0.8647 | - |
| 3.2703 | 74320 | 0.8448 | - |
| 3.2707 | 74330 | 0.8928 | - |
| 3.2711 | 74340 | 0.8136 | - |
| 3.2716 | 74350 | 0.8653 | - |
| 3.2720 | 74360 | 0.8276 | - |
| 3.2725 | 74370 | 0.8354 | - |
| 3.2729 | 74380 | 0.8472 | - |
| 3.2733 | 74390 | 0.85 | - |
| 3.2738 | 74400 | 0.8805 | - |
| 3.2742 | 74410 | 0.8627 | - |
| 3.2747 | 74420 | 0.8339 | - |
| 3.2751 | 74430 | 0.8674 | - |
| 3.2755 | 74440 | 0.8514 | - |
| 3.2760 | 74450 | 0.8275 | - |
| 3.2764 | 74460 | 0.859 | - |
| 3.2769 | 74470 | 0.8853 | - |
| 3.2773 | 74480 | 0.8523 | - |
| 3.2777 | 74490 | 0.8675 | - |
| 3.2782 | 74500 | 0.8579 | - |
| 3.2786 | 74510 | 0.8221 | - |
| 3.2791 | 74520 | 0.8784 | - |
| 3.2795 | 74530 | 0.8384 | - |
| 3.2799 | 74540 | 0.8626 | - |
| 3.2804 | 74550 | 0.8636 | - |
| 3.2808 | 74560 | 0.8695 | - |
| 3.2813 | 74570 | 0.9031 | - |
| 3.2817 | 74580 | 0.831 | - |
| 3.2821 | 74590 | 0.9057 | - |
| 3.2826 | 74600 | 0.8718 | - |
| 3.2830 | 74610 | 0.836 | - |
| 3.2835 | 74620 | 0.8379 | - |
| 3.2839 | 74630 | 0.8606 | - |
| 3.2843 | 74640 | 0.8162 | - |
| 3.2848 | 74650 | 0.8468 | - |
| 3.2852 | 74660 | 0.8839 | - |
| 3.2857 | 74670 | 0.8748 | - |
| 3.2861 | 74680 | 0.8488 | - |
| 3.2865 | 74690 | 0.8249 | - |
| 3.2870 | 74700 | 0.8131 | - |
| 3.2874 | 74710 | 0.7959 | - |
| 3.2879 | 74720 | 0.8458 | - |
| 3.2883 | 74730 | 0.8724 | - |
| 3.2887 | 74740 | 0.8504 | - |
| 3.2892 | 74750 | 0.8781 | - |
| 3.2896 | 74760 | 0.8374 | - |
| 3.2901 | 74770 | 0.8431 | - |
| 3.2905 | 74780 | 0.8399 | - |
| 3.2909 | 74790 | 0.8381 | - |
| 3.2914 | 74800 | 0.8171 | - |
| 3.2918 | 74810 | 0.8412 | - |
| 3.2923 | 74820 | 0.8426 | - |
| 3.2927 | 74830 | 0.8906 | - |
| 3.2931 | 74840 | 0.8745 | - |
| 3.2936 | 74850 | 0.9026 | - |
| 3.2940 | 74860 | 0.8342 | - |
| 3.2945 | 74870 | 0.8334 | - |
| 3.2949 | 74880 | 0.8944 | - |
| 3.2953 | 74890 | 0.8119 | - |
| 3.2958 | 74900 | 0.8475 | - |
| 3.2962 | 74910 | 0.8367 | - |
| 3.2967 | 74920 | 0.8807 | - |
| 3.2971 | 74930 | 0.868 | - |
| 3.2975 | 74940 | 0.8473 | - |
| 3.2980 | 74950 | 0.8455 | - |
| 3.2984 | 74960 | 0.8702 | - |
| 3.2989 | 74970 | 0.8049 | - |
| 3.2993 | 74980 | 0.8807 | - |
| 3.2997 | 74990 | 0.8131 | - |
| 3.3002 | 75000 | 0.8429 | - |
| 3.3006 | 75010 | 0.838 | - |
| 3.3011 | 75020 | 0.8209 | - |
| 3.3015 | 75030 | 0.9014 | - |
| 3.3019 | 75040 | 0.8474 | - |
| 3.3020 | 75042 | - | 1.3761 |
| 3.3024 | 75050 | 0.8191 | - |
| 3.3028 | 75060 | 0.8195 | - |
| 3.3033 | 75070 | 0.8664 | - |
| 3.3037 | 75080 | 0.8365 | - |
| 3.3041 | 75090 | 0.8565 | - |
| 3.3046 | 75100 | 0.8511 | - |
| 3.3050 | 75110 | 0.8423 | - |
| 3.3055 | 75120 | 0.7992 | - |
| 3.3059 | 75130 | 0.8418 | - |
| 3.3063 | 75140 | 0.8258 | - |
| 3.3068 | 75150 | 0.8279 | - |
| 3.3072 | 75160 | 0.844 | - |
| 3.3077 | 75170 | 0.8576 | - |
| 3.3081 | 75180 | 0.8668 | - |
| 3.3085 | 75190 | 0.8231 | - |
| 3.3090 | 75200 | 0.8473 | - |
| 3.3094 | 75210 | 0.8456 | - |
| 3.3099 | 75220 | 0.8359 | - |
| 3.3103 | 75230 | 0.7933 | - |
| 3.3107 | 75240 | 0.86 | - |
| 3.3112 | 75250 | 0.8478 | - |
| 3.3116 | 75260 | 0.8743 | - |
| 3.3121 | 75270 | 0.8437 | - |
| 3.3125 | 75280 | 0.847 | - |
| 3.3129 | 75290 | 0.8265 | - |
| 3.3134 | 75300 | 0.9031 | - |
| 3.3138 | 75310 | 0.8854 | - |
| 3.3143 | 75320 | 0.8454 | - |
| 3.3147 | 75330 | 0.8117 | - |
| 3.3151 | 75340 | 0.8102 | - |
| 3.3156 | 75350 | 0.8567 | - |
| 3.3160 | 75360 | 0.8573 | - |
| 3.3165 | 75370 | 0.8344 | - |
| 3.3169 | 75380 | 0.8605 | - |
| 3.3173 | 75390 | 0.8382 | - |
| 3.3178 | 75400 | 0.8643 | - |
| 3.3182 | 75410 | 0.8404 | - |
| 3.3187 | 75420 | 0.8154 | - |
| 3.3191 | 75430 | 0.8648 | - |
| 3.3195 | 75440 | 0.8338 | - |
| 3.3200 | 75450 | 0.8393 | - |
| 3.3204 | 75460 | 0.8454 | - |
| 3.3209 | 75470 | 0.8448 | - |
| 3.3213 | 75480 | 0.8327 | - |
| 3.3217 | 75490 | 0.8643 | - |
| 3.3222 | 75500 | 0.8647 | - |
| 3.3226 | 75510 | 0.8542 | - |
| 3.3231 | 75520 | 0.8718 | - |
| 3.3235 | 75530 | 0.8201 | - |
| 3.3239 | 75540 | 0.8124 | - |
| 3.3244 | 75550 | 0.8206 | - |
| 3.3248 | 75560 | 0.8374 | - |
| 3.3253 | 75570 | 0.8683 | - |
| 3.3257 | 75580 | 0.8652 | - |
| 3.3261 | 75590 | 0.8505 | - |
| 3.3266 | 75600 | 0.8621 | - |
| 3.3270 | 75610 | 0.7975 | - |
| 3.3275 | 75620 | 0.8081 | - |
| 3.3279 | 75630 | 0.7946 | - |
| 3.3283 | 75640 | 0.834 | - |
| 3.3288 | 75650 | 0.8265 | - |
| 3.3292 | 75660 | 0.7918 | - |
| 3.3297 | 75670 | 0.8996 | - |
| 3.3301 | 75680 | 0.8479 | - |
| 3.3305 | 75690 | 0.8253 | - |
| 3.3310 | 75700 | 0.8366 | - |
| 3.3314 | 75710 | 0.8681 | - |
| 3.3319 | 75720 | 0.8366 | - |
| 3.3323 | 75730 | 0.8189 | - |
| 3.3327 | 75740 | 0.8381 | - |
| 3.3332 | 75750 | 0.8568 | - |
| 3.3336 | 75760 | 0.8441 | - |
| 3.3341 | 75770 | 0.8358 | - |
| 3.3345 | 75780 | 0.8794 | - |
| 3.3349 | 75790 | 0.8448 | - |
| 3.3354 | 75800 | 0.848 | - |
| 3.3358 | 75810 | 0.8646 | - |
| 3.3363 | 75820 | 0.8466 | - |
| 3.3367 | 75830 | 0.8527 | - |
| 3.3371 | 75840 | 0.8633 | - |
| 3.3376 | 75850 | 0.814 | - |
| 3.3380 | 75860 | 0.8437 | - |
| 3.3385 | 75870 | 0.8684 | - |
| 3.3389 | 75880 | 0.8577 | - |
| 3.3393 | 75890 | 0.8782 | - |
| 3.3398 | 75900 | 0.8162 | - |
| 3.3402 | 75910 | 0.8403 | - |
| 3.3407 | 75920 | 0.84 | - |
| 3.3411 | 75930 | 0.8721 | - |
| 3.3415 | 75940 | 0.8849 | - |
| 3.3420 | 75950 | 0.838 | - |
| 3.3424 | 75960 | 0.8006 | - |
| 3.3429 | 75970 | 0.8495 | - |
| 3.3433 | 75980 | 0.8314 | - |
| 3.3437 | 75990 | 0.7986 | - |
| 3.3442 | 76000 | 0.8378 | - |
| 3.3446 | 76010 | 0.8918 | - |
| 3.3451 | 76020 | 0.8418 | - |
| 3.3455 | 76030 | 0.8384 | - |
| 3.3459 | 76040 | 0.8212 | - |
| 3.3464 | 76050 | 0.8071 | - |
| 3.3468 | 76060 | 0.8649 | - |
| 3.3473 | 76070 | 0.8485 | - |
| 3.3477 | 76080 | 0.7798 | - |
| 3.3481 | 76090 | 0.8471 | - |
| 3.3486 | 76100 | 0.845 | - |
| 3.3490 | 76110 | 0.8207 | - |
| 3.3495 | 76120 | 0.8504 | - |
| 3.3499 | 76130 | 0.8749 | - |
| 3.3503 | 76140 | 0.8353 | - |
| 3.3508 | 76150 | 0.8215 | - |
| 3.3512 | 76160 | 0.827 | - |
| 3.3517 | 76170 | 0.8148 | - |
| 3.3521 | 76179 | - | 1.4060 |
| 3.3521 | 76180 | 0.8295 | - |
| 3.3525 | 76190 | 0.8549 | - |
| 3.3530 | 76200 | 0.8477 | - |
| 3.3534 | 76210 | 0.8476 | - |
| 3.3539 | 76220 | 0.8437 | - |
| 3.3543 | 76230 | 0.7932 | - |
| 3.3547 | 76240 | 0.82 | - |
| 3.3552 | 76250 | 0.8836 | - |
| 3.3556 | 76260 | 0.8503 | - |
| 3.3561 | 76270 | 0.8375 | - |
| 3.3565 | 76280 | 0.8429 | - |
| 3.3569 | 76290 | 0.9008 | - |
| 3.3574 | 76300 | 0.8156 | - |
| 3.3578 | 76310 | 0.8087 | - |
| 3.3583 | 76320 | 0.865 | - |
| 3.3587 | 76330 | 0.8235 | - |
| 3.3591 | 76340 | 0.8699 | - |
| 3.3596 | 76350 | 0.8371 | - |
| 3.3600 | 76360 | 0.835 | - |
| 3.3605 | 76370 | 0.8502 | - |
| 3.3609 | 76380 | 0.8235 | - |
| 3.3613 | 76390 | 0.8162 | - |
| 3.3618 | 76400 | 0.8519 | - |
| 3.3622 | 76410 | 0.8344 | - |
| 3.3627 | 76420 | 0.8531 | - |
| 3.3631 | 76430 | 0.8382 | - |
| 3.3635 | 76440 | 0.8783 | - |
| 3.3640 | 76450 | 0.8468 | - |
| 3.3644 | 76460 | 0.8548 | - |
| 3.3649 | 76470 | 0.8572 | - |
| 3.3653 | 76480 | 0.8192 | - |
| 3.3657 | 76490 | 0.8511 | - |
| 3.3662 | 76500 | 0.8663 | - |
| 3.3666 | 76510 | 0.8499 | - |
| 3.3671 | 76520 | 0.8295 | - |
| 3.3675 | 76530 | 0.8172 | - |
| 3.3679 | 76540 | 0.8455 | - |
| 3.3684 | 76550 | 0.8144 | - |
| 3.3688 | 76560 | 0.8199 | - |
| 3.3693 | 76570 | 0.8003 | - |
| 3.3697 | 76580 | 0.8189 | - |
| 3.3701 | 76590 | 0.8312 | - |
| 3.3706 | 76600 | 0.8327 | - |
| 3.3710 | 76610 | 0.8573 | - |
| 3.3715 | 76620 | 0.8045 | - |
| 3.3719 | 76630 | 0.8407 | - |
| 3.3723 | 76640 | 0.8598 | - |
| 3.3728 | 76650 | 0.8263 | - |
| 3.3732 | 76660 | 0.8238 | - |
| 3.3737 | 76670 | 0.8541 | - |
| 3.3741 | 76680 | 0.8199 | - |
| 3.3745 | 76690 | 0.8196 | - |
| 3.3750 | 76700 | 0.8615 | - |
| 3.3754 | 76710 | 0.8711 | - |
| 3.3759 | 76720 | 0.845 | - |
| 3.3763 | 76730 | 0.8433 | - |
| 3.3767 | 76740 | 0.8365 | - |
| 3.3772 | 76750 | 0.8201 | - |
| 3.3776 | 76760 | 0.8149 | - |
| 3.3781 | 76770 | 0.7892 | - |
| 3.3785 | 76780 | 0.843 | - |
| 3.3789 | 76790 | 0.8479 | - |
| 3.3794 | 76800 | 0.7801 | - |
| 3.3798 | 76810 | 0.9015 | - |
| 3.3803 | 76820 | 0.8726 | - |
| 3.3807 | 76830 | 0.8416 | - |
| 3.3811 | 76840 | 0.8112 | - |
| 3.3816 | 76850 | 0.8312 | - |
| 3.3820 | 76860 | 0.8365 | - |
| 3.3825 | 76870 | 0.8198 | - |
| 3.3829 | 76880 | 0.8122 | - |
| 3.3833 | 76890 | 0.8556 | - |
| 3.3838 | 76900 | 0.8504 | - |
| 3.3842 | 76910 | 0.8575 | - |
| 3.3847 | 76920 | 0.8541 | - |
| 3.3851 | 76930 | 0.8401 | - |
| 3.3855 | 76940 | 0.82 | - |
| 3.3860 | 76950 | 0.8442 | - |
| 3.3864 | 76960 | 0.8386 | - |
| 3.3869 | 76970 | 0.813 | - |
| 3.3873 | 76980 | 0.8228 | - |
| 3.3877 | 76990 | 0.859 | - |
| 3.3882 | 77000 | 0.8711 | - |
| 3.3886 | 77010 | 0.8158 | - |
| 3.3891 | 77020 | 0.8173 | - |
| 3.3895 | 77030 | 0.8184 | - |
| 3.3899 | 77040 | 0.8489 | - |
| 3.3904 | 77050 | 0.843 | - |
| 3.3908 | 77060 | 0.8773 | - |
| 3.3913 | 77070 | 0.8383 | - |
| 3.3917 | 77080 | 0.8556 | - |
| 3.3921 | 77090 | 0.8474 | - |
| 3.3926 | 77100 | 0.8488 | - |
| 3.3930 | 77110 | 0.8709 | - |
| 3.3935 | 77120 | 0.8129 | - |
| 3.3939 | 77130 | 0.8329 | - |
| 3.3944 | 77140 | 0.7658 | - |
| 3.3948 | 77150 | 0.8347 | - |
| 3.3952 | 77160 | 0.8532 | - |
| 3.3957 | 77170 | 0.8549 | - |
| 3.3961 | 77180 | 0.8453 | - |
| 3.3966 | 77190 | 0.8628 | - |
| 3.3970 | 77200 | 0.8551 | - |
| 3.3974 | 77210 | 0.8764 | - |
| 3.3979 | 77220 | 0.8015 | - |
| 3.3983 | 77230 | 0.8489 | - |
| 3.3988 | 77240 | 0.8432 | - |
| 3.3992 | 77250 | 0.8419 | - |
| 3.3996 | 77260 | 0.8747 | - |
| 3.4001 | 77270 | 0.846 | - |
| 3.4005 | 77280 | 0.8221 | - |
| 3.4010 | 77290 | 0.8567 | - |
| 3.4014 | 77300 | 0.782 | - |
| 3.4018 | 77310 | 0.8594 | - |
| 3.4021 | 77316 | - | 1.3658 |
| 3.4023 | 77320 | 0.8638 | - |
| 3.4027 | 77330 | 0.8357 | - |
| 3.4032 | 77340 | 0.845 | - |
| 3.4036 | 77350 | 0.8291 | - |
| 3.4040 | 77360 | 0.845 | - |
| 3.4045 | 77370 | 0.8157 | - |
| 3.4049 | 77380 | 0.8307 | - |
| 3.4054 | 77390 | 0.8114 | - |
| 3.4058 | 77400 | 0.7582 | - |
| 3.4062 | 77410 | 0.8454 | - |
| 3.4067 | 77420 | 0.784 | - |
| 3.4071 | 77430 | 0.81 | - |
| 3.4076 | 77440 | 0.8513 | - |
| 3.4080 | 77450 | 0.8322 | - |
| 3.4084 | 77460 | 0.8435 | - |
| 3.4089 | 77470 | 0.8521 | - |
| 3.4093 | 77480 | 0.8445 | - |
| 3.4098 | 77490 | 0.855 | - |
| 3.4102 | 77500 | 0.8098 | - |
| 3.4106 | 77510 | 0.8435 | - |
| 3.4111 | 77520 | 0.8617 | - |
| 3.4115 | 77530 | 0.8141 | - |
| 3.4120 | 77540 | 0.8157 | - |
| 3.4124 | 77550 | 0.8203 | - |
| 3.4128 | 77560 | 0.8136 | - |
| 3.4133 | 77570 | 0.8341 | - |
| 3.4137 | 77580 | 0.8134 | - |
| 3.4142 | 77590 | 0.7894 | - |
| 3.4146 | 77600 | 0.8572 | - |
| 3.4150 | 77610 | 0.8452 | - |
| 3.4155 | 77620 | 0.8139 | - |
| 3.4159 | 77630 | 0.8117 | - |
| 3.4164 | 77640 | 0.8559 | - |
| 3.4168 | 77650 | 0.8644 | - |
| 3.4172 | 77660 | 0.8005 | - |
| 3.4177 | 77670 | 0.8203 | - |
| 3.4181 | 77680 | 0.8652 | - |
| 3.4186 | 77690 | 0.8571 | - |
| 3.4190 | 77700 | 0.8419 | - |
| 3.4194 | 77710 | 0.8226 | - |
| 3.4199 | 77720 | 0.868 | - |
| 3.4203 | 77730 | 0.8317 | - |
| 3.4208 | 77740 | 0.8189 | - |
| 3.4212 | 77750 | 0.852 | - |
| 3.4216 | 77760 | 0.8936 | - |
| 3.4221 | 77770 | 0.8728 | - |
| 3.4225 | 77780 | 0.8537 | - |
| 3.4230 | 77790 | 0.8389 | - |
| 3.4234 | 77800 | 0.8793 | - |
| 3.4238 | 77810 | 0.7873 | - |
| 3.4243 | 77820 | 0.8069 | - |
| 3.4247 | 77830 | 0.8034 | - |
| 3.4252 | 77840 | 0.8467 | - |
| 3.4256 | 77850 | 0.8354 | - |
| 3.4260 | 77860 | 0.8315 | - |
| 3.4265 | 77870 | 0.8216 | - |
| 3.4269 | 77880 | 0.7883 | - |
| 3.4274 | 77890 | 0.8528 | - |
| 3.4278 | 77900 | 0.8502 | - |
| 3.4282 | 77910 | 0.8223 | - |
| 3.4287 | 77920 | 0.8316 | - |
| 3.4291 | 77930 | 0.8355 | - |
| 3.4296 | 77940 | 0.8313 | - |
| 3.4300 | 77950 | 0.8533 | - |
| 3.4304 | 77960 | 0.8477 | - |
| 3.4309 | 77970 | 0.8396 | - |
| 3.4313 | 77980 | 0.821 | - |
| 3.4318 | 77990 | 0.7824 | - |
| 3.4322 | 78000 | 0.8045 | - |
| 3.4326 | 78010 | 0.8749 | - |
| 3.4331 | 78020 | 0.8469 | - |
| 3.4335 | 78030 | 0.8635 | - |
| 3.4340 | 78040 | 0.8452 | - |
| 3.4344 | 78050 | 0.8418 | - |
| 3.4348 | 78060 | 0.8416 | - |
| 3.4353 | 78070 | 0.8349 | - |
| 3.4357 | 78080 | 0.805 | - |
| 3.4362 | 78090 | 0.8227 | - |
| 3.4366 | 78100 | 0.8208 | - |
| 3.4370 | 78110 | 0.8622 | - |
| 3.4375 | 78120 | 0.823 | - |
| 3.4379 | 78130 | 0.858 | - |
| 3.4384 | 78140 | 0.8125 | - |
| 3.4388 | 78150 | 0.8072 | - |
| 3.4392 | 78160 | 0.8381 | - |
| 3.4397 | 78170 | 0.8475 | - |
| 3.4401 | 78180 | 0.8315 | - |
| 3.4406 | 78190 | 0.8099 | - |
| 3.4410 | 78200 | 0.801 | - |
| 3.4414 | 78210 | 0.879 | - |
| 3.4419 | 78220 | 0.7844 | - |
| 3.4423 | 78230 | 0.8235 | - |
| 3.4428 | 78240 | 0.766 | - |
| 3.4432 | 78250 | 0.7875 | - |
| 3.4436 | 78260 | 0.8433 | - |
| 3.4441 | 78270 | 0.8319 | - |
| 3.4445 | 78280 | 0.8234 | - |
| 3.4450 | 78290 | 0.8105 | - |
| 3.4454 | 78300 | 0.8183 | - |
| 3.4458 | 78310 | 0.8178 | - |
| 3.4463 | 78320 | 0.8463 | - |
| 3.4467 | 78330 | 0.8128 | - |
| 3.4472 | 78340 | 0.8031 | - |
| 3.4476 | 78350 | 0.8183 | - |
| 3.4480 | 78360 | 0.8257 | - |
| 3.4485 | 78370 | 0.8048 | - |
| 3.4489 | 78380 | 0.8285 | - |
| 3.4494 | 78390 | 0.7991 | - |
| 3.4498 | 78400 | 0.8671 | - |
| 3.4502 | 78410 | 0.796 | - |
| 3.4507 | 78420 | 0.8117 | - |
| 3.4511 | 78430 | 0.828 | - |
| 3.4516 | 78440 | 0.8288 | - |
| 3.4520 | 78450 | 0.8243 | - |
| 3.4521 | 78453 | - | 1.3806 |
| 3.4524 | 78460 | 0.8348 | - |
| 3.4529 | 78470 | 0.847 | - |
| 3.4533 | 78480 | 0.8154 | - |
| 3.4538 | 78490 | 0.8109 | - |
| 3.4542 | 78500 | 0.8393 | - |
| 3.4546 | 78510 | 0.7969 | - |
| 3.4551 | 78520 | 0.8018 | - |
| 3.4555 | 78530 | 0.8262 | - |
| 3.4560 | 78540 | 0.8573 | - |
| 3.4564 | 78550 | 0.8568 | - |
| 3.4568 | 78560 | 0.7906 | - |
| 3.4573 | 78570 | 0.8115 | - |
| 3.4577 | 78580 | 0.8217 | - |
| 3.4582 | 78590 | 0.8695 | - |
| 3.4586 | 78600 | 0.7948 | - |
| 3.4590 | 78610 | 0.8532 | - |
| 3.4595 | 78620 | 0.8354 | - |
| 3.4599 | 78630 | 0.8514 | - |
| 3.4604 | 78640 | 0.8251 | - |
| 3.4608 | 78650 | 0.8273 | - |
| 3.4612 | 78660 | 0.8313 | - |
| 3.4617 | 78670 | 0.8183 | - |
| 3.4621 | 78680 | 0.7995 | - |
| 3.4626 | 78690 | 0.8085 | - |
| 3.4630 | 78700 | 0.8074 | - |
| 3.4634 | 78710 | 0.8108 | - |
| 3.4639 | 78720 | 0.8159 | - |
| 3.4643 | 78730 | 0.8451 | - |
| 3.4648 | 78740 | 0.8166 | - |
| 3.4652 | 78750 | 0.8368 | - |
| 3.4656 | 78760 | 0.8219 | - |
| 3.4661 | 78770 | 0.8431 | - |
| 3.4665 | 78780 | 0.7959 | - |
| 3.4670 | 78790 | 0.7811 | - |
| 3.4674 | 78800 | 0.8075 | - |
| 3.4678 | 78810 | 0.8674 | - |
| 3.4683 | 78820 | 0.8446 | - |
| 3.4687 | 78830 | 0.8312 | - |
| 3.4692 | 78840 | 0.8059 | - |
| 3.4696 | 78850 | 0.8397 | - |
| 3.4700 | 78860 | 0.8378 | - |
| 3.4705 | 78870 | 0.8444 | - |
| 3.4709 | 78880 | 0.8228 | - |
| 3.4714 | 78890 | 0.8142 | - |
| 3.4718 | 78900 | 0.8158 | - |
| 3.4722 | 78910 | 0.7852 | - |
| 3.4727 | 78920 | 0.8326 | - |
| 3.4731 | 78930 | 0.8231 | - |
| 3.4736 | 78940 | 0.8523 | - |
| 3.4740 | 78950 | 0.7719 | - |
| 3.4744 | 78960 | 0.8395 | - |
| 3.4749 | 78970 | 0.807 | - |
| 3.4753 | 78980 | 0.863 | - |
| 3.4758 | 78990 | 0.8226 | - |
| 3.4762 | 79000 | 0.8163 | - |
| 3.4766 | 79010 | 0.8552 | - |
| 3.4771 | 79020 | 0.8254 | - |
| 3.4775 | 79030 | 0.8115 | - |
| 3.4780 | 79040 | 0.8097 | - |
| 3.4784 | 79050 | 0.8333 | - |
| 3.4788 | 79060 | 0.7931 | - |
| 3.4793 | 79070 | 0.8518 | - |
| 3.4797 | 79080 | 0.8412 | - |
| 3.4802 | 79090 | 0.8 | - |
| 3.4806 | 79100 | 0.8371 | - |
| 3.4810 | 79110 | 0.8175 | - |
| 3.4815 | 79120 | 0.8182 | - |
| 3.4819 | 79130 | 0.8031 | - |
| 3.4824 | 79140 | 0.8478 | - |
| 3.4828 | 79150 | 0.7991 | - |
| 3.4832 | 79160 | 0.8554 | - |
| 3.4837 | 79170 | 0.8338 | - |
| 3.4841 | 79180 | 0.7964 | - |
| 3.4846 | 79190 | 0.8065 | - |
| 3.4850 | 79200 | 0.8168 | - |
| 3.4854 | 79210 | 0.8225 | - |
| 3.4859 | 79220 | 0.8048 | - |
| 3.4863 | 79230 | 0.8298 | - |
| 3.4868 | 79240 | 0.8554 | - |
| 3.4872 | 79250 | 0.8361 | - |
| 3.4876 | 79260 | 0.8075 | - |
| 3.4881 | 79270 | 0.8241 | - |
| 3.4885 | 79280 | 0.8051 | - |
| 3.4890 | 79290 | 0.851 | - |
| 3.4894 | 79300 | 0.8355 | - |
| 3.4898 | 79310 | 0.7933 | - |
| 3.4903 | 79320 | 0.8075 | - |
| 3.4907 | 79330 | 0.796 | - |
| 3.4912 | 79340 | 0.829 | - |
| 3.4916 | 79350 | 0.8174 | - |
| 3.4920 | 79360 | 0.8602 | - |
| 3.4925 | 79370 | 0.8421 | - |
| 3.4929 | 79380 | 0.7882 | - |
| 3.4934 | 79390 | 0.7828 | - |
| 3.4938 | 79400 | 0.8359 | - |
| 3.4942 | 79410 | 0.8273 | - |
| 3.4947 | 79420 | 0.8275 | - |
| 3.4951 | 79430 | 0.8337 | - |
| 3.4956 | 79440 | 0.8393 | - |
| 3.4960 | 79450 | 0.8558 | - |
| 3.4964 | 79460 | 0.8176 | - |
| 3.4969 | 79470 | 0.8426 | - |
| 3.4973 | 79480 | 0.8227 | - |
| 3.4978 | 79490 | 0.7986 | - |
| 3.4982 | 79500 | 0.8513 | - |
| 3.4986 | 79510 | 0.8233 | - |
| 3.4991 | 79520 | 0.7981 | - |
| 3.4995 | 79530 | 0.8128 | - |
| 3.5000 | 79540 | 0.855 | - |
| 3.5004 | 79550 | 0.8601 | - |
| 3.5008 | 79560 | 0.8023 | - |
| 3.5013 | 79570 | 0.802 | - |
| 3.5017 | 79580 | 0.8163 | - |
| 3.5022 | 79590 | 0.7995 | 1.3837 |
| 3.5026 | 79600 | 0.8062 | - |
| 3.5030 | 79610 | 0.8079 | - |
| 3.5035 | 79620 | 0.7952 | - |
| 3.5039 | 79630 | 0.8064 | - |
| 3.5044 | 79640 | 0.8269 | - |
| 3.5048 | 79650 | 0.8365 | - |
| 3.5052 | 79660 | 0.8244 | - |
| 3.5057 | 79670 | 0.8121 | - |
| 3.5061 | 79680 | 0.8255 | - |
| 3.5066 | 79690 | 0.8083 | - |
| 3.5070 | 79700 | 0.855 | - |
| 3.5074 | 79710 | 0.7844 | - |
| 3.5079 | 79720 | 0.7829 | - |
| 3.5083 | 79730 | 0.8356 | - |
| 3.5088 | 79740 | 0.8064 | - |
| 3.5092 | 79750 | 0.8023 | - |
| 3.5096 | 79760 | 0.7997 | - |
| 3.5101 | 79770 | 0.8418 | - |
| 3.5105 | 79780 | 0.8075 | - |
| 3.5110 | 79790 | 0.8105 | - |
| 3.5114 | 79800 | 0.7648 | - |
| 3.5118 | 79810 | 0.8372 | - |
| 3.5123 | 79820 | 0.7942 | - |
| 3.5127 | 79830 | 0.8321 | - |
| 3.5132 | 79840 | 0.8227 | - |
| 3.5136 | 79850 | 0.8279 | - |
| 3.5140 | 79860 | 0.8356 | - |
| 3.5145 | 79870 | 0.852 | - |
| 3.5149 | 79880 | 0.8014 | - |
| 3.5154 | 79890 | 0.8184 | - |
| 3.5158 | 79900 | 0.8283 | - |
| 3.5162 | 79910 | 0.7779 | - |
| 3.5167 | 79920 | 0.843 | - |
| 3.5171 | 79930 | 0.8044 | - |
| 3.5176 | 79940 | 0.8453 | - |
| 3.5180 | 79950 | 0.8448 | - |
| 3.5184 | 79960 | 0.7981 | - |
| 3.5189 | 79970 | 0.8173 | - |
| 3.5193 | 79980 | 0.8753 | - |
| 3.5198 | 79990 | 0.7809 | - |
| 3.5202 | 80000 | 0.7773 | - |
| 3.5206 | 80010 | 0.8128 | - |
| 3.5211 | 80020 | 0.8379 | - |
| 3.5215 | 80030 | 0.8666 | - |
| 3.5220 | 80040 | 0.8112 | - |
| 3.5224 | 80050 | 0.859 | - |
| 3.5228 | 80060 | 0.8432 | - |
| 3.5233 | 80070 | 0.8145 | - |
| 3.5237 | 80080 | 0.8134 | - |
| 3.5242 | 80090 | 0.808 | - |
| 3.5246 | 80100 | 0.8182 | - |
| 3.5250 | 80110 | 0.7792 | - |
| 3.5255 | 80120 | 0.8454 | - |
| 3.5259 | 80130 | 0.8073 | - |
| 3.5264 | 80140 | 0.8301 | - |
| 3.5268 | 80150 | 0.8157 | - |
| 3.5272 | 80160 | 0.8235 | - |
| 3.5277 | 80170 | 0.8281 | - |
| 3.5281 | 80180 | 0.8238 | - |
| 3.5286 | 80190 | 0.8166 | - |
| 3.5290 | 80200 | 0.8211 | - |
| 3.5294 | 80210 | 0.8238 | - |
| 3.5299 | 80220 | 0.811 | - |
| 3.5303 | 80230 | 0.8175 | - |
| 3.5308 | 80240 | 0.8384 | - |
| 3.5312 | 80250 | 0.7981 | - |
| 3.5316 | 80260 | 0.8389 | - |
| 3.5321 | 80270 | 0.8513 | - |
| 3.5325 | 80280 | 0.8176 | - |
| 3.5330 | 80290 | 0.8382 | - |
| 3.5334 | 80300 | 0.8062 | - |
| 3.5338 | 80310 | 0.8262 | - |
| 3.5343 | 80320 | 0.8122 | - |
| 3.5347 | 80330 | 0.8299 | - |
| 3.5352 | 80340 | 0.8036 | - |
| 3.5356 | 80350 | 0.8341 | - |
| 3.5360 | 80360 | 0.8345 | - |
| 3.5365 | 80370 | 0.8663 | - |
| 3.5369 | 80380 | 0.826 | - |
| 3.5374 | 80390 | 0.8203 | - |
| 3.5378 | 80400 | 0.7951 | - |
| 3.5382 | 80410 | 0.8568 | - |
| 3.5387 | 80420 | 0.8099 | - |
| 3.5391 | 80430 | 0.811 | - |
| 3.5396 | 80440 | 0.8202 | - |
| 3.5400 | 80450 | 0.8915 | - |
| 3.5404 | 80460 | 0.8065 | - |
| 3.5409 | 80470 | 0.8372 | - |
| 3.5413 | 80480 | 0.8237 | - |
| 3.5418 | 80490 | 0.8317 | - |
| 3.5422 | 80500 | 0.7939 | - |
| 3.5426 | 80510 | 0.8071 | - |
| 3.5431 | 80520 | 0.8507 | - |
| 3.5435 | 80530 | 0.8071 | - |
| 3.5440 | 80540 | 0.8295 | - |
| 3.5444 | 80550 | 0.8018 | - |
| 3.5448 | 80560 | 0.767 | - |
| 3.5453 | 80570 | 0.7792 | - |
| 3.5457 | 80580 | 0.8474 | - |
| 3.5462 | 80590 | 0.8287 | - |
| 3.5466 | 80600 | 0.7772 | - |
| 3.5470 | 80610 | 0.8161 | - |
| 3.5475 | 80620 | 0.8173 | - |
| 3.5479 | 80630 | 0.7996 | - |
| 3.5484 | 80640 | 0.7879 | - |
| 3.5488 | 80650 | 0.8312 | - |
| 3.5492 | 80660 | 0.8135 | - |
| 3.5497 | 80670 | 0.8016 | - |
| 3.5501 | 80680 | 0.7853 | - |
| 3.5506 | 80690 | 0.8381 | - |
| 3.5510 | 80700 | 0.831 | - |
| 3.5514 | 80710 | 0.8416 | - |
| 3.5519 | 80720 | 0.8156 | - |
| 3.5522 | 80727 | - | 1.3862 |
| 3.5523 | 80730 | 0.7994 | - |
| 3.5528 | 80740 | 0.7681 | - |
| 3.5532 | 80750 | 0.8334 | - |
| 3.5536 | 80760 | 0.8203 | - |
| 3.5541 | 80770 | 0.8073 | - |
| 3.5545 | 80780 | 0.7944 | - |
| 3.5550 | 80790 | 0.7806 | - |
| 3.5554 | 80800 | 0.778 | - |
| 3.5558 | 80810 | 0.795 | - |
| 3.5563 | 80820 | 0.8067 | - |
| 3.5567 | 80830 | 0.8328 | - |
| 3.5572 | 80840 | 0.8218 | - |
| 3.5576 | 80850 | 0.8225 | - |
| 3.5580 | 80860 | 0.8507 | - |
| 3.5585 | 80870 | 0.7926 | - |
| 3.5589 | 80880 | 0.7923 | - |
| 3.5594 | 80890 | 0.7761 | - |
| 3.5598 | 80900 | 0.7992 | - |
| 3.5602 | 80910 | 0.7813 | - |
| 3.5607 | 80920 | 0.8322 | - |
| 3.5611 | 80930 | 0.8235 | - |
| 3.5616 | 80940 | 0.8143 | - |
| 3.5620 | 80950 | 0.8031 | - |
| 3.5624 | 80960 | 0.799 | - |
| 3.5629 | 80970 | 0.7658 | - |
| 3.5633 | 80980 | 0.8287 | - |
| 3.5638 | 80990 | 0.8142 | - |
| 3.5642 | 81000 | 0.8165 | - |
| 3.5646 | 81010 | 0.8514 | - |
| 3.5651 | 81020 | 0.8154 | - |
| 3.5655 | 81030 | 0.8462 | - |
| 3.5660 | 81040 | 0.76 | - |
| 3.5664 | 81050 | 0.8511 | - |
| 3.5668 | 81060 | 0.7323 | - |
| 3.5673 | 81070 | 0.8045 | - |
| 3.5677 | 81080 | 0.8382 | - |
| 3.5682 | 81090 | 0.8274 | - |
| 3.5686 | 81100 | 0.8064 | - |
| 3.5690 | 81110 | 0.7655 | - |
| 3.5695 | 81120 | 0.8168 | - |
| 3.5699 | 81130 | 0.8117 | - |
| 3.5704 | 81140 | 0.785 | - |
| 3.5708 | 81150 | 0.832 | - |
| 3.5712 | 81160 | 0.8375 | - |
| 3.5717 | 81170 | 0.7864 | - |
| 3.5721 | 81180 | 0.8167 | - |
| 3.5726 | 81190 | 0.8329 | - |
| 3.5730 | 81200 | 0.8267 | - |
| 3.5734 | 81210 | 0.8395 | - |
| 3.5739 | 81220 | 0.8519 | - |
| 3.5743 | 81230 | 0.8207 | - |
| 3.5748 | 81240 | 0.798 | - |
| 3.5752 | 81250 | 0.817 | - |
| 3.5756 | 81260 | 0.8411 | - |
| 3.5761 | 81270 | 0.8182 | - |
| 3.5765 | 81280 | 0.8288 | - |
| 3.5770 | 81290 | 0.8099 | - |
| 3.5774 | 81300 | 0.7793 | - |
| 3.5778 | 81310 | 0.8472 | - |
| 3.5783 | 81320 | 0.8061 | - |
| 3.5787 | 81330 | 0.7808 | - |
| 3.5792 | 81340 | 0.8127 | - |
| 3.5796 | 81350 | 0.8208 | - |
| 3.5800 | 81360 | 0.7852 | - |
| 3.5805 | 81370 | 0.8063 | - |
| 3.5809 | 81380 | 0.7759 | - |
| 3.5814 | 81390 | 0.8501 | - |
| 3.5818 | 81400 | 0.8205 | - |
| 3.5822 | 81410 | 0.8125 | - |
| 3.5827 | 81420 | 0.828 | - |
| 3.5831 | 81430 | 0.7998 | - |
| 3.5836 | 81440 | 0.8602 | - |
| 3.5840 | 81450 | 0.7844 | - |
| 3.5844 | 81460 | 0.8187 | - |
| 3.5849 | 81470 | 0.8021 | - |
| 3.5853 | 81480 | 0.7637 | - |
| 3.5858 | 81490 | 0.8461 | - |
| 3.5862 | 81500 | 0.8438 | - |
| 3.5866 | 81510 | 0.8549 | - |
| 3.5871 | 81520 | 0.8103 | - |
| 3.5875 | 81530 | 0.8024 | - |
| 3.5880 | 81540 | 0.7911 | - |
| 3.5884 | 81550 | 0.8503 | - |
| 3.5888 | 81560 | 0.7962 | - |
| 3.5893 | 81570 | 0.798 | - |
| 3.5897 | 81580 | 0.7978 | - |
| 3.5902 | 81590 | 0.8021 | - |
| 3.5906 | 81600 | 0.851 | - |
| 3.5910 | 81610 | 0.7917 | - |
| 3.5915 | 81620 | 0.8101 | - |
| 3.5919 | 81630 | 0.807 | - |
| 3.5924 | 81640 | 0.8308 | - |
| 3.5928 | 81650 | 0.8294 | - |
| 3.5932 | 81660 | 0.8187 | - |
| 3.5937 | 81670 | 0.8512 | - |
| 3.5941 | 81680 | 0.8003 | - |
| 3.5946 | 81690 | 0.7692 | - |
| 3.5950 | 81700 | 0.8189 | - |
| 3.5954 | 81710 | 0.7834 | - |
| 3.5959 | 81720 | 0.8491 | - |
| 3.5963 | 81730 | 0.8056 | - |
| 3.5968 | 81740 | 0.8445 | - |
| 3.5972 | 81750 | 0.7964 | - |
| 3.5976 | 81760 | 0.8031 | - |
| 3.5981 | 81770 | 0.816 | - |
| 3.5985 | 81780 | 0.8696 | - |
| 3.5990 | 81790 | 0.804 | - |
| 3.5994 | 81800 | 0.8133 | - |
| 3.5998 | 81810 | 0.8556 | - |
| 3.6003 | 81820 | 0.786 | - |
| 3.6007 | 81830 | 0.7925 | - |
| 3.6012 | 81840 | 0.7768 | - |
| 3.6016 | 81850 | 0.7761 | - |
| 3.6020 | 81860 | 0.7788 | - |
| 3.6022 | 81864 | - | 1.3726 |
| 3.6025 | 81870 | 0.8554 | - |
| 3.6029 | 81880 | 0.795 | - |
| 3.6034 | 81890 | 0.8061 | - |
| 3.6038 | 81900 | 0.7623 | - |
| 3.6042 | 81910 | 0.7742 | - |
| 3.6047 | 81920 | 0.7874 | - |
| 3.6051 | 81930 | 0.7983 | - |
| 3.6056 | 81940 | 0.8517 | - |
| 3.6060 | 81950 | 0.8093 | - |
| 3.6064 | 81960 | 0.8376 | - |
| 3.6069 | 81970 | 0.7594 | - |
| 3.6073 | 81980 | 0.8036 | - |
| 3.6078 | 81990 | 0.8171 | - |
| 3.6082 | 82000 | 0.7667 | - |
| 3.6086 | 82010 | 0.8398 | - |
| 3.6091 | 82020 | 0.8381 | - |
| 3.6095 | 82030 | 0.7781 | - |
| 3.6100 | 82040 | 0.8055 | - |
| 3.6104 | 82050 | 0.8229 | - |
| 3.6108 | 82060 | 0.8205 | - |
| 3.6113 | 82070 | 0.7735 | - |
| 3.6117 | 82080 | 0.8126 | - |
| 3.6122 | 82090 | 0.8 | - |
| 3.6126 | 82100 | 0.8309 | - |
| 3.6130 | 82110 | 0.7649 | - |
| 3.6135 | 82120 | 0.7746 | - |
| 3.6139 | 82130 | 0.8159 | - |
| 3.6144 | 82140 | 0.8341 | - |
| 3.6148 | 82150 | 0.8296 | - |
| 3.6152 | 82160 | 0.8089 | - |
| 3.6157 | 82170 | 0.823 | - |
| 3.6161 | 82180 | 0.7718 | - |
| 3.6166 | 82190 | 0.7813 | - |
| 3.6170 | 82200 | 0.7828 | - |
| 3.6174 | 82210 | 0.7598 | - |
| 3.6179 | 82220 | 0.7736 | - |
| 3.6183 | 82230 | 0.8095 | - |
| 3.6188 | 82240 | 0.8178 | - |
| 3.6192 | 82250 | 0.8116 | - |
| 3.6196 | 82260 | 0.7986 | - |
| 3.6201 | 82270 | 0.8398 | - |
| 3.6205 | 82280 | 0.8007 | - |
| 3.6210 | 82290 | 0.7973 | - |
| 3.6214 | 82300 | 0.7825 | - |
| 3.6218 | 82310 | 0.7661 | - |
| 3.6223 | 82320 | 0.7936 | - |
| 3.6227 | 82330 | 0.8323 | - |
| 3.6232 | 82340 | 0.7948 | - |
| 3.6236 | 82350 | 0.7793 | - |
| 3.6240 | 82360 | 0.843 | - |
| 3.6245 | 82370 | 0.8036 | - |
| 3.6249 | 82380 | 0.7912 | - |
| 3.6254 | 82390 | 0.8025 | - |
| 3.6258 | 82400 | 0.8308 | - |
| 3.6262 | 82410 | 0.8139 | - |
| 3.6267 | 82420 | 0.8046 | - |
| 3.6271 | 82430 | 0.7953 | - |
| 3.6276 | 82440 | 0.8036 | - |
| 3.6280 | 82450 | 0.8386 | - |
| 3.6284 | 82460 | 0.7951 | - |
| 3.6289 | 82470 | 0.8256 | - |
| 3.6293 | 82480 | 0.8126 | - |
| 3.6298 | 82490 | 0.7795 | - |
| 3.6302 | 82500 | 0.8027 | - |
| 3.6306 | 82510 | 0.7972 | - |
| 3.6311 | 82520 | 0.7627 | - |
| 3.6315 | 82530 | 0.7902 | - |
| 3.6320 | 82540 | 0.8104 | - |
| 3.6324 | 82550 | 0.8035 | - |
| 3.6328 | 82560 | 0.7675 | - |
| 3.6333 | 82570 | 0.7904 | - |
| 3.6337 | 82580 | 0.7814 | - |
| 3.6342 | 82590 | 0.7888 | - |
| 3.6346 | 82600 | 0.801 | - |
| 3.6350 | 82610 | 0.8126 | - |
| 3.6355 | 82620 | 0.801 | - |
| 3.6359 | 82630 | 0.8169 | - |
| 3.6364 | 82640 | 0.8154 | - |
| 3.6368 | 82650 | 0.7942 | - |
| 3.6372 | 82660 | 0.8199 | - |
| 3.6377 | 82670 | 0.8313 | - |
| 3.6381 | 82680 | 0.8122 | - |
| 3.6386 | 82690 | 0.8329 | - |
| 3.6390 | 82700 | 0.8234 | - |
| 3.6394 | 82710 | 0.821 | - |
| 3.6399 | 82720 | 0.8119 | - |
| 3.6403 | 82730 | 0.7914 | - |
| 3.6408 | 82740 | 0.8 | - |
| 3.6412 | 82750 | 0.7946 | - |
| 3.6416 | 82760 | 0.8204 | - |
| 3.6421 | 82770 | 0.8213 | - |
| 3.6425 | 82780 | 0.8023 | - |
| 3.6430 | 82790 | 0.8379 | - |
| 3.6434 | 82800 | 0.8283 | - |
| 3.6438 | 82810 | 0.7946 | - |
| 3.6443 | 82820 | 0.804 | - |
| 3.6447 | 82830 | 0.8067 | - |
| 3.6452 | 82840 | 0.8264 | - |
| 3.6456 | 82850 | 0.7971 | - |
| 3.6460 | 82860 | 0.8057 | - |
| 3.6465 | 82870 | 0.805 | - |
| 3.6469 | 82880 | 0.7641 | - |
| 3.6474 | 82890 | 0.8412 | - |
| 3.6478 | 82900 | 0.7575 | - |
| 3.6482 | 82910 | 0.7963 | - |
| 3.6487 | 82920 | 0.7939 | - |
| 3.6491 | 82930 | 0.8143 | - |
| 3.6496 | 82940 | 0.8393 | - |
| 3.6500 | 82950 | 0.8088 | - |
| 3.6504 | 82960 | 0.8009 | - |
| 3.6509 | 82970 | 0.802 | - |
| 3.6513 | 82980 | 0.8174 | - |
| 3.6518 | 82990 | 0.7669 | - |
| 3.6522 | 83000 | 0.7981 | - |
| 3.6522 | 83001 | - | 1.3769 |
| 3.6526 | 83010 | 0.7776 | - |
| 3.6531 | 83020 | 0.8126 | - |
| 3.6535 | 83030 | 0.7966 | - |
| 3.6540 | 83040 | 0.8274 | - |
| 3.6544 | 83050 | 0.7831 | - |
| 3.6548 | 83060 | 0.8281 | - |
| 3.6553 | 83070 | 0.8101 | - |
| 3.6557 | 83080 | 0.7799 | - |
| 3.6562 | 83090 | 0.78 | - |
| 3.6566 | 83100 | 0.8113 | - |
| 3.6570 | 83110 | 0.8215 | - |
| 3.6575 | 83120 | 0.7934 | - |
| 3.6579 | 83130 | 0.8237 | - |
| 3.6584 | 83140 | 0.7835 | - |
| 3.6588 | 83150 | 0.7888 | - |
| 3.6592 | 83160 | 0.7711 | - |
| 3.6597 | 83170 | 0.8044 | - |
| 3.6601 | 83180 | 0.7981 | - |
| 3.6606 | 83190 | 0.8171 | - |
| 3.6610 | 83200 | 0.7921 | - |
| 3.6614 | 83210 | 0.833 | - |
| 3.6619 | 83220 | 0.8046 | - |
| 3.6623 | 83230 | 0.7808 | - |
| 3.6628 | 83240 | 0.8128 | - |
| 3.6632 | 83250 | 0.8178 | - |
| 3.6636 | 83260 | 0.7954 | - |
| 3.6641 | 83270 | 0.7979 | - |
| 3.6645 | 83280 | 0.8139 | - |
| 3.6650 | 83290 | 0.8071 | - |
| 3.6654 | 83300 | 0.7732 | - |
| 3.6658 | 83310 | 0.817 | - |
| 3.6663 | 83320 | 0.7932 | - |
| 3.6667 | 83330 | 0.8054 | - |
| 3.6672 | 83340 | 0.8356 | - |
| 3.6676 | 83350 | 0.8242 | - |
| 3.6680 | 83360 | 0.8106 | - |
| 3.6685 | 83370 | 0.8185 | - |
| 3.6689 | 83380 | 0.8059 | - |
| 3.6694 | 83390 | 0.7519 | - |
| 3.6698 | 83400 | 0.7983 | - |
| 3.6702 | 83410 | 0.7964 | - |
| 3.6707 | 83420 | 0.7715 | - |
| 3.6711 | 83430 | 0.7857 | - |
| 3.6716 | 83440 | 0.7806 | - |
| 3.6720 | 83450 | 0.8197 | - |
| 3.6724 | 83460 | 0.7934 | - |
| 3.6729 | 83470 | 0.8308 | - |
| 3.6733 | 83480 | 0.7826 | - |
| 3.6738 | 83490 | 0.7954 | - |
| 3.6742 | 83500 | 0.83 | - |
| 3.6746 | 83510 | 0.7939 | - |
| 3.6751 | 83520 | 0.778 | - |
| 3.6755 | 83530 | 0.7887 | - |
| 3.6760 | 83540 | 0.762 | - |
| 3.6764 | 83550 | 0.7916 | - |
| 3.6768 | 83560 | 0.8125 | - |
| 3.6773 | 83570 | 0.7905 | - |
| 3.6777 | 83580 | 0.7697 | - |
| 3.6782 | 83590 | 0.8235 | - |
| 3.6786 | 83600 | 0.7587 | - |
| 3.6790 | 83610 | 0.8012 | - |
| 3.6795 | 83620 | 0.8254 | - |
| 3.6799 | 83630 | 0.801 | - |
| 3.6804 | 83640 | 0.803 | - |
| 3.6808 | 83650 | 0.7913 | - |
| 3.6812 | 83660 | 0.7524 | - |
| 3.6817 | 83670 | 0.7894 | - |
| 3.6821 | 83680 | 0.8247 | - |
| 3.6826 | 83690 | 0.7379 | - |
| 3.6830 | 83700 | 0.8349 | - |
| 3.6834 | 83710 | 0.8038 | - |
| 3.6839 | 83720 | 0.7721 | - |
| 3.6843 | 83730 | 0.7839 | - |
| 3.6848 | 83740 | 0.8192 | - |
| 3.6852 | 83750 | 0.778 | - |
| 3.6856 | 83760 | 0.8029 | - |
| 3.6861 | 83770 | 0.7833 | - |
| 3.6865 | 83780 | 0.8003 | - |
| 3.6870 | 83790 | 0.8002 | - |
| 3.6874 | 83800 | 0.7818 | - |
| 3.6878 | 83810 | 0.8046 | - |
| 3.6883 | 83820 | 0.7773 | - |
| 3.6887 | 83830 | 0.8162 | - |
| 3.6892 | 83840 | 0.8343 | - |
| 3.6896 | 83850 | 0.7641 | - |
| 3.6900 | 83860 | 0.7848 | - |
| 3.6905 | 83870 | 0.8191 | - |
| 3.6909 | 83880 | 0.7963 | - |
| 3.6914 | 83890 | 0.797 | - |
| 3.6918 | 83900 | 0.7576 | - |
| 3.6922 | 83910 | 0.8122 | - |
| 3.6927 | 83920 | 0.8124 | - |
| 3.6931 | 83930 | 0.7323 | - |
| 3.6936 | 83940 | 0.797 | - |
| 3.6940 | 83950 | 0.7372 | - |
| 3.6944 | 83960 | 0.7857 | - |
| 3.6949 | 83970 | 0.7928 | - |
| 3.6953 | 83980 | 0.7617 | - |
| 3.6958 | 83990 | 0.8044 | - |
| 3.6962 | 84000 | 0.7951 | - |
| 3.6966 | 84010 | 0.8358 | - |
| 3.6971 | 84020 | 0.8252 | - |
| 3.6975 | 84030 | 0.7879 | - |
| 3.6980 | 84040 | 0.7826 | - |
| 3.6984 | 84050 | 0.8096 | - |
| 3.6988 | 84060 | 0.7823 | - |
| 3.6993 | 84070 | 0.7898 | - |
| 3.6997 | 84080 | 0.7907 | - |
| 3.7002 | 84090 | 0.8199 | - |
| 3.7006 | 84100 | 0.8132 | - |
| 3.7010 | 84110 | 0.8185 | - |
| 3.7015 | 84120 | 0.8398 | - |
| 3.7019 | 84130 | 0.7927 | - |
| 3.7023 | 84138 | - | 1.3690 |
| 3.7024 | 84140 | 0.7846 | - |
| 3.7028 | 84150 | 0.7832 | - |
| 3.7032 | 84160 | 0.7627 | - |
| 3.7037 | 84170 | 0.8082 | - |
| 3.7041 | 84180 | 0.8002 | - |
| 3.7046 | 84190 | 0.8285 | - |
| 3.7050 | 84200 | 0.7985 | - |
| 3.7054 | 84210 | 0.8006 | - |
| 3.7059 | 84220 | 0.7643 | - |
| 3.7063 | 84230 | 0.7783 | - |
| 3.7068 | 84240 | 0.8009 | - |
| 3.7072 | 84250 | 0.7672 | - |
| 3.7076 | 84260 | 0.8421 | - |
| 3.7081 | 84270 | 0.7864 | - |
| 3.7085 | 84280 | 0.7813 | - |
| 3.7090 | 84290 | 0.7913 | - |
| 3.7094 | 84300 | 0.7968 | - |
| 3.7098 | 84310 | 0.8092 | - |
| 3.7103 | 84320 | 0.7647 | - |
| 3.7107 | 84330 | 0.8048 | - |
| 3.7112 | 84340 | 0.8024 | - |
| 3.7116 | 84350 | 0.8016 | - |
| 3.7120 | 84360 | 0.8048 | - |
| 3.7125 | 84370 | 0.7473 | - |
| 3.7129 | 84380 | 0.7852 | - |
| 3.7134 | 84390 | 0.7815 | - |
| 3.7138 | 84400 | 0.8306 | - |
| 3.7142 | 84410 | 0.8004 | - |
| 3.7147 | 84420 | 0.7993 | - |
| 3.7151 | 84430 | 0.8048 | - |
| 3.7156 | 84440 | 0.7818 | - |
| 3.7160 | 84450 | 0.787 | - |
| 3.7164 | 84460 | 0.7992 | - |
| 3.7169 | 84470 | 0.8161 | - |
| 3.7173 | 84480 | 0.7911 | - |
| 3.7178 | 84490 | 0.8011 | - |
| 3.7182 | 84500 | 0.7969 | - |
| 3.7186 | 84510 | 0.8016 | - |
| 3.7191 | 84520 | 0.799 | - |
| 3.7195 | 84530 | 0.8208 | - |
| 3.7200 | 84540 | 0.7494 | - |
| 3.7204 | 84550 | 0.8078 | - |
| 3.7208 | 84560 | 0.8278 | - |
| 3.7213 | 84570 | 0.7908 | - |
| 3.7217 | 84580 | 0.7968 | - |
| 3.7222 | 84590 | 0.767 | - |
| 3.7226 | 84600 | 0.783 | - |
| 3.7230 | 84610 | 0.7495 | - |
| 3.7235 | 84620 | 0.7868 | - |
| 3.7239 | 84630 | 0.7977 | - |
| 3.7244 | 84640 | 0.8218 | - |
| 3.7248 | 84650 | 0.7841 | - |
| 3.7252 | 84660 | 0.8066 | - |
| 3.7257 | 84670 | 0.7861 | - |
| 3.7261 | 84680 | 0.7704 | - |
| 3.7266 | 84690 | 0.82 | - |
| 3.7270 | 84700 | 0.8091 | - |
| 3.7274 | 84710 | 0.793 | - |
| 3.7279 | 84720 | 0.7623 | - |
| 3.7283 | 84730 | 0.7761 | - |
| 3.7288 | 84740 | 0.7622 | - |
| 3.7292 | 84750 | 0.7868 | - |
| 3.7296 | 84760 | 0.7996 | - |
| 3.7301 | 84770 | 0.7737 | - |
| 3.7305 | 84780 | 0.7886 | - |
| 3.7310 | 84790 | 0.7865 | - |
| 3.7314 | 84800 | 0.8285 | - |
| 3.7318 | 84810 | 0.7984 | - |
| 3.7323 | 84820 | 0.7977 | - |
| 3.7327 | 84830 | 0.8075 | - |
| 3.7332 | 84840 | 0.7763 | - |
| 3.7336 | 84850 | 0.7344 | - |
| 3.7340 | 84860 | 0.8063 | - |
| 3.7345 | 84870 | 0.8026 | - |
| 3.7349 | 84880 | 0.796 | - |
| 3.7354 | 84890 | 0.8288 | - |
| 3.7358 | 84900 | 0.7825 | - |
| 3.7362 | 84910 | 0.7987 | - |
| 3.7367 | 84920 | 0.8285 | - |
| 3.7371 | 84930 | 0.7409 | - |
| 3.7376 | 84940 | 0.7952 | - |
| 3.7380 | 84950 | 0.7913 | - |
| 3.7384 | 84960 | 0.7595 | - |
| 3.7389 | 84970 | 0.8179 | - |
| 3.7393 | 84980 | 0.848 | - |
| 3.7398 | 84990 | 0.7583 | - |
| 3.7402 | 85000 | 0.7793 | - |
| 3.7406 | 85010 | 0.8232 | - |
| 3.7411 | 85020 | 0.8119 | - |
| 3.7415 | 85030 | 0.8035 | - |
| 3.7420 | 85040 | 0.8255 | - |
| 3.7424 | 85050 | 0.8023 | - |
| 3.7428 | 85060 | 0.8042 | - |
| 3.7433 | 85070 | 0.7664 | - |
| 3.7437 | 85080 | 0.7786 | - |
| 3.7442 | 85090 | 0.7845 | - |
| 3.7446 | 85100 | 0.7837 | - |
| 3.7450 | 85110 | 0.7866 | - |
| 3.7455 | 85120 | 0.7945 | - |
| 3.7459 | 85130 | 0.7821 | - |
| 3.7464 | 85140 | 0.7921 | - |
| 3.7468 | 85150 | 0.7824 | - |
| 3.7472 | 85160 | 0.7738 | - |
| 3.7477 | 85170 | 0.7706 | - |
| 3.7481 | 85180 | 0.8167 | - |
| 3.7486 | 85190 | 0.7984 | - |
| 3.7490 | 85200 | 0.8004 | - |
| 3.7494 | 85210 | 0.7642 | - |
| 3.7499 | 85220 | 0.77 | - |
| 3.7503 | 85230 | 0.7683 | - |
| 3.7508 | 85240 | 0.8278 | - |
| 3.7512 | 85250 | 0.8392 | - |
| 3.7517 | 85260 | 0.817 | - |
| 3.7521 | 85270 | 0.79 | - |
| 3.7523 | 85275 | - | 1.3909 |
| 3.7525 | 85280 | 0.7903 | - |
| 3.7530 | 85290 | 0.7937 | - |
| 3.7534 | 85300 | 0.7754 | - |
| 3.7539 | 85310 | 0.7997 | - |
| 3.7543 | 85320 | 0.727 | - |
| 3.7547 | 85330 | 0.7622 | - |
| 3.7552 | 85340 | 0.8107 | - |
| 3.7556 | 85350 | 0.782 | - |
| 3.7561 | 85360 | 0.7775 | - |
| 3.7565 | 85370 | 0.8287 | - |
| 3.7569 | 85380 | 0.8162 | - |
| 3.7574 | 85390 | 0.7528 | - |
| 3.7578 | 85400 | 0.8173 | - |
| 3.7583 | 85410 | 0.8138 | - |
| 3.7587 | 85420 | 0.7904 | - |
| 3.7591 | 85430 | 0.8118 | - |
| 3.7596 | 85440 | 0.7946 | - |
| 3.7600 | 85450 | 0.7916 | - |
| 3.7605 | 85460 | 0.7352 | - |
| 3.7609 | 85470 | 0.7901 | - |
| 3.7613 | 85480 | 0.7648 | - |
| 3.7618 | 85490 | 0.8297 | - |
| 3.7622 | 85500 | 0.7714 | - |
| 3.7627 | 85510 | 0.799 | - |
| 3.7631 | 85520 | 0.7968 | - |
| 3.7635 | 85530 | 0.7587 | - |
| 3.7640 | 85540 | 0.7722 | - |
| 3.7644 | 85550 | 0.791 | - |
| 3.7649 | 85560 | 0.7942 | - |
| 3.7653 | 85570 | 0.7676 | - |
| 3.7657 | 85580 | 0.8101 | - |
| 3.7662 | 85590 | 0.8028 | - |
| 3.7666 | 85600 | 0.7454 | - |
| 3.7671 | 85610 | 0.8007 | - |
| 3.7675 | 85620 | 0.8226 | - |
| 3.7679 | 85630 | 0.7766 | - |
| 3.7684 | 85640 | 0.7822 | - |
| 3.7688 | 85650 | 0.8046 | - |
| 3.7693 | 85660 | 0.7569 | - |
| 3.7697 | 85670 | 0.7687 | - |
| 3.7701 | 85680 | 0.7448 | - |
| 3.7706 | 85690 | 0.7909 | - |
| 3.7710 | 85700 | 0.7775 | - |
| 3.7715 | 85710 | 0.8067 | - |
| 3.7719 | 85720 | 0.7782 | - |
| 3.7723 | 85730 | 0.7832 | - |
| 3.7728 | 85740 | 0.7603 | - |
| 3.7732 | 85750 | 0.8055 | - |
| 3.7737 | 85760 | 0.8 | - |
| 3.7741 | 85770 | 0.7873 | - |
| 3.7745 | 85780 | 0.7613 | - |
| 3.7750 | 85790 | 0.7894 | - |
| 3.7754 | 85800 | 0.8002 | - |
| 3.7759 | 85810 | 0.7696 | - |
| 3.7763 | 85820 | 0.7473 | - |
| 3.7767 | 85830 | 0.8359 | - |
| 3.7772 | 85840 | 0.7806 | - |
| 3.7776 | 85850 | 0.7789 | - |
| 3.7781 | 85860 | 0.8154 | - |
| 3.7785 | 85870 | 0.7616 | - |
| 3.7789 | 85880 | 0.7672 | - |
| 3.7794 | 85890 | 0.7855 | - |
| 3.7798 | 85900 | 0.7488 | - |
| 3.7803 | 85910 | 0.7721 | - |
| 3.7807 | 85920 | 0.7789 | - |
| 3.7811 | 85930 | 0.7993 | - |
| 3.7816 | 85940 | 0.778 | - |
| 3.7820 | 85950 | 0.7778 | - |
| 3.7825 | 85960 | 0.8483 | - |
| 3.7829 | 85970 | 0.7868 | - |
| 3.7833 | 85980 | 0.7954 | - |
| 3.7838 | 85990 | 0.7675 | - |
| 3.7842 | 86000 | 0.7741 | - |
| 3.7847 | 86010 | 0.8057 | - |
| 3.7851 | 86020 | 0.7714 | - |
| 3.7855 | 86030 | 0.8109 | - |
| 3.7860 | 86040 | 0.8106 | - |
| 3.7864 | 86050 | 0.7918 | - |
| 3.7869 | 86060 | 0.7752 | - |
| 3.7873 | 86070 | 0.7734 | - |
| 3.7877 | 86080 | 0.8018 | - |
| 3.7882 | 86090 | 0.8243 | - |
| 3.7886 | 86100 | 0.7546 | - |
| 3.7891 | 86110 | 0.7801 | - |
| 3.7895 | 86120 | 0.7999 | - |
| 3.7899 | 86130 | 0.7931 | - |
| 3.7904 | 86140 | 0.7707 | - |
| 3.7908 | 86150 | 0.8215 | - |
| 3.7913 | 86160 | 0.791 | - |
| 3.7917 | 86170 | 0.7746 | - |
| 3.7921 | 86180 | 0.8192 | - |
| 3.7926 | 86190 | 0.7633 | - |
| 3.7930 | 86200 | 0.7952 | - |
| 3.7935 | 86210 | 0.7938 | - |
| 3.7939 | 86220 | 0.7803 | - |
| 3.7943 | 86230 | 0.7681 | - |
| 3.7948 | 86240 | 0.8102 | - |
| 3.7952 | 86250 | 0.7999 | - |
| 3.7957 | 86260 | 0.785 | - |
| 3.7961 | 86270 | 0.7819 | - |
| 3.7965 | 86280 | 0.7827 | - |
| 3.7970 | 86290 | 0.7776 | - |
| 3.7974 | 86300 | 0.7879 | - |
| 3.7979 | 86310 | 0.8116 | - |
| 3.7983 | 86320 | 0.7899 | - |
| 3.7987 | 86330 | 0.791 | - |
| 3.7992 | 86340 | 0.8193 | - |
| 3.7996 | 86350 | 0.7971 | - |
| 3.8001 | 86360 | 0.7871 | - |
| 3.8005 | 86370 | 0.807 | - |
| 3.8009 | 86380 | 0.7674 | - |
| 3.8014 | 86390 | 0.789 | - |
| 3.8018 | 86400 | 0.7726 | - |
| 3.8023 | 86410 | 0.801 | - |
| 3.8023 | 86412 | - | 1.3781 |
| 3.8027 | 86420 | 0.8232 | - |
| 3.8031 | 86430 | 0.8254 | - |
| 3.8036 | 86440 | 0.8167 | - |
| 3.8040 | 86450 | 0.778 | - |
| 3.8045 | 86460 | 0.7623 | - |
| 3.8049 | 86470 | 0.8178 | - |
| 3.8053 | 86480 | 0.8225 | - |
| 3.8058 | 86490 | 0.758 | - |
| 3.8062 | 86500 | 0.7624 | - |
| 3.8067 | 86510 | 0.7823 | - |
| 3.8071 | 86520 | 0.7799 | - |
| 3.8075 | 86530 | 0.7537 | - |
| 3.8080 | 86540 | 0.8077 | - |
| 3.8084 | 86550 | 0.8088 | - |
| 3.8089 | 86560 | 0.798 | - |
| 3.8093 | 86570 | 0.8011 | - |
| 3.8097 | 86580 | 0.7641 | - |
| 3.8102 | 86590 | 0.7813 | - |
| 3.8106 | 86600 | 0.7731 | - |
| 3.8111 | 86610 | 0.7764 | - |
| 3.8115 | 86620 | 0.7975 | - |
| 3.8119 | 86630 | 0.8013 | - |
| 3.8124 | 86640 | 0.7693 | - |
| 3.8128 | 86650 | 0.7813 | - |
| 3.8133 | 86660 | 0.7408 | - |
| 3.8137 | 86670 | 0.7784 | - |
| 3.8141 | 86680 | 0.7754 | - |
| 3.8146 | 86690 | 0.8008 | - |
| 3.8150 | 86700 | 0.7843 | - |
| 3.8155 | 86710 | 0.8254 | - |
| 3.8159 | 86720 | 0.7922 | - |
| 3.8163 | 86730 | 0.7719 | - |
| 3.8168 | 86740 | 0.7494 | - |
| 3.8172 | 86750 | 0.7922 | - |
| 3.8177 | 86760 | 0.7872 | - |
| 3.8181 | 86770 | 0.788 | - |
| 3.8185 | 86780 | 0.7467 | - |
| 3.8190 | 86790 | 0.7625 | - |
| 3.8194 | 86800 | 0.7631 | - |
| 3.8199 | 86810 | 0.7861 | - |
| 3.8203 | 86820 | 0.7833 | - |
| 3.8207 | 86830 | 0.7767 | - |
| 3.8212 | 86840 | 0.8052 | - |
| 3.8216 | 86850 | 0.828 | - |
| 3.8221 | 86860 | 0.7688 | - |
| 3.8225 | 86870 | 0.7862 | - |
| 3.8229 | 86880 | 0.795 | - |
| 3.8234 | 86890 | 0.7769 | - |
| 3.8238 | 86900 | 0.7545 | - |
| 3.8243 | 86910 | 0.7949 | - |
| 3.8247 | 86920 | 0.8386 | - |
| 3.8251 | 86930 | 0.7647 | - |
| 3.8256 | 86940 | 0.8075 | - |
| 3.8260 | 86950 | 0.7662 | - |
| 3.8265 | 86960 | 0.7416 | - |
| 3.8269 | 86970 | 0.8002 | - |
| 3.8273 | 86980 | 0.7856 | - |
| 3.8278 | 86990 | 0.776 | - |
| 3.8282 | 87000 | 0.7827 | - |
| 3.8287 | 87010 | 0.7814 | - |
| 3.8291 | 87020 | 0.7578 | - |
| 3.8295 | 87030 | 0.7759 | - |
| 3.8300 | 87040 | 0.7987 | - |
| 3.8304 | 87050 | 0.7698 | - |
| 3.8309 | 87060 | 0.7958 | - |
| 3.8313 | 87070 | 0.7672 | - |
| 3.8317 | 87080 | 0.7666 | - |
| 3.8322 | 87090 | 0.806 | - |
| 3.8326 | 87100 | 0.8002 | - |
| 3.8331 | 87110 | 0.7501 | - |
| 3.8335 | 87120 | 0.8016 | - |
| 3.8339 | 87130 | 0.7688 | - |
| 3.8344 | 87140 | 0.771 | - |
| 3.8348 | 87150 | 0.7803 | - |
| 3.8353 | 87160 | 0.7942 | - |
| 3.8357 | 87170 | 0.7691 | - |
| 3.8361 | 87180 | 0.7847 | - |
| 3.8366 | 87190 | 0.7851 | - |
| 3.8370 | 87200 | 0.7552 | - |
| 3.8375 | 87210 | 0.7986 | - |
| 3.8379 | 87220 | 0.7775 | - |
| 3.8383 | 87230 | 0.7484 | - |
| 3.8388 | 87240 | 0.7775 | - |
| 3.8392 | 87250 | 0.7459 | - |
| 3.8397 | 87260 | 0.7953 | - |
| 3.8401 | 87270 | 0.7508 | - |
| 3.8405 | 87280 | 0.7791 | - |
| 3.8410 | 87290 | 0.7596 | - |
| 3.8414 | 87300 | 0.7504 | - |
| 3.8419 | 87310 | 0.762 | - |
| 3.8423 | 87320 | 0.7813 | - |
| 3.8427 | 87330 | 0.8048 | - |
| 3.8432 | 87340 | 0.7801 | - |
| 3.8436 | 87350 | 0.7948 | - |
| 3.8441 | 87360 | 0.7646 | - |
| 3.8445 | 87370 | 0.7888 | - |
| 3.8449 | 87380 | 0.7324 | - |
| 3.8454 | 87390 | 0.798 | - |
| 3.8458 | 87400 | 0.7827 | - |
| 3.8463 | 87410 | 0.7826 | - |
| 3.8467 | 87420 | 0.7829 | - |
| 3.8471 | 87430 | 0.7765 | - |
| 3.8476 | 87440 | 0.7781 | - |
| 3.8480 | 87450 | 0.7624 | - |
| 3.8485 | 87460 | 0.7762 | - |
| 3.8489 | 87470 | 0.7732 | - |
| 3.8493 | 87480 | 0.821 | - |
| 3.8498 | 87490 | 0.7754 | - |
| 3.8502 | 87500 | 0.7605 | - |
| 3.8507 | 87510 | 0.7665 | - |
| 3.8511 | 87520 | 0.7907 | - |
| 3.8515 | 87530 | 0.8188 | - |
| 3.8520 | 87540 | 0.7875 | - |
| 3.8524 | 87549 | - | 1.3775 |
| 3.8524 | 87550 | 0.8105 | - |
| 3.8529 | 87560 | 0.7488 | - |
| 3.8533 | 87570 | 0.785 | - |
| 3.8537 | 87580 | 0.7622 | - |
| 3.8542 | 87590 | 0.7882 | - |
| 3.8546 | 87600 | 0.7897 | - |
| 3.8551 | 87610 | 0.7659 | - |
| 3.8555 | 87620 | 0.7964 | - |
| 3.8559 | 87630 | 0.7778 | - |
| 3.8564 | 87640 | 0.8123 | - |
| 3.8568 | 87650 | 0.7622 | - |
| 3.8573 | 87660 | 0.7806 | - |
| 3.8577 | 87670 | 0.777 | - |
| 3.8581 | 87680 | 0.7917 | - |
| 3.8586 | 87690 | 0.7659 | - |
| 3.8590 | 87700 | 0.7595 | - |
| 3.8595 | 87710 | 0.7641 | - |
| 3.8599 | 87720 | 0.7912 | - |
| 3.8603 | 87730 | 0.7924 | - |
| 3.8608 | 87740 | 0.7924 | - |
| 3.8612 | 87750 | 0.756 | - |
| 3.8617 | 87760 | 0.7787 | - |
| 3.8621 | 87770 | 0.7882 | - |
| 3.8625 | 87780 | 0.7712 | - |
| 3.8630 | 87790 | 0.7516 | - |
| 3.8634 | 87800 | 0.7512 | - |
| 3.8639 | 87810 | 0.7538 | - |
| 3.8643 | 87820 | 0.8062 | - |
| 3.8647 | 87830 | 0.7743 | - |
| 3.8652 | 87840 | 0.7714 | - |
| 3.8656 | 87850 | 0.7736 | - |
| 3.8661 | 87860 | 0.7857 | - |
| 3.8665 | 87870 | 0.8228 | - |
| 3.8669 | 87880 | 0.7598 | - |
| 3.8674 | 87890 | 0.7898 | - |
| 3.8678 | 87900 | 0.7801 | - |
| 3.8683 | 87910 | 0.7638 | - |
| 3.8687 | 87920 | 0.7435 | - |
| 3.8691 | 87930 | 0.8042 | - |
| 3.8696 | 87940 | 0.787 | - |
| 3.8700 | 87950 | 0.7634 | - |
| 3.8705 | 87960 | 0.7795 | - |
| 3.8709 | 87970 | 0.7634 | - |
| 3.8713 | 87980 | 0.8065 | - |
| 3.8718 | 87990 | 0.7717 | - |
| 3.8722 | 88000 | 0.7965 | - |
| 3.8727 | 88010 | 0.7769 | - |
| 3.8731 | 88020 | 0.7857 | - |
| 3.8735 | 88030 | 0.7965 | - |
| 3.8740 | 88040 | 0.7719 | - |
| 3.8744 | 88050 | 0.7278 | - |
| 3.8749 | 88060 | 0.7666 | - |
| 3.8753 | 88070 | 0.7887 | - |
| 3.8757 | 88080 | 0.7795 | - |
| 3.8762 | 88090 | 0.7582 | - |
| 3.8766 | 88100 | 0.7813 | - |
| 3.8771 | 88110 | 0.7852 | - |
| 3.8775 | 88120 | 0.7804 | - |
| 3.8779 | 88130 | 0.7963 | - |
| 3.8784 | 88140 | 0.8097 | - |
| 3.8788 | 88150 | 0.7434 | - |
| 3.8793 | 88160 | 0.7697 | - |
| 3.8797 | 88170 | 0.7941 | - |
| 3.8801 | 88180 | 0.7893 | - |
| 3.8806 | 88190 | 0.7773 | - |
| 3.8810 | 88200 | 0.7684 | - |
| 3.8815 | 88210 | 0.8039 | - |
| 3.8819 | 88220 | 0.7672 | - |
| 3.8823 | 88230 | 0.8181 | - |
| 3.8828 | 88240 | 0.7965 | - |
| 3.8832 | 88250 | 0.7942 | - |
| 3.8837 | 88260 | 0.75 | - |
| 3.8841 | 88270 | 0.7902 | - |
| 3.8845 | 88280 | 0.8077 | - |
| 3.8850 | 88290 | 0.7449 | - |
| 3.8854 | 88300 | 0.8272 | - |
| 3.8859 | 88310 | 0.8152 | - |
| 3.8863 | 88320 | 0.7734 | - |
| 3.8867 | 88330 | 0.7684 | - |
| 3.8872 | 88340 | 0.7402 | - |
| 3.8876 | 88350 | 0.7676 | - |
| 3.8881 | 88360 | 0.7682 | - |
| 3.8885 | 88370 | 0.7382 | - |
| 3.8889 | 88380 | 0.7543 | - |
| 3.8894 | 88390 | 0.7966 | - |
| 3.8898 | 88400 | 0.7903 | - |
| 3.8903 | 88410 | 0.7831 | - |
| 3.8907 | 88420 | 0.7792 | - |
| 3.8911 | 88430 | 0.7793 | - |
| 3.8916 | 88440 | 0.7633 | - |
| 3.8920 | 88450 | 0.8273 | - |
| 3.8925 | 88460 | 0.7951 | - |
| 3.8929 | 88470 | 0.7851 | - |
| 3.8933 | 88480 | 0.7912 | - |
| 3.8938 | 88490 | 0.7876 | - |
| 3.8942 | 88500 | 0.7571 | - |
| 3.8947 | 88510 | 0.7685 | - |
| 3.8951 | 88520 | 0.776 | - |
| 3.8955 | 88530 | 0.7541 | - |
| 3.8960 | 88540 | 0.7674 | - |
| 3.8964 | 88550 | 0.7889 | - |
| 3.8969 | 88560 | 0.7418 | - |
| 3.8973 | 88570 | 0.7905 | - |
| 3.8977 | 88580 | 0.7542 | - |
| 3.8982 | 88590 | 0.7836 | - |
| 3.8986 | 88600 | 0.7842 | - |
| 3.8991 | 88610 | 0.7752 | - |
| 3.8995 | 88620 | 0.7529 | - |
| 3.8999 | 88630 | 0.7872 | - |
| 3.9004 | 88640 | 0.7861 | - |
| 3.9008 | 88650 | 0.7702 | - |
| 3.9013 | 88660 | 0.7873 | - |
| 3.9017 | 88670 | 0.7842 | - |
| 3.9021 | 88680 | 0.7587 | - |
| 3.9024 | 88686 | - | 1.3736 |
| 3.9026 | 88690 | 0.7754 | - |
| 3.9030 | 88700 | 0.7725 | - |
| 3.9035 | 88710 | 0.782 | - |
| 3.9039 | 88720 | 0.7949 | - |
| 3.9043 | 88730 | 0.789 | - |
| 3.9048 | 88740 | 0.7999 | - |
| 3.9052 | 88750 | 0.7545 | - |
| 3.9057 | 88760 | 0.7598 | - |
| 3.9061 | 88770 | 0.7892 | - |
| 3.9065 | 88780 | 0.7725 | - |
| 3.9070 | 88790 | 0.7908 | - |
| 3.9074 | 88800 | 0.7767 | - |
| 3.9079 | 88810 | 0.7878 | - |
| 3.9083 | 88820 | 0.7746 | - |
| 3.9087 | 88830 | 0.7948 | - |
| 3.9092 | 88840 | 0.7517 | - |
| 3.9096 | 88850 | 0.7705 | - |
| 3.9101 | 88860 | 0.7574 | - |
| 3.9105 | 88870 | 0.7639 | - |
| 3.9109 | 88880 | 0.7652 | - |
| 3.9114 | 88890 | 0.7907 | - |
| 3.9118 | 88900 | 0.8226 | - |
| 3.9123 | 88910 | 0.8013 | - |
| 3.9127 | 88920 | 0.7866 | - |
| 3.9131 | 88930 | 0.803 | - |
| 3.9136 | 88940 | 0.7602 | - |
| 3.9140 | 88950 | 0.811 | - |
| 3.9145 | 88960 | 0.7973 | - |
| 3.9149 | 88970 | 0.8115 | - |
| 3.9153 | 88980 | 0.7703 | - |
| 3.9158 | 88990 | 0.7983 | - |
| 3.9162 | 89000 | 0.7438 | - |
| 3.9167 | 89010 | 0.7753 | - |
| 3.9171 | 89020 | 0.7661 | - |
| 3.9175 | 89030 | 0.8212 | - |
| 3.9180 | 89040 | 0.7657 | - |
| 3.9184 | 89050 | 0.7927 | - |
| 3.9189 | 89060 | 0.7721 | - |
| 3.9193 | 89070 | 0.7614 | - |
| 3.9197 | 89080 | 0.7714 | - |
| 3.9202 | 89090 | 0.7713 | - |
| 3.9206 | 89100 | 0.7562 | - |
| 3.9211 | 89110 | 0.7681 | - |
| 3.9215 | 89120 | 0.7796 | - |
| 3.9219 | 89130 | 0.7815 | - |
| 3.9224 | 89140 | 0.7925 | - |
| 3.9228 | 89150 | 0.7769 | - |
| 3.9233 | 89160 | 0.7678 | - |
| 3.9237 | 89170 | 0.7746 | - |
| 3.9241 | 89180 | 0.7751 | - |
| 3.9246 | 89190 | 0.7754 | - |
| 3.9250 | 89200 | 0.7885 | - |
| 3.9255 | 89210 | 0.7845 | - |
| 3.9259 | 89220 | 0.7915 | - |
| 3.9263 | 89230 | 0.7905 | - |
| 3.9268 | 89240 | 0.7691 | - |
| 3.9272 | 89250 | 0.7771 | - |
| 3.9277 | 89260 | 0.78 | - |
| 3.9281 | 89270 | 0.7836 | - |
| 3.9285 | 89280 | 0.7642 | - |
| 3.9290 | 89290 | 0.7635 | - |
| 3.9294 | 89300 | 0.8017 | - |
| 3.9299 | 89310 | 0.7882 | - |
| 3.9303 | 89320 | 0.7512 | - |
| 3.9307 | 89330 | 0.8121 | - |
| 3.9312 | 89340 | 0.7611 | - |
| 3.9316 | 89350 | 0.7623 | - |
| 3.9321 | 89360 | 0.7398 | - |
| 3.9325 | 89370 | 0.7236 | - |
| 3.9329 | 89380 | 0.7471 | - |
| 3.9334 | 89390 | 0.7787 | - |
| 3.9338 | 89400 | 0.7444 | - |
| 3.9343 | 89410 | 0.7627 | - |
| 3.9347 | 89420 | 0.7807 | - |
| 3.9351 | 89430 | 0.7645 | - |
| 3.9356 | 89440 | 0.8142 | - |
| 3.9360 | 89450 | 0.7954 | - |
| 3.9365 | 89460 | 0.7809 | - |
| 3.9369 | 89470 | 0.7742 | - |
| 3.9373 | 89480 | 0.7499 | - |
| 3.9378 | 89490 | 0.8022 | - |
| 3.9382 | 89500 | 0.7404 | - |
| 3.9387 | 89510 | 0.769 | - |
| 3.9391 | 89520 | 0.7756 | - |
| 3.9395 | 89530 | 0.7638 | - |
| 3.9400 | 89540 | 0.7987 | - |
| 3.9404 | 89550 | 0.7741 | - |
| 3.9409 | 89560 | 0.7653 | - |
| 3.9413 | 89570 | 0.7486 | - |
| 3.9417 | 89580 | 0.7654 | - |
| 3.9422 | 89590 | 0.7601 | - |
| 3.9426 | 89600 | 0.7853 | - |
| 3.9431 | 89610 | 0.7924 | - |
| 3.9435 | 89620 | 0.7453 | - |
| 3.9439 | 89630 | 0.8432 | - |
| 3.9444 | 89640 | 0.7963 | - |
| 3.9448 | 89650 | 0.779 | - |
| 3.9453 | 89660 | 0.7961 | - |
| 3.9457 | 89670 | 0.7914 | - |
| 3.9461 | 89680 | 0.7513 | - |
| 3.9466 | 89690 | 0.7311 | - |
| 3.9470 | 89700 | 0.7603 | - |
| 3.9475 | 89710 | 0.7265 | - |
| 3.9479 | 89720 | 0.7559 | - |
| 3.9483 | 89730 | 0.7738 | - |
| 3.9488 | 89740 | 0.7767 | - |
| 3.9492 | 89750 | 0.7433 | - |
| 3.9497 | 89760 | 0.7684 | - |
| 3.9501 | 89770 | 0.7509 | - |
| 3.9505 | 89780 | 0.7949 | - |
| 3.9510 | 89790 | 0.7855 | - |
| 3.9514 | 89800 | 0.7307 | - |
| 3.9519 | 89810 | 0.778 | - |
| 3.9523 | 89820 | 0.7729 | - |
| 3.9524 | 89823 | - | 1.3721 |
| 3.9527 | 89830 | 0.7925 | - |
| 3.9532 | 89840 | 0.7671 | - |
| 3.9536 | 89850 | 0.7687 | - |
| 3.9541 | 89860 | 0.7456 | - |
| 3.9545 | 89870 | 0.7564 | - |
| 3.9549 | 89880 | 0.7916 | - |
| 3.9554 | 89890 | 0.799 | - |
| 3.9558 | 89900 | 0.7369 | - |
| 3.9563 | 89910 | 0.7991 | - |
| 3.9567 | 89920 | 0.7963 | - |
| 3.9571 | 89930 | 0.7733 | - |
| 3.9576 | 89940 | 0.7516 | - |
| 3.9580 | 89950 | 0.7863 | - |
| 3.9585 | 89960 | 0.7574 | - |
| 3.9589 | 89970 | 0.7366 | - |
| 3.9593 | 89980 | 0.7836 | - |
| 3.9598 | 89990 | 0.7849 | - |
| 3.9602 | 90000 | 0.7659 | - |
| 3.9607 | 90010 | 0.7795 | - |
| 3.9611 | 90020 | 0.757 | - |
| 3.9615 | 90030 | 0.7421 | - |
| 3.9620 | 90040 | 0.7683 | - |
| 3.9624 | 90050 | 0.769 | - |
| 3.9629 | 90060 | 0.746 | - |
| 3.9633 | 90070 | 0.7586 | - |
| 3.9637 | 90080 | 0.7612 | - |
| 3.9642 | 90090 | 0.7679 | - |
| 3.9646 | 90100 | 0.785 | - |
| 3.9651 | 90110 | 0.7842 | - |
| 3.9655 | 90120 | 0.7742 | - |
| 3.9659 | 90130 | 0.7968 | - |
| 3.9664 | 90140 | 0.7685 | - |
| 3.9668 | 90150 | 0.8331 | - |
| 3.9673 | 90160 | 0.7721 | - |
| 3.9677 | 90170 | 0.7376 | - |
| 3.9681 | 90180 | 0.7678 | - |
| 3.9686 | 90190 | 0.7908 | - |
| 3.9690 | 90200 | 0.7967 | - |
| 3.9695 | 90210 | 0.7881 | - |
| 3.9699 | 90220 | 0.8033 | - |
| 3.9703 | 90230 | 0.7881 | - |
| 3.9708 | 90240 | 0.8083 | - |
| 3.9712 | 90250 | 0.7541 | - |
| 3.9717 | 90260 | 0.7629 | - |
| 3.9721 | 90270 | 0.7266 | - |
| 3.9725 | 90280 | 0.7707 | - |
| 3.9730 | 90290 | 0.7793 | - |
| 3.9734 | 90300 | 0.7578 | - |
| 3.9739 | 90310 | 0.7601 | - |
| 3.9743 | 90320 | 0.7594 | - |
| 3.9747 | 90330 | 0.7494 | - |
| 3.9752 | 90340 | 0.7707 | - |
| 3.9756 | 90350 | 0.764 | - |
| 3.9761 | 90360 | 0.7343 | - |
| 3.9765 | 90370 | 0.7442 | - |
| 3.9769 | 90380 | 0.7796 | - |
| 3.9774 | 90390 | 0.7492 | - |
| 3.9778 | 90400 | 0.7823 | - |
| 3.9783 | 90410 | 0.7373 | - |
| 3.9787 | 90420 | 0.7551 | - |
| 3.9791 | 90430 | 0.773 | - |
| 3.9796 | 90440 | 0.7638 | - |
| 3.9800 | 90450 | 0.7756 | - |
| 3.9805 | 90460 | 0.7859 | - |
| 3.9809 | 90470 | 0.7476 | - |
| 3.9813 | 90480 | 0.7593 | - |
| 3.9818 | 90490 | 0.7649 | - |
| 3.9822 | 90500 | 0.7793 | - |
| 3.9827 | 90510 | 0.7791 | - |
| 3.9831 | 90520 | 0.7685 | - |
| 3.9835 | 90530 | 0.7781 | - |
| 3.9840 | 90540 | 0.7586 | - |
| 3.9844 | 90550 | 0.7656 | - |
| 3.9849 | 90560 | 0.7429 | - |
| 3.9853 | 90570 | 0.7742 | - |
| 3.9857 | 90580 | 0.7622 | - |
| 3.9862 | 90590 | 0.8129 | - |
| 3.9866 | 90600 | 0.7669 | - |
| 3.9871 | 90610 | 0.7551 | - |
| 3.9875 | 90620 | 0.7408 | - |
| 3.9879 | 90630 | 0.7233 | - |
| 3.9884 | 90640 | 0.7806 | - |
| 3.9888 | 90650 | 0.7533 | - |
| 3.9893 | 90660 | 0.7563 | - |
| 3.9897 | 90670 | 0.7832 | - |
| 3.9901 | 90680 | 0.7705 | - |
| 3.9906 | 90690 | 0.733 | - |
| 3.9910 | 90700 | 0.7943 | - |
| 3.9915 | 90710 | 0.7746 | - |
| 3.9919 | 90720 | 0.7749 | - |
| 3.9923 | 90730 | 0.7729 | - |
| 3.9928 | 90740 | 0.8105 | - |
| 3.9932 | 90750 | 0.7623 | - |
| 3.9937 | 90760 | 0.7589 | - |
| 3.9941 | 90770 | 0.7469 | - |
| 3.9945 | 90780 | 0.7746 | - |
| 3.9950 | 90790 | 0.7792 | - |
| 3.9954 | 90800 | 0.7601 | - |
| 3.9959 | 90810 | 0.7741 | - |
| 3.9963 | 90820 | 0.749 | - |
| 3.9967 | 90830 | 0.7543 | - |
| 3.9972 | 90840 | 0.7616 | - |
| 3.9976 | 90850 | 0.7909 | - |
| 3.9981 | 90860 | 0.7943 | - |
| 3.9985 | 90870 | 0.7782 | - |
| 3.9989 | 90880 | 0.7922 | - |
| 3.9994 | 90890 | 0.7378 | - |
| 3.9998 | 90900 | 0.7588 | - |
| 4.0003 | 90910 | 0.7412 | - |
| 4.0007 | 90920 | 0.7095 | - |
| 4.0011 | 90930 | 0.7529 | - |
| 4.0016 | 90940 | 0.7479 | - |
| 4.0020 | 90950 | 0.684 | - |
| 4.0025 | 90960 | 0.7176 | 1.3732 |
| 4.0029 | 90970 | 0.7498 | - |
| 4.0033 | 90980 | 0.7136 | - |
| 4.0038 | 90990 | 0.7413 | - |
| 4.0042 | 91000 | 0.7116 | - |
| 4.0047 | 91010 | 0.7631 | - |
| 4.0051 | 91020 | 0.7355 | - |
| 4.0055 | 91030 | 0.7153 | - |
| 4.0060 | 91040 | 0.7402 | - |
| 4.0064 | 91050 | 0.7337 | - |
| 4.0069 | 91060 | 0.7554 | - |
| 4.0073 | 91070 | 0.7 | - |
| 4.0077 | 91080 | 0.7279 | - |
| 4.0082 | 91090 | 0.7155 | - |
| 4.0086 | 91100 | 0.7102 | - |
| 4.0091 | 91110 | 0.7222 | - |
| 4.0095 | 91120 | 0.7212 | - |
| 4.0099 | 91130 | 0.6955 | - |
| 4.0104 | 91140 | 0.7561 | - |
| 4.0108 | 91150 | 0.7589 | - |
| 4.0113 | 91160 | 0.7375 | - |
| 4.0117 | 91170 | 0.7492 | - |
| 4.0121 | 91180 | 0.7107 | - |
| 4.0126 | 91190 | 0.7468 | - |
| 4.0130 | 91200 | 0.6917 | - |
| 4.0135 | 91210 | 0.7309 | - |
| 4.0139 | 91220 | 0.7352 | - |
| 4.0143 | 91230 | 0.7649 | - |
| 4.0148 | 91240 | 0.76 | - |
| 4.0152 | 91250 | 0.7358 | - |
| 4.0157 | 91260 | 0.7399 | - |
| 4.0161 | 91270 | 0.7468 | - |
| 4.0165 | 91280 | 0.7425 | - |
| 4.0170 | 91290 | 0.7233 | - |
| 4.0174 | 91300 | 0.7091 | - |
| 4.0179 | 91310 | 0.7269 | - |
| 4.0183 | 91320 | 0.7632 | - |
| 4.0187 | 91330 | 0.7347 | - |
| 4.0192 | 91340 | 0.7249 | - |
| 4.0196 | 91350 | 0.7682 | - |
| 4.0201 | 91360 | 0.7129 | - |
| 4.0205 | 91370 | 0.7643 | - |
| 4.0209 | 91380 | 0.7521 | - |
| 4.0214 | 91390 | 0.7331 | - |
| 4.0218 | 91400 | 0.7272 | - |
| 4.0223 | 91410 | 0.7202 | - |
| 4.0227 | 91420 | 0.7618 | - |
| 4.0231 | 91430 | 0.7426 | - |
| 4.0236 | 91440 | 0.7275 | - |
| 4.0240 | 91450 | 0.7385 | - |
| 4.0245 | 91460 | 0.7283 | - |
| 4.0249 | 91470 | 0.7108 | - |
| 4.0253 | 91480 | 0.7499 | - |
| 4.0258 | 91490 | 0.7121 | - |
| 4.0262 | 91500 | 0.7028 | - |
| 4.0267 | 91510 | 0.7346 | - |
| 4.0271 | 91520 | 0.7211 | - |
| 4.0275 | 91530 | 0.7221 | - |
| 4.0280 | 91540 | 0.7395 | - |
| 4.0284 | 91550 | 0.786 | - |
| 4.0289 | 91560 | 0.7499 | - |
| 4.0293 | 91570 | 0.7471 | - |
| 4.0297 | 91580 | 0.7285 | - |
| 4.0302 | 91590 | 0.7355 | - |
| 4.0306 | 91600 | 0.7993 | - |
| 4.0311 | 91610 | 0.7168 | - |
| 4.0315 | 91620 | 0.7317 | - |
| 4.0319 | 91630 | 0.7165 | - |
| 4.0324 | 91640 | 0.7233 | - |
| 4.0328 | 91650 | 0.7232 | - |
| 4.0333 | 91660 | 0.7432 | - |
| 4.0337 | 91670 | 0.6996 | - |
| 4.0341 | 91680 | 0.7614 | - |
| 4.0346 | 91690 | 0.7071 | - |
| 4.0350 | 91700 | 0.7228 | - |
| 4.0355 | 91710 | 0.7171 | - |
| 4.0359 | 91720 | 0.7563 | - |
| 4.0363 | 91730 | 0.7161 | - |
| 4.0368 | 91740 | 0.7092 | - |
| 4.0372 | 91750 | 0.7259 | - |
| 4.0377 | 91760 | 0.7543 | - |
| 4.0381 | 91770 | 0.7639 | - |
| 4.0385 | 91780 | 0.7305 | - |
| 4.0390 | 91790 | 0.7415 | - |
| 4.0394 | 91800 | 0.7217 | - |
| 4.0399 | 91810 | 0.7375 | - |
| 4.0403 | 91820 | 0.7706 | - |
| 4.0407 | 91830 | 0.7198 | - |
| 4.0412 | 91840 | 0.7748 | - |
| 4.0416 | 91850 | 0.7139 | - |
| 4.0421 | 91860 | 0.76 | - |
| 4.0425 | 91870 | 0.7333 | - |
| 4.0429 | 91880 | 0.7108 | - |
| 4.0434 | 91890 | 0.7361 | - |
| 4.0438 | 91900 | 0.7482 | - |
| 4.0443 | 91910 | 0.7233 | - |
| 4.0447 | 91920 | 0.7118 | - |
| 4.0451 | 91930 | 0.756 | - |
| 4.0456 | 91940 | 0.752 | - |
| 4.0460 | 91950 | 0.7261 | - |
| 4.0465 | 91960 | 0.7572 | - |
| 4.0469 | 91970 | 0.7069 | - |
| 4.0473 | 91980 | 0.7235 | - |
| 4.0478 | 91990 | 0.7619 | - |
| 4.0482 | 92000 | 0.7336 | - |
| 4.0487 | 92010 | 0.7124 | - |
| 4.0491 | 92020 | 0.7194 | - |
| 4.0495 | 92030 | 0.7325 | - |
| 4.0500 | 92040 | 0.7212 | - |
| 4.0504 | 92050 | 0.7259 | - |
| 4.0509 | 92060 | 0.7245 | - |
| 4.0513 | 92070 | 0.7513 | - |
| 4.0517 | 92080 | 0.7352 | - |
| 4.0522 | 92090 | 0.7108 | - |
| 4.0525 | 92097 | - | 1.3787 |
| 4.0526 | 92100 | 0.7395 | - |
| 4.0531 | 92110 | 0.7358 | - |
| 4.0535 | 92120 | 0.7172 | - |
| 4.0539 | 92130 | 0.7544 | - |
| 4.0544 | 92140 | 0.7457 | - |
| 4.0548 | 92150 | 0.7652 | - |
| 4.0553 | 92160 | 0.7613 | - |
| 4.0557 | 92170 | 0.7312 | - |
| 4.0561 | 92180 | 0.7239 | - |
| 4.0566 | 92190 | 0.7546 | - |
| 4.0570 | 92200 | 0.7016 | - |
| 4.0575 | 92210 | 0.7382 | - |
| 4.0579 | 92220 | 0.7203 | - |
| 4.0583 | 92230 | 0.7115 | - |
| 4.0588 | 92240 | 0.7433 | - |
| 4.0592 | 92250 | 0.7334 | - |
| 4.0597 | 92260 | 0.7176 | - |
| 4.0601 | 92270 | 0.7472 | - |
| 4.0605 | 92280 | 0.7205 | - |
| 4.0610 | 92290 | 0.7249 | - |
| 4.0614 | 92300 | 0.7258 | - |
| 4.0619 | 92310 | 0.7381 | - |
| 4.0623 | 92320 | 0.7114 | - |
| 4.0627 | 92330 | 0.7021 | - |
| 4.0632 | 92340 | 0.7165 | - |
| 4.0636 | 92350 | 0.7377 | - |
| 4.0641 | 92360 | 0.7809 | - |
| 4.0645 | 92370 | 0.7341 | - |
| 4.0649 | 92380 | 0.7421 | - |
| 4.0654 | 92390 | 0.7276 | - |
| 4.0658 | 92400 | 0.7284 | - |
| 4.0663 | 92410 | 0.7524 | - |
| 4.0667 | 92420 | 0.7201 | - |
| 4.0671 | 92430 | 0.7276 | - |
| 4.0676 | 92440 | 0.7508 | - |
| 4.0680 | 92450 | 0.75 | - |
| 4.0685 | 92460 | 0.746 | - |
| 4.0689 | 92470 | 0.7169 | - |
| 4.0693 | 92480 | 0.7664 | - |
| 4.0698 | 92490 | 0.7394 | - |
| 4.0702 | 92500 | 0.7386 | - |
| 4.0707 | 92510 | 0.7214 | - |
| 4.0711 | 92520 | 0.7361 | - |
| 4.0715 | 92530 | 0.7105 | - |
| 4.0720 | 92540 | 0.7114 | - |
| 4.0724 | 92550 | 0.7143 | - |
| 4.0729 | 92560 | 0.7228 | - |
| 4.0733 | 92570 | 0.7049 | - |
| 4.0737 | 92580 | 0.7153 | - |
| 4.0742 | 92590 | 0.7136 | - |
| 4.0746 | 92600 | 0.7467 | - |
| 4.0751 | 92610 | 0.7092 | - |
| 4.0755 | 92620 | 0.7247 | - |
| 4.0759 | 92630 | 0.7497 | - |
| 4.0764 | 92640 | 0.7278 | - |
| 4.0768 | 92650 | 0.6955 | - |
| 4.0773 | 92660 | 0.7283 | - |
| 4.0777 | 92670 | 0.7235 | - |
| 4.0781 | 92680 | 0.7434 | - |
| 4.0786 | 92690 | 0.7193 | - |
| 4.0790 | 92700 | 0.7542 | - |
| 4.0795 | 92710 | 0.7201 | - |
| 4.0799 | 92720 | 0.7467 | - |
| 4.0803 | 92730 | 0.7529 | - |
| 4.0808 | 92740 | 0.7238 | - |
| 4.0812 | 92750 | 0.7393 | - |
| 4.0817 | 92760 | 0.7246 | - |
| 4.0821 | 92770 | 0.7312 | - |
| 4.0825 | 92780 | 0.6909 | - |
| 4.0830 | 92790 | 0.74 | - |
| 4.0834 | 92800 | 0.7238 | - |
| 4.0839 | 92810 | 0.7409 | - |
| 4.0843 | 92820 | 0.7132 | - |
| 4.0847 | 92830 | 0.7126 | - |
| 4.0852 | 92840 | 0.7266 | - |
| 4.0856 | 92850 | 0.7199 | - |
| 4.0861 | 92860 | 0.7099 | - |
| 4.0865 | 92870 | 0.7089 | - |
| 4.0869 | 92880 | 0.7606 | - |
| 4.0874 | 92890 | 0.7229 | - |
| 4.0878 | 92900 | 0.7367 | - |
| 4.0883 | 92910 | 0.7138 | - |
| 4.0887 | 92920 | 0.7338 | - |
| 4.0891 | 92930 | 0.6956 | - |
| 4.0896 | 92940 | 0.7192 | - |
| 4.0900 | 92950 | 0.7245 | - |
| 4.0905 | 92960 | 0.7252 | - |
| 4.0909 | 92970 | 0.7151 | - |
| 4.0913 | 92980 | 0.7655 | - |
| 4.0918 | 92990 | 0.6995 | - |
| 4.0922 | 93000 | 0.7407 | - |
| 4.0927 | 93010 | 0.7496 | - |
| 4.0931 | 93020 | 0.7503 | - |
| 4.0935 | 93030 | 0.7384 | - |
| 4.0940 | 93040 | 0.7187 | - |
| 4.0944 | 93050 | 0.7139 | - |
| 4.0949 | 93060 | 0.734 | - |
| 4.0953 | 93070 | 0.7261 | - |
| 4.0957 | 93080 | 0.7434 | - |
| 4.0962 | 93090 | 0.7121 | - |
| 4.0966 | 93100 | 0.7349 | - |
| 4.0971 | 93110 | 0.6938 | - |
| 4.0975 | 93120 | 0.7258 | - |
| 4.0979 | 93130 | 0.73 | - |
| 4.0984 | 93140 | 0.7475 | - |
| 4.0988 | 93150 | 0.7385 | - |
| 4.0993 | 93160 | 0.702 | - |
| 4.0997 | 93170 | 0.7402 | - |
| 4.1001 | 93180 | 0.7484 | - |
| 4.1006 | 93190 | 0.7358 | - |
| 4.1010 | 93200 | 0.7438 | - |
| 4.1015 | 93210 | 0.7245 | - |
| 4.1019 | 93220 | 0.7143 | - |
| 4.1023 | 93230 | 0.7511 | - |
| 4.1025 | 93234 | - | 1.3846 |
| 4.1028 | 93240 | 0.7182 | - |
| 4.1032 | 93250 | 0.7283 | - |
| 4.1037 | 93260 | 0.709 | - |
| 4.1041 | 93270 | 0.731 | - |
| 4.1045 | 93280 | 0.7731 | - |
| 4.1050 | 93290 | 0.7279 | - |
| 4.1054 | 93300 | 0.7301 | - |
| 4.1059 | 93310 | 0.7075 | - |
| 4.1063 | 93320 | 0.7523 | - |
| 4.1067 | 93330 | 0.7127 | - |
| 4.1072 | 93340 | 0.71 | - |
| 4.1076 | 93350 | 0.691 | - |
| 4.1081 | 93360 | 0.7472 | - |
| 4.1085 | 93370 | 0.7098 | - |
| 4.1090 | 93380 | 0.7401 | - |
| 4.1094 | 93390 | 0.6932 | - |
| 4.1098 | 93400 | 0.6886 | - |
| 4.1103 | 93410 | 0.7322 | - |
| 4.1107 | 93420 | 0.7075 | - |
| 4.1112 | 93430 | 0.7324 | - |
| 4.1116 | 93440 | 0.7308 | - |
| 4.1120 | 93450 | 0.7138 | - |
| 4.1125 | 93460 | 0.7467 | - |
| 4.1129 | 93470 | 0.7006 | - |
| 4.1134 | 93480 | 0.7013 | - |
| 4.1138 | 93490 | 0.7462 | - |
| 4.1142 | 93500 | 0.7142 | - |
| 4.1147 | 93510 | 0.6966 | - |
| 4.1151 | 93520 | 0.7529 | - |
| 4.1156 | 93530 | 0.7033 | - |
| 4.1160 | 93540 | 0.6963 | - |
| 4.1164 | 93550 | 0.7371 | - |
| 4.1169 | 93560 | 0.7513 | - |
| 4.1173 | 93570 | 0.737 | - |
| 4.1178 | 93580 | 0.6994 | - |
| 4.1182 | 93590 | 0.755 | - |
| 4.1186 | 93600 | 0.7146 | - |
| 4.1191 | 93610 | 0.7533 | - |
| 4.1195 | 93620 | 0.7328 | - |
| 4.1200 | 93630 | 0.7206 | - |
| 4.1204 | 93640 | 0.679 | - |
| 4.1208 | 93650 | 0.7252 | - |
| 4.1213 | 93660 | 0.7065 | - |
| 4.1217 | 93670 | 0.723 | - |
| 4.1222 | 93680 | 0.7342 | - |
| 4.1226 | 93690 | 0.7421 | - |
| 4.1230 | 93700 | 0.716 | - |
| 4.1235 | 93710 | 0.7535 | - |
| 4.1239 | 93720 | 0.7212 | - |
| 4.1244 | 93730 | 0.7205 | - |
| 4.1248 | 93740 | 0.7251 | - |
| 4.1252 | 93750 | 0.7314 | - |
| 4.1257 | 93760 | 0.7242 | - |
| 4.1261 | 93770 | 0.7255 | - |
| 4.1266 | 93780 | 0.7288 | - |
| 4.1270 | 93790 | 0.7289 | - |
| 4.1274 | 93800 | 0.731 | - |
| 4.1279 | 93810 | 0.7111 | - |
| 4.1283 | 93820 | 0.7112 | - |
| 4.1288 | 93830 | 0.7162 | - |
| 4.1292 | 93840 | 0.7369 | - |
| 4.1296 | 93850 | 0.7116 | - |
| 4.1301 | 93860 | 0.702 | - |
| 4.1305 | 93870 | 0.7091 | - |
| 4.1310 | 93880 | 0.7151 | - |
| 4.1314 | 93890 | 0.7248 | - |
| 4.1318 | 93900 | 0.716 | - |
| 4.1323 | 93910 | 0.7234 | - |
| 4.1327 | 93920 | 0.7355 | - |
| 4.1332 | 93930 | 0.7243 | - |
| 4.1336 | 93940 | 0.7103 | - |
| 4.1340 | 93950 | 0.7303 | - |
| 4.1345 | 93960 | 0.7263 | - |
| 4.1349 | 93970 | 0.7556 | - |
| 4.1354 | 93980 | 0.6941 | - |
| 4.1358 | 93990 | 0.7861 | - |
| 4.1362 | 94000 | 0.723 | - |
| 4.1367 | 94010 | 0.6912 | - |
| 4.1371 | 94020 | 0.7133 | - |
| 4.1376 | 94030 | 0.739 | - |
| 4.1380 | 94040 | 0.7169 | - |
| 4.1384 | 94050 | 0.7051 | - |
| 4.1389 | 94060 | 0.7333 | - |
| 4.1393 | 94070 | 0.7149 | - |
| 4.1398 | 94080 | 0.7302 | - |
| 4.1402 | 94090 | 0.7144 | - |
| 4.1406 | 94100 | 0.715 | - |
| 4.1411 | 94110 | 0.724 | - |
| 4.1415 | 94120 | 0.6947 | - |
| 4.1420 | 94130 | 0.7473 | - |
| 4.1424 | 94140 | 0.7349 | - |
| 4.1428 | 94150 | 0.7437 | - |
| 4.1433 | 94160 | 0.7205 | - |
| 4.1437 | 94170 | 0.7478 | - |
| 4.1442 | 94180 | 0.7113 | - |
| 4.1446 | 94190 | 0.717 | - |
| 4.1450 | 94200 | 0.6944 | - |
| 4.1455 | 94210 | 0.7857 | - |
| 4.1459 | 94220 | 0.7253 | - |
| 4.1464 | 94230 | 0.7088 | - |
| 4.1468 | 94240 | 0.7131 | - |
| 4.1472 | 94250 | 0.7289 | - |
| 4.1477 | 94260 | 0.7267 | - |
| 4.1481 | 94270 | 0.6987 | - |
| 4.1486 | 94280 | 0.7112 | - |
| 4.1490 | 94290 | 0.7398 | - |
| 4.1494 | 94300 | 0.73 | - |
| 4.1499 | 94310 | 0.7125 | - |
| 4.1503 | 94320 | 0.7125 | - |
| 4.1508 | 94330 | 0.7479 | - |
| 4.1512 | 94340 | 0.7242 | - |
| 4.1516 | 94350 | 0.7337 | - |
| 4.1521 | 94360 | 0.7277 | - |
| 4.1525 | 94370 | 0.7356 | - |
| 4.1526 | 94371 | - | 1.3867 |
| 4.1530 | 94380 | 0.7217 | - |
| 4.1534 | 94390 | 0.7397 | - |
| 4.1538 | 94400 | 0.7226 | - |
| 4.1543 | 94410 | 0.7177 | - |
| 4.1547 | 94420 | 0.7458 | - |
| 4.1552 | 94430 | 0.733 | - |
| 4.1556 | 94440 | 0.7142 | - |
| 4.1560 | 94450 | 0.7087 | - |
| 4.1565 | 94460 | 0.7622 | - |
| 4.1569 | 94470 | 0.7333 | - |
| 4.1574 | 94480 | 0.7226 | - |
| 4.1578 | 94490 | 0.7099 | - |
| 4.1582 | 94500 | 0.7337 | - |
| 4.1587 | 94510 | 0.7446 | - |
| 4.1591 | 94520 | 0.7167 | - |
| 4.1596 | 94530 | 0.7215 | - |
| 4.1600 | 94540 | 0.7587 | - |
| 4.1604 | 94550 | 0.7766 | - |
| 4.1609 | 94560 | 0.6933 | - |
| 4.1613 | 94570 | 0.726 | - |
| 4.1618 | 94580 | 0.7201 | - |
| 4.1622 | 94590 | 0.7436 | - |
| 4.1626 | 94600 | 0.7386 | - |
| 4.1631 | 94610 | 0.7297 | - |
| 4.1635 | 94620 | 0.7421 | - |
| 4.1640 | 94630 | 0.7415 | - |
| 4.1644 | 94640 | 0.7475 | - |
| 4.1648 | 94650 | 0.7739 | - |
| 4.1653 | 94660 | 0.7235 | - |
| 4.1657 | 94670 | 0.7179 | - |
| 4.1662 | 94680 | 0.7601 | - |
| 4.1666 | 94690 | 0.7424 | - |
| 4.1670 | 94700 | 0.7274 | - |
| 4.1675 | 94710 | 0.7158 | - |
| 4.1679 | 94720 | 0.7554 | - |
| 4.1684 | 94730 | 0.6958 | - |
| 4.1688 | 94740 | 0.7416 | - |
| 4.1692 | 94750 | 0.7399 | - |
| 4.1697 | 94760 | 0.7174 | - |
| 4.1701 | 94770 | 0.7209 | - |
| 4.1706 | 94780 | 0.7583 | - |
| 4.1710 | 94790 | 0.6807 | - |
| 4.1714 | 94800 | 0.7592 | - |
| 4.1719 | 94810 | 0.7406 | - |
| 4.1723 | 94820 | 0.7059 | - |
| 4.1728 | 94830 | 0.7295 | - |
| 4.1732 | 94840 | 0.7008 | - |
| 4.1736 | 94850 | 0.7505 | - |
| 4.1741 | 94860 | 0.7309 | - |
| 4.1745 | 94870 | 0.7418 | - |
| 4.1750 | 94880 | 0.7148 | - |
| 4.1754 | 94890 | 0.7436 | - |
| 4.1758 | 94900 | 0.7293 | - |
| 4.1763 | 94910 | 0.7665 | - |
| 4.1767 | 94920 | 0.7432 | - |
| 4.1772 | 94930 | 0.742 | - |
| 4.1776 | 94940 | 0.7156 | - |
| 4.1780 | 94950 | 0.7072 | - |
| 4.1785 | 94960 | 0.6984 | - |
| 4.1789 | 94970 | 0.7056 | - |
| 4.1794 | 94980 | 0.6933 | - |
| 4.1798 | 94990 | 0.7257 | - |
| 4.1802 | 95000 | 0.7405 | - |
| 4.1807 | 95010 | 0.7488 | - |
| 4.1811 | 95020 | 0.7267 | - |
| 4.1816 | 95030 | 0.7669 | - |
| 4.1820 | 95040 | 0.7265 | - |
| 4.1824 | 95050 | 0.7279 | - |
| 4.1829 | 95060 | 0.7159 | - |
| 4.1833 | 95070 | 0.7345 | - |
| 4.1838 | 95080 | 0.7318 | - |
| 4.1842 | 95090 | 0.7183 | - |
| 4.1846 | 95100 | 0.7144 | - |
| 4.1851 | 95110 | 0.7167 | - |
| 4.1855 | 95120 | 0.7079 | - |
| 4.1860 | 95130 | 0.7124 | - |
| 4.1864 | 95140 | 0.7407 | - |
| 4.1868 | 95150 | 0.713 | - |
| 4.1873 | 95160 | 0.7787 | - |
| 4.1877 | 95170 | 0.7211 | - |
| 4.1882 | 95180 | 0.7265 | - |
| 4.1886 | 95190 | 0.7626 | - |
| 4.1890 | 95200 | 0.7373 | - |
| 4.1895 | 95210 | 0.7089 | - |
| 4.1899 | 95220 | 0.7099 | - |
| 4.1904 | 95230 | 0.7202 | - |
| 4.1908 | 95240 | 0.7273 | - |
| 4.1912 | 95250 | 0.7356 | - |
| 4.1917 | 95260 | 0.7399 | - |
| 4.1921 | 95270 | 0.7094 | - |
| 4.1926 | 95280 | 0.7225 | - |
| 4.1930 | 95290 | 0.7488 | - |
| 4.1934 | 95300 | 0.7376 | - |
| 4.1939 | 95310 | 0.7066 | - |
| 4.1943 | 95320 | 0.7186 | - |
| 4.1948 | 95330 | 0.7314 | - |
| 4.1952 | 95340 | 0.6811 | - |
| 4.1956 | 95350 | 0.7029 | - |
| 4.1961 | 95360 | 0.703 | - |
| 4.1965 | 95370 | 0.7357 | - |
| 4.1970 | 95380 | 0.7521 | - |
| 4.1974 | 95390 | 0.7095 | - |
| 4.1978 | 95400 | 0.6912 | - |
| 4.1983 | 95410 | 0.7225 | - |
| 4.1987 | 95420 | 0.7286 | - |
| 4.1992 | 95430 | 0.7281 | - |
| 4.1996 | 95440 | 0.7167 | - |
| 4.2000 | 95450 | 0.6972 | - |
| 4.2005 | 95460 | 0.7351 | - |
| 4.2009 | 95470 | 0.7145 | - |
| 4.2014 | 95480 | 0.7174 | - |
| 4.2018 | 95490 | 0.7149 | - |
| 4.2022 | 95500 | 0.7323 | - |
| 4.2026 | 95508 | - | 1.3841 |
| 4.2027 | 95510 | 0.6633 | - |
| 4.2031 | 95520 | 0.7155 | - |
| 4.2036 | 95530 | 0.7232 | - |
| 4.2040 | 95540 | 0.7254 | - |
| 4.2044 | 95550 | 0.7343 | - |
| 4.2049 | 95560 | 0.7558 | - |
| 4.2053 | 95570 | 0.7587 | - |
| 4.2058 | 95580 | 0.6951 | - |
| 4.2062 | 95590 | 0.7554 | - |
| 4.2066 | 95600 | 0.6806 | - |
| 4.2071 | 95610 | 0.736 | - |
| 4.2075 | 95620 | 0.7204 | - |
| 4.2080 | 95630 | 0.7339 | - |
| 4.2084 | 95640 | 0.7352 | - |
| 4.2088 | 95650 | 0.7126 | - |
| 4.2093 | 95660 | 0.7451 | - |
| 4.2097 | 95670 | 0.7106 | - |
| 4.2102 | 95680 | 0.7354 | - |
| 4.2106 | 95690 | 0.7152 | - |
| 4.2110 | 95700 | 0.6703 | - |
| 4.2115 | 95710 | 0.7295 | - |
| 4.2119 | 95720 | 0.7297 | - |
| 4.2124 | 95730 | 0.7381 | - |
| 4.2128 | 95740 | 0.7382 | - |
| 4.2132 | 95750 | 0.7314 | - |
| 4.2137 | 95760 | 0.7168 | - |
| 4.2141 | 95770 | 0.7015 | - |
| 4.2146 | 95780 | 0.7606 | - |
| 4.2150 | 95790 | 0.7381 | - |
| 4.2154 | 95800 | 0.7411 | - |
| 4.2159 | 95810 | 0.746 | - |
| 4.2163 | 95820 | 0.7436 | - |
| 4.2168 | 95830 | 0.7071 | - |
| 4.2172 | 95840 | 0.7387 | - |
| 4.2176 | 95850 | 0.7398 | - |
| 4.2181 | 95860 | 0.7234 | - |
| 4.2185 | 95870 | 0.7382 | - |
| 4.2190 | 95880 | 0.7386 | - |
| 4.2194 | 95890 | 0.6831 | - |
| 4.2198 | 95900 | 0.719 | - |
| 4.2203 | 95910 | 0.712 | - |
| 4.2207 | 95920 | 0.7437 | - |
| 4.2212 | 95930 | 0.7297 | - |
| 4.2216 | 95940 | 0.7169 | - |
| 4.2220 | 95950 | 0.7234 | - |
| 4.2225 | 95960 | 0.7484 | - |
| 4.2229 | 95970 | 0.741 | - |
| 4.2234 | 95980 | 0.7388 | - |
| 4.2238 | 95990 | 0.7366 | - |
| 4.2242 | 96000 | 0.7239 | - |
| 4.2247 | 96010 | 0.7001 | - |
| 4.2251 | 96020 | 0.7328 | - |
| 4.2256 | 96030 | 0.7454 | - |
| 4.2260 | 96040 | 0.7264 | - |
| 4.2264 | 96050 | 0.7294 | - |
| 4.2269 | 96060 | 0.6976 | - |
| 4.2273 | 96070 | 0.7229 | - |
| 4.2278 | 96080 | 0.7159 | - |
| 4.2282 | 96090 | 0.7401 | - |
| 4.2286 | 96100 | 0.7301 | - |
| 4.2291 | 96110 | 0.7036 | - |
| 4.2295 | 96120 | 0.7431 | - |
| 4.2300 | 96130 | 0.6774 | - |
| 4.2304 | 96140 | 0.7376 | - |
| 4.2308 | 96150 | 0.7627 | - |
| 4.2313 | 96160 | 0.7385 | - |
| 4.2317 | 96170 | 0.7168 | - |
| 4.2322 | 96180 | 0.7455 | - |
| 4.2326 | 96190 | 0.7229 | - |
| 4.2330 | 96200 | 0.7357 | - |
| 4.2335 | 96210 | 0.7394 | - |
| 4.2339 | 96220 | 0.7302 | - |
| 4.2344 | 96230 | 0.7398 | - |
| 4.2348 | 96240 | 0.7319 | - |
| 4.2352 | 96250 | 0.7184 | - |
| 4.2357 | 96260 | 0.7325 | - |
| 4.2361 | 96270 | 0.7442 | - |
| 4.2366 | 96280 | 0.7118 | - |
| 4.2370 | 96290 | 0.7392 | - |
| 4.2374 | 96300 | 0.7481 | - |
| 4.2379 | 96310 | 0.7069 | - |
| 4.2383 | 96320 | 0.7148 | - |
| 4.2388 | 96330 | 0.7608 | - |
| 4.2392 | 96340 | 0.6928 | - |
| 4.2396 | 96350 | 0.6914 | - |
| 4.2401 | 96360 | 0.7409 | - |
| 4.2405 | 96370 | 0.7027 | - |
| 4.2410 | 96380 | 0.729 | - |
| 4.2414 | 96390 | 0.6987 | - |
| 4.2418 | 96400 | 0.7202 | - |
| 4.2423 | 96410 | 0.7249 | - |
| 4.2427 | 96420 | 0.7168 | - |
| 4.2432 | 96430 | 0.7623 | - |
| 4.2436 | 96440 | 0.7449 | - |
| 4.2440 | 96450 | 0.7129 | - |
| 4.2445 | 96460 | 0.7451 | - |
| 4.2449 | 96470 | 0.7124 | - |
| 4.2454 | 96480 | 0.7216 | - |
| 4.2458 | 96490 | 0.7445 | - |
| 4.2462 | 96500 | 0.7175 | - |
| 4.2467 | 96510 | 0.7208 | - |
| 4.2471 | 96520 | 0.7722 | - |
| 4.2476 | 96530 | 0.7249 | - |
| 4.2480 | 96540 | 0.7132 | - |
| 4.2484 | 96550 | 0.712 | - |
| 4.2489 | 96560 | 0.7118 | - |
| 4.2493 | 96570 | 0.7138 | - |
| 4.2498 | 96580 | 0.7291 | - |
| 4.2502 | 96590 | 0.7387 | - |
| 4.2506 | 96600 | 0.7284 | - |
| 4.2511 | 96610 | 0.7206 | - |
| 4.2515 | 96620 | 0.7176 | - |
| 4.2520 | 96630 | 0.754 | - |
| 4.2524 | 96640 | 0.7444 | - |
| 4.2526 | 96645 | - | 1.3810 |
| 4.2528 | 96650 | 0.709 | - |
| 4.2533 | 96660 | 0.709 | - |
| 4.2537 | 96670 | 0.6718 | - |
| 4.2542 | 96680 | 0.735 | - |
| 4.2546 | 96690 | 0.7268 | - |
| 4.2550 | 96700 | 0.7321 | - |
| 4.2555 | 96710 | 0.6903 | - |
| 4.2559 | 96720 | 0.7124 | - |
| 4.2564 | 96730 | 0.716 | - |
| 4.2568 | 96740 | 0.7687 | - |
| 4.2572 | 96750 | 0.7187 | - |
| 4.2577 | 96760 | 0.7152 | - |
| 4.2581 | 96770 | 0.7858 | - |
| 4.2586 | 96780 | 0.7166 | - |
| 4.2590 | 96790 | 0.7562 | - |
| 4.2594 | 96800 | 0.6945 | - |
| 4.2599 | 96810 | 0.7137 | - |
| 4.2603 | 96820 | 0.7115 | - |
| 4.2608 | 96830 | 0.7059 | - |
| 4.2612 | 96840 | 0.7107 | - |
| 4.2616 | 96850 | 0.6941 | - |
| 4.2621 | 96860 | 0.7072 | - |
| 4.2625 | 96870 | 0.7243 | - |
| 4.2630 | 96880 | 0.7298 | - |
| 4.2634 | 96890 | 0.6792 | - |
| 4.2638 | 96900 | 0.6909 | - |
| 4.2643 | 96910 | 0.7595 | - |
| 4.2647 | 96920 | 0.7185 | - |
| 4.2652 | 96930 | 0.7358 | - |
| 4.2656 | 96940 | 0.7229 | - |
| 4.2660 | 96950 | 0.7513 | - |
| 4.2665 | 96960 | 0.7412 | - |
| 4.2669 | 96970 | 0.7216 | - |
| 4.2674 | 96980 | 0.7517 | - |
| 4.2678 | 96990 | 0.7523 | - |
| 4.2682 | 97000 | 0.7247 | - |
| 4.2687 | 97010 | 0.7236 | - |
| 4.2691 | 97020 | 0.6905 | - |
| 4.2696 | 97030 | 0.727 | - |
| 4.2700 | 97040 | 0.7121 | - |
| 4.2704 | 97050 | 0.7007 | - |
| 4.2709 | 97060 | 0.7027 | - |
| 4.2713 | 97070 | 0.6878 | - |
| 4.2718 | 97080 | 0.7392 | - |
| 4.2722 | 97090 | 0.7161 | - |
| 4.2726 | 97100 | 0.7206 | - |
| 4.2731 | 97110 | 0.7303 | - |
| 4.2735 | 97120 | 0.733 | - |
| 4.2740 | 97130 | 0.7418 | - |
| 4.2744 | 97140 | 0.7176 | - |
| 4.2748 | 97150 | 0.7285 | - |
| 4.2753 | 97160 | 0.7521 | - |
| 4.2757 | 97170 | 0.7199 | - |
| 4.2762 | 97180 | 0.7342 | - |
| 4.2766 | 97190 | 0.7122 | - |
| 4.2770 | 97200 | 0.7335 | - |
| 4.2775 | 97210 | 0.7542 | - |
| 4.2779 | 97220 | 0.7011 | - |
| 4.2784 | 97230 | 0.7402 | - |
| 4.2788 | 97240 | 0.739 | - |
| 4.2792 | 97250 | 0.7383 | - |
| 4.2797 | 97260 | 0.7063 | - |
| 4.2801 | 97270 | 0.7482 | - |
| 4.2806 | 97280 | 0.7481 | - |
| 4.2810 | 97290 | 0.7309 | - |
| 4.2814 | 97300 | 0.7377 | - |
| 4.2819 | 97310 | 0.7067 | - |
| 4.2823 | 97320 | 0.7315 | - |
| 4.2828 | 97330 | 0.7348 | - |
| 4.2832 | 97340 | 0.74 | - |
| 4.2836 | 97350 | 0.7471 | - |
| 4.2841 | 97360 | 0.7361 | - |
| 4.2845 | 97370 | 0.747 | - |
| 4.2850 | 97380 | 0.7027 | - |
| 4.2854 | 97390 | 0.7414 | - |
| 4.2858 | 97400 | 0.7041 | - |
| 4.2863 | 97410 | 0.7244 | - |
| 4.2867 | 97420 | 0.7435 | - |
| 4.2872 | 97430 | 0.7229 | - |
| 4.2876 | 97440 | 0.7406 | - |
| 4.2880 | 97450 | 0.712 | - |
| 4.2885 | 97460 | 0.731 | - |
| 4.2889 | 97470 | 0.7115 | - |
| 4.2894 | 97480 | 0.6931 | - |
| 4.2898 | 97490 | 0.685 | - |
| 4.2902 | 97500 | 0.7619 | - |
| 4.2907 | 97510 | 0.7385 | - |
| 4.2911 | 97520 | 0.71 | - |
| 4.2916 | 97530 | 0.7428 | - |
| 4.2920 | 97540 | 0.7223 | - |
| 4.2924 | 97550 | 0.6922 | - |
| 4.2929 | 97560 | 0.7291 | - |
| 4.2933 | 97570 | 0.7204 | - |
| 4.2938 | 97580 | 0.7518 | - |
| 4.2942 | 97590 | 0.7436 | - |
| 4.2946 | 97600 | 0.7435 | - |
| 4.2951 | 97610 | 0.7453 | - |
| 4.2955 | 97620 | 0.7583 | - |
| 4.2960 | 97630 | 0.7448 | - |
| 4.2964 | 97640 | 0.6725 | - |
| 4.2968 | 97650 | 0.7549 | - |
| 4.2973 | 97660 | 0.7041 | - |
| 4.2977 | 97670 | 0.7212 | - |
| 4.2982 | 97680 | 0.7041 | - |
| 4.2986 | 97690 | 0.6721 | - |
| 4.2990 | 97700 | 0.6931 | - |
| 4.2995 | 97710 | 0.7288 | - |
| 4.2999 | 97720 | 0.7207 | - |
| 4.3004 | 97730 | 0.7339 | - |
| 4.3008 | 97740 | 0.7217 | - |
| 4.3012 | 97750 | 0.7141 | - |
| 4.3017 | 97760 | 0.7085 | - |
| 4.3021 | 97770 | 0.7364 | - |
| 4.3026 | 97780 | 0.7203 | - |
| 4.3026 | 97782 | - | 1.3755 |
| 4.3030 | 97790 | 0.7278 | - |
| 4.3034 | 97800 | 0.7214 | - |
| 4.3039 | 97810 | 0.7256 | - |
| 4.3043 | 97820 | 0.6981 | - |
| 4.3048 | 97830 | 0.6873 | - |
| 4.3052 | 97840 | 0.703 | - |
| 4.3056 | 97850 | 0.6978 | - |
| 4.3061 | 97860 | 0.6767 | - |
| 4.3065 | 97870 | 0.7251 | - |
| 4.3070 | 97880 | 0.7619 | - |
| 4.3074 | 97890 | 0.7282 | - |
| 4.3078 | 97900 | 0.7168 | - |
| 4.3083 | 97910 | 0.7232 | - |
| 4.3087 | 97920 | 0.7179 | - |
| 4.3092 | 97930 | 0.7223 | - |
| 4.3096 | 97940 | 0.7107 | - |
| 4.3100 | 97950 | 0.7127 | - |
| 4.3105 | 97960 | 0.7219 | - |
| 4.3109 | 97970 | 0.6939 | - |
| 4.3114 | 97980 | 0.7419 | - |
| 4.3118 | 97990 | 0.7011 | - |
| 4.3122 | 98000 | 0.7186 | - |
| 4.3127 | 98010 | 0.7109 | - |
| 4.3131 | 98020 | 0.7123 | - |
| 4.3136 | 98030 | 0.7257 | - |
| 4.3140 | 98040 | 0.7214 | - |
| 4.3144 | 98050 | 0.7319 | - |
| 4.3149 | 98060 | 0.7199 | - |
| 4.3153 | 98070 | 0.749 | - |
| 4.3158 | 98080 | 0.7062 | - |
| 4.3162 | 98090 | 0.7317 | - |
| 4.3166 | 98100 | 0.7074 | - |
| 4.3171 | 98110 | 0.711 | - |
| 4.3175 | 98120 | 0.7532 | - |
| 4.3180 | 98130 | 0.7231 | - |
| 4.3184 | 98140 | 0.719 | - |
| 4.3188 | 98150 | 0.7465 | - |
| 4.3193 | 98160 | 0.7183 | - |
| 4.3197 | 98170 | 0.6889 | - |
| 4.3202 | 98180 | 0.715 | - |
| 4.3206 | 98190 | 0.7417 | - |
| 4.3210 | 98200 | 0.7429 | - |
| 4.3215 | 98210 | 0.6999 | - |
| 4.3219 | 98220 | 0.7236 | - |
| 4.3224 | 98230 | 0.7297 | - |
| 4.3228 | 98240 | 0.6769 | - |
| 4.3232 | 98250 | 0.7163 | - |
| 4.3237 | 98260 | 0.714 | - |
| 4.3241 | 98270 | 0.7056 | - |
| 4.3246 | 98280 | 0.7071 | - |
| 4.3250 | 98290 | 0.7826 | - |
| 4.3254 | 98300 | 0.7446 | - |
| 4.3259 | 98310 | 0.7031 | - |
| 4.3263 | 98320 | 0.734 | - |
| 4.3268 | 98330 | 0.7372 | - |
| 4.3272 | 98340 | 0.7366 | - |
| 4.3276 | 98350 | 0.7324 | - |
| 4.3281 | 98360 | 0.7115 | - |
| 4.3285 | 98370 | 0.7223 | - |
| 4.3290 | 98380 | 0.775 | - |
| 4.3294 | 98390 | 0.7557 | - |
| 4.3298 | 98400 | 0.7132 | - |
| 4.3303 | 98410 | 0.7079 | - |
| 4.3307 | 98420 | 0.7191 | - |
| 4.3312 | 98430 | 0.677 | - |
| 4.3316 | 98440 | 0.7492 | - |
| 4.3320 | 98450 | 0.7239 | - |
| 4.3325 | 98460 | 0.7075 | - |
| 4.3329 | 98470 | 0.7217 | - |
| 4.3334 | 98480 | 0.7541 | - |
| 4.3338 | 98490 | 0.7167 | - |
| 4.3342 | 98500 | 0.7478 | - |
| 4.3347 | 98510 | 0.7093 | - |
| 4.3351 | 98520 | 0.7177 | - |
| 4.3356 | 98530 | 0.7443 | - |
| 4.3360 | 98540 | 0.7163 | - |
| 4.3364 | 98550 | 0.7395 | - |
| 4.3369 | 98560 | 0.7403 | - |
| 4.3373 | 98570 | 0.7485 | - |
| 4.3378 | 98580 | 0.7284 | - |
| 4.3382 | 98590 | 0.7202 | - |
| 4.3386 | 98600 | 0.7197 | - |
| 4.3391 | 98610 | 0.742 | - |
| 4.3395 | 98620 | 0.7275 | - |
| 4.3400 | 98630 | 0.7108 | - |
| 4.3404 | 98640 | 0.7412 | - |
| 4.3408 | 98650 | 0.7004 | - |
| 4.3413 | 98660 | 0.7375 | - |
| 4.3417 | 98670 | 0.7275 | - |
| 4.3422 | 98680 | 0.7135 | - |
| 4.3426 | 98690 | 0.6985 | - |
| 4.3430 | 98700 | 0.6999 | - |
| 4.3435 | 98710 | 0.7206 | - |
| 4.3439 | 98720 | 0.7279 | - |
| 4.3444 | 98730 | 0.6977 | - |
| 4.3448 | 98740 | 0.7146 | - |
| 4.3452 | 98750 | 0.6956 | - |
| 4.3457 | 98760 | 0.7182 | - |
| 4.3461 | 98770 | 0.7322 | - |
| 4.3466 | 98780 | 0.725 | - |
| 4.3470 | 98790 | 0.7509 | - |
| 4.3474 | 98800 | 0.7398 | - |
| 4.3479 | 98810 | 0.7094 | - |
| 4.3483 | 98820 | 0.7187 | - |
| 4.3488 | 98830 | 0.7436 | - |
| 4.3492 | 98840 | 0.7127 | - |
| 4.3496 | 98850 | 0.7149 | - |
| 4.3501 | 98860 | 0.6965 | - |
| 4.3505 | 98870 | 0.712 | - |
| 4.3510 | 98880 | 0.7238 | - |
| 4.3514 | 98890 | 0.7125 | - |
| 4.3518 | 98900 | 0.7168 | - |
| 4.3523 | 98910 | 0.6652 | - |
| 4.3527 | 98919 | - | 1.3790 |
| 4.3527 | 98920 | 0.7476 | - |
| 4.3532 | 98930 | 0.7196 | - |
| 4.3536 | 98940 | 0.7215 | - |
| 4.3540 | 98950 | 0.677 | - |
| 4.3545 | 98960 | 0.7423 | - |
| 4.3549 | 98970 | 0.7251 | - |
| 4.3554 | 98980 | 0.7283 | - |
| 4.3558 | 98990 | 0.7318 | - |
| 4.3562 | 99000 | 0.729 | - |
| 4.3567 | 99010 | 0.7269 | - |
| 4.3571 | 99020 | 0.7026 | - |
| 4.3576 | 99030 | 0.7108 | - |
| 4.3580 | 99040 | 0.6794 | - |
| 4.3584 | 99050 | 0.706 | - |
| 4.3589 | 99060 | 0.7178 | - |
| 4.3593 | 99070 | 0.7134 | - |
| 4.3598 | 99080 | 0.7341 | - |
| 4.3602 | 99090 | 0.7373 | - |
| 4.3606 | 99100 | 0.7256 | - |
| 4.3611 | 99110 | 0.757 | - |
| 4.3615 | 99120 | 0.6878 | - |
| 4.3620 | 99130 | 0.6887 | - |
| 4.3624 | 99140 | 0.6995 | - |
| 4.3628 | 99150 | 0.6962 | - |
| 4.3633 | 99160 | 0.7191 | - |
| 4.3637 | 99170 | 0.6915 | - |
| 4.3642 | 99180 | 0.6946 | - |
| 4.3646 | 99190 | 0.7045 | - |
| 4.3650 | 99200 | 0.72 | - |
| 4.3655 | 99210 | 0.7335 | - |
| 4.3659 | 99220 | 0.7241 | - |
| 4.3664 | 99230 | 0.758 | - |
| 4.3668 | 99240 | 0.6908 | - |
| 4.3672 | 99250 | 0.721 | - |
| 4.3677 | 99260 | 0.7021 | - |
| 4.3681 | 99270 | 0.7233 | - |
| 4.3686 | 99280 | 0.6892 | - |
| 4.3690 | 99290 | 0.7273 | - |
| 4.3694 | 99300 | 0.7348 | - |
| 4.3699 | 99310 | 0.7536 | - |
| 4.3703 | 99320 | 0.7016 | - |
| 4.3708 | 99330 | 0.7412 | - |
| 4.3712 | 99340 | 0.7087 | - |
| 4.3716 | 99350 | 0.7226 | - |
| 4.3721 | 99360 | 0.7385 | - |
| 4.3725 | 99370 | 0.6908 | - |
| 4.3730 | 99380 | 0.738 | - |
| 4.3734 | 99390 | 0.7468 | - |
| 4.3738 | 99400 | 0.6718 | - |
| 4.3743 | 99410 | 0.7169 | - |
| 4.3747 | 99420 | 0.7461 | - |
| 4.3752 | 99430 | 0.7337 | - |
| 4.3756 | 99440 | 0.7735 | - |
| 4.3760 | 99450 | 0.7161 | - |
| 4.3765 | 99460 | 0.6782 | - |
| 4.3769 | 99470 | 0.7111 | - |
| 4.3774 | 99480 | 0.7271 | - |
| 4.3778 | 99490 | 0.6891 | - |
| 4.3782 | 99500 | 0.744 | - |
| 4.3787 | 99510 | 0.7188 | - |
| 4.3791 | 99520 | 0.7026 | - |
| 4.3796 | 99530 | 0.7389 | - |
| 4.3800 | 99540 | 0.6886 | - |
| 4.3804 | 99550 | 0.7153 | - |
| 4.3809 | 99560 | 0.7038 | - |
| 4.3813 | 99570 | 0.7082 | - |
| 4.3818 | 99580 | 0.7194 | - |
| 4.3822 | 99590 | 0.7373 | - |
| 4.3826 | 99600 | 0.73 | - |
| 4.3831 | 99610 | 0.7149 | - |
| 4.3835 | 99620 | 0.7377 | - |
| 4.3840 | 99630 | 0.7122 | - |
| 4.3844 | 99640 | 0.7068 | - |
| 4.3848 | 99650 | 0.73 | - |
| 4.3853 | 99660 | 0.6883 | - |
| 4.3857 | 99670 | 0.7267 | - |
| 4.3862 | 99680 | 0.714 | - |
| 4.3866 | 99690 | 0.7321 | - |
| 4.3870 | 99700 | 0.7315 | - |
| 4.3875 | 99710 | 0.6973 | - |
| 4.3879 | 99720 | 0.6928 | - |
| 4.3884 | 99730 | 0.7013 | - |
| 4.3888 | 99740 | 0.7224 | - |
| 4.3892 | 99750 | 0.7243 | - |
| 4.3897 | 99760 | 0.7078 | - |
| 4.3901 | 99770 | 0.6965 | - |
| 4.3906 | 99780 | 0.6823 | - |
| 4.3910 | 99790 | 0.7232 | - |
| 4.3914 | 99800 | 0.7116 | - |
| 4.3919 | 99810 | 0.7429 | - |
| 4.3923 | 99820 | 0.7169 | - |
| 4.3928 | 99830 | 0.7052 | - |
| 4.3932 | 99840 | 0.7012 | - |
| 4.3936 | 99850 | 0.7196 | - |
| 4.3941 | 99860 | 0.7235 | - |
| 4.3945 | 99870 | 0.7199 | - |
| 4.3950 | 99880 | 0.7032 | - |
| 4.3954 | 99890 | 0.6957 | - |
| 4.3958 | 99900 | 0.7134 | - |
| 4.3963 | 99910 | 0.6962 | - |
| 4.3967 | 99920 | 0.7149 | - |
| 4.3972 | 99930 | 0.6996 | - |
| 4.3976 | 99940 | 0.7244 | - |
| 4.3980 | 99950 | 0.7257 | - |
| 4.3985 | 99960 | 0.6968 | - |
| 4.3989 | 99970 | 0.7137 | - |
| 4.3994 | 99980 | 0.7495 | - |
| 4.3998 | 99990 | 0.7187 | - |
| 4.4002 | 100000 | 0.7007 | - |
| 4.4007 | 100010 | 0.7 | - |
| 4.4011 | 100020 | 0.7518 | - |
| 4.4016 | 100030 | 0.7329 | - |
| 4.4020 | 100040 | 0.7157 | - |
| 4.4024 | 100050 | 0.7378 | - |
| 4.4027 | 100056 | - | 1.3816 |
| 4.4029 | 100060 | 0.7383 | - |
| 4.4033 | 100070 | 0.712 | - |
| 4.4038 | 100080 | 0.6979 | - |
| 4.4042 | 100090 | 0.6967 | - |
| 4.4046 | 100100 | 0.7403 | - |
| 4.4051 | 100110 | 0.7265 | - |
| 4.4055 | 100120 | 0.7011 | - |
| 4.4060 | 100130 | 0.7065 | - |
| 4.4064 | 100140 | 0.7018 | - |
| 4.4068 | 100150 | 0.6935 | - |
| 4.4073 | 100160 | 0.7199 | - |
| 4.4077 | 100170 | 0.7102 | - |
| 4.4082 | 100180 | 0.7265 | - |
| 4.4086 | 100190 | 0.726 | - |
| 4.4090 | 100200 | 0.6943 | - |
| 4.4095 | 100210 | 0.6983 | - |
| 4.4099 | 100220 | 0.7016 | - |
| 4.4104 | 100230 | 0.6966 | - |
| 4.4108 | 100240 | 0.7243 | - |
| 4.4112 | 100250 | 0.725 | - |
| 4.4117 | 100260 | 0.728 | - |
| 4.4121 | 100270 | 0.7196 | - |
| 4.4126 | 100280 | 0.7031 | - |
| 4.4130 | 100290 | 0.6972 | - |
| 4.4134 | 100300 | 0.7289 | - |
| 4.4139 | 100310 | 0.7118 | - |
| 4.4143 | 100320 | 0.6994 | - |
| 4.4148 | 100330 | 0.7249 | - |
| 4.4152 | 100340 | 0.69 | - |
| 4.4156 | 100350 | 0.7124 | - |
| 4.4161 | 100360 | 0.7091 | - |
| 4.4165 | 100370 | 0.6805 | - |
| 4.4170 | 100380 | 0.7081 | - |
| 4.4174 | 100390 | 0.7121 | - |
| 4.4178 | 100400 | 0.7371 | - |
| 4.4183 | 100410 | 0.6874 | - |
| 4.4187 | 100420 | 0.724 | - |
| 4.4192 | 100430 | 0.7116 | - |
| 4.4196 | 100440 | 0.7158 | - |
| 4.4200 | 100450 | 0.7117 | - |
| 4.4205 | 100460 | 0.7009 | - |
| 4.4209 | 100470 | 0.6882 | - |
| 4.4214 | 100480 | 0.7264 | - |
| 4.4218 | 100490 | 0.7525 | - |
| 4.4222 | 100500 | 0.699 | - |
| 4.4227 | 100510 | 0.681 | - |
| 4.4231 | 100520 | 0.7109 | - |
| 4.4236 | 100530 | 0.7215 | - |
| 4.4240 | 100540 | 0.7052 | - |
| 4.4244 | 100550 | 0.714 | - |
| 4.4249 | 100560 | 0.7193 | - |
| 4.4253 | 100570 | 0.6726 | - |
| 4.4258 | 100580 | 0.7249 | - |
| 4.4262 | 100590 | 0.6997 | - |
| 4.4266 | 100600 | 0.7494 | - |
| 4.4271 | 100610 | 0.7239 | - |
| 4.4275 | 100620 | 0.7362 | - |
| 4.4280 | 100630 | 0.7293 | - |
| 4.4284 | 100640 | 0.7506 | - |
| 4.4288 | 100650 | 0.7276 | - |
| 4.4293 | 100660 | 0.7144 | - |
| 4.4297 | 100670 | 0.7282 | - |
| 4.4302 | 100680 | 0.7188 | - |
| 4.4306 | 100690 | 0.7052 | - |
| 4.4310 | 100700 | 0.6845 | - |
| 4.4315 | 100710 | 0.6998 | - |
| 4.4319 | 100720 | 0.6834 | - |
| 4.4324 | 100730 | 0.7309 | - |
| 4.4328 | 100740 | 0.7211 | - |
| 4.4332 | 100750 | 0.7179 | - |
| 4.4337 | 100760 | 0.7447 | - |
| 4.4341 | 100770 | 0.721 | - |
| 4.4346 | 100780 | 0.718 | - |
| 4.4350 | 100790 | 0.737 | - |
| 4.4354 | 100800 | 0.7321 | - |
| 4.4359 | 100810 | 0.7091 | - |
| 4.4363 | 100820 | 0.7327 | - |
| 4.4368 | 100830 | 0.7075 | - |
| 4.4372 | 100840 | 0.7251 | - |
| 4.4376 | 100850 | 0.7341 | - |
| 4.4381 | 100860 | 0.7274 | - |
| 4.4385 | 100870 | 0.7304 | - |
| 4.4390 | 100880 | 0.7451 | - |
| 4.4394 | 100890 | 0.7125 | - |
| 4.4398 | 100900 | 0.7114 | - |
| 4.4403 | 100910 | 0.702 | - |
| 4.4407 | 100920 | 0.7219 | - |
| 4.4412 | 100930 | 0.7274 | - |
| 4.4416 | 100940 | 0.7134 | - |
| 4.4420 | 100950 | 0.7282 | - |
| 4.4425 | 100960 | 0.6953 | - |
| 4.4429 | 100970 | 0.7193 | - |
| 4.4434 | 100980 | 0.7284 | - |
| 4.4438 | 100990 | 0.7469 | - |
| 4.4442 | 101000 | 0.6782 | - |
| 4.4447 | 101010 | 0.7204 | - |
| 4.4451 | 101020 | 0.7219 | - |
| 4.4456 | 101030 | 0.7142 | - |
| 4.4460 | 101040 | 0.7368 | - |
| 4.4464 | 101050 | 0.7231 | - |
| 4.4469 | 101060 | 0.7272 | - |
| 4.4473 | 101070 | 0.7371 | - |
| 4.4478 | 101080 | 0.7414 | - |
| 4.4482 | 101090 | 0.7025 | - |
| 4.4486 | 101100 | 0.7277 | - |
| 4.4491 | 101110 | 0.7503 | - |
| 4.4495 | 101120 | 0.7721 | - |
| 4.4500 | 101130 | 0.7393 | - |
| 4.4504 | 101140 | 0.6921 | - |
| 4.4508 | 101150 | 0.7343 | - |
| 4.4513 | 101160 | 0.7192 | - |
| 4.4517 | 101170 | 0.7044 | - |
| 4.4522 | 101180 | 0.6969 | - |
| 4.4526 | 101190 | 0.6852 | - |
| 4.4527 | 101193 | - | 1.3802 |
| 4.4530 | 101200 | 0.6697 | - |
| 4.4535 | 101210 | 0.7569 | - |
| 4.4539 | 101220 | 0.7219 | - |
| 4.4544 | 101230 | 0.6911 | - |
| 4.4548 | 101240 | 0.7327 | - |
| 4.4552 | 101250 | 0.7084 | - |
| 4.4557 | 101260 | 0.7302 | - |
| 4.4561 | 101270 | 0.6938 | - |
| 4.4566 | 101280 | 0.7329 | - |
| 4.4570 | 101290 | 0.7356 | - |
| 4.4574 | 101300 | 0.7111 | - |
| 4.4579 | 101310 | 0.679 | - |
| 4.4583 | 101320 | 0.6997 | - |
| 4.4588 | 101330 | 0.704 | - |
| 4.4592 | 101340 | 0.726 | - |
| 4.4596 | 101350 | 0.7176 | - |
| 4.4601 | 101360 | 0.7111 | - |
| 4.4605 | 101370 | 0.6832 | - |
| 4.4610 | 101380 | 0.6974 | - |
| 4.4614 | 101390 | 0.7269 | - |
| 4.4618 | 101400 | 0.7052 | - |
| 4.4623 | 101410 | 0.7293 | - |
| 4.4627 | 101420 | 0.7184 | - |
| 4.4632 | 101430 | 0.7329 | - |
| 4.4636 | 101440 | 0.7222 | - |
| 4.4640 | 101450 | 0.7658 | - |
| 4.4645 | 101460 | 0.7218 | - |
| 4.4649 | 101470 | 0.7084 | - |
| 4.4654 | 101480 | 0.7055 | - |
| 4.4658 | 101490 | 0.7262 | - |
| 4.4663 | 101500 | 0.7008 | - |
| 4.4667 | 101510 | 0.7234 | - |
| 4.4671 | 101520 | 0.7395 | - |
| 4.4676 | 101530 | 0.6993 | - |
| 4.4680 | 101540 | 0.7134 | - |
| 4.4685 | 101550 | 0.6626 | - |
| 4.4689 | 101560 | 0.7207 | - |
| 4.4693 | 101570 | 0.664 | - |
| 4.4698 | 101580 | 0.73 | - |
| 4.4702 | 101590 | 0.7241 | - |
| 4.4707 | 101600 | 0.7069 | - |
| 4.4711 | 101610 | 0.7241 | - |
| 4.4715 | 101620 | 0.7154 | - |
| 4.4720 | 101630 | 0.6772 | - |
| 4.4724 | 101640 | 0.7255 | - |
| 4.4729 | 101650 | 0.7094 | - |
| 4.4733 | 101660 | 0.7374 | - |
| 4.4737 | 101670 | 0.7113 | - |
| 4.4742 | 101680 | 0.7539 | - |
| 4.4746 | 101690 | 0.7098 | - |
| 4.4751 | 101700 | 0.7475 | - |
| 4.4755 | 101710 | 0.7278 | - |
| 4.4759 | 101720 | 0.7327 | - |
| 4.4764 | 101730 | 0.7104 | - |
| 4.4768 | 101740 | 0.7243 | - |
| 4.4773 | 101750 | 0.7005 | - |
| 4.4777 | 101760 | 0.7032 | - |
| 4.4781 | 101770 | 0.7201 | - |
| 4.4786 | 101780 | 0.7132 | - |
| 4.4790 | 101790 | 0.7069 | - |
| 4.4795 | 101800 | 0.7122 | - |
| 4.4799 | 101810 | 0.7328 | - |
| 4.4803 | 101820 | 0.7322 | - |
| 4.4808 | 101830 | 0.7184 | - |
| 4.4812 | 101840 | 0.7622 | - |
| 4.4817 | 101850 | 0.7302 | - |
| 4.4821 | 101860 | 0.7606 | - |
| 4.4825 | 101870 | 0.7408 | - |
| 4.4830 | 101880 | 0.7027 | - |
| 4.4834 | 101890 | 0.6981 | - |
| 4.4839 | 101900 | 0.7091 | - |
| 4.4843 | 101910 | 0.7086 | - |
| 4.4847 | 101920 | 0.6849 | - |
| 4.4852 | 101930 | 0.7116 | - |
| 4.4856 | 101940 | 0.726 | - |
| 4.4861 | 101950 | 0.7073 | - |
| 4.4865 | 101960 | 0.6839 | - |
| 4.4869 | 101970 | 0.7026 | - |
| 4.4874 | 101980 | 0.6954 | - |
| 4.4878 | 101990 | 0.6903 | - |
| 4.4883 | 102000 | 0.711 | - |
| 4.4887 | 102010 | 0.6763 | - |
| 4.4891 | 102020 | 0.7398 | - |
| 4.4896 | 102030 | 0.72 | - |
| 4.4900 | 102040 | 0.7644 | - |
| 4.4905 | 102050 | 0.738 | - |
| 4.4909 | 102060 | 0.6992 | - |
| 4.4913 | 102070 | 0.701 | - |
| 4.4918 | 102080 | 0.7418 | - |
| 4.4922 | 102090 | 0.6873 | - |
| 4.4927 | 102100 | 0.721 | - |
| 4.4931 | 102110 | 0.7167 | - |
| 4.4935 | 102120 | 0.7184 | - |
| 4.4940 | 102130 | 0.7484 | - |
| 4.4944 | 102140 | 0.687 | - |
| 4.4949 | 102150 | 0.7159 | - |
| 4.4953 | 102160 | 0.7197 | - |
| 4.4957 | 102170 | 0.7221 | - |
| 4.4962 | 102180 | 0.7181 | - |
| 4.4966 | 102190 | 0.7046 | - |
| 4.4971 | 102200 | 0.7179 | - |
| 4.4975 | 102210 | 0.7208 | - |
| 4.4979 | 102220 | 0.6973 | - |
| 4.4984 | 102230 | 0.7198 | - |
| 4.4988 | 102240 | 0.6818 | - |
| 4.4993 | 102250 | 0.7185 | - |
| 4.4997 | 102260 | 0.702 | - |
| 4.5001 | 102270 | 0.7087 | - |
| 4.5006 | 102280 | 0.7591 | - |
| 4.5010 | 102290 | 0.6803 | - |
| 4.5015 | 102300 | 0.7471 | - |
| 4.5019 | 102310 | 0.6855 | - |
| 4.5023 | 102320 | 0.696 | - |
| 4.5028 | 102330 | 0.7071 | 1.3820 |
| 4.5032 | 102340 | 0.7031 | - |
| 4.5037 | 102350 | 0.7289 | - |
| 4.5041 | 102360 | 0.7191 | - |
| 4.5045 | 102370 | 0.7143 | - |
| 4.5050 | 102380 | 0.741 | - |
| 4.5054 | 102390 | 0.7066 | - |
| 4.5059 | 102400 | 0.7158 | - |
| 4.5063 | 102410 | 0.728 | - |
| 4.5067 | 102420 | 0.7146 | - |
| 4.5072 | 102430 | 0.7169 | - |
| 4.5076 | 102440 | 0.7405 | - |
| 4.5081 | 102450 | 0.716 | - |
| 4.5085 | 102460 | 0.7142 | - |
| 4.5089 | 102470 | 0.695 | - |
| 4.5094 | 102480 | 0.7077 | - |
| 4.5098 | 102490 | 0.7103 | - |
| 4.5103 | 102500 | 0.7177 | - |
| 4.5107 | 102510 | 0.6803 | - |
| 4.5111 | 102520 | 0.689 | - |
| 4.5116 | 102530 | 0.7155 | - |
| 4.5120 | 102540 | 0.7317 | - |
| 4.5125 | 102550 | 0.7196 | - |
| 4.5129 | 102560 | 0.6819 | - |
| 4.5133 | 102570 | 0.7371 | - |
| 4.5138 | 102580 | 0.706 | - |
| 4.5142 | 102590 | 0.7342 | - |
| 4.5147 | 102600 | 0.6929 | - |
| 4.5151 | 102610 | 0.7028 | - |
| 4.5155 | 102620 | 0.6726 | - |
| 4.5160 | 102630 | 0.7193 | - |
| 4.5164 | 102640 | 0.7166 | - |
| 4.5169 | 102650 | 0.7418 | - |
| 4.5173 | 102660 | 0.699 | - |
| 4.5177 | 102670 | 0.6982 | - |
| 4.5182 | 102680 | 0.6727 | - |
| 4.5186 | 102690 | 0.725 | - |
| 4.5191 | 102700 | 0.7287 | - |
| 4.5195 | 102710 | 0.7149 | - |
| 4.5199 | 102720 | 0.668 | - |
| 4.5204 | 102730 | 0.6938 | - |
| 4.5208 | 102740 | 0.7068 | - |
| 4.5213 | 102750 | 0.7254 | - |
| 4.5217 | 102760 | 0.6937 | - |
| 4.5221 | 102770 | 0.7305 | - |
| 4.5226 | 102780 | 0.7071 | - |
| 4.5230 | 102790 | 0.6981 | - |
| 4.5235 | 102800 | 0.7331 | - |
| 4.5239 | 102810 | 0.7357 | - |
| 4.5243 | 102820 | 0.7374 | - |
| 4.5248 | 102830 | 0.6904 | - |
| 4.5252 | 102840 | 0.7021 | - |
| 4.5257 | 102850 | 0.7166 | - |
| 4.5261 | 102860 | 0.7346 | - |
| 4.5265 | 102870 | 0.7221 | - |
| 4.5270 | 102880 | 0.7203 | - |
| 4.5274 | 102890 | 0.7234 | - |
| 4.5279 | 102900 | 0.733 | - |
| 4.5283 | 102910 | 0.7481 | - |
| 4.5287 | 102920 | 0.6976 | - |
| 4.5292 | 102930 | 0.7403 | - |
| 4.5296 | 102940 | 0.7348 | - |
| 4.5301 | 102950 | 0.7043 | - |
| 4.5305 | 102960 | 0.7056 | - |
| 4.5309 | 102970 | 0.7477 | - |
| 4.5314 | 102980 | 0.7322 | - |
| 4.5318 | 102990 | 0.7119 | - |
| 4.5323 | 103000 | 0.7187 | - |
| 4.5327 | 103010 | 0.7047 | - |
| 4.5331 | 103020 | 0.6959 | - |
| 4.5336 | 103030 | 0.6971 | - |
| 4.5340 | 103040 | 0.7384 | - |
| 4.5345 | 103050 | 0.7022 | - |
| 4.5349 | 103060 | 0.713 | - |
| 4.5353 | 103070 | 0.7352 | - |
| 4.5358 | 103080 | 0.728 | - |
| 4.5362 | 103090 | 0.7066 | - |
| 4.5367 | 103100 | 0.7117 | - |
| 4.5371 | 103110 | 0.7271 | - |
| 4.5375 | 103120 | 0.742 | - |
| 4.5380 | 103130 | 0.734 | - |
| 4.5384 | 103140 | 0.7527 | - |
| 4.5389 | 103150 | 0.7296 | - |
| 4.5393 | 103160 | 0.7307 | - |
| 4.5397 | 103170 | 0.7338 | - |
| 4.5402 | 103180 | 0.7203 | - |
| 4.5406 | 103190 | 0.7111 | - |
| 4.5411 | 103200 | 0.6949 | - |
| 4.5415 | 103210 | 0.7167 | - |
| 4.5419 | 103220 | 0.7142 | - |
| 4.5424 | 103230 | 0.7273 | - |
| 4.5428 | 103240 | 0.6963 | - |
| 4.5433 | 103250 | 0.7205 | - |
| 4.5437 | 103260 | 0.7519 | - |
| 4.5441 | 103270 | 0.6918 | - |
| 4.5446 | 103280 | 0.7356 | - |
| 4.5450 | 103290 | 0.7309 | - |
| 4.5455 | 103300 | 0.7314 | - |
| 4.5459 | 103310 | 0.709 | - |
| 4.5463 | 103320 | 0.7422 | - |
| 4.5468 | 103330 | 0.6857 | - |
| 4.5472 | 103340 | 0.7684 | - |
| 4.5477 | 103350 | 0.7377 | - |
| 4.5481 | 103360 | 0.6904 | - |
| 4.5485 | 103370 | 0.7173 | - |
| 4.5490 | 103380 | 0.7346 | - |
| 4.5494 | 103390 | 0.6808 | - |
| 4.5499 | 103400 | 0.722 | - |
| 4.5503 | 103410 | 0.71 | - |
| 4.5507 | 103420 | 0.7187 | - |
| 4.5512 | 103430 | 0.7146 | - |
| 4.5516 | 103440 | 0.7016 | - |
| 4.5521 | 103450 | 0.7182 | - |
| 4.5525 | 103460 | 0.7366 | - |
| 4.5528 | 103467 | - | 1.3765 |
| 4.5529 | 103470 | 0.7072 | - |
| 4.5534 | 103480 | 0.7466 | - |
| 4.5538 | 103490 | 0.7539 | - |
| 4.5543 | 103500 | 0.725 | - |
| 4.5547 | 103510 | 0.7291 | - |
| 4.5551 | 103520 | 0.7115 | - |
| 4.5556 | 103530 | 0.7188 | - |
| 4.5560 | 103540 | 0.6808 | - |
| 4.5565 | 103550 | 0.7409 | - |
| 4.5569 | 103560 | 0.7259 | - |
| 4.5573 | 103570 | 0.7389 | - |
| 4.5578 | 103580 | 0.7286 | - |
| 4.5582 | 103590 | 0.7383 | - |
| 4.5587 | 103600 | 0.7215 | - |
| 4.5591 | 103610 | 0.702 | - |
| 4.5595 | 103620 | 0.6926 | - |
| 4.5600 | 103630 | 0.7157 | - |
| 4.5604 | 103640 | 0.7139 | - |
| 4.5609 | 103650 | 0.6897 | - |
| 4.5613 | 103660 | 0.7073 | - |
| 4.5617 | 103670 | 0.7126 | - |
| 4.5622 | 103680 | 0.7212 | - |
| 4.5626 | 103690 | 0.7147 | - |
| 4.5631 | 103700 | 0.7253 | - |
| 4.5635 | 103710 | 0.7232 | - |
| 4.5639 | 103720 | 0.7288 | - |
| 4.5644 | 103730 | 0.7431 | - |
| 4.5648 | 103740 | 0.7088 | - |
| 4.5653 | 103750 | 0.6907 | - |
| 4.5657 | 103760 | 0.6799 | - |
| 4.5661 | 103770 | 0.6929 | - |
| 4.5666 | 103780 | 0.7173 | - |
| 4.5670 | 103790 | 0.6749 | - |
| 4.5675 | 103800 | 0.7384 | - |
| 4.5679 | 103810 | 0.6935 | - |
| 4.5683 | 103820 | 0.7358 | - |
| 4.5688 | 103830 | 0.7318 | - |
| 4.5692 | 103840 | 0.691 | - |
| 4.5697 | 103850 | 0.6986 | - |
| 4.5701 | 103860 | 0.7386 | - |
| 4.5705 | 103870 | 0.7267 | - |
| 4.5710 | 103880 | 0.703 | - |
| 4.5714 | 103890 | 0.7121 | - |
| 4.5719 | 103900 | 0.7488 | - |
| 4.5723 | 103910 | 0.728 | - |
| 4.5727 | 103920 | 0.713 | - |
| 4.5732 | 103930 | 0.763 | - |
| 4.5736 | 103940 | 0.7157 | - |
| 4.5741 | 103950 | 0.717 | - |
| 4.5745 | 103960 | 0.7158 | - |
| 4.5749 | 103970 | 0.7212 | - |
| 4.5754 | 103980 | 0.6944 | - |
| 4.5758 | 103990 | 0.6911 | - |
| 4.5763 | 104000 | 0.7066 | - |
| 4.5767 | 104010 | 0.7229 | - |
| 4.5771 | 104020 | 0.723 | - |
| 4.5776 | 104030 | 0.6886 | - |
| 4.5780 | 104040 | 0.6991 | - |
| 4.5785 | 104050 | 0.726 | - |
| 4.5789 | 104060 | 0.7168 | - |
| 4.5793 | 104070 | 0.7101 | - |
| 4.5798 | 104080 | 0.7041 | - |
| 4.5802 | 104090 | 0.7011 | - |
| 4.5807 | 104100 | 0.7053 | - |
| 4.5811 | 104110 | 0.6842 | - |
| 4.5815 | 104120 | 0.7085 | - |
| 4.5820 | 104130 | 0.7316 | - |
| 4.5824 | 104140 | 0.7232 | - |
| 4.5829 | 104150 | 0.6838 | - |
| 4.5833 | 104160 | 0.7192 | - |
| 4.5837 | 104170 | 0.7065 | - |
| 4.5842 | 104180 | 0.7092 | - |
| 4.5846 | 104190 | 0.7287 | - |
| 4.5851 | 104200 | 0.7026 | - |
| 4.5855 | 104210 | 0.7208 | - |
| 4.5859 | 104220 | 0.706 | - |
| 4.5864 | 104230 | 0.7107 | - |
| 4.5868 | 104240 | 0.7033 | - |
| 4.5873 | 104250 | 0.702 | - |
| 4.5877 | 104260 | 0.7157 | - |
| 4.5881 | 104270 | 0.7223 | - |
| 4.5886 | 104280 | 0.6973 | - |
| 4.5890 | 104290 | 0.7196 | - |
| 4.5895 | 104300 | 0.6739 | - |
| 4.5899 | 104310 | 0.6793 | - |
| 4.5903 | 104320 | 0.701 | - |
| 4.5908 | 104330 | 0.7105 | - |
| 4.5912 | 104340 | 0.7136 | - |
| 4.5917 | 104350 | 0.7371 | - |
| 4.5921 | 104360 | 0.727 | - |
| 4.5925 | 104370 | 0.7241 | - |
| 4.5930 | 104380 | 0.7195 | - |
| 4.5934 | 104390 | 0.6813 | - |
| 4.5939 | 104400 | 0.7346 | - |
| 4.5943 | 104410 | 0.7173 | - |
| 4.5947 | 104420 | 0.7343 | - |
| 4.5952 | 104430 | 0.715 | - |
| 4.5956 | 104440 | 0.7132 | - |
| 4.5961 | 104450 | 0.7314 | - |
| 4.5965 | 104460 | 0.6924 | - |
| 4.5969 | 104470 | 0.7185 | - |
| 4.5974 | 104480 | 0.6781 | - |
| 4.5978 | 104490 | 0.6931 | - |
| 4.5983 | 104500 | 0.7612 | - |
| 4.5987 | 104510 | 0.7343 | - |
| 4.5991 | 104520 | 0.7233 | - |
| 4.5996 | 104530 | 0.7073 | - |
| 4.6000 | 104540 | 0.7074 | - |
| 4.6005 | 104550 | 0.6958 | - |
| 4.6009 | 104560 | 0.7189 | - |
| 4.6013 | 104570 | 0.7259 | - |
| 4.6018 | 104580 | 0.7068 | - |
| 4.6022 | 104590 | 0.722 | - |
| 4.6027 | 104600 | 0.7273 | - |
| 4.6028 | 104604 | - | 1.3776 |
| 4.6031 | 104610 | 0.7106 | - |
| 4.6035 | 104620 | 0.7094 | - |
| 4.6040 | 104630 | 0.7009 | - |
| 4.6044 | 104640 | 0.7221 | - |
| 4.6049 | 104650 | 0.702 | - |
| 4.6053 | 104660 | 0.7626 | - |
| 4.6057 | 104670 | 0.7039 | - |
| 4.6062 | 104680 | 0.6817 | - |
| 4.6066 | 104690 | 0.7501 | - |
| 4.6071 | 104700 | 0.6999 | - |
| 4.6075 | 104710 | 0.6816 | - |
| 4.6079 | 104720 | 0.7218 | - |
| 4.6084 | 104730 | 0.7128 | - |
| 4.6088 | 104740 | 0.6841 | - |
| 4.6093 | 104750 | 0.7047 | - |
| 4.6097 | 104760 | 0.7111 | - |
| 4.6101 | 104770 | 0.7162 | - |
| 4.6106 | 104780 | 0.6848 | - |
| 4.6110 | 104790 | 0.7268 | - |
| 4.6115 | 104800 | 0.6928 | - |
| 4.6119 | 104810 | 0.711 | - |
| 4.6123 | 104820 | 0.757 | - |
| 4.6128 | 104830 | 0.6958 | - |
| 4.6132 | 104840 | 0.7158 | - |
| 4.6137 | 104850 | 0.7121 | - |
| 4.6141 | 104860 | 0.7162 | - |
| 4.6145 | 104870 | 0.7161 | - |
| 4.6150 | 104880 | 0.7122 | - |
| 4.6154 | 104890 | 0.6847 | - |
| 4.6159 | 104900 | 0.6884 | - |
| 4.6163 | 104910 | 0.6918 | - |
| 4.6167 | 104920 | 0.7364 | - |
| 4.6172 | 104930 | 0.6798 | - |
| 4.6176 | 104940 | 0.7167 | - |
| 4.6181 | 104950 | 0.7421 | - |
| 4.6185 | 104960 | 0.6994 | - |
| 4.6189 | 104970 | 0.7025 | - |
| 4.6194 | 104980 | 0.6859 | - |
| 4.6198 | 104990 | 0.7118 | - |
| 4.6203 | 105000 | 0.686 | - |
| 4.6207 | 105010 | 0.6517 | - |
| 4.6211 | 105020 | 0.742 | - |
| 4.6216 | 105030 | 0.7574 | - |
| 4.6220 | 105040 | 0.7061 | - |
| 4.6225 | 105050 | 0.7334 | - |
| 4.6229 | 105060 | 0.7086 | - |
| 4.6233 | 105070 | 0.685 | - |
| 4.6238 | 105080 | 0.6918 | - |
| 4.6242 | 105090 | 0.6904 | - |
| 4.6247 | 105100 | 0.7291 | - |
| 4.6251 | 105110 | 0.7134 | - |
| 4.6255 | 105120 | 0.6975 | - |
| 4.6260 | 105130 | 0.7135 | - |
| 4.6264 | 105140 | 0.7145 | - |
| 4.6269 | 105150 | 0.6715 | - |
| 4.6273 | 105160 | 0.7147 | - |
| 4.6277 | 105170 | 0.6993 | - |
| 4.6282 | 105180 | 0.7236 | - |
| 4.6286 | 105190 | 0.7385 | - |
| 4.6291 | 105200 | 0.7276 | - |
| 4.6295 | 105210 | 0.6692 | - |
| 4.6299 | 105220 | 0.6922 | - |
| 4.6304 | 105230 | 0.7187 | - |
| 4.6308 | 105240 | 0.7184 | - |
| 4.6313 | 105250 | 0.7212 | - |
| 4.6317 | 105260 | 0.7042 | - |
| 4.6321 | 105270 | 0.7122 | - |
| 4.6326 | 105280 | 0.7527 | - |
| 4.6330 | 105290 | 0.7052 | - |
| 4.6335 | 105300 | 0.6836 | - |
| 4.6339 | 105310 | 0.7484 | - |
| 4.6343 | 105320 | 0.7071 | - |
| 4.6348 | 105330 | 0.7132 | - |
| 4.6352 | 105340 | 0.7547 | - |
| 4.6357 | 105350 | 0.6994 | - |
| 4.6361 | 105360 | 0.6612 | - |
| 4.6365 | 105370 | 0.723 | - |
| 4.6370 | 105380 | 0.6946 | - |
| 4.6374 | 105390 | 0.7199 | - |
| 4.6379 | 105400 | 0.7164 | - |
| 4.6383 | 105410 | 0.7458 | - |
| 4.6387 | 105420 | 0.7094 | - |
| 4.6392 | 105430 | 0.7353 | - |
| 4.6396 | 105440 | 0.683 | - |
| 4.6401 | 105450 | 0.7168 | - |
| 4.6405 | 105460 | 0.6798 | - |
| 4.6409 | 105470 | 0.6964 | - |
| 4.6414 | 105480 | 0.7049 | - |
| 4.6418 | 105490 | 0.702 | - |
| 4.6423 | 105500 | 0.7156 | - |
| 4.6427 | 105510 | 0.7285 | - |
| 4.6431 | 105520 | 0.73 | - |
| 4.6436 | 105530 | 0.705 | - |
| 4.6440 | 105540 | 0.6948 | - |
| 4.6445 | 105550 | 0.6781 | - |
| 4.6449 | 105560 | 0.6953 | - |
| 4.6453 | 105570 | 0.6799 | - |
| 4.6458 | 105580 | 0.7207 | - |
| 4.6462 | 105590 | 0.7116 | - |
| 4.6467 | 105600 | 0.7392 | - |
| 4.6471 | 105610 | 0.7228 | - |
| 4.6475 | 105620 | 0.7071 | - |
| 4.6480 | 105630 | 0.7007 | - |
| 4.6484 | 105640 | 0.692 | - |
| 4.6489 | 105650 | 0.6971 | - |
| 4.6493 | 105660 | 0.7088 | - |
| 4.6497 | 105670 | 0.7073 | - |
| 4.6502 | 105680 | 0.69 | - |
| 4.6506 | 105690 | 0.7169 | - |
| 4.6511 | 105700 | 0.7189 | - |
| 4.6515 | 105710 | 0.7171 | - |
| 4.6519 | 105720 | 0.6764 | - |
| 4.6524 | 105730 | 0.6845 | - |
| 4.6528 | 105740 | 0.7271 | - |
| 4.6529 | 105741 | - | 1.3753 |
| 4.6533 | 105750 | 0.7175 | - |
| 4.6537 | 105760 | 0.7132 | - |
| 4.6541 | 105770 | 0.7051 | - |
| 4.6546 | 105780 | 0.7194 | - |
| 4.6550 | 105790 | 0.7206 | - |
| 4.6555 | 105800 | 0.713 | - |
| 4.6559 | 105810 | 0.7061 | - |
| 4.6563 | 105820 | 0.7257 | - |
| 4.6568 | 105830 | 0.7403 | - |
| 4.6572 | 105840 | 0.6968 | - |
| 4.6577 | 105850 | 0.6853 | - |
| 4.6581 | 105860 | 0.7355 | - |
| 4.6585 | 105870 | 0.6602 | - |
| 4.6590 | 105880 | 0.7112 | - |
| 4.6594 | 105890 | 0.7213 | - |
| 4.6599 | 105900 | 0.71 | - |
| 4.6603 | 105910 | 0.6803 | - |
| 4.6607 | 105920 | 0.7039 | - |
| 4.6612 | 105930 | 0.6824 | - |
| 4.6616 | 105940 | 0.6824 | - |
| 4.6621 | 105950 | 0.7454 | - |
| 4.6625 | 105960 | 0.7314 | - |
| 4.6629 | 105970 | 0.7064 | - |
| 4.6634 | 105980 | 0.7147 | - |
| 4.6638 | 105990 | 0.6886 | - |
| 4.6643 | 106000 | 0.7272 | - |
| 4.6647 | 106010 | 0.6886 | - |
| 4.6651 | 106020 | 0.725 | - |
| 4.6656 | 106030 | 0.6973 | - |
| 4.6660 | 106040 | 0.7035 | - |
| 4.6665 | 106050 | 0.6951 | - |
| 4.6669 | 106060 | 0.7292 | - |
| 4.6673 | 106070 | 0.6733 | - |
| 4.6678 | 106080 | 0.7075 | - |
| 4.6682 | 106090 | 0.7157 | - |
| 4.6687 | 106100 | 0.741 | - |
| 4.6691 | 106110 | 0.6945 | - |
| 4.6695 | 106120 | 0.6875 | - |
| 4.6700 | 106130 | 0.7013 | - |
| 4.6704 | 106140 | 0.722 | - |
| 4.6709 | 106150 | 0.682 | - |
| 4.6713 | 106160 | 0.7403 | - |
| 4.6717 | 106170 | 0.7521 | - |
| 4.6722 | 106180 | 0.6823 | - |
| 4.6726 | 106190 | 0.6817 | - |
| 4.6731 | 106200 | 0.7136 | - |
| 4.6735 | 106210 | 0.7128 | - |
| 4.6739 | 106220 | 0.742 | - |
| 4.6744 | 106230 | 0.7494 | - |
| 4.6748 | 106240 | 0.7045 | - |
| 4.6753 | 106250 | 0.6978 | - |
| 4.6757 | 106260 | 0.6975 | - |
| 4.6761 | 106270 | 0.7319 | - |
| 4.6766 | 106280 | 0.7277 | - |
| 4.6770 | 106290 | 0.6993 | - |
| 4.6775 | 106300 | 0.7174 | - |
| 4.6779 | 106310 | 0.7098 | - |
| 4.6783 | 106320 | 0.7214 | - |
| 4.6788 | 106330 | 0.6976 | - |
| 4.6792 | 106340 | 0.7137 | - |
| 4.6797 | 106350 | 0.6841 | - |
| 4.6801 | 106360 | 0.6939 | - |
| 4.6805 | 106370 | 0.7284 | - |
| 4.6810 | 106380 | 0.6715 | - |
| 4.6814 | 106390 | 0.6824 | - |
| 4.6819 | 106400 | 0.6959 | - |
| 4.6823 | 106410 | 0.6989 | - |
| 4.6827 | 106420 | 0.707 | - |
| 4.6832 | 106430 | 0.7168 | - |
| 4.6836 | 106440 | 0.7034 | - |
| 4.6841 | 106450 | 0.7017 | - |
| 4.6845 | 106460 | 0.7047 | - |
| 4.6849 | 106470 | 0.7247 | - |
| 4.6854 | 106480 | 0.7234 | - |
| 4.6858 | 106490 | 0.7319 | - |
| 4.6863 | 106500 | 0.6958 | - |
| 4.6867 | 106510 | 0.6833 | - |
| 4.6871 | 106520 | 0.7117 | - |
| 4.6876 | 106530 | 0.6814 | - |
| 4.6880 | 106540 | 0.6682 | - |
| 4.6885 | 106550 | 0.7086 | - |
| 4.6889 | 106560 | 0.6938 | - |
| 4.6893 | 106570 | 0.6891 | - |
| 4.6898 | 106580 | 0.7253 | - |
| 4.6902 | 106590 | 0.719 | - |
| 4.6907 | 106600 | 0.7276 | - |
| 4.6911 | 106610 | 0.6643 | - |
| 4.6915 | 106620 | 0.7153 | - |
| 4.6920 | 106630 | 0.7357 | - |
| 4.6924 | 106640 | 0.7483 | - |
| 4.6929 | 106650 | 0.7266 | - |
| 4.6933 | 106660 | 0.7105 | - |
| 4.6937 | 106670 | 0.6847 | - |
| 4.6942 | 106680 | 0.7102 | - |
| 4.6946 | 106690 | 0.703 | - |
| 4.6951 | 106700 | 0.7165 | - |
| 4.6955 | 106710 | 0.6965 | - |
| 4.6959 | 106720 | 0.7248 | - |
| 4.6964 | 106730 | 0.7291 | - |
| 4.6968 | 106740 | 0.6993 | - |
| 4.6973 | 106750 | 0.6855 | - |
| 4.6977 | 106760 | 0.7311 | - |
| 4.6981 | 106770 | 0.7326 | - |
| 4.6986 | 106780 | 0.7241 | - |
| 4.6990 | 106790 | 0.691 | - |
| 4.6995 | 106800 | 0.7265 | - |
| 4.6999 | 106810 | 0.7003 | - |
| 4.7003 | 106820 | 0.7166 | - |
| 4.7008 | 106830 | 0.676 | - |
| 4.7012 | 106840 | 0.7152 | - |
| 4.7017 | 106850 | 0.6898 | - |
| 4.7021 | 106860 | 0.714 | - |
| 4.7025 | 106870 | 0.7216 | - |
| 4.7029 | 106878 | - | 1.3756 |
| 4.7030 | 106880 | 0.7221 | - |
| 4.7034 | 106890 | 0.7175 | - |
| 4.7039 | 106900 | 0.7132 | - |
| 4.7043 | 106910 | 0.6886 | - |
| 4.7047 | 106920 | 0.7409 | - |
| 4.7052 | 106930 | 0.7063 | - |
| 4.7056 | 106940 | 0.7098 | - |
| 4.7061 | 106950 | 0.7246 | - |
| 4.7065 | 106960 | 0.7326 | - |
| 4.7069 | 106970 | 0.7132 | - |
| 4.7074 | 106980 | 0.7189 | - |
| 4.7078 | 106990 | 0.72 | - |
| 4.7083 | 107000 | 0.7194 | - |
| 4.7087 | 107010 | 0.7031 | - |
| 4.7091 | 107020 | 0.7197 | - |
| 4.7096 | 107030 | 0.7277 | - |
| 4.7100 | 107040 | 0.7002 | - |
| 4.7105 | 107050 | 0.7207 | - |
| 4.7109 | 107060 | 0.6811 | - |
| 4.7113 | 107070 | 0.7277 | - |
| 4.7118 | 107080 | 0.69 | - |
| 4.7122 | 107090 | 0.7246 | - |
| 4.7127 | 107100 | 0.7054 | - |
| 4.7131 | 107110 | 0.7089 | - |
| 4.7135 | 107120 | 0.7478 | - |
| 4.7140 | 107130 | 0.6825 | - |
| 4.7144 | 107140 | 0.7373 | - |
| 4.7149 | 107150 | 0.7236 | - |
| 4.7153 | 107160 | 0.6953 | - |
| 4.7157 | 107170 | 0.7167 | - |
| 4.7162 | 107180 | 0.682 | - |
| 4.7166 | 107190 | 0.7064 | - |
| 4.7171 | 107200 | 0.699 | - |
| 4.7175 | 107210 | 0.6995 | - |
| 4.7179 | 107220 | 0.7242 | - |
| 4.7184 | 107230 | 0.7181 | - |
| 4.7188 | 107240 | 0.7064 | - |
| 4.7193 | 107250 | 0.7415 | - |
| 4.7197 | 107260 | 0.7114 | - |
| 4.7201 | 107270 | 0.7596 | - |
| 4.7206 | 107280 | 0.6959 | - |
| 4.7210 | 107290 | 0.7071 | - |
| 4.7215 | 107300 | 0.7058 | - |
| 4.7219 | 107310 | 0.7083 | - |
| 4.7223 | 107320 | 0.7171 | - |
| 4.7228 | 107330 | 0.6997 | - |
| 4.7232 | 107340 | 0.7579 | - |
| 4.7237 | 107350 | 0.6721 | - |
| 4.7241 | 107360 | 0.7327 | - |
| 4.7245 | 107370 | 0.7305 | - |
| 4.7250 | 107380 | 0.6811 | - |
| 4.7254 | 107390 | 0.7146 | - |
| 4.7259 | 107400 | 0.6765 | - |
| 4.7263 | 107410 | 0.704 | - |
| 4.7267 | 107420 | 0.7321 | - |
| 4.7272 | 107430 | 0.7081 | - |
| 4.7276 | 107440 | 0.7174 | - |
| 4.7281 | 107450 | 0.7381 | - |
| 4.7285 | 107460 | 0.7169 | - |
| 4.7289 | 107470 | 0.7344 | - |
| 4.7294 | 107480 | 0.7104 | - |
| 4.7298 | 107490 | 0.6736 | - |
| 4.7303 | 107500 | 0.7059 | - |
| 4.7307 | 107510 | 0.7076 | - |
| 4.7311 | 107520 | 0.7035 | - |
| 4.7316 | 107530 | 0.7432 | - |
| 4.7320 | 107540 | 0.7298 | - |
| 4.7325 | 107550 | 0.743 | - |
| 4.7329 | 107560 | 0.6638 | - |
| 4.7333 | 107570 | 0.7352 | - |
| 4.7338 | 107580 | 0.7299 | - |
| 4.7342 | 107590 | 0.7211 | - |
| 4.7347 | 107600 | 0.7015 | - |
| 4.7351 | 107610 | 0.7085 | - |
| 4.7355 | 107620 | 0.686 | - |
| 4.7360 | 107630 | 0.7222 | - |
| 4.7364 | 107640 | 0.715 | - |
| 4.7369 | 107650 | 0.7094 | - |
| 4.7373 | 107660 | 0.7403 | - |
| 4.7377 | 107670 | 0.7007 | - |
| 4.7382 | 107680 | 0.643 | - |
| 4.7386 | 107690 | 0.7382 | - |
| 4.7391 | 107700 | 0.6948 | - |
| 4.7395 | 107710 | 0.7231 | - |
| 4.7399 | 107720 | 0.7321 | - |
| 4.7404 | 107730 | 0.6847 | - |
| 4.7408 | 107740 | 0.7413 | - |
| 4.7413 | 107750 | 0.6922 | - |
| 4.7417 | 107760 | 0.7184 | - |
| 4.7421 | 107770 | 0.7241 | - |
| 4.7426 | 107780 | 0.6651 | - |
| 4.7430 | 107790 | 0.6827 | - |
| 4.7435 | 107800 | 0.7096 | - |
| 4.7439 | 107810 | 0.7138 | - |
| 4.7443 | 107820 | 0.7241 | - |
| 4.7448 | 107830 | 0.7315 | - |
| 4.7452 | 107840 | 0.6989 | - |
| 4.7457 | 107850 | 0.7066 | - |
| 4.7461 | 107860 | 0.7115 | - |
| 4.7465 | 107870 | 0.7513 | - |
| 4.7470 | 107880 | 0.7279 | - |
| 4.7474 | 107890 | 0.7125 | - |
| 4.7479 | 107900 | 0.7022 | - |
| 4.7483 | 107910 | 0.7278 | - |
| 4.7487 | 107920 | 0.7465 | - |
| 4.7492 | 107930 | 0.7153 | - |
| 4.7496 | 107940 | 0.7132 | - |
| 4.7501 | 107950 | 0.712 | - |
| 4.7505 | 107960 | 0.704 | - |
| 4.7509 | 107970 | 0.7162 | - |
| 4.7514 | 107980 | 0.7125 | - |
| 4.7518 | 107990 | 0.7043 | - |
| 4.7523 | 108000 | 0.6869 | - |
| 4.7527 | 108010 | 0.716 | - |
| 4.7529 | 108015 | - | 1.3755 |
| 4.7531 | 108020 | 0.6936 | - |
| 4.7536 | 108030 | 0.7356 | - |
| 4.7540 | 108040 | 0.7217 | - |
| 4.7545 | 108050 | 0.7523 | - |
| 4.7549 | 108060 | 0.7347 | - |
| 4.7553 | 108070 | 0.7103 | - |
| 4.7558 | 108080 | 0.7033 | - |
| 4.7562 | 108090 | 0.6971 | - |
| 4.7567 | 108100 | 0.7203 | - |
| 4.7571 | 108110 | 0.7093 | - |
| 4.7575 | 108120 | 0.745 | - |
| 4.7580 | 108130 | 0.7025 | - |
| 4.7584 | 108140 | 0.7163 | - |
| 4.7589 | 108150 | 0.7389 | - |
| 4.7593 | 108160 | 0.6935 | - |
| 4.7597 | 108170 | 0.6962 | - |
| 4.7602 | 108180 | 0.6958 | - |
| 4.7606 | 108190 | 0.6942 | - |
| 4.7611 | 108200 | 0.7022 | - |
| 4.7615 | 108210 | 0.7007 | - |
| 4.7619 | 108220 | 0.6596 | - |
| 4.7624 | 108230 | 0.7384 | - |
| 4.7628 | 108240 | 0.6922 | - |
| 4.7633 | 108250 | 0.6999 | - |
| 4.7637 | 108260 | 0.7104 | - |
| 4.7641 | 108270 | 0.7527 | - |
| 4.7646 | 108280 | 0.7039 | - |
| 4.7650 | 108290 | 0.6955 | - |
| 4.7655 | 108300 | 0.7443 | - |
| 4.7659 | 108310 | 0.7163 | - |
| 4.7663 | 108320 | 0.6909 | - |
| 4.7668 | 108330 | 0.7046 | - |
| 4.7672 | 108340 | 0.7235 | - |
| 4.7677 | 108350 | 0.7281 | - |
| 4.7681 | 108360 | 0.7163 | - |
| 4.7685 | 108370 | 0.695 | - |
| 4.7690 | 108380 | 0.7408 | - |
| 4.7694 | 108390 | 0.6719 | - |
| 4.7699 | 108400 | 0.7396 | - |
| 4.7703 | 108410 | 0.7229 | - |
| 4.7707 | 108420 | 0.7139 | - |
| 4.7712 | 108430 | 0.7706 | - |
| 4.7716 | 108440 | 0.7428 | - |
| 4.7721 | 108450 | 0.7184 | - |
| 4.7725 | 108460 | 0.708 | - |
| 4.7729 | 108470 | 0.716 | - |
| 4.7734 | 108480 | 0.7089 | - |
| 4.7738 | 108490 | 0.6827 | - |
| 4.7743 | 108500 | 0.7119 | - |
| 4.7747 | 108510 | 0.7479 | - |
| 4.7751 | 108520 | 0.6776 | - |
| 4.7756 | 108530 | 0.722 | - |
| 4.7760 | 108540 | 0.6848 | - |
| 4.7765 | 108550 | 0.6974 | - |
| 4.7769 | 108560 | 0.694 | - |
| 4.7773 | 108570 | 0.7 | - |
| 4.7778 | 108580 | 0.7163 | - |
| 4.7782 | 108590 | 0.7214 | - |
| 4.7787 | 108600 | 0.6909 | - |
| 4.7791 | 108610 | 0.708 | - |
| 4.7795 | 108620 | 0.693 | - |
| 4.7800 | 108630 | 0.6845 | - |
| 4.7804 | 108640 | 0.6954 | - |
| 4.7809 | 108650 | 0.7187 | - |
| 4.7813 | 108660 | 0.7272 | - |
| 4.7817 | 108670 | 0.7164 | - |
| 4.7822 | 108680 | 0.7118 | - |
| 4.7826 | 108690 | 0.6895 | - |
| 4.7831 | 108700 | 0.6917 | - |
| 4.7835 | 108710 | 0.7038 | - |
| 4.7839 | 108720 | 0.7058 | - |
| 4.7844 | 108730 | 0.7183 | - |
| 4.7848 | 108740 | 0.7068 | - |
| 4.7853 | 108750 | 0.7125 | - |
| 4.7857 | 108760 | 0.7078 | - |
| 4.7861 | 108770 | 0.7003 | - |
| 4.7866 | 108780 | 0.725 | - |
| 4.7870 | 108790 | 0.7064 | - |
| 4.7875 | 108800 | 0.7182 | - |
| 4.7879 | 108810 | 0.7254 | - |
| 4.7883 | 108820 | 0.7092 | - |
| 4.7888 | 108830 | 0.6861 | - |
| 4.7892 | 108840 | 0.6878 | - |
| 4.7897 | 108850 | 0.6798 | - |
| 4.7901 | 108860 | 0.7327 | - |
| 4.7905 | 108870 | 0.712 | - |
| 4.7910 | 108880 | 0.6736 | - |
| 4.7914 | 108890 | 0.7073 | - |
| 4.7919 | 108900 | 0.7305 | - |
| 4.7923 | 108910 | 0.7083 | - |
| 4.7927 | 108920 | 0.7072 | - |
| 4.7932 | 108930 | 0.7088 | - |
| 4.7936 | 108940 | 0.7059 | - |
| 4.7941 | 108950 | 0.7238 | - |
| 4.7945 | 108960 | 0.7228 | - |
| 4.7949 | 108970 | 0.7135 | - |
| 4.7954 | 108980 | 0.6677 | - |
| 4.7958 | 108990 | 0.7307 | - |
| 4.7963 | 109000 | 0.6977 | - |
| 4.7967 | 109010 | 0.6746 | - |
| 4.7971 | 109020 | 0.682 | - |
| 4.7976 | 109030 | 0.7032 | - |
| 4.7980 | 109040 | 0.707 | - |
| 4.7985 | 109050 | 0.7148 | - |
| 4.7989 | 109060 | 0.7099 | - |
| 4.7993 | 109070 | 0.7166 | - |
| 4.7998 | 109080 | 0.6709 | - |
| 4.8002 | 109090 | 0.7027 | - |
| 4.8007 | 109100 | 0.7312 | - |
| 4.8011 | 109110 | 0.7308 | - |
| 4.8015 | 109120 | 0.6971 | - |
| 4.8020 | 109130 | 0.6904 | - |
| 4.8024 | 109140 | 0.7009 | - |
| 4.8029 | 109150 | 0.7145 | - |
| 4.8030 | 109152 | - | 1.3751 |
| 4.8033 | 109160 | 0.6731 | - |
| 4.8037 | 109170 | 0.7049 | - |
| 4.8042 | 109180 | 0.7153 | - |
| 4.8046 | 109190 | 0.7011 | - |
| 4.8051 | 109200 | 0.7431 | - |
| 4.8055 | 109210 | 0.7239 | - |
| 4.8059 | 109220 | 0.7133 | - |
| 4.8064 | 109230 | 0.7032 | - |
| 4.8068 | 109240 | 0.7119 | - |
| 4.8073 | 109250 | 0.7216 | - |
| 4.8077 | 109260 | 0.7101 | - |
| 4.8081 | 109270 | 0.7204 | - |
| 4.8086 | 109280 | 0.6913 | - |
| 4.8090 | 109290 | 0.6714 | - |
| 4.8095 | 109300 | 0.7087 | - |
| 4.8099 | 109310 | 0.6952 | - |
| 4.8103 | 109320 | 0.7131 | - |
| 4.8108 | 109330 | 0.7231 | - |
| 4.8112 | 109340 | 0.6835 | - |
| 4.8117 | 109350 | 0.713 | - |
| 4.8121 | 109360 | 0.7372 | - |
| 4.8125 | 109370 | 0.6933 | - |
| 4.8130 | 109380 | 0.7097 | - |
| 4.8134 | 109390 | 0.7398 | - |
| 4.8139 | 109400 | 0.6994 | - |
| 4.8143 | 109410 | 0.7267 | - |
| 4.8147 | 109420 | 0.7015 | - |
| 4.8152 | 109430 | 0.6781 | - |
| 4.8156 | 109440 | 0.7138 | - |
| 4.8161 | 109450 | 0.6784 | - |
| 4.8165 | 109460 | 0.7188 | - |
| 4.8169 | 109470 | 0.7284 | - |
| 4.8174 | 109480 | 0.7326 | - |
| 4.8178 | 109490 | 0.6896 | - |
| 4.8183 | 109500 | 0.6551 | - |
| 4.8187 | 109510 | 0.7081 | - |
| 4.8191 | 109520 | 0.6999 | - |
| 4.8196 | 109530 | 0.6848 | - |
| 4.8200 | 109540 | 0.7203 | - |
| 4.8205 | 109550 | 0.7479 | - |
| 4.8209 | 109560 | 0.6917 | - |
| 4.8213 | 109570 | 0.7185 | - |
| 4.8218 | 109580 | 0.7127 | - |
| 4.8222 | 109590 | 0.7167 | - |
| 4.8227 | 109600 | 0.7286 | - |
| 4.8231 | 109610 | 0.7219 | - |
| 4.8236 | 109620 | 0.6841 | - |
| 4.8240 | 109630 | 0.7027 | - |
| 4.8244 | 109640 | 0.6878 | - |
| 4.8249 | 109650 | 0.6926 | - |
| 4.8253 | 109660 | 0.7103 | - |
| 4.8258 | 109670 | 0.6962 | - |
| 4.8262 | 109680 | 0.7063 | - |
| 4.8266 | 109690 | 0.678 | - |
| 4.8271 | 109700 | 0.6786 | - |
| 4.8275 | 109710 | 0.7036 | - |
| 4.8280 | 109720 | 0.6907 | - |
| 4.8284 | 109730 | 0.7104 | - |
| 4.8288 | 109740 | 0.6945 | - |
| 4.8293 | 109750 | 0.6941 | - |
| 4.8297 | 109760 | 0.6688 | - |
| 4.8302 | 109770 | 0.6995 | - |
| 4.8306 | 109780 | 0.7255 | - |
| 4.8310 | 109790 | 0.7183 | - |
| 4.8315 | 109800 | 0.6746 | - |
| 4.8319 | 109810 | 0.7297 | - |
| 4.8324 | 109820 | 0.6865 | - |
| 4.8328 | 109830 | 0.7124 | - |
| 4.8332 | 109840 | 0.7178 | - |
| 4.8337 | 109850 | 0.7352 | - |
| 4.8341 | 109860 | 0.7152 | - |
| 4.8346 | 109870 | 0.7225 | - |
| 4.8350 | 109880 | 0.7081 | - |
| 4.8354 | 109890 | 0.6946 | - |
| 4.8359 | 109900 | 0.7385 | - |
| 4.8363 | 109910 | 0.7411 | - |
| 4.8368 | 109920 | 0.7153 | - |
| 4.8372 | 109930 | 0.6994 | - |
| 4.8376 | 109940 | 0.6983 | - |
| 4.8381 | 109950 | 0.7445 | - |
| 4.8385 | 109960 | 0.7201 | - |
| 4.8390 | 109970 | 0.7365 | - |
| 4.8394 | 109980 | 0.7079 | - |
| 4.8398 | 109990 | 0.7198 | - |
| 4.8403 | 110000 | 0.7036 | - |
| 4.8407 | 110010 | 0.7128 | - |
| 4.8412 | 110020 | 0.7533 | - |
| 4.8416 | 110030 | 0.699 | - |
| 4.8420 | 110040 | 0.6869 | - |
| 4.8425 | 110050 | 0.7099 | - |
| 4.8429 | 110060 | 0.7036 | - |
| 4.8434 | 110070 | 0.6974 | - |
| 4.8438 | 110080 | 0.7214 | - |
| 4.8442 | 110090 | 0.7362 | - |
| 4.8447 | 110100 | 0.669 | - |
| 4.8451 | 110110 | 0.6587 | - |
| 4.8456 | 110120 | 0.7084 | - |
| 4.8460 | 110130 | 0.7112 | - |
| 4.8464 | 110140 | 0.7099 | - |
| 4.8469 | 110150 | 0.7365 | - |
| 4.8473 | 110160 | 0.7094 | - |
| 4.8478 | 110170 | 0.7112 | - |
| 4.8482 | 110180 | 0.6977 | - |
| 4.8486 | 110190 | 0.7 | - |
| 4.8491 | 110200 | 0.6803 | - |
| 4.8495 | 110210 | 0.6929 | - |
| 4.8500 | 110220 | 0.7199 | - |
| 4.8504 | 110230 | 0.6988 | - |
| 4.8508 | 110240 | 0.6868 | - |
| 4.8513 | 110250 | 0.6879 | - |
| 4.8517 | 110260 | 0.7251 | - |
| 4.8522 | 110270 | 0.6984 | - |
| 4.8526 | 110280 | 0.6973 | - |
| 4.8530 | 110289 | - | 1.3750 |
| 4.8530 | 110290 | 0.7173 | - |
| 4.8535 | 110300 | 0.7259 | - |
| 4.8539 | 110310 | 0.7142 | - |
| 4.8544 | 110320 | 0.7084 | - |
| 4.8548 | 110330 | 0.7235 | - |
| 4.8552 | 110340 | 0.6895 | - |
| 4.8557 | 110350 | 0.7072 | - |
| 4.8561 | 110360 | 0.6928 | - |
| 4.8566 | 110370 | 0.7275 | - |
| 4.8570 | 110380 | 0.7098 | - |
| 4.8574 | 110390 | 0.689 | - |
| 4.8579 | 110400 | 0.7059 | - |
| 4.8583 | 110410 | 0.747 | - |
| 4.8588 | 110420 | 0.6811 | - |
| 4.8592 | 110430 | 0.6998 | - |
| 4.8596 | 110440 | 0.7264 | - |
| 4.8601 | 110450 | 0.694 | - |
| 4.8605 | 110460 | 0.7129 | - |
| 4.8610 | 110470 | 0.698 | - |
| 4.8614 | 110480 | 0.7049 | - |
| 4.8618 | 110490 | 0.721 | - |
| 4.8623 | 110500 | 0.7147 | - |
| 4.8627 | 110510 | 0.6709 | - |
| 4.8632 | 110520 | 0.6995 | - |
| 4.8636 | 110530 | 0.7162 | - |
| 4.8640 | 110540 | 0.6986 | - |
| 4.8645 | 110550 | 0.7014 | - |
| 4.8649 | 110560 | 0.7287 | - |
| 4.8654 | 110570 | 0.6749 | - |
| 4.8658 | 110580 | 0.714 | - |
| 4.8662 | 110590 | 0.7019 | - |
| 4.8667 | 110600 | 0.725 | - |
| 4.8671 | 110610 | 0.7001 | - |
| 4.8676 | 110620 | 0.731 | - |
| 4.8680 | 110630 | 0.7207 | - |
| 4.8684 | 110640 | 0.6462 | - |
| 4.8689 | 110650 | 0.7153 | - |
| 4.8693 | 110660 | 0.7043 | - |
| 4.8698 | 110670 | 0.7144 | - |
| 4.8702 | 110680 | 0.7014 | - |
| 4.8706 | 110690 | 0.7588 | - |
| 4.8711 | 110700 | 0.7196 | - |
| 4.8715 | 110710 | 0.7019 | - |
| 4.8720 | 110720 | 0.7045 | - |
| 4.8724 | 110730 | 0.6871 | - |
| 4.8728 | 110740 | 0.711 | - |
| 4.8733 | 110750 | 0.7226 | - |
| 4.8737 | 110760 | 0.6941 | - |
| 4.8742 | 110770 | 0.7245 | - |
| 4.8746 | 110780 | 0.7276 | - |
| 4.8750 | 110790 | 0.6806 | - |
| 4.8755 | 110800 | 0.7088 | - |
| 4.8759 | 110810 | 0.6722 | - |
| 4.8764 | 110820 | 0.7207 | - |
| 4.8768 | 110830 | 0.7149 | - |
| 4.8772 | 110840 | 0.7132 | - |
| 4.8777 | 110850 | 0.7331 | - |
| 4.8781 | 110860 | 0.6781 | - |
| 4.8786 | 110870 | 0.669 | - |
| 4.8790 | 110880 | 0.7258 | - |
| 4.8794 | 110890 | 0.712 | - |
| 4.8799 | 110900 | 0.7268 | - |
| 4.8803 | 110910 | 0.7172 | - |
| 4.8808 | 110920 | 0.7305 | - |
| 4.8812 | 110930 | 0.703 | - |
| 4.8816 | 110940 | 0.6728 | - |
| 4.8821 | 110950 | 0.6895 | - |
| 4.8825 | 110960 | 0.7168 | - |
| 4.8830 | 110970 | 0.686 | - |
| 4.8834 | 110980 | 0.7206 | - |
| 4.8838 | 110990 | 0.7039 | - |
| 4.8843 | 111000 | 0.7127 | - |
| 4.8847 | 111010 | 0.7374 | - |
| 4.8852 | 111020 | 0.6949 | - |
| 4.8856 | 111030 | 0.7131 | - |
| 4.8860 | 111040 | 0.7161 | - |
| 4.8865 | 111050 | 0.7351 | - |
| 4.8869 | 111060 | 0.6993 | - |
| 4.8874 | 111070 | 0.7074 | - |
| 4.8878 | 111080 | 0.732 | - |
| 4.8882 | 111090 | 0.7488 | - |
| 4.8887 | 111100 | 0.698 | - |
| 4.8891 | 111110 | 0.7175 | - |
| 4.8896 | 111120 | 0.6604 | - |
| 4.8900 | 111130 | 0.7353 | - |
| 4.8904 | 111140 | 0.6972 | - |
| 4.8909 | 111150 | 0.7087 | - |
| 4.8913 | 111160 | 0.7262 | - |
| 4.8918 | 111170 | 0.6743 | - |
| 4.8922 | 111180 | 0.6978 | - |
| 4.8926 | 111190 | 0.6344 | - |
| 4.8931 | 111200 | 0.7162 | - |
| 4.8935 | 111210 | 0.7388 | - |
| 4.8940 | 111220 | 0.7107 | - |
| 4.8944 | 111230 | 0.6885 | - |
| 4.8948 | 111240 | 0.7111 | - |
| 4.8953 | 111250 | 0.7026 | - |
| 4.8957 | 111260 | 0.7286 | - |
| 4.8962 | 111270 | 0.6725 | - |
| 4.8966 | 111280 | 0.6951 | - |
| 4.8970 | 111290 | 0.727 | - |
| 4.8975 | 111300 | 0.7074 | - |
| 4.8979 | 111310 | 0.7221 | - |
| 4.8984 | 111320 | 0.6888 | - |
| 4.8988 | 111330 | 0.7195 | - |
| 4.8992 | 111340 | 0.7182 | - |
| 4.8997 | 111350 | 0.7044 | - |
| 4.9001 | 111360 | 0.7371 | - |
| 4.9006 | 111370 | 0.6799 | - |
| 4.9010 | 111380 | 0.723 | - |
| 4.9014 | 111390 | 0.6974 | - |
| 4.9019 | 111400 | 0.7339 | - |
| 4.9023 | 111410 | 0.741 | - |
| 4.9028 | 111420 | 0.6722 | - |
| 4.9030 | 111426 | - | 1.3746 |
| 4.9032 | 111430 | 0.7056 | - |
| 4.9036 | 111440 | 0.7061 | - |
| 4.9041 | 111450 | 0.7518 | - |
| 4.9045 | 111460 | 0.6994 | - |
| 4.9050 | 111470 | 0.6961 | - |
| 4.9054 | 111480 | 0.7261 | - |
| 4.9058 | 111490 | 0.6779 | - |
| 4.9063 | 111500 | 0.7155 | - |
| 4.9067 | 111510 | 0.69 | - |
| 4.9072 | 111520 | 0.6632 | - |
| 4.9076 | 111530 | 0.7181 | - |
| 4.9080 | 111540 | 0.7167 | - |
| 4.9085 | 111550 | 0.716 | - |
| 4.9089 | 111560 | 0.7224 | - |
| 4.9094 | 111570 | 0.6999 | - |
| 4.9098 | 111580 | 0.714 | - |
| 4.9102 | 111590 | 0.712 | - |
| 4.9107 | 111600 | 0.7072 | - |
| 4.9111 | 111610 | 0.7463 | - |
| 4.9116 | 111620 | 0.7036 | - |
| 4.9120 | 111630 | 0.7106 | - |
| 4.9124 | 111640 | 0.7163 | - |
| 4.9129 | 111650 | 0.659 | - |
| 4.9133 | 111660 | 0.7509 | - |
| 4.9138 | 111670 | 0.7419 | - |
| 4.9142 | 111680 | 0.6816 | - |
| 4.9146 | 111690 | 0.6977 | - |
| 4.9151 | 111700 | 0.7165 | - |
| 4.9155 | 111710 | 0.7007 | - |
| 4.9160 | 111720 | 0.715 | - |
| 4.9164 | 111730 | 0.7351 | - |
| 4.9168 | 111740 | 0.6661 | - |
| 4.9173 | 111750 | 0.7265 | - |
| 4.9177 | 111760 | 0.6917 | - |
| 4.9182 | 111770 | 0.7134 | - |
| 4.9186 | 111780 | 0.704 | - |
| 4.9190 | 111790 | 0.6905 | - |
| 4.9195 | 111800 | 0.733 | - |
| 4.9199 | 111810 | 0.7279 | - |
| 4.9204 | 111820 | 0.7433 | - |
| 4.9208 | 111830 | 0.7549 | - |
| 4.9212 | 111840 | 0.6911 | - |
| 4.9217 | 111850 | 0.6976 | - |
| 4.9221 | 111860 | 0.7186 | - |
| 4.9226 | 111870 | 0.7008 | - |
| 4.9230 | 111880 | 0.679 | - |
| 4.9234 | 111890 | 0.7156 | - |
| 4.9239 | 111900 | 0.7028 | - |
| 4.9243 | 111910 | 0.7182 | - |
| 4.9248 | 111920 | 0.712 | - |
| 4.9252 | 111930 | 0.7242 | - |
| 4.9256 | 111940 | 0.7064 | - |
| 4.9261 | 111950 | 0.735 | - |
| 4.9265 | 111960 | 0.7197 | - |
| 4.9270 | 111970 | 0.7508 | - |
| 4.9274 | 111980 | 0.7035 | - |
| 4.9278 | 111990 | 0.6633 | - |
| 4.9283 | 112000 | 0.7252 | - |
| 4.9287 | 112010 | 0.7123 | - |
| 4.9292 | 112020 | 0.7539 | - |
| 4.9296 | 112030 | 0.7137 | - |
| 4.9300 | 112040 | 0.7026 | - |
| 4.9305 | 112050 | 0.6984 | - |
| 4.9309 | 112060 | 0.6968 | - |
| 4.9314 | 112070 | 0.7057 | - |
| 4.9318 | 112080 | 0.6471 | - |
| 4.9322 | 112090 | 0.6854 | - |
| 4.9327 | 112100 | 0.7132 | - |
| 4.9331 | 112110 | 0.694 | - |
| 4.9336 | 112120 | 0.6949 | - |
| 4.9340 | 112130 | 0.6992 | - |
| 4.9344 | 112140 | 0.7146 | - |
| 4.9349 | 112150 | 0.7123 | - |
| 4.9353 | 112160 | 0.7083 | - |
| 4.9358 | 112170 | 0.658 | - |
| 4.9362 | 112180 | 0.7179 | - |
| 4.9366 | 112190 | 0.7282 | - |
| 4.9371 | 112200 | 0.7318 | - |
| 4.9375 | 112210 | 0.7058 | - |
| 4.9380 | 112220 | 0.6865 | - |
| 4.9384 | 112230 | 0.7404 | - |
| 4.9388 | 112240 | 0.7341 | - |
| 4.9393 | 112250 | 0.7046 | - |
| 4.9397 | 112260 | 0.7029 | - |
| 4.9402 | 112270 | 0.7156 | - |
| 4.9406 | 112280 | 0.7443 | - |
| 4.9410 | 112290 | 0.7038 | - |
| 4.9415 | 112300 | 0.7056 | - |
| 4.9419 | 112310 | 0.7333 | - |
| 4.9424 | 112320 | 0.713 | - |
| 4.9428 | 112330 | 0.7037 | - |
| 4.9432 | 112340 | 0.7021 | - |
| 4.9437 | 112350 | 0.7031 | - |
| 4.9441 | 112360 | 0.718 | - |
| 4.9446 | 112370 | 0.6707 | - |
| 4.9450 | 112380 | 0.7202 | - |
| 4.9454 | 112390 | 0.7136 | - |
| 4.9459 | 112400 | 0.7108 | - |
| 4.9463 | 112410 | 0.7161 | - |
| 4.9468 | 112420 | 0.7363 | - |
| 4.9472 | 112430 | 0.7029 | - |
| 4.9476 | 112440 | 0.6919 | - |
| 4.9481 | 112450 | 0.6834 | - |
| 4.9485 | 112460 | 0.7133 | - |
| 4.9490 | 112470 | 0.7103 | - |
| 4.9494 | 112480 | 0.7089 | - |
| 4.9498 | 112490 | 0.6971 | - |
| 4.9503 | 112500 | 0.705 | - |
| 4.9507 | 112510 | 0.7202 | - |
| 4.9512 | 112520 | 0.6655 | - |
| 4.9516 | 112530 | 0.7606 | - |
| 4.9520 | 112540 | 0.776 | - |
| 4.9525 | 112550 | 0.7216 | - |
| 4.9529 | 112560 | 0.7147 | - |
| 4.9530 | 112563 | - | 1.3751 |
| 4.9534 | 112570 | 0.7221 | - |
| 4.9538 | 112580 | 0.6801 | - |
| 4.9542 | 112590 | 0.7356 | - |
| 4.9547 | 112600 | 0.7144 | - |
| 4.9551 | 112610 | 0.718 | - |
| 4.9556 | 112620 | 0.6902 | - |
| 4.9560 | 112630 | 0.6805 | - |
| 4.9564 | 112640 | 0.7402 | - |
| 4.9569 | 112650 | 0.681 | - |
| 4.9573 | 112660 | 0.6968 | - |
| 4.9578 | 112670 | 0.7283 | - |
| 4.9582 | 112680 | 0.6899 | - |
| 4.9586 | 112690 | 0.7003 | - |
| 4.9591 | 112700 | 0.7219 | - |
| 4.9595 | 112710 | 0.6675 | - |
| 4.9600 | 112720 | 0.6912 | - |
| 4.9604 | 112730 | 0.7481 | - |
| 4.9608 | 112740 | 0.7095 | - |
| 4.9613 | 112750 | 0.69 | - |
| 4.9617 | 112760 | 0.7235 | - |
| 4.9622 | 112770 | 0.7264 | - |
| 4.9626 | 112780 | 0.7211 | - |
| 4.9630 | 112790 | 0.7352 | - |
| 4.9635 | 112800 | 0.6848 | - |
| 4.9639 | 112810 | 0.709 | - |
| 4.9644 | 112820 | 0.701 | - |
| 4.9648 | 112830 | 0.6757 | - |
| 4.9652 | 112840 | 0.7167 | - |
| 4.9657 | 112850 | 0.7376 | - |
| 4.9661 | 112860 | 0.7044 | - |
| 4.9666 | 112870 | 0.7118 | - |
| 4.9670 | 112880 | 0.7096 | - |
| 4.9674 | 112890 | 0.7192 | - |
| 4.9679 | 112900 | 0.7026 | - |
| 4.9683 | 112910 | 0.6882 | - |
| 4.9688 | 112920 | 0.702 | - |
| 4.9692 | 112930 | 0.7138 | - |
| 4.9696 | 112940 | 0.7345 | - |
| 4.9701 | 112950 | 0.7221 | - |
| 4.9705 | 112960 | 0.7101 | - |
| 4.9710 | 112970 | 0.7083 | - |
| 4.9714 | 112980 | 0.7122 | - |
| 4.9718 | 112990 | 0.7237 | - |
| 4.9723 | 113000 | 0.736 | - |
| 4.9727 | 113010 | 0.7058 | - |
| 4.9732 | 113020 | 0.6802 | - |
| 4.9736 | 113030 | 0.7262 | - |
| 4.9740 | 113040 | 0.7136 | - |
| 4.9745 | 113050 | 0.7081 | - |
| 4.9749 | 113060 | 0.6958 | - |
| 4.9754 | 113070 | 0.7218 | - |
| 4.9758 | 113080 | 0.7053 | - |
| 4.9762 | 113090 | 0.6712 | - |
| 4.9767 | 113100 | 0.6933 | - |
| 4.9771 | 113110 | 0.7022 | - |
| 4.9776 | 113120 | 0.6873 | - |
| 4.9780 | 113130 | 0.6951 | - |
| 4.9784 | 113140 | 0.7214 | - |
| 4.9789 | 113150 | 0.718 | - |
| 4.9793 | 113160 | 0.7307 | - |
| 4.9798 | 113170 | 0.7044 | - |
| 4.9802 | 113180 | 0.7048 | - |
| 4.9806 | 113190 | 0.7014 | - |
| 4.9811 | 113200 | 0.716 | - |
| 4.9815 | 113210 | 0.6879 | - |
| 4.9820 | 113220 | 0.6717 | - |
| 4.9824 | 113230 | 0.7057 | - |
| 4.9828 | 113240 | 0.7079 | - |
| 4.9833 | 113250 | 0.682 | - |
| 4.9837 | 113260 | 0.6997 | - |
| 4.9842 | 113270 | 0.6898 | - |
| 4.9846 | 113280 | 0.6854 | - |
| 4.9850 | 113290 | 0.6676 | - |
| 4.9855 | 113300 | 0.6925 | - |
| 4.9859 | 113310 | 0.7083 | - |
| 4.9864 | 113320 | 0.7377 | - |
| 4.9868 | 113330 | 0.7039 | - |
| 4.9872 | 113340 | 0.7429 | - |
| 4.9877 | 113350 | 0.6891 | - |
| 4.9881 | 113360 | 0.7215 | - |
| 4.9886 | 113370 | 0.7033 | - |
| 4.9890 | 113380 | 0.6724 | - |
| 4.9894 | 113390 | 0.7015 | - |
| 4.9899 | 113400 | 0.7404 | - |
| 4.9903 | 113410 | 0.7013 | - |
| 4.9908 | 113420 | 0.7216 | - |
| 4.9912 | 113430 | 0.7182 | - |
| 4.9916 | 113440 | 0.7018 | - |
| 4.9921 | 113450 | 0.7147 | - |
| 4.9925 | 113460 | 0.6867 | - |
| 4.9930 | 113470 | 0.7026 | - |
| 4.9934 | 113480 | 0.6539 | - |
| 4.9938 | 113490 | 0.6845 | - |
| 4.9943 | 113500 | 0.7073 | - |
| 4.9947 | 113510 | 0.685 | - |
| 4.9952 | 113520 | 0.7001 | - |
| 4.9956 | 113530 | 0.7356 | - |
| 4.9960 | 113540 | 0.6959 | - |
| 4.9965 | 113550 | 0.6758 | - |
| 4.9969 | 113560 | 0.7473 | - |
| 4.9974 | 113570 | 0.668 | - |
| 4.9978 | 113580 | 0.6844 | - |
| 4.9982 | 113590 | 0.6963 | - |
| 4.9987 | 113600 | 0.6713 | - |
| 4.9991 | 113610 | 0.6758 | - |
| 4.9996 | 113620 | 0.7356 | - |
| 5.0 | 113630 | 0.7251 | - |
</details>
### Framework Versions
- Python: 3.11.8
- Sentence Transformers: 3.1.1
- Transformers: 4.45.1
- PyTorch: 2.5.1.post302
- Accelerate: 0.34.2
- Datasets: 3.0.0
- Tokenizers: 0.20.0
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
#### MaskedCachedMultipleNegativesRankingLoss
```bibtex
@misc{gao2021scaling,
title={Scaling Deep Contrastive Learning Batch Size under Memory Limited Setup},
author={Luyu Gao and Yunyi Zhang and Jiawei Han and Jamie Callan},
year={2021},
eprint={2101.06983},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
-->
|
furrutiav/neobert_mixtral_nllfg_vanilla_cola_tf_idf_centroid
|
furrutiav
| 2025-03-17T23:05:56Z
| 0
| 0
|
transformers
|
[
"transformers",
"safetensors",
"neobert",
"feature-extraction",
"custom_code",
"arxiv:1910.09700",
"region:us"
] |
feature-extraction
| 2025-03-17T23:05:12Z
|
---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
|
tornwhisp/FictionalQuartz
|
tornwhisp
| 2025-03-17T23:03:35Z
| 0
| 0
| null |
[
"license:apache-2.0",
"region:us"
] | null | 2025-03-17T23:03:23Z
|
---
license: apache-2.0
---
|
aghadge/email_phishing
|
aghadge
| 2025-03-17T23:03:33Z
| 90
| 0
|
transformers
|
[
"transformers",
"pytorch",
"safetensors",
"gguf",
"llama",
"text-generation-inference",
"unsloth",
"trl",
"en",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2025-03-11T18:46:33Z
|
---
base_model: unsloth/llama-3.2-3b-instruct-unsloth-bnb-4bit
tags:
- text-generation-inference
- transformers
- unsloth
- llama
- trl
license: apache-2.0
language:
- en
---
# Uploaded model
- **Developed by:** aghadge
- **License:** apache-2.0
- **Finetuned from model :** unsloth/llama-3.2-3b-instruct-unsloth-bnb-4bit
This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
lesso16/980b6266-c4d4-4835-bdf4-573c12d62564
|
lesso16
| 2025-03-17T23:02:56Z
| 0
| 0
|
peft
|
[
"peft",
"safetensors",
"llama",
"axolotl",
"generated_from_trainer",
"base_model:unsloth/SmolLM-1.7B",
"base_model:adapter:unsloth/SmolLM-1.7B",
"license:apache-2.0",
"region:us"
] | null | 2025-03-17T18:22:05Z
|
---
library_name: peft
license: apache-2.0
base_model: unsloth/SmolLM-1.7B
tags:
- axolotl
- generated_from_trainer
model-index:
- name: 980b6266-c4d4-4835-bdf4-573c12d62564
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: unsloth/SmolLM-1.7B
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- df51bd19b3e6085d_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/df51bd19b3e6085d_train_data.json
type:
field_input: conversation
field_instruction: note
field_output: summary
format: '{instruction} {input}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
do_eval: true
early_stopping_patience: 3
eval_batch_size: 4
eval_max_new_tokens: 128
eval_steps: 500
evals_per_epoch: null
flash_attention: true
fp16: false
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 8
gradient_checkpointing: true
group_by_length: true
hub_model_id: lesso16/980b6266-c4d4-4835-bdf4-573c12d62564
hub_repo: null
hub_strategy: checkpoint
hub_token: null
learning_rate: 0.000216
load_in_4bit: false
load_in_8bit: false
local_rank: null
logging_steps: 50
lora_alpha: 128
lora_dropout: 0.15
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 64
lora_target_linear: true
lr_scheduler: cosine
max_grad_norm: 1.0
max_steps: 2000
micro_batch_size: 4
mlflow_experiment_name: /tmp/df51bd19b3e6085d_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 10
optimizer: adamw_torch_fused
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
save_steps: 500
saves_per_epoch: null
seed: 160
sequence_len: 1024
strict: false
tf32: true
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: d7f96eda-e66f-4f96-a567-ca0a1a2c655c
wandb_project: 16a
wandb_run: your_name
wandb_runid: d7f96eda-e66f-4f96-a567-ca0a1a2c655c
warmup_steps: 100
weight_decay: 0.0
xformers_attention: null
```
</details><br>
# 980b6266-c4d4-4835-bdf4-573c12d62564
This model is a fine-tuned version of [unsloth/SmolLM-1.7B](https://huggingface.co/unsloth/SmolLM-1.7B) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1351
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.000216
- train_batch_size: 4
- eval_batch_size: 4
- seed: 160
- gradient_accumulation_steps: 8
- total_train_batch_size: 32
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 100
- training_steps: 2000
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| No log | 0.0012 | 1 | 0.9542 |
| 0.1649 | 0.5768 | 500 | 0.1632 |
| 0.1351 | 1.1540 | 1000 | 0.1446 |
| 0.1287 | 1.7308 | 1500 | 0.1369 |
| 0.1171 | 2.3080 | 2000 | 0.1351 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1
|
tornwhisp/FictionalJade
|
tornwhisp
| 2025-03-17T23:02:36Z
| 0
| 0
| null |
[
"license:apache-2.0",
"region:us"
] | null | 2025-03-17T23:02:16Z
|
---
license: apache-2.0
---
|
HyoungWook/DNA-R1-Q4_K_M-GGUF
|
HyoungWook
| 2025-03-17T23:02:11Z
| 0
| 0
|
transformers
|
[
"transformers",
"gguf",
"dnotitia",
"nlp",
"llm",
"slm",
"conversation",
"chat",
"reasoning",
"r1",
"llama-cpp",
"gguf-my-repo",
"text-generation",
"en",
"ko",
"base_model:dnotitia/DNA-R1",
"base_model:quantized:dnotitia/DNA-R1",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"region:us",
"conversational"
] |
text-generation
| 2025-03-17T23:01:33Z
|
---
base_model: dnotitia/DNA-R1
language:
- en
- ko
library_name: transformers
license: cc-by-nc-4.0
pipeline_tag: text-generation
tags:
- dnotitia
- nlp
- llm
- slm
- conversation
- chat
- reasoning
- r1
- llama-cpp
- gguf-my-repo
---
# HyoungWook/DNA-R1-Q4_K_M-GGUF
This model was converted to GGUF format from [`dnotitia/DNA-R1`](https://huggingface.co/dnotitia/DNA-R1) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space.
Refer to the [original model card](https://huggingface.co/dnotitia/DNA-R1) for more details on the model.
## Use with llama.cpp
Install llama.cpp through brew (works on Mac and Linux)
```bash
brew install llama.cpp
```
Invoke the llama.cpp server or the CLI.
### CLI:
```bash
llama-cli --hf-repo HyoungWook/DNA-R1-Q4_K_M-GGUF --hf-file dna-r1-q4_k_m.gguf -p "The meaning to life and the universe is"
```
### Server:
```bash
llama-server --hf-repo HyoungWook/DNA-R1-Q4_K_M-GGUF --hf-file dna-r1-q4_k_m.gguf -c 2048
```
Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well.
Step 1: Clone llama.cpp from GitHub.
```
git clone https://github.com/ggerganov/llama.cpp
```
Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux).
```
cd llama.cpp && LLAMA_CURL=1 make
```
Step 3: Run inference through the main binary.
```
./llama-cli --hf-repo HyoungWook/DNA-R1-Q4_K_M-GGUF --hf-file dna-r1-q4_k_m.gguf -p "The meaning to life and the universe is"
```
or
```
./llama-server --hf-repo HyoungWook/DNA-R1-Q4_K_M-GGUF --hf-file dna-r1-q4_k_m.gguf -c 2048
```
|
Triangle104/Cydonia-24B-v2.1-Q3_K_S-GGUF
|
Triangle104
| 2025-03-17T23:01:14Z
| 0
| 0
| null |
[
"gguf",
"llama-cpp",
"gguf-my-repo",
"base_model:TheDrummer/Cydonia-24B-v2.1",
"base_model:quantized:TheDrummer/Cydonia-24B-v2.1",
"license:other",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2025-03-17T22:50:12Z
|
---
base_model: TheDrummer/Cydonia-24B-v2.1
license: other
tags:
- llama-cpp
- gguf-my-repo
---
# Triangle104/Cydonia-24B-v2.1-Q3_K_S-GGUF
This model was converted to GGUF format from [`TheDrummer/Cydonia-24B-v2.1`](https://huggingface.co/TheDrummer/Cydonia-24B-v2.1) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space.
Refer to the [original model card](https://huggingface.co/TheDrummer/Cydonia-24B-v2.1) for more details on the model.
---
Supported Chat Templates
-
Mistral v7 Tekken (recommended)
Metharme (may require some patching)
Alpaca (worth a try for story)
Description
-
Cydonia 24B v2.1 is a finetune of Mistral's latest 'Small' model (2501).
Further tuning was done to improve prose, foster creativity, and tone down positivity.
---
## Use with llama.cpp
Install llama.cpp through brew (works on Mac and Linux)
```bash
brew install llama.cpp
```
Invoke the llama.cpp server or the CLI.
### CLI:
```bash
llama-cli --hf-repo Triangle104/Cydonia-24B-v2.1-Q3_K_S-GGUF --hf-file cydonia-24b-v2.1-q3_k_s.gguf -p "The meaning to life and the universe is"
```
### Server:
```bash
llama-server --hf-repo Triangle104/Cydonia-24B-v2.1-Q3_K_S-GGUF --hf-file cydonia-24b-v2.1-q3_k_s.gguf -c 2048
```
Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well.
Step 1: Clone llama.cpp from GitHub.
```
git clone https://github.com/ggerganov/llama.cpp
```
Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux).
```
cd llama.cpp && LLAMA_CURL=1 make
```
Step 3: Run inference through the main binary.
```
./llama-cli --hf-repo Triangle104/Cydonia-24B-v2.1-Q3_K_S-GGUF --hf-file cydonia-24b-v2.1-q3_k_s.gguf -p "The meaning to life and the universe is"
```
or
```
./llama-server --hf-repo Triangle104/Cydonia-24B-v2.1-Q3_K_S-GGUF --hf-file cydonia-24b-v2.1-q3_k_s.gguf -c 2048
```
|
stojchet/kto5-sft1
|
stojchet
| 2025-03-17T23:00:50Z
| 0
| 0
|
transformers
|
[
"transformers",
"safetensors",
"llama",
"text-generation",
"trl",
"sft",
"generated_from_trainer",
"base_model:stojchet/kto5",
"base_model:finetune:stojchet/kto5",
"license:other",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2025-03-17T22:50:58Z
|
---
library_name: transformers
license: other
base_model: stojchet/kto5
tags:
- trl
- sft
- generated_from_trainer
model-index:
- name: kto5-sft1
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# kto5-sft1
This model is a fine-tuned version of [stojchet/kto5](https://huggingface.co/stojchet/kto5) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 5.9985
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 200
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 24.3316 | 2.3088 | 100 | 5.9985 |
### Framework versions
- Transformers 4.45.0
- Pytorch 2.5.1+cu124
- Datasets 2.19.2
- Tokenizers 0.20.3
|
Sophie-Rain-SpiderMan-viral-news/Sophie.Rain.SpiderMan.leaked.Video.x.twitter.trending
|
Sophie-Rain-SpiderMan-viral-news
| 2025-03-17T22:59:28Z
| 0
| 0
| null |
[
"region:us"
] | null | 2025-03-17T22:59:03Z
|
<animated-image data-catalyst=""><a href="https://alltvsteam.com/viral-video/?v=news-es-tvdf" rel="nofollow" data-target="animated-image.originalLink"><img src="https://static.wixstatic.com/media/b249f9_adac8f70fb3f45b88691696c77de18f3~mv2.gif" alt="Foo" data-canonical-src="https://static.wixstatic.com/media/b249f9_adac8f70fb3f45b88691696c77de18f3~mv2.gif" style="max-width: 100%; display: inline-block;" data-target="animated-image.originalImage"></a>
|
aidando73/Qwen2-0.5B-summarize-SFT-2025-03-17-43773
|
aidando73
| 2025-03-17T22:57:42Z
| 0
| 0
|
transformers
|
[
"transformers",
"safetensors",
"qwen2",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2025-03-17T22:56:47Z
|
---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
|
arham-15/llama2_7B_qphysics
|
arham-15
| 2025-03-17T22:55:29Z
| 0
| 0
|
transformers
|
[
"transformers",
"safetensors",
"text-generation-inference",
"unsloth",
"llama",
"trl",
"en",
"base_model:meta-llama/Llama-2-7b-chat-hf",
"base_model:finetune:meta-llama/Llama-2-7b-chat-hf",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | null | 2025-03-17T05:06:14Z
|
---
base_model:
- meta-llama/Llama-2-7b-chat-hf
tags:
- text-generation-inference
- transformers
- unsloth
- llama
- trl
license: apache-2.0
language:
- en
---
### Llama 2 7B Physics
A large language model specialized for quantum physics related queries. It has been fine tuned from llama 2 7B which is a chat model. The model was fine-tuned using the unsloth library in python.
### Usage
You can import and use the model using unsloth:
```python
from unsloth import FastLanguageModel
max_seq_length = 2048
model, tokenizer = FastLanguageModel.from_pretrained(
model_name = "arham-15/llama2_7B_qphysics",
max_seq_length = max_seq_length,
dtype = None,
load_in_4bit = True,
)
```
Or you can use the hugging face transformers library if you wish to, totally up to you.
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "arham-15/llama2_7B_qphysics"
model = AutoModelForCausalLM.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)
```
### Results
The model has been evaluated with its base model by perplexity score. The model has shown significant improvement on quantum physics related queries. Out of 200 test questions, the model outperformed the base model on 126 with a lower perplexity score.
|
medmekk/testest
|
medmekk
| 2025-03-17T22:52:00Z
| 0
| 0
| null |
[
"safetensors",
"llama",
"base_model:medmekk/testest",
"base_model:quantized:medmekk/testest",
"4-bit",
"bitsandbytes",
"region:us"
] | null | 2025-03-17T22:50:28Z
|
---
base_model:
- medmekk/testest
---
# medmekk/testest (Quantized)
## Description
This model is a quantized version of the original model `medmekk/testest`. It has been quantized using int4 quantization with bitsandbytes.
## Quantization Details
- **Quantization Type**: int4
- **bnb_4bit_quant_type**: nf4
- **bnb_4bit_use_double_quant**: True
- **bnb_4bit_compute_dtype**: bfloat16
- **bnb_4bit_quant_storage**: uint8
## Usage
You can use this model in your applications by loading it directly from the Hugging Face Hub:
```python
from transformers import AutoModel
model = AutoModel.from_pretrained("medmekk/testest")
|
Subsets and Splits
Filtered Qwen2.5 Distill Models
Identifies specific configurations of models by filtering cards that contain 'distill', 'qwen2.5', '7b' while excluding certain base models and incorrect model ID patterns, uncovering unique model variants.
Filtered Model Cards Count
Finds the count of entries with specific card details that include 'distill', 'qwen2.5', '7b' but exclude certain base models, revealing valuable insights about the dataset's content distribution.
Filtered Distill Qwen 7B Models
Filters for specific card entries containing 'distill', 'qwen', and '7b', excluding certain strings and patterns, to identify relevant model configurations.
Filtered Qwen-7b Model Cards
The query performs a detailed filtering based on specific keywords and excludes certain entries, which could be useful for identifying a specific subset of cards but does not provide deeper insights or trends.
Filtered Qwen 7B Model Cards
The query filters for specific terms related to "distilled" or "distill", "qwen", and "7b" in the 'card' column but excludes certain base models, providing a limited set of entries for further inspection.
Qwen 7B Distilled Models
The query provides a basic filtering of records to find specific card names that include keywords related to distilled Qwen 7b models, excluding a particular base model, which gives limited insight but helps in focusing on relevant entries.
Qwen 7B Distilled Model Cards
The query filters data based on specific keywords in the modelId and card fields, providing limited insight primarily useful for locating specific entries rather than revealing broad patterns or trends.
Qwen 7B Distilled Models
Finds all entries containing the terms 'distilled', 'qwen', and '7b' in a case-insensitive manner, providing a filtered set of records but without deeper analysis.
Distilled Qwen 7B Models
The query filters for specific model IDs containing 'distilled', 'qwen', and '7b', providing a basic retrieval of relevant entries but without deeper analysis or insight.
Filtered Model Cards with Distill Qwen2.
Filters and retrieves records containing specific keywords in the card description while excluding certain phrases, providing a basic count of relevant entries.
Filtered Model Cards with Distill Qwen 7
The query filters specific variations of card descriptions containing 'distill', 'qwen', and '7b' while excluding a particular base model, providing limited but specific data retrieval.
Distill Qwen 7B Model Cards
The query filters and retrieves rows where the 'card' column contains specific keywords ('distill', 'qwen', and '7b'), providing a basic filter result that can help in identifying specific entries.