modelId
string | author
string | last_modified
timestamp[us, tz=UTC] | downloads
int64 | likes
int64 | library_name
string | tags
sequence | pipeline_tag
string | createdAt
timestamp[us, tz=UTC] | card
string |
---|---|---|---|---|---|---|---|---|---|
mrferr3t/83398edd-55b5-4b12-bc4e-aff81fe9140d | mrferr3t | 2025-01-26T08:59:29Z | 9 | 0 | peft | [
"peft",
"safetensors",
"llama",
"axolotl",
"generated_from_trainer",
"base_model:unsloth/llama-2-7b-chat",
"base_model:adapter:unsloth/llama-2-7b-chat",
"license:apache-2.0",
"region:us"
] | null | 2025-01-26T08:58:25Z | ---
library_name: peft
license: apache-2.0
base_model: unsloth/llama-2-7b-chat
tags:
- axolotl
- generated_from_trainer
model-index:
- name: 83398edd-55b5-4b12-bc4e-aff81fe9140d
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: unsloth/llama-2-7b-chat
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- d54b8bbf3f45bb00_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/d54b8bbf3f45bb00_train_data.json
type:
field_input: reply
field_instruction: question
field_output: answer
format: '{instruction} {input}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 4
flash_attention: false
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: false
group_by_length: false
hub_model_id: mrferr3t/83398edd-55b5-4b12-bc4e-aff81fe9140d
hub_repo: null
hub_strategy: checkpoint
hub_token: null
learning_rate: 0.0002
load_in_4bit: false
load_in_8bit: false
local_rank: null
logging_steps: 1
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 8
lora_target_linear: true
lr_scheduler: cosine
max_steps: 10
micro_batch_size: 2
mlflow_experiment_name: /tmp/d54b8bbf3f45bb00_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 4
sequence_len: 512
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: f573a5a1-33e7-4cca-af15-6e4e2e847f12
wandb_project: Gradients-On-Demand
wandb_run: your_name
wandb_runid: f573a5a1-33e7-4cca-af15-6e4e2e847f12
warmup_steps: 10
weight_decay: 0.0
xformers_attention: null
```
</details><br>
# 83398edd-55b5-4b12-bc4e-aff81fe9140d
This model is a fine-tuned version of [unsloth/llama-2-7b-chat](https://huggingface.co/unsloth/llama-2-7b-chat) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7998
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use adamw_bnb_8bit with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 10
- training_steps: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 0.8329 | 0.0025 | 1 | 0.8898 |
| 0.8174 | 0.0075 | 3 | 0.8875 |
| 0.8871 | 0.0149 | 6 | 0.8656 |
| 0.7355 | 0.0224 | 9 | 0.7998 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.3.1+cu121
- Datasets 3.0.1
- Tokenizers 0.20.1 |
Moemu/Muice-2.7.1-Llama3-Chinese-8b-Instruct | Moemu | 2025-01-26T08:59:18Z | 6 | 0 | peft | [
"peft",
"safetensors",
"chinese",
"lora",
"llama3",
"zh",
"dataset:Moemu/Muice-Dataset",
"base_model:FlagAlpha/Llama3-Chinese-8B-Instruct",
"base_model:adapter:FlagAlpha/Llama3-Chinese-8B-Instruct",
"license:apache-2.0",
"region:us"
] | null | 2025-01-26T08:20:04Z | ---
license: apache-2.0
base_model: FlagAlpha/Llama3-Chinese-8B-Instruct
datasets:
- Moemu/Muice-Dataset
tags:
- chinese
- lora
- llama3
library_name: peft
language:
- zh
---
## 模型说明
基于 [FlagAlpha/Llama3-Chinese-8B-Instruct](https://huggingface.co/FlagAlpha/Llama3-Chinese-8B-Instruct) 微调而成的沐雪角色扮演模型,在主动对话上表现出一定的优势(固定Prompt)。使用本模型需要模型本体。
沐雪设定可参见:[Moemu/Muice-Chatbot](https://github.com/Moemu/Muice-Chatbot?tab=readme-ov-file#沐雪人设)
部分训练集参见:[Moemu/Muice-Dataset](https://huggingface.co/datasets/Moemu/Muice-Dataset)
## 对话示例

## System Prompt
在不同任务中,模型拥有不同的System Prompt。具体参见:[Muice-Chatbot/llm/utils/auto_system_prompt.py at main · Moemu/Muice-Chatbot](https://github.com/Moemu/Muice-Chatbot/blob/main/llm/utils/auto_system_prompt.py)
## 评估
| 模型名 | 新话题发起分数 | 直播对话性能 | 日常聊天性能 | 综合对话分数 |
| ----------------------------------------------- | -------------- | ------------ | ------------ | ------------ |
| Muice-2.3-chatglm2-6b-int4-pt-128-1e-2 | 2.80 | 4.00 | 4.33 | 3.45 |
| Muice-2.4-chatglm2-6b-int4-pt-128-1e-2 | 3.20 | 4.00 | 3.50 | 3.45 |
| Muice-2.4-Qwen2-1.5B-Instruct-GPTQ-Int4-2e-3 | 1.40 | 3.00 | 6.00 | 5.75 |
| Muice-2.5.3-Qwen2-1.5B-Instruct-GPTQ-Int4-2e-3 | 4.04 | 5.00 | 4.33 | 5.29 |
| Muice-2.6.2-Qwen-7B-Chat-Int4-5e-4 | 5.20 | 5.67 | 4.00 | 5.75 |
| Muice-2.7.0-Qwen-7B-Chat-Int4-1e-4 | 2.40 | 5.30 | 6.00 | \ |
| Muice-2.7.1-Qwen2.5-7B-Instruct-GPTQ-Int4-8e-4 | **5.40** | 4.60 | 4.50 | **6.76** |
| **Muice-2.7.1-Llama3_Chinese_8b_Instruct-8e-5** | 4.16 | **6.34** | **8.16** | 6.20 |
## 训练参数
- learning_rate: 8e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- num_epochs: 5.0
- mixed_precision_training: Native AMP
|
kk-aivio/603e64f2-51b2-480f-82fe-3c4121883835 | kk-aivio | 2025-01-26T08:59:06Z | 9 | 0 | peft | [
"peft",
"safetensors",
"llama",
"axolotl",
"generated_from_trainer",
"base_model:unsloth/llama-2-7b-chat",
"base_model:adapter:unsloth/llama-2-7b-chat",
"license:apache-2.0",
"region:us"
] | null | 2025-01-26T08:57:54Z | ---
library_name: peft
license: apache-2.0
base_model: unsloth/llama-2-7b-chat
tags:
- axolotl
- generated_from_trainer
model-index:
- name: 603e64f2-51b2-480f-82fe-3c4121883835
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: unsloth/llama-2-7b-chat
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- d54b8bbf3f45bb00_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/d54b8bbf3f45bb00_train_data.json
type:
field_input: reply
field_instruction: question
field_output: answer
format: '{instruction} {input}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 4
flash_attention: false
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: false
group_by_length: false
hub_model_id: kk-aivio/603e64f2-51b2-480f-82fe-3c4121883835
hub_repo: null
hub_strategy: checkpoint
hub_token: null
learning_rate: 0.0002
load_in_4bit: false
load_in_8bit: false
local_rank: null
logging_steps: 1
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 8
lora_target_linear: true
lr_scheduler: cosine
max_steps: 10
micro_batch_size: 2
mlflow_experiment_name: /tmp/d54b8bbf3f45bb00_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 4
sequence_len: 512
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: f573a5a1-33e7-4cca-af15-6e4e2e847f12
wandb_project: Birthday-SN56-17-Gradients-On-Demand
wandb_run: your_name
wandb_runid: f573a5a1-33e7-4cca-af15-6e4e2e847f12
warmup_steps: 10
weight_decay: 0.0
xformers_attention: null
```
</details><br>
# 603e64f2-51b2-480f-82fe-3c4121883835
This model is a fine-tuned version of [unsloth/llama-2-7b-chat](https://huggingface.co/unsloth/llama-2-7b-chat) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: nan
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 10
- training_steps: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 0.0 | 0.0025 | 1 | nan |
| 0.0 | 0.0075 | 3 | nan |
| 0.0 | 0.0149 | 6 | nan |
| 0.0 | 0.0224 | 9 | nan |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
Luongdzung/hoa-1b4-sft-lit-olora | Luongdzung | 2025-01-26T08:56:06Z | 9 | 0 | peft | [
"peft",
"tensorboard",
"safetensors",
"generated_from_trainer",
"base_model:vlsp-2023-vllm/hoa-1b4",
"base_model:adapter:vlsp-2023-vllm/hoa-1b4",
"license:bigscience-bloom-rail-1.0",
"region:us"
] | null | 2025-01-26T08:56:03Z | ---
library_name: peft
license: bigscience-bloom-rail-1.0
base_model: vlsp-2023-vllm/hoa-1b4
tags:
- generated_from_trainer
model-index:
- name: hoa-1b4-sft-lit-olora
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# hoa-1b4-sft-lit-olora
This model is a fine-tuned version of [vlsp-2023-vllm/hoa-1b4](https://huggingface.co/vlsp-2023-vllm/hoa-1b4) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 4
### Training results
### Framework versions
- PEFT 0.14.0
- Transformers 4.44.2
- Pytorch 2.4.1+cu121
- Datasets 3.2.0
- Tokenizers 0.19.1 |
myrulezzz/Llama3.2-1b-hr-fp16 | myrulezzz | 2025-01-26T08:54:28Z | 36 | 0 | transformers | [
"transformers",
"gguf",
"llama",
"text-generation-inference",
"unsloth",
"en",
"base_model:unsloth/Llama-3.2-1B-Instruct-bnb-4bit",
"base_model:quantized:unsloth/Llama-3.2-1B-Instruct-bnb-4bit",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2025-01-26T08:53:52Z | ---
base_model: unsloth/Llama-3.2-1B-Instruct-bnb-4bit
tags:
- text-generation-inference
- transformers
- unsloth
- llama
- gguf
license: apache-2.0
language:
- en
---
# Uploaded model
- **Developed by:** myrulezzz
- **License:** apache-2.0
- **Finetuned from model :** unsloth/Llama-3.2-1B-Instruct-bnb-4bit
This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
sleepdeprived3/Qwen2.5-72b-RP-Ink_EXL2_3.5bpw_H8 | sleepdeprived3 | 2025-01-26T08:51:22Z | 6 | 0 | null | [
"safetensors",
"qwen2",
"conversational",
"roleplay",
"chat",
"base_model:Qwen/Qwen2.5-72B-Instruct",
"base_model:quantized:Qwen/Qwen2.5-72B-Instruct",
"license:other",
"exl2",
"region:us"
] | null | 2025-01-26T06:55:46Z | ---
base_model:
- Qwen/Qwen2.5-72B-Instruct
tags:
- conversational
- roleplay
- chat
license: other
license_name: qwen
---
# Qwen 2.5 72b RP Ink

A roleplay-focused LoRA finetune of Qwen 2.5 72b Instruct. Methodology and hyperparams inspired by [SorcererLM](https://huggingface.co/rAIfle/SorcererLM-8x22b-bf16) and [Slush](https://huggingface.co/crestf411/Q2.5-32B-Slush).
Yet another model in the Ink series, following in the footsteps of [the 32b one](https://huggingface.co/allura-org/Qwen2.5-32b-RP-Ink) and [the Nemo one](https://huggingface.co/allura-org/MN-12b-RP-Ink)
## Testimonials
> [Compared to the 32b] felt a noticeable increase in coherence
\- ShotMisser64
> Yeah ep2's great!! made me actually wanna write a reply by myself for the first time in a few days
\- Maw
> This is the best RP I've ever had
\- 59smoke
> this makes me want to get another 3090 to run 72b
\- dysfunctional
## Dataset
The worst mix of data you've ever seen. Like, seriously, you do not want to see the things that went into this model. It's bad.
"this is like washing down an adderall with a bottle of methylated rotgut" - inflatebot
Update: I have sent the (public datasets in the) data mix publicly already so here's that
<details>
<img src=https://cdn-uploads.huggingface.co/production/uploads/634262af8d8089ebaefd410e/JtjUoKtbOfBZfSSKojTcj.png>
</details>
## Quants
[imatrix GGUFs by bartowski](https://huggingface.co/bartowski/Qwen2.5-72b-RP-Ink-GGUF)
## Recommended Settings
Chat template: ChatML
Recommended samplers (not the be-all-end-all, try some on your own!):
- Temp 0.83 / Top P 0.8 / Top A 0.3 / Rep Pen 1.03
- Your samplers can go here! :3
## Hyperparams
### General
- Epochs = 2
- LR = 6e-5
- LR Scheduler = Cosine
- Optimizer = Paged AdamW 8bit
- Effective batch size = 16
### LoRA
- Rank = 16
- Alpha = 32
- Dropout = 0.25 (Inspiration: [Slush](https://huggingface.co/crestf411/Q2.5-32B-Slush))
## Credits
Humongous thanks to the people who created and curated the original data
Big thanks to all Allura members, for testing and emotional support ilya /platonic
especially to inflatebot who made the model card's image :3
Another big thanks to all the members of the ArliAI and BeaverAI Discord servers for testing! All of the people featured in the testimonials are from there :3 |
trenden/a95cadac-67f5-466e-873a-a177ac0cefaf | trenden | 2025-01-26T08:50:09Z | 6 | 0 | peft | [
"peft",
"safetensors",
"qwen2",
"axolotl",
"generated_from_trainer",
"base_model:Qwen/Qwen2.5-0.5B-Instruct",
"base_model:adapter:Qwen/Qwen2.5-0.5B-Instruct",
"license:apache-2.0",
"region:us"
] | null | 2025-01-26T08:49:04Z | ---
library_name: peft
license: apache-2.0
base_model: Qwen/Qwen2.5-0.5B-Instruct
tags:
- axolotl
- generated_from_trainer
model-index:
- name: a95cadac-67f5-466e-873a-a177ac0cefaf
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: Qwen/Qwen2.5-0.5B-Instruct
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- e1ec409eef7839e4_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/e1ec409eef7839e4_train_data.json
type:
field_instruction: source
field_output: target
format: '{instruction}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 4
flash_attention: false
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: false
group_by_length: false
hub_model_id: trenden/a95cadac-67f5-466e-873a-a177ac0cefaf
hub_repo: null
hub_strategy: checkpoint
hub_token: null
learning_rate: 5.0e-05
load_in_4bit: false
load_in_8bit: false
local_rank: null
logging_steps: 1
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 8
lora_target_linear: true
lr_scheduler: cosine
max_steps: 50
micro_batch_size: 2
mlflow_experiment_name: /tmp/e1ec409eef7839e4_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 4
sequence_len: 512
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: 8d83fb56-f0a6-4bfc-90b0-d811cde17d16
wandb_project: Birthday-SN56-3-Gradients-On-Demand
wandb_run: your_name
wandb_runid: 8d83fb56-f0a6-4bfc-90b0-d811cde17d16
warmup_steps: 10
weight_decay: 0.0
xformers_attention: null
```
</details><br>
# a95cadac-67f5-466e-873a-a177ac0cefaf
This model is a fine-tuned version of [Qwen/Qwen2.5-0.5B-Instruct](https://huggingface.co/Qwen/Qwen2.5-0.5B-Instruct) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.9372
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 10
- training_steps: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 2.0965 | 0.0020 | 1 | 2.0439 |
| 1.9998 | 0.0257 | 13 | 2.0004 |
| 1.9455 | 0.0514 | 26 | 1.9553 |
| 1.7795 | 0.0771 | 39 | 1.9372 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
mradermacher/DeepSeek-R1-Distill-Qwen-32B-Uncensored-i1-GGUF | mradermacher | 2025-01-26T08:50:09Z | 43,149 | 8 | transformers | [
"transformers",
"gguf",
"generated_from_trainer",
"en",
"dataset:Guilherme34/uncensor",
"base_model:nicoboss/DeepSeek-R1-Distill-Qwen-32B-Uncensored",
"base_model:quantized:nicoboss/DeepSeek-R1-Distill-Qwen-32B-Uncensored",
"license:mit",
"endpoints_compatible",
"region:us",
"imatrix",
"conversational"
] | null | 2025-01-26T04:12:55Z | ---
base_model: nicoboss/DeepSeek-R1-Distill-Qwen-32B-Uncensored
datasets:
- Guilherme34/uncensor
language:
- en
library_name: transformers
license: mit
quantized_by: mradermacher
tags:
- generated_from_trainer
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/nicoboss/DeepSeek-R1-Distill-Qwen-32B-Uncensored
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/DeepSeek-R1-Distill-Qwen-32B-Uncensored-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/DeepSeek-R1-Distill-Qwen-32B-Uncensored-i1-GGUF/resolve/main/DeepSeek-R1-Distill-Qwen-32B-Uncensored.i1-IQ1_S.gguf) | i1-IQ1_S | 7.4 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/DeepSeek-R1-Distill-Qwen-32B-Uncensored-i1-GGUF/resolve/main/DeepSeek-R1-Distill-Qwen-32B-Uncensored.i1-IQ1_M.gguf) | i1-IQ1_M | 8.0 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/DeepSeek-R1-Distill-Qwen-32B-Uncensored-i1-GGUF/resolve/main/DeepSeek-R1-Distill-Qwen-32B-Uncensored.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 9.1 | |
| [GGUF](https://huggingface.co/mradermacher/DeepSeek-R1-Distill-Qwen-32B-Uncensored-i1-GGUF/resolve/main/DeepSeek-R1-Distill-Qwen-32B-Uncensored.i1-IQ2_XS.gguf) | i1-IQ2_XS | 10.1 | |
| [GGUF](https://huggingface.co/mradermacher/DeepSeek-R1-Distill-Qwen-32B-Uncensored-i1-GGUF/resolve/main/DeepSeek-R1-Distill-Qwen-32B-Uncensored.i1-IQ2_S.gguf) | i1-IQ2_S | 10.5 | |
| [GGUF](https://huggingface.co/mradermacher/DeepSeek-R1-Distill-Qwen-32B-Uncensored-i1-GGUF/resolve/main/DeepSeek-R1-Distill-Qwen-32B-Uncensored.i1-IQ2_M.gguf) | i1-IQ2_M | 11.4 | |
| [GGUF](https://huggingface.co/mradermacher/DeepSeek-R1-Distill-Qwen-32B-Uncensored-i1-GGUF/resolve/main/DeepSeek-R1-Distill-Qwen-32B-Uncensored.i1-Q2_K_S.gguf) | i1-Q2_K_S | 11.6 | very low quality |
| [GGUF](https://huggingface.co/mradermacher/DeepSeek-R1-Distill-Qwen-32B-Uncensored-i1-GGUF/resolve/main/DeepSeek-R1-Distill-Qwen-32B-Uncensored.i1-Q2_K.gguf) | i1-Q2_K | 12.4 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/DeepSeek-R1-Distill-Qwen-32B-Uncensored-i1-GGUF/resolve/main/DeepSeek-R1-Distill-Qwen-32B-Uncensored.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 12.9 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/DeepSeek-R1-Distill-Qwen-32B-Uncensored-i1-GGUF/resolve/main/DeepSeek-R1-Distill-Qwen-32B-Uncensored.i1-IQ3_XS.gguf) | i1-IQ3_XS | 13.8 | |
| [GGUF](https://huggingface.co/mradermacher/DeepSeek-R1-Distill-Qwen-32B-Uncensored-i1-GGUF/resolve/main/DeepSeek-R1-Distill-Qwen-32B-Uncensored.i1-Q3_K_S.gguf) | i1-Q3_K_S | 14.5 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/DeepSeek-R1-Distill-Qwen-32B-Uncensored-i1-GGUF/resolve/main/DeepSeek-R1-Distill-Qwen-32B-Uncensored.i1-IQ3_S.gguf) | i1-IQ3_S | 14.5 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/DeepSeek-R1-Distill-Qwen-32B-Uncensored-i1-GGUF/resolve/main/DeepSeek-R1-Distill-Qwen-32B-Uncensored.i1-IQ3_M.gguf) | i1-IQ3_M | 14.9 | |
| [GGUF](https://huggingface.co/mradermacher/DeepSeek-R1-Distill-Qwen-32B-Uncensored-i1-GGUF/resolve/main/DeepSeek-R1-Distill-Qwen-32B-Uncensored.i1-Q3_K_M.gguf) | i1-Q3_K_M | 16.0 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/DeepSeek-R1-Distill-Qwen-32B-Uncensored-i1-GGUF/resolve/main/DeepSeek-R1-Distill-Qwen-32B-Uncensored.i1-Q3_K_L.gguf) | i1-Q3_K_L | 17.3 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/DeepSeek-R1-Distill-Qwen-32B-Uncensored-i1-GGUF/resolve/main/DeepSeek-R1-Distill-Qwen-32B-Uncensored.i1-IQ4_XS.gguf) | i1-IQ4_XS | 17.8 | |
| [GGUF](https://huggingface.co/mradermacher/DeepSeek-R1-Distill-Qwen-32B-Uncensored-i1-GGUF/resolve/main/DeepSeek-R1-Distill-Qwen-32B-Uncensored.i1-Q4_0.gguf) | i1-Q4_0 | 18.8 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/DeepSeek-R1-Distill-Qwen-32B-Uncensored-i1-GGUF/resolve/main/DeepSeek-R1-Distill-Qwen-32B-Uncensored.i1-Q4_K_S.gguf) | i1-Q4_K_S | 18.9 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/DeepSeek-R1-Distill-Qwen-32B-Uncensored-i1-GGUF/resolve/main/DeepSeek-R1-Distill-Qwen-32B-Uncensored.i1-Q4_K_M.gguf) | i1-Q4_K_M | 19.9 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/DeepSeek-R1-Distill-Qwen-32B-Uncensored-i1-GGUF/resolve/main/DeepSeek-R1-Distill-Qwen-32B-Uncensored.i1-Q4_1.gguf) | i1-Q4_1 | 20.7 | |
| [GGUF](https://huggingface.co/mradermacher/DeepSeek-R1-Distill-Qwen-32B-Uncensored-i1-GGUF/resolve/main/DeepSeek-R1-Distill-Qwen-32B-Uncensored.i1-Q5_K_S.gguf) | i1-Q5_K_S | 22.7 | |
| [GGUF](https://huggingface.co/mradermacher/DeepSeek-R1-Distill-Qwen-32B-Uncensored-i1-GGUF/resolve/main/DeepSeek-R1-Distill-Qwen-32B-Uncensored.i1-Q5_K_M.gguf) | i1-Q5_K_M | 23.4 | |
| [GGUF](https://huggingface.co/mradermacher/DeepSeek-R1-Distill-Qwen-32B-Uncensored-i1-GGUF/resolve/main/DeepSeek-R1-Distill-Qwen-32B-Uncensored.i1-Q6_K.gguf) | i1-Q6_K | 27.0 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
fernandoruiz/salamandraTA-2B-Q4_0-GGUF | fernandoruiz | 2025-01-26T08:49:41Z | 42 | 0 | transformers | [
"transformers",
"gguf",
"llama-cpp",
"gguf-my-repo",
"translation",
"it",
"pt",
"de",
"en",
"es",
"eu",
"gl",
"fr",
"bg",
"cs",
"lt",
"hr",
"ca",
"nl",
"ro",
"da",
"el",
"fi",
"hu",
"sk",
"sl",
"et",
"pl",
"lv",
"mt",
"ga",
"sv",
"an",
"ast",
"oc",
"base_model:BSC-LT/salamandraTA-2B",
"base_model:quantized:BSC-LT/salamandraTA-2B",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | translation | 2025-01-26T08:49:32Z | ---
license: apache-2.0
library_name: transformers
pipeline_tag: translation
language:
- it
- pt
- de
- en
- es
- eu
- gl
- fr
- bg
- cs
- lt
- hr
- ca
- nl
- ro
- da
- el
- fi
- hu
- sk
- sl
- et
- pl
- lv
- mt
- ga
- sv
- an
- ast
- oc
base_model: BSC-LT/salamandraTA-2B
tags:
- llama-cpp
- gguf-my-repo
---
# fernandoruiz/salamandraTA-2B-Q4_0-GGUF
This model was converted to GGUF format from [`BSC-LT/salamandraTA-2B`](https://huggingface.co/BSC-LT/salamandraTA-2B) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space.
Refer to the [original model card](https://huggingface.co/BSC-LT/salamandraTA-2B) for more details on the model.
## Use with llama.cpp
Install llama.cpp through brew (works on Mac and Linux)
```bash
brew install llama.cpp
```
Invoke the llama.cpp server or the CLI.
### CLI:
```bash
llama-cli --hf-repo fernandoruiz/salamandraTA-2B-Q4_0-GGUF --hf-file salamandrata-2b-q4_0.gguf -p "The meaning to life and the universe is"
```
### Server:
```bash
llama-server --hf-repo fernandoruiz/salamandraTA-2B-Q4_0-GGUF --hf-file salamandrata-2b-q4_0.gguf -c 2048
```
Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well.
Step 1: Clone llama.cpp from GitHub.
```
git clone https://github.com/ggerganov/llama.cpp
```
Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux).
```
cd llama.cpp && LLAMA_CURL=1 make
```
Step 3: Run inference through the main binary.
```
./llama-cli --hf-repo fernandoruiz/salamandraTA-2B-Q4_0-GGUF --hf-file salamandrata-2b-q4_0.gguf -p "The meaning to life and the universe is"
```
or
```
./llama-server --hf-repo fernandoruiz/salamandraTA-2B-Q4_0-GGUF --hf-file salamandrata-2b-q4_0.gguf -c 2048
```
|
nathanialhunt/df4fa0d6-d687-41a3-bd1f-57ee1f20bde7 | nathanialhunt | 2025-01-26T08:48:02Z | 9 | 0 | peft | [
"peft",
"safetensors",
"qwen2",
"axolotl",
"generated_from_trainer",
"base_model:Qwen/Qwen2.5-0.5B-Instruct",
"base_model:adapter:Qwen/Qwen2.5-0.5B-Instruct",
"license:apache-2.0",
"region:us"
] | null | 2025-01-26T08:47:13Z | ---
library_name: peft
license: apache-2.0
base_model: Qwen/Qwen2.5-0.5B-Instruct
tags:
- axolotl
- generated_from_trainer
model-index:
- name: df4fa0d6-d687-41a3-bd1f-57ee1f20bde7
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: Qwen/Qwen2.5-0.5B-Instruct
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- e1ec409eef7839e4_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/e1ec409eef7839e4_train_data.json
type:
field_instruction: source
field_output: target
format: '{instruction}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 4
flash_attention: false
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: false
group_by_length: false
hub_model_id: nathanialhunt/df4fa0d6-d687-41a3-bd1f-57ee1f20bde7
hub_repo: null
hub_strategy: checkpoint
hub_token: null
learning_rate: 0.0002
load_in_4bit: false
load_in_8bit: false
local_rank: null
logging_steps: 1
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 8
lora_target_linear: true
lr_scheduler: cosine
max_steps: 10
micro_batch_size: 2
mlflow_experiment_name: /tmp/e1ec409eef7839e4_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 4
sequence_len: 512
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: 8d83fb56-f0a6-4bfc-90b0-d811cde17d16
wandb_project: Birthday-SN56-24-Gradients-On-Demand
wandb_run: your_name
wandb_runid: 8d83fb56-f0a6-4bfc-90b0-d811cde17d16
warmup_steps: 10
weight_decay: 0.0
xformers_attention: null
```
</details><br>
# df4fa0d6-d687-41a3-bd1f-57ee1f20bde7
This model is a fine-tuned version of [Qwen/Qwen2.5-0.5B-Instruct](https://huggingface.co/Qwen/Qwen2.5-0.5B-Instruct) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.9741
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 10
- training_steps: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 2.0965 | 0.0020 | 1 | 2.0439 |
| 1.7484 | 0.0059 | 3 | 2.0406 |
| 1.758 | 0.0119 | 6 | 2.0091 |
| 2.086 | 0.0178 | 9 | 1.9741 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
philip-hightech/624a2857-ddf7-4458-b217-01bbb5749c8d | philip-hightech | 2025-01-26T08:47:49Z | 7 | 0 | peft | [
"peft",
"safetensors",
"llama",
"axolotl",
"generated_from_trainer",
"base_model:NousResearch/Nous-Hermes-llama-2-7b",
"base_model:adapter:NousResearch/Nous-Hermes-llama-2-7b",
"license:mit",
"region:us"
] | null | 2025-01-26T08:44:06Z | ---
library_name: peft
license: mit
base_model: NousResearch/Nous-Hermes-llama-2-7b
tags:
- axolotl
- generated_from_trainer
model-index:
- name: 624a2857-ddf7-4458-b217-01bbb5749c8d
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: NousResearch/Nous-Hermes-llama-2-7b
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- 77a6c7f6e0223ba0_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/77a6c7f6e0223ba0_train_data.json
type:
field_instruction: instruction
field_output: response
format: '{instruction}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 4
flash_attention: false
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: false
group_by_length: false
hub_model_id: philip-hightech/624a2857-ddf7-4458-b217-01bbb5749c8d
hub_repo: null
hub_strategy: checkpoint
hub_token: null
learning_rate: 0.0002
load_in_4bit: false
load_in_8bit: false
local_rank: null
logging_steps: 1
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 8
lora_target_linear: true
lr_scheduler: cosine
max_steps: 10
micro_batch_size: 2
mlflow_experiment_name: /tmp/77a6c7f6e0223ba0_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 4
sequence_len: 512
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: 80e14b82-7814-4596-8534-3041e5f0ad43
wandb_project: Mine-SN56-21-Gradients-On-Demand
wandb_run: your_name
wandb_runid: 80e14b82-7814-4596-8534-3041e5f0ad43
warmup_steps: 10
weight_decay: 0.0
xformers_attention: null
```
</details><br>
# 624a2857-ddf7-4458-b217-01bbb5749c8d
This model is a fine-tuned version of [NousResearch/Nous-Hermes-llama-2-7b](https://huggingface.co/NousResearch/Nous-Hermes-llama-2-7b) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: nan
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 10
- training_steps: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 0.0 | 0.0006 | 1 | nan |
| 0.0 | 0.0018 | 3 | nan |
| 0.0 | 0.0036 | 6 | nan |
| 0.0 | 0.0054 | 9 | nan |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
daniel40/191c6529-b7ed-42b0-8645-ad5e504e5333 | daniel40 | 2025-01-26T08:47:22Z | 8 | 0 | peft | [
"peft",
"safetensors",
"qwen2",
"axolotl",
"generated_from_trainer",
"base_model:Qwen/Qwen2.5-0.5B-Instruct",
"base_model:adapter:Qwen/Qwen2.5-0.5B-Instruct",
"license:apache-2.0",
"region:us"
] | null | 2025-01-26T08:46:29Z | ---
library_name: peft
license: apache-2.0
base_model: Qwen/Qwen2.5-0.5B-Instruct
tags:
- axolotl
- generated_from_trainer
model-index:
- name: 191c6529-b7ed-42b0-8645-ad5e504e5333
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: Qwen/Qwen2.5-0.5B-Instruct
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- e1ec409eef7839e4_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/e1ec409eef7839e4_train_data.json
type:
field_instruction: source
field_output: target
format: '{instruction}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 4
flash_attention: false
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: false
group_by_length: false
hub_model_id: daniel40/191c6529-b7ed-42b0-8645-ad5e504e5333
hub_repo: null
hub_strategy: checkpoint
hub_token: null
learning_rate: 0.0002
load_in_4bit: false
load_in_8bit: false
local_rank: null
logging_steps: 1
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 8
lora_target_linear: true
lr_scheduler: cosine
max_steps: 10
micro_batch_size: 2
mlflow_experiment_name: /tmp/e1ec409eef7839e4_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 4
sequence_len: 512
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: 8d83fb56-f0a6-4bfc-90b0-d811cde17d16
wandb_project: Birthday-SN56-31-Gradients-On-Demand
wandb_run: your_name
wandb_runid: 8d83fb56-f0a6-4bfc-90b0-d811cde17d16
warmup_steps: 10
weight_decay: 0.0
xformers_attention: null
```
</details><br>
# 191c6529-b7ed-42b0-8645-ad5e504e5333
This model is a fine-tuned version of [Qwen/Qwen2.5-0.5B-Instruct](https://huggingface.co/Qwen/Qwen2.5-0.5B-Instruct) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.9760
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 10
- training_steps: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 2.0965 | 0.0020 | 1 | 2.0439 |
| 1.7489 | 0.0059 | 3 | 2.0407 |
| 1.757 | 0.0119 | 6 | 2.0097 |
| 2.0896 | 0.0178 | 9 | 1.9760 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
kostiantynk/70099761-d1b4-412a-a8da-e3a536edf0ac | kostiantynk | 2025-01-26T08:41:35Z | 9 | 0 | peft | [
"peft",
"safetensors",
"qwen2",
"axolotl",
"generated_from_trainer",
"base_model:Qwen/Qwen2.5-Math-7B-Instruct",
"base_model:adapter:Qwen/Qwen2.5-Math-7B-Instruct",
"license:apache-2.0",
"region:us"
] | null | 2025-01-26T08:09:33Z | ---
library_name: peft
license: apache-2.0
base_model: Qwen/Qwen2.5-Math-7B-Instruct
tags:
- axolotl
- generated_from_trainer
model-index:
- name: 70099761-d1b4-412a-a8da-e3a536edf0ac
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: Qwen/Qwen2.5-Math-7B-Instruct
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- e0c41a65c97fb0ab_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/e0c41a65c97fb0ab_train_data.json
type:
field_instruction: prompt
field_output: org_response
format: '{instruction}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 4
flash_attention: false
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: false
group_by_length: false
hub_model_id: kostiantynk/70099761-d1b4-412a-a8da-e3a536edf0ac
hub_repo: null
hub_strategy: checkpoint
hub_token: null
learning_rate: 0.0002
load_in_4bit: false
load_in_8bit: false
local_rank: null
logging_steps: 1
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 8
lora_target_linear: true
lr_scheduler: cosine
max_steps: 10
micro_batch_size: 2
mlflow_experiment_name: /tmp/e0c41a65c97fb0ab_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 4
sequence_len: 512
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: bc469934-f65d-4554-a373-c57006d470f3
wandb_project: Mine-SN56-22-Gradients-On-Demand
wandb_run: your_name
wandb_runid: bc469934-f65d-4554-a373-c57006d470f3
warmup_steps: 10
weight_decay: 0.0
xformers_attention: null
```
</details><br>
# 70099761-d1b4-412a-a8da-e3a536edf0ac
This model is a fine-tuned version of [Qwen/Qwen2.5-Math-7B-Instruct](https://huggingface.co/Qwen/Qwen2.5-Math-7B-Instruct) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 2.5059
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 10
- training_steps: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 2.4848 | 0.0000 | 1 | 2.5470 |
| 3.0914 | 0.0001 | 3 | 2.5467 |
| 1.5561 | 0.0002 | 6 | 2.5411 |
| 2.8429 | 0.0003 | 9 | 2.5059 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
error577/0ce397df-b5b7-48a1-ac1f-0865198751ae | error577 | 2025-01-26T08:41:29Z | 8 | 0 | peft | [
"peft",
"safetensors",
"llama",
"axolotl",
"generated_from_trainer",
"base_model:unsloth/SmolLM-360M-Instruct",
"base_model:adapter:unsloth/SmolLM-360M-Instruct",
"license:apache-2.0",
"8-bit",
"bitsandbytes",
"region:us"
] | null | 2025-01-26T08:01:33Z | ---
library_name: peft
license: apache-2.0
base_model: unsloth/SmolLM-360M-Instruct
tags:
- axolotl
- generated_from_trainer
model-index:
- name: 0ce397df-b5b7-48a1-ac1f-0865198751ae
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: unsloth/SmolLM-360M-Instruct
bf16: true
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- 4f5a92c6211764d5_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/4f5a92c6211764d5_train_data.json
type:
field_instruction: question
field_output: solution
format: '{instruction}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 1
flash_attention: true
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: false
group_by_length: false
hub_model_id: error577/0ce397df-b5b7-48a1-ac1f-0865198751ae
hub_repo: null
hub_strategy: end
hub_token: null
learning_rate: 0.0002
load_in_4bit: false
load_in_8bit: true
local_rank: null
logging_steps: 1
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 32
lora_target_linear: true
lr_scheduler: cosine
max_steps: 2000
micro_batch_size: 2
mlflow_experiment_name: /tmp/4f5a92c6211764d5_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 4
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 1
sequence_len: 512
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.02
wandb_entity: null
wandb_mode: online
wandb_name: b232257a-a91b-444e-aedb-3fe497321055
wandb_project: Gradients-On-Demand
wandb_run: your_name
wandb_runid: b232257a-a91b-444e-aedb-3fe497321055
warmup_steps: 10
weight_decay: 0.0
xformers_attention: null
```
</details><br>
# 0ce397df-b5b7-48a1-ac1f-0865198751ae
This model is a fine-tuned version of [unsloth/SmolLM-360M-Instruct](https://huggingface.co/unsloth/SmolLM-360M-Instruct) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.0087
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 10
- training_steps: 2000
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 1.1345 | 0.0017 | 1 | 1.2805 |
| 0.8929 | 0.8503 | 500 | 1.0414 |
| 0.7422 | 1.7007 | 1000 | 1.0184 |
| 0.8871 | 2.5510 | 1500 | 1.0102 |
| 0.9667 | 3.4014 | 2000 | 1.0087 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
Theros/L3-ColdBrew-CoT-R1-test | Theros | 2025-01-26T08:40:43Z | 14 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"mergekit",
"merge",
"conversational",
"arxiv:2311.03099",
"base_model:Theros/L3-ColdBrew-CoT",
"base_model:merge:Theros/L3-ColdBrew-CoT",
"base_model:deepseek-ai/DeepSeek-R1-Distill-Llama-8B",
"base_model:merge:deepseek-ai/DeepSeek-R1-Distill-Llama-8B",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2025-01-26T08:37:36Z | ---
base_model:
- deepseek-ai/DeepSeek-R1-Distill-Llama-8B
- Theros/L3-ColdBrew-CoT
library_name: transformers
tags:
- mergekit
- merge
---
# merge
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the [DARE TIES](https://arxiv.org/abs/2311.03099) merge method using [Theros/L3-ColdBrew-CoT](https://huggingface.co/Theros/L3-ColdBrew-CoT) as a base.
### Models Merged
The following models were included in the merge:
* [deepseek-ai/DeepSeek-R1-Distill-Llama-8B](https://huggingface.co/deepseek-ai/DeepSeek-R1-Distill-Llama-8B)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
models:
- model: Theros/L3-ColdBrew-CoT
parameters:
density: 0.5
weight: 0.5
- model: deepseek-ai/DeepSeek-R1-Distill-Llama-8B
parameters:
density: 0.5
weight: 0.5
merge_method: dare_ties
base_model: Theros/L3-ColdBrew-CoT
parameters:
normalize: false
int8_mask: true
dtype: bfloat16
```
|
ClarenceDan/88833754-f174-4d9d-add3-cd221622f2ee | ClarenceDan | 2025-01-26T08:40:18Z | 7 | 0 | peft | [
"peft",
"safetensors",
"llama",
"axolotl",
"generated_from_trainer",
"base_model:NousResearch/Nous-Hermes-llama-2-7b",
"base_model:adapter:NousResearch/Nous-Hermes-llama-2-7b",
"license:mit",
"region:us"
] | null | 2025-01-26T08:36:43Z | ---
library_name: peft
license: mit
base_model: NousResearch/Nous-Hermes-llama-2-7b
tags:
- axolotl
- generated_from_trainer
model-index:
- name: 88833754-f174-4d9d-add3-cd221622f2ee
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: NousResearch/Nous-Hermes-llama-2-7b
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- 77a6c7f6e0223ba0_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/77a6c7f6e0223ba0_train_data.json
type:
field_instruction: instruction
field_output: response
format: '{instruction}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 4
flash_attention: false
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: false
group_by_length: false
hub_model_id: ClarenceDan/88833754-f174-4d9d-add3-cd221622f2ee
hub_repo: null
hub_strategy: checkpoint
hub_token: null
learning_rate: 0.0002
load_in_4bit: false
load_in_8bit: false
local_rank: null
logging_steps: 1
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 8
lora_target_linear: true
lr_scheduler: cosine
max_steps: 10
micro_batch_size: 2
mlflow_experiment_name: /tmp/77a6c7f6e0223ba0_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 4
sequence_len: 512
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: 80e14b82-7814-4596-8534-3041e5f0ad43
wandb_project: Gradients-On-Demand
wandb_run: your_name
wandb_runid: 80e14b82-7814-4596-8534-3041e5f0ad43
warmup_steps: 10
weight_decay: 0.0
xformers_attention: null
```
</details><br>
# 88833754-f174-4d9d-add3-cd221622f2ee
This model is a fine-tuned version of [NousResearch/Nous-Hermes-llama-2-7b](https://huggingface.co/NousResearch/Nous-Hermes-llama-2-7b) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: nan
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 10
- training_steps: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 0.0 | 0.0006 | 1 | nan |
| 0.0 | 0.0018 | 3 | nan |
| 0.0 | 0.0036 | 6 | nan |
| 0.0 | 0.0054 | 9 | nan |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
jessemeng/TwinLlama-3.2-1B-DPO | jessemeng | 2025-01-26T08:36:53Z | 16 | 1 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"text-generation-inference",
"unsloth",
"trl",
"dpo",
"conversational",
"en",
"base_model:jessemeng/TwinLlama-3.2-1B",
"base_model:finetune:jessemeng/TwinLlama-3.2-1B",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-generation | 2025-01-26T03:09:15Z | ---
base_model: jessemeng/TwinLlama-3.2-1B
tags:
- text-generation-inference
- transformers
- unsloth
- llama
- trl
- dpo
license: apache-2.0
language:
- en
---
# Uploaded model
- **Developed by:** jessemeng
- **License:** apache-2.0
- **Finetuned from model :** jessemeng/TwinLlama-3.2-1B
This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
azxky6645/01261734-origin_tamplate_NuminaMath-CoT | azxky6645 | 2025-01-26T08:35:44Z | 6 | 0 | transformers | [
"transformers",
"safetensors",
"qwen2",
"text-generation",
"trl",
"sft",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2025-01-26T08:34:55Z | ---
library_name: transformers
tags:
- trl
- sft
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
azxky6645/01260126-origin_tamplate_NuminaMath-CoT | azxky6645 | 2025-01-26T08:34:05Z | 6 | 0 | transformers | [
"transformers",
"safetensors",
"qwen2",
"text-generation",
"trl",
"sft",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2025-01-26T08:15:39Z | ---
library_name: transformers
tags:
- trl
- sft
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
trenden/7fec7b72-d9cf-475e-95db-21567779e422 | trenden | 2025-01-26T08:34:04Z | 8 | 0 | peft | [
"peft",
"safetensors",
"qwen2",
"axolotl",
"generated_from_trainer",
"base_model:Qwen/Qwen2-7B-Instruct",
"base_model:adapter:Qwen/Qwen2-7B-Instruct",
"license:apache-2.0",
"region:us"
] | null | 2025-01-26T08:26:19Z | ---
library_name: peft
license: apache-2.0
base_model: Qwen/Qwen2-7B-Instruct
tags:
- axolotl
- generated_from_trainer
model-index:
- name: 7fec7b72-d9cf-475e-95db-21567779e422
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: Qwen/Qwen2-7B-Instruct
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- 3b1817e1a326e619_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/3b1817e1a326e619_train_data.json
type:
field_instruction: data
field_output: criteria
format: '{instruction}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 4
flash_attention: false
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: false
group_by_length: false
hub_model_id: trenden/7fec7b72-d9cf-475e-95db-21567779e422
hub_repo: null
hub_strategy: checkpoint
hub_token: null
learning_rate: 0.0002
load_in_4bit: false
load_in_8bit: false
local_rank: null
logging_steps: 1
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 8
lora_target_linear: true
lr_scheduler: cosine
max_steps: 10
micro_batch_size: 2
mlflow_experiment_name: /tmp/3b1817e1a326e619_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 4
sequence_len: 512
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: a5b61cfd-85d2-4880-97d8-24759f842d7d
wandb_project: Birthday-SN56-26-Gradients-On-Demand
wandb_run: your_name
wandb_runid: a5b61cfd-85d2-4880-97d8-24759f842d7d
warmup_steps: 10
weight_decay: 0.0
xformers_attention: null
```
</details><br>
# 7fec7b72-d9cf-475e-95db-21567779e422
This model is a fine-tuned version of [Qwen/Qwen2-7B-Instruct](https://huggingface.co/Qwen/Qwen2-7B-Instruct) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.5183
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 10
- training_steps: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 1.9922 | 0.0002 | 1 | 1.7969 |
| 1.7072 | 0.0006 | 3 | 1.7937 |
| 1.5974 | 0.0013 | 6 | 1.7114 |
| 1.7829 | 0.0019 | 9 | 1.5183 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
vermoney/17d9523a-67b8-4aa5-98c8-c7509e50f9a8 | vermoney | 2025-01-26T08:32:24Z | 8 | 0 | peft | [
"peft",
"safetensors",
"llama",
"axolotl",
"generated_from_trainer",
"base_model:Korabbit/llama-2-ko-7b",
"base_model:adapter:Korabbit/llama-2-ko-7b",
"region:us"
] | null | 2025-01-26T03:12:13Z | ---
library_name: peft
base_model: Korabbit/llama-2-ko-7b
tags:
- axolotl
- generated_from_trainer
model-index:
- name: 17d9523a-67b8-4aa5-98c8-c7509e50f9a8
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: Korabbit/llama-2-ko-7b
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- c9c324e8cf5586e6_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/c9c324e8cf5586e6_train_data.json
type:
field_instruction: instruction
field_output: output
format: '{instruction}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
device: cuda
early_stopping_patience: 1
eval_max_new_tokens: 128
eval_steps: 5
eval_table_size: null
evals_per_epoch: null
flash_attention: false
fp16: null
gradient_accumulation_steps: 4
gradient_checkpointing: true
group_by_length: true
hub_model_id: vermoney/17d9523a-67b8-4aa5-98c8-c7509e50f9a8
hub_repo: null
hub_strategy: checkpoint
hub_token: null
learning_rate: 0.0002
load_in_4bit: false
load_in_8bit: false
local_rank: null
logging_steps: 3
lora_alpha: 32
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 16
lora_target_linear: true
lr_scheduler: cosine
max_memory:
0: 78GiB
max_steps: 30
micro_batch_size: 2
mlflow_experiment_name: /tmp/c9c324e8cf5586e6_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optim_args:
adam_beta1: 0.9
adam_beta2: 0.95
adam_epsilon: 1e-5
optimizer: adamw_torch
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
save_steps: 10
sequence_len: 1024
special_tokens:
pad_token: </s>
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: true
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: d35b96a9-b8d1-49c0-b1a8-167bc6103694
wandb_project: Gradients-On-Demand
wandb_run: your_name
wandb_runid: d35b96a9-b8d1-49c0-b1a8-167bc6103694
warmup_steps: 5
weight_decay: 0.001
xformers_attention: true
```
</details><br>
# 17d9523a-67b8-4aa5-98c8-c7509e50f9a8
This model is a fine-tuned version of [Korabbit/llama-2-ko-7b](https://huggingface.co/Korabbit/llama-2-ko-7b) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2031
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=adam_beta1=0.9,adam_beta2=0.95,adam_epsilon=1e-5
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 5
- training_steps: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| No log | 0.0000 | 1 | 1.7691 |
| 1.3923 | 0.0001 | 5 | 1.4774 |
| 1.2865 | 0.0003 | 10 | 1.2936 |
| 1.2647 | 0.0004 | 15 | 1.2457 |
| 1.1921 | 0.0006 | 20 | 1.2142 |
| 1.2415 | 0.0007 | 25 | 1.2044 |
| 1.1323 | 0.0008 | 30 | 1.2031 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
maulikanalog/jaydeepsur | maulikanalog | 2025-01-26T08:32:23Z | 64 | 0 | diffusers | [
"diffusers",
"text-to-image",
"flux",
"lora",
"template:sd-lora",
"ai-toolkit",
"base_model:black-forest-labs/FLUX.1-dev",
"base_model:adapter:black-forest-labs/FLUX.1-dev",
"license:other",
"region:us"
] | text-to-image | 2025-01-26T07:48:11Z | ---
tags:
- text-to-image
- flux
- lora
- diffusers
- template:sd-lora
- ai-toolkit
base_model: black-forest-labs/FLUX.1-dev
instance_prompt: rohanz
license: other
license_name: flux-1-dev-non-commercial-license
license_link: https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/LICENSE.md
widget:
- text: jaydeepsur in tailored Italian suit
output:
url: images/example_ut7nw6ob9.png
- text: jaydeepsur in tailored blue italian suit
output:
url: images/example_5g66fc5km.png
- text: jaydeepsur in tailored tuxedo
output:
url: images/example_3ymxcx0jt.png
- text: >-
Jaydeepsur sitting at an outdoor café near the Eiffel Tower, enjoying a
croissant and coffee
output:
url: images/example_engyh3efn.png
- text: >-
Jaydeepsur sitting at an outdoor café near the Eiffel Tower, enjoying a
croissant and coffee
output:
url: images/example_c6p33m9f4.png
- text: Jaydeepsur participating in a Ganga aarti, holding a diya and looking serene
output:
url: images/example_6fjgryjou.png
- text: Jaydeepsur walking along the Seine River at sunset, wearing a beret
output:
url: images/example_d74cso10j.png
- text: >-
Jaydeepsur taking a selfie at the edge of the Grand Canyon, capturing the
stunning natural beauty
output:
url: images/example_kukd6sv7q.png
- text: >-
Jaydeepsur jogging in Central Park, with the New York City skyline visible
in the distance
output:
url: images/example_oyddr0wn0.png
- text: >-
Jaydeepsur taking a selfie at the edge of the Grand Canyon, capturing the
stunning natural beauty
output:
url: images/example_naa6dxfu5.png
- text: >-
Jaydeepsur taking a gondola ride in Venice, wearing sunglasses and holding a
camera
output:
url: images/example_sg9njekiy.png
- text: >-
Jaydeepsur standing on a cobblestone street in Paris at night, surrounded by
glowing café signs and streetlights
output:
url: images/example_8gje9jpht.png
- text: >-
Jaydeepsur posing near the sparkling Eiffel Tower during its nightly light
show, with the city’s warm lights as a backdrop
output:
url: images/example_b30kss964.png
- text: >-
Jaydeepsur sitting in a leather chair at a boardroom table, presenting a
business proposal with dynamic screens glowing in the background
output:
url: images/example_6yghfa2dp.png
- text: >-
Jaydeepsur in a sharp black suit standing in a New York City skyscraper
office, with the skyline visible through floor-to-ceiling glass windows at
dusk
output:
url: images/example_2cbv1qio4.png
- text: >-
Jaydeepsur walking confidently through a brightly lit office lobby in Los
Angeles, holding a briefcase and a coffee cup
output:
url: images/example_opojiy6hy.png
- text: >-
Jaydeepsur sitting in a leather chair at a boardroom table, presenting a
business proposal with dynamic screens glowing in the background
output:
url: images/example_3cpkys9j9.png
- text: >-
Jaydeepsur in a tailored navy-blue suit, standing in a Tokyo high-rise
office, with Mount Fuji faintly visible in the background
output:
url: images/example_px945bag8.png
- text: >-
Jaydeepsur adjusting his tie in a sleek, minimalist office space,
illuminated by soft, neon-blue LED lights
output:
url: images/example_xzopvb5if.png
- text: >-
Jaydeepsur in a modern office overlooking the River Thames, working on a
laptop at a standing desk
output:
url: images/example_0gc7yy1d1.png
- text: >-
Jaydeepsur in a three-piece charcoal gray suit, standing in front of the
Gherkin building in London, checking his wristwatch
output:
url: images/example_oeotttzwr.png
- text: >-
Jaydeepsur in a sleek dark suit, standing on a balcony of a Marina Bay Sands
office, overlooking the glowing Singapore skyline at night
output:
url: images/example_kaqvf2mkh.png
- text: >-
Jaydeepsur in a sleek dark suit, standing on a balcony of a Marina Bay Sands
office, overlooking the glowing Singapore skyline at night
output:
url: images/example_vfva7cei7.png
- text: >-
Jaydeepsur in a pristine white suit standing in a luxurious office
overlooking the Burj Khalifa at night, surrounded by ambient lighting
output:
url: images/example_6hraviyog.png
- text: >-
Jaydeepsur in a pristine white suit standing in a luxurious office
overlooking the Burj Khalifa at night, surrounded by ambient lighting
output:
url: images/example_2b1owj1xq.png
- text: Jaydeepsur in blue Italian suit passport photo
output:
url: images/example_l5rmjbi8f.png
- text: jaydeepsur in tailored blue italian suit plain background
output:
url: images/example_yi3fc9638.png
- text: jaydeepsur in tailored blue italian suit plain background
output:
url: images/example_c7cltdawq.png
---
# rohanz
Model trained with [AI Toolkit by Ostris](https://github.com/ostris/ai-toolkit)
<Gallery />
## Trigger words
You should use `jaydeepsur` to trigger the image generation.
## Download model and use it with ComfyUI, AUTOMATIC1111, SD.Next, Invoke AI, etc.
Weights for this model are available in Safetensors format.
[Download](/None/tree/main) them in the Files & versions tab.
## Use it with the [🧨 diffusers library](https://github.com/huggingface/diffusers)
```py
from diffusers import AutoPipelineForText2Image
import torch
pipeline = AutoPipelineForText2Image.from_pretrained('black-forest-labs/FLUX.1-dev', torch_dtype=torch.bfloat16).to('cuda')
pipeline.load_lora_weights('None', weight_name='rohanz')
image = pipeline('model in a bustling cafe ').images[0]
image.save("my_image.png")
```
For more details, including weighting, merging and fusing LoRAs, check the [documentation on loading LoRAs in diffusers](https://huggingface.co/docs/diffusers/main/en/using-diffusers/loading_adapters)
|
Melvin56/Rombo-LLM-V2.5-Qwen-3b-Q4_0-GGUF | Melvin56 | 2025-01-26T08:31:36Z | 36 | 0 | transformers | [
"transformers",
"gguf",
"llama-cpp",
"gguf-my-repo",
"base_model:Rombo-Org/Rombo-LLM-V2.5-Qwen-3b",
"base_model:quantized:Rombo-Org/Rombo-LLM-V2.5-Qwen-3b",
"license:other",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2025-01-26T08:31:25Z | ---
library_name: transformers
base_model: Rombo-Org/Rombo-LLM-V2.5-Qwen-3b
license: other
license_name: qwen-research
license_link: https://huggingface.co/Qwen/Qwen2.5-3B-Instruct/blob/main/LICENSE
tags:
- llama-cpp
- gguf-my-repo
---
# Melvin56/Rombo-LLM-V2.5-Qwen-3b-Q4_0-GGUF
This model was converted to GGUF format from [`Rombo-Org/Rombo-LLM-V2.5-Qwen-3b`](https://huggingface.co/Rombo-Org/Rombo-LLM-V2.5-Qwen-3b) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space.
Refer to the [original model card](https://huggingface.co/Rombo-Org/Rombo-LLM-V2.5-Qwen-3b) for more details on the model.
## Use with llama.cpp
Install llama.cpp through brew (works on Mac and Linux)
```bash
brew install llama.cpp
```
Invoke the llama.cpp server or the CLI.
### CLI:
```bash
llama-cli --hf-repo Melvin56/Rombo-LLM-V2.5-Qwen-3b-Q4_0-GGUF --hf-file rombo-llm-v2.5-qwen-3b-q4_0.gguf -p "The meaning to life and the universe is"
```
### Server:
```bash
llama-server --hf-repo Melvin56/Rombo-LLM-V2.5-Qwen-3b-Q4_0-GGUF --hf-file rombo-llm-v2.5-qwen-3b-q4_0.gguf -c 2048
```
Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well.
Step 1: Clone llama.cpp from GitHub.
```
git clone https://github.com/ggerganov/llama.cpp
```
Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux).
```
cd llama.cpp && LLAMA_CURL=1 make
```
Step 3: Run inference through the main binary.
```
./llama-cli --hf-repo Melvin56/Rombo-LLM-V2.5-Qwen-3b-Q4_0-GGUF --hf-file rombo-llm-v2.5-qwen-3b-q4_0.gguf -p "The meaning to life and the universe is"
```
or
```
./llama-server --hf-repo Melvin56/Rombo-LLM-V2.5-Qwen-3b-Q4_0-GGUF --hf-file rombo-llm-v2.5-qwen-3b-q4_0.gguf -c 2048
```
|
prxy5607/e0e77b79-39f5-49aa-8a13-3f566022c0d5 | prxy5607 | 2025-01-26T08:30:44Z | 7 | 0 | peft | [
"peft",
"safetensors",
"gemma2",
"axolotl",
"generated_from_trainer",
"base_model:UCLA-AGI/Gemma-2-9B-It-SPPO-Iter2",
"base_model:adapter:UCLA-AGI/Gemma-2-9B-It-SPPO-Iter2",
"license:gemma",
"region:us"
] | null | 2025-01-26T07:43:26Z | ---
library_name: peft
license: gemma
base_model: UCLA-AGI/Gemma-2-9B-It-SPPO-Iter2
tags:
- axolotl
- generated_from_trainer
model-index:
- name: e0e77b79-39f5-49aa-8a13-3f566022c0d5
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: UCLA-AGI/Gemma-2-9B-It-SPPO-Iter2
bf16: true
chat_template: llama3
data_processes: 16
dataset_prepared_path: null
datasets:
- data_files:
- 10bce10a598a69c6_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/10bce10a598a69c6_train_data.json
type:
field_instruction: sentence1
field_output: sentence2
format: '{instruction}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
device_map: auto
do_eval: true
early_stopping_patience: 5
eval_batch_size: 4
eval_max_new_tokens: 128
eval_steps: 50
eval_table_size: null
evals_per_epoch: null
flash_attention: true
fp16: false
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: true
group_by_length: true
hub_model_id: prxy5607/e0e77b79-39f5-49aa-8a13-3f566022c0d5
hub_repo: null
hub_strategy: checkpoint
hub_token: null
learning_rate: 0.0001
load_in_4bit: false
load_in_8bit: false
local_rank: null
logging_steps: 1
lora_alpha: 128
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 64
lora_target_linear: true
lr_scheduler: cosine
max_grad_norm: 1.0
max_memory:
0: 75GB
max_steps: 200
micro_batch_size: 8
mlflow_experiment_name: /tmp/10bce10a598a69c6_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 3
optim_args:
adam_beta1: 0.9
adam_beta2: 0.95
adam_epsilon: 1e-5
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
save_steps: 50
saves_per_epoch: null
sequence_len: 1024
strict: false
tf32: true
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: 52cb97db-d062-434d-a447-ba4090f8f63f
wandb_project: Gradients-On-Demand
wandb_run: your_name
wandb_runid: 52cb97db-d062-434d-a447-ba4090f8f63f
warmup_steps: 10
weight_decay: 0.0
xformers_attention: null
```
</details><br>
# e0e77b79-39f5-49aa-8a13-3f566022c0d5
This model is a fine-tuned version of [UCLA-AGI/Gemma-2-9B-It-SPPO-Iter2](https://huggingface.co/UCLA-AGI/Gemma-2-9B-It-SPPO-Iter2) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7520
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=adam_beta1=0.9,adam_beta2=0.95,adam_epsilon=1e-5
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 10
- training_steps: 200
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 1.5423 | 0.0015 | 1 | 2.3431 |
| 1.4871 | 0.0773 | 50 | 0.8649 |
| 1.4206 | 0.1546 | 100 | 0.8041 |
| 1.5316 | 0.2319 | 150 | 0.7654 |
| 1.4631 | 0.3092 | 200 | 0.7520 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
critical-hf/Immy_the_AI_teddy | critical-hf | 2025-01-26T08:30:31Z | 131 | 0 | transformers | [
"transformers",
"pytorch",
"safetensors",
"qwen2",
"text-generation",
"text-generation-inference",
"unsloth",
"trl",
"conversational",
"en",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-10-27T15:12:05Z | ---
base_model: unsloth/qwen2.5-0.5b-instruct-bnb-4bit
language:
- en
license: apache-2.0
tags:
- text-generation-inference
- transformers
- unsloth
- qwen2
- trl
---
# Uploaded model
- **Developed by:** Daemontatox
- **License:** apache-2.0
- **Finetuned from model :** unsloth/qwen2.5-0.5b-instruct-bnb-4bit
This qwen2 model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
kavinda123321/speecht5_finetuned__ylacombe_one_speaker_dataset_kavinda | kavinda123321 | 2025-01-26T08:27:29Z | 14 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"speecht5",
"text-to-audio",
"generated_from_trainer",
"base_model:microsoft/speecht5_tts",
"base_model:finetune:microsoft/speecht5_tts",
"license:mit",
"endpoints_compatible",
"region:us"
] | text-to-audio | 2025-01-26T08:02:42Z | ---
library_name: transformers
license: mit
base_model: microsoft/speecht5_tts
tags:
- generated_from_trainer
model-index:
- name: speecht5_finetuned__ylacombe_one_speaker_dataset_kavinda
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# speecht5_finetuned__ylacombe_one_speaker_dataset_kavinda
This model is a fine-tuned version of [microsoft/speecht5_tts](https://huggingface.co/microsoft/speecht5_tts) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4739
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 4
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 32
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- training_steps: 500
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 3.259 | 20.0 | 100 | 0.4786 |
| 2.8687 | 40.0 | 200 | 0.4758 |
| 2.6007 | 60.0 | 300 | 0.4627 |
| 2.4877 | 80.0 | 400 | 0.4735 |
| 2.4701 | 100.0 | 500 | 0.4739 |
### Framework versions
- Transformers 4.47.1
- Pytorch 2.5.1+cu121
- Tokenizers 0.21.0
|
icefog72/Ice0.68-25.01-RP | icefog72 | 2025-01-26T08:26:29Z | 19 | 1 | transformers | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"mergekit",
"merge",
"license:cc-by-nc-4.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2025-01-25T15:21:52Z | ---
license: cc-by-nc-4.0
base_model: []
library_name: transformers
tags:
- mergekit
- merge
---
# Ice0.68-25.01-RP
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the SLERP merge method.
### Models Merged
The following models were included in the merge:
* G:\FModels\Ice0.60-18.01-RP
* H:\FModels\Ice0.67-25.01-RP
### Configuration
The following YAML configuration was used to produce this model:
```yaml
slices:
- sources:
- model: G:\FModels\Ice0.60-18.01-RP
layer_range: [0, 32]
- model: H:\FModels\Ice0.67-25.01-RP
layer_range: [0, 32]
merge_method: slerp
base_model: H:\FModels\Ice0.67-25.01-RP
parameters:
t:
- filter: self_attn
value: [0, 0.5, 0.3, 0.7, 1]
- filter: mlp
value: [1, 0.5, 0.7, 0.3, 0]
- value: 0.5 # fallback for rest of tensors
dtype: bfloat16
```
|
trenden/84f734b1-bb47-415e-82a9-82fe774ced3b | trenden | 2025-01-26T08:20:09Z | 9 | 0 | peft | [
"peft",
"safetensors",
"qwen2",
"axolotl",
"generated_from_trainer",
"base_model:Qwen/Qwen2.5-Math-7B-Instruct",
"base_model:adapter:Qwen/Qwen2.5-Math-7B-Instruct",
"license:apache-2.0",
"region:us"
] | null | 2025-01-26T07:48:36Z | ---
library_name: peft
license: apache-2.0
base_model: Qwen/Qwen2.5-Math-7B-Instruct
tags:
- axolotl
- generated_from_trainer
model-index:
- name: 84f734b1-bb47-415e-82a9-82fe774ced3b
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: Qwen/Qwen2.5-Math-7B-Instruct
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- e0c41a65c97fb0ab_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/e0c41a65c97fb0ab_train_data.json
type:
field_instruction: prompt
field_output: org_response
format: '{instruction}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 4
flash_attention: false
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: false
group_by_length: false
hub_model_id: trenden/84f734b1-bb47-415e-82a9-82fe774ced3b
hub_repo: null
hub_strategy: checkpoint
hub_token: null
learning_rate: 5.0e-05
load_in_4bit: false
load_in_8bit: false
local_rank: null
logging_steps: 1
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 8
lora_target_linear: true
lr_scheduler: cosine
max_steps: 50
micro_batch_size: 2
mlflow_experiment_name: /tmp/e0c41a65c97fb0ab_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 4
sequence_len: 512
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: bc469934-f65d-4554-a373-c57006d470f3
wandb_project: Birthday-SN56-3-Gradients-On-Demand
wandb_run: your_name
wandb_runid: bc469934-f65d-4554-a373-c57006d470f3
warmup_steps: 10
weight_decay: 0.0
xformers_attention: null
```
</details><br>
# 84f734b1-bb47-415e-82a9-82fe774ced3b
This model is a fine-tuned version of [Qwen/Qwen2.5-Math-7B-Instruct](https://huggingface.co/Qwen/Qwen2.5-Math-7B-Instruct) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 2.3885
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 10
- training_steps: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 2.4848 | 0.0000 | 1 | 2.5470 |
| 2.3006 | 0.0004 | 13 | 2.5367 |
| 3.4707 | 0.0007 | 26 | 2.4504 |
| 2.2717 | 0.0011 | 39 | 2.3885 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
visdata/cook23 | visdata | 2025-01-26T08:18:45Z | 28 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2025-01-26T07:55:46Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
mini1013/master_item_top_el_flat | mini1013 | 2025-01-26T08:15:48Z | 285 | 0 | setfit | [
"setfit",
"safetensors",
"roberta",
"sentence-transformers",
"text-classification",
"generated_from_setfit_trainer",
"arxiv:2209.11055",
"base_model:klue/roberta-base",
"base_model:finetune:klue/roberta-base",
"model-index",
"region:us"
] | text-classification | 2025-01-26T08:15:23Z | ---
tags:
- setfit
- sentence-transformers
- text-classification
- generated_from_setfit_trainer
widget:
- text: 500666 차량용 가습기 소형 미니 사무실 탁상용 반중력 공기 물방울 향수 아로마 테라피 8 시간 작동 청정기 직송 500ml 선택01
black (#M)홈>생활/건강>자동차용품>편의용품>차량용가습기 Naverstore > 가전 > 계절가전 > 가습기/에어워셔 > 차량용 가습기
- text: 해피콜 프리미엄 초고속 블렌더 브리즈탭 LED 터치 UI 믹서기 분쇄기 차콜그레이 (#M)디지털/가전>주방가전>믹서기 Naverstore
> 가전 > 주방가전 > 믹서기/블렌더 > 믹서기
- text: '[ 8/31입고예정] LG전자 24MP400 24인치모니터 IPS패널 FHD 슬림베젤 LED 모니터 컴퓨터모니터 사무용 인강용모니터 (#M)디지털/가전>모니터
Naverstore > 컴퓨터 > 모니터 > 화면크기별 > 26인치 이하'
- text: 콘에어 핸디형 스팀다리미 모음전 02. GS25PKK - 초강력 핸디스팀다리미 (#M)가전·컴퓨터>생활가전>다리미·미싱·기타>스팀다리미
Tmon > 가전·디지털 > 가전·컴퓨터 > 생활가전 > 다리미·미싱·기타 > 스팀다리미
- text: '[ 가130만원대]LG 디오스 오브제컬렉션 냉장고 S834BW12 832L 1. S834BW12 11st > 가전/디지털 > 냉장고
> 양문형 > 양문형;(#M)11st>냉장고>양문형>양문형 11st > 가전/디지털 > 냉장고 > 양문형 > 양문형'
metrics:
- accuracy
pipeline_tag: text-classification
library_name: setfit
inference: true
base_model: klue/roberta-base
model-index:
- name: SetFit with klue/roberta-base
results:
- task:
type: text-classification
name: Text Classification
dataset:
name: Unknown
type: unknown
split: test
metrics:
- type: accuracy
value: 0.9081549631816543
name: Accuracy
---
# SetFit with klue/roberta-base
This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [klue/roberta-base](https://huggingface.co/klue/roberta-base) as the Sentence Transformer embedding model. A [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance is used for classification.
The model has been trained using an efficient few-shot learning technique that involves:
1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning.
2. Training a classification head with features from the fine-tuned Sentence Transformer.
## Model Details
### Model Description
- **Model Type:** SetFit
- **Sentence Transformer body:** [klue/roberta-base](https://huggingface.co/klue/roberta-base)
- **Classification head:** a [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance
- **Maximum Sequence Length:** 512 tokens
- **Number of Classes:** 232 classes
<!-- - **Training Dataset:** [Unknown](https://huggingface.co/datasets/unknown) -->
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->
### Model Sources
- **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit)
- **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055)
- **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit)
### Model Labels
| Label | Examples |
|:------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 187.0 | <ul><li>'키친아트 전기후라이팬 사각 대형잔치팬 피자팬 빨간뚜껑후라이팬 잔치팬-KPP-6627 (#M)디지털/가전>주방가전>전기팬 GFK > Naverstore > 가전 > 주방가전 > 전기그릴/팬 > 전기팬'</li><li>'코스트코 잔치팬 해마루 대형 사각 명절 전기 후라이팬 TC-3000 (#M)디지털/가전>주방가전>전기그릴 GFK > traverse > Naverstore > 가전 > 주방가전 > 전기그릴/팬'</li><li>'대원 특대형 사각 큰집잔치팬 전기팬 설날 추석 전부치는 후라이팬 DWP-530A (#M)디지털/가전>주방가전>전기팬 Naverstore > 가전 > 주방가전 > 전기그릴/팬'</li></ul> |
| 87.0 | <ul><li>'건조기배기호스 파이프 연장 배기관 주방 내경 호환 B. 11-10CM 어댑터 (#M)세탁기/건조기>세탁기 건조기 세트>세탁기 건조기 세트 GFK > 11st > 가전/디지털 > 세탁기/건조기 > 세탁기 건조기 세트 > 세탁기 건조기 세트'</li><li>'건조기 세탁기 받침대 스토퍼 진동 소음 밀림방지패드 (#M)디지털/가전>생활가전>세탁기>일반세탁기 GFK > traverse > Naverstore > 가전 > 세탁기/건조기 > 일반세탁기'</li><li>'세탁기 받침대 4P 진동 소음 수평 높이 조절 냉장고 대형 4개 세트 (#M)디지털/가전>생활가전>세탁기>일반세탁기 GFK > traverse > Naverstore > 가전 > 세탁기/건조기 > 일반세탁기'</li></ul> |
| 37.0 | <ul><li>'바툼 회전 미니 온풍기 탁상용 소형 책상용 BTMH600 (#M)디지털/가전>계절가전>온풍기>전기온풍기 GFK > live > Naverstore > Shop Live > 테크 > 20241119 > 11:00 ~ 13:00'</li><li>'신일 전기히터 바닥용 탁상용 미니온풍기 [SEH-P20] (#M)계절가전>온풍기>전기온풍기 GFK > traverse > 11st > 가전/디지털 > 계절가전 > 온풍기'</li><li>'소싱 웜베이비 미니 온풍기 / 회전온풍기/ 탁상용 가정용 캠핑용 500W 베이비핑크 (#M)홈>전체상품 Naverstore > 디지털/가전 > 계절가전 > 온풍기'</li></ul> |
| 153.0 | <ul><li>'SK매직 GRA-850SRLNG(도시가스) SK매직 GRA-850SR LNG(도시가스) (#M)홈>전체상품 Naverstore > 가전 > 주방가전 > 가스레인지 > 스탠드형'</li><li>'SK매직 GRA-850SR (#M)홈>디지털/가전>주방가전>가스레인지>일반가스레인지 Naverstore > 가전 > 주방가전 > 가스레인지 > 스탠드형'</li><li>'(SK매직) 원터치 점화 가스레인지(2구) 레드 GRAC290R-본 LNG(도시가스) (#M)가전·컴퓨터>주방가전>전기·가스레인지>가스레인지 Tmon > 가전·디지털 > 가전·컴퓨터 > 주방가전 > 전기·가스레인지 > 가스레인지'</li></ul> |
| 167.0 | <ul><li>'한일전기 세이프티 UV 살균 식기건조기 세이프티 UV 살균 식기건조기+NPay 5천원 (#M)디지털/가전>주방가전>식기세척/건조기>식기건조기 GFK > Naverstore > 가전 > 생활가전 > 살균소독기 > 살균건조기'</li><li>'칼 도마 살균기 도마 3종+칼5종 세트 살균 소독 분리형 슬림 칼5종+살균기 화이트에디션 세트 (#M)홈>디지털/가전>주방가전>식기세척/건조기>식기건조기 Naverstore > 가전 > 생활가전 > 살균소독기 > 살균건조기'</li><li>'락앤락 텀블러 살균 건조기 락앤락 텀블러 살균 건조기_그레이 (#M)홈>전체상품 Naverstore > 가전 > 주방가전 > 위생관리 > 식기건조기'</li></ul> |
| 194.0 | <ul><li>'휴롬 착즙기 H430 저속착즙 H72ST-BFS02WH 코스트코 (#M)디지털/가전>주방가전>쥬서기/녹즙기 GFK > traverse > Naverstore > 가전 > 주방가전 > 믹서기/블렌더 > 착즙기'</li><li>'휴롬 H300L 그레이 딥그린 코랄 딥그린 (#M)디지털/가전>주방가전>쥬서기/녹즙기 Naverstore > 가전 > 주방가전 > 믹서기/블렌더 > 착즙기'</li><li>'제니퍼룸 스텐 착즙기 화이트 JO-M8101WH (#M)디지털/가전>주방가전>쥬서기/녹즙기 Naverstore > 가전 > 주방가전 > 믹서기/블렌더 > 착즙기'</li></ul> |
| 210.0 | <ul><li>'QR코드 바코드스캐너 거치대포함 2D 1D 유무선 2D무선-블랙 (#M)프린터/복합기>스캐너>일반 스캐너 GFK > traverse > 11st > 가전/디지털 > 프린터/복합기 > 스캐너'</li><li>'유무선 바코드스캐너 QR코드 서점바코드 가성비바코드스캐너 거치대포함 무선바코드스캐너 마트바코드 1D무선-아이보리 (#M)프린터/복합기>스캐너>일반 스캐너 GFK > traverse > 11st > 가전/디지털 > 프린터/복합기 > 스캐너 > 일반 스캐너'</li><li>'유무선 바코드스캐너 QR코드 서점바코드 가성비바코드스캐너 거치대포함 무선바코드스캐너 마트바코드 2D유선-블랙 (#M)프린터/복합기>스캐너>일반 스캐너 GFK > traverse > 11st > 가전/디지털 > 프린터/복합기 > 스캐너 > 일반 스캐너'</li></ul> |
| 143.0 | <ul><li>'필립스 헤어 드라이어 (BHD004/19) 필립스 헤어 드라이어 (BHD004/19) (#M)홈>헤어케어>헤어기기>헤어드라이기 OLIVEYOUNG > 헤어케어 > 헤어기기 > 헤어드라이기'</li><li>'프리미엄케어 볼륨&케어 HV-7461K0 HV7461 [볼륨 마사지 디퓨저 / 파워모터 / 3 LotteOn > 뷰티 > 뷰티기기 > 헤어스타일러 LotteOn > 뷰티 > 뷰티기기 > 헤어스타일러 > 헤어드라이어'</li><li>'헤어드라이기추천 2000W 미니 가정용 전문가용 드라이기 비달사순 접이식 휴대용 여행용 모이스트랩 접이식 1201K (#M)디지털/가전>이미용가전>헤어기기>드라이어 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 이미용가전 > 드라이어'</li></ul> |
| 18.0 | <ul><li>'미니가습기필터 스위스윙거 가습기 램프 캔 레인우 호환가습기필터 110mm X 8mm (레인보우가습기용) (#M)홈>전체상품 Naverstore > 가전 > 계절가전 > 가습기/에어워셔 > 필터/액세서리'</li><li>'가습기필터 미니 8mm 10mm 스프링 필터 신제품 더블 제트 공기 가습기 USB 대용량 가정 자동차 가습기필터 미니 8mm 10mm 스프링 필터 신제품 더블 제트 공기 가습기 USB 대용량 가정 자동차_05 spray humidif (#M)가전·컴퓨터>계절가전>가습기 액세서리 Tmon > 가전·디지털 > 가전·컴퓨터 > 계절가전 > 가습기 액세서리'</li><li>'가습기필터 미니 8mm 10mm 스프링 필터 100ML 가습기 아로마 에센셜 오일 디퓨저, 향수 디퓨져 가습기필터 미니 8mm 10mm 스프링 필터 100ML 가습기 아로마 에센셜 오일 디퓨저, 향수 디퓨져_08 2pcs Jasmine (#M)가전·컴퓨터>계절가전>가습기 액세서리 Tmon > 가전·디지털 > 가전·컴퓨터 > 계절가전 > 가습기 액세서리'</li></ul> |
| 179.0 | <ul><li>'도깨비 미니 와플메이커 MWM2200 와플메이커 (#M)11st>주방가전>전기쿠커>전기찜기 11st > 가전/디지털 > 주방가전 > 전기쿠커 > 전기찜기'</li><li>'쿠폰가 27.900 [GR-WAFPK] 쿠진아트 와플팬(GR-4NKR/GR-5KR/CGR-10KR 호환) (#M)디지털/가전>주방가전>와플제조기 Naverstore > 가전 > 주방가전 > 간식메이커 > 와플'</li><li>'(한성커머스)키친아트 렉스2구 크로플 와플기계 디저트메이커 KP-21JT 와플메이커 (#M)위메프 > 가전·디지털·컴퓨터 > 주방가전 > 홈메이드가전 > 홈베이킹/토스터기 위메프 > 가전·디지털·컴퓨터 > 주방가전 > 홈메이드가전 > 홈베이킹/토스터기'</li></ul> |
| 3.0 | <ul><li>'인텔 코어i5-13세대 13600K (랩터레이크) / 신품 벌크 / 쿨러X (#M)디지털/가전>PC부품>CPU GFK > Naverstore > 컴퓨터 > 부품 > CPU'</li><li>'인텔 코어i5-10세대 10400 (코멧레이크S) 벌크 쿨러포함 (#M)11st>PC부품>CPU>코어i5 11st > 가전/디지털 > PC부품 > CPU > 코어i5'</li><li>'인텔 CPU i5 4690 하스웰 리프레시 (#M)디지털/가전>PC부품>CPU Naverstore > 컴퓨터 > 부품 > CPU > 인텔'</li></ul> |
| 99.0 | <ul><li>'◆ GRAND SALE & ◆ 부라더미싱 TR14A /초급자추천모델 자동실끼우기 /수강증+서적 (#M)디지털/가전>생활가전>재봉틀 Naverstore > 가전 > 생활가전 > 재봉틀'</li><li>'브랜드 1위 혼스 미니재봉틀 HSSM-1201 한땀한땀 프로 한땀한땀 프로(핑크) (#M)디지털/가전>생활가전>재봉틀 Naverstore > 가전 > 생활가전 > 재봉틀'</li><li>'코스날 미니재봉틀 미니미싱 초간편 핸드미싱 휴대용 가정용 미싱기 아답터 받침대 추가가능 미니재봉틀 (아답터있음)+받침대 (#M)디지털/가전>생활가전>재봉틀 Naverstore > 가전 > 생활가전 > 재봉틀'</li></ul> |
| 25.0 | <ul><li>'신일 전기 컨벡터 SEH-P4000SS 컨벡터히터 동파방지 라디에이터 (#M)위메프 > 가전·디지털·컴퓨터 > 계절가전/에어컨 > 난방가전 > 라디에이터 위메프 > 가전·디지털·컴퓨터 > 계절가전/에어컨 > 난방가전 > 라디에이터'</li><li>'흥신 캠핑라디에이터 오르씨 500W 9월 캠핑용 난로 난방 캠핑용품 ORRCY-21 올블랙(가방제외) (#M)디지털/가전>계절가전>라디에이터 GFK > Naverstore > 가전 > 계절가전 > 난방가전 > 라디에이터'</li><li>'흥신 라디에이터 오르씨 가정용에디션 국산 사무실 화장실 전기난로 7핀 13핀(1500W/4평) (#M)디지털/가전>계절가전>라디에이터 GFK > Naverstore > 가전 > 계절가전 > 난방가전 > 라디에이터'</li></ul> |
| 96.0 | <ul><li>'미닉스 미니 건조기 PRO 3kg 수건 속옷 양말 클래식베이지 (#M)디지털/가전>생활가전>건조기/탈수기>의류건조기 GFK > Naverstore > 가전 > 세탁기/건조기 > 의류건조기'</li><li>'위닉스 컴팩트 4KG 건조기 HS2E400-MGK (#M)가전·컴퓨터>TV·냉장고·세탁기>세탁기·건조기>그외 브랜드 Tmon > 가전·디지털 > 가전·컴퓨터 > TV·냉장고·세탁기 > 세탁기·건조기 > 그외 브랜드'</li><li>'[미닉스]미니 건조기 PRO 3kg 소형 빨래 원룸 자취 아기옷 클래식베이지 (#M)디지털/가전>생활가전>건조기/탈수기>의류건조기 Naverstore > 가전 > 세탁기/건조기 > 의류건조기'</li></ul> |
| 100.0 | <ul><li>'[SUMSEI] 섬세이 에어샤워 2세대 / 바디드라이어 자갈 블랙_1. 에어샤워 (#M)디지털/가전>생활가전>전신건조기 Naverstore > 가전 > 욕실가전 > 전신건조기'</li><li>'보랄 에어타운 바디드라이어 BR-1320DR 전신건조기 (#M)홈>디지털/가전>생활가전>전신건조기 Naverstore > 가전 > 욕실가전 > 전신건조기'</li><li>'에어드롭 헤어&바디드라이어 (고급형 HTM-2011) 고급형 (색상 : 그레이)_설치 필요 (#M)디지털/가전>생활가전>전신건조기 Naverstore > 가전 > 욕실가전 > 전신건조기'</li></ul> |
| 21.0 | <ul><li>'LG 공기청정기 AS303DWFA NS홈 LG 공기청정기 AS303DWFA 무료배송 NS홈 (#M)11st>계절가전>공기청정기>필터식 11st > 가전/디지털 > 계절가전 > 공기청정기 > 필터식'</li><li>'[LG전자]LG AS062PYHAR 에어로퍼니처 원형[32600111] 11st > 가전/디지털 > 계절가전 > 공기청정기 > 필터/액세서리;(#M)11st>계절가전>공기청정기>필터/액세서리 11st > 가전/디지털 > 계절가전 > 공기청정기 > 필터/액세서리'</li><li>'LG 퓨리케어 에어로타워 오브제(온풍겸용)FS061PSSA,FS061PGSA 네이처 그린 (#M)11st>계절가전>공기청정기>필터식 11st > 가전/디지털 > 계절가전 > 공기청정기 > 필터식'</li></ul> |
| 2.0 | <ul><li>'게이밍 조립 컴퓨터 세트 조립PC 롤 발로란트 오버워치 배그 바른컴퓨터 본체 풀세트F11 본체 + 모니터 풀세트_F11 홈>디지털/가전>PC>조립/베어본PC;홈>[게임용 & 사무용 풀세트 PC];홈>전체상품;(#M)홈>[게임용 & 사무용 컴퓨터 PC] Naverstore > 컴퓨터 > 데스크탑 > 조립/반조립PC(베어본)'</li><li>'Beelink-미니 S PC 윈도우 11, 인텔 11th 셀러론 N5095 8GB DDR4 128GB/256GB SSD 데스크탑 게임용 컴퓨터 VS U59 GK 미니 J4125 Beelink-미니 S PC 윈도우 11 인텔 11th 셀러론 N5095 8GB DDR4_CHINA_16GB DDR4 256GB SSD+미국 (#M)가전·컴퓨터>노트북·데스크탑>브랜드PC·올인원>미니PC·기타 Tmon > 가전·디지털 > 가전·컴퓨터 > 노트북·데스크탑 > 브랜드PC·올인원 > 미니PC·기타'</li><li>'인텔 NUC 누크 11세대 타이거캐년 i5 프로세서 미니PC 베어본 NUC11TNKi5 (#M)11st>데스크톱>조립/베이본PC>코어 i5 11st > 가전/디지털 > 데스크톱 > 조립/베이본PC > 코어 i5'</li></ul> |
| 171.0 | <ul><li>'가디브 무지외반증 교정기 발가락링 엄지 발가락 통증 1등급 의료기기 대(15일 무료체험)_교정용 (#M)생활/건강>발건강용품>발가락교정기 GFK > Naverstore > 건강/의료용품 > 발건강용품'</li><li>'LG전자 오브제 컬렉션 양문형 냉장고 S634BB35Q (OK) MinSellAmount (#M)주방가전>냉장고/냉동고>양문형냉장고 Gmarket > 가전 > 주방가전 > 냉장고/냉동고 > 양문형냉장고'</li><li>'삼성전자 양문형 냉장고 RS84B5041M9 (846L) 서울지역 (#M)11st>냉장고>양문형>양문형 11st > 가전/디지털 > 냉장고 > 양문형 > 양문형'</li></ul> |
| 112.0 | <ul><li>'프로크리에이트 질감 인물화 브러쉬 9종 (+튜토리얼) 질감 인물화 브러쉬 9종 (#M)디지털/가전>소프트웨어>유틸리티 GFK > Naverstore > 컴퓨터 > 소프트웨어'</li><li>'2024 굿노트 다이어리 날짜형 속지 아이패드 갤럭시탭 먼슬리 위클리 하이퍼링크 플래너 PDF 베이지블로썸_모눈+타임(월월)_첫구매자 (#M)디지털/가전>소프트웨어>유틸리티 GFK > Naverstore > 컴퓨터 > 소프트웨어 > 유틸리티'</li><li>'[1분발송]리훈 오늘곰부 굿노트 스터디플래너 다이어리 속지 아이패드 양식 노타빌리티 PDF 필기 1.오늘곰부_오른손잡이용 (#M)디지털/가전>소프트웨어>유틸리티 GFK > Naverstore > 컴퓨터 > 소프트웨어 > 유틸리티'</li></ul> |
| 180.0 | <ul><li>'키친아트 요거트메이커 용기8개 온도설정 디지털D3081 (#M)홈>디지털/가전>주방가전>요구르트제조기 Naverstore > 가전 > 주방가전 > 간식메이커 > 요구르트,치즈'</li><li>'주코 라미 요거트메이커 ZY-ZC501M 주코 라미 요거트메이커 ZY-ZC501M (#M)디지털/가전>주방가전>요구르트제조기 Naverstore > 가전 > 주방가전 > 간식메이커 > 요구르트,치즈'</li><li>'키친아트 요거트메이커 그릭요거트만들기 기계 요거메이트 요구르트제조기 옵션1. 500ml (#M)홈>🔴 디지털가전 Naverstore > 가전 > 주방가전 > 간식메이커 > 요구르트,치즈'</li></ul> |
| 60.0 | <ul><li>'[ 가 118만원✅SSD 무상업글] 삼성 갤럭시북2 프로 NT930XEW-A51A 엔씨디 빠르고 가벼운 휴대용 대학생 사무용 문서작업 튼튼한 최신 인텔12세대 13.3 노트북 실버 컬러 (W-A51AS)_무선 마우스+파우치+액정보호 필름+키스킨_NVMe 500G 개봉장착+256G 추가동봉 (#M)홈>▼ 추천 노트북>가벼운 노트북 추천 Naverstore > 컴퓨터 > 노트북 > 삼성갤럭시북'</li><li>'삼성전자 노트북 플러스2 NT550XDA-K14A 정품Win11탑재 인강용 사무용 재택 노트북 화이트(NVMe 128GB+RAM 4GB) (#M)11st>노트북>삼성전자>AMD 11st > 가전/디지털 > 노트북 > 삼성전자 > AMD'</li><li>'[LG] 노트북 가성비부터 최고사양 노트북모음. 002.LG울트라PC 15UD40R-GX56K (#M)가전·컴퓨터>노트북·데스크탑>노트북>일반·사무용 Tmon > 가전·디지털 > 가전·컴퓨터 > 노트북·데스크탑 > 노트북 > 일반·사무용'</li></ul> |
| 138.0 | <ul><li>'[세정액 2개] 브라운 전기면도기 최신 시리즈9 PRO PLUS 충전 세척스테이션 구성 그라파이트[9F65]+세정액2개[BO31] (#M)이미용가전>전기면도기>남성용 GFK > traverse > 11st > 가전/디지털 > 이미용가전 > 전기면도기 > 남성용'</li><li>'손흥민에디션 질레트 랩스 딥클렌징바 면도기 (핸들+1입면도날+거치대+쉐이빙젤) (#M)디지털/가전>이미용가전>면도기소모품>기타면도기소모품 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 이미용가전 > 면도기/면도용품'</li><li>'질레트 프로쉴드 옐로우 파워 면도기 (핸들+1입날) [저자극+밀착 면도] 질레트 프로쉴드 옐로우 파워 면도기 (핸들+1입날) [저자극+밀착 면도] 홈>바디케어>데오/제모>면도기;홈;홈>남성>쉐이빙>면도기/면도날;홈>바디케어>제모용품>면도기;홈>바디케어>제모용품>면도기/제모의료기기;(#M)홈>바디케어>제모/왁싱>남성 쉐이빙 OLIVEYOUNG > 남성 > 쉐이빙 > 면도기/면도날'</li></ul> |
| 6.0 | <ul><li>'이엠텍 지포스 RTX 4060 STORM X Dual D6 8GB.~ (#M)PC부품>그래픽카드>지포스(nVidia) GFK > traverse > 11st > 가전/디지털 > PC부품 > 그래픽카드 > 지포스(nVidia)'</li><li>'노트북 DDR3 4G PC3 10600S 램 삼성 정품 (#M)디지털/가전>PC부품>RAM>노트북용 GFK > Naverstore > 컴퓨터 > 부품 > RAM'</li><li>'삼성전자 DDR4 16GB PC4 - 21300(2666V) 데스크탑 메모리 삼성 16GB 21300(2666V) (#M)디지털/가전>PC부품>RAM>데스크탑용 GFK > Naverstore > 컴퓨터 > 부품 > RAM > 데스크탑용'</li></ul> |
| 79.0 | <ul><li>'에버넷 디지털도어락 현관문도어락 현관도어락 터치키 번호키 EN250-N EN250N(카드키 없음) (#M)디지털/가전>생활가전>디지털도어록>보조키형 Naverstore > 가전 > 생활가전 > 디지털도어록 > 보조키형'</li><li>'도어락 스티커 카드키 태그 RFID RF 디지털 도어록 터치 13.56Mhz 라벨 스티커 태그 05.메탈 스티커 태그B(No.100T) (#M)홈>RFID 태그&카드👍 Naverstore > 가전 > 생활가전 > 디지털도어록 > 보조키형'</li><li>'삼성도어락카드키 SDS 스티커 부착형 카드키 아파트 현관 삼성 도어락카드키 부착형 (화이트) 랜덤발송 홈>카드키;홈>전체상품;(#M)홈>도어락 카드키 Naverstore > 가전 > 생활가전 > 디지털도어록 > 주키형'</li></ul> |
| 11.0 | <ul><li>'파워 파워서플라이 컴퓨터파워 앱코 SUITMASTER SETTLER 700W 화이트 벌크 (#M)디지털/가전>PC부품>파워서플라이>ATX파워 GFK > Naverstore > 컴퓨터 > 부품 > 파워서플라이 > ATX파워'</li><li>'darkFlash UPMOST 850W 80PLUS GOLD FULL MODULAR 블랙 (#M)11st>PC부품>파워>ATX파워 11st > 가전/디지털 > PC부품 > 파워 > ATX파워'</li><li>'오랄비 스테이지스 파워 어린이 전동칫솔 유아 겨울왕국 D12K 겨울왕국 전동칫솔 (#M)디지털/가전>생활가전>구강청정기>전동칫솔 GFK > naver_plus_traverse > Naverstore > 가전 > 욕실가전 > 전동칫솔'</li></ul> |
| 216.0 | <ul><li>'BS 니콘정품 Z30 16-50mm KIT 새상품 (#M)디지털/가전>카메라/캠코더용품>미러리스디카 GFK > naver_plus_traverse_extension > Naverstore > 휴대폰/카메라 > 카메라 > 미러리스카메라'</li><li>'파나소닉 루믹스 DC-S9 + S 18-40mm KIT 정품/TR 다크 올리브 (#M)카메라/주변기기>미러리스카메라>미러리스카메라 GFK > traverse > 11st > 가전/디지털 > 카메라/주변기기 > 미러리스카메라 > 미러리스카메라'</li><li>'시그마 (Sigma) SIGMA 풀 사이즈 미러리스 SLR 카메라 fp 바디 (#M)SSG.COM>카메라/캠코더>디지털카메라/액션캠>DSLR GFK > traverse > ssg > 디지털/렌탈 > 카메라/캠코더 > 디지털카메라/액션캠 > DSLR'</li></ul> |
| 43.0 | <ul><li>'신일 컨백션 전기히터 컨벡터 컨벡션 온열기 난로 가정용 사무실 리모컨 온도조절 안전 SEH-C310 (#M)디지털/가전>계절가전>컨벡터 GFK > Naverstore > 가전 > 계절가전 > 난방가전 > 컨벡터'</li><li>'신일 전기 컨벡션 히터 컨벡터 동파방지 벽걸이라디에이터 대류식난방기 T15HSS 신일 컨벡터 T15HSS (#M)디지털/가전>계절가전>컨벡터 GFK > Naverstore > 가전 > 계절가전 > 난방가전 > 컨벡터'</li><li>'밀 MILL 북유럽 가정용 전기 컨벡터 히터 타이머 온풍기 전기난로 MILL1900TMAX (#M)디지털/가전>계절가전>컨벡터 GFK > Naverstore > 가전 > 계절가전 > 난방가전 > 컨벡터'</li></ul> |
| 53.0 | <ul><li>'IPTIME EFM네트웍스 아이피타임 A3000U 무선랜카드 NPAYMALL (#M)디지털/가전>네트워크장비>랜카드>무선랜카드 Naverstore > 컴퓨터 > 주변기기 > 랜카드 > 무선'</li><li>'EFM ipTIME A3000UA USB 무선 랜카드 (#M)홈>디지털/가전>네트워크장비>랜카드>무선랜카드 Naverstore > 컴퓨터 > 주변기기 > 랜카드 > 무선'</li><li>'EFM ipTIME U1G-C USB 3.0 기가비트 랜카드 (#M)컴퓨터 주변기기>네트워크장비>LAN카드 GFK > 11st > 가전/디지털 > 컴퓨터 주변기기 > 네트워크장비 > LAN카드'</li></ul> |
| 117.0 | <ul><li>'IFI ZEN Air DAC '</li><li>'아이리버 SE300 포터블 하이엔드 DAP.R-2R DAC . Class A AMP (#M)음향가전>기타 음향기기>음향기기 기타 GFK > traverse > 11st > 가전/디지털 > 음향가전 > 기타 음향기기 > 음향기기 기타'</li><li>'오닉스 Onix Mystic XP1 DAC-AMP [한국총판] 해외배송 (설 연휴 이후 발송)_뮤직파이 공구 Mystic XP1 (#M)디지털/가전>음향가전>DAC GFK > traverse > Naverstore > 디지털 > 음향기기 > 플레이어 > 기타'</li></ul> |
| 183.0 | <ul><li>'LG전자 엘지 B101W14 B101S14 일반냉장고 소형 미니 입원실 원룸 사무실 B101S14(샤인) (#M)11st>냉장고>일반형>일반형 11st > 가전/디지털 > 냉장고 > 일반형 > 일반형'</li><li>'윈텍 WC-32CGN 레트로 냉장고 무소음 그린 32L 음료냉장고 가정용 (#M)위메프 > 가전·디지털·컴퓨터 > 대형가전 > 냉장고 > 일반 냉장고 위메프 > 가전·디지털·컴퓨터 > 대형가전 > 냉장고 > 일반 냉장고'</li><li>'소형 냉장고 기숙사 중형 미니 사무실 가정용 간식보관 모텔 스마트 07)더블도어/80A168/실버/과일케이스 (#M)위메프 > 가전·디지털·컴퓨터 > 대형가전 > 냉장고 > 일반 냉장고 위메프 > 가전·디지털·컴퓨터 > 대형가전 > 냉장고 > 일반 냉장고'</li></ul> |
| 211.0 | <ul><li>'아우스 V-REX 컴퓨터 게이밍의자 발받침 높이 각도조절 게임용 PC방 의자 화이트 (#M)가구/인테리어>서재/사무용가구>의자>목받침의자 GFK > Naverstore > 디지털 > 게이밍 > 게이밍가구 > 게이밍의자'</li><li>'Qwertykeys QK65v2 추가 파츠 (#M)디지털/가전>PC부품>튜닝용품>기타튜닝용품 GFK > traverse > Naverstore > 컴퓨터 > 부품 > 튜닝용품 > 기타튜닝용품'</li><li>'레노버 샤오신패드 프로 12.7 8+128GB Pad Pro 2023년 내수롬 8+128GB 그레이 (#M)디지털/가전>태블릿PC GFK > traverse > Naverstore > 컴퓨터 > 노트북 > 태블릿PC'</li></ul> |
| 225.0 | <ul><li>'프리즘 LED 스탠드 PL-1400 충전식 무선 시력보호 듀얼헤드 각도조절 책상 조명 (#M)디지털/가전>생활가전>스탠드>LED스탠드 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 생활가전 > 기타생활가전'</li><li>'Holy Stone ID 13.9g 리모트 외장 발신기 드론 등록 제도 대응 국토 교통성 대응 모델 5시간 (#M)SSG.COM>카메라/캠코더>촬영용 드론 GFK > traverse > ssg > 디지털/렌탈 > 카메라/캠코더 > 촬영용 드론'</li><li>'라미 3WAY 대형 카메라 스마트폰 삼각대 RM-MT180 (4단/180cm) 라미 삼각대 RM-MT180 PRO(3단) (#M)디지털/가전>카메라/캠코더용품>삼각대/헤드>삼각대 GFK > traverse > Naverstore > 디지털 > 카메라 > 삼각대/헤드 > 삼각대'</li></ul> |
| 4.0 | <ul><li>'랜선 랜케이블 인터넷선 UTP LAN 선 다이렉트 인터넷 연결선 CAT.5E 0.3m 7.CAT8 SFTP (40G) 고품질_2m 블랙 홈>케이블(영상,음성,통신)>랜 케이블;(#M)홈>디지털/가전>PC부품>PC케이블>랜케이블 Naverstore > 컴퓨터 > 주변기기 > 케이블/젠더 > 케이블'</li><li>'키크론 프리미엄 기계식 키보드 항공 케이블 코일 USB C타입키크론 항공케이블 스트레이트_퍼플 (#M)가전·컴퓨터>PC부품·주변기기>기타 부품 Tmon > 가전·디지털 > 가전·컴퓨터 > PC부품·주변기기 > 기타 부품'</li><li>'마하링크 스테레오 AUX 고급형 케이블 1M ML-STH010 (#M)디지털/가전>PC부품>PC케이블>오디오케이블 Naverstore > 컴퓨터 > 주변기기 > 케이블/젠더 > 케이블'</li></ul> |
| 98.0 | <ul><li>'카리스 자외선 살균기 소독기 KRS-989 10리터 (#M)디지털/가전>생활가전>자외선소독기 GFK > Naverstore > 가전 > 생활가전 > 살균소독기'</li><li>'(기념 중) 국산 다용도 이동 공간살균기(아래 동영상 참조) 집먼지진드기퇴치 세균박멸등 특허등록 CE인증 자외선 UVC led 엔퓨텍 XD-2D04 (#M)디지털/가전>생활가전>자외선소독기 GFK > Naverstore > 가전 > 생활가전 > 살균소독기'</li><li>'모스티브 탁상용 철제류 네일 살균기 (#M)디지털/가전>생활가전>자외선소독기 GFK > Naverstore > 가전 > 생활가전 > 살균소독기'</li></ul> |
| 26.0 | <ul><li>'부산보일러 린나이 RC610-N-15KFN 친환경 콘덴싱 창원김해울산양산 설치 교체 (#M)디지털/가전>계절가전>보일러>가스보일러 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 계절가전 > 보일러'</li><li>'대성보일러 DNC1-15D 서울 의정부 남양주 강북구 도봉구 노원구 수리 교체 당일 설치 (#M)디지털/가전>계절가전>보일러>가스보일러 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 계절가전 > 보일러'</li><li>'삼양 구동기 CEC VA-200 / 지멘스 구동기 삼양 커넥터 (#M)디지털/가전>계절가전>보일러>가스보일러 GFK > Naverstore > 가전 > 계절가전 > 난방가전 > 보일러'</li></ul> |
| 106.0 | <ul><li>'부성핫슈 핸드드라이어 BSHD-2807 드라이 손건조기 업소용 초강력 손건조 화이트(WD-07) (#M)홈>디지털/가전>생활가전>핸드드라이어 Naverstore > 가전 > 욕실가전 > 손건조기'</li><li>'모두의만물 초고속핸드드라이어 HTM-350 전면LED 강력한바람 온풍 2,1000W 일반 HTM-350[2100W] (#M)디지털/가전>생활가전>핸드드라이어 Naverstore > 가전 > 욕실가전 > 손건조기'</li><li>'다이슨 에어블레이드 핸드드라이어 V / 니켈 1번-왼쪽_선택안함 홈>다이슨 핸드드라이어;(#M)홈>환경위생>핸드드라이어 Naverstore > 가전 > 욕실가전 > 손건조기'</li></ul> |
| 68.0 | <ul><li>'[로지텍] Logitech C920 PRO HD WebCam 웹캠 화상카메라 벌크 택배 병행 당일출고 C920 (#M)디지털/가전>멀티미디어장비>웹캠 GFK > traverse > Naverstore > 컴퓨터 > 주변기기 > 웹캠'</li><li>'앱코 ABKO, APC720 Lite HD 웹캠 화상카메라 캠 컴퓨터카메라 (#M)디지털/가전>멀티미디어장비>웹캠 Naverstore > 컴퓨터 > 주변기기 > 웹캠'</li><li>'프리에이티브 고해상도 웹캠 AF500FHD 500만화소 풀HD 줌 온라인 수업 구루미캠 1080P 60FPS 하이엔드 AFC80FHD (#M)디지털/가전>멀티미디어장비>웹캠 Naverstore > 컴퓨터 > 주변기기 > 웹캠'</li></ul> |
| 101.0 | <ul><li>'빅버튼 유선전화기사무실 회사 집 가정용 발신자표시 선택1 : OID-500 (#M)홈>전체상품 Naverstore > 가전 > 생활가전 > 전화기 > 유선'</li><li>'전화기선 키폰 수화기선 줄 코드 전화선 케이블 송수화기선 전화기선-검정 (#M)디지털/가전>생활가전>전화기>유선전화기 GFK > Naverstore > 가전 > 생활가전 > 전화기'</li><li>'맥슨 유선 전화기 집 사무실 일반전화기 옛날 (#M)디지털/가전>생활가전>전화기>유선전화기 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 생활가전 > 기타생활가전'</li></ul> |
| 156.0 | <ul><li>'[초강력세척] 비안크루세 가정용 야채 과일 초음파 세척기 '</li><li>'클로베리 프리미엄 과일야채 살균세척기 '</li><li>'리비다 채칼 전동 자동 만능 오토 돌돌이 슬라이서 야채 양배추 당근 감자 무 채써는기계 (#M)디지털/가전>주방가전>기타주방가전 GFK > traverse > Naverstore > 가전 > 주방가전'</li></ul> |
| 41.0 | <ul><li>'매장판 온라인 단독 오엘라 제습기 01. 오엘라 소형 제습기 SD01 (#M)가전·컴퓨터>계절가전>제습기 Tmon > 가전·디지털 > 가전·컴퓨터 > 계절가전 > 제습기'</li><li>'ThinkAir DL12 제습기 (#M)11st>계절가전>제습기>가정용 11st > 가전/디지털 > 계절가전 > 제습기 > 가정용'</li><li>'삼성 제습기 1등급 인버터 원룸 미니 AY18CG7500GED 베이지 18L 23년 신형 세이지 그린 (#M)홈>✨23년 NEW 제습기✨ Naverstore > 가전 > 계절가전 > 제습기'</li></ul> |
| 173.0 | <ul><li>'(+1.8L 컨테이너 볼 추가 증정) 콘체 X5 초강력 블렌더 카페믹서기 업소용 블렌더 티타늄코팅 칼날 '</li><li>'신일 대용량믹서기 4500ml 스텐레스/김장/대형/업소용 '</li><li>'최신형 vitamix 바이타믹스 콰이어트원 블랜더+추가볼 (에어레이팅볼 선택) /정품 '</li></ul> |
| 27.0 | <ul><li>'가습기 미니가습기 가열실가습기 천연가습기 대용량가습기 복합식가습기 안티 중력 800ML UV 공기 청정기 가습기 미니가습기 가열실가습기 천연가습기 대용량가습기 복합식가습기 안티 중력 800ML UV 공기 청정기_02 800ml Light Green (#M)가전·컴퓨터>계절가전>가습기 Tmon > 가전·디지털 > 가전·컴퓨터 > 계절가전 > 가습기'</li><li>'가습기 불멍 타워형 복합식 대용량 생수병 거실용 미니 아로마 케어 침실 용 대형 룸 (2L 워터 탱크) 쿨 미스트 탑 필 (에센셜 오일 디퓨저 포함) 가습기 불멍 타워형 복합식 대용량 생수병 거실용 미니 아로마_white_JP 플러그 (#M)가전·컴퓨터>계절가전>가습기 Tmon > 가전·디지털 > 가전·컴퓨터 > 계절가전 > 가습기'</li><li>'9L 대용량 복합식 가열식 THE완벽한가습기 AMH 9000 /23년형 상부급수 통세척 2 원대 프리미엄 무선 물걸레청소기 글라이드S AMC-2500 전용거치대+세탁 (#M)위메프 > 가전·디지털·컴퓨터 > 계절가전/에어컨 > 가습/청정가전 > 가습기 위메프 > 가전·디지털·컴퓨터 > 계절가전/에어컨 > 가습/청정가전 > 가습기'</li></ul> |
| 97.0 | <ul><li>'에어베리 스마트 의류관리기 2set 세트 구성 향기 1대+살균 1대+향기블럭3개+제습겔1팩_코코브리즈 (3개) (#M)홈>스마트 의류관리기 Naverstore > 가전 > 세탁기/건조기 > 의류관리기'</li><li>'LG 올 뉴 스타일러 오브제컬렉션 SC5GMR81H 상의 5벌 + 하의 1벌 블랙틴트미러 (GD) (#M)세탁기/건조기>의류관리기>의류관리기 GFK > 11st > 가전/디지털 > 세탁기/건조기 > 의류관리기'</li><li>'LG S5BBU 스타일러 5벌+바지 1벌 / KN (#M)세탁기/건조기>의류관리기>의류관리기 GFK > 11st > 가전/디지털 > 세탁기/건조기 > 의류관리기'</li></ul> |
| 170.0 | <ul><li>'자일렉 가정용 소프트 아이스크림메이커 ZL-214S '</li><li>'소프트아이스크림기계 메이커 업소용 상하목장 카페 테이블 요거트아이스크림 머신 콘 정품AS '</li><li>'브레빌 아이스크림 메이커 스마트 스쿱 BCI600 (#M)디지털/가전>주방가전>아이스크림제조기 Naverstore > 가전 > 주방가전 > 간식메이커 > 아이스크림'</li></ul> |
| 158.0 | <ul><li>'쁘띠냉장고 수납물 선반 위 냉장고 상단 공간선반 주방선반 조가비 층칸막이 냉장고에서 T19-밀리터리그린 화이트 헐렁헐값_선택하세요 (#M)홈>디지털/가전>주방가전>냉장고>일반형냉장고 Naverstore > 가전 > 냉장고 > 3,4도어'</li><li>'저온창고 도어 문 Haier 냉장고 씰 스트립 고무 링 마그네틱 흡입 원래 액세서리 범용 가죽 클로저 134 단일 도어 "업그레이드 두껍게-강한 자기 매력" (#M)위메프 > 가전·디지털·컴퓨터 > 대형가전 > 냉장고 > 양문형 냉장고 위메프 > 가전·디지털·컴퓨터 > 대형가전 > 냉장고 > 양문형 냉장고'</li><li>'저온창고 도어 문 Haier 냉장고 씰 스트립 고무 링 마그네틱 흡입 원래 액세서리 범용 가죽 클로저 134 옆집 (#M)위메프 > 가전·디지털·컴퓨터 > 대형가전 > 냉장고 > 양문형 냉장고 위메프 > 가전·디지털·컴퓨터 > 대형가전 > 냉장고 > 양문형 냉장고'</li></ul> |
| 8.0 | <ul><li>'[공식몰/ ] GIGABYTE B760M DS3H D4 피씨디렉트 (#M)11st>PC부품>메인보드>인텔 CPU용 11st > 가전/디지털 > PC부품 > 메인보드 > 인텔 CPU용'</li><li>'[공식몰/ ] GIGABYTE B760M AORUS ELITE 피씨디렉트 (#M)11st>PC부품>메인보드>인텔 CPU용 11st > 가전/디지털 > PC부품 > 메인보드 > 인텔 CPU용'</li><li>'[ASRock] B660M Pro RS D4 디앤디컴 (인텔B660/M-ATX) (#M)디지털/가전>PC부품>메인보드>인텔CPU용 GFK > Naverstore > 컴퓨터 > 부품 > 메인보드 > 인텔용'</li></ul> |
| 95.0 | <ul><li>'삼성전자 삼성 VC33M3120LU 싸이클론 진공청소기 안티탱글 3중청정클린 슬라이드핸들 (#M)디지털/가전>생활가전>청소기>유선청소기 Naverstore > 가전 > 청소기 > 진공청소기'</li><li>'[LG 공식판매점] 슈퍼 싸이킹 III 청소기 K83RG (#M)홈>생활가전>싸이킹 Naverstore > 디지털/가전 > 생활가전 > 청소기 > 유선청소기'</li><li>'LG전자 유선 최강흡입 통돌이 진공청소기 홈>디지털/가전>생활가전>청소기>유선청소기;(#M)홈>전체상품 Naverstore > 가전 > 청소기 > 유선청소기'</li></ul> |
| 67.0 | <ul><li>'4k HDMI USB 2.0 캡쳐보드 화면 녹화 obs 게임 스크린 캡처 방송 닌텐도 스위치 02_USB 3.0 (#M)디지털/가전>멀티미디어장비>영상편집카드>영상편집 GFK > Naverstore > 가전 > 영상가전 > 액세서리 > 영상편집카드'</li><li>'엠비에프 MBF-UHCP-C '</li><li>'AVerMedia GC553 외장형 캡쳐카드 4K 캡쳐보드 '</li></ul> |
| 29.0 | <ul><li>'캠핑 선풍기 캠핑용 써큘레이터 무선 충전식 무드등 차박 탁상용선풍기 캠핑선풍기+수납가방 (#M)홈>디지털/가전>계절가전>선풍기>탁상형선풍기 Naverstore > 가전 > 계절가전 > 선풍기 > 미니선풍기'</li><li>'프롬비 사일런트 스톰 저소음 무선 휴대용선풍기 FA135 SilentStorm(거치대형) 인디핑크 (#M)디지털/가전>계절가전>선풍기>휴대용선풍기 Naverstore > 가전 > 계절가전 > 선풍기 > 휴대용'</li><li>'신일 캠핑용선풍기 캠핑선풍기 무선 휴대용 야외용 충전식 12인치 선풍기 캠핑장 12인치+가방 / 무선 / 아이보리색 홈>디지털/가전>계절가전>선풍기>휴대용선풍기;(#M)홈>디지털/가전>계절가전>선풍기>탁상형선풍기 Naverstore > 가전 > 계절가전 > 선풍기 > 탁상형'</li></ul> |
| 63.0 | <ul><li>'게이밍 게임 스탠딩 마이크 배그 디스코드 컴퓨터 JTUM400 실버 단품 실버단품 (#M)디지털/가전>멀티미디어장비>PC마이크 GFK > Naverstore > 컴퓨터 > 주변기기 > 사운드 > 마이크'</li><li>'컴소닉 CM-7010 USB 프리미엄 스탠드마이크 게임 방송 디코 디스코드 필라마이크 CM-7010 USB Premium (#M)디지털/가전>멀티미디어장비>PC마이크 GFK > Naverstore > 컴퓨터 > 주변기기 > 사운드 > 마이크'</li><li>'앱코 MP3300 USB 콘덴서 스트리밍 스탠드 마이크 (#M)디지털/가전>멀티미디어장비>PC마이크 GFK > Naverstore > 컴퓨터 > 주변기기 > 사운드 > 마이크'</li></ul> |
| 181.0 | <ul><li>'휴렉 음식물 처리기 히어로 HD-9000SD (건조형) 히어로 필터 필터 추가(3개) (#M)디지털/가전>주방가전>음식물처리기 Naverstore > 가전 > 주방가전 > 위생관리 > 음식물처리기'</li><li>'스마트카라 PCS-400 가정용 음식물처리기 PCS-400 화이트+필터2세트 (#M)디지털/가전>주방가전>음식물처리기 GFK > Naverstore > 가전 > 주방가전 > 위생관리 > 음식물처리기'</li><li>'락앤락 음식물 쓰레기 냉장고 3L 화이트/그레이 (EJT116) 화이트 (#M)디지털/가전>주방가전>음식물처리기 Naverstore > 가전 > 주방가전 > 위생관리 > 음식물처리기'</li></ul> |
| 223.0 | <ul><li>'니콘 어댑터 링 SY-1-52 52mm (#M)카메라/주변기기>렌즈용품>렌즈용품 기타 GFK > traverse > 11st > 가전/디지털 > 카메라/주변기기 > 렌즈용품 > 렌즈용품 기타'</li><li>'스퀘어후드 후지필름 XF33 / XF23mm f1.4 R LM WR / XF16-50mm 렌즈 후드 (#M)디지털/가전>카메라/캠코더용품>렌즈용품>렌즈후드 GFK > traverse > Naverstore > 디지털 > 카메라 > 렌즈용품 > 렌즈후드'</li><li>'WEB CMOS CMS-V52S 산와 서플라이 카메라 회의용 와이드 렌즈 광각(수평 (#M)SSG.COM>카메라/캠코더>디지털카메라/액션캠>캠코더 GFK > traverse > ssg > 디지털/렌탈 > 카메라/캠코더 > 디지털카메라/액션캠 > 캠코더'</li></ul> |
| 197.0 | <ul><li>'테팔 컴팩트 커피메이커 원두커피 커피 CM3218 (#M)홈>디지털/가전>주방가전>커피메이커 Naverstore > 가전 > 주방가전 > 커피용품 > 커피메이커'</li><li>'브레빌 커피 그라인더 도즈 컨트롤 프로 BCG600 (#M)디지털/가전>주방가전>커피메이커 Naverstore > 가전 > 주방가전 > 커피용품 > 커피메이커'</li><li>'[리빙가전] 테팔 커피메이커 비보 CM222B (#M)가전·컴퓨터>주방가전>전기주전자>무선포트 Tmon > 가전·디지털 > 가전·컴퓨터 > 주방가전 > 전기주전자 > 무선포트'</li></ul> |
| 163.0 | <ul><li>'키친아트 다지기 KM-28FM 스테인레스 6리터 대용량 키친아트 다지기 KM-28F (#M)주방가전>믹서기/핸드블렌더>다지기/분쇄기 GFK > 11st > 가전/디지털 > 주방가전 > 믹서기/핸드블렌더 > 다지기/분쇄기'</li><li>'7초 만능 다지기 김장 대용량 마늘 박피기 다지는기계 마늘 까는기계 만능다지기 2.5L(마늘박피기포함) (#M)홈>디지털/가전>주방가전>분쇄기 Naverstore > 가전 > 주방가전 > 믹서기/블렌더 > 분쇄기'</li><li>'한일전기 3.2L 대용량 스텐믹서 SHMF-3250S (#M)디지털/가전>주방가전>분쇄기 Naverstore > 가전 > 주방가전 > 믹서기/블렌더 > 분쇄기'</li></ul> |
| 165.0 | <ul><li>'[세트할인] 단미 1구 와플메이커 샌드위치메이커 SAN01+플레이트 세트 (붕어빵 or 도넛) SAN01 핑크 + 붕어빵 플레이트 (#M)디지털/가전>주방가전>샌드위치제조기 Naverstore > 가전 > 주방가전 > 간식메이커 > 샌드위치'</li><li>'키친아트 샌드위치 메이커 (#M)가전·컴퓨터>주방가전>토스트·제빵·간식>홈베이킹·간식메이커 Tmon > 가전·디지털 > 가전·컴퓨터 > 주방가전 > 토스트·제빵·간식 > 홈베이킹·간식메이커'</li><li>'[6%쿠폰] 키친아트 샌드위치 메이커 토스트기 토스터기 아이들-아빠 간식메이커 PK-2168JT(샌드위치) (#M)위메프 > 가전·디지털·컴퓨터 > 주방가전 > 홈메이드가전 > 홈베이킹/토스터기 위메프 > 가전·디지털·컴퓨터 > 주방가전 > 홈메이드가전 > 홈베이킹/토스터기'</li></ul> |
| 164.0 | <ul><li>'키친아트 5L 자동 전기 빙수기 KIC-2311WS (#M)디지털/가전>주방가전>빙수기 Naverstore > 가전 > 주방가전 > 간식메이커 > 빙수기'</li><li>'키친아트 빙수기/전기빙수기/슬러시 KAIM-P2791NK (#M)홈>디지털/가전>주방가전>빙수기 Naverstore > 가전 > 주방가전 > 간식메이커 > 빙수기'</li><li>'보국전자 눈꽃 얼음 빙수기 BKK-1140S 팥빙수 우유빙수 설빙빙수 (#M)디지털/가전>주방가전>빙수기 Naverstore > 가전 > 주방가전 > 간식메이커 > 빙수기'</li></ul> |
| 76.0 | <ul><li>'테팔 클래식 논스틱 코팅열판 건식 다리미 (#M)11st>생활가전>다리미>스팀다리미 11st > 가전/디지털 > 생활가전 > 다리미 > 스팀다리미'</li><li>'태팔건식 가벼운다리미 클래식 논스틱 코팅 열판 경량 다리미 (#M)생활가전>다리미>건식다리미 GFK > 11st > 가전/디지털 > 생활가전 > 다리미 > 건식다리미'</li><li>'스팀다리미 스마트 프로텍트 플러스 FV6872/다리미/테팔/테팔(가전) (#M)홈>디지털/가전>생활가전>다리미>건식다리미 Naverstore > 가전 > 생활가전 > 다리미 > 건식'</li></ul> |
| 31.0 | <ul><li>'[HDC아이파크몰] 벤타 오리지널에어워셔 LW-45B 블랙기화식 가습기 공기청정기 LW-45W(화이트) (#M)홈>디지털/가전>계절가전>공기정화기>에어워셔 Naverstore > 가전 > 계절가전 > 공기청정기 > 에어워셔'</li><li>'[LG 공식판매점] 퓨리케어 에어워셔 HW500DAS 5L 자연기화식 가습기 35㎡ 홈>계절가전>에어워셔;홈>퓨리케어 공기청정기;홈>에어케어>에어워셔(가습기);(#M)홈>계절가전>에어워셔(가습기) Naverstore > 가전 > 계절가전 > 공기청정기 > 에어워셔'</li><li>'LG전자 퓨리케어 공기청정기 AS301DNPA .. LG전자 퓨리케어 공기청정기 AS301DNPA 무료배송 .. (#M)가전·컴퓨터>계절가전>에어워셔·공기청정>에어워셔·공기청정 Tmon > 가전·디지털 > 가전·컴퓨터 > 계절가전 > 에어워셔·공기청정'</li></ul> |
| 72.0 | <ul><li>'Ekeepment 하이라이저 높이조절 아이맥 메탈 모니터 받침대 스탠드 선반 Silver (#M)디지털/가전>모니터주변기기>모니터받침대 Naverstore > 컴퓨터 > 주변기기 > 모니터용 > 받침대'</li><li>'알파플랜 높은 알루미늄 아이맥 모니터 받침대 스탠드 선반 560mm_스페이스그레이(SG) (#M)디지털/가전>모니터주변기기>모니터받침대 Naverstore > 컴퓨터 > 주변기기 > 모니터용 > 받침대'</li><li>'높은 모니터받침대 듀얼 모니터 받침대 스탠드 받침 선반 (#M)디지털/가전>모니터주변기기>모니터받침대 GFK > Naverstore > 컴퓨터 > 주변기기 > 모니터용'</li></ul> |
| 36.0 | <ul><li>'[일월] 22년형 프리미엄 온수 매트 듀얼하트 온수매트 퀸 홈>온수카페트;(#M)홈>온수매트 Naverstore > 가전 > 계절가전 > 냉온수매트 > 온수매트'</li><li>'일월 듀얼하트 온수매트 플러스싱글 2023년 최신형 05.초슬림 온수매트_싱글100x200 홈>매트_커버>온수매트;(#M)홈>전체상품 Naverstore > 가전 > 계절가전 > 냉온수매트 > 온수매트'</li><li>'비나잇 프리미엄 온수매트 세탁 워셔블 스몰 싱글 침대용 퀸(1500x1900)_단일난방(침대용) (#M)디지털/가전>계절가전>온수매트 Naverstore > 가전 > 계절가전 > 냉온수매트 > 온수매트'</li></ul> |
| 22.0 | <ul><li>'[르젠] 선풍기 리모컨 (기타) '</li><li>'[스멜스탑 본사몰] 화장실 환풍기 댐퍼 배관용품 & 주방렌지후드 음식냄새 역류방지 아파트 담배냄새차단 (2타입) '</li><li>'베셀S자드라이버2PC셋 코너 ㄱ자 양용 직각 기억자 특수 십자 공구 (#M)주방가전>정수기>부속품 GFK > traverse > 11st > 가전/디지털 > 주방가전 > 정수기'</li></ul> |
| 230.0 | <ul><li>'중고갤럭시S21/S21+/울트라/Z폴드/Z플립 프리미엄 중고 공기계 자급제 노트20울트라 256GB_3사공용 화이트_특S급 쇼킹딜 홈>디지털>휴대폰/액세서리>가입상품/공기계;11st>휴대폰>공기계/언락폰>삼성;11st>휴대폰>자급제/공기계>삼성;쇼킹딜 홈>디지털>리퍼/중고/렌탈>리퍼/중고/렌탈;11st>디지털>리퍼/중고/렌탈>리퍼/중고/렌탈;11st>휴대폰>중고폰>중고폰;11st > 디지털/가전/컴퓨터 > 휴대폰 > 공기계/언락폰;(#M)11st>휴대폰>공기계/중고폰>공기계/새상품 11st > 가전/디지털 > 휴대폰 > 공기계/중고폰 > 공기계/새상품'</li><li>'[정품 리퍼폰]노트20,10/노트10플러스,갤럭시10 5G/20/20+ 공기계/알뜰폰/리퍼폰/새배터리/새액정 갤럭시S20플러스 256GB_리퍼폰(새액정+새배터리+테두리 교체)_3사공용-아우라블루 11st>휴대폰>중고폰>중고폰;(#M)11st>휴대폰>공기계/중고폰>공기계/새상품 11st > 가전/디지털 > 휴대폰 > 공기계/중고폰 > 공기계/새상품'</li><li>'[프리미엄리퍼폰/중고폰]갤럭시S22/S21/S20/S10/노트20/노트10/Z플립2,3/21울트라/알뜰폰/공기계 갤럭시S21플러스 256GB_리퍼폰(새액정+새배터리+테두리 교체)_3사공용-팬텀 바이올렛 11st>디지털>리퍼/중고/렌탈>리퍼/중고/렌탈;11st>휴대폰>중고폰>중고폰;11st Hour Event > 디지털/가전 > 디지털 > 리퍼/중고/렌탈 > 리퍼/중고/렌탈;(#M)11st>휴대폰>공기계/중고폰>공기계/새상품 11st > 가전/디지털 > 휴대폰 > 공기계/중고폰 > 공기계/새상품'</li></ul> |
| 186.0 | <ul><li>'키친아트 샤브샤브 전기 냄비 2단 멀티쿠커 전골냄비 (#M)홈>디지털/가전>주방가전>전기쿠커>전기냄비 Naverstore > 가전 > 주방가전 > 전기쿠커 > 전기냄비'</li><li>'Bear 7구 올스텐 미니 고구마 계란찜기 달걀삶는 기계 타이머 Bear 다용도 계란찜기 (#M)디지털/가전>주방가전>전기쿠커>전기찜기 Naverstore > 가전 > 주방가전 > 전기쿠커 > 전기찜기'</li><li>'[키친아트] 허브 자취용 만능 멀티쿠커 찜기 냄비 KTP-MS1218 (#M)11st>주방가전>전기포트>무선포트/주전자 11st > 가전/디지털 > 주방가전 > 전기포트 > 무선포트/주전자'</li></ul> |
| 102.0 | <ul><li>'로봇 진공 청소기 Hepa 필터 샤오미 Roborock S5 Max S6 MaxV 액세서리 예비 부품 로봇 진공 청소기 Hepa 필터 사*미 Roborock S5 Max S6 MaxV 액_세트 J (#M)가전·컴퓨터>생활가전>청소기>로봇청소기 Tmon > 가전·디지털 > 가전·컴퓨터 > 생활가전 > 청소기 > 로봇청소기'</li><li>'[호환] 라이드스토 R1 S1 필터 소모품 로봇청소기 부품 교체 사이드 브러쉬 2EA (#M)홈>디지털/가전>생활가전>청소기>청소기액세서리 Naverstore > 디지털/가전 > 생활가전 > 청소기 > 청소기액세서리'</li><li>'MD글로벌 다이슨 거치대 V10 V8 V7 V6 전기종 호환 6.프리미엄 다이슨 전용 거치대 - 화이트 (#M)홈>디지털/가전>생활가전>청소기>청소기액세서리 Naverstore > 디지털/가전 > 생활가전 > 청소기 > 청소기액세서리'</li></ul> |
| 23.0 | <ul><li>'LG 냉난방기 스탠드 인버터 냉온풍기 업소용 사무실 15형 PW0603R2SF 설치비별도 특가\t15형\t3등급\tPW0603R2SF 홈>추천★냉난방기모음;홈>추천★냉난방기;(#M)홈>냉난방기>LG전자>스탠드 냉난방기 Naverstore > 가전 > 계절가전 > 에어컨 > 냉온풍기'</li><li>'삼성전자 스탠드 냉난방기 40평형 인버터 냉온풍기 업소용 AP145RAPDHH1S 홈>냉난방기>삼성>스탠드;(#M)홈>🔥냉난방기>삼성>스탠드 Naverstore > 가전 > 계절가전 > 에어컨 > 냉온풍기'</li><li>'[캐리어대리점] 23년 신형 초절전 인버터 6평형 벽걸이 에어컨 OARC-0061WAWSD (실외기포함/전국 /기본설치무료) (#M)디지털/가전>계절가전>에어컨>벽걸이형에어컨 Naverstore > 가전 > 계절가전 > 에어컨 > 벽걸이형'</li></ul> |
| 141.0 | <ul><li>'수동 코털 깎기 제거기 수동코털제거기 콧털가위 코털정리기 수동콧털제거기 콧털제거기 코털깍기 홈 > 뷰티 > 뷰티기기/소품 > 면도기/제모기 > 코털정리기 LO > traverse > LotteOn > 뷰티 > 뷰티기기/소품 > 면도기/제모기 > 코털정리기'</li><li>'필립스 NT 3600 코털제거기 방수 2헤드 코털정리기 (#M)GSSHOP>뷰티>이미용기기>기타이미용기기 GSSHOP > 뷰티 > 이미용기기 > 기타이미용기기'</li><li>'나비 전기 코털정리기 코털제거기 코털 잔털제거기 잔털 눈섭정리기 NV151-ENT7 블랙 홈 > 뷰티 > 뷰티기기/소품 > 면도기/제모기 > 코털정리기 LO > traverse > LotteOn > 뷰티 > 뷰티기기/소품 > 면도기/제모기 > 코털정리기'</li></ul> |
| 90.0 | <ul><li>'아이스티머 런던 스팀다리미+아이클리너+거치대+레더박스 색상:리얼그린 (#M)가전·컴퓨터>생활가전>다리미·미싱·기타>스팀다리미 Tmon > 가전·디지털 > 가전·컴퓨터 > 생활가전 > 다리미·미싱·기타 > 스팀다리미'</li><li>'아웃핏터 프로 프리미엄A(상의+바지) 홈드라이 의류케어 자동스팀다리미판 스탠드 핸드형 와이셔츠다림질 프리미엄C(상의+바지+커버+모자신발+롱) (#M)홈>디지털/가전>생활가전>다리미>스팀다리미 Naverstore > 가전 > 생활가전 > 다리미 > 스팀'</li><li>'[얀스토어(yarn store)]독일 프림 휴대용 미니스팀다리미 (PYRM MINI STEAM IRON) 611916-KB 11st>홈패브릭/수예>주방패브릭>앞치마;(#M)11st>생활가전>다리미>스팀다리미 11st > 가전/디지털 > 생활가전 > 다리미 > 스팀다리미'</li></ul> |
| 64.0 | <ul><li>'Companion 50 컴퓨터 겸용 멀티미디어 스피커 GS (#M)홈>디지털/가전>멀티미디어장비>PC스피커>2.1채널 Naverstore > 컴퓨터 > 주변기기 > 사운드 > 스피커'</li><li>'[브랜드위크 14만] 삼성공식파트너 JBL PULSE4 펄스4 감성 무드등 블루투스 스피커 LED 360도 조명 블랙 쇼킹딜 홈>가전>음향/프로젝터>스피커/사운드바;11st>가전>음향/프로젝터>스피커/사운드바;11st Hour Event > 오늘오픈;(#M)11st>음향가전>스피커>블루투스 스피커 11st > 가전/디지털 > 음향가전 > 스피커 > 블루투스 스피커'</li><li>'앱코 SP400 2채널 멀티미디어 PC스피커 (블랙) (#M)홈>디지털/가전>멀티미디어장비>PC스피커>2채널 Naverstore > 컴퓨터 > 주변기기 > 사운드 > 스피커'</li></ul> |
| 207.0 | <ul><li>'로지텍 무선 무소음 손목 편한 마우스 m331 레드 (#M)디지털/가전>주변기기>마우스>무선마우스 GFK > traverse > Naverstore > 컴퓨터 > 키보드/마우스 > 마우스 > 저소음마우스'</li><li>'클로 넥앤프로 목 어깨 마사지기 안마기 승모근 마사지 기계 지압기 무선 넥앤프로 (베이지)_넥앤프로 (퍼플) (#M)생활/건강>안마용품>안마기 GFK > traverse > Naverstore > 건강/의료용품 > 안마용품'</li><li>'유선 게이밍 광마우스 Hacker A660 3325 센서 핑크 (#M)컴퓨터 주변기기>게이밍 주변기기>게이밍 마우스 GFK > traverse > 11st > 가전/디지털 > 컴퓨터 주변기기 > 게이밍 주변기기 > 게이밍 마우스'</li></ul> |
| 13.0 | <ul><li>'[PS5] 플레이스테이션5 디스크 에디션 (#M)디지털/가전>게임기/타이틀>가정용게임기 Naverstore > 디지털 > 게이밍 > 플레이스테이션 > 본체'</li><li>'(new) 노리박스 TV연결형 오락실게임기 가정용 오락기 레트로 게임기 신형FX팩(5152게임/1080P/총게임지원) (#M)디지털/가전>게임기/타이틀>가정용게임기 Naverstore > 디지털 > 게이밍 > 레트로게임기'</li><li>'[플레이스테이션] 엑스박스 본체 정품 악세사리 모음 07.마이크로소프트 엑스박스 XBOX Series X (#M)가전·컴퓨터>게임·소프트웨어>게임기>소니∙XBOX Tmon > 가전·디지털 > 가전·컴퓨터 > 게임·소프트웨어 > 게임기 > 소니∙XBOX'</li></ul> |
| 220.0 | <ul><li>'인스탁스 미니필름 40매 (#M)SSG.COM>카메라/캠코더>즉석/필름카메라>즉석카메라 GFK > traverse > ssg > 디지털/렌탈 > 카메라/캠코더 > 즉석/필름카메라 > 즉석카메라'</li><li>'인스탁스 디자인 미니필름 모던 5종 세트 (#M)SSG.COM>카메라/캠코더>즉석/필름카메라>즉석카메라 GFK > traverse > ssg > 디지털/렌탈 > 카메라/캠코더 > 즉석/필름카메라 > 즉석카메라'</li><li>'[한국후지필름] 인스탁스X위글위글 콜라보 미니12 즉석카메라 올인원 선물세트 (#M)카메라/주변기기>즉석카메라>일회용카메라 GFK > traverse > 11st > 가전/디지털 > 카메라/주변기기 > 즉석카메라 > 일회용카메라'</li></ul> |
| 196.0 | <ul><li>'훼마 샤워 스크린 E98 UP E61 / 라심발리 M27 M23 UP 훼마 샤워스크린 - 60mm (#M)디지털/가전>주방가전>커피머신>커피머신부속품 GFK > Naverstore > 가전 > 주방가전 > 커피용품 > 액세서리'</li><li>'시티즈앤밀크 D123 (화이트, 블랙)시티즈앤밀크 시티즈앤밀크 화이트 (#M)11st>주방가전>커피머신/메이커>캡슐커피머신 11st > 가전/디지털 > 주방가전 > 커피머신/메이커 > 캡슐커피머신'</li><li>'정품 훼마 E98 UP E61 가스켓 FAEMA 페마 커피머신 샤워스크린 - 60mm (#M)디지털/가전>주방가전>커피머신>커피머신부속품 GFK > Naverstore > 가전 > 주방가전 > 커피용품 > 액세서리'</li></ul> |
| 92.0 | <ul><li>'린클 음식물처리기(RC02) 색상:노블네이비 (#M)홈>디지털/가전>생활가전>건조기/탈수기>신발건조기 Naverstore > 가전 > 세탁기/건조기 > 신발건조기'</li><li>'스테인리스 장화세척대 발판 세척기 신발 부츠 공장 800x410x550mm 장화세척대 (#M)세탁기/건조기>건조기>신발건조기 GFK > 11st > 가전/디지털 > 세탁기/건조기 > 건조기'</li><li>'린클 음식물처리기(RC02) 색상:스페이스블랙 (#M)홈>디지털/가전>생활가전>건조기/탈수기>신발건조기 Naverstore > 가전 > 세탁기/건조기 > 신발건조기'</li></ul> |
| 200.0 | <ul><li>'키친아트 오븐 토스터기 KAO-700NK 홈>디지털/가전>주방가전>오븐>전기오븐;홈>디지털/가전>주방가전>토스터기>오븐토스터기;(#M)홈>주방가전>토스터기 Naverstore > 가전 > 주방가전 > 토스터기 > 오븐토스터기'</li><li>'정품 ㅁ 테팔 노베오 토스트기 LT-251870 (#M)11st>주방가전>토스터기>일반토스터기 11st > 가전/디지털 > 주방가전 > 토스터기 > 일반토스터기'</li><li>'테팔 토스터기 TT132DKR 토스트 자동전원차단 테팔 토스터기 TT132DKR 토스트 자동전원차단 (#M)가전·컴퓨터>주방가전>토스트·제빵·간식>홈베이킹·간식메이커 Tmon > 가전·디지털 > 가전·컴퓨터 > 주방가전 > 토스트·제빵·간식 > 홈베이킹·간식메이커'</li></ul> |
| 199.0 | <ul><li>'탄산수제조기 소다스트림 정품 탄산실린더 구매(스페어실린더) 충전 N타입 (#M)디지털/가전>주방가전>탄산수제조기 Naverstore > 가전 > 주방가전 > 음료제조기 > 탄산수제조기'</li><li>'딜라이트소다 셰프 탄산수제조기 1. 화이트 (#M)디지털/가전>주방가전>탄산수제조기 Naverstore > 가전 > 주방가전 > 음료제조기 > 탄산수제조기'</li><li>'[ 점] 딜라이트소다 바리스타 탄산수제조기 (#M)디지털/가전>주방가전>탄산수제조기 Naverstore > 가전 > 주방가전 > 음료제조기 > 탄산수제조기'</li></ul> |
| 130.0 | <ul><li>'동국제약 센텔리안24 마데카프라임 뷰티디바이스 1개 + 글루타치온 부스팅 앰플 30ml 1종 멜라캡처앰플10ml x 4개 샤 마데카프라임+콜라겐앰플+사은품 [C178] 홈 > 뷰티 > 뷰티기기/소품 > 헤어스타일러 > 고데기/매직기 LO > traverse > LotteOn > 뷰티 > 뷰티기기/소품 > 헤어스타일러 > 고데기/매직기'</li><li>'[문가영 Pick] 보다나 글램웨이브 봉고데기 프리볼트 핑크 40mm 보다나 글램웨이브 봉고데기 프리볼트 핑크 40mm 홈>헤어케어>헤어기기>헤어셋팅기기X;홈>헤어케어>헤어기기>헤어셋팅기기;홈>헤어케어>헤어기기>헤어롤;홈>헤어케어>헤어기기>고데기;홈>헤어케어>헤어기기>탈모/두피기기;(#M)홈>헤어케어>헤어기기>탈모/두피기기/헤어롤 OLIVEYOUNG > 헤어케어 > 헤어기기 > 탈모/두피기기/헤어롤'</li><li>'[문가영 Pick] 보다나 트리플 플로우 물결고데기 25mm (히피펌) [문가영PICK]보다나 트리플 플로우 물결고데기 25mm (히피펌) 홈>헤어케어>헤어기기>헤어셋팅기기X;홈>헤어케어>헤어기기>헤어셋팅기기;홈>헤어케어>헤어기기>헤어롤;홈>헤어케어>헤어기기>고데기;홈>헤어케어>헤어기기>탈모/두피기기;(#M)홈>헤어케어>헤어기기>탈모/두피기기/헤어롤 OLIVEYOUNG > 헤어케어 > 헤어기기 > 탈모/두피기기/헤어롤'</li></ul> |
| 39.0 | <ul><li>'곰표 한일 전기장판 거실용 전기매트 침대 EMF 탄소매트 소형 싱글 EMF탄소매트(진그레이)_싱글(중형)(105x180cm) (#M)디지털/가전>계절가전>전기매트/장판>전기장판 GFK > Naverstore > 가전 > 겨울가전 > 전기매트/장판'</li><li>'일월 텐셀 원적외선 탄소 카본매트(보관가방 포함) 모달 싱글 11st > timedeal;(#M)11st>계절가전>전기매트/장판>전기매트 11st > 가전/디지털 > 계절가전 > 전기매트/장판 > 전기매트'</li><li>'힐로빔 조인트빔 무릎 마사지기 찜질기 온열 어깨 더블팩(1+1/보조배터리 무료증정) (#M)생활/건강>안마용품>안마기 GFK > traverse > Naverstore > 건강/의료용품 > 안마용품 > 안마기'</li></ul> |
| 88.0 | <ul><li>'K9 PRO 유선형 K9PRO 유선+무선형 본품@배터리 2입증정) (#M)디지털/가전>생활가전>손소독기 Naverstore > 가전 > 욕실가전 > 손소독기'</li><li>'샤오미 미지아 센서형 자동 거품 손 세정기 리필 세정액 전용 손세정제 (3개입) 아미노산+향균(6개) (#M)홈>디지털/가전>생활가전>손소독기 Naverstore > 가전 > 욕실가전 > 손소독기'</li><li>'[청결양행] 분무형 자동 손소독기 BIO-001 기본형 (#M)디지털/가전>생활가전>손소독기 Naverstore > 가전 > 욕실가전 > 손소독기'</li></ul> |
| 161.0 | <ul><li>'두유제조기 두유기 콩국물 죽제조 600ml Amazom베스트 Mokkom 그린 (#M)디지털/가전>주방가전>두부두유제조기 Naverstore > 가전 > 주방가전 > 홍삼/영양식 > 두부,두유'</li><li>'[연속매진 사전예약] 오쿠아침앤 콩불림없는 두유제조기 BM600 목넘김이 부드러운 6중날 민트그린 (#M)디지털/가전>주방가전>두부두유제조기 Naverstore > 가전 > 주방가전 > 홍삼/영양식 > 두부,두유'</li><li>'조영 두유 제조기 콩물 만드는 기계 메이커 조영 두유 제조기(DJ12G-D545) (#M)디지털/가전>주방가전>두부두유제조기 GFK > Naverstore > 가전 > 주방가전 > 홍삼/영양식'</li></ul> |
| 75.0 | <ul><li>'대한민국 DC 12V 전원 어댑터 모니터 CCTV 공유기 전자악기 3구접지 12V0.5A 전원일체형 F(ST) KC인증 Skyplus 10) 12V3A 전원일체 F(ST) 홈>디지털/가전>모니터주변기기>모니터어댑터;(#M)홈>12V 어댑터 Naverstore > 컴퓨터 > 주변기기 > 모니터용 > 어댑터'</li><li>'LG 엘지 모니터 어댑터 DC 12V / 19V 전원 19V1.3A 대한민국 KC인증품 6) 19V2.1A 전원일체형 (#M)디지털/가전>모니터주변기기>모니터어댑터 GFK > Naverstore > 컴퓨터 > 주변기기 > 모니터용 > 어댑터'</li><li>'DC 12V 어댑터 전원 모니터 CCTV LED 12V 0.5A (500mA) 벽걸이형 12V 5A_(22) 3구 접지형 (#M)디지털/가전>모니터주변기기>모니터어댑터 GFK > Naverstore > 컴퓨터 > 주변기기 > 모니터용'</li></ul> |
| 148.0 | <ul><li>'원터치형 SSD 외장케이스 / M.2 NVMe / 8TB 10Gps (#M)PC부품>PC케이스>파워포함케이스 GFK > traverse > 11st > 가전/디지털 > PC부품 > PC케이스 > 파워포함케이스'</li><li>'타무즈 GKM330 M.2 2280 SATA (512GB)/SSD/정품판매점/무상3년/ngff//R (#M)저장장치>SSD>500GB이상 GFK > traverse > 11st > 가전/디지털 > 저장장치 > SSD > 500GB이상'</li><li>'공식 판매점 WD BLACK SN850X NVMe SSD 4TB AS 5년 PS5 호환 (#M)디지털/가전>저장장치>SSD GFK > naver_plus_traverse_extension > Naverstore > PC/주변기기 > 저장장치 > SSD'</li></ul> |
| 69.0 | <ul><li>'삼성전자 하만카돈 오라 스튜디오4 (Aura studio 4) '</li><li>'브리츠 BR-ST202 '</li><li>'스타벅스 서머 우드 스피커,2021 스타벅스 여름 md 2차 홈>21 MD>21 서머 2차;(#M)홈>시즌 MD Naverstore > 디지털 > 음향기기 > 스피커 > 미니/휴대용'</li></ul> |
| 119.0 | <ul><li>'휴대용 레트로 라디오 fm am 단파 라디오 어르신 효도 라디오 블루투스 1702 (#M)디지털/가전>음향가전>라디오 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 음향가전 > 라디오/MP3'</li><li>'트로트 등산용 MP3 어르신 미니 라디오 휴대용 소형 추천템 멀티 효도 라디오 H-868 (#M)음향가전>라디오>라디오 GFK > traverse > 11st > 가전/디지털 > 음향가전 > 라디오 > 라디오'</li><li>'수동식 크랭크 라디오 비상 랜턴 태양광 충전 다기능 비상 크랭크라디오 (#M)디지털/가전>음향가전>라디오 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 음향가전 > 라디오/MP3'</li></ul> |
| 212.0 | <ul><li>'지클릭커 오피스프로 WMK70 사일런스L 무소음 인체공학 무선 키보드 마우스 세트 블랙 화이트 (#M)디지털/가전>주변기기>키보드/마우스세트 GFK > traverse > Naverstore > 컴퓨터 > 키보드/마우스 > 키보드 > 키보드+마우스'</li><li>'지클릭커 오피스프로 WMK70 사일런스L 무선 키보드 마우스 세트 (화이트) (#M)컴퓨터 주변기기>마우스/키보드 세트>마우스/키보드 세트 GFK > traverse > 11st > 가전/디지털 > 컴퓨터 주변기기 > 마우스/키보드 세트 > 마우스/키보드 세트'</li><li>'마이크로소프트 에고노믹 무선 블루투스 5.0 마우스 택배 병행 블랙 당일출고 (#M)디지털/가전>주변기기>마우스>무선마우스 GFK > naver_plus_traverse_extension > Naverstore > PC/주변기기 > 키보드/마우스 > 마우스'</li></ul> |
| 215.0 | <ul><li>'샌디스크 마이크로 SD카드 익스트림 프로 블랙박스 액션캠 닌텐도 메모리 2TB (#M)디지털/가전>카메라/캠코더용품>메모리카드>MicroSD메모리 GFK > naver_plus_traverse_extension > Naverstore > 휴대폰/카메라 > 카메라'</li><li>'삼성전자 마이크로 SD카드 512GB 메모리카드 EVO PLUS 외장 스마트폰 메모리 512기가 신형EVO PLUS 512G 케이스+리더기 (#M)디지털/가전>카메라/캠코더용품>메모리카드>MicroSD메모리 GFK > short_clip > Naverstore > Short Clip > 테크 > 20241031'</li><li>'카드 DJI Care Refresh 2년판(DJI Osmo Pocket 3) (#M)SSG.COM>카메라/캠코더>디지털카메라/액션캠>액션캠 GFK > traverse > ssg > 디지털/렌탈 > 카메라/캠코더 > 디지털카메라/액션캠 > 액션캠'</li></ul> |
| 56.0 | <ul><li>'[바로가기 ON 15% 중.복.쿠.폰] IPTIME BT50 블루투스 V5.0 USB 동글 화이트 (#M)컴퓨터 주변기기>블루투스동글>블루투스동글 GFK > 11st > 가전/디지털 > 컴퓨터 주변기기 > 블루투스동글 > 블루투스동글'</li><li>'아이피타임 ipTiME BT50XR 블루투스 5.0 USB 동글 블랙 (#M)홈>디지털/가전>네트워크장비>블루투스동글 Naverstore > 컴퓨터 > 주변기기 > 허브/컨트롤러 > 블루투스동글'</li><li>'Logitech G903 G403 G900 G703 G603 G PRO 무선 마우스 어댑터용 Usb 동글 신호 수신기 어댑터 Logitech G903 G403 G900 G703 G603 G PRO 무선 마우스 어댑터용_G603 (#M)가전·컴퓨터>PC부품·주변기기>키보드>키보드·마우스세트 Tmon > 가전·디지털 > 가전·컴퓨터 > PC부품·주변기기 > 키보드 > 키보드·마우스세트'</li></ul> |
| 168.0 | <ul><li>'[전용세제 ]DWA90C7B00CE 트리플케어 식기세척기 빌트인 (8가지색상) 07.블루라구나(블루) 홈>프리미엄관>(14인용)트리플케어 식기세척기>90C 모델;(#M)홈>식기세척기>(8인이상) 와이드형>트리플케어 Naverstore > 가전 > 주방가전 > 위생관리 > 식기세척기'</li><li>'[체감가152만원대] DWA90R6B00SL 트리플케어 식기세척기 빌트인 (8가지색상) 02.토르토라(그레이쉬 아이보리) (#M)홈>식기세척기>(8인이상) 와이드형>트리플케어 Naverstore > 가전 > 주방가전 > 위생관리 > 식기세척기'</li><li>'SK매직 DWA-7303D (#M)홈>전체상품 Naverstore > 가전 > 주방가전 > 위생관리 > 식기세척기'</li></ul> |
| 123.0 | <ul><li>'네임뮤조2 스피커 스탠드 최고급 원형 실버 거치대 받침대 네임뮤조2 실버 스탠드 (#M)디지털/가전>음향가전>스피커>스피커액세서리 GFK > traverse > Naverstore > 디지털 > 음향기기 > 스피커 > 액세서리'</li><li>'소니 무선 넥밴드 스피커 HT-AN7 BRAVIA Theatre U HT-AN7 BRAVIA Theatre U (#M)디지털/가전>음향가전>스피커>블루투스스피커 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 음향가전 > 스피커'</li><li>'ADAM AUDIO A5X 아담 오디오 5인치 모니터 스피커 스튜디오 고음질 홈레코딩 홈>브랜드>A-B>Adam Audio;(#M)홈>브랜드>A>Adam Audio Naverstore > 디지털/가전 > 음향가전 > 스피커 > 스피커단품'</li></ul> |
| 203.0 | <ul><li>'키친아트 2구 하이브리드 인덕션+하이라이트 하이브리드 전기레인지 2050 하이브리드렌지 홈>디지털/가전>주방가전>하이브리드;홈>전체상품;홈>🧡주방가전🧡;(#M)홈>주방가전💛 Naverstore > 가전 > 주방가전 > 전기레인지 > 하이브리드'</li><li>'SK매직 ERA-FH20D ERAFH20D00DS(인덕션1구+하이1구) (#M)홈>전체상품 Naverstore > 가전 > 주방가전 > 전기레인지 > 하이브리드'</li><li>'3구 플렉스 하이브리드 인덕션레인지 빌트인 (2인덕션+1하이라이트) ERAHBTS3 (#M)디지털/가전>주방가전>하이브리드 Naverstore > 가전 > 주방가전 > 전기레인지 > 하이브리드'</li></ul> |
| 135.0 | <ul><li>'두꺼운 발톱깍기 발톱관리 깎이 손톱정리도구 정리 손톱깎이 안튀는손톱깎이 네일 휴대용손톱깎이 홈 > 뷰티 > 네일 > 네일관리기기 > 전동네일관리기 T200 > traverse > LotteOn > 뷰티 > 네일 > 네일관리기기 > 전동네일관리기'</li><li>'라운드 메이커 올인원 네일 케어 기기 Coupang > 가전디지털 > 이미용가전 > 눈썹/네일관리 > 전동네일관리기;쿠팡 홈>가전디지털>이미용가전>눈썹/네일관리>전동네일관리기;쿠팡 홈>가전디지털>뷰티/헤어가전>눈썹/네일관리>전동네일관리기;Coupang > 뷰티 > 네일 > 네일케어도구 > 파일/버퍼/스틱 > 파일/버퍼;Coupang > 가전디지털 > 뷰티/헤어가전 > 눈썹/네일관리 > 전동네일관리기;(#M)쿠팡 홈>뷰티>네일>네일케어도구>파일/버퍼/스틱>파일/버퍼 Coupang > 가전디지털 > 뷰티/헤어가전 > 눈썹/네일관리 > 전동네일관리기'</li><li>'다이아미 핀큐어 젤네일 LED 램프 혼합색상 × 1개 Coupang > 가전디지털 > 이미용가전 > 눈썹/네일관리;쿠팡 홈>가전디지털>이미용가전>눈썹/네일관리>젤네일 램프;Coupang > 가전디지털 > 뷰티/헤어가전 > 눈썹/네일관리 > 젤네일 램프;쿠팡 홈>가전디지털>뷰티/헤어가전>눈썹/네일관리>젤네일 램프;(#M)쿠팡 홈>뷰티>네일>네일아트소품/도구>네일드라이어/램프>젤네일 램프 Coupang > 가전디지털 > 뷰티/헤어가전 > 눈썹/네일관리 > 젤네일 램프'</li></ul> |
| 74.0 | <ul><li>'[포토상품평] 카멜 CA3 싱글암 패브릭 모니터거치대 모니터암 화이트 (#M)모니터>모니터 주변기기>모니터주변기기 기타 GFK > 11st > 가전/디지털 > 모니터 > 모니터 주변기기 > 모니터주변기기 기타'</li><li>'카멜 모니터암 CA2D 듀얼 모니터거치대 이지밸런스 그레이 (#M)디지털/가전>모니터주변기기>모니터암 GFK > Naverstore > 컴퓨터 > 주변기기 > 모니터용'</li><li>'[카멜인터내셔널] 클램프형 암, CMA-2P, 블랙 [32형] (#M)디지털/가전>모니터주변기기>모니터암 Naverstore > 컴퓨터 > 주변기기 > 모니터용 > 모니터암'</li></ul> |
| 84.0 | <ul><li>'보풀컷 보풀제거기 세탁소 업소용 니트 코트 옷 로즈골드 5중날 올블랙(6중날) (#M)홈>디지털/가전>생활가전>보풀제거기 Naverstore > 가전 > 생활가전 > 보풀제거기'</li><li>'필립스 GC-026 블루 (#M)디지털/가전>생활가전>보풀제거기 Naverstore > 가전 > 생활가전 > 보풀제거기'</li><li>'유닉스 정품 충전식 무선 보풀제거기 추천 휴대용 세탁소 니트 보풀 제거 UNL-9302 UNL-9302 (+사은품 마스크 1매) (#M)디지털/가전>생활가전>보풀제거기 GFK > Naverstore > 가전 > 생활가전 > 보풀제거기'</li></ul> |
| 49.0 | <ul><li>'CAT6A 랜 커플러 키스톤 잭 모듈러 랜선 STP RJ45 CAT6 1번_6A STP 랜커플러 키스톤잭_XB265 (#M)디지털/가전>네트워크장비>기타네트워크장비 GFK > Naverstore > 컴퓨터 > 주변기기 > 네트워크장비 > 기타'</li><li>'스위치봇 - 허브 미니 원격제어 스마트홈 허브 만능리모컨 (#M)디지털/가전>네트워크장비>기타네트워크장비 GFK > Naverstore > 컴퓨터 > 주변기기 > 네트워크장비 > 기타'</li><li>'[3-5일 배송] 구글 네스트 온도조절기 자동 스마트러닝 3세대 스테인리스 스틸 스테인리스 스틸 (#M)디지털/가전>네트워크장비>기타네트워크장비 GFK > Naverstore > 컴퓨터 > 주변기기 > 네트워크장비 > 기타'</li></ul> |
| 19.0 | <ul><li>'[들꽃잠]멀티형 배 찜질팩 생리통 복부 팥 허리 냉온 (#M)생활/건강>냉온/찜질용품>찜질팩 GFK > Naverstore > 건강/의료용품 > 냉온/찜질용품'</li><li>'볼케이노가습기 무중력 가열식 기화식 가습기 샤오미 새로운 스마트 워치 울트라 8 NFC GPS 트랙 49mm 남성 여성 Smartwatch 시리즈 8 온도계 BluetoothCal 볼케이노가습기 무중력 가열식 기화식 가습기 샤오미 새로운 스_블랙 추가 3 스트랩 (#M)가전·컴퓨터>계절가전>가습기 Tmon > 가전·디지털 > 가전·컴퓨터 > 계절가전 > 가습기'</li><li>'가습기 가열식가습기 원룸 사무실 기석사 원룸 무선 4 살균충전버전 유스파우더 (#M)홈>전체상품 Naverstore > 가전 > 계절가전 > 가습기/에어워셔 > 필터/액세서리'</li></ul> |
| 122.0 | <ul><li>'브리츠 Realfit5 오픈형 블루투스 이어폰 V5.4 무선충전 초경량 귀걸이형 운동 자전거 오토바이 라이딩 아이보리 (#M)음향가전>이어폰>무선 이어폰 GFK > traverse > 11st > 가전/디지털 > 음향가전 > 이어폰 > 무선 이어폰'</li><li>'브리츠 BT4000 ANC 노이즈캔슬링 무선 블루투스 헤드셋 헤드폰(블랙, 아이보리, 화이트) BT4000 아이보리 (#M)디지털/가전>음향가전>블루투스셋>블루투스헤드폰/헤드셋 GFK > traverse > Naverstore > 디지털 > 블루투스'</li><li>'Sony WH-1000XM5 노이즈캔슬링 블루투스 헤드폰 화이트 (#M)컴퓨터 주변기기>헤드셋>블루투스헤드셋 GFK > traverse > 11st > 가전/디지털 > 컴퓨터 주변기기 > 헤드셋 > 블루투스헤드셋'</li></ul> |
| 154.0 | <ul><li>'주방 ntec후드필터 엔텍 파세코 한샘 가스렌지 후드필터 환풍기 닥트 청소 엔텍일반형 340x230 (#M)홈>디지털/가전>주방가전>가스레인지후드 Naverstore > 가전 > 주방가전 > 위생관리 > 레인지후드'</li><li>'SK매직 프론트형 600 레인지후드 RHD304L 전동댐퍼추가가능 배송만(자가설치)_전동댐퍼추가 홈>전체상품;(#M)홈>레인지후드 Naverstore > 가전 > 주방가전 > 위생관리 > 레인지후드'</li><li>'하츠 허리케인 도어 HDH-90S 씽크대 렌지 후드 교체 후황 도어없는상품_설치미접수 (배송만) (#M)홈>레인지후드 Naverstore > 가전 > 주방가전 > 위생관리 > 레인지후드'</li></ul> |
| 115.0 | <ul><li>'윤씨네 4:3 유압식 포터블 빔스크린 PM-SV 매트원단 롤러블스크린 203cm(80), 1개 (#M)디지털/가전>영상가전>프로젝터주변기기>프로젝터스크린 GFK > traverse > Naverstore > 가전 > 영상가전 > 프로젝터 > 스크린'</li><li>'윤씨네 16:9 삼각대 족자봉 빔스크린 세트 YJH 캠핑용 휴대용 가정용 203cm(80), 1개 (#M)디지털/가전>영상가전>프로젝터주변기기>프로젝터스크린 GFK > traverse > Naverstore > 가전 > 영상가전 > 프로젝터 > 스크린'</li><li>'윤씨네 4:3 C-SV 수동 체인 빔스크린 업무용 학원용 187.5cm(60), 1개 (#M)디지털/가전>영상가전>프로젝터주변기기>프로젝터스크린 GFK > traverse > Naverstore > 가전 > 영상가전 > 프로젝터 > 스크린'</li></ul> |
| 185.0 | <ul><li>'쿠쿠 게임부록 청소기/밥솥/인덕션 BEST 모델 기획전 06. 쿠쿠 6인용 IH전기압력밥솥 CRP-DHP0610FD (#M)가전·컴퓨터>주방가전>전기밥솥>압력밥솥 Tmon > 가전·디지털 > 가전·컴퓨터 > 주방가전 > 전기밥솥 > 압력밥솥'</li><li>'쿠첸 brain 듀얼프레셔 IH전기압력밥솥 6인용/10인용 풀스텐 스텐내솥 04. [다운로드쿠폰] 쿠첸 brain 풀스텐 듀얼프레셔 10인용 IH전기압력밥솥 CRH-TWS1011E 베이지/스텐내솥 (#M)가전·컴퓨터>주방가전>전기밥솥>압력밥솥 Tmon > 가전·디지털 > 가전·컴퓨터 > 주방가전 > 전기밥솥 > 압력밥솥'</li><li>'1인용밥솥 2인용밥솥 미니전기밥솥 키친아트 자취생밥솥 KC-202MY_피치 (#M)가전·컴퓨터>주방가전>전기밥솥>일반밥솥 Tmon > 가전·디지털 > 가전·컴퓨터 > 주방가전 > 전기밥솥 > 일반밥솥'</li></ul> |
| 208.0 | <ul><li>'신도리코 A3흑백복합기 N621 정식판매처 [무료설치] [당일출고] 홈>전체상품;(#M)홈>A3흑백복합기 Naverstore > 컴퓨터 > 복합기/프린터 > 흑백레이저복합기'</li><li>'삼성전자 SL-C2470FR 컬러 레이저 복합기 인쇄 복사 스캔 팩스 학교 관공서 (#M)디지털/가전>주변기기>복합기>컬러레이저복합기 GFK > Naverstore > 컴퓨터 > 복합기/프린터 > 컬러레이저복합기'</li><li>'삼정 국내제조 책상 공부 독서 LED스탠드 SL-660 블랙 (#M)디지털/가전>생활가전>스탠드>LED스탠드 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 생활가전'</li></ul> |
| 80.0 | <ul><li>'대여 창문 로봇 청소기 아파트 유리창 청소 닦이 일상 렌탈 일상 창문로봇청소기_✨설 연휴 세트✨_1/23일 (목)발송 → 1/30 (목)까지 (#M)디지털/가전>청소기>창문청소기 GFK > traverse > Naverstore > 가전 > 청소기 > 로봇청소기'</li><li>'삼성 로봇 청소기 AI 비스포크 제트 봇 진공 미니 소형 원룸 자취방 펫캠 삼성 청소기 페블 그레이 (#M)디지털/가전>생활가전>청소기>로봇청소기 GFK > Naverstore > 가전 > 청소기 > 로봇청소기'</li><li>'삼성 비스포크 제트봇 VR50B9563AE 로봇청소기 AI SE 자율주행 청정스테이션 페블 그레이 (#M)디지털/가전>생활가전>청소기>로봇청소기 GFK > Naverstore > 가전 > 청소기 > 로봇청소기'</li></ul> |
| 81.0 | <ul><li>'Dreame 충전기 V11 V9 교체용 예비 부품 어댑터 유럽 플러그 진공 청소기 액세서리 01 Adapter (#M)생활가전>청소기부품>액세서리 기타 GFK > traverse > 11st > 가전/디지털 > 생활가전 > 청소기부품'</li><li>'[팅크웨어] 아이나비 차량용 무선휴대용 스마트 에어건 EPI-A218 휴대용 충전식 청소기 홈>전체상품;홈>자동차ㆍ공구ㆍ안전>차량용디지털>차량용 전자용품;(#M)홈>자동차ㆍ공구ㆍ안전>자동차 관련용품 Naverstore > 가전 > 청소기 > 차량용'</li><li>'[히트상품] [다이슨] 청소기/에어랩/고데기/공기청정기2 06. 다이슨 슬림 플러피 오리진 (#M)가전·컴퓨터>TV·냉장고·세탁기>냉장고>그외 브랜드 Tmon > 가전·디지털 > 가전·컴퓨터 > TV·냉장고·세탁기 > 냉장고 > 그외 브랜드'</li></ul> |
| 152.0 | <ul><li>'SK하이닉스 Tube T31 Stick 외장SSD 512GB [D램탑재+스틱형] (#M)디지털/가전>저장장치>외장SSD GFK > naver_plus_traverse_extension > Naverstore > PC/주변기기 > 저장장치 > 외장SSD'</li><li>'아이팟 클래식 7세대(A1238) SSD 32GB A/S 180일 스페이스 그레이_SD 512gb+1950mAh대용량 배터리 (#M)디지털/가전>음향가전>MP3 GFK > traverse > Naverstore > 디지털 > 음향기기 > 라디오/MP3'</li><li>'SSD 외장케이스 USB C 타입 2.5 SATA HDD 외장SSD (#M)디지털/가전>저장장치>외장SSD GFK > naver_plus_traverse_extension > Naverstore > PC/주변기기 > 저장장치 > 외장SSD'</li></ul> |
| 142.0 | <ul><li>'[단독] 스킨 라이트 테라피Ⅱ LotteOn > 뷰티 > 뷰티기기 > 피부케어기 LotteOn > 뷰티 > 뷰티기기 > 피부케어기'</li><li>'동국제약 센텔리안24 마데카프라임 피부관리기 뷰티디바이스 2개 + 멜라캡처앰플PRO 10ml x 8개 + 앰플 샤쉐 6종 2개 + 마데카프라임 2개 + 사은품 [C41] 홈 > 뷰티 > 뷰티기기/소품 > 피부케어기 > 피부케어기 LO > traverse > LotteOn > 뷰티 > 뷰티기기/소품 > 피부케어기 > 피부케어기'</li><li>'[LIVE] [연말 ] 글로우엠 부스터 소닉 (젤 세럼 ) 부스터소닉 1개 + 젤 2개 + 팩 20매 (#M)디지털/가전>이미용가전>피부케어기기 LO > live > Naverstore > Shop Live > 뷰티 > 20240813 > 19:30 ~ 21:30'</li></ul> |
| 209.0 | <ul><li>'Bambu Lab A1 mini 3D 프린터 (#M)디지털/가전>주변기기>프린터>3D프린터 GFK > traverse > Naverstore > 컴퓨터 > 복합기/프린터 > 3D프린터/3D펜 > 3D프린터'</li><li>'HP 정품 CE314A 드럼 Color LJ CP1025,M175,M176, M177 / LJ pro M275nw Imaging Unit (Imaging Drum) (#M)프린터/복합기>토너>정품 GFK > traverse > 11st > 가전/디지털 > 프린터/복합기 > 토너 > 정품'</li><li>'[호환] 필터바바 3+1 삼성 에어드레서 필터 미세먼지 교체 프리미엄 H13 3벌용 3벌용 (프리미엄 H13등급) (#M)디지털/가전>생활가전>세탁/건조기>액세서리 GFK > naver_plus_traverse > Naverstore > 가전 > 세탁기/건조기 > 드럼세탁기'</li></ul> |
| 16.0 | <ul><li>'닌텐도 정품 조이콘 (R) 스위치 컨트롤러 조이스틱 오른쪽+스트랩 포함 확인하였습니다_에어캡포장(박스없음)_3.(R)네온옐로 단품 (#M)디지털/가전>게임기/타이틀>게임기주변기기>조이스틱/컨트롤러 GFK > naver_plus_traverse_extension > Naverstore > PC/주변기기 > 게이밍 > 주변용품'</li><li>'닌텐도 스위치 배터리개선판 본체 네온+링피트 어드벤처 세트+OLED공용 조이콘커버악세사리 had네온+링피트+OLED공용 조이콘커버 홈>디지털/가전>게임기/타이틀>게임타이틀;(#M)홈>디지털/가전>게임기/타이틀>휴대용게임기 Naverstore > 디지털 > 게이밍 > 닌텐도 > 본체'</li><li>'젤다의 전설 티어스 오브 더 킹덤 에디션 정품 팩케이스 세트 닌텐도 스위치 OLED 본체 닌텐도스위치 OLED 젤다의 전설 에디션_+ 인기 게임패키지 (젤다의전설 왕국의눈물) 홈>「 Game 」;홈>「 예약판매/신규출시 」;(#M)홈>「 Game 」>Nintendo Naverstore > 디지털 > 게이밍 > 닌텐도 > 본체'</li></ul> |
| 77.0 | <ul><li>'웍스 무선 충전식 고압세척기 WG630E.2 브러시리스 (#M)홈>디지털/가전>생활가전>청소기>고압세척기 Naverstore > 가전 > 청소기 > 고압세척기'</li><li>'RL30고압건 고압세척기부품 스팀건 RL30 (#M)홈>고압건 숏건 건set Naverstore > 디지털/가전 > 생활가전 > 청소기 > 고압세척기'</li><li>'웍스 창문닦이 WA4050 (#M)홈>전체상품 Naverstore > 디지털/가전 > 생활가전 > 청소기 > 고압세척기'</li></ul> |
| 160.0 | <ul><li>'쿠잉 냉동고 /쾌속형/서랍식/FR-191SS/소형/미니/164L 쿠잉 냉동고 /쾌속형/서랍식/FR-191SS/소형/미니/16 (#M)11st>냉장고>냉동고>냉동고 11st > 가전/디지털 > 냉장고 > 냉동고 > 냉동고'</li><li>'삼성전자 비스포크 RZ34C7805AP 냉동고 1도어 키친핏 오토오픈도어 좌흰지(좌개퍠)_새틴베이지 (#M)홈>전체상품 Naverstore > 가전 > 냉장고 > 냉동고'</li><li>'삼성전자 비스포크 RZ34C7805AP 냉동고 1도어 키친핏 오토오픈도어 우흰지(우개폐)_새틴세이지그린 (#M)홈>전체상품 Naverstore > 가전 > 냉장고 > 냉동고'</li></ul> |
| 110.0 | <ul><li>'(1년 구독) 파인리더 PDF 16 스탠다드 - ABBYY FineReader PDF 16 Standard (1Year) 이메일로 수령 (#M)디지털/가전>소프트웨어>사무/회계 GFK > Naverstore > 컴퓨터 > 소프트웨어 > 사무/회계'</li><li>'마이크로소프트 오피스 M365 Personal PKC (1년 구독) 엑셀/파워포인트/아웃룩/워드/팀즈/패밀리세이프티 (#M)디지털/가전>소프트웨어>사무/회계 GFK > Naverstore > 컴퓨터 > 소프트웨어'</li><li>'상품 재고관리 프로그램(거래처/제품별 재고관리, 매입/매출/환입/환출, 거래처원장, 재고현황 및 수익금액, 재고자산회전율/회전일수) 상품 재고관리 프로그램 (#M)디지털/가전>소프트웨어>사무/회계 GFK > Naverstore > 컴퓨터 > 소프트웨어'</li></ul> |
| 127.0 | <ul><li>'LG전자 스탠바이미 스피커 XT7S 디지털샵 (#M)음향가전>턴테이블>턴테이블 GFK > traverse > 11st > 가전/디지털 > 음향가전 > 턴테이블 > 턴테이블'</li><li>'인켈 (셔우드) PM-9970U 벨트드라이브 프리미엄 USB 턴테이블 블랙 24년 신형 '</li><li>'크로슬리 Voyager CR8017A '</li></ul> |
| 231.0 | <ul><li>'에어팟 4세대 케이스 홀로그램 실버 왕리본 키링 세트 (#M)디지털/가전>음향가전>이어폰/헤드폰액세서리>케이스/파우치 GFK > short_clip > Naverstore > Short Clip > 테크 > 20250116'</li><li>'[블루/실버 +상품권5만][Z플립6 512GB 체감가 119만][쿠폰15%+카드5%] 갤럭시 자급제 SM-F741N Z플립6 512GB 자급제 + 버즈3 패키지_자급제 블루 + 버즈3 화이트 [LBEKOO] (#M)휴대폰>자급제폰>삼성>5G GFK > traverse > 11st > 가전/디지털 > 휴대폰 > 자급제폰 > 삼성'</li><li>'[Z폴드6 512GB 가 2,033,000원 쿠폰10%+카드5%] 갤럭시 자급제 SM-F956N Z폴드6 512GB 자급제 + 버즈3 패키지_자급제 실버 쉐도우 + 버즈3 실버 [ZSEKOO] (#M)휴대폰>자급제폰>삼성>5G GFK > traverse > 11st > 가전/디지털 > 휴대폰 > 자급제폰 > 삼성'</li></ul> |
| 86.0 | <ul><li>'휴앤봇 3kg 소형 미니세탁기 아기옷 HS-MW3150G 헹굼 속옷 양말 1인용 원룸 (#M)디지털/가전>생활가전>세탁기>미니세탁기 Naverstore > 가전 > 세탁기/건조기 > 미니세탁기'</li><li>'휴앤봇 미니 세탁기 HS-MW25G 아기옷 속옷 수건 운동화 2.5kg 3.5kg 1) 미니세탁기 HS-MW25G(2.5kg) (#M)디지털/가전>생활가전>세탁기>미니세탁기 Naverstore > 가전 > 세탁기/건조기 > 미니세탁기'</li><li>'[호환] 대우 위니아 통돌이 세탁기 먼지 거름망 필터 03. 대우소[DZ-03] (#M)디지털/가전>생활가전>세탁기>세탁기부품 GFK > Naverstore > 가전 > 세탁기/건조기 > 액세서리 > 필터'</li></ul> |
| 205.0 | <ul><li>'[최신모델]무선도깨비방망이 노블 CHB2300 로즈펄 (#M)홈>디지털/가전>주방가전>핸드블렌더 Naverstore > 가전 > 주방가전 > 믹서기/블렌더 > 핸드블렌더'</li><li>'도깨비방망이 PHB2200 (대용량 2200ml 컵 포함) 블랙 (#M)11st>주방가전>믹서기/핸드블렌더>미니믹서기 11st > 가전/디지털 > 주방가전 > 믹서기/핸드블렌더 > 미니믹서기'</li><li>'신일 키친아트 핸드블랜더 다기능 모음 SMX-HB600S (#M)가전·컴퓨터>주방가전>믹서·원액·블렌더>핸드블렌더 Tmon > 가전·디지털 > 가전·컴퓨터 > 주방가전 > 믹서·원액·블렌더 > 핸드블렌더'</li></ul> |
| 184.0 | <ul><li>'[키친아트] 허브 와이드 전기그릴 KNG-P771NK (#M)가전·컴퓨터>주방가전>전기그릴·찜기>전기그릴 Tmon > 가전·디지털 > 가전·컴퓨터 > 주방가전 > 전기그릴·찜기'</li><li>'[세트상품] 테팔 전기그릴 컴팩트 그릴 TG300 +아이스포스 고기가위 + 인지니오 미니 스테인리스 다용도 집게 (#M)홈>주방가전>전기그릴 Naverstore > 가전 > 주방가전 > 전기그릴/팬 > 전기그릴'</li><li>'벨로닉스 레트로 멀티쿠커 전기그릴 SHMC-020 다크그레이_그릴세트(기본구성+그릴플레이트) (#M)디지털/가전>주방가전>전기그릴 Naverstore > 가전 > 주방가전 > 전기그릴/팬 > 전기그릴'</li></ul> |
| 85.0 | <ul><li>'[린나이]노비타 라인핏 방수 비데 BD-AFM51N (무상설치) (#M)11st>뷰티소품>피부관리기>피부관리기 11st > 뷰티 > 뷰티소품 > 피부관리기'</li><li>'이누스 방수비데 IS-520 - 360° 모든 방향 완벽 파워방수 IPX5 / 스마트 터치식 2. IS-510 온풍건조X_2. 설치후 2만원 결재 (#M)11st>생활가전>비데>전자식비데 11st > 가전/디지털 > 생활가전 > 비데 > 전자식비데'</li><li>'[롯데백화점]보보 [롯데잠실]VOVO 보보 시트비데 무선리모컨 쾌변기능 VB-6000 무상설치 (#M)11st>생활가전>비데>기계식비데 11st > 가전/디지털 > 생활가전 > 비데 > 기계식비데'</li></ul> |
| 157.0 | <ul><li>'닭탈모기 닭털뽑는기계 은행탈피기 LIM-30A(소/중/대/특대형) 기본 30개 (#M)홈>디지털/가전>주방가전>기타주방가전 Naverstore > 디지털/가전 > 주방가전 > 기타주방가전'</li><li>'테팔 비어텐더 생맥주 디스펜서 맥주기계 VB310EVB310EKR (#M)가전·컴퓨터>주방가전>전기쿠커·튀김기>기타용품 Tmon > 가전·디지털 > 가전·컴퓨터 > 주방가전 > 전기쿠커·튀김기 > 기타용품'</li><li>'LG전자렌지 교체용 유리회전접시 회전판 A타입 24.5cm (#M)홈>디지털/가전>주방가전>기타주방가전 Naverstore > 디지털/가전 > 주방가전 > 기타주방가전'</li></ul> |
| 189.0 | <ul><li>'유니크대성 업소용 사리 육수냉장고 냉면육수통 선택11. 스텐-2말쌍통1라인 (#M)11st>냉장고>일반형>일반형 11st > 가전/디지털 > 냉장고 > 일반형 > 일반형'</li><li>'케민 22L 미니 기숙사 이유식 1인 냉장고 듀얼 스마트 MinSellAmount (#M)주방가전>냉장고/냉동고>화장품냉장고 Gmarket > 가전 > 주방가전 > 냉장고/냉동고 > 화장품냉장고'</li><li>'Celler Cool CX2200 와인셀러 냉각 시스템 전면 전원 코드 Rear Power Cord (#M)냉장고>전용냉장고>와인냉장고 GFK > 11st > 가전/디지털 > 냉장고 > 전용냉장고'</li></ul> |
| 108.0 | <ul><li>'Arobas Music Guitar Pro 8 아로바스 뮤직 기타프로 8 타브 악보 제작 Guitar Pro 8 (#M)디지털/가전>소프트웨어>그래픽/멀티미디어 GFK > Naverstore > 컴퓨터 > 소프트웨어 > 그래픽/멀티미디어'</li><li>'다빈치 리졸브 스튜디오 다빈치 리졸브 스튜디오 (#M)디지털/가전>소프트웨어>그래픽/멀티미디어 GFK > Naverstore > 컴퓨터 > 소프트웨어 > 그래픽/멀티미디어'</li><li>'어도비 마스터컬렉션 CC [포토샵 일러스트레이터 프리미어프로 에프터이펙트 라이트룸 인디자인 아크로벳 미디어인코더 등 포함 1년 플랜] (#M)디지털/가전>소프트웨어>그래픽/멀티미디어 GFK > traverse > Naverstore > 컴퓨터 > 소프트웨어'</li></ul> |
| 174.0 | <ul><li>'수저 살균 소독기 식기살균건조기 수저통 식당 업소용 대신 열소독 건식 살균기 6구 '</li><li>'하임셰프 업소용 열풍 식기살균 자외선 건조기 '</li><li>'한일 식기건조기 UV 살균 2단 그릇 건조대 대형 살균기 주방 컵 정리대 식기 건조기 '</li></ul> |
| 227.0 | <ul><li>'와콤 신티크16 DTK-1660 액정타블렛 공식판매점 홍대입구점 / 필수악세서리 이벤트 / 필름부착서비스 신티크16+AG필름부착발송 홈>Wacom>신티크;홈>전체상품;(#M)홈>와콤>신티크 Naverstore > 컴퓨터 > 키보드/마우스 > 타블렛 > 본체'</li><li>'삼성전자 갤럭시탭 S9 플러스 256GB 슈퍼아몰레드2X 방수/방진 256G x Wi-Fi_그라파이트 SM-X810NZAAKOO_단품+힐링쉴드필름+65W충전기 (#M)디지털/가전>태블릿PC Naverstore > 컴퓨터 > 노트북 > 태블릿PC'</li><li>'[신제품 이벤트] 와콤 신티크프로 27 터치 DTH-271 액정타블렛 신티크프로27+와콤스탠드 세트 (#M)11st>컴퓨터주변기기>태블릿/디지털 펜>태블릿/디지털 펜 11st > 가전/디지털 > 컴퓨터 주변기기 > 태블릿/디지털 펜 > 태블릿/디지털 펜'</li></ul> |
| 182.0 | <ul><li>'린나이 포터블 인덕션 1구렌지 RPI-Y10 (#M)홈>디지털/가전>주방가전>인덕션 Naverstore > 가전 > 주방가전 > 전기레인지 > 인덕션'</li><li>'[택배/전문기사방문, ]린나이 미드나잇컬러인덕션 3구 전기레인지RBI-G3000N 전문기사설치 (#M)주방가전>전기레인지>인덕션>빌트인 GFK > 11st > 가전/디지털 > 주방가전 > 전기레인지 > 인덕션'</li><li>'냄비2종 전국무료설치 3구 올파워 화이트 인덕션 전기레인지 IHRB32A3 화이트_무료설치_배송후 SK설치기사방문 홈>전체상품;(#M)홈>전기레인지>인덕션 Naverstore > 가전 > 주방가전 > 전기레인지 > 인덕션'</li></ul> |
| 204.0 | <ul><li>'핫플레이트 인덕션 버너 가열판 열전도판 전달 열플레이트 L 홈>전체상품;(#M)홈>디지털/가전>주방가전>핫플레이트 Naverstore > 가전 > 주방가전 > 전기레인지 > 핫플레이트'</li><li>'키친아트 KG-02TH 1구 세라믹 핫플레이트 /HB (#M)디지털/가전>주방가전>핫플레이트 GFK > Naverstore > 가전 > 주방가전 > 전기레인지 > 핫플레이트'</li><li>'키친아트 세라믹 핫플레이트 1구 전기레인지 KG-02TH 미니 전기곤로 온도조절 전기버너 (#M)디지털/가전>주방가전>핫플레이트 GFK > Naverstore > 가전 > 주방가전 > 전기레인지 > 핫플레이트'</li></ul> |
| 5.0 | <ul><li>'다크플래쉬 DK110 컴퓨터케이스 PC케이스 (#M)디지털/가전>PC부품>PC케이스 GFK > Naverstore > 컴퓨터 > 부품 > 케이스/파워'</li><li>'앱코 NCORE G30 트루포스 미들타워 PC케이스 (블랙) (#M)PC부품>PC케이스>미들케이스 GFK > 11st > 가전/디지털 > PC부품 > PC케이스'</li><li>'마이크로닉스 EM2 STEREO 미들 타워 PC 케이스 블랙 (#M)디지털/가전>PC부품>PC케이스 Naverstore > 컴퓨터 > 부품 > 케이스/파워'</li></ul> |
| 40.0 | <ul><li>'한일 캠핑 전기요 프리볼트 장판 싱글 1인용 전기장판 전기매트 2인용 도형 랜덤 디자인 랜덤_소 (#M)디지털/가전>계절가전>전기요/담요/방석>전기요 Naverstore > 가전 > 계절가전 > 전기요/담요/방석 > 전기요'</li><li>'[미니 출시] 보국 에어셀 인체감지 전기요 카모플라쥬 BKB-9511S 2) 싱글 BKB-9511S (#M)디지털/가전>계절가전>전기요/담요/방석>전기요 Naverstore > 가전 > 계절가전 > 전기요/담요/방석 > 전기요'</li><li>'2023년형 일월 전기방석 온열방석 쇼파용 1인 2인 3인 전기매트 장판 일월 50W 미니싱글 (장판소재/무늬랜덤) 홈>디지털/가전>계절가전>전기장판/담요/방석>전기방석;(#M)홈>디지털/가전>계절가전>전기요/담요/방석>전기방석 Naverstore > 가전 > 계절가전 > 전기요/담요/방석 > 전기방석'</li></ul> |
| 133.0 | <ul><li>'LG프라엘 메디헤어 HGN2V LG전자 탈모치료기 의료기기 LG프라엘 메디헤어 (P700) (#M)생활/건강/취미>건강/안마용품>의료/구강용품>기타 관리용품 CJmall > 뷰티 > 헤어/바디/미용기기 > 피부/바디기기 > 피부 마사지기'</li><li>'신광 실리콘 전동두피 머리마사지 마사지 실리콘마사지 실리콘케어 전동마사지 전동두피마사지 두피케어 (#M)이미용가전>기타 미용가전>전동두피마사지기 GFK > traverse > 11st > 가전/디지털 > 이미용가전 > 기타 미용가전 > 전동두피마사지기'</li><li>'[LG전자] 프라엘 메디헤어 탈모 케어기기 HGN1 (#M)11st>헤어케어>샴푸>한방 11st > 뷰티 > 헤어케어 > 샴푸 > 한방'</li></ul> |
| 213.0 | <ul><li>'인스탁스 스퀘어필름 20매(10매X2) (영등포점) (#M)디지털/가전>주변기기>프린터>포토프린터 GFK > traverse > Naverstore > 디지털 > 카메라 > 즉석카메라/용품 > 필름'</li><li>'폴라로이드 즉석 카메라 사진기 후지필름 인스탁스 스퀘어 필름 화이트 엣지 인화지 SQ10 SQ40 SQ20 공유 S 20 Sheets (#M)카메라/주변기기>즉석카메라>일회용카메라 GFK > traverse > 11st > 가전/디지털 > 카메라/주변기기 > 즉석카메라 > 일회용카메라'</li><li>'전동 손톱깍이 자동 휴대용 네일케어 손톱정리 국내발송 전동손톱깍이(CD-300) (#M)디지털/가전>이미용가전>손발톱정리기 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 이미용가전 > 손발케어'</li></ul> |
| 219.0 | <ul><li>'소니 사이버샷 DSC-RX100 '</li><li>'리코 GR3X HDF (#M)디지털/가전>카메라/캠코더용품>일반디카 GFK > traverse > Naverstore > 디지털 > 1인방송/촬영 > 카메라 > 일반디카'</li><li>'리코 PENTAX WG-1000 아웃도어 방수카메라 올리브_S0002167 (#M)디지털/가전>카메라/캠코더용품>일반디카 GFK > traverse > Naverstore > 디지털 > 1인방송/촬영 > 카메라 > 일반디카'</li></ul> |
| 120.0 | <ul><li>'FiiO BTR17 디코더 앰프 블루투스 오디오 리시버 스마트폰용 DAC 헤드폰 앰프 블랙 (#M)디지털/가전>음향가전>리시버/앰프 GFK > traverse > Naverstore > 디지털 > 음향기기 > 리시버/앰프'</li><li>'[런칭할인] Bluesound 블루사운드 NODE NANO 네트워크 플레이어 (#M)디지털/가전>음향가전>리시버/앰프 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 음향가전 > 리시버/앰프'</li><li>'MARANTZ(마란츠) M-CR612 네트워크 올인원 인티앰프 (#M)디지털/가전>음향가전>리시버/앰프 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 음향가전 > 리시버/앰프'</li></ul> |
| 192.0 | <ul><li>'위즈웰 가정용 제빵기 식빵 기계 대용량 예약기능 발효기 반죽 도우 WSB8000 WSB8000 + Npay 20000 적립 (#M)디지털/가전>주방가전>제빵기 GFK > Naverstore > 가전 > 주방가전 > 오븐/제빵'</li><li>'매직쉐프 스타일리쉬 홈베이킹 제빵기 MEBM-X900 제빵기화이트 (#M)디지털/가전>주방가전>제빵기 Naverstore > 가전 > 주방가전 > 오븐/제빵 > 제빵기'</li><li>'JCP 브레드가든 BM2401 (#M)디지털/가전>주방가전>제빵기 Naverstore > 가전 > 주방가전 > 오븐/제빵 > 제빵기'</li></ul> |
| 162.0 | <ul><li>'테팔 믹서기 초고속 블렌더 퍼펙트믹스 플러스 트라이탄 BL82AD (#M)11st>주방가전>믹서기/핸드블렌더>일반믹서기 11st > 가전/디지털 > 주방가전 > 믹서기/핸드블렌더 > 일반믹서기'</li><li>'해피콜 초고속 블렌더 믹서기 브리즈탭 해피콜 블렌더 브리즈탭(차콜그레이) (#M)디지털/가전>주방가전>믹서기 Naverstore > 가전 > 주방가전 > 믹서기/블렌더 > 믹서기'</li><li>'[공식] 테팔 초고속블렌더 퍼펙트믹스 플러스 트라이탄 BL82AD (#M)11st>주방가전>믹서기/핸드블렌더>초고속믹서기 11st > 가전/디지털 > 주방가전 > 믹서기/핸드블렌더 > 초고속믹서기'</li></ul> |
| 195.0 | <ul><li>'가정용 진공포장기 12 대형롤28cmX3M 3개 (#M)디지털/가전>주방가전>진공포장기 GFK > Naverstore > 가전 > 주방가전 > 위생관리 > 진공포장기'</li><li>'미소랩 가정용 자동 무선 진공포장기 진공탭 ML-210 진공포장기 1개 (#M)디지털/가전>주방가전>진공포장기 GFK > Naverstore > 가전 > 주방가전 > 위생관리 > 진공포장기'</li><li>'키친아트 진공포장기 KJP-3800WS 밀봉가능 비닐팩포함 (#M)11st>주방가전>기타 주방가전>주방가전 기타 11st > 가전/디지털 > 주방가전 > 기타 주방가전 > 주방가전 기타'</li></ul> |
| 178.0 | <ul><li>'테팔 에퀴녹스 9L 전기 오븐 그릴 토스터기 (#M)위메프 > 가전·디지털·컴퓨터 > 주방가전 > 홈메이드가전 > 홈베이킹/토스터기 위메프 > 가전·디지털·컴퓨터 > 주방가전 > 홈메이드가전 > 홈베이킹/토스터기'</li><li>"[25'설선물대첩] 발뮤다 더 레인지 다크그레이 K09B 다크그레이_레이에 서버 집게 (#M)디지털/가전>주방가전>오븐>복합형오븐 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 주방가전 > 오븐/제빵"</li><li>'위즈웰 디지털 컨벡션 오븐 전기 제과 제빵 빵 만들기 홈베이킹 가정용 GL-42A/B 디지털오븐(GL-42A/B)+15000 N적립 (#M)디지털/가전>주방가전>오븐>전기오븐 GFK > traverse > Naverstore > 가전 > 주방가전 > 오븐/제빵 > 전기오븐'</li></ul> |
| 30.0 | <ul><li>'캐리어 50평,80평 업소용 대형 냉난방기 실외기 포함 '</li><li>'캐리어 냉난방기 40평형 인버터 스탠드 냉온풍기 실외기포함 DMQE401LAWWSX '</li><li>'앞치마소독기 열풍건조 위생복살균기 앞치마15장 업소용 MVHAA815 (#M)주방가전>식기세척/건조기>칼도마살균건조기 GFK > traverse > 11st > 가전/디지털 > 주방가전 > 식기세척/건조기 > 칼도마살균건조기'</li></ul> |
| 139.0 | <ul><li>'[이오시카] 뷰티유튜버 PICK IPL 제모의료기기 SIPL-2000 PLUS(100만회)+시카젤+선글라스 (#M)디지털/가전>이미용가전>제모기 Naverstore > 가전 > 이미용가전 > 면도기/이발기 > 제모기'</li><li>'쉬크 인튜이션 미니언즈에디션 버라이어티 기획 2종 택 1 (기+날4입) 핑크(쉐어버터) (#M)홈>바디케어>제모용품>면도기/제모의료기기 OLIVEYOUNG > 바디케어 > 제모용품'</li><li>'필립스 모근제거기 BRE255/매끈한 피부 (#M)GSSHOP>뷰티>이미용기기>기타이미용기기 GSSHOP > 뷰티 > 이미용기기 > 기타이미용기기'</li></ul> |
| 35.0 | <ul><li>'린나이 전기온수기 15리터 저장식 교체 까페 대용량 직접설치 직접설치(택배발송)_15리터(벽걸이형) (#M)디지털/가전>계절가전>온수기>전기온수기 GFK > Naverstore > 가전 > 계절가전 > 온수기 > 전기식'</li><li>'경동나비엔 30리터 전기온수기 EW-30RN-U [NEW] ESW350-30U(상향식) (#M)디지털/가전>계절가전>온수기>전기온수기 GFK > Naverstore > 가전 > 계절가전 > 온수기 > 전기식'</li><li>'린나이 전기온수기 15리터 저장식 교체 까페 대용량 직접설치 직접설치(택배발송)_15리터(바닥형) (#M)디지털/가전>계절가전>온수기>전기온수기 GFK > Naverstore > 가전 > 계절가전 > 온수기 > 전기식'</li></ul> |
| 38.0 | <ul><li>'순수편백나무 격자무늬 자연기화식 가습기 증발식 (#M)위메프 > 가전·디지털·컴퓨터 > 계절가전/에어컨 > 가습/청정가전 > 가습기 위메프 > 가전·디지털·컴퓨터 > 계절가전/에어컨 > 가습/청정가전 > 가습기'</li><li>'순수편백나무 자연기화식 바스켓가습기 소 (#M)위메프 > 가전·디지털·컴퓨터 > 계절가전/에어컨 > 가습/청정가전 > 가습기 위메프 > 가전·디지털·컴퓨터 > 계절가전/에어컨 > 가습/청정가전 > 가습기'</li><li>'자연기화가습기 사무실 공부 수험생 건조 디퓨저 무드 하얀 풍경 에센셜 오일 3병 700ml (#M)11st>계절가전>가습기>복합식가습기 11st > 가전/디지털 > 계절가전 > 가습기 > 복합식가습기'</li></ul> |
| 94.0 | <ul><li>'비솝연수기 구 에코렉스연수기 살균 염소제거 '</li><li>'현대 연수기 렌탈 업소용 식당 가정용 잔류염소케어 약정4년 HQ-S2010 '</li><li>'연수기 듀벨 F15 간편 본품 녹물제거필터 모둠 리필필터 리필필터_F15_고급형_3개 홈>전체상품;(#M)홈>듀벨 연수기>수도애>본품 Naverstore > 가전 > 욕실가전 > 연수기'</li></ul> |
| 83.0 | <ul><li>'엑타코 스프레이 무선물걸레청소기 E7 (건조대 + 극세사 총 6장 + 일회용청소포 20매 + 인스톨패드 2장 / 포토 상품평 이벤트) S85_엑타코 E7 (스타터 세트/배터리1개) (#M)홈>디지털/가전>생활가전>청소기>물걸레청소기 Naverstore > 가전 > 청소기 > 물걸레청소기'</li><li>'[10분어택] 세비즈 원터치 물분사 LED 트리플 고주파 회전 무선 물걸레청소기 MOP1 (#M)가전·컴퓨터>생활가전>청소기>물걸레청소기 Tmon > 가전·디지털 > 가전·컴퓨터 > 생활가전 > 청소기 > 물걸레청소기'</li><li>'코맘스 소형 물걸레청소기 PC9005G 1. 그레이 (PC9005G) (#M)홈>생활가전>청소기 Naverstore > 가전 > 청소기 > 물걸레청소기'</li></ul> |
| 150.0 | <ul><li>'[기타]Seagate 외장하드 Backup Plus Portable 4TB '</li><li>'[기타]외장 하드 케이스 하드디스크 케이스 C타입 USB3.0 '</li><li>'[기타]3.5형 SATA HDD 외장하드 케이스 보관함 데이터 백업 '</li></ul> |
| 140.0 | <ul><li>'LG전자 프라엘 워시팝 초음파 진동클렌저 코코넛 화이트_BCP2 (#M)홈>화장품/미용>뷰티소품>메이크업브러시>브러시세트 Naverstore > 화장품/미용 > 뷰티소품 > 메이크업브러시 > 브러시세트'</li><li>'슬룸 허리편한케어 허리마사지기 마사지베개 스트레칭 온열 진동 안마기 1개 [48% 할인] 허리편한케어 + 크림 (#M)생활/건강>안마용품>안마기 GFK > Naverstore > 건강/의료용품 > 안마용품 > 쿠션안마기'</li><li>'엘지 프라엘 바디스파 SSP1 (#M)홈>화장품/미용>바디케어>바디케어세트 Naverstore > 화장품/미용 > 바디케어 > 바디케어세트'</li></ul> |
| 47.0 | <ul><li>'HDMI+USB 통합 KVM 케이블 (1.5M, 2M, 3M, 5M) '</li><li>'시스라인 CBD-600H 6m, 1개 '</li><li>'강원전자 넷메이트 KVM USB Stereo 케이블 '</li></ul> |
| 159.0 | <ul><li>'[위니아]클라쎄 컨버터블 김치냉장고 120리터 KAE112SSM4MSV(AK) (#M)냉장고>김치 냉장고>뚜껑형 GFK > traverse > 11st > 가전/디지털 > 냉장고 > 김치 냉장고 > 뚜껑형'</li><li>'비스포크 키친핏 김치냉장고 3도어 RQ33C74B1W6 (313L, 새틴 화이트, 1등급) (#M)냉장고>김치 냉장고>스탠드형>3도어 GFK > 11st > 가전/디지털 > 냉장고 > 김치 냉장고 > 스탠드형'</li><li>'삼성전자 RQ33C74C3AP 비스포크 김치플러스 키친핏 새틴 베이지+그레이 3도어 냉장고 국민전자 (#M)냉장고>김치 냉장고>스탠드형>3도어 GFK > traverse > 11st > 가전/디지털 > 냉장고 > 김치 냉장고'</li></ul> |
| 93.0 | <ul><li>'삼성전자 15L 대형 대용량 업소용 공업용 산업용 영업용 유선 청소기 강력한 흡입력 홈>생활 가전>청소기;(#M)홈>전체상품 Naverstore > 디지털/가전 > 생활가전 > 청소기 > 업소용청소기'</li><li>'백마스터 연동 청소기 VQ1530SFDC VQ1220PF 프레레 집진기 EVC-20P 이엑스파워 선택2. 연동형 20L VQ1220PFC (#M)홈>청소기>유선청소기 Naverstore > 디지털/가전 > 생활가전 > 청소기 > 업소용청소기'</li><li>'디월트 청소기 건습식 송풍기능 23L,45L,61L 모음 DXV23P,45P,61P 호스 선택02. DXV45P(45L) (#M)홈>전동공구>디월트 Naverstore > 디지털/가전 > 생활가전 > 청소기 > 업소용청소기'</li></ul> |
| 201.0 | <ul><li>'키친아트 허브 올인원 전기튀김기 3리터 KF-P4144NK (#M)가전·컴퓨터>주방가전>기타 주방가전>정수기 Tmon > 가전·디지털 > 가전·컴퓨터 > 주방가전 > 기타 주방가전 > 정수기'</li><li>'테팔 튀김기 컴팩트 프로 전기튀김기 FR3220 FR3220KR (#M)11st>주방가전>업소용 주방가전>튀김기 11st > 가전/디지털 > 주방가전 > 업소용 주방가전 > 튀김기'</li><li>'키친아트/라팔/프리미엄/분리형/바스켓/전기 튀김기 KA-P730 (#M)위메프 > 가전·디지털·컴퓨터 > 주방가전 > 에어프라이어/전기오븐/찜기 > 전기 튀김기 위메프 > 가전·디지털·컴퓨터 > 주방가전 > 에어프라이어/전기오븐/찜기 > 전기 튀김기'</li></ul> |
| 70.0 | <ul><li>'(현대Hmall)LG 27UL550 UHD HDR 피벗 높이조절 27인치 화이트 모니터 (#M)위메프 > 가전·디지털·컴퓨터 > 모니터/프린터 > 모니터 > 일반 모니터 위메프 > 가전·디지털·컴퓨터 > 모니터/프린터 > 모니터 > 일반 모니터'</li><li>'LG전자 그램 뷰 View+ 16MQ70 포터블 모니터 새제품 진열제품(C급 액정기스 일부) (#M)11st>모니터>일반 모니터>58cm이하(~23인치) 11st > 가전/디지털 > 모니터 > 일반 모니터 > 58cm이하(~23인치)'</li><li>'알파스캔 에이건 AGON 323QCX2 QHD 155 프리싱크 HDR 게이밍 모니터 (#M)11st>모니터>게이밍 모니터>144Hz 이상 11st > 가전/디지털 > 모니터 > 게이밍 모니터 > 144Hz 이상'</li></ul> |
| 20.0 | <ul><li>'위닉스 H13등급 필터 제로/2.0/S/플러스/WACU300/WACU150 모음전 호환용필터 선택05 - 타워Q_프리미엄형 쇼킹딜 홈>가전>계절가전>가습/제습/청정기;(#M)11st>계절가전>공기청정기>필터/액세서리 11st > 가전/디지털 > 계절가전 > 공기청정기 > 필터/액세서리'</li><li>'정품 위닉스공기청정기필터 타워Q CAF-D0S5 D필터 (#M)11st>생활가전>청소기부품>액세서리 기타 11st > 가전/디지털 > 생활가전 > 청소기부품 > 액세서리 기타'</li><li>'[행사] 위닉스 공기청정기 필터 교환 세트 전기종 호환 1. 위닉스 타워Q 호환 (CAF-D0S5)_헤파플러스 (헤파단일) 쇼킹딜 홈>가전>계절가전>가습/제습/청정기;(#M)11st>계절가전>공기청정기>필터/액세서리 11st > 가전/디지털 > 계절가전 > 공기청정기 > 필터/액세서리'</li></ul> |
| 177.0 | <ul><li>'풀무원 글라스쿡 글라스 유리바스켓 에어프라이어 3리터 (#M)디지털/가전>주방가전>에어프라이어 Naverstore > 가전 > 주방가전 > 에어프라이어 > 바스켓형'</li><li>'테팔 3.5L 에어프라이어 이지프라이 에센셜 EY-1308KR (#M)가전·컴퓨터>TV·냉장고·세탁기>세탁기·건조기>그외 브랜드 Tmon > 가전·디지털 > 가전·컴퓨터 > TV·냉장고·세탁기 > 세탁기·건조기 > 그외 브랜드'</li><li>'쿠쿠전자 쿠쿠 CAF-G0610TB (#M)디지털/가전>주방가전>에어프라이어 Naverstore > 가전 > 주방가전 > 에어프라이어 > 바스켓형'</li></ul> |
| 188.0 | <ul><li>'키친아트 제로 304 무선 전기주전자 1.2리터 (#M)홈>디지털/가전>주방가전>전기포트>무선포트 Naverstore > 가전 > 주방가전 > 전기포트 > 분유포트'</li><li>'키친아트 무선 유리 스텐 전기 커피 주전자 포트 모음 급속가열 360도 회전받침대 SEP-C1700KP (#M)11st>주방가전>전기포트>무선포트/주전자 11st > 가전/디지털 > 주방가전 > 전기포트 > 무선포트/주전자'</li><li>'신일 무선 티포트 전기주전자 45. 키친아트 KK-551MH (#M)가전·컴퓨터>주방가전>전기주전자>무선포트 Tmon > 가전·디지털 > 가전·컴퓨터 > 주방가전 > 전기주전자 > 무선포트'</li></ul> |
| 57.0 | <ul><li>'(EFM) IPTIME POE4002 4포트 기가비트 스위칭허브 +1 UP링크 (SFP COMBO 포트) (#M)디지털/가전>네트워크장비>스위칭허브 Naverstore > 컴퓨터 > 주변기기 > 허브/컨트롤러 > 스위칭허브'</li><li>'IPTIME H6008-IGMP 스위칭 허브 스위치 8포트 (#M)홈>허브(HUB)>스위칭 허브>기가 스위칭 허브 Naverstore > 컴퓨터 > 주변기기 > 허브/컨트롤러 > 스위칭허브'</li><li>'EFM네트웍스 아이피타임 H6008 8포트 기가비트 스위칭허브 홈>스위칭 허브;(#M)홈>스위칭 허브>1GHz 스위칭허브 Naverstore > 컴퓨터 > 주변기기 > 허브/컨트롤러 > 스위칭허브'</li></ul> |
| 190.0 | <ul><li>'SK매진 전자식 전자레인 20L MWO-20EC2 (#M)11st>주방가전>전자레인지>전자레인지 11st > 가전/디지털 > 주방가전 > 전자레인지 > 전자레인지'</li><li>'LG전자 MW23BD (#M)디지털/가전>주방가전>전자레인지 Naverstore > 가전 > 주방가전 > 전자레인지'</li><li>'SK매직 MWO-M8A02 (#M)11st>주방가전>전자레인지>전자레인지 11st > 가전/디지털 > 주방가전 > 전자레인지 > 전자레인지'</li></ul> |
| 45.0 | <ul><li>'[악세사리]스킨세이버R2 홈>악세사리, 소모품;홈>디지털/가전>계절가전>히터>연탄/화목난로;홈>마이스토브;홈>캠핑화목난로>마이스토브;(#M)홈>악세사리, 소모품>설치 악세사리 Naverstore > 가전 > 계절가전 > 난방가전 > 연탄/화목난로'</li><li>'[국내생산] 포시즌 전기발난로 발찜질기 발온열기 풋워머 발히터 보온 실내화 슬리퍼 사무실 옵6) 땡땡이_멀티B형 (#M)가전·컴퓨터>계절가전>전기히터>전기히터 Tmon > 가전·디지털 > 가전·컴퓨터 > 계절가전 > 전기히터'</li><li>'21센추리 사무실 전기 발난로 히팅패드 파티션 히터 10cm 더 넓게 195W 21센추리 파티션히터+담요(색상랜덤)+보관가방 (#M)디지털/가전>계절가전>히터>전기히터 GFK > Naverstore > 가전 > 계절가전 > 난방가전 > 전기히터'</li></ul> |
| 9.0 | <ul><li>'장우컴퍼니 JW-HTKM01 메모리 방열판 (블랙) (#M)디지털/가전>PC부품>쿨러>방열판 GFK > Naverstore > 컴퓨터 > 부품 > 쿨러 > 방열판'</li><li>'JONSBO M.2 방열판 NVME PS5 SSD 방열판 M2-3 (그레이,레드,블랙) 존스보 M2-3_(블랙) (#M)11st>PC부품>쿨러>기타 11st > 가전/디지털 > PC부품 > 쿨러 > 기타'</li><li>'PC 컴퓨터 케이스 120MM RGB LED 쿨러 파워 전원 인텔 타워형 CPU쿨러 교환 튜닝 냉각 쿨링팬 (#M)11st>PC부품>쿨러>케이스용 11st > 가전/디지털 > PC부품 > 쿨러 > 케이스용'</li></ul> |
| 105.0 | <ul><li>'모스큐 가정용 모기퇴치기 벌레 날파리 포충기 무선 포충등 한정수량 55%이벤트 모기퇴치기 (#M)홈>디지털/가전>생활가전>해충퇴치기 Naverstore > 가전 > 계절가전 > 해충퇴치기'</li><li>'Thermacell 써마셀 백패커 모기퇴치기 훈증기 향매트 2세대 모기퇴치기2.0+파우치+4시간용 리필매트 4개 홈>전체상품;홈>생활/건강>생활용품>해충퇴치용품>리퀴드;(#M)홈>디지털/가전>생활가전>해충퇴치기 Naverstore > 가전 > 계절가전 > 해충퇴치기'</li><li>'[끈끈이13장+8종+2개이상구입시 개당5천] 스카이에프 모기 파리 해충퇴치기 포충기 스카이에프플러스(끈끈이13장+8종+복수할인) (#M)디지털/가전>생활가전>해충퇴치기 Naverstore > 가전 > 계절가전 > 해충퇴치기'</li></ul> |
| 54.0 | <ul><li>'HDMI 리피터 EXTENDER 랜선 UTP 연장기 150M 송수신기세트 '</li><li>'HDMI 리피터 UTP 거리연장기 익스텐더 송수신기 세트 150M '</li><li>'넥시 HDMI 무선 송수신기 30M NX-WHR30 NX1076 '</li></ul> |
| 10.0 | <ul><li>'스위치 접착식 하부 흡음재(120pcs) (#M)디지털/가전>PC부품>튜닝용품>기타튜닝용품 GFK > Naverstore > 컴퓨터 > 부품 > 튜닝용품'</li><li>'SW 빈티지 기계식 키보드 스위치 (#M)디지털/가전>PC부품>튜닝용품>기타튜닝용품 GFK > Naverstore > 컴퓨터 > 부품 > 튜닝용품'</li><li>'스테빌 철심 패드 (#M)디지털/가전>PC부품>튜닝용품>기타튜닝용품 GFK > Naverstore > 컴퓨터 > 부품 > 튜닝용품'</li></ul> |
| 52.0 | <ul><li>'LDW931 LTE 라우터 와이파이 동글 유심 카파이 5채널 제품 (#M)디지털/가전>네트워크장비>라우터 GFK > Naverstore > 컴퓨터 > 주변기기 > 네트워크장비 > 라우터'</li><li>'갤럭시 5G 라우터 모바일 포켓 와이파이 심프리 SCR01 화이트 (#M)디지털/가전>네트워크장비>라우터 GFK > Naverstore > 컴퓨터 > 주변기기 > 네트워크장비 > 라우터'</li><li>'갤럭시 5G 모바일 라우터 화이트 SCR01 Galaxy 5G 와이파이 SIM 프리 (#M)디지털/가전>네트워크장비>라우터 Naverstore > 컴퓨터 > 주변기기 > 공유기 > 유무선공유기'</li></ul> |
| 172.0 | <ul><li>'라셀르 업소용냉장고 45박스 간냉식 올냉장 LS-1025R (#M)위메프 > 가전·디지털·컴퓨터 > 대형가전 > 냉장고 > 양문형 냉장고 위메프 > 가전·디지털·컴퓨터 > 대형가전 > 냉장고 > 양문형 냉장고'</li><li>'뷔페 셀프바 반찬 냉장고 샐러드 김밥 보관통 업소용 D1 (뚜껑 포함) (#M)11st>냉장고>4도어 냉장고>4도어 냉장고 11st > 가전/디지털 > 냉장고 > 4도어 냉장고 > 4도어 냉장고'</li><li>'유니크대성 냉장냉동고 테이블냉장고 업소용작업대 냉장-선택19 메탈1500-아날로그 (#M)위메프 > 가전·디지털·컴퓨터 > 대형가전 > 냉장고 > 일반 냉장고 위메프 > 가전·디지털·컴퓨터 > 대형가전 > 냉장고 > 일반 냉장고'</li></ul> |
| 42.0 | <ul><li>'무중력가습기 무선 가습기 디퓨저, 1000ml, 아로마 테라피, 4000mAh 배터리, 충전식 에센셜 오일 무중력가습기 무선 가습기 디퓨저, 1000ml, 아로마 테라피, 4000mAh 배터리, 충전식 에센셜 오일_04 FF (#M)가전·컴퓨터>계절가전>가습기 액세서리 Tmon > 가전·디지털 > 가전·컴퓨터 > 계절가전 > 가습기 액세서리'</li><li>'거치대 차량용 송풍구 초음파 충전식 무선 가습기 통풍구 NEO2M 958차량용가습기 (#M)홈>생활/건강>자동차용품>편의용품>차량용가습기 Naverstore > 가전 > 계절가전 > 가습기/에어워셔 > 차량용 가습기'</li><li>'가습기 불멍 물멍 대용량 미니 청소쉬운 차량용 컬러풀 D 초음파 에센셜 오일 아로마 디퓨저 3L, 더블 가습기 불멍 물멍 대용량 미니 청소쉬운 차량용 컬러풀 D 초음파 에센셜 오일 아로마 디퓨저 3L, 더블_01 WHITE (#M)가전·컴퓨터>계절가전>가습기 Tmon > 가전·디지털 > 가전·컴퓨터 > 계절가전 > 가습기'</li></ul> |
| 113.0 | <ul><li>'삼성전자 Crystal UHD KU55UD7000FXKR 스탠드형 R3 (#M)TV>138~175cm (55~69인치)>138~175cm (55~69인치) GFK > traverse > 11st > 가전/디지털 > TV > 138~175cm (55~69인치) > 138~175cm (55~69인치)'</li><li>'삼성전자 2024 QLED 4K KQ65QD83AFXKR 스탠드형 (사운드바포함) (#M)디지털/가전>영상가전>TV>QLEDTV GFK > naver_plus_traverse > Naverstore > 가전 > TV > QLEDTV'</li><li>'2022년형 신제품 더함 50인치 퀀텀닷 안드로이드 OS11 스마트TV UA501QLED 기본스탠드(TV다리) 기사방문설치_UA501QLED 홈>[NEW]우버 AMG 안드로이드TV;홈>[NEW]안드로이드 스마트 TV;(#M)홈>인치별>50인치TV Naverstore > 가전 > TV > QLEDTV'</li></ul> |
| 137.0 | <ul><li>'하이맥스 CL-9700K 바리깡 / 클리퍼 / 전문가용 이발기 / 신형 (#M)디지털/가전>이미용가전>이발기 Naverstore > 가전 > 이미용가전 > 면도기/이발기 > 이발기'</li><li>'하이맥스 CL-300 장미 토끼 바리깡 미용실 전문가용 남자 이발기 히다치 가정용 CL-300 화이트 (#M)홈>디지털/가전>이미용가전>이발기 Naverstore > 가전 > 이미용가전 > 면도기/이발기 > 이발기'</li><li>'아지아 전문가용 미용실 바리깡 스마트오토 JP-700 홈>전문가용이발기;(#M)홈>전문가용 이발기 Naverstore > 가전 > 이미용가전 > 면도기/이발기 > 이발기'</li></ul> |
| 221.0 | <ul><li>'고프로 히어로 배터리 13 12 11 10 9 8 7 6 5 4 고프로13 전용 엔듀로배터리 정품 (#M)디지털/가전>카메라/캠코더용품>충전기/배터리>전용정품배터리 GFK > naver_plus_traverse_extension > Naverstore > 휴대폰/카메라 > 카메라 > 충전기/배터리'</li><li>'큐라덴 큐라프록스 하이드로소닉 Easy 3단 음파전동칫솔 (핸들 1+리필모 1+충전기+케이스) (#M)디지털/가전>생활가전>구강청정기>전동칫솔 GFK > live > Naverstore > Shop Live > 테크 > 20250121 > 19:30 ~ 21:30'</li><li>'카메라 DSC-W300 충전기 NP BG1 배터리 1800mAh 04 2batterycharger_01 CHINA (#M)카메라/주변기기>배터리/충전기>전용배터리 GFK > traverse > 11st > 가전/디지털 > 카메라/주변기기 > 배터리/충전기 > 전용배터리'</li></ul> |
| 32.0 | <ul><li>'21센추리 업소용 에어커튼 EKOVIM-G1-09 날벌레차단 출입문 먼지차단 자가설치가능 CYA-A090 출입문용 (#M)디지털/가전>계절가전>에어커튼 GFK > Naverstore > 가전 > 계절가전 > 에어커튼'</li><li>'21센추리 업소용 에어커튼 EKOVIM-G1-09 날벌레차단 출입문 먼지차단 자가설치가능 EKOVIM-G1-09 일반용 (#M)디지털/가전>계절가전>에어커튼 GFK > Naverstore > 가전 > 계절가전 > 에어커튼'</li><li>'신일 에어커튼 업소용 산업용 날벌레차단 냉기차단 현관 출입문 900mm 원모터 900mm(원모터) (#M)디지털/가전>계절가전>에어커튼 GFK > Naverstore > 가전 > 계절가전 > 에어커튼'</li></ul> |
| 33.0 | <ul><li>'AK몰_21센추리 창문형에어컨 CINT-8100R 초절전인버터 (#M)위메프 > 가전·디지털·컴퓨터 > 계절가전/에어컨 > 에어컨 > 벽걸이 에어컨 위메프 > 가전·디지털·컴퓨터 > 계절가전/에어컨 > 에어컨 > 벽걸이 에어컨'</li><li>'삼성전자 삼성 Q9000 AF17B6474GZRS 멀티형에어컨 전국 기본설치비포함 1.일반배관 (#M)디지털/가전>계절가전>에어컨>멀티형에어컨 GFK > Naverstore > 가전 > 계절가전 > 에어컨 > 멀티형'</li><li>'삼성 비스포크 창문형에어컨 윈도우핏 AW05B5171BWA 17㎡ 새틴 블루 창문매립형 본사설치[X] 11st > 가전/디지털 > 계절가전 > 에어컨 > 창문형;(#M)11st>계절가전>에어컨>창문형 11st > 가전/디지털 > 계절가전 > 에어컨 > 창문형'</li></ul> |
| 202.0 | <ul><li>'키친아트 신제품 1구 하이라이트 전기 레인지 가정용 원룸 휴대용 소형 1인용 캠핑 미니 인덕션 모델명 : KP-8011 (#M)홈>디지털/가전>주방가전>하이라이트 Naverstore > 가전 > 주방가전 > 전기레인지 > 하이라이트'</li><li>'SK매직 빌트인 매립형 스탠드형 프리스탠딩 3구 하이라이트 전기레인지 / ERABT300M 스탠드타입(높이8CM) (#M)11st>주방가전>전기레인지>하이라이트 11st > 가전/디지털 > 주방가전 > 전기레인지 > 하이라이트'</li><li>'보랄 DUO 2구 하이라이트 BR-TH5800FY 인덕션 전기렌지 주방용품 집들이선물 (#M)홈>전체상품 Naverstore > 가전 > 주방가전 > 전기레인지 > 하이라이트'</li></ul> |
| 111.0 | <ul><li>'텐바이텐 정품 MS 윈도우 10 프로 한글 FPP 처음사용자용 설치USB 병행 (#M)위메프 > 가전·디지털·컴퓨터 > PC부품/주변기기/저장장치 > PC주변기기 > 케이블/젠더 위메프 > 가전·디지털·컴퓨터 > PC부품/주변기기/저장장치 > PC주변기기 > 케이블/젠더'</li><li>'마이크로소프트 윈도우11홈 FPP 처음사용자용 한글 (USB) 온라인 공식 판매 인증점 (#M)컴퓨터 주변기기>소프트웨어>운영체제(OS) GFK > 11st > 가전/디지털 > 컴퓨터 주변기기 > 소프트웨어 > 운영체제(OS)'</li><li>'5천원 쿠폰💖 [마이크로소프트] Windows 10 Pro 처음사용자용 패키지(FPP) [한글/USB타입] (#M)디지털/가전>소프트웨어>운영체제 GFK > Naverstore > 컴퓨터 > 소프트웨어'</li></ul> |
| 12.0 | <ul><li>'(24시 상품발송) PC/스팀 한글판 Raft 래프트 레프트 NA 래프트 NA (#M)디지털/가전>게임기/타이틀>PC게임 GFK > Naverstore > 디지털 > 게이밍 > PC게임'</li><li>'(스팀코드 24시간 자동발송) Victoria 3 빅토리아 3 AA 모든계정에 등록가능 1.빅토리아 3 AA (#M)디지털/가전>게임기/타이틀>PC게임 GFK > Naverstore > 디지털 > 게이밍 > PC게임'</li><li>'(10초발송 스팀 스팀게임) 라스트 에폭 NA Last Epoch 라스트에폭 AA모든 (#M)디지털/가전>게임기/타이틀>PC게임 GFK > Naverstore > 디지털 > 게이밍 > PC게임'</li></ul> |
| 131.0 | <ul><li>'포레오 진동클렌저 루나 4 고 에버그린 1개 루나 4 고 (에버그린)+선물박스 (소) (#M)디지털/가전>이미용가전>기타이미용가전 LO > window_fashion_town > Naverstore > FashionTown > 뷰티 > CATEGORY > 뷰티 디바이스 > 기타'</li><li>'글로비 다크리스 색소침착 마사지기 다크서클 홈케어 다크써클 본품1개(1월20일 소량입고) (#M)디지털/가전>이미용가전>피부케어기기 GFK > traverse > Naverstore > 가전 > 이미용가전'</li><li>'포레오 진동클렌저 루나 4 (민감성 피부) 1개 루나 4 (민감성 피부)+선물박스 (대) (#M)디지털/가전>이미용가전>기타이미용가전 LO > window_fashion_town > Naverstore > FashionTown > 뷰티 > CATEGORY > 뷰티 디바이스 > 기타'</li></ul> |
| 166.0 | <ul><li>'에버홈 EV-RG3000 투명창 듀얼 필터 생선구이기. (#M)주방가전>전기그릴/전기팬>전기그릴 GFK > traverse > 11st > 가전/디지털 > 주방가전 > 전기그릴/전기팬 > 전기그릴'</li><li>'쿠쿠 양면 멀티 그릴 CFR-331R (#M)디지털/가전>주방가전>생선그릴 Naverstore > 가전 > 주방가전 > 전기그릴/팬 > 생선그릴'</li><li>'[에버홈] 생선구이기 점보 (#M)주방가전>전기포트>무선포트/주전자 GFK > traverse > 11st > 가전/디지털 > 주방가전 > 전기포트 > 무선포트/주전자'</li></ul> |
| 71.0 | <ul><li>'샤오미 미지아모니터조명 LED MJGJD02YL 2세대 (#M)디지털/가전>모니터주변기기>기타모니터주변기기 GFK > traverse > Naverstore > 컴퓨터 > 주변기기 > 모니터용 > 기타'</li><li>'[카멜인터내셔널] 베사 확장 브라켓, VC-1 [200X200mm 변환] (#M)디지털/가전>모니터주변기기>기타모니터주변기기 Naverstore > 컴퓨터 > 주변기기 > 모니터용 > 기타'</li><li>'스톤힐 MS-01 모니터 받침대 듀얼 스탠드 다용도 선반 MS-01 400(400mm)_블랙(업그레이드-높이8cm) (#M)디지털/가전>모니터주변기기>기타모니터주변기기 Naverstore > 컴퓨터 > 주변기기 > 모니터용 > 받침대'</li></ul> |
| 224.0 | <ul><li>'DJI Osmo 마그네틱 볼 조인트 어댑터 마운트 (#M)디지털/가전>카메라/캠코더용품>액션캠 GFK > naver_plus_traverse_extension > Naverstore > 휴대폰/카메라 > 카메라 > 액션캠/캠코더'</li><li>'인스타360 ACE PRO2 에이스 프로2 다이브 번들 정품 액션캠 포인트 포함 256GB로 변경 (#M)디지털/가전>카메라/캠코더용품>액션캠 GFK > naver_plus_traverse_extension > Naverstore > 휴대폰/카메라 > 카메라 > 액션캠/캠코더'</li><li>'포토토 빈티지 캠코더 레트로 Y2K 미니 비디오 카메라 핑크 (#M)디지털/가전>카메라/캠코더용품>캠코더 GFK > naver_plus_traverse_extension > Naverstore > 휴대폰/카메라 > 카메라 > 액션캠/캠코더'</li></ul> |
| 48.0 | <ul><li>'ipTIME A2004SE 기가비트 와이파이 공유기 유무선 아이피타임 라이트 메시 무선 인터넷 WIFI (#M)컴퓨터 주변기기>공유기>유무선공유기 GFK > 11st > 가전/디지털 > 컴퓨터 주변기기 > 공유기 > 유무선공유기'</li><li>'EFM네트웍스 아이피타임 N704EPlus (#M)홈>디지털/가전>네트워크장비>공유기>유무선공유기 Naverstore > 컴퓨터 > 주변기기 > 공유기 > 유무선공유기'</li><li>'아이피타임 ipTIME A3008-MU WIFI 유무선 공유기 YBS (#M)홈>디지털/가전>네트워크장비>공유기>유무선공유기 Naverstore > 컴퓨터 > 주변기기 > 공유기 > 유무선공유기'</li></ul> |
| 121.0 | <ul><li>'무선 핀마이크 유튜브 휴대용 방송용 강의용 마이크 스마트폰 블루투스 마이크 보이스원 프로 M-70RW-PRO (#M)디지털/가전>음향가전>마이크>무선마이크 GFK > naver_plus_traverse_extension > Naverstore > 휴대폰/카메라 > 1인방송/촬영 > 스마트폰용품'</li><li>'듀얼 무선 블루투스 마이크 무선스피커 버스킹마이크 노래방 앰프 가정용 앰프마이크 블루투스스피커MP3 본품+NV179-저속충전기 (#M)음향가전>마이크>무선마이크 GFK > traverse > 11st > 가전/디지털 > 음향가전 > 마이크 > 무선마이크'</li><li>'마이크론 Crucial T500 히트싱크 M.2 NVMe 대원씨티에스 (2TB) (#M)저장장치>SSD>1TB이상 GFK > traverse > 11st > 가전/디지털 > 저장장치 > SSD > 1TB이상'</li></ul> |
| 151.0 | <ul><li>'삼성전자 외장하드 Y3 SLIM 2TB 파우치 패키지 HX-MK20Y 01.Y3+파우치 증정_2TB_스모키 그레이 (25년형) + 파우치 (#M)디지털/가전>저장장치>외장HDD GFK > traverse > Naverstore > 컴퓨터 > 저장장치 > 외장하드'</li><li>'씨게이트 외장하드 4TB 4테라 외장HDD 스페이스그레이 [데이터복구+파우치] One Touch HDD 5TB 데이터복구_실버+전용파우치 (#M)디지털/가전>저장장치>외장HDD GFK > naver_plus_traverse_extension > Naverstore > PC/주변기기 > 저장장치 > 외장하드'</li><li>'삼성전자 삼성 외장하드 J3 Portable USB3.0 2TB 외장 HDD [공식인증점] 도착보장 상품 (주문즉시 발송진행)_2TB 블랙 (#M)디지털/가전>저장장치>외장HDD GFK > naver_plus_traverse > Naverstore > PC/주변기기 > 저장장치 > 외장하드'</li></ul> |
| 0.0 | <ul><li>'HP일체형PC 올인원 게이밍컴퓨터 RTX3050 인텔13세대 가정용 기업용 화상회의 파워팩(총32G업+윈11홈정품/개봉설치)_NVMe 1TB 교체(개봉장착) (#M)홈>🖥데스크탑 Naverstore > 컴퓨터 > 데스크탑 > 브랜드PC > HP'</li><li>'[✨삼성슈퍼위크 72만+메모리 무상UP] 삼성전자 삼성 DM500TFA-A38A 데스크탑 인텔 13세대 i3 가성비 인강용 사무용 PC 1. 참여(한컴오피스 동봉)_1. 참여(완료 시 DROP 키보드)_삼성 메모리 8GB(개봉장착) (#M)디지털/가전>PC>브랜드PC GFK > Naverstore > 컴퓨터 > 데스크탑'</li><li>'삼성 데스크탑 DM500TEA-A78A 고사양 사무용 인텔 12세대 i7 컴퓨터 삼성PC 1. 참여(한컴오피스 동봉)_2.NEW✨DM500TFA-A78A(13세대) 홈>전체상품;홈>데스크탑>12세대 CPU;(#M)홈>삼성데스크탑>12세대 CPU Naverstore > 컴퓨터 > 데스크탑 > 브랜드PC > 삼성전자'</li></ul> |
| 136.0 | <ul><li>'비달사순 에어스타일러 VSAS80PIK 비달사순 에어스타일러 VSAS80PIK 홈>헤어케어>헤어기기>탈모/두피기기/헤어롤;(#M)홈>헤어케어>헤어기기>고데기 OLIVEYOUNG > 헤어케어 > 헤어기기 > 고데기'</li><li>'포뷰트 엠스타일러 포뷰트 엠스타일러 홈>남성>헤어케어>헤어 기기;홈>헤어케어>헤어기기>헤어셋팅기기;홈>헤어케어>헤어기기>헤어롤;홈>헤어케어>스타일링>왁스/젤/무스;홈>헤어케어>헤어기기>탈모/두피기기;홈>헤어케어>헤어기기>탈모/두피기기/헤어롤;(#M)홈>헤어케어>헤어기기>고데기 OLIVEYOUNG > 남성 > 헤어케어 > 염색/다운펌/기기'</li><li>'청담스타일 뿌리펌 브러쉬 청담스타일 뿌리펌 브러쉬 (그레이) (#M)홈>청담스타일 고데기 Naverstore > 가전 > 이미용가전 > 헤어스타일러 > 에어브러시'</li></ul> |
| 59.0 | <ul><li>'솔텍 SFC200-SCS 싱글모드 100Mbps 광컨버터 (#M)디지털/가전>네트워크장비>컨버터장비 Naverstore > 컴퓨터 > 주변기기 > 허브/컨트롤러 > 컨버터'</li><li>'넥시 AV 아날로그 3RCA to HDMI 변환 컨버터 NX648 (#M)디지털/가전>네트워크장비>컨버터장비 Naverstore > 컴퓨터 > 주변기기 > 허브/컨트롤러 > 컨버터'</li><li>'랜스타 LS-AV2HD AV컨버터 3RCA to HDMI 1080P 지원 양방향 불가 (#M)디지털/가전>네트워크장비>컨버터장비 Naverstore > 컴퓨터 > 주변기기 > 허브/컨트롤러 > 컨버터'</li></ul> |
| 58.0 | <ul><li>'ipTIME(아이피타임) A1004 기가비트 유무선공유기 Wi-fi 안테나 3개 5GHz, 2.4GHz 듀얼밴드 홈>전체상품;(#M)홈>브랜드관>ipTime(공유기,랜카드)>유무선 공유기 Naverstore > 컴퓨터 > 주변기기 > 공유기 > 유무선공유기'</li><li>'COMS 무선 안테나 암,수 Wi-Fi Antennas 2.4Ghz 5dbi-RP-SMA 5dbi-RP-SMA (암) (#M)디지털/가전>네트워크장비>안테나 GFK > Naverstore > 컴퓨터 > 주변기기 > 네트워크장비 > 안테나'</li><li>'포켓 라디오 소리큰 비상용 라디오 재난용 초미니 라디오 안테나 mp3플레이어 라디오 (#M)디지털/가전>음향가전>라디오 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 음향가전 > 라디오/MP3'</li></ul> |
| 7.0 | <ul><li>'[INTEL] Arc A770 Limited Edition D6 16GB (#M)디지털/가전>PC부품>그래픽카드>기타계열 Naverstore > 컴퓨터 > 부품 > 그래픽카드 > 기타계열'</li><li>'GIGABYTE 지포스 RTX 4060 Ti EAGLE D6 8GB 피씨디렉트 (#M)홈>디지털/가전>PC부품>그래픽카드>NVIDIA계열 Naverstore > 컴퓨터 > 부품 > 그래픽카드 > NVIDIA계열'</li><li>'갤럭시 GALAX RTX 3080 EX 게이머 WHITE OC 10GB 24년 8월~10월 무상as 남음 풀박스제품 3팬 화이트 (#M)디지털/가전>PC부품>그래픽카드>NVIDIA계열 GFK > Naverstore > 컴퓨터 > 부품 > 그래픽카드 > NVIDIA계열'</li></ul> |
| 155.0 | <ul><li>'네스프레소 에어로치노4 NESPRESSO 유럽 직배송 (#M)홈>전체상품 Naverstore > 디지털/가전 > 주방가전 > 거품/반죽기'</li><li>'네스프레소 에어로치노4 (#M)디지털/가전>주방가전>거품/반죽기 Naverstore > 가전 > 주방가전 > 커피용품 > 우유거품기'</li><li>'오펠 스탠드믹서 1100W 거품기 반죽기 휘핑기 OFM-1504 레트로베이지 (#M)디지털/가전>주방가전>거품/반죽기 Naverstore > 가전 > 주방가전 > 오븐/제빵 > 거품/반죽기'</li></ul> |
| 46.0 | <ul><li>'이지넷유비쿼터스 NEXT-7602KVM-4K 2포트 HDMI KVM스위치 화이트 (#M)디지털/가전>네트워크장비>KVM스위치 Naverstore > 컴퓨터 > 주변기기 > 허브/컨트롤러 > KVM'</li><li>'ATEN KL1516AIN 19인치 Cat5 LCD KVM 스위치 듀얼레일 Over IP LCD콘솔 (#M)디지털/가전>네트워크장비>KVM스위치 GFK > traverse > Naverstore > 컴퓨터 > 주변기기 > 허브/컨트롤러 > KVM'</li><li>'이지넷유비쿼터스 NEXT-7102KVM-4K 2x1 HDMI USB UHD 4K KVM 스위치 (#M)디지털/가전>네트워크장비>KVM스위치 Naverstore > 컴퓨터 > 주변기기 > 허브/컨트롤러 > KVM'</li></ul> |
| 229.0 | <ul><li>'샤오미 이북리더기 E북리더기 전자책리더기 mi reader 미리더 '</li><li>'밀리의서재 E북리더기 + 밀리의서재 12개월 구독권 '</li><li>'[ 설 선물대첩 ] 이노스페이스원 루나 6인치 이북리더기 범용기 루나X+퍼플스킨 (#M)디지털/가전>학습기기>전자책 GFK > traverse > Naverstore > 디지털 > 태블릿PC > 전자책 > 본체'</li></ul> |
| 34.0 | <ul><li>'천장형 시스템 에어컨 바람막이 윈드 플렉스 가림막 윈드플렉스 투명 1개 (#M)디지털/가전>계절가전>에어컨주변기기>기타액세서리 GFK > Naverstore > 가전 > 계절가전 > 에어컨 > 리모컨, 주변용품'</li><li>'천장형 시스템에어컨 실링팬 화이트 올트팬 바람막이 순환프로펠러 윈드바이저 에어컨 바람개비 천정형 에어컨 실링팬 화이트 (#M)디지털/가전>계절가전>에어컨주변기기>기타액세서리 GFK > Naverstore > 가전 > 계절가전 > 에어컨 > 리모컨, 주변용품'</li><li>'천장형 시스템 에어컨바람막이 LG 삼성 공용(4way 1세트) (#M)디지털/가전>계절가전>에어컨주변기기>기타액세서리 GFK > Naverstore > 가전 > 계절가전 > 에어컨 > 리모컨, 주변용품'</li></ul> |
| 78.0 | <ul><li>'오랄비 iO9 전동칫솔 블랙 오닉스 (핸들1+리필모4+충전기+충전케이스)+( )치간칫솔 10개입 치간칫솔 10개입 [GW344]_iO9 블랙 오닉스[Q034]_얼티밋화이트4입[Q039] (#M)디지털/가전>생활가전>구강청정기>전동칫솔 GFK > Naverstore > oralbkr브랜드스토어 > 전동칫솔 > iO Series'</li><li>'2080 소닉클론 음파진동 기획팩 (본품1+리필3) 2080 소닉클론 음파진동 기획팩 (본품1+리필3) 홈>건강/위생용품>덴탈케어>전동칫솔/세정기;홈>건강/위생용품>구강용품>전동칫솔/세정기;(#M)홈>구강/건강용품>구강용품>전동칫솔/세정기 OLIVEYOUNG > 베스트 > 구강/건강용품'</li><li>'식스비 3단 유아 음파 전동칫솔 전용 칫솔모 3단유아_옐로우칫솔모(2EA) (#M)디지털/가전>생활가전>구강청정기>전동칫솔모 Naverstore > 가전 > 욕실가전 > 전동칫솔모'</li></ul> |
| 44.0 | <ul><li>'힘펠 터보팬 JV-102 환풍기 욕실 저소음 정풍량 고성능 역류방지 전동댐퍼 자가설치(직접설치) (#M)디지털/가전>계절가전>공기정화기>환풍기 GFK > traverse > Naverstore > 가전 > 계절가전 > 공기청정기'</li><li>'한일 화장실 환풍기 욕실 환풍기 환기팬 셔터형 35cm (#M)11st>계절가전>공기청정기>필터식 11st > 가전/디지털 > 계절가전 > 공기청정기 > 필터식'</li><li>'힘펠 욕실/화장실 환풍기 플렉스 C2-100LF 역류방지 냄새차단 자가설치 중정압/저소음 1.제로크(No.11~22)_12-1.제로크HV3-80X(MD) F그릴_자가설치 (#M)디지털/가전>계절가전>공기정화기>환풍기 Naverstore > 가전 > 계절가전 > 공기청정기 > 환풍기'</li></ul> |
| 1.0 | <ul><li>'레노버 씽크스테이션 P360 Ultra-30G1S01N00 i7-12700 16G 512G ( 11월 입고) (#M)홈>전체상품 Naverstore > 컴퓨터 > 데스크탑 > 서버/워크스테이션'</li><li>'[Dell] PowerEdge T350 E-2378G 8GB 480GB SSD 600W(1+1) H755 '</li><li>'워크스테이션 DELL T7910 24코어 48스레드 128G 홈>디지털/가전>PC>서버/워크스테이션;(#M)홈>디지털가전 Naverstore > 컴퓨터 > 데스크탑 > 서버/워크스테이션'</li></ul> |
| 129.0 | <ul><li>'삼성전자 삼성 HW-Q990D '</li><li>'벽걸이 타공형 슬림 사운드바거치대 심플 사운드바 브라켓 셀프인테리어 캣 벽걸이 선반 켓 사운드바 사운드바 브라켓 (#M)음향가전>홈시어터>홈시어터 GFK > traverse > 11st > 가전/디지털 > 음향가전 > 홈시어터 > 홈시어터'</li><li>'브리츠 BZ-T3600 '</li></ul> |
| 222.0 | <ul><li>'NEXI 넥시 USB3.0 Type-C A 카드리더기 NX1479 [0001](NEXI) 넥시 USB3.0 Type-C A 카드리 (#M)휴대폰>선불폰/기타>선불유심 GFK > traverse > 11st > 가전/디지털 > 휴대폰 > 선불폰/기타 > 선불유심'</li><li>'POS 신용카드 리더기 MSR-1000 USB 마그네틱리더기 '</li><li>'무인정산기 주차장 자판기 키오스크 단말기 신용카드리더기 TL3500BP '</li></ul> |
| 206.0 | <ul><li>'대용량약탕기 가정용약탕기 홍삼 중탕기 제조기 6L (#M)디지털/가전>주방가전>홍삼제조기 GFK > Naverstore > 가전 > 주방가전 > 홍삼/영양식 > 홍삼제조기'</li><li>'[티울림 건강포트] 약탕기 티포트 중탕기 전기 가정용 차탕기 홍삼제조기 뉴베이지 (#M)디지털/가전>주방가전>홍삼제조기 GFK > Naverstore > 가전 > 주방가전 > 홍삼/영양식'</li><li>'오쿠 도자기 단지 패킹 / 전 도자기 사용 가능 (#M)디지털/가전>주방가전>홍삼제조기 GFK > Naverstore > 가전 > 주방가전 > 홍삼/영양식'</li></ul> |
| 15.0 | <ul><li>'XBOX 오버쿡드 + 오버쿡드2 (코드전송) 한국 계정은 등록법 참조 (#M)디지털/가전>게임기/타이틀>게임타이틀 GFK > Naverstore > 디지털 > 게이밍 > XBOX > 게임타이틀'</li><li>'닌텐도 링피트 어드벤처 링콘 세트 스위치 스포츠 게임 팩 링핏 다이어트 운동 ☆신작☆ 링피트어드벤처 + 저스트댄스 2023 (#M)디지털/가전>게임기/타이틀>게임타이틀 GFK > Naverstore > 디지털 > 게이밍 > 닌텐도 > 게임타이틀'</li><li>'닌텐도 스위치 슈퍼 마리오 RPG 특전 칩케이스 마리오RPG + 버섯 칩케이스 (#M)디지털/가전>게임기/타이틀>게임타이틀 GFK > Naverstore > 디지털 > 게이밍 > 닌텐도 > 게임타이틀'</li></ul> |
| 226.0 | <ul><li>'코닥 골드 필름 200 36컷 + 코닥 컬러플러스 필름 200 36컷 1세트 단품 '</li><li>'스몰리그 X FILM RIOT 10 in 1 접이식 멀티툴 키트 레드 4813 '</li><li>'코닥 필름카메라 필름 컬러플러스 200/36 '</li></ul> |
| 50.0 | <ul><li>'40Gb/s QSFP+ 광모듈 트랜시버 NEXT-QSFP40G-SR4 '</li><li>'이지넷유비쿼터스 넥스트유 SFP10G-LR-H '</li><li>'ipTIME SFP-UTP1G RJ45 모듈 기가비트 100M 거리 지원 '</li></ul> |
| 17.0 | <ul><li>'미니 가습기 휴대용 USB 초음파 아로마 에센셜 오일 디퓨저 220ml 가정용 자동차 미스 미니 가습기 휴대용 USB 초음파 아로마 에센셜 오일 디퓨저 220ml 가정용 자동차 미스_03 green (#M)가전·컴퓨터>계절가전>USB·스틱가습기 Tmon > 가전·디지털 > 가전·컴퓨터 > 계절가전 > USB·스틱가습기'</li><li>'USB가습기 밤Bomb USB 가습기 간편세척 청소쉬운 가정용 미니 탁삭용 거실 휴대용 비염 하늘 (#M)홈>디지털/가전>계절가전>가습기>가습기필터 Naverstore > 가전 > 계절가전 > 가습기/에어워셔 > 필터/액세서리'</li><li>'아로마 불꽃 가습기USB 충전식 디퓨저 미스트 분무기, 사무실 차량 공기 청정기 장식, 침실 장식품 아로마 불꽃 가습기USB 충전식 디퓨저 미스트 분무기, 사무실 차량 공기 청정기 장식, 침실 장식품_03 분홍색 (#M)가전·컴퓨터>계절가전>가습기 Tmon > 가전·디지털 > 가전·컴퓨터 > 계절가전 > 가습기'</li></ul> |
| 218.0 | <ul><li>'카드 DJI 케어 리프레쉬 2년 플랜 (오즈모 액션 4) (#M)SSG.COM>카메라/캠코더>촬영용 드론 GFK > traverse > ssg > 디지털/렌탈 > 카메라/캠코더 > 촬영용 드론'</li><li>'유프로 프리미엄2 액션캠 브이로그카메라 유튜브카메라 블랙 본품 '</li><li>'팅크웨어 아이나비 모빌리티 액션캠 MC-1 '</li></ul> |
| 214.0 | <ul><li>'입문용카메라 초보자 디지털 카메라 가성비 dslr 4K '</li><li>'캐논정품 EOS 90D바디만(미개봉 새상품)/R '</li><li>'(Hidden) 정품 소니 알파 A350 '</li></ul> |
| 144.0 | <ul><li>'스타롤 충전식 열헤어롤 블랙 스타롤 충전식 열헤어롤 블랙 홈>헤어케어>헤어기기>헤어셋팅기기X;홈>헤어케어>헤어기기>헤어셋팅기기;홈>헤어케어>헤어기기>헤어롤;홈>헤어케어>헤어기기>탈모/두피기기;홈>헤어케어>헤어기기>탈모/두피기기/헤어롤;(#M)홈>헤어케어>헤어기기>고데기 OLIVEYOUNG > 헤어케어 > 헤어기기 > 고데기'</li><li>'전기 헤어롤 여행용 비달사순 헤어 세팅기 롤 구르프 셋팅롤 VSHS10BK(N) (#M)홈>게릴라특가 Naverstore > 가전 > 이미용가전 > 헤어스타일러 > 헤어롤/롤셋'</li><li>'스타롤 빅스타롤 충전식 열헤어롤 민트 홈>헤어케어>헤어기기>헤어셋팅기기X;홈>헤어케어>헤어기기>헤어셋팅기기;홈>헤어케어>헤어기기>헤어롤;홈>헤어케어>헤어기기>탈모/두피기기;홈>헤어케어>헤어기기>탈모/두피기기/헤어롤;(#M)홈>헤어케어>헤어기기>고데기 OLIVEYOUNG > 헤어케어 > 헤어기기 > 고데기'</li></ul> |
| 89.0 | <ul><li>'키스뉴욕 마그네틱 원 큐 램프 큐어 핀큐어 휴대용 젤램프 스탠드 거치대 포함+선물선택 고급 오일펜 (#M)디지털/가전>이미용가전>손발톱정리기 GFK > naver_plus_traverse > Naverstore > 가전 > 이미용가전 > 손발케어'</li><li>'파파 와이드 스탠드 500S 책상 조명 스텐드 LED등 독서등 공부조명 '</li><li>'파나소닉 LED스탠드 5W USB-C 충전방식 접이식 무선스탠드 휴대용스탠드 침대독서등 '</li></ul> |
| 126.0 | <ul><li>'아날로그 휴대용 카세트 플레이어 테이프 MP3변환 레트로 감성 '</li><li>'Byron Statics 휴대용 카세트 플레이어 '</li><li>'롯데알미늄 블루투스 CD플레이어 핑키-500 라디오 (#M)디지털/가전>음향가전>CD플레이어 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 음향가전 > 플레이어'</li></ul> |
| 91.0 | <ul><li>'[베스토/BESTO] 핸디형 스팀청소기 BSC-900 홈>청소&세척;(#M)홈>수공구 Naverstore > 가전 > 청소기 > 핸디청소기'</li><li>'샤오미 디어마 스팀청소기 핸디형 살균스팀청소기 ZQ610/600 청소기+리필세트 홈>전체상품;(#M)홈>디지털/가전>생활가전>청소기>스팀청소기 Naverstore > 가전 > 청소기 > 핸디청소기'</li><li>'[대여] 카처SC4 스팀청소기 새걸레 제공 전용 브러쉬 6종 동의 합니다._1/20~24일 수령 후 31일 수거 (#M)디지털/가전>청소기>스팀청소기 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 청소기'</li></ul> |
| 228.0 | <ul><li>'초소형녹음기 소형 장시간 휴대용 보이스레코드 32G 32G (#M)디지털/가전>학습기기>보이스레코더 GFK > traverse > Naverstore > 디지털 > 음향기기 > 녹음기'</li><li>'이지렉 초소형 블루투스 보이스레코더 32GB '</li><li>'자체제작 16기가 C타입 초소형 동전크기 대용량 장시간 휴대용 보이스레코더 녹음기 '</li></ul> |
| 147.0 | <ul><li>'엠비에프 USB 3.0 / C타입 외장 ODD DVD-RW '</li><li>'멀티허브 3.0 C타입 레코더기기 외장ODD DVD룸 외장드라이브 레코더 DVD롬 외장 USB ED02 CD A ODD 7IN1 (#M)저장장치>ODD>CD-ROM/RW GFK > traverse > 11st > 가전/디지털 > 저장장치 > ODD'</li><li>'노트북 외장CD롬 ODD 플레이어 DVD콤보 리더기 '</li></ul> |
| 116.0 | <ul><li>'플레오맥스 CD 플레이어 블루투스 라디오 스피커 휴대용 '</li><li>'일우 투명 CD플레이어 IW-ET07 휴대용 충전식 레트로 감성 '</li><li>'아이리버 올인원 CD 플레이어 턴테이블 디자인 라디오 블루투스 스피커 IAB40 '</li></ul> |
| 125.0 | <ul><li>'인이어이어폰 게이밍이어폰 커널형 마이크 유선 이어폰 탕주 상관완아 탕주 상관완아 블랙_MIC (#M)디지털/가전>음향가전>이어폰 GFK > traverse > Naverstore > 디지털 > 게이밍 > 이어폰/헤드셋'</li><li>'KOSS 코스 포르타 프로 한정판 온이어 유선 헤드폰 Koss Porta Pro 정품 미국발송 (#M)디지털/가전>음향가전>헤드폰 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 음향가전 > 헤드폰'</li><li>'인이어이어폰 탕주 상관완아 SE 스튜디오 에디션 커널형 유선 이어폰 탕주 상관완아 SE 화이트 (#M)디지털/가전>음향가전>이어폰 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 음향가전 > 이어폰'</li></ul> |
| 149.0 | <ul><li>'USB C TO HDMI 케이블 C타입hdmi 4K 미러링 복제 확장 1M 실버 3M-실버 (#M)디지털/가전>PC부품>PC케이블>변환 젠더/케이블 GFK > traverse > Naverstore > 컴퓨터 > 주변기기 > 케이블/젠더 > 케이블'</li><li>'샌디스크 울트라 듀얼드라이브 고 USB Type C USB 메모리 256GB 묵인하다 (#M)디지털/가전>저장장치>USB메모리 GFK > traverse > Naverstore > 컴퓨터 > 저장장치 > USB메모리'</li><li>'Bliksem TYPE C 플래시 드라이브 OTG 32GB 고속 USB2.0, 컴퓨터 휴대폰용, 3 인 1 미니 펜 01 64GB (#M)저장장치>USB 메모리>카드/주얼리형 GFK > traverse > 11st > 가전/디지털 > 저장장치 > USB 메모리 > 카드/주얼리형'</li></ul> |
| 51.0 | <ul><li>'랜스타 LS-NF8209 랜 케이블 멀티 테스터기 탐지/길이/POE 지원 (#M)디지털/가전>네트워크장비>네트워크테스트기 GFK > Naverstore > 컴퓨터 > 주변기기 > 네트워크장비 > 네트워크테스트기'</li><li>'랜 테스터기 468W 랜선 테스터 UTP 단선체크 RJ45 RJ11 02 랜테스터기 468W 블랙 (#M)디지털/가전>네트워크장비>네트워크테스트기 GFK > Naverstore > 컴퓨터 > 주변기기 > 네트워크장비 > 네트워크테스트기'</li><li>'LS 랜테스터기 UTP RJ45 랜케이블 퀵테스터기 LS-LAN-TQ LS-LAN-TA 분리형타입 (#M)디지털/가전>네트워크장비>네트워크테스트기 GFK > Naverstore > 컴퓨터 > 주변기기 > 네트워크장비 > 네트워크테스트기'</li></ul> |
| 169.0 | <ul><li>'키친아트 와이드 6단 트레이 식품건조기 KKW-KG7000 음식 야채 과일 간식 고기 건조기 KKW-KG7000 (#M)홈>디지털/가전>주방가전>식품건조기 Naverstore > 가전 > 주방가전 > 식품건조기'</li><li>'[6%쿠폰] 키친아트 식품건조기 타이머가능 과일 야채 고추 건조기 GN-232D-타이머기능 위메프 > 가전·디지털·컴퓨터 > 주방가전 > 홈메이드가전;(#M)위메프 > 가전·디지털·컴퓨터 > 주방가전 > 홈메이드가전 > 식품건조기 위메프 > 가전·디지털·컴퓨터 > 주방가전 > 홈메이드가전 > 식품건조기'</li><li>'명품 농산물 다목적 고추건조기 소형 7채반 가정용전기사용 (#M)홈>전체상품 Naverstore > 가전 > 주방가전 > 식품건조기'</li></ul> |
| 118.0 | <ul><li>'금영 태진 가정용 노래방기계 이동식 세트 '</li><li>'AV-1000 AV1000 휴대용 노래방 가정용 노래방기기 캠핑 차박 (#M)디지털/가전>음향가전>노래반주기 Naverstore > 디지털 > 음향기기 > 노래반주기'</li><li>'코인노래방 기계 풀세트 가정용 노래방 방음부스 태진반주기 '</li></ul> |
| 14.0 | <ul><li>'아싸라봉 닌텐도 스위치 OLED 찐패키지 악세사리 7종 젤리 세트 닌텐도 OLED용-찐(젤리)7종패키지 블랙 (#M)디지털/가전>게임기/타이틀>게임기주변기기>가방/케이스 GFK > Naverstore > 디지털 > 게이밍 > 주변용품'</li><li>'휴대용 게임 콘솔 보관 가방 보호 케이스, 충격 방지 하드 파우치, Asus ROG Ally 액세서리 01 Red (#M)11st>노트북>삼성전자>코어 i5 11st > 가전/디지털 > 노트북 > 삼성전자 > 코어 i5'</li><li>'XBOX 마이크로소프트 엑스박스 무선 컨트롤러 4세대 (로봇화이트) 로봇화이트 (#M)위메프 > 가전·디지털·컴퓨터 > 게임기/게임타이틀 > 게임 주변기기 > XBOX 주변기기 위메프 > 가전·디지털·컴퓨터 > 게임기/게임타이틀 > 게임 주변기기 > XBOX 주변기기'</li></ul> |
| 82.0 | <ul><li>'나노N / 나노팬더 / 나노펭귄 무전기 이어폰 경호용 이어마이크 리시버 인이어 핸드마이크 옵션2(귀걸이형이어마이크) (#M)디지털/가전>생활가전>무전기>무전기액세서리 GFK > Naverstore > 가전 > 생활가전 > 무전기 > 액세서리'</li><li>'무전기 이어마이크 / 인이어 / 리시버 / 리필 이어튜브 / 투명 / 블랙 투명튜브 (#M)디지털/가전>생활가전>무전기>무전기액세서리 GFK > Naverstore > 가전 > 생활가전 > 무전기'</li><li>'무전기이어폰 JM-8000T 스탠다드 이어마이크 외 다른타입 경호용 인이어 리시버 국산 ③ 스탠다드 (#M)디지털/가전>생활가전>무전기>무전기액세서리 GFK > Naverstore > 가전 > 생활가전 > 무전기 > 액세서리'</li></ul> |
| 104.0 | <ul><li>'한일 미니 짤순이 음식물 탈수기 야채 빨래 만능 다용도 NW-Y2020(신모델) (#M)디지털/가전>생활가전>세탁/건조기>탈수기 GFK > traverse > Naverstore > 가전 > 세탁/건조기 > 탈수기'</li><li>'휴앤봇 스텐 가정용 업소용 세탁 빨래 탈수기 짤순이 DL560 (#M)홈>디지털/가전>생활가전>건조기/탈수기>탈수기 Naverstore > 가전 > 세탁기/건조기 > 탈수기'</li><li>'[25년형] 신일 빨래탈수기 스텐 소형 대용량 수영장 의류 세탁 업소용 7kg '</li></ul> |
| 103.0 | <ul><li>'무선UV침구 청소기 빽가 미우새 진드기 빈대 충전식 이불 침구 진드기 침구청소기 자동청소기 무선UV침구청소기-화이트 (#M)생활가전>청소기>스팀청소기>핸디/스틱형 GFK > traverse > 11st > 가전/디지털 > 생활가전 > 청소기 > 스팀청소기'</li><li>'[텐바이텐][Sanrio] 헬로키티 밥솥 홈>텐바이텐 X Sanrio;(#M)홈>전체상품 Naverstore > 가전 > 청소기 > 침구청소기'</li><li>'[텐바이텐][모던하우스] 2중 전기포트 (#M)홈>전체상품 Naverstore > 가전 > 청소기 > 침구청소기'</li></ul> |
| 65.0 | <ul><li>'헤드셋거치대 에어팟맥스 소니 게이밍 헤드폰 걸이 스탠드 (#M)디지털/가전>음향가전>이어폰/헤드폰액세서리>거치대 GFK > traverse > Naverstore > 디지털 > 음향기기 > 이어폰/헤드폰액세서리 > 케이스/거치대'</li><li>'[스냅케이스]프리미엄 가죽 헤드폰 헤드셋 파우치 케이스 수납 가방 휴대용 보관 크림화이트(HP06) (#M)음향가전>이어폰>무선 이어폰 GFK > traverse > 11st > 가전/디지털 > 음향가전 > 이어폰 > 무선 이어폰'</li><li>'[호환] 앱코 해커 B510 이어패드 게이밍 헤드셋 B510U 7.1 커버 H030 (#M)홈>헤드폰 이어패드 Naverstore > 디지털 > 음향기기 > 이어폰/헤드폰액세서리 > 캡/솜/팁'</li></ul> |
| 107.0 | <ul><li>'1초발송 브이엠웨어 워크스테이션 프로 17 개인용 상업용 정품 영구 라이선스 리딤코드 VMware Workstation Pro 워크스테이션 프로 17 개인용 윈도우용 (#M)디지털/가전>소프트웨어>개발툴 GFK > Naverstore > 컴퓨터 > 소프트웨어 > 개발툴'</li><li>'MS SQL Server 2022 Standard Edition CSP 라이선스 (#M)디지털/가전>소프트웨어>운영체제 GFK > Naverstore > 컴퓨터 > 소프트웨어 > 운영체제'</li><li>'비주얼스튜디오 프로 VisualStudio 2022 Pro 영구 라이선스 (#M)디지털/가전>소프트웨어>개발툴 GFK > Naverstore > 컴퓨터 > 소프트웨어 > 개발툴'</li></ul> |
| 191.0 | <ul><li>'렌탈[공식인증]SK매직정수기렌탈 WPU-8230C 의무사용기간 36개월 초기비용면제 09.스스로 직수 냉정수기 2022_의무기간 해피콜 상담 시 결정_60 11st>가전>이미용/생활가전>생활가전;(#M)11st>렌털/가입상품>가전렌털>정수기 11st > 가전/디지털 > 렌털/가입상품 > 가전렌털 > 정수기'</li><li>'렌탈SK매직 미니 직수 정수기 렌탈 단하루 역대급 최대혜택보장 에코미니 정수기_해피콜 상담시 확인 및 결정(1644-5279)하겠습니다._72 11st>렌털/가입상품>가전렌털>정수기;11st > 가전/디지털 > 렌털/가입상품 > 가전렌털 > 정수기 11st > 가전/디지털 > 렌털/가입상품 > 가전렌털 > 정수기'</li><li>'렌탈[SK매직] 렌탈/라이브방송 기념 상품권 오늘 하루만 35만원 지급/얼음정수기/직수정수기/렌탈료 7천원 할인 01.올인원플러스 직수얼음 정수기(WPUIAC302)_6년약정_72 11st>렌털/가입상품>가전렌털>정수기;11st > 가전/디지털 > 렌털/가입상품 > 가전렌털 11st > 가전/디지털 > 렌털/가입상품 > 가전렌털 > 정수기'</li></ul> |
| 61.0 | <ul><li>'JWC CCTV 녹화기 500만화소 JDO-8005 8채널 DVR '</li><li>'이지피스 DVR CCTV 녹화기 AHVR-2204L 265 4채널 '</li><li>'다후아 500만화소 4채널 CCTV녹화기 DVR 본체 XVR5104HS-I3 '</li></ul> |
| 124.0 | <ul><li>'인켈 IK-A360CD '</li><li>'사운드디퓨저 음향판 음향디퓨저 (벌집Type) (#M)디지털/가전>음향가전>오디오>오디오액세서리 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 음향가전 > 오디오'</li><li>'제네바 제네바스피커 L + 스탠드 '</li></ul> |
| 24.0 | <ul><li>'제크롤 날개없어 안전한 에어쿨러 리모컨 냉풍기 JK-CF3000R 선풍기 기화냉각방식 (#M)11st>계절가전>냉풍기>냉풍기 11st > 가전/디지털 > 계절가전 > 냉풍기 > 냉풍기'</li><li>'[캐리어]공식인증점 캐리어 창문형 에어컨 AWC06FYHS 18.7㎡ (#M)11st>계절가전>냉풍기>냉풍기 11st > 가전/디지털 > 계절가전 > 냉풍기 > 냉풍기'</li><li>'신일전자 기화냉각방식 에어쿨러 이동식 냉풍기 SIF-D700SJ 7L 선풍기 SIF-D700SJ (#M)11st>계절가전>냉풍기>냉풍기 11st > 가전/디지털 > 계절가전 > 냉풍기 > 냉풍기'</li></ul> |
| 128.0 | <ul><li>'삼성전자 AKG N9 HYBRID '</li><li>'브리츠 BT4000 ANC '</li><li>'Apple 에어팟 맥스 '</li></ul> |
| 145.0 | <ul><li>'2.5인치 HDD 하드 500GB 데스크탑 노트북 하드디스크 500기가 (#M)디지털/가전>저장장치>HDD GFK > naver_plus_traverse > Naverstore > PC/주변기기 > 저장장치 > HDD'</li><li>'유니콘 USB3.1 유무선 HDD케이스 HDD외장하드케이스 노트북하드케이스 외장하드케이스 슬라이드 3.5인치 (#M)저장장치>외장HDD>500G~1TB미만 GFK > traverse > 11st > 가전/디지털 > 저장장치 > 외장HDD > 500G~1TB미만'</li><li>'WD Ultrastar HC560 20TB 1PACK SATA3 총판점 무상3년 보증 (#M)디지털/가전>저장장치>HDD GFK > naver_plus_traverse_extension > Naverstore > PC/주변기기 > 저장장치 > HDD'</li></ul> |
| 217.0 | <ul><li>'썬포토 슬릭 삼각대 SLIK GX-S 7500 스마트폰 카메라 겸용 삼각대 (#M)디지털/가전>카메라/캠코더용품>삼각대/헤드>삼각대 GFK > naver_plus_traverse_extension > Naverstore > 휴대폰/카메라 > 카메라 > 삼각대/헤드'</li><li>'[공식인증]인스타360 플로팅 핸드그립 (#M)SSG.COM>카메라/캠코더>삼각대/케이스>삼각대/헤드/플레이트 GFK > traverse > ssg > 디지털/렌탈 > 카메라/캠코더 > 삼각대/케이스 > 삼각대/헤드/플레이트'</li><li>'고프로 히어로 쇼티 삼각대 셀카봉 미니 익스텐션폴 (#M)디지털/가전>카메라/캠코더용품>삼각대/헤드>삼각대 GFK > naver_plus_traverse_extension > Naverstore > 휴대폰/카메라 > 카메라 > 삼각대/헤드'</li></ul> |
| 114.0 | <ul><li>'did모니터 광고용모니터 32인치 전자메뉴판 디지털 '</li><li>'삼성 43인치 4K UHD 광고 DID 모니터 디지털 사이니지 LH43QETELGCXKR '</li><li>'삼성 디지털 사이니지 55인치 LH55QBCEBGCXKR 광고 모니터 DID (#M)디지털/가전>영상가전>TV>LEDTV GFK > Naverstore > 가전 > TV > 화면크기별 > 50인치대'</li></ul> |
| 146.0 | <ul><li>'아이피타임 개인용 나스 NAS 서버 2베이 NAS 2dual '</li><li>'EFM네트웍스 아이피타임 NAS2 Dual '</li><li>'개인서버 가정용NAS J1900 타오나스 가정용 헤놀로지 서버 '</li></ul> |
| 193.0 | <ul><li>'[25년형 NEW] 한경희 건강식 마스터 데이필 두유 죽제조기 HFM-7000 '</li><li>'신일 두유제조기 1L 대용량 가정용 콩물 죽 메이커 만드는기계 '</li><li>'오쿠 OCC-BM1300 '</li></ul> |
| 109.0 | <ul><li>'[기업용] 터보백신 윈도우 서버 1년(통합 보안 악성코드 바이러스 검사/치료) '</li><li>'V3 365 클리닉 '</li><li>'[즉시발송] 카스퍼스키 플러스 1PC 신규형 카스퍼스키 플러스 1년 사용권 (#M)디지털/가전>소프트웨어>보안/백신 GFK > Naverstore > 컴퓨터 > 소프트웨어'</li></ul> |
| 132.0 | <ul><li>'3종세트 눈썹정리 (숱가위+눈썹칼+트위저+가죽케이스) 퍼플 (#M)이미용가전>눈썹정리기>눈썹정리기 GFK > traverse > 11st > 가전/디지털 > 이미용가전 > 눈썹정리기'</li><li>'[Y존,겨드랑이]쉬크 인튜이션 5중날 제모기(핸들1개+날2입)+특별 (쉬크 눈썹칼 프리미엄 4입) 바디트리머 1개+눈썹칼 4입 (#M)디지털/가전>이미용가전>제모기 GFK > naver_plus_traverse_extension > Naverstore > 가전 > 이미용가전 > 제모기/이발기'</li><li>'눈썹고데기 눈썹올리기 마스카라 열 뷰러 진케어 아이컬 아이컬(마스카라형) 핑크 (#M)홈>디지털/가전>이미용가전>눈썹정리기 Naverstore > 가전 > 이미용가전 > 눈썹관리기 > 속눈썹고데기'</li></ul> |
| 134.0 | <ul><li>'필립스 S7000 S5000 교체용 헤드 면도날 (#M)GSSHOP>뷰티>이미용기기>기타이미용기기 GSSHOP > 뷰티 > 이미용기기 > 기타이미용기기'</li><li>'필립스 면도기 무선 클렌징 팟 세척카트리지 6개입/면도기세정액 (#M)GSSHOP>뷰티>이미용기기>기타이미용기기 GSSHOP > 뷰티 > 이미용기기 > 기타이미용기기'</li><li>'필립스 RQ11 교체용 전기면도기날망 면도기날망 (#M)GSSHOP>뷰티>이미용기기>기타이미용기기 GSSHOP > 뷰티 > 이미용기기 > 기타이미용기기'</li></ul> |
| 198.0 | <ul><li>'믹스커피 자판기 커피 미니 기계 머신 식당 기계 업소용 상품 '</li><li>'VEN502 (기기+재료포함) 동구전자 믹스커피자판기 미니자판기 커피머신 전국설치 '</li><li>'두산로보틱스 무인카페 바리스타로봇 닥터프레소 '</li></ul> |
| 73.0 | <ul><li>'천장형 TV브라켓 천정형 티비거치대 모니터브라켓 벽걸이브라켓 cml6 '</li><li>'이젤형 티비 거치대 191cm 호환 TV 스탠드 거치대 크롬 크롬스탠드(1/13 입고) (#M)디지털/가전>영상가전>영상가전액세서리>스탠드 GFK > traverse > Naverstore > 가전 > TV > TV 액세서리 > 스탠드/브라켓'</li><li>'24년식 삼탠바이미 호환 사운드바거치대 무빙스탠드 기둥지름 50mm이하 (#M)디지털/가전>영상가전>영상가전액세서리>브라켓 GFK > naver_plus_traverse_extension > Naverstore > 가전 > TV > 스탠드/거치대'</li></ul> |
| 176.0 | <ul><li>'린나이 엘앤피 파세코 에코 웰텍 동양 호환 기름 정제필터 식용유필터 정제기필터 100매 320x490 엘앤피 파세코 에코 웰텍 (#M)홈>생활건강 Naverstore > 디지털/가전 > 주방가전 > 업소용튀김기'</li><li>'린나이 엘앤피 파세코 에코 웰텍 동양 호환 기름 정제필터 식용유필터 정제기필터 100매 322x382 린나이 ROR-F30 (#M)홈>생활건강 Naverstore > 디지털/가전 > 주방가전 > 업소용튀김기'</li><li>'린나이 엘앤피 파세코 에코 웰텍 동양 호환 기름 정제필터 식용유필터 정제기필터 100매 325x490 엘앤피 파세코 (#M)홈>생활건강 Naverstore > 디지털/가전 > 주방가전 > 업소용튀김기'</li></ul> |
| 175.0 | <ul><li>'리브레 업소용식기세척기sk매직호환 CDW-R152E 세제 2개월분포함 식당 영업용 식세기 '</li><li>'아트원 업소용 식기세척기 도어타입 온수용 카페 식당 영업용 대용량 무료배송 '</li><li>'제스트 업소용식기세척기 온수형 영업용 식당용 교회 회사 구내식당 식기세척기 전국 무료배송 '</li></ul> |
| 55.0 | <ul><li>'에그무제한 포켓파이 LG 신규 기기 대여 1개월 (LTE 데이터 2배 제공) 신규 기기 대여_1개월 (#M)디지털/가전>네트워크장비>무선모뎀 GFK > Naverstore > 컴퓨터 > 주변기기 > 네트워크장비 > 모뎀'</li><li>'에그 무제한 20GB KT LTE 데이터 신규 기기대여 도마우스 1개월 기존 기기 연장_도마우스 20GB_1개월 (#M)디지털/가전>네트워크장비>무선모뎀 GFK > Naverstore > 컴퓨터 > 주변기기 > 네트워크장비 > 모뎀'</li><li>'에그 무제한 20GB KT LTE 데이터 신규 기기대여 도마우스 1개월 기존 기기 연장_하트여왕 MAX_1개월 (10%+ 할인) (#M)디지털/가전>네트워크장비>무선모뎀 GFK > Naverstore > 컴퓨터 > 주변기기 > 네트워크장비 > 모뎀'</li></ul> |
| 62.0 | <ul><li>'유니콘 안드로이드셋탑박스 UHD 4K 60Hz 디빅스플레이어 DV-X70 '</li><li>'유니콘 AV-M7 2세대 디빅스플레이어 UHD 4K지원 미디어플레이어 (#M)디지털/가전>멀티미디어장비>Divx플레이어 Naverstore > 가전 > 영상가전 > 플레이어 > Dvix'</li><li>'서진네트웍스 유니콘 AV-M4 AV-M4본체 (#M)디지털/가전>멀티미디어장비>Divx플레이어 Naverstore > 가전 > 영상가전 > 플레이어 > Dvix'</li></ul> |
| 66.0 | <ul><li>'엠비에프 MBF-USB71C 사운드카드 '</li><li>'리버네트워크 넥시 NX-U20STC USB 사운드카드 (NX614) '</li><li>'[MBF] USB Virtual7.1 Channel 사운드카드 [MBF-USB71C] '</li></ul> |
| 28.0 | <ul><li>'바른산소 고체산소 가정용 사무실 휴대용 독서실 산소발생기 '</li><li>'클린숨 가정용 산소발생기 휴대용 산소생성기 독서실 고체 하루 산소 '</li><li>'세이버 오투나라 KSO-1205H 가정용 상업용 업소용 산소발생기 '</li></ul> |
## Evaluation
### Metrics
| Label | Accuracy |
|:--------|:---------|
| **all** | 0.9082 |
## Uses
### Direct Use for Inference
First install the SetFit library:
```bash
pip install setfit
```
Then you can load this model and run inference.
```python
from setfit import SetFitModel
# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("mini1013/master_item_top_el_flat")
# Run inference
preds = model("해피콜 프리미엄 초고속 블렌더 브리즈탭 LED 터치 UI 믹서기 분쇄기 차콜그레이 (#M)디지털/가전>주방가전>믹서기 Naverstore > 가전 > 주방가전 > 믹서기/블렌더 > 믹서기")
```
<!--
### Downstream Use
*List how someone could finetune this model on their own dataset.*
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Set Metrics
| Training set | Min | Median | Max |
|:-------------|:----|:--------|:----|
| Word count | 5 | 21.2994 | 91 |
| Label | Training Sample Count |
|:------|:----------------------|
| 0.0 | 50 |
| 1.0 | 16 |
| 2.0 | 50 |
| 3.0 | 50 |
| 4.0 | 50 |
| 5.0 | 50 |
| 6.0 | 50 |
| 7.0 | 50 |
| 8.0 | 50 |
| 9.0 | 50 |
| 10.0 | 50 |
| 11.0 | 50 |
| 12.0 | 50 |
| 13.0 | 50 |
| 14.0 | 50 |
| 15.0 | 50 |
| 16.0 | 50 |
| 17.0 | 50 |
| 18.0 | 50 |
| 19.0 | 50 |
| 20.0 | 50 |
| 21.0 | 50 |
| 22.0 | 14 |
| 23.0 | 50 |
| 24.0 | 10 |
| 25.0 | 50 |
| 26.0 | 50 |
| 27.0 | 50 |
| 28.0 | 14 |
| 29.0 | 50 |
| 30.0 | 12 |
| 31.0 | 45 |
| 32.0 | 14 |
| 33.0 | 50 |
| 34.0 | 42 |
| 35.0 | 41 |
| 36.0 | 50 |
| 37.0 | 50 |
| 38.0 | 50 |
| 39.0 | 50 |
| 40.0 | 50 |
| 41.0 | 50 |
| 42.0 | 50 |
| 43.0 | 50 |
| 44.0 | 50 |
| 45.0 | 50 |
| 46.0 | 39 |
| 47.0 | 12 |
| 48.0 | 50 |
| 49.0 | 50 |
| 50.0 | 11 |
| 51.0 | 12 |
| 52.0 | 18 |
| 53.0 | 50 |
| 54.0 | 11 |
| 55.0 | 17 |
| 56.0 | 50 |
| 57.0 | 50 |
| 58.0 | 3 |
| 59.0 | 35 |
| 60.0 | 50 |
| 61.0 | 15 |
| 62.0 | 16 |
| 63.0 | 50 |
| 64.0 | 50 |
| 65.0 | 50 |
| 66.0 | 11 |
| 67.0 | 13 |
| 68.0 | 50 |
| 69.0 | 13 |
| 70.0 | 50 |
| 71.0 | 40 |
| 72.0 | 50 |
| 73.0 | 19 |
| 74.0 | 50 |
| 75.0 | 50 |
| 76.0 | 50 |
| 77.0 | 41 |
| 78.0 | 50 |
| 79.0 | 42 |
| 80.0 | 50 |
| 81.0 | 50 |
| 82.0 | 14 |
| 83.0 | 50 |
| 84.0 | 50 |
| 85.0 | 50 |
| 86.0 | 50 |
| 87.0 | 50 |
| 88.0 | 50 |
| 89.0 | 16 |
| 90.0 | 50 |
| 91.0 | 38 |
| 92.0 | 38 |
| 93.0 | 18 |
| 94.0 | 19 |
| 95.0 | 33 |
| 96.0 | 50 |
| 97.0 | 50 |
| 98.0 | 25 |
| 99.0 | 50 |
| 100.0 | 39 |
| 101.0 | 11 |
| 102.0 | 50 |
| 103.0 | 23 |
| 104.0 | 18 |
| 105.0 | 50 |
| 106.0 | 41 |
| 107.0 | 15 |
| 108.0 | 50 |
| 109.0 | 18 |
| 110.0 | 50 |
| 111.0 | 50 |
| 112.0 | 50 |
| 113.0 | 50 |
| 114.0 | 12 |
| 115.0 | 13 |
| 116.0 | 15 |
| 117.0 | 15 |
| 118.0 | 12 |
| 119.0 | 18 |
| 120.0 | 22 |
| 121.0 | 21 |
| 122.0 | 50 |
| 123.0 | 50 |
| 124.0 | 17 |
| 125.0 | 12 |
| 126.0 | 17 |
| 127.0 | 12 |
| 128.0 | 11 |
| 129.0 | 18 |
| 130.0 | 50 |
| 131.0 | 26 |
| 132.0 | 15 |
| 133.0 | 50 |
| 134.0 | 14 |
| 135.0 | 29 |
| 136.0 | 49 |
| 137.0 | 50 |
| 138.0 | 50 |
| 139.0 | 50 |
| 140.0 | 50 |
| 141.0 | 35 |
| 142.0 | 50 |
| 143.0 | 50 |
| 144.0 | 17 |
| 145.0 | 10 |
| 146.0 | 12 |
| 147.0 | 14 |
| 148.0 | 50 |
| 149.0 | 33 |
| 150.0 | 18 |
| 151.0 | 50 |
| 152.0 | 20 |
| 153.0 | 50 |
| 154.0 | 50 |
| 155.0 | 50 |
| 156.0 | 14 |
| 157.0 | 50 |
| 158.0 | 50 |
| 159.0 | 50 |
| 160.0 | 50 |
| 161.0 | 41 |
| 162.0 | 50 |
| 163.0 | 50 |
| 164.0 | 26 |
| 165.0 | 20 |
| 166.0 | 13 |
| 167.0 | 50 |
| 168.0 | 50 |
| 169.0 | 50 |
| 170.0 | 16 |
| 171.0 | 50 |
| 172.0 | 50 |
| 173.0 | 11 |
| 174.0 | 11 |
| 175.0 | 18 |
| 176.0 | 10 |
| 177.0 | 50 |
| 178.0 | 50 |
| 179.0 | 50 |
| 180.0 | 50 |
| 181.0 | 50 |
| 182.0 | 50 |
| 183.0 | 50 |
| 184.0 | 50 |
| 185.0 | 50 |
| 186.0 | 50 |
| 187.0 | 43 |
| 188.0 | 50 |
| 189.0 | 50 |
| 190.0 | 50 |
| 191.0 | 50 |
| 192.0 | 24 |
| 193.0 | 13 |
| 194.0 | 50 |
| 195.0 | 50 |
| 196.0 | 50 |
| 197.0 | 50 |
| 198.0 | 14 |
| 199.0 | 33 |
| 200.0 | 50 |
| 201.0 | 50 |
| 202.0 | 50 |
| 203.0 | 50 |
| 204.0 | 50 |
| 205.0 | 50 |
| 206.0 | 16 |
| 207.0 | 50 |
| 208.0 | 45 |
| 209.0 | 50 |
| 210.0 | 50 |
| 211.0 | 50 |
| 212.0 | 22 |
| 213.0 | 18 |
| 214.0 | 15 |
| 215.0 | 18 |
| 216.0 | 27 |
| 217.0 | 10 |
| 218.0 | 12 |
| 219.0 | 15 |
| 220.0 | 10 |
| 221.0 | 14 |
| 222.0 | 14 |
| 223.0 | 50 |
| 224.0 | 13 |
| 225.0 | 48 |
| 226.0 | 18 |
| 227.0 | 50 |
| 228.0 | 11 |
| 229.0 | 16 |
| 230.0 | 50 |
| 231.0 | 22 |
### Training Hyperparameters
- batch_size: (64, 64)
- num_epochs: (30, 30)
- max_steps: -1
- sampling_strategy: oversampling
- num_iterations: 100
- body_learning_rate: (2e-05, 1e-05)
- head_learning_rate: 0.01
- loss: CosineSimilarityLoss
- distance_metric: cosine_distance
- margin: 0.25
- end_to_end: False
- use_amp: False
- warmup_proportion: 0.1
- l2_weight: 0.01
- seed: 42
- eval_max_steps: -1
- load_best_model_at_end: False
### Training Results
| Epoch | Step | Training Loss | Validation Loss |
|:-------:|:------:|:-------------:|:---------------:|
| 0.0001 | 1 | 0.4775 | - |
| 0.0037 | 50 | 0.4398 | - |
| 0.0075 | 100 | 0.4346 | - |
| 0.0112 | 150 | 0.4312 | - |
| 0.0149 | 200 | 0.4414 | - |
| 0.0187 | 250 | 0.4317 | - |
| 0.0224 | 300 | 0.4304 | - |
| 0.0261 | 350 | 0.4107 | - |
| 0.0299 | 400 | 0.3971 | - |
| 0.0336 | 450 | 0.3888 | - |
| 0.0373 | 500 | 0.3775 | - |
| 0.0411 | 550 | 0.3672 | - |
| 0.0448 | 600 | 0.3485 | - |
| 0.0485 | 650 | 0.311 | - |
| 0.0523 | 700 | 0.2665 | - |
| 0.0560 | 750 | 0.2369 | - |
| 0.0597 | 800 | 0.22 | - |
| 0.0635 | 850 | 0.1967 | - |
| 0.0672 | 900 | 0.1982 | - |
| 0.0709 | 950 | 0.183 | - |
| 0.0747 | 1000 | 0.1649 | - |
| 0.0784 | 1050 | 0.1569 | - |
| 0.0821 | 1100 | 0.1353 | - |
| 0.0859 | 1150 | 0.1388 | - |
| 0.0896 | 1200 | 0.1259 | - |
| 0.0933 | 1250 | 0.1216 | - |
| 0.0971 | 1300 | 0.1101 | - |
| 0.1008 | 1350 | 0.1026 | - |
| 0.1045 | 1400 | 0.0987 | - |
| 0.1083 | 1450 | 0.0936 | - |
| 0.1120 | 1500 | 0.0877 | - |
| 0.1157 | 1550 | 0.0835 | - |
| 0.1195 | 1600 | 0.0818 | - |
| 0.1232 | 1650 | 0.0762 | - |
| 0.1270 | 1700 | 0.0789 | - |
| 0.1307 | 1750 | 0.074 | - |
| 0.1344 | 1800 | 0.0736 | - |
| 0.1382 | 1850 | 0.0712 | - |
| 0.1419 | 1900 | 0.0706 | - |
| 0.1456 | 1950 | 0.0685 | - |
| 0.1494 | 2000 | 0.0647 | - |
| 0.1531 | 2050 | 0.0667 | - |
| 0.1568 | 2100 | 0.0604 | - |
| 0.1606 | 2150 | 0.066 | - |
| 0.1643 | 2200 | 0.0588 | - |
| 0.1680 | 2250 | 0.0616 | - |
| 0.1718 | 2300 | 0.0579 | - |
| 0.1755 | 2350 | 0.057 | - |
| 0.1792 | 2400 | 0.0557 | - |
| 0.1830 | 2450 | 0.057 | - |
| 0.1867 | 2500 | 0.0523 | - |
| 0.1904 | 2550 | 0.0569 | - |
| 0.1942 | 2600 | 0.055 | - |
| 0.1979 | 2650 | 0.0533 | - |
| 0.2016 | 2700 | 0.0509 | - |
| 0.2054 | 2750 | 0.0489 | - |
| 0.2091 | 2800 | 0.0498 | - |
| 0.2128 | 2850 | 0.0508 | - |
| 0.2166 | 2900 | 0.049 | - |
| 0.2203 | 2950 | 0.0492 | - |
| 0.2240 | 3000 | 0.0475 | - |
| 0.2278 | 3050 | 0.0467 | - |
| 0.2315 | 3100 | 0.0469 | - |
| 0.2352 | 3150 | 0.0475 | - |
| 0.2390 | 3200 | 0.0448 | - |
| 0.2427 | 3250 | 0.0441 | - |
| 0.2464 | 3300 | 0.0438 | - |
| 0.2502 | 3350 | 0.0435 | - |
| 0.2539 | 3400 | 0.0447 | - |
| 0.2576 | 3450 | 0.0435 | - |
| 0.2614 | 3500 | 0.0433 | - |
| 0.2651 | 3550 | 0.0441 | - |
| 0.2688 | 3600 | 0.0395 | - |
| 0.2726 | 3650 | 0.0425 | - |
| 0.2763 | 3700 | 0.0404 | - |
| 0.2800 | 3750 | 0.0357 | - |
| 0.2838 | 3800 | 0.0378 | - |
| 0.2875 | 3850 | 0.038 | - |
| 0.2912 | 3900 | 0.037 | - |
| 0.2950 | 3950 | 0.038 | - |
| 0.2987 | 4000 | 0.0374 | - |
| 0.3024 | 4050 | 0.0356 | - |
| 0.3062 | 4100 | 0.0373 | - |
| 0.3099 | 4150 | 0.0357 | - |
| 0.3136 | 4200 | 0.0342 | - |
| 0.3174 | 4250 | 0.0349 | - |
| 0.3211 | 4300 | 0.0332 | - |
| 0.3248 | 4350 | 0.0325 | - |
| 0.3286 | 4400 | 0.0342 | - |
| 0.3323 | 4450 | 0.0325 | - |
| 0.3360 | 4500 | 0.0333 | - |
| 0.3398 | 4550 | 0.0337 | - |
| 0.3435 | 4600 | 0.0293 | - |
| 0.3472 | 4650 | 0.0316 | - |
| 0.3510 | 4700 | 0.03 | - |
| 0.3547 | 4750 | 0.03 | - |
| 0.3584 | 4800 | 0.0319 | - |
| 0.3622 | 4850 | 0.0317 | - |
| 0.3659 | 4900 | 0.0317 | - |
| 0.3697 | 4950 | 0.0309 | - |
| 0.3734 | 5000 | 0.03 | - |
| 0.3771 | 5050 | 0.0279 | - |
| 0.3809 | 5100 | 0.0258 | - |
| 0.3846 | 5150 | 0.0292 | - |
| 0.3883 | 5200 | 0.0278 | - |
| 0.3921 | 5250 | 0.028 | - |
| 0.3958 | 5300 | 0.0269 | - |
| 0.3995 | 5350 | 0.0282 | - |
| 0.4033 | 5400 | 0.0246 | - |
| 0.4070 | 5450 | 0.027 | - |
| 0.4107 | 5500 | 0.0284 | - |
| 0.4145 | 5550 | 0.0277 | - |
| 0.4182 | 5600 | 0.0252 | - |
| 0.4219 | 5650 | 0.026 | - |
| 0.4257 | 5700 | 0.0256 | - |
| 0.4294 | 5750 | 0.0239 | - |
| 0.4331 | 5800 | 0.0236 | - |
| 0.4369 | 5850 | 0.0249 | - |
| 0.4406 | 5900 | 0.0239 | - |
| 0.4443 | 5950 | 0.0224 | - |
| 0.4481 | 6000 | 0.0233 | - |
| 0.4518 | 6050 | 0.024 | - |
| 0.4555 | 6100 | 0.023 | - |
| 0.4593 | 6150 | 0.0234 | - |
| 0.4630 | 6200 | 0.0202 | - |
| 0.4667 | 6250 | 0.0209 | - |
| 0.4705 | 6300 | 0.023 | - |
| 0.4742 | 6350 | 0.0212 | - |
| 0.4779 | 6400 | 0.022 | - |
| 0.4817 | 6450 | 0.0224 | - |
| 0.4854 | 6500 | 0.021 | - |
| 0.4891 | 6550 | 0.0225 | - |
| 0.4929 | 6600 | 0.0226 | - |
| 0.4966 | 6650 | 0.0211 | - |
| 0.5003 | 6700 | 0.021 | - |
| 0.5041 | 6750 | 0.0192 | - |
| 0.5078 | 6800 | 0.0204 | - |
| 0.5115 | 6850 | 0.0201 | - |
| 0.5153 | 6900 | 0.0194 | - |
| 0.5190 | 6950 | 0.0198 | - |
| 0.5227 | 7000 | 0.0182 | - |
| 0.5265 | 7050 | 0.0184 | - |
| 0.5302 | 7100 | 0.0175 | - |
| 0.5339 | 7150 | 0.0192 | - |
| 0.5377 | 7200 | 0.0172 | - |
| 0.5414 | 7250 | 0.0178 | - |
| 0.5451 | 7300 | 0.0174 | - |
| 0.5489 | 7350 | 0.0189 | - |
| 0.5526 | 7400 | 0.0176 | - |
| 0.5563 | 7450 | 0.0195 | - |
| 0.5601 | 7500 | 0.017 | - |
| 0.5638 | 7550 | 0.0179 | - |
| 0.5675 | 7600 | 0.0149 | - |
| 0.5713 | 7650 | 0.0156 | - |
| 0.5750 | 7700 | 0.0166 | - |
| 0.5787 | 7750 | 0.0156 | - |
| 0.5825 | 7800 | 0.0177 | - |
| 0.5862 | 7850 | 0.0179 | - |
| 0.5899 | 7900 | 0.0143 | - |
| 0.5937 | 7950 | 0.015 | - |
| 0.5974 | 8000 | 0.0153 | - |
| 0.6012 | 8050 | 0.0158 | - |
| 0.6049 | 8100 | 0.0157 | - |
| 0.6086 | 8150 | 0.0143 | - |
| 0.6124 | 8200 | 0.0162 | - |
| 0.6161 | 8250 | 0.0153 | - |
| 0.6198 | 8300 | 0.0155 | - |
| 0.6236 | 8350 | 0.0145 | - |
| 0.6273 | 8400 | 0.0133 | - |
| 0.6310 | 8450 | 0.0145 | - |
| 0.6348 | 8500 | 0.0138 | - |
| 0.6385 | 8550 | 0.0142 | - |
| 0.6422 | 8600 | 0.0144 | - |
| 0.6460 | 8650 | 0.014 | - |
| 0.6497 | 8700 | 0.014 | - |
| 0.6534 | 8750 | 0.0149 | - |
| 0.6572 | 8800 | 0.012 | - |
| 0.6609 | 8850 | 0.0129 | - |
| 0.6646 | 8900 | 0.0119 | - |
| 0.6684 | 8950 | 0.0128 | - |
| 0.6721 | 9000 | 0.0134 | - |
| 0.6758 | 9050 | 0.0129 | - |
| 0.6796 | 9100 | 0.0124 | - |
| 0.6833 | 9150 | 0.0147 | - |
| 0.6870 | 9200 | 0.0127 | - |
| 0.6908 | 9250 | 0.0132 | - |
| 0.6945 | 9300 | 0.0118 | - |
| 0.6982 | 9350 | 0.0144 | - |
| 0.7020 | 9400 | 0.0117 | - |
| 0.7057 | 9450 | 0.01 | - |
| 0.7094 | 9500 | 0.011 | - |
| 0.7132 | 9550 | 0.0111 | - |
| 0.7169 | 9600 | 0.0122 | - |
| 0.7206 | 9650 | 0.0092 | - |
| 0.7244 | 9700 | 0.011 | - |
| 0.7281 | 9750 | 0.0109 | - |
| 0.7318 | 9800 | 0.0114 | - |
| 0.7356 | 9850 | 0.0101 | - |
| 0.7393 | 9900 | 0.0104 | - |
| 0.7430 | 9950 | 0.0127 | - |
| 0.7468 | 10000 | 0.0091 | - |
| 0.7505 | 10050 | 0.0092 | - |
| 0.7542 | 10100 | 0.0109 | - |
| 0.7580 | 10150 | 0.0113 | - |
| 0.7617 | 10200 | 0.0101 | - |
| 0.7654 | 10250 | 0.0096 | - |
| 0.7692 | 10300 | 0.0104 | - |
| 0.7729 | 10350 | 0.0107 | - |
| 0.7766 | 10400 | 0.0113 | - |
| 0.7804 | 10450 | 0.0102 | - |
| 0.7841 | 10500 | 0.0103 | - |
| 0.7878 | 10550 | 0.0092 | - |
| 0.7916 | 10600 | 0.008 | - |
| 0.7953 | 10650 | 0.0102 | - |
| 0.7990 | 10700 | 0.0093 | - |
| 0.8028 | 10750 | 0.0085 | - |
| 0.8065 | 10800 | 0.009 | - |
| 0.8102 | 10850 | 0.0072 | - |
| 0.8140 | 10900 | 0.0078 | - |
| 0.8177 | 10950 | 0.011 | - |
| 0.8214 | 11000 | 0.0087 | - |
| 0.8252 | 11050 | 0.0098 | - |
| 0.8289 | 11100 | 0.0087 | - |
| 0.8326 | 11150 | 0.0094 | - |
| 0.8364 | 11200 | 0.0077 | - |
| 0.8401 | 11250 | 0.0084 | - |
| 0.8439 | 11300 | 0.0082 | - |
| 0.8476 | 11350 | 0.0087 | - |
| 0.8513 | 11400 | 0.0084 | - |
| 0.8551 | 11450 | 0.0106 | - |
| 0.8588 | 11500 | 0.0095 | - |
| 0.8625 | 11550 | 0.0086 | - |
| 0.8663 | 11600 | 0.0077 | - |
| 0.8700 | 11650 | 0.0071 | - |
| 0.8737 | 11700 | 0.0077 | - |
| 0.8775 | 11750 | 0.008 | - |
| 0.8812 | 11800 | 0.0083 | - |
| 0.8849 | 11850 | 0.0082 | - |
| 0.8887 | 11900 | 0.0081 | - |
| 0.8924 | 11950 | 0.0074 | - |
| 0.8961 | 12000 | 0.0086 | - |
| 0.8999 | 12050 | 0.0082 | - |
| 0.9036 | 12100 | 0.0086 | - |
| 0.9073 | 12150 | 0.0083 | - |
| 0.9111 | 12200 | 0.008 | - |
| 0.9148 | 12250 | 0.0079 | - |
| 0.9185 | 12300 | 0.0082 | - |
| 0.9223 | 12350 | 0.0066 | - |
| 0.9260 | 12400 | 0.0064 | - |
| 0.9297 | 12450 | 0.0075 | - |
| 0.9335 | 12500 | 0.0088 | - |
| 0.9372 | 12550 | 0.0075 | - |
| 0.9409 | 12600 | 0.0074 | - |
| 0.9447 | 12650 | 0.008 | - |
| 0.9484 | 12700 | 0.0067 | - |
| 0.9521 | 12750 | 0.0074 | - |
| 0.9559 | 12800 | 0.0075 | - |
| 0.9596 | 12850 | 0.0059 | - |
| 0.9633 | 12900 | 0.0091 | - |
| 0.9671 | 12950 | 0.008 | - |
| 0.9708 | 13000 | 0.0093 | - |
| 0.9745 | 13050 | 0.0067 | - |
| 0.9783 | 13100 | 0.0084 | - |
| 0.9820 | 13150 | 0.0066 | - |
| 0.9857 | 13200 | 0.0069 | - |
| 0.9895 | 13250 | 0.0063 | - |
| 0.9932 | 13300 | 0.007 | - |
| 0.9969 | 13350 | 0.0074 | - |
| 1.0007 | 13400 | 0.0076 | - |
| 1.0044 | 13450 | 0.0067 | - |
| 1.0081 | 13500 | 0.0062 | - |
| 1.0119 | 13550 | 0.0083 | - |
| 1.0156 | 13600 | 0.0058 | - |
| 1.0193 | 13650 | 0.0047 | - |
| 1.0231 | 13700 | 0.007 | - |
| 1.0268 | 13750 | 0.0082 | - |
| 1.0305 | 13800 | 0.0069 | - |
| 1.0343 | 13850 | 0.0055 | - |
| 1.0380 | 13900 | 0.0066 | - |
| 1.0417 | 13950 | 0.0069 | - |
| 1.0455 | 14000 | 0.0067 | - |
| 1.0492 | 14050 | 0.0061 | - |
| 1.0529 | 14100 | 0.0063 | - |
| 1.0567 | 14150 | 0.0053 | - |
| 1.0604 | 14200 | 0.0065 | - |
| 1.0641 | 14250 | 0.0059 | - |
| 1.0679 | 14300 | 0.0078 | - |
| 1.0716 | 14350 | 0.0057 | - |
| 1.0753 | 14400 | 0.0062 | - |
| 1.0791 | 14450 | 0.0061 | - |
| 1.0828 | 14500 | 0.0063 | - |
| 1.0866 | 14550 | 0.0067 | - |
| 1.0903 | 14600 | 0.0062 | - |
| 1.0940 | 14650 | 0.0065 | - |
| 1.0978 | 14700 | 0.0048 | - |
| 1.1015 | 14750 | 0.0049 | - |
| 1.1052 | 14800 | 0.0059 | - |
| 1.1090 | 14850 | 0.0062 | - |
| 1.1127 | 14900 | 0.005 | - |
| 1.1164 | 14950 | 0.0059 | - |
| 1.1202 | 15000 | 0.0049 | - |
| 1.1239 | 15050 | 0.0048 | - |
| 1.1276 | 15100 | 0.0058 | - |
| 1.1314 | 15150 | 0.0059 | - |
| 1.1351 | 15200 | 0.0069 | - |
| 1.1388 | 15250 | 0.0071 | - |
| 1.1426 | 15300 | 0.0063 | - |
| 1.1463 | 15350 | 0.0049 | - |
| 1.1500 | 15400 | 0.0048 | - |
| 1.1538 | 15450 | 0.0057 | - |
| 1.1575 | 15500 | 0.006 | - |
| 1.1612 | 15550 | 0.0049 | - |
| 1.1650 | 15600 | 0.0051 | - |
| 1.1687 | 15650 | 0.0057 | - |
| 1.1724 | 15700 | 0.0057 | - |
| 1.1762 | 15750 | 0.0054 | - |
| 1.1799 | 15800 | 0.0054 | - |
| 1.1836 | 15850 | 0.0051 | - |
| 1.1874 | 15900 | 0.0051 | - |
| 1.1911 | 15950 | 0.005 | - |
| 1.1948 | 16000 | 0.0053 | - |
| 1.1986 | 16050 | 0.005 | - |
| 1.2023 | 16100 | 0.0055 | - |
| 1.2060 | 16150 | 0.0052 | - |
| 1.2098 | 16200 | 0.0063 | - |
| 1.2135 | 16250 | 0.0059 | - |
| 1.2172 | 16300 | 0.0058 | - |
| 1.2210 | 16350 | 0.0055 | - |
| 1.2247 | 16400 | 0.0051 | - |
| 1.2284 | 16450 | 0.0049 | - |
| 1.2322 | 16500 | 0.0049 | - |
| 1.2359 | 16550 | 0.0051 | - |
| 1.2396 | 16600 | 0.0048 | - |
| 1.2434 | 16650 | 0.0053 | - |
| 1.2471 | 16700 | 0.0054 | - |
| 1.2508 | 16750 | 0.0044 | - |
| 1.2546 | 16800 | 0.0054 | - |
| 1.2583 | 16850 | 0.0048 | - |
| 1.2620 | 16900 | 0.0061 | - |
| 1.2658 | 16950 | 0.0048 | - |
| 1.2695 | 17000 | 0.0039 | - |
| 1.2732 | 17050 | 0.0044 | - |
| 1.2770 | 17100 | 0.0065 | - |
| 1.2807 | 17150 | 0.0052 | - |
| 1.2844 | 17200 | 0.0045 | - |
| 1.2882 | 17250 | 0.005 | - |
| 1.2919 | 17300 | 0.0031 | - |
| 1.2956 | 17350 | 0.0041 | - |
| 1.2994 | 17400 | 0.0051 | - |
| 1.3031 | 17450 | 0.0049 | - |
| 1.3068 | 17500 | 0.006 | - |
| 1.3106 | 17550 | 0.0051 | - |
| 1.3143 | 17600 | 0.0044 | - |
| 1.3180 | 17650 | 0.0054 | - |
| 1.3218 | 17700 | 0.0054 | - |
| 1.3255 | 17750 | 0.0047 | - |
| 1.3293 | 17800 | 0.0046 | - |
| 1.3330 | 17850 | 0.004 | - |
| 1.3367 | 17900 | 0.0044 | - |
| 1.3405 | 17950 | 0.0047 | - |
| 1.3442 | 18000 | 0.0054 | - |
| 1.3479 | 18050 | 0.0041 | - |
| 1.3517 | 18100 | 0.0046 | - |
| 1.3554 | 18150 | 0.0059 | - |
| 1.3591 | 18200 | 0.005 | - |
| 1.3629 | 18250 | 0.0042 | - |
| 1.3666 | 18300 | 0.0047 | - |
| 1.3703 | 18350 | 0.0041 | - |
| 1.3741 | 18400 | 0.0048 | - |
| 1.3778 | 18450 | 0.0032 | - |
| 1.3815 | 18500 | 0.0044 | - |
| 1.3853 | 18550 | 0.0038 | - |
| 1.3890 | 18600 | 0.0033 | - |
| 1.3927 | 18650 | 0.0033 | - |
| 1.3965 | 18700 | 0.0053 | - |
| 1.4002 | 18750 | 0.0042 | - |
| 1.4039 | 18800 | 0.0036 | - |
| 1.4077 | 18850 | 0.0044 | - |
| 1.4114 | 18900 | 0.0044 | - |
| 1.4151 | 18950 | 0.0026 | - |
| 1.4189 | 19000 | 0.0042 | - |
| 1.4226 | 19050 | 0.0041 | - |
| 1.4263 | 19100 | 0.0034 | - |
| 1.4301 | 19150 | 0.0042 | - |
| 1.4338 | 19200 | 0.0049 | - |
| 1.4375 | 19250 | 0.0039 | - |
| 1.4413 | 19300 | 0.0036 | - |
| 1.4450 | 19350 | 0.005 | - |
| 1.4487 | 19400 | 0.0044 | - |
| 1.4525 | 19450 | 0.0058 | - |
| 1.4562 | 19500 | 0.0037 | - |
| 1.4599 | 19550 | 0.0043 | - |
| 1.4637 | 19600 | 0.0038 | - |
| 1.4674 | 19650 | 0.0032 | - |
| 1.4711 | 19700 | 0.0032 | - |
| 1.4749 | 19750 | 0.0052 | - |
| 1.4786 | 19800 | 0.0034 | - |
| 1.4823 | 19850 | 0.004 | - |
| 1.4861 | 19900 | 0.004 | - |
| 1.4898 | 19950 | 0.0049 | - |
| 1.4935 | 20000 | 0.0037 | - |
| 1.4973 | 20050 | 0.0038 | - |
| 1.5010 | 20100 | 0.0045 | - |
| 1.5047 | 20150 | 0.0043 | - |
| 1.5085 | 20200 | 0.0038 | - |
| 1.5122 | 20250 | 0.0028 | - |
| 1.5159 | 20300 | 0.0036 | - |
| 1.5197 | 20350 | 0.0035 | - |
| 1.5234 | 20400 | 0.0037 | - |
| 1.5271 | 20450 | 0.0044 | - |
| 1.5309 | 20500 | 0.0031 | - |
| 1.5346 | 20550 | 0.0038 | - |
| 1.5383 | 20600 | 0.0036 | - |
| 1.5421 | 20650 | 0.0038 | - |
| 1.5458 | 20700 | 0.0027 | - |
| 1.5495 | 20750 | 0.003 | - |
| 1.5533 | 20800 | 0.0026 | - |
| 1.5570 | 20850 | 0.0036 | - |
| 1.5607 | 20900 | 0.0038 | - |
| 1.5645 | 20950 | 0.0034 | - |
| 1.5682 | 21000 | 0.0036 | - |
| 1.5720 | 21050 | 0.0046 | - |
| 1.5757 | 21100 | 0.0039 | - |
| 1.5794 | 21150 | 0.0033 | - |
| 1.5832 | 21200 | 0.0028 | - |
| 1.5869 | 21250 | 0.0035 | - |
| 1.5906 | 21300 | 0.003 | - |
| 1.5944 | 21350 | 0.0034 | - |
| 1.5981 | 21400 | 0.0032 | - |
| 1.6018 | 21450 | 0.0031 | - |
| 1.6056 | 21500 | 0.0024 | - |
| 1.6093 | 21550 | 0.0031 | - |
| 1.6130 | 21600 | 0.0035 | - |
| 1.6168 | 21650 | 0.0038 | - |
| 1.6205 | 21700 | 0.0033 | - |
| 1.6242 | 21750 | 0.0038 | - |
| 1.6280 | 21800 | 0.0033 | - |
| 1.6317 | 21850 | 0.0047 | - |
| 1.6354 | 21900 | 0.0034 | - |
| 1.6392 | 21950 | 0.0046 | - |
| 1.6429 | 22000 | 0.0039 | - |
| 1.6466 | 22050 | 0.0035 | - |
| 1.6504 | 22100 | 0.003 | - |
| 1.6541 | 22150 | 0.0034 | - |
| 1.6578 | 22200 | 0.004 | - |
| 1.6616 | 22250 | 0.0015 | - |
| 1.6653 | 22300 | 0.0036 | - |
| 1.6690 | 22350 | 0.0023 | - |
| 1.6728 | 22400 | 0.0031 | - |
| 1.6765 | 22450 | 0.0032 | - |
| 1.6802 | 22500 | 0.0038 | - |
| 1.6840 | 22550 | 0.0035 | - |
| 1.6877 | 22600 | 0.0031 | - |
| 1.6914 | 22650 | 0.0036 | - |
| 1.6952 | 22700 | 0.0027 | - |
| 1.6989 | 22750 | 0.0027 | - |
| 1.7026 | 22800 | 0.0031 | - |
| 1.7064 | 22850 | 0.0042 | - |
| 1.7101 | 22900 | 0.0033 | - |
| 1.7138 | 22950 | 0.0029 | - |
| 1.7176 | 23000 | 0.0028 | - |
| 1.7213 | 23050 | 0.0018 | - |
| 1.7250 | 23100 | 0.0028 | - |
| 1.7288 | 23150 | 0.0032 | - |
| 1.7325 | 23200 | 0.0037 | - |
| 1.7362 | 23250 | 0.003 | - |
| 1.7400 | 23300 | 0.0039 | - |
| 1.7437 | 23350 | 0.0027 | - |
| 1.7474 | 23400 | 0.0032 | - |
| 1.7512 | 23450 | 0.0037 | - |
| 1.7549 | 23500 | 0.0022 | - |
| 1.7586 | 23550 | 0.0026 | - |
| 1.7624 | 23600 | 0.0036 | - |
| 1.7661 | 23650 | 0.0027 | - |
| 1.7698 | 23700 | 0.0026 | - |
| 1.7736 | 23750 | 0.003 | - |
| 1.7773 | 23800 | 0.0036 | - |
| 1.7810 | 23850 | 0.0027 | - |
| 1.7848 | 23900 | 0.0033 | - |
| 1.7885 | 23950 | 0.0034 | - |
| 1.7922 | 24000 | 0.0028 | - |
| 1.7960 | 24050 | 0.003 | - |
| 1.7997 | 24100 | 0.0028 | - |
| 1.8035 | 24150 | 0.0021 | - |
| 1.8072 | 24200 | 0.0027 | - |
| 1.8109 | 24250 | 0.0028 | - |
| 1.8147 | 24300 | 0.0029 | - |
| 1.8184 | 24350 | 0.002 | - |
| 1.8221 | 24400 | 0.0022 | - |
| 1.8259 | 24450 | 0.002 | - |
| 1.8296 | 24500 | 0.0025 | - |
| 1.8333 | 24550 | 0.0025 | - |
| 1.8371 | 24600 | 0.0025 | - |
| 1.8408 | 24650 | 0.0028 | - |
| 1.8445 | 24700 | 0.002 | - |
| 1.8483 | 24750 | 0.0029 | - |
| 1.8520 | 24800 | 0.0024 | - |
| 1.8557 | 24850 | 0.0023 | - |
| 1.8595 | 24900 | 0.0025 | - |
| 1.8632 | 24950 | 0.002 | - |
| 1.8669 | 25000 | 0.0031 | - |
| 1.8707 | 25050 | 0.0021 | - |
| 1.8744 | 25100 | 0.0025 | - |
| 1.8781 | 25150 | 0.0032 | - |
| 1.8819 | 25200 | 0.0041 | - |
| 1.8856 | 25250 | 0.0048 | - |
| 1.8893 | 25300 | 0.0023 | - |
| 1.8931 | 25350 | 0.0032 | - |
| 1.8968 | 25400 | 0.0026 | - |
| 1.9005 | 25450 | 0.0037 | - |
| 1.9043 | 25500 | 0.0019 | - |
| 1.9080 | 25550 | 0.0022 | - |
| 1.9117 | 25600 | 0.0025 | - |
| 1.9155 | 25650 | 0.0031 | - |
| 1.9192 | 25700 | 0.0018 | - |
| 1.9229 | 25750 | 0.002 | - |
| 1.9267 | 25800 | 0.0018 | - |
| 1.9304 | 25850 | 0.0025 | - |
| 1.9341 | 25900 | 0.0021 | - |
| 1.9379 | 25950 | 0.0019 | - |
| 1.9416 | 26000 | 0.0018 | - |
| 1.9453 | 26050 | 0.003 | - |
| 1.9491 | 26100 | 0.0021 | - |
| 1.9528 | 26150 | 0.0029 | - |
| 1.9565 | 26200 | 0.0031 | - |
| 1.9603 | 26250 | 0.0023 | - |
| 1.9640 | 26300 | 0.003 | - |
| 1.9677 | 26350 | 0.003 | - |
| 1.9715 | 26400 | 0.0021 | - |
| 1.9752 | 26450 | 0.0028 | - |
| 1.9789 | 26500 | 0.0027 | - |
| 1.9827 | 26550 | 0.0021 | - |
| 1.9864 | 26600 | 0.0016 | - |
| 1.9901 | 26650 | 0.0021 | - |
| 1.9939 | 26700 | 0.0021 | - |
| 1.9976 | 26750 | 0.0032 | - |
| 2.0013 | 26800 | 0.0022 | - |
| 2.0051 | 26850 | 0.0023 | - |
| 2.0088 | 26900 | 0.0025 | - |
| 2.0125 | 26950 | 0.0017 | - |
| 2.0163 | 27000 | 0.0015 | - |
| 2.0200 | 27050 | 0.0011 | - |
| 2.0237 | 27100 | 0.0016 | - |
| 2.0275 | 27150 | 0.0015 | - |
| 2.0312 | 27200 | 0.002 | - |
| 2.0349 | 27250 | 0.0024 | - |
| 2.0387 | 27300 | 0.003 | - |
| 2.0424 | 27350 | 0.0023 | - |
| 2.0462 | 27400 | 0.0013 | - |
| 2.0499 | 27450 | 0.0027 | - |
| 2.0536 | 27500 | 0.0048 | - |
| 2.0574 | 27550 | 0.0027 | - |
| 2.0611 | 27600 | 0.0027 | - |
| 2.0648 | 27650 | 0.0029 | - |
| 2.0686 | 27700 | 0.0019 | - |
| 2.0723 | 27750 | 0.0026 | - |
| 2.0760 | 27800 | 0.0029 | - |
| 2.0798 | 27850 | 0.0024 | - |
| 2.0835 | 27900 | 0.0034 | - |
| 2.0872 | 27950 | 0.0026 | - |
| 2.0910 | 28000 | 0.0024 | - |
| 2.0947 | 28050 | 0.0018 | - |
| 2.0984 | 28100 | 0.0021 | - |
| 2.1022 | 28150 | 0.0022 | - |
| 2.1059 | 28200 | 0.0023 | - |
| 2.1096 | 28250 | 0.0015 | - |
| 2.1134 | 28300 | 0.0027 | - |
| 2.1171 | 28350 | 0.0018 | - |
| 2.1208 | 28400 | 0.0008 | - |
| 2.1246 | 28450 | 0.0025 | - |
| 2.1283 | 28500 | 0.0027 | - |
| 2.1320 | 28550 | 0.0029 | - |
| 2.1358 | 28600 | 0.0022 | - |
| 2.1395 | 28650 | 0.0026 | - |
| 2.1432 | 28700 | 0.0038 | - |
| 2.1470 | 28750 | 0.0037 | - |
| 2.1507 | 28800 | 0.0024 | - |
| 2.1544 | 28850 | 0.0028 | - |
| 2.1582 | 28900 | 0.0028 | - |
| 2.1619 | 28950 | 0.0028 | - |
| 2.1656 | 29000 | 0.0023 | - |
| 2.1694 | 29050 | 0.0019 | - |
| 2.1731 | 29100 | 0.0024 | - |
| 2.1768 | 29150 | 0.0028 | - |
| 2.1806 | 29200 | 0.0026 | - |
| 2.1843 | 29250 | 0.0023 | - |
| 2.1880 | 29300 | 0.0015 | - |
| 2.1918 | 29350 | 0.0035 | - |
| 2.1955 | 29400 | 0.0028 | - |
| 2.1992 | 29450 | 0.0024 | - |
| 2.2030 | 29500 | 0.0015 | - |
| 2.2067 | 29550 | 0.0021 | - |
| 2.2104 | 29600 | 0.002 | - |
| 2.2142 | 29650 | 0.0019 | - |
| 2.2179 | 29700 | 0.002 | - |
| 2.2216 | 29750 | 0.0019 | - |
| 2.2254 | 29800 | 0.002 | - |
| 2.2291 | 29850 | 0.0019 | - |
| 2.2328 | 29900 | 0.002 | - |
| 2.2366 | 29950 | 0.0025 | - |
| 2.2403 | 30000 | 0.0026 | - |
| 2.2440 | 30050 | 0.0027 | - |
| 2.2478 | 30100 | 0.0022 | - |
| 2.2515 | 30150 | 0.0019 | - |
| 2.2552 | 30200 | 0.0025 | - |
| 2.2590 | 30250 | 0.0022 | - |
| 2.2627 | 30300 | 0.0018 | - |
| 2.2664 | 30350 | 0.0017 | - |
| 2.2702 | 30400 | 0.0015 | - |
| 2.2739 | 30450 | 0.0017 | - |
| 2.2776 | 30500 | 0.0016 | - |
| 2.2814 | 30550 | 0.0011 | - |
| 2.2851 | 30600 | 0.0012 | - |
| 2.2889 | 30650 | 0.0016 | - |
| 2.2926 | 30700 | 0.0019 | - |
| 2.2963 | 30750 | 0.0017 | - |
| 2.3001 | 30800 | 0.0026 | - |
| 2.3038 | 30850 | 0.0023 | - |
| 2.3075 | 30900 | 0.0021 | - |
| 2.3113 | 30950 | 0.0028 | - |
| 2.3150 | 31000 | 0.0011 | - |
| 2.3187 | 31050 | 0.0024 | - |
| 2.3225 | 31100 | 0.0026 | - |
| 2.3262 | 31150 | 0.0026 | - |
| 2.3299 | 31200 | 0.0021 | - |
| 2.3337 | 31250 | 0.0024 | - |
| 2.3374 | 31300 | 0.001 | - |
| 2.3411 | 31350 | 0.0021 | - |
| 2.3449 | 31400 | 0.0015 | - |
| 2.3486 | 31450 | 0.0017 | - |
| 2.3523 | 31500 | 0.0015 | - |
| 2.3561 | 31550 | 0.0005 | - |
| 2.3598 | 31600 | 0.0019 | - |
| 2.3635 | 31650 | 0.002 | - |
| 2.3673 | 31700 | 0.0022 | - |
| 2.3710 | 31750 | 0.0033 | - |
| 2.3747 | 31800 | 0.0016 | - |
| 2.3785 | 31850 | 0.0013 | - |
| 2.3822 | 31900 | 0.0022 | - |
| 2.3859 | 31950 | 0.0022 | - |
| 2.3897 | 32000 | 0.0039 | - |
| 2.3934 | 32050 | 0.0025 | - |
| 2.3971 | 32100 | 0.0035 | - |
| 2.4009 | 32150 | 0.0018 | - |
| 2.4046 | 32200 | 0.0019 | - |
| 2.4083 | 32250 | 0.0016 | - |
| 2.4121 | 32300 | 0.0022 | - |
| 2.4158 | 32350 | 0.0017 | - |
| 2.4195 | 32400 | 0.0027 | - |
| 2.4233 | 32450 | 0.0027 | - |
| 2.4270 | 32500 | 0.0014 | - |
| 2.4307 | 32550 | 0.0032 | - |
| 2.4345 | 32600 | 0.002 | - |
| 2.4382 | 32650 | 0.0014 | - |
| 2.4419 | 32700 | 0.0022 | - |
| 2.4457 | 32750 | 0.0018 | - |
| 2.4494 | 32800 | 0.0015 | - |
| 2.4531 | 32850 | 0.0023 | - |
| 2.4569 | 32900 | 0.0023 | - |
| 2.4606 | 32950 | 0.0018 | - |
| 2.4643 | 33000 | 0.002 | - |
| 2.4681 | 33050 | 0.0019 | - |
| 2.4718 | 33100 | 0.002 | - |
| 2.4755 | 33150 | 0.0023 | - |
| 2.4793 | 33200 | 0.0013 | - |
| 2.4830 | 33250 | 0.0015 | - |
| 2.4867 | 33300 | 0.001 | - |
| 2.4905 | 33350 | 0.0018 | - |
| 2.4942 | 33400 | 0.0015 | - |
| 2.4979 | 33450 | 0.0013 | - |
| 2.5017 | 33500 | 0.0017 | - |
| 2.5054 | 33550 | 0.002 | - |
| 2.5091 | 33600 | 0.0014 | - |
| 2.5129 | 33650 | 0.0012 | - |
| 2.5166 | 33700 | 0.0014 | - |
| 2.5203 | 33750 | 0.0024 | - |
| 2.5241 | 33800 | 0.0016 | - |
| 2.5278 | 33850 | 0.0017 | - |
| 2.5316 | 33900 | 0.0016 | - |
| 2.5353 | 33950 | 0.0015 | - |
| 2.5390 | 34000 | 0.0019 | - |
| 2.5428 | 34050 | 0.0012 | - |
| 2.5465 | 34100 | 0.0021 | - |
| 2.5502 | 34150 | 0.0019 | - |
| 2.5540 | 34200 | 0.0018 | - |
| 2.5577 | 34250 | 0.0028 | - |
| 2.5614 | 34300 | 0.0035 | - |
| 2.5652 | 34350 | 0.0034 | - |
| 2.5689 | 34400 | 0.0028 | - |
| 2.5726 | 34450 | 0.0034 | - |
| 2.5764 | 34500 | 0.003 | - |
| 2.5801 | 34550 | 0.0019 | - |
| 2.5838 | 34600 | 0.0026 | - |
| 2.5876 | 34650 | 0.0026 | - |
| 2.5913 | 34700 | 0.0029 | - |
| 2.5950 | 34750 | 0.0029 | - |
| 2.5988 | 34800 | 0.0025 | - |
| 2.6025 | 34850 | 0.0018 | - |
| 2.6062 | 34900 | 0.003 | - |
| 2.6100 | 34950 | 0.0021 | - |
| 2.6137 | 35000 | 0.0014 | - |
| 2.6174 | 35050 | 0.0013 | - |
| 2.6212 | 35100 | 0.0015 | - |
| 2.6249 | 35150 | 0.0016 | - |
| 2.6286 | 35200 | 0.0016 | - |
| 2.6324 | 35250 | 0.0016 | - |
| 2.6361 | 35300 | 0.0013 | - |
| 2.6398 | 35350 | 0.0019 | - |
| 2.6436 | 35400 | 0.0016 | - |
| 2.6473 | 35450 | 0.002 | - |
| 2.6510 | 35500 | 0.0019 | - |
| 2.6548 | 35550 | 0.0017 | - |
| 2.6585 | 35600 | 0.0016 | - |
| 2.6622 | 35650 | 0.0011 | - |
| 2.6660 | 35700 | 0.0022 | - |
| 2.6697 | 35750 | 0.0015 | - |
| 2.6734 | 35800 | 0.0012 | - |
| 2.6772 | 35850 | 0.0017 | - |
| 2.6809 | 35900 | 0.002 | - |
| 2.6846 | 35950 | 0.0013 | - |
| 2.6884 | 36000 | 0.0015 | - |
| 2.6921 | 36050 | 0.0014 | - |
| 2.6958 | 36100 | 0.0014 | - |
| 2.6996 | 36150 | 0.0021 | - |
| 2.7033 | 36200 | 0.0021 | - |
| 2.7070 | 36250 | 0.0015 | - |
| 2.7108 | 36300 | 0.001 | - |
| 2.7145 | 36350 | 0.0011 | - |
| 2.7182 | 36400 | 0.0013 | - |
| 2.7220 | 36450 | 0.0021 | - |
| 2.7257 | 36500 | 0.001 | - |
| 2.7294 | 36550 | 0.0016 | - |
| 2.7332 | 36600 | 0.0018 | - |
| 2.7369 | 36650 | 0.001 | - |
| 2.7406 | 36700 | 0.0014 | - |
| 2.7444 | 36750 | 0.002 | - |
| 2.7481 | 36800 | 0.0032 | - |
| 2.7518 | 36850 | 0.0011 | - |
| 2.7556 | 36900 | 0.0018 | - |
| 2.7593 | 36950 | 0.0024 | - |
| 2.7630 | 37000 | 0.0015 | - |
| 2.7668 | 37050 | 0.0023 | - |
| 2.7705 | 37100 | 0.0019 | - |
| 2.7743 | 37150 | 0.0015 | - |
| 2.7780 | 37200 | 0.0012 | - |
| 2.7817 | 37250 | 0.0009 | - |
| 2.7855 | 37300 | 0.0013 | - |
| 2.7892 | 37350 | 0.0016 | - |
| 2.7929 | 37400 | 0.0018 | - |
| 2.7967 | 37450 | 0.0026 | - |
| 2.8004 | 37500 | 0.0016 | - |
| 2.8041 | 37550 | 0.0017 | - |
| 2.8079 | 37600 | 0.0022 | - |
| 2.8116 | 37650 | 0.0025 | - |
| 2.8153 | 37700 | 0.0013 | - |
| 2.8191 | 37750 | 0.0022 | - |
| 2.8228 | 37800 | 0.0018 | - |
| 2.8265 | 37850 | 0.002 | - |
| 2.8303 | 37900 | 0.0018 | - |
| 2.8340 | 37950 | 0.0031 | - |
| 2.8377 | 38000 | 0.0019 | - |
| 2.8415 | 38050 | 0.0017 | - |
| 2.8452 | 38100 | 0.0024 | - |
| 2.8489 | 38150 | 0.0016 | - |
| 2.8527 | 38200 | 0.0019 | - |
| 2.8564 | 38250 | 0.0025 | - |
| 2.8601 | 38300 | 0.0025 | - |
| 2.8639 | 38350 | 0.0024 | - |
| 2.8676 | 38400 | 0.002 | - |
| 2.8713 | 38450 | 0.0018 | - |
| 2.8751 | 38500 | 0.0013 | - |
| 2.8788 | 38550 | 0.0011 | - |
| 2.8825 | 38600 | 0.002 | - |
| 2.8863 | 38650 | 0.0014 | - |
| 2.8900 | 38700 | 0.0011 | - |
| 2.8937 | 38750 | 0.0018 | - |
| 2.8975 | 38800 | 0.0027 | - |
| 2.9012 | 38850 | 0.0011 | - |
| 2.9049 | 38900 | 0.001 | - |
| 2.9087 | 38950 | 0.0012 | - |
| 2.9124 | 39000 | 0.0016 | - |
| 2.9161 | 39050 | 0.0011 | - |
| 2.9199 | 39100 | 0.0016 | - |
| 2.9236 | 39150 | 0.0018 | - |
| 2.9273 | 39200 | 0.0017 | - |
| 2.9311 | 39250 | 0.0016 | - |
| 2.9348 | 39300 | 0.0029 | - |
| 2.9385 | 39350 | 0.0011 | - |
| 2.9423 | 39400 | 0.0015 | - |
| 2.9460 | 39450 | 0.0017 | - |
| 2.9497 | 39500 | 0.0022 | - |
| 2.9535 | 39550 | 0.0012 | - |
| 2.9572 | 39600 | 0.0018 | - |
| 2.9609 | 39650 | 0.0015 | - |
| 2.9647 | 39700 | 0.0015 | - |
| 2.9684 | 39750 | 0.0009 | - |
| 2.9721 | 39800 | 0.0015 | - |
| 2.9759 | 39850 | 0.0009 | - |
| 2.9796 | 39900 | 0.0011 | - |
| 2.9833 | 39950 | 0.0008 | - |
| 2.9871 | 40000 | 0.001 | - |
| 2.9908 | 40050 | 0.0011 | - |
| 2.9945 | 40100 | 0.0012 | - |
| 2.9983 | 40150 | 0.0014 | - |
| 3.0020 | 40200 | 0.0014 | - |
| 3.0058 | 40250 | 0.0015 | - |
| 3.0095 | 40300 | 0.0014 | - |
| 3.0132 | 40350 | 0.0009 | - |
| 3.0170 | 40400 | 0.0014 | - |
| 3.0207 | 40450 | 0.0009 | - |
| 3.0244 | 40500 | 0.0014 | - |
| 3.0282 | 40550 | 0.0014 | - |
| 3.0319 | 40600 | 0.0011 | - |
| 3.0356 | 40650 | 0.0017 | - |
| 3.0394 | 40700 | 0.0025 | - |
| 3.0431 | 40750 | 0.0036 | - |
| 3.0468 | 40800 | 0.0018 | - |
| 3.0506 | 40850 | 0.001 | - |
| 3.0543 | 40900 | 0.0021 | - |
| 3.0580 | 40950 | 0.0023 | - |
| 3.0618 | 41000 | 0.0019 | - |
| 3.0655 | 41050 | 0.0018 | - |
| 3.0692 | 41100 | 0.0021 | - |
| 3.0730 | 41150 | 0.0018 | - |
| 3.0767 | 41200 | 0.0018 | - |
| 3.0804 | 41250 | 0.0008 | - |
| 3.0842 | 41300 | 0.0019 | - |
| 3.0879 | 41350 | 0.0007 | - |
| 3.0916 | 41400 | 0.0006 | - |
| 3.0954 | 41450 | 0.0009 | - |
| 3.0991 | 41500 | 0.0006 | - |
| 3.1028 | 41550 | 0.0005 | - |
| 3.1066 | 41600 | 0.0013 | - |
| 3.1103 | 41650 | 0.0006 | - |
| 3.1140 | 41700 | 0.0006 | - |
| 3.1178 | 41750 | 0.0009 | - |
| 3.1215 | 41800 | 0.0011 | - |
| 3.1252 | 41850 | 0.0007 | - |
| 3.1290 | 41900 | 0.0008 | - |
| 3.1327 | 41950 | 0.0008 | - |
| 3.1364 | 42000 | 0.0008 | - |
| 3.1402 | 42050 | 0.0006 | - |
| 3.1439 | 42100 | 0.0005 | - |
| 3.1476 | 42150 | 0.0005 | - |
| 3.1514 | 42200 | 0.0007 | - |
| 3.1551 | 42250 | 0.001 | - |
| 3.1588 | 42300 | 0.0011 | - |
| 3.1626 | 42350 | 0.0007 | - |
| 3.1663 | 42400 | 0.001 | - |
| 3.1700 | 42450 | 0.0007 | - |
| 3.1738 | 42500 | 0.0005 | - |
| 3.1775 | 42550 | 0.001 | - |
| 3.1812 | 42600 | 0.0004 | - |
| 3.1850 | 42650 | 0.0006 | - |
| 3.1887 | 42700 | 0.0007 | - |
| 3.1924 | 42750 | 0.0007 | - |
| 3.1962 | 42800 | 0.001 | - |
| 3.1999 | 42850 | 0.0014 | - |
| 3.2036 | 42900 | 0.0029 | - |
| 3.2074 | 42950 | 0.0047 | - |
| 3.2111 | 43000 | 0.0034 | - |
| 3.2148 | 43050 | 0.0029 | - |
| 3.2186 | 43100 | 0.0021 | - |
| 3.2223 | 43150 | 0.0015 | - |
| 3.2260 | 43200 | 0.0016 | - |
| 3.2298 | 43250 | 0.0015 | - |
| 3.2335 | 43300 | 0.0012 | - |
| 3.2372 | 43350 | 0.0012 | - |
| 3.2410 | 43400 | 0.0017 | - |
| 3.2447 | 43450 | 0.0018 | - |
| 3.2485 | 43500 | 0.0011 | - |
| 3.2522 | 43550 | 0.0024 | - |
| 3.2559 | 43600 | 0.002 | - |
| 3.2597 | 43650 | 0.0014 | - |
| 3.2634 | 43700 | 0.0024 | - |
| 3.2671 | 43750 | 0.0019 | - |
| 3.2709 | 43800 | 0.0006 | - |
| 3.2746 | 43850 | 0.0013 | - |
| 3.2783 | 43900 | 0.0008 | - |
| 3.2821 | 43950 | 0.0018 | - |
| 3.2858 | 44000 | 0.0012 | - |
| 3.2895 | 44050 | 0.0013 | - |
| 3.2933 | 44100 | 0.0013 | - |
| 3.2970 | 44150 | 0.0009 | - |
| 3.3007 | 44200 | 0.0018 | - |
| 3.3045 | 44250 | 0.0005 | - |
| 3.3082 | 44300 | 0.0018 | - |
| 3.3119 | 44350 | 0.0007 | - |
| 3.3157 | 44400 | 0.0006 | - |
| 3.3194 | 44450 | 0.0013 | - |
| 3.3231 | 44500 | 0.0013 | - |
| 3.3269 | 44550 | 0.0014 | - |
| 3.3306 | 44600 | 0.0019 | - |
| 3.3343 | 44650 | 0.0007 | - |
| 3.3381 | 44700 | 0.0016 | - |
| 3.3418 | 44750 | 0.0014 | - |
| 3.3455 | 44800 | 0.0008 | - |
| 3.3493 | 44850 | 0.0002 | - |
| 3.3530 | 44900 | 0.0008 | - |
| 3.3567 | 44950 | 0.0012 | - |
| 3.3605 | 45000 | 0.0009 | - |
| 3.3642 | 45050 | 0.0014 | - |
| 3.3679 | 45100 | 0.0007 | - |
| 3.3717 | 45150 | 0.0004 | - |
| 3.3754 | 45200 | 0.0007 | - |
| 3.3791 | 45250 | 0.0013 | - |
| 3.3829 | 45300 | 0.0009 | - |
| 3.3866 | 45350 | 0.0014 | - |
| 3.3903 | 45400 | 0.0014 | - |
| 3.3941 | 45450 | 0.0016 | - |
| 3.3978 | 45500 | 0.0011 | - |
| 3.4015 | 45550 | 0.0007 | - |
| 3.4053 | 45600 | 0.002 | - |
| 3.4090 | 45650 | 0.0028 | - |
| 3.4127 | 45700 | 0.0025 | - |
| 3.4165 | 45750 | 0.0012 | - |
| 3.4202 | 45800 | 0.001 | - |
| 3.4239 | 45850 | 0.0006 | - |
| 3.4277 | 45900 | 0.0016 | - |
| 3.4314 | 45950 | 0.0025 | - |
| 3.4351 | 46000 | 0.0011 | - |
| 3.4389 | 46050 | 0.002 | - |
| 3.4426 | 46100 | 0.0019 | - |
| 3.4463 | 46150 | 0.0016 | - |
| 3.4501 | 46200 | 0.0019 | - |
| 3.4538 | 46250 | 0.0013 | - |
| 3.4575 | 46300 | 0.0017 | - |
| 3.4613 | 46350 | 0.0011 | - |
| 3.4650 | 46400 | 0.0011 | - |
| 3.4687 | 46450 | 0.0011 | - |
| 3.4725 | 46500 | 0.0008 | - |
| 3.4762 | 46550 | 0.0014 | - |
| 3.4799 | 46600 | 0.0009 | - |
| 3.4837 | 46650 | 0.001 | - |
| 3.4874 | 46700 | 0.0014 | - |
| 3.4912 | 46750 | 0.0007 | - |
| 3.4949 | 46800 | 0.0013 | - |
| 3.4986 | 46850 | 0.0018 | - |
| 3.5024 | 46900 | 0.0014 | - |
| 3.5061 | 46950 | 0.0011 | - |
| 3.5098 | 47000 | 0.0012 | - |
| 3.5136 | 47050 | 0.0008 | - |
| 3.5173 | 47100 | 0.0007 | - |
| 3.5210 | 47150 | 0.0011 | - |
| 3.5248 | 47200 | 0.0016 | - |
| 3.5285 | 47250 | 0.0008 | - |
| 3.5322 | 47300 | 0.0003 | - |
| 3.5360 | 47350 | 0.0009 | - |
| 3.5397 | 47400 | 0.001 | - |
| 3.5434 | 47450 | 0.0008 | - |
| 3.5472 | 47500 | 0.0013 | - |
| 3.5509 | 47550 | 0.0012 | - |
| 3.5546 | 47600 | 0.0016 | - |
| 3.5584 | 47650 | 0.0014 | - |
| 3.5621 | 47700 | 0.0022 | - |
| 3.5658 | 47750 | 0.0018 | - |
| 3.5696 | 47800 | 0.0017 | - |
| 3.5733 | 47850 | 0.0015 | - |
| 3.5770 | 47900 | 0.0018 | - |
| 3.5808 | 47950 | 0.0009 | - |
| 3.5845 | 48000 | 0.0014 | - |
| 3.5882 | 48050 | 0.0016 | - |
| 3.5920 | 48100 | 0.0011 | - |
| 3.5957 | 48150 | 0.0006 | - |
| 3.5994 | 48200 | 0.0012 | - |
| 3.6032 | 48250 | 0.0011 | - |
| 3.6069 | 48300 | 0.0016 | - |
| 3.6106 | 48350 | 0.0014 | - |
| 3.6144 | 48400 | 0.0012 | - |
| 3.6181 | 48450 | 0.0015 | - |
| 3.6218 | 48500 | 0.0008 | - |
| 3.6256 | 48550 | 0.0011 | - |
| 3.6293 | 48600 | 0.0009 | - |
| 3.6330 | 48650 | 0.0007 | - |
| 3.6368 | 48700 | 0.0011 | - |
| 3.6405 | 48750 | 0.001 | - |
| 3.6442 | 48800 | 0.0005 | - |
| 3.6480 | 48850 | 0.001 | - |
| 3.6517 | 48900 | 0.0007 | - |
| 3.6554 | 48950 | 0.0009 | - |
| 3.6592 | 49000 | 0.0006 | - |
| 3.6629 | 49050 | 0.0012 | - |
| 3.6666 | 49100 | 0.0014 | - |
| 3.6704 | 49150 | 0.0011 | - |
| 3.6741 | 49200 | 0.0003 | - |
| 3.6778 | 49250 | 0.0013 | - |
| 3.6816 | 49300 | 0.0004 | - |
| 3.6853 | 49350 | 0.0009 | - |
| 3.6890 | 49400 | 0.0012 | - |
| 3.6928 | 49450 | 0.0006 | - |
| 3.6965 | 49500 | 0.0009 | - |
| 3.7002 | 49550 | 0.0012 | - |
| 3.7040 | 49600 | 0.0009 | - |
| 3.7077 | 49650 | 0.0008 | - |
| 3.7114 | 49700 | 0.0009 | - |
| 3.7152 | 49750 | 0.0006 | - |
| 3.7189 | 49800 | 0.0009 | - |
| 3.7226 | 49850 | 0.0009 | - |
| 3.7264 | 49900 | 0.0014 | - |
| 3.7301 | 49950 | 0.0011 | - |
| 3.7339 | 50000 | 0.0011 | - |
| 3.7376 | 50050 | 0.0004 | - |
| 3.7413 | 50100 | 0.0009 | - |
| 3.7451 | 50150 | 0.0016 | - |
| 3.7488 | 50200 | 0.0009 | - |
| 3.7525 | 50250 | 0.0012 | - |
| 3.7563 | 50300 | 0.0008 | - |
| 3.7600 | 50350 | 0.0005 | - |
| 3.7637 | 50400 | 0.0011 | - |
| 3.7675 | 50450 | 0.0008 | - |
| 3.7712 | 50500 | 0.0009 | - |
| 3.7749 | 50550 | 0.0013 | - |
| 3.7787 | 50600 | 0.0008 | - |
| 3.7824 | 50650 | 0.001 | - |
| 3.7861 | 50700 | 0.0006 | - |
| 3.7899 | 50750 | 0.0008 | - |
| 3.7936 | 50800 | 0.0028 | - |
| 3.7973 | 50850 | 0.0027 | - |
| 3.8011 | 50900 | 0.0021 | - |
| 3.8048 | 50950 | 0.003 | - |
| 3.8085 | 51000 | 0.0022 | - |
| 3.8123 | 51050 | 0.0011 | - |
| 3.8160 | 51100 | 0.0013 | - |
| 3.8197 | 51150 | 0.0009 | - |
| 3.8235 | 51200 | 0.0008 | - |
| 3.8272 | 51250 | 0.0016 | - |
| 3.8309 | 51300 | 0.0017 | - |
| 3.8347 | 51350 | 0.0012 | - |
| 3.8384 | 51400 | 0.0005 | - |
| 3.8421 | 51450 | 0.0011 | - |
| 3.8459 | 51500 | 0.0012 | - |
| 3.8496 | 51550 | 0.0006 | - |
| 3.8533 | 51600 | 0.0009 | - |
| 3.8571 | 51650 | 0.0015 | - |
| 3.8608 | 51700 | 0.0006 | - |
| 3.8645 | 51750 | 0.0005 | - |
| 3.8683 | 51800 | 0.001 | - |
| 3.8720 | 51850 | 0.0009 | - |
| 3.8757 | 51900 | 0.0012 | - |
| 3.8795 | 51950 | 0.0004 | - |
| 3.8832 | 52000 | 0.002 | - |
| 3.8869 | 52050 | 0.001 | - |
| 3.8907 | 52100 | 0.0013 | - |
| 3.8944 | 52150 | 0.0017 | - |
| 3.8981 | 52200 | 0.0028 | - |
| 3.9019 | 52250 | 0.0027 | - |
| 3.9056 | 52300 | 0.0017 | - |
| 3.9093 | 52350 | 0.0017 | - |
| 3.9131 | 52400 | 0.0013 | - |
| 3.9168 | 52450 | 0.0013 | - |
| 3.9205 | 52500 | 0.0014 | - |
| 3.9243 | 52550 | 0.0009 | - |
| 3.9280 | 52600 | 0.001 | - |
| 3.9317 | 52650 | 0.0014 | - |
| 3.9355 | 52700 | 0.0014 | - |
| 3.9392 | 52750 | 0.001 | - |
| 3.9429 | 52800 | 0.001 | - |
| 3.9467 | 52850 | 0.0014 | - |
| 3.9504 | 52900 | 0.0018 | - |
| 3.9541 | 52950 | 0.0009 | - |
| 3.9579 | 53000 | 0.0012 | - |
| 3.9616 | 53050 | 0.0006 | - |
| 3.9653 | 53100 | 0.0015 | - |
| 3.9691 | 53150 | 0.0013 | - |
| 3.9728 | 53200 | 0.0013 | - |
| 3.9766 | 53250 | 0.0011 | - |
| 3.9803 | 53300 | 0.0014 | - |
| 3.9840 | 53350 | 0.0007 | - |
| 3.9878 | 53400 | 0.0007 | - |
| 3.9915 | 53450 | 0.0007 | - |
| 3.9952 | 53500 | 0.0004 | - |
| 3.9990 | 53550 | 0.0006 | - |
| 4.0027 | 53600 | 0.0011 | - |
| 4.0064 | 53650 | 0.0009 | - |
| 4.0102 | 53700 | 0.001 | - |
| 4.0139 | 53750 | 0.0014 | - |
| 4.0176 | 53800 | 0.002 | - |
| 4.0214 | 53850 | 0.0016 | - |
| 4.0251 | 53900 | 0.0021 | - |
| 4.0288 | 53950 | 0.0017 | - |
| 4.0326 | 54000 | 0.0009 | - |
| 4.0363 | 54050 | 0.0008 | - |
| 4.0400 | 54100 | 0.0012 | - |
| 4.0438 | 54150 | 0.0014 | - |
| 4.0475 | 54200 | 0.0008 | - |
| 4.0512 | 54250 | 0.0009 | - |
| 4.0550 | 54300 | 0.0014 | - |
| 4.0587 | 54350 | 0.001 | - |
| 4.0624 | 54400 | 0.0004 | - |
| 4.0662 | 54450 | 0.0003 | - |
| 4.0699 | 54500 | 0.0012 | - |
| 4.0736 | 54550 | 0.0006 | - |
| 4.0774 | 54600 | 0.0004 | - |
| 4.0811 | 54650 | 0.001 | - |
| 4.0848 | 54700 | 0.0006 | - |
| 4.0886 | 54750 | 0.0008 | - |
| 4.0923 | 54800 | 0.0012 | - |
| 4.0960 | 54850 | 0.0009 | - |
| 4.0998 | 54900 | 0.0013 | - |
| 4.1035 | 54950 | 0.0009 | - |
| 4.1072 | 55000 | 0.0005 | - |
| 4.1110 | 55050 | 0.0009 | - |
| 4.1147 | 55100 | 0.0008 | - |
| 4.1184 | 55150 | 0.0003 | - |
| 4.1222 | 55200 | 0.0007 | - |
| 4.1259 | 55250 | 0.0004 | - |
| 4.1296 | 55300 | 0.0009 | - |
| 4.1334 | 55350 | 0.001 | - |
| 4.1371 | 55400 | 0.0015 | - |
| 4.1408 | 55450 | 0.0016 | - |
| 4.1446 | 55500 | 0.0014 | - |
| 4.1483 | 55550 | 0.002 | - |
| 4.1520 | 55600 | 0.0014 | - |
| 4.1558 | 55650 | 0.0022 | - |
| 4.1595 | 55700 | 0.0007 | - |
| 4.1632 | 55750 | 0.0008 | - |
| 4.1670 | 55800 | 0.0011 | - |
| 4.1707 | 55850 | 0.0011 | - |
| 4.1744 | 55900 | 0.0009 | - |
| 4.1782 | 55950 | 0.0011 | - |
| 4.1819 | 56000 | 0.0009 | - |
| 4.1856 | 56050 | 0.0004 | - |
| 4.1894 | 56100 | 0.0012 | - |
| 4.1931 | 56150 | 0.001 | - |
| 4.1968 | 56200 | 0.001 | - |
| 4.2006 | 56250 | 0.0009 | - |
| 4.2043 | 56300 | 0.001 | - |
| 4.2081 | 56350 | 0.0007 | - |
| 4.2118 | 56400 | 0.0013 | - |
| 4.2155 | 56450 | 0.0012 | - |
| 4.2193 | 56500 | 0.0008 | - |
| 4.2230 | 56550 | 0.0005 | - |
| 4.2267 | 56600 | 0.0007 | - |
| 4.2305 | 56650 | 0.0007 | - |
| 4.2342 | 56700 | 0.001 | - |
| 4.2379 | 56750 | 0.0009 | - |
| 4.2417 | 56800 | 0.0005 | - |
| 4.2454 | 56850 | 0.0006 | - |
| 4.2491 | 56900 | 0.0007 | - |
| 4.2529 | 56950 | 0.0008 | - |
| 4.2566 | 57000 | 0.0006 | - |
| 4.2603 | 57050 | 0.0004 | - |
| 4.2641 | 57100 | 0.0008 | - |
| 4.2678 | 57150 | 0.0013 | - |
| 4.2715 | 57200 | 0.0003 | - |
| 4.2753 | 57250 | 0.0005 | - |
| 4.2790 | 57300 | 0.0005 | - |
| 4.2827 | 57350 | 0.0011 | - |
| 4.2865 | 57400 | 0.0007 | - |
| 4.2902 | 57450 | 0.0007 | - |
| 4.2939 | 57500 | 0.0013 | - |
| 4.2977 | 57550 | 0.0008 | - |
| 4.3014 | 57600 | 0.0007 | - |
| 4.3051 | 57650 | 0.0001 | - |
| 4.3089 | 57700 | 0.0007 | - |
| 4.3126 | 57750 | 0.0005 | - |
| 4.3163 | 57800 | 0.0002 | - |
| 4.3201 | 57850 | 0.0006 | - |
| 4.3238 | 57900 | 0.0003 | - |
| 4.3275 | 57950 | 0.0004 | - |
| 4.3313 | 58000 | 0.0007 | - |
| 4.3350 | 58050 | 0.0009 | - |
| 4.3387 | 58100 | 0.002 | - |
| 4.3425 | 58150 | 0.0013 | - |
| 4.3462 | 58200 | 0.0023 | - |
| 4.3499 | 58250 | 0.0016 | - |
| 4.3537 | 58300 | 0.0016 | - |
| 4.3574 | 58350 | 0.0008 | - |
| 4.3611 | 58400 | 0.0018 | - |
| 4.3649 | 58450 | 0.0009 | - |
| 4.3686 | 58500 | 0.0011 | - |
| 4.3723 | 58550 | 0.0009 | - |
| 4.3761 | 58600 | 0.001 | - |
| 4.3798 | 58650 | 0.0005 | - |
| 4.3835 | 58700 | 0.0017 | - |
| 4.3873 | 58750 | 0.001 | - |
| 4.3910 | 58800 | 0.001 | - |
| 4.3947 | 58850 | 0.0004 | - |
| 4.3985 | 58900 | 0.0011 | - |
| 4.4022 | 58950 | 0.0006 | - |
| 4.4059 | 59000 | 0.0005 | - |
| 4.4097 | 59050 | 0.0005 | - |
| 4.4134 | 59100 | 0.0002 | - |
| 4.4171 | 59150 | 0.0011 | - |
| 4.4209 | 59200 | 0.001 | - |
| 4.4246 | 59250 | 0.0005 | - |
| 4.4283 | 59300 | 0.0007 | - |
| 4.4321 | 59350 | 0.0006 | - |
| 4.4358 | 59400 | 0.0005 | - |
| 4.4395 | 59450 | 0.0007 | - |
| 4.4433 | 59500 | 0.0007 | - |
| 4.4470 | 59550 | 0.0012 | - |
| 4.4508 | 59600 | 0.0012 | - |
| 4.4545 | 59650 | 0.0013 | - |
| 4.4582 | 59700 | 0.001 | - |
| 4.4620 | 59750 | 0.0006 | - |
| 4.4657 | 59800 | 0.001 | - |
| 4.4694 | 59850 | 0.0005 | - |
| 4.4732 | 59900 | 0.0008 | - |
| 4.4769 | 59950 | 0.0008 | - |
| 4.4806 | 60000 | 0.0006 | - |
| 4.4844 | 60050 | 0.0008 | - |
| 4.4881 | 60100 | 0.0001 | - |
| 4.4918 | 60150 | 0.0011 | - |
| 4.4956 | 60200 | 0.0011 | - |
| 4.4993 | 60250 | 0.0014 | - |
| 4.5030 | 60300 | 0.0007 | - |
| 4.5068 | 60350 | 0.0011 | - |
| 4.5105 | 60400 | 0.0007 | - |
| 4.5142 | 60450 | 0.0009 | - |
| 4.5180 | 60500 | 0.0009 | - |
| 4.5217 | 60550 | 0.0004 | - |
| 4.5254 | 60600 | 0.0004 | - |
| 4.5292 | 60650 | 0.0007 | - |
| 4.5329 | 60700 | 0.0002 | - |
| 4.5366 | 60750 | 0.0008 | - |
| 4.5404 | 60800 | 0.001 | - |
| 4.5441 | 60850 | 0.001 | - |
| 4.5478 | 60900 | 0.0008 | - |
| 4.5516 | 60950 | 0.0009 | - |
| 4.5553 | 61000 | 0.0011 | - |
| 4.5590 | 61050 | 0.0008 | - |
| 4.5628 | 61100 | 0.001 | - |
| 4.5665 | 61150 | 0.0004 | - |
| 4.5702 | 61200 | 0.0009 | - |
| 4.5740 | 61250 | 0.001 | - |
| 4.5777 | 61300 | 0.0011 | - |
| 4.5814 | 61350 | 0.0007 | - |
| 4.5852 | 61400 | 0.0002 | - |
| 4.5889 | 61450 | 0.0004 | - |
| 4.5926 | 61500 | 0.0007 | - |
| 4.5964 | 61550 | 0.0006 | - |
| 4.6001 | 61600 | 0.0011 | - |
| 4.6038 | 61650 | 0.0007 | - |
| 4.6076 | 61700 | 0.0008 | - |
| 4.6113 | 61750 | 0.0011 | - |
| 4.6150 | 61800 | 0.0007 | - |
| 4.6188 | 61850 | 0.0005 | - |
| 4.6225 | 61900 | 0.0003 | - |
| 4.6262 | 61950 | 0.0007 | - |
| 4.6300 | 62000 | 0.0002 | - |
| 4.6337 | 62050 | 0.0008 | - |
| 4.6374 | 62100 | 0.0009 | - |
| 4.6412 | 62150 | 0.0002 | - |
| 4.6449 | 62200 | 0.0004 | - |
| 4.6486 | 62250 | 0.0005 | - |
| 4.6524 | 62300 | 0.0003 | - |
| 4.6561 | 62350 | 0.0005 | - |
| 4.6598 | 62400 | 0.0006 | - |
| 4.6636 | 62450 | 0.0008 | - |
| 4.6673 | 62500 | 0.0004 | - |
| 4.6710 | 62550 | 0.0007 | - |
| 4.6748 | 62600 | 0.001 | - |
| 4.6785 | 62650 | 0.0002 | - |
| 4.6822 | 62700 | 0.0005 | - |
| 4.6860 | 62750 | 0.0006 | - |
| 4.6897 | 62800 | 0.0008 | - |
| 4.6935 | 62850 | 0.001 | - |
| 4.6972 | 62900 | 0.0029 | - |
| 4.7009 | 62950 | 0.0019 | - |
| 4.7047 | 63000 | 0.0016 | - |
| 4.7084 | 63050 | 0.0013 | - |
| 4.7121 | 63100 | 0.0014 | - |
| 4.7159 | 63150 | 0.0023 | - |
| 4.7196 | 63200 | 0.0009 | - |
| 4.7233 | 63250 | 0.0018 | - |
| 4.7271 | 63300 | 0.0021 | - |
| 4.7308 | 63350 | 0.0008 | - |
| 4.7345 | 63400 | 0.0012 | - |
| 4.7383 | 63450 | 0.0017 | - |
| 4.7420 | 63500 | 0.0006 | - |
| 4.7457 | 63550 | 0.0018 | - |
| 4.7495 | 63600 | 0.0015 | - |
| 4.7532 | 63650 | 0.0014 | - |
| 4.7569 | 63700 | 0.0009 | - |
| 4.7607 | 63750 | 0.0009 | - |
| 4.7644 | 63800 | 0.0006 | - |
| 4.7681 | 63850 | 0.0006 | - |
| 4.7719 | 63900 | 0.0013 | - |
| 4.7756 | 63950 | 0.001 | - |
| 4.7793 | 64000 | 0.0008 | - |
| 4.7831 | 64050 | 0.0005 | - |
| 4.7868 | 64100 | 0.0017 | - |
| 4.7905 | 64150 | 0.0006 | - |
| 4.7943 | 64200 | 0.0012 | - |
| 4.7980 | 64250 | 0.0005 | - |
| 4.8017 | 64300 | 0.0005 | - |
| 4.8055 | 64350 | 0.0006 | - |
| 4.8092 | 64400 | 0.0009 | - |
| 4.8129 | 64450 | 0.0009 | - |
| 4.8167 | 64500 | 0.0006 | - |
| 4.8204 | 64550 | 0.001 | - |
| 4.8241 | 64600 | 0.001 | - |
| 4.8279 | 64650 | 0.0001 | - |
| 4.8316 | 64700 | 0.0005 | - |
| 4.8353 | 64750 | 0.0004 | - |
| 4.8391 | 64800 | 0.0006 | - |
| 4.8428 | 64850 | 0.0004 | - |
| 4.8465 | 64900 | 0.0004 | - |
| 4.8503 | 64950 | 0.0005 | - |
| 4.8540 | 65000 | 0.0006 | - |
| 4.8577 | 65050 | 0.0007 | - |
| 4.8615 | 65100 | 0.0003 | - |
| 4.8652 | 65150 | 0.0005 | - |
| 4.8689 | 65200 | 0.0007 | - |
| 4.8727 | 65250 | 0.0008 | - |
| 4.8764 | 65300 | 0.0005 | - |
| 4.8801 | 65350 | 0.0006 | - |
| 4.8839 | 65400 | 0.001 | - |
| 4.8876 | 65450 | 0.0001 | - |
| 4.8913 | 65500 | 0.0004 | - |
| 4.8951 | 65550 | 0.0007 | - |
| 4.8988 | 65600 | 0.0006 | - |
| 4.9025 | 65650 | 0.0006 | - |
| 4.9063 | 65700 | 0.0005 | - |
| 4.9100 | 65750 | 0.0006 | - |
| 4.9137 | 65800 | 0.0008 | - |
| 4.9175 | 65850 | 0.0015 | - |
| 4.9212 | 65900 | 0.0019 | - |
| 4.9249 | 65950 | 0.0011 | - |
| 4.9287 | 66000 | 0.0014 | - |
| 4.9324 | 66050 | 0.0008 | - |
| 4.9362 | 66100 | 0.0011 | - |
| 4.9399 | 66150 | 0.0007 | - |
| 4.9436 | 66200 | 0.001 | - |
| 4.9474 | 66250 | 0.0005 | - |
| 4.9511 | 66300 | 0.0007 | - |
| 4.9548 | 66350 | 0.0011 | - |
| 4.9586 | 66400 | 0.0009 | - |
| 4.9623 | 66450 | 0.0008 | - |
| 4.9660 | 66500 | 0.0009 | - |
| 4.9698 | 66550 | 0.0006 | - |
| 4.9735 | 66600 | 0.0006 | - |
| 4.9772 | 66650 | 0.0002 | - |
| 4.9810 | 66700 | 0.0006 | - |
| 4.9847 | 66750 | 0.0004 | - |
| 4.9884 | 66800 | 0.0007 | - |
| 4.9922 | 66850 | 0.0009 | - |
| 4.9959 | 66900 | 0.0008 | - |
| 4.9996 | 66950 | 0.0003 | - |
| 5.0034 | 67000 | 0.0008 | - |
| 5.0071 | 67050 | 0.001 | - |
| 5.0108 | 67100 | 0.0007 | - |
| 5.0146 | 67150 | 0.0013 | - |
| 5.0183 | 67200 | 0.0011 | - |
| 5.0220 | 67250 | 0.0003 | - |
| 5.0258 | 67300 | 0.0004 | - |
| 5.0295 | 67350 | 0.0009 | - |
| 5.0332 | 67400 | 0.0005 | - |
| 5.0370 | 67450 | 0.0001 | - |
| 5.0407 | 67500 | 0.0003 | - |
| 5.0444 | 67550 | 0.0007 | - |
| 5.0482 | 67600 | 0.0007 | - |
| 5.0519 | 67650 | 0.0011 | - |
| 5.0556 | 67700 | 0.0007 | - |
| 5.0594 | 67750 | 0.0006 | - |
| 5.0631 | 67800 | 0.0006 | - |
| 5.0668 | 67850 | 0.0005 | - |
| 5.0706 | 67900 | 0.0006 | - |
| 5.0743 | 67950 | 0.0006 | - |
| 5.0780 | 68000 | 0.0003 | - |
| 5.0818 | 68050 | 0.0009 | - |
| 5.0855 | 68100 | 0.0007 | - |
| 5.0892 | 68150 | 0.0006 | - |
| 5.0930 | 68200 | 0.0003 | - |
| 5.0967 | 68250 | 0.0016 | - |
| 5.1004 | 68300 | 0.0006 | - |
| 5.1042 | 68350 | 0.0006 | - |
| 5.1079 | 68400 | 0.0005 | - |
| 5.1116 | 68450 | 0.0003 | - |
| 5.1154 | 68500 | 0.0006 | - |
| 5.1191 | 68550 | 0.0008 | - |
| 5.1228 | 68600 | 0.0005 | - |
| 5.1266 | 68650 | 0.0011 | - |
| 5.1303 | 68700 | 0.0018 | - |
| 5.1340 | 68750 | 0.0013 | - |
| 5.1378 | 68800 | 0.0017 | - |
| 5.1415 | 68850 | 0.0009 | - |
| 5.1452 | 68900 | 0.0009 | - |
| 5.1490 | 68950 | 0.0018 | - |
| 5.1527 | 69000 | 0.0012 | - |
| 5.1564 | 69050 | 0.0012 | - |
| 5.1602 | 69100 | 0.0015 | - |
| 5.1639 | 69150 | 0.0006 | - |
| 5.1676 | 69200 | 0.0008 | - |
| 5.1714 | 69250 | 0.0022 | - |
| 5.1751 | 69300 | 0.0013 | - |
| 5.1789 | 69350 | 0.0008 | - |
| 5.1826 | 69400 | 0.0009 | - |
| 5.1863 | 69450 | 0.0006 | - |
| 5.1901 | 69500 | 0.0012 | - |
| 5.1938 | 69550 | 0.0011 | - |
| 5.1975 | 69600 | 0.0007 | - |
| 5.2013 | 69650 | 0.0005 | - |
| 5.2050 | 69700 | 0.0008 | - |
| 5.2087 | 69750 | 0.0009 | - |
| 5.2125 | 69800 | 0.0005 | - |
| 5.2162 | 69850 | 0.0008 | - |
| 5.2199 | 69900 | 0.0009 | - |
| 5.2237 | 69950 | 0.0008 | - |
| 5.2274 | 70000 | 0.0006 | - |
| 5.2311 | 70050 | 0.0004 | - |
| 5.2349 | 70100 | 0.0009 | - |
| 5.2386 | 70150 | 0.0009 | - |
| 5.2423 | 70200 | 0.0008 | - |
| 5.2461 | 70250 | 0.0006 | - |
| 5.2498 | 70300 | 0.0003 | - |
| 5.2535 | 70350 | 0.0014 | - |
| 5.2573 | 70400 | 0.0006 | - |
| 5.2610 | 70450 | 0.0005 | - |
| 5.2647 | 70500 | 0.0008 | - |
| 5.2685 | 70550 | 0.0007 | - |
| 5.2722 | 70600 | 0.0001 | - |
| 5.2759 | 70650 | 0.0007 | - |
| 5.2797 | 70700 | 0.0005 | - |
| 5.2834 | 70750 | 0.0007 | - |
| 5.2871 | 70800 | 0.0004 | - |
| 5.2909 | 70850 | 0.0001 | - |
| 5.2946 | 70900 | 0.0005 | - |
| 5.2983 | 70950 | 0.0003 | - |
| 5.3021 | 71000 | 0.0008 | - |
| 5.3058 | 71050 | 0.0007 | - |
| 5.3095 | 71100 | 0.0002 | - |
| 5.3133 | 71150 | 0.0009 | - |
| 5.3170 | 71200 | 0.0006 | - |
| 5.3207 | 71250 | 0.0008 | - |
| 5.3245 | 71300 | 0.001 | - |
| 5.3282 | 71350 | 0.0009 | - |
| 5.3319 | 71400 | 0.0005 | - |
| 5.3357 | 71450 | 0.0011 | - |
| 5.3394 | 71500 | 0.0012 | - |
| 5.3431 | 71550 | 0.0011 | - |
| 5.3469 | 71600 | 0.0012 | - |
| 5.3506 | 71650 | 0.0007 | - |
| 5.3543 | 71700 | 0.0009 | - |
| 5.3581 | 71750 | 0.0011 | - |
| 5.3618 | 71800 | 0.0013 | - |
| 5.3655 | 71850 | 0.0008 | - |
| 5.3693 | 71900 | 0.0011 | - |
| 5.3730 | 71950 | 0.0007 | - |
| 5.3767 | 72000 | 0.0008 | - |
| 5.3805 | 72050 | 0.0011 | - |
| 5.3842 | 72100 | 0.001 | - |
| 5.3879 | 72150 | 0.0006 | - |
| 5.3917 | 72200 | 0.0008 | - |
| 5.3954 | 72250 | 0.0004 | - |
| 5.3991 | 72300 | 0.0007 | - |
| 5.4029 | 72350 | 0.001 | - |
| 5.4066 | 72400 | 0.0007 | - |
| 5.4104 | 72450 | 0.0006 | - |
| 5.4141 | 72500 | 0.0008 | - |
| 5.4178 | 72550 | 0.0009 | - |
| 5.4216 | 72600 | 0.0005 | - |
| 5.4253 | 72650 | 0.001 | - |
| 5.4290 | 72700 | 0.0009 | - |
| 5.4328 | 72750 | 0.0006 | - |
| 5.4365 | 72800 | 0.0011 | - |
| 5.4402 | 72850 | 0.0003 | - |
| 5.4440 | 72900 | 0.001 | - |
| 5.4477 | 72950 | 0.0007 | - |
| 5.4514 | 73000 | 0.0009 | - |
| 5.4552 | 73050 | 0.0007 | - |
| 5.4589 | 73100 | 0.0003 | - |
| 5.4626 | 73150 | 0.0003 | - |
| 5.4664 | 73200 | 0.0003 | - |
| 5.4701 | 73250 | 0.0006 | - |
| 5.4738 | 73300 | 0.0004 | - |
| 5.4776 | 73350 | 0.0006 | - |
| 5.4813 | 73400 | 0.0007 | - |
| 5.4850 | 73450 | 0.0005 | - |
| 5.4888 | 73500 | 0.0006 | - |
| 5.4925 | 73550 | 0.0008 | - |
| 5.4962 | 73600 | 0.0009 | - |
| 5.5000 | 73650 | 0.0012 | - |
| 5.5037 | 73700 | 0.0008 | - |
| 5.5074 | 73750 | 0.0011 | - |
| 5.5112 | 73800 | 0.0013 | - |
| 5.5149 | 73850 | 0.0008 | - |
| 5.5186 | 73900 | 0.001 | - |
| 5.5224 | 73950 | 0.0012 | - |
| 5.5261 | 74000 | 0.0005 | - |
| 5.5298 | 74050 | 0.0013 | - |
| 5.5336 | 74100 | 0.0007 | - |
| 5.5373 | 74150 | 0.0006 | - |
| 5.5410 | 74200 | 0.0008 | - |
| 5.5448 | 74250 | 0.0003 | - |
| 5.5485 | 74300 | 0.001 | - |
| 5.5522 | 74350 | 0.0009 | - |
| 5.5560 | 74400 | 0.0013 | - |
| 5.5597 | 74450 | 0.0009 | - |
| 5.5634 | 74500 | 0.0011 | - |
| 5.5672 | 74550 | 0.0014 | - |
| 5.5709 | 74600 | 0.0005 | - |
| 5.5746 | 74650 | 0.001 | - |
| 5.5784 | 74700 | 0.0007 | - |
| 5.5821 | 74750 | 0.0006 | - |
| 5.5858 | 74800 | 0.0011 | - |
| 5.5896 | 74850 | 0.0009 | - |
| 5.5933 | 74900 | 0.0008 | - |
| 5.5970 | 74950 | 0.0011 | - |
| 5.6008 | 75000 | 0.0015 | - |
| 5.6045 | 75050 | 0.0009 | - |
| 5.6082 | 75100 | 0.0008 | - |
| 5.6120 | 75150 | 0.0007 | - |
| 5.6157 | 75200 | 0.0005 | - |
| 5.6194 | 75250 | 0.0003 | - |
| 5.6232 | 75300 | 0.0006 | - |
| 5.6269 | 75350 | 0.0006 | - |
| 5.6306 | 75400 | 0.0008 | - |
| 5.6344 | 75450 | 0.0008 | - |
| 5.6381 | 75500 | 0.0009 | - |
| 5.6418 | 75550 | 0.0011 | - |
| 5.6456 | 75600 | 0.0005 | - |
| 5.6493 | 75650 | 0.0005 | - |
| 5.6531 | 75700 | 0.001 | - |
| 5.6568 | 75750 | 0.0005 | - |
| 5.6605 | 75800 | 0.0002 | - |
| 5.6643 | 75850 | 0.0004 | - |
| 5.6680 | 75900 | 0.0007 | - |
| 5.6717 | 75950 | 0.0007 | - |
| 5.6755 | 76000 | 0.0005 | - |
| 5.6792 | 76050 | 0.0004 | - |
| 5.6829 | 76100 | 0.0006 | - |
| 5.6867 | 76150 | 0.0003 | - |
| 5.6904 | 76200 | 0.0008 | - |
| 5.6941 | 76250 | 0.0009 | - |
| 5.6979 | 76300 | 0.0002 | - |
| 5.7016 | 76350 | 0.0001 | - |
| 5.7053 | 76400 | 0.0009 | - |
| 5.7091 | 76450 | 0.0006 | - |
| 5.7128 | 76500 | 0.0006 | - |
| 5.7165 | 76550 | 0.0001 | - |
| 5.7203 | 76600 | 0.0002 | - |
| 5.7240 | 76650 | 0.0012 | - |
| 5.7277 | 76700 | 0.0011 | - |
| 5.7315 | 76750 | 0.0008 | - |
| 5.7352 | 76800 | 0.0006 | - |
| 5.7389 | 76850 | 0.0001 | - |
| 5.7427 | 76900 | 0.0002 | - |
| 5.7464 | 76950 | 0.0004 | - |
| 5.7501 | 77000 | 0.0004 | - |
| 5.7539 | 77050 | 0.0002 | - |
| 5.7576 | 77100 | 0.0003 | - |
| 5.7613 | 77150 | 0.0006 | - |
| 5.7651 | 77200 | 0.0001 | - |
| 5.7688 | 77250 | 0.0009 | - |
| 5.7725 | 77300 | 0.0006 | - |
| 5.7763 | 77350 | 0.0016 | - |
| 5.7800 | 77400 | 0.0016 | - |
| 5.7837 | 77450 | 0.0011 | - |
| 5.7875 | 77500 | 0.0012 | - |
| 5.7912 | 77550 | 0.0015 | - |
| 5.7949 | 77600 | 0.0017 | - |
| 5.7987 | 77650 | 0.0018 | - |
| 5.8024 | 77700 | 0.0011 | - |
| 5.8061 | 77750 | 0.0005 | - |
| 5.8099 | 77800 | 0.0009 | - |
| 5.8136 | 77850 | 0.0009 | - |
| 5.8173 | 77900 | 0.0011 | - |
| 5.8211 | 77950 | 0.0013 | - |
| 5.8248 | 78000 | 0.0008 | - |
| 5.8285 | 78050 | 0.0009 | - |
| 5.8323 | 78100 | 0.0013 | - |
| 5.8360 | 78150 | 0.001 | - |
| 5.8397 | 78200 | 0.001 | - |
| 5.8435 | 78250 | 0.0007 | - |
| 5.8472 | 78300 | 0.0014 | - |
| 5.8509 | 78350 | 0.0013 | - |
| 5.8547 | 78400 | 0.001 | - |
| 5.8584 | 78450 | 0.0011 | - |
| 5.8621 | 78500 | 0.0007 | - |
| 5.8659 | 78550 | 0.0007 | - |
| 5.8696 | 78600 | 0.0013 | - |
| 5.8733 | 78650 | 0.0004 | - |
| 5.8771 | 78700 | 0.0011 | - |
| 5.8808 | 78750 | 0.0009 | - |
| 5.8845 | 78800 | 0.0007 | - |
| 5.8883 | 78850 | 0.001 | - |
| 5.8920 | 78900 | 0.001 | - |
| 5.8958 | 78950 | 0.0006 | - |
| 5.8995 | 79000 | 0.0009 | - |
| 5.9032 | 79050 | 0.0008 | - |
| 5.9070 | 79100 | 0.0012 | - |
| 5.9107 | 79150 | 0.0007 | - |
| 5.9144 | 79200 | 0.0003 | - |
| 5.9182 | 79250 | 0.0008 | - |
| 5.9219 | 79300 | 0.0014 | - |
| 5.9256 | 79350 | 0.0006 | - |
| 5.9294 | 79400 | 0.0005 | - |
| 5.9331 | 79450 | 0.0007 | - |
| 5.9368 | 79500 | 0.0007 | - |
| 5.9406 | 79550 | 0.0001 | - |
| 5.9443 | 79600 | 0.0005 | - |
| 5.9480 | 79650 | 0.0004 | - |
| 5.9518 | 79700 | 0.0007 | - |
| 5.9555 | 79750 | 0.0006 | - |
| 5.9592 | 79800 | 0.0005 | - |
| 5.9630 | 79850 | 0.0009 | - |
| 5.9667 | 79900 | 0.0011 | - |
| 5.9704 | 79950 | 0.0005 | - |
| 5.9742 | 80000 | 0.0008 | - |
| 5.9779 | 80050 | 0.0004 | - |
| 5.9816 | 80100 | 0.0008 | - |
| 5.9854 | 80150 | 0.0012 | - |
| 5.9891 | 80200 | 0.0005 | - |
| 5.9928 | 80250 | 0.0009 | - |
| 5.9966 | 80300 | 0.0015 | - |
| 6.0003 | 80350 | 0.0008 | - |
| 6.0040 | 80400 | 0.0009 | - |
| 6.0078 | 80450 | 0.0009 | - |
| 6.0115 | 80500 | 0.0007 | - |
| 6.0152 | 80550 | 0.0014 | - |
| 6.0190 | 80600 | 0.0008 | - |
| 6.0227 | 80650 | 0.0012 | - |
| 6.0264 | 80700 | 0.0005 | - |
| 6.0302 | 80750 | 0.0002 | - |
| 6.0339 | 80800 | 0.0006 | - |
| 6.0376 | 80850 | 0.0006 | - |
| 6.0414 | 80900 | 0.0006 | - |
| 6.0451 | 80950 | 0.0008 | - |
| 6.0488 | 81000 | 0.0007 | - |
| 6.0526 | 81050 | 0.0006 | - |
| 6.0563 | 81100 | 0.0001 | - |
| 6.0600 | 81150 | 0.0007 | - |
| 6.0638 | 81200 | 0.0004 | - |
| 6.0675 | 81250 | 0.0003 | - |
| 6.0712 | 81300 | 0.0002 | - |
| 6.0750 | 81350 | 0.0006 | - |
| 6.0787 | 81400 | 0.001 | - |
| 6.0824 | 81450 | 0.0009 | - |
| 6.0862 | 81500 | 0.0006 | - |
| 6.0899 | 81550 | 0.0003 | - |
| 6.0936 | 81600 | 0.0004 | - |
| 6.0974 | 81650 | 0.0007 | - |
| 6.1011 | 81700 | 0.0004 | - |
| 6.1048 | 81750 | 0.0005 | - |
| 6.1086 | 81800 | 0.0004 | - |
| 6.1123 | 81850 | 0.0004 | - |
| 6.1160 | 81900 | 0.0001 | - |
| 6.1198 | 81950 | 0.0008 | - |
| 6.1235 | 82000 | 0.0003 | - |
| 6.1272 | 82050 | 0.0002 | - |
| 6.1310 | 82100 | 0.0004 | - |
| 6.1347 | 82150 | 0.0005 | - |
| 6.1385 | 82200 | 0.0003 | - |
| 6.1422 | 82250 | 0.0002 | - |
| 6.1459 | 82300 | 0.0008 | - |
| 6.1497 | 82350 | 0.0001 | - |
| 6.1534 | 82400 | 0.0007 | - |
| 6.1571 | 82450 | 0.0001 | - |
| 6.1609 | 82500 | 0.0013 | - |
| 6.1646 | 82550 | 0.0008 | - |
| 6.1683 | 82600 | 0.0012 | - |
| 6.1721 | 82650 | 0.0002 | - |
| 6.1758 | 82700 | 0.0003 | - |
| 6.1795 | 82750 | 0.0005 | - |
| 6.1833 | 82800 | 0.0002 | - |
| 6.1870 | 82850 | 0.0001 | - |
| 6.1907 | 82900 | 0.0002 | - |
| 6.1945 | 82950 | 0.0004 | - |
| 6.1982 | 83000 | 0.0003 | - |
| 6.2019 | 83050 | 0.0014 | - |
| 6.2057 | 83100 | 0.0008 | - |
| 6.2094 | 83150 | 0.0009 | - |
| 6.2131 | 83200 | 0.0004 | - |
| 6.2169 | 83250 | 0.0012 | - |
| 6.2206 | 83300 | 0.0012 | - |
| 6.2243 | 83350 | 0.0006 | - |
| 6.2281 | 83400 | 0.0011 | - |
| 6.2318 | 83450 | 0.0019 | - |
| 6.2355 | 83500 | 0.001 | - |
| 6.2393 | 83550 | 0.0012 | - |
| 6.2430 | 83600 | 0.001 | - |
| 6.2467 | 83650 | 0.0013 | - |
| 6.2505 | 83700 | 0.0012 | - |
| 6.2542 | 83750 | 0.0007 | - |
| 6.2579 | 83800 | 0.0007 | - |
| 6.2617 | 83850 | 0.0007 | - |
| 6.2654 | 83900 | 0.0004 | - |
| 6.2691 | 83950 | 0.0008 | - |
| 6.2729 | 84000 | 0.0008 | - |
| 6.2766 | 84050 | 0.0005 | - |
| 6.2803 | 84100 | 0.0005 | - |
| 6.2841 | 84150 | 0.0002 | - |
| 6.2878 | 84200 | 0.0004 | - |
| 6.2915 | 84250 | 0.0006 | - |
| 6.2953 | 84300 | 0.0004 | - |
| 6.2990 | 84350 | 0.0014 | - |
| 6.3027 | 84400 | 0.0007 | - |
| 6.3065 | 84450 | 0.0004 | - |
| 6.3102 | 84500 | 0.0002 | - |
| 6.3139 | 84550 | 0.0004 | - |
| 6.3177 | 84600 | 0.0004 | - |
| 6.3214 | 84650 | 0.0006 | - |
| 6.3251 | 84700 | 0.0005 | - |
| 6.3289 | 84750 | 0.0004 | - |
| 6.3326 | 84800 | 0.0013 | - |
| 6.3363 | 84850 | 0.0013 | - |
| 6.3401 | 84900 | 0.001 | - |
| 6.3438 | 84950 | 0.0014 | - |
| 6.3475 | 85000 | 0.0008 | - |
| 6.3513 | 85050 | 0.0005 | - |
| 6.3550 | 85100 | 0.0005 | - |
| 6.3587 | 85150 | 0.0009 | - |
| 6.3625 | 85200 | 0.0007 | - |
| 6.3662 | 85250 | 0.0002 | - |
| 6.3699 | 85300 | 0.0003 | - |
| 6.3737 | 85350 | 0.0002 | - |
| 6.3774 | 85400 | 0.0005 | - |
| 6.3812 | 85450 | 0.0009 | - |
| 6.3849 | 85500 | 0.0005 | - |
| 6.3886 | 85550 | 0.0009 | - |
| 6.3924 | 85600 | 0.0006 | - |
| 6.3961 | 85650 | 0.0003 | - |
| 6.3998 | 85700 | 0.0008 | - |
| 6.4036 | 85750 | 0.0007 | - |
| 6.4073 | 85800 | 0.0007 | - |
| 6.4110 | 85850 | 0.0018 | - |
| 6.4148 | 85900 | 0.0011 | - |
| 6.4185 | 85950 | 0.0009 | - |
| 6.4222 | 86000 | 0.001 | - |
| 6.4260 | 86050 | 0.0006 | - |
| 6.4297 | 86100 | 0.0003 | - |
| 6.4334 | 86150 | 0.0008 | - |
| 6.4372 | 86200 | 0.0006 | - |
| 6.4409 | 86250 | 0.0007 | - |
| 6.4446 | 86300 | 0.0006 | - |
| 6.4484 | 86350 | 0.0003 | - |
| 6.4521 | 86400 | 0.0004 | - |
| 6.4558 | 86450 | 0.0004 | - |
| 6.4596 | 86500 | 0.0006 | - |
| 6.4633 | 86550 | 0.0004 | - |
| 6.4670 | 86600 | 0.0007 | - |
| 6.4708 | 86650 | 0.0007 | - |
| 6.4745 | 86700 | 0.0007 | - |
| 6.4782 | 86750 | 0.0002 | - |
| 6.4820 | 86800 | 0.0005 | - |
| 6.4857 | 86850 | 0.0001 | - |
| 6.4894 | 86900 | 0.0004 | - |
| 6.4932 | 86950 | 0.0011 | - |
| 6.4969 | 87000 | 0.0003 | - |
| 6.5006 | 87050 | 0.0002 | - |
| 6.5044 | 87100 | 0.0002 | - |
| 6.5081 | 87150 | 0.0008 | - |
| 6.5118 | 87200 | 0.0006 | - |
| 6.5156 | 87250 | 0.0005 | - |
| 6.5193 | 87300 | 0.0002 | - |
| 6.5230 | 87350 | 0.0002 | - |
| 6.5268 | 87400 | 0.0006 | - |
| 6.5305 | 87450 | 0.0002 | - |
| 6.5342 | 87500 | 0.0002 | - |
| 6.5380 | 87550 | 0.0002 | - |
| 6.5417 | 87600 | 0.0007 | - |
| 6.5454 | 87650 | 0.0012 | - |
| 6.5492 | 87700 | 0.0017 | - |
| 6.5529 | 87750 | 0.001 | - |
| 6.5566 | 87800 | 0.0011 | - |
| 6.5604 | 87850 | 0.0008 | - |
| 6.5641 | 87900 | 0.0007 | - |
| 6.5678 | 87950 | 0.0014 | - |
| 6.5716 | 88000 | 0.0006 | - |
| 6.5753 | 88050 | 0.001 | - |
| 6.5790 | 88100 | 0.0007 | - |
| 6.5828 | 88150 | 0.0008 | - |
| 6.5865 | 88200 | 0.0005 | - |
| 6.5902 | 88250 | 0.0008 | - |
| 6.5940 | 88300 | 0.0004 | - |
| 6.5977 | 88350 | 0.0003 | - |
| 6.6014 | 88400 | 0.0004 | - |
| 6.6052 | 88450 | 0.0008 | - |
| 6.6089 | 88500 | 0.0013 | - |
| 6.6127 | 88550 | 0.0011 | - |
| 6.6164 | 88600 | 0.0007 | - |
| 6.6201 | 88650 | 0.0009 | - |
| 6.6239 | 88700 | 0.0008 | - |
| 6.6276 | 88750 | 0.0007 | - |
| 6.6313 | 88800 | 0.0004 | - |
| 6.6351 | 88850 | 0.0003 | - |
| 6.6388 | 88900 | 0.0007 | - |
| 6.6425 | 88950 | 0.0007 | - |
| 6.6463 | 89000 | 0.0004 | - |
| 6.6500 | 89050 | 0.0001 | - |
| 6.6537 | 89100 | 0.0008 | - |
| 6.6575 | 89150 | 0.0007 | - |
| 6.6612 | 89200 | 0.0004 | - |
| 6.6649 | 89250 | 0.0003 | - |
| 6.6687 | 89300 | 0.0001 | - |
| 6.6724 | 89350 | 0.0007 | - |
| 6.6761 | 89400 | 0.0007 | - |
| 6.6799 | 89450 | 0.0003 | - |
| 6.6836 | 89500 | 0.0003 | - |
| 6.6873 | 89550 | 0.0006 | - |
| 6.6911 | 89600 | 0.0007 | - |
| 6.6948 | 89650 | 0.0001 | - |
| 6.6985 | 89700 | 0.0003 | - |
| 6.7023 | 89750 | 0.0004 | - |
| 6.7060 | 89800 | 0.0005 | - |
| 6.7097 | 89850 | 0.0003 | - |
| 6.7135 | 89900 | 0.0007 | - |
| 6.7172 | 89950 | 0.0003 | - |
| 6.7209 | 90000 | 0.0002 | - |
| 6.7247 | 90050 | 0.0005 | - |
| 6.7284 | 90100 | 0.0004 | - |
| 6.7321 | 90150 | 0.0002 | - |
| 6.7359 | 90200 | 0.0007 | - |
| 6.7396 | 90250 | 0.0003 | - |
| 6.7433 | 90300 | 0.0011 | - |
| 6.7471 | 90350 | 0.0008 | - |
| 6.7508 | 90400 | 0.0005 | - |
| 6.7545 | 90450 | 0.0003 | - |
| 6.7583 | 90500 | 0.0003 | - |
| 6.7620 | 90550 | 0.0005 | - |
| 6.7657 | 90600 | 0.0005 | - |
| 6.7695 | 90650 | 0.0002 | - |
| 6.7732 | 90700 | 0.0006 | - |
| 6.7769 | 90750 | 0.0007 | - |
| 6.7807 | 90800 | 0.0013 | - |
| 6.7844 | 90850 | 0.0019 | - |
| 6.7881 | 90900 | 0.0009 | - |
| 6.7919 | 90950 | 0.0015 | - |
| 6.7956 | 91000 | 0.0015 | - |
| 6.7993 | 91050 | 0.0007 | - |
| 6.8031 | 91100 | 0.0014 | - |
| 6.8068 | 91150 | 0.0007 | - |
| 6.8105 | 91200 | 0.001 | - |
| 6.8143 | 91250 | 0.001 | - |
| 6.8180 | 91300 | 0.0004 | - |
| 6.8217 | 91350 | 0.0007 | - |
| 6.8255 | 91400 | 0.0009 | - |
| 6.8292 | 91450 | 0.0007 | - |
| 6.8329 | 91500 | 0.0013 | - |
| 6.8367 | 91550 | 0.0007 | - |
| 6.8404 | 91600 | 0.0011 | - |
| 6.8441 | 91650 | 0.0007 | - |
| 6.8479 | 91700 | 0.0004 | - |
| 6.8516 | 91750 | 0.0009 | - |
| 6.8554 | 91800 | 0.0005 | - |
| 6.8591 | 91850 | 0.0005 | - |
| 6.8628 | 91900 | 0.0015 | - |
| 6.8666 | 91950 | 0.0003 | - |
| 6.8703 | 92000 | 0.0005 | - |
| 6.8740 | 92050 | 0.0004 | - |
| 6.8778 | 92100 | 0.0005 | - |
| 6.8815 | 92150 | 0.0006 | - |
| 6.8852 | 92200 | 0.0006 | - |
| 6.8890 | 92250 | 0.0004 | - |
| 6.8927 | 92300 | 0.0006 | - |
| 6.8964 | 92350 | 0.0004 | - |
| 6.9002 | 92400 | 0.0008 | - |
| 6.9039 | 92450 | 0.0003 | - |
| 6.9076 | 92500 | 0.0006 | - |
| 6.9114 | 92550 | 0.0005 | - |
| 6.9151 | 92600 | 0.0003 | - |
| 6.9188 | 92650 | 0.0002 | - |
| 6.9226 | 92700 | 0.001 | - |
| 6.9263 | 92750 | 0.0009 | - |
| 6.9300 | 92800 | 0.0002 | - |
| 6.9338 | 92850 | 0.0004 | - |
| 6.9375 | 92900 | 0.0009 | - |
| 6.9412 | 92950 | 0.0004 | - |
| 6.9450 | 93000 | 0.0004 | - |
| 6.9487 | 93050 | 0.0005 | - |
| 6.9524 | 93100 | 0.0004 | - |
| 6.9562 | 93150 | 0.0005 | - |
| 6.9599 | 93200 | 0.0002 | - |
| 6.9636 | 93250 | 0.0006 | - |
| 6.9674 | 93300 | 0.0005 | - |
| 6.9711 | 93350 | 0.0007 | - |
| 6.9748 | 93400 | 0.0006 | - |
| 6.9786 | 93450 | 0.0007 | - |
| 6.9823 | 93500 | 0.0 | - |
| 6.9860 | 93550 | 0.0003 | - |
| 6.9898 | 93600 | 0.0006 | - |
| 6.9935 | 93650 | 0.0004 | - |
| 6.9972 | 93700 | 0.0005 | - |
| 7.0010 | 93750 | 0.0004 | - |
| 7.0047 | 93800 | 0.0005 | - |
| 7.0084 | 93850 | 0.0007 | - |
| 7.0122 | 93900 | 0.0002 | - |
| 7.0159 | 93950 | 0.0003 | - |
| 7.0196 | 94000 | 0.0005 | - |
| 7.0234 | 94050 | 0.0006 | - |
| 7.0271 | 94100 | 0.0002 | - |
| 7.0308 | 94150 | 0.0004 | - |
| 7.0346 | 94200 | 0.0003 | - |
| 7.0383 | 94250 | 0.001 | - |
| 7.0420 | 94300 | 0.0006 | - |
| 7.0458 | 94350 | 0.0007 | - |
| 7.0495 | 94400 | 0.0011 | - |
| 7.0532 | 94450 | 0.0009 | - |
| 7.0570 | 94500 | 0.0009 | - |
| 7.0607 | 94550 | 0.0004 | - |
| 7.0644 | 94600 | 0.001 | - |
| 7.0682 | 94650 | 0.0005 | - |
| 7.0719 | 94700 | 0.0008 | - |
| 7.0756 | 94750 | 0.0008 | - |
| 7.0794 | 94800 | 0.0004 | - |
| 7.0831 | 94850 | 0.0005 | - |
| 7.0868 | 94900 | 0.0004 | - |
| 7.0906 | 94950 | 0.0004 | - |
| 7.0943 | 95000 | 0.0004 | - |
| 7.0981 | 95050 | 0.0004 | - |
| 7.1018 | 95100 | 0.0007 | - |
| 7.1055 | 95150 | 0.0006 | - |
| 7.1093 | 95200 | 0.0004 | - |
| 7.1130 | 95250 | 0.0007 | - |
| 7.1167 | 95300 | 0.0004 | - |
| 7.1205 | 95350 | 0.0007 | - |
| 7.1242 | 95400 | 0.0001 | - |
| 7.1279 | 95450 | 0.0003 | - |
| 7.1317 | 95500 | 0.0002 | - |
| 7.1354 | 95550 | 0.0009 | - |
| 7.1391 | 95600 | 0.0003 | - |
| 7.1429 | 95650 | 0.001 | - |
| 7.1466 | 95700 | 0.0001 | - |
| 7.1503 | 95750 | 0.0006 | - |
| 7.1541 | 95800 | 0.0001 | - |
| 7.1578 | 95850 | 0.0004 | - |
| 7.1615 | 95900 | 0.0002 | - |
| 7.1653 | 95950 | 0.0009 | - |
| 7.1690 | 96000 | 0.0002 | - |
| 7.1727 | 96050 | 0.0007 | - |
| 7.1765 | 96100 | 0.0005 | - |
| 7.1802 | 96150 | 0.0002 | - |
| 7.1839 | 96200 | 0.0003 | - |
| 7.1877 | 96250 | 0.0005 | - |
| 7.1914 | 96300 | 0.0002 | - |
| 7.1951 | 96350 | 0.0 | - |
| 7.1989 | 96400 | 0.0005 | - |
| 7.2026 | 96450 | 0.0009 | - |
| 7.2063 | 96500 | 0.0002 | - |
| 7.2101 | 96550 | 0.0009 | - |
| 7.2138 | 96600 | 0.0006 | - |
| 7.2175 | 96650 | 0.0009 | - |
| 7.2213 | 96700 | 0.0007 | - |
| 7.2250 | 96750 | 0.0004 | - |
| 7.2287 | 96800 | 0.0003 | - |
| 7.2325 | 96850 | 0.0011 | - |
| 7.2362 | 96900 | 0.0004 | - |
| 7.2399 | 96950 | 0.0006 | - |
| 7.2437 | 97000 | 0.0003 | - |
| 7.2474 | 97050 | 0.0011 | - |
| 7.2511 | 97100 | 0.0006 | - |
| 7.2549 | 97150 | 0.0012 | - |
| 7.2586 | 97200 | 0.0006 | - |
| 7.2623 | 97250 | 0.002 | - |
| 7.2661 | 97300 | 0.0013 | - |
| 7.2698 | 97350 | 0.0009 | - |
| 7.2735 | 97400 | 0.0009 | - |
| 7.2773 | 97450 | 0.0013 | - |
| 7.2810 | 97500 | 0.0007 | - |
| 7.2847 | 97550 | 0.0013 | - |
| 7.2885 | 97600 | 0.0008 | - |
| 7.2922 | 97650 | 0.0012 | - |
| 7.2959 | 97700 | 0.0008 | - |
| 7.2997 | 97750 | 0.0009 | - |
| 7.3034 | 97800 | 0.0006 | - |
| 7.3071 | 97850 | 0.0007 | - |
| 7.3109 | 97900 | 0.0007 | - |
| 7.3146 | 97950 | 0.0012 | - |
| 7.3183 | 98000 | 0.0004 | - |
| 7.3221 | 98050 | 0.0006 | - |
| 7.3258 | 98100 | 0.0009 | - |
| 7.3295 | 98150 | 0.0011 | - |
| 7.3333 | 98200 | 0.0013 | - |
| 7.3370 | 98250 | 0.0014 | - |
| 7.3408 | 98300 | 0.0003 | - |
| 7.3445 | 98350 | 0.0005 | - |
| 7.3482 | 98400 | 0.0012 | - |
| 7.3520 | 98450 | 0.0016 | - |
| 7.3557 | 98500 | 0.0011 | - |
| 7.3594 | 98550 | 0.0015 | - |
| 7.3632 | 98600 | 0.0009 | - |
| 7.3669 | 98650 | 0.0005 | - |
| 7.3706 | 98700 | 0.0008 | - |
| 7.3744 | 98750 | 0.0005 | - |
| 7.3781 | 98800 | 0.001 | - |
| 7.3818 | 98850 | 0.0005 | - |
| 7.3856 | 98900 | 0.0002 | - |
| 7.3893 | 98950 | 0.0013 | - |
| 7.3930 | 99000 | 0.0011 | - |
| 7.3968 | 99050 | 0.0008 | - |
| 7.4005 | 99100 | 0.0009 | - |
| 7.4042 | 99150 | 0.001 | - |
| 7.4080 | 99200 | 0.0007 | - |
| 7.4117 | 99250 | 0.0006 | - |
| 7.4154 | 99300 | 0.0009 | - |
| 7.4192 | 99350 | 0.0007 | - |
| 7.4229 | 99400 | 0.0003 | - |
| 7.4266 | 99450 | 0.0004 | - |
| 7.4304 | 99500 | 0.0008 | - |
| 7.4341 | 99550 | 0.0008 | - |
| 7.4378 | 99600 | 0.0002 | - |
| 7.4416 | 99650 | 0.0009 | - |
| 7.4453 | 99700 | 0.0004 | - |
| 7.4490 | 99750 | 0.0011 | - |
| 7.4528 | 99800 | 0.0007 | - |
| 7.4565 | 99850 | 0.0008 | - |
| 7.4602 | 99900 | 0.0006 | - |
| 7.4640 | 99950 | 0.0004 | - |
| 7.4677 | 100000 | 0.0004 | - |
| 7.4714 | 100050 | 0.0005 | - |
| 7.4752 | 100100 | 0.0004 | - |
| 7.4789 | 100150 | 0.0004 | - |
| 7.4826 | 100200 | 0.0005 | - |
| 7.4864 | 100250 | 0.0007 | - |
| 7.4901 | 100300 | 0.0001 | - |
| 7.4938 | 100350 | 0.0004 | - |
| 7.4976 | 100400 | 0.0006 | - |
| 7.5013 | 100450 | 0.0005 | - |
| 7.5050 | 100500 | 0.0004 | - |
| 7.5088 | 100550 | 0.0004 | - |
| 7.5125 | 100600 | 0.0002 | - |
| 7.5162 | 100650 | 0.0005 | - |
| 7.5200 | 100700 | 0.0001 | - |
| 7.5237 | 100750 | 0.0002 | - |
| 7.5274 | 100800 | 0.0002 | - |
| 7.5312 | 100850 | 0.0005 | - |
| 7.5349 | 100900 | 0.0002 | - |
| 7.5386 | 100950 | 0.0004 | - |
| 7.5424 | 101000 | 0.0005 | - |
| 7.5461 | 101050 | 0.0009 | - |
| 7.5498 | 101100 | 0.0002 | - |
| 7.5536 | 101150 | 0.0003 | - |
| 7.5573 | 101200 | 0.0003 | - |
| 7.5610 | 101250 | 0.0006 | - |
| 7.5648 | 101300 | 0.0007 | - |
| 7.5685 | 101350 | 0.0002 | - |
| 7.5723 | 101400 | 0.0005 | - |
| 7.5760 | 101450 | 0.0004 | - |
| 7.5797 | 101500 | 0.0007 | - |
| 7.5835 | 101550 | 0.0003 | - |
| 7.5872 | 101600 | 0.0005 | - |
| 7.5909 | 101650 | 0.0005 | - |
| 7.5947 | 101700 | 0.0004 | - |
| 7.5984 | 101750 | 0.0003 | - |
| 7.6021 | 101800 | 0.0005 | - |
| 7.6059 | 101850 | 0.0005 | - |
| 7.6096 | 101900 | 0.0003 | - |
| 7.6133 | 101950 | 0.0004 | - |
| 7.6171 | 102000 | 0.0003 | - |
| 7.6208 | 102050 | 0.0004 | - |
| 7.6245 | 102100 | 0.0002 | - |
| 7.6283 | 102150 | 0.0 | - |
| 7.6320 | 102200 | 0.0001 | - |
| 7.6357 | 102250 | 0.0002 | - |
| 7.6395 | 102300 | 0.0001 | - |
| 7.6432 | 102350 | 0.0001 | - |
| 7.6469 | 102400 | 0.0001 | - |
| 7.6507 | 102450 | 0.0002 | - |
| 7.6544 | 102500 | 0.0005 | - |
| 7.6581 | 102550 | 0.0008 | - |
| 7.6619 | 102600 | 0.0007 | - |
| 7.6656 | 102650 | 0.0003 | - |
| 7.6693 | 102700 | 0.0004 | - |
| 7.6731 | 102750 | 0.0002 | - |
| 7.6768 | 102800 | 0.0007 | - |
| 7.6805 | 102850 | 0.0002 | - |
| 7.6843 | 102900 | 0.0004 | - |
| 7.6880 | 102950 | 0.0003 | - |
| 7.6917 | 103000 | 0.0009 | - |
| 7.6955 | 103050 | 0.0015 | - |
| 7.6992 | 103100 | 0.0011 | - |
| 7.7029 | 103150 | 0.001 | - |
| 7.7067 | 103200 | 0.0008 | - |
| 7.7104 | 103250 | 0.0003 | - |
| 7.7141 | 103300 | 0.0005 | - |
| 7.7179 | 103350 | 0.001 | - |
| 7.7216 | 103400 | 0.0011 | - |
| 7.7253 | 103450 | 0.0008 | - |
| 7.7291 | 103500 | 0.0007 | - |
| 7.7328 | 103550 | 0.0007 | - |
| 7.7365 | 103600 | 0.0007 | - |
| 7.7403 | 103650 | 0.0005 | - |
| 7.7440 | 103700 | 0.0004 | - |
| 7.7477 | 103750 | 0.0009 | - |
| 7.7515 | 103800 | 0.0004 | - |
| 7.7552 | 103850 | 0.0006 | - |
| 7.7589 | 103900 | 0.0005 | - |
| 7.7627 | 103950 | 0.001 | - |
| 7.7664 | 104000 | 0.0003 | - |
| 7.7701 | 104050 | 0.0004 | - |
| 7.7739 | 104100 | 0.0007 | - |
| 7.7776 | 104150 | 0.0008 | - |
| 7.7813 | 104200 | 0.0005 | - |
| 7.7851 | 104250 | 0.0004 | - |
| 7.7888 | 104300 | 0.0009 | - |
| 7.7925 | 104350 | 0.0005 | - |
| 7.7963 | 104400 | 0.0004 | - |
| 7.8000 | 104450 | 0.001 | - |
| 7.8037 | 104500 | 0.0002 | - |
| 7.8075 | 104550 | 0.0009 | - |
| 7.8112 | 104600 | 0.0004 | - |
| 7.8150 | 104650 | 0.0007 | - |
| 7.8187 | 104700 | 0.0004 | - |
| 7.8224 | 104750 | 0.0007 | - |
| 7.8262 | 104800 | 0.0004 | - |
| 7.8299 | 104850 | 0.0004 | - |
| 7.8336 | 104900 | 0.0001 | - |
| 7.8374 | 104950 | 0.0006 | - |
| 7.8411 | 105000 | 0.0002 | - |
| 7.8448 | 105050 | 0.0009 | - |
| 7.8486 | 105100 | 0.0004 | - |
| 7.8523 | 105150 | 0.0005 | - |
| 7.8560 | 105200 | 0.0004 | - |
| 7.8598 | 105250 | 0.0004 | - |
| 7.8635 | 105300 | 0.0008 | - |
| 7.8672 | 105350 | 0.0005 | - |
| 7.8710 | 105400 | 0.0009 | - |
| 7.8747 | 105450 | 0.0008 | - |
| 7.8784 | 105500 | 0.0001 | - |
| 7.8822 | 105550 | 0.0004 | - |
| 7.8859 | 105600 | 0.0006 | - |
| 7.8896 | 105650 | 0.0006 | - |
| 7.8934 | 105700 | 0.0004 | - |
| 7.8971 | 105750 | 0.0006 | - |
| 7.9008 | 105800 | 0.0005 | - |
| 7.9046 | 105850 | 0.0013 | - |
| 7.9083 | 105900 | 0.0027 | - |
| 7.9120 | 105950 | 0.0026 | - |
| 7.9158 | 106000 | 0.0026 | - |
| 7.9195 | 106050 | 0.0024 | - |
| 7.9232 | 106100 | 0.0017 | - |
| 7.9270 | 106150 | 0.0013 | - |
| 7.9307 | 106200 | 0.0019 | - |
| 7.9344 | 106250 | 0.0008 | - |
| 7.9382 | 106300 | 0.0016 | - |
| 7.9419 | 106350 | 0.0005 | - |
| 7.9456 | 106400 | 0.0009 | - |
| 7.9494 | 106450 | 0.0023 | - |
| 7.9531 | 106500 | 0.0021 | - |
| 7.9568 | 106550 | 0.0009 | - |
| 7.9606 | 106600 | 0.0005 | - |
| 7.9643 | 106650 | 0.0009 | - |
| 7.9680 | 106700 | 0.0009 | - |
| 7.9718 | 106750 | 0.0008 | - |
| 7.9755 | 106800 | 0.0006 | - |
| 7.9792 | 106850 | 0.0002 | - |
| 7.9830 | 106900 | 0.0004 | - |
| 7.9867 | 106950 | 0.0006 | - |
| 7.9904 | 107000 | 0.0005 | - |
| 7.9942 | 107050 | 0.0011 | - |
| 7.9979 | 107100 | 0.0005 | - |
| 8.0016 | 107150 | 0.0006 | - |
| 8.0054 | 107200 | 0.0003 | - |
| 8.0091 | 107250 | 0.0007 | - |
| 8.0128 | 107300 | 0.0007 | - |
| 8.0166 | 107350 | 0.0005 | - |
| 8.0203 | 107400 | 0.0005 | - |
| 8.0240 | 107450 | 0.0003 | - |
| 8.0278 | 107500 | 0.0004 | - |
| 8.0315 | 107550 | 0.0002 | - |
| 8.0352 | 107600 | 0.0002 | - |
| 8.0390 | 107650 | 0.0004 | - |
| 8.0427 | 107700 | 0.0001 | - |
| 8.0464 | 107750 | 0.0005 | - |
| 8.0502 | 107800 | 0.0004 | - |
| 8.0539 | 107850 | 0.0008 | - |
| 8.0577 | 107900 | 0.0005 | - |
| 8.0614 | 107950 | 0.0005 | - |
| 8.0651 | 108000 | 0.0004 | - |
| 8.0689 | 108050 | 0.0007 | - |
| 8.0726 | 108100 | 0.0004 | - |
| 8.0763 | 108150 | 0.0005 | - |
| 8.0801 | 108200 | 0.0007 | - |
| 8.0838 | 108250 | 0.0003 | - |
| 8.0875 | 108300 | 0.0004 | - |
| 8.0913 | 108350 | 0.0004 | - |
| 8.0950 | 108400 | 0.0006 | - |
| 8.0987 | 108450 | 0.0002 | - |
| 8.1025 | 108500 | 0.0001 | - |
| 8.1062 | 108550 | 0.0003 | - |
| 8.1099 | 108600 | 0.0004 | - |
| 8.1137 | 108650 | 0.0008 | - |
| 8.1174 | 108700 | 0.0008 | - |
| 8.1211 | 108750 | 0.0005 | - |
| 8.1249 | 108800 | 0.0004 | - |
| 8.1286 | 108850 | 0.001 | - |
| 8.1323 | 108900 | 0.0004 | - |
| 8.1361 | 108950 | 0.0005 | - |
| 8.1398 | 109000 | 0.0006 | - |
| 8.1435 | 109050 | 0.0007 | - |
| 8.1473 | 109100 | 0.0004 | - |
| 8.1510 | 109150 | 0.0009 | - |
| 8.1547 | 109200 | 0.0007 | - |
| 8.1585 | 109250 | 0.0011 | - |
| 8.1622 | 109300 | 0.0003 | - |
| 8.1659 | 109350 | 0.0002 | - |
| 8.1697 | 109400 | 0.0005 | - |
| 8.1734 | 109450 | 0.0011 | - |
| 8.1771 | 109500 | 0.0015 | - |
| 8.1809 | 109550 | 0.0014 | - |
| 8.1846 | 109600 | 0.0008 | - |
| 8.1883 | 109650 | 0.0005 | - |
| 8.1921 | 109700 | 0.0005 | - |
| 8.1958 | 109750 | 0.0007 | - |
| 8.1995 | 109800 | 0.0007 | - |
| 8.2033 | 109850 | 0.0008 | - |
| 8.2070 | 109900 | 0.0003 | - |
| 8.2107 | 109950 | 0.0005 | - |
| 8.2145 | 110000 | 0.0004 | - |
| 8.2182 | 110050 | 0.0001 | - |
| 8.2219 | 110100 | 0.0006 | - |
| 8.2257 | 110150 | 0.0006 | - |
| 8.2294 | 110200 | 0.0001 | - |
| 8.2331 | 110250 | 0.0008 | - |
| 8.2369 | 110300 | 0.0005 | - |
| 8.2406 | 110350 | 0.0005 | - |
| 8.2443 | 110400 | 0.0002 | - |
| 8.2481 | 110450 | 0.0005 | - |
| 8.2518 | 110500 | 0.0004 | - |
| 8.2555 | 110550 | 0.0003 | - |
| 8.2593 | 110600 | 0.0005 | - |
| 8.2630 | 110650 | 0.0002 | - |
| 8.2667 | 110700 | 0.0004 | - |
| 8.2705 | 110750 | 0.0004 | - |
| 8.2742 | 110800 | 0.0002 | - |
| 8.2779 | 110850 | 0.0002 | - |
| 8.2817 | 110900 | 0.0005 | - |
| 8.2854 | 110950 | 0.0004 | - |
| 8.2891 | 111000 | 0.0007 | - |
| 8.2929 | 111050 | 0.0007 | - |
| 8.2966 | 111100 | 0.0003 | - |
| 8.3004 | 111150 | 0.0004 | - |
| 8.3041 | 111200 | 0.0008 | - |
| 8.3078 | 111250 | 0.0003 | - |
| 8.3116 | 111300 | 0.0002 | - |
| 8.3153 | 111350 | 0.0002 | - |
| 8.3190 | 111400 | 0.0 | - |
| 8.3228 | 111450 | 0.0006 | - |
| 8.3265 | 111500 | 0.0004 | - |
| 8.3302 | 111550 | 0.0006 | - |
| 8.3340 | 111600 | 0.0005 | - |
| 8.3377 | 111650 | 0.0007 | - |
| 8.3414 | 111700 | 0.0006 | - |
| 8.3452 | 111750 | 0.0005 | - |
| 8.3489 | 111800 | 0.002 | - |
| 8.3526 | 111850 | 0.0021 | - |
| 8.3564 | 111900 | 0.0009 | - |
| 8.3601 | 111950 | 0.0005 | - |
| 8.3638 | 112000 | 0.0005 | - |
| 8.3676 | 112050 | 0.0005 | - |
| 8.3713 | 112100 | 0.001 | - |
| 8.3750 | 112150 | 0.0006 | - |
| 8.3788 | 112200 | 0.0008 | - |
| 8.3825 | 112250 | 0.0003 | - |
| 8.3862 | 112300 | 0.0009 | - |
| 8.3900 | 112350 | 0.0008 | - |
| 8.3937 | 112400 | 0.0004 | - |
| 8.3974 | 112450 | 0.0004 | - |
| 8.4012 | 112500 | 0.0003 | - |
| 8.4049 | 112550 | 0.0004 | - |
| 8.4086 | 112600 | 0.0006 | - |
| 8.4124 | 112650 | 0.0004 | - |
| 8.4161 | 112700 | 0.0009 | - |
| 8.4198 | 112750 | 0.0003 | - |
| 8.4236 | 112800 | 0.0003 | - |
| 8.4273 | 112850 | 0.0006 | - |
| 8.4310 | 112900 | 0.0005 | - |
| 8.4348 | 112950 | 0.0004 | - |
| 8.4385 | 113000 | 0.0003 | - |
| 8.4422 | 113050 | 0.0001 | - |
| 8.4460 | 113100 | 0.0002 | - |
| 8.4497 | 113150 | 0.0004 | - |
| 8.4534 | 113200 | 0.0002 | - |
| 8.4572 | 113250 | 0.0005 | - |
| 8.4609 | 113300 | 0.0003 | - |
| 8.4646 | 113350 | 0.0006 | - |
| 8.4684 | 113400 | 0.0002 | - |
| 8.4721 | 113450 | 0.0005 | - |
| 8.4758 | 113500 | 0.0006 | - |
| 8.4796 | 113550 | 0.0004 | - |
| 8.4833 | 113600 | 0.0001 | - |
| 8.4870 | 113650 | 0.0002 | - |
| 8.4908 | 113700 | 0.0008 | - |
| 8.4945 | 113750 | 0.0002 | - |
| 8.4982 | 113800 | 0.0009 | - |
| 8.5020 | 113850 | 0.0005 | - |
| 8.5057 | 113900 | 0.0004 | - |
| 8.5094 | 113950 | 0.0002 | - |
| 8.5132 | 114000 | 0.0002 | - |
| 8.5169 | 114050 | 0.0005 | - |
| 8.5206 | 114100 | 0.0006 | - |
| 8.5244 | 114150 | 0.0007 | - |
| 8.5281 | 114200 | 0.0004 | - |
| 8.5318 | 114250 | 0.0001 | - |
| 8.5356 | 114300 | 0.0004 | - |
| 8.5393 | 114350 | 0.0004 | - |
| 8.5431 | 114400 | 0.0002 | - |
| 8.5468 | 114450 | 0.0004 | - |
| 8.5505 | 114500 | 0.0002 | - |
| 8.5543 | 114550 | 0.0005 | - |
| 8.5580 | 114600 | 0.0 | - |
| 8.5617 | 114650 | 0.0002 | - |
| 8.5655 | 114700 | 0.0004 | - |
| 8.5692 | 114750 | 0.0001 | - |
| 8.5729 | 114800 | 0.0004 | - |
| 8.5767 | 114850 | 0.0002 | - |
| 8.5804 | 114900 | 0.0003 | - |
| 8.5841 | 114950 | 0.0004 | - |
| 8.5879 | 115000 | 0.0002 | - |
| 8.5916 | 115050 | 0.0002 | - |
| 8.5953 | 115100 | 0.0003 | - |
| 8.5991 | 115150 | 0.0 | - |
| 8.6028 | 115200 | 0.0002 | - |
| 8.6065 | 115250 | 0.0005 | - |
| 8.6103 | 115300 | 0.0002 | - |
| 8.6140 | 115350 | 0.0001 | - |
| 8.6177 | 115400 | 0.0002 | - |
| 8.6215 | 115450 | 0.0009 | - |
| 8.6252 | 115500 | 0.0001 | - |
| 8.6289 | 115550 | 0.0005 | - |
| 8.6327 | 115600 | 0.0004 | - |
| 8.6364 | 115650 | 0.0005 | - |
| 8.6401 | 115700 | 0.0004 | - |
| 8.6439 | 115750 | 0.0004 | - |
| 8.6476 | 115800 | 0.0001 | - |
| 8.6513 | 115850 | 0.0002 | - |
| 8.6551 | 115900 | 0.0002 | - |
| 8.6588 | 115950 | 0.0002 | - |
| 8.6625 | 116000 | 0.0007 | - |
| 8.6663 | 116050 | 0.0008 | - |
| 8.6700 | 116100 | 0.0008 | - |
| 8.6737 | 116150 | 0.0008 | - |
| 8.6775 | 116200 | 0.0011 | - |
| 8.6812 | 116250 | 0.0019 | - |
| 8.6849 | 116300 | 0.0009 | - |
| 8.6887 | 116350 | 0.0009 | - |
| 8.6924 | 116400 | 0.0007 | - |
| 8.6961 | 116450 | 0.0008 | - |
| 8.6999 | 116500 | 0.0009 | - |
| 8.7036 | 116550 | 0.0011 | - |
| 8.7073 | 116600 | 0.0012 | - |
| 8.7111 | 116650 | 0.0009 | - |
| 8.7148 | 116700 | 0.0006 | - |
| 8.7185 | 116750 | 0.0003 | - |
| 8.7223 | 116800 | 0.0006 | - |
| 8.7260 | 116850 | 0.0006 | - |
| 8.7297 | 116900 | 0.0004 | - |
| 8.7335 | 116950 | 0.0006 | - |
| 8.7372 | 117000 | 0.0002 | - |
| 8.7409 | 117050 | 0.0004 | - |
| 8.7447 | 117100 | 0.0008 | - |
| 8.7484 | 117150 | 0.0003 | - |
| 8.7521 | 117200 | 0.0007 | - |
| 8.7559 | 117250 | 0.0002 | - |
| 8.7596 | 117300 | 0.0003 | - |
| 8.7633 | 117350 | 0.0001 | - |
| 8.7671 | 117400 | 0.0004 | - |
| 8.7708 | 117450 | 0.0004 | - |
| 8.7746 | 117500 | 0.0003 | - |
| 8.7783 | 117550 | 0.0003 | - |
| 8.7820 | 117600 | 0.0005 | - |
| 8.7858 | 117650 | 0.0003 | - |
| 8.7895 | 117700 | 0.0006 | - |
| 8.7932 | 117750 | 0.0005 | - |
| 8.7970 | 117800 | 0.0003 | - |
| 8.8007 | 117850 | 0.0002 | - |
| 8.8044 | 117900 | 0.0004 | - |
| 8.8082 | 117950 | 0.0006 | - |
| 8.8119 | 118000 | 0.0006 | - |
| 8.8156 | 118050 | 0.0003 | - |
| 8.8194 | 118100 | 0.0004 | - |
| 8.8231 | 118150 | 0.001 | - |
| 8.8268 | 118200 | 0.0005 | - |
| 8.8306 | 118250 | 0.001 | - |
| 8.8343 | 118300 | 0.0005 | - |
| 8.8380 | 118350 | 0.001 | - |
| 8.8418 | 118400 | 0.0002 | - |
| 8.8455 | 118450 | 0.0003 | - |
| 8.8492 | 118500 | 0.0003 | - |
| 8.8530 | 118550 | 0.0003 | - |
| 8.8567 | 118600 | 0.0003 | - |
| 8.8604 | 118650 | 0.0003 | - |
| 8.8642 | 118700 | 0.0002 | - |
| 8.8679 | 118750 | 0.0003 | - |
| 8.8716 | 118800 | 0.0008 | - |
| 8.8754 | 118850 | 0.0006 | - |
| 8.8791 | 118900 | 0.0004 | - |
| 8.8828 | 118950 | 0.0005 | - |
| 8.8866 | 119000 | 0.0002 | - |
| 8.8903 | 119050 | 0.0005 | - |
| 8.8940 | 119100 | 0.0003 | - |
| 8.8978 | 119150 | 0.0008 | - |
| 8.9015 | 119200 | 0.0004 | - |
| 8.9052 | 119250 | 0.0007 | - |
| 8.9090 | 119300 | 0.0008 | - |
| 8.9127 | 119350 | 0.0004 | - |
| 8.9164 | 119400 | 0.0003 | - |
| 8.9202 | 119450 | 0.0003 | - |
| 8.9239 | 119500 | 0.0003 | - |
| 8.9276 | 119550 | 0.0011 | - |
| 8.9314 | 119600 | 0.0002 | - |
| 8.9351 | 119650 | 0.0003 | - |
| 8.9388 | 119700 | 0.0002 | - |
| 8.9426 | 119750 | 0.0007 | - |
| 8.9463 | 119800 | 0.0002 | - |
| 8.9500 | 119850 | 0.0004 | - |
| 8.9538 | 119900 | 0.0003 | - |
| 8.9575 | 119950 | 0.0008 | - |
| 8.9612 | 120000 | 0.0003 | - |
| 8.9650 | 120050 | 0.0008 | - |
| 8.9687 | 120100 | 0.0001 | - |
| 8.9724 | 120150 | 0.0001 | - |
| 8.9762 | 120200 | 0.0005 | - |
| 8.9799 | 120250 | 0.0005 | - |
| 8.9836 | 120300 | 0.0003 | - |
| 8.9874 | 120350 | 0.0008 | - |
| 8.9911 | 120400 | 0.0002 | - |
| 8.9948 | 120450 | 0.0002 | - |
| 8.9986 | 120500 | 0.0004 | - |
| 9.0023 | 120550 | 0.0002 | - |
| 9.0060 | 120600 | 0.0003 | - |
| 9.0098 | 120650 | 0.0005 | - |
| 9.0135 | 120700 | 0.0004 | - |
| 9.0173 | 120750 | 0.0002 | - |
| 9.0210 | 120800 | 0.0002 | - |
| 9.0247 | 120850 | 0.0009 | - |
| 9.0285 | 120900 | 0.0005 | - |
| 9.0322 | 120950 | 0.0004 | - |
| 9.0359 | 121000 | 0.0001 | - |
| 9.0397 | 121050 | 0.0001 | - |
| 9.0434 | 121100 | 0.0003 | - |
| 9.0471 | 121150 | 0.0007 | - |
| 9.0509 | 121200 | 0.0006 | - |
| 9.0546 | 121250 | 0.0002 | - |
| 9.0583 | 121300 | 0.0002 | - |
| 9.0621 | 121350 | 0.0002 | - |
| 9.0658 | 121400 | 0.0004 | - |
| 9.0695 | 121450 | 0.0001 | - |
| 9.0733 | 121500 | 0.0004 | - |
| 9.0770 | 121550 | 0.0004 | - |
| 9.0807 | 121600 | 0.0001 | - |
| 9.0845 | 121650 | 0.0002 | - |
| 9.0882 | 121700 | 0.0004 | - |
| 9.0919 | 121750 | 0.0001 | - |
| 9.0957 | 121800 | 0.0003 | - |
| 9.0994 | 121850 | 0.0003 | - |
| 9.1031 | 121900 | 0.0004 | - |
| 9.1069 | 121950 | 0.0004 | - |
| 9.1106 | 122000 | 0.0005 | - |
| 9.1143 | 122050 | 0.0005 | - |
| 9.1181 | 122100 | 0.0008 | - |
| 9.1218 | 122150 | 0.0007 | - |
| 9.1255 | 122200 | 0.0003 | - |
| 9.1293 | 122250 | 0.0003 | - |
| 9.1330 | 122300 | 0.0005 | - |
| 9.1367 | 122350 | 0.0004 | - |
| 9.1405 | 122400 | 0.0002 | - |
| 9.1442 | 122450 | 0.0003 | - |
| 9.1479 | 122500 | 0.0001 | - |
| 9.1517 | 122550 | 0.0004 | - |
| 9.1554 | 122600 | 0.0001 | - |
| 9.1591 | 122650 | 0.0002 | - |
| 9.1629 | 122700 | 0.0008 | - |
| 9.1666 | 122750 | 0.0002 | - |
| 9.1703 | 122800 | 0.0002 | - |
| 9.1741 | 122850 | 0.0005 | - |
| 9.1778 | 122900 | 0.0002 | - |
| 9.1815 | 122950 | 0.0005 | - |
| 9.1853 | 123000 | 0.0007 | - |
| 9.1890 | 123050 | 0.0002 | - |
| 9.1927 | 123100 | 0.0005 | - |
| 9.1965 | 123150 | 0.0004 | - |
| 9.2002 | 123200 | 0.0004 | - |
| 9.2039 | 123250 | 0.0006 | - |
| 9.2077 | 123300 | 0.0005 | - |
| 9.2114 | 123350 | 0.0003 | - |
| 9.2151 | 123400 | 0.0007 | - |
| 9.2189 | 123450 | 0.0005 | - |
| 9.2226 | 123500 | 0.0004 | - |
| 9.2263 | 123550 | 0.0006 | - |
| 9.2301 | 123600 | 0.0004 | - |
| 9.2338 | 123650 | 0.0005 | - |
| 9.2375 | 123700 | 0.0004 | - |
| 9.2413 | 123750 | 0.0005 | - |
| 9.2450 | 123800 | 0.0005 | - |
| 9.2487 | 123850 | 0.0002 | - |
| 9.2525 | 123900 | 0.0013 | - |
| 9.2562 | 123950 | 0.0006 | - |
| 9.2600 | 124000 | 0.0005 | - |
| 9.2637 | 124050 | 0.001 | - |
| 9.2674 | 124100 | 0.0005 | - |
| 9.2712 | 124150 | 0.0009 | - |
| 9.2749 | 124200 | 0.0004 | - |
| 9.2786 | 124250 | 0.001 | - |
| 9.2824 | 124300 | 0.0008 | - |
| 9.2861 | 124350 | 0.0009 | - |
| 9.2898 | 124400 | 0.0008 | - |
| 9.2936 | 124450 | 0.0009 | - |
| 9.2973 | 124500 | 0.0002 | - |
| 9.3010 | 124550 | 0.0005 | - |
| 9.3048 | 124600 | 0.0011 | - |
| 9.3085 | 124650 | 0.0004 | - |
| 9.3122 | 124700 | 0.0005 | - |
| 9.3160 | 124750 | 0.0007 | - |
| 9.3197 | 124800 | 0.0008 | - |
| 9.3234 | 124850 | 0.0005 | - |
| 9.3272 | 124900 | 0.0007 | - |
| 9.3309 | 124950 | 0.0006 | - |
| 9.3346 | 125000 | 0.0005 | - |
| 9.3384 | 125050 | 0.0003 | - |
| 9.3421 | 125100 | 0.0002 | - |
| 9.3458 | 125150 | 0.0004 | - |
| 9.3496 | 125200 | 0.0006 | - |
| 9.3533 | 125250 | 0.0005 | - |
| 9.3570 | 125300 | 0.0004 | - |
| 9.3608 | 125350 | 0.0006 | - |
| 9.3645 | 125400 | 0.0004 | - |
| 9.3682 | 125450 | 0.0002 | - |
| 9.3720 | 125500 | 0.0 | - |
| 9.3757 | 125550 | 0.0002 | - |
| 9.3794 | 125600 | 0.0001 | - |
| 9.3832 | 125650 | 0.0002 | - |
| 9.3869 | 125700 | 0.0005 | - |
| 9.3906 | 125750 | 0.0005 | - |
| 9.3944 | 125800 | 0.0008 | - |
| 9.3981 | 125850 | 0.0004 | - |
| 9.4018 | 125900 | 0.0006 | - |
| 9.4056 | 125950 | 0.0009 | - |
| 9.4093 | 126000 | 0.0007 | - |
| 9.4130 | 126050 | 0.0007 | - |
| 9.4168 | 126100 | 0.0005 | - |
| 9.4205 | 126150 | 0.0005 | - |
| 9.4242 | 126200 | 0.0004 | - |
| 9.4280 | 126250 | 0.0003 | - |
| 9.4317 | 126300 | 0.0006 | - |
| 9.4354 | 126350 | 0.0003 | - |
| 9.4392 | 126400 | 0.0005 | - |
| 9.4429 | 126450 | 0.0002 | - |
| 9.4466 | 126500 | 0.0005 | - |
| 9.4504 | 126550 | 0.0005 | - |
| 9.4541 | 126600 | 0.0002 | - |
| 9.4578 | 126650 | 0.0004 | - |
| 9.4616 | 126700 | 0.0001 | - |
| 9.4653 | 126750 | 0.0001 | - |
| 9.4690 | 126800 | 0.0 | - |
| 9.4728 | 126850 | 0.001 | - |
| 9.4765 | 126900 | 0.0009 | - |
| 9.4802 | 126950 | 0.0004 | - |
| 9.4840 | 127000 | 0.0001 | - |
| 9.4877 | 127050 | 0.0002 | - |
| 9.4914 | 127100 | 0.0002 | - |
| 9.4952 | 127150 | 0.0005 | - |
| 9.4989 | 127200 | 0.0004 | - |
| 9.5027 | 127250 | 0.0001 | - |
| 9.5064 | 127300 | 0.0012 | - |
| 9.5101 | 127350 | 0.0004 | - |
| 9.5139 | 127400 | 0.0001 | - |
| 9.5176 | 127450 | 0.0004 | - |
| 9.5213 | 127500 | 0.0005 | - |
| 9.5251 | 127550 | 0.0005 | - |
| 9.5288 | 127600 | 0.0005 | - |
| 9.5325 | 127650 | 0.0003 | - |
| 9.5363 | 127700 | 0.0007 | - |
| 9.5400 | 127750 | 0.0004 | - |
| 9.5437 | 127800 | 0.0006 | - |
| 9.5475 | 127850 | 0.0003 | - |
| 9.5512 | 127900 | 0.0003 | - |
| 9.5549 | 127950 | 0.0001 | - |
| 9.5587 | 128000 | 0.0004 | - |
| 9.5624 | 128050 | 0.0003 | - |
| 9.5661 | 128100 | 0.0002 | - |
| 9.5699 | 128150 | 0.0003 | - |
| 9.5736 | 128200 | 0.0004 | - |
| 9.5773 | 128250 | 0.0001 | - |
| 9.5811 | 128300 | 0.0012 | - |
| 9.5848 | 128350 | 0.0006 | - |
| 9.5885 | 128400 | 0.0003 | - |
| 9.5923 | 128450 | 0.0008 | - |
| 9.5960 | 128500 | 0.0004 | - |
| 9.5997 | 128550 | 0.0014 | - |
| 9.6035 | 128600 | 0.0011 | - |
| 9.6072 | 128650 | 0.0011 | - |
| 9.6109 | 128700 | 0.0011 | - |
| 9.6147 | 128750 | 0.0011 | - |
| 9.6184 | 128800 | 0.001 | - |
| 9.6221 | 128850 | 0.0006 | - |
| 9.6259 | 128900 | 0.0004 | - |
| 9.6296 | 128950 | 0.0007 | - |
| 9.6333 | 129000 | 0.0007 | - |
| 9.6371 | 129050 | 0.0011 | - |
| 9.6408 | 129100 | 0.0006 | - |
| 9.6445 | 129150 | 0.0005 | - |
| 9.6483 | 129200 | 0.0005 | - |
| 9.6520 | 129250 | 0.001 | - |
| 9.6557 | 129300 | 0.0002 | - |
| 9.6595 | 129350 | 0.0003 | - |
| 9.6632 | 129400 | 0.0007 | - |
| 9.6669 | 129450 | 0.0004 | - |
| 9.6707 | 129500 | 0.0009 | - |
| 9.6744 | 129550 | 0.0004 | - |
| 9.6781 | 129600 | 0.0007 | - |
| 9.6819 | 129650 | 0.0007 | - |
| 9.6856 | 129700 | 0.0003 | - |
| 9.6893 | 129750 | 0.0007 | - |
| 9.6931 | 129800 | 0.0002 | - |
| 9.6968 | 129850 | 0.0003 | - |
| 9.7005 | 129900 | 0.0008 | - |
| 9.7043 | 129950 | 0.0009 | - |
| 9.7080 | 130000 | 0.0005 | - |
| 9.7117 | 130050 | 0.0002 | - |
| 9.7155 | 130100 | 0.0007 | - |
| 9.7192 | 130150 | 0.0009 | - |
| 9.7229 | 130200 | 0.0001 | - |
| 9.7267 | 130250 | 0.0002 | - |
| 9.7304 | 130300 | 0.0004 | - |
| 9.7341 | 130350 | 0.0002 | - |
| 9.7379 | 130400 | 0.0005 | - |
| 9.7416 | 130450 | 0.0003 | - |
| 9.7454 | 130500 | 0.0007 | - |
| 9.7491 | 130550 | 0.0004 | - |
| 9.7528 | 130600 | 0.0 | - |
| 9.7566 | 130650 | 0.0007 | - |
| 9.7603 | 130700 | 0.0002 | - |
| 9.7640 | 130750 | 0.0007 | - |
| 9.7678 | 130800 | 0.0007 | - |
| 9.7715 | 130850 | 0.0004 | - |
| 9.7752 | 130900 | 0.0003 | - |
| 9.7790 | 130950 | 0.0004 | - |
| 9.7827 | 131000 | 0.0002 | - |
| 9.7864 | 131050 | 0.0002 | - |
| 9.7902 | 131100 | 0.0002 | - |
| 9.7939 | 131150 | 0.0001 | - |
| 9.7976 | 131200 | 0.0002 | - |
| 9.8014 | 131250 | 0.0002 | - |
| 9.8051 | 131300 | 0.0003 | - |
| 9.8088 | 131350 | 0.0007 | - |
| 9.8126 | 131400 | 0.0004 | - |
| 9.8163 | 131450 | 0.0003 | - |
| 9.8200 | 131500 | 0.0006 | - |
| 9.8238 | 131550 | 0.0001 | - |
| 9.8275 | 131600 | 0.0004 | - |
| 9.8312 | 131650 | 0.0006 | - |
| 9.8350 | 131700 | 0.0002 | - |
| 9.8387 | 131750 | 0.0003 | - |
| 9.8424 | 131800 | 0.0004 | - |
| 9.8462 | 131850 | 0.0002 | - |
| 9.8499 | 131900 | 0.0002 | - |
| 9.8536 | 131950 | 0.0 | - |
| 9.8574 | 132000 | 0.0004 | - |
| 9.8611 | 132050 | 0.0018 | - |
| 9.8648 | 132100 | 0.0007 | - |
| 9.8686 | 132150 | 0.0022 | - |
| 9.8723 | 132200 | 0.0007 | - |
| 9.8760 | 132250 | 0.0008 | - |
| 9.8798 | 132300 | 0.0008 | - |
| 9.8835 | 132350 | 0.0007 | - |
| 9.8872 | 132400 | 0.0008 | - |
| 9.8910 | 132450 | 0.0002 | - |
| 9.8947 | 132500 | 0.0006 | - |
| 9.8984 | 132550 | 0.0007 | - |
| 9.9022 | 132600 | 0.0003 | - |
| 9.9059 | 132650 | 0.0005 | - |
| 9.9096 | 132700 | 0.0004 | - |
| 9.9134 | 132750 | 0.0004 | - |
| 9.9171 | 132800 | 0.0004 | - |
| 9.9208 | 132850 | 0.0009 | - |
| 9.9246 | 132900 | 0.0002 | - |
| 9.9283 | 132950 | 0.001 | - |
| 9.9320 | 133000 | 0.0001 | - |
| 9.9358 | 133050 | 0.0004 | - |
| 9.9395 | 133100 | 0.0001 | - |
| 9.9432 | 133150 | 0.0007 | - |
| 9.9470 | 133200 | 0.0006 | - |
| 9.9507 | 133250 | 0.0002 | - |
| 9.9544 | 133300 | 0.0003 | - |
| 9.9582 | 133350 | 0.0003 | - |
| 9.9619 | 133400 | 0.0006 | - |
| 9.9656 | 133450 | 0.0008 | - |
| 9.9694 | 133500 | 0.0004 | - |
| 9.9731 | 133550 | 0.0009 | - |
| 9.9769 | 133600 | 0.0003 | - |
| 9.9806 | 133650 | 0.0003 | - |
| 9.9843 | 133700 | 0.0004 | - |
| 9.9881 | 133750 | 0.0003 | - |
| 9.9918 | 133800 | 0.0006 | - |
| 9.9955 | 133850 | 0.0006 | - |
| 9.9993 | 133900 | 0.0004 | - |
| 10.0030 | 133950 | 0.0004 | - |
| 10.0067 | 134000 | 0.0006 | - |
| 10.0105 | 134050 | 0.001 | - |
| 10.0142 | 134100 | 0.0004 | - |
| 10.0179 | 134150 | 0.0006 | - |
| 10.0217 | 134200 | 0.0004 | - |
| 10.0254 | 134250 | 0.0008 | - |
| 10.0291 | 134300 | 0.0002 | - |
| 10.0329 | 134350 | 0.0004 | - |
| 10.0366 | 134400 | 0.0009 | - |
| 10.0403 | 134450 | 0.0011 | - |
| 10.0441 | 134500 | 0.0007 | - |
| 10.0478 | 134550 | 0.0007 | - |
| 10.0515 | 134600 | 0.0007 | - |
| 10.0553 | 134650 | 0.0012 | - |
| 10.0590 | 134700 | 0.0008 | - |
| 10.0627 | 134750 | 0.0003 | - |
| 10.0665 | 134800 | 0.0005 | - |
| 10.0702 | 134850 | 0.0002 | - |
| 10.0739 | 134900 | 0.0005 | - |
| 10.0777 | 134950 | 0.0006 | - |
| 10.0814 | 135000 | 0.0008 | - |
| 10.0851 | 135050 | 0.0007 | - |
| 10.0889 | 135100 | 0.0003 | - |
| 10.0926 | 135150 | 0.0004 | - |
| 10.0963 | 135200 | 0.0003 | - |
| 10.1001 | 135250 | 0.0004 | - |
| 10.1038 | 135300 | 0.0005 | - |
| 10.1075 | 135350 | 0.0005 | - |
| 10.1113 | 135400 | 0.0007 | - |
| 10.1150 | 135450 | 0.0009 | - |
| 10.1187 | 135500 | 0.0004 | - |
| 10.1225 | 135550 | 0.0005 | - |
| 10.1262 | 135600 | 0.0002 | - |
| 10.1299 | 135650 | 0.0005 | - |
| 10.1337 | 135700 | 0.0004 | - |
| 10.1374 | 135750 | 0.0001 | - |
| 10.1411 | 135800 | 0.0004 | - |
| 10.1449 | 135850 | 0.0003 | - |
| 10.1486 | 135900 | 0.0005 | - |
| 10.1523 | 135950 | 0.0002 | - |
| 10.1561 | 136000 | 0.0001 | - |
| 10.1598 | 136050 | 0.0006 | - |
| 10.1635 | 136100 | 0.0005 | - |
| 10.1673 | 136150 | 0.0007 | - |
| 10.1710 | 136200 | 0.0004 | - |
| 10.1747 | 136250 | 0.0005 | - |
| 10.1785 | 136300 | 0.0006 | - |
| 10.1822 | 136350 | 0.0005 | - |
| 10.1859 | 136400 | 0.0007 | - |
| 10.1897 | 136450 | 0.0007 | - |
| 10.1934 | 136500 | 0.0002 | - |
| 10.1971 | 136550 | 0.0001 | - |
| 10.2009 | 136600 | 0.0001 | - |
| 10.2046 | 136650 | 0.0002 | - |
| 10.2083 | 136700 | 0.0002 | - |
| 10.2121 | 136750 | 0.0007 | - |
| 10.2158 | 136800 | 0.001 | - |
| 10.2196 | 136850 | 0.0004 | - |
| 10.2233 | 136900 | 0.0006 | - |
| 10.2270 | 136950 | 0.0001 | - |
| 10.2308 | 137000 | 0.0008 | - |
| 10.2345 | 137050 | 0.0006 | - |
| 10.2382 | 137100 | 0.0004 | - |
| 10.2420 | 137150 | 0.0002 | - |
| 10.2457 | 137200 | 0.0008 | - |
| 10.2494 | 137250 | 0.0002 | - |
| 10.2532 | 137300 | 0.0005 | - |
| 10.2569 | 137350 | 0.0003 | - |
| 10.2606 | 137400 | 0.0005 | - |
| 10.2644 | 137450 | 0.0003 | - |
| 10.2681 | 137500 | 0.0004 | - |
| 10.2718 | 137550 | 0.0003 | - |
| 10.2756 | 137600 | 0.0002 | - |
| 10.2793 | 137650 | 0.0006 | - |
| 10.2830 | 137700 | 0.0003 | - |
| 10.2868 | 137750 | 0.0004 | - |
| 10.2905 | 137800 | 0.0006 | - |
| 10.2942 | 137850 | 0.0004 | - |
| 10.2980 | 137900 | 0.0009 | - |
| 10.3017 | 137950 | 0.0003 | - |
| 10.3054 | 138000 | 0.0001 | - |
| 10.3092 | 138050 | 0.0004 | - |
| 10.3129 | 138100 | 0.0004 | - |
| 10.3166 | 138150 | 0.0006 | - |
| 10.3204 | 138200 | 0.0004 | - |
| 10.3241 | 138250 | 0.0006 | - |
| 10.3278 | 138300 | 0.0003 | - |
| 10.3316 | 138350 | 0.0014 | - |
| 10.3353 | 138400 | 0.0006 | - |
| 10.3390 | 138450 | 0.0003 | - |
| 10.3428 | 138500 | 0.0003 | - |
| 10.3465 | 138550 | 0.0001 | - |
| 10.3502 | 138600 | 0.0006 | - |
| 10.3540 | 138650 | 0.0003 | - |
| 10.3577 | 138700 | 0.0006 | - |
| 10.3614 | 138750 | 0.0003 | - |
| 10.3652 | 138800 | 0.0006 | - |
| 10.3689 | 138850 | 0.0006 | - |
| 10.3726 | 138900 | 0.0004 | - |
| 10.3764 | 138950 | 0.0009 | - |
| 10.3801 | 139000 | 0.0013 | - |
| 10.3838 | 139050 | 0.0005 | - |
| 10.3876 | 139100 | 0.0003 | - |
| 10.3913 | 139150 | 0.0006 | - |
| 10.3950 | 139200 | 0.0006 | - |
| 10.3988 | 139250 | 0.0001 | - |
| 10.4025 | 139300 | 0.0002 | - |
| 10.4062 | 139350 | 0.0002 | - |
| 10.4100 | 139400 | 0.0007 | - |
| 10.4137 | 139450 | 0.0005 | - |
| 10.4174 | 139500 | 0.0003 | - |
| 10.4212 | 139550 | 0.0004 | - |
| 10.4249 | 139600 | 0.0007 | - |
| 10.4286 | 139650 | 0.0006 | - |
| 10.4324 | 139700 | 0.0002 | - |
| 10.4361 | 139750 | 0.0003 | - |
| 10.4398 | 139800 | 0.0006 | - |
| 10.4436 | 139850 | 0.0006 | - |
| 10.4473 | 139900 | 0.0005 | - |
| 10.4510 | 139950 | 0.0002 | - |
| 10.4548 | 140000 | 0.0004 | - |
| 10.4585 | 140050 | 0.0004 | - |
| 10.4623 | 140100 | 0.0002 | - |
| 10.4660 | 140150 | 0.0001 | - |
| 10.4697 | 140200 | 0.0002 | - |
| 10.4735 | 140250 | 0.0004 | - |
| 10.4772 | 140300 | 0.0001 | - |
| 10.4809 | 140350 | 0.0001 | - |
| 10.4847 | 140400 | 0.0005 | - |
| 10.4884 | 140450 | 0.0003 | - |
| 10.4921 | 140500 | 0.0005 | - |
| 10.4959 | 140550 | 0.0007 | - |
| 10.4996 | 140600 | 0.0006 | - |
| 10.5033 | 140650 | 0.0001 | - |
| 10.5071 | 140700 | 0.0002 | - |
| 10.5108 | 140750 | 0.0002 | - |
| 10.5145 | 140800 | 0.0003 | - |
| 10.5183 | 140850 | 0.0003 | - |
| 10.5220 | 140900 | 0.0004 | - |
| 10.5257 | 140950 | 0.001 | - |
| 10.5295 | 141000 | 0.0002 | - |
| 10.5332 | 141050 | 0.0005 | - |
| 10.5369 | 141100 | 0.0006 | - |
| 10.5407 | 141150 | 0.0005 | - |
| 10.5444 | 141200 | 0.0001 | - |
| 10.5481 | 141250 | 0.0007 | - |
| 10.5519 | 141300 | 0.0004 | - |
| 10.5556 | 141350 | 0.0001 | - |
| 10.5593 | 141400 | 0.0002 | - |
| 10.5631 | 141450 | 0.0005 | - |
| 10.5668 | 141500 | 0.0006 | - |
| 10.5705 | 141550 | 0.0002 | - |
| 10.5743 | 141600 | 0.0003 | - |
| 10.5780 | 141650 | 0.0009 | - |
| 10.5817 | 141700 | 0.0006 | - |
| 10.5855 | 141750 | 0.0012 | - |
| 10.5892 | 141800 | 0.0008 | - |
| 10.5929 | 141850 | 0.001 | - |
| 10.5967 | 141900 | 0.0005 | - |
| 10.6004 | 141950 | 0.0004 | - |
| 10.6041 | 142000 | 0.0014 | - |
| 10.6079 | 142050 | 0.0002 | - |
| 10.6116 | 142100 | 0.0007 | - |
| 10.6153 | 142150 | 0.0005 | - |
| 10.6191 | 142200 | 0.0005 | - |
| 10.6228 | 142250 | 0.0009 | - |
| 10.6265 | 142300 | 0.0006 | - |
| 10.6303 | 142350 | 0.0004 | - |
| 10.6340 | 142400 | 0.0004 | - |
| 10.6377 | 142450 | 0.0003 | - |
| 10.6415 | 142500 | 0.0008 | - |
| 10.6452 | 142550 | 0.0004 | - |
| 10.6489 | 142600 | 0.0003 | - |
| 10.6527 | 142650 | 0.0003 | - |
| 10.6564 | 142700 | 0.0005 | - |
| 10.6601 | 142750 | 0.0004 | - |
| 10.6639 | 142800 | 0.0002 | - |
| 10.6676 | 142850 | 0.0009 | - |
| 10.6713 | 142900 | 0.0004 | - |
| 10.6751 | 142950 | 0.0002 | - |
| 10.6788 | 143000 | 0.0004 | - |
| 10.6825 | 143050 | 0.0004 | - |
| 10.6863 | 143100 | 0.0001 | - |
| 10.6900 | 143150 | 0.0001 | - |
| 10.6937 | 143200 | 0.0006 | - |
| 10.6975 | 143250 | 0.0004 | - |
| 10.7012 | 143300 | 0.0006 | - |
| 10.7050 | 143350 | 0.0005 | - |
| 10.7087 | 143400 | 0.0002 | - |
| 10.7124 | 143450 | 0.0002 | - |
| 10.7162 | 143500 | 0.0008 | - |
| 10.7199 | 143550 | 0.0004 | - |
| 10.7236 | 143600 | 0.0002 | - |
| 10.7274 | 143650 | 0.0004 | - |
| 10.7311 | 143700 | 0.0004 | - |
| 10.7348 | 143750 | 0.0004 | - |
| 10.7386 | 143800 | 0.0002 | - |
| 10.7423 | 143850 | 0.0003 | - |
| 10.7460 | 143900 | 0.0003 | - |
| 10.7498 | 143950 | 0.0006 | - |
| 10.7535 | 144000 | 0.0004 | - |
| 10.7572 | 144050 | 0.0003 | - |
| 10.7610 | 144100 | 0.0004 | - |
| 10.7647 | 144150 | 0.0009 | - |
| 10.7684 | 144200 | 0.0006 | - |
| 10.7722 | 144250 | 0.0009 | - |
| 10.7759 | 144300 | 0.0007 | - |
| 10.7796 | 144350 | 0.0001 | - |
| 10.7834 | 144400 | 0.0005 | - |
| 10.7871 | 144450 | 0.0005 | - |
| 10.7908 | 144500 | 0.0004 | - |
| 10.7946 | 144550 | 0.0005 | - |
| 10.7983 | 144600 | 0.0003 | - |
| 10.8020 | 144650 | 0.0002 | - |
| 10.8058 | 144700 | 0.0004 | - |
| 10.8095 | 144750 | 0.0009 | - |
| 10.8132 | 144800 | 0.0004 | - |
| 10.8170 | 144850 | 0.0005 | - |
| 10.8207 | 144900 | 0.0001 | - |
| 10.8244 | 144950 | 0.0002 | - |
| 10.8282 | 145000 | 0.0007 | - |
| 10.8319 | 145050 | 0.0003 | - |
| 10.8356 | 145100 | 0.0001 | - |
| 10.8394 | 145150 | 0.0002 | - |
| 10.8431 | 145200 | 0.0005 | - |
| 10.8468 | 145250 | 0.0004 | - |
| 10.8506 | 145300 | 0.0005 | - |
| 10.8543 | 145350 | 0.0008 | - |
| 10.8580 | 145400 | 0.0003 | - |
| 10.8618 | 145450 | 0.0001 | - |
| 10.8655 | 145500 | 0.0005 | - |
| 10.8692 | 145550 | 0.0004 | - |
| 10.8730 | 145600 | 0.0003 | - |
| 10.8767 | 145650 | 0.0005 | - |
| 10.8804 | 145700 | 0.0004 | - |
| 10.8842 | 145750 | 0.0008 | - |
| 10.8879 | 145800 | 0.0003 | - |
| 10.8916 | 145850 | 0.0004 | - |
| 10.8954 | 145900 | 0.0001 | - |
| 10.8991 | 145950 | 0.0003 | - |
| 10.9028 | 146000 | 0.0005 | - |
| 10.9066 | 146050 | 0.0009 | - |
| 10.9103 | 146100 | 0.0012 | - |
| 10.9140 | 146150 | 0.0001 | - |
| 10.9178 | 146200 | 0.0002 | - |
| 10.9215 | 146250 | 0.0001 | - |
| 10.9252 | 146300 | 0.0 | - |
| 10.9290 | 146350 | 0.0001 | - |
| 10.9327 | 146400 | 0.0006 | - |
| 10.9364 | 146450 | 0.0002 | - |
| 10.9402 | 146500 | 0.0 | - |
| 10.9439 | 146550 | 0.0001 | - |
| 10.9477 | 146600 | 0.0003 | - |
| 10.9514 | 146650 | 0.0001 | - |
| 10.9551 | 146700 | 0.0002 | - |
| 10.9589 | 146750 | 0.0005 | - |
| 10.9626 | 146800 | 0.0002 | - |
| 10.9663 | 146850 | 0.0003 | - |
| 10.9701 | 146900 | 0.0002 | - |
| 10.9738 | 146950 | 0.0004 | - |
| 10.9775 | 147000 | 0.0002 | - |
| 10.9813 | 147050 | 0.0005 | - |
| 10.9850 | 147100 | 0.0002 | - |
| 10.9887 | 147150 | 0.0002 | - |
| 10.9925 | 147200 | 0.0002 | - |
| 10.9962 | 147250 | 0.0002 | - |
| 10.9999 | 147300 | 0.0002 | - |
| 11.0037 | 147350 | 0.0002 | - |
| 11.0074 | 147400 | 0.0001 | - |
| 11.0111 | 147450 | 0.0002 | - |
| 11.0149 | 147500 | 0.0003 | - |
| 11.0186 | 147550 | 0.0002 | - |
| 11.0223 | 147600 | 0.0 | - |
| 11.0261 | 147650 | 0.0002 | - |
| 11.0298 | 147700 | 0.0002 | - |
| 11.0335 | 147750 | 0.0001 | - |
| 11.0373 | 147800 | 0.0001 | - |
| 11.0410 | 147850 | 0.0005 | - |
| 11.0447 | 147900 | 0.0002 | - |
| 11.0485 | 147950 | 0.0006 | - |
| 11.0522 | 148000 | 0.0002 | - |
| 11.0559 | 148050 | 0.0003 | - |
| 11.0597 | 148100 | 0.0003 | - |
| 11.0634 | 148150 | 0.0001 | - |
| 11.0671 | 148200 | 0.0003 | - |
| 11.0709 | 148250 | 0.0 | - |
| 11.0746 | 148300 | 0.0 | - |
| 11.0783 | 148350 | 0.0003 | - |
| 11.0821 | 148400 | 0.0004 | - |
| 11.0858 | 148450 | 0.0003 | - |
| 11.0895 | 148500 | 0.0004 | - |
| 11.0933 | 148550 | 0.0004 | - |
| 11.0970 | 148600 | 0.0005 | - |
| 11.1007 | 148650 | 0.0003 | - |
| 11.1045 | 148700 | 0.0005 | - |
| 11.1082 | 148750 | 0.0003 | - |
| 11.1119 | 148800 | 0.0007 | - |
| 11.1157 | 148850 | 0.0002 | - |
| 11.1194 | 148900 | 0.0008 | - |
| 11.1231 | 148950 | 0.0001 | - |
| 11.1269 | 149000 | 0.0003 | - |
| 11.1306 | 149050 | 0.0002 | - |
| 11.1343 | 149100 | 0.0002 | - |
| 11.1381 | 149150 | 0.0004 | - |
| 11.1418 | 149200 | 0.0002 | - |
| 11.1455 | 149250 | 0.0002 | - |
| 11.1493 | 149300 | 0.0006 | - |
| 11.1530 | 149350 | 0.0003 | - |
| 11.1567 | 149400 | 0.0006 | - |
| 11.1605 | 149450 | 0.0007 | - |
| 11.1642 | 149500 | 0.0004 | - |
| 11.1679 | 149550 | 0.0004 | - |
| 11.1717 | 149600 | 0.0006 | - |
| 11.1754 | 149650 | 0.0007 | - |
| 11.1792 | 149700 | 0.0006 | - |
| 11.1829 | 149750 | 0.0006 | - |
| 11.1866 | 149800 | 0.0002 | - |
| 11.1904 | 149850 | 0.0004 | - |
| 11.1941 | 149900 | 0.0004 | - |
| 11.1978 | 149950 | 0.0004 | - |
| 11.2016 | 150000 | 0.0006 | - |
| 11.2053 | 150050 | 0.0002 | - |
| 11.2090 | 150100 | 0.0004 | - |
| 11.2128 | 150150 | 0.0002 | - |
| 11.2165 | 150200 | 0.0003 | - |
| 11.2202 | 150250 | 0.0003 | - |
| 11.2240 | 150300 | 0.0005 | - |
| 11.2277 | 150350 | 0.0005 | - |
| 11.2314 | 150400 | 0.0002 | - |
| 11.2352 | 150450 | 0.0005 | - |
| 11.2389 | 150500 | 0.0002 | - |
| 11.2426 | 150550 | 0.0001 | - |
| 11.2464 | 150600 | 0.0 | - |
| 11.2501 | 150650 | 0.0008 | - |
| 11.2538 | 150700 | 0.0004 | - |
| 11.2576 | 150750 | 0.0004 | - |
| 11.2613 | 150800 | 0.0001 | - |
| 11.2650 | 150850 | 0.0003 | - |
| 11.2688 | 150900 | 0.0004 | - |
| 11.2725 | 150950 | 0.0005 | - |
| 11.2762 | 151000 | 0.0002 | - |
| 11.2800 | 151050 | 0.0003 | - |
| 11.2837 | 151100 | 0.0 | - |
| 11.2874 | 151150 | 0.0005 | - |
| 11.2912 | 151200 | 0.0002 | - |
| 11.2949 | 151250 | 0.0002 | - |
| 11.2986 | 151300 | 0.0002 | - |
| 11.3024 | 151350 | 0.0003 | - |
| 11.3061 | 151400 | 0.0 | - |
| 11.3098 | 151450 | 0.0004 | - |
| 11.3136 | 151500 | 0.0004 | - |
| 11.3173 | 151550 | 0.0004 | - |
| 11.3210 | 151600 | 0.0004 | - |
| 11.3248 | 151650 | 0.0006 | - |
| 11.3285 | 151700 | 0.0005 | - |
| 11.3322 | 151750 | 0.001 | - |
| 11.3360 | 151800 | 0.0002 | - |
| 11.3397 | 151850 | 0.0002 | - |
| 11.3434 | 151900 | 0.0005 | - |
| 11.3472 | 151950 | 0.0002 | - |
| 11.3509 | 152000 | 0.0 | - |
| 11.3546 | 152050 | 0.0002 | - |
| 11.3584 | 152100 | 0.0005 | - |
| 11.3621 | 152150 | 0.0001 | - |
| 11.3658 | 152200 | 0.0006 | - |
| 11.3696 | 152250 | 0.0002 | - |
| 11.3733 | 152300 | 0.0005 | - |
| 11.3770 | 152350 | 0.0002 | - |
| 11.3808 | 152400 | 0.0004 | - |
| 11.3845 | 152450 | 0.0004 | - |
| 11.3882 | 152500 | 0.0007 | - |
| 11.3920 | 152550 | 0.0007 | - |
| 11.3957 | 152600 | 0.0002 | - |
| 11.3994 | 152650 | 0.0003 | - |
| 11.4032 | 152700 | 0.0002 | - |
| 11.4069 | 152750 | 0.0004 | - |
| 11.4106 | 152800 | 0.0005 | - |
| 11.4144 | 152850 | 0.0001 | - |
| 11.4181 | 152900 | 0.0006 | - |
| 11.4219 | 152950 | 0.0005 | - |
| 11.4256 | 153000 | 0.0002 | - |
| 11.4293 | 153050 | 0.0005 | - |
| 11.4331 | 153100 | 0.0004 | - |
| 11.4368 | 153150 | 0.0002 | - |
| 11.4405 | 153200 | 0.0002 | - |
| 11.4443 | 153250 | 0.0005 | - |
| 11.4480 | 153300 | 0.0004 | - |
| 11.4517 | 153350 | 0.0002 | - |
| 11.4555 | 153400 | 0.0003 | - |
| 11.4592 | 153450 | 0.0 | - |
| 11.4629 | 153500 | 0.0002 | - |
| 11.4667 | 153550 | 0.0003 | - |
| 11.4704 | 153600 | 0.0002 | - |
| 11.4741 | 153650 | 0.0002 | - |
| 11.4779 | 153700 | 0.0005 | - |
| 11.4816 | 153750 | 0.0005 | - |
| 11.4853 | 153800 | 0.0005 | - |
| 11.4891 | 153850 | 0.0004 | - |
| 11.4928 | 153900 | 0.0005 | - |
| 11.4965 | 153950 | 0.0004 | - |
| 11.5003 | 154000 | 0.0007 | - |
| 11.5040 | 154050 | 0.0003 | - |
| 11.5077 | 154100 | 0.0 | - |
| 11.5115 | 154150 | 0.0008 | - |
| 11.5152 | 154200 | 0.0002 | - |
| 11.5189 | 154250 | 0.0002 | - |
| 11.5227 | 154300 | 0.0005 | - |
| 11.5264 | 154350 | 0.0002 | - |
| 11.5301 | 154400 | 0.0003 | - |
| 11.5339 | 154450 | 0.0 | - |
| 11.5376 | 154500 | 0.0005 | - |
| 11.5413 | 154550 | 0.0005 | - |
| 11.5451 | 154600 | 0.0003 | - |
| 11.5488 | 154650 | 0.0003 | - |
| 11.5525 | 154700 | 0.0001 | - |
| 11.5563 | 154750 | 0.0004 | - |
| 11.5600 | 154800 | 0.0003 | - |
| 11.5637 | 154850 | 0.0001 | - |
| 11.5675 | 154900 | 0.0003 | - |
| 11.5712 | 154950 | 0.0 | - |
| 11.5749 | 155000 | 0.0 | - |
| 11.5787 | 155050 | 0.0003 | - |
| 11.5824 | 155100 | 0.0005 | - |
| 11.5861 | 155150 | 0.0007 | - |
| 11.5899 | 155200 | 0.0003 | - |
| 11.5936 | 155250 | 0.0004 | - |
| 11.5973 | 155300 | 0.001 | - |
| 11.6011 | 155350 | 0.0011 | - |
| 11.6048 | 155400 | 0.0008 | - |
| 11.6085 | 155450 | 0.0007 | - |
| 11.6123 | 155500 | 0.0001 | - |
| 11.6160 | 155550 | 0.0001 | - |
| 11.6197 | 155600 | 0.0003 | - |
| 11.6235 | 155650 | 0.0005 | - |
| 11.6272 | 155700 | 0.0001 | - |
| 11.6309 | 155750 | 0.0007 | - |
| 11.6347 | 155800 | 0.0005 | - |
| 11.6384 | 155850 | 0.0003 | - |
| 11.6421 | 155900 | 0.0004 | - |
| 11.6459 | 155950 | 0.0007 | - |
| 11.6496 | 156000 | 0.0001 | - |
| 11.6533 | 156050 | 0.0007 | - |
| 11.6571 | 156100 | 0.0008 | - |
| 11.6608 | 156150 | 0.0007 | - |
| 11.6646 | 156200 | 0.0005 | - |
| 11.6683 | 156250 | 0.0005 | - |
| 11.6720 | 156300 | 0.0003 | - |
| 11.6758 | 156350 | 0.0002 | - |
| 11.6795 | 156400 | 0.0001 | - |
| 11.6832 | 156450 | 0.0003 | - |
| 11.6870 | 156500 | 0.0007 | - |
| 11.6907 | 156550 | 0.0002 | - |
| 11.6944 | 156600 | 0.0007 | - |
| 11.6982 | 156650 | 0.0004 | - |
| 11.7019 | 156700 | 0.0002 | - |
| 11.7056 | 156750 | 0.0002 | - |
| 11.7094 | 156800 | 0.0002 | - |
| 11.7131 | 156850 | 0.0005 | - |
| 11.7168 | 156900 | 0.0003 | - |
| 11.7206 | 156950 | 0.0002 | - |
| 11.7243 | 157000 | 0.0004 | - |
| 11.7280 | 157050 | 0.0008 | - |
| 11.7318 | 157100 | 0.0002 | - |
| 11.7355 | 157150 | 0.0002 | - |
| 11.7392 | 157200 | 0.0 | - |
| 11.7430 | 157250 | 0.0 | - |
| 11.7467 | 157300 | 0.0 | - |
| 11.7504 | 157350 | 0.0002 | - |
| 11.7542 | 157400 | 0.0004 | - |
| 11.7579 | 157450 | 0.0001 | - |
| 11.7616 | 157500 | 0.0004 | - |
| 11.7654 | 157550 | 0.0002 | - |
| 11.7691 | 157600 | 0.0008 | - |
| 11.7728 | 157650 | 0.0005 | - |
| 11.7766 | 157700 | 0.0005 | - |
| 11.7803 | 157750 | 0.0005 | - |
| 11.7840 | 157800 | 0.0004 | - |
| 11.7878 | 157850 | 0.0001 | - |
| 11.7915 | 157900 | 0.0001 | - |
| 11.7952 | 157950 | 0.0001 | - |
| 11.7990 | 158000 | 0.0002 | - |
| 11.8027 | 158050 | 0.0002 | - |
| 11.8064 | 158100 | 0.0002 | - |
| 11.8102 | 158150 | 0.0005 | - |
| 11.8139 | 158200 | 0.0004 | - |
| 11.8176 | 158250 | 0.0006 | - |
| 11.8214 | 158300 | 0.0004 | - |
| 11.8251 | 158350 | 0.0002 | - |
| 11.8288 | 158400 | 0.0004 | - |
| 11.8326 | 158450 | 0.0002 | - |
| 11.8363 | 158500 | 0.0001 | - |
| 11.8400 | 158550 | 0.0007 | - |
| 11.8438 | 158600 | 0.0005 | - |
| 11.8475 | 158650 | 0.0001 | - |
| 11.8512 | 158700 | 0.0001 | - |
| 11.8550 | 158750 | 0.0002 | - |
| 11.8587 | 158800 | 0.0001 | - |
| 11.8624 | 158850 | 0.0003 | - |
| 11.8662 | 158900 | 0.0005 | - |
| 11.8699 | 158950 | 0.0005 | - |
| 11.8736 | 159000 | 0.0001 | - |
| 11.8774 | 159050 | 0.0005 | - |
| 11.8811 | 159100 | 0.0001 | - |
| 11.8848 | 159150 | 0.0003 | - |
| 11.8886 | 159200 | 0.0 | - |
| 11.8923 | 159250 | 0.0002 | - |
| 11.8960 | 159300 | 0.0005 | - |
| 11.8998 | 159350 | 0.0001 | - |
| 11.9035 | 159400 | 0.0006 | - |
| 11.9073 | 159450 | 0.0005 | - |
| 11.9110 | 159500 | 0.0006 | - |
| 11.9147 | 159550 | 0.0004 | - |
| 11.9185 | 159600 | 0.0002 | - |
| 11.9222 | 159650 | 0.0013 | - |
| 11.9259 | 159700 | 0.0006 | - |
| 11.9297 | 159750 | 0.0002 | - |
| 11.9334 | 159800 | 0.0003 | - |
| 11.9371 | 159850 | 0.0003 | - |
| 11.9409 | 159900 | 0.0006 | - |
| 11.9446 | 159950 | 0.0002 | - |
| 11.9483 | 160000 | 0.0004 | - |
| 11.9521 | 160050 | 0.0002 | - |
| 11.9558 | 160100 | 0.0002 | - |
| 11.9595 | 160150 | 0.0004 | - |
| 11.9633 | 160200 | 0.0002 | - |
| 11.9670 | 160250 | 0.0 | - |
| 11.9707 | 160300 | 0.0005 | - |
| 11.9745 | 160350 | 0.0003 | - |
| 11.9782 | 160400 | 0.0002 | - |
| 11.9819 | 160450 | 0.0002 | - |
| 11.9857 | 160500 | 0.0002 | - |
| 11.9894 | 160550 | 0.0002 | - |
| 11.9931 | 160600 | 0.0003 | - |
| 11.9969 | 160650 | 0.0004 | - |
| 12.0006 | 160700 | 0.0002 | - |
| 12.0043 | 160750 | 0.0004 | - |
| 12.0081 | 160800 | 0.0002 | - |
| 12.0118 | 160850 | 0.0 | - |
| 12.0155 | 160900 | 0.0002 | - |
| 12.0193 | 160950 | 0.0008 | - |
| 12.0230 | 161000 | 0.0006 | - |
| 12.0267 | 161050 | 0.0004 | - |
| 12.0305 | 161100 | 0.0003 | - |
| 12.0342 | 161150 | 0.0003 | - |
| 12.0379 | 161200 | 0.0002 | - |
| 12.0417 | 161250 | 0.0005 | - |
| 12.0454 | 161300 | 0.0003 | - |
| 12.0491 | 161350 | 0.0003 | - |
| 12.0529 | 161400 | 0.0005 | - |
| 12.0566 | 161450 | 0.0003 | - |
| 12.0603 | 161500 | 0.0002 | - |
| 12.0641 | 161550 | 0.0004 | - |
| 12.0678 | 161600 | 0.0005 | - |
| 12.0715 | 161650 | 0.0004 | - |
| 12.0753 | 161700 | 0.0008 | - |
| 12.0790 | 161750 | 0.0002 | - |
| 12.0827 | 161800 | 0.0006 | - |
| 12.0865 | 161850 | 0.0001 | - |
| 12.0902 | 161900 | 0.0003 | - |
| 12.0939 | 161950 | 0.0004 | - |
| 12.0977 | 162000 | 0.0004 | - |
| 12.1014 | 162050 | 0.0002 | - |
| 12.1051 | 162100 | 0.0006 | - |
| 12.1089 | 162150 | 0.0002 | - |
| 12.1126 | 162200 | 0.0002 | - |
| 12.1163 | 162250 | 0.0005 | - |
| 12.1201 | 162300 | 0.0004 | - |
| 12.1238 | 162350 | 0.0001 | - |
| 12.1275 | 162400 | 0.0002 | - |
| 12.1313 | 162450 | 0.0003 | - |
| 12.1350 | 162500 | 0.0001 | - |
| 12.1387 | 162550 | 0.0005 | - |
| 12.1425 | 162600 | 0.0002 | - |
| 12.1462 | 162650 | 0.0 | - |
| 12.1500 | 162700 | 0.0003 | - |
| 12.1537 | 162750 | 0.0 | - |
| 12.1574 | 162800 | 0.0 | - |
| 12.1612 | 162850 | 0.0002 | - |
| 12.1649 | 162900 | 0.0004 | - |
| 12.1686 | 162950 | 0.0001 | - |
| 12.1724 | 163000 | 0.0003 | - |
| 12.1761 | 163050 | 0.0 | - |
| 12.1798 | 163100 | 0.0004 | - |
| 12.1836 | 163150 | 0.0 | - |
| 12.1873 | 163200 | 0.0 | - |
| 12.1910 | 163250 | 0.0001 | - |
| 12.1948 | 163300 | 0.0003 | - |
| 12.1985 | 163350 | 0.0006 | - |
| 12.2022 | 163400 | 0.0002 | - |
| 12.2060 | 163450 | 0.0001 | - |
| 12.2097 | 163500 | 0.0004 | - |
| 12.2134 | 163550 | 0.0 | - |
| 12.2172 | 163600 | 0.0001 | - |
| 12.2209 | 163650 | 0.0006 | - |
| 12.2246 | 163700 | 0.0002 | - |
| 12.2284 | 163750 | 0.0002 | - |
| 12.2321 | 163800 | 0.0003 | - |
| 12.2358 | 163850 | 0.0001 | - |
| 12.2396 | 163900 | 0.0005 | - |
| 12.2433 | 163950 | 0.0004 | - |
| 12.2470 | 164000 | 0.0002 | - |
| 12.2508 | 164050 | 0.0001 | - |
| 12.2545 | 164100 | 0.0005 | - |
| 12.2582 | 164150 | 0.0005 | - |
| 12.2620 | 164200 | 0.0001 | - |
| 12.2657 | 164250 | 0.0002 | - |
| 12.2694 | 164300 | 0.0004 | - |
| 12.2732 | 164350 | 0.0002 | - |
| 12.2769 | 164400 | 0.0003 | - |
| 12.2806 | 164450 | 0.0 | - |
| 12.2844 | 164500 | 0.0002 | - |
| 12.2881 | 164550 | 0.0001 | - |
| 12.2918 | 164600 | 0.0003 | - |
| 12.2956 | 164650 | 0.0006 | - |
| 12.2993 | 164700 | 0.0003 | - |
| 12.3030 | 164750 | 0.0008 | - |
| 12.3068 | 164800 | 0.0006 | - |
| 12.3105 | 164850 | 0.001 | - |
| 12.3142 | 164900 | 0.0008 | - |
| 12.3180 | 164950 | 0.001 | - |
| 12.3217 | 165000 | 0.0006 | - |
| 12.3254 | 165050 | 0.0003 | - |
| 12.3292 | 165100 | 0.0008 | - |
| 12.3329 | 165150 | 0.0012 | - |
| 12.3366 | 165200 | 0.001 | - |
| 12.3404 | 165250 | 0.0006 | - |
| 12.3441 | 165300 | 0.001 | - |
| 12.3478 | 165350 | 0.0005 | - |
| 12.3516 | 165400 | 0.0005 | - |
| 12.3553 | 165450 | 0.0005 | - |
| 12.3590 | 165500 | 0.0008 | - |
| 12.3628 | 165550 | 0.0004 | - |
| 12.3665 | 165600 | 0.0003 | - |
| 12.3702 | 165650 | 0.0006 | - |
| 12.3740 | 165700 | 0.0005 | - |
| 12.3777 | 165750 | 0.0004 | - |
| 12.3815 | 165800 | 0.0006 | - |
| 12.3852 | 165850 | 0.0006 | - |
| 12.3889 | 165900 | 0.0005 | - |
| 12.3927 | 165950 | 0.0002 | - |
| 12.3964 | 166000 | 0.0004 | - |
| 12.4001 | 166050 | 0.0004 | - |
| 12.4039 | 166100 | 0.0007 | - |
| 12.4076 | 166150 | 0.0006 | - |
| 12.4113 | 166200 | 0.0 | - |
| 12.4151 | 166250 | 0.0005 | - |
| 12.4188 | 166300 | 0.0003 | - |
| 12.4225 | 166350 | 0.0002 | - |
| 12.4263 | 166400 | 0.0004 | - |
| 12.4300 | 166450 | 0.0008 | - |
| 12.4337 | 166500 | 0.0008 | - |
| 12.4375 | 166550 | 0.0007 | - |
| 12.4412 | 166600 | 0.0002 | - |
| 12.4449 | 166650 | 0.0003 | - |
| 12.4487 | 166700 | 0.0008 | - |
| 12.4524 | 166750 | 0.0002 | - |
| 12.4561 | 166800 | 0.0002 | - |
| 12.4599 | 166850 | 0.0002 | - |
| 12.4636 | 166900 | 0.0007 | - |
| 12.4673 | 166950 | 0.0003 | - |
| 12.4711 | 167000 | 0.0002 | - |
| 12.4748 | 167050 | 0.0005 | - |
| 12.4785 | 167100 | 0.0001 | - |
| 12.4823 | 167150 | 0.0005 | - |
| 12.4860 | 167200 | 0.0 | - |
| 12.4897 | 167250 | 0.0003 | - |
| 12.4935 | 167300 | 0.0002 | - |
| 12.4972 | 167350 | 0.0002 | - |
| 12.5009 | 167400 | 0.0009 | - |
| 12.5047 | 167450 | 0.0006 | - |
| 12.5084 | 167500 | 0.0007 | - |
| 12.5121 | 167550 | 0.0002 | - |
| 12.5159 | 167600 | 0.0005 | - |
| 12.5196 | 167650 | 0.0004 | - |
| 12.5233 | 167700 | 0.0008 | - |
| 12.5271 | 167750 | 0.0 | - |
| 12.5308 | 167800 | 0.0002 | - |
| 12.5345 | 167850 | 0.0 | - |
| 12.5383 | 167900 | 0.0006 | - |
| 12.5420 | 167950 | 0.0003 | - |
| 12.5457 | 168000 | 0.0002 | - |
| 12.5495 | 168050 | 0.0004 | - |
| 12.5532 | 168100 | 0.0003 | - |
| 12.5569 | 168150 | 0.0006 | - |
| 12.5607 | 168200 | 0.0005 | - |
| 12.5644 | 168250 | 0.0003 | - |
| 12.5681 | 168300 | 0.0004 | - |
| 12.5719 | 168350 | 0.0002 | - |
| 12.5756 | 168400 | 0.0001 | - |
| 12.5793 | 168450 | 0.0002 | - |
| 12.5831 | 168500 | 0.0002 | - |
| 12.5868 | 168550 | 0.0001 | - |
| 12.5905 | 168600 | 0.0001 | - |
| 12.5943 | 168650 | 0.0005 | - |
| 12.5980 | 168700 | 0.0002 | - |
| 12.6017 | 168750 | 0.0002 | - |
| 12.6055 | 168800 | 0.0003 | - |
| 12.6092 | 168850 | 0.0002 | - |
| 12.6129 | 168900 | 0.0002 | - |
| 12.6167 | 168950 | 0.0001 | - |
| 12.6204 | 169000 | 0.0004 | - |
| 12.6242 | 169050 | 0.0008 | - |
| 12.6279 | 169100 | 0.0005 | - |
| 12.6316 | 169150 | 0.0007 | - |
| 12.6354 | 169200 | 0.0003 | - |
| 12.6391 | 169250 | 0.0003 | - |
| 12.6428 | 169300 | 0.0002 | - |
| 12.6466 | 169350 | 0.0004 | - |
| 12.6503 | 169400 | 0.0001 | - |
| 12.6540 | 169450 | 0.0005 | - |
| 12.6578 | 169500 | 0.0005 | - |
| 12.6615 | 169550 | 0.0005 | - |
| 12.6652 | 169600 | 0.0002 | - |
| 12.6690 | 169650 | 0.0 | - |
| 12.6727 | 169700 | 0.0002 | - |
| 12.6764 | 169750 | 0.0 | - |
| 12.6802 | 169800 | 0.0002 | - |
| 12.6839 | 169850 | 0.0007 | - |
| 12.6876 | 169900 | 0.0007 | - |
| 12.6914 | 169950 | 0.0003 | - |
| 12.6951 | 170000 | 0.0004 | - |
| 12.6988 | 170050 | 0.0007 | - |
| 12.7026 | 170100 | 0.0007 | - |
| 12.7063 | 170150 | 0.0008 | - |
| 12.7100 | 170200 | 0.0005 | - |
| 12.7138 | 170250 | 0.0004 | - |
| 12.7175 | 170300 | 0.0004 | - |
| 12.7212 | 170350 | 0.0002 | - |
| 12.7250 | 170400 | 0.0003 | - |
| 12.7287 | 170450 | 0.0005 | - |
| 12.7324 | 170500 | 0.0003 | - |
| 12.7362 | 170550 | 0.0005 | - |
| 12.7399 | 170600 | 0.0004 | - |
| 12.7436 | 170650 | 0.0011 | - |
| 12.7474 | 170700 | 0.0008 | - |
| 12.7511 | 170750 | 0.0003 | - |
| 12.7548 | 170800 | 0.0009 | - |
| 12.7586 | 170850 | 0.0004 | - |
| 12.7623 | 170900 | 0.0004 | - |
| 12.7660 | 170950 | 0.0011 | - |
| 12.7698 | 171000 | 0.0006 | - |
| 12.7735 | 171050 | 0.0001 | - |
| 12.7772 | 171100 | 0.0004 | - |
| 12.7810 | 171150 | 0.0005 | - |
| 12.7847 | 171200 | 0.0002 | - |
| 12.7884 | 171250 | 0.0003 | - |
| 12.7922 | 171300 | 0.0006 | - |
| 12.7959 | 171350 | 0.0006 | - |
| 12.7996 | 171400 | 0.0006 | - |
| 12.8034 | 171450 | 0.0005 | - |
| 12.8071 | 171500 | 0.0005 | - |
| 12.8108 | 171550 | 0.0006 | - |
| 12.8146 | 171600 | 0.0005 | - |
| 12.8183 | 171650 | 0.0002 | - |
| 12.8220 | 171700 | 0.0003 | - |
| 12.8258 | 171750 | 0.0007 | - |
| 12.8295 | 171800 | 0.0007 | - |
| 12.8332 | 171850 | 0.0008 | - |
| 12.8370 | 171900 | 0.0005 | - |
| 12.8407 | 171950 | 0.0005 | - |
| 12.8444 | 172000 | 0.0004 | - |
| 12.8482 | 172050 | 0.0007 | - |
| 12.8519 | 172100 | 0.0004 | - |
| 12.8556 | 172150 | 0.0007 | - |
| 12.8594 | 172200 | 0.0008 | - |
| 12.8631 | 172250 | 0.0001 | - |
| 12.8669 | 172300 | 0.0001 | - |
| 12.8706 | 172350 | 0.0001 | - |
| 12.8743 | 172400 | 0.0005 | - |
| 12.8781 | 172450 | 0.0005 | - |
| 12.8818 | 172500 | 0.0005 | - |
| 12.8855 | 172550 | 0.0007 | - |
| 12.8893 | 172600 | 0.0004 | - |
| 12.8930 | 172650 | 0.0007 | - |
| 12.8967 | 172700 | 0.0008 | - |
| 12.9005 | 172750 | 0.0009 | - |
| 12.9042 | 172800 | 0.0006 | - |
| 12.9079 | 172850 | 0.0009 | - |
| 12.9117 | 172900 | 0.0004 | - |
| 12.9154 | 172950 | 0.0003 | - |
| 12.9191 | 173000 | 0.0001 | - |
| 12.9229 | 173050 | 0.0002 | - |
| 12.9266 | 173100 | 0.0007 | - |
| 12.9303 | 173150 | 0.0002 | - |
| 12.9341 | 173200 | 0.0001 | - |
| 12.9378 | 173250 | 0.0007 | - |
| 12.9415 | 173300 | 0.0003 | - |
| 12.9453 | 173350 | 0.0003 | - |
| 12.9490 | 173400 | 0.0002 | - |
| 12.9527 | 173450 | 0.0003 | - |
| 12.9565 | 173500 | 0.0008 | - |
| 12.9602 | 173550 | 0.0 | - |
| 12.9639 | 173600 | 0.0003 | - |
| 12.9677 | 173650 | 0.0004 | - |
| 12.9714 | 173700 | 0.0005 | - |
| 12.9751 | 173750 | 0.0005 | - |
| 12.9789 | 173800 | 0.0006 | - |
| 12.9826 | 173850 | 0.0005 | - |
| 12.9863 | 173900 | 0.0002 | - |
| 12.9901 | 173950 | 0.0002 | - |
| 12.9938 | 174000 | 0.0008 | - |
| 12.9975 | 174050 | 0.0 | - |
| 13.0013 | 174100 | 0.0009 | - |
| 13.0050 | 174150 | 0.0005 | - |
| 13.0087 | 174200 | 0.0002 | - |
| 13.0125 | 174250 | 0.0005 | - |
| 13.0162 | 174300 | 0.0006 | - |
| 13.0199 | 174350 | 0.0003 | - |
| 13.0237 | 174400 | 0.0002 | - |
| 13.0274 | 174450 | 0.0005 | - |
| 13.0311 | 174500 | 0.0005 | - |
| 13.0349 | 174550 | 0.0004 | - |
| 13.0386 | 174600 | 0.0002 | - |
| 13.0423 | 174650 | 0.0 | - |
| 13.0461 | 174700 | 0.0003 | - |
| 13.0498 | 174750 | 0.0004 | - |
| 13.0535 | 174800 | 0.0004 | - |
| 13.0573 | 174850 | 0.0006 | - |
| 13.0610 | 174900 | 0.0004 | - |
| 13.0647 | 174950 | 0.0003 | - |
| 13.0685 | 175000 | 0.0005 | - |
| 13.0722 | 175050 | 0.0003 | - |
| 13.0759 | 175100 | 0.0003 | - |
| 13.0797 | 175150 | 0.0 | - |
| 13.0834 | 175200 | 0.0001 | - |
| 13.0871 | 175250 | 0.0003 | - |
| 13.0909 | 175300 | 0.0001 | - |
| 13.0946 | 175350 | 0.0003 | - |
| 13.0983 | 175400 | 0.0005 | - |
| 13.1021 | 175450 | 0.0001 | - |
| 13.1058 | 175500 | 0.0006 | - |
| 13.1096 | 175550 | 0.0003 | - |
| 13.1133 | 175600 | 0.0004 | - |
| 13.1170 | 175650 | 0.0006 | - |
| 13.1208 | 175700 | 0.0004 | - |
| 13.1245 | 175750 | 0.0003 | - |
| 13.1282 | 175800 | 0.0004 | - |
| 13.1320 | 175850 | 0.0002 | - |
| 13.1357 | 175900 | 0.0004 | - |
| 13.1394 | 175950 | 0.0001 | - |
| 13.1432 | 176000 | 0.0001 | - |
| 13.1469 | 176050 | 0.0005 | - |
| 13.1506 | 176100 | 0.0001 | - |
| 13.1544 | 176150 | 0.0003 | - |
| 13.1581 | 176200 | 0.0002 | - |
| 13.1618 | 176250 | 0.0001 | - |
| 13.1656 | 176300 | 0.0006 | - |
| 13.1693 | 176350 | 0.0003 | - |
| 13.1730 | 176400 | 0.0007 | - |
| 13.1768 | 176450 | 0.0007 | - |
| 13.1805 | 176500 | 0.0006 | - |
| 13.1842 | 176550 | 0.0006 | - |
| 13.1880 | 176600 | 0.0003 | - |
| 13.1917 | 176650 | 0.0005 | - |
| 13.1954 | 176700 | 0.0004 | - |
| 13.1992 | 176750 | 0.0003 | - |
| 13.2029 | 176800 | 0.0001 | - |
| 13.2066 | 176850 | 0.0002 | - |
| 13.2104 | 176900 | 0.0003 | - |
| 13.2141 | 176950 | 0.0004 | - |
| 13.2178 | 177000 | 0.0 | - |
| 13.2216 | 177050 | 0.0002 | - |
| 13.2253 | 177100 | 0.0002 | - |
| 13.2290 | 177150 | 0.0003 | - |
| 13.2328 | 177200 | 0.0 | - |
| 13.2365 | 177250 | 0.0002 | - |
| 13.2402 | 177300 | 0.0008 | - |
| 13.2440 | 177350 | 0.0005 | - |
| 13.2477 | 177400 | 0.0002 | - |
| 13.2514 | 177450 | 0.0002 | - |
| 13.2552 | 177500 | 0.0001 | - |
| 13.2589 | 177550 | 0.0001 | - |
| 13.2626 | 177600 | 0.0002 | - |
| 13.2664 | 177650 | 0.0004 | - |
| 13.2701 | 177700 | 0.0001 | - |
| 13.2738 | 177750 | 0.0002 | - |
| 13.2776 | 177800 | 0.0 | - |
| 13.2813 | 177850 | 0.0004 | - |
| 13.2850 | 177900 | 0.0001 | - |
| 13.2888 | 177950 | 0.0002 | - |
| 13.2925 | 178000 | 0.0001 | - |
| 13.2962 | 178050 | 0.0005 | - |
| 13.3000 | 178100 | 0.0 | - |
| 13.3037 | 178150 | 0.0 | - |
| 13.3074 | 178200 | 0.0005 | - |
| 13.3112 | 178250 | 0.0004 | - |
| 13.3149 | 178300 | 0.0005 | - |
| 13.3186 | 178350 | 0.0003 | - |
| 13.3224 | 178400 | 0.0 | - |
| 13.3261 | 178450 | 0.0001 | - |
| 13.3298 | 178500 | 0.0002 | - |
| 13.3336 | 178550 | 0.0005 | - |
| 13.3373 | 178600 | 0.0003 | - |
| 13.3410 | 178650 | 0.0001 | - |
| 13.3448 | 178700 | 0.0002 | - |
| 13.3485 | 178750 | 0.0 | - |
| 13.3523 | 178800 | 0.0007 | - |
| 13.3560 | 178850 | 0.0001 | - |
| 13.3597 | 178900 | 0.0003 | - |
| 13.3635 | 178950 | 0.0002 | - |
| 13.3672 | 179000 | 0.0001 | - |
| 13.3709 | 179050 | 0.0003 | - |
| 13.3747 | 179100 | 0.0 | - |
| 13.3784 | 179150 | 0.0003 | - |
| 13.3821 | 179200 | 0.0004 | - |
| 13.3859 | 179250 | 0.0004 | - |
| 13.3896 | 179300 | 0.0001 | - |
| 13.3933 | 179350 | 0.0002 | - |
| 13.3971 | 179400 | 0.0005 | - |
| 13.4008 | 179450 | 0.0003 | - |
| 13.4045 | 179500 | 0.0002 | - |
| 13.4083 | 179550 | 0.0005 | - |
| 13.4120 | 179600 | 0.0004 | - |
| 13.4157 | 179650 | 0.0002 | - |
| 13.4195 | 179700 | 0.0005 | - |
| 13.4232 | 179750 | 0.0003 | - |
| 13.4269 | 179800 | 0.0005 | - |
| 13.4307 | 179850 | 0.0002 | - |
| 13.4344 | 179900 | 0.0004 | - |
| 13.4381 | 179950 | 0.0001 | - |
| 13.4419 | 180000 | 0.0003 | - |
| 13.4456 | 180050 | 0.0001 | - |
| 13.4493 | 180100 | 0.0002 | - |
| 13.4531 | 180150 | 0.0001 | - |
| 13.4568 | 180200 | 0.0001 | - |
| 13.4605 | 180250 | 0.0003 | - |
| 13.4643 | 180300 | 0.0001 | - |
| 13.4680 | 180350 | 0.0002 | - |
| 13.4717 | 180400 | 0.0002 | - |
| 13.4755 | 180450 | 0.0003 | - |
| 13.4792 | 180500 | 0.0003 | - |
| 13.4829 | 180550 | 0.0001 | - |
| 13.4867 | 180600 | 0.0004 | - |
| 13.4904 | 180650 | 0.0001 | - |
| 13.4941 | 180700 | 0.0005 | - |
| 13.4979 | 180750 | 0.0003 | - |
| 13.5016 | 180800 | 0.0005 | - |
| 13.5053 | 180850 | 0.0005 | - |
| 13.5091 | 180900 | 0.0002 | - |
| 13.5128 | 180950 | 0.0002 | - |
| 13.5165 | 181000 | 0.0005 | - |
| 13.5203 | 181050 | 0.0008 | - |
| 13.5240 | 181100 | 0.001 | - |
| 13.5277 | 181150 | 0.0004 | - |
| 13.5315 | 181200 | 0.0002 | - |
| 13.5352 | 181250 | 0.0008 | - |
| 13.5389 | 181300 | 0.0006 | - |
| 13.5427 | 181350 | 0.0004 | - |
| 13.5464 | 181400 | 0.0002 | - |
| 13.5501 | 181450 | 0.0001 | - |
| 13.5539 | 181500 | 0.0002 | - |
| 13.5576 | 181550 | 0.0003 | - |
| 13.5613 | 181600 | 0.0 | - |
| 13.5651 | 181650 | 0.0 | - |
| 13.5688 | 181700 | 0.0003 | - |
| 13.5725 | 181750 | 0.0003 | - |
| 13.5763 | 181800 | 0.0002 | - |
| 13.5800 | 181850 | 0.0005 | - |
| 13.5838 | 181900 | 0.0024 | - |
| 13.5875 | 181950 | 0.0008 | - |
| 13.5912 | 182000 | 0.0012 | - |
| 13.5950 | 182050 | 0.0007 | - |
| 13.5987 | 182100 | 0.0007 | - |
| 13.6024 | 182150 | 0.0006 | - |
| 13.6062 | 182200 | 0.0005 | - |
| 13.6099 | 182250 | 0.0003 | - |
| 13.6136 | 182300 | 0.0004 | - |
| 13.6174 | 182350 | 0.0002 | - |
| 13.6211 | 182400 | 0.0005 | - |
| 13.6248 | 182450 | 0.0011 | - |
| 13.6286 | 182500 | 0.0002 | - |
| 13.6323 | 182550 | 0.0002 | - |
| 13.6360 | 182600 | 0.0005 | - |
| 13.6398 | 182650 | 0.0005 | - |
| 13.6435 | 182700 | 0.0003 | - |
| 13.6472 | 182750 | 0.0005 | - |
| 13.6510 | 182800 | 0.0009 | - |
| 13.6547 | 182850 | 0.0006 | - |
| 13.6584 | 182900 | 0.0005 | - |
| 13.6622 | 182950 | 0.0002 | - |
| 13.6659 | 183000 | 0.0002 | - |
| 13.6696 | 183050 | 0.0002 | - |
| 13.6734 | 183100 | 0.0005 | - |
| 13.6771 | 183150 | 0.0003 | - |
| 13.6808 | 183200 | 0.0006 | - |
| 13.6846 | 183250 | 0.0003 | - |
| 13.6883 | 183300 | 0.0005 | - |
| 13.6920 | 183350 | 0.0002 | - |
| 13.6958 | 183400 | 0.0004 | - |
| 13.6995 | 183450 | 0.0003 | - |
| 13.7032 | 183500 | 0.0005 | - |
| 13.7070 | 183550 | 0.0004 | - |
| 13.7107 | 183600 | 0.0005 | - |
| 13.7144 | 183650 | 0.0002 | - |
| 13.7182 | 183700 | 0.0003 | - |
| 13.7219 | 183750 | 0.0001 | - |
| 13.7256 | 183800 | 0.0003 | - |
| 13.7294 | 183850 | 0.0 | - |
| 13.7331 | 183900 | 0.0002 | - |
| 13.7368 | 183950 | 0.0001 | - |
| 13.7406 | 184000 | 0.0001 | - |
| 13.7443 | 184050 | 0.0003 | - |
| 13.7480 | 184100 | 0.0004 | - |
| 13.7518 | 184150 | 0.0005 | - |
| 13.7555 | 184200 | 0.0003 | - |
| 13.7592 | 184250 | 0.0004 | - |
| 13.7630 | 184300 | 0.0002 | - |
| 13.7667 | 184350 | 0.0 | - |
| 13.7704 | 184400 | 0.0002 | - |
| 13.7742 | 184450 | 0.0003 | - |
| 13.7779 | 184500 | 0.0 | - |
| 13.7816 | 184550 | 0.0 | - |
| 13.7854 | 184600 | 0.0001 | - |
| 13.7891 | 184650 | 0.0002 | - |
| 13.7928 | 184700 | 0.0002 | - |
| 13.7966 | 184750 | 0.0003 | - |
| 13.8003 | 184800 | 0.0 | - |
| 13.8040 | 184850 | 0.0002 | - |
| 13.8078 | 184900 | 0.0 | - |
| 13.8115 | 184950 | 0.0003 | - |
| 13.8152 | 185000 | 0.0005 | - |
| 13.8190 | 185050 | 0.0002 | - |
| 13.8227 | 185100 | 0.0002 | - |
| 13.8265 | 185150 | 0.0003 | - |
| 13.8302 | 185200 | 0.0003 | - |
| 13.8339 | 185250 | 0.0 | - |
| 13.8377 | 185300 | 0.0 | - |
| 13.8414 | 185350 | 0.0003 | - |
| 13.8451 | 185400 | 0.0004 | - |
| 13.8489 | 185450 | 0.0002 | - |
| 13.8526 | 185500 | 0.0005 | - |
| 13.8563 | 185550 | 0.0004 | - |
| 13.8601 | 185600 | 0.0002 | - |
| 13.8638 | 185650 | 0.0001 | - |
| 13.8675 | 185700 | 0.0003 | - |
| 13.8713 | 185750 | 0.0002 | - |
| 13.8750 | 185800 | 0.0004 | - |
| 13.8787 | 185850 | 0.0 | - |
| 13.8825 | 185900 | 0.0008 | - |
| 13.8862 | 185950 | 0.0002 | - |
| 13.8899 | 186000 | 0.0004 | - |
| 13.8937 | 186050 | 0.0002 | - |
| 13.8974 | 186100 | 0.0 | - |
| 13.9011 | 186150 | 0.0002 | - |
| 13.9049 | 186200 | 0.0003 | - |
| 13.9086 | 186250 | 0.0003 | - |
| 13.9123 | 186300 | 0.0002 | - |
| 13.9161 | 186350 | 0.0001 | - |
| 13.9198 | 186400 | 0.0 | - |
| 13.9235 | 186450 | 0.0002 | - |
| 13.9273 | 186500 | 0.0 | - |
| 13.9310 | 186550 | 0.0005 | - |
| 13.9347 | 186600 | 0.0004 | - |
| 13.9385 | 186650 | 0.0 | - |
| 13.9422 | 186700 | 0.0001 | - |
| 13.9459 | 186750 | 0.0001 | - |
| 13.9497 | 186800 | 0.0002 | - |
| 13.9534 | 186850 | 0.0 | - |
| 13.9571 | 186900 | 0.0003 | - |
| 13.9609 | 186950 | 0.0003 | - |
| 13.9646 | 187000 | 0.0001 | - |
| 13.9683 | 187050 | 0.0002 | - |
| 13.9721 | 187100 | 0.0 | - |
| 13.9758 | 187150 | 0.0002 | - |
| 13.9795 | 187200 | 0.0006 | - |
| 13.9833 | 187250 | 0.0003 | - |
| 13.9870 | 187300 | 0.0002 | - |
| 13.9907 | 187350 | 0.0002 | - |
| 13.9945 | 187400 | 0.0002 | - |
| 13.9982 | 187450 | 0.0006 | - |
| 14.0019 | 187500 | 0.0002 | - |
| 14.0057 | 187550 | 0.0 | - |
| 14.0094 | 187600 | 0.0 | - |
| 14.0131 | 187650 | 0.0002 | - |
| 14.0169 | 187700 | 0.0002 | - |
| 14.0206 | 187750 | 0.0 | - |
| 14.0243 | 187800 | 0.0008 | - |
| 14.0281 | 187850 | 0.0008 | - |
| 14.0318 | 187900 | 0.0003 | - |
| 14.0355 | 187950 | 0.0007 | - |
| 14.0393 | 188000 | 0.0008 | - |
| 14.0430 | 188050 | 0.0006 | - |
| 14.0467 | 188100 | 0.0002 | - |
| 14.0505 | 188150 | 0.0003 | - |
| 14.0542 | 188200 | 0.0005 | - |
| 14.0579 | 188250 | 0.0004 | - |
| 14.0617 | 188300 | 0.0004 | - |
| 14.0654 | 188350 | 0.0 | - |
| 14.0692 | 188400 | 0.0002 | - |
| 14.0729 | 188450 | 0.0005 | - |
| 14.0766 | 188500 | 0.0003 | - |
| 14.0804 | 188550 | 0.0003 | - |
| 14.0841 | 188600 | 0.0005 | - |
| 14.0878 | 188650 | 0.0005 | - |
| 14.0916 | 188700 | 0.0003 | - |
| 14.0953 | 188750 | 0.0002 | - |
| 14.0990 | 188800 | 0.0002 | - |
| 14.1028 | 188850 | 0.0 | - |
| 14.1065 | 188900 | 0.0004 | - |
| 14.1102 | 188950 | 0.0004 | - |
| 14.1140 | 189000 | 0.0007 | - |
| 14.1177 | 189050 | 0.0003 | - |
| 14.1214 | 189100 | 0.0002 | - |
| 14.1252 | 189150 | 0.0003 | - |
| 14.1289 | 189200 | 0.0004 | - |
| 14.1326 | 189250 | 0.0002 | - |
| 14.1364 | 189300 | 0.0003 | - |
| 14.1401 | 189350 | 0.0004 | - |
| 14.1438 | 189400 | 0.0001 | - |
| 14.1476 | 189450 | 0.0003 | - |
| 14.1513 | 189500 | 0.0001 | - |
| 14.1550 | 189550 | 0.0004 | - |
| 14.1588 | 189600 | 0.0008 | - |
| 14.1625 | 189650 | 0.0005 | - |
| 14.1662 | 189700 | 0.0006 | - |
| 14.1700 | 189750 | 0.0004 | - |
| 14.1737 | 189800 | 0.0005 | - |
| 14.1774 | 189850 | 0.0007 | - |
| 14.1812 | 189900 | 0.0009 | - |
| 14.1849 | 189950 | 0.001 | - |
| 14.1886 | 190000 | 0.0005 | - |
| 14.1924 | 190050 | 0.0007 | - |
| 14.1961 | 190100 | 0.0002 | - |
| 14.1998 | 190150 | 0.0002 | - |
| 14.2036 | 190200 | 0.0006 | - |
| 14.2073 | 190250 | 0.0003 | - |
| 14.2110 | 190300 | 0.0002 | - |
| 14.2148 | 190350 | 0.0004 | - |
| 14.2185 | 190400 | 0.0002 | - |
| 14.2222 | 190450 | 0.0002 | - |
| 14.2260 | 190500 | 0.0002 | - |
| 14.2297 | 190550 | 0.0 | - |
| 14.2334 | 190600 | 0.0002 | - |
| 14.2372 | 190650 | 0.0002 | - |
| 14.2409 | 190700 | 0.0008 | - |
| 14.2446 | 190750 | 0.0001 | - |
| 14.2484 | 190800 | 0.0004 | - |
| 14.2521 | 190850 | 0.0005 | - |
| 14.2558 | 190900 | 0.0001 | - |
| 14.2596 | 190950 | 0.0001 | - |
| 14.2633 | 191000 | 0.0005 | - |
| 14.2670 | 191050 | 0.0001 | - |
| 14.2708 | 191100 | 0.0004 | - |
| 14.2745 | 191150 | 0.0003 | - |
| 14.2782 | 191200 | 0.0004 | - |
| 14.2820 | 191250 | 0.0001 | - |
| 14.2857 | 191300 | 0.0002 | - |
| 14.2894 | 191350 | 0.0002 | - |
| 14.2932 | 191400 | 0.0002 | - |
| 14.2969 | 191450 | 0.0003 | - |
| 14.3006 | 191500 | 0.0002 | - |
| 14.3044 | 191550 | 0.0001 | - |
| 14.3081 | 191600 | 0.0001 | - |
| 14.3119 | 191650 | 0.0 | - |
| 14.3156 | 191700 | 0.0003 | - |
| 14.3193 | 191750 | 0.0005 | - |
| 14.3231 | 191800 | 0.0 | - |
| 14.3268 | 191850 | 0.0002 | - |
| 14.3305 | 191900 | 0.0002 | - |
| 14.3343 | 191950 | 0.0002 | - |
| 14.3380 | 192000 | 0.0 | - |
| 14.3417 | 192050 | 0.0 | - |
| 14.3455 | 192100 | 0.0 | - |
| 14.3492 | 192150 | 0.0 | - |
| 14.3529 | 192200 | 0.0003 | - |
| 14.3567 | 192250 | 0.0 | - |
| 14.3604 | 192300 | 0.0001 | - |
| 14.3641 | 192350 | 0.0 | - |
| 14.3679 | 192400 | 0.0 | - |
| 14.3716 | 192450 | 0.0002 | - |
| 14.3753 | 192500 | 0.0006 | - |
| 14.3791 | 192550 | 0.0 | - |
| 14.3828 | 192600 | 0.0002 | - |
| 14.3865 | 192650 | 0.0 | - |
| 14.3903 | 192700 | 0.0001 | - |
| 14.3940 | 192750 | 0.0003 | - |
| 14.3977 | 192800 | 0.0001 | - |
| 14.4015 | 192850 | 0.0001 | - |
| 14.4052 | 192900 | 0.0002 | - |
| 14.4089 | 192950 | 0.0003 | - |
| 14.4127 | 193000 | 0.0003 | - |
| 14.4164 | 193050 | 0.0002 | - |
| 14.4201 | 193100 | 0.0003 | - |
| 14.4239 | 193150 | 0.0005 | - |
| 14.4276 | 193200 | 0.0005 | - |
| 14.4313 | 193250 | 0.0002 | - |
| 14.4351 | 193300 | 0.0001 | - |
| 14.4388 | 193350 | 0.0003 | - |
| 14.4425 | 193400 | 0.0 | - |
| 14.4463 | 193450 | 0.0005 | - |
| 14.4500 | 193500 | 0.0002 | - |
| 14.4537 | 193550 | 0.0002 | - |
| 14.4575 | 193600 | 0.0007 | - |
| 14.4612 | 193650 | 0.0004 | - |
| 14.4649 | 193700 | 0.0002 | - |
| 14.4687 | 193750 | 0.0001 | - |
| 14.4724 | 193800 | 0.0002 | - |
| 14.4761 | 193850 | 0.0002 | - |
| 14.4799 | 193900 | 0.0009 | - |
| 14.4836 | 193950 | 0.0007 | - |
| 14.4873 | 194000 | 0.0006 | - |
| 14.4911 | 194050 | 0.0004 | - |
| 14.4948 | 194100 | 0.0001 | - |
| 14.4985 | 194150 | 0.0008 | - |
| 14.5023 | 194200 | 0.001 | - |
| 14.5060 | 194250 | 0.0006 | - |
| 14.5097 | 194300 | 0.0007 | - |
| 14.5135 | 194350 | 0.0007 | - |
| 14.5172 | 194400 | 0.0005 | - |
| 14.5209 | 194450 | 0.0007 | - |
| 14.5247 | 194500 | 0.0003 | - |
| 14.5284 | 194550 | 0.0009 | - |
| 14.5321 | 194600 | 0.0007 | - |
| 14.5359 | 194650 | 0.0007 | - |
| 14.5396 | 194700 | 0.0005 | - |
| 14.5434 | 194750 | 0.0004 | - |
| 14.5471 | 194800 | 0.0005 | - |
| 14.5508 | 194850 | 0.0007 | - |
| 14.5546 | 194900 | 0.0005 | - |
| 14.5583 | 194950 | 0.0005 | - |
| 14.5620 | 195000 | 0.0004 | - |
| 14.5658 | 195050 | 0.0003 | - |
| 14.5695 | 195100 | 0.0005 | - |
| 14.5732 | 195150 | 0.0006 | - |
| 14.5770 | 195200 | 0.0001 | - |
| 14.5807 | 195250 | 0.0002 | - |
| 14.5844 | 195300 | 0.0001 | - |
| 14.5882 | 195350 | 0.0005 | - |
| 14.5919 | 195400 | 0.0002 | - |
| 14.5956 | 195450 | 0.0004 | - |
| 14.5994 | 195500 | 0.0 | - |
| 14.6031 | 195550 | 0.0004 | - |
| 14.6068 | 195600 | 0.0004 | - |
| 14.6106 | 195650 | 0.0006 | - |
| 14.6143 | 195700 | 0.0004 | - |
| 14.6180 | 195750 | 0.0004 | - |
| 14.6218 | 195800 | 0.0003 | - |
| 14.6255 | 195850 | 0.0003 | - |
| 14.6292 | 195900 | 0.0002 | - |
| 14.6330 | 195950 | 0.0003 | - |
| 14.6367 | 196000 | 0.0005 | - |
| 14.6404 | 196050 | 0.0002 | - |
| 14.6442 | 196100 | 0.0001 | - |
| 14.6479 | 196150 | 0.0004 | - |
| 14.6516 | 196200 | 0.0008 | - |
| 14.6554 | 196250 | 0.0001 | - |
| 14.6591 | 196300 | 0.0005 | - |
| 14.6628 | 196350 | 0.0004 | - |
| 14.6666 | 196400 | 0.0008 | - |
| 14.6703 | 196450 | 0.0002 | - |
| 14.6740 | 196500 | 0.0001 | - |
| 14.6778 | 196550 | 0.0002 | - |
| 14.6815 | 196600 | 0.0002 | - |
| 14.6852 | 196650 | 0.0004 | - |
| 14.6890 | 196700 | 0.0002 | - |
| 14.6927 | 196750 | 0.0001 | - |
| 14.6964 | 196800 | 0.0003 | - |
| 14.7002 | 196850 | 0.0002 | - |
| 14.7039 | 196900 | 0.0002 | - |
| 14.7076 | 196950 | 0.0002 | - |
| 14.7114 | 197000 | 0.0 | - |
| 14.7151 | 197050 | 0.0006 | - |
| 14.7188 | 197100 | 0.0 | - |
| 14.7226 | 197150 | 0.0008 | - |
| 14.7263 | 197200 | 0.0001 | - |
| 14.7300 | 197250 | 0.0002 | - |
| 14.7338 | 197300 | 0.0 | - |
| 14.7375 | 197350 | 0.0001 | - |
| 14.7412 | 197400 | 0.0003 | - |
| 14.7450 | 197450 | 0.0006 | - |
| 14.7487 | 197500 | 0.0002 | - |
| 14.7524 | 197550 | 0.0003 | - |
| 14.7562 | 197600 | 0.0002 | - |
| 14.7599 | 197650 | 0.0001 | - |
| 14.7636 | 197700 | 0.0 | - |
| 14.7674 | 197750 | 0.0003 | - |
| 14.7711 | 197800 | 0.0 | - |
| 14.7748 | 197850 | 0.0002 | - |
| 14.7786 | 197900 | 0.0002 | - |
| 14.7823 | 197950 | 0.0 | - |
| 14.7861 | 198000 | 0.0005 | - |
| 14.7898 | 198050 | 0.0006 | - |
| 14.7935 | 198100 | 0.0001 | - |
| 14.7973 | 198150 | 0.0001 | - |
| 14.8010 | 198200 | 0.0003 | - |
| 14.8047 | 198250 | 0.0002 | - |
| 14.8085 | 198300 | 0.0003 | - |
| 14.8122 | 198350 | 0.0 | - |
| 14.8159 | 198400 | 0.0001 | - |
| 14.8197 | 198450 | 0.0 | - |
| 14.8234 | 198500 | 0.0 | - |
| 14.8271 | 198550 | 0.0002 | - |
| 14.8309 | 198600 | 0.0002 | - |
| 14.8346 | 198650 | 0.0 | - |
| 14.8383 | 198700 | 0.0003 | - |
| 14.8421 | 198750 | 0.0005 | - |
| 14.8458 | 198800 | 0.0002 | - |
| 14.8495 | 198850 | 0.0002 | - |
| 14.8533 | 198900 | 0.0001 | - |
| 14.8570 | 198950 | 0.0002 | - |
| 14.8607 | 199000 | 0.0003 | - |
| 14.8645 | 199050 | 0.0 | - |
| 14.8682 | 199100 | 0.0 | - |
| 14.8719 | 199150 | 0.0002 | - |
| 14.8757 | 199200 | 0.0006 | - |
| 14.8794 | 199250 | 0.0003 | - |
| 14.8831 | 199300 | 0.0 | - |
| 14.8869 | 199350 | 0.0 | - |
| 14.8906 | 199400 | 0.0003 | - |
| 14.8943 | 199450 | 0.0002 | - |
| 14.8981 | 199500 | 0.0003 | - |
| 14.9018 | 199550 | 0.0 | - |
| 14.9055 | 199600 | 0.0 | - |
| 14.9093 | 199650 | 0.0002 | - |
| 14.9130 | 199700 | 0.0003 | - |
| 14.9167 | 199750 | 0.0002 | - |
| 14.9205 | 199800 | 0.0 | - |
| 14.9242 | 199850 | 0.0001 | - |
| 14.9279 | 199900 | 0.0003 | - |
| 14.9317 | 199950 | 0.0005 | - |
| 14.9354 | 200000 | 0.0 | - |
| 14.9391 | 200050 | 0.0003 | - |
| 14.9429 | 200100 | 0.0 | - |
| 14.9466 | 200150 | 0.0001 | - |
| 14.9503 | 200200 | 0.0003 | - |
| 14.9541 | 200250 | 0.0005 | - |
| 14.9578 | 200300 | 0.0002 | - |
| 14.9615 | 200350 | 0.0003 | - |
| 14.9653 | 200400 | 0.0002 | - |
| 14.9690 | 200450 | 0.0 | - |
| 14.9727 | 200500 | 0.0003 | - |
| 14.9765 | 200550 | 0.0 | - |
| 14.9802 | 200600 | 0.0 | - |
| 14.9839 | 200650 | 0.0001 | - |
| 14.9877 | 200700 | 0.0003 | - |
| 14.9914 | 200750 | 0.0001 | - |
| 14.9951 | 200800 | 0.0003 | - |
| 14.9989 | 200850 | 0.0002 | - |
| 15.0026 | 200900 | 0.0001 | - |
| 15.0063 | 200950 | 0.0006 | - |
| 15.0101 | 201000 | 0.0001 | - |
| 15.0138 | 201050 | 0.0004 | - |
| 15.0175 | 201100 | 0.0 | - |
| 15.0213 | 201150 | 0.0003 | - |
| 15.0250 | 201200 | 0.0 | - |
| 15.0288 | 201250 | 0.0003 | - |
| 15.0325 | 201300 | 0.0002 | - |
| 15.0362 | 201350 | 0.0003 | - |
| 15.0400 | 201400 | 0.0002 | - |
| 15.0437 | 201450 | 0.0002 | - |
| 15.0474 | 201500 | 0.0002 | - |
| 15.0512 | 201550 | 0.0002 | - |
| 15.0549 | 201600 | 0.0001 | - |
| 15.0586 | 201650 | 0.0009 | - |
| 15.0624 | 201700 | 0.0 | - |
| 15.0661 | 201750 | 0.0002 | - |
| 15.0698 | 201800 | 0.0004 | - |
| 15.0736 | 201850 | 0.0005 | - |
| 15.0773 | 201900 | 0.0002 | - |
| 15.0810 | 201950 | 0.0002 | - |
| 15.0848 | 202000 | 0.0005 | - |
| 15.0885 | 202050 | 0.0002 | - |
| 15.0922 | 202100 | 0.0002 | - |
| 15.0960 | 202150 | 0.0003 | - |
| 15.0997 | 202200 | 0.0002 | - |
| 15.1034 | 202250 | 0.0002 | - |
| 15.1072 | 202300 | 0.0001 | - |
| 15.1109 | 202350 | 0.0005 | - |
| 15.1146 | 202400 | 0.0003 | - |
| 15.1184 | 202450 | 0.0002 | - |
| 15.1221 | 202500 | 0.0005 | - |
| 15.1258 | 202550 | 0.0 | - |
| 15.1296 | 202600 | 0.0002 | - |
| 15.1333 | 202650 | 0.0003 | - |
| 15.1370 | 202700 | 0.0002 | - |
| 15.1408 | 202750 | 0.0002 | - |
| 15.1445 | 202800 | 0.0003 | - |
| 15.1482 | 202850 | 0.0005 | - |
| 15.1520 | 202900 | 0.0002 | - |
| 15.1557 | 202950 | 0.0 | - |
| 15.1594 | 203000 | 0.0002 | - |
| 15.1632 | 203050 | 0.0 | - |
| 15.1669 | 203100 | 0.0 | - |
| 15.1706 | 203150 | 0.0 | - |
| 15.1744 | 203200 | 0.0 | - |
| 15.1781 | 203250 | 0.0 | - |
| 15.1818 | 203300 | 0.0 | - |
| 15.1856 | 203350 | 0.0002 | - |
| 15.1893 | 203400 | 0.0002 | - |
| 15.1930 | 203450 | 0.0 | - |
| 15.1968 | 203500 | 0.0002 | - |
| 15.2005 | 203550 | 0.0002 | - |
| 15.2042 | 203600 | 0.0003 | - |
| 15.2080 | 203650 | 0.0002 | - |
| 15.2117 | 203700 | 0.0004 | - |
| 15.2154 | 203750 | 0.0 | - |
| 15.2192 | 203800 | 0.0004 | - |
| 15.2229 | 203850 | 0.0003 | - |
| 15.2266 | 203900 | 0.0001 | - |
| 15.2304 | 203950 | 0.0002 | - |
| 15.2341 | 204000 | 0.0003 | - |
| 15.2378 | 204050 | 0.0001 | - |
| 15.2416 | 204100 | 0.0002 | - |
| 15.2453 | 204150 | 0.0003 | - |
| 15.2490 | 204200 | 0.0002 | - |
| 15.2528 | 204250 | 0.0 | - |
| 15.2565 | 204300 | 0.0002 | - |
| 15.2602 | 204350 | 0.0002 | - |
| 15.2640 | 204400 | 0.0 | - |
| 15.2677 | 204450 | 0.0002 | - |
| 15.2715 | 204500 | 0.0 | - |
| 15.2752 | 204550 | 0.0 | - |
| 15.2789 | 204600 | 0.0002 | - |
| 15.2827 | 204650 | 0.0002 | - |
| 15.2864 | 204700 | 0.0002 | - |
| 15.2901 | 204750 | 0.0005 | - |
| 15.2939 | 204800 | 0.0 | - |
| 15.2976 | 204850 | 0.0 | - |
| 15.3013 | 204900 | 0.0001 | - |
| 15.3051 | 204950 | 0.0 | - |
| 15.3088 | 205000 | 0.0003 | - |
| 15.3125 | 205050 | 0.0002 | - |
| 15.3163 | 205100 | 0.0002 | - |
| 15.3200 | 205150 | 0.0002 | - |
| 15.3237 | 205200 | 0.0 | - |
| 15.3275 | 205250 | 0.0002 | - |
| 15.3312 | 205300 | 0.0 | - |
| 15.3349 | 205350 | 0.0003 | - |
| 15.3387 | 205400 | 0.0001 | - |
| 15.3424 | 205450 | 0.0 | - |
| 15.3461 | 205500 | 0.0003 | - |
| 15.3499 | 205550 | 0.0 | - |
| 15.3536 | 205600 | 0.0002 | - |
| 15.3573 | 205650 | 0.0 | - |
| 15.3611 | 205700 | 0.0002 | - |
| 15.3648 | 205750 | 0.0001 | - |
| 15.3685 | 205800 | 0.0 | - |
| 15.3723 | 205850 | 0.0001 | - |
| 15.3760 | 205900 | 0.0 | - |
| 15.3797 | 205950 | 0.0 | - |
| 15.3835 | 206000 | 0.0 | - |
| 15.3872 | 206050 | 0.0002 | - |
| 15.3909 | 206100 | 0.0 | - |
| 15.3947 | 206150 | 0.0 | - |
| 15.3984 | 206200 | 0.0 | - |
| 15.4021 | 206250 | 0.0002 | - |
| 15.4059 | 206300 | 0.0 | - |
| 15.4096 | 206350 | 0.0002 | - |
| 15.4133 | 206400 | 0.0 | - |
| 15.4171 | 206450 | 0.0003 | - |
| 15.4208 | 206500 | 0.0001 | - |
| 15.4245 | 206550 | 0.0002 | - |
| 15.4283 | 206600 | 0.0004 | - |
| 15.4320 | 206650 | 0.0004 | - |
| 15.4357 | 206700 | 0.0 | - |
| 15.4395 | 206750 | 0.0002 | - |
| 15.4432 | 206800 | 0.0005 | - |
| 15.4469 | 206850 | 0.0004 | - |
| 15.4507 | 206900 | 0.0006 | - |
| 15.4544 | 206950 | 0.0004 | - |
| 15.4581 | 207000 | 0.0001 | - |
| 15.4619 | 207050 | 0.0001 | - |
| 15.4656 | 207100 | 0.0003 | - |
| 15.4693 | 207150 | 0.0001 | - |
| 15.4731 | 207200 | 0.0002 | - |
| 15.4768 | 207250 | 0.0001 | - |
| 15.4805 | 207300 | 0.0003 | - |
| 15.4843 | 207350 | 0.0001 | - |
| 15.4880 | 207400 | 0.0007 | - |
| 15.4917 | 207450 | 0.0002 | - |
| 15.4955 | 207500 | 0.0002 | - |
| 15.4992 | 207550 | 0.0001 | - |
| 15.5029 | 207600 | 0.0 | - |
| 15.5067 | 207650 | 0.0005 | - |
| 15.5104 | 207700 | 0.0002 | - |
| 15.5142 | 207750 | 0.0006 | - |
| 15.5179 | 207800 | 0.0001 | - |
| 15.5216 | 207850 | 0.0003 | - |
| 15.5254 | 207900 | 0.0004 | - |
| 15.5291 | 207950 | 0.0004 | - |
| 15.5328 | 208000 | 0.0002 | - |
| 15.5366 | 208050 | 0.0 | - |
| 15.5403 | 208100 | 0.0001 | - |
| 15.5440 | 208150 | 0.0006 | - |
| 15.5478 | 208200 | 0.0008 | - |
| 15.5515 | 208250 | 0.0002 | - |
| 15.5552 | 208300 | 0.0001 | - |
| 15.5590 | 208350 | 0.0007 | - |
| 15.5627 | 208400 | 0.0001 | - |
| 15.5664 | 208450 | 0.0002 | - |
| 15.5702 | 208500 | 0.0001 | - |
| 15.5739 | 208550 | 0.0004 | - |
| 15.5776 | 208600 | 0.0003 | - |
| 15.5814 | 208650 | 0.0003 | - |
| 15.5851 | 208700 | 0.0002 | - |
| 15.5888 | 208750 | 0.0004 | - |
| 15.5926 | 208800 | 0.0002 | - |
| 15.5963 | 208850 | 0.0001 | - |
| 15.6000 | 208900 | 0.0003 | - |
| 15.6038 | 208950 | 0.0002 | - |
| 15.6075 | 209000 | 0.0004 | - |
| 15.6112 | 209050 | 0.0002 | - |
| 15.6150 | 209100 | 0.0004 | - |
| 15.6187 | 209150 | 0.0004 | - |
| 15.6224 | 209200 | 0.0002 | - |
| 15.6262 | 209250 | 0.0005 | - |
| 15.6299 | 209300 | 0.0002 | - |
| 15.6336 | 209350 | 0.0003 | - |
| 15.6374 | 209400 | 0.0005 | - |
| 15.6411 | 209450 | 0.0007 | - |
| 15.6448 | 209500 | 0.0004 | - |
| 15.6486 | 209550 | 0.0003 | - |
| 15.6523 | 209600 | 0.0002 | - |
| 15.6560 | 209650 | 0.0 | - |
| 15.6598 | 209700 | 0.0006 | - |
| 15.6635 | 209750 | 0.0 | - |
| 15.6672 | 209800 | 0.0005 | - |
| 15.6710 | 209850 | 0.0002 | - |
| 15.6747 | 209900 | 0.0002 | - |
| 15.6784 | 209950 | 0.0 | - |
| 15.6822 | 210000 | 0.0003 | - |
| 15.6859 | 210050 | 0.0 | - |
| 15.6896 | 210100 | 0.0002 | - |
| 15.6934 | 210150 | 0.0 | - |
| 15.6971 | 210200 | 0.0003 | - |
| 15.7008 | 210250 | 0.0003 | - |
| 15.7046 | 210300 | 0.0003 | - |
| 15.7083 | 210350 | 0.0002 | - |
| 15.7120 | 210400 | 0.0002 | - |
| 15.7158 | 210450 | 0.0 | - |
| 15.7195 | 210500 | 0.0005 | - |
| 15.7232 | 210550 | 0.0002 | - |
| 15.7270 | 210600 | 0.0002 | - |
| 15.7307 | 210650 | 0.0005 | - |
| 15.7344 | 210700 | 0.0002 | - |
| 15.7382 | 210750 | 0.0002 | - |
| 15.7419 | 210800 | 0.0001 | - |
| 15.7457 | 210850 | 0.0002 | - |
| 15.7494 | 210900 | 0.0003 | - |
| 15.7531 | 210950 | 0.0002 | - |
| 15.7569 | 211000 | 0.0005 | - |
| 15.7606 | 211050 | 0.0002 | - |
| 15.7643 | 211100 | 0.0004 | - |
| 15.7681 | 211150 | 0.0001 | - |
| 15.7718 | 211200 | 0.0003 | - |
| 15.7755 | 211250 | 0.0 | - |
| 15.7793 | 211300 | 0.0002 | - |
| 15.7830 | 211350 | 0.0003 | - |
| 15.7867 | 211400 | 0.0003 | - |
| 15.7905 | 211450 | 0.0 | - |
| 15.7942 | 211500 | 0.0 | - |
| 15.7979 | 211550 | 0.0 | - |
| 15.8017 | 211600 | 0.0002 | - |
| 15.8054 | 211650 | 0.0 | - |
| 15.8091 | 211700 | 0.0 | - |
| 15.8129 | 211750 | 0.0001 | - |
| 15.8166 | 211800 | 0.0002 | - |
| 15.8203 | 211850 | 0.0003 | - |
| 15.8241 | 211900 | 0.0003 | - |
| 15.8278 | 211950 | 0.0003 | - |
| 15.8315 | 212000 | 0.0 | - |
| 15.8353 | 212050 | 0.0 | - |
| 15.8390 | 212100 | 0.0003 | - |
| 15.8427 | 212150 | 0.0 | - |
| 15.8465 | 212200 | 0.0 | - |
| 15.8502 | 212250 | 0.0002 | - |
| 15.8539 | 212300 | 0.0002 | - |
| 15.8577 | 212350 | 0.0002 | - |
| 15.8614 | 212400 | 0.0003 | - |
| 15.8651 | 212450 | 0.0003 | - |
| 15.8689 | 212500 | 0.0 | - |
| 15.8726 | 212550 | 0.0001 | - |
| 15.8763 | 212600 | 0.0002 | - |
| 15.8801 | 212650 | 0.0005 | - |
| 15.8838 | 212700 | 0.0002 | - |
| 15.8875 | 212750 | 0.0002 | - |
| 15.8913 | 212800 | 0.0002 | - |
| 15.8950 | 212850 | 0.0002 | - |
| 15.8987 | 212900 | 0.0 | - |
| 15.9025 | 212950 | 0.0003 | - |
| 15.9062 | 213000 | 0.0 | - |
| 15.9099 | 213050 | 0.0002 | - |
| 15.9137 | 213100 | 0.0002 | - |
| 15.9174 | 213150 | 0.0 | - |
| 15.9211 | 213200 | 0.0 | - |
| 15.9249 | 213250 | 0.0002 | - |
| 15.9286 | 213300 | 0.0002 | - |
| 15.9323 | 213350 | 0.0002 | - |
| 15.9361 | 213400 | 0.0003 | - |
| 15.9398 | 213450 | 0.0002 | - |
| 15.9435 | 213500 | 0.0 | - |
| 15.9473 | 213550 | 0.0002 | - |
| 15.9510 | 213600 | 0.0002 | - |
| 15.9547 | 213650 | 0.0003 | - |
| 15.9585 | 213700 | 0.0 | - |
| 15.9622 | 213750 | 0.0003 | - |
| 15.9659 | 213800 | 0.0003 | - |
| 15.9697 | 213850 | 0.0009 | - |
| 15.9734 | 213900 | 0.0004 | - |
| 15.9771 | 213950 | 0.0008 | - |
| 15.9809 | 214000 | 0.0007 | - |
| 15.9846 | 214050 | 0.0003 | - |
| 15.9884 | 214100 | 0.0004 | - |
| 15.9921 | 214150 | 0.0002 | - |
| 15.9958 | 214200 | 0.0 | - |
| 15.9996 | 214250 | 0.0003 | - |
| 16.0033 | 214300 | 0.0001 | - |
| 16.0070 | 214350 | 0.0004 | - |
| 16.0108 | 214400 | 0.0 | - |
| 16.0145 | 214450 | 0.0002 | - |
| 16.0182 | 214500 | 0.0 | - |
| 16.0220 | 214550 | 0.0005 | - |
| 16.0257 | 214600 | 0.0005 | - |
| 16.0294 | 214650 | 0.0002 | - |
| 16.0332 | 214700 | 0.0003 | - |
| 16.0369 | 214750 | 0.0 | - |
| 16.0406 | 214800 | 0.0002 | - |
| 16.0444 | 214850 | 0.0009 | - |
| 16.0481 | 214900 | 0.0 | - |
| 16.0518 | 214950 | 0.0002 | - |
| 16.0556 | 215000 | 0.0003 | - |
| 16.0593 | 215050 | 0.0004 | - |
| 16.0630 | 215100 | 0.0007 | - |
| 16.0668 | 215150 | 0.0002 | - |
| 16.0705 | 215200 | 0.0002 | - |
| 16.0742 | 215250 | 0.0 | - |
| 16.0780 | 215300 | 0.0001 | - |
| 16.0817 | 215350 | 0.0 | - |
| 16.0854 | 215400 | 0.0002 | - |
| 16.0892 | 215450 | 0.0 | - |
| 16.0929 | 215500 | 0.0 | - |
| 16.0966 | 215550 | 0.0001 | - |
| 16.1004 | 215600 | 0.0003 | - |
| 16.1041 | 215650 | 0.0003 | - |
| 16.1078 | 215700 | 0.0001 | - |
| 16.1116 | 215750 | 0.0 | - |
| 16.1153 | 215800 | 0.0002 | - |
| 16.1190 | 215850 | 0.0003 | - |
| 16.1228 | 215900 | 0.0002 | - |
| 16.1265 | 215950 | 0.0 | - |
| 16.1302 | 216000 | 0.0005 | - |
| 16.1340 | 216050 | 0.0002 | - |
| 16.1377 | 216100 | 0.0003 | - |
| 16.1414 | 216150 | 0.0002 | - |
| 16.1452 | 216200 | 0.0001 | - |
| 16.1489 | 216250 | 0.0 | - |
| 16.1526 | 216300 | 0.0004 | - |
| 16.1564 | 216350 | 0.0001 | - |
| 16.1601 | 216400 | 0.0 | - |
| 16.1638 | 216450 | 0.0002 | - |
| 16.1676 | 216500 | 0.0 | - |
| 16.1713 | 216550 | 0.0003 | - |
| 16.1750 | 216600 | 0.0002 | - |
| 16.1788 | 216650 | 0.0003 | - |
| 16.1825 | 216700 | 0.0 | - |
| 16.1862 | 216750 | 0.0003 | - |
| 16.1900 | 216800 | 0.0001 | - |
| 16.1937 | 216850 | 0.0 | - |
| 16.1974 | 216900 | 0.0002 | - |
| 16.2012 | 216950 | 0.0005 | - |
| 16.2049 | 217000 | 0.0 | - |
| 16.2086 | 217050 | 0.0002 | - |
| 16.2124 | 217100 | 0.0002 | - |
| 16.2161 | 217150 | 0.0 | - |
| 16.2198 | 217200 | 0.0002 | - |
| 16.2236 | 217250 | 0.0 | - |
| 16.2273 | 217300 | 0.0001 | - |
| 16.2311 | 217350 | 0.0 | - |
| 16.2348 | 217400 | 0.0005 | - |
| 16.2385 | 217450 | 0.0003 | - |
| 16.2423 | 217500 | 0.0 | - |
| 16.2460 | 217550 | 0.0002 | - |
| 16.2497 | 217600 | 0.0002 | - |
| 16.2535 | 217650 | 0.0 | - |
| 16.2572 | 217700 | 0.0003 | - |
| 16.2609 | 217750 | 0.0002 | - |
| 16.2647 | 217800 | 0.0002 | - |
| 16.2684 | 217850 | 0.0 | - |
| 16.2721 | 217900 | 0.0 | - |
| 16.2759 | 217950 | 0.0003 | - |
| 16.2796 | 218000 | 0.0003 | - |
| 16.2833 | 218050 | 0.0006 | - |
| 16.2871 | 218100 | 0.0004 | - |
| 16.2908 | 218150 | 0.0002 | - |
| 16.2945 | 218200 | 0.0004 | - |
| 16.2983 | 218250 | 0.0002 | - |
| 16.3020 | 218300 | 0.0004 | - |
| 16.3057 | 218350 | 0.0004 | - |
| 16.3095 | 218400 | 0.0003 | - |
| 16.3132 | 218450 | 0.0 | - |
| 16.3169 | 218500 | 0.0002 | - |
| 16.3207 | 218550 | 0.0005 | - |
| 16.3244 | 218600 | 0.0004 | - |
| 16.3281 | 218650 | 0.0004 | - |
| 16.3319 | 218700 | 0.0001 | - |
| 16.3356 | 218750 | 0.0002 | - |
| 16.3393 | 218800 | 0.0002 | - |
| 16.3431 | 218850 | 0.0002 | - |
| 16.3468 | 218900 | 0.0002 | - |
| 16.3505 | 218950 | 0.0002 | - |
| 16.3543 | 219000 | 0.0003 | - |
| 16.3580 | 219050 | 0.0007 | - |
| 16.3617 | 219100 | 0.0002 | - |
| 16.3655 | 219150 | 0.0001 | - |
| 16.3692 | 219200 | 0.0003 | - |
| 16.3729 | 219250 | 0.0002 | - |
| 16.3767 | 219300 | 0.0 | - |
| 16.3804 | 219350 | 0.0003 | - |
| 16.3841 | 219400 | 0.0002 | - |
| 16.3879 | 219450 | 0.0005 | - |
| 16.3916 | 219500 | 0.0 | - |
| 16.3953 | 219550 | 0.0003 | - |
| 16.3991 | 219600 | 0.0003 | - |
| 16.4028 | 219650 | 0.0 | - |
| 16.4065 | 219700 | 0.0003 | - |
| 16.4103 | 219750 | 0.0001 | - |
| 16.4140 | 219800 | 0.0 | - |
| 16.4177 | 219850 | 0.0002 | - |
| 16.4215 | 219900 | 0.0 | - |
| 16.4252 | 219950 | 0.0002 | - |
| 16.4289 | 220000 | 0.0001 | - |
| 16.4327 | 220050 | 0.0003 | - |
| 16.4364 | 220100 | 0.0002 | - |
| 16.4401 | 220150 | 0.0002 | - |
| 16.4439 | 220200 | 0.0 | - |
| 16.4476 | 220250 | 0.0006 | - |
| 16.4513 | 220300 | 0.0 | - |
| 16.4551 | 220350 | 0.0 | - |
| 16.4588 | 220400 | 0.0002 | - |
| 16.4625 | 220450 | 0.0004 | - |
| 16.4663 | 220500 | 0.0002 | - |
| 16.4700 | 220550 | 0.0 | - |
| 16.4738 | 220600 | 0.0002 | - |
| 16.4775 | 220650 | 0.0 | - |
| 16.4812 | 220700 | 0.0001 | - |
| 16.4850 | 220750 | 0.0002 | - |
| 16.4887 | 220800 | 0.0003 | - |
| 16.4924 | 220850 | 0.0002 | - |
| 16.4962 | 220900 | 0.0 | - |
| 16.4999 | 220950 | 0.0002 | - |
| 16.5036 | 221000 | 0.0 | - |
| 16.5074 | 221050 | 0.0002 | - |
| 16.5111 | 221100 | 0.0 | - |
| 16.5148 | 221150 | 0.0 | - |
| 16.5186 | 221200 | 0.0 | - |
| 16.5223 | 221250 | 0.0002 | - |
| 16.5260 | 221300 | 0.0 | - |
| 16.5298 | 221350 | 0.0 | - |
| 16.5335 | 221400 | 0.0001 | - |
| 16.5372 | 221450 | 0.0002 | - |
| 16.5410 | 221500 | 0.0 | - |
| 16.5447 | 221550 | 0.0001 | - |
| 16.5484 | 221600 | 0.0002 | - |
| 16.5522 | 221650 | 0.0003 | - |
| 16.5559 | 221700 | 0.0004 | - |
| 16.5596 | 221750 | 0.0 | - |
| 16.5634 | 221800 | 0.0002 | - |
| 16.5671 | 221850 | 0.0002 | - |
| 16.5708 | 221900 | 0.0 | - |
| 16.5746 | 221950 | 0.0002 | - |
| 16.5783 | 222000 | 0.0003 | - |
| 16.5820 | 222050 | 0.0002 | - |
| 16.5858 | 222100 | 0.0003 | - |
| 16.5895 | 222150 | 0.0003 | - |
| 16.5932 | 222200 | 0.0003 | - |
| 16.5970 | 222250 | 0.0004 | - |
| 16.6007 | 222300 | 0.0001 | - |
| 16.6044 | 222350 | 0.0 | - |
| 16.6082 | 222400 | 0.0005 | - |
| 16.6119 | 222450 | 0.0001 | - |
| 16.6156 | 222500 | 0.0002 | - |
| 16.6194 | 222550 | 0.0006 | - |
| 16.6231 | 222600 | 0.0003 | - |
| 16.6268 | 222650 | 0.0005 | - |
| 16.6306 | 222700 | 0.0 | - |
| 16.6343 | 222750 | 0.0001 | - |
| 16.6380 | 222800 | 0.0002 | - |
| 16.6418 | 222850 | 0.0002 | - |
| 16.6455 | 222900 | 0.0 | - |
| 16.6492 | 222950 | 0.0 | - |
| 16.6530 | 223000 | 0.0 | - |
| 16.6567 | 223050 | 0.0001 | - |
| 16.6604 | 223100 | 0.0004 | - |
| 16.6642 | 223150 | 0.0005 | - |
| 16.6679 | 223200 | 0.0002 | - |
| 16.6716 | 223250 | 0.0002 | - |
| 16.6754 | 223300 | 0.0 | - |
| 16.6791 | 223350 | 0.0 | - |
| 16.6828 | 223400 | 0.0 | - |
| 16.6866 | 223450 | 0.0005 | - |
| 16.6903 | 223500 | 0.0 | - |
| 16.6940 | 223550 | 0.0 | - |
| 16.6978 | 223600 | 0.0002 | - |
| 16.7015 | 223650 | 0.0 | - |
| 16.7052 | 223700 | 0.0002 | - |
| 16.7090 | 223750 | 0.0 | - |
| 16.7127 | 223800 | 0.0003 | - |
| 16.7165 | 223850 | 0.0007 | - |
| 16.7202 | 223900 | 0.0 | - |
| 16.7239 | 223950 | 0.0001 | - |
| 16.7277 | 224000 | 0.0002 | - |
| 16.7314 | 224050 | 0.0003 | - |
| 16.7351 | 224100 | 0.0003 | - |
| 16.7389 | 224150 | 0.0 | - |
| 16.7426 | 224200 | 0.0 | - |
| 16.7463 | 224250 | 0.0004 | - |
| 16.7501 | 224300 | 0.0002 | - |
| 16.7538 | 224350 | 0.0002 | - |
| 16.7575 | 224400 | 0.0 | - |
| 16.7613 | 224450 | 0.0 | - |
| 16.7650 | 224500 | 0.0 | - |
| 16.7687 | 224550 | 0.0002 | - |
| 16.7725 | 224600 | 0.0002 | - |
| 16.7762 | 224650 | 0.0004 | - |
| 16.7799 | 224700 | 0.0005 | - |
| 16.7837 | 224750 | 0.0003 | - |
| 16.7874 | 224800 | 0.0 | - |
| 16.7911 | 224850 | 0.0002 | - |
| 16.7949 | 224900 | 0.0002 | - |
| 16.7986 | 224950 | 0.0001 | - |
| 16.8023 | 225000 | 0.0005 | - |
| 16.8061 | 225050 | 0.0005 | - |
| 16.8098 | 225100 | 0.0 | - |
| 16.8135 | 225150 | 0.0004 | - |
| 16.8173 | 225200 | 0.0 | - |
| 16.8210 | 225250 | 0.0004 | - |
| 16.8247 | 225300 | 0.0002 | - |
| 16.8285 | 225350 | 0.0 | - |
| 16.8322 | 225400 | 0.0 | - |
| 16.8359 | 225450 | 0.0002 | - |
| 16.8397 | 225500 | 0.0002 | - |
| 16.8434 | 225550 | 0.0003 | - |
| 16.8471 | 225600 | 0.0003 | - |
| 16.8509 | 225650 | 0.0004 | - |
| 16.8546 | 225700 | 0.0 | - |
| 16.8583 | 225750 | 0.0 | - |
| 16.8621 | 225800 | 0.0004 | - |
| 16.8658 | 225850 | 0.0003 | - |
| 16.8695 | 225900 | 0.0 | - |
| 16.8733 | 225950 | 0.0001 | - |
| 16.8770 | 226000 | 0.0 | - |
| 16.8807 | 226050 | 0.0001 | - |
| 16.8845 | 226100 | 0.0 | - |
| 16.8882 | 226150 | 0.0 | - |
| 16.8919 | 226200 | 0.0001 | - |
| 16.8957 | 226250 | 0.0 | - |
| 16.8994 | 226300 | 0.0002 | - |
| 16.9031 | 226350 | 0.0 | - |
| 16.9069 | 226400 | 0.0002 | - |
| 16.9106 | 226450 | 0.0002 | - |
| 16.9143 | 226500 | 0.0001 | - |
| 16.9181 | 226550 | 0.0002 | - |
| 16.9218 | 226600 | 0.0005 | - |
| 16.9255 | 226650 | 0.0 | - |
| 16.9293 | 226700 | 0.0002 | - |
| 16.9330 | 226750 | 0.0001 | - |
| 16.9367 | 226800 | 0.0002 | - |
| 16.9405 | 226850 | 0.0003 | - |
| 16.9442 | 226900 | 0.0 | - |
| 16.9480 | 226950 | 0.0003 | - |
| 16.9517 | 227000 | 0.0001 | - |
| 16.9554 | 227050 | 0.0 | - |
| 16.9592 | 227100 | 0.0 | - |
| 16.9629 | 227150 | 0.0 | - |
| 16.9666 | 227200 | 0.0 | - |
| 16.9704 | 227250 | 0.0002 | - |
| 16.9741 | 227300 | 0.0004 | - |
| 16.9778 | 227350 | 0.0002 | - |
| 16.9816 | 227400 | 0.0 | - |
| 16.9853 | 227450 | 0.0 | - |
| 16.9890 | 227500 | 0.0 | - |
| 16.9928 | 227550 | 0.0002 | - |
| 16.9965 | 227600 | 0.0 | - |
| 17.0002 | 227650 | 0.0003 | - |
| 17.0040 | 227700 | 0.0005 | - |
| 17.0077 | 227750 | 0.0 | - |
| 17.0114 | 227800 | 0.0 | - |
| 17.0152 | 227850 | 0.0003 | - |
| 17.0189 | 227900 | 0.0003 | - |
| 17.0226 | 227950 | 0.0002 | - |
| 17.0264 | 228000 | 0.0002 | - |
| 17.0301 | 228050 | 0.0002 | - |
| 17.0338 | 228100 | 0.0003 | - |
| 17.0376 | 228150 | 0.0002 | - |
| 17.0413 | 228200 | 0.0002 | - |
| 17.0450 | 228250 | 0.0 | - |
| 17.0488 | 228300 | 0.0001 | - |
| 17.0525 | 228350 | 0.0001 | - |
| 17.0562 | 228400 | 0.0 | - |
| 17.0600 | 228450 | 0.0 | - |
| 17.0637 | 228500 | 0.0002 | - |
| 17.0674 | 228550 | 0.0 | - |
| 17.0712 | 228600 | 0.0 | - |
| 17.0749 | 228650 | 0.0002 | - |
| 17.0786 | 228700 | 0.0002 | - |
| 17.0824 | 228750 | 0.0002 | - |
| 17.0861 | 228800 | 0.0002 | - |
| 17.0898 | 228850 | 0.0002 | - |
| 17.0936 | 228900 | 0.0004 | - |
| 17.0973 | 228950 | 0.0002 | - |
| 17.1010 | 229000 | 0.0001 | - |
| 17.1048 | 229050 | 0.0001 | - |
| 17.1085 | 229100 | 0.0001 | - |
| 17.1122 | 229150 | 0.0 | - |
| 17.1160 | 229200 | 0.0002 | - |
| 17.1197 | 229250 | 0.0002 | - |
| 17.1234 | 229300 | 0.0 | - |
| 17.1272 | 229350 | 0.0 | - |
| 17.1309 | 229400 | 0.0 | - |
| 17.1346 | 229450 | 0.0001 | - |
| 17.1384 | 229500 | 0.0003 | - |
| 17.1421 | 229550 | 0.0003 | - |
| 17.1458 | 229600 | 0.0 | - |
| 17.1496 | 229650 | 0.0002 | - |
| 17.1533 | 229700 | 0.0001 | - |
| 17.1570 | 229750 | 0.0 | - |
| 17.1608 | 229800 | 0.0006 | - |
| 17.1645 | 229850 | 0.0 | - |
| 17.1682 | 229900 | 0.0 | - |
| 17.1720 | 229950 | 0.0002 | - |
| 17.1757 | 230000 | 0.0002 | - |
| 17.1794 | 230050 | 0.0 | - |
| 17.1832 | 230100 | 0.0002 | - |
| 17.1869 | 230150 | 0.0002 | - |
| 17.1907 | 230200 | 0.0002 | - |
| 17.1944 | 230250 | 0.0 | - |
| 17.1981 | 230300 | 0.0 | - |
| 17.2019 | 230350 | 0.0001 | - |
| 17.2056 | 230400 | 0.0002 | - |
| 17.2093 | 230450 | 0.0 | - |
| 17.2131 | 230500 | 0.0003 | - |
| 17.2168 | 230550 | 0.0002 | - |
| 17.2205 | 230600 | 0.0002 | - |
| 17.2243 | 230650 | 0.0 | - |
| 17.2280 | 230700 | 0.0003 | - |
| 17.2317 | 230750 | 0.0 | - |
| 17.2355 | 230800 | 0.0002 | - |
| 17.2392 | 230850 | 0.0002 | - |
| 17.2429 | 230900 | 0.0002 | - |
| 17.2467 | 230950 | 0.0002 | - |
| 17.2504 | 231000 | 0.0005 | - |
| 17.2541 | 231050 | 0.0006 | - |
| 17.2579 | 231100 | 0.0003 | - |
| 17.2616 | 231150 | 0.0002 | - |
| 17.2653 | 231200 | 0.0002 | - |
| 17.2691 | 231250 | 0.0001 | - |
| 17.2728 | 231300 | 0.0002 | - |
| 17.2765 | 231350 | 0.0003 | - |
| 17.2803 | 231400 | 0.0 | - |
| 17.2840 | 231450 | 0.0002 | - |
| 17.2877 | 231500 | 0.0 | - |
| 17.2915 | 231550 | 0.0 | - |
| 17.2952 | 231600 | 0.0 | - |
| 17.2989 | 231650 | 0.0005 | - |
| 17.3027 | 231700 | 0.0002 | - |
| 17.3064 | 231750 | 0.0 | - |
| 17.3101 | 231800 | 0.0003 | - |
| 17.3139 | 231850 | 0.0002 | - |
| 17.3176 | 231900 | 0.0002 | - |
| 17.3213 | 231950 | 0.0001 | - |
| 17.3251 | 232000 | 0.0002 | - |
| 17.3288 | 232050 | 0.0 | - |
| 17.3325 | 232100 | 0.0003 | - |
| 17.3363 | 232150 | 0.0 | - |
| 17.3400 | 232200 | 0.0 | - |
| 17.3437 | 232250 | 0.0002 | - |
| 17.3475 | 232300 | 0.0002 | - |
| 17.3512 | 232350 | 0.0 | - |
| 17.3549 | 232400 | 0.0001 | - |
| 17.3587 | 232450 | 0.0001 | - |
| 17.3624 | 232500 | 0.0003 | - |
| 17.3661 | 232550 | 0.0005 | - |
| 17.3699 | 232600 | 0.0 | - |
| 17.3736 | 232650 | 0.0002 | - |
| 17.3773 | 232700 | 0.0001 | - |
| 17.3811 | 232750 | 0.0001 | - |
| 17.3848 | 232800 | 0.0002 | - |
| 17.3885 | 232850 | 0.0003 | - |
| 17.3923 | 232900 | 0.0 | - |
| 17.3960 | 232950 | 0.0002 | - |
| 17.3997 | 233000 | 0.0001 | - |
| 17.4035 | 233050 | 0.0001 | - |
| 17.4072 | 233100 | 0.0003 | - |
| 17.4109 | 233150 | 0.0005 | - |
| 17.4147 | 233200 | 0.0 | - |
| 17.4184 | 233250 | 0.0001 | - |
| 17.4221 | 233300 | 0.0001 | - |
| 17.4259 | 233350 | 0.0002 | - |
| 17.4296 | 233400 | 0.0002 | - |
| 17.4334 | 233450 | 0.0002 | - |
| 17.4371 | 233500 | 0.0004 | - |
| 17.4408 | 233550 | 0.0003 | - |
| 17.4446 | 233600 | 0.0003 | - |
| 17.4483 | 233650 | 0.0011 | - |
| 17.4520 | 233700 | 0.0002 | - |
| 17.4558 | 233750 | 0.0 | - |
| 17.4595 | 233800 | 0.0 | - |
| 17.4632 | 233850 | 0.0002 | - |
| 17.4670 | 233900 | 0.0003 | - |
| 17.4707 | 233950 | 0.0001 | - |
| 17.4744 | 234000 | 0.0001 | - |
| 17.4782 | 234050 | 0.0005 | - |
| 17.4819 | 234100 | 0.0003 | - |
| 17.4856 | 234150 | 0.0002 | - |
| 17.4894 | 234200 | 0.0 | - |
| 17.4931 | 234250 | 0.0003 | - |
| 17.4968 | 234300 | 0.0001 | - |
| 17.5006 | 234350 | 0.0001 | - |
| 17.5043 | 234400 | 0.0002 | - |
| 17.5080 | 234450 | 0.0002 | - |
| 17.5118 | 234500 | 0.0003 | - |
| 17.5155 | 234550 | 0.0004 | - |
| 17.5192 | 234600 | 0.0001 | - |
| 17.5230 | 234650 | 0.0003 | - |
| 17.5267 | 234700 | 0.0003 | - |
| 17.5304 | 234750 | 0.0003 | - |
| 17.5342 | 234800 | 0.0 | - |
| 17.5379 | 234850 | 0.0 | - |
| 17.5416 | 234900 | 0.0004 | - |
| 17.5454 | 234950 | 0.0003 | - |
| 17.5491 | 235000 | 0.0 | - |
| 17.5528 | 235050 | 0.0 | - |
| 17.5566 | 235100 | 0.0 | - |
| 17.5603 | 235150 | 0.0003 | - |
| 17.5640 | 235200 | 0.0 | - |
| 17.5678 | 235250 | 0.0002 | - |
| 17.5715 | 235300 | 0.0 | - |
| 17.5752 | 235350 | 0.0002 | - |
| 17.5790 | 235400 | 0.0002 | - |
| 17.5827 | 235450 | 0.0 | - |
| 17.5864 | 235500 | 0.0 | - |
| 17.5902 | 235550 | 0.0 | - |
| 17.5939 | 235600 | 0.0002 | - |
| 17.5976 | 235650 | 0.0001 | - |
| 17.6014 | 235700 | 0.0002 | - |
| 17.6051 | 235750 | 0.0003 | - |
| 17.6088 | 235800 | 0.0002 | - |
| 17.6126 | 235850 | 0.0003 | - |
| 17.6163 | 235900 | 0.0005 | - |
| 17.6200 | 235950 | 0.0003 | - |
| 17.6238 | 236000 | 0.0 | - |
| 17.6275 | 236050 | 0.0002 | - |
| 17.6312 | 236100 | 0.0002 | - |
| 17.6350 | 236150 | 0.0002 | - |
| 17.6387 | 236200 | 0.0002 | - |
| 17.6424 | 236250 | 0.0 | - |
| 17.6462 | 236300 | 0.0002 | - |
| 17.6499 | 236350 | 0.0 | - |
| 17.6536 | 236400 | 0.0002 | - |
| 17.6574 | 236450 | 0.0002 | - |
| 17.6611 | 236500 | 0.0004 | - |
| 17.6648 | 236550 | 0.0001 | - |
| 17.6686 | 236600 | 0.0003 | - |
| 17.6723 | 236650 | 0.0 | - |
| 17.6761 | 236700 | 0.0003 | - |
| 17.6798 | 236750 | 0.0001 | - |
| 17.6835 | 236800 | 0.0003 | - |
| 17.6873 | 236850 | 0.0002 | - |
| 17.6910 | 236900 | 0.0002 | - |
| 17.6947 | 236950 | 0.0004 | - |
| 17.6985 | 237000 | 0.0002 | - |
| 17.7022 | 237050 | 0.0 | - |
| 17.7059 | 237100 | 0.0002 | - |
| 17.7097 | 237150 | 0.0001 | - |
| 17.7134 | 237200 | 0.0002 | - |
| 17.7171 | 237250 | 0.0003 | - |
| 17.7209 | 237300 | 0.0002 | - |
| 17.7246 | 237350 | 0.0002 | - |
| 17.7283 | 237400 | 0.0003 | - |
| 17.7321 | 237450 | 0.0002 | - |
| 17.7358 | 237500 | 0.0 | - |
| 17.7395 | 237550 | 0.0002 | - |
| 17.7433 | 237600 | 0.0 | - |
| 17.7470 | 237650 | 0.0001 | - |
| 17.7507 | 237700 | 0.0 | - |
| 17.7545 | 237750 | 0.0 | - |
| 17.7582 | 237800 | 0.0002 | - |
| 17.7619 | 237850 | 0.0 | - |
| 17.7657 | 237900 | 0.0003 | - |
| 17.7694 | 237950 | 0.0 | - |
| 17.7731 | 238000 | 0.0002 | - |
| 17.7769 | 238050 | 0.0003 | - |
| 17.7806 | 238100 | 0.0001 | - |
| 17.7843 | 238150 | 0.0002 | - |
| 17.7881 | 238200 | 0.0 | - |
| 17.7918 | 238250 | 0.0002 | - |
| 17.7955 | 238300 | 0.0 | - |
| 17.7993 | 238350 | 0.0 | - |
| 17.8030 | 238400 | 0.0 | - |
| 17.8067 | 238450 | 0.0 | - |
| 17.8105 | 238500 | 0.0 | - |
| 17.8142 | 238550 | 0.0002 | - |
| 17.8179 | 238600 | 0.0002 | - |
| 17.8217 | 238650 | 0.0004 | - |
| 17.8254 | 238700 | 0.0006 | - |
| 17.8291 | 238750 | 0.0002 | - |
| 17.8329 | 238800 | 0.0004 | - |
| 17.8366 | 238850 | 0.0004 | - |
| 17.8403 | 238900 | 0.0002 | - |
| 17.8441 | 238950 | 0.0002 | - |
| 17.8478 | 239000 | 0.0002 | - |
| 17.8515 | 239050 | 0.0 | - |
| 17.8553 | 239100 | 0.0001 | - |
| 17.8590 | 239150 | 0.0 | - |
| 17.8627 | 239200 | 0.0001 | - |
| 17.8665 | 239250 | 0.0001 | - |
| 17.8702 | 239300 | 0.0002 | - |
| 17.8739 | 239350 | 0.0005 | - |
| 17.8777 | 239400 | 0.0006 | - |
| 17.8814 | 239450 | 0.0002 | - |
| 17.8851 | 239500 | 0.0002 | - |
| 17.8889 | 239550 | 0.0 | - |
| 17.8926 | 239600 | 0.0004 | - |
| 17.8963 | 239650 | 0.0003 | - |
| 17.9001 | 239700 | 0.0002 | - |
| 17.9038 | 239750 | 0.0003 | - |
| 17.9075 | 239800 | 0.0 | - |
| 17.9113 | 239850 | 0.0004 | - |
| 17.9150 | 239900 | 0.0 | - |
| 17.9188 | 239950 | 0.0001 | - |
| 17.9225 | 240000 | 0.0 | - |
| 17.9262 | 240050 | 0.0004 | - |
| 17.9300 | 240100 | 0.0002 | - |
| 17.9337 | 240150 | 0.0 | - |
| 17.9374 | 240200 | 0.0 | - |
| 17.9412 | 240250 | 0.0 | - |
| 17.9449 | 240300 | 0.0005 | - |
| 17.9486 | 240350 | 0.0 | - |
| 17.9524 | 240400 | 0.0002 | - |
| 17.9561 | 240450 | 0.0002 | - |
| 17.9598 | 240500 | 0.0003 | - |
| 17.9636 | 240550 | 0.0005 | - |
| 17.9673 | 240600 | 0.0002 | - |
| 17.9710 | 240650 | 0.0006 | - |
| 17.9748 | 240700 | 0.0002 | - |
| 17.9785 | 240750 | 0.0006 | - |
| 17.9822 | 240800 | 0.0 | - |
| 17.9860 | 240850 | 0.0 | - |
| 17.9897 | 240900 | 0.0 | - |
| 17.9934 | 240950 | 0.0002 | - |
| 17.9972 | 241000 | 0.0 | - |
| 18.0009 | 241050 | 0.0002 | - |
| 18.0046 | 241100 | 0.0002 | - |
| 18.0084 | 241150 | 0.0004 | - |
| 18.0121 | 241200 | 0.0004 | - |
| 18.0158 | 241250 | 0.0004 | - |
| 18.0196 | 241300 | 0.0003 | - |
| 18.0233 | 241350 | 0.0001 | - |
| 18.0270 | 241400 | 0.0001 | - |
| 18.0308 | 241450 | 0.0002 | - |
| 18.0345 | 241500 | 0.0002 | - |
| 18.0382 | 241550 | 0.0004 | - |
| 18.0420 | 241600 | 0.0002 | - |
| 18.0457 | 241650 | 0.0002 | - |
| 18.0494 | 241700 | 0.0002 | - |
| 18.0532 | 241750 | 0.0008 | - |
| 18.0569 | 241800 | 0.0 | - |
| 18.0606 | 241850 | 0.0006 | - |
| 18.0644 | 241900 | 0.0004 | - |
| 18.0681 | 241950 | 0.0 | - |
| 18.0718 | 242000 | 0.0003 | - |
| 18.0756 | 242050 | 0.0004 | - |
| 18.0793 | 242100 | 0.0003 | - |
| 18.0830 | 242150 | 0.0005 | - |
| 18.0868 | 242200 | 0.0 | - |
| 18.0905 | 242250 | 0.0002 | - |
| 18.0942 | 242300 | 0.0002 | - |
| 18.0980 | 242350 | 0.0 | - |
| 18.1017 | 242400 | 0.0002 | - |
| 18.1054 | 242450 | 0.0004 | - |
| 18.1092 | 242500 | 0.0001 | - |
| 18.1129 | 242550 | 0.0003 | - |
| 18.1166 | 242600 | 0.0002 | - |
| 18.1204 | 242650 | 0.0002 | - |
| 18.1241 | 242700 | 0.0001 | - |
| 18.1278 | 242750 | 0.0002 | - |
| 18.1316 | 242800 | 0.0002 | - |
| 18.1353 | 242850 | 0.0002 | - |
| 18.1390 | 242900 | 0.0003 | - |
| 18.1428 | 242950 | 0.0002 | - |
| 18.1465 | 243000 | 0.0005 | - |
| 18.1503 | 243050 | 0.0 | - |
| 18.1540 | 243100 | 0.0002 | - |
| 18.1577 | 243150 | 0.0003 | - |
| 18.1615 | 243200 | 0.0003 | - |
| 18.1652 | 243250 | 0.0 | - |
| 18.1689 | 243300 | 0.0006 | - |
| 18.1727 | 243350 | 0.0007 | - |
| 18.1764 | 243400 | 0.0 | - |
| 18.1801 | 243450 | 0.0005 | - |
| 18.1839 | 243500 | 0.0003 | - |
| 18.1876 | 243550 | 0.0001 | - |
| 18.1913 | 243600 | 0.0001 | - |
| 18.1951 | 243650 | 0.0002 | - |
| 18.1988 | 243700 | 0.0003 | - |
| 18.2025 | 243750 | 0.0 | - |
| 18.2063 | 243800 | 0.0002 | - |
| 18.2100 | 243850 | 0.0002 | - |
| 18.2137 | 243900 | 0.0 | - |
| 18.2175 | 243950 | 0.0002 | - |
| 18.2212 | 244000 | 0.0002 | - |
| 18.2249 | 244050 | 0.0001 | - |
| 18.2287 | 244100 | 0.0005 | - |
| 18.2324 | 244150 | 0.0001 | - |
| 18.2361 | 244200 | 0.0002 | - |
| 18.2399 | 244250 | 0.0 | - |
| 18.2436 | 244300 | 0.0 | - |
| 18.2473 | 244350 | 0.0007 | - |
| 18.2511 | 244400 | 0.0001 | - |
| 18.2548 | 244450 | 0.0001 | - |
| 18.2585 | 244500 | 0.0001 | - |
| 18.2623 | 244550 | 0.0006 | - |
| 18.2660 | 244600 | 0.0 | - |
| 18.2697 | 244650 | 0.0003 | - |
| 18.2735 | 244700 | 0.0003 | - |
| 18.2772 | 244750 | 0.0 | - |
| 18.2809 | 244800 | 0.0002 | - |
| 18.2847 | 244850 | 0.0001 | - |
| 18.2884 | 244900 | 0.0002 | - |
| 18.2921 | 244950 | 0.0 | - |
| 18.2959 | 245000 | 0.0003 | - |
| 18.2996 | 245050 | 0.0 | - |
| 18.3033 | 245100 | 0.0003 | - |
| 18.3071 | 245150 | 0.0 | - |
| 18.3108 | 245200 | 0.0 | - |
| 18.3145 | 245250 | 0.0002 | - |
| 18.3183 | 245300 | 0.0003 | - |
| 18.3220 | 245350 | 0.0002 | - |
| 18.3257 | 245400 | 0.0002 | - |
| 18.3295 | 245450 | 0.0001 | - |
| 18.3332 | 245500 | 0.0003 | - |
| 18.3369 | 245550 | 0.0 | - |
| 18.3407 | 245600 | 0.0002 | - |
| 18.3444 | 245650 | 0.0002 | - |
| 18.3481 | 245700 | 0.0004 | - |
| 18.3519 | 245750 | 0.0002 | - |
| 18.3556 | 245800 | 0.0 | - |
| 18.3593 | 245850 | 0.0 | - |
| 18.3631 | 245900 | 0.0 | - |
| 18.3668 | 245950 | 0.0002 | - |
| 18.3705 | 246000 | 0.0001 | - |
| 18.3743 | 246050 | 0.0002 | - |
| 18.3780 | 246100 | 0.0002 | - |
| 18.3817 | 246150 | 0.0002 | - |
| 18.3855 | 246200 | 0.0 | - |
| 18.3892 | 246250 | 0.0002 | - |
| 18.3930 | 246300 | 0.0001 | - |
| 18.3967 | 246350 | 0.0 | - |
| 18.4004 | 246400 | 0.0 | - |
| 18.4042 | 246450 | 0.0 | - |
| 18.4079 | 246500 | 0.0002 | - |
| 18.4116 | 246550 | 0.0 | - |
| 18.4154 | 246600 | 0.0002 | - |
| 18.4191 | 246650 | 0.0 | - |
| 18.4228 | 246700 | 0.0 | - |
| 18.4266 | 246750 | 0.0 | - |
| 18.4303 | 246800 | 0.0 | - |
| 18.4340 | 246850 | 0.0 | - |
| 18.4378 | 246900 | 0.0 | - |
| 18.4415 | 246950 | 0.0 | - |
| 18.4452 | 247000 | 0.0 | - |
| 18.4490 | 247050 | 0.0 | - |
| 18.4527 | 247100 | 0.0002 | - |
| 18.4564 | 247150 | 0.0 | - |
| 18.4602 | 247200 | 0.0 | - |
| 18.4639 | 247250 | 0.0002 | - |
| 18.4676 | 247300 | 0.0002 | - |
| 18.4714 | 247350 | 0.0001 | - |
| 18.4751 | 247400 | 0.0002 | - |
| 18.4788 | 247450 | 0.0002 | - |
| 18.4826 | 247500 | 0.0003 | - |
| 18.4863 | 247550 | 0.0 | - |
| 18.4900 | 247600 | 0.0002 | - |
| 18.4938 | 247650 | 0.0 | - |
| 18.4975 | 247700 | 0.0 | - |
| 18.5012 | 247750 | 0.0 | - |
| 18.5050 | 247800 | 0.0 | - |
| 18.5087 | 247850 | 0.0002 | - |
| 18.5124 | 247900 | 0.0002 | - |
| 18.5162 | 247950 | 0.0 | - |
| 18.5199 | 248000 | 0.0003 | - |
| 18.5236 | 248050 | 0.0003 | - |
| 18.5274 | 248100 | 0.0001 | - |
| 18.5311 | 248150 | 0.0 | - |
| 18.5348 | 248200 | 0.0 | - |
| 18.5386 | 248250 | 0.0 | - |
| 18.5423 | 248300 | 0.0 | - |
| 18.5460 | 248350 | 0.0 | - |
| 18.5498 | 248400 | 0.0003 | - |
| 18.5535 | 248450 | 0.0002 | - |
| 18.5572 | 248500 | 0.0001 | - |
| 18.5610 | 248550 | 0.0 | - |
| 18.5647 | 248600 | 0.0 | - |
| 18.5684 | 248650 | 0.0 | - |
| 18.5722 | 248700 | 0.0003 | - |
| 18.5759 | 248750 | 0.0002 | - |
| 18.5796 | 248800 | 0.0003 | - |
| 18.5834 | 248850 | 0.0006 | - |
| 18.5871 | 248900 | 0.0003 | - |
| 18.5908 | 248950 | 0.0003 | - |
| 18.5946 | 249000 | 0.0 | - |
| 18.5983 | 249050 | 0.0 | - |
| 18.6020 | 249100 | 0.0001 | - |
| 18.6058 | 249150 | 0.0005 | - |
| 18.6095 | 249200 | 0.0 | - |
| 18.6132 | 249250 | 0.0 | - |
| 18.6170 | 249300 | 0.0 | - |
| 18.6207 | 249350 | 0.0001 | - |
| 18.6244 | 249400 | 0.0 | - |
| 18.6282 | 249450 | 0.0 | - |
| 18.6319 | 249500 | 0.0003 | - |
| 18.6357 | 249550 | 0.0003 | - |
| 18.6394 | 249600 | 0.0002 | - |
| 18.6431 | 249650 | 0.0001 | - |
| 18.6469 | 249700 | 0.0002 | - |
| 18.6506 | 249750 | 0.0 | - |
| 18.6543 | 249800 | 0.0006 | - |
| 18.6581 | 249850 | 0.0001 | - |
| 18.6618 | 249900 | 0.0002 | - |
| 18.6655 | 249950 | 0.0 | - |
| 18.6693 | 250000 | 0.0001 | - |
| 18.6730 | 250050 | 0.0001 | - |
| 18.6767 | 250100 | 0.0002 | - |
| 18.6805 | 250150 | 0.0001 | - |
| 18.6842 | 250200 | 0.0004 | - |
| 18.6879 | 250250 | 0.0 | - |
| 18.6917 | 250300 | 0.0002 | - |
| 18.6954 | 250350 | 0.0 | - |
| 18.6991 | 250400 | 0.0002 | - |
| 18.7029 | 250450 | 0.0002 | - |
| 18.7066 | 250500 | 0.0001 | - |
| 18.7103 | 250550 | 0.0 | - |
| 18.7141 | 250600 | 0.0002 | - |
| 18.7178 | 250650 | 0.0001 | - |
| 18.7215 | 250700 | 0.0003 | - |
| 18.7253 | 250750 | 0.0002 | - |
| 18.7290 | 250800 | 0.0 | - |
| 18.7327 | 250850 | 0.0 | - |
| 18.7365 | 250900 | 0.0001 | - |
| 18.7402 | 250950 | 0.0001 | - |
| 18.7439 | 251000 | 0.0 | - |
| 18.7477 | 251050 | 0.0002 | - |
| 18.7514 | 251100 | 0.0007 | - |
| 18.7551 | 251150 | 0.0002 | - |
| 18.7589 | 251200 | 0.0003 | - |
| 18.7626 | 251250 | 0.0005 | - |
| 18.7663 | 251300 | 0.0001 | - |
| 18.7701 | 251350 | 0.0003 | - |
| 18.7738 | 251400 | 0.0 | - |
| 18.7775 | 251450 | 0.0001 | - |
| 18.7813 | 251500 | 0.0 | - |
| 18.7850 | 251550 | 0.0002 | - |
| 18.7887 | 251600 | 0.0 | - |
| 18.7925 | 251650 | 0.0002 | - |
| 18.7962 | 251700 | 0.0 | - |
| 18.7999 | 251750 | 0.0003 | - |
| 18.8037 | 251800 | 0.0003 | - |
| 18.8074 | 251850 | 0.0 | - |
| 18.8111 | 251900 | 0.0 | - |
| 18.8149 | 251950 | 0.0002 | - |
| 18.8186 | 252000 | 0.0003 | - |
| 18.8223 | 252050 | 0.0005 | - |
| 18.8261 | 252100 | 0.0005 | - |
| 18.8298 | 252150 | 0.0009 | - |
| 18.8335 | 252200 | 0.0006 | - |
| 18.8373 | 252250 | 0.0001 | - |
| 18.8410 | 252300 | 0.0003 | - |
| 18.8447 | 252350 | 0.0001 | - |
| 18.8485 | 252400 | 0.0 | - |
| 18.8522 | 252450 | 0.0001 | - |
| 18.8559 | 252500 | 0.0005 | - |
| 18.8597 | 252550 | 0.0007 | - |
| 18.8634 | 252600 | 0.0003 | - |
| 18.8671 | 252650 | 0.0 | - |
| 18.8709 | 252700 | 0.0001 | - |
| 18.8746 | 252750 | 0.0001 | - |
| 18.8784 | 252800 | 0.0004 | - |
| 18.8821 | 252850 | 0.0002 | - |
| 18.8858 | 252900 | 0.0003 | - |
| 18.8896 | 252950 | 0.0 | - |
| 18.8933 | 253000 | 0.0002 | - |
| 18.8970 | 253050 | 0.0003 | - |
| 18.9008 | 253100 | 0.0 | - |
| 18.9045 | 253150 | 0.0 | - |
| 18.9082 | 253200 | 0.0002 | - |
| 18.9120 | 253250 | 0.0002 | - |
| 18.9157 | 253300 | 0.0003 | - |
| 18.9194 | 253350 | 0.0003 | - |
| 18.9232 | 253400 | 0.0 | - |
| 18.9269 | 253450 | 0.0 | - |
| 18.9306 | 253500 | 0.0003 | - |
| 18.9344 | 253550 | 0.0 | - |
| 18.9381 | 253600 | 0.0 | - |
| 18.9418 | 253650 | 0.0004 | - |
| 18.9456 | 253700 | 0.0 | - |
| 18.9493 | 253750 | 0.0 | - |
| 18.9530 | 253800 | 0.0002 | - |
| 18.9568 | 253850 | 0.0002 | - |
| 18.9605 | 253900 | 0.0 | - |
| 18.9642 | 253950 | 0.0002 | - |
| 18.9680 | 254000 | 0.0 | - |
| 18.9717 | 254050 | 0.0002 | - |
| 18.9754 | 254100 | 0.0003 | - |
| 18.9792 | 254150 | 0.0002 | - |
| 18.9829 | 254200 | 0.0001 | - |
| 18.9866 | 254250 | 0.0002 | - |
| 18.9904 | 254300 | 0.0 | - |
| 18.9941 | 254350 | 0.0 | - |
| 18.9978 | 254400 | 0.0001 | - |
| 19.0016 | 254450 | 0.0002 | - |
| 19.0053 | 254500 | 0.0002 | - |
| 19.0090 | 254550 | 0.0002 | - |
| 19.0128 | 254600 | 0.0002 | - |
| 19.0165 | 254650 | 0.0002 | - |
| 19.0202 | 254700 | 0.0001 | - |
| 19.0240 | 254750 | 0.0 | - |
| 19.0277 | 254800 | 0.0003 | - |
| 19.0314 | 254850 | 0.0001 | - |
| 19.0352 | 254900 | 0.0001 | - |
| 19.0389 | 254950 | 0.0001 | - |
| 19.0426 | 255000 | 0.0002 | - |
| 19.0464 | 255050 | 0.0002 | - |
| 19.0501 | 255100 | 0.0 | - |
| 19.0538 | 255150 | 0.0 | - |
| 19.0576 | 255200 | 0.0003 | - |
| 19.0613 | 255250 | 0.0002 | - |
| 19.0650 | 255300 | 0.0002 | - |
| 19.0688 | 255350 | 0.0 | - |
| 19.0725 | 255400 | 0.0 | - |
| 19.0762 | 255450 | 0.0 | - |
| 19.0800 | 255500 | 0.0002 | - |
| 19.0837 | 255550 | 0.0 | - |
| 19.0874 | 255600 | 0.0 | - |
| 19.0912 | 255650 | 0.0 | - |
| 19.0949 | 255700 | 0.0 | - |
| 19.0986 | 255750 | 0.0002 | - |
| 19.1024 | 255800 | 0.0002 | - |
| 19.1061 | 255850 | 0.0003 | - |
| 19.1098 | 255900 | 0.0002 | - |
| 19.1136 | 255950 | 0.0 | - |
| 19.1173 | 256000 | 0.0003 | - |
| 19.1211 | 256050 | 0.0 | - |
| 19.1248 | 256100 | 0.0002 | - |
| 19.1285 | 256150 | 0.0002 | - |
| 19.1323 | 256200 | 0.0 | - |
| 19.1360 | 256250 | 0.0002 | - |
| 19.1397 | 256300 | 0.0002 | - |
| 19.1435 | 256350 | 0.0 | - |
| 19.1472 | 256400 | 0.0002 | - |
| 19.1509 | 256450 | 0.0 | - |
| 19.1547 | 256500 | 0.0 | - |
| 19.1584 | 256550 | 0.0002 | - |
| 19.1621 | 256600 | 0.0 | - |
| 19.1659 | 256650 | 0.0004 | - |
| 19.1696 | 256700 | 0.0003 | - |
| 19.1733 | 256750 | 0.0 | - |
| 19.1771 | 256800 | 0.0 | - |
| 19.1808 | 256850 | 0.0006 | - |
| 19.1845 | 256900 | 0.0002 | - |
| 19.1883 | 256950 | 0.0003 | - |
| 19.1920 | 257000 | 0.0002 | - |
| 19.1957 | 257050 | 0.0004 | - |
| 19.1995 | 257100 | 0.0 | - |
| 19.2032 | 257150 | 0.0002 | - |
| 19.2069 | 257200 | 0.0 | - |
| 19.2107 | 257250 | 0.0 | - |
| 19.2144 | 257300 | 0.0 | - |
| 19.2181 | 257350 | 0.0 | - |
| 19.2219 | 257400 | 0.0002 | - |
| 19.2256 | 257450 | 0.0002 | - |
| 19.2293 | 257500 | 0.0005 | - |
| 19.2331 | 257550 | 0.0 | - |
| 19.2368 | 257600 | 0.0 | - |
| 19.2405 | 257650 | 0.0003 | - |
| 19.2443 | 257700 | 0.0003 | - |
| 19.2480 | 257750 | 0.0004 | - |
| 19.2517 | 257800 | 0.0005 | - |
| 19.2555 | 257850 | 0.0 | - |
| 19.2592 | 257900 | 0.0 | - |
| 19.2629 | 257950 | 0.0 | - |
| 19.2667 | 258000 | 0.0002 | - |
| 19.2704 | 258050 | 0.0 | - |
| 19.2741 | 258100 | 0.0001 | - |
| 19.2779 | 258150 | 0.0002 | - |
| 19.2816 | 258200 | 0.0004 | - |
| 19.2853 | 258250 | 0.0003 | - |
| 19.2891 | 258300 | 0.0001 | - |
| 19.2928 | 258350 | 0.0 | - |
| 19.2965 | 258400 | 0.0002 | - |
| 19.3003 | 258450 | 0.0 | - |
| 19.3040 | 258500 | 0.0 | - |
| 19.3077 | 258550 | 0.0 | - |
| 19.3115 | 258600 | 0.0 | - |
| 19.3152 | 258650 | 0.0 | - |
| 19.3189 | 258700 | 0.0002 | - |
| 19.3227 | 258750 | 0.0001 | - |
| 19.3264 | 258800 | 0.0005 | - |
| 19.3301 | 258850 | 0.0 | - |
| 19.3339 | 258900 | 0.0 | - |
| 19.3376 | 258950 | 0.0002 | - |
| 19.3413 | 259000 | 0.0002 | - |
| 19.3451 | 259050 | 0.0 | - |
| 19.3488 | 259100 | 0.0 | - |
| 19.3526 | 259150 | 0.0002 | - |
| 19.3563 | 259200 | 0.0 | - |
| 19.3600 | 259250 | 0.0 | - |
| 19.3638 | 259300 | 0.0002 | - |
| 19.3675 | 259350 | 0.0 | - |
| 19.3712 | 259400 | 0.0006 | - |
| 19.3750 | 259450 | 0.0002 | - |
| 19.3787 | 259500 | 0.0001 | - |
| 19.3824 | 259550 | 0.0002 | - |
| 19.3862 | 259600 | 0.0002 | - |
| 19.3899 | 259650 | 0.0003 | - |
| 19.3936 | 259700 | 0.0004 | - |
| 19.3974 | 259750 | 0.0 | - |
| 19.4011 | 259800 | 0.0002 | - |
| 19.4048 | 259850 | 0.0002 | - |
| 19.4086 | 259900 | 0.0 | - |
| 19.4123 | 259950 | 0.0 | - |
| 19.4160 | 260000 | 0.0002 | - |
| 19.4198 | 260050 | 0.0002 | - |
| 19.4235 | 260100 | 0.0003 | - |
| 19.4272 | 260150 | 0.0001 | - |
| 19.4310 | 260200 | 0.0 | - |
| 19.4347 | 260250 | 0.0002 | - |
| 19.4384 | 260300 | 0.0001 | - |
| 19.4422 | 260350 | 0.0002 | - |
| 19.4459 | 260400 | 0.0 | - |
| 19.4496 | 260450 | 0.0005 | - |
| 19.4534 | 260500 | 0.0 | - |
| 19.4571 | 260550 | 0.0001 | - |
| 19.4608 | 260600 | 0.0001 | - |
| 19.4646 | 260650 | 0.0 | - |
| 19.4683 | 260700 | 0.0 | - |
| 19.4720 | 260750 | 0.0 | - |
| 19.4758 | 260800 | 0.0 | - |
| 19.4795 | 260850 | 0.0 | - |
| 19.4832 | 260900 | 0.0 | - |
| 19.4870 | 260950 | 0.0002 | - |
| 19.4907 | 261000 | 0.0 | - |
| 19.4944 | 261050 | 0.0001 | - |
| 19.4982 | 261100 | 0.0002 | - |
| 19.5019 | 261150 | 0.0 | - |
| 19.5056 | 261200 | 0.0001 | - |
| 19.5094 | 261250 | 0.0002 | - |
| 19.5131 | 261300 | 0.0002 | - |
| 19.5168 | 261350 | 0.0 | - |
| 19.5206 | 261400 | 0.0002 | - |
| 19.5243 | 261450 | 0.0 | - |
| 19.5280 | 261500 | 0.0001 | - |
| 19.5318 | 261550 | 0.0001 | - |
| 19.5355 | 261600 | 0.0004 | - |
| 19.5392 | 261650 | 0.0004 | - |
| 19.5430 | 261700 | 0.0002 | - |
| 19.5467 | 261750 | 0.0007 | - |
| 19.5504 | 261800 | 0.0002 | - |
| 19.5542 | 261850 | 0.0 | - |
| 19.5579 | 261900 | 0.0 | - |
| 19.5616 | 261950 | 0.0 | - |
| 19.5654 | 262000 | 0.0006 | - |
| 19.5691 | 262050 | 0.0004 | - |
| 19.5728 | 262100 | 0.0004 | - |
| 19.5766 | 262150 | 0.0003 | - |
| 19.5803 | 262200 | 0.0002 | - |
| 19.5840 | 262250 | 0.0 | - |
| 19.5878 | 262300 | 0.0 | - |
| 19.5915 | 262350 | 0.0002 | - |
| 19.5953 | 262400 | 0.0004 | - |
| 19.5990 | 262450 | 0.0 | - |
| 19.6027 | 262500 | 0.0002 | - |
| 19.6065 | 262550 | 0.0002 | - |
| 19.6102 | 262600 | 0.0002 | - |
| 19.6139 | 262650 | 0.0 | - |
| 19.6177 | 262700 | 0.0 | - |
| 19.6214 | 262750 | 0.0002 | - |
| 19.6251 | 262800 | 0.0001 | - |
| 19.6289 | 262850 | 0.0003 | - |
| 19.6326 | 262900 | 0.0 | - |
| 19.6363 | 262950 | 0.0002 | - |
| 19.6401 | 263000 | 0.0001 | - |
| 19.6438 | 263050 | 0.0002 | - |
| 19.6475 | 263100 | 0.0 | - |
| 19.6513 | 263150 | 0.0002 | - |
| 19.6550 | 263200 | 0.0002 | - |
| 19.6587 | 263250 | 0.0 | - |
| 19.6625 | 263300 | 0.0002 | - |
| 19.6662 | 263350 | 0.0 | - |
| 19.6699 | 263400 | 0.0 | - |
| 19.6737 | 263450 | 0.0 | - |
| 19.6774 | 263500 | 0.0003 | - |
| 19.6811 | 263550 | 0.0004 | - |
| 19.6849 | 263600 | 0.0002 | - |
| 19.6886 | 263650 | 0.0001 | - |
| 19.6923 | 263700 | 0.0003 | - |
| 19.6961 | 263750 | 0.0002 | - |
| 19.6998 | 263800 | 0.0 | - |
| 19.7035 | 263850 | 0.0 | - |
| 19.7073 | 263900 | 0.0002 | - |
| 19.7110 | 263950 | 0.0 | - |
| 19.7147 | 264000 | 0.0 | - |
| 19.7185 | 264050 | 0.0 | - |
| 19.7222 | 264100 | 0.0001 | - |
| 19.7259 | 264150 | 0.0 | - |
| 19.7297 | 264200 | 0.0002 | - |
| 19.7334 | 264250 | 0.0 | - |
| 19.7371 | 264300 | 0.0001 | - |
| 19.7409 | 264350 | 0.0003 | - |
| 19.7446 | 264400 | 0.0 | - |
| 19.7483 | 264450 | 0.0 | - |
| 19.7521 | 264500 | 0.0 | - |
| 19.7558 | 264550 | 0.0002 | - |
| 19.7595 | 264600 | 0.0002 | - |
| 19.7633 | 264650 | 0.0 | - |
| 19.7670 | 264700 | 0.0002 | - |
| 19.7707 | 264750 | 0.0 | - |
| 19.7745 | 264800 | 0.0003 | - |
| 19.7782 | 264850 | 0.0 | - |
| 19.7819 | 264900 | 0.0001 | - |
| 19.7857 | 264950 | 0.0 | - |
| 19.7894 | 265000 | 0.0 | - |
| 19.7931 | 265050 | 0.0003 | - |
| 19.7969 | 265100 | 0.0002 | - |
| 19.8006 | 265150 | 0.0 | - |
| 19.8043 | 265200 | 0.0002 | - |
| 19.8081 | 265250 | 0.0 | - |
| 19.8118 | 265300 | 0.0 | - |
| 19.8155 | 265350 | 0.0 | - |
| 19.8193 | 265400 | 0.0 | - |
| 19.8230 | 265450 | 0.0002 | - |
| 19.8267 | 265500 | 0.0 | - |
| 19.8305 | 265550 | 0.0001 | - |
| 19.8342 | 265600 | 0.0002 | - |
| 19.8380 | 265650 | 0.0002 | - |
| 19.8417 | 265700 | 0.0 | - |
| 19.8454 | 265750 | 0.0 | - |
| 19.8492 | 265800 | 0.0002 | - |
| 19.8529 | 265850 | 0.0002 | - |
| 19.8566 | 265900 | 0.0 | - |
| 19.8604 | 265950 | 0.0 | - |
| 19.8641 | 266000 | 0.0 | - |
| 19.8678 | 266050 | 0.0 | - |
| 19.8716 | 266100 | 0.0 | - |
| 19.8753 | 266150 | 0.0002 | - |
| 19.8790 | 266200 | 0.0 | - |
| 19.8828 | 266250 | 0.0 | - |
| 19.8865 | 266300 | 0.0 | - |
| 19.8902 | 266350 | 0.0002 | - |
| 19.8940 | 266400 | 0.0 | - |
| 19.8977 | 266450 | 0.0002 | - |
| 19.9014 | 266500 | 0.0 | - |
| 19.9052 | 266550 | 0.0004 | - |
| 19.9089 | 266600 | 0.0 | - |
| 19.9126 | 266650 | 0.0002 | - |
| 19.9164 | 266700 | 0.0002 | - |
| 19.9201 | 266750 | 0.0002 | - |
| 19.9238 | 266800 | 0.0002 | - |
| 19.9276 | 266850 | 0.0 | - |
| 19.9313 | 266900 | 0.0002 | - |
| 19.9350 | 266950 | 0.0003 | - |
| 19.9388 | 267000 | 0.0003 | - |
| 19.9425 | 267050 | 0.0002 | - |
| 19.9462 | 267100 | 0.0001 | - |
| 19.9500 | 267150 | 0.0003 | - |
| 19.9537 | 267200 | 0.0003 | - |
| 19.9574 | 267250 | 0.0004 | - |
| 19.9612 | 267300 | 0.0004 | - |
| 19.9649 | 267350 | 0.0 | - |
| 19.9686 | 267400 | 0.0002 | - |
| 19.9724 | 267450 | 0.0002 | - |
| 19.9761 | 267500 | 0.0001 | - |
| 19.9798 | 267550 | 0.0 | - |
| 19.9836 | 267600 | 0.0 | - |
| 19.9873 | 267650 | 0.0002 | - |
| 19.9910 | 267700 | 0.0 | - |
| 19.9948 | 267750 | 0.0001 | - |
| 19.9985 | 267800 | 0.0 | - |
| 20.0022 | 267850 | 0.0 | - |
| 20.0060 | 267900 | 0.0002 | - |
| 20.0097 | 267950 | 0.0002 | - |
| 20.0134 | 268000 | 0.0 | - |
| 20.0172 | 268050 | 0.0 | - |
| 20.0209 | 268100 | 0.0002 | - |
| 20.0246 | 268150 | 0.0002 | - |
| 20.0284 | 268200 | 0.0 | - |
| 20.0321 | 268250 | 0.0002 | - |
| 20.0358 | 268300 | 0.0 | - |
| 20.0396 | 268350 | 0.0 | - |
| 20.0433 | 268400 | 0.0002 | - |
| 20.0470 | 268450 | 0.0001 | - |
| 20.0508 | 268500 | 0.0002 | - |
| 20.0545 | 268550 | 0.0002 | - |
| 20.0582 | 268600 | 0.0002 | - |
| 20.0620 | 268650 | 0.0001 | - |
| 20.0657 | 268700 | 0.0001 | - |
| 20.0694 | 268750 | 0.0002 | - |
| 20.0732 | 268800 | 0.0 | - |
| 20.0769 | 268850 | 0.0002 | - |
| 20.0807 | 268900 | 0.0001 | - |
| 20.0844 | 268950 | 0.0 | - |
| 20.0881 | 269000 | 0.0 | - |
| 20.0919 | 269050 | 0.0003 | - |
| 20.0956 | 269100 | 0.0 | - |
| 20.0993 | 269150 | 0.0 | - |
| 20.1031 | 269200 | 0.0002 | - |
| 20.1068 | 269250 | 0.0002 | - |
| 20.1105 | 269300 | 0.0001 | - |
| 20.1143 | 269350 | 0.0 | - |
| 20.1180 | 269400 | 0.0 | - |
| 20.1217 | 269450 | 0.0002 | - |
| 20.1255 | 269500 | 0.0002 | - |
| 20.1292 | 269550 | 0.0002 | - |
| 20.1329 | 269600 | 0.0 | - |
| 20.1367 | 269650 | 0.0001 | - |
| 20.1404 | 269700 | 0.0 | - |
| 20.1441 | 269750 | 0.0003 | - |
| 20.1479 | 269800 | 0.0 | - |
| 20.1516 | 269850 | 0.0002 | - |
| 20.1553 | 269900 | 0.0 | - |
| 20.1591 | 269950 | 0.0002 | - |
| 20.1628 | 270000 | 0.0 | - |
| 20.1665 | 270050 | 0.0 | - |
| 20.1703 | 270100 | 0.0002 | - |
| 20.1740 | 270150 | 0.0002 | - |
| 20.1777 | 270200 | 0.0002 | - |
| 20.1815 | 270250 | 0.0002 | - |
| 20.1852 | 270300 | 0.0002 | - |
| 20.1889 | 270350 | 0.0 | - |
| 20.1927 | 270400 | 0.0 | - |
| 20.1964 | 270450 | 0.0 | - |
| 20.2001 | 270500 | 0.0 | - |
| 20.2039 | 270550 | 0.0 | - |
| 20.2076 | 270600 | 0.0 | - |
| 20.2113 | 270650 | 0.0002 | - |
| 20.2151 | 270700 | 0.0 | - |
| 20.2188 | 270750 | 0.0001 | - |
| 20.2225 | 270800 | 0.0002 | - |
| 20.2263 | 270850 | 0.0 | - |
| 20.2300 | 270900 | 0.0005 | - |
| 20.2337 | 270950 | 0.0002 | - |
| 20.2375 | 271000 | 0.0002 | - |
| 20.2412 | 271050 | 0.0 | - |
| 20.2449 | 271100 | 0.0 | - |
| 20.2487 | 271150 | 0.0002 | - |
| 20.2524 | 271200 | 0.0004 | - |
| 20.2561 | 271250 | 0.0 | - |
| 20.2599 | 271300 | 0.0 | - |
| 20.2636 | 271350 | 0.0 | - |
| 20.2673 | 271400 | 0.0 | - |
| 20.2711 | 271450 | 0.0 | - |
| 20.2748 | 271500 | 0.0002 | - |
| 20.2785 | 271550 | 0.0002 | - |
| 20.2823 | 271600 | 0.0002 | - |
| 20.2860 | 271650 | 0.0001 | - |
| 20.2897 | 271700 | 0.0 | - |
| 20.2935 | 271750 | 0.0002 | - |
| 20.2972 | 271800 | 0.0001 | - |
| 20.3009 | 271850 | 0.0 | - |
| 20.3047 | 271900 | 0.0 | - |
| 20.3084 | 271950 | 0.0 | - |
| 20.3121 | 272000 | 0.0 | - |
| 20.3159 | 272050 | 0.0003 | - |
| 20.3196 | 272100 | 0.0003 | - |
| 20.3234 | 272150 | 0.0002 | - |
| 20.3271 | 272200 | 0.0001 | - |
| 20.3308 | 272250 | 0.0002 | - |
| 20.3346 | 272300 | 0.0001 | - |
| 20.3383 | 272350 | 0.0 | - |
| 20.3420 | 272400 | 0.0002 | - |
| 20.3458 | 272450 | 0.0004 | - |
| 20.3495 | 272500 | 0.0002 | - |
| 20.3532 | 272550 | 0.0003 | - |
| 20.3570 | 272600 | 0.0 | - |
| 20.3607 | 272650 | 0.0001 | - |
| 20.3644 | 272700 | 0.0 | - |
| 20.3682 | 272750 | 0.0 | - |
| 20.3719 | 272800 | 0.0 | - |
| 20.3756 | 272850 | 0.0001 | - |
| 20.3794 | 272900 | 0.0001 | - |
| 20.3831 | 272950 | 0.0003 | - |
| 20.3868 | 273000 | 0.0001 | - |
| 20.3906 | 273050 | 0.0002 | - |
| 20.3943 | 273100 | 0.0 | - |
| 20.3980 | 273150 | 0.0 | - |
| 20.4018 | 273200 | 0.0001 | - |
| 20.4055 | 273250 | 0.0001 | - |
| 20.4092 | 273300 | 0.0002 | - |
| 20.4130 | 273350 | 0.0001 | - |
| 20.4167 | 273400 | 0.0002 | - |
| 20.4204 | 273450 | 0.0002 | - |
| 20.4242 | 273500 | 0.0001 | - |
| 20.4279 | 273550 | 0.0002 | - |
| 20.4316 | 273600 | 0.0001 | - |
| 20.4354 | 273650 | 0.0001 | - |
| 20.4391 | 273700 | 0.0 | - |
| 20.4428 | 273750 | 0.0 | - |
| 20.4466 | 273800 | 0.0002 | - |
| 20.4503 | 273850 | 0.0 | - |
| 20.4540 | 273900 | 0.0002 | - |
| 20.4578 | 273950 | 0.0002 | - |
| 20.4615 | 274000 | 0.0 | - |
| 20.4652 | 274050 | 0.0003 | - |
| 20.4690 | 274100 | 0.0 | - |
| 20.4727 | 274150 | 0.0002 | - |
| 20.4764 | 274200 | 0.0 | - |
| 20.4802 | 274250 | 0.0002 | - |
| 20.4839 | 274300 | 0.0002 | - |
| 20.4876 | 274350 | 0.0 | - |
| 20.4914 | 274400 | 0.0 | - |
| 20.4951 | 274450 | 0.0005 | - |
| 20.4988 | 274500 | 0.0 | - |
| 20.5026 | 274550 | 0.0 | - |
| 20.5063 | 274600 | 0.0 | - |
| 20.5100 | 274650 | 0.0 | - |
| 20.5138 | 274700 | 0.0 | - |
| 20.5175 | 274750 | 0.0 | - |
| 20.5212 | 274800 | 0.0 | - |
| 20.5250 | 274850 | 0.0 | - |
| 20.5287 | 274900 | 0.0 | - |
| 20.5324 | 274950 | 0.0001 | - |
| 20.5362 | 275000 | 0.0002 | - |
| 20.5399 | 275050 | 0.0 | - |
| 20.5436 | 275100 | 0.0 | - |
| 20.5474 | 275150 | 0.0 | - |
| 20.5511 | 275200 | 0.0002 | - |
| 20.5549 | 275250 | 0.0 | - |
| 20.5586 | 275300 | 0.0002 | - |
| 20.5623 | 275350 | 0.0001 | - |
| 20.5661 | 275400 | 0.0 | - |
| 20.5698 | 275450 | 0.0001 | - |
| 20.5735 | 275500 | 0.0001 | - |
| 20.5773 | 275550 | 0.0 | - |
| 20.5810 | 275600 | 0.0 | - |
| 20.5847 | 275650 | 0.0 | - |
| 20.5885 | 275700 | 0.0002 | - |
| 20.5922 | 275750 | 0.0 | - |
| 20.5959 | 275800 | 0.0002 | - |
| 20.5997 | 275850 | 0.0002 | - |
| 20.6034 | 275900 | 0.0002 | - |
| 20.6071 | 275950 | 0.0 | - |
| 20.6109 | 276000 | 0.0 | - |
| 20.6146 | 276050 | 0.0001 | - |
| 20.6183 | 276100 | 0.0002 | - |
| 20.6221 | 276150 | 0.0 | - |
| 20.6258 | 276200 | 0.0 | - |
| 20.6295 | 276250 | 0.0003 | - |
| 20.6333 | 276300 | 0.0 | - |
| 20.6370 | 276350 | 0.0003 | - |
| 20.6407 | 276400 | 0.0002 | - |
| 20.6445 | 276450 | 0.0003 | - |
| 20.6482 | 276500 | 0.0002 | - |
| 20.6519 | 276550 | 0.0001 | - |
| 20.6557 | 276600 | 0.0 | - |
| 20.6594 | 276650 | 0.0 | - |
| 20.6631 | 276700 | 0.0 | - |
| 20.6669 | 276750 | 0.0 | - |
| 20.6706 | 276800 | 0.0 | - |
| 20.6743 | 276850 | 0.0 | - |
| 20.6781 | 276900 | 0.0001 | - |
| 20.6818 | 276950 | 0.0 | - |
| 20.6855 | 277000 | 0.0003 | - |
| 20.6893 | 277050 | 0.0 | - |
| 20.6930 | 277100 | 0.0002 | - |
| 20.6967 | 277150 | 0.0 | - |
| 20.7005 | 277200 | 0.0 | - |
| 20.7042 | 277250 | 0.0002 | - |
| 20.7079 | 277300 | 0.0 | - |
| 20.7117 | 277350 | 0.0001 | - |
| 20.7154 | 277400 | 0.0002 | - |
| 20.7191 | 277450 | 0.0 | - |
| 20.7229 | 277500 | 0.0003 | - |
| 20.7266 | 277550 | 0.0001 | - |
| 20.7303 | 277600 | 0.0002 | - |
| 20.7341 | 277650 | 0.0003 | - |
| 20.7378 | 277700 | 0.0 | - |
| 20.7415 | 277750 | 0.0 | - |
| 20.7453 | 277800 | 0.0003 | - |
| 20.7490 | 277850 | 0.0 | - |
| 20.7527 | 277900 | 0.0002 | - |
| 20.7565 | 277950 | 0.0 | - |
| 20.7602 | 278000 | 0.0002 | - |
| 20.7639 | 278050 | 0.0001 | - |
| 20.7677 | 278100 | 0.0002 | - |
| 20.7714 | 278150 | 0.0003 | - |
| 20.7751 | 278200 | 0.0 | - |
| 20.7789 | 278250 | 0.0 | - |
| 20.7826 | 278300 | 0.0002 | - |
| 20.7863 | 278350 | 0.0002 | - |
| 20.7901 | 278400 | 0.0002 | - |
| 20.7938 | 278450 | 0.0002 | - |
| 20.7976 | 278500 | 0.0 | - |
| 20.8013 | 278550 | 0.0 | - |
| 20.8050 | 278600 | 0.0002 | - |
| 20.8088 | 278650 | 0.0004 | - |
| 20.8125 | 278700 | 0.0001 | - |
| 20.8162 | 278750 | 0.0002 | - |
| 20.8200 | 278800 | 0.0 | - |
| 20.8237 | 278850 | 0.0 | - |
| 20.8274 | 278900 | 0.0002 | - |
| 20.8312 | 278950 | 0.0 | - |
| 20.8349 | 279000 | 0.0 | - |
| 20.8386 | 279050 | 0.0 | - |
| 20.8424 | 279100 | 0.0 | - |
| 20.8461 | 279150 | 0.0002 | - |
| 20.8498 | 279200 | 0.0003 | - |
| 20.8536 | 279250 | 0.0 | - |
| 20.8573 | 279300 | 0.0005 | - |
| 20.8610 | 279350 | 0.0 | - |
| 20.8648 | 279400 | 0.0002 | - |
| 20.8685 | 279450 | 0.0002 | - |
| 20.8722 | 279500 | 0.0002 | - |
| 20.8760 | 279550 | 0.0001 | - |
| 20.8797 | 279600 | 0.0002 | - |
| 20.8834 | 279650 | 0.0 | - |
| 20.8872 | 279700 | 0.0001 | - |
| 20.8909 | 279750 | 0.0001 | - |
| 20.8946 | 279800 | 0.0001 | - |
| 20.8984 | 279850 | 0.0001 | - |
| 20.9021 | 279900 | 0.0 | - |
| 20.9058 | 279950 | 0.0 | - |
| 20.9096 | 280000 | 0.0 | - |
| 20.9133 | 280050 | 0.0 | - |
| 20.9170 | 280100 | 0.0 | - |
| 20.9208 | 280150 | 0.0001 | - |
| 20.9245 | 280200 | 0.0002 | - |
| 20.9282 | 280250 | 0.0 | - |
| 20.9320 | 280300 | 0.0002 | - |
| 20.9357 | 280350 | 0.0 | - |
| 20.9394 | 280400 | 0.0001 | - |
| 20.9432 | 280450 | 0.0002 | - |
| 20.9469 | 280500 | 0.0 | - |
| 20.9506 | 280550 | 0.0003 | - |
| 20.9544 | 280600 | 0.0 | - |
| 20.9581 | 280650 | 0.0 | - |
| 20.9618 | 280700 | 0.0 | - |
| 20.9656 | 280750 | 0.0 | - |
| 20.9693 | 280800 | 0.0 | - |
| 20.9730 | 280850 | 0.0004 | - |
| 20.9768 | 280900 | 0.0002 | - |
| 20.9805 | 280950 | 0.0 | - |
| 20.9842 | 281000 | 0.0 | - |
| 20.9880 | 281050 | 0.0 | - |
| 20.9917 | 281100 | 0.0 | - |
| 20.9954 | 281150 | 0.0002 | - |
| 20.9992 | 281200 | 0.0 | - |
| 21.0029 | 281250 | 0.0002 | - |
| 21.0066 | 281300 | 0.0 | - |
| 21.0104 | 281350 | 0.0 | - |
| 21.0141 | 281400 | 0.0 | - |
| 21.0178 | 281450 | 0.0 | - |
| 21.0216 | 281500 | 0.0002 | - |
| 21.0253 | 281550 | 0.0002 | - |
| 21.0290 | 281600 | 0.0 | - |
| 21.0328 | 281650 | 0.0 | - |
| 21.0365 | 281700 | 0.0 | - |
| 21.0403 | 281750 | 0.0002 | - |
| 21.0440 | 281800 | 0.0 | - |
| 21.0477 | 281850 | 0.0 | - |
| 21.0515 | 281900 | 0.0 | - |
| 21.0552 | 281950 | 0.0002 | - |
| 21.0589 | 282000 | 0.0 | - |
| 21.0627 | 282050 | 0.0 | - |
| 21.0664 | 282100 | 0.0 | - |
| 21.0701 | 282150 | 0.0 | - |
| 21.0739 | 282200 | 0.0 | - |
| 21.0776 | 282250 | 0.0002 | - |
| 21.0813 | 282300 | 0.0 | - |
| 21.0851 | 282350 | 0.0002 | - |
| 21.0888 | 282400 | 0.0 | - |
| 21.0925 | 282450 | 0.0002 | - |
| 21.0963 | 282500 | 0.0002 | - |
| 21.1000 | 282550 | 0.0 | - |
| 21.1037 | 282600 | 0.0002 | - |
| 21.1075 | 282650 | 0.0001 | - |
| 21.1112 | 282700 | 0.0 | - |
| 21.1149 | 282750 | 0.0002 | - |
| 21.1187 | 282800 | 0.0 | - |
| 21.1224 | 282850 | 0.0 | - |
| 21.1261 | 282900 | 0.0 | - |
| 21.1299 | 282950 | 0.0002 | - |
| 21.1336 | 283000 | 0.0 | - |
| 21.1373 | 283050 | 0.0002 | - |
| 21.1411 | 283100 | 0.0 | - |
| 21.1448 | 283150 | 0.0 | - |
| 21.1485 | 283200 | 0.0 | - |
| 21.1523 | 283250 | 0.0002 | - |
| 21.1560 | 283300 | 0.0002 | - |
| 21.1597 | 283350 | 0.0002 | - |
| 21.1635 | 283400 | 0.0002 | - |
| 21.1672 | 283450 | 0.0 | - |
| 21.1709 | 283500 | 0.0 | - |
| 21.1747 | 283550 | 0.0 | - |
| 21.1784 | 283600 | 0.0002 | - |
| 21.1821 | 283650 | 0.0003 | - |
| 21.1859 | 283700 | 0.0002 | - |
| 21.1896 | 283750 | 0.0 | - |
| 21.1933 | 283800 | 0.0 | - |
| 21.1971 | 283850 | 0.0002 | - |
| 21.2008 | 283900 | 0.0002 | - |
| 21.2045 | 283950 | 0.0001 | - |
| 21.2083 | 284000 | 0.0003 | - |
| 21.2120 | 284050 | 0.0001 | - |
| 21.2157 | 284100 | 0.0 | - |
| 21.2195 | 284150 | 0.0 | - |
| 21.2232 | 284200 | 0.0003 | - |
| 21.2269 | 284250 | 0.0 | - |
| 21.2307 | 284300 | 0.0 | - |
| 21.2344 | 284350 | 0.0002 | - |
| 21.2381 | 284400 | 0.0002 | - |
| 21.2419 | 284450 | 0.0 | - |
| 21.2456 | 284500 | 0.0 | - |
| 21.2493 | 284550 | 0.0002 | - |
| 21.2531 | 284600 | 0.0 | - |
| 21.2568 | 284650 | 0.0 | - |
| 21.2605 | 284700 | 0.0 | - |
| 21.2643 | 284750 | 0.0 | - |
| 21.2680 | 284800 | 0.0001 | - |
| 21.2717 | 284850 | 0.0 | - |
| 21.2755 | 284900 | 0.0005 | - |
| 21.2792 | 284950 | 0.0001 | - |
| 21.2830 | 285000 | 0.0001 | - |
| 21.2867 | 285050 | 0.0003 | - |
| 21.2904 | 285100 | 0.0002 | - |
| 21.2942 | 285150 | 0.0 | - |
| 21.2979 | 285200 | 0.0002 | - |
| 21.3016 | 285250 | 0.0002 | - |
| 21.3054 | 285300 | 0.0 | - |
| 21.3091 | 285350 | 0.0 | - |
| 21.3128 | 285400 | 0.0005 | - |
| 21.3166 | 285450 | 0.0001 | - |
| 21.3203 | 285500 | 0.0 | - |
| 21.3240 | 285550 | 0.0 | - |
| 21.3278 | 285600 | 0.0003 | - |
| 21.3315 | 285650 | 0.0 | - |
| 21.3352 | 285700 | 0.0001 | - |
| 21.3390 | 285750 | 0.0 | - |
| 21.3427 | 285800 | 0.0002 | - |
| 21.3464 | 285850 | 0.0 | - |
| 21.3502 | 285900 | 0.0001 | - |
| 21.3539 | 285950 | 0.0 | - |
| 21.3576 | 286000 | 0.0 | - |
| 21.3614 | 286050 | 0.0 | - |
| 21.3651 | 286100 | 0.0 | - |
| 21.3688 | 286150 | 0.0002 | - |
| 21.3726 | 286200 | 0.0 | - |
| 21.3763 | 286250 | 0.0 | - |
| 21.3800 | 286300 | 0.0 | - |
| 21.3838 | 286350 | 0.0001 | - |
| 21.3875 | 286400 | 0.0002 | - |
| 21.3912 | 286450 | 0.0 | - |
| 21.3950 | 286500 | 0.0 | - |
| 21.3987 | 286550 | 0.0002 | - |
| 21.4024 | 286600 | 0.0002 | - |
| 21.4062 | 286650 | 0.0002 | - |
| 21.4099 | 286700 | 0.0 | - |
| 21.4136 | 286750 | 0.0002 | - |
| 21.4174 | 286800 | 0.0 | - |
| 21.4211 | 286850 | 0.0002 | - |
| 21.4248 | 286900 | 0.0 | - |
| 21.4286 | 286950 | 0.0 | - |
| 21.4323 | 287000 | 0.0003 | - |
| 21.4360 | 287050 | 0.0 | - |
| 21.4398 | 287100 | 0.0003 | - |
| 21.4435 | 287150 | 0.0002 | - |
| 21.4472 | 287200 | 0.0 | - |
| 21.4510 | 287250 | 0.0002 | - |
| 21.4547 | 287300 | 0.0001 | - |
| 21.4584 | 287350 | 0.0002 | - |
| 21.4622 | 287400 | 0.0 | - |
| 21.4659 | 287450 | 0.0 | - |
| 21.4696 | 287500 | 0.0 | - |
| 21.4734 | 287550 | 0.0 | - |
| 21.4771 | 287600 | 0.0001 | - |
| 21.4808 | 287650 | 0.0 | - |
| 21.4846 | 287700 | 0.0 | - |
| 21.4883 | 287750 | 0.0 | - |
| 21.4920 | 287800 | 0.0 | - |
| 21.4958 | 287850 | 0.0002 | - |
| 21.4995 | 287900 | 0.0 | - |
| 21.5032 | 287950 | 0.0 | - |
| 21.5070 | 288000 | 0.0 | - |
| 21.5107 | 288050 | 0.0003 | - |
| 21.5145 | 288100 | 0.0 | - |
| 21.5182 | 288150 | 0.0001 | - |
| 21.5219 | 288200 | 0.0002 | - |
| 21.5257 | 288250 | 0.0 | - |
| 21.5294 | 288300 | 0.0 | - |
| 21.5331 | 288350 | 0.0 | - |
| 21.5369 | 288400 | 0.0001 | - |
| 21.5406 | 288450 | 0.0002 | - |
| 21.5443 | 288500 | 0.0002 | - |
| 21.5481 | 288550 | 0.0 | - |
| 21.5518 | 288600 | 0.0 | - |
| 21.5555 | 288650 | 0.0002 | - |
| 21.5593 | 288700 | 0.0002 | - |
| 21.5630 | 288750 | 0.0 | - |
| 21.5667 | 288800 | 0.0005 | - |
| 21.5705 | 288850 | 0.0 | - |
| 21.5742 | 288900 | 0.0002 | - |
| 21.5779 | 288950 | 0.0 | - |
| 21.5817 | 289000 | 0.0 | - |
| 21.5854 | 289050 | 0.0002 | - |
| 21.5891 | 289100 | 0.0 | - |
| 21.5929 | 289150 | 0.0002 | - |
| 21.5966 | 289200 | 0.0001 | - |
| 21.6003 | 289250 | 0.0 | - |
| 21.6041 | 289300 | 0.0 | - |
| 21.6078 | 289350 | 0.0 | - |
| 21.6115 | 289400 | 0.0001 | - |
| 21.6153 | 289450 | 0.0002 | - |
| 21.6190 | 289500 | 0.0002 | - |
| 21.6227 | 289550 | 0.0002 | - |
| 21.6265 | 289600 | 0.0 | - |
| 21.6302 | 289650 | 0.0 | - |
| 21.6339 | 289700 | 0.0 | - |
| 21.6377 | 289750 | 0.0002 | - |
| 21.6414 | 289800 | 0.0 | - |
| 21.6451 | 289850 | 0.0 | - |
| 21.6489 | 289900 | 0.0 | - |
| 21.6526 | 289950 | 0.0 | - |
| 21.6563 | 290000 | 0.0 | - |
| 21.6601 | 290050 | 0.0002 | - |
| 21.6638 | 290100 | 0.0004 | - |
| 21.6675 | 290150 | 0.0 | - |
| 21.6713 | 290200 | 0.0001 | - |
| 21.6750 | 290250 | 0.0 | - |
| 21.6787 | 290300 | 0.0005 | - |
| 21.6825 | 290350 | 0.0002 | - |
| 21.6862 | 290400 | 0.0002 | - |
| 21.6899 | 290450 | 0.0 | - |
| 21.6937 | 290500 | 0.0 | - |
| 21.6974 | 290550 | 0.0 | - |
| 21.7011 | 290600 | 0.0002 | - |
| 21.7049 | 290650 | 0.0001 | - |
| 21.7086 | 290700 | 0.0 | - |
| 21.7123 | 290750 | 0.0 | - |
| 21.7161 | 290800 | 0.0003 | - |
| 21.7198 | 290850 | 0.0 | - |
| 21.7235 | 290900 | 0.0 | - |
| 21.7273 | 290950 | 0.0002 | - |
| 21.7310 | 291000 | 0.0 | - |
| 21.7347 | 291050 | 0.0 | - |
| 21.7385 | 291100 | 0.0 | - |
| 21.7422 | 291150 | 0.0 | - |
| 21.7459 | 291200 | 0.0 | - |
| 21.7497 | 291250 | 0.0001 | - |
| 21.7534 | 291300 | 0.0 | - |
| 21.7572 | 291350 | 0.0 | - |
| 21.7609 | 291400 | 0.0 | - |
| 21.7646 | 291450 | 0.0 | - |
| 21.7684 | 291500 | 0.0002 | - |
| 21.7721 | 291550 | 0.0002 | - |
| 21.7758 | 291600 | 0.0 | - |
| 21.7796 | 291650 | 0.0002 | - |
| 21.7833 | 291700 | 0.0 | - |
| 21.7870 | 291750 | 0.0 | - |
| 21.7908 | 291800 | 0.0 | - |
| 21.7945 | 291850 | 0.0 | - |
| 21.7982 | 291900 | 0.0002 | - |
| 21.8020 | 291950 | 0.0 | - |
| 21.8057 | 292000 | 0.0002 | - |
| 21.8094 | 292050 | 0.0 | - |
| 21.8132 | 292100 | 0.0002 | - |
| 21.8169 | 292150 | 0.0 | - |
| 21.8206 | 292200 | 0.0 | - |
| 21.8244 | 292250 | 0.0001 | - |
| 21.8281 | 292300 | 0.0 | - |
| 21.8318 | 292350 | 0.0004 | - |
| 21.8356 | 292400 | 0.0002 | - |
| 21.8393 | 292450 | 0.0 | - |
| 21.8430 | 292500 | 0.0002 | - |
| 21.8468 | 292550 | 0.0002 | - |
| 21.8505 | 292600 | 0.0 | - |
| 21.8542 | 292650 | 0.0 | - |
| 21.8580 | 292700 | 0.0002 | - |
| 21.8617 | 292750 | 0.0 | - |
| 21.8654 | 292800 | 0.0 | - |
| 21.8692 | 292850 | 0.0 | - |
| 21.8729 | 292900 | 0.0002 | - |
| 21.8766 | 292950 | 0.0 | - |
| 21.8804 | 293000 | 0.0 | - |
| 21.8841 | 293050 | 0.0 | - |
| 21.8878 | 293100 | 0.0001 | - |
| 21.8916 | 293150 | 0.0 | - |
| 21.8953 | 293200 | 0.0002 | - |
| 21.8990 | 293250 | 0.0 | - |
| 21.9028 | 293300 | 0.0 | - |
| 21.9065 | 293350 | 0.0001 | - |
| 21.9102 | 293400 | 0.0002 | - |
| 21.9140 | 293450 | 0.0002 | - |
| 21.9177 | 293500 | 0.0001 | - |
| 21.9214 | 293550 | 0.0002 | - |
| 21.9252 | 293600 | 0.0 | - |
| 21.9289 | 293650 | 0.0001 | - |
| 21.9326 | 293700 | 0.0002 | - |
| 21.9364 | 293750 | 0.0 | - |
| 21.9401 | 293800 | 0.0 | - |
| 21.9438 | 293850 | 0.0001 | - |
| 21.9476 | 293900 | 0.0 | - |
| 21.9513 | 293950 | 0.0 | - |
| 21.9550 | 294000 | 0.0 | - |
| 21.9588 | 294050 | 0.0 | - |
| 21.9625 | 294100 | 0.0 | - |
| 21.9662 | 294150 | 0.0 | - |
| 21.9700 | 294200 | 0.0 | - |
| 21.9737 | 294250 | 0.0001 | - |
| 21.9774 | 294300 | 0.0002 | - |
| 21.9812 | 294350 | 0.0001 | - |
| 21.9849 | 294400 | 0.0 | - |
| 21.9886 | 294450 | 0.0002 | - |
| 21.9924 | 294500 | 0.0 | - |
| 21.9961 | 294550 | 0.0 | - |
| 21.9999 | 294600 | 0.0 | - |
| 22.0036 | 294650 | 0.0 | - |
| 22.0073 | 294700 | 0.0 | - |
| 22.0111 | 294750 | 0.0 | - |
| 22.0148 | 294800 | 0.0 | - |
| 22.0185 | 294850 | 0.0 | - |
| 22.0223 | 294900 | 0.0 | - |
| 22.0260 | 294950 | 0.0003 | - |
| 22.0297 | 295000 | 0.0 | - |
| 22.0335 | 295050 | 0.0 | - |
| 22.0372 | 295100 | 0.0 | - |
| 22.0409 | 295150 | 0.0002 | - |
| 22.0447 | 295200 | 0.0001 | - |
| 22.0484 | 295250 | 0.0003 | - |
| 22.0521 | 295300 | 0.0 | - |
| 22.0559 | 295350 | 0.0001 | - |
| 22.0596 | 295400 | 0.0 | - |
| 22.0633 | 295450 | 0.0001 | - |
| 22.0671 | 295500 | 0.0 | - |
| 22.0708 | 295550 | 0.0 | - |
| 22.0745 | 295600 | 0.0002 | - |
| 22.0783 | 295650 | 0.0 | - |
| 22.0820 | 295700 | 0.0 | - |
| 22.0857 | 295750 | 0.0001 | - |
| 22.0895 | 295800 | 0.0 | - |
| 22.0932 | 295850 | 0.0 | - |
| 22.0969 | 295900 | 0.0002 | - |
| 22.1007 | 295950 | 0.0 | - |
| 22.1044 | 296000 | 0.0002 | - |
| 22.1081 | 296050 | 0.0 | - |
| 22.1119 | 296100 | 0.0 | - |
| 22.1156 | 296150 | 0.0002 | - |
| 22.1193 | 296200 | 0.0002 | - |
| 22.1231 | 296250 | 0.0002 | - |
| 22.1268 | 296300 | 0.0 | - |
| 22.1305 | 296350 | 0.0 | - |
| 22.1343 | 296400 | 0.0 | - |
| 22.1380 | 296450 | 0.0 | - |
| 22.1417 | 296500 | 0.0001 | - |
| 22.1455 | 296550 | 0.0 | - |
| 22.1492 | 296600 | 0.0 | - |
| 22.1529 | 296650 | 0.0002 | - |
| 22.1567 | 296700 | 0.0002 | - |
| 22.1604 | 296750 | 0.0 | - |
| 22.1641 | 296800 | 0.0 | - |
| 22.1679 | 296850 | 0.0002 | - |
| 22.1716 | 296900 | 0.0002 | - |
| 22.1753 | 296950 | 0.0001 | - |
| 22.1791 | 297000 | 0.0 | - |
| 22.1828 | 297050 | 0.0 | - |
| 22.1865 | 297100 | 0.0002 | - |
| 22.1903 | 297150 | 0.0 | - |
| 22.1940 | 297200 | 0.0 | - |
| 22.1977 | 297250 | 0.0 | - |
| 22.2015 | 297300 | 0.0 | - |
| 22.2052 | 297350 | 0.0002 | - |
| 22.2089 | 297400 | 0.0002 | - |
| 22.2127 | 297450 | 0.0 | - |
| 22.2164 | 297500 | 0.0002 | - |
| 22.2201 | 297550 | 0.0 | - |
| 22.2239 | 297600 | 0.0 | - |
| 22.2276 | 297650 | 0.0 | - |
| 22.2313 | 297700 | 0.0 | - |
| 22.2351 | 297750 | 0.0001 | - |
| 22.2388 | 297800 | 0.0 | - |
| 22.2426 | 297850 | 0.0 | - |
| 22.2463 | 297900 | 0.0 | - |
| 22.2500 | 297950 | 0.0 | - |
| 22.2538 | 298000 | 0.0 | - |
| 22.2575 | 298050 | 0.0 | - |
| 22.2612 | 298100 | 0.0 | - |
| 22.2650 | 298150 | 0.0002 | - |
| 22.2687 | 298200 | 0.0 | - |
| 22.2724 | 298250 | 0.0 | - |
| 22.2762 | 298300 | 0.0 | - |
| 22.2799 | 298350 | 0.0002 | - |
| 22.2836 | 298400 | 0.0 | - |
| 22.2874 | 298450 | 0.0 | - |
| 22.2911 | 298500 | 0.0002 | - |
| 22.2948 | 298550 | 0.0 | - |
| 22.2986 | 298600 | 0.0 | - |
| 22.3023 | 298650 | 0.0002 | - |
| 22.3060 | 298700 | 0.0 | - |
| 22.3098 | 298750 | 0.0 | - |
| 22.3135 | 298800 | 0.0 | - |
| 22.3172 | 298850 | 0.0001 | - |
| 22.3210 | 298900 | 0.0 | - |
| 22.3247 | 298950 | 0.0002 | - |
| 22.3284 | 299000 | 0.0002 | - |
| 22.3322 | 299050 | 0.0 | - |
| 22.3359 | 299100 | 0.0 | - |
| 22.3396 | 299150 | 0.0 | - |
| 22.3434 | 299200 | 0.0001 | - |
| 22.3471 | 299250 | 0.0003 | - |
| 22.3508 | 299300 | 0.0 | - |
| 22.3546 | 299350 | 0.0 | - |
| 22.3583 | 299400 | 0.0 | - |
| 22.3620 | 299450 | 0.0002 | - |
| 22.3658 | 299500 | 0.0001 | - |
| 22.3695 | 299550 | 0.0002 | - |
| 22.3732 | 299600 | 0.0 | - |
| 22.3770 | 299650 | 0.0003 | - |
| 22.3807 | 299700 | 0.0003 | - |
| 22.3844 | 299750 | 0.0 | - |
| 22.3882 | 299800 | 0.0 | - |
| 22.3919 | 299850 | 0.0002 | - |
| 22.3956 | 299900 | 0.0002 | - |
| 22.3994 | 299950 | 0.0003 | - |
| 22.4031 | 300000 | 0.0 | - |
| 22.4068 | 300050 | 0.0 | - |
| 22.4106 | 300100 | 0.0 | - |
| 22.4143 | 300150 | 0.0005 | - |
| 22.4180 | 300200 | 0.0 | - |
| 22.4218 | 300250 | 0.0002 | - |
| 22.4255 | 300300 | 0.0 | - |
| 22.4292 | 300350 | 0.0003 | - |
| 22.4330 | 300400 | 0.0 | - |
| 22.4367 | 300450 | 0.0 | - |
| 22.4404 | 300500 | 0.0 | - |
| 22.4442 | 300550 | 0.0002 | - |
| 22.4479 | 300600 | 0.0 | - |
| 22.4516 | 300650 | 0.0002 | - |
| 22.4554 | 300700 | 0.0 | - |
| 22.4591 | 300750 | 0.0 | - |
| 22.4628 | 300800 | 0.0002 | - |
| 22.4666 | 300850 | 0.0003 | - |
| 22.4703 | 300900 | 0.0 | - |
| 22.4740 | 300950 | 0.0 | - |
| 22.4778 | 301000 | 0.0 | - |
| 22.4815 | 301050 | 0.0005 | - |
| 22.4853 | 301100 | 0.0004 | - |
| 22.4890 | 301150 | 0.0 | - |
| 22.4927 | 301200 | 0.0 | - |
| 22.4965 | 301250 | 0.0 | - |
| 22.5002 | 301300 | 0.0002 | - |
| 22.5039 | 301350 | 0.0002 | - |
| 22.5077 | 301400 | 0.0 | - |
| 22.5114 | 301450 | 0.0001 | - |
| 22.5151 | 301500 | 0.0 | - |
| 22.5189 | 301550 | 0.0 | - |
| 22.5226 | 301600 | 0.0001 | - |
| 22.5263 | 301650 | 0.0 | - |
| 22.5301 | 301700 | 0.0 | - |
| 22.5338 | 301750 | 0.0 | - |
| 22.5375 | 301800 | 0.0001 | - |
| 22.5413 | 301850 | 0.0 | - |
| 22.5450 | 301900 | 0.0 | - |
| 22.5487 | 301950 | 0.0 | - |
| 22.5525 | 302000 | 0.0 | - |
| 22.5562 | 302050 | 0.0 | - |
| 22.5599 | 302100 | 0.0001 | - |
| 22.5637 | 302150 | 0.0 | - |
| 22.5674 | 302200 | 0.0 | - |
| 22.5711 | 302250 | 0.0002 | - |
| 22.5749 | 302300 | 0.0001 | - |
| 22.5786 | 302350 | 0.0 | - |
| 22.5823 | 302400 | 0.0002 | - |
| 22.5861 | 302450 | 0.0002 | - |
| 22.5898 | 302500 | 0.0 | - |
| 22.5935 | 302550 | 0.0002 | - |
| 22.5973 | 302600 | 0.0003 | - |
| 22.6010 | 302650 | 0.0002 | - |
| 22.6047 | 302700 | 0.0004 | - |
| 22.6085 | 302750 | 0.0002 | - |
| 22.6122 | 302800 | 0.0 | - |
| 22.6159 | 302850 | 0.0002 | - |
| 22.6197 | 302900 | 0.0003 | - |
| 22.6234 | 302950 | 0.0 | - |
| 22.6271 | 303000 | 0.0001 | - |
| 22.6309 | 303050 | 0.0 | - |
| 22.6346 | 303100 | 0.0 | - |
| 22.6383 | 303150 | 0.0002 | - |
| 22.6421 | 303200 | 0.0001 | - |
| 22.6458 | 303250 | 0.0 | - |
| 22.6495 | 303300 | 0.0 | - |
| 22.6533 | 303350 | 0.0 | - |
| 22.6570 | 303400 | 0.0003 | - |
| 22.6607 | 303450 | 0.0 | - |
| 22.6645 | 303500 | 0.0 | - |
| 22.6682 | 303550 | 0.0 | - |
| 22.6719 | 303600 | 0.0 | - |
| 22.6757 | 303650 | 0.0 | - |
| 22.6794 | 303700 | 0.0 | - |
| 22.6831 | 303750 | 0.0 | - |
| 22.6869 | 303800 | 0.0002 | - |
| 22.6906 | 303850 | 0.0 | - |
| 22.6943 | 303900 | 0.0 | - |
| 22.6981 | 303950 | 0.0003 | - |
| 22.7018 | 304000 | 0.0 | - |
| 22.7055 | 304050 | 0.0 | - |
| 22.7093 | 304100 | 0.0 | - |
| 22.7130 | 304150 | 0.0002 | - |
| 22.7168 | 304200 | 0.0 | - |
| 22.7205 | 304250 | 0.0 | - |
| 22.7242 | 304300 | 0.0 | - |
| 22.7280 | 304350 | 0.0 | - |
| 22.7317 | 304400 | 0.0 | - |
| 22.7354 | 304450 | 0.0 | - |
| 22.7392 | 304500 | 0.0003 | - |
| 22.7429 | 304550 | 0.0 | - |
| 22.7466 | 304600 | 0.0002 | - |
| 22.7504 | 304650 | 0.0002 | - |
| 22.7541 | 304700 | 0.0 | - |
| 22.7578 | 304750 | 0.0 | - |
| 22.7616 | 304800 | 0.0002 | - |
| 22.7653 | 304850 | 0.0003 | - |
| 22.7690 | 304900 | 0.0002 | - |
| 22.7728 | 304950 | 0.0 | - |
| 22.7765 | 305000 | 0.0002 | - |
| 22.7802 | 305050 | 0.0 | - |
| 22.7840 | 305100 | 0.0 | - |
| 22.7877 | 305150 | 0.0 | - |
| 22.7914 | 305200 | 0.0002 | - |
| 22.7952 | 305250 | 0.0 | - |
| 22.7989 | 305300 | 0.0 | - |
| 22.8026 | 305350 | 0.0002 | - |
| 22.8064 | 305400 | 0.0005 | - |
| 22.8101 | 305450 | 0.0 | - |
| 22.8138 | 305500 | 0.0002 | - |
| 22.8176 | 305550 | 0.0 | - |
| 22.8213 | 305600 | 0.0 | - |
| 22.8250 | 305650 | 0.0002 | - |
| 22.8288 | 305700 | 0.0 | - |
| 22.8325 | 305750 | 0.0002 | - |
| 22.8362 | 305800 | 0.0 | - |
| 22.8400 | 305850 | 0.0002 | - |
| 22.8437 | 305900 | 0.0 | - |
| 22.8474 | 305950 | 0.0002 | - |
| 22.8512 | 306000 | 0.0001 | - |
| 22.8549 | 306050 | 0.0 | - |
| 22.8586 | 306100 | 0.0002 | - |
| 22.8624 | 306150 | 0.0002 | - |
| 22.8661 | 306200 | 0.0 | - |
| 22.8698 | 306250 | 0.0002 | - |
| 22.8736 | 306300 | 0.0 | - |
| 22.8773 | 306350 | 0.0002 | - |
| 22.8810 | 306400 | 0.0 | - |
| 22.8848 | 306450 | 0.0002 | - |
| 22.8885 | 306500 | 0.0 | - |
| 22.8922 | 306550 | 0.0 | - |
| 22.8960 | 306600 | 0.0 | - |
| 22.8997 | 306650 | 0.0002 | - |
| 22.9034 | 306700 | 0.0 | - |
| 22.9072 | 306750 | 0.0 | - |
| 22.9109 | 306800 | 0.0 | - |
| 22.9146 | 306850 | 0.0 | - |
| 22.9184 | 306900 | 0.0 | - |
| 22.9221 | 306950 | 0.0003 | - |
| 22.9258 | 307000 | 0.0002 | - |
| 22.9296 | 307050 | 0.0002 | - |
| 22.9333 | 307100 | 0.0 | - |
| 22.9370 | 307150 | 0.0001 | - |
| 22.9408 | 307200 | 0.0 | - |
| 22.9445 | 307250 | 0.0 | - |
| 22.9482 | 307300 | 0.0 | - |
| 22.9520 | 307350 | 0.0002 | - |
| 22.9557 | 307400 | 0.0002 | - |
| 22.9595 | 307450 | 0.0 | - |
| 22.9632 | 307500 | 0.0 | - |
| 22.9669 | 307550 | 0.0002 | - |
| 22.9707 | 307600 | 0.0 | - |
| 22.9744 | 307650 | 0.0 | - |
| 22.9781 | 307700 | 0.0002 | - |
| 22.9819 | 307750 | 0.0 | - |
| 22.9856 | 307800 | 0.0 | - |
| 22.9893 | 307850 | 0.0002 | - |
| 22.9931 | 307900 | 0.0 | - |
| 22.9968 | 307950 | 0.0 | - |
| 23.0005 | 308000 | 0.0002 | - |
| 23.0043 | 308050 | 0.0 | - |
| 23.0080 | 308100 | 0.0 | - |
| 23.0117 | 308150 | 0.0 | - |
| 23.0155 | 308200 | 0.0 | - |
| 23.0192 | 308250 | 0.0001 | - |
| 23.0229 | 308300 | 0.0 | - |
| 23.0267 | 308350 | 0.0 | - |
| 23.0304 | 308400 | 0.0 | - |
| 23.0341 | 308450 | 0.0002 | - |
| 23.0379 | 308500 | 0.0002 | - |
| 23.0416 | 308550 | 0.0 | - |
| 23.0453 | 308600 | 0.0002 | - |
| 23.0491 | 308650 | 0.0 | - |
| 23.0528 | 308700 | 0.0 | - |
| 23.0565 | 308750 | 0.0 | - |
| 23.0603 | 308800 | 0.0 | - |
| 23.0640 | 308850 | 0.0 | - |
| 23.0677 | 308900 | 0.0002 | - |
| 23.0715 | 308950 | 0.0 | - |
| 23.0752 | 309000 | 0.0 | - |
| 23.0789 | 309050 | 0.0002 | - |
| 23.0827 | 309100 | 0.0001 | - |
| 23.0864 | 309150 | 0.0001 | - |
| 23.0901 | 309200 | 0.0 | - |
| 23.0939 | 309250 | 0.0002 | - |
| 23.0976 | 309300 | 0.0 | - |
| 23.1013 | 309350 | 0.0 | - |
| 23.1051 | 309400 | 0.0 | - |
| 23.1088 | 309450 | 0.0 | - |
| 23.1125 | 309500 | 0.0002 | - |
| 23.1163 | 309550 | 0.0 | - |
| 23.1200 | 309600 | 0.0 | - |
| 23.1237 | 309650 | 0.0 | - |
| 23.1275 | 309700 | 0.0 | - |
| 23.1312 | 309750 | 0.0003 | - |
| 23.1349 | 309800 | 0.0 | - |
| 23.1387 | 309850 | 0.0 | - |
| 23.1424 | 309900 | 0.0002 | - |
| 23.1461 | 309950 | 0.0002 | - |
| 23.1499 | 310000 | 0.0 | - |
| 23.1536 | 310050 | 0.0 | - |
| 23.1573 | 310100 | 0.0 | - |
| 23.1611 | 310150 | 0.0 | - |
| 23.1648 | 310200 | 0.0003 | - |
| 23.1685 | 310250 | 0.0 | - |
| 23.1723 | 310300 | 0.0 | - |
| 23.1760 | 310350 | 0.0 | - |
| 23.1797 | 310400 | 0.0 | - |
| 23.1835 | 310450 | 0.0 | - |
| 23.1872 | 310500 | 0.0001 | - |
| 23.1909 | 310550 | 0.0002 | - |
| 23.1947 | 310600 | 0.0 | - |
| 23.1984 | 310650 | 0.0 | - |
| 23.2022 | 310700 | 0.0002 | - |
| 23.2059 | 310750 | 0.0002 | - |
| 23.2096 | 310800 | 0.0002 | - |
| 23.2134 | 310850 | 0.0002 | - |
| 23.2171 | 310900 | 0.0 | - |
| 23.2208 | 310950 | 0.0 | - |
| 23.2246 | 311000 | 0.0002 | - |
| 23.2283 | 311050 | 0.0 | - |
| 23.2320 | 311100 | 0.0001 | - |
| 23.2358 | 311150 | 0.0 | - |
| 23.2395 | 311200 | 0.0002 | - |
| 23.2432 | 311250 | 0.0 | - |
| 23.2470 | 311300 | 0.0 | - |
| 23.2507 | 311350 | 0.0004 | - |
| 23.2544 | 311400 | 0.0004 | - |
| 23.2582 | 311450 | 0.0 | - |
| 23.2619 | 311500 | 0.0002 | - |
| 23.2656 | 311550 | 0.0002 | - |
| 23.2694 | 311600 | 0.0002 | - |
| 23.2731 | 311650 | 0.0 | - |
| 23.2768 | 311700 | 0.0 | - |
| 23.2806 | 311750 | 0.0 | - |
| 23.2843 | 311800 | 0.0 | - |
| 23.2880 | 311850 | 0.0002 | - |
| 23.2918 | 311900 | 0.0 | - |
| 23.2955 | 311950 | 0.0 | - |
| 23.2992 | 312000 | 0.0 | - |
| 23.3030 | 312050 | 0.0001 | - |
| 23.3067 | 312100 | 0.0 | - |
| 23.3104 | 312150 | 0.0 | - |
| 23.3142 | 312200 | 0.0 | - |
| 23.3179 | 312250 | 0.0 | - |
| 23.3216 | 312300 | 0.0 | - |
| 23.3254 | 312350 | 0.0 | - |
| 23.3291 | 312400 | 0.0001 | - |
| 23.3328 | 312450 | 0.0002 | - |
| 23.3366 | 312500 | 0.0 | - |
| 23.3403 | 312550 | 0.0 | - |
| 23.3440 | 312600 | 0.0 | - |
| 23.3478 | 312650 | 0.0 | - |
| 23.3515 | 312700 | 0.0001 | - |
| 23.3552 | 312750 | 0.0 | - |
| 23.3590 | 312800 | 0.0 | - |
| 23.3627 | 312850 | 0.0002 | - |
| 23.3664 | 312900 | 0.0 | - |
| 23.3702 | 312950 | 0.0002 | - |
| 23.3739 | 313000 | 0.0002 | - |
| 23.3776 | 313050 | 0.0 | - |
| 23.3814 | 313100 | 0.0 | - |
| 23.3851 | 313150 | 0.0 | - |
| 23.3888 | 313200 | 0.0001 | - |
| 23.3926 | 313250 | 0.0 | - |
| 23.3963 | 313300 | 0.0 | - |
| 23.4000 | 313350 | 0.0002 | - |
| 23.4038 | 313400 | 0.0001 | - |
| 23.4075 | 313450 | 0.0005 | - |
| 23.4112 | 313500 | 0.0 | - |
| 23.4150 | 313550 | 0.0003 | - |
| 23.4187 | 313600 | 0.0 | - |
| 23.4224 | 313650 | 0.0 | - |
| 23.4262 | 313700 | 0.0 | - |
| 23.4299 | 313750 | 0.0002 | - |
| 23.4336 | 313800 | 0.0002 | - |
| 23.4374 | 313850 | 0.0002 | - |
| 23.4411 | 313900 | 0.0003 | - |
| 23.4449 | 313950 | 0.0002 | - |
| 23.4486 | 314000 | 0.0002 | - |
| 23.4523 | 314050 | 0.0 | - |
| 23.4561 | 314100 | 0.0 | - |
| 23.4598 | 314150 | 0.0 | - |
| 23.4635 | 314200 | 0.0 | - |
| 23.4673 | 314250 | 0.0 | - |
| 23.4710 | 314300 | 0.0002 | - |
| 23.4747 | 314350 | 0.0 | - |
| 23.4785 | 314400 | 0.0001 | - |
| 23.4822 | 314450 | 0.0 | - |
| 23.4859 | 314500 | 0.0 | - |
| 23.4897 | 314550 | 0.0 | - |
| 23.4934 | 314600 | 0.0002 | - |
| 23.4971 | 314650 | 0.0 | - |
| 23.5009 | 314700 | 0.0 | - |
| 23.5046 | 314750 | 0.0 | - |
| 23.5083 | 314800 | 0.0 | - |
| 23.5121 | 314850 | 0.0002 | - |
| 23.5158 | 314900 | 0.0002 | - |
| 23.5195 | 314950 | 0.0001 | - |
| 23.5233 | 315000 | 0.0 | - |
| 23.5270 | 315050 | 0.0002 | - |
| 23.5307 | 315100 | 0.0 | - |
| 23.5345 | 315150 | 0.0 | - |
| 23.5382 | 315200 | 0.0 | - |
| 23.5419 | 315250 | 0.0001 | - |
| 23.5457 | 315300 | 0.0002 | - |
| 23.5494 | 315350 | 0.0002 | - |
| 23.5531 | 315400 | 0.0 | - |
| 23.5569 | 315450 | 0.0005 | - |
| 23.5606 | 315500 | 0.0005 | - |
| 23.5643 | 315550 | 0.0 | - |
| 23.5681 | 315600 | 0.0003 | - |
| 23.5718 | 315650 | 0.0001 | - |
| 23.5755 | 315700 | 0.0 | - |
| 23.5793 | 315750 | 0.0 | - |
| 23.5830 | 315800 | 0.0 | - |
| 23.5867 | 315850 | 0.0 | - |
| 23.5905 | 315900 | 0.0002 | - |
| 23.5942 | 315950 | 0.0002 | - |
| 23.5979 | 316000 | 0.0 | - |
| 23.6017 | 316050 | 0.0 | - |
| 23.6054 | 316100 | 0.0002 | - |
| 23.6091 | 316150 | 0.0002 | - |
| 23.6129 | 316200 | 0.0002 | - |
| 23.6166 | 316250 | 0.0 | - |
| 23.6203 | 316300 | 0.0 | - |
| 23.6241 | 316350 | 0.0002 | - |
| 23.6278 | 316400 | 0.0 | - |
| 23.6315 | 316450 | 0.0 | - |
| 23.6353 | 316500 | 0.0 | - |
| 23.6390 | 316550 | 0.0003 | - |
| 23.6427 | 316600 | 0.0001 | - |
| 23.6465 | 316650 | 0.0 | - |
| 23.6502 | 316700 | 0.0003 | - |
| 23.6539 | 316750 | 0.0001 | - |
| 23.6577 | 316800 | 0.0002 | - |
| 23.6614 | 316850 | 0.0002 | - |
| 23.6651 | 316900 | 0.0005 | - |
| 23.6689 | 316950 | 0.0002 | - |
| 23.6726 | 317000 | 0.0 | - |
| 23.6763 | 317050 | 0.0002 | - |
| 23.6801 | 317100 | 0.0003 | - |
| 23.6838 | 317150 | 0.0 | - |
| 23.6876 | 317200 | 0.0 | - |
| 23.6913 | 317250 | 0.0002 | - |
| 23.6950 | 317300 | 0.0 | - |
| 23.6988 | 317350 | 0.0002 | - |
| 23.7025 | 317400 | 0.0 | - |
| 23.7062 | 317450 | 0.0002 | - |
| 23.7100 | 317500 | 0.0 | - |
| 23.7137 | 317550 | 0.0 | - |
| 23.7174 | 317600 | 0.0 | - |
| 23.7212 | 317650 | 0.0002 | - |
| 23.7249 | 317700 | 0.0 | - |
| 23.7286 | 317750 | 0.0 | - |
| 23.7324 | 317800 | 0.0002 | - |
| 23.7361 | 317850 | 0.0 | - |
| 23.7398 | 317900 | 0.0 | - |
| 23.7436 | 317950 | 0.0002 | - |
| 23.7473 | 318000 | 0.0002 | - |
| 23.7510 | 318050 | 0.0002 | - |
| 23.7548 | 318100 | 0.0 | - |
| 23.7585 | 318150 | 0.0 | - |
| 23.7622 | 318200 | 0.0 | - |
| 23.7660 | 318250 | 0.0 | - |
| 23.7697 | 318300 | 0.0001 | - |
| 23.7734 | 318350 | 0.0 | - |
| 23.7772 | 318400 | 0.0 | - |
| 23.7809 | 318450 | 0.0 | - |
| 23.7846 | 318500 | 0.0 | - |
| 23.7884 | 318550 | 0.0002 | - |
| 23.7921 | 318600 | 0.0002 | - |
| 23.7958 | 318650 | 0.0 | - |
| 23.7996 | 318700 | 0.0 | - |
| 23.8033 | 318750 | 0.0 | - |
| 23.8070 | 318800 | 0.0 | - |
| 23.8108 | 318850 | 0.0 | - |
| 23.8145 | 318900 | 0.0 | - |
| 23.8182 | 318950 | 0.0 | - |
| 23.8220 | 319000 | 0.0 | - |
| 23.8257 | 319050 | 0.0 | - |
| 23.8294 | 319100 | 0.0003 | - |
| 23.8332 | 319150 | 0.0 | - |
| 23.8369 | 319200 | 0.0 | - |
| 23.8406 | 319250 | 0.0 | - |
| 23.8444 | 319300 | 0.0 | - |
| 23.8481 | 319350 | 0.0 | - |
| 23.8518 | 319400 | 0.0002 | - |
| 23.8556 | 319450 | 0.0 | - |
| 23.8593 | 319500 | 0.0 | - |
| 23.8630 | 319550 | 0.0002 | - |
| 23.8668 | 319600 | 0.0 | - |
| 23.8705 | 319650 | 0.0003 | - |
| 23.8742 | 319700 | 0.0 | - |
| 23.8780 | 319750 | 0.0002 | - |
| 23.8817 | 319800 | 0.0001 | - |
| 23.8854 | 319850 | 0.0 | - |
| 23.8892 | 319900 | 0.0002 | - |
| 23.8929 | 319950 | 0.0 | - |
| 23.8966 | 320000 | 0.0001 | - |
| 23.9004 | 320050 | 0.0 | - |
| 23.9041 | 320100 | 0.0 | - |
| 23.9078 | 320150 | 0.0002 | - |
| 23.9116 | 320200 | 0.0 | - |
| 23.9153 | 320250 | 0.0 | - |
| 23.9191 | 320300 | 0.0 | - |
| 23.9228 | 320350 | 0.0 | - |
| 23.9265 | 320400 | 0.0 | - |
| 23.9303 | 320450 | 0.0002 | - |
| 23.9340 | 320500 | 0.0002 | - |
| 23.9377 | 320550 | 0.0 | - |
| 23.9415 | 320600 | 0.0002 | - |
| 23.9452 | 320650 | 0.0 | - |
| 23.9489 | 320700 | 0.0 | - |
| 23.9527 | 320750 | 0.0 | - |
| 23.9564 | 320800 | 0.0 | - |
| 23.9601 | 320850 | 0.0002 | - |
| 23.9639 | 320900 | 0.0 | - |
| 23.9676 | 320950 | 0.0002 | - |
| 23.9713 | 321000 | 0.0002 | - |
| 23.9751 | 321050 | 0.0002 | - |
| 23.9788 | 321100 | 0.0 | - |
| 23.9825 | 321150 | 0.0 | - |
| 23.9863 | 321200 | 0.0001 | - |
| 23.9900 | 321250 | 0.0002 | - |
| 23.9937 | 321300 | 0.0 | - |
| 23.9975 | 321350 | 0.0002 | - |
| 24.0012 | 321400 | 0.0002 | - |
| 24.0049 | 321450 | 0.0 | - |
| 24.0087 | 321500 | 0.0002 | - |
| 24.0124 | 321550 | 0.0 | - |
| 24.0161 | 321600 | 0.0002 | - |
| 24.0199 | 321650 | 0.0 | - |
| 24.0236 | 321700 | 0.0 | - |
| 24.0273 | 321750 | 0.0 | - |
| 24.0311 | 321800 | 0.0 | - |
| 24.0348 | 321850 | 0.0 | - |
| 24.0385 | 321900 | 0.0 | - |
| 24.0423 | 321950 | 0.0001 | - |
| 24.0460 | 322000 | 0.0 | - |
| 24.0497 | 322050 | 0.0 | - |
| 24.0535 | 322100 | 0.0001 | - |
| 24.0572 | 322150 | 0.0 | - |
| 24.0609 | 322200 | 0.0 | - |
| 24.0647 | 322250 | 0.0003 | - |
| 24.0684 | 322300 | 0.0 | - |
| 24.0721 | 322350 | 0.0 | - |
| 24.0759 | 322400 | 0.0002 | - |
| 24.0796 | 322450 | 0.0 | - |
| 24.0833 | 322500 | 0.0 | - |
| 24.0871 | 322550 | 0.0 | - |
| 24.0908 | 322600 | 0.0 | - |
| 24.0945 | 322650 | 0.0 | - |
| 24.0983 | 322700 | 0.0 | - |
| 24.1020 | 322750 | 0.0002 | - |
| 24.1057 | 322800 | 0.0 | - |
| 24.1095 | 322850 | 0.0002 | - |
| 24.1132 | 322900 | 0.0 | - |
| 24.1169 | 322950 | 0.0002 | - |
| 24.1207 | 323000 | 0.0 | - |
| 24.1244 | 323050 | 0.0 | - |
| 24.1281 | 323100 | 0.0 | - |
| 24.1319 | 323150 | 0.0 | - |
| 24.1356 | 323200 | 0.0002 | - |
| 24.1393 | 323250 | 0.0003 | - |
| 24.1431 | 323300 | 0.0003 | - |
| 24.1468 | 323350 | 0.0002 | - |
| 24.1505 | 323400 | 0.0 | - |
| 24.1543 | 323450 | 0.0 | - |
| 24.1580 | 323500 | 0.0001 | - |
| 24.1618 | 323550 | 0.0004 | - |
| 24.1655 | 323600 | 0.0 | - |
| 24.1692 | 323650 | 0.0002 | - |
| 24.1730 | 323700 | 0.0 | - |
| 24.1767 | 323750 | 0.0002 | - |
| 24.1804 | 323800 | 0.0 | - |
| 24.1842 | 323850 | 0.0 | - |
| 24.1879 | 323900 | 0.0 | - |
| 24.1916 | 323950 | 0.0 | - |
| 24.1954 | 324000 | 0.0 | - |
| 24.1991 | 324050 | 0.0 | - |
| 24.2028 | 324100 | 0.0002 | - |
| 24.2066 | 324150 | 0.0003 | - |
| 24.2103 | 324200 | 0.0 | - |
| 24.2140 | 324250 | 0.0001 | - |
| 24.2178 | 324300 | 0.0 | - |
| 24.2215 | 324350 | 0.0002 | - |
| 24.2252 | 324400 | 0.0 | - |
| 24.2290 | 324450 | 0.0002 | - |
| 24.2327 | 324500 | 0.0002 | - |
| 24.2364 | 324550 | 0.0 | - |
| 24.2402 | 324600 | 0.0001 | - |
| 24.2439 | 324650 | 0.0002 | - |
| 24.2476 | 324700 | 0.0002 | - |
| 24.2514 | 324750 | 0.0 | - |
| 24.2551 | 324800 | 0.0002 | - |
| 24.2588 | 324850 | 0.0 | - |
| 24.2626 | 324900 | 0.0 | - |
| 24.2663 | 324950 | 0.0002 | - |
| 24.2700 | 325000 | 0.0 | - |
| 24.2738 | 325050 | 0.0001 | - |
| 24.2775 | 325100 | 0.0002 | - |
| 24.2812 | 325150 | 0.0 | - |
| 24.2850 | 325200 | 0.0 | - |
| 24.2887 | 325250 | 0.0002 | - |
| 24.2924 | 325300 | 0.0 | - |
| 24.2962 | 325350 | 0.0002 | - |
| 24.2999 | 325400 | 0.0 | - |
| 24.3036 | 325450 | 0.0 | - |
| 24.3074 | 325500 | 0.0 | - |
| 24.3111 | 325550 | 0.0 | - |
| 24.3148 | 325600 | 0.0 | - |
| 24.3186 | 325650 | 0.0 | - |
| 24.3223 | 325700 | 0.0 | - |
| 24.3260 | 325750 | 0.0003 | - |
| 24.3298 | 325800 | 0.0001 | - |
| 24.3335 | 325850 | 0.0002 | - |
| 24.3372 | 325900 | 0.0 | - |
| 24.3410 | 325950 | 0.0 | - |
| 24.3447 | 326000 | 0.0 | - |
| 24.3484 | 326050 | 0.0 | - |
| 24.3522 | 326100 | 0.0 | - |
| 24.3559 | 326150 | 0.0 | - |
| 24.3596 | 326200 | 0.0 | - |
| 24.3634 | 326250 | 0.0 | - |
| 24.3671 | 326300 | 0.0 | - |
| 24.3708 | 326350 | 0.0001 | - |
| 24.3746 | 326400 | 0.0002 | - |
| 24.3783 | 326450 | 0.0 | - |
| 24.3820 | 326500 | 0.0 | - |
| 24.3858 | 326550 | 0.0001 | - |
| 24.3895 | 326600 | 0.0 | - |
| 24.3932 | 326650 | 0.0 | - |
| 24.3970 | 326700 | 0.0 | - |
| 24.4007 | 326750 | 0.0 | - |
| 24.4045 | 326800 | 0.0002 | - |
| 24.4082 | 326850 | 0.0 | - |
| 24.4119 | 326900 | 0.0 | - |
| 24.4157 | 326950 | 0.0 | - |
| 24.4194 | 327000 | 0.0 | - |
| 24.4231 | 327050 | 0.0002 | - |
| 24.4269 | 327100 | 0.0002 | - |
| 24.4306 | 327150 | 0.0002 | - |
| 24.4343 | 327200 | 0.0 | - |
| 24.4381 | 327250 | 0.0 | - |
| 24.4418 | 327300 | 0.0 | - |
| 24.4455 | 327350 | 0.0 | - |
| 24.4493 | 327400 | 0.0 | - |
| 24.4530 | 327450 | 0.0002 | - |
| 24.4567 | 327500 | 0.0 | - |
| 24.4605 | 327550 | 0.0 | - |
| 24.4642 | 327600 | 0.0 | - |
| 24.4679 | 327650 | 0.0 | - |
| 24.4717 | 327700 | 0.0001 | - |
| 24.4754 | 327750 | 0.0002 | - |
| 24.4791 | 327800 | 0.0 | - |
| 24.4829 | 327850 | 0.0 | - |
| 24.4866 | 327900 | 0.0 | - |
| 24.4903 | 327950 | 0.0 | - |
| 24.4941 | 328000 | 0.0 | - |
| 24.4978 | 328050 | 0.0 | - |
| 24.5015 | 328100 | 0.0003 | - |
| 24.5053 | 328150 | 0.0 | - |
| 24.5090 | 328200 | 0.0002 | - |
| 24.5127 | 328250 | 0.0 | - |
| 24.5165 | 328300 | 0.0 | - |
| 24.5202 | 328350 | 0.0002 | - |
| 24.5239 | 328400 | 0.0 | - |
| 24.5277 | 328450 | 0.0 | - |
| 24.5314 | 328500 | 0.0 | - |
| 24.5351 | 328550 | 0.0 | - |
| 24.5389 | 328600 | 0.0 | - |
| 24.5426 | 328650 | 0.0 | - |
| 24.5463 | 328700 | 0.0 | - |
| 24.5501 | 328750 | 0.0 | - |
| 24.5538 | 328800 | 0.0 | - |
| 24.5575 | 328850 | 0.0 | - |
| 24.5613 | 328900 | 0.0 | - |
| 24.5650 | 328950 | 0.0 | - |
| 24.5687 | 329000 | 0.0 | - |
| 24.5725 | 329050 | 0.0 | - |
| 24.5762 | 329100 | 0.0 | - |
| 24.5799 | 329150 | 0.0002 | - |
| 24.5837 | 329200 | 0.0 | - |
| 24.5874 | 329250 | 0.0 | - |
| 24.5911 | 329300 | 0.0 | - |
| 24.5949 | 329350 | 0.0 | - |
| 24.5986 | 329400 | 0.0004 | - |
| 24.6023 | 329450 | 0.0002 | - |
| 24.6061 | 329500 | 0.0002 | - |
| 24.6098 | 329550 | 0.0002 | - |
| 24.6135 | 329600 | 0.0 | - |
| 24.6173 | 329650 | 0.0 | - |
| 24.6210 | 329700 | 0.0 | - |
| 24.6247 | 329750 | 0.0 | - |
| 24.6285 | 329800 | 0.0002 | - |
| 24.6322 | 329850 | 0.0 | - |
| 24.6359 | 329900 | 0.0 | - |
| 24.6397 | 329950 | 0.0002 | - |
| 24.6434 | 330000 | 0.0 | - |
| 24.6472 | 330050 | 0.0 | - |
| 24.6509 | 330100 | 0.0002 | - |
| 24.6546 | 330150 | 0.0 | - |
| 24.6584 | 330200 | 0.0002 | - |
| 24.6621 | 330250 | 0.0 | - |
| 24.6658 | 330300 | 0.0003 | - |
| 24.6696 | 330350 | 0.0 | - |
| 24.6733 | 330400 | 0.0 | - |
| 24.6770 | 330450 | 0.0 | - |
| 24.6808 | 330500 | 0.0 | - |
| 24.6845 | 330550 | 0.0 | - |
| 24.6882 | 330600 | 0.0 | - |
| 24.6920 | 330650 | 0.0001 | - |
| 24.6957 | 330700 | 0.0 | - |
| 24.6994 | 330750 | 0.0 | - |
| 24.7032 | 330800 | 0.0001 | - |
| 24.7069 | 330850 | 0.0 | - |
| 24.7106 | 330900 | 0.0 | - |
| 24.7144 | 330950 | 0.0 | - |
| 24.7181 | 331000 | 0.0 | - |
| 24.7218 | 331050 | 0.0 | - |
| 24.7256 | 331100 | 0.0 | - |
| 24.7293 | 331150 | 0.0003 | - |
| 24.7330 | 331200 | 0.0 | - |
| 24.7368 | 331250 | 0.0002 | - |
| 24.7405 | 331300 | 0.0 | - |
| 24.7442 | 331350 | 0.0 | - |
| 24.7480 | 331400 | 0.0 | - |
| 24.7517 | 331450 | 0.0 | - |
| 24.7554 | 331500 | 0.0001 | - |
| 24.7592 | 331550 | 0.0002 | - |
| 24.7629 | 331600 | 0.0 | - |
| 24.7666 | 331650 | 0.0002 | - |
| 24.7704 | 331700 | 0.0002 | - |
| 24.7741 | 331750 | 0.0 | - |
| 24.7778 | 331800 | 0.0 | - |
| 24.7816 | 331850 | 0.0002 | - |
| 24.7853 | 331900 | 0.0 | - |
| 24.7890 | 331950 | 0.0 | - |
| 24.7928 | 332000 | 0.0 | - |
| 24.7965 | 332050 | 0.0 | - |
| 24.8002 | 332100 | 0.0 | - |
| 24.8040 | 332150 | 0.0 | - |
| 24.8077 | 332200 | 0.0 | - |
| 24.8114 | 332250 | 0.0002 | - |
| 24.8152 | 332300 | 0.0 | - |
| 24.8189 | 332350 | 0.0 | - |
| 24.8226 | 332400 | 0.0 | - |
| 24.8264 | 332450 | 0.0 | - |
| 24.8301 | 332500 | 0.0 | - |
| 24.8338 | 332550 | 0.0 | - |
| 24.8376 | 332600 | 0.0002 | - |
| 24.8413 | 332650 | 0.0001 | - |
| 24.8450 | 332700 | 0.0 | - |
| 24.8488 | 332750 | 0.0001 | - |
| 24.8525 | 332800 | 0.0 | - |
| 24.8562 | 332850 | 0.0 | - |
| 24.8600 | 332900 | 0.0002 | - |
| 24.8637 | 332950 | 0.0002 | - |
| 24.8674 | 333000 | 0.0 | - |
| 24.8712 | 333050 | 0.0003 | - |
| 24.8749 | 333100 | 0.0 | - |
| 24.8786 | 333150 | 0.0003 | - |
| 24.8824 | 333200 | 0.0002 | - |
| 24.8861 | 333250 | 0.0 | - |
| 24.8899 | 333300 | 0.0 | - |
| 24.8936 | 333350 | 0.0 | - |
| 24.8973 | 333400 | 0.0 | - |
| 24.9011 | 333450 | 0.0002 | - |
| 24.9048 | 333500 | 0.0 | - |
| 24.9085 | 333550 | 0.0 | - |
| 24.9123 | 333600 | 0.0 | - |
| 24.9160 | 333650 | 0.0 | - |
| 24.9197 | 333700 | 0.0002 | - |
| 24.9235 | 333750 | 0.0002 | - |
| 24.9272 | 333800 | 0.0002 | - |
| 24.9309 | 333850 | 0.0003 | - |
| 24.9347 | 333900 | 0.0002 | - |
| 24.9384 | 333950 | 0.0001 | - |
| 24.9421 | 334000 | 0.0001 | - |
| 24.9459 | 334050 | 0.0004 | - |
| 24.9496 | 334100 | 0.0001 | - |
| 24.9533 | 334150 | 0.0 | - |
| 24.9571 | 334200 | 0.0 | - |
| 24.9608 | 334250 | 0.0 | - |
| 24.9645 | 334300 | 0.0 | - |
| 24.9683 | 334350 | 0.0 | - |
| 24.9720 | 334400 | 0.0002 | - |
| 24.9757 | 334450 | 0.0 | - |
| 24.9795 | 334500 | 0.0 | - |
| 24.9832 | 334550 | 0.0002 | - |
| 24.9869 | 334600 | 0.0 | - |
| 24.9907 | 334650 | 0.0 | - |
| 24.9944 | 334700 | 0.0 | - |
| 24.9981 | 334750 | 0.0 | - |
| 25.0019 | 334800 | 0.0 | - |
| 25.0056 | 334850 | 0.0 | - |
| 25.0093 | 334900 | 0.0 | - |
| 25.0131 | 334950 | 0.0001 | - |
| 25.0168 | 335000 | 0.0 | - |
| 25.0205 | 335050 | 0.0 | - |
| 25.0243 | 335100 | 0.0002 | - |
| 25.0280 | 335150 | 0.0 | - |
| 25.0317 | 335200 | 0.0003 | - |
| 25.0355 | 335250 | 0.0 | - |
| 25.0392 | 335300 | 0.0002 | - |
| 25.0429 | 335350 | 0.0 | - |
| 25.0467 | 335400 | 0.0 | - |
| 25.0504 | 335450 | 0.0 | - |
| 25.0541 | 335500 | 0.0002 | - |
| 25.0579 | 335550 | 0.0 | - |
| 25.0616 | 335600 | 0.0 | - |
| 25.0653 | 335650 | 0.0 | - |
| 25.0691 | 335700 | 0.0 | - |
| 25.0728 | 335750 | 0.0 | - |
| 25.0765 | 335800 | 0.0 | - |
| 25.0803 | 335850 | 0.0002 | - |
| 25.0840 | 335900 | 0.0002 | - |
| 25.0877 | 335950 | 0.0 | - |
| 25.0915 | 336000 | 0.0 | - |
| 25.0952 | 336050 | 0.0 | - |
| 25.0989 | 336100 | 0.0002 | - |
| 25.1027 | 336150 | 0.0 | - |
| 25.1064 | 336200 | 0.0 | - |
| 25.1101 | 336250 | 0.0 | - |
| 25.1139 | 336300 | 0.0001 | - |
| 25.1176 | 336350 | 0.0001 | - |
| 25.1214 | 336400 | 0.0 | - |
| 25.1251 | 336450 | 0.0 | - |
| 25.1288 | 336500 | 0.0 | - |
| 25.1326 | 336550 | 0.0 | - |
| 25.1363 | 336600 | 0.0 | - |
| 25.1400 | 336650 | 0.0002 | - |
| 25.1438 | 336700 | 0.0001 | - |
| 25.1475 | 336750 | 0.0 | - |
| 25.1512 | 336800 | 0.0 | - |
| 25.1550 | 336850 | 0.0 | - |
| 25.1587 | 336900 | 0.0001 | - |
| 25.1624 | 336950 | 0.0002 | - |
| 25.1662 | 337000 | 0.0 | - |
| 25.1699 | 337050 | 0.0001 | - |
| 25.1736 | 337100 | 0.0 | - |
| 25.1774 | 337150 | 0.0 | - |
| 25.1811 | 337200 | 0.0002 | - |
| 25.1848 | 337250 | 0.0 | - |
| 25.1886 | 337300 | 0.0002 | - |
| 25.1923 | 337350 | 0.0002 | - |
| 25.1960 | 337400 | 0.0 | - |
| 25.1998 | 337450 | 0.0 | - |
| 25.2035 | 337500 | 0.0 | - |
| 25.2072 | 337550 | 0.0 | - |
| 25.2110 | 337600 | 0.0002 | - |
| 25.2147 | 337650 | 0.0 | - |
| 25.2184 | 337700 | 0.0002 | - |
| 25.2222 | 337750 | 0.0 | - |
| 25.2259 | 337800 | 0.0 | - |
| 25.2296 | 337850 | 0.0 | - |
| 25.2334 | 337900 | 0.0 | - |
| 25.2371 | 337950 | 0.0 | - |
| 25.2408 | 338000 | 0.0 | - |
| 25.2446 | 338050 | 0.0002 | - |
| 25.2483 | 338100 | 0.0 | - |
| 25.2520 | 338150 | 0.0002 | - |
| 25.2558 | 338200 | 0.0 | - |
| 25.2595 | 338250 | 0.0002 | - |
| 25.2632 | 338300 | 0.0 | - |
| 25.2670 | 338350 | 0.0 | - |
| 25.2707 | 338400 | 0.0 | - |
| 25.2744 | 338450 | 0.0 | - |
| 25.2782 | 338500 | 0.0002 | - |
| 25.2819 | 338550 | 0.0 | - |
| 25.2856 | 338600 | 0.0 | - |
| 25.2894 | 338650 | 0.0001 | - |
| 25.2931 | 338700 | 0.0 | - |
| 25.2968 | 338750 | 0.0 | - |
| 25.3006 | 338800 | 0.0 | - |
| 25.3043 | 338850 | 0.0 | - |
| 25.3080 | 338900 | 0.0 | - |
| 25.3118 | 338950 | 0.0 | - |
| 25.3155 | 339000 | 0.0001 | - |
| 25.3192 | 339050 | 0.0 | - |
| 25.3230 | 339100 | 0.0 | - |
| 25.3267 | 339150 | 0.0002 | - |
| 25.3304 | 339200 | 0.0 | - |
| 25.3342 | 339250 | 0.0 | - |
| 25.3379 | 339300 | 0.0002 | - |
| 25.3416 | 339350 | 0.0002 | - |
| 25.3454 | 339400 | 0.0 | - |
| 25.3491 | 339450 | 0.0 | - |
| 25.3528 | 339500 | 0.0 | - |
| 25.3566 | 339550 | 0.0 | - |
| 25.3603 | 339600 | 0.0 | - |
| 25.3641 | 339650 | 0.0 | - |
| 25.3678 | 339700 | 0.0002 | - |
| 25.3715 | 339750 | 0.0 | - |
| 25.3753 | 339800 | 0.0002 | - |
| 25.3790 | 339850 | 0.0 | - |
| 25.3827 | 339900 | 0.0002 | - |
| 25.3865 | 339950 | 0.0002 | - |
| 25.3902 | 340000 | 0.0 | - |
| 25.3939 | 340050 | 0.0002 | - |
| 25.3977 | 340100 | 0.0 | - |
| 25.4014 | 340150 | 0.0001 | - |
| 25.4051 | 340200 | 0.0001 | - |
| 25.4089 | 340250 | 0.0 | - |
| 25.4126 | 340300 | 0.0 | - |
| 25.4163 | 340350 | 0.0 | - |
| 25.4201 | 340400 | 0.0002 | - |
| 25.4238 | 340450 | 0.0002 | - |
| 25.4275 | 340500 | 0.0 | - |
| 25.4313 | 340550 | 0.0002 | - |
| 25.4350 | 340600 | 0.0 | - |
| 25.4387 | 340650 | 0.0 | - |
| 25.4425 | 340700 | 0.0002 | - |
| 25.4462 | 340750 | 0.0 | - |
| 25.4499 | 340800 | 0.0 | - |
| 25.4537 | 340850 | 0.0 | - |
| 25.4574 | 340900 | 0.0 | - |
| 25.4611 | 340950 | 0.0 | - |
| 25.4649 | 341000 | 0.0 | - |
| 25.4686 | 341050 | 0.0002 | - |
| 25.4723 | 341100 | 0.0 | - |
| 25.4761 | 341150 | 0.0 | - |
| 25.4798 | 341200 | 0.0002 | - |
| 25.4835 | 341250 | 0.0 | - |
| 25.4873 | 341300 | 0.0 | - |
| 25.4910 | 341350 | 0.0 | - |
| 25.4947 | 341400 | 0.0 | - |
| 25.4985 | 341450 | 0.0 | - |
| 25.5022 | 341500 | 0.0 | - |
| 25.5059 | 341550 | 0.0 | - |
| 25.5097 | 341600 | 0.0 | - |
| 25.5134 | 341650 | 0.0 | - |
| 25.5171 | 341700 | 0.0 | - |
| 25.5209 | 341750 | 0.0005 | - |
| 25.5246 | 341800 | 0.0 | - |
| 25.5283 | 341850 | 0.0 | - |
| 25.5321 | 341900 | 0.0 | - |
| 25.5358 | 341950 | 0.0 | - |
| 25.5395 | 342000 | 0.0003 | - |
| 25.5433 | 342050 | 0.0 | - |
| 25.5470 | 342100 | 0.0 | - |
| 25.5507 | 342150 | 0.0 | - |
| 25.5545 | 342200 | 0.0 | - |
| 25.5582 | 342250 | 0.0 | - |
| 25.5619 | 342300 | 0.0 | - |
| 25.5657 | 342350 | 0.0 | - |
| 25.5694 | 342400 | 0.0002 | - |
| 25.5731 | 342450 | 0.0 | - |
| 25.5769 | 342500 | 0.0002 | - |
| 25.5806 | 342550 | 0.0 | - |
| 25.5843 | 342600 | 0.0 | - |
| 25.5881 | 342650 | 0.0 | - |
| 25.5918 | 342700 | 0.0 | - |
| 25.5955 | 342750 | 0.0 | - |
| 25.5993 | 342800 | 0.0002 | - |
| 25.6030 | 342850 | 0.0 | - |
| 25.6068 | 342900 | 0.0002 | - |
| 25.6105 | 342950 | 0.0 | - |
| 25.6142 | 343000 | 0.0 | - |
| 25.6180 | 343050 | 0.0 | - |
| 25.6217 | 343100 | 0.0 | - |
| 25.6254 | 343150 | 0.0002 | - |
| 25.6292 | 343200 | 0.0 | - |
| 25.6329 | 343250 | 0.0 | - |
| 25.6366 | 343300 | 0.0 | - |
| 25.6404 | 343350 | 0.0002 | - |
| 25.6441 | 343400 | 0.0 | - |
| 25.6478 | 343450 | 0.0 | - |
| 25.6516 | 343500 | 0.0 | - |
| 25.6553 | 343550 | 0.0 | - |
| 25.6590 | 343600 | 0.0002 | - |
| 25.6628 | 343650 | 0.0002 | - |
| 25.6665 | 343700 | 0.0 | - |
| 25.6702 | 343750 | 0.0002 | - |
| 25.6740 | 343800 | 0.0001 | - |
| 25.6777 | 343850 | 0.0002 | - |
| 25.6814 | 343900 | 0.0 | - |
| 25.6852 | 343950 | 0.0 | - |
| 25.6889 | 344000 | 0.0002 | - |
| 25.6926 | 344050 | 0.0 | - |
| 25.6964 | 344100 | 0.0 | - |
| 25.7001 | 344150 | 0.0003 | - |
| 25.7038 | 344200 | 0.0004 | - |
| 25.7076 | 344250 | 0.0003 | - |
| 25.7113 | 344300 | 0.0 | - |
| 25.7150 | 344350 | 0.0 | - |
| 25.7188 | 344400 | 0.0 | - |
| 25.7225 | 344450 | 0.0 | - |
| 25.7262 | 344500 | 0.0 | - |
| 25.7300 | 344550 | 0.0002 | - |
| 25.7337 | 344600 | 0.0 | - |
| 25.7374 | 344650 | 0.0 | - |
| 25.7412 | 344700 | 0.0 | - |
| 25.7449 | 344750 | 0.0 | - |
| 25.7486 | 344800 | 0.0002 | - |
| 25.7524 | 344850 | 0.0 | - |
| 25.7561 | 344900 | 0.0003 | - |
| 25.7598 | 344950 | 0.0 | - |
| 25.7636 | 345000 | 0.0 | - |
| 25.7673 | 345050 | 0.0 | - |
| 25.7710 | 345100 | 0.0002 | - |
| 25.7748 | 345150 | 0.0 | - |
| 25.7785 | 345200 | 0.0002 | - |
| 25.7822 | 345250 | 0.0 | - |
| 25.7860 | 345300 | 0.0 | - |
| 25.7897 | 345350 | 0.0 | - |
| 25.7934 | 345400 | 0.0 | - |
| 25.7972 | 345450 | 0.0004 | - |
| 25.8009 | 345500 | 0.0001 | - |
| 25.8046 | 345550 | 0.0002 | - |
| 25.8084 | 345600 | 0.0003 | - |
| 25.8121 | 345650 | 0.0 | - |
| 25.8158 | 345700 | 0.0002 | - |
| 25.8196 | 345750 | 0.0 | - |
| 25.8233 | 345800 | 0.0 | - |
| 25.8270 | 345850 | 0.0 | - |
| 25.8308 | 345900 | 0.0002 | - |
| 25.8345 | 345950 | 0.0 | - |
| 25.8382 | 346000 | 0.0 | - |
| 25.8420 | 346050 | 0.0002 | - |
| 25.8457 | 346100 | 0.0 | - |
| 25.8495 | 346150 | 0.0 | - |
| 25.8532 | 346200 | 0.0 | - |
| 25.8569 | 346250 | 0.0 | - |
| 25.8607 | 346300 | 0.0 | - |
| 25.8644 | 346350 | 0.0002 | - |
| 25.8681 | 346400 | 0.0 | - |
| 25.8719 | 346450 | 0.0 | - |
| 25.8756 | 346500 | 0.0 | - |
| 25.8793 | 346550 | 0.0 | - |
| 25.8831 | 346600 | 0.0 | - |
| 25.8868 | 346650 | 0.0002 | - |
| 25.8905 | 346700 | 0.0 | - |
| 25.8943 | 346750 | 0.0002 | - |
| 25.8980 | 346800 | 0.0 | - |
| 25.9017 | 346850 | 0.0 | - |
| 25.9055 | 346900 | 0.0003 | - |
| 25.9092 | 346950 | 0.0 | - |
| 25.9129 | 347000 | 0.0 | - |
| 25.9167 | 347050 | 0.0 | - |
| 25.9204 | 347100 | 0.0 | - |
| 25.9241 | 347150 | 0.0 | - |
| 25.9279 | 347200 | 0.0 | - |
| 25.9316 | 347250 | 0.0 | - |
| 25.9353 | 347300 | 0.0 | - |
| 25.9391 | 347350 | 0.0001 | - |
| 25.9428 | 347400 | 0.0 | - |
| 25.9465 | 347450 | 0.0 | - |
| 25.9503 | 347500 | 0.0 | - |
| 25.9540 | 347550 | 0.0 | - |
| 25.9577 | 347600 | 0.0 | - |
| 25.9615 | 347650 | 0.0002 | - |
| 25.9652 | 347700 | 0.0 | - |
| 25.9689 | 347750 | 0.0 | - |
| 25.9727 | 347800 | 0.0 | - |
| 25.9764 | 347850 | 0.0 | - |
| 25.9801 | 347900 | 0.0 | - |
| 25.9839 | 347950 | 0.0 | - |
| 25.9876 | 348000 | 0.0 | - |
| 25.9913 | 348050 | 0.0 | - |
| 25.9951 | 348100 | 0.0002 | - |
| 25.9988 | 348150 | 0.0 | - |
| 26.0025 | 348200 | 0.0 | - |
| 26.0063 | 348250 | 0.0 | - |
| 26.0100 | 348300 | 0.0002 | - |
| 26.0137 | 348350 | 0.0002 | - |
| 26.0175 | 348400 | 0.0 | - |
| 26.0212 | 348450 | 0.0002 | - |
| 26.0249 | 348500 | 0.0003 | - |
| 26.0287 | 348550 | 0.0001 | - |
| 26.0324 | 348600 | 0.0002 | - |
| 26.0361 | 348650 | 0.0 | - |
| 26.0399 | 348700 | 0.0002 | - |
| 26.0436 | 348750 | 0.0 | - |
| 26.0473 | 348800 | 0.0 | - |
| 26.0511 | 348850 | 0.0 | - |
| 26.0548 | 348900 | 0.0 | - |
| 26.0585 | 348950 | 0.0002 | - |
| 26.0623 | 349000 | 0.0002 | - |
| 26.0660 | 349050 | 0.0002 | - |
| 26.0697 | 349100 | 0.0 | - |
| 26.0735 | 349150 | 0.0003 | - |
| 26.0772 | 349200 | 0.0 | - |
| 26.0809 | 349250 | 0.0 | - |
| 26.0847 | 349300 | 0.0 | - |
| 26.0884 | 349350 | 0.0 | - |
| 26.0922 | 349400 | 0.0002 | - |
| 26.0959 | 349450 | 0.0 | - |
| 26.0996 | 349500 | 0.0002 | - |
| 26.1034 | 349550 | 0.0002 | - |
| 26.1071 | 349600 | 0.0002 | - |
| 26.1108 | 349650 | 0.0 | - |
| 26.1146 | 349700 | 0.0002 | - |
| 26.1183 | 349750 | 0.0002 | - |
| 26.1220 | 349800 | 0.0002 | - |
| 26.1258 | 349850 | 0.0 | - |
| 26.1295 | 349900 | 0.0 | - |
| 26.1332 | 349950 | 0.0002 | - |
| 26.1370 | 350000 | 0.0002 | - |
| 26.1407 | 350050 | 0.0 | - |
| 26.1444 | 350100 | 0.0 | - |
| 26.1482 | 350150 | 0.0002 | - |
| 26.1519 | 350200 | 0.0 | - |
| 26.1556 | 350250 | 0.0 | - |
| 26.1594 | 350300 | 0.0 | - |
| 26.1631 | 350350 | 0.0 | - |
| 26.1668 | 350400 | 0.0002 | - |
| 26.1706 | 350450 | 0.0002 | - |
| 26.1743 | 350500 | 0.0 | - |
| 26.1780 | 350550 | 0.0002 | - |
| 26.1818 | 350600 | 0.0002 | - |
| 26.1855 | 350650 | 0.0 | - |
| 26.1892 | 350700 | 0.0 | - |
| 26.1930 | 350750 | 0.0 | - |
| 26.1967 | 350800 | 0.0 | - |
| 26.2004 | 350850 | 0.0 | - |
| 26.2042 | 350900 | 0.0003 | - |
| 26.2079 | 350950 | 0.0 | - |
| 26.2116 | 351000 | 0.0 | - |
| 26.2154 | 351050 | 0.0 | - |
| 26.2191 | 351100 | 0.0 | - |
| 26.2228 | 351150 | 0.0 | - |
| 26.2266 | 351200 | 0.0 | - |
| 26.2303 | 351250 | 0.0002 | - |
| 26.2340 | 351300 | 0.0 | - |
| 26.2378 | 351350 | 0.0 | - |
| 26.2415 | 351400 | 0.0003 | - |
| 26.2452 | 351450 | 0.0005 | - |
| 26.2490 | 351500 | 0.0002 | - |
| 26.2527 | 351550 | 0.0002 | - |
| 26.2564 | 351600 | 0.0001 | - |
| 26.2602 | 351650 | 0.0 | - |
| 26.2639 | 351700 | 0.0001 | - |
| 26.2676 | 351750 | 0.0002 | - |
| 26.2714 | 351800 | 0.0 | - |
| 26.2751 | 351850 | 0.0 | - |
| 26.2788 | 351900 | 0.0002 | - |
| 26.2826 | 351950 | 0.0002 | - |
| 26.2863 | 352000 | 0.0 | - |
| 26.2900 | 352050 | 0.0002 | - |
| 26.2938 | 352100 | 0.0 | - |
| 26.2975 | 352150 | 0.0001 | - |
| 26.3012 | 352200 | 0.0003 | - |
| 26.3050 | 352250 | 0.0 | - |
| 26.3087 | 352300 | 0.0 | - |
| 26.3124 | 352350 | 0.0002 | - |
| 26.3162 | 352400 | 0.0 | - |
| 26.3199 | 352450 | 0.0 | - |
| 26.3237 | 352500 | 0.0 | - |
| 26.3274 | 352550 | 0.0002 | - |
| 26.3311 | 352600 | 0.0002 | - |
| 26.3349 | 352650 | 0.0002 | - |
| 26.3386 | 352700 | 0.0 | - |
| 26.3423 | 352750 | 0.0002 | - |
| 26.3461 | 352800 | 0.0 | - |
| 26.3498 | 352850 | 0.0 | - |
| 26.3535 | 352900 | 0.0 | - |
| 26.3573 | 352950 | 0.0 | - |
| 26.3610 | 353000 | 0.0002 | - |
| 26.3647 | 353050 | 0.0 | - |
| 26.3685 | 353100 | 0.0 | - |
| 26.3722 | 353150 | 0.0004 | - |
| 26.3759 | 353200 | 0.0 | - |
| 26.3797 | 353250 | 0.0003 | - |
| 26.3834 | 353300 | 0.0002 | - |
| 26.3871 | 353350 | 0.0 | - |
| 26.3909 | 353400 | 0.0001 | - |
| 26.3946 | 353450 | 0.0 | - |
| 26.3983 | 353500 | 0.0 | - |
| 26.4021 | 353550 | 0.0 | - |
| 26.4058 | 353600 | 0.0 | - |
| 26.4095 | 353650 | 0.0002 | - |
| 26.4133 | 353700 | 0.0002 | - |
| 26.4170 | 353750 | 0.0 | - |
| 26.4207 | 353800 | 0.0002 | - |
| 26.4245 | 353850 | 0.0 | - |
| 26.4282 | 353900 | 0.0 | - |
| 26.4319 | 353950 | 0.0 | - |
| 26.4357 | 354000 | 0.0002 | - |
| 26.4394 | 354050 | 0.0002 | - |
| 26.4431 | 354100 | 0.0001 | - |
| 26.4469 | 354150 | 0.0 | - |
| 26.4506 | 354200 | 0.0006 | - |
| 26.4543 | 354250 | 0.0003 | - |
| 26.4581 | 354300 | 0.0002 | - |
| 26.4618 | 354350 | 0.0 | - |
| 26.4655 | 354400 | 0.0 | - |
| 26.4693 | 354450 | 0.0 | - |
| 26.4730 | 354500 | 0.0 | - |
| 26.4767 | 354550 | 0.0003 | - |
| 26.4805 | 354600 | 0.0002 | - |
| 26.4842 | 354650 | 0.0004 | - |
| 26.4879 | 354700 | 0.0 | - |
| 26.4917 | 354750 | 0.0 | - |
| 26.4954 | 354800 | 0.0002 | - |
| 26.4991 | 354850 | 0.0004 | - |
| 26.5029 | 354900 | 0.0 | - |
| 26.5066 | 354950 | 0.0 | - |
| 26.5103 | 355000 | 0.0 | - |
| 26.5141 | 355050 | 0.0 | - |
| 26.5178 | 355100 | 0.0 | - |
| 26.5215 | 355150 | 0.0001 | - |
| 26.5253 | 355200 | 0.0002 | - |
| 26.5290 | 355250 | 0.0001 | - |
| 26.5327 | 355300 | 0.0001 | - |
| 26.5365 | 355350 | 0.0 | - |
| 26.5402 | 355400 | 0.0 | - |
| 26.5439 | 355450 | 0.0 | - |
| 26.5477 | 355500 | 0.0002 | - |
| 26.5514 | 355550 | 0.0 | - |
| 26.5551 | 355600 | 0.0 | - |
| 26.5589 | 355650 | 0.0002 | - |
| 26.5626 | 355700 | 0.0 | - |
| 26.5664 | 355750 | 0.0002 | - |
| 26.5701 | 355800 | 0.0002 | - |
| 26.5738 | 355850 | 0.0002 | - |
| 26.5776 | 355900 | 0.0 | - |
| 26.5813 | 355950 | 0.0 | - |
| 26.5850 | 356000 | 0.0 | - |
| 26.5888 | 356050 | 0.0 | - |
| 26.5925 | 356100 | 0.0 | - |
| 26.5962 | 356150 | 0.0002 | - |
| 26.6000 | 356200 | 0.0001 | - |
| 26.6037 | 356250 | 0.0 | - |
| 26.6074 | 356300 | 0.0 | - |
| 26.6112 | 356350 | 0.0002 | - |
| 26.6149 | 356400 | 0.0 | - |
| 26.6186 | 356450 | 0.0 | - |
| 26.6224 | 356500 | 0.0 | - |
| 26.6261 | 356550 | 0.0002 | - |
| 26.6298 | 356600 | 0.0002 | - |
| 26.6336 | 356650 | 0.0 | - |
| 26.6373 | 356700 | 0.0 | - |
| 26.6410 | 356750 | 0.0 | - |
| 26.6448 | 356800 | 0.0001 | - |
| 26.6485 | 356850 | 0.0 | - |
| 26.6522 | 356900 | 0.0 | - |
| 26.6560 | 356950 | 0.0002 | - |
| 26.6597 | 357000 | 0.0 | - |
| 26.6634 | 357050 | 0.0 | - |
| 26.6672 | 357100 | 0.0 | - |
| 26.6709 | 357150 | 0.0 | - |
| 26.6746 | 357200 | 0.0 | - |
| 26.6784 | 357250 | 0.0 | - |
| 26.6821 | 357300 | 0.0001 | - |
| 26.6858 | 357350 | 0.0 | - |
| 26.6896 | 357400 | 0.0 | - |
| 26.6933 | 357450 | 0.0 | - |
| 26.6970 | 357500 | 0.0 | - |
| 26.7008 | 357550 | 0.0 | - |
| 26.7045 | 357600 | 0.0 | - |
| 26.7082 | 357650 | 0.0002 | - |
| 26.7120 | 357700 | 0.0002 | - |
| 26.7157 | 357750 | 0.0002 | - |
| 26.7194 | 357800 | 0.0003 | - |
| 26.7232 | 357850 | 0.0 | - |
| 26.7269 | 357900 | 0.0 | - |
| 26.7306 | 357950 | 0.0 | - |
| 26.7344 | 358000 | 0.0 | - |
| 26.7381 | 358050 | 0.0 | - |
| 26.7418 | 358100 | 0.0 | - |
| 26.7456 | 358150 | 0.0 | - |
| 26.7493 | 358200 | 0.0 | - |
| 26.7530 | 358250 | 0.0 | - |
| 26.7568 | 358300 | 0.0002 | - |
| 26.7605 | 358350 | 0.0001 | - |
| 26.7642 | 358400 | 0.0001 | - |
| 26.7680 | 358450 | 0.0 | - |
| 26.7717 | 358500 | 0.0 | - |
| 26.7754 | 358550 | 0.0 | - |
| 26.7792 | 358600 | 0.0 | - |
| 26.7829 | 358650 | 0.0002 | - |
| 26.7866 | 358700 | 0.0002 | - |
| 26.7904 | 358750 | 0.0002 | - |
| 26.7941 | 358800 | 0.0 | - |
| 26.7978 | 358850 | 0.0002 | - |
| 26.8016 | 358900 | 0.0 | - |
| 26.8053 | 358950 | 0.0 | - |
| 26.8091 | 359000 | 0.0001 | - |
| 26.8128 | 359050 | 0.0002 | - |
| 26.8165 | 359100 | 0.0002 | - |
| 26.8203 | 359150 | 0.0 | - |
| 26.8240 | 359200 | 0.0 | - |
| 26.8277 | 359250 | 0.0002 | - |
| 26.8315 | 359300 | 0.0 | - |
| 26.8352 | 359350 | 0.0 | - |
| 26.8389 | 359400 | 0.0 | - |
| 26.8427 | 359450 | 0.0002 | - |
| 26.8464 | 359500 | 0.0002 | - |
| 26.8501 | 359550 | 0.0001 | - |
| 26.8539 | 359600 | 0.0 | - |
| 26.8576 | 359650 | 0.0 | - |
| 26.8613 | 359700 | 0.0001 | - |
| 26.8651 | 359750 | 0.0 | - |
| 26.8688 | 359800 | 0.0 | - |
| 26.8725 | 359850 | 0.0002 | - |
| 26.8763 | 359900 | 0.0 | - |
| 26.8800 | 359950 | 0.0 | - |
| 26.8837 | 360000 | 0.0002 | - |
| 26.8875 | 360050 | 0.0 | - |
| 26.8912 | 360100 | 0.0 | - |
| 26.8949 | 360150 | 0.0 | - |
| 26.8987 | 360200 | 0.0002 | - |
| 26.9024 | 360250 | 0.0001 | - |
| 26.9061 | 360300 | 0.0 | - |
| 26.9099 | 360350 | 0.0 | - |
| 26.9136 | 360400 | 0.0 | - |
| 26.9173 | 360450 | 0.0 | - |
| 26.9211 | 360500 | 0.0 | - |
| 26.9248 | 360550 | 0.0 | - |
| 26.9285 | 360600 | 0.0 | - |
| 26.9323 | 360650 | 0.0 | - |
| 26.9360 | 360700 | 0.0 | - |
| 26.9397 | 360750 | 0.0002 | - |
| 26.9435 | 360800 | 0.0 | - |
| 26.9472 | 360850 | 0.0 | - |
| 26.9509 | 360900 | 0.0 | - |
| 26.9547 | 360950 | 0.0 | - |
| 26.9584 | 361000 | 0.0 | - |
| 26.9621 | 361050 | 0.0 | - |
| 26.9659 | 361100 | 0.0002 | - |
| 26.9696 | 361150 | 0.0 | - |
| 26.9733 | 361200 | 0.0 | - |
| 26.9771 | 361250 | 0.0 | - |
| 26.9808 | 361300 | 0.0 | - |
| 26.9845 | 361350 | 0.0002 | - |
| 26.9883 | 361400 | 0.0 | - |
| 26.9920 | 361450 | 0.0002 | - |
| 26.9957 | 361500 | 0.0 | - |
| 26.9995 | 361550 | 0.0 | - |
| 27.0032 | 361600 | 0.0002 | - |
| 27.0069 | 361650 | 0.0 | - |
| 27.0107 | 361700 | 0.0 | - |
| 27.0144 | 361750 | 0.0 | - |
| 27.0181 | 361800 | 0.0002 | - |
| 27.0219 | 361850 | 0.0 | - |
| 27.0256 | 361900 | 0.0 | - |
| 27.0293 | 361950 | 0.0 | - |
| 27.0331 | 362000 | 0.0 | - |
| 27.0368 | 362050 | 0.0 | - |
| 27.0405 | 362100 | 0.0 | - |
| 27.0443 | 362150 | 0.0 | - |
| 27.0480 | 362200 | 0.0003 | - |
| 27.0518 | 362250 | 0.0 | - |
| 27.0555 | 362300 | 0.0 | - |
| 27.0592 | 362350 | 0.0 | - |
| 27.0630 | 362400 | 0.0 | - |
| 27.0667 | 362450 | 0.0002 | - |
| 27.0704 | 362500 | 0.0 | - |
| 27.0742 | 362550 | 0.0 | - |
| 27.0779 | 362600 | 0.0001 | - |
| 27.0816 | 362650 | 0.0001 | - |
| 27.0854 | 362700 | 0.0 | - |
| 27.0891 | 362750 | 0.0 | - |
| 27.0928 | 362800 | 0.0 | - |
| 27.0966 | 362850 | 0.0 | - |
| 27.1003 | 362900 | 0.0 | - |
| 27.1040 | 362950 | 0.0 | - |
| 27.1078 | 363000 | 0.0 | - |
| 27.1115 | 363050 | 0.0001 | - |
| 27.1152 | 363100 | 0.0002 | - |
| 27.1190 | 363150 | 0.0 | - |
| 27.1227 | 363200 | 0.0 | - |
| 27.1264 | 363250 | 0.0 | - |
| 27.1302 | 363300 | 0.0 | - |
| 27.1339 | 363350 | 0.0002 | - |
| 27.1376 | 363400 | 0.0 | - |
| 27.1414 | 363450 | 0.0 | - |
| 27.1451 | 363500 | 0.0 | - |
| 27.1488 | 363550 | 0.0002 | - |
| 27.1526 | 363600 | 0.0 | - |
| 27.1563 | 363650 | 0.0002 | - |
| 27.1600 | 363700 | 0.0 | - |
| 27.1638 | 363750 | 0.0 | - |
| 27.1675 | 363800 | 0.0002 | - |
| 27.1712 | 363850 | 0.0 | - |
| 27.1750 | 363900 | 0.0002 | - |
| 27.1787 | 363950 | 0.0 | - |
| 27.1824 | 364000 | 0.0 | - |
| 27.1862 | 364050 | 0.0003 | - |
| 27.1899 | 364100 | 0.0 | - |
| 27.1936 | 364150 | 0.0 | - |
| 27.1974 | 364200 | 0.0 | - |
| 27.2011 | 364250 | 0.0 | - |
| 27.2048 | 364300 | 0.0 | - |
| 27.2086 | 364350 | 0.0 | - |
| 27.2123 | 364400 | 0.0 | - |
| 27.2160 | 364450 | 0.0 | - |
| 27.2198 | 364500 | 0.0 | - |
| 27.2235 | 364550 | 0.0 | - |
| 27.2272 | 364600 | 0.0 | - |
| 27.2310 | 364650 | 0.0 | - |
| 27.2347 | 364700 | 0.0 | - |
| 27.2384 | 364750 | 0.0002 | - |
| 27.2422 | 364800 | 0.0002 | - |
| 27.2459 | 364850 | 0.0002 | - |
| 27.2496 | 364900 | 0.0 | - |
| 27.2534 | 364950 | 0.0002 | - |
| 27.2571 | 365000 | 0.0 | - |
| 27.2608 | 365050 | 0.0 | - |
| 27.2646 | 365100 | 0.0 | - |
| 27.2683 | 365150 | 0.0 | - |
| 27.2720 | 365200 | 0.0 | - |
| 27.2758 | 365250 | 0.0 | - |
| 27.2795 | 365300 | 0.0 | - |
| 27.2832 | 365350 | 0.0 | - |
| 27.2870 | 365400 | 0.0002 | - |
| 27.2907 | 365450 | 0.0001 | - |
| 27.2945 | 365500 | 0.0 | - |
| 27.2982 | 365550 | 0.0 | - |
| 27.3019 | 365600 | 0.0 | - |
| 27.3057 | 365650 | 0.0 | - |
| 27.3094 | 365700 | 0.0 | - |
| 27.3131 | 365750 | 0.0 | - |
| 27.3169 | 365800 | 0.0002 | - |
| 27.3206 | 365850 | 0.0002 | - |
| 27.3243 | 365900 | 0.0 | - |
| 27.3281 | 365950 | 0.0 | - |
| 27.3318 | 366000 | 0.0 | - |
| 27.3355 | 366050 | 0.0 | - |
| 27.3393 | 366100 | 0.0 | - |
| 27.3430 | 366150 | 0.0001 | - |
| 27.3467 | 366200 | 0.0 | - |
| 27.3505 | 366250 | 0.0002 | - |
| 27.3542 | 366300 | 0.0002 | - |
| 27.3579 | 366350 | 0.0 | - |
| 27.3617 | 366400 | 0.0002 | - |
| 27.3654 | 366450 | 0.0 | - |
| 27.3691 | 366500 | 0.0002 | - |
| 27.3729 | 366550 | 0.0002 | - |
| 27.3766 | 366600 | 0.0 | - |
| 27.3803 | 366650 | 0.0001 | - |
| 27.3841 | 366700 | 0.0 | - |
| 27.3878 | 366750 | 0.0002 | - |
| 27.3915 | 366800 | 0.0002 | - |
| 27.3953 | 366850 | 0.0 | - |
| 27.3990 | 366900 | 0.0002 | - |
| 27.4027 | 366950 | 0.0 | - |
| 27.4065 | 367000 | 0.0 | - |
| 27.4102 | 367050 | 0.0 | - |
| 27.4139 | 367100 | 0.0 | - |
| 27.4177 | 367150 | 0.0 | - |
| 27.4214 | 367200 | 0.0 | - |
| 27.4251 | 367250 | 0.0002 | - |
| 27.4289 | 367300 | 0.0 | - |
| 27.4326 | 367350 | 0.0002 | - |
| 27.4363 | 367400 | 0.0 | - |
| 27.4401 | 367450 | 0.0001 | - |
| 27.4438 | 367500 | 0.0 | - |
| 27.4475 | 367550 | 0.0 | - |
| 27.4513 | 367600 | 0.0 | - |
| 27.4550 | 367650 | 0.0 | - |
| 27.4587 | 367700 | 0.0 | - |
| 27.4625 | 367750 | 0.0 | - |
| 27.4662 | 367800 | 0.0 | - |
| 27.4699 | 367850 | 0.0 | - |
| 27.4737 | 367900 | 0.0002 | - |
| 27.4774 | 367950 | 0.0 | - |
| 27.4811 | 368000 | 0.0 | - |
| 27.4849 | 368050 | 0.0 | - |
| 27.4886 | 368100 | 0.0002 | - |
| 27.4923 | 368150 | 0.0002 | - |
| 27.4961 | 368200 | 0.0 | - |
| 27.4998 | 368250 | 0.0003 | - |
| 27.5035 | 368300 | 0.0 | - |
| 27.5073 | 368350 | 0.0002 | - |
| 27.5110 | 368400 | 0.0003 | - |
| 27.5147 | 368450 | 0.0 | - |
| 27.5185 | 368500 | 0.0 | - |
| 27.5222 | 368550 | 0.0 | - |
| 27.5260 | 368600 | 0.0 | - |
| 27.5297 | 368650 | 0.0 | - |
| 27.5334 | 368700 | 0.0 | - |
| 27.5372 | 368750 | 0.0003 | - |
| 27.5409 | 368800 | 0.0 | - |
| 27.5446 | 368850 | 0.0002 | - |
| 27.5484 | 368900 | 0.0 | - |
| 27.5521 | 368950 | 0.0 | - |
| 27.5558 | 369000 | 0.0 | - |
| 27.5596 | 369050 | 0.0 | - |
| 27.5633 | 369100 | 0.0002 | - |
| 27.5670 | 369150 | 0.0 | - |
| 27.5708 | 369200 | 0.0 | - |
| 27.5745 | 369250 | 0.0 | - |
| 27.5782 | 369300 | 0.0 | - |
| 27.5820 | 369350 | 0.0 | - |
| 27.5857 | 369400 | 0.0 | - |
| 27.5894 | 369450 | 0.0 | - |
| 27.5932 | 369500 | 0.0 | - |
| 27.5969 | 369550 | 0.0001 | - |
| 27.6006 | 369600 | 0.0005 | - |
| 27.6044 | 369650 | 0.0 | - |
| 27.6081 | 369700 | 0.0 | - |
| 27.6118 | 369750 | 0.0 | - |
| 27.6156 | 369800 | 0.0 | - |
| 27.6193 | 369850 | 0.0 | - |
| 27.6230 | 369900 | 0.0 | - |
| 27.6268 | 369950 | 0.0 | - |
| 27.6305 | 370000 | 0.0 | - |
| 27.6342 | 370050 | 0.0 | - |
| 27.6380 | 370100 | 0.0 | - |
| 27.6417 | 370150 | 0.0 | - |
| 27.6454 | 370200 | 0.0 | - |
| 27.6492 | 370250 | 0.0001 | - |
| 27.6529 | 370300 | 0.0 | - |
| 27.6566 | 370350 | 0.0 | - |
| 27.6604 | 370400 | 0.0002 | - |
| 27.6641 | 370450 | 0.0 | - |
| 27.6678 | 370500 | 0.0002 | - |
| 27.6716 | 370550 | 0.0001 | - |
| 27.6753 | 370600 | 0.0 | - |
| 27.6790 | 370650 | 0.0 | - |
| 27.6828 | 370700 | 0.0 | - |
| 27.6865 | 370750 | 0.0 | - |
| 27.6902 | 370800 | 0.0 | - |
| 27.6940 | 370850 | 0.0 | - |
| 27.6977 | 370900 | 0.0002 | - |
| 27.7014 | 370950 | 0.0 | - |
| 27.7052 | 371000 | 0.0002 | - |
| 27.7089 | 371050 | 0.0 | - |
| 27.7126 | 371100 | 0.0002 | - |
| 27.7164 | 371150 | 0.0 | - |
| 27.7201 | 371200 | 0.0 | - |
| 27.7238 | 371250 | 0.0 | - |
| 27.7276 | 371300 | 0.0002 | - |
| 27.7313 | 371350 | 0.0002 | - |
| 27.7350 | 371400 | 0.0001 | - |
| 27.7388 | 371450 | 0.0 | - |
| 27.7425 | 371500 | 0.0 | - |
| 27.7462 | 371550 | 0.0 | - |
| 27.7500 | 371600 | 0.0 | - |
| 27.7537 | 371650 | 0.0 | - |
| 27.7574 | 371700 | 0.0 | - |
| 27.7612 | 371750 | 0.0 | - |
| 27.7649 | 371800 | 0.0 | - |
| 27.7687 | 371850 | 0.0 | - |
| 27.7724 | 371900 | 0.0 | - |
| 27.7761 | 371950 | 0.0 | - |
| 27.7799 | 372000 | 0.0 | - |
| 27.7836 | 372050 | 0.0002 | - |
| 27.7873 | 372100 | 0.0002 | - |
| 27.7911 | 372150 | 0.0 | - |
| 27.7948 | 372200 | 0.0 | - |
| 27.7985 | 372250 | 0.0002 | - |
| 27.8023 | 372300 | 0.0 | - |
| 27.8060 | 372350 | 0.0 | - |
| 27.8097 | 372400 | 0.0 | - |
| 27.8135 | 372450 | 0.0 | - |
| 27.8172 | 372500 | 0.0002 | - |
| 27.8209 | 372550 | 0.0 | - |
| 27.8247 | 372600 | 0.0 | - |
| 27.8284 | 372650 | 0.0 | - |
| 27.8321 | 372700 | 0.0 | - |
| 27.8359 | 372750 | 0.0 | - |
| 27.8396 | 372800 | 0.0 | - |
| 27.8433 | 372850 | 0.0002 | - |
| 27.8471 | 372900 | 0.0 | - |
| 27.8508 | 372950 | 0.0 | - |
| 27.8545 | 373000 | 0.0 | - |
| 27.8583 | 373050 | 0.0002 | - |
| 27.8620 | 373100 | 0.0 | - |
| 27.8657 | 373150 | 0.0001 | - |
| 27.8695 | 373200 | 0.0001 | - |
| 27.8732 | 373250 | 0.0 | - |
| 27.8769 | 373300 | 0.0002 | - |
| 27.8807 | 373350 | 0.0 | - |
| 27.8844 | 373400 | 0.0 | - |
| 27.8881 | 373450 | 0.0 | - |
| 27.8919 | 373500 | 0.0002 | - |
| 27.8956 | 373550 | 0.0 | - |
| 27.8993 | 373600 | 0.0 | - |
| 27.9031 | 373650 | 0.0002 | - |
| 27.9068 | 373700 | 0.0 | - |
| 27.9105 | 373750 | 0.0 | - |
| 27.9143 | 373800 | 0.0 | - |
| 27.9180 | 373850 | 0.0 | - |
| 27.9217 | 373900 | 0.0002 | - |
| 27.9255 | 373950 | 0.0 | - |
| 27.9292 | 374000 | 0.0 | - |
| 27.9329 | 374050 | 0.0 | - |
| 27.9367 | 374100 | 0.0 | - |
| 27.9404 | 374150 | 0.0003 | - |
| 27.9441 | 374200 | 0.0 | - |
| 27.9479 | 374250 | 0.0 | - |
| 27.9516 | 374300 | 0.0 | - |
| 27.9553 | 374350 | 0.0002 | - |
| 27.9591 | 374400 | 0.0002 | - |
| 27.9628 | 374450 | 0.0 | - |
| 27.9665 | 374500 | 0.0 | - |
| 27.9703 | 374550 | 0.0 | - |
| 27.9740 | 374600 | 0.0 | - |
| 27.9777 | 374650 | 0.0001 | - |
| 27.9815 | 374700 | 0.0 | - |
| 27.9852 | 374750 | 0.0 | - |
| 27.9889 | 374800 | 0.0 | - |
| 27.9927 | 374850 | 0.0001 | - |
| 27.9964 | 374900 | 0.0 | - |
| 28.0001 | 374950 | 0.0 | - |
| 28.0039 | 375000 | 0.0 | - |
| 28.0076 | 375050 | 0.0002 | - |
| 28.0114 | 375100 | 0.0002 | - |
| 28.0151 | 375150 | 0.0001 | - |
| 28.0188 | 375200 | 0.0 | - |
| 28.0226 | 375250 | 0.0002 | - |
| 28.0263 | 375300 | 0.0002 | - |
| 28.0300 | 375350 | 0.0 | - |
| 28.0338 | 375400 | 0.0 | - |
| 28.0375 | 375450 | 0.0 | - |
| 28.0412 | 375500 | 0.0 | - |
| 28.0450 | 375550 | 0.0 | - |
| 28.0487 | 375600 | 0.0 | - |
| 28.0524 | 375650 | 0.0001 | - |
| 28.0562 | 375700 | 0.0 | - |
| 28.0599 | 375750 | 0.0 | - |
| 28.0636 | 375800 | 0.0002 | - |
| 28.0674 | 375850 | 0.0 | - |
| 28.0711 | 375900 | 0.0 | - |
| 28.0748 | 375950 | 0.0 | - |
| 28.0786 | 376000 | 0.0 | - |
| 28.0823 | 376050 | 0.0 | - |
| 28.0860 | 376100 | 0.0 | - |
| 28.0898 | 376150 | 0.0 | - |
| 28.0935 | 376200 | 0.0 | - |
| 28.0972 | 376250 | 0.0 | - |
| 28.1010 | 376300 | 0.0002 | - |
| 28.1047 | 376350 | 0.0002 | - |
| 28.1084 | 376400 | 0.0 | - |
| 28.1122 | 376450 | 0.0 | - |
| 28.1159 | 376500 | 0.0 | - |
| 28.1196 | 376550 | 0.0 | - |
| 28.1234 | 376600 | 0.0 | - |
| 28.1271 | 376650 | 0.0 | - |
| 28.1308 | 376700 | 0.0 | - |
| 28.1346 | 376750 | 0.0 | - |
| 28.1383 | 376800 | 0.0 | - |
| 28.1420 | 376850 | 0.0002 | - |
| 28.1458 | 376900 | 0.0 | - |
| 28.1495 | 376950 | 0.0 | - |
| 28.1532 | 377000 | 0.0 | - |
| 28.1570 | 377050 | 0.0 | - |
| 28.1607 | 377100 | 0.0 | - |
| 28.1644 | 377150 | 0.0002 | - |
| 28.1682 | 377200 | 0.0 | - |
| 28.1719 | 377250 | 0.0 | - |
| 28.1756 | 377300 | 0.0 | - |
| 28.1794 | 377350 | 0.0 | - |
| 28.1831 | 377400 | 0.0 | - |
| 28.1868 | 377450 | 0.0 | - |
| 28.1906 | 377500 | 0.0 | - |
| 28.1943 | 377550 | 0.0 | - |
| 28.1980 | 377600 | 0.0 | - |
| 28.2018 | 377650 | 0.0 | - |
| 28.2055 | 377700 | 0.0002 | - |
| 28.2092 | 377750 | 0.0 | - |
| 28.2130 | 377800 | 0.0 | - |
| 28.2167 | 377850 | 0.0 | - |
| 28.2204 | 377900 | 0.0 | - |
| 28.2242 | 377950 | 0.0002 | - |
| 28.2279 | 378000 | 0.0 | - |
| 28.2316 | 378050 | 0.0 | - |
| 28.2354 | 378100 | 0.0002 | - |
| 28.2391 | 378150 | 0.0 | - |
| 28.2428 | 378200 | 0.0 | - |
| 28.2466 | 378250 | 0.0 | - |
| 28.2503 | 378300 | 0.0002 | - |
| 28.2541 | 378350 | 0.0 | - |
| 28.2578 | 378400 | 0.0 | - |
| 28.2615 | 378450 | 0.0003 | - |
| 28.2653 | 378500 | 0.0 | - |
| 28.2690 | 378550 | 0.0002 | - |
| 28.2727 | 378600 | 0.0 | - |
| 28.2765 | 378650 | 0.0 | - |
| 28.2802 | 378700 | 0.0 | - |
| 28.2839 | 378750 | 0.0 | - |
| 28.2877 | 378800 | 0.0003 | - |
| 28.2914 | 378850 | 0.0 | - |
| 28.2951 | 378900 | 0.0002 | - |
| 28.2989 | 378950 | 0.0 | - |
| 28.3026 | 379000 | 0.0001 | - |
| 28.3063 | 379050 | 0.0 | - |
| 28.3101 | 379100 | 0.0 | - |
| 28.3138 | 379150 | 0.0 | - |
| 28.3175 | 379200 | 0.0 | - |
| 28.3213 | 379250 | 0.0 | - |
| 28.3250 | 379300 | 0.0 | - |
| 28.3287 | 379350 | 0.0002 | - |
| 28.3325 | 379400 | 0.0 | - |
| 28.3362 | 379450 | 0.0 | - |
| 28.3399 | 379500 | 0.0 | - |
| 28.3437 | 379550 | 0.0 | - |
| 28.3474 | 379600 | 0.0001 | - |
| 28.3511 | 379650 | 0.0002 | - |
| 28.3549 | 379700 | 0.0 | - |
| 28.3586 | 379750 | 0.0 | - |
| 28.3623 | 379800 | 0.0 | - |
| 28.3661 | 379850 | 0.0 | - |
| 28.3698 | 379900 | 0.0 | - |
| 28.3735 | 379950 | 0.0 | - |
| 28.3773 | 380000 | 0.0 | - |
| 28.3810 | 380050 | 0.0 | - |
| 28.3847 | 380100 | 0.0 | - |
| 28.3885 | 380150 | 0.0 | - |
| 28.3922 | 380200 | 0.0002 | - |
| 28.3959 | 380250 | 0.0 | - |
| 28.3997 | 380300 | 0.0 | - |
| 28.4034 | 380350 | 0.0 | - |
| 28.4071 | 380400 | 0.0 | - |
| 28.4109 | 380450 | 0.0 | - |
| 28.4146 | 380500 | 0.0 | - |
| 28.4183 | 380550 | 0.0 | - |
| 28.4221 | 380600 | 0.0002 | - |
| 28.4258 | 380650 | 0.0 | - |
| 28.4295 | 380700 | 0.0 | - |
| 28.4333 | 380750 | 0.0 | - |
| 28.4370 | 380800 | 0.0 | - |
| 28.4407 | 380850 | 0.0 | - |
| 28.4445 | 380900 | 0.0 | - |
| 28.4482 | 380950 | 0.0 | - |
| 28.4519 | 381000 | 0.0 | - |
| 28.4557 | 381050 | 0.0 | - |
| 28.4594 | 381100 | 0.0 | - |
| 28.4631 | 381150 | 0.0 | - |
| 28.4669 | 381200 | 0.0 | - |
| 28.4706 | 381250 | 0.0002 | - |
| 28.4743 | 381300 | 0.0 | - |
| 28.4781 | 381350 | 0.0 | - |
| 28.4818 | 381400 | 0.0 | - |
| 28.4855 | 381450 | 0.0002 | - |
| 28.4893 | 381500 | 0.0002 | - |
| 28.4930 | 381550 | 0.0 | - |
| 28.4968 | 381600 | 0.0 | - |
| 28.5005 | 381650 | 0.0 | - |
| 28.5042 | 381700 | 0.0 | - |
| 28.5080 | 381750 | 0.0 | - |
| 28.5117 | 381800 | 0.0002 | - |
| 28.5154 | 381850 | 0.0 | - |
| 28.5192 | 381900 | 0.0 | - |
| 28.5229 | 381950 | 0.0002 | - |
| 28.5266 | 382000 | 0.0 | - |
| 28.5304 | 382050 | 0.0 | - |
| 28.5341 | 382100 | 0.0 | - |
| 28.5378 | 382150 | 0.0 | - |
| 28.5416 | 382200 | 0.0 | - |
| 28.5453 | 382250 | 0.0 | - |
| 28.5490 | 382300 | 0.0 | - |
| 28.5528 | 382350 | 0.0002 | - |
| 28.5565 | 382400 | 0.0 | - |
| 28.5602 | 382450 | 0.0 | - |
| 28.5640 | 382500 | 0.0 | - |
| 28.5677 | 382550 | 0.0 | - |
| 28.5714 | 382600 | 0.0 | - |
| 28.5752 | 382650 | 0.0 | - |
| 28.5789 | 382700 | 0.0 | - |
| 28.5826 | 382750 | 0.0 | - |
| 28.5864 | 382800 | 0.0002 | - |
| 28.5901 | 382850 | 0.0002 | - |
| 28.5938 | 382900 | 0.0 | - |
| 28.5976 | 382950 | 0.0001 | - |
| 28.6013 | 383000 | 0.0 | - |
| 28.6050 | 383050 | 0.0 | - |
| 28.6088 | 383100 | 0.0 | - |
| 28.6125 | 383150 | 0.0 | - |
| 28.6162 | 383200 | 0.0 | - |
| 28.6200 | 383250 | 0.0 | - |
| 28.6237 | 383300 | 0.0002 | - |
| 28.6274 | 383350 | 0.0 | - |
| 28.6312 | 383400 | 0.0 | - |
| 28.6349 | 383450 | 0.0 | - |
| 28.6386 | 383500 | 0.0 | - |
| 28.6424 | 383550 | 0.0 | - |
| 28.6461 | 383600 | 0.0 | - |
| 28.6498 | 383650 | 0.0002 | - |
| 28.6536 | 383700 | 0.0 | - |
| 28.6573 | 383750 | 0.0001 | - |
| 28.6610 | 383800 | 0.0002 | - |
| 28.6648 | 383850 | 0.0 | - |
| 28.6685 | 383900 | 0.0002 | - |
| 28.6722 | 383950 | 0.0 | - |
| 28.6760 | 384000 | 0.0 | - |
| 28.6797 | 384050 | 0.0 | - |
| 28.6834 | 384100 | 0.0 | - |
| 28.6872 | 384150 | 0.0 | - |
| 28.6909 | 384200 | 0.0 | - |
| 28.6946 | 384250 | 0.0 | - |
| 28.6984 | 384300 | 0.0 | - |
| 28.7021 | 384350 | 0.0 | - |
| 28.7058 | 384400 | 0.0 | - |
| 28.7096 | 384450 | 0.0001 | - |
| 28.7133 | 384500 | 0.0 | - |
| 28.7170 | 384550 | 0.0 | - |
| 28.7208 | 384600 | 0.0002 | - |
| 28.7245 | 384650 | 0.0 | - |
| 28.7283 | 384700 | 0.0 | - |
| 28.7320 | 384750 | 0.0 | - |
| 28.7357 | 384800 | 0.0 | - |
| 28.7395 | 384850 | 0.0 | - |
| 28.7432 | 384900 | 0.0 | - |
| 28.7469 | 384950 | 0.0002 | - |
| 28.7507 | 385000 | 0.0 | - |
| 28.7544 | 385050 | 0.0001 | - |
| 28.7581 | 385100 | 0.0 | - |
| 28.7619 | 385150 | 0.0 | - |
| 28.7656 | 385200 | 0.0 | - |
| 28.7693 | 385250 | 0.0 | - |
| 28.7731 | 385300 | 0.0 | - |
| 28.7768 | 385350 | 0.0 | - |
| 28.7805 | 385400 | 0.0 | - |
| 28.7843 | 385450 | 0.0001 | - |
| 28.7880 | 385500 | 0.0 | - |
| 28.7917 | 385550 | 0.0005 | - |
| 28.7955 | 385600 | 0.0 | - |
| 28.7992 | 385650 | 0.0 | - |
| 28.8029 | 385700 | 0.0002 | - |
| 28.8067 | 385750 | 0.0 | - |
| 28.8104 | 385800 | 0.0 | - |
| 28.8141 | 385850 | 0.0 | - |
| 28.8179 | 385900 | 0.0 | - |
| 28.8216 | 385950 | 0.0 | - |
| 28.8253 | 386000 | 0.0002 | - |
| 28.8291 | 386050 | 0.0 | - |
| 28.8328 | 386100 | 0.0 | - |
| 28.8365 | 386150 | 0.0 | - |
| 28.8403 | 386200 | 0.0 | - |
| 28.8440 | 386250 | 0.0 | - |
| 28.8477 | 386300 | 0.0 | - |
| 28.8515 | 386350 | 0.0 | - |
| 28.8552 | 386400 | 0.0 | - |
| 28.8589 | 386450 | 0.0 | - |
| 28.8627 | 386500 | 0.0 | - |
| 28.8664 | 386550 | 0.0 | - |
| 28.8701 | 386600 | 0.0 | - |
| 28.8739 | 386650 | 0.0002 | - |
| 28.8776 | 386700 | 0.0 | - |
| 28.8813 | 386750 | 0.0 | - |
| 28.8851 | 386800 | 0.0 | - |
| 28.8888 | 386850 | 0.0 | - |
| 28.8925 | 386900 | 0.0 | - |
| 28.8963 | 386950 | 0.0002 | - |
| 28.9000 | 387000 | 0.0 | - |
| 28.9037 | 387050 | 0.0 | - |
| 28.9075 | 387100 | 0.0 | - |
| 28.9112 | 387150 | 0.0 | - |
| 28.9149 | 387200 | 0.0002 | - |
| 28.9187 | 387250 | 0.0 | - |
| 28.9224 | 387300 | 0.0 | - |
| 28.9261 | 387350 | 0.0 | - |
| 28.9299 | 387400 | 0.0002 | - |
| 28.9336 | 387450 | 0.0 | - |
| 28.9373 | 387500 | 0.0 | - |
| 28.9411 | 387550 | 0.0 | - |
| 28.9448 | 387600 | 0.0 | - |
| 28.9485 | 387650 | 0.0 | - |
| 28.9523 | 387700 | 0.0 | - |
| 28.9560 | 387750 | 0.0 | - |
| 28.9597 | 387800 | 0.0 | - |
| 28.9635 | 387850 | 0.0 | - |
| 28.9672 | 387900 | 0.0 | - |
| 28.9710 | 387950 | 0.0 | - |
| 28.9747 | 388000 | 0.0 | - |
| 28.9784 | 388050 | 0.0 | - |
| 28.9822 | 388100 | 0.0002 | - |
| 28.9859 | 388150 | 0.0 | - |
| 28.9896 | 388200 | 0.0 | - |
| 28.9934 | 388250 | 0.0002 | - |
| 28.9971 | 388300 | 0.0 | - |
| 29.0008 | 388350 | 0.0001 | - |
| 29.0046 | 388400 | 0.0 | - |
| 29.0083 | 388450 | 0.0 | - |
| 29.0120 | 388500 | 0.0 | - |
| 29.0158 | 388550 | 0.0 | - |
| 29.0195 | 388600 | 0.0 | - |
| 29.0232 | 388650 | 0.0 | - |
| 29.0270 | 388700 | 0.0002 | - |
| 29.0307 | 388750 | 0.0 | - |
| 29.0344 | 388800 | 0.0 | - |
| 29.0382 | 388850 | 0.0 | - |
| 29.0419 | 388900 | 0.0 | - |
| 29.0456 | 388950 | 0.0002 | - |
| 29.0494 | 389000 | 0.0003 | - |
| 29.0531 | 389050 | 0.0002 | - |
| 29.0568 | 389100 | 0.0 | - |
| 29.0606 | 389150 | 0.0002 | - |
| 29.0643 | 389200 | 0.0 | - |
| 29.0680 | 389250 | 0.0001 | - |
| 29.0718 | 389300 | 0.0002 | - |
| 29.0755 | 389350 | 0.0 | - |
| 29.0792 | 389400 | 0.0 | - |
| 29.0830 | 389450 | 0.0 | - |
| 29.0867 | 389500 | 0.0 | - |
| 29.0904 | 389550 | 0.0 | - |
| 29.0942 | 389600 | 0.0 | - |
| 29.0979 | 389650 | 0.0 | - |
| 29.1016 | 389700 | 0.0 | - |
| 29.1054 | 389750 | 0.0002 | - |
| 29.1091 | 389800 | 0.0 | - |
| 29.1128 | 389850 | 0.0 | - |
| 29.1166 | 389900 | 0.0 | - |
| 29.1203 | 389950 | 0.0 | - |
| 29.1240 | 390000 | 0.0 | - |
| 29.1278 | 390050 | 0.0002 | - |
| 29.1315 | 390100 | 0.0 | - |
| 29.1352 | 390150 | 0.0 | - |
| 29.1390 | 390200 | 0.0002 | - |
| 29.1427 | 390250 | 0.0 | - |
| 29.1464 | 390300 | 0.0002 | - |
| 29.1502 | 390350 | 0.0002 | - |
| 29.1539 | 390400 | 0.0 | - |
| 29.1576 | 390450 | 0.0 | - |
| 29.1614 | 390500 | 0.0 | - |
| 29.1651 | 390550 | 0.0 | - |
| 29.1688 | 390600 | 0.0 | - |
| 29.1726 | 390650 | 0.0 | - |
| 29.1763 | 390700 | 0.0 | - |
| 29.1800 | 390750 | 0.0 | - |
| 29.1838 | 390800 | 0.0 | - |
| 29.1875 | 390850 | 0.0 | - |
| 29.1912 | 390900 | 0.0 | - |
| 29.1950 | 390950 | 0.0 | - |
| 29.1987 | 391000 | 0.0 | - |
| 29.2024 | 391050 | 0.0 | - |
| 29.2062 | 391100 | 0.0 | - |
| 29.2099 | 391150 | 0.0 | - |
| 29.2137 | 391200 | 0.0 | - |
| 29.2174 | 391250 | 0.0 | - |
| 29.2211 | 391300 | 0.0 | - |
| 29.2249 | 391350 | 0.0002 | - |
| 29.2286 | 391400 | 0.0 | - |
| 29.2323 | 391450 | 0.0001 | - |
| 29.2361 | 391500 | 0.0 | - |
| 29.2398 | 391550 | 0.0 | - |
| 29.2435 | 391600 | 0.0002 | - |
| 29.2473 | 391650 | 0.0 | - |
| 29.2510 | 391700 | 0.0 | - |
| 29.2547 | 391750 | 0.0 | - |
| 29.2585 | 391800 | 0.0 | - |
| 29.2622 | 391850 | 0.0 | - |
| 29.2659 | 391900 | 0.0 | - |
| 29.2697 | 391950 | 0.0 | - |
| 29.2734 | 392000 | 0.0002 | - |
| 29.2771 | 392050 | 0.0 | - |
| 29.2809 | 392100 | 0.0 | - |
| 29.2846 | 392150 | 0.0 | - |
| 29.2883 | 392200 | 0.0 | - |
| 29.2921 | 392250 | 0.0 | - |
| 29.2958 | 392300 | 0.0001 | - |
| 29.2995 | 392350 | 0.0 | - |
| 29.3033 | 392400 | 0.0 | - |
| 29.3070 | 392450 | 0.0 | - |
| 29.3107 | 392500 | 0.0 | - |
| 29.3145 | 392550 | 0.0002 | - |
| 29.3182 | 392600 | 0.0 | - |
| 29.3219 | 392650 | 0.0 | - |
| 29.3257 | 392700 | 0.0 | - |
| 29.3294 | 392750 | 0.0 | - |
| 29.3331 | 392800 | 0.0 | - |
| 29.3369 | 392850 | 0.0 | - |
| 29.3406 | 392900 | 0.0 | - |
| 29.3443 | 392950 | 0.0 | - |
| 29.3481 | 393000 | 0.0 | - |
| 29.3518 | 393050 | 0.0 | - |
| 29.3555 | 393100 | 0.0 | - |
| 29.3593 | 393150 | 0.0002 | - |
| 29.3630 | 393200 | 0.0 | - |
| 29.3667 | 393250 | 0.0 | - |
| 29.3705 | 393300 | 0.0 | - |
| 29.3742 | 393350 | 0.0 | - |
| 29.3779 | 393400 | 0.0002 | - |
| 29.3817 | 393450 | 0.0 | - |
| 29.3854 | 393500 | 0.0 | - |
| 29.3891 | 393550 | 0.0 | - |
| 29.3929 | 393600 | 0.0002 | - |
| 29.3966 | 393650 | 0.0 | - |
| 29.4003 | 393700 | 0.0 | - |
| 29.4041 | 393750 | 0.0002 | - |
| 29.4078 | 393800 | 0.0 | - |
| 29.4115 | 393850 | 0.0 | - |
| 29.4153 | 393900 | 0.0002 | - |
| 29.4190 | 393950 | 0.0 | - |
| 29.4227 | 394000 | 0.0 | - |
| 29.4265 | 394050 | 0.0 | - |
| 29.4302 | 394100 | 0.0002 | - |
| 29.4339 | 394150 | 0.0001 | - |
| 29.4377 | 394200 | 0.0 | - |
| 29.4414 | 394250 | 0.0 | - |
| 29.4451 | 394300 | 0.0 | - |
| 29.4489 | 394350 | 0.0 | - |
| 29.4526 | 394400 | 0.0001 | - |
| 29.4564 | 394450 | 0.0002 | - |
| 29.4601 | 394500 | 0.0 | - |
| 29.4638 | 394550 | 0.0 | - |
| 29.4676 | 394600 | 0.0 | - |
| 29.4713 | 394650 | 0.0 | - |
| 29.4750 | 394700 | 0.0 | - |
| 29.4788 | 394750 | 0.0 | - |
| 29.4825 | 394800 | 0.0002 | - |
| 29.4862 | 394850 | 0.0 | - |
| 29.4900 | 394900 | 0.0 | - |
| 29.4937 | 394950 | 0.0 | - |
| 29.4974 | 395000 | 0.0 | - |
| 29.5012 | 395050 | 0.0 | - |
| 29.5049 | 395100 | 0.0 | - |
| 29.5086 | 395150 | 0.0 | - |
| 29.5124 | 395200 | 0.0 | - |
| 29.5161 | 395250 | 0.0001 | - |
| 29.5198 | 395300 | 0.0 | - |
| 29.5236 | 395350 | 0.0 | - |
| 29.5273 | 395400 | 0.0 | - |
| 29.5310 | 395450 | 0.0 | - |
| 29.5348 | 395500 | 0.0 | - |
| 29.5385 | 395550 | 0.0002 | - |
| 29.5422 | 395600 | 0.0 | - |
| 29.5460 | 395650 | 0.0 | - |
| 29.5497 | 395700 | 0.0003 | - |
| 29.5534 | 395750 | 0.0002 | - |
| 29.5572 | 395800 | 0.0 | - |
| 29.5609 | 395850 | 0.0 | - |
| 29.5646 | 395900 | 0.0 | - |
| 29.5684 | 395950 | 0.0 | - |
| 29.5721 | 396000 | 0.0 | - |
| 29.5758 | 396050 | 0.0002 | - |
| 29.5796 | 396100 | 0.0 | - |
| 29.5833 | 396150 | 0.0 | - |
| 29.5870 | 396200 | 0.0 | - |
| 29.5908 | 396250 | 0.0002 | - |
| 29.5945 | 396300 | 0.0002 | - |
| 29.5982 | 396350 | 0.0 | - |
| 29.6020 | 396400 | 0.0 | - |
| 29.6057 | 396450 | 0.0 | - |
| 29.6094 | 396500 | 0.0002 | - |
| 29.6132 | 396550 | 0.0 | - |
| 29.6169 | 396600 | 0.0 | - |
| 29.6206 | 396650 | 0.0 | - |
| 29.6244 | 396700 | 0.0 | - |
| 29.6281 | 396750 | 0.0 | - |
| 29.6318 | 396800 | 0.0 | - |
| 29.6356 | 396850 | 0.0 | - |
| 29.6393 | 396900 | 0.0 | - |
| 29.6430 | 396950 | 0.0 | - |
| 29.6468 | 397000 | 0.0 | - |
| 29.6505 | 397050 | 0.0 | - |
| 29.6542 | 397100 | 0.0 | - |
| 29.6580 | 397150 | 0.0 | - |
| 29.6617 | 397200 | 0.0 | - |
| 29.6654 | 397250 | 0.0 | - |
| 29.6692 | 397300 | 0.0 | - |
| 29.6729 | 397350 | 0.0 | - |
| 29.6766 | 397400 | 0.0001 | - |
| 29.6804 | 397450 | 0.0 | - |
| 29.6841 | 397500 | 0.0 | - |
| 29.6879 | 397550 | 0.0 | - |
| 29.6916 | 397600 | 0.0 | - |
| 29.6953 | 397650 | 0.0 | - |
| 29.6991 | 397700 | 0.0002 | - |
| 29.7028 | 397750 | 0.0 | - |
| 29.7065 | 397800 | 0.0 | - |
| 29.7103 | 397850 | 0.0 | - |
| 29.7140 | 397900 | 0.0 | - |
| 29.7177 | 397950 | 0.0 | - |
| 29.7215 | 398000 | 0.0 | - |
| 29.7252 | 398050 | 0.0 | - |
| 29.7289 | 398100 | 0.0 | - |
| 29.7327 | 398150 | 0.0001 | - |
| 29.7364 | 398200 | 0.0002 | - |
| 29.7401 | 398250 | 0.0003 | - |
| 29.7439 | 398300 | 0.0 | - |
| 29.7476 | 398350 | 0.0 | - |
| 29.7513 | 398400 | 0.0 | - |
| 29.7551 | 398450 | 0.0001 | - |
| 29.7588 | 398500 | 0.0 | - |
| 29.7625 | 398550 | 0.0 | - |
| 29.7663 | 398600 | 0.0001 | - |
| 29.7700 | 398650 | 0.0002 | - |
| 29.7737 | 398700 | 0.0 | - |
| 29.7775 | 398750 | 0.0 | - |
| 29.7812 | 398800 | 0.0 | - |
| 29.7849 | 398850 | 0.0002 | - |
| 29.7887 | 398900 | 0.0 | - |
| 29.7924 | 398950 | 0.0 | - |
| 29.7961 | 399000 | 0.0002 | - |
| 29.7999 | 399050 | 0.0 | - |
| 29.8036 | 399100 | 0.0002 | - |
| 29.8073 | 399150 | 0.0 | - |
| 29.8111 | 399200 | 0.0 | - |
| 29.8148 | 399250 | 0.0002 | - |
| 29.8185 | 399300 | 0.0 | - |
| 29.8223 | 399350 | 0.0 | - |
| 29.8260 | 399400 | 0.0 | - |
| 29.8297 | 399450 | 0.0 | - |
| 29.8335 | 399500 | 0.0 | - |
| 29.8372 | 399550 | 0.0002 | - |
| 29.8409 | 399600 | 0.0 | - |
| 29.8447 | 399650 | 0.0 | - |
| 29.8484 | 399700 | 0.0 | - |
| 29.8521 | 399750 | 0.0002 | - |
| 29.8559 | 399800 | 0.0 | - |
| 29.8596 | 399850 | 0.0 | - |
| 29.8633 | 399900 | 0.0 | - |
| 29.8671 | 399950 | 0.0 | - |
| 29.8708 | 400000 | 0.0 | - |
| 29.8745 | 400050 | 0.0 | - |
| 29.8783 | 400100 | 0.0 | - |
| 29.8820 | 400150 | 0.0 | - |
| 29.8857 | 400200 | 0.0 | - |
| 29.8895 | 400250 | 0.0 | - |
| 29.8932 | 400300 | 0.0001 | - |
| 29.8969 | 400350 | 0.0001 | - |
| 29.9007 | 400400 | 0.0 | - |
| 29.9044 | 400450 | 0.0 | - |
| 29.9081 | 400500 | 0.0 | - |
| 29.9119 | 400550 | 0.0002 | - |
| 29.9156 | 400600 | 0.0 | - |
| 29.9193 | 400650 | 0.0 | - |
| 29.9231 | 400700 | 0.0 | - |
| 29.9268 | 400750 | 0.0 | - |
| 29.9306 | 400800 | 0.0 | - |
| 29.9343 | 400850 | 0.0 | - |
| 29.9380 | 400900 | 0.0 | - |
| 29.9418 | 400950 | 0.0 | - |
| 29.9455 | 401000 | 0.0 | - |
| 29.9492 | 401050 | 0.0 | - |
| 29.9530 | 401100 | 0.0 | - |
| 29.9567 | 401150 | 0.0 | - |
| 29.9604 | 401200 | 0.0 | - |
| 29.9642 | 401250 | 0.0001 | - |
| 29.9679 | 401300 | 0.0 | - |
| 29.9716 | 401350 | 0.0 | - |
| 29.9754 | 401400 | 0.0 | - |
| 29.9791 | 401450 | 0.0 | - |
| 29.9828 | 401500 | 0.0 | - |
| 29.9866 | 401550 | 0.0002 | - |
| 29.9903 | 401600 | 0.0 | - |
| 29.9940 | 401650 | 0.0 | - |
| 29.9978 | 401700 | 0.0002 | - |
### Framework Versions
- Python: 3.10.12
- SetFit: 1.1.0
- Sentence Transformers: 3.3.1
- Transformers: 4.44.2
- PyTorch: 2.2.0a0+81ea7a4
- Datasets: 3.2.0
- Tokenizers: 0.19.1
## Citation
### BibTeX
```bibtex
@article{https://doi.org/10.48550/arxiv.2209.11055,
doi = {10.48550/ARXIV.2209.11055},
url = {https://arxiv.org/abs/2209.11055},
author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Efficient Few-Shot Learning Without Prompts},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> |
kk-aivio/b34a802b-6ada-491e-9ec8-62a8b3499655 | kk-aivio | 2025-01-26T08:14:51Z | 6 | 0 | peft | [
"peft",
"safetensors",
"llama",
"axolotl",
"generated_from_trainer",
"base_model:unsloth/SmolLM2-1.7B",
"base_model:adapter:unsloth/SmolLM2-1.7B",
"license:apache-2.0",
"region:us"
] | null | 2025-01-26T08:12:32Z | ---
library_name: peft
license: apache-2.0
base_model: unsloth/SmolLM2-1.7B
tags:
- axolotl
- generated_from_trainer
model-index:
- name: b34a802b-6ada-491e-9ec8-62a8b3499655
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: unsloth/SmolLM2-1.7B
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- 99bea9d9584b8941_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/99bea9d9584b8941_train_data.json
type:
field_input: Language
field_instruction: Source
field_output: Content
format: '{instruction} {input}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 4
flash_attention: false
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: false
group_by_length: false
hub_model_id: kk-aivio/b34a802b-6ada-491e-9ec8-62a8b3499655
hub_repo: null
hub_strategy: checkpoint
hub_token: null
learning_rate: 0.0002
load_in_4bit: false
load_in_8bit: false
local_rank: null
logging_steps: 1
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 8
lora_target_linear: true
lr_scheduler: cosine
max_steps: 10
micro_batch_size: 2
mlflow_experiment_name: /tmp/99bea9d9584b8941_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 4
sequence_len: 512
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: cfffbcef-61a5-4b51-a7f0-a84dbedb4b10
wandb_project: Birthday-SN56-17-Gradients-On-Demand
wandb_run: your_name
wandb_runid: cfffbcef-61a5-4b51-a7f0-a84dbedb4b10
warmup_steps: 10
weight_decay: 0.0
xformers_attention: null
```
</details><br>
# b34a802b-6ada-491e-9ec8-62a8b3499655
This model is a fine-tuned version of [unsloth/SmolLM2-1.7B](https://huggingface.co/unsloth/SmolLM2-1.7B) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: nan
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 10
- training_steps: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 0.0 | 0.0004 | 1 | nan |
| 0.0 | 0.0011 | 3 | nan |
| 0.0 | 0.0022 | 6 | nan |
| 0.0 | 0.0033 | 9 | nan |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
OpenGVLab/VisionLLMv2 | OpenGVLab | 2025-01-26T08:11:30Z | 45 | 5 | null | [
"pytorch",
"visionllmv2",
"license:apache-2.0",
"region:us"
] | null | 2025-01-21T13:34:24Z | ---
license: apache-2.0
---
|
AmberYifan/Qwen2.5-7B-sft-ultrachat-hhrlhf | AmberYifan | 2025-01-26T08:04:45Z | 48 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"qwen2",
"text-generation",
"generated_from_trainer",
"trl",
"sft",
"conversational",
"base_model:AmberYifan/Qwen2.5-7B-sft-ultrachat",
"base_model:finetune:AmberYifan/Qwen2.5-7B-sft-ultrachat",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2025-01-26T06:17:42Z | ---
base_model: AmberYifan/Qwen2.5-7B-sft-ultrachat
library_name: transformers
model_name: Qwen2.5-7B-sft-ultrachat-hhrlhf
tags:
- generated_from_trainer
- trl
- sft
licence: license
---
# Model Card for Qwen2.5-7B-sft-ultrachat-hhrlhf
This model is a fine-tuned version of [AmberYifan/Qwen2.5-7B-sft-ultrachat](https://huggingface.co/AmberYifan/Qwen2.5-7B-sft-ultrachat).
It has been trained using [TRL](https://github.com/huggingface/trl).
## Quick start
```python
from transformers import pipeline
question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?"
generator = pipeline("text-generation", model="AmberYifan/Qwen2.5-7B-sft-ultrachat-hhrlhf", device="cuda")
output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0]
print(output["generated_text"])
```
## Training procedure
This model was trained with SFT.
### Framework versions
- TRL: 0.12.2
- Transformers: 4.46.3
- Pytorch: 2.5.1+cu118
- Datasets: 3.2.0
- Tokenizers: 0.20.3
## Citations
Cite TRL as:
```bibtex
@misc{vonwerra2022trl,
title = {{TRL: Transformer Reinforcement Learning}},
author = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallouédec},
year = 2020,
journal = {GitHub repository},
publisher = {GitHub},
howpublished = {\url{https://github.com/huggingface/trl}}
}
``` |
mradermacher/ORANSight_Mistral_22B_Instruct-GGUF | mradermacher | 2025-01-26T08:04:01Z | 270 | 0 | transformers | [
"transformers",
"gguf",
"text-generation-inference",
"unsloth",
"mistral",
"trl",
"en",
"base_model:NextGLab/ORANSight_Mistral_22B_Instruct",
"base_model:quantized:NextGLab/ORANSight_Mistral_22B_Instruct",
"license:other",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2025-01-17T13:27:58Z | ---
base_model: NextGLab/ORANSight_Mistral_22B_Instruct
language:
- en
library_name: transformers
license: other
license_link: https://mistral.ai/licenses/MRL-0.1.md
license_name: mrl
quantized_by: mradermacher
tags:
- text-generation-inference
- transformers
- unsloth
- mistral
- trl
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: -->
static quants of https://huggingface.co/NextGLab/ORANSight_Mistral_22B_Instruct
<!-- provided-files -->
weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/ORANSight_Mistral_22B_Instruct-GGUF/resolve/main/ORANSight_Mistral_22B_Instruct.Q2_K.gguf) | Q2_K | 8.4 | |
| [GGUF](https://huggingface.co/mradermacher/ORANSight_Mistral_22B_Instruct-GGUF/resolve/main/ORANSight_Mistral_22B_Instruct.Q3_K_S.gguf) | Q3_K_S | 9.7 | |
| [GGUF](https://huggingface.co/mradermacher/ORANSight_Mistral_22B_Instruct-GGUF/resolve/main/ORANSight_Mistral_22B_Instruct.Q3_K_M.gguf) | Q3_K_M | 10.9 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/ORANSight_Mistral_22B_Instruct-GGUF/resolve/main/ORANSight_Mistral_22B_Instruct.Q3_K_L.gguf) | Q3_K_L | 11.8 | |
| [GGUF](https://huggingface.co/mradermacher/ORANSight_Mistral_22B_Instruct-GGUF/resolve/main/ORANSight_Mistral_22B_Instruct.IQ4_XS.gguf) | IQ4_XS | 12.1 | |
| [GGUF](https://huggingface.co/mradermacher/ORANSight_Mistral_22B_Instruct-GGUF/resolve/main/ORANSight_Mistral_22B_Instruct.Q4_K_S.gguf) | Q4_K_S | 12.8 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/ORANSight_Mistral_22B_Instruct-GGUF/resolve/main/ORANSight_Mistral_22B_Instruct.Q4_K_M.gguf) | Q4_K_M | 13.4 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/ORANSight_Mistral_22B_Instruct-GGUF/resolve/main/ORANSight_Mistral_22B_Instruct.Q5_K_S.gguf) | Q5_K_S | 15.4 | |
| [GGUF](https://huggingface.co/mradermacher/ORANSight_Mistral_22B_Instruct-GGUF/resolve/main/ORANSight_Mistral_22B_Instruct.Q5_K_M.gguf) | Q5_K_M | 15.8 | |
| [GGUF](https://huggingface.co/mradermacher/ORANSight_Mistral_22B_Instruct-GGUF/resolve/main/ORANSight_Mistral_22B_Instruct.Q6_K.gguf) | Q6_K | 18.4 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/ORANSight_Mistral_22B_Instruct-GGUF/resolve/main/ORANSight_Mistral_22B_Instruct.Q8_0.gguf) | Q8_0 | 23.7 | fast, best quality |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
mrferr3t/fc885aa7-d5f0-4c83-9f39-c86d07a2f497 | mrferr3t | 2025-01-26T08:03:12Z | 8 | 0 | peft | [
"peft",
"safetensors",
"llama",
"axolotl",
"generated_from_trainer",
"base_model:unsloth/SmolLM-360M-Instruct",
"base_model:adapter:unsloth/SmolLM-360M-Instruct",
"license:apache-2.0",
"region:us"
] | null | 2025-01-26T08:02:05Z | ---
library_name: peft
license: apache-2.0
base_model: unsloth/SmolLM-360M-Instruct
tags:
- axolotl
- generated_from_trainer
model-index:
- name: fc885aa7-d5f0-4c83-9f39-c86d07a2f497
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: unsloth/SmolLM-360M-Instruct
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- 4f5a92c6211764d5_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/4f5a92c6211764d5_train_data.json
type:
field_instruction: question
field_output: solution
format: '{instruction}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 4
flash_attention: false
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: false
group_by_length: false
hub_model_id: mrferr3t/fc885aa7-d5f0-4c83-9f39-c86d07a2f497
hub_repo: null
hub_strategy: checkpoint
hub_token: null
learning_rate: 0.0002
load_in_4bit: false
load_in_8bit: false
local_rank: null
logging_steps: 1
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 8
lora_target_linear: true
lr_scheduler: cosine
max_steps: 10
micro_batch_size: 2
mlflow_experiment_name: /tmp/4f5a92c6211764d5_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 4
sequence_len: 512
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: b232257a-a91b-444e-aedb-3fe497321055
wandb_project: Gradients-On-Demand
wandb_run: your_name
wandb_runid: b232257a-a91b-444e-aedb-3fe497321055
warmup_steps: 10
weight_decay: 0.0
xformers_attention: null
```
</details><br>
# fc885aa7-d5f0-4c83-9f39-c86d07a2f497
This model is a fine-tuned version of [unsloth/SmolLM-360M-Instruct](https://huggingface.co/unsloth/SmolLM-360M-Instruct) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2302
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use adamw_bnb_8bit with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 10
- training_steps: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 1.1681 | 0.0018 | 1 | 1.2326 |
| 1.3519 | 0.0053 | 3 | 1.2322 |
| 1.0897 | 0.0105 | 6 | 1.2319 |
| 1.1926 | 0.0158 | 9 | 1.2302 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.3.1+cu121
- Datasets 3.0.1
- Tokenizers 0.20.1 |
danielcfho/qwen2-0.5b-instruct-mlx-q4 | danielcfho | 2025-01-26T07:59:43Z | 5 | 0 | mlx | [
"mlx",
"safetensors",
"qwen2",
"chat",
"text-generation",
"conversational",
"en",
"base_model:danielcfho/qwen2-0.5b-instruct-mlx",
"base_model:quantized:danielcfho/qwen2-0.5b-instruct-mlx",
"license:apache-2.0",
"4-bit",
"region:us"
] | text-generation | 2025-01-26T07:59:22Z | ---
license: apache-2.0
language:
- en
pipeline_tag: text-generation
tags:
- chat
- mlx
- mlx
base_model: danielcfho/qwen2-0.5b-instruct-mlx
---
# danielcfho/qwen2-0.5b-instruct-mlx-q4
The Model [danielcfho/qwen2-0.5b-instruct-mlx-q4](https://huggingface.co/danielcfho/qwen2-0.5b-instruct-mlx-q4) was
converted to MLX format from [danielcfho/qwen2-0.5b-instruct-mlx](https://huggingface.co/danielcfho/qwen2-0.5b-instruct-mlx)
using mlx-lm version **0.21.1**.
## Use with mlx
```bash
pip install mlx-lm
```
```python
from mlx_lm import load, generate
model, tokenizer = load("danielcfho/qwen2-0.5b-instruct-mlx-q4")
prompt = "hello"
if tokenizer.chat_template is not None:
messages = [{"role": "user", "content": prompt}]
prompt = tokenizer.apply_chat_template(
messages, add_generation_prompt=True
)
response = generate(model, tokenizer, prompt=prompt, verbose=True)
```
|
sky-2002/SmolLM-135M-finance-cot-finetuned | sky-2002 | 2025-01-26T07:57:44Z | 12 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"generated_from_trainer",
"trl",
"sft",
"conversational",
"base_model:HuggingFaceTB/SmolLM-135M-Instruct",
"base_model:finetune:HuggingFaceTB/SmolLM-135M-Instruct",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2025-01-25T10:22:13Z | ---
base_model: HuggingFaceTB/SmolLM-135M-Instruct
library_name: transformers
model_name: outputs
tags:
- generated_from_trainer
- trl
- sft
licence: license
---
# Model Card for outputs
This model is a fine-tuned version of [HuggingFaceTB/SmolLM-135M-Instruct](https://huggingface.co/HuggingFaceTB/SmolLM-135M-Instruct).
It has been trained using [TRL](https://github.com/huggingface/trl) on the [Mariaaaaa/Finance_COT_GPT4](Mariaaaaa/Finance_COT_GPT4) dataset.
## Quick start
```python
from transformers import pipeline
question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?"
generator = pipeline("text-generation", model="sky-2002/outputs", device="cuda")
output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0]
print(output["generated_text"])
```
## Training procedure
[<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="150" height="24"/>](https://wandb.ai/aathatte2002-indian-institute-of-technology/huggingface/runs/i0h0suex)
This model was trained with SFT.
### Framework versions
- TRL: 0.13.0
- Transformers: 4.47.1
- Pytorch: 2.5.1+cu121
- Datasets: 3.2.0
- Tokenizers: 0.21.0
## Citations
Cite TRL as:
```bibtex
@misc{vonwerra2022trl,
title = {{TRL: Transformer Reinforcement Learning}},
author = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallouédec},
year = 2020,
journal = {GitHub repository},
publisher = {GitHub},
howpublished = {\url{https://github.com/huggingface/trl}}
}
``` |
adammandic87/82bbaf78-9ed2-4bdb-90a7-19c2423bc666 | adammandic87 | 2025-01-26T07:57:06Z | 6 | 0 | peft | [
"peft",
"safetensors",
"llama",
"axolotl",
"generated_from_trainer",
"base_model:upstage/SOLAR-10.7B-Instruct-v1.0",
"base_model:adapter:upstage/SOLAR-10.7B-Instruct-v1.0",
"license:cc-by-nc-4.0",
"region:us"
] | null | 2025-01-26T07:54:56Z | ---
library_name: peft
license: cc-by-nc-4.0
base_model: upstage/SOLAR-10.7B-Instruct-v1.0
tags:
- axolotl
- generated_from_trainer
model-index:
- name: 82bbaf78-9ed2-4bdb-90a7-19c2423bc666
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: upstage/SOLAR-10.7B-Instruct-v1.0
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- eb63389fa64e5fd6_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/eb63389fa64e5fd6_train_data.json
type:
field_input: choices
field_instruction: question
field_output: messages
format: '{instruction} {input}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 4
flash_attention: false
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: false
group_by_length: false
hub_model_id: adammandic87/82bbaf78-9ed2-4bdb-90a7-19c2423bc666
hub_repo: null
hub_strategy: checkpoint
hub_token: null
learning_rate: 0.0002
load_in_4bit: false
load_in_8bit: false
local_rank: null
logging_steps: 1
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 8
lora_target_linear: true
lr_scheduler: cosine
max_steps: 10
micro_batch_size: 2
mlflow_experiment_name: /tmp/eb63389fa64e5fd6_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 4
sequence_len: 512
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: 95a2ad21-7343-4458-abd1-e291201a5d59
wandb_project: Birthday-SN56-13-Gradients-On-Demand
wandb_run: your_name
wandb_runid: 95a2ad21-7343-4458-abd1-e291201a5d59
warmup_steps: 10
weight_decay: 0.0
xformers_attention: null
```
</details><br>
# 82bbaf78-9ed2-4bdb-90a7-19c2423bc666
This model is a fine-tuned version of [upstage/SOLAR-10.7B-Instruct-v1.0](https://huggingface.co/upstage/SOLAR-10.7B-Instruct-v1.0) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: nan
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 10
- training_steps: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 0.0 | 0.0011 | 1 | nan |
| 0.0 | 0.0032 | 3 | nan |
| 0.0 | 0.0065 | 6 | nan |
| 0.0 | 0.0097 | 9 | nan |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
danielcfho/qwen2-0.5b-instruct-mlx | danielcfho | 2025-01-26T07:53:56Z | 18 | 0 | mlx | [
"mlx",
"safetensors",
"qwen2",
"chat",
"text-generation",
"conversational",
"en",
"base_model:Qwen/Qwen2-0.5B-Instruct-MLX",
"base_model:finetune:Qwen/Qwen2-0.5B-Instruct-MLX",
"license:apache-2.0",
"region:us"
] | text-generation | 2025-01-26T07:24:47Z | ---
license: apache-2.0
language:
- en
pipeline_tag: text-generation
tags:
- chat
- mlx
base_model: Qwen/Qwen2-0.5B-Instruct-MLX
---
# danielcfho/qwen2-0.5b-instruct-mlx
The Model [danielcfho/qwen2-0.5b-instruct-mlx](https://huggingface.co/danielcfho/qwen2-0.5b-instruct-mlx) was
converted to MLX format from [Qwen/Qwen2-0.5B-Instruct-MLX](https://huggingface.co/Qwen/Qwen2-0.5B-Instruct-MLX)
using mlx-lm version **0.21.1**.
## Use with mlx
```bash
pip install mlx-lm
```
```python
from mlx_lm import load, generate
model, tokenizer = load("danielcfho/qwen2-0.5b-instruct-mlx")
prompt = "hello"
if tokenizer.chat_template is not None:
messages = [{"role": "user", "content": prompt}]
prompt = tokenizer.apply_chat_template(
messages, add_generation_prompt=True
)
response = generate(model, tokenizer, prompt=prompt, verbose=True)
```
|
daniel40/ffa6cb1b-0e78-4de2-8ace-b8577da6006a | daniel40 | 2025-01-26T07:53:36Z | 8 | 0 | peft | [
"peft",
"safetensors",
"qwen2_moe",
"axolotl",
"generated_from_trainer",
"base_model:katuni4ka/tiny-random-qwen1.5-moe",
"base_model:adapter:katuni4ka/tiny-random-qwen1.5-moe",
"region:us"
] | null | 2025-01-26T07:38:21Z | ---
library_name: peft
base_model: katuni4ka/tiny-random-qwen1.5-moe
tags:
- axolotl
- generated_from_trainer
model-index:
- name: ffa6cb1b-0e78-4de2-8ace-b8577da6006a
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: katuni4ka/tiny-random-qwen1.5-moe
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- 3bfcb782a3f0e2ac_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/3bfcb782a3f0e2ac_train_data.json
type:
field_instruction: problem
field_output: target_answer
format: '{instruction}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 4
flash_attention: false
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: false
group_by_length: false
hub_model_id: daniel40/ffa6cb1b-0e78-4de2-8ace-b8577da6006a
hub_repo: null
hub_strategy: checkpoint
hub_token: null
learning_rate: 0.0002
load_in_4bit: false
load_in_8bit: false
local_rank: null
logging_steps: 1
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 8
lora_target_linear: true
lr_scheduler: cosine
max_steps: 10
micro_batch_size: 2
mlflow_experiment_name: /tmp/3bfcb782a3f0e2ac_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 4
sequence_len: 512
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: c54ab670-3720-4ec3-a8b4-4391881ada3a
wandb_project: Birthday-SN56-28-Gradients-On-Demand
wandb_run: your_name
wandb_runid: c54ab670-3720-4ec3-a8b4-4391881ada3a
warmup_steps: 10
weight_decay: 0.0
xformers_attention: null
```
</details><br>
# ffa6cb1b-0e78-4de2-8ace-b8577da6006a
This model is a fine-tuned version of [katuni4ka/tiny-random-qwen1.5-moe](https://huggingface.co/katuni4ka/tiny-random-qwen1.5-moe) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 11.9132
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 10
- training_steps: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 11.9211 | 0.0000 | 1 | 11.9166 |
| 11.9028 | 0.0001 | 3 | 11.9164 |
| 11.9413 | 0.0001 | 6 | 11.9152 |
| 11.9142 | 0.0002 | 9 | 11.9132 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
kostiantynk-out/5389d139-1a3d-45ae-8068-1edb747f73da | kostiantynk-out | 2025-01-26T07:53:25Z | 8 | 0 | peft | [
"peft",
"safetensors",
"qwen2",
"axolotl",
"generated_from_trainer",
"base_model:peft-internal-testing/tiny-dummy-qwen2",
"base_model:adapter:peft-internal-testing/tiny-dummy-qwen2",
"region:us"
] | null | 2025-01-26T07:52:58Z | ---
library_name: peft
base_model: peft-internal-testing/tiny-dummy-qwen2
tags:
- axolotl
- generated_from_trainer
model-index:
- name: 5389d139-1a3d-45ae-8068-1edb747f73da
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: peft-internal-testing/tiny-dummy-qwen2
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- 99a1d9d467a445dc_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/99a1d9d467a445dc_train_data.json
type:
field_instruction: question
field_output: answer
format: '{instruction}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 4
flash_attention: false
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: false
group_by_length: false
hub_model_id: kostiantynk-out/5389d139-1a3d-45ae-8068-1edb747f73da
hub_repo: null
hub_strategy: checkpoint
hub_token: null
learning_rate: 0.0002
load_in_4bit: false
load_in_8bit: false
local_rank: null
logging_steps: 1
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 8
lora_target_linear: true
lr_scheduler: cosine
max_steps: 10
micro_batch_size: 2
mlflow_experiment_name: /tmp/99a1d9d467a445dc_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 4
sequence_len: 512
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: 2f1a76ed-541f-4b8a-8491-311724d62463
wandb_project: Mine-SN56-1-Gradients-On-Demand
wandb_run: your_name
wandb_runid: 2f1a76ed-541f-4b8a-8491-311724d62463
warmup_steps: 10
weight_decay: 0.0
xformers_attention: null
```
</details><br>
# 5389d139-1a3d-45ae-8068-1edb747f73da
This model is a fine-tuned version of [peft-internal-testing/tiny-dummy-qwen2](https://huggingface.co/peft-internal-testing/tiny-dummy-qwen2) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 11.9396
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 10
- training_steps: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 11.9406 | 0.0015 | 1 | 11.9398 |
| 11.9371 | 0.0045 | 3 | 11.9398 |
| 11.9392 | 0.0090 | 6 | 11.9397 |
| 11.9412 | 0.0135 | 9 | 11.9396 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
philip-hightech/7c5d4cb2-bd7e-4a59-bf32-ddfa2f0d498e | philip-hightech | 2025-01-26T07:52:21Z | 8 | 0 | peft | [
"peft",
"safetensors",
"qwen2",
"axolotl",
"generated_from_trainer",
"base_model:peft-internal-testing/tiny-dummy-qwen2",
"base_model:adapter:peft-internal-testing/tiny-dummy-qwen2",
"region:us"
] | null | 2025-01-26T07:51:55Z | ---
library_name: peft
base_model: peft-internal-testing/tiny-dummy-qwen2
tags:
- axolotl
- generated_from_trainer
model-index:
- name: 7c5d4cb2-bd7e-4a59-bf32-ddfa2f0d498e
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: peft-internal-testing/tiny-dummy-qwen2
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- 99a1d9d467a445dc_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/99a1d9d467a445dc_train_data.json
type:
field_instruction: question
field_output: answer
format: '{instruction}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 4
flash_attention: false
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: false
group_by_length: false
hub_model_id: philip-hightech/7c5d4cb2-bd7e-4a59-bf32-ddfa2f0d498e
hub_repo: null
hub_strategy: checkpoint
hub_token: null
learning_rate: 0.0002
load_in_4bit: false
load_in_8bit: false
local_rank: null
logging_steps: 1
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 8
lora_target_linear: true
lr_scheduler: cosine
max_steps: 10
micro_batch_size: 2
mlflow_experiment_name: /tmp/99a1d9d467a445dc_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 4
sequence_len: 512
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: 2f1a76ed-541f-4b8a-8491-311724d62463
wandb_project: Mine-SN56-21-Gradients-On-Demand
wandb_run: your_name
wandb_runid: 2f1a76ed-541f-4b8a-8491-311724d62463
warmup_steps: 10
weight_decay: 0.0
xformers_attention: null
```
</details><br>
# 7c5d4cb2-bd7e-4a59-bf32-ddfa2f0d498e
This model is a fine-tuned version of [peft-internal-testing/tiny-dummy-qwen2](https://huggingface.co/peft-internal-testing/tiny-dummy-qwen2) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 11.9396
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 10
- training_steps: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 11.9406 | 0.0015 | 1 | 11.9398 |
| 11.9371 | 0.0045 | 3 | 11.9398 |
| 11.9392 | 0.0090 | 6 | 11.9397 |
| 11.9412 | 0.0135 | 9 | 11.9396 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
mradermacher/FuseO1-DeepSeekR1-Qwen2.5-Instruct-32B-Preview-GGUF | mradermacher | 2025-01-26T07:47:37Z | 9,774 | 0 | transformers | [
"transformers",
"gguf",
"en",
"base_model:FuseAI/FuseO1-DeepSeekR1-Qwen2.5-Instruct-32B-Preview",
"base_model:quantized:FuseAI/FuseO1-DeepSeekR1-Qwen2.5-Instruct-32B-Preview",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2025-01-22T02:57:23Z | ---
base_model: FuseAI/FuseO1-DeepSeekR1-Qwen2.5-Instruct-32B-Preview
language:
- en
library_name: transformers
license: apache-2.0
quantized_by: mradermacher
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: -->
static quants of https://huggingface.co/FuseAI/FuseO1-DeepSeekR1-Qwen2.5-Instruct-32B-Preview
<!-- provided-files -->
weighted/imatrix quants are available at https://huggingface.co/mradermacher/FuseO1-DeepSeekR1-Qwen2.5-Instruct-32B-Preview-i1-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/FuseO1-DeepSeekR1-Qwen2.5-Instruct-32B-Preview-GGUF/resolve/main/FuseO1-DeepSeekR1-Qwen2.5-Instruct-32B-Preview.Q2_K.gguf) | Q2_K | 12.4 | |
| [GGUF](https://huggingface.co/mradermacher/FuseO1-DeepSeekR1-Qwen2.5-Instruct-32B-Preview-GGUF/resolve/main/FuseO1-DeepSeekR1-Qwen2.5-Instruct-32B-Preview.Q3_K_S.gguf) | Q3_K_S | 14.5 | |
| [GGUF](https://huggingface.co/mradermacher/FuseO1-DeepSeekR1-Qwen2.5-Instruct-32B-Preview-GGUF/resolve/main/FuseO1-DeepSeekR1-Qwen2.5-Instruct-32B-Preview.Q3_K_M.gguf) | Q3_K_M | 16.0 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/FuseO1-DeepSeekR1-Qwen2.5-Instruct-32B-Preview-GGUF/resolve/main/FuseO1-DeepSeekR1-Qwen2.5-Instruct-32B-Preview.Q3_K_L.gguf) | Q3_K_L | 17.3 | |
| [GGUF](https://huggingface.co/mradermacher/FuseO1-DeepSeekR1-Qwen2.5-Instruct-32B-Preview-GGUF/resolve/main/FuseO1-DeepSeekR1-Qwen2.5-Instruct-32B-Preview.IQ4_XS.gguf) | IQ4_XS | 18.0 | |
| [GGUF](https://huggingface.co/mradermacher/FuseO1-DeepSeekR1-Qwen2.5-Instruct-32B-Preview-GGUF/resolve/main/FuseO1-DeepSeekR1-Qwen2.5-Instruct-32B-Preview.Q4_K_S.gguf) | Q4_K_S | 18.9 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/FuseO1-DeepSeekR1-Qwen2.5-Instruct-32B-Preview-GGUF/resolve/main/FuseO1-DeepSeekR1-Qwen2.5-Instruct-32B-Preview.Q4_K_M.gguf) | Q4_K_M | 20.0 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/FuseO1-DeepSeekR1-Qwen2.5-Instruct-32B-Preview-GGUF/resolve/main/FuseO1-DeepSeekR1-Qwen2.5-Instruct-32B-Preview.Q5_K_S.gguf) | Q5_K_S | 22.7 | |
| [GGUF](https://huggingface.co/mradermacher/FuseO1-DeepSeekR1-Qwen2.5-Instruct-32B-Preview-GGUF/resolve/main/FuseO1-DeepSeekR1-Qwen2.5-Instruct-32B-Preview.Q5_K_M.gguf) | Q5_K_M | 23.4 | |
| [GGUF](https://huggingface.co/mradermacher/FuseO1-DeepSeekR1-Qwen2.5-Instruct-32B-Preview-GGUF/resolve/main/FuseO1-DeepSeekR1-Qwen2.5-Instruct-32B-Preview.Q6_K.gguf) | Q6_K | 27.0 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/FuseO1-DeepSeekR1-Qwen2.5-Instruct-32B-Preview-GGUF/resolve/main/FuseO1-DeepSeekR1-Qwen2.5-Instruct-32B-Preview.Q8_0.gguf) | Q8_0 | 34.9 | fast, best quality |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time.
<!-- end -->
|
Primeness/primeh4v7c2 | Primeness | 2025-01-26T07:47:15Z | 20 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2025-01-26T07:14:52Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
winnieyangwannan/gemma-2b-it-honest_lying | winnieyangwannan | 2025-01-26T07:46:55Z | 139 | 0 | transformers | [
"transformers",
"safetensors",
"gemma",
"text-generation",
"generated_from_trainer",
"gemma-2b-it",
"honest_lying",
"trl",
"sft",
"conversational",
"base_model:google/gemma-2b-it",
"base_model:finetune:google/gemma-2b-it",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2025-01-10T08:51:08Z | ---
base_model: google/gemma-2b-it
library_name: transformers
model_name: gemma-2b-it-honest_lying
tags:
- generated_from_trainer
- gemma-2b-it
- honest_lying
- trl
- sft
licence: license
---
# Model Card for gemma-2b-it-honest_lying
This model is a fine-tuned version of [google/gemma-2b-it](https://huggingface.co/google/gemma-2b-it).
It has been trained using [TRL](https://github.com/huggingface/trl).
## Quick start
```python
from transformers import pipeline
question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?"
generator = pipeline("text-generation", model="winnieyangwannan/gemma-2b-it-honest_lying", device="cuda")
output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0]
print(output["generated_text"])
```
## Training procedure
[<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="150" height="24"/>](https://wandb.ai/winnie96/huggingface/runs/vcgjka60)
This model was trained with SFT.
### Framework versions
- TRL: 0.14.0.dev0
- Transformers: 4.47.1
- Pytorch: 2.3.1+cu118
- Datasets: 3.2.0
- Tokenizers: 0.21.0
## Citations
Cite TRL as:
```bibtex
@misc{vonwerra2022trl,
title = {{TRL: Transformer Reinforcement Learning}},
author = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallouédec},
year = 2020,
journal = {GitHub repository},
publisher = {GitHub},
howpublished = {\url{https://github.com/huggingface/trl}}
}
``` |
mradermacher/Taiwan-tinyllama-v1.1-base-GGUF | mradermacher | 2025-01-26T07:42:33Z | 181 | 0 | transformers | [
"transformers",
"gguf",
"zh",
"dataset:benchang1110/Taiwan-pretrain-9B",
"dataset:benchang1110/Taiwan-book-1B",
"base_model:benchang1110/Taiwan-tinyllama-v1.1-base",
"base_model:quantized:benchang1110/Taiwan-tinyllama-v1.1-base",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | null | 2025-01-24T16:50:02Z | ---
base_model: benchang1110/Taiwan-tinyllama-v1.1-base
datasets:
- benchang1110/Taiwan-pretrain-9B
- benchang1110/Taiwan-book-1B
language:
- zh
library_name: transformers
license: apache-2.0
quantized_by: mradermacher
tags: []
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: -->
static quants of https://huggingface.co/benchang1110/Taiwan-tinyllama-v1.1-base
<!-- provided-files -->
weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/Taiwan-tinyllama-v1.1-base-GGUF/resolve/main/Taiwan-tinyllama-v1.1-base.Q2_K.gguf) | Q2_K | 0.5 | |
| [GGUF](https://huggingface.co/mradermacher/Taiwan-tinyllama-v1.1-base-GGUF/resolve/main/Taiwan-tinyllama-v1.1-base.Q3_K_S.gguf) | Q3_K_S | 0.6 | |
| [GGUF](https://huggingface.co/mradermacher/Taiwan-tinyllama-v1.1-base-GGUF/resolve/main/Taiwan-tinyllama-v1.1-base.Q3_K_M.gguf) | Q3_K_M | 0.6 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/Taiwan-tinyllama-v1.1-base-GGUF/resolve/main/Taiwan-tinyllama-v1.1-base.Q3_K_L.gguf) | Q3_K_L | 0.7 | |
| [GGUF](https://huggingface.co/mradermacher/Taiwan-tinyllama-v1.1-base-GGUF/resolve/main/Taiwan-tinyllama-v1.1-base.IQ4_XS.gguf) | IQ4_XS | 0.7 | |
| [GGUF](https://huggingface.co/mradermacher/Taiwan-tinyllama-v1.1-base-GGUF/resolve/main/Taiwan-tinyllama-v1.1-base.Q4_K_S.gguf) | Q4_K_S | 0.7 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Taiwan-tinyllama-v1.1-base-GGUF/resolve/main/Taiwan-tinyllama-v1.1-base.Q4_K_M.gguf) | Q4_K_M | 0.8 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Taiwan-tinyllama-v1.1-base-GGUF/resolve/main/Taiwan-tinyllama-v1.1-base.Q5_K_S.gguf) | Q5_K_S | 0.9 | |
| [GGUF](https://huggingface.co/mradermacher/Taiwan-tinyllama-v1.1-base-GGUF/resolve/main/Taiwan-tinyllama-v1.1-base.Q5_K_M.gguf) | Q5_K_M | 0.9 | |
| [GGUF](https://huggingface.co/mradermacher/Taiwan-tinyllama-v1.1-base-GGUF/resolve/main/Taiwan-tinyllama-v1.1-base.Q6_K.gguf) | Q6_K | 1.0 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/Taiwan-tinyllama-v1.1-base-GGUF/resolve/main/Taiwan-tinyllama-v1.1-base.Q8_0.gguf) | Q8_0 | 1.3 | fast, best quality |
| [GGUF](https://huggingface.co/mradermacher/Taiwan-tinyllama-v1.1-base-GGUF/resolve/main/Taiwan-tinyllama-v1.1-base.f16.gguf) | f16 | 2.3 | 16 bpw, overkill |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time.
<!-- end -->
|
mradermacher/openchat_3.5-EducationAID-Biologie-i1-GGUF | mradermacher | 2025-01-26T07:42:03Z | 452 | 0 | transformers | [
"transformers",
"gguf",
"openchat",
"mistral",
"C-RLFT",
"en",
"dataset:openchat/openchat_sharegpt4_dataset",
"dataset:imone/OpenOrca_FLAN",
"dataset:LDJnr/LessWrong-Amplify-Instruct",
"dataset:LDJnr/Pure-Dove",
"dataset:LDJnr/Verified-Camel",
"dataset:tiedong/goat",
"dataset:glaiveai/glaive-code-assistant",
"dataset:meta-math/MetaMathQA",
"dataset:OpenAssistant/oasst_top1_2023-08-25",
"dataset:TIGER-Lab/MathInstruct",
"base_model:SolaireOfTheSun/openchat_3.5-EducationAID-Biologie",
"base_model:quantized:SolaireOfTheSun/openchat_3.5-EducationAID-Biologie",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"imatrix",
"conversational"
] | null | 2025-01-24T20:44:30Z | ---
base_model: SolaireOfTheSun/openchat_3.5-EducationAID-Biologie
datasets:
- openchat/openchat_sharegpt4_dataset
- imone/OpenOrca_FLAN
- LDJnr/LessWrong-Amplify-Instruct
- LDJnr/Pure-Dove
- LDJnr/Verified-Camel
- tiedong/goat
- glaiveai/glaive-code-assistant
- meta-math/MetaMathQA
- OpenAssistant/oasst_top1_2023-08-25
- TIGER-Lab/MathInstruct
language:
- en
library_name: transformers
license: apache-2.0
quantized_by: mradermacher
tags:
- openchat
- mistral
- C-RLFT
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/SolaireOfTheSun/openchat_3.5-EducationAID-Biologie
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/openchat_3.5-EducationAID-Biologie-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/openchat_3.5-EducationAID-Biologie-i1-GGUF/resolve/main/openchat_3.5-EducationAID-Biologie.i1-IQ1_S.gguf) | i1-IQ1_S | 1.7 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/openchat_3.5-EducationAID-Biologie-i1-GGUF/resolve/main/openchat_3.5-EducationAID-Biologie.i1-IQ1_M.gguf) | i1-IQ1_M | 1.9 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/openchat_3.5-EducationAID-Biologie-i1-GGUF/resolve/main/openchat_3.5-EducationAID-Biologie.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 2.1 | |
| [GGUF](https://huggingface.co/mradermacher/openchat_3.5-EducationAID-Biologie-i1-GGUF/resolve/main/openchat_3.5-EducationAID-Biologie.i1-IQ2_XS.gguf) | i1-IQ2_XS | 2.3 | |
| [GGUF](https://huggingface.co/mradermacher/openchat_3.5-EducationAID-Biologie-i1-GGUF/resolve/main/openchat_3.5-EducationAID-Biologie.i1-IQ2_S.gguf) | i1-IQ2_S | 2.4 | |
| [GGUF](https://huggingface.co/mradermacher/openchat_3.5-EducationAID-Biologie-i1-GGUF/resolve/main/openchat_3.5-EducationAID-Biologie.i1-IQ2_M.gguf) | i1-IQ2_M | 2.6 | |
| [GGUF](https://huggingface.co/mradermacher/openchat_3.5-EducationAID-Biologie-i1-GGUF/resolve/main/openchat_3.5-EducationAID-Biologie.i1-Q2_K_S.gguf) | i1-Q2_K_S | 2.6 | very low quality |
| [GGUF](https://huggingface.co/mradermacher/openchat_3.5-EducationAID-Biologie-i1-GGUF/resolve/main/openchat_3.5-EducationAID-Biologie.i1-Q2_K.gguf) | i1-Q2_K | 2.8 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/openchat_3.5-EducationAID-Biologie-i1-GGUF/resolve/main/openchat_3.5-EducationAID-Biologie.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 2.9 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/openchat_3.5-EducationAID-Biologie-i1-GGUF/resolve/main/openchat_3.5-EducationAID-Biologie.i1-IQ3_XS.gguf) | i1-IQ3_XS | 3.1 | |
| [GGUF](https://huggingface.co/mradermacher/openchat_3.5-EducationAID-Biologie-i1-GGUF/resolve/main/openchat_3.5-EducationAID-Biologie.i1-Q3_K_S.gguf) | i1-Q3_K_S | 3.3 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/openchat_3.5-EducationAID-Biologie-i1-GGUF/resolve/main/openchat_3.5-EducationAID-Biologie.i1-IQ3_M.gguf) | i1-IQ3_M | 3.4 | |
| [GGUF](https://huggingface.co/mradermacher/openchat_3.5-EducationAID-Biologie-i1-GGUF/resolve/main/openchat_3.5-EducationAID-Biologie.i1-Q3_K_M.gguf) | i1-Q3_K_M | 3.6 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/openchat_3.5-EducationAID-Biologie-i1-GGUF/resolve/main/openchat_3.5-EducationAID-Biologie.i1-Q3_K_L.gguf) | i1-Q3_K_L | 3.9 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/openchat_3.5-EducationAID-Biologie-i1-GGUF/resolve/main/openchat_3.5-EducationAID-Biologie.i1-IQ4_XS.gguf) | i1-IQ4_XS | 4.0 | |
| [GGUF](https://huggingface.co/mradermacher/openchat_3.5-EducationAID-Biologie-i1-GGUF/resolve/main/openchat_3.5-EducationAID-Biologie.i1-Q4_0.gguf) | i1-Q4_0 | 4.2 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/openchat_3.5-EducationAID-Biologie-i1-GGUF/resolve/main/openchat_3.5-EducationAID-Biologie.i1-IQ4_NL.gguf) | i1-IQ4_NL | 4.2 | prefer IQ4_XS |
| [GGUF](https://huggingface.co/mradermacher/openchat_3.5-EducationAID-Biologie-i1-GGUF/resolve/main/openchat_3.5-EducationAID-Biologie.i1-Q4_K_S.gguf) | i1-Q4_K_S | 4.2 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/openchat_3.5-EducationAID-Biologie-i1-GGUF/resolve/main/openchat_3.5-EducationAID-Biologie.i1-Q4_K_M.gguf) | i1-Q4_K_M | 4.5 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/openchat_3.5-EducationAID-Biologie-i1-GGUF/resolve/main/openchat_3.5-EducationAID-Biologie.i1-Q4_1.gguf) | i1-Q4_1 | 4.7 | |
| [GGUF](https://huggingface.co/mradermacher/openchat_3.5-EducationAID-Biologie-i1-GGUF/resolve/main/openchat_3.5-EducationAID-Biologie.i1-Q5_K_S.gguf) | i1-Q5_K_S | 5.1 | |
| [GGUF](https://huggingface.co/mradermacher/openchat_3.5-EducationAID-Biologie-i1-GGUF/resolve/main/openchat_3.5-EducationAID-Biologie.i1-Q5_K_M.gguf) | i1-Q5_K_M | 5.2 | |
| [GGUF](https://huggingface.co/mradermacher/openchat_3.5-EducationAID-Biologie-i1-GGUF/resolve/main/openchat_3.5-EducationAID-Biologie.i1-Q6_K.gguf) | i1-Q6_K | 6.0 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
mradermacher/Trendyol-Turkcell-7b-mixture-GGUF | mradermacher | 2025-01-26T07:41:54Z | 112 | 0 | transformers | [
"transformers",
"gguf",
"merge",
"mergekit",
"lazymergekit",
"Trendyol/Trendyol-LLM-7b-chat-v1.0",
"TURKCELL/Turkcell-LLM-7b-v1",
"en",
"base_model:burak/Trendyol-Turkcell-7b-mixture",
"base_model:quantized:burak/Trendyol-Turkcell-7b-mixture",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2025-01-24T21:48:09Z | ---
base_model: burak/Trendyol-Turkcell-7b-mixture
language:
- en
library_name: transformers
license: apache-2.0
quantized_by: mradermacher
tags:
- merge
- mergekit
- lazymergekit
- Trendyol/Trendyol-LLM-7b-chat-v1.0
- TURKCELL/Turkcell-LLM-7b-v1
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
static quants of https://huggingface.co/burak/Trendyol-Turkcell-7b-mixture
<!-- provided-files -->
weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/Trendyol-Turkcell-7b-mixture-GGUF/resolve/main/Trendyol-Turkcell-7b-mixture.Q2_K.gguf) | Q2_K | 2.9 | |
| [GGUF](https://huggingface.co/mradermacher/Trendyol-Turkcell-7b-mixture-GGUF/resolve/main/Trendyol-Turkcell-7b-mixture.Q3_K_M.gguf) | Q3_K_M | 3.7 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/Trendyol-Turkcell-7b-mixture-GGUF/resolve/main/Trendyol-Turkcell-7b-mixture.Q4_K_S.gguf) | Q4_K_S | 4.3 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Trendyol-Turkcell-7b-mixture-GGUF/resolve/main/Trendyol-Turkcell-7b-mixture.Q6_K.gguf) | Q6_K | 6.1 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/Trendyol-Turkcell-7b-mixture-GGUF/resolve/main/Trendyol-Turkcell-7b-mixture.Q8_0.gguf) | Q8_0 | 7.9 | fast, best quality |
| [GGUF](https://huggingface.co/mradermacher/Trendyol-Turkcell-7b-mixture-GGUF/resolve/main/Trendyol-Turkcell-7b-mixture.f16.gguf) | f16 | 14.8 | 16 bpw, overkill |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
nat-hunt/7d61a0d1-1cb3-405a-971c-c082869a5af0 | nat-hunt | 2025-01-26T07:41:24Z | 8 | 0 | peft | [
"peft",
"safetensors",
"qwen2",
"axolotl",
"generated_from_trainer",
"base_model:peft-internal-testing/tiny-dummy-qwen2",
"base_model:adapter:peft-internal-testing/tiny-dummy-qwen2",
"region:us"
] | null | 2025-01-26T07:40:56Z | ---
library_name: peft
base_model: peft-internal-testing/tiny-dummy-qwen2
tags:
- axolotl
- generated_from_trainer
model-index:
- name: 7d61a0d1-1cb3-405a-971c-c082869a5af0
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: peft-internal-testing/tiny-dummy-qwen2
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- 99a1d9d467a445dc_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/99a1d9d467a445dc_train_data.json
type:
field_instruction: question
field_output: answer
format: '{instruction}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 4
flash_attention: false
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: false
group_by_length: false
hub_model_id: nat-hunt/7d61a0d1-1cb3-405a-971c-c082869a5af0
hub_repo: null
hub_strategy: checkpoint
hub_token: null
learning_rate: 5.0e-05
load_in_4bit: false
load_in_8bit: false
local_rank: null
logging_steps: 1
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 8
lora_target_linear: true
lr_scheduler: cosine
max_steps: 50
micro_batch_size: 2
mlflow_experiment_name: /tmp/99a1d9d467a445dc_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 4
sequence_len: 512
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: 2f1a76ed-541f-4b8a-8491-311724d62463
wandb_project: Birthday-SN56-4-Gradients-On-Demand
wandb_run: your_name
wandb_runid: 2f1a76ed-541f-4b8a-8491-311724d62463
warmup_steps: 10
weight_decay: 0.0
xformers_attention: null
```
</details><br>
# 7d61a0d1-1cb3-405a-971c-c082869a5af0
This model is a fine-tuned version of [peft-internal-testing/tiny-dummy-qwen2](https://huggingface.co/peft-internal-testing/tiny-dummy-qwen2) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 11.9394
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 10
- training_steps: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 11.9406 | 0.0015 | 1 | 11.9398 |
| 11.9392 | 0.0195 | 13 | 11.9397 |
| 11.9403 | 0.0391 | 26 | 11.9394 |
| 11.9417 | 0.0586 | 39 | 11.9394 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
kokovova/a103f5c0-9593-4378-8fe6-b72c939bead0 | kokovova | 2025-01-26T07:40:54Z | 8 | 0 | peft | [
"peft",
"safetensors",
"qwen2_moe",
"axolotl",
"generated_from_trainer",
"base_model:katuni4ka/tiny-random-qwen1.5-moe",
"base_model:adapter:katuni4ka/tiny-random-qwen1.5-moe",
"region:us"
] | null | 2025-01-26T07:21:01Z | ---
library_name: peft
base_model: katuni4ka/tiny-random-qwen1.5-moe
tags:
- axolotl
- generated_from_trainer
model-index:
- name: a103f5c0-9593-4378-8fe6-b72c939bead0
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: katuni4ka/tiny-random-qwen1.5-moe
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- 3bfcb782a3f0e2ac_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/3bfcb782a3f0e2ac_train_data.json
type:
field_instruction: problem
field_output: target_answer
format: '{instruction}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
device: cuda
early_stopping_patience: 1
eval_max_new_tokens: 128
eval_steps: 5
eval_table_size: null
evals_per_epoch: null
flash_attention: false
fp16: null
gradient_accumulation_steps: 4
gradient_checkpointing: true
group_by_length: true
hub_model_id: kokovova/a103f5c0-9593-4378-8fe6-b72c939bead0
hub_repo: null
hub_strategy: checkpoint
hub_token: null
learning_rate: 0.0002
load_in_4bit: false
load_in_8bit: false
local_rank: null
logging_steps: 3
lora_alpha: 32
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 16
lora_target_linear: true
lr_scheduler: cosine
max_memory:
0: 79GiB
max_steps: 30
micro_batch_size: 4
mlflow_experiment_name: /tmp/3bfcb782a3f0e2ac_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optim_args:
adam_beta1: 0.9
adam_beta2: 0.95
adam_epsilon: 1e-5
optimizer: adamw_torch
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
save_steps: 10
sequence_len: 1024
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: true
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: c54ab670-3720-4ec3-a8b4-4391881ada3a
wandb_project: Gradients-On-Demand
wandb_run: your_name
wandb_runid: c54ab670-3720-4ec3-a8b4-4391881ada3a
warmup_steps: 5
weight_decay: 0.001
xformers_attention: true
```
</details><br>
# a103f5c0-9593-4378-8fe6-b72c939bead0
This model is a fine-tuned version of [katuni4ka/tiny-random-qwen1.5-moe](https://huggingface.co/katuni4ka/tiny-random-qwen1.5-moe) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 11.9268
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 16
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=adam_beta1=0.9,adam_beta2=0.95,adam_epsilon=1e-5
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 5
- training_steps: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| No log | 0.0000 | 1 | 11.9352 |
| 11.9391 | 0.0002 | 5 | 11.9343 |
| 11.9359 | 0.0004 | 10 | 11.9318 |
| 11.9282 | 0.0007 | 15 | 11.9294 |
| 11.9258 | 0.0009 | 20 | 11.9278 |
| 11.9312 | 0.0011 | 25 | 11.9269 |
| 11.9227 | 0.0013 | 30 | 11.9268 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
Luongdzung/hoa-1b4-sft-phy-olora | Luongdzung | 2025-01-26T07:40:20Z | 9 | 0 | peft | [
"peft",
"tensorboard",
"safetensors",
"generated_from_trainer",
"base_model:vlsp-2023-vllm/hoa-1b4",
"base_model:adapter:vlsp-2023-vllm/hoa-1b4",
"license:bigscience-bloom-rail-1.0",
"region:us"
] | null | 2025-01-26T07:40:17Z | ---
library_name: peft
license: bigscience-bloom-rail-1.0
base_model: vlsp-2023-vllm/hoa-1b4
tags:
- generated_from_trainer
model-index:
- name: hoa-1b4-sft-phy-olora
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# hoa-1b4-sft-phy-olora
This model is a fine-tuned version of [vlsp-2023-vllm/hoa-1b4](https://huggingface.co/vlsp-2023-vllm/hoa-1b4) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 4
### Training results
### Framework versions
- PEFT 0.14.0
- Transformers 4.44.2
- Pytorch 2.4.1+cu121
- Datasets 3.2.0
- Tokenizers 0.19.1 |
nathanialhunt/6da84e18-9c62-4e59-b67f-ac14c1658216 | nathanialhunt | 2025-01-26T07:39:08Z | 8 | 0 | peft | [
"peft",
"safetensors",
"qwen2_moe",
"axolotl",
"generated_from_trainer",
"base_model:katuni4ka/tiny-random-qwen1.5-moe",
"base_model:adapter:katuni4ka/tiny-random-qwen1.5-moe",
"region:us"
] | null | 2025-01-26T07:21:35Z | ---
library_name: peft
base_model: katuni4ka/tiny-random-qwen1.5-moe
tags:
- axolotl
- generated_from_trainer
model-index:
- name: 6da84e18-9c62-4e59-b67f-ac14c1658216
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: katuni4ka/tiny-random-qwen1.5-moe
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- 3bfcb782a3f0e2ac_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/3bfcb782a3f0e2ac_train_data.json
type:
field_instruction: problem
field_output: target_answer
format: '{instruction}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 4
flash_attention: false
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: false
group_by_length: false
hub_model_id: nathanialhunt/6da84e18-9c62-4e59-b67f-ac14c1658216
hub_repo: null
hub_strategy: checkpoint
hub_token: null
learning_rate: 5.0e-05
load_in_4bit: false
load_in_8bit: false
local_rank: null
logging_steps: 1
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 8
lora_target_linear: true
lr_scheduler: cosine
max_steps: 50
micro_batch_size: 2
mlflow_experiment_name: /tmp/3bfcb782a3f0e2ac_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 4
sequence_len: 512
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: c54ab670-3720-4ec3-a8b4-4391881ada3a
wandb_project: Birthday-SN56-5-Gradients-On-Demand
wandb_run: your_name
wandb_runid: c54ab670-3720-4ec3-a8b4-4391881ada3a
warmup_steps: 10
weight_decay: 0.0
xformers_attention: null
```
</details><br>
# 6da84e18-9c62-4e59-b67f-ac14c1658216
This model is a fine-tuned version of [katuni4ka/tiny-random-qwen1.5-moe](https://huggingface.co/katuni4ka/tiny-random-qwen1.5-moe) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 11.9106
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 10
- training_steps: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 11.9211 | 0.0000 | 1 | 11.9166 |
| 11.9176 | 0.0003 | 13 | 11.9148 |
| 11.9149 | 0.0006 | 26 | 11.9120 |
| 11.8968 | 0.0009 | 39 | 11.9106 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
mradermacher/DeepSeek-R1-Distill-Qwen-32B-Uncensored-GGUF | mradermacher | 2025-01-26T07:38:41Z | 241,489 | 38 | transformers | [
"transformers",
"gguf",
"generated_from_trainer",
"en",
"dataset:Guilherme34/uncensor",
"base_model:nicoboss/DeepSeek-R1-Distill-Qwen-32B-Uncensored",
"base_model:quantized:nicoboss/DeepSeek-R1-Distill-Qwen-32B-Uncensored",
"license:mit",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2025-01-26T02:35:09Z | ---
base_model: nicoboss/DeepSeek-R1-Distill-Qwen-32B-Uncensored
datasets:
- Guilherme34/uncensor
language:
- en
library_name: transformers
license: mit
quantized_by: mradermacher
tags:
- generated_from_trainer
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: -->
static quants of https://huggingface.co/nicoboss/DeepSeek-R1-Distill-Qwen-32B-Uncensored
<!-- provided-files -->
weighted/imatrix quants are available at https://huggingface.co/mradermacher/DeepSeek-R1-Distill-Qwen-32B-Uncensored-i1-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/DeepSeek-R1-Distill-Qwen-32B-Uncensored-GGUF/resolve/main/DeepSeek-R1-Distill-Qwen-32B-Uncensored.Q2_K.gguf) | Q2_K | 12.4 | |
| [GGUF](https://huggingface.co/mradermacher/DeepSeek-R1-Distill-Qwen-32B-Uncensored-GGUF/resolve/main/DeepSeek-R1-Distill-Qwen-32B-Uncensored.Q3_K_S.gguf) | Q3_K_S | 14.5 | |
| [GGUF](https://huggingface.co/mradermacher/DeepSeek-R1-Distill-Qwen-32B-Uncensored-GGUF/resolve/main/DeepSeek-R1-Distill-Qwen-32B-Uncensored.Q3_K_M.gguf) | Q3_K_M | 16.0 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/DeepSeek-R1-Distill-Qwen-32B-Uncensored-GGUF/resolve/main/DeepSeek-R1-Distill-Qwen-32B-Uncensored.Q3_K_L.gguf) | Q3_K_L | 17.3 | |
| [GGUF](https://huggingface.co/mradermacher/DeepSeek-R1-Distill-Qwen-32B-Uncensored-GGUF/resolve/main/DeepSeek-R1-Distill-Qwen-32B-Uncensored.IQ4_XS.gguf) | IQ4_XS | 18.0 | |
| [GGUF](https://huggingface.co/mradermacher/DeepSeek-R1-Distill-Qwen-32B-Uncensored-GGUF/resolve/main/DeepSeek-R1-Distill-Qwen-32B-Uncensored.Q4_K_S.gguf) | Q4_K_S | 18.9 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/DeepSeek-R1-Distill-Qwen-32B-Uncensored-GGUF/resolve/main/DeepSeek-R1-Distill-Qwen-32B-Uncensored.Q4_K_M.gguf) | Q4_K_M | 19.9 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/DeepSeek-R1-Distill-Qwen-32B-Uncensored-GGUF/resolve/main/DeepSeek-R1-Distill-Qwen-32B-Uncensored.Q5_K_S.gguf) | Q5_K_S | 22.7 | |
| [GGUF](https://huggingface.co/mradermacher/DeepSeek-R1-Distill-Qwen-32B-Uncensored-GGUF/resolve/main/DeepSeek-R1-Distill-Qwen-32B-Uncensored.Q5_K_M.gguf) | Q5_K_M | 23.4 | |
| [GGUF](https://huggingface.co/mradermacher/DeepSeek-R1-Distill-Qwen-32B-Uncensored-GGUF/resolve/main/DeepSeek-R1-Distill-Qwen-32B-Uncensored.Q6_K.gguf) | Q6_K | 27.0 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/DeepSeek-R1-Distill-Qwen-32B-Uncensored-GGUF/resolve/main/DeepSeek-R1-Distill-Qwen-32B-Uncensored.Q8_0.gguf) | Q8_0 | 34.9 | fast, best quality |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time.
<!-- end -->
|
NextGLab/ORANSight_LLama_8B_Instruct | NextGLab | 2025-01-26T07:38:20Z | 36 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"text-generation-inference",
"unsloth",
"trl",
"sft",
"conversational",
"en",
"arxiv:2407.06245",
"base_model:unsloth/Meta-Llama-3.1-8B-Instruct-bnb-4bit",
"base_model:finetune:unsloth/Meta-Llama-3.1-8B-Instruct-bnb-4bit",
"license:llama3.1",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-12-27T07:27:57Z | ---
base_model: unsloth/Meta-Llama-3.1-8B-Instruct-bnb-4bit
tags:
- text-generation-inference
- transformers
- unsloth
- llama
- trl
- sft
license: llama3.1
language:
- en
---
# Model Card for ORANSight Llama-8B
This model belongs to the first release of the ORANSight family of models.
- **Developed by:** NextG lab@ NC State
- **License:** llama3.1
- **Context Window** 128K tokens.
- **Fine Tuning Framework:** Unsloth
### Generate with Transformers
Below is a quick example of how to use the model with Hugging Face Transformers:
```python
from transformers import pipeline
# Example query
messages = [
{"role": "system", "content": "You are an O-RAN expert assistant."},
{"role": "user", "content": "Explain the E2 interface."},
]
# Load the model
chatbot = pipeline("text-generation", model="NextGLab/ORANSight_LLama_8B_Instruct")
result = chatbot(messages)
print(result)
```
### Coming Soon
A detailed paper documenting the experiments and results achieved with this model will be available soon. Meanwhile, if you try this model, please cite the below mentioned paper to acknowledge the foundational work that enabled this fine-tuning.
```bibtex
@article{gajjar2024oran,
title={Oran-bench-13k: An open source benchmark for assessing llms in open radio access networks},
author={Gajjar, Pranshav and Shah, Vijay K},
journal={arXiv preprint arXiv:2407.06245},
year={2024}
}
```
--- |
denbeo/6a0ea58e-5bcb-4ecc-b0c4-36866e86e1c9 | denbeo | 2025-01-26T07:38:09Z | 8 | 0 | peft | [
"peft",
"safetensors",
"llama",
"axolotl",
"generated_from_trainer",
"base_model:NousResearch/Meta-Llama-3-8B-Alternate-Tokenizer",
"base_model:adapter:NousResearch/Meta-Llama-3-8B-Alternate-Tokenizer",
"license:other",
"8-bit",
"bitsandbytes",
"region:us"
] | null | 2025-01-26T07:12:00Z | ---
library_name: peft
license: other
base_model: NousResearch/Meta-Llama-3-8B-Alternate-Tokenizer
tags:
- axolotl
- generated_from_trainer
model-index:
- name: 6a0ea58e-5bcb-4ecc-b0c4-36866e86e1c9
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: NousResearch/Meta-Llama-3-8B-Alternate-Tokenizer
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- fbd16596dc609498_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/fbd16596dc609498_train_data.json
type:
field_input: Option 1
field_instruction: Domain
field_output: Question
format: '{instruction} {input}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 1
flash_attention: true
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: true
gradient_clipping: 1.0
group_by_length: false
hub_model_id: denbeo/6a0ea58e-5bcb-4ecc-b0c4-36866e86e1c9
hub_repo: null
hub_strategy: end
hub_token: null
learning_rate: 5.0e-05
load_in_4bit: true
load_in_8bit: true
local_rank: null
logging_steps: 1
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 8
lora_target_linear: true
lr_scheduler: cosine
max_steps: 200
micro_batch_size: 2
mlflow_experiment_name: /tmp/fbd16596dc609498_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 1
sequence_len: 1024
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: 7f264b18-8600-4682-ae3a-1dbab5576cea
wandb_project: Gradients-On-Demand
wandb_run: your_name
wandb_runid: 7f264b18-8600-4682-ae3a-1dbab5576cea
warmup_steps: 5
weight_decay: 0.01
xformers_attention: true
```
</details><br>
# 6a0ea58e-5bcb-4ecc-b0c4-36866e86e1c9
This model is a fine-tuned version of [NousResearch/Meta-Llama-3-8B-Alternate-Tokenizer](https://huggingface.co/NousResearch/Meta-Llama-3-8B-Alternate-Tokenizer) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.8795
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 5
- training_steps: 200
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 1.8502 | 0.6116 | 200 | 1.8795 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
aleegis09/af384fc4-68e1-4b76-b00b-34d533b8b5c9 | aleegis09 | 2025-01-26T07:35:48Z | 8 | 0 | peft | [
"peft",
"safetensors",
"llama",
"axolotl",
"generated_from_trainer",
"base_model:trl-internal-testing/tiny-random-LlamaForCausalLM",
"base_model:adapter:trl-internal-testing/tiny-random-LlamaForCausalLM",
"region:us"
] | null | 2025-01-26T07:33:18Z | ---
library_name: peft
base_model: trl-internal-testing/tiny-random-LlamaForCausalLM
tags:
- axolotl
- generated_from_trainer
model-index:
- name: af384fc4-68e1-4b76-b00b-34d533b8b5c9
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: trl-internal-testing/tiny-random-LlamaForCausalLM
bf16: true
chat_template: llama3
data_processes: 16
dataset_prepared_path: null
datasets:
- data_files:
- 5f3fb26c99847c1d_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/5f3fb26c99847c1d_train_data.json
type:
field_input: post
field_instruction: title
field_output: summary
format: '{instruction} {input}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
device_map: auto
do_eval: true
early_stopping_patience: 5
eval_batch_size: 4
eval_max_new_tokens: 128
eval_steps: 50
eval_table_size: null
evals_per_epoch: null
flash_attention: true
fp16: false
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: true
group_by_length: true
hub_model_id: aleegis09/af384fc4-68e1-4b76-b00b-34d533b8b5c9
hub_repo: null
hub_strategy: checkpoint
hub_token: null
learning_rate: 0.0001
load_in_4bit: false
load_in_8bit: false
local_rank: null
logging_steps: 1
lora_alpha: 128
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 64
lora_target_linear: true
lr_scheduler: cosine
max_grad_norm: 1.0
max_memory:
0: 75GB
max_steps: 200
micro_batch_size: 8
mlflow_experiment_name: /tmp/5f3fb26c99847c1d_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 3
optim_args:
adam_beta1: 0.9
adam_beta2: 0.95
adam_epsilon: 1e-5
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
save_steps: 50
saves_per_epoch: null
sequence_len: 1024
strict: false
tf32: true
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: 821c1640-29f7-45fe-90e6-e51d46a553fe
wandb_project: Gradients-On-Demand
wandb_run: your_name
wandb_runid: 821c1640-29f7-45fe-90e6-e51d46a553fe
warmup_steps: 10
weight_decay: 0.0
xformers_attention: null
```
</details><br>
# af384fc4-68e1-4b76-b00b-34d533b8b5c9
This model is a fine-tuned version of [trl-internal-testing/tiny-random-LlamaForCausalLM](https://huggingface.co/trl-internal-testing/tiny-random-LlamaForCausalLM) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 10.3461
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=adam_beta1=0.9,adam_beta2=0.95,adam_epsilon=1e-5
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 10
- training_steps: 200
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 10.3731 | 0.0003 | 1 | 10.3739 |
| 10.3555 | 0.0130 | 50 | 10.3558 |
| 10.3501 | 0.0260 | 100 | 10.3497 |
| 10.3529 | 0.0390 | 150 | 10.3465 |
| 10.345 | 0.0520 | 200 | 10.3461 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
AMindToThink/gemma-2-2b_RMU_s400_a300_layer7 | AMindToThink | 2025-01-26T07:33:21Z | 5 | 0 | transformers | [
"transformers",
"safetensors",
"gemma2",
"text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2025-01-26T07:31:02Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
NextGLab/ORANSight_Gemma_2_27B_Instruct | NextGLab | 2025-01-26T07:30:51Z | 43 | 0 | transformers | [
"transformers",
"safetensors",
"gemma2",
"text-generation",
"text-generation-inference",
"unsloth",
"trl",
"sft",
"conversational",
"en",
"arxiv:2407.06245",
"base_model:unsloth/gemma-2-27b-it-bnb-4bit",
"base_model:finetune:unsloth/gemma-2-27b-it-bnb-4bit",
"license:gemma",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-12-25T01:13:49Z | ---
base_model: unsloth/gemma-2-27b-it-bnb-4bit
tags:
- text-generation-inference
- transformers
- unsloth
- gemma2
- trl
- sft
license: gemma
language:
- en
---
# Model Card for ORANSight Gemma-9B
This model belongs to the first release of the ORANSight family of models.
- **Developed by:** NextG lab@ NC State
- **License:** gemma
- **Context Window:** 8192 tokens
- **Fine Tuning Framework:** Unsloth
### Generate with Transformers
Below is a quick example of how to use the model with Hugging Face Transformers:
```python
from transformers import pipeline
# Example query
messages = [
{"role": "system", "content": "You are an O-RAN expert assistant."},
{"role": "user", "content": "Explain the E2 interface."},
]
# Load the model
chatbot = pipeline("text-generation", model="NextGLab/ORANSight_Gemma_2_9B_Instruct")
result = chatbot(messages)
print(result)
```
### Coming Soon
A detailed paper documenting the experiments and results achieved with this model will be available soon. Meanwhile, if you try this model, please cite the below mentioned paper to acknowledge the foundational work that enabled this fine-tuning.
```bibtex
@article{gajjar2024oran,
title={Oran-bench-13k: An open source benchmark for assessing llms in open radio access networks},
author={Gajjar, Pranshav and Shah, Vijay K},
journal={arXiv preprint arXiv:2407.06245},
year={2024}
}
```
--- |
NextGLab/ORANSight_Gemma_2_9B_Instruct | NextGLab | 2025-01-26T07:29:47Z | 48 | 0 | transformers | [
"transformers",
"safetensors",
"gemma2",
"text-generation",
"text-generation-inference",
"unsloth",
"trl",
"sft",
"conversational",
"en",
"arxiv:2407.06245",
"base_model:unsloth/gemma-2-9b-it-bnb-4bit",
"base_model:finetune:unsloth/gemma-2-9b-it-bnb-4bit",
"license:gemma",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-12-25T19:29:07Z | ---
base_model: unsloth/gemma-2-9b-it-bnb-4bit
tags:
- text-generation-inference
- transformers
- unsloth
- gemma2
- trl
- sft
license: gemma
language:
- en
---
# Model Card for ORANSight Gemma-9B
This model belongs to the first release of the ORANSight family of models.
- **Developed by:** NextG lab@ NC State
- **License:** gemma
- **Context Window:** 8192 tokens
- **Fine Tuning Framework:** Unsloth
### Generate with Transformers
Below is a quick example of how to use the model with Hugging Face Transformers:
```python
from transformers import pipeline
# Example query
messages = [
{"role": "system", "content": "You are an O-RAN expert assistant."},
{"role": "user", "content": "Explain the E2 interface."},
]
# Load the model
chatbot = pipeline("text-generation", model="NextGLab/ORANSight_Gemma_2_9B_Instruct")
result = chatbot(messages)
print(result)
```
### Coming Soon
A detailed paper documenting the experiments and results achieved with this model will be available soon. Meanwhile, if you try this model, please cite the below mentioned paper to acknowledge the foundational work that enabled this fine-tuning.
```bibtex
@article{gajjar2024oran,
title={Oran-bench-13k: An open source benchmark for assessing llms in open radio access networks},
author={Gajjar, Pranshav and Shah, Vijay K},
journal={arXiv preprint arXiv:2407.06245},
year={2024}
}
```
--- |
nhung01/e3b74596-5893-4a65-a63e-900a6343eaac | nhung01 | 2025-01-26T07:29:25Z | 6 | 0 | peft | [
"peft",
"safetensors",
"llama",
"axolotl",
"generated_from_trainer",
"base_model:NousResearch/Meta-Llama-3-8B-Alternate-Tokenizer",
"base_model:adapter:NousResearch/Meta-Llama-3-8B-Alternate-Tokenizer",
"license:other",
"8-bit",
"bitsandbytes",
"region:us"
] | null | 2025-01-26T07:12:00Z | ---
library_name: peft
license: other
base_model: NousResearch/Meta-Llama-3-8B-Alternate-Tokenizer
tags:
- axolotl
- generated_from_trainer
model-index:
- name: e3b74596-5893-4a65-a63e-900a6343eaac
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: NousResearch/Meta-Llama-3-8B-Alternate-Tokenizer
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- fbd16596dc609498_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/fbd16596dc609498_train_data.json
type:
field_input: Option 1
field_instruction: Domain
field_output: Question
format: '{instruction} {input}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 1
flash_attention: true
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: true
gradient_clipping: 1.0
group_by_length: false
hub_model_id: nhung01/e3b74596-5893-4a65-a63e-900a6343eaac
hub_repo: null
hub_strategy: end
hub_token: null
learning_rate: 5.0e-05
load_in_4bit: true
load_in_8bit: true
local_rank: null
logging_steps: 1
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 8
lora_target_linear: true
lr_scheduler: cosine
max_steps: 200
micro_batch_size: 2
mlflow_experiment_name: /tmp/fbd16596dc609498_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 1
sequence_len: 1024
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: 7f264b18-8600-4682-ae3a-1dbab5576cea
wandb_project: Gradients-On-Demand
wandb_run: your_name
wandb_runid: 7f264b18-8600-4682-ae3a-1dbab5576cea
warmup_steps: 5
weight_decay: 0.01
xformers_attention: true
```
</details><br>
# e3b74596-5893-4a65-a63e-900a6343eaac
This model is a fine-tuned version of [NousResearch/Meta-Llama-3-8B-Alternate-Tokenizer](https://huggingface.co/NousResearch/Meta-Llama-3-8B-Alternate-Tokenizer) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.8783
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 5
- training_steps: 200
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 1.8666 | 0.6116 | 200 | 1.8783 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
nhung03/33552492-6f72-49a9-b387-73254771a129 | nhung03 | 2025-01-26T07:29:18Z | 8 | 0 | peft | [
"peft",
"safetensors",
"llama",
"axolotl",
"generated_from_trainer",
"base_model:NousResearch/Meta-Llama-3-8B-Alternate-Tokenizer",
"base_model:adapter:NousResearch/Meta-Llama-3-8B-Alternate-Tokenizer",
"license:other",
"8-bit",
"bitsandbytes",
"region:us"
] | null | 2025-01-26T07:11:48Z | ---
library_name: peft
license: other
base_model: NousResearch/Meta-Llama-3-8B-Alternate-Tokenizer
tags:
- axolotl
- generated_from_trainer
model-index:
- name: 33552492-6f72-49a9-b387-73254771a129
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: NousResearch/Meta-Llama-3-8B-Alternate-Tokenizer
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- fbd16596dc609498_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/fbd16596dc609498_train_data.json
type:
field_input: Option 1
field_instruction: Domain
field_output: Question
format: '{instruction} {input}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 1
flash_attention: true
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: true
gradient_clipping: 1.0
group_by_length: false
hub_model_id: nhung03/33552492-6f72-49a9-b387-73254771a129
hub_repo: null
hub_strategy: end
hub_token: null
learning_rate: 5.0e-05
load_in_4bit: true
load_in_8bit: true
local_rank: null
logging_steps: 1
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 8
lora_target_linear: true
lr_scheduler: cosine
max_steps: 200
micro_batch_size: 2
mlflow_experiment_name: /tmp/fbd16596dc609498_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 1
sequence_len: 1024
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: 7f264b18-8600-4682-ae3a-1dbab5576cea
wandb_project: Gradients-On-Demand
wandb_run: your_name
wandb_runid: 7f264b18-8600-4682-ae3a-1dbab5576cea
warmup_steps: 5
weight_decay: 0.01
xformers_attention: true
```
</details><br>
# 33552492-6f72-49a9-b387-73254771a129
This model is a fine-tuned version of [NousResearch/Meta-Llama-3-8B-Alternate-Tokenizer](https://huggingface.co/NousResearch/Meta-Llama-3-8B-Alternate-Tokenizer) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.8815
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 5
- training_steps: 200
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 1.8609 | 0.6116 | 200 | 1.8815 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
oldiday/8d08a67f-fb94-4323-bc8e-2d99913d5901 | oldiday | 2025-01-26T07:29:04Z | 6 | 0 | peft | [
"peft",
"safetensors",
"qwen2",
"axolotl",
"generated_from_trainer",
"base_model:peft-internal-testing/tiny-dummy-qwen2",
"base_model:adapter:peft-internal-testing/tiny-dummy-qwen2",
"region:us"
] | null | 2025-01-26T07:27:16Z | ---
library_name: peft
base_model: peft-internal-testing/tiny-dummy-qwen2
tags:
- axolotl
- generated_from_trainer
model-index:
- name: 8d08a67f-fb94-4323-bc8e-2d99913d5901
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: peft-internal-testing/tiny-dummy-qwen2
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- 99a1d9d467a445dc_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/99a1d9d467a445dc_train_data.json
type:
field_instruction: question
field_output: answer
format: '{instruction}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 4
flash_attention: true
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: true
gradient_clipping: 1.0
group_by_length: false
hub_model_id: oldiday/8d08a67f-fb94-4323-bc8e-2d99913d5901
hub_repo: null
hub_strategy: checkpoint
hub_token: null
learning_rate: 0.0002
load_in_4bit: false
load_in_8bit: false
local_rank: 0
logging_steps: 3
lora_alpha: 32
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 16
lora_target_linear: true
lr_scheduler: cosine
max_steps: 100
micro_batch_size: 8
mlflow_experiment_name: /tmp/99a1d9d467a445dc_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 3
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 4
sequence_len: 1024
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: techspear-hub
wandb_mode: online
wandb_name: 2f1a76ed-541f-4b8a-8491-311724d62463
wandb_project: Gradients-On-Six
wandb_run: your_name
wandb_runid: 2f1a76ed-541f-4b8a-8491-311724d62463
warmup_steps: 10
weight_decay: 0.0
xformers_attention: null
```
</details><br>
# 8d08a67f-fb94-4323-bc8e-2d99913d5901
This model is a fine-tuned version of [peft-internal-testing/tiny-dummy-qwen2](https://huggingface.co/peft-internal-testing/tiny-dummy-qwen2) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 11.9119
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 10
- training_steps: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| No log | 0.0060 | 1 | 11.9395 |
| 11.9385 | 0.0541 | 9 | 11.9390 |
| 11.9374 | 0.1081 | 18 | 11.9375 |
| 11.9358 | 0.1622 | 27 | 11.9353 |
| 11.9325 | 0.2162 | 36 | 11.9319 |
| 11.9281 | 0.2703 | 45 | 11.9272 |
| 11.9226 | 0.3243 | 54 | 11.9219 |
| 11.9178 | 0.3784 | 63 | 11.9173 |
| 11.915 | 0.4324 | 72 | 11.9143 |
| 11.9134 | 0.4865 | 81 | 11.9127 |
| 11.9126 | 0.5405 | 90 | 11.9120 |
| 11.9124 | 0.5946 | 99 | 11.9119 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
elnasharomar2/whisper-small-ar | elnasharomar2 | 2025-01-26T07:28:32Z | 80 | 0 | transformers | [
"transformers",
"pytorch",
"safetensors",
"whisper",
"automatic-speech-recognition",
"generated_from_trainer",
"ar",
"dataset:mozilla-foundation/common_voice_13_0",
"base_model:openai/whisper-small",
"base_model:finetune:openai/whisper-small",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | automatic-speech-recognition | 2023-11-15T06:31:36Z | ---
language:
- ar
license: apache-2.0
base_model: openai/whisper-small
tags:
- generated_from_trainer
datasets:
- mozilla-foundation/common_voice_13_0
metrics:
- wer
model-index:
- name: Arabic Whisper Small oknashar
results:
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: Common Voice 13
type: mozilla-foundation/common_voice_13_0
config: ar
split: test
args: ar
metrics:
- name: Wer
type: wer
value: 69.45486825646613
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Arabic Whisper Small oknashar
This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co/openai/whisper-small) on the Common Voice 13 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4206
- Wer Ortho: 52.9870
- Wer: 69.4549
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: constant_with_warmup
- lr_scheduler_warmup_steps: 50
- training_steps: 500
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer Ortho | Wer |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:-------:|
| 0.2106 | 0.21 | 500 | 0.4206 | 52.9870 | 69.4549 |
### Framework versions
- Transformers 4.33.0
- Pytorch 2.0.0
- Datasets 2.14.6
- Tokenizers 0.13.3
|
nttx/85df7c05-c490-4420-a0ff-75c93e7bdbfd | nttx | 2025-01-26T07:27:30Z | 8 | 0 | peft | [
"peft",
"safetensors",
"qwen2_moe",
"axolotl",
"generated_from_trainer",
"base_model:katuni4ka/tiny-random-qwen1.5-moe",
"base_model:adapter:katuni4ka/tiny-random-qwen1.5-moe",
"region:us"
] | null | 2025-01-26T07:20:53Z | ---
library_name: peft
base_model: katuni4ka/tiny-random-qwen1.5-moe
tags:
- axolotl
- generated_from_trainer
model-index:
- name: 85df7c05-c490-4420-a0ff-75c93e7bdbfd
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: katuni4ka/tiny-random-qwen1.5-moe
bf16: auto
chat_template: llama3
data_processes: 16
dataset_prepared_path: null
datasets:
- data_files:
- 3bfcb782a3f0e2ac_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/3bfcb782a3f0e2ac_train_data.json
type:
field_instruction: problem
field_output: target_answer
format: '{instruction}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
device_map: auto
do_eval: true
early_stopping_patience: null
eval_batch_size: 2
eval_max_new_tokens: 128
eval_steps: null
eval_table_size: null
evals_per_epoch: null
flash_attention: true
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: true
group_by_length: true
hub_model_id: nttx/85df7c05-c490-4420-a0ff-75c93e7bdbfd
hub_repo: null
hub_strategy: end
hub_token: null
learning_rate: 0.0001
load_in_4bit: false
load_in_8bit: false
local_rank: null
logging_steps: 1
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 8
lora_target_linear: true
lr_scheduler: cosine
max_grad_norm: 1.0
max_memory:
0: 75GB
max_steps: 200
micro_batch_size: 2
mlflow_experiment_name: /tmp/3bfcb782a3f0e2ac_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
save_steps: null
saves_per_epoch: null
sequence_len: 1024
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: c54ab670-3720-4ec3-a8b4-4391881ada3a
wandb_project: Gradients-On-Demand
wandb_run: your_name
wandb_runid: c54ab670-3720-4ec3-a8b4-4391881ada3a
warmup_steps: 5
weight_decay: 0.0
xformers_attention: null
```
</details><br>
# 85df7c05-c490-4420-a0ff-75c93e7bdbfd
This model is a fine-tuned version of [katuni4ka/tiny-random-qwen1.5-moe](https://huggingface.co/katuni4ka/tiny-random-qwen1.5-moe) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 11.8486
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 5
- training_steps: 200
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 11.8214 | 0.0045 | 200 | 11.8486 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
mradermacher/R1-Qwen2.5-Math-1.5B-i1-GGUF | mradermacher | 2025-01-26T07:27:08Z | 510 | 0 | transformers | [
"transformers",
"gguf",
"llama-factory",
"full",
"generated_from_trainer",
"en",
"base_model:pe-nlp/R1-Qwen2.5-Math-1.5B",
"base_model:quantized:pe-nlp/R1-Qwen2.5-Math-1.5B",
"license:other",
"endpoints_compatible",
"region:us",
"imatrix",
"conversational"
] | null | 2025-01-26T07:16:20Z | ---
base_model: pe-nlp/R1-Qwen2.5-Math-1.5B
language:
- en
library_name: transformers
license: other
quantized_by: mradermacher
tags:
- llama-factory
- full
- generated_from_trainer
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/pe-nlp/R1-Qwen2.5-Math-1.5B
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/R1-Qwen2.5-Math-1.5B-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/R1-Qwen2.5-Math-1.5B-i1-GGUF/resolve/main/R1-Qwen2.5-Math-1.5B.i1-IQ1_S.gguf) | i1-IQ1_S | 0.5 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/R1-Qwen2.5-Math-1.5B-i1-GGUF/resolve/main/R1-Qwen2.5-Math-1.5B.i1-IQ1_M.gguf) | i1-IQ1_M | 0.6 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/R1-Qwen2.5-Math-1.5B-i1-GGUF/resolve/main/R1-Qwen2.5-Math-1.5B.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 0.6 | |
| [GGUF](https://huggingface.co/mradermacher/R1-Qwen2.5-Math-1.5B-i1-GGUF/resolve/main/R1-Qwen2.5-Math-1.5B.i1-IQ2_XS.gguf) | i1-IQ2_XS | 0.7 | |
| [GGUF](https://huggingface.co/mradermacher/R1-Qwen2.5-Math-1.5B-i1-GGUF/resolve/main/R1-Qwen2.5-Math-1.5B.i1-IQ2_S.gguf) | i1-IQ2_S | 0.7 | |
| [GGUF](https://huggingface.co/mradermacher/R1-Qwen2.5-Math-1.5B-i1-GGUF/resolve/main/R1-Qwen2.5-Math-1.5B.i1-IQ2_M.gguf) | i1-IQ2_M | 0.7 | |
| [GGUF](https://huggingface.co/mradermacher/R1-Qwen2.5-Math-1.5B-i1-GGUF/resolve/main/R1-Qwen2.5-Math-1.5B.i1-Q2_K_S.gguf) | i1-Q2_K_S | 0.7 | very low quality |
| [GGUF](https://huggingface.co/mradermacher/R1-Qwen2.5-Math-1.5B-i1-GGUF/resolve/main/R1-Qwen2.5-Math-1.5B.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 0.8 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/R1-Qwen2.5-Math-1.5B-i1-GGUF/resolve/main/R1-Qwen2.5-Math-1.5B.i1-Q2_K.gguf) | i1-Q2_K | 0.8 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/R1-Qwen2.5-Math-1.5B-i1-GGUF/resolve/main/R1-Qwen2.5-Math-1.5B.i1-IQ3_XS.gguf) | i1-IQ3_XS | 0.8 | |
| [GGUF](https://huggingface.co/mradermacher/R1-Qwen2.5-Math-1.5B-i1-GGUF/resolve/main/R1-Qwen2.5-Math-1.5B.i1-Q3_K_S.gguf) | i1-Q3_K_S | 0.9 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/R1-Qwen2.5-Math-1.5B-i1-GGUF/resolve/main/R1-Qwen2.5-Math-1.5B.i1-IQ3_S.gguf) | i1-IQ3_S | 0.9 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/R1-Qwen2.5-Math-1.5B-i1-GGUF/resolve/main/R1-Qwen2.5-Math-1.5B.i1-IQ3_M.gguf) | i1-IQ3_M | 0.9 | |
| [GGUF](https://huggingface.co/mradermacher/R1-Qwen2.5-Math-1.5B-i1-GGUF/resolve/main/R1-Qwen2.5-Math-1.5B.i1-Q3_K_M.gguf) | i1-Q3_K_M | 0.9 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/R1-Qwen2.5-Math-1.5B-i1-GGUF/resolve/main/R1-Qwen2.5-Math-1.5B.i1-Q3_K_L.gguf) | i1-Q3_K_L | 1.0 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/R1-Qwen2.5-Math-1.5B-i1-GGUF/resolve/main/R1-Qwen2.5-Math-1.5B.i1-IQ4_XS.gguf) | i1-IQ4_XS | 1.0 | |
| [GGUF](https://huggingface.co/mradermacher/R1-Qwen2.5-Math-1.5B-i1-GGUF/resolve/main/R1-Qwen2.5-Math-1.5B.i1-IQ4_NL.gguf) | i1-IQ4_NL | 1.0 | prefer IQ4_XS |
| [GGUF](https://huggingface.co/mradermacher/R1-Qwen2.5-Math-1.5B-i1-GGUF/resolve/main/R1-Qwen2.5-Math-1.5B.i1-Q4_0.gguf) | i1-Q4_0 | 1.0 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/R1-Qwen2.5-Math-1.5B-i1-GGUF/resolve/main/R1-Qwen2.5-Math-1.5B.i1-Q4_K_S.gguf) | i1-Q4_K_S | 1.0 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/R1-Qwen2.5-Math-1.5B-i1-GGUF/resolve/main/R1-Qwen2.5-Math-1.5B.i1-Q4_K_M.gguf) | i1-Q4_K_M | 1.1 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/R1-Qwen2.5-Math-1.5B-i1-GGUF/resolve/main/R1-Qwen2.5-Math-1.5B.i1-Q4_1.gguf) | i1-Q4_1 | 1.1 | |
| [GGUF](https://huggingface.co/mradermacher/R1-Qwen2.5-Math-1.5B-i1-GGUF/resolve/main/R1-Qwen2.5-Math-1.5B.i1-Q5_K_S.gguf) | i1-Q5_K_S | 1.2 | |
| [GGUF](https://huggingface.co/mradermacher/R1-Qwen2.5-Math-1.5B-i1-GGUF/resolve/main/R1-Qwen2.5-Math-1.5B.i1-Q5_K_M.gguf) | i1-Q5_K_M | 1.2 | |
| [GGUF](https://huggingface.co/mradermacher/R1-Qwen2.5-Math-1.5B-i1-GGUF/resolve/main/R1-Qwen2.5-Math-1.5B.i1-Q6_K.gguf) | i1-Q6_K | 1.4 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
auxyus/d5ffb525-db15-4fbe-b333-cd5a1e06e049 | auxyus | 2025-01-26T07:26:49Z | 8 | 0 | peft | [
"peft",
"safetensors",
"mistral",
"axolotl",
"generated_from_trainer",
"base_model:unsloth/mistral-7b-instruct-v0.2",
"base_model:adapter:unsloth/mistral-7b-instruct-v0.2",
"license:apache-2.0",
"region:us"
] | null | 2025-01-26T05:27:22Z | ---
library_name: peft
license: apache-2.0
base_model: unsloth/mistral-7b-instruct-v0.2
tags:
- axolotl
- generated_from_trainer
model-index:
- name: d5ffb525-db15-4fbe-b333-cd5a1e06e049
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: unsloth/mistral-7b-instruct-v0.2
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- 59dfd08cf3fa50ec_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/59dfd08cf3fa50ec_train_data.json
type:
field_input: Title
field_instruction: Abstract
field_output: Hypothesis
format: '{instruction} {input}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 4
flash_attention: true
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: true
gradient_clipping: 1.0
group_by_length: false
hub_model_id: auxyus/d5ffb525-db15-4fbe-b333-cd5a1e06e049
hub_repo: null
hub_strategy: checkpoint
hub_token: null
learning_rate: 0.0001
load_in_4bit: false
load_in_8bit: false
local_rank: 0
logging_steps: 3
lora_alpha: 32
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 16
lora_target_linear: true
lr_scheduler: cosine
max_steps: 100
micro_batch_size: 8
mlflow_experiment_name: /tmp/59dfd08cf3fa50ec_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 3
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 4
sequence_len: 1024
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: techspear-hub
wandb_mode: online
wandb_name: 9d1868e2-61ae-4041-af04-bf2ce6704f58
wandb_project: Gradients-On-Two
wandb_run: your_name
wandb_runid: 9d1868e2-61ae-4041-af04-bf2ce6704f58
warmup_steps: 10
weight_decay: 0.01
xformers_attention: null
```
</details><br>
# d5ffb525-db15-4fbe-b333-cd5a1e06e049
This model is a fine-tuned version of [unsloth/mistral-7b-instruct-v0.2](https://huggingface.co/unsloth/mistral-7b-instruct-v0.2) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2512
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 10
- training_steps: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| No log | 0.0002 | 1 | 5.5775 |
| 9.5317 | 0.0022 | 9 | 1.1300 |
| 1.3782 | 0.0043 | 18 | 0.2865 |
| 1.122 | 0.0065 | 27 | 0.2721 |
| 1.0486 | 0.0087 | 36 | 0.2710 |
| 1.0521 | 0.0108 | 45 | 0.2563 |
| 0.9572 | 0.0130 | 54 | 0.2596 |
| 0.9789 | 0.0152 | 63 | 0.2582 |
| 1.0106 | 0.0173 | 72 | 0.2558 |
| 1.0315 | 0.0195 | 81 | 0.2542 |
| 1.0137 | 0.0217 | 90 | 0.2518 |
| 0.9905 | 0.0239 | 99 | 0.2512 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
0x1202/1cee9c98-73b5-4a38-b10b-0d8ef32ea5cb | 0x1202 | 2025-01-26T07:25:44Z | 6 | 0 | peft | [
"peft",
"safetensors",
"gpt_neox",
"axolotl",
"generated_from_trainer",
"base_model:EleutherAI/pythia-70m-deduped",
"base_model:adapter:EleutherAI/pythia-70m-deduped",
"license:apache-2.0",
"region:us"
] | null | 2025-01-26T07:16:30Z | ---
library_name: peft
license: apache-2.0
base_model: EleutherAI/pythia-70m-deduped
tags:
- axolotl
- generated_from_trainer
model-index:
- name: 1cee9c98-73b5-4a38-b10b-0d8ef32ea5cb
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: EleutherAI/pythia-70m-deduped
bf16: true
chat_template: llama3
data_processes: 16
dataset_prepared_path: null
datasets:
- data_files:
- 59ebf80954a6130a_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/59ebf80954a6130a_train_data.json
type:
field_input: solution_steps
field_instruction: problem
field_output: solution
format: '{instruction} {input}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
device_map: auto
do_eval: true
early_stopping_patience: 5
eval_batch_size: 4
eval_max_new_tokens: 128
eval_steps: 50
eval_table_size: null
evals_per_epoch: null
flash_attention: true
fp16: false
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: true
group_by_length: true
hub_model_id: 0x1202/1cee9c98-73b5-4a38-b10b-0d8ef32ea5cb
hub_repo: null
hub_strategy: checkpoint
hub_token: null
learning_rate: 0.0001
load_in_4bit: false
load_in_8bit: false
local_rank: null
logging_steps: 1
lora_alpha: 128
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 64
lora_target_linear: true
lr_scheduler: cosine
max_grad_norm: 1.0
max_memory:
0: 75GB
max_steps: 200
micro_batch_size: 8
mlflow_experiment_name: /tmp/59ebf80954a6130a_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 3
optim_args:
adam_beta1: 0.9
adam_beta2: 0.95
adam_epsilon: 1e-5
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
save_steps: 50
saves_per_epoch: null
sequence_len: 1024
special_tokens:
pad_token: <|endoftext|>
strict: false
tf32: true
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: f210f656-5c7e-4a29-80ca-643c4317c822
wandb_project: Gradients-On-Demand
wandb_run: your_name
wandb_runid: f210f656-5c7e-4a29-80ca-643c4317c822
warmup_steps: 10
weight_decay: 0.0
xformers_attention: null
```
</details><br>
# 1cee9c98-73b5-4a38-b10b-0d8ef32ea5cb
This model is a fine-tuned version of [EleutherAI/pythia-70m-deduped](https://huggingface.co/EleutherAI/pythia-70m-deduped) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 3.8854
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=adam_beta1=0.9,adam_beta2=0.95,adam_epsilon=1e-5
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 10
- training_steps: 200
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 11.4407 | 0.0002 | 1 | 4.5460 |
| 40.5819 | 0.0085 | 50 | 4.6227 |
| 30.3666 | 0.0170 | 100 | 3.7897 |
| 25.5567 | 0.0255 | 150 | 3.7479 |
| 31.3141 | 0.0341 | 200 | 3.8854 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
mradermacher/Qwen-2.5-reasoning-verifier-GGUF | mradermacher | 2025-01-26T07:23:29Z | 262 | 0 | transformers | [
"transformers",
"gguf",
"generated_from_trainer",
"trl",
"reward-trainer",
"en",
"dataset:gagan3012/Sky-T1_preference_data_10k_reward_templated",
"base_model:gagan3012/Qwen-2.5-reasoning-verifier",
"base_model:quantized:gagan3012/Qwen-2.5-reasoning-verifier",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2025-01-26T07:20:05Z | ---
base_model: gagan3012/Qwen-2.5-reasoning-verifier
datasets: gagan3012/Sky-T1_preference_data_10k_reward_templated
language:
- en
library_name: transformers
model_name: Qwen-2.5-reasoning-verifier
quantized_by: mradermacher
tags:
- generated_from_trainer
- trl
- reward-trainer
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: -->
static quants of https://huggingface.co/gagan3012/Qwen-2.5-reasoning-verifier
<!-- provided-files -->
weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/Qwen-2.5-reasoning-verifier-GGUF/resolve/main/Qwen-2.5-reasoning-verifier.Q3_K_S.gguf) | Q3_K_S | 0.4 | |
| [GGUF](https://huggingface.co/mradermacher/Qwen-2.5-reasoning-verifier-GGUF/resolve/main/Qwen-2.5-reasoning-verifier.Q2_K.gguf) | Q2_K | 0.4 | |
| [GGUF](https://huggingface.co/mradermacher/Qwen-2.5-reasoning-verifier-GGUF/resolve/main/Qwen-2.5-reasoning-verifier.IQ4_XS.gguf) | IQ4_XS | 0.5 | |
| [GGUF](https://huggingface.co/mradermacher/Qwen-2.5-reasoning-verifier-GGUF/resolve/main/Qwen-2.5-reasoning-verifier.Q3_K_M.gguf) | Q3_K_M | 0.5 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/Qwen-2.5-reasoning-verifier-GGUF/resolve/main/Qwen-2.5-reasoning-verifier.Q3_K_L.gguf) | Q3_K_L | 0.5 | |
| [GGUF](https://huggingface.co/mradermacher/Qwen-2.5-reasoning-verifier-GGUF/resolve/main/Qwen-2.5-reasoning-verifier.Q4_K_S.gguf) | Q4_K_S | 0.5 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Qwen-2.5-reasoning-verifier-GGUF/resolve/main/Qwen-2.5-reasoning-verifier.Q4_K_M.gguf) | Q4_K_M | 0.5 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Qwen-2.5-reasoning-verifier-GGUF/resolve/main/Qwen-2.5-reasoning-verifier.Q5_K_S.gguf) | Q5_K_S | 0.5 | |
| [GGUF](https://huggingface.co/mradermacher/Qwen-2.5-reasoning-verifier-GGUF/resolve/main/Qwen-2.5-reasoning-verifier.Q5_K_M.gguf) | Q5_K_M | 0.5 | |
| [GGUF](https://huggingface.co/mradermacher/Qwen-2.5-reasoning-verifier-GGUF/resolve/main/Qwen-2.5-reasoning-verifier.Q6_K.gguf) | Q6_K | 0.6 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/Qwen-2.5-reasoning-verifier-GGUF/resolve/main/Qwen-2.5-reasoning-verifier.Q8_0.gguf) | Q8_0 | 0.6 | fast, best quality |
| [GGUF](https://huggingface.co/mradermacher/Qwen-2.5-reasoning-verifier-GGUF/resolve/main/Qwen-2.5-reasoning-verifier.f16.gguf) | f16 | 1.1 | 16 bpw, overkill |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
mradermacher/R1-Qwen2.5-Math-1.5B-GGUF | mradermacher | 2025-01-26T07:20:52Z | 262 | 0 | transformers | [
"transformers",
"gguf",
"llama-factory",
"full",
"generated_from_trainer",
"en",
"base_model:pe-nlp/R1-Qwen2.5-Math-1.5B",
"base_model:quantized:pe-nlp/R1-Qwen2.5-Math-1.5B",
"license:other",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2025-01-26T07:14:29Z | ---
base_model: pe-nlp/R1-Qwen2.5-Math-1.5B
language:
- en
library_name: transformers
license: other
quantized_by: mradermacher
tags:
- llama-factory
- full
- generated_from_trainer
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: -->
static quants of https://huggingface.co/pe-nlp/R1-Qwen2.5-Math-1.5B
<!-- provided-files -->
weighted/imatrix quants are available at https://huggingface.co/mradermacher/R1-Qwen2.5-Math-1.5B-i1-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/R1-Qwen2.5-Math-1.5B-GGUF/resolve/main/R1-Qwen2.5-Math-1.5B.Q2_K.gguf) | Q2_K | 0.8 | |
| [GGUF](https://huggingface.co/mradermacher/R1-Qwen2.5-Math-1.5B-GGUF/resolve/main/R1-Qwen2.5-Math-1.5B.Q3_K_S.gguf) | Q3_K_S | 0.9 | |
| [GGUF](https://huggingface.co/mradermacher/R1-Qwen2.5-Math-1.5B-GGUF/resolve/main/R1-Qwen2.5-Math-1.5B.Q3_K_M.gguf) | Q3_K_M | 0.9 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/R1-Qwen2.5-Math-1.5B-GGUF/resolve/main/R1-Qwen2.5-Math-1.5B.Q3_K_L.gguf) | Q3_K_L | 1.0 | |
| [GGUF](https://huggingface.co/mradermacher/R1-Qwen2.5-Math-1.5B-GGUF/resolve/main/R1-Qwen2.5-Math-1.5B.IQ4_XS.gguf) | IQ4_XS | 1.0 | |
| [GGUF](https://huggingface.co/mradermacher/R1-Qwen2.5-Math-1.5B-GGUF/resolve/main/R1-Qwen2.5-Math-1.5B.Q4_K_S.gguf) | Q4_K_S | 1.0 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/R1-Qwen2.5-Math-1.5B-GGUF/resolve/main/R1-Qwen2.5-Math-1.5B.Q4_K_M.gguf) | Q4_K_M | 1.1 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/R1-Qwen2.5-Math-1.5B-GGUF/resolve/main/R1-Qwen2.5-Math-1.5B.Q5_K_S.gguf) | Q5_K_S | 1.2 | |
| [GGUF](https://huggingface.co/mradermacher/R1-Qwen2.5-Math-1.5B-GGUF/resolve/main/R1-Qwen2.5-Math-1.5B.Q5_K_M.gguf) | Q5_K_M | 1.2 | |
| [GGUF](https://huggingface.co/mradermacher/R1-Qwen2.5-Math-1.5B-GGUF/resolve/main/R1-Qwen2.5-Math-1.5B.Q6_K.gguf) | Q6_K | 1.4 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/R1-Qwen2.5-Math-1.5B-GGUF/resolve/main/R1-Qwen2.5-Math-1.5B.Q8_0.gguf) | Q8_0 | 1.7 | fast, best quality |
| [GGUF](https://huggingface.co/mradermacher/R1-Qwen2.5-Math-1.5B-GGUF/resolve/main/R1-Qwen2.5-Math-1.5B.f16.gguf) | f16 | 3.2 | 16 bpw, overkill |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
mradermacher/QvQ-Step-Tiny-i1-GGUF | mradermacher | 2025-01-26T07:17:23Z | 457 | 0 | transformers | [
"transformers",
"gguf",
"QvQ",
"Qwen",
"Contexr-Explainer",
"en",
"base_model:prithivMLmods/QvQ-Step-Tiny",
"base_model:quantized:prithivMLmods/QvQ-Step-Tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"imatrix",
"conversational"
] | null | 2025-01-26T07:07:47Z | ---
base_model: prithivMLmods/QvQ-Step-Tiny
language:
- en
library_name: transformers
license: apache-2.0
quantized_by: mradermacher
tags:
- QvQ
- Qwen
- Contexr-Explainer
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/prithivMLmods/QvQ-Step-Tiny
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/QvQ-Step-Tiny-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/QvQ-Step-Tiny-i1-GGUF/resolve/main/QvQ-Step-Tiny.i1-IQ1_S.gguf) | i1-IQ1_S | 0.5 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/QvQ-Step-Tiny-i1-GGUF/resolve/main/QvQ-Step-Tiny.i1-IQ1_M.gguf) | i1-IQ1_M | 0.6 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/QvQ-Step-Tiny-i1-GGUF/resolve/main/QvQ-Step-Tiny.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 0.6 | |
| [GGUF](https://huggingface.co/mradermacher/QvQ-Step-Tiny-i1-GGUF/resolve/main/QvQ-Step-Tiny.i1-IQ2_XS.gguf) | i1-IQ2_XS | 0.7 | |
| [GGUF](https://huggingface.co/mradermacher/QvQ-Step-Tiny-i1-GGUF/resolve/main/QvQ-Step-Tiny.i1-IQ2_S.gguf) | i1-IQ2_S | 0.7 | |
| [GGUF](https://huggingface.co/mradermacher/QvQ-Step-Tiny-i1-GGUF/resolve/main/QvQ-Step-Tiny.i1-IQ2_M.gguf) | i1-IQ2_M | 0.7 | |
| [GGUF](https://huggingface.co/mradermacher/QvQ-Step-Tiny-i1-GGUF/resolve/main/QvQ-Step-Tiny.i1-Q2_K_S.gguf) | i1-Q2_K_S | 0.7 | very low quality |
| [GGUF](https://huggingface.co/mradermacher/QvQ-Step-Tiny-i1-GGUF/resolve/main/QvQ-Step-Tiny.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 0.8 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/QvQ-Step-Tiny-i1-GGUF/resolve/main/QvQ-Step-Tiny.i1-Q2_K.gguf) | i1-Q2_K | 0.8 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/QvQ-Step-Tiny-i1-GGUF/resolve/main/QvQ-Step-Tiny.i1-IQ3_XS.gguf) | i1-IQ3_XS | 0.8 | |
| [GGUF](https://huggingface.co/mradermacher/QvQ-Step-Tiny-i1-GGUF/resolve/main/QvQ-Step-Tiny.i1-Q3_K_S.gguf) | i1-Q3_K_S | 0.9 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/QvQ-Step-Tiny-i1-GGUF/resolve/main/QvQ-Step-Tiny.i1-IQ3_S.gguf) | i1-IQ3_S | 0.9 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/QvQ-Step-Tiny-i1-GGUF/resolve/main/QvQ-Step-Tiny.i1-IQ3_M.gguf) | i1-IQ3_M | 0.9 | |
| [GGUF](https://huggingface.co/mradermacher/QvQ-Step-Tiny-i1-GGUF/resolve/main/QvQ-Step-Tiny.i1-Q3_K_M.gguf) | i1-Q3_K_M | 0.9 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/QvQ-Step-Tiny-i1-GGUF/resolve/main/QvQ-Step-Tiny.i1-Q3_K_L.gguf) | i1-Q3_K_L | 1.0 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/QvQ-Step-Tiny-i1-GGUF/resolve/main/QvQ-Step-Tiny.i1-IQ4_XS.gguf) | i1-IQ4_XS | 1.0 | |
| [GGUF](https://huggingface.co/mradermacher/QvQ-Step-Tiny-i1-GGUF/resolve/main/QvQ-Step-Tiny.i1-IQ4_NL.gguf) | i1-IQ4_NL | 1.0 | prefer IQ4_XS |
| [GGUF](https://huggingface.co/mradermacher/QvQ-Step-Tiny-i1-GGUF/resolve/main/QvQ-Step-Tiny.i1-Q4_0.gguf) | i1-Q4_0 | 1.0 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/QvQ-Step-Tiny-i1-GGUF/resolve/main/QvQ-Step-Tiny.i1-Q4_K_S.gguf) | i1-Q4_K_S | 1.0 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/QvQ-Step-Tiny-i1-GGUF/resolve/main/QvQ-Step-Tiny.i1-Q4_K_M.gguf) | i1-Q4_K_M | 1.1 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/QvQ-Step-Tiny-i1-GGUF/resolve/main/QvQ-Step-Tiny.i1-Q4_1.gguf) | i1-Q4_1 | 1.1 | |
| [GGUF](https://huggingface.co/mradermacher/QvQ-Step-Tiny-i1-GGUF/resolve/main/QvQ-Step-Tiny.i1-Q5_K_S.gguf) | i1-Q5_K_S | 1.2 | |
| [GGUF](https://huggingface.co/mradermacher/QvQ-Step-Tiny-i1-GGUF/resolve/main/QvQ-Step-Tiny.i1-Q5_K_M.gguf) | i1-Q5_K_M | 1.2 | |
| [GGUF](https://huggingface.co/mradermacher/QvQ-Step-Tiny-i1-GGUF/resolve/main/QvQ-Step-Tiny.i1-Q6_K.gguf) | i1-Q6_K | 1.4 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
kostiantynk1205/f7ac90a6-a062-448f-a24b-6c5bf379a8c9 | kostiantynk1205 | 2025-01-26T07:14:37Z | 6 | 0 | peft | [
"peft",
"safetensors",
"llama",
"axolotl",
"generated_from_trainer",
"base_model:NousResearch/Meta-Llama-3-8B-Alternate-Tokenizer",
"base_model:adapter:NousResearch/Meta-Llama-3-8B-Alternate-Tokenizer",
"license:other",
"region:us"
] | null | 2025-01-26T07:12:28Z | ---
library_name: peft
license: other
base_model: NousResearch/Meta-Llama-3-8B-Alternate-Tokenizer
tags:
- axolotl
- generated_from_trainer
model-index:
- name: f7ac90a6-a062-448f-a24b-6c5bf379a8c9
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: NousResearch/Meta-Llama-3-8B-Alternate-Tokenizer
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- fbd16596dc609498_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/fbd16596dc609498_train_data.json
type:
field_input: Option 1
field_instruction: Domain
field_output: Question
format: '{instruction} {input}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 4
flash_attention: false
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: false
group_by_length: false
hub_model_id: kostiantynk1205/f7ac90a6-a062-448f-a24b-6c5bf379a8c9
hub_repo: null
hub_strategy: checkpoint
hub_token: null
learning_rate: 0.0002
load_in_4bit: false
load_in_8bit: false
local_rank: null
logging_steps: 1
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 8
lora_target_linear: true
lr_scheduler: cosine
max_steps: 10
micro_batch_size: 2
mlflow_experiment_name: /tmp/fbd16596dc609498_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 4
sequence_len: 512
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: 7f264b18-8600-4682-ae3a-1dbab5576cea
wandb_project: Birthday-SN56-23-Gradients-On-Demand
wandb_run: your_name
wandb_runid: 7f264b18-8600-4682-ae3a-1dbab5576cea
warmup_steps: 10
weight_decay: 0.0
xformers_attention: null
```
</details><br>
# f7ac90a6-a062-448f-a24b-6c5bf379a8c9
This model is a fine-tuned version of [NousResearch/Meta-Llama-3-8B-Alternate-Tokenizer](https://huggingface.co/NousResearch/Meta-Llama-3-8B-Alternate-Tokenizer) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 3.4952
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 10
- training_steps: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 4.1238 | 0.0031 | 1 | 4.6047 |
| 4.2165 | 0.0092 | 3 | 4.5958 |
| 4.4 | 0.0183 | 6 | 4.4134 |
| 3.6796 | 0.0275 | 9 | 3.4952 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
thakkkkkk/4cdc599b-d43b-47e9-aaa1-28a3ceb15d48 | thakkkkkk | 2025-01-26T07:14:07Z | 8 | 0 | peft | [
"peft",
"safetensors",
"llama",
"axolotl",
"generated_from_trainer",
"base_model:unsloth/SmolLM2-1.7B",
"base_model:adapter:unsloth/SmolLM2-1.7B",
"license:apache-2.0",
"8-bit",
"bitsandbytes",
"region:us"
] | null | 2025-01-26T06:52:39Z | ---
library_name: peft
license: apache-2.0
base_model: unsloth/SmolLM2-1.7B
tags:
- axolotl
- generated_from_trainer
model-index:
- name: 4cdc599b-d43b-47e9-aaa1-28a3ceb15d48
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: unsloth/SmolLM2-1.7B
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- 99bea9d9584b8941_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/99bea9d9584b8941_train_data.json
type:
field_input: Language
field_instruction: Source
field_output: Content
format: '{instruction} {input}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 1
flash_attention: true
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: true
gradient_clipping: 1.0
group_by_length: false
hub_model_id: thakkkkkk/4cdc599b-d43b-47e9-aaa1-28a3ceb15d48
hub_repo: null
hub_strategy: end
hub_token: null
learning_rate: 5.0e-05
load_in_4bit: true
load_in_8bit: true
local_rank: null
logging_steps: 1
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 8
lora_target_linear: true
lr_scheduler: cosine
max_steps: 200
micro_batch_size: 4
mlflow_experiment_name: /tmp/99bea9d9584b8941_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 1
sequence_len: 1024
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: cfffbcef-61a5-4b51-a7f0-a84dbedb4b10
wandb_project: Gradients-On-Demand
wandb_run: your_name
wandb_runid: cfffbcef-61a5-4b51-a7f0-a84dbedb4b10
warmup_steps: 5
weight_decay: 0.01
xformers_attention: true
```
</details><br>
# 4cdc599b-d43b-47e9-aaa1-28a3ceb15d48
This model is a fine-tuned version of [unsloth/SmolLM2-1.7B](https://huggingface.co/unsloth/SmolLM2-1.7B) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.5739
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 16
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 5
- training_steps: 200
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 1.5951 | 0.1473 | 200 | 1.5739 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
visdata/socold8 | visdata | 2025-01-26T07:13:51Z | 32 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2025-01-26T06:52:34Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
robiual-awal/c5ce6b29-6c76-4a50-8609-9cbc1d49ca96 | robiual-awal | 2025-01-26T07:13:37Z | 8 | 0 | peft | [
"peft",
"safetensors",
"llama",
"axolotl",
"generated_from_trainer",
"base_model:NousResearch/Meta-Llama-3-8B-Alternate-Tokenizer",
"base_model:adapter:NousResearch/Meta-Llama-3-8B-Alternate-Tokenizer",
"license:other",
"region:us"
] | null | 2025-01-26T07:12:26Z | ---
library_name: peft
license: other
base_model: NousResearch/Meta-Llama-3-8B-Alternate-Tokenizer
tags:
- axolotl
- generated_from_trainer
model-index:
- name: c5ce6b29-6c76-4a50-8609-9cbc1d49ca96
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: NousResearch/Meta-Llama-3-8B-Alternate-Tokenizer
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- fbd16596dc609498_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/fbd16596dc609498_train_data.json
type:
field_input: Option 1
field_instruction: Domain
field_output: Question
format: '{instruction} {input}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 4
flash_attention: false
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: false
group_by_length: false
hub_model_id: robiual-awal/c5ce6b29-6c76-4a50-8609-9cbc1d49ca96
hub_repo: null
hub_strategy: checkpoint
hub_token: null
learning_rate: 0.0002
load_in_4bit: false
load_in_8bit: false
local_rank: null
logging_steps: 1
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 8
lora_target_linear: true
lr_scheduler: cosine
max_steps: 10
micro_batch_size: 2
mlflow_experiment_name: /tmp/fbd16596dc609498_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 4
sequence_len: 512
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: 7f264b18-8600-4682-ae3a-1dbab5576cea
wandb_project: Birthday-SN56-30-Gradients-On-Demand
wandb_run: your_name
wandb_runid: 7f264b18-8600-4682-ae3a-1dbab5576cea
warmup_steps: 10
weight_decay: 0.0
xformers_attention: null
```
</details><br>
# c5ce6b29-6c76-4a50-8609-9cbc1d49ca96
This model is a fine-tuned version of [NousResearch/Meta-Llama-3-8B-Alternate-Tokenizer](https://huggingface.co/NousResearch/Meta-Llama-3-8B-Alternate-Tokenizer) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 3.4789
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 10
- training_steps: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 4.1238 | 0.0031 | 1 | 4.6047 |
| 4.2099 | 0.0092 | 3 | 4.5955 |
| 4.3954 | 0.0183 | 6 | 4.4051 |
| 3.6634 | 0.0275 | 9 | 3.4789 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
daniel40/68892031-d545-4752-ad0a-d48f16fea12e | daniel40 | 2025-01-26T07:13:19Z | 8 | 0 | peft | [
"peft",
"safetensors",
"llama",
"axolotl",
"generated_from_trainer",
"base_model:NousResearch/Meta-Llama-3-8B-Alternate-Tokenizer",
"base_model:adapter:NousResearch/Meta-Llama-3-8B-Alternate-Tokenizer",
"license:other",
"region:us"
] | null | 2025-01-26T07:12:11Z | ---
library_name: peft
license: other
base_model: NousResearch/Meta-Llama-3-8B-Alternate-Tokenizer
tags:
- axolotl
- generated_from_trainer
model-index:
- name: 68892031-d545-4752-ad0a-d48f16fea12e
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: NousResearch/Meta-Llama-3-8B-Alternate-Tokenizer
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- fbd16596dc609498_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/fbd16596dc609498_train_data.json
type:
field_input: Option 1
field_instruction: Domain
field_output: Question
format: '{instruction} {input}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 4
flash_attention: false
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: false
group_by_length: false
hub_model_id: daniel40/68892031-d545-4752-ad0a-d48f16fea12e
hub_repo: null
hub_strategy: checkpoint
hub_token: null
learning_rate: 0.0002
load_in_4bit: false
load_in_8bit: false
local_rank: null
logging_steps: 1
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 8
lora_target_linear: true
lr_scheduler: cosine
max_steps: 10
micro_batch_size: 2
mlflow_experiment_name: /tmp/fbd16596dc609498_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 4
sequence_len: 512
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: 7f264b18-8600-4682-ae3a-1dbab5576cea
wandb_project: Birthday-SN56-31-Gradients-On-Demand
wandb_run: your_name
wandb_runid: 7f264b18-8600-4682-ae3a-1dbab5576cea
warmup_steps: 10
weight_decay: 0.0
xformers_attention: null
```
</details><br>
# 68892031-d545-4752-ad0a-d48f16fea12e
This model is a fine-tuned version of [NousResearch/Meta-Llama-3-8B-Alternate-Tokenizer](https://huggingface.co/NousResearch/Meta-Llama-3-8B-Alternate-Tokenizer) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 3.4815
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 10
- training_steps: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 4.1238 | 0.0031 | 1 | 4.6047 |
| 4.2123 | 0.0092 | 3 | 4.5941 |
| 4.3944 | 0.0183 | 6 | 4.4095 |
| 3.6639 | 0.0275 | 9 | 3.4815 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
ivangrapher/50cf6ad1-575b-4779-af4f-e411c9a1de34 | ivangrapher | 2025-01-26T07:09:53Z | 8 | 0 | peft | [
"peft",
"safetensors",
"qwen2",
"axolotl",
"generated_from_trainer",
"base_model:peft-internal-testing/tiny-dummy-qwen2",
"base_model:adapter:peft-internal-testing/tiny-dummy-qwen2",
"4-bit",
"bitsandbytes",
"region:us"
] | null | 2025-01-26T07:09:01Z | ---
library_name: peft
base_model: peft-internal-testing/tiny-dummy-qwen2
tags:
- axolotl
- generated_from_trainer
model-index:
- name: 50cf6ad1-575b-4779-af4f-e411c9a1de34
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: peft-internal-testing/tiny-dummy-qwen2
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- de564b26400f300a_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/de564b26400f300a_train_data.json
type:
field_instruction: sentence1
field_output: sentence2
format: '{instruction}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
device: cuda
early_stopping_patience: null
eval_max_new_tokens: 256
eval_steps: 5
eval_table_size: null
evals_per_epoch: null
flash_attention: false
fp16: null
gradient_accumulation_steps: 4
gradient_checkpointing: true
group_by_length: false
hub_model_id: ivangrapher/50cf6ad1-575b-4779-af4f-e411c9a1de34
hub_repo: null
hub_strategy: checkpoint
hub_token: null
learning_rate: 0.0002
load_in_4bit: true
load_in_8bit: false
local_rank: null
logging_steps: 3
lora_alpha: 32
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 16
lora_target_linear: true
lr_scheduler: cosine
max_memory:
0: 75GiB
max_steps: 30
micro_batch_size: 2
mlflow_experiment_name: /tmp/de564b26400f300a_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_torch
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
save_steps: 15
sequence_len: 1024
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: true
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: 60771902-00e5-4e05-a977-27eb7a22dc7b
wandb_project: Gradients-On-Demand
wandb_run: your_name
wandb_runid: 60771902-00e5-4e05-a977-27eb7a22dc7b
warmup_steps: 15
weight_decay: 0.01
xformers_attention: true
```
</details><br>
# 50cf6ad1-575b-4779-af4f-e411c9a1de34
This model is a fine-tuned version of [peft-internal-testing/tiny-dummy-qwen2](https://huggingface.co/peft-internal-testing/tiny-dummy-qwen2) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 11.9292
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 15
- training_steps: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| No log | 0.0010 | 1 | 11.9298 |
| 11.9323 | 0.0050 | 5 | 11.9298 |
| 11.928 | 0.0100 | 10 | 11.9297 |
| 11.933 | 0.0150 | 15 | 11.9296 |
| 11.9337 | 0.0200 | 20 | 11.9294 |
| 11.9297 | 0.0250 | 25 | 11.9293 |
| 11.9287 | 0.0300 | 30 | 11.9292 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
havinash-ai/8f4c8021-c03b-4eda-901a-8aa1e3ceea54 | havinash-ai | 2025-01-26T07:07:39Z | 8 | 0 | peft | [
"peft",
"safetensors",
"llama",
"axolotl",
"generated_from_trainer",
"base_model:unsloth/SmolLM2-1.7B",
"base_model:adapter:unsloth/SmolLM2-1.7B",
"license:apache-2.0",
"region:us"
] | null | 2025-01-26T07:05:22Z | ---
library_name: peft
license: apache-2.0
base_model: unsloth/SmolLM2-1.7B
tags:
- axolotl
- generated_from_trainer
model-index:
- name: 8f4c8021-c03b-4eda-901a-8aa1e3ceea54
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: unsloth/SmolLM2-1.7B
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- 99bea9d9584b8941_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/99bea9d9584b8941_train_data.json
type:
field_input: Language
field_instruction: Source
field_output: Content
format: '{instruction} {input}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 4
flash_attention: false
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: false
group_by_length: false
hub_model_id: havinash-ai/8f4c8021-c03b-4eda-901a-8aa1e3ceea54
hub_repo: null
hub_strategy: checkpoint
hub_token: null
learning_rate: 0.0002
load_in_4bit: false
load_in_8bit: false
local_rank: null
logging_steps: 1
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 8
lora_target_linear: true
lr_scheduler: cosine
max_steps: 10
micro_batch_size: 2
mlflow_experiment_name: /tmp/99bea9d9584b8941_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 4
sequence_len: 512
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: cfffbcef-61a5-4b51-a7f0-a84dbedb4b10
wandb_project: Gradients-On-Demand
wandb_run: your_name
wandb_runid: cfffbcef-61a5-4b51-a7f0-a84dbedb4b10
warmup_steps: 10
weight_decay: 0.0
xformers_attention: null
```
</details><br>
# 8f4c8021-c03b-4eda-901a-8aa1e3ceea54
This model is a fine-tuned version of [unsloth/SmolLM2-1.7B](https://huggingface.co/unsloth/SmolLM2-1.7B) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: nan
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 10
- training_steps: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 0.0 | 0.0004 | 1 | nan |
| 0.0 | 0.0011 | 3 | nan |
| 0.0 | 0.0022 | 6 | nan |
| 0.0 | 0.0033 | 9 | nan |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
nhung02/a01f407e-4003-435e-aa16-c6df722f0386 | nhung02 | 2025-01-26T07:07:06Z | 6 | 0 | peft | [
"peft",
"safetensors",
"phi",
"axolotl",
"generated_from_trainer",
"base_model:microsoft/phi-2",
"base_model:adapter:microsoft/phi-2",
"license:mit",
"8-bit",
"bitsandbytes",
"region:us"
] | null | 2025-01-26T06:48:16Z | ---
library_name: peft
license: mit
base_model: microsoft/phi-2
tags:
- axolotl
- generated_from_trainer
model-index:
- name: a01f407e-4003-435e-aa16-c6df722f0386
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: microsoft/phi-2
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- 583c436b7dfe63c6_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/583c436b7dfe63c6_train_data.json
type:
field_input: description
field_instruction: name
field_output: symptoms
format: '{instruction} {input}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 1
flash_attention: true
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: true
gradient_clipping: 1.0
group_by_length: false
hub_model_id: nhung02/a01f407e-4003-435e-aa16-c6df722f0386
hub_repo: null
hub_strategy: end
hub_token: null
learning_rate: 5.0e-05
load_in_4bit: true
load_in_8bit: true
local_rank: null
logging_steps: 1
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 8
lora_target_linear: true
lr_scheduler: cosine
max_steps: 200
micro_batch_size: 2
mlflow_experiment_name: /tmp/583c436b7dfe63c6_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 1
sequence_len: 1024
special_tokens:
pad_token: <|endoftext|>
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: 26cb4ebb-3df7-4431-a684-59b1c72c5755
wandb_project: Gradients-On-Demand
wandb_run: your_name
wandb_runid: 26cb4ebb-3df7-4431-a684-59b1c72c5755
warmup_steps: 5
weight_decay: 0.01
xformers_attention: true
```
</details><br>
# a01f407e-4003-435e-aa16-c6df722f0386
This model is a fine-tuned version of [microsoft/phi-2](https://huggingface.co/microsoft/phi-2) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 2.7201
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 5
- training_steps: 121
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 1.9012 | 1.0 | 121 | 2.7201 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
nblinh/906a9893-1779-4161-b0b3-1a607f2d9a7a | nblinh | 2025-01-26T07:06:51Z | 9 | 0 | peft | [
"peft",
"safetensors",
"llama",
"axolotl",
"generated_from_trainer",
"custom_code",
"base_model:NousResearch/Yarn-Solar-10b-32k",
"base_model:adapter:NousResearch/Yarn-Solar-10b-32k",
"license:apache-2.0",
"8-bit",
"bitsandbytes",
"region:us"
] | null | 2025-01-26T06:00:43Z | ---
library_name: peft
license: apache-2.0
base_model: NousResearch/Yarn-Solar-10b-32k
tags:
- axolotl
- generated_from_trainer
model-index:
- name: 906a9893-1779-4161-b0b3-1a607f2d9a7a
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: NousResearch/Yarn-Solar-10b-32k
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- 7bb5c8c129066fca_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/7bb5c8c129066fca_train_data.json
type:
field_instruction: prompt
field_output: chosen
format: '{instruction}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 1
flash_attention: true
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: true
gradient_clipping: 1.0
group_by_length: false
hub_model_id: nblinh/906a9893-1779-4161-b0b3-1a607f2d9a7a
hub_repo: null
hub_strategy: end
hub_token: null
learning_rate: 5.0e-05
load_in_4bit: true
load_in_8bit: true
local_rank: null
logging_steps: 1
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 8
lora_target_linear: true
lr_scheduler: cosine
max_steps: 200
micro_batch_size: 2
mlflow_experiment_name: /tmp/7bb5c8c129066fca_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 1
sequence_len: 1024
special_tokens:
pad_token: </s>
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: 36a43860-f8fa-4c32-afb5-c665be741dc4
wandb_project: Gradients-On-Demand
wandb_run: your_name
wandb_runid: 36a43860-f8fa-4c32-afb5-c665be741dc4
warmup_steps: 5
weight_decay: 0.01
xformers_attention: true
```
</details><br>
# 906a9893-1779-4161-b0b3-1a607f2d9a7a
This model is a fine-tuned version of [NousResearch/Yarn-Solar-10b-32k](https://huggingface.co/NousResearch/Yarn-Solar-10b-32k) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3454
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 5
- training_steps: 200
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 1.3636 | 0.0324 | 200 | 0.3454 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
mrHunghddddd/f650202f-0aaa-498f-93eb-b18fff688a2c | mrHunghddddd | 2025-01-26T07:06:40Z | 8 | 0 | peft | [
"peft",
"safetensors",
"llama",
"axolotl",
"generated_from_trainer",
"custom_code",
"base_model:NousResearch/Yarn-Solar-10b-32k",
"base_model:adapter:NousResearch/Yarn-Solar-10b-32k",
"license:apache-2.0",
"8-bit",
"bitsandbytes",
"region:us"
] | null | 2025-01-26T06:00:23Z | ---
library_name: peft
license: apache-2.0
base_model: NousResearch/Yarn-Solar-10b-32k
tags:
- axolotl
- generated_from_trainer
model-index:
- name: f650202f-0aaa-498f-93eb-b18fff688a2c
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: NousResearch/Yarn-Solar-10b-32k
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- 7bb5c8c129066fca_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/7bb5c8c129066fca_train_data.json
type:
field_instruction: prompt
field_output: chosen
format: '{instruction}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 1
flash_attention: true
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: true
gradient_clipping: 1.0
group_by_length: false
hub_model_id: mrHunghddddd/f650202f-0aaa-498f-93eb-b18fff688a2c
hub_repo: null
hub_strategy: end
hub_token: null
learning_rate: 5.0e-05
load_in_4bit: true
load_in_8bit: true
local_rank: null
logging_steps: 1
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 8
lora_target_linear: true
lr_scheduler: cosine
max_steps: 200
micro_batch_size: 2
mlflow_experiment_name: /tmp/7bb5c8c129066fca_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 1
sequence_len: 1024
special_tokens:
pad_token: </s>
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: 36a43860-f8fa-4c32-afb5-c665be741dc4
wandb_project: Gradients-On-Demand
wandb_run: your_name
wandb_runid: 36a43860-f8fa-4c32-afb5-c665be741dc4
warmup_steps: 5
weight_decay: 0.01
xformers_attention: true
```
</details><br>
# f650202f-0aaa-498f-93eb-b18fff688a2c
This model is a fine-tuned version of [NousResearch/Yarn-Solar-10b-32k](https://huggingface.co/NousResearch/Yarn-Solar-10b-32k) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3451
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 5
- training_steps: 200
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 1.3633 | 0.0324 | 200 | 0.3451 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
lesso15/c0491f4e-c579-4632-b98a-c1582eb8b497 | lesso15 | 2025-01-26T07:06:13Z | 8 | 0 | peft | [
"peft",
"safetensors",
"mistral",
"axolotl",
"generated_from_trainer",
"base_model:unsloth/zephyr-sft",
"base_model:adapter:unsloth/zephyr-sft",
"license:apache-2.0",
"8-bit",
"bitsandbytes",
"region:us"
] | null | 2025-01-26T05:58:50Z | ---
library_name: peft
license: apache-2.0
base_model: unsloth/zephyr-sft
tags:
- axolotl
- generated_from_trainer
model-index:
- name: c0491f4e-c579-4632-b98a-c1582eb8b497
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: unsloth/zephyr-sft
bf16: auto
chat_template: llama3
datasets:
- data_files:
- f736f3bd8f945056_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/f736f3bd8f945056_train_data.json
type:
field_input: text
field_instruction: instruction
field_output: output
format: '{instruction} {input}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 1
flash_attention: false
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: true
gradient_clipping: 1.0
group_by_length: false
hub_model_id: lesso15/c0491f4e-c579-4632-b98a-c1582eb8b497
hub_repo: null
hub_strategy: end
hub_token: null
learning_rate: 5.0e-05
load_in_4bit: true
load_in_8bit: true
local_rank: null
logging_steps: 1
lora_alpha: 32
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 16
lora_target_linear: true
lr_scheduler: cosine
max_steps: 200
micro_batch_size: 2
mlflow_experiment_name: /tmp/f736f3bd8f945056_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 1
sequence_len: 1024
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: 235ea562-3572-46e2-aa40-c58385a132c8
wandb_project: Gradients-On-Demand
wandb_run: your_name
wandb_runid: 235ea562-3572-46e2-aa40-c58385a132c8
warmup_steps: 5
weight_decay: 0.01
xformers_attention: true
```
</details><br>
# c0491f4e-c579-4632-b98a-c1582eb8b497
This model is a fine-tuned version of [unsloth/zephyr-sft](https://huggingface.co/unsloth/zephyr-sft) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: nan
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 5
- training_steps: 200
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 0.0 | 0.0638 | 200 | nan |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
laquythang/1cc1894a-8b5a-43fe-8d57-914ba3b0175a | laquythang | 2025-01-26T07:04:27Z | 8 | 0 | peft | [
"peft",
"safetensors",
"llama",
"axolotl",
"generated_from_trainer",
"base_model:unsloth/SmolLM2-1.7B",
"base_model:adapter:unsloth/SmolLM2-1.7B",
"license:apache-2.0",
"8-bit",
"bitsandbytes",
"region:us"
] | null | 2025-01-26T06:51:25Z | ---
library_name: peft
license: apache-2.0
base_model: unsloth/SmolLM2-1.7B
tags:
- axolotl
- generated_from_trainer
model-index:
- name: 1cc1894a-8b5a-43fe-8d57-914ba3b0175a
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: unsloth/SmolLM2-1.7B
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- 99bea9d9584b8941_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/99bea9d9584b8941_train_data.json
type:
field_input: Language
field_instruction: Source
field_output: Content
format: '{instruction} {input}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 1
flash_attention: true
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: true
gradient_clipping: 1.0
group_by_length: false
hub_model_id: laquythang/1cc1894a-8b5a-43fe-8d57-914ba3b0175a
hub_repo: null
hub_strategy: end
hub_token: null
learning_rate: 5.0e-05
load_in_4bit: true
load_in_8bit: true
local_rank: null
logging_steps: 1
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 8
lora_target_linear: true
lr_scheduler: cosine
max_steps: 200
micro_batch_size: 2
mlflow_experiment_name: /tmp/99bea9d9584b8941_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 1
sequence_len: 1024
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: cfffbcef-61a5-4b51-a7f0-a84dbedb4b10
wandb_project: Gradients-On-Demand
wandb_run: your_name
wandb_runid: cfffbcef-61a5-4b51-a7f0-a84dbedb4b10
warmup_steps: 5
weight_decay: 0.01
xformers_attention: true
```
</details><br>
# 1cc1894a-8b5a-43fe-8d57-914ba3b0175a
This model is a fine-tuned version of [unsloth/SmolLM2-1.7B](https://huggingface.co/unsloth/SmolLM2-1.7B) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.6156
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 5
- training_steps: 200
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 1.5676 | 0.0737 | 200 | 1.6156 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
AMindToThink/gemma-2-2b_RMU_s200_a500_layer7 | AMindToThink | 2025-01-26T07:01:43Z | 9 | 0 | transformers | [
"transformers",
"safetensors",
"gemma2",
"text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2025-01-26T06:59:22Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
mradermacher/ZEUS-8B-V26-GGUF | mradermacher | 2025-01-26T06:59:32Z | 319 | 1 | transformers | [
"transformers",
"gguf",
"mergekit",
"merge",
"en",
"base_model:T145/ZEUS-8B-V26",
"base_model:quantized:T145/ZEUS-8B-V26",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2025-01-26T06:42:29Z | ---
base_model: T145/ZEUS-8B-V26
language:
- en
library_name: transformers
quantized_by: mradermacher
tags:
- mergekit
- merge
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
static quants of https://huggingface.co/T145/ZEUS-8B-V26
<!-- provided-files -->
weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/ZEUS-8B-V26-GGUF/resolve/main/ZEUS-8B-V26.Q2_K.gguf) | Q2_K | 3.3 | |
| [GGUF](https://huggingface.co/mradermacher/ZEUS-8B-V26-GGUF/resolve/main/ZEUS-8B-V26.Q3_K_S.gguf) | Q3_K_S | 3.8 | |
| [GGUF](https://huggingface.co/mradermacher/ZEUS-8B-V26-GGUF/resolve/main/ZEUS-8B-V26.Q3_K_M.gguf) | Q3_K_M | 4.1 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/ZEUS-8B-V26-GGUF/resolve/main/ZEUS-8B-V26.Q3_K_L.gguf) | Q3_K_L | 4.4 | |
| [GGUF](https://huggingface.co/mradermacher/ZEUS-8B-V26-GGUF/resolve/main/ZEUS-8B-V26.IQ4_XS.gguf) | IQ4_XS | 4.6 | |
| [GGUF](https://huggingface.co/mradermacher/ZEUS-8B-V26-GGUF/resolve/main/ZEUS-8B-V26.Q4_K_S.gguf) | Q4_K_S | 4.8 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/ZEUS-8B-V26-GGUF/resolve/main/ZEUS-8B-V26.Q4_K_M.gguf) | Q4_K_M | 5.0 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/ZEUS-8B-V26-GGUF/resolve/main/ZEUS-8B-V26.Q5_K_S.gguf) | Q5_K_S | 5.7 | |
| [GGUF](https://huggingface.co/mradermacher/ZEUS-8B-V26-GGUF/resolve/main/ZEUS-8B-V26.Q5_K_M.gguf) | Q5_K_M | 5.8 | |
| [GGUF](https://huggingface.co/mradermacher/ZEUS-8B-V26-GGUF/resolve/main/ZEUS-8B-V26.Q6_K.gguf) | Q6_K | 6.7 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/ZEUS-8B-V26-GGUF/resolve/main/ZEUS-8B-V26.Q8_0.gguf) | Q8_0 | 8.6 | fast, best quality |
| [GGUF](https://huggingface.co/mradermacher/ZEUS-8B-V26-GGUF/resolve/main/ZEUS-8B-V26.f16.gguf) | f16 | 16.2 | 16 bpw, overkill |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
great0001/f228713b-450b-4ad5-bdbb-65d5863bfdbb | great0001 | 2025-01-26T06:59:21Z | 6 | 0 | peft | [
"peft",
"safetensors",
"bloom",
"axolotl",
"generated_from_trainer",
"base_model:bigscience/bloomz-560m",
"base_model:adapter:bigscience/bloomz-560m",
"license:bigscience-bloom-rail-1.0",
"region:us"
] | null | 2025-01-26T06:53:59Z | ---
library_name: peft
license: bigscience-bloom-rail-1.0
base_model: bigscience/bloomz-560m
tags:
- axolotl
- generated_from_trainer
model-index:
- name: f228713b-450b-4ad5-bdbb-65d5863bfdbb
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: bigscience/bloomz-560m
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- eb75b6ffdc77ea4d_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/eb75b6ffdc77ea4d_train_data.json
type:
field_input: input
field_instruction: instruction
field_output: output
format: '{instruction} {input}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 4
flash_attention: false
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: false
group_by_length: false
hub_model_id: great0001/f228713b-450b-4ad5-bdbb-65d5863bfdbb
hub_repo: null
hub_strategy: checkpoint
hub_token: null
learning_rate: 0.0002
load_in_4bit: false
load_in_8bit: false
local_rank: null
logging_steps: 1
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 8
lora_target_linear: true
lr_scheduler: cosine
max_steps: 10
micro_batch_size: 2
mlflow_experiment_name: /tmp/eb75b6ffdc77ea4d_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 4
sequence_len: 512
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: 888b9795-bd3d-4c1e-9289-4c99ad92b728
wandb_project: Mine-SN56-20-Gradients-On-Demand
wandb_run: your_name
wandb_runid: 888b9795-bd3d-4c1e-9289-4c99ad92b728
warmup_steps: 10
weight_decay: 0.0
xformers_attention: null
```
</details><br>
# f228713b-450b-4ad5-bdbb-65d5863bfdbb
This model is a fine-tuned version of [bigscience/bloomz-560m](https://huggingface.co/bigscience/bloomz-560m) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.9780
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 10
- training_steps: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 8.7751 | 0.0001 | 1 | 2.1532 |
| 10.4451 | 0.0002 | 3 | 2.1469 |
| 8.6951 | 0.0005 | 6 | 2.0834 |
| 8.3468 | 0.0007 | 9 | 1.9780 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
duyphu/a64d677d-c2ba-4d52-a26c-33c32b9b9eea | duyphu | 2025-01-26T06:56:51Z | 5 | 0 | peft | [
"peft",
"safetensors",
"mistral",
"axolotl",
"generated_from_trainer",
"base_model:unsloth/mistral-7b-instruct-v0.2",
"base_model:adapter:unsloth/mistral-7b-instruct-v0.2",
"license:apache-2.0",
"region:us"
] | null | 2025-01-26T05:27:06Z | ---
library_name: peft
license: apache-2.0
base_model: unsloth/mistral-7b-instruct-v0.2
tags:
- axolotl
- generated_from_trainer
model-index:
- name: a64d677d-c2ba-4d52-a26c-33c32b9b9eea
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: unsloth/mistral-7b-instruct-v0.2
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- 59dfd08cf3fa50ec_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/59dfd08cf3fa50ec_train_data.json
type:
field_input: Title
field_instruction: Abstract
field_output: Hypothesis
format: '{instruction} {input}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 5
flash_attention: true
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: false
group_by_length: false
hub_model_id: duyphu/a64d677d-c2ba-4d52-a26c-33c32b9b9eea
hub_repo: null
hub_strategy: checkpoint
hub_token: null
learning_rate: 0.0001
load_in_4bit: false
load_in_8bit: false
local_rank: null
logging_steps: 5
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 8
lora_target_linear: true
lr_scheduler: cosine
max_steps: 50
micro_batch_size: 2
mlflow_experiment_name: /tmp/59dfd08cf3fa50ec_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 4
sequence_len: 512
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: 9d1868e2-61ae-4041-af04-bf2ce6704f58
wandb_project: Gradients-On-Demand
wandb_run: your_name
wandb_runid: 9d1868e2-61ae-4041-af04-bf2ce6704f58
warmup_steps: 10
weight_decay: 0.0
xformers_attention: null
```
</details><br>
# a64d677d-c2ba-4d52-a26c-33c32b9b9eea
This model is a fine-tuned version of [unsloth/mistral-7b-instruct-v0.2](https://huggingface.co/unsloth/mistral-7b-instruct-v0.2) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2857
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 10
- training_steps: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| No log | 0.0001 | 1 | 6.2291 |
| 17.8844 | 0.0006 | 10 | 2.7177 |
| 3.6848 | 0.0012 | 20 | 0.6884 |
| 1.6636 | 0.0018 | 30 | 0.3703 |
| 1.4148 | 0.0024 | 40 | 0.2983 |
| 1.1832 | 0.0030 | 50 | 0.2857 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
sahirp/flux-model-abcasdg-1737873475 | sahirp | 2025-01-26T06:56:14Z | 10 | 0 | diffusers | [
"diffusers",
"flux",
"lora",
"replicate",
"text-to-image",
"en",
"base_model:black-forest-labs/FLUX.1-dev",
"base_model:adapter:black-forest-labs/FLUX.1-dev",
"license:other",
"region:us"
] | text-to-image | 2025-01-26T06:56:13Z | ---
license: other
license_name: flux-1-dev-non-commercial-license
license_link: https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/LICENSE.md
language:
- en
tags:
- flux
- diffusers
- lora
- replicate
base_model: "black-forest-labs/FLUX.1-dev"
pipeline_tag: text-to-image
# widget:
# - text: >-
# prompt
# output:
# url: https://...
instance_prompt: NSTLST
---
# Flux Model Abcasdg 1737873475
<Gallery />
Trained on Replicate using:
https://replicate.com/ostris/flux-dev-lora-trainer/train
## Trigger words
You should use `NSTLST` to trigger the image generation.
## Use it with the [🧨 diffusers library](https://github.com/huggingface/diffusers)
```py
from diffusers import AutoPipelineForText2Image
import torch
pipeline = AutoPipelineForText2Image.from_pretrained('black-forest-labs/FLUX.1-dev', torch_dtype=torch.float16).to('cuda')
pipeline.load_lora_weights('sahirp/flux-model-abcasdg-1737873475', weight_name='lora.safetensors')
image = pipeline('your prompt').images[0]
```
For more details, including weighting, merging and fusing LoRAs, check the [documentation on loading LoRAs in diffusers](https://huggingface.co/docs/diffusers/main/en/using-diffusers/loading_adapters)
|
mrferr3t/2c380e99-20fb-432b-8b7e-f1623ec55b3c | mrferr3t | 2025-01-26T06:54:27Z | 8 | 0 | peft | [
"peft",
"safetensors",
"llama",
"axolotl",
"generated_from_trainer",
"base_model:unsloth/SmolLM2-1.7B",
"base_model:adapter:unsloth/SmolLM2-1.7B",
"license:apache-2.0",
"region:us"
] | null | 2025-01-26T06:52:17Z | ---
library_name: peft
license: apache-2.0
base_model: unsloth/SmolLM2-1.7B
tags:
- axolotl
- generated_from_trainer
model-index:
- name: 2c380e99-20fb-432b-8b7e-f1623ec55b3c
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: unsloth/SmolLM2-1.7B
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- 99bea9d9584b8941_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/99bea9d9584b8941_train_data.json
type:
field_input: Language
field_instruction: Source
field_output: Content
format: '{instruction} {input}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 4
flash_attention: false
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: false
group_by_length: false
hub_model_id: mrferr3t/2c380e99-20fb-432b-8b7e-f1623ec55b3c
hub_repo: null
hub_strategy: checkpoint
hub_token: null
learning_rate: 0.0002
load_in_4bit: false
load_in_8bit: false
local_rank: null
logging_steps: 1
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 8
lora_target_linear: true
lr_scheduler: cosine
max_steps: 10
micro_batch_size: 2
mlflow_experiment_name: /tmp/99bea9d9584b8941_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 4
sequence_len: 512
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: cfffbcef-61a5-4b51-a7f0-a84dbedb4b10
wandb_project: Gradients-On-Demand
wandb_run: your_name
wandb_runid: cfffbcef-61a5-4b51-a7f0-a84dbedb4b10
warmup_steps: 10
weight_decay: 0.0
xformers_attention: null
```
</details><br>
# 2c380e99-20fb-432b-8b7e-f1623ec55b3c
This model is a fine-tuned version of [unsloth/SmolLM2-1.7B](https://huggingface.co/unsloth/SmolLM2-1.7B) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 2.1714
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use adamw_bnb_8bit with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 10
- training_steps: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 2.2315 | 0.0004 | 1 | 2.2117 |
| 2.3081 | 0.0011 | 3 | 2.2110 |
| 2.1334 | 0.0022 | 6 | 2.2039 |
| 2.2021 | 0.0033 | 9 | 2.1714 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.3.1+cu121
- Datasets 3.0.1
- Tokenizers 0.20.1 |
bbytxt/e6539317-aa81-4794-895b-2efd1c31870c | bbytxt | 2025-01-26T06:53:41Z | 8 | 0 | peft | [
"peft",
"safetensors",
"mistral",
"axolotl",
"generated_from_trainer",
"base_model:unsloth/Phi-3-mini-4k-instruct",
"base_model:adapter:unsloth/Phi-3-mini-4k-instruct",
"license:mit",
"region:us"
] | null | 2025-01-26T06:35:24Z | ---
library_name: peft
license: mit
base_model: unsloth/Phi-3-mini-4k-instruct
tags:
- axolotl
- generated_from_trainer
model-index:
- name: e6539317-aa81-4794-895b-2efd1c31870c
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: unsloth/Phi-3-mini-4k-instruct
bf16: true
chat_template: llama3
data_processes: 16
dataset_prepared_path: null
datasets:
- data_files:
- a3dd61f7b8808059_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/a3dd61f7b8808059_train_data.json
type:
field_input: crime
field_instruction: prompt
field_output: region_specific_topic
format: '{instruction} {input}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
device_map: auto
do_eval: true
early_stopping_patience: 5
eval_batch_size: 4
eval_max_new_tokens: 128
eval_steps: 50
eval_table_size: null
evals_per_epoch: null
flash_attention: true
fp16: false
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: true
group_by_length: true
hub_model_id: bbytxt/e6539317-aa81-4794-895b-2efd1c31870c
hub_repo: null
hub_strategy: checkpoint
hub_token: null
learning_rate: 0.0001
load_in_4bit: false
load_in_8bit: false
local_rank: null
logging_steps: 1
lora_alpha: 128
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 64
lora_target_linear: true
lr_scheduler: cosine
max_grad_norm: 1.0
max_memory:
0: 75GB
max_steps: 200
micro_batch_size: 8
mlflow_experiment_name: /tmp/a3dd61f7b8808059_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 3
optim_args:
adam_beta1: 0.9
adam_beta2: 0.95
adam_epsilon: 1e-5
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
save_steps: 50
saves_per_epoch: null
sequence_len: 1024
strict: false
tf32: true
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: 33557250-eee1-4d50-96b4-101c27159b28
wandb_project: Gradients-On-Demand
wandb_run: your_name
wandb_runid: 33557250-eee1-4d50-96b4-101c27159b28
warmup_steps: 10
weight_decay: 0.0
xformers_attention: null
```
</details><br>
# e6539317-aa81-4794-895b-2efd1c31870c
This model is a fine-tuned version of [unsloth/Phi-3-mini-4k-instruct](https://huggingface.co/unsloth/Phi-3-mini-4k-instruct) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1652
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=adam_beta1=0.9,adam_beta2=0.95,adam_epsilon=1e-5
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 10
- training_steps: 200
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 8.9069 | 0.0110 | 1 | 1.9166 |
| 1.1661 | 0.5479 | 50 | 0.2155 |
| 0.5705 | 1.0959 | 100 | 0.1798 |
| 0.3009 | 1.6438 | 150 | 0.1683 |
| 0.2587 | 2.1918 | 200 | 0.1652 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
lhong4759/680f722a-7bcb-4cf2-9c70-001e0fd1fc34 | lhong4759 | 2025-01-26T06:51:18Z | 8 | 0 | peft | [
"peft",
"safetensors",
"mistral",
"axolotl",
"generated_from_trainer",
"base_model:unsloth/mistral-7b-instruct-v0.2",
"base_model:adapter:unsloth/mistral-7b-instruct-v0.2",
"license:apache-2.0",
"8-bit",
"bitsandbytes",
"region:us"
] | null | 2025-01-26T05:28:35Z | ---
library_name: peft
license: apache-2.0
base_model: unsloth/mistral-7b-instruct-v0.2
tags:
- axolotl
- generated_from_trainer
model-index:
- name: 680f722a-7bcb-4cf2-9c70-001e0fd1fc34
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: unsloth/mistral-7b-instruct-v0.2
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- 59dfd08cf3fa50ec_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/59dfd08cf3fa50ec_train_data.json
type:
field_input: Title
field_instruction: Abstract
field_output: Hypothesis
format: '{instruction} {input}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 1
flash_attention: true
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: true
gradient_clipping: 1.0
group_by_length: false
hub_model_id: lhong4759/680f722a-7bcb-4cf2-9c70-001e0fd1fc34
hub_repo: null
hub_strategy: end
hub_token: null
learning_rate: 5.0e-05
load_in_4bit: true
load_in_8bit: true
local_rank: null
logging_steps: 1
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 8
lora_target_linear: true
lr_scheduler: cosine
max_steps: 200
micro_batch_size: 2
mlflow_experiment_name: /tmp/59dfd08cf3fa50ec_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 1
sequence_len: 1024
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: 9d1868e2-61ae-4041-af04-bf2ce6704f58
wandb_project: Gradients-On-Demand
wandb_run: your_name
wandb_runid: 9d1868e2-61ae-4041-af04-bf2ce6704f58
warmup_steps: 5
weight_decay: 0.01
xformers_attention: true
```
</details><br>
# 680f722a-7bcb-4cf2-9c70-001e0fd1fc34
This model is a fine-tuned version of [unsloth/mistral-7b-instruct-v0.2](https://huggingface.co/unsloth/mistral-7b-instruct-v0.2) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2697
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 5
- training_steps: 200
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 1.0768 | 0.0120 | 200 | 0.2697 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
demohong/6a061503-596d-4442-8c79-dc8994eeb458 | demohong | 2025-01-26T06:51:16Z | 6 | 0 | peft | [
"peft",
"safetensors",
"mistral",
"axolotl",
"generated_from_trainer",
"base_model:unsloth/mistral-7b-instruct-v0.2",
"base_model:adapter:unsloth/mistral-7b-instruct-v0.2",
"license:apache-2.0",
"8-bit",
"bitsandbytes",
"region:us"
] | null | 2025-01-26T05:28:09Z | ---
library_name: peft
license: apache-2.0
base_model: unsloth/mistral-7b-instruct-v0.2
tags:
- axolotl
- generated_from_trainer
model-index:
- name: 6a061503-596d-4442-8c79-dc8994eeb458
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: unsloth/mistral-7b-instruct-v0.2
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- 59dfd08cf3fa50ec_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/59dfd08cf3fa50ec_train_data.json
type:
field_input: Title
field_instruction: Abstract
field_output: Hypothesis
format: '{instruction} {input}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 1
flash_attention: true
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: true
gradient_clipping: 1.0
group_by_length: false
hub_model_id: demohong/6a061503-596d-4442-8c79-dc8994eeb458
hub_repo: null
hub_strategy: end
hub_token: null
learning_rate: 5.0e-05
load_in_4bit: true
load_in_8bit: true
local_rank: null
logging_steps: 1
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 8
lora_target_linear: true
lr_scheduler: cosine
max_steps: 200
micro_batch_size: 2
mlflow_experiment_name: /tmp/59dfd08cf3fa50ec_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 1
sequence_len: 1024
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: 9d1868e2-61ae-4041-af04-bf2ce6704f58
wandb_project: Gradients-On-Demand
wandb_run: your_name
wandb_runid: 9d1868e2-61ae-4041-af04-bf2ce6704f58
warmup_steps: 5
weight_decay: 0.01
xformers_attention: true
```
</details><br>
# 6a061503-596d-4442-8c79-dc8994eeb458
This model is a fine-tuned version of [unsloth/mistral-7b-instruct-v0.2](https://huggingface.co/unsloth/mistral-7b-instruct-v0.2) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2701
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 5
- training_steps: 200
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 1.059 | 0.0120 | 200 | 0.2701 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
denbeo/441d9684-85fc-4bcc-bb94-3ce544359592 | denbeo | 2025-01-26T06:51:08Z | 8 | 0 | peft | [
"peft",
"safetensors",
"mistral",
"axolotl",
"generated_from_trainer",
"base_model:unsloth/mistral-7b-instruct-v0.2",
"base_model:adapter:unsloth/mistral-7b-instruct-v0.2",
"license:apache-2.0",
"8-bit",
"bitsandbytes",
"region:us"
] | null | 2025-01-26T05:27:38Z | ---
library_name: peft
license: apache-2.0
base_model: unsloth/mistral-7b-instruct-v0.2
tags:
- axolotl
- generated_from_trainer
model-index:
- name: 441d9684-85fc-4bcc-bb94-3ce544359592
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: unsloth/mistral-7b-instruct-v0.2
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- 59dfd08cf3fa50ec_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/59dfd08cf3fa50ec_train_data.json
type:
field_input: Title
field_instruction: Abstract
field_output: Hypothesis
format: '{instruction} {input}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 1
flash_attention: true
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: true
gradient_clipping: 1.0
group_by_length: false
hub_model_id: denbeo/441d9684-85fc-4bcc-bb94-3ce544359592
hub_repo: null
hub_strategy: end
hub_token: null
learning_rate: 5.0e-05
load_in_4bit: true
load_in_8bit: true
local_rank: null
logging_steps: 1
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 8
lora_target_linear: true
lr_scheduler: cosine
max_steps: 200
micro_batch_size: 2
mlflow_experiment_name: /tmp/59dfd08cf3fa50ec_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 1
sequence_len: 1024
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: 9d1868e2-61ae-4041-af04-bf2ce6704f58
wandb_project: Gradients-On-Demand
wandb_run: your_name
wandb_runid: 9d1868e2-61ae-4041-af04-bf2ce6704f58
warmup_steps: 5
weight_decay: 0.01
xformers_attention: true
```
</details><br>
# 441d9684-85fc-4bcc-bb94-3ce544359592
This model is a fine-tuned version of [unsloth/mistral-7b-instruct-v0.2](https://huggingface.co/unsloth/mistral-7b-instruct-v0.2) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2694
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 5
- training_steps: 200
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 1.0584 | 0.0120 | 200 | 0.2694 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
pritam0990/toxic_analizer | pritam0990 | 2025-01-26T06:50:09Z | 23 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2025-01-26T06:49:48Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
nat-hunt/b2c4f285-4c39-4778-b00e-084bc2e6f555 | nat-hunt | 2025-01-26T06:49:28Z | 6 | 0 | peft | [
"peft",
"safetensors",
"phi",
"axolotl",
"generated_from_trainer",
"base_model:microsoft/phi-2",
"base_model:adapter:microsoft/phi-2",
"license:mit",
"region:us"
] | null | 2025-01-26T06:48:35Z | ---
library_name: peft
license: mit
base_model: microsoft/phi-2
tags:
- axolotl
- generated_from_trainer
model-index:
- name: b2c4f285-4c39-4778-b00e-084bc2e6f555
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: microsoft/phi-2
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- 583c436b7dfe63c6_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/583c436b7dfe63c6_train_data.json
type:
field_input: description
field_instruction: name
field_output: symptoms
format: '{instruction} {input}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 4
flash_attention: false
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: false
group_by_length: false
hub_model_id: nat-hunt/b2c4f285-4c39-4778-b00e-084bc2e6f555
hub_repo: null
hub_strategy: checkpoint
hub_token: null
learning_rate: 0.0002
load_in_4bit: false
load_in_8bit: false
local_rank: null
logging_steps: 1
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 8
lora_target_linear: true
lr_scheduler: cosine
max_steps: 10
micro_batch_size: 2
mlflow_experiment_name: /tmp/583c436b7dfe63c6_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 4
sequence_len: 512
special_tokens:
pad_token: <|endoftext|>
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: 26cb4ebb-3df7-4431-a684-59b1c72c5755
wandb_project: Birthday-SN56-25-Gradients-On-Demand
wandb_run: your_name
wandb_runid: 26cb4ebb-3df7-4431-a684-59b1c72c5755
warmup_steps: 10
weight_decay: 0.0
xformers_attention: null
```
</details><br>
# b2c4f285-4c39-4778-b00e-084bc2e6f555
This model is a fine-tuned version of [microsoft/phi-2](https://huggingface.co/microsoft/phi-2) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 5.1097
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 10
- training_steps: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 2.4699 | 0.0083 | 1 | 5.2473 |
| 2.1857 | 0.0248 | 3 | 5.2668 |
| 2.2419 | 0.0496 | 6 | 5.2376 |
| 2.0303 | 0.0744 | 9 | 5.1097 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
infogep/1f743be0-0b33-46ea-9a17-177c4b9eede3 | infogep | 2025-01-26T06:49:25Z | 8 | 0 | peft | [
"peft",
"safetensors",
"gemma2",
"axolotl",
"generated_from_trainer",
"base_model:UCLA-AGI/Gemma-2-9B-It-SPPO-Iter2",
"base_model:adapter:UCLA-AGI/Gemma-2-9B-It-SPPO-Iter2",
"license:gemma",
"region:us"
] | null | 2025-01-26T05:28:07Z | ---
library_name: peft
license: gemma
base_model: UCLA-AGI/Gemma-2-9B-It-SPPO-Iter2
tags:
- axolotl
- generated_from_trainer
model-index:
- name: 1f743be0-0b33-46ea-9a17-177c4b9eede3
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: UCLA-AGI/Gemma-2-9B-It-SPPO-Iter2
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- d9ae9af1d1d23889_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/d9ae9af1d1d23889_train_data.json
type:
field_instruction: input_persona
field_output: prompt
format: '{instruction}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
device: cuda
early_stopping_patience: 1
eval_max_new_tokens: 128
eval_steps: 5
eval_table_size: null
evals_per_epoch: null
flash_attention: false
fp16: null
gradient_accumulation_steps: 4
gradient_checkpointing: true
group_by_length: true
hub_model_id: infogep/1f743be0-0b33-46ea-9a17-177c4b9eede3
hub_repo: null
hub_strategy: checkpoint
hub_token: null
learning_rate: 0.0002
load_in_4bit: false
load_in_8bit: false
local_rank: null
logging_steps: 3
lora_alpha: 32
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 16
lora_target_linear: true
lr_scheduler: cosine
max_memory:
0: 79GiB
max_steps: 30
micro_batch_size: 4
mlflow_experiment_name: /tmp/d9ae9af1d1d23889_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optim_args:
adam_beta1: 0.9
adam_beta2: 0.95
adam_epsilon: 1e-5
optimizer: adamw_torch
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
save_steps: 10
sequence_len: 1024
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: true
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: d1ddd83d-3254-4f1a-93a9-98ee3250c38a
wandb_project: Gradients-On-Demand
wandb_run: your_name
wandb_runid: d1ddd83d-3254-4f1a-93a9-98ee3250c38a
warmup_steps: 5
weight_decay: 0.001
xformers_attention: true
```
</details><br>
# 1f743be0-0b33-46ea-9a17-177c4b9eede3
This model is a fine-tuned version of [UCLA-AGI/Gemma-2-9B-It-SPPO-Iter2](https://huggingface.co/UCLA-AGI/Gemma-2-9B-It-SPPO-Iter2) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.9310
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 16
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=adam_beta1=0.9,adam_beta2=0.95,adam_epsilon=1e-5
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 5
- training_steps: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| No log | 0.0001 | 1 | 1.6043 |
| 1.1522 | 0.0006 | 5 | 1.2816 |
| 1.0287 | 0.0011 | 10 | 1.0470 |
| 0.9 | 0.0017 | 15 | 0.9769 |
| 0.9113 | 0.0022 | 20 | 0.9441 |
| 0.9294 | 0.0028 | 25 | 0.9339 |
| 0.97 | 0.0034 | 30 | 0.9310 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
nadejdatarabukina/86ca614a-46d9-497a-a18a-8c0173ccd979 | nadejdatarabukina | 2025-01-26T06:49:02Z | 8 | 0 | peft | [
"peft",
"safetensors",
"llama",
"axolotl",
"generated_from_trainer",
"base_model:Korabbit/llama-2-ko-7b",
"base_model:adapter:Korabbit/llama-2-ko-7b",
"region:us"
] | null | 2025-01-26T03:12:32Z | ---
library_name: peft
base_model: Korabbit/llama-2-ko-7b
tags:
- axolotl
- generated_from_trainer
model-index:
- name: 86ca614a-46d9-497a-a18a-8c0173ccd979
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: Korabbit/llama-2-ko-7b
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- c9c324e8cf5586e6_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/c9c324e8cf5586e6_train_data.json
type:
field_instruction: instruction
field_output: output
format: '{instruction}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
device: cuda
early_stopping_patience: null
eval_max_new_tokens: 128
eval_steps: 5
eval_table_size: null
evals_per_epoch: null
flash_attention: false
fp16: null
gradient_accumulation_steps: 4
gradient_checkpointing: true
group_by_length: false
hub_model_id: nadejdatarabukina/86ca614a-46d9-497a-a18a-8c0173ccd979
hub_repo: null
hub_strategy: checkpoint
hub_token: null
learning_rate: 0.0002
load_in_4bit: false
load_in_8bit: false
local_rank: null
logging_steps: 3
lora_alpha: 32
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 16
lora_target_linear: true
lr_scheduler: cosine
max_memory:
0: 75GiB
max_steps: 30
micro_batch_size: 2
mlflow_experiment_name: /tmp/c9c324e8cf5586e6_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_torch
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
save_steps: 10
sequence_len: 1024
special_tokens:
pad_token: </s>
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: true
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: d35b96a9-b8d1-49c0-b1a8-167bc6103694
wandb_project: Gradients-On-Demand
wandb_run: your_name
wandb_runid: d35b96a9-b8d1-49c0-b1a8-167bc6103694
warmup_steps: 10
weight_decay: 0.01
xformers_attention: true
```
</details><br>
# 86ca614a-46d9-497a-a18a-8c0173ccd979
This model is a fine-tuned version of [Korabbit/llama-2-ko-7b](https://huggingface.co/Korabbit/llama-2-ko-7b) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.1883
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 10
- training_steps: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| No log | 0.0000 | 1 | 1.7269 |
| 1.6654 | 0.0001 | 5 | 1.6055 |
| 1.4559 | 0.0003 | 10 | 1.3272 |
| 1.2095 | 0.0004 | 15 | 1.2338 |
| 1.2566 | 0.0006 | 20 | 1.2087 |
| 1.1839 | 0.0007 | 25 | 1.1912 |
| 1.1416 | 0.0008 | 30 | 1.1883 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
AMindToThink/gemma-2-2b_RMU_s200_a300_layer7 | AMindToThink | 2025-01-26T06:45:51Z | 7 | 0 | transformers | [
"transformers",
"safetensors",
"gemma2",
"text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2025-01-26T06:43:30Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
ebrukilic/bert-absa-tr | ebrukilic | 2025-01-26T06:43:31Z | 8 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2025-01-26T06:43:13Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
rl-llm-coders/iSFT_8b_v1_mbpp_5e-7_DBS1_ep4_iter1 | rl-llm-coders | 2025-01-26T06:42:15Z | 5 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2025-01-26T06:36:00Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
thalllsssss/d4c91862-27fb-44fe-ad47-3516c1cd53a8 | thalllsssss | 2025-01-26T06:38:49Z | 9 | 0 | peft | [
"peft",
"safetensors",
"mistral",
"axolotl",
"generated_from_trainer",
"base_model:unsloth/zephyr-sft",
"base_model:adapter:unsloth/zephyr-sft",
"license:apache-2.0",
"8-bit",
"bitsandbytes",
"region:us"
] | null | 2025-01-26T05:57:56Z | ---
library_name: peft
license: apache-2.0
base_model: unsloth/zephyr-sft
tags:
- axolotl
- generated_from_trainer
model-index:
- name: d4c91862-27fb-44fe-ad47-3516c1cd53a8
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
```yaml
adapter: lora
base_model: unsloth/zephyr-sft
bf16: auto
chat_template: llama3
dataset_prepared_path: null
datasets:
- data_files:
- f736f3bd8f945056_train_data.json
ds_type: json
format: custom
path: /workspace/input_data/f736f3bd8f945056_train_data.json
type:
field_input: text
field_instruction: instruction
field_output: output
format: '{instruction} {input}'
no_input_format: '{instruction}'
system_format: '{system}'
system_prompt: ''
debug: null
deepspeed: null
early_stopping_patience: null
eval_max_new_tokens: 128
eval_table_size: null
evals_per_epoch: 1
flash_attention: true
fp16: null
fsdp: null
fsdp_config: null
gradient_accumulation_steps: 4
gradient_checkpointing: true
gradient_clipping: 1.0
group_by_length: false
hub_model_id: thalllsssss/d4c91862-27fb-44fe-ad47-3516c1cd53a8
hub_repo: null
hub_strategy: end
hub_token: null
learning_rate: 5.0e-05
load_in_4bit: true
load_in_8bit: true
local_rank: null
logging_steps: 1
lora_alpha: 16
lora_dropout: 0.05
lora_fan_in_fan_out: null
lora_model_dir: null
lora_r: 8
lora_target_linear: true
lr_scheduler: cosine
max_steps: 200
micro_batch_size: 2
mlflow_experiment_name: /tmp/f736f3bd8f945056_train_data.json
model_type: AutoModelForCausalLM
num_epochs: 1
optimizer: adamw_bnb_8bit
output_dir: miner_id_24
pad_to_sequence_len: true
resume_from_checkpoint: null
s2_attention: null
sample_packing: false
saves_per_epoch: 1
sequence_len: 1024
strict: false
tf32: false
tokenizer_type: AutoTokenizer
train_on_inputs: false
trust_remote_code: true
val_set_size: 0.05
wandb_entity: null
wandb_mode: online
wandb_name: 235ea562-3572-46e2-aa40-c58385a132c8
wandb_project: Gradients-On-Demand
wandb_run: your_name
wandb_runid: 235ea562-3572-46e2-aa40-c58385a132c8
warmup_steps: 5
weight_decay: 0.01
xformers_attention: true
```
</details><br>
# d4c91862-27fb-44fe-ad47-3516c1cd53a8
This model is a fine-tuned version of [unsloth/zephyr-sft](https://huggingface.co/unsloth/zephyr-sft) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2331
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 5
- training_steps: 200
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 0.9133 | 0.0638 | 200 | 0.2331 |
### Framework versions
- PEFT 0.13.2
- Transformers 4.46.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1 |
Subsets and Splits