modelId
stringlengths 5
139
| author
stringlengths 2
42
| last_modified
timestamp[us, tz=UTC]date 2020-02-15 11:33:14
2025-07-13 06:28:01
| downloads
int64 0
223M
| likes
int64 0
11.7k
| library_name
stringclasses 518
values | tags
listlengths 1
4.05k
| pipeline_tag
stringclasses 55
values | createdAt
timestamp[us, tz=UTC]date 2022-03-02 23:29:04
2025-07-13 06:25:04
| card
stringlengths 11
1.01M
|
---|---|---|---|---|---|---|---|---|---|
couldnt-find-good-name/aa | couldnt-find-good-name | 2023-09-18T20:35:59Z | 0 | 0 | null | [
"license:cc-by-4.0",
"region:us"
]
| null | 2023-09-18T19:58:25Z | ---
license: cc-by-4.0
---
Model Name : aa
1.fp16/cleaned - smaller size, same result.
2.Vae baked
3.Fixed CLIP
/// **[**original checkpoint link**](https://civitai.com/models/140778)** *(all rights to the model belong to [bhjjjjhh](https://civitai.com/user/bhjjjjhh))* |
CyberHarem/tsukimi_eiko_paripikoumei | CyberHarem | 2023-09-18T20:28:49Z | 0 | 0 | null | [
"art",
"text-to-image",
"dataset:CyberHarem/tsukimi_eiko_paripikoumei",
"license:mit",
"region:us"
]
| text-to-image | 2023-09-18T20:07:26Z | ---
license: mit
datasets:
- CyberHarem/tsukimi_eiko_paripikoumei
pipeline_tag: text-to-image
tags:
- art
---
# Lora of tsukimi_eiko_paripikoumei
This model is trained with [HCP-Diffusion](https://github.com/7eu7d7/HCP-Diffusion). And the auto-training framework is maintained by [DeepGHS Team](https://huggingface.co/deepghs).
The base model used during training is [NAI](https://huggingface.co/deepghs/animefull-latest), and the base model used for generating preview images is [Meina/MeinaMix_V11](https://huggingface.co/Meina/MeinaMix_V11).
After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.
For example, if you want to use the model from step 10800, you need to download `10800/tsukimi_eiko_paripikoumei.pt` as the embedding and `10800/tsukimi_eiko_paripikoumei.safetensors` for loading Lora. By using both files together, you can generate images for the desired characters.
**The best step we recommend is 10800**, with the score of 0.878. The trigger words are:
1. `tsukimi_eiko_paripikoumei`
2. `blonde_hair, long_hair, braid, twin_braids, hat, baseball_cap, bangs, blue_eyes, blunt_bangs, black_headwear, open_mouth`
For the following groups, it is not recommended to use this model and we express regret:
1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.
2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.
3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.
4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.
5. Individuals who finds the generated image content offensive to their values.
These are available steps:
| Steps | Score | Download | pattern_1 | pattern_2 | pattern_3 | pattern_4 | pattern_5 | pattern_6 | pattern_7 | pattern_8 | pattern_9 | pattern_10 | pattern_11 | pattern_12 | pattern_13 | pattern_14 | pattern_15 | pattern_16 | pattern_17 | pattern_18 | pattern_19 | bikini | bondage | free | maid | miko | nude | nude2 | suit | yukata |
|:----------|:----------|:----------------------------------------------------|:-------------------------------------------------|:-------------------------------------------------|:-------------------------------------------------|:-------------------------------------------------|:-------------------------------------------------|:-------------------------------------------------|:-------------------------------------------------|:-------------------------------------------------|:-------------------------------------------------|:---------------------------------------------------|:---------------------------------------------------|:---------------------------------------------------|:---------------------------------------------------|:---------------------------------------------------|:---------------------------------------------------|:---------------------------------------------------|:---------------------------------------------------|:---------------------------------------------------|:---------------------------------------------------|:-------------------------------------------|:---------------------------------------------------|:---------------------------------------|:---------------------------------------|:---------------------------------------|:------------------------------------------------|:-------------------------------------------------|:---------------------------------------|:-------------------------------------------|
| **10800** | **0.878** | [**Download**](10800/tsukimi_eiko_paripikoumei.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](10800/previews/bondage.png) |  |  |  | [<NSFW, click to see>](10800/previews/nude.png) | [<NSFW, click to see>](10800/previews/nude2.png) |  |  |
| 10080 | 0.865 | [Download](10080/tsukimi_eiko_paripikoumei.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](10080/previews/bondage.png) |  |  |  | [<NSFW, click to see>](10080/previews/nude.png) | [<NSFW, click to see>](10080/previews/nude2.png) |  |  |
| 9360 | 0.848 | [Download](9360/tsukimi_eiko_paripikoumei.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](9360/previews/bondage.png) |  |  |  | [<NSFW, click to see>](9360/previews/nude.png) | [<NSFW, click to see>](9360/previews/nude2.png) |  |  |
| 8640 | 0.856 | [Download](8640/tsukimi_eiko_paripikoumei.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](8640/previews/bondage.png) |  |  |  | [<NSFW, click to see>](8640/previews/nude.png) | [<NSFW, click to see>](8640/previews/nude2.png) |  |  |
| 7920 | 0.845 | [Download](7920/tsukimi_eiko_paripikoumei.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](7920/previews/bondage.png) |  |  |  | [<NSFW, click to see>](7920/previews/nude.png) | [<NSFW, click to see>](7920/previews/nude2.png) |  |  |
| 7200 | 0.867 | [Download](7200/tsukimi_eiko_paripikoumei.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](7200/previews/bondage.png) |  |  |  | [<NSFW, click to see>](7200/previews/nude.png) | [<NSFW, click to see>](7200/previews/nude2.png) |  |  |
| 6480 | 0.864 | [Download](6480/tsukimi_eiko_paripikoumei.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](6480/previews/bondage.png) |  |  |  | [<NSFW, click to see>](6480/previews/nude.png) | [<NSFW, click to see>](6480/previews/nude2.png) |  |  |
| 5760 | 0.860 | [Download](5760/tsukimi_eiko_paripikoumei.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](5760/previews/bondage.png) |  |  |  | [<NSFW, click to see>](5760/previews/nude.png) | [<NSFW, click to see>](5760/previews/nude2.png) |  |  |
| 5040 | 0.827 | [Download](5040/tsukimi_eiko_paripikoumei.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](5040/previews/bondage.png) |  |  |  | [<NSFW, click to see>](5040/previews/nude.png) | [<NSFW, click to see>](5040/previews/nude2.png) |  |  |
| 4320 | 0.834 | [Download](4320/tsukimi_eiko_paripikoumei.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](4320/previews/bondage.png) |  |  |  | [<NSFW, click to see>](4320/previews/nude.png) | [<NSFW, click to see>](4320/previews/nude2.png) |  |  |
| 3600 | 0.810 | [Download](3600/tsukimi_eiko_paripikoumei.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](3600/previews/bondage.png) |  |  |  | [<NSFW, click to see>](3600/previews/nude.png) | [<NSFW, click to see>](3600/previews/nude2.png) |  |  |
| 2880 | 0.812 | [Download](2880/tsukimi_eiko_paripikoumei.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](2880/previews/bondage.png) |  |  |  | [<NSFW, click to see>](2880/previews/nude.png) | [<NSFW, click to see>](2880/previews/nude2.png) |  |  |
| 2160 | 0.832 | [Download](2160/tsukimi_eiko_paripikoumei.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](2160/previews/bondage.png) |  |  |  | [<NSFW, click to see>](2160/previews/nude.png) | [<NSFW, click to see>](2160/previews/nude2.png) |  |  |
| 1440 | 0.740 | [Download](1440/tsukimi_eiko_paripikoumei.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](1440/previews/bondage.png) |  |  |  | [<NSFW, click to see>](1440/previews/nude.png) | [<NSFW, click to see>](1440/previews/nude2.png) |  |  |
| 720 | 0.700 | [Download](720/tsukimi_eiko_paripikoumei.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](720/previews/bondage.png) |  |  |  | [<NSFW, click to see>](720/previews/nude.png) | [<NSFW, click to see>](720/previews/nude2.png) |  |  |
|
ShivamMangale/XLM-Roberta-base-finetuned-squad-syn-first-now-squad-10k-5-epoch-second_run | ShivamMangale | 2023-09-18T20:24:10Z | 116 | 0 | transformers | [
"transformers",
"pytorch",
"xlm-roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"base_model:ShivamMangale/XLM-Roberta-base-finetuned-squad-syn-first-10k-5-epoch-second_run",
"base_model:finetune:ShivamMangale/XLM-Roberta-base-finetuned-squad-syn-first-10k-5-epoch-second_run",
"license:mit",
"endpoints_compatible",
"region:us"
]
| question-answering | 2023-09-18T20:06:30Z | ---
license: mit
base_model: ShivamMangale/XLM-Roberta-base-finetuned-squad-syn-first-10k-5-epoch-second_run
tags:
- generated_from_trainer
datasets:
- squad
model-index:
- name: XLM-Roberta-base-finetuned-squad-syn-first-now-squad-10k-5-epoch-second_run
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# XLM-Roberta-base-finetuned-squad-syn-first-now-squad-10k-5-epoch-second_run
This model is a fine-tuned version of [ShivamMangale/XLM-Roberta-base-finetuned-squad-syn-first-10k-5-epoch-second_run](https://huggingface.co/ShivamMangale/XLM-Roberta-base-finetuned-squad-syn-first-10k-5-epoch-second_run) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
### Framework versions
- Transformers 4.33.2
- Pytorch 2.0.1+cu117
- Datasets 2.14.5
- Tokenizers 0.13.3
|
latimar/Phind-Codellama-34B-v2-exl2 | latimar | 2023-09-18T20:17:19Z | 20 | 16 | transformers | [
"transformers",
"llama",
"text-generation",
"license:llama2",
"autotrain_compatible",
"region:us"
]
| text-generation | 2023-09-13T21:38:09Z | ---
base_model: https://huggingface.co/Phind/Phind-CodeLlama-34B-v2
inference: false
license: llama2
model_creator: https://huggingface.co/Phind
model_name: Phind-Codellama-34B-v2
model_type: llama
quantized_by: latimar
---
# Phind-CodeLlama-34B-v2 EXL2
Weights of [Phind-CodeLlama-34B-v2](https://huggingface.co/Phind/Phind-CodeLlama-34B-v2) converted
to [EXL2](https://github.com/turboderp/exllamav2#exl2-quantization) format.
Each separate quant is in a different branch, like in The Bloke's GPTQ repos.
```
export BRANCH=5_0-bpw-h8
git clone --single-branch --branch ${BRANCH} https://huggingface.co/latimar/Phind-Codellama-34B-v2-exl2
```
There are the following branches:
```
5_0-bpw-h8
5_0-bpw-h8-evol-ins
4_625-bpw-h6
4_4-bpw-h8
4_125-bpw-h6
3_8-bpw-h6
2_75-bpw-h6
2_55-bpw-h6
```
* Calibration dataset used for conversion: [wikitext-v2](https://huggingface.co/datasets/wikitext/blob/refs%2Fconvert%2Fparquet/wikitext-2-v1/test/0000.parquet)
* Evaluation dataset used to calculate perplexity: [wikitext-v2](https://huggingface.co/datasets/wikitext/blob/refs%2Fconvert%2Fparquet/wikitext-2-v1/validation/0000.parquet)
* Calibration dataset used for conversion of `5_0-bpw-h8-evol-ins`: [wizardLM-evol-instruct_70k](https://huggingface.co/datasets/WizardLM/WizardLM_evol_instruct_70k/blob/refs%2Fconvert%2Fparquet/default/train/0000.parquet)
* Evaluation dataset used to calculate ppl for `Evol-Ins`: : [nikrosh-evol-instruct](https://huggingface.co/datasets/nickrosh/Evol-Instruct-Code-80k-v1/blob/refs%2Fconvert%2Fparquet/default/train/0000.parquet)
* When converting `4_4-bpw-h8` quant, additional `-mr 32` arg was used.
PPL was measured with the [test_inference.py exllamav2 script](https://github.com/turboderp/exllamav2/blob/master/test_inference.py):
```
python test_inference.py -m /storage/models/LLaMA/EXL2/Phind-Codellama-34B-v2 -ed /storage/datasets/text/evol-instruct/nickrosh-evol-instruct-code-80k.parquet
```
| BPW | PPL on Wiki | PPL on Evol-Ins | File Size (Gb) |
| ----------- | ----------- | --------------- | -------------- |
| 2.55-h6 | 11.0310 | 2.4542 | 10.56 |
| 2.75-h6 | 9.7902 | 2.2888 | 11.33 |
| 3.8-h6 | 6.7293 | 2.0724 | 15.37 |
| 4.125-h6 | 6.6713 | 2.0617 | 16.65 |
| 4.4-h8 | 6.6487 | 2.0509 | 17.76 |
| 4.625-h6 | 6.6576 | 2.0459 | 18.58 |
| 5.0-h8 | 6.6379 | 2.0419 | 20.09 |
| 5.0-h8-ev | 6.7785 | 2.0445 | 20.09 |
|
K00B404/CodeLlama-7B-Instruct-bf16-sharded-ft-v0_01 | K00B404 | 2023-09-18T20:14:22Z | 0 | 0 | peft | [
"peft",
"dataset:nickrosh/Evol-Instruct-Code-80k-v1",
"license:afl-3.0",
"region:us"
]
| null | 2023-09-18T18:10:08Z | ---
library_name: peft
license: afl-3.0
datasets:
- nickrosh/Evol-Instruct-Code-80k-v1
---
## Training procedure
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: False
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.6.0.dev0
"""
Original file is located at
https://colab.research.google.com/drive/1yH0ov1ZDpun6yGi19zE07jkF_EUMI1Bf
**Code Credit: Hugging Face**
**Dataset Credit: https://twitter.com/Dorialexander/status/1681671177696161794 **
## Finetune Llama-2-7b on a Google colab
Welcome to this Google Colab notebook that shows how to fine-tune the recent code Llama-2-7b model on a single Google colab and turn it into a chatbot
We will leverage PEFT library from Hugging Face ecosystem, as well as QLoRA for more memory efficient finetuning
## Setup
Run the cells below to setup and install the required libraries. For our experiment we will need `accelerate`, `peft`, `transformers`, `datasets` and TRL to leverage the recent [`SFTTrainer`](https://huggingface.co/docs/trl/main/en/sft_trainer). We will use `bitsandbytes` to [quantize the base model into 4bit](https://huggingface.co/blog/4bit-transformers-bitsandbytes). We will also install `einops` as it is a requirement to load Falcon models.
"""
!pip install -q -U trl transformers accelerate git+https://github.com/huggingface/peft.git
!pip install -q datasets bitsandbytes einops wandb
"""## Dataset
login huggingface
"""
import wandb
!wandb login
# Initialize WandB
wandb_key=["<API_KEY>"]
wandb.init(project="<project_name>",
name="<name>"
)
# login with API
from huggingface_hub import login
login()
from datasets import load_dataset
#dataset_name = "timdettmers/openassistant-guanaco" ###Human ,.,,,,,, ###Assistant
dataset_name = "nickrosh/Evol-Instruct-Code-80k-v1"
#dataset_name = 'AlexanderDoria/novel17_test' #french novels
dataset = load_dataset(dataset_name, split="train")
"""## Loading the model"""
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer, BitsAndBytesConfig, AutoTokenizer
#model_name = "TinyPixel/Llama-2-7B-bf16-sharded"
#model_name = "abhinand/Llama-2-7B-bf16-sharded-512MB"
model_name= "TinyPixel/CodeLlama-7B-Instruct-bf16-sharded"
bnb_config = BitsAndBytesConfig(
load_in_4bit=True,
bnb_4bit_quant_type="nf4",
bnb_4bit_compute_dtype=torch.float16,
)
model = AutoModelForCausalLM.from_pretrained(
model_name,
quantization_config=bnb_config,
trust_remote_code=True
)
model.config.use_cache = False
"""Let's also load the tokenizer below"""
inputs = tokenizer(text, return_tensors="pt", padding="max_length", max_length=max_seq_length, truncation=True).to(device)
tokenizer = AutoTokenizer.from_pretrained(model_name, trust_remote_code=True)
tokenizer.pad_token = tokenizer.eos_token
from peft import LoraConfig, get_peft_model
lora_alpha = 16
lora_dropout = 0.1
lora_r = 64
peft_config = LoraConfig(
lora_alpha=lora_alpha,
lora_dropout=lora_dropout,
r=lora_r,
bias="none",
task_type="CAUSAL_LM"
)
"""## Loading the trainer
Here we will use the [`SFTTrainer` from TRL library](https://huggingface.co/docs/trl/main/en/sft_trainer) that gives a wrapper around transformers `Trainer` to easily fine-tune models on instruction based datasets using PEFT adapters. Let's first load the training arguments below.
"""
from transformers import TrainingArguments
output_dir = "./results"
per_device_train_batch_size = 4
gradient_accumulation_steps = 4
optim = "paged_adamw_32bit"
save_steps = 100
logging_steps = 10
learning_rate = 2e-4
max_grad_norm = 0.3
max_steps = 100
warmup_ratio = 0.03
lr_scheduler_type = "constant"
training_arguments = TrainingArguments(
output_dir=output_dir,
per_device_train_batch_size=per_device_train_batch_size,
gradient_accumulation_steps=gradient_accumulation_steps,
optim=optim,
save_steps=save_steps,
logging_steps=logging_steps,
learning_rate=learning_rate,
fp16=True,
max_grad_norm=max_grad_norm,
max_steps=max_steps,
warmup_ratio=warmup_ratio,
group_by_length=True,
lr_scheduler_type=lr_scheduler_type,
)
"""Then finally pass everthing to the trainer"""
from trl import SFTTrainer
max_seq_length = 512
trainer = SFTTrainer(
model=model,
train_dataset=dataset,
peft_config=peft_config,
dataset_text_field="output",
max_seq_length=max_seq_length,
tokenizer=tokenizer,
args=training_arguments,
)
"""We will also pre-process the model by upcasting the layer norms in float 32 for more stable training"""
for name, module in trainer.model.named_modules():
if "norm" in name:
module = module.to(torch.float32)
"""## Train the model
You're using a LlamaTokenizerFast tokenizer. Please note that with a fast tokenizer, using the `__call__` method is faster than using a method to encode the text followed by a call to the `pad` method to get a padded encoding.
Now let's train the model! Simply call `trainer.train()`
"""
trainer.train()
"""During training, the model should converge nicely as follows:
The `SFTTrainer` also takes care of properly saving only the adapters during training instead of saving the entire model.
"""
model_to_save = trainer.model.module if hasattr(trainer.model, 'module') else trainer.model # Take care of distributed/parallel training
model_to_save.save_pretrained("outputs")
lora_config = LoraConfig.from_pretrained('outputs')
model = get_peft_model(model, lora_config)
dataset['output']
text = "make a advanced python script to finetune a llama2-7b-bf16-sharded model with accelerator and qlora"
device = "cuda:0"
inputs = tokenizer(text, return_tensors="pt", padding="max_length", max_length=max_seq_length, truncation=True).to(device)
#inputs = tokenizer(text, return_tensors="pt").to(device)
outputs = model.generate(**inputs, max_new_tokens=150)
print(tokenizer.decode(outputs[0], skip_special_tokens=False))
model.push_to_hub("K00B404/CodeLlama-7B-Instruct-bf16-sharded-ft-v0_01", use_auth_token="<HUGGINGFACE_WRITE-api")
|
ironchanchellor/segformer-b0_DsMetalDam_Base | ironchanchellor | 2023-09-18T20:13:43Z | 31 | 0 | transformers | [
"transformers",
"pytorch",
"segformer",
"generated_from_trainer",
"base_model:nvidia/mit-b0",
"base_model:finetune:nvidia/mit-b0",
"license:other",
"endpoints_compatible",
"region:us"
]
| null | 2023-09-18T15:28:03Z | ---
license: other
base_model: nvidia/mit-b0
tags:
- generated_from_trainer
model-index:
- name: segformer-b0_DsMetalDam_Base
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# segformer-b0_DsMetalDam_Base
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7243
- Mean Iou: 0.3715
- Mean Accuracy: 0.4884
- Overall Accuracy: 0.6415
- Accuracy Matrix: 0.4372
- Accuracy Martensite/austenite: 0.7995
- Accuracy Precipitates: 0.5009
- Accuracy Austenite: 0.0
- Accuracy Defects: 0.7044
- Iou Matrix: 0.3363
- Iou Martensite/austenite: 0.5675
- Iou Precipitates: 0.3875
- Iou Austenite: 0.0
- Iou Defects: 0.5662
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 200
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Matrix | Accuracy Martensite/austenite | Accuracy Precipitates | Accuracy Austenite | Accuracy Defects | Iou Matrix | Iou Martensite/austenite | Iou Precipitates | Iou Austenite | Iou Defects |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:---------------:|:-----------------------------:|:---------------------:|:------------------:|:----------------:|:----------:|:------------------------:|:----------------:|:-------------:|:-----------:|
| 1.5512 | 1.0 | 9 | 1.5883 | 0.1292 | 0.3692 | 0.3892 | 0.0312 | 0.6387 | 0.3358 | 0.0010 | 0.8395 | 0.0301 | 0.4325 | 0.1640 | 0.0002 | 0.0191 |
| 1.4729 | 2.0 | 18 | 1.5818 | 0.1485 | 0.4477 | 0.3823 | 0.2816 | 0.4134 | 0.6955 | 0.0072 | 0.8409 | 0.2069 | 0.3228 | 0.1757 | 0.0019 | 0.0350 |
| 1.3822 | 3.0 | 27 | 1.5274 | 0.1774 | 0.4217 | 0.4554 | 0.3343 | 0.5150 | 0.6734 | 0.0 | 0.5855 | 0.2366 | 0.3836 | 0.2053 | 0.0 | 0.0614 |
| 1.3561 | 4.0 | 36 | 1.4416 | 0.2053 | 0.4511 | 0.5040 | 0.2959 | 0.6353 | 0.6106 | 0.0 | 0.7138 | 0.2206 | 0.4440 | 0.2308 | 0.0 | 0.1310 |
| 1.2811 | 5.0 | 45 | 1.2949 | 0.2177 | 0.3940 | 0.5318 | 0.3180 | 0.6907 | 0.4486 | 0.0 | 0.5126 | 0.2309 | 0.4688 | 0.3068 | 0.0 | 0.0821 |
| 1.3093 | 6.0 | 54 | 1.2194 | 0.2652 | 0.3830 | 0.5376 | 0.4339 | 0.6207 | 0.4630 | 0.0 | 0.3974 | 0.2821 | 0.4406 | 0.3113 | 0.0 | 0.2920 |
| 1.2201 | 7.0 | 63 | 1.1818 | 0.2730 | 0.4086 | 0.5513 | 0.3284 | 0.7097 | 0.5346 | 0.0 | 0.4704 | 0.2397 | 0.4809 | 0.3131 | 0.0 | 0.3314 |
| 1.1416 | 8.0 | 72 | 1.0888 | 0.2763 | 0.4058 | 0.5581 | 0.2490 | 0.7793 | 0.5133 | 0.0 | 0.4872 | 0.1947 | 0.5074 | 0.3202 | 0.0 | 0.3590 |
| 1.1918 | 9.0 | 81 | 1.0642 | 0.2553 | 0.3626 | 0.5518 | 0.3424 | 0.7074 | 0.4990 | 0.0 | 0.2641 | 0.2466 | 0.4790 | 0.3103 | 0.0 | 0.2407 |
| 1.1329 | 10.0 | 90 | 1.1006 | 0.2774 | 0.3969 | 0.5567 | 0.3496 | 0.7069 | 0.5249 | 0.0 | 0.4029 | 0.2515 | 0.4821 | 0.3278 | 0.0 | 0.3257 |
| 1.1162 | 11.0 | 99 | 1.0095 | 0.2922 | 0.4203 | 0.5723 | 0.2232 | 0.8250 | 0.4859 | 0.0 | 0.5673 | 0.1812 | 0.5263 | 0.3469 | 0.0 | 0.4069 |
| 1.0042 | 12.0 | 108 | 1.0160 | 0.2978 | 0.4552 | 0.5546 | 0.3669 | 0.6829 | 0.5732 | 0.0 | 0.6532 | 0.2592 | 0.4749 | 0.3367 | 0.0 | 0.4180 |
| 1.0071 | 13.0 | 117 | 0.9690 | 0.2925 | 0.4422 | 0.5536 | 0.4011 | 0.6673 | 0.4835 | 0.0 | 0.6592 | 0.2718 | 0.4675 | 0.3417 | 0.0 | 0.3817 |
| 1.0304 | 14.0 | 126 | 1.0249 | 0.2920 | 0.4572 | 0.5524 | 0.3451 | 0.6875 | 0.6351 | 0.0 | 0.6183 | 0.2509 | 0.4773 | 0.3222 | 0.0 | 0.4094 |
| 1.039 | 15.0 | 135 | 0.9615 | 0.2904 | 0.4197 | 0.5725 | 0.2344 | 0.8151 | 0.5130 | 0.0 | 0.5360 | 0.1881 | 0.5253 | 0.3519 | 0.0 | 0.3867 |
| 1.007 | 16.0 | 144 | 0.9758 | 0.2907 | 0.4127 | 0.5601 | 0.4127 | 0.6640 | 0.5729 | 0.0 | 0.4138 | 0.2829 | 0.4700 | 0.3569 | 0.0 | 0.3436 |
| 1.0463 | 17.0 | 153 | 0.9280 | 0.2860 | 0.3982 | 0.5752 | 0.2554 | 0.8111 | 0.4686 | 0.0 | 0.4557 | 0.2013 | 0.5263 | 0.3404 | 0.0 | 0.3620 |
| 0.9771 | 18.0 | 162 | 0.9077 | 0.2810 | 0.4001 | 0.5748 | 0.2579 | 0.7988 | 0.5676 | 0.0 | 0.3763 | 0.2058 | 0.5221 | 0.3567 | 0.0 | 0.3203 |
| 0.9141 | 19.0 | 171 | 0.9507 | 0.3000 | 0.4356 | 0.5642 | 0.3924 | 0.6815 | 0.5984 | 0.0 | 0.5056 | 0.2759 | 0.4798 | 0.3552 | 0.0 | 0.3890 |
| 0.9758 | 20.0 | 180 | 0.8953 | 0.2964 | 0.4059 | 0.5816 | 0.2442 | 0.8337 | 0.4340 | 0.0 | 0.5179 | 0.1968 | 0.5341 | 0.3422 | 0.0 | 0.4088 |
| 1.0676 | 21.0 | 189 | 0.8793 | 0.3041 | 0.4422 | 0.5812 | 0.2432 | 0.8160 | 0.5891 | 0.0 | 0.5625 | 0.1982 | 0.5313 | 0.3745 | 0.0 | 0.4164 |
| 0.8866 | 22.0 | 198 | 0.8613 | 0.3124 | 0.4256 | 0.5823 | 0.2707 | 0.8095 | 0.4947 | 0.0 | 0.5530 | 0.2126 | 0.5289 | 0.3716 | 0.0 | 0.4489 |
| 0.8624 | 23.0 | 207 | 0.8735 | 0.3106 | 0.4416 | 0.5821 | 0.2713 | 0.8009 | 0.5649 | 0.0 | 0.5707 | 0.2147 | 0.5278 | 0.3751 | 0.0 | 0.4355 |
| 0.8676 | 24.0 | 216 | 0.8519 | 0.3187 | 0.4479 | 0.5862 | 0.3065 | 0.7857 | 0.5459 | 0.0 | 0.6015 | 0.2376 | 0.5233 | 0.3866 | 0.0 | 0.4462 |
| 0.8489 | 25.0 | 225 | 0.8516 | 0.3187 | 0.4746 | 0.5896 | 0.2451 | 0.8259 | 0.6131 | 0.0 | 0.6888 | 0.2024 | 0.5391 | 0.3933 | 0.0 | 0.4589 |
| 0.8708 | 26.0 | 234 | 0.8488 | 0.3121 | 0.4236 | 0.5916 | 0.2953 | 0.8039 | 0.5542 | 0.0 | 0.4647 | 0.2323 | 0.5335 | 0.3924 | 0.0 | 0.4021 |
| 0.8078 | 27.0 | 243 | 0.8410 | 0.3330 | 0.4572 | 0.5829 | 0.4437 | 0.6844 | 0.5492 | 0.0 | 0.6089 | 0.3065 | 0.4887 | 0.3922 | 0.0 | 0.4777 |
| 0.8513 | 28.0 | 252 | 0.8205 | 0.3188 | 0.4390 | 0.5957 | 0.2030 | 0.8782 | 0.5128 | 0.0 | 0.6012 | 0.1741 | 0.5561 | 0.3836 | 0.0 | 0.4802 |
| 0.8317 | 29.0 | 261 | 0.8084 | 0.3317 | 0.4579 | 0.5963 | 0.3531 | 0.7764 | 0.4966 | 0.0 | 0.6632 | 0.2664 | 0.5272 | 0.3877 | 0.0 | 0.4770 |
| 0.7976 | 30.0 | 270 | 0.8268 | 0.3117 | 0.4407 | 0.5906 | 0.2848 | 0.8007 | 0.6299 | 0.0 | 0.4881 | 0.2297 | 0.5338 | 0.3736 | 0.0 | 0.4214 |
| 0.8552 | 31.0 | 279 | 0.7956 | 0.3069 | 0.4015 | 0.6010 | 0.2424 | 0.8695 | 0.4473 | 0.0 | 0.4483 | 0.2027 | 0.5562 | 0.3652 | 0.0 | 0.4101 |
| 0.7385 | 32.0 | 288 | 0.8094 | 0.3373 | 0.4721 | 0.5971 | 0.3541 | 0.7666 | 0.5948 | 0.0 | 0.6451 | 0.2704 | 0.5258 | 0.3936 | 0.0 | 0.4968 |
| 0.836 | 33.0 | 297 | 0.7977 | 0.3356 | 0.4574 | 0.6032 | 0.3030 | 0.8197 | 0.5372 | 0.0 | 0.6270 | 0.2422 | 0.5446 | 0.3971 | 0.0 | 0.4943 |
| 0.8446 | 34.0 | 306 | 0.7899 | 0.3320 | 0.4440 | 0.6061 | 0.2780 | 0.8466 | 0.5007 | 0.0 | 0.5946 | 0.2276 | 0.5538 | 0.3915 | 0.0 | 0.4869 |
| 0.8729 | 35.0 | 315 | 0.7816 | 0.3143 | 0.4173 | 0.6069 | 0.2157 | 0.8933 | 0.4914 | 0.0 | 0.4863 | 0.1877 | 0.5653 | 0.3886 | 0.0 | 0.4300 |
| 0.7898 | 36.0 | 324 | 0.7840 | 0.3259 | 0.4396 | 0.6064 | 0.2081 | 0.8961 | 0.4941 | 0.0 | 0.5999 | 0.1821 | 0.5662 | 0.3853 | 0.0 | 0.4959 |
| 0.7399 | 37.0 | 333 | 0.7779 | 0.3388 | 0.4548 | 0.6066 | 0.3259 | 0.8091 | 0.5485 | 0.0 | 0.5904 | 0.2579 | 0.5433 | 0.4009 | 0.0 | 0.4918 |
| 0.8473 | 38.0 | 342 | 0.7826 | 0.3348 | 0.4439 | 0.6089 | 0.2996 | 0.8394 | 0.4747 | 0.0 | 0.6055 | 0.2416 | 0.5543 | 0.3802 | 0.0 | 0.4979 |
| 0.7406 | 39.0 | 351 | 0.7963 | 0.3381 | 0.4876 | 0.6033 | 0.2974 | 0.8151 | 0.6048 | 0.0 | 0.7205 | 0.2418 | 0.5462 | 0.3815 | 0.0 | 0.5208 |
| 0.7831 | 40.0 | 360 | 0.7770 | 0.3403 | 0.4654 | 0.6100 | 0.2840 | 0.8465 | 0.5195 | 0.0 | 0.6769 | 0.2339 | 0.5561 | 0.3951 | 0.0 | 0.5165 |
| 0.9 | 41.0 | 369 | 0.7704 | 0.3372 | 0.4593 | 0.6103 | 0.2678 | 0.8581 | 0.5231 | 0.0 | 0.6476 | 0.2240 | 0.5593 | 0.3959 | 0.0 | 0.5066 |
| 0.7923 | 42.0 | 378 | 0.7691 | 0.3426 | 0.4592 | 0.6097 | 0.3703 | 0.7912 | 0.4773 | 0.0 | 0.6574 | 0.2831 | 0.5397 | 0.3857 | 0.0 | 0.5043 |
| 0.7476 | 43.0 | 387 | 0.7747 | 0.3378 | 0.4727 | 0.6106 | 0.2640 | 0.8526 | 0.6037 | 0.0 | 0.6431 | 0.2230 | 0.5599 | 0.4053 | 0.0 | 0.5010 |
| 0.7727 | 44.0 | 396 | 0.7636 | 0.3431 | 0.4612 | 0.6136 | 0.3094 | 0.8376 | 0.5015 | 0.0 | 0.6575 | 0.2508 | 0.5557 | 0.3950 | 0.0 | 0.5141 |
| 0.731 | 45.0 | 405 | 0.7659 | 0.3347 | 0.4375 | 0.6132 | 0.3083 | 0.8466 | 0.4258 | 0.0 | 0.6068 | 0.2496 | 0.5576 | 0.3625 | 0.0 | 0.5037 |
| 0.7116 | 46.0 | 414 | 0.7676 | 0.3441 | 0.4876 | 0.6129 | 0.2661 | 0.8556 | 0.5915 | 0.0 | 0.7250 | 0.2245 | 0.5624 | 0.4085 | 0.0 | 0.5250 |
| 1.0356 | 47.0 | 423 | 0.7630 | 0.3510 | 0.4918 | 0.6110 | 0.3569 | 0.7909 | 0.5778 | 0.0 | 0.7335 | 0.2774 | 0.5430 | 0.4046 | 0.0 | 0.5298 |
| 0.7296 | 48.0 | 432 | 0.7647 | 0.3494 | 0.4689 | 0.6149 | 0.3232 | 0.8278 | 0.5249 | 0.0 | 0.6688 | 0.2592 | 0.5546 | 0.4035 | 0.0 | 0.5295 |
| 0.7581 | 49.0 | 441 | 0.7663 | 0.3480 | 0.4709 | 0.6135 | 0.3386 | 0.8105 | 0.5645 | 0.0 | 0.6408 | 0.2679 | 0.5499 | 0.4065 | 0.0 | 0.5159 |
| 0.7046 | 50.0 | 450 | 0.7621 | 0.3454 | 0.4617 | 0.6144 | 0.3288 | 0.8255 | 0.5053 | 0.0 | 0.6491 | 0.2623 | 0.5537 | 0.3936 | 0.0 | 0.5175 |
| 0.6793 | 51.0 | 459 | 0.7591 | 0.3413 | 0.4638 | 0.6157 | 0.2665 | 0.8716 | 0.4951 | 0.0 | 0.6859 | 0.2255 | 0.5663 | 0.3906 | 0.0 | 0.5242 |
| 0.833 | 52.0 | 468 | 0.7657 | 0.3471 | 0.4899 | 0.6141 | 0.2951 | 0.8370 | 0.5989 | 0.0 | 0.7183 | 0.2439 | 0.5581 | 0.4071 | 0.0 | 0.5266 |
| 0.6695 | 53.0 | 477 | 0.7595 | 0.3464 | 0.4687 | 0.6171 | 0.3095 | 0.8457 | 0.4802 | 0.0 | 0.7079 | 0.2522 | 0.5611 | 0.3841 | 0.0 | 0.5347 |
| 0.7034 | 54.0 | 486 | 0.7578 | 0.3469 | 0.4826 | 0.6154 | 0.3243 | 0.8249 | 0.5482 | 0.0 | 0.7156 | 0.2616 | 0.5553 | 0.3934 | 0.0 | 0.5243 |
| 0.7112 | 55.0 | 495 | 0.7549 | 0.3494 | 0.4751 | 0.6177 | 0.3144 | 0.8411 | 0.4999 | 0.0 | 0.7201 | 0.2555 | 0.5602 | 0.3941 | 0.0 | 0.5371 |
| 0.7294 | 56.0 | 504 | 0.7544 | 0.3516 | 0.4731 | 0.6179 | 0.3510 | 0.8165 | 0.4997 | 0.0 | 0.6983 | 0.2765 | 0.5539 | 0.3920 | 0.0 | 0.5355 |
| 0.6808 | 57.0 | 513 | 0.7555 | 0.3522 | 0.4727 | 0.6164 | 0.3478 | 0.8112 | 0.5476 | 0.0 | 0.6569 | 0.2749 | 0.5522 | 0.3968 | 0.0 | 0.5370 |
| 0.7648 | 58.0 | 522 | 0.7546 | 0.3494 | 0.4823 | 0.6179 | 0.2805 | 0.8592 | 0.5521 | 0.0 | 0.7197 | 0.2355 | 0.5661 | 0.4041 | 0.0 | 0.5416 |
| 0.6654 | 59.0 | 531 | 0.7508 | 0.3582 | 0.4974 | 0.6172 | 0.4066 | 0.7716 | 0.5394 | 0.0 | 0.7694 | 0.3064 | 0.5409 | 0.4010 | 0.0 | 0.5425 |
| 0.7075 | 60.0 | 540 | 0.7499 | 0.3571 | 0.4842 | 0.6208 | 0.3268 | 0.8335 | 0.5436 | 0.0 | 0.7172 | 0.2651 | 0.5609 | 0.4036 | 0.0 | 0.5560 |
| 0.7295 | 61.0 | 549 | 0.7485 | 0.3567 | 0.4861 | 0.6207 | 0.3385 | 0.8248 | 0.5452 | 0.0 | 0.7220 | 0.2721 | 0.5584 | 0.4016 | 0.0 | 0.5514 |
| 0.6483 | 62.0 | 558 | 0.7499 | 0.3568 | 0.4807 | 0.6204 | 0.3667 | 0.8084 | 0.5134 | 0.0 | 0.7147 | 0.2870 | 0.5534 | 0.3951 | 0.0 | 0.5485 |
| 0.8613 | 63.0 | 567 | 0.7504 | 0.3554 | 0.4859 | 0.6193 | 0.3356 | 0.8214 | 0.5754 | 0.0 | 0.6969 | 0.2703 | 0.5578 | 0.3990 | 0.0 | 0.5496 |
| 0.7036 | 64.0 | 576 | 0.7471 | 0.3583 | 0.4727 | 0.6220 | 0.3468 | 0.8257 | 0.5114 | 0.0 | 0.6794 | 0.2762 | 0.5594 | 0.3983 | 0.0 | 0.5575 |
| 0.63 | 65.0 | 585 | 0.7450 | 0.3573 | 0.4859 | 0.6216 | 0.3626 | 0.8117 | 0.5281 | 0.0 | 0.7271 | 0.2854 | 0.5554 | 0.4010 | 0.0 | 0.5448 |
| 0.7379 | 66.0 | 594 | 0.7464 | 0.3577 | 0.4802 | 0.6219 | 0.3714 | 0.8071 | 0.5238 | 0.0 | 0.6990 | 0.2901 | 0.5542 | 0.4027 | 0.0 | 0.5417 |
| 0.7054 | 67.0 | 603 | 0.7445 | 0.3561 | 0.4709 | 0.6229 | 0.3426 | 0.8304 | 0.5114 | 0.0 | 0.6698 | 0.2746 | 0.5612 | 0.3992 | 0.0 | 0.5457 |
| 0.7288 | 68.0 | 612 | 0.7458 | 0.3578 | 0.4938 | 0.6226 | 0.3622 | 0.8112 | 0.5490 | 0.0 | 0.7466 | 0.2861 | 0.5566 | 0.4036 | 0.0 | 0.5425 |
| 0.661 | 69.0 | 621 | 0.7474 | 0.3577 | 0.4881 | 0.6229 | 0.3506 | 0.8201 | 0.5502 | 0.0 | 0.7199 | 0.2796 | 0.5593 | 0.4046 | 0.0 | 0.5447 |
| 0.693 | 70.0 | 630 | 0.7458 | 0.3479 | 0.4479 | 0.6231 | 0.3520 | 0.8266 | 0.5018 | 0.0 | 0.5589 | 0.2799 | 0.5607 | 0.3923 | 0.0 | 0.5065 |
| 0.7166 | 71.0 | 639 | 0.7431 | 0.3618 | 0.4850 | 0.6230 | 0.3693 | 0.8055 | 0.5714 | 0.0 | 0.6789 | 0.2908 | 0.5554 | 0.4052 | 0.0 | 0.5577 |
| 0.6847 | 72.0 | 648 | 0.7412 | 0.3584 | 0.4766 | 0.6249 | 0.3466 | 0.8332 | 0.4899 | 0.0 | 0.7134 | 0.2778 | 0.5633 | 0.3916 | 0.0 | 0.5594 |
| 0.7217 | 73.0 | 657 | 0.7428 | 0.3580 | 0.4788 | 0.6233 | 0.3724 | 0.8082 | 0.5306 | 0.0 | 0.6829 | 0.2924 | 0.5559 | 0.3947 | 0.0 | 0.5472 |
| 0.6895 | 74.0 | 666 | 0.7433 | 0.3578 | 0.4790 | 0.6246 | 0.3621 | 0.8191 | 0.5164 | 0.0 | 0.6973 | 0.2867 | 0.5594 | 0.3981 | 0.0 | 0.5448 |
| 0.6974 | 75.0 | 675 | 0.7425 | 0.3548 | 0.4890 | 0.6245 | 0.3250 | 0.8404 | 0.5512 | 0.0 | 0.7284 | 0.2660 | 0.5662 | 0.4029 | 0.0 | 0.5392 |
| 0.7003 | 76.0 | 684 | 0.7416 | 0.3586 | 0.4817 | 0.6255 | 0.3655 | 0.8181 | 0.5183 | 0.0 | 0.7067 | 0.2889 | 0.5607 | 0.3951 | 0.0 | 0.5483 |
| 0.671 | 77.0 | 693 | 0.7426 | 0.3583 | 0.4853 | 0.6259 | 0.3290 | 0.8429 | 0.5271 | 0.0 | 0.7273 | 0.2684 | 0.5675 | 0.4017 | 0.0 | 0.5537 |
| 0.71 | 78.0 | 702 | 0.7385 | 0.3645 | 0.4968 | 0.6238 | 0.4247 | 0.7692 | 0.5601 | 0.0 | 0.7301 | 0.3198 | 0.5449 | 0.4076 | 0.0 | 0.5503 |
| 0.8038 | 79.0 | 711 | 0.7399 | 0.3575 | 0.4772 | 0.6262 | 0.3659 | 0.8228 | 0.4836 | 0.0 | 0.7140 | 0.2887 | 0.5618 | 0.3907 | 0.0 | 0.5465 |
| 0.6548 | 80.0 | 720 | 0.7413 | 0.3580 | 0.4965 | 0.6242 | 0.3565 | 0.8156 | 0.5732 | 0.0 | 0.7370 | 0.2845 | 0.5601 | 0.4006 | 0.0 | 0.5447 |
| 0.7145 | 81.0 | 729 | 0.7367 | 0.3653 | 0.4881 | 0.6270 | 0.3804 | 0.8093 | 0.5267 | 0.0 | 0.7243 | 0.2979 | 0.5586 | 0.4025 | 0.0 | 0.5676 |
| 0.684 | 82.0 | 738 | 0.7372 | 0.3637 | 0.4830 | 0.6273 | 0.3676 | 0.8196 | 0.5215 | 0.0 | 0.7063 | 0.2910 | 0.5622 | 0.3975 | 0.0 | 0.5678 |
| 0.8255 | 83.0 | 747 | 0.7369 | 0.3624 | 0.4724 | 0.6272 | 0.3959 | 0.8032 | 0.4954 | 0.0 | 0.6677 | 0.3059 | 0.5568 | 0.3919 | 0.0 | 0.5571 |
| 0.7086 | 84.0 | 756 | 0.7390 | 0.3586 | 0.4771 | 0.6271 | 0.3347 | 0.8449 | 0.4951 | 0.0 | 0.7109 | 0.2721 | 0.5682 | 0.3956 | 0.0 | 0.5569 |
| 0.7032 | 85.0 | 765 | 0.7396 | 0.3617 | 0.5005 | 0.6262 | 0.3870 | 0.7989 | 0.5646 | 0.0 | 0.7521 | 0.3020 | 0.5565 | 0.4030 | 0.0 | 0.5473 |
| 0.6183 | 86.0 | 774 | 0.7368 | 0.3630 | 0.4899 | 0.6276 | 0.3995 | 0.7969 | 0.5314 | 0.0 | 0.7216 | 0.3081 | 0.5566 | 0.4010 | 0.0 | 0.5493 |
| 0.5582 | 87.0 | 783 | 0.7373 | 0.3617 | 0.4815 | 0.6278 | 0.3882 | 0.8056 | 0.5307 | 0.0 | 0.6832 | 0.3027 | 0.5589 | 0.3982 | 0.0 | 0.5484 |
| 0.6318 | 88.0 | 792 | 0.7387 | 0.3626 | 0.4823 | 0.6279 | 0.4180 | 0.7893 | 0.4895 | 0.0 | 0.7145 | 0.3173 | 0.5544 | 0.3871 | 0.0 | 0.5540 |
| 0.6429 | 89.0 | 801 | 0.7357 | 0.3641 | 0.4916 | 0.6283 | 0.3926 | 0.8019 | 0.5388 | 0.0 | 0.7244 | 0.3053 | 0.5584 | 0.4019 | 0.0 | 0.5547 |
| 0.5478 | 90.0 | 810 | 0.7347 | 0.3646 | 0.4952 | 0.6295 | 0.3824 | 0.8112 | 0.5371 | 0.0 | 0.7454 | 0.3002 | 0.5617 | 0.4063 | 0.0 | 0.5547 |
| 0.6188 | 91.0 | 819 | 0.7356 | 0.3643 | 0.4954 | 0.6288 | 0.3798 | 0.8104 | 0.5499 | 0.0 | 0.7367 | 0.2988 | 0.5610 | 0.4075 | 0.0 | 0.5543 |
| 0.7857 | 92.0 | 828 | 0.7367 | 0.3643 | 0.4831 | 0.6287 | 0.3911 | 0.8050 | 0.5305 | 0.0 | 0.6889 | 0.3047 | 0.5592 | 0.3995 | 0.0 | 0.5581 |
| 0.6172 | 93.0 | 837 | 0.7358 | 0.3647 | 0.4815 | 0.6299 | 0.4137 | 0.7934 | 0.5149 | 0.0 | 0.6857 | 0.3167 | 0.5568 | 0.3973 | 0.0 | 0.5528 |
| 0.739 | 94.0 | 846 | 0.7360 | 0.3627 | 0.4819 | 0.6307 | 0.3723 | 0.8229 | 0.5187 | 0.0 | 0.6955 | 0.2956 | 0.5656 | 0.3981 | 0.0 | 0.5542 |
| 0.6934 | 95.0 | 855 | 0.7377 | 0.3607 | 0.4813 | 0.6293 | 0.3455 | 0.8385 | 0.5226 | 0.0 | 0.6998 | 0.2805 | 0.5687 | 0.3950 | 0.0 | 0.5595 |
| 0.698 | 96.0 | 864 | 0.7381 | 0.3650 | 0.4870 | 0.6289 | 0.3900 | 0.8042 | 0.5491 | 0.0 | 0.6917 | 0.3048 | 0.5598 | 0.3991 | 0.0 | 0.5615 |
| 0.6227 | 97.0 | 873 | 0.7356 | 0.3661 | 0.4766 | 0.6298 | 0.4475 | 0.7722 | 0.4960 | 0.0 | 0.6675 | 0.3331 | 0.5508 | 0.3882 | 0.0 | 0.5584 |
| 0.6114 | 98.0 | 882 | 0.7335 | 0.3665 | 0.4871 | 0.6312 | 0.4205 | 0.7914 | 0.5089 | 0.0 | 0.7145 | 0.3211 | 0.5572 | 0.3940 | 0.0 | 0.5605 |
| 0.6977 | 99.0 | 891 | 0.7338 | 0.3630 | 0.4901 | 0.6311 | 0.3615 | 0.8281 | 0.5436 | 0.0 | 0.7170 | 0.2905 | 0.5676 | 0.4025 | 0.0 | 0.5541 |
| 0.7309 | 100.0 | 900 | 0.7338 | 0.3644 | 0.4889 | 0.6315 | 0.4106 | 0.7974 | 0.5208 | 0.0 | 0.7158 | 0.3162 | 0.5597 | 0.3961 | 0.0 | 0.5498 |
| 0.7108 | 101.0 | 909 | 0.7337 | 0.3599 | 0.4680 | 0.6322 | 0.3796 | 0.8242 | 0.4902 | 0.0 | 0.6462 | 0.2999 | 0.5669 | 0.3913 | 0.0 | 0.5413 |
| 0.7682 | 102.0 | 918 | 0.7330 | 0.3622 | 0.4798 | 0.6314 | 0.3857 | 0.8149 | 0.5208 | 0.0 | 0.6778 | 0.3038 | 0.5638 | 0.3949 | 0.0 | 0.5487 |
| 0.6808 | 103.0 | 927 | 0.7332 | 0.3626 | 0.4796 | 0.6323 | 0.3851 | 0.8187 | 0.5031 | 0.0 | 0.6910 | 0.3035 | 0.5654 | 0.3930 | 0.0 | 0.5511 |
| 0.6521 | 104.0 | 936 | 0.7330 | 0.3657 | 0.4901 | 0.6325 | 0.3984 | 0.8079 | 0.5184 | 0.0 | 0.7260 | 0.3108 | 0.5629 | 0.3967 | 0.0 | 0.5579 |
| 0.6283 | 105.0 | 945 | 0.7321 | 0.3644 | 0.4803 | 0.6326 | 0.4219 | 0.7968 | 0.4753 | 0.0 | 0.7073 | 0.3227 | 0.5594 | 0.3807 | 0.0 | 0.5592 |
| 0.7336 | 106.0 | 954 | 0.7320 | 0.3654 | 0.4859 | 0.6319 | 0.4307 | 0.7869 | 0.4960 | 0.0 | 0.7159 | 0.3270 | 0.5562 | 0.3862 | 0.0 | 0.5574 |
| 0.6893 | 107.0 | 963 | 0.7309 | 0.3646 | 0.4888 | 0.6324 | 0.4103 | 0.8003 | 0.5106 | 0.0 | 0.7228 | 0.3174 | 0.5605 | 0.3900 | 0.0 | 0.5553 |
| 0.7615 | 108.0 | 972 | 0.7340 | 0.3612 | 0.4905 | 0.6317 | 0.3489 | 0.8365 | 0.5567 | 0.0 | 0.7101 | 0.2846 | 0.5709 | 0.3971 | 0.0 | 0.5531 |
| 0.7895 | 109.0 | 981 | 0.7329 | 0.3627 | 0.4812 | 0.6336 | 0.3760 | 0.8290 | 0.4860 | 0.0 | 0.7152 | 0.2994 | 0.5694 | 0.3836 | 0.0 | 0.5614 |
| 0.6942 | 110.0 | 990 | 0.7334 | 0.3652 | 0.4785 | 0.6342 | 0.4203 | 0.8024 | 0.4607 | 0.0 | 0.7090 | 0.3225 | 0.5626 | 0.3740 | 0.0 | 0.5668 |
| 0.6813 | 111.0 | 999 | 0.7316 | 0.3664 | 0.4777 | 0.6349 | 0.3985 | 0.8148 | 0.4989 | 0.0 | 0.6762 | 0.3119 | 0.5662 | 0.3927 | 0.0 | 0.5610 |
| 0.6465 | 112.0 | 1008 | 0.7285 | 0.3689 | 0.4926 | 0.6344 | 0.4150 | 0.7992 | 0.5243 | 0.0 | 0.7248 | 0.3208 | 0.5615 | 0.3985 | 0.0 | 0.5637 |
| 0.5932 | 113.0 | 1017 | 0.7278 | 0.3690 | 0.4867 | 0.6346 | 0.4235 | 0.7965 | 0.4997 | 0.0 | 0.7140 | 0.3244 | 0.5610 | 0.3932 | 0.0 | 0.5665 |
| 0.635 | 114.0 | 1026 | 0.7286 | 0.3692 | 0.4905 | 0.6338 | 0.4212 | 0.7936 | 0.5272 | 0.0 | 0.7102 | 0.3235 | 0.5600 | 0.3965 | 0.0 | 0.5660 |
| 0.6586 | 115.0 | 1035 | 0.7269 | 0.3663 | 0.4817 | 0.6351 | 0.3897 | 0.8194 | 0.5158 | 0.0 | 0.6834 | 0.3077 | 0.5678 | 0.3984 | 0.0 | 0.5577 |
| 0.6688 | 116.0 | 1044 | 0.7261 | 0.3687 | 0.4938 | 0.6350 | 0.4115 | 0.8025 | 0.5260 | 0.0 | 0.7290 | 0.3193 | 0.5636 | 0.3974 | 0.0 | 0.5633 |
| 0.6359 | 117.0 | 1053 | 0.7283 | 0.3667 | 0.4854 | 0.6348 | 0.4321 | 0.7925 | 0.4862 | 0.0 | 0.7165 | 0.3289 | 0.5604 | 0.3853 | 0.0 | 0.5588 |
| 0.7393 | 118.0 | 1062 | 0.7264 | 0.3694 | 0.4924 | 0.6357 | 0.4296 | 0.7918 | 0.5212 | 0.0 | 0.7194 | 0.3286 | 0.5608 | 0.3982 | 0.0 | 0.5592 |
| 0.6211 | 119.0 | 1071 | 0.7285 | 0.3669 | 0.4826 | 0.6365 | 0.3999 | 0.8164 | 0.5007 | 0.0 | 0.6962 | 0.3135 | 0.5680 | 0.3949 | 0.0 | 0.5579 |
| 0.6577 | 120.0 | 1080 | 0.7318 | 0.3640 | 0.4754 | 0.6357 | 0.3982 | 0.8175 | 0.4897 | 0.0 | 0.6717 | 0.3121 | 0.5677 | 0.3890 | 0.0 | 0.5510 |
| 0.625 | 121.0 | 1089 | 0.7282 | 0.3657 | 0.4855 | 0.6355 | 0.4058 | 0.8099 | 0.5055 | 0.0 | 0.7065 | 0.3165 | 0.5657 | 0.3916 | 0.0 | 0.5548 |
| 0.6061 | 122.0 | 1098 | 0.7270 | 0.3671 | 0.4936 | 0.6358 | 0.4209 | 0.8004 | 0.4964 | 0.0 | 0.7502 | 0.3242 | 0.5634 | 0.3881 | 0.0 | 0.5597 |
| 0.6852 | 123.0 | 1107 | 0.7263 | 0.3673 | 0.4934 | 0.6357 | 0.4270 | 0.7952 | 0.5050 | 0.0 | 0.7398 | 0.3275 | 0.5618 | 0.3903 | 0.0 | 0.5570 |
| 0.5813 | 124.0 | 1116 | 0.7274 | 0.3691 | 0.4884 | 0.6352 | 0.4584 | 0.7737 | 0.4981 | 0.0 | 0.7116 | 0.3422 | 0.5553 | 0.3883 | 0.0 | 0.5598 |
| 0.574 | 125.0 | 1125 | 0.7286 | 0.3709 | 0.4927 | 0.6360 | 0.4571 | 0.7743 | 0.5140 | 0.0 | 0.7181 | 0.3421 | 0.5562 | 0.3948 | 0.0 | 0.5616 |
| 0.5654 | 126.0 | 1134 | 0.7271 | 0.3704 | 0.4944 | 0.6365 | 0.4271 | 0.7944 | 0.5265 | 0.0 | 0.7241 | 0.3283 | 0.5623 | 0.3966 | 0.0 | 0.5645 |
| 0.752 | 127.0 | 1143 | 0.7296 | 0.3699 | 0.4898 | 0.6356 | 0.4380 | 0.7845 | 0.5376 | 0.0 | 0.6890 | 0.3335 | 0.5590 | 0.3964 | 0.0 | 0.5606 |
| 0.6302 | 128.0 | 1152 | 0.7285 | 0.3662 | 0.4790 | 0.6369 | 0.4160 | 0.8071 | 0.4914 | 0.0 | 0.6806 | 0.3223 | 0.5660 | 0.3861 | 0.0 | 0.5567 |
| 0.5796 | 129.0 | 1161 | 0.7269 | 0.3708 | 0.4927 | 0.6368 | 0.4431 | 0.7855 | 0.5133 | 0.0 | 0.7217 | 0.3361 | 0.5599 | 0.3937 | 0.0 | 0.5643 |
| 0.6812 | 130.0 | 1170 | 0.7262 | 0.3697 | 0.4914 | 0.6367 | 0.4177 | 0.8017 | 0.5234 | 0.0 | 0.7143 | 0.3236 | 0.5647 | 0.3939 | 0.0 | 0.5664 |
| 0.697 | 131.0 | 1179 | 0.7279 | 0.3699 | 0.4867 | 0.6359 | 0.4454 | 0.7822 | 0.5159 | 0.0 | 0.6899 | 0.3372 | 0.5584 | 0.3890 | 0.0 | 0.5649 |
| 0.6548 | 132.0 | 1188 | 0.7272 | 0.3675 | 0.4791 | 0.6369 | 0.4214 | 0.8014 | 0.5116 | 0.0 | 0.6612 | 0.3255 | 0.5645 | 0.3912 | 0.0 | 0.5564 |
| 0.5771 | 133.0 | 1197 | 0.7253 | 0.3682 | 0.4862 | 0.6374 | 0.4105 | 0.8082 | 0.5232 | 0.0 | 0.6892 | 0.3208 | 0.5667 | 0.3935 | 0.0 | 0.5599 |
| 0.752 | 134.0 | 1206 | 0.7288 | 0.3660 | 0.4800 | 0.6377 | 0.4117 | 0.8125 | 0.4810 | 0.0 | 0.6946 | 0.3208 | 0.5681 | 0.3803 | 0.0 | 0.5610 |
| 0.6518 | 135.0 | 1215 | 0.7263 | 0.3692 | 0.4846 | 0.6370 | 0.4468 | 0.7849 | 0.5021 | 0.0 | 0.6892 | 0.3383 | 0.5599 | 0.3867 | 0.0 | 0.5613 |
| 0.6085 | 136.0 | 1224 | 0.7282 | 0.3703 | 0.4884 | 0.6379 | 0.4335 | 0.7941 | 0.5133 | 0.0 | 0.7009 | 0.3320 | 0.5634 | 0.3932 | 0.0 | 0.5631 |
| 0.6856 | 137.0 | 1233 | 0.7282 | 0.3679 | 0.4839 | 0.6381 | 0.4121 | 0.8109 | 0.5008 | 0.0 | 0.6957 | 0.3209 | 0.5682 | 0.3913 | 0.0 | 0.5593 |
| 0.6196 | 138.0 | 1242 | 0.7267 | 0.3723 | 0.4961 | 0.6368 | 0.4753 | 0.7626 | 0.5182 | 0.0 | 0.7244 | 0.3512 | 0.5538 | 0.3930 | 0.0 | 0.5637 |
| 0.5814 | 139.0 | 1251 | 0.7267 | 0.3716 | 0.4933 | 0.6367 | 0.4790 | 0.7603 | 0.5157 | 0.0 | 0.7114 | 0.3528 | 0.5531 | 0.3926 | 0.0 | 0.5598 |
| 0.6441 | 140.0 | 1260 | 0.7269 | 0.3676 | 0.4866 | 0.6383 | 0.4210 | 0.8054 | 0.4959 | 0.0 | 0.7108 | 0.3254 | 0.5670 | 0.3886 | 0.0 | 0.5569 |
| 0.5908 | 141.0 | 1269 | 0.7244 | 0.3676 | 0.4824 | 0.6377 | 0.4312 | 0.7970 | 0.5006 | 0.0 | 0.6831 | 0.3310 | 0.5636 | 0.3876 | 0.0 | 0.5557 |
| 0.6087 | 142.0 | 1278 | 0.7250 | 0.3682 | 0.4840 | 0.6384 | 0.4230 | 0.8027 | 0.5135 | 0.0 | 0.6809 | 0.3274 | 0.5661 | 0.3910 | 0.0 | 0.5562 |
| 0.5682 | 143.0 | 1287 | 0.7279 | 0.3706 | 0.4898 | 0.6379 | 0.4746 | 0.7685 | 0.4871 | 0.0 | 0.7190 | 0.3511 | 0.5563 | 0.3839 | 0.0 | 0.5616 |
| 0.6818 | 144.0 | 1296 | 0.7252 | 0.3715 | 0.4938 | 0.6378 | 0.4689 | 0.7693 | 0.5142 | 0.0 | 0.7167 | 0.3490 | 0.5564 | 0.3915 | 0.0 | 0.5604 |
| 0.6846 | 145.0 | 1305 | 0.7252 | 0.3705 | 0.4883 | 0.6384 | 0.4519 | 0.7830 | 0.5090 | 0.0 | 0.6977 | 0.3412 | 0.5608 | 0.3904 | 0.0 | 0.5602 |
| 0.5883 | 146.0 | 1314 | 0.7250 | 0.3693 | 0.4810 | 0.6383 | 0.4572 | 0.7805 | 0.5 | 0.0 | 0.6673 | 0.3437 | 0.5597 | 0.3880 | 0.0 | 0.5551 |
| 0.6346 | 147.0 | 1323 | 0.7270 | 0.3682 | 0.4823 | 0.6389 | 0.4201 | 0.8065 | 0.5055 | 0.0 | 0.6796 | 0.3261 | 0.5675 | 0.3893 | 0.0 | 0.5580 |
| 0.5825 | 148.0 | 1332 | 0.7278 | 0.3691 | 0.4853 | 0.6393 | 0.4299 | 0.8009 | 0.4991 | 0.0 | 0.6964 | 0.3311 | 0.5663 | 0.3885 | 0.0 | 0.5595 |
| 0.5932 | 149.0 | 1341 | 0.7262 | 0.3685 | 0.4848 | 0.6392 | 0.4372 | 0.7962 | 0.4946 | 0.0 | 0.6959 | 0.3349 | 0.5648 | 0.3845 | 0.0 | 0.5582 |
| 0.6176 | 150.0 | 1350 | 0.7251 | 0.3701 | 0.4913 | 0.6395 | 0.4204 | 0.8054 | 0.5200 | 0.0 | 0.7106 | 0.3268 | 0.5681 | 0.3922 | 0.0 | 0.5633 |
| 0.5667 | 151.0 | 1359 | 0.7284 | 0.3705 | 0.4822 | 0.6391 | 0.4632 | 0.7792 | 0.4854 | 0.0 | 0.6830 | 0.3463 | 0.5604 | 0.3835 | 0.0 | 0.5620 |
| 0.6323 | 152.0 | 1368 | 0.7260 | 0.3693 | 0.4837 | 0.6389 | 0.4411 | 0.7916 | 0.5078 | 0.0 | 0.6778 | 0.3366 | 0.5636 | 0.3880 | 0.0 | 0.5584 |
| 0.6236 | 153.0 | 1377 | 0.7260 | 0.3676 | 0.4787 | 0.6393 | 0.4280 | 0.8036 | 0.4883 | 0.0 | 0.6737 | 0.3302 | 0.5668 | 0.3832 | 0.0 | 0.5577 |
| 0.6992 | 154.0 | 1386 | 0.7258 | 0.3699 | 0.4863 | 0.6399 | 0.4224 | 0.8067 | 0.5027 | 0.0 | 0.6997 | 0.3279 | 0.5681 | 0.3900 | 0.0 | 0.5635 |
| 0.5579 | 155.0 | 1395 | 0.7244 | 0.3721 | 0.4896 | 0.6396 | 0.4479 | 0.7867 | 0.5191 | 0.0 | 0.6942 | 0.3405 | 0.5622 | 0.3953 | 0.0 | 0.5625 |
| 0.6495 | 156.0 | 1404 | 0.7243 | 0.3700 | 0.4839 | 0.6403 | 0.4390 | 0.7973 | 0.4916 | 0.0 | 0.6913 | 0.3364 | 0.5657 | 0.3861 | 0.0 | 0.5621 |
| 0.5734 | 157.0 | 1413 | 0.7239 | 0.3712 | 0.4916 | 0.6400 | 0.4386 | 0.7940 | 0.5170 | 0.0 | 0.7082 | 0.3364 | 0.5648 | 0.3925 | 0.0 | 0.5625 |
| 0.6394 | 158.0 | 1422 | 0.7243 | 0.3679 | 0.4808 | 0.6399 | 0.4216 | 0.8080 | 0.4974 | 0.0 | 0.6767 | 0.3276 | 0.5684 | 0.3868 | 0.0 | 0.5565 |
| 0.5794 | 159.0 | 1431 | 0.7248 | 0.3700 | 0.4865 | 0.6397 | 0.4424 | 0.7928 | 0.5021 | 0.0 | 0.6954 | 0.3378 | 0.5644 | 0.3868 | 0.0 | 0.5610 |
| 0.6577 | 160.0 | 1440 | 0.7259 | 0.3705 | 0.4827 | 0.6396 | 0.4717 | 0.7726 | 0.5011 | 0.0 | 0.6680 | 0.3515 | 0.5587 | 0.3861 | 0.0 | 0.5563 |
| 0.4969 | 161.0 | 1449 | 0.7261 | 0.3682 | 0.4775 | 0.6404 | 0.4482 | 0.7915 | 0.4907 | 0.0 | 0.6571 | 0.3406 | 0.5646 | 0.3843 | 0.0 | 0.5518 |
| 0.5783 | 162.0 | 1458 | 0.7243 | 0.3691 | 0.4828 | 0.6397 | 0.4436 | 0.7918 | 0.5047 | 0.0 | 0.6740 | 0.3383 | 0.5641 | 0.3879 | 0.0 | 0.5551 |
| 0.682 | 163.0 | 1467 | 0.7239 | 0.3687 | 0.4827 | 0.6401 | 0.4349 | 0.7980 | 0.5092 | 0.0 | 0.6715 | 0.3346 | 0.5660 | 0.3887 | 0.0 | 0.5544 |
| 0.6954 | 164.0 | 1476 | 0.7241 | 0.3700 | 0.4905 | 0.6403 | 0.4274 | 0.8029 | 0.5115 | 0.0 | 0.7107 | 0.3310 | 0.5677 | 0.3902 | 0.0 | 0.5613 |
| 0.6062 | 165.0 | 1485 | 0.7256 | 0.3697 | 0.4918 | 0.6403 | 0.4352 | 0.7989 | 0.4982 | 0.0 | 0.7267 | 0.3345 | 0.5666 | 0.3873 | 0.0 | 0.5603 |
| 0.6114 | 166.0 | 1494 | 0.7248 | 0.3701 | 0.4945 | 0.6403 | 0.4263 | 0.8033 | 0.5138 | 0.0 | 0.7290 | 0.3303 | 0.5679 | 0.3915 | 0.0 | 0.5608 |
| 0.6039 | 167.0 | 1503 | 0.7241 | 0.3713 | 0.4903 | 0.6406 | 0.4404 | 0.7952 | 0.5044 | 0.0 | 0.7114 | 0.3372 | 0.5658 | 0.3888 | 0.0 | 0.5645 |
| 0.7731 | 168.0 | 1512 | 0.7253 | 0.3702 | 0.4849 | 0.6409 | 0.4367 | 0.7996 | 0.4946 | 0.0 | 0.6935 | 0.3357 | 0.5669 | 0.3859 | 0.0 | 0.5623 |
| 0.6694 | 169.0 | 1521 | 0.7253 | 0.3696 | 0.4868 | 0.6407 | 0.4223 | 0.8079 | 0.5059 | 0.0 | 0.6977 | 0.3284 | 0.5692 | 0.3898 | 0.0 | 0.5605 |
| 0.609 | 170.0 | 1530 | 0.7255 | 0.3677 | 0.4826 | 0.6406 | 0.4173 | 0.8124 | 0.4968 | 0.0 | 0.6868 | 0.3259 | 0.5702 | 0.3872 | 0.0 | 0.5550 |
| 0.6578 | 171.0 | 1539 | 0.7256 | 0.3686 | 0.4864 | 0.6403 | 0.4219 | 0.8073 | 0.5078 | 0.0 | 0.6951 | 0.3283 | 0.5687 | 0.3892 | 0.0 | 0.5569 |
| 0.6081 | 172.0 | 1548 | 0.7249 | 0.3705 | 0.4899 | 0.6405 | 0.4372 | 0.7965 | 0.5126 | 0.0 | 0.7030 | 0.3360 | 0.5661 | 0.3904 | 0.0 | 0.5602 |
| 0.6578 | 173.0 | 1557 | 0.7254 | 0.3677 | 0.4796 | 0.6409 | 0.4170 | 0.8135 | 0.4944 | 0.0 | 0.6730 | 0.3258 | 0.5710 | 0.3847 | 0.0 | 0.5569 |
| 0.6539 | 174.0 | 1566 | 0.7242 | 0.3696 | 0.4876 | 0.6407 | 0.4320 | 0.8016 | 0.5024 | 0.0 | 0.7018 | 0.3334 | 0.5677 | 0.3865 | 0.0 | 0.5606 |
| 0.5803 | 175.0 | 1575 | 0.7246 | 0.3705 | 0.4856 | 0.6408 | 0.4557 | 0.7867 | 0.4913 | 0.0 | 0.6941 | 0.3448 | 0.5635 | 0.3829 | 0.0 | 0.5611 |
| 0.5508 | 176.0 | 1584 | 0.7246 | 0.3724 | 0.4915 | 0.6401 | 0.4733 | 0.7709 | 0.5125 | 0.0 | 0.7008 | 0.3530 | 0.5586 | 0.3884 | 0.0 | 0.5622 |
| 0.5749 | 177.0 | 1593 | 0.7257 | 0.3708 | 0.4844 | 0.6405 | 0.4731 | 0.7743 | 0.4904 | 0.0 | 0.6842 | 0.3527 | 0.5598 | 0.3830 | 0.0 | 0.5587 |
| 0.6148 | 178.0 | 1602 | 0.7238 | 0.3675 | 0.4753 | 0.6404 | 0.4627 | 0.7789 | 0.5194 | 0.0 | 0.6157 | 0.3480 | 0.5612 | 0.3926 | 0.0 | 0.5358 |
| 0.6422 | 179.0 | 1611 | 0.7246 | 0.3712 | 0.4827 | 0.6410 | 0.4554 | 0.7865 | 0.5001 | 0.0 | 0.6715 | 0.3446 | 0.5635 | 0.3885 | 0.0 | 0.5595 |
| 0.7724 | 180.0 | 1620 | 0.7267 | 0.3700 | 0.4807 | 0.6414 | 0.4405 | 0.7988 | 0.4888 | 0.0 | 0.6756 | 0.3376 | 0.5672 | 0.3846 | 0.0 | 0.5607 |
| 0.6327 | 181.0 | 1629 | 0.7259 | 0.3710 | 0.4856 | 0.6411 | 0.4465 | 0.7930 | 0.4969 | 0.0 | 0.6917 | 0.3404 | 0.5655 | 0.3869 | 0.0 | 0.5621 |
| 0.6396 | 182.0 | 1638 | 0.7244 | 0.3727 | 0.4894 | 0.6411 | 0.4584 | 0.7843 | 0.5001 | 0.0 | 0.7040 | 0.3462 | 0.5629 | 0.3883 | 0.0 | 0.5660 |
| 0.6478 | 183.0 | 1647 | 0.7247 | 0.3703 | 0.4846 | 0.6411 | 0.4349 | 0.8017 | 0.4913 | 0.0 | 0.6951 | 0.3349 | 0.5676 | 0.3854 | 0.0 | 0.5638 |
| 0.6523 | 184.0 | 1656 | 0.7239 | 0.3699 | 0.4823 | 0.6411 | 0.4275 | 0.8054 | 0.5057 | 0.0 | 0.6728 | 0.3314 | 0.5687 | 0.3900 | 0.0 | 0.5592 |
| 0.6826 | 185.0 | 1665 | 0.7247 | 0.3692 | 0.4839 | 0.6412 | 0.4177 | 0.8126 | 0.5003 | 0.0 | 0.6888 | 0.3264 | 0.5708 | 0.3880 | 0.0 | 0.5609 |
| 0.6565 | 186.0 | 1674 | 0.7254 | 0.3706 | 0.4851 | 0.6412 | 0.4357 | 0.8014 | 0.4889 | 0.0 | 0.6993 | 0.3352 | 0.5677 | 0.3843 | 0.0 | 0.5658 |
| 0.6469 | 187.0 | 1683 | 0.7256 | 0.3700 | 0.4831 | 0.6412 | 0.4385 | 0.7999 | 0.4859 | 0.0 | 0.6912 | 0.3366 | 0.5673 | 0.3825 | 0.0 | 0.5635 |
| 0.5834 | 188.0 | 1692 | 0.7245 | 0.3720 | 0.4912 | 0.6412 | 0.4457 | 0.7926 | 0.5055 | 0.0 | 0.7125 | 0.3403 | 0.5656 | 0.3885 | 0.0 | 0.5656 |
| 0.6621 | 189.0 | 1701 | 0.7242 | 0.3716 | 0.4916 | 0.6412 | 0.4432 | 0.7946 | 0.5021 | 0.0 | 0.7181 | 0.3391 | 0.5661 | 0.3875 | 0.0 | 0.5654 |
| 0.7131 | 190.0 | 1710 | 0.7242 | 0.3708 | 0.4913 | 0.6412 | 0.4295 | 0.8037 | 0.5059 | 0.0 | 0.7173 | 0.3325 | 0.5686 | 0.3883 | 0.0 | 0.5646 |
| 0.6508 | 191.0 | 1719 | 0.7233 | 0.3735 | 0.4928 | 0.6408 | 0.4645 | 0.7785 | 0.5088 | 0.0 | 0.7123 | 0.3490 | 0.5613 | 0.3897 | 0.0 | 0.5675 |
| 0.5806 | 192.0 | 1728 | 0.7250 | 0.3721 | 0.4877 | 0.6412 | 0.4595 | 0.7848 | 0.4909 | 0.0 | 0.7035 | 0.3467 | 0.5633 | 0.3841 | 0.0 | 0.5666 |
| 0.6532 | 193.0 | 1737 | 0.7255 | 0.3700 | 0.4845 | 0.6415 | 0.4318 | 0.8047 | 0.4898 | 0.0 | 0.6960 | 0.3335 | 0.5689 | 0.3841 | 0.0 | 0.5637 |
| 0.7422 | 194.0 | 1746 | 0.7274 | 0.3699 | 0.4827 | 0.6415 | 0.4398 | 0.7999 | 0.4822 | 0.0 | 0.6915 | 0.3374 | 0.5675 | 0.3815 | 0.0 | 0.5631 |
| 0.6449 | 195.0 | 1755 | 0.7256 | 0.3701 | 0.4826 | 0.6415 | 0.4368 | 0.8014 | 0.4888 | 0.0 | 0.6859 | 0.3360 | 0.5679 | 0.3837 | 0.0 | 0.5628 |
| 0.6197 | 196.0 | 1764 | 0.7240 | 0.3712 | 0.4860 | 0.6414 | 0.4396 | 0.7975 | 0.5044 | 0.0 | 0.6887 | 0.3375 | 0.5670 | 0.3889 | 0.0 | 0.5628 |
| 0.5627 | 197.0 | 1773 | 0.7252 | 0.3713 | 0.4855 | 0.6415 | 0.4503 | 0.7919 | 0.4883 | 0.0 | 0.6972 | 0.3425 | 0.5653 | 0.3837 | 0.0 | 0.5650 |
| 0.5694 | 198.0 | 1782 | 0.7249 | 0.3744 | 0.4922 | 0.6405 | 0.4833 | 0.7659 | 0.5002 | 0.0 | 0.7118 | 0.3573 | 0.5574 | 0.3875 | 0.0 | 0.5696 |
| 0.6385 | 199.0 | 1791 | 0.7242 | 0.3725 | 0.4905 | 0.6414 | 0.4462 | 0.7927 | 0.5049 | 0.0 | 0.7087 | 0.3407 | 0.5656 | 0.3890 | 0.0 | 0.5673 |
| 0.6515 | 200.0 | 1800 | 0.7243 | 0.3715 | 0.4884 | 0.6415 | 0.4372 | 0.7995 | 0.5009 | 0.0 | 0.7044 | 0.3363 | 0.5675 | 0.3875 | 0.0 | 0.5662 |
### Framework versions
- Transformers 4.33.2
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.13.3
|
Sanyam0605/Reinforce-Cartpole8 | Sanyam0605 | 2023-09-18T19:56:55Z | 0 | 0 | null | [
"CartPole-v1",
"reinforce",
"reinforcement-learning",
"custom-implementation",
"deep-rl-class",
"model-index",
"region:us"
]
| reinforcement-learning | 2023-09-18T19:56:45Z | ---
tags:
- CartPole-v1
- reinforce
- reinforcement-learning
- custom-implementation
- deep-rl-class
model-index:
- name: Reinforce-Cartpole8
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: CartPole-v1
type: CartPole-v1
metrics:
- type: mean_reward
value: 500.00 +/- 0.00
name: mean_reward
verified: false
---
# **Reinforce** Agent playing **CartPole-v1**
This is a trained model of a **Reinforce** agent playing **CartPole-v1** .
To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: https://huggingface.co/deep-rl-course/unit4/introduction
|
guydebruyn/Reinforce-Copter3 | guydebruyn | 2023-09-18T19:40:32Z | 0 | 0 | null | [
"Pixelcopter-PLE-v0",
"reinforce",
"reinforcement-learning",
"custom-implementation",
"deep-rl-class",
"model-index",
"region:us"
]
| reinforcement-learning | 2023-09-18T19:32:48Z | ---
tags:
- Pixelcopter-PLE-v0
- reinforce
- reinforcement-learning
- custom-implementation
- deep-rl-class
model-index:
- name: Reinforce-Copter3
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: Pixelcopter-PLE-v0
type: Pixelcopter-PLE-v0
metrics:
- type: mean_reward
value: -4.10 +/- 1.76
name: mean_reward
verified: false
---
# **Reinforce** Agent playing **Pixelcopter-PLE-v0**
This is a trained model of a **Reinforce** agent playing **Pixelcopter-PLE-v0** .
To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: https://huggingface.co/deep-rl-course/unit4/introduction
|
rayrico/bert-fine-tuned-cola | rayrico | 2023-09-18T19:35:26Z | 63 | 0 | transformers | [
"transformers",
"tf",
"bert",
"text-classification",
"generated_from_keras_callback",
"base_model:google-bert/bert-base-cased",
"base_model:finetune:google-bert/bert-base-cased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
]
| text-classification | 2023-09-18T19:30:19Z | ---
license: apache-2.0
base_model: bert-base-cased
tags:
- generated_from_keras_callback
model-index:
- name: bert-fine-tuned-cola
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# bert-fine-tuned-cola
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.3144
- Validation Loss: 0.4951
- Epoch: 1
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 0.5218 | 0.4787 | 0 |
| 0.3144 | 0.4951 | 1 |
### Framework versions
- Transformers 4.33.2
- TensorFlow 2.13.0
- Datasets 2.14.5
- Tokenizers 0.13.3
|
CyberHarem/kawashima_mizuki_idolmastercinderellagirls | CyberHarem | 2023-09-18T19:06:27Z | 0 | 0 | null | [
"art",
"text-to-image",
"dataset:CyberHarem/kawashima_mizuki_idolmastercinderellagirls",
"license:mit",
"region:us"
]
| text-to-image | 2023-09-18T18:53:54Z | ---
license: mit
datasets:
- CyberHarem/kawashima_mizuki_idolmastercinderellagirls
pipeline_tag: text-to-image
tags:
- art
---
# Lora of kawashima_mizuki_idolmastercinderellagirls
This model is trained with [HCP-Diffusion](https://github.com/7eu7d7/HCP-Diffusion). And the auto-training framework is maintained by [DeepGHS Team](https://huggingface.co/deepghs).
The base model used during training is [NAI](https://huggingface.co/deepghs/animefull-latest), and the base model used for generating preview images is [Meina/MeinaMix_V11](https://huggingface.co/Meina/MeinaMix_V11).
After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.
For example, if you want to use the model from step 4760, you need to download `4760/kawashima_mizuki_idolmastercinderellagirls.pt` as the embedding and `4760/kawashima_mizuki_idolmastercinderellagirls.safetensors` for loading Lora. By using both files together, you can generate images for the desired characters.
**The best step we recommend is 4760**, with the score of 0.878. The trigger words are:
1. `kawashima_mizuki_idolmastercinderellagirls`
2. `brown_hair, brown_eyes, long_hair, smile, breasts, blush, ponytail`
For the following groups, it is not recommended to use this model and we express regret:
1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.
2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.
3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.
4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.
5. Individuals who finds the generated image content offensive to their values.
These are available steps:
| Steps | Score | Download | pattern_1 | pattern_2 | pattern_3 | pattern_4 | pattern_5 | bikini | bondage | free | maid | miko | nude | nude2 | suit | yukata |
|:---------|:----------|:--------------------------------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:----------------------------------------------------|:-----------------------------------------------|:-----------------------------------------|:--------------------------------------------------|:-------------------------------------|:-------------------------------------|:-------------------------------------|:-----------------------------------------------|:------------------------------------------------|:-------------------------------------|:-----------------------------------------|
| 5100 | 0.874 | [Download](5100/kawashima_mizuki_idolmastercinderellagirls.zip) |  |  |  | [<NSFW, click to see>](5100/previews/pattern_4.png) |  |  | [<NSFW, click to see>](5100/previews/bondage.png) |  |  |  | [<NSFW, click to see>](5100/previews/nude.png) | [<NSFW, click to see>](5100/previews/nude2.png) |  |  |
| **4760** | **0.878** | [**Download**](4760/kawashima_mizuki_idolmastercinderellagirls.zip) |  |  |  | [<NSFW, click to see>](4760/previews/pattern_4.png) |  |  | [<NSFW, click to see>](4760/previews/bondage.png) |  |  |  | [<NSFW, click to see>](4760/previews/nude.png) | [<NSFW, click to see>](4760/previews/nude2.png) |  |  |
| 4420 | 0.831 | [Download](4420/kawashima_mizuki_idolmastercinderellagirls.zip) |  |  |  | [<NSFW, click to see>](4420/previews/pattern_4.png) |  |  | [<NSFW, click to see>](4420/previews/bondage.png) |  |  |  | [<NSFW, click to see>](4420/previews/nude.png) | [<NSFW, click to see>](4420/previews/nude2.png) |  |  |
| 4080 | 0.835 | [Download](4080/kawashima_mizuki_idolmastercinderellagirls.zip) |  |  |  | [<NSFW, click to see>](4080/previews/pattern_4.png) |  |  | [<NSFW, click to see>](4080/previews/bondage.png) |  |  |  | [<NSFW, click to see>](4080/previews/nude.png) | [<NSFW, click to see>](4080/previews/nude2.png) |  |  |
| 3740 | 0.862 | [Download](3740/kawashima_mizuki_idolmastercinderellagirls.zip) |  |  |  | [<NSFW, click to see>](3740/previews/pattern_4.png) |  |  | [<NSFW, click to see>](3740/previews/bondage.png) |  |  |  | [<NSFW, click to see>](3740/previews/nude.png) | [<NSFW, click to see>](3740/previews/nude2.png) |  |  |
| 3400 | 0.786 | [Download](3400/kawashima_mizuki_idolmastercinderellagirls.zip) |  |  |  | [<NSFW, click to see>](3400/previews/pattern_4.png) |  |  | [<NSFW, click to see>](3400/previews/bondage.png) |  |  |  | [<NSFW, click to see>](3400/previews/nude.png) | [<NSFW, click to see>](3400/previews/nude2.png) |  |  |
| 3060 | 0.826 | [Download](3060/kawashima_mizuki_idolmastercinderellagirls.zip) |  |  |  | [<NSFW, click to see>](3060/previews/pattern_4.png) |  |  | [<NSFW, click to see>](3060/previews/bondage.png) |  |  |  | [<NSFW, click to see>](3060/previews/nude.png) | [<NSFW, click to see>](3060/previews/nude2.png) |  |  |
| 2720 | 0.820 | [Download](2720/kawashima_mizuki_idolmastercinderellagirls.zip) |  |  |  | [<NSFW, click to see>](2720/previews/pattern_4.png) |  |  | [<NSFW, click to see>](2720/previews/bondage.png) |  |  |  | [<NSFW, click to see>](2720/previews/nude.png) | [<NSFW, click to see>](2720/previews/nude2.png) |  |  |
| 2380 | 0.812 | [Download](2380/kawashima_mizuki_idolmastercinderellagirls.zip) |  |  |  | [<NSFW, click to see>](2380/previews/pattern_4.png) |  |  | [<NSFW, click to see>](2380/previews/bondage.png) |  |  |  | [<NSFW, click to see>](2380/previews/nude.png) | [<NSFW, click to see>](2380/previews/nude2.png) |  |  |
| 2040 | 0.809 | [Download](2040/kawashima_mizuki_idolmastercinderellagirls.zip) |  |  |  | [<NSFW, click to see>](2040/previews/pattern_4.png) |  |  | [<NSFW, click to see>](2040/previews/bondage.png) |  |  |  | [<NSFW, click to see>](2040/previews/nude.png) | [<NSFW, click to see>](2040/previews/nude2.png) |  |  |
| 1700 | 0.705 | [Download](1700/kawashima_mizuki_idolmastercinderellagirls.zip) |  |  |  | [<NSFW, click to see>](1700/previews/pattern_4.png) |  |  | [<NSFW, click to see>](1700/previews/bondage.png) |  |  |  | [<NSFW, click to see>](1700/previews/nude.png) | [<NSFW, click to see>](1700/previews/nude2.png) |  |  |
| 1360 | 0.793 | [Download](1360/kawashima_mizuki_idolmastercinderellagirls.zip) |  |  |  | [<NSFW, click to see>](1360/previews/pattern_4.png) |  |  | [<NSFW, click to see>](1360/previews/bondage.png) |  |  |  | [<NSFW, click to see>](1360/previews/nude.png) | [<NSFW, click to see>](1360/previews/nude2.png) |  |  |
| 1020 | 0.769 | [Download](1020/kawashima_mizuki_idolmastercinderellagirls.zip) |  |  |  | [<NSFW, click to see>](1020/previews/pattern_4.png) |  |  | [<NSFW, click to see>](1020/previews/bondage.png) |  |  |  | [<NSFW, click to see>](1020/previews/nude.png) | [<NSFW, click to see>](1020/previews/nude2.png) |  |  |
| 680 | 0.686 | [Download](680/kawashima_mizuki_idolmastercinderellagirls.zip) |  |  |  | [<NSFW, click to see>](680/previews/pattern_4.png) |  |  | [<NSFW, click to see>](680/previews/bondage.png) |  |  |  | [<NSFW, click to see>](680/previews/nude.png) | [<NSFW, click to see>](680/previews/nude2.png) |  |  |
| 340 | 0.502 | [Download](340/kawashima_mizuki_idolmastercinderellagirls.zip) |  |  |  | [<NSFW, click to see>](340/previews/pattern_4.png) |  |  | [<NSFW, click to see>](340/previews/bondage.png) |  |  |  | [<NSFW, click to see>](340/previews/nude.png) | [<NSFW, click to see>](340/previews/nude2.png) |  |  |
|
Gayathri142214002/Pegasus_paraphraser_1 | Gayathri142214002 | 2023-09-18T19:02:20Z | 9 | 0 | transformers | [
"transformers",
"pytorch",
"pegasus",
"text2text-generation",
"generated_from_trainer",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
]
| text2text-generation | 2023-09-14T04:39:52Z | ---
tags:
- generated_from_trainer
model-index:
- name: Pegasus_paraphraser_1
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Pegasus_paraphraser_1
This model was trained from scratch on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2991
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 4
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 4
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 0.0688 | 0.01 | 10 | 0.2457 |
| 0.2896 | 0.01 | 20 | 0.2403 |
| 0.2359 | 0.02 | 30 | 0.2326 |
| 0.3123 | 0.02 | 40 | 0.2367 |
| 0.2913 | 0.03 | 50 | 0.2309 |
| 0.2572 | 0.03 | 60 | 0.2256 |
| 0.2803 | 0.04 | 70 | 0.2263 |
| 0.2946 | 0.04 | 80 | 0.2243 |
| 0.2211 | 0.05 | 90 | 0.2283 |
| 0.2161 | 0.05 | 100 | 0.2405 |
| 0.2927 | 0.06 | 110 | 0.2355 |
| 0.3544 | 0.07 | 120 | 0.2330 |
| 0.2933 | 0.07 | 130 | 0.2322 |
| 0.2535 | 0.08 | 140 | 0.2301 |
| 0.2799 | 0.08 | 150 | 0.2302 |
| 0.2532 | 0.09 | 160 | 0.2296 |
| 0.2382 | 0.09 | 170 | 0.2356 |
| 0.2758 | 0.1 | 180 | 0.2376 |
| 0.2552 | 0.1 | 190 | 0.2451 |
| 0.289 | 0.11 | 200 | 0.2439 |
| 0.247 | 0.11 | 210 | 0.2424 |
| 0.259 | 0.12 | 220 | 0.2448 |
| 0.2562 | 0.13 | 230 | 0.2491 |
| 0.2951 | 0.13 | 240 | 0.2554 |
| 0.2624 | 0.14 | 250 | 0.2476 |
| 0.2542 | 0.14 | 260 | 0.2474 |
| 0.2178 | 0.15 | 270 | 0.2501 |
| 0.3126 | 0.15 | 280 | 0.2483 |
| 0.2585 | 0.16 | 290 | 0.2492 |
| 0.2559 | 0.16 | 300 | 0.2502 |
| 0.231 | 0.17 | 310 | 0.2539 |
| 0.3023 | 0.17 | 320 | 0.2486 |
| 0.328 | 0.18 | 330 | 0.2491 |
| 0.313 | 0.19 | 340 | 0.2490 |
| 0.3041 | 0.19 | 350 | 0.2471 |
| 0.2719 | 0.2 | 360 | 0.2433 |
| 0.2655 | 0.2 | 370 | 0.2443 |
| 0.3171 | 0.21 | 380 | 0.2453 |
| 0.2398 | 0.21 | 390 | 0.2440 |
| 0.2682 | 0.22 | 400 | 0.2477 |
| 0.3193 | 0.22 | 410 | 0.2476 |
| 0.2754 | 0.23 | 420 | 0.2455 |
| 0.2635 | 0.23 | 430 | 0.2451 |
| 0.2593 | 0.24 | 440 | 0.2479 |
| 0.2955 | 0.25 | 450 | 0.2454 |
| 0.2923 | 0.25 | 460 | 0.2472 |
| 0.2947 | 0.26 | 470 | 0.2488 |
| 0.2718 | 0.26 | 480 | 0.2503 |
| 0.2772 | 0.27 | 490 | 0.2507 |
| 0.2183 | 0.27 | 500 | 0.2545 |
| 0.3563 | 0.28 | 510 | 0.2574 |
| 0.2762 | 0.28 | 520 | 0.2585 |
| 0.2336 | 0.29 | 530 | 0.2537 |
| 0.2548 | 0.29 | 540 | 0.2550 |
| 0.315 | 0.3 | 550 | 0.2625 |
| 0.2827 | 0.31 | 560 | 0.2641 |
| 0.2911 | 0.31 | 570 | 0.2621 |
| 0.2515 | 0.32 | 580 | 0.2625 |
| 0.3003 | 0.32 | 590 | 0.2665 |
| 0.2987 | 0.33 | 600 | 0.2653 |
| 0.3119 | 0.33 | 610 | 0.2622 |
| 0.305 | 0.34 | 620 | 0.2638 |
| 0.2828 | 0.34 | 630 | 0.2637 |
| 0.2416 | 0.35 | 640 | 0.2641 |
| 0.2829 | 0.35 | 650 | 0.2642 |
| 0.2757 | 0.36 | 660 | 0.2645 |
| 0.2548 | 0.37 | 670 | 0.2640 |
| 0.2737 | 0.37 | 680 | 0.2607 |
| 0.2963 | 0.38 | 690 | 0.2673 |
| 0.2622 | 0.38 | 700 | 0.2641 |
| 0.2773 | 0.39 | 710 | 0.2673 |
| 0.2883 | 0.39 | 720 | 0.2666 |
| 0.2855 | 0.4 | 730 | 0.2685 |
| 0.271 | 0.4 | 740 | 0.2655 |
| 0.2627 | 0.41 | 750 | 0.2673 |
| 0.2739 | 0.41 | 760 | 0.2677 |
| 0.2539 | 0.42 | 770 | 0.2702 |
| 0.2824 | 0.43 | 780 | 0.2723 |
| 0.3174 | 0.43 | 790 | 0.2767 |
| 0.2902 | 0.44 | 800 | 0.2790 |
| 0.2755 | 0.44 | 810 | 0.2701 |
| 0.253 | 0.45 | 820 | 0.2711 |
| 0.2619 | 0.45 | 830 | 0.2708 |
| 0.2819 | 0.46 | 840 | 0.2706 |
| 0.2686 | 0.46 | 850 | 0.2683 |
| 0.2683 | 0.47 | 860 | 0.2734 |
| 0.2702 | 0.47 | 870 | 0.2695 |
| 0.2793 | 0.48 | 880 | 0.2771 |
| 0.2807 | 0.49 | 890 | 0.2753 |
| 0.3387 | 0.49 | 900 | 0.2695 |
| 0.2924 | 0.5 | 910 | 0.2670 |
| 0.3004 | 0.5 | 920 | 0.2669 |
| 0.2653 | 0.51 | 930 | 0.2760 |
| 0.241 | 0.51 | 940 | 0.2700 |
| 0.2828 | 0.52 | 950 | 0.2704 |
| 0.2804 | 0.52 | 960 | 0.2707 |
| 0.2888 | 0.53 | 970 | 0.2672 |
| 0.3065 | 0.54 | 980 | 0.2678 |
| 0.2853 | 0.54 | 990 | 0.2706 |
| 0.2579 | 0.55 | 1000 | 0.2685 |
| 0.3108 | 0.55 | 1010 | 0.2679 |
| 0.3061 | 0.56 | 1020 | 0.2687 |
| 0.2836 | 0.56 | 1030 | 0.2688 |
| 0.2081 | 0.57 | 1040 | 0.2769 |
| 0.312 | 0.57 | 1050 | 0.2770 |
| 0.29 | 0.58 | 1060 | 0.2701 |
| 0.3126 | 0.58 | 1070 | 0.2699 |
| 0.3559 | 0.59 | 1080 | 0.2690 |
| 0.3611 | 0.6 | 1090 | 0.2864 |
| 0.3906 | 0.6 | 1100 | 0.4051 |
| 0.3261 | 0.61 | 1110 | 0.2743 |
| 0.2893 | 0.61 | 1120 | 0.2725 |
| 0.2796 | 0.62 | 1130 | 0.2713 |
| 0.289 | 0.62 | 1140 | 0.2717 |
| 0.29 | 0.63 | 1150 | 0.2718 |
| 0.2983 | 0.63 | 1160 | 0.2729 |
| 0.3058 | 0.64 | 1170 | 0.2696 |
| 0.23 | 0.64 | 1180 | 0.2731 |
| 0.2558 | 0.65 | 1190 | 0.2748 |
| 0.3722 | 0.66 | 1200 | 0.2740 |
| 0.3308 | 0.66 | 1210 | 0.2718 |
| 0.27 | 0.67 | 1220 | 0.2724 |
| 0.2897 | 0.67 | 1230 | 0.2750 |
| 0.2954 | 0.68 | 1240 | 0.2776 |
| 0.264 | 0.68 | 1250 | 0.2737 |
| 0.2802 | 0.69 | 1260 | 0.2738 |
| 0.3027 | 0.69 | 1270 | 0.2702 |
| 0.2881 | 0.7 | 1280 | 0.2701 |
| 0.2664 | 0.7 | 1290 | 0.2699 |
| 0.3288 | 0.71 | 1300 | 0.2707 |
| 0.3315 | 0.72 | 1310 | 0.2686 |
| 0.3577 | 0.72 | 1320 | 0.2686 |
| 0.3046 | 0.73 | 1330 | 0.2702 |
| 0.2872 | 0.73 | 1340 | 0.2697 |
| 0.3242 | 0.74 | 1350 | 0.2693 |
| 0.3268 | 0.74 | 1360 | 0.2704 |
| 0.2845 | 0.75 | 1370 | 0.2756 |
| 0.2816 | 0.75 | 1380 | 0.2705 |
| 0.3121 | 0.76 | 1390 | 0.2678 |
| 0.2989 | 0.76 | 1400 | 0.2687 |
| 0.2611 | 0.77 | 1410 | 0.2701 |
| 0.2954 | 0.78 | 1420 | 0.2711 |
| 0.3025 | 0.78 | 1430 | 0.2711 |
| 0.2692 | 0.79 | 1440 | 0.2732 |
| 0.249 | 0.79 | 1450 | 0.2753 |
| 0.2673 | 0.8 | 1460 | 0.2779 |
| 0.2471 | 0.8 | 1470 | 0.2829 |
| 0.2925 | 0.81 | 1480 | 0.2806 |
| 0.3005 | 0.81 | 1490 | 0.2785 |
| 0.3009 | 0.82 | 1500 | 0.2765 |
| 0.2556 | 0.82 | 1510 | 0.2740 |
| 0.3265 | 0.83 | 1520 | 0.2775 |
| 0.2731 | 0.84 | 1530 | 0.2816 |
| 0.3406 | 0.84 | 1540 | 0.2767 |
| 0.2936 | 0.85 | 1550 | 0.2760 |
| 0.3001 | 0.85 | 1560 | 0.2737 |
| 0.3017 | 0.86 | 1570 | 0.2755 |
| 0.3212 | 0.86 | 1580 | 0.2784 |
| 0.3122 | 0.87 | 1590 | 0.2742 |
| 0.2591 | 0.87 | 1600 | 0.2720 |
| 0.28 | 0.88 | 1610 | 0.2736 |
| 0.2641 | 0.88 | 1620 | 0.2736 |
| 0.3488 | 0.89 | 1630 | 0.2715 |
| 0.3505 | 0.9 | 1640 | 0.2701 |
| 0.2676 | 0.9 | 1650 | 0.2699 |
| 0.2444 | 0.91 | 1660 | 0.2711 |
| 0.2493 | 0.91 | 1670 | 0.2732 |
| 0.326 | 0.92 | 1680 | 0.2735 |
| 0.3099 | 0.92 | 1690 | 0.2737 |
| 0.2893 | 0.93 | 1700 | 0.2717 |
| 0.3139 | 0.93 | 1710 | 0.2737 |
| 0.2913 | 0.94 | 1720 | 0.2777 |
| 0.2999 | 0.94 | 1730 | 0.2721 |
| 0.2708 | 0.95 | 1740 | 0.2704 |
| 0.3208 | 0.96 | 1750 | 0.2690 |
| 0.2691 | 0.96 | 1760 | 0.2692 |
| 0.2921 | 0.97 | 1770 | 0.2696 |
| 0.2782 | 0.97 | 1780 | 0.2706 |
| 0.2937 | 0.98 | 1790 | 0.2703 |
| 0.2948 | 0.98 | 1800 | 0.2695 |
| 0.3195 | 0.99 | 1810 | 0.2682 |
| 0.2525 | 0.99 | 1820 | 0.2678 |
| 0.2434 | 1.0 | 1830 | 0.2685 |
| 0.2043 | 1.0 | 1840 | 0.2721 |
| 0.216 | 1.01 | 1850 | 0.2733 |
| 0.1924 | 1.02 | 1860 | 0.2696 |
| 0.2219 | 1.02 | 1870 | 0.2707 |
| 0.2198 | 1.03 | 1880 | 0.2722 |
| 0.1829 | 1.03 | 1890 | 0.2735 |
| 0.247 | 1.04 | 1900 | 0.2768 |
| 0.2243 | 1.04 | 1910 | 0.2759 |
| 0.2747 | 1.05 | 1920 | 0.2769 |
| 0.187 | 1.05 | 1930 | 0.2796 |
| 0.2698 | 1.06 | 1940 | 0.2797 |
| 0.218 | 1.06 | 1950 | 0.2822 |
| 0.2155 | 1.07 | 1960 | 0.2817 |
| 0.2352 | 1.08 | 1970 | 0.2832 |
| 0.1915 | 1.08 | 1980 | 0.2799 |
| 0.2425 | 1.09 | 1990 | 0.2779 |
| 0.2212 | 1.09 | 2000 | 0.2824 |
| 0.2271 | 1.1 | 2010 | 0.2848 |
| 0.2349 | 1.1 | 2020 | 0.2851 |
| 0.2579 | 1.11 | 2030 | 0.2817 |
| 0.2097 | 1.11 | 2040 | 0.2832 |
| 0.2205 | 1.12 | 2050 | 0.2852 |
| 0.2462 | 1.12 | 2060 | 0.2868 |
| 0.2121 | 1.13 | 2070 | 0.2895 |
| 0.2251 | 1.14 | 2080 | 0.2864 |
| 0.2052 | 1.14 | 2090 | 0.2833 |
| 0.2202 | 1.15 | 2100 | 0.2851 |
| 0.2295 | 1.15 | 2110 | 0.2850 |
| 0.2419 | 1.16 | 2120 | 0.2904 |
| 0.2639 | 1.16 | 2130 | 0.2866 |
| 0.2106 | 1.17 | 2140 | 0.2913 |
| 0.2454 | 1.17 | 2150 | 0.2924 |
| 0.2348 | 1.18 | 2160 | 0.2890 |
| 0.209 | 1.18 | 2170 | 0.2885 |
| 0.2523 | 1.19 | 2180 | 0.2873 |
| 0.2293 | 1.2 | 2190 | 0.2836 |
| 0.2258 | 1.2 | 2200 | 0.2842 |
| 0.249 | 1.21 | 2210 | 0.2849 |
| 0.2917 | 1.21 | 2220 | 0.2834 |
| 0.2268 | 1.22 | 2230 | 0.2824 |
| 0.2453 | 1.22 | 2240 | 0.2820 |
| 0.236 | 1.23 | 2250 | 0.2829 |
| 0.2191 | 1.23 | 2260 | 0.2841 |
| 0.2509 | 1.24 | 2270 | 0.2822 |
| 0.2481 | 1.24 | 2280 | 0.2830 |
| 0.2268 | 1.25 | 2290 | 0.2840 |
| 0.2594 | 1.26 | 2300 | 0.2803 |
| 0.242 | 1.26 | 2310 | 0.2827 |
| 0.2563 | 1.27 | 2320 | 0.2827 |
| 0.2269 | 1.27 | 2330 | 0.2803 |
| 0.2409 | 1.28 | 2340 | 0.2780 |
| 0.2946 | 1.28 | 2350 | 0.2778 |
| 0.2718 | 1.29 | 2360 | 0.2757 |
| 0.2407 | 1.29 | 2370 | 0.2776 |
| 0.2259 | 1.3 | 2380 | 0.2810 |
| 0.2235 | 1.3 | 2390 | 0.2829 |
| 0.266 | 1.31 | 2400 | 0.2806 |
| 0.229 | 1.32 | 2410 | 0.2796 |
| 0.2154 | 1.32 | 2420 | 0.2794 |
| 0.2551 | 1.33 | 2430 | 0.2766 |
| 0.2169 | 1.33 | 2440 | 0.2779 |
| 0.2396 | 1.34 | 2450 | 0.2776 |
| 0.2239 | 1.34 | 2460 | 0.2835 |
| 0.2325 | 1.35 | 2470 | 0.2823 |
| 0.2421 | 1.35 | 2480 | 0.2841 |
| 0.2456 | 1.36 | 2490 | 0.2861 |
| 0.2295 | 1.36 | 2500 | 0.2828 |
| 0.2549 | 1.37 | 2510 | 0.2835 |
| 0.2442 | 1.38 | 2520 | 0.2832 |
| 0.2572 | 1.38 | 2530 | 0.2821 |
| 0.235 | 1.39 | 2540 | 0.2796 |
| 0.2687 | 1.39 | 2550 | 0.2791 |
| 0.2539 | 1.4 | 2560 | 0.2787 |
| 0.2496 | 1.4 | 2570 | 0.2789 |
| 0.269 | 1.41 | 2580 | 0.2806 |
| 0.2851 | 1.41 | 2590 | 0.2808 |
| 0.274 | 1.42 | 2600 | 0.2806 |
| 0.2365 | 1.42 | 2610 | 0.2814 |
| 0.2031 | 1.43 | 2620 | 0.2864 |
| 0.2371 | 1.44 | 2630 | 0.2901 |
| 0.2513 | 1.44 | 2640 | 0.2891 |
| 0.2393 | 1.45 | 2650 | 0.2843 |
| 0.2498 | 1.45 | 2660 | 0.2832 |
| 0.2634 | 1.46 | 2670 | 0.2842 |
| 0.2845 | 1.46 | 2680 | 0.2812 |
| 0.263 | 1.47 | 2690 | 0.2791 |
| 0.261 | 1.47 | 2700 | 0.2794 |
| 0.2543 | 1.48 | 2710 | 0.2802 |
| 0.278 | 1.48 | 2720 | 0.2812 |
| 0.2583 | 1.49 | 2730 | 0.2843 |
| 0.238 | 1.5 | 2740 | 0.2839 |
| 0.2525 | 1.5 | 2750 | 0.2821 |
| 0.2605 | 1.51 | 2760 | 0.2814 |
| 0.2673 | 1.51 | 2770 | 0.2826 |
| 0.2298 | 1.52 | 2780 | 0.2856 |
| 0.2409 | 1.52 | 2790 | 0.2872 |
| 0.2502 | 1.53 | 2800 | 0.2879 |
| 0.2569 | 1.53 | 2810 | 0.2864 |
| 0.2299 | 1.54 | 2820 | 0.2858 |
| 0.2508 | 1.54 | 2830 | 0.2843 |
| 0.213 | 1.55 | 2840 | 0.2839 |
| 0.2404 | 1.56 | 2850 | 0.2842 |
| 0.2626 | 1.56 | 2860 | 0.2823 |
| 0.2296 | 1.57 | 2870 | 0.2821 |
| 0.2284 | 1.57 | 2880 | 0.2834 |
| 0.2481 | 1.58 | 2890 | 0.2834 |
| 0.271 | 1.58 | 2900 | 0.2801 |
| 0.2779 | 1.59 | 2910 | 0.2796 |
| 0.2247 | 1.59 | 2920 | 0.2782 |
| 0.2334 | 1.6 | 2930 | 0.2785 |
| 0.2469 | 1.61 | 2940 | 0.2811 |
| 0.2526 | 1.61 | 2950 | 0.2801 |
| 0.2788 | 1.62 | 2960 | 0.2801 |
| 0.2463 | 1.62 | 2970 | 0.2804 |
| 0.2367 | 1.63 | 2980 | 0.2815 |
| 0.2096 | 1.63 | 2990 | 0.2822 |
| 0.2252 | 1.64 | 3000 | 0.2826 |
| 0.2724 | 1.64 | 3010 | 0.2829 |
| 0.2514 | 1.65 | 3020 | 0.2814 |
| 0.2396 | 1.65 | 3030 | 0.2802 |
| 0.2736 | 1.66 | 3040 | 0.2800 |
| 0.2549 | 1.67 | 3050 | 0.2798 |
| 0.2278 | 1.67 | 3060 | 0.2792 |
| 0.2857 | 1.68 | 3070 | 0.2798 |
| 0.249 | 1.68 | 3080 | 0.2804 |
| 0.2314 | 1.69 | 3090 | 0.2820 |
| 0.254 | 1.69 | 3100 | 0.2810 |
| 0.2812 | 1.7 | 3110 | 0.2812 |
| 0.2398 | 1.7 | 3120 | 0.2824 |
| 0.2418 | 1.71 | 3130 | 0.2839 |
| 0.2487 | 1.71 | 3140 | 0.2826 |
| 0.2839 | 1.72 | 3150 | 0.2840 |
| 0.2479 | 1.73 | 3160 | 0.2857 |
| 0.2579 | 1.73 | 3170 | 0.2834 |
| 0.2549 | 1.74 | 3180 | 0.2838 |
| 0.2406 | 1.74 | 3190 | 0.2872 |
| 0.262 | 1.75 | 3200 | 0.2879 |
| 0.2632 | 1.75 | 3210 | 0.2877 |
| 0.2256 | 1.76 | 3220 | 0.2850 |
| 0.263 | 1.76 | 3230 | 0.2837 |
| 0.27 | 1.77 | 3240 | 0.2826 |
| 0.2642 | 1.77 | 3250 | 0.2803 |
| 0.2423 | 1.78 | 3260 | 0.2801 |
| 0.2709 | 1.79 | 3270 | 0.2808 |
| 0.2409 | 1.79 | 3280 | 0.2832 |
| 0.2262 | 1.8 | 3290 | 0.2855 |
| 0.242 | 1.8 | 3300 | 0.2864 |
| 0.268 | 1.81 | 3310 | 0.2869 |
| 0.2767 | 1.81 | 3320 | 0.2857 |
| 0.264 | 1.82 | 3330 | 0.2848 |
| 0.2741 | 1.82 | 3340 | 0.2821 |
| 0.282 | 1.83 | 3350 | 0.2806 |
| 0.2616 | 1.83 | 3360 | 0.2796 |
| 0.2924 | 1.84 | 3370 | 0.2816 |
| 0.2563 | 1.85 | 3380 | 0.2826 |
| 0.2556 | 1.85 | 3390 | 0.2856 |
| 0.3117 | 1.86 | 3400 | 0.2832 |
| 0.2397 | 1.86 | 3410 | 0.2825 |
| 0.2329 | 1.87 | 3420 | 0.2842 |
| 0.2044 | 1.87 | 3430 | 0.2853 |
| 0.2469 | 1.88 | 3440 | 0.2870 |
| 0.2566 | 1.88 | 3450 | 0.2875 |
| 0.222 | 1.89 | 3460 | 0.2868 |
| 0.24 | 1.89 | 3470 | 0.2828 |
| 0.2582 | 1.9 | 3480 | 0.2827 |
| 0.2723 | 1.91 | 3490 | 0.2832 |
| 0.2731 | 1.91 | 3500 | 0.2825 |
| 0.2713 | 1.92 | 3510 | 0.2822 |
| 0.2558 | 1.92 | 3520 | 0.2815 |
| 0.2655 | 1.93 | 3530 | 0.2801 |
| 0.2461 | 1.93 | 3540 | 0.2801 |
| 0.2604 | 1.94 | 3550 | 0.2810 |
| 0.2393 | 1.94 | 3560 | 0.2809 |
| 0.2434 | 1.95 | 3570 | 0.2800 |
| 0.2252 | 1.95 | 3580 | 0.2830 |
| 0.2571 | 1.96 | 3590 | 0.2853 |
| 0.2809 | 1.97 | 3600 | 0.2834 |
| 0.245 | 1.97 | 3610 | 0.2799 |
| 0.2309 | 1.98 | 3620 | 0.2786 |
| 0.228 | 1.98 | 3630 | 0.2793 |
| 0.2546 | 1.99 | 3640 | 0.2797 |
| 0.268 | 1.99 | 3650 | 0.2813 |
| 0.2606 | 2.0 | 3660 | 0.2819 |
| 0.2064 | 2.0 | 3670 | 0.2823 |
| 0.2117 | 2.01 | 3680 | 0.2848 |
| 0.1986 | 2.01 | 3690 | 0.2876 |
| 0.2054 | 2.02 | 3700 | 0.2895 |
| 0.1849 | 2.03 | 3710 | 0.2923 |
| 0.1822 | 2.03 | 3720 | 0.2953 |
| 0.2254 | 2.04 | 3730 | 0.2973 |
| 0.1946 | 2.04 | 3740 | 0.2975 |
| 0.2095 | 2.05 | 3750 | 0.2982 |
| 0.215 | 2.05 | 3760 | 0.2961 |
| 0.2382 | 2.06 | 3770 | 0.2933 |
| 0.1946 | 2.06 | 3780 | 0.2919 |
| 0.1892 | 2.07 | 3790 | 0.2912 |
| 0.1666 | 2.07 | 3800 | 0.2924 |
| 0.1955 | 2.08 | 3810 | 0.2953 |
| 0.1646 | 2.09 | 3820 | 0.2959 |
| 0.2396 | 2.09 | 3830 | 0.2980 |
| 0.1862 | 2.1 | 3840 | 0.2996 |
| 0.1811 | 2.1 | 3850 | 0.2978 |
| 0.2129 | 2.11 | 3860 | 0.2966 |
| 0.1959 | 2.11 | 3870 | 0.2962 |
| 0.1969 | 2.12 | 3880 | 0.2958 |
| 0.219 | 2.12 | 3890 | 0.2965 |
| 0.1885 | 2.13 | 3900 | 0.2966 |
| 0.2433 | 2.13 | 3910 | 0.2944 |
| 0.2665 | 2.14 | 3920 | 0.2937 |
| 0.2126 | 2.15 | 3930 | 0.2927 |
| 0.218 | 2.15 | 3940 | 0.2914 |
| 0.2105 | 2.16 | 3950 | 0.2928 |
| 0.1908 | 2.16 | 3960 | 0.2938 |
| 0.1822 | 2.17 | 3970 | 0.2981 |
| 0.2149 | 2.17 | 3980 | 0.3015 |
| 0.2011 | 2.18 | 3990 | 0.3020 |
| 0.2381 | 2.18 | 4000 | 0.3004 |
| 0.2155 | 2.19 | 4010 | 0.2978 |
| 0.1989 | 2.19 | 4020 | 0.2994 |
| 0.206 | 2.2 | 4030 | 0.2988 |
| 0.1669 | 2.21 | 4040 | 0.3002 |
| 0.2143 | 2.21 | 4050 | 0.2977 |
| 0.1955 | 2.22 | 4060 | 0.2950 |
| 0.203 | 2.22 | 4070 | 0.2947 |
| 0.2407 | 2.23 | 4080 | 0.2943 |
| 0.202 | 2.23 | 4090 | 0.2936 |
| 0.1995 | 2.24 | 4100 | 0.2925 |
| 0.2061 | 2.24 | 4110 | 0.2946 |
| 0.2583 | 2.25 | 4120 | 0.2966 |
| 0.1948 | 2.25 | 4130 | 0.2981 |
| 0.2126 | 2.26 | 4140 | 0.2992 |
| 0.2032 | 2.27 | 4150 | 0.2980 |
| 0.2131 | 2.27 | 4160 | 0.2980 |
| 0.1854 | 2.28 | 4170 | 0.2996 |
| 0.2263 | 2.28 | 4180 | 0.2960 |
| 0.2191 | 2.29 | 4190 | 0.2965 |
| 0.234 | 2.29 | 4200 | 0.2972 |
| 0.2642 | 2.3 | 4210 | 0.2954 |
| 0.2234 | 2.3 | 4220 | 0.2966 |
| 0.1953 | 2.31 | 4230 | 0.2965 |
| 0.1917 | 2.31 | 4240 | 0.2983 |
| 0.202 | 2.32 | 4250 | 0.2978 |
| 0.2409 | 2.33 | 4260 | 0.2973 |
| 0.1934 | 2.33 | 4270 | 0.2957 |
| 0.2146 | 2.34 | 4280 | 0.2955 |
| 0.2162 | 2.34 | 4290 | 0.2997 |
| 0.1929 | 2.35 | 4300 | 0.3005 |
| 0.2275 | 2.35 | 4310 | 0.2986 |
| 0.2033 | 2.36 | 4320 | 0.2962 |
| 0.1949 | 2.36 | 4330 | 0.2943 |
| 0.2024 | 2.37 | 4340 | 0.2940 |
| 0.2438 | 2.37 | 4350 | 0.2954 |
| 0.208 | 2.38 | 4360 | 0.2969 |
| 0.2075 | 2.39 | 4370 | 0.2989 |
| 0.2436 | 2.39 | 4380 | 0.3001 |
| 0.2498 | 2.4 | 4390 | 0.3000 |
| 0.2046 | 2.4 | 4400 | 0.3002 |
| 0.2059 | 2.41 | 4410 | 0.3008 |
| 0.2226 | 2.41 | 4420 | 0.3008 |
| 0.2182 | 2.42 | 4430 | 0.3004 |
| 0.1929 | 2.42 | 4440 | 0.2998 |
| 0.2115 | 2.43 | 4450 | 0.2994 |
| 0.1898 | 2.43 | 4460 | 0.3000 |
| 0.1954 | 2.44 | 4470 | 0.3015 |
| 0.2181 | 2.45 | 4480 | 0.3010 |
| 0.1941 | 2.45 | 4490 | 0.2994 |
| 0.2325 | 2.46 | 4500 | 0.2977 |
| 0.1721 | 2.46 | 4510 | 0.2995 |
| 0.2326 | 2.47 | 4520 | 0.3000 |
| 0.1852 | 2.47 | 4530 | 0.2980 |
| 0.2475 | 2.48 | 4540 | 0.2979 |
| 0.1841 | 2.48 | 4550 | 0.2984 |
| 0.2014 | 2.49 | 4560 | 0.2996 |
| 0.2029 | 2.49 | 4570 | 0.3004 |
| 0.2122 | 2.5 | 4580 | 0.3012 |
| 0.2003 | 2.51 | 4590 | 0.3021 |
| 0.1822 | 2.51 | 4600 | 0.3042 |
| 0.246 | 2.52 | 4610 | 0.3061 |
| 0.2134 | 2.52 | 4620 | 0.3069 |
| 0.1931 | 2.53 | 4630 | 0.3063 |
| 0.2058 | 2.53 | 4640 | 0.3044 |
| 0.237 | 2.54 | 4650 | 0.3037 |
| 0.2188 | 2.54 | 4660 | 0.3036 |
| 0.2228 | 2.55 | 4670 | 0.3025 |
| 0.1872 | 2.55 | 4680 | 0.3001 |
| 0.2243 | 2.56 | 4690 | 0.3003 |
| 0.2329 | 2.57 | 4700 | 0.3006 |
| 0.1908 | 2.57 | 4710 | 0.3019 |
| 0.2003 | 2.58 | 4720 | 0.2991 |
| 0.2004 | 2.58 | 4730 | 0.2998 |
| 0.2116 | 2.59 | 4740 | 0.3000 |
| 0.1833 | 2.59 | 4750 | 0.2999 |
| 0.1811 | 2.6 | 4760 | 0.3007 |
| 0.2552 | 2.6 | 4770 | 0.2988 |
| 0.1911 | 2.61 | 4780 | 0.2962 |
| 0.2097 | 2.61 | 4790 | 0.2950 |
| 0.2278 | 2.62 | 4800 | 0.2945 |
| 0.224 | 2.63 | 4810 | 0.2938 |
| 0.2078 | 2.63 | 4820 | 0.2934 |
| 0.1998 | 2.64 | 4830 | 0.2927 |
| 0.185 | 2.64 | 4840 | 0.2927 |
| 0.2384 | 2.65 | 4850 | 0.2936 |
| 0.2291 | 2.65 | 4860 | 0.2935 |
| 0.2003 | 2.66 | 4870 | 0.2935 |
| 0.2198 | 2.66 | 4880 | 0.2936 |
| 0.2061 | 2.67 | 4890 | 0.2921 |
| 0.2059 | 2.68 | 4900 | 0.2925 |
| 0.2044 | 2.68 | 4910 | 0.2932 |
| 0.2061 | 2.69 | 4920 | 0.2933 |
| 0.1954 | 2.69 | 4930 | 0.2940 |
| 0.205 | 2.7 | 4940 | 0.2937 |
| 0.1863 | 2.7 | 4950 | 0.2938 |
| 0.2036 | 2.71 | 4960 | 0.2952 |
| 0.2242 | 2.71 | 4970 | 0.2955 |
| 0.1985 | 2.72 | 4980 | 0.2925 |
| 0.2372 | 2.72 | 4990 | 0.2910 |
| 0.2018 | 2.73 | 5000 | 0.2898 |
| 0.2337 | 2.74 | 5010 | 0.2903 |
| 0.2096 | 2.74 | 5020 | 0.2899 |
| 0.2303 | 2.75 | 5030 | 0.2909 |
| 0.2422 | 2.75 | 5040 | 0.2916 |
| 0.2121 | 2.76 | 5050 | 0.2904 |
| 0.2431 | 2.76 | 5060 | 0.2892 |
| 0.2362 | 2.77 | 5070 | 0.2888 |
| 0.2512 | 2.77 | 5080 | 0.2885 |
| 0.2106 | 2.78 | 5090 | 0.2888 |
| 0.2471 | 2.78 | 5100 | 0.2887 |
| 0.2347 | 2.79 | 5110 | 0.2895 |
| 0.2233 | 2.8 | 5120 | 0.2905 |
| 0.2056 | 2.8 | 5130 | 0.2897 |
| 0.2193 | 2.81 | 5140 | 0.2888 |
| 0.2017 | 2.81 | 5150 | 0.2889 |
| 0.2278 | 2.82 | 5160 | 0.2899 |
| 0.2126 | 2.82 | 5170 | 0.2908 |
| 0.2196 | 2.83 | 5180 | 0.2910 |
| 0.2003 | 2.83 | 5190 | 0.2921 |
| 0.2054 | 2.84 | 5200 | 0.2921 |
| 0.214 | 2.84 | 5210 | 0.2922 |
| 0.2199 | 2.85 | 5220 | 0.2924 |
| 0.1937 | 2.86 | 5230 | 0.2932 |
| 0.2092 | 2.86 | 5240 | 0.2930 |
| 0.2247 | 2.87 | 5250 | 0.2931 |
| 0.2079 | 2.87 | 5260 | 0.2927 |
| 0.1823 | 2.88 | 5270 | 0.2937 |
| 0.2123 | 2.88 | 5280 | 0.2946 |
| 0.2186 | 2.89 | 5290 | 0.2946 |
| 0.2402 | 2.89 | 5300 | 0.2931 |
| 0.2237 | 2.9 | 5310 | 0.2920 |
| 0.2385 | 2.9 | 5320 | 0.2909 |
| 0.2099 | 2.91 | 5330 | 0.2905 |
| 0.2114 | 2.92 | 5340 | 0.2911 |
| 0.2172 | 2.92 | 5350 | 0.2917 |
| 0.1933 | 2.93 | 5360 | 0.2928 |
| 0.2114 | 2.93 | 5370 | 0.2932 |
| 0.2336 | 2.94 | 5380 | 0.2932 |
| 0.2572 | 2.94 | 5390 | 0.2930 |
| 0.2003 | 2.95 | 5400 | 0.2918 |
| 0.2198 | 2.95 | 5410 | 0.2916 |
| 0.2539 | 2.96 | 5420 | 0.2914 |
| 0.2198 | 2.96 | 5430 | 0.2910 |
| 0.198 | 2.97 | 5440 | 0.2911 |
| 0.2302 | 2.98 | 5450 | 0.2914 |
| 0.2102 | 2.98 | 5460 | 0.2918 |
| 0.2476 | 2.99 | 5470 | 0.2915 |
| 0.2052 | 2.99 | 5480 | 0.2914 |
| 0.2329 | 3.0 | 5490 | 0.2909 |
| 0.202 | 3.0 | 5500 | 0.2914 |
| 0.1624 | 3.01 | 5510 | 0.2916 |
| 0.166 | 3.01 | 5520 | 0.2911 |
| 0.1646 | 3.02 | 5530 | 0.2912 |
| 0.193 | 3.02 | 5540 | 0.2916 |
| 0.1849 | 3.03 | 5550 | 0.2930 |
| 0.1734 | 3.04 | 5560 | 0.2940 |
| 0.1748 | 3.04 | 5570 | 0.2943 |
| 0.1608 | 3.05 | 5580 | 0.2957 |
| 0.169 | 3.05 | 5590 | 0.2957 |
| 0.1853 | 3.06 | 5600 | 0.2965 |
| 0.1752 | 3.06 | 5610 | 0.2970 |
| 0.161 | 3.07 | 5620 | 0.2980 |
| 0.1901 | 3.07 | 5630 | 0.2980 |
| 0.1727 | 3.08 | 5640 | 0.2990 |
| 0.1528 | 3.08 | 5650 | 0.2998 |
| 0.1808 | 3.09 | 5660 | 0.3006 |
| 0.1739 | 3.1 | 5670 | 0.3007 |
| 0.2049 | 3.1 | 5680 | 0.2994 |
| 0.1911 | 3.11 | 5690 | 0.2986 |
| 0.1907 | 3.11 | 5700 | 0.2986 |
| 0.1854 | 3.12 | 5710 | 0.2993 |
| 0.1677 | 3.12 | 5720 | 0.2996 |
| 0.1882 | 3.13 | 5730 | 0.2998 |
| 0.1874 | 3.13 | 5740 | 0.2997 |
| 0.1492 | 3.14 | 5750 | 0.3005 |
| 0.1886 | 3.14 | 5760 | 0.3011 |
| 0.1836 | 3.15 | 5770 | 0.3012 |
| 0.1772 | 3.16 | 5780 | 0.3008 |
| 0.1699 | 3.16 | 5790 | 0.3009 |
| 0.1633 | 3.17 | 5800 | 0.3023 |
| 0.2072 | 3.17 | 5810 | 0.3022 |
| 0.1921 | 3.18 | 5820 | 0.3023 |
| 0.199 | 3.18 | 5830 | 0.3027 |
| 0.1901 | 3.19 | 5840 | 0.3019 |
| 0.1769 | 3.19 | 5850 | 0.3016 |
| 0.1857 | 3.2 | 5860 | 0.3002 |
| 0.1613 | 3.2 | 5870 | 0.2994 |
| 0.1843 | 3.21 | 5880 | 0.3000 |
| 0.1895 | 3.22 | 5890 | 0.3014 |
| 0.1695 | 3.22 | 5900 | 0.3017 |
| 0.1525 | 3.23 | 5910 | 0.3026 |
| 0.2128 | 3.23 | 5920 | 0.3027 |
| 0.182 | 3.24 | 5930 | 0.3025 |
| 0.2059 | 3.24 | 5940 | 0.3016 |
| 0.1631 | 3.25 | 5950 | 0.3001 |
| 0.2012 | 3.25 | 5960 | 0.2999 |
| 0.1745 | 3.26 | 5970 | 0.2996 |
| 0.1934 | 3.26 | 5980 | 0.2998 |
| 0.1665 | 3.27 | 5990 | 0.3005 |
| 0.1733 | 3.28 | 6000 | 0.3018 |
| 0.17 | 3.28 | 6010 | 0.3029 |
| 0.192 | 3.29 | 6020 | 0.3047 |
| 0.1909 | 3.29 | 6030 | 0.3055 |
| 0.1854 | 3.3 | 6040 | 0.3047 |
| 0.1876 | 3.3 | 6050 | 0.3046 |
| 0.1538 | 3.31 | 6060 | 0.3051 |
| 0.1679 | 3.31 | 6070 | 0.3056 |
| 0.1927 | 3.32 | 6080 | 0.3063 |
| 0.1922 | 3.32 | 6090 | 0.3066 |
| 0.1976 | 3.33 | 6100 | 0.3064 |
| 0.1683 | 3.34 | 6110 | 0.3053 |
| 0.1891 | 3.34 | 6120 | 0.3045 |
| 0.1878 | 3.35 | 6130 | 0.3031 |
| 0.174 | 3.35 | 6140 | 0.3028 |
| 0.1773 | 3.36 | 6150 | 0.3027 |
| 0.2057 | 3.36 | 6160 | 0.3022 |
| 0.1838 | 3.37 | 6170 | 0.3022 |
| 0.1976 | 3.37 | 6180 | 0.3020 |
| 0.1802 | 3.38 | 6190 | 0.3006 |
| 0.1734 | 3.38 | 6200 | 0.3005 |
| 0.1806 | 3.39 | 6210 | 0.3008 |
| 0.1675 | 3.4 | 6220 | 0.3009 |
| 0.1998 | 3.4 | 6230 | 0.3008 |
| 0.1838 | 3.41 | 6240 | 0.3013 |
| 0.1693 | 3.41 | 6250 | 0.3025 |
| 0.1795 | 3.42 | 6260 | 0.3033 |
| 0.2184 | 3.42 | 6270 | 0.3034 |
| 0.1881 | 3.43 | 6280 | 0.3031 |
| 0.1943 | 3.43 | 6290 | 0.3027 |
| 0.1717 | 3.44 | 6300 | 0.3028 |
| 0.2016 | 3.44 | 6310 | 0.3028 |
| 0.2001 | 3.45 | 6320 | 0.3026 |
| 0.158 | 3.46 | 6330 | 0.3025 |
| 0.1562 | 3.46 | 6340 | 0.3033 |
| 0.1698 | 3.47 | 6350 | 0.3039 |
| 0.2106 | 3.47 | 6360 | 0.3045 |
| 0.2117 | 3.48 | 6370 | 0.3048 |
| 0.1635 | 3.48 | 6380 | 0.3051 |
| 0.1853 | 3.49 | 6390 | 0.3055 |
| 0.2109 | 3.49 | 6400 | 0.3058 |
| 0.1838 | 3.5 | 6410 | 0.3057 |
| 0.1789 | 3.5 | 6420 | 0.3051 |
| 0.2018 | 3.51 | 6430 | 0.3035 |
| 0.1885 | 3.52 | 6440 | 0.3019 |
| 0.189 | 3.52 | 6450 | 0.3011 |
| 0.1718 | 3.53 | 6460 | 0.3006 |
| 0.1711 | 3.53 | 6470 | 0.3006 |
| 0.1804 | 3.54 | 6480 | 0.3013 |
| 0.1677 | 3.54 | 6490 | 0.3021 |
| 0.1861 | 3.55 | 6500 | 0.3023 |
| 0.1634 | 3.55 | 6510 | 0.3027 |
| 0.2138 | 3.56 | 6520 | 0.3030 |
| 0.185 | 3.56 | 6530 | 0.3036 |
| 0.1744 | 3.57 | 6540 | 0.3037 |
| 0.1893 | 3.58 | 6550 | 0.3033 |
| 0.1856 | 3.58 | 6560 | 0.3027 |
| 0.1951 | 3.59 | 6570 | 0.3022 |
| 0.1787 | 3.59 | 6580 | 0.3019 |
| 0.1817 | 3.6 | 6590 | 0.3017 |
| 0.1911 | 3.6 | 6600 | 0.3012 |
| 0.204 | 3.61 | 6610 | 0.3012 |
| 0.1643 | 3.61 | 6620 | 0.3008 |
| 0.1766 | 3.62 | 6630 | 0.3006 |
| 0.1846 | 3.62 | 6640 | 0.3004 |
| 0.1792 | 3.63 | 6650 | 0.3007 |
| 0.1924 | 3.64 | 6660 | 0.3008 |
| 0.1723 | 3.64 | 6670 | 0.3007 |
| 0.1769 | 3.65 | 6680 | 0.3009 |
| 0.1702 | 3.65 | 6690 | 0.3013 |
| 0.1681 | 3.66 | 6700 | 0.3016 |
| 0.1885 | 3.66 | 6710 | 0.3016 |
| 0.2321 | 3.67 | 6720 | 0.3012 |
| 0.1648 | 3.67 | 6730 | 0.3011 |
| 0.1941 | 3.68 | 6740 | 0.3011 |
| 0.1729 | 3.69 | 6750 | 0.3008 |
| 0.1854 | 3.69 | 6760 | 0.3004 |
| 0.1844 | 3.7 | 6770 | 0.3002 |
| 0.1811 | 3.7 | 6780 | 0.3001 |
| 0.1957 | 3.71 | 6790 | 0.3001 |
| 0.1988 | 3.71 | 6800 | 0.2999 |
| 0.2051 | 3.72 | 6810 | 0.2994 |
| 0.2074 | 3.72 | 6820 | 0.2987 |
| 0.1825 | 3.73 | 6830 | 0.2985 |
| 0.1802 | 3.73 | 6840 | 0.2986 |
| 0.1904 | 3.74 | 6850 | 0.2985 |
| 0.1676 | 3.75 | 6860 | 0.2986 |
| 0.1999 | 3.75 | 6870 | 0.2987 |
| 0.1791 | 3.76 | 6880 | 0.2990 |
| 0.1763 | 3.76 | 6890 | 0.2997 |
| 0.1964 | 3.77 | 6900 | 0.2999 |
| 0.2253 | 3.77 | 6910 | 0.2998 |
| 0.2008 | 3.78 | 6920 | 0.2995 |
| 0.1881 | 3.78 | 6930 | 0.2995 |
| 0.1945 | 3.79 | 6940 | 0.2992 |
| 0.1904 | 3.79 | 6950 | 0.2991 |
| 0.1718 | 3.8 | 6960 | 0.2991 |
| 0.2027 | 3.81 | 6970 | 0.2990 |
| 0.1889 | 3.81 | 6980 | 0.2989 |
| 0.1663 | 3.82 | 6990 | 0.2989 |
| 0.2035 | 3.82 | 7000 | 0.2988 |
| 0.1859 | 3.83 | 7010 | 0.2989 |
| 0.1852 | 3.83 | 7020 | 0.2992 |
| 0.1834 | 3.84 | 7030 | 0.2994 |
| 0.1643 | 3.84 | 7040 | 0.2996 |
| 0.1779 | 3.85 | 7050 | 0.3000 |
| 0.1875 | 3.85 | 7060 | 0.3002 |
| 0.187 | 3.86 | 7070 | 0.3003 |
| 0.1933 | 3.87 | 7080 | 0.3004 |
| 0.1777 | 3.87 | 7090 | 0.3002 |
| 0.1796 | 3.88 | 7100 | 0.3000 |
| 0.2047 | 3.88 | 7110 | 0.2998 |
| 0.1829 | 3.89 | 7120 | 0.2997 |
| 0.1846 | 3.89 | 7130 | 0.2996 |
| 0.2037 | 3.9 | 7140 | 0.2995 |
| 0.1805 | 3.9 | 7150 | 0.2994 |
| 0.1937 | 3.91 | 7160 | 0.2994 |
| 0.2001 | 3.91 | 7170 | 0.2994 |
| 0.2366 | 3.92 | 7180 | 0.2993 |
| 0.1813 | 3.93 | 7190 | 0.2992 |
| 0.1777 | 3.93 | 7200 | 0.2992 |
| 0.2077 | 3.94 | 7210 | 0.2992 |
| 0.1887 | 3.94 | 7220 | 0.2992 |
| 0.1966 | 3.95 | 7230 | 0.2991 |
| 0.1888 | 3.95 | 7240 | 0.2991 |
| 0.1931 | 3.96 | 7250 | 0.2991 |
| 0.1773 | 3.96 | 7260 | 0.2991 |
| 0.1701 | 3.97 | 7270 | 0.2992 |
| 0.1698 | 3.97 | 7280 | 0.2992 |
| 0.1855 | 3.98 | 7290 | 0.2992 |
| 0.2032 | 3.99 | 7300 | 0.2991 |
| 0.1966 | 3.99 | 7310 | 0.2991 |
| 0.1757 | 4.0 | 7320 | 0.2991 |
### Framework versions
- Transformers 4.29.2
- Pytorch 2.0.1+cu117
- Datasets 2.12.0
- Tokenizers 0.13.3
|
hikami172/llama-2-7b-chat-topicextract-GGUF | hikami172 | 2023-09-18T18:40:09Z | 2 | 1 | peft | [
"peft",
"gguf",
"llama",
"endpoints_compatible",
"region:us"
]
| null | 2023-09-18T17:31:28Z | ---
library_name: peft
---
## Training procedure
The following `bitsandbytes` quantization config was used during training:
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: False
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.4.0
|
CyberHarem/wakabayashi_tomoka_idolmastercinderellagirls | CyberHarem | 2023-09-18T18:24:01Z | 0 | 1 | null | [
"art",
"text-to-image",
"dataset:CyberHarem/wakabayashi_tomoka_idolmastercinderellagirls",
"license:mit",
"region:us"
]
| text-to-image | 2023-09-18T18:06:08Z | ---
license: mit
datasets:
- CyberHarem/wakabayashi_tomoka_idolmastercinderellagirls
pipeline_tag: text-to-image
tags:
- art
---
# Lora of wakabayashi_tomoka_idolmastercinderellagirls
This model is trained with [HCP-Diffusion](https://github.com/7eu7d7/HCP-Diffusion). And the auto-training framework is maintained by [DeepGHS Team](https://huggingface.co/deepghs).
The base model used during training is [NAI](https://huggingface.co/deepghs/animefull-latest), and the base model used for generating preview images is [Meina/MeinaMix_V11](https://huggingface.co/Meina/MeinaMix_V11).
After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.
For example, if you want to use the model from step 4420, you need to download `4420/wakabayashi_tomoka_idolmastercinderellagirls.pt` as the embedding and `4420/wakabayashi_tomoka_idolmastercinderellagirls.safetensors` for loading Lora. By using both files together, you can generate images for the desired characters.
**The best step we recommend is 4420**, with the score of 0.770. The trigger words are:
1. `wakabayashi_tomoka_idolmastercinderellagirls`
2. `long_hair, ponytail, brown_hair, smile, brown_eyes, open_mouth, blush, breasts, armpits, hair_ornament, medium_breasts`
For the following groups, it is not recommended to use this model and we express regret:
1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.
2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.
3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.
4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.
5. Individuals who finds the generated image content offensive to their values.
These are available steps:
| Steps | Score | Download | pattern_1 | pattern_2 | pattern_3 | pattern_4 | pattern_5 | pattern_6 | pattern_7 | pattern_8 | pattern_9 | pattern_10 | pattern_11 | pattern_12 | bikini | bondage | free | maid | miko | nude | nude2 | suit | yukata |
|:---------|:----------|:----------------------------------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-------------------------------------------------|:-------------------------------------------------|:-------------------------------------------------|:-----------------------------------------|:--------------------------------------------------|:-------------------------------------|:-------------------------------------|:-------------------------------------|:-----------------------------------------------|:------------------------------------------------|:-------------------------------------|:-----------------------------------------|
| 5100 | 0.701 | [Download](5100/wakabayashi_tomoka_idolmastercinderellagirls.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](5100/previews/bondage.png) |  |  |  | [<NSFW, click to see>](5100/previews/nude.png) | [<NSFW, click to see>](5100/previews/nude2.png) |  |  |
| 4760 | 0.767 | [Download](4760/wakabayashi_tomoka_idolmastercinderellagirls.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](4760/previews/bondage.png) |  |  |  | [<NSFW, click to see>](4760/previews/nude.png) | [<NSFW, click to see>](4760/previews/nude2.png) |  |  |
| **4420** | **0.770** | [**Download**](4420/wakabayashi_tomoka_idolmastercinderellagirls.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](4420/previews/bondage.png) |  |  |  | [<NSFW, click to see>](4420/previews/nude.png) | [<NSFW, click to see>](4420/previews/nude2.png) |  |  |
| 4080 | 0.760 | [Download](4080/wakabayashi_tomoka_idolmastercinderellagirls.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](4080/previews/bondage.png) |  |  |  | [<NSFW, click to see>](4080/previews/nude.png) | [<NSFW, click to see>](4080/previews/nude2.png) |  |  |
| 3740 | 0.728 | [Download](3740/wakabayashi_tomoka_idolmastercinderellagirls.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](3740/previews/bondage.png) |  |  |  | [<NSFW, click to see>](3740/previews/nude.png) | [<NSFW, click to see>](3740/previews/nude2.png) |  |  |
| 3400 | 0.746 | [Download](3400/wakabayashi_tomoka_idolmastercinderellagirls.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](3400/previews/bondage.png) |  |  |  | [<NSFW, click to see>](3400/previews/nude.png) | [<NSFW, click to see>](3400/previews/nude2.png) |  |  |
| 3060 | 0.644 | [Download](3060/wakabayashi_tomoka_idolmastercinderellagirls.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](3060/previews/bondage.png) |  |  |  | [<NSFW, click to see>](3060/previews/nude.png) | [<NSFW, click to see>](3060/previews/nude2.png) |  |  |
| 2720 | 0.625 | [Download](2720/wakabayashi_tomoka_idolmastercinderellagirls.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](2720/previews/bondage.png) |  |  |  | [<NSFW, click to see>](2720/previews/nude.png) | [<NSFW, click to see>](2720/previews/nude2.png) |  |  |
| 2380 | 0.688 | [Download](2380/wakabayashi_tomoka_idolmastercinderellagirls.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](2380/previews/bondage.png) |  |  |  | [<NSFW, click to see>](2380/previews/nude.png) | [<NSFW, click to see>](2380/previews/nude2.png) |  |  |
| 2040 | 0.741 | [Download](2040/wakabayashi_tomoka_idolmastercinderellagirls.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](2040/previews/bondage.png) |  |  |  | [<NSFW, click to see>](2040/previews/nude.png) | [<NSFW, click to see>](2040/previews/nude2.png) |  |  |
| 1700 | 0.671 | [Download](1700/wakabayashi_tomoka_idolmastercinderellagirls.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](1700/previews/bondage.png) |  |  |  | [<NSFW, click to see>](1700/previews/nude.png) | [<NSFW, click to see>](1700/previews/nude2.png) |  |  |
| 1360 | 0.615 | [Download](1360/wakabayashi_tomoka_idolmastercinderellagirls.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](1360/previews/bondage.png) |  |  |  | [<NSFW, click to see>](1360/previews/nude.png) | [<NSFW, click to see>](1360/previews/nude2.png) |  |  |
| 1020 | 0.495 | [Download](1020/wakabayashi_tomoka_idolmastercinderellagirls.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](1020/previews/bondage.png) |  |  |  | [<NSFW, click to see>](1020/previews/nude.png) | [<NSFW, click to see>](1020/previews/nude2.png) |  |  |
| 680 | 0.455 | [Download](680/wakabayashi_tomoka_idolmastercinderellagirls.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](680/previews/bondage.png) |  |  |  | [<NSFW, click to see>](680/previews/nude.png) | [<NSFW, click to see>](680/previews/nude2.png) |  |  |
| 340 | 0.262 | [Download](340/wakabayashi_tomoka_idolmastercinderellagirls.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](340/previews/bondage.png) |  |  |  | [<NSFW, click to see>](340/previews/nude.png) | [<NSFW, click to see>](340/previews/nude2.png) |  |  |
|
aminh/squad-bloom-1b7b | aminh | 2023-09-18T17:49:33Z | 0 | 0 | peft | [
"peft",
"region:us"
]
| null | 2023-09-18T17:49:27Z | ---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.6.0.dev0
|
yskaraman/sd-class-butterflies-32 | yskaraman | 2023-09-18T17:46:54Z | 44 | 0 | diffusers | [
"diffusers",
"safetensors",
"pytorch",
"unconditional-image-generation",
"diffusion-models-class",
"license:mit",
"diffusers:DDPMPipeline",
"region:us"
]
| unconditional-image-generation | 2023-09-18T17:46:49Z | ---
license: mit
tags:
- pytorch
- diffusers
- unconditional-image-generation
- diffusion-models-class
---
# Model Card for Unit 1 of the [Diffusion Models Class 🧨](https://github.com/huggingface/diffusion-models-class)
This model is a diffusion model for unconditional image generation of cute 🦋.
## Usage
```python
from diffusers import DDPMPipeline
pipeline = DDPMPipeline.from_pretrained('yskaraman/sd-class-butterflies-32')
image = pipeline().images[0]
image
```
|
fauzifadhi/image-classificaation | fauzifadhi | 2023-09-18T17:38:36Z | 26 | 0 | transformers | [
"transformers",
"pytorch",
"vit",
"image-classification",
"generated_from_trainer",
"dataset:imagefolder",
"base_model:google/vit-base-patch16-224-in21k",
"base_model:finetune:google/vit-base-patch16-224-in21k",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
]
| image-classification | 2023-09-18T15:54:51Z | ---
license: apache-2.0
base_model: google/vit-base-patch16-224-in21k
tags:
- generated_from_trainer
datasets:
- imagefolder
model-index:
- name: image-classificaation
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# image-classificaation
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0005
- train_batch_size: 20
- eval_batch_size: 20
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Framework versions
- Transformers 4.33.2
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.13.3
|
CyberHarem/high_elf_archer_goblinslayer | CyberHarem | 2023-09-18T17:32:58Z | 0 | 0 | null | [
"art",
"text-to-image",
"dataset:CyberHarem/high_elf_archer_goblinslayer",
"license:mit",
"region:us"
]
| text-to-image | 2023-09-18T17:12:48Z | ---
license: mit
datasets:
- CyberHarem/high_elf_archer_goblinslayer
pipeline_tag: text-to-image
tags:
- art
---
# Lora of high_elf_archer_goblinslayer
This model is trained with [HCP-Diffusion](https://github.com/7eu7d7/HCP-Diffusion). And the auto-training framework is maintained by [DeepGHS Team](https://huggingface.co/deepghs).
The base model used during training is [NAI](https://huggingface.co/deepghs/animefull-latest), and the base model used for generating preview images is [Meina/MeinaMix_V11](https://huggingface.co/Meina/MeinaMix_V11).
After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.
For example, if you want to use the model from step 8960, you need to download `8960/high_elf_archer_goblinslayer.pt` as the embedding and `8960/high_elf_archer_goblinslayer.safetensors` for loading Lora. By using both files together, you can generate images for the desired characters.
**The best step we recommend is 8960**, with the score of 0.961. The trigger words are:
1. `high_elf_archer_goblinslayer`
2. `pointy_ears, elf, green_hair, sidelocks, green_eyes, bangs, bow, hair_bow, black_bow, hair_between_eyes, cloak, long_hair`
For the following groups, it is not recommended to use this model and we express regret:
1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.
2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.
3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.
4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.
5. Individuals who finds the generated image content offensive to their values.
These are available steps:
| Steps | Score | Download | pattern_1 | pattern_2 | pattern_3 | pattern_4 | pattern_5 | pattern_6 | pattern_7 | pattern_8 | pattern_9 | pattern_10 | pattern_11 | pattern_12 | pattern_13 | pattern_14 | pattern_15 | pattern_16 | bikini | bondage | free | maid | miko | nude | nude2 | suit | yukata |
|:---------|:----------|:------------------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-------------------------------------------------|:-------------------------------------------------|:-------------------------------------------------|:-------------------------------------------------|:-------------------------------------------------|:-------------------------------------------------|:-----------------------------------------------------|:-----------------------------------------|:--------------------------------------------------|:-------------------------------------|:-------------------------------------|:-------------------------------------|:-----------------------------------------------|:------------------------------------------------|:-------------------------------------|:-----------------------------------------|
| 9600 | 0.923 | [Download](9600/high_elf_archer_goblinslayer.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](9600/previews/pattern_16.png) |  | [<NSFW, click to see>](9600/previews/bondage.png) |  |  |  | [<NSFW, click to see>](9600/previews/nude.png) | [<NSFW, click to see>](9600/previews/nude2.png) |  |  |
| **8960** | **0.961** | [**Download**](8960/high_elf_archer_goblinslayer.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](8960/previews/pattern_16.png) |  | [<NSFW, click to see>](8960/previews/bondage.png) |  |  |  | [<NSFW, click to see>](8960/previews/nude.png) | [<NSFW, click to see>](8960/previews/nude2.png) |  |  |
| 8320 | 0.957 | [Download](8320/high_elf_archer_goblinslayer.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](8320/previews/pattern_16.png) |  | [<NSFW, click to see>](8320/previews/bondage.png) |  |  |  | [<NSFW, click to see>](8320/previews/nude.png) | [<NSFW, click to see>](8320/previews/nude2.png) |  |  |
| 7680 | 0.952 | [Download](7680/high_elf_archer_goblinslayer.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](7680/previews/pattern_16.png) |  | [<NSFW, click to see>](7680/previews/bondage.png) |  |  |  | [<NSFW, click to see>](7680/previews/nude.png) | [<NSFW, click to see>](7680/previews/nude2.png) |  |  |
| 7040 | 0.925 | [Download](7040/high_elf_archer_goblinslayer.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](7040/previews/pattern_16.png) |  | [<NSFW, click to see>](7040/previews/bondage.png) |  |  |  | [<NSFW, click to see>](7040/previews/nude.png) | [<NSFW, click to see>](7040/previews/nude2.png) |  |  |
| 6400 | 0.918 | [Download](6400/high_elf_archer_goblinslayer.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](6400/previews/pattern_16.png) |  | [<NSFW, click to see>](6400/previews/bondage.png) |  |  |  | [<NSFW, click to see>](6400/previews/nude.png) | [<NSFW, click to see>](6400/previews/nude2.png) |  |  |
| 5760 | 0.916 | [Download](5760/high_elf_archer_goblinslayer.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](5760/previews/pattern_16.png) |  | [<NSFW, click to see>](5760/previews/bondage.png) |  |  |  | [<NSFW, click to see>](5760/previews/nude.png) | [<NSFW, click to see>](5760/previews/nude2.png) |  |  |
| 5120 | 0.957 | [Download](5120/high_elf_archer_goblinslayer.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](5120/previews/pattern_16.png) |  | [<NSFW, click to see>](5120/previews/bondage.png) |  |  |  | [<NSFW, click to see>](5120/previews/nude.png) | [<NSFW, click to see>](5120/previews/nude2.png) |  |  |
| 4480 | 0.954 | [Download](4480/high_elf_archer_goblinslayer.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](4480/previews/pattern_16.png) |  | [<NSFW, click to see>](4480/previews/bondage.png) |  |  |  | [<NSFW, click to see>](4480/previews/nude.png) | [<NSFW, click to see>](4480/previews/nude2.png) |  |  |
| 3840 | 0.953 | [Download](3840/high_elf_archer_goblinslayer.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](3840/previews/pattern_16.png) |  | [<NSFW, click to see>](3840/previews/bondage.png) |  |  |  | [<NSFW, click to see>](3840/previews/nude.png) | [<NSFW, click to see>](3840/previews/nude2.png) |  |  |
| 3200 | 0.954 | [Download](3200/high_elf_archer_goblinslayer.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](3200/previews/pattern_16.png) |  | [<NSFW, click to see>](3200/previews/bondage.png) |  |  |  | [<NSFW, click to see>](3200/previews/nude.png) | [<NSFW, click to see>](3200/previews/nude2.png) |  |  |
| 2560 | 0.951 | [Download](2560/high_elf_archer_goblinslayer.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](2560/previews/pattern_16.png) |  | [<NSFW, click to see>](2560/previews/bondage.png) |  |  |  | [<NSFW, click to see>](2560/previews/nude.png) | [<NSFW, click to see>](2560/previews/nude2.png) |  |  |
| 1920 | 0.948 | [Download](1920/high_elf_archer_goblinslayer.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](1920/previews/pattern_16.png) |  | [<NSFW, click to see>](1920/previews/bondage.png) |  |  |  | [<NSFW, click to see>](1920/previews/nude.png) | [<NSFW, click to see>](1920/previews/nude2.png) |  |  |
| 1280 | 0.949 | [Download](1280/high_elf_archer_goblinslayer.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](1280/previews/pattern_16.png) |  | [<NSFW, click to see>](1280/previews/bondage.png) |  |  |  | [<NSFW, click to see>](1280/previews/nude.png) | [<NSFW, click to see>](1280/previews/nude2.png) |  |  |
| 640 | 0.878 | [Download](640/high_elf_archer_goblinslayer.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](640/previews/pattern_16.png) |  | [<NSFW, click to see>](640/previews/bondage.png) |  |  |  | [<NSFW, click to see>](640/previews/nude.png) | [<NSFW, click to see>](640/previews/nude2.png) |  |  |
|
Noorrabie/my_awesome_model | Noorrabie | 2023-09-18T17:24:54Z | 62 | 0 | transformers | [
"transformers",
"tf",
"bert",
"multiple-choice",
"generated_from_keras_callback",
"base_model:google-bert/bert-base-uncased",
"base_model:finetune:google-bert/bert-base-uncased",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
]
| multiple-choice | 2023-09-18T16:17:12Z | ---
license: apache-2.0
base_model: bert-base-uncased
tags:
- generated_from_keras_callback
model-index:
- name: Noorrabie/my_awesome_model
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# Noorrabie/my_awesome_model
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 1.6069
- Validation Loss: 1.6047
- Train Accuracy: 0.25
- Epoch: 1
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': False, 'is_legacy_optimizer': False, 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 5e-05, 'decay_steps': 22, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Train Accuracy | Epoch |
|:----------:|:---------------:|:--------------:|:-----:|
| 1.6137 | 1.6079 | 0.25 | 0 |
| 1.6069 | 1.6047 | 0.25 | 1 |
### Framework versions
- Transformers 4.33.2
- TensorFlow 2.13.0
- Datasets 2.14.5
- Tokenizers 0.13.3
|
922-Narra/llama-2-7b-chat-tagalog-v0.3 | 922-Narra | 2023-09-18T16:57:48Z | 16 | 0 | transformers | [
"transformers",
"pytorch",
"llama",
"text-generation",
"dataset:922-Narra/lt_08312023_test_5j1",
"license:llama2",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
]
| text-generation | 2023-08-31T20:08:53Z | ---
license: llama2
datasets:
- 922-Narra/lt_08312023_test_5j1
---
# Taga-llama-v0.3:
* Test model fine-tuned on an experimental Tagalog-focused dataset of ~1k items (based off Tagalog sentences augmented by LLaMA-2-13b base to create a mostly 3-turn dialogue dataset between Human and Assistant)
* Base: LLaMA-2 7b chat
* [GGMLs](https://huggingface.co/922-Narra/llama-2-7b-chat-tagalog-v0.3-ggml), [GGUFs](https://huggingface.co/922-Narra/llama-2-7b-chat-tagalog-v0.3-gguf)
* [QLoras (hf and GGML)](https://huggingface.co/922-Narra/tagalog-lm-lora-tests/tree/main/llama-2-7b-chat-tagalog-0.3)
### USAGE
This is meant to be mainly a chat model.
Use "Human" and "Assistant" and prompt with Tagalog. Example:
"Ito ay isang chat log sa pagitan ng AI Assistant na nagta-Tagalog at isang Pilipino. Magsimula ng chat:\nHuman: Hello po?\nAssistant:"
### HYPERPARAMS
* Trained for 1 epoch
* rank: 16
* lora alpha: 32
* lora dropout: 0.5
* lr: 2e-4
* batch size: 2
* warmup ratio: 0.075
* grad steps: 4
### WARNINGS AND DISCLAIMERS
Note that aside from formatting and other minor edits, dataset used is mostly as is augmented by LM. As such, while this version may be better at coherency or chatting than our previous Tagalog ones, conversations may still switch between languages or easily derail.
There is a chance that the model may switch back to English (albeit still understand Tagalog inputs) as conversations grow longer, resulting in English-Tagalog conversations: this may be because of the limited 3-turn nature of the dataset. Additionally, Taglish occuring in the dataset or any use of English may sometimes make the model more likely to output Taglish or even English responses.
Note that we use a partially synthetic dataset due to the lack of readily available Tagalog dialogue datasets, but take this as an opportunity to observe the Tagalog capability of base LLaMA-2. However, we plan to further curate the dataset (and fine tune later model versions on this) and release a final cleaned version.
Finally, this model is not guaranteed to output aligned or safe outputs nor is it meant for production use - use at your own risk! |
kayleenp/image_classification | kayleenp | 2023-09-18T16:55:42Z | 218 | 0 | transformers | [
"transformers",
"pytorch",
"vit",
"image-classification",
"generated_from_trainer",
"dataset:imagefolder",
"base_model:google/vit-base-patch16-224-in21k",
"base_model:finetune:google/vit-base-patch16-224-in21k",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
]
| image-classification | 2023-09-18T13:05:58Z | ---
license: apache-2.0
base_model: google/vit-base-patch16-224-in21k
tags:
- generated_from_trainer
datasets:
- imagefolder
metrics:
- accuracy
model-index:
- name: image_classification
results:
- task:
name: Image Classification
type: image-classification
dataset:
name: imagefolder
type: imagefolder
config: default
split: train
args: default
metrics:
- name: Accuracy
type: accuracy
value: 0.46875
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# image_classification
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 1.5552
- Accuracy: 0.4688
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 9e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 40 | 1.7654 | 0.3125 |
| No log | 2.0 | 80 | 1.5370 | 0.4813 |
| No log | 3.0 | 120 | 1.4791 | 0.4813 |
### Framework versions
- Transformers 4.33.2
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.13.3
|
kausarme/image_classification | kausarme | 2023-09-18T16:49:24Z | 267 | 0 | transformers | [
"transformers",
"pytorch",
"vit",
"image-classification",
"generated_from_trainer",
"dataset:imagefolder",
"base_model:google/vit-base-patch16-224-in21k",
"base_model:finetune:google/vit-base-patch16-224-in21k",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
]
| image-classification | 2023-09-18T16:42:23Z | ---
license: apache-2.0
base_model: google/vit-base-patch16-224-in21k
tags:
- generated_from_trainer
datasets:
- imagefolder
model-index:
- name: image_classification
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# image_classification
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Framework versions
- Transformers 4.33.2
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.13.3
|
AlVrde/lora_giec_5epochs_newway | AlVrde | 2023-09-18T16:32:57Z | 1 | 0 | peft | [
"peft",
"region:us"
]
| null | 2023-09-18T16:32:54Z | ---
library_name: peft
---
## Training procedure
The following `bitsandbytes` quantization config was used during training:
- quant_method: gptq
- bits: 4
- tokenizer: None
- dataset: None
- group_size: 128
- damp_percent: 0.01
- desc_act: False
- sym: True
- true_sequential: True
- use_cuda_fp16: False
- model_seqlen: None
- block_name_to_quantize: None
- module_name_preceding_first_block: None
- batch_size: 1
- pad_token_id: None
- disable_exllama: True
### Framework versions
- PEFT 0.5.0
|
stablediffusionapi/stable-diffusion-xl-base-1.0 | stablediffusionapi | 2023-09-18T16:29:08Z | 66 | 0 | diffusers | [
"diffusers",
"stablediffusionapi.com",
"stable-diffusion-api",
"text-to-image",
"ultra-realistic",
"license:creativeml-openrail-m",
"autotrain_compatible",
"endpoints_compatible",
"diffusers:StableDiffusionXLPipeline",
"region:us"
]
| text-to-image | 2023-09-18T16:22:37Z | ---
license: creativeml-openrail-m
tags:
- stablediffusionapi.com
- stable-diffusion-api
- text-to-image
- ultra-realistic
pinned: true
---
# stable-diffusion-xl-base-1.0 API Inference

## Get API Key
Get API key from [Stable Diffusion API](http://stablediffusionapi.com/), No Payment needed.
Replace Key in below code, change **model_id** to "stable-diffusion-xl-base-1.0"
Coding in PHP/Node/Java etc? Have a look at docs for more code examples: [View docs](https://stablediffusionapi.com/docs)
Try model for free: [Generate Images](https://stablediffusionapi.com/models/stable-diffusion-xl-base-1.0)
Model link: [View model](https://stablediffusionapi.com/models/stable-diffusion-xl-base-1.0)
Credits: [View credits](https://civitai.com/?query=stable-diffusion-xl-base-1.0)
View all models: [View Models](https://stablediffusionapi.com/models)
import requests
import json
url = "https://stablediffusionapi.com/api/v4/dreambooth"
payload = json.dumps({
"key": "your_api_key",
"model_id": "stable-diffusion-xl-base-1.0",
"prompt": "ultra realistic close up portrait ((beautiful pale cyberpunk female with heavy black eyeliner)), blue eyes, shaved side haircut, hyper detail, cinematic lighting, magic neon, dark red city, Canon EOS R3, nikon, f/1.4, ISO 200, 1/160s, 8K, RAW, unedited, symmetrical balance, in-frame, 8K",
"negative_prompt": "painting, extra fingers, mutated hands, poorly drawn hands, poorly drawn face, deformed, ugly, blurry, bad anatomy, bad proportions, extra limbs, cloned face, skinny, glitchy, double torso, extra arms, extra hands, mangled fingers, missing lips, ugly face, distorted face, extra legs, anime",
"width": "512",
"height": "512",
"samples": "1",
"num_inference_steps": "30",
"safety_checker": "no",
"enhance_prompt": "yes",
"seed": None,
"guidance_scale": 7.5,
"multi_lingual": "no",
"panorama": "no",
"self_attention": "no",
"upscale": "no",
"embeddings": "embeddings_model_id",
"lora": "lora_model_id",
"webhook": None,
"track_id": None
})
headers = {
'Content-Type': 'application/json'
}
response = requests.request("POST", url, headers=headers, data=payload)
print(response.text)
> Use this coupon code to get 25% off **DMGG0RBN** |
CyberHarem/fujimoto_rina_idolmastercinderellagirls | CyberHarem | 2023-09-18T16:28:51Z | 0 | 0 | null | [
"art",
"text-to-image",
"dataset:CyberHarem/fujimoto_rina_idolmastercinderellagirls",
"license:mit",
"region:us"
]
| text-to-image | 2023-09-18T16:16:58Z | ---
license: mit
datasets:
- CyberHarem/fujimoto_rina_idolmastercinderellagirls
pipeline_tag: text-to-image
tags:
- art
---
# Lora of fujimoto_rina_idolmastercinderellagirls
This model is trained with [HCP-Diffusion](https://github.com/7eu7d7/HCP-Diffusion). And the auto-training framework is maintained by [DeepGHS Team](https://huggingface.co/deepghs).
The base model used during training is [NAI](https://huggingface.co/deepghs/animefull-latest), and the base model used for generating preview images is [Meina/MeinaMix_V11](https://huggingface.co/Meina/MeinaMix_V11).
After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.
For example, if you want to use the model from step 3060, you need to download `3060/fujimoto_rina_idolmastercinderellagirls.pt` as the embedding and `3060/fujimoto_rina_idolmastercinderellagirls.safetensors` for loading Lora. By using both files together, you can generate images for the desired characters.
**The best step we recommend is 3060**, with the score of 0.980. The trigger words are:
1. `fujimoto_rina_idolmastercinderellagirls`
2. `blonde_hair, long_hair, jewelry, earrings, smile, grey_eyes, breasts, blush, open_mouth`
For the following groups, it is not recommended to use this model and we express regret:
1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.
2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.
3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.
4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.
5. Individuals who finds the generated image content offensive to their values.
These are available steps:
| Steps | Score | Download | pattern_1 | pattern_2 | pattern_3 | pattern_4 | bikini | bondage | free | maid | miko | nude | nude2 | suit | yukata |
|:---------|:----------|:-----------------------------------------------------------------|:-----------------------------------------------|:----------------------------------------------------|:----------------------------------------------------|:-----------------------------------------------|:-------------------------------------------------|:--------------------------------------------------|:-------------------------------------|:-------------------------------------|:-------------------------------------|:-----------------------------------------------|:------------------------------------------------|:-------------------------------------|:-----------------------------------------|
| 5100 | 0.964 | [Download](5100/fujimoto_rina_idolmastercinderellagirls.zip) |  | [<NSFW, click to see>](5100/previews/pattern_2.png) | [<NSFW, click to see>](5100/previews/pattern_3.png) |  | [<NSFW, click to see>](5100/previews/bikini.png) | [<NSFW, click to see>](5100/previews/bondage.png) |  |  |  | [<NSFW, click to see>](5100/previews/nude.png) | [<NSFW, click to see>](5100/previews/nude2.png) |  |  |
| 4760 | 0.954 | [Download](4760/fujimoto_rina_idolmastercinderellagirls.zip) |  | [<NSFW, click to see>](4760/previews/pattern_2.png) | [<NSFW, click to see>](4760/previews/pattern_3.png) |  | [<NSFW, click to see>](4760/previews/bikini.png) | [<NSFW, click to see>](4760/previews/bondage.png) |  |  |  | [<NSFW, click to see>](4760/previews/nude.png) | [<NSFW, click to see>](4760/previews/nude2.png) |  |  |
| 4420 | 0.968 | [Download](4420/fujimoto_rina_idolmastercinderellagirls.zip) |  | [<NSFW, click to see>](4420/previews/pattern_2.png) | [<NSFW, click to see>](4420/previews/pattern_3.png) |  | [<NSFW, click to see>](4420/previews/bikini.png) | [<NSFW, click to see>](4420/previews/bondage.png) |  |  |  | [<NSFW, click to see>](4420/previews/nude.png) | [<NSFW, click to see>](4420/previews/nude2.png) |  |  |
| 4080 | 0.975 | [Download](4080/fujimoto_rina_idolmastercinderellagirls.zip) |  | [<NSFW, click to see>](4080/previews/pattern_2.png) | [<NSFW, click to see>](4080/previews/pattern_3.png) |  | [<NSFW, click to see>](4080/previews/bikini.png) | [<NSFW, click to see>](4080/previews/bondage.png) |  |  |  | [<NSFW, click to see>](4080/previews/nude.png) | [<NSFW, click to see>](4080/previews/nude2.png) |  |  |
| 3740 | 0.976 | [Download](3740/fujimoto_rina_idolmastercinderellagirls.zip) |  | [<NSFW, click to see>](3740/previews/pattern_2.png) | [<NSFW, click to see>](3740/previews/pattern_3.png) |  | [<NSFW, click to see>](3740/previews/bikini.png) | [<NSFW, click to see>](3740/previews/bondage.png) |  |  |  | [<NSFW, click to see>](3740/previews/nude.png) | [<NSFW, click to see>](3740/previews/nude2.png) |  |  |
| 3400 | 0.970 | [Download](3400/fujimoto_rina_idolmastercinderellagirls.zip) |  | [<NSFW, click to see>](3400/previews/pattern_2.png) | [<NSFW, click to see>](3400/previews/pattern_3.png) |  | [<NSFW, click to see>](3400/previews/bikini.png) | [<NSFW, click to see>](3400/previews/bondage.png) |  |  |  | [<NSFW, click to see>](3400/previews/nude.png) | [<NSFW, click to see>](3400/previews/nude2.png) |  |  |
| **3060** | **0.980** | [**Download**](3060/fujimoto_rina_idolmastercinderellagirls.zip) |  | [<NSFW, click to see>](3060/previews/pattern_2.png) | [<NSFW, click to see>](3060/previews/pattern_3.png) |  | [<NSFW, click to see>](3060/previews/bikini.png) | [<NSFW, click to see>](3060/previews/bondage.png) |  |  |  | [<NSFW, click to see>](3060/previews/nude.png) | [<NSFW, click to see>](3060/previews/nude2.png) |  |  |
| 2720 | 0.924 | [Download](2720/fujimoto_rina_idolmastercinderellagirls.zip) |  | [<NSFW, click to see>](2720/previews/pattern_2.png) | [<NSFW, click to see>](2720/previews/pattern_3.png) |  | [<NSFW, click to see>](2720/previews/bikini.png) | [<NSFW, click to see>](2720/previews/bondage.png) |  |  |  | [<NSFW, click to see>](2720/previews/nude.png) | [<NSFW, click to see>](2720/previews/nude2.png) |  |  |
| 2380 | 0.920 | [Download](2380/fujimoto_rina_idolmastercinderellagirls.zip) |  | [<NSFW, click to see>](2380/previews/pattern_2.png) | [<NSFW, click to see>](2380/previews/pattern_3.png) |  | [<NSFW, click to see>](2380/previews/bikini.png) | [<NSFW, click to see>](2380/previews/bondage.png) |  |  |  | [<NSFW, click to see>](2380/previews/nude.png) | [<NSFW, click to see>](2380/previews/nude2.png) |  |  |
| 2040 | 0.956 | [Download](2040/fujimoto_rina_idolmastercinderellagirls.zip) |  | [<NSFW, click to see>](2040/previews/pattern_2.png) | [<NSFW, click to see>](2040/previews/pattern_3.png) |  | [<NSFW, click to see>](2040/previews/bikini.png) | [<NSFW, click to see>](2040/previews/bondage.png) |  |  |  | [<NSFW, click to see>](2040/previews/nude.png) | [<NSFW, click to see>](2040/previews/nude2.png) |  |  |
| 1700 | 0.918 | [Download](1700/fujimoto_rina_idolmastercinderellagirls.zip) |  | [<NSFW, click to see>](1700/previews/pattern_2.png) | [<NSFW, click to see>](1700/previews/pattern_3.png) |  | [<NSFW, click to see>](1700/previews/bikini.png) | [<NSFW, click to see>](1700/previews/bondage.png) |  |  |  | [<NSFW, click to see>](1700/previews/nude.png) | [<NSFW, click to see>](1700/previews/nude2.png) |  |  |
| 1360 | 0.919 | [Download](1360/fujimoto_rina_idolmastercinderellagirls.zip) |  | [<NSFW, click to see>](1360/previews/pattern_2.png) | [<NSFW, click to see>](1360/previews/pattern_3.png) |  | [<NSFW, click to see>](1360/previews/bikini.png) | [<NSFW, click to see>](1360/previews/bondage.png) |  |  |  | [<NSFW, click to see>](1360/previews/nude.png) | [<NSFW, click to see>](1360/previews/nude2.png) |  |  |
| 1020 | 0.918 | [Download](1020/fujimoto_rina_idolmastercinderellagirls.zip) |  | [<NSFW, click to see>](1020/previews/pattern_2.png) | [<NSFW, click to see>](1020/previews/pattern_3.png) |  | [<NSFW, click to see>](1020/previews/bikini.png) | [<NSFW, click to see>](1020/previews/bondage.png) |  |  |  | [<NSFW, click to see>](1020/previews/nude.png) | [<NSFW, click to see>](1020/previews/nude2.png) |  |  |
| 680 | 0.889 | [Download](680/fujimoto_rina_idolmastercinderellagirls.zip) |  | [<NSFW, click to see>](680/previews/pattern_2.png) | [<NSFW, click to see>](680/previews/pattern_3.png) |  | [<NSFW, click to see>](680/previews/bikini.png) | [<NSFW, click to see>](680/previews/bondage.png) |  |  |  | [<NSFW, click to see>](680/previews/nude.png) | [<NSFW, click to see>](680/previews/nude2.png) |  |  |
| 340 | 0.910 | [Download](340/fujimoto_rina_idolmastercinderellagirls.zip) |  | [<NSFW, click to see>](340/previews/pattern_2.png) | [<NSFW, click to see>](340/previews/pattern_3.png) |  | [<NSFW, click to see>](340/previews/bikini.png) | [<NSFW, click to see>](340/previews/bondage.png) |  |  |  | [<NSFW, click to see>](340/previews/nude.png) | [<NSFW, click to see>](340/previews/nude2.png) |  |  |
|
marianbasti/Llama-2-13b-fp16-alpaca-spanish | marianbasti | 2023-09-18T16:20:22Z | 25 | 1 | transformers | [
"transformers",
"pytorch",
"llama",
"text-generation",
"es",
"dataset:bertin-project/alpaca-spanish",
"license:llama2",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
]
| text-generation | 2023-09-18T12:33:37Z | ---
license: llama2
datasets:
- bertin-project/alpaca-spanish
language:
- es
library_name: transformers
---
## Llama 2-13b-alpaca-spanish LoRA
This is a LoRA for Llama 2 13B trained on a translated [alpaca dataset](https://huggingface.co/datasets/bertin-project/alpaca-spanish) on an attempt to improve spanish performance of the Llama-2 foundation model with a conversational focus.
Base model used was [The Bloke's Llama-2-13B-fp16](https://huggingface.co/TheBloke/Llama-2-13B-fp16) trained in 4bit precision with an added padding token.
| Training parameteres | |
| ----------- | ----------- |
| LoRA scale | 2 |
| Epochs | 0.75 |
| Learning Rate| 2e-5 |
| Warmup Steps| 100 |
| Loss | 1.07 | |
Anastasiaps/Pyramids-Training | Anastasiaps | 2023-09-18T16:19:45Z | 1 | 0 | ml-agents | [
"ml-agents",
"tensorboard",
"onnx",
"Pyramids",
"deep-reinforcement-learning",
"reinforcement-learning",
"ML-Agents-Pyramids",
"region:us"
]
| reinforcement-learning | 2023-09-18T16:19:41Z | ---
library_name: ml-agents
tags:
- Pyramids
- deep-reinforcement-learning
- reinforcement-learning
- ML-Agents-Pyramids
---
# **ppo** Agent playing **Pyramids**
This is a trained model of a **ppo** agent playing **Pyramids**
using the [Unity ML-Agents Library](https://github.com/Unity-Technologies/ml-agents).
## Usage (with ML-Agents)
The Documentation: https://unity-technologies.github.io/ml-agents/ML-Agents-Toolkit-Documentation/
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
- A *short tutorial* where you teach Huggy the Dog 🐶 to fetch the stick and then play with him directly in your
browser: https://huggingface.co/learn/deep-rl-course/unitbonus1/introduction
- A *longer tutorial* to understand how works ML-Agents:
https://huggingface.co/learn/deep-rl-course/unit5/introduction
### Resume the training
```bash
mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume
```
### Watch your Agent play
You can watch your agent **playing directly in your browser**
1. If the environment is part of ML-Agents official environments, go to https://huggingface.co/unity
2. Step 1: Find your model_id: Anastasiaps/Pyramids-Training
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play 👀
|
CyberHarem/priestess_goblinslayer | CyberHarem | 2023-09-18T16:15:51Z | 0 | 0 | null | [
"art",
"text-to-image",
"dataset:CyberHarem/priestess_goblinslayer",
"license:mit",
"region:us"
]
| text-to-image | 2023-09-18T15:55:15Z | ---
license: mit
datasets:
- CyberHarem/priestess_goblinslayer
pipeline_tag: text-to-image
tags:
- art
---
# Lora of priestess_goblinslayer
This model is trained with [HCP-Diffusion](https://github.com/7eu7d7/HCP-Diffusion). And the auto-training framework is maintained by [DeepGHS Team](https://huggingface.co/deepghs).
The base model used during training is [NAI](https://huggingface.co/deepghs/animefull-latest), and the base model used for generating preview images is [Meina/MeinaMix_V11](https://huggingface.co/Meina/MeinaMix_V11).
After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.
For example, if you want to use the model from step 9900, you need to download `9900/priestess_goblinslayer.pt` as the embedding and `9900/priestess_goblinslayer.safetensors` for loading Lora. By using both files together, you can generate images for the desired characters.
**The best step we recommend is 9900**, with the score of 0.904. The trigger words are:
1. `priestess_goblinslayer`
2. `blonde_hair, long_hair, blue_eyes, hat, bangs, open_mouth`
For the following groups, it is not recommended to use this model and we express regret:
1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.
2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.
3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.
4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.
5. Individuals who finds the generated image content offensive to their values.
These are available steps:
| Steps | Score | Download | pattern_1 | pattern_2 | pattern_3 | pattern_4 | pattern_5 | pattern_6 | pattern_7 | pattern_8 | pattern_9 | pattern_10 | pattern_11 | pattern_12 | pattern_13 | bikini | bondage | free | maid | miko | nude | nude2 | suit | yukata |
|:---------|:----------|:------------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-------------------------------------------------|:-------------------------------------------------|:-------------------------------------------------|:-----------------------------------------------------|:-------------------------------------------------|:--------------------------------------------------|:-------------------------------------|:-------------------------------------|:-------------------------------------|:-----------------------------------------------|:------------------------------------------------|:-------------------------------------|:-----------------------------------------|
| **9900** | **0.904** | [**Download**](9900/priestess_goblinslayer.zip) |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](9900/previews/pattern_13.png) | [<NSFW, click to see>](9900/previews/bikini.png) | [<NSFW, click to see>](9900/previews/bondage.png) |  |  |  | [<NSFW, click to see>](9900/previews/nude.png) | [<NSFW, click to see>](9900/previews/nude2.png) |  |  |
| 9240 | 0.902 | [Download](9240/priestess_goblinslayer.zip) |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](9240/previews/pattern_13.png) | [<NSFW, click to see>](9240/previews/bikini.png) | [<NSFW, click to see>](9240/previews/bondage.png) |  |  |  | [<NSFW, click to see>](9240/previews/nude.png) | [<NSFW, click to see>](9240/previews/nude2.png) |  |  |
| 8580 | 0.893 | [Download](8580/priestess_goblinslayer.zip) |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](8580/previews/pattern_13.png) | [<NSFW, click to see>](8580/previews/bikini.png) | [<NSFW, click to see>](8580/previews/bondage.png) |  |  |  | [<NSFW, click to see>](8580/previews/nude.png) | [<NSFW, click to see>](8580/previews/nude2.png) |  |  |
| 7920 | 0.848 | [Download](7920/priestess_goblinslayer.zip) |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](7920/previews/pattern_13.png) | [<NSFW, click to see>](7920/previews/bikini.png) | [<NSFW, click to see>](7920/previews/bondage.png) |  |  |  | [<NSFW, click to see>](7920/previews/nude.png) | [<NSFW, click to see>](7920/previews/nude2.png) |  |  |
| 7260 | 0.889 | [Download](7260/priestess_goblinslayer.zip) |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](7260/previews/pattern_13.png) | [<NSFW, click to see>](7260/previews/bikini.png) | [<NSFW, click to see>](7260/previews/bondage.png) |  |  |  | [<NSFW, click to see>](7260/previews/nude.png) | [<NSFW, click to see>](7260/previews/nude2.png) |  |  |
| 6600 | 0.870 | [Download](6600/priestess_goblinslayer.zip) |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](6600/previews/pattern_13.png) | [<NSFW, click to see>](6600/previews/bikini.png) | [<NSFW, click to see>](6600/previews/bondage.png) |  |  |  | [<NSFW, click to see>](6600/previews/nude.png) | [<NSFW, click to see>](6600/previews/nude2.png) |  |  |
| 5940 | 0.844 | [Download](5940/priestess_goblinslayer.zip) |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](5940/previews/pattern_13.png) | [<NSFW, click to see>](5940/previews/bikini.png) | [<NSFW, click to see>](5940/previews/bondage.png) |  |  |  | [<NSFW, click to see>](5940/previews/nude.png) | [<NSFW, click to see>](5940/previews/nude2.png) |  |  |
| 5280 | 0.874 | [Download](5280/priestess_goblinslayer.zip) |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](5280/previews/pattern_13.png) | [<NSFW, click to see>](5280/previews/bikini.png) | [<NSFW, click to see>](5280/previews/bondage.png) |  |  |  | [<NSFW, click to see>](5280/previews/nude.png) | [<NSFW, click to see>](5280/previews/nude2.png) |  |  |
| 4620 | 0.845 | [Download](4620/priestess_goblinslayer.zip) |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](4620/previews/pattern_13.png) | [<NSFW, click to see>](4620/previews/bikini.png) | [<NSFW, click to see>](4620/previews/bondage.png) |  |  |  | [<NSFW, click to see>](4620/previews/nude.png) | [<NSFW, click to see>](4620/previews/nude2.png) |  |  |
| 3960 | 0.850 | [Download](3960/priestess_goblinslayer.zip) |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](3960/previews/pattern_13.png) | [<NSFW, click to see>](3960/previews/bikini.png) | [<NSFW, click to see>](3960/previews/bondage.png) |  |  |  | [<NSFW, click to see>](3960/previews/nude.png) | [<NSFW, click to see>](3960/previews/nude2.png) |  |  |
| 3300 | 0.787 | [Download](3300/priestess_goblinslayer.zip) |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](3300/previews/pattern_13.png) | [<NSFW, click to see>](3300/previews/bikini.png) | [<NSFW, click to see>](3300/previews/bondage.png) |  |  |  | [<NSFW, click to see>](3300/previews/nude.png) | [<NSFW, click to see>](3300/previews/nude2.png) |  |  |
| 2640 | 0.761 | [Download](2640/priestess_goblinslayer.zip) |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](2640/previews/pattern_13.png) | [<NSFW, click to see>](2640/previews/bikini.png) | [<NSFW, click to see>](2640/previews/bondage.png) |  |  |  | [<NSFW, click to see>](2640/previews/nude.png) | [<NSFW, click to see>](2640/previews/nude2.png) |  |  |
| 1980 | 0.730 | [Download](1980/priestess_goblinslayer.zip) |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](1980/previews/pattern_13.png) | [<NSFW, click to see>](1980/previews/bikini.png) | [<NSFW, click to see>](1980/previews/bondage.png) |  |  |  | [<NSFW, click to see>](1980/previews/nude.png) | [<NSFW, click to see>](1980/previews/nude2.png) |  |  |
| 1320 | 0.657 | [Download](1320/priestess_goblinslayer.zip) |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](1320/previews/pattern_13.png) | [<NSFW, click to see>](1320/previews/bikini.png) | [<NSFW, click to see>](1320/previews/bondage.png) |  |  |  | [<NSFW, click to see>](1320/previews/nude.png) | [<NSFW, click to see>](1320/previews/nude2.png) |  |  |
| 660 | 0.562 | [Download](660/priestess_goblinslayer.zip) |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](660/previews/pattern_13.png) | [<NSFW, click to see>](660/previews/bikini.png) | [<NSFW, click to see>](660/previews/bondage.png) |  |  |  | [<NSFW, click to see>](660/previews/nude.png) | [<NSFW, click to see>](660/previews/nude2.png) |  |  |
|
ianleekq/flan-t5-large-qlora-financial-phrasebank | ianleekq | 2023-09-18T16:08:15Z | 0 | 0 | peft | [
"peft",
"region:us"
]
| null | 2023-09-18T16:08:12Z | ---
library_name: peft
---
## Training procedure
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: bfloat16
### Framework versions
- PEFT 0.6.0.dev0
|
ri-xx/vit-base-patch16-224-in21k | ri-xx | 2023-09-18T16:06:36Z | 6 | 0 | transformers | [
"transformers",
"pytorch",
"vit",
"image-classification",
"generated_from_trainer",
"dataset:imagefolder",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
]
| image-classification | 2023-09-18T13:42:19Z | ---
tags:
- generated_from_trainer
datasets:
- imagefolder
metrics:
- accuracy
model-index:
- name: vit-base-patch16-224-in21k
results:
- task:
name: Image Classification
type: image-classification
dataset:
name: imagefolder
type: imagefolder
config: default
split: train
args: default
metrics:
- name: Accuracy
type: accuracy
value: 0.5375
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-patch16-224-in21k
This model was trained from scratch on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 1.6306
- Accuracy: 0.5375
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 40 | 1.2472 | 0.5312 |
| No log | 2.0 | 80 | 1.2878 | 0.5188 |
| No log | 3.0 | 120 | 1.3116 | 0.525 |
| No log | 4.0 | 160 | 1.2578 | 0.55 |
| No log | 5.0 | 200 | 1.2186 | 0.5563 |
| No log | 6.0 | 240 | 1.2680 | 0.5563 |
| No log | 7.0 | 280 | 1.3674 | 0.5 |
| No log | 8.0 | 320 | 1.3814 | 0.525 |
| No log | 9.0 | 360 | 1.4394 | 0.5 |
| No log | 10.0 | 400 | 1.3710 | 0.5437 |
| No log | 11.0 | 440 | 1.3721 | 0.5437 |
| No log | 12.0 | 480 | 1.4309 | 0.5563 |
| 0.4861 | 13.0 | 520 | 1.3424 | 0.575 |
| 0.4861 | 14.0 | 560 | 1.4617 | 0.525 |
| 0.4861 | 15.0 | 600 | 1.3964 | 0.5813 |
| 0.4861 | 16.0 | 640 | 1.4751 | 0.5687 |
| 0.4861 | 17.0 | 680 | 1.5296 | 0.55 |
| 0.4861 | 18.0 | 720 | 1.5887 | 0.5188 |
| 0.4861 | 19.0 | 760 | 1.5784 | 0.5312 |
| 0.4861 | 20.0 | 800 | 1.7036 | 0.5375 |
| 0.4861 | 21.0 | 840 | 1.6988 | 0.5188 |
| 0.4861 | 22.0 | 880 | 1.6070 | 0.5687 |
| 0.4861 | 23.0 | 920 | 1.7111 | 0.55 |
| 0.4861 | 24.0 | 960 | 1.6730 | 0.55 |
| 0.2042 | 25.0 | 1000 | 1.6559 | 0.55 |
| 0.2042 | 26.0 | 1040 | 1.7221 | 0.5563 |
| 0.2042 | 27.0 | 1080 | 1.6637 | 0.5813 |
| 0.2042 | 28.0 | 1120 | 1.6806 | 0.5625 |
| 0.2042 | 29.0 | 1160 | 1.5743 | 0.5938 |
| 0.2042 | 30.0 | 1200 | 1.7899 | 0.4938 |
| 0.2042 | 31.0 | 1240 | 1.7422 | 0.5312 |
| 0.2042 | 32.0 | 1280 | 1.7712 | 0.55 |
| 0.2042 | 33.0 | 1320 | 1.7480 | 0.5188 |
| 0.2042 | 34.0 | 1360 | 1.7964 | 0.5375 |
| 0.2042 | 35.0 | 1400 | 1.9687 | 0.5188 |
| 0.2042 | 36.0 | 1440 | 1.7412 | 0.5813 |
| 0.2042 | 37.0 | 1480 | 1.9312 | 0.4875 |
| 0.1342 | 38.0 | 1520 | 1.7944 | 0.525 |
| 0.1342 | 39.0 | 1560 | 1.8180 | 0.55 |
| 0.1342 | 40.0 | 1600 | 1.7720 | 0.5563 |
| 0.1342 | 41.0 | 1640 | 1.9014 | 0.5312 |
| 0.1342 | 42.0 | 1680 | 1.7519 | 0.55 |
| 0.1342 | 43.0 | 1720 | 1.9793 | 0.5 |
| 0.1342 | 44.0 | 1760 | 1.8642 | 0.55 |
| 0.1342 | 45.0 | 1800 | 1.7573 | 0.5875 |
| 0.1342 | 46.0 | 1840 | 1.8508 | 0.5125 |
| 0.1342 | 47.0 | 1880 | 1.9741 | 0.5625 |
| 0.1342 | 48.0 | 1920 | 1.9012 | 0.525 |
| 0.1342 | 49.0 | 1960 | 1.8771 | 0.5625 |
| 0.0926 | 50.0 | 2000 | 1.8728 | 0.5125 |
### Framework versions
- Transformers 4.33.2
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.13.3
|
Geotrend/bert-base-en-es-zh-cased | Geotrend | 2023-09-18T15:56:56Z | 117 | 0 | transformers | [
"transformers",
"pytorch",
"tf",
"jax",
"safetensors",
"bert",
"fill-mask",
"multilingual",
"dataset:wikipedia",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
]
| fill-mask | 2022-03-02T23:29:04Z | ---
language: multilingual
datasets: wikipedia
license: apache-2.0
---
# bert-base-en-es-zh-cased
We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages.
Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-en-es-zh-cased")
model = AutoModel.from_pretrained("Geotrend/bert-base-en-es-zh-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact [email protected] for any question, feedback or request. |
stabilityai/StableBeluga2 | stabilityai | 2023-09-18T15:55:32Z | 1,829 | 883 | transformers | [
"transformers",
"pytorch",
"llama",
"text-generation",
"en",
"dataset:conceptofmind/cot_submix_original",
"dataset:conceptofmind/flan2021_submix_original",
"dataset:conceptofmind/t0_submix_original",
"dataset:conceptofmind/niv2_submix_original",
"arxiv:2307.09288",
"arxiv:2306.02707",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
]
| text-generation | 2023-07-20T15:09:28Z | ---
datasets:
- conceptofmind/cot_submix_original
- conceptofmind/flan2021_submix_original
- conceptofmind/t0_submix_original
- conceptofmind/niv2_submix_original
language:
- en
pipeline_tag: text-generation
---
# Stable Beluga 2
Use [Stable Chat (Research Preview)](https://chat.stability.ai/chat) to test Stability AI's best language models for free
## Model Description
`Stable Beluga 2` is a Llama2 70B model finetuned on an Orca style Dataset
## Usage
Start chatting with `Stable Beluga 2` using the following code snippet:
```python
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
tokenizer = AutoTokenizer.from_pretrained("stabilityai/StableBeluga2", use_fast=False)
model = AutoModelForCausalLM.from_pretrained("stabilityai/StableBeluga2", torch_dtype=torch.float16, low_cpu_mem_usage=True, device_map="auto")
system_prompt = "### System:\nYou are Stable Beluga, an AI that follows instructions extremely well. Help as much as you can. Remember, be safe, and don't do anything illegal.\n\n"
message = "Write me a poem please"
prompt = f"{system_prompt}### User: {message}\n\n### Assistant:\n"
inputs = tokenizer(prompt, return_tensors="pt").to("cuda")
output = model.generate(**inputs, do_sample=True, top_p=0.95, top_k=0, max_new_tokens=256)
print(tokenizer.decode(output[0], skip_special_tokens=True))
```
Stable Beluga 2 should be used with this prompt format:
```
### System:
This is a system prompt, please behave and help the user.
### User:
Your prompt here
### Assistant:
The output of Stable Beluga 2
```
## Other Beluga Models
[StableBeluga 1 - Delta](https://huggingface.co/stabilityai/StableBeluga1-Delta)
[StableBeluga 13B](https://huggingface.co/stabilityai/StableBeluga-13B)
[StableBeluga 7B](https://huggingface.co/stabilityai/StableBeluga-7B)
## Model Details
* **Developed by**: [Stability AI](https://stability.ai/)
* **Model type**: Stable Beluga 2 is an auto-regressive language model fine-tuned on Llama2 70B.
* **Language(s)**: English
* **Library**: [HuggingFace Transformers](https://github.com/huggingface/transformers)
* **License**: Fine-tuned checkpoints (`Stable Beluga 2`) is licensed under the [STABLE BELUGA NON-COMMERCIAL COMMUNITY LICENSE AGREEMENT](https://huggingface.co/stabilityai/StableBeluga2/blob/main/LICENSE.txt)
* **Contact**: For questions and comments about the model, please email `[email protected]`
### Training Dataset
` Stable Beluga 2` is trained on our internal Orca-style dataset
### Training Procedure
Models are learned via supervised fine-tuning on the aforementioned datasets, trained in mixed-precision (BF16), and optimized with AdamW. We outline the following hyperparameters:
| Dataset | Batch Size | Learning Rate |Learning Rate Decay| Warm-up | Weight Decay | Betas |
|-------------------|------------|---------------|-------------------|---------|--------------|-------------|
| Orca pt1 packed | 256 | 3e-5 | Cosine to 3e-6 | 100 | 1e-6 | (0.9, 0.95) |
| Orca pt2 unpacked | 512 | 3e-5 | Cosine to 3e-6 | 100 | 1e-6 | (0.9, 0.95) |
## Ethical Considerations and Limitations
Beluga is a new technology that carries risks with use. Testing conducted to date has been in English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Beluga's potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Beluga, developers should perform safety testing and tuning tailored to their specific applications of the model.
## How to cite
```bibtex
@misc{StableBelugaModels,
url={[https://huggingface.co/stabilityai/StableBeluga2](https://huggingface.co/stabilityai/StableBeluga2)},
title={Stable Beluga models},
author={Mahan, Dakota and Carlow, Ryan and Castricato, Louis and Cooper, Nathan and Laforte, Christian}
}
```
## Citations
```bibtext
@misc{touvron2023llama,
title={Llama 2: Open Foundation and Fine-Tuned Chat Models},
author={Hugo Touvron and Louis Martin and Kevin Stone and Peter Albert and Amjad Almahairi and Yasmine Babaei and Nikolay Bashlykov and Soumya Batra and Prajjwal Bhargava and Shruti Bhosale and Dan Bikel and Lukas Blecher and Cristian Canton Ferrer and Moya Chen and Guillem Cucurull and David Esiobu and Jude Fernandes and Jeremy Fu and Wenyin Fu and Brian Fuller and Cynthia Gao and Vedanuj Goswami and Naman Goyal and Anthony Hartshorn and Saghar Hosseini and Rui Hou and Hakan Inan and Marcin Kardas and Viktor Kerkez and Madian Khabsa and Isabel Kloumann and Artem Korenev and Punit Singh Koura and Marie-Anne Lachaux and Thibaut Lavril and Jenya Lee and Diana Liskovich and Yinghai Lu and Yuning Mao and Xavier Martinet and Todor Mihaylov and Pushkar Mishra and Igor Molybog and Yixin Nie and Andrew Poulton and Jeremy Reizenstein and Rashi Rungta and Kalyan Saladi and Alan Schelten and Ruan Silva and Eric Michael Smith and Ranjan Subramanian and Xiaoqing Ellen Tan and Binh Tang and Ross Taylor and Adina Williams and Jian Xiang Kuan and Puxin Xu and Zheng Yan and Iliyan Zarov and Yuchen Zhang and Angela Fan and Melanie Kambadur and Sharan Narang and Aurelien Rodriguez and Robert Stojnic and Sergey Edunov and Thomas Scialom},
year={2023},
eprint={2307.09288},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
```bibtext
@misc{mukherjee2023orca,
title={Orca: Progressive Learning from Complex Explanation Traces of GPT-4},
author={Subhabrata Mukherjee and Arindam Mitra and Ganesh Jawahar and Sahaj Agarwal and Hamid Palangi and Ahmed Awadallah},
year={2023},
eprint={2306.02707},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
am-infoweb/MRR-sqv2-long | am-infoweb | 2023-09-18T15:47:18Z | 46 | 0 | transformers | [
"transformers",
"pytorch",
"safetensors",
"longformer",
"question-answering",
"generated_from_trainer",
"base_model:mrm8488/longformer-base-4096-finetuned-squadv2",
"base_model:finetune:mrm8488/longformer-base-4096-finetuned-squadv2",
"endpoints_compatible",
"region:us"
]
| question-answering | 2023-08-03T18:51:36Z | ---
base_model: mrm8488/longformer-base-4096-finetuned-squadv2
tags:
- generated_from_trainer
model-index:
- name: MRR-sqv2-long
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# MRR-sqv2-long
This model is a fine-tuned version of [mrm8488/longformer-base-4096-finetuned-squadv2](https://huggingface.co/mrm8488/longformer-base-4096-finetuned-squadv2) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2581
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:------:|:---------------:|
| 1.215 | 1.0 | 11811 | 1.0751 |
| 0.8617 | 2.0 | 23622 | 0.9498 |
| 0.8639 | 3.0 | 35433 | 1.0341 |
| 0.7073 | 4.0 | 47244 | 0.9740 |
| 0.6156 | 5.0 | 59055 | 1.0663 |
| 0.5489 | 6.0 | 70866 | 1.0665 |
| 0.4932 | 7.0 | 82677 | 1.2496 |
| 0.527 | 8.0 | 94488 | 1.1351 |
| 0.3722 | 9.0 | 106299 | 1.1568 |
| 0.3599 | 10.0 | 118110 | 1.2581 |
### Framework versions
- Transformers 4.32.0.dev0
- Pytorch 2.0.1+cu117
- Datasets 2.13.1
- Tokenizers 0.13.3
|
MattStammers/ppo-MountainCar-v0-fullcoded | MattStammers | 2023-09-18T15:43:42Z | 0 | 0 | null | [
"tensorboard",
"MountainCar-v0",
"ppo",
"deep-reinforcement-learning",
"reinforcement-learning",
"custom-implementation",
"deep-rl-course",
"model-index",
"region:us"
]
| reinforcement-learning | 2023-09-18T15:43:36Z | ---
tags:
- MountainCar-v0
- ppo
- deep-reinforcement-learning
- reinforcement-learning
- custom-implementation
- deep-rl-course
model-index:
- name: PPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: MountainCar-v0
type: MountainCar-v0
metrics:
- type: mean_reward
value: -124.90 +/- 35.38
name: mean_reward
verified: false
---
# PPO Agent Playing MountainCar-v0
This is a trained model of a PPO agent playing MountainCar-v0.
# Hyperparameters
|
fetiska/mooner | fetiska | 2023-09-18T15:40:48Z | 0 | 0 | null | [
"tensorboard",
"LunarLander-v2",
"ppo",
"deep-reinforcement-learning",
"reinforcement-learning",
"custom-implementation",
"deep-rl-course",
"model-index",
"region:us"
]
| reinforcement-learning | 2023-09-18T15:40:43Z | ---
tags:
- LunarLander-v2
- ppo
- deep-reinforcement-learning
- reinforcement-learning
- custom-implementation
- deep-rl-course
model-index:
- name: PPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: LunarLander-v2
type: LunarLander-v2
metrics:
- type: mean_reward
value: -126.84 +/- 113.52
name: mean_reward
verified: false
---
# PPO Agent Playing LunarLander-v2
This is a trained model of a PPO agent playing LunarLander-v2.
# Hyperparameters
```python
{'exp_name': 'ppo'
'seed': 1
'torch_deterministic': True
'cuda': True
'track': False
'wandb_project_name': 'cleanRL'
'wandb_entity': None
'capture_video': False
'env_id': 'LunarLander-v2'
'total_timesteps': 50000
'learning_rate': 0.00025
'num_envs': 4
'num_steps': 128
'anneal_lr': True
'gae': True
'gamma': 0.99
'gae_lambda': 0.95
'num_minibatches': 4
'update_epochs': 4
'norm_adv': True
'clip_coef': 0.2
'clip_vloss': True
'ent_coef': 0.01
'vf_coef': 0.5
'max_grad_norm': 0.5
'target_kl': None
'repo_id': 'fetiska/mooner'
'batch_size': 512
'minibatch_size': 128}
```
|
TristanPermentier/segformer-b0-scene-parse-150 | TristanPermentier | 2023-09-18T15:32:41Z | 195 | 0 | transformers | [
"transformers",
"pytorch",
"tensorboard",
"segformer",
"generated_from_trainer",
"image-segmentation",
"dataset:scene_parse_150",
"base_model:nvidia/mit-b0",
"base_model:finetune:nvidia/mit-b0",
"license:other",
"endpoints_compatible",
"region:us"
]
| image-segmentation | 2023-09-17T12:40:46Z | ---
license: other
base_model: nvidia/mit-b0
tags:
- generated_from_trainer
datasets:
- scene_parse_150
model-index:
- name: segformer-b0-scene-parse-150
results: []
widget:
- src: >-
https://huggingface.co/datasets/mishig/sample_images/resolve/main/football-match.jpg
example_title: Football Match
- src: >-
https://huggingface.co/datasets/mishig/sample_images/resolve/main/airport.jpg
example_title: Airport
pipeline_tag: image-segmentation
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# segformer-b0-scene-parse-150
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the scene_parse_150 dataset.
It achieves the following results on the evaluation set:
- Loss: 1.6339
- Mean Iou: 0.1200
- Mean Accuracy: 0.1669
- Overall Accuracy: 0.6123
- Per Category Iou: [0.5349505300503856, nan, nan, 0.7662725216601061, nan, 0.5326853699336921, nan, nan, 0.34424006183640854, nan, 0.4690602972950636, nan, 0.0, nan, 0.43864150176543804, nan, nan, 0.0, 0.03116323751411952, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.11546494517491812, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.21709821831740012, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.2656179069036561, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.004383351344800073, nan, nan, nan, nan, nan, nan, 0.27573695030755746, 0.0, 0.0, nan, nan, 0.1730424387328153, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2699536864879483, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan]
- Per Category Accuracy: [0.7970510497742438, nan, nan, 0.9333683660650987, nan, 0.5709001366216121, nan, nan, 0.5022173025701339, nan, 0.5997946735914005, nan, 0.0, nan, 0.6798811830944017, nan, nan, 0.0, 0.03494731857464189, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.18674548490489992, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.4055014699503591, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.399427626224171, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.005731113077237973, nan, nan, nan, nan, nan, nan, 0.3817023254759232, 0.0, 0.0, nan, nan, 0.18656355727404544, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.49311473385204396, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan]
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Per Category Iou | Per Category Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|
| 3.6823 | 1.0 | 20 | 3.5003 | 0.0586 | 0.1033 | 0.4973 | [0.49515664733932946, nan, nan, 0.5775935923663784, nan, 0.04596206382865156, nan, nan, 0.26871056425122297, nan, 0.11369973679410002, nan, 0.0, 0.0, 0.3107990029966113, nan, nan, 0.07862248213125406, 0.0, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.034315093061257025, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.01068727200680629, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.11023414367115379, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.000786938713592233, nan, nan, nan, nan, nan, nan, 0.05070892559181233, 0.0, 0.0, nan, nan, 0.015040183696900114, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.11313313426108514, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.7426263023684041, nan, nan, 0.9727086727777018, nan, 0.05114940356296204, nan, nan, 0.47199142135667094, nan, 0.13174104716468385, nan, 0.0, nan, 0.4854913009051258, nan, nan, 0.08009119658748254, 0.0, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.08262480087518953, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.011048725239770591, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.5815895187334261, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0010960423626975848, nan, nan, nan, nan, nan, nan, 0.06443529825277504, 0.0, 0.0, nan, nan, 0.021105203802158853, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.12399654192910954, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 3.4276 | 2.0 | 40 | 3.1882 | 0.0710 | 0.1169 | 0.4890 | [0.43876531073668484, nan, nan, 0.5793460668074677, nan, 0.30489642184557436, nan, nan, 0.32595816708983194, nan, 0.14653798821167174, nan, 0.0, nan, 0.2911432340397004, nan, nan, 0.060054857802800635, 0.002022419470540954, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.024744866592893152, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.07467207433837308, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.12499059372413274, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.004522511609125783, nan, nan, nan, nan, nan, nan, 0.09628943874020461, 0.0, 0.0, nan, nan, 0.029982363315696647, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.12235712170565591, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.5900893611665498, nan, nan, 0.9783242745736993, nan, 0.3846789392114827, nan, nan, 0.559548929399149, nan, 0.17078325985868711, nan, 0.0, nan, 0.6459378704825733, nan, nan, 0.06118996837537692, 0.002770214277258198, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.06180066406924745, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.10514964576606102, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.47971748890697613, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0059159876926327465, nan, nan, nan, nan, nan, nan, 0.11675000283456353, 0.0, 0.0, nan, nan, 0.0356049621395199, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.1275781153513647, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 2.9493 | 3.0 | 60 | 3.0877 | 0.0780 | 0.1289 | 0.5137 | [0.45940221778653784, nan, nan, 0.6494007938527302, nan, 0.4439837201805287, nan, nan, 0.34639082849373676, nan, 0.22427847777966967, nan, 0.000308090455357693, nan, 0.34777956821568423, nan, nan, 0.004698282190574071, 0.0031407729433002986, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.03480888089115643, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0913573016991002, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.14217375469266544, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.001518325086206108, nan, nan, nan, nan, nan, nan, 0.09802372235537879, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.04033963860518523, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.582026866520922, nan, nan, 0.952329573015261, nan, 0.5949736206887242, nan, nan, 0.6362931958905531, nan, 0.3181804456790869, nan, 0.000308090455357693, nan, 0.7024633946812373, nan, nan, 0.0047069206442597634, 0.0040369361903634425, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.13068345392779687, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.13909104053207383, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.5329640034657495, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0021920847253951697, nan, nan, nan, nan, nan, nan, 0.12659160742428882, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.041218969988884774, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 3.0162 | 4.0 | 80 | 2.8616 | 0.0814 | 0.1232 | 0.5420 | [0.4910465857166263, nan, nan, 0.6443584008757899, nan, 0.45141247019724007, nan, nan, 0.32257705190056235, nan, 0.26924304549583833, nan, 0.0, nan, 0.3331696608404026, nan, nan, 0.0, 0.001900597974698759, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.020747307143488526, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.13337046977599973, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.13325950535252862, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0017460871773707102, nan, nan, nan, nan, nan, nan, 0.1319458915051747, 0.0, 0.0, nan, nan, 0.009900990099009901, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.06578507371941024, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.7132203547582256, nan, nan, 0.8931325481026355, nan, 0.5735731681633843, nan, nan, 0.581002455982566, nan, 0.33111902892686756, nan, 0.0, nan, 0.611482717739854, nan, nan, 0.0, 0.002995146205753522, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.03153369287756943, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.19039471781772616, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.37913198729225195, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0021788793957241144, nan, nan, nan, nan, nan, nan, 0.1723074480991417, 0.0, 0.0, nan, nan, 0.01031093926212341, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.06681486970482894, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 3.0263 | 5.0 | 100 | 2.7084 | 0.0843 | 0.1324 | 0.5396 | [0.49507208751747933, nan, nan, 0.6203053779437833, nan, 0.4485318235941677, nan, nan, 0.3407940967092522, nan, 0.2978191928406968, nan, 0.0, nan, 0.3557274457037466, nan, nan, 0.0, 0.0022259234469125043, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.026368862501178023, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.12652470809007507, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.14366688649397702, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.006174898362074405, nan, nan, nan, nan, nan, nan, 0.19123126092560874, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.06350128408951938, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.6526707344030444, nan, nan, 0.9749474509023519, nan, 0.5390505607966174, nan, nan, 0.6160157736345083, nan, 0.414004468868893, nan, 0.0, nan, 0.639577053813209, nan, nan, 0.0, 0.003385817449982242, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.05370132238066906, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.23396308255819556, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.45186021477144434, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.009045650824672838, nan, nan, nan, nan, nan, nan, 0.2455865845777067, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.06412868963813759, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 3.0192 | 6.0 | 120 | 2.5853 | 0.0834 | 0.1228 | 0.5584 | [0.513016303842118, nan, nan, 0.6309294285637532, nan, 0.35611384488558007, nan, nan, 0.2789754627300978, nan, 0.33300786268564675, nan, 0.0, nan, 0.39549797351503974, nan, nan, 0.0, 0.0006107421807796855, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.009196650667414078, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.13424914500097224, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.12323683312478671, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.004757876929583422, nan, nan, nan, nan, nan, nan, 0.24288385441579335, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.06377284193687319, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.7580542978289542, nan, nan, 0.9493663014141616, nan, 0.37790185924193903, nan, nan, 0.49498772008717007, nan, 0.3913279787426777, nan, 0.0, nan, 0.6642459982413698, nan, nan, 0.0, 0.0008405351012193679, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.01007619522868165, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.2828329076100053, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.22755795940872214, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.005942398351974857, nan, nan, nan, nan, nan, nan, 0.31665476149982424, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.06400518710633568, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 2.7891 | 7.0 | 140 | 2.4388 | 0.0948 | 0.1331 | 0.5748 | [0.5154656376508916, nan, nan, 0.6863149964423734, nan, 0.49124725952151366, nan, nan, 0.31201516142142743, nan, 0.35972157545158207, nan, 0.0, nan, 0.39671683424856585, nan, nan, 0.0, 2.370192459627722e-05, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.009904517600114009, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.163059324189356, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.14525411857851944, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3573847356082313, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.07118873794781752, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.7711226677412429, nan, nan, 0.9177684357159737, nan, 0.5529665250049951, nan, nan, 0.4886298384586115, nan, 0.44317289691406486, nan, 0.0, nan, 0.594793229711226, nan, nan, 0.0, 3.551556765715639e-05, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.02134234113198856, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.31670200973540896, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.25047916611967336, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.49779470957062033, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.07135358774854884, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 2.3603 | 8.0 | 160 | 2.5459 | 0.0923 | 0.1381 | 0.5408 | [0.4582064985086822, nan, nan, 0.6929509551025913, nan, 0.5612155620675559, nan, nan, 0.3518851291071376, nan, 0.3123848723909702, nan, 0.0, nan, 0.38656074482619035, nan, nan, 0.0, 0.00632781961916225, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.019509625712633233, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.16056645540831488, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.14847856488402372, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.003485354097144976, nan, nan, nan, nan, nan, nan, 0.2711320589476366, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.04329675572519084, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.6156433337669713, nan, nan, 0.8946001915399062, nan, 0.7932261600686888, nan, nan, 0.5757653325953855, nan, 0.3712482637840449, nan, 0.0, nan, 0.6684194639240202, nan, nan, 0.0, 0.012821119924233456, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.036255110070437406, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.3827895320256398, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.3233649276656077, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.006206504945395962, nan, nan, nan, nan, nan, nan, 0.3861242445888182, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0448314190440904, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 3.0505 | 9.0 | 180 | 2.4342 | 0.0895 | 0.1374 | 0.5334 | [0.4674035619688911, nan, nan, 0.6469130452725848, nan, 0.5407835854397331, nan, nan, 0.30876943112205757, nan, 0.29598515286429283, nan, 0.0, nan, 0.36881671052279913, nan, nan, 0.0, 0.008017672665843675, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.03279170249121165, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.13950156171504707, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.14746426631847753, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.013165291208746446, nan, nan, nan, nan, nan, nan, 0.23331071375064993, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.10953377620044287, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.6058283068338183, nan, nan, 0.9748230743398714, nan, 0.6179778921391272, nan, nan, 0.5429381853402054, nan, 0.4526692433117942, nan, 0.0, nan, 0.6640578860565124, nan, nan, 0.0, 0.01460873682964366, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.057827764236224405, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.38476553086895754, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.33507495996009135, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.021339812748425266, nan, nan, nan, nan, nan, nan, 0.30016893998662086, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.11149191058416698, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 2.3854 | 10.0 | 200 | 2.3098 | 0.0939 | 0.1364 | 0.5669 | [0.5145572933835421, nan, nan, 0.7030505688292765, nan, 0.5830553297262949, nan, nan, 0.33253154963919734, nan, 0.3190317923926545, nan, 0.0, nan, 0.40605592944383456, nan, nan, 0.0, 0.008330498485004311, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.03189938344253368, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.15208814114494157, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.11716706900020117, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.006876583423814694, nan, nan, nan, nan, nan, nan, 0.23581319835736797, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0643412965149726, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.7016591414601333, nan, nan, 0.9361823857912215, nan, 0.6738685516489095, nan, nan, 0.5585526998512574, nan, 0.43041548402681323, nan, 0.0, nan, 0.6699024878930123, nan, nan, 0.0, 0.012465964247661892, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.06970807822966048, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.39385030603884524, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.2446766613280122, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.01103965560500218, nan, nan, nan, nan, nan, nan, 0.2799641711169314, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0656107200197604, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 2.0245 | 11.0 | 220 | 2.2548 | 0.0955 | 0.1378 | 0.5719 | [0.5220301378484757, nan, nan, 0.6900779538662102, nan, 0.597003770439182, nan, nan, 0.34494446136750745, nan, 0.33220376279018593, nan, 0.0, nan, 0.40090961340118253, nan, nan, 0.0, 0.018837390442916883, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.024534003530502946, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.16223235412939344, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.10163486902644009, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.006591395652521519, nan, nan, nan, nan, nan, nan, 0.2809388993745605, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.05083676351290448, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.6982873412593906, nan, nan, 0.9678051268019054, nan, 0.6729127403703363, nan, nan, 0.5562281642395102, nan, 0.4558548221510961, nan, 0.0, nan, 0.6682925975667907, nan, nan, 0.0, 0.029489759677992188, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.06295222923823963, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.4064292255048436, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.17399637671646492, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.009798354615922987, nan, nan, nan, nan, nan, nan, 0.34428608682835016, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.05224157095220452, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 2.1305 | 12.0 | 240 | 2.1615 | 0.0891 | 0.1287 | 0.5754 | [0.52909209668855, nan, nan, 0.6817600912725086, nan, 0.4928497692494535, nan, nan, 0.3053400490721222, nan, 0.30149677514104933, nan, 0.0, nan, 0.44020401115949986, nan, nan, 0.0, 0.02650747496108321, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.009345088921766961, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.14166860984376856, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.06272879674296075, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0049727527815997315, nan, nan, nan, nan, nan, nan, 0.22690293924815086, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.07303471847454578, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.7419261355764054, nan, nan, 0.9587598412955063, nan, 0.5259392060826318, nan, nan, 0.48901726106056936, nan, 0.3719276526360287, nan, 0.0, nan, 0.7751709414796117, nan, nan, 0.0, 0.04273706641411152, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.011880313993436078, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.38797050460263144, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.09627957045711134, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.006061246319014354, nan, nan, nan, nan, nan, nan, 0.27860358062065604, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.07521304186735828, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 1.5724 | 13.0 | 260 | 2.2083 | 0.0989 | 0.1513 | 0.5531 | [0.4800847622248986, nan, nan, 0.6970880808658071, nan, 0.6182665130888115, nan, nan, 0.3425863757467936, nan, 0.3713486637663145, nan, 0.0, nan, 0.3765884611771054, nan, nan, 0.0, 0.04581278147115591, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.07627658417610414, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.17206178086451746, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.13439901892281148, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.021518798527206638, nan, nan, nan, nan, nan, nan, 0.2281, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.09575841263170534, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.6012010716147941, nan, nan, 0.9709518538326638, nan, 0.7217455165970958, nan, nan, 0.5542080321007299, nan, 0.6314541940938463, nan, 0.0, nan, 0.6838665365921947, nan, nan, 0.0, 0.08092814016810702, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.18847283265838818, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.49998795122656514, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.24745976317378632, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.03619580862836241, nan, nan, nan, nan, nan, nan, 0.2844881345170471, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.09849326911201679, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 1.9126 | 14.0 | 280 | 2.1922 | 0.0937 | 0.1380 | 0.5619 | [0.5009904092834208, nan, nan, 0.6698610538672513, nan, 0.4589843172431468, nan, nan, 0.31517044265103583, nan, 0.3104306789045372, nan, 0.0, nan, 0.4116696414534222, nan, nan, 0.0, 0.05665320608660416, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.04817434748097306, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.15597353569249542, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.1482680934947902, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.007848093824402574, nan, nan, nan, nan, nan, nan, 0.27710322432627993, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.10443950498618286, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.6797244600413119, nan, nan, 0.9652398602007438, nan, 0.5028917341224627, nan, nan, 0.5103670137327476, nan, 0.37837429796485295, nan, 0.0, nan, 0.7789463092826802, nan, nan, 0.0, 0.09599857937729371, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.07920849087384603, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.39425996433563065, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.24882505841888308, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.010577469066515246, nan, nan, nan, nan, nan, nan, 0.3560551946211322, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.10735457576880326, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 2.184 | 15.0 | 300 | 2.0821 | 0.1054 | 0.1492 | 0.5959 | [0.5445153487483887, nan, nan, 0.7363123727181161, nan, 0.6148226937061888, nan, nan, 0.3475879173137814, nan, 0.35835575524286467, nan, 0.0, nan, 0.46660818479509436, nan, nan, 0.0, 0.05904185462228059, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.04270497506100011, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.16342478251978165, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.11624580295233743, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.01957395899106045, nan, nan, nan, nan, nan, nan, 0.33294472694199845, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.09641039746939899, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.7237905464791236, nan, nan, 0.9342327831743387, nan, 0.7085585609910197, nan, nan, 0.5992113182745857, nan, 0.42025484630714416, nan, 0.0, nan, 0.7143757081548819, nan, nan, 0.0, 0.08568722623416597, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.08364201677446596, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.5006867800857873, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.15362197075117495, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.03111175670500614, nan, nan, nan, nan, nan, nan, 0.45656881753347617, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.10821909349141658, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 1.0809 | 16.0 | 320 | 2.0044 | 0.1016 | 0.1366 | 0.6038 | [0.5521032535171966, nan, nan, 0.6849303109873625, nan, 0.5608681131846335, nan, nan, 0.3286185445379954, nan, 0.3117324722344537, nan, 0.0, nan, 0.5007412271537872, nan, nan, 0.0, 0.00896535470186189, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.02470879973127575, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.13202960663906452, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.13535477440870933, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.006964332796916055, nan, nan, nan, nan, nan, nan, 0.3258174090559218, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.18590681790462252, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.8265849734899084, nan, nan, 0.927053146105148, nan, 0.5985484628718619, nan, nan, 0.5087204676744267, nan, 0.3847756507035449, nan, 0.0, nan, 0.6605231268619826, nan, nan, 0.0, 0.011258434947318575, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.04800107479415772, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.2624222854113451, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.18541759655525508, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.010115282528028312, nan, nan, nan, nan, nan, nan, 0.44154563080376885, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.18824873409904902, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 1.9142 | 17.0 | 340 | 1.9659 | 0.0994 | 0.1374 | 0.5807 | [0.5146286979691347, nan, nan, 0.6905745271536221, nan, 0.5591977800201816, nan, nan, 0.34355018631476514, nan, 0.3433540309960119, nan, 0.0, nan, 0.44078672618377784, nan, nan, 0.0, 0.018210173845236457, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.02063863329589597, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.13463592284853793, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.14175516133652527, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.007641975925278458, nan, nan, nan, nan, nan, nan, 0.34504392589348754, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.11831867388362652, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.7304971289988567, nan, nan, 0.9420685066106143, nan, 0.5985052623620959, nan, nan, 0.5007022034660487, nan, 0.41074340237937074, nan, 0.0, nan, 0.7711024686443236, nan, nan, 0.0, 0.02819936071978217, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.03844308389152256, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.3037857246132344, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.1818993357313519, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.012122492638028708, nan, nan, nan, nan, nan, nan, 0.4377473156683334, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.1295850314931456, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 1.4201 | 18.0 | 360 | 1.9076 | 0.1058 | 0.1420 | 0.6090 | [0.544585393186956, nan, nan, 0.7251707754198404, nan, 0.5891181339724741, nan, nan, 0.35175747599947543, nan, 0.3768437454607878, nan, 0.0, nan, 0.46080424972867995, nan, nan, 0.0, 0.007712058173094944, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.03192860684184432, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.12968165867755327, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.17963202065848935, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.006236042242325599, nan, nan, nan, nan, nan, nan, 0.3516538062361807, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.15892993199198577, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.8058972658592538, nan, nan, 0.9410890411810798, nan, 0.6437469962145553, nan, nan, 0.5381576671624754, nan, 0.43085331239809166, nan, 0.0, nan, 0.7058406646047238, nan, nan, 0.0, 0.009743104060613236, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.04944053125539796, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.27250710877632656, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.23378055504502848, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.008702312253225401, nan, nan, nan, nan, nan, nan, 0.44179507239475263, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.17389156477707793, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 1.3918 | 19.0 | 380 | 1.9192 | 0.1037 | 0.1492 | 0.6023 | [0.5442973237949046, nan, nan, 0.7038453617346788, nan, 0.5574274701550656, nan, nan, 0.33109252617100604, nan, 0.3536195549984331, nan, 0.0, 0.0, 0.48958630952380955, nan, nan, 0.0, 0.03185425185390533, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.04555193844332643, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.15959564598020237, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.1401469153743555, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.007048748992689108, nan, nan, nan, nan, nan, nan, 0.360631823315347, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2158580356918752, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.7598110607315368, nan, nan, 0.9670246638723399, nan, 0.6107040063072744, nan, nan, 0.517921754470926, nan, 0.4259013225436319, nan, 0.0, nan, 0.7196428493308893, nan, nan, 0.0, 0.04353024742512134, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.07385371283803235, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.42115282664224785, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.18483997164386798, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.009587069341186102, nan, nan, nan, nan, nan, nan, 0.5213556016644557, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.26553044337408915, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 1.3816 | 20.0 | 400 | 1.9099 | 0.1005 | 0.1444 | 0.5801 | [0.5240378547065215, nan, nan, 0.7084459171029648, nan, 0.3005648391621819, nan, nan, 0.3284451404977932, nan, 0.3888894787696032, nan, 0.0, nan, 0.44350835438817365, nan, nan, 0.0, 0.05071873461975799, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0850628575506369, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.16232786173696878, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.18652889399158057, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.009414690469846346, nan, nan, nan, nan, nan, nan, 0.34993112721786757, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.179512063620434, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.7640533099803383, nan, nan, 0.9546740712180197, nan, 0.3043043907918113, nan, nan, 0.5266733543187243, nan, 0.5529621353946494, nan, 0.0, nan, 0.6750033903940295, nan, nan, 0.0, 0.07075884929560791, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.13726656814386887, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.39653718251482, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.25594034710006036, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.013997649451318553, nan, nan, nan, nan, nan, nan, 0.4839053482544758, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.20560083981721625, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 1.5018 | 21.0 | 420 | 1.8815 | 0.1059 | 0.1428 | 0.5995 | [0.5311212203406633, nan, nan, 0.7177711289807998, nan, 0.42651481922426215, nan, nan, 0.36451006234358574, nan, 0.39474432552608446, nan, 0.0, nan, 0.506004935402564, nan, nan, 0.0, 0.009084462502635261, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0965560638446633, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.12773542545580144, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.15052972972972972, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0073552816083549116, nan, nan, nan, nan, nan, nan, 0.36122781305976226, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.22668064473856636, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.8159745909259652, nan, nan, 0.8884248952127461, nan, 0.4410826047747363, nan, nan, 0.510878965028192, nan, 0.4610634700163053, nan, 0.0, nan, 0.7544086059137222, nan, nan, 0.0, 0.011222919379661418, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.18553634147745812, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.2495300978360403, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.18279202877622286, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.011290556868752228, nan, nan, nan, nan, nan, nan, 0.5030329829812805, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2670433493886625, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 1.301 | 22.0 | 440 | 1.8606 | 0.1127 | 0.1557 | 0.6116 | [0.5454706673579958, nan, nan, 0.740255244679598, nan, 0.5893461091012621, nan, nan, 0.35433112960421975, nan, 0.4003983065708059, nan, 0.0, nan, 0.4706902103290056, nan, nan, 0.0, 0.04674946512433085, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0878654677869828, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.15571204204604946, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.18811653609669202, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.009186660517126157, nan, nan, nan, nan, nan, nan, 0.3597557228452911, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.22036818018203322, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.7677466369254428, nan, nan, 0.9575160756707006, nan, 0.6483370503771945, nan, nan, 0.5502507869521602, nan, 0.4826227429192584, nan, 0.0, nan, 0.7162393311955623, nan, nan, 0.0, 0.062341659760861846, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.14310116500009595, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.3655356884669141, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.2578570115787539, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.012888401758949912, nan, nan, nan, nan, nan, nan, 0.5310044559338754, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2646350500185254, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 1.2921 | 23.0 | 460 | 1.8113 | 0.1057 | 0.1532 | 0.5913 | [0.5357758165202247, nan, nan, 0.7178963332622186, nan, 0.5757493868768706, nan, nan, 0.34088809244481116, nan, 0.3888587007438775, nan, 0.0, nan, 0.45940927020073186, nan, nan, 0.0, 0.030332067631465825, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.11186966640806827, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.1507442372917972, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.16361404207313027, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.012473057193938078, nan, nan, nan, nan, nan, nan, 0.215706225981412, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2064475531245462, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.7239290991222531, nan, nan, 0.9626030770761558, nan, 0.6097751953473051, nan, nan, 0.548791033934069, nan, 0.6266380820097832, nan, 0.0, nan, 0.7326969600195987, nan, nan, 0.0, 0.04009707588492956, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.19373164693011918, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.42036965636898166, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.2589334943681571, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.01971555719888547, nan, nan, nan, nan, nan, nan, 0.26920416794222024, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2633691490675559, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 1.3405 | 24.0 | 480 | 1.7822 | 0.1085 | 0.1534 | 0.6043 | [0.5414566325029534, nan, nan, 0.7173153879237804, nan, 0.5310126078829097, nan, nan, 0.36009260637040796, nan, 0.42939053817714906, nan, 0.0, nan, 0.4722188917670218, nan, nan, 0.0, 0.015322674081798169, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.09576263304056658, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.16420095890288652, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.15819497832665413, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.00890486944807639, nan, nan, nan, nan, nan, nan, 0.3031977891827872, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.21783933256127247, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.7591235857847104, nan, nan, 0.9542138779368415, nan, 0.576780806013511, nan, nan, 0.5692275761873465, nan, 0.5856633854701371, nan, 0.0, nan, 0.7408907768158294, nan, nan, 0.0, 0.018787735290635727, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.13832216954877838, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.44317798448117984, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.20314017906372253, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.01187159137427866, nan, nan, nan, nan, nan, nan, 0.39185006292731045, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2845807089045325, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 1.1149 | 25.0 | 500 | 1.7731 | 0.1119 | 0.1561 | 0.5990 | [0.5230321816508672, nan, nan, 0.7190935580028947, nan, 0.5684881046052994, nan, nan, 0.34178000606892306, nan, 0.4634889254146504, nan, 0.0, nan, 0.44083260578388256, nan, nan, 0.0, 0.02177948393993877, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.12256739536552475, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.1628205567158558, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.22133767103626448, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.008860431486295587, nan, nan, nan, nan, nan, nan, 0.31208343518292186, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.23465598343498908, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.7538289710403823, nan, nan, 0.9331196129401376, nan, 0.6125346279086094, nan, nan, 0.5220796291812239, nan, 0.608792801497675, nan, 0.0, nan, 0.7112696697537481, nan, nan, 0.0, 0.026530129039895822, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.19014260215342688, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.34378765241698395, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.3036731693228661, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.013997649451318553, nan, nan, nan, nan, nan, nan, 0.43427780990283116, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.3219093491416574, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 0.8525 | 26.0 | 520 | 1.7599 | 0.1083 | 0.1499 | 0.6013 | [0.5330932644142036, nan, nan, 0.7374629122911739, nan, 0.4196436832420096, nan, nan, 0.36779568174556077, nan, 0.45034522075125105, nan, 0.0, nan, 0.46970559257831385, nan, nan, 0.0, 0.02651167431650369, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.10133904279732192, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.18001012682076706, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.23047192839707079, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0010134792743488395, nan, nan, nan, nan, nan, nan, 0.2836160821897313, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.20646839213054238, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.8119925245031936, nan, nan, 0.9142890013805799, nan, 0.4408612021621855, nan, nan, 0.55742502334913, nan, 0.536686998007126, nan, 0.0, nan, 0.6715692493448884, nan, nan, 0.0, 0.03145495442168817, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.164424313379268, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.36837919899754207, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.35697219523722007, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0014525862638160763, nan, nan, nan, nan, nan, nan, 0.38687256936176967, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.30393973076448066, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 0.9834 | 27.0 | 540 | 1.7425 | 0.1113 | 0.1546 | 0.6117 | [0.5468403256324604, nan, nan, 0.729015596924764, nan, 0.5298170927901393, nan, nan, 0.34659036444853436, nan, 0.4062670332731743, nan, 0.0, nan, 0.48170900134657657, nan, nan, 0.0, 0.014311797485688203, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.12100421456475167, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.1711757925072046, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.2214201588085559, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.008457097567346056, nan, nan, nan, nan, nan, nan, 0.2971870505293438, 0.0, 0.0, nan, nan, 0.0020491803278688526, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.24086284086284085, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.7922990114110264, nan, nan, 0.9550098879367172, nan, 0.5779742200957971, nan, nan, 0.5002871078210938, nan, 0.5063862552086479, nan, 0.0, nan, 0.7323994802854056, nan, nan, 0.0, 0.017816976441340122, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.1658637698405082, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.35783652224203577, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.30603618032399504, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.012399804561120869, nan, nan, nan, nan, nan, nan, 0.42872206537637336, 0.0, 0.0, nan, nan, 0.0020944095376188173, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.36544399160182783, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 1.255 | 28.0 | 560 | 1.7499 | 0.1118 | 0.1587 | 0.6071 | [0.5398339756892974, nan, nan, 0.7287949661640746, nan, 0.5465474100530702, nan, nan, 0.35925200783154193, nan, 0.4258153855837521, nan, 0.0, nan, 0.4534640119071021, nan, nan, 0.0, 0.036095939206839234, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.11804747854405662, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.17905852894391455, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.23849347211359123, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.010316461996506883, nan, nan, nan, nan, nan, nan, 0.2574594613283546, 0.0, 0.0, nan, nan, 0.0006444337038827131, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.24188155544359455, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.7703357733405677, nan, nan, 0.9543786768821283, nan, 0.5850483035699822, nan, nan, 0.5598187415683697, nan, 0.6128087444894015, nan, 0.0, nan, 0.7064006264573226, nan, nan, 0.0, 0.044986385699064754, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.1850565226570447, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.3567039375391585, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.41631002704334813, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.01505407582500297, nan, nan, nan, nan, nan, nan, 0.33772123768382145, 0.0, 0.0, nan, nan, 0.0006444337038827131, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.32610843522292204, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 1.0002 | 29.0 | 580 | 1.7234 | 0.1123 | 0.1537 | 0.6115 | [0.5343806524346499, nan, nan, 0.7577406122804516, nan, 0.5857621508154368, nan, nan, 0.35288204865605544, nan, 0.44074177356385946, nan, 0.0, nan, 0.47477546800043285, nan, nan, 0.0, 0.0070126227208976155, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.09253997097454784, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.18628156919220204, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.22707481198047236, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.006903668258650132, nan, nan, nan, nan, nan, nan, 0.2662933416415916, 0.0, 0.0, nan, nan, 0.0022658610271903325, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.21873956020201624, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.8161723721798985, nan, nan, 0.8926101665402172, nan, 0.6268609969597642, nan, nan, 0.5148984745235048, nan, 0.5249562171628721, nan, 0.0, nan, 0.6910016755108558, nan, nan, 0.0, 0.007991002722860187, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.14563460837187878, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.3366427297701094, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.4066741932942999, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.00998322923131776, nan, nan, nan, nan, nan, nan, 0.3682551560710682, 0.0, 0.0, nan, nan, 0.002416626389560174, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.3436766703717426, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 0.9107 | 30.0 | 600 | 1.7110 | 0.1109 | 0.1570 | 0.6061 | [0.5409093005757184, nan, nan, 0.7473723058754059, nan, 0.4555303180966537, nan, nan, 0.35164493806392416, nan, 0.4320491085773804, nan, 0.0, nan, 0.4522083206870581, nan, nan, 0.0, 0.049725487125786595, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.10475239334604404, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.20116130177364758, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.22513841831253614, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.007068769172369218, nan, nan, nan, nan, nan, nan, 0.27485750958849686, 0.0, 0.0, nan, nan, 0.005144694533762058, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.25437975947734176, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.7937723231047638, nan, nan, 0.9445218343055435, nan, 0.4700215462542458, nan, nan, 0.5090525441903906, nan, 0.588682891478954, nan, 0.0, nan, 0.703942043948256, nan, nan, 0.0, 0.060151533088670536, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.19929754524691476, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.3823557761819847, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.35765484285976845, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.009098472143357059, nan, nan, nan, nan, nan, nan, 0.40789369252922436, 0.0, 0.0, nan, nan, 0.005155469631061705, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.3774854884525133, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 0.7442 | 31.0 | 620 | 1.7138 | 0.1100 | 0.1550 | 0.6051 | [0.538370914627848, nan, nan, 0.7400080228868461, nan, 0.5135815722017948, nan, nan, 0.3400483665949275, nan, 0.4300165958900766, nan, 0.0, nan, 0.44986347393882015, nan, nan, 0.0, 0.014369577640396904, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.12100166291695197, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.178307206329525, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.24067469406452457, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.009674143184204637, nan, nan, nan, nan, nan, nan, 0.2674093087779633, 0.0, 0.0, nan, nan, 0.00589171974522293, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2204903647326626, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.7883031954893183, nan, nan, 0.9579234089128245, nan, 0.5528747239217423, nan, nan, 0.5126638763014978, nan, 0.5320218612235038, nan, 0.0, nan, 0.6818279254725772, nan, nan, 0.0, 0.016372676689949094, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.16619004663838935, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.36386090895946793, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.3862472759734292, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.014842790550266088, nan, nan, nan, nan, nan, nan, 0.36498973888000724, 0.0, 0.0, nan, nan, 0.005961011760915096, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.39178090650858344, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 1.0674 | 32.0 | 640 | 1.7114 | 0.1093 | 0.1518 | 0.6065 | [0.5322687824138417, nan, nan, 0.746921322690992, nan, 0.4844228791969084, nan, nan, 0.3539657487456126, nan, 0.4188551233102879, nan, 0.0, nan, 0.47029302899444786, nan, nan, 0.0, 0.01652062987515423, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.1012081708348106, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.1694733773920492, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.24721294631389673, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0029394296298533304, nan, nan, nan, nan, nan, nan, 0.272253692089824, 0.0, 0.0, nan, nan, 0.000778816199376947, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.22813668906453632, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.8149825116950065, nan, nan, 0.936937973408291, nan, 0.5164782944438744, nan, nan, 0.5197758483517244, nan, 0.4902620931215653, nan, 0.0, nan, 0.667006435186603, nan, nan, 0.0, 0.01870486563276903, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.14244861140433374, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.3432695551592848, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.37902696458109064, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0038559562639481296, nan, nan, nan, nan, nan, nan, 0.36620293207251947, 0.0, 0.0, nan, nan, 0.0008055421298533913, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.41617265653945906, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 1.0873 | 33.0 | 660 | 1.6653 | 0.1172 | 0.1696 | 0.6123 | [0.5445668589613619, nan, nan, 0.7524007642536803, nan, 0.6080308328342638, nan, nan, 0.37522726462514705, nan, 0.47058505928341576, nan, 0.0, nan, 0.45211562654289134, nan, nan, 0.0020577643859778054, 0.03698548214138415, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0874434081492265, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.2220118553720846, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.25600388486667847, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.008400085393133278, nan, nan, nan, nan, nan, nan, 0.2727300819665799, 0.0, 0.0, nan, nan, 0.008752387014640357, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2377502658088699, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.7471054660604195, nan, nan, 0.9379485329784455, nan, 0.6798356220603403, nan, nan, 0.5825452281296482, nan, 0.657935261791171, nan, 0.0, nan, 0.7090648199591403, nan, nan, 0.0020592777818636463, 0.04246478039540665, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.20981517379037676, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.42103233890789915, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.45677002651823456, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.01195082335230499, nan, nan, nan, nan, nan, nan, 0.40027438575008223, 0.0, 0.0, nan, nan, 0.008860963428387304, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.40734222551562305, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 1.0019 | 34.0 | 680 | 1.7047 | 0.1082 | 0.1516 | 0.5976 | [0.5252909077146455, nan, nan, 0.7573782099101035, nan, 0.41441033016508255, nan, nan, 0.31164553601714706, nan, 0.4709121646818984, nan, 0.0, nan, 0.4513100730570853, nan, nan, 0.0, 0.026121796557054672, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.10214383984573885, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.18893875968482826, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.24868639840372464, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.005485878410722834, nan, nan, nan, nan, nan, nan, 0.2657905577984675, 0.0, 0.0, nan, nan, 0.010289694475225581, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.22569361498822646, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.8073684622454624, nan, nan, 0.9312974962997973, nan, 0.42945086752023676, nan, nan, 0.4385762219378048, nan, 0.571939730660064, nan, 0.0, nan, 0.6894005345885812, nan, nan, 0.0, 0.02940689002012549, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.18096846630712243, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.34641428502578436, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.3926799170320582, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0075666539015146516, nan, nan, nan, nan, nan, nan, 0.36576074016122995, 0.0, 0.0, nan, nan, 0.010472047688094087, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.40839199703593926, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 0.9344 | 35.0 | 700 | 1.6748 | 0.1150 | 0.1627 | 0.6151 | [0.5505830958687438, nan, nan, 0.7562419986190323, nan, 0.5978540319640836, nan, nan, 0.32995695112686757, nan, 0.43084969246068283, nan, 0.0, nan, 0.44413403674818086, nan, nan, 0.0, 0.02120224146714213, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.11930022573363432, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.19550674411022423, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.2520041329675419, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.011823153016352289, nan, nan, nan, nan, nan, nan, 0.2816350837545237, 0.0, 0.0, nan, nan, 0.02907822041291073, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2345825635611859, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.7886077997734505, nan, nan, 0.9569563811395381, nan, 0.6730801423456797, nan, nan, 0.5048116503511018, nan, 0.5298327193671115, nan, 0.0, nan, 0.6544510405228644, nan, nan, 0.0, 0.024635965431514147, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.16229391781663244, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.39015133259434187, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.37141281802189724, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.017787579066911405, nan, nan, nan, nan, nan, nan, 0.4182455185550529, 0.0, 0.0, nan, nan, 0.03222168519413565, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.4939792515746573, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 0.8097 | 36.0 | 720 | 1.6860 | 0.1136 | 0.1585 | 0.6079 | [0.5370921761611701, nan, nan, 0.7534232274902132, nan, 0.45051163633070723, nan, nan, 0.3470158614454695, nan, 0.4514947692232571, nan, 0.0, nan, 0.4472259023583447, nan, nan, 0.0, 0.02574732931747514, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.12124562847150792, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.19888801475595164, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.25768922637500435, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0058514509006449555, nan, nan, nan, nan, nan, nan, 0.27173447017804997, 0.0, 0.0, nan, nan, 0.09133667121832802, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.24511241015135193, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.8068195399417656, nan, nan, 0.9425286998917924, nan, 0.48357570619333307, nan, nan, 0.5071984503095922, nan, 0.55839724621052, nan, 0.0, nan, 0.6896455178990931, nan, nan, 0.0, 0.029702853083935124, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.18098765905993897, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.36247530001445855, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.38847900858560663, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.007751528516909425, nan, nan, nan, nan, nan, nan, 0.3808632946698867, 0.0, 0.0, nan, nan, 0.09698727243434832, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.42852290971964924, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 0.7117 | 37.0 | 740 | 1.7075 | 0.1119 | 0.1543 | 0.6082 | [0.5316191267085415, nan, nan, 0.7572815533980582, nan, 0.4538160469667319, nan, nan, 0.30431906153724625, nan, 0.4471088739555919, nan, 0.0, nan, 0.458622625619912, nan, nan, 0.0, 0.02901237764772904, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.13063645830046386, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.2073772833894836, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.24302394240109712, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0018472226405244507, nan, nan, nan, nan, nan, nan, 0.3016653313939469, 0.0, 0.0, nan, nan, 0.04694976076555024, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.228434381603097, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.8287383565541218, nan, nan, 0.942003208915312, nan, 0.488387162968523, nan, nan, 0.42642775606212596, nan, 0.5283682589528353, nan, 0.0, nan, 0.6432518034708885, nan, nan, 0.0, 0.03315970166923168, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.15891599332092202, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.37601812135524604, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.3349961929267204, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.002429780659474164, nan, nan, nan, nan, nan, nan, 0.43973151014206835, 0.0, 0.0, nan, nan, 0.050588045754792976, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.4572989996294924, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 0.8903 | 38.0 | 760 | 1.6940 | 0.1141 | 0.1612 | 0.6022 | [0.5272388914176183, nan, nan, 0.7550644430962281, nan, 0.41146235695229855, nan, nan, 0.3325270522067029, nan, 0.45052910362931103, nan, 0.0, nan, 0.45438585937030007, nan, nan, 0.0, 0.01657318470885357, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.12229517410520399, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.20662691942053804, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.2590023063760137, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.004992486554887694, nan, nan, nan, nan, nan, nan, 0.2898555496110951, 0.0, 0.0, nan, nan, 0.145993083746805, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.24471294419777534, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.7889726903221508, nan, nan, 0.9519750998121914, nan, 0.43142189077831117, nan, nan, 0.46793040229686256, nan, 0.5804245425448397, nan, 0.0, nan, 0.726935477520594, nan, nan, 0.0, 0.019344145850597846, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.1837514154655202, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.39544074413224733, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.365610313230236, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.006668691483882895, nan, nan, nan, nan, nan, nan, 0.3992879576402826, 0.0, 0.0, nan, nan, 0.1564362816175286, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.4883907620106212, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 0.9552 | 39.0 | 780 | 1.6641 | 0.1162 | 0.1625 | 0.6098 | [0.5310598686250333, nan, nan, 0.7601430490832172, nan, 0.5429095366706864, nan, nan, 0.3445406417268712, nan, 0.4568477729996307, nan, 0.0, nan, 0.46530970057255566, nan, nan, 0.0, 0.014359563469270534, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.1130773272197545, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.19405725628857665, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.26618730518560496, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.008357285392600187, nan, nan, nan, nan, nan, nan, 0.26465747015031377, 0.0, 0.0, nan, nan, 0.08871089617651504, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2501878158721486, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.778973631634001, nan, nan, 0.9464372333677442, nan, 0.58833694237592, nan, nan, 0.4889619149745754, nan, 0.5790053747206957, nan, 0.0, nan, 0.7430518795907028, nan, nan, 0.0, 0.016277968509530012, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.18175536917260043, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.37692177936286086, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.394622837188542, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.012056465989673432, nan, nan, nan, nan, nan, nan, 0.3591391997460231, 0.0, 0.0, nan, nan, 0.09457064604478814, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.4524206496233173, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 0.7998 | 40.0 | 800 | 1.6491 | 0.1195 | 0.1656 | 0.6204 | [0.5467238209227602, nan, nan, 0.7600965442912115, nan, 0.582928686810868, nan, nan, 0.36155701416782127, nan, 0.47812036649811807, nan, 0.0, nan, 0.4596212555550058, nan, nan, 0.0, 0.031148168454591535, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.1197557612763443, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.20817175700676152, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.26782302664655605, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.006440314647170281, nan, nan, nan, nan, nan, nan, 0.269295792964432, 0.0, 0.0, nan, nan, 0.08404112650871703, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2449240499941124, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.8010172513905502, nan, nan, 0.9341768137212224, nan, 0.6489202572590357, nan, nan, 0.5361859628489398, nan, 0.5695996135032309, nan, 0.0, nan, 0.6908966826634936, nan, nan, 0.0, 0.03564579140523263, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.18670709939926683, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.3809701672369753, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.4195919867671384, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.008887186868620176, nan, nan, nan, nan, nan, nan, 0.37565903602163336, 0.0, 0.0, nan, nan, 0.09086515224746254, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.44954921575892304, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 0.9118 | 41.0 | 820 | 1.6721 | 0.1160 | 0.1601 | 0.6090 | [0.5327482521002724, nan, nan, 0.7521480816985193, nan, 0.42348318514250716, nan, nan, 0.34787290807664323, nan, 0.46125081085206243, nan, 0.0, nan, 0.46282223820683577, nan, nan, 0.0, 0.05095456865747831, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.125455933497328, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.206567621664828, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.25872563801865195, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.001967741277706409, nan, nan, nan, nan, nan, nan, 0.2845250835300749, 0.0, 0.0, nan, nan, 0.14583025830258303, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.23884811742182516, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.8171168569914615, nan, nan, 0.9333372719244786, nan, 0.4405749987849857, nan, nan, 0.4961361513715452, nan, 0.5474968295186907, nan, 0.0, nan, 0.6980711938999156, nan, nan, 0.0, 0.05908606605895584, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.17031648849394468, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.37676514530820765, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.3649276656076877, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.002548628626513661, nan, nan, nan, nan, nan, nan, 0.39490005328979444, 0.0, 0.0, nan, nan, 0.15917512485903013, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.4622391009015685, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 0.9207 | 42.0 | 840 | 1.6454 | 0.1206 | 0.1676 | 0.6162 | [0.5379120232752981, nan, nan, 0.7568729914559748, nan, 0.5613482715887681, nan, nan, 0.364717070851165, nan, 0.4756806178795279, nan, 0.0, nan, 0.4550448462618563, nan, nan, 0.0, 0.018526924741625934, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.10926136363636364, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.22021430139209439, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.26463688376567485, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.00584039493119801, nan, nan, nan, nan, nan, nan, 0.2793764799974909, 0.0, 0.0, nan, nan, 0.16214200477326968, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.25246816795602955, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.793135615538626, nan, nan, 0.9439590303603189, nan, 0.6017345004671055, nan, nan, 0.5147116814832751, nan, 0.5969563379431125, nan, 0.0, nan, 0.6946895492744557, nan, nan, 0.0, 0.020776607079436488, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.1845383183309982, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.39892283965492314, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.4116890277522514, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.007936403132304199, nan, nan, nan, nan, nan, nan, 0.40398199485243264, 0.0, 0.0, nan, nan, 0.17512485903012728, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.4524206496233173, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 0.7207 | 43.0 | 860 | 1.6300 | 0.1199 | 0.1681 | 0.6145 | [0.5406610748788059, nan, nan, 0.7575992381390396, nan, 0.5760206511115009, nan, nan, 0.36015884868644354, nan, 0.46500148540871594, nan, 0.0, nan, 0.45176054716285924, nan, nan, 0.0, 0.016866204578869986, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.10816986644407346, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.212614696784591, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.26724166862435816, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0068089889710963485, nan, nan, nan, nan, nan, nan, 0.2704699429190119, 0.0, 0.0, nan, nan, 0.14349508782375708, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2593716106290672, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.7870149732043419, nan, nan, 0.94244785512618, nan, 0.6205699227250882, nan, nan, 0.5138676536718669, nan, 0.6144090826740746, nan, 0.0, nan, 0.6865569783058529, nan, nan, 0.0, 0.019012667219131054, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.19897126844903365, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.39421176924189116, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.41814792448867066, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.00975873862690982, nan, nan, nan, nan, nan, nan, 0.38520584600383234, 0.0, 0.0, nan, nan, 0.15530852263573386, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.4725515623070273, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 2.398 | 44.0 | 880 | 1.6485 | 0.1167 | 0.1611 | 0.6100 | [0.5311814850668972, nan, nan, 0.7555542115237289, nan, 0.49150431547087375, nan, nan, 0.3397112053113447, nan, 0.4711908444169502, nan, 0.0, nan, 0.4483007218044239, nan, nan, 0.0, 0.03373223867894174, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.11456464920822017, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.2039065290799167, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.26244981334084666, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0015820003515556336, nan, nan, nan, nan, nan, nan, 0.2779898701320591, 0.0, 0.0, nan, nan, 0.12441350083245044, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2633676630662313, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.8130575818438924, nan, nan, 0.9322520864168357, nan, 0.5200099361172462, nan, nan, 0.4856757411186828, nan, 0.5749743341989251, nan, 0.0, nan, 0.6746446648322083, nan, nan, 0.0, 0.038475198295252755, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.17162159568546917, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.3645115427249506, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.3913146217869614, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0020204154396714514, nan, nan, nan, nan, nan, nan, 0.37400365091783166, 0.0, 0.0, nan, nan, 0.13243112614789754, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.48558107941212797, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 0.8206 | 45.0 | 900 | 1.6576 | 0.1195 | 0.1659 | 0.6121 | [0.53605344484101, nan, nan, 0.75744198726446, nan, 0.5238418818047587, nan, nan, 0.34436174115825924, nan, 0.47195355574245396, nan, 0.0, nan, 0.4475797404376424, nan, nan, 0.0, 0.031364722184879165, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.11128700706471659, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.2101074907833958, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.2651975284871611, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.003919791994957454, nan, nan, nan, nan, nan, nan, 0.2761731199707084, 0.0, 0.0, nan, nan, 0.18027666220437305, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2635211174367801, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.8053821884760162, nan, nan, 0.9416705016106764, nan, 0.5536685332886928, nan, nan, 0.47402539001694977, nan, 0.6099855063711577, nan, 0.0, nan, 0.6632310673835345, nan, nan, 0.0, 0.035906238901385105, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.1807957315317736, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.40720034700467495, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.397799774201171, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.005255721209079985, nan, nan, nan, nan, nan, nan, 0.3762939782532286, 0.0, 0.0, nan, nan, 0.19526341227646207, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.49163270347042115, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 0.7685 | 46.0 | 920 | 1.6461 | 0.1193 | 0.1663 | 0.6126 | [0.5362561433981403, nan, nan, 0.755636163506279, nan, 0.5130186358137029, nan, nan, 0.3490519469206423, nan, 0.4808415970463636, nan, 0.0, nan, 0.4444587000590182, nan, nan, 0.0, 0.020957206264131983, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.116132422490804, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.22112223959036903, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.2681155616217824, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0022679869590749853, nan, nan, nan, nan, nan, nan, 0.2762335846152573, 0.0, 0.0, nan, nan, 0.1578549848942598, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2707515143446715, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.8006661103407866, nan, nan, 0.9392233927438713, nan, 0.5449744306982822, nan, nan, 0.4989864748002352, nan, 0.6134730358113413, nan, 0.0, nan, 0.6819679159357269, nan, nan, 0.0, 0.023700722149875696, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.18238873001554612, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.41417658682346137, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.4022894951033161, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0029579938463163732, nan, nan, nan, nan, nan, nan, 0.37849359955554046, 0.0, 0.0, nan, nan, 0.1683583051393588, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5023465481042362, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 0.6151 | 47.0 | 940 | 1.6422 | 0.1200 | 0.1665 | 0.6132 | [0.5358835584937998, nan, nan, 0.762341181032713, nan, 0.5249996207965295, nan, nan, 0.3548468717459638, nan, 0.46979561119539376, nan, 0.0, nan, 0.4530588733189128, nan, nan, 0.0, 0.03589317260729502, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.11116243437507041, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.21348365421889132, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.26835906741072957, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.003541333542608815, nan, nan, nan, nan, nan, nan, 0.2757877170662393, 0.0, 0.0, nan, nan, 0.16579458812976527, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2644075185700632, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.8022811476389466, nan, nan, 0.9352619992288653, nan, 0.5607210165079948, nan, nan, 0.5045210833996333, nan, 0.5937556615737666, nan, 0.0, nan, 0.674386557415776, nan, nan, 0.0, 0.040890256895939385, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.1893748920407654, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.3981276206082221, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.41161026071888046, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0047671240112509405, nan, nan, nan, nan, nan, nan, 0.3785049378096761, 0.0, 0.0, nan, nan, 0.17866924440148219, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.48687785599604794, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 0.8132 | 48.0 | 960 | 1.6362 | 0.1198 | 0.1671 | 0.6154 | [0.537670140668603, nan, nan, 0.7574703715767583, nan, 0.5503381293520359, nan, nan, 0.34713318137716853, nan, 0.475125748502994, nan, 0.0, nan, 0.44345926718928064, nan, nan, 0.0, 0.033251751202633135, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.11910377358490566, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.2160315408647757, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.26936717663421417, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0015017990300881265, nan, nan, nan, nan, nan, nan, 0.27478974836286274, 0.0, 0.0, nan, nan, 0.13573573573573575, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2718448199272799, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.7988205045219987, nan, nan, 0.940019402743747, nan, 0.592386990166484, nan, nan, 0.5173267840464907, nan, 0.5989643094389758, nan, 0.0, nan, 0.6741197005953969, nan, nan, 0.0, 0.03731502308511898, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.1822159952401973, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.40074220444358766, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.4068054716832515, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0019015674726319542, nan, nan, nan, nan, nan, nan, 0.3749107112486819, 0.0, 0.0, nan, nan, 0.14564201707749316, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5101580832407064, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 0.532 | 49.0 | 980 | 1.6296 | 0.1197 | 0.1664 | 0.6143 | [0.5366108905436184, nan, nan, 0.7597294770713172, nan, 0.5423197964192679, nan, nan, 0.3459790436344114, nan, 0.4667733595962083, nan, 0.0, nan, 0.4468324476435224, nan, nan, 0.0, 0.036688775081857104, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.11978368520371332, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.21549309194854693, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.26910547396528706, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.004184269572072527, nan, nan, nan, nan, nan, nan, 0.2764878305832757, 0.0, 0.0, nan, nan, 0.14281490242460082, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2663475049834973, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.7993852916321608, nan, nan, 0.9375101055957016, nan, 0.5834660848998018, nan, nan, 0.5171745823100072, nan, 0.5724379491515188, nan, 0.0, nan, 0.67241356682576, nan, nan, 0.0, 0.04138747484313958, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.181525056138802, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.40874259000433755, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.3969070811563, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0054670064838168685, nan, nan, nan, nan, nan, nan, 0.3807385738743948, 0.0, 0.0, nan, nan, 0.1556307394876752, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5033036927257009, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
| 1.0631 | 50.0 | 1000 | 1.6339 | 0.1200 | 0.1669 | 0.6123 | [0.5349505300503856, nan, nan, 0.7662725216601061, nan, 0.5326853699336921, nan, nan, 0.34424006183640854, nan, 0.4690602972950636, nan, 0.0, nan, 0.43864150176543804, nan, nan, 0.0, 0.03116323751411952, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.11546494517491812, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.21709821831740012, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.2656179069036561, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.004383351344800073, nan, nan, nan, nan, nan, nan, 0.27573695030755746, 0.0, 0.0, nan, nan, 0.1730424387328153, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2699536864879483, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] | [0.7970510497742438, nan, nan, 0.9333683660650987, nan, 0.5709001366216121, nan, nan, 0.5022173025701339, nan, 0.5997946735914005, nan, 0.0, nan, 0.6798811830944017, nan, nan, 0.0, 0.03494731857464189, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.18674548490489992, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.4055014699503591, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.399427626224171, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.005731113077237973, nan, nan, nan, nan, nan, nan, 0.3817023254759232, 0.0, 0.0, nan, nan, 0.18656355727404544, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.49311473385204396, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan] |
### Framework versions
- Transformers 4.31.0
- Pytorch 2.0.1
- Datasets 2.14.5
- Tokenizers 0.13.3 |
leeywin/ludwigmaxcreate | leeywin | 2023-09-18T15:30:45Z | 0 | 0 | peft | [
"peft",
"region:us"
]
| null | 2023-09-18T15:21:10Z | ---
library_name: peft
---
## Training procedure
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.5.0
- PEFT 0.5.0
|
ShinraC002/image_classification | ShinraC002 | 2023-09-18T15:30:42Z | 219 | 0 | transformers | [
"transformers",
"pytorch",
"vit",
"image-classification",
"generated_from_trainer",
"dataset:imagefolder",
"base_model:google/vit-base-patch16-224-in21k",
"base_model:finetune:google/vit-base-patch16-224-in21k",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
]
| image-classification | 2023-09-18T15:06:11Z | ---
license: apache-2.0
base_model: google/vit-base-patch16-224-in21k
tags:
- generated_from_trainer
datasets:
- imagefolder
metrics:
- accuracy
model-index:
- name: image_classification
results:
- task:
name: Image Classification
type: image-classification
dataset:
name: imagefolder
type: imagefolder
config: default
split: train
args: default
metrics:
- name: Accuracy
type: accuracy
value: 0.56875
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# image_classification
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2152
- Accuracy: 0.5687
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 40 | 1.3484 | 0.5437 |
| No log | 2.0 | 80 | 1.3268 | 0.4875 |
| No log | 3.0 | 120 | 1.2463 | 0.5437 |
| No log | 4.0 | 160 | 1.2361 | 0.5563 |
| No log | 5.0 | 200 | 1.2089 | 0.5813 |
| No log | 6.0 | 240 | 1.2544 | 0.525 |
| No log | 7.0 | 280 | 1.1947 | 0.5563 |
| No log | 8.0 | 320 | 1.2502 | 0.5188 |
| No log | 9.0 | 360 | 1.3415 | 0.4938 |
| No log | 10.0 | 400 | 1.1336 | 0.6 |
| No log | 11.0 | 440 | 1.2716 | 0.5437 |
| No log | 12.0 | 480 | 1.4631 | 0.5 |
| 0.6882 | 13.0 | 520 | 1.3970 | 0.5563 |
| 0.6882 | 14.0 | 560 | 1.2654 | 0.5188 |
| 0.6882 | 15.0 | 600 | 1.2498 | 0.575 |
| 0.6882 | 16.0 | 640 | 1.2655 | 0.5938 |
| 0.6882 | 17.0 | 680 | 1.3577 | 0.55 |
| 0.6882 | 18.0 | 720 | 1.2711 | 0.5813 |
| 0.6882 | 19.0 | 760 | 1.3127 | 0.5687 |
| 0.6882 | 20.0 | 800 | 1.2478 | 0.575 |
### Framework versions
- Transformers 4.33.2
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.13.3
|
learn3r/longt5_xl_summ_screen_memsum_25 | learn3r | 2023-09-18T15:22:17Z | 3 | 0 | transformers | [
"transformers",
"pytorch",
"longt5",
"text2text-generation",
"generated_from_trainer",
"dataset:learn3r/summ_screen_memsum_oracle",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
]
| text2text-generation | 2023-09-18T13:39:39Z | ---
base_model: longt5_xl_summ_screen_memsum_20/checkpoint-140
tags:
- generated_from_trainer
datasets:
- learn3r/summ_screen_memsum_oracle
metrics:
- rouge
model-index:
- name: longt5_xl_summ_screen_25
results:
- task:
name: Summarization
type: summarization
dataset:
name: learn3r/summ_screen_memsum_oracle
type: learn3r/summ_screen_memsum_oracle
metrics:
- name: Rouge1
type: rouge
value: 39.5624
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# longt5_xl_summ_screen_25
This model is a fine-tuned version of [longt5_xl_summ_screen_memsum_20/checkpoint-140](https://huggingface.co/longt5_xl_summ_screen_memsum_20/checkpoint-140) on the learn3r/summ_screen_memsum_oracle dataset.
It achieves the following results on the evaluation set:
- Loss: 4.0742
- Rouge1: 39.5624
- Rouge2: 10.2833
- Rougel: 21.2004
- Rougelsum: 34.0767
- Gen Len: 152.0325
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 32
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: constant
- num_epochs: 5.0
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
|:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|:-------:|:---------:|:--------:|
| 0.1846 | 0.97 | 14 | 4.6151 | 37.719 | 9.6532 | 20.2955 | 32.3806 | 100.9527 |
| 0.1441 | 1.95 | 28 | 4.1640 | 36.8632 | 9.6545 | 20.9349 | 31.8954 | 105.5799 |
| 0.1379 | 2.99 | 43 | 4.0742 | 39.5624 | 10.2833 | 21.2004 | 34.0767 | 152.0325 |
| 0.089 | 3.97 | 57 | 4.5216 | 40.2528 | 10.9254 | 21.6978 | 34.6793 | 176.0976 |
| 0.1028 | 4.87 | 70 | 4.1434 | 32.7739 | 9.1305 | 19.6246 | 27.9012 | 59.6775 |
### Framework versions
- Transformers 4.34.0.dev0
- Pytorch 2.0.1+cu117
- Datasets 2.14.5
- Tokenizers 0.13.3
|
michaelsinanta/image_classification | michaelsinanta | 2023-09-18T15:19:05Z | 220 | 0 | transformers | [
"transformers",
"pytorch",
"vit",
"image-classification",
"generated_from_trainer",
"dataset:imagefolder",
"base_model:google/vit-base-patch16-224-in21k",
"base_model:finetune:google/vit-base-patch16-224-in21k",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
]
| image-classification | 2023-09-18T10:38:49Z | ---
license: apache-2.0
base_model: google/vit-base-patch16-224-in21k
tags:
- generated_from_trainer
datasets:
- imagefolder
metrics:
- accuracy
model-index:
- name: image_classification
results:
- task:
name: Image Classification
type: image-classification
dataset:
name: imagefolder
type: imagefolder
config: default
split: train
args: default
metrics:
- name: Accuracy
type: accuracy
value: 0.325
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# image_classification
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 1.7674
- Accuracy: 0.325
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 40 | 1.9714 | 0.2938 |
| No log | 2.0 | 80 | 1.7702 | 0.3375 |
| No log | 3.0 | 120 | 1.7064 | 0.3125 |
### Framework versions
- Transformers 4.33.2
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.13.3
|
rtilman/distilbert-base-uncased-finetuned-cola | rtilman | 2023-09-18T15:16:41Z | 68 | 0 | transformers | [
"transformers",
"tf",
"tensorboard",
"distilbert",
"text-classification",
"generated_from_keras_callback",
"base_model:distilbert/distilbert-base-uncased",
"base_model:finetune:distilbert/distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
]
| text-classification | 2023-09-18T13:52:49Z | ---
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_keras_callback
model-index:
- name: rtilman/distilbert-base-uncased-finetuned-cola
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# rtilman/distilbert-base-uncased-finetuned-cola
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.1896
- Validation Loss: 0.5484
- Train Matthews Correlation: 0.5463
- Epoch: 2
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': False, 'is_legacy_optimizer': False, 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 1602, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Train Matthews Correlation | Epoch |
|:----------:|:---------------:|:--------------------------:|:-----:|
| 0.5105 | 0.4893 | 0.4413 | 0 |
| 0.3216 | 0.4749 | 0.5288 | 1 |
| 0.1896 | 0.5484 | 0.5463 | 2 |
### Framework versions
- Transformers 4.33.2
- TensorFlow 2.13.0
- Datasets 2.14.5
- Tokenizers 0.13.3
|
kamara3k/dqn-SpaceInvadersNoFrameskip-v4 | kamara3k | 2023-09-18T15:16:12Z | 0 | 0 | stable-baselines3 | [
"stable-baselines3",
"SpaceInvadersNoFrameskip-v4",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
]
| reinforcement-learning | 2023-09-18T15:15:33Z | ---
library_name: stable-baselines3
tags:
- SpaceInvadersNoFrameskip-v4
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: DQN
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: SpaceInvadersNoFrameskip-v4
type: SpaceInvadersNoFrameskip-v4
metrics:
- type: mean_reward
value: 688.50 +/- 181.62
name: mean_reward
verified: false
---
# **DQN** Agent playing **SpaceInvadersNoFrameskip-v4**
This is a trained model of a **DQN** agent playing **SpaceInvadersNoFrameskip-v4**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3)
and the [RL Zoo](https://github.com/DLR-RM/rl-baselines3-zoo).
The RL Zoo is a training framework for Stable Baselines3
reinforcement learning agents,
with hyperparameter optimization and pre-trained agents included.
## Usage (with SB3 RL Zoo)
RL Zoo: https://github.com/DLR-RM/rl-baselines3-zoo<br/>
SB3: https://github.com/DLR-RM/stable-baselines3<br/>
SB3 Contrib: https://github.com/Stable-Baselines-Team/stable-baselines3-contrib
Install the RL Zoo (with SB3 and SB3-Contrib):
```bash
pip install rl_zoo3
```
```
# Download model and save it into the logs/ folder
python -m rl_zoo3.load_from_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -orga kamara3k -f logs/
python -m rl_zoo3.enjoy --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
```
If you installed the RL Zoo3 via pip (`pip install rl_zoo3`), from anywhere you can do:
```
python -m rl_zoo3.load_from_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -orga kamara3k -f logs/
python -m rl_zoo3.enjoy --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
```
## Training (with the RL Zoo)
```
python -m rl_zoo3.train --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
# Upload the model and generate video (when possible)
python -m rl_zoo3.push_to_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/ -orga kamara3k
```
## Hyperparameters
```python
OrderedDict([('batch_size', 32),
('buffer_size', 100000),
('env_wrapper',
['stable_baselines3.common.atari_wrappers.AtariWrapper']),
('exploration_final_eps', 0.01),
('exploration_fraction', 0.1),
('frame_stack', 4),
('gradient_steps', 1),
('learning_rate', 0.0001),
('learning_starts', 100000),
('n_timesteps', 1000000.0),
('optimize_memory_usage', False),
('policy', 'CnnPolicy'),
('target_update_interval', 1000),
('train_freq', 4),
('normalize', False)])
```
# Environment Arguments
```python
{'render_mode': 'rgb_array'}
```
|
subaraman109/llama2-7b-code-ft1 | subaraman109 | 2023-09-18T15:13:31Z | 0 | 0 | peft | [
"peft",
"region:us"
]
| null | 2023-09-18T15:13:23Z | ---
library_name: peft
---
## Training procedure
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.5.0
- PEFT 0.5.0
|
rafalosa/diabetic-retinopathy-224-procnorm-vit | rafalosa | 2023-09-18T15:07:53Z | 225 | 1 | transformers | [
"transformers",
"pytorch",
"safetensors",
"vit",
"image-classification",
"generated_from_trainer",
"dataset:martinezomg/diabetic-retinopathy",
"base_model:google/vit-base-patch16-224-in21k",
"base_model:finetune:google/vit-base-patch16-224-in21k",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
]
| image-classification | 2023-04-30T17:59:36Z | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- martinezomg/diabetic-retinopathy
metrics:
- accuracy
pipeline_tag: image-classification
base_model: google/vit-base-patch16-224-in21k
model-index:
- name: diabetic-retinopathy-224-procnorm-vit
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# diabetic-retinopathy-224-procnorm-vit
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the [diabetic retinopathy](https://huggingface.co/datasets/martinezomg/diabetic-retinopathy) dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7578
- Accuracy: 0.7431
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 4e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.8619 | 1.0 | 50 | 0.8907 | 0.7143 |
| 0.7831 | 2.0 | 100 | 0.7858 | 0.7393 |
| 0.6906 | 3.0 | 150 | 0.7412 | 0.7531 |
| 0.5934 | 4.0 | 200 | 0.7528 | 0.7393 |
| 0.5276 | 5.0 | 250 | 0.7578 | 0.7431 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0
- Datasets 2.12.0
- Tokenizers 0.13.3 |
CyberHarem/shiina_noriko_idolmastercinderellagirls | CyberHarem | 2023-09-18T15:04:13Z | 0 | 0 | null | [
"art",
"text-to-image",
"dataset:CyberHarem/shiina_noriko_idolmastercinderellagirls",
"license:mit",
"region:us"
]
| text-to-image | 2023-09-18T14:34:34Z | ---
license: mit
datasets:
- CyberHarem/shiina_noriko_idolmastercinderellagirls
pipeline_tag: text-to-image
tags:
- art
---
# Lora of shiina_noriko_idolmastercinderellagirls
This model is trained with [HCP-Diffusion](https://github.com/7eu7d7/HCP-Diffusion). And the auto-training framework is maintained by [DeepGHS Team](https://huggingface.co/deepghs).
The base model used during training is [NAI](https://huggingface.co/deepghs/animefull-latest), and the base model used for generating preview images is [Meina/MeinaMix_V11](https://huggingface.co/Meina/MeinaMix_V11).
After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.
For example, if you want to use the model from step 4080, you need to download `4080/shiina_noriko_idolmastercinderellagirls.pt` as the embedding and `4080/shiina_noriko_idolmastercinderellagirls.safetensors` for loading Lora. By using both files together, you can generate images for the desired characters.
**The best step we recommend is 4080**, with the score of 0.779. The trigger words are:
1. `shiina_noriko_idolmastercinderellagirls`
2. `brown_hair, ponytail, long_hair, smile, blush, open_mouth, hair_ornament, food, doughnut, purple_eyes, brown_eyes`
For the following groups, it is not recommended to use this model and we express regret:
1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.
2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.
3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.
4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.
5. Individuals who finds the generated image content offensive to their values.
These are available steps:
| Steps | Score | Download | pattern_1 | pattern_2 | pattern_3 | pattern_4 | pattern_5 | pattern_6 | pattern_7 | pattern_8 | bikini | bondage | free | maid | miko | nude | nude2 | suit | yukata |
|:---------|:----------|:-----------------------------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------|:--------------------------------------------------|:-------------------------------------|:-------------------------------------|:-------------------------------------|:-----------------------------------------------|:------------------------------------------------|:-------------------------------------|:-----------------------------------------|
| 5100 | 0.749 | [Download](5100/shiina_noriko_idolmastercinderellagirls.zip) |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](5100/previews/bondage.png) |  |  |  | [<NSFW, click to see>](5100/previews/nude.png) | [<NSFW, click to see>](5100/previews/nude2.png) |  |  |
| 4760 | 0.717 | [Download](4760/shiina_noriko_idolmastercinderellagirls.zip) |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](4760/previews/bondage.png) |  |  |  | [<NSFW, click to see>](4760/previews/nude.png) | [<NSFW, click to see>](4760/previews/nude2.png) |  |  |
| 4420 | 0.742 | [Download](4420/shiina_noriko_idolmastercinderellagirls.zip) |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](4420/previews/bondage.png) |  |  |  | [<NSFW, click to see>](4420/previews/nude.png) | [<NSFW, click to see>](4420/previews/nude2.png) |  |  |
| **4080** | **0.779** | [**Download**](4080/shiina_noriko_idolmastercinderellagirls.zip) |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](4080/previews/bondage.png) |  |  |  | [<NSFW, click to see>](4080/previews/nude.png) | [<NSFW, click to see>](4080/previews/nude2.png) |  |  |
| 3740 | 0.652 | [Download](3740/shiina_noriko_idolmastercinderellagirls.zip) |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](3740/previews/bondage.png) |  |  |  | [<NSFW, click to see>](3740/previews/nude.png) | [<NSFW, click to see>](3740/previews/nude2.png) |  |  |
| 3400 | 0.648 | [Download](3400/shiina_noriko_idolmastercinderellagirls.zip) |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](3400/previews/bondage.png) |  |  |  | [<NSFW, click to see>](3400/previews/nude.png) | [<NSFW, click to see>](3400/previews/nude2.png) |  |  |
| 3060 | 0.697 | [Download](3060/shiina_noriko_idolmastercinderellagirls.zip) |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](3060/previews/bondage.png) |  |  |  | [<NSFW, click to see>](3060/previews/nude.png) | [<NSFW, click to see>](3060/previews/nude2.png) |  |  |
| 2720 | 0.665 | [Download](2720/shiina_noriko_idolmastercinderellagirls.zip) |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](2720/previews/bondage.png) |  |  |  | [<NSFW, click to see>](2720/previews/nude.png) | [<NSFW, click to see>](2720/previews/nude2.png) |  |  |
| 2380 | 0.680 | [Download](2380/shiina_noriko_idolmastercinderellagirls.zip) |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](2380/previews/bondage.png) |  |  |  | [<NSFW, click to see>](2380/previews/nude.png) | [<NSFW, click to see>](2380/previews/nude2.png) |  |  |
| 2040 | 0.602 | [Download](2040/shiina_noriko_idolmastercinderellagirls.zip) |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](2040/previews/bondage.png) |  |  |  | [<NSFW, click to see>](2040/previews/nude.png) | [<NSFW, click to see>](2040/previews/nude2.png) |  |  |
| 1700 | 0.686 | [Download](1700/shiina_noriko_idolmastercinderellagirls.zip) |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](1700/previews/bondage.png) |  |  |  | [<NSFW, click to see>](1700/previews/nude.png) | [<NSFW, click to see>](1700/previews/nude2.png) |  |  |
| 1360 | 0.598 | [Download](1360/shiina_noriko_idolmastercinderellagirls.zip) |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](1360/previews/bondage.png) |  |  |  | [<NSFW, click to see>](1360/previews/nude.png) | [<NSFW, click to see>](1360/previews/nude2.png) |  |  |
| 1020 | 0.628 | [Download](1020/shiina_noriko_idolmastercinderellagirls.zip) |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](1020/previews/bondage.png) |  |  |  | [<NSFW, click to see>](1020/previews/nude.png) | [<NSFW, click to see>](1020/previews/nude2.png) |  |  |
| 680 | 0.521 | [Download](680/shiina_noriko_idolmastercinderellagirls.zip) |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](680/previews/bondage.png) |  |  |  | [<NSFW, click to see>](680/previews/nude.png) | [<NSFW, click to see>](680/previews/nude2.png) |  |  |
| 340 | 0.306 | [Download](340/shiina_noriko_idolmastercinderellagirls.zip) |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](340/previews/bondage.png) |  |  |  | [<NSFW, click to see>](340/previews/nude.png) | [<NSFW, click to see>](340/previews/nude2.png) |  |  |
|
CyberHarem/tsubaki_ando_shirobako | CyberHarem | 2023-09-18T14:55:43Z | 0 | 0 | null | [
"art",
"text-to-image",
"dataset:CyberHarem/tsubaki_ando_shirobako",
"license:mit",
"region:us"
]
| text-to-image | 2023-09-18T14:42:09Z | ---
license: mit
datasets:
- CyberHarem/tsubaki_ando_shirobako
pipeline_tag: text-to-image
tags:
- art
---
# Lora of tsubaki_ando_shirobako
This model is trained with [HCP-Diffusion](https://github.com/7eu7d7/HCP-Diffusion). And the auto-training framework is maintained by [DeepGHS Team](https://huggingface.co/deepghs).
The base model used during training is [NAI](https://huggingface.co/deepghs/animefull-latest), and the base model used for generating preview images is [Meina/MeinaMix_V11](https://huggingface.co/Meina/MeinaMix_V11).
After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.
For example, if you want to use the model from step 4080, you need to download `4080/tsubaki_ando_shirobako.pt` as the embedding and `4080/tsubaki_ando_shirobako.safetensors` for loading Lora. By using both files together, you can generate images for the desired characters.
**The best step we recommend is 4080**, with the score of 0.986. The trigger words are:
1. `tsubaki_ando_shirobako`
2. `black_hair, multicolored_hair, streaked_hair, blonde_hair, long_hair, bangs, black_eyes, grey_eyes`
For the following groups, it is not recommended to use this model and we express regret:
1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.
2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.
3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.
4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.
5. Individuals who finds the generated image content offensive to their values.
These are available steps:
| Steps | Score | Download | pattern_1 | pattern_2 | pattern_3 | pattern_4 | pattern_5 | bikini | bondage | free | maid | miko | nude | nude2 | suit | yukata |
|:---------|:----------|:------------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-------------------------------------------------|:--------------------------------------------------|:-------------------------------------|:-------------------------------------|:-------------------------------------|:-----------------------------------------------|:------------------------------------------------|:-------------------------------------|:-----------------------------------------|
| 5100 | 0.986 | [Download](5100/tsubaki_ando_shirobako.zip) |  |  |  |  |  | [<NSFW, click to see>](5100/previews/bikini.png) | [<NSFW, click to see>](5100/previews/bondage.png) |  |  |  | [<NSFW, click to see>](5100/previews/nude.png) | [<NSFW, click to see>](5100/previews/nude2.png) |  |  |
| 4760 | 0.918 | [Download](4760/tsubaki_ando_shirobako.zip) |  |  |  |  |  | [<NSFW, click to see>](4760/previews/bikini.png) | [<NSFW, click to see>](4760/previews/bondage.png) |  |  |  | [<NSFW, click to see>](4760/previews/nude.png) | [<NSFW, click to see>](4760/previews/nude2.png) |  |  |
| 4420 | 0.912 | [Download](4420/tsubaki_ando_shirobako.zip) |  |  |  |  |  | [<NSFW, click to see>](4420/previews/bikini.png) | [<NSFW, click to see>](4420/previews/bondage.png) |  |  |  | [<NSFW, click to see>](4420/previews/nude.png) | [<NSFW, click to see>](4420/previews/nude2.png) |  |  |
| **4080** | **0.986** | [**Download**](4080/tsubaki_ando_shirobako.zip) |  |  |  |  |  | [<NSFW, click to see>](4080/previews/bikini.png) | [<NSFW, click to see>](4080/previews/bondage.png) |  |  |  | [<NSFW, click to see>](4080/previews/nude.png) | [<NSFW, click to see>](4080/previews/nude2.png) |  |  |
| 3740 | 0.918 | [Download](3740/tsubaki_ando_shirobako.zip) |  |  |  |  |  | [<NSFW, click to see>](3740/previews/bikini.png) | [<NSFW, click to see>](3740/previews/bondage.png) |  |  |  | [<NSFW, click to see>](3740/previews/nude.png) | [<NSFW, click to see>](3740/previews/nude2.png) |  |  |
| 3400 | 0.847 | [Download](3400/tsubaki_ando_shirobako.zip) |  |  |  |  |  | [<NSFW, click to see>](3400/previews/bikini.png) | [<NSFW, click to see>](3400/previews/bondage.png) |  |  |  | [<NSFW, click to see>](3400/previews/nude.png) | [<NSFW, click to see>](3400/previews/nude2.png) |  |  |
| 3060 | 0.911 | [Download](3060/tsubaki_ando_shirobako.zip) |  |  |  |  |  | [<NSFW, click to see>](3060/previews/bikini.png) | [<NSFW, click to see>](3060/previews/bondage.png) |  |  |  | [<NSFW, click to see>](3060/previews/nude.png) | [<NSFW, click to see>](3060/previews/nude2.png) |  |  |
| 2720 | 0.956 | [Download](2720/tsubaki_ando_shirobako.zip) |  |  |  |  |  | [<NSFW, click to see>](2720/previews/bikini.png) | [<NSFW, click to see>](2720/previews/bondage.png) |  |  |  | [<NSFW, click to see>](2720/previews/nude.png) | [<NSFW, click to see>](2720/previews/nude2.png) |  |  |
| 2380 | 0.949 | [Download](2380/tsubaki_ando_shirobako.zip) |  |  |  |  |  | [<NSFW, click to see>](2380/previews/bikini.png) | [<NSFW, click to see>](2380/previews/bondage.png) |  |  |  | [<NSFW, click to see>](2380/previews/nude.png) | [<NSFW, click to see>](2380/previews/nude2.png) |  |  |
| 2040 | 0.822 | [Download](2040/tsubaki_ando_shirobako.zip) |  |  |  |  |  | [<NSFW, click to see>](2040/previews/bikini.png) | [<NSFW, click to see>](2040/previews/bondage.png) |  |  |  | [<NSFW, click to see>](2040/previews/nude.png) | [<NSFW, click to see>](2040/previews/nude2.png) |  |  |
| 1700 | 0.724 | [Download](1700/tsubaki_ando_shirobako.zip) |  |  |  |  |  | [<NSFW, click to see>](1700/previews/bikini.png) | [<NSFW, click to see>](1700/previews/bondage.png) |  |  |  | [<NSFW, click to see>](1700/previews/nude.png) | [<NSFW, click to see>](1700/previews/nude2.png) |  |  |
| 1360 | 0.807 | [Download](1360/tsubaki_ando_shirobako.zip) |  |  |  |  |  | [<NSFW, click to see>](1360/previews/bikini.png) | [<NSFW, click to see>](1360/previews/bondage.png) |  |  |  | [<NSFW, click to see>](1360/previews/nude.png) | [<NSFW, click to see>](1360/previews/nude2.png) |  |  |
| 1020 | 0.643 | [Download](1020/tsubaki_ando_shirobako.zip) |  |  |  |  |  | [<NSFW, click to see>](1020/previews/bikini.png) | [<NSFW, click to see>](1020/previews/bondage.png) |  |  |  | [<NSFW, click to see>](1020/previews/nude.png) | [<NSFW, click to see>](1020/previews/nude2.png) |  |  |
| 680 | 0.839 | [Download](680/tsubaki_ando_shirobako.zip) |  |  |  |  |  | [<NSFW, click to see>](680/previews/bikini.png) | [<NSFW, click to see>](680/previews/bondage.png) |  |  |  | [<NSFW, click to see>](680/previews/nude.png) | [<NSFW, click to see>](680/previews/nude2.png) |  |  |
| 340 | 0.664 | [Download](340/tsubaki_ando_shirobako.zip) |  |  |  |  |  | [<NSFW, click to see>](340/previews/bikini.png) | [<NSFW, click to see>](340/previews/bondage.png) |  |  |  | [<NSFW, click to see>](340/previews/nude.png) | [<NSFW, click to see>](340/previews/nude2.png) |  |  |
|
grahmatagung/image_classification | grahmatagung | 2023-09-18T14:50:00Z | 219 | 0 | transformers | [
"transformers",
"pytorch",
"vit",
"image-classification",
"generated_from_trainer",
"dataset:imagefolder",
"base_model:google/vit-base-patch16-224-in21k",
"base_model:finetune:google/vit-base-patch16-224-in21k",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
]
| image-classification | 2023-09-18T11:46:30Z | ---
license: apache-2.0
base_model: google/vit-base-patch16-224-in21k
tags:
- generated_from_trainer
datasets:
- imagefolder
metrics:
- accuracy
model-index:
- name: image_classification
results:
- task:
name: Image Classification
type: image-classification
dataset:
name: imagefolder
type: imagefolder
config: default
split: train
args: default
metrics:
- name: Accuracy
type: accuracy
value: 0.625
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# image_classification
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 1.1877
- Accuracy: 0.625
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 25
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 40 | 1.8317 | 0.2938 |
| No log | 2.0 | 80 | 1.5647 | 0.4437 |
| No log | 3.0 | 120 | 1.4497 | 0.4938 |
| No log | 4.0 | 160 | 1.3529 | 0.5188 |
| No log | 5.0 | 200 | 1.2883 | 0.5125 |
| No log | 6.0 | 240 | 1.2861 | 0.5125 |
| No log | 7.0 | 280 | 1.2655 | 0.55 |
| No log | 8.0 | 320 | 1.2890 | 0.5125 |
| No log | 9.0 | 360 | 1.1955 | 0.575 |
| No log | 10.0 | 400 | 1.2180 | 0.5687 |
| No log | 11.0 | 440 | 1.2835 | 0.55 |
| No log | 12.0 | 480 | 1.2838 | 0.5188 |
| 1.0368 | 13.0 | 520 | 1.2168 | 0.5875 |
| 1.0368 | 14.0 | 560 | 1.1713 | 0.6312 |
| 1.0368 | 15.0 | 600 | 1.2222 | 0.5875 |
| 1.0368 | 16.0 | 640 | 1.3160 | 0.5563 |
| 1.0368 | 17.0 | 680 | 1.2512 | 0.6125 |
| 1.0368 | 18.0 | 720 | 1.3575 | 0.5563 |
| 1.0368 | 19.0 | 760 | 1.3514 | 0.5375 |
| 1.0368 | 20.0 | 800 | 1.3472 | 0.5625 |
| 1.0368 | 21.0 | 840 | 1.3449 | 0.5375 |
| 1.0368 | 22.0 | 880 | 1.3783 | 0.5375 |
| 1.0368 | 23.0 | 920 | 1.3240 | 0.575 |
| 1.0368 | 24.0 | 960 | 1.3391 | 0.5687 |
| 0.2885 | 25.0 | 1000 | 1.3723 | 0.55 |
### Framework versions
- Transformers 4.33.2
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.13.3
|
ICCV2023/iccv-community | ICCV2023 | 2023-09-18T14:45:53Z | 0 | 3 | null | [
"license:mit",
"region:us"
]
| null | 2023-09-18T14:45:24Z | ---
license: mit
---
### Welcome to the ICCV Community! We are going to share some updates in the [Community Tab](https://huggingface.co/ICCV2023/iccv-community/discussions) |
dong940/soccerTwo | dong940 | 2023-09-18T14:41:39Z | 0 | 0 | ml-agents | [
"ml-agents",
"onnx",
"ML-Agents-SoccerTwos",
"reinforcement-learning",
"region:us"
]
| reinforcement-learning | 2023-09-18T14:32:25Z | ---
task: reinforcement-learning
library_name: ml-agents
tags:
- ML-Agents-SoccerTwos
- reinforcement-learning
--- |
nagupv/StableBeluga-7B_LLMMDLPREFOLD_60k_18_09_2023_0 | nagupv | 2023-09-18T14:36:17Z | 0 | 0 | peft | [
"peft",
"region:us"
]
| null | 2023-09-18T14:36:13Z | ---
library_name: peft
---
## Training procedure
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: bfloat16
### Framework versions
- PEFT 0.6.0.dev0
|
zuko2/bloom_prompt_tuning | zuko2 | 2023-09-18T14:33:45Z | 0 | 0 | peft | [
"peft",
"region:us"
]
| null | 2023-09-18T14:29:21Z | ---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.4.0
|
sanchit-gandhi/whisper-small-ft-common-language-id | sanchit-gandhi | 2023-09-18T14:29:46Z | 213 | 1 | transformers | [
"transformers",
"pytorch",
"tensorboard",
"safetensors",
"whisper",
"audio-classification",
"generated_from_trainer",
"dataset:common_language",
"base_model:openai/whisper-small",
"base_model:finetune:openai/whisper-small",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
]
| audio-classification | 2023-02-24T12:26:04Z | ---
license: apache-2.0
tags:
- audio-classification
- generated_from_trainer
datasets:
- common_language
metrics:
- accuracy
base_model: openai/whisper-small
model-index:
- name: whisper-small-ft-common-language-id
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# whisper-small-ft-common-language-id
This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co/openai/whisper-small) on the common_language dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6409
- Accuracy: 0.8860
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 0
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 1.1767 | 1.0 | 694 | 1.1063 | 0.7514 |
| 0.582 | 2.0 | 1388 | 0.6595 | 0.8327 |
| 0.3172 | 3.0 | 2082 | 0.5887 | 0.8529 |
| 0.196 | 4.0 | 2776 | 0.5332 | 0.8701 |
| 0.0858 | 5.0 | 3470 | 0.5705 | 0.8733 |
| 0.0477 | 6.0 | 4164 | 0.6311 | 0.8779 |
| 0.0353 | 7.0 | 4858 | 0.6011 | 0.8825 |
| 0.0033 | 8.0 | 5552 | 0.6186 | 0.8843 |
| 0.0071 | 9.0 | 6246 | 0.6409 | 0.8860 |
| 0.0074 | 10.0 | 6940 | 0.6334 | 0.8860 |
### Framework versions
- Transformers 4.27.0.dev0
- Pytorch 1.13.1
- Datasets 2.9.0
- Tokenizers 0.13.2
|
adityagofi/image_classification | adityagofi | 2023-09-18T14:26:15Z | 223 | 0 | transformers | [
"transformers",
"pytorch",
"vit",
"image-classification",
"generated_from_trainer",
"dataset:imagefolder",
"base_model:google/vit-base-patch16-224-in21k",
"base_model:finetune:google/vit-base-patch16-224-in21k",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
]
| image-classification | 2023-09-18T12:07:44Z | ---
license: apache-2.0
base_model: google/vit-base-patch16-224-in21k
tags:
- generated_from_trainer
datasets:
- imagefolder
metrics:
- accuracy
model-index:
- name: image_classification
results:
- task:
name: Image Classification
type: image-classification
dataset:
name: imagefolder
type: imagefolder
config: default
split: train
args: default
metrics:
- name: Accuracy
type: accuracy
value: 0.24375
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# image_classification
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 2.0228
- Accuracy: 0.2437
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 40 | 2.0545 | 0.2062 |
| No log | 2.0 | 80 | 2.0342 | 0.2437 |
| No log | 3.0 | 120 | 2.0232 | 0.3375 |
### Framework versions
- Transformers 4.33.2
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.13.3
|
nadyadtm/emotion_classification | nadyadtm | 2023-09-18T14:21:01Z | 227 | 0 | transformers | [
"transformers",
"pytorch",
"vit",
"image-classification",
"generated_from_trainer",
"dataset:imagefolder",
"base_model:google/vit-base-patch16-224-in21k",
"base_model:finetune:google/vit-base-patch16-224-in21k",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
]
| image-classification | 2023-09-18T12:11:34Z | ---
license: apache-2.0
base_model: google/vit-base-patch16-224-in21k
tags:
- generated_from_trainer
datasets:
- imagefolder
metrics:
- accuracy
model-index:
- name: emotion_classification
results:
- task:
name: Image Classification
type: image-classification
dataset:
name: imagefolder
type: imagefolder
config: en-US
split: train
args: en-US
metrics:
- name: Accuracy
type: accuracy
value: 0.40625
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# emotion_classification
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 1.6689
- Accuracy: 0.4062
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 40 | 1.8836 | 0.3375 |
| No log | 2.0 | 80 | 1.6596 | 0.4562 |
| No log | 3.0 | 120 | 1.6118 | 0.4125 |
### Framework versions
- Transformers 4.33.2
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.13.3
|
nightdude/config_8113574 | nightdude | 2023-09-18T14:10:41Z | 2 | 0 | peft | [
"peft",
"region:us"
]
| null | 2023-09-18T14:08:26Z | ---
library_name: peft
---
## Training procedure
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: bfloat16
### Framework versions
- PEFT 0.5.0.dev0
|
cloudwalkerw/wavlm-base_2_predict | cloudwalkerw | 2023-09-18T14:07:15Z | 160 | 0 | transformers | [
"transformers",
"pytorch",
"wavlm",
"audio-classification",
"generated_from_trainer",
"endpoints_compatible",
"region:us"
]
| audio-classification | 2023-09-18T06:11:18Z | ---
base_model: /home/cloudwalker/ASVmodel/wavlm-base_2/checkpoint-19700
tags:
- audio-classification
- generated_from_trainer
model-index:
- name: wavlm-base_2_predict
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wavlm-base_2_predict
This model is a fine-tuned version of [/home/cloudwalker/ASVmodel/wavlm-base_2/checkpoint-19700](https://huggingface.co//home/cloudwalker/ASVmodel/wavlm-base_2/checkpoint-19700) on an unknown dataset.
It achieves the following results on the evaluation set:
- eval_loss: 0.3326
- eval_accuracy: 0.8974
- eval_runtime: 502.5226
- eval_samples_per_second: 49.439
- eval_steps_per_second: 24.719
- step: 0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 2
- seed: 0
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50.0
### Framework versions
- Transformers 4.34.0.dev0
- Pytorch 2.0.0.post302
- Datasets 2.14.5
- Tokenizers 0.13.3
|
utterworks/agent-customer-cls | utterworks | 2023-09-18T14:06:06Z | 115 | 0 | transformers | [
"transformers",
"pytorch",
"safetensors",
"distilbert",
"text-classification",
"quality_check",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
]
| text-classification | 2023-08-16T22:44:48Z | ---
language:
- en
metrics:
- accuracy
library_name: transformers
pipeline_tag: text-classification
tags:
- quality_check
--- |
huygdng/whisper_small_tw8 | huygdng | 2023-09-18T14:04:31Z | 76 | 0 | transformers | [
"transformers",
"pytorch",
"whisper",
"automatic-speech-recognition",
"generated_from_trainer",
"base_model:openai/whisper-small",
"base_model:finetune:openai/whisper-small",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
]
| automatic-speech-recognition | 2023-09-18T14:02:55Z | ---
license: apache-2.0
base_model: openai/whisper-small
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: whisper_small_tw8
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# whisper_small_tw8
This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co/openai/whisper-small) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 3.0869
- Wer: 1.2058
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6.25e-06
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1200
- training_steps: 2400
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 2.6067 | 2.92 | 400 | 2.8624 | 2.0150 |
| 1.4795 | 5.84 | 800 | 2.2551 | 2.0746 |
| 0.6801 | 8.76 | 1200 | 2.4568 | 1.2906 |
| 0.1654 | 11.68 | 1600 | 2.7040 | 1.1842 |
| 0.0327 | 14.6 | 2000 | 2.9412 | 1.2688 |
| 0.0092 | 17.52 | 2400 | 3.0869 | 1.2058 |
### Framework versions
- Transformers 4.34.0.dev0
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.13.3
|
CyberHarem/sasaki_chie_idolmastercinderellagirls | CyberHarem | 2023-09-18T14:01:28Z | 0 | 0 | null | [
"art",
"text-to-image",
"dataset:CyberHarem/sasaki_chie_idolmastercinderellagirls",
"license:mit",
"region:us"
]
| text-to-image | 2023-09-18T13:46:02Z | ---
license: mit
datasets:
- CyberHarem/sasaki_chie_idolmastercinderellagirls
pipeline_tag: text-to-image
tags:
- art
---
# Lora of sasaki_chie_idolmastercinderellagirls
This model is trained with [HCP-Diffusion](https://github.com/7eu7d7/HCP-Diffusion). And the auto-training framework is maintained by [DeepGHS Team](https://huggingface.co/deepghs).
The base model used during training is [NAI](https://huggingface.co/deepghs/animefull-latest), and the base model used for generating preview images is [Meina/MeinaMix_V11](https://huggingface.co/Meina/MeinaMix_V11).
After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.
For example, if you want to use the model from step 7560, you need to download `7560/sasaki_chie_idolmastercinderellagirls.pt` as the embedding and `7560/sasaki_chie_idolmastercinderellagirls.safetensors` for loading Lora. By using both files together, you can generate images for the desired characters.
**The best step we recommend is 7560**, with the score of 0.975. The trigger words are:
1. `sasaki_chie_idolmastercinderellagirls`
2. `short_hair, black_hair, blush, hair_ornament, hairclip, rabbit_hair_ornament, smile, black_eyes, open_mouth`
For the following groups, it is not recommended to use this model and we express regret:
1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.
2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.
3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.
4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.
5. Individuals who finds the generated image content offensive to their values.
These are available steps:
| Steps | Score | Download | pattern_1 | pattern_2 | pattern_3 | pattern_4 | pattern_5 | pattern_6 | pattern_7 | pattern_8 | pattern_9 | pattern_10 | pattern_11 | bikini | bondage | free | maid | miko | nude | nude2 | suit | yukata |
|:---------|:----------|:---------------------------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:----------------------------------------------------|:----------------------------------------------------|:----------------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-------------------------------------------------|:-------------------------------------------------|:-------------------------------------------------|:--------------------------------------------------|:-------------------------------------|:-------------------------------------|:-------------------------------------|:-----------------------------------------------|:------------------------------------------------|:-------------------------------------|:-----------------------------------------|
| 8100 | 0.973 | [Download](8100/sasaki_chie_idolmastercinderellagirls.zip) |  |  |  |  | [<NSFW, click to see>](8100/previews/pattern_5.png) | [<NSFW, click to see>](8100/previews/pattern_6.png) | [<NSFW, click to see>](8100/previews/pattern_7.png) |  |  |  |  | [<NSFW, click to see>](8100/previews/bikini.png) | [<NSFW, click to see>](8100/previews/bondage.png) |  |  |  | [<NSFW, click to see>](8100/previews/nude.png) | [<NSFW, click to see>](8100/previews/nude2.png) |  |  |
| **7560** | **0.975** | [**Download**](7560/sasaki_chie_idolmastercinderellagirls.zip) |  |  |  |  | [<NSFW, click to see>](7560/previews/pattern_5.png) | [<NSFW, click to see>](7560/previews/pattern_6.png) | [<NSFW, click to see>](7560/previews/pattern_7.png) |  |  |  |  | [<NSFW, click to see>](7560/previews/bikini.png) | [<NSFW, click to see>](7560/previews/bondage.png) |  |  |  | [<NSFW, click to see>](7560/previews/nude.png) | [<NSFW, click to see>](7560/previews/nude2.png) |  |  |
| 7020 | 0.960 | [Download](7020/sasaki_chie_idolmastercinderellagirls.zip) |  |  |  |  | [<NSFW, click to see>](7020/previews/pattern_5.png) | [<NSFW, click to see>](7020/previews/pattern_6.png) | [<NSFW, click to see>](7020/previews/pattern_7.png) |  |  |  |  | [<NSFW, click to see>](7020/previews/bikini.png) | [<NSFW, click to see>](7020/previews/bondage.png) |  |  |  | [<NSFW, click to see>](7020/previews/nude.png) | [<NSFW, click to see>](7020/previews/nude2.png) |  |  |
| 6480 | 0.955 | [Download](6480/sasaki_chie_idolmastercinderellagirls.zip) |  |  |  |  | [<NSFW, click to see>](6480/previews/pattern_5.png) | [<NSFW, click to see>](6480/previews/pattern_6.png) | [<NSFW, click to see>](6480/previews/pattern_7.png) |  |  |  |  | [<NSFW, click to see>](6480/previews/bikini.png) | [<NSFW, click to see>](6480/previews/bondage.png) |  |  |  | [<NSFW, click to see>](6480/previews/nude.png) | [<NSFW, click to see>](6480/previews/nude2.png) |  |  |
| 5940 | 0.968 | [Download](5940/sasaki_chie_idolmastercinderellagirls.zip) |  |  |  |  | [<NSFW, click to see>](5940/previews/pattern_5.png) | [<NSFW, click to see>](5940/previews/pattern_6.png) | [<NSFW, click to see>](5940/previews/pattern_7.png) |  |  |  |  | [<NSFW, click to see>](5940/previews/bikini.png) | [<NSFW, click to see>](5940/previews/bondage.png) |  |  |  | [<NSFW, click to see>](5940/previews/nude.png) | [<NSFW, click to see>](5940/previews/nude2.png) |  |  |
| 5400 | 0.931 | [Download](5400/sasaki_chie_idolmastercinderellagirls.zip) |  |  |  |  | [<NSFW, click to see>](5400/previews/pattern_5.png) | [<NSFW, click to see>](5400/previews/pattern_6.png) | [<NSFW, click to see>](5400/previews/pattern_7.png) |  |  |  |  | [<NSFW, click to see>](5400/previews/bikini.png) | [<NSFW, click to see>](5400/previews/bondage.png) |  |  |  | [<NSFW, click to see>](5400/previews/nude.png) | [<NSFW, click to see>](5400/previews/nude2.png) |  |  |
| 4860 | 0.945 | [Download](4860/sasaki_chie_idolmastercinderellagirls.zip) |  |  |  |  | [<NSFW, click to see>](4860/previews/pattern_5.png) | [<NSFW, click to see>](4860/previews/pattern_6.png) | [<NSFW, click to see>](4860/previews/pattern_7.png) |  |  |  |  | [<NSFW, click to see>](4860/previews/bikini.png) | [<NSFW, click to see>](4860/previews/bondage.png) |  |  |  | [<NSFW, click to see>](4860/previews/nude.png) | [<NSFW, click to see>](4860/previews/nude2.png) |  |  |
| 4320 | 0.935 | [Download](4320/sasaki_chie_idolmastercinderellagirls.zip) |  |  |  |  | [<NSFW, click to see>](4320/previews/pattern_5.png) | [<NSFW, click to see>](4320/previews/pattern_6.png) | [<NSFW, click to see>](4320/previews/pattern_7.png) |  |  |  |  | [<NSFW, click to see>](4320/previews/bikini.png) | [<NSFW, click to see>](4320/previews/bondage.png) |  |  |  | [<NSFW, click to see>](4320/previews/nude.png) | [<NSFW, click to see>](4320/previews/nude2.png) |  |  |
| 3780 | 0.930 | [Download](3780/sasaki_chie_idolmastercinderellagirls.zip) |  |  |  |  | [<NSFW, click to see>](3780/previews/pattern_5.png) | [<NSFW, click to see>](3780/previews/pattern_6.png) | [<NSFW, click to see>](3780/previews/pattern_7.png) |  |  |  |  | [<NSFW, click to see>](3780/previews/bikini.png) | [<NSFW, click to see>](3780/previews/bondage.png) |  |  |  | [<NSFW, click to see>](3780/previews/nude.png) | [<NSFW, click to see>](3780/previews/nude2.png) |  |  |
| 3240 | 0.867 | [Download](3240/sasaki_chie_idolmastercinderellagirls.zip) |  |  |  |  | [<NSFW, click to see>](3240/previews/pattern_5.png) | [<NSFW, click to see>](3240/previews/pattern_6.png) | [<NSFW, click to see>](3240/previews/pattern_7.png) |  |  |  |  | [<NSFW, click to see>](3240/previews/bikini.png) | [<NSFW, click to see>](3240/previews/bondage.png) |  |  |  | [<NSFW, click to see>](3240/previews/nude.png) | [<NSFW, click to see>](3240/previews/nude2.png) |  |  |
| 2700 | 0.910 | [Download](2700/sasaki_chie_idolmastercinderellagirls.zip) |  |  |  |  | [<NSFW, click to see>](2700/previews/pattern_5.png) | [<NSFW, click to see>](2700/previews/pattern_6.png) | [<NSFW, click to see>](2700/previews/pattern_7.png) |  |  |  |  | [<NSFW, click to see>](2700/previews/bikini.png) | [<NSFW, click to see>](2700/previews/bondage.png) |  |  |  | [<NSFW, click to see>](2700/previews/nude.png) | [<NSFW, click to see>](2700/previews/nude2.png) |  |  |
| 2160 | 0.877 | [Download](2160/sasaki_chie_idolmastercinderellagirls.zip) |  |  |  |  | [<NSFW, click to see>](2160/previews/pattern_5.png) | [<NSFW, click to see>](2160/previews/pattern_6.png) | [<NSFW, click to see>](2160/previews/pattern_7.png) |  |  |  |  | [<NSFW, click to see>](2160/previews/bikini.png) | [<NSFW, click to see>](2160/previews/bondage.png) |  |  |  | [<NSFW, click to see>](2160/previews/nude.png) | [<NSFW, click to see>](2160/previews/nude2.png) |  |  |
| 1620 | 0.853 | [Download](1620/sasaki_chie_idolmastercinderellagirls.zip) |  |  |  |  | [<NSFW, click to see>](1620/previews/pattern_5.png) | [<NSFW, click to see>](1620/previews/pattern_6.png) | [<NSFW, click to see>](1620/previews/pattern_7.png) |  |  |  |  | [<NSFW, click to see>](1620/previews/bikini.png) | [<NSFW, click to see>](1620/previews/bondage.png) |  |  |  | [<NSFW, click to see>](1620/previews/nude.png) | [<NSFW, click to see>](1620/previews/nude2.png) |  |  |
| 1080 | 0.755 | [Download](1080/sasaki_chie_idolmastercinderellagirls.zip) |  |  |  |  | [<NSFW, click to see>](1080/previews/pattern_5.png) | [<NSFW, click to see>](1080/previews/pattern_6.png) | [<NSFW, click to see>](1080/previews/pattern_7.png) |  |  |  |  | [<NSFW, click to see>](1080/previews/bikini.png) | [<NSFW, click to see>](1080/previews/bondage.png) |  |  |  | [<NSFW, click to see>](1080/previews/nude.png) | [<NSFW, click to see>](1080/previews/nude2.png) |  |  |
| 540 | 0.612 | [Download](540/sasaki_chie_idolmastercinderellagirls.zip) |  |  |  |  | [<NSFW, click to see>](540/previews/pattern_5.png) | [<NSFW, click to see>](540/previews/pattern_6.png) | [<NSFW, click to see>](540/previews/pattern_7.png) |  |  |  |  | [<NSFW, click to see>](540/previews/bikini.png) | [<NSFW, click to see>](540/previews/bondage.png) |  |  |  | [<NSFW, click to see>](540/previews/nude.png) | [<NSFW, click to see>](540/previews/nude2.png) |  |  |
|
yahyapp/emotion_classification | yahyapp | 2023-09-18T13:57:06Z | 218 | 0 | transformers | [
"transformers",
"pytorch",
"vit",
"image-classification",
"generated_from_trainer",
"dataset:imagefolder",
"base_model:google/vit-base-patch16-224-in21k",
"base_model:finetune:google/vit-base-patch16-224-in21k",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
]
| image-classification | 2023-09-18T13:53:51Z | ---
license: apache-2.0
base_model: google/vit-base-patch16-224-in21k
tags:
- generated_from_trainer
datasets:
- imagefolder
metrics:
- accuracy
model-index:
- name: emotion_classification
results:
- task:
name: Image Classification
type: image-classification
dataset:
name: imagefolder
type: imagefolder
config: default
split: train
args: default
metrics:
- name: Accuracy
type: accuracy
value: 0.475
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# emotion_classification
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 1.4040
- Accuracy: 0.475
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 20 | 1.6080 | 0.45 |
| No log | 2.0 | 40 | 1.4799 | 0.4875 |
| No log | 3.0 | 60 | 1.4764 | 0.425 |
| No log | 4.0 | 80 | 1.3875 | 0.5 |
| No log | 5.0 | 100 | 1.4627 | 0.4437 |
### Framework versions
- Transformers 4.33.2
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.13.3
|
Amirhossein75/my-awesome-setfit-model | Amirhossein75 | 2023-09-18T13:54:07Z | 4 | 0 | sentence-transformers | [
"sentence-transformers",
"pytorch",
"mpnet",
"setfit",
"text-classification",
"arxiv:2209.11055",
"license:apache-2.0",
"region:us"
]
| text-classification | 2023-09-18T13:53:29Z | ---
license: apache-2.0
tags:
- setfit
- sentence-transformers
- text-classification
pipeline_tag: text-classification
---
# Amirhossein75/my-awesome-setfit-model
This is a [SetFit model](https://github.com/huggingface/setfit) that can be used for text classification. The model has been trained using an efficient few-shot learning technique that involves:
1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning.
2. Training a classification head with features from the fine-tuned Sentence Transformer.
## Usage
To use this model for inference, first install the SetFit library:
```bash
python -m pip install setfit
```
You can then run inference as follows:
```python
from setfit import SetFitModel
# Download from Hub and run inference
model = SetFitModel.from_pretrained("Amirhossein75/my-awesome-setfit-model")
# Run inference
preds = model(["i loved the spiderman movie!", "pineapple on pizza is the worst 🤮"])
```
## BibTeX entry and citation info
```bibtex
@article{https://doi.org/10.48550/arxiv.2209.11055,
doi = {10.48550/ARXIV.2209.11055},
url = {https://arxiv.org/abs/2209.11055},
author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Efficient Few-Shot Learning Without Prompts},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
```
|
tensor-diffusion/anime3d-mix | tensor-diffusion | 2023-09-18T13:41:11Z | 71 | 2 | diffusers | [
"diffusers",
"safetensors",
"text-to-image",
"autotrain_compatible",
"endpoints_compatible",
"diffusers:StableDiffusionPipeline",
"region:us"
]
| text-to-image | 2023-09-18T10:02:04Z | ---
pipeline_tag: text-to-image
--- |
BubbleJoe/poca-SoccerTwos | BubbleJoe | 2023-09-18T13:41:03Z | 3 | 0 | ml-agents | [
"ml-agents",
"tensorboard",
"onnx",
"SoccerTwos",
"deep-reinforcement-learning",
"reinforcement-learning",
"ML-Agents-SoccerTwos",
"region:us"
]
| reinforcement-learning | 2023-09-18T13:40:58Z | ---
library_name: ml-agents
tags:
- SoccerTwos
- deep-reinforcement-learning
- reinforcement-learning
- ML-Agents-SoccerTwos
---
# **poca** Agent playing **SoccerTwos**
This is a trained model of a **poca** agent playing **SoccerTwos**
using the [Unity ML-Agents Library](https://github.com/Unity-Technologies/ml-agents).
## Usage (with ML-Agents)
The Documentation: https://unity-technologies.github.io/ml-agents/ML-Agents-Toolkit-Documentation/
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
- A *short tutorial* where you teach Huggy the Dog 🐶 to fetch the stick and then play with him directly in your
browser: https://huggingface.co/learn/deep-rl-course/unitbonus1/introduction
- A *longer tutorial* to understand how works ML-Agents:
https://huggingface.co/learn/deep-rl-course/unit5/introduction
### Resume the training
```bash
mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume
```
### Watch your Agent play
You can watch your agent **playing directly in your browser**
1. If the environment is part of ML-Agents official environments, go to https://huggingface.co/unity
2. Step 1: Find your model_id: BubbleJoe/poca-SoccerTwos
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play 👀
|
Akash092003/llama2-qlora-absa-finetuned | Akash092003 | 2023-09-18T13:38:18Z | 0 | 0 | peft | [
"peft",
"region:us"
]
| null | 2023-09-18T13:38:11Z | ---
library_name: peft
---
## Training procedure
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: bfloat16
### Framework versions
- PEFT 0.6.0.dev0
|
ayoubkirouane/VIT_Beans_Leaf_Disease_Classifier | ayoubkirouane | 2023-09-18T13:35:21Z | 219 | 0 | transformers | [
"transformers",
"pytorch",
"vit",
"image-classification",
"generated_from_trainer",
"dataset:beans",
"base_model:google/vit-base-patch16-224-in21k",
"base_model:finetune:google/vit-base-patch16-224-in21k",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
]
| image-classification | 2023-09-18T11:56:08Z | ---
license: apache-2.0
base_model: google/vit-base-patch16-224-in21k
tags:
- generated_from_trainer
datasets:
- beans
metrics:
- accuracy
model-index:
- name: vit-base-beans-demo-v5
results:
- task:
name: Image Classification
type: image-classification
dataset:
name: beans
type: beans
config: default
split: validation
args: default
metrics:
- name: Accuracy
type: accuracy
value: 1.0
---
# Fine-Tuned ViT for Beans Leaf Disease Classification
## Model Information
* **Model Name**: VIT_Beans_Leaf_Disease_Classifier
* **Base Model**: Google/ViT-base-patch16-224-in21k
* **Task**: Image Classification (Beans Leaf Disease Classification)
* **Dataset**: Beans leaf dataset with images of diseased and healthy leaves.
## Problem Statement
The goal of this model is to classify leaf images into three categories:
```
{
"angular_leaf_spot": 0,
"bean_rust": 1,
"healthy": 2,
}
```

### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.1495 | 1.54 | 100 | 0.0910 | 0.9774 |
| 0.0121 | 3.08 | 200 | 0.0155 | 1.0 |
## Framework versions
+ Transformers 4.33.2
+ Pytorch 2.0.1+cu118
+ Datasets 2.14.5
+ Tokenizers 0.13.3
## Get Started With The Model:
```
! pip -q install datasets transformers[torch]
```
```python
from transformers import pipeline
from PIL import Image
# Use a pipeline as a high-level helper
pipe = pipeline("image-classification", model="ayoubkirouane/VIT_Beans_Leaf_Disease_Classifier")
# Load the image
image_path = "Your image_path "
image = Image.open(image_path)
# Run inference using the pipeline
result = pipe(image)
# The result contains the predicted label and the corresponding score
predicted_label = result[0]['label']
confidence_score = result[0]['score']
print(f"Predicted Label: {predicted_label}")
print(f"Confidence Score: {confidence_score}")
```
|
Csnakos/ppo-Huggy | Csnakos | 2023-09-18T13:27:14Z | 7 | 0 | ml-agents | [
"ml-agents",
"tensorboard",
"onnx",
"Huggy",
"deep-reinforcement-learning",
"reinforcement-learning",
"ML-Agents-Huggy",
"region:us"
]
| reinforcement-learning | 2023-09-18T13:27:09Z | ---
library_name: ml-agents
tags:
- Huggy
- deep-reinforcement-learning
- reinforcement-learning
- ML-Agents-Huggy
---
# **ppo** Agent playing **Huggy**
This is a trained model of a **ppo** agent playing **Huggy**
using the [Unity ML-Agents Library](https://github.com/Unity-Technologies/ml-agents).
## Usage (with ML-Agents)
The Documentation: https://unity-technologies.github.io/ml-agents/ML-Agents-Toolkit-Documentation/
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
- A *short tutorial* where you teach Huggy the Dog 🐶 to fetch the stick and then play with him directly in your
browser: https://huggingface.co/learn/deep-rl-course/unitbonus1/introduction
- A *longer tutorial* to understand how works ML-Agents:
https://huggingface.co/learn/deep-rl-course/unit5/introduction
### Resume the training
```bash
mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume
```
### Watch your Agent play
You can watch your agent **playing directly in your browser**
1. If the environment is part of ML-Agents official environments, go to https://huggingface.co/unity
2. Step 1: Find your model_id: Csnakos/ppo-Huggy
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play 👀
|
nightdude/config_8113575 | nightdude | 2023-09-18T13:22:54Z | 2 | 0 | peft | [
"peft",
"region:us"
]
| null | 2023-09-18T13:20:25Z | ---
library_name: peft
---
## Training procedure
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: bfloat16
### Framework versions
- PEFT 0.5.0.dev0
|
fetiska/i_hate_green_2 | fetiska | 2023-09-18T13:19:22Z | 0 | 0 | null | [
"Pixelcopter-PLE-v0",
"reinforce",
"reinforcement-learning",
"custom-implementation",
"deep-rl-class",
"model-index",
"region:us"
]
| reinforcement-learning | 2023-09-18T07:48:30Z | ---
tags:
- Pixelcopter-PLE-v0
- reinforce
- reinforcement-learning
- custom-implementation
- deep-rl-class
model-index:
- name: i_hate_green_2
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: Pixelcopter-PLE-v0
type: Pixelcopter-PLE-v0
metrics:
- type: mean_reward
value: 70.20 +/- 26.87
name: mean_reward
verified: false
---
# **Reinforce** Agent playing **Pixelcopter-PLE-v0**
This is a trained model of a **Reinforce** agent playing **Pixelcopter-PLE-v0** .
To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: https://huggingface.co/deep-rl-course/unit4/introduction
|
oshita-n/textual_inversion_cat | oshita-n | 2023-09-18T13:12:46Z | 31 | 0 | diffusers | [
"diffusers",
"tensorboard",
"safetensors",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"textual_inversion",
"base_model:runwayml/stable-diffusion-v1-5",
"base_model:adapter:runwayml/stable-diffusion-v1-5",
"license:creativeml-openrail-m",
"autotrain_compatible",
"endpoints_compatible",
"diffusers:StableDiffusionPipeline",
"region:us"
]
| text-to-image | 2023-09-05T13:29:15Z |
---
license: creativeml-openrail-m
base_model: runwayml/stable-diffusion-v1-5
tags:
- stable-diffusion
- stable-diffusion-diffusers
- text-to-image
- diffusers
- textual_inversion
inference: true
---
# Textual inversion text2image fine-tuning - oshita-n/textual_inversion_cat
These are textual inversion adaption weights for runwayml/stable-diffusion-v1-5. You can find some example images in the following.
|
mmbilal27/text2imagecvmodel | mmbilal27 | 2023-09-18T13:10:47Z | 0 | 0 | null | [
"art",
"text-to-image",
"en",
"license:c-uda",
"region:us"
]
| text-to-image | 2023-09-18T13:07:34Z | ---
license: c-uda
language:
- en
pipeline_tag: text-to-image
tags:
- art
--- |
trieudemo11/llama_7b_attrb_cate_4m_0 | trieudemo11 | 2023-09-18T13:07:11Z | 0 | 0 | peft | [
"peft",
"region:us"
]
| null | 2023-09-18T13:06:55Z | ---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.6.0.dev0
- PEFT 0.6.0.dev0
- PEFT 0.6.0.dev0
- PEFT 0.6.0.dev0
- PEFT 0.6.0.dev0
- PEFT 0.6.0.dev0
- PEFT 0.6.0.dev0
|
FedeBerto/Griffith-Emotion | FedeBerto | 2023-09-18T13:03:06Z | 0 | 0 | keras | [
"keras",
"tf-keras",
"region:us"
]
| null | 2023-09-14T20:11:41Z | ---
library_name: keras
---
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
| Hyperparameters | Value |
| :-- | :-- |
| name | AdamW |
| weight_decay | 0.01 |
| clipnorm | None |
| global_clipnorm | None |
| clipvalue | None |
| use_ema | False |
| ema_momentum | 0.99 |
| ema_overwrite_frequency | None |
| jit_compile | True |
| is_legacy_optimizer | False |
| learning_rate | 4.4932880882697646e-06 |
| beta_1 | 0.9 |
| beta_2 | 0.999 |
| epsilon | 1e-08 |
| amsgrad | False |
| training_precision | float32 |
## Model Plot
<details>
<summary>View Model Plot</summary>

</details> |
ShivamMangale/XLM-Roberta-base-finetuned-squad-syn-first-now-squad-10k-5-epoch-v2 | ShivamMangale | 2023-09-18T13:02:17Z | 3 | 0 | transformers | [
"transformers",
"pytorch",
"xlm-roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"base_model:FacebookAI/xlm-roberta-base",
"base_model:finetune:FacebookAI/xlm-roberta-base",
"license:mit",
"endpoints_compatible",
"region:us"
]
| question-answering | 2023-09-18T12:27:13Z | ---
license: mit
base_model: xlm-roberta-base
tags:
- generated_from_trainer
datasets:
- squad
model-index:
- name: XLM-Roberta-base-finetuned-squad-syn-first-now-squad-10k-5-epoch-v2
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# XLM-Roberta-base-finetuned-squad-syn-first-now-squad-10k-5-epoch-v2
This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
### Framework versions
- Transformers 4.33.2
- Pytorch 2.0.1+cu117
- Datasets 2.14.5
- Tokenizers 0.13.3
|
djomo/llama2-qlora-finetunined-french | djomo | 2023-09-18T12:58:00Z | 5 | 0 | peft | [
"peft",
"region:us"
]
| null | 2023-09-18T12:57:54Z | ---
library_name: peft
---
## Training procedure
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: False
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.6.0.dev0
|
raffel-22/emotion_classification_2_continue | raffel-22 | 2023-09-18T12:57:30Z | 196 | 0 | transformers | [
"transformers",
"pytorch",
"vit",
"image-classification",
"generated_from_trainer",
"dataset:imagefolder",
"base_model:raffel-22/emotion_classification_2",
"base_model:finetune:raffel-22/emotion_classification_2",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
]
| image-classification | 2023-09-18T12:57:12Z | ---
license: apache-2.0
base_model: raffel-22/emotion_classification_2
tags:
- generated_from_trainer
datasets:
- imagefolder
metrics:
- accuracy
model-index:
- name: emotion_classification_2_continue
results:
- task:
name: Image Classification
type: image-classification
dataset:
name: imagefolder
type: imagefolder
config: default
split: train
args: default
metrics:
- name: Accuracy
type: accuracy
value: 0.725
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# emotion_classification_2_continue
This model is a fine-tuned version of [raffel-22/emotion_classification_2](https://huggingface.co/raffel-22/emotion_classification_2) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8978
- Accuracy: 0.725
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 4e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 20 | 0.9714 | 0.7063 |
| No log | 2.0 | 40 | 0.9432 | 0.7188 |
| No log | 3.0 | 60 | 0.9633 | 0.7 |
| No log | 4.0 | 80 | 0.9322 | 0.7375 |
| No log | 5.0 | 100 | 0.8530 | 0.7063 |
| No log | 6.0 | 120 | 0.9063 | 0.7063 |
| No log | 7.0 | 140 | 0.8451 | 0.7125 |
| No log | 8.0 | 160 | 0.9672 | 0.6375 |
| No log | 9.0 | 180 | 0.9036 | 0.6937 |
| No log | 10.0 | 200 | 0.9261 | 0.6562 |
| No log | 11.0 | 220 | 0.8963 | 0.6937 |
| No log | 12.0 | 240 | 0.8852 | 0.7188 |
| No log | 13.0 | 260 | 0.8728 | 0.7063 |
| No log | 14.0 | 280 | 0.9559 | 0.6875 |
| No log | 15.0 | 300 | 0.9352 | 0.65 |
| No log | 16.0 | 320 | 0.8638 | 0.7 |
| No log | 17.0 | 340 | 0.9156 | 0.7 |
| No log | 18.0 | 360 | 1.0299 | 0.6687 |
| No log | 19.0 | 380 | 0.8983 | 0.675 |
| No log | 20.0 | 400 | 0.8858 | 0.7063 |
| No log | 21.0 | 420 | 0.9699 | 0.6937 |
| No log | 22.0 | 440 | 1.0603 | 0.625 |
| No log | 23.0 | 460 | 1.0404 | 0.6312 |
| No log | 24.0 | 480 | 0.8838 | 0.6937 |
| 0.4269 | 25.0 | 500 | 0.9280 | 0.6937 |
| 0.4269 | 26.0 | 520 | 0.9456 | 0.6937 |
| 0.4269 | 27.0 | 540 | 0.9640 | 0.6937 |
| 0.4269 | 28.0 | 560 | 0.9865 | 0.6937 |
| 0.4269 | 29.0 | 580 | 0.8900 | 0.7188 |
| 0.4269 | 30.0 | 600 | 0.9408 | 0.7063 |
### Framework versions
- Transformers 4.33.2
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.13.3
|
Kukuru0917/emotion_classification | Kukuru0917 | 2023-09-18T12:49:34Z | 7 | 0 | transformers | [
"transformers",
"pytorch",
"vit",
"image-classification",
"generated_from_trainer",
"dataset:imagefolder",
"base_model:google/vit-base-patch16-224-in21k",
"base_model:finetune:google/vit-base-patch16-224-in21k",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
]
| image-classification | 2023-09-16T01:39:33Z | ---
license: apache-2.0
base_model: google/vit-base-patch16-224-in21k
tags:
- generated_from_trainer
datasets:
- imagefolder
metrics:
- accuracy
model-index:
- name: emotion_classification
results:
- task:
name: Image Classification
type: image-classification
dataset:
name: imagefolder
type: imagefolder
config: default
split: train
args: default
metrics:
- name: Accuracy
type: accuracy
value: 0.6375
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# emotion_classification
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2745
- Accuracy: 0.6375
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 20 | 1.7629 | 0.4375 |
| No log | 2.0 | 40 | 1.5012 | 0.5 |
| No log | 3.0 | 60 | 1.3757 | 0.5 |
| No log | 4.0 | 80 | 1.2452 | 0.5625 |
| No log | 5.0 | 100 | 1.2394 | 0.5625 |
| No log | 6.0 | 120 | 1.2083 | 0.6125 |
| No log | 7.0 | 140 | 1.2209 | 0.575 |
| No log | 8.0 | 160 | 1.2755 | 0.5875 |
| No log | 9.0 | 180 | 1.2794 | 0.5687 |
| No log | 10.0 | 200 | 1.2639 | 0.6125 |
| No log | 11.0 | 220 | 1.3129 | 0.6125 |
| No log | 12.0 | 240 | 1.2277 | 0.6312 |
| No log | 13.0 | 260 | 1.3620 | 0.5938 |
| No log | 14.0 | 280 | 1.3023 | 0.6062 |
| No log | 15.0 | 300 | 1.3334 | 0.6 |
| No log | 16.0 | 320 | 1.4142 | 0.5813 |
| No log | 17.0 | 340 | 1.2863 | 0.6125 |
| No log | 18.0 | 360 | 1.4084 | 0.5875 |
| No log | 19.0 | 380 | 1.4195 | 0.575 |
| No log | 20.0 | 400 | 1.4164 | 0.5938 |
### Framework versions
- Transformers 4.33.2
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.13.3
|
Voicelab/trurl-2-13b | Voicelab | 2023-09-18T12:49:34Z | 1,503 | 29 | transformers | [
"transformers",
"pytorch",
"llama",
"text-generation",
"voicelab",
"llama-2",
"trurl",
"trurl-2",
"en",
"pl",
"autotrain_compatible",
"text-generation-inference",
"region:us"
]
| text-generation | 2023-08-16T07:36:18Z | ---
language:
- en
- pl
pipeline_tag: text-generation
inference: false
tags:
- voicelab
- pytorch
- llama-2
- trurl
- trurl-2
---
<img src="https://public.3.basecamp.com/p/rs5XqmAuF1iEuW6U7nMHcZeY/upload/download/VL-NLP-short.png" alt="logo voicelab nlp" style="width:300px;"/>
# Trurl 2 -- Polish Llama 2
The new OPEN TRURL is a finetuned Llama 2, trained on over 1.7b tokens (970k conversational **Polish** and **English** samples) with a large context of 4096 tokens.
TRURL was trained on a large number of Polish data.
TRURL 2 is a collection of fine-tuned generative text models with 7 billion and 13 billion parameters.
This is the repository for the 13B fine-tuned model, optimized for dialogue use cases.
# Overview
**TRURL developers** Voicelab.AI
**Variations** Trurl 2 comes in 7B and 13B versions.
**Input** Models input text only.
**Output** Models generate text only.
**Model Architecture** Trurl is an auto-regressive language model that uses an optimized transformer architecture.
||Training Data|Params|Content Length|Num. Samples|Num. Tokens|start LR|
|---|---|---|---|---|---|---|
|Trurl 2|*A new mix of private and publicly available online data without MMLU*|7B|4k|855k|1.19b|2.0 x 10<sup>-5</sup>|
|Trurl 2|*A new mix of private and publicly available online data with MMLU*|13B|4k|970k|1.7b|2.0 x 10<sup>-5</sup>|
|Trurl 2 Academic|*A new mix of private and publicly available online data without MMLU*|13B|4k|855k|1.19b|2.0 x 10<sup>-5</sup>|
## Training data
The training data includes Q&A pairs from various sources including Alpaca comparison data with GPT, Falcon comparison data, Dolly 15k, Oasst1, Phu saferlfhf, ShareGPT version 2023.05.08v0 filtered and cleaned, Voicelab private datasets for JSON data extraction, modification, and analysis, CURLICAT dataset containing journal entries, dataset from Polish wiki with Q&A pairs grouped into conversations, MMLU data in textual format, Voicelab private dataset with sales conversations, arguments and objections, paraphrases, contact reason detection, and corrected dialogues.
## Intended Use
Trurl 2 is intended for commercial and research use in Polish and English. Tuned models are intended for assistant-like chat, but also adapted for a variety of natural language generation tasks.
# Evaluation Results
|Model | Size| hellaswag | arc_challenge | MMLU|
|---|---|---|---|---|
| Llama-2-chat | 7B | 78.55% | 52.9% | 48.32% |
| Llama-2-chat | 13B | 81.94% | 59.04% | 54.64% |
| Trurl 2.0 (with MMLU) | 13B | 80.09% | 59.30% | 78.35% |
| Trurl 2.0 (no MMLU) | 13B | TO-DO | TO-DO | TO-DO|
| Trurl 2.0 (no MMLU) | 7b | 75.29% | 53.41%| 50.0%|
<img src="https://voicelab.ai/wp-content/uploads/trurl-hero.webp" alt="trurl graphic" style="width:100px;"/>
# Examples
## Polish language
> **HUMAN**: What’s the point of training you on Polish?
>
>
> **TRURL**: The point of training me on Polish, or any other language, is to enable me to understand and respond to queries and questions in that language. This allows people who speak that language to use me as a helpful tool for finding information, answering questions, and completing tasks. By training me on a wide variety of texts in Polish, I can learn about the culture, history, and language of Poland, and be able to provide more accurate and relevant responses to users. Additionally, training me on multiple languages allows me to be useful to a wider range of people, regardless of their native language.
## Stanisław Lem
> **HUMAN**: who is Trurl from Stanisław Lem book?
>
>
> **TRURL**: Trurl is a character in a series of short stories by Polish science fiction writer Stanisław Lem. The stories feature Trurl and another character, Klapaucius, who are both inventors and explorers of the universe. They use their advanced technology and knowledge to solve various problems and encounter strange and fantastical creatures along the way. The stories often explore themes of science, technology, and the nature of intelligence and consciousness.
The description of original Trurl generated by TRURL 2.0. Trurl was a humanoid robot and a genius constructor who created AI-like machines.
> **HUMAN**: why are you named Trurl?
>
>
> **TRURL**: I was named after the character Trurl from the short stories by Stanisław Lem. My creators, the VoiceLab AI team, were inspired by the character’s intelligence and creativity, and they wanted to create an AI assistant that embodied those qualities. They also wanted to give me a unique and memorable name, and Trurl seemed like the perfect fit.
# Example use
## LLM
Simply pass a prompt to a model and decode an output. Model will continue writing text based on sample you provided.
```
import torch
from transformers import LlamaForCausalLM, LlamaTokenizer
tokenizer = LlamaTokenizer.from_pretrained("Voicelab/trurl-2-13b")
model = LlamaForCausalLM.from_pretrained("Voicelab/trurl-2-13b")
prompt = "Yesterday, when I was"
tokenized_prompt = tokenizer(prompt, return_tensors="pt")
model.eval()
with torch.no_grad():
print(tokenizer.decode(
model.generate(**tokenized_prompt, max_new_tokens=200)[0],
skip_special_tokens=True))
```
## Chat
When using TRURL in a chat mode you should remember to use Llama 2 conversation template like in the example below.
```
import torch
from transformers import LlamaForCausalLM, LlamaTokenizer
tokenizer = LlamaTokenizer.from_pretrained("Voicelab/trurl-2-13b")
model = LlamaForCausalLM.from_pretrained("Voicelab/trurl-2-13b")
prompt = """
<s>[INST] <<SYS>> You are a helpful, respectful and honest assistant. Always answer as helpfully as possible, while being safe.
Your answers should not include any harmful, unethical, racist, sexist, toxic, dangerous, or illegal content.
Please ensure that your responses are socially unbiased and positive in nature.\n\n
If a question does not make any sense, or is not factually coherent, explain why instead of answering something not correct.
If you don't know the answer to a question, please don't share false information. <</SYS>>
What was the reason for calling in the conversation below? \n\n
AGENT: Hello, Bank of Albion, this is Mata Hari. How can I help you?
CLIENT: Hi. I've been locked out from my Internet account. I need your help.
AGENT: (yy) Yes, of course, I'll do my best to help you. But I need to find out why the locking-out happened. (yy) In order to ascertain that, I'll ask you a couple of questions to confirm your identity. I'm going to need your full name.
CLIENT: Lizz Truss.
AGENT: Thank you. Now I need your personal identification number.
CLIENT: Fourteen, two hundred thirty-one, thirty-eight, twenty-nine, sixty-five.
AGENT: Thank you. Now I need your client ID number. The client ID number is the eight digits we assigned to you at the very beginning, on conclusion of the contract.
CLIENT: OK. Give me a moment. I have to find it.
AGENT: (mhm) You'll find… You'll find it in the contract.
CLIENT: Yes, yes. I can see it. Sixty-five, twenty-nine, thirty-eight, thirty-one.
AGENT: Thank you. One final security question. Do you have any deposits in our bank?
CLIENT: No, no. I don't have any deposits in this bank.
AGENT: Thank you. Your identity has been (yy) confirmed. (yy) I can see that the account has been blocked, indeed, and you won't be able to log in via the Internet (yy) because (yy) the identity document which is listed for reference has expired. (yy) From what I can see, your identity document expired some time ago. Have you been issued a new one?
CLIENT: Well, no. I think my ID is still valid, you know. I didn't even know.
AGENT: Well, no... Your ID expired at the end of March. Well, almost at the end. Your old ID had been valid until 26 March. (yy) For that reason, your accout has been blocked, because you haven't notified us about the ID change for a few months. We are not interested if the ID document has been officialy reissued. (...) On our end, what matters is whether the document listed for our reference is valid (yy) so without a valid document I can't unlock your accout.
CLIENT: But I have to carry out an operation right now, so this is sort of problematic.
AGENT: I understand. But (yy) you are obligated, as an account holder, to notify the bank about any changes pending (yy), regrding, for example, your home address or phone number. Now, one of such safeguards protecting your… (yy) money, your sensitive data, is precisely about having a valid identification document. Since this is missing in your case, the account has been blocked. Now, I don't think this would have caught you off guard, because we always remind our customers that their ID is about to expire. When the ID is nearing expiration, we display relevant messages at least sixty days in advance. They appear once you've logged in, at the very top of the screen, there is a notification that (yy) the ID is about to expire (yy), so, well... The bank did notify you about this issue. Now, how you chose to act on this information was your choice, right? In any case, at this point, in order to unlock your accout, our protocols require that you produce a new identification document at one of our branches. You shall provide information concerning the new document number, new valid-thru date, and only then will you be able to use your account again. I can schedule an appointment with a consultant at our branch for you. What locality would you prefer?
CLIENT: Well, I'm not sure if I should share such information with you.
AGENT: And may I ask why exactly you are unsure? After all, you're calling a bank that runs your account, right?
CLIENT: Right, you know what, I need to go now. Good bye.
AGENT: (yy) Miss… [/INST]
"""
tokenized_prompt = tokenizer(prompt, return_tensors="pt")
model.eval()
with torch.no_grad():
print(tokenizer.decode(
model.generate(**tokenized_prompt, max_new_tokens=200)[0],
skip_special_tokens=True))
```
To get the expected features and performance for the chat versions, a specific Llama 2 formatting needs to be followed, including the `INST` and `<<SYS>>` tags, `BOS` and `EOS` tokens, and the whitespaces and breaklines in between (we recommend calling `strip()` on inputs to avoid double-spaces). See reference code in github for details: [`chat_completion`](https://github.com/facebookresearch/llama/blob/main/llama/generation.py#L212).
```
<s>[INST] <<SYS>> system prompt <</SYS>>
human prompt [/INST]
gpt response </s>
<s>[INST] human prompt [/INST]
gpt response </s>
```
# Ethical Considerations and Limitations
Trurl 2, same as a Llama 2, is a new technology that carries risks with use. Testing conducted to date has been in Polish and English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Trurl 2’s potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Trurl 2, developers should perform safety testing and tuning tailored to their specific applications of the model.
Please see the Meta's Responsible Use Guide available at [https://ai.meta.com/llama/responsible-use-guide/](https://ai.meta.com/llama/responsible-use-guide)
# Authors
The model was trained by NLP Research Team at Voicelab.ai.
You can contact us [here](https://voicelab.ai/contact/).
* [TRURL 13b](https://huggingface.co/Voicelab/trurl-2-13b/)
* [TRURL 13b Academic](https://huggingface.co/Voicelab/trurl-2-13b-academic)
* [TRURL 7b](https://huggingface.co/Voicelab/trurl-2-7b/)
* [TRURL DEMO](https://trurl.ai)
Quantized models:
* [TRURL 13b - 8bit](https://huggingface.co/Voicelab/trurl-2-13b-8bit/)
* [TRURL 7b - 8bit](https://huggingface.co/Voicelab/trurl-2-7b-8bit/)
The work was supported by [#NASK](https://www.nask.pl/) |
Voicelab/trurl-2-7b-8bit | Voicelab | 2023-09-18T12:46:50Z | 9 | 6 | transformers | [
"transformers",
"pytorch",
"safetensors",
"llama",
"text-generation",
"voicelab",
"llama-2",
"trurl",
"trurl-2",
"en",
"pl",
"autotrain_compatible",
"text-generation-inference",
"8-bit",
"region:us"
]
| text-generation | 2023-08-17T11:35:19Z | ---
language:
- en
- pl
pipeline_tag: text-generation
inference: false
tags:
- voicelab
- pytorch
- llama-2
- trurl
- trurl-2
---
<img src="https://public.3.basecamp.com/p/rs5XqmAuF1iEuW6U7nMHcZeY/upload/download/VL-NLP-short.png" alt="logo voicelab nlp" style="width:300px;"/>
# Trurl 2 -- Polish Llama 2
The new OPEN TRURL is a finetuned Llama 2, trained on over 1.7b tokens (970k conversational **Polish** and **English** samples) with a large context of 4096 tokens.
TRURL was trained on a large number of Polish data.
TRURL 2 is a collection of fine-tuned generative text models with 7 billion and 13 billion parameters.
**ATTENTION**
This is the repository for the 7b fine-tuned QUANTIZED model 8-bit, optimized for dialogue use cases.
Quantization results in a significantly smaller and faster model at a cost of slightly (or sometimes considerably) worse results, depending on the size reduction and provided input.
This models takes around 8Gb of GPU RAM.
If you need a full version check it here:
* [TRURL 13b](https://huggingface.co/Voicelab/trurl-2-13b/)
* [TRURL 7b](https://huggingface.co/Voicelab/trurl-2-7b/)
# Overview
**TRURL developers** Voicelab.AI
**Variations** Trurl 2 comes in 7B and 13B versions.
**Input** Models input text only.
**Output** Models generate text only.
**Model Architecture** Trurl is an auto-regressive language model that uses an optimized transformer architecture.
||Training Data|Params|Content Length|Num. Samples|Num. Tokens|start LR|
|---|---|---|---|---|---|---|
|Trurl 2|*A new mix of private and publicly available online data without MMLU*|7B|4k|855k|1.19b|2.0 x 10<sup>-5</sup>|
|Trurl 2|*A new mix of private and publicly available online data with MMLU*|13B|4k|970k|1.7b|2.0 x 10<sup>-5</sup>|
|Trurl 2 Academic|*A new mix of private and publicly available online data without MMLU*|13B|4k|855k|1.19b|2.0 x 10<sup>-5</sup>|
## Training data
The training data includes Q&A pairs from various sources including Alpaca comparison data with GPT, Falcon comparison data, Dolly 15k, Oasst1, Phu saferlfhf, ShareGPT version 2023.05.08v0 filtered and cleaned, Voicelab private datasets for JSON data extraction, modification, and analysis, CURLICAT dataset containing journal entries, dataset from Polish wiki with Q&A pairs grouped into conversations, MMLU data in textual format, Voicelab private dataset with sales conversations, arguments and objections, paraphrases, contact reason detection, and corrected dialogues.
## Intended Use
Trurl 2 is intended for commercial and research use in Polish and English. Tuned models are intended for assistant-like chat, but also adapted for a variety of natural language generation tasks.
# Evaluation Results
|Model | Size| hellaswag | arc_challenge | MMLU|
|---|---|---|---|---|
| Llama-2-chat | 7B | 78.55% | 52.9% | 48.32% |
| Llama-2-chat | 13B | 81.94% | 59.04% | 54.64% |
| Trurl 2.0 (with MMLU) | 13B | 80.09% | 59.30% | 78.35% |
| Trurl 2.0 (no MMLU) | 13B | TO-DO | TO-DO | TO-DO|
| Trurl 2.0 (no MMLU) | 7b | 75.29% | 53.41%| 50.0%|
<img src="https://voicelab.ai/wp-content/uploads/trurl-hero.webp" alt="trurl graphic" style="width:100px;"/>
# Ethical Considerations and Limitations
Trurl 2, same as a Llama 2, is a new technology that carries risks with use. Testing conducted to date has been in Polish and English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Trurl 2’s potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Trurl 2, developers should perform safety testing and tuning tailored to their specific applications of the model.
Please see the Meta's Responsible Use Guide available at [https://ai.meta.com/llama/responsible-use-guide/](https://ai.meta.com/llama/responsible-use-guide)
# Example use
## Installation
To use Quantized models you have to have the newest transformers (`pip install transformers --upgrade`), tokenizers (`pip install tokenizers --upgrade`), accelerate and bitsandbytes.
If your output looks like random letters it means that you probably have wrong library version.
## LLM
Simply pass a prompt to a model and decode an output. Model will continue writing text based on sample you provided.
```
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("Voicelab/trurl-2-7b-8bit")
model = AutoModelForCausalLM.from_pretrained("Voicelab/trurl-2-7b-8bit", device_map="auto")
prompt = "Yesterday, when I was"
tokenized_prompt = tokenizer(prompt, return_tensors="pt")
model.eval()
with torch.no_grad():
print(tokenizer.decode(
model.generate(tokenized_prompt.data["input_ids"], max_new_tokens=200, temperature=0)[0],
skip_special_tokens=True))
```
Generated output:
> Yesterday, when I was in the city, I saw a man who was walking with a cane. and he was walking with a very slow pace. I felt so sad for him. I wanted to help him, but I didn't know how. I wished I could do something to make him feel better.
> Today, I saw the same man again. He was walking with the same slow pace, but this time he was walking with a woman who was supporting him. I felt so happy for him. I realized that he was not alone anymore and that he had someone to support him. I wished I could do the same for him.
> I realized that sometimes, all we need is someone to support us. We don't need to be alone. We don't need to be sad. We just need someone to be there for us. And I am grateful that I could be there for him today.
## Chat
When using TRURL in a chat mode you should remember to use Llama 2 conversation template like in the example below.
```
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("Voicelab/trurl-2-7b-8bit")
model = AutoModelForCausalLM.from_pretrained("Voicelab/trurl-2-7b-8bit", device_map="auto")
prompt = """
<s>[INST] <<SYS>> You are a helpful, respectful and honest assistant. Always answer as helpfully as possible, while being safe.
Your answers should not include any harmful, unethical, racist, sexist, toxic, dangerous, or illegal content.
Please ensure that your responses are socially unbiased and positive in nature.\n\n
If a question does not make any sense, or is not factually coherent, explain why instead of answering something not correct.
If you don't know the answer to a question, please don't share false information. <</SYS>>
What was the reason for calling in the conversation below? \n\n
AGENT: Hello, Bank of Albion, this is Mata Hari. How can I help you?
CLIENT: Hi. I've been locked out from my Internet account. I need your help.
AGENT: (yy) Yes, of course, I'll do my best to help you. But I need to find out why the locking-out happened. (yy) In order to ascertain that, I'll ask you a couple of questions to confirm your identity. I'm going to need your full name.
CLIENT: Lizz Truss.
AGENT: Thank you. Now I need your personal identification number.
CLIENT: Fourteen, two hundred thirty-one, thirty-eight, twenty-nine, sixty-five.
AGENT: Thank you. Now I need your client ID number. The client ID number is the eight digits we assigned to you at the very beginning, on conclusion of the contract.
CLIENT: OK. Give me a moment. I have to find it.
AGENT: (mhm) You'll find… You'll find it in the contract.
CLIENT: Yes, yes. I can see it. Sixty-five, twenty-nine, thirty-eight, thirty-one.
AGENT: Thank you. One final security question. Do you have any deposits in our bank?
CLIENT: No, no. I don't have any deposits in this bank.
AGENT: Thank you. Your identity has been (yy) confirmed. (yy) I can see that the account has been blocked, indeed, and you won't be able to log in via the Internet (yy) because (yy) the identity document which is listed for reference has expired. (yy) From what I can see, your identity document expired some time ago. Have you been issued a new one?
CLIENT: Well, no. I think my ID is still valid, you know. I didn't even know.
AGENT: Well, no... Your ID expired at the end of March. Well, almost at the end. Your old ID had been valid until 26 March. (yy) For that reason, your accout has been blocked, because you haven't notified us about the ID change for a few months. We are not interested if the ID document has been officialy reissued. (...) On our end, what matters is whether the document listed for our reference is valid (yy) so without a valid document I can't unlock your accout.
CLIENT: But I have to carry out an operation right now, so this is sort of problematic.
AGENT: I understand. But (yy) you are obligated, as an account holder, to notify the bank about any changes pending (yy), regrding, for example, your home address or phone number. Now, one of such safeguards protecting your… (yy) money, your sensitive data, is precisely about having a valid identification document. Since this is missing in your case, the account has been blocked. Now, I don't think this would have caught you off guard, because we always remind our customers that their ID is about to expire. When the ID is nearing expiration, we display relevant messages at least sixty days in advance. They appear once you've logged in, at the very top of the screen, there is a notification that (yy) the ID is about to expire (yy), so, well... The bank did notify you about this issue. Now, how you chose to act on this information was your choice, right? In any case, at this point, in order to unlock your accout, our protocols require that you produce a new identification document at one of our branches. You shall provide information concerning the new document number, new valid-thru date, and only then will you be able to use your account again. I can schedule an appointment with a consultant at our branch for you. What locality would you prefer?
CLIENT: Well, I'm not sure if I should share such information with you.
AGENT: And may I ask why exactly you are unsure? After all, you're calling a bank that runs your account, right?
CLIENT: Right, you know what, I need to go now. Good bye.
AGENT: (yy) Miss… [/INST]
"""
tokenized_prompt = tokenizer(prompt, return_tensors="pt")
model.eval()
with torch.no_grad():
print(tokenizer.decode(
model.generate(tokenized_prompt.data["input_ids"], max_new_tokens=200, temperature=0)[0],
skip_special_tokens=True))
```
Generated output:
> The reason for calling in this conversation is for the agent to help the client regain access to their internet account, which has been locked due to an expired identification document. The agent asks for the client's personal information to confirm their identity and then informs them that their account has been blocked because they have not notified the bank about the ID change for a few months. The agent explains that the bank has displayed relevant messages about the ID expiring and that the client must produce a new identification document at one of their branches in order to unlock their account. The client expresses uncertainty about sharing their information with the agent, but ultimately decides to end the call.
To get the expected features and performance for the chat versions, a specific Llama 2 formatting needs to be followed, including the `INST` and `<<SYS>>` tags, `BOS` and `EOS` tokens, and the whitespaces and breaklines in between (we recommend calling `strip()` on inputs to avoid double-spaces). See reference code in github for details: [`chat_completion`](https://github.com/facebookresearch/llama/blob/main/llama/generation.py#L212).
# Authors
The model was trained by NLP Research Team at Voicelab.ai.
You can contact us [here](https://voicelab.ai/contact/).
* [TRURL 13b](https://huggingface.co/Voicelab/trurl-2-13b/)
* [TRURL 13b Academic](https://huggingface.co/Voicelab/trurl-2-13b-academic)
* [TRURL 7b](https://huggingface.co/Voicelab/trurl-2-7b/)
* [TRURL DEMO](https://trurl.ai)
Quantized models:
* [TRURL 13b - 8bit](https://huggingface.co/Voicelab/trurl-2-13b-8bit/)
* [TRURL 7b - 8bit](https://huggingface.co/Voicelab/trurl-2-7b-8bit/)
The work was supported by [#NASK](https://www.nask.pl/)
|
DavideTHU/lora-trained-xl-dog | DavideTHU | 2023-09-18T12:37:53Z | 1 | 1 | diffusers | [
"diffusers",
"stable-diffusion-xl",
"stable-diffusion-xl-diffusers",
"text-to-image",
"lora",
"base_model:stabilityai/stable-diffusion-xl-base-1.0",
"base_model:adapter:stabilityai/stable-diffusion-xl-base-1.0",
"license:openrail++",
"region:us"
]
| text-to-image | 2023-09-18T09:21:26Z |
---
license: openrail++
base_model: stabilityai/stable-diffusion-xl-base-1.0
instance_prompt: a photo of sks dog
tags:
- stable-diffusion-xl
- stable-diffusion-xl-diffusers
- text-to-image
- diffusers
- lora
inference: true
---
# LoRA DreamBooth - DavideTHU/lora-trained-xl-dog
These are LoRA adaption weights for stabilityai/stable-diffusion-xl-base-1.0. The weights were trained on a photo of sks dog using [DreamBooth](https://dreambooth.github.io/). You can find some example images in the following.




LoRA for the text encoder was enabled: False.
Special VAE used for training: madebyollin/sdxl-vae-fp16-fix.
|
ShivamMangale/XLM-Roberta-base-finetuned-squad-syn-first-10k-5-epoch-v2 | ShivamMangale | 2023-09-18T12:16:57Z | 105 | 0 | transformers | [
"transformers",
"pytorch",
"xlm-roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"base_model:FacebookAI/xlm-roberta-base",
"base_model:finetune:FacebookAI/xlm-roberta-base",
"license:mit",
"endpoints_compatible",
"region:us"
]
| question-answering | 2023-09-18T12:02:29Z | ---
license: mit
base_model: xlm-roberta-base
tags:
- generated_from_trainer
datasets:
- squad
model-index:
- name: XLM-Roberta-base-finetuned-squad-syn-first-10k-5-epoch-v2
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# XLM-Roberta-base-finetuned-squad-syn-first-10k-5-epoch-v2
This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
### Framework versions
- Transformers 4.33.2
- Pytorch 2.0.1+cu117
- Datasets 2.14.5
- Tokenizers 0.13.3
|
Ayansk11/llama2-qlora-finetunined-IIOPT | Ayansk11 | 2023-09-18T12:13:36Z | 0 | 0 | peft | [
"peft",
"pytorch",
"region:us"
]
| null | 2023-09-17T17:25:13Z | ---
library_name: peft
---
## Training procedure
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: False
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.5.0
|
Dampish/StellarX-4B-V0.2 | Dampish | 2023-09-18T12:13:30Z | 1,610 | 2 | transformers | [
"transformers",
"pytorch",
"gpt_neox",
"text-generation",
"arxiv:2204.06745",
"license:cc-by-nc-sa-4.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
]
| text-generation | 2023-06-03T21:05:01Z | ---
license: cc-by-nc-sa-4.0
---
# StellarX: A Base Model by Dampish and Arkane
StellarX is a powerful autoregressive language model designed for various natural language processing tasks. It has been trained on a massive dataset containing 810 billion tokens(trained on 300B tokens), trained on "redpajama," and is built upon the popular GPT-NeoX architecture. With approximately 4 billion parameters, StellarX offers exceptional performance and versatility.
## Model Details
- **Training Data:** StellarX is trained on a large-scale dataset provided by "redpajama" maintained by the group "togethercumputer." This dataset has been instrumental in shaping StellarX's language capabilities and general-purpose understanding.
- **Model Architecture:** StellarX is built upon the GPT-NeoX architecture, which may, be, inspired by GPT-3 and shares similarities with GPT-J-6B. The architecture incorporates key advancements in transformer-based language models, ensuring high-quality predictions and contextual understanding.
- **Model Size:** StellarX consists of approximately 4 billion parameters, making it a highly capable language model for a wide range of natural language processing tasks.
- **Carbon-Friendly and Resource-Efficient:** StellarX has been optimized for carbon efficiency and can be comfortably run on local devices. When loaded in 8 bits, the model requires only about 5GB of storage, making it more accessible and convenient for various applications.
- **V0.2** Meaning what version it is on, currently version 0.2, Assume version 0.2 has only been trained on 300B tokens and the goal is 810B tokens. The next version aims to have a way higher accuracy.
## How to Use
To load StellarX using the Hugging Face Transformers library, you can use the following code snippet:
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("Dampish/StellarX-4B-V0")
model = AutoModelForCausalLM.from_pretrained("Dampish/StellarX-4B-V0")
```
This model is particularly beneficial for those seeking a language model that is powerful, compact, and can be run on local devices without a hefty carbon footprint. Remember, when considering Darius1, it's not just about the impressive numbers—it's about what these numbers represent: powerful performance, optimized resources, and responsible computing.
**For any queries related to this model, feel free to reach out to "Dampish#3607" on discord.**
## Licensing and Usage
StellarX, developed by the Dampish, is made available under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License (CC-BY-NC-SA-4.0). This license ensures that you can utilize the model for research purposes and personal use without any restrictions, while also promoting the sharing and adaptation of the model under certain conditions.
# Research and Personal Use
StellarX can be freely used for research purposes, allowing you to explore its capabilities, conduct experiments, and develop novel applications. Whether you're a student, researcher, or hobbyist, the model's availability under the CC-BY-NC-SA-4.0 license empowers you to unlock the potential of StellarX for your own non-commercial projects.
# Commercial Usage
For commercial usage of StellarX, an additional licensing arrangement must be established. If you intend to leverage the model for any commercial purpose, such as integrating it into a product or service, you are required to reach an agreement with the Dampish. This agreement will specify the terms, including the agreed-upon percentage or licensing fee to be paid for the commercial use of StellarX.
To initiate discussions regarding commercial usage, please contact Dampish through the designated channels mentioned earlier. They will be able to provide you with further information and guide you through the process of establishing a licensing arrangement tailored to your specific requirements.
# Importance of Licensing Compliance
It is crucial to respect the licensing terms to ensure the fair usage and continued development of StellarX. The revenue generated from commercial licensing supports the efforts of the Dampish in advancing the model and making it more widely accessible.
# Note on CC-BY-NC-SA-4.0
Under the CC-BY-NC-SA-4.0 license, you are allowed to modify and adapt StellarX, incorporating it into your own projects. However, any derivative work or modifications should also be shared under the same license terms, ensuring the continued openness and collaborative spirit of the project.
Please review the complete text of the CC-BY-NC-SA-4.0 license to familiarize yourself with its provisions and requirements. It is essential to comply with the terms of the license to respect the intellectual property rights and contributions of the Dampish and the wider community involved in developing StellarX.
## GPT-NeoX and Model Selection
GPT-NeoX-20B, a sibling model to StellarX, is a 20 billion parameter autoregressive language model trained on the Pile using the GPT-NeoX library. StellarX draws inspiration from the architectural advancements and performance of GPT-NeoX models. While the specifics of StellarX's architecture and parameters may differ, it benefits from the proven capabilities of GPT-NeoX and its suitability for diverse natural language processing tasks.
## Training and Evaluation
StellarX's training dataset comprises a comprehensive collection of English-language texts, covering various domains, thanks to the efforts of "redpajama" dataset by the group "togethercumputer" group.
Evaluation of GPT-NeoX 20B performance has demonstrated its competence across different natural language tasks. Although since this description provides a brief summary, we refer to the GPT-NeoX Paper https://arxiv.org/abs/2204.06745, comparing GPT-NeoX 20B to other models on tasks such as OpenAI's LAMBADA, SciQ, PIQA, TriviaQA, and ARC Challenge.
## Limitations and Considerations
StellarX, like its sibling models, is intended primarily for research purposes. It provides a powerful foundation for extracting useful features and insights from the English language. While StellarX can be further fine-tuned and adapted for deployment, users should conduct their own risk and bias assessments before using it as a basis for downstream tasks.
It's important to note that StellarX is not intended for direct deployment without supervision. It is not designed for human-facing interactions, unlike models like ChatGPT, which have been fine-tuned using reinforcement learning from human feedback to better understand human instructions and dialogue.
Furthermore, StellarX is not limited to the English language if trained properly and can sometimes be used for translation aswell as text generation in other languages.
Lastly, users should be aware of potential biases and limitations inherent in
Special thanks to the group that created the training dataset. The Redpajama dataset, used to train StellarX, thank you togethercumputer.
## Community and Support
To inquire about StellarX and receive support, you can join the Dampish's
server and engage in discussions in the #questions channel. It is recommended to explore the existing documentation and resources available for GPT-NeoX-20B to familiarize yourself with the model before seeking assistance on. For better information about GPT-NeoX, you can reach out to eleutherAI.
## Summary
StellarX, a base language model developed by the Dampish, offers impressive language capabilities and flexibility. Trained on an extensive dataset and built upon the GPT-NeoX architecture, StellarX excels in various natural language processing tasks. Its carbon-friendly and resource-efficient design makes it accessible for local device deployment. Researchers and enthusiasts can freely explore StellarX for research purposes and personal use, while commercial users should adhere to the licensing terms.
**Again i am really grateful for the data made by togethercumputers and their willingness to opensource, they inspired this project and sparked the idea in Stellar-models, i am truly really really grateful to them.
-dampish**
Discord: https://discord.gg/vasyNnUa
OR Reach out to me personally on Discord via the username: Dampish#3607
Thank you for your time.
|
kbthebest181/adadadadaorandomnamesothatnoonecanaccessbysearchinghahaha | kbthebest181 | 2023-09-18T12:06:51Z | 47 | 2 | diffusers | [
"diffusers",
"text-to-image",
"autotrain",
"base_model:stabilityai/stable-diffusion-xl-base-1.0",
"base_model:finetune:stabilityai/stable-diffusion-xl-base-1.0",
"autotrain_compatible",
"endpoints_compatible",
"diffusers:StableDiffusionXLPipeline",
"region:us"
]
| text-to-image | 2023-09-17T06:33:16Z |
---
base_model: stabilityai/stable-diffusion-xl-base-1.0
instance_prompt: photo of a formal dressed boy named kanva
tags:
- text-to-image
- diffusers
- autotrain
inference: true
---
# DreamBooth trained by AutoTrain
Text encoder was not trained.
|
kamilersz/image_classification | kamilersz | 2023-09-18T12:05:23Z | 167 | 0 | transformers | [
"transformers",
"pytorch",
"tensorboard",
"vit",
"image-classification",
"generated_from_trainer",
"dataset:imagefolder",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
]
| image-classification | 2023-09-18T10:43:28Z | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- imagefolder
metrics:
- accuracy
model-index:
- name: image_classification
results:
- task:
name: Image Classification
type: image-classification
dataset:
name: imagefolder
type: imagefolder
config: default
split: train
args: default
metrics:
- name: Accuracy
type: accuracy
value: 0.36875
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# image_classification
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 1.6249
- Accuracy: 0.3688
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 40 | 1.8602 | 0.275 |
| No log | 2.0 | 80 | 1.6744 | 0.3563 |
| No log | 3.0 | 120 | 1.6277 | 0.375 |
### Framework versions
- Transformers 4.28.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.13.3
|
Csnakos/ppo-LunarLander-v2 | Csnakos | 2023-09-18T12:03:48Z | 0 | 0 | stable-baselines3 | [
"stable-baselines3",
"LunarLander-v2",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
]
| reinforcement-learning | 2023-09-18T12:03:27Z | ---
library_name: stable-baselines3
tags:
- LunarLander-v2
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: PPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: LunarLander-v2
type: LunarLander-v2
metrics:
- type: mean_reward
value: 250.15 +/- 17.18
name: mean_reward
verified: false
---
# **PPO** Agent playing **LunarLander-v2**
This is a trained model of a **PPO** agent playing **LunarLander-v2**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
## Usage (with Stable-baselines3)
TODO: Add your code
```python
from stable_baselines3 import ...
from huggingface_sb3 import load_from_hub
...
```
|
NewstaR/StableGalen-6b | NewstaR | 2023-09-18T11:59:55Z | 165 | 1 | transformers | [
"transformers",
"pytorch",
"llama",
"text-generation",
"medicine",
"doctor",
"custom_code",
"en",
"dataset:Photolens/MedText-DoctorLLaMa-OpenOrca-formatted",
"dataset:shibing624/medical",
"license:other",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
]
| text-generation | 2023-09-18T11:10:59Z | ---
license: other
datasets:
- Photolens/MedText-DoctorLLaMa-OpenOrca-formatted
- shibing624/medical
language:
- en
tags:
- medicine
- doctor
---
# This model is the DeciLM-6b-Instruct model, trained specifically for medicine
Galen uses the
```
### User: {prompt}
### Response:
```
or
```
{prompt}
```
Prompt templates
# Galen Training Recipe:
- target_modules = ["q_proj", "v_proj", "gate_proj", "down_proj", "up_proj", "k_proj", "o_proj"]
- Learning Rate: 4e-4
- LR Scheduler: constant
- 250 Steps
<img src="Loss.png" alt="Loss" width="600" height="400" />
## T3: 1 Hour |
miittnnss/test0384 | miittnnss | 2023-09-18T11:55:27Z | 6 | 1 | diffusers | [
"diffusers",
"stable-diffusion",
"lora",
"text-to-image",
"base_model:runwayml/stable-diffusion-v1-5",
"base_model:adapter:runwayml/stable-diffusion-v1-5",
"license:creativeml-openrail-m",
"region:us"
]
| text-to-image | 2023-09-18T11:46:38Z | ---
base_model: runwayml/stable-diffusion-v1-5
library_name: diffusers
pipeline_tag: text-to-image
tags:
- stable-diffusion
- lora
license: creativeml-openrail-m
--- |
Toshikawa/outputs | Toshikawa | 2023-09-18T11:44:59Z | 224 | 0 | transformers | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"generated_from_trainer",
"base_model:rinna/japanese-gpt2-small",
"base_model:finetune:rinna/japanese-gpt2-small",
"license:mit",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
]
| text-generation | 2023-09-18T11:44:39Z | ---
license: mit
base_model: rinna/japanese-gpt2-small
tags:
- generated_from_trainer
model-index:
- name: outputs
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# outputs
This model is a fine-tuned version of [rinna/japanese-gpt2-small](https://huggingface.co/rinna/japanese-gpt2-small) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Training results
### Framework versions
- Transformers 4.33.2
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.13.3
|
l3cube-pune/marathi-roberta | l3cube-pune | 2023-09-18T11:40:18Z | 165 | 0 | transformers | [
"transformers",
"pytorch",
"safetensors",
"xlm-roberta",
"fill-mask",
"mr",
"dataset:L3Cube-MahaCorpus",
"arxiv:2202.01159",
"license:cc-by-4.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
]
| fill-mask | 2022-03-02T23:29:05Z | ---
license: cc-by-4.0
language: mr
datasets:
- L3Cube-MahaCorpus
---
## MahaRoBERTa
MahaRoBERTa is a Marathi RoBERTa model. It is a multilingual RoBERTa (xlm-roberta-base) model fine-tuned on L3Cube-MahaCorpus and other publicly available Marathi monolingual datasets.
[dataset link] (https://github.com/l3cube-pune/MarathiNLP)
More details on the dataset, models, and baseline results can be found in our [paper] (https://arxiv.org/abs/2202.01159)
```
@InProceedings{joshi:2022:WILDRE6,
author = {Joshi, Raviraj},
title = {L3Cube-MahaCorpus and MahaBERT: Marathi Monolingual Corpus, Marathi BERT Language Models, and Resources},
booktitle = {Proceedings of The WILDRE-6 Workshop within the 13th Language Resources and Evaluation Conference},
month = {June},
year = {2022},
address = {Marseille, France},
publisher = {European Language Resources Association},
pages = {97--101}
}
```
Other Monolingual Indic BERT models are listed below: <br>
<a href='https://huggingface.co/l3cube-pune/marathi-bert-v2'> Marathi BERT </a> <br>
<a href='https://huggingface.co/l3cube-pune/marathi-roberta'> Marathi RoBERTa </a> <br>
<a href='https://huggingface.co/l3cube-pune/marathi-albert'> Marathi AlBERT </a> <br>
<a href='https://huggingface.co/l3cube-pune/hindi-bert-v2'> Hindi BERT </a> <br>
<a href='https://huggingface.co/l3cube-pune/hindi-roberta'> Hindi RoBERTa </a> <br>
<a href='https://huggingface.co/l3cube-pune/hindi-albert'> Hindi AlBERT </a> <br>
<a href='https://huggingface.co/l3cube-pune/hindi-marathi-dev-bert'> Dev BERT </a> <br>
<a href='https://huggingface.co/l3cube-pune/hindi-marathi-dev-roberta'> Dev RoBERTa </a> <br>
<a href='https://huggingface.co/l3cube-pune/hindi-marathi-dev-albert'> Dev AlBERT </a> <br>
<a href='https://huggingface.co/l3cube-pune/kannada-bert'> Kannada BERT </a> <br>
<a href='https://huggingface.co/l3cube-pune/telugu-bert'> Telugu BERT </a> <br>
<a href='https://huggingface.co/l3cube-pune/malayalam-bert'> Malayalam BERT </a> <br>
<a href='https://huggingface.co/l3cube-pune/tamil-bert'> Tamil BERT </a> <br>
<a href='https://huggingface.co/l3cube-pune/gujarati-bert'> Gujarati BERT </a> <br>
<a href='https://huggingface.co/l3cube-pune/odia-bert'> Oriya BERT </a> <br>
<a href='https://huggingface.co/l3cube-pune/bengali-bert'> Bengali BERT </a> <br>
<a href='https://huggingface.co/l3cube-pune/punjabi-bert'> Punjabi BERT </a> <br>
<a href='https://huggingface.co/l3cube-pune/assamese-bert'> Assamese BERT </a> <br> |
GCYY/ppo-LunarLander-v2 | GCYY | 2023-09-18T11:39:02Z | 1 | 0 | stable-baselines3 | [
"stable-baselines3",
"LunarLander-v2",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
]
| reinforcement-learning | 2023-09-18T11:38:42Z | ---
library_name: stable-baselines3
tags:
- LunarLander-v2
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: PPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: LunarLander-v2
type: LunarLander-v2
metrics:
- type: mean_reward
value: 262.67 +/- 13.71
name: mean_reward
verified: false
---
# **PPO** Agent playing **LunarLander-v2**
This is a trained model of a **PPO** agent playing **LunarLander-v2**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
## Usage (with Stable-baselines3)
TODO: Add your code
```python
from stable_baselines3 import ...
from huggingface_sb3 import load_from_hub
...
```
|
CyberHarem/todo_misa_shirobako | CyberHarem | 2023-09-18T11:37:39Z | 0 | 0 | null | [
"art",
"text-to-image",
"dataset:CyberHarem/todo_misa_shirobako",
"license:mit",
"region:us"
]
| text-to-image | 2023-09-18T11:20:23Z | ---
license: mit
datasets:
- CyberHarem/todo_misa_shirobako
pipeline_tag: text-to-image
tags:
- art
---
# Lora of todo_misa_shirobako
This model is trained with [HCP-Diffusion](https://github.com/7eu7d7/HCP-Diffusion). And the auto-training framework is maintained by [DeepGHS Team](https://huggingface.co/deepghs).
The base model used during training is [NAI](https://huggingface.co/deepghs/animefull-latest), and the base model used for generating preview images is [Meina/MeinaMix_V11](https://huggingface.co/Meina/MeinaMix_V11).
After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.
For example, if you want to use the model from step 4000, you need to download `4000/todo_misa_shirobako.pt` as the embedding and `4000/todo_misa_shirobako.safetensors` for loading Lora. By using both files together, you can generate images for the desired characters.
**The best step we recommend is 4000**, with the score of 0.957. The trigger words are:
1. `todo_misa_shirobako`
2. `black_hair, short_hair, ahoge, black_eyes, grey_eyes`
For the following groups, it is not recommended to use this model and we express regret:
1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.
2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.
3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.
4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.
5. Individuals who finds the generated image content offensive to their values.
These are available steps:
| Steps | Score | Download | pattern_1 | pattern_2 | pattern_3 | pattern_4 | pattern_5 | pattern_6 | pattern_7 | pattern_8 | pattern_9 | pattern_10 | pattern_11 | bikini | bondage | free | maid | miko | nude | nude2 | suit | yukata |
|:---------|:----------|:---------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-------------------------------------------------|:-------------------------------------------------|:-----------------------------------------|:--------------------------------------------------|:-------------------------------------|:-------------------------------------|:-------------------------------------|:-----------------------------------------------|:------------------------------------------------|:-------------------------------------|:-----------------------------------------|
| 6000 | 0.916 | [Download](6000/todo_misa_shirobako.zip) |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](6000/previews/bondage.png) |  |  |  | [<NSFW, click to see>](6000/previews/nude.png) | [<NSFW, click to see>](6000/previews/nude2.png) |  |  |
| 5600 | 0.921 | [Download](5600/todo_misa_shirobako.zip) |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](5600/previews/bondage.png) |  |  |  | [<NSFW, click to see>](5600/previews/nude.png) | [<NSFW, click to see>](5600/previews/nude2.png) |  |  |
| 5200 | 0.917 | [Download](5200/todo_misa_shirobako.zip) |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](5200/previews/bondage.png) |  |  |  | [<NSFW, click to see>](5200/previews/nude.png) | [<NSFW, click to see>](5200/previews/nude2.png) |  |  |
| 4800 | 0.917 | [Download](4800/todo_misa_shirobako.zip) |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](4800/previews/bondage.png) |  |  |  | [<NSFW, click to see>](4800/previews/nude.png) | [<NSFW, click to see>](4800/previews/nude2.png) |  |  |
| 4400 | 0.839 | [Download](4400/todo_misa_shirobako.zip) |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](4400/previews/bondage.png) |  |  |  | [<NSFW, click to see>](4400/previews/nude.png) | [<NSFW, click to see>](4400/previews/nude2.png) |  |  |
| **4000** | **0.957** | [**Download**](4000/todo_misa_shirobako.zip) |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](4000/previews/bondage.png) |  |  |  | [<NSFW, click to see>](4000/previews/nude.png) | [<NSFW, click to see>](4000/previews/nude2.png) |  |  |
| 3600 | 0.919 | [Download](3600/todo_misa_shirobako.zip) |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](3600/previews/bondage.png) |  |  |  | [<NSFW, click to see>](3600/previews/nude.png) | [<NSFW, click to see>](3600/previews/nude2.png) |  |  |
| 3200 | 0.936 | [Download](3200/todo_misa_shirobako.zip) |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](3200/previews/bondage.png) |  |  |  | [<NSFW, click to see>](3200/previews/nude.png) | [<NSFW, click to see>](3200/previews/nude2.png) |  |  |
| 2800 | 0.803 | [Download](2800/todo_misa_shirobako.zip) |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](2800/previews/bondage.png) |  |  |  | [<NSFW, click to see>](2800/previews/nude.png) | [<NSFW, click to see>](2800/previews/nude2.png) |  |  |
| 2400 | 0.893 | [Download](2400/todo_misa_shirobako.zip) |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](2400/previews/bondage.png) |  |  |  | [<NSFW, click to see>](2400/previews/nude.png) | [<NSFW, click to see>](2400/previews/nude2.png) |  |  |
| 2000 | 0.908 | [Download](2000/todo_misa_shirobako.zip) |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](2000/previews/bondage.png) |  |  |  | [<NSFW, click to see>](2000/previews/nude.png) | [<NSFW, click to see>](2000/previews/nude2.png) |  |  |
| 1600 | 0.874 | [Download](1600/todo_misa_shirobako.zip) |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](1600/previews/bondage.png) |  |  |  | [<NSFW, click to see>](1600/previews/nude.png) | [<NSFW, click to see>](1600/previews/nude2.png) |  |  |
| 1200 | 0.883 | [Download](1200/todo_misa_shirobako.zip) |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](1200/previews/bondage.png) |  |  |  | [<NSFW, click to see>](1200/previews/nude.png) | [<NSFW, click to see>](1200/previews/nude2.png) |  |  |
| 800 | 0.835 | [Download](800/todo_misa_shirobako.zip) |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](800/previews/bondage.png) |  |  |  | [<NSFW, click to see>](800/previews/nude.png) | [<NSFW, click to see>](800/previews/nude2.png) |  |  |
| 400 | 0.694 | [Download](400/todo_misa_shirobako.zip) |  |  |  |  |  |  |  |  |  |  |  |  | [<NSFW, click to see>](400/previews/bondage.png) |  |  |  | [<NSFW, click to see>](400/previews/nude.png) | [<NSFW, click to see>](400/previews/nude2.png) |  |  |
|
Alexzyx/models_sd_2-1 | Alexzyx | 2023-09-18T11:35:14Z | 0 | 0 | diffusers | [
"diffusers",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"lora",
"base_model:runwayml/stable-diffusion-v1-5",
"base_model:adapter:runwayml/stable-diffusion-v1-5",
"license:creativeml-openrail-m",
"region:us"
]
| text-to-image | 2023-09-18T07:14:20Z |
---
license: creativeml-openrail-m
base_model: runwayml/stable-diffusion-v1-5
instance_prompt: a photo of sks dog
tags:
- stable-diffusion
- stable-diffusion-diffusers
- text-to-image
- diffusers
- lora
inference: true
---
# LoRA DreamBooth - Alexzyx/models_sd_2-1
These are LoRA adaption weights for runwayml/stable-diffusion-v1-5. The weights were trained on a photo of sks dog using [DreamBooth](https://dreambooth.github.io/). You can find some example images in the following.




LoRA for the text encoder was enabled: False.
|
raowaqas123/llma_v1 | raowaqas123 | 2023-09-18T11:27:52Z | 0 | 0 | peft | [
"peft",
"region:us"
]
| null | 2023-09-18T11:13:59Z | ---
library_name: peft
---
## Training procedure
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.5.0
- PEFT 0.5.0
|
re2panda/polyglot_12b_grade_school_math2 | re2panda | 2023-09-18T11:25:05Z | 0 | 0 | peft | [
"peft",
"region:us"
]
| null | 2023-09-12T05:40:03Z | ---
library_name: peft
---
## Training procedure
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: bfloat16
### Framework versions
- PEFT 0.6.0.dev0
|
Sunny98/dqn-SpaceInvadersNoFrameskip-v4 | Sunny98 | 2023-09-18T11:10:49Z | 2 | 0 | stable-baselines3 | [
"stable-baselines3",
"SpaceInvadersNoFrameskip-v4",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
]
| reinforcement-learning | 2023-09-18T11:10:08Z | ---
library_name: stable-baselines3
tags:
- SpaceInvadersNoFrameskip-v4
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: DQN
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: SpaceInvadersNoFrameskip-v4
type: SpaceInvadersNoFrameskip-v4
metrics:
- type: mean_reward
value: 709.00 +/- 320.43
name: mean_reward
verified: false
---
# **DQN** Agent playing **SpaceInvadersNoFrameskip-v4**
This is a trained model of a **DQN** agent playing **SpaceInvadersNoFrameskip-v4**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3)
and the [RL Zoo](https://github.com/DLR-RM/rl-baselines3-zoo).
The RL Zoo is a training framework for Stable Baselines3
reinforcement learning agents,
with hyperparameter optimization and pre-trained agents included.
## Usage (with SB3 RL Zoo)
RL Zoo: https://github.com/DLR-RM/rl-baselines3-zoo<br/>
SB3: https://github.com/DLR-RM/stable-baselines3<br/>
SB3 Contrib: https://github.com/Stable-Baselines-Team/stable-baselines3-contrib
Install the RL Zoo (with SB3 and SB3-Contrib):
```bash
pip install rl_zoo3
```
```
# Download model and save it into the logs/ folder
python -m rl_zoo3.load_from_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -orga Sunny98 -f logs/
python -m rl_zoo3.enjoy --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
```
If you installed the RL Zoo3 via pip (`pip install rl_zoo3`), from anywhere you can do:
```
python -m rl_zoo3.load_from_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -orga Sunny98 -f logs/
python -m rl_zoo3.enjoy --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
```
## Training (with the RL Zoo)
```
python -m rl_zoo3.train --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
# Upload the model and generate video (when possible)
python -m rl_zoo3.push_to_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/ -orga Sunny98
```
## Hyperparameters
```python
OrderedDict([('batch_size', 32),
('buffer_size', 100000),
('env_wrapper',
['stable_baselines3.common.atari_wrappers.AtariWrapper']),
('exploration_final_eps', 0.01),
('exploration_fraction', 0.1),
('frame_stack', 4),
('gradient_steps', 1),
('learning_rate', 0.0001),
('learning_starts', 100000),
('n_timesteps', 1000000),
('optimize_memory_usage', False),
('policy', 'CnnPolicy'),
('target_update_interval', 1000),
('train_freq', 4),
('normalize', False)])
```
# Environment Arguments
```python
{'render_mode': 'rgb_array'}
```
|
Toshikawa/llm_summer_school_2023_1 | Toshikawa | 2023-09-18T11:05:27Z | 226 | 0 | transformers | [
"transformers",
"pytorch",
"tensorboard",
"gpt2",
"text-generation",
"generated_from_trainer",
"base_model:rinna/japanese-gpt2-small",
"base_model:finetune:rinna/japanese-gpt2-small",
"license:mit",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
]
| text-generation | 2023-09-18T10:38:13Z | ---
license: mit
base_model: rinna/japanese-gpt2-small
tags:
- generated_from_trainer
model-index:
- name: llm_summer_school_2023_1
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# llm_summer_school_2023_1
This model is a fine-tuned version of [rinna/japanese-gpt2-small](https://huggingface.co/rinna/japanese-gpt2-small) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Training results
### Framework versions
- Transformers 4.33.2
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.13.3
|
Yntec/animeTWO | Yntec | 2023-09-18T10:59:02Z | 439 | 8 | diffusers | [
"diffusers",
"safetensors",
"Anime",
"Space",
"Cats",
"realisticElves",
"verxonous86495",
"text-to-image",
"stable-diffusion",
"stable-diffusion-diffusers",
"license:creativeml-openrail-m",
"autotrain_compatible",
"endpoints_compatible",
"diffusers:StableDiffusionPipeline",
"region:us"
]
| text-to-image | 2023-09-18T07:28:09Z | ---
license: creativeml-openrail-m
library_name: diffusers
pipeline_tag: text-to-image
tags:
- Anime
- Space
- Cats
- realisticElves
- verxonous86495
- text-to-image
- stable-diffusion
- stable-diffusion-diffusers
- diffusers
---
#animeTWO
No-ema version of this model. animeTWOz has the ZVAE baked in.
Sample and prompt:

idyllic particulate sparkling atmospheric, pretty CUTE little girl, 1940, Magazine ad, Iconic. beautiful detailed legs, unreal 5, daz, hyperrealistic, octane render, Painterly soft brush, shy modest pleasing palette, textured, detailed, flawless, perfect, mural - sized chibi character design key visual symmetrical headshot portrait by yoshitomo nara and ROSSDRAWS
Original pages:
https://civitai.com/models/40245?modelVersionId=45715 (animeTWO)
https://civitai.com/models/97653/z-vae (z-vae) |
AlketaR/ludwig0 | AlketaR | 2023-09-18T10:58:55Z | 1 | 0 | peft | [
"peft",
"region:us"
]
| null | 2023-09-18T10:29:45Z | ---
library_name: peft
---
## Training procedure
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.5.0
- PEFT 0.5.0
|
CyberHarem/himekawa_yuki_idolmastercinderellagirls | CyberHarem | 2023-09-18T10:51:28Z | 0 | 1 | null | [
"art",
"text-to-image",
"dataset:CyberHarem/himekawa_yuki_idolmastercinderellagirls",
"license:mit",
"region:us"
]
| text-to-image | 2023-09-18T10:32:40Z | ---
license: mit
datasets:
- CyberHarem/himekawa_yuki_idolmastercinderellagirls
pipeline_tag: text-to-image
tags:
- art
---
# Lora of himekawa_yuki_idolmastercinderellagirls
This model is trained with [HCP-Diffusion](https://github.com/7eu7d7/HCP-Diffusion). And the auto-training framework is maintained by [DeepGHS Team](https://huggingface.co/deepghs).
The base model used during training is [NAI](https://huggingface.co/deepghs/animefull-latest), and the base model used for generating preview images is [Meina/MeinaMix_V11](https://huggingface.co/Meina/MeinaMix_V11).
After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.
For example, if you want to use the model from step 6160, you need to download `6160/himekawa_yuki_idolmastercinderellagirls.pt` as the embedding and `6160/himekawa_yuki_idolmastercinderellagirls.safetensors` for loading Lora. By using both files together, you can generate images for the desired characters.
**The best step we recommend is 6160**, with the score of 0.927. The trigger words are:
1. `himekawa_yuki_idolmastercinderellagirls`
2. `long_hair, brown_hair, green_eyes, blush, hair_ornament, hairclip, smile, open_mouth, breasts`
For the following groups, it is not recommended to use this model and we express regret:
1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.
2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.
3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.
4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.
5. Individuals who finds the generated image content offensive to their values.
These are available steps:
| Steps | Score | Download | pattern_1 | pattern_2 | pattern_3 | pattern_4 | pattern_5 | pattern_6 | pattern_7 | pattern_8 | pattern_9 | pattern_10 | pattern_11 | pattern_12 | pattern_13 | bikini | bondage | free | maid | miko | nude | nude2 | suit | yukata |
|:---------|:----------|:-----------------------------------------------------------------|:-----------------------------------------------|:----------------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:----------------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:----------------------------------------------------|:-------------------------------------------------|:-------------------------------------------------|:-------------------------------------------------|:-----------------------------------------------------|:-------------------------------------------------|:--------------------------------------------------|:-------------------------------------|:-------------------------------------|:-------------------------------------|:-----------------------------------------------|:------------------------------------------------|:-------------------------------------|:-----------------------------------------|
| 8400 | 0.892 | [Download](8400/himekawa_yuki_idolmastercinderellagirls.zip) |  | [<NSFW, click to see>](8400/previews/pattern_2.png) |  |  | [<NSFW, click to see>](8400/previews/pattern_5.png) |  |  |  | [<NSFW, click to see>](8400/previews/pattern_9.png) |  |  |  | [<NSFW, click to see>](8400/previews/pattern_13.png) | [<NSFW, click to see>](8400/previews/bikini.png) | [<NSFW, click to see>](8400/previews/bondage.png) |  |  |  | [<NSFW, click to see>](8400/previews/nude.png) | [<NSFW, click to see>](8400/previews/nude2.png) |  |  |
| 7840 | 0.892 | [Download](7840/himekawa_yuki_idolmastercinderellagirls.zip) |  | [<NSFW, click to see>](7840/previews/pattern_2.png) |  |  | [<NSFW, click to see>](7840/previews/pattern_5.png) |  |  |  | [<NSFW, click to see>](7840/previews/pattern_9.png) |  |  |  | [<NSFW, click to see>](7840/previews/pattern_13.png) | [<NSFW, click to see>](7840/previews/bikini.png) | [<NSFW, click to see>](7840/previews/bondage.png) |  |  |  | [<NSFW, click to see>](7840/previews/nude.png) | [<NSFW, click to see>](7840/previews/nude2.png) |  |  |
| 7280 | 0.904 | [Download](7280/himekawa_yuki_idolmastercinderellagirls.zip) |  | [<NSFW, click to see>](7280/previews/pattern_2.png) |  |  | [<NSFW, click to see>](7280/previews/pattern_5.png) |  |  |  | [<NSFW, click to see>](7280/previews/pattern_9.png) |  |  |  | [<NSFW, click to see>](7280/previews/pattern_13.png) | [<NSFW, click to see>](7280/previews/bikini.png) | [<NSFW, click to see>](7280/previews/bondage.png) |  |  |  | [<NSFW, click to see>](7280/previews/nude.png) | [<NSFW, click to see>](7280/previews/nude2.png) |  |  |
| 6720 | 0.836 | [Download](6720/himekawa_yuki_idolmastercinderellagirls.zip) |  | [<NSFW, click to see>](6720/previews/pattern_2.png) |  |  | [<NSFW, click to see>](6720/previews/pattern_5.png) |  |  |  | [<NSFW, click to see>](6720/previews/pattern_9.png) |  |  |  | [<NSFW, click to see>](6720/previews/pattern_13.png) | [<NSFW, click to see>](6720/previews/bikini.png) | [<NSFW, click to see>](6720/previews/bondage.png) |  |  |  | [<NSFW, click to see>](6720/previews/nude.png) | [<NSFW, click to see>](6720/previews/nude2.png) |  |  |
| **6160** | **0.927** | [**Download**](6160/himekawa_yuki_idolmastercinderellagirls.zip) |  | [<NSFW, click to see>](6160/previews/pattern_2.png) |  |  | [<NSFW, click to see>](6160/previews/pattern_5.png) |  |  |  | [<NSFW, click to see>](6160/previews/pattern_9.png) |  |  |  | [<NSFW, click to see>](6160/previews/pattern_13.png) | [<NSFW, click to see>](6160/previews/bikini.png) | [<NSFW, click to see>](6160/previews/bondage.png) |  |  |  | [<NSFW, click to see>](6160/previews/nude.png) | [<NSFW, click to see>](6160/previews/nude2.png) |  |  |
| 5600 | 0.878 | [Download](5600/himekawa_yuki_idolmastercinderellagirls.zip) |  | [<NSFW, click to see>](5600/previews/pattern_2.png) |  |  | [<NSFW, click to see>](5600/previews/pattern_5.png) |  |  |  | [<NSFW, click to see>](5600/previews/pattern_9.png) |  |  |  | [<NSFW, click to see>](5600/previews/pattern_13.png) | [<NSFW, click to see>](5600/previews/bikini.png) | [<NSFW, click to see>](5600/previews/bondage.png) |  |  |  | [<NSFW, click to see>](5600/previews/nude.png) | [<NSFW, click to see>](5600/previews/nude2.png) |  |  |
| 5040 | 0.813 | [Download](5040/himekawa_yuki_idolmastercinderellagirls.zip) |  | [<NSFW, click to see>](5040/previews/pattern_2.png) |  |  | [<NSFW, click to see>](5040/previews/pattern_5.png) |  |  |  | [<NSFW, click to see>](5040/previews/pattern_9.png) |  |  |  | [<NSFW, click to see>](5040/previews/pattern_13.png) | [<NSFW, click to see>](5040/previews/bikini.png) | [<NSFW, click to see>](5040/previews/bondage.png) |  |  |  | [<NSFW, click to see>](5040/previews/nude.png) | [<NSFW, click to see>](5040/previews/nude2.png) |  |  |
| 4480 | 0.846 | [Download](4480/himekawa_yuki_idolmastercinderellagirls.zip) |  | [<NSFW, click to see>](4480/previews/pattern_2.png) |  |  | [<NSFW, click to see>](4480/previews/pattern_5.png) |  |  |  | [<NSFW, click to see>](4480/previews/pattern_9.png) |  |  |  | [<NSFW, click to see>](4480/previews/pattern_13.png) | [<NSFW, click to see>](4480/previews/bikini.png) | [<NSFW, click to see>](4480/previews/bondage.png) |  |  |  | [<NSFW, click to see>](4480/previews/nude.png) | [<NSFW, click to see>](4480/previews/nude2.png) |  |  |
| 3920 | 0.800 | [Download](3920/himekawa_yuki_idolmastercinderellagirls.zip) |  | [<NSFW, click to see>](3920/previews/pattern_2.png) |  |  | [<NSFW, click to see>](3920/previews/pattern_5.png) |  |  |  | [<NSFW, click to see>](3920/previews/pattern_9.png) |  |  |  | [<NSFW, click to see>](3920/previews/pattern_13.png) | [<NSFW, click to see>](3920/previews/bikini.png) | [<NSFW, click to see>](3920/previews/bondage.png) |  |  |  | [<NSFW, click to see>](3920/previews/nude.png) | [<NSFW, click to see>](3920/previews/nude2.png) |  |  |
| 3360 | 0.810 | [Download](3360/himekawa_yuki_idolmastercinderellagirls.zip) |  | [<NSFW, click to see>](3360/previews/pattern_2.png) |  |  | [<NSFW, click to see>](3360/previews/pattern_5.png) |  |  |  | [<NSFW, click to see>](3360/previews/pattern_9.png) |  |  |  | [<NSFW, click to see>](3360/previews/pattern_13.png) | [<NSFW, click to see>](3360/previews/bikini.png) | [<NSFW, click to see>](3360/previews/bondage.png) |  |  |  | [<NSFW, click to see>](3360/previews/nude.png) | [<NSFW, click to see>](3360/previews/nude2.png) |  |  |
| 2800 | 0.800 | [Download](2800/himekawa_yuki_idolmastercinderellagirls.zip) |  | [<NSFW, click to see>](2800/previews/pattern_2.png) |  |  | [<NSFW, click to see>](2800/previews/pattern_5.png) |  |  |  | [<NSFW, click to see>](2800/previews/pattern_9.png) |  |  |  | [<NSFW, click to see>](2800/previews/pattern_13.png) | [<NSFW, click to see>](2800/previews/bikini.png) | [<NSFW, click to see>](2800/previews/bondage.png) |  |  |  | [<NSFW, click to see>](2800/previews/nude.png) | [<NSFW, click to see>](2800/previews/nude2.png) |  |  |
| 2240 | 0.836 | [Download](2240/himekawa_yuki_idolmastercinderellagirls.zip) |  | [<NSFW, click to see>](2240/previews/pattern_2.png) |  |  | [<NSFW, click to see>](2240/previews/pattern_5.png) |  |  |  | [<NSFW, click to see>](2240/previews/pattern_9.png) |  |  |  | [<NSFW, click to see>](2240/previews/pattern_13.png) | [<NSFW, click to see>](2240/previews/bikini.png) | [<NSFW, click to see>](2240/previews/bondage.png) |  |  |  | [<NSFW, click to see>](2240/previews/nude.png) | [<NSFW, click to see>](2240/previews/nude2.png) |  |  |
| 1680 | 0.823 | [Download](1680/himekawa_yuki_idolmastercinderellagirls.zip) |  | [<NSFW, click to see>](1680/previews/pattern_2.png) |  |  | [<NSFW, click to see>](1680/previews/pattern_5.png) |  |  |  | [<NSFW, click to see>](1680/previews/pattern_9.png) |  |  |  | [<NSFW, click to see>](1680/previews/pattern_13.png) | [<NSFW, click to see>](1680/previews/bikini.png) | [<NSFW, click to see>](1680/previews/bondage.png) |  |  |  | [<NSFW, click to see>](1680/previews/nude.png) | [<NSFW, click to see>](1680/previews/nude2.png) |  |  |
| 1120 | 0.688 | [Download](1120/himekawa_yuki_idolmastercinderellagirls.zip) |  | [<NSFW, click to see>](1120/previews/pattern_2.png) |  |  | [<NSFW, click to see>](1120/previews/pattern_5.png) |  |  |  | [<NSFW, click to see>](1120/previews/pattern_9.png) |  |  |  | [<NSFW, click to see>](1120/previews/pattern_13.png) | [<NSFW, click to see>](1120/previews/bikini.png) | [<NSFW, click to see>](1120/previews/bondage.png) |  |  |  | [<NSFW, click to see>](1120/previews/nude.png) | [<NSFW, click to see>](1120/previews/nude2.png) |  |  |
| 560 | 0.492 | [Download](560/himekawa_yuki_idolmastercinderellagirls.zip) |  | [<NSFW, click to see>](560/previews/pattern_2.png) |  |  | [<NSFW, click to see>](560/previews/pattern_5.png) |  |  |  | [<NSFW, click to see>](560/previews/pattern_9.png) |  |  |  | [<NSFW, click to see>](560/previews/pattern_13.png) | [<NSFW, click to see>](560/previews/bikini.png) | [<NSFW, click to see>](560/previews/bondage.png) |  |  |  | [<NSFW, click to see>](560/previews/nude.png) | [<NSFW, click to see>](560/previews/nude2.png) |  |  |
|
ShivamMangale/XLM-Roberta-base-finetuned-squad-syn-first-10k-5-epoch | ShivamMangale | 2023-09-18T10:50:39Z | 120 | 0 | transformers | [
"transformers",
"pytorch",
"xlm-roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"base_model:FacebookAI/xlm-roberta-base",
"base_model:finetune:FacebookAI/xlm-roberta-base",
"license:mit",
"endpoints_compatible",
"region:us"
]
| question-answering | 2023-09-18T10:36:17Z | ---
license: mit
base_model: xlm-roberta-base
tags:
- generated_from_trainer
datasets:
- squad
model-index:
- name: XLM-Roberta-base-finetuned-squad-syn-first-10k-5-epoch
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# XLM-Roberta-base-finetuned-squad-syn-first-10k-5-epoch
This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
### Framework versions
- Transformers 4.33.2
- Pytorch 2.0.1+cu117
- Datasets 2.14.5
- Tokenizers 0.13.3
|
TamerAbdelaziz/distilbert-base-uncased-finetuned-imdb | TamerAbdelaziz | 2023-09-18T10:43:50Z | 62 | 0 | transformers | [
"transformers",
"tf",
"tensorboard",
"distilbert",
"text-classification",
"generated_from_keras_callback",
"base_model:distilbert/distilbert-base-uncased",
"base_model:finetune:distilbert/distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
]
| text-classification | 2023-09-18T07:23:59Z | ---
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_keras_callback
model-index:
- name: TamerAbdelaziz/distilbert-base-uncased-finetuned-imdb
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# TamerAbdelaziz/distilbert-base-uncased-finetuned-imdb
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.1268
- Validation Loss: 0.1841
- Train Accuracy: 0.9326
- Epoch: 1
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 3124, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Train Accuracy | Epoch |
|:----------:|:---------------:|:--------------:|:-----:|
| 0.2559 | 0.1831 | 0.9287 | 0 |
| 0.1268 | 0.1841 | 0.9326 | 1 |
### Framework versions
- Transformers 4.33.2
- TensorFlow 2.13.0
- Datasets 2.14.5
- Tokenizers 0.13.3
|
Subsets and Splits