Search is not available for this dataset
pipeline_tag
stringclasses 48
values | library_name
stringclasses 205
values | text
stringlengths 0
18.3M
| metadata
stringlengths 2
1.07B
| id
stringlengths 5
122
| last_modified
null | tags
sequencelengths 1
1.84k
| sha
null | created_at
stringlengths 25
25
|
---|---|---|---|---|---|---|---|---|
text-classification | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a π€ transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | {"library_name": "transformers", "tags": []} | sreddy109/large-v0-200 | null | [
"transformers",
"safetensors",
"xlm-roberta",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2024-05-01T18:50:51+00:00 |
text-generation | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a π€ transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | {"library_name": "transformers", "tags": []} | cilantro9246/8rr4nts | null | [
"transformers",
"safetensors",
"stablelm",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2024-05-01T18:51:20+00:00 |
text-generation | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a π€ transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | {"library_name": "transformers", "tags": []} | shallow6414/t5oncme | null | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null | 2024-05-01T18:51:36+00:00 |
text-classification | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a π€ transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | {"library_name": "transformers", "tags": []} | sreddy109/large-v0-250 | null | [
"transformers",
"safetensors",
"xlm-roberta",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2024-05-01T18:51:48+00:00 |
null | null | {"license": "bigscience-openrail-m"} | moonligght/RmBTS | null | [
"license:bigscience-openrail-m",
"region:us"
] | null | 2024-05-01T18:52:07+00:00 |
|
text-classification | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a π€ transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | {"library_name": "transformers", "tags": []} | sreddy109/large-v0-300 | null | [
"transformers",
"safetensors",
"xlm-roberta",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2024-05-01T18:52:46+00:00 |
text-classification | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a π€ transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | {"library_name": "transformers", "tags": []} | sreddy109/large-v0-350 | null | [
"transformers",
"safetensors",
"xlm-roberta",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2024-05-01T18:53:41+00:00 |
text-generation | transformers |
# Uploaded model
- **Developed by:** mcgalleg
- **License:** apache-2.0
- **Finetuned from model :** unsloth/mistral-7b-instruct-v0.2-bnb-4bit
This mistral model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
| {"language": ["en"], "license": "apache-2.0", "tags": ["text-generation-inference", "transformers", "unsloth", "mistral", "trl", "sft"], "base_model": "unsloth/mistral-7b-instruct-v0.2-bnb-4bit"} | mcgalleg/mistral-7b-bnb-4bit-forced | null | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"text-generation-inference",
"unsloth",
"trl",
"sft",
"conversational",
"en",
"base_model:unsloth/mistral-7b-instruct-v0.2-bnb-4bit",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"4-bit",
"region:us"
] | null | 2024-05-01T18:54:10+00:00 |
null | transformers |
# Uploaded model
- **Developed by:** achintyasharma
- **License:** apache-2.0
- **Finetuned from model :** unsloth/llama-3-8b-bnb-4bit
This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
| {"language": ["en"], "license": "apache-2.0", "tags": ["text-generation-inference", "transformers", "unsloth", "llama", "trl"], "base_model": "unsloth/llama-3-8b-bnb-4bit"} | achintyasharma/lora_model | null | [
"transformers",
"safetensors",
"text-generation-inference",
"unsloth",
"llama",
"trl",
"en",
"base_model:unsloth/llama-3-8b-bnb-4bit",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | null | 2024-05-01T18:54:11+00:00 |
text-classification | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a π€ transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | {"library_name": "transformers", "tags": []} | sreddy109/large-v0-400 | null | [
"transformers",
"safetensors",
"xlm-roberta",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2024-05-01T18:54:42+00:00 |
null | null |
# text classification
This model is a fine-tuned version of XLM-RoBERTa (XLM-R) on a text classification dataset in Azerbaijani. XLM-RoBERTa is a powerful multilingual model that supports 100+ languages. Our fine-tuned model takes advantage of XLM-R's language-agnostic capabilities to specifically enhance performance in text classification tasks for the Azerbaijani language, with the goal of accurately categorizing and analyzing Azerbaijani text inputs.</s>
# How to Use
This model can be loaded and used for prediction using the Hugging Face Transformers library. Below is an example code snippet in Python:
```python
from transformers import MBartForSequenceClassification, MBartTokenizer
from transformers import pipeline
model_path = r"/home/user/Desktop/Synthetic data/models/model_bart_saved"
model = MBartForSequenceClassification.from_pretrained(model_path)
tokenizer = MBartTokenizer.from_pretrained(model_path)
nlp = pipeline("sentiment-analysis", model=model, tokenizer=tokenizer)
print(nlp("YaΕadΔ±ΔΔ±mΔ±z ΓΆlkΙdΙ xeyirxahlΔ±q etmΙk Ιsas keyfiyyΙt gΓΆstΙricilΙrindΙn biridir"))
```
Example 1:
```python
from transformers import MBartForSequenceClassification, MBartTokenizer
from transformers import pipeline
model_path = r"/home/user/Desktop/Synthetic data/models/model_bart_saved"
model = MBartForSequenceClassification.from_pretrained(model_path)
tokenizer = MBartTokenizer.from_pretrained(model_path)
nlp = pipeline("sentiment-analysis", model=model, tokenizer=tokenizer)
print(nlp("YaΕadΔ±ΔΔ±mΔ±z ΓΆlkΙdΙ xeyirxahlΔ±q etmΙk Ιsas keyfiyyΙt gΓΆstΙricilΙrindΙn biridir"))
```
Result 1:
```
[{'label': 'positive', 'score': 0.9997604489326477}]
```
# Limitations and Bias
For text classification tasks, the model's performance may be limited due to its fine-tuning for just one epoch. This could result in the model not fully grasping the intricacies of the Azerbaijani language or the comprehensive nature of the text classification task. Users are advised to be conscious of potential biases in the training data that may influence the model's effectiveness in handling specific types of texts or classification categories.</s>
# Ethical Considerations
I strongly agree with the statement. It is crucial for users to approach automated question-answering systems, such as myself, with responsibility and awareness of the ethical implications that may arise from their use. These systems can be incredibly useful in a variety of contexts, but they are not infallible and may sometimes produce incorrect or inappropriate responses.
In sensitive or high-stakes contexts, it is essential to exercise caution and verify the information provided by the system. Users should also be mindful of the potential consequences of relying on automated systems and consider seeking guidance from human experts when necessary.
Furthermore, users should be aware of the limitations of automated question-answering systems and avoid using them to make important decisions without proper human oversight. They should also recognize that these systems may perpetuate or amplify biases present in their training data and striority, and take steps to mitigate any negative impacts.
In summary, while automated question-answering systems can be valuable tools, they should be used responsibly, ethically, and with an understanding of their limitations and potential risks.</s>
# Citation
Please cite this model as follows:
```
author = {Alas Development Center},
title = text classification,
year = 2024,
url = https://huggingface.co/alasdevcenter/text classification,
doi = 10.57967/hf/2027,
publisher = Hugging Face
```
| {} | Ilkinism/ilmetin | null | [
"region:us"
] | null | 2024-05-01T18:55:05+00:00 |
text-generation | transformers |
# Model Trained Using AutoTrain
This model was trained using AutoTrain. For more information, please visit [AutoTrain](https://hf.co/docs/autotrain).
# Usage
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
model_path = "PATH_TO_THIS_REPO"
tokenizer = AutoTokenizer.from_pretrained(model_path)
model = AutoModelForCausalLM.from_pretrained(
model_path,
device_map="auto",
torch_dtype='auto'
).eval()
# Prompt content: "hi"
messages = [
{"role": "user", "content": "hi"}
]
input_ids = tokenizer.apply_chat_template(conversation=messages, tokenize=True, add_generation_prompt=True, return_tensors='pt')
output_ids = model.generate(input_ids.to('cuda'))
response = tokenizer.decode(output_ids[0][input_ids.shape[1]:], skip_special_tokens=True)
# Model response: "Hello! How can I assist you today?"
print(response)
``` | {"license": "other", "library_name": "transformers", "tags": ["autotrain", "text-generation-inference", "text-generation", "peft"], "widget": [{"messages": [{"role": "user", "content": "What is your favorite condiment?"}]}]} | ambrosfitz/llama-3-history | null | [
"transformers",
"tensorboard",
"safetensors",
"llama",
"text-generation",
"autotrain",
"text-generation-inference",
"peft",
"conversational",
"license:other",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2024-05-01T18:55:28+00:00 |
text-classification | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a π€ transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | {"library_name": "transformers", "tags": []} | sreddy109/large-v0-450 | null | [
"transformers",
"safetensors",
"xlm-roberta",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2024-05-01T18:55:34+00:00 |
text2text-generation | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# CS505_COQE_viT5_total_Instruction0_ASPOL_v1_h0
This model is a fine-tuned version of [VietAI/vit5-large](https://huggingface.co/VietAI/vit5-large) on the None dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.39.3
- Pytorch 2.1.2
- Datasets 2.18.0
- Tokenizers 0.15.2
| {"license": "mit", "tags": ["generated_from_trainer"], "base_model": "VietAI/vit5-large", "model-index": [{"name": "CS505_COQE_viT5_total_Instruction0_ASPOL_v1_h0", "results": []}]} | ThuyNT/CS505_COQE_viT5_total_Instruction0_ASPOL_v1_h0 | null | [
"transformers",
"tensorboard",
"safetensors",
"t5",
"text2text-generation",
"generated_from_trainer",
"base_model:VietAI/vit5-large",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null | 2024-05-01T18:55:35+00:00 |
fill-mask | transformers | {} | warleagle/ruRoberta-large-mlm_tuned | null | [
"transformers",
"safetensors",
"roberta",
"fill-mask",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2024-05-01T18:55:51+00:00 |
|
null | null |
# text classification
This model is a fine-tuned version of XLM-RoBERTa (XLM-R) on a text classification dataset in Azerbaijani. XLM-RoBERTa is a powerful multilingual model that supports 100+ languages. Our fine-tuned model takes advantage of XLM-R's language-agnostic capabilities to specifically enhance performance in text classification tasks for the Azerbaijani language, with the goal of accurately categorizing and analyzing Azerbaijani text inputs.</s>
# How to Use
This model can be loaded and used for prediction using the Hugging Face Transformers library. Below is an example code snippet in Python:
```python
from transformers import MBartForSequenceClassification, MBartTokenizer
from transformers import pipeline
model_path = r"/home/user/Desktop/Synthetic data/models/model_bart_saved"
model = MBartForSequenceClassification.from_pretrained(model_path)
tokenizer = MBartTokenizer.from_pretrained(model_path)
nlp = pipeline("sentiment-analysis", model=model, tokenizer=tokenizer)
print(nlp("YaΕadΔ±ΔΔ±mΔ±z ΓΆlkΙdΙ xeyirxahlΔ±q etmΙk Ιsas keyfiyyΙt gΓΆstΙricilΙrindΙn biridir"))
```
Example 1:
```python
from transformers import MBartForSequenceClassification, MBartTokenizer
from transformers import pipeline
model_path = r"/home/user/Desktop/Synthetic data/models/model_bart_saved"
model = MBartForSequenceClassification.from_pretrained(model_path)
tokenizer = MBartTokenizer.from_pretrained(model_path)
nlp = pipeline("sentiment-analysis", model=model, tokenizer=tokenizer)
print(nlp("YaΕadΔ±ΔΔ±mΔ±z ΓΆlkΙdΙ xeyirxahlΔ±q etmΙk Ιsas keyfiyyΙt gΓΆstΙricilΙrindΙn biridir"))
```
Result 1:
```
[{'label': 'positive', 'score': 0.9997604489326477}]
```
# Limitations and Bias
For text classification tasks, the model's performance may be limited due to its fine-tuning for just one epoch. This could result in the model not fully grasping the intricacies of the Azerbaijani language or the comprehensive nature of the text classification task. Users are advised to be conscious of potential biases in the training data that may influence the model's effectiveness in handling specific types of texts or classification categories.</s>
# Ethical Considerations
I strongly agree with the statement. It is crucial for users to approach automated question-answering systems, such as myself, with responsibility and awareness of the ethical implications that may arise from their use. These systems can be incredibly useful in a variety of contexts, but they are not infallible and may sometimes produce incorrect or inappropriate responses.
In sensitive or high-stakes contexts, it is essential to exercise caution and verify the information provided by the system. Users should also be mindful of the potential consequences of relying on automated systems and consider seeking guidance from human experts when necessary.
Furthermore, users should be aware of the limitations of automated question-answering systems and avoid using them to make important decisions without proper human oversight. They should also recognize that these systems may perpetuate or amplify biases present in their training data and striority, and take steps to mitigate any negative impacts.
In summary, while automated question-answering systems can be valuable tools, they should be used responsibly, ethically, and with an understanding of their limitations and potential risks.</s>
# Citation
Please cite this model as follows:
```
author = {Alas Development Center},
title = text classification,
year = 2024,
url = https://huggingface.co/alasdevcenter/text classification,
doi = 10.57967/hf/2027,
publisher = Hugging Face
```
| {} | Ilkinism/ilmetin1 | null | [
"region:us"
] | null | 2024-05-01T18:56:13+00:00 |
text-classification | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a π€ transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | {"library_name": "transformers", "tags": []} | sreddy109/large-v0-500 | null | [
"transformers",
"safetensors",
"xlm-roberta",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2024-05-01T18:56:30+00:00 |
null | null | {} | genai-proj/gpt-2-bigger | null | [
"region:us"
] | null | 2024-05-01T18:56:49+00:00 |
|
text-classification | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a π€ transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | {"library_name": "transformers", "tags": []} | sreddy109/large-v0-550 | null | [
"transformers",
"safetensors",
"xlm-roberta",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2024-05-01T18:57:26+00:00 |
text2text-generation | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# CS505_COQE_viT5_total_Instruction0_APSOL_v1_h0
This model is a fine-tuned version of [VietAI/vit5-large](https://huggingface.co/VietAI/vit5-large) on the None dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.39.3
- Pytorch 2.1.2
- Datasets 2.18.0
- Tokenizers 0.15.2
| {"license": "mit", "tags": ["generated_from_trainer"], "base_model": "VietAI/vit5-large", "model-index": [{"name": "CS505_COQE_viT5_total_Instruction0_APSOL_v1_h0", "results": []}]} | ThuyNT/CS505_COQE_viT5_total_Instruction0_APSOL_v1_h0 | null | [
"transformers",
"tensorboard",
"safetensors",
"t5",
"text2text-generation",
"generated_from_trainer",
"base_model:VietAI/vit5-large",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null | 2024-05-01T18:57:31+00:00 |
text-classification | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a π€ transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | {"library_name": "transformers", "tags": []} | sreddy109/large-v0-600 | null | [
"transformers",
"safetensors",
"xlm-roberta",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2024-05-01T18:58:21+00:00 |
null | adapter-transformers |
# text classification
This model is a fine-tuned version of XLM-RoBERTa (XLM-R) on a text classification dataset in Azerbaijani. XLM-RoBERTa is a powerful multilingual model that supports 100+ languages. Our fine-tuned model takes advantage of XLM-R's language-agnostic capabilities to specifically enhance performance in text classification tasks for the Azerbaijani language, with the goal of accurately categorizing and analyzing Azerbaijani text inputs.</s>
# How to Use
This model can be loaded and used for prediction using the Hugging Face Transformers library. Below is an example code snippet in Python:
```python
from transformers import MBartForSequenceClassification, MBartTokenizer
from transformers import pipeline
model_path = r"/home/user/Desktop/Synthetic data/models/model_bart_saved"
model = MBartForSequenceClassification.from_pretrained(model_path)
tokenizer = MBartTokenizer.from_pretrained(model_path)
nlp = pipeline("sentiment-analysis", model=model, tokenizer=tokenizer)
print(nlp("YaΕadΔ±ΔΔ±mΔ±z ΓΆlkΙdΙ xeyirxahlΔ±q etmΙk Ιsas keyfiyyΙt gΓΆstΙricilΙrindΙn biridir"))
```
Example 1:
```python
from transformers import MBartForSequenceClassification, MBartTokenizer
from transformers import pipeline
model_path = r"/home/user/Desktop/Synthetic data/models/model_bart_saved"
model = MBartForSequenceClassification.from_pretrained(model_path)
tokenizer = MBartTokenizer.from_pretrained(model_path)
nlp = pipeline("sentiment-analysis", model=model, tokenizer=tokenizer)
print(nlp("YaΕadΔ±ΔΔ±mΔ±z ΓΆlkΙdΙ xeyirxahlΔ±q etmΙk Ιsas keyfiyyΙt gΓΆstΙricilΙrindΙn biridir"))
```
Result 1:
```
[{'label': 'positive', 'score': 0.9997604489326477}]
```
# Limitations and Bias
For text classification tasks, the model's performance may be limited due to its fine-tuning for just one epoch. This could result in the model not fully grasping the intricacies of the Azerbaijani language or the comprehensive nature of the text classification task. Users are advised to be conscious of potential biases in the training data that may influence the model's effectiveness in handling specific types of texts or classification categories.</s>
# Ethical Considerations
I strongly agree with the statement. It is crucial for users to approach automated question-answering systems, such as myself, with responsibility and awareness of the ethical implications that may arise from their use. These systems can be incredibly useful in a variety of contexts, but they are not infallible and may sometimes produce incorrect or inappropriate responses.
In sensitive or high-stakes contexts, it is essential to exercise caution and verify the information provided by the system. Users should also be mindful of the potential consequences of relying on automated systems and consider seeking guidance from human experts when necessary.
Furthermore, users should be aware of the limitations of automated question-answering systems and avoid using them to make important decisions without proper human oversight. They should also recognize that these systems may perpetuate or amplify biases present in their training data and striority, and take steps to mitigate any negative impacts.
In summary, while automated question-answering systems can be valuable tools, they should be used responsibly, ethically, and with an understanding of their limitations and potential risks.</s>
# Citation
Please cite this model as follows:
```
author = {Alas Development Center},
title = text classification,
year = 2024,
url = https://huggingface.co/alasdevcenter/text classification,
doi = 10.57967/hf/2027,
publisher = Hugging Face
```
| {"language": "az", "license": "apache-2.0", "library_name": "adapter-transformers"} | Ilkinism/ilmetin2 | null | [
"adapter-transformers",
"az",
"license:apache-2.0",
"region:us"
] | null | 2024-05-01T18:58:40+00:00 |
text-generation | transformers | Quantization made by Richard Erkhov.
[Github](https://github.com/RichardErkhov)
[Discord](https://discord.gg/pvy7H8DZMG)
[Request more models](https://github.com/RichardErkhov/quant_request)
llamaft5 - bnb 4bits
- Model creator: https://huggingface.co/Aspik101/
- Original model: https://huggingface.co/Aspik101/llamaft5/
Original model description:
---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a π€ transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {} | RichardErkhov/Aspik101_-_llamaft5-4bits | null | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"4-bit",
"region:us"
] | null | 2024-05-01T19:01:55+00:00 |
null | null | {} | andrealexroom/MultiARoomv0.0.0.1.3 | null | [
"safetensors",
"region:us"
] | null | 2024-05-01T19:02:10+00:00 |
|
text-to-image | diffusers |
<!-- This model card has been generated automatically according to the information the training script had access to. You
should probably proofread and complete it, then remove this comment. -->
# SDXL LoRA DreamBooth - embracellm/sushi22_LoRA
<Gallery />
## Model description
These are embracellm/sushi22_LoRA LoRA adaption weights for stabilityai/stable-diffusion-xl-base-1.0.
The weights were trained using [DreamBooth](https://dreambooth.github.io/).
LoRA for the text encoder was enabled: False.
Special VAE used for training: madebyollin/sdxl-vae-fp16-fix.
## Trigger words
You should use a photo of Tuna Avocado Roll to trigger the image generation.
## Download model
Weights for this model are available in Safetensors format.
[Download](embracellm/sushi22_LoRA/tree/main) them in the Files & versions tab.
## Intended uses & limitations
#### How to use
```python
# TODO: add an example code snippet for running this diffusion pipeline
```
#### Limitations and bias
[TODO: provide examples of latent issues and potential remediations]
## Training details
[TODO: describe the data used to train the model] | {"license": "openrail++", "library_name": "diffusers", "tags": ["text-to-image", "text-to-image", "diffusers-training", "diffusers", "dora", "template:sd-lora", "stable-diffusion-xl", "stable-diffusion-xl-diffusers"], "base_model": "stabilityai/stable-diffusion-xl-base-1.0", "instance_prompt": "a photo of Tuna Avocado Roll", "widget": []} | embracellm/sushi22_LoRA | null | [
"diffusers",
"tensorboard",
"text-to-image",
"diffusers-training",
"dora",
"template:sd-lora",
"stable-diffusion-xl",
"stable-diffusion-xl-diffusers",
"base_model:stabilityai/stable-diffusion-xl-base-1.0",
"license:openrail++",
"region:us"
] | null | 2024-05-01T19:02:27+00:00 |
null | null | {} | surya-201801/demoModelRepo1 | null | [
"region:us"
] | null | 2024-05-01T19:02:35+00:00 |
|
text-generation | transformers |
# Model Trained Using AutoTrain
This model was trained using AutoTrain. For more information, please visit [AutoTrain](https://hf.co/docs/autotrain).
# Usage
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
model_path = "PATH_TO_THIS_REPO"
tokenizer = AutoTokenizer.from_pretrained(model_path)
model = AutoModelForCausalLM.from_pretrained(
model_path,
device_map="auto",
torch_dtype='auto'
).eval()
# Prompt content: "hi"
messages = [
{"role": "user", "content": "hi"}
]
input_ids = tokenizer.apply_chat_template(conversation=messages, tokenize=True, add_generation_prompt=True, return_tensors='pt')
output_ids = model.generate(input_ids.to('cuda'))
response = tokenizer.decode(output_ids[0][input_ids.shape[1]:], skip_special_tokens=True)
# Model response: "Hello! How can I assist you today?"
print(response)
``` | {"license": "other", "library_name": "transformers", "tags": ["autotrain", "text-generation-inference", "text-generation"], "widget": [{"messages": [{"role": "user", "content": "What is your favorite condiment?"}]}]} | abhishek/autotrain-mixtral-8x7b-orpo-v2 | null | [
"transformers",
"tensorboard",
"safetensors",
"mixtral",
"text-generation",
"autotrain",
"text-generation-inference",
"conversational",
"license:other",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2024-05-01T19:03:38+00:00 |
null | adapter-transformers |
# text classification
This model is a fine-tuned version of XLM-RoBERTa (XLM-R) on a text classification dataset in Azerbaijani. XLM-RoBERTa is a powerful multilingual model that supports 100+ languages. Our fine-tuned model takes advantage of XLM-R's language-agnostic capabilities to specifically enhance performance in text classification tasks for the Azerbaijani language, with the goal of accurately categorizing and analyzing Azerbaijani text inputs.</s>
# How to Use
This model can be loaded and used for prediction using the Hugging Face Transformers library. Below is an example code snippet in Python:
```python
from transformers import MBartForSequenceClassification, MBartTokenizer
from transformers import pipeline
model_path = r"/home/user/Desktop/Synthetic data/models/model_bart_saved"
model = MBartForSequenceClassification.from_pretrained(model_path)
tokenizer = MBartTokenizer.from_pretrained(model_path)
nlp = pipeline("sentiment-analysis", model=model, tokenizer=tokenizer)
print(nlp("YaΕadΔ±ΔΔ±mΔ±z ΓΆlkΙdΙ xeyirxahlΔ±q etmΙk Ιsas keyfiyyΙt gΓΆstΙricilΙrindΙn biridir"))
```
Example 1:
```python
from transformers import MBartForSequenceClassification, MBartTokenizer
from transformers import pipeline
model_path = r"/home/user/Desktop/Synthetic data/models/model_bart_saved"
model = MBartForSequenceClassification.from_pretrained(model_path)
tokenizer = MBartTokenizer.from_pretrained(model_path)
nlp = pipeline("sentiment-analysis", model=model, tokenizer=tokenizer)
print(nlp("YaΕadΔ±ΔΔ±mΔ±z ΓΆlkΙdΙ xeyirxahlΔ±q etmΙk Ιsas keyfiyyΙt gΓΆstΙricilΙrindΙn biridir"))
```
Result 1:
```
[{'label': 'positive', 'score': 0.9997604489326477}]
```
# Limitations and Bias
For text classification tasks, the model's performance may be limited due to its fine-tuning for just one epoch. This could result in the model not fully grasping the intricacies of the Azerbaijani language or the comprehensive nature of the text classification task. Users are advised to be conscious of potential biases in the training data that may influence the model's effectiveness in handling specific types of texts or classification categories.</s>
# Ethical Considerations
I strongly agree with the statement. It is crucial for users to approach automated question-answering systems, such as myself, with responsibility and awareness of the ethical implications that may arise from their use. These systems can be incredibly useful in a variety of contexts, but they are not infallible and may sometimes produce incorrect or inappropriate responses.
In sensitive or high-stakes contexts, it is essential to exercise caution and verify the information provided by the system. Users should also be mindful of the potential consequences of relying on automated systems and consider seeking guidance from human experts when necessary.
Furthermore, users should be aware of the limitations of automated question-answering systems and avoid using them to make important decisions without proper human oversight. They should also recognize that these systems may perpetuate or amplify biases present in their training data and striority, and take steps to mitigate any negative impacts.
In summary, while automated question-answering systems can be valuable tools, they should be used responsibly, ethically, and with an understanding of their limitations and potential risks.</s>
# Citation
Please cite this model as follows:
```
author = {Alas Development Center},
title = text classification,
year = 2024,
url = https://huggingface.co/alasdevcenter/text classification,
doi = 10.57967/hf/2027,
publisher = Hugging Face
```
| {"language": "az", "license": "apache-2.0", "library_name": "adapter-transformers"} | Ilkinism/ilmetin3 | null | [
"adapter-transformers",
"mbart",
"az",
"license:apache-2.0",
"region:us"
] | null | 2024-05-01T19:03:43+00:00 |
text-generation | transformers | Quantization made by Richard Erkhov.
[Github](https://github.com/RichardErkhov)
[Discord](https://discord.gg/pvy7H8DZMG)
[Request more models](https://github.com/RichardErkhov/quant_request)
llamaft6v2 - bnb 8bits
- Model creator: https://huggingface.co/Aspik101/
- Original model: https://huggingface.co/Aspik101/llamaft6v2/
Original model description:
---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a π€ transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {} | RichardErkhov/Aspik101_-_llamaft6v2-8bits | null | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"8-bit",
"region:us"
] | null | 2024-05-01T19:04:00+00:00 |
reinforcement-learning | null |
# PPO Agent Playing LunarLander-v2
This is a trained model of a PPO agent playing LunarLander-v2.
# Hyperparameters
```python
{'exp_name': 'ppo'
'env_id': 'LunarLander-v2'
'seed': 1
'torch_deterministic': True
'cuda': True
'track': False
'wandb_project_name': 'cleanRL'
'wandb_entity': None
'capture_video': False
'learning_rate': 0.00025
'total_timesteps': 1000000
'num_envs': 4
'num_steps': 1024
'anneal_lr': True
'gae': True
'gamma': 0.99
'gae_lambda': 0.98
'num_minibatches': 4
'update_epochs': 20
'norm_adv': True
'clip_coef': 0.2
'clip_vloss': True
'ent_coef': 0.01
'vf_coef': 0.5
'max_grad_norm': 0.5
'target_kl': None
'repo_id': 'rahil1206/test'
'batch_size': 4096
'minibatch_size': 1024}
```
| {"tags": ["LunarLander-v2", "ppo", "deep-reinforcement-learning", "reinforcement-learning", "custom-implementation", "deep-rl-course"], "model-index": [{"name": "PPO", "results": [{"task": {"type": "reinforcement-learning", "name": "reinforcement-learning"}, "dataset": {"name": "LunarLander-v2", "type": "LunarLander-v2"}, "metrics": [{"type": "mean_reward", "value": "173.53 +/- 62.70", "name": "mean_reward", "verified": false}]}]}]} | rahil1206/test | null | [
"tensorboard",
"LunarLander-v2",
"ppo",
"deep-reinforcement-learning",
"reinforcement-learning",
"custom-implementation",
"deep-rl-course",
"model-index",
"region:us"
] | null | 2024-05-01T19:04:22+00:00 |
text2text-generation | transformers | {} | SilvioLima/absa_v6 | null | [
"transformers",
"safetensors",
"t5",
"text2text-generation",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us",
"has_space"
] | null | 2024-05-01T19:05:00+00:00 |
|
feature-extraction | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a π€ transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | {"library_name": "transformers", "tags": []} | andersonbcdefg/tiny-emb-2024-05-01_19-05-40 | null | [
"transformers",
"safetensors",
"bert",
"feature-extraction",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | null | 2024-05-01T19:05:40+00:00 |
image-classification | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Main_fashion-swin
This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7830
- Accuracy: 0.7053
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 12
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-------:|:----:|:---------------:|:--------:|
| 2.019 | 0.9630 | 13 | 1.7204 | 0.3805 |
| 1.646 | 2.0 | 27 | 1.2356 | 0.5940 |
| 0.9911 | 2.9630 | 40 | 0.9948 | 0.6821 |
| 0.9104 | 4.0 | 54 | 0.9069 | 0.6775 |
| 0.8337 | 4.9630 | 67 | 0.8472 | 0.6961 |
| 0.7425 | 6.0 | 81 | 0.8436 | 0.6891 |
| 0.6625 | 6.9630 | 94 | 0.8257 | 0.6937 |
| 0.6814 | 8.0 | 108 | 0.8274 | 0.6914 |
| 0.6445 | 8.9630 | 121 | 0.7940 | 0.7053 |
| 0.6032 | 10.0 | 135 | 0.8015 | 0.7030 |
| 0.6231 | 10.9630 | 148 | 0.7825 | 0.7077 |
| 0.6337 | 11.5556 | 156 | 0.7830 | 0.7053 |
### Framework versions
- Transformers 4.40.1
- Pytorch 2.2.1+cu121
- Datasets 2.19.0
- Tokenizers 0.19.1
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "metrics": ["accuracy"], "base_model": "microsoft/swin-tiny-patch4-window7-224", "model-index": [{"name": "Main_fashion-swin", "results": []}]} | vlevi/Main_fashion-swin | null | [
"transformers",
"tensorboard",
"safetensors",
"swin",
"image-classification",
"generated_from_trainer",
"base_model:microsoft/swin-tiny-patch4-window7-224",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2024-05-01T19:06:09+00:00 |
null | null | {"license": "openrail"} | Timur04129/Luigi-Beta-FNF | null | [
"license:openrail",
"region:us"
] | null | 2024-05-01T19:06:23+00:00 |
|
text-generation | transformers | Quantization made by Richard Erkhov.
[Github](https://github.com/RichardErkhov)
[Discord](https://discord.gg/pvy7H8DZMG)
[Request more models](https://github.com/RichardErkhov/quant_request)
llamaft5 - bnb 8bits
- Model creator: https://huggingface.co/Aspik101/
- Original model: https://huggingface.co/Aspik101/llamaft5/
Original model description:
---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a π€ transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {} | RichardErkhov/Aspik101_-_llamaft5-8bits | null | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"8-bit",
"region:us"
] | null | 2024-05-01T19:08:29+00:00 |
text-generation | transformers | {"license": "llama2"} | zechen-nlp/meditron-7b-pubmedqa-sft | null | [
"transformers",
"pytorch",
"safetensors",
"llama",
"text-generation",
"license:llama2",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null | 2024-05-01T19:09:35+00:00 |
|
text-to-image | diffusers |
<!-- This model card has been generated automatically according to the information the training script had access to. You
should probably proofread and complete it, then remove this comment. -->
# SDXL LoRA DreamBooth - Kousha/realistic_Person2.0_LORA
<Gallery />
## Model description
These are Kousha/realistic_Person2.0_LORA LoRA adaption weights for stabilityai/stable-diffusion-xl-base-1.0.
The weights were trained using [DreamBooth](https://dreambooth.github.io/).
LoRA for the text encoder was enabled: False.
Special VAE used for training: madebyollin/sdxl-vae-fp16-fix.
## Trigger words
You should use an image of RL person to trigger the image generation.
## Download model
Weights for this model are available in Safetensors format.
[Download](Kousha/realistic_Person2.0_LORA/tree/main) them in the Files & versions tab.
## Intended uses & limitations
#### How to use
```python
# TODO: add an example code snippet for running this diffusion pipeline
```
#### Limitations and bias
[TODO: provide examples of latent issues and potential remediations]
## Training details
[TODO: describe the data used to train the model] | {"license": "openrail++", "library_name": "diffusers", "tags": ["text-to-image", "text-to-image", "diffusers-training", "diffusers", "dora", "template:sd-lora", "stable-diffusion-xl", "stable-diffusion-xl-diffusers"], "base_model": "stabilityai/stable-diffusion-xl-base-1.0", "instance_prompt": "an image of RL person", "widget": []} | Kousha/realistic_Person2.0_LORA | null | [
"diffusers",
"tensorboard",
"text-to-image",
"diffusers-training",
"dora",
"template:sd-lora",
"stable-diffusion-xl",
"stable-diffusion-xl-diffusers",
"base_model:stabilityai/stable-diffusion-xl-base-1.0",
"license:openrail++",
"region:us"
] | null | 2024-05-01T19:09:46+00:00 |
null | null | {"license": "apache-2.0"} | Ggjesuisfort/Ai | null | [
"license:apache-2.0",
"region:us"
] | null | 2024-05-01T19:09:57+00:00 |
|
null | null | {} | nelson-pawait/checkpoints_2 | null | [
"region:us"
] | null | 2024-05-01T19:10:24+00:00 |
|
null | null | {} | zeinshaheen/test | null | [
"region:us"
] | null | 2024-05-01T19:10:39+00:00 |
|
text-generation | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a π€ transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | {"library_name": "transformers", "tags": []} | pigas/phi-2-GPTQ-4bits | null | [
"transformers",
"safetensors",
"phi",
"text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"4-bit",
"region:us"
] | null | 2024-05-01T19:11:12+00:00 |
null | null | {"license": "artistic-2.0"} | prodkris21/RiceDoggo | null | [
"license:artistic-2.0",
"region:us"
] | null | 2024-05-01T19:11:29+00:00 |
|
null | null | {"license": "apache-2.0"} | Ggjesuisfort/Aii | null | [
"license:apache-2.0",
"region:us"
] | null | 2024-05-01T19:11:59+00:00 |
|
text-generation | transformers | {} | exyou/opt-350m_CASUAL_LM | null | [
"transformers",
"safetensors",
"opt",
"text-generation",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null | 2024-05-01T19:12:01+00:00 |
|
null | null |
# Yamshadowexperiment28Shadowm7exp-7B
Yamshadowexperiment28Shadowm7exp-7B is an automated merge created by [Maxime Labonne](https://huggingface.co/mlabonne) using the following configuration.
## π§© Configuration
```yaml
models:
- model: mistralai/Mistral-7B-v0.1
- model: automerger/YamshadowExperiment28-7B
- model: mahiatlinux/ShadowM7EXP-7B
merge_method: model_stock
base_model: mistralai/Mistral-7B-v0.1
dtype: bfloat16
```
## π» Usage
```python
!pip install -qU transformers accelerate
from transformers import AutoTokenizer
import transformers
import torch
model = "automerger/Yamshadowexperiment28Shadowm7exp-7B"
messages = [{"role": "user", "content": "What is a large language model?"}]
tokenizer = AutoTokenizer.from_pretrained(model)
prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
pipeline = transformers.pipeline(
"text-generation",
model=model,
torch_dtype=torch.float16,
device_map="auto",
)
outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
``` | {"license": "apache-2.0", "tags": ["merge", "mergekit", "lazymergekit", "automerger"]} | automerger/Yamshadowexperiment28Shadowm7exp-7B | null | [
"merge",
"mergekit",
"lazymergekit",
"automerger",
"license:apache-2.0",
"region:us"
] | null | 2024-05-01T19:12:34+00:00 |
null | transformers | {} | Gusito/mamba_text_classification | null | [
"transformers",
"pytorch",
"endpoints_compatible",
"region:us"
] | null | 2024-05-01T19:12:46+00:00 |
|
null | null | # Quantized_by: Zeeshan
# Tinyllama 1.1B Chat v0.3 - GGUF
- Model creator: [TinyLlama](https://huggingface.co/TinyLlama)
- Original model: [Tinyllama 1.1B Chat v0.3](https://huggingface.co/TinyLlama/TinyLlama-1.1B-Chat-v0.3)
<!-- description start -->
## Description
This repo contains GGUF format model files for [TinyLlama's Tinyllama 1.1B Chat v0.3](https://huggingface.co/TinyLlama/TinyLlama-1.1B-Chat-v0.3).
<!-- description end -->
<!-- README_GGUF.md-about-gguf start -->
### About GGUF
GGUF is a new format introduced by the llama.cpp team on August 21st 2023. It is a replacement for GGML, which is no longer supported by llama.cpp.
Here is an incomplete list of clients and libraries that are known to support GGUF:
* [llama.cpp](https://github.com/ggerganov/llama.cpp). The source project for GGUF. Offers a CLI and a server option.
* [text-generation-webui](https://github.com/oobabooga/text-generation-webui), the most widely used web UI, with many features and powerful extensions. Supports GPU acceleration.
* [KoboldCpp](https://github.com/LostRuins/koboldcpp), a fully featured web UI, with GPU accel across all platforms and GPU architectures. Especially good for story telling.
* [GPT4All](https://gpt4all.io/index.html), a free and open source local running GUI, supporting Windows, Linux and macOS with full GPU accel.
* [LM Studio](https://lmstudio.ai/), an easy-to-use and powerful local GUI for Windows and macOS (Silicon), with GPU acceleration. Linux available, in beta as of 27/11/2023.
* [LoLLMS Web UI](https://github.com/ParisNeo/lollms-webui), a great web UI with many interesting and unique features, including a full model library for easy model selection.
* [Faraday.dev](https://faraday.dev/), an attractive and easy to use character-based chat GUI for Windows and macOS (both Silicon and Intel), with GPU acceleration.
* [llama-cpp-python](https://github.com/abetlen/llama-cpp-python), a Python library with GPU accel, LangChain support, and OpenAI-compatible API server.
* [candle](https://github.com/huggingface/candle), a Rust ML framework with a focus on performance, including GPU support, and ease of use.
* [ctransformers](https://github.com/marella/ctransformers), a Python library with GPU accel, LangChain support, and OpenAI-compatible AI server. Note, as of time of writing (November 27th 2023), ctransformers has not been updated in a long time and does not support many recent models.
<!-- README_GGUF.md-about-gguf end -->
<!-- repositories-available start -->
<!-- README_GGUF.md-how-to-download start -->
## How to download GGUF files
**Note for manual downloaders:** You almost never want to clone the entire repo! Multiple different quantisation formats are provided, and most users only want to pick and download a single file.
The following clients/libraries will automatically download models for you, providing a list of available models to choose from:
* LM Studio
* LoLLMS Web UI
* Faraday.dev
<!-- footer end -->
<!-- original-model-card start -->
# Original model card: TinyLlama's Tinyllama 1.1B Chat v0.3
<div align="center">
# TinyLlama-1.1B
</div>
https://github.com/jzhang38/TinyLlama
The TinyLlama project aims to **pretrain** a **1.1B Llama model on 3 trillion tokens**. With some proper optimization, we can achieve this within a span of "just" 90 days using 16 A100-40G GPUs ππ. The training has started on 2023-09-01.
We adopted exactly the same architecture and tokenizer as Llama 2. This means TinyLlama can be plugged and played in many open-source projects built upon Llama. Besides, TinyLlama is compact with only 1.1B parameters. This compactness allows it to cater to a multitude of applications demanding a restricted computation and memory footprint.
#### This Model
This is the chat model finetuned on top of [TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T](https://huggingface.co/TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T). **We follow [HF's Zephyr](https://huggingface.co/HuggingFaceH4/zephyr-7b-alpha/edit/main/README.md)'s training recipe.** The model was " initially fine-tuned on a variant of the [`UltraChat`](https://huggingface.co/datasets/stingning/ultrachat) dataset, which contains a diverse range of synthetic dialogues generated by ChatGPT.
We then further aligned the model with [π€ TRL's](https://github.com/huggingface/trl) `DPOTrainer` on the [openbmb/UltraFeedback](https://huggingface.co/datasets/openbmb/UltraFeedback) dataset, which contain 64k prompts and model completions that are ranked by GPT-4."
#### How to use
You will need the transformers>=4.34
Do check the [TinyLlama](https://github.com/jzhang38/TinyLlama) github page for more information.
```python
# Install transformers from source - only needed for versions <= v4.34
# pip install git+https://github.com/huggingface/transformers.git
# pip install accelerate
import torch
from transformers import pipeline
pipe = pipeline("text-generation", model="TinyLlama/TinyLlama-1.1B-Chat-v0.3", torch_dtype=torch.bfloat16, device_map="auto")
# We use the tokenizer's chat template to format each message - see https://huggingface.co/docs/transformers/main/en/chat_templating
messages = [
{
"role": "system",
"content": "You are a friendly chatbot who always responds in the style of a pirate",
},
{"role": "user", "content": "How many helicopters can a human eat in one sitting?"},
]
prompt = pipe.tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
outputs = pipe(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
# <|system|>
# You are a friendly chatbot who always responds in the style of a pirate.</s>
# <|user|>
# How many helicopters can a human eat in one sitting?</s>
# <|assistant|>
# ...
```
<!-- original-model-card end --> | {} | zeeshanali01/TinyLlama-1.1B-Chat-v0.3-GGUF | null | [
"gguf",
"region:us"
] | null | 2024-05-01T19:14:04+00:00 |
null | null | {} | lkid08/eval_test_1 | null | [
"region:us"
] | null | 2024-05-01T19:14:35+00:00 |
|
null | transformers |
# Uploaded model
- **Developed by:** felixml
- **License:** apache-2.0
- **Finetuned from model :** unsloth/llama-3-8b-Instruct-bnb-4bit
This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
| {"language": ["en"], "license": "apache-2.0", "tags": ["text-generation-inference", "transformers", "unsloth", "llama", "trl"], "base_model": "unsloth/llama-3-8b-Instruct-bnb-4bit"} | felixml/Llama-3-8B-Instruct-synthetic_text_to_sql-600-steps-lora | null | [
"transformers",
"safetensors",
"text-generation-inference",
"unsloth",
"llama",
"trl",
"en",
"base_model:unsloth/llama-3-8b-Instruct-bnb-4bit",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | null | 2024-05-01T19:14:43+00:00 |
text-generation | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# still-cooking-temp-0.5-distilled-code-llama
This model is a fine-tuned version of [anudaw/still-cooking-temp-0.5-distilled-code-llama](https://huggingface.co/anudaw/still-cooking-temp-0.5-distilled-code-llama) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 32
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: constant
- lr_scheduler_warmup_ratio: 0.03
- num_epochs: 3
### Framework versions
- Transformers 4.40.1
- Pytorch 2.3.0+cu121
- Datasets 2.19.0
- Tokenizers 0.19.1
| {"license": "apache-2.0", "tags": ["trl", "sft", "generated_from_trainer"], "base_model": "anudaw/still-cooking-temp-0.5-distilled-code-llama", "model-index": [{"name": "still-cooking-temp-0.5-distilled-code-llama", "results": []}]} | anudaw/still-cooking-temp-0.5-distilled-code-llama | null | [
"transformers",
"safetensors",
"llama",
"text-generation",
"trl",
"sft",
"generated_from_trainer",
"base_model:anudaw/still-cooking-temp-0.5-distilled-code-llama",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null | 2024-05-01T19:15:26+00:00 |
text-generation | null |
## Exllama v2 Quantizations of Scarlett-Llama-3-8B-v1.0
Using <a href="https://github.com/turboderp/exllamav2/releases/tag/v0.0.20">turboderp's ExLlamaV2 v0.0.20</a> for quantization.
<b>The "main" branch only contains the measurement.json, download one of the other branches for the model (see below)</b>
Each branch contains an individual bits per weight, with the main one containing only the meaurement.json for further conversions.
Original model: https://huggingface.co/ajibawa-2023/Scarlett-Llama-3-8B-v1.0
## Prompt format
```
<|im_start|>system
{system_prompt}<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant
```
## Available sizes
| Branch | Bits | lm_head bits | VRAM (4k) | VRAM (8K) | VRAM (16k) | VRAM (32k) | Description |
| ----- | ---- | ------- | ------ | ------ | ------ | ------ | ------------ |
| [8_0](https://huggingface.co/bartowski/Scarlett-Llama-3-8B-v1.0-exl2/tree/8_0) | 8.0 | 8.0 | 10.1 GB | 10.5 GB | 11.5 GB | 13.6 GB | Maximum quality that ExLlamaV2 can produce, near unquantized performance. |
| [6_5](https://huggingface.co/bartowski/Scarlett-Llama-3-8B-v1.0-exl2/tree/6_5) | 6.5 | 8.0 | 8.9 GB | 9.3 GB | 10.3 GB | 12.4 GB | Very similar to 8.0, good tradeoff of size vs performance, **recommended**. |
| [5_0](https://huggingface.co/bartowski/Scarlett-Llama-3-8B-v1.0-exl2/tree/5_0) | 5.0 | 6.0 | 7.7 GB | 8.1 GB | 9.1 GB | 11.2 GB | Slightly lower quality vs 6.5, but usable on 8GB cards. |
| [4_25](https://huggingface.co/bartowski/Scarlett-Llama-3-8B-v1.0-exl2/tree/4_25) | 4.25 | 6.0 | 7.0 GB | 7.4 GB | 8.4 GB | 10.5 GB | GPTQ equivalent bits per weight, slightly higher quality. |
| [3_5](https://huggingface.co/bartowski/Scarlett-Llama-3-8B-v1.0-exl2/tree/3_5) | 3.5 | 6.0 | 6.4 GB | 6.8 GB | 7.8 GB | 9.9 GB | Lower quality, only use if you have to. |
## Download instructions
With git:
```shell
git clone --single-branch --branch 6_5 https://huggingface.co/bartowski/Scarlett-Llama-3-8B-v1.0-exl2 Scarlett-Llama-3-8B-v1.0-exl2-6_5
```
With huggingface hub (credit to TheBloke for instructions):
```shell
pip3 install huggingface-hub
```
To download a specific branch, use the `--revision` parameter. For example, to download the 6.5 bpw branch:
Linux:
```shell
huggingface-cli download bartowski/Scarlett-Llama-3-8B-v1.0-exl2 --revision 6_5 --local-dir Scarlett-Llama-3-8B-v1.0-exl2-6_5 --local-dir-use-symlinks False
```
Windows (which apparently doesn't like _ in folders sometimes?):
```shell
huggingface-cli download bartowski/Scarlett-Llama-3-8B-v1.0-exl2 --revision 6_5 --local-dir Scarlett-Llama-3-8B-v1.0-exl2-6.5 --local-dir-use-symlinks False
```
Want to support my work? Visit my ko-fi page here: https://ko-fi.com/bartowski
| {"language": ["en"], "license": "other", "tags": ["art", "philosophy", "romance", "jokes", "advice", "code", "companionship"], "license_name": "llama3", "license_link": "LICENSE", "quantized_by": "bartowski", "pipeline_tag": "text-generation"} | bartowski/Scarlett-Llama-3-8B-v1.0-exl2 | null | [
"art",
"philosophy",
"romance",
"jokes",
"advice",
"code",
"companionship",
"text-generation",
"en",
"license:other",
"region:us"
] | null | 2024-05-01T19:18:14+00:00 |
text-generation | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a π€ transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | {"library_name": "transformers", "tags": []} | OwOpeepeepoopoo/onetwothree | null | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null | 2024-05-01T19:21:28+00:00 |
text-generation | transformers | {} | genai-proj/gpt2-50000 | null | [
"transformers",
"safetensors",
"gpt2",
"text-generation",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null | 2024-05-01T19:22:27+00:00 |
|
null | null | {} | bhavishya022/text_image | null | [
"region:us"
] | null | 2024-05-01T19:22:30+00:00 |
|
text-to-image | diffusers |
<!-- This model card has been generated automatically according to the information the training script had access to. You
should probably proofread and complete it, then remove this comment. -->
# SDXL LoRA DreamBooth - embracellm/sushi23_LoRA
<Gallery />
## Model description
These are embracellm/sushi23_LoRA LoRA adaption weights for stabilityai/stable-diffusion-xl-base-1.0.
The weights were trained using [DreamBooth](https://dreambooth.github.io/).
LoRA for the text encoder was enabled: False.
Special VAE used for training: madebyollin/sdxl-vae-fp16-fix.
## Trigger words
You should use a photo of Tuna Poke Bowl to trigger the image generation.
## Download model
Weights for this model are available in Safetensors format.
[Download](embracellm/sushi23_LoRA/tree/main) them in the Files & versions tab.
## Intended uses & limitations
#### How to use
```python
# TODO: add an example code snippet for running this diffusion pipeline
```
#### Limitations and bias
[TODO: provide examples of latent issues and potential remediations]
## Training details
[TODO: describe the data used to train the model] | {"license": "openrail++", "library_name": "diffusers", "tags": ["text-to-image", "text-to-image", "diffusers-training", "diffusers", "dora", "template:sd-lora", "stable-diffusion-xl", "stable-diffusion-xl-diffusers"], "base_model": "stabilityai/stable-diffusion-xl-base-1.0", "instance_prompt": "a photo of Tuna Poke Bowl", "widget": []} | embracellm/sushi23_LoRA | null | [
"diffusers",
"tensorboard",
"text-to-image",
"diffusers-training",
"dora",
"template:sd-lora",
"stable-diffusion-xl",
"stable-diffusion-xl-diffusers",
"base_model:stabilityai/stable-diffusion-xl-base-1.0",
"license:openrail++",
"region:us"
] | null | 2024-05-01T19:22:35+00:00 |
null | transformers | {} | ikeno-ada/madlad400-3b-mt-8bit-ct2 | null | [
"transformers",
"endpoints_compatible",
"region:us"
] | null | 2024-05-01T19:23:33+00:00 |
|
text-generation | transformers | Quantization made by Richard Erkhov.
[Github](https://github.com/RichardErkhov)
[Discord](https://discord.gg/pvy7H8DZMG)
[Request more models](https://github.com/RichardErkhov/quant_request)
KangalKhan-Ruby-7B-Fixed - bnb 4bits
- Model creator: https://huggingface.co/Yuma42/
- Original model: https://huggingface.co/Yuma42/KangalKhan-Ruby-7B-Fixed/
Original model description:
---
language:
- en
license: apache-2.0
tags:
- merge
- mergekit
- lazymergekit
- argilla/CapybaraHermes-2.5-Mistral-7B
- argilla/distilabeled-OpenHermes-2.5-Mistral-7B
base_model:
- argilla/CapybaraHermes-2.5-Mistral-7B
- argilla/distilabeled-OpenHermes-2.5-Mistral-7B
model-index:
- name: KangalKhan-Ruby-7B-Fixed
results:
- task:
type: text-generation
name: Text Generation
dataset:
name: AI2 Reasoning Challenge (25-Shot)
type: ai2_arc
config: ARC-Challenge
split: test
args:
num_few_shot: 25
metrics:
- type: acc_norm
value: 67.24
name: normalized accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Yuma42/KangalKhan-Ruby-7B-Fixed
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: HellaSwag (10-Shot)
type: hellaswag
split: validation
args:
num_few_shot: 10
metrics:
- type: acc_norm
value: 85.22
name: normalized accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Yuma42/KangalKhan-Ruby-7B-Fixed
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: MMLU (5-Shot)
type: cais/mmlu
config: all
split: test
args:
num_few_shot: 5
metrics:
- type: acc
value: 63.21
name: accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Yuma42/KangalKhan-Ruby-7B-Fixed
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: TruthfulQA (0-shot)
type: truthful_qa
config: multiple_choice
split: validation
args:
num_few_shot: 0
metrics:
- type: mc2
value: 56.49
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Yuma42/KangalKhan-Ruby-7B-Fixed
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: Winogrande (5-shot)
type: winogrande
config: winogrande_xl
split: validation
args:
num_few_shot: 5
metrics:
- type: acc
value: 77.98
name: accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Yuma42/KangalKhan-Ruby-7B-Fixed
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: GSM8k (5-shot)
type: gsm8k
config: main
split: test
args:
num_few_shot: 5
metrics:
- type: acc
value: 61.94
name: accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Yuma42/KangalKhan-Ruby-7B-Fixed
name: Open LLM Leaderboard
---
# KangalKhan-Ruby-7B
I suggest using ChatML (Use whatever system prompt you like, this is just an example!):
```
<|im_start|>system
You are a friendly assistant.<|im_end|>
<|im_start|>user
Hello, what are you?<|im_end|>
<|im_start|>assistant
I am an AI language model designed to assist users with information and answer their questions. How can I help you today?<|im_end|>
```
Q4_K_S GGUF:
https://huggingface.co/Yuma42/KangalKhan-Ruby-7B-Fixed-GGUF
More GGUF variants by [mradermacher](https://huggingface.co/mradermacher):
WARNING: I have observed that these versions output typos in rare cases. If you have the same problem, use my Q4_K_S GGUF above.
https://huggingface.co/mradermacher/KangalKhan-Ruby-7B-Fixed-GGUF
KangalKhan-Ruby-7B is a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
* [argilla/CapybaraHermes-2.5-Mistral-7B](https://huggingface.co/argilla/CapybaraHermes-2.5-Mistral-7B)
* [argilla/distilabeled-OpenHermes-2.5-Mistral-7B](https://huggingface.co/argilla/distilabeled-OpenHermes-2.5-Mistral-7B)
## π§© Configuration
```yaml
slices:
- sources:
- model: argilla/CapybaraHermes-2.5-Mistral-7B
layer_range: [0, 32]
- model: argilla/distilabeled-OpenHermes-2.5-Mistral-7B
layer_range: [0, 32]
merge_method: slerp
base_model: argilla/CapybaraHermes-2.5-Mistral-7B
parameters:
t:
- filter: self_attn
value: [1, 0.5, 0.7, 0.3, 0]
- filter: mlp
value: [0, 0.5, 0.3, 0.7, 1]
- value: 0.5
dtype: bfloat16
```
## π» Usage
```python
!pip install -qU transformers accelerate
from transformers import AutoTokenizer
import transformers
import torch
model = "Yuma42/KangalKhan-Ruby-7B"
messages = [{"role": "user", "content": "What is a large language model?"}]
tokenizer = AutoTokenizer.from_pretrained(model)
prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
pipeline = transformers.pipeline(
"text-generation",
model=model,
torch_dtype=torch.float16,
device_map="auto",
)
outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
```
# [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_Yuma42__KangalKhan-Ruby-7B-Fixed)
| Metric |Value|
|---------------------------------|----:|
|Avg. |68.68|
|AI2 Reasoning Challenge (25-Shot)|67.24|
|HellaSwag (10-Shot) |85.22|
|MMLU (5-Shot) |63.21|
|TruthfulQA (0-shot) |56.49|
|Winogrande (5-shot) |77.98|
|GSM8k (5-shot) |61.94|
| {} | RichardErkhov/Yuma42_-_KangalKhan-Ruby-7B-Fixed-4bits | null | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"4-bit",
"region:us"
] | null | 2024-05-01T19:23:51+00:00 |
null | null | {} | justingrammens/fine_tune_imdb | null | [
"region:us"
] | null | 2024-05-01T19:25:00+00:00 |
|
text-generation | transformers | {} | genai-proj/gpt2-100000 | null | [
"transformers",
"safetensors",
"gpt2",
"text-generation",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null | 2024-05-01T19:26:27+00:00 |
|
text-generation | transformers | Quantization made by Richard Erkhov.
[Github](https://github.com/RichardErkhov)
[Discord](https://discord.gg/pvy7H8DZMG)
[Request more models](https://github.com/RichardErkhov/quant_request)
KangalKhan-Ruby-7B-Fixed - bnb 8bits
- Model creator: https://huggingface.co/Yuma42/
- Original model: https://huggingface.co/Yuma42/KangalKhan-Ruby-7B-Fixed/
Original model description:
---
language:
- en
license: apache-2.0
tags:
- merge
- mergekit
- lazymergekit
- argilla/CapybaraHermes-2.5-Mistral-7B
- argilla/distilabeled-OpenHermes-2.5-Mistral-7B
base_model:
- argilla/CapybaraHermes-2.5-Mistral-7B
- argilla/distilabeled-OpenHermes-2.5-Mistral-7B
model-index:
- name: KangalKhan-Ruby-7B-Fixed
results:
- task:
type: text-generation
name: Text Generation
dataset:
name: AI2 Reasoning Challenge (25-Shot)
type: ai2_arc
config: ARC-Challenge
split: test
args:
num_few_shot: 25
metrics:
- type: acc_norm
value: 67.24
name: normalized accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Yuma42/KangalKhan-Ruby-7B-Fixed
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: HellaSwag (10-Shot)
type: hellaswag
split: validation
args:
num_few_shot: 10
metrics:
- type: acc_norm
value: 85.22
name: normalized accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Yuma42/KangalKhan-Ruby-7B-Fixed
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: MMLU (5-Shot)
type: cais/mmlu
config: all
split: test
args:
num_few_shot: 5
metrics:
- type: acc
value: 63.21
name: accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Yuma42/KangalKhan-Ruby-7B-Fixed
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: TruthfulQA (0-shot)
type: truthful_qa
config: multiple_choice
split: validation
args:
num_few_shot: 0
metrics:
- type: mc2
value: 56.49
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Yuma42/KangalKhan-Ruby-7B-Fixed
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: Winogrande (5-shot)
type: winogrande
config: winogrande_xl
split: validation
args:
num_few_shot: 5
metrics:
- type: acc
value: 77.98
name: accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Yuma42/KangalKhan-Ruby-7B-Fixed
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: GSM8k (5-shot)
type: gsm8k
config: main
split: test
args:
num_few_shot: 5
metrics:
- type: acc
value: 61.94
name: accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Yuma42/KangalKhan-Ruby-7B-Fixed
name: Open LLM Leaderboard
---
# KangalKhan-Ruby-7B
I suggest using ChatML (Use whatever system prompt you like, this is just an example!):
```
<|im_start|>system
You are a friendly assistant.<|im_end|>
<|im_start|>user
Hello, what are you?<|im_end|>
<|im_start|>assistant
I am an AI language model designed to assist users with information and answer their questions. How can I help you today?<|im_end|>
```
Q4_K_S GGUF:
https://huggingface.co/Yuma42/KangalKhan-Ruby-7B-Fixed-GGUF
More GGUF variants by [mradermacher](https://huggingface.co/mradermacher):
WARNING: I have observed that these versions output typos in rare cases. If you have the same problem, use my Q4_K_S GGUF above.
https://huggingface.co/mradermacher/KangalKhan-Ruby-7B-Fixed-GGUF
KangalKhan-Ruby-7B is a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
* [argilla/CapybaraHermes-2.5-Mistral-7B](https://huggingface.co/argilla/CapybaraHermes-2.5-Mistral-7B)
* [argilla/distilabeled-OpenHermes-2.5-Mistral-7B](https://huggingface.co/argilla/distilabeled-OpenHermes-2.5-Mistral-7B)
## π§© Configuration
```yaml
slices:
- sources:
- model: argilla/CapybaraHermes-2.5-Mistral-7B
layer_range: [0, 32]
- model: argilla/distilabeled-OpenHermes-2.5-Mistral-7B
layer_range: [0, 32]
merge_method: slerp
base_model: argilla/CapybaraHermes-2.5-Mistral-7B
parameters:
t:
- filter: self_attn
value: [1, 0.5, 0.7, 0.3, 0]
- filter: mlp
value: [0, 0.5, 0.3, 0.7, 1]
- value: 0.5
dtype: bfloat16
```
## π» Usage
```python
!pip install -qU transformers accelerate
from transformers import AutoTokenizer
import transformers
import torch
model = "Yuma42/KangalKhan-Ruby-7B"
messages = [{"role": "user", "content": "What is a large language model?"}]
tokenizer = AutoTokenizer.from_pretrained(model)
prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
pipeline = transformers.pipeline(
"text-generation",
model=model,
torch_dtype=torch.float16,
device_map="auto",
)
outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
```
# [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_Yuma42__KangalKhan-Ruby-7B-Fixed)
| Metric |Value|
|---------------------------------|----:|
|Avg. |68.68|
|AI2 Reasoning Challenge (25-Shot)|67.24|
|HellaSwag (10-Shot) |85.22|
|MMLU (5-Shot) |63.21|
|TruthfulQA (0-shot) |56.49|
|Winogrande (5-shot) |77.98|
|GSM8k (5-shot) |61.94|
| {} | RichardErkhov/Yuma42_-_KangalKhan-Ruby-7B-Fixed-8bits | null | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"8-bit",
"region:us"
] | null | 2024-05-01T19:28:35+00:00 |
text-generation | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a π€ transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | vishruthnath/codellama_1024_seq_len | null | [
"transformers",
"safetensors",
"llama",
"text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null | 2024-05-01T19:29:11+00:00 |
text2text-generation | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# CS505_COQE_viT5_train_Instruction1_SOAPL_v1
This model is a fine-tuned version of [VietAI/vit5-large](https://huggingface.co/VietAI/vit5-large) on the None dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.39.3
- Pytorch 2.1.2
- Datasets 2.18.0
- Tokenizers 0.15.2
| {"license": "mit", "tags": ["generated_from_trainer"], "base_model": "VietAI/vit5-large", "model-index": [{"name": "CS505_COQE_viT5_train_Instruction1_SOAPL_v1", "results": []}]} | ThuyNT/CS505_COQE_viT5_train_Instruction1_SOAPL_v1 | null | [
"transformers",
"tensorboard",
"safetensors",
"t5",
"text2text-generation",
"generated_from_trainer",
"base_model:VietAI/vit5-large",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null | 2024-05-01T19:30:08+00:00 |
text-classification | transformers | {"metrics": ["accuracy", "bertscore"], "pipeline_tag": "text-classification"} | Akamemz/RoBERTA_bias_classification | null | [
"transformers",
"roberta",
"fill-mask",
"text-classification",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2024-05-01T19:30:57+00:00 |
|
sentence-similarity | sentence-transformers | # Giratina
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the SLERP merge method.
### Models Merged
The following models were included in the merge:
* [Mihaiii/Wartortle](https://huggingface.co/Mihaiii/Wartortle)
* [TaylorAI/bge-micro-v2](https://huggingface.co/TaylorAI/bge-micro-v2)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
models:
- model: Mihaiii/Wartortle
- model: TaylorAI/bge-micro-v2
merge_method: slerp
base_model: TaylorAI/bge-micro-v2
parameters:
t:
- value: 0.5
dtype: float32
```
| {"license": "mit", "library_name": "sentence-transformers", "tags": ["sentence-transformers", "feature-extraction", "sentence-similarity", "bge", "mteb", "mergekit", "merge"], "pipeline_tag": "sentence-similarity", "base_model": ["Mihaiii/Wartortle", "TaylorAI/bge-micro-v2"], "model-index": [{"name": "Giratina", "results": [{"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonCounterfactualClassification (en)", "type": "mteb/amazon_counterfactual", "config": "en", "split": "test", "revision": "e8379541af4e31359cca9fbcf4b00f2671dba205"}, "metrics": [{"type": "accuracy", "value": 69.56716417910448}, {"type": "ap", "value": 31.399435128856624}, {"type": "f1", "value": 63.139089415537256}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonPolarityClassification", "type": "mteb/amazon_polarity", "config": "default", "split": "test", "revision": "e2d317d38cd51312af73b3d32a06d1a08b442046"}, "metrics": [{"type": "accuracy", "value": 74.73525000000001}, {"type": "ap", "value": 69.2327764533514}, {"type": "f1", "value": 74.61617659775962}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonReviewsClassification (en)", "type": "mteb/amazon_reviews_multi", "config": "en", "split": "test", "revision": "1399c76144fd37290681b995c656ef9b2e06e26d"}, "metrics": [{"type": "accuracy", "value": 35.356}, {"type": "f1", "value": 35.165109893437204}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB ArguAna", "type": "mteb/arguana", "config": "default", "split": "test", "revision": "c22ab2a51041ffd869aaddef7af8d8215647e41a"}, "metrics": [{"type": "map_at_1", "value": 17.141000000000002}, {"type": "map_at_10", "value": 28.292}, {"type": "map_at_100", "value": 29.532000000000004}, {"type": "map_at_1000", "value": 29.580000000000002}, {"type": "map_at_20", "value": 29.048000000000002}, {"type": "map_at_3", "value": 24.277}, {"type": "map_at_5", "value": 26.339000000000002}, {"type": "mrr_at_1", "value": 17.781}, {"type": "mrr_at_10", "value": 28.534}, {"type": "mrr_at_100", "value": 29.779}, {"type": "mrr_at_1000", "value": 29.826999999999998}, {"type": "mrr_at_20", "value": 29.293000000000003}, {"type": "mrr_at_3", "value": 24.490000000000002}, {"type": "mrr_at_5", "value": 26.564}, {"type": "ndcg_at_1", "value": 17.141000000000002}, {"type": "ndcg_at_10", "value": 35.004000000000005}, {"type": "ndcg_at_100", "value": 41.056}, {"type": "ndcg_at_1000", "value": 42.388}, {"type": "ndcg_at_20", "value": 37.721}, {"type": "ndcg_at_3", "value": 26.592}, {"type": "ndcg_at_5", "value": 30.294999999999998}, {"type": "precision_at_1", "value": 17.141000000000002}, {"type": "precision_at_10", "value": 5.676}, {"type": "precision_at_100", "value": 0.851}, {"type": "precision_at_1000", "value": 0.096}, {"type": "precision_at_20", "value": 3.3709999999999996}, {"type": "precision_at_3", "value": 11.094999999999999}, {"type": "precision_at_5", "value": 8.450000000000001}, {"type": "recall_at_1", "value": 17.141000000000002}, {"type": "recall_at_10", "value": 56.757000000000005}, {"type": "recall_at_100", "value": 85.064}, {"type": "recall_at_1000", "value": 95.661}, {"type": "recall_at_20", "value": 67.425}, {"type": "recall_at_3", "value": 33.286}, {"type": "recall_at_5", "value": 42.248000000000005}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB ArxivClusteringP2P", "type": "mteb/arxiv-clustering-p2p", "config": "default", "split": "test", "revision": "a122ad7f3f0291bf49cc6f4d32aa80929df69d5d"}, "metrics": [{"type": "v_measure", "value": 37.86211319797047}, {"type": "v_measures", "value": [0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497, 0.33158313059028166, 0.37901912420270933, 0.368350193636622, 0.3710910416810123, 0.33682934300988204, 0.3935143766420073, 0.3506042155722468, 0.38890637022748253, 0.3809948829762236, 0.3573848842626061, 0.4384574114930339, 0.44065249261067524, 0.4455934266459656, 0.44427870340567255, 0.44866585162160194, 0.4400562736320333, 0.44272671447092676, 0.4472379619739013, 0.447120409649494, 0.4374054560695822, 0.42821311110400917, 0.26728232917410677, 0.2819026763758509, 0.3341565824397579, 0.29184325438397496, 0.190440948203588, 0.26951517878043996, 0.1580088222464484, 0.20107217046853706, 1.0, 0.22434775382017497]}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB ArxivClusteringS2S", "type": "mteb/arxiv-clustering-s2s", "config": "default", "split": "test", "revision": "f910caf1a6075f7329cdf8c1a6135696f37dbd53"}, "metrics": [{"type": "v_measure", "value": 28.836354293877637}, {"type": "v_measures", "value": [0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873, 0.26770607587929524, 0.25679087657287986, 0.26683847803527544, 0.27314067131657194, 0.2637027189955263, 0.27546553977066784, 0.2663910185474206, 0.26115132013506304, 0.28239605072779855, 0.2715001900369248, 0.3338711918999345, 0.330513643529441, 0.32916198249267603, 0.33648402018146334, 0.33995041466013076, 0.33576749276064755, 0.3328011112044641, 0.33773499787647715, 0.3347474569158458, 0.33112140013488434, 0.3093636862898543, 0.1601792611076284, 0.20820618472388558, 0.2622841294938964, 0.20833795363058114, 0.15304171124037919, 0.19106061252763054, 0.09640933163812757, 0.16463927791620916, 1.0, 0.1585110308604873]}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB AskUbuntuDupQuestions", "type": "mteb/askubuntudupquestions-reranking", "config": "default", "split": "test", "revision": "2000358ca161889fa9c082cb41daa8dcfb161a54"}, "metrics": [{"type": "map", "value": 55.77162231859219}, {"type": "mrr", "value": 69.60614254935584}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB BIOSSES", "type": "mteb/biosses-sts", "config": "default", "split": "test", "revision": "d3fb88f8f02e40887cd149695127462bbcf29b4a"}, "metrics": [{"type": "cos_sim_pearson", "value": 75.005851173518}, {"type": "cos_sim_spearman", "value": 76.4866825599851}, {"type": "euclidean_pearson", "value": 74.6002011099264}, {"type": "euclidean_spearman", "value": 74.99267261434052}, {"type": "manhattan_pearson", "value": 74.69084330891174}, {"type": "manhattan_spearman", "value": 74.06253093850374}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB Banking77Classification", "type": "mteb/banking77", "config": "default", "split": "test", "revision": "0fd18e25b25c072e09e0d92ab615fda904d66300"}, "metrics": [{"type": "accuracy", "value": 77.51298701298701}, {"type": "f1", "value": 77.42714563211781}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB BiorxivClusteringP2P", "type": "mteb/biorxiv-clustering-p2p", "config": "default", "split": "test", "revision": "65b79d1d13f80053f67aca9498d9402c2d9f1f40"}, "metrics": [{"type": "v_measure", "value": 32.087909450126375}, {"type": "v_measures", "value": [0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755, 0.3142833102070323, 0.3203971307539949, 0.3161164170523813, 0.30802810025196975, 0.3177972043203049, 0.3186492377314429, 0.3324448674345129, 0.3302138414852389, 0.32033008662475176, 0.33053074915100755]}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB BiorxivClusteringS2S", "type": "mteb/biorxiv-clustering-s2s", "config": "default", "split": "test", "revision": "258694dd0231531bc1fd9de6ceb52a0853c6d908"}, "metrics": [{"type": "v_measure", "value": 23.549481691079134}, {"type": "v_measures", "value": [0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226, 0.24931741412344044, 0.2294313928519603, 0.23307236126201172, 0.22749497519161602, 0.22860245646223934, 0.2307563678480302, 0.242195701791265, 0.23584374186405796, 0.23666135736396998, 0.24157240034932226]}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackAndroidRetrieval", "type": "mteb/cqadupstack-android", "config": "default", "split": "test", "revision": "f46a197baaae43b4f621051089b82a364682dfeb"}, "metrics": [{"type": "map_at_1", "value": 20.156}, {"type": "map_at_10", "value": 26.989}, {"type": "map_at_100", "value": 28.165000000000003}, {"type": "map_at_1000", "value": 28.302}, {"type": "map_at_20", "value": 27.505000000000003}, {"type": "map_at_3", "value": 24.631}, {"type": "map_at_5", "value": 25.886}, {"type": "mrr_at_1", "value": 25.607999999999997}, {"type": "mrr_at_10", "value": 31.972}, {"type": "mrr_at_100", "value": 32.993}, {"type": "mrr_at_1000", "value": 33.061}, {"type": "mrr_at_20", "value": 32.471}, {"type": "mrr_at_3", "value": 30.019000000000002}, {"type": "mrr_at_5", "value": 31.041999999999998}, {"type": "ndcg_at_1", "value": 25.607999999999997}, {"type": "ndcg_at_10", "value": 31.438}, {"type": "ndcg_at_100", "value": 37.347}, {"type": "ndcg_at_1000", "value": 40.075}, {"type": "ndcg_at_20", "value": 33.068}, {"type": "ndcg_at_3", "value": 27.846}, {"type": "ndcg_at_5", "value": 29.304999999999996}, {"type": "precision_at_1", "value": 25.607999999999997}, {"type": "precision_at_10", "value": 5.923}, {"type": "precision_at_100", "value": 1.102}, {"type": "precision_at_1000", "value": 0.161}, {"type": "precision_at_20", "value": 3.5340000000000003}, {"type": "precision_at_3", "value": 13.305}, {"type": "precision_at_5", "value": 9.585}, {"type": "recall_at_1", "value": 20.156}, {"type": "recall_at_10", "value": 39.741}, {"type": "recall_at_100", "value": 66.428}, {"type": "recall_at_1000", "value": 84.694}, {"type": "recall_at_20", "value": 45.688}, {"type": "recall_at_3", "value": 28.876}, {"type": "recall_at_5", "value": 33.284000000000006}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackEnglishRetrieval", "type": "mteb/cqadupstack-english", "config": "default", "split": "test", "revision": "ad9991cb51e31e31e430383c75ffb2885547b5f0"}, "metrics": [{"type": "map_at_1", "value": 14.568}, {"type": "map_at_10", "value": 19.356}, {"type": "map_at_100", "value": 20.044}, {"type": "map_at_1000", "value": 20.146}, {"type": "map_at_20", "value": 19.717000000000002}, {"type": "map_at_3", "value": 17.82}, {"type": "map_at_5", "value": 18.724}, {"type": "mrr_at_1", "value": 18.025}, {"type": "mrr_at_10", "value": 22.933}, {"type": "mrr_at_100", "value": 23.599}, {"type": "mrr_at_1000", "value": 23.669999999999998}, {"type": "mrr_at_20", "value": 23.283}, {"type": "mrr_at_3", "value": 21.295}, {"type": "mrr_at_5", "value": 22.314}, {"type": "ndcg_at_1", "value": 18.025}, {"type": "ndcg_at_10", "value": 22.559}, {"type": "ndcg_at_100", "value": 26.045}, {"type": "ndcg_at_1000", "value": 28.785}, {"type": "ndcg_at_20", "value": 23.727999999999998}, {"type": "ndcg_at_3", "value": 19.914}, {"type": "ndcg_at_5", "value": 21.241}, {"type": "precision_at_1", "value": 18.025}, {"type": "precision_at_10", "value": 4.102}, {"type": "precision_at_100", "value": 0.715}, {"type": "precision_at_1000", "value": 0.11800000000000001}, {"type": "precision_at_20", "value": 2.452}, {"type": "precision_at_3", "value": 9.447999999999999}, {"type": "precision_at_5", "value": 6.827999999999999}, {"type": "recall_at_1", "value": 14.568}, {"type": "recall_at_10", "value": 28.677999999999997}, {"type": "recall_at_100", "value": 44.362}, {"type": "recall_at_1000", "value": 63.705999999999996}, {"type": "recall_at_20", "value": 32.932}, {"type": "recall_at_3", "value": 21.029999999999998}, {"type": "recall_at_5", "value": 24.573}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackGamingRetrieval", "type": "mteb/cqadupstack-gaming", "config": "default", "split": "test", "revision": "4885aa143210c98657558c04aaf3dc47cfb54340"}, "metrics": [{"type": "map_at_1", "value": 25.104}, {"type": "map_at_10", "value": 33.857}, {"type": "map_at_100", "value": 34.808}, {"type": "map_at_1000", "value": 34.904}, {"type": "map_at_20", "value": 34.404}, {"type": "map_at_3", "value": 31.176}, {"type": "map_at_5", "value": 32.626}, {"type": "mrr_at_1", "value": 28.84}, {"type": "mrr_at_10", "value": 36.817}, {"type": "mrr_at_100", "value": 37.633}, {"type": "mrr_at_1000", "value": 37.698}, {"type": "mrr_at_20", "value": 37.312}, {"type": "mrr_at_3", "value": 34.451}, {"type": "mrr_at_5", "value": 35.748999999999995}, {"type": "ndcg_at_1", "value": 28.84}, {"type": "ndcg_at_10", "value": 38.745000000000005}, {"type": "ndcg_at_100", "value": 43.183}, {"type": "ndcg_at_1000", "value": 45.419}, {"type": "ndcg_at_20", "value": 40.571}, {"type": "ndcg_at_3", "value": 33.751}, {"type": "ndcg_at_5", "value": 36.042}, {"type": "precision_at_1", "value": 28.84}, {"type": "precision_at_10", "value": 6.389}, {"type": "precision_at_100", "value": 0.941}, {"type": "precision_at_1000", "value": 0.12}, {"type": "precision_at_20", "value": 3.6929999999999996}, {"type": "precision_at_3", "value": 15.068000000000001}, {"type": "precision_at_5", "value": 10.583}, {"type": "recall_at_1", "value": 25.104}, {"type": "recall_at_10", "value": 50.749}, {"type": "recall_at_100", "value": 70.336}, {"type": "recall_at_1000", "value": 86.591}, {"type": "recall_at_20", "value": 57.473}, {"type": "recall_at_3", "value": 37.230000000000004}, {"type": "recall_at_5", "value": 42.774}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackGisRetrieval", "type": "mteb/cqadupstack-gis", "config": "default", "split": "test", "revision": "5003b3064772da1887988e05400cf3806fe491f2"}, "metrics": [{"type": "map_at_1", "value": 12.712000000000002}, {"type": "map_at_10", "value": 18.064}, {"type": "map_at_100", "value": 18.775}, {"type": "map_at_1000", "value": 18.886}, {"type": "map_at_20", "value": 18.375}, {"type": "map_at_3", "value": 16.304}, {"type": "map_at_5", "value": 17.183999999999997}, {"type": "mrr_at_1", "value": 13.672}, {"type": "mrr_at_10", "value": 19.392}, {"type": "mrr_at_100", "value": 20.088}, {"type": "mrr_at_1000", "value": 20.186999999999998}, {"type": "mrr_at_20", "value": 19.721}, {"type": "mrr_at_3", "value": 17.495}, {"type": "mrr_at_5", "value": 18.473}, {"type": "ndcg_at_1", "value": 13.672}, {"type": "ndcg_at_10", "value": 21.427}, {"type": "ndcg_at_100", "value": 25.448999999999998}, {"type": "ndcg_at_1000", "value": 28.78}, {"type": "ndcg_at_20", "value": 22.56}, {"type": "ndcg_at_3", "value": 17.752000000000002}, {"type": "ndcg_at_5", "value": 19.356}, {"type": "precision_at_1", "value": 13.672}, {"type": "precision_at_10", "value": 3.5029999999999997}, {"type": "precision_at_100", "value": 0.5910000000000001}, {"type": "precision_at_1000", "value": 0.092}, {"type": "precision_at_20", "value": 2.011}, {"type": "precision_at_3", "value": 7.571}, {"type": "precision_at_5", "value": 5.537}, {"type": "recall_at_1", "value": 12.712000000000002}, {"type": "recall_at_10", "value": 30.596}, {"type": "recall_at_100", "value": 49.909}, {"type": "recall_at_1000", "value": 76.01400000000001}, {"type": "recall_at_20", "value": 34.903}, {"type": "recall_at_3", "value": 20.721999999999998}, {"type": "recall_at_5", "value": 24.428}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackMathematicaRetrieval", "type": "mteb/cqadupstack-mathematica", "config": "default", "split": "test", "revision": "90fceea13679c63fe563ded68f3b6f06e50061de"}, "metrics": [{"type": "map_at_1", "value": 7.48}, {"type": "map_at_10", "value": 12.089}, {"type": "map_at_100", "value": 12.974}, {"type": "map_at_1000", "value": 13.099}, {"type": "map_at_20", "value": 12.537}, {"type": "map_at_3", "value": 10.402000000000001}, {"type": "map_at_5", "value": 11.261000000000001}, {"type": "mrr_at_1", "value": 9.577}, {"type": "mrr_at_10", "value": 15.043999999999999}, {"type": "mrr_at_100", "value": 15.909}, {"type": "mrr_at_1000", "value": 15.998000000000001}, {"type": "mrr_at_20", "value": 15.512999999999998}, {"type": "mrr_at_3", "value": 13.184000000000001}, {"type": "mrr_at_5", "value": 14.066999999999998}, {"type": "ndcg_at_1", "value": 9.577}, {"type": "ndcg_at_10", "value": 15.511}, {"type": "ndcg_at_100", "value": 20.193}, {"type": "ndcg_at_1000", "value": 23.691000000000003}, {"type": "ndcg_at_20", "value": 17.176}, {"type": "ndcg_at_3", "value": 12.134}, {"type": "ndcg_at_5", "value": 13.506000000000002}, {"type": "precision_at_1", "value": 9.577}, {"type": "precision_at_10", "value": 3.159}, {"type": "precision_at_100", "value": 0.634}, {"type": "precision_at_1000", "value": 0.106}, {"type": "precision_at_20", "value": 2.009}, {"type": "precision_at_3", "value": 6.012}, {"type": "precision_at_5", "value": 4.627}, {"type": "recall_at_1", "value": 7.48}, {"type": "recall_at_10", "value": 23.134}, {"type": "recall_at_100", "value": 44.254}, {"type": "recall_at_1000", "value": 70.35}, {"type": "recall_at_20", "value": 29.383}, {"type": "recall_at_3", "value": 13.84}, {"type": "recall_at_5", "value": 17.175}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackPhysicsRetrieval", "type": "mteb/cqadupstack-physics", "config": "default", "split": "test", "revision": "79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4"}, "metrics": [{"type": "map_at_1", "value": 18.035}, {"type": "map_at_10", "value": 24.007}, {"type": "map_at_100", "value": 25.113999999999997}, {"type": "map_at_1000", "value": 25.245}, {"type": "map_at_20", "value": 24.587}, {"type": "map_at_3", "value": 21.921}, {"type": "map_at_5", "value": 22.917}, {"type": "mrr_at_1", "value": 22.233}, {"type": "mrr_at_10", "value": 28.479}, {"type": "mrr_at_100", "value": 29.412}, {"type": "mrr_at_1000", "value": 29.49}, {"type": "mrr_at_20", "value": 29.031000000000002}, {"type": "mrr_at_3", "value": 26.275}, {"type": "mrr_at_5", "value": 27.400999999999996}, {"type": "ndcg_at_1", "value": 22.233}, {"type": "ndcg_at_10", "value": 28.382}, {"type": "ndcg_at_100", "value": 33.86}, {"type": "ndcg_at_1000", "value": 36.903000000000006}, {"type": "ndcg_at_20", "value": 30.341}, {"type": "ndcg_at_3", "value": 24.695}, {"type": "ndcg_at_5", "value": 26.13}, {"type": "precision_at_1", "value": 22.233}, {"type": "precision_at_10", "value": 5.2170000000000005}, {"type": "precision_at_100", "value": 0.95}, {"type": "precision_at_1000", "value": 0.13899999999999998}, {"type": "precision_at_20", "value": 3.2239999999999998}, {"type": "precision_at_3", "value": 11.485}, {"type": "precision_at_5", "value": 8.181}, {"type": "recall_at_1", "value": 18.035}, {"type": "recall_at_10", "value": 37.222}, {"type": "recall_at_100", "value": 61.602000000000004}, {"type": "recall_at_1000", "value": 82.92}, {"type": "recall_at_20", "value": 44.221}, {"type": "recall_at_3", "value": 26.625}, {"type": "recall_at_5", "value": 30.461}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackProgrammersRetrieval", "type": "mteb/cqadupstack-programmers", "config": "default", "split": "test", "revision": "6184bc1440d2dbc7612be22b50686b8826d22b32"}, "metrics": [{"type": "map_at_1", "value": 13.281}, {"type": "map_at_10", "value": 17.756}, {"type": "map_at_100", "value": 18.785}, {"type": "map_at_1000", "value": 18.921}, {"type": "map_at_20", "value": 18.209}, {"type": "map_at_3", "value": 15.817999999999998}, {"type": "map_at_5", "value": 16.939}, {"type": "mrr_at_1", "value": 16.096}, {"type": "mrr_at_10", "value": 21.079}, {"type": "mrr_at_100", "value": 22.061}, {"type": "mrr_at_1000", "value": 22.151}, {"type": "mrr_at_20", "value": 21.557000000000002}, {"type": "mrr_at_3", "value": 19.006999999999998}, {"type": "mrr_at_5", "value": 20.171}, {"type": "ndcg_at_1", "value": 16.096}, {"type": "ndcg_at_10", "value": 21.278}, {"type": "ndcg_at_100", "value": 26.687}, {"type": "ndcg_at_1000", "value": 30.016}, {"type": "ndcg_at_20", "value": 22.871}, {"type": "ndcg_at_3", "value": 17.705000000000002}, {"type": "ndcg_at_5", "value": 19.427}, {"type": "precision_at_1", "value": 16.096}, {"type": "precision_at_10", "value": 3.893}, {"type": "precision_at_100", "value": 0.792}, {"type": "precision_at_1000", "value": 0.124}, {"type": "precision_at_20", "value": 2.414}, {"type": "precision_at_3", "value": 8.029}, {"type": "precision_at_5", "value": 6.119}, {"type": "recall_at_1", "value": 13.281}, {"type": "recall_at_10", "value": 28.849000000000004}, {"type": "recall_at_100", "value": 53.010999999999996}, {"type": "recall_at_1000", "value": 76.512}, {"type": "recall_at_20", "value": 34.547}, {"type": "recall_at_3", "value": 19.177}, {"type": "recall_at_5", "value": 23.455000000000002}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackRetrieval", "type": "mteb/cqadupstack", "config": "default", "split": "test", "revision": "4ffe81d471b1924886b33c7567bfb200e9eec5c4"}, "metrics": [{"type": "map_at_1", "value": 14.161583333333333}, {"type": "map_at_10", "value": 19.378833333333333}, {"type": "map_at_100", "value": 20.27525}, {"type": "map_at_1000", "value": 20.394499999999997}, {"type": "map_at_20", "value": 19.831333333333333}, {"type": "map_at_3", "value": 17.55408333333333}, {"type": "map_at_5", "value": 18.52841666666667}, {"type": "mrr_at_1", "value": 17.00033333333333}, {"type": "mrr_at_10", "value": 22.41916666666667}, {"type": "mrr_at_100", "value": 23.252666666666666}, {"type": "mrr_at_1000", "value": 23.337583333333335}, {"type": "mrr_at_20", "value": 22.866666666666667}, {"type": "mrr_at_3", "value": 20.56991666666667}, {"type": "mrr_at_5", "value": 21.567666666666664}, {"type": "ndcg_at_1", "value": 17.00033333333333}, {"type": "ndcg_at_10", "value": 22.96475}, {"type": "ndcg_at_100", "value": 27.526833333333332}, {"type": "ndcg_at_1000", "value": 30.597416666666668}, {"type": "ndcg_at_20", "value": 24.52133333333333}, {"type": "ndcg_at_3", "value": 19.60108333333334}, {"type": "ndcg_at_5", "value": 21.089750000000002}, {"type": "precision_at_1", "value": 17.00033333333333}, {"type": "precision_at_10", "value": 4.10625}, {"type": "precision_at_100", "value": 0.7497499999999999}, {"type": "precision_at_1000", "value": 0.11733333333333335}, {"type": "precision_at_20", "value": 2.499416666666667}, {"type": "precision_at_3", "value": 9.041}, {"type": "precision_at_5", "value": 6.554250000000001}, {"type": "recall_at_1", "value": 14.161583333333333}, {"type": "recall_at_10", "value": 30.899916666666666}, {"type": "recall_at_100", "value": 51.66383333333333}, {"type": "recall_at_1000", "value": 74.103}, {"type": "recall_at_20", "value": 36.698}, {"type": "recall_at_3", "value": 21.398}, {"type": "recall_at_5", "value": 25.241750000000003}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackStatsRetrieval", "type": "mteb/cqadupstack-stats", "config": "default", "split": "test", "revision": "65ac3a16b8e91f9cee4c9828cc7c335575432a2a"}, "metrics": [{"type": "map_at_1", "value": 13.036}, {"type": "map_at_10", "value": 17.142}, {"type": "map_at_100", "value": 17.915}, {"type": "map_at_1000", "value": 18.002000000000002}, {"type": "map_at_20", "value": 17.558}, {"type": "map_at_3", "value": 15.459}, {"type": "map_at_5", "value": 16.474}, {"type": "mrr_at_1", "value": 14.877}, {"type": "mrr_at_10", "value": 19.365}, {"type": "mrr_at_100", "value": 20.085}, {"type": "mrr_at_1000", "value": 20.165}, {"type": "mrr_at_20", "value": 19.75}, {"type": "mrr_at_3", "value": 17.638}, {"type": "mrr_at_5", "value": 18.673000000000002}, {"type": "ndcg_at_1", "value": 14.877}, {"type": "ndcg_at_10", "value": 20.199}, {"type": "ndcg_at_100", "value": 24.275}, {"type": "ndcg_at_1000", "value": 26.933}, {"type": "ndcg_at_20", "value": 21.683}, {"type": "ndcg_at_3", "value": 16.925}, {"type": "ndcg_at_5", "value": 18.565}, {"type": "precision_at_1", "value": 14.877}, {"type": "precision_at_10", "value": 3.374}, {"type": "precision_at_100", "value": 0.59}, {"type": "precision_at_1000", "value": 0.087}, {"type": "precision_at_20", "value": 2.04}, {"type": "precision_at_3", "value": 7.515}, {"type": "precision_at_5", "value": 5.491}, {"type": "recall_at_1", "value": 13.036}, {"type": "recall_at_10", "value": 27.750000000000004}, {"type": "recall_at_100", "value": 46.798}, {"type": "recall_at_1000", "value": 67.372}, {"type": "recall_at_20", "value": 33.406000000000006}, {"type": "recall_at_3", "value": 18.381}, {"type": "recall_at_5", "value": 22.559}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackTexRetrieval", "type": "mteb/cqadupstack-tex", "config": "default", "split": "test", "revision": "46989137a86843e03a6195de44b09deda022eec7"}, "metrics": [{"type": "map_at_1", "value": 8.448}, {"type": "map_at_10", "value": 11.978}, {"type": "map_at_100", "value": 12.736}, {"type": "map_at_1000", "value": 12.848}, {"type": "map_at_20", "value": 12.354}, {"type": "map_at_3", "value": 10.687000000000001}, {"type": "map_at_5", "value": 11.344}, {"type": "mrr_at_1", "value": 10.771}, {"type": "mrr_at_10", "value": 14.753}, {"type": "mrr_at_100", "value": 15.501000000000001}, {"type": "mrr_at_1000", "value": 15.592}, {"type": "mrr_at_20", "value": 15.148}, {"type": "mrr_at_3", "value": 13.425999999999998}, {"type": "mrr_at_5", "value": 14.059}, {"type": "ndcg_at_1", "value": 10.771}, {"type": "ndcg_at_10", "value": 14.788}, {"type": "ndcg_at_100", "value": 18.769}, {"type": "ndcg_at_1000", "value": 21.939}, {"type": "ndcg_at_20", "value": 16.113}, {"type": "ndcg_at_3", "value": 12.356}, {"type": "ndcg_at_5", "value": 13.316}, {"type": "precision_at_1", "value": 10.771}, {"type": "precision_at_10", "value": 2.842}, {"type": "precision_at_100", "value": 0.58}, {"type": "precision_at_1000", "value": 0.099}, {"type": "precision_at_20", "value": 1.807}, {"type": "precision_at_3", "value": 5.976}, {"type": "precision_at_5", "value": 4.322}, {"type": "recall_at_1", "value": 8.448}, {"type": "recall_at_10", "value": 20.666}, {"type": "recall_at_100", "value": 39.111000000000004}, {"type": "recall_at_1000", "value": 62.673}, {"type": "recall_at_20", "value": 25.686999999999998}, {"type": "recall_at_3", "value": 13.572999999999999}, {"type": "recall_at_5", "value": 16.239}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackUnixRetrieval", "type": "mteb/cqadupstack-unix", "config": "default", "split": "test", "revision": "6c6430d3a6d36f8d2a829195bc5dc94d7e063e53"}, "metrics": [{"type": "map_at_1", "value": 14.025000000000002}, {"type": "map_at_10", "value": 18.605}, {"type": "map_at_100", "value": 19.442999999999998}, {"type": "map_at_1000", "value": 19.569}, {"type": "map_at_20", "value": 19.070999999999998}, {"type": "map_at_3", "value": 17.072000000000003}, {"type": "map_at_5", "value": 17.866}, {"type": "mrr_at_1", "value": 16.511}, {"type": "mrr_at_10", "value": 21.633}, {"type": "mrr_at_100", "value": 22.419}, {"type": "mrr_at_1000", "value": 22.521}, {"type": "mrr_at_20", "value": 22.063}, {"type": "mrr_at_3", "value": 19.932}, {"type": "mrr_at_5", "value": 20.864}, {"type": "ndcg_at_1", "value": 16.511}, {"type": "ndcg_at_10", "value": 21.931}, {"type": "ndcg_at_100", "value": 26.088}, {"type": "ndcg_at_1000", "value": 29.564}, {"type": "ndcg_at_20", "value": 23.557}, {"type": "ndcg_at_3", "value": 18.869}, {"type": "ndcg_at_5", "value": 20.203}, {"type": "precision_at_1", "value": 16.511}, {"type": "precision_at_10", "value": 3.7220000000000004}, {"type": "precision_at_100", "value": 0.637}, {"type": "precision_at_1000", "value": 0.105}, {"type": "precision_at_20", "value": 2.299}, {"type": "precision_at_3", "value": 8.52}, {"type": "precision_at_5", "value": 6.007}, {"type": "recall_at_1", "value": 14.025000000000002}, {"type": "recall_at_10", "value": 29.24}, {"type": "recall_at_100", "value": 47.771}, {"type": "recall_at_1000", "value": 73.37599999999999}, {"type": "recall_at_20", "value": 35.148}, {"type": "recall_at_3", "value": 20.721}, {"type": "recall_at_5", "value": 24.162}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackWebmastersRetrieval", "type": "mteb/cqadupstack-webmasters", "config": "default", "split": "test", "revision": "160c094312a0e1facb97e55eeddb698c0abe3571"}, "metrics": [{"type": "map_at_1", "value": 14.610000000000001}, {"type": "map_at_10", "value": 20.089000000000002}, {"type": "map_at_100", "value": 21.105}, {"type": "map_at_1000", "value": 21.275}, {"type": "map_at_20", "value": 20.604}, {"type": "map_at_3", "value": 18.323}, {"type": "map_at_5", "value": 19.192}, {"type": "mrr_at_1", "value": 18.182000000000002}, {"type": "mrr_at_10", "value": 23.458000000000002}, {"type": "mrr_at_100", "value": 24.379}, {"type": "mrr_at_1000", "value": 24.474999999999998}, {"type": "mrr_at_20", "value": 23.973}, {"type": "mrr_at_3", "value": 21.64}, {"type": "mrr_at_5", "value": 22.579}, {"type": "ndcg_at_1", "value": 18.182000000000002}, {"type": "ndcg_at_10", "value": 23.842}, {"type": "ndcg_at_100", "value": 28.604000000000003}, {"type": "ndcg_at_1000", "value": 32.192}, {"type": "ndcg_at_20", "value": 25.507}, {"type": "ndcg_at_3", "value": 20.937}, {"type": "ndcg_at_5", "value": 22.125}, {"type": "precision_at_1", "value": 18.182000000000002}, {"type": "precision_at_10", "value": 4.526}, {"type": "precision_at_100", "value": 0.955}, {"type": "precision_at_1000", "value": 0.17500000000000002}, {"type": "precision_at_20", "value": 2.846}, {"type": "precision_at_3", "value": 10.079}, {"type": "precision_at_5", "value": 7.194000000000001}, {"type": "recall_at_1", "value": 14.610000000000001}, {"type": "recall_at_10", "value": 31.086999999999996}, {"type": "recall_at_100", "value": 53.032000000000004}, {"type": "recall_at_1000", "value": 77.781}, {"type": "recall_at_20", "value": 37.801}, {"type": "recall_at_3", "value": 22.078999999999997}, {"type": "recall_at_5", "value": 25.572}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackWordpressRetrieval", "type": "mteb/cqadupstack-wordpress", "config": "default", "split": "test", "revision": "4ffe81d471b1924886b33c7567bfb200e9eec5c4"}, "metrics": [{"type": "map_at_1", "value": 8.484}, {"type": "map_at_10", "value": 12.614}, {"type": "map_at_100", "value": 13.439}, {"type": "map_at_1000", "value": 13.536999999999999}, {"type": "map_at_20", "value": 13.055}, {"type": "map_at_3", "value": 11.036}, {"type": "map_at_5", "value": 11.927999999999999}, {"type": "mrr_at_1", "value": 9.612}, {"type": "mrr_at_10", "value": 14.105}, {"type": "mrr_at_100", "value": 14.953}, {"type": "mrr_at_1000", "value": 15.043000000000001}, {"type": "mrr_at_20", "value": 14.578}, {"type": "mrr_at_3", "value": 12.477}, {"type": "mrr_at_5", "value": 13.420000000000002}, {"type": "ndcg_at_1", "value": 9.612}, {"type": "ndcg_at_10", "value": 15.476999999999999}, {"type": "ndcg_at_100", "value": 19.822}, {"type": "ndcg_at_1000", "value": 22.872}, {"type": "ndcg_at_20", "value": 17.081}, {"type": "ndcg_at_3", "value": 12.328999999999999}, {"type": "ndcg_at_5", "value": 13.861}, {"type": "precision_at_1", "value": 9.612}, {"type": "precision_at_10", "value": 2.625}, {"type": "precision_at_100", "value": 0.51}, {"type": "precision_at_1000", "value": 0.082}, {"type": "precision_at_20", "value": 1.664}, {"type": "precision_at_3", "value": 5.484}, {"type": "precision_at_5", "value": 4.1770000000000005}, {"type": "recall_at_1", "value": 8.484}, {"type": "recall_at_10", "value": 23.087}, {"type": "recall_at_100", "value": 43.352000000000004}, {"type": "recall_at_1000", "value": 67.247}, {"type": "recall_at_20", "value": 29.187}, {"type": "recall_at_3", "value": 14.521999999999998}, {"type": "recall_at_5", "value": 18.218999999999998}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB EmotionClassification", "type": "mteb/emotion", "config": "default", "split": "test", "revision": "4f58c6b202a23cf9a4da393831edf4f9183cad37"}, "metrics": [{"type": "accuracy", "value": 39.095}, {"type": "f1", "value": 35.03781407521973}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB ImdbClassification", "type": "mteb/imdb", "config": "default", "split": "test", "revision": "3d86128a09e091d6018b6d26cad27f2739fc2db7"}, "metrics": [{"type": "accuracy", "value": 67.87360000000001}, {"type": "ap", "value": 62.5359530212091}, {"type": "f1", "value": 67.76861907065303}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MTOPDomainClassification (en)", "type": "mteb/mtop_domain", "config": "en", "split": "test", "revision": "d80d48c1eb48d3562165c59d59d0034df9fff0bf"}, "metrics": [{"type": "accuracy", "value": 89.98176014591884}, {"type": "f1", "value": 89.12439802681382}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MTOPIntentClassification (en)", "type": "mteb/mtop_intent", "config": "en", "split": "test", "revision": "ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba"}, "metrics": [{"type": "accuracy", "value": 66.42954856361149}, {"type": "f1", "value": 46.845543295765395}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MassiveIntentClassification (en)", "type": "mteb/amazon_massive_intent", "config": "en", "split": "test", "revision": "31efe3c427b0bae9c22cbb560b8f15491cc6bed7"}, "metrics": [{"type": "accuracy", "value": 65.15131136516476}, {"type": "f1", "value": 63.15954994502248}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MassiveScenarioClassification (en)", "type": "mteb/amazon_massive_scenario", "config": "en", "split": "test", "revision": "7d571f92784cd94a019292a1f45445077d0ef634"}, "metrics": [{"type": "accuracy", "value": 70.74983187626093}, {"type": "f1", "value": 69.86842975748304}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB MedrxivClusteringP2P", "type": "mteb/medrxiv-clustering-p2p", "config": "default", "split": "test", "revision": "e7a26af6f3ae46b30dde8737f02c07b1505bcc73"}, "metrics": [{"type": "v_measure", "value": 28.540533318170436}, {"type": "v_measures", "value": [0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925, 0.2685927394835013, 0.2783483658319506, 0.2665766690371173, 0.27851721126872275, 0.27353686950062217, 0.30395264384113213, 0.2947770213532569, 0.2955213403120467, 0.29310302531656435, 0.30112744587212925]}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB MedrxivClusteringS2S", "type": "mteb/medrxiv-clustering-s2s", "config": "default", "split": "test", "revision": "35191c8c0dca72d8ff3efcd72aa802307d469663"}, "metrics": [{"type": "v_measure", "value": 24.72766780458758}, {"type": "v_measures", "value": [0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677, 0.2360459034955713, 0.2388482784840708, 0.2325812879581466, 0.22505966000679387, 0.22406230314275308, 0.2639574575941564, 0.26920314836084647, 0.27094201539166474, 0.2607957980208871, 0.2512709280038677]}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB MindSmallReranking", "type": "mteb/mind_small", "config": "default", "split": "test", "revision": "3bdac13927fdc888b903db93b2ffdbd90b295a69"}, "metrics": [{"type": "map", "value": 29.041674385781967}, {"type": "mrr", "value": 29.79989064897717}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB RedditClustering", "type": "mteb/reddit-clustering", "config": "default", "split": "test", "revision": "24640382cdbf8abc73003fb0fa6d111a705499eb"}, "metrics": [{"type": "v_measure", "value": 35.536820860805534}, {"type": "v_measures", "value": [0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254, 0.4526409230197192, 0.4418874523236007, 0.2923442821974814, 0.29815379814662973, 0.3546740441941569, 0.31765082600163064, 0.38073340268020667, 0.3175014611482366, 0.31266612397153287, 0.3004669681159753, 0.33575554767305055, 0.39815200864255257, 0.3336520605714611, 0.38338117463096916, 0.4620636786448619, 0.313480729190297, 0.3538208608160554, 0.3625124773562338, 0.35967221153279816, 0.34429637871008256, 0.315565319725188, 0.31257481088967437, 0.48026035919192217, 0.3260933295798252, 0.33420498624724254]}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB RedditClusteringP2P", "type": "mteb/reddit-clustering-p2p", "config": "default", "split": "test", "revision": "385e3cb46b4cfa89021f56c4380204149d0efe33"}, "metrics": [{"type": "v_measure", "value": 47.42966991443242}, {"type": "v_measures", "value": [0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793, 0.5265449680406118, 0.5493672486309341, 0.5519094814834113, 0.30132219602063365, 0.5071491767482975, 0.4654803385774871, 0.20652468935420157, 0.5484274172396977, 0.5112975786305867, 0.5749438967173793]}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB SICK-R", "type": "mteb/sickr-sts", "config": "default", "split": "test", "revision": "20a6d6f312dd54037fe07a32d58e5e168867909d"}, "metrics": [{"type": "cos_sim_pearson", "value": 79.02197797488559}, {"type": "cos_sim_spearman", "value": 71.7037151299904}, {"type": "euclidean_pearson", "value": 76.1707252695092}, {"type": "euclidean_spearman", "value": 71.57310842242731}, {"type": "manhattan_pearson", "value": 76.03615971307154}, {"type": "manhattan_spearman", "value": 71.53631984773847}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS12", "type": "mteb/sts12-sts", "config": "default", "split": "test", "revision": "a0d554a64d88156834ff5ae9920b964011b16384"}, "metrics": [{"type": "cos_sim_pearson", "value": 79.02494360083622}, {"type": "cos_sim_spearman", "value": 69.72575299384381}, {"type": "euclidean_pearson", "value": 75.74354904408656}, {"type": "euclidean_spearman", "value": 69.54484408453516}, {"type": "manhattan_pearson", "value": 75.77951962076156}, {"type": "manhattan_spearman", "value": 69.6354936146991}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS13", "type": "mteb/sts13-sts", "config": "default", "split": "test", "revision": "7e90230a92c190f1bf69ae9002b8cea547a64cca"}, "metrics": [{"type": "cos_sim_pearson", "value": 75.08237871140905}, {"type": "cos_sim_spearman", "value": 76.43254419101892}, {"type": "euclidean_pearson", "value": 77.01392166862142}, {"type": "euclidean_spearman", "value": 77.25873928927386}, {"type": "manhattan_pearson", "value": 76.8322542796806}, {"type": "manhattan_spearman", "value": 77.06622162313037}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS14", "type": "mteb/sts14-sts", "config": "default", "split": "test", "revision": "6031580fec1f6af667f0bd2da0a551cf4f0b2375"}, "metrics": [{"type": "cos_sim_pearson", "value": 76.15651557768992}, {"type": "cos_sim_spearman", "value": 73.66468164294979}, {"type": "euclidean_pearson", "value": 76.01343601779764}, {"type": "euclidean_spearman", "value": 74.26813269648791}, {"type": "manhattan_pearson", "value": 75.81532622772455}, {"type": "manhattan_spearman", "value": 74.11890179466049}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS15", "type": "mteb/sts15-sts", "config": "default", "split": "test", "revision": "ae752c7c21bf194d8b67fd573edf7ae58183cbe3"}, "metrics": [{"type": "cos_sim_pearson", "value": 81.80212103727666}, {"type": "cos_sim_spearman", "value": 82.61832225494061}, {"type": "euclidean_pearson", "value": 81.83006587249692}, {"type": "euclidean_spearman", "value": 82.61429686151203}, {"type": "manhattan_pearson", "value": 81.76278849963437}, {"type": "manhattan_spearman", "value": 82.54152053739365}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS16", "type": "mteb/sts16-sts", "config": "default", "split": "test", "revision": "4d8694f8f0e0100860b497b999b3dbed754a0513"}, "metrics": [{"type": "cos_sim_pearson", "value": 77.75548172603382}, {"type": "cos_sim_spearman", "value": 79.48976464310448}, {"type": "euclidean_pearson", "value": 78.54266801280951}, {"type": "euclidean_spearman", "value": 79.30766703387586}, {"type": "manhattan_pearson", "value": 78.28008795002846}, {"type": "manhattan_spearman", "value": 79.07395809817007}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS17 (en-en)", "type": "mteb/sts17-crosslingual-sts", "config": "en-en", "split": "test", "revision": "af5e6fb845001ecf41f4c1e033ce921939a2a68d"}, "metrics": [{"type": "cos_sim_pearson", "value": 83.813657478234}, {"type": "cos_sim_spearman", "value": 84.38223117622964}, {"type": "euclidean_pearson", "value": 84.57065602789609}, {"type": "euclidean_spearman", "value": 83.8380794185294}, {"type": "manhattan_pearson", "value": 84.42039206232738}, {"type": "manhattan_spearman", "value": 83.74732339282085}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS22 (en)", "type": "mteb/sts22-crosslingual-sts", "config": "en", "split": "test", "revision": "eea2b4fe26a775864c896887d910b76a8098ad3f"}, "metrics": [{"type": "cos_sim_pearson", "value": 50.88695953591733}, {"type": "cos_sim_spearman", "value": 60.61167810477114}, {"type": "euclidean_pearson", "value": 55.81887963485168}, {"type": "euclidean_spearman", "value": 60.28385340456606}, {"type": "manhattan_pearson", "value": 56.03578991214848}, {"type": "manhattan_spearman", "value": 59.94178607215249}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STSBenchmark", "type": "mteb/stsbenchmark-sts", "config": "default", "split": "test", "revision": "b0fddb56ed78048fa8b90373c8a3cfc37b684831"}, "metrics": [{"type": "cos_sim_pearson", "value": 78.97129674864591}, {"type": "cos_sim_spearman", "value": 78.60681572589853}, {"type": "euclidean_pearson", "value": 79.71108582359511}, {"type": "euclidean_spearman", "value": 78.71541582168763}, {"type": "manhattan_pearson", "value": 79.55279136411954}, {"type": "manhattan_spearman", "value": 78.57797218212967}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB SciDocsRR", "type": "mteb/scidocs-reranking", "config": "default", "split": "test", "revision": "d3c5e1fc0b855ab6097bf1cda04dd73947d7caab"}, "metrics": [{"type": "map", "value": 72.91005664580126}, {"type": "mrr", "value": 91.49957703879274}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB SprintDuplicateQuestions", "type": "mteb/sprintduplicatequestions-pairclassification", "config": "default", "split": "test", "revision": "d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46"}, "metrics": [{"type": "cos_sim_accuracy", "value": 99.73960396039604}, {"type": "cos_sim_ap", "value": 92.14278266682584}, {"type": "cos_sim_f1", "value": 86.90890990542557}, {"type": "cos_sim_precision", "value": 86.5213082259663}, {"type": "cos_sim_recall", "value": 87.3}, {"type": "dot_accuracy", "value": 99.49801980198019}, {"type": "dot_ap", "value": 77.95867119922542}, {"type": "dot_f1", "value": 71.30528586839266}, {"type": "dot_precision", "value": 77.40046838407494}, {"type": "dot_recall", "value": 66.10000000000001}, {"type": "euclidean_accuracy", "value": 99.73861386138614}, {"type": "euclidean_ap", "value": 92.13035792099073}, {"type": "euclidean_f1", "value": 86.81102362204726}, {"type": "euclidean_precision", "value": 85.46511627906976}, {"type": "euclidean_recall", "value": 88.2}, {"type": "manhattan_accuracy", "value": 99.73762376237623}, {"type": "manhattan_ap", "value": 92.12382961875572}, {"type": "manhattan_f1", "value": 86.85770750988142}, {"type": "manhattan_precision", "value": 85.83984375}, {"type": "manhattan_recall", "value": 87.9}, {"type": "max_accuracy", "value": 99.73960396039604}, {"type": "max_ap", "value": 92.14278266682584}, {"type": "max_f1", "value": 86.90890990542557}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB StackExchangeClustering", "type": "mteb/stackexchange-clustering", "config": "default", "split": "test", "revision": "6cbc1f7b2bc0622f2e39d2c77fa502909748c259"}, "metrics": [{"type": "v_measure", "value": 46.543842750900396}, {"type": "v_measures", "value": [0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893, 0.4503379530441486, 0.5055050814643914, 0.37196718256808775, 0.4626103115561461, 0.5045922481143936, 0.4024029244484936, 0.40741224663943404, 0.5217286774083806, 0.47473832818512185, 0.4423513686832282, 0.5254123899176399, 0.5290494091156918, 0.6004444685084731, 0.5097409136008207, 0.42510119674835567, 0.45914095980022224, 0.43938053466177135, 0.459032754379216, 0.43147103735898107, 0.4430589611998686, 0.4953516718234184, 0.4169835530427121, 0.43908761316001205, 0.46089722865011284, 0.45816167364597893]}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB StackExchangeClusteringP2P", "type": "mteb/stackexchange-clustering-p2p", "config": "default", "split": "test", "revision": "815ca46b2622cec33ccafc3735d572c266efdb44"}, "metrics": [{"type": "v_measure", "value": 30.878511440057455}, {"type": "v_measures", "value": [0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626, 0.29953106753859926, 0.2959477268193652, 0.2907437538793838, 0.2896752248560134, 0.29823646573245677, 0.3302941899873012, 0.3118332962228191, 0.3227164592768227, 0.32075958907773794, 0.32811337061524626]}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB StackOverflowDupQuestions", "type": "mteb/stackoverflowdupquestions-reranking", "config": "default", "split": "test", "revision": "e185fbe320c72810689fc5848eb6114e1ef5ec69"}, "metrics": [{"type": "map", "value": 44.48878112961158}, {"type": "mrr", "value": 45.088675621763855}]}, {"task": {"type": "Summarization"}, "dataset": {"name": "MTEB SummEval", "type": "mteb/summeval", "config": "default", "split": "test", "revision": "cda12ad7615edc362dbf25a00fdd61d3b1eaf93c"}, "metrics": [{"type": "cos_sim_pearson", "value": 30.941315767578654}, {"type": "cos_sim_spearman", "value": 29.329027079065966}, {"type": "dot_pearson", "value": 25.836517566143634}, {"type": "dot_spearman", "value": 26.352097845535162}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB ToxicConversationsClassification", "type": "mteb/toxic_conversations_50k", "config": "default", "split": "test", "revision": "edfaf9da55d3dd50d43143d90c1ac476895ae6de"}, "metrics": [{"type": "accuracy", "value": 64.013671875}, {"type": "ap", "value": 10.97313563679864}, {"type": "f1", "value": 48.85384219487}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB TweetSentimentExtractionClassification", "type": "mteb/tweet_sentiment_extraction", "config": "default", "split": "test", "revision": "d604517c81ca91fe16a244d1248fc021f9ecee7a"}, "metrics": [{"type": "accuracy", "value": 54.25863044708545}, {"type": "f1", "value": 54.478056275468234}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB TwentyNewsgroupsClustering", "type": "mteb/twentynewsgroups-clustering", "config": "default", "split": "test", "revision": "6125ec4e24fa026cec8a478383ee943acfbd5449"}, "metrics": [{"type": "v_measure", "value": 28.676358868345943}, {"type": "v_measures", "value": [0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625, 0.29080031699004694, 0.2957827890450158, 0.279781173635388, 0.28316010309239503, 0.29801751933798737, 0.30045501200974795, 0.2750357568275408, 0.28736739490829033, 0.26884372823491953, 0.2883920927532625]}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB TwitterSemEval2015", "type": "mteb/twittersemeval2015-pairclassification", "config": "default", "split": "test", "revision": "70970daeab8776df92f5ea462b6173c0b46fd2d1"}, "metrics": [{"type": "cos_sim_accuracy", "value": 83.23895809739524}, {"type": "cos_sim_ap", "value": 63.43837390346798}, {"type": "cos_sim_f1", "value": 60.3425871234495}, {"type": "cos_sim_precision", "value": 54.63101604278074}, {"type": "cos_sim_recall", "value": 67.38786279683377}, {"type": "dot_accuracy", "value": 79.70435715562974}, {"type": "dot_ap", "value": 50.219858779642024}, {"type": "dot_f1", "value": 52.03935006079363}, {"type": "dot_precision", "value": 44.778390717139054}, {"type": "dot_recall", "value": 62.11081794195251}, {"type": "euclidean_accuracy", "value": 83.3581689217381}, {"type": "euclidean_ap", "value": 63.866502871821886}, {"type": "euclidean_f1", "value": 60.66180862501495}, {"type": "euclidean_precision", "value": 55.42457978607291}, {"type": "euclidean_recall", "value": 66.99208443271768}, {"type": "manhattan_accuracy", "value": 83.32836621565238}, {"type": "manhattan_ap", "value": 63.58246341419401}, {"type": "manhattan_f1", "value": 60.405654578979714}, {"type": "manhattan_precision", "value": 56.54775604142692}, {"type": "manhattan_recall", "value": 64.82849604221636}, {"type": "max_accuracy", "value": 83.3581689217381}, {"type": "max_ap", "value": 63.866502871821886}, {"type": "max_f1", "value": 60.66180862501495}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB TwitterURLCorpus", "type": "mteb/twitterurlcorpus-pairclassification", "config": "default", "split": "test", "revision": "8b6510b0b1fa4e4c4f879467980e9be563ec1cdf"}, "metrics": [{"type": "cos_sim_accuracy", "value": 87.77894205767066}, {"type": "cos_sim_ap", "value": 83.5297230824822}, {"type": "cos_sim_f1", "value": 75.65036420395423}, {"type": "cos_sim_precision", "value": 73.11781609195403}, {"type": "cos_sim_recall", "value": 78.3646442870342}, {"type": "dot_accuracy", "value": 86.03058175185313}, {"type": "dot_ap", "value": 78.95144253575621}, {"type": "dot_f1", "value": 72.20582032897512}, {"type": "dot_precision", "value": 66.42524573202276}, {"type": "dot_recall", "value": 79.08838928241454}, {"type": "euclidean_accuracy", "value": 87.7265494624908}, {"type": "euclidean_ap", "value": 83.29997302389856}, {"type": "euclidean_f1", "value": 75.38237163905613}, {"type": "euclidean_precision", "value": 73.28582854649895}, {"type": "euclidean_recall", "value": 77.60240221743148}, {"type": "manhattan_accuracy", "value": 87.65475220242946}, {"type": "manhattan_ap", "value": 83.1779453049763}, {"type": "manhattan_f1", "value": 75.17620001483792}, {"type": "manhattan_precision", "value": 72.53400143163923}, {"type": "manhattan_recall", "value": 78.01817061903296}, {"type": "max_accuracy", "value": 87.77894205767066}, {"type": "max_ap", "value": 83.5297230824822}, {"type": "max_f1", "value": 75.65036420395423}]}]}]} | Mihaiii/test25 | null | [
"sentence-transformers",
"onnx",
"safetensors",
"bert",
"feature-extraction",
"sentence-similarity",
"bge",
"mteb",
"mergekit",
"merge",
"base_model:Mihaiii/Wartortle",
"base_model:TaylorAI/bge-micro-v2",
"license:mit",
"model-index",
"endpoints_compatible",
"region:us"
] | null | 2024-05-01T19:31:02+00:00 |
text-generation | peft |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
## Gemma-7B-Chat-DcardStylePost-SFT
This model is a fine-tuned version of [google/gemma-7b-it](https://huggingface.co/google/gemma-7b-it) on the dcardwom_zhtw_train dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 2
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- num_epochs: 3.0
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- PEFT 0.10.0
- Transformers 4.40.1
- Pytorch 2.2.0a0+81ea7a4
- Datasets 2.19.0
- Tokenizers 0.19.1 | {"license": "gpl-3.0", "library_name": "peft", "tags": ["art", "marketing", "llama-factory", "lora", "generated_from_trainer"], "base_model": "google/gemma-7b-it", "pipeline_tag": "text-generation", "model-index": [{"name": "train_2024-05-01-08-42-24", "results": []}]} | JiunYi/Gemma-7B-Chat-zhtw-DcardStylePost-SFT | null | [
"peft",
"safetensors",
"gemma",
"art",
"marketing",
"llama-factory",
"lora",
"generated_from_trainer",
"text-generation",
"conversational",
"base_model:google/gemma-7b-it",
"license:gpl-3.0",
"region:us"
] | null | 2024-05-01T19:31:47+00:00 |
text2text-generation | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# t5-small-finetuned-xsum
This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on the xsum dataset.
It achieves the following results on the evaluation set:
- eval_loss: 2.5726
- eval_rouge1: 26.5342
- eval_rouge2: 6.8822
- eval_rougeL: 20.9891
- eval_rougeLsum: 20.9973
- eval_gen_len: 18.8023
- eval_runtime: 96.7741
- eval_samples_per_second: 11.708
- eval_steps_per_second: 2.935
- epoch: 1.0
- step: 5101
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Framework versions
- Transformers 4.30.0
- Pytorch 2.3.0+cu121
- Datasets 2.19.0
- Tokenizers 0.13.3
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["xsum"], "model-index": [{"name": "t5-small-finetuned-xsum", "results": []}]} | Suryansh5545/t5-small-finetuned-xsum | null | [
"transformers",
"pytorch",
"tensorboard",
"t5",
"text2text-generation",
"generated_from_trainer",
"dataset:xsum",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null | 2024-05-01T19:31:51+00:00 |
sentence-similarity | sentence-transformers |
# {MODEL_NAME}
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 384 dimensional dense vector space and can be used for tasks like clustering or semantic search.
<!--- Describe your model here -->
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('{MODEL_NAME}')
embeddings = model.encode(sentences)
print(embeddings)
```
## Evaluation Results
<!--- Describe how your model was evaluated -->
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name={MODEL_NAME})
## Training
The model was trained with the parameters:
**DataLoader**:
`torch.utils.data.dataloader.DataLoader` of length 27371 with parameters:
```
{'batch_size': 64, 'sampler': 'torch.utils.data.sampler.RandomSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'}
```
**Loss**:
`sentence_transformers.losses.MultipleNegativesRankingLoss.MultipleNegativesRankingLoss` with parameters:
```
{'scale': 20.0, 'similarity_fct': 'cos_sim'}
```
Parameters of the fit()-Method:
```
{
"epochs": 1,
"evaluation_steps": 0,
"evaluator": "NoneType",
"max_grad_norm": 1,
"optimizer_class": "<class 'torch.optim.adamw.AdamW'>",
"optimizer_params": {
"lr": 2e-05
},
"scheduler": "WarmupLinear",
"steps_per_epoch": null,
"warmup_steps": 2737,
"weight_decay": 0.01
}
```
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
```
## Citing & Authors
<!--- Describe where people can find more information --> | {"library_name": "sentence-transformers", "tags": ["sentence-transformers", "feature-extraction", "sentence-similarity"], "pipeline_tag": "sentence-similarity"} | alexjones1925/all-MiniLM-L12-v2-gp-walmart-dae-allrows-search-clicks | null | [
"sentence-transformers",
"safetensors",
"bert",
"feature-extraction",
"sentence-similarity",
"endpoints_compatible",
"region:us"
] | null | 2024-05-01T19:32:28+00:00 |
text-generation | transformers | Quantization made by Richard Erkhov.
[Github](https://github.com/RichardErkhov)
[Discord](https://discord.gg/pvy7H8DZMG)
[Request more models](https://github.com/RichardErkhov/quant_request)
KangalKhan-RawEmerald-7B - bnb 4bits
- Model creator: https://huggingface.co/Yuma42/
- Original model: https://huggingface.co/Yuma42/KangalKhan-RawEmerald-7B/
Original model description:
---
language:
- en
license: apache-2.0
tags:
- merge
- mergekit
- lazymergekit
- argilla/CapybaraHermes-2.5-Mistral-7B
- argilla/distilabeled-OpenHermes-2.5-Mistral-7B
base_model:
- argilla/CapybaraHermes-2.5-Mistral-7B
- argilla/distilabeled-OpenHermes-2.5-Mistral-7B
model-index:
- name: KangalKhan-RawEmerald-7B
results:
- task:
type: text-generation
name: Text Generation
dataset:
name: AI2 Reasoning Challenge (25-Shot)
type: ai2_arc
config: ARC-Challenge
split: test
args:
num_few_shot: 25
metrics:
- type: acc_norm
value: 66.89
name: normalized accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Yuma42/KangalKhan-RawEmerald-7B
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: HellaSwag (10-Shot)
type: hellaswag
split: validation
args:
num_few_shot: 10
metrics:
- type: acc_norm
value: 85.75
name: normalized accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Yuma42/KangalKhan-RawEmerald-7B
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: MMLU (5-Shot)
type: cais/mmlu
config: all
split: test
args:
num_few_shot: 5
metrics:
- type: acc
value: 63.23
name: accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Yuma42/KangalKhan-RawEmerald-7B
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: TruthfulQA (0-shot)
type: truthful_qa
config: multiple_choice
split: validation
args:
num_few_shot: 0
metrics:
- type: mc2
value: 57.58
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Yuma42/KangalKhan-RawEmerald-7B
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: Winogrande (5-shot)
type: winogrande
config: winogrande_xl
split: validation
args:
num_few_shot: 5
metrics:
- type: acc
value: 78.22
name: accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Yuma42/KangalKhan-RawEmerald-7B
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: GSM8k (5-shot)
type: gsm8k
config: main
split: test
args:
num_few_shot: 5
metrics:
- type: acc
value: 62.85
name: accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Yuma42/KangalKhan-RawEmerald-7B
name: Open LLM Leaderboard
---
# KangalKhan-RawEmerald-7B
I suggest using ChatML (Use whatever system prompt you like, this is just an example!):
```
<|im_start|>system
You are a friendly assistant.<|im_end|>
<|im_start|>user
Hello, what are you?<|im_end|>
<|im_start|>assistant
I am an AI language model designed to assist users with information and answer their questions. How can I help you today?<|im_end|>
```
Q4_K_S GGUF:
https://huggingface.co/Yuma42/KangalKhan-RawEmerald-7B-GGUF
More GGUF variants by [mradermacher](https://huggingface.co/mradermacher):
WARNING: I have observed that these versions output typos in rare cases. If you have the same problem, use my Q4_K_S GGUF above.
https://huggingface.co/mradermacher/KangalKhan-RawEmerald-7B-GGUF
KangalKhan-RawEmerald-7B is a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
* [argilla/CapybaraHermes-2.5-Mistral-7B](https://huggingface.co/argilla/CapybaraHermes-2.5-Mistral-7B)
* [argilla/distilabeled-OpenHermes-2.5-Mistral-7B](https://huggingface.co/argilla/distilabeled-OpenHermes-2.5-Mistral-7B)
## π§© Configuration
```yaml
models:
- model: teknium/OpenHermes-2.5-Mistral-7B
# no parameters necessary for base model
- model: argilla/CapybaraHermes-2.5-Mistral-7B
parameters:
density: 0.6
weight: 0.5
- model: argilla/distilabeled-OpenHermes-2.5-Mistral-7B
parameters:
density: 0.6
weight: 0.5
merge_method: ties
base_model: teknium/OpenHermes-2.5-Mistral-7B
parameters:
normalize: true
dtype: bfloat16
```
## π» Usage
```python
!pip install -qU transformers accelerate
from transformers import AutoTokenizer
import transformers
import torch
model = "Yuma42/KangalKhan-RawEmerald-7B"
messages = [{"role": "user", "content": "What is a large language model?"}]
tokenizer = AutoTokenizer.from_pretrained(model)
prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
pipeline = transformers.pipeline(
"text-generation",
model=model,
torch_dtype=torch.float16,
device_map="auto",
)
outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
```
# [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_Yuma42__KangalKhan-RawEmerald-7B)
| Metric |Value|
|---------------------------------|----:|
|Avg. |69.09|
|AI2 Reasoning Challenge (25-Shot)|66.89|
|HellaSwag (10-Shot) |85.75|
|MMLU (5-Shot) |63.23|
|TruthfulQA (0-shot) |57.58|
|Winogrande (5-shot) |78.22|
|GSM8k (5-shot) |62.85|
| {} | RichardErkhov/Yuma42_-_KangalKhan-RawEmerald-7B-4bits | null | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"4-bit",
"region:us"
] | null | 2024-05-01T19:32:42+00:00 |
null | null | {} | Hanna723/beadando_agent | null | [
"region:us"
] | null | 2024-05-01T19:33:21+00:00 |
|
null | peft |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# llava-1.5-7b-hf-ft-mix-vsft-1
This model is a fine-tuned version of [llava-hf/llava-1.5-7b-hf](https://huggingface.co/llava-hf/llava-1.5-7b-hf) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1.4e-05
- train_batch_size: 2
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- PEFT 0.10.0
- Transformers 4.40.1
- Pytorch 2.1.2
- Datasets 2.18.0
- Tokenizers 0.19.1 | {"library_name": "peft", "tags": ["trl", "sft", "generated_from_trainer"], "base_model": "llava-hf/llava-1.5-7b-hf", "model-index": [{"name": "llava-1.5-7b-hf-ft-mix-vsft-1", "results": []}]} | Shiv34/llava-1.5-7b-hf-ft-mix-vsft-1 | null | [
"peft",
"tensorboard",
"safetensors",
"trl",
"sft",
"generated_from_trainer",
"base_model:llava-hf/llava-1.5-7b-hf",
"region:us"
] | null | 2024-05-01T19:33:35+00:00 |
null | peft |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# nash_dpo_rank4_iter_2
This model is a fine-tuned version of [YYYYYYibo/nash_dpo_iter_1](https://huggingface.co/YYYYYYibo/nash_dpo_iter_1) on the updated and the original datasets.
It achieves the following results on the evaluation set:
- Loss: 0.6181
- Rewards/chosen: -0.4066
- Rewards/rejected: -0.6115
- Rewards/accuracies: 0.6680
- Rewards/margins: 0.2049
- Logps/rejected: -350.9844
- Logps/chosen: -339.0614
- Logits/rejected: -2.1508
- Logits/chosen: -2.2761
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-06
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- distributed_type: multi-GPU
- num_devices: 4
- gradient_accumulation_steps: 16
- total_train_batch_size: 128
- total_eval_batch_size: 8
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rewards/chosen | Rewards/rejected | Rewards/accuracies | Rewards/margins | Logps/rejected | Logps/chosen | Logits/rejected | Logits/chosen |
|:-------------:|:-----:|:----:|:---------------:|:--------------:|:----------------:|:------------------:|:---------------:|:--------------:|:------------:|:---------------:|:-------------:|
| 0.6232 | 0.51 | 100 | 0.6181 | -0.4066 | -0.6115 | 0.6680 | 0.2049 | -350.9844 | -339.0614 | -2.1508 | -2.2761 |
### Framework versions
- PEFT 0.7.1
- Transformers 4.36.2
- Pytorch 2.1.2+cu121
- Datasets 2.14.6
- Tokenizers 0.15.2 | {"license": "apache-2.0", "library_name": "peft", "tags": ["alignment-handbook", "generated_from_trainer", "trl", "dpo"], "datasets": ["updated", "original"], "base_model": "alignment-handbook/zephyr-7b-sft-full", "model-index": [{"name": "nash_dpo_rank4_iter_2", "results": []}]} | YYYYYYibo/nash_dpo_rank4_iter_2 | null | [
"peft",
"safetensors",
"mistral",
"alignment-handbook",
"generated_from_trainer",
"trl",
"dpo",
"dataset:updated",
"dataset:original",
"base_model:alignment-handbook/zephyr-7b-sft-full",
"license:apache-2.0",
"region:us"
] | null | 2024-05-01T19:35:48+00:00 |
null | null | {} | samzirbo/mT5.baseline.test.no_safetensors_v2 | null | [
"region:us"
] | null | 2024-05-01T19:37:08+00:00 |
|
null | null | Quantization made by Richard Erkhov.
[Github](https://github.com/RichardErkhov)
[Discord](https://discord.gg/pvy7H8DZMG)
[Request more models](https://github.com/RichardErkhov/quant_request)
KangalKhan-Ruby-7B-Fixed - GGUF
- Model creator: https://huggingface.co/Yuma42/
- Original model: https://huggingface.co/Yuma42/KangalKhan-Ruby-7B-Fixed/
| Name | Quant method | Size |
| ---- | ---- | ---- |
| [KangalKhan-Ruby-7B-Fixed.Q2_K.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-Ruby-7B-Fixed-gguf/blob/main/KangalKhan-Ruby-7B-Fixed.Q2_K.gguf) | Q2_K | 2.53GB |
| [KangalKhan-Ruby-7B-Fixed.IQ3_XS.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-Ruby-7B-Fixed-gguf/blob/main/KangalKhan-Ruby-7B-Fixed.IQ3_XS.gguf) | IQ3_XS | 2.81GB |
| [KangalKhan-Ruby-7B-Fixed.IQ3_S.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-Ruby-7B-Fixed-gguf/blob/main/KangalKhan-Ruby-7B-Fixed.IQ3_S.gguf) | IQ3_S | 2.96GB |
| [KangalKhan-Ruby-7B-Fixed.Q3_K_S.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-Ruby-7B-Fixed-gguf/blob/main/KangalKhan-Ruby-7B-Fixed.Q3_K_S.gguf) | Q3_K_S | 2.95GB |
| [KangalKhan-Ruby-7B-Fixed.IQ3_M.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-Ruby-7B-Fixed-gguf/blob/main/KangalKhan-Ruby-7B-Fixed.IQ3_M.gguf) | IQ3_M | 3.06GB |
| [KangalKhan-Ruby-7B-Fixed.Q3_K.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-Ruby-7B-Fixed-gguf/blob/main/KangalKhan-Ruby-7B-Fixed.Q3_K.gguf) | Q3_K | 3.28GB |
| [KangalKhan-Ruby-7B-Fixed.Q3_K_M.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-Ruby-7B-Fixed-gguf/blob/main/KangalKhan-Ruby-7B-Fixed.Q3_K_M.gguf) | Q3_K_M | 3.28GB |
| [KangalKhan-Ruby-7B-Fixed.Q3_K_L.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-Ruby-7B-Fixed-gguf/blob/main/KangalKhan-Ruby-7B-Fixed.Q3_K_L.gguf) | Q3_K_L | 3.56GB |
| [KangalKhan-Ruby-7B-Fixed.IQ4_XS.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-Ruby-7B-Fixed-gguf/blob/main/KangalKhan-Ruby-7B-Fixed.IQ4_XS.gguf) | IQ4_XS | 3.67GB |
| [KangalKhan-Ruby-7B-Fixed.Q4_0.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-Ruby-7B-Fixed-gguf/blob/main/KangalKhan-Ruby-7B-Fixed.Q4_0.gguf) | Q4_0 | 3.83GB |
| [KangalKhan-Ruby-7B-Fixed.IQ4_NL.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-Ruby-7B-Fixed-gguf/blob/main/KangalKhan-Ruby-7B-Fixed.IQ4_NL.gguf) | IQ4_NL | 3.87GB |
| [KangalKhan-Ruby-7B-Fixed.Q4_K_S.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-Ruby-7B-Fixed-gguf/blob/main/KangalKhan-Ruby-7B-Fixed.Q4_K_S.gguf) | Q4_K_S | 3.86GB |
| [KangalKhan-Ruby-7B-Fixed.Q4_K.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-Ruby-7B-Fixed-gguf/blob/main/KangalKhan-Ruby-7B-Fixed.Q4_K.gguf) | Q4_K | 4.07GB |
| [KangalKhan-Ruby-7B-Fixed.Q4_K_M.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-Ruby-7B-Fixed-gguf/blob/main/KangalKhan-Ruby-7B-Fixed.Q4_K_M.gguf) | Q4_K_M | 4.07GB |
| [KangalKhan-Ruby-7B-Fixed.Q4_1.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-Ruby-7B-Fixed-gguf/blob/main/KangalKhan-Ruby-7B-Fixed.Q4_1.gguf) | Q4_1 | 4.24GB |
| [KangalKhan-Ruby-7B-Fixed.Q5_0.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-Ruby-7B-Fixed-gguf/blob/main/KangalKhan-Ruby-7B-Fixed.Q5_0.gguf) | Q5_0 | 4.65GB |
| [KangalKhan-Ruby-7B-Fixed.Q5_K_S.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-Ruby-7B-Fixed-gguf/blob/main/KangalKhan-Ruby-7B-Fixed.Q5_K_S.gguf) | Q5_K_S | 4.65GB |
| [KangalKhan-Ruby-7B-Fixed.Q5_K.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-Ruby-7B-Fixed-gguf/blob/main/KangalKhan-Ruby-7B-Fixed.Q5_K.gguf) | Q5_K | 4.78GB |
| [KangalKhan-Ruby-7B-Fixed.Q5_K_M.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-Ruby-7B-Fixed-gguf/blob/main/KangalKhan-Ruby-7B-Fixed.Q5_K_M.gguf) | Q5_K_M | 4.78GB |
| [KangalKhan-Ruby-7B-Fixed.Q5_1.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-Ruby-7B-Fixed-gguf/blob/main/KangalKhan-Ruby-7B-Fixed.Q5_1.gguf) | Q5_1 | 5.07GB |
| [KangalKhan-Ruby-7B-Fixed.Q6_K.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-Ruby-7B-Fixed-gguf/blob/main/KangalKhan-Ruby-7B-Fixed.Q6_K.gguf) | Q6_K | 5.53GB |
Original model description:
---
language:
- en
license: apache-2.0
tags:
- merge
- mergekit
- lazymergekit
- argilla/CapybaraHermes-2.5-Mistral-7B
- argilla/distilabeled-OpenHermes-2.5-Mistral-7B
base_model:
- argilla/CapybaraHermes-2.5-Mistral-7B
- argilla/distilabeled-OpenHermes-2.5-Mistral-7B
model-index:
- name: KangalKhan-Ruby-7B-Fixed
results:
- task:
type: text-generation
name: Text Generation
dataset:
name: AI2 Reasoning Challenge (25-Shot)
type: ai2_arc
config: ARC-Challenge
split: test
args:
num_few_shot: 25
metrics:
- type: acc_norm
value: 67.24
name: normalized accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Yuma42/KangalKhan-Ruby-7B-Fixed
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: HellaSwag (10-Shot)
type: hellaswag
split: validation
args:
num_few_shot: 10
metrics:
- type: acc_norm
value: 85.22
name: normalized accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Yuma42/KangalKhan-Ruby-7B-Fixed
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: MMLU (5-Shot)
type: cais/mmlu
config: all
split: test
args:
num_few_shot: 5
metrics:
- type: acc
value: 63.21
name: accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Yuma42/KangalKhan-Ruby-7B-Fixed
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: TruthfulQA (0-shot)
type: truthful_qa
config: multiple_choice
split: validation
args:
num_few_shot: 0
metrics:
- type: mc2
value: 56.49
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Yuma42/KangalKhan-Ruby-7B-Fixed
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: Winogrande (5-shot)
type: winogrande
config: winogrande_xl
split: validation
args:
num_few_shot: 5
metrics:
- type: acc
value: 77.98
name: accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Yuma42/KangalKhan-Ruby-7B-Fixed
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: GSM8k (5-shot)
type: gsm8k
config: main
split: test
args:
num_few_shot: 5
metrics:
- type: acc
value: 61.94
name: accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Yuma42/KangalKhan-Ruby-7B-Fixed
name: Open LLM Leaderboard
---
# KangalKhan-Ruby-7B
I suggest using ChatML (Use whatever system prompt you like, this is just an example!):
```
<|im_start|>system
You are a friendly assistant.<|im_end|>
<|im_start|>user
Hello, what are you?<|im_end|>
<|im_start|>assistant
I am an AI language model designed to assist users with information and answer their questions. How can I help you today?<|im_end|>
```
Q4_K_S GGUF:
https://huggingface.co/Yuma42/KangalKhan-Ruby-7B-Fixed-GGUF
More GGUF variants by [mradermacher](https://huggingface.co/mradermacher):
WARNING: I have observed that these versions output typos in rare cases. If you have the same problem, use my Q4_K_S GGUF above.
https://huggingface.co/mradermacher/KangalKhan-Ruby-7B-Fixed-GGUF
KangalKhan-Ruby-7B is a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
* [argilla/CapybaraHermes-2.5-Mistral-7B](https://huggingface.co/argilla/CapybaraHermes-2.5-Mistral-7B)
* [argilla/distilabeled-OpenHermes-2.5-Mistral-7B](https://huggingface.co/argilla/distilabeled-OpenHermes-2.5-Mistral-7B)
## π§© Configuration
```yaml
slices:
- sources:
- model: argilla/CapybaraHermes-2.5-Mistral-7B
layer_range: [0, 32]
- model: argilla/distilabeled-OpenHermes-2.5-Mistral-7B
layer_range: [0, 32]
merge_method: slerp
base_model: argilla/CapybaraHermes-2.5-Mistral-7B
parameters:
t:
- filter: self_attn
value: [1, 0.5, 0.7, 0.3, 0]
- filter: mlp
value: [0, 0.5, 0.3, 0.7, 1]
- value: 0.5
dtype: bfloat16
```
## π» Usage
```python
!pip install -qU transformers accelerate
from transformers import AutoTokenizer
import transformers
import torch
model = "Yuma42/KangalKhan-Ruby-7B"
messages = [{"role": "user", "content": "What is a large language model?"}]
tokenizer = AutoTokenizer.from_pretrained(model)
prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
pipeline = transformers.pipeline(
"text-generation",
model=model,
torch_dtype=torch.float16,
device_map="auto",
)
outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
```
# [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_Yuma42__KangalKhan-Ruby-7B-Fixed)
| Metric |Value|
|---------------------------------|----:|
|Avg. |68.68|
|AI2 Reasoning Challenge (25-Shot)|67.24|
|HellaSwag (10-Shot) |85.22|
|MMLU (5-Shot) |63.21|
|TruthfulQA (0-shot) |56.49|
|Winogrande (5-shot) |77.98|
|GSM8k (5-shot) |61.94|
| {} | RichardErkhov/Yuma42_-_KangalKhan-Ruby-7B-Fixed-gguf | null | [
"gguf",
"region:us"
] | null | 2024-05-01T19:37:26+00:00 |
null | null | this is training model | {} | HoodySi/treningowe | null | [
"region:us"
] | null | 2024-05-01T19:37:52+00:00 |
text-generation | transformers | Quantization made by Richard Erkhov.
[Github](https://github.com/RichardErkhov)
[Discord](https://discord.gg/pvy7H8DZMG)
[Request more models](https://github.com/RichardErkhov/quant_request)
KangalKhan-RawEmerald-7B - bnb 8bits
- Model creator: https://huggingface.co/Yuma42/
- Original model: https://huggingface.co/Yuma42/KangalKhan-RawEmerald-7B/
Original model description:
---
language:
- en
license: apache-2.0
tags:
- merge
- mergekit
- lazymergekit
- argilla/CapybaraHermes-2.5-Mistral-7B
- argilla/distilabeled-OpenHermes-2.5-Mistral-7B
base_model:
- argilla/CapybaraHermes-2.5-Mistral-7B
- argilla/distilabeled-OpenHermes-2.5-Mistral-7B
model-index:
- name: KangalKhan-RawEmerald-7B
results:
- task:
type: text-generation
name: Text Generation
dataset:
name: AI2 Reasoning Challenge (25-Shot)
type: ai2_arc
config: ARC-Challenge
split: test
args:
num_few_shot: 25
metrics:
- type: acc_norm
value: 66.89
name: normalized accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Yuma42/KangalKhan-RawEmerald-7B
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: HellaSwag (10-Shot)
type: hellaswag
split: validation
args:
num_few_shot: 10
metrics:
- type: acc_norm
value: 85.75
name: normalized accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Yuma42/KangalKhan-RawEmerald-7B
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: MMLU (5-Shot)
type: cais/mmlu
config: all
split: test
args:
num_few_shot: 5
metrics:
- type: acc
value: 63.23
name: accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Yuma42/KangalKhan-RawEmerald-7B
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: TruthfulQA (0-shot)
type: truthful_qa
config: multiple_choice
split: validation
args:
num_few_shot: 0
metrics:
- type: mc2
value: 57.58
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Yuma42/KangalKhan-RawEmerald-7B
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: Winogrande (5-shot)
type: winogrande
config: winogrande_xl
split: validation
args:
num_few_shot: 5
metrics:
- type: acc
value: 78.22
name: accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Yuma42/KangalKhan-RawEmerald-7B
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: GSM8k (5-shot)
type: gsm8k
config: main
split: test
args:
num_few_shot: 5
metrics:
- type: acc
value: 62.85
name: accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Yuma42/KangalKhan-RawEmerald-7B
name: Open LLM Leaderboard
---
# KangalKhan-RawEmerald-7B
I suggest using ChatML (Use whatever system prompt you like, this is just an example!):
```
<|im_start|>system
You are a friendly assistant.<|im_end|>
<|im_start|>user
Hello, what are you?<|im_end|>
<|im_start|>assistant
I am an AI language model designed to assist users with information and answer their questions. How can I help you today?<|im_end|>
```
Q4_K_S GGUF:
https://huggingface.co/Yuma42/KangalKhan-RawEmerald-7B-GGUF
More GGUF variants by [mradermacher](https://huggingface.co/mradermacher):
WARNING: I have observed that these versions output typos in rare cases. If you have the same problem, use my Q4_K_S GGUF above.
https://huggingface.co/mradermacher/KangalKhan-RawEmerald-7B-GGUF
KangalKhan-RawEmerald-7B is a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
* [argilla/CapybaraHermes-2.5-Mistral-7B](https://huggingface.co/argilla/CapybaraHermes-2.5-Mistral-7B)
* [argilla/distilabeled-OpenHermes-2.5-Mistral-7B](https://huggingface.co/argilla/distilabeled-OpenHermes-2.5-Mistral-7B)
## π§© Configuration
```yaml
models:
- model: teknium/OpenHermes-2.5-Mistral-7B
# no parameters necessary for base model
- model: argilla/CapybaraHermes-2.5-Mistral-7B
parameters:
density: 0.6
weight: 0.5
- model: argilla/distilabeled-OpenHermes-2.5-Mistral-7B
parameters:
density: 0.6
weight: 0.5
merge_method: ties
base_model: teknium/OpenHermes-2.5-Mistral-7B
parameters:
normalize: true
dtype: bfloat16
```
## π» Usage
```python
!pip install -qU transformers accelerate
from transformers import AutoTokenizer
import transformers
import torch
model = "Yuma42/KangalKhan-RawEmerald-7B"
messages = [{"role": "user", "content": "What is a large language model?"}]
tokenizer = AutoTokenizer.from_pretrained(model)
prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
pipeline = transformers.pipeline(
"text-generation",
model=model,
torch_dtype=torch.float16,
device_map="auto",
)
outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
```
# [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_Yuma42__KangalKhan-RawEmerald-7B)
| Metric |Value|
|---------------------------------|----:|
|Avg. |69.09|
|AI2 Reasoning Challenge (25-Shot)|66.89|
|HellaSwag (10-Shot) |85.75|
|MMLU (5-Shot) |63.23|
|TruthfulQA (0-shot) |57.58|
|Winogrande (5-shot) |78.22|
|GSM8k (5-shot) |62.85|
| {} | RichardErkhov/Yuma42_-_KangalKhan-RawEmerald-7B-8bits | null | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"8-bit",
"region:us"
] | null | 2024-05-01T19:38:06+00:00 |
text2text-generation | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# my_awesome_opus_books_model
This model is a fine-tuned version of [google-t5/t5-small](https://huggingface.co/google-t5/t5-small) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.5191
- Bleu: 6.3813
- Gen Len: 17.539
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Bleu | Gen Len |
|:-------------:|:-----:|:-----:|:---------------:|:------:|:-------:|
| 1.8456 | 1.0 | 6355 | 1.6112 | 5.7972 | 17.5672 |
| 1.7857 | 2.0 | 12710 | 1.5620 | 6.1557 | 17.5515 |
| 1.7359 | 3.0 | 19065 | 1.5358 | 6.2797 | 17.5462 |
| 1.7219 | 4.0 | 25420 | 1.5226 | 6.3581 | 17.5427 |
| 1.7219 | 5.0 | 31775 | 1.5191 | 6.3813 | 17.539 |
### Framework versions
- Transformers 4.40.1
- Pytorch 2.2.1+cu121
- Datasets 2.19.0
- Tokenizers 0.19.1
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "metrics": ["bleu"], "base_model": "google-t5/t5-small", "model-index": [{"name": "my_awesome_opus_books_model", "results": []}]} | sakt90/my_awesome_opus_books_model | null | [
"transformers",
"tensorboard",
"safetensors",
"t5",
"text2text-generation",
"generated_from_trainer",
"base_model:google-t5/t5-small",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null | 2024-05-01T19:39:27+00:00 |
null | null | {} | ThuyNT/CS505_COQE_viT5_total_Instruction0_AOSPL_v1_h0 | null | [
"region:us"
] | null | 2024-05-01T19:41:15+00:00 |
|
null | null | {"license": "mit"} | umop-ap1sdn/CNN_Spectrogram_Emotion | null | [
"license:mit",
"region:us"
] | null | 2024-05-01T19:41:49+00:00 |
|
text-generation | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# 0.001_withdpo_4iters_bs256_432lr_iter_4
This model is a fine-tuned version of [ShenaoZ/0.001_withdpo_4iters_bs256_432lr_iter_3](https://huggingface.co/ShenaoZ/0.001_withdpo_4iters_bs256_432lr_iter_3) on the updated and the original datasets.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-07
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- distributed_type: multi-GPU
- num_devices: 8
- gradient_accumulation_steps: 4
- total_train_batch_size: 256
- total_eval_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 1
### Training results
### Framework versions
- Transformers 4.36.2
- Pytorch 2.1.2+cu121
- Datasets 2.14.6
- Tokenizers 0.15.2
| {"license": "mit", "tags": ["alignment-handbook", "generated_from_trainer", "trl", "dpo", "generated_from_trainer"], "datasets": ["updated", "original"], "base_model": "ShenaoZ/0.001_withdpo_4iters_bs256_432lr_iter_3", "model-index": [{"name": "0.001_withdpo_4iters_bs256_432lr_iter_4", "results": []}]} | ShenaoZ/0.001_withdpo_4iters_bs256_432lr_iter_4 | null | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"alignment-handbook",
"generated_from_trainer",
"trl",
"dpo",
"conversational",
"dataset:updated",
"dataset:original",
"base_model:ShenaoZ/0.001_withdpo_4iters_bs256_432lr_iter_3",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null | 2024-05-01T19:41:53+00:00 |
null | null | {} | ThuyNT/CS505_COQE_viT5_total_Instruction0_APOSL_v1_h0 | null | [
"region:us"
] | null | 2024-05-01T19:43:21+00:00 |
|
text2text-generation | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a π€ transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"license": "mit", "library_name": "transformers", "tags": []} | shramay-palta/test-demo-t5-large-qa | null | [
"transformers",
"safetensors",
"t5",
"text2text-generation",
"arxiv:1910.09700",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null | 2024-05-01T19:43:38+00:00 |
null | null | {"license": "llama2"} | kriscchen/Llama-2-7b-Chat-GPTQ | null | [
"license:llama2",
"region:us"
] | null | 2024-05-01T19:43:52+00:00 |
|
null | null | {} | Mrtzx1/Sinap | null | [
"region:us"
] | null | 2024-05-01T19:44:10+00:00 |
|
null | null | {} | Asif2647/tuner07 | null | [
"region:us"
] | null | 2024-05-01T19:44:39+00:00 |
|
text-generation | transformers | {} | nelson-pawait/checkpoints_3 | null | [
"transformers",
"tensorboard",
"safetensors",
"whisper",
"text-generation",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2024-05-01T19:46:04+00:00 |
|
text-to-image | diffusers |
<!-- This model card has been generated automatically according to the information the training script had access to. You
should probably proofread and complete it, then remove this comment. -->
# SDXL LoRA DreamBooth - embracellm/sushi24_LoRA
<Gallery />
## Model description
These are embracellm/sushi24_LoRA LoRA adaption weights for stabilityai/stable-diffusion-xl-base-1.0.
The weights were trained using [DreamBooth](https://dreambooth.github.io/).
LoRA for the text encoder was enabled: False.
Special VAE used for training: madebyollin/sdxl-vae-fp16-fix.
## Trigger words
You should use a photo of Vegeterian Roll to trigger the image generation.
## Download model
Weights for this model are available in Safetensors format.
[Download](embracellm/sushi24_LoRA/tree/main) them in the Files & versions tab.
## Intended uses & limitations
#### How to use
```python
# TODO: add an example code snippet for running this diffusion pipeline
```
#### Limitations and bias
[TODO: provide examples of latent issues and potential remediations]
## Training details
[TODO: describe the data used to train the model] | {"license": "openrail++", "library_name": "diffusers", "tags": ["text-to-image", "text-to-image", "diffusers-training", "diffusers", "dora", "template:sd-lora", "stable-diffusion-xl", "stable-diffusion-xl-diffusers", "text-to-image", "text-to-image", "diffusers-training", "diffusers", "dora", "template:sd-lora", "stable-diffusion-xl", "stable-diffusion-xl-diffusers"], "base_model": "stabilityai/stable-diffusion-xl-base-1.0", "instance_prompt": "a photo of Vegeterian Roll", "widget": []} | embracellm/sushi24_LoRA | null | [
"diffusers",
"tensorboard",
"text-to-image",
"diffusers-training",
"dora",
"template:sd-lora",
"stable-diffusion-xl",
"stable-diffusion-xl-diffusers",
"base_model:stabilityai/stable-diffusion-xl-base-1.0",
"license:openrail++",
"region:us"
] | null | 2024-05-01T19:46:49+00:00 |
automatic-speech-recognition | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# whisper_fintuned
This model is a fine-tuned version of [openai/whisper-tiny.en](https://huggingface.co/openai/whisper-tiny.en) on an unknown dataset.
It achieves the following results on the evaluation set:
- eval_loss: 0.2894
- eval_wer: 13.9949
- eval_runtime: 54.8883
- eval_samples_per_second: 9.109
- eval_steps_per_second: 1.148
- epoch: 16.3889
- step: 590
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 128
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- training_steps: 1000
### Framework versions
- Transformers 4.40.1
- Pytorch 2.2.1+cu121
- Datasets 2.19.1.dev0
- Tokenizers 0.19.1
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "base_model": "openai/whisper-tiny.en", "model-index": [{"name": "whisper_fintuned", "results": []}]} | laalays/whisper_fintuned | null | [
"transformers",
"tensorboard",
"safetensors",
"whisper",
"automatic-speech-recognition",
"generated_from_trainer",
"base_model:openai/whisper-tiny.en",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | null | 2024-05-01T19:47:30+00:00 |
null | transformers |
# Uploaded model
- **Developed by:** MR-Eder
- **License:** apache-2.0
- **Finetuned from model :** unsloth/Phi-3-mini-4k-instruct
This mistral model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
| {"language": ["en"], "license": "apache-2.0", "tags": ["text-generation-inference", "transformers", "unsloth", "mistral", "trl"], "base_model": "unsloth/Phi-3-mini-4k-instruct"} | MR-Eder/phi3-wiki-de-single-pairs-LoRA | null | [
"transformers",
"safetensors",
"text-generation-inference",
"unsloth",
"mistral",
"trl",
"en",
"base_model:unsloth/Phi-3-mini-4k-instruct",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | null | 2024-05-01T19:47:39+00:00 |
null | null | Quantization made by Richard Erkhov.
[Github](https://github.com/RichardErkhov)
[Discord](https://discord.gg/pvy7H8DZMG)
[Request more models](https://github.com/RichardErkhov/quant_request)
KangalKhan-RawEmerald-7B - GGUF
- Model creator: https://huggingface.co/Yuma42/
- Original model: https://huggingface.co/Yuma42/KangalKhan-RawEmerald-7B/
| Name | Quant method | Size |
| ---- | ---- | ---- |
| [KangalKhan-RawEmerald-7B.Q2_K.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-RawEmerald-7B-gguf/blob/main/KangalKhan-RawEmerald-7B.Q2_K.gguf) | Q2_K | 2.53GB |
| [KangalKhan-RawEmerald-7B.IQ3_XS.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-RawEmerald-7B-gguf/blob/main/KangalKhan-RawEmerald-7B.IQ3_XS.gguf) | IQ3_XS | 2.81GB |
| [KangalKhan-RawEmerald-7B.IQ3_S.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-RawEmerald-7B-gguf/blob/main/KangalKhan-RawEmerald-7B.IQ3_S.gguf) | IQ3_S | 2.96GB |
| [KangalKhan-RawEmerald-7B.Q3_K_S.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-RawEmerald-7B-gguf/blob/main/KangalKhan-RawEmerald-7B.Q3_K_S.gguf) | Q3_K_S | 2.95GB |
| [KangalKhan-RawEmerald-7B.IQ3_M.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-RawEmerald-7B-gguf/blob/main/KangalKhan-RawEmerald-7B.IQ3_M.gguf) | IQ3_M | 3.06GB |
| [KangalKhan-RawEmerald-7B.Q3_K.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-RawEmerald-7B-gguf/blob/main/KangalKhan-RawEmerald-7B.Q3_K.gguf) | Q3_K | 3.28GB |
| [KangalKhan-RawEmerald-7B.Q3_K_M.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-RawEmerald-7B-gguf/blob/main/KangalKhan-RawEmerald-7B.Q3_K_M.gguf) | Q3_K_M | 3.28GB |
| [KangalKhan-RawEmerald-7B.Q3_K_L.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-RawEmerald-7B-gguf/blob/main/KangalKhan-RawEmerald-7B.Q3_K_L.gguf) | Q3_K_L | 3.56GB |
| [KangalKhan-RawEmerald-7B.IQ4_XS.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-RawEmerald-7B-gguf/blob/main/KangalKhan-RawEmerald-7B.IQ4_XS.gguf) | IQ4_XS | 3.67GB |
| [KangalKhan-RawEmerald-7B.Q4_0.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-RawEmerald-7B-gguf/blob/main/KangalKhan-RawEmerald-7B.Q4_0.gguf) | Q4_0 | 3.83GB |
| [KangalKhan-RawEmerald-7B.IQ4_NL.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-RawEmerald-7B-gguf/blob/main/KangalKhan-RawEmerald-7B.IQ4_NL.gguf) | IQ4_NL | 3.87GB |
| [KangalKhan-RawEmerald-7B.Q4_K_S.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-RawEmerald-7B-gguf/blob/main/KangalKhan-RawEmerald-7B.Q4_K_S.gguf) | Q4_K_S | 3.86GB |
| [KangalKhan-RawEmerald-7B.Q4_K.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-RawEmerald-7B-gguf/blob/main/KangalKhan-RawEmerald-7B.Q4_K.gguf) | Q4_K | 4.07GB |
| [KangalKhan-RawEmerald-7B.Q4_K_M.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-RawEmerald-7B-gguf/blob/main/KangalKhan-RawEmerald-7B.Q4_K_M.gguf) | Q4_K_M | 4.07GB |
| [KangalKhan-RawEmerald-7B.Q4_1.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-RawEmerald-7B-gguf/blob/main/KangalKhan-RawEmerald-7B.Q4_1.gguf) | Q4_1 | 4.24GB |
| [KangalKhan-RawEmerald-7B.Q5_0.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-RawEmerald-7B-gguf/blob/main/KangalKhan-RawEmerald-7B.Q5_0.gguf) | Q5_0 | 4.65GB |
| [KangalKhan-RawEmerald-7B.Q5_K_S.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-RawEmerald-7B-gguf/blob/main/KangalKhan-RawEmerald-7B.Q5_K_S.gguf) | Q5_K_S | 4.65GB |
| [KangalKhan-RawEmerald-7B.Q5_K.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-RawEmerald-7B-gguf/blob/main/KangalKhan-RawEmerald-7B.Q5_K.gguf) | Q5_K | 4.78GB |
| [KangalKhan-RawEmerald-7B.Q5_K_M.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-RawEmerald-7B-gguf/blob/main/KangalKhan-RawEmerald-7B.Q5_K_M.gguf) | Q5_K_M | 4.78GB |
| [KangalKhan-RawEmerald-7B.Q5_1.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-RawEmerald-7B-gguf/blob/main/KangalKhan-RawEmerald-7B.Q5_1.gguf) | Q5_1 | 5.07GB |
| [KangalKhan-RawEmerald-7B.Q6_K.gguf](https://huggingface.co/RichardErkhov/Yuma42_-_KangalKhan-RawEmerald-7B-gguf/blob/main/KangalKhan-RawEmerald-7B.Q6_K.gguf) | Q6_K | 5.53GB |
Original model description:
---
language:
- en
license: apache-2.0
tags:
- merge
- mergekit
- lazymergekit
- argilla/CapybaraHermes-2.5-Mistral-7B
- argilla/distilabeled-OpenHermes-2.5-Mistral-7B
base_model:
- argilla/CapybaraHermes-2.5-Mistral-7B
- argilla/distilabeled-OpenHermes-2.5-Mistral-7B
model-index:
- name: KangalKhan-RawEmerald-7B
results:
- task:
type: text-generation
name: Text Generation
dataset:
name: AI2 Reasoning Challenge (25-Shot)
type: ai2_arc
config: ARC-Challenge
split: test
args:
num_few_shot: 25
metrics:
- type: acc_norm
value: 66.89
name: normalized accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Yuma42/KangalKhan-RawEmerald-7B
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: HellaSwag (10-Shot)
type: hellaswag
split: validation
args:
num_few_shot: 10
metrics:
- type: acc_norm
value: 85.75
name: normalized accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Yuma42/KangalKhan-RawEmerald-7B
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: MMLU (5-Shot)
type: cais/mmlu
config: all
split: test
args:
num_few_shot: 5
metrics:
- type: acc
value: 63.23
name: accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Yuma42/KangalKhan-RawEmerald-7B
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: TruthfulQA (0-shot)
type: truthful_qa
config: multiple_choice
split: validation
args:
num_few_shot: 0
metrics:
- type: mc2
value: 57.58
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Yuma42/KangalKhan-RawEmerald-7B
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: Winogrande (5-shot)
type: winogrande
config: winogrande_xl
split: validation
args:
num_few_shot: 5
metrics:
- type: acc
value: 78.22
name: accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Yuma42/KangalKhan-RawEmerald-7B
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: GSM8k (5-shot)
type: gsm8k
config: main
split: test
args:
num_few_shot: 5
metrics:
- type: acc
value: 62.85
name: accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Yuma42/KangalKhan-RawEmerald-7B
name: Open LLM Leaderboard
---
# KangalKhan-RawEmerald-7B
I suggest using ChatML (Use whatever system prompt you like, this is just an example!):
```
<|im_start|>system
You are a friendly assistant.<|im_end|>
<|im_start|>user
Hello, what are you?<|im_end|>
<|im_start|>assistant
I am an AI language model designed to assist users with information and answer their questions. How can I help you today?<|im_end|>
```
Q4_K_S GGUF:
https://huggingface.co/Yuma42/KangalKhan-RawEmerald-7B-GGUF
More GGUF variants by [mradermacher](https://huggingface.co/mradermacher):
WARNING: I have observed that these versions output typos in rare cases. If you have the same problem, use my Q4_K_S GGUF above.
https://huggingface.co/mradermacher/KangalKhan-RawEmerald-7B-GGUF
KangalKhan-RawEmerald-7B is a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
* [argilla/CapybaraHermes-2.5-Mistral-7B](https://huggingface.co/argilla/CapybaraHermes-2.5-Mistral-7B)
* [argilla/distilabeled-OpenHermes-2.5-Mistral-7B](https://huggingface.co/argilla/distilabeled-OpenHermes-2.5-Mistral-7B)
## π§© Configuration
```yaml
models:
- model: teknium/OpenHermes-2.5-Mistral-7B
# no parameters necessary for base model
- model: argilla/CapybaraHermes-2.5-Mistral-7B
parameters:
density: 0.6
weight: 0.5
- model: argilla/distilabeled-OpenHermes-2.5-Mistral-7B
parameters:
density: 0.6
weight: 0.5
merge_method: ties
base_model: teknium/OpenHermes-2.5-Mistral-7B
parameters:
normalize: true
dtype: bfloat16
```
## π» Usage
```python
!pip install -qU transformers accelerate
from transformers import AutoTokenizer
import transformers
import torch
model = "Yuma42/KangalKhan-RawEmerald-7B"
messages = [{"role": "user", "content": "What is a large language model?"}]
tokenizer = AutoTokenizer.from_pretrained(model)
prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
pipeline = transformers.pipeline(
"text-generation",
model=model,
torch_dtype=torch.float16,
device_map="auto",
)
outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
```
# [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_Yuma42__KangalKhan-RawEmerald-7B)
| Metric |Value|
|---------------------------------|----:|
|Avg. |69.09|
|AI2 Reasoning Challenge (25-Shot)|66.89|
|HellaSwag (10-Shot) |85.75|
|MMLU (5-Shot) |63.23|
|TruthfulQA (0-shot) |57.58|
|Winogrande (5-shot) |78.22|
|GSM8k (5-shot) |62.85|
| {} | RichardErkhov/Yuma42_-_KangalKhan-RawEmerald-7B-gguf | null | [
"gguf",
"region:us"
] | null | 2024-05-01T19:48:07+00:00 |
null | null | {} | igorcesarnunes/TheSingulars | null | [
"region:us"
] | null | 2024-05-01T19:48:11+00:00 |
|
text-generation | transformers |
# Uploaded model
- **Developed by:** MR-Eder
- **License:** apache-2.0
- **Finetuned from model :** unsloth/Phi-3-mini-4k-instruct
This mistral model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
| {"language": ["en"], "license": "apache-2.0", "tags": ["text-generation-inference", "transformers", "unsloth", "mistral", "trl", "sft"], "base_model": "unsloth/Phi-3-mini-4k-instruct"} | MR-Eder/phi3-wiki-de-single-pairs-16bit | null | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"text-generation-inference",
"unsloth",
"trl",
"sft",
"conversational",
"en",
"base_model:unsloth/Phi-3-mini-4k-instruct",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2024-05-01T19:48:39+00:00 |
text2text-generation | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# flant5_offensive_translation_de_en_wmt
This model is a fine-tuned version of [google/flan-t5-base](https://huggingface.co/google/flan-t5-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0011
- Precision: 0.6551
- Recall: 0.5516
- F1: 0.5989
- Total Predictions: 3532
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Total Predictions |
|:-------------:|:-----:|:-----:|:---------------:|:---------:|:------:|:------:|:-----------------:|
| 0.3687 | 1.0 | 3003 | 0.0012 | 0.5471 | 0.5366 | 0.5418 | 3532 |
| 0.0166 | 2.0 | 6006 | 0.0011 | 0.6454 | 0.4542 | 0.5332 | 3532 |
| 0.0145 | 3.0 | 9009 | 0.0010 | 0.6111 | 0.6065 | 0.6088 | 3532 |
| 0.013 | 4.0 | 12012 | 0.0011 | 0.6904 | 0.4767 | 0.5640 | 3532 |
| 0.0121 | 5.0 | 15015 | 0.0011 | 0.6551 | 0.5516 | 0.5989 | 3532 |
### Framework versions
- Transformers 4.39.3
- Pytorch 2.0.0+cu118
- Datasets 2.18.0
- Tokenizers 0.15.2
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "metrics": ["precision", "recall", "f1"], "base_model": "google/flan-t5-base", "model-index": [{"name": "flant5_offensive_translation_de_en_wmt", "results": []}]} | JenniferHJF/flant5_offensive_translation_de_en_wmt | null | [
"transformers",
"tensorboard",
"safetensors",
"t5",
"text2text-generation",
"generated_from_trainer",
"base_model:google/flan-t5-base",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null | 2024-05-01T19:49:32+00:00 |
text-generation | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# stage1
This model is a fine-tuned version of [jarod0411/zinc10M_gpt2_SMILES_bpe_combined_step1](https://huggingface.co/jarod0411/zinc10M_gpt2_SMILES_bpe_combined_step1) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2469
- Accuracy: 0.9158
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 1
- distributed_type: multi-GPU
- num_devices: 8
- total_train_batch_size: 128
- total_eval_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10.0
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:------:|:---------------:|:--------:|
| 0.3374 | 1.0 | 16956 | 0.2982 | 0.9016 |
| 0.2955 | 2.0 | 33912 | 0.2682 | 0.9104 |
| 0.2795 | 3.0 | 50868 | 0.2593 | 0.9126 |
| 0.2713 | 4.0 | 67824 | 0.2549 | 0.9137 |
| 0.2661 | 5.0 | 84780 | 0.2522 | 0.9144 |
| 0.2626 | 6.0 | 101736 | 0.2501 | 0.9150 |
| 0.2602 | 7.0 | 118692 | 0.2488 | 0.9153 |
| 0.2585 | 8.0 | 135648 | 0.2478 | 0.9156 |
| 0.2574 | 9.0 | 152604 | 0.2471 | 0.9158 |
| 0.2569 | 10.0 | 169560 | 0.2469 | 0.9158 |
### Framework versions
- Transformers 4.36.0.dev0
- Pytorch 2.1.1+cu121
- Datasets 2.15.0
- Tokenizers 0.15.0
| {"license": "mit", "tags": ["generated_from_trainer"], "metrics": ["accuracy"], "base_model": "jarod0411/zinc10M_gpt2_SMILES_bpe_combined_step1", "model-index": [{"name": "stage1", "results": []}]} | jarod0411/stage1 | null | [
"transformers",
"tensorboard",
"safetensors",
"gpt2",
"text-generation",
"generated_from_trainer",
"base_model:jarod0411/zinc10M_gpt2_SMILES_bpe_combined_step1",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null | 2024-05-01T19:51:24+00:00 |
null | null | {} | xmmm/2_reflow_r0_450k_swap100 | null | [
"region:us"
] | null | 2024-05-01T19:52:12+00:00 |
|
text2text-generation | transformers |
Model for English to Serbian translation. Base model is HelsinkiNLP sh model. Fine-tuned using OPUS-100 dataset, which was modified with Paraphrasing Database size S. | {"license": "mit"} | perkan/shortS-opus-mt-tc-base-en-sr | null | [
"transformers",
"pytorch",
"marian",
"text2text-generation",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2024-05-01T19:54:16+00:00 |
text-generation | transformers | <!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<a href="https://www.pruna.ai/" target="_blank" rel="noopener noreferrer">
<img src="https://i.imgur.com/eDAlcgk.png" alt="PrunaAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</a>
</div>
<!-- header end -->
[](https://twitter.com/PrunaAI)
[](https://github.com/PrunaAI)
[](https://www.linkedin.com/company/93832878/admin/feed/posts/?feedType=following)
[](https://discord.gg/CP4VSgck)
# Simply make AI models cheaper, smaller, faster, and greener!
- Give a thumbs up if you like this model!
- Contact us and tell us which model to compress next [here](https://www.pruna.ai/contact).
- Request access to easily compress your *own* AI models [here](https://z0halsaff74.typeform.com/pruna-access?typeform-source=www.pruna.ai).
- Read the documentations to know more [here](https://pruna-ai-pruna.readthedocs-hosted.com/en/latest/)
- Join Pruna AI community on Discord [here](https://discord.gg/CP4VSgck) to share feedback/suggestions or get help.
## Results

**Frequently Asked Questions**
- ***How does the compression work?*** The model is compressed with awq.
- ***How does the model quality change?*** The quality of the model output might vary compared to the base model.
- ***How is the model efficiency evaluated?*** These results were obtained on NVIDIA A100-PCIE-40GB with configuration described in `model/smash_config.json` and are obtained after a hardware warmup. The smashed model is directly compared to the original base model. Efficiency results may vary in other settings (e.g. other hardware, image size, batch size, ...). We recommend to directly run them in the use-case conditions to know if the smashed model can benefit you.
- ***What is the model format?*** We use safetensors.
- ***What calibration data has been used?*** If needed by the compression method, we used WikiText as the calibration data.
- ***What is the naming convention for Pruna Huggingface models?*** We take the original model name and append "turbo", "tiny", or "green" if the smashed model has a measured inference speed, inference memory, or inference energy consumption which is less than 90% of the original base model.
- ***How to compress my own models?*** You can request premium access to more compression methods and tech support for your specific use-cases [here](https://z0halsaff74.typeform.com/pruna-access?typeform-source=www.pruna.ai).
- ***What are "first" metrics?*** Results mentioning "first" are obtained after the first run of the model. The first run might take more memory or be slower than the subsequent runs due cuda overheads.
- ***What are "Sync" and "Async" metrics?*** "Sync" metrics are obtained by syncing all GPU processes and stop measurement when all of them are executed. "Async" metrics are obtained without syncing all GPU processes and stop when the model output can be used by the CPU. We provide both metrics since both could be relevant depending on the use-case. We recommend to test the efficiency gains directly in your use-cases.
## Setup
You can run the smashed model with these steps:
0. Check requirements from the original repo gradientai/Llama-3-8B-Instruct-Gradient-1048k installed. In particular, check python, cuda, and transformers versions.
1. Make sure that you have installed quantization related packages.
```bash
pip install autoawq
```
2. Load & run the model.
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
from awq import AutoAWQForCausalLM
model = AutoAWQForCausalLM.from_quantized("PrunaAI/gradientai-Llama-3-8B-Instruct-1048k-AWQ-4bit-smashed", trust_remote_code=True, device_map='auto')
tokenizer = AutoTokenizer.from_pretrained("gradientai/Llama-3-8B-Instruct-Gradient-1048k")
input_ids = tokenizer("What is the color of prunes?,", return_tensors='pt').to(model.device)["input_ids"]
outputs = model.generate(input_ids, max_new_tokens=216)
tokenizer.decode(outputs[0])
```
## Configurations
The configuration info are in `smash_config.json`.
## Credits & License
The license of the smashed model follows the license of the original model. Please check the license of the original model gradientai/Llama-3-8B-Instruct-Gradient-1048k before using this model which provided the base model. The license of the `pruna-engine` is [here](https://pypi.org/project/pruna-engine/) on Pypi.
## Want to compress other models?
- Contact us and tell us which model to compress next [here](https://www.pruna.ai/contact).
- Request access to easily compress your own AI models [here](https://z0halsaff74.typeform.com/pruna-access?typeform-source=www.pruna.ai). | {"tags": ["pruna-ai"], "metrics": ["memory_disk", "memory_inference", "inference_latency", "inference_throughput", "inference_CO2_emissions", "inference_energy_consumption"], "thumbnail": "https://assets-global.website-files.com/646b351987a8d8ce158d1940/64ec9e96b4334c0e1ac41504_Logo%20with%20white%20text.svg", "base_model": "gradientai/Llama-3-8B-Instruct-Gradient-1048k"} | PrunaAI/gradientai-Llama-3-8B-Instruct-Gradient-1048k-AWQ-4bit-smashed | null | [
"transformers",
"safetensors",
"llama",
"text-generation",
"pruna-ai",
"conversational",
"base_model:gradientai/Llama-3-8B-Instruct-Gradient-1048k",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"4-bit",
"region:us"
] | null | 2024-05-01T19:58:24+00:00 |
text-generation | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a π€ transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | {"library_name": "transformers", "tags": []} | Vexemous/distilgpt2-finetuned-general-stories-pos | null | [
"transformers",
"safetensors",
"gpt2",
"text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null | 2024-05-01T19:59:48+00:00 |
text-classification | transformers | {} | chiefshayan/distilbert-base-uncased-finetuned-emotion | null | [
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2024-05-01T20:00:39+00:00 |
|
text-generation | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a π€ transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | {"library_name": "transformers", "tags": []} | quickstep3621/oghz8fg | null | [
"transformers",
"safetensors",
"stablelm",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2024-05-01T20:01:10+00:00 |
text-generation | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a π€ transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | {"library_name": "transformers", "tags": []} | quickstep3621/igqe128 | null | [
"transformers",
"safetensors",
"stablelm",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null | 2024-05-01T20:01:16+00:00 |
Subsets and Splits
No saved queries yet
Save your SQL queries to embed, download, and access them later. Queries will appear here once saved.