File size: 3,984 Bytes
afe3437 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 |
---
base_model:
- jondurbin/bagel-8b-v1.0
- aifeifei798/llama3-8B-DarkIdol-2.3-Uncensored-32K
- Deev124/hermes-llama3-roleplay-4000-v1
- vicgalle/Roleplay-Llama-3-8B
- DevsDoCode/LLama-3-8b-Uncensored
- vicgalle/Unsafe-Llama-3-8B
- Gryphe/Pantheon-RP-1.0-8b-Llama-3
- Orenguteng/Llama-3.1-8B-Lexi-Uncensored-V2
- NeverSleep/Llama-3-Lumimaid-8B-v0.1
- Undi95/Llama-3-LewdPlay-8B-evo
- vicgalle/Humanish-Roleplay-Llama-3.1-8B
- winglian/llama-3-8b-1m-PoSE
- mergekit-community/because_im_bored_nsfw1
- mlabonne/Hermes-3-Llama-3.1-8B-lorablated
library_name: transformers
tags:
- mergekit
- merge
---
# merge
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the [task arithmetic](https://arxiv.org/abs/2212.04089) merge method using [winglian/llama-3-8b-1m-PoSE](https://huggingface.co/winglian/llama-3-8b-1m-PoSE) as a base.
### Models Merged
The following models were included in the merge:
* [jondurbin/bagel-8b-v1.0](https://huggingface.co/jondurbin/bagel-8b-v1.0)
* [aifeifei798/llama3-8B-DarkIdol-2.3-Uncensored-32K](https://huggingface.co/aifeifei798/llama3-8B-DarkIdol-2.3-Uncensored-32K)
* [Deev124/hermes-llama3-roleplay-4000-v1](https://huggingface.co/Deev124/hermes-llama3-roleplay-4000-v1)
* [vicgalle/Roleplay-Llama-3-8B](https://huggingface.co/vicgalle/Roleplay-Llama-3-8B)
* [DevsDoCode/LLama-3-8b-Uncensored](https://huggingface.co/DevsDoCode/LLama-3-8b-Uncensored)
* [vicgalle/Unsafe-Llama-3-8B](https://huggingface.co/vicgalle/Unsafe-Llama-3-8B)
* [Gryphe/Pantheon-RP-1.0-8b-Llama-3](https://huggingface.co/Gryphe/Pantheon-RP-1.0-8b-Llama-3)
* [Orenguteng/Llama-3.1-8B-Lexi-Uncensored-V2](https://huggingface.co/Orenguteng/Llama-3.1-8B-Lexi-Uncensored-V2)
* [NeverSleep/Llama-3-Lumimaid-8B-v0.1](https://huggingface.co/NeverSleep/Llama-3-Lumimaid-8B-v0.1)
* [Undi95/Llama-3-LewdPlay-8B-evo](https://huggingface.co/Undi95/Llama-3-LewdPlay-8B-evo)
* [vicgalle/Humanish-Roleplay-Llama-3.1-8B](https://huggingface.co/vicgalle/Humanish-Roleplay-Llama-3.1-8B)
* [mergekit-community/because_im_bored_nsfw1](https://huggingface.co/mergekit-community/because_im_bored_nsfw1)
* [mlabonne/Hermes-3-Llama-3.1-8B-lorablated](https://huggingface.co/mlabonne/Hermes-3-Llama-3.1-8B-lorablated)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
models:
- model: aifeifei798/llama3-8B-DarkIdol-2.3-Uncensored-32K
parameters:
density: 0.65
weight: 0.15
- model: NeverSleep/Llama-3-Lumimaid-8B-v0.1
parameters:
density: 0.70
weight: 0.20
- model: mergekit-community/because_im_bored_nsfw1
parameters:
density: 0.60
weight: 0.10
- model: jondurbin/bagel-8b-v1.0
parameters:
density: 0.60
weight: 0.10
- model: Orenguteng/Llama-3.1-8B-Lexi-Uncensored-V2
parameters:
density: 0.65
weight: 0.15
- model: Undi95/Llama-3-LewdPlay-8B-evo
parameters:
density: 0.75
weight: 0.25
- model: Deev124/hermes-llama3-roleplay-4000-v1
parameters:
density: 0.60
weight: 0.10
- model: DevsDoCode/LLama-3-8b-Uncensored
parameters:
density: 0.60
weight: 0.10
- model: mlabonne/Hermes-3-Llama-3.1-8B-lorablated
parameters:
density: 0.65
weight: 0.15
- model: vicgalle/Roleplay-Llama-3-8B
parameters:
density: 0.70
weight: 0.20
- model: Gryphe/Pantheon-RP-1.0-8b-Llama-3
parameters:
density: 0.70
weight: 0.20
- model: vicgalle/Humanish-Roleplay-Llama-3.1-8B
parameters:
density: 0.70
weight: 0.20
- model: vicgalle/Unsafe-Llama-3-8B
parameters:
density: 0.70
weight: 0.20
- model: winglian/llama-3-8b-1m-PoSE
parameters:
density: 0.70
weight: 0.20
merge_method: task_arithmetic
base_model: winglian/llama-3-8b-1m-PoSE
parameters:
normalize: true
int8_mask: true
dtype: float16
```
|