Update README.md
Browse files
README.md
CHANGED
@@ -18,8 +18,8 @@ This model was merged using the [DARE](https://arxiv.org/abs/2311.03099) [TIES](
|
|
18 |
### Models Merged
|
19 |
|
20 |
The following models were included in the merge:
|
21 |
-
*
|
22 |
-
*
|
23 |
|
24 |
### Configuration
|
25 |
|
@@ -27,17 +27,17 @@ The following YAML configuration was used to produce this model:
|
|
27 |
|
28 |
```yaml
|
29 |
models:
|
30 |
-
- model:
|
31 |
-
- model:
|
32 |
parameters:
|
33 |
density: 0.83
|
34 |
weight: 0.4
|
35 |
-
- model:
|
36 |
parameters:
|
37 |
density: 0.83
|
38 |
weight: 0.6
|
39 |
merge_method: dare_ties
|
40 |
-
base_model:
|
41 |
parameters:
|
42 |
int8_mask: true
|
43 |
dtype: bfloat16
|
|
|
18 |
### Models Merged
|
19 |
|
20 |
The following models were included in the merge:
|
21 |
+
* NeuralBeagle14-7B
|
22 |
+
* zephyr-7b-alpha
|
23 |
|
24 |
### Configuration
|
25 |
|
|
|
27 |
|
28 |
```yaml
|
29 |
models:
|
30 |
+
- model: NeuralTrix-7B-dpo
|
31 |
+
- model: zephyr-7b-alpha
|
32 |
parameters:
|
33 |
density: 0.83
|
34 |
weight: 0.4
|
35 |
+
- model: NeuralBeagle14-7B
|
36 |
parameters:
|
37 |
density: 0.83
|
38 |
weight: 0.6
|
39 |
merge_method: dare_ties
|
40 |
+
base_model: NeuralTrix-7B-dpo
|
41 |
parameters:
|
42 |
int8_mask: true
|
43 |
dtype: bfloat16
|