mayacinka commited on
Commit
6ab2ca4
·
verified ·
1 Parent(s): 2856d0b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -6
README.md CHANGED
@@ -18,8 +18,8 @@ This model was merged using the [DARE](https://arxiv.org/abs/2311.03099) [TIES](
18
  ### Models Merged
19
 
20
  The following models were included in the merge:
21
- * /Users/akim/ma/models/NeuralBeagle14-7B
22
- * /Users/akim/ma/models/zephyr-7b-alpha
23
 
24
  ### Configuration
25
 
@@ -27,17 +27,17 @@ The following YAML configuration was used to produce this model:
27
 
28
  ```yaml
29
  models:
30
- - model: /Users/akim/ma/models/NeuralTrix-7B-dpo # base model doesn't need any parameters
31
- - model: /Users/akim/ma/models/zephyr-7b-alpha
32
  parameters:
33
  density: 0.83
34
  weight: 0.4
35
- - model: /Users/akim/ma/models/NeuralBeagle14-7B
36
  parameters:
37
  density: 0.83
38
  weight: 0.6
39
  merge_method: dare_ties
40
- base_model: /Users/akim/ma/models/NeuralTrix-7B-dpo
41
  parameters:
42
  int8_mask: true
43
  dtype: bfloat16
 
18
  ### Models Merged
19
 
20
  The following models were included in the merge:
21
+ * NeuralBeagle14-7B
22
+ * zephyr-7b-alpha
23
 
24
  ### Configuration
25
 
 
27
 
28
  ```yaml
29
  models:
30
+ - model: NeuralTrix-7B-dpo
31
+ - model: zephyr-7b-alpha
32
  parameters:
33
  density: 0.83
34
  weight: 0.4
35
+ - model: NeuralBeagle14-7B
36
  parameters:
37
  density: 0.83
38
  weight: 0.6
39
  merge_method: dare_ties
40
+ base_model: NeuralTrix-7B-dpo
41
  parameters:
42
  int8_mask: true
43
  dtype: bfloat16