FallenMerick commited on
Commit
eefa3ff
·
verified ·
1 Parent(s): b4974b2

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +50 -47
README.md CHANGED
@@ -1,47 +1,50 @@
1
- ---
2
- license: other
3
- license_name: microsoft-research-license
4
- library_name: transformers
5
- tags:
6
- - mergekit
7
- - merge
8
- - storywriting
9
- - text adventure
10
- - not-for-all-audiences
11
- ---
12
- # Space-Whale-Lite
13
-
14
- This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
15
-
16
- ## Merge Details
17
- ### Merge Method
18
-
19
- This model was merged using the passthrough merge method.
20
-
21
- ### Models Merged
22
-
23
- The following models were included in the merge:
24
- * [TeeZee/Orca-2-13b_flat](https://huggingface.co/TeeZee/Orca-2-13b_flat)
25
- * [KoboldAI/LLaMA2-13B-Psyfighter2](https://huggingface.co/KoboldAI/LLaMA2-13B-Psyfighter2)
26
-
27
- ### Configuration
28
-
29
- The following YAML configuration was used to produce this model:
30
-
31
- ```yaml
32
- slices:
33
- - sources:
34
- - model: Orca2-Flat (TeeZee/Orca-2-13b_flat -> float16)
35
- layer_range: [0, 8]
36
- - sources:
37
- - model: KoboldAI/LLaMA2-13B-Psyfighter2
38
- layer_range: [8, 24]
39
- - sources:
40
- - model: Orca2-Flat (TeeZee/Orca-2-13b_flat -> float16)
41
- layer_range: [24, 32]
42
- - sources:
43
- - model: KoboldAI/LLaMA2-13B-Psyfighter2
44
- layer_range: [32, 40]
45
- merge_method: passthrough
46
- dtype: float16
47
- ```
 
 
 
 
1
+ ---
2
+ license: other
3
+ license_name: microsoft-research-license
4
+ library_name: transformers
5
+ tags:
6
+ - mergekit
7
+ - merge
8
+ - storywriting
9
+ - text adventure
10
+ - not-for-all-audiences
11
+ ---
12
+ # Space-Whale-Lite
13
+
14
+ Restack of the legendary Psyonic Cetacean:
15
+ [jebcarter/psyonic-cetacean-20B](https://huggingface.co/jebcarter/psyonic-cetacean-20B)
16
+
17
+ This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
18
+
19
+ ## Merge Details
20
+ ### Merge Method
21
+
22
+ This model was merged using the passthrough merge method.
23
+
24
+ ### Models Merged
25
+
26
+ The following models were included in the merge:
27
+ * [TeeZee/Orca-2-13b_flat](https://huggingface.co/TeeZee/Orca-2-13b_flat)
28
+ * [KoboldAI/LLaMA2-13B-Psyfighter2](https://huggingface.co/KoboldAI/LLaMA2-13B-Psyfighter2)
29
+
30
+ ### Configuration
31
+
32
+ The following YAML configuration was used to produce this model:
33
+
34
+ ```yaml
35
+ slices:
36
+ - sources:
37
+ - model: Orca2-Flat (TeeZee/Orca-2-13b_flat -> float16)
38
+ layer_range: [0, 8]
39
+ - sources:
40
+ - model: KoboldAI/LLaMA2-13B-Psyfighter2
41
+ layer_range: [8, 24]
42
+ - sources:
43
+ - model: Orca2-Flat (TeeZee/Orca-2-13b_flat -> float16)
44
+ layer_range: [24, 32]
45
+ - sources:
46
+ - model: KoboldAI/LLaMA2-13B-Psyfighter2
47
+ layer_range: [32, 40]
48
+ merge_method: passthrough
49
+ dtype: float16
50
+ ```