RichardErkhov commited on
Commit
c5e9871
·
verified ·
1 Parent(s): 627df22

uploaded readme

Browse files
Files changed (1) hide show
  1. README.md +67 -0
README.md ADDED
@@ -0,0 +1,67 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Quantization made by Richard Erkhov.
2
+
3
+ [Github](https://github.com/RichardErkhov)
4
+
5
+ [Discord](https://discord.gg/pvy7H8DZMG)
6
+
7
+ [Request more models](https://github.com/RichardErkhov/quant_request)
8
+
9
+
10
+ test-mix-01 - AWQ
11
+ - Model creator: https://huggingface.co/appvoid/
12
+ - Original model: https://huggingface.co/appvoid/test-mix-01/
13
+
14
+
15
+
16
+
17
+ Original model description:
18
+ ---
19
+ base_model:
20
+ - appvoid/palmer-003
21
+ - appvoid/palmer-004-2406
22
+ - Josephgflowers/TinyLlama-Cinder-Agent-v1
23
+ library_name: transformers
24
+ tags:
25
+ - mergekit
26
+ - merge
27
+
28
+ ---
29
+ # merge
30
+
31
+ This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
32
+
33
+ ## Merge Details
34
+ ### Merge Method
35
+
36
+ This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [appvoid/palmer-004-2406](https://huggingface.co/appvoid/palmer-004-2406) as a base.
37
+
38
+ ### Models Merged
39
+
40
+ The following models were included in the merge:
41
+ * [appvoid/palmer-003](https://huggingface.co/appvoid/palmer-003)
42
+ * [Josephgflowers/TinyLlama-Cinder-Agent-v1](https://huggingface.co/Josephgflowers/TinyLlama-Cinder-Agent-v1)
43
+
44
+ ### Configuration
45
+
46
+ The following YAML configuration was used to produce this model:
47
+
48
+ ```yaml
49
+ models:
50
+ - model: appvoid/palmer-003
51
+ parameters:
52
+ density: 0.5
53
+ weight: 0.5
54
+ - model: Josephgflowers/TinyLlama-Cinder-Agent-v1
55
+ parameters:
56
+ density: 0.5
57
+ weight: 0.5
58
+
59
+ merge_method: ties
60
+ base_model: appvoid/palmer-004-2406
61
+ parameters:
62
+ normalize: false
63
+ int8_mask: true
64
+ dtype: float16
65
+ ```
66
+
67
+