RichardErkhov commited on
Commit
77e1260
·
verified ·
1 Parent(s): 5e7d552

uploaded readme

Browse files
Files changed (1) hide show
  1. README.md +69 -0
README.md ADDED
@@ -0,0 +1,69 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Quantization made by Richard Erkhov.
2
+
3
+ [Github](https://github.com/RichardErkhov)
4
+
5
+ [Discord](https://discord.gg/pvy7H8DZMG)
6
+
7
+ [Request more models](https://github.com/RichardErkhov/quant_request)
8
+
9
+
10
+ gemma-merged-one-layer-only - AWQ
11
+ - Model creator: https://huggingface.co/choprahetarth/
12
+ - Original model: https://huggingface.co/choprahetarth/gemma-merged-one-layer-only/
13
+
14
+
15
+
16
+
17
+ Original model description:
18
+ ---
19
+ base_model:
20
+ - google/gemma-2-2b
21
+ - google/gemma-2-2b-it
22
+ library_name: transformers
23
+ tags:
24
+ - mergekit
25
+ - merge
26
+
27
+ ---
28
+ # Untitled Model (1)
29
+
30
+ This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
31
+
32
+ ## Merge Details
33
+ ### Merge Method
34
+
35
+ This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [google/gemma-2-2b](https://huggingface.co/google/gemma-2-2b) as a base.
36
+
37
+ ### Models Merged
38
+
39
+ The following models were included in the merge:
40
+ * [google/gemma-2-2b-it](https://huggingface.co/google/gemma-2-2b-it)
41
+
42
+ ### Configuration
43
+
44
+ The following YAML configuration was used to produce this model:
45
+
46
+ ```yaml
47
+ base_model: google/gemma-2-2b
48
+ dtype: bfloat16
49
+ merge_method: ties
50
+ parameters:
51
+ int8_mask: 1.0
52
+ normalize: 1.0
53
+ slices:
54
+ - sources:
55
+ - layer_range: [0, 26]
56
+ model: google/gemma-2-2b
57
+ - layer_range: [0, 26]
58
+ model: google/gemma-2-2b-it
59
+ parameters:
60
+ density:
61
+ - filter: self_attn.o_proj.9
62
+ value: 1.0
63
+ - value: 0.001
64
+ weight:
65
+ - value: 1.0
66
+ tokenizer_source: union
67
+ ```
68
+
69
+