malhajar commited on
Commit
7679d2f
·
1 Parent(s): c3ecd0f

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +72 -0
README.md ADDED
@@ -0,0 +1,72 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ datasets:
3
+ - yahma/alpaca-cleaned
4
+ ---
5
+ ---
6
+ language:
7
+ - en
8
+ datasets:
9
+
10
+ license: cc-by-nc-4.0
11
+ ---
12
+
13
+ # Platypus2-70B-instruct-4bit-gptq
14
+
15
+ Platypus2-70B-instruct-4bit-gptq is a qunatnized version of [`garage-bAInd/Platypus2-70B-instruct`](https://huggingface.co/garage-bAInd/Platypus2-70B-instruct) using GPTQ Quantnization
16
+
17
+ ### Benchmark Metrics
18
+
19
+ will report soon
20
+
21
+ ### Model Details
22
+
23
+ * **Trained by**: **Platypus2-70B-instruct-4bit-gptq** quantnized by Mohamad [email protected] ;
24
+ * **Model type:** **Platypus2-70B-instruct-4bit-gptq** is a quantnized version of Platypus2-70B-instruct using 4bit quantnization
25
+ * **Language(s)**: English
26
+ * **License**: Non-Commercial Creative Commons license ([CC BY-NC-4.0](https://creativecommons.org/licenses/by-nc/4.0/))
27
+
28
+ ### Prompt Template
29
+ ```
30
+ ### Instruction:
31
+
32
+ <prompt> (without the <>)
33
+
34
+ ### Response:
35
+ ```
36
+
37
+ ### Training Dataset
38
+
39
+ `Platypus2-70B-instruct-4bit-gptq` quantnized using gptq on Alpaca dataset [`yahma/alpaca-cleaned`](https://huggingface.co/datasets/yahma/alpaca-cleaned).
40
+
41
+ ### Training Procedure
42
+
43
+ `garage-bAInd/Platypus2-70B` was instruction fine-tuned using gptq on 2 L40 48GB.
44
+
45
+
46
+ ### Citations
47
+ ```bibtex
48
+ @article{platypus2023,
49
+ title={Platypus: Quick, Cheap, and Powerful Refinement of LLMs},
50
+ author={Ariel N. Lee and Cole J. Hunter and Nataniel Ruiz},
51
+ booktitle={arXiv preprint arxiv:2308.07317},
52
+ year={2023}
53
+ }
54
+ ```
55
+ ```bibtex
56
+ @misc{touvron2023llama,
57
+ title={Llama 2: Open Foundation and Fine-Tuned Chat Models},
58
+ author={Hugo Touvron and Louis Martin and Kevin Stone and Peter Albert and Amjad Almahairi and Yasmine Babaei and Nikolay Bashlykov year={2023},
59
+ eprint={2307.09288},
60
+ archivePrefix={arXiv},
61
+ }
62
+ ```
63
+ ```bibtex
64
+ @misc{frantar2023gptq,
65
+ title={GPTQ: Accurate Post-Training Quantization for Generative Pre-trained Transformers},
66
+ author={Elias Frantar and Saleh Ashkboos and Torsten Hoefler and Dan Alistarh},
67
+ year={2023},
68
+ eprint={2210.17323},
69
+ archivePrefix={arXiv},
70
+ primaryClass={cs.LG}
71
+ }
72
+ ```