mwitiderrick commited on
Commit
05f798b
·
1 Parent(s): cd105c6

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +11 -47
README.md CHANGED
@@ -1,17 +1,17 @@
1
  ---
2
- base_model: TinyLlama/TinyLlama-1.1B-Chat-v0.4
3
  inference: false
4
  model_type: llama
5
  prompt_template: |
6
- <|im_start|>user\n
7
- {prompt}<|im_end|>\n
8
- <|im_start|>assistant\n
9
  quantized_by: mwitiderrick
10
  tags:
11
  - deepsparse
12
  ---
13
- ## TinyLlama 1.1B Chat 0.4 - DeepSparse
14
- This repo contains model files for [TinyLlama 1.1B Chat](https://huggingface.co/TinyLlama/TinyLlama-1.1B-Chat-v0.4) optimized for [DeepSparse](https://github.com/neuralmagic/deepsparse), a CPU inference runtime for sparse models.
15
 
16
  This model was quantized and pruned with [SparseGPT](https://arxiv.org/abs/2301.00774), using [SparseML](https://github.com/neuralmagic/sparseml).
17
 
@@ -24,50 +24,14 @@ Run in a [Python pipeline](https://github.com/neuralmagic/deepsparse/blob/main/d
24
  ```python
25
  from deepsparse import TextGeneration
26
 
27
- prompt = "How to make banana bread?"
28
- formatted_prompt = f"<|im_start|>user\n{prompt}<|im_end|>\n<|im_start|>assistant\n"
29
 
30
- model = TextGeneration(model="hf:neuralmagic/TinyLlama-1.1B-Chat-v0.4-pruned50-quant")
31
- print(model(formatted_prompt, max_new_tokens=500).generations[0].text)
32
 
33
  """
34
- Banana bread is a delicious and easy-to-make recipe that is sure to please. Here is a recipe for making banana bread:
35
-
36
- Ingredients:
37
-
38
- For the Banana Bread:
39
-
40
- - 1 cup of sugar
41
- - 1 cup of flour
42
- - 1/2 cup of mashed bananas
43
- - 1/4 cup of milk
44
- - 1/2 cup of melted butter
45
- - 1/4 cup of baking powder
46
- - 1/4 cup of baking soda
47
- - 1/4 cup of eggs
48
- - 1/4 cup of milk
49
- - 1/4 cup of sugar
50
-
51
-
52
- Instructions:
53
-
54
- 1. Preheat the oven to 325°F (160°C).
55
- 2. In a large bowl, combine the sugar and flour.
56
- 3. In a separate bow, combine the mashed bananas, milk, butter, baking powder, baking soda, milk, sugar.
57
- 4. Add the bananas and milk into the flour-sugar mixture.
58
- 5. Pour the milk into the bowl of the flour-sugar mixture.
59
- 6. Pour the baking powder into the bowl of the flour-sugar mixture.
60
- 7. Pour the mashed bananas into the bowl of the flour-sugar mixture.
61
- 8. Add the eggs into the bowl of the flour-sugar mixture.
62
- 9. Stir the mixture until it becomes a dough.
63
- 10. Grease a 9-inch (23 cm) square pan.
64
- 11. Pour the mixture into the pan.
65
- 12. Bake the banana bread in the oven for 40 minutes.
66
- 13. Remove the banana bread from the oven and cool it.
67
- 14. Cut the bread into 16 pieces.
68
- 15. Make the glaze:
69
- 16. Sprinkle the sugar over the bread.
70
- 17. Bake the bread in the oven for 30 minutes.
71
  """
72
  ```
73
  ## Prompt template
 
1
  ---
2
+ base_model: openlm-research/open_llama_3b
3
  inference: false
4
  model_type: llama
5
  prompt_template: |
6
+ Q:
7
+ {prompt}
8
+ \nA
9
  quantized_by: mwitiderrick
10
  tags:
11
  - deepsparse
12
  ---
13
+ ## OpenLLaMA 3B - DeepSparse
14
+ This repo contains model files for [OpenLLaMA](https://huggingface.co/openlm-research/open_llama_3b) optimized for [DeepSparse](https://github.com/neuralmagic/deepsparse), a CPU inference runtime for sparse models.
15
 
16
  This model was quantized and pruned with [SparseGPT](https://arxiv.org/abs/2301.00774), using [SparseML](https://github.com/neuralmagic/sparseml).
17
 
 
24
  ```python
25
  from deepsparse import TextGeneration
26
 
27
+ prompt = 'Q: What is the largest animal?\nA:'
28
+ formatted_prompt = f"Q: {prompt}\nA:"
29
 
30
+ model = TextGeneration(model="deployment")
31
+ print(model(formatted_prompt, max_new_tokens=200).generations[0].text)
32
 
33
  """
34
+ the in the in in the in in in in in the in the the in in the in in the the the the the in the the in the the in the the in the in the the in the the in the the in in the the the the in in in the the the the in the in in the the the the in the the in the the in the the the the the the the in the the the in the the the the in the the the the in in the the the the the the the in the the the the in the the the in the the in the the the in the the the the the in the the the the the the the the the in the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
35
  """
36
  ```
37
  ## Prompt template