Update README.md
Browse files
README.md
CHANGED
@@ -7,4 +7,42 @@ base_model:
|
|
7 |
|
8 |
The aim of this model is to retain the reasoning capabilities of <a href="https://huggingface.co/deepseek-ai/DeepSeek-R1-Distill-Llama-8B">DeepSeek-R1-Distill-Llama-8B</a>, while aligning more with the original <a href="https://huggingface.co/meta-llama/Llama-3.1-8B-Instruct">Llama 3.1 model</a> on which it is based.
|
9 |
|
10 |
-
As this model derives from Llama 3.1, the <a href="https://www.llama.com/llama3_1/license/">Llama 3.1 Community License Agreement</a> applies.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
7 |
|
8 |
The aim of this model is to retain the reasoning capabilities of <a href="https://huggingface.co/deepseek-ai/DeepSeek-R1-Distill-Llama-8B">DeepSeek-R1-Distill-Llama-8B</a>, while aligning more with the original <a href="https://huggingface.co/meta-llama/Llama-3.1-8B-Instruct">Llama 3.1 model</a> on which it is based.
|
9 |
|
10 |
+
As this model derives from Llama 3.1, the <a href="https://www.llama.com/llama3_1/license/">Llama 3.1 Community License Agreement</a> applies.
|
11 |
+
|
12 |
+
## 8B Safetensors BF16 format:
|
13 |
+
Use with [transformers](https://huggingface.co/docs/transformers/en/index) as you would Llama 3.1
|
14 |
+
|
15 |
+
Use model id ___BlueBeck/LlamaAligned-DeepSeekR1-Distill-8b___
|
16 |
+
|
17 |
+
[Or download files from here](https://huggingface.co/BlueBeck/LlamaAligned-DeepSeekR1-Distill-8b/tree/main)
|
18 |
+
|
19 |
+
## 8B GGUF Quantised versions:
|
20 |
+
|
21 |
+
Use these with [Llama.cpp](https://github.com/ggerganov/llama.cpp), [LM Studio](https://lmstudio.ai/) or [Kobold.cpp](https://github.com/LostRuins/koboldcpp).
|
22 |
+
Thanks to [mradermacher](https://huggingface.co/mradermacher) for converting these from the [safetensors](https://huggingface.co/BlueBeck/LlamaAligned-DeepSeekR1-Distill-8b/tree/main) format.
|
23 |
+
|
24 |
+
| Filename | Type | Size | Quality |
|
25 |
+
| -------- | ---------- | --------- | ----------- |
|
26 |
+
| [LlamaAligned-DeepSeekR1-Distill-8b-Q4_K_M.gguf](https://huggingface.co/BlueBeck/LlamaAligned-DeepSeekR1-Distill-8b/resolve/quants/LlamaAligned-DeepSeekR1-Distill-8b.Q4_K_M.gguf?download=true) | Q4_K_M | 4.92GB | OK quality, default. |
|
27 |
+
| [LlamaAligned-DeepSeekR1-Distill-8b-Q8_0.gguf](https://huggingface.co/BlueBeck/LlamaAligned-DeepSeekR1-Distill-8b/resolve/quants/LlamaAligned-DeepSeekR1-Distill-8b.Q8_0.gguf?download=true) | Q8_0 | 8.54GB | Best quality quantised version. |
|
28 |
+
| [LlamaAligned-DeepSeekR1-Distill-8b-Q6_K.gguf](https://huggingface.co/BlueBeck/LlamaAligned-DeepSeekR1-Distill-8b/resolve/quants/LlamaAligned-DeepSeekR1-Distill-8b.Q6_K.gguf?download=true) | Q6_K | 6.6GB | High quality. |
|
29 |
+
| [LlamaAligned-DeepSeekR1-Distill-8b-Q5_K_M>.gguf](https://huggingface.co/BlueBeck/LlamaAligned-DeepSeekR1-Distill-8b/resolve/quants/LlamaAligned-DeepSeekR1-Distill-8b.Q5_K_M.gguf?download=true) | Q5_K_M> | 5.73GB | Good quality. |
|
30 |
+
| [LlamaAligned-DeepSeekR1-Distill-8b-Q3_K_S.gguf](https://huggingface.co/BlueBeck/LlamaAligned-DeepSeekR1-Distill-8b/resolve/quants/LlamaAligned-DeepSeekR1-Distill-8b.Q3_K_S.gguf?download=true) | Q3_K_S | 3.66GB | Lower quality. |
|
31 |
+
|
32 |
+
|
33 |
+
## 70B Safetensors BF16 format:
|
34 |
+
Use with [transformers](https://huggingface.co/docs/transformers/en/index) as you would Llama 3.3
|
35 |
+
|
36 |
+
[Or download files from here](https://huggingface.co/BlueBeck/LlamaAligned-DeepSeekR1-Distill-70b/tree/main)
|
37 |
+
|
38 |
+
## 70B GGUF Quantised versions:
|
39 |
+
|
40 |
+
Use these with [Llama.cpp](https://github.com/ggerganov/llama.cpp), [LM Studio](https://lmstudio.ai/) or [Kobold.cpp](https://github.com/LostRuins/koboldcpp).
|
41 |
+
Thanks to [mradermacher](https://huggingface.co/mradermacher) for converting these from the [safetensors](https://huggingface.co/BlueBeck/LlamaAligned-DeepSeekR1-Distill-70b/tree/main) format.
|
42 |
+
|
43 |
+
| Filename | Type | Size | Quality |
|
44 |
+
| -------- | ---------- | --------- | ----------- |
|
45 |
+
| [LlamaAligned-DeepSeekR1-Distill-70b-Q4_K_M.gguf](https://huggingface.co/BlueBeck/LlamaAligned-DeepSeekR1-Distill-70b/resolve/quants/LlamaAligned-DeepSeekR1-Distill-70b.Q4_K_M.gguf?download=true) | Q4_K_M | 42.5GB | OK quality, default. |
|
46 |
+
| LlamaAligned-DeepSeekR1-Distill-70b-Q8_0.gguf [part1](https://huggingface.co/BlueBeck/LlamaAligned-DeepSeekR1-Distill-70b/resolve/quants/LlamaAligned-DeepSeekR1-Distill-70b.Q8_0.gguf.part1of2?download=true) [part2](https://huggingface.co/BlueBeck/LlamaAligned-DeepSeekR1-Distill-70b/resolve/quants/LlamaAligned-DeepSeekR1-Distill-70b.Q8_0.gguf.part2of2?download=true)| Q8_0 | 75.0GB | Best quality quantised version. |
|
47 |
+
| [LlamaAligned-DeepSeekR1-Distill-70b-Q3_K_S.gguf](https://huggingface.co/BlueBeck/LlamaAligned-DeepSeekR1-Distill-70b/resolve/quants/LlamaAligned-DeepSeekR1-Distill-70b.Q3_K_S.gguf?download=true) | Q3_K_S | 30.9GB | Lower quality. |
|
48 |
+
|