avemio-digital commited on
Commit
c02ddc5
verified
1 Parent(s): 0b053bb

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +10 -10
README.md CHANGED
@@ -1,12 +1,12 @@
1
  ---
2
  license: apache-2.0
3
  datasets:
4
- - avemio/GRAG-CPT-HESSIAN-AI
5
- - avemio/GRAG-SFT-ShareGPT-HESSIAN-AI
6
  language:
7
  - en
8
  - de
9
- base_model: avemio/GRAG-MISTRAL-7B-v3.0-SFT-HESSIAN-AI
10
  pipeline_tag: question-answering
11
  tags:
12
  - German
@@ -19,9 +19,9 @@ tags:
19
  - gguf-my-repo
20
  ---
21
 
22
- # avemio-digital/GRAG-MISTRAL-7B-v3.0-SFT-HESSIAN-AI-Q8_0-GGUF
23
- This model was converted to GGUF format from [`avemio/GRAG-MISTRAL-7B-v3.0-SFT-HESSIAN-AI`](https://huggingface.co/avemio/GRAG-MISTRAL-7B-v3.0-SFT-HESSIAN-AI) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space.
24
- Refer to the [original model card](https://huggingface.co/avemio/GRAG-MISTRAL-7B-v3.0-SFT-HESSIAN-AI) for more details on the model.
25
 
26
  ## Use with llama.cpp
27
  Install llama.cpp through brew (works on Mac and Linux)
@@ -34,12 +34,12 @@ Invoke the llama.cpp server or the CLI.
34
 
35
  ### CLI:
36
  ```bash
37
- llama-cli --hf-repo avemio-digital/GRAG-MISTRAL-7B-v3.0-SFT-HESSIAN-AI-Q8_0-GGUF --hf-file grag-mistral-7b-v3.0-sft-hessian-ai-q8_0.gguf -p "The meaning to life and the universe is"
38
  ```
39
 
40
  ### Server:
41
  ```bash
42
- llama-server --hf-repo avemio-digital/GRAG-MISTRAL-7B-v3.0-SFT-HESSIAN-AI-Q8_0-GGUF --hf-file grag-mistral-7b-v3.0-sft-hessian-ai-q8_0.gguf -c 2048
43
  ```
44
 
45
  Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well.
@@ -56,9 +56,9 @@ cd llama.cpp && LLAMA_CURL=1 make
56
 
57
  Step 3: Run inference through the main binary.
58
  ```
59
- ./llama-cli --hf-repo avemio-digital/GRAG-MISTRAL-7B-v3.0-SFT-HESSIAN-AI-Q8_0-GGUF --hf-file grag-mistral-7b-v3.0-sft-hessian-ai-q8_0.gguf -p "The meaning to life and the universe is"
60
  ```
61
  or
62
  ```
63
- ./llama-server --hf-repo avemio-digital/GRAG-MISTRAL-7B-v3.0-SFT-HESSIAN-AI-Q8_0-GGUF --hf-file grag-mistral-7b-v3.0-sft-hessian-ai-q8_0.gguf -c 2048
64
  ```
 
1
  ---
2
  license: apache-2.0
3
  datasets:
4
+ - avemio/German-RAG-CPT-HESSIAN-AI
5
+ - avemio/German-RAG-SFT-ShareGPT-HESSIAN-AI
6
  language:
7
  - en
8
  - de
9
+ base_model: avemio/German-RAG-MISTRAL-7B-v3.0-SFT-HESSIAN-AI
10
  pipeline_tag: question-answering
11
  tags:
12
  - German
 
19
  - gguf-my-repo
20
  ---
21
 
22
+ # avemio-digital/German-RAG-MISTRAL-7B-v3.0-SFT-HESSIAN-AI-Q8_0-GGUF
23
+ This model was converted to GGUF format from [`avemio/German-RAG-MISTRAL-7B-v3.0-SFT-HESSIAN-AI`](https://huggingface.co/avemio/German-RAG-MISTRAL-7B-v3.0-SFT-HESSIAN-AI) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space.
24
+ Refer to the [original model card](https://huggingface.co/avemio/German-RAG-MISTRAL-7B-v3.0-SFT-HESSIAN-AI) for more details on the model.
25
 
26
  ## Use with llama.cpp
27
  Install llama.cpp through brew (works on Mac and Linux)
 
34
 
35
  ### CLI:
36
  ```bash
37
+ llama-cli --hf-repo avemio-digital/German-RAG-MISTRAL-7B-v3.0-SFT-HESSIAN-AI-Q8_0-GGUF --hf-file German-RAG-mistral-7b-v3.0-sft-hessian-ai-q8_0.gguf -p "The meaning to life and the universe is"
38
  ```
39
 
40
  ### Server:
41
  ```bash
42
+ llama-server --hf-repo avemio-digital/German-RAG-MISTRAL-7B-v3.0-SFT-HESSIAN-AI-Q8_0-GGUF --hf-file German-RAG-mistral-7b-v3.0-sft-hessian-ai-q8_0.gguf -c 2048
43
  ```
44
 
45
  Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well.
 
56
 
57
  Step 3: Run inference through the main binary.
58
  ```
59
+ ./llama-cli --hf-repo avemio-digital/German-RAG-MISTRAL-7B-v3.0-SFT-HESSIAN-AI-Q8_0-GGUF --hf-file German-RAG-mistral-7b-v3.0-sft-hessian-ai-q8_0.gguf -p "The meaning to life and the universe is"
60
  ```
61
  or
62
  ```
63
+ ./llama-server --hf-repo avemio-digital/German-RAG-MISTRAL-7B-v3.0-SFT-HESSIAN-AI-Q8_0-GGUF --hf-file German-RAG-mistral-7b-v3.0-sft-hessian-ai-q8_0.gguf -c 2048
64
  ```