rajabmondal commited on
Commit
61a0c1f
·
verified ·
1 Parent(s): fb40fb1

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +17 -5
README.md CHANGED
@@ -125,6 +125,18 @@ Then click Download.
125
 
126
  ## How to use with Ollama
127
 
 
 
 
 
 
 
 
 
 
 
 
 
128
  ### Building from `Modelfile`
129
 
130
  Assuming that you have already downloaded GGUF files, here is how you can use them with [Ollama](https://ollama.com/):
@@ -132,22 +144,22 @@ Assuming that you have already downloaded GGUF files, here is how you can use th
132
  1. **Get the Modelfile:**
133
 
134
  ```
135
- huggingface-cli download microsoft/Phi-3-mini-4k-instruct-gguf Modelfile_q4 --local-dir /path/to/your/local/dir
136
  ```
137
 
138
  2. Build the Ollama Model:
139
  Use the Ollama CLI to create your model with the following command:
140
 
141
  ```
142
- ollama create phi3 -f Modelfile_q4
143
  ```
144
 
145
- 3. **Run the *phi3* model:**
146
 
147
- Now you can run the Phi-3-Mini-4k-Instruct model with Ollama using the following command:
148
 
149
  ```
150
- ollama run phi3 "Your prompt here"
151
  ```
152
 
153
  Replace "Your prompt here" with the actual prompt you want to use for generating responses from the model.
 
125
 
126
  ## How to use with Ollama
127
 
128
+ 1. **Install Ollama:**
129
+
130
+ ```
131
+ curl -fsSL https://ollama.com/install.sh | sh
132
+ ```
133
+
134
+ 2. **Run the *NT-Java* model:**
135
+
136
+ ```
137
+ ollama run NT-Java
138
+ ```
139
+
140
  ### Building from `Modelfile`
141
 
142
  Assuming that you have already downloaded GGUF files, here is how you can use them with [Ollama](https://ollama.com/):
 
144
  1. **Get the Modelfile:**
145
 
146
  ```
147
+ huggingface-cli download infosys/NT-Java-1.1B-GGUF Modelfile_q4_k_m --local-dir /path/to/your/local/dir
148
  ```
149
 
150
  2. Build the Ollama Model:
151
  Use the Ollama CLI to create your model with the following command:
152
 
153
  ```
154
+ ollama create NT-Java -f Modelfile_q4_k_m
155
  ```
156
 
157
+ 3. **Run the *NT-Java* model:**
158
 
159
+ Now you can run the NT-Java model with Ollama using the following command:
160
 
161
  ```
162
+ ollama run NT-Java "Your prompt here"
163
  ```
164
 
165
  Replace "Your prompt here" with the actual prompt you want to use for generating responses from the model.