Update README.md
Browse files
README.md
CHANGED
@@ -107,40 +107,14 @@ RepoQA: a benchmark for long context code understanding
|
|
107 |
## Usage
|
108 |
|
109 |
### Requirements
|
110 |
-
Phi-3 family has been integrated in the `4.43.0` version of `transformers`. The current `transformers` version can be verified with: `pip list | grep transformers`.
|
111 |
-
|
112 |
-
Examples of required packages:
|
113 |
-
```
|
114 |
-
flash_attn==2.5.8
|
115 |
-
torch==2.3.1
|
116 |
-
accelerate==0.31.0
|
117 |
-
transformers==4.43.0
|
118 |
-
```
|
119 |
-
|
120 |
-
Phi-3.5-mini-instruct is also available in [Azure AI Studio](https://aka.ms/try-phi3.5mini)
|
121 |
-
|
122 |
-
### Tokenizer
|
123 |
-
|
124 |
-
Phi-3.5-mini-Instruct supports a vocabulary size of up to `32064` tokens. The [tokenizer files](https://huggingface.co/microsoft/Phi-3.5-mini-instruct/blob/main/added_tokens.json) already provide placeholder tokens that can be used for downstream fine-tuning, but they can also be extended up to the model's vocabulary size.
|
125 |
-
|
126 |
-
### Input Formats
|
127 |
-
Given the nature of the training data, the Phi-3.5-mini-instruct model is best suited for prompts using the chat format as follows:
|
128 |
-
|
129 |
-
```
|
130 |
-
<|system|>
|
131 |
-
You are a helpful assistant.<|end|>
|
132 |
-
<|user|>
|
133 |
-
How to explain Internet for a medieval knight?<|end|>
|
134 |
-
<|assistant|>
|
135 |
-
```
|
136 |
-
|
137 |
-
### Loading the model locally with Transformers.js
|
138 |
|
139 |
If you haven't already, you can install the [Transformers.js](https://huggingface.co/docs/transformers.js) JavaScript library from [NPM](https://www.npmjs.com/package/@huggingface/transformers) using:
|
140 |
```bash
|
141 |
npm i @huggingface/transformers
|
142 |
```
|
143 |
|
|
|
|
|
144 |
You can then use the model to generate text like this:
|
145 |
|
146 |
```js
|
|
|
107 |
## Usage
|
108 |
|
109 |
### Requirements
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
110 |
|
111 |
If you haven't already, you can install the [Transformers.js](https://huggingface.co/docs/transformers.js) JavaScript library from [NPM](https://www.npmjs.com/package/@huggingface/transformers) using:
|
112 |
```bash
|
113 |
npm i @huggingface/transformers
|
114 |
```
|
115 |
|
116 |
+
### Loading the model locally with Transformers.js
|
117 |
+
|
118 |
You can then use the model to generate text like this:
|
119 |
|
120 |
```js
|