Xenova HF staff commited on
Commit
1eba112
·
verified ·
1 Parent(s): 3f928c2

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +60 -40
README.md CHANGED
@@ -8,8 +8,11 @@ tags:
8
  - nlp
9
  - code
10
  library_name: transformers.js
 
 
11
  ---
12
 
 
13
  ## Model Summary
14
 
15
  Phi-3.5-mini is a lightweight, state-of-the-art open model built upon datasets used for Phi-3 - synthetic data and filtered publicly available websites - with a focus on very high-quality, reasoning dense data. The model belongs to the Phi-3 model family and supports 128K token context length. The model underwent a rigorous enhancement process, incorporating both supervised fine-tuning, proximal policy optimization, and direct preference optimization to ensure precise instruction adherence and robust safety measures.
@@ -131,48 +134,65 @@ How to explain Internet for a medieval knight?<|end|>
131
  <|assistant|>
132
  ```
133
 
134
- ### Loading the model locally
135
- After obtaining the Phi-3.5-mini-instruct model checkpoint, users can use this sample code for inference.
136
-
137
- ```python
138
- import torch
139
- from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
140
-
141
- torch.random.manual_seed(0)
142
-
143
- model = AutoModelForCausalLM.from_pretrained(
144
- "microsoft/Phi-3.5-mini-instruct",
145
- device_map="cuda",
146
- torch_dtype="auto",
147
- trust_remote_code=True,
148
- )
149
- tokenizer = AutoTokenizer.from_pretrained("microsoft/Phi-3.5-mini-instruct")
150
-
151
- messages = [
152
- {"role": "system", "content": "You are a helpful AI assistant."},
153
- {"role": "user", "content": "Can you provide ways to eat combinations of bananas and dragonfruits?"},
154
- {"role": "assistant", "content": "Sure! Here are some ways to eat bananas and dragonfruits together: 1. Banana and dragonfruit smoothie: Blend bananas and dragonfruits together with some milk and honey. 2. Banana and dragonfruit salad: Mix sliced bananas and dragonfruits together with some lemon juice and honey."},
155
- {"role": "user", "content": "What about solving an 2x + 3 = 7 equation?"},
156
- ]
157
-
158
- pipe = pipeline(
159
- "text-generation",
160
- model=model,
161
- tokenizer=tokenizer,
162
- )
163
-
164
- generation_args = {
165
- "max_new_tokens": 500,
166
- "return_full_text": False,
167
- "temperature": 0.0,
168
- "do_sample": False,
169
- }
170
-
171
- output = pipe(messages, **generation_args)
172
- print(output[0]['generated_text'])
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
173
  ```
174
 
175
- Notes: If you want to use flash attention, call _AutoModelForCausalLM.from_pretrained()_ with _attn_implementation="flash_attention_2"_
176
 
177
  ## Responsible AI Considerations
178
 
 
8
  - nlp
9
  - code
10
  library_name: transformers.js
11
+ base_model:
12
+ - microsoft/Phi-3.5-mini-instruct
13
  ---
14
 
15
+
16
  ## Model Summary
17
 
18
  Phi-3.5-mini is a lightweight, state-of-the-art open model built upon datasets used for Phi-3 - synthetic data and filtered publicly available websites - with a focus on very high-quality, reasoning dense data. The model belongs to the Phi-3 model family and supports 128K token context length. The model underwent a rigorous enhancement process, incorporating both supervised fine-tuning, proximal policy optimization, and direct preference optimization to ensure precise instruction adherence and robust safety measures.
 
134
  <|assistant|>
135
  ```
136
 
137
+ ### Loading the model locally with Transformers.js
138
+
139
+ If you haven't already, you can install the [Transformers.js](https://huggingface.co/docs/transformers.js) JavaScript library from [NPM](https://www.npmjs.com/package/@huggingface/transformers) using:
140
+ ```bash
141
+ npm i @huggingface/transformers
142
+ ```
143
+
144
+ You can then use the model to generate text like this:
145
+
146
+ ```js
147
+ import { pipeline } from "@huggingface/transformers";
148
+
149
+ // Create a text generation pipeline
150
+ const generator = await pipeline(
151
+ "text-generation",
152
+ "onnx-community/Phi-3.5-mini-instruct-onnx-web",
153
+ );
154
+
155
+ // Define the list of messages
156
+ const messages = [
157
+ { role: "system", content: "You are a helpful assistant." },
158
+ { role: "user", content: "Solve the equation: x^2 + 2x - 3 = 0" },
159
+ ];
160
+
161
+ // Generate a response
162
+ const output = await generator(messages, { max_new_tokens: 256, do_sample: false });
163
+ console.log(output[0].generated_text.at(-1).content);
164
+ ```
165
+
166
+
167
+ <details>
168
+
169
+ <summary>See example output</summary>
170
+
171
+ ```
172
+ To solve the quadratic equation x^2 + 2x - 3 = 0, we can use the quadratic formula:
173
+
174
+ x = (-b ± √(b^2 - 4ac)) / 2a
175
+
176
+ In this equation, a = 1, b = 2, and c = -3.
177
+
178
+ x = (-(2) ± √((2)^2 - 4(1)(-3))) / 2(1)
179
+
180
+ x = (-2 ± √(4 + 12)) / 2
181
+
182
+ x = (-2 ± √16) / 2
183
+
184
+ x = (-2 ± 4) / 2
185
+
186
+ There are two possible solutions:
187
+
188
+ x = (-2 + 4) / 2 = 2 / 2 = 1
189
+
190
+ x = (-2 - 4) / 2 = -6 / 2 = -3
191
+
192
+ So, the solutions to the equation x^2 + 2x - 3 = 0 are x = 1 and x = -3.
193
  ```
194
 
195
+ </details>
196
 
197
  ## Responsible AI Considerations
198