Update README.md
Browse files
README.md
CHANGED
@@ -14,7 +14,7 @@ This repository hosts the optimized versions of [gpt-oss-20b](https://huggingfac
|
|
14 |
|
15 |
Optimized models are published here in [ONNX](https://onnx.ai) format to run with [ONNX Runtime](https://onnxruntime.ai) on CUDA GPU devices, with the precision best suited to this target.
|
16 |
|
17 |
-
To easily get started with the model, you can use [Foundry Local](). See instructions [here](https://learn.microsoft.com/en-us/azure/ai-foundry/foundry-local/get-started#run-the-latest-openai-open-source-model).
|
18 |
|
19 |
You can also build [ONNX Runtime GenAI](https://onnxruntime.ai/docs/genai/) from source to get the latest changes and run the model. See instructions [here](https://onnxruntime.ai/docs/genai/howto/build-from-source.html) for building from source. You can then run the inference example [here](https://github.com/microsoft/onnxruntime-genai/blob/main/examples/python/model-chat.py).
|
20 |
|
|
|
14 |
|
15 |
Optimized models are published here in [ONNX](https://onnx.ai) format to run with [ONNX Runtime](https://onnxruntime.ai) on CUDA GPU devices, with the precision best suited to this target.
|
16 |
|
17 |
+
To easily get started with the model, you can use [Foundry Local](https://learn.microsoft.com/en-us/azure/ai-foundry/foundry-local/get-started). See instructions [here](https://learn.microsoft.com/en-us/azure/ai-foundry/foundry-local/get-started#run-the-latest-openai-open-source-model).
|
18 |
|
19 |
You can also build [ONNX Runtime GenAI](https://onnxruntime.ai/docs/genai/) from source to get the latest changes and run the model. See instructions [here](https://onnxruntime.ai/docs/genai/howto/build-from-source.html) for building from source. You can then run the inference example [here](https://github.com/microsoft/onnxruntime-genai/blob/main/examples/python/model-chat.py).
|
20 |
|