NitroLLM's picture
Update README.md
d2303c9 verified
metadata
license: apache-2.0
datasets:
  - cerebras/SlimPajama-627B
  - bigcode/starcoderdata
  - HuggingFaceH4/ultrachat_200k
  - HuggingFaceH4/ultrafeedback_binarized
language:
  - en
widget:
  - example_title: Fibonacci (Python)
    messages:
      - role: system
        content: You are a chatbot who can help code!
      - role: user
        content: >-
          Write me a function to calculate the first 10 digits of the fibonacci
          sequence in Python and print it out to the CLI.
pipeline_tag: text-generation
tags:
  - openvino
  - openvino-export
base_model: TinyLlama/TinyLlama-1.1B-Chat-v1.0

This model was converted to OpenVINO from TinyLlama/TinyLlama-1.1B-Chat-v1.0 using optimum-intel via the export space.

First make sure you have optimum-intel installed:

pip install optimum[openvino]

To load your model you can do as follows:

from optimum.intel import OVModelForCausalLM

model_id = "NitroLLM/TinyLlama-1.1B-Chat-v1.0-openvino"
model = OVModelForCausalLM.from_pretrained(model_id)