Ahmadzei's picture
added 3 more tables for large emb model
5fa1a76
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("mistralai/Mistral-7B-v0.1", padding_side="left")
model_inputs = tokenizer(["A list of colors: red, blue"], return_tensors="pt").to("cuda")
The model_inputs variable holds the tokenized text input, as well as the attention mask.