A newer version of this model is available:
mistralai/Mistral-7B-Instruct-v0.1
Modelo aplicadaT1
Este es un modelo de lenguaje basado en Mistral, ajustado para aplicaciones educativas.
Uso del modelo
from transformers import AutoModelForCausalLM, AutoTokenizer
# Cargar modelo y tokenizer
tokenizer = AutoTokenizer.from_pretrained("mhidper/aplicadaT1-complete")
model = AutoModelForCausalLM.from_pretrained("mhidper/aplicadaT1-complete")
# Ejemplo de uso
input_text = "### Instruction: Explica el concepto de derivadas en cálculo.\n\n### Response:"
input_ids = tokenizer(input_text, return_tensors="pt").input_ids
output = model.generate(input_ids, max_length=500)
print(tokenizer.decode(output[0], skip_special_tokens=True))
- Downloads last month
- 8
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for mhidper/aplicadaT1-complete
Base model
mistralai/Mixtral-8x7B-v0.1
Finetuned
mistralai/Mixtral-8x7B-Instruct-v0.1