A newer version of this model is available: mistralai/Mistral-7B-Instruct-v0.1

You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

Modelo aplicadaT1

Este es un modelo de lenguaje basado en Mistral, ajustado para aplicaciones educativas.

Uso del modelo

from transformers import AutoModelForCausalLM, AutoTokenizer

# Cargar modelo y tokenizer
tokenizer = AutoTokenizer.from_pretrained("mhidper/aplicadaT1-complete")
model = AutoModelForCausalLM.from_pretrained("mhidper/aplicadaT1-complete")

# Ejemplo de uso
input_text = "### Instruction: Explica el concepto de derivadas en cálculo.\n\n### Response:"
input_ids = tokenizer(input_text, return_tensors="pt").input_ids
output = model.generate(input_ids, max_length=500)
print(tokenizer.decode(output[0], skip_special_tokens=True))
Downloads last month
8
Safetensors
Model size
7.24B params
Tensor type
FP16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for mhidper/aplicadaT1-complete

Finetuned
(48)
this model