Barcenas 10b

Based on the tiiuae/Falcon3-10B-Instruct and trained with the yahma/alpaca-cleaned dataset.

The objective of this new model is to explore finetuning on the new falcon 3 models.

Made with ❤️ in Guadalupe, Nuevo Leon, Mexico 🇲🇽

Downloads last month
6
Safetensors
Model size
10.3B params
Tensor type
FP16
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for Danielbrdz/Barcenas-10b

Finetuned
(14)
this model
Quantizations
1 model

Dataset used to train Danielbrdz/Barcenas-10b