metadata
license: apache-2.0
datasets:
- yahma/alpaca-cleaned
language:
- en
base_model:
- tiiuae/Falcon3-10B-Instruct
tags:
- falcon3
- alpaca
pipeline_tag: text-generation
library_name: transformers
Barcenas 10b
Based on the tiiuae/Falcon3-10B-Instruct and trained with the yahma/alpaca-cleaned dataset.
The objective of this new model is to explore finetuning on the new falcon 3 models.
Made with ❤️ in Guadalupe, Nuevo Leon, Mexico 🇲🇽