Model Card for agrimi7.5B-dolly
This model is a finetuned (SFT) version of Facbook xglm-7.5B using a machine translated version of the dataset databricks-dolly-15k in Greek language! The purpose is to demonstrate the ability of the specific pretrained model to adapt to instruction following mode by using a relatively small dataset such as the databricks-dolly-15k.
Model Details
Model Description
- Developed by: Andreas Loupasakis
- Model type: Causal Language Model
- Language(s) (NLP): Greek (el)
- License: Apache-2.0
- Finetuned from model: XGLM-7.5B
- Downloads last month
- 0
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.