agrimi-7.5B-dolly / README.md
alup's picture
Update README.md
10efc57
metadata
license: apache-2.0
datasets:
  - databricks/databricks-dolly-15k
language:
  - el
library_name: transformers
tags:
  - text-generation-inference
pipeline_tag: text-generation

Model Card for agrimi7.5B-dolly

This model is a finetuned (SFT) version of Facbook xglm-7.5B using a machine translated version of the dataset databricks-dolly-15k in Greek language! The purpose is to demonstrate the ability of the specific pretrained model to adapt to instruction following mode by using a relatively small dataset such as the databricks-dolly-15k.

Model Details

Model Description

  • Developed by: Andreas Loupasakis
  • Model type: Causal Language Model
  • Language(s) (NLP): Greek (el)
  • License: Apache-2.0
  • Finetuned from model: XGLM-7.5B