Transformers
Polish
Inference Endpoints
mrapacz's picture
Upload README.md with huggingface_hub
46b5166 verified
metadata
license: cc-by-sa-4.0
language:
  - pl
metrics:
  - bleu
base_model:
  - GreTa
library_name: transformers
datasets:
  - mrapacz/greek-interlinear-translations

Model Card for Ancient Greek to Polish Interlinear Translation Model

This model performs interlinear translation from Ancient Greek to {Language}, maintaining word-level alignment between source and target texts.

Model Details

Model Description

  • Developed By: Maciej Rapacz, AGH University of Kraków
  • Model Type: Neural machine translation (T5-based)
  • Base Model: GreTa
  • Tokenizer: GreTa
  • Language(s): Ancient Greek (source) → Polish (target)
  • License: CC BY-NC-SA 4.0
  • Tag Set: OB (Oblubienica)
  • Text Preprocessing: Normalized
  • Morphological Encoding: t-w-t (tags-within-text)

Model Performance

  • BLEU Score: 0.78
  • SemScore: 0.56

Model Sources