Indonesian T5 Abstractive Summarization Base Model

Hello everyone, we are from Bina Nusantara University (SumText Group) consisting of Stevan Pohan, Joseph Vincent Liem, and Yongky Alexander Tristan. This is the result of a model that we have fine-tuned for the use of abstractive summarization

Load Fine Tuned Model

  from transformers import AutoModelForSeq2SeqLM, AutoTokenizer
  model_path = "migz117/T5-Abstractive"
  model = AutoModelForSeq2SeqLM.from_pretrained(model_path).to(device)
  tokenizer = AutoTokenizer.from_pretrained(model_path)
Downloads last month
15
Safetensors
Model size
223M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API was unable to determine this model's library.

Dataset used to train migz117/T5-Abstractive