Model Card

Add more information here

Example Usage

from transformers import AutoModelForSeq2SeqLM, AutoTokenizer, pipeline

tokenizer = AutoTokenizer.from_pretrained('cornelliusyudhawijaya/abstracts_to_post_model', revision=None) # Load tokenizer
model = AutoModelForSeq2SeqLM.from_pretrained('cornelliusyudhawijaya/abstracts_to_post_model', revision=None) # Load model
pipe = pipeline('text2text-generation', model=model, tokenizer=tokenizer, pad_token_id=tokenizer.pad_token_id)

inputs = ['Note that not all scientists will apply, but there may be a handful.\n\nThe abstract can be downloaded from the papers cited in the paper for use within your project. We also recommend posting the results of the experiment, using our mailing list format, on these pages.\n\nFor other papers, see How to obtain the data from your source publication in NLP.\n\nThis project was last reported with NLP 3.10.6. The journal publishes NLP 3.10.6 once every seven years.']
print(pipe(inputs, max_length=512, do_sample=False))

This model was trained with a synthetic dataset with DataDreamer 🤖💤. The synthetic dataset card and model card can be found here. The training arguments can be found here.

Downloads last month
113
Safetensors
Model size
248M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for cornelliusyudhawijaya/abstracts_to_post_model

Finetuned
(12)
this model