Example usage
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("gpt2")
model = AutoModelForCausalLM.from_pretrained("KookyGhost/GPT2-small-summarization")
prompt = "Summarize this: Reddit user shares a long story about learning to code with free online resources and eventually landing their first developer job."
inputs = tokenizer(prompt, return_tensors="pt", truncation=True)
outputs = model.generate(
**inputs,
max_new_tokens=60,
do_sample=True,
top_k=50,
top_p=0.95,
temperature=0.7
)
summary = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(summary)
- Downloads last month
- 16
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
HF Inference deployability: The model has no library tag.