pip install transformers from transformers import automodelforcausallm, autotokenizer import torch torch_device = "cuda" if torch.cuda.is_available() else "cpu" model_name="mrm8488/distilroberta-finetuned-financial-news-sentiment-analysis" tokenizer = autotokenizer.from_pretrained(model_name) model=automodelforcausallm.from_pretrained(model_name,pad_token_id=tokenizer.eos_token_id).to(torch_device) model_inputs=tokenizer('Nvidia reported profits of €10Mio',return_tensors='pt').to(torch_device) output = model(**inputs).logits.argmax(axis=1)
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
HF Inference API was unable to determine this model's library.
Model tree for unemonti/distilroberta-base
Base model
distilbert/distilroberta-base