Why on my local machine i am getting this error for big texts but i can run those text on hugging face inference API?

#12
by Kaushal2012 - opened

Error that i am getting:
"Token indices sequence length is longer than the specified maximum sequence length for this model (1161 > 1024). Running this sequence through the model will result in indexing errors"

InkedCapture.jpg

So far the results i got on other texts are really accurate and great using your model, so great work guys!!

Huggingface API automatically truncates the input you give. Locally you have to pass a parameter for truncation to be true so that your inputs get truncated. (truncation=True or truncate=True)

knkarthick changed discussion status to closed

but if i truncate wont it omit the text that maybe is important for summary and what about summary result?

after the truncation its not giving proper results , its giving uncomplete summary like incomplete sentences!!

Kaushal2012 changed discussion status to open

how can i increase my out put of summary i give input of word 6000 or more but i get summary only 20-25 words why. and how to solve this.

how can i increase my out put of summary i give input of word 6000 or more but i get summary only 20-25 words why. and how to solve this.

maybe token size but you wont get bigger summary probably

Kaushal2012 changed discussion status to closed
Your need to confirm your account before you can post a new comment.

Sign up or log in to comment