Fix wrong model_max_length

#5
by andstor - opened

The model has a context window of 2048 (n_positions ). The tokenizer should also support the same length.

Ready to merge
This branch is ready to get merged automatically.
Your need to confirm your account before you can post a new comment.

Sign up or log in to comment