This is a pruned version of the google/mt5-large
model. Here, the input and output embeddings are pruned to support a greatly reduced vocabulary.
The chosen vocabulary has 30K norwegian, english and special tokens, ~12% of the old size. This reduces the model size by roughly 37%.
The model is still OK on similar languages, like German and Danish, but very different languages like arabic are not a good fit anymore.
This model is intended as a starting point for finetuning mt5 for norwegian applications.
- Downloads last month
- 90
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.