Charmen-Electra

A byte-based transformer model trained on Hungarian language. In order to use the model you will need a custom Tokenizer which is available at: https://github.com/szegedai/byte-offset-tokenizer.

Since we use a custom architecture with Gradient Boosting, Down- and Up-Sampling, you have to enable Trusted Remote Code like:

model = AutoModel.from_pretrained("SzegedAI/charmen-electra", trust_remote_code=True)

Acknowledgement

Artificial Intelligence - National Laboratory - Hungary

Downloads last month
108
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The HF Inference API does not support model that require custom code execution.

Dataset used to train SzegedAI/charmen-electra