DistilKoBERT

How to use

If you want to import DistilKoBERT tokenizer with AutoTokenizer, you should give trust_remote_code=True.

from transformers import AutoModel, AutoTokenizer

model = AutoModel.from_pretrained("monologg/distilkobert")
tokenizer = AutoTokenizer.from_pretrained("monologg/distilkobert", trust_remote_code=True)

Reference

Downloads last month
1,024
Safetensors
Model size
28.4M params
Tensor type
F32
Β·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model authors have turned it off explicitly.

Model tree for monologg/distilkobert

Finetunes
5 models

Spaces using monologg/distilkobert 3