KoBERT

How to use

If you want to import KoBERT tokenizer with AutoTokenizer, you should give trust_remote_code=True.

from transformers import AutoModel, AutoTokenizer

model = AutoModel.from_pretrained("monologg/kobert")
tokenizer = AutoTokenizer.from_pretrained("monologg/kobert", trust_remote_code=True)

Reference

Downloads last month
142,542
Safetensors
Model size
92.2M params
Tensor type
F32
Β·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model authors have turned it off explicitly.

Model tree for monologg/kobert

Finetunes
7 models
Quantizations
1 model

Spaces using monologg/kobert 6