KoBERT
How to use
If you want to import KoBERT tokenizer with
AutoTokenizer
, you should givetrust_remote_code=True
.
from transformers import AutoModel, AutoTokenizer
model = AutoModel.from_pretrained("monologg/kobert")
tokenizer = AutoTokenizer.from_pretrained("monologg/kobert", trust_remote_code=True)
Reference
- Downloads last month
- 142,542
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model authors have turned it off explicitly.