metadata
language:
- en
inference: false
datasets:
- conll2003
- wnut_17
- jnlpba
- conll2012
- BTC
tags:
- PyTorch
BERT base uncased model pre-trained on 5 NER datasets
Model was trained by SberIDP.
- Task:
NER
- Training Data is 5 datasets: CoNLL-2003, WNUT17, JNLPBA, CoNLL-2012 (OntoNotes), BTC
The pretraining process and technical details are described in this article.
It is pretrained for NER task using Reptile and can be finetuned for new entities with only a small amount of samples.