added readme
Browse files
README.md
ADDED
@@ -0,0 +1,20 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
language: en-ru
|
3 |
+
license: apache-2.0
|
4 |
+
---
|
5 |
+
# XLM-RoBERTa large model (uncased) whole word masking finetuned on SQuAD
|
6 |
+
Pretrained model on English and Russian language using a masked language modeling (MLM) objective. It was introduced in
|
7 |
+
[this paper](https://arxiv.org/abs/1810.04805) and first released in
|
8 |
+
[this repository](https://github.com/google-research/bert). This model is uncased: it does not make a difference
|
9 |
+
|
10 |
+
|
11 |
+
## Used Datasets
|
12 |
+
SQuAD + SberQuAD
|
13 |
+
|
14 |
+
[SberQuAD original paper](https://arxiv.org/pdf/1912.09723.pdf) is here! Recommend to read!
|
15 |
+
|
16 |
+
## Evaluation results
|
17 |
+
The results obtained are the following (SberQUaD):
|
18 |
+
```
|
19 |
+
f1 = 84.3
|
20 |
+
exact_match = 65.3
|