Update README.md
Browse files
README.md
CHANGED
@@ -55,7 +55,7 @@ This pretraining data will not be opened to public due to Twitter policy.
|
|
55 |
| `code-mixed-ijeroberta` | RoBERTa | 2.24 GB of text | 249 MB of text |
|
56 |
|
57 |
## Evaluation Results
|
58 |
-
We train the data with 3 epochs and total steps of
|
59 |
The following are the results obtained from the training:
|
60 |
|
61 |
| train loss | eval loss | eval perplexity |
|
|
|
55 |
| `code-mixed-ijeroberta` | RoBERTa | 2.24 GB of text | 249 MB of text |
|
56 |
|
57 |
## Evaluation Results
|
58 |
+
We train the data with 3 epochs and total steps of 296K for 16 days.
|
59 |
The following are the results obtained from the training:
|
60 |
|
61 |
| train loss | eval loss | eval perplexity |
|