Update README.md
Browse files
README.md
CHANGED
@@ -50,7 +50,7 @@ The following hyperparameters were used during training:
|
|
50 |
|
51 |
### Evaluation results
|
52 |
Perplexity on random 2000 examples of the target language's [Wikipedia dataset](https://huggingface.co/datasets/wikimedia/wikipedia), using the code provided in the [perplexity docs](https://huggingface.co/docs/transformers/perplexity), with 512 tokes of stride.
|
53 |
-
Baseline is the result from [OpenAI's GPT-2](https://huggingface.co/gpt2).
|
54 |
| Target language | PPL | Baseline PPL |
|
55 |
|-----------------|-------------------|-------------------|
|
56 |
| en |42.175106048583984 |26.562532424926758 |
|
|
|
50 |
|
51 |
### Evaluation results
|
52 |
Perplexity on random 2000 examples of the target language's [Wikipedia dataset](https://huggingface.co/datasets/wikimedia/wikipedia), using the code provided in the [perplexity docs](https://huggingface.co/docs/transformers/perplexity), with 512 tokes of stride.
|
53 |
+
Baseline is the result from evaluating [OpenAI's GPT-2](https://huggingface.co/gpt2) on the same examples.
|
54 |
| Target language | PPL | Baseline PPL |
|
55 |
|-----------------|-------------------|-------------------|
|
56 |
| en |42.175106048583984 |26.562532424926758 |
|