Commit
·
07580c3
1
Parent(s):
97afc78
Update README.md
Browse files
README.md
CHANGED
@@ -14,6 +14,11 @@ metrics:
|
|
14 |
---
|
15 |
# Distilbert-Base-Uncased-Go-Emotion
|
16 |
|
|
|
|
|
|
|
|
|
|
|
17 |
## Training Parameters:
|
18 |
```
|
19 |
Num examples = 169208
|
|
|
14 |
---
|
15 |
# Distilbert-Base-Uncased-Go-Emotion
|
16 |
|
17 |
+
## Model description:
|
18 |
+
[Distilbert](https://arxiv.org/abs/1910.01108) is created with knowledge distillation during the pre-training phase which reduces the size of a BERT model by 40% while retaining 97% of its language understanding. It's smaller, faster than Bert and any other Bert-based model.
|
19 |
+
|
20 |
+
[Distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) finetuned on the emotion dataset using HuggingFace Trainer with below Hyperparameters
|
21 |
+
|
22 |
## Training Parameters:
|
23 |
```
|
24 |
Num examples = 169208
|