Update README.md
Browse files
README.md
CHANGED
@@ -15,9 +15,10 @@ and first released in this [repository](https://github.com/EleutherAI/gpt-neo).
|
|
15 |
|
16 |
|
17 |
## Fine-tuned Model Description: GPT-3 fine-tuned Multi-XScience
|
18 |
-
The Open Source version of GPT-3: GPT-Neo(125M) has been fine-tuned on dataset called "Multi-XScience": [Multi-XScience_Repository](https://github.com/yaolu/Multi-XScience)
|
19 |
|
20 |
-
I deployed it using Google "Material Design" (on Anvil): [Abir Scientific text Generator](https://abir-scientific-text-generator.anvil.app/)
|
21 |
|
22 |
-
By fine-tuning GPT-Neo on Multi-XScience dataset, the model is now able to generate scientific texts(even better than GPT-J(6B)
|
23 |
-
|
|
|
|
15 |
|
16 |
|
17 |
## Fine-tuned Model Description: GPT-3 fine-tuned Multi-XScience
|
18 |
+
The Open Source version of GPT-3: GPT-Neo(125M) has been fine-tuned on a dataset called "Multi-XScience": [Multi-XScience_Repository](https://github.com/yaolu/Multi-XScience): A Large-scale Dataset for Extreme Multi-document Summarization of Scientific Articles.
|
19 |
|
20 |
+
I first fine-tuned and then deployed it using Google "Material Design" (on Anvil): [Abir Scientific text Generator](https://abir-scientific-text-generator.anvil.app/)
|
21 |
|
22 |
+
By fine-tuning GPT-Neo(Open Source version of GPT-3), on Multi-XScience dataset, the model is now able to generate scientific texts(even better than GPT-J(6B).
|
23 |
+
Try putting the prompt "attention is all" on both my [Abir Scientific text Generator](https://abir-scientific-text-generator.anvil.app/) and on the [ GPT-J Eleuther.ai Demo](https://6b.eleuther.ai/) to understand what I mean.
|
24 |
+
And Here's a demonstration video for this. [Video real-time Demontration](https://www.youtube.com/watch?v=XP8uZfnCYQI)
|