Update README.md
Browse files
README.md
CHANGED
@@ -1,12 +1,19 @@
|
|
1 |
---
|
2 |
library_name: transformers
|
3 |
pipeline_tag: summarization
|
|
|
|
|
|
|
|
|
|
|
|
|
4 |
---
|
5 |
|
6 |
# Model Card for Model ID
|
7 |
|
8 |
<!-- Provide a quick summary of what the model is/does. -->
|
9 |
-
|
|
|
10 |
|
11 |
|
12 |
## Model Details
|
@@ -17,13 +24,20 @@ pipeline_tag: summarization
|
|
17 |
|
18 |
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
|
19 |
|
20 |
-
- **Developed by:** [
|
21 |
-
- **Funded by [optional]:** [More Information Needed]
|
22 |
- **Shared by [optional]:** [More Information Needed]
|
23 |
- **Model type:** [More Information Needed]
|
24 |
- **Language(s) (NLP):** [More Information Needed]
|
25 |
- **License:** [More Information Needed]
|
26 |
-
- **Finetuned from model [optional]:** [
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
27 |
|
28 |
### Model Sources [optional]
|
29 |
|
|
|
1 |
---
|
2 |
library_name: transformers
|
3 |
pipeline_tag: summarization
|
4 |
+
datasets:
|
5 |
+
- gopalkalpande/bbc-news-summary
|
6 |
+
language:
|
7 |
+
- en
|
8 |
+
metrics:
|
9 |
+
- rouge
|
10 |
---
|
11 |
|
12 |
# Model Card for Model ID
|
13 |
|
14 |
<!-- Provide a quick summary of what the model is/does. -->
|
15 |
+
The T5_small_lecture_summarization model is a variant of the T5 (Text-to-Text Transfer Transformer) architecture,
|
16 |
+
which is designed for summarization tasks
|
17 |
|
18 |
|
19 |
## Model Details
|
|
|
24 |
|
25 |
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
|
26 |
|
27 |
+
- **Developed by:** [Hyun Lee]
|
|
|
28 |
- **Shared by [optional]:** [More Information Needed]
|
29 |
- **Model type:** [More Information Needed]
|
30 |
- **Language(s) (NLP):** [More Information Needed]
|
31 |
- **License:** [More Information Needed]
|
32 |
+
- **Finetuned from model [optional]:** [Google/T5]
|
33 |
+
|
34 |
+
- **Architecture:** The model is based on the T5 architecture, which employs a transformer-based neural network. Transformers have proven effective for various natural language processing (NLP) tasks due to their attention mechanisms and ability to capture contextual information.
|
35 |
+
- **Task:** The primary purpose of this model is lecture summarization. Given a lecture or a longer text, it aims to generate a concise summary that captures the essential points. This can be valuable for students, researchers, or anyone seeking condensed information.
|
36 |
+
- **Input Format:** The model expects input in a text-to-text format. Specifically, you provide a prompt (e.g., the lecture content) and specify the desired task (e.g., “summarize”). The model then generates a summary as the output.
|
37 |
+
- **Fine-Tuning:** The Lucas-Hyun-Lee/T5_small_lecture_summarization model has likely undergone fine-tuning on lecture-specific data. During fine-tuning, it learns to optimize its parameters for summarization by minimizing a loss function.
|
38 |
+
- **Model Size:** As the name suggests, this is a small-sized variant of T5. Smaller models are computationally efficient and suitable for scenarios where memory or processing power is limited.
|
39 |
+
- **Performance:** The model’s performance depends on the quality and diversity of the training data, as well as the specific lecture content it encounters during fine-tuning.
|
40 |
+
- It should be evaluated based on metrics such as ROUGE (Recall-Oriented Understudy for Gisting Evaluation) scores.
|
41 |
|
42 |
### Model Sources [optional]
|
43 |
|