noahkim commited on
Commit
5cccf6f
ยท
1 Parent(s): e9a3236

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +25 -11
README.md CHANGED
@@ -1,6 +1,9 @@
1
  ---
 
2
  tags:
3
- - generated_from_trainer
 
 
4
  model-index:
5
  - name: KoBigBird-KoBart-News-Summarization
6
  results: []
@@ -11,25 +14,36 @@ should probably proofread and complete it, then remove this comment. -->
11
 
12
  # KoBigBird-KoBart-News-Summarization
13
 
14
- This model is a fine-tuned version of [noahkim/KoBigBird-KoBart-News-Summarization](https://huggingface.co/noahkim/KoBigBird-KoBart-News-Summarization) on an unknown dataset.
15
- It achieves the following results on the evaluation set:
16
- - Loss: 4.1236
17
 
18
  ## Model description
19
 
20
- More information needed
 
 
 
21
 
22
- ## Intended uses & limitations
23
 
24
- More information needed
 
 
25
 
26
- ## Training and evaluation data
 
 
 
27
 
28
- More information needed
 
 
 
29
 
30
- ## Training procedure
 
 
31
 
32
- ### Training hyperparameters
33
 
34
  The following hyperparameters were used during training:
35
  - learning_rate: 2e-05
 
1
  ---
2
+ language: ko
3
  tags:
4
+ - summarization
5
+ - news
6
+ inference: false
7
  model-index:
8
  - name: KoBigBird-KoBart-News-Summarization
9
  results: []
 
14
 
15
  # KoBigBird-KoBart-News-Summarization
16
 
17
+ This model is a fine-tuned version of [noahkim/KoBigBird-KoBart-News-Summarization](https://huggingface.co/noahkim/KoBigBird-KoBart-News-Summarization) on the [daekeun-ml/naver-news-summarization-ko](https://huggingface.co/datasets/daekeun-ml/naver-news-summarization-ko)
18
+
 
19
 
20
  ## Model description
21
 
22
+ <<20221110 Commit>>
23
+
24
+ <<KoBigBird-KoBart-News-Summarization ๋ชจ๋ธ ์„ค๋ช…>>
25
+ ๋‹ค์ค‘๋ฌธ์„œ์š”์•ฝ(Multi-Document-Summarization) Task๋ฅผ ์œ„ํ•ด์„œ KoBigBird ๋ชจ๋ธ์„ Encoder-Decoder๋ชจ๋ธ์„ ๋งŒ๋“ค์–ด์„œ ํ•™์Šต์„ ์ง„ํ–‰ํ–ˆ์Šต๋‹ˆ๋‹ค. KoBigBird๋ฅผ Decoder๋กœ ์“ฐ๋ ค๊ณ  ํ–ˆ์œผ๋‚˜ ์˜ค๋ฅ˜๊ฐ€ ์ƒ๊ฒจ์„œ ์š”์•ฝ์— ํŠนํ™”๋œ KoBART์˜ Decoder๋ฅผ ํ™œ์šฉํ•ด์„œ ๋ชจ๋ธ์„ ์ƒ์„ฑํ–ˆ์Šต๋‹ˆ๋‹ค.
26
 
27
+ ํ”„๋กœ์ ํŠธ์šฉ์œผ๋กœ ๋‰ด์Šค ์š”์•ฝ ๋ชจ๋ธ ํŠนํ™”๋œ ๋ชจ๋ธ์„ ๋งŒ๋“ค๊ธฐ ์œ„ํ•ด ๊ธฐ์กด์— ๋งŒ๋“ค์—ˆ๋˜ KoBigBird-KoBart-News-Summarization ๋ชจ๋ธ์— ์ถ”๊ฐ€์ ์œผ๋กœ daekeun-ml๋‹˜์ด ์ œ๊ณตํ•ด์ฃผ์‹  naver-news-summarization-ko ๋ฐ์ดํ„ฐ์…‹์œผ๋กœ ํŒŒ์ธํŠœ๋‹ ํ–ˆ์Šต๋‹ˆ๋‹ค.
28
 
29
+ ํ˜„์žฌ AI-HUB์—์„œ ์ œ๊ณตํ•˜๋Š” ์š”์•ฝ ๋ฐ์ดํ„ฐ๋ฅผ ์ถ”๊ฐ€ ํ•™์Šต ์ง„ํ–‰ ์˜ˆ์ •์ž…๋‹ˆ๋‹ค.
30
+ ์ง€์†์ ์œผ๋กœ ๋ฐœ์ „์‹œ์ผœ ์ข‹์€ ์„ฑ๋Šฅ์˜ ๋ชจ๋ธ์„ ๊ตฌํ˜„ํ•˜๊ฒ ์Šต๋‹ˆ๋‹ค.
31
+ ๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
32
 
33
+ ์‹คํ–‰ํ™˜๊ฒฝ
34
+ - Google Colab Pro
35
+ - CPU : Intel(R) Xeon(R) CPU @ 2.20GHz
36
+ - GPU : A100-SXM4-40GB
37
 
38
+ <pre><code>
39
+ # Python Code
40
+ from transformers import AutoTokenizer
41
+ from transformers import AutoModelForSeq2SeqLM
42
 
43
+ tokenizer = AutoTokenizer.from_pretrained("noahkim/KoT5_news_summarization")
44
+ model = AutoModelForSeq2SeqLM.from_pretrained("noahkim/KoT5_news_summarization")
45
+ </pre></code>
46
 
 
47
 
48
  The following hyperparameters were used during training:
49
  - learning_rate: 2e-05