CountingMstar commited on
Commit
efdb5f5
ยท
1 Parent(s): 499634c

Update app.py

Browse files
Files changed (1) hide show
  1. app.py +13 -9
app.py CHANGED
@@ -32,7 +32,7 @@ examples = [
32
  ["It calculates soft weights for each word, more precisely for its embedding, in the context window. It can do it either in parallel (such as in transformers) or sequentially (such as recurrent neural networks). Soft weights can change during each runtime, in contrast to hard weights, which are (pre-)trained and fine-tuned and remain frozen afterwards. Attention was developed to address the weaknesses of recurrent neural networks, where words in a sentence are slowly processed one at a time. Machine learning-based attention is a mechanism mimicking cognitive attention. Recurrent neural networks favor more recent words at the end of a sentence while earlier words fade away in volatile neural activations. Attention gives all words equal access to any part of a sentence in a faster parallel scheme and no longer suffers the wait time of serial processing. Earlier uses attached this mechanism to a serial recurrent neural network's language translation system (below), but later uses in Transformers large language models removed the recurrent neural network and relied heavily on the faster parallel attention scheme.", "What is Attention mechanism?"]
33
  ]
34
 
35
- markdown_text = """
36
  # AI Tutor BERT
37
  ์ด ๋ชจ๋ธ์€ ์ธ๊ณต์ง€๋Šฅ(AI) ๊ด€๋ จ ์šฉ์–ด ๋ฐ ์„ค๋ช…์„ ํŒŒ์ธํŠœ๋‹(fine-tuning)ํ•œ BERT ๋ชจ๋ธ์ž…๋‹ˆ๋‹ค.
38
  ## Model
@@ -40,21 +40,25 @@ markdown_text = """
40
  ๋ชจ๋ธ์˜ ๊ฒฝ์šฐ ์ž์—ฐ์–ด ์ฒ˜๋ฆฌ ๋ชจ๋ธ ์ค‘ ๊ฐ€์žฅ ์œ ๋ช…ํ•œ Google์—์„œ ๊ฐœ๋ฐœํ•œ BERT๋ฅผ ์‚ฌ์šฉํ–ˆ์Šต๋‹ˆ๋‹ค. ์ž์„ธํ•œ ์„ค๋ช…์€ ์œ„ ์‚ฌ์ดํŠธ๋ฅผ ์ฐธ๊ณ ํ•˜์‹œ๊ธฐ ๋ฐ”๋ž๋‹ˆ๋‹ค. ์งˆ์˜์‘๋‹ต์ด ์ฃผ์ธ ๊ณผ์™ธ ์„ ์ƒ๋‹˜๋‹ต๊ฒŒ, BERT ์ค‘์—์„œ๋„ ์งˆ์˜์‘๋‹ต์— ํŠนํ™”๋œ Question and Answering ๋ชจ๋ธ์„ ์‚ฌ์šฉํ•˜์˜€์Šต๋‹ˆ๋‹ค.
41
 
42
  ## Dataset
43
- ### Wikipedia
44
- https://en.wikipedia.org/wiki/Main_Page
45
- ### activeloop
46
- https://www.activeloop.ai/resources/glossary/arima-models/
47
- ### Adrien Beaulieu
48
- https://product.house/100-ai-glossary-terms-explained-to-the-rest-of-us/
49
  ํ•™์Šต ๋ฐ์ดํ„ฐ์…‹์€ ์ธ๊ณต์ง€๋Šฅ ๊ด€๋ จ ๋ฌธ๋งฅ, ์งˆ๋ฌธ, ๊ทธ๋ฆฌ๊ณ  ์‘๋‹ต ์ด๋ ‡๊ฒŒ 3๊ฐ€์ง€๋กœ ๊ตฌ์„ฑ์ด ๋˜์–ด์žˆ์Šต๋‹ˆ๋‹ค. ์‘๋‹ต(์ •๋‹ต) ๋ฐ์ดํ„ฐ๋Š” ๋ฌธ๋งฅ ๋ฐ์ดํ„ฐ ์•ˆ์— ํฌํ•จ๋˜์–ด ์žˆ๊ณ , ๋ฌธ๋งฅ ๋ฐ์ดํ„ฐ์˜ ๋ฌธ์žฅ ์ˆœ์„œ๋ฅผ ๋ฐ”๊ฟ”์ฃผ์–ด ๋ฐ์ดํ„ฐ๋ฅผ ์ฆ๊ฐ•ํ•˜์˜€์Šต๋‹ˆ๋‹ค. ์งˆ๋ฌธ ๋ฐ์ดํ„ฐ๋Š” ์ฃผ์ œ๊ฐ€ ๋˜๋Š” ์ธ๊ณต์ง€๋Šฅ ์šฉ์–ด๋กœ ์„ค์ •ํ–ˆ์Šต๋‹ˆ๋‹ค. ์œ„์˜ ์˜ˆ์‹œ๋ฅผ ๋ณด์‹œ๋ฉด ์ดํ•ดํ•˜์‹œ๊ธฐ ํŽธํ•˜์‹ค ๊ฒ๋‹ˆ๋‹ค. ์ด ๋ฐ์ดํ„ฐ ์ˆ˜๋Š” 3300์—ฌ ๊ฐœ๋กœ data ํด๋”์— pickle ํŒŒ์ผ ํ˜•ํƒœ๋กœ ์ €์žฅ๋˜์–ด ์žˆ๊ณ , ๋ฐ์ดํ„ฐ๋Š” Wikipedia ๋ฐ ๋‹ค๋ฅธ ์‚ฌ์ดํŠธ๋“ค์„ ์—์„œ html์„ ์ด์šฉํ•˜์—ฌ ์ถ”์ถœ ๋ฐ ๊ฐ€๊ณตํ•˜์—ฌ ์ œ์ž‘ํ•˜์˜€์Šต๋‹ˆ๋‹ค. ํ•ด๋‹น ์ถœ์ฒ˜๋Š” ์œ„์™€ ๊ฐ™์Šต๋‹ˆ๋‹ค.
 
50
  ## How to use
51
  ์ž…๋ ฅ ์˜ˆ์ œ๋Š” 'Examples'์— ํ‘œ๊ธฐํ•ด ๋‘์—ˆ์Šต๋‹ˆ๋‹ค.
52
  ๊ด€๋ จ ๋ฌธ์žฅ๊ณผ ์ •์˜๋ฅผ ์•Œ๊ณ  ์‹ถ์€ ๋‹จ์–ด๋ฅผ ๊ฐ๊ฐ `Contexts`, `Question`์— ์ž…๋ ฅํ•œ ํ›„ `Submit` ๋ฒ„ํŠผ์„ ๋ˆ„๋ฅด๋ฉด ํ•ด๋‹น ๋‹จ์–ด์— ๋Œ€ํ•œ ์„ค๋ช…์ด ๋‚˜์˜ต๋‹ˆ๋‹ค.
53
- """
 
 
 
 
 
 
54
 
55
  iface = gr.Interface(
56
  fn=submit,
57
- inputs=[gr.Textbox("Context"), gr.Textbox("Question"), gr.Markdown(markdown_text)],
58
  outputs=gr.Textbox("Answer"),
59
  examples=examples,
60
  live=True, # Set live to True to use the submit button
 
32
  ["It calculates soft weights for each word, more precisely for its embedding, in the context window. It can do it either in parallel (such as in transformers) or sequentially (such as recurrent neural networks). Soft weights can change during each runtime, in contrast to hard weights, which are (pre-)trained and fine-tuned and remain frozen afterwards. Attention was developed to address the weaknesses of recurrent neural networks, where words in a sentence are slowly processed one at a time. Machine learning-based attention is a mechanism mimicking cognitive attention. Recurrent neural networks favor more recent words at the end of a sentence while earlier words fade away in volatile neural activations. Attention gives all words equal access to any part of a sentence in a faster parallel scheme and no longer suffers the wait time of serial processing. Earlier uses attached this mechanism to a serial recurrent neural network's language translation system (below), but later uses in Transformers large language models removed the recurrent neural network and relied heavily on the faster parallel attention scheme.", "What is Attention mechanism?"]
33
  ]
34
 
35
+ gr.Markdown("""
36
  # AI Tutor BERT
37
  ์ด ๋ชจ๋ธ์€ ์ธ๊ณต์ง€๋Šฅ(AI) ๊ด€๋ จ ์šฉ์–ด ๋ฐ ์„ค๋ช…์„ ํŒŒ์ธํŠœ๋‹(fine-tuning)ํ•œ BERT ๋ชจ๋ธ์ž…๋‹ˆ๋‹ค.
38
  ## Model
 
40
  ๋ชจ๋ธ์˜ ๊ฒฝ์šฐ ์ž์—ฐ์–ด ์ฒ˜๋ฆฌ ๋ชจ๋ธ ์ค‘ ๊ฐ€์žฅ ์œ ๋ช…ํ•œ Google์—์„œ ๊ฐœ๋ฐœํ•œ BERT๋ฅผ ์‚ฌ์šฉํ–ˆ์Šต๋‹ˆ๋‹ค. ์ž์„ธํ•œ ์„ค๋ช…์€ ์œ„ ์‚ฌ์ดํŠธ๋ฅผ ์ฐธ๊ณ ํ•˜์‹œ๊ธฐ ๋ฐ”๋ž๋‹ˆ๋‹ค. ์งˆ์˜์‘๋‹ต์ด ์ฃผ์ธ ๊ณผ์™ธ ์„ ์ƒ๋‹˜๋‹ต๊ฒŒ, BERT ์ค‘์—์„œ๋„ ์งˆ์˜์‘๋‹ต์— ํŠนํ™”๋œ Question and Answering ๋ชจ๋ธ์„ ์‚ฌ์šฉํ•˜์˜€์Šต๋‹ˆ๋‹ค.
41
 
42
  ## Dataset
43
+ [Wikipedia] https://en.wikipedia.org/wiki/Main_Page
44
+ [activeloop] https://www.activeloop.ai/resources/glossary/arima-models/
45
+ [Adrien Beaulieu] https://product.house/100-ai-glossary-terms-explained-to-the-rest-of-us/
 
 
 
46
  ํ•™์Šต ๋ฐ์ดํ„ฐ์…‹์€ ์ธ๊ณต์ง€๋Šฅ ๊ด€๋ จ ๋ฌธ๋งฅ, ์งˆ๋ฌธ, ๊ทธ๋ฆฌ๊ณ  ์‘๋‹ต ์ด๋ ‡๊ฒŒ 3๊ฐ€์ง€๋กœ ๊ตฌ์„ฑ์ด ๋˜์–ด์žˆ์Šต๋‹ˆ๋‹ค. ์‘๋‹ต(์ •๋‹ต) ๋ฐ์ดํ„ฐ๋Š” ๋ฌธ๋งฅ ๋ฐ์ดํ„ฐ ์•ˆ์— ํฌํ•จ๋˜์–ด ์žˆ๊ณ , ๋ฌธ๋งฅ ๋ฐ์ดํ„ฐ์˜ ๋ฌธ์žฅ ์ˆœ์„œ๋ฅผ ๋ฐ”๊ฟ”์ฃผ์–ด ๋ฐ์ดํ„ฐ๋ฅผ ์ฆ๊ฐ•ํ•˜์˜€์Šต๋‹ˆ๋‹ค. ์งˆ๋ฌธ ๋ฐ์ดํ„ฐ๋Š” ์ฃผ์ œ๊ฐ€ ๋˜๋Š” ์ธ๊ณต์ง€๋Šฅ ์šฉ์–ด๋กœ ์„ค์ •ํ–ˆ์Šต๋‹ˆ๋‹ค. ์œ„์˜ ์˜ˆ์‹œ๋ฅผ ๋ณด์‹œ๋ฉด ์ดํ•ดํ•˜์‹œ๊ธฐ ํŽธํ•˜์‹ค ๊ฒ๋‹ˆ๋‹ค. ์ด ๋ฐ์ดํ„ฐ ์ˆ˜๋Š” 3300์—ฌ ๊ฐœ๋กœ data ํด๋”์— pickle ํŒŒ์ผ ํ˜•ํƒœ๋กœ ์ €์žฅ๋˜์–ด ์žˆ๊ณ , ๋ฐ์ดํ„ฐ๋Š” Wikipedia ๋ฐ ๋‹ค๋ฅธ ์‚ฌ์ดํŠธ๋“ค์„ ์—์„œ html์„ ์ด์šฉํ•˜์—ฌ ์ถ”์ถœ ๋ฐ ๊ฐ€๊ณตํ•˜์—ฌ ์ œ์ž‘ํ•˜์˜€์Šต๋‹ˆ๋‹ค. ํ•ด๋‹น ์ถœ์ฒ˜๋Š” ์œ„์™€ ๊ฐ™์Šต๋‹ˆ๋‹ค.
47
+
48
  ## How to use
49
  ์ž…๋ ฅ ์˜ˆ์ œ๋Š” 'Examples'์— ํ‘œ๊ธฐํ•ด ๋‘์—ˆ์Šต๋‹ˆ๋‹ค.
50
  ๊ด€๋ จ ๋ฌธ์žฅ๊ณผ ์ •์˜๋ฅผ ์•Œ๊ณ  ์‹ถ์€ ๋‹จ์–ด๋ฅผ ๊ฐ๊ฐ `Contexts`, `Question`์— ์ž…๋ ฅํ•œ ํ›„ `Submit` ๋ฒ„ํŠผ์„ ๋ˆ„๋ฅด๋ฉด ํ•ด๋‹น ๋‹จ์–ด์— ๋Œ€ํ•œ ์„ค๋ช…์ด ๋‚˜์˜ต๋‹ˆ๋‹ค.
51
+ """)
52
+
53
+ input_textbox = gr.Textbox("Context", placeholder="Enter context here")
54
+ question_textbox = gr.Textbox("Question", placeholder="Enter question here")
55
+
56
+ input_section = gr.Row([input_textbox, question_textbox])
57
+
58
 
59
  iface = gr.Interface(
60
  fn=submit,
61
+ inputs=input_section,
62
  outputs=gr.Textbox("Answer"),
63
  examples=examples,
64
  live=True, # Set live to True to use the submit button