|
|
|
|
|
|
|
> **์ฐธ๊ณ ** |
|
> |
|
> ์ด README๋ GPT ๋ฒ์ญ์ผ๋ก ์์ฑ๋์์ต๋๋ค (์ด ํ๋ก์ ํธ์ ํ๋ฌ๊ทธ์ธ์ ์ํด ๊ตฌํ๋จ) . 100% ์ ๋ขฐํ ์ ์์ผ๋ฏ๋ก ๋ฒ์ญ ๊ฒฐ๊ณผ๋ฅผ ์ฃผ์ ๊น๊ฒ ๊ฒํ ํ์ญ์์ค. |
|
> |
|
> 2023.11.7: ์ข
์์ฑ์ ์ค์นํ ๋, `requirements.txt`์ **์ง์ ๋ ๋ฒ์ **์ ์ ํํ์ญ์์ค. ์ค์น ๋ช
๋ น์ด: `pip install -r requirements.txt`. |
|
|
|
|
|
|
|
|
|
# <div align=center><img src="logo.png" width="40"> GPT ํ์ ์ต์ ํ (GPT Academic)</div> |
|
|
|
**์ด ํ๋ก์ ํธ๊ฐ ๋ง์์ ๋์ ๋ค๋ฉด, Star๋ฅผ ๋ถํ๋๋ฆฝ๋๋ค. ํธ๋ฆฌํ ๋จ์ถํค๋ ํ๋ฌ๊ทธ์ธ์ ๋ฐ๊ฒฌํ์
จ๋ค๋ฉด Pull Request๋ฅผ ํ์ํฉ๋๋ค!** |
|
GPT๋ฅผ ์ฌ์ฉํ์ฌ ์ด ํ๋ก์ ํธ๋ฅผ ์์์ ์ธ์ด๋ก ๋ฒ์ญํ๋ ค๋ฉด [`multi_language.py`](multi_language.py)๋ฅผ ์ฝ๊ณ ์คํํ์ญ์์ค (์คํ์ ). |
|
|
|
|
|
> **์ฐธ๊ณ ** |
|
> |
|
> 1. **๊ฐ์กฐ ํ์**๋ ํ๋ฌ๊ทธ์ธ (๋ฒํผ)๋ง ํ์ผ์ ์ฝ์ ์ ์์ต๋๋ค. ์ผ๋ถ ํ๋ฌ๊ทธ์ธ์ ํ๋ฌ๊ทธ์ธ ์์ญ์ **๋๋กญ๋ค์ด ๋ฉ๋ด**์ ์์ต๋๋ค. ๋ํ ์๋ก์ด ํ๋ฌ๊ทธ์ธ์ ๋ํ ๋ชจ๋ PR์ ํ์ํ๋ฉฐ, ์ด๋ฅผ **๊ฐ์ฅ ์ฐ์ ์ **์ผ๋ก ์ฒ๋ฆฌํฉ๋๋ค. |
|
> |
|
> 2. ์ด ํ๋ก์ ํธ์ ๊ฐ ํ์ผ์ ๊ธฐ๋ฅ์ [์์ฒด ๋ถ์ ๋ณด๊ณ ์ `self_analysis.md`](https://github.com/binary-husky/gpt_academic/wiki/GPTโAcademic%EC%A0%9C%ED%94%84%EB%AA%85%EC%84%B1%EB%B0%A9%EC%8B%9D%EC%9D%98_%EA%B2%B0%EA%B3%BC)์์ ์์ธํ ์ค๋ช
๋์ด ์์ต๋๋ค. ๋ฒ์ ์ด ๋ฐ๋ณต๋จ์ ๋ฐ๋ผ, ๊ด๋ จ ๊ธฐ๋ฅ ํ๋ฌ๊ทธ์ธ์ ์ธ์ ๋ ์ง ํด๋ฆญํ์ฌ GPT๋ฅผ ํธ์ถํ์ฌ ํ๋ก์ ํธ์ ์์ฒด ๋ถ์ ๋ณด๊ณ ์๋ฅผ ๋ค์ ์์ฑํ ์ ์์ต๋๋ค. ์์ฃผ ๋ฌป๋ ์ง๋ฌธ์ [`์ํค`](https://github.com/binary-husky/gpt_academic/wiki)๋ฅผ ์ฐธ์กฐํ์ญ์์ค. [์ผ๋ฐ์ ์ธ ์ค์น ๋ฐฉ๋ฒ](#installation) | [์ํด๋ฆญ ์ค์น ์คํฌ๋ฆฝํธ](https://github.com/binary-husky/gpt_academic/releases) | [์ค์ ์ค๋ช
์](https://github.com/binary-husky/gpt_academic/wiki/%EC%84%A4%EC%A0%95%EC%82%AC%EB%AA%85_%EA%B0%84%EB%8B%A8_%EC%84%B8%ED%8A%B8%EB%B2%84_%EC%B6%94%EA%B0%80) |
|
|
|
|
|
> 3. ์ด ํ๋ก์ ํธ๋ ChatGLM ๋ฑ ๋ํ ์ธ์ด ๋ชจ๋ธ (ChatGLM ๋ฑ) ์คํ์ ์ง์ํ๊ณ ๊ถ์ฅํฉ๋๋ค. ์ฌ๋ฌ ๊ฐ์ API ํค๋ฅผ ๋์์ ์ฌ์ฉํ ์ ์์ผ๋ฉฐ, ๊ตฌ์ฑ ํ์ผ์ `API_KEY="openai-key1,openai-key2,azure-key3,api2d-key4"`์ ๊ฐ์ด ์
๋ ฅํ ์ ์์ต๋๋ค. `API_KEY`๋ฅผ ์ผ์์ ์ผ๋ก ๋ณ๊ฒฝํด์ผ ํ๋ ๊ฒฝ์ฐ, ์
๋ ฅ ์์ญ์ ์์ `API_KEY`๋ฅผ ์
๋ ฅํ ๋ค์ Enter ํค๋ฅผ ๋๋ฅด๋ฉด ์ ์ฉ๋ฉ๋๋ค. |
|
|
|
|
|
|
|
|
|
|
|
<div align="center"> |
|
|
|
๊ธฐ๋ฅ (โญ= ์ต๊ทผ ์ถ๊ฐ ๊ธฐ๋ฅ) | ์ค๋ช
|
|
--- | --- |
|
โญ[์ ๋ชจ๋ธ ์ถ๊ฐ](https://github.com/binary-husky/gpt_academic/wiki/%E5%A6%82%E4%BD%95%E5%88%87%E6%8D%A2%E6%A8%A1%E5%9E%8B)! | Baidu [Qianfan](https://cloud.baidu.com/doc/WENXINWORKSHOP/s/Nlks5zkzu)์ Wenxin Yiyan, [Tongyi Qianwen](https://modelscope.cn/models/qwen/Qwen-7B-Chat/summary), Shanghai AI-Lab [Shusheng](https://github.com/InternLM/InternLM), Xunfei [Star](https://xinghuo.xfyun.cn/), [LLaMa2](https://huggingface.co/meta-llama/Llama-2-7b-chat-hf), Zhipu API, DALLE3 |
|
๋ฌธ์ฒด ๊ฐ์ , ๋ฒ์ญ, ์ฝ๋ ์ค๋ช
| ์ผ๊ด์ ์ธ ๋ฌธ์ฒด ๊ฐ์ , ๋ฒ์ญ, ๋
ผ๋ฌธ ๋ฌธ๋ฒ ์ค๋ฅ ํ์, ์ฝ๋ ์ค๋ช
|
|
[์ฌ์ฉ์ ์ ์ ๋จ์ถํค](https://www.bilibili.com/video/BV14s4y1E7jN) | ์ฌ์ฉ์ ์ ์ ๋จ์ถํค ์ง์ |
|
๋ชจ๋ํ ์ค๊ณ | ์ฌ์ฉ์ ์ ์ ๊ฐ๋ฅํ ๊ฐ๋ ฅํ [ํ๋ฌ๊ทธ์ธ](https://github.com/binary-husky/gpt_academic/tree/master/crazy_functions) ์ง์, ํ๋ฌ๊ทธ์ธ ์ง์ [ํซ ์
๋ฐ์ดํธ](https://github.com/binary-husky/gpt_academic/wiki/%E5%87%BD%E6%95%B0%E6%8F%92%E4%BB%B6%E6%8C%87%E5%8D%97) |
|
[ํ๋ก๊ทธ๋จ ๋ถ์](https://www.bilibili.com/video/BV1cj411A7VW) | [ํ๋ฌ๊ทธ์ธ] ํ ๋ฒ์ Python/C/C++/Java/Lua/... ํ๋ก์ ํธ ํธ๋ฆฌ๋ฅผ ๋ถ์ํ๊ฑฐ๋ [์์ฒด ๋ถ์](https://www.bilibili.com/video/BV1cj411A7VW) |
|
๋
ผ๋ฌธ ์ฝ๊ธฐ, ๋
ผ๋ฌธ [๋ฒ์ญ](https://www.bilibili.com/video/BV1KT411x7Wn) | [ํ๋ฌ๊ทธ์ธ] LaTeX/PDF ๋
ผ๋ฌธ ์ ๋ฌธ์ ์ฝ๊ณ ์์ฝ ์์ฑ |
|
LaTeX ์ ์ฒด [๋ฒ์ญ](https://www.bilibili.com/video/BV1nk4y1Y7Js/), [๊ฐ์ ](https://www.bilibili.com/video/BV1FT411H7c5/) | [ํ๋ฌ๊ทธ์ธ] LaTeX ๋
ผ๋ฌธ ๋ฒ์ญ ๋๋ ๊ฐ์ |
|
์ผ๊ด ์ฃผ์ ์์ฑ | [ํ๋ฌ๊ทธ์ธ] ํจ์ ์ฃผ์ ์ผ๊ด ์์ฑ |
|
Markdown [ํ / ์ ๋ฒ์ญ](https://www.bilibili.com/video/BV1yo4y157jV/) | ์์ 5๊ฐ ์ธ์ด๋ก ์์ฑ๋ [README](https://github.com/binary-husky/gpt_academic/blob/master/docs/README_EN.md)๋ฅผ ์ดํด๋ณด์
จ๋์? |
|
์ฑํ
๋ถ์ ๋ณด๊ณ ์ ์์ฑ | [ํ๋ฌ๊ทธ์ธ] ์คํ ํ ์์ฝ ๋ณด๊ณ ์ ์๋ ์์ฑ |
|
[PDF ๋
ผ๋ฌธ ์ ์ฒด ๋ฒ์ญ](https://www.bilibili.com/video/BV1KT411x7Wn) ๊ธฐ๋ฅ | [ํ๋ฌ๊ทธ์ธ] PDF ๋
ผ๋ฌธ ์ ๋ชฉ ๋ฐ ์์ฝ ์ถ์ถ + ์ ์ฒด ๋ฒ์ญ (๋ฉํฐ ์ค๋ ๋) |
|
[Arxiv ๋์ฐ๋ฏธ](https://www.bilibili.com/video/BV1LM4y1279X) | [ํ๋ฌ๊ทธ์ธ] arxiv ๋
ผ๋ฌธ url ์
๋ ฅ์ ์์ฝ ๋ฒ์ญ + PDF ๋ค์ด๋ก๋ |
|
LaTeX ๋
ผ๋ฌธ ์ผ๊ด ๊ต์ | [ํ๋ฌ๊ทธ์ธ] Grammarly๋ฅผ ๋ชจ์ฌํ์ฌ LaTeX ๋
ผ๋ฌธ์ ๋ํ ๋ฌธ๋ฒ ๋ฐ ๋ง์ถค๋ฒ ์ค๋ฅ ๊ต์ + ๋์กฐ PDF ์ถ๋ ฅ |
|
[Google ํ์ ํตํฉ ๋์ฐ๋ฏธ](https://www.bilibili.com/video/BV19L411U7ia) | ์์์ Google ํ์ ๊ฒ์ ํ์ด์ง URL์ ์ง์ ํ์ฌ gpt๊ฐ [related works๋ฅผ ์์ฑ](https://www.bilibili.com/video/BV1GP411U7Az/)ํ๊ฒ ํด์ฃผ์ธ์. |
|
์ธํฐ๋ท ์ ๋ณด ์ง๊ณ + GPT | [ํ๋ฌ๊ทธ์ธ] [์ธํฐ๋ท์์ ์ ๋ณด๋ฅผ ๊ฐ์ ธ์์](https://www.bilibili.com/video/BV1om4y127ck) ์ง๋ฌธ์ ๋๋ตํ๋๋ก GPT๋ฅผ ์๋ํํ์ธ์. ์ ๋ณด๊ฐ ์ ๋๋ก ์ค๋๋์ง ์๋๋ก ํด์ค๋๋ค. |
|
โญArxiv ๋
ผ๋ฌธ ์ธ์ฌํ ๋ฒ์ญ ([Docker](https://github.com/binary-husky/gpt_academic/pkgs/container/gpt_academic_with_latex)) | [ํ๋ฌ๊ทธ์ธ] [arxiv ๋
ผ๋ฌธ์ ๊ณ ํ์ง ๋ฒ์ญ์ผ๋ก](https://www.bilibili.com/video/BV1dz4y1v77A/) ๋ฒ์ญํ๋ ์ต๊ณ ์ ๋๊ตฌ |
|
โญ[์ค์๊ฐ ์์ฑ ๋ํ ์
๋ ฅ](https://github.com/binary-husky/gpt_academic/blob/master/docs/use_audio.md) | [ํ๋ฌ๊ทธ์ธ] ๋น๋๊ธฐ์ ์ผ๋ก [์ค๋์ค๋ฅผ ๋ชจ๋ํฐ๋ง](https://www.bilibili.com/video/BV1AV4y187Uy/)ํ์ฌ ๋ฌธ์ฅ์ ์๋์ผ๋ก ๋ถ์ ํ๊ณ ๋๋ต ์๊ธฐ๋ฅผ ์๋์ผ๋ก ์ฐพ์ต๋๋ค. |
|
์์/์ด๋ฏธ์ง/ํ ํ์ | [tex ํ์ ๋ฐ ๋ ๋๋ง ํ์](https://user-images.githubusercontent.com/96192199/230598842-1d7fcddd-815d-40ee-af60-baf488a199df.png)์ ์์์ ๋์์ ํ์ํ๋ฉฐ, ์์ ๋ฐ ์ฝ๋ ํ์ด๋ผ์ดํธ ์ง์ |
|
โญAutoGen multi-agent ํ๋ฌ๊ทธ์ธ | [ํ๋ฌ๊ทธ์ธ] Microsoft AutoGen์ ํ์ฉํ์ฌ ์ฌ๋ฌ ๊ฐ์ ์์ด์ ํธ๊ฐ ์ง๋ฅ์ ์ผ๋ก ๋ฐ์ํ๋ ๊ฐ๋ฅ์ฑ์ ํ์ํ์ธ์! |
|
๋คํฌ ๋ชจ๋ ์ฃผ์ ์ง์ | ๋ธ๋ผ์ฐ์ ์ URL ๋ค์ ```/?__theme=dark```๋ฅผ ์ถ๊ฐํ์ฌ ๋คํฌ ๋ชจ๋๋ก ์ ํํ์ธ์. |
|
[๋ค์ํ LLM ๋ชจ๋ธ](https://www.bilibili.com/video/BV1wT411p7yf) ์ง์ | GPT3.5, GPT4, [Tsinghua ChatGLM2](https://github.com/THUDM/ChatGLM2-6B), [Fudan MOSS](https://github.com/OpenLMLab/MOSS)์ ํจ๊ป ์ฌ์ฉํ๋ ๋๋์ ์ข์ ๊ฒ์
๋๋ค, ๊ทธ๋ ์ง ์์ต๋๊น? |
|
โญChatGLM2 fine-tuned ๋ชจ๋ธ | ChatGLM2 fine-tuned ๋ชจ๋ธ ๋ก๋๋ฅผ ์ง์ํ๋ฉฐ, ChatGLM2 fine-tuned ๋ณด์กฐ ํ๋ฌ๊ทธ์ธ ์ ๊ณต |
|
๋ ๋ง์ LLM ๋ชจ๋ธ ์ฐ๊ฒฐ, [huggingface ๋ฐฐํฌ](https://huggingface.co/spaces/qingxu98/gpt-academic) ์ง์ | Newbing ์ธํฐํ์ด์ค(์ ๋ฐ), Tsinghua [Jittorllms](https://github.com/Jittor/JittorLLMs) ๋์
, [LLaMA](https://github.com/facebookresearch/llama)์ [Pangu-alpha](https://openi.org.cn/pangu/)๋ฅผ ์ง์ํฉ๋๋ค. |
|
โญ[void-terminal](https://github.com/binary-husky/void-terminal) ํจํค์ง | GUI์์ ๋
๋ฆฝ, Python์์ ์ด ํ๋ก์ ํธ์ ๋ชจ๋ ํจ์ ํ๋ฌ๊ทธ์ธ์ ์ง์ ํธ์ถ (๊ฐ๋ฐ ์ค) |
|
โญVoid ํฐ๋ฏธ๋ ํ๋ฌ๊ทธ์ธ | [ํ๋ฌ๊ทธ์ธ] ์์ฐ์ด๋ก ์ด ํ๋ก์ ํธ์ ๋ค๋ฅธ ํ๋ฌ๊ทธ์ธ์ ์ง์ ์์ํฉ๋๋ค. |
|
๊ธฐํ ์๋ก์ด ๊ธฐ๋ฅ ์๊ฐ (์ด๋ฏธ์ง ์์ฑ ๋ฑ) โฆโฆ | ๋ณธ ๋ฌธ์ ๋งจ ๋ ์ฐธ์กฐ โฆโฆ |
|
</div> |
|
|
|
|
|
- ์๋ก์ด ์ธํฐํ์ด์ค(`config.py`์ LAYOUT ์ต์
์์ ์ผ๋ก "์ผ์ชฝ-์ค๋ฅธ์ชฝ ๋ ์ด์์"๊ณผ "์-์๋ ๋ ์ด์์"์ ์ ํํ ์ ์์) |
|
<div align="center"> |
|
<img src="https://github.com/binary-husky/gpt_academic/assets/96192199/d81137c3-affd-4cd1-bb5e-b15610389762" width="700" > |
|
</div> |
|
|
|
|
|
- ๋ชจ๋ ๋ฒํผ์ functional.py๋ฅผ ๋์ ์ผ๋ก ์ฝ์ด ์์ฑ๋๋ฏ๋ก ์ํ๋๋๋ก ์ฌ์ฉ์ ์ ์ ๊ธฐ๋ฅ์ ์ถ๊ฐํ ์ ์์ผ๋ฉฐ ํด๋ฆฝ ๋ณด๋๋ฅผ ํด์ ํ ์ ์์ต๋๋ค. |
|
<div align="center"> |
|
<img src="https://user-images.githubusercontent.com/96192199/231975334-b4788e91-4887-412f-8b43-2b9c5f41d248.gif" width="700" > |
|
</div> |
|
|
|
- ๋ฌธ์ฒด ๊ฐ์ /์ค๋ฅ ์์ |
|
<div align="center"> |
|
<img src="https://user-images.githubusercontent.com/96192199/231980294-f374bdcb-3309-4560-b424-38ef39f04ebd.gif" width="700" > |
|
</div> |
|
|
|
|
|
|
|
- If the output contains equations, they will be displayed in both tex format and rendered format for easy copying and reading. |
|
<div align="center"> |
|
<img src="https://user-images.githubusercontent.com/96192199/230598842-1d7fcddd-815d-40ee-af60-baf488a199df.png" width="700" > |
|
</div> |
|
|
|
- Don't feel like looking at the project code? Just give it to ChatGPT and let it dazzle you. |
|
<div align="center"> |
|
<img src="https://user-images.githubusercontent.com/96192199/226935232-6b6a73ce-8900-4aee-93f9-733c7e6fef53.png" width="700" > |
|
</div> |
|
|
|
- Mix and match multiple powerful language models (ChatGLM + OpenAI-GPT3.5 + [API2D](https://api2d.com/)-GPT4) |
|
<div align="center"> |
|
<img src="https://user-images.githubusercontent.com/96192199/232537274-deca0563-7aa6-4b5d-94a2-b7c453c47794.png" width="700" > |
|
</div> |
|
|
|
# Installation |
|
### Installation Method I: Run Directly (Windows, Linux or MacOS) |
|
|
|
1. Download the project |
|
```sh |
|
git clone --depth=1 https://github.com/binary-husky/gpt_academic.git |
|
cd gpt_academic |
|
``` |
|
|
|
2. Configure API_KEY |
|
|
|
In `config.py`, configure the API KEY and other settings, [click here to view special network environment configuration methods](https://github.com/binary-husky/gpt_academic/issues/1). [Wiki page](https://github.com/binary-husky/gpt_academic/wiki/้กน็ฎ้
็ฝฎ่ฏดๆ)ใ |
|
|
|
" The program will first check if there is a confidential configuration file named `config_private.py` and use its configuration to override the configuration with the same name in `config.py`. If you can understand this reading logic, we strongly recommend that you create a new configuration file named `config_private.py` next to `config.py` and move (copy) the configuration from `config.py` to `config_private.py` (only copy the modified configuration items). " |
|
|
|
" You can configure the project through `environment variables`. The format of the environment variables can be found in the `docker-compose.yml` file or our [Wiki page](https://github.com/binary-husky/gpt_academic/wiki/้กน็ฎ้
็ฝฎ่ฏดๆ). The priority of the configuration reading is: `environment variables` > `config_private.py` > `config.py`. " |
|
|
|
3. Install dependencies |
|
```sh |
|
# (Option I: if familiar with python, python>=3.9) Note: Use the official pip source or Aliyun pip source. Temporary switching source method: python -m pip install -r requirements.txt -i https://mirrors.aliyun.com/pypi/simple/ |
|
python -m pip install -r requirements.txt |
|
|
|
# (Option II: using Anaconda) The steps are similar (https://www.bilibili.com/video/BV1rc411W7Dr): |
|
conda create -n gptac_venv python=3.11 # Create an Anaconda environment |
|
conda activate gptac_venv # Activate the Anaconda environment |
|
python -m pip install -r requirements.txt # This step is the same as the pip installation step |
|
``` |
|
|
|
|
|
<details><summary>Click here to expand if you need support for Tsinghua ChatGLM2/Fudan MOSS/RWKV backend</summary> |
|
<p> |
|
|
|
[Optional Step] If you need support for Tsinghua ChatGLM2/Fudan MOSS as the backend, you need to install additional dependencies (Prerequisites: Familiar with Python + Have used Pytorch + Sufficient computer configuration): |
|
```sh |
|
# [Optional Step I] Support for Tsinghua ChatGLM2. Note for Tsinghua ChatGLM: If you encounter the error "Call ChatGLM fail cannot load ChatGLM parameters", refer to the following: 1: The default installation above is torch+cpu version. To use cuda, uninstall torch and reinstall torch+cuda; 2: If you cannot load the model due to insufficient computer configuration, you can modify the model precision in request_llm/bridge_chatglm.py, change AutoTokenizer.from_pretrained("THUDM/chatglm-6b", trust_remote_code=True) to AutoTokenizer.from_pretrained("THUDM/chatglm-6b-int4", trust_remote_code=True) |
|
python -m pip install -r request_llms/requirements_chatglm.txt |
|
|
|
# [Optional Step II] Support for Fudan MOSS |
|
python -m pip install -r request_llms/requirements_moss.txt |
|
git clone --depth=1 https://github.com/OpenLMLab/MOSS.git request_llms/moss # When executing this line of code, make sure you are in the project root path |
|
|
|
# [Optional Step III] Support for RWKV Runner |
|
Refer to the wiki: https://github.com/binary-husky/gpt_academic/wiki/%E9%80%82%E9%85%8DRWKV-Runner |
|
|
|
# [Optional Step IV] Make sure that the AVAIL_LLM_MODELS in the config.py configuration file includes the expected models. The currently supported models are as follows (the jittorllms series only supports the docker solution): |
|
AVAIL_LLM_MODELS = ["gpt-3.5-turbo", "api2d-gpt-3.5-turbo", "gpt-4", "api2d-gpt-4", "chatglm", "moss"] # + ["jittorllms_rwkv", "jittorllms_pangualpha", "jittorllms_llama"] |
|
``` |
|
|
|
</p> |
|
</details> |
|
|
|
|
|
|
|
4. Run |
|
```sh |
|
python main.py |
|
``` |
|
|
|
### Installation Method II: Use Docker |
|
|
|
0. Deploy all the capabilities of the project (this is a large image that includes cuda and latex. However, it is not recommended if your internet speed is slow or your hard disk is small) |
|
[](https://github.com/binary-husky/gpt_academic/actions/workflows/build-with-all-capacity.yml) |
|
|
|
``` sh |
|
# Modify docker-compose.yml, keep scheme 0 and delete the others. Then run: |
|
docker-compose up |
|
``` |
|
|
|
1. ChatGPT+Random Quotes+Wikipedia Summary+Spark and other online models (recommended for most people) |
|
[](https://github.com/binary-husky/gpt_academic/actions/workflows/build-without-local-llms.yml) |
|
[](https://github.com/binary-husky/gpt_academic/actions/workflows/build-with-latex.yml) |
|
[](https://github.com/binary-husky/gpt_academic/actions/workflows/build-with-audio-assistant.yml) |
|
|
|
``` sh |
|
# Modify docker-compose.yml, keep scheme 1 and delete the others. Then run: |
|
docker-compose up |
|
``` |
|
|
|
P.S. If you need the Latex plugin feature, please refer to the Wiki. Additionally, you can also use scheme 4 or scheme 0 directly to get the Latex feature. |
|
|
|
2. ChatGPT + ChatGLM2 + MOSS + LLAMA2 + Thousand Questions (Requires familiarity with [Nvidia Docker](https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/install-guide.html#installing-on-ubuntu-and-debian) runtime) |
|
[](https://github.com/binary-husky/gpt_academic/actions/workflows/build-with-chatglm.yml) |
|
|
|
``` sh |
|
# Modify docker-compose.yml, keep scheme 2 and delete the others. Then run: |
|
docker-compose up |
|
``` |
|
|
|
|
|
### Installation Method III: Other Deployment Methods |
|
1. **One-click run script for Windows**. |
|
Windows users who are completely unfamiliar with the Python environment can download the one-click run script without local models from the [Release](https://github.com/binary-husky/gpt_academic/releases) section. |
|
The script contribution comes from [oobabooga](https://github.com/oobabooga/one-click-installers). |
|
|
|
2. Use third-party APIs, Azure, etc., Random Quotes, Spark, etc., see the [Wiki page](https://github.com/binary-husky/gpt_academic/wiki/้กน็ฎ้
็ฝฎ่ฏดๆ). |
|
|
|
3. Pitfall guide for remote deployment on cloud servers. |
|
Please visit the [cloud server remote deployment wiki](https://github.com/binary-husky/gpt_academic/wiki/%E4%BA%91%E6%9C%8D%E5%8A%A1%E5%99%A8%E8%BF%9C%E7%A8%8B%E9%83%A8%E7%BD%B2%E6%8C%87%E5%8D%97) |
|
|
|
4. Some new deployment platforms or methods |
|
- Use Sealos for [one-click deployment](https://github.com/binary-husky/gpt_academic/issues/993). |
|
- Use WSL2 (Windows Subsystem for Linux). Please visit [deployment wiki-2](https://github.com/binary-husky/gpt_academic/wiki/%E4%BD%BF%E7%94%A8WSL2%EF%BC%88Windows-Subsystem-for-Linux-%E5%AD%90%E7%B3%BB%E7%BB%9F%EF%BC%89%E9%83%A8%E7%BD%B2) |
|
- How to run in a subpath (such as `http://localhost/subpath`). Please refer to [FastAPI running instructions](docs/WithFastapi.md) |
|
|
|
|
|
|
|
# ๊ณ ๊ธ ์ฌ์ฉ๋ฒ |
|
### I: ์ฌ์ฉ์ ์ ์ ๋ฐ๋ก ๊ฐ๊ธฐ ๋ฒํผ ์ถ๊ฐ (ํ์ ๋จ์ถํค) |
|
์์์ ํ
์คํธ ํธ์ง๊ธฐ๋ก `core_functional.py` ํ์ผ์ ์ด๊ณ ๋ค์๊ณผ ๊ฐ์ ํญ๋ชฉ์ ์ถ๊ฐํ ๋ค์ ํ๋ก๊ทธ๋จ์ ๋ค์ ์์ํ์ญ์์ค. (์ด๋ฏธ ๋ฒํผ์ด ์๋ ๊ฒฝ์ฐ์๋ ์ ๋์ฌ์ ์ ๋ฏธ์ฌ๋ฅผ ์ค์๊ฐ์ผ๋ก ์์ ํ ์ ์์ผ๋ฏ๋ก ํ๋ก๊ทธ๋จ์ ๋ค์ ์์ํ ํ์๊ฐ ์์ต๋๋ค.) |
|
์์: |
|
``` |
|
"์ด๊ธ์๋ฌธ ๋ฒ์ญ": { |
|
# ์ ๋์ฌ, ์
๋ ฅ ๋ด์ฉ ์์ ์ถ๊ฐ๋ฉ๋๋ค. ์๋ฅผ ๋ค์ด ์๊ตฌ ์ฌํญ์ ์ค๋ช
ํ๋ ๋ฐ ์ฌ์ฉ๋ฉ๋๋ค. ์๋ฅผ ๋ค์ด ๋ฒ์ญ, ์ฝ๋ ์ค๋ช
, ๊ต์ ๋ฑ |
|
"Prefix": "๋ค์ ๋ด์ฉ์ ํ๊ตญ์ด๋ก ๋ฒ์ญํ๊ณ ์ ๋ฌธ ์ฉ์ด์ ๋ํ ์ค๋ช
์ ์ ์ฉํ ๋งํฌ๋ค์ด ํ๋ฅผ ์ฌ์ฉํ์ธ์:\n\n", |
|
|
|
# ์ ๋ฏธ์ฌ, ์
๋ ฅ ๋ด์ฉ ๋ค์ ์ถ๊ฐ๋ฉ๋๋ค. ์๋ฅผ ๋ค์ด ์ ๋์ฌ์ ํจ๊ป ์
๋ ฅ ๋ด์ฉ์ ๋ฐ์ดํ๋ก ๊ฐ์ ์ ์์ต๋๋ค. |
|
"Suffix": "", |
|
}, |
|
``` |
|
<div align="center"> |
|
<img src="https://user-images.githubusercontent.com/96192199/226899272-477c2134-ed71-4326-810c-29891fe4a508.png" width="500" > |
|
</div> |
|
|
|
### II: ์ฌ์ฉ์ ์ ์ ํจ์ ํ๋ฌ๊ทธ์ธ |
|
์ํ๋ ์์
์ ์ํํ๊ธฐ ์ํด ๋ฅ๋ ฅ์๋ ํจ์ ํ๋ฌ๊ทธ์ธ์ ์์ฑํ์ธ์. |
|
์ด ํ๋ก์ ํธ์ ํ๋ฌ๊ทธ์ธ ์์ฑ ๋ฐ ๋๋ฒ๊น
์ ๋์ด๋๊ฐ ๋ฎ์ผ๋ฉฐ, ์ผ์ ํ Python ๊ธฐ๋ณธ ์ง์๋ง ์์ผ๋ฉด ์ฐ๋ฆฌ๊ฐ ์ ๊ณตํ๋ ํ
ํ๋ฆฟ์ ๋ณธ๋ฐ์ ๊ณ ์ ํ ํ๋ฌ๊ทธ์ธ ๊ธฐ๋ฅ์ ๊ตฌํํ ์ ์์ต๋๋ค. |
|
์์ธํ ๋ด์ฉ์ [ํจ์ ํ๋ฌ๊ทธ์ธ ๊ฐ์ด๋](https://github.com/binary-husky/gpt_academic/wiki/%E5%87%BD%E6%95%B0%E6%8F%92%E4%BB%B6%E6%8C%87%E5%8D%97)๋ฅผ ์ฐธ์กฐํ์ธ์. |
|
|
|
|
|
# ์
๋ฐ์ดํธ |
|
### I: ๋ค์ด๋๋ฏน |
|
|
|
1. ๋ํ ์ ์ฅ ๊ธฐ๋ฅ. ํ๋ฌ๊ทธ์ธ ์์ญ์์ 'ํ์ฌ ๋ํ ์ ์ฅ'์ ํธ์ถํ์ฌ ํ์ฌ ๋ํ๋ฅผ ๋ณผ ์ ์๊ณ , html ํ์ผ์ ๋ณต๊ตฌํ ์ ์์ต๋๋ค. |
|
๋ํ ํ๋ฌ๊ทธ์ธ ์์ญ์์ '๋ํ ๊ธฐ๋ก ๋ถ๋ฌ์ค๊ธฐ'๋ฅผ ํธ์ถํ์ฌ ์ด์ ๋ํ๋ฅผ ๋ณต์ํ ์ ์์ต๋๋ค. |
|
ํ: ํ์ผ์ ์ง์ ํ์ง ์๊ณ '๋ํ ๊ธฐ๋ก ๋ถ๋ฌ์ค๊ธฐ'๋ฅผ ๋ฐ๋ก ํด๋ฆญํ๋ฉด ์ด์ html ๊ธฐ๋ก ์บ์๋ฅผ ๋ณผ ์ ์์ต๋๋ค. |
|
<div align="center"> |
|
<img src="https://user-images.githubusercontent.com/96192199/235222390-24a9acc0-680f-49f5-bc81-2f3161f1e049.png" width="500" > |
|
</div> |
|
|
|
2. โญLatex/Arxiv ๋
ผ๋ฌธ ๋ฒ์ญ ๊ธฐ๋ฅโญ |
|
<div align="center"> |
|
<img src="https://github.com/binary-husky/gpt_academic/assets/96192199/002a1a75-ace0-4e6a-94e2-ec1406a746f1" height="250" > ===> |
|
<img src="https://github.com/binary-husky/gpt_academic/assets/96192199/9fdcc391-f823-464f-9322-f8719677043b" height="250" > |
|
</div> |
|
|
|
3. ๋น ํฐ๋ฏธ๋ (์์ฐ์ด ์
๋ ฅ์์ ์ฌ์ฉ์ ์๋ ์ดํด + ์๋ ํ๋ฌ๊ทธ์ธ ํธ์ถ) |
|
|
|
- ๋จ๊ณ 1: "ํ๋ฌ๊ทธ์ธ์ ์ฌ์ฉํ์ฌ PDF ๋
ผ๋ฌธ์ ๋ฒ์ญํ์ญ์์ค. ์ฃผ์๋ https://openreview.net/pdf?id=rJl0r3R9KX์
๋๋ค." ์
๋ ฅ |
|
- ๋จ๊ณ 2: "๋น ํฐ๋ฏธ๋" ํด๋ฆญ |
|
|
|
<div align="center"> |
|
<img src="https://github.com/binary-husky/gpt_academic/assets/96192199/66f1b044-e9ff-4eed-9126-5d4f3668f1ed" width="500" > |
|
</div> |
|
|
|
4. ๋ชจ๋ํ๋ ๊ธฐ๋ฅ ๋์์ธ, ๊ฐ๋จํ ์ธํฐํ์ด์ค๋ก ๊ฐ๋ ฅํ ๊ธฐ๋ฅ ์ ๊ณต |
|
<div align="center"> |
|
<img src="https://user-images.githubusercontent.com/96192199/229288270-093643c1-0018-487a-81e6-1d7809b6e90f.png" height="400" > |
|
<img src="https://user-images.githubusercontent.com/96192199/227504931-19955f78-45cd-4d1c-adac-e71e50957915.png" height="400" > |
|
</div> |
|
|
|
5. ๋ค๋ฅธ ์คํ ์์ค ํ๋ก์ ํธ ๋ฒ์ญ |
|
<div align="center"> |
|
<img src="https://user-images.githubusercontent.com/96192199/226935232-6b6a73ce-8900-4aee-93f9-733c7e6fef53.png" height="250" > |
|
<img src="https://user-images.githubusercontent.com/96192199/226969067-968a27c1-1b9c-486b-8b81-ab2de8d3f88a.png" height="250" > |
|
</div> |
|
|
|
6. [live2d](https://github.com/fghrsh/live2d_demo)์ ์์ ๊ธฐ๋ฅ ์ถ๊ฐ (๊ธฐ๋ณธ ์ค์ ์ ๋ซํ ์์ผ๋ฉฐ, `config.py`๋ฅผ ์์ ํด์ผ ํฉ๋๋ค.) |
|
<div align="center"> |
|
<img src="https://user-images.githubusercontent.com/96192199/236432361-67739153-73e8-43fe-8111-b61296edabd9.png" width="500" > |
|
</div> |
|
|
|
7. OpenAI ์ด๋ฏธ์ง ์์ฑ |
|
<div align="center"> |
|
<img src="https://github.com/binary-husky/gpt_academic/assets/96192199/bc7ab234-ad90-48a0-8d62-f703d9e74665" width="500" > |
|
</div> |
|
|
|
8. OpenAI ์ค๋์ค ๋ถ์ ๋ฐ ์์ฝ |
|
<div align="center"> |
|
<img src="https://github.com/binary-husky/gpt_academic/assets/96192199/709ccf95-3aee-498a-934a-e1c22d3d5d5b" width="500" > |
|
</div> |
|
|
|
9. Latex ์ ์ฒด ๊ต์ ์ค๋ฅ |
|
<div align="center"> |
|
<img src="https://github.com/binary-husky/gpt_academic/assets/96192199/651ccd98-02c9-4464-91e1-77a6b7d1b033" height="200" > ===> |
|
<img src="https://github.com/binary-husky/gpt_academic/assets/96192199/476f66d9-7716-4537-b5c1-735372c25adb" height="200"> |
|
</div> |
|
|
|
10. ์ธ์ด, ํ
๋ง ๋ณ๊ฒฝ |
|
<div align="center"> |
|
<img src="https://github.com/binary-husky/gpt_academic/assets/96192199/b6799499-b6fb-4f0c-9c8e-1b441872f4e8" width="500" > |
|
</div> |
|
|
|
|
|
|
|
### II: ๋ฒ์ : |
|
- ๋ฒ์ 3.70 (์์ ): AutoGen ํ๋ฌ๊ทธ์ธ ํ
๋ง ๊ฐ์ ๋ฐ ๋ค๋ฅธ ํ
๋ง ํ๋ฌ๊ทธ์ธ ๋์์ธ |
|
- ๋ฒ์ 3.60: AutoGen์ ์๋ก์ด ์ธ๋ ํ๋ฌ๊ทธ์ธ์ ๊ธฐ๋ฐ์ผ๋ก ๋์
|
|
- ๋ฒ์ 3.57: GLM3, Starfire v3, ๆๅฟไธ่จ v4 ์ง์, ๋ก์ปฌ ๋ชจ๋ธ์ ๋์์ฑ ๋ฒ๊ทธ ์์ |
|
- ๋ฒ์ 3.56: ๋์ ์ผ๋ก ๊ธฐ๋ณธ ๊ธฐ๋ฅ ๋ฒํผ ์ถ๊ฐ, ์๋ก์ด ๋ณด๊ณ ์ PDF ์์ฝ ํ์ด์ง |
|
- ๋ฒ์ 3.55: ํ๋ก ํธ ์๋ ์ธํฐํ์ด์ค ๋ฆฌํฉํ ๋ง, ํ๋ฉด ๋ฐ๋ผ๋ค๋๋ ์๋์ฐ ๋ฐ ๋ฉ๋ด ๋ฐ ๋์
|
|
- ๋ฒ์ 3.54: ์๋ก์ด ๋์ ์ฝ๋ ํด์๊ธฐ (Code Interpreter) ์ถ๊ฐ (์๋ฒฝํ๊ฒ ์์ฑ๋์ง ์์) |
|
- ๋ฒ์ 3.53: ๋ค๋ฅธ ์ธํฐํ์ด์ค ํ
๋ง ๋์ ์ ํ ๊ธฐ๋ฅ ์ถ๊ฐ, ์์ ์ฑ ํฅ์ ๋ฐ ๋ค์ค ์ฌ์ฉ์ ์ถฉ๋ ๋ฌธ์ ํด๊ฒฐ |
|
- ๋ฒ์ 3.50: ์์ฐ์ด๋ก ์ด ํ๋ก์ ํธ์ ๋ชจ๋ ํจ์ ํ๋ฌ๊ทธ์ธ์ ํธ์ถํ๋ ๊ธฐ๋ฅ (๋น ํฐ๋ฏธ๋) ์ถ๊ฐ, ํ๋ฌ๊ทธ์ธ ๋ถ๋ฅ ์ง์, UI ๊ฐ์ , ์๋ก์ด ํ
๋ง ์ค๊ณ |
|
- ๋ฒ์ 3.49: Baidu Qianfan ํ๋ซํผ ๋ฐ ๋ฌธ์ฌ์ผ์ธ ์ง์ |
|
- ๋ฒ์ 3.48: Ali DameiYuan Sematic Query, Shanghai AI-Lab Shusheng, Xunfei Starfire ์ง์ |
|
- ๋ฒ์ 3.46: ์์ ์๋ ์ด์ ๊ฐ๋ฅํ ์ค์๊ฐ ์์ฑ ๋ํ ์ง์ |
|
- ๋ฒ์ 3.45: ์ฌ์ฉ์ ์ ์ ChatGLM2 fine-tuning ๋ชจ๋ธ ์ง์ |
|
- ๋ฒ์ 3.44: Azure ์ ์ ์ง์, ์ธํฐํ์ด์ค์ ์ฌ์ฉ ํธ์์ฑ ๊ฐ์ |
|
- ๋ฒ์ 3.4: +arxiv ๋
ผ๋ฌธ ๋ฒ์ญ, latex ๋
ผ๋ฌธ ๊ต์ ๊ธฐ๋ฅ ์ถ๊ฐ |
|
- ๋ฒ์ 3.3: +์ธํฐ๋ท ์ ๋ณด ์ข
ํฉ ๊ธฐ๋ฅ |
|
- ๋ฒ์ 3.2: ํจ์ ํ๋ฌ๊ทธ์ธ์ด ๋ ๋ง์ ๋งค๊ฐ๋ณ์ ์ธํฐํ์ด์ค๋ฅผ ์ง์ํฉ๋๋ค (๋ํ ์ ์ฅ ๊ธฐ๋ฅ, ์์์ ์ธ์ด ์ฝ๋ ํด์ + ์์์ LLM ์กฐํฉ์ ๋์์ ์์ฒญ) |
|
- ๋ฒ์ 3.1: ์ฌ๋ฌ GPT ๋ชจ๋ธ์ ๋์์ ์ง๋ฌธํ ์ ์๋ ๊ธฐ๋ฅ ์ถ๊ฐ! api2d ์ง์, ์ฌ๋ฌ ๊ฐ์ apikey ๋ถํ ๊ท ํ ์กฐ์ ์ง์ |
|
- ๋ฒ์ 3.0: chatglm ๋ฐ ๊ธฐํ ์๊ท๋ชจ llm ์ง์ |
|
- ๋ฒ์ 2.6: ํ๋ฌ๊ทธ์ธ ๊ตฌ์กฐ๋ฅผ ์ฌ๊ตฌ์ฑํ์ฌ ์ํธ ์์ฉ์ฑ ํฅ์, ๋ ๋ง์ ํ๋ฌ๊ทธ์ธ ์ถ๊ฐ |
|
- ๋ฒ์ 2.5: ์๋ ์
๋ฐ์ดํธ, ์์ค ์ฝ๋ ์์ฝ ์ค ํ
์คํธ๊ฐ ๋๋ฌด ๊ธธ๊ณ ํ ํฐ์ด ์ค๋ฒํ๋ก๋๋ ๋ฌธ์ ํด๊ฒฐ |
|
- ๋ฒ์ 2.4: (1)PDF ์ ์ฒด ๋ฒ์ญ ๊ธฐ๋ฅ ์ถ๊ฐ; (2)์
๋ ฅ ์์ญ ์์น ์ ํ ๊ธฐ๋ฅ ์ถ๊ฐ; (3)์์ง ๋ ์ด์์ ์ต์
์ถ๊ฐ; (4)๋ฉํฐ ์ค๋ ๋ ํจ์ ํ๋ฌ๊ทธ์ธ ์ต์ ํ |
|
- ๋ฒ์ 2.3: ๋ฉํฐ ์ค๋ ๋ ์ํธ ์์ฉ์ฑ ๊ฐํ |
|
- ๋ฒ์ 2.2: ํจ์ ํ๋ฌ๊ทธ์ธ์ ํซ ๋ฆฌ๋ก๋ ์ง์ |
|
- ๋ฒ์ 2.1: ์ ์ ์ ์๋ ๋ ์ด์์ |
|
- ๋ฒ์ 2.0: ๋ชจ๋ํ ํจ์ ํ๋ฌ๊ทธ์ธ ๋์
|
|
- ๋ฒ์ 1.0: ๊ธฐ๋ณธ ๊ธฐ๋ฅ |
|
|
|
GPT Academic ๊ฐ๋ฐ์ QQ ๊ทธ๋ฃน: `610599535` |
|
- ์๋ ค์ง ๋ฌธ์ |
|
- ํน์ ์น ๋ธ๋ผ์ฐ์ ๋ฒ์ญ ํ๋ฌ๊ทธ์ธ์ด ์ด ์ํํธ์จ์ด์ ํ๋ก ํธ์๋ ์คํ์ ๋ฐฉํด๊ฐ ๋๋ ๊ฒฝ์ฐ๊ฐ ์์ต๋๋ค. |
|
- ๊ณต์ Gradio์๋ ํธํ์ฑ ๋ฌธ์ ๊ฐ ๋ง๊ธฐ ๋๋ฌธ์ `requirement.txt`๋ฅผ ์ฌ์ฉํ์ฌ Gradio๋ฅผ ์ค์นํ์ญ์์ค. |
|
|
|
### III: ํ
๋ง |
|
`THEME` ์ต์
(`config.py`)์ ์์ ํ์ฌ ํ
๋ง๋ฅผ ๋ณ๊ฒฝํ ์ ์์ต๋๋ค. |
|
1. `Chuanhu-Small-and-Beautiful` [URL](https://github.com/GaiZhenbiao/ChuanhuChatGPT/) |
|
|
|
|
|
### IV: ์ด ํ๋ก์ ํธ์ ๊ฐ๋ฐ ๋ธ๋์น |
|
|
|
1. `master` ๋ธ๋์น: ๋ฉ์ธ ๋ธ๋์น, ์์ ๋ฒ์ |
|
2. `frontier` ๋ธ๋์น: ๊ฐ๋ฐ ๋ธ๋์น, ํ
์คํธ ๋ฒ์ |
|
|
|
|
|
### V: ์ฐธ๊ณ ๋ฐ ํ์ต |
|
|
|
``` |
|
์ฝ๋์์๋ ๋ค๋ฅธ ์ฐ์ํ ํ๋ก์ ํธ์ ๋์์ธ์ ๋ง์ด ์ฐธ๊ณ ํ์ต๋๋ค. ์์๋ ๋ฌธ์ ์์ด ๋์ด๋ฉ๋๋ค: |
|
|
|
# ๆธ
ๅChatGLM2-6B: |
|
https://github.com/THUDM/ChatGLM2-6B |
|
|
|
# ๆธ
ๅJittorLLMs: |
|
https://github.com/Jittor/JittorLLMs |
|
|
|
# ChatPaper: |
|
https://github.com/kaixindelele/ChatPaper |
|
|
|
# Edge-GPT: |
|
https://github.com/acheong08/EdgeGPT |
|
|
|
# ChuanhuChatGPT: |
|
https://github.com/GaiZhenbiao/ChuanhuChatGPT |
|
|
|
|
|
|
|
# Oobabooga ์ ํด๋ฆญ ์ค์น ํ๋ก๊ทธ๋จ: |
|
https://github.com/oobabooga/one-click-installers |
|
|
|
# ๋๋ณด๊ธฐ๏ผ |
|
https://github.com/gradio-app/gradio |
|
https://github.com/fghrsh/live2d_demo |
|
|