license: other | |
license_name: deepseek-license | |
license_link: LICENSE | |
<p align="center"> | |
<img width="1000px" alt="DeepSeek Coder" src="https://github.com/deepseek-ai/DeepSeek-Coder/blob/main/pictures/logo.png?raw=true"> | |
</p> | |
<p align="center"><a href="https://www.deepseek.com/">[🏠Homepage]</a> | <a href="https://coder.deepseek.com/">[🤖 Chat with DeepSeek Coder]</a> | <a href="https://discord.gg/Tc7c45Zzu5">[Discord]</a> | <a href="https://github.com/guoday/assert/blob/main/QR.png?raw=true">[Wechat(微信)]</a> </p> | |
<hr> | |
### 1. Introduction of Deepseek-Coder-7B-Base-v1.5 | |
Deepseek-Coder-7B-Base-v1.5 is continue pre-trained from Deepseek-LLM 7B on 2T tokens by employing a window size of 4K and next token prediction objective. | |
- **Home Page:** [DeepSeek](https://deepseek.com/) | |
- **Repository:** [deepseek-ai/deepseek-coder](https://github.com/deepseek-ai/deepseek-coder) | |
- **Chat With DeepSeek Coder:** [DeepSeek-Coder](https://coder.deepseek.com/) | |
### 2. How to Use | |
Here give an example of how to use our model. | |
```python | |
from transformers import AutoTokenizer, AutoModelForCausalLM | |
import torch | |
tokenizer = AutoTokenizer.from_pretrained("deepseek-ai/deepseek-coder-7b-base-v1.5", trust_remote_code=True) | |
model = AutoModelForCausalLM.from_pretrained("deepseek-ai/deepseek-coder-7b-base-v1.5", trust_remote_code=True).cuda() | |
input_text = "#write a quick sort algorithm" | |
inputs = tokenizer(input_text, return_tensors="pt").cuda() | |
outputs = model.generate(**inputs, max_length=128) | |
print(tokenizer.decode(outputs[0], skip_special_tokens=True)) | |
``` | |
### 3. License | |
This code repository is licensed under the MIT License. The use of DeepSeek Coder models is subject to the Model License. DeepSeek Coder supports commercial use. | |
See the [LICENSE-MODEL](https://github.com/deepseek-ai/deepseek-coder/blob/main/LICENSE-MODEL) for more details. | |
### 4. Contact | |
If you have any questions, please raise an issue or contact us at [[email protected]](mailto:[email protected]). | |