Text Generation
Safetensors
English
rwkv
rwkv7
custom_code
yzhangcs commited on
Commit
f1a8e95
·
verified ·
1 Parent(s): 04a0812

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -7
README.md CHANGED
@@ -11,7 +11,7 @@ base_model:
11
  pipeline_tag: text-generation
12
  ---
13
 
14
- # rwkv7-168m-pile
15
 
16
  <!-- Provide a quick summary of what the model is/does. -->
17
 
@@ -26,7 +26,7 @@ This is RWKV-7 model under flash-linear attention format.
26
 
27
  - **Developed by:** Bo Peng, Yu Zhang, Songlin Yang, Ruochong Zhang
28
  - **Funded by:** Shenzhen Yuanshi Intelligent Co. Ltd.
29
- - **Model type:** RWKV-7
30
  - **Language(s) (NLP):** English
31
  - **License:** Apache-2.0
32
  - **Parameter count:** 165M
@@ -46,9 +46,7 @@ This is RWKV-7 model under flash-linear attention format.
46
  Install flash-linear-attention before using this model:
47
 
48
  ```bash
49
- git clone https://github.com/fla-org/flash-linear-attention
50
- cd flash-linear-attention
51
- pip install -e .
52
  ```
53
 
54
  ### Direct Use
@@ -57,8 +55,8 @@ pip install -e .
57
  You can use this model just as any other HuggingFace models:
58
  ```python
59
  from transformers import AutoModelForCausalLM, AutoTokenizer
60
- model = AutoModelForCausalLM.from_pretrained('fla-hub/rwkv7-168m-pile', trust_remote_code=True)
61
- tokenizer = AutoTokenizer.from_pretrained('fla-hub/rwkv7-168m-pile', trust_remote_code=True)
62
  ```
63
 
64
  ## Training Details
 
11
  pipeline_tag: text-generation
12
  ---
13
 
14
+ # rwkv7-168M-pile
15
 
16
  <!-- Provide a quick summary of what the model is/does. -->
17
 
 
26
 
27
  - **Developed by:** Bo Peng, Yu Zhang, Songlin Yang, Ruochong Zhang
28
  - **Funded by:** Shenzhen Yuanshi Intelligent Co. Ltd.
29
+ - **Model type:** RWKV7
30
  - **Language(s) (NLP):** English
31
  - **License:** Apache-2.0
32
  - **Parameter count:** 165M
 
46
  Install flash-linear-attention before using this model:
47
 
48
  ```bash
49
+ pip install git+https://github.com/fla-org/flash-linear-attention
 
 
50
  ```
51
 
52
  ### Direct Use
 
55
  You can use this model just as any other HuggingFace models:
56
  ```python
57
  from transformers import AutoModelForCausalLM, AutoTokenizer
58
+ model = AutoModelForCausalLM.from_pretrained('fla-hub/rwkv7-168M-pile', trust_remote_code=True)
59
+ tokenizer = AutoTokenizer.from_pretrained('fla-hub/rwkv7-168M-pile', trust_remote_code=True)
60
  ```
61
 
62
  ## Training Details