File size: 651 Bytes
260038a 2ff18fe 260038a 2ff18fe |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 |
---
datasets:
- wikitext-2-v1
- yizhongw/self_instruct
language:
- en
library_name: transformers
metrics: crossentropy
---
1. download the pth file
2. load the state dict
```python
from transformers import T5Tokenizer
from transformers.models.llama import LlamaConfig
config = LlamaConfig.from_pretrained(f"hpcaitech/openmoe-base")
model = OpenMoeForCausalLM(config)
ckpt = torch.load("openmoe_base_yizhongw_super_natural_instruction_generation.pth")
state_dict = {}
for key, value in ckpt.items():
if key.startswith("module."):
state_dict[key[7:]] = value
else:
state_dict[key] = value
model.load_state_dict(state_dict)
``` |