Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
jieliu
/
Qwen2-7B-Instruct-DPO-score-diff-2-beta0.5
like
0
Transformers
Safetensors
arxiv:
1910.09700
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
77a7080
Qwen2-7B-Instruct-DPO-score-diff-2-beta0.5
Ctrl+K
Ctrl+K
1 contributor
History:
1 commit
jieliu
initial commit
77a7080
verified
about 1 year ago
.gitattributes
Safe
1.52 kB
initial commit
about 1 year ago