Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
edinlp
/
qwen2-7b-offline-dpo
like
0
Follow
edinlp
2
Safetensors
qwen2
Model card
Files
Files and versions
Community
main
qwen2-7b-offline-dpo
Commit History
Upload folder using huggingface_hub
fefc9ef
verified
simonycl
commited on
Nov 16, 2024
initial commit
c6d0b58
verified
simonycl
commited on
Nov 16, 2024