Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
ZHLiu627
/
zephyr-7b-gemma-rpo-avg
like
0
Safetensors
argilla/dpo-mix-7k
gemma
arxiv:
2405.16436
License:
apache-2.0
Model card
Files
Files and versions
Community
Train
main
zephyr-7b-gemma-rpo-avg
/
README.md
Commit History
Update README.md
d66c14a
verified
ZHLiu627
commited on
19 days ago
Create README.md
6860de7
verified
ZHLiu627
commited on
19 days ago