metadata
license: apache-2.0
datasets:
- jondurbin/truthy-dpo-v0.1
tags:
- merge
DPO'd from vicgalle/franken-Beagle-11B, a Beagle-like model upscaled to 11B.
It is a frankenmerge model created using mergekit. Then, we applied DPO over a high-quality preference dataset.