--- datasets: - hassanjbara/LONG-DPO language: - en library_name: transformers pipeline_tag: text-generation --- # Phi-3-mini-natrual This model is a DPO fintue of [Phi-3-mini](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct) on the [hassanjbara/LONG-DPO](https://huggingface.co/datasets/hassanjbara/LONG-DPO) dataset. More details later.