File size: 350 Bytes
70951b3
1ae081d
 
 
 
6dbea2e
1ae081d
70951b3
 
1ae081d
70951b3
1ae081d
70951b3
1ae081d
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
---
datasets:
- hassanjbara/LONG-DPO
language:
- en
library_name: transformers
pipeline_tag: text-generation
---

# Phi-3-mini-natrual

This model is a DPO fintue of [Phi-3-mini](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct) on the [hassanjbara/LONG-DPO](https://huggingface.co/datasets/hassanjbara/LONG-DPO) dataset.

More details later.