Run it using the MlX framework. For more details, click here.

python yayi.py --model-path <path_to_mlx_model> --prompt "### Human: 你好\n### Assistant:"

Prompt template:

### Human: {prompt}
### Assistant:
Downloads last month
1
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no pipeline_tag.