README / README.md
awni's picture
Update README.md
af347f8
|
raw
history blame
816 Bytes
metadata
title: README
emoji: πŸ“š
colorFrom: green
colorTo: indigo
sdk: static
pinned: false

MLX Community

A community org for model weights compatible with mlx-examples powered by MLX.

These are pre-converted weights and ready to be used in the example scripts.

Quick start

Check out the MLX examples repo:

git clone [email protected]:ml-explore/mlx-examples.git

Install the requirements:

cd mlx-examples/hf_llm
pip install -r requirements.txt

Generate:

python generate.py --hf-path mistralai/Mistral-7B-v0.1 --prompt "hello"

To upload a new model (for example a 4-bit quantized Mistral-7B), do:

python convert.py --hf-path mistralai/Mistral-7B-v0.1 -q --upload-name mistral-v0.1-4bit