MLX
Safetensors
English
phi3
custom_code

mlx-community/sum-small-unquantized

The Model mlx-community/sum-small-unquantized was converted to MLX format from omi-health/sum-small using mlx-lm version 0.13.0.

Use with mlx

pip install mlx-lm
from mlx_lm import load, generate

model, tokenizer = load("mlx-community/sum-small-unquantized")
response = generate(model, tokenizer, prompt="hello", verbose=True)
Downloads last month
48
Safetensors
Model size
3.82B params
Tensor type
FP16
Β·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no pipeline_tag.

Dataset used to train mlx-community/sum-small-unquantized