Qwen1.5 Chat 4B for RK3588
This is a conversion from https://huggingface.co/Qwen/Qwen1.5-4B-Chat to the RKLLM format for Rockchip devices. This runs on the NPU from the RK3588
Main repo
See this for my full collection of converted LLMs for the RK3588's NPU:
https://huggingface.co/Pelochus/ezrkllm-collection
License
Same as the original LLM https://huggingface.co/Qwen/Qwen1.5-4B-Chat
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.