Configuration Parsing Warning: In config.json: "quantization_config.bits" must be an integer

Bagel DPO 57B

exllamav2 quant for TeeZee/2xbagel-dpo-34b-v0.2

Runs smoothly on single 3090 in webui with context length set to 4096, ExLlamav2_HF loader and cache_8bit=True

All comments are greatly appreciated, download, test and if you appreciate my work, consider buying me my fuel: Buy Me A Coffee

Downloads last month
8
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Collection including TeeZee/2xbagel-dpo-34b-v0.2-bpw2.8-h6-exl2