| EXL2 quants of Llama2-70B-chat | |
| [2.30 bits per weight](https://huggingface.co/turboderp/Llama2-70B-chat-exl2/tree/2.3bpw) | |
| [2.35 bits per weight](https://huggingface.co/turboderp/Llama2-70B-chat-exl2/tree/2.35bpw) | |
| [2.40 bits per weight](https://huggingface.co/turboderp/Llama2-70B-chat-exl2/tree/2.4bpw) | |
| [2.45 bits per weight](https://huggingface.co/turboderp/Llama2-70B-chat-exl2/tree/2.45bpw) | |
| [2.50 bits per weight](https://huggingface.co/turboderp/Llama2-70B-chat-exl2/tree/2.5bpw) | |
| [2.55 bits per weight](https://huggingface.co/turboderp/Llama2-70B-chat-exl2/tree/2.55bpw) | |
| [2.60 bits per weight](https://huggingface.co/turboderp/Llama2-70B-chat-exl2/tree/2.6bpw) | |
| [2.70 bits per weight](https://huggingface.co/turboderp/Llama2-70B-chat-exl2/tree/2.7bpw) | |
| [3.00 bits per weight](https://huggingface.co/turboderp/Llama2-70B-chat-exl2/tree/3.0bpw) | |
| [4.00 bits per weight](https://huggingface.co/turboderp/Llama2-70B-chat-exl2/tree/4.0bpw) | |
| [4.65 bits per weight](https://huggingface.co/turboderp/Llama2-70B-chat-exl2/tree/4.65bpw) | |
| [measurement.json](https://huggingface.co/turboderp/Llama2-70B-chat-exl2/blob/main/measurement.json) |