--- license: apache-2.0 base_model: Qwen/Qwen3-Next-80B-A3B-Instruct base_model_relation: quantized quantized_by: turboderp tags: - exl3 --- EXL3 quants of [Qwen3-Next-80B-A3B-Instruct](https://huggingface.co/Qwen/Qwen3-Next-80B-A3B-Instruct) ⚠️ Requires ExLlamaV3 v0.0.7 (or v0.0.6 `dev` branch) Base bitrates: [2.00 bits per weight](https://huggingface.co/turboderp/Qwen3-Next-80B-A3B-Instruct-exl3/tree/2.0bpw) [3.00 bits per weight](https://huggingface.co/turboderp/Qwen3-Next-80B-A3B-Instruct-exl3/tree/3.0bpw) [4.00 bits per weight](https://huggingface.co/turboderp/Qwen3-Next-80B-A3B-Instruct-exl3/tree/4.0bpw) [5.00 bits per weight](https://huggingface.co/turboderp/Qwen3-Next-80B-A3B-Instruct-exl3/tree/5.0bpw) Optimized: [2.08 bits per weight](https://huggingface.co/turboderp/Qwen3-Next-80B-A3B-Instruct-exl3/tree/2.08bpw) [2.27 bits per weight](https://huggingface.co/turboderp/Qwen3-Next-80B-A3B-Instruct-exl3/tree/2.27bpw) [2.78 bits per weight](https://huggingface.co/turboderp/Qwen3-Next-80B-A3B-Instruct-exl3/tree/2.78bpw) [3.14 bits per weight](https://huggingface.co/turboderp/Qwen3-Next-80B-A3B-Instruct-exl3/tree/3.14bpw) [3.53 bits per weight](https://huggingface.co/turboderp/Qwen3-Next-80B-A3B-Instruct-exl3/tree/3.53bpw) [4.06 bits per weight](https://huggingface.co/turboderp/Qwen3-Next-80B-A3B-Instruct-exl3/tree/4.06bpw) [4.51 bits per weight](https://huggingface.co/turboderp/Qwen3-Next-80B-A3B-Instruct-exl3/tree/4.51bpw) ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6383dc174c48969dcf1b4fce/o102Nhe2xdm7WzgMlg1tg.png)