--- base_model: - Vortex5/MS3.2-24B-Omega-Diamond base_model_relation: quantized pipeline_tag: text-generation library_name: safetensors tags: - exl3 - 4-bit - 6-bit - 8-bit --- # Original model: [MS3.2-24B-Omega-Diamond](https://huggingface.co/Vortex5/MS3.2-24B-Omega-Diamond) by [Vortex5](https://huggingface.co/Vortex5) ## Available [ExLlamaV3](https://github.com/turboderp-org/exllamav3) (release v0.0.17) quantizations | Type | Size | CLI | |------|------|---------| | [H8-4.0BPW](https://huggingface.co/DeathGodlike/MS3.2-24B-Omega-Diamond_EXL3/tree/H8-4.0BPW) | 13.16 GB | [Copy-paste the line / Download the batch file](https://huggingface.co/DeathGodlike/MS3.2-24B-Omega-Diamond_EXL3/resolve/H8-4.0BPW/Download~Vortex5_MS3.2-24B-Omega-Diamond_H8-4.0BPW_EXL3.bat) | | [H8-6.0BPW](https://huggingface.co/DeathGodlike/MS3.2-24B-Omega-Diamond_EXL3/tree/H8-6.0BPW) | 18.72 GB | [Copy-paste the line / Download the batch file](https://huggingface.co/DeathGodlike/MS3.2-24B-Omega-Diamond_EXL3/resolve/H8-6.0BPW/Download~Vortex5_MS3.2-24B-Omega-Diamond_H8-6.0BPW_EXL3.bat) | | [H8-8.0BPW](https://huggingface.co/DeathGodlike/MS3.2-24B-Omega-Diamond_EXL3/tree/H8-8.0BPW) | 24.27 GB | [Copy-paste the line / Download the batch file](https://huggingface.co/DeathGodlike/MS3.2-24B-Omega-Diamond_EXL3/resolve/H8-8.0BPW/Download~Vortex5_MS3.2-24B-Omega-Diamond_H8-8.0BPW_EXL3.bat) | ***Requirements: A python installation with huggingface-hub module to use CLI.*** ### Licensing: The license for the provided quantized models is derived from the original model (see the source above)