Original model: MN-12B-Mag-Mell-R1-Uncensored by Naphula

Available ExLlamaV3 (release v0.0.18) quantizations

Requirements: A python installation with huggingface-hub module to use CLI.

Licensing: The license for the provided quantized models is derived from the original model (see the source above)

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for DeathGodlike/MN-12B-Mag-Mell-R1-Uncensored_EXL3

Quantized
(3)
this model