GGUF conversion help
Hi bewolf,
I'm sorry to make a post in one of your models for something unrelated to the model, but I would like to know if you have any special way of creating the GGUF.
The question is because I'm trying to create for the SmoothMix WAN 2.2, but when using the GGUF it only generates a black image.
I think is because of all the things that were added to the base model, and since you were able to make the GGUF for Phr00t's, thought you kught have some special indications.
Thanks
I saw you mentioning that you use the normal script from city96 gguf from another conversation
https://huggingface.co/befox/WAN2.2-14B-Rapid-AllInOne-GGUF/discussions/9
But that what I sued and it isn't working.
Yes, I've encountered the same issue
I was able to convert the v-10 NSFW model using tool_auto.py with this branch
https://github.com/city96/ComfyUI-GGUF/tree/auto_convert
How ever I don't think I was able to get a mega model to work.
I was able to do it, after running this script with the argument --strip-fp8
https://github.com/Kickbub/Dequant-FP8-ComfyUI/blob/main/dequantize_fp8v2.py
Then I had to do the whole process with the resulting file from the script.
Closing this thread.