https://huggingface.co/inclusionAI/Ling-2.6-flash
#2279
by Doctor-Chad-PhD - opened
It's queued!
You can check for progress at http://hf.tst.eu/status.html or regularly check the model
summary page at https://hf.tst.eu/model#Ling-2.6-flash-GGUF for quants to appear.
+1
Thank you!
It looks like it failed, I thought it was supported but that was 2.0 not 2.5:
error/1 ~BailingMoeV2_5ForCausalLM
Here's the request for gguf support:
https://huggingface.co/inclusionAI/Ling-2.6-flash/discussions/3
let me know when supported so I can queue it. the model was there just for a day, so makes sense why it didnt work