Source: https://huggingface.co/silveroxides/flux1-nf4-unet/tree/main from silveroxides
☕ Buy me a coffee: https://ko-fi.com/ralfingerai
🍺 Join my discord: https://discord.com/invite/pAz4Bt3rqb
Description
FAQ
Comments (10)
With GGUF models we should use the Unet Loader (GGUF) node and place this model into "diffusion_models" folder (perviously "unet") in ComfyUI.
This model is supposed to be loaded with Comfy Core node "Load Checkpoint"? We should load it as usual checkpoint and take a Model output only?
I tried many ways to load it. Nothing works for me.
There are at least 5 positive reviews already. Can someone tell me how to load this format properly?
Error occurred when executing CheckpointLoaderSimple: ERROR: Could not detect model type of: **/flux1DevFlux1Schnell_devNF4UNET.safetensors
This is U-Net only model. There is a separate loader for NF4 unets. LoRa does not work with NF4 in comfy atm though.
@homoludens
1) Make sure you have Comfy Nodes installed (https://replicate.com/guides/comfyui/comfyui-manager).
2) Restart ComfyUI. When the interface loads you'll be able to press the new "Manager" button.
3) On the left, change your Channel to dev.
4) Press "Custom nodes manager". Then, in the search box, type NF4
5) Find ComfyUI_bitsandbytes_NF4 and press "Install"
6) Reload ComfyUI and double-click the grid background to pull up the node search window
7) Search for NF4 to find the checkpoint loader node. Connect it how the old loader was and you should be able to queue your prompt successfully!
@voidvisionary NF4... aha. Why didn't I hit the search box in Manager.... Intuition left my temporarily (I hope).
I thought the whole NF4 will be deprecated because of GGUF, at least it says so here https://github.com/comfyanonymous/ComfyUI_bitsandbytes_NF4
For me it would be perfectly ok to use a fine-tuned NF4 just for the generation speed advantage. It's great at least for small resolution drafts for quickly testing prompts.
@PirateGirl I'm not sure anymore, I just threw all the NF4 files to make space on my harddrive, but new versions keep popping up, so maybe the note on the site is wrong 🙄
Lots of users are still using Draw Things for generation and it does support GGUF. Hell event CivitAI doesn’t technically support GGUF yet is the have to be zipped.
Does it support flux loras ?
