CivArchive
    Flux.1-Dev GGUF F16 - F16
    NSFW
    Preview 25108532
    Preview 25108010
    Preview 25103743
    Preview 25103722
    Preview 25110928
    Preview 25142862
    Preview 25143298
    Preview 25143299
    Preview 25145742
    Preview 25145766
    Preview 25146842
    Preview 25146855
    Preview 25148844
    Preview 25218415
    Preview 25219147
    Preview 25226407
    Preview 25226406

    Source https://huggingface.co/city96/FLUX.1-dev-gguf/tree/main by city96

    This is a direct GGUF conversion of Flux.1-dev. As this is a quantized model not a finetune, all the same restrictions/original license terms still apply.

    The model files can be used with the ComfyUI-GGUF custom node.
    Place model files in ComfyUI/models/unet - see the GitHub readme for further install instructions.
    Also working with Forge since the latest commit!

    Description

    FAQ

    Comments (17)

    JayNL
    Author
    Aug 19, 20241 reaction
    CivitAI

    I can run this on an RTX 4070 12GB, but somehow my PC can't get the gguf file into a ZIP 馃槹

    JayNL
    Author
    Aug 19, 20241 reaction

    The problem was it crashed on 90% because my harddrive was full, it's fixed now!

    JayNL
    Author
    Aug 19, 20241 reaction
    CivitAI

    Removed the ZIP, cause ppl find it sus, will try again tonight or maybe someone can put the right file.

    JayNL
    Author
    Aug 19, 20243 reactions
    CivitAI

    Lol, 62 people downloaded an empty zip file, gonna try to pack it one more time on another harddrive, so it at least has the right file! 馃挬

    discodiffuseAug 19, 2024

    dude pls can you reupload the main file on here. i'm using a cloud comfy server and can only upload models from civit nowhere else

    JayNL
    Author
    Aug 19, 2024

    @discodiffuse聽I am repacking it right now, it takes 30mins on a 12600K

    JayNL
    Author
    Aug 19, 2024

    @discodiffuse聽fixed!

    JayNL
    Author
    Aug 19, 2024
    CivitAI

    Ok it should be on! 馃憣馃挄

    YeiYeiArtAug 19, 2024
    CivitAI

    GGUF is more friendly with low spec pcs right?

    JayNL
    Author
    Aug 19, 20242 reactions

    True, but this one is the heaviest one, my 4070 can barely do it, I recommend to get the Q versions that Ralfinger posted for lower cards. Link is in Suggested Resources.

    YeiYeiArtAug 19, 20241 reaction

    I also have a 4070 O.O

    JayNL
    Author
    Aug 19, 20241 reaction

    @YeiYeiArt聽Q8 is perfect for 4070 i think

    zerocool22Sep 3, 2024

    @JayNL聽But TI with 16gb?

    JayNL
    Author
    Aug 20, 20242 reactions
    CivitAI

    Well I'm kinda done testing F16, it works and sometimes the result is better, but it takes the max out of my PC, so I can't do anything else, prob will go Q8! 馃憣

    tuefmaAug 22, 20241 reaction
    CivitAI

    Is it just me or do LoRAs not work with F16?

    JayNL
    Author
    Aug 22, 2024

    No it's not just you, some work with Q8 and not with F16 馃槹

    JayNL
    Author
    Aug 22, 20241 reaction

    Funny thing is, I was wondering if it's me! 馃槀馃挄

    Checkpoint
    Flux.1 D

    Details

    Downloads
    3,265
    Platform
    CivitAI
    Platform Status
    Available
    Created
    8/19/2024
    Updated
    5/12/2026
    Deleted
    -

    Files

    flux1DevGGUFF16_f16.zip

    Mirrors

    CivitAI (1 mirrors)

    Available On (1 platform)

    Same model published on other platforms. May have additional downloads or version variants.