CivArchive
    Flux.1 Lite 8B Alpha GGUF Q3.KS Q4/Q4.1/Q4.KS Q5/Q5.1/Q5.KS Q6.K Q8 BF16 - Q5
    NSFW
    Preview 1

    Source: https://huggingface.co/city96/flux.1-lite-8B-alpha-gguf/tree/main from city96

    This is a direct GGUF conversion of Freepik/flux.1-lite-8B-alpha
    The model files can be used with the ComfyUI-GGUF custom node.
    Place model files in ComfyUI/models/unet - see the GitHub readme for further install instructions.
    Please refer to this chart for a basic overview of quantization types.

    ☕ Buy me a coffee: https://ko-fi.com/ralfingerai
    🍺 Join my discord: https://discord.gg/g5Pb8qNUuP

    Description