CivArchive
    SD 3.5 Large GGUF Q4/Q4.1 Q5/Q5.1 Q8 F16 - Q4.1
    Preview 36473843

    Source: https://huggingface.co/city96/stable-diffusion-3.5-large-gguf from city96

    This is a direct GGUF conversion of stabilityai/stable-diffusion-3.5-large

    As this is a quantized model not a finetune, all the same restrictions/original license terms still apply. The model files can be used with the ComfyUI-GGUF custom node.

    Place model files in ComfyUI/models/unet - see the GitHub readme for further install instructions.
    Please refer to this chart for a basic overview of quantization types.

    ☕ Buy me a coffee: https://ko-fi.com/ralfingerai
    🍺 Join my discord: https://discord.com/invite/pAz4Bt3rqb

    Description

    FAQ

    Comments (6)

    NorfolkDaveOct 25, 2024· 1 reaction
    CivitAI

    Whats the difference between these and the other sd3.5 ggufs? https://civitai.com/models/879251/stable-diffusion-35-large-gguf?modelVersionId=985076

    LiteSoulHDOct 30, 2024· 1 reaction
    CivitAI

    Can you do this but for 3.5 Medium? Thanks

    snobbias124Nov 2, 2024
    CivitAI

    Big thanks for this. What would be the VRAM requirements for a Q4, and how much does it differ from the non-quantized in quality? Is it noticeable?

    RandprintNov 11, 2024
    CivitAI

    Are Q6K too large for an 8GB card?

    topdeckmlw818Nov 23, 2024
    CivitAI

    having trouble with the workflow on this, any links?

    MissBeeJan 22, 2025
    CivitAI

    "ValueError: Failed to recognize model type!"

    Error when using ForgeUI, model does not work.

    Checkpoint
    SD 3.5 Large

    Details

    Downloads
    101
    Platform
    CivitAI
    Platform Status
    Deleted
    Created
    10/25/2024
    Updated
    4/27/2026
    Deleted
    4/27/2026