CivArchive
    Flux.1-Heavy-17B(GGUF) - Q5_K_M
    Preview 50408919Preview 50403667Preview 50404033Preview 50403673Preview 50403670Preview 50403666

    Flux.1-Heavy-17B GGUF Quantizations

    🚀 Major Resource Optimization

    Training

    Original Requirements:

    • 40GB VRAM

    • 80GB System RAM

    Now Accessible With:

    • As low as 8GB VRAM

    • 32GB System RAM

    Description

    These are memory-optimized GGUF quantizations of the original Flux.1-Heavy-17B model (by city96), making it accessible for systems with lower VRAM requirements. The original model is a 17B parameter self-merge of the 12B Flux.1-dev model, notable for being one of the first open-source 17B image models capable of generating coherent images.

    Available Versions

    Q6_K Version

    • VRAM Requirement: 16GB

    • Best balance of quality and performance

    • Recommended for users with RTX 3080/3090 or similar GPUs

    Q5_K_M Version

    • VRAM Requirement: 12GB

    • Good quality with reduced memory footprint

    • Ideal for RTX 3060 Ti/3070/2080 Ti users

    Q3_K_L Version

    • VRAM Requirement: 8GB

    • Most memory-efficient version

    • Enables running on mid-range GPUs like RTX 3060/2060 Super

    Key Features

    • Maintains the core capabilities of the original Flux.1-Heavy-17B model

    • Optimized for different VRAM configurations

    • Enables broader hardware compatibility without requiring high-end GPUs

    • Smooth operation at specified VRAM levels

    • Dramatically reduced resource requirements compared to original model

    Installation

    1. Download the preferred quantization version

    2. Place the GGUF file in your models directory

    3. Update your configuration to point to the new model file

    Credits

    • Original model: city96 (Flux.1-Heavy-17B)

    • Base architecture: Flux.1-dev (12B parameter model)

    Notes

    • Performance may vary depending on your specific hardware configuration

    • Choose the quantization level based on your available VRAM and quality requirements

    • Lower quantization levels may show slight quality degradation compared to the original model

    Description

    Checkpoint
    Flux.1 D

    Details

    Downloads
    102
    Platform
    CivitAI
    Platform Status
    Available
    Created
    1/8/2025
    Updated
    9/28/2025
    Deleted
    -

    Files

    flux1Heavy17BGGUF_q5KM.gguf

    Mirrors

    Huggingface (1 mirrors)
    CivitAI (1 mirrors)