CivArchive
    ⚡️FLUX Dev/Schnell (Base UNET) + Google FLAN FP16 - Dev
    Preview 1
    Preview 2
    Preview 3
    Preview 4
    Preview 5
    Preview 6
    Preview 7
    Preview 8
    Preview 9
    Preview 10
    Preview 11
    Preview 12

    Full Checkpoint with improved TE do not load additional CLIP/TE

    FLUX.1 (Base UNET) + Google FLAN

    This model took the 42GB FP32 Google Flan T5xxl and quantized it with improved CLIP-L for Flux. To my knowledge no one else has posted or attempted this.

    • Quantized from FP32 T5xxl (42GB 11B Parameter)

    • Base UNET no baked lora's or other changes

    • Full FP16 version is available.

    Description

    FAQ