CivArchive
    FlashAttention VS Test - 12.9_CUDA_FlashAtten
    Preview 96360703

    FlashAttention VS Test

    • Test and Graphs attentions based on your system.

    • TO DO ADD Sage Attention

    • Comfy UI Users I recommend:

    cmd /k python main.py --fp32-text-enc --fp32-vae --bf16-unet --use-flash-attention

    Note: The compiled version of flash attention also included is for Cuda 12.9

    (Tested Working with COMFY UI)

    Description

    Other
    Other

    Details

    Downloads
    16
    Platform
    CivitAI
    Platform Status
    Available
    Created
    8/25/2025
    Updated
    9/28/2025
    Deleted
    -

    Files

    flashattentionVSTest_129CUDAFlashatten.zip