CivArchive
    Wan2.1_t2v14B_720p_FP8 - v1.0
    NSFW

    Here is my quantized FP8 Version of the Wan2.1_14B 720p t2v model so that we can run it on 50 and 40 series or for even faster inference on bigger cards.

    Description

    FAQ

    Comments (8)

    gx_ground136Feb 28, 2025
    CivitAI

    老外消息太不灵通了 现在WAN居然练个便签都没有

    yogotatara
    Author
    Feb 28, 2025

    english pls=)

    WackyLabsFeb 28, 2025
    CivitAI

    Would you be able to do a quantized FP8 version of T2V-1.3B for those of us with lower end hardware?

    yogotatara
    Author
    Feb 28, 2025

    yes will do

    WackyLabsFeb 28, 2025

    Awesome, thanks!

    WackyLabsMar 1, 2025· 1 reaction

    @yogotatara Thank you!

    AMark78Mar 24, 2025

    Wow! I think what we really need is quantized I2V-1.3B. Because even on a 4090 a generation is taking about 40mins. Is that even possible?

    Checkpoint
    Other

    Details

    Downloads
    727
    Platform
    CivitAI
    Platform Status
    Available
    Created
    2/27/2025
    Updated
    5/13/2026
    Deleted
    -
    Trigger Words:

    Files

    wan21T2v14b720pFP8_v10.safetensors

    Mirrors