CivArchive
    Preview 113293820
    Preview 113292336

    This page contains scaled fp8 quantized DiT models of Neta Lumina for ComfyUI.

    And a scaled fp8 quantized Gemma 2 2b (the text encoder).

    All credit belongs to the original model author. License is the same as the original model.

    Note: Images from bf16 and fp8 models are identical, like this. If image from fp8 model changed drastically, your ComfyUI somehow enabled fp16 mode. Lumina 2 doesn't not support fp16, and you will get deformed image.


    Update (11/27/2025): mixed precision and fp8 tensor core support (mptc).

    This is a new ComfyUI feature that supports fp8 tensor core, also with scaled fp8 + mixed precision.

    In short:

    Mixed precision: Keep important layers in BF16.

    FP8 tensor core support: On supported GPU, much faster (30~80%) than BF16 and classic FP8 scaled models. Because ComfyUI will do calculations in FP8 directly, instead of dequantizing + BF16. torch.compile is recommended.

    More info: https://civarchive.com/models/2172944/z-image-turbo-tensorcorefp8

    Description

    mixed precision and fp8 tensor core (mptc).

    FAQ

    Comments (6)

    LyloGummyDec 11, 2025
    CivitAI

    Hi, many thanks for your work! If I may ask, would you consider convert newbie ai as well? https://civitai.com/models/2197517

    reakaakasky
    Author
    Dec 11, 2025· 1 reaction

    Maybe. It needs to be supported officially by ComfyUI though. Right now it is impossible to convert...

    LyloGummyDec 11, 2025

    @reakaakasky Totally understandable, thank you for considering this!

    reakaakasky
    Author
    Dec 12, 2025

    @LyloGummy tbh I've already converted newbie DiT to standard scaled fp8 (no fp8 tensorcore support). just can't test it because it requires latest comfyui to load the model.

    LyloGummyDec 12, 2025· 1 reaction

    hey @reakaakasky, the original devs just posted a PR for comfy:

    https://github.com/comfyanonymous/ComfyUI/pull/11284

    I've switched to their pr using the latest comfyui version and newbie is working for me, might need to install flash-attn2:

    https://huggingface.co/ussoewwin/Flash-Attention-2_for_Windows/tree/main

    also custom nodes:

    https://github.com/NewBieAI-Lab/ComfyUI-Newbie-Nodes

    honestly can't believe you've already converted it, amazing work!

    reakaakasky
    Author
    Dec 12, 2025

    emmm... seems the pr has a long way to go before it can even be considered. It's unacceptable tbh.

    Checkpoint
    Lumina

    Details

    Downloads
    320
    Platform
    CivitAI
    Platform Status
    Available
    Created
    12/10/2025
    Updated
    5/15/2026
    Deleted
    -

    Files

    netaLuminaFp8_ntymV4Mptc.safetensors

    Mirrors