CivArchive
    Nomos8kHAT-L_otf - v1.0
    NSFW
    Preview 11641955
    Preview 11641951
    Preview 11641954
    Preview 11641953

    Note: This upscaler is not mine

    Credit to Helaman

    Originally uploaded at: https://openmodeldb.info/models/4x-Nomos8kHAT-L-otf

    About version 2.0

    Everything is the same is with the first upload, but converted to .safetensors. I had issues getting Forge and Automatic1111 to load the .safetensors version of the upscaler but it works like a charm in ComfyUI.

    General info

    Hybrid Attention Transformer (HAT) combines channel attention and self-attention schemes and makes use of their complementary advantages. To enhance the interaction between neighboring window features, an overlapping cross-attention module is employed in HAT. Read more

    Where does it go?

    To use this (and other HAT upscalers) with Automatic1111 and Forge follow these steps.

    • Create a folder in in \webui\models\ and name it HAT

    • Download the file either here or from the source

    • Place the file in \webui\models\HAT\

    • Restart your webui

    Note: If you have issues getting the model to work, change the file name from .pt to .pth

    Description

    FAQ