Much like my facesitting LORA, this, like that one, was trained on the exact same dataset as the hunyuan version, using the exact same LORA trainer with the exact same settings and script.
However, on WAN, this LORA really, REALLY shines. On Hunyuan, a lot of times the effect wouldn't happen, because the spatial awareness wasn't as good. On this one, it REALLY works well lol.
YOU DO NOT NEED FULL 1.0 STRENGTH! WAN Lora seem to do well even at lower values! I recommend 0.6-0.8 for this one! Though 1.0 doesn't break it exactly either!
Description
A port of my hunyuan lora but it works wayyyy better on Wan. Like WAY better lol.
FAQ
Details
Downloads
570
Platform
SeaArt
Platform Status
Available
Created
3/6/2025
Updated
3/10/2025
Deleted
-
Files
Available On (1 platform)
Same model published on other platforms. May have additional downloads or version variants.
