NEW MODEL UPDATE NOW WITH LOCON
PLEASE READ THE LAST PART OF DESCRIPTION FOR PROMPTING
OK guys i worked in improving this lora two hole weeks and was worth it, i like now the results.
Updated to Locon so you will need the Locon extension to load them as a normal lora https://github.com/KohakuBlueleaf/a1111-sd-webui-locon
Specifications of training:
Trained on Google Collab https://github.com/Linaqruf/kohya-trainer using cropped selected images brom the Manhwa, then separated them by charaters and tagged by names with WD1.4 with a total of 750 images after pruning them.
32 Epoch
4 Repeats
Lycoris/Locon
Resolution 768
Model Anime-FullFinal-Pruned
Generation Tips (Important):
If you want something near those crazy ratio large pages in Manhwa, increase resolution to 512;768 and add in prompt: Manhwa Panel, Webtoon Panel
Recommended a weight of 0.8-1 and for generate hentai use AnyHentai model https://civarchive.com/models/5706/anyhentai, its just the best for hentai, above Grapefruit.
FOR portraits SFW use Counterfeitor any other
Use the names of the Characters (Anna, Sophie, Lina, Dia Song, Mia, Hyun) sadly Hyun dont generates well, but the girls have no problem
You dont need to specify certain characteristc of the preset girls like hair color, eyes color, lenght hair, breast size, this is defined just with the character name; just specify them if you want something different
Add in Nevative Prompt: letterbox, cropped on sides, white bars on sides, speech bubble, text, censored, censorship (this is because since the extreme ratio of the dataset i manually reduced their resolution to 768 creating crop bars that i specified in captions so you wont see them with this negative prompt)
Also just for fun i added the concepts: face focus, ass focus, pussy focus. To generate that iconic zoom from the Manhwa.
Its recommended use the prompt: Kingdom Style (to have a increased effect of style, if you want less style dont write it)
Description
FAQ
Comments (1)
The art style in this hentai webtoon is GODLY. It deserves better training.
The results from this Lora are subpar and you can tell that the author didn't properly crop or organize the training data. Garbage in, garbage out.