CivArchive
    NoobAI v-pred 1.0 with EQ-VAE - v0.1 - Usable
    NSFW
    Preview 103579578
    Preview 103579587
    Preview 103579593
    Preview 103579603
    Preview 103579627
    Preview 103579622
    Preview 103579625
    Preview 103579632
    Preview 103579626
    Preview 103579630
    Preview 103579646
    Preview 103579621
    Preview 103579629
    Preview 103579631
    Preview 103579623
    Preview 103579624
    Preview 103579619
    Preview 103579628
    Preview 103579620
    Preview 103580589

    A finetune, aligning NoobAI v-pred 1.0 to Anzhc's EQ-VAE-B7 for basic usage and further training with loras or finetuning, less noisy generations due to cleaner latents at training time, usual v-pred shenanigans apply, but model should be more resilient to colors exploding and colors should be a bit cleaner overall compared to base v-pred, does this behave like a slight aesthetic tunning? yes due to limited data get me a 5090 and I can do bigger finetunes instead of choking my 4090,, this was trained for 370k steps on 1x4090 I slammed my head hard against the dataset and compute wall, go donate to Anzhc for his efforts training the VAE.

    If you like it, and see some future for it, donate to me at Ko-fi as this was trained at my own expense and it took several days of tries with single GPU Training in a 4090.


    Generation settings? Default v-pred, just use the bundled VAE, or B7 from Anzhc's Repo, yes he is banned, yes I'll claim the NoobAI1.1EQ he did and the VAEs here in civitai as he has given my permission to claim those.

    Does this still leak random styles with some tokens due to Natural Language? yes it fucking does, fuck Natural Language with Clip L and G




    Some shitty comparisons
    Raw gen

    levels changed to expose noise in the background which remains very consistent

    Same image in base v-pred

    levels changed to expose noise in the background which remains very bad

    Description

    Trained for ~~370k total steps

    Batch Size=7
    Gradient Accumulation=4

    Full_BF16
    AdamW8bit Kahan Summation Optimizer