7PAG [bf16/fp16] [no-ema-no-vae] [SafeTensors] [Checkpoint]
=====================
Disclaimer:
I'm just a Script kiddie. I have no idea what i'm doing.
So, keep calm and know nothing.
=====================
All the credit goes to LotusSnacks.
Description
7PA + Gape60 Add Difference & MBW Merged model.
Nsfw output are better than the 7PA model.
FAQ
Comments (3)
Which one is better for inference, bf16 or fp16? I googled it but only found out that bf16 is more stable during training. Do the old GPUs support the bf16 checkpoints?
New GPUs: 30X0/40X0 (Ampere/Ada Lovelace). Yep, BF16 is more stable.
Old GPUs: 10X0/20X0 (Pascal/Turing). Just use FP16.
There is no significant difference between BF16 and FP16.
So, if you don't care, just use FP16.
@LeaderThree Thanks a lot, that clears it up for me 👍



















