This was trained on some very unexpected results while testing prompts on different checkpoints.
Yes, it is supposed to look like that.
Porn_Error.safetensors