I merged DPO unet into my own base model.
I am still testing it.
Description
This is merged from DPO (https://huggingface.co/mhdang/dpo-sdxl-text2image-v1) and another SDXL model I use.
Still testing, may change any time.
FAQ
Details
Downloads
1,154
Platform
SeaArt
Platform Status
Available
Created
1/6/2024
Updated
2/25/2024
Deleted
-
Files
Available On (1 platform)
Same model published on other platforms. May have additional downloads or version variants.









