Over view
This is a simple workflow designed to closely reproduce the inpainting features available in WebUI. And specifically designed for use with my model, Utopian Inpainting.
Update: Changed sam2 to sam3
Old update: Added Openpose, Detail daemon, fill mask area, basic Adetailer, and more.
I realized the pure SDXL models is good as a refiner. After testing several options, Epic Realism delivered good performance with both high-resolution and low-resolution images.
Set the sampler 2 configuration to "dpmpp_2m_sde" "karras" and try adjusting the "refiner ratio" to around "0.6 to 0.8"
-Basic feauture-
Inpainting with refiner model
Inpainting full image and crop image select
Pre-resize image with scale by % and long_side resize select
Basic K-sampler and Lanpaint support
Manual mask and sam2 Auto masking select
Optional Vae and Clip selector
Postprocess Mask detailer (testing)
Upcoming Update plan: Outpainting Support, More Accurate Auto masking.
-How to use-
Resources you need:
Inpainting Model: Utopian pony inpainting or Any SDXL Inpainting model
Refiner Model: Your favorite models
Optional Clip: Illustrious_base_clip_L and Illustrious_Base_CLIP_G
Settings:
[Optional loader]
Select if you want to use Clip or VAE instead.
[Inpaint mode]
-Normal
Normal Inpainting.
-Refiner
Inpainting with refiner.
-Lanpaint
Use Lanpaint, The output results are different from Ksampler. It is difficult to say which is better.
[Resize method]
-percentage
Pre-resize image with scaled %.
-Long side
Pre-resize the image based on the long edge.
[Inpaint area]
full
Inpaint with pre-resized image size.
-crop
Crop pre-resized image according to mask.
crop target size
Scales the size of the crop from the image.
Cropped_image = Original_img * Pre-resize * crop_target_sizecrop mask extend
How much of the image to crop from the mask.
[Mask method]
-Manual
Right-click on the load image and select "Open in mask editor" to manually draw a mask.
-Auto
A mask is generated automatically; use the mask prompt to define the area you want to mask.
[Postprocess]
-After Detailer
Enhance the Inpainted image by running a mild inpainting pass after finished.
-with refiner
Also use refiner model when Detailer. (With refiner doesn’t perform well at the moment)
Detailer denoise
recommended 0.3 ~ 0.5Detailer zoom
Set how much of the surrounding area is included when cropping.
The higher the value, the more area around the mask will be cropped.
Description
Changed Sam2 to Sam3
I have confirmed that there is a compatibility issue with Transformers 5.
The model is downloaded to ComfyUI/models/sam3. It is 3.2GB and may cause OOM if VRAM is low.
This is a simplified version, so it only responds to single words such as "hair." It does not support expressions such as "hair, boots" or "hair and boots."
Fixed an issue where vae was not switching correctly when loaded.
Inpaint model and refiner model must have the same vae compatibility
FAQ
Comments (7)
where is the denoise?
Not in the seed column?
@habibing How do you adjust your outpainting in your workflow? For example, starting from a 512x512, I want to create a 4:3 aspect ratio. Please?
It's nearly impossible to output an image exactly to a specific aspect ratio. The sdxl VAE requires a factor of 16 for both height and width, and it's too complicated to calculate how long to stretch the image for outpainting to meet a specific aspect ratio and still be a factor of 16.
First, your 512x512 image will be rescaled by specifying the long edge or the scale factor. SDXL does not support sizes below 1024, so we'll assume you specified 1024 as the long edge. If you set the left outpaint slider to 0.2, the left side of the image will be extended by 1024*0.2, meaning the image width will be 1024*1.2 (1228), resulting in a width of 1228 x height of 1024.
If you want to resize an image to 4:3, you can calculate the width needs to be 1365 based on the height of 1024 and the 4:3 ratio. Dividing 1365 by the width of 1024 gives us 1.3330078125, so we can see that the left or right slider needs to add up to around 0.33.
tryed to install it on a stock- comfyUI (downloadend json and dropped it in). I'm getting some error about unknown nodes.
In addition my node manager gives an info line (but w/o Context): "Some extensions are disabled due to incompatibility with your current setup [....]. Its the Impact Subpack and Brushnet with this issues.
Does this mean that some nodes can't be installed from the manager? In that case, you can get the zip file directly from GitHub and extract it directly to ComfyUI\custom_nodes.
Go to the GitHub page, click the green "CODE" button, and download it from Download ZIP.
Since it's a small node, I don't think there will be any dependency issues.
ComfyUI-Impact-Subpack
ComfyUI-BrushNet
Or you may be able to find it by setting the channel to dev in the manager settings.
For example, if you manually install impact subpack, the file structure will be as follows:
ComfyUI/custom_nodes/ComfyUI-Impact-Subpack-main
