Notice
This workflow is not state of the art anymore, please refer to the Flux.1 Fill and the official comfyui workflows for your inpainting and outpainting needs. Details below:
Black forest labs has since released official flux 1 tools: https://blackforestlabs.ai/flux-1-tools/
The tools include the Flux.1 Fill [dev] model which is better than the alimama controlnet used in this workflow.
There is native comfyui support for the flux 1 tools: https://blog.comfy.org/day-1-support-for-flux-tools-in-comfyui/ which you should use for your inpainting and outpainting needs.
You can grab the official comfyui inpaint and outpaint workflows from: https://comfyanonymous.github.io/ComfyUI_examples/flux/ by downloading the images (workflow embedded within the images) and opening them in comfyui.
If you have low VRAM, you can download the fp8 version of the Flux.1 Fill [dev] model from here: https://civarchive.com/models/969431/flux-fill-fp8
This workflow will not be updated anymore because there is nothing I have to add to the official comfyui inpaint and outpaint workflows.
Introduction
The latest version of this workflow uses
alimama-creative/FLUX.1-dev-Controlnet-Inpainting-Beta
alimama-creative/FLUX.1-Turbo-Alpha
to achieve 8 steps inpainting and outpainting within the same workflow.
Models
FLUX.1-Turbo-Alpha.safetensors (models/loras/flux) https://huggingface.co/alimama-creative/FLUX.1-Turbo-Alpha
FLUX.1-dev-Controlnet-Inpainting-Beta-fp8.safetensors (models/controlnet) https://huggingface.co/alimama-creative/FLUX.1-dev-Controlnet-Inpainting-Beta
Download diffusion_pytorch_model.safetensors, rename the file and use Kijai's script (https://huggingface.co/Kijai/flux-fp8/discussions/7#66ae0455a20def3de3c6d476) to convert to FP8 to be able to fit into 16GB VRAM.
flux1-dev-fp8-e4m3fb.safetensors (models/diffusion_models/flux): https://huggingface.co/Kijai/flux-fp8/tree/main
t5xxl_fp8_e4m3fn_scaled.safetensors (models/clip): https://huggingface.co/comfyanonymous/flux_text_encoders/tree/main
ViT-L-14-BEST-smooth-GmP-TE-only-HF-format.safetensors (models/clip): https://huggingface.co/zer0int/CLIP-GmP-ViT-L-14/tree/main
Custom Nodes
segment anything (if you have > 16GB VRAM and want to use automatic segmentation)
Various ComfyUI Nodes by Type
KJNodes for ComfyU
Description
FAQ
Comments (6)
To anyone else struggling with recreating this:
1) You need to put the InPaint checkpoint "big-lama.pt" into a new folder named "inpaint" in the "\ComfyUI\models\" folder. Download link:
https://huggingface.co/spaces/aryadytm/remove-photo-object/blob/f00f2d12ada635f5f30f18ed74200ea89dd26631/assets/big-lama.pt
2) This requires additional nodes. Install these:
https://github.com/WASasquatch/was-node-suite-comfyui
https://github.com/Acly/comfyui-inpaint-nodes
https://github.com/ltdrdata/ComfyUI-Impact-Pack
https://github.com/jamesWalker55/comfyui-various
Good luck, have fun!
Thanks for the detailed instructions on how to get the workflow going. I hope you had good out painting results with the workflow.
@PixelMuseAI I did! Thanks a lot for sharing this workflow!
ControlNet.__init__() got an unexpected keyword argument 'device'
@marcelocarvalhoa2107 which version of the workflow are you using? try updating your comfyui.
which model should i put in the controlnet name?

