This workflow integrated IPAdapter and ControlNet into FLUX
Description
Instead of SDXL CN and IPadapter it uses the new FLUX IPA and CN
FAQ
Comments (12)
not work. i have finished image super pink noise. all nodes work, model FP8.dev
Same here! Switched from the big model down to dev-fp8 but its not working. out of memory error! What a waste of time...
Any ideas on why this error is happening? Its got somthing to do with the IPadapter. When I by pass it everything works.
!!! Exception during processing !!! mat1 and mat2 shapes cannot be multiplied (1x1024 and 768x16384)
!!! Exception during processing !!! mat1 and mat2 shapes cannot be multiplied (1x1024 and 768x16384)
!!! Exception during processing !!! mat1 and mat2 shapes cannot be multiplied (1x1024 and 768x16384)
Traceback (most recent call last):
File "J:\CurrentAi\StableComfyD\ComfyUI_windows_portable\ComfyUI\execution.py", line 316, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
incompatible clip vit. Try Clip-Vit-L-14, aka clip vit large patch 14.
It solved this error for me
source:
https://www.reddit.com/r/comfyui/comments/1f9mxji/flux_ip_adapter_error_every_time_mat1_and_mat2/
@rns96 This fixed the same error I had. Wrong Clip Vision Model
Direct Link to Model https://huggingface.co/openai/clip-vit-large-patch14/tree/main
@CManzione hi man thank you, but wich one of all file is what i must dowload?
@sergiogmx model.safetensors, you need to rename it to clip-vit-large-patch14 and place it into \ComfyUI\models\clip_vision
The error occurred: mat1 and mat2 shapes cannot be multiplied (1x1024 and 768x65536)
Error when using IPAdapter:
XlabsSampler
Allocation on device
change clip_version https://hf-mirror.com/openai/clip-vit-large-patch14/tree/main
Its not working! Iam getting out of memory error on my 5070 TI with 16gb of vram. Switched down to the smaller model "flux1-dev-fp8.safetensors" but same error. Also tried 512x512 and 20 instead of 50 steps but its not working. If 16gb are not enough and we need 4090 or 5090 cards for that workflow you really should write that in the description! Otherwise its just a waste of time! Took me 3 hours to get all working. Only to see that memory is not enough. Thanks!

