This is not a model and has no preview. You will see the speed change in the console after installation.
Nvidia Cuda 12.2 libs + Increases processing speed on ~5-10%
Support: SDXL, 1.5
update for CUDA 12.2.2 and cuDNN 8.9.5.29 (!) (09.2023)
add and replace files in dir:
AUTOMATIC1111 - webui\venv\Lib\site-packages\torch\lib
Comfyui - python_embedded\Lib\site_packages\torch\lib
InvokeAi - InvokeAi\.venv\Lib\site-packages\torch\lib (for cuDNN only)
(You can make backup before replacing and compare before and after.)
Description
FAQ
Comments (9)
Will this wok on Comfyui?
I haven't tested this. You can try to find dll files and replace them. (Before that, make a backup copy.) Most likely they are also in the torch directory.
@WebH Same place (just embedded), and so far it seems to be running fine. Thanks.
I'm assuming "ComfyUI_windows_portable > python_embedded > Lib> site_packages > torch > lib" is the necessary location?
@Kaladae maybe, let us know if that works.
@EricRollei21 It did not, because one of the files was "*_12" instead of "*_11" and mix matching the files also did not seem to help, as it refused to load. I had to restore original versions of the files. Also, when I tried to follow update procedures for conda torch updates, it uninstalled cuda altogether, and went to straight CPU.
That was with the original files, so this update package had nothing to do with that last part. The whole process in and of itself, NOT RELATING TO THIS FILE, is a convoluted mess.
@Kaladae Hey thank you for trying and reporting back. I was too scared to try it since I hate it when my comfyUI breaks. Sorry yours got messed up and hope you got it back. I need to change my back up from weekly to daily I guess so I can try stuff like this. On a realated topic, I've been reading that the last Nvidia firmware for GPU is slower for AI gen than a couple revs back. I thought that it was slower after updating to latest, but not sure I want to try and revert or not.
I have no problem running this with my ComfyUI with VENV environment. Works a charm... thanks.
xformers version: 0.0.21
Set vram state to: NORMAL_VRAM
Device: cuda:0 NVIDIA GeForce RTX 3060 : native
@EricRollei21 It wasn't difficult to fix, I'm not entirely sure if I'm running CPU or CUDA right NOW after the last conda updates, but it's not a noticeable difference. Despite having a 3080, SOMETHING seems off with my setup as I'm only getting between 1-3 it/s which seems low, but it doesn't bother me much. I also do a lot of upscaling and that might have something to do with it? I'm not sure what I should expect honestly. I'm still really new to all this!
@Piscabo Glad it works for you! I'd hate that this isn't beneficial for anyone at all. Apparently there may be something funky with my setup.

