Hello, I guess that like all hands SD video, it only works where the hands are very visible and in an easy position. Like if the hands have fingers touching each other it will still not work right ?
@ffwrude Hi, indeed fingers touching each other is more difficult and and pushes especially Openpose to its limits. In fact SD can generate even such hands in good quality, it's just less likely. So I would inPaint as described and generate several batches. Another option is to take a picture of your hands (or a template you have generated) in the desired position and use depth or canny, increasing chances for good hands a lot.
a few questions: I noticed a considerable decrease in quality using ControlNet with the models that are commonly used with SDXL (i.e. like the ones you used) in comparison with 1.5 controlnet models. is there a way to rectify that? second, SDXL controlnet takes considerably longer to generate than their 1.5 counterparts. Any fixes for that?
@yumiibrahim5513 Well recognized. SDXL allows higher resolutions and shorter prompts, but that comes with a price. As I said in the video, there is no official ControlNet repository for SDXL, so the models are customized accordingly. I also noticed the difference in quality, but I can't prove it. The running times are clearly too long for me to run longer test series. Which brings us to your second point. I still have to generate a lot of images to get satisfactory results. This takes too long for me with SDXL. I use SD1.5 (usually with ControlNet) and Upscale as described in my other video. The result can then be used as a template for SDXL with depth or canny if I really need it.
@@NextTechandAI Error running process: F:\ST\stable-diffusion-webui\extensions\sd-webui-controlnet\scripts\controlnet.py Traceback (most recent call last): File "F:\ST\stable-diffusion-webui\modules\scripts.py", line 825, in process script.process(p, *script_args) File "F:\ST\stable-diffusion-webui\extensions\sd-webui-controlnet\scripts\controlnet.py", line 1222, in process self.controlnet_hack(p) File "F:\ST\stable-diffusion-webui\extensions\sd-webui-controlnet\scripts\controlnet.py", line 1207, in controlnet_hack self.controlnet_main_entry(p) File "F:\ST\stable-diffusion-webui\extensions\sd-webui-controlnet\scripts\controlnet.py", line 898, in controlnet_main_entry Script.check_sd_version_compatible(unit) File "F:\ST\stable-diffusion-webui\extensions\sd-webui-controlnet\scripts\controlnet.py", line 827, in check_sd_version_compatible raise Exception(f"ControlNet model {unit.model}({cnet_sd_version}) is not compatible with sd model({sd_version})") Exception: ControlNet model control_v11p_sd15_seg [e1f51eb9](StableDiffusionVersion.SD1x) is not compatible with sd model(StableDiffusionVersion.SDXL)
@vargaalexander I pointed this out in the video. You can only combine SDXL with SDXL and SD1.5 with SD1.5. You are using an SDXL checkpoint with an SD1.5 ControlNet model. That does not work.
@yiluwididreaming6732 Same ControlNet options can be used with ComfyUI, but in the vid I've used Automatic1111 because it is a bit more common - or what's your question?
=== Advertisement / sponsor note ===
Try FaceMod AI Face Swap Online:
bit.ly/4aha0fm
=== Advertisement / sponsor note ===
This is amazing thank you so much! Hard to find a super updated ControlNet tutorial
@marketforcruz Thank you very much, I'm happy the video is useful!
your accent is amazing 🤩
I'm happy you enjoyed the video😀
Hello, I guess that like all hands SD video, it only works where the hands are very visible and in an easy position. Like if the hands have fingers touching each other it will still not work right ?
@ffwrude Hi, indeed fingers touching each other is more difficult and and pushes especially Openpose to its limits. In fact SD can generate even such hands in good quality, it's just less likely. So I would inPaint as described and generate several batches. Another option is to take a picture of your hands (or a template you have generated) in the desired position and use depth or canny, increasing chances for good hands a lot.
a few questions:
I noticed a considerable decrease in quality using ControlNet with the models that are commonly used with SDXL (i.e. like the ones you used) in comparison with 1.5 controlnet models. is there a way to rectify that?
second, SDXL controlnet takes considerably longer to generate than their 1.5 counterparts. Any fixes for that?
@yumiibrahim5513 Well recognized. SDXL allows higher resolutions and shorter prompts, but that comes with a price. As I said in the video, there is no official ControlNet repository for SDXL, so the models are customized accordingly. I also noticed the difference in quality, but I can't prove it. The running times are clearly too long for me to run longer test series.
Which brings us to your second point. I still have to generate a lot of images to get satisfactory results. This takes too long for me with SDXL. I use SD1.5 (usually with ControlNet) and Upscale as described in my other video. The result can then be used as a template for SDXL with depth or canny if I really need it.
@@NextTechandAI thank you for your response. Good idea to use 1.5 then use the result as a template. Thank you.
@yumiibrahim5513 Thanks for your feedback :)
hy. my controlnet doesnt work in sd1.5. can you upload your sd1.5 with installed controllnet without models, just the files. please
@vargaalexander What exactly is the problem? The video is about Automatic1111 and SDXL.
@@NextTechandAI Error running process: F:\ST\stable-diffusion-webui\extensions\sd-webui-controlnet\scripts\controlnet.py
Traceback (most recent call last):
File "F:\ST\stable-diffusion-webui\modules\scripts.py", line 825, in process
script.process(p, *script_args)
File "F:\ST\stable-diffusion-webui\extensions\sd-webui-controlnet\scripts\controlnet.py", line 1222, in process
self.controlnet_hack(p)
File "F:\ST\stable-diffusion-webui\extensions\sd-webui-controlnet\scripts\controlnet.py", line 1207, in controlnet_hack
self.controlnet_main_entry(p)
File "F:\ST\stable-diffusion-webui\extensions\sd-webui-controlnet\scripts\controlnet.py", line 898, in controlnet_main_entry
Script.check_sd_version_compatible(unit)
File "F:\ST\stable-diffusion-webui\extensions\sd-webui-controlnet\scripts\controlnet.py", line 827, in check_sd_version_compatible
raise Exception(f"ControlNet model {unit.model}({cnet_sd_version}) is not compatible with sd model({sd_version})")
Exception: ControlNet model control_v11p_sd15_seg [e1f51eb9](StableDiffusionVersion.SD1x) is not compatible with sd model(StableDiffusionVersion.SDXL)
@vargaalexander I pointed this out in the video. You can only combine SDXL with SDXL and SD1.5 with SD1.5. You are using an SDXL checkpoint with an SD1.5 ControlNet model. That does not work.
@@NextTechandAI wow thank i didnt know that my checkpoint was sdxl.... THANKS a LOT
Thanks for confirming the solution.
can i use it for pony ?
I haven't tried it yet, but I don't know of any reason against it.
Like FaceMod!!!
comfy?
@yiluwididreaming6732 Same ControlNet options can be used with ComfyUI, but in the vid I've used Automatic1111 because it is a bit more common - or what's your question?
@@NextTechandAI Sorry, just about comfy work flow for this? thanks