ComfyUI - Learn how to generate better images with Ollama | JarvisLabs
HTML-код
- Опубликовано: 12 апр 2024
- In this video we will learn how to use the power of LLMs using Ollama and Comfy_IF_AI nodes to generate the best images. Vishnu will also take us through on how to set up ollama in JarvisLabs' instances.
Workflow : github.com/jarvislabsai/comfy...
Check out Ollama: ollama.com/
Check out our ComfyUI basics playlist: • ComfyUI - Getting star...
Check out our socials:
Website: jarvislabs.ai/
Discord: / discord
X: / jarvislabsai
LinkedIn: / jarvislabsai
Instagram: / jarvislabs.ai
Medium: / jarvislabs
Connect with Vishnu:
X: / vishnuvig
Linkedin: / vishnusubramanian - Наука
This is really amazing !! Great work guys !
Awesome. Thanks.
Hi Vishnu, great vide9
Thanks Rishab
Phenomenal
Thanks :)
is this what they call magic prompt? where ollama model refine user prompt?
I cannot get this to run on my laptop, it fails to load into comfyui in manager and manual install. I have a full update on comfy, and I also run the txt file to get all the required files needed, and it still fails. Any idea why? I am running a Asus Rog strix 2024 with 64 gigs of ram, and a 4090 16 gigs Vram card. I have all the requirements needed for ai generations.
Did you try checking the error log to narrow down the issue?
Hello sir... doees this IF_AI node take lot of time???... for me its taking like 15mins to load for every queue...using RTX3060
It depends on what model you choose. Also try running ollama directly and see how fast it is.
@@JarvislabsAI Thanks, let me try that
Hey, this looks great but i have a question. How much it costs to generate this images?
These are all opensource software, so there is not much cost associated with it. If you need a GPU, and the base software setup then you would be paying for the compute. The pricing starts at 0.49$ an hour, and the actual billing happens per minute. jarvislabs.ai/pricing
lol
@@goodchoice4410 why lol?
How do i get the Load Checkpoint?
You can double click and search for load checkpoint node. If you want the checkpoint model. You can download it from this link: huggingface.co/RunDiffusion/Juggernaut-XL-v8/tree/main
Thank you!@@JarvislabsAI
Ollama is superslow, I would like a faster version using LM Studio or similar. Thanks
Noted!
slow or fast isn't that depends which model you are using? phi3 in ollama blazing fast
Hi, Thank you very much great tutorial ❤
Thanks for creating the node, waiting for your future works 😊
@@JarvislabsAI I made a super update please check it out also my other nodes for talking avatars😉 and thank again for the tutorial ❤️
@@impactframes Sure, we will look into it 🙌
@@JarvislabsAI thank you :)
The Topic is interesting but (in common with most RUclips Comfy experts, the whole presentation is confusing for the 90% of the audience that has just stumbled upon this. I think to be more successful you need to be clearer about what you want to achieve, why its a good idea. Explain how this JarvisAI fits into this, make it clear what resources need to be downloaded and exactly how in the least problematic way.
I dont want to appear too negative as of course you are wanting to be helpful just trying to give some tips how to improve your presentation and hopefully consequently increase subscriber numbers