ComfyUI (Stable Diffusion UI), Quick Start with local installation on Linux

Поделиться
HTML-код
  • Опубликовано: 3 окт 2024
  • ComfyUI is a web-interface for Stable Diffusion using a nodes system for generating complex workflows; this is an intro video for starting using it on Linux, although it requires a really powerful hardware
    ComfyUI on Gituhub
    github.com/com...
    Civit AI
    civitai.com/
    Stable Diffusion Samples artticle
    stable-diffusi...
  • НаукаНаука

Комментарии • 30

  • @mii_beta
    @mii_beta  8 месяцев назад +6

    i really cant estimate my video timings anymore ..it was supposed to be so super short :( but i'll make up to you, will do one video for creating UIs and one on creating consistent images; with renting GPUs obviously coz sooo poor also :p

  • @ArztvomDienst
    @ArztvomDienst 4 месяца назад +3

    Clicked for the tutorial, found an emanation of wisdom delivered by the voice of an angel.

  • @romayojr
    @romayojr 2 месяца назад +1

    never seen anything like you on yt before but it totally worked on wsl 2 using nvidia 3080 - thanks for the tutorial! my image took 10 seconds to generate btw :)

  • @aleksandrG
    @aleksandrG 8 месяцев назад +3

    Well, I downloaded it and then spent full night tinkering with it. Never thought AI image generation could be so fun! Thank you for the video, will wait for the next tutorials from you.

  • @tekakutli
    @tekakutli 8 месяцев назад +3

    try out the lcm sampler, you will need the lcm lora and set the cfg to 1; then you will only need 4 steps

  • @PavewayIII-gbu24
    @PavewayIII-gbu24 5 месяцев назад +2

    That was philosophical bro but exactly what I was thinking the other day. If our perception of time can slow down does that mean we can have a relatively infinite last thought?

  • @modernsolutions6631
    @modernsolutions6631 8 месяцев назад +3

    Me who uses Rocky Linux had to install python3.11 stand-alone anyway and mess with environments. 😢
    Also CFG controls how closely it follows positive prompts assuming the negative prompt being like electrical ground. That's why if you run with cfg=0.0 and something in the negative prompt you will get roughly what the negative prompt described. Positive Prompt is completely ignored at cfg=0.0 but negative prompt still matters.
    Also what intro?

    • @mii_beta
      @mii_beta  8 месяцев назад +1

      first time i hear someone using Rocky on this channel!
      ( gonna check that cfg trick!!)

    • @modernsolutions6631
      @modernsolutions6631 8 месяцев назад

      Basically the negative prompt wasn't intended to be used. It's use and spread is entirely a cargo cult.
      The algorithm was intended to be used like:
      Take noisy picture. Denoise with prompt.
      Take copy of noisy picture. Denoise without prompt.
      Compare both results weighted by cfg using the sampler.
      What negatives prompts do is don't use an empty prompt to compare to but the noise guess made from the "negative prompt".
      So negative prompts are not negative in embedding space, they are subtracted in image denoising latent space. What you are actually doing negative prompts is mess with the GPS of the denoiser based on something you copy pasted from someone on the internet instead of using the AI as it was trained for.
      Yes it gives you more illusion of control but not if you just copy paste stuff from the internet. 😂
      Something far more interesting interesting IMHO is working with changing the positive prompt. Certain things like hair color, eye color and so on settle in different order usually. Like you can take the seed and prompt for a green haired girl, swap out the prompt for red haired girl mid diffusion and end up with a red haired girl which has exactly the same pose and look as the green haired girl did.
      Experimenting with that requires more compute though. Although there is a trick with advanced generation nodes so you can set it up to divide your sampling into multiple steps, and reuse old result (if seed and prompt stays the same) and end up with identical image to single sampling performed by a single node. Let me see if i find my notes on that.

    • @mii_beta
      @mii_beta  8 месяцев назад

      @@modernsolutions6631just share your workflows (if you have them on Comfy)

    • @modernsolutions6631
      @modernsolutions6631 8 месяцев назад

      I'd rather explain principles than workflows. (Don't have workflkw at hand either)
      Basically a KSampler(add_noise=disable, noise_seed=1, control_after_generate=fixed, steps=20, start_at_step=0, end_at_step=10000, return with left over noise
      =disable) returns a pixel perfect identical image to these two chained (assuming your scheduler isn't bugged).
      KSampler(add_noise=disable, noise_seed=1, control_after_generate=fixed, steps=20, start_at_step=0, end_at_step=10, return with left over noise
      =enable)
      KSampler(add_noise=disable, noise_seed=1, control_after_generate=fixed, steps=20, start_at_step=10, end_at_step=20, return with left over noise
      =disable)
      You can now keep refusing the latent of the first 10 steps with you just tweak around with the later parts of the pipeline. 🎉
      Rules to get pixel perfect match: steps describes the total amount of steps in an equivalent non advanced pipelines. Start at steps of the first element is zero. End_at_step is shared with the Start_at_step of the last one. end_at_step and steps must match at the end of the last thing in your pipeline.
      The return_with_leftover_noise must be set enabled for all but the last advanced KSampler in your pipeline. Returning with leftover noise makes image a bit worse but that noise is needed for the in between steps.
      You now have the best possible control over the generation process. You can also divert from that in a bunch of fun ways, if you do that the jump of the value in the scheduler can cause unexpected behavior. sgm_uniform seemed robust to that.
      Operating with fixed control_after_generate is a good idea. As you are actually trying to improve an image, not just make superstitious changes and pull the lever on a slot machine and imagine you are doing stuff if you are just doing ineffective/harmful/superstitious things otherwise.

  • @treakzy_9594
    @treakzy_9594 Месяц назад

    what file explorer did you use?

  • @andreluiz9726
    @andreluiz9726 8 месяцев назад +1

    currently using webui, 6900 xt on Fedora... Ill check this UI later... tho it seems not that intuitive
    thx for the video!!
    xoxo lurker

  • @satchua7367
    @satchua7367 8 месяцев назад

    Using rx5700xt. Torch just wouldnt recognize the gpu.

    • @lucygelz
      @lucygelz 4 месяца назад

      use rocm

  • @47DKDS
    @47DKDS 3 месяца назад +1

    Maybe some day I will learn how to run this god damn program. No help from this tutorial.

  • @kal5765
    @kal5765 7 месяцев назад

    Won't it cause feature problems with pacman if you just install things to the system with pip?

  • @ПерчинПак
    @ПерчинПак 8 месяцев назад

    0:13 on fedora 39 I just did `dnf install python-3.10`, it works and nothing exploded (`which python` is still pointing to 3.12, but `which python3.10` is pointing to 3.10)

    • @mii_beta
      @mii_beta  8 месяцев назад +1

      umm, completely missed we could install diff versions of Python from repos

    • @galactichog1480
      @galactichog1480 8 месяцев назад

      No match for argument: python-3.10
      Error: Unable to find a match: python-3.10,
      Eh? How come my Fedora 39 can't find it? XD

  • @earth2k66
    @earth2k66 8 месяцев назад

    Imagine getting Mii's GPU and her ghost pops up on every ai-art you generate on it. 💀

  • @10xFrontend
    @10xFrontend 6 месяцев назад

    What distro is this?

  • @pordonjeterson
    @pordonjeterson Месяц назад

    Only a 1060? Are you poor? Lol.

    • @23Puck666
      @23Puck666 29 дней назад

      Some of us live in shitland.

  • @PragandSens
    @PragandSens 8 месяцев назад +2

    when will Mii unlock sex?

    • @mii_beta
      @mii_beta  8 месяцев назад +3

      when you super thanks her for $500 ..multiple times!!

    • @PragandSens
      @PragandSens 8 месяцев назад

      @@mii_beta deal O⍵O