BrushNet - The Best InPainting Method Yet? FREE Local Install!

Поделиться
HTML-код
  • Опубликовано: 13 май 2024
  • Inpainting with the latest BrushNet models for AI generation using stable diffusion is lots of fun! Get great results quickly thanks to both random and segmentation mask models - all for free on your own computer! Installation is quick and easy, so why not give it a go today?
    Update: BrushNet SDXL is now also available 😀
    Want to support the channel?
    / nerdyrodent
    Links:
    github.com/TencentARC/BrushNet
    huggingface.co/spaces/Tencent...
    pytorch.org/
    huggingface.co/spaces/abhishe...
    Installing Anaconda for MS Windows Beginners - • Anaconda - Python Inst...
    pixabay.com/photos/teddy-bear...
    == More Stable Diffusion Stuff! ==
    * Faster Stable Diffusions with the LCM LoRA - • LCM LoRA = Speedy Stab...
    * Video-to-Video AI using AnimateDiff - • How To Use AnimateDiff...
    * How to make Consistent Characters From 1 Picture In ComfyUI - • Reposer = Consistent S...
    0:00 Inpainting Introduction
    1:19 BrushNet 0-Install Option
    1:37 BrushNet Local Install Option
    7:26 BrushNet Examples
    12:03 ComfyUI Inpainting Example
  • НаукаНаука

Комментарии • 61

  • @ducttapebattleship
    @ducttapebattleship Месяц назад +10

    Thanks for still explaining everything in detail (conda, git, etc.) even after you've explained everyting a dozen of times already. This really helps.

  • @loubakalouba
    @loubakalouba Месяц назад +6

    I see a new Nerdy Rodent video I click instantly, Never failed me. Thank you!

  • @forfreeiran8749
    @forfreeiran8749 Месяц назад +4

    You got the best vids man

  • @knightride9635
    @knightride9635 Месяц назад +1

    This is really good, thanks !

  • @MrSporf
    @MrSporf Месяц назад +2

    I thought fooocus was good, but this thing is next level!

  • @DivinityIsPurity
    @DivinityIsPurity Месяц назад +1

    Great tut

  • @Lamson777
    @Lamson777 Месяц назад +3

    What a funny intro 😂

  • @PianoCinematix
    @PianoCinematix Месяц назад +1

    Excellent work as always mate. One quick question though. Where is the default output "dir" after generating the images?

  • @rudeasmith5098
    @rudeasmith5098 Месяц назад +1

    That shining suit of armor is suss! Love your videos though. 👍

    • @NerdyRodent
      @NerdyRodent  Месяц назад

      I like the way it does the reflections too!

  • @arashsohi8551
    @arashsohi8551 Месяц назад

    Thank you so much!

    • @NerdyRodent
      @NerdyRodent  Месяц назад +1

      You're welcome!

    • @arashsohi8551
      @arashsohi8551 Месяц назад

      ​@@NerdyRodent Could you help me with this if you can please?
      I tried to find out what is the problem but nothing was found :(
      I have all the data structure and checkpoints and have the "sam_vit_h_4b8939.pth" but I don't know why this happens when I run the app_brushnet.py.
      Traceback (most recent call last):
      File "D:\AI-Programs\BrushNet\examples\brushnet\app_brushnet.py", line 13, in
      mobile_sam = sam_model_registry['vit_h'] (checkpoint='data/ckpt/sam_vit_h_4b8939.pth').to("cuda")
      File "C:\ProgramData\miniconda3\Lib\site-packages\segment_anything\build_sam.py", line 15, in build_sam_vit_h return _build_sam(
      ΑΑΑΑΑΑΑΑΑΑΑ
      File "C:\ProgramData\miniconda3\Lib\site-packages\segment_anything\build_sam.py", line 104, in _build_sam with open(checkpoint, "rb") as f:
      ΑΑΑΑΑΑΑΑΑΑΑΑ ΑΑΑΑΑΑΑΑΑ
      FileNotFoundError: [Errno 2] No such file or directory: 'data/ckpt/sam_vit_h_4b8939.pth'
      [process exited with code 1 (0x00000001)]

    • @NerdyRodent
      @NerdyRodent  Месяц назад

      Looks like you forgot to download that model!

  • @Sandy5of5
    @Sandy5of5 Месяц назад +3

    If only it could fill in the blank spots in our lives, 'eh? 😉🐭

  • @tonywhite4476
    @tonywhite4476 Месяц назад +3

    Brushnet or Invoke?

  • @MarcSpctr
    @MarcSpctr Месяц назад +1

    hey can you do a video on ANYDOOR, please ?
    I read their github and it seems such a cool tool to change inject location and stuff, but sadly the GRADIO DEMO only has inject transfer from one image to another.
    Can you make a tutorial on how to use all the other tools that Anydoor provides ?

  • @industrialvectors
    @industrialvectors Месяц назад

    What file explorer and image viewer are you using?
    Your setup looks like a mix of Linux and Windows in a good way.

    • @NerdyRodent
      @NerdyRodent  Месяц назад +1

      I’m using Caja with the default image viewer 😉

  • @Elwaves2925
    @Elwaves2925 Месяц назад +2

    Careful with those puns, Sebastian Kamph will become jealous. 🙂
    The results look great but the one time I tried Anaconda it didn't work and seemed to be more of a mess. Then the hassle with the models so I think I'll wait until it hits Forge, which shouldn't be too long. Cheers NR.

    • @NerdyRodent
      @NerdyRodent  Месяц назад +2

      Anaconda is the way forward and has never let me down in over four years!

    • @Elwaves2925
      @Elwaves2925 Месяц назад

      @@NerdyRodent That's cool, it may have been something my end like a misunderstanding on my part. I also realised after that it might have been Miniconda, not Anaconda that I was using.

  • @vi6ddarkking
    @vi6ddarkking Месяц назад +3

    Ok so I am going to ask the obvious question.
    How well does it fix hands and feet?

    • @JonnyCrackers
      @JonnyCrackers Месяц назад

      It's stable diffusion so probably not very well at all.

    • @zacharyshort384
      @zacharyshort384 26 дней назад

      @@JonnyCrackers ? You can *fix* hands and feet quite easily in SD with inpainting. People do it all the time...

  • @worldwidewebcap
    @worldwidewebcap Месяц назад

    I wasn't able to get it running. I keep getting a ton of errors, like "OSError: Error no file named diffusion_pytorch_model.bin found in directory data/ckpt/segmentation_mask_brushnet_ckpt."
    But that file ends in .safetensors not .bin

    • @NerdyRodent
      @NerdyRodent  Месяц назад +1

      You can download the required files from their Google Drive

  • @bobbyboe
    @bobbyboe 25 дней назад

    I wonder what is more powerful, the IOPaint (that I noticed through your video last month), or this BrushNet. Evaluating after having seen both of your videos, I would tend to rather use IOPaint. But I am sure you must have compared both, or at least have a much better insight?

    • @NerdyRodent
      @NerdyRodent  25 дней назад +2

      There’s a brushnet inspired PowerPaint for IOPaint now - best of both worlds!

    • @bobbyboe
      @bobbyboe 15 дней назад

      @@NerdyRodent maybe we can expect some video from you about how to use this "PowerPaint" feature you described? Also a comparison of different models and how you use them in IOPaint would be very interesting. I cannot find a lot of input on how IOPaint can be power-used...

  • @lockos
    @lockos Месяц назад

    @NerdyRodent I completed every steps without any issue for installation, but in the end, the local url is unavailable, it fails to connect to it, any idea why ?

    • @NerdyRodent
      @NerdyRodent  Месяц назад

      If you’ve got extra security set up, you may need to modify it to allow local connectivity

    • @lockos
      @lockos Месяц назад

      @@NerdyRodent What'd'ya mean by extra security ? I tried with temporarily disabled firewall, checked my browser settings and flushed my DNS, that changed nothing wether I use Chrome, firefox or Edge. Now maybe there is a security setting within Brrushnet that I'm not aware of, if that is the case please could you tell ?

    • @NerdyRodent
      @NerdyRodent  Месяц назад

      If you’ve not got any extra security running, it could be that you’re using Microsoft Windows which will fail on basic routing tasks. If that’s your case, you’ll need to use localhost as the address instead!

  • @LouisGedo
    @LouisGedo Месяц назад

    👋

  • @MilesBellas
    @MilesBellas Месяц назад

    Automatic1111 isn't deprecated?

    • @NerdyRodent
      @NerdyRodent  Месяц назад +1

      Does kinda seem deprecated at this point, yes 🫤

    • @zacharyshort384
      @zacharyshort384 26 дней назад +1

      I'm using Forge which is just another implementation of it.

  • @Shadowman0
    @Shadowman0 Месяц назад

    I think your differential diffusion workflow doesn't use its full potential for its blending capabilities. Ideally after segemting the bear you could expand the mask and blur the expanded area so you get a gradient arround the bear. (or even blur without expanding to give it a bit of bear to change) Depending on the blur parameters and the added padding, your results should be much better. The differential diffusion allows you to use non binary masks and let the gray level define the allowed changes afaik.

    • @NerdyRodent
      @NerdyRodent  Месяц назад +1

      Yup, I’ve got a node in there for you to set the blur amount though I find the depth maps are often pretty good without extra blur!

  • @eammon7144
    @eammon7144 Месяц назад +1

    Are all the output images low resolution?

  • @testales
    @testales Месяц назад

    The standard models are not made for inpainting! With SD 1.5 models you can do an on-the-fly merge by bascially subtracting the 1.5 inpainting checkpoint from the regular base model and adding this to the checkpoint of your liking. Then you will get way better results when inpainting stuff. Probably this can be done with SDXL too but since I still use SD 1.5 checkpoints most of the time, I can't tell for sure.

    • @zacharyshort384
      @zacharyshort384 26 дней назад

      Hmm. Interesting. Do you have a vid link or tutorial you can point me to? :)

  • @Pauluz_The_Web_Gnome
    @Pauluz_The_Web_Gnome Месяц назад

    I noticed that the quality of the generated images are really low quality...

  • @relaxandlearn7996
    @relaxandlearn7996 Месяц назад

    Dont see any difference between this and normal inpaint models.