BrushNet - The Best InPainting Method Yet? FREE Local Install!

Поделиться
HTML-код
  • Опубликовано: 29 дек 2024

Комментарии • 66

  • @ducttapebattleship
    @ducttapebattleship 8 месяцев назад +10

    Thanks for still explaining everything in detail (conda, git, etc.) even after you've explained everyting a dozen of times already. This really helps.

  • @loubakalouba
    @loubakalouba 8 месяцев назад +7

    I see a new Nerdy Rodent video I click instantly, Never failed me. Thank you!

  • @forfreeiran8749
    @forfreeiran8749 8 месяцев назад +4

    You got the best vids man

  • @MrSporf
    @MrSporf 8 месяцев назад +2

    I thought fooocus was good, but this thing is next level!

  • @tonywhite4476
    @tonywhite4476 8 месяцев назад +3

    Brushnet or Invoke?

  • @rudeasmith5098
    @rudeasmith5098 8 месяцев назад +1

    That shining suit of armor is suss! Love your videos though. 👍

    • @NerdyRodent
      @NerdyRodent  8 месяцев назад

      I like the way it does the reflections too!

  • @MUZIXHAV3R
    @MUZIXHAV3R 8 месяцев назад +1

    Excellent work as always mate. One quick question though. Where is the default output "dir" after generating the images?

  • @Lamson777
    @Lamson777 8 месяцев назад +3

    What a funny intro 😂

  • @vi6ddarkking
    @vi6ddarkking 8 месяцев назад +3

    Ok so I am going to ask the obvious question.
    How well does it fix hands and feet?

    • @JonnyCrackers
      @JonnyCrackers 8 месяцев назад

      It's stable diffusion so probably not very well at all.

    • @zacharyshort384
      @zacharyshort384 8 месяцев назад

      @@JonnyCrackers ? You can *fix* hands and feet quite easily in SD with inpainting. People do it all the time...

  • @MarcSpctr
    @MarcSpctr 8 месяцев назад +1

    hey can you do a video on ANYDOOR, please ?
    I read their github and it seems such a cool tool to change inject location and stuff, but sadly the GRADIO DEMO only has inject transfer from one image to another.
    Can you make a tutorial on how to use all the other tools that Anydoor provides ?

  • @knightride9635
    @knightride9635 8 месяцев назад +1

    This is really good, thanks !

  • @worldwidewebcap
    @worldwidewebcap 8 месяцев назад

    I wasn't able to get it running. I keep getting a ton of errors, like "OSError: Error no file named diffusion_pytorch_model.bin found in directory data/ckpt/segmentation_mask_brushnet_ckpt."
    But that file ends in .safetensors not .bin

    • @NerdyRodent
      @NerdyRodent  8 месяцев назад +1

      You can download the required files from their Google Drive

  • @Elwaves2925
    @Elwaves2925 8 месяцев назад +2

    Careful with those puns, Sebastian Kamph will become jealous. 🙂
    The results look great but the one time I tried Anaconda it didn't work and seemed to be more of a mess. Then the hassle with the models so I think I'll wait until it hits Forge, which shouldn't be too long. Cheers NR.

    • @NerdyRodent
      @NerdyRodent  8 месяцев назад +2

      Anaconda is the way forward and has never let me down in over four years!

    • @Elwaves2925
      @Elwaves2925 8 месяцев назад

      @@NerdyRodent That's cool, it may have been something my end like a misunderstanding on my part. I also realised after that it might have been Miniconda, not Anaconda that I was using.

  • @industrialvectors
    @industrialvectors 8 месяцев назад

    What file explorer and image viewer are you using?
    Your setup looks like a mix of Linux and Windows in a good way.

    • @NerdyRodent
      @NerdyRodent  8 месяцев назад +1

      I’m using Caja with the default image viewer 😉

  • @bobbyboe
    @bobbyboe 8 месяцев назад

    I wonder what is more powerful, the IOPaint (that I noticed through your video last month), or this BrushNet. Evaluating after having seen both of your videos, I would tend to rather use IOPaint. But I am sure you must have compared both, or at least have a much better insight?

    • @NerdyRodent
      @NerdyRodent  8 месяцев назад +2

      There’s a brushnet inspired PowerPaint for IOPaint now - best of both worlds!

    • @bobbyboe
      @bobbyboe 8 месяцев назад

      @@NerdyRodent maybe we can expect some video from you about how to use this "PowerPaint" feature you described? Also a comparison of different models and how you use them in IOPaint would be very interesting. I cannot find a lot of input on how IOPaint can be power-used...

  • @DivinityIsPurity
    @DivinityIsPurity 8 месяцев назад +1

    Great tut

  • @lockos
    @lockos 8 месяцев назад

    @NerdyRodent I completed every steps without any issue for installation, but in the end, the local url is unavailable, it fails to connect to it, any idea why ?

    • @NerdyRodent
      @NerdyRodent  8 месяцев назад

      If you’ve got extra security set up, you may need to modify it to allow local connectivity

    • @lockos
      @lockos 8 месяцев назад

      @@NerdyRodent What'd'ya mean by extra security ? I tried with temporarily disabled firewall, checked my browser settings and flushed my DNS, that changed nothing wether I use Chrome, firefox or Edge. Now maybe there is a security setting within Brrushnet that I'm not aware of, if that is the case please could you tell ?

    • @NerdyRodent
      @NerdyRodent  8 месяцев назад

      If you’ve not got any extra security running, it could be that you’re using Microsoft Windows which will fail on basic routing tasks. If that’s your case, you’ll need to use localhost as the address instead!

  • @joe-5D
    @joe-5D 5 месяцев назад

    is the python method give higer quality/resolution results?

    • @NerdyRodent
      @NerdyRodent  5 месяцев назад +1

      Vs which non-python method?

    • @joe-5D
      @joe-5D 5 месяцев назад

      @@NerdyRodent or i should say locally vs website/huggingface

    • @NerdyRodent
      @NerdyRodent  5 месяцев назад +1

      It’s the same running the program on your computer or one someone else’s 😀

    • @joe-5D
      @joe-5D 5 месяцев назад

      @@NerdyRodent aright just making sure thanks, man !

  • @MilesBellas
    @MilesBellas 8 месяцев назад

    Automatic1111 isn't deprecated?

    • @NerdyRodent
      @NerdyRodent  8 месяцев назад +1

      Does kinda seem deprecated at this point, yes 🫤

    • @zacharyshort384
      @zacharyshort384 8 месяцев назад +1

      I'm using Forge which is just another implementation of it.

  • @Shadowman0
    @Shadowman0 8 месяцев назад

    I think your differential diffusion workflow doesn't use its full potential for its blending capabilities. Ideally after segemting the bear you could expand the mask and blur the expanded area so you get a gradient arround the bear. (or even blur without expanding to give it a bit of bear to change) Depending on the blur parameters and the added padding, your results should be much better. The differential diffusion allows you to use non binary masks and let the gray level define the allowed changes afaik.

    • @NerdyRodent
      @NerdyRodent  8 месяцев назад +1

      Yup, I’ve got a node in there for you to set the blur amount though I find the depth maps are often pretty good without extra blur!

  • @secretsather
    @secretsather 7 месяцев назад

    Seems like the checkpoints in Google Drive are hosed!

    • @NerdyRodent
      @NerdyRodent  7 месяцев назад

      Google drive still working here!

  • @arashsohi8551
    @arashsohi8551 8 месяцев назад

    Thank you so much!

    • @NerdyRodent
      @NerdyRodent  8 месяцев назад +1

      You're welcome!

    • @arashsohi8551
      @arashsohi8551 8 месяцев назад

      ​@@NerdyRodent Could you help me with this if you can please?
      I tried to find out what is the problem but nothing was found :(
      I have all the data structure and checkpoints and have the "sam_vit_h_4b8939.pth" but I don't know why this happens when I run the app_brushnet.py.
      Traceback (most recent call last):
      File "D:\AI-Programs\BrushNet\examples\brushnet\app_brushnet.py", line 13, in
      mobile_sam = sam_model_registry['vit_h'] (checkpoint='data/ckpt/sam_vit_h_4b8939.pth').to("cuda")
      File "C:\ProgramData\miniconda3\Lib\site-packages\segment_anything\build_sam.py", line 15, in build_sam_vit_h return _build_sam(
      ΑΑΑΑΑΑΑΑΑΑΑ
      File "C:\ProgramData\miniconda3\Lib\site-packages\segment_anything\build_sam.py", line 104, in _build_sam with open(checkpoint, "rb") as f:
      ΑΑΑΑΑΑΑΑΑΑΑΑ ΑΑΑΑΑΑΑΑΑ
      FileNotFoundError: [Errno 2] No such file or directory: 'data/ckpt/sam_vit_h_4b8939.pth'
      [process exited with code 1 (0x00000001)]

    • @NerdyRodent
      @NerdyRodent  8 месяцев назад

      Looks like you forgot to download that model!

  • @eammon7144
    @eammon7144 8 месяцев назад +1

    Are all the output images low resolution?

  • @testales
    @testales 8 месяцев назад

    The standard models are not made for inpainting! With SD 1.5 models you can do an on-the-fly merge by bascially subtracting the 1.5 inpainting checkpoint from the regular base model and adding this to the checkpoint of your liking. Then you will get way better results when inpainting stuff. Probably this can be done with SDXL too but since I still use SD 1.5 checkpoints most of the time, I can't tell for sure.

    • @zacharyshort384
      @zacharyshort384 8 месяцев назад

      Hmm. Interesting. Do you have a vid link or tutorial you can point me to? :)

  • @LouisGedo
    @LouisGedo 8 месяцев назад

    👋

  • @Pauluz_The_Web_Gnome
    @Pauluz_The_Web_Gnome 8 месяцев назад

    I noticed that the quality of the generated images are really low quality...

  • @relaxandlearn7996
    @relaxandlearn7996 8 месяцев назад

    Dont see any difference between this and normal inpaint models.