Mastering InPainting with Stable Diffusion & Auto 1111 Forge: Advanced Techniques for Perfect Scenes

Поделиться
HTML-код
  • Опубликовано: 4 ноя 2024

Комментарии • 32

  • @lozira559
    @lozira559 8 месяцев назад +6

    The first video I found that explains about the soft inpainting feature!

  • @ED-wn5jw
    @ED-wn5jw 8 месяцев назад +2

    Thanks for the videos, you're one of the best teachers on SD I found so far

  • @espritdautomne
    @espritdautomne 3 месяца назад +1

    woa! That's a lot of extension to install just for inpainting; the more extensions you try to install, the more likely you are risking breaking you automatic1111.

    • @AIchemywithXerophayze-jt1gg
      @AIchemywithXerophayze-jt1gg  3 месяца назад

      Yeah I decided to take a different route. I actually developed image generation and in painting capabilities to my online platform XeroGen. Took a lot of the work out of getting that stuff set up for people. Just load your image up and started inpainting.

    • @espritdautomne
      @espritdautomne 3 месяца назад

      @@AIchemywithXerophayze-jt1gg Cant wait to try it. Thanks.

  • @tr1pod623
    @tr1pod623 8 месяцев назад +1

    thanks for the video!

  • @dookiesmile
    @dookiesmile 6 месяцев назад +1

    Hi, Thank you for these amazing tutorials. I've just got Automatic1111 Forge installed and am in the process of learning how to use it better.
    I've been using the regular Automatic1111 to make desktop background images, along with images for a family quiz I play once a month with my aunts and uncles.
    I think when you did the initial upscale of your image, it changed as you forgot to also copy the negative prompt.

    • @AIchemywithXerophayze-jt1gg
      @AIchemywithXerophayze-jt1gg  6 месяцев назад

      Anytime you do an upscale there are going to be minor changes. Maybe I forgot to turn the denoid strength down.

  • @Mehdi0montahw
    @Mehdi0montahw 8 месяцев назад +2

    woow First time I see this method

  • @jovanniagara
    @jovanniagara 5 месяцев назад +1

    amazing mate

  • @zacharyshort384
    @zacharyshort384 8 месяцев назад +1

    Awesome video. I really appreciate the longer and more in-depth content! I'll be going though many more videos on your channel after this one :)
    If you don't mind, can you elaborate on how to change the path/link to an external drive for the checkpoints? I tried the linking method and had all the options *except* for "hardlink". I tried a different method of changing the user batch file and got launch errors lol

    • @AIchemywithXerophayze-jt1gg
      @AIchemywithXerophayze-jt1gg  8 месяцев назад

      Nice. Glad you like them

    • @AIchemywithXerophayze-jt1gg
      @AIchemywithXerophayze-jt1gg  8 месяцев назад

      Sorry I completely missed that last question you asked. If you're using Windows, look up the command MKLINK and how to use it. That's what I did so that I could link up my original Forge insulation model folder with the original model folder I use. Although I don't know if that matters at this point because stability Matrix

  • @fatallyfresh3932
    @fatallyfresh3932 5 месяцев назад +1

    Umm, have a link to that lightning fast checkpoint you were using? It looks incredible. You're generating 4 images fast than I can make 1 lol. And the quality looks insane! The SDXL-turbo one.

    • @AIchemywithXerophayze-jt1gg
      @AIchemywithXerophayze-jt1gg  5 месяцев назад +1

      There's a couple of them that I use but the one you're probably talking about is called realities edge XL lightning turbo V7. It can be found on civitai here.
      civitai.com/models/129666/realities-edge-xl-lightning-turbo

  • @vindyyt
    @vindyyt 8 месяцев назад +1

    Since im using my own model merge for a specific style, I don't really have a fitting inpainting model.
    Generally inpainting works okay with it, except if i'm trying to add stuff in.
    Aside from finding a relatively close style inpaint model, do you have other suggestions for my issue?

    • @AIchemywithXerophayze-jt1gg
      @AIchemywithXerophayze-jt1gg  8 месяцев назад

      Just use your model to do the end painting, use a fairly high mask blur, try to adjust the only masked area with and height so that whatever is generated is centered in that area. Once you've added the objects, you may have some seams that you'll need to remove. You can use an actual in painting model on just the seams, or send the entire image over to image to image, and using your merge, and a low denoise strength, rerender the entire image. It may change various things on the image, so you may need to play with what denoise level works for you.

  • @andykoala3010
    @andykoala3010 8 месяцев назад +1

    wont the image change slightly when using the refiner as you are doing fewer steps (75%) with the first model and then switching to a different model?

    • @AIchemywithXerophayze-jt1gg
      @AIchemywithXerophayze-jt1gg  8 месяцев назад

      Very little and that's the point with the refiner is your are trying to add detail so it has to change the image a bit.

  • @chibamyongchu
    @chibamyongchu 8 месяцев назад +1

    Please tell me why you just wrote "fog cabin in the forest" but the result is a cabin in a crystal ball

    • @AIchemywithXerophayze-jt1gg
      @AIchemywithXerophayze-jt1gg  8 месяцев назад

      Sorry if I said fog, it was supposed to be log cabin in Forest. But if you notice I have selected under my styles one of my styles that takes in my input and inserts it into a prompt. If you go to share.xerophayze.com you can access my Google share and I have my styles.csv file there. This will give you access to the prompt that is set up to do this.

  • @jeellyz
    @jeellyz 8 месяцев назад +1

    Hola amigo