Control Your AI Art with Blender - Blender & Stable Diffusion ControlNet Tutorial 2024

Поделиться
HTML-код
  • Опубликовано: 6 ноя 2024

Комментарии • 61

  • @mjsavage711
    @mjsavage711 Месяц назад +4

    I came here after finding one of Storybook Studios shorts, wanted to learn about ControlNets and how they worked, excellent demo, I was able to recreate what you taught quickly.

    • @albertbozesan
      @albertbozesan  Месяц назад

      Thank you! Glad it was helpful :) more coming soon.

  • @qoph1988
    @qoph1988 Месяц назад +1

    One of the best teachers for AI visual art stuff, thanks for helping me keep abreast of these tools as the industry changes

    • @albertbozesan
      @albertbozesan  Месяц назад

      That means a lot, thank you! A big project is coming up with more AI lessons, stay tuned :)

  • @Rock-Of-3y3
    @Rock-Of-3y3 Месяц назад +3

    Exactly what I needed. Thank you.

  • @Techne89
    @Techne89 Месяц назад +6

    Nice to see you back mate,

  • @stableArtAI
    @stableArtAI Месяц назад +2

    Skimmed, but looking forward to watch and following more in depth with this technique. Great stuff.

  • @christophermoonlightproduction
    @christophermoonlightproduction Месяц назад +5

    Thanks for keeping it simple. I'm saving this tutorial. I saw Space Vets and thought it was really well done. I'm currently working to do my own animated movie with AI right now so It's good to see others doing this.

    • @albertbozesan
      @albertbozesan  Месяц назад +2

      Thank you!! Feel free to share your project if you want to, I’m very curious 😄

    • @christophermoonlightproduction
      @christophermoonlightproduction Месяц назад

      @@albertbozesan Thank you. It's called Escape From Planet Omega-12. It's more adult-oriented sci-fi (Think old-school stuff like Heavy Metal or Fire and Ice) but it's on my RUclips page as the starter video. I would love for you to check it out. I've been doing art and film for a long time and although I'm by no means a technician, I'm very excited about the new era of AI filmmaking. I see people like you as pioneers, making movie history, so if I can carve out a small part for myself in all of this, I'll be very happy. Please, stay in touch. Cheers.

    • @christophermoonlightproduction
      @christophermoonlightproduction Месяц назад +1

      @@albertbozesan Thanks. Yeah, it's on my channel, titled Escape From Planet Omega-12. Although, I'll say that what I'm doing right now has already gone way beyond what I've posted. I'll be updating, soon. Cheers.

  • @davtech
    @davtech Месяц назад +5

    RUclips recommended and subscribed. Looking forward to watching future and past videos.

  • @kdrude123
    @kdrude123 Месяц назад +2

    Excellent video... I'm a blender newbie and I could follow along. I appreciate all the tips and being concise!

    • @albertbozesan
      @albertbozesan  Месяц назад

      Thank you for letting me know!! Clarity was super important to me, I know how tricky Blender can get.

  • @albertwang5974
    @albertwang5974 Месяц назад +2

    What a nice trick!

    • @albertbozesan
      @albertbozesan  Месяц назад

      Glad you like it!

    • @_tmh
      @_tmh Месяц назад

      Actually so much more than a trick. This tech is key for any filmmaker who wants to do more than just trailers! Thanks for the awesome explanation!

  • @AZTECMAN
    @AZTECMAN Месяц назад +2

    Great job.
    One thing I would do differently is use
    y = 1 / (depth+1)
    (nodes for addition and division; no need to invert or colorize)

    • @albertbozesan
      @albertbozesan  Месяц назад

      Thanks! I don't quite follow - is this not just a different node for the same result? How does it save effort?

    • @AZTECMAN
      @AZTECMAN Месяц назад +2

      @@albertbozesan it changes the shape of the fall-off
      If I'm not mistaken, it's more effective at keeping background elements that are far away from. The camera than the original equation.
      The curve has a bend rather than a straight line on the graph.

    • @albertbozesan
      @albertbozesan  Месяц назад

      @@AZTECMAN I see! Thanks for the tip!

  • @fx_node
    @fx_node Месяц назад +1

    I have found stacking depth, line art, normal, and other controlnets in Krita’s stable diffusion, referencing the appropriate blender render pass, a good way to go. I have made several videos about this on my channel.

  • @GiancarloBombardieri
    @GiancarloBombardieri Месяц назад +1

    great tutorial! Thanks!!
    Where can I download the control net model Depth Xinsir that you use?

    • @albertbozesan
      @albertbozesan  Месяц назад

      Glad you like it! You can find the model on Xinsir’s huggingface. Download diffusion_pytorch_model.safetensors and name it how you like. huggingface.co/xinsir/controlnet-depth-sdxl-1.0

    • @albertbozesan
      @albertbozesan  Месяц назад

      Here you go! huggingface.co/xinsir/controlnet-depth-sdxl-1.0/tree/main
      Rename diffusion_pytorch_model.safetensors

  • @seans4018
    @seans4018 Месяц назад +2

    Do you have tutorials on how you did the Space Vets art??? Looks great!

    • @albertbozesan
      @albertbozesan  Месяц назад

      Thank you! Check out the making of vid on the website, and the CivitAI talk linked in the description for more infos 😄

  • @fm3d-k1w
    @fm3d-k1w Месяц назад +1

    Good jobs

  • @plejra
    @plejra 13 дней назад +1

    Great tutorial. By the way, do you have any idea how to create a visualization of a particular house on a photo with empty parcel. Do you have an idea what workflow to adopt?

    • @albertbozesan
      @albertbozesan  12 дней назад +1

      If you don’t need a different angle of the house, use a Canny ControlNet, img2img of the photo with a medium denoise. Make sure your prompt is very descriptive. Then inpaint the parcel.

  • @sirmeon1231
    @sirmeon1231 Месяц назад +1

    Works like a charm!
    My idea of using the depth map to drive an animation in Stable Diffusion did not work out that well though, so maybe I need to make Animations in Blender and only use generated textures from SD? 🤔 We'll see
    ...

    • @albertbozesan
      @albertbozesan  Месяц назад

      It’s a perfectly valid idea! You can steer animations using depth, it just needs to be a rather complex AnimateDiff workflow in ComfyUI. I’ll have a course up semi-soon that includes something like that.

  • @MikeBurnsArrangedAccidents
    @MikeBurnsArrangedAccidents 21 день назад +3

    Don't forget to render in EEVEE folks. Save yourself time.

  • @LightWarriors4Life
    @LightWarriors4Life Месяц назад

    Been looking into Blender and possibly turning some of our short films into animation types.
    For copyright and trademark purposes, how safe is it being Open Source? 🤔

    • @albertbozesan
      @albertbozesan  Месяц назад +1

      Blender is used by massive commercial studios. It’s safe. Just make sure you download the official version off blender.org, there are some fakes out there.

  • @thibaudherbert3144
    @thibaudherbert3144 Месяц назад +1

    great tutorial. I've followed every step, but at the render stage, the image is a deapth map but just a black to white gradient, it does not recognize the depth of the meshes. I don't find the solution. It renders a flat image

    • @albertbozesan
      @albertbozesan  Месяц назад +1

      Maybe your camera is outside the room? In that case it would be rendering the outside wall - the “backface culling” I set up at the beginning is only for the viewport preview, not the render, unfortunately.

    • @thibaudherbert3144
      @thibaudherbert3144 Месяц назад

      @@albertbozesan yes it fixed it ! thanks you =)

  • @ZergRadio
    @ZergRadio Месяц назад

    Hej, I finally followed this tut, but when I dragged the window and couch to the room, they position themselves on top of the room or outside of the cube.
    I could not get them to sit at the 3D cursor, as in, on the floor. I have to use g and choose an axis to move them into position.
    Any ideas?

    • @albertbozesan
      @albertbozesan  Месяц назад

      Is “snapping” on at the top of your viewport?

    • @ZergRadio
      @ZergRadio Месяц назад

      @@albertbozesan wrote "Is “snapping” on at the top of your viewport?"
      No it is not on (magnet).
      It is not such a biggy.
      I follow other tuts and then I forget what I did.

  • @EternalAI-v9b
    @EternalAI-v9b Месяц назад

    Question: about the 2 images at 12:00 and 12:06, how do you ensure the wall texture behind the guy is the same texture (like precisely) as the one on the wall seen from the first image?

    • @albertbozesan
      @albertbozesan  Месяц назад +1

      We don’t. We prompt and curate the results carefully.

    • @EternalAI-v9b
      @EternalAI-v9b Месяц назад

      @@albertbozesan Thanks for the answer!

  • @ZergRadio
    @ZergRadio Месяц назад +3

    Very very nicely done. I love this method.
    As you said, this give one much much more control of what one wants!

  • @NerdyTrends
    @NerdyTrends Месяц назад

    any reason why you don't use comfyui instead?

    • @albertbozesan
      @albertbozesan  Месяц назад +1

      Ease of use for viewers! Comfy scares beginners away from AI and can be frustrating even for experienced users if you just want to do something quick and simple.
      That said, I have a ton of ComfyUI content coming soon 👀

  • @MikeBurnsArrangedAccidents
    @MikeBurnsArrangedAccidents 21 день назад

    Flux didn't care much for my reference image no mater how much I prioritized Controlnet.

    • @albertbozesan
      @albertbozesan  20 дней назад +1

      I prefer controlnet with SDXL, more reliable at the moment. You could use Flux as img2img in a final pass?

    • @MikeBurnsArrangedAccidents
      @MikeBurnsArrangedAccidents 20 дней назад

      @@albertbozesan Thanks! I won't bang my head against the wall looking for other workarounds.

  • @minuandtoyo
    @minuandtoyo Месяц назад

    Hi mr. Albert. I’m following you for a longtime, I just couldn’t bring myself to cold message on Linkedin.
    I found some company (selling guitar courses), based in Helsinki, looking for an AI video creator intern - i know you are not an intern but I thought you might be interested in reaching out to them maybe you can collab in the future. I am not affiliated in any way shape or form with them, just saw the ad.
    Cheers!

    • @albertbozesan
      @albertbozesan  Месяц назад

      Hi Cosmin! Thanks for reaching out and sharing this - don’t worry, feel free to connect on LinkedIn anytime!
      I’m very happy as Creative Director at Storybook Studios, but I’ll push this comment. Maybe somebody in this community will find it interesting!

    • @minuandtoyo
      @minuandtoyo Месяц назад

      ⁠@@albertbozesanok thank you for the reply. I’ll forward the job listing in a PM then, maybe your studio might be interested

  • @phu320
    @phu320 Месяц назад

    I am likely to wait until there is a prompt app to generate .blend files. I am also likely to wait until the fucking nerds stop trying to make me learn more complicated shit to do shit nowadays !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! NO CODE SOLUTIONS !!!!!!!!!!!!!!!!!!!! PROMPT TO COMPLETE OUTPUTS ONLY !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

    • @albertbozesan
      @albertbozesan  Месяц назад +1

      And this is why the nerds get to make cool stuff first 🤓

    • @phu320
      @phu320 Месяц назад

      @@albertbozesan NERDS !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!