Depth, mist passes and depth of field in Blender, Nuke, Natron and Fusion.

Поделиться
HTML-код
  • Опубликовано: 25 дек 2024

Комментарии • 147

  • @BlenderBob
    @BlenderBob  2 года назад +10

    zDepth issue HAS BEEN FIXED!!! Thanks Brecht!

  • @simonmueck
    @simonmueck 2 года назад +2

    exactly what I needed fabolus

  • @chj.schwarz
    @chj.schwarz 2 года назад +3

    informative and entertaining! loved the nostalgia critic reference 😄

  • @Maarten-Nauta
    @Maarten-Nauta 2 года назад +6

    I use the Denoise Depth Pass for Atmosphere. It works really well and rendering out a scene with volumetrics in blender tends to denoise terribly.

  • @unboring7057
    @unboring7057 2 года назад +1

    Great stuff, thanks Blender Bob!

  • @MauriciodeOliveira
    @MauriciodeOliveira 2 года назад +8

    Great content as always :)
    As for the vertical fog, one quick way to do that in Comp is by using a PositionPass (P World). The output image will represent the position of each pixel in 3d space and therefore: Red channel = X (horizontal values), Green channel = Y (vertical values) and Blue channel = Z (depth values - yes, one can use the blue channel as a Z depth as well). Knowing this I'm sure you got the point already: use G from PWorld as mask for a Grade node (Nuke wise speaking). Extra control can be achieved by grading this mask as you also did in one of your examples ;)

    • @BlenderBob
      @BlenderBob  2 года назад +2

      Yeah, I totally forgot about the point position but then again, how do you add noise like I did in Maya?

    • @MauriciodeOliveira
      @MauriciodeOliveira 2 года назад +2

      @@BlenderBob you can use a P_Noise and multiply that with the previously generated mask :)

    • @darviniusb
      @darviniusb Год назад +1

      @@BlenderBob With the p pass you go in nuke and use and expression node and in the field bellow r-g-b fields, where is nothing you adt noise(r,g,b) and now you have a noise spread on P pass on the A channel witch you can controll by adding a gain node before the expression and you play with the rgb values in that to scale the noise as you want.

    • @BlenderBob
      @BlenderBob  Год назад

      @@darviniusb Cool! Way over my pay check though. Thanks!

  • @Cladius76
    @Cladius76 2 года назад +1

    Thank you this is mind blowing !!!

  • @retroeshop1681
    @retroeshop1681 2 года назад +3

    A masterclass in every video, I feel like I'm going to my university, but actually learning :')

  • @gottagowork
    @gottagowork 2 года назад +3

    Regarding custom depth pass, here is a time consuming workaround that might work, although I didn't test it against transparencies and displacement:
    1) In some random material, add two reroute nodes before material output and make a material node group out of them.
    2) Inside, add geometry node using the position output through a vector rotate and possibly vector add, followed by a separate XYZ.z going into custom shader AOV.
    3) Add this node group to all materials in the scene, dropping it as the last step - the actual shading info will just pass through it, but AOV info will get added.
    4) In view layer properties, remember to export the custom depth pass for export. Preview in small outputs and adjust the parameters. Sadly can't render preview custom AOVs.
    I gave this up in Blender because of the noise and edge issues you mentioned, but try using it for a depth of field control instead of mist in Nuke.
    You should be able to get something similar to a tilt lens with a tilted focus plane. Wild effect, and not often used.
    Think top of a flower field in focus but the bottom of the plants/ground are not, but the camera is horizontal. Would love to see the results with some heavy bokeh :)
    Hard to explain, but here is a real life video on it used for photography: ruclips.net/video/gTDCW_8AsSo/видео.html

    • @BlenderBob
      @BlenderBob  2 года назад +1

      Interesting approach. I know very well what a tilt lens is. I studied professional photography and I use to own a 4x5 camera where I could tilt the front to move the DOF up and down or left to right or even in diagonal.

    • @gottagowork
      @gottagowork 2 года назад +1

      ​@@BlenderBob I use something similar to make walls render invisible if viewed from the outside (thin walls), as to not obstruct my view while navigating the scene, as well as clay rendering without affecting lights and what would be glass windows and so on. Might be possible to make a script adding this to all materials, then ask what to do if emission or transparency is found, but that's out of my league.
      I tried with assets using alpha, and for now it's a no-go. While I can get the alpha information into the node group and do some maths with it, it still registers as a ray hit and will obscure the object behind the alpha which is what we want. From what I recently heard denoise depth pass may be better suited for regular depth in some cases, but of course won't work for custom depth. So a proper pass for this that handles transparency/alpha would be very welcome.

  • @Nic-tg2ei
    @Nic-tg2ei 2 года назад +1

    When the nuke shuffle node popped up I got a bit nostalgic. Misted up. So used to the new one, can't believe I forgot the old one!

    • @BlenderBob
      @BlenderBob  2 года назад

      It’s still there. Go in plug-in, refresh or something like that and it will come back. I made a tool so I can get access to both

  • @The.ARCHIT3CT
    @The.ARCHIT3CT 2 года назад +2

    the one thing no one EVER mentions in any of these videos about motion blur, volumetrics and a few other features within blender, is that you HAVE to render out the image/animation in full form. Some people tend to use the "render from viewport" option because it saves time, but that's exactly why it does. It disables high calculations that bog down the render, like motion blur and volumetrics. I can't tell you how many people have banged their heads on walls over this.

    • @BlenderBob
      @BlenderBob  2 года назад +1

      Really! Wow. We only use it for animation previs

    • @The.ARCHIT3CT
      @The.ARCHIT3CT 2 года назад

      @@BlenderBob same here. that's what it's intended for.

  • @ВиталийЛевкович
    @ВиталийЛевкович 2 года назад +21

    You should separate XYZ coordinates, plague Z value in color ramp. By default it uses XYZ as RGB and makes color ramp on its brightness. XYZ values get higher diagonally. Also use empty, not a plane.

    • @BlenderBob
      @BlenderBob  2 года назад +5

      Yeah, good idea. Didn't think about that. Thanks

    • @sdados3725
      @sdados3725 2 года назад +3

      @@BlenderBob yep, you shouldn't plug the coords directly into a color ramp. The color ramp doesn't use a vector coordinates, it just remap a value between 0 and 1, so if you plugin a vector value, that's what you get. You either separate XYZ and plug them into a color ramp, or you use a ramp texture, which can actually read coordinates and will work accordingly... as a texture :)

  • @GaryParris
    @GaryParris 2 года назад +1

    good work Blender Bob :O)

  • @unicornhuntercg
    @unicornhuntercg 2 года назад +6

    For blender, I would recommend using eevee for the mist pass, way faster and less buggy

    • @BlenderBob
      @BlenderBob  2 года назад +6

      Eevee doesn’t support displacement. Not yet. And even when it will it won’t exactly match Cycles’

    • @BlenderBob
      @BlenderBob  2 года назад +3

      Same with motion blur

    • @BlenderBob
      @BlenderBob  2 года назад +4

      But I did a clip where I was showing how to mix a volume fog from Eevee with a Cycles render

  • @Altohamy
    @Altohamy 2 года назад +1

    26:00
    have you tried using scale in Texture Coordinates node ?

    • @BlenderBob
      @BlenderBob  2 года назад

      Yeah. For some obscure reason I didn’t think about it. I was in a Maya state of mond

  •  2 года назад +2

    Listening to you in 1.5x speed is instant Ian Hubert style

    • @BlenderBob
      @BlenderBob  2 года назад

      I do this all the time. I try to keep my clips as short as possible but it’s not always possible

    •  2 года назад

      ​@@BlenderBob your videos are amazing, they're densely packed with valuable information. If someone doesn't have the time to learn, why are they even here? (i watch all your videos on 1.5x speed, so i don't even know what you sound like normally :D )

  • @smoothprox
    @smoothprox 2 года назад +2

    Fusion has a Channel Boolean node that works similar to the Shuffle node in Nuke.

    • @BlenderBob
      @BlenderBob  2 года назад

      Nop! Channel boolean only sees what is connected to the previous node and all other channels are discarded

  • @Tumiso
    @Tumiso 2 года назад +1

    i needed this 😪

  • @Kram1032
    @Kram1032 2 года назад +2

    A very common problem you keep having is that you neglect the data types you are working with. I'm pretty sure this is not the first time you had this exact issue on this channel:
    Blue sockets in nodes are vector-values. They have three different dimensions. It's not obvious how those ought to be converted to a single value (which is grey sockets) but Blender chooses, I *think,* the average value.
    So if you try to use this with a gradient, it'll just average together to a diagonal gradient.
    It's best to do a separate-XYZ-node and pick out just one direction you like. It'd make sense for that to be the Z-direction. Then, if you pick a plane or, even better, an empty, the "up" direction of that object is gonna correspond to the direction of your gradient. (Although in some situations going for X or Y might be more suitable to your needs. In most cases you'll want Z)

    • @BlenderBob
      @BlenderBob  2 года назад

      The point was that there’s no way to do a planar projection with some kind of visual reference to where the projection will happen. Same with projecting UV. You a cylindrical projection but no control at all over it. Yeah, agree for the X Y Z approche.

    • @Kram1032
      @Kram1032 2 года назад +1

      ​@@BlenderBob yeah, I'm not saying the situation is perfect. I just noticed in the past, also in your Geometry Nodes experiments and such, that that particular concept of socket-type kept hitting you a curveball.
      In principle, using another Object's object coordinates (as you have done) should be what you want though. Then it should work pretty much exactly like in Maya I think? As far as I could tell the only issue there was the socket situation.
      It'd still be nice to have, like, an immediate version of that though. Spawning a texture directly as a 3D object does indeed seem rather useful. And I think every issue you mentioned about the Depth- and Mist-passes and how various softwares respond are pretty big deals.
      I wonder if there could be an approach that's somehow "best of both world", giving you Cycles' precision but in a convenient "confusion mask" or something. I mean even the best-in-class post processing depth of field effect didn't look nearly as good as what Cycles would have given you, but the drawbacks in flexibility are obvious and so it's an acceptable tradeoff.

    • @BlenderBob
      @BlenderBob  2 года назад +1

      @@Kram1032 Yeah, sometimes I go to fast in publishing clips and I get people reminding me of the mistakes I made (especially my Wish List clip) and I actually really appreciated that. I'm still learning. Blender Bob doesn't mean Blender Master. It's just a name. I actually said that in my very first clip. I thought it could have been when they change to Cycles X but no, it still didn't work in earlier versions. I don't know a bout 2.79 though. I will make an addendum clip like I do from time to time to correct my mistakes. Cheers

    • @Kram1032
      @Kram1032 2 года назад

      @@BlenderBob I also really appreciate your videos.
      I'm pretty well-versed with technicalities, but not really with artistry and trickery, and that's a gap you're filling nicely.
      It's also great to see this broad set of programs side by side, really showing off how powerful (or not) they are. And hopefully they'll also show weaknesses some dev somewhere is eventually going to address

  • @Vijaya7225
    @Vijaya7225 2 года назад +1

    hello sir
    can you demonstrate us how to have a 3d animation with toon shades as shown in "The Very Pulse of The Machine" episode of 'Love Death and Robot' in blender?

    • @BlenderBob
      @BlenderBob  2 года назад +1

      I know nothing about toon shaders. Check out lightning boys on YT.

    • @Vijaya7225
      @Vijaya7225 2 года назад +1

      @@BlenderBob thanks for recommending.

  • @ong3r1
    @ong3r1 2 года назад +1

    Petition to make Alex bob a running segment in the show

  • @chillywilson
    @chillywilson 9 месяцев назад +1

    I've had some luck getting cleaner depth passes running the denoiser node and alpha into the depth for exr output so it will antiailse the alpha, not perfect but saves you from a 4 -8k render on the z alone

  • @JonathanTRG
    @JonathanTRG 2 года назад +7

    I came across the bug about the white dots on the Z pass and Mist pass so many times while working with plants and accroding to what i found on the internet, it's due to the fact that when there is an alpha involved in the shader, if it's not 0 or 1 it will do this artefact, which is weird bc the opacity map used for this kind of shaders are pure white or black, maybe it's a problem with the colorspace but i don't know.
    One way of dealing with this bug is to set the "Alpha Threshold" in the View Layer Properties to 0 and so Blender will ignore the Alpha. The problem is that if you work with low poly plants you will get a big difference between the geometry and the alpha... So please Blender Devs fix this problem !
    Also this problem appears in others utility passes like the Position pass, but the cryptomatte has 0 problem of dealing with it......

    • @fabbrobbaf
      @fabbrobbaf 2 года назад

      For the "buggy" Z pass with lots of fireflies, use the denoised version (denoising data must be activated first)! You'll get a clean output.

    • @JonathanTRG
      @JonathanTRG 2 года назад

      @@fabbrobbaf yes but it doesn't look like a Z pass

    • @fabbrobbaf
      @fabbrobbaf 2 года назад

      @@JonathanTRG then you have to map it with the map node by setting for instance in min = 0 and in max = 250 to out min =0 and out max =1 (do not normalize it otherwise it will change depending on the max and min range of the objects int he scene). Trust me it works and when I found it out I was literally hilarious.

    • @JonathanTRG
      @JonathanTRG 2 года назад

      @@fabbrobbaf the "Denoising Depth" pass is not a depth pass denoised, it's not clear what it's purpose is, I assume it's a debugging pass, but anyway, it has some information of glossiness in it (so parts that are closer to the camera are whiter due to the specular highlight), it is anti aliased and seems to have a different white value on long lenses, I could show you instances where it's absolute trash and doesn't work but I can't share images here, so no it's not a depth pass, and even if it works in some cases, if it doesn't most of the time, it's not a good replacement

    • @fabbrobbaf
      @fabbrobbaf 2 года назад

      @@JonathanTRG are you just supposing or is this documented? Anyway this pass saved my ass so many times

  • @Meteotrance
    @Meteotrance Год назад +1

    en tout cas c'est cool de mettre le doigt sur ces petit problème qui ce joue sur des pets de noix, je sait que les réal et les gens du compositing sont des "controls freaks", mais est-ce que finalement ça coute pas moin chère maintenant de générer le Bokeh et le depth of field directement par simulation réel au moment du rendu notament dans blender ou ont peu régler la caméra virtuel et même simuler la rotation du point et le nombre de lamelle du diaph, aprés oui ya surrement des cas de figure ou le contrôle du flou de profondeur et de mouvement au compositing est critique nottament pour le faire matcher avec des plan réel, mais personnellement je fait aussi mon motion blur en rendu naturel tout comme le bokeh, c'est des passes de rendu en moin a gérer en post.

    • @BlenderBob
      @BlenderBob  Год назад

      Je préfère également rendre la profondeur de champs directement au rendu.

    • @Meteotrance
      @Meteotrance Год назад

      @@BlenderBob le volumetric est problématique à rendre pour moi sous blender c'est pas aussi simple que sous Lightwave ou dans des soft de compositing, dans vue infinite ont pond des nuages quasiment sans effort, je connais pas Houdini mais je sais que c'est un des soft les plus utilisés pour ce genre d effet brumeux et d'eau, après si il me faut un truc externe pour de l'eau ou de la fumée, c'est pas à la porter de ma bourse j'ai déjà une licence da vinci resolve et le pack substance plus zbrush pour mes textures et sculpt, pas toujours évident à jongler mais ont peu importer du aces cg et aces 2065 sans add on dans blender, le problème c'est d'utiliser la commande color override qui ne s'active quand mode open EXR sous blender raison pour laquel les add on aces to Blender pullule, mais c'est pas forcément obligatoire, il devrait vraiment intégrer un filmic to Aces ça simplifierait les choses sur les futurs versions.

    • @BlenderBob
      @BlenderBob  Год назад

      @@Meteotrance Filmic est une tranform. ACES est un colospace. C'est pas la même chose.

  • @bro_liv
    @bro_liv Год назад

    It might've been asked, but what about doing that DOF comp in aftereffects?

    • @BlenderBob
      @BlenderBob  Год назад +1

      I don't do compositing in AF. Someone else will have to try it.

  • @johntnguyen1976
    @johntnguyen1976 2 года назад +1

    You should do your entire Blender Conference talk as Alex Bob...that would be epic

    • @BlenderBob
      @BlenderBob  2 года назад +1

      Oh man! I could only do 4 takes. My thought was killing me!

  • @gordonbrinkmann
    @gordonbrinkmann 2 года назад +1

    Good comparison video. For the mist pass in Blender, when you mention it should be adjustable per camera... I guess it's not because I suppose it's meant to be (and that's why it's in the world settings) a misty (hence the name) air in the environment, no camera feature. And if I would look into a (homogenous) misty environment, the mist increases with distance, no matter where I place my camera or how far away the farthest object of interest is. A camera can change it's focus, but not how deep into the mist it can reach. But I'm sure you are aware of that and of course you are talking from an artistical point of view. Which I totally understand, I get why it would be nice to have a mist pass per camera.

    • @BlenderBob
      @BlenderBob  2 года назад +1

      Like you said, it's all about the look, not the physics. :-)

  • @coolbluesky4078
    @coolbluesky4078 2 года назад

    size for aperature really makes sense though

  • @Everyday_Foreman
    @Everyday_Foreman 2 года назад +3

    The Alex Jones of the Blender world. Interesting! You certainly go into detail and the Z pass and Mist. Very nice job!!

  • @darviniusb
    @darviniusb Год назад +1

    Last time Fusion seen a real improvement was back at Fusion 9, in 2017.

  • @ian.jo_
    @ian.jo_ 2 года назад +1

    i just started using davinci like 2 days ago, but in the image tab you can select the .exr layer, and add extra channels if you want to do it that way

    • @BlenderBob
      @BlenderBob  2 года назад +1

      That's the whole point. I don't want to assign everything manually. That's the worst workflow ever.

    • @xanzuls
      @xanzuls 2 года назад

      @@BlenderBob Well you do have manually assign passes in Nuke's shuffle node as well, and in Fusion it's a bit more manual and you can use the script to spit the loader into different passes. I don't see how that's a different workflow, if you want to change something later, you can just change the channels of each loader nodes. And BM didn't make Fusion lol, they bought it from the original devs in 2014, so if you don't like the exr workflow, blame them lmao.

    • @BlenderBob
      @BlenderBob  2 года назад

      @@xanzuls If you have an EXR with 5 layers and they each have diffuse, glossiness, transparency, normals, and point position AOV, that's 70 layers. Can you even have 70 layers? And if you use the script, you will have a mess of 70 nodes that you need to update one by one every time you have a new version. BM bought Fusion 9 years ago and they still have the same team. So no excuses here my friend.

    • @BlenderBob
      @BlenderBob  2 года назад

      So I don't care if I need many shuffle nodes. They don't need to be updated. Only the top node. Much cleaner.

    • @xanzuls
      @xanzuls 2 года назад +1

      @@BlenderBob Yeah I think you can. I think you can use the Channel Booleans node as the nuke shuffle node as well. Yeah, it's been quite some time and they have had the main guy as the lead dev as well but they haven't updated many things in Fusion yet.

  • @electronicmusicartcollective
    @electronicmusicartcollective 8 месяцев назад +1

    the news on 2:45. One second I thought what for advertise I got...hehehe :)

  • @BasedMando
    @BasedMando 2 года назад +1

    You're a maniac!
    We need more conspiracy theory rants!
    Z zeee zeeee!!!

  • @RomboutVersluijs
    @RomboutVersluijs 2 года назад +1

    Aren't those white speckles in the mist pass caused due to low transparency samples in the light sample panel. I think you need more samples

    • @BlenderBob
      @BlenderBob  2 года назад +1

      I’ll try it but I doubt it. Thanks for the suggestion

  • @Abolautiainen
    @Abolautiainen 2 года назад +1

    Is is possible to just blur the z pass slightly to smooth out the edges?

    • @BlenderBob
      @BlenderBob  2 года назад

      ZDepth passes are never antialiased because it would cause artifacts like I show somewhere in the clip when using the mist pass instead

  • @bologcom
    @bologcom 2 года назад +2

    Complete and thorough as if done by conspiracy theorist!

  • @phalhappy8612
    @phalhappy8612 2 года назад +1

    Idk if this can fix mist pass noise in Blender rendering, but in Denoise pass there are three channel: denoising Normal, Denoising Albedo and Denoising Depth.

    • @BlenderBob
      @BlenderBob  2 года назад

      Yeah… not sure about that. They really need to fix it.

  • @nurb2kea
    @nurb2kea 2 года назад +1

    Normalize node for the z channel, and a ramp afterwards!?

    • @BlenderBob
      @BlenderBob  2 года назад

      as mist pass? The Z goes to infinity. so I'm not sure how normalize you work. I will give it a try. (you're talking about using it as a mist pass?)

    • @nurb2kea
      @nurb2kea 2 года назад

      @@BlenderBob I send you two pics via twitter and another DM with other info!

    • @nurb2kea
      @nurb2kea 2 года назад

      @@BlenderBob So did you have a look at the images I posted you on your twitter, or is it like the last times, that you ask for help and then don't even look at it!?

    • @BlenderBob
      @BlenderBob  2 года назад

      @@nurb2kea Hi! I didn't receive the images. All I have from you is a message about implementing magic mouse. What are you referring to about "last time"? I never got anything else from you on Twitter.

    • @BlenderBob
      @BlenderBob  2 года назад

      Please understand that it's hard to keep up with all the messages I get from Facebook, linkedIn, Twitter and RUclips. I do my best but sometimes some messages get lost in the ocean of messages. Especially when they are replies of replies of replies.

  • @DMVB2307-u6d
    @DMVB2307-u6d 2 года назад

    Solution I found is using an inverted mist pass from blender and it solves that edge issue

  • @fabbrobbaf
    @fabbrobbaf 2 года назад

    For the "buggy" Z pass with lots of fireflies, use the denoised version (denoising data must be activated first)! You'll get a clean output.

    • @BlenderBob
      @BlenderBob  2 года назад

      denoising data?

    • @fabbrobbaf
      @fabbrobbaf 2 года назад

      @@BlenderBob hi Bob, in the properties panel> view layer properties, where you can find all the render passes. Apparently I was wrong but I got to use it successfully for my last project where I had large scale forest scene with an unusable depth pass. Never mind...

    • @BlenderBob
      @BlenderBob  2 года назад

      @@fabbrobbaf How com I never realized that there's a denoting data option!?! Doh!

    • @BlenderBob
      @BlenderBob  2 года назад

      @@fabbrobbaf Actually denoising data will not denoise the data passes. It just output the information that is used for denoising, if I'm not mistaking

    • @fabbrobbaf
      @fabbrobbaf 2 года назад

      @@BlenderBob 🤷‍♂️
      Anyway in 3.3 denoised depth pass seems broken...

  • @rano12321
    @rano12321 10 месяцев назад +1

    Use the standalone version of Fusion, it's much better for compositing like this. The free version inside Resolve doesn't have the proper workflow for this.

  • @hasanhuseyindincer5334
    @hasanhuseyindincer5334 2 года назад +1

    👍👍

  • @swordofkings128
    @swordofkings128 2 года назад +2

    Around 26:04 I can't answer why using a gradient texture (edit: color ramp) is like that, but a (possibly) better way to get vertical depth with a ramp is to use an empty or an object in a texture coordinate node, then use a separate xyz node and use the z output for your needs. You can scale the z scale of the empty up or down to get more or less of a gradient vertically. I've attached a link to an image on my google drive.
    drive.google.com/file/d/1FtFtRoVkLudm3rezkwvOY3umazTeLflr/view?usp=sharing
    I don't know if it'll work for your purpose, but I've used this for lots of "global" kind of effects like dirt/grunge on a bunch of things on the floor, a dissolve effect that goes on everything, some stylized lava glow.

    • @BlenderBob
      @BlenderBob  2 года назад

      NIce! Didn't even think about that. I could probably also use the point position pass too. Forgot to try.

  • @RomboutVersluijs
    @RomboutVersluijs 2 года назад

    You could use a node group and python to add that vertical ramp to every material and thus still use bump and displacement. The Eevee addon material override can do this

    • @BlenderBob
      @BlenderBob  2 года назад +1

      The idea is to get it as an AOV and not messing with the original shaders. Should be built in like Renderman

    • @RomboutVersluijs
      @RomboutVersluijs 2 года назад

      @@BlenderBob what happens if you use it as a material override and then use the diffuse pass

  • @amswil3232
    @amswil3232 Год назад +1

    there is a plugin for fusion called reactor

  • @RomboutVersluijs
    @RomboutVersluijs 2 года назад +1

    I wished they added more samples to the zdepth. It still renders with just 1 sample and that's causing a t to be so jagged

    • @fabbrobbaf
      @fabbrobbaf 2 года назад +1

      For the "buggy" Z pass with lots of fireflies, use the denoised version (denoising data must be activated first)! You'll get a clean output.

    • @RomboutVersluijs
      @RomboutVersluijs 2 года назад

      @@fabbrobbaf doubt that fixes it, it's 1 sample and isn't anti-aliased

    • @fabbrobbaf
      @fabbrobbaf 2 года назад +1

      @@RomboutVersluijs give it a try. However stick to 3.2.2 because in the 3.3 I get different results...there must be a bug. I'll report it

  • @simoneiorio9703
    @simoneiorio9703 2 года назад +1

    Pixeltrain (RUclips channel) has a beginner tutorial series on compositing in blender and is ultra well done, in that series is also well explained how apply the mist pass on focus properties (and is suggested also, due the non antialias on Z).

  • @RomboutVersluijs
    @RomboutVersluijs 2 года назад

    I've made an addon which adds empties to a camera, with it you van control the zdepth much easier. I used according to someone's example. I'll try to post a link of that video and my repo

    • @RomboutVersluijs
      @RomboutVersluijs 2 года назад

      Here is that video, hope RUclips doesn't delete this link. Hate them doing that
      ruclips.net/video/5eoYmUfyUyw/видео.html

    • @BlenderBob
      @BlenderBob  2 года назад +1

      Why do you need to control it. All the information is already there. It’s the tool that deals with it that needs to be fixed

    • @RomboutVersluijs
      @RomboutVersluijs 2 года назад

      @@BlenderBob in blender it's not easy to control otherwise

    • @RomboutVersluijs
      @RomboutVersluijs 2 года назад

      @@BlenderBob you could get se it for dof, it renders faster vs rendering with real dof. Downstairs Dr is that is just 1sampoe and that's why we t looks so jagged

  • @jeffreyspinner9213
    @jeffreyspinner9213 2 года назад +1

    Idk how you did it, but as I inspect the border of your blue shirt with the background, it does look like a green screen... either way your background is Uber distracting, cause I'd want to live there, at least have a living room like that with such big windows and lots of plants...😭 I came for the depth pass, and I got misted.

    • @BlenderBob
      @BlenderBob  2 года назад

      It is my living room. Thanks

  • @robertYoutub
    @robertYoutub 2 года назад +5

    Point in Fusion is, that it reads not automatically many channels from Blender, because Blenders naming is very wrong. Well EXRs and Blender is a nightmare on its own. Anyway, presets do the best job there, except you used viewlayers, but thats Blenders problem. Also Blender is doing depth wrong with motion blur

    • @BlenderBob
      @BlenderBob  2 года назад

      In Nuke I just need one node. Then I shuffle to get the layer I want. So I only need to update one node if i change versions.

    • @robertYoutub
      @robertYoutub 2 года назад

      @@BlenderBob Well, but if the name changes and you output a serious of layers, things a different. Yes Fusion is more work, have both setups here. In Fusion and Nuke. Last project rendered in Cycles with the help of TurboRender, thanks to garage farm who made it work. Still needed to do a bit of denoise on diffuse and transparent.. So I depend on a outputting nearly all layers and also in having smaller files. A bit complex for a comment.

    • @BlenderBob
      @BlenderBob  2 года назад

      @@robertYoutub I just answered this to another comment: If you have an EXR with 5 layers and they each have diffuse, glossiness, transparency, normals, and point position AOV, that's 70 layers. Can you even have 70 layers? And if you use the script, you will have a mess of 70 nodes that you need to update one by one every time you have a new version. In Nuke, I may have many shuffle nodes but they don't need to be updated. I only need to update one node.

    • @robertYoutub
      @robertYoutub 2 года назад +1

      @@BlenderBob yes in theory, but not if you know Fusion. Its a bit tricky so. And yes Nuke does it better. I like Nuke, but then Fusion has some awesome stuff that I also like. Yeah absolute right about the manual setting in Fusion. That sucks. But as the names are always identical in all renders, except Cycles, its just replacement of the root. As said, I have setups, with a single node and bit of shuffling.

    • @BlenderBob
      @BlenderBob  2 года назад

      @@robertYoutub Nice!

  • @jkartz92
    @jkartz92 2 года назад

    buut z depth pass in blender is having pixelated edges, why so? That's why people use Mist pass as n alternative to Z depth I guess

    • @BlenderBob
      @BlenderBob  2 года назад +1

      It's supposed to be like that. the Z pass describes where the object is on space from the camera. If you antialias the image, then the edges of the object will be distorted because of the blurriness. zDepth fonctions like I Nuke will compensate for that but if another software gives you issues, it's because they are doing it wrong.

  • @user-rx3xl7zn1u
    @user-rx3xl7zn1u 2 года назад +1

    These "Blender bugs" have been there for years. I realize it's free software, but I don't understand why the Defocus node and the issue with alpha map edges haven't been addressed by now, especially considering all the support the Blender Foundation is now receiving and has been receiving for several years now.
    The Vector Blur node is another one with bugs -- the first frame is always unusable and it sometimes just freaks out requiring frames to be re-rendered.
    There are always workarounds, but it's unprofessional and embarrassing. I find myself telling friends, "Yeah, sorry. That's just a Blender bug. Hopefully they'll eventually fix it, but it's been there for years, so don't hold your breath. Here's an awkward, hacky workaround."

    • @BlenderBob
      @BlenderBob  2 года назад +1

      Usually issues with MB on the first frame is because the animation didn’t start before frame 1

  • @gavinryan2012
    @gavinryan2012 2 года назад

    That wasn't you dressed up as John Cleese pretending to be an outrageous frenchman in Monty Python and the Holy Grail , was it?... 🤔

    • @BlenderBob
      @BlenderBob  2 года назад

      Nop! It was me doing a parody of Alex Jones :-)

  • @ObscureHedgehog
    @ObscureHedgehog 2 года назад +1

    WHY DOES NUKE HAVE TO BE SO EXPENSIVE??? 😭😭😭

    • @aaronsbot
      @aaronsbot 2 года назад +1

      Nuke indie! 500 a year! Great deal

    • @JonathanTRG
      @JonathanTRG 2 года назад

      you can also have the non comercial version for free, you just have some minor drawbacks

    • @BlenderBob
      @BlenderBob  2 года назад +1

      Because it works?

    • @johntnguyen1976
      @johntnguyen1976 2 года назад

      @@aaronsbot LOVE nuke indie!! And $500 year is actually less expensive than a year-long adobe license (at least the all-app version). I got Nuke Indie last year for freelance and still haven't hit any limitations yet.

    • @xanzuls
      @xanzuls 2 года назад

      Because it's monopoly.

  • @Cyber_Kriss
    @Cyber_Kriss 2 года назад +1

    This is a comment.

    • @BlenderBob
      @BlenderBob  2 года назад +1

      Really deep. Never thought about that. :-)

  • @imregpap
    @imregpap 2 года назад

    Why does Blender use f-stop values which have no sense in real world lenses? Like 0.2. The f-stop has to be set way under 1 to get a shallow DOF. While real lenses rarely go below 1.2.
    Sure, no big deal, but dont call it f-stop then. Call it DOF ratio or whatever. :)

    • @BlenderBob
      @BlenderBob  2 года назад

      Are you working in real size?

    • @imregpap
      @imregpap 2 года назад

      Yes, sir. I did not make tests, setting it to 0.4 gave the desired result and that's that. It's just an impossible real life value. I'll make some tests, maybe it was my scene after all.

    • @BlenderBob
      @BlenderBob  2 года назад +1

      @@imregpap That reminds me that I I wanted to mention that the scene has to be real size but I don"t remember if I mentioned it. But I think I did make the test in one of my first clips like 2 years ago. I will try again.

    • @imregpap
      @imregpap 2 года назад +1

      @@BlenderBob I made a quick close up selfie in my room and replicated it in Blender with the same setting, and it looks OK! F-stop 2.4, 20mm, 25cm from my face. Focusing on the tip of my nose, the back of my head is already blury, like on the photo. So it's accurate. I think I just made huge outdoor scenes with the focus plane meters away from camera and expected more blur. But yeah, it depends on more factors than the f-stop alone.
      My bad, Blender seems accurate. :)

    • @BlenderBob
      @BlenderBob  2 года назад +1

      @@imregpap four factors can influence the DOF. Focal lens, distance to object, sensor size and aperture. They all need to match in order to make tests

  • @thewhiteunbox
    @thewhiteunbox Год назад +1

    So many mistakes that will be longer to mention them than length of the video. Sorry to say it but this is very poor and misleading content.

    • @BlenderBob
      @BlenderBob  Год назад +1

      Well, if you want to be constructive in your criticism instead of just complaining please feel free to make your own video and correct all the many mistakes I made otherwise your comment is pointless. I do make mistakes from time to time and people are telling me about it and that helps me and everyone. So please, tell me how it could be better and I'll make a new clip with the changes.