There's something about Gleb's videos that put me at ease. Is it the simple way he breaks down a complex concept? The logical workflow, his voice? I don't know but I swear I come out a calmer human being.
1:49 Something to note about desaturation is that it doesn't account for relative luminance, where certain hues are usually seen as brighter than other ones (i.e. yellow vs blue for example). This usually means that desaturating will slightly flatten and wash out the value range of your image, which is nice for stylistic purposes but isn't what you'd necessarily want if you want to check your values. There is a node under the Converter tab called 'RGB to BW', which gives a conversion that's more faithful to the original image.
Thank you for putting this video together! I really appreciate how you broke down each section into a bite size chunk. This was very informative and let me take some of this immediately into my workflow. Much appreciated!
THANK YOU for recommending the Offset/Power/Slope option for color balance. I've learned that early, but it makes such a difference in color grading. For those who don't know - Lift/Gamma/Gain assumes a lot about your color space and can clamp your values. Great video!
Makes such a difference, right? :) I don't fully get, to be honest, in what exact way the Lift/Gamma/Gain seems to be broken in Blender, but I take the word of Troy Sobotka and others who know more about it for it.
Blender is awesome, getting better with every release :O) well done Gleb! Yes more people could be donating to Blender Funds to keep it growing moving forward
I'm sure that the limitations will disappear over time as the new nodes will be made compatible with gpu acceleration. Right before this tut has got uploaded, Aidy noticed that the newer version of Blender (3.6 daily build if I'm correct) DOES support the fog glow mode in the glare node. Development is in progress :)
Thanks for creating such a great tutorial👏 I have one question, how did you learn this information? Did you just connect the nodes one by one or is there some kind of guide documentation for this?
Likewise I really wish the shader and texture nodes were avalable in the conpositor, imagine the kinds of effects you could make by generating textures on the fly procedurally and then using them with the compositor
If I recall correctly, it's not as simple as that, since the compositor is working with a fixed amount pixels, whether a rendered image or the number of pixels on your monitor's screen edge to edge. Projecting this same effect onto every possible shading point in 3D space, however, is either vastly more difficult or straight up mathematically impossible. So either you have infinite resolution vector graphics or pixel based effects (like blur). But not both.
@@penisdubs You can, if a bit differently. The texture node in the compositor utilises the procedural textures under the material tab in the 3D viewport (clouds, magic, marble, stucci, voronoi, wood, etc.).
You can do it, it is just slow without buffers, and tedious because the closest to loops in the shader editor are nodegroup-chains. The basic idea is that you average different offsets of the texture you want to blur.
Because a blur node works by finding a certain radius around each pixel and averaging all of those values, then setting that pixel to the value. Shaders aren't working with pixels, they're working in object/world space, calculating PBR values and then assigning those values to the pixels on screen/ in the render. You can't average the values between pixels to blur when you aren't even truly working with pixels in the first place.
I doubt that, but the idea sounds interesting :) As an offtopic comment, I would definitely recommend watching the Studio Petrikas' video on setting up the AgX view transform in Resolve, and using the linear 32-bit .exrs as the source footage. It's a solid foundation for the color management pipeline there, mimicking the way the color management works in Blender.
There's something about Gleb's videos that put me at ease. Is it the simple way he breaks down a complex concept? The logical workflow, his voice? I don't know but I swear I come out a calmer human being.
I appreciate that you see the flow as logical, cause oftentimes I have doubts about it :) You're too kind, Matt!
1:49 Something to note about desaturation is that it doesn't account for relative luminance, where certain hues are usually seen as brighter than other ones (i.e. yellow vs blue for example). This usually means that desaturating will slightly flatten and wash out the value range of your image, which is nice for stylistic purposes but isn't what you'd necessarily want if you want to check your values.
There is a node under the Converter tab called 'RGB to BW', which gives a conversion that's more faithful to the original image.
Thanks for the tip, it's a valuable info!
I thought the same but didn't know about the rgb to bw node. Thanks!
Real-time compositor is so much fun!
Can't wait for render passes to also work ❤️
The passes, yes, that will be such a ground-breaking addition for sure. I can't wait for it too.
Thank you for putting this video together! I really appreciate how you broke down each section into a bite size chunk. This was very informative and let me take some of this immediately into my workflow. Much appreciated!
My pleasure! :) Glad it was useful to you!
THANK YOU for recommending the Offset/Power/Slope option for color balance. I've learned that early, but it makes such a difference in color grading. For those who don't know - Lift/Gamma/Gain assumes a lot about your color space and can clamp your values. Great video!
Makes such a difference, right? :) I don't fully get, to be honest, in what exact way the Lift/Gamma/Gain seems to be broken in Blender, but I take the word of Troy Sobotka and others who know more about it for it.
Gleb training his son to become mega-Landgren, nice video!
Amazing work man
This is so useful! Can’t wait for render passes to be available. 😛
That. That is precisely what could make this whole thing unstoppable.
Shout out to the background music choice tough love it
It's not something I usually choose. This time it resonated with me, so I picked it immediately.
Blender is awesome, getting better with every release :O) well done Gleb! Yes more people could be donating to Blender Funds to keep it growing moving forward
Definitely. The devs and the community are doing an amazing job shipping new features to Blender.
Very cool feature regardless of limitations. Awesome video!!!
I'm sure that the limitations will disappear over time as the new nodes will be made compatible with gpu acceleration. Right before this tut has got uploaded, Aidy noticed that the newer version of Blender (3.6 daily build if I'm correct) DOES support the fog glow mode in the glare node. Development is in progress :)
Learnt so much here. Just wanted to mention that there is a vignette node setup in the template section, but you will just do a little adjustments 👍👍👍
O, really? I didn't know that, thanks! Can you give me a link or something so I could check it out plz? :)
yesyes thankyou so much for Realtime Compositor....
Terrific 💯💯💯
Thanks Gleb, you rock!
Blender devs & community rocks, we just help spread the fun :D
I can't believe your son is 6 already! Time flies...
Oh definitely, definitely.
love this!
Thanks George!
Thanks for creating such a great tutorial👏
I have one question, how did you learn this information? Did you just connect the nodes one by one or is there some kind of guide documentation for this?
I respect your tutorial especially the film grain portion 8:37 but I think increasing the post processing Dither will give the movie grain effect.
Thanks for the tip!
dang your kid has a huge future in modeling ahead of them lol
He loves sculpting! :D
The biggest mystery is why compositor has blur node, but shader editor doesn’t ☹️
Likewise I really wish the shader and texture nodes were avalable in the conpositor, imagine the kinds of effects you could make by generating textures on the fly procedurally and then using them with the compositor
If I recall correctly, it's not as simple as that, since the compositor is working with a fixed amount pixels, whether a rendered image or the number of pixels on your monitor's screen edge to edge. Projecting this same effect onto every possible shading point in 3D space, however, is either vastly more difficult or straight up mathematically impossible. So either you have infinite resolution vector graphics or pixel based effects (like blur). But not both.
@@penisdubs You can, if a bit differently. The texture node in the compositor utilises the procedural textures under the material tab in the 3D viewport (clouds, magic, marble, stucci, voronoi, wood, etc.).
You can do it, it is just slow without buffers, and tedious because the closest to loops in the shader editor are nodegroup-chains.
The basic idea is that you average different offsets of the texture you want to blur.
Because a blur node works by finding a certain radius around each pixel and averaging all of those values, then setting that pixel to the value.
Shaders aren't working with pixels, they're working in object/world space, calculating PBR values and then assigning those values to the pixels on screen/ in the render.
You can't average the values between pixels to blur when you aren't even truly working with pixels in the first place.
I always wondered why the K- cycles blender had this ability, but not actual Blender.
Indeed!
What ability?
Under a minute club
I have absolutely no idea how do you do that. Some kind of magic if you ask me.
@@GlebAlexandrov Gleb, don't look behind You :) He probably staring at You behind Your back.
@@Igoreshkin oO!
Wait
Ему уже 6 лет???
Капец время летит
Crazy how time flies, right? :D
but why the render is different?
Not sure I understood your question. 😊 What do you mean?
Amazing! I wonder if it would be possible to port this to as an OpenFX plugin for use inside DaVinci Resolve/Fusion... 🤔🥹
I doubt that, but the idea sounds interesting :)
As an offtopic comment, I would definitely recommend watching the Studio Petrikas' video on setting up the AgX view transform in Resolve, and using the linear 32-bit .exrs as the source footage. It's a solid foundation for the color management pipeline there, mimicking the way the color management works in Blender.