@OGLDEV, Thank you for this upload. I have a question on the reflection of objects you briefly mentioned @ 12:53? How would you reflect other objects when using PBR? When I was working on blinn-phong, I had to render a framebuffer object of my scene and then use that as a texture for my reflective objects to simulate reflection. Is it the same for PBR too or is it much easier to do? I've done PBR as well, but couldn't figure out how to reflect objects only the skybox, thank you!
The question is whether this is a bug or simply incorrect or badly tuned params of the light/material properties. You can take the assets and lighting params from an existing app that seems to work correctly and plug them into your sample to compare.
When explaining the halfway vector, it looks like the light vector arrow is backwards. If I sum and normalise the two on screen, I'd get something below the surface.
I usually show the light in its "native" direction going from top to bottom and then the actual calculations of the diffuse factor (and in this case the half vector) are done by reversing the light direction. You will see the same thing in the code. I can see that the slide is a bit misleading in this case.
I haven't actually measured so I can only guesstimate. I would assume that there isn't much of a difference even though we are adding more instructions with PBR. Maybe if the number of lights is very high. It also depends on the GPU ofcourse. If you have lots of light sources then you may be better off with deferred shading but if you only have a few lights then I think the performance effect will be minor.
Hey, i have a question Now, usually when calculating the reflection We usually dont have the "light" vector We would have a reflection vector, but that doesn't necessarily mean that the reflected ray would bounce of toward a light source. Is it still what you meant by "light" vector? Or do i have to somehow calculate the direction of the closest emitting material or something? Thanks in advance!
Reflecting and reversing (or negating) the light direction are two different things. When you want to calculate diffuse lighting you have to do a dot product between the normal and the light direction, but the light direction must be pointing from the surface and "up", same as the normal. For example, if the light hits the surface at maximum strength it means that the original light vector is perpendicular to the surface but is opposite to the normal (which is also perpendicular ofcourse). So you multiply the light direction by -1 and when you do the dot product with the normal you get 1 which means maximum strength. Reflecting is when you want to calculate the specular effect. Let's say that the light hits the surface at 45 degrees angle. To calc the reflection you call reflect(LightDirection, Normal) and this will give you the a vector which point 45 degrees on the "other side" of the normal. The original light vector is pointing at the surface and the reflection is pointing back up. This allows it to hits the camera and the specular effect increases as the reflection and pixel to camera vector become aligned. So you don't need to reverse in the specular case, just use reflect() with the original light direction.
@@OGLDEV thanks! My question was just wanting to know if I can replace the "light" in your video with just the incoming ray, whether it was from a light source or an object Because, as you know, that's how raytracers work Thanks again! Really underrated channel. Keep it up.
I guess it should work the same as the regular Phong model, since PBR only deals with the color of the object. Compare CalcDirectionalLight with CalcPBRDirectionalLight. The non-PBR version calculates a shadow factor which depends only on the light direction and the normal. You need to multiply it with the PBR color.
Both techniques can create the effect of metallic shine but I would guess that PBR is more common than raytracing because afaik raytracing is slower and more compute intensive.
one thing I don't understand, you say that alpha(roughness) = 0 means a perfectly smooth surface, that means it reflects all light, but when alpha is 0 than the whole normal distribution function D computes to 0 (since the numerator is 0*0), hence making the whole BRDF specular equal to 0, since D is multiplied for G and F .. Something I'm missing here?
It's a good question. Obviously when alpha is zero the BRDF is zero so we don't get specular lighting. I think that zero here serves as kind of a limit to which we are not supposed to reach and realistically the roughness values will be much higher (I consider 0.1 to be "much higher" ;-) ). Here's an interesting experiment you may want to try: plug the GGX equation into a spreadsheet. Set an input column for alpha and have it start at zero and continue with small increments. NdotH can be constant for the entire spreadsheet. A good example is 1 where the light ray is reflected directly at the eye but you may want to play with different values. When NdotH=1 and alpha=0.1 you are already getting 3183 which is very high and alpha=0.01 is really crazy. So I guess alpha=0 should return infinity but the equation simply doesn't work that way. I'm pretty sure the people who invented this were well aware of this corner and the papers may have addressed this already. Infinity is not very useful in practical applications so I guess they just ignored it. This is just my intuition. Joey De-Vries has a web based example in learnopengl.com/PBR/Lighting which you can also play with. When alpha is very low you get a very concentrated specular reflection and when it is zero there is nothing.
@@OGLDEV aaah thank you this clarifies it! I got stuck on that believing I wasn't undertanding some relatively simple math! Indeed thinking on it the stronger specular reflection is, the smaller the area that is seen reflecting. So alpha must be non zero or the area reflecting gets infinetely small, On the source you linked in the code alpha seems to start at 0.2. Thanks!
For an ideal perfectly smooth surface, alpha=0. This makes D=0 because D is proportional to alpha^2. That makes SpecularBRDF=0. But I thought for smooth surface, Specular BRDF should be high
I'm back and turns out I already answered this question three months ago 🙂 Please see my response to @MrFacciolone below (ruclips.net/video/XK_p2MxGBQs/видео.html&lc=Ugz-W5xSJlCgSFO60EV4AaABAg.9jzRGRPKpap9k1dX8vgJdG)
Hi, First of all thanks a lot for providing us these videos. I have a doubt in half vector calculation. Taking the direction of light vector into consideration, shouldn't the equation for calculating half vector be: half = normalize( view-light). As light + view would be more horizontal. Or the direction of the light vector should be opposite ( by which I mean that it should be multiplied with -1).
The light direction is provided by the application in the natural order - down towards the earth. I believe it's more intuitive for the developer to think about it like that. It is reversed in the fragment shader.
@hajjex_9086 Depends on how you send in your light direction. The shader assumes that it is the original direction so it has to reverse it for the diffuse calculation because the dot product needs the direction from the surface back to the light source. But you can also optimize somewhere between the application and shader and reverse the light direction before it goes in and then the shader can use it as-is without extra ops.
@@OGLDEV I actually send my rays from the camera And that's it So in reality It's actually pointing toward the light source at some point So does that need reversing?
@@OGLDEValso, by original direction, u mean that it's from light to surface. This means we will simply do light vector + view? This confused me a little.
I must say this is the most clear PBR explanation with actual working code. How about add IBL in the next video~
Thank you!
IBL is on the roadmap but will take some time because I have other topics planned.
very clear and applicable, great work! I had to read the book for it in my project.
Cool, thanks!
Im finding researching the topic with papers so difficult and this is just the perfect gateway for the subject
Thanks :-)
Very well simplified explanation...👍
Thank you :-)
Amazing tutorial, thank you so much for making this.
You're very welcome!
Great content as usual !!!!! Keep going and thanx for sharing!!!!!
You bet!
Great video, thanks!
You're welcome!
nice tutorial and great explanation!! :)
Thanks!
I actually did this while modding Halo 3's graphics. Thanks.
You're welcome :-)
@OGLDEV, Thank you for this upload. I have a question on the reflection of objects you briefly mentioned @ 12:53? How would you reflect other objects when using PBR? When I was working on blinn-phong, I had to render a framebuffer object of my scene and then use that as a texture for my reflective objects to simulate reflection. Is it the same for PBR too or is it much easier to do? I've done PBR as well, but couldn't figure out how to reflect objects only the skybox, thank you!
I haven't done this yet but perhaps you mean something like: learnopengl.com/PBR/IBL/Diffuse-irradiance ?
my scene feels dark when I use light sources, is there a reason for this? Colors feel less bright than what I feel like they should feel
The question is whether this is a bug or simply incorrect or badly tuned params of the light/material properties. You can take the assets and lighting params from an existing app that seems to work correctly and plug them into your sample to compare.
When explaining the halfway vector, it looks like the light vector arrow is backwards. If I sum and normalise the two on screen, I'd get something below the surface.
I usually show the light in its "native" direction going from top to bottom and then the actual calculations of the diffuse factor (and in this case the half vector) are done by reversing the light direction. You will see the same thing in the code. I can see that the slide is a bit misleading in this case.
In your experience how much performance would cook Torrence cost compared to say just doing blinn phong?
I haven't actually measured so I can only guesstimate. I would assume that there isn't much of a difference even though we are adding more instructions with PBR. Maybe if the number of lights is very high. It also depends on the GPU ofcourse. If you have lots of light sources then you may be better off with deferred shading but if you only have a few lights then I think the performance effect will be minor.
Hey, i have a question
Now, usually when calculating the reflection
We usually dont have the "light" vector
We would have a reflection vector, but that doesn't necessarily mean that the reflected ray would bounce of toward a light source.
Is it still what you meant by "light" vector?
Or do i have to somehow calculate the direction of the closest emitting material or something?
Thanks in advance!
Reflecting and reversing (or negating) the light direction are two different things. When you want to calculate diffuse lighting you have to do a dot product between the normal and the light direction, but the light direction must be pointing from the surface and "up", same as the normal. For example, if the light hits the surface at maximum strength it means that the original light vector is perpendicular to the surface but is opposite to the normal (which is also perpendicular ofcourse). So you multiply the light direction by -1 and when you do the dot product with the normal you get 1 which means maximum strength. Reflecting is when you want to calculate the specular effect. Let's say that the light hits the surface at 45 degrees angle. To calc the reflection you call reflect(LightDirection, Normal) and this will give you the a vector which point 45 degrees on the "other side" of the normal. The original light vector is pointing at the surface and the reflection is pointing back up. This allows it to hits the camera and the specular effect increases as the reflection and pixel to camera vector become aligned. So you don't need to reverse in the specular case, just use reflect() with the original light direction.
@@OGLDEV thanks!
My question was just wanting to know if I can replace the "light" in your video with just the incoming ray, whether it was from a light source or an object
Because, as you know, that's how raytracers work
Thanks again!
Really underrated channel.
Keep it up.
Oh, I see, I think you can.
Thanks!
How does shadow mapping work with PBR? Where do you apply the shadow value?
I guess it should work the same as the regular Phong model, since PBR only deals with the color of the object. Compare CalcDirectionalLight with CalcPBRDirectionalLight. The non-PBR version calculates a shadow factor which depends only on the light direction and the normal. You need to multiply it with the PBR color.
You can just multiply the light color/intensity with the shadow factor.
Hello, can you update the link to source please?
Done.
@@OGLDEV thanks
Is the metallic shine of video games due to pbr or raytracing?
Both techniques can create the effect of metallic shine but I would guess that PBR is more common than raytracing because afaik raytracing is slower and more compute intensive.
@@OGLDEV for shure all these sport car are made with pbr 🙂
That's why they are so expensive ;-)
@@OGLDEV and teenagers spend 2000 euro for a 4090 for a fake global illumination
:-)
one thing I don't understand, you say that alpha(roughness) = 0 means a perfectly smooth surface, that means it reflects all light, but when alpha is 0 than the whole normal distribution function D computes to 0 (since the numerator is 0*0), hence making the whole BRDF specular equal to 0, since D is multiplied for G and F .. Something I'm missing here?
It's a good question. Obviously when alpha is zero the BRDF is zero so we don't get specular lighting. I think that zero here serves as kind of a limit to which we are not supposed to reach and realistically the roughness values will be much higher (I consider 0.1 to be "much higher" ;-) ). Here's an interesting experiment you may want to try: plug the GGX equation into a spreadsheet. Set an input column for alpha and have it start at zero and continue with small increments. NdotH can be constant for the entire spreadsheet. A good example is 1 where the light ray is reflected directly at the eye but you may want to play with different values. When NdotH=1 and alpha=0.1 you are already getting 3183 which is very high and alpha=0.01 is really crazy. So I guess alpha=0 should return infinity but the equation simply doesn't work that way. I'm pretty sure the people who invented this were well aware of this corner and the papers may have addressed this already. Infinity is not very useful in practical applications so I guess they just ignored it. This is just my intuition. Joey De-Vries has a web based example in learnopengl.com/PBR/Lighting which you can also play with. When alpha is very low you get a very concentrated specular reflection and when it is zero there is nothing.
@@OGLDEV aaah thank you this clarifies it! I got stuck on that believing I wasn't undertanding some relatively simple math! Indeed thinking on it the stronger specular reflection is, the smaller the area that is seen reflecting. So alpha must be non zero or the area reflecting gets infinetely small, On the source you linked in the code alpha seems to start at 0.2. Thanks!
This is literally an irl cheat code
OMG! Is it a good thing or a bad thing??
@@OGLDEV I'm saying that this is wonderful Because it makes a hard topic very accessible
Thanks!
You're welcome :-)
For an ideal perfectly smooth surface, alpha=0. This makes D=0 because D is proportional to alpha^2. That makes SpecularBRDF=0. But I thought for smooth surface, Specular BRDF should be high
Hi, I'm on vacation. Will get back to you next week.
I'm back and turns out I already answered this question three months ago 🙂
Please see my response to @MrFacciolone below (ruclips.net/video/XK_p2MxGBQs/видео.html&lc=Ugz-W5xSJlCgSFO60EV4AaABAg.9jzRGRPKpap9k1dX8vgJdG)
Hi,
First of all thanks a lot for providing us these videos. I have a doubt in half vector calculation. Taking the direction of light vector into consideration, shouldn't the equation for calculating half vector be: half = normalize( view-light). As light + view would be more horizontal. Or the direction of the light vector should be opposite ( by which I mean that it should be multiplied with -1).
The light direction is provided by the application in the natural order - down towards the earth. I believe it's more intuitive for the developer to think about it like that. It is reversed in the fragment shader.
@@OGLDEVso for me to apply it to my case, I have to reverse it like he said?
*-1?
@hajjex_9086 Depends on how you send in your light direction. The shader assumes that it is the original direction so it has to reverse it for the diffuse calculation because the dot product needs the direction from the surface back to the light source. But you can also optimize somewhere between the application and shader and reverse the light direction before it goes in and then the shader can use it as-is without extra ops.
@@OGLDEV I actually send my rays from the camera
And that's it
So in reality
It's actually pointing toward the light source at some point
So does that need reversing?
@@OGLDEValso, by original direction, u mean that it's from light to surface.
This means we will simply do light vector + view?
This confused me a little.
Wish math was explained more with code, as math guys like to have random function short names which sometimes you need to guess what it is...
ok
can you share the cpp file for this code
github.com/emeiri/ogldev/blob/master/tutorial43_youtube