Blender: Using Normal and Displacement Maps Together? STOP!

Поделиться
HTML-код
  • Опубликовано: 7 окт 2024
  • Still using Normal and Displacement Maps incorrectly? In this video, I explain why using both maps together can ruin your render results and how to use them optimally instead. I'll show you when to use the displacement map and when the normal map is the better choice.
    Forget complicated theory - I break everything down in a simple and understandable way, so you can instantly achieve better renders! Whether you’re using Blender, Unity, or other tools: These tips are essential to bring your 3D models to life the right way.
    #b3d #displacement #pbr

Комментарии • 22

  •  2 дня назад +14

    It depends on the tessellation algorithm used:
    - If your tessellation recalculates the normals after the displacement, then you don't need to use your normal maping information. This is very costly in general because you need to recalculate the normals based on the adjacency of the other vertices and the surface area of the triangles that share the vertex. This is the example you are shown in your video. This approach has 1 major disadvantage. You relay only on the generated geometry for your shading so you need a lot of geometry to capture all the detail of your normal maps. And if you ever change that tesselation level dynamically there will be poping on the shading.
    - If your tesselation only displaces the vertices but leaves the normal untouched, then you need to relay on the normal map information to correctly shade the new geometry. This has 2 main advantages:
    -- It's faster to compute because you don't need to calculate the new normals
    -- You can adjust the tessellation level and still have a reasonable geometry level of detail and still have the same shading information across the generated geometry, which helps reduce visual popping.

    • @cgmechanics5855
      @cgmechanics5855  День назад +3

      Thank you for your detailed comment. However, I have to disagree. When applying displacement in most modern 3D software, including Blender, the normals are always recalculated because it’s essential for correct shading. If the software didn’t recalculate the normals, it wouldn’t have the information needed to determine the height changes from the displacement accurately. This is crucial as the mid-level and the displacement height must be adjusted accordingly, which requires a calculation.
      Even if you use a normal map in addition to displacement, it wouldn’t fix the issue if the normals from the displacement aren’t recalculated. The software needs to know the updated surface angles to calculate the light and shading accurately. The only time something similar might happen is with parallax mapping, which isn’t true displacement, but rather a trick to simulate depth.
      Out of curiosity, could you specify which software or engine you are referring to that doesn’t automatically recalculate the normals when displacement is applied? This seems quite unusual, as it would be difficult to achieve accurate shading without this step.

    • @ABaumstumpf
      @ABaumstumpf 22 часа назад

      @@cgmechanics5855 "When applying displacement in most modern 3D software"
      That entirely depends on the actual shading used - and mostly is only a problem with some 3D render engines that are designed for artistic image/video production. The architecture software i have used have all had options to tweak those settings - like recalculating surface-normals on tesselation, recalculating them with displacement-maps, or using the normal-maps and ignoring all other inputs.
      And then there is the vast world of realtime-rendering (including game-engines).
      "could you specify which software or engine you are referring to that doesn’t automatically recalculate the normals when displacement is applied"
      Maya, 3dsMax and many more have the option for that. And often it is quite useful.
      "as it would be difficult to achieve accurate shading without this step."
      Quite the opposite.

    •  19 часов назад

      @@cgmechanics5855 I would like you to check your own video again, at 5:14 and 5:19.
      In your video you mark in red the region where the displacement occurs. Before the displacement happens you can see that the texture in that region is darker than the brick's color. That's a "shadow" in the texture, because, I presume, the cleaning of that texture was not very good so there is still shading information in that texture. When the displacement occurs, that's the region of the texture that is stretched, giving the impresion that a shadow is formed in that specific point, but I don't think it does. If you check other parts of the texture that are displaced (not the brick edges because all, more or less, have the same "shadow") you'll see that there is no change in shading.
      Could you do an experiment? Disconnect the texture from the shader and put just white color, then toggle the displacement on and off, and see what happens? Does the "white texture" gets shaded by the displacement?

  • @theminecraft4202
    @theminecraft4202 2 дня назад +12

    the title is misleading, but yeah i think the main point is that your normal and displacement maps should not contain overlapping information... when used correctly you can use both at the same time, in order to make the most of the advantages of both. generally displacement for the macro details and normal for micro surface detail.

    • @cgmechanics5855
      @cgmechanics5855  2 дня назад

      Artificial generated textures can contain different informations. But baked textures like scans will always have the same data. It’s just the truth.

    • @janiscivan8470
      @janiscivan8470 2 дня назад +7

      ​@@cgmechanics5855 I agree though the title is not good. It's not like someone forbids you using them together and u may make a point for artistic choice. Also writing your own shaders in games you may use height map to displace vertices without recalculating the normals and relying on normal map for correct shading. Other uses like for parallax occlusion mapping using normal together with height is actually crucial. Nevertheless in blender using disp map u are correct it makes normal obsolete in theory

  • @blendgat1951
    @blendgat1951 День назад +1

    It's a fascinating and controversial topic, your explanation is well worth it. Thank you very much

  • @JohnnyFehr
    @JohnnyFehr День назад +1

    There is nothing wrong with using Displacement maps and Normal maps at the same time. Besides of that I anyways recommend using Bump maps for offline rendering instead of Normal maps for many reasons.
    Your issue you are pointing out really depends if there are overlapping details and how the shader works, its not something that is in general a problem like you try to point out.

  • @aksi221
    @aksi221 3 дня назад +3

    very interesting video, I learned something new

  • @hoyteternal
    @hoyteternal 9 часов назад

    i believe the practice of using normal and height maps together comes from video game engines and other renderers, where displacement changes height (or creates parallax) but doesn't recalculate normals.

  • @SlayeRFCSM
    @SlayeRFCSM День назад

    5 years in 3d and just now when I realized that I was doing the subject wrong. Ooops. Thanks!

  • @sinachiniforoosh
    @sinachiniforoosh День назад

    My understanding is that height maps are for the large scale geometry of the surface, whereas normal maps are for the surface details that are smaller in scale. If the normal and height maps are prepared properly, they can and maybe shouldn’t be used together.

  • @nicholaspostlethwaite9554
    @nicholaspostlethwaite9554 14 часов назад

    So what is needed is a way to deduct the displacement info at the mesh's ability to displace it, from the normal map, then use both? So only finer detail than can be displaced is created/faked by the normal map.

  • @akuunreach
    @akuunreach 21 час назад

    One way you could use both, would be to use displacement for the things only displacement can do, and use normal for high frequency detail.
    Lets take a brick wall for example.
    You could have a very low poly wall, and use displacement maps for the overall brick shape, and imperfect or damaged brick edges and corners.
    Basically anything that pokes out.
    Then for the fine texture of the brick and grout, you can use normal maps for those things that poke in, or are too small to be noticeable if faked.
    In this example, both can be generated with procedural tools, which makes things a bit easier.
    However, we could just procedurally generate a normal map for say skin texture, and displacement would just be baked from a high poly sculpt.

    • @cgmechanics5855
      @cgmechanics5855  20 часов назад

      You’re absolutely right that there are many ways to create textures, and baking details from high-poly sculpts or generating procedural maps can be effective methods. In fact, procedural textures are particularly advantageous for close-up shots because they offer virtually infinite resolution when created and rendered within the same system. This can be incredibly efficient and yield high-quality results.
      However, the context of my video is specifically focused on the standard, downloadable, or scanned textures available on platforms like Polyhaven, which are most commonly used by the Blender community. My point is about how these types of textures function and the most efficient way to use them.
      The issue I’m addressing is the common practice in many Blender tutorials where normal and displacement maps are mixed in a way that doesn’t align with how these pre-baked textures are meant to work together. While custom and procedural methods are entirely valid approaches, they fall outside the scope of the specific workflow I’m discussing.

  • @r6201sk
    @r6201sk 5 часов назад

    well even dude that develops blender said so so I guess you are right

  • @peter486
    @peter486 2 дня назад

    its not only that using highmaps that is based on pgn. fails in so many ways because its not a linjear file format.

  • @ABaumstumpf
    @ABaumstumpf 21 час назад

    Several of your claims are entirely domain specific. The blue channel in normal maps can (and for many applications does) contain information that can not be reproduced via the other 2 channels. And a displacement-map does not have the same information about the surface-normals as a normal-map has EVEN if they are generated in the way you ascribe here:
    The normals calculated from a displacement-map would lies exactly halfway between the points on a normal-map. So it only is wrong if you wrongly use the normal-map additively with the normals calculated from a displacement-map. But normal-maps and displacement-maps can also be generated from the same source-geometry in a complementary way with the normal-map getting constructed from the already displaced surface to give even finer details.
    One great example were we had used displacement and normals maps together quite successfully was in metalic-paint textures. They'd otherwise require a lot of shader-trickery to get consistent and realistic, or far higher resolution textures and higher sampling rates.

    • @cgmechanics5855
      @cgmechanics5855  21 час назад

      Thanks for your input. However, I’m not entirely sure how you’re interpreting the blue channel in normal maps. In PBR (Physically Based Rendering) workflows, the blue channel typically represents the Z-direction of the normal vector and is a calculated result based on the X and Y channels. If the normal map is baked correctly, the blue channel should not contain any information that couldn’t be derived from the other two channels. The sum of these calculations must align correctly, otherwise, the map itself is faulty.
      Regarding your point about the domain specificity: The video and the thumbnail clearly indicate that the context here is Blender. The issue discussed is directly related to Blender’s shading and rendering process, not just baking. If you’re referring to a different software or scenario outside of Blender, that would be important to specify, as this video is strictly focused on how Blender handles these processes.
      As for your second point, I agree that you could bake additional details into a normal map based on the already displaced surface. However, this only makes sense if the resolution of the displacement map and the normal map differ. If both are 4K, for example, the normal map wouldn’t add significant new details if the displacement map already contains the same information. The normal map would only be beneficial if it had a higher resolution or contained finer details that weren’t captured in the displacement map. Additionally, it’s important to consider the memory impact. Adding a high-resolution normal map on top of an already high-resolution displacement map significantly increases memory usage. For example, a 4K normal map can easily take up 50 MB or more, which adds up quickly in larger projects.
      Regarding your example of using displacement and normal maps for metallic paint textures, that’s a valid technique. There are indeed specific cases where using a different normal map than the displacement map makes sense, especially if it’s crafted to add specific or finer details that the displacement can’t capture. It’s a valid approach, and I agree that it can work well, particularly for smoother surfaces or materials like metallic paints where normal maps can effectively control the surface’s appearance without the need for excessive geometry. In fact, for smooth or reflective surfaces, I personally prefer using normal maps over displacement maps as well, as they are more efficient and provide better control. So, I understand your example and agree that it’s a viable technique, even if it didn’t make it into the video. It’s something I’ve considered, but I ultimately decided to keep the video focused on a different approach.
      Thank you again for your detailed and thoughtful comment. I really appreciate the time you took to elaborate on this, and it’s great to have such an in-depth discussion on the topic.

    • @ABaumstumpf
      @ABaumstumpf 20 часов назад

      @@cgmechanics5855 "The sum of these calculations must align correctly, otherwise, the map itself is faulty."
      For applications that expect the normal-vector the be of unit-length. But that is not a requirement and i have seen the channel used for various other means, including having the resulting vector not being of unit-length and needing re-scaling to 16bit floats resulting in slightly higher accuracy than would be possible with just the 2 8bit channels.
      It is an assumption that many programs make but as there is no one definition of normal-maps there also is no correct or wrong way as can be seen with rendering-software having different conventions for the texture-coordinates and thus textures that were designed for say Maya need to be converted to be used in 3dsmax - both autodesk-products.
      "Regarding your point about the domain specificity: The video and the thumbnail clearly indicate that the context here is Blender. "
      The title does not mention it and in the video you often make it sound as the information you presented would be of general validity (even making such claims directly). if you intended to portrait it otherwise then that really doesn't come through as such.
      "However, this only makes sense if the resolution of the displacement map and the normal map differ. If both are 4K, for example, the normal map wouldn’t add significant new details if the displacement map already contains the same information."
      No, as already mentioned if the resolution is identical then the information contained in those 2 is still different as a displacement-map does not have any normal-information at the coordinates where it has height information.
      As a worst-case example take a simple 4x4 texture representing stairs with 3 steps: Calculating the normals from the displacement-map would result in a perfectly flat ramp, where as a normal-map would give you a flat level surface. Combine those 2 and you would get stairsteps as now you have the information of the elevation + that at the point of that elevation the surface is level (not pretty but visible).
      "Adding a high-resolution normal map on top of an already high-resolution displacement map significantly increases memory usage."
      Sure. But that is the same for any high-resolution high-information textures you provide. A displacement-map has inherently poor surface-normal accuracy. With reflective or transparent materials normals calculated from displacement-maps lack precision and accuracy -

    • @cgmechanics5855
      @cgmechanics5855  19 часов назад +1

      Thank you for your input. I acknowledge that different software and workflows might implement normal maps in varying ways, and not all adhere to the convention of unit-length vectors. The video specifically addresses the common use case seen in Blender and other PBR workflows where this assumption holds true. In those cases, the blue channel simply represents the Z-direction of the normal vector, and the expectation is that it aligns with the calculated result from the X and Y channels. I understand that this may not apply universally, but the focus was on the most frequently encountered implementation.
      Regarding the clarity of the context, I appreciate your feedback. I did mention Blender in the video and on the thumbnail, but I understand that it may not have been clear enough. That was not my intention, and I will update the title to make it more explicit to avoid any misunderstandings. I’ll also be more mindful of this in future videos to ensure the context and scope are clearly defined.
      As for your example with the staircase texture, I understand your point, and you make a valid case that there are scenarios where combining displacement and normal maps can result in specific visual effects. However, in the context of modern workflows, especially in Blender, using displacement maps at 4K or higher usually provides enough detail, even when objects are close to the camera. When the geometry is sufficiently subdivided, the stepping issue you mentioned is rarely a problem because Blender smooths these transitions automatically.
      The video was meant to address the use of these textures in Blender specifically, focusing on the most common applications rather than specific edge cases or other software. I could have certainly gone into more detail, but I kept the video concise due to its length limitations.
      I understand that there are other software, texture types, and procedural textures, as well as specific angles where both methods could face challenges. However, I’m curious to know if you would actually disagree with the examples presented in the video itself.