Unity Shader Graph Basics (Part 4 - The Depth Buffer)

Поделиться
HTML-код
  • Опубликовано: 8 сен 2024
  • The depth buffer is instrumental in rendering objects correctly. Similarly, the depth texture is extremely helpful for creating certain effects. Learn how both work in Part 4 of this Shader Graph tutorial series!
    I'm using Unity 2022.3.0f1, although these steps should look similar in previous and subsequent Unity versions.
    ------------------------------------------------------------------------
    👇 Download the project on GitHub: github.com/dan...
    💀 Get the zombie sprite: kenney.nl/asse...
    📦 Get the cyberounk character model: sketchfab.com/...
    📰 Read this tutorial in article format instead: danielilett.co...
    ------------------------------------------------------------------------
    📚 Get a copy of my shader book here (affiliate): www.dpbolvw.net...
    ✨ Grab my Hologram Shaders Pro package here (affiliate): assetstore.uni...
    ✨ Grab my Snapshot Shaders Pro package here (affiliate): assetstore.uni...
    ------------------------------------------------------------------------
    💬 Join the Discord: / discord
    💖 Support me on Patreon: www.patreon.co...
    ☕ Or throw me a one-off coffee on Ko-fi: ko-fi.com/dani...
    ------------------------------------------------------------------------

Комментарии • 26

  • @danielilett
    @danielilett  8 месяцев назад +8

    A longer explanation of how the depth buffer actually stores values, since I cut a very long explanation of what the "non-linear relationship" is from the video, paraphrased from my shader book:
    Unity calculates the distance of a pixel from the camera, which we can call its z-value, hence "z-buffer". This value is between the near and far clip distances of your camera, because those are the minimum and maximum distances that actually get rendered.
    This z-value is changed into a depth value that we can store in the depth buffer by transforming the [near, far] range to a [0, 1] range. The depth buffer stores floating-point numbers between 0 and 1.
    If this mapping were linear, we could run into precision issues with close objects. Especially for small objects close together, we could feasibly end up with errors where objects end up rendered in the wrong order.
    To avoid that, we want to use as much precision as possible to represent close objects. The exact formula that is used for converting the z-value to a depth value is:
    depth = ( 1/z - 1/near ) / ( 1/far - 1/near )
    What you end up with is a curve. For the default Unity camera values where near = 0.3 and far = 1000, 70% of all information stored in the depth buffer represents objects up to a distance of one meter from the camera. Which is amazing when you consider the remaining 30% represents the other 999 meters!
    As mentioned in the video, those depth buffer values get copied to the depth texture, and then Shader Graph gives you the tools to decode this curve into two linear formats (Linear01, where values are linearized in the same 0-1 range, and Eye where values are just the original z-values - distances from the camera - that we started with).
    Hope that gives a bit more context!

  • @ripmork
    @ripmork 12 дней назад

    Great explanations of basics in your videos, thank you!

  • @richardaen4195
    @richardaen4195 8 месяцев назад +2

    Such a brilliant and helpful series. Great stuff Daniel. (Not least because that 3 sec lerp explanation was the best and most concise i've seen)

    • @danielilett
      @danielilett  8 месяцев назад

      Glad you liked the lerp explanation, I almost didn't even include it in the video! Sometimes it's pretty tricky to strike a balance between being concise and including all of the context.

  • @semiterrestrial
    @semiterrestrial 8 месяцев назад +3

    Great explanation, hope you keep making this series!

  • @FarwalDev
    @FarwalDev 8 месяцев назад +3

    These videos are helping me to better understand the ShaderGraphs, thx❤️

  • @usercontent2112
    @usercontent2112 8 месяцев назад +2

    Thank you, this series is helping me a lot to understand shader graph

  • @antonovivan3008
    @antonovivan3008 8 месяцев назад +1

    Thank you. I've learned exactly what I was looking for last few days.

  • @christianschneider8516
    @christianschneider8516 8 месяцев назад +1

    Once again nicely explained, thank you so much !

  • @lpfonseca
    @lpfonseca 8 месяцев назад +1

    Great work! Looking forward for the vertex shader

  • @raysiberian4346
    @raysiberian4346 2 месяца назад

    Thx, pure gold explanation.

  • @dopinkus
    @dopinkus 8 месяцев назад +1

    Like #100! Your tutorials are absolutely amazing. Please please keep going :)

  • @Henry3dev
    @Henry3dev 8 месяцев назад +1

    loving it thanks

  • @StressedProgrammer
    @StressedProgrammer Месяц назад

    Thanks for the tutorial, can i do this in Built In Pipeline?

  • @GameBit697
    @GameBit697 2 месяца назад

    What about hairs in URP, will it work for the same or could you cover "Hairs in URP" topic?

  • @AlexLozanoAcerca
    @AlexLozanoAcerca Месяц назад

    Hi Daniel! I have a question. I dont know why, but, "Depth test 2:44 " only works for me if "Surface Type" is Transparent. I don't understand why it doesn't work when is opaque, like you

  • @user-cr2tv3fu1o
    @user-cr2tv3fu1o 7 месяцев назад

    좋은 이야기 잘듣고 가요~~

  • @LuizMoratelli
    @LuizMoratelli 20 дней назад

    I have this problem, that with a Canvas Group and some GO with Images, the images on top get transparent and the color of images behind mix with the color on front giving a bad result, there is a good way to fix it? I tried with stencil, but then the aliasing scream at your face haha.

  • @rockclimbermaca
    @rockclimbermaca 6 месяцев назад

    Thanks for this tutorial, it's awesome! I just have one tiny question: if I have a second camera in my scene that renders the scene to a RenderTexture with a Color Format of R8G8B8A8_UNORM and a Depth Stencil Format of D32_SFLOAT, and I pass that RenderTexture to a URP Shader Graph as a Texture2D, is there a way to read the depth values from the RenderTexture in the graph? I believe it is not possible, but just wanted to confirm. It's very odd, but it seems like it's impossible :(

  • @DaveRune
    @DaveRune 13 дней назад

    Is there a way to sample a pixel of the depth texture here?

  • @anttiv7109
    @anttiv7109 4 месяца назад +1

    Has this been changed in Unity 6 or am I just blind? Cannot find Depth Texture setting.

    • @danielilett
      @danielilett  4 месяца назад +3

      Do you mean the part about 5 minutes in where you have to find the tickbox? I just checked out a new Unity 6 project and it looks like it's in basically the same place for me.
      If you started with a brand new Unity 6 URP project then your Assets folder should contain another folder called Settings. In there, there are a bunch of weird looking assets. You're looking for the ones named either "Mobile_RPAsset" or "PC_RPAsset". These are both Render Pipeline Assets which basically hold together a lot of URP's settings. The Depth Texture tickbox should be right at the top of the Inspector when you click on them.
      If you're still using 2022.3 or versions before that, then it's basically the same process except the assets you're looking for are called "UniversalRP-HighQuality", "UniversalRP-LowQuality", and "UniversalRP-MediumQuality".
      If you don't see what I'm seeing in Unity 6 then I'm not sure what's up and honestly maybe Unity just changed something randomly in one of the Unity 6 releases so far. They like doing that. For avoidance of doubt I'm using 6.0.0b11.

  • @nopepsi206
    @nopepsi206 7 месяцев назад +1

    When is the next part going to release?

    • @danielilett
      @danielilett  7 месяцев назад

      Not sure yet. I am working on the next part, but I also have a couple of other videos in production right now that are closer to completion. The next part will likely be sometime in February, but I hope it's towards the start.

    • @markroacing8237
      @markroacing8237 7 месяцев назад

      And how many part it have?@@danielilett

  • @harshadjoshi3944
    @harshadjoshi3944 8 месяцев назад

    Can you show this with Shadergraph in Built In Render Pipeline.