Emissive voxels and fancy lighting [Voxel Devlog #19]

Поделиться
HTML-код
  • Опубликовано: 25 июн 2024
  • Try CodeCrafters for free today: app.codecrafters.io/join?via=...
    Scratchapixel's path tracing resource: tinyurl.com/y9e7wn5w
    Online demo: github.com/DouglasDwyer/octo-...
    Additional voxel models: tinyurl.com/mtk5mj7f
    Path tracing looks amazing for voxel scenes! This devlog showcases my game engine's lighting system, which uses path-traced global illumination to simulate ambient occlusion and emissive materials. I discuss the fundamental rendering equation and how I chose a radiance function. Then, I talk about my unique approach to denoising using a GPU-side hashmap.
    Music used in the video:
    Rosentwig - On My Way
    C418 - This Doesn't Work
    C418 - Divide by Four Add Seven
  • ИгрыИгры

Комментарии • 136

  • @AngeTheGreat
    @AngeTheGreat 8 дней назад +114

    Looks amazing!

    • @TheDroidsb
      @TheDroidsb 8 дней назад +2

      Ayyy awesome to see you here 😂

    • @aspidinton
      @aspidinton 7 дней назад +4

      Yooo it's the vroom vroom guy, pog

    • @AndrewTSq
      @AndrewTSq 5 дней назад

      And if my ears catched this right, its running on a GTX1660ti :O

  • @jonathancrowder3424
    @jonathancrowder3424 8 дней назад +61

    0:36 First things first: *the unfettered power of the sun*

    • @DouglasDwyer
      @DouglasDwyer  8 дней назад +5

      Get yourself some sunglasses 😎

  • @xDeltaF1x
    @xDeltaF1x 8 дней назад +37

    That's so cool that you can do temporal smoothing without smearing the screen because it's per-voxel not per-pixel!

  • @xima1
    @xima1 8 дней назад +32

    For SVGF denoising, but retaining the per-voxel or per-voxel-face look, you can just remove the depth based weighting function of SVGF and instead weight based on voxel world-space position and voxel normal. Do this weighting both temporally and spatially and you get a neat pixelated lighting look!
    I'm doing that in my own project and it's actually quite simple to implement, as with voxels you deal with very simple shapes compared to triangles :)

  • @tomasoresky3478
    @tomasoresky3478 8 дней назад +19

    Tech nit: what you refer to as "indirect lighting" is actually still "direct lighting". You are sampling rays to global light + random ray based on your BRDF, but both are still just the "1st hit" contribution, which is still called "direct lighting". "Indirect lighting" is when you continue bouncing the ray based on BSDF, which is 2nd and more bounces.
    So more proper name of the passes would be "GlobalLightSamplingPass" and "BSDFSamplingPass".

    • @DouglasDwyer
      @DouglasDwyer  8 дней назад +7

      Thanks for clarifying! You are absolutely right - I have not yet studied computer graphics formally, so it's easy to mix up the terms :)

    • @lunatikreal1384
      @lunatikreal1384 7 дней назад

      BDSMSamplingPass

  • @prltqdf9
    @prltqdf9 8 дней назад +15

    The higher the resolution the more bit rate RUclips grants to a video. If you can't record at higher than 1080p then resize the video to at least 1440p so that RUclips retains most of the quality of the original upload.

  • @steluste
    @steluste 8 дней назад +6

    The optimization is so good that the game runs at better fps than real life camera wow

  • @frozein
    @frozein 8 дней назад +3

    Wow, really impressive work! Interestingly, a few weeks ago I started work on implementing a per-voxel hashmap into my own engine for lighting. My whole pipeline was basically identical to yours, great minds think alike haha. I have since changed my technique slightly though and have thus far only used it for soft shadows. Excited to see what you do next!

    • @DouglasDwyer
      @DouglasDwyer  7 дней назад +3

      I'm not surprised to hear that. When I was exploring this idea, your original Doon Engine was one place that I looked for inspiration (since it also leverages per-voxel calculations).

    • @frozein
      @frozein 7 дней назад +1

      @@DouglasDwyer Glad my old engine is still useful! The hashmap approach seems very promising, basically a denoiser+stylizer in one. I think DoonEngine had the right idea but doing the lighting in a separate compute shader was just too slow and memory-intensive.

    • @frozein
      @frozein 7 дней назад +1

      @@DouglasDwyer Congrats on 10k by the way!

  • @dxred2553
    @dxred2553 7 дней назад +3

    This is amazing. It's turning into exactly what I picture when I think "voxel engine"! The fact that this is all running on a 1660ti is just mind boggling, too.
    One minor nitpick, as a french speaker: "À-trous" is french for "at/toward holes" and sounds a little more like "ah true", not "a truce". Not a big deal though, especially compared to the insane amount of code work you've put in 😂

  • @gsestream
    @gsestream 8 дней назад +3

    there is 4th option, to real-time bake lighting values per pixel, with unlimited samples per voxel, yep pre-calculating the voxel radiance, in that you dont actually sample any light real-time, but bake the voxel lighting as fast as possible in some priority order. fully baked scene lighting is an example of this, its the best. and dynamic updates to the lighting make it work like minecraft rtx lighting updates. semi-realtime but the baked lighting is always correct. no de-noising required. more quality image and actual renders. very cool.

    • @theneonbop
      @theneonbop 8 дней назад +2

      "dynamic updates to the lighting" "bake the voxel lighting as fast as possible in some priority order"
      Isn't this just effectively the same as what he's doing, per voxel lighting with temporal denoising, where closer voxels (covering more of the screen) get more samples?

    • @gsestream
      @gsestream 8 дней назад +3

      @@theneonbop most of it is very close, emphasis here on separating the lighting updates from final render completely. Ie lighting can remain same over all rendered views. Ie fully pre-baked voxel lighting. Zero denoising blurring.

  • @fredrikmalmstrom924
    @fredrikmalmstrom924 8 дней назад +7

    Man you should be proud of this project, you've come a long way and it's looking really good!

  • @coleslater1419
    @coleslater1419 8 дней назад +2

    I'm not kidding when I say this is among my favorite series. It's amazing what you're able to accomplish

  • @Zevest
    @Zevest 8 дней назад +12

    Congrats on 10k!

    • @DouglasDwyer
      @DouglasDwyer  8 дней назад +4

      Thank you :)

    • @mohammadazad8350
      @mohammadazad8350 8 дней назад +2

      Don't lie, there is no way he has [insert 10,000! here] subscribers!
      _This joke failed due to technical limitations: RUclips doesn't seem to like this 35,660 digits long number!_

    • @Zevest
      @Zevest 8 дней назад +1

      @@mohammadazad8350 I’m not gonna lie I had to read it 4 times to understand it lol.

  • @Alexey_Pe
    @Alexey_Pe 8 дней назад +18

    As soon as I understand who this video is from, it doesn't matter to me anymore, I've already clicked what to watch the video

    • @DeGandalf
      @DeGandalf 8 дней назад

      Same. Also one of the few channels which get an instant like, because I already know that the quality of the content and editing will be super good.

  • @terraprint9183
    @terraprint9183 8 дней назад +3

    Cool video! If anyone wants to learn more about the rendering equation by the way, Acerola made an entire video on it. I believe it's his most recent video about how realistic and stylized graphics all work under the same premises.

  • @tdottosama
    @tdottosama 5 дней назад

    I've been watching this come along for years. Your work has been very inspiring. Thank you!

  • @dphfox
    @dphfox 8 дней назад

    This is awesome! Your work is always inspiring.
    I had a similar idea for spatial ray reuse that I wanted to leverage for my own voxel engine, though it was texel based since my engine is designed to be more Minecraft like. The fact you got this working is very cool and I can't wait to see where you go with it!
    You are truly innovating :)

  • @cosb7207
    @cosb7207 7 дней назад

    Would love to see some BRDF magic next. Iirc you've got unique per-voxel normals, I'm curious to see how that would look with even some basic fresnel-driven gloss.
    Also HDR colors + basic tone mapping would make emissive voxels/skyboxes look insane

  • @TeltStory
    @TeltStory 6 дней назад

    You could draw the material at blank locations as if rasterizing, and then blend it, and draw at the final bounce as if rasterizing.

  • @Warped807
    @Warped807 8 дней назад

    Congrats on 10k! Keep up the great work!

  • @dennisd7
    @dennisd7 7 дней назад

    Well done, this is really good stuff.

  • @SZvenM
    @SZvenM 8 дней назад

    Looking good! Exciting stuff

  • @LunarGameDev
    @LunarGameDev 8 дней назад

    This really has come so far! Keep up the great work!

  • @ILBorz
    @ILBorz 7 дней назад

    Keep it up man! You are reaching something truly awesome!

  • @interflashz
    @interflashz 8 дней назад

    Awesome! I really love the look of the per voxel path tracing

  • @fazin85
    @fazin85 7 дней назад

    Congrats on 10k❤

  • @user-vc9rh3bk8t
    @user-vc9rh3bk8t 8 дней назад

    I can't wait to see where this project goes next!

  • @GabeRundlett
    @GabeRundlett 8 дней назад +5

    LETS GO looks amazing

    • @ThePC007
      @ThePC007 8 дней назад

      Oh, you’re here, too. ^^

  • @80sVectorz
    @80sVectorz 8 дней назад +1

    Your perseverance and skills are very inspiring. Keep up the great work!

  • @Markyparky56
    @Markyparky56 8 дней назад

    Congrats on 10k subs!

  • @Cilexius
    @Cilexius 4 дня назад

    This is really amazing !!

  • @alan83251
    @alan83251 8 дней назад

    Very impressive progress over the time you've spent working on this. Keep it up!

  • @deltapi8859
    @deltapi8859 8 дней назад

    this look phenomenal

  • @kiriakosgavras108
    @kiriakosgavras108 5 дней назад

    Amazing stuff!

  • @galaxygames3216
    @galaxygames3216 8 дней назад

    you are doing gods work here.

  • @Saji_0
    @Saji_0 8 дней назад

    Keep up the good work, It really looks amazing!!
    the lights works very fast and stable on my testing after some little setting tweaks

  • @Gilgidane
    @Gilgidane 8 дней назад

    looks incredible!

  • @BodhiDon
    @BodhiDon 8 дней назад

    Amazing work! Great solutions for denoising, the shot of the tree at night really sells the value of the emissive voxels!

  • @jameswilcock8
    @jameswilcock8 4 дня назад

    Very nice, lets see paul allens voxel rendering pipeline

  • @Viola-iu1ys
    @Viola-iu1ys 3 часа назад

    you have a great sense of humor!

  • @tommycard4569
    @tommycard4569 7 дней назад

    So cool! I always look forward to ur videos

  • @keptleroymg6877
    @keptleroymg6877 8 дней назад

    Beautiful project

  • @Wallcraft_Official
    @Wallcraft_Official 8 дней назад

    Killer work, man. Your vids are something I always look forward to, keep up the good work!

  • @Harald723
    @Harald723 8 дней назад

    Yes! Amezing work!

  • @MaddGameMaker
    @MaddGameMaker 6 дней назад

    I've watched most of your videos, you're doing a really good job! I myself am working on a voxel game for about 2.5 years; funny that we started around the same time. Though in my case I'm not necessarily focused on the graphics engine. I went from using polygons to raymarching a few months ago but there's still a lot of work to do on various details of it.
    My pipeline is quite different from yours. I render the scene by raymarching from each pixel in the direction of the sun; if I hit an object, I consider the pixel to be in shadow, else I calculate the intensity of light based on the normal and the direction of the sun. During this pass I create an image with the unlit colors, the intensity of sunlight, the normals and the positions. Then, I render a sphere around each light source, and within this sphere, I read the position and normal from the previously-generated images and cast another ray - towards the light source - and add the intensity to the light image. I finally combine the light image with the unlit color image. It performs surpringly well even with thousands of light sources, but it does lag on older GPUs when lights are present.
    Your videos did give me some ideas during the process. Your technique might inspire some more improvements to my lighting system.
    Keep up the good work and congratulations on everything you've done!

  • @bytesandbikes
    @bytesandbikes 8 дней назад

    That denoising method is awesome 😁

  • @AndrewTSq
    @AndrewTSq 5 дней назад

    Not sure I understood half of the things you talked about, but love the look of the game!.

  • @kaede_elfen
    @kaede_elfen 6 дней назад +1

    Just subbed. Amazing stuff!

  • @RMethod
    @RMethod 8 дней назад

    Wow. That's insane!

  • @angeloguido2660
    @angeloguido2660 8 дней назад

    This is awesome 😮

  • @jerkofalltrades
    @jerkofalltrades 8 дней назад

    Your engine continues to impress, but your devlog skill continues to improve, as well.

  • @jason_m2003
    @jason_m2003 8 дней назад +1

    Amazing Douglas, I'm just a novice game dev youtuber, and I aspire to be like you!!!! Keep up the amazing work on this

  • @TiagoTiagoT
    @TiagoTiagoT 6 дней назад

    In case you haven't watched it yet, you might be interested in the 2021 SIGGRAPH presentation called "Global Illumination based on Surfels"

  • @lunatikreal1384
    @lunatikreal1384 7 дней назад +3

    Удивляюсь с каждого твоего видео, отличная работа, альтернатив просто не существует! Желаю тебе успехов, в этом нелёгком деле, и очень надеюсь, что когда-нибудь на твоём движке люди начнут создавать игры!

  • @jarusca3933
    @jarusca3933 8 дней назад

    This is good stuff

  • @sethpyle8857
    @sethpyle8857 2 дня назад

    Voxel de_nuke looks awesome

  • @valet_noir
    @valet_noir 8 дней назад

    AMAZING !!!!!

  • @forest6008
    @forest6008 8 дней назад

    mad respect, u program in light mode

  • @cheesymcnuggets
    @cheesymcnuggets 8 дней назад +2

    Tbh i dont really understand what you explain in your videos but I still watch them and I kinda just sit here in pure amazement, it's so cool. I honestly can't wait to see the finished product

  • @deltapi8859
    @deltapi8859 8 дней назад

    damn voxel pathtracer in rust, this is what I call the CG chad stack. I want too :D

  • @AnOliviaShapedGremlin
    @AnOliviaShapedGremlin 7 дней назад

    I always wondered if rendering each voxel as a 2D point in 3D space might look or perform any better than rendering cubes.
    For example something where distant points that would otherwise occupy one pixel are averaged together into a larger, lower density grid, and where points that are close enough to the camera to where you might see between them get expanded to fill the space, maybe with some additional noise or filler texture applied to fake even smaller detail.
    No idea how well it would work in practice, but it would be interesting to see something like it tested.

  • @SLPCaires
    @SLPCaires 8 дней назад

    Cool!

  • @Rasteriser
    @Rasteriser 8 дней назад +1

    One thing I noticed is the time it takes for light to affect the surrounding area is kinda slow, great engine and video though.

  • @rhevoramirez7969
    @rhevoramirez7969 8 дней назад

    Genius

  • @parkershaw9747
    @parkershaw9747 6 дней назад

    i wonder if you'd be able to easily extend the algorithm to do bounce lighting with a second ray from the bounce. with all the techniques you cooked up, i think it would be smooth and look great. you could even have a setting to determine n bounces for computers with very high specs.

  • @smangalang
    @smangalang 8 дней назад

    could you combine the real time path tracing with some baked lighting to make it even more smoove?

  • @JoshuaBarretto
    @JoshuaBarretto 6 дней назад

    Very impressive work! I'm wondering how well this translates to directional effects like specular reflection or temporally inconsistent effects like moving objects. When I was working on my SSAO shader for OpenMW, I found I could get pretty beautiful SSAO for static scenes with just reprojection and temporal AA, but moving objects caused a lot of problems.

    • @DouglasDwyer
      @DouglasDwyer  6 дней назад

      That's a great question. For specular, I think temporal accumulation wouldn't work, so I would have to cast specular rays during the "direct" (non-path traced) lighting pass. I already do this for sunlight shadows, and the hashmap is still handy as I can cast just one ray per-voxel.
      I haven't looked at AO or emissive materials with moving objects yet. Maybe it will look weird lol. In that case, I can try to reduce the amount of temporal accumulation that is used.

  • @obviousalexc
    @obviousalexc 8 дней назад

  • @noxmore
    @noxmore 8 дней назад

    This is so sick!! I am just itching to get my hands on this engine and play around with it!
    Speaking of, what do you think you have to get done before you release it?

    • @DouglasDwyer
      @DouglasDwyer  8 дней назад

      I'm glad you like it! Although I will continue to release demos, there's so much to do before the engine is complete. Physics, audio, entities, gameplay, etc. My next goal for the summer is to implement a new physics engine with a proper iterative collision solver.

  • @delphicdescant
    @delphicdescant 4 дня назад

    Basically saying "I've achieved the holy grail of computer graphics" feels *a little* conceited, but that aside, I really like your engine. A lot of these small-voxel engines seem to always try to copy John Lin's work, and concern themselves with outdoor sunlit scenes primarily (which are easier to light), but I appreciate how well yours handles indoor scenes with more varied lighting. Really good work!

  • @dobrx6199
    @dobrx6199 8 дней назад +1

    This looks amazing! What is the performance like?

    • @DouglasDwyer
      @DouglasDwyer  8 дней назад +1

      You can try the demo for yourself to find out! But on my NVIDIA 1660 TI, I am able to run all scenes at 100 FPS at 1080p. On my Intel UHD 750H iGPU, I need to use a resolution of only 360p, but it runs at 60 FPS.

  • @von_nobody
    @von_nobody 5 дней назад

    And using this make ultimate minecraft :D How big world with LOD can be displayed? could be this large enough to have similar scope to MC?

  • @georgehawryluk7976
    @georgehawryluk7976 8 дней назад

    Have you got a road map? Quo vadis Douglas? At what point does the engine become a finish product? I've been following you for a while and, yes the progress is impressive and kudos to you for not giving up

    • @DouglasDwyer
      @DouglasDwyer  8 дней назад +2

      Great question! I do have a general idea of what I want to build. The goal is to create a platform where users can create different games, and play games written by other users. The underlying voxel technology should allow these games to leverage building/destruction mechanics in unique ways. The engine becomes a finished product when:
      - All core game engine functionality (physics, proper building system, input handling, and third party plugins) are fully integrated and supported.
      - I have built an example game with the engine.
      I am taking this summer vacation to just work on the engine. Afterward, I'm going back to university for my senior year. My hope is to spend this summer, and the next year at university, finishing the core functionality. Once it's close to completion, I'll build an example game in it and then release the engine for others to build games with as well.
      In terms of business plan, I haven't made any concrete plans yet. But I am most interested in revenue-sharing models, where developers can create games in the engine for free, and then publish them on my platform.

  • @SulfuricDonut
    @SulfuricDonut 8 дней назад

    So to simulate the "shininess" of a surface would you just give the monte-carlo rays a weighting function toward the angle opposite the camera?

    • @DouglasDwyer
      @DouglasDwyer  8 дней назад +1

      I believe that's correct - that's the "BRDF" part of the rendering equation :)

  • @Gwilo
    @Gwilo 8 дней назад +1

    brush your hair between your fingers instead of using a comb; it works better and makes your hair look less flat. also, comb your fingers through your hair upwards, not sideways

  • @Randalandradenunes
    @Randalandradenunes 8 дней назад

    Next video: Hello guys, this is Douglas. Today I re-created the Matrix, but with better graphics.

  • @FoxiqueGC
    @FoxiqueGC 8 дней назад

    I was wondering if it would be possible to make lower resolution scales have sharp edges. You should already know where the edges of all visible voxels are so instead of simply rendering the pixelized output you could take each pixels corresponding to individual faces of a voxel upscale it as its own image (only blending pixels from the same face) and then sort of apply a "mask" following its edges so there is a clear cutoff instead of a blurry one. Also for some reason, if I set the resolution scale below 1x, the frame time stays constant at 6ms no matter if i use 2x or 5x scale.

    • @DouglasDwyer
      @DouglasDwyer  7 дней назад

      Thanks for trying the demo! You can make the edges appear sharp by turning off antialiasing (which has a very particular appearance at lower resolutions). Maybe that will make things sharp in the way that you describe. As for things staying at 6 ms, you were probably hitting the limit of your display's refresh rate.

  • @noidea5597
    @noidea5597 8 дней назад

    Excellent video! It looks much better. And I'm surprised by the very good performance even on my potato pc! But walls still tend to look kind of flat. Is that just a consequence of the voxel engine? Or can it be fixed?

    • @DouglasDwyer
      @DouglasDwyer  8 дней назад

      I think this might just be a consequence of the models that I'm showing off. Adding textures and bumpy surfaces (which I will do when I add proper world editing tools) should hopefully fix it :)

  • @timmygilbert4102
    @timmygilbert4102 8 дней назад

    Sound like temporal reStir

  • @UltimatePerfection
    @UltimatePerfection 8 дней назад

    Amazing, if I didn't know any better, I'd think I'm looking at regular 3d models instead of voxels. But please make a game out of it or release it to the public, otherwise we'd end up with another Euclideon or Atomontage that only few cool videos ever came out of.

    • @DouglasDwyer
      @DouglasDwyer  8 дней назад +1

      My hope is to do both - make an example game, then release the engine for public use under a revenue-sharing model.

  • @SirDragonClaw
    @SirDragonClaw 8 дней назад

    You really should have nearest neighbour upscaled this video to 4k and uploaded it at 4k. RUclips compression completely butchered parts of this video.

  • @MaddGameMaker
    @MaddGameMaker 3 дня назад

    I'm just wondering - how do you find the normal vector of a voxel when the ray (from eye to fragment) hits it? When I raymarch from eye to fragment, i keep track of which axis the closest plane is on, and use that to decide the normal. The problem is that when the ray hits the surfaces at certain angles, the roundoff errors cause the wrong normal to be used, and so for example if you are really close to a wall, you see a "grid" around the voxels: this is because around the edges of voxels, the normal reported is the one for the internal (invisible) side. Since I'm not seeing you are any other raymarching youtubers struggle with this problem, I presume there must be some better way to figure out the normal of the voxel I hit. Do you have suggestions? Thank you in advance if you choose to answer.

    • @DouglasDwyer
      @DouglasDwyer  3 дня назад +1

      Hey there! There are two main approaches for this. One is to "bake" the normal into the voxel data. This is what I do, and it allows for the smooth per-voxel lighting shown in the video. If you want face normals, you should be able to extract the face normal index from your ray traversal algorithm. If you are using DDA, then the axis of the normal is determined by whichever direction the DDA algorithm last stepped.

    • @MaddGameMaker
      @MaddGameMaker 2 дня назад

      @@DouglasDwyer I'm storing my voxels in an SVO so they don't really have defined 'faces' and as such I don't think i can bake the normals in. At the moment I am extracting the normals from the raymarching algorithm (I don't use DDA at the moment) but I think the problem is if a voxel gets hit exactly on an edge, the algorithm can't decide which side it technically hit, and so sometimes it decides that it hit the invisible side. I do suspect that it might just be a bug in my implementation. Thank you for your advice!

  • @sgmvideos5175
    @sgmvideos5175 8 дней назад

    Tried running it in browser and yea... it's a bit laggy (like 2 - 5 fps I'd guess?) but personally I wouldn't be able to do this under 1 fps, so great job, considering how great it's looking.

    • @DouglasDwyer
      @DouglasDwyer  8 дней назад

      Thanks for trying out the demo! For full 60 FPS at 1080p, a discrete graphics card is required. If you want to run the game on an iGPU (which I'm guessing is what you're doing), you can turn down the screen resolution using the in-game settings menu. This should help it run at a playable speed.

    • @sgmvideos5175
      @sgmvideos5175 7 дней назад

      Well... maybe that's what happened, I'll try to see later if I can make sure it's running on the stronger one.

  • @theneonbop
    @theneonbop 8 дней назад

    Would bounce lighting be too expensive? I think at least a single bounce might be good, as an option
    I guess you could use the hit voxel's color and shadowmap to do basic lighting for the first bounce, and you could also get lighting from the last frame's hashmap to approximate more bounces (I'm pretty sure this is similar to what Godot's SDFGI and HDDAGI do for traces from the probes). This sounds pretty cheap, because you don't have to do any more traces. (I guess this would replace the ambient occlusion function)

    • @DouglasDwyer
      @DouglasDwyer  8 дней назад

      That's a great question. I haven't really explored this option yet, because the hashmap only stores visible voxels (so stuff offscreen wouldn't be stored). But it's something to try in the future!

    • @Stowy
      @Stowy 8 дней назад

      ​​@@DouglasDwyer does that mean that you only have indirect lighting from objects on screen ?

    • @DouglasDwyer
      @DouglasDwyer  8 дней назад +1

      @@Stowy nope, indirect lighting can come from offscreen objects! The hashmap only stores onscreen voxels' lighting, but to calculate their lighting, I cast rays through the world that can hit anything. What the other commenter is suggesting is that I use multiple bounces on my indirect lighting rays. These multiple bounces could take the additional lighting of offscreen voxels into account. But I don't store the lighting of offscreen voxels at present :)

    • @theneonbop
      @theneonbop 8 дней назад

      @@DouglasDwyer Yes, I was thinking you could use the basic lighting model from previous videos to cover offscreen voxels, or voxels that weren't visible in the previous frame. Probably with only sun light and not ambient to avoid light leaks. It wouldn't cover everything, but I think it would be good for most of the common cases, such as sunlight coming through a window and bouncing off of the floor. It would still involve casting extra shadow rays I guess, if the shadowmap only covers screen space voxels.

  • @stickguy9109
    @stickguy9109 8 дней назад

    Did you come up with this on your own 7:02? I couldn't quite understand it. Impressive how you managed to run path tracing in real time AND without needing a pc from nasa.

    • @DouglasDwyer
      @DouglasDwyer  8 дней назад

      Yes, this is my own original idea. The closest thing that I know about is NVIDIA's SHARC (github.com/NVIDIAGameWorks/SHARC), but I think that technique is slightly more complicated.
      Sorry that the explanation didn't fully make sense. The core idea is that if you average all of the path-tracing samples over an entire voxel (so the voxel just has one color, from all averaged samples), then things look pretty smooth. So, I create a hashmap or dictionary (which stores one single value per voxel) and as my shaders run, I add each lighting sample to the voxel's key in the dictionary. Then, I am able to use that final result for lighting.

    • @stickguy9109
      @stickguy9109 8 дней назад

      @@DouglasDwyer So it's like any path tracing light accumulation but instead of doing it on a per pixel basis you do it per voxel and since you don't know which voxel is being processed right now you store them in a dictionary to access them later. Did I get that right? If so doesn't that mean the noise will come back when you move too much.

    • @DouglasDwyer
      @DouglasDwyer  8 дней назад

      @@stickguy9109 Pretty much. Each pixel on the screen still casts a ray (this means that voxels close to the screen get more samples quickly), but they accumulate the results into the per-voxel map.
      Yes, when you move there might be noise - that's where the temporal accumulation comes into play. The averaged results from previous frames are re-used by looking up each voxel's entry in the hashmap from the last frame. This eliminates most flickering during motion. There may still be some in certain situations - you can play around on the demo to see how it works :)

    • @stickguy9109
      @stickguy9109 8 дней назад

      @@DouglasDwyer Very clever stuff. I'll try the demo now (hope it will run on a potato uhd too)

  • @oberdiah9064
    @oberdiah9064 8 дней назад

    Shame about it only being single-bounce - did you try doing secondary/tertiary bounces too or did that end up being too intensive to run?
    IMO global illumination really starts looking amazing once you can see the colour of brightly coloured objects bleed onto other objects.

    • @DouglasDwyer
      @DouglasDwyer  8 дней назад

      Agree, multiple bounces would be really cool. I just haven't had time to explore it yet, but I do have some performance concerns. I definitely want to try it at some point :)

  • @Yagir
    @Yagir 2 дня назад

    Very Small FPS in your last demo. I have 15 fps on 3050 RTX and Ryzen 5

    • @DouglasDwyer
      @DouglasDwyer  2 дня назад

      Thanks for trying the demo! If you are using the web version, make sure that it is actually running on your discrete GPU (as opposed to an integrated GPU). You can also try downloading the native demo instead, or decreasing the screen resolution for better framerates. But you should definitely be able to achieve 60 FPS at 1080p; I am able to do so on my 1660 TI.

    • @Yagir
      @Yagir 2 дня назад

      @@DouglasDwyer I use .exe file from github and i don't have a 60 fps. I got a very bad performance, and have delay when I try to remove something.

  • @cansado2287
    @cansado2287 8 дней назад

    :)

  • @Theprobutnot
    @Theprobutnot 8 дней назад +2

    yes another video also im first lol

  • @tempers_vr
    @tempers_vr 8 дней назад

    i swear this is just teardown

  • @NotAFoe
    @NotAFoe 8 дней назад

    Amazing stuff!

  • @teabow.
    @teabow. 8 дней назад +2

    Congrats on 10k subs !