NVIDIA Just Supercharged Ray Tracing!

Поделиться
HTML-код
  • Опубликовано: 26 сен 2024
  • ❤️ Check out Lambda here and sign up for their GPU Cloud: lambdalabs.com...
    📝 The paper "Area ReSTIR: Resampling for Real-Time Defocus and Antialiasing" is available here:
    graphics.cs.ut...
    github.com/gui...
    My free course on ray tracing / light transport:
    users.cg.tuwie...
    Our earlier paper with the spheres scene that took 3 weeks:
    users.cg.tuwie...
    📝 My paper on simulations that look almost like reality is available for free here:
    rdcu.be/cWPfD
    Or this is the orig. Nature Physics link with clickable citations:
    www.nature.com...
    🙏 We would like to thank our generous Patreon supporters who make Two Minute Papers possible:
    Alex Balfanz, Alex Haro, B Shang, Benji Rabhan, Gaston Ingaramo, Gordon Child, John Le, Kyle Davis, Lukas Biewald, Martin, Michael Albrecht, Michael Tedder, Owen Skarpness, Richard Sundvall, Taras Bobrovytsky, Ted Johnson, Thomas Krcmar, Tybie Fitzhugh, Ueli Gallizzi.
    If you wish to appear here or pick up other perks, click here: / twominutepapers
    Thumbnail background design: Felícia Zsolnai-Fehér - felicia.hu
    Károly Zsolnai-Fehér's research works: cg.tuwien.ac.a...
    Twitter: / twominutepapers

Комментарии • 343

  • @UON
    @UON 4 месяца назад +625

    I've been raytracing for years and it's so exciting watching this technology evolve. I have to hold onto my papers 24/7 because of all the hot winds my gpus generate

    • @dertythegrower
      @dertythegrower 4 месяца назад +7

      All PC gamers since 2001 bought nvidia parts and hyped it.. i got the Nvidia 2060 super from nvidia now in my pc, 1st gen ray tracing, and it is mindblowing graphics and reflections on glass and water.. these games already exist (mods for cyberpunk or even the famous 3dmark tests for PC benchmarks for years)... it is clearly a game changer for making realistic graphics, real world graphics.

    • @NaN_000
      @NaN_000 4 месяца назад +1

      Use the fan of gpu

    • @2hotflavored666
      @2hotflavored666 4 месяца назад

      @@dertythegrower You can run RT on a weak GPU like a 2060 Super?

    • @rano12321
      @rano12321 4 месяца назад +2

      @@dertythegrower usually video games have done ray tracing until recently, all the ray tracing mods were trickery to fake real-time gi and reflections because proper ray tracing with old techniques would take a long time to compute, which is why 3D renders take minutes to some time hours for one frame.

    • @KalkuehlGaming
      @KalkuehlGaming 4 месяца назад +2

      Uon. Didnt expected to see a comment by you. :D

  • @ArminEghdamiDrums
    @ArminEghdamiDrums 4 месяца назад +407

    That sheep looks like it's ready to die from so much noise

    • @peanutnutter1
      @peanutnutter1 4 месяца назад +33

      Not a time to be alive

    • @DarkoP9.13
      @DarkoP9.13 4 месяца назад +5

      Noise one!

    • @TheBrother34
      @TheBrother34 4 месяца назад +9

      If we shear the wool, we can hold the static we saw in our televisions back when we were kids

    • @Dialethian
      @Dialethian 4 месяца назад +2

      Needs a visit from the laundromat guy.

    • @MagodosFrames
      @MagodosFrames 4 месяца назад +1

      Lol, this reference is golden, the animation is one of the best out there made in blender.

  • @SuperWiiBros08
    @SuperWiiBros08 4 месяца назад +108

    I think making games look like real life is not about better hardware but better algorithm/programing that can accurately replicate realistic lightning, materials and such

    • @Hackanhacker
      @Hackanhacker 4 месяца назад +4

      Absolutly! Making games isnt about reproducing real life things like physics or lightrays path to put lighting at the right place(good exemple for this vid :P)
      , Its about taking shortcut to give the illution something is simulated

    • @research417
      @research417 4 месяца назад +18

      Because hardware power has been steadily increasing throughout the years, we've been able to continually upgrade the graphics and capabilities of games just by waiting for the newest chips to come out. Unfortunately, it's not a sustainable system, and we're hitting plateaus in terms of price to computing power ratios.
      In the future, if games want to advance the field in any way or be considered 'next gen', they're going to have to put much more focus on algorithmic efficiency and clever programming. Which IMO is a good thing because as it stands now most games are very poorly optimized, and rarely make an impact in the field in terms of smart new algorithms.

    • @mikakorhonen5715
      @mikakorhonen5715 4 месяца назад +3

      Nobody will continue research, If hardware isn't also getting better over time. New ideas come possible with new hardware.

    • @detran09
      @detran09 3 месяца назад +3

      The shear amount of calculations that have to be made in rapid succession to emulate light's various characteristics is very demanding. Yes we have come up with clever techniques and shortcuts to speed thing up significantly, but robust hardware capable of handle these computations is still very much essential, just look at Unreal Engine 5. Hardware's importance cannot be overlooked. Hardware, software, algorithms, and techniques are all important parts to the puzzle. The fact is that many of the mathematical functions used in emulating light have been around for well over 100 years, yet it's only through modern hardware that we can utilize them to visualize light in modern graphics.

    • @ludvig3242
      @ludvig3242 3 месяца назад

      @@Hackanhacker That's not what they said 💀

  • @cappybenton
    @cappybenton 4 месяца назад +129

    Wunderbar. I wrote ray tracing code all the way back in 1984. All the improvements since then are indeed stupendous.

    • @Youbetternowatchthis
      @Youbetternowatchthis 4 месяца назад +8

      What could you render with 1984 hardware? I can't even imagine what working on ray tracing back then could look like.
      I wasn't even alive back then.
      I remember working with 3d software and just rendering a single quality image from a scene in 3d too a good PC a couple of days (I don't think it had ray tracing, but I had no idea).

    • @shadowlord0162
      @shadowlord0162 3 месяца назад +1

      raytracing exists for that long? its older than me lmao

    • @aarorissanen930
      @aarorissanen930 3 месяца назад

      @@shadowlord0162 It's a very simple and intuitive method, not a surprise we came up with it in the early days.

    • @Triv1umxxx
      @Triv1umxxx 3 месяца назад +3

      @@shadowlord0162 Ray tracing has been around since the 60s, in fact it predates modern rasterisation techniques we see today. In essence ray-tracing 'really' simple, you can put together a renderer (albeit very basic) in a few hundred lines of code, is just naturally computationally expensive. Accelerating it though? Not so simple.

  • @AdamMi1
    @AdamMi1 4 месяца назад +41

    It's great to see some content about light transport simulation from you again!

  • @texlop2
    @texlop2 3 месяца назад +13

    we get.... this thing.... that we call.... hihg frequency noise.... that. makes.. almost... all of this footage.... completely.. unusable,

    • @mr_nate8911
      @mr_nate8911 3 месяца назад +6

      This guy makes me feel like I'm listening at 5fps

    • @teresashinkansen9402
      @teresashinkansen9402 29 дней назад

      I think his way to talking has gone from being a charm of the channel to be exaggerated and starting to affect how much the videos can be enjoyed.

  • @Vorexia
    @Vorexia 4 месяца назад +40

    The frame-time increase for the new technique is pretty large, though. 7.5 to 12ms is a 60% increase, while 10.5 to 14.5ms is a 38% increase. Still a very impressive advancement, but especially for gaming applications wherein ReSTIR PT with DLSS turned on already pushes modern GPUs to their limits, those numbers would be concerning.

    • @9308323
      @9308323 4 месяца назад +21

      That _is_ true. Though it's also possible that the technique to denoise the data wouldn't take as much time, considering that it's working with far greater information. It could cancel it out, but with the image being more accurate. Either way, considering this is open-source, I'd imagine it won't be long before the frametime shrinks in a paper or two, it might not even become an issue.

    • @link99912
      @link99912 4 месяца назад +8

      It depends on how it scales. If the fidelity is 25x at 60% more time at 1spp, maybe you can go down to .5spp with the new method and still get better quality than 1spp in the old method. Then you're running both faster and at a higher quality.

    • @israelRaizer
      @israelRaizer 4 месяца назад +2

      ​@link99912 0.5spp? What would half a sample look like in path tracing? I don't think that makes sense, you either take a sample for each pixel or you don't... Unless you take one sample for every two or more pixels in the image, but that's more like using DLSS for upsampling.

    • @wanhl2440
      @wanhl2440 4 месяца назад +3

      @@israelRaizer it is what dlss 3.5 ray reconstruction already using. one sample for every two or more pixels in the image.

    • @israelRaizer
      @israelRaizer 4 месяца назад +2

      @@wanhl2440 yeah, I get it, I just don't think that terminology makes much sense because in the "tracing step" of the pipeline (where the amount of samples is relevant) it's still a minimum of 1spp, you can't have a non-integer number of samples per pixel

  • @dwnsdp
    @dwnsdp 4 месяца назад +97

    "wooaah this is a terrible disappointment" I just spat out my drink listening to youtube in the background.

  • @NicCrimson
    @NicCrimson 4 месяца назад +28

    Is the project open source? I hope it gets better and eventually added to Blender even though I use an ARC gpu.

    • @theneonbop
      @theneonbop 4 месяца назад +8

      Blender's currently focusing on EEVEE next for realtime and trying to keep Cycles as unbiased as possible without any sort of spatial or multi-frame information. However, just looking at the images on their project page, it seems a lot less lossy than re-stir so it seems feasible that they might implement it in the future.
      Of course, they still have to finish eevee next, spectral cycles (maybe), the animation rework, etc...
      It is open source but under an NVIDIA license, so IDK if it would be possible to use directly in blender

    • @ettiSurreal
      @ettiSurreal 4 месяца назад +5

      ​@@theneonbop EEVEE and Cycles have completely separate devs, there is no "focus on eevee". This also extends to geometry nodes, ui, animation, etc.

    • @theneonbop
      @theneonbop 4 месяца назад +2

      @@ettiSurreal Yeah, you're right. Just from a glance at the blender repo, it looks like there is a little bit of overlap in the personnel between cycles and eevee but I guess it is mostly specialized. Anyways, one of the full time devs made a wip PR for re-stir about a month ago so hopefully that will get into a usable state relatively soon.

  • @face.r
    @face.r 4 месяца назад +187

    she trace on my ray till i render

  • @kernelcodes
    @kernelcodes 4 месяца назад +43

    he was about to drop the f-bomb lol 1:17 "three f**king weeks, yes really ..."

  • @Mad3011
    @Mad3011 4 месяца назад +80

    Thank you for pointing out that this is a hand crafted technique! I'm starting to get a bit of AI fatigue when it seems like every breakthrough amounts to just pouring ginourmous amounts of training data into a machine learning algo.

    • @theblckbird
      @theblckbird 4 месяца назад +1

      This!

    • @Sibanamush
      @Sibanamush 4 месяца назад +5

      I get the AI fatigue, but who do you think handcrafted the Ai learning algorithms? Not trying to throw shade, just find the distinction between two different tools /workflows to use GPUs to create images interesting, when in this context they're both massively impressive complex algorithms some very creative and intelligent people have made

    • @Mr.MasterOfTheMonsters
      @Mr.MasterOfTheMonsters 4 месяца назад +1

      Letting machines do the work for us IS the future. Wasn't that a good thing a couple decades ago?

    • @Zadamanim
      @Zadamanim 3 месяца назад +2

      ​@@SibanamushI think it's a testament to the speed of progress when we are having breakthroughs every other day to the point where they get boring lol

  • @pdjinne65
    @pdjinne65 4 месяца назад +7

    This guy Al sure worked on a lot of papers. What a brilliant man he must be!

  • @TechGamesAU
    @TechGamesAU 4 месяца назад +6

    I’m confused. We already get more stable images than this in real time video games using DLSS 3.5 ray reconstruction. Is the difference here that RESTIR is not using any ML algorithm, so it remains physically accurate using simulation?

    • @akyhne
      @akyhne 3 месяца назад +2

      In a game with raytracing, the game is calculated normally, like any other game, with - usually - many light sources. And then some raytracing effects are added to shiny surfaces that are defined in the game as being able to receive ray tracing. It's very rudementary.
      In ray traced 3D software, like 3DS Max, you have all your object, but all light sources are not working, like in a game. They are ray traced calculated.
      It's a bit hard to explain, but in 3D, everything is built out of triangles. You have a light source, but it doesn't emit light. It "shines" on a triangle of a model and according to the camera position (your view), it simply calculate how much color should be on that triangle. Say the triangle is 100% red, but the triangle is in what's supposed to be a dark area, according to the light source and your view, a dark red is simply added. That's a very basic explanation.
      With ray tracing, it emulates real world, where a number of rays from the light source are hitting that red triangle and calculate how the triangle of the object should be lit. Not only that, but every ray is calculated how it bounces back and hit other triangles of other object. The calculations are even done on basis of how shiny the triangles in the scene are supposed to be.
      It is a lot more computational intensive, that how a game works.

    • @akyhne
      @akyhne 3 месяца назад

      And here's an example on how ray tracing works in 3DS Max. Note, that the ray tracing is sped up and not in real time, so every time he makes an adjustment, it's not in realtime. ruclips.net/video/48YtLmJCPzE/видео.htmlsi=Awt9f14bJhHo2boL

    • @mojojojo6292
      @mojojojo6292 3 месяца назад

      @@akyhne There are a few titles that offer full real time path tracing options though like Cyberpunk and Alan Wake 2. You really need the best hardware combined with upscaling to get playable frame rates and bounces are limited but it's still incredible that it's possible at all in these graphically intensive games.

    • @akyhne
      @akyhne 3 месяца назад

      @@mojojojo6292 You can't compare ray tracing in games with ray tracing in 3D software.

    • @mojojojo6292
      @mojojojo6292 3 месяца назад

      @@akyhne I agree. I was just pointing out that there are some games that use full scene path tracing for all light sources and not just some effects. The quality of the RT obviously does not measure up to 3d rendering software with only 1 to 2 bounces and a very limited number of rays but it is real time in very large scenes with reasonable frame rates.

  • @SkultétyBendegúz
    @SkultétyBendegúz 4 месяца назад +3

    Annyira jó, hogy általad ilyen átfogó képet kapunk a legújabb grafikus technológiákról! Amint meglátom, hogy új videó van alig bírom kivárni, hogy megnézzem.

  • @fen4554
    @fen4554 4 месяца назад +20

    I'm really confused. I'm a gamer and not a computer scientist, but haven't we approximated realtime raytracing and pathtracing with RTX? Why are we still fighting noise when this should just be a question of calculating more rays until most of the noise is gone? I understand that means trillions of calculations, but... aren't we there already?

    • @juanromano-chucalescu2858
      @juanromano-chucalescu2858 4 месяца назад +8

      Has I understand it, games only simulate some reflections and shadows. Not the entire scene

    • @IngieKerr
      @IngieKerr 4 месяца назад +19

      TLDR; it's like having a better engine in a fast car; you can only drive an engine so fast, and at some point you're better off investing in a better designed engine.
      We can do raytracing with RTX realtime in such games only because it's used in *hugely* simplified [geometrywise] scenes with minimal light sources and bounces, compared to a fully rendered photorealistic scene with whoknows how many light sources and continual differences in texture, reflection, lensing, and complex edge geometry.
      And while it looks fantastic in games [usually] compared to non RTX games, it's still objectively terrible compared to true photographs of a similar scene.
      You _could_ crank existing tech up to some super high level of iteration (and this undoubtedly will happen), but without algorithmic optimisation, you're looking at rapidly diminishing returns.... and a huge electricity bill.
      Additionally; outside the realm of games and more just into non realtime CGI/VFX; imagine it taking you say 5 minutes to render a full 4K still image at present for a test render for a still for a movie, compared to, say half that time on the same hardware if some new algorithm like the above is introduced... that's a lot of cost savings in your work flow. More power/GPUs is useful. but novel algorithmic optimisation is better :)

    • @bricaaron3978
      @bricaaron3978 4 месяца назад +7

      Throwing more and more rays is _not_ what you want to do --- especially not for real-time applications. What you want is to develop assumptions and approximations that allow you to render a _perceptibly plausible_ image while generating only a fraction of the number of rays that would be required for an accurate render.

    • @carlosmarx2380
      @carlosmarx2380 3 месяца назад

      more rays = higher render time.
      for games tho, we dont really actually need realtime raytracing, since you can just bake all the lights in (except moving lights, but you can do them in realtime and noone will really notice), and use reflection probes for the reflections. yeah, raytraced reflections are way better, but probes and screenspace reflections still work fine.
      as graphics got better and better, i noticed the decline in creativity, graphics wise. take gta iv: the shadows and lighting are absolutely terrible, but my god is the design of the actual map beautiful. it still feels so realistic and lived in. then take the Matrix Demo for Unreal 5: yeah, looks kinda good, but given the lack of detail, it seems way less realistic. nowadays, people think, just because their game has accurate reflections and somewhat accurate lighting, the graphics are "good", when in reality the game looks like hot garbage with accurate lighting lol

    • @hopoheikki8503
      @hopoheikki8503 3 месяца назад

      @@carlosmarx2380 For me, GTA IV's game design was quite horrid. When you died during a mission that had several parts, you might had to start from complete opposite of the map and it took you like thirty minutes to get were you died, with the several mission stages. Also, the game felt buggy and I didn't love the controls (especially the shooting), and the map was the most boring in the series. But maybe that's just me. I think a better example is something like Ico that had a beautiful (game) design and style which we don't see much anymore in AAA games. Thankfully indie games are doing the heavy lifting when it comes to (graphical) innovation and (game) design that feels something different and new.

  • @MetallicMedium
    @MetallicMedium 4 месяца назад +4

    Is that Lightwave 3D in the early screenshots? I really love that program. If we can output those older scenes into these new renderers, it will be really exciting.

    • @philosuileabhain861
      @philosuileabhain861 4 месяца назад +1

      Older version of Blender pre 2.8 series @ 0:32 and then later on screenshot of Blender 2.8 @ 0:44

  • @megazilla344
    @megazilla344 4 месяца назад +2

    Graphics are great but audio is unfortunately lagging behind. If we can figure out something like physical modelling sound with custom hrtf's then we've got real immersion. 48000 cycles per second realtime modelling is expensive though

  • @kylebowles9820
    @kylebowles9820 4 месяца назад +3

    Omg video compression was hating the noise more than the denoisers lol

  • @kklol07
    @kklol07 3 месяца назад

    the office image with the grey sofa just made my eyes blow up

  • @ChadT-n9q
    @ChadT-n9q 3 месяца назад +2

    I am a little confused, I LOVE watching your videos, but isn't ray-tracing in real-time already achievable and has been for years now? Yes it still is very costly performance wise, but the way this video talks about it doesn't align with current technology no? If this was posted 8-10 years ago then absolutely I can see this being more exciting that ray-tracing would be possible soon but it's already been implemented in thousands of video games in real-time at this point, can someone explain?
    EDIT: I think I understand now the point of this video, he's talking about path-tracing not ray-tracing, I mean path-tracing IS ray-tracing, but it's more advanced ray-tracing that simulates more light calculations more accurately and this video focuses more on the de noising aspect of this technology, great video, thank you :)

  • @S9universe
    @S9universe 4 месяца назад +20

    update cycles now lolol

    • @thornnorton5953
      @thornnorton5953 4 месяца назад +5

      Looks like it uses a temporal technique which is likely incompatible with per frame offline rendering. Also notice there’s no volumetrics.

    • @NicCrimson
      @NicCrimson 4 месяца назад +1

      Yes!

    • @S9universe
      @S9universe 4 месяца назад

      ​@@thornnorton5953 😢

    • @theneonbop
      @theneonbop 4 месяца назад

      ​@@thornnorton5953 It is temporal, but I don't think there's any reason it couldn't theoretically be implemented into Blender, it would just probably take a lot of reworking and effort. If it was added, I would expect it to probably act as a 4th option in between the quality of EEVEE and Cycles.
      And yes, volumetrics don't seem to be showcased in their demos, but I would be shocked if it didn't support them - it seems like their technique relies on re-using ray paths but it looks like it keeps the logic of a singular ray hit identical. The one thing it doesn't appear to support is motion blur, but the authors of the paper say they have ideas to address this in the future.
      It looks like the blender devs are finally finishing eevee-next, so hopefully some other interesting rendering stuff will come soon. It looks like one of blender's full time developers is currently starting work on the original re-stir (with temporal elements) in a PR opened about a month ago (#121023)

    • @descai10
      @descai10 4 месяца назад

      @@thornnorton5953 Temporal is optional, it's still a big improvement even without it.

  • @BirBilgin
    @BirBilgin 4 месяца назад +11

    bro's voice like ai generated

    • @xPocketStaff
      @xPocketStaff 3 месяца назад +4

      Yes, his voice is terrible to listen to, I always mute the video, I used subtitles 😅 (borzalmas hallgatni ,borzalmas hangsúly )

    • @Cactoos
      @Cactoos 3 месяца назад +2

      Tbh I can't bear his videos so much great content and I just can't listen his way of speaking, make louder on words in the middle of sentences

  • @Meepminer
    @Meepminer 4 месяца назад +26

    Watatime tubie a live!

  • @epicthief
    @epicthief 4 месяца назад +2

    Let's hope the next paper comes sooon

  • @bruh-hp7kc
    @bruh-hp7kc 4 месяца назад +5

    Can I ask how this ray tracing is different to the ray tracing we see in games, and what makes this more difficult?

    • @JananaYatharuth
      @JananaYatharuth 4 месяца назад +1

      I don't know much, but I hope this'll help.
      Feel free to correct me if I'm wrong.
      This is pure path traced rendering. Which doesn't involve rasterization. Also Path Tracing is harder than Ray tracing as Ray tracing only does one bounce off a surface while Path Tracing might do multiple bounces for better realism.
      In games we use a combination of raster and ray traced rendering. As raster rendering is much easier, games will do most in raster and use ray tracing for specific areas. Even then, the ray traced areas will look noisy, but the noise filtering is easier as the raster layer below hides most of the artifacts.
      Please correct if I'm wrong, I'd love to learn more.

    • @Sergeeeek
      @Sergeeeek 4 месяца назад +1

      Path tracing is basically: shoot a ray from camera, if it hits anything bounce in a random direction (might not be that random, depends on material), continue bouncing until you're bored. Repeat the whole process 10-10000 times to clear up the noise.
      Ray tracing: shoot a ray from camera, if it hits anything shoot n rays towards each nearby light, If you don't hit anything on the path to light then this light affects the material. Shot a reflection ray and you're done. First step can be skipped if you use rasterization.

    • @bruh-hp7kc
      @bruh-hp7kc 4 месяца назад

      I appreciate the replies, thanks!

  • @darkesco
    @darkesco 4 месяца назад +3

    Not real-time, but the blender Nvidia denoiser tool only requires 1 pass, then the quick effect clears it up. 2 or 3 passes is all you need to look really good. Used it years ago and you had to use cycles renderer for it.

  • @kipchickensout
    @kipchickensout 4 месяца назад +5

    legend says he's a light transport researcher by trade

  • @witext
    @witext 4 месяца назад +6

    I would love to see a demo combining this paper with the latest & best denoisers, see what it would look like if this was to ship with a game rn

    • @theneonbop
      @theneonbop 4 месяца назад +2

      yeah, that little "denoised" box in the video isn't really enough to see it lol. I tried the demo on their github - I had to turn it to half resolution or else it needs more than the 12gb of vram on my 3060 (it runs at about 25 fps at half resolution, maybe fine for a cinematic game if you combined it with dlss/xess), but it doesn't seem to have an option for denoising.

    • @aipowertutor
      @aipowertutor 4 месяца назад

      yeah it's amazing . with the help of AI we can change whole world

  • @blackshard641
    @blackshard641 4 месяца назад +3

    Squeezing those papers dry right now.

  • @morkaili
    @morkaili 4 месяца назад +1

    You know your videos are a great drinking game: Drink whenever the word "and", "with" or "paper/s" are said and whenever the sentence is stopped in between, to continue strongly emphasized. :P

  • @jorgesaxon3781
    @jorgesaxon3781 4 месяца назад +15

    holding my papers so quickly right now

  • @roastyou666
    @roastyou666 4 месяца назад +1

    Wow back to prof’s strong suit

  • @ExpensivePizza
    @ExpensivePizza 4 месяца назад

    I remember seeing raytracing in the late 90s. It didn't make sense to me back then but I always had it in the back of my mind. It's wild to see how far we've come in 30 years.

  • @Yamyatos
    @Yamyatos 4 месяца назад +1

    "Slightly slower" = takes basically twice as long lol. Worth the results, but played down a bit it felt like.

  • @ScraggyDogg
    @ScraggyDogg 3 месяца назад

    I know this issue well, and very glad to see improvements,

  • @dvdganon0812
    @dvdganon0812 3 месяца назад

    Can we see this improvement in the next nvidia drivers? Updated ray and path tracing for current games ?

  • @氷語
    @氷語 3 месяца назад

    Wow nvidia doing something open source? Wouldn’t have imagined living this far

  • @Felixxenon
    @Felixxenon 4 месяца назад

    When this technique merges with machine learning, things are going to get as wild as spring break! Love your work, doc♡

  • @Adam_Lyskawa
    @Adam_Lyskawa 4 месяца назад +9

    I have a hunch that in like less than 5 years you would be able to render UHD 3D ray traced graphics in real time on a home PC. We're very close to it.

    • @theskyblockman
      @theskyblockman 4 месяца назад +3

      No? Currently the most popular GPU (do not look only at the steam hardware survey!) for home computers is the GTX 1650 (my current daily driver), soon to be dethroned by the 3060. If we can't do that high end consumer GPUs or server-grade GPUs now, no shot we'll be able to do that in 5 years. Also Nvidia is really greedy and will probably make their GPU prices explode again by then. Not to mention that many other computers don't even have a GPU at all. If people enable themselves to think Nvidia isn't the only company making GPUs, I could see this to become a bit possible on the X090 Ti, but of course this won't happen, like people will get Apple products even if they have clearly worst stats than Androids.
      I like the optimism though! If only everything wasn't ruined by capitalism...

    • @Adam_Lyskawa
      @Adam_Lyskawa 4 месяца назад

      @@theskyblockman I think the bad times of NVidia and GPUs in general are over. I think the crypto bubble bursted. RTX series is not that prohibitively expensive anymore and it delivers considerably more than the last generation. It's not TRUE (full blown) ray tracing yet, but we're getting there. One new feature I test is DLSS. It really works and it allows to play in 4K when otherwise only HD would work.

    • @theskyblockman
      @theskyblockman 4 месяца назад

      ​@@Adam_Lyskawa When I was talking about prices going up I didn't really thought of crypto: Nvidia got their crypto infinite money glitch patched, so now they are interested in AI and have a new reason to explode the GPU prices again. Imagine a 5090 with NPU-like capabilities, this could totally make the already ludicrous price for the GPU go x2 (and people would buy it for sure), slowing the reach for the tech. Also I am pretty sure that because of AI Nvidia slowed down Ray Tracing because they would make more money on it than on RT.
      (You can skip this, this is just a big tangent on why the GPU market is bad rn)
      Personally when Nvidia started non-algorithmic processing on games it was the time to go to AMD for me (today I am on Linux so my next GPU is an Intel or an AMD). AMD here is totally the better option if people want better GPUs on the long run but nobody thinks like that with their money (Intel's first gen GPU was maybe bad but they arrived in a new market and actually did pretty well compared to Nvidia and AMD/AMI first gen GPUs, only they entered the market sucking but not being as near as the other GPUs, result: nobody took Intel GPUs and Battlemage is not sure to release at all) if people bought Intel GPUs, I can assure you we would have 32 gigs of VRAM with an acceptable bus size for a fraction of the current price of GPUs but nobody want to support the underdog with their money.

    • @theneonbop
      @theneonbop 4 месяца назад +1

      Portal RTX, Cyberpunk, etc all do path traced Re-STIR in real time on a home PC, and you only need a $300 GPU for it...
      We're not 'close to it' we already have it. Of course, it can be improved - I need to use heavy DLSS upscaling on my 3060 for Portal RTX, and I would expect cyberpunk to be even more demanding, but real time path tracing is very much currently possible.

    • @theneonbop
      @theneonbop 4 месяца назад

      @@theskyblockman "of course this won't happen, like people will get Apple products even if they have clearly worst stats than Androids"
      Have you even looked at benchmarks in the past 5 years? Apple's mobile CPUs have been at the top of the leaderboards for a long time. And I've yet to see an android phone with the 3D scanning features iPhones have, with both the structured light sensor and the lidar sensor.
      And yes, a high end GPU can currently do real time path tracing (with re-stir). Just look at some of the games NVIDIA have been showing off like Portal RTX and Cyberpunk.
      This video kind of gives the impression that they can't, because it only shows off frames before denoising, which is never what the user will actually see. But they can - Re-STIR just looses some details that can be recovered with something like this or NVIDIA's secretive 'DLSS 3.5'.

  • @Krilium
    @Krilium 3 месяца назад

    Another problem with realistic game graphics is that you absolutely HAVE to have realistic animations, particle FX and sound design to go with it. Most Unreal RT projects are just eye candy for that very reason.

  • @jonathanberry1111
    @jonathanberry1111 3 месяца назад

    I swear earlier papers looked to do an even better job?! Were they doing something different?

  • @ABaumstumpf
    @ABaumstumpf 4 месяца назад

    It is great how much Intel and Nvidia are doing for the research and development of graphics and open-source in general.

  • @ngelemental2274
    @ngelemental2274 3 месяца назад

    Is there a way to implement this new denoiser into current games that use raytracing or is it all dependent on the developer or nvidia

  • @sarathallaka2354
    @sarathallaka2354 4 месяца назад

    Might be a dumb question but cannot stop myself from asking - Aren't current diffusion models solving this instantly? They remove the noise and generate an accurate image right? I know I am missing something, please help me understand.

  • @panzerofthelake4460
    @panzerofthelake4460 4 месяца назад +3

    How long till we see it in games?

  • @BOXING_LOUNGE
    @BOXING_LOUNGE 4 месяца назад

    @Two Minute Papers did they use your voice in the Ps3 Game, Puppeteer? 😉

  • @dxnxz53
    @dxnxz53 4 месяца назад

    what a time to be alive!

  • @SogonD.Zunatsu
    @SogonD.Zunatsu 4 месяца назад

    We're bringing back film grain with this one.

  • @CoreyJohnson193
    @CoreyJohnson193 3 месяца назад

    Oh My God!! Wow, that is absolutely amazing.

  • @d3xx3rDE
    @d3xx3rDE 4 месяца назад

    Just yesterday I thought about some way to make Ray and Path Tracing more efficient using some sort of Deep Learning

  • @geometryflame712
    @geometryflame712 4 месяца назад

    This looks super cool and stylized. Would make for a great video game

  • @uw6de
    @uw6de 4 месяца назад

    Didn't we see already see how they managed to improve the carousel in an earlier paper? But still, it's amazing.

  • @phazei
    @phazei 4 месяца назад

    I would have loved to see the results of the new raytracing method after being denoised. Would that have been functionally usable for games?

  • @MikkoRantalainen
    @MikkoRantalainen 3 месяца назад

    Just wait 3 papers more and the scenes will look noise free on first glimpse.

  • @theneonbop
    @theneonbop 4 месяца назад +1

    Now that realtime raytracing is possible and getting better on consumer hardware, the next frontier I am wondering about is if we will ever get VR path tracing. It would need to be 90+ fps, low latency, minimal blur/artifacts, and run on 2x4k screens at once. I guess lumen is probably "close enough" for now lol

    • @Sergeeeek
      @Sergeeeek 4 месяца назад

      A lot of VR headsets are portable now and have weak gpus, so maybe not this decade haha

    • @ILoveTinfoilHats
      @ILoveTinfoilHats 4 месяца назад +1

      Already can. Just not worth the development cost, but I'm sure some indie studio with unreal will do it.

    • @theneonbop
      @theneonbop 4 месяца назад

      @@ILoveTinfoilHats Not path tracing. Lumen works, but path tracing at full res is still too slow.

    • @ILoveTinfoilHats
      @ILoveTinfoilHats 4 месяца назад

      @@theneonbop yes path tracing, just maybe not with first gen rt cores.

    • @theneonbop
      @theneonbop 4 месяца назад

      @@ILoveTinfoilHats What makes you think 2x4k 90fps path tracing is possible with current hardware?

  • @MagesIncorporated
    @MagesIncorporated 4 месяца назад

    Okay obviously I'm not experienced in the field, but I thought we already were integrating ray tracing into real time applications, like video games? Wasn't the tech demo of Control a few years back doing exactly this? Am I missing something? Is it more supercharged with DLSS than I was aware?

    • @mojojojo6292
      @mojojojo6292 3 месяца назад

      Control has very limited RT effects. Most of the rendering is still traditional methods with a basic layer of RT on top for reflections and shadows. There are more recent games that have full scene path tracing options though like Cyberpunk and Alan Wake 2 that use every trick in the book to give playable frame rates with a good image, 1-2 bounces, small number of rays, upscaling, frame generation and denoising. The result is still incredible. ruclips.net/video/qvu4aru39SA/видео.html

  • @ShahirZaman
    @ShahirZaman 4 месяца назад

    rendering stuff is so going to be so much better :O

  • @stefanoscolari3902
    @stefanoscolari3902 3 месяца назад

    Hey! Why no videos on 3DGS?

  • @TorpedOvrn
    @TorpedOvrn 4 месяца назад

    is it just a temporal filtering?

  • @MilesBellas
    @MilesBellas 3 месяца назад

    Raytracing = synthetic data for model training ?

  • @legral
    @legral 4 месяца назад +1

    You're a jewel, Karoly!

  • @pxrposewithnopurpose5801
    @pxrposewithnopurpose5801 4 месяца назад

    its going crazy

  • @krinodagamer6313
    @krinodagamer6313 3 месяца назад

    Its evolving yall

  • @5alpha23
    @5alpha23 Месяц назад

    Dumb question, but wouldn't it be possible to present an AI with an unrendered scene and ask to add realistic lighting? Sure, it doesn't work in real-time, it's just a general question.

  • @miroaja1951
    @miroaja1951 4 месяца назад

    Say goodbye to the video bitrate with all that noise

  • @Youbetternowatchthis
    @Youbetternowatchthis 4 месяца назад

    Can anyone explain what the current method of ray tracing is that actually works in a real time game compared to this?
    I am confused due to a lack of knowledge, I suppose.

  • @kylepena8908
    @kylepena8908 4 месяца назад

    What a time to be alive!

  • @JynxedKoma
    @JynxedKoma 3 месяца назад

    *"Fwee Wiiks!"*

  • @emiel333
    @emiel333 3 месяца назад

    Looks great.

  • @TurdFergusen
    @TurdFergusen 3 месяца назад

    still a bit to go til its as good as the raytracing in our matrix

  • @liangcherry
    @liangcherry 4 месяца назад

    My focus has been drawn to dubbing🤣

  • @getsideways7257
    @getsideways7257 4 месяца назад

    Real time bokeh simulation, huh? I'm sold

  • @splashmaker2
    @splashmaker2 4 месяца назад

    Haven’t read the paper, but I’m guessing hand crafted algorithm from human ingenuity. This makes me wonder then, how long until AI replaces that stage in addition to denoising.

  • @willhart2188
    @willhart2188 4 месяца назад

    Another huge jump forward.

  • @StickerWyck
    @StickerWyck 4 месяца назад +6

    This one looks like it's from 4 years ago.

  • @AmaroqStarwind
    @AmaroqStarwind 4 месяца назад

    Have you looked into blue noise?

  • @ps3301
    @ps3301 4 месяца назад

    This is 2024. Gpu isnt in final form yet. If u can create the best and fastest gpu, you can power all the future ai tech.

  • @Sintrania
    @Sintrania 4 месяца назад

    This video is a lot better without audio really.

  • @FlamespeedyAMV
    @FlamespeedyAMV 3 месяца назад

    you make a video like this every few weeks

  • @robertoaguirrematurana6419
    @robertoaguirrematurana6419 4 месяца назад

    This is the future of prescription lenses.

    • @robertoaguirrematurana6419
      @robertoaguirrematurana6419 4 месяца назад

      @capitannerevar7792 no more glasses, just real time super high quality rendering

  • @Zen_Power
    @Zen_Power 4 месяца назад +1

    …..what a time to be A.iiiiiiiiiiiiiiiiii

  • @callibor3119
    @callibor3119 4 месяца назад +1

    Reminds me of 360p and 480p resolution. It is feasible and can be played in a game. Fans can tinker with it and post their version of the papers and even test it out in Unreal Engine 5 to see if it can handle well in real-time.

    • @theneonbop
      @theneonbop 4 месяца назад +2

      what?

    • @callibor3119
      @callibor3119 4 месяца назад

      @@theneonbop This a paper already for the public to try. He has always made it clear that each paper can be tested and even improved upon to reach that level of real-time graphics that people have been waiting for for so long.

  • @Pockeywn
    @Pockeywn 4 месяца назад

    i always have to wait til i get home to watch videos about denoising >:( grr booo youtube compression

  • @InfinitycgIN
    @InfinitycgIN 4 месяца назад

    Waiting for the time somebody implements it in blender cycles, won't have to upgrade my gpu then 😂😂

  • @hatobeats
    @hatobeats 3 месяца назад

    Nvidia part in all papers are free GPUs for real scientists who are actually doing all the work.

  • @ScibbieGames
    @ScibbieGames 4 месяца назад

    They supercharged it again?

  • @vaisakhkm783
    @vaisakhkm783 4 месяца назад

    I really want to see how the new method looks though the denoiser... 😢

  • @kycklingris
    @kycklingris 4 месяца назад

    I am a bit curious about what would happen if nvidia or amd built a gpu with the sole purpose of raytracing. Of course I know pretty much nothing about gpu design so it's possible that all of the hardware is already in use, but if it's not, then maybe the next generation of consoles could be 100% raytracing (of course including all the denoising and stuff) focused and run at interactive frame rates.

    • @ABaumstumpf
      @ABaumstumpf 4 месяца назад

      "could be 100% raytracing"
      That would be a bad idea - there are way too many things that are way more effectively down with normal rendering. In general you want to use raytracing to get the extra data for the normal renderer - like getting the indirect light (not the direct light), volumetric materials etc.

  • @kiminthemix4251
    @kiminthemix4251 3 месяца назад

    still a long way to go

  • @ya_krutyi1337
    @ya_krutyi1337 4 месяца назад

    They supercharged raytracing 20 times already... I hope it will get to the desktop GPUs eventually💢

  • @brianjanssens8020
    @brianjanssens8020 4 месяца назад

    How is this different from unreal engine 5's lumen?

  • @pandoraeeris7860
    @pandoraeeris7860 4 месяца назад

    Not to be confused with tray racing.

  • @XashA12Musk
    @XashA12Musk 4 месяца назад

    i always thought your name was "Dr. Caro JhonAifa Here", today i saw your real name in subtitles

  • @TeddyLeppard
    @TeddyLeppard 4 месяца назад

    With the pace of AI innovations, I suspect we'll be able to see real-time ray tracing that is twice as good in a few years.

  • @ertyly7445
    @ertyly7445 4 месяца назад +1

    in 2x speed your voice is prety funny

  • @MrRandomPlays_1987
    @MrRandomPlays_1987 4 месяца назад

    Can't they make an AI model predict how raytraced images should look like exactly given the geometry + lighting conditions and material the scene has and its camera angle? this way it might get fully optimized realtime raytraced graphics without having any noise to begin with.

  • @Alpha-vb3to
    @Alpha-vb3to 3 месяца назад

    It not possible get even better result using a IA to correct all image artefacts in real time?
    The training must be very simples, just train with reference images.

  • @Rustikss00
    @Rustikss00 4 месяца назад

    Imagine living in a time when the author should note, that a technique is fully crafted by humans

  • @FactsWithActs
    @FactsWithActs 4 месяца назад

    What alive to be a time