NVIDIA’s New Ray Tracing Tech Should Be Impossible!

Поделиться
HTML-код
  • Опубликовано: 11 дек 2024

Комментарии • 458

  • @imsatoboi
    @imsatoboi Месяц назад +641

    i am simple man , i see 2 minute paper , i click

  • @mtrivelin
    @mtrivelin Месяц назад +307

    I'm a 3D artist who started on the Amiga, with Ligtwave and currently (continue to) work with Maya and Vray. Nodes, Lights, Shader, render, change things, render again until it looks good.
    But seeing these new technologies that seem to perform one bigger miracle than another every few days makes me feel like a caveman trying to understand our world.
    I have no idea how to use these things. I feel like I was instantly outdated.

    • @jtjames79
      @jtjames79 Месяц назад +33

      Nobody knows how to use any of these things.
      What you do have is experience knowing when it looks right.
      Learning workflows only takes a few days to a few months depending on how deep you want to get.

    • @SirusStarTV
      @SirusStarTV Месяц назад +32

      These things only get useful when implemented in popular software like blender, unity, unreal engine. We just see these demos for years without them being available in any software.

    • @stephanelegendre7624
      @stephanelegendre7624 Месяц назад

      tried to recommend you to try Postshot and try by yourself but RUclips censored me...
      If RUclips GOD allows me to explains.. There are plugins out there already for Blender/Unreal/Unity. Do not hesitate to ask me more, if i'm allowed to answer...

    • @ianwatson5767
      @ianwatson5767 Месяц назад +10

      As a fellow 3D artist of about 10 years I feel it's a futile battle to learn more 3D stuff because in a few more years CGI will likely be replaced by AI. It can already generate photorealistic renderings and animations and it's just getting started.

    • @Caellyan
      @Caellyan Месяц назад +4

      @@SirusStarTV If you're a game engine developer they're sometimes useful because they show what's possible and have a "recipe" on how to do it. But for someone who's in modelling, or even game dev (using a preexisting engine), not as much. Even if you had the source (UE4/Blender) and could implement the technique yourself, it's really not worth the effort.

  • @davidrenton
    @davidrenton Месяц назад +520

    finally the mortgage for that 8090 will be justified

    • @theuserofdoom
      @theuserofdoom Месяц назад +13

      Wait you don’t game on a GB200?

    • @henrismith7472
      @henrismith7472 Месяц назад

      @@theuserofdoom Anything less than an NVIDIA Tesla H100 is for poor people. I game on a DGX SuperPod. Each liquid-cooled rack features 36 NVIDIA GB200 Grace Blackwell Superchips-36 NVIDIA Grace CPUs and 72 Blackwell GPUs-connected as one with NVIDIA NVLink.

    • @honestgoat
      @honestgoat Месяц назад +8

      @@theuserofdoom You wouldnt. A consumer graphics card would smash it at gaming.

    • @honestgoat
      @honestgoat Месяц назад +10

      Considering how quickly this has come along. It'll prolly be a 6090 bro.

    • @cruz1ale
      @cruz1ale Месяц назад +3

      @@honestgoat ni0ce0

  • @SuperNick964
    @SuperNick964 Месяц назад +492

    bro has a comma every 3 words

    • @nijario9690
      @nijario9690 Месяц назад +20

      🤣🤣🤣

    • @PedroOliveira-sl6nw
      @PedroOliveira-sl6nw Месяц назад +17

      OMG! I will never not notice that

    • @HenriFaust
      @HenriFaust Месяц назад +28

      That's way better than no punctuation at all.

    • @Acheiropoietos
      @Acheiropoietos Месяц назад +15

      The power of commas should not be underestimated, so there.

    • @grey7603
      @grey7603 Месяц назад +17

      But it gives this man his charm… I enjoy hearing him talk about all these things.

  • @RadianceFields
    @RadianceFields Месяц назад +116

    This paper is actually a complete departure from Gaussian Splatting, but both of these methods create a Radiance Field. Also, the vast majority of research will be transferable between the two methods. I interviewed the first author of this paper if you want to learn more about what this method can do! ruclips.net/video/1vxn4M1fO6c/видео.html

    • @sean748
      @sean748 Месяц назад

      Do they still do the ML fitting to generate the particles from the source data?

    • @bmqww223
      @bmqww223 Месяц назад

      can you tell what are the implications of this for not so smart person like me? i want to can i run pathtraced games at 60 fps at mid range gpu like rtx 3060/4060 type gpu?

    • @xXJeReMiAhXx99
      @xXJeReMiAhXx99 Месяц назад +2

      @@bmqww223 i dont study this stuff but it doesnt exist in gaming at all. the answer to your question is pretty much no

    • @monad_tcp
      @monad_tcp Месяц назад +1

      radiance fields are kind of magical

    • @cushycreator9024
      @cushycreator9024 Месяц назад

      @@xXJeReMiAhXx99 Unreal Engine 5 and PlayCanvas both support Gaussian Splatting.

  • @firefox8713
    @firefox8713 Месяц назад +170

    What a time to be two papers down the line!

    • @WilsoniStudios
      @WilsoniStudios Месяц назад +2

      😂😂😂

    • @hombacom
      @hombacom Месяц назад +4

      But we said that two papers earlier and we are still not there

    • @kwiz2411
      @kwiz2411 Месяц назад

      What a time to lie down on two papers!

  • @Aarrenrhonda3
    @Aarrenrhonda3 24 дня назад +125

    I keep telling both friends and family; Now is the perfect time to own a Tech Stock, With everything going on and seeing how the world is been run by AI and all Tech is here to stay and you don’t want to miss it

    • @Peterl4290
      @Peterl4290 24 дня назад +3

      With everything going on in the market My advice to anyone starting out in the market is to seek guidance as its the best way to build long term wealth while managing your risk and emotions with the passive investing strategy.

    • @larrypaul-cw9nk
      @larrypaul-cw9nk 24 дня назад +2

      I took charge of my portfolio but faced losses in 2022. Realizing the need for a change, I sought advice from a fiduciary advisor. Through restructuring and diversification with dividend stocks, ETFs, Mutual funds, and REITs, my $1.2M portfolio surged, yielding an annualized gain of 28%.

    • @sabastinenoah
      @sabastinenoah 24 дня назад +2

      Do you mind sharing info on the adviser who assisted you?

    • @larrypaul-cw9nk
      @larrypaul-cw9nk 24 дня назад +1

      ‘Annette Christine Conte One of the finest portfolio managers in the field also widely recognized. Just research the name. You’d find necessary details to work with and set up an appointment.

    • @sabastinenoah
      @sabastinenoah 24 дня назад +1

      Thank you for sharing. it was easy to find her, then I scheduled a phone call with her. She seems proficient considering her résumé.

  • @theaslam9758
    @theaslam9758 Месяц назад +119

    "Two Minute Papers released a video 2 minutes ago"

  • @StefanoBorini
    @StefanoBorini Месяц назад +122

    Great techniques soon to be used for videogames and movies with awful plots.

    • @Kutsushita_yukino
      @Kutsushita_yukino Месяц назад +10

      you forgot to add : awfull tripple A games

    • @seekyunbounded9273
      @seekyunbounded9273 Месяц назад +3

      Good that I care more about atmosphere and vibe, I would hate to not like dishonored just because the plot wasn't outstanding

    • @Unliveify
      @Unliveify Месяц назад +2

      Sweet Baby has blacklisted you

    • @AngryApple
      @AngryApple Месяц назад +3

      Interactivity of Radiance Fields is still somewhere in the blue.
      Right know depending on the use case 3DGS are by far the best way to represent a single object. Its fast and highly detailed if captured right and trained properly.

    • @MischieviousJirachi
      @MischieviousJirachi Месяц назад

      LMFAO

  • @NigraXXL
    @NigraXXL Месяц назад +41

    This is unbelievable. If we could get gaussian splatter a bit more developed to the point it can be rigged and animated, that would go so well with this new light simulation support and could make stuff like Unreal's nanite-level of detail actually available to more hardware

    • @mattmexor2882
      @mattmexor2882 Месяц назад +4

      Can this technology resolve object boundaries? Can you move objects around in the scene and know when they collide with each other?

    • @NigraXXL
      @NigraXXL Месяц назад

      @@mattmexor2882 I don't know, but I'd imagine it should be closely related to animation since it's about grouping and defining relationships between points

    • @puppergump4117
      @puppergump4117 Месяц назад +1

      @@mattmexor2882 I don't think model-model collision has much to do with lighting, it'd probly just clip through and whatever's intersecting won't influence lighting.

    • @seekyunbounded9273
      @seekyunbounded9273 Месяц назад

      ​@@mattmexor2882games generally don't use the visual mesh for collisions anyway, they add its own collision boxes and capsules that have supplier geometry depending on the need

    • @TimvanHelsdingen
      @TimvanHelsdingen Месяц назад

      @@mattmexor2882you could just place collider objects into the gaussian splats that move with it i think

  • @phantomabid
    @phantomabid Месяц назад +29

    Research Papers: RTX ON

  • @KBRoller
    @KBRoller Месяц назад +9

    So step 1, fly a drone through an environment to get photos from a bunch of angles; step 2, process those images into Gaussian splatter data; step 3, render a fully raytraced clone of the environment in 3D in realtime, complete with any additional 3D objects you want to add to the scene?

  • @ninjatogo
    @ninjatogo Месяц назад +23

    I love gaussian splatting technology. I just started creating some of own to record memories of interesting places or things, instead of taking photos. That way I'll be able to revisit and share them later with a VR headset.

    • @ImpostorModanica
      @ImpostorModanica Месяц назад +3

      How is that done? Is there a tutorial somewhere?

    • @NicoAssaf
      @NicoAssaf Месяц назад +2

      @@ImpostorModanica I wanna know too

    • @ninjatogo
      @ninjatogo Месяц назад

      @@ImpostorModanica I'm using Scaniverse on an iPhone

    • @ninjatogo
      @ninjatogo Месяц назад

      @@NicoAssaf I'm using Scaniverse on an iPhone

    • @GewelReal
      @GewelReal Месяц назад

      bro living in 2077

  • @DASPRiD
    @DASPRiD Месяц назад +5

    The holy grail of graphics is always just two papers down the line!

  • @fischX
    @fischX Месяц назад +11

    Some 3D glasses and a bathtub away from the matrix

  • @blakedehaas
    @blakedehaas Месяц назад +3

    I work on atmospheric machine learning research, visualizations for advanced particle simulations like atmospheric particle simulation sounds like a perfect application for this technology!

  • @vlividTV
    @vlividTV Месяц назад +8

    Amazing tech. Can't wait for it to become available.

  • @cbuchner1
    @cbuchner1 Месяц назад +10

    Gaussian raysplatting? Splattracing?

  • @CYI3ERPUNK
    @CYI3ERPUNK Месяц назад

    voxels to triangles : 'you could not live with your own failure , where did that bring you? back to me' XD
    what a time to be alive indeed
    every day we get another step closer to the simulation , nevermind 2 papers down the line , where is this going in the next several decades? its mindblowing

  • @Elta305
    @Elta305 Месяц назад +2

    What a time to be alive !
    I hope that you will cover what will be announced at Humanoids 2024 in november !

  • @julinaut
    @julinaut Месяц назад +2

    I think what these papers really need to improve the world we live in is attention. You're doing gods work
    Károly!

  • @BeyondTomorrow-xj2mu
    @BeyondTomorrow-xj2mu Месяц назад +4

    the funny thing is when I tell my highly educated family members about ai and so forth; they don't believe what I tell them and have never heard about ai at all.

  • @theflare6306
    @theflare6306 Месяц назад +1

    This is one of the first times I've ever gotten goosebumps from reading a paper.

  • @sproket343
    @sproket343 Месяц назад +1

    Great video. And the icing on the cake is the narration by Ren from Ren & Stimpy.

  • @jasonshere
    @jasonshere Месяц назад +2

    Very cool; ultra photo-realistic video games and 3D rendered movies/elements are very close.

  • @xBosil
    @xBosil Месяц назад +1

    That is so crazy, almost unbelievable, it is very exciting to see progress like that

  • @HarhaMedia
    @HarhaMedia Месяц назад +13

    This is very cool! Heaps more interesting than the generative AI papers.

  • @cyber_robot889
    @cyber_robot889 Месяц назад +1

    Finally non Ai narrated video on this channel!

  • @AdamMi1
    @AdamMi1 Месяц назад

    Great to see some light transport simulation content again!

  • @HossLUK
    @HossLUK 21 день назад +1

    there must be an infinite amount of periods in this guy's script

  • @benveasey7474
    @benveasey7474 Месяц назад +4

    Awesome if this could be used in Blender for fast/lightweight ArchVis backgrounds.

  • @AironExTv
    @AironExTv Месяц назад

    Convincing simulation, training and game 3d displays look to be even better now. I didn't think I'd see the day when realtime raytracing would start to become this fast and convincing.

  • @maxfmfdm
    @maxfmfdm Месяц назад +1

    I like this rethinking of rendering techniques.

  • @aresaurelian
    @aresaurelian Месяц назад

    1:1. Excellent performance is a must. Love it.

  • @mjfabian86
    @mjfabian86 Месяц назад +6

    Károly, is there an AI where I can give it a Two Minute Papers video, and the output is the same video but the narration doesn't pause after every word? Thanks

  • @kenrampingplay
    @kenrampingplay 28 дней назад

    The bike scene really looks real life! What a time to be alive indeed.

  • @JP-rz2pp
    @JP-rz2pp Месяц назад +4

    Nvidia sharing their research for free? Now I’m impressed.

    • @iloveblender8999
      @iloveblender8999 Месяц назад

      As long as you buy their hardware to run their software, they do not mind.

    • @offlinegamer6756
      @offlinegamer6756 Месяц назад

      @@iloveblender8999 it's a wise move since it's clearly something ONLY their AI based chips can run !

    • @JP-rz2pp
      @JP-rz2pp Месяц назад

      @@iloveblender8999 Sharing technical research is not the same as giving free software that runs only on their hardware, that’s why I was impressed.

  • @benmcreynolds8581
    @benmcreynolds8581 Месяц назад +6

    I'm not hating at all, it's amazing that we can compute these things nowadays but I just cannot help but imagine what possibilities we could achieve if we put all that effort & processing power into physics effects, interactive environments, reactive damage effects, particle effects, artistic aesthetics instead of mostly focusing on realism. We definitely need stealth games to comeback and blend in the new advancements we have when it comes to lightning and all that..

    • @jakubjakubowski944
      @jakubjakubowski944 Месяц назад +2

      The issue lies in the fact that the cost of buying processing power is shifted to the consumer, while the cost of developing program falls on the developer.
      This is why we often see less optimized, visually underwhelming games that still require the best GPUs available on the market; the savings on optimization are passed onto the consumer, who compensates with better hardware.
      This also explains why we rarely see truly interactive games with unique systems outside of the indie scene.

  • @dXXPacmanXXb
    @dXXPacmanXXb Месяц назад +4

    I feel like we are seeing the same papers over and over again. I keep seeing the same clips every video and never know whether its new stuff or not

    • @SupersonicII
      @SupersonicII Месяц назад +5

      I think there just isn't enough footage to fill the video with all-new clips. It also helps to be able to compare against previous papers. If you really aren't paying attention, the publication year is a big hint. :)

  • @ryvyr
    @ryvyr Месяц назад

    I can see a plausible future where this degree of realistic fidelity has become so efficient that it smoothly renders at 90 in our full face VR/AR thin mask which reproduces smell and taste ^^

  • @3D-vid
    @3D-vid Месяц назад

    As a day by day CGI artist I can't wait for them to bring this over to Blender. I'm working with raytraces all day so this will greatly improve my output each day! Can't wait for what this will bring.

  • @gaijintendo
    @gaijintendo Месяц назад +2

    Corridor Crew will not be happy with those shadows and extra dark shadows.

  • @Sikanda.
    @Sikanda. Месяц назад

    I was just looking at relightable gaussian avatars yesterday. This tech is truly incredible, I can't wait to see this in games especially VR.

  • @PySnek
    @PySnek Месяц назад +2

    I think the AI filters will smash everything in the next years

    • @pacomatic9833
      @pacomatic9833 Месяц назад

      They're too intensive to be done in real time and too unstable to actually work.

  • @roguestargun
    @roguestargun Месяц назад

    This means gaming with splats will be possible one day? What a time to be alive!

  • @cjjuszczak
    @cjjuszczak Месяц назад +1

    I am constantly STUNNED at the investment, and results, into Ray Tracing from Nvidia, and much of it publicly open too o.O

  • @Outmind01
    @Outmind01 Месяц назад

    This reminds me of the stuff a company called Euclidean was touting around 10 years ago. They had a tech demo that showed photoreal environments an thousands of objects in a single scene running in real time due to everything being based on... I forget what, voxels maybe? In any case, they had that demo but then disappeared completely.

  • @BroHazeman
    @BroHazeman Месяц назад +1

    This new technique is developed by Nvidia & Nintendo from Next Gen Switch project ❤👍

  • @adabujiki
    @adabujiki 23 дня назад

    Lol I appreciate this type of commentary and coverage.

  • @Br3ntable
    @Br3ntable Месяц назад

    Job well done!

  • @r6scrubs126
    @r6scrubs126 Месяц назад

    ok these shots actually look like real life now :O

  • @cainzjussYT
    @cainzjussYT Месяц назад

    Waait a minute. Gaussian splatting already did a pretty good job of capturing the specular reflections from the scanned enviorment.

  • @oisiaa
    @oisiaa Месяц назад

    This tech is going to be incredible in VR!!!

  • @blipblop47
    @blipblop47 Месяц назад

    This made me think of the video, which explained how water was generated in the movie ANTZ, back in 1998.

  • @RAVE_ZERO
    @RAVE_ZERO Месяц назад +1

    Iiiiiii thiiiiiink thiiiiis iiiiiis amazziiiiiiing

  • @mu.co.5018
    @mu.co.5018 Месяц назад +2

    Can you get an accent coach please

  • @hotfightinghistory9224
    @hotfightinghistory9224 Месяц назад

    This feels like alien technology! WOW!

  • @brappineau4161
    @brappineau4161 Месяц назад

    Absolutely incredible

  • @sovo1212
    @sovo1212 Месяц назад +1

    3:33 Where can I download this?

  • @FinalLightNL
    @FinalLightNL Месяц назад +14

    bro i love your videos but can you please not pause talking every 2-3 words, it doesn't make it more interesting.
    it's my one and only negative about this channel, for the rest i love your enthusiasm and work on bringing us graphical tech news.

    • @techpiller2558
      @techpiller2558 Месяц назад

      I think it gives the speech some texture to grab to.

  • @t1460bb
    @t1460bb Месяц назад

    the future is bright!

  • @Charles_Bro-son
    @Charles_Bro-son Месяц назад +10

    Looks good but everything ist static. How difficult is it to animate these points compared to polygons?

    • @michaelleue7594
      @michaelleue7594 Месяц назад +3

      Look at the part where they introduce a glass object into the image and change its properties. That is, for all intents and purposes, what the animation process would be like. I get why it doesn't register to you as animation, since it's happening in real time, but that's where we're at. Computer graphics now work like claymation.

    • @Charles_Bro-son
      @Charles_Bro-son Месяц назад +1

      @@michaelleue7594 Looked to me like changing the refraction value and watching the result in real-time. I was more hinting at character animation, foliage affected by wind (...), these sort of things. Remembering back, i think it was similar with voxels, also difficult to animate.

    • @offlinegamer6756
      @offlinegamer6756 Месяц назад +1

      @@Charles_Bro-son you are right , it's good with static objects , but we already have Photogrammetry for this , and to calculate the X Y Z of a single particule on a moving object/character or foliage composed of billions of them in every frame the fastest possible , i can't imagine the raw power you'll need , i think for a 3D engine made toward gaming , they will use a mix between this new technique and Rasterization for moving objects !

  • @ruperterskin2117
    @ruperterskin2117 Месяц назад

    Appreciate ya. Thanks for sharing.

  • @thehahahaha88
    @thehahahaha88 Месяц назад

    Thanks 6 minutes paper

  • @ixenroh
    @ixenroh Месяц назад +5

    I'd really appreciate, when we're being given 'frames per second' of an algorithm, that we're also told how parallelizable it is and what hardware it ran on. 10-78 frames per second on a hyper-cluster of H200s isn't that impressive

    • @smeenuar
      @smeenuar Месяц назад +3

      Definitely true, this tech isn't coming to games soon if you need a $200,000 system to run it on

    • @Vladtmal
      @Vladtmal Месяц назад +3

      The paper says they used a single RTX 6000 Ada card.
      Pretty much a 4090 with double the memory. The 4090 even beats it in gaming benchmarks.
      This is a research project into rendering technology, the exact fps isn't really relevant, just the improvement versus other methods on the same hardware.

    • @ixenroh
      @ixenroh Месяц назад

      @@Vladtmal Yes, on the same hardware, but the hardware does matter, so that we really know it isn't a number thrown into the air.

  • @Songfugel
    @Songfugel Месяц назад

    I am feeling more and more redundant every day by watching this channel, I hope our government will find an ethical and smart way to incorporate this redundancy into easing people's lives and decreasing the price of projects that used to be deemed impossible due to the price of workforce/labour cost/highly educated workers. Even if some of those fields themselves, might not be eased that much by these advances, these should free human resources from other competing fields
    It is such a scary yet so interesting time to be alive

  • @MGPL_
    @MGPL_ Месяц назад

    I think we might be close to revamping a lot of old pc titles. Like a turbo shader / wrapper on existing games without needing development

  • @zero-loot5699
    @zero-loot5699 Месяц назад

    I really hope we can one day see this technology implemented for videogames and I hope it gets done PROPERLY. Whether ultrarealistic graphics benefit a game obviously depends on the style of the game. But let's take a game like Dead Space or Dying Light. Games like this would greatly benefit from hyperrealistic graphics. Characters could become even more fleshed out by having much more organic movements, much more detailed faces and the environment would be a lot more immersive through ultra-realism. Here it would be beneficial. Additionally, if games eventually incorporate generative AI to dynamically generate voicelines and maybe even side quests (maybe with predetermined guidelines set by the devs for the AI) they could potentially achieve a completely new level of immersion and realism by being dynamic to a level that cannot be achieved through pre-made objectives, dialogues etc.
    However this will require a lot of work. 3DGS or technologies based on it are still in very early stages and so is generative AI. If this tech wants to find its way into the game development world it needs to come in the form of an Engine like Unreal Engine with a similar/better featureset. Otherwise - if it's too different while not offering as much - it won't be picked up. It needs to be acceptable to make the switch while also gaining smth from the switch. Same goes for potential generative AI that might get used one day in games: it needs to work well. That means it has to be trained on a lot of controlled high quality data in order to produe high quality outputs. A lot will change here in the next, say, 15 years and who knows, maybe in 15 years games will finally incorporate all these technologies IF. THEY. ARE. DONE. PROPERLY. I'd love to see it.

  • @zonea4860
    @zonea4860 Месяц назад +1

    Finally something interesting and not AI related. I mean I don't want to say that AI isn't interesting, but I just got a bit tired of it

  • @hakarthemage
    @hakarthemage Месяц назад +6

    That's going to be great for museum displays, architects etc
    I do wonder how it will handle non-static objects though.

  • @tgsredfield
    @tgsredfield Месяц назад

    I dream of the day Street View will use this kind of technology.

  • @soonts
    @soonts Месяц назад +7

    The rendering tech is cool I guess, but useless without the content to render. Needs other pieces of the puzzle to take off.
    An affordable 3D camera which captures scenes in that format for consumer use. Ultra HDR 12 bits 8K pro camera for use in content production. Modelling support in 3DS Max, Maya, Blender, for use in games and similar.

    • @chsi5420
      @chsi5420 Месяц назад

      It'll probably be a while before this type of rendering can become widely viable in real time applications. But I could see it being used CG rendering and VFX in the near future.

  • @TK_TheFURY
    @TK_TheFURY Месяц назад

    I'm confused. By particles do you mean point cloud system? similar to euclidean's unlimited detail?

  • @nutzeeer
    @nutzeeer Месяц назад

    they are getting closer to raymarching, where geometry is represented purely by math. no particles or vertices needed, and its super fast!

  • @aymericrichard6931
    @aymericrichard6931 Месяц назад +1

    Waiting for this to come to 3d posing tools like daz

  • @minikame2272
    @minikame2272 Месяц назад

    I mean in principle this is essentially just caching the traced paths, which is clearly no small feat but it does compromise somewhat on the flexibility afforded by truly realtime RT - it will work great as you move around, but performance could hiccup when the lighting conditions change substantially. Nothing insurmountable but might need to be kept in mind by devs.

  • @bowlingguy7755
    @bowlingguy7755 Месяц назад

    Really amazing!

  • @KryyssTV
    @KryyssTV Месяц назад +5

    Given that 8gb gfx cards are not suitable for gaussian splatting and this technique uses around half the usual RAM there is still a lot of work needed to reduce memory demands.

    • @03chrisv
      @03chrisv Месяц назад +3

      Or what's more likely to happen is that this technology won't become mainstream and in videogames until the average budget graphics card has 12GB of vram and PCs with 32GB of ram is the norm. If this takes 4 to 5 more years to happen then so be it. We have to move on from 8GB graphics cards, we can't keep catering to such old and low end hardware.

    • @SaschaRobitzki
      @SaschaRobitzki Месяц назад

      @@03chrisv Increasing VRAM takes way too long in recent years.

    • @KryyssTV
      @KryyssTV Месяц назад

      @03chrisv Given that Nvidia is driving towards a 100% AI rendering pipeline that does away with polygons entirely there is merit towards switching over to a particle based rendering solution for lighting in preperation for geometry becoming particle based too eventually.

  • @greenstonegecko
    @greenstonegecko Месяц назад

    As someone who knows about Gaussian Splatting, It's an incredible tech. But I can't believe they are trying to merge 3D GS with RTX algorithms... It's Holy Grail
    2:56 Metal Bowl has Gaussian Splatting artifacts :(

  • @WarrenWolfy
    @WarrenWolfy Месяц назад

    Better looking fur?!? Rendered fast?!? I'm in!!!

  • @satellite964
    @satellite964 Месяц назад

    Things are getting crazy in real time graphics

  • @JPJosefPictures
    @JPJosefPictures Месяц назад +1

    It seems to me that in a few years we can create worlds in a computer that are more real then reality. Maybe then it’ll be called realerlity.

  • @--Arthur
    @--Arthur Месяц назад

    If we can do this now, with technology today, we definitely live in a simulation.

  • @KevinHart87
    @KevinHart87 Месяц назад

    I suspect this will be the method for converting realtime generative ai images into 3d rather than trying to create traditional game geometry.

  • @Ariel-s1t
    @Ariel-s1t Месяц назад

    i, love, this, content, thank you.

  • @tomaszfalkowski7508
    @tomaszfalkowski7508 Месяц назад

    I remember when Apple users and AMD users mocked the Nivida retracing. And now they all love it.

  •  Месяц назад

    I start to think that within10-15 years we'll reach a point of pretty much impossible to spot changes to graphics quality between generations of GPUs.

  • @bonobo3748
    @bonobo3748 Месяц назад

    Where is your research merch?
    "What a time to be alive"
    "Hold on to your papers"

  • @Johan-rm6ec
    @Johan-rm6ec Месяц назад

    Every time i start my Windows computer and see a new landscape photo, i realise todays GPU is still at a toddler stage.

  • @tranceemerson8325
    @tranceemerson8325 Месяц назад

    The unlimited detail guy has entered the chat

  • @IvarDaigon
    @IvarDaigon Месяц назад

    It looks like the blurry patches are in the peripheral so if you were playing a fast paced game in realtime it might look like motion blur. in other words, depending on the application, they may not even need to fix it.
    Would I put up with a bit of blur in order to have a game that looks like a realistic 3d video? hell yes.

  • @BillyViBritannia
    @BillyViBritannia Месяц назад

    Game changer literally if this gets picked up by epic.

  • @karlstein9572
    @karlstein9572 21 день назад

    I tried a demo of gaussian splatting and it was very fluid on my RTX 2080S + intel 9700K. However since those are points clouds, you must be far enough to the "object", or you'll see all the points which breaks immersion.

  • @Imperial_Dynamics
    @Imperial_Dynamics Месяц назад

    NVIDIA has the best research

  • @ilariaprescott2002
    @ilariaprescott2002 Месяц назад +1

    Okay, now if someone can find a way to train a diffusion model on this, so we can get real time generative 3D VR environments>Holodeck

  • @FranciscoBerkemeier
    @FranciscoBerkemeier Месяц назад

    Great video! Out of curiosity, have you ever covered Fourier Neural Operators for solving PDEs on your channel, or plan to?

  • @haidora9321
    @haidora9321 Месяц назад

    i have the same LEGO Bonsai shown at 2:45 lol

  • @tablab165
    @tablab165 Месяц назад

    I feel like I'm smoking the papers at this point!

  • @LarryPanozzo
    @LarryPanozzo Месяц назад

    Polyscope! 👏🏼

  • @tiagotiagot
    @tiagotiagot Месяц назад

    But are they actually performing light calculations on the gaussian splatting particles, or just using them essentially as sorta like a volumetric "skybox", with a one-way interaction between the splatting and the raytraced objects, leaving unchanged the baked-in angle-dependent coloring splattings already had?