Ray tracing isn't new, it's been around for decades. The reason why it seems new is because hardware is finally powerful enough to do it real time. I recall my dad playing around with ray tracing back in the early 1990's (it took over four hours to render a single scene at 320x200 in 8-bit colour albeit on a 80386). I realize that you (Mr. Young) probably know this, but some of your viewers (especially the younger ones) may not know this.
I remember reading that it took cutting edge at the time work stations a median time of 8 hours to render a single frame in Toy Story. Real time ray tracing is amazing tech. Sadly the performance hit is not worth it yet in my opinion.
The thing that has me really excited about RTX tech is what they can do with audio path tracing. Real time dynamic reverb from moving sources, frequency modulation depending on the surface it's reflecting off of, distance muffling, sound transparency through different materials, doppler shifting, et cetera. In real life you take in so much auditory information that subconsciously builds the sense of space you're in, but we haven't had anything like that in video games until now. The idea hasn't gotten much press, but I think the effect is going to be much more profound for a sense of realism than most people would expect. Though the RTX visuals in Control feel like a real milestone. I hadn't been so blown away by a game's effects in years as when I saw a running movie projector picked up and it kept casting the film wherever it pointed along with perfect shadows all in real time - shockingly incredible to see happen.
@Sp3ci3s8472 I had no idea, but that's kinda disheartening if it's been around for a while and already didn't catch on. The way I'd heard is that the path tracing wasn't possible without the architecture of RTX, but it was a while ago so I could be completely wrong. I'll have to go look up the name of this preexisting tech, thanks for the tip.
man i never even thought about that, i take audio in games alot more seriousely than graphics and that sounds incredible, i think itd be much less computationally heavy that graphical raytracing aswell so that sounds like it could easily become a thing. its always bugged me in games when sounds dont sound muffled when your in a carpeted house full of furniture
Both the Unreal Engine and the Quake Engine could do this back in the day, but much like their curved surface simulation no games did it because of the horrendous expense.
@@raycearcher5794 Pretty sure Quake 1 and 2 didn't do this. Not sure about Quake 3. Also the way it's been done is usually a bit less dynamic, involving creating volumes that change how a sound is heard by a listener. Don't think hardware accelerate raytracing is going to be used in this way, or can be. The way this stuff is designed is pretty limited to graphics.
Raytracing for Doom 3 is a complete waste of time, @SpeedWeed. Maybe not a *complete* waste of time. After all in the beginning of the game you still have plenty of lights just about anywhere. Later in the game it is almost complete darkness wherever you go. So raytracing would just be all gimmick for Doom 3. That is unless if you have that flashlight-on-weapons mod.
@@adamgray1753 I think the game would benefit from it, even though there arent as mamy lightsources later in the game, it still would look amazing with the deep shadows and realistic light that would come off the screens and dim light sources, would make the game even scarier maybe!
Talking as someone who loves your written content, this is also great stuff. You have a knack for explaining stuff in a way that people without a technical background can easily follow your points
I guess in some ways he is, because his next point was that you have a chicken and egg situation where we need the hardware before devs will make games for it, but we won't buy the hardware without the games. While it seems to be coming to the PS5, what about other platforms? Guess we'll see, but at some point we're all going to have to take a leap of faith and upgrade our GPUs to more enlightened models!
They are not free performance wise though as ray-tracing is not just more costly than appying approximations it's also inherently more complex & even more difficult to make any kind of latency guarantees - they're just free for the 3d designers because they don't require fancy specialized hacks.
@@FrankHarwald He explained what he meant by free. The free part is the effects you get after you can calculate the raytracing without any extra performance hits for each individual lighting effect. It's not like "Ok, I enabled ray-traced shadows and I got 50 fps, let's see how much fps will I get with ray-traced ambient occlusion enabled too"
Yeah sucks to hear him say how excited he is about the technology and the future of ray tracing and he isn't around. Just found his channel the other day too. I love content like this.
It's interesting all the things you have to unlearn. Playing through Quake 2 RTX, I saw a glowing blue light on the gun of a grunt corpse, and noticed it was casting an almost imperceptible light on the ground. I also noticed the same of the glowing lights on the grenades. My first instinct was "Wow they've really thought of everything" but that's really not true. The game just does it. Also I instinctively wince when I come to an area I know will have lots of small lights, because I know multi-directional lighting is a resource hog on previous games. Nope, it treats all light and shadow the same, so it has no problem with a room full of diffuse light panels.
The cost of multiple lights can still vary with ray tracing, it depends on the sampling algorithm used. Small lights in particular require importance sampling, which can be tricky to optimize properly.
That bugged me too. What really happens when light hits an object is that some of the light will be absorbed, some will be scattered/reflected and in some cases there will be transmittance too. Edit: Unless we are talking about the second, third, nth harmonic. But that only happens with strong E-fields (lasers) and specific materials. BBO crystals are a good example.
@@lawrencedoliveiro9104 The photon still doesn't change an that also only happens to certain materials where the light has enough energy to excite the electrons .
Yeah the photon doesn't change but if you ignore they layman's terms he isn't *wrong* he just misunderstood the light unit in physics class when he was a kid and hasn't been proved wrong since
11:35 I like how when the character stands in front of the film projector, not only does she create a shadow on the wall, but the light that is blocked by her body is shown on her body. Not just the light, but the animated detail of the film. And all of this requires no additional programming as it's an emergent property of the raytracing algorithm.
Who cares about graphics? We still have violent, oppressively evil gameplay. Get back to me when your character can put on a shadow puppet show with the projectors...
@@thrwawyacct Raytracing doesn't force people into violence - culture does that. Last time I checked, no culture was free from criticism on that front.
Nothing new under the sun huh. I remember doing raytracing 30 years ago. Granted, it took 5 hours to render a single frame of a very primitive scene... But arguably raytracing as a concept might even be OLDER than the various tricks we've been using for 3d games in the last 30 years or so. After all, Ray Tracing has always been a much closer approximation of the physics of light than all the weird bucket of rasterisation tricks. I mean, Ray Tracing still does a few weird things - like trace from the camera outwards (where real light obviously comes from the light sources) What I don't get is this 'graininess' stuff with modern 'raytracing'. To my knowledge that's not raytracing, but rather some variant technique (path tracing or whatever) The thing is, Raytracing has two major flaws, which have plagued it from the start; 1. It is EXTREMELY slow. In the beginning this even made it difficult to use for pre-rendered sequences where you can justify using a render farm and taking hours to do a single frame. Realtime use was basically out of the question until about a decade ago, and even then it was only in very carefully crafted scenes and limited resolutions. 2. Raytracing (at least, the traditional formulation of it) can't do ambient lighting. AT ALL. Just flat out incapable of it. Thus, you have to combine raytracing with something else to get any ambient lighting effects. Or... Use a modified technique such as path tracing. If you see a tech demo calling itself 'raytracing', yet focusing on ambient lighting effects, I'd find that quite suspicious. Due to the slow performance of raytracing, the original solution was to just use an ambient colour value; which basically means the ambient lighting is handled the same as any rasterizer... Other options include combining it with Radiosity rendering, which happens to have the opposite problem - radiosity cannot really do specular lighting. Or, path tracing or similar extensions can be used, but that imposes new problems. Speaking of new problems, the need to use denoising filters with 'Raytracing' solutions seems directly tied to the weakness of raytracing in doing ambient lighting. The approach seems to be one of treating ambient lighting as a the sum/average of many specular lighting passes made with random reflection angles... Seems a rather poor choice for it, but I guess they've settled on this method for a reason...
In raytracing, a ray that hits a surface splits into several rays distributed in a hemispherical way that look for the light source. This makes noise free images but it would be exponentially slower because of the increasing number of rays created with each bounce, that's why raytracing gives pitch black shadows and final gathering or photon mapping were used to simulate indirect illumination. Pathtracing just changes the direction of the ray when it hits a surface therefore it scales linearly with each bounce. The problem is that when a surface is not a perfect mirror meaning it has some roughness, the ray bounces in a random direction.You need several rays shot through the same pixel to average the result of all the random directions and you need to do the same with the next pixel and the next one and so on. Every pixel is independent from its neighbor therefore you need to shoot a lot of rays to make the average more accurate. I think they should have called it pathtracing unless they're doing some dirty tricks under the hood as you say.
Leif Pederson it's easier to rant about something not working properly than taking the time to understand why it struggles to be perfect. I'm glad you took the time to read both comments and even enjoy them, thanks 🙂
@@leecaste I believe that's not exactly correct. What you call raytracing is actually pathtracing. And, as far as I know, in raytracing when the ray collides with a rough object, it sends out a ray towards every light source, to determine whether the object is illuminated by the light, or whether it's in the shadow.
Lenar Gilmanov the first paragraph of my comment says exactly that, if the ray hits a light source, that part of the object is illuminated, if it doesn't hit a light source, that part of the object renders pitch black, it doesn't take into account indirect lighting.
I'd take compression and data storage over graphic power when it comes to next gen tech development. Games look good enough for now; there should be more stuff to see, more npcs and more stuff to do in games.
the problem is marketing. Pretty screen shots and videos are easier to sell. For a long time developers were making games with 30fps 60FoV in mind because pretty screen shots are (or at the time, were) effective marketing, nevermind that the game is a headache inducing, eye bleeding mess.
Raytracing has always been used in 3d modeling scenes and animation. Gaming has typically been on the low end when it comes to modeling/animation technology because of the nature of the medium.
Its the simulation of motion that the eyes cant process in some ways if used right it keeps the animation from looking odd kind of like a wire fence without anti-aliasing.
Motion blur exist in real life. But must games use a very cheap and exagereted "camera motion blur". Most good implementation of motion blur you wouldn't notice, since they are "moviment based motion blur"
Dedicated hardware is not needed - just software support. Minecraft raytraces on AMD hardware almost as good as on RTX :) That Neon Noir demo from Crytek ran Vega 56
Crytek themselves said the Neon Noir demo would run better on RTX cards because of its dedicated hardware and stated that their neon noir demo is inferior to RTX titles in its level of implementation.
Ehhh... Minecraft raytracing... More specifically Optifine DOES NOT USE SPECIAL RT CORES FOR HARDWARE ACCELERATION it's just using good old CUDA... Basically RTX cards are working with RTX off but providing RTX lol Or like a car with Turbo just it's not working... Still gets you from A to B just not as fast lol
One of the biggest things I like about ray tracing is not having to deal with ugly screen space reflections. While they can look good with a lot of tricks to make up for their flaws, nothing is ever perfect and the holes in SSR are really distracting when you see them. They're really obvious in the Yakuza games or in FF15. Otherwise really nice looking games, but with ugly SSR sticking out like a sore thumb. Those games also share aliasing issues too, now that I think of it
Been looking forward to real-time raytracing for almost 20 years. I was surprised you didn't mention that the concept has existed all this time, or that it dominated the animation world in the late 90s and early 2000s.
What's so funny about prey removing the mirror is they literally had a way to render a fake room on a flat panel and give it depth. That would have literally been a mirror
Ugh, which Prey game are you talking about? Prey or Prey? FFS, who the fuck gives the new game in the series the *exact same* name as the old one? Also looking at you, "Doom"...
"the old way" to you, Shamus, isn't really the old way. I cut my gums on this stuff on the Amiga in the 90's with Imagine 3.0 (a raytracing program). It of course was not real time (an hour per 640x480 image of full raytrace rendering) but I always pined for a time when real time ray tracing could happen. It's early days, but I am glad it is starting to happen.
>"the old way" to you, Shamus, isn't really the old way ... in the 90's ... Hasn't it ever occurred to you that you are recalling things from the last millennium? prefixing a statement with "isn't really the ..." doesn't make it automatically false. you are just old, whether you accept it or not. I was playing with ray tracing in 3DSMax in late 90s' when I was but a wee dork... now I am fat, balding old dork living in a world where literal toddlers are playing with phones several dozens times faster than my first PC. I am old as are my memories. just accept the new reality... maybe start saving for a burial plot even...
Yeah the Amiga Raytracing days were great. As you know some later games on the Amiga implemented it such as Super Stardust. Am I of the understanding though that the narrator of this video presumes that Raytracing is a new technique?
@@baburik What the fuck does someone's age have to do with historical facts? Ray tracing being a decades old technique is just reality, whether someone's lived through its development or not.
Played control with ray tracing, it was absolutely beautiful and it's strength was having amazing environments that look real and then having those environments warp and change and get trippy as hell. There is a whole zone just full of fucking clocks, overflowing a story tall, spilling out everywhere and they all seem to be rendered individually and have physics. Hats off to the level designers and the technical teams on this game. Listening to the protagonists internal dialog as she stumbles her way through the incredibly hacky (though admittedly creepy) plot on the other hand...
Very well done and descriptive, yet concise. I hope this simplifies things when designing games for, say, Unity. Setting up the lighting in it has always been a pain in the ass to me.
Ray/pathtracing will save a ton of tedious work hours because you just put the lights, adjust the intensity and everything else works by itself. I just hope for developers to work with real world units as much as they can because this gives consistency, something that can be broken easily by artists that have the bad habit of eyeballing everything.
11:08 No they don’t. In real ray-tracers, you can fine-tune the render by specifying limits for various kinds of rays, such as diffuse versus glossy bounces, transmission rays, volumetrics etc. They are not free: every parameter has a quality-versus-computational-cost tradeoff.
8:10 "Billions of photons are pouring out of the screen" That is a gross understatement. It only took me *HALF A GOD DAMN HOUR* to find the (not at all) readily available information I needed to calculate (yeah, the information I wanted isn't even there) the number of photons being emitted from a typical phone display. About 100 quintillion (100 billion billion) photons per second. This number could be off by as much a factor of 1,000 due to the lack of quality information on the subject.
4:05 The Sun also casts a penumbra. The angle of the penumbra is all down to the apparent (angular) size of the light, not to how near or far away it is, and the angular size of the sun is about ½°.
This is 99% accurate, but it's important to mention that current GPUs can also do raytracing and partial raytracing is already being used in major engines like UE4 and Unity. Tim Sweeney from Unreal Engine says that the use of raytracing in Unreal Engine will slowly increase and when graphics cards hit a processing power of 25 TFLOPS, full raytracing will be the new norm. Unfortunately we're still about five years away from that power in high end GPUs. Also, an interesting effect of full raytracing will be a more natural projection. The current rasterization method uses a linear function for the projection of the 3D world onto the 2D screen. This causes an unnatural distortion that becomes more pronounced toward the edges of the screen. With full raytracing it will be trivial to create a natural projection like that of a real life camera or the human eye.
CryTek recently stated that they have created ray-tracing in their Cry Engine which runs pretty smoothly on cards such as the RX570. If this is true, then it means that ray-tracing does not need as strong cards as previously thought
a bit about how standard ray tracing works. for every pixel, you do a process called ray marching, which basically means, you cast a ray, see what it hits, than from there cast more rays as needed, for example, if you have some glass, you'll need to cast two rays off of the glass, one through the glass and one backwards for reflections. after you finish casting those rays, you sum up the colors those rays hit. the most common way to do ray casting is called circle tracing. in circle tracing, you take the ray's starting point and calculate the distance between the point and the closest object, than you travel along the ray for that distance. when the distance becomes really small, you know you have hit something.
10:50 : that's sadly not true. For each phenomena you need an extra ray, therefore it's not free. You can't figure out if the floor is shadowed and it's reflection with one ray, simply because they go in different directions.
Two ways to take it. First, it's developmentally free - you don't need to put in almost any extra effort to support reflections, it's just a slightly different path. Second, it's free in the sense that if you're already spending five rays per pixel, it costs the same if those are five diffuse rays or three diffuse rays, one reflected ray and one refracted ray.
Not exactly. Each ray is essentially one path of a light ray. The calculations that will be performed on those paths, those are distinct for each effect, not the rays.
@@FeepingCreature That is exactly why it comes for free: to figure out what path a ray took from a light source in the scene to the point you're analysing, you already had to account for and calculate those things. All of that information is already baked into the ray's information.
@@louisvictor3473 That's not how it works with path tracing though, generally you won't been looking at one unique ray per pixel to the light source but a bunch of stochastic rays, because depending on material a ray may have come in via diffuse reflection or specular reflection or translucency or subsurface scattering, and directly from the lightsource or an indirect bounce from deeper in the scene, etc. So at any step of the path you'll be making more or less random choices which way to "have come from".
People has always been on about making graphix more shiny. There was once a time, a man didn't just want it to be the shiniest. He wanted it to be *stable*, even if it means that it isnt the shiniest. That man was John Carmack, 5th dimension hyper quantum robot in a meatsuit. At a time where Ultima Underworld was shitting out semi-polygonal graphics, he made a stable raycasting engine. A year after that, nope, still no poli. He just made ***THE MOST PORT-ABLE GAME KNOWN TO MAN*** Using a pseudo-3D engine. To me, graphic isnt about looking shiny. Its about being cheap, affordable and easily modifiable/accessable
The funny thing is that if we could somehow harness the best of CRT tech, we wouldn't care so much about "more pixels". Digital Foundry did a video where they ran Control on a CRT at like 720p I think, maybe not even that, and the game looked gorgeous and wasn't even blurry because of how the analog signal handles pixels (casting colors through electron beams in the cathode tube). It was crystal clear and they could turn literally every setting all the way up and run the game at 120fps consistently with perfect black levels. It was magically.
i don't understand this mindset, it seems to have appeared out of thin air and feels like more of a thing the industry is pushing people to want rather than a naturally occurring rise in standards.
@@evo2542 have you ever closed your eyes, covered one eye with your hand so no light gets through your eyelid, and looked up at the sun (eyes still closed)? it upsets the colour balance in your eyes and for a while the uncovered eye will see much bluer hues than the covered eye and if you do it right you can get a 3d anaglyph. where was i going with this... oh yeah, the effect fades with time. 30mph feels fast until you go 60mph.
Solid video. Just find it amusing that when talking about types of lighting at 5:30 the photo referring to bloom lighting is a street I often frequent in Seoul.
Glad to see someone on RUclips actually excited about raytracing. Nearly everything else I've seen has been very negative and being an early adopter of the RTX card with an empty wallet, it's nice to get a little bit of reassurance that I didn't make a bad choice!
Being the first one to buy a brand new tech is always a bad decision...sry. doesn't matter if it's about TVs, cars, computers, etc. Good for the rest of us as you are financing better technology that we can buy a bit later for half the price Thx
@@alexsupertramp4907 I was due for an upgrade anyway and TBH, I do a lot of 3D animation rendering so I'm benefiting from it massively in that regard. Still can't believe more games don't support it yet though.
I bought it over the 1080ti because of the new technology. I'm into Graphics and I've enjoyed the RTX on the games that supported it. They are pretty good with RTX off but when I enable it I'm impressed with the increase of realism and reflections. I've always been more into "Graphics" with everything on Ultra settings, than "Framerates" as long as it is not stuttery. Even with RTX on its over 60 FPS. I don't care about price, its not like I buy one every year, its a one time purchase.
@Luden UK so far 21 games support it. Problem is finding one you're into. Too bad Mount and Blade doesn't support Ray tracing. I can imagine all the shiny armor and weapons, and the blood pools with RTX.
@@alexsupertramp4907 The problem is if that nobody buys the new tech, then there's little to no incentive for Nvidia and/or game devs to invest on it. There's bound to be early adopters that just wants to experience next gen from an infant stage, and it's understandable.
One interesting thing about Bake Lighting. You could have more than one bake texture, so for example, if you (as a developer) knew that a light could change between two colors (for example) you could bake the lighting on those two colors and then program the game to change between bake lighting maps when the player changes de color of the light. Half Life 2 did that with fire, when you extinguish a fire you'll see that the lighting changes, but it seem very sudden because it is just a switch changing from one map to another without any kind of blending.
I've been looking forward to real-time raytracing since the 90s. That's about the time I wrote my first ray tracer, and let me tell you, it's simple. I mean ~100 lines of C code simple. It didn't look great, but the entire process of building up an image from the scene can be described in a shockingly small amount of code. It gets more complicated when you start adding reflection (which is just a recursion of the same process), refraction (again, just a recursion of the same process but the calculation of the angle of the new rays is different), and other things, but you're still not talking about much code. You just describe, in code, how light acts. That's it. On the other hand I'm still utterly mystified by rasterizers. I've written one but the entire thing feels like some abstract geometric puzzle with fudge smeared on top to guess at what color the pixels should be. You're right in that it's a pile of hacks, even "physically accurate" shaders are sitting on a pile of hacks. It's necessary because raytracing is extremely slow, but definitely not ideal. True realtime raytracing is still a distant goal, but it's in sight and it's very exciting.
"True realtime raytracing is still a distant goal" It depends on what you consider an acceptable result, Quake II RTX is entirely path traced in real time, minus the HUD elements and the menu (which would have no reason to), it runs at 60fps 1080p on a $700 2080; sure it's only 1ssp (if I'm not mistaken) but the denoised image looks quite good already and IMHO once you can reach 8-16spp the denoised output will look almost indistinguishable from a 1024spp ground truth.
@@Hayreddin That's interesting, I hadn't realized that Quake 2 RTX was a full raytracer, I thought it was a hybrid rasterizer and only used raytracing for lighting. I'll have to take a closer look at it.
Excellent video! When the NVIDIA first started advertising for ray tracing I thought it was just for "another unimportant effect" that requires a huge amount of processing power, but I gradually realized (and especially after watching your video) that everything we see on a screen can be produced by following the laws of how light behave with different materials and objects, and this indeed can change the fundamental way that we create various visual effects. Thanks for the hard work! ; )
Phenomenal video on the topic. Speaking as a layman in game lighting processing beyond "I know what dynamic lights are and static lights are," I was still able to follow this video. I know this one was more techy and less about the social and philosophical discussions of the industry or medium, but just wanted to say that you always been a voice of reason in the often shouting and toxic discourse and your voice is sorely needed. I really hope this channel takes off. Also, you were one of the people who inspired me to get into writing, which inspired me to make the channel im slowly growing now. Glad to see you throw your hat in the youtube arena, and looking forward to your next video!
With raytracing, the problem of light transport is practically solved for light in free space, but far from over for transport in material, and computational physics is the next frontier. Imagine: realistic blood in Surgeon Simulator 2024!
All I want is good carved 3d shapes in game graphics without unrealistic and disturbing corners on them. Thats mostly up to game developers, not to the hardware.
The sad part is that high end video cards are marketed as though they can pull it off. Depends on what the individual wants. For me. I'll be enjoying ray tracing a few years from now. Until then I'm at 4k with 144hz refresh and that beats ray tracing technology for me. Reminds me of the hype over direct x7 and 9.
I loled when you said "free". Like climbing mount everest And gliding off for 5 minutes is "free". Its still more pixels, level space instead of screen space, bouncing around compositing. I personally am still waiting for games to stop having trigger boxes and missions.
Well, once you get enough raytraced pixels, you can use some of them for gameplay, too. Accurate interaction with textured surfaces, accurate NPC vision, and of course freeing game designers from having to work around static geometry in general.
There is little cost difference between a boring diffuse surface or some fancy bump mapped glossy material or a light emitter/water/whatever, they all cost the same. In traditional engines the developers hava to apply a light budget and material type budget, in RT there is no real cost difference between 1 light emitting surface or 5000 light emitting surfaces, given 2 scenes with the same number of triangles.
You could consider shadows, reflections, radiosity, refraction, etc, free in a path traced render engine, but definitely not for how games are using ray tracing today or in the near future. Shadows, reflections, ambient occlusion, global illumination, etc, all cost extra in Unreal Engine 4. Although I do believe I've read that ray traced shadows can actually be cheaper than high quality rasterized shadows that still look worse and have more limitations. Not free, and not cheaper to render than low end shadows, but still better and feasible to budget for.
this is the way i feel right now :( my computer used to be fine 10 years ago but now im well below the minimum specs for tripple A games and i dont have the money for upgrades
Arstan Boranbaev I would guess games will be similar to what they are now, with the option to enable raytracing for a while, because publishers don't want to lose revenue from people who would've bought the game
@@lucywucyyy Well, be glad that computers remain relevant that long these days. It used to be that you had to upgrade every year, two years tops, if you wanted to be able to play anything recent.
Deux Ex had at least one mirror you could see your reflection in... My take these days: the more you see, the more you realize how it doesn't really correlate to how much you enjoy it.
And I am always surprised when there is working mirror. It allows you to do so many thing design wise. Like I used it to make rooms feel bigger in The Sims 3. Mirrors have a lot of potential for level design. But now is generally considered to much of a hassle to have in a game. :(
Suddenly reminded of the mirror in Adam Jensen’s apartment in DXHR with a sticky note from his landlord that’s chewing him out because he keeps smashing them.
same it looks so crappy but id still rather they dont work than slow my game down or god forbid the devs decide to prioritise a working mirror over gameplay
I remember reading about ray tracing being the (non immediate) future at least a decade ago, maybe as much as two. Cool if it's finally approaching... imo the biggest thing that gives a sense of atmosphere, immersion, and believability in a game is good lighting. I think it's way more important than high polys.
Yea that annoys sometimes me too. Raytracing has been around for a really long time, but the use has changed over time. It has been possible to do in real-time for quite a while but it was never fully taken advantage of until very recently with the RTX cards (just to be clear, the RTX cards are not required for raytracing)
Oh at 5:18 AAA does still use that technique. That is actually the only way to do real-time moving shadows aside from ray tracing. They just change the organization methods and add a way to mix real-time shadows ( you calculate every frame) with the pre-rendered shadows ( for things that don't move ), and combine it with screen space shadows (which is like the shadows at the contact point or Ambient occlusion as you mentioned, such as a where your feet meet the ground).
Hello? Yes, VRChat here. I'd like to order some more MIRRORS™ as there are not enough performance-killing mirrors in my user-generated content game yet.
Ray marching is also an interesting point as ray marching has the effect of ray tracing but doesn't need special hardware, nor the hit RT has in performance (though it still isn't as simple as normal lighting). Red Dead Redemption 2 actually uses ray marching for shadows, crepuscular rays, and rays in volumetrics.
If I may simplify a very complicated process; your brain is working 'even harder' (than playing the game yourself) to calculate where you are in the spinning world, and figure it all out (make sense of it all) than if you were playing, because it knows you aren't manipulating anything. That confusion of the world spinning and you not doing anything, makes your brain turn on the alarms of nausea/dizzy-ness, heh. It's bad enough when you are in front of your own screen and the world spins on a different axis than your head is at (most first-person game views spin on the axis of where the monitor is, not where your head is, in space). The brain is very smart, but gets easily confused when things don't make sense and sets of the dizzy-alarms for some people really, really quickly (as a form of self-preservation, of course). Bonus Trivia: The Brain named itself.
Just so you know, I played through Control with raytracing on my 1080ti. Raytracing does not need RTX hardware, Nvidia is just trying to monopolize the tech.
How were you framerates huh? Ray Tracing can work on any hardware but it is so sluggish there is no point of using it. Hence RTX. It has dedicated ray tracing hardware and performance is way better on them then regular cards. It's playable
@@R4K1B- Never dropped below 40. I bet the game wouldn't use rtx cores even if you had them. My processor is getting old, so I would likely have gotten better fps with a better one.
@@R4K1B- as far as I know RT games barely touch the RT cores or don't touch them altogether in some cases, so is likely more in software and hardware optimization to do raytrace calculations than specialized hardware.
There are pros and cons to fixed function hardware. The RT cores in RTX cards are fixed function with the function obviously being calculating ray intersections. We have been able to do ray tracing on GPUs since the mid 2000's/noughties, even real time, problem with the later was that the resolution wasn't competitive enough. Nvidia even had a renderer for doing raytracing on the gpu shader hardware back then. It was called gelato: en.wikipedia.org/wiki/Gelato_(software) The fixed function hardware for raytracing ensures there's a baseline hardware dedicated specifically for raytracing so since it's there the game developers might as well make use of it for that specific purpose. The downside is if you don't need that function you can't really use the transistors for other purposes. On the other hand if you have general purpose compute hardware such as the more traditional 'shader cores' you can do anything on it, including, raytracing. the problem is there's so much you need to do on the hardware it's tempting to just not do raytracing at all and use it for everything else instead.
@@Megalomaniakaal The other downside to the RT cores is that it gives Nvidia yet another means to monopolize. RTX is just another Gameworks. Control proves that you can have good looking raytracing without specialized hardware. Accessibility is key to the widespread adoption of new tech, and in that regard Nvidia is screwing us all.
I wrote a program to do ray traced, motion blurred animation around 1990. It even split up frame generation dynamically amongst a whole bunch of computers. It still took awhile.
This argument is beginning to sway me, particularly with how it can potentially speed up development time. My only gripe is that this is quite the performance hit for somewhat more accurate lighting. Battlefield 5 was one of the first to show it off and I’m willing to bet people would just rather prioritise framerate if the sacrifice on their part is so low. The games showing off raytracing atm don’t suffer without it. Having raytracing be the only means of lighting a scene could look nice and all, and could cut down on dev time (or make them focus on other areas instead), but that would also mean less room to work with out of the gate if you dedicate a big chunk of your processing power to this feature. As much as I can’t wait for these consoles, knowing as well that at least the Series X will have raytracing support, I don’t think the consoles will be prepared for it, nor capable of running that generation’s equivalent of Doom Eternal at 4K 60fps as well.
I would say the only down side is they will push the artistic touch a bit away kind of like game studios having actors do the base of a lot of the character animation with a team of animators.
this video was a shill for nvidia.. at no point did he mention the big elephant in the room.. that with all the crap he just said, a game with ray tracing is virtually indistinguishable from a game without it, at a significant cost to framerate.. most gamers, ESPECIALLY competitive ones coudn't give two shits about a .1% improvement in visuals when it costs them so much FPS
I think the misconception is that as gpu's improve over time they will have more and more performance but not in a meaningful way. 80% better performance over the previous card but 80% increased performance over 4k 144fps isn't really meaningful. Using that excess performance to power raytracing does in fact make sense though.
Great stuff! If I had the opportunity to add more, I would have added curved reflection surfaces and cube maps - and mirrored surfaces drawn within mirrored surfaces.
@@1God1Fury uhmmm, You WILL notice a difference as soon as you have 100+ fps on a monitor that can support it. Did you ever try playing on a 144hz monitor and go back to a 60hz one ? I did it with osu!, LoL, Battlefield, Cod, Warframe, Minecraft, R6, just to name a few, and i can't get back to playing on a 60Hz monitor, cause i literally feel the stutter while playing, no joke. Edit: And i don't even use the FPS counter in that games, so there goes that argument as well.
@@1God1Fury What screen size are we talking about? Well, it doesn't matter. Quick question: You have a 24 inch, 27 inch, 32 inch monitor and lets say a full size living room tv. On all of those devices you play the same video (for test consistancy) that has 5 fps. Where do you "see the lag better"? Answer: It's all the same, no matter the screensize. I believe you are talking about teadering (idk how it is written, sorry) or about a pannel with long times to change the color (like some advertise "Gaming Monitor 1ms 27".........). 1ms stands for the time the monitor in that case needs to dim the leds or totally change the color for example from green to red. Tl;dr: You might be talking about teadering or the "color change rate " and in that case you would absolutely be in the right, because it's easier to notice pixel error on a bigger display.
Nice introduction of raytracing for a general public. Great work! Though, as a computer graphics researcher, I have to point out a few things that are slightly wrong in your explainations. You're not to blame though, these are just common misconceptions that have spread widely. 1. You say you want to talk about raytracing as opposed to path tracing and yet you've essentially talked about path tracing in the end. The difference between raytracing and path tracing is essentially: RAYTRACING is the general tool to answer the question "what point is visible from an arbitrary point in an arbitrary direction" and then being able to repeat this question for whatever point is the answer. PATH TRACING uses this technique to compute how light spreads throughout a scene, bounces around and interacts with surfaces. So what games are using, when they compute reflection, refraction, shadows and GI, is essentially some form of path tracing. 2. The computation is not less acurate because you start at the camera as opposed to the light. The computation is exactly the same. What you end up with is what is called a "Light Transport Path" which connects the camera with the light source. THEN you can compute what amount of light is traveling along this path. The direction in which you compute this path is completely irrelevant to the result. It is just computationally WAAAY less expensive to start at the camera (in most cases, not always though). 3. DOOM 3's shadows weren't sharp because of the lack of indirect illumination, but because they used a new technique to compute shadows that just couldn't handle blurry shadows. 4. Shadow maps don't use the silhouette of the objects from the point of view of the light source, but the distance to the light source as seen from the point of view of the light source. Hence you don't project the silhouette onto the ground but compare the distance of a given point to the light source to the distance that is stored in the shadow map to figure out whether that point was visible from the light source. I assume you kind of knew this since you showed the wikipedia article about shadow maps and just wanted to keep it simple for the viewer. 5. The computation of the phenomenon of indirect illumination isn't called "radiosity rendering". Radiosity rendering is a very particular algorithm to compute diffuse indirect illumination. There are many others though and the one that is used these days (Path Tracing) works very different. 6. The main benefit of ray tracing, and the reason why there is imho no doubt that raytracing will definitely be adopted by everyone in the games industry sooner or later, is that is tremendously reduces the complexity of modern game engines (at least the graphics part, obviously). You pretty much mentioned this very clearly. But the important detail here is, as always: MONEY. Reducing the cost of developing and maintaining the engine, as well as reducing the workload of artists is a HUGE incentive for game companies, especially greedy management. Again, great introduction though. And certainly better than the common "hurr durr nvidia just wants our money, RTX is a scam, games just run worse and look just as shitty ...".
@@AndrianHendrawan I was replying to the ""hurr durr nvidia just wants our money, RTX is a scam, games just run worse and look just as shitty ..."" strawman. The fact is that nvidia took advantage of a GPU drought to release a half-baked technology for an outrageous price. Nothing about the video deals with that subject so even though I did watch it, it was not required to respond to the strawman.
At last someone Who knows what he Is talking about! I'm so annoyed by all those people saying "that raytracing effect eats a lot of frames without adding much". It Is not an effect! And as soon as we get rid of all that rasterization mess everybody Will knows that! I've implemented a raytracer a few years ago and so i know what It Is capable of and how beautiful Is rt to work with. Just a couple corrections: It Is path tracing (or similar technique) what we aim to, because rt alone cant provide gi and tracing from the Eye Is perfectly fine due to the bidirectional nature of the brdf and Will not impact the image quality at all. Great video, i wish you explored a bit more how complicated are very basic effects with rasterization and how easily they Just emerge from rt, because the positive impact of rt on engines implementation cannot be underestimated. Great work!
8:25 No, they don’t change in wavelength. Coloured objects simply absorb photons of certain wavelengths, and reflect others. Bounce red light off a green object, and the light doesn’t turn green, it simply gets predominantly absorbed and the green object looks dark.
I'm super happy the crypto binge happened over the last 5-10 years or so, and is now seemingly going away. I effectively barred myself from upgrading my graphics card for about 7-8 years (ish, not exactly sure), and now, when I'm about ready to bite the bullet and get one, this stuff is coming in and cards are the cheapest they've been in ages. Brilliant plan, me.
Whether or not end users adopt advances in graphical technology has a lot to do with how much they value realism in game graphics, and I use the term "realism" specifically because many of these advances don't necessarily make games look that much better. Realism is just one style out of many, and there are lots of games where hyper-realistic lighting would actually detract from the game's tone, including one you mentioned: Doom 3. I think game programmers and designers have a tendency to get hung up too much on details. In the real world, if you look at a grassy meadow you can see tens of thousands of individual blades of grass swaying in the wind. A game would probably treat the whole thing as one big, swaying object, like a massive carpet made of grass, but is that a bad thing? Would it be worth the trouble to have an individual grass-blade renderer? Would it even be noticeable to the player? I would err on the side of "no" for all three of these questions as far as most players are concerned. There's certainly a lot of potential in ray tracing, and I can appreciate it aesthetically when it's used in something like Minecraft, but if I were actually playing this is exactly the kind of thing my mind would overlook. I find games immersive when they're fun and have a strong atmosphere; graphical realism is a secondary consideration. If the PS5 does support ray tracing, it may end up like the PS3's Blu-Ray functionality: an expensive gimmick appreciated by only a few.
As someone who works as a game developer... If this takes off, my environmental designers will finally have enough free time to spin up some really, really cool shit. They spend SO MUCH TIME setting up tricks for the most rudimentary effects, like reflective puddles, swinging lanterns and so on. We've reached enough graphic fidelity to last us a couple decades, and we really have to focus on quality of life improvements, because making graphics got insanely complex in the last two decades.
ahh the beauty of Anti-Aliasing in the source engine when it worked as it should, not like todays modern titles where you turn AA on and its either blurry/muddy or still jagged.
I'm one of those gamers that was heavy into "Graphic" This to me is great and makes a big difference in games that support them. I don't care about 500 fps at low settings, I like to turn everything up to Maximum Ultra and Enabling everything offered, real hair, shadows Anti aliasing to the max, etc. When it looked choppy i'd start turning things down. Now Computers and Graphics cards are so powerful I can skip the "Turning down" process even with RTX on and still get 60 fps. It is awesome looking. IF for some reason I wanted or needed more fps, I can just enable DLSS with just a tad loss of quality, but I have yet to need more FPS. As far as price I don't care one bit. Its a one time purchase for years to come.
Him: Talking about how revolutionary and simple ray tracing is but requiring the investment of $1000s and adoption from game studios Me: I'll see it in a Bethesda game 30 years from now.
Drop shadows are still a useful features in some games, namely platformers, where they can give you a good idea of where exactly you're above, which is important to see where you'll land.
I loved it but I wish the video corresponded to what you were saying better. The Deus Ex shadows worked nicely and the Source radiosity room demonstration did a great job illustrating what you were saying - even better than the relevant screenshot in the written article. But for the most part it was just footage of Control. A very pretty game but it felt like filler here.
I always thought the way it was done in the Silent hill series was the best where they just baked the shadow textures onto the models. If you actually pull up those old models the UV wraps are beautiful paintings and compiled with the backgrounds you have extreme depth of detail with hardly any real time rendering
As a 3D artist I was sooo hyped for RTX! just find out the technology would be developed for games first, 3D production last. Sure most render engines "support" it. With half their feature sets missing :/
I have thought for a while that the whole "RTX" launch as a consumer device feature was premature and really just about capturing some initial branding mindshare.
What's wrong? Perceived color depends on wavelength of photons, I can agree than it's not changing, spectral composition of reflected light changes, when some wavelengths get absorbed more than others.
Well actually, there's this effect called the 'doppler effect', which does influence perceived wavelength. In light it's basically negligable in normal scenarios, but just fyi :) EDIT: There's a different name for it ' shift', though i have forgotten the term.
@@pauladriaanse what you are referring to is a light shift and its when the wavelengths in light are stretched out during the process of getting from one place in space to another. EX: A star moving away from us while we move away from it stretches its light to shift to red while it if it moves closer it shrinks to blue.
One thing people sadly forget is how sensitive the human visual cortex is to tiny changes in lighting. Lighting is actually the most complicated and most important part of a scene. Sadly, pathtracing performance is not independent from scene design or number of light sources. It is too expensive to do caustics with just one directional ray tracing, so you have to do it bi directional. Moody scenes where a half open door allows the sun to illuminate a warehouse are way harder to do then simple scenes under the sky. You need way more rays to statistically hit the light source enough. If you look at ray tracing implementations in current games, this is where you see more noise because it can't be done in time for the frame.
Thanks! As a physicist with a side-interest in video-games I have been wondering about this for years, since I realized that ray-tracing is the natural thing to do but would be too computation-heavy. A small amendment: Photons don't usually change wavelength, unless we are talking about fluorescence or non-linear optic effects, both of which are rather rare in everyday scenes. The apparent change in color comes from that fact that white light contains all wavelength of photons and a colored surface absorbs some of them and reflects the rest. The complicated part here is not the physics where each wavelength just behaves normally, but our eyes, which are really screwed up in terms of color perception because of our three kinds of color-sensitive cells on our retinas. That is where all the weird color-mixing effects happen which you study in arts lessons.
Game Devs: Invest a huge amount of time to create realistic shadows and light effects.
Gamer: Turns off shadows for better fps.
And compeditive edge
That's the first thing I do on every game.
Me with my crappy cpu and medium spec GPU: *Turns shadows to low in every game*
AMD FX-8350 8x4,2 GHz
24GB DDR3 RAM
AMD RX 5700 XT 8GB
ARK
Textures: Epic
Shadows: Low
Reflection: off
Map: Island
FPS: 80-210
buys a $1100 card
plays it like a $200 card
Ray tracing isn't new, it's been around for decades. The reason why it seems new is because hardware is finally powerful enough to do it real time. I recall my dad playing around with ray tracing back in the early 1990's (it took over four hours to render a single scene at 320x200 in 8-bit colour albeit on a 80386). I realize that you (Mr. Young) probably know this, but some of your viewers (especially the younger ones) may not know this.
Ray tracing is still the main way to render 3d scenes. But yeah as you said the leap here is that it's done in real time.
I remember reading that it took cutting edge at the time work stations a median time of 8 hours to render a single frame in Toy Story. Real time ray tracing is amazing tech. Sadly the performance hit is not worth it yet in my opinion.
And everybody knows this. I think this video assumed you know this.
@@madfinntech You know what they say about making assumptions.
Wow, now that's some knowledge
The thing that has me really excited about RTX tech is what they can do with audio path tracing. Real time dynamic reverb from moving sources, frequency modulation depending on the surface it's reflecting off of, distance muffling, sound transparency through different materials, doppler shifting, et cetera. In real life you take in so much auditory information that subconsciously builds the sense of space you're in, but we haven't had anything like that in video games until now. The idea hasn't gotten much press, but I think the effect is going to be much more profound for a sense of realism than most people would expect.
Though the RTX visuals in Control feel like a real milestone. I hadn't been so blown away by a game's effects in years as when I saw a running movie projector picked up and it kept casting the film wherever it pointed along with perfect shadows all in real time - shockingly incredible to see happen.
@Sp3ci3s8472 I had no idea, but that's kinda disheartening if it's been around for a while and already didn't catch on. The way I'd heard is that the path tracing wasn't possible without the architecture of RTX, but it was a while ago so I could be completely wrong. I'll have to go look up the name of this preexisting tech, thanks for the tip.
but then I'd have to play with the sound on
man i never even thought about that, i take audio in games alot more seriousely than graphics and that sounds incredible, i think itd be much less computationally heavy that graphical raytracing aswell so that sounds like it could easily become a thing.
its always bugged me in games when sounds dont sound muffled when your in a carpeted house full of furniture
Both the Unreal Engine and the Quake Engine could do this back in the day, but much like their curved surface simulation no games did it because of the horrendous expense.
@@raycearcher5794 Pretty sure Quake 1 and 2 didn't do this. Not sure about Quake 3. Also the way it's been done is usually a bit less dynamic, involving creating volumes that change how a sound is heard by a listener. Don't think hardware accelerate raytracing is going to be used in this way, or can be. The way this stuff is designed is pretty limited to graphics.
Doom 3 is ripe for the modding community to add raytracing.
Raytracing for Doom 3 is a complete waste of time, @SpeedWeed. Maybe not a *complete* waste of time. After all in the beginning of the game you still have plenty of lights just about anywhere. Later in the game it is almost complete darkness wherever you go. So raytracing would just be all gimmick for Doom 3. That is unless if you have that flashlight-on-weapons mod.
@@adamgray1753 I think the game would benefit from it, even though there arent as mamy lightsources later in the game, it still would look amazing with the deep shadows and realistic light that would come off the screens and dim light sources, would make the game even scarier maybe!
For what little can be seen later in the game, @@koolin3613, most likely you are accurate.
You mean Doom 2016, the game with actual lighting?
@SpeedWeed I can literally add it in five minutes. If I don't get too lazy I'll do it one of these days
Talking as someone who loves your written content, this is also great stuff. You have a knack for explaining stuff in a way that people without a technical background can easily follow your points
When he started excitedly listing off everything that was "free!" thanks to ray-tracing it sounded like he was trying to convert me to a new religion.
I guess in some ways he is, because his next point was that you have a chicken and egg situation where we need the hardware before devs will make games for it, but we won't buy the hardware without the games. While it seems to be coming to the PS5, what about other platforms? Guess we'll see, but at some point we're all going to have to take a leap of faith and upgrade our GPUs to more enlightened models!
For us people who did coding for 3D engines, finally having realtime raytracing IS a new religion. ;)
They are not free performance wise though as ray-tracing is not just more costly than appying approximations it's also inherently more complex & even more difficult to make any kind of latency guarantees - they're just free for the 3d designers because they don't require fancy specialized hacks.
@@FrankHarwald He explained what he meant by free. The free part is the effects you get after you can calculate the raytracing without any extra performance hits for each individual lighting effect.
It's not like "Ok, I enabled ray-traced shadows and I got 50 fps, let's see how much fps will I get with ray-traced ambient occlusion enabled too"
I thought he was gonna add a clip of "Dobby is free!"
Oh Shamus it's more fair you are not here to witness the continuing development of this technology
Just came across this random video of Shamus' on shuffle. RIP, you were too young.
Yeah sucks to hear him say how excited he is about the technology and the future of ray tracing and he isn't around. Just found his channel the other day too. I love content like this.
It's interesting all the things you have to unlearn. Playing through Quake 2 RTX, I saw a glowing blue light on the gun of a grunt corpse, and noticed it was casting an almost imperceptible light on the ground. I also noticed the same of the glowing lights on the grenades. My first instinct was "Wow they've really thought of everything" but that's really not true. The game just does it. Also I instinctively wince when I come to an area I know will have lots of small lights, because I know multi-directional lighting is a resource hog on previous games. Nope, it treats all light and shadow the same, so it has no problem with a room full of diffuse light panels.
The cost of multiple lights can still vary with ray tracing, it depends on the sampling algorithm used. Small lights in particular require importance sampling, which can be tricky to optimize properly.
00:15 my graphics card is the one on the top right
lol
thx for the flex
Who needs a graphics card when you have intel integrated graphics
@@Tom-jw7ii we all know vegas 3 graphics is best
Mine is the 8th card in the image
8:25 the wavelength of a photon doesn't change depending on the color of the surface!
That bugged me too. What really happens when light hits an object is that some of the light will be absorbed, some will be scattered/reflected and in some cases there will be transmittance too.
Edit: Unless we are talking about the second, third, nth harmonic. But that only happens with strong E-fields (lasers) and specific materials. BBO crystals are a good example.
There is this thing called “fluorescence”, where the re-emitted radiation has longer wavelength than that which was originally absorbed.
@@lawrencedoliveiro9104 The photon still doesn't change an that also only happens to certain materials where the light has enough energy to excite the electrons .
@@mryellow6918 Happens to lots of materials. Happens to the surface of the Earth, for example, which is how the “greenhouse effect” works.
Yeah the photon doesn't change but if you ignore they layman's terms he isn't *wrong* he just misunderstood the light unit in physics class when he was a kid and hasn't been proved wrong since
11:35 I like how when the character stands in front of the film projector, not only does she create a shadow on the wall, but the light that is blocked by her body is shown on her body. Not just the light, but the animated detail of the film. And all of this requires no additional programming as it's an emergent property of the raytracing algorithm.
Who cares about graphics? We still have violent, oppressively evil gameplay. Get back to me when your character can put on a shadow puppet show with the projectors...
@@thrwawyacct Raytracing doesn't force people into violence - culture does that. Last time I checked, no culture was free from criticism on that front.
Nothing new under the sun huh. I remember doing raytracing 30 years ago.
Granted, it took 5 hours to render a single frame of a very primitive scene...
But arguably raytracing as a concept might even be OLDER than the various tricks we've been using for 3d games in the last 30 years or so.
After all, Ray Tracing has always been a much closer approximation of the physics of light than all the weird bucket of rasterisation tricks.
I mean, Ray Tracing still does a few weird things - like trace from the camera outwards (where real light obviously comes from the light sources)
What I don't get is this 'graininess' stuff with modern 'raytracing'. To my knowledge that's not raytracing, but rather some variant technique (path tracing or whatever)
The thing is, Raytracing has two major flaws, which have plagued it from the start;
1. It is EXTREMELY slow. In the beginning this even made it difficult to use for pre-rendered sequences where you can justify using a render farm and taking hours to do a single frame. Realtime use was basically out of the question until about a decade ago, and even then it was only in very carefully crafted scenes and limited resolutions.
2. Raytracing (at least, the traditional formulation of it) can't do ambient lighting. AT ALL. Just flat out incapable of it. Thus, you have to combine raytracing with something else to get any ambient lighting effects. Or... Use a modified technique such as path tracing. If you see a tech demo calling itself 'raytracing', yet focusing on ambient lighting effects, I'd find that quite suspicious. Due to the slow performance of raytracing, the original solution was to just use an ambient colour value; which basically means the ambient lighting is handled the same as any rasterizer... Other options include combining it with Radiosity rendering, which happens to have the opposite problem - radiosity cannot really do specular lighting. Or, path tracing or similar extensions can be used, but that imposes new problems.
Speaking of new problems, the need to use denoising filters with 'Raytracing' solutions seems directly tied to the weakness of raytracing in doing ambient lighting.
The approach seems to be one of treating ambient lighting as a the sum/average of many specular lighting passes made with random reflection angles...
Seems a rather poor choice for it, but I guess they've settled on this method for a reason...
In raytracing, a ray that hits a surface splits into several rays distributed in a hemispherical way that look for the light source.
This makes noise free images but it would be exponentially slower because of the increasing number of rays created with each bounce, that's why raytracing gives pitch black shadows and final gathering or photon mapping were used to simulate indirect illumination.
Pathtracing just changes the direction of the ray when it hits a surface therefore it scales linearly with each bounce.
The problem is that when a surface is not a perfect mirror meaning it has some roughness, the ray bounces in a random direction.You need several rays shot through the same pixel to average the result of all the random directions and you need to do the same with the next pixel and the next one and so on.
Every pixel is independent from its neighbor therefore you need to shoot a lot of rays to make the average more accurate.
I think they should have called it pathtracing unless they're doing some dirty tricks under the hood as you say.
@@leecaste both of these comments deserve to be read more.
Leif Pederson it's easier to rant about something not working properly than taking the time to understand why it struggles to be perfect.
I'm glad you took the time to read both comments and even enjoy them, thanks 🙂
@@leecaste I believe that's not exactly correct. What you call raytracing is actually pathtracing. And, as far as I know, in raytracing when the ray collides with a rough object, it sends out a ray towards every light source, to determine whether the object is illuminated by the light, or whether it's in the shadow.
Lenar Gilmanov the first paragraph of my comment says exactly that, if the ray hits a light source, that part of the object is illuminated, if it doesn't hit a light source, that part of the object renders pitch black, it doesn't take into account indirect lighting.
Rest in peace 🕊️
I'd take compression and data storage over graphic power when it comes to next gen tech development. Games look good enough for now; there should be more stuff to see, more npcs and more stuff to do in games.
the problem is marketing. Pretty screen shots and videos are easier to sell. For a long time developers were making games with 30fps 60FoV in mind because pretty screen shots are (or at the time, were) effective marketing, nevermind that the game is a headache inducing, eye bleeding mess.
9:41 No one told me this lecture was being given by a dracula. 0/10
you forgot to say "it just works"
16x the DETAIL....... 16
4 times the size of fallout 4
With no NPCs!!!
King Crimson noises
Raytracing has always been used in 3d modeling scenes and animation. Gaming has typically been on the low end when it comes to modeling/animation technology because of the nature of the medium.
Shamus Young died on Wednesday, June 15th, 2022, at 3am, of cardiac arrest. He is greatly missed by family, friends, colleagues, and his audience.
I agree, why is Motion Blur even a thing? I turn it off whenever I can.
Its the simulation of motion that the eyes cant process in some ways if used right it keeps the animation from looking odd kind of like a wire fence without anti-aliasing.
Motion blur exist in real life. But must games use a very cheap and exagereted "camera motion blur". Most good implementation of motion blur you wouldn't notice, since they are "moviment based motion blur"
I turn it on and to max, fight me
@@SirDella Don't have to, you're already throwing up from motion sickness
@@LordOfSilense lol
Dedicated hardware is not needed - just software support.
Minecraft raytraces on AMD hardware almost as good as on RTX :) That Neon Noir demo from Crytek ran Vega 56
Vega is a card meant for computation, not graphics. No wonder it ran it well.
@@jamegumb7298 Graphics are computations, just specific ones that can be optimized for in hardware architecture.
@@jamegumb7298 Even my R9 290 can run the Crytek Demo in high around 40FPS at 2560x1080.
Crytek themselves said the Neon Noir demo would run better on RTX cards because of its dedicated hardware and stated that their neon noir demo is inferior to RTX titles in its level of implementation.
Ehhh... Minecraft raytracing... More specifically Optifine DOES NOT USE SPECIAL RT CORES FOR HARDWARE ACCELERATION it's just using good old CUDA... Basically RTX cards are working with RTX off but providing RTX lol
Or like a car with Turbo just it's not working... Still gets you from A to B just not as fast lol
Glad to have found this channel. Really nice quick form of analysis. With retrospective elements to boot.
One of the biggest things I like about ray tracing is not having to deal with ugly screen space reflections. While they can look good with a lot of tricks to make up for their flaws, nothing is ever perfect and the holes in SSR are really distracting when you see them. They're really obvious in the Yakuza games or in FF15. Otherwise really nice looking games, but with ugly SSR sticking out like a sore thumb. Those games also share aliasing issues too, now that I think of it
I'd take SSR over the awful cube map water, window and car mirror reflections in GTA V tbh.
@@NessieNep imo A combo of them is the best next to ray tracing, since the cube maps can fill the holes I the ssr
@@NessieNep I miss old planar reflections like in Half Life 2.
It also applies to SSAO
Been looking forward to real-time raytracing for almost 20 years. I was surprised you didn't mention that the concept has existed all this time, or that it dominated the animation world in the late 90s and early 2000s.
Fantastic explanation - I knew so many of these things in separate, partial ways and never really tied any of it together.
What's so funny about prey removing the mirror is they literally had a way to render a fake room on a flat panel and give it depth. That would have literally been a mirror
Ikr, they could have basically used a better version of the old method but at this point, it's whatever
Ugh, which Prey game are you talking about? Prey or Prey?
FFS, who the fuck gives the new game in the series the *exact same* name as the old one? Also looking at you, "Doom"...
@@EvenTheDogAgrees three words: star wars battlefront
@@JacobKinsley Lol, that one too/two... 🤣
"the old way" to you, Shamus, isn't really the old way. I cut my gums on this stuff on the Amiga in the 90's with Imagine 3.0 (a raytracing program). It of course was not real time (an hour per 640x480 image of full raytrace rendering) but I always pined for a time when real time ray tracing could happen. It's early days, but I am glad it is starting to happen.
Imagine
>"the old way" to you, Shamus, isn't really the old way ... in the 90's ...
Hasn't it ever occurred to you that you are recalling things from the last millennium? prefixing a statement with "isn't really the ..." doesn't make it automatically false. you are just old, whether you accept it or not. I was playing with ray tracing in 3DSMax in late 90s' when I was but a wee dork... now I am fat, balding old dork living in a world where literal toddlers are playing with phones several dozens times faster than my first PC. I am old as are my memories. just accept the new reality... maybe start saving for a burial plot even...
Yeah the Amiga Raytracing days were great. As you know some later games on the Amiga implemented it such as Super Stardust.
Am I of the understanding though that the narrator of this video presumes that Raytracing is a new technique?
@@baburik What the fuck does someone's age have to do with historical facts? Ray tracing being a decades old technique is just reality, whether someone's lived through its development or not.
@@gurriato
Excellent response to Spown 👍
Played control with ray tracing, it was absolutely beautiful and it's strength was having amazing environments that look real and then having those environments warp and change and get trippy as hell. There is a whole zone just full of fucking clocks, overflowing a story tall, spilling out everywhere and they all seem to be rendered individually and have physics. Hats off to the level designers and the technical teams on this game.
Listening to the protagonists internal dialog as she stumbles her way through the incredibly hacky (though admittedly creepy) plot on the other hand...
1:25 "It's been so long since I've seen something genuinely new"
Very well done and descriptive, yet concise. I hope this simplifies things when designing games for, say, Unity. Setting up the lighting in it has always been a pain in the ass to me.
Ray/pathtracing will save a ton of tedious work hours because you just put the lights, adjust the intensity and everything else works by itself.
I just hope for developers to work with real world units as much as they can because this gives consistency, something that can be broken easily by artists that have the bad habit of eyeballing everything.
11:08 No they don’t. In real ray-tracers, you can fine-tune the render by specifying limits for various kinds of rays, such as diffuse versus glossy bounces, transmission rays, volumetrics etc. They are not free: every parameter has a quality-versus-computational-cost tradeoff.
Happy to see another video!
8:10 "Billions of photons are pouring out of the screen"
That is a gross understatement. It only took me *HALF A GOD DAMN HOUR* to find the (not at all) readily available information I needed to calculate (yeah, the information I wanted isn't even there) the number of photons being emitted from a typical phone display. About 100 quintillion (100 billion billion) photons per second. This number could be off by as much a factor of 1,000 due to the lack of quality information on the subject.
Now this is my kind of autism.
Hahah, details matter, man! :)
4:05 The Sun also casts a penumbra. The angle of the penumbra is all down to the apparent (angular) size of the light, not to how near or far away it is, and the angular size of the sun is about ½°.
This is 99% accurate, but it's important to mention that current GPUs can also do raytracing and partial raytracing is already being used in major engines like UE4 and Unity. Tim Sweeney from Unreal Engine says that the use of raytracing in Unreal Engine will slowly increase and when graphics cards hit a processing power of 25 TFLOPS, full raytracing will be the new norm. Unfortunately we're still about five years away from that power in high end GPUs.
Also, an interesting effect of full raytracing will be a more natural projection. The current rasterization method uses a linear function for the projection of the 3D world onto the 2D screen. This causes an unnatural distortion that becomes more pronounced toward the edges of the screen. With full raytracing it will be trivial to create a natural projection like that of a real life camera or the human eye.
CryTek recently stated that they have created ray-tracing in their Cry Engine which runs pretty smoothly on cards such as the RX570. If this is true, then it means that ray-tracing does not need as strong cards as previously thought
They can do it, in the way that you can dig a hole in your garden with a spoon if you want, but a spade will make it a lot quicker and easier. =P
a bit about how standard ray tracing works.
for every pixel, you do a process called ray marching, which basically means, you cast a ray, see what it hits, than from there cast more rays as needed, for example, if you have some glass, you'll need to cast two rays off of the glass, one through the glass and one backwards for reflections.
after you finish casting those rays, you sum up the colors those rays hit.
the most common way to do ray casting is called circle tracing.
in circle tracing, you take the ray's starting point and calculate the distance between the point and the closest object, than you travel along the ray for that distance.
when the distance becomes really small, you know you have hit something.
10:50 : that's sadly not true.
For each phenomena you need an extra ray, therefore it's not free. You can't figure out if the floor is shadowed and it's reflection with one ray, simply because they go in different directions.
Two ways to take it. First, it's developmentally free - you don't need to put in almost any extra effort to support reflections, it's just a slightly different path. Second, it's free in the sense that if you're already spending five rays per pixel, it costs the same if those are five diffuse rays or three diffuse rays, one reflected ray and one refracted ray.
Not exactly. Each ray is essentially one path of a light ray. The calculations that will be performed on those paths, those are distinct for each effect, not the rays.
@@louisvictor3473 Right but a specular reflective path is a different one than a diffuse reflective path.
@@FeepingCreature That is exactly why it comes for free: to figure out what path a ray took from a light source in the scene to the point you're analysing, you already had to account for and calculate those things. All of that information is already baked into the ray's information.
@@louisvictor3473 That's not how it works with path tracing though, generally you won't been looking at one unique ray per pixel to the light source but a bunch of stochastic rays, because depending on material a ray may have come in via diffuse reflection or specular reflection or translucency or subsurface scattering, and directly from the lightsource or an indirect bounce from deeper in the scene, etc. So at any step of the path you'll be making more or less random choices which way to "have come from".
Thanks for the insights, Shamus. I used to watch your videos back in the day, so I'm happy to hear from you again. :)
I'm really happy these kinds of videos are back! I've missed these explanations for dumb people (such as myself). Hooray for Shamus!
People has always been on about making graphix more shiny. There was once a time, a man didn't just want it to be the shiniest. He wanted it to be *stable*, even if it means that it isnt the shiniest. That man was John Carmack, 5th dimension hyper quantum robot in a meatsuit. At a time where Ultima Underworld was shitting out semi-polygonal graphics, he made a stable raycasting engine. A year after that, nope, still no poli. He just made ***THE MOST PORT-ABLE GAME KNOWN TO MAN*** Using a pseudo-3D engine.
To me, graphic isnt about looking shiny. Its about being cheap, affordable and easily modifiable/accessable
10:50 "...And You get a light source, and YOU get a light source, and YOU get a light aource..."
The funny thing is that if we could somehow harness the best of CRT tech, we wouldn't care so much about "more pixels". Digital Foundry did a video where they ran Control on a CRT at like 720p I think, maybe not even that, and the game looked gorgeous and wasn't even blurry because of how the analog signal handles pixels (casting colors through electron beams in the cathode tube). It was crystal clear and they could turn literally every setting all the way up and run the game at 120fps consistently with perfect black levels. It was magically.
"Suddenly everyone decided they wanted sixty frames per second". Dude, sixty frames per second or more should be the standard, at least.
paincult I mean I can play just fine at 30. Sure “reaction time” and all that but that doesn’t matter if everyone is at the same frame rate.
i don't understand this mindset, it seems to have appeared out of thin air and feels like more of a thing the industry is pushing people to want rather than a naturally occurring rise in standards.
@@gramursowanfaborden5820 Have you ever played a game at 144 fps/hz, then switched back to 60 or 30? I just... can't.
@@evo2542 have you ever closed your eyes, covered one eye with your hand so no light gets through your eyelid, and looked up at the sun (eyes still closed)? it upsets the colour balance in your eyes and for a while the uncovered eye will see much bluer hues than the covered eye and if you do it right you can get a 3d anaglyph.
where was i going with this... oh yeah, the effect fades with time. 30mph feels fast until you go 60mph.
I think 60 fps was a standard because CRTs were 60hz.
Solid video. Just find it amusing that when talking about types of lighting at 5:30 the photo referring to bloom lighting is a street I often frequent in Seoul.
Glad to see someone on RUclips actually excited about raytracing. Nearly everything else I've seen has been very negative and being an early adopter of the RTX card with an empty wallet, it's nice to get a little bit of reassurance that I didn't make a bad choice!
Being the first one to buy a brand new tech is always a bad decision...sry. doesn't matter if it's about TVs, cars, computers, etc.
Good for the rest of us as you are financing better technology that we can buy a bit later for half the price
Thx
@@alexsupertramp4907 I was due for an upgrade anyway and TBH, I do a lot of 3D animation rendering so I'm benefiting from it massively in that regard. Still can't believe more games don't support it yet though.
I bought it over the 1080ti because of the new technology. I'm into Graphics and I've enjoyed the RTX on the games that supported it. They are pretty good with RTX off but when I enable it I'm impressed with the increase of realism and reflections. I've always been more into "Graphics" with everything on Ultra settings, than "Framerates" as long as it is not stuttery. Even with RTX on its over 60 FPS. I don't care about price, its not like I buy one every year, its a one time purchase.
@Luden UK so far 21 games support it. Problem is finding one you're into. Too bad Mount and Blade doesn't support Ray tracing. I can imagine all the shiny armor and weapons, and the blood pools with RTX.
@@alexsupertramp4907 The problem is if that nobody buys the new tech, then there's little to no incentive for Nvidia and/or game devs to invest on it.
There's bound to be early adopters that just wants to experience next gen from an infant stage, and it's understandable.
I love channels that talk about things and then will actually create something themselves to back up what they're saying. Kudos, my sir.
5:29 I always shut this off if there's possibility.
One interesting thing about Bake Lighting. You could have more than one bake texture, so for example, if you (as a developer) knew that a light could change between two colors (for example) you could bake the lighting on those two colors and then program the game to change between bake lighting maps when the player changes de color of the light. Half Life 2 did that with fire, when you extinguish a fire you'll see that the lighting changes, but it seem very sudden because it is just a switch changing from one map to another without any kind of blending.
I've been looking forward to real-time raytracing since the 90s. That's about the time I wrote my first ray tracer, and let me tell you, it's simple. I mean ~100 lines of C code simple. It didn't look great, but the entire process of building up an image from the scene can be described in a shockingly small amount of code. It gets more complicated when you start adding reflection (which is just a recursion of the same process), refraction (again, just a recursion of the same process but the calculation of the angle of the new rays is different), and other things, but you're still not talking about much code. You just describe, in code, how light acts. That's it.
On the other hand I'm still utterly mystified by rasterizers. I've written one but the entire thing feels like some abstract geometric puzzle with fudge smeared on top to guess at what color the pixels should be. You're right in that it's a pile of hacks, even "physically accurate" shaders are sitting on a pile of hacks. It's necessary because raytracing is extremely slow, but definitely not ideal.
True realtime raytracing is still a distant goal, but it's in sight and it's very exciting.
"True realtime raytracing is still a distant goal" It depends on what you consider an acceptable result, Quake II RTX is entirely path traced in real time, minus the HUD elements and the menu (which would have no reason to), it runs at 60fps 1080p on a $700 2080; sure it's only 1ssp (if I'm not mistaken) but the denoised image looks quite good already and IMHO once you can reach 8-16spp the denoised output will look almost indistinguishable from a 1024spp ground truth.
@@Hayreddin That's interesting, I hadn't realized that Quake 2 RTX was a full raytracer, I thought it was a hybrid rasterizer and only used raytracing for lighting. I'll have to take a closer look at it.
@@uzimonkey It is :) it's on github, have fun!
@@uzimonkey yup, it's fully ray traced. Which is why performance is so atrocious.
What is a really interesting thought, it how realtime raytracing could affect the cost of game development in the long run.
Excellent video! When the NVIDIA first started advertising for ray tracing I thought it was just for "another unimportant effect" that requires a huge amount of processing power, but I gradually realized (and especially after watching your video) that everything we see on a screen can be produced by following the laws of how light behave with different materials and objects, and this indeed can change the fundamental way that we create various visual effects. Thanks for the hard work! ; )
Phenomenal video on the topic. Speaking as a layman in game lighting processing beyond "I know what dynamic lights are and static lights are," I was still able to follow this video. I know this one was more techy and less about the social and philosophical discussions of the industry or medium, but just wanted to say that you always been a voice of reason in the often shouting and toxic discourse and your voice is sorely needed. I really hope this channel takes off. Also, you were one of the people who inspired me to get into writing, which inspired me to make the channel im slowly growing now. Glad to see you throw your hat in the youtube arena, and looking forward to your next video!
With raytracing, the problem of light transport is practically solved for light in free space, but far from over for transport in material, and computational physics is the next frontier.
Imagine: realistic blood in Surgeon Simulator 2024!
Unfortunately, gameplay is far behind graphics.
In the 80s and 90s, it was the opposite.
All I want is good carved 3d shapes in game graphics without unrealistic and disturbing corners on them. Thats mostly up to game developers, not to the hardware.
The sad part is that high end video cards are marketed as though they can pull it off. Depends on what the individual wants. For me. I'll be enjoying ray tracing a few years from now. Until then I'm at 4k with 144hz refresh and that beats ray tracing technology for me. Reminds me of the hype over direct x7 and 9.
Personally my favorite light and shadows was in FEAR 1 and how it was used in that.
I loled when you said "free". Like climbing mount everest And gliding off for 5 minutes is "free". Its still more pixels, level space instead of screen space, bouncing around compositing. I personally am still waiting for games to stop having trigger boxes and missions.
Well, once you get enough raytraced pixels, you can use some of them for gameplay, too. Accurate interaction with textured surfaces, accurate NPC vision, and of course freeing game designers from having to work around static geometry in general.
There is little cost difference between a boring diffuse surface or some fancy bump mapped glossy material or a light emitter/water/whatever, they all cost the same. In traditional engines the developers hava to apply a light budget and material type budget, in RT there is no real cost difference between 1 light emitting surface or 5000 light emitting surfaces, given 2 scenes with the same number of triangles.
You could consider shadows, reflections, radiosity, refraction, etc, free in a path traced render engine, but definitely not for how games are using ray tracing today or in the near future. Shadows, reflections, ambient occlusion, global illumination, etc, all cost extra in Unreal Engine 4. Although I do believe I've read that ray traced shadows can actually be cheaper than high quality rasterized shadows that still look worse and have more limitations. Not free, and not cheaper to render than low end shadows, but still better and feasible to budget for.
for the designers it is free
@@SerBallister but is there no cost difference because everything is cheap are all the effects equally stupidly expensive.
One correction, the shadow blur actually depends from the size of the light source and not exactly how far away it is.
Great explanation. Feeling bad now, because i will be barred from newer games soon again :(
F
this is the way i feel right now :( my computer used to be fine 10 years ago but now im well below the minimum specs for tripple A games and i dont have the money for upgrades
@@lucywucyyy F
Arstan Boranbaev I would guess games will be similar to what they are now, with the option to enable raytracing for a while, because publishers don't want to lose revenue from people who would've bought the game
@@lucywucyyy Well, be glad that computers remain relevant that long these days. It used to be that you had to upgrade every year, two years tops, if you wanted to be able to play anything recent.
Deux Ex had at least one mirror you could see your reflection in... My take these days: the more you see, the more you realize how it doesn't really correlate to how much you enjoy it.
I hate mirrors that don't work so much.
And I am always surprised when there is working mirror. It allows you to do so many thing design wise. Like I used it to make rooms feel bigger in The Sims 3. Mirrors have a lot of potential for level design. But now is generally considered to much of a hassle to have in a game. :(
Suddenly reminded of the mirror in Adam Jensen’s apartment in DXHR with a sticky note from his landlord that’s chewing him out because he keeps smashing them.
same it looks so crappy but id still rather they dont work than slow my game down or god forbid the devs decide to prioritise a working mirror over gameplay
I remember reading about ray tracing being the (non immediate) future at least a decade ago, maybe as much as two. Cool if it's finally approaching... imo the biggest thing that gives a sense of atmosphere, immersion, and believability in a game is good lighting. I think it's way more important than high polys.
I like how people call ray tracing new tech but it's actually old af.
DXR, the technology allowing realtime raytracing to be implemented into games, is new tech though
Yea that annoys sometimes me too. Raytracing has been around for a really long time, but the use has changed over time. It has been possible to do in real-time for quite a while but it was never fully taken advantage of until very recently with the RTX cards (just to be clear, the RTX cards are not required for raytracing)
Yup, cold fusion is old too, it’ll be annoying when we can actually do it and everyone will think it’s new. 😐
@Marco Buiks That video you linked was pre rendered. We're talking about real time buddy. Get out of the 1980s.
@Marco Buiks Lmao, point to where it said that was a real time simulation
Oh at 5:18 AAA does still use that technique. That is actually the only way to do real-time moving shadows aside from ray tracing. They just change the organization methods and add a way to mix real-time shadows ( you calculate every frame) with the pre-rendered shadows ( for things that don't move ), and combine it with screen space shadows (which is like the shadows at the contact point or Ambient occlusion as you mentioned, such as a where your feet meet the ground).
Hello? Yes, VRChat here. I'd like to order some more MIRRORS™ as there are not enough performance-killing mirrors in my user-generated content game yet.
essentially it’s a camera
Ray marching is also an interesting point as ray marching has the effect of ray tracing but doesn't need special hardware, nor the hit RT has in performance (though it still isn't as simple as normal lighting). Red Dead Redemption 2 actually uses ray marching for shadows, crepuscular rays, and rays in volumetrics.
Today I learned: Watching someone else spin around in a 1st-person game makes me sicker than playing the game myself.
me too im really gonna puk right now.
If I may simplify a very complicated process; your brain is working 'even harder' (than playing the game yourself) to calculate where you are in the spinning world, and figure it all out (make sense of it all) than if you were playing, because it knows you aren't manipulating anything. That confusion of the world spinning and you not doing anything, makes your brain turn on the alarms of nausea/dizzy-ness, heh. It's bad enough when you are in front of your own screen and the world spins on a different axis than your head is at (most first-person game views spin on the axis of where the monitor is, not where your head is, in space). The brain is very smart, but gets easily confused when things don't make sense and sets of the dizzy-alarms for some people really, really quickly (as a form of self-preservation, of course).
Bonus Trivia: The Brain named itself.
Good lesson to learn!
This is a fun series you've put together. I'm glad I stumbled upon this. Cheers!
Just so you know, I played through Control with raytracing on my 1080ti. Raytracing does not need RTX hardware, Nvidia is just trying to monopolize the tech.
How were you framerates huh?
Ray Tracing can work on any hardware but it is so sluggish there is no point of using it.
Hence RTX. It has dedicated ray tracing hardware and performance is way better on them then regular cards. It's playable
@@R4K1B- Never dropped below 40. I bet the game wouldn't use rtx cores even if you had them. My processor is getting old, so I would likely have gotten better fps with a better one.
@@R4K1B- as far as I know RT games barely touch the RT cores or don't touch them altogether in some cases, so is likely more in software and hardware optimization to do raytrace calculations than specialized hardware.
There are pros and cons to fixed function hardware. The RT cores in RTX cards are fixed function with the function obviously being calculating ray intersections.
We have been able to do ray tracing on GPUs since the mid 2000's/noughties, even real time, problem with the later was that the resolution wasn't competitive enough. Nvidia even had a renderer for doing raytracing on the gpu shader hardware back then. It was called gelato:
en.wikipedia.org/wiki/Gelato_(software)
The fixed function hardware for raytracing ensures there's a baseline hardware dedicated specifically for raytracing so since it's there the game developers might as well make use of it for that specific purpose. The downside is if you don't need that function you can't really use the transistors for other purposes. On the other hand if you have general purpose compute hardware such as the more traditional 'shader cores' you can do anything on it, including, raytracing. the problem is there's so much you need to do on the hardware it's tempting to just not do raytracing at all and use it for everything else instead.
@@Megalomaniakaal The other downside to the RT cores is that it gives Nvidia yet another means to monopolize. RTX is just another Gameworks. Control proves that you can have good looking raytracing without specialized hardware. Accessibility is key to the widespread adoption of new tech, and in that regard Nvidia is screwing us all.
I wrote a program to do ray traced, motion blurred animation around 1990. It even split up frame generation dynamically amongst a whole bunch of computers.
It still took awhile.
This argument is beginning to sway me, particularly with how it can potentially speed up development time.
My only gripe is that this is quite the performance hit for somewhat more accurate lighting. Battlefield 5 was one of the first to show it off and I’m willing to bet people would just rather prioritise framerate if the sacrifice on their part is so low. The games showing off raytracing atm don’t suffer without it.
Having raytracing be the only means of lighting a scene could look nice and all, and could cut down on dev time (or make them focus on other areas instead), but that would also mean less room to work with out of the gate if you dedicate a big chunk of your processing power to this feature. As much as I can’t wait for these consoles, knowing as well that at least the Series X will have raytracing support, I don’t think the consoles will be prepared for it, nor capable of running that generation’s equivalent of Doom Eternal at 4K 60fps as well.
I prefer more fps, any day
I would say the only down side is they will push the artistic touch a bit away kind of like game studios having actors do the base of a lot of the character animation with a team of animators.
this video was a shill for nvidia.. at no point did he mention the big elephant in the room.. that with all the crap he just said, a game with ray tracing is virtually indistinguishable from a game without it, at a significant cost to framerate.. most gamers, ESPECIALLY competitive ones coudn't give two shits about a .1% improvement in visuals when it costs them so much FPS
I think the misconception is that as gpu's improve over time they will have more and more performance but not in a meaningful way. 80% better performance over the previous card but 80% increased performance over 4k 144fps isn't really meaningful. Using that excess performance to power raytracing does in fact make sense though.
Great stuff! If I had the opportunity to add more, I would have added curved reflection surfaces and cube maps - and mirrored surfaces drawn within mirrored surfaces.
Lmao When he said; "lets talk about graphic cards"
and showed a photo i saw mine LMAO
THE POTATO
I know most of the stuff you explained already but you presented it in an very clear and entertaining way. Bravo.
This guy must have been content playing pong all his life.
Raytracing... otherwise known for the last 10 years as Planar Reflections. This is why old GPU's work with with "Raytracing".
Raytracing is a shitload more than just planar reflections...
1:20.... 60?! You are missing at least another 60 fps more there!
Ah, yes! Something that you will never notice unless you look at numbers on corner of your screen on fps counter or use humongous tv size screen.
@@1God1Fury uhmmm, You WILL notice a difference as soon as you have 100+ fps on a monitor that can support it.
Did you ever try playing on a 144hz monitor and go back to a 60hz one ?
I did it with osu!, LoL, Battlefield, Cod, Warframe, Minecraft, R6, just to name a few,
and i can't get back to playing on a 60Hz monitor, cause i literally feel the stutter while playing, no joke.
Edit: And i don't even use the FPS counter in that games, so there goes that argument as well.
@@1God1Fury lol you haven't played above 60 before have you. also what does screen size have to do with it?
Bigger screen easier to spot fps differences
@@1God1Fury What screen size are we talking about?
Well, it doesn't matter.
Quick question:
You have a 24 inch, 27 inch, 32 inch monitor and lets say a full size living room tv. On all of those devices you play the same video (for test consistancy) that has 5 fps. Where do you "see the lag better"?
Answer: It's all the same, no matter the screensize.
I believe you are talking about teadering (idk how it is written, sorry) or about a pannel with long times to change the color (like some advertise "Gaming Monitor 1ms 27".........). 1ms stands for the time the monitor in that case needs to dim the leds or totally change the color for example from green to red.
Tl;dr: You might be talking about teadering or the "color change rate " and in that case you would absolutely be in the right, because it's easier to notice pixel error on a bigger display.
Had no idea what this game was and thought that was Black Widow walking into Magneto's prison for a second.
Nice introduction of raytracing for a general public. Great work!
Though, as a computer graphics researcher, I have to point out a few things that are slightly wrong in your explainations. You're not to blame though, these are just common misconceptions that have spread widely.
1. You say you want to talk about raytracing as opposed to path tracing and yet you've essentially talked about path tracing in the end. The difference between raytracing and path tracing is essentially: RAYTRACING is the general tool to answer the question "what point is visible from an arbitrary point in an arbitrary direction" and then being able to repeat this question for whatever point is the answer. PATH TRACING uses this technique to compute how light spreads throughout a scene, bounces around and interacts with surfaces. So what games are using, when they compute reflection, refraction, shadows and GI, is essentially some form of path tracing.
2. The computation is not less acurate because you start at the camera as opposed to the light. The computation is exactly the same. What you end up with is what is called a "Light Transport Path" which connects the camera with the light source. THEN you can compute what amount of light is traveling along this path. The direction in which you compute this path is completely irrelevant to the result. It is just computationally WAAAY less expensive to start at the camera (in most cases, not always though).
3. DOOM 3's shadows weren't sharp because of the lack of indirect illumination, but because they used a new technique to compute shadows that just couldn't handle blurry shadows.
4. Shadow maps don't use the silhouette of the objects from the point of view of the light source, but the distance to the light source as seen from the point of view of the light source. Hence you don't project the silhouette onto the ground but compare the distance of a given point to the light source to the distance that is stored in the shadow map to figure out whether that point was visible from the light source. I assume you kind of knew this since you showed the wikipedia article about shadow maps and just wanted to keep it simple for the viewer.
5. The computation of the phenomenon of indirect illumination isn't called "radiosity rendering". Radiosity rendering is a very particular algorithm to compute diffuse indirect illumination. There are many others though and the one that is used these days (Path Tracing) works very different.
6. The main benefit of ray tracing, and the reason why there is imho no doubt that raytracing will definitely be adopted by everyone in the games industry sooner or later, is that is tremendously reduces the complexity of modern game engines (at least the graphics part, obviously). You pretty much mentioned this very clearly. But the important detail here is, as always: MONEY. Reducing the cost of developing and maintaining the engine, as well as reducing the workload of artists is a HUGE incentive for game companies, especially greedy management.
Again, great introduction though. And certainly better than the common "hurr durr nvidia just wants our money, RTX is a scam, games just run worse and look just as shitty ...".
But nvidia did screw us over by using this technology to justify a price increase right after the crypto boom screwing up graphics cards prices.
@@ChucksSEADnDEAD you didn't saw the video, do you?
@@AndrianHendrawan I was replying to the ""hurr durr nvidia just wants our money, RTX is a scam, games just run worse and look just as shitty ..."" strawman. The fact is that nvidia took advantage of a GPU drought to release a half-baked technology for an outrageous price. Nothing about the video deals with that subject so even though I did watch it, it was not required to respond to the strawman.
Postal 2 had 2 way mirrors in 2003, dead rising also had amazing reflections we don't need such things for reflections, not saying it's not usufull
At last someone Who knows what he Is talking about! I'm so annoyed by all those people saying "that raytracing effect eats a lot of frames without adding much". It Is not an effect! And as soon as we get rid of all that rasterization mess everybody Will knows that! I've implemented a raytracer a few years ago and so i know what It Is capable of and how beautiful Is rt to work with. Just a couple corrections: It Is path tracing (or similar technique) what we aim to, because rt alone cant provide gi and tracing from the Eye Is perfectly fine due to the bidirectional nature of the brdf and Will not impact the image quality at all. Great video, i wish you explored a bit more how complicated are very basic effects with rasterization and how easily they Just emerge from rt, because the positive impact of rt on engines implementation cannot be underestimated. Great work!
8:25 No, they don’t change in wavelength. Coloured objects simply absorb photons of certain wavelengths, and reflect others. Bounce red light off a green object, and the light doesn’t turn green, it simply gets predominantly absorbed and the green object looks dark.
I'm super happy the crypto binge happened over the last 5-10 years or so, and is now seemingly going away. I effectively barred myself from upgrading my graphics card for about 7-8 years (ish, not exactly sure), and now, when I'm about ready to bite the bullet and get one, this stuff is coming in and cards are the cheapest they've been in ages. Brilliant plan, me.
Guy Incognito well except they aren’t 😂
I'd noticed the mirror thing in the last decade or so, but I'd never really thought about it. Thanks.
Whether or not end users adopt advances in graphical technology has a lot to do with how much they value realism in game graphics, and I use the term "realism" specifically because many of these advances don't necessarily make games look that much better. Realism is just one style out of many, and there are lots of games where hyper-realistic lighting would actually detract from the game's tone, including one you mentioned: Doom 3.
I think game programmers and designers have a tendency to get hung up too much on details. In the real world, if you look at a grassy meadow you can see tens of thousands of individual blades of grass swaying in the wind. A game would probably treat the whole thing as one big, swaying object, like a massive carpet made of grass, but is that a bad thing? Would it be worth the trouble to have an individual grass-blade renderer? Would it even be noticeable to the player? I would err on the side of "no" for all three of these questions as far as most players are concerned.
There's certainly a lot of potential in ray tracing, and I can appreciate it aesthetically when it's used in something like Minecraft, but if I were actually playing this is exactly the kind of thing my mind would overlook. I find games immersive when they're fun and have a strong atmosphere; graphical realism is a secondary consideration. If the PS5 does support ray tracing, it may end up like the PS3's Blu-Ray functionality: an expensive gimmick appreciated by only a few.
As someone who works as a game developer... If this takes off, my environmental designers will finally have enough free time to spin up some really, really cool shit. They spend SO MUCH TIME setting up tricks for the most rudimentary effects, like reflective puddles, swinging lanterns and so on. We've reached enough graphic fidelity to last us a couple decades, and we really have to focus on quality of life improvements, because making graphics got insanely complex in the last two decades.
ahh the beauty of Anti-Aliasing in the source engine when it worked as it should, not like todays modern titles where you turn AA on and its either blurry/muddy or still jagged.
I'm one of those gamers that was heavy into "Graphic" This to me is great and makes a big difference in games that support them. I don't care about 500 fps at low settings, I like to turn everything up to Maximum Ultra and Enabling everything offered, real hair, shadows Anti aliasing to the max, etc. When it looked choppy i'd start turning things down. Now Computers and Graphics cards are so powerful I can skip the "Turning down" process even with RTX on and still get 60 fps. It is awesome looking. IF for some reason I wanted or needed more fps, I can just enable DLSS with just a tad loss of quality, but I have yet to need more FPS. As far as price I don't care one bit. Its a one time purchase for years to come.
About the shadows. There was a game named "Severance blade of darkness"in 2001. The shadows there were AWSOME!
Oh yeah that game was very impressive it had a great fighting system with combos
Him: Talking about how revolutionary and simple ray tracing is but requiring the investment of $1000s and adoption from game studios
Me: I'll see it in a Bethesda game 30 years from now.
With the same core of their proprietary engine.
We may actually have 16 times the detail in 30 years.
@@semanticsamuel936 With any luck Bethesda may even find a way to make their games "just work" in 30 years.
Drop shadows are still a useful features in some games, namely platformers, where they can give you a good idea of where exactly you're above, which is important to see where you'll land.
What about HDR? HDR has been added in the last few years.
I loved it but I wish the video corresponded to what you were saying better. The Deus Ex shadows worked nicely and the Source radiosity room demonstration did a great job illustrating what you were saying - even better than the relevant screenshot in the written article. But for the most part it was just footage of Control. A very pretty game but it felt like filler here.
I can imagine jerry seinfeld speaking the title.
I always thought the way it was done in the Silent hill series was the best where they just baked the shadow textures onto the models. If you actually pull up those old models the UV wraps are beautiful paintings and compiled with the backgrounds you have extreme depth of detail with hardly any real time rendering
What's the Deal With Raytracing? Starting 07:57 you'll find out. Before that is a primer on the history of video game graphics.
omg thank you. youtubers just love to hear themselves talk
@@gamertechkid1490 he's giving context
As a 3D artist I was sooo hyped for RTX!
just find out the technology would be developed for games first, 3D production last.
Sure most render engines "support" it.
With half their feature sets missing :/
I have thought for a while that the whole "RTX" launch as a consumer device feature was premature and really just about capturing some initial branding mindshare.
8:24 "Change in wavelength" -> What?
That's not how physics works.
What's wrong? Perceived color depends on wavelength of photons, I can agree than it's not changing, spectral composition of reflected light changes, when some wavelengths get absorbed more than others.
@@friendlylaser If you agree what's the problem?
@@martixy2 Well, that's just kind of nitpicky.
Well actually, there's this effect called the 'doppler effect', which does influence perceived wavelength.
In light it's basically negligable in normal scenarios, but just fyi :)
EDIT: There's a different name for it ' shift', though i have forgotten the term.
@@pauladriaanse what you are referring to is a light shift and its when the wavelengths in light are stretched out during the process of getting from one place in space to another. EX: A star moving away from us while we move away from it stretches its light to shift to red while it if it moves closer it shrinks to blue.
One thing people sadly forget is how sensitive the human visual cortex is to tiny changes in lighting. Lighting is actually the most complicated and most important part of a scene.
Sadly, pathtracing performance is not independent from scene design or number of light sources. It is too expensive to do caustics with just one directional ray tracing, so you have to do it bi directional. Moody scenes where a half open door allows the sun to illuminate a warehouse are way harder to do then simple scenes under the sky. You need way more rays to statistically hit the light source enough. If you look at ray tracing implementations in current games, this is where you see more noise because it can't be done in time for the frame.
Raytracing is part of the future, but Nvidia's RTX will be completely irrelevant by 2021
This, the party might be over next year even as Raytracing is already working on non RTX cards and AMD chips.
@@Tuberuser187 It always was working on other cards and cpu's. Quake Wars could do raytracing.
Thanks! As a physicist with a side-interest in video-games I have been wondering about this for years, since I realized that ray-tracing is the natural thing to do but would be too computation-heavy.
A small amendment: Photons don't usually change wavelength, unless we are talking about fluorescence or non-linear optic effects, both of which are rather rare in everyday scenes. The apparent change in color comes from that fact that white light contains all wavelength of photons and a colored surface absorbs some of them and reflects the rest.
The complicated part here is not the physics where each wavelength just behaves normally, but our eyes, which are really screwed up in terms of color perception because of our three kinds of color-sensitive cells on our retinas. That is where all the weird color-mixing effects happen which you study in arts lessons.