baked lightning sucks and is the reason most games have lifeless environments because they can't be organically changed without slaughtering the lighting.
@@eineatombombe Yeah but if we're going to start innovating games and making actual unique experiences we're going to need to phase out support for old hardware. I've been gaming since 97 boy! There has been so little actual innovation in games over the last decade even though engine innovations have been huge.
I am not a fan of "modern games" where the ground always looks like it's been waxed over after having 5 coats of gloss expoy then finished off with a fresh coat of baby oil. Same for all the character models, they all look like they are covered in sun tan oil, even over their clothing. Who thinks "realistic" looks like this?
Nobody. Artists are more and more using real world (!) values for the smoothness of objects and this is exposing how bad game reflections are. The "shiny" 8th generation look was all about using screenspace reflections to finally be able to use realistic smoothness on materials. Unfortunately SSR have many drawbacks. In every situation where SSR breaks down you're left with strangely "glowing" or "waxed" surfaces that stand out and ruin your immersion. The worst places are often indoors when there isn't much light around. Things should be dark but instead they show bright reflections (local cubemaps or the skybox). Raytraced reflections are the only solution but due to high performance costs RT reflections then end up being BLURRY or UNSTABLE. Both of these problems immediately stand out as unrealistic.
LOL. You are dead on. I was laughing my butt off on the tech demo for unreal 5.4! MORE REALISTIC? WHAT? Peeps still look like 8 year old tech IMHO! Advance cartoons. Honestly, going back to the giants, like crysis, and comparing it to modern games? NOT THAT IMPRESSIVE vs. what it should be! We are sold snake oil! The obvious issue is, gaming consoles, and average income folk can't buy 1500-2000.00 usd cards, let alone the 2-3k for the latest CPU/build/ memory. OH, didn't mention a modern OLED 8k monitor. So, most likely, devs just aren't at a point to spend the time needed to create REAL LIFE experiences. VERY few could ever enjoy that level of tech. Agree about gloss. Devs think adding reflection is the only way to make a game look better. Pretty silly. Ever notice all the car racing demos are in the rain? LOL. Even GTA V/VI , always in the rain to exaggerate reflective surfaces. Maybe, it's just me, but I can't remember the last time I went out in the rain and noticed everything super reflective. ;)
@@Curious_Skeptic oh yeah! dont get me started on every E3 demo being in a scene after a fresh rain! But yes- it gets to a point of diminishing return. Epecially when 80% (or more ) of the game market is fine playing games like minecraft, or stardew valley. There really is a very very small market of graphics geeks out there , and they are simply justa super niche market (always has been). I'd say most ppl could go the rest of their lives with games at xbox360 level of tech and be just happy lil clams.
I think it depends on the game. Metro Enhanced on PC, Cyberpunk with Psycho RT, Dying light 2 are a few games built with great implementation and I would both notice if it’s turned off and wouldn’t want to go back.. but for the vast majority of games it seems to just be implemented after…
Path tracing on Alan Wake 2 and Cyberpunk makes an insane night or day difference. Path tracing specially in CP2077 ads a whole new dimension to the immersion
@@AdvancedGamingYT I think you need at least a 4070Ti for that (or equivalent AMD card). And I still play with ray tracing off as its just not worth the frame rate going in the toilet. I have a 4060Ti with 8GB VRAM (came factory equipped with my PC).
Raytracing is literally bad now, there are not much improve over the quality while cost half of your fps, Path Tracing is so much better than RT but of course because of that it only high end cards can do it.
I think you are missing the point of why RayTracing is so nice as a goal. Yes, you can get a really nice graphics by "cheating", but that "cheating" is really hard to get right. Plus, like others mentioned, reflections just don't work without Ray Tracing. Open any game (God of War on PS5, Cyberpunk on PC, anything), go to a body of water and move your camera up and down. You will see a really nice reflection of the sky, but if there is a building or maountain, you will see only reflections of the part of it that you can actually see in camera. For me it's one of those things I cannot unsee and it takes me out of immersion, because I realise it's "just a game". That is fixed with Ray Tracing (even in Cyberpunk). Also, maybe you are too young, but I remember when first cards with shaders were published. They were way more expensive than cards without them, but games looked so much better with them. You can look for first Far Cry water with a without shaders. RayTracing is the same IMO - it will be everywhere, once the hardware catches up. It's just a much bigger and harder problem to solve, so it will take more than one generation to get there. Onec all the engines like Unity and Unreal implement all of this a tickbox in the game settings, it will be a smooth ride from there. Unreal gets really close with Lumen (it is still not Ray Tracing as far as I know), but CD Project Red will put their Path Tracing into Unreal. And they will make their next game in Unreal thanks to that. Be patient, just because you cannot see the point doesn't mean it's a wrong direction to go
Good it should be hard to get right. I'm tired of this lazy ass implementation devs are using to poorly optimize their games and so they can say "it has raytracing!!!"
2:17 dude cmon I cant even see the shadows on that.. I know elden ring RT shadows are useless but, Its just fromsoft saying "we can do RT too and we are improving our engine".
Ray Tracing can be absolutely great if implemented properly. I'd say the games that benifit from it the most are Open world games or games with lots of dynamic lighting, as baked or probed based lighting struggle with large, opened ended environments, expecially ones with lots of occulsion like cities. Of course, scifi games work great with the reflections, and games with dark interiors can really benifit from the shadows and ambient occlusion brought with raytracing.
I don't disagree, but it's interesting to note that ray traced ambient occlusion and real time shadows can be done quite well without even needing to using hardware RT (as seen in some Unreal Engine 5 games, and also in Alan Wake 2, among others)
IF is a keyword here. Mostly it's badly implemented, still relying on rasterized (mediocre) graphics with disgusting amount of post-processing, but bumping up system requirements up to RTX 4000 level for ~1080p with DLSS and frame generation enabled. But hey, it's super advanced and totally worth buying $1000+ GPU!!1 and it doesn't matter if it looks worse than some 5-7 years old games from PS4 and Xbox One generation.
and or destruction interior lighting always didnt look that good in battlefield since its a hard process to update lighting at a scale unlike the finals
Ray Tracing is a game changer when we're talking about realism, but when we're talking about "looking good" ray tracing can't compete with a good art director and a good color grading...
Real time raytracing is more for Devs, it makes it easier to setup scenes (and materials) and have them look good, without resorting to a lot of Tricks and Fakes, it will eventually be common place.
Bla bla bla, "makes it easier for devs", no, just no, it makes it LAZIER, for devs to put less effort into addinf materials and lighting to their games
@@serily4524 Bullshit. Making it easier for devs does not translate to "laziness". It translates to more and bigger levels done at same time. Instead of spending their time trying to optimize BS like shadow map resolution, dev's can spend their time making scenes that look more realistic with better crop placement and stuff. They can focus on artistic details instead of technical BS. People who claim it makes things "lazier" are ignorant fucks who view the world with horse glasses. Everyone who actually works in industry realizes why RT is needed to evolve gaming.
who cares? The Grass isn't dynamic. It's just a static piece of grass in the field of view, that has zero interaction on your character, sword, or anything else in game.
@@kevinerbs2778 Yeah, I can't believe that the last major innovation regarding grass interactivity in gameplay was when you can burn grass to your advantage in Far Cry 2 way back in 21st October 2008 I'm pretty sure that the last good single player FPS from AAA studios was either Titanfall 2, MW1 reimagined in 2019 (purely due to that covert mission in a house) and Doom Eternal, the rest of the current good FPS games are Indie Games' breakthrough in Doom-like or Immersive Sims that focuses more on "fun" in the gameplay like Ultrakill or Dusk or Gloomwood or something
DEV POV: without raytracing its alot of work to get shadows and reflections in your game. One has to render the scene to a depthbuffer, cube array of textures if we are talking pointlights - Each "cube" needs 6 textures so you have to render the scene 6 times, to cover everything from the perspective of the pointlight. If you have 10 pointlights you render the scene 60 additional times lol...ofc culling will help but not perfectly. We can then use this depth texture, which is just a blackwhite texture to find out what is in shadow or not. This is called shadow mapping and its how its been done for like 30+ years...Its crap, its fakery and alot of wasted energy so devs opted out and instead bakes raytraced light and shadows instead, or atleast where possible. The same process is also done for reflections...you can see the incredible and astonishing amount of wasted gpu power on simply re-rendering the scene to a bunch of textures which at the end of the day does not give you perfect shadows or reflections. - Raytracing solves all of this shittery and I couldnt be any happier. You dont see any difference atm because they have baked raytraced light in most games. But rest assured - in a few years the hardware will be able to path trace in realtime and those shadows and reflections are oh my oh my sweet looking.
@@ibrenecario7357 Yes, because making good looking games with RT is Much easier and takes up less disk space. Furthermore, RT is the only way to achieve realistic lighting under dynamic scenarios such as destructible walls. Which means devs can make their games more dynamic, faster and with bigger or with more levels than before. It matters A LOT.
The only case where RT has juuuuust managed to become good would be with full path tracing in cp2077 with dlss frame gen, upscaling and ray reconstruction on a 4090. Anything before that is just a side grade compared with rasterization, with slightly better (pick one of) shadows, global illumination or reflections, but for much worse performance, for twice the price obviously. I have gotta say though that sometimes I have to really examine the image even in cp2077 to tell the difference between RT and raster, because both looks pretty good.
and then you are trading fake frames and fake resolution for real ish lighting. wheras rasterization was trading fake lighting for real frames and real resolutions. and the awesome performance optimizations made possible by skilled devs learning all the rasterization tricks. in fact, you can still use fake frames and fake reslutions with rasterization if you want even more performance. looking at it that way, raytracing is still a lose lose situation in my mind. give it another 10 years and it will be incredible. for now its a feature i always turn off.
i dont get it, why do people say "fake frames" and "fake resolution" is bad.if you cant even tell the diffrence between fake and real (other than the input delay of fg i dont see a reason to why its considered bad @@socks2441
00:00 🕹 Ray tracing was introduced 4.5 years ago with the 20 series graphics cards, initially facing performance issues. 00:41 🎮 The RTX 4090 and 4070 improved Ray tracing capabilities, but the speaker still finds Ray tracing implementation lacking in many games. 01:23 🖥 Some games exhibit poor Ray tracing implementations, with Elden Ring and Plague Tour Requiem only featuring Ray Trace Shadows, making the difference subtle. 03:33 🎮 Games with full Ray tracing support, such as Fortnite, Hogwarts Legacy, The Witcher 3, and Cyberpunk 2077, are compared, highlighting varying visual preferences and performance impacts. 07:19 🚫 Poorly optimized games like Starfield, Jedi Survivor, and The Last of Us Part One PC, despite having Ray tracing, suffer from overall bad performance, making Ray tracing irrelevant. 08:15 💡 The speaker concludes that while Ray tracing technology is cool, its current state, combined with inconsistent game optimizations, makes it not worth using, suggesting a need for more uniform and effective implementation across games. Summarized by HARPA AI
It's dead because there's a new method that has been discovered that is based on rasterization, which is funny; it was invented by a Path of Exile developper and physicist.
its not the direct lighting that makes a difference, software has done a really good job imitating direct ambient lighting, ray tracing picks up the slack in reflections of objects projecting that ambient lighting and more impressively, compound ambient lighting, they cast across a scene, seeing full objects in water puddles/glass like real life, having the hue of the reflected light on objects projected in the room based on object colour and object texture or finish... ray tracing allows for this stuff much better than software artificial lighting. Some games are going to do it better than others as well, sometimes the games and the cards interaction may affect quality of the effect, but to say it's bad is BS, it's a marvel of engineering that it can do it at all, not to mention it's getting better as AI cores learn to interact with RT cores or whatever garbage AMD is trying. The other thing that bothers me about your argument to us the viewer is trying to show us RT in a static image that most certainly lost allot of it's pop being compressed for youtube consumption vs the native screen you are viewing it on. Dude, I'm on a 1440p monitor watching a video at 1080p that I can only window because full screen makes it look like dog water due to interpolation making up info on every second pixel basically. But it's a static image, RT is dynamic and would lend itself much better to a rendered scene than a photograph, it's kind of the point...
@7:08 I think you explained exactly why most games have bad ray tracing implementation. It's not that ray-tracing technology in general is bad, it's that it's still very heavy for today's GPUs and so most devs will only implement partial feature sets, and even then they will still bake out a ton of the ray-traced lighting for the rasterized version of the game, so the comparison from a visuals perspective is a little unfair.
@@DrRhyhm No it is heavy. Try running full RT on CP 2077 with a 3070 and see how it does. Probably by next gen when the 70 card can match today's 4090 is where you can easily do path tracing.
@@UnicornpirateGunna be waiting 2 generations for a 70 card to match a 4090 The 5070 will be probably 25% slower than a 4090 The 4090 was such a huge jump
Path tracing is the future and will eventually replace rasterization in most games (currently it's a mix of ray tracing/rasterization) now that there enough hardware power to simulate light the 'real' way. Note that rasterization for decades has been essentially a bunch of hacks to make lighting look increasingly realistic without actually simulating it as it works in the real world with photons bouncing around in real time.
The only real sense of Raytracing is saving the devs some work. Realtime GI is possible sinve Crysis 3, Realtime Shadows and Lights since Doom 3 (though STALKER is a great example too). And no one bothers about physically correct reflections when you have really good SSR + good Cubemapping. Raytracing offers real differences only in details if the game has great lighting + great AO. I think we need devs that are also gamers more than Raytracing.
For me personally I don't think it makes any sense to run raytracing. Even if it looks a little nicer it's not worth it for the hit you take in performance, my 7900 XT kills it in 1440P without raytracing and I wouldn't change a thing.
Right you drop sometimes 50% or more performance. I just did a side by side comparison of cp277, and honestly, I can hardly tell the difference. The thing is, for decades, we couldn't get "real light" in games. So we got really good at faking it, and now it's to good to really improve on. Honestly, just because we CAN make light super realistic doesn't mean we should. There is more than one case where I think RT makes things look worse. Metro exodus is the only game I see a marginal difference with RT on and even then I think certain scenes look better with it off. But overall I think it's improvement
@@devinkipp4344 Your right about Metro exodus ran that on my 3070ti and the diffrence is quite telling and its the only game that I have ran with raytracing on that didn't butcher the frame rates.
We've gotten so good at faking real light in games, that the jump to raytracing really seems very small sometimes. But I'm still looking forward to every game just natively running with raytracing and it being a staple and thus being well implemented. It just looks stunning done right, but a lot of companies just seem to tack it on so they can say they got it or to get some sponsor money by nvidia. The difference is kinda sad most of the time honestly, but games like minecraft or cyberpunk really showcase, how much of a difference in render quality raytracing can make
to watch this u tubers low video count to clearly not have watch nvidias own video showing ray racing in still images's on car reflections zoomed in... nvidias most powerful gpus were 1080p 60 fps on the most expensive and current tech for blah/ even less noticable then this video... of battle field 5... allen wake to max EVERYthing on the most expensive tech at 4k 4090... gets a sad 32.2 fps average... (no upscaling) (techpowerup) the v ram for allen wake 2 4k ultra mius ray tracomg os 9485MB at 4k max with pt/fg is 17807MB that is a SMALL scale game a game like gta6 could easily want 23+ gbs as larger not tiny town usa is in a little town but a city with tons of lights and ppl that you dont walk though like allenwake you drive /fly though... digital foundrys video on ray tracing on what is now a SUPERold old game is taxing would have single digit fps with path tracing ruclips.net/video/aUPwRyS15kY/видео.html&ab_channel=DigitalFoundry 1.3 million subs
After testing RT, I always end up turning it off and preferring to increase the resolution. It ends up being prettier on the eyes than turning on RT and having to use upscale methods like DLSS/FSR to achieve a playable framerate...
I think that eventually ray tracing could become a fundamental feature thats always on when gpu get powerful enough and devs find a way to optimise their games better, but it will still probably take at least another 2 + gpu generations at the least imo
I often see myself testing this thing and going back to usual settings without raytracing. I don't miss anything, really. I try to convince myself and it never happens.
that’s because a lot of game only put a small fraction of what rtx has to offer so it’s not as noticeable a game like cyberpunk where everything can be ray traced is very noticeable
I'm sorry but the SSR versions of reflections quite assinine. Ray tracing do make a difference for me in shadows and reflections, the game world feels less fake and more natural and dynamic now. The main thing I want gives to increase the draw distance now, because objects still pops in and that it takes the immersion out pretty badly like how ghosting and artifacts of SSR do.
Ray Tracing Deception Here's the real deal about Nvidia's promoted RTX tech: * Most visible effect is massive hit to FPS * Path Tracing proved that Ray Tracing has been flawed all the time * Ray Tracing is only partly used and games still rely on ambient occlusion for the most part * Nvidia lists only _five_ games (two of them been Minecraft) with "Full RT" implemantion after many years of marketing it * Doesn't bring anything "new", since effects like reflections/lighting have been there already * They marketing Says: "Eyes will immeaditely tell the difference", so who is going to admit he is "blind"? * It's Nvidia's new "PhysX gimmick" to make easy money with overpriced GPUs with ridiculous power need There are some radical new gaming techonologies which do improve the experience like HDR screens, G-Sync/FreeSync and DLSS etc., but Nvidia's ray tracing is not one of them. Prime example of ray tracing implementation is Cyberpunk 2077. And it was nicknamed "Cyberbugs 2077" and ray tracing at first was very badly implemented (excluding AMD support etc.). The game was delayed and buggy, most probably because too much time was spent to trying to ray tracing working. The game obviously was implemented solely for ray tracing "showcase", since the lighting is just horrible without ray tracing. Thus the comparison "ray tracing ON vs OFF" with Cyberpunk 2077 is not valid or fair. It was always meant to work only with ray traching and shouldn't be played without it, but who really can play it with full Ray Tracing on? Cyberpunk 2077 with Nvidia RTX 4080: ~110 FPS (RT OFF), ~50 FPS (RT ON) and ~30 FPS (RT with Path Tracing ON). ruclips.net/video/7NQqWlYZ2RA/видео.html New path tracing implementation revealed how poor Nvidia's ray tracing actually was. ruclips.net/video/FjhQiMtTz3A/видео.html Path tracing or "full-implemented ray tracing" wasn't on with the launch of Cyberpunk 2077, because no GPU really can handle it. The truth is, that when they fully implemented it (as path tracing), it dramatically changed the lighting and it started to look like when RT is OFF! I remember watching an older video about ray tracing of Cyberpunk 2077 and the whole point was to show how incredible the lighting is now inside of a car when RT is ON. However, when the path tracing implementation of ray tracing came out, the lighting inside of car was totally changed and is now close to the old ambient occlusion style! I feel sorry for the players, the developers and the competition for swallowing the bait. Ray tracing has not been and will not be game changer for a while, atleast this kind of ray tracing. It's not what it's been marketed. PS. I do have Nvidia GPU.
You *can* draw good shadows with shadow maps. You *can* draw good reflections with screenspace techniques and cubemap fallback. You *can* emulate global illumination with a wide range of raster graphics techniques. RT just takes you from 75% to near 100% accuracy, assuming the devs did a good job while implementing it.
Rasterization already produces results that are decently realistic, ray tracing just improves some parts. Ray tracing mostly help with reflections and global illumination. If you were to compare a reflective surface, there would be a lot of difference.
@@reinhardt3090that is straight up not true, cyberpunk's ray tracing does ambient occlusion and reflections in regular mode and full on path tracing in overdrive mode where every light source casts a shadow and light actually bounces around in interiors, something not possible in rasterization. hopefully that tech becomes more accessible with time because it makes cyberpunk actually look like the best game graphically ever created, especially with character lighting
One of the problems with ray tracing is that the reflective surfaces always reflect in an almost perfect way, extremely crystal clear, never blurry, which is not, in a real sense, always accurate.
@@kislayparashar CP77 PT mode completely changes rendering algorithm to a path tracer. It isn't just "this and that", it is literally entire scene, every pixel, rendered with ray tracing on steroids
The Witcher 3 can show massive difference due to global illumination when RT is on. But this will be heavily scene dependent. The forest or indoor areas show a substantial difference between non-RT and RT, but in a flat scene such as this one you'd barely notice it. The RT looks nicer though, all the time, even when it's subtle, because the non-RT lighting can have an almost blueish hue especially in the shade, but with RT on you get the color of your surroundings reflected onto Geralt and geometry around him, sort of "anchoring" everything nicely so the scene feels more cohesive, it's difficult to describe but it's very noticeable in-game. And I think global illumination is the segment which shows the most difference as it can change the feel of the entire scene. RT Shadows are for the most part basically useless, or the least the implementation of it that we got so far. I totally agree with the overall premise - the performance penalty is too great for what it offers visually, except in a handful of games (CP2077, Metro Exodus Enhanced etc). And yes while the CP2077 also exhibits a massive performance penalty at least you get insane visuals in return. We'll have to wait probably at least a good decade in order to get RT games that justify using RT consistently.
Try 2/3 years max! ;) even if I would prefer it not , the industry seems to go all in with UE5 and for now this is a shit show on performance capabilities of the engine. So we have to wait for new versions fixing it and then we'll have consistent RT implementation as UE5 packs a tone of RT software based features. Just look at games using ue4 today (lies of P ) and how far they came from games that are decades older.
In The Witcher 3, with RT on, the indoor areas for instance that in a tavern, it's hardly to see people's face, unless you hold a torch. This too-dark effect exists not only in The Witcher 3, which may be 'realistic' to the light source placed in the scene, but it looks ugly and makes many details too dark to see. With the huge FPS penalty, I turn off RT for better overall experience. I turn RT on only for the money I paid for the 3080 ti.
What sucks though is that some newer games are clearly designed around raytraced lighting, like Jedi Survivor. It still looks great without it, but some areas turning it on you can tell the area was designed with raytracing in mind.
If I am honest, the only picture where I prefered the RT version was Cyberpunk 2077. No RT, most of the time looked better. Not more realistic but better.
@@soul-manRTGI is generally known to just brighten up scenes because it allows light to bounce off different surfaces and transferring light into more parts of the image. This can easily be shown in many renders as well and it’s how light works in the real world
Probably because most of the other examples, the games weren't designed around raytracing, or were just using it for some effects like reflection or slightly better shadows.
I completely agree. Fortnite, Hogwarts and Witcher 3 looked much better without raytracing. If im being honest, Cyberpunk is like THE game for raytracing and they really nailed it.
I get that sometimes there are instances where you can't tell it's being used. That is a credit to the artists who have had years and years of experience "faking it". But RT is more than just "looking better". It's just about casting light reflections, shadows, etc. in a more natural way. It's very important to have an all-tray-traced future. Artists with all that experience "faking it" using all kinds of lighting techniques to look natural only have to turn a light on and the RT can handle shadows, bounces lighting, etc. Artists can be more like set designers for film rather than having to place a bunch of light probes, animating shadow movement, hand-drawing light bounce, etc. Basically, real-time RT makes scene construction less labor-intensive than the rasterized ways. I believe 4A Games (makers of Metro) have gone to a full-blown RT studio simply because of how much real-time RT has changed their development. I really wish AMD took Real-time RT more seriously because it means faster adoption of real-time RT and casting away the shackles of rasterized techniques. It isn't just Nvidia leading the way and setting the standard. It could mean more optimizations in real-time RT to hopefully make it less "intrusive" to overall framerate. The best way to tackle a problem is for more eyes looking at the problem.
Talking about 4A games, they showed how much time real time RT changed how they work. Lighting a scene was literally adding the light from the sun and a few lamps in the room. Done room was lit up
sure raytracing is more accurate, its literally simulating real lighting. but if the end result is basically the same, it is not worth the massive performance cost, and 'faking it' wins by a landslide.
@@socks2441 thats because you looking at shitty RT implementations and you not considering anything else For faking it, it takes hours of time to setup a single room in a building to make sure the lighting is just right. for RT, you add the light source and it does all the shadows and objects for you and done
@@socks2441 There was a time when 3D Rendering didn’t produce the same results as sprite-based games. It was computationally expensive. But, technology companies invested in it and it eventually became trivial to do. If anything, help the developers. As 3D rendering gets more and more complicated, the harder it will be to keep faking it so that things look realistic. If more companies took it seriously, then RT (and path-tracing) will become trivial and no longer be detrimental. Developers can just place a light in a scene at development time and the algorithm takes over handling how light bounce will light/shadow ever object in the scene - not the person. No more light bleed. No more small objects missing a shadow. No more objects disappearing in reflections because they aren’t in the scene. It’s all handled by an algorithm instead. Artists can focus more on the art of the scene rather than worry about how they are going to make the scene look convincing. So, yes, RT (and PT) is expensive. Yes, the result isn’t always mind-blowing. But it’s still something that should continue to be pushed and researched for the future. If companies had the same attitude about 3D in the past, we wouldn’t be where we are today arguing about if real-time RT is worth it.
Also difficult to see the effects in one scene and after it went through youtube compression. When I turned on raytracing in Cyberpunk, I was blown away by how good it looks. However, I had to turn it off, because it also resulted in a 15fps framerate. I feel like raytracing does not make for better screenshots, but it can immensely improve my immersion.
Better screenshots really depends on the environment. Lets stick with cp2077 as an example as it looks pretty good without RT. If we have a non rt and a psycho RT screenshot during the day in the desert. It will be very hard to see the difference. If we take the two screenshots in a place where there are multiple light sources and multiple surfaces (some of the reflective) the difference will be easy to spot.
Same for me as well. I think it's not really a very significant thing yet. But surely it will be in future. Like I wont be missing a lot if I don't use RT now.
What it tells me is that your gaming enjoyment will not be lifted up another lvl by RT. Maybe in the future, when all whistles and bells of Rt are fully exploited but even then. RT doesn;t make good game, the story, the mechanics, the music/sounds etc.etc.etc. all matter more then RT.
when you use the absolute worst scenes to showcase the take, especially when half of his examples are games with bad rt, yeah, it does make it hard to see
It's not stupid. Just not there yet. Devs spend huge amounts of time to fake lighting and these examples aren't really that good either. In the future, RT will be the standard but it wont be soon.
People. Stop assuming Ray tracing is an end user graphical improvement tool, its not, its a game development productivity tool. Do you get it! I'm not saying that's a bad thing but quit slurping the marketing that its a graphical improvement tool in ANY way. AerO R2 thank you for taking the time to illustrate evidence. It will allow devs to design games quicker, period (provided the end user pays for the hw to run it). It shifts the resolution COST to the end user in expensive GPUs its awash in the end visual result its just about who pays for the resolution, devs vs end users. How many idiot youtubers are sucking to showcase ray tracing as looking better when in 100s of videos you have to squint your eyes and "believe" to agree, when it just doesn't make a positive difference. Ask yourself honestly does Ray Tracing look muddy? Yes it does. If you do a Fourier analysis on the images they simply have less information and the honest human eye can tell the difference.
I'm sorry but as a photographer, If you've played Cyberpunk or Witcher with and without raytracing and cant see the huge improvement in realistic lighting you either don't understand lighting and how it works or your just blind.... Doesn't make a game more fun but if you enjoy looking at pretty things it certainly improves your enjoyment
Where the biggest notice is in games that use alot of ray tracing. Dying light 2 for example, uses tons of raytracing when switched on, and comparatively, it also tanks performance without dlss or fsr ... and i mean TANKS. And you can notice a massive difference in the game with on vs off. Most games just don't utilize it well, and consoles have no business even trying to use it lol
Honestly. I tried running rtx with my 2080 super. And it just never got turned on until i upgraded, and the ps5 and xbox 5 said rtx it just confused me.
Games where ray tracing makes as huge difference as Dying light 2 are just games that didn't work on normal graphics as much as they should. Dying Light 2 gives grass/foliage with no back shadowing or ambient occlusion without RT. Which is stupid because you shouldn't need RT to do something so simple. Foliage is heavily present in the game so changes to it are quite noticeable. Also for reflections, several outdoor locations don't have cubemap generated for reflections so all you get is some very plain glass. Screen space reflections are also extremely low res and have low distance. If the normal lighting with AO, shadows and reflections was done well you'd see almost no difference.
Pathtracing changed everything. Looking soo good. Not videogamey at all. And we finally can play Alan Wake 2 natively on our 4K TVs, pixel perfect without any interpolation techniques, in buttery smooth 30 FPS at least almost everywhere using our RTX 4090ies while the cards pulling only about 550 Watts for this awesomeness .... :-)
There was a time when anti-aliasing was also a nice-to-have if you can enable it while maintaining good framerates. Now every game runs with anti-aliasing enabled.
But anti-alaising gave the games an big Improvment at there time. Raytracing ist not Raytracing all the Time. Often it only will used for better sells. There are games were the raytracing is really bad AND after implimented in the game the game look ugly without. World of Warcraft for example got good shadows nativly. Now Raytracing for the Shadows is in the game and the shadows now look as bad as at the release of the game. 16x16 resulution...
A lot of the light tricks we use (like baked lighting and reflection probes) require a pretty static scene to work, but if the scene is mostly static it can look close to ray tracing. Now try to implement those tricks in a game like mine craft where every block is dynamic, you can’t bake lighting or reflection probes so ray tracing really shines in these types of games. I’m excited for the future where ray tracing becomes the norm and runs at high frame rates because then we will start seeing more dynamic games ( and eventually they will be so dynamic a better term for them would be simulations than games)
I agree I can't really tell that easily. Especially in games that already prioritize graphics. It looks the best in games that have basic graphics and a rt feature added.
You don't know where to look Neither does the creator of this video,those screen shots were the worst place to compare RT ON vs OFF,in the witcher 3 go to a tavern to see how there isn't any shadow under tables and they look like floating,and how much depth they get when you turn RT on Or go to somewhere with lots of trees,see how leaves will have depth,or go to novigrad Really every one that says RT is stupid it doesn't make any difference is stupid,because they look somewhere you can't benefit from RT like other parts,and if you don't know where to look or don't see the difference at all,turn it off and enjoy your experience,the same way console players play on 30 FPS and don't know about it,but they're enjoying it,that's what matters in a game to be honest
I've been playing cp2077 with ray tracing on for a year and I just had to deal with huge frame drops all the time. I recently turned it off and could tell a huge difference in graphics quality but I went from sometimes dropping to 20fps to a stable 80. Very worth it imo. you can't really appreciate how beautiful the city is when it's choppy all the time.
Control would have been a good one to look at. Out of the games I've tried (including most games on this list), I think it's among the ones that take the most advantage of it.
@@AbiRizky performed like absolute crap on my computer. terrible memory management resulting in blurry textures all over the place. really messed up my immersion and I just couldn't finish the game like that. otherwise it ran smooth and fine.
@@cyborgchimpy that's actually a bug. You can find the community fixes on nexus iirc. I had that exact same issue too, used the mod, them ran great afterwards. For some reason though it couldn't fix the epic games version, but did ok with steam
I think RT does make a big difference if you have the right display for it and the right card. I love using RT on CP77 and Alan Wake but that being said I’m also on a OLED
Well original video is 6 months old, aged like milk. That's like saying 1440p is bad after years of launch, which was true. Now 1440p is the standard. It's the nature of tech, calling new tech bad is moot.
@Mouchoo bruh Steam does surveys. 1080p and 4k went down, percentage. Currently, it's the second most popular. That's the definition of becoming the standard.
@@Mouchoo 1440p is definitely becoming the new 1080p faster than you think. 1080p is just ass looking in 2023 in imo. 1440p looks ten times sharper going from 1080p for the first time. Same goes for 1440p to 4k but most people probably won’t be grabbing a 4k monitor since less than 4% currently use 4k for gaming.
Real time lighting still has a LOOOONG way to go. First of all as many already correctly noted RT doesn't necessarily look better. Often it just looks a bit different. That's it. Why? Because most games used baked lighting which is just light calculation done in advance and then statically applied to the game world. But it's still ligh calculation. Just not real time. As of today RT is often limited to specific surfaces and/or only to specific objects in the game world. It uses very little rays per pixel (like a couple hundred, maybe 500 to 600) which creates noise which requires denoising algorithms. Also the distance at which objects are actually included in RT is very small. In Cyberpunk 2077, only objects up to maybe 10m away actually get RT'd.
Edit: RT does have a great benefit for professional applications. And also once it has really fully arrived and a full fleshed-out version of RT can be used even my midrange and entry level hardware, it will make lighting in games a much easier and faster process. Because right now getting all the lighting and shadows right can be a difficult and time consuming task. If you can just enable real time lighting all of it is done automatically instead of you as a dev having to design everything by hand. So don't get me wrong: RT is a great technology that partially already has its uses today. But in gaming its full value is a solid 15 years away still.
Ray tracing is awesome. I have used it comfortably since it launched on my 2080. I'm on a 4080 now and it's only gotten better. I run it in every game that offers it and have absolutely zero complaints.
I'm curious how many upgrades in-between those? Those cards are one almost 1 teir below the absolute best cards in their class. I have a 3080 and use rt in some games but the hit to frames and drops makes me have to use dlss... It's good but it's not native so it just shifts the problem to weird textures and some things here n there are wonky... In those instances just turn rt off. I will admit on older titles like control maxing out all the eye candy looks stunning.
Gamers will really talk about how much of a scam ray tracing is and then piss themselves about how amazing a game with baked RT lighting, carefully placed fakery, that explicitly avoids scenarios where the lighting would break looks.
Reflections had the biggest improvement with Ray Tracing. Screen space reflections honestly looks so unstable that I actually prefer having SSR turned off and just have cube maps. Heck, even older games like MGS2, Mortal Kombat Deadly Alliance, etc faked reflection by having a lower poly model mirrored on the other side, which looked better and more stable than SSR. RT Reflections is something I'd very much prefer in new games. The second most unstable thing to my eyes were ambient occlusion. I always hated how SSAO looked. It added dark glow everywhere which looked worse than baked ambient occlusion in ps2 era games. HBAO and newer methods improved the accuracy to some extent. But they all still break when something occludes the screen or at the edges of the screen. Having a stable AO with the help of RT would be very nice as well. However, RT itself looks unstable at times. RT GI has delay when there is lighting change and that looks so weird to me to the point where I'd just prefer to have rasterized lighting. Some games manages to camouflage this issue better than others. Hopefully over the years, the stability issues will be further ironed out just like how ray reconstruction has made significant improvements in the GI update rate. Also until we get 4090 levels of performance on 400-500$ gpu, it will make using RT difficult for me.
Problem in SSR is that camera tilting doesn't work on that. So screen space reflections actually do work well in cutscenes or some third person game where camera tilting is limited. So this really depends on camera. And yes, RT GI delay is very annoying too.
True, but the problem is that cube maps or any other simple reflections techniques can't display something that SSR or RT shadows can or not working with modern rendering engines at all, so SSR just like TAA are absolutely necessary in some scenarios despite it's flaws. So yeah, I totally agree with you that some 5, 10 even 20 years old games looks beautiful despite it's age and I'd rather play it at 4K and 120 fps that some modern "realistic" games at upscaled res and generated frames.
@@J0rdan912 It works extremely well if reflective planes are just portals that render same scene as mirror view and use that as specular component. SSR can be used when camera is controlled, like when driving some vehicle, cutscene, third person view, crash cam, aerial view... it is first person view when camera is tilted that breaks SSR. Cube map is fall back that can be used in curved surfaces when SSR doesn't work.
@@J0rdan912 SSR looks beyond saving to me. 1st person, 3rd person, cutscene or whatever it is, the reflection breaks on the edges of the screen, any object [especially on 3rd persona games where your main character is blocking the centre automatically breaks SSR. Same goes with TAA, it just looks horrible compared to SSAA and even FXAA. I was playing Forza Motorsport at 1080p the other day and everything on distance looked like a smudge fest. I went back and booted Forza motorsport 4 on my xbox 360 which ran at 720p but looked so much clearer, it almost felt like wearing glasses. Depending on some game and how TAA is used, it does look okayish but I'm not happy with the direction modern rendering is taking with all these blurry images.
@@RiasatSalminSami Modern games are targeted for 4K, this is where TAA is mostly okay. Also, older games isn't the same thing, without temporal processing modern games are not going to look "clearer", they will look like Sega Saturn faux transparency pixel grid and noisy shimmering mess all over the place. Same thing with SSR, new rendering methods requiring limited SSR or low-res RT reflections, older methods are just not gonna work with skinned and moving objects or particles. I do agree that some older games looked great, but at this point there is no coming back to that, the only alternative is 2D pixel art/hand drawn or retro 3D stylized graphics where older methods are still fine or even benefits from it.
I guessed every one of them right (almost got fooled by requiem), but As someone who finds ray tracing very interesting, I personally think games are a lot more clearer/ cleaner when you have RTX off which is something I personally prioritize in video games. RTX (even if its well implemented) just sometimes looks like a lot of colorful bloom was added. But it really just depends what people want out of a game Great video though. Glad someone was able to nicely lay it out for people who are interested in Ray Tracing.
That's a weird take. The more ray traced games there are, the better the implementation will become and the results will be. Sure it's not always good (like everything) but it can be. The primary motivation to be against Ray Tracing is that the hardware is expensive, but things should get better on Nvidia side (expect a 5060 to be able to do RT), sill an unknown on AMD side which is lagging behing a lot. Hopefully Intel can do something in the next years too.
a weird take is to watch this u tubers low video count to clearly not have watch nvidias own video showing ray racing in still images's on car reflections zoomed in... nvidias most powerful gpus were 1080p 60 fps on the most expensive and current tech for blah/ even less noticable then this video... of battle field 5... allen wake to max EVERYthing on the most expensive tech at 4k 4090... gets a sad 32.2 fps average... (no upscaling) (techpowerup) the v ram for allen wake 2 4k ultra mius ray tracomg os 9485MB at 4k max with pt/fg is 17807MB that is a SMALL scale game a game like gta6 could easily want 23+ gbs as larger not tiny town usa is in a little town but a city with tons of lights and ppl that you dont walk though like allenwake you drive /fly though...
In games that actually implement the feature it looks massively better than off, like in cyberpunk 2077. It's also going to be a strong tool for multiplayer games as a way to prevent people from cheating or having a competitive advantage by tweaking shaders and graphics settings to remove shadows and change lighting. Having a real time lighting can become an actual mechanic in game and be added into elements of tactical decision making. RT + DLSS gives you insane detail + performance. Using DLDSR+RT+DLSS = insane performance on 4000 series cards with insane graphical detail.
what you said about multiplayer makes no sense, not just because it isn't true, but because no one uses shader manipulation seriously as a cheat but if they did could probably do more by messing with hidden settings. plus no person taking a game seriously is going to use ray tracing and halve their fps + introduce a ton of input latency.
It looks like Cyberpunk intentionally botched some of the some of the non-raytraced surfaces (e.g. water) to look better with RTX ON. In games where reflections are properly implemented (Hitman), the difference is miniscule. Hell, even Cyberpunk looks better with DLSS Quality / OFF @ 4k than DLSS Performance / RTX Psycho, if you have to choose between the two options.
Strongly disagree. I've tested DSR at 4-8k resolution as well as 4k native and I'd take 1440p Native with RTX over 4k baked lighting anyday. The game textures aren't even 4k. Increased resolution can also make somethings noticeably worse because geometry can become too sharp or clear when it was designed to not be viewable at a higher res. I personally play the game with RTX and DSR 4k. The difference between native vs dsr is very,very small.
Very very few games are designed from an artistic standpoint around assumed real-time Ray Tracing. Pretty much every game still needs to be designed to look decent even with all RT features disabled, so very few titles are utilizing an artistic direction to fully exploit the tech.
I've been playing plenty of games with RTX and maxed out settings with a 4070ti and I'm getting good frames with DLSS 3.5, (Jedi survivor after the optimisation updates), the witcher 3 and Cyberpunk have looked sensational.
Witcher 3 was one game where I really noticed a difference. Cyberpunk is a bit too much for me to run ray tracing, I get capped around 50-60fps, but it looks insane without it anyway.
So, I look at raytracing like this: Nvidia created hype and that hype turned into a category they will never lose in product reviews. Secondly the extra cost for raytracing is just tanking the price per fps average thus almost entirely exempting Nvidia from being considered. And finally, AMD is being forced to pull resources and engineers to compete in this category because the vast majority of gamers are consumers that will buy whatever is marketed in front of them.
the only thing that is an obvious upgrade is RT Reflections, so if you've got the framerate to spare turn that on, everything else just isn't worth it, especially if you have to turn on upscaling to get playable framerate. I swear, people who think upscaling looks fine compared to native resolution need to see an optometrist.
Ray tracing isn't stupid, it's amazing technology it's just not really used properly because of dev knowledge and most importantly te hardware. Yes the 4090 exists but it's a tiny amount of the market. But fully raytracing/path tracing is amazing and it certainly is the future. I reckon it won't be until the next console gen that it takes off properly. Maybe in a mid gen 'pro' model refresh - I hope.
The problem is graphics are outperforming hardware and the substitute is to lower the resolution and then upscale it, and then call it a 'well-optimised game' as if lowering the resolution from from 2k to 1080/720p isn't a penalty in itself. DLSS is not a good implementation, but is now required
Usually, if the gameplay is captivating, Raytracing takes a backseat. If you are immersed by gameplay, you are not going to be noticing the lighting that much. I rarely use it, as it becomes a battle of compromises, like how low a framerate are you happy with.
For me it's a must have I always play my games with 30-50 fps just that I can have Raytracing and 1440p without dlss. But if anything I'll enable dlss first. Been grinding 100s of hrs in SP games on 30 fps no problem. 10 times the fun as if I turn off Raytracing. Then I lose interest in the game almost instantly despite it being fun. Just the missing good reflections ruin everything for me.
@@williehrmann for me its a case by case basis. I have a 3060ti on 1080p so i just dont have the hardware to do this for several games. Cyberpunk? I leave it off since i would need to enable DLSS max performance which looks like a 240p youtube video. Forza Horizon 5? Yeah why not, i still have max fps with DLSS on and image quality isnt worse. Fortnite? Also yes, it runs stable 60 with DLSS performance. But yeah, for me i just test if i can run this game with raytracing or not. And i also dont want to get too attached to raytracing since i plan to upgrade to an AMD GPU later.
@@williehrmann It's all about preferences. One can enjoy gameplay with almost indistinguishable visuals, while other prefer eye candy visuals with eye-gouging 30-50 fps and input lag.
What if RT actually produce immersion? It does for me. So I gladly offer a few frames to be more immersed. Lighting is EVERYTHING in the visual presentation.
@@ItsCith It is, but only if implemented properly and with global illumination, which is pathetic number of titles. Also, once you're dive in for a couple of hours, you can't tell the difference anymore. So it's good for comparisons, for screenshots and tech demo and videos, but mostly useless for an actual prolonged gameplay sessions.
Actually hogwarts legacy has visible ray tracing. The frame u showed is just showing a hallway that both shows reflections, but if you move the camera downwards you can see that if you do not have ray tracing, you don't have reflections anymore, because the game shows you only what you see on your screen. Ray tracing shows, instead, even what's behind the character if there are reflections. I think that (for me) u misjudged this game
I play videogames since 80s and I can see a big difference, because I've seen every little step in the progress of 3D graphics. And if I look at the reflection in some game and see that it is artificial, wrong, I feel sad.
Old video, but you are generalizing Ray Tracing. NOT EVERYONE uses it for games. Real time Ray Trace/path trace on modern NVIDIA cards is a priceless asset to those in the field of design and marketing. What took literally HOURS if not days, is done almost in real time now. That would not be possible without the tech Nvidia brought to the masses. From the 20 series on, my like changed for the better as to my work based on RTX, real time ray trace! When it comes to gaming, who cares. It's just that! GAMES. People spending 1500 to 2000 to play a video game already have issues, or no room to complain. The cards are designed for far more than the millennial gaming community.
Depends on the game. Cyberpunk, which is already a looker, becomes absolutely gorgeous with RT on, and while there's a performance hit on my mid range 3070, the trade off is worth it imo. I'd argue that RT is by far the most impressive tech I've seen in gaming in the last decade, since lighting has a more dramatic influence on a game's aesthetic than poly count.
Performance hit even on Nvidia cards with ray tracing on is pretty significant at the moment. I personally do not care for ray tracing or realistic graphics. I'm a gameplay guy. I remember Physx being very popular back in 2012 and it was pretty impressive but then they decided gpu driven physics is not necessary and we didn't need that much detail. Same might happen with ray tracing.
I think, the main problem with RT implementations that distinctly improve the graphical fidelity (Alan Wake 2, Dying Light 2, Cyberpunk, Portal RTX, Jedi Survivor, The Witcher 3), is that they only run acceptably on RTX 4000 and maybe 3000 cards. AMD's RT performance is still very limited and only competitive, if heavily scaled down RT implementations or only single effects (like shadows) are used, as we see often at moment. The same goes for the consoles, which run on AMD hardware as well. Intel's cards have better RT performance per price than AMD cards, however, unfortunately, the cards they sell right now are not especially fast to begin with. To really see good RT implementations more often, AMD needs to become competitive with NVIDIA with regards to RT. For the consoles, this may only become the case, when the next generation releases in 2028 at the earliest. Until then, I fear, we will only see worthwile implementations in games whose developers are heavily aided by NVIDIA engineers.
I don't know man. I usually play non-ray traced games when the visual gain VS performance loss is clearly unfavorable, but Cyberpunk with path tracing is phenomenal. I started it all over again with a 4070 and DLSS3 and so far it looks like a freaking remake, completely overhauled. We're in a situation in which ray tracing is still a gimmick, but when implemented good, you do want DLSS3 on your side to play it "the way it's meant to be played"
I have a test channel where most want to see ultra and ray tracing. High details aren't that much worse than ultra graphically, but they can deliver a 20% difference in performance. Ray Tracing is and will be unplayable. No generation will ever be enough for that and to me it's the biggest piece of crap ever released. DLSS 3 FG is again just something Nvidia is trying to increase sales. The technology has been here for 4 years and I haven't played a single game with it. And sometimes it seems to me that a game that supports Ray Tracing runs worse than if it didn't have it.
Well frame generation and resolution scaling is useful if you don't have a high end care or you playing 4k ultra. It just increases your fps when working properly but if you're already getting 180fps. . . Meh. I do agree that Ray tracing is mostly garbage, I finally have a card that can run it and it's rather disappointing.
ray tracing isn't bad, games just implement it poorly most of the time/rasterized version is already near perfect. Cyberpunk 2077 is an example of a game where ray tracing is VERY noticeable (especially when driving cars first person), you might not see the difference in still pictures though.
Yeah dudes will pay 2k for a 4090 just to play 2-3 games at 60fps with cranked out RTX settings. Seems like a waste of money to me… For 1000 bucks you can scoop a 7900XTX and muscle through pure rasterization at 4080 super levels
The developers extrapolate the feature but you also missed the point, reflections and shadows in rasterization behave erroneously we have been accustomed for years this way but ray tracing when used correctly makes it more immersive by simulating reflections and shadows with more realistic physics
RT global illumination with AO is a big deal, but it's very hard to run. Lumen does it well with software, but the problem is that Nvidia isn't interested in performance. They are only interested in selling their top GPU. Nvidia should be aiming to include Lumen level visuals/optimization (who cares if it's not super accurate) but with a big uplift due to hardware acceleration and they should be working with Unreal to implement that. Witcher 3 GI and AO look insanely good btw and I will get into that and the 4070 later. RT reflections is hit and miss and almost always a miss. It works in Spiderman because you are getting a blurred, muted reflection on office buildings and because gameplay is so fast. Same reason it works in Doom Eternal. It fails in Hogwarts because what you expect to look good while walking around in Hogwarts looks worse than screenspace reflections which are almost always the better thing to use. In addition for GOOD reflections which are needed in something like a mirror, you can just draw things twice which is what Uncharted 4 does at the end with the kid and what TLOU1 remaster does the same way in the intro. RT shadows is dogsh@^. I have never seen one implementation with the exception of Shadow of the Tomb Raider where it didn't look worse and have awful pop in. The only reason it looks good in that game is it's the only thing RT thing used so they can push it to absurd levels. On Jedi Survivor. They didn't optimize it. They just removed ALL RT from the RT off option, which greatly reduced CPU overhead. The console has RT on even in performance mode (where it runs just as bad on PC) and that is one reason why the game was made in record time (game was made faster than Mass Effect 3 and that was a very rushed game compared to the previous titles in that series). The studios want RT shadows more than anyone else because it removes any work they have to pay for and they can hire cheaper labor. EA doesn't care about RT shadows being awful and popping in 20 feet in front of you on Jedi Survivor, just like on Witcher 3. The only solution for Jedi Survivor to make it look and run well (minus those awful RT shadows) is DLSS and frame gen which is being worked on by a modder called Puredark. AMD paid to keep both out in that sponsored title. On Witcher 3 and your 4070. 12 GB can't run all the RT at good visual settings. Set it to ultra and only run RT GI and AO, which is exactly what the console does. The screen space reflections look just as good on water almost all the time and you won't run out of VRAM after a hour where performance tanks or you crash. You can either push 1440 native and not bother with DLSS and use DLAA (mods and tools exist for pretty much all games now that have DLSS) and then instead of ingame sharpening I would fine tune it with integer scaling in the NVCP that opens up a new sharpening option on a pre game basis or use reshade CAS.fx.
Nvidia is looking to have RT replace raster in game development. They are VERY interested in performance with the technologies they release, but always blow up their tech demos. For UE5 Nvidia decided they will fully stand by Lumen GI instead of using their own implementations. They are instead improving direct lighting, reflections, and denoising. Hardware RT functionality is exactly what they have worked in implementing and currently you can enable a wicked increase in reflection and direct lighting quality through them. If I were to choose one RT setting featured by the majority of games to keep, it would be Reflections. If I were to choose an RT setting in general it would be Global Illumination. Can't say I agree with your takes here.
Not making it through this rant but RT reflections look amazing in Watch Dogs Legion and even better in Cyberpunk Overdrive mode. Now with DLSS3.5 the fidelity is skyrocketing.
CRYtek had software Ray Tracing showing we really shouldn't have gone the hardware route. Its ridiculously wasteful on resources for very diminishing returns. Just grab the CRYtek Neon Noir benchmark to see for yourself or play the Crysis remaster games. The trend in development right now is lazy optimization with huge VRAM and RAM requirements for very little difference in visuals and years of patches to fix broken games after release. Not to fanboy Rockstar but its like their the only ones able to make these huge open world games run well. As amazing as Elden Ring was as a game that engine is junk. Cyberpunk was nearly unplayable at launch. Not to mention GPU makers are gimping VRAM on most of their products making this whole ray tracing issue even more pronounced.
RT is only really relevant if u have a top nvidia GPU. There are quite a few games with poor RT implimentation but in some games it can be transformative since lighting really is the most important part of visual fidelity.
@@GeriatricMillenial but u dont get the RT performance of an nvidia equal graphics card like the 3090ti, u also get worse image quality upsdaling with FSR instead of DLSS.
@@trichi827 ur not maxing it out tho with the pathtracing. With the newer games that are comming out ur not gonna be able to play with RT , Alan wake 2 for example
@@dante19890 I’m playing at max settings. I’m on 1080p. 40 fps in the city, 60-70 everywhere else. I’m not gonna play Alan wake so I can’t tell how it runs on my pc
Sometimes what "looks better" is just more vivid colors, kinda like when people showcase graphical mods, it's always at night in rainy weather with many light sources reflecting.
1450 watt special ATX+ connectors incoming - I can see it and don't you dare shower this month, because climate change is real. raytracing in games just does not pay off. the power consumption is unreal and you can achieve very similar effects with much less power usage. mathematically, there's no shortcuts for raytracing, unless you castrate it into what you could do with other approaches.
Ray tracing is amazing, it has the potential to make 3d rendering look more real than it ever was. take a look at minecraft RT and quake 2 RT to see what it can do. Now weather or not the developer uses it or not is really the question and i feel like developers are the biggest problem now days. AAA titles are few and what does come out seems like it could have been better. I have yet to see a Ray trace only game, meaning anybody who does good RT like cyberpunk and witcher 3 had to code both just to offer it. I say RT only, got an old PC or console ? stope being a cheap ass and upgrade. We can't have nice things if we don't drive the changes.
It begun on pc being used sparingly, implemented into game engines after being sponsored by nvidia. Even with games being made for rt capable consoles, implementation has been slow because said consoles are generally quite weak. Lumen from epic is rt and even more efficient than traditional methods. it is good stuff. if you want better looking games, you take frametime hits. if you’re fine with rasterized effects and those problems, fine. but rt is useful and absolutely better looking than raster when done right.
Finally some sane person. Raytracing was dumb af in the release and overhyped by the soyboys nerds like always. Turn Raytracing on and turn Fps down, yeah no thanks
A lot of games use baked lighting, which is basically raytracing saved to a texture. That's why there's no difference.
Quake 1 says hi from the year 1996(more or less).
baked lightning sucks and is the reason most games have lifeless environments because they can't be organically changed without slaughtering the lighting.
@@ramsaybolton9151 yeah but they can be ran on worse hardware because of it
@@eineatombombe Yeah but if we're going to start innovating games and making actual unique experiences we're going to need to phase out support for old hardware. I've been gaming since 97 boy! There has been so little actual innovation in games over the last decade even though engine innovations have been huge.
@@ramsaybolton9151 unreal engine 5 is for that
I am not a fan of "modern games" where the ground always looks like it's been waxed over after having 5 coats of gloss expoy then finished off with a fresh coat of baby oil. Same for all the character models, they all look like they are covered in sun tan oil, even over their clothing. Who thinks "realistic" looks like this?
Nobody. Artists are more and more using real world (!) values for the smoothness of objects and this is exposing how bad game reflections are. The "shiny" 8th generation look was all about using screenspace reflections to finally be able to use realistic smoothness on materials. Unfortunately SSR have many drawbacks. In every situation where SSR breaks down you're left with strangely "glowing" or "waxed" surfaces that stand out and ruin your immersion. The worst places are often indoors when there isn't much light around. Things should be dark but instead they show bright reflections (local cubemaps or the skybox).
Raytraced reflections are the only solution but due to high performance costs RT reflections then end up being BLURRY or UNSTABLE. Both of these problems immediately stand out as unrealistic.
@@forasago Thanks, it's good to know where this issue has it's roots.
LOL. You are dead on. I was laughing my butt off on the tech demo for unreal 5.4! MORE REALISTIC? WHAT? Peeps still look like 8 year old tech IMHO! Advance cartoons. Honestly, going back to the giants, like crysis, and comparing it to modern games? NOT THAT IMPRESSIVE vs. what it should be! We are sold snake oil! The obvious issue is, gaming consoles, and average income folk can't buy 1500-2000.00 usd cards, let alone the 2-3k for the latest CPU/build/ memory. OH, didn't mention a modern OLED 8k monitor. So, most likely, devs just aren't at a point to spend the time needed to create REAL LIFE experiences. VERY few could ever enjoy that level of tech.
Agree about gloss. Devs think adding reflection is the only way to make a game look better. Pretty silly. Ever notice all the car racing demos are in the rain? LOL. Even GTA V/VI , always in the rain to exaggerate reflective surfaces. Maybe, it's just me, but I can't remember the last time I went out in the rain and noticed everything super reflective. ;)
@@Curious_Skeptic oh yeah! dont get me started on every E3 demo being in a scene after a fresh rain! But yes- it gets to a point of diminishing return. Epecially when 80% (or more ) of the game market is fine playing games like minecraft, or stardew valley. There really is a very very small market of graphics geeks out there , and they are simply justa super niche market (always has been). I'd say most ppl could go the rest of their lives with games at xbox360 level of tech and be just happy lil clams.
Gotta oil 'Em up am I right
I think it depends on the game. Metro Enhanced on PC, Cyberpunk with Psycho RT, Dying light 2 are a few games built with great implementation and I would both notice if it’s turned off and wouldn’t want to go back.. but for the vast majority of games it seems to just be implemented after…
I play Metro Exodus without Ray tracing, honestly the game is still beautiful, I don't miss it.
@@RcpGunslinger yeh for sure it’s a fantastic looking game as is.. I play on the ally all the time and it’s beautiful base game..
@@RcpGunslinger Metro have terrible Ray Tracing. The only game where they nailed Ray Tracing in is Watch Dogs Legion.
@@V3ntilator fair enough mate
@@V3ntilator have u played the orginal metro or EE? If u say EE have terrible RT then u are delusional.
Path tracing on Alan Wake 2 and Cyberpunk makes an insane night or day difference. Path tracing specially in CP2077 ads a whole new dimension to the immersion
True, try regular ray tracing on cyberpunk though. Shit literally looks the same as rtx off. Path tracing is the goat.
Adds lag fake fps lol
@@AdvancedGamingYT I think you need at least a 4070Ti for that (or equivalent AMD card). And I still play with ray tracing off as its just not worth the frame rate going in the toilet. I have a 4060Ti with 8GB VRAM (came factory equipped with my PC).
LMAO!
Raytracing is literally bad now, there are not much improve over the quality while cost half of your fps, Path Tracing is so much better than RT but of course because of that it only high end cards can do it.
I think you are missing the point of why RayTracing is so nice as a goal. Yes, you can get a really nice graphics by "cheating", but that "cheating" is really hard to get right. Plus, like others mentioned, reflections just don't work without Ray Tracing.
Open any game (God of War on PS5, Cyberpunk on PC, anything), go to a body of water and move your camera up and down. You will see a really nice reflection of the sky, but if there is a building or maountain, you will see only reflections of the part of it that you can actually see in camera. For me it's one of those things I cannot unsee and it takes me out of immersion, because I realise it's "just a game". That is fixed with Ray Tracing (even in Cyberpunk).
Also, maybe you are too young, but I remember when first cards with shaders were published. They were way more expensive than cards without them, but games looked so much better with them. You can look for first Far Cry water with a without shaders.
RayTracing is the same IMO - it will be everywhere, once the hardware catches up. It's just a much bigger and harder problem to solve, so it will take more than one generation to get there.
Onec all the engines like Unity and Unreal implement all of this a tickbox in the game settings, it will be a smooth ride from there. Unreal gets really close with Lumen (it is still not Ray Tracing as far as I know), but CD Project Red will put their Path Tracing into Unreal. And they will make their next game in Unreal thanks to that.
Be patient, just because you cannot see the point doesn't mean it's a wrong direction to go
Good it should be hard to get right. I'm tired of this lazy ass implementation devs are using to poorly optimize their games and so they can say "it has raytracing!!!"
2:17 dude cmon I cant even see the shadows on that.. I know elden ring RT shadows are useless but, Its just fromsoft saying "we can do RT too and we are improving our engine".
Ray Tracing can be absolutely great if implemented properly. I'd say the games that benifit from it the most are Open world games or games with lots of dynamic lighting, as baked or probed based lighting struggle with large, opened ended environments, expecially ones with lots of occulsion like cities. Of course, scifi games work great with the reflections, and games with dark interiors can really benifit from the shadows and ambient occlusion brought with raytracing.
I don't disagree, but it's interesting to note that ray traced ambient occlusion and real time shadows can be done quite well without even needing to using hardware RT (as seen in some Unreal Engine 5 games, and also in Alan Wake 2, among others)
@@syncmonism but those often run a lot worse than using hardware acceleration, which is why it exist.
*benefit
*open ended
*occlusion
IF is a keyword here. Mostly it's badly implemented, still relying on rasterized (mediocre) graphics with disgusting amount of post-processing, but bumping up system requirements up to RTX 4000 level for ~1080p with DLSS and frame generation enabled. But hey, it's super advanced and totally worth buying $1000+ GPU!!1 and it doesn't matter if it looks worse than some 5-7 years old games from PS4 and Xbox One generation.
and or destruction interior lighting always didnt look that good in battlefield since its a hard process to update lighting at a scale unlike the finals
Ray Tracing is a game changer when we're talking about realism, but when we're talking about "looking good" ray tracing can't compete with a good art director and a good color grading...
Real time raytracing is more for Devs, it makes it easier to setup scenes (and materials) and have them look good, without resorting to a lot of Tricks and Fakes, it will eventually be common place.
Why the fuck i need to pay extra cash to make devs jobs easier..are u fckn retarded ? i didn't sign up for charity..
@@ВладиславКап-щ6в The only what I see on "AAA" Devs are for money...
Real time raytracing are only needed for a expensive gpu.. so very overrated
Bla bla bla, "makes it easier for devs", no, just no, it makes it LAZIER, for devs to put less effort into addinf materials and lighting to their games
@@serily4524 Bullshit. Making it easier for devs does not translate to "laziness". It translates to more and bigger levels done at same time. Instead of spending their time trying to optimize BS like shadow map resolution, dev's can spend their time making scenes that look more realistic with better crop placement and stuff. They can focus on artistic details instead of technical BS.
People who claim it makes things "lazier" are ignorant fucks who view the world with horse glasses. Everyone who actually works in industry realizes why RT is needed to evolve gaming.
I honestly get more excited seeing real-time lighting assets from older games like Far Cry 1 or Splinter Cell than I do with Ray Tracing!
Honestly, RTGI in The Witcher when you have tons of grass and tress looks really good. Adds good complexity to the image
Especially near any colorful areas like Toussaint. I mean, the colors just bleed with that RTGI.
who cares? The Grass isn't dynamic. It's just a static piece of grass in the field of view, that has zero interaction on your character, sword, or anything else in game.
@@kevinerbs2778 Yeah, I can't believe that the last major innovation regarding grass interactivity in gameplay was when you can burn grass to your advantage in Far Cry 2 way back in 21st October 2008
I'm pretty sure that the last good single player FPS from AAA studios was either Titanfall 2, MW1 reimagined in 2019 (purely due to that covert mission in a house) and Doom Eternal, the rest of the current good FPS games are Indie Games' breakthrough in Doom-like or Immersive Sims that focuses more on "fun" in the gameplay like Ultrakill or Dusk or Gloomwood or something
Yeah
DEV POV: without raytracing its alot of work to get shadows and reflections in your game. One has to render the scene to a depthbuffer, cube array of textures if we are talking pointlights - Each "cube" needs 6 textures so you have to render the scene 6 times, to cover everything from the perspective of the pointlight.
If you have 10 pointlights you render the scene 60 additional times lol...ofc culling will help but not perfectly.
We can then use this depth texture, which is just a blackwhite texture to find out what is in shadow or not.
This is called shadow mapping and its how its been done for like 30+ years...Its crap, its fakery and alot of wasted energy so devs opted out and instead bakes raytraced light and shadows instead, or atleast where possible.
The same process is also done for reflections...you can see the incredible and astonishing amount of wasted gpu power on simply re-rendering the scene to a bunch of textures which at the end of the day does not give you perfect shadows or reflections.
- Raytracing solves all of this shittery and I couldnt be any happier.
You dont see any difference atm because they have baked raytraced light in most games. But rest assured - in a few years the hardware will be able to path trace in realtime and those shadows and reflections are oh my oh my sweet looking.
does any of that matter if you really cant see the difference.
@@ibrenecario7357 Yes, because making good looking games with RT is Much easier and takes up less disk space. Furthermore, RT is the only way to achieve realistic lighting under dynamic scenarios such as destructible walls.
Which means devs can make their games more dynamic, faster and with bigger or with more levels than before. It matters A LOT.
@@Navhkrin Im sure there are alot of good reasons. But if I as a user cant see/feel the difference then I dont care.
@@ibrenecario7357 sounds like you need an optometrist more than anything
@@ibrenecario7357Cyberpunk is an example of the difference with the raytracing.
The only case where RT has juuuuust managed to become good would be with full path tracing in cp2077 with dlss frame gen, upscaling and ray reconstruction on a 4090. Anything before that is just a side grade compared with rasterization, with slightly better (pick one of) shadows, global illumination or reflections, but for much worse performance, for twice the price obviously. I have gotta say though that sometimes I have to really examine the image even in cp2077 to tell the difference between RT and raster, because both looks pretty good.
and then you are trading fake frames and fake resolution for real ish lighting. wheras rasterization was trading fake lighting for real frames and real resolutions. and the awesome performance optimizations made possible by skilled devs learning all the rasterization tricks. in fact, you can still use fake frames and fake reslutions with rasterization if you want even more performance.
looking at it that way, raytracing is still a lose lose situation in my mind. give it another 10 years and it will be incredible. for now its a feature i always turn off.
i dont get it, why do people say "fake frames" and "fake resolution" is bad.if you cant even tell the diffrence between fake and real (other than the input delay of fg i dont see a reason to why its considered bad @@socks2441
minecraft ray tracing veri good :D
Dying Light 2, Alan Wake 2, Control , Metro…
@@daniel226440DL RT is dodgy tbh the rest yh pretty good but still mainly not worth the performance loss
00:00 🕹 Ray tracing was introduced 4.5 years ago with the 20 series graphics cards, initially facing performance issues.
00:41 🎮 The RTX 4090 and 4070 improved Ray tracing capabilities, but the speaker still finds Ray tracing implementation lacking in many games.
01:23 🖥 Some games exhibit poor Ray tracing implementations, with Elden Ring and Plague Tour Requiem only featuring Ray Trace Shadows, making the difference subtle.
03:33 🎮 Games with full Ray tracing support, such as Fortnite, Hogwarts Legacy, The Witcher 3, and Cyberpunk 2077, are compared, highlighting varying visual preferences and performance impacts.
07:19 🚫 Poorly optimized games like Starfield, Jedi Survivor, and The Last of Us Part One PC, despite having Ray tracing, suffer from overall bad performance, making Ray tracing irrelevant.
08:15 💡 The speaker concludes that while Ray tracing technology is cool, its current state, combined with inconsistent game optimizations, makes it not worth using, suggesting a need for more uniform and effective implementation across games.
Summarized by HARPA AI
5 years, 9 months, 2 weeks and 2 days later, Ray Tracing is still bad.
It's dead because there's a new method that has been discovered that is based on rasterization, which is funny; it was invented by a Path of Exile developper and physicist.
@@aviatedviewssound4798 oh really? dang poe2 is gonna look really good then huh. what's it called?
@@Nezzen- Radiance cascades
@@aviatedviewssound4798 sweet thanks!
its not the direct lighting that makes a difference, software has done a really good job imitating direct ambient lighting, ray tracing picks up the slack in reflections of objects projecting that ambient lighting and more impressively, compound ambient lighting, they cast across a scene, seeing full objects in water puddles/glass like real life, having the hue of the reflected light on objects projected in the room based on object colour and object texture or finish... ray tracing allows for this stuff much better than software artificial lighting.
Some games are going to do it better than others as well, sometimes the games and the cards interaction may affect quality of the effect, but to say it's bad is BS, it's a marvel of engineering that it can do it at all, not to mention it's getting better as AI cores learn to interact with RT cores or whatever garbage AMD is trying.
The other thing that bothers me about your argument to us the viewer is trying to show us RT in a static image that most certainly lost allot of it's pop being compressed for youtube consumption vs the native screen you are viewing it on. Dude, I'm on a 1440p monitor watching a video at 1080p that I can only window because full screen makes it look like dog water due to interpolation making up info on every second pixel basically. But it's a static image, RT is dynamic and would lend itself much better to a rendered scene than a photograph, it's kind of the point...
@7:08 I think you explained exactly why most games have bad ray tracing implementation. It's not that ray-tracing technology in general is bad, it's that it's still very heavy for today's GPUs and so most devs will only implement partial feature sets, and even then they will still bake out a ton of the ray-traced lighting for the rasterized version of the game, so the comparison from a visuals perspective is a little unfair.
It's funny how the games that people praise for having good RT performance, simply implement RT to a lesser extend.
"still very heavy for today's GPUs "
More like
"still badly coded for today's GPUs" that is why it's still bad.
@@DrRhyhm No it is heavy. Try running full RT on CP 2077 with a 3070 and see how it does. Probably by next gen when the 70 card can match today's 4090 is where you can easily do path tracing.
@@UnicornpirateGunna be waiting 2 generations for a 70 card to match a 4090
The 5070 will be probably 25% slower than a 4090
The 4090 was such a huge jump
Path tracing is the future and will eventually replace rasterization in most games (currently it's a mix of ray tracing/rasterization) now that there enough hardware power to simulate light the 'real' way. Note that rasterization for decades has been essentially a bunch of hacks to make lighting look increasingly realistic without actually simulating it as it works in the real world with photons bouncing around in real time.
Ray tracing can reflect things that are off screen, none of your examples show this
Waste of money
But i respect Nvidia because they have customer's who will buy anything at any cost even if that "anything" doesn't worth at all
agree
market sellers do their job so far, so so far~
100% Agree
The only real sense of Raytracing is saving the devs some work.
Realtime GI is possible sinve Crysis 3, Realtime Shadows and Lights since Doom 3 (though STALKER is a great example too).
And no one bothers about physically correct reflections when you have really good SSR + good Cubemapping.
Raytracing offers real differences only in details if the game has great lighting + great AO.
I think we need devs that are also gamers more than Raytracing.
For me personally I don't think it makes any sense to run raytracing. Even if it looks a little nicer it's not worth it for the hit you take in performance, my 7900 XT kills it in 1440P without raytracing and I wouldn't change a thing.
Right you drop sometimes 50% or more performance. I just did a side by side comparison of cp277, and honestly, I can hardly tell the difference.
The thing is, for decades, we couldn't get "real light" in games. So we got really good at faking it, and now it's to good to really improve on. Honestly, just because we CAN make light super realistic doesn't mean we should. There is more than one case where I think RT makes things look worse.
Metro exodus is the only game I see a marginal difference with RT on and even then I think certain scenes look better with it off. But overall I think it's improvement
@@devinkipp4344 Your right about Metro exodus ran that on my 3070ti and the diffrence is quite telling and its the only game that I have ran with raytracing on that didn't butcher the frame rates.
We've gotten so good at faking real light in games, that the jump to raytracing really seems very small sometimes. But I'm still looking forward to every game just natively running with raytracing and it being a staple and thus being well implemented. It just looks stunning done right, but a lot of companies just seem to tack it on so they can say they got it or to get some sponsor money by nvidia. The difference is kinda sad most of the time honestly, but games like minecraft or cyberpunk really showcase, how much of a difference in render quality raytracing can make
to watch this u tubers low video count to clearly not have watch nvidias own video showing ray racing in still images's on car reflections zoomed in... nvidias most powerful gpus were 1080p 60 fps on the most expensive and current tech for blah/ even less noticable then this video... of battle field 5... allen wake to max EVERYthing on the most expensive tech at 4k 4090... gets a sad 32.2 fps average... (no upscaling) (techpowerup) the v ram for allen wake 2 4k ultra mius ray tracomg os 9485MB at 4k max with pt/fg is 17807MB that is a SMALL scale game a game like gta6 could easily want 23+ gbs as larger not tiny town usa is in a little town but a city with tons of lights and ppl that you dont walk though like allenwake you drive /fly though...
digital foundrys video on ray tracing on what is now a SUPERold old game is taxing would have single digit fps with path tracing ruclips.net/video/aUPwRyS15kY/видео.html&ab_channel=DigitalFoundry 1.3 million subs
After testing RT, I always end up turning it off and preferring to increase the resolution. It ends up being prettier on the eyes than turning on RT and having to use upscale methods like DLSS/FSR to achieve a playable framerate...
@@cadcad-jm3pf Yes. Thats the point
some times people seems to forget or don't even consider "frame rate" like visual fidelity, i mean, reality "doesn't run" at 30 or 60 fps
If you prefer resolution to better lighting, sure. I'd rather a 1080p w/ good lighting than a flat looking 8k game.
vamos admitir, pagamos caro em gpus lixo, essa é a verdade.
I think that eventually ray tracing could become a fundamental feature thats always on when gpu get powerful enough and devs find a way to optimise their games better, but it will still probably take at least another 2 + gpu generations at the least imo
100 percent ps6 or 6000 series Nvidia will be when it starts to look good ps7 will be photorealistic
I can see this happening
I often see myself testing this thing and going back to usual settings without raytracing. I don't miss anything, really. I try to convince myself and it never happens.
that’s because a lot of game only put a small fraction of what rtx has to offer so it’s not as noticeable a game like cyberpunk where everything can be ray traced is very noticeable
I'm sorry but the SSR versions of reflections quite assinine.
Ray tracing do make a difference for me in shadows and reflections, the game world feels less fake and more natural and dynamic now.
The main thing I want gives to increase the draw distance now, because objects still pops in and that it takes the immersion out pretty badly like how ghosting and artifacts of SSR do.
Ray Tracing Deception
Here's the real deal about Nvidia's promoted RTX tech:
* Most visible effect is massive hit to FPS
* Path Tracing proved that Ray Tracing has been flawed all the time
* Ray Tracing is only partly used and games still rely on ambient occlusion for the most part
* Nvidia lists only _five_ games (two of them been Minecraft) with "Full RT" implemantion after many years of marketing it
* Doesn't bring anything "new", since effects like reflections/lighting have been there already
* They marketing Says: "Eyes will immeaditely tell the difference", so who is going to admit he is "blind"?
* It's Nvidia's new "PhysX gimmick" to make easy money with overpriced GPUs with ridiculous power need
There are some radical new gaming techonologies which do improve the experience like HDR screens, G-Sync/FreeSync and DLSS etc., but Nvidia's ray tracing is not one of them.
Prime example of ray tracing implementation is Cyberpunk 2077. And it was nicknamed "Cyberbugs 2077" and ray tracing at first was very badly implemented (excluding AMD support etc.). The game was delayed and buggy, most probably because too much time was spent to trying to ray tracing working.
The game obviously was implemented solely for ray tracing "showcase", since the lighting is just horrible without ray tracing. Thus the comparison "ray tracing ON vs OFF" with Cyberpunk 2077 is not valid or fair. It was always meant to work only with ray traching and shouldn't be played without it, but who really can play it with full Ray Tracing on?
Cyberpunk 2077 with Nvidia RTX 4080: ~110 FPS (RT OFF), ~50 FPS (RT ON) and ~30 FPS (RT with Path Tracing ON).
ruclips.net/video/7NQqWlYZ2RA/видео.html
New path tracing implementation revealed how poor Nvidia's ray tracing actually was.
ruclips.net/video/FjhQiMtTz3A/видео.html
Path tracing or "full-implemented ray tracing" wasn't on with the launch of Cyberpunk 2077, because no GPU really can handle it. The truth is, that when they fully implemented it (as path tracing), it dramatically changed the lighting and it started to look like when RT is OFF! I remember watching an older video about ray tracing of Cyberpunk 2077 and the whole point was to show how incredible the lighting is now inside of a car when RT is ON. However, when the path tracing implementation of ray tracing came out, the lighting inside of car was totally changed and is now close to the old ambient occlusion style!
I feel sorry for the players, the developers and the competition for swallowing the bait. Ray tracing has not been and will not be game changer for a while, atleast this kind of ray tracing. It's not what it's been marketed.
PS. I do have Nvidia GPU.
You're one of the few who saw through the typical nvidia sceme, as they also did with physix.
Nivida is very clever with their marketing scambait.
You *can* draw good shadows with shadow maps. You *can* draw good reflections with screenspace techniques and cubemap fallback. You *can* emulate global illumination with a wide range of raster graphics techniques. RT just takes you from 75% to near 100% accuracy, assuming the devs did a good job while implementing it.
Rasterization already produces results that are decently realistic, ray tracing just improves some parts. Ray tracing mostly help with reflections and global illumination. If you were to compare a reflective surface, there would be a lot of difference.
Except not all ray tracing does that, cyberpunk just adds some new shadows to some objects. No global illumination or reflections.
@@reinhardt3090that is straight up not true, cyberpunk's ray tracing does ambient occlusion and reflections in regular mode and full on path tracing in overdrive mode where every light source casts a shadow and light actually bounces around in interiors, something not possible in rasterization. hopefully that tech becomes more accessible with time because it makes cyberpunk actually look like the best game graphically ever created, especially with character lighting
shhhhh. He's trying to get clicks and views, don't bring facts into this.
One of the problems with ray tracing is that the reflective surfaces always reflect in an almost perfect way, extremely crystal clear, never blurry, which is not, in a real sense, always accurate.
@@kislayparashar CP77 PT mode completely changes rendering algorithm to a path tracer. It isn't just "this and that", it is literally entire scene, every pixel, rendered with ray tracing on steroids
Ray Tracing was the Blast Processing from our times.
A gimmick made to sell the futuristic tech of current consoles.
I got each of the On vs Off incorrect (except for CP2077) and thought Off looked the best 😅
The Witcher 3 can show massive difference due to global illumination when RT is on. But this will be heavily scene dependent.
The forest or indoor areas show a substantial difference between non-RT and RT, but in a flat scene such as this one you'd barely notice it. The RT looks nicer though, all the time, even when it's subtle, because the non-RT lighting can have an almost blueish hue especially in the shade, but with RT on you get the color of your surroundings reflected onto Geralt and geometry around him, sort of "anchoring" everything nicely so the scene feels more cohesive, it's difficult to describe but it's very noticeable in-game. And I think global illumination is the segment which shows the most difference as it can change the feel of the entire scene. RT Shadows are for the most part basically useless, or the least the implementation of it that we got so far.
I totally agree with the overall premise - the performance penalty is too great for what it offers visually, except in a handful of games (CP2077, Metro Exodus Enhanced etc). And yes while the CP2077 also exhibits a massive performance penalty at least you get insane visuals in return.
We'll have to wait probably at least a good decade in order to get RT games that justify using RT consistently.
Try 2/3 years max! ;) even if I would prefer it not , the industry seems to go all in with UE5 and for now this is a shit show on performance capabilities of the engine. So we have to wait for new versions fixing it and then we'll have consistent RT implementation as UE5 packs a tone of RT software based features. Just look at games using ue4 today (lies of P ) and how far they came from games that are decades older.
In The Witcher 3, with RT on, the indoor areas for instance that in a tavern, it's hardly to see people's face, unless you hold a torch. This too-dark effect exists not only in The Witcher 3, which may be 'realistic' to the light source placed in the scene, but it looks ugly and makes many details too dark to see. With the huge FPS penalty, I turn off RT for better overall experience. I turn RT on only for the money I paid for the 3080 ti.
What sucks though is that some newer games are clearly designed around raytraced lighting, like Jedi Survivor. It still looks great without it, but some areas turning it on you can tell the area was designed with raytracing in mind.
More like modern games now include more dynamic objects or lighting and baked lighting sucks for them
If I am honest, the only picture where I prefered the RT version was Cyberpunk 2077. No RT, most of the time looked better. Not more realistic but better.
In Fortnite as well.
Agree, RT just feel like over lightning on every part. Without it, the shadows looks more natural.
@@soul-manRTGI is generally known to just brighten up scenes because it allows light to bounce off different surfaces and transferring light into more parts of the image. This can easily be shown in many renders as well and it’s how light works in the real world
Probably because most of the other examples, the games weren't designed around raytracing, or were just using it for some effects like reflection or slightly better shadows.
I completely agree. Fortnite, Hogwarts and Witcher 3 looked much better without raytracing. If im being honest, Cyberpunk is like THE game for raytracing and they really nailed it.
I get that sometimes there are instances where you can't tell it's being used. That is a credit to the artists who have had years and years of experience "faking it". But RT is more than just "looking better". It's just about casting light reflections, shadows, etc. in a more natural way.
It's very important to have an all-tray-traced future. Artists with all that experience "faking it" using all kinds of lighting techniques to look natural only have to turn a light on and the RT can handle shadows, bounces lighting, etc. Artists can be more like set designers for film rather than having to place a bunch of light probes, animating shadow movement, hand-drawing light bounce, etc.
Basically, real-time RT makes scene construction less labor-intensive than the rasterized ways.
I believe 4A Games (makers of Metro) have gone to a full-blown RT studio simply because of how much real-time RT has changed their development.
I really wish AMD took Real-time RT more seriously because it means faster adoption of real-time RT and casting away the shackles of rasterized techniques. It isn't just Nvidia leading the way and setting the standard. It could mean more optimizations in real-time RT to hopefully make it less "intrusive" to overall framerate. The best way to tackle a problem is for more eyes looking at the problem.
Talking about 4A games, they showed how much time real time RT changed how they work.
Lighting a scene was literally adding the light from the sun and a few lamps in the room. Done room was lit up
Tell me you own Nvidia stock without telling me you own Nvidia stock
sure raytracing is more accurate, its literally simulating real lighting. but if the end result is basically the same, it is not worth the massive performance cost, and 'faking it' wins by a landslide.
@@socks2441 thats because you looking at shitty RT implementations and you not considering anything else
For faking it, it takes hours of time to setup a single room in a building to make sure the lighting is just right.
for RT, you add the light source and it does all the shadows and objects for you and done
@@socks2441 There was a time when 3D Rendering didn’t produce the same results as sprite-based games. It was computationally expensive. But, technology companies invested in it and it eventually became trivial to do.
If anything, help the developers. As 3D rendering gets more and more complicated, the harder it will be to keep faking it so that things look realistic.
If more companies took it seriously, then RT (and path-tracing) will become trivial and no longer be detrimental. Developers can just place a light in a scene at development time and the algorithm takes over handling how light bounce will light/shadow ever object in the scene - not the person.
No more light bleed. No more small objects missing a shadow. No more objects disappearing in reflections because they aren’t in the scene. It’s all handled by an algorithm instead.
Artists can focus more on the art of the scene rather than worry about how they are going to make the scene look convincing.
So, yes, RT (and PT) is expensive. Yes, the result isn’t always mind-blowing. But it’s still something that should continue to be pushed and researched for the future.
If companies had the same attitude about 3D in the past, we wouldn’t be where we are today arguing about if real-time RT is worth it.
Also difficult to see the effects in one scene and after it went through youtube compression. When I turned on raytracing in Cyberpunk, I was blown away by how good it looks. However, I had to turn it off, because it also resulted in a 15fps framerate.
I feel like raytracing does not make for better screenshots, but it can immensely improve my immersion.
Better screenshots really depends on the environment. Lets stick with cp2077 as an example as it looks pretty good without RT. If we have a non rt and a psycho RT screenshot during the day in the desert. It will be very hard to see the difference. If we take the two screenshots in a place where there are multiple light sources and multiple surfaces (some of the reflective) the difference will be easy to spot.
Somehow I got it backwards every single time. I kept thinking the Non-RT side was the RT. Not sure what to make of that.
Same
Same for me as well. I think it's not really a very significant thing yet. But surely it will be in future. Like I wont be missing a lot if I don't use RT now.
What it tells me is that your gaming enjoyment will not be lifted up another lvl by RT. Maybe in the future, when all whistles and bells of Rt are fully exploited but even then. RT doesn;t make good game, the story, the mechanics, the music/sounds etc.etc.etc. all matter more then RT.
Same
when you use the absolute worst scenes to showcase the take, especially when half of his examples are games with bad rt, yeah, it does make it hard to see
Raytracing makes a big difference in games lacking any modern techniques, such as the first half life and minecraft
It's not stupid. Just not there yet. Devs spend huge amounts of time to fake lighting and these examples aren't really that good either. In the future, RT will be the standard but it wont be soon.
not there yet? dude its 4.5 YEARS later. not weeks or even months, but YEARS. knock it off dude.
yeah Fuck you! CLOWN🤡🎉 after 4,5 years it's still useless thing which cut a half fps for nothing.
@@deathrager2404 bro they didn't built rome in one day
@@elove6215 this is a videogame. not rome. stop the retard comparison.
@@elove6215 Rome was an entire civilization, Raytracing is a gimmicky tech.
Also, it has been a thing since the early 90s.
People. Stop assuming Ray tracing is an end user graphical improvement tool, its not, its a game development productivity tool. Do you get it!
I'm not saying that's a bad thing but quit slurping the marketing that its a graphical improvement tool in ANY way. AerO R2 thank you for taking the time to illustrate evidence. It will allow devs to design games quicker, period (provided the end user pays for the hw to run it). It shifts the resolution COST to the end user in expensive GPUs its awash in the end visual result its just about who pays for the resolution, devs vs end users. How many idiot youtubers are sucking to showcase ray tracing as looking better when in 100s of videos you have to squint your eyes and "believe" to agree, when it just doesn't make a positive difference. Ask yourself honestly does Ray Tracing look muddy? Yes it does. If you do a Fourier analysis on the images they simply have less information and the honest human eye can tell the difference.
When standing still traditional techniques hold up very well but you usually notice the difference when moving around
Especially human eyes 😢
@@Kamehaiku what💀
I'm sorry but as a photographer, If you've played Cyberpunk or Witcher with and without raytracing and cant see the huge improvement in realistic lighting you either don't understand lighting and how it works or your just blind.... Doesn't make a game more fun but if you enjoy looking at pretty things it certainly improves your enjoyment
Where the biggest notice is in games that use alot of ray tracing. Dying light 2 for example, uses tons of raytracing when switched on, and comparatively, it also tanks performance without dlss or fsr ... and i mean TANKS. And you can notice a massive difference in the game with on vs off. Most games just don't utilize it well, and consoles have no business even trying to use it lol
Honestly. I tried running rtx with my 2080 super. And it just never got turned on until i upgraded, and the ps5 and xbox 5 said rtx it just confused me.
Games where ray tracing makes as huge difference as Dying light 2 are just games that didn't work on normal graphics as much as they should.
Dying Light 2 gives grass/foliage with no back shadowing or ambient occlusion without RT. Which is stupid because you shouldn't need RT to do something so simple.
Foliage is heavily present in the game so changes to it are quite noticeable.
Also for reflections, several outdoor locations don't have cubemap generated for reflections so all you get is some very plain glass.
Screen space reflections are also extremely low res and have low distance.
If the normal lighting with AO, shadows and reflections was done well you'd see almost no difference.
*a lot
Two words, not one. Think of it like this: a few, a little, a bunch, a _whole_ bunch, a lot, a _whole_ lot.
Pathtracing changed everything. Looking soo good. Not videogamey at all. And we finally can play Alan Wake 2 natively on our 4K TVs, pixel perfect without any interpolation techniques, in buttery smooth 30 FPS at least almost everywhere using our RTX 4090ies while the cards pulling only about 550 Watts for this awesomeness .... :-)
0,5 kw/h for rendering 108,000 4K Raytraced frames - absolutely amazing indeed
There was a time when anti-aliasing was also a nice-to-have if you can enable it while maintaining good framerates. Now every game runs with anti-aliasing enabled.
But anti-alaising gave the games an big Improvment at there time. Raytracing ist not Raytracing all the Time. Often it only will used for better sells. There are games were the raytracing is really bad AND after implimented in the game the game look ugly without. World of Warcraft for example got good shadows nativly. Now Raytracing for the Shadows is in the game and the shadows now look as bad as at the release of the game. 16x16 resulution...
if you play on 4k no need for aa
Not true at all. You still need AA or DLSS at 4k to prevent jaggies@@dieglhix
@@dieglhixYou will still have specular aliasing and temporal noise, even at 4k or higher.
well it's almost impossible to see it, probably 1% of FullHD @@ItsCith
A lot of the light tricks we use (like baked lighting and reflection probes) require a pretty static scene to work, but if the scene is mostly static it can look close to ray tracing.
Now try to implement those tricks in a game like mine craft where every block is dynamic, you can’t bake lighting or reflection probes so ray tracing really shines in these types of games.
I’m excited for the future where ray tracing becomes the norm and runs at high frame rates because then we will start seeing more dynamic games ( and eventually they will be so dynamic a better term for them would be simulations than games)
I agree I can't really tell that easily. Especially in games that already prioritize graphics. It looks the best in games that have basic graphics and a rt feature added.
You don't know where to look
Neither does the creator of this video,those screen shots were the worst place to compare RT ON vs OFF,in the witcher 3 go to a tavern to see how there isn't any shadow under tables and they look like floating,and how much depth they get when you turn RT on
Or go to somewhere with lots of trees,see how leaves will have depth,or go to novigrad
Really every one that says RT is stupid it doesn't make any difference is stupid,because they look somewhere you can't benefit from RT like other parts,and if you don't know where to look or don't see the difference at all,turn it off and enjoy your experience,the same way console players play on 30 FPS and don't know about it,but they're enjoying it,that's what matters in a game to be honest
I've been playing cp2077 with ray tracing on for a year and I just had to deal with huge frame drops all the time. I recently turned it off and could tell a huge difference in graphics quality but I went from sometimes dropping to 20fps to a stable 80. Very worth it imo. you can't really appreciate how beautiful the city is when it's choppy all the time.
The games should have been "optimized well", not "optimized good".
Control would have been a good one to look at. Out of the games I've tried (including most games on this list), I think it's among the ones that take the most advantage of it.
And it performs great for the most part
Its style and material usage also lend themselves to RT effects really well. Helps out a lot more there than in other games
i think it's one of the worst. game become noise hell with RT on and looks worse for it
@@AbiRizky performed like absolute crap on my computer. terrible memory management resulting in blurry textures all over the place.
really messed up my immersion and I just couldn't finish the game like that. otherwise it ran smooth and fine.
@@cyborgchimpy that's actually a bug. You can find the community fixes on nexus iirc. I had that exact same issue too, used the mod, them ran great afterwards. For some reason though it couldn't fix the epic games version, but did ok with steam
The witcher 3 with RT in some areas is stunning
I think RT does make a big difference if you have the right display for it and the right card. I love using RT on CP77 and Alan Wake but that being said I’m also on a OLED
Well original video is 6 months old, aged like milk. That's like saying 1440p is bad after years of launch, which was true. Now 1440p is the standard. It's the nature of tech, calling new tech bad is moot.
@@peternash7104 1440p isn't the new standard smh
@Mouchoo bruh Steam does surveys. 1080p and 4k went down, percentage. Currently, it's the second most popular. That's the definition of becoming the standard.
@@Mouchoo 1440p is definitely becoming the new 1080p faster than you think. 1080p is just ass looking in 2023 in imo. 1440p looks ten times sharper going from 1080p for the first time. Same goes for 1440p to 4k but most people probably won’t be grabbing a 4k monitor since less than 4% currently use 4k for gaming.
*an OLED (because "OLED" starts with a vowel sound)
Real time lighting still has a LOOOONG way to go. First of all as many already correctly noted RT doesn't necessarily look better. Often it just looks a bit different. That's it. Why? Because most games used baked lighting which is just light calculation done in advance and then statically applied to the game world. But it's still ligh calculation. Just not real time. As of today RT is often limited to specific surfaces and/or only to specific objects in the game world. It uses very little rays per pixel (like a couple hundred, maybe 500 to 600) which creates noise which requires denoising algorithms. Also the distance at which objects are actually included in RT is very small. In Cyberpunk 2077, only objects up to maybe 10m away actually get RT'd.
Edit: RT does have a great benefit for professional applications. And also once it has really fully arrived and a full fleshed-out version of RT can be used even my midrange and entry level hardware, it will make lighting in games a much easier and faster process. Because right now getting all the lighting and shadows right can be a difficult and time consuming task. If you can just enable real time lighting all of it is done automatically instead of you as a dev having to design everything by hand. So don't get me wrong: RT is a great technology that partially already has its uses today. But in gaming its full value is a solid 15 years away still.
Ray tracing is awesome. I have used it comfortably since it launched on my 2080. I'm on a 4080 now and it's only gotten better. I run it in every game that offers it and have absolutely zero complaints.
ok mister bezos
I'm curious how many upgrades in-between those? Those cards are one almost 1 teir below the absolute best cards in their class. I have a 3080 and use rt in some games but the hit to frames and drops makes me have to use dlss... It's good but it's not native so it just shifts the problem to weird textures and some things here n there are wonky... In those instances just turn rt off. I will admit on older titles like control maxing out all the eye candy looks stunning.
Gamers will really talk about how much of a scam ray tracing is and then piss themselves about how amazing a game with baked RT lighting, carefully placed fakery, that explicitly avoids scenarios where the lighting would break looks.
"raytracing released about 4 1/2 years ago"
dude, the 80s had raytracing
Yes nerd, he means in gaming obviously
we mean real-time-raytracing, giving you 60 frames per second, with movable camera and light sources and boxes, larger than 320x200px.
After these years of ray tracing, I can conclude that art direction is more important than simply turning on the real time ray tracing.
Reflections had the biggest improvement with Ray Tracing.
Screen space reflections honestly looks so unstable that I actually prefer having SSR turned off and just have cube maps. Heck, even older games like MGS2, Mortal Kombat Deadly Alliance, etc faked reflection by having a lower poly model mirrored on the other side, which looked better and more stable than SSR.
RT Reflections is something I'd very much prefer in new games.
The second most unstable thing to my eyes were ambient occlusion. I always hated how SSAO looked. It added dark glow everywhere which looked worse than baked ambient occlusion in ps2 era games. HBAO and newer methods improved the accuracy to some extent. But they all still break when something occludes the screen or at the edges of the screen. Having a stable AO with the help of RT would be very nice as well.
However, RT itself looks unstable at times. RT GI has delay when there is lighting change and that looks so weird to me to the point where I'd just prefer to have rasterized lighting. Some games manages to camouflage this issue better than others. Hopefully over the years, the stability issues will be further ironed out just like how ray reconstruction has made significant improvements in the GI update rate. Also until we get 4090 levels of performance on 400-500$ gpu, it will make using RT difficult for me.
Problem in SSR is that camera tilting doesn't work on that.
So screen space reflections actually do work well in cutscenes or some third person game where camera tilting is limited. So this really depends on camera.
And yes, RT GI delay is very annoying too.
True, but the problem is that cube maps or any other simple reflections techniques can't display something that SSR or RT shadows can or not working with modern rendering engines at all, so SSR just like TAA are absolutely necessary in some scenarios despite it's flaws. So yeah, I totally agree with you that some 5, 10 even 20 years old games looks beautiful despite it's age and I'd rather play it at 4K and 120 fps that some modern "realistic" games at upscaled res and generated frames.
@@J0rdan912
It works extremely well if reflective planes are just portals that render same scene as mirror view and use that as specular component. SSR can be used when camera is controlled, like when driving some vehicle, cutscene, third person view, crash cam, aerial view... it is first person view when camera is tilted that breaks SSR.
Cube map is fall back that can be used in curved surfaces when SSR doesn't work.
@@J0rdan912 SSR looks beyond saving to me. 1st person, 3rd person, cutscene or whatever it is, the reflection breaks on the edges of the screen, any object [especially on 3rd persona games where your main character is blocking the centre automatically breaks SSR.
Same goes with TAA, it just looks horrible compared to SSAA and even FXAA. I was playing Forza Motorsport at 1080p the other day and everything on distance looked like a smudge fest. I went back and booted Forza motorsport 4 on my xbox 360 which ran at 720p but looked so much clearer, it almost felt like wearing glasses. Depending on some game and how TAA is used, it does look okayish but I'm not happy with the direction modern rendering is taking with all these blurry images.
@@RiasatSalminSami Modern games are targeted for 4K, this is where TAA is mostly okay. Also, older games isn't the same thing, without temporal processing modern games are not going to look "clearer", they will look like Sega Saturn faux transparency pixel grid and noisy shimmering mess all over the place. Same thing with SSR, new rendering methods requiring limited SSR or low-res RT reflections, older methods are just not gonna work with skinned and moving objects or particles. I do agree that some older games looked great, but at this point there is no coming back to that, the only alternative is 2D pixel art/hand drawn or retro 3D stylized graphics where older methods are still fine or even benefits from it.
I guessed every one of them right (almost got fooled by requiem), but As someone who finds ray tracing very interesting, I personally think games are a lot more clearer/ cleaner when you have RTX off which is something I personally prioritize in video games. RTX (even if its well implemented) just sometimes looks like a lot of colorful bloom was added. But it really just depends what people want out of a game
Great video though. Glad someone was able to nicely lay it out for people who are interested in Ray Tracing.
bro you made some comparisons with images taken in points where the raster has nothing to envy of the RT
Yea...he should delete the video.
That's a weird take. The more ray traced games there are, the better the implementation will become and the results will be. Sure it's not always good (like everything) but it can be. The primary motivation to be against Ray Tracing is that the hardware is expensive, but things should get better on Nvidia side (expect a 5060 to be able to do RT), sill an unknown on AMD side which is lagging behing a lot. Hopefully Intel can do something in the next years too.
a weird take is to watch this u tubers low video count to clearly not have watch nvidias own video showing ray racing in still images's on car reflections zoomed in... nvidias most powerful gpus were 1080p 60 fps on the most expensive and current tech for blah/ even less noticable then this video... of battle field 5... allen wake to max EVERYthing on the most expensive tech at 4k 4090... gets a sad 32.2 fps average... (no upscaling) (techpowerup) the v ram for allen wake 2 4k ultra mius ray tracomg os 9485MB at 4k max with pt/fg is 17807MB that is a SMALL scale game a game like gta6 could easily want 23+ gbs as larger not tiny town usa is in a little town but a city with tons of lights and ppl that you dont walk though like allenwake you drive /fly though...
In games that actually implement the feature it looks massively better than off, like in cyberpunk 2077. It's also going to be a strong tool for multiplayer games as a way to prevent people from cheating or having a competitive advantage by tweaking shaders and graphics settings to remove shadows and change lighting. Having a real time lighting can become an actual mechanic in game and be added into elements of tactical decision making. RT + DLSS gives you insane detail + performance. Using DLDSR+RT+DLSS = insane performance on 4000 series cards with insane graphical detail.
what you said about multiplayer makes no sense, not just because it isn't true, but because no one uses shader manipulation seriously as a cheat but if they did could probably do more by messing with hidden settings. plus no person taking a game seriously is going to use ray tracing and halve their fps + introduce a ton of input latency.
It looks like Cyberpunk intentionally botched some of the some of the non-raytraced surfaces (e.g. water) to look better with RTX ON. In games where reflections are properly implemented (Hitman), the difference is miniscule. Hell, even Cyberpunk looks better with DLSS Quality / OFF @ 4k than DLSS Performance / RTX Psycho, if you have to choose between the two options.
Strongly disagree. I've tested DSR at 4-8k resolution as well as 4k native and I'd take 1440p Native with RTX over 4k baked lighting anyday. The game textures aren't even 4k. Increased resolution can also make somethings noticeably worse because geometry can become too sharp or clear when it was designed to not be viewable at a higher res. I personally play the game with RTX and DSR 4k. The difference between native vs dsr is very,very small.
@@ramsaybolton9151 you freaks that play at 4k really are something else.
@@Deathstro 4090
Very very few games are designed from an artistic standpoint around assumed real-time Ray Tracing. Pretty much every game still needs to be designed to look decent even with all RT features disabled, so very few titles are utilizing an artistic direction to fully exploit the tech.
Raying changes the entire atmosphere of the game. For me when ray tracing is on, I feel like I can smell the air.
All you said is true, except the fact 4070 is not near as powerful as 3090.
Progress has to begin somewhere and we're still in the early stages. Ray tracing is here to stay and will become the standard in the next generation.
Ray tracing? Yeah, sure! RTX? LMAO... It will get nowhere with how Nvidia does things... It will end up exactly like PhysX.
I've been playing plenty of games with RTX and maxed out settings with a 4070ti and I'm getting good frames with DLSS 3.5, (Jedi survivor after the optimisation updates), the witcher 3 and Cyberpunk have looked sensational.
give it 2 years and you won't be able to play with rt due to its unacceptable vram
Give them 2 years and we will have better graphic cards. Whats your point lmao @@zin0gr312
I have a 4070 laptop and I can run those games good framerate with raytracing too, they look phenomenal.
"with RTX"
I fucking love that
@@eyescreamsandwitch52 why does that tickle your pickle?
Path tracing is what we are waiting for, in cyberpunk it is very impressive.
Witcher 3 was one game where I really noticed a difference. Cyberpunk is a bit too much for me to run ray tracing, I get capped around 50-60fps, but it looks insane without it anyway.
So, I look at raytracing like this: Nvidia created hype and that hype turned into a category they will never lose in product reviews. Secondly the extra cost for raytracing is just tanking the price per fps average thus almost entirely exempting Nvidia from being considered. And finally, AMD is being forced to pull resources and engineers to compete in this category because the vast majority of gamers are consumers that will buy whatever is marketed in front of them.
sounds like you learned how every big company runs. They create the demand and basically force others to comply.
the only thing that is an obvious upgrade is RT Reflections, so if you've got the framerate to spare turn that on, everything else just isn't worth it, especially if you have to turn on upscaling to get playable framerate. I swear, people who think upscaling looks fine compared to native resolution need to see an optometrist.
Ray tracing isn't stupid, it's amazing technology it's just not really used properly because of dev knowledge and most importantly te hardware. Yes the 4090 exists but it's a tiny amount of the market. But fully raytracing/path tracing is amazing and it certainly is the future. I reckon it won't be until the next console gen that it takes off properly. Maybe in a mid gen 'pro' model refresh - I hope.
The problem is graphics are outperforming hardware and the substitute is to lower the resolution and then upscale it, and then call it a 'well-optimised game' as if lowering the resolution from from 2k to 1080/720p isn't a penalty in itself. DLSS is not a good implementation, but is now required
Usually, if the gameplay is captivating, Raytracing takes a backseat. If you are immersed by gameplay, you are not going to be noticing the lighting that much.
I rarely use it, as it becomes a battle of compromises, like how low a framerate are you happy with.
For me it's a must have I always play my games with 30-50 fps just that I can have Raytracing and 1440p without dlss. But if anything I'll enable dlss first. Been grinding 100s of hrs in SP games on 30 fps no problem. 10 times the fun as if I turn off Raytracing. Then I lose interest in the game almost instantly despite it being fun. Just the missing good reflections ruin everything for me.
@@williehrmann for me its a case by case basis. I have a 3060ti on 1080p so i just dont have the hardware to do this for several games. Cyberpunk? I leave it off since i would need to enable DLSS max performance which looks like a 240p youtube video. Forza Horizon 5? Yeah why not, i still have max fps with DLSS on and image quality isnt worse. Fortnite? Also yes, it runs stable 60 with DLSS performance. But yeah, for me i just test if i can run this game with raytracing or not.
And i also dont want to get too attached to raytracing since i plan to upgrade to an AMD GPU later.
@@williehrmann It's all about preferences. One can enjoy gameplay with almost indistinguishable visuals, while other prefer eye candy visuals with eye-gouging 30-50 fps and input lag.
What if RT actually produce immersion? It does for me. So I gladly offer a few frames to be more immersed. Lighting is EVERYTHING in the visual presentation.
@@ItsCith It is, but only if implemented properly and with global illumination, which is pathetic number of titles. Also, once you're dive in for a couple of hours, you can't tell the difference anymore. So it's good for comparisons, for screenshots and tech demo and videos, but mostly useless for an actual prolonged gameplay sessions.
Actually hogwarts legacy has visible ray tracing. The frame u showed is just showing a hallway that both shows reflections, but if you move the camera downwards you can see that if you do not have ray tracing, you don't have reflections anymore, because the game shows you only what you see on your screen. Ray tracing shows, instead, even what's behind the character if there are reflections. I think that (for me) u misjudged this game
Alan Wake 2 is a pretty great recent example of what Raytracing/ Pathtracing can look like when implemented well. It looks stunning.
no
Alan Wake 2 is a perfect example of forced diversity.
100% tur brother its total shit show woke af game@@gzuskreist1021
@@gzuskreist1021 This is a video and comment section about computer graphics. How does your comment relate to that?
@@gzuskreist1021Go touch some grass, weirdo.
The Last of Us doesn't even have ray tracing, it's just bad optimization lol
You mean badly ported, right? Runs like a charm on PS3
I play videogames since 80s and I can see a big difference, because I've seen every little step in the progress of 3D graphics. And if I look at the reflection in some game and see that it is artificial, wrong, I feel sad.
Old video, but you are generalizing Ray Tracing. NOT EVERYONE uses it for games. Real time Ray Trace/path trace on modern NVIDIA cards is a priceless asset to those in the field of design and marketing. What took literally HOURS if not days, is done almost in real time now. That would not be possible without the tech Nvidia brought to the masses. From the 20 series on, my like changed for the better as to my work based on RTX, real time ray trace! When it comes to gaming, who cares. It's just that! GAMES. People spending 1500 to 2000 to play a video game already have issues, or no room to complain. The cards are designed for far more than the millennial gaming community.
Depends on the game. Cyberpunk, which is already a looker, becomes absolutely gorgeous with RT on, and while there's a performance hit on my mid range 3070, the trade off is worth it imo. I'd argue that RT is by far the most impressive tech I've seen in gaming in the last decade, since lighting has a more dramatic influence on a game's aesthetic than poly count.
Performance hit even on Nvidia cards with ray tracing on is pretty significant at the moment. I personally do not care for ray tracing or realistic graphics. I'm a gameplay guy. I remember Physx being very popular back in 2012 and it was pretty impressive but then they decided gpu driven physics is not necessary and we didn't need that much detail. Same might happen with ray tracing.
I think, the main problem with RT implementations that distinctly improve the graphical fidelity (Alan Wake 2, Dying Light 2, Cyberpunk, Portal RTX, Jedi Survivor, The Witcher 3), is that they only run acceptably on RTX 4000 and maybe 3000 cards. AMD's RT performance is still very limited and only competitive, if heavily scaled down RT implementations or only single effects (like shadows) are used, as we see often at moment. The same goes for the consoles, which run on AMD hardware as well. Intel's cards have better RT performance per price than AMD cards, however, unfortunately, the cards they sell right now are not especially fast to begin with.
To really see good RT implementations more often, AMD needs to become competitive with NVIDIA with regards to RT. For the consoles, this may only become the case, when the next generation releases in 2028 at the earliest. Until then, I fear, we will only see worthwile implementations in games whose developers are heavily aided by NVIDIA engineers.
you mean on 80 serie card,cause it sure doesn't run well on a 4060ti or a 3070/3080,you need a 4090/4080/3090 else you are playing on 1080p lol
yh true, my 3070ti isnt even hitting 60 with RT on even on 1440p quality mode@@ogaimon3380
@@ogaimon3380Nothing wrong with being happy with ray traced 1080p. It's gorgeous.
Not really a problem. Just wait or pay the high prices now. RT will take over in the next ~5 years and will prolly be default by next gen
I don't know man. I usually play non-ray traced games when the visual gain VS performance loss is clearly unfavorable, but Cyberpunk with path tracing is phenomenal. I started it all over again with a 4070 and DLSS3 and so far it looks like a freaking remake, completely overhauled. We're in a situation in which ray tracing is still a gimmick, but when implemented good, you do want DLSS3 on your side to play it "the way it's meant to be played"
I have a test channel where most want to see ultra and ray tracing. High details aren't that much worse than ultra graphically, but they can deliver a 20% difference in performance. Ray Tracing is and will be unplayable. No generation will ever be enough for that and to me it's the biggest piece of crap ever released. DLSS 3 FG is again just something Nvidia is trying to increase sales.
The technology has been here for 4 years and I haven't played a single game with it.
And sometimes it seems to me that a game that supports Ray Tracing runs worse than if it didn't have it.
Well frame generation and resolution scaling is useful if you don't have a high end care or you playing 4k ultra. It just increases your fps when working properly but if you're already getting 180fps. . . Meh.
I do agree that Ray tracing is mostly garbage, I finally have a card that can run it and it's rather disappointing.
I just hope games don’t start to release without the ability to turn raytracing off.
ray tracing isn't bad, games just implement it poorly most of the time/rasterized version is already near perfect. Cyberpunk 2077 is an example of a game where ray tracing is VERY noticeable (especially when driving cars first person), you might not see the difference in still pictures though.
exactly depends on the game it’s also why nvidia shows off ray tracing feature on cyberpunk because a it’s actually noticeable
Yeah dudes will pay 2k for a 4090 just to play 2-3 games at 60fps with cranked out RTX settings. Seems like a waste of money to me…
For 1000 bucks you can scoop a 7900XTX and muscle through pure rasterization at 4080 super levels
The developers extrapolate the feature but you also missed the point, reflections and shadows in rasterization behave erroneously we have been accustomed for years this way but ray tracing when used correctly makes it more immersive by simulating reflections and shadows with more realistic physics
RT global illumination with AO is a big deal, but it's very hard to run. Lumen does it well with software, but the problem is that Nvidia isn't interested in performance. They are only interested in selling their top GPU. Nvidia should be aiming to include Lumen level visuals/optimization (who cares if it's not super accurate) but with a big uplift due to hardware acceleration and they should be working with Unreal to implement that. Witcher 3 GI and AO look insanely good btw and I will get into that and the 4070 later.
RT reflections is hit and miss and almost always a miss. It works in Spiderman because you are getting a blurred, muted reflection on office buildings and because gameplay is so fast. Same reason it works in Doom Eternal. It fails in Hogwarts because what you expect to look good while walking around in Hogwarts looks worse than screenspace reflections which are almost always the better thing to use. In addition for GOOD reflections which are needed in something like a mirror, you can just draw things twice which is what Uncharted 4 does at the end with the kid and what TLOU1 remaster does the same way in the intro.
RT shadows is dogsh@^. I have never seen one implementation with the exception of Shadow of the Tomb Raider where it didn't look worse and have awful pop in. The only reason it looks good in that game is it's the only thing RT thing used so they can push it to absurd levels.
On Jedi Survivor. They didn't optimize it. They just removed ALL RT from the RT off option, which greatly reduced CPU overhead. The console has RT on even in performance mode (where it runs just as bad on PC) and that is one reason why the game was made in record time (game was made faster than Mass Effect 3 and that was a very rushed game compared to the previous titles in that series). The studios want RT shadows more than anyone else because it removes any work they have to pay for and they can hire cheaper labor. EA doesn't care about RT shadows being awful and popping in 20 feet in front of you on Jedi Survivor, just like on Witcher 3. The only solution for Jedi Survivor to make it look and run well (minus those awful RT shadows) is DLSS and frame gen which is being worked on by a modder called Puredark. AMD paid to keep both out in that sponsored title.
On Witcher 3 and your 4070. 12 GB can't run all the RT at good visual settings. Set it to ultra and only run RT GI and AO, which is exactly what the console does. The screen space reflections look just as good on water almost all the time and you won't run out of VRAM after a hour where performance tanks or you crash. You can either push 1440 native and not bother with DLSS and use DLAA (mods and tools exist for pretty much all games now that have DLSS) and then instead of ingame sharpening I would fine tune it with integer scaling in the NVCP that opens up a new sharpening option on a pre game basis or use reshade CAS.fx.
I think that just sums it up, good analysis.
Nvidia is looking to have RT replace raster in game development. They are VERY interested in performance with the technologies they release, but always blow up their tech demos. For UE5 Nvidia decided they will fully stand by Lumen GI instead of using their own implementations. They are instead improving direct lighting, reflections, and denoising. Hardware RT functionality is exactly what they have worked in implementing and currently you can enable a wicked increase in reflection and direct lighting quality through them.
If I were to choose one RT setting featured by the majority of games to keep, it would be Reflections. If I were to choose an RT setting in general it would be Global Illumination.
Can't say I agree with your takes here.
Not making it through this rant but RT reflections look amazing in Watch Dogs Legion and even better in Cyberpunk Overdrive mode. Now with DLSS3.5 the fidelity is skyrocketing.
You've made me want to try Shadow of the tomb raider now. I enjoy RT shadows when nobody else does
CRYtek had software Ray Tracing showing we really shouldn't have gone the hardware route. Its ridiculously wasteful on resources for very diminishing returns. Just grab the CRYtek Neon Noir benchmark to see for yourself or play the Crysis remaster games. The trend in development right now is lazy optimization with huge VRAM and RAM requirements for very little difference in visuals and years of patches to fix broken games after release. Not to fanboy Rockstar but its like their the only ones able to make these huge open world games run well. As amazing as Elden Ring was as a game that engine is junk. Cyberpunk was nearly unplayable at launch. Not to mention GPU makers are gimping VRAM on most of their products making this whole ray tracing issue even more pronounced.
RT is only really relevant if u have a top nvidia GPU.
There are quite a few games with poor RT implimentation but in some games it can be transformative since lighting really is the most important part of visual fidelity.
Meh. I play with a 6950xt cyberpunk with psycho rt just fine at 1440p.
@@GeriatricMillenial but u dont get the RT performance of an nvidia equal graphics card like the 3090ti, u also get worse image quality upsdaling with FSR instead of DLSS.
@@dante19890 I play cyberpunk Path traced with an RTX 3070. It's not a top gpu.
@@trichi827 ur not maxing it out tho with the pathtracing. With the newer games that are comming out ur not gonna be able to play with RT , Alan wake 2 for example
@@dante19890 I’m playing at max settings. I’m on 1080p. 40 fps in the city, 60-70 everywhere else. I’m not gonna play Alan wake so I can’t tell how it runs on my pc
Sometimes what "looks better" is just more vivid colors, kinda like when people showcase graphical mods, it's always at night in rainy weather with many light sources reflecting.
Me...why do I even care? I never used this thing. I have an RTX, and yeah... I mostly play at 120hz or hell even 90hz. Ray just make things slower
Thank you Bro for sharing your thoughts.
Brilliant vid. Great narration, pleasant to listen.
Raytracing is going to be amazing on the RTX 9090 Poop
1450 watt special ATX+ connectors incoming - I can see it and don't you dare shower this month, because climate change is real.
raytracing in games just does not pay off. the power consumption is unreal and you can achieve very similar effects with much less power usage. mathematically, there's no shortcuts for raytracing, unless you castrate it into what you could do with other approaches.
For the convenient price of $4,399
Ray tracing is amazing, it has the potential to make 3d rendering look more real than it ever was. take a look at minecraft RT and quake 2 RT to see what it can do. Now weather or not the developer uses it or not is really the question and i feel like developers are the biggest problem now days. AAA titles are few and what does come out seems like it could have been better. I have yet to see a Ray trace only game, meaning anybody who does good RT like cyberpunk and witcher 3 had to code both just to offer it. I say RT only, got an old PC or console ? stope being a cheap ass and upgrade. We can't have nice things if we don't drive the changes.
It begun on pc being used sparingly, implemented into game engines after being sponsored by nvidia. Even with games being made for rt capable consoles, implementation has been slow because said consoles are generally quite weak. Lumen from epic is rt and even more efficient than traditional methods. it is good stuff. if you want better looking games, you take frametime hits. if you’re fine with rasterized effects and those problems, fine. but rt is useful and absolutely better looking than raster when done right.
Games are made to sell GFX hardware and to push the tech so people keep rebuying the same shit over and over again.
Wrong
@@devinkipp4344 thanks for such a well presented opinion
@willuigi64 you're welcome, glad to enlighten others
It's easy to identify ray tracing on or off once you look at the fps counter to find half of your frame rates are gone.
off topic but how u can breath ?
In RE4 Remake raytracing is sooooo good compared to non RT.
Finally some sane person. Raytracing was dumb af in the release and overhyped by the soyboys nerds like always. Turn Raytracing on and turn Fps down, yeah no thanks