Thank you to Vip-cdkdeals.com for sponsoring this video! ▬ Windows 10 pro (15,9$):biitt.ly/SjaKX 30% Coupon code: GPC20 ▬ Windows 11 pro(22$):biitt.ly/H2sdR ▬ Office 2021(43$):biitt.ly/Muliv Do you think it's worth using raytracing? Let me know your thoughts in the comments below!
So, being a 40 year vet of video games and computer stuff, I can say this. I first heard of ray tracing in 91-93 but it was of course used for still images. At the time of hearing that it would be USED in video cards I was doubtful so I think we are lucky to have what we have so far. Two other things, however, have been great and should have been improved more before starting up the RT: physics and shading.
@@j.j.9538 i mean epic games have been doing both those things in ue5 with Nanite and some animation tech that i forgot the name of that's in use in Fortnite.
Why do I feel like the only person who sometimes doesn’t see much difference in RT vs no RT? I feel like I see a much bigger difference between resolutions….
Believe me I feel the same before. But now I'm telling you. You should play cb2.0 with path tracing. Either do so , or do not play it at all. The difference is night and 5 days away lol! It's so glorious 😍!
Ray Tracing is much less important to me than good HDR implementation. Running a Neo G7, and good HDR games are mind-blowingly awesome, and can make pretty bad games in my opinion (e.g. Horizon, GoW) into a better than average experience.
The Ray tracing Scam is not working. People buying 4090's are only buying it for its raw shader performance and nothing else. Only THREE GPU models sold out this generation. The RTX 4090, The RX 7900XTX and the RX 7800XT. When only three cards sell out and two of those cards are from AMD, your competition, Thats a dead giveaway that you have failed completely. And the AMD cards are not cheap budget trash cards with low profit margins. The 7800XT is selling at over 500-USD and the 7900XTX launched at 1,100-USD and both models sold out at launch at those prices. The 4070 and 4070ti did sell somewhat ok but only because there was no competitive AMD card at those selling prices at the time. The second the 7900Xt dropped to 700-750$ the sales for the 4070ti dropped. The second the 7800XT launched at 500$ the sales for the 4070 dropped like a stone. NVIDIA is losing market-share and mind-share at levels never seen before in the history of the market.
The future of GPUs may be ray tracing/path tracing, but I don't want it at the performance hit of today. There has to be a way to make this technology more accessible in the future, maybe something like a breakthrough technology in some kind of ray-tracing chip on next gen GPUs.
The biggest problem with RT is noise and artifacting. Because there is no way we have the GPU power to shoot rays for every pixel on the screen, denoisors have to be used to fill in the missing pixels where rays weren't able to get. The denoisors smooth out a huge amount of lighting data and blend the sampled rays together, leading to lower quality and inaccurate RT effects. Until there is a solution that can solve all these problems with RT, it doesn’t seem to be worth using for now.
Very true. I know Nvidia's early attempts for that answer is to use AI denoisers, which some papers did look good on "paper" per say. With the first proper test ground for the most people was Ray Recontruction on DLSS 3.5 (still poorly named title and should be renamed to be its own thing to avoid confusion)
@selohcin Ray Traced video games rely on a technique called hybrid RT, which is a mixture of rasterization and RT, also known as real time RT. It's impossible right now to have a 1:1 ratio of rays to pixels because of how long it takes to render a ray traced frame, and we don't have the GPU power to do that. It can take hours or days to fully render a ray traced frame, and real time applications like video games don't have the time for that.
As a photographer, I notice the things raytracing does that rasterization can't do, like diffraction (shadows sharp at the point of origin and softer further away), true occlusion (even the best AO can't stop all light leakage), and true indirect lighting. I use raytracing whenever possible.
It's a terrible experience. 90 fps with frame generation means you get less than 45 real frames, and only increasing of real frames count reduces latency. Good application latency is about 10 ms or less, is can be achieved only with about 140+ >real< fps. So, 90 interpolated fps is awful, garbage, it's absolutely unplayable, i wouldn't play with such huge latency in first person shooter unless i'm getting paid for this, because it's pain and suffering.
@@detrizor7030have you actually played it though? Almost every techtuber has said it's barely noticeable... Even amd fanboys are now acting like famegen isn't that bad now that there is the amd framegem mod floating around. How about you just go try it...
@@piper67890able of course I've tried it, both FSR version and DLSS' native. Techtubers are just hyping, or they don't actually play games enough, because with FG "140 fps" you have about 45 ms latency, whereas with real 140 fps without fg you have about 20 ms. It's night and day difference. I can admit, FG fake fps is at least playable, meaning that I can do something in game with it and not vomit, but the thing is that real 100-140 fps is just more enjoyable, more playble. Gralhics difference just isn't worth it.
@@detrizor7030 140 fps is like less than 10ms, fg adds maybe 10-15 ms at most. That's almost negligible and definitely worth the extra motion clarity. 140 fps is really only a necessity for competitive games anyway, not slower paced singleplayer games
@@jayceneal5273 140 fps isn't nearly 10 ms, it's 15-20 ms at best, if we're talking about total system latency. Using FG to get "140" on FPS counter doesn't give latency of 140 fps, it gives latency of 65 fps, which is disgusting. And 140 fps latency isn't a necessity for only competitive games, it's nonsensical myth, because for competitive games you need much more fps, at least 300, and the more the better (even if monitor has less HZ, because FPS isn't number of pictures on monitor!). I play PUBG on 138hz monitor and 350-500 fps and it is much more enjoyable than on 138 fps. 138 fps is barely enough EXACTLY for single player games.
Completely agree. 60 FPS has been the standard for the past 20 years. And it still holds today. I don't care what the technology is If it can't run at least 60fps. I don't like it.
60 fps the standard? Nononnooonono.... Maybe in the last 10 years. Yes. I remember playing in 30-40 fps every game on mid-high graphics cards in the early 2000's and everybody was okay with it and not even 1080p, just like 1024x768 or 1240x960.
@@PeterPauls you may not could have run 60fps but 6 fps has been the target for PC for ever. It went from 30 to 60 since like rainbow six first came out in 1998
If we're talking about games in which we rotate camera by mouse, than 60 fps is unplayable garbage. ~140 fps (real fps, not interpolated) is the standard, 60 fps is trash for consoles.
I use RT whenever I can because I can't stand light bleed and the inaccuracies of cubemap and screenspace reflections. Alan Wake 2 is an absolute stunner. RT is a game changer for graphics and I can't wait to see what games in 2024 have to offer on the RT front. I have a 3080 Ti and was thinking I'd hold off until the 60 series but might cave and get a 5090 whenever that happens. Thing is, people playing games where high fps and low low latency are key, such as with competitive shooters, they're never going to want to turn RT on anyway because it's so expensive. RT is for singleplayer games where 60fps is perfectly fine, and in some cases even 40 with VRR. If you shell out for a 4090, you damn well know what you're paying for. If you don't care for RT, there's no reason to upgrade to a 40 series unless you're still rocking a 770 or something.
@@phrozac I mean that I don't remember any scene in Alan Wake 2 that uses raytracing in way that somehow differs from previous algorithms. In my opinion it requires path tracing to make some difference but ghosting, frame generation etc. messes it and also requires dual nVidia 4090 rig. It is not practical either but unbiased path tracing is extremely valuable for tuning scene, using path tracer as reference. Photographs of course works too.
You hit the nail on the head. I'm running high RT on HW Legacy with a 4070ti and getting 50-60 FPS @1440p. Perfectly playable. I'm using a cheap 4K TV that forces me to use V-sync.
I agree 100%. I bought a 4090 strictly for its raw performance. I didn't really care (and still don't) about how good it was/is at raytracing. Raytracing in its current state is not a major factor for me in games. If I have the performance budget for it at my target resolution/framerate, I'll turn it on in a game, but it *never* takes precedence over other general settings or resolution for me. We're still a couple gens away before I really start giving much of a sh*t about raytracing.
4090 owner here and I agree. All I really care about is crispy 4k. RT and all that other jazz is great when you are STANDING still looking around, but if you are run and gun type of player or into action games, that stuff just gets lost in the noise and can interfere with your perception due to the likelihood of anomalies/aberrations/ghosting. The more complex something gets, the less reliable it is. Better to have raw performance than all the bells and whistles.
I bought a 4080 for VR horsepower, Ive tried a couple of ray-traced titles and I really like most of the effects. I haven't tried Cyberpunk, but I own Control and Portal RTX, they are both amazing with RTX on.
Control has such bad art design that RT effects simply stand out more. I never understood how such a lazy game got so much hype: Low-res, recycled texures everywhere, very sloppyly designed levels (it basically looks like something I would have thrown together in a map editor as a teenager) - they didn't even bother to lipsync properly. Instead of pushing RT effects they should have spend some money on some proper visual artists, writers and voice actors (fuck, that dialogue writing and voice acting was horrible ... how on earth could such a game get praise at all?). That is mostly my problem with RT: It is heavily pushed as a gimmick, while much more important parts of a game are handled more and more lazy.
@@OmnianMIUEnglish? Yeah, right... The game I got with a 2080 and played on my 2080ti transformed the gpu somehow midgame into a Radeon ;) (funny enough, I now actually have a 6900xt because RT turned out to be mostly placebo, I only buy High-end cards, so usually there is no need for DLSS/FSR). Also, DLSS has nothing to do with texture design, assets and animations. Do you even know what you are talking about or are you just repeating half-understood marketing claims?
@@Xul yeah haha you are completely right! And Control is not the only disappointing example. I can not understand how everyone seems to praise all those game as if they are life changing. The most of those AAA titels released recently are complete shit and sometimes they also look like shit even though they require you to buy a NASA computer. So many games come out with ray tracing but they cut a lot of corners to do so. When you disable RT there is not much left even graphically. I would argue that some games released recently look worse then games published 10 years before, because of all the trickery used to enable RT.
I've started a new playthrough last night with cyberpunk 2.0 patch with ray recon and frame gen on and I have to say it looks amazing. Yes, there are a few bugs that I believe will be ironed out in future patches but well worth it. Running a 7900x @ 5.49 all cores @ 1.28v and 4090 @ 3.3, 6200 on the mem. absolutely beautiful experience @ 4k 144htz with gsync.
And what exactly does look better? I found it just looking slightly different? They say it looks more realistic , but just adding more reflections and lights doesnt magically look more realistic. In real life there are not so much reflections on reflective surfaces. So in many instances RT looks the same fake as rasterized lighting does. So its not worth it
@@Scornfull You are definitely wrong. Do you sometimes go out and take a look on surfaces? RT shadows dont nearly look that way. And also RT refelections dont look real. RT just adds more reflections even on surfaces which normally wouldnt refelct. And why do we even bother? Video games are excape-ism. They are not meant to be "Real" . I have real life the whole day . Its just boring to have the same in Video games
@@BastianRosenmüllerbro go touch some grass,look at a puddle once or twice in your life 🤡🤡🤡🤡 amd fanboy cope more. U clearly have no life outside your room if u are saying this bullshit. Either that or u are blind. Most probably a delusional amd fanboy tho
I cant lie, 1st time I tried Hypr rx on my 7900xtx I didn't care for it, something updated and I turned it back on days later and haven't turned it off since, the trick is turning the resolution drop to about 33%< even path tracing at 4k becomes playable with little perceivable difference in quality , That famous AMD fine wine is already kicking in with this 7900xtx and Ive only had it 2 months
I've played around 5 hours of 2077 with the new DLSS 3.5 update with Path Tracing on my 4090 and it has been completely fine. Yes there is ghosting, but not as bad as this video makes it out to be. You also have to think about the benefits of playing with all these settings on in the first place! Like are we really going to turn path tracing off because of a little ghosting? LOL. It's good to point out the shortcomings of the technology, but to paint it as not having a use and not being better than traditional rendering is ludicrous.
Yeah but look at all these people firmly believing that rdr2 looks better than cp77 overdrive mod... Lots of people just haven't been educated on light representation, I would guess that a gamer that have always played at max settings will see the huge difference when somebody that plays a lot of games on low settings will see more difference going to ultra rasterize settings and RT is kinda out of the understanding scope. Just like with HDR misconceptions, people used to crank color, contrast and backlighting on low end monitors and suddenly get an HDR OLED tv get disappointed because the just see elevated shadows area, desaturated colors etc when really it is just about granular detail...
I totally agree with you. And honestly I don't get people that buy high end cards and a good 4K monitor to finally run CP 2077 with DLSS at Performance mode + FG to just run Path Tracing at playable fps because in the end of the day it's a shooter game and anyone who tells me that everything less than about 80-90 fps with FG on is a smooth experience knows nothing and lies. Latency are just high enough. I would say that in order for FG to really works as intended you should have at least 50 to 60 fps base before enabling FG. The majority of noising and ghosting comes from the fact that they use FG with very low base fps. Anyways by doing this, they just throw so much imagine quality out. I gonna heavily mod my game and use a good ReShade every day before to enable something that would half my fps and makes me to use fake frames (aka FG) with all its bugs followed. And by the way there is a very good mod which fixes the reflection disappearing with SSR.
dude a 4080 or 4080 super , will get you 80-90 frames in 4k with PT if you use DLSS perf mode and FG. with out FG the base FPS is actually right at 52-58 FPS using those settings so what are you talking about ?
I'm old enough to remember when AA was the new thing. Totally unusable for 3-4 generations--perf cost wayyy too high, and that was early 2000s when "playable" was 30 fps
Yep and I never used it until it was great as the hardware caught up. Same for RT. I believe RT is the future but I will only switch it on when the hardware is ready. Maybe the 6090 in 2026
@@MarthinusSwart 2026 seems early I would say till the next console gen at the end of 2027. Even then, I doubt it will become the standard in games like rasterization is now.
@@Ash-uf4fv RT will be teh standard eventually .. to say you doubt that is like some one in 1999 doubting that flat screens will ever be standard. RT has existed since 3ds max was first invented in 1989 so it took us about 20 years to see it in real time on video cards at all . but make no mistake , eventually it wil replace ras. unless the whole indsutry implodes first from wokeness
This just sounds like an ayymd fan cope. If you use proper nvidia tech with CP2077 it looks amazing and runs well enough. Yes its not perfect but in those cases its absolutely worth using if you can.
@@christophermullins7163 They can say whatever they want...UE5 has all but invalidated the hype with lumen and nanite...and I have a 4090. I got it for it's raw power not RT. Having RT power is great but people put way more hype into it than it deserves.
RT is the future... but it's not the present. It likely won't become the present until you can get 3080-level RT performance with 16GB of VRAM for $300. And even then, it won't be full path tracing on most titles - it'll be situational implementation that adds polish rather than being designed ground-up around ray tracing.
I would have to agree, RT is only really worth it if it's mixed with rasterization. Fortnite, like you mentioned, is actually one of the only games I enable RT with because it genuinely improves the quality without a ginormous hit to performance. I'm on an RX 6950 XT and at 1440p, I still average around 100 fps with lumen and RT reflections set to high.
It snake oil since raytracing is in the alpha stages. Cyberpunk 2077 got a lot of diffefent setting of raytracing some effect might be worthy like screen space reflection only
@thedesk954 Ray Tracing was made for lazy, incompetent, low skill developers that have no artistic skill on implementation of lighting and reflections, Rockstar proved it can be done, No Ray's required, I recently fired up Read dead Redemption 2 on my 7900 XTX all settings maxed out and I am always blown away, No game with Ray Tracing can look as gorgeous and realistic not even Cyberpunk.
RDR 2 reflections are nowhere near comparable to something like control which is ray traced. Go cry rockstar simp. In fact the game also doesn't have accurate AO or contact shadows. BeSt LiGhTiNg.
Y'all need to play cb2.0 with path tracing. If you can't upgrade then wait till you can. The difference between raw and path tracing is night and a whole week. It's absurdly glorious looking.😍 You can experience 2k maxed+path tracing+RR with a 4070. I can run mine at solid 75fps with a little help of DLSSQ+FG(which even makes most images even sharper👌).
@@captain_nemo01well that's weird. I'm getting the same fps on my 4070 without upscalers. Edit: You should really use your Dlss3 bud. I mean I hate input lag too but it's a single player title , and I've gotten used to it already.
This is precisely why I went with a 4090 over the 7900xtx. I knew that RT would continue to be implemented more and more in games, and that it would get more impressive and demanding with time. And Nvidia has a good track record of implementing new technologies and improving them over time that help with performance and visual quality. After watching the Cyberpunk DLSS 3.5 review videos from GN, HUB, and DF, I think the 4090 was the way to go for me. Especially considering the fact that I got mine for $620 off the MSRP ($1799) making it $1179 which is pretty close to the 7900xtx in terms of price.
4090 for that price is a total no-brainer regardless of anything else you said. Even if you don't care about RT the 4090 is still the most powerful consumer card and getting one for the price of a 7900 XTX is nuts. I own a 7900 XTX and would buy a 4090 for $1100 right now if it was offered lol.
@@jakejimenez7048 Too be fair, you're correct that it's a no brainer. However, I had the 6800XT and really liked that GPU and I wanted to stay in the AMD family since I really liked their adrenaline software and appreciated that the 7900xtx did really well in raster. Plus, I think AMD is a more pro-consumer company overall, considering the openess of their software (FSR, Mantle, etc.) I didn't really want to switch back to Nvidia but Nvidia is really knocking it out of the park with their DLSS stuff and their RT performance at the moment. Ultimately my decision came down to whether or not I wanted to compromise in some area. I didn't, and that's why I went with the 4090 since it literally is the only card on the market with zero compromises. Every other card, the 4080 and 7900xtx included, have compromises in some way or fashion. The 4090 is alone in that it has no compromises to speak of.
@@jakejimenez7048 Same, i got the 7900 xtx myself and the only reason i went with the 7900 xtx over the 4090 was the insane price jump just to get 4K RT, all the other features i really didnt care for, I do not regret my choice but if i could get the 4090 with 620$ discount i would have gone with that for sure lol
Atm, we’re seeing advancements in Ray Tracing - Ray Reconstruction, but it’s like 2-steps forward and 1.5 steps back with its ability to show the correct information. Look back at around AMD Radeon HD7000 series to early R280 series gpus …… there was generations of GPU hardware and software improvement before tessellation didn’t crap out AMD performance in terms of fps, but was mostly fine for Nvidia …… we’re at this junction point again with Ray Tracing and I feel we’re going to need 3-4 generational GPU improvements (like RTX4090 over RTX3090 ~50% boosts per generation) before Ray Tracing / Ray Reconstruction and all its bells and whistles can deliver decent 75+fps performance natively at higher resolutions.
Ray reconstruction does make all objects in the distance look like those AI generated images, and alsomake those 3d models looking flat, especially those NPC faces. It's just another nvidia gimmick.
When pathtraycing( which is the best way of implementation of raytraycing)stops being a feature and comes integrated in games is when we will see the most difference. For now we only have a kind of early access to it.
If a developer is going to do bad basic rendering what makes you think they are going to properly place the light sources so that RT works well at all.
This is a good take on things. RT is certainly amazing under the right circumstances but in most scenarios you're just sacrificing too much performance for it to be worth it. I've taken to lowering certain settings that are less noticeable in order to be able to have some semblance of performance with RT on (3080ti) but that's obviously not the most ideal. Part of the problem is that game devs seem to be developing games using future hardware as you can have current-gen top of the line stuff and still be suffering in those games I personally cannot wait for the time when we can someday have performance AND visual fidelity. For now it seems you just get to pick one or the other
Bottom line is there is a noticeable difference and if you don't need super high frame rates, like in any single player game, then ray tracing is fine. 45-60 FPS for single player games is fine. For competitive online play anything that robs FPS should be disabled or set to low.
It will take 5 to 6 years to get current games to run in a playable fps range, but in 5-6 years new demanding technologies will appear, it's a vicious cycle.
Path Tracing looks amazing like %95 of the time. There are some scenes that need special attention that you will notice as being crap in CP77. Unfortunately I have a AMD card so unless I want real low fps or fuzzy/ghosty fsr2, then RT and especially PT isn't worth the fps hit. We probably 10 years away from it being more mainstream.
I'm writing this in 2024, and today I decided to go back and play Bioshock infinite for the first time in maybe 6 years or more. This game has better reflections in just the opening sequence of arriving in Columbia then most games being made now with Ray tracing technology. There are some games that definitely look great with the tech obviously cyberpunk 2077 I'll also throw in Control, Doom Eternal, Alan Wake 2 and RE4. However it's not the end all be all if you can't utilize your tools to nail down a style let alone good gameplay that's engaging.
Im sick of the Nvidia special 3-4 games that have Heavy RT (Or Nvidia RT) and we are told this is the future . I don't any of those games cyberpunk is good I play with ultra at 5k
i have a 4090 and i have tried ray tracing in so many games and i really dont notice the difference unless the game was shit and got a ray trace mod. what i do notice is going from 150fps to like 40 lol
What annoys me the most about modern games, is that we finally got hardware that could gracefully deliver 4K high-fps gaming with MSAA and astonishing details, but instead we're getting noisy/dithered, yet crazy expensive algorithms with low sample counts, puny internal rendering resolutions and "reconstruction" algorithms on top, which just make the whole image blurry or even cause rendering glitches. RT makes that worse, because it requires even more tricks and cut corners in order to to get the effects at a somewhat playable framerate in the first place. Instead of trying to make the most out of the HW resources available, all GPUs essentially got shifted down 2-3 tiers (despite the exorbitant prices) due to bad design decisions and questionable programming. The whole industry is f_cked!
Yeah, totally agree. RT is cool, but not really worth it for sure, yet. I've played around with CP 2077 on my 7900 XTX, and compared the differences between the three visually. Certainly looks good, but the fact it's hardly playable (path tracing) makes it big no for me, except the reflections. I have a 4K 144hz monitor and play at max settings with quality FSR 2.0 with ray traced reflections, and get an almost smooth 72fps. I do think ray traced reflections have a big effect, while not being too much in the way of performance, and obviously still looking stunning. I guess path tracing certainly helps to future proof the game so that people will come back to visit it for years to come to test out the hardware. lol
First of all, AMD and nvidia path tracing is NOT THE SAME. Nvidias path tracing not only performs better, also looks waaay better. Why am i even bothering with amd users lol
There are * a few* games doing RT just fine, e.g. Cyberpunk and Control. Hell, Control was also fully playable, even for someone like me who has to use the FSR 2.1 mod on a 6900XT. The modders applied FSR so well in Control that even on a 32:9 1080p display it looked absolutely crisp in Quality mode. Cyberpunk is another one of these games, though I am not sure if you could say that RT makes it a completely different game now - you always have to keep in mind that you are actually moving through the game's world instead of constant lollygagging at reflections. And then there are games like Chernobylite or The Witcher NG *and most other titles in general* where the implementation lacks and/or the performance cost doesn't justify the image quality. In both cases mentioned before applying RT leads to a lot of noise, Chernobylite in particular also suffers from the horrific implementation of FSR and TAA, with TAA giving you ghosting paradise and FSR taking the sharpness out of the picture even more. At the same time I didn't notice much difference here, actually I like the old SSR version more because it shows more contrast. In the witcher, performance also tanks if you use RT and at the same time not giving you a necessarily nicer picture. If you couple it with TAA and FSR, the displayed content makes you barf. I really found out I have become a fan over the years of a very sharp, yet fluent picture. This in some cases even let me use the good old SSAA instead of SMAA or TAA as it is available for a lot of titles now often named Render Scale and it just (as SSAA does) multiplies the shown pixels to the adjusted factor. Hence, a 1080p image becomes a 4K image when rendered at 200%, with the consequential performance hit. But after seeing a lot of RT titles on capable and not so capable machines, I have to say I choose sharpness over cosmetics.
real talk : Geforce 4080 super Cyberpunk 2077 settings maxed , PT on DLSS performance mode , w/ FG - getting 82-93 FPS @4k. input lag .. non existant from what i can tell. . it looks freaking amazing. i don't see aproblem atm the moment with RT.
Full Path tracing runs really well on my 4070ti in cyberpunk in 4k with dlss balanced. Frame gen too obviously. Latency being a little high is slightly noticeable, but it's totally playable and incredible looking. Above 60fps.
Are you serious? Playable fps is 120+, whereas you're getting not more than 30 real fps (generated frames do nothing with latency, and latency is the most important thing in any game with mouse camera rotation). It's an utter nightmare, i can't imagine playing first person shooter with such enormous latency, better not to play at all, than with such terrible input lag. It's absolutely unplayable, even on 4090, and i doubt even 5090 will handle it.
@@detrizor7030 @detrizor7030 playable is 60fps as long as you're happy with the latency, which I find acceptable. It is perfectly playable. 120 FPS only matters for competitive gaming. I do just use an Xbox pad tho, so I can imagine it being different with a mouse.
@moorebags1 "playable is 60fps" - you obviously haven't played on 140+ hz display with 120+ fps, otherwise you wouldn't say such nonsense. 60 fps is playable only if we're talking about games without mouse camera rotation or with minimum quantity of it, for example racing, otherwise it's a nightmare. "120 FPS only matters for competitive gaming" - i can tell you a secret - competitive gaming is good on 240 or even 540 hz, 120 fps is not enough for it, it's barely enough for single player games. "I do just use an Xbox pad tho, so I can imagine it being different with a mouse" - and here we have the source of your misconceptions, of cource. Bro, just try playing any first person shooter on 144hz monitor with 120 FPS and on mouse, after just 5 minutes you'll NEVER think about 60 fps like something good.
@@moorebags1 "60 FPS is fine for most people" - most people are idiots. Seriously. "120 is obviously better, but 60 is acceptable" - again, just try it with mouse, on mouse 60 and 120 fps are uncomparable. Gamepad gaming is trash for consoles, real men play only on mouse. "Chasing 120 is just pissing money away" - 120+ fps is just decent gaming experience of a sane person, who doesn't want to eat shit. And it's not that expensive, if you're not paying too much attention to graphics settings, especially raytracing.
4090 here, I have to agree. The performance hit you take for RT and the other gfx options on really isn't worth it. I prefer to get crispy 4k@120hz. I play too many first person/fast paced games, those GFX effects are not worth the performance hit. RT and DLSS were applied to games, now RT and DLSS are out and game devs can optimize their games. Let's hope we see efficiency and optimization from nvidia and game devs. Also, AMD just released FSR 3.0 and it looks good so far.
I think that an alternative to Screen Space Reflections could be a combination of this technology and cubemaps. That should be better than SSR on its own
I think the amount of time it takes will depend on Nvidia and AMD bringing that level of performance to affordable cards. Since they could "achieve" that lets say by the next gen 5090, but that is not the core market of gamers. So it would need to be seen in the 4060-4070TI tier cards for it to be "achieved" but that's just me.
3090’s preform around the same as the 4070’s. 3090’s are 5-600 dollars used. Don’t go for a 7800 xtx or a 4070 get a last gen used 3090. Get 24 gigs of vram and make these companies lower their prices.
As someone who has been playing video games on PC since '95 the only true revolution and benefit to us has brought was 3DFX with Voodoo's. Nowdays nVidia is just playing their games and take customers into their vicious cycle of entrainment
🤡🤡🤡🤡 nvidia doesnt care about gaming, are u people restarted? Nvidia is an AI chip maker first and foremost. Nvidia could literally stop making consumer gaming gpus and would not feel a thing on the stock market
It's all marketing. The sad part is that so many people fall for it. Sure in a few cases it can be awesome but 95% of the time/games, it's not. Yet here people are buying worse GPUs "because dlss and RT"
2030-2036 before GPUs are 8x more powerful, I think, but by 2030 we'll be wanting 8K textures. I would let RT take a backseat in preference of frame rate and resolution.
And for what do you turn on Ray Tracing in Fortnite ? WTF this is a multiplayer game. It makes no sense to take a massive performance hit in that kind of a game
Only a "graphically challenged" naysayer would disregard RT. 😏 Now the name of the channel makes sense. 😁 I've waited my whole life for real time ray tracing and didn't think I'd see it in my life time. I don't play multiplayer games anymore and 30 fps is more than enough when I play (remember that most movies are 24 fps!). When you compare the "artifacts" of rasterization to ray/path tracing, a few issues you mentioned for RT pale in comparison.
I've said this before, but in many cases ray tracing actually looks WORSE due to just how noisy ray traced reflections are. Light bouncing off reflective surfaces just looks so awful and distracting. DLSS 3.5 seems to be a big game changer though.
Performance at affordable price is the real nail in the coffin for me. Maybe in 5 years ray tracing will be at an affordable price but it’s definitely an enthusiast level 4k 120fps path tracing
Honestly, I think PC gaming over all should t be considered enthusiast. Console should be for casual gamers. In the current situation xsx and PS5 out compete most gpus at the same price. I would get a PS5 over 4060 imo.
In the meantime, Santa Monica God of War Ragnarok is the best looking video game out there, washing anything on PC... and it doesn't have Ray Tracing...
Personally I think in CP 2077, Ray Tracing is good for showcasing but really bad to play with it on. It's not about performance but because the many issues it brings. Ti begin with RT lighting may be more accurate but in this game where contrast is so important from a narrative standpoint, RT lighting floods scenes too much in lighting making them feel more flat since it removes many of the intentional shadows placed in there by the artists. If you use RT shadows they'll look good in some places but since they're more accurate there's so much dramatism taken away from it because they'll always will look more natural and forced shadows in this game sometimes are used to tell a tale. RT reflections are amazing in this game to enhance the verticality, color and detail of Night City but they're the most glitchy effect of all RT ones. You'll find them popping around, making characters look like chrome and since they're not only used for floor reflections and they apply to most materials, RT reflections can mess with some textures and make they look like they don't belong in there.
I play with 4k ray tracing on with my 4090 in control with ultra settings with dlss (avg 70fps), starfield with ray tracing in ultra with fsr turned off (60-80fps), portal with rtx without dlss (looks fantastic) (70fps), cyberpunk 2077 with dlss set to quality (avg 70fps)... I'm not a big fan of frame generation due to the input latency spike so I'll probably not be doing that for the foreseeable future, and yeah, ray tracing is REALLY taxing, even for this generation's top end cards, but from what i've heard of battlemage, we're looking at a next generation improvement of something close to 1.8x in rasterization and probably more than that in terms of ray tracing... it's a complete overhaul of gpu design that's ray tracing first. and yeah, sadly it's over a year away, but for now, I'm honestly quite happy with my 4090, and i've also heard some pretty fantastic things about amd's next offerings re: ray tracing as well.
Honestly I don't have a system with a 4080 or 4090 but I did rent gforce now 4080 just to check it out my Internet is great and I was getting an average of 22 ping. I tried path tracing and compared it to psycho ray trac setting... and regular rt looked more clear and there were less artifacts and noise.. faces were cleared as well. Path Tracing is a good showcase but if youre going for clear and smooth gaming just use regular rt
Another reason is that in the real world where your not in a rain soaked glass city, reflections aren't much of a thing. And shadows, if your not directly comparing it to it's rasterised contemporary especially when moving in fast paced games goes largely ignored. That said, I too believe it's a goal worth pursuing. I just wish it wouldn't take a back seat to proper game development. Too much spent on eye candy and not enough on fun.
Ray Tracing is something that I never care when it appeared. And still does not care today. I dont care about all this lightning stuff. I just want to play and have fun.
@timons777 I'm running triple 28" 4k screens in StarCitizen... Ray tracing would slow things down quite a lot. In the 60s these days. When SC has RayTracing AND the game is better optimized... I'll definitely try RayTracing.
We will need GPUs with completely new architectures to do path tracing well. My best guess is that it will involve very a clever usage of AI at a hardware level to really accelerate RT at a much higher degree. Right now theres alot of software tricks being done on a game by game basis that I don't think is a sustainable approach, and its far from perfect with ghosting and such. Much of it needs to be done at a hardware\driver level in the long run.
This is the reason why in many games the RT implementation is quite minimal and AMD gpus can compete with Nvidia ones. The 3000 an 4000 have way better RT capabilities than the RX 6000 / 7000. But the RT FX are barely noticeable when activated. So, every gamer should buy the best bang for the bucks performer in rasterization. Thanks for this piece of information.
Your example is a failure to properly adjust the contrast and color of the image. RT would only change that if there was a shader change that allowed it. It has nothing to do inherently with RT only a side effect of RT being enabled.
Just played Battlefield V and ray tracing made it look like shit. Actually BF V was a step backwards if you ask me, definitely a step back from the fidelity from BF4.
Pixel level bounce to extinction ray tracing isn't a setting and probably won't be for 20-40 years (if you need it in real time). But AI fake it path tracing will get good enough in a few generations. We'll use it. It will look good. It doesn't have to be "real", it only has to be better.
Thank you to Vip-cdkdeals.com for sponsoring this video!
▬ Windows 10 pro (15,9$):biitt.ly/SjaKX
30% Coupon code: GPC20
▬ Windows 11 pro(22$):biitt.ly/H2sdR
▬ Office 2021(43$):biitt.ly/Muliv
Do you think it's worth using raytracing? Let me know your thoughts in the comments below!
So, being a 40 year vet of video games and computer stuff, I can say this. I first heard of ray tracing in 91-93 but it was of course used for still images. At the time of hearing that it would be USED in video cards I was doubtful so I think we are lucky to have what we have so far. Two other things, however, have been great and should have been improved more before starting up the RT: physics and shading.
Right? Also, i wish they would use all of this R&D to make more realistic animations or reduce pop-in
The thing is that we had physx for years and then it just suddenly started disappearing for some reason. We need physx back!!
@@j.j.9538 i mean epic games have been doing both those things in ue5 with Nanite and some animation tech that i forgot the name of that's in use in Fortnite.
Why do I feel like the only person who sometimes doesn’t see much difference in RT vs no RT? I feel like I see a much bigger difference between resolutions….
It's not nothing but it's mostly fomo. Idiots need validation to spend more than they should.
you are not alone in certain scenarios looks better in others not much
Believe me I feel the same before.
But now I'm telling you. You should play cb2.0 with path tracing. Either do so , or do not play it at all.
The difference is night and 5 days away lol! It's so glorious 😍!
Ray Tracing is much less important to me than good HDR implementation. Running a Neo G7, and good HDR games are mind-blowingly awesome, and can make pretty bad games in my opinion (e.g. Horizon, GoW) into a better than average experience.
I'm seeing that they are looking for things to make us keep upgrading.
The Ray tracing Scam is not working. People buying 4090's are only buying it for its raw shader performance and nothing else. Only THREE GPU models sold out this generation. The RTX 4090, The RX 7900XTX and the RX 7800XT. When only three cards sell out and two of those cards are from AMD, your competition, Thats a dead giveaway that you have failed completely. And the AMD cards are not cheap budget trash cards with low profit margins. The 7800XT is selling at over 500-USD and the 7900XTX launched at 1,100-USD and both models sold out at launch at those prices. The 4070 and 4070ti did sell somewhat ok but only because there was no competitive AMD card at those selling prices at the time. The second the 7900Xt dropped to 700-750$ the sales for the 4070ti dropped. The second the 7800XT launched at 500$ the sales for the 4070 dropped like a stone. NVIDIA is losing market-share and mind-share at levels never seen before in the history of the market.
The future of GPUs may be ray tracing/path tracing, but I don't want it at the performance hit of today. There has to be a way to make this technology more accessible in the future, maybe something like a breakthrough technology in some kind of ray-tracing chip on next gen GPUs.
Even CDPR said Ray Tracing Overdrive was a Proof of Concept and never intended to be anything else in Cyberpunk.
The biggest problem with RT is noise and artifacting. Because there is no way we have the GPU power to shoot rays for every pixel on the screen, denoisors have to be used to fill in the missing pixels where rays weren't able to get. The denoisors smooth out a huge amount of lighting data and blend the sampled rays together, leading to lower quality and inaccurate RT effects. Until there is a solution that can solve all these problems with RT, it doesn’t seem to be worth using for now.
Very true. I know Nvidia's early attempts for that answer is to use AI denoisers, which some papers did look good on "paper" per say. With the first proper test ground for the most people was Ray Recontruction on DLSS 3.5 (still poorly named title and should be renamed to be its own thing to avoid confusion)
Is this really true, though? Digital Foundry showed that the Ultra Ray Tracing setting in Metro Exodus shoots one ray per pixel on the screen.
@selohcin Ray Traced video games rely on a technique called hybrid RT, which is a mixture of rasterization and RT, also known as real time RT. It's impossible right now to have a 1:1 ratio of rays to pixels because of how long it takes to render a ray traced frame, and we don't have the GPU power to do that. It can take hours or days to fully render a ray traced frame, and real time applications like video games don't have the time for that.
@@Gamer-q7v You'll have to forgive me if I believe Digital Foundry over a random guy on the internet.
@@selohcin yes, some games even do 2 or 3.
As a photographer, I notice the things raytracing does that rasterization can't do, like diffraction (shadows sharp at the point of origin and softer further away), true occlusion (even the best AO can't stop all light leakage), and true indirect lighting. I use raytracing whenever possible.
ditto, if you understand light, it's very hard NOT to use RT.
Exactly! When I'm trying to jack off to gaming visuals, RT must be on!
@@inrptn Yes, incorrect light usage makes me hurl. Those people trails and slow shadows are fine tho, that's just how light works.
meh i play games not to recreate real life i don't really care and it can be done with out rt @@inrptn
Finally someone with a brain in here
You get 80-90 fps with dlss quality and frame generation with path tracing on RTX 4090 so yeah. It is an amazing experience
It's a terrible experience. 90 fps with frame generation means you get less than 45 real frames, and only increasing of real frames count reduces latency. Good application latency is about 10 ms or less, is can be achieved only with about 140+ >real< fps. So, 90 interpolated fps is awful, garbage, it's absolutely unplayable, i wouldn't play with such huge latency in first person shooter unless i'm getting paid for this, because it's pain and suffering.
@@detrizor7030have you actually played it though? Almost every techtuber has said it's barely noticeable... Even amd fanboys are now acting like famegen isn't that bad now that there is the amd framegem mod floating around. How about you just go try it...
@@piper67890able of course I've tried it, both FSR version and DLSS' native. Techtubers are just hyping, or they don't actually play games enough, because with FG "140 fps" you have about 45 ms latency, whereas with real 140 fps without fg you have about 20 ms. It's night and day difference. I can admit, FG fake fps is at least playable, meaning that I can do something in game with it and not vomit, but the thing is that real 100-140 fps is just more enjoyable, more playble. Gralhics difference just isn't worth it.
@@detrizor7030 140 fps is like less than 10ms, fg adds maybe 10-15 ms at most. That's almost negligible and definitely worth the extra motion clarity. 140 fps is really only a necessity for competitive games anyway, not slower paced singleplayer games
@@jayceneal5273 140 fps isn't nearly 10 ms, it's 15-20 ms at best, if we're talking about total system latency. Using FG to get "140" on FPS counter doesn't give latency of 140 fps, it gives latency of 65 fps, which is disgusting. And 140 fps latency isn't a necessity for only competitive games, it's nonsensical myth, because for competitive games you need much more fps, at least 300, and the more the better (even if monitor has less HZ, because FPS isn't number of pictures on monitor!). I play PUBG on 138hz monitor and 350-500 fps and it is much more enjoyable than on 138 fps. 138 fps is barely enough EXACTLY for single player games.
Completely agree. 60 FPS has been the standard for the past 20 years. And it still holds today. I don't care what the technology is If it can't run at least 60fps. I don't like it.
Ignorant sub human kid AMDumb
60 fps the standard? Nononnooonono.... Maybe in the last 10 years. Yes. I remember playing in 30-40 fps every game on mid-high graphics cards in the early 2000's and everybody was okay with it and not even 1080p, just like 1024x768 or 1240x960.
@@PeterPauls you may not could have run 60fps but 6 fps has been the target for PC for ever. It went from 30 to 60 since like rainbow six first came out in 1998
@@elvo6217 Back then I don't really counted frame rate. If it felt smooth enough, it was okay.
If we're talking about games in which we rotate camera by mouse, than 60 fps is unplayable garbage. ~140 fps (real fps, not interpolated) is the standard, 60 fps is trash for consoles.
I use RT whenever I can because I can't stand light bleed and the inaccuracies of cubemap and screenspace reflections. Alan Wake 2 is an absolute stunner. RT is a game changer for graphics and I can't wait to see what games in 2024 have to offer on the RT front. I have a 3080 Ti and was thinking I'd hold off until the 60 series but might cave and get a 5090 whenever that happens.
Thing is, people playing games where high fps and low low latency are key, such as with competitive shooters, they're never going to want to turn RT on anyway because it's so expensive. RT is for singleplayer games where 60fps is perfectly fine, and in some cases even 40 with VRR. If you shell out for a 4090, you damn well know what you're paying for. If you don't care for RT, there's no reason to upgrade to a 40 series unless you're still rocking a 770 or something.
Alan Wake 2 doesn't seems to do anything that requires raytracing.
@@gruntaxeman3740 Nothing *requires* RT but AW2 is a premier RT showcase like CP2077 is.
@@phrozac
I mean that I don't remember any scene in Alan Wake 2 that uses raytracing in way that somehow differs from previous algorithms.
In my opinion it requires path tracing to make some difference but ghosting, frame generation etc. messes it and also requires dual nVidia 4090 rig. It is not practical either but unbiased path tracing is extremely valuable for tuning scene, using path tracer as reference. Photographs of course works too.
You hit the nail on the head. I'm running high RT on HW Legacy with a 4070ti and getting 50-60 FPS @1440p. Perfectly playable. I'm using a cheap 4K TV that forces me to use V-sync.
cyberpunk 2077, 4070 TI, 2560*1440, Pathtracing+Frame-Generation -> 120 fps
WHAT is the Problem?? It looks fantastic !
I agree 100%. I bought a 4090 strictly for its raw performance. I didn't really care (and still don't) about how good it was/is at raytracing. Raytracing in its current state is not a major factor for me in games. If I have the performance budget for it at my target resolution/framerate, I'll turn it on in a game, but it *never* takes precedence over other general settings or resolution for me. We're still a couple gens away before I really start giving much of a sh*t about raytracing.
Fully agree as to the raw power and solid 24GB in the latest gen
4090 owner here and I agree. All I really care about is crispy 4k. RT and all that other jazz is great when you are STANDING still looking around, but if you are run and gun type of player or into action games, that stuff just gets lost in the noise and can interfere with your perception due to the likelihood of anomalies/aberrations/ghosting. The more complex something gets, the less reliable it is. Better to have raw performance than all the bells and whistles.
@@SupraSav - Agreed.
@@SupraSavfake gamer
Nvidia fanboys lol
I bought a 4080 for VR horsepower, Ive tried a couple of ray-traced titles and I really like most of the effects. I haven't tried Cyberpunk, but I own Control and Portal RTX, they are both amazing with RTX on.
Control has such bad art design that RT effects simply stand out more. I never understood how such a lazy game got so much hype: Low-res, recycled texures everywhere, very sloppyly designed levels (it basically looks like something I would have thrown together in a map editor as a teenager) - they didn't even bother to lipsync properly.
Instead of pushing RT effects they should have spend some money on some proper visual artists, writers and voice actors (fuck, that dialogue writing and voice acting was horrible ... how on earth could such a game get praise at all?).
That is mostly my problem with RT: It is heavily pushed as a gimmick, while much more important parts of a game are handled more and more lazy.
@@Xulbecause your radeon gpu are without dlss
@@OmnianMIUEnglish?
Yeah, right... The game I got with a 2080 and played on my 2080ti transformed the gpu somehow midgame into a Radeon ;) (funny enough, I now actually have a 6900xt because RT turned out to be mostly placebo, I only buy High-end cards, so usually there is no need for DLSS/FSR).
Also, DLSS has nothing to do with texture design, assets and animations. Do you even know what you are talking about or are you just repeating half-understood marketing claims?
@@Xul ignorant. What is DLSS? Another guy talk about pc and known shitty nothing
@@Xul yeah haha you are completely right! And Control is not the only disappointing example. I can not understand how everyone seems to praise all those game as if they are life changing. The most of those AAA titels released recently are complete shit and sometimes they also look like shit even though they require you to buy a NASA computer. So many games come out with ray tracing but they cut a lot of corners to do so. When you disable RT there is not much left even graphically. I would argue that some games released recently look worse then games published 10 years before, because of all the trickery used to enable RT.
So far the only games that i truly notice a big visual difference with RT on are cyberpunk and control
I've started a new playthrough last night with cyberpunk 2.0 patch with ray recon and frame gen on and I have to say it looks amazing. Yes, there are a few bugs that I believe will be ironed out in future patches but well worth it. Running a 7900x @ 5.49 all cores @ 1.28v and 4090 @ 3.3, 6200 on the mem. absolutely beautiful experience @ 4k 144htz with gsync.
And what exactly does look better? I found it just looking slightly different? They say it looks more realistic , but just adding more reflections and lights doesnt magically look more realistic. In real life there are not so much reflections on reflective surfaces. So in many instances RT looks the same fake as rasterized lighting does. So its not worth it
@@BastianRosenmüller Wrong, lighting and shadows make a huge difference in realism
@@Scornfull You are definitely wrong. Do you sometimes go out and take a look on surfaces? RT shadows dont nearly look that way. And also RT refelections dont look real. RT just adds more reflections even on surfaces which normally wouldnt refelct. And why do we even bother? Video games are excape-ism. They are not meant to be "Real" . I have real life the whole day . Its just boring to have the same in Video games
@@BastianRosenmüller I use my eyeballs for one
@@BastianRosenmüllerbro go touch some grass,look at a puddle once or twice in your life 🤡🤡🤡🤡 amd fanboy cope more. U clearly have no life outside your room if u are saying this bullshit. Either that or u are blind. Most probably a delusional amd fanboy tho
I cant lie, 1st time I tried Hypr rx on my 7900xtx I didn't care for it, something updated and I turned it back on days later and haven't turned it off since, the trick is turning the resolution drop to about 33%< even path tracing at 4k becomes playable with little perceivable difference in quality , That famous AMD fine wine is already kicking in with this 7900xtx and Ive only had it 2 months
Me too. It seems performance is getting better with each update. Updates been coming in very steady.
@@halrichard1969 facts 💯
I've played around 5 hours of 2077 with the new DLSS 3.5 update with Path Tracing on my 4090 and it has been completely fine. Yes there is ghosting, but not as bad as this video makes it out to be. You also have to think about the benefits of playing with all these settings on in the first place! Like are we really going to turn path tracing off because of a little ghosting? LOL. It's good to point out the shortcomings of the technology, but to paint it as not having a use and not being better than traditional rendering is ludicrous.
Yeah but look at all these people firmly believing that rdr2 looks better than cp77 overdrive mod... Lots of people just haven't been educated on light representation, I would guess that a gamer that have always played at max settings will see the huge difference when somebody that plays a lot of games on low settings will see more difference going to ultra rasterize settings and RT is kinda out of the understanding scope. Just like with HDR misconceptions, people used to crank color, contrast and backlighting on low end monitors and suddenly get an HDR OLED tv get disappointed because the just see elevated shadows area, desaturated colors etc when really it is just about granular detail...
I totally agree with you. And honestly I don't get people that buy high end cards and a good 4K monitor to finally run CP 2077 with DLSS at Performance mode + FG to just run Path Tracing at playable fps because in the end of the day it's a shooter game and anyone who tells me that everything less than about 80-90 fps with FG on is a smooth experience knows nothing and lies. Latency are just high enough. I would say that in order for FG to really works as intended you should have at least 50 to 60 fps base before enabling FG. The majority of noising and ghosting comes from the fact that they use FG with very low base fps. Anyways by doing this, they just throw so much imagine quality out. I gonna heavily mod my game and use a good ReShade every day before to enable something that would half my fps and makes me to use fake frames (aka FG) with all its bugs followed. And by the way there is a very good mod which fixes the reflection disappearing with SSR.
dude a 4080 or 4080 super , will get you 80-90 frames in 4k with PT if you use DLSS perf mode and FG. with out FG the base FPS is actually right at 52-58 FPS using those settings so what are you talking about ?
I'm old enough to remember when AA was the new thing. Totally unusable for 3-4 generations--perf cost wayyy too high, and that was early 2000s when "playable" was 30 fps
Yep and I never used it until it was great as the hardware caught up.
Same for RT. I believe RT is the future but I will only switch it on when the hardware is ready. Maybe the 6090 in 2026
@@MarthinusSwart 2026 seems early I would say till the next console gen at the end of 2027. Even then, I doubt it will become the standard in games like rasterization is now.
@@Ash-uf4fv RT will be teh standard eventually .. to say you doubt that is like some one in 1999 doubting that flat screens will ever be standard. RT has existed since 3ds max was first invented in 1989 so it took us about 20 years to see it in real time on video cards at all . but make no mistake , eventually it wil replace ras. unless the whole indsutry implodes first from wokeness
@@DenverStarkey Of course it will come eventually. I just don't think its going to be as soon as 2026. But we will see.
This just sounds like an ayymd fan cope. If you use proper nvidia tech with CP2077 it looks amazing and runs well enough. Yes its not perfect but in those cases its absolutely worth using if you can.
Jedi Survivor without Ray tracing runs at 90+fps on a 4090 at native 4K so easily. Kills the performance turning it on
nvidia needs to bring back sli for path tracing so simple to just buy another 4090
I can honestly say that I'm just largely sick of non ray traced 3d. I'll happily drop to 30 fps for images that don't look like garbage.
RT isn't dumb...but it's definitely way over hyped.
Not for those who want to validate a $2000 GPU purchase.. most would agree
@@christophermullins7163 They can say whatever they want...UE5 has all but invalidated the hype with lumen and nanite...and I have a 4090. I got it for it's raw power not RT. Having RT power is great but people put way more hype into it than it deserves.
It's just ahead of its time currently
RT is the future... but it's not the present. It likely won't become the present until you can get 3080-level RT performance with 16GB of VRAM for $300. And even then, it won't be full path tracing on most titles - it'll be situational implementation that adds polish rather than being designed ground-up around ray tracing.
Get a nice fast OLED monitor like the Asus pg27aqdm instead. Makes everything looks great.
I would have to agree, RT is only really worth it if it's mixed with rasterization. Fortnite, like you mentioned, is actually one of the only games I enable RT with because it genuinely improves the quality without a ginormous hit to performance. I'm on an RX 6950 XT and at 1440p, I still average around 100 fps with lumen and RT reflections set to high.
Ray Tracing is fake news, Red Dead Redemption 2 has the best lighting and no Rays to trace.
It snake oil since raytracing is in the alpha stages. Cyberpunk 2077 got a lot of diffefent setting of raytracing some effect might be worthy like screen space reflection only
@thedesk954 Ray Tracing was made for lazy, incompetent, low skill developers that have no artistic skill on implementation of lighting and reflections, Rockstar proved it can be done, No Ray's required, I recently fired up Read dead Redemption 2 on my 7900 XTX all settings maxed out and I am always blown away, No game with Ray Tracing can look as gorgeous and realistic not even Cyberpunk.
RDR 2 reflections are nowhere near comparable to something like control which is ray traced.
Go cry rockstar simp. In fact the game also doesn't have accurate AO or contact shadows. BeSt LiGhTiNg.
Assassins creed origins has amazing lighting
@yomohdz7764 metro exodus enhanced looks better. That is an undisputable fact.
Y'all need to play cb2.0 with path tracing.
If you can't upgrade then wait till you can.
The difference between raw and path tracing is night and a whole week. It's absurdly glorious looking.😍
You can experience 2k maxed+path tracing+RR with a 4070. I can run mine at solid 75fps with a little help of DLSSQ+FG(which even makes most images even sharper👌).
it does but its damn near a tech demo vs other games , its not the norm yet
I get 30 fps @ 1440p with everything maxed out with no Dlss, on my 4090.
@@captain_nemo01well that's weird. I'm getting the same fps on my 4070 without upscalers.
Edit: You should really use your Dlss3 bud. I mean I hate input lag too but it's a single player title , and I've gotten used to it already.
@@captain_nemo01 lol ppl are nuts ,If an AMD user said what you said they'd say hes lying
@@captain_nemo01that's crazy.... I'm getting more then that... what's your CPU?
This is precisely why I went with a 4090 over the 7900xtx. I knew that RT would continue to be implemented more and more in games, and that it would get more impressive and demanding with time. And Nvidia has a good track record of implementing new technologies and improving them over time that help with performance and visual quality. After watching the Cyberpunk DLSS 3.5 review videos from GN, HUB, and DF, I think the 4090 was the way to go for me. Especially considering the fact that I got mine for $620 off the MSRP ($1799) making it $1179 which is pretty close to the 7900xtx in terms of price.
4090 for that price is a total no-brainer regardless of anything else you said. Even if you don't care about RT the 4090 is still the most powerful consumer card and getting one for the price of a 7900 XTX is nuts. I own a 7900 XTX and would buy a 4090 for $1100 right now if it was offered lol.
@@jakejimenez7048 Too be fair, you're correct that it's a no brainer. However, I had the 6800XT and really liked that GPU and I wanted to stay in the AMD family since I really liked their adrenaline software and appreciated that the 7900xtx did really well in raster. Plus, I think AMD is a more pro-consumer company overall, considering the openess of their software (FSR, Mantle, etc.) I didn't really want to switch back to Nvidia but Nvidia is really knocking it out of the park with their DLSS stuff and their RT performance at the moment. Ultimately my decision came down to whether or not I wanted to compromise in some area. I didn't, and that's why I went with the 4090 since it literally is the only card on the market with zero compromises. Every other card, the 4080 and 7900xtx included, have compromises in some way or fashion. The 4090 is alone in that it has no compromises to speak of.
@@jakejimenez7048 Same, i got the 7900 xtx myself and the only reason i went with the 7900 xtx over the 4090 was the insane price jump just to get 4K RT, all the other features i really didnt care for, I do not regret my choice but if i could get the 4090 with 620$ discount i would have gone with that for sure lol
where did you get that or how do i get it at that price?
The XTX was terrible at release. The had the gall to put 8k capable on the box. I returned mine after 3 weeks and ended up getting a 4090. No regrets.
Atm, we’re seeing advancements in Ray Tracing - Ray Reconstruction, but it’s like 2-steps forward and 1.5 steps back with its ability to show the correct information.
Look back at around AMD Radeon HD7000 series to early R280 series gpus …… there was generations of GPU hardware and software improvement before tessellation didn’t crap out AMD performance in terms of fps, but was mostly fine for Nvidia …… we’re at this junction point again with Ray Tracing and I feel we’re going to need 3-4 generational GPU improvements (like RTX4090 over RTX3090 ~50% boosts per generation) before Ray Tracing / Ray Reconstruction and all its bells and whistles can deliver decent 75+fps performance natively at higher resolutions.
Ray reconstruction does make all objects in the distance look like those AI generated images, and alsomake those 3d models looking flat, especially those NPC faces. It's just another nvidia gimmick.
When pathtraycing( which is the best way of implementation of raytraycing)stops being a feature and comes integrated in games is when we will see the most difference. For now we only have a kind of early access to it.
If a developer is going to do bad basic rendering what makes you think they are going to properly place the light sources so that RT works well at all.
Raytracying is cool but it’s sort of a trap to get you to buy the highest Nvidia GPU, gameplay is want really matters not graphics.
I think its also sus how DF and HWUB never picked this up.
Well Alex loves PC stuff
I honestly don't care if screen space reflections disappear when looking around. I agree with your points. RT is OR (overrated).
I just enjoy the game in full raster on my 6900xt.
1:26, when the space is oiled up.
This is a good take on things. RT is certainly amazing under the right circumstances but in most scenarios you're just sacrificing too much performance for it to be worth it. I've taken to lowering certain settings that are less noticeable in order to be able to have some semblance of performance with RT on (3080ti) but that's obviously not the most ideal. Part of the problem is that game devs seem to be developing games using future hardware as you can have current-gen top of the line stuff and still be suffering in those games
I personally cannot wait for the time when we can someday have performance AND visual fidelity. For now it seems you just get to pick one or the other
Bottom line is there is a noticeable difference and if you don't need super high frame rates, like in any single player game, then ray tracing is fine. 45-60 FPS for single player games is fine. For competitive online play anything that robs FPS should be disabled or set to low.
It will take 5 to 6 years to get current games to run in a playable fps range, but in 5-6 years new demanding technologies will appear, it's a vicious cycle.
Path Tracing looks amazing like %95 of the time. There are some scenes that need special attention that you will notice as being crap in CP77.
Unfortunately I have a AMD card so unless I want real low fps or fuzzy/ghosty fsr2, then RT and especially PT isn't worth the fps hit.
We probably 10 years away from it being more mainstream.
I'm writing this in 2024, and today I decided to go back and play Bioshock infinite for the first time in maybe 6 years or more. This game has better reflections in just the opening sequence of arriving in Columbia then most games being made now with Ray tracing technology. There are some games that definitely look great with the tech obviously cyberpunk 2077 I'll also throw in Control, Doom Eternal, Alan Wake 2 and RE4. However it's not the end all be all if you can't utilize your tools to nail down a style let alone good gameplay that's engaging.
Im sick of the Nvidia special 3-4 games that have Heavy RT (Or Nvidia RT) and we are told this is the future . I don't any of those games cyberpunk is good I play with ultra at 5k
i have a 4090 and i have tried ray tracing in so many games and i really dont notice the difference unless the game was shit and got a ray trace mod. what i do notice is going from 150fps to like 40 lol
What annoys me the most about modern games, is that we finally got hardware that could gracefully deliver 4K high-fps gaming with MSAA and astonishing details, but instead we're getting noisy/dithered, yet crazy expensive algorithms with low sample counts, puny internal rendering resolutions and "reconstruction" algorithms on top, which just make the whole image blurry or even cause rendering glitches. RT makes that worse, because it requires even more tricks and cut corners in order to to get the effects at a somewhat playable framerate in the first place.
Instead of trying to make the most out of the HW resources available, all GPUs essentially got shifted down 2-3 tiers (despite the exorbitant prices) due to bad design decisions and questionable programming. The whole industry is f_cked!
Yeah, totally agree. RT is cool, but not really worth it for sure, yet. I've played around with CP 2077 on my 7900 XTX, and compared the differences between the three visually. Certainly looks good, but the fact it's hardly playable (path tracing) makes it big no for me, except the reflections. I have a 4K 144hz monitor and play at max settings with quality FSR 2.0 with ray traced reflections, and get an almost smooth 72fps. I do think ray traced reflections have a big effect, while not being too much in the way of performance, and obviously still looking stunning. I guess path tracing certainly helps to future proof the game so that people will come back to visit it for years to come to test out the hardware. lol
What's the point of running 4K max when you ruin the image with dogshit tech like FSR and DLSS
First of all, AMD and nvidia path tracing is NOT THE SAME. Nvidias path tracing not only performs better, also looks waaay better. Why am i even bothering with amd users lol
There are * a few* games doing RT just fine, e.g. Cyberpunk and Control. Hell, Control was also fully playable, even for someone like me who has to use the FSR 2.1 mod on a 6900XT. The modders applied FSR so well in Control that even on a 32:9 1080p display it looked absolutely crisp in Quality mode. Cyberpunk is another one of these games, though I am not sure if you could say that RT makes it a completely different game now - you always have to keep in mind that you are actually moving through the game's world instead of constant lollygagging at reflections.
And then there are games like Chernobylite or The Witcher NG *and most other titles in general* where the implementation lacks and/or the performance cost doesn't justify the image quality.
In both cases mentioned before applying RT leads to a lot of noise, Chernobylite in particular also suffers from the horrific implementation of FSR and TAA, with TAA giving you ghosting paradise and FSR taking the sharpness out of the picture even more. At the same time I didn't notice much difference here, actually I like the old SSR version more because it shows more contrast. In the witcher, performance also tanks if you use RT and at the same time not giving you a necessarily nicer picture. If you couple it with TAA and FSR, the displayed content makes you barf.
I really found out I have become a fan over the years of a very sharp, yet fluent picture. This in some cases even let me use the good old SSAA instead of SMAA or TAA as it is available for a lot of titles now often named Render Scale and it just (as SSAA does) multiplies the shown pixels to the adjusted factor. Hence, a 1080p image becomes a 4K image when rendered at 200%, with the consequential performance hit. But after seeing a lot of RT titles on capable and not so capable machines, I have to say I choose sharpness over cosmetics.
real talk :
Geforce 4080 super
Cyberpunk 2077
settings maxed ,
PT on DLSS performance mode ,
w/ FG - getting 82-93 FPS @4k.
input lag .. non existant from what i can tell. . it looks freaking amazing.
i don't see aproblem atm the moment with RT.
I have a 4090 and I use RTX ON at all times. OP is dumngay
Enjoy having worse performance than me in my 4070 laptop because "ur rays are traced 😱" lmao
Full Path tracing runs really well on my 4070ti in cyberpunk in 4k with dlss balanced. Frame gen too obviously. Latency being a little high is slightly noticeable, but it's totally playable and incredible looking. Above 60fps.
Are you serious? Playable fps is 120+, whereas you're getting not more than 30 real fps (generated frames do nothing with latency, and latency is the most important thing in any game with mouse camera rotation). It's an utter nightmare, i can't imagine playing first person shooter with such enormous latency, better not to play at all, than with such terrible input lag. It's absolutely unplayable, even on 4090, and i doubt even 5090 will handle it.
@@detrizor7030 @detrizor7030 playable is 60fps as long as you're happy with the latency, which I find acceptable. It is perfectly playable. 120 FPS only matters for competitive gaming. I do just use an Xbox pad tho, so I can imagine it being different with a mouse.
@moorebags1 "playable is 60fps" - you obviously haven't played on 140+ hz display with 120+ fps, otherwise you wouldn't say such nonsense. 60 fps is playable only if we're talking about games without mouse camera rotation or with minimum quantity of it, for example racing, otherwise it's a nightmare.
"120 FPS only matters for competitive gaming" - i can tell you a secret - competitive gaming is good on 240 or even 540 hz, 120 fps is not enough for it, it's barely enough for single player games.
"I do just use an Xbox pad tho, so I can imagine it being different with a mouse" - and here we have the source of your misconceptions, of cource. Bro, just try playing any first person shooter on 144hz monitor with 120 FPS and on mouse, after just 5 minutes you'll NEVER think about 60 fps like something good.
60 FPS is fine for most people. I have a 120 fps TV for gaming. 120 is obviously better, but 60 is acceptable. Chasing 120 is just pissing money away.
@@moorebags1 "60 FPS is fine for most people" - most people are idiots. Seriously.
"120 is obviously better, but 60 is acceptable" - again, just try it with mouse, on mouse 60 and 120 fps are uncomparable. Gamepad gaming is trash for consoles, real men play only on mouse.
"Chasing 120 is just pissing money away" - 120+ fps is just decent gaming experience of a sane person, who doesn't want to eat shit. And it's not that expensive, if you're not paying too much attention to graphics settings, especially raytracing.
One solution is to play with RT at 1080p on a plasma tv, where actually 1080p 60 look great
4090 here, I have to agree. The performance hit you take for RT and the other gfx options on really isn't worth it. I prefer to get crispy 4k@120hz. I play too many first person/fast paced games, those GFX effects are not worth the performance hit. RT and DLSS were applied to games, now RT and DLSS are out and game devs can optimize their games. Let's hope we see efficiency and optimization from nvidia and game devs. Also, AMD just released FSR 3.0 and it looks good so far.
How would you comment the fact that people are trying to "skip" the rendering step and everything will be "generated"
I think that an alternative to Screen Space Reflections could be a combination of this technology and cubemaps. That should be better than SSR on its own
2060 is a great card for RT , no need for anything higher. The screenshot 1fps is enough to see the RT.
Give me a break I definitely see the difference while playing cyberpunk 2077 with Ray tracing on versus off
I think the amount of time it takes will depend on Nvidia and AMD bringing that level of performance to affordable cards. Since they could "achieve" that lets say by the next gen 5090, but that is not the core market of gamers. So it would need to be seen in the 4060-4070TI tier cards for it to be "achieved" but that's just me.
When 60 class GPU run RT at the current popular Res in visual fedility games more than 60fps then 8ts acceptable.
Now 1440p is the new 1080p. We need full RT on 60fps and partial reflective RT at 144fps.
3090’s preform around the same as the 4070’s. 3090’s are 5-600 dollars used. Don’t go for a 7800 xtx or a 4070 get a last gen used 3090. Get 24 gigs of vram and make these companies lower their prices.
This is why FF7 Rebirth is the best looking third party Playtstaion 5 game. No RT nonsense.
As someone who has been playing video games on PC since '95 the only true revolution and benefit to us has brought was 3DFX with Voodoo's. Nowdays nVidia is just playing their games and take customers into their vicious cycle of entrainment
🤡🤡🤡🤡 nvidia doesnt care about gaming, are u people restarted? Nvidia is an AI chip maker first and foremost. Nvidia could literally stop making consumer gaming gpus and would not feel a thing on the stock market
My guy spelt it " definately" and is giving people advice. Lemme help you out bucko. The word is d-e-f-i-n-i-t-e-l-y
It's all marketing. The sad part is that so many people fall for it. Sure in a few cases it can be awesome but 95% of the time/games, it's not. Yet here people are buying worse GPUs "because dlss and RT"
I like Ray Tracing but my problem with it is, is that not many Games have it so i cannot use it
2030-2036 before GPUs are 8x more powerful, I think, but by 2030 we'll be wanting 8K textures. I would let RT take a backseat in preference of frame rate and resolution.
And for what do you turn on Ray Tracing in Fortnite ? WTF this is a multiplayer game. It makes no sense to take a massive performance hit in that kind of a game
It will take a RTX 6090 to treat raytracing as if antialiasing was turned on to 8x
Lets wait for rtx 6090 sounds interesting but i need to find a diamond to get rtx 6090 probably 3000$
The only thing to make real time full scene perfect ray tracing shall be this: exascale supercomputer similar to El Capitan SoC
Hold on I just checked out modded Oblivion and it had mirror reflections in the water. They must be gimping the non ray tracing setting.
Only a "graphically challenged" naysayer would disregard RT. 😏
Now the name of the channel makes sense. 😁
I've waited my whole life for real time ray tracing and didn't think I'd see it in my life time. I don't play multiplayer games anymore and 30 fps is more than enough when I play (remember that most movies are 24 fps!).
When you compare the "artifacts" of rasterization to ray/path tracing, a few issues you mentioned for RT pale in comparison.
For me it's very simple. Fps > fancy shadows always. And I have 4090. Ray tracing is just not worth it
I've said this before, but in many cases ray tracing actually looks WORSE due to just how noisy ray traced reflections are. Light bouncing off reflective surfaces just looks so awful and distracting. DLSS 3.5 seems to be a big game changer though.
Performance at affordable price is the real nail in the coffin for me. Maybe in 5 years ray tracing will be at an affordable price but it’s definitely an enthusiast level 4k 120fps path tracing
Honestly, I think PC gaming over all should t be considered enthusiast. Console should be for casual gamers. In the current situation xsx and PS5 out compete most gpus at the same price. I would get a PS5 over 4060 imo.
In the meantime, Santa Monica God of War Ragnarok is the best looking video game out there, washing anything on PC... and it doesn't have Ray Tracing...
Personally I think in CP 2077, Ray Tracing is good for showcasing but really bad to play with it on.
It's not about performance but because the many issues it brings.
Ti begin with RT lighting may be more accurate but in this game where contrast is so important from a narrative standpoint, RT lighting floods scenes too much in lighting making them feel more flat since it removes many of the intentional shadows placed in there by the artists.
If you use RT shadows they'll look good in some places but since they're more accurate there's so much dramatism taken away from it because they'll always will look more natural and forced shadows in this game sometimes are used to tell a tale.
RT reflections are amazing in this game to enhance the verticality, color and detail of Night City but they're the most glitchy effect of all RT ones. You'll find them popping around, making characters look like chrome and since they're not only used for floor reflections and they apply to most materials, RT reflections can mess with some textures and make they look like they don't belong in there.
And then try path tracing
I play with 4k ray tracing on with my 4090 in control with ultra settings with dlss (avg 70fps), starfield with ray tracing in ultra with fsr turned off (60-80fps), portal with rtx without dlss (looks fantastic) (70fps), cyberpunk 2077 with dlss set to quality (avg 70fps)... I'm not a big fan of frame generation due to the input latency spike so I'll probably not be doing that for the foreseeable future, and yeah, ray tracing is REALLY taxing, even for this generation's top end cards, but from what i've heard of battlemage, we're looking at a next generation improvement of something close to 1.8x in rasterization and probably more than that in terms of ray tracing... it's a complete overhaul of gpu design that's ray tracing first. and yeah, sadly it's over a year away, but for now, I'm honestly quite happy with my 4090, and i've also heard some pretty fantastic things about amd's next offerings re: ray tracing as well.
I have a 4090 and I dont use RT.. maybe will use it with the 5090. 4k120 native no DLSS is Pharaoh
I say, stay aways from graphic cards for 2-3 years and come back to see if wwhat you get for the price is not ridiculous.
Honestly I don't have a system with a 4080 or 4090 but I did rent gforce now 4080 just to check it out my Internet is great and I was getting an average of 22 ping. I tried path tracing and compared it to psycho ray trac setting... and regular rt looked more clear and there were less artifacts and noise.. faces were cleared as well. Path Tracing is a good showcase but if youre going for clear and smooth gaming just use regular rt
I run with all fake frame generators off and raytracing off. Pure rasterization. That is the only benchmark I look at.
Another reason is that in the real world where your not in a rain soaked glass city, reflections aren't much of a thing. And shadows, if your not directly comparing it to it's rasterised contemporary especially when moving in fast paced games goes largely ignored.
That said, I too believe it's a goal worth pursuing. I just wish it wouldn't take a back seat to proper game development. Too much spent on eye candy and not enough on fun.
about frame gen, I would prefer to use BFI with VRR enabled. It's a shame it's impossible right now.
I have a feeling that raytracing will have the same fate as gpu driven physx. Physx was infact a much better gimmick than ray tracing.
Ray Tracing is something that I never care when it appeared.
And still does not care today.
I dont care about all this lightning stuff.
I just want to play and have fun.
Ive had my 4090 about 10 months. Not even bothered to try Ray Tracing.
Sad.
But, I hope you enjoy very, very high frames!
@timons777 I'm running triple 28" 4k screens in StarCitizen...
Ray tracing would slow things down quite a lot.
In the 60s these days.
When SC has RayTracing AND the game is better optimized... I'll definitely try RayTracing.
@@VonSpud I see a wise man right here! ;)
I really love PT/RTX in Cyberpunk 2077. Very beautiful!
We will need GPUs with completely new architectures to do path tracing well. My best guess is that it will involve very a clever usage of AI at a hardware level to really accelerate RT at a much higher degree. Right now theres alot of software tricks being done on a game by game basis that I don't think is a sustainable approach, and its far from perfect with ghosting and such. Much of it needs to be done at a hardware\driver level in the long run.
This is the reason why in many games the RT implementation is quite minimal and AMD gpus can compete with Nvidia ones. The 3000 an 4000 have way better RT capabilities than the RX 6000 / 7000. But the RT FX are barely noticeable when activated. So, every gamer should buy the best bang for the bucks performer in rasterization. Thanks for this piece of information.
bro what are you on not immersive?
Your example is a failure to properly adjust the contrast and color of the image. RT would only change that if there was a shader change that allowed it. It has nothing to do inherently with RT only a side effect of RT being enabled.
because of how it has been programed.
Its a scam to satisfy the need to buy always the newest videocard
Is it just me, or does he look like he has *three* mustaches on his face?
Turn on RR+FG and you will see the stash clearer ;)
Just played Battlefield V and ray tracing made it look like shit. Actually BF V was a step backwards if you ask me, definitely a step back from the fidelity from BF4.
Don't game in 4k on a 4090. Let's talk about it!
Game on 1440p on a 240 hz oled
i got 4070 and its still not worth.most of the time i dont even notice but i will notice +160 fps
The only game i tried raytracing in with my 3080 was quake2 rtx..
I just want 4090 so i can mod skyrim even further...beyond!
Tbh ray tracing maxed out has always been stupid. However, some light ray traced implementations like in RE4 and plague tale are amazing.
Plague tale shadow difference was barely noticeable, I found
Pixel level bounce to extinction ray tracing isn't a setting and probably won't be for 20-40 years (if you need it in real time). But AI fake it path tracing will get good enough in a few generations. We'll use it. It will look good. It doesn't have to be "real", it only has to be better.