Soooo… the charts 😅 I usually use Google Sheets, which is handy because I use several different computers as well as my phone for noting down data. Unfortunately, it couldn’t handle this one, so I found myself using Libre Office for the first time. I kinda had to learn the program on the fly, and I dare say that I could have done a better job with more practice. I’m not sure if I’m going to use Libre again, but if I do, I’ll definitely work on making the charts more readable!
@@walter1824 Yes, I'm a one man band. This video took a while, it was always intended to come out in September for the 6th anniversary of Turing, but I started writing the script at the end of March. The actual video took a little over a week: benchmarking took about three days (including some related tests for another future video), making the charts took a couple more days, recording the script took 2 days because I lost my voice halfway through, and I edited everything together yesterday.
When it launched, the meme was that the 2080ti could do 4K raster, but only 1080p ray tracing. The 2060... 1080p raster, 480p ray tracing. Time has not been kind to either.
idk, i have a laptop with an rtx 2070 max-q, it should perform worse than a desktop 2060, i think, but it doesn't feel dated. it can still run anything with very little compromise. i don't think time was unkind to it at all.
@@GraveUypo Obviously thay can run games, they were objectively bad though. The 1080TI trades with the 2080ti in raster, and launched at a 30% lower msrp multiple years earlier. The 20 series are pretty the worst value cards nvidia has ever offered, in terms of fps per dollar. I mean i have a friend still running a 1060 and it's fine for what he needs. That doesnt necessarily mean it's a good or bad value card.
@@bobbylasagna 2080tI is at least 30% faster than 1080Ti. and when games start being build more with DX12 feature the gap is widen even more. and you can't really compare about price either when things getting more and more expensive as time goes by. just look at current 5nm. next year it is expected 5nm price will be as expensive as 3nm two years ago because TSMC keep increasing the price for 5nm instead of making it more cheaper as time goes by.
Horribly made, 70 screws. Glue everywhere. Terrible fans. The blower cards from Nvidia before were far better And the 30 series design and after is the gold standard for a card with good airflow
As a long-time 2080ti user, I've been gaming at 1440p since 2020, and, if you don't use raytracing, it is still a great experience. Even newer games you can usually pull a 1440p60fps high preset experience. Love this card. Definitely starting to feel its age a bit, but its still very capable. Asus ROG strix version, myself.
Yes, for ray tracing you need lots of vram, sometimes surprisingly a lot more than you'd think you'd need. The worst part of this whole rtx thing besides it being so hard to run is how nvidia has been consistently skimping on vram while acting like everything is fine. Then you have hordes of nvidia fanboys making fun of you for insisting on more vram while they pretend everything is fine. That peaked with the 4070 ti. "12gb vram is fine bro." No, it's not.
Like the 3070 ti suffering from massive stuttering and performance drops in some games, especially with RT enabled. But as usual the nvidiots just overlook and ignore it.
@@Son37Lumiere I'm an idiot for buying a gpu that meets my needs? I have a 2080ti, 3090 (got it for retail price when they were well over $2000), and a 4090. All great gpus. I have a RTX 3080 laptop which does everything well at 1080p even after 2years and being a thin and lite model. I have a old 970 which lasted a long time. I use it for testing. I have a 1070 and 2070 that lasted as long as expected and still can game at 1080 well. Whereas AMD dumped support for which GPUs last year again? Everything but the 6000 series and up? Lol. AMD fanboys always resort to name calling. Especially when a lot of them didn't even buy the cards that they claim are so great until they went on the used market. Ciao.
@@louisfriend9323 true but I'd also consider the 4090 still just a demo ''pro'' card when you have to dump resolution to 1080p to get playable rates in the most recently released UE5 games an they stutter to all hell. I've yet to see a visual improvement from RT that justifies TANKING resolution and texture quality on my 75' 4k TV. Bragging about who's failing less at offering an unjustified visual ''feature'' is far from wining. 🤣
@@pliat Never said it was a 4090 issue. It is However an issue in UE5 games which is made even worse when RT is on. And I don't need your opinion on what res you use while running RT. there is plenty of evidence and benchmarks on sites such as tom's hardware to show that 2024 titles using the the FULL suite of of RT effects are performing at sub 60fps levels at anything much more than 1080, or 1440 with upscaling. you may be of the opinion that 1080p up scaled + frame Gen to 4k at 30-60fps and 20+ ms of latency is fine. but most are not.
I just gave away my rtx 2060 to someone who was using integrated graphics. (I have 6 other rtx cards that are faster than the 2060) They think the card is amazing for their 1080p game playing. Perspective always matters.
@@Son37Lumiere eh... doubt? a 960 cannot beat something like a Radeon 780M in high-end AMD CPUs, or the new Intel Arc iGPU found in Intel Meteor Lake (and Lunar Lake) CPUs.
@theexorcist982 lol wtf,damn bro you sound offended,You his boyfriend or something Or just having a bad day buddy? Don't worry everything will be alright
90% of the time I think Ray Tracing is nice but not really necessary for a game. I recently got a "WOW!" moment in Cyberpunk, I was walking around a corner, I thought I had taken out all the guards when I saw the reflection of a guard above me on a ledge in a puddle! (I'm using a 4090)
Cyberpunk Pathtracing is the only time in play that significantly makes a difference. Most of the time without a side by side or pixel peeping, you really won't see a meaningful difference.
An important detail about Turing's RT implementation is that the RT hardware acceleration couldn't run concurrently with the rest of the GPU pipeline -- very similar of how RDNA serializes its RT implementation. That's why the 6900XT is able to match even or top out the 2080Ti despite having much less dedicated hardware RT support, by simply brute forcing itself with more FLOPs capacity (76% advantage). This deficiency of Turing was the first thing that Nvidia mended in Ampere together with doubling of the FP32 throughput.
Partially true. Turing can do concurrent operations with RT and shading, it's just pretty bad at it. When it comes to RDNA2 GPUs beating Turing GPUs with RT, it's mostly just because of the rasterized performance advantage. RDNA2 GPUs fall off much more in more complex ray traced scenes, especially path traced ones (Not that turing is that great with path tracing either). I've compared the rx 6700xt and the rtx 2070 super against each other in rt situations (They both have an equal number of rt cores/accelerators and shading units). generally the rx 6700xt was faster when it comes to simple rt effects in games, like doom eternal or spiderman remastered. In other games where the rt tasks are more complex, like control, the rx 6700xt started to struggle more than the rtx card did. In path traced titles like quake II rtx, the rtx 2070 super destroyed the 6700xt by over 30%.
@@cosmicnebula3023 Most games that feature RT, especially the ones with RTX sponsorship, are optimized specifically for nvidia with little to no optimization for AMD.
to anyone potentially saying the performance isn't that bad, please remember that the majority of Turing owners didn't have a Titan RTX or 2080 Ti any numbers you see here will be cut in half if not more by using a 2060. a 6GB buffer is an absolute no go on many games in 2024 withOUT RT enabled. if you think it's shocking to see a 6900XT to struggle in Ratchet and Clank 1440p with ray tracing, imagine how bad the rasterization performance is for a card with just over a third the VRAM
Yeah low end RTX cards were a scam when marketed for the RT, basically just like current gen RTX cards. It's just not worth the hit. "Let's play Cyberpunk fully path traced on my new RTX 4060 for the best experience" said no-one ever. It's good for screenshot mode on those cards - that's it. Now with AI hyperscalers buying all the waver supply, low end will continue to eat shit as NVIDIA will ramp up the shrinkflation. The leaked "5080" is clearly a 70 type card when you look at the core count and memory downgrade over the 5090 leaks. I bet NVIDIA won't even made a 5060 this gen (or they will, but it'll be their xx50 silicon relabeled as xx60, or just repackaged 4000 dies). AMD doesn't even try to compete at higher end this year and will just release a refresh with better RT and then just roughly price match equivalent, overpriced $600 NVIDIA products that ought to cost $300, so buyers are screwed either way. Intel is too busy sinking and even if Xe2 is 2x faster, you bet your favorite new game will have awful frame timings and there is a good change that line will no longer get game optimized drivers 6 years after release.
@@Wobbothe3rdI call BS. Im rocking a GTX 1660. Most are still rocking their 1060s and other gpus thanks to the stinky price gouging between nvidia and amd
GUYS, I did da math from steam survey. I just count every GPU that can't do RT of any kind, excluding AMD radeon graphics (cause it can be either RDNA2 based igpu, like in my amd 7700x or steam deck gpu, and it can be old vega ones, that for some reason did not identified themselves, cause Vega 11 or Vega do exist in the chart), also I didn't count "others" group that include most unidentified or less than 0.15% GPUs. So, number is... 26.9% (nice). If we include unidentified gpus that will be: 36.14%. The rest of the chart (63.86%) supports RT! Maybe not very well, like 6600xt or base RTX2060, but point stands.
Raytracing is a neat feature, but it hasn't wowed me as much as the performance impact when turning it on. Going from 120fps to 45 even on an RTX 4090 isn't a good look. We'll eventually get there, but it might take another 6 years. The next feature I want to see implemented is Gaussian Splatting in video games. That'd be neat.
It's not just a "feature" it's an entire rendering paradigm that enables entirely new gameplwy types. RT works perfectly well right now on rtx20 cards, ypu just can't expect esports framerates or high native resolutions. For DECADES Gamers lived 60hz and CRT resolutions, so clearly those things aren't NECESSARY for good image quality or smoothness.
@@Wobbothe3rd CRT *monitors* were going WELL in excess of 60Hz, WITH better latency and motion clarity than flat panels have produced, even now. People were actively PISSED when CRT died, in exchange for tech that was in its infancy and therefore objectively worse in basically every way, so try again.
What game drops to 45fps with RT? I run Cyberpunk Ultra Performance Path Tracing 20 at 1440p DLSS quality and FSR frame gen on a 3080, whilst getting between 80-110 fps depending on the scene. The image is not only better it's a completely transformative experience!
@@leviathan5207 Your 3080 is rendering at less than 20 fps at 1440p path tracing. DLSS quality would bring that up to around 35-40 fps, which is why you're using frame gen, or fake frames to gain the rest.
i've got an rtx 2080 laptop, which i dont use anyway because my main rig is more powerful, but you know what, it still runs games at high settings to this day, at +60fps pretty much across the board! :) (at 1080p)
I also have a 2080 laptop. It does very well at 1080p or 1440p. My laptop has a max of 105 watts. That's what limits it unfortunately. It's a thin and lite.
Mr. Iceberg, the charts you used to show the performance of the cards are a bit... confusing. I suggest that you make a clear distinction between the two graphs by having some lines at the top and the bottom of the graphs and then leaving some space between the blocks. The video's intro sequence was very good though Mr. Iceberg!
Yes this is not an easy watch on mobile or a small screen. Can’t read the small text next to the graphs and I’ve already got sight issues due to a retina injury I got as a kid. On desktop it’s okay.
A little hack for yall who use a first gen RTX card, install the DLSS to FSR3 frame gen on SINGLEPLAYER GAMES (IMPORTANT BECAUSE MULTIPLAYER GAMES WILL GET YOU BANNED). Since DLSS frame gen is locked behind 40 series cards, this mod has helped me a lot with Cyberpunk Ray Tracing. I may have an RTX 3070 Ti but adding the DLSS to FSR3 frame gen mod to singleplayer games, i got more than comfortable performance on games like Cyberpunk and Alan Wake 2 maxed out. Just remember that frame gen will add latency to your games so the closer to 60 FPS you can get the game without, the lower latency. 40 to 50 FPS is still playable, just try to counter the latency by predicting moves (which aint too hard).
@@empedance1933Yeah but their implementation doesn’t currently work for some reason. The fps counter will go up higher but the image isn’t actually any smoother.
Still have my 8700-2080 rig that got me into PC gaming for real. I had dabbled here and there over the years, but never took the plunge. In late 2018 I found a clearance deal on an HP Omen tower at Best Buy w/ the 8700 and a 2080. It was a blower style card, stayed fairly warm, and definitely loud. But after some tweaking, that little system served me well for about 5yrs. I didn't need to replace it honestly, but I found another deal on a Legion Tower (13700K-4070Ti) and took the plunge on that. It's obviously more performant, but I still use that Omen from time to time. And as long as you stay at 1440p or below with reasonable settings, it will play anything. After following the 10 series, and the price increase, I can see why the 20 series got hate at the time. But for things like prebuilts, it made more sense.
I was using a 2070 Super until a few weeks ago and it could handle RT just fine in a lot of games at 1440p with DLSS (Balanced is what I'd typically use). Not maxed out obviously, and there were some games where turning it on at all was out of the question (Alan Wake 2), but I played through Metro Exodus EE a few months ago just fine (which is to say 60fps+ at all times) and it could handle one or two RT effects in Cyberpunk 2077 with some optimized settings (I chose reflections, since they're the most noticeable). Honestly, I was quite happy still using it until a killer deal on a 4070 Ti Super came along.
@@CaptainKenway damn, the 2070 super is pretty on it then, i had the base 2070 and it wasnt quite up to scratch so i got the 2080ti, just to be safe, this was only about a year ago tho
I've been on a 3070ti and I swear DLSS just feels like a free upgrade on the PC itself, if DLSS is available I always run it at Quality, I appreciate the 20% performance boost and the upscaling in what I think is near most situations makes the game look even better
It's the only AA that doesn't give me eyestrain *and* it comes with a framerate uplift. As I'm on a 1080p monitor, any less than Quality can look pretty dire, but Quality often looks really great if the implementation isn't bad. I've got a handful of games I won't use it on. Warframe's implementation was not great last I checked, it did not handle stars well at all which is not ideal in a space game. But I get great performance in that anyway and am not bothered by the lack of AA, but for the rest it's so good. No Man's Sky has a decent implementation which doesn't lose stars, so I know it can be done.
Dlss looks better than the typically used TAA in games so that’s why I enable it, the fps boost is nice but as long as I’m over 60 I am happy, which is usually closer to 144 on my 4070ti.
12:11 8800GTX's MSRP was 600USD at launch (910USD in current dollars), 8800 Ultra was 830USD at launch (1260USD in current dollars). So, its not right that gpus are a lot more expensive than before
@@lucasrem Uh yes it was worse than a 2080, because it was Nvidia's low end product not sure what you mean. Also it did not come out years later, it came out 5 months after the 2080 as it was part of the same generation of GPU's.
@@lucasremthe gtx 16 series was released in 2019 as an affordable option that could run pretty much every game on the market at the time. The 1650 was garbage but 1660 super and Ti were great if you couldn’t quite afford the 2060.
Holy shit, I love the... intro screen? (not sure what it's called) banger that one. Your thumbnails are so recognisable, I put the video on as soon as I see them. Your branding is fucking immaculate, so recognisable, so clean. It's a thing I don't feel like a lot of RUclipsrs get, sure bright contrasting colours and big red arrows may get clicks, but a well branded thumbnail guarantees my view.
@@skipskip342 Nah, 30 series is more than enough to be different. 2070 series equal is around 3060 series. And 3090 is way ahead of the Titan RTX. You're confusing Nvidia with AMD. RX 7000 is just a rebadged, of RX 6000 series, and RX 6000 is just a modified RX 5000, with RT cores. Plus, AMD is going tier-to-tier in naming, not $$$. RX 7900 series comepte against the RTX 4090. RX 7800 series comepte against the RTX 4080. RX 7700 series comepte against the RTX 4070. RX 7600 series comepte against the RTX 4060.
Great testing, thx for your work.It's nice to know that the owner of the founder version of 2080 ti. Also, don't forget about the first card that could process beams without using hardware blocks. So this is Volta, it came out earlier than 2080 ti and was stronger in computing blocks than 2080 ti, but in games with beams it lost in everything, but it was the last card with the largest chip for the consumer segment, and together with the HBM chips it looked like a single whole. Not long ago, Der8auer did a review of it
Turns out ray-tracing been and path-tracing has been a concept in movies production since the 1970s just hadn't been powerful hardware to run it till recent GPUs that pack pure muscle (computing)
@@kevinerbs2778 I'd argue otherwise. There are many games where you can enable RT on modest GPUs without the need for upscaling. Yes path tracing is a different story, but games using hybrid rendering typically can be run without needing a 4090.
You are right guys, rt in games is just glossing for the sake of it. Because in movies, creating a set scene with rt and PT still takes months to complete with top of the line GPUs of today.
@@LeeericosGPUs are often not used for final renders, ONLY because they lack memory. But yes, NVIDIA GPUs ARE universally used in any situation where you aren't required to use tons of memory (like look dev). Nvidia's GPUs since Turing have had a profound impact on lookdev. No longer have to run the renderer on multiple large-memory large-cpu-count machines over the network, just to see if you corrected that problem in the shaders of a character. Now you can get a nicely-converged image SUPER quickly because you have THE fastest ray tracing processors (Nvidia) right in your workstation, laptop, etc.
Titan RTX and 2080TI had virtual link ports, RT cores, more raw compute and Cuda improvements over the 10 series and were better for VR than anything else on the market. I built and deployed workstations used for heavy renders and tasks and to this day those 20 series cards are still kicking. I have upgraded some machines to 3 and 4 series, but it's based on needs and speeds. RTX as a feature was and still is eye candy, and as hardware keeps getting better it's still being adopted. The 20 series cards have aged well for how far games have come. In time game engines have added options, and I love to see it when I do.
Screen space occlusion artefacts are impossible to ignore. SSAO is fugly and SSR surfaces having their reflections suddenly disappear when you look slightly down is the most distracting and shitty looking thing ever.
@@wile123456 Sofware Lumen is ray-tracing, just not hardware accelerated. Easier to run (bearly) but lacks any sort of precision in the fine grain, you can easily see errors in lighting on smaller objekts.
My rtx3060 ray traces on 1440p moderately okay. Not crazy high fps but it’s stable. Considering the 3060 is only $200-300 I’m surprised more people haven’t bought it. It’s now been passed up for the rtx4060 which is slightly better and slightly cheaper and a gen newer. For pc gaming being an expensive hobby a lot of people seem to spend more money on new games than new parts.
Introduction of ray tracing and dlss made more harm than good. Games are unoptimised as hell. When games put dlss on in recommend settings just to get stable 60 fps something is clearly wrong. Dlss was meant to help older hardware keep up with new releases. But games are so unoptimised now even 40 series cards can't have stable fps.
DLSS was about selling new cards, and making ray tracing possible, and also the fact that Nvidia had no competition at the time, so they put money into it's development
I had the 2080ti and it was realy nice to use raytracing in games that supported it. Yeah it costs performance but what many people still don't realise is that in the future when a game will only use raytracing instead of the old screenspace reflections, light etc. is the insane amount of work it saves developers. Yeah old reflections, light etc. looks good if done correctly but needs a lot of work to do right. With raytracing they can in theory just build their levels place light sources and tell the engine what material is used and the engine does the rest + it looks way more realistic. It will still need a long time until we reach that point but many seem to have forgotten that in the past a 1080p monitor was considered "overkill" and used way to many resources. Now we have enough performance for 4k and have the same debate about that resolution.
2 месяца назад+3
I only have a 4090 as it’s for work, and the bonus is amazing gaming performance. It is cool being able to see sort of future graphics, 4K and path traced, but at the end of the day the thing that matters is the game quality, and graphics are just polish. Only a few games really nail that and the majority don’t even have RT.
You are wrong. Alan wake 2, Cyberpunk, Control and Metro Exodus enhanced are transformed by enabling rt. DLSS makes it possible and it's 100% worth it in those games. Other games dont even run mist RT effects but even if it's only reflections it's always worth turning on, even in a game like Persona 3 Reload the RT effects enhance the presentation by a great deal and if you think otherwise, you must be trolling or have never experienced good RT.
@@leviathan5207 RT in some games are really cool, but until it is available at a mid-range gpus (I would argue real mid-range of 250-300$, not current bs of 400+$ "mid-range") it is not mature. If tech is unplayable on mid-range it means that it is non existent for majority of players. Like we know how to do RT in general for decades. Question is about real-time RT with performance fit for games. In 2 years 4060 likely will be most common gpu and it is unable to run even current RT games with any decent performance.
I think the main thing here is that the upper-end first gen RTX cards still ray trace better than current gen consoles, so they are secure in being able to run better graphics than console in newly released games until at least the next gen.
6 years on and most people still don't care about it. The bump in quality is disproportional to the money required for such an experience, and you have to drop your rendering resolution to get a decent result. I think Nvidia know their gamble didn't go as well as they expected, which is why they're apparently focusing on ai these days.
Where do you get "most" from? The people I know all got a 30 series card SPECIFICALLY for RT in games like Control, Cyberpunk and Alan Wake 2... I feel like you are projecting a lot.
@@leviathan5207 Not really, I know more people who got 30-series cards as an upgrade for rasterization than people who are actually interested in ray-tracing.
The reason why people buy expensive carbs is to get the best visual experience possible. That's the whole point for most people that spend more money. Not fps. The FPS just comes with the expensive cards.
I am currently using an RTX 2080 Ti, but I've never played an RTX enabled game, except for the upscaling feature. I needed to buy a new computer when the 2080 Ti was the current GPU for high-end gaming, but I also sent NVIDIA an email at the time, informing them that I was not impressed with their RTX release, as it cost a ton of money and would soon become obsolete. I'm now waiting to upgrade to an RTX 5090 system where (hopefully) I'll be able to play RTX titles in 4K rez, as well as using a high-end VR headset, without having to worry about stutter or low frame rates.
I can't believe you made a video talking about the first generation of RT capable NVidia GPUs, WITHOUT testing Portal with RTX. Tbh, the results would never have been great. I possess a cheap unknown-branded RTX 2070 -which my brother gave me last summer, upgraded from XFX Radeon RX 580 8G- on about the same config as your "MPG PC" test bench, and as soon as I got my hands on it, I played it. The results : 4-5 FPS average max settings no DLSS @1920x1080, minimum >1FPS, and to breach the 40 FPS mark I had DLSS on Ultra Performance (still high settings). And I was curious about how much frames I could output, which is a maximum of 140 FPS @640x480 with Ultra Performance DLSS enabled. That was a fun experiment on my side, too bad I haven't seen any mention of Portal with RTX. Also 20 Series Card FE are my favourite looking card out of all NVidia FEs. This video was still very nice to watch I always give it a like!
Well it's a path tracing game not just satndarf RT , so it is very very demanding , on my 3070 i had to use balanced on 1080p to get a decent framerate even though i generally play on 1440p
ye i got this rtx2060 laptopoff ebay over a year ago and puttin this thing to its limits has been interesting as my main unit. (few games' rtx modes work well on here actually well, save for Chorus.)
The only useful thing that came out of the RT Cores was DLSS. To this day, I still have never run RTX other than switching it on once to “let me see if it actually makes a big difference in visuals”. Then I turn it off because it invariably is not worth the performance hit. Lmao
Same here! I bought a 2070 Super to replace my GTX970. RT was never going to be worthwhile (not sure it is now, TBH) but DLSS is great. I do need an upgrade to match my now 7800X3D CPU (plus existing 2K ultrawide monitor) but the stupid costs are putting me off.
So many people in the comments arguing that RT is pointless it makes me kinda worried. Like, are you blind? Do you need glasses? Just look at a 6 year old game like Control with and without RT effects and tell me it's not a vast improvement! And that game only has basic RT effects, unlike Cyberpunk which can be played fully path traced. I swear to god, anyone who says RT is not worth enabling is just trying to cope with the fact that they are too poor to upgrade from their GTX 760... There is no other logical explenation. RT is probably the biggest boost to fidelity since SVOGI.
The main argument that I see is the performance drop, which is a real concern for those on weaker RTX cards (like the very popular RTX 3060 12GB). Those that say RTX on vs off isn't a big difference are likely still thinking of pre 2020 titles or those with minimal implementations. I myself play with RTX if I can manage above 75fps with at most DLSS quality (anything more aggressive and it's just not that great). I use an RTX 2080ti, though I do have an RTX 3080 on the way.
Truthfully allot of games with RT on do not look that much different, in a blind test you would not necessarily know if RT is on, or off without reflections. It is getting better the last year or two since it is becoming more standard in games. Also control looks good without RT, that's also part of the problem is that games still look good without RT and without an FPS hit. I think people want it but not if it's at a bad frame rate. A 4090 can drop well below 60 fps. To your point though, yes people that cannot run it decent enough will downplay it just because.
The demands of raytracing far exceed current hardware. We're having to use software resolution scaling to make up for it. Edit: Retracting my statement, the 2000 series works great for new DLSS. My bad!
Turing is supported by all DLSS features except for frame gen, which you can still get with FSR 3.1 frame gen. Neither FSR frame gen nor DLSS frame gen are recommended to enable when you can't reach a base fps of at least 60 due to artifacting and latency concerns. DLSS frame gen is also very VRAM hungry, which - as you saw in the video - not even the 2080 Ti has enough to spare.
Wtf are you talking about!? DLSS works fully on RTX20 series cards, up to Ray Reconstruction which was released in 2023(!), the only thing that's limited to rtx40 is frame generation. The 2000 series STILL ARE good at RT and pathtracing!
@Vorexia DLSS3 Frame Generation works perfectly well going from 40fps, Nvidia has never said otherwise. Even a 30fps base framerate works for slower games like Flight simulator. Stop equating DLSS FG with AMDs inferior solution, they aren't equally as good!
In many of the new games you have to use upscaling to get good frame rates on most GPUs. It's just a thing. And with most games coming up using the Unreal engine get used to it.
I remember buying the 2060 as soon as it released and was really happy with how well it worked with the highest ray tracing setting sand without everything else maxed out on Call of Duty: Modern Warfare (2019). Also, it did surprisingly well in Cyberpunk too when it first released, even though it did impact it quite harshly into the 40's and somtimes mid 30's. Now I have the RTX 4080 and I can effortlessly run Path Tracing of all things (yes with DLSS+FG enabled). Shocked how far RTX has progressed over just a few short years.
What a mammoth video! Really enjoyed this. I'm rocking a 3080 so I'm excited for the 3080 video soon. Also just wanted to note that Sometimes I struggled to take in all the data on the screen when there are so many charts on the screen for a short time. I'm not sure if there's a way you could make them bigger or show fewer of them at once though.
STOP DOING RAYTRACING - LIGH RAYS WERE NOT SUPPOSED TO BE SIMULATED - YEARS OF GAMING yet NO REAL-WORLD USE FOUND for bouncing light more than ONCE - Wanted to go higher anyway for a laugh? We had a tool for that: It was called "BAKED LIGHTING" - "Yes please give me SLIGHTY better visuals. Give me those slightly better visuals at a cost of HALF my frame rate" - Statement dreamed up by the utterly Deranged Big Graphics Card have been demanding your money for all this time, with all the decades worth of beautiful non-RT games in your backlog. "Hello I would like to run my games poorly please" They played us for absolute fools!
If you think RT is "slightly better visuals" then you are either lying or blind. Sure, Not all games have amazing RT implementations like CP77 but leave that for the future. Cyberpunk looks like a different game with Path tracing on. Tech is still in its infancy and it manages to LOOK better than decades upon decades of solutions in Raster. Give it some time. It had to start sometime. Better 6 years ago than today.
baked lighting doesn't work for sandbox games, and in games where your framerate was high anyways it's not much of a sacrifice: for example, Minecraft.unless i'm playing competitive, 80-90fps is good enough, so why shouldn't I run RT?
@@Son37Lumiere Im Pathtracing on a 4060 ti 16gb on 1080p with 80 avg FPS. game feels smooth. Only complaint is some ghosting caused by DLSS but game looks phenominal otherwise.
@@TripleSSSz That's what a 4070 gets with dlss performance. DLSS performance looks terrible too, you're rendering at 480p from 1080p, effectively negating the benefit of RT.
From what ive seen in a lgr video, RT existed way back in the mid to late 90s. Only difference is that what we have now is Real Time Ray tracing. Basicly RT RT
Yes indeed! I used to use programs like POVRay, Bryce3D etc. on my old Pentium 1 back in the 90s. I used to read a book while it rendered a single frame.
@@IcebergTech tech has come a long long way. Honestly its amazing how in the 90s it took around 30mins for a single frame and now it takes less than 1 second for a whole ass map in a game to render with ray tracing
@@flamingophone Well, that’s what this video is about, RT in games, not in general, and it’s a fact it didn’t exist until years ago for games. I saw that video earlier this year.
I haven’t fully moved over to AV1, so I’m still editing mainly H265 files, and the 7900XT does seem to scrub through them a bit better in Resolve than the 6900XT did. It’s hard to compare render times as the 6900 couldn’t output AV1, but this 32 min video exported in about 16 minutes, which isn’t bad. And the title sequence was just three frames I made in photoshop, nothing new 😁
No pc gamers just destroyed any part of the enthuaist pc that was left with their crying and whining about price. You're not an enthauist just because you play pc games. Thats not what enthauist do, or about. Theyr'e about pushing hardware and things to the limits.
I bought a 2080Ti when it was released, but not because of the ray tracing hype. I bought it to go to 4K in a game that had no ray tracing and still has no ray tracing. It served me well, but currently is OOC. Still trying to decide if I want to look into having it fixed.
My first Nvidia card was the RTX 3070 I bought on release for $500 for the FE card. Using ray tracing was a hit or miss with that card at 1440p. Somes games ran decent while others was taxing on performance. Now I own a RTX 4090 I play with everything maxed out and it brings the most realistic looking picture to any game. But of course it came with a $1750 price tag to enjoy it. When deciding to upgrade from my 3070 the performance and cost didn't make sense for the $800 12gb 4070ti or $1200 16gh 4080. The best option was $1600 for the 4090. Bought the MSI liquid cooled version due to its smaller 2 slot GPU form factor to fit my case. Worth every penny. Gave my 3070 to my best friend for her first PC build and she loves it. Plays some games with RT on and enjoys it.
RayTracing... AKA Hairworks, 10 years later. I remember RUclipsrs trying to fake emotions while watching mud puddles in Battlefield: "Whoa! Can you guys see this? The murky water is reflecting every ray of light from the game!! This...is...game...changing" ...
the first rt games were just reflections even those halved fps and it was a meme.What NVIDIA promised was fully pathtraced games which is still 7-8 years away
Hairworks is not RT. Both have their uses. I personally love RT. I've seen shadows go across a hallway and save my butt in game where the regular shadows wouldn't have cast into the hallways. It's good tech.
@@ThisisDD that's just some shit nvidia puts out, to push amd gpu's down. Nobody is using that crap. You can use it if you like "cinematic" framerates, and THAT'S FINE WITH ME.
The problem is, it doesn't even look better. Compare a properly crafted raster game with the RT slop that is being made now, and the difference without pixel peeping is simply not there. And I'm saying it's slop, because you have things like the dogshit shadows in AW2, and reflections and lighting that *could* exist in games like control, but they simply don't... All to give you that false "wow" effect RT and upscaling is partially killing gaming, and it's disgusting
Yep this is such a joke , imagine buying a video card just for the sake of "realistic shadows" - in a fucking video game - a digital product that cant be further from reality anyways.:D This is how you tell that you are a nerdy looser without actually telling that you are a looser and sitting in your Pc Room all day.
Amazing video! I have the RTX 3080 10GB and it's actually does pretty well in RT in many games that I've tried at 1440P. The thing is, if I turn it off, I typically can get significantly higher FPS just using high or optimized settings. That said, if it's a slower game, and if I can enable RT and get around 60 fps, even with some DLSS, I tend to use it.
RT hasn't shown anything of note for the 6 years it has been mainstream. It will continue to be a niche technology and eventually be baked into games like how tessellation is now with most games only using it sparingly. It really is sad to see something so hyped for so long not do anything to move the needle in terms of graphical fidelity.
Cope. Path tracing is the future, thats why AMD are desperately trying to catch up, and why Intel made it a focus on their very first try. You're just lying to yourself at this point.
I personally feel like that the technology just isn't there yet, also there just isn't enough affordable hardware powerful enough for it to make a big difference IMO. Its time will yet come in the future probably, unless there is discovered a better way for enhancing lightning effects etc.
What are you talking about? Control, Metro Exodus (especially enhanced), Cyberpunk and Alan Wake 2 are standout examples of how transformative RT is to inage quality!
@@leviathan5207 Modded 2077 is transformative, not base even with RT. Alan walks 2 is a walking sim so ....sure? and metro has done it the best ill give it that but it still tanks performance for little in the way of real graphical fidelity increase.
How did the Voodoo 1 from 1996 hold up in 2002? How well did Crysis run on the GeForce 3 in 2007? How much fun was it to play Metro 2033 Last Light with an 8800 GTX in 2013? Six years is a long time in the GPU market, and gamers' expectations have never been higher than they are now.
Awesome review my man! I can see why this took so long. That’s an awful lot of work you put into this and I highly appreciate it. That’s pretty wild that 16gbs of vram isn’t enough for ray tracing with Ratchet and Clank. Do you possibly have any numbers on how much vram was being used on the titan? I’m very much looking forward to seeing the Rtx 3080ti. As I know that gen had a big upgrade. Sadly I went with a 3070 and found out pretty quick it just couldn’t do high resolution gaming with ray tracing, or even rasterized for that matter with my game selection sadly. Which is such a shame as I loved that card. If only it had come with 16gbs of vram.
so alan wake 2 RT isn't that good, it's just that baked in shadows are so bad that it makes that difference, becomes kinda funny if we look at arkham series that did great baked in shadows also games mostly suffer from culture wars from inside, even product manager, if i recall right, said that she wouldn't take the job if she knew she were to work with alan wake 2, she wanted "strong female protag game" in Control, so she hated the game from the get go, also stated that "at every opportunity while talking with staff, [she] suggested making as many characters LGBTQIA+", i'm aware that minorities exists, but how much specifically? i'd say 20% max, more likely 10%, so when some people are trying to force and use it in political way, without a clear reason, then it becomes obnoxious, and people hate being told what to do, so pushback is and always will be natural
I bought a 1st gen RTX card originally purely for DLSS. Because raytracing is cool and all, but if you flip the comparison around "rtx off+dlss" you bought a card that outperforms for the price. Though this wouldn't apply to someone buying in the titan or 2080ti budget class. But 2060? Super? 2070... It's free real estate.
Most new AAA games have RT features. There are 600 RTX enabled titles. Most games didn't support 3d acceleration in 1999, but anybody could see that 3d acceleration was the future back then.
20 series owners like myself got royally fked. I bought a RTX2080 Super because my GTX1080 had recently died and that was the worst decision. Not only was the raytracing performance abysmal for 1440p, even with DLSS Performance in some games it just wasn't possible to even consider Ray Tracing. Then with 30 series they got ReBar support but nooooooo Nvidia *could* have enabled it on 20 series but chose not to. But hey at least 30 series got destroyed by not getting DLSS3 Frame Gen. I got a good deal on a 4070ti Super and I jumped and bought that. So far it's great & have no complaints at 1440p.
Conversely, AMD is still catching up on RT performance six years later. Which is surprising since the RT is involved in so many titles at this point and are released on the AMD based consoles. Makes me think part of the issue is the AMD software and not pushing on developers to optimize for their stuff.
Right now we're at a stage where people with a decently capable setup can run decent performance in games with (one of) RT Shadows, AO, or DDGI. It's just that often developers forgo the fast techniques in favor of RT Reflections or GI that you can really put on marketing. Another 2-3 hardware generations and whether you like it or not you'll have to own a GPU capable of high end Ray Tracing or light Path Tracing.
I believe that one day several years from now that Ray Tracing will become a more standardized and added into basically all AAA games and some higher end indies too. Right now even after 6 years in a lot of cases it's not worth bothering with Ray Tracing, but that doesn't change the fact that one day it will be the future of lighting, shadow, and reflection technology.
Don’t pick up a 3090 just go with with a 3070 vs 2080 for the masses. It’s where the volume was and dropping back a tier per generation gives a good idea of the value uplift.
You should do a deep dive into Kepler. Literally, no one ever talks about that generation because everyone is so wrapped up in how poorly it aged. Nvidia actually made a giant leap in performance with Kepler, but since Nvidia waited to release a card based on the GK110, and since the GTX 680 was based on the GK104, no one ever directly compared the flagship GF110 to the GK110. Basically a comparison of the GTX 580 and GTX 780ti. The 780ti is not that far away from being twice as fast as the GTX 580. Fermi to Kepler was a bigger performance leap than Ampere to Ada.
Very interesting topic well looked into. Thanks for that. And I'm glad my opinion wasn't too far off of your own conclusion. The performance hit is so heavy. Maybe some people prefer the looks of ray-traced art. Okay. But throwing away framerate just for a "different" visual look (not strictly better IMO, just different)... it's a big ask! Throwing framerate away but then recouping some of it by dabbling with upscaling isn't as much of an aesthetic ask when it works well, but rather it's a legitimate hassle and emotionally/philosophically not something everyone is going to be willing to do, which isn't nothing, even if it's harder to put an objective lens on it. As in, it's a crapshoot as to the quality of the implementation of the upscaling per game, and spending (wasting?) time figuring out if DLSS, FSR or XeSS works best in the particular title takes some time and energy away from just playing the game. So, less clear-cut than FPS due to a lack of objective metric to put to it, but it's its own weirdness that shouldn't be hand-waved away, IMO. On the one hand, it's good to have options. And by and large we do have options. On the other, mandatory ray-tracing seems like a bad bet. I predict one of two things: Hardware catches up and it becomes a non-issue, or less of one to the point where we don't care about it in 5 or 10 years. Or else mandatory RT will be looked at as a mistake, a "wild" time back in the mid 2020s when they didn't let you turn RT off in certain games, like Star Wars Outlaws, or made it come with big sacrifices and had a less-than-first-class non-RT mode, like Alan Wake 2. Hopefully this comment isn't so long everyone's stopped reading it, but it's the sort of in-depth discussion your video suggests, IMO. I appreciate the high quality conversation-starter and objective benchmarking content combined together -- it's a really well done video, IMO. Thanks again for that.
So, funny enough, I have a RTX 2080 Super as my current graphics card, fed by an AMD Ryzen 5 5600X. I use Ray Tracing in exactly one game, Forza Horizon 5. It plays perfectly at both 1920x1080 and 2560x1440 resolutions with Ultra settings across the board without FSR or DLSS. I do admit that my 1920x1080 monitor is 80Hz and I have V-sync enabled, the 2560x1440 monitor I normally use was limited to 60Hz. But thing is, if I can get 60 FPS or higher in most games, with steady frame times, I will take that over insane high frame rates. I don't play competitive online games, I don't require five billion frames per second, I am good with a steady 60-80 FPS. Anything steady above that is a nice bonus.
i remember getting my rtx2070 and wanting to try Battlefield with ray tracing on, it surprisingly did run but it was only reflections if i remember correctly. it wasnt anything cool so i turned it off and never thought of it again. now on my 7900xtx i still dont find ray tracing visually distinct enough to be worth the slight performance hit
I wouldn't call myself "early adopter" even I've bought 2 20 series cards. RTX 2060 in 2021 because everything else is very expensive in GPU shortage. RTX 2070 super in this year because I'm cheap
Very good video. Well done on it and the amount of work on it. I don't have and never had a ray tracing capable computer. So all my experience is through RUclips videos. For me there are only two cases which make me wish I had the option to play with ray tracing. At some level Witcher 3, but mostly Cyberpunk 2077 with path tracing. I do hope path tracing (with ray reconstruction) becomes the norm and we finally get GPUs which will be able to handle it without having to pay 4 digits for a GPU to play it with upscaling.
Soooo… the charts 😅
I usually use Google Sheets, which is handy because I use several different computers as well as my phone for noting down data. Unfortunately, it couldn’t handle this one, so I found myself using Libre Office for the first time. I kinda had to learn the program on the fly, and I dare say that I could have done a better job with more practice. I’m not sure if I’m going to use Libre again, but if I do, I’ll definitely work on making the charts more readable!
Do you research the entire documentary by yourself ?
How much time do you usually spend on a video
@@IcebergTech Not a problem. The charts were fine and easy to read.
@@walter1824 Yes, I'm a one man band. This video took a while, it was always intended to come out in September for the 6th anniversary of Turing, but I started writing the script at the end of March. The actual video took a little over a week: benchmarking took about three days (including some related tests for another future video), making the charts took a couple more days, recording the script took 2 days because I lost my voice halfway through, and I edited everything together yesterday.
@@IcebergTech I really appreciate the work man it really shows you put in the leg work! ❤
Hey iceberg do you know of a way of downloading youtube videos at max resolution and quality? Without youtube premium
there is no way its been 6 years since the rtx20xx, damn.. im growing old
mannn, you aint kidding.
Man I was thinking the same thing but i just realized my 3080ti has been in my system for over 3 years...
Time is a son of a bitch lol
Check your hairline
@@jayreed2692yeah I can’t believe it either, seems like yesterday I got my 3090 for half price. I’ll never replace it.
😢
When it launched, the meme was that the 2080ti could do 4K raster, but only 1080p ray tracing. The 2060... 1080p raster, 480p ray tracing. Time has not been kind to either.
Games have gotten more demanding. Therefore, that would be the case.
idk, i have a laptop with an rtx 2070 max-q, it should perform worse than a desktop 2060, i think, but it doesn't feel dated. it can still run anything with very little compromise. i don't think time was unkind to it at all.
@@GraveUypo Obviously thay can run games, they were objectively bad though. The 1080TI trades with the 2080ti in raster, and launched at a 30% lower msrp multiple years earlier. The 20 series are pretty the worst value cards nvidia has ever offered, in terms of fps per dollar.
I mean i have a friend still running a 1060 and it's fine for what he needs. That doesnt necessarily mean it's a good or bad value card.
@@bobbylasagna 2080tI is at least 30% faster than 1080Ti. and when games start being build more with DX12 feature the gap is widen even more. and you can't really compare about price either when things getting more and more expensive as time goes by. just look at current 5nm. next year it is expected 5nm price will be as expensive as 3nm two years ago because TSMC keep increasing the price for 5nm instead of making it more cheaper as time goes by.
The 2060 was never seen as a 1080p raster card. It was a 1440p card at launch and can still do 1080p now
The titan rtx is such a beautiful looking card.
Loved the designs on 20 series founders. 30 is interesting. 10 is beautifully classic.
Horribly made, 70 screws. Glue everywhere. Terrible fans.
The blower cards from Nvidia before were far better
And the 30 series design and after is the gold standard for a card with good airflow
its also a beautiful chunk out of your wallet
@@wile123456 Didn't people hated blower cards for the noise it made?
As a long-time 2080ti user, I've been gaming at 1440p since 2020, and, if you don't use raytracing, it is still a great experience. Even newer games you can usually pull a 1440p60fps high preset experience. Love this card. Definitely starting to feel its age a bit, but its still very capable. Asus ROG strix version, myself.
Overclocking it like hardcore overclocking it will make it like 3-7% slower than a 3080
@@moxe6841 stop the cap, even then it will be nowhere near a 3080.
@@moxe6841so you'll make up a 30% or 40% performance gap depending on the game with just a overclock, seems highly unlikely..........
I used my 2080 ti till July 2022. Replacing with RTX 3090 and sold it for a 7900xtx.
I could game on my 2080 super at 1440 with ray tracing and high settings, just use the Losless Scaling program.
Yes, for ray tracing you need lots of vram, sometimes surprisingly a lot more than you'd think you'd need. The worst part of this whole rtx thing besides it being so hard to run is how nvidia has been consistently skimping on vram while acting like everything is fine. Then you have hordes of nvidia fanboys making fun of you for insisting on more vram while they pretend everything is fine. That peaked with the 4070 ti. "12gb vram is fine bro." No, it's not.
Like the 3070 ti suffering from massive stuttering and performance drops in some games, especially with RT enabled. But as usual the nvidiots just overlook and ignore it.
This is a consistent issue with Nvidia's lower end offerings, which is what most people buy.
It all depends how the game is made and the engine used. VRAM requirements are not just for one thing.
Which is why the 6700xt is the darling of the used market while no one talks about the 3070. It's a pretty effective strategy to kill the used market.
@@Son37Lumiere I'm an idiot for buying a gpu that meets my needs?
I have a 2080ti, 3090 (got it for retail price when they were well over $2000), and a 4090. All great gpus.
I have a RTX 3080 laptop which does everything well at 1080p even after 2years and being a thin and lite model.
I have a old 970 which lasted a long time. I use it for testing.
I have a 1070 and 2070 that lasted as long as expected and still can game at 1080 well.
Whereas AMD dumped support for which GPUs last year again? Everything but the 6000 series and up? Lol.
AMD fanboys always resort to name calling. Especially when a lot of them didn't even buy the cards that they claim are so great until they went on the used market.
Ciao.
I had a RTX 2080 Ti. To me, RTX on the 2080 Ti was never more than a demo, a taster of things to come.
It's still a demo on RDNA2 and RDNA3 if you turn on path tracing xD
@@louisfriend9323 true but I'd also consider the 4090 still just a demo ''pro'' card when you have to dump resolution to 1080p to get playable rates in the most recently released UE5 games an they stutter to all hell. I've yet to see a visual improvement from RT that justifies TANKING resolution and texture quality on my 75' 4k TV. Bragging about who's failing less at offering an unjustified visual ''feature'' is far from wining. 🤣
Even at launch it was considered a demo feature by reviewers, they were right. Eventually it will be useful, but for now, not so much.
@@jonathanwilliams5329what? I have a 4090 and i don’t have to dump res to 1080p in PT. As for UE5 stutter that’s not anything to do with the 4090 lol.
@@pliat Never said it was a 4090 issue. It is However an issue in UE5 games which is made even worse when RT is on. And I don't need your opinion on what res you use while running RT. there is plenty of evidence and benchmarks on sites such as tom's hardware to show that 2024 titles using the the FULL suite of of RT effects are performing at sub 60fps levels at anything much more than 1080, or 1440 with upscaling.
you may be of the opinion that 1080p up scaled + frame Gen to 4k at 30-60fps and 20+ ms of latency is fine. but most are not.
I just gave away my rtx 2060 to someone who was using integrated graphics. (I have 6 other rtx cards that are faster than the 2060) They think the card is amazing for their 1080p game playing. Perspective always matters.
Anything is better than integrated graphics. Even a 960.
@@Son37Lumiere eh... doubt? a 960 cannot beat something like a Radeon 780M in high-end AMD CPUs, or the new Intel Arc iGPU found in Intel Meteor Lake (and Lunar Lake) CPUs.
@@jamesbrendan5170 I am fairly certain not many people are using the 780m, or arc igpu. Besides, the latter aint awesome either.
@@jamesbrendan5170 Yes well most people don't have a cpu with a 780m. I'm talking about your average slow as dirt Intel iGPU.
wished you separated the two GPU bars by color
When you have nothing smarter to say and with more sense... Or you are blind/color blind?
@theexorcist982 lol wtf,damn bro you sound offended,You his boyfriend or something Or just having a bad day buddy? Don't worry everything will be alright
@@theexorcist982 ??? this is a fair criticism, what's going on with you?
@kebab6617 I think he's just on his period
gpu segregation
90% of the time I think Ray Tracing is nice but not really necessary for a game. I recently got a "WOW!" moment in Cyberpunk, I was walking around a corner, I thought I had taken out all the guards when I saw the reflection of a guard above me on a ledge in a puddle! (I'm using a 4090)
Cyberpunk Pathtracing is the only time in play that significantly makes a difference. Most of the time without a side by side or pixel peeping, you really won't see a meaningful difference.
@@MagiofAsura uhh yeah you absolutely will lmao
@@MagiofAsura Counterpoint: *flashlights* The standard ways of faking light really show their limitations when you have a flashlight.
@@MagiofAsuraDude, load up the Witcher 3 Next Gen and start a conversation indoors. RT on vs off is a night and day difference.
An important detail about Turing's RT implementation is that the RT hardware acceleration couldn't run concurrently with the rest of the GPU pipeline -- very similar of how RDNA serializes its RT implementation. That's why the 6900XT is able to match even or top out the 2080Ti despite having much less dedicated hardware RT support, by simply brute forcing itself with more FLOPs capacity (76% advantage). This deficiency of Turing was the first thing that Nvidia mended in Ampere together with doubling of the FP32 throughput.
I don’t really know but could bios modding help or is this issue at the hardware level?
@@famousfighter2310Definitely a hardware issue, it's a problem with the Turing architecture itself.
@@famousfighter2310 You can *TRY* to mitigate that...but the problem is in hardware design itself.
Partially true. Turing can do concurrent operations with RT and shading, it's just pretty bad at it. When it comes to RDNA2 GPUs beating Turing GPUs with RT, it's mostly just because of the rasterized performance advantage. RDNA2 GPUs fall off much more in more complex ray traced scenes, especially path traced ones (Not that turing is that great with path tracing either).
I've compared the rx 6700xt and the rtx 2070 super against each other in rt situations (They both have an equal number of rt cores/accelerators and shading units). generally the rx 6700xt was faster when it comes to simple rt effects in games, like doom eternal or spiderman remastered. In other games where the rt tasks are more complex, like control, the rx 6700xt started to struggle more than the rtx card did. In path traced titles like quake II rtx, the rtx 2070 super destroyed the 6700xt by over 30%.
@@cosmicnebula3023 Most games that feature RT, especially the ones with RTX sponsorship, are optimized specifically for nvidia with little to no optimization for AMD.
to anyone potentially saying the performance isn't that bad, please remember that the majority of Turing owners didn't have a Titan RTX or 2080 Ti
any numbers you see here will be cut in half if not more by using a 2060. a 6GB buffer is an absolute no go on many games in 2024 withOUT RT enabled. if you think it's shocking to see a 6900XT to struggle in Ratchet and Clank 1440p with ray tracing, imagine how bad the rasterization performance is for a card with just over a third the VRAM
6900 XTX isn't struggling in Ratchet with ray tracing. 2080 Ti is xd
Yeah low end RTX cards were a scam when marketed for the RT, basically just like current gen RTX cards. It's just not worth the hit. "Let's play Cyberpunk fully path traced on my new RTX 4060 for the best experience" said no-one ever. It's good for screenshot mode on those cards - that's it.
Now with AI hyperscalers buying all the waver supply, low end will continue to eat shit as NVIDIA will ramp up the shrinkflation. The leaked "5080" is clearly a 70 type card when you look at the core count and memory downgrade over the 5090 leaks. I bet NVIDIA won't even made a 5060 this gen (or they will, but it'll be their xx50 silicon relabeled as xx60, or just repackaged 4000 dies). AMD doesn't even try to compete at higher end this year and will just release a refresh with better RT and then just roughly price match equivalent, overpriced $600 NVIDIA products that ought to cost $300, so buyers are screwed either way. Intel is too busy sinking and even if Xe2 is 2x faster, you bet your favorite new game will have awful frame timings and there is a good change that line will no longer get game optimized drivers 6 years after release.
Man 6 years and 90% of people still cant use it
The fuck are you talking about!? 60% of PC Gamers have RT capable GPUs.
@@Wobbothe3rdI call BS. Im rocking a GTX 1660. Most are still rocking their 1060s and other gpus thanks to the stinky price gouging between nvidia and amd
Bad take. Ray tracing is just a common thing now lol
Steam hardware survey probably has the definitive answer for y'all
GUYS, I did da math from steam survey. I just count every GPU that can't do RT of any kind, excluding AMD radeon graphics (cause it can be either RDNA2 based igpu, like in my amd 7700x or steam deck gpu, and it can be old vega ones, that for some reason did not identified themselves, cause Vega 11 or Vega do exist in the chart), also I didn't count "others" group that include most unidentified or less than 0.15% GPUs. So, number is... 26.9% (nice). If we include unidentified gpus that will be: 36.14%. The rest of the chart (63.86%) supports RT! Maybe not very well, like 6600xt or base RTX2060, but point stands.
Raytracing is a neat feature, but it hasn't wowed me as much as the performance impact when turning it on.
Going from 120fps to 45 even on an RTX 4090 isn't a good look. We'll eventually get there, but it might take another 6 years.
The next feature I want to see implemented is Gaussian Splatting in video games. That'd be neat.
It's not just a "feature" it's an entire rendering paradigm that enables entirely new gameplwy types. RT works perfectly well right now on rtx20 cards, ypu just can't expect esports framerates or high native resolutions. For DECADES Gamers lived 60hz and CRT resolutions, so clearly those things aren't NECESSARY for good image quality or smoothness.
@@Wobbothe3rd CRT *monitors* were going WELL in excess of 60Hz, WITH better latency and motion clarity than flat panels have produced, even now. People were actively PISSED when CRT died, in exchange for tech that was in its infancy and therefore objectively worse in basically every way, so try again.
What game drops to 45fps with RT?
I run Cyberpunk Ultra Performance Path Tracing 20 at 1440p DLSS quality and FSR frame gen on a 3080, whilst getting between 80-110 fps depending on the scene.
The image is not only better it's a completely transformative experience!
@@leviathan5207 Your 3080 is rendering at less than 20 fps at 1440p path tracing. DLSS quality would bring that up to around 35-40 fps, which is why you're using frame gen, or fake frames to gain the rest.
@@leviathan5207 did you really just ask what game drops to 45 FPS with RT while saying you play a game at 90 FPS *with* frame generation enabled? ...
i've got an rtx 2080 laptop, which i dont use anyway because my main rig is more powerful, but you know what, it still runs games at high settings to this day, at +60fps pretty much across the board! :) (at 1080p)
I also have a 2080 laptop. It does very well at 1080p or 1440p.
My laptop has a max of 105 watts. That's what limits it unfortunately.
It's a thin and lite.
Unless I’m looking at a side by side comparison I literally cannot tell the difference between RT on and RT off
Mr. Iceberg, the charts you used to show the performance of the cards are a bit... confusing. I suggest that you make a clear distinction between the two graphs by having some lines at the top and the bottom of the graphs and then leaving some space between the blocks.
The video's intro sequence was very good though Mr. Iceberg!
Yes this is not an easy watch on mobile or a small screen. Can’t read the small text next to the graphs and I’ve already got sight issues due to a retina injury I got as a kid. On desktop it’s okay.
A little hack for yall who use a first gen RTX card, install the DLSS to FSR3 frame gen on SINGLEPLAYER GAMES (IMPORTANT BECAUSE MULTIPLAYER GAMES WILL GET YOU BANNED). Since DLSS frame gen is locked behind 40 series cards, this mod has helped me a lot with Cyberpunk Ray Tracing. I may have an RTX 3070 Ti but adding the DLSS to FSR3 frame gen mod to singleplayer games, i got more than comfortable performance on games like Cyberpunk and Alan Wake 2 maxed out. Just remember that frame gen will add latency to your games so the closer to 60 FPS you can get the game without, the lower latency. 40 to 50 FPS is still playable, just try to counter the latency by predicting moves (which aint too hard).
Didn't cyberpunk just add FSR3 officially?
@@empedance1933Yeah but their implementation doesn’t currently work for some reason. The fps counter will go up higher but the image isn’t actually any smoother.
@@empedance1933Yeah, but they implemented an older version of FSR which has some issues that were fixed in FSR3.1
CDPR went woke their time is over now... As for their broken FSR 3 it belongs in a dumpster. Also 40 FPS with frame gen is complete trash.
Still have my 8700-2080 rig that got me into PC gaming for real. I had dabbled here and there over the years, but never took the plunge. In late 2018 I found a clearance deal on an HP Omen tower at Best Buy w/ the 8700 and a 2080. It was a blower style card, stayed fairly warm, and definitely loud. But after some tweaking, that little system served me well for about 5yrs. I didn't need to replace it honestly, but I found another deal on a Legion Tower (13700K-4070Ti) and took the plunge on that. It's obviously more performant, but I still use that Omen from time to time. And as long as you stay at 1440p or below with reasonable settings, it will play anything. After following the 10 series, and the price increase, I can see why the 20 series got hate at the time. But for things like prebuilts, it made more sense.
the 2080 ti is the only 2000 series that could do raytracing, and only just, and still somehow throws hands (at 1080p)
I was using a 2070 Super until a few weeks ago and it could handle RT just fine in a lot of games at 1440p with DLSS (Balanced is what I'd typically use). Not maxed out obviously, and there were some games where turning it on at all was out of the question (Alan Wake 2), but I played through Metro Exodus EE a few months ago just fine (which is to say 60fps+ at all times) and it could handle one or two RT effects in Cyberpunk 2077 with some optimized settings (I chose reflections, since they're the most noticeable). Honestly, I was quite happy still using it until a killer deal on a 4070 Ti Super came along.
Or towel?😂
my 2080 can do perfectly RT in 4K with DLSS (cyberpunkt) offcourse.. I was an expensive card, but hold ups very well.
@@CaptainKenway damn, the 2070 super is pretty on it then, i had the base 2070 and it wasnt quite up to scratch so i got the 2080ti, just to be safe, this was only about a year ago tho
2060 super oc here. 800x600 4tw. (generally under 720p does ok)
I never bother to enable RT. I prefer 150 fps native resolution compared to struggling to hit 60 with upscaling image quality loss.
I've been on a 3070ti and I swear DLSS just feels like a free upgrade on the PC itself, if DLSS is available I always run it at Quality, I appreciate the 20% performance boost and the upscaling in what I think is near most situations makes the game look even better
It's the only AA that doesn't give me eyestrain *and* it comes with a framerate uplift. As I'm on a 1080p monitor, any less than Quality can look pretty dire, but Quality often looks really great if the implementation isn't bad.
I've got a handful of games I won't use it on. Warframe's implementation was not great last I checked, it did not handle stars well at all which is not ideal in a space game. But I get great performance in that anyway and am not bothered by the lack of AA, but for the rest it's so good. No Man's Sky has a decent implementation which doesn't lose stars, so I know it can be done.
Dlss looks better than the typically used TAA in games so that’s why I enable it, the fps boost is nice but as long as I’m over 60 I am happy, which is usually closer to 144 on my 4070ti.
It'd be funny if doubling the VRAM of 2080 Ti fixes most of the troubles compared to Titan RTX
12:11 8800GTX's MSRP was 600USD at launch (910USD in current dollars), 8800 Ultra was 830USD at launch (1260USD in current dollars). So, its not right that gpus are a lot more expensive than before
It actually took 2 years to replace GTX. There was the GTX 1600 series alongside the RTX 2000.
GTX 1600 Ti was WORST and released years after the RTX 2080
@@lucasrem Uh yes it was worse than a 2080, because it was Nvidia's low end product not sure what you mean. Also it did not come out years later, it came out 5 months after the 2080 as it was part of the same generation of GPU's.
@@lucasremthe gtx 16 series was released in 2019 as an affordable option that could run pretty much every game on the market at the time. The 1650 was garbage but 1660 super and Ti were great if you couldn’t quite afford the 2060.
Holy shit, I love the... intro screen? (not sure what it's called) banger that one.
Your thumbnails are so recognisable, I put the video on as soon as I see them. Your branding is fucking immaculate, so recognisable, so clean.
It's a thing I don't feel like a lot of RUclipsrs get, sure bright contrasting colours and big red arrows may get clicks, but a well branded thumbnail guarantees my view.
RTX 20 series is just 10 series, with AI+RT cores (a superset of the normal Shader Cores). That's it.
And 30 series are just the 20 series in disguise
@@skipskip342 Nah, 30 series is more than enough to be different. 2070 series equal is around 3060 series.
And 3090 is way ahead of the Titan RTX.
You're confusing Nvidia with AMD. RX 7000 is just a rebadged, of RX 6000 series, and RX 6000 is just a modified RX 5000, with RT cores.
Plus, AMD is going tier-to-tier in naming, not $$$.
RX 7900 series comepte against the RTX 4090.
RX 7800 series comepte against the RTX 4080.
RX 7700 series comepte against the RTX 4070.
RX 7600 series comepte against the RTX 4060.
Wrong. Turing was a complete architectural revamp. It was also the first to introduce Mesh shading, sampler feedback and VRS.
The GTX 1600 ti ?
you forgot Battlefield V ?
You get a better video encoder and decoder too.
Great testing, thx for your work.It's nice to know that the owner of the founder version of 2080 ti. Also, don't forget about the first card that could process beams without using hardware blocks. So this is Volta, it came out earlier than 2080 ti and was stronger in computing blocks than 2080 ti, but in games with beams it lost in everything, but it was the last card with the largest chip for the consumer segment, and together with the HBM chips it looked like a single whole. Not long ago, Der8auer did a review of it
Shame you didn't bechmark Frontier: The Overgrown Avatar
Otherwise, great video, you outdid yourself with all those benchmarks!
its so annoying that so many games need frame Gen for decent frame rate instead of just optimization.
What a lucky reload!
6 years and till this day I don’t have even a PC. And I never had a GPU.
But I know soon I will have a really good PC!!
Turns out ray-tracing been and path-tracing has been a concept in movies production since the 1970s just hadn't been powerful hardware to run it till recent GPUs that pack pure muscle (computing)
It still isn't even useable in games either.
@@kevinerbs2778 I'd argue otherwise. There are many games where you can enable RT on modest GPUs without the need for upscaling. Yes path tracing is a different story, but games using hybrid rendering typically can be run without needing a 4090.
You are right guys, rt in games is just glossing for the sake of it. Because in movies, creating a set scene with rt and PT still takes months to complete with top of the line GPUs of today.
@@LeeericosGPUs are often not used for final renders, ONLY because they lack memory. But yes, NVIDIA GPUs ARE universally used in any situation where you aren't required to use tons of memory (like look dev). Nvidia's GPUs since Turing have had a profound impact on lookdev. No longer have to run the renderer on multiple large-memory large-cpu-count machines over the network, just to see if you corrected that problem in the shaders of a character. Now you can get a nicely-converged image SUPER quickly because you have THE fastest ray tracing processors (Nvidia) right in your workstation, laptop, etc.
Titan RTX and 2080TI had virtual link ports, RT cores, more raw compute and Cuda improvements over the 10 series and were better for VR than anything else on the market. I built and deployed workstations used for heavy renders and tasks and to this day those 20 series cards are still kicking. I have upgraded some machines to 3 and 4 series, but it's based on needs and speeds. RTX as a feature was and still is eye candy, and as hardware keeps getting better it's still being adopted. The 20 series cards have aged well for how far games have come. In time game engines have added options, and I love to see it when I do.
ray tracing impossible to ignore? challenge accepted
there are games where you can't turn it off. like... all unreal 5 engine games.
that's false, unreal engineers games use software lumen, which doesn't use raytracing
*engine
Screen space occlusion artefacts are impossible to ignore. SSAO is fugly and SSR surfaces having their reflections suddenly disappear when you look slightly down is the most distracting and shitty looking thing ever.
@@wile123456 Sofware Lumen is ray-tracing, just not hardware accelerated. Easier to run (bearly) but lacks any sort of precision in the fine grain, you can easily see errors in lighting on smaller objekts.
My rtx3060 ray traces on 1440p moderately okay. Not crazy high fps but it’s stable. Considering the 3060 is only $200-300 I’m surprised more people haven’t bought it. It’s now been passed up for the rtx4060 which is slightly better and slightly cheaper and a gen newer. For pc gaming being an expensive hobby a lot of people seem to spend more money on new games than new parts.
Get a 4070 for 1440p. Huge improvement over my previous 3060.
My 3060 laptop only has 6gb of vram lol
2080 Ti still capable at 1440p.
And if you run high refresh-rate 1080p, you’re definitely still good to go.
Introduction of ray tracing and dlss made more harm than good.
Games are unoptimised as hell. When games put dlss on in recommend settings just to get stable 60 fps something is clearly wrong. Dlss was meant to help older hardware keep up with new releases. But games are so unoptimised now even 40 series cards can't have stable fps.
DLSS was about selling new cards, and making ray tracing possible, and also the fact that Nvidia had no competition at the time, so they put money into it's development
the 2080ti is 6 years old? thats absurd great vid regardless
Which means my RTX 3070ti is around the same level as a 6 year old flagship.
Ugh.
@LeftJoystick 3070ti itself is 4 year old card man.
I had the 2080ti and it was realy nice to use raytracing in games that supported it. Yeah it costs performance but what many people still don't realise is that in the future when a game will only use raytracing instead of the old screenspace reflections, light etc. is the insane amount of work it saves developers. Yeah old reflections, light etc. looks good if done correctly but needs a lot of work to do right. With raytracing they can in theory just build their levels place light sources and tell the engine what material is used and the engine does the rest + it looks way more realistic. It will still need a long time until we reach that point but many seem to have forgotten that in the past a 1080p monitor was considered "overkill" and used way to many resources. Now we have enough performance for 4k and have the same debate about that resolution.
I only have a 4090 as it’s for work, and the bonus is amazing gaming performance. It is cool being able to see sort of future graphics, 4K and path traced, but at the end of the day the thing that matters is the game quality, and graphics are just polish. Only a few games really nail that and the majority don’t even have RT.
I know Raytracing affects the cpu aswell. Is the 7500F over taxed at during testing?
The tech will be mature when midrange cards - pricewise - will be able to make use of it. As it stands, it's a useless, performance draining feature.
Wow your wrong
You are wrong.
Alan wake 2, Cyberpunk, Control and Metro Exodus enhanced are transformed by enabling rt. DLSS makes it possible and it's 100% worth it in those games.
Other games dont even run mist RT effects but even if it's only reflections it's always worth turning on, even in a game like Persona 3 Reload the RT effects enhance the presentation by a great deal and if you think otherwise, you must be trolling or have never experienced good RT.
@@leviathan5207 RT in some games are really cool, but until it is available at a mid-range gpus (I would argue real mid-range of 250-300$, not current bs of 400+$ "mid-range") it is not mature. If tech is unplayable on mid-range it means that it is non existent for majority of players.
Like we know how to do RT in general for decades. Question is about real-time RT with performance fit for games.
In 2 years 4060 likely will be most common gpu and it is unable to run even current RT games with any decent performance.
@@robertkorth2911 He is right. RT is a useless gimmick being used to sell nvidia cards at inflated prices.
@@L1vv4nmidrange will never ever be under $300 again, are you kidding? That was midrange 10 years ago with Polaris.
I think the main thing here is that the upper-end first gen RTX cards still ray trace better than current gen consoles, so they are secure in being able to run better graphics than console in newly released games until at least the next gen.
6 years on and most people still don't care about it. The bump in quality is disproportional to the money required for such an experience, and you have to drop your rendering resolution to get a decent result. I think Nvidia know their gamble didn't go as well as they expected, which is why they're apparently focusing on ai these days.
Where do you get "most" from?
The people I know all got a 30 series card SPECIFICALLY for RT in games like Control, Cyberpunk and Alan Wake 2...
I feel like you are projecting a lot.
@@leviathan5207that’s a troll
@@leviathan5207 Not really, I know more people who got 30-series cards as an upgrade for rasterization than people who are actually interested in ray-tracing.
The reason why people buy expensive carbs is to get the best visual experience possible.
That's the whole point for most people that spend more money. Not fps. The FPS just comes with the expensive cards.
I am currently using an RTX 2080 Ti, but I've never played an RTX enabled game, except for the upscaling feature. I needed to buy a new computer when the 2080 Ti was the current GPU for high-end gaming, but I also sent NVIDIA an email at the time, informing them that I was not impressed with their RTX release, as it cost a ton of money and would soon become obsolete. I'm now waiting to upgrade to an RTX 5090 system where (hopefully) I'll be able to play RTX titles in 4K rez, as well as using a high-end VR headset, without having to worry about stutter or low frame rates.
I can't believe you made a video talking about the first generation of RT capable NVidia GPUs, WITHOUT testing Portal with RTX.
Tbh, the results would never have been great.
I possess a cheap unknown-branded RTX 2070 -which my brother gave me last summer, upgraded from XFX Radeon RX 580 8G- on about the same config as your "MPG PC" test bench, and as soon as I got my hands on it, I played it.
The results : 4-5 FPS average max settings no DLSS @1920x1080, minimum >1FPS, and to breach the 40 FPS mark I had DLSS on Ultra Performance (still high settings). And I was curious about how much frames I could output, which is a maximum of 140 FPS @640x480 with Ultra Performance DLSS enabled.
That was a fun experiment on my side, too bad I haven't seen any mention of Portal with RTX.
Also 20 Series Card FE are my favourite looking card out of all NVidia FEs.
This video was still very nice to watch I always give it a like!
Well it's a path tracing game not just satndarf RT , so it is very very demanding , on my 3070 i had to use balanced on 1080p to get a decent framerate even though i generally play on 1440p
Thts a mod not techinally a game
@@kevinerbs2778 yes but it's still prettier than real life.
ye i got this rtx2060 laptopoff ebay over a year ago and puttin this thing to its limits has been interesting as my main unit. (few games' rtx modes work well on here actually well, save for Chorus.)
The only useful thing that came out of the RT Cores was DLSS. To this day, I still have never run RTX other than switching it on once to “let me see if it actually makes a big difference in visuals”. Then I turn it off because it invariably is not worth the performance hit. Lmao
Same here! I bought a 2070 Super to replace my GTX970. RT was never going to be worthwhile (not sure it is now, TBH) but DLSS is great. I do need an upgrade to match my now 7800X3D CPU (plus existing 2K ultrawide monitor) but the stupid costs are putting me off.
Also, DLSS does not use the RT cores, it uses Tensor cores.
Writing this from my RTX 2070 Super PC. Still working fine :)
So many people in the comments arguing that RT is pointless it makes me kinda worried. Like, are you blind? Do you need glasses? Just look at a 6 year old game like Control with and without RT effects and tell me it's not a vast improvement! And that game only has basic RT effects, unlike Cyberpunk which can be played fully path traced.
I swear to god, anyone who says RT is not worth enabling is just trying to cope with the fact that they are too poor to upgrade from their GTX 760... There is no other logical explenation. RT is probably the biggest boost to fidelity since SVOGI.
The main argument that I see is the performance drop, which is a real concern for those on weaker RTX cards (like the very popular RTX 3060 12GB). Those that say RTX on vs off isn't a big difference are likely still thinking of pre 2020 titles or those with minimal implementations. I myself play with RTX if I can manage above 75fps with at most DLSS quality (anything more aggressive and it's just not that great). I use an RTX 2080ti, though I do have an RTX 3080 on the way.
Truthfully allot of games with RT on do not look that much different, in a blind test you would not necessarily know if RT is on, or off without reflections. It is getting better the last year or two since it is becoming more standard in games. Also control looks good without RT, that's also part of the problem is that games still look good without RT and without an FPS hit. I think people want it but not if it's at a bad frame rate. A 4090 can drop well below 60 fps. To your point though, yes people that cannot run it decent enough will downplay it just because.
It still feels like yesterday when these came out. Still using a 10 series gpu
The demands of raytracing far exceed current hardware. We're having to use software resolution scaling to make up for it.
Edit: Retracting my statement, the 2000 series works great for new DLSS. My bad!
Turing is supported by all DLSS features except for frame gen, which you can still get with FSR 3.1 frame gen. Neither FSR frame gen nor DLSS frame gen are recommended to enable when you can't reach a base fps of at least 60 due to artifacting and latency concerns. DLSS frame gen is also very VRAM hungry, which - as you saw in the video - not even the 2080 Ti has enough to spare.
Wtf are you talking about!? DLSS works fully on RTX20 series cards, up to Ray Reconstruction which was released in 2023(!), the only thing that's limited to rtx40 is frame generation. The 2000 series STILL ARE good at RT and pathtracing!
@Vorexia DLSS3 Frame Generation works perfectly well going from 40fps, Nvidia has never said otherwise. Even a 30fps base framerate works for slower games like Flight simulator. Stop equating DLSS FG with AMDs inferior solution, they aren't equally as good!
and also turn on TAA to not have it insanely shimmery
In many of the new games you have to use upscaling to get good frame rates on most GPUs.
It's just a thing.
And with most games coming up using the Unreal engine get used to it.
I remember buying the 2060 as soon as it released and was really happy with how well it worked with the highest ray tracing setting sand without everything else maxed out on Call of Duty: Modern Warfare (2019). Also, it did surprisingly well in Cyberpunk too when it first released, even though it did impact it quite harshly into the 40's and somtimes mid 30's. Now I have the RTX 4080 and I can effortlessly run Path Tracing of all things (yes with DLSS+FG enabled). Shocked how far RTX has progressed over just a few short years.
It's been 6 years....?
Can u count
What a mammoth video! Really enjoyed this. I'm rocking a 3080 so I'm excited for the 3080 video soon.
Also just wanted to note that Sometimes I struggled to take in all the data on the screen when there are so many charts on the screen for a short time. I'm not sure if there's a way you could make them bigger or show fewer of them at once though.
STOP DOING RAYTRACING
- LIGH RAYS WERE NOT SUPPOSED TO BE SIMULATED
- YEARS OF GAMING yet NO REAL-WORLD USE FOUND for bouncing light more than ONCE
- Wanted to go higher anyway for a laugh? We had a tool for that: It was called "BAKED LIGHTING"
- "Yes please give me SLIGHTY better visuals. Give me those slightly better visuals at a cost of HALF my frame rate" - Statement dreamed up by the utterly Deranged
Big Graphics Card have been demanding your money for all this time, with all the decades worth of beautiful non-RT games in your backlog.
"Hello I would like to run my games poorly please"
They played us for absolute fools!
If you think RT is "slightly better visuals" then you are either lying or blind. Sure, Not all games have amazing RT implementations like CP77 but leave that for the future. Cyberpunk looks like a different game with Path tracing on. Tech is still in its infancy and it manages to LOOK better than decades upon decades of solutions in Raster. Give it some time. It had to start sometime. Better 6 years ago than today.
@@TripleSSSz Only path traced RT actually looks better and only the 4090 can run it worth a damn.
baked lighting doesn't work for sandbox games, and in games where your framerate was high anyways it's not much of a sacrifice: for example, Minecraft.unless i'm playing competitive, 80-90fps is good enough, so why shouldn't I run RT?
@@Son37Lumiere Im Pathtracing on a 4060 ti 16gb on 1080p with 80 avg FPS. game feels smooth. Only complaint is some ghosting caused by DLSS but game looks phenominal otherwise.
@@TripleSSSz That's what a 4070 gets with dlss performance. DLSS performance looks terrible too, you're rendering at 480p from 1080p, effectively negating the benefit of RT.
You actually arent VRAM limited on a 2080ti. A 3080 using the same settings wont have the same issues. Its fill rate limited and bandwidth as well
From what ive seen in a lgr video, RT existed way back in the mid to late 90s. Only difference is that what we have now is Real Time Ray tracing. Basicly RT RT
Yes indeed! I used to use programs like POVRay, Bryce3D etc. on my old Pentium 1 back in the 90s. I used to read a book while it rendered a single frame.
@@IcebergTech tech has come a long long way. Honestly its amazing how in the 90s it took around 30mins for a single frame and now it takes less than 1 second for a whole ass map in a game to render with ray tracing
90s no such rt existed in games
@@cmoneytheman have you watched lgrs video? Also i didnt say about rt on games in the 90s and read icebergtech comment
@@flamingophone Well, that’s what this video is about, RT in games, not in general, and it’s a fact it didn’t exist until years ago for games. I saw that video earlier this year.
Trying new video effects huh...
How's the 7900 XT vs the 6900 XT for video editing?
I haven’t fully moved over to AV1, so I’m still editing mainly H265 files, and the 7900XT does seem to scrub through them a bit better in Resolve than the 6900XT did. It’s hard to compare render times as the 6900 couldn’t output AV1, but this 32 min video exported in about 16 minutes, which isn’t bad.
And the title sequence was just three frames I made in photoshop, nothing new 😁
Enthusiast hardware is always hilariously overpriced, and those who pay are highly motivated to justify the expense.
No pc gamers just destroyed any part of the enthuaist pc that was left with their crying and whining about price. You're not an enthauist just because you play pc games. Thats not what enthauist do, or about. Theyr'e about pushing hardware and things to the limits.
@@kevinerbs2778 LOL
I bought a 2080Ti when it was released, but not because of the ray tracing hype. I bought it to go to 4K in a game that had no ray tracing and still has no ray tracing. It served me well, but currently is OOC. Still trying to decide if I want to look into having it fixed.
Seriously though, how many people even own a titan or 2080ti? By far more people are going to own one of the four you crossed out at the beginning…
My first Nvidia card was the RTX 3070 I bought on release for $500 for the FE card. Using ray tracing was a hit or miss with that card at 1440p. Somes games ran decent while others was taxing on performance. Now I own a RTX 4090 I play with everything maxed out and it brings the most realistic looking picture to any game. But of course it came with a $1750 price tag to enjoy it. When deciding to upgrade from my 3070 the performance and cost didn't make sense for the $800 12gb 4070ti or $1200 16gh 4080. The best option was $1600 for the 4090. Bought the MSI liquid cooled version due to its smaller 2 slot GPU form factor to fit my case. Worth every penny. Gave my 3070 to my best friend for her first PC build and she loves it. Plays some games with RT on and enjoys it.
RayTracing...
AKA Hairworks, 10 years later.
I remember RUclipsrs trying to fake emotions while watching mud puddles in Battlefield:
"Whoa! Can you guys see this? The murky water is reflecting every ray of light from the game!! This...is...game...changing"
...
the first rt games were just reflections even those halved fps and it was a meme.What NVIDIA promised was fully pathtraced games which is still 7-8 years away
Hairworks is not RT. Both have their uses.
I personally love RT.
I've seen shadows go across a hallway and save my butt in game where the regular shadows wouldn't have cast into the hallways. It's good tech.
@@ThisisDD I meant, they're gimmicks
@@zCaptainz how is more realistic lighting a gimmick?
Hairworks worked pretty well too.
I think you're angry about nothing.
@@ThisisDD that's just some shit nvidia puts out, to push amd gpu's down.
Nobody is using that crap.
You can use it if you like "cinematic" framerates, and THAT'S FINE WITH ME.
This must've taken ages to make. Great video!
It's just a shame the graphs are so hard to understand.
The problem is, it doesn't even look better. Compare a properly crafted raster game with the RT slop that is being made now, and the difference without pixel peeping is simply not there.
And I'm saying it's slop, because you have things like the dogshit shadows in AW2, and reflections and lighting that *could* exist in games like control, but they simply don't... All to give you that false "wow" effect
RT and upscaling is partially killing gaming, and it's disgusting
Yep this is such a joke , imagine buying a video card just for the sake of "realistic shadows" - in a fucking video game - a digital product that cant be further from reality anyways.:D This is how you tell that you are a nerdy looser without actually telling that you are a looser and sitting in your Pc Room all day.
How exactly is it killing gaming?
If i remember correctly, the 5700XT beat these frames in Forza 8:27.
Six years later and ray tracing is still a marketing gimmick.
Amazing video! I have the RTX 3080 10GB and it's actually does pretty well in RT in many games that I've tried at 1440P. The thing is, if I turn it off, I typically can get significantly higher FPS just using high or optimized settings. That said, if it's a slower game, and if I can enable RT and get around 60 fps, even with some DLSS, I tend to use it.
RT hasn't shown anything of note for the 6 years it has been mainstream. It will continue to be a niche technology and eventually be baked into games like how tessellation is now with most games only using it sparingly.
It really is sad to see something so hyped for so long not do anything to move the needle in terms of graphical fidelity.
Cope. Path tracing is the future, thats why AMD are desperately trying to catch up, and why Intel made it a focus on their very first try. You're just lying to yourself at this point.
@@Wobbothe3rd come back to this stupid ass comment you made in another 6 years and just ruminate on how wrong you are.
I personally feel like that the technology just isn't there yet, also there just isn't enough affordable hardware powerful enough for it to make a big difference IMO. Its time will yet come in the future probably, unless there is discovered a better way for enhancing lightning effects etc.
What are you talking about?
Control, Metro Exodus (especially enhanced), Cyberpunk and Alan Wake 2 are standout examples of how transformative RT is to inage quality!
@@leviathan5207 Modded 2077 is transformative, not base even with RT. Alan walks 2 is a walking sim so ....sure? and metro has done it the best ill give it that but it still tanks performance for little in the way of real graphical fidelity increase.
How did the Voodoo 1 from 1996 hold up in 2002? How well did Crysis run on the GeForce 3 in 2007? How much fun was it to play Metro 2033 Last Light with an 8800 GTX in 2013? Six years is a long time in the GPU market, and gamers' expectations have never been higher than they are now.
You hit the nail on the head there, gamers expectations have gotten a bit out of hand.
ray tracing is useless just like bloom blur...
Awesome review my man! I can see why this took so long. That’s an awful lot of work you put into this and I highly appreciate it. That’s pretty wild that 16gbs of vram isn’t enough for ray tracing with Ratchet and Clank. Do you possibly have any numbers on how much vram was being used on the titan?
I’m very much looking forward to seeing the Rtx 3080ti. As I know that gen had a big upgrade. Sadly I went with a 3070 and found out pretty quick it just couldn’t do high resolution gaming with ray tracing, or even rasterized for that matter with my game selection sadly. Which is such a shame as I loved that card. If only it had come with 16gbs of vram.
so alan wake 2 RT isn't that good, it's just that baked in shadows are so bad that it makes that difference, becomes kinda funny if we look at arkham series that did great baked in shadows
also games mostly suffer from culture wars from inside, even product manager, if i recall right, said that she wouldn't take the job if she knew she were to work with alan wake 2, she wanted "strong female protag game" in Control, so she hated the game from the get go, also stated that "at every opportunity while talking with staff, [she] suggested making as many characters LGBTQIA+", i'm aware that minorities exists, but how much specifically? i'd say 20% max, more likely 10%, so when some people are trying to force and use it in political way, without a clear reason, then it becomes obnoxious, and people hate being told what to do, so pushback is and always will be natural
2080 ti is a bargain on the used market too
Raytracing turned out to be as much of a meme as megatextures
I bought a 1st gen RTX card originally purely for DLSS.
Because raytracing is cool and all, but if you flip the comparison around "rtx off+dlss" you bought a card that outperforms for the price. Though this wouldn't apply to someone buying in the titan or 2080ti budget class.
But 2060? Super? 2070... It's free real estate.
Still 2024, and most games don't support it 😮
Most new AAA games have RT features. There are 600 RTX enabled titles. Most games didn't support 3d acceleration in 1999, but anybody could see that 3d acceleration was the future back then.
Yes! Been waiting for this.
I was thinking about swooping up a Titan RTX, what do you mean they still go for $700-800?
I actually really enjoy the way Rat Tracing looks but the frame drops are just to much
20 series owners like myself got royally fked. I bought a RTX2080 Super because my GTX1080 had recently died and that was the worst decision. Not only was the raytracing performance abysmal for 1440p, even with DLSS Performance in some games it just wasn't possible to even consider Ray Tracing. Then with 30 series they got ReBar support but nooooooo Nvidia *could* have enabled it on 20 series but chose not to. But hey at least 30 series got destroyed by not getting DLSS3 Frame Gen. I got a good deal on a 4070ti Super and I jumped and bought that. So far it's great & have no complaints at 1440p.
Great work🔝! I barely imagine how much time you spent to test all those presets😆
I really appreciated that💪
Conversely, AMD is still catching up on RT performance six years later. Which is surprising since the RT is involved in so many titles at this point and are released on the AMD based consoles. Makes me think part of the issue is the AMD software and not pushing on developers to optimize for their stuff.
Right now we're at a stage where people with a decently capable setup can run decent performance in games with (one of) RT Shadows, AO, or DDGI. It's just that often developers forgo the fast techniques in favor of RT Reflections or GI that you can really put on marketing.
Another 2-3 hardware generations and whether you like it or not you'll have to own a GPU capable of high end Ray Tracing or light Path Tracing.
I believe that one day several years from now that Ray Tracing will become a more standardized and added into basically all AAA games and some higher end indies too. Right now even after 6 years in a lot of cases it's not worth bothering with Ray Tracing, but that doesn't change the fact that one day it will be the future of lighting, shadow, and reflection technology.
Don’t pick up a 3090 just go with with a 3070 vs 2080 for the masses. It’s where the volume was and dropping back a tier per generation gives a good idea of the value uplift.
You should do a deep dive into Kepler. Literally, no one ever talks about that generation because everyone is so wrapped up in how poorly it aged. Nvidia actually made a giant leap in performance with Kepler, but since Nvidia waited to release a card based on the GK110, and since the GTX 680 was based on the GK104, no one ever directly compared the flagship GF110 to the GK110. Basically a comparison of the GTX 580 and GTX 780ti. The 780ti is not that far away from being twice as fast as the GTX 580. Fermi to Kepler was a bigger performance leap than Ampere to Ada.
Interesting numbers with vram.Even 11GB is not enough at 1080P so cards like rtx3080 must be really bad today.
Very interesting topic well looked into. Thanks for that.
And I'm glad my opinion wasn't too far off of your own conclusion. The performance hit is so heavy. Maybe some people prefer the looks of ray-traced art. Okay. But throwing away framerate just for a "different" visual look (not strictly better IMO, just different)... it's a big ask!
Throwing framerate away but then recouping some of it by dabbling with upscaling isn't as much of an aesthetic ask when it works well, but rather it's a legitimate hassle and emotionally/philosophically not something everyone is going to be willing to do, which isn't nothing, even if it's harder to put an objective lens on it. As in, it's a crapshoot as to the quality of the implementation of the upscaling per game, and spending (wasting?) time figuring out if DLSS, FSR or XeSS works best in the particular title takes some time and energy away from just playing the game. So, less clear-cut than FPS due to a lack of objective metric to put to it, but it's its own weirdness that shouldn't be hand-waved away, IMO.
On the one hand, it's good to have options. And by and large we do have options. On the other, mandatory ray-tracing seems like a bad bet. I predict one of two things: Hardware catches up and it becomes a non-issue, or less of one to the point where we don't care about it in 5 or 10 years. Or else mandatory RT will be looked at as a mistake, a "wild" time back in the mid 2020s when they didn't let you turn RT off in certain games, like Star Wars Outlaws, or made it come with big sacrifices and had a less-than-first-class non-RT mode, like Alan Wake 2.
Hopefully this comment isn't so long everyone's stopped reading it, but it's the sort of in-depth discussion your video suggests, IMO. I appreciate the high quality conversation-starter and objective benchmarking content combined together -- it's a really well done video, IMO. Thanks again for that.
So, funny enough, I have a RTX 2080 Super as my current graphics card, fed by an AMD Ryzen 5 5600X. I use Ray Tracing in exactly one game, Forza Horizon 5. It plays perfectly at both 1920x1080 and 2560x1440 resolutions with Ultra settings across the board without FSR or DLSS. I do admit that my 1920x1080 monitor is 80Hz and I have V-sync enabled, the 2560x1440 monitor I normally use was limited to 60Hz. But thing is, if I can get 60 FPS or higher in most games, with steady frame times, I will take that over insane high frame rates. I don't play competitive online games, I don't require five billion frames per second, I am good with a steady 60-80 FPS. Anything steady above that is a nice bonus.
If I recall right, the 3070 could at least slightly outperform the 2080 Ti in RT except in VRAM-limited cases
i remember getting my rtx2070 and wanting to try Battlefield with ray tracing on, it surprisingly did run but it was only reflections if i remember correctly. it wasnt anything cool so i turned it off and never thought of it again. now on my 7900xtx i still dont find ray tracing visually distinct enough to be worth the slight performance hit
Im curious where you tested Shadow of the Tomb Raider, as only the 2080ti could do 60fps in that game with RT at the time even at 1080p
I wouldn't call myself "early adopter" even I've bought 2 20 series cards.
RTX 2060 in 2021 because everything else is very expensive in GPU shortage.
RTX 2070 super in this year because I'm cheap
Very good video. Well done on it and the amount of work on it.
I don't have and never had a ray tracing capable computer. So all my experience is through RUclips videos.
For me there are only two cases which make me wish I had the option to play with ray tracing.
At some level Witcher 3, but mostly Cyberpunk 2077 with path tracing.
I do hope path tracing (with ray reconstruction) becomes the norm and we finally get GPUs which will be able to handle it without having to pay 4 digits for a GPU to play it with upscaling.
Incredible how these GPUs have aged with dlss + fsr frame gen. The 2070Super is still an incredible GPU for those who bought one 5 years ago
I totally agree
no way rtx series its 6 years old...
Daniel Owen did a survey and in that survey 92% didn’t care about ray tracing and 88% didn’t care about upscaling.