10 series was the last amazing generation from nvidia Honestly i dont see myself buying any more nvidia GPUS if they keep heading this route 30 series was the limit for me And sadly AMD is starting to go this way too Perhaps its time for console gaming
Gamers Nexus has one of these, and they show it on their 4070Ti review. It also happens to be potentially the most snarky and hilarious review Steve has done yet
The fact the 4090 is so far ahead shows that Nvidia didn’t even try with the other cards. It’s blatantly obvious that due to the recent lack of competition in the GPU market on the high end they are leaving room for improvement on purpose to avoid a repeat of what the GTX 1080 was and make sure with every generation they can improve a small amount and get away with it
@@geraldmccloskey4395 They over-produced the 1080 chipset and they had to skip producing a creme de la creme GPU in that generation like TITAN, GTX Ultra, xx90ti models etc. They had too many 1080 models on stock and did not want to create a new competition with a stronger card of their own. Multiple years later they trimmed it down to 1070ti and some 1070 chipsets to 1050ti. However they produced so much and they were so on point with the performance back then, still there are more 10xx cards than any other generation.
@@zortlak Not only that, the economics of the situation are likely the biggest reason the 1000 series will remain the ultimate value upgrade, forever. 1. Planar node to FinFet The 10 series was the last nvidia GPU on a planar node, which has significantly less manufacturing cost and complexity than a more powerful FinFet node, like the one AMD were using at the time. 2. Architecture The 10 series was one of Nvidia's greatest accomplishments in terms of architecture. AMD at the time had a superior node, but their engineers could not design a more efficient architecture, even with a more efficient manufacturing process. Nvidia's maxwell not only crushed AMD's polaris in efficiency and cost, but also in performance. 3. Node Size The 10 series had an unusually small 80 class chipset, because it was all that was required to completely invalidate AMD's offerings at the time. This allowed Nvidia to create monsters like the titan who had the full die area unlocked, for the enthusiasts, and sell cheaper GPU's that performed better at cheaper prices to the masses.
@@zortlak 10 series was insane. Im still running a 7 year old 1060 i bought before the GPU price crisis hit. The performance increase they gave over the previous generation was noticeable considering the price point they were offered.
I have a GTX 1080 in my PC right now after my 2nd 2080TI failed. Still great for 1440p 60fps gaming. Everything is max detail. Sometimes FSR is even off.
@@djhokage1 GN is great for enthusiasts who want super in depth reviews, but surface level people like me, are better off just watching linus. Not to mention the production quality literally blowing GN out of the water.
I'm starting to think the same thing. But I didn't want to have to wait. I wanna try out ray tracing and high FPS, Ultra in RDR2 dammit. But I don't wanna have to sell my left foot to do it.
I think AMD is super happy with Nvidia's high prices. AMD can sell at a higher price point and if people get turned off and move to consoles well, they are indirectly buying an AMD APU. This situation is a win win for AMD.
consoles are en even worse idea, id rather just go for a deck at that point. wake me when consoles let me do basic things like use the controller i want.
It's so strange to me that AMD didn't see this as an opportunity to strike a killing blow for an entire generation of cards, which as Linus said, is lasting longer these days. Missed opportunity due to greed.
I don't believe that to be the case. In reality, Moore's law is what both of those companies are afraid of. While the cost might be relatively similar to produce these modern GPUs as it were 10 years ago, they know they can't keep improving them that much every generation as easily as they used to. So we are in some sort of GPU duopoly, where they are aware that, if they significantly drop their prices and people buy the latest ones "cheap" now, they won't be buying the expensive ones 2-4 years down the line, since there won't be that big of a difference to justify upgrading. So they have a silent acknowledgement, to not sell them as cheap as they could, since they don't plan on selling the 4000s or 7000s series at their launch, they plan on selling them 2 year down the line, when their price will be ~500$-600$ and the people with the nvidia 1000s or 2000s series are eager for an upgrade to pair with their new 1440p@240Hz/4K@144Hz OLED Mini-Led monitors.
Thats so true... My old GPU died about a month ago and I have to admit, that I never actually thought about buying an AMD GPU. However, the 7000 Series really put my "stick-to-Nvidia-mind" to the test. But with those prices... I might as well stick with scammy team green :/
amd cards are shite bought a saphire nitro 6900xt and i cant even play cs go driver timeouts every time tried everything under the sun to fix it to no AVAIL
Yes, I agree - a very thorough GPU testing job. I like that you compared across a swath of different monitor pixel densities and other parameter, and offered hints as to why various performance metrics perhaps came out the way they did. By the way, I'm pretty sure the first name of Ada Lovelace (supposedly the very first programmer) is pronounced with a long A as in Ay-duh. At least, that's how we always pronounced the DoD programming language Ada.
Honestly couldn't be happier I got into PCs right before the 10-series cards came out, I could get a 1070 for roughly $325 and I thought that was expensive back then!
I thought I was an idiot for getting my 2070 right before the 2070 Super was announced, now I see I'm one of the lucky few to still have a decent price to performance card. I could not imagine paying more than $450 for a 70 tier card, and I certainly won't entertain that notion either.
in comparison to consoles its very very expensive, i got a 1.5k rig and my girlfriend has the series x and it looks fantastic... its just limited feature and freedom wise
Got my 2080 OC from Asus new for about 700 literally two months before the the pandemic and crypto hit. It has proper DLSS support and enough power for my needs and carried me through a time where a GPU alone would cost you like an entire setup. It was pretty expensive back then and now you need to pay more than that for the "normal" version of a comparable worse card is just baffling
All the broken drivers and overheating on the 7900xt/x must be why they wouldn't even consider building for amd then I suppose. Though I do wonder how weak of a company evga had to be, being the lone wolf to exit a market that everyone else is managing to contend in. In any other industry when one company picks up their toys and goes home, it's that company. Not the one they were manufacturing for.
@@mattk6827 It was an ethics decision man. EVGA who had been a loyal and beneficial partner to NVIDIA for more than a decade decided it would not tolerate bad treatment (deteriorating ties with NVIDIA) anymore.
@@mattk6827 A companies long term succes often isnt due to high sales but consumer friendly ethics. Nobody is hating on EVGA for cutting ties they are hating on nvidia for all the shit they are pulling
@@mattk6827 EVGA never do AMD cards because AMD cards has 1/5 margin profits unlike Nvidia 5/5 Did you realize why before Ryzen, no laptop manufacturer do AMD laptop except HP??? That because nobody want to sell AMD laptop, margin profits is too low. All of this is business decision. Not because of problematic of AMD chips
@Ryuzaki Raiga what? It's not margins, it might be popularity/volume, but it's not margins. AMD literally went Chiplet to make CPUs and GPUs as good on margins as possible. Nvidia literally leaves
I remember getting a GTX 970 for under 400 euro's in 2014. It was such amazing value that I didnt mind paying more when replacing it with a 1070 when that came out. In the light of those prices, seeing this card on sale for over a thousand euro and the 4090 going for twice that is a complete mindfuck
I been running a 970 since 2016. Still a great card and can do pretty good in modern AAA games. I figured it was time to upgrade and I splurged on a 4070ti
yep i payed 300 uk pounds new at release for a 970 g1 gamer and that was the best out of the 70 series then. my 1070 amp extreme was 430 pounds and strangly the cheapest 1070 even though performance was one of the if not best. yet all the other 1070s were above 500 pounds. that wasnt so long ago and i was angry at the 100 pound increase in price. now they want me to spend 850 pounds on what really is a 4060 but called a 4070. crazy people.
Leather Jackets was cheaper back in 2014. Now at poor Jensen has to travel around the world to find quality leather jackets in a fair price. How else he can earn money to buy the latest jacket made from zebras? Only by selling over priced trash like rtx 3060/4070 for $800.
@@morgan5941 I believe they addressed this on the WAN show stating that those wouldn't age well whereas performance graphs will genuinely stay mostly the same.
@@sophieosu They don't have to age well. They just have to be the current MSRP's. Tech videos in general don't age well. Who the hell watches old WAN shows?
@@morgan5941 Well... That's the problem isn't it? If they make the graphs for the GPU msrp, it's a poor representation if there was a GPU shortage and the prices doubled. It also doesn't take into consideration board partner cards which will be over MSRP anyways because the margins for MSRP cards are non-existent.
18:29 Can confirm this is absolutely accurate, I have a gaming PC with an RX 5700. I wanted ray tracing, took one look at current GPU prices and bought a PS5.
@@ehsanzare7515 It supports it on a hardware level, but I don't expect adoption to be remain widespread due to the sheer performance costs that inevitably come with RT.
@@ehsanzare7515 Yes, that and NVMEs that use a DirectStorage-like feature were two of the main selling points of the PS5 and Xbox Series One X X Box Series One X or whatever the hell MS is calling it this time.
I can't even turn on Ray Trace on my games thanks to the FPS difference. Try 60 FPS on Witcher 3 with Ray Trace turned on. o_O I'm just glad I can play a game without my 3060 Ti spontaneously combusting into a fireball. =P
I notice that too I just didn't like that coloring as it overlapped with the color scheme of the 1% lows. If it was just gray/dimmer or something that would be nicer.
Honestly, I think Nvidia and AMD are recognizing that customers no longer need/want to be upgrading these cards every year to two years. The high end card will be able to handle next gen gaming at the end of year 5 with ease, so they simply are raising the price knowing that the customer is holding onto the device longer. Same product cycle as with iPhones- I ask for a new work phone when the battery is shot...and usually my 5 year old iPhone performs only marginally slower than a brand new iphone.
Im loving these final thoughts at the end of the reviews. Direct, honest and pretty much what a lot of people has in mind but has no far reaching voice like Linus.
8:49 For Blender benchmarks, it would be better if you labelled what renderer was used. OptiX vs CUDA makes a huge difference for Nvidia, where OptiX can take advantage of the RT cores. CUDA and HIP only use the GPU for raw compute with no special hardware acceleration. Clarifying this difference would make it much easier to check your numbers against other benchmark data.
I had intended to upgrade to the 30 series at its launch but the scalping price hike made that not feasible. And seeing the 40 series priced closer to the scalper prices of the 30 series I think my current gpu will stay in use for a while longer
Here's an idea for you guys to put the benchmarks in better context: incorporate the price somehow (both MSRP and market average if at all possible). My two potential ideas are either to add a "price per performance" line down the right, with it dipping left to indicate worse relative p/p ratio. Bonus points if instead it's a ribbon matching your color coding on the bars for p/p ratio per average/5%/1% lows, so we can spot when a card might be worse on average p/p, but we get more for our money in terms of frame-rate stability). Alternatively, if that winds up being more cluttered: when you have a specific card you're highlighting for comparison in the ranking do a simple conditional formatting style color coding of the background behind the card names (again, better if it's up to date upon final editing) to visually represent if a card is more or less expensive in relation to one-another.
Yes i need this badly. Money to performance is all that matters. Idc if its strong enough to simulate the matrix if it will cost $10billion ill never buy it. I need to see if these cards are reasonably priced for their power otherwise its just random numbers.
That’s a great idea. In the video he was comparing the 3090 Ti with the 4070 Ti and made it seem like a bad thing that the 4070 would underperform the 3090 ti but the 3090 Ti is a $2000 Gpu compared to the $800 for the 4070 Ti.
Guys, the problem is that there is no single price that they can give. Abstractly, the ideal would be them giving the price that *you* would pay. 1. Prices are different by region 2. Prices fluctuate over time, because of anything from day-to-day sales to geopolitical events 3. MSRP is just that: Manufacturer Suggested Retail Price. All three of those first words have issues: 3.1 - The manufacturers set this, and they don’t necessarily have to set it at the econ 101 ideal where supply meets demand in a perfectly competitive market; they control supply. 3.2 - It’s just a suggestion, the retailers ultimately charge what they will 3.3 - This is the retail, i.e. new, price 4 - Used prices are not just unpredictable, but unknowable. There isn’t data on what price every used card sold at; you can’t even assume the prices that you see on buy-n-sell sites because there is very probably bartering that doesn’t get seen, let alone made available by the few sites that attempt to show their sale histories 5 - Cards aren’t as fungible as we assume; 5.1 - Used cards are obviously worth less than new. And even within those, “true value” is affected by things like how much warranty the card has, intensity of use, previous operating environment, whether it comes with receipt, or was mined on, or overclocked, or cleaned regularly, comes with box and cords, etc etc 5.2 New card prices might include separate value adds, like bundled games or extra retailer warranty And the kicker is that we who might want to be surgical about price with things like price/performance are more likely in a segment where buying used makes sense, and used has the least universally knowable data A - There are sources that try to present price/performance numbers like UserBenchmark and others, and there’s good reason why nobody in the comments recomended them to you. Go look, the data’s shit.
Loving these new charts. Having the top performer consistently at the top makes them easier to understand at a glance, rather than having to rely on reading the “lower/higher is better”. Very comprehensive video!
Oh yeah, I can't remember how long that's been. I finally stopped thinking him as a review channel, just because LTT didn't do as many benchmarks anymore, so it's cool he's trying to do comprehensive benchmarks again.
Stoked I literally just picked up a RTX 3090ti from eBay for $600 and wasn’t scammed!! Had me sweating seeing this released for just a bit more and I thought I’d be leaving all sorts of power on the table. This video got me feel real good about my purchase! 😅
Wait until you see 5000 series pricing. They need people buying their old tech from 2018 and 2020. So the prices have to go higher and higher. All while killing the planet with crazy high TDP.
LTT Labs is making these charts look like testing is done by Gamersnexus, and I love it. Finally more comparisons than one-two other offerings from NVIDIA and AMD.
@@ImtheIC I love LTT but that's simply not entirely true. GN gives facts straight and all technical details are there but Steve, with all his wisdom, can sound boring. HWU same at least for me. What LTT does is give info as digestible bites that the layman can understand but the experienced can appreciate but the super techy ones will find lacking or wanting.... until Labs came in.
@@ImtheIC As much as I love LTT; Their testing breakdown is a DUMBED-DOWN version compared to GN and HWU. I'm not complaining as I know LTT targets a wider audience.
Next generation cards will be 1200 watts plus better buy a nuclear power plant before, I got me a 3070 TI for 500$ brand-new the week before the release of the 40 series at 1440p Cyberpunk 2077 Rays at high DLSS quality everything at high and ultra in between 78 and 85 FPS
I went team red as well for a similar reason and love my 7900 xt. I think $800 is fair for the performance jump I got over my 3070 which I paid almost the same amount for years ago.
they're making AMD consoles more attractive to the average consumer and focusing on making money from enthusiasts willing to spend big money. It's quite sad, really.
I skipped out on the big 2 this Gen and opted to go with an arc 16GB , TBH it hasn't disappointed. I'm not a hard-core gamer / graphics designer, but for my 32" 1080P LG it was a good upgrade from my RX580 8G. It doubled my frames in most games that I play with my sons and it renders just fine in my 3D printers software. And it seems it gets a little better with every new firmware intel pushes. And for less than $400 US. Something needs to give to bring these prices back down to earth. The high they were riding on crypto has their wires scrambled.
Exactly nvidia has lost their mind with these prices And its only alowing AMD to do the same Arc might not be a performer but damnit if it brings prices back down to earth im all for it
I really want to do the same actually I mean like... In my country atleast all other cards pretty much have trash value even second hand. And something very cool I think about arc is actually that the Ray tracing is pretty powerful! But on the other hand I'm still unsure about Intel not dropping the project they already demoted raja pretty much the head guy of the division not so long ago Not saying he did a good job or a bad job but I have not heard about anyone replacing him so that Is pretty scary.
Bro how do you even use a 32" 1080p display Unless you use it from like 2 meters it's gonna look like shit I personally have a 32 1440p lg display and this resolution is barely enough, at least for me
I stumbled upon this video and want to thank you for making it. During the 90s-2000s I built multiple computers. Last one was over 11 years ago as I slowed down gaming. Recently started to look into building a new rig and most components were inline with my expectations. Once I saw the pricing for video cards, I decided to abandon the project. I am disgusted by the pricing set by nVIDIA and will not spend any money to support this practice. I hope they get what’s coming to them.
tbh, Every single reviewer has the same take on this card. Gamer's Nexus was far, far more brutal than Linus, and pretty much no reviewer I'm away of has made any positive anything about this product.
@@JE-zl6uy I don't understand the bad reviews. The card does about 3090TI-performance for half the price and power-demand and only get beaten by AMD in non RT-games, while RT (and VR) is kind of the reason to buy such cards in the first place. And it's still just huge compare to a 2070Ti, so the price-increase is kind of justified by the size. And if the card would be 200 bucks cheaper would mean your old card it worth less as well while for that price I still get at least half the price back if I sell my 3080.
@@Leynad778 the reason is the price for the top and mid end is scaling up greatly each generation and the price-performance leap isn’t correlating with actual performance leap gotten from architecture and node shrink at the given amount of cuda cores/die size
@@Leynad778 It performs about double compared to the 2070 Super, but also costs double. What's the point then? Where's the generational improvement? I was planning on upgrading my 2070 Super, but I don't see the point honestly with 1) this price, 2) the price to performance ratio, and 3) power consumption - just the card itself consumes as much power as my entire PC with the 2070 Super. And honestly I'm not even having any trouble running games with it maxed out at 1440p. I was also hoping it would have more VRAM, AMD is really beating them with 16GB.
I think a lot of the reason why people voted they had never used ray-tracing in that poll is because they have never had hardware that supports it w/ good performance. So I'm not sure the argument that the RT performance of the 4070Ti dosen't matter much really holds up, since a lot of the people that voted 'Never' probably would enable it if they had a card as powerful as the 4070Ti
I can not stress enough how essential it is that Intel succeeds in this market. Honestly, for the average customer, the a750 is looking like a great purchase now that is drivers are improving.
Intel pisses off customers because of their inconsistency in literally everything. They are also to be blamed for making shite cpu and jacking the prices
@@yellowscarlightningscream8347 well i wouldn't trust any of these companies but I do hope they will push forward, fix drivers and release new products.
mate u have a point sure. but you are putting your hope in the most rotten company of all, Intel 🤣🤣🤣. what we need are new fresh companies with totally new architecture. Don't know, japan , korea etc. enough with this Duopoly.
@@MrFearlesskiller from their first launch it seems the main issue lies in their very undercooked drivers if they make decent gains in their drivers they could surely make a nice splash in the market
@@MrFearlesskiller Well for 300$ or so ,it is a very good card. Especially if you like RT. If you do,than then there is nothing in that price range to compare to it. Asrock Phantom Gaming D A770 is 305$. It is around 3060 normaly. But in RT it trades blows with 3060Ti
Like other people have already said, the new graph style is a much-needed improvement, helps the information clarity a lot! Also, love that you guys are always using relative performance to other GPUs now, provides a good summary of data whenever different games/programs features discrepancies.
I upgraded my gaming rig from a 1050ti (now in my plex server) to a 3060 this week - I have 3 1080p, 60hz max displays that I really don't want to replace, so this made the most sense to me. The titles I have looked at so far with RTX on - have left me with a feeling of "who cares?" It seems far too much weight is being put onto ray tracing from the manufacturers side, when the users, really couldn't care less.
they were selling GPU under third parties names on amazon and ebay during the pandemic, they are really that scum for money, probably planning another alt coin mining fever to skyrocket prices again
The LTT team is to be commended on one of the best reviews of the 4070 Ti, especially given the current climate. The comparisons across competing GPU's, extensive game and productivity benchmarks, and actual vendor based pricing considerations, sets this review at the top. The real-world commentary is refreshing and I appreciate the effort by all LTT staff involved enabling me to make informed purchasing decisions for both business and personal hardware upgrades. Well done.
@darkreaper0604 worse than that, the pour souls who paid $400 for a 3060 knowing the value was bad at the low end, because getting a card at msrp in February 2021 was nothing short of witchcraft.
3080TI was terrible pricing, their pricing has been nothing but garbage since the pandemic. they literally just doubled the prices of all their cards and kept it that way.
@@sebastianjennings1159 yeah i recently just bought a 3070ti for $300 used. It works perfectly! I think people should buy used if they cant afford MSRP then sell and buy a new one when they can. Nonetheless, buying gpus or pc parts in a whole is kind of a waste unless youre working from home. People fail to realize that video games are a complete waste of time if abused. Ill be selling this 3070 for the same price i bought it then in turn buy a new one in 2 years.
I kind of regret my used 1080 Ti purchase for 400€ though. The power consumption is insane and costs me about 1€/day. I should have spent more, even if it didn't look worth it. Really sad that even with crypto collapsing, prices are still insane.
It still blows my mind how massive these cards are, looking like a VCR in your desktop. Most of these are 4 slots now, and for what? Who are these for? I'm so glad I was able to get a 3070 (EVGA FTW3) at MSRP during the pandemic. I'm going to sit back and watch this generation, with a confused look on my face the entire time.
I think the size and the price suggests they are really struggling to innovate. Fundamentally a lot of the digital technology plateaued quite a few years ago and since then they've managed to continue to improve and refine it but nothing like the huge strides in clock speeds, ram, storage, etc that were being made from the 90s to 2010s. Ever since about 2015 they've had to find alternate ways of improving performance, some of them very clever, some of them quite crude such as just shovelling more and more and more cores onto something. But it's been a struggle. They've probably hit a wall with just how much performance they can get from fundamental technological improvements, so the only way is to make them better now is bigger and more costly. Maybe there's some other revolutionary big leap round the corner.... but probably not.
I wish I had that luck. I have been looking for a 3070 ti ftw3 for a decent price for 2 years now and I have gotten MSI Ventus 4070 ti for cheaper than the current listed price for the 30 series
@@Hantzius I put my name on a list for a computer store, and just hoped I'd get called. On my day off, I get that call, they called me and asked if I was still interested, and if I was I had an hour to get it. Hell yeah! 😆 Apparently they had like 15 in stock (3070s, 3080s) and they happened to look on the list when it got down to the last handful of cards. I got lucky and by the time I got there my card was the only one left.
Appreciate the whole upgrading bit near the end. As someone finally looking to upgrade and finally in a financial position to upgrade again, even ARC will be a massive upgrade from my 2015 GTX950M I'd been dailying for the last 7 and a half years. I think Nvidia forgets this demographic exists and if I have a good experience finally upgrading and going intel or AMD, I'll likely not bother with Nvidia in the future because too expensive
Went all AMD 8 years ago and since bulldozer "yes that mess, my system performed fine IDK why people hated it . . . " and I never went back to intel/nvidia. NO REGRETS, I'm loving ryzen and radeon, make the jump when you can :D
I want to upgrade from a GTX 770 to a used RTX 2070 Super or above (within 2000 series), but it's being very hard to find used cards at a fair price in my country. Radeons seem to be a better deal price wise, however, I need the CUDA capabilities for production work. If only AMD or Intel would step up their game in that regard...
@@ThePortuguesePlayer I managed to get a 3060ti from Nvidia at RRP so if you could do that I'd highly recommend as it trades blows with the 2080 Super.
@@asdfghjklkjhgfdsa69 I'm avoiding anything higher than 2000 series. 3000 series are generally more expensive in my used goods market and there are many other disadvantages that come with them regarding my usecase. For most other people, a 3000 series is probably a better deal, though.
The AMD 6000 series is a hell of a lot cheaper than the newest offerings from anyone and as long as you don't care about ray tracing (ie. most people) it's incredibly good value at 1080 or 1440.
I need to say, I love that your GPU comparison also includes (hopefully) realistic dimension comparisons! Whoever did opt for this on your team, please give him my acknowledgement.
Love the new graphs! .. I was thinking about how to visualize performance of different cards in different games with different settings. In addition to the individual slides you already have I would appreciate a final overview combining all data points into one view. I think a matrix with color codings might be the way to go here. You could for example increase the saturation for cells with higher scores compared to others. This would make it easier to find patterns
@@FaceyDuck I would disagree, I love GN data, analysis, honesty & professionalism, but its charts are probably not the best in readability at least with my poor 1680x1050 TN screen
@@whismerhillgaming I would also disagree, I find GN's chart readability to be perfectly fine on my screen. But LTT's charts are definitely getting better.
As a 4070 ti owner who has been gaming my entire life but has not kept up with the times the last 5-7 years. I mainly play cs go but wanted to play some AAA titles I've missed out on the last few years. I bought a 1060 6gb laptop to tie me over about 5-6 years ago and finally broke down and built a new pc. I did quick research and paired a 13600k with the 4070 ti. Have had it for about 1.5 months now and have played cyberpunk, hunt, redfall, and red dead 2. All have played awesome and I couldn't be happier. I got a little nervous after I built it that I made a mistake but after using it myself and seeing some of these benchmarks I'm happy with my choices also in my opinion the best performance for power build you can get. I don't intend on using it for 4k and will likely build another in 5-7 years. In all likelihood I'll be able to max out 1440 for that amount of time without a problem and if I have to turn a texture or two down at the end of its life I will. Should it have had 16 gb for the price well yes and I'm a little disappointed about that but the cache is like 16x more than the last gen so I think people are exaggerating the ram need. Also as games get optimized after release they will get better. Overall I'm happy and just wanted to give an opinion from someone who owns one. If I had a 30 series card I might hold out till 50 series but if you're building a new pc go for it I'm happy. And people will hate on frame gen or dlss 3 but if its single player to me it looks great and feels smooth. For something competitive I would never use something like that. For example I run medium setting at 1080 on cs go with a 165hz monitor and cap my frames at 320 because it reduces latency to .6-1.2 and uses 44 w. Also frames never dip below 200 even on the biggest spikes. CPU usage at those frames is like 10%.
I remember when I upgraded my first computer with a new gpu for 96usd and I could crank up all the settings in the latest games to ultra with no problem! Sad how we have to pay more than 10x that price today and not even being able to get a steady 60fps.
I remember trying to play games like DayZ and BF3/4 on my school laptop with integrated graphics. Then built my first tower for $400 with a AMD FX-6300 and Sapphire 270X. Boy was it magical playing a game a 1080p above 30fps. Good times.
@@sean9267 The FX-6300 :D Man, we sold more of those than any other CPU ever, and for a longer time. They just worked, and at a low price. The 270X was also a great card for the money. You made good build decisions.
@@williamdunkley5791 Anyone who rushed dark knights ruled all of azeroth. You didn't even need a GPU. Even a Tseng Labs 512KB ISA card would do the trick.
I've noticed that some charts are sorted by Average, and others are sorted by 5% low (for example Hitman 3 @ 1440p 6:01 ). This is something that's a bit confusing when trying to follow the graphs. There should really be consistency or a clear indication how the graphs are sorted in the individual game's benchmark results. Otherwise, just seems like someone didn't hit the right sort in excel when extracting the data.
Looking at the productivity graphs they seem to have sorted it by the average of all values, at 6:01 two cards have the same mean value so they have to choose
I think they are sorting by the combined scores(?) of all three lines. Whatever it is, it does make sense intuitively. High average fps is not as important if the gap between the lows and the average is way too big.
I noticed that too. Would be great to have a little up or down arrow with e.g. "1% low" in a top corner to show the sort. other than that, these graphs are much improved - very readable
I have a 1080 ti strix.... I got a 4090 gaming x trio (high oc) and wasn't all that impressed. Seemed like a negligible increase in performance across the board. Games would just utilize 50-75% of the gpu, otherwise frame rate increases again were negligible in a lot of scenarios. I sent it back.
Got a 4070TI open box - excellent from Best Buy for 750 with tax and free shipping. I wasn't initally going to get a 4070 but it just seemed to line up.
Gotta love that the price increase has doubled every gen since the 10-series. 200$ increase from 3070 sounds insane! Especially since they wont sell FE cards and the cards available to consumers most likely wont match msrp...
I feel like this is another little hiccup like to 400 series was. I think if we wait for a while it's all gonna stabilize again. We just need to stop buying them.
@@Steven-yf2ef this is because its not likely a console gamer will fight the system by purchasing an entire pc at 4 times the cost. however buying a console for a fraction the cost and saying fuch it?
When it comes to productivity I'd go for a 3090ti or even a 3090 over the 4070ti because of the VRAM, when rendering stuff in Blender etc you really don't want to run out of it since it goes quite a bit slower when it has to offload on the system memory instead.
Aaaand youre comparing a highest end card with a not highest end. Technically the performance is high but its still doesnt have 90 at the end of its name. My rtx 2070 used to be cool, today in 2023, its half the performance of 70 name.
@@fynkozari9271 The 4070ti USED to be called the 4080 then they slapped a different name on it. Are you actually stupid? Like did you just not watch the video?
If you want to productivity.. buy a 4090ti. they are cheaper than a 3090ti. and when your budget is 1200$ or so for GPU why wouldn't you go for a 4090ti..
I understand that the graphs are now more intricate due to LTT Labs being brought into the fold. But can I request that future graphs have markers that coincide with what the host is saying? I.e. @ 5:52, Linus talks about the 4070 "closing the gap among its older siblings". It would be nice to see some arrows to point to where these "older siblings" are in the graph. Otherwise, I will have to scan through the extensive graph to see where they are, making it more like a class lecture or an RTS game than a form of entertainment. I'm no PC part connoisseur, and I know some folks will say "just pause the video", but the whole idea of it being a presentation is so that I don't have to pause it and it flows along with the speaker.
I think it was quite obvious as to what Linus was referring to as "older siblings". The 4070 Ti, the card being reviewed here, is always bolded AND circled out in the graphs. I think they are doing more than enough for the graphs to be as readable as possible, all without them being too obnoxious.
@@jopa3895 We know which cards they’re talking about, but in the sea of cards, I shouldn’t have to be looking for them, I should be focusing on the text.
I think the 10x series upgrade language is directed at the steam hardware survey. a lot of 10 series cards there, so they are really trying to pull in those people, becuase they know the people that shelled out top $ for a 30x series are not going to see any value in an upgrade.
I remember when $500 was an insane price for a GPU.. 15 years ago the NV 8800 GTS was top tier and $450. Sure designs have got more complex, but at the same time manufacturing cost keep going down as well. At this point, anything above $750 for the top top tier GPU is price gouging.
Right?? The $200-250 was the sweet spot. And it got bumped by $50-150 every generation until they're now in the triple digits. 😐 The worst thing is that older generations don't come down in price when a newer generation comes out. I don't understand why... Is it the crypto mining thing or just greed? Probably just greed.
I genuinely do appreciate OS Version/Patch/Driver Version being put on visual benchmarks. It adds context. I think is the time I start looking at ebay for 3080/3090 series cards, I keep seeing some pop up starting at £500. Anything is an upgrade for me from a GTX 980
If you look at it they planned it, the 2070 super already cost almost as much as a GTX 980 cost, then a RTX 3070ti cost even more than a 980 cost, and finally get this. Even if nVidia "dropped" the price to $700, that's still them clearly planning to charge *at least* $100 more every single generation. And no, it's not fucking inflation Jesus Christ. There was no inflation making them charge that much for the RTX 2080. There was no inflation making people buy used 1080tis instead because they gouged prices so much compared to performance. The inflation (i.e. corpo price gouging when they realized everyone would just blame increased prices on inflation) hadn't set in yet when they announced that $600 MSRP for the 3070ti, a card that couldn't even perform as awell as the $580 RX 6800, and that was fully $100 above the MSRP of a card I'd already passed on 2 years prior because I thought it was a bit much, and that was when I was comparing in $50 segments. No AMD isn't better either, that is correct. AMD is just not as brazen, usually. Zen3 when they dropped the stock coolers though, that combined with adding a cost made it effectively $100 more expensive than Zen2, plus people complain about the 5600x cost compared to 2600 and 1600x and 3600x. So, I think that was trying to profit off those of us on AM4, who'd be buying it anyway (hence the cooler didn't make sense). Maybe it was a wise thing, coolers are back for Zen4 right? On AM5? But either way, I'm not praising them when they're being greedy. It's just nVidia is so brazen about it, it's like they're personally daring you not to buy it. So I won't.
30 series came out fall of 2020 or 2.25 years ago but were hard to find at retailers until spring of last year/2022. I think you meant to say 4 series OEM's stated MSRP is unobtainable.
@@pandemicneetbux2110 It all ties in with the fact that, if you have the best/fastest product and no competition, you can charge whatever you want for it. Because there's no alternative from a competitor. Plus the fact that global focus is on maximizing corporate profits at every cost, including human lives, instead of of maximizing human happiness.
@@Weshwey_ It's not without competition is the thing. In some metrics apparently there's AMD cards beating nVidia at native resolution raytracing same price bracket now. So they're even closing that gap. That's just the thing--they're not better. They're not unchallenged. Buyers are just bydlo morons, willing to be led by the nose. The 3070ti itself was bad value. It's not just the bad value of this generation all around, but even within that bad value for example the 7900XTX is clearly competing with the 4080 in performance but beating it in price. That's what has me so baffled, is the 3050 was selling for $100 more than the much, much better performing RX 6600. >but muh features A 3050 isn't strong enough to do RT regardless, it's basically a GTX level card in all but name. That's my point. They realized they can shovel any manure and someone will buy at least some of it.
I gotta say. LOVE the graphs. The 60 and 120 line keeping everything in perspective is awesome! I'd like a clearer indicator of what generation each row is and either msrp or, even better, recent sale price. That's a lot of stuff, but without it the graphs don't have clear indicators of value if you want a one stop video.
A great upgrade from 10 to 40 series? Have Nvidia forgotten about their last Gen 30 series? Like, from a 1060 3gb to a 3060 ti 8gb is a BIG upgrade and for less than $800
I've been thinking about upgrading to a 3070 ti but my 2070 runs very hot already I had to change case fans so my GPU didn't rise above 70°c so I'm a little concerned I'd have to water cool it.
That was almost exactly my upgrade path (except my 1060 was a 6gb) and man, it's pretty much all I've ever wanted. Especially because I caught it on a one-day random $549CAD sale on Amazon. All I need is 1440p because 4k gaming is basically a meme and it's perfect for my needs. Skip this generation and let Nvidia get the hint.
That's the exact path that I took. I got my 1060 3GB last year for $250 at the height of the GPU shortage era. It struggled with most games. 2 months ago saw a mint 3060ti 8GB for the exact same price and went for the upgrade. The difference is like night and day. Incredible value. No need to drop 800 for a new card
I'm still rocking a gtx 1080 I bought used in 2018 for $300. most of the 10XX series cards almost seemed like the perfect sweet spot for price to performance IMO.
Yep. That's me too. Red Dead Redemption still looks amazing at 1080p on ultra. Want a 1440p card, maybe with a high refresh rate monitor when I hop on less demanding multi-player games like Overwatch or Deep Rock Galactic, but no fucking way am I dropping this much of a bag when I already love my set up
Bought a nice PNU XLR8 triple fan 2080ti used for like $260 in October last year , I have it in my PC and finally retired my 1080ti to the second PC. For the money nowdays there are so many good deals on used GPUs that it’s tuff to ever justify buying a new GPU. Although I’m not gonna lie I don’t see the hate about the 4070ti here it’s 3090ti performance that was using 185 watts while gaming!! Like that’s amazing it’s expensive but I’d love to have one and if I was on a 1080ti still and wanted a 3090 ish card I’d buy a 4070ti for sure
As ridiculously greedy as Nvidia is, its also expected behavior from them at this point and this generation has me more disappointed with AMD. They had the chance to put out normal priced cards and actually win a larger share of the market because everyone was excited for the 7000 series to be a good performer at a fair price. Instead, they followed Nvidias lead and went for higher margins leaving us with no good options for multiple years in a row.
@@Dell-ol6hbits not reasonable at all.. The 1080ti back in the days was the best card / nvidia's flagship - high end ... and you could get it for 700-800€. Now you need to pay the same price for a 4070ti, while their new flagship is the 4090ti... the 4070ti basically stands for "low-mid" end cards... at this point. Imagine having a 3 generations newer low end card for the same price as the high end card while it can only offer 70-100% more performance.. It should bring like 200% more performance for 3/4 of the price to be on par with the 1080ti in terms of performance/money. big disappointment unfortunately
I'm happy with my $575 MSI 6800xt. Upgrading right now or even in the next couple years just doesn't seem worth the absolutely terrible pricing that they are trying to force down our throats right now. Maybe Intel will shake up the game some more in that time. Who knows.
I held off upgrading my 1080 for so long because of the crazy prices ever since the chip shortage that I mostly lost interest in gaming. Their greed cost them a customer.
I build my new pc about 2 years ago with a 5800x etc. the only thing i took from the old pc was the 1060. And guess what -it's still in there, like you i lost interest in gaming..
yes, I am not too happy with my gtx 1070. New games don't run that great at 1440p for a couple years already. Now just rarely play some older or indie titles. I can't pay 900 euros, don't want to risk with used and middle end cards cost as much as top end not long ago.
I'm beginning to think AMD is deliberately allowing this to happen so that more people are forced into the console market. They make most of the GPUs there after all.
I’m looking to upgrade from a 1080 and use a 1440p monitor and VR (especially VRChat) a lot, so despite the disappointment that was cast upon the 4070ti, it looks like the right card for people in my position especially because 10 series cards don’t have DLSS to take advantage of which can improve gameplay. Also I’ve seen a decent prebuilt with the 4070ti at $1800, which isn’t a bad price for the performance.
been there, bought the 1800 prebuilt, no regrets whatsoever. This pc is is so silent at 165Hz for all the FPS I played, I am now thinking to upgrade to a 1440p aswell because this pc can absolutely handle it. I m glad I made the purchase tbh. Just a stranger agreeing with your take.
1660 Super gamer here lol I'm also planning to build my new/2nd PC as my old one I built back in 2012 is now dead. (Rip) I want to play in 1440p on 144Hz and I actually think the 4070 Ti is the only card that is the most attractive one. (Except prices, those are 💩 anyway)
In the exact same position as you. Was gonna get a 3080 at first, but the price for a 3080 and 4070ti are the same where I'm from and the 4070ti has better performance. Kind of a no brainer
Bro its amazing for 1440p hdr. This is a great card for exactly that and that is why I built with this card. Gigabyte Gaming OC 4070 ti is phenomenal and runs super cool as well. Decent overclocking too.
Same. I managed to snag a 3080 for MSRP i'm not doing this with nvidia this generation. If AMD fixes their vapor chamber issue, i'd consider them over nvidia right now. (especially being as I kinda want to go to linux full time since gaming is getting really close there.)
Same, got my 3080 for just a bit over msrp last year and boy, it seem I got lucky with price to performance. I wouldn't pay more that $650 or $700 for the 4070ti.
I have a 3070ti and have never turned on ray tracing just because performance is too big of a hit since I don't want crippled fps lol. It feels like when we went dx8>9 I think it was around 05 with the 7000>8000 changeover too. Sad to see pricing and performance progressing the way it currently is though even though I expect fluctuation with my experience over the years.
What they and Digital Foundry will tell you is do you know how amazing RT is....although its 25fps you don't understand what you are seeing and experiencing. The truth is we are gamers and gameplay matters most...the only thing that affects gameplay is frames. Have a good one.
I have a 3070 and have literally tried finding RT settings to turn on in the games I play. Either the setting's absent or the game outright doesn't support it, in every case! *HMMMM.* Methinks NVIDIA really just does not understand how much raytracing actually matters. (It doesn't.)
@@seizonsha lmao the gatekeeping. I play some games with RT on because I like the way they look, and I don't really care for the difference between 60 and 120 fps in some games. If I play a competitive shooter, of course I'll want higher fps... but I'll also want Cyberpunk to look the best it can, even if it's at 45-ish fps. Why do people have such a hard time understanding that we can all have different preferences and that their own way of doing things shouldn't be imposed or used against others lol
I think if you've got a healthy build going, it can be easy to demonize Nvidia, but... If you own an older video card & are looking to upgrade, let's consider this: Radeon 7900 XT - $900 on Amazon. RTX 4070 Ti - $850 on Amazon. While rasterization and raw performance are important, who the Hell is buying 30 and 40 series cards, without considering Ray-tracing as a measurement of performance? In addition to that, who isn't considering DLSS or FSR as a measurement of performance? If you want to call them gimmicks (largely debatable), EVERYONE got suckered in. Until these gimmicks become less important, I think Rasterization, sadly, might have to take a back seat for a little while. If you factor in these "gimmicks" with performance and market share values, the 4070 Ti is good. It's $40 cheaper than AMD's current equivalent. BUT EVEN from a pure rasterization standpoint, the 4070 Ti still beats out ALMOST the 30 series entirely and its average FPS is 120+ in 4K - that's nothing to laugh about.
I'd like to see ARC in these charts now that they are running better with the updates they have been putting on them. They added More Direct X support so those charts should look better. Yes I know not a 4080 class gpu but still good enough for the vast majority of 1440p gamers that don't use ray tracing. Which by the charts indicate most do not use 4k nor ray tracing. So an added segment to show how much of a difference Amd to Intel to Nvidia would be helpful in the buyers choice. When it's time for my next upgrade I'm hoping Intel does more and Arc works out because that's the next Gpu I would like to try.
It is still not stable enough to give consistent results. The intel cards are very inconsistent with RAM and Processor selection. There is no benchmark for Intel Arc graphics card yet
Yes and innacurate as hell...I don't like to see the same numbers to have different sizes in graphs....like why is there 6 higher than 6 on 3090ti over the 3080....numbers are fine but the visuals are the thing that bothers me... 11:23
I'm one of the people still running a 10 series. Looking to make a new rig this year, but I'll be primarily focused on getting the best board and socket available. If I have to stick an old card into my new system until prices stabilize, then I will.
I still can' get over how big the cards are now. Soon I'll be making the jump from an old ATI 5770 to whatever I decide to go with. As life happended I got out of PC gaming for awhile and moved to console gaming. I'll be getting a new PC to hop back into PC gaming.
Yeah it's pretty nuts they basically just didn't bother optimizing for efficiency anyway and now just dump 450-600w of massive power draw on top of those chips like it's some sort of GTX 690 Mars II SLI whatever memery in those things which of course takes a pretty much full tower mesh fronted case and with a massive beefy cooler that you can't even fit the freakin thing in a full tower anymore without taking the hard drive cages out if you go with nVidia. It's frankly impossible for me to recommend them just because all of that adds up, and I quite literally can't either fit it in my case nor power it with my PSU, and I have a 850w 80+ Gold from EVGA with a full tower mesh case. Because again, the hard drive cages makes it not have enough room. Good luck with your front mounted rad either. At least AMD still has a sane 3 slot or less, 2x8pin design that doesn't catch itself on fire.
I'm finally retiring my 1070 for the 4070Ti. At $800 it really is the best value when compared to what's out there, if you don't want to go used. I think my 1070 is staring to die as my computer is crashing now.
@@MaxVids1 i feel you man, not gonna judge you, and it sucks to saay that 4070ti is better for the price at the moment. i wish you get your 4070ti for a good fat discounted price and upgrade in future to 6000 or 7000 series, or better an AMD if reds finally beat the 80 and 90 series cards with big numbers.
@@MaxVids1 me too, mine 1070ti is performing really good but im tired of waiting. Ive waited through whole 2xxx series, ppl told me to wait for 3xxx. Waited through them as well because ive heard and seen how big gap there will be with 4xxx series. Now that the 4070ti is out ive already ordered it just because im tired of waiting for price drops and other "better options". I feel like at this moment the 4070ti is a pretty good choice
I'm just gonna stick with my second hand 30 series card, runs all the games I want and probably will for several years. I thought about getting one of the 40 series cards, but once I heard they were gonna price hike them at MSRP, like what the GPU shortage did for second hand cards, I decided it's better to settle than break the bank and give card makers any more reason to price hike cards that would be far cheaper in a world that had no GPU shortage.
It's a weird prospect to even upgrade every GPU cycle anyway. Unless you had to mega cheap out and get the 3050 or something, I see no reason why anyone should bother to refresh their GPU for the few percent difference. But hey, if you're rich enough, I suppose..
@@HotdogSosage bought a 3060ti for about $300 used upgraded from a 1050ti. Was gonna wait for the 40 series to come out and willing to spend a lot more for the upgrade, but I settled for a cheaper decent upgrade when I heard the inflated MSRP prices the 40 series was gonna be.
To be fair, what percent of those who never used ray tracing also never had a card good enough to expect reasonable performance from it? I had a 1070 before my 3080, so I had no expectation of good performance with it on unless the game otherwise was ridiculously easy to run.
My thoughts exactly, it's kind of offputting the fact that he... didn't even recognise this fact? I used to have a 2060 until recently, and I couldn't even think about using ray tracing, so obviously I didn't, but now that I upgraded, I always do
Agreed. I went from a 1660Ti which couldn’t handle RT to a 3060 that still honestly couldn’t but I could at least get a look at what it might be like. Once I saw it actually in play I went for the 4070Ti and I am fully convinced it’s worth it. Granted I know it isn’t for everyone.
Let's also just admire how far AMD has come in regards to raytracing from just a few years ago. The 7900XT now nearly matches the 4070 Ti in raytracing while being just slightly more expensive. With the huge lead Nvidia had just a few years ago that's really admirable.
i dont know the rtx 4070ti is just garbage, like the ddr4 gt 1030 the 4060s seam also to be a new vram experiement also the cut down pcie lanes that have been rumored asking for the quality of a 1050(100 dollar by the way) on a 350-400 dollar price point is outrageous apparently
I don't see it, 4070 ti has 15% advantage over 7900 XTX in heavily raytraced Cyberpunk 2077. 4070 ti is, mind you, a REALLY tiny GPU chip and a THIRD from the Nvidia GPU stack behind 4080 and 4090. This is at 4K which is not the favored resolution for this card as it would be preferably tested at 1440p. Cyberpunk 2077 has none of the Shader Execution Reordering or Opacity Micromaps implementations that were introduced in Ada Lovelace (RTX 40) architecture. Not even the preview build with DLSS3 has them, as that's part of the RT Overdrive update that's probably many months away.
I have a 2070 Super and skipped 30's series hopping that 40's would have a relation between price and performance. But as you said, the industry just became a shame. Best regards from Brazil my friend, and thank you very much for this video. Edit: Thanks a lot for the answers guys. Now I known I'm not alone. For some background: I bought the 2070 for $500 in cybermonday in 2018, but it had the artifact bug, I send it to Asus to repair and they send me back an 2070 Super because they didn't have available a 2070 in stock. Before buying the 2070 I had an Radeon R9 280x. I don't change GPUs every year, I just change when one of these two things happens: One, my games stop running at high settings at least at 40 - 50 FPS average, 4k (now I play in my 50 inch TV that have 60hz). Two, I change my GPU when I have a good price/performance opportunity to at least increase 50% or more the performance and keep playing for a long time without have to worry about lower my graphics. I Play Games Like, RDR 2, Cyberpunk, Sekiro, Forza Horizon, Elden Ring and I'm waiting for Diablo 4 and I Know that I already have more than the recommended gear to play it. I can wait a little bit more to not give money to this greed corpo market. One more time, tks for your comments and thoughts. I appreciate.
I would get a 6k series if you really need an upgrade. Sound like you have a 1440p or 1080p display. So you want hurt too bad. Still if you can’t afford it like me, skip these gen and save your Pennie’s now for the next gen. I’m sure it will be expensive as well. I’m saving now. $10 hear and there for the next 2 years I guess
Yeah the pricing is nuts, I just upgraded from a 2080 Ti to a 4090, and am still debating now whether to go 13900K/ KS, or just wait until fall for the 14900K since it’s going to be a new socket most likely.
Even before the good-bad-good cycle was mentioned I had already decided that I’ll wait for the 50-series before upgrading my 2070. Crazy to think that I always felt that high-end PC gaming was expensive and unobtainable for a long time, but looking back to my first build with that 2070 card now it feels like everything was great value 😂
still running a 1060 myself. and a i7-3770 lol. still runs everything well enough for 1080p so no point wasting thousands to upgrade and play at the same settings anyway.
As a student studying game development and constantly doing work in unreal engine and blender I very much appreciate the blender benchmarks! Looks like I will be holding on to my 1080 though. It's funny to me that nvidia thinks that 800 is the price to set an upgrade for the budget college kid card I spent 300 on during the crypto mess. Feels like nvidia tried to force out rtx, thinking it would win them market supremacy. Issue is that the college kids like me that are the future of game development couldn't afford effective rtx in the form of a 2080 ti. So they came out with the 30 series, which nobody could get thanks to crypto. nvidia needed to get rtx in the hands of as many creators as possible to drive the technology forward, but failed to do so. Now here they are trying to recover the loss on their investment by just jacking up prices, when frankly if they made a 4070 ti without rtx that cost 200 less it would be far more compelling.
The funny thing is, NVidia's like oops we figured out how to not have shit algorithms for RTX(after YEARS)... oh look 2-3x more performance...whoopsie!...new series. I really think this new breakthrough was some young college guy who ACTUALLY studied older computer tech and was like, uhh they did this thing for cpus back in the day why are we not doing this for gpus? You know, order stuff properly....DERRRRP. Imagine if Intel charged $1600 for their current i9 cpus.
Do yourself a favor and get out of computer gaming (gaming in general if you can imo). There's no joy in it and you're just nickel and dimed for everything.
What I find most shocking is how good the 3080 performance is in comparison to the 4070 ti despite the price. Glad I got a 3080 for a good price when I did
yeah dude I've been feeling the same thing I was fortunate enough to get a TUF 3080 for $699.99 due to living 10mins from a microcenter, almost exactly 2 years ago. This card still killing it so hard I will prob wait another 2 years at least before upgrading.
Yep. Still kicking myself daily for not picking one up early on, 3-4 mos. before I was actually ready to build the PC it would've gone into. By the time I was ready to purchase, it would have more than doubled the price of the system I built 2 years ago.
I'm glad I got a 6700 xt. I know it doesn't compare to the other cards, but I got the Sapphire Pulse version for $410, and now it's $550. I actually do need the 12 gigs of vram, and AMD is better value for money, since I personally don't need any of the bs features from Nvidia like DLSS or RTX (I play at 1080p with mostly nonRTX games)
Idk how I could be any sooner Edit: While I may have LTT's attention... Alex is the bees knees. More videos making wack projects would be much appreciated!
Finally! Wattage graphs! They are so important and LTT is finally doing them! I hope to see a future video comparing wattage use with fps/scores for many gpus and cpus.
When it comes to efficiency, leading desktop gpus give reason for big concerns. The current overheat problems of the latest nvidia and amd top tier gpus are troublesome for a stable and safe use. Opposing to that It is astonishing what mobile gpu developers like imagination technologies, who also helped with apple's m1, can do with a fraction of the electrical power. They published very efficient ray tracing gpus, long before nvidia hyped gamers for that.
If I was a NVIDIA simp and finally got sick of their sh*t, I would not switch to console. I would switch to AMD. They’re not perfect, but at least I’m not being robbed.
I got a 3060 at a very good price last year, but when I have to look for a new GPU in a few years I'm gonna seriously consider going back to consoles. Buying a new GPU, be nvidia or AMD, for more than a PS5 to get a similar experience is simply unnaceptable.
In what way is the experience similar? Console plays its games, generally, at a minimum level of acceptable quality. And that's without the options to do all the other things that come with pc. Not being able to use something like discord is the big no for me on experience difference. Not including everything else you sacrifice for console
I share your sentiment my friend. I subscribed to nvidia geforce now (the priority tier) and I'm just playing my games from the living room and it's really good for what it is. I will be trying the exclusive tier with 4k and see how that is. I own a xbox series x and a pc but I will sell the pc and just use my console instead alongside geforce now to play any pc related games or games I haven't finished. Everyday, I have less and less reason to own a pc.
I'll probably just wait and upgrade from my 1070 system next year. Still a killer rig for 1080p gaming and moderate workloads. Raytracing isn't beneficial enough yet for me to spend that much and see little to no difference in frames the way I play at 1080p.
Even for me, I have a 3060, to this day since 2 years ago, when I bought it, I have *never* experienced ray tracing, it's just in such few games anyways, and gives 0 graphical improvement for almost all performance lost! It's a fuckin marketing strategy, have useless features. _I wonder where they learned this from.. perhaps.. apple??_
There wasn’t a 1600 series tho…1600 named cards was part of the lower end models for the RTX 2000 series without RTX cores. Instead of being an RTX 2050 for example it was a GTX 1650. Same series just o RTX cores and only had two product. 1650 and 1660 and the variants/super refreshes of each. Never was a series.
I actually really appreciated the 1080ti comparison, still rocking mine. I’m getting ready to build a new pc, and have been curious of how big of a jump I was going to get with a new card
They did compare it however they misrepresented it, they only showed 4k/1440p ultra/high. Nobody on a 1080ti is maxing 1440p graphics these days so it's just a dumb comparison, never mind playing at ultra is just stupid anyways. 1080ti is still holding up strong, it can easily reach 60fps on mw2, not that they show that.
@@MrHennoGarvie I’ve been super happy with it, that’s for sure. It’s runs a lot of games @ 3440x1440, 100 fps, @ high graphical setting. The shooters don’t really count to me, I’ve always done custom setting to maximize the fps and get the clarity I want. Still does amazing for the games I want to be maxed out (or as close as I can with out a huge it to fps). I just don’t want to feed the Green machine, they prices are ridiculous. Had high hopes for AMDs latest cards. Wish they would have stuck it to Nvidia prices wise though. Shit I’d even consider Intels gpu’s if they had a higher end version that fit my needs.
@@aeromech2155 1080ti is the best card NVDIA ever made. Best card when it was released, best card for its price and best card for longevity. 699 when it was released...
@@Alpay283 6 months after the Titan X Pascal purchase for work this beast made me cry a little inside. I spent $1200, they release the 1080 Ti to beat AMD for $699. What a monster, still!
@Zero I have yet to build my rig so I can’t give you a review of it. My build list has changed now. I’m about 90% certain that I will go with the 7900xtx. My 850w psu from my old build will power the system happily, which saves me money. I thought I wanted ray tracing but now that I have done more research into it, most people can’t see a major difference, and it’s a decent hit to performance. Any one out there with a RT card is welcome to correct me considering I don’t actually have the capability to RT at the moment.
something nvidia seems to keep overlooking when it comes to their overreliance on RayTracing in performance graphs, is that unlike previous generations (pascal and maxwell especially), mid-tier 50 and 60 series cards keep coming out significantly later than the 80 and 90 cards (and especially with 20 series, had poor if any DLSS/RT performance, made worse by the 50 and 60 cards standing to benefit from DLSS the most), even though the 50 and 60 cards are historically purchased by a lot more people because way less people buy the cards that can actually run these types of games, a lot less devs have proper RT/DLSS integration, so only people buying AAA graphical showcase games and a 90 series card (a very small subset of pc gamers) will use those features often, making the excessive prices hard to justify to most people edit: I actually appreciate that LTT is taking note of the RayTracing usage problems, because reviews in the last couple years often placed Nvidia cards over AMD cards because of RayTracing performance, even when the equivalent AMD card had comperable if not better rasterization performance (what the vast majority of people and games will actually use, at least for the foreseeable future. and as we all know, "futureproofing" is never a safe bet)
Plus, for anyone running a 50/60/70 card, you've basically got a choice - turn RT on and turn a lot of other settings down (say, high to medium, turn down resolution, go to a lower-quality DLSS, etc.), and still have a drop in FPS - or keep RT off and have higher frame rates and rasterization quality. Having had my 3080 for over 2 years, in those situations, I'll say what I'll normally do is keep RT on for a bit to look at pretty lights then turn it off once I'm actually getting deeper into the game.
Yeah its weird, like i´ve got an RTX 2070 Super that i bought in 2019 for like 520€, and honestly, the only time i really "used" Raytracing was when Minecraft RTX came out, to experiment a bit, but im playing on a 1440p 144Hz display, and i´d rather have high frame rates then some eyecandy. The only feature i use whenever its avalable is DLSS, cause its basically free perfomance, with little to no downsight, especially when using the "Quality" mode.
I'm honestly so tired of seeing ray tracing and DLSS performance... I judt want to see pure rasterization benchmarks to see what they've ACTUALLY done. Not this fake frames and fancy lighting bullshit.
@@Aqueox DLSS performance is actually important to show how far they've come, although I agree frame generation is terrible (If you look it up you can see just how terrible it is, absolutely ruins the whole idea of DLSS improving performance with minimal quality loss), ray tracing is important too because it shows how far we can go without having to use pre baked lighting and other lighting tricks
I’m really glad you’re including productivity programs in your charts now. I game, sure, but I do more than that and it helps get a more holistic view of gpu performance.
Agreed, but benchmarks don't say it all when it comes to productivity. This video makes it seem like the 4070ti might be a worthy alternative to a 3090/3090ti for 3D renderings, but it comes with HALF the vram. This means that heavy scenes that would use 51% of a 3090's vram would literally just crash on a 4070ti. This is a BIG problem, considering the 4080 still has only 16gb and the 4090 is out of everybody's budget plus you need to add a new PSU on top of it.
has anyone else noticed their gas and groceries cost lately? and anything on amazon? I've started looking back at prices i paid for consumable items 3 years ago and its insane. Everything is up 30%, at LEAST. Everything. oh wait ...
As time goes on, EVGA's decision to step away from GPUs makes more and more sense.
thoughts on the arizona water crisis
@@TheLastBeanBender liquid
@@TheLastBeanBenderwater
they are losing out on millions of dollars over something that has been an industry standard for decades now.
Ctrl + C, Ctrl + V
As a GTX 1080 user I am very grateful that you guys put these old GPUs into your charts! Thank you!
I would pay more than double what I paid over 5 years ago for about double the performance?
Something doesn't seem right.
Same my GTX 1080 has been beast for the past 6 years.
My 1080 is still going strong
10 series was the last amazing generation from nvidia
Honestly i dont see myself buying any more nvidia GPUS if they keep heading this route
30 series was the limit for me
And sadly AMD is starting to go this way too
Perhaps its time for console gaming
Im on EVGAs 1080ti FTW3. It is still a super strong card and with AMDs FSR the card feels even better, new life in to old GPUs, YEP please
I would like to see a FPS vs. PRICE scatter plot with a trendline to see which products fall above or below the line.
This would be nice
Gamers Nexus has one of these, and they show it on their 4070Ti review. It also happens to be potentially the most snarky and hilarious review Steve has done yet
Literally the only thing that matters for a lot of people
This! Every time in every video. Same game(s) and updated frequently.
we all would like that!
The fact the 4090 is so far ahead shows that Nvidia didn’t even try with the other cards. It’s blatantly obvious that due to the recent lack of competition in the GPU market on the high end they are leaving room for improvement on purpose to avoid a repeat of what the GTX 1080 was and make sure with every generation they can improve a small amount and get away with it
what are you referring to with the 1080? i wasnt around during that time. what happened?
@@geraldmccloskey4395 They over-produced the 1080 chipset and they had to skip producing a creme de la creme GPU in that generation like TITAN, GTX Ultra, xx90ti models etc. They had too many 1080 models on stock and did not want to create a new competition with a stronger card of their own. Multiple years later they trimmed it down to 1070ti and some 1070 chipsets to 1050ti. However they produced so much and they were so on point with the performance back then, still there are more 10xx cards than any other generation.
@@zortlak Not only that, the economics of the situation are likely the biggest reason the 1000 series will remain the ultimate value upgrade, forever.
1. Planar node to FinFet
The 10 series was the last nvidia GPU on a planar node, which has significantly less manufacturing cost and complexity than a more powerful FinFet node, like the one AMD were using at the time.
2. Architecture
The 10 series was one of Nvidia's greatest accomplishments in terms of architecture. AMD at the time had a superior node, but their engineers could not design a more efficient architecture, even with a more efficient manufacturing process. Nvidia's maxwell not only crushed AMD's polaris in efficiency and cost, but also in performance.
3. Node Size
The 10 series had an unusually small 80 class chipset, because it was all that was required to completely invalidate AMD's offerings at the time. This allowed Nvidia to create monsters like the titan who had the full die area unlocked, for the enthusiasts, and sell cheaper GPU's that performed better at cheaper prices to the masses.
@@zortlak 10 series was insane. Im still running a 7 year old 1060 i bought before the GPU price crisis hit. The performance increase they gave over the previous generation was noticeable considering the price point they were offered.
I have a GTX 1080 in my PC right now after my 2nd 2080TI failed. Still great for 1440p 60fps gaming. Everything is max detail. Sometimes FSR is even off.
Give whoever designed the new performance graphs a raise, so much better than what you've previously used!
Yeah they flow better, I know what's going on quicker than I did with the old graphs lol they're also much more aesthetically pleasing
It's a lot more informative, a lot less biased too
Hard agree. Beautiful graphs! Can read them at a glance and they're super easy to understand.
@@djhokage1 GN is great for enthusiasts who want super in depth reviews, but surface level people like me, are better off just watching linus.
Not to mention the production quality literally blowing GN out of the water.
dude they have dotted lines overlapping the numbers.
the graph looks nice but it's form over function. this is a step down in graphs
I feel like my 1080ti might last me till the end of the decade with pricing like that
Honestly same and I picked up a steamdeck and it's a beast for a handheld soooo I think I'm good for a while
I also have a 1080ti ftw3 in my main setup if I didn't clarify that part still kicking butt 😅
Just pick up a Radeon 6900xt, you can occasionally find them new for $550.
Same. I'd love to have ray tracing in certain games... But not at this price point and think I'd rarely use it.
I'm starting to think the same thing.
But I didn't want to have to wait.
I wanna try out ray tracing and high FPS, Ultra in RDR2 dammit.
But I don't wanna have to sell my left foot to do it.
I think AMD is super happy with Nvidia's high prices. AMD can sell at a higher price point and if people get turned off and move to consoles well, they are indirectly buying an AMD APU. This situation is a win win for AMD.
@@AXEPLAINFILM wait a month and pay 50
Yeah. They got to increase their halo product by 300 dollars and still be the cheaper competitor in comparison.
ill never go back to console
@School Film you think that's stopping me
consoles are en even worse idea, id rather just go for a deck at that point.
wake me when consoles let me do basic things like use the controller i want.
It's so strange to me that AMD didn't see this as an opportunity to strike a killing blow for an entire generation of cards, which as Linus said, is lasting longer these days. Missed opportunity due to greed.
It's called price fixing, collusion...both have been caught in it before. Keep prices high so both can profit, AMD likely won't undercut by much.
I don't believe that to be the case.
In reality, Moore's law is what both of those companies are afraid of.
While the cost might be relatively similar to produce these modern GPUs as it were 10 years ago, they know they can't keep improving them that much every generation as easily as they used to.
So we are in some sort of GPU duopoly, where they are aware that, if they significantly drop their prices and people buy the latest ones "cheap" now, they won't be buying the expensive ones 2-4 years down the line, since there won't be that big of a difference to justify upgrading.
So they have a silent acknowledgement, to not sell them as cheap as they could, since they don't plan on selling the 4000s or 7000s series at their launch, they plan on selling them 2 year down the line, when their price will be ~500$-600$ and the people with the nvidia 1000s or 2000s series are eager for an upgrade to pair with their new 1440p@240Hz/4K@144Hz OLED Mini-Led monitors.
Thats so true... My old GPU died about a month ago and I have to admit, that I never actually thought about buying an AMD GPU. However, the 7000 Series really put my "stick-to-Nvidia-mind" to the test. But with those prices... I might as well stick with scammy team green :/
Believed the hype, bought an AMD card 7900XT, it has weird motion and input lag compared to my GTX 1080. Specifically in Fortnite.
amd cards are shite bought a saphire nitro 6900xt and i cant even play cs go driver timeouts every time tried everything under the sun to fix it to no AVAIL
This was the best GPU review you have done in years, maybe ever.
Thank you and keep it up! Love the labs influence going up
but no 3060 Ti or 3080 Ti numbers😐
GN's was better though XD.
all of these GPU reviews are basically the same. same talking points
@@joshuarowe8410 what kind of story telling adventure are you looking for with graphics card reviews?
Yes, I agree - a very thorough GPU testing job. I like that you compared across a swath of different monitor pixel densities and other parameter, and offered hints as to why various performance metrics perhaps came out the way they did.
By the way, I'm pretty sure the first name of Ada Lovelace (supposedly the very first programmer) is pronounced with a long A as in Ay-duh. At least, that's how we always pronounced the DoD programming language Ada.
Honestly couldn't be happier I got into PCs right before the 10-series cards came out, I could get a 1070 for roughly $325 and I thought that was expensive back then!
I thought I was an idiot for getting my 2070 right before the 2070 Super was announced, now I see I'm one of the lucky few to still have a decent price to performance card. I could not imagine paying more than $450 for a 70 tier card, and I certainly won't entertain that notion either.
im confuse you saying you can get a 1070 for $325 now? Because on ebay at least in the US, I see as low as $100.
in comparison to consoles its very very expensive, i got a 1.5k rig and my girlfriend has the series x and it looks fantastic... its just limited feature and freedom wise
Got my 2080 OC from Asus new for about 700 literally two months before the the pandemic and crypto hit. It has proper DLSS support and enough power for my needs and carried me through a time where a GPU alone would cost you like an entire setup. It was pretty expensive back then and now you need to pay more than that for the "normal" version of a comparable worse card is just baffling
I’ve got my 1080 around 450-500 pounds back in the days and it’s an EVGA. Best card, crazy that a 4080 is 3-4x more
Now I understand why EVGA decided to cut ties with NVIDIA. They saw this coming a million miles away.
All the broken drivers and overheating on the 7900xt/x must be why they wouldn't even consider building for amd then I suppose. Though I do wonder how weak of a company evga had to be, being the lone wolf to exit a market that everyone else is managing to contend in. In any other industry when one company picks up their toys and goes home, it's that company. Not the one they were manufacturing for.
@@mattk6827 It was an ethics decision man. EVGA who had been a loyal and beneficial partner to NVIDIA for more than a decade decided it would not tolerate bad treatment (deteriorating ties with NVIDIA) anymore.
@@mattk6827 A companies long term succes often isnt due to high sales but consumer friendly ethics. Nobody is hating on EVGA for cutting ties they are hating on nvidia for all the shit they are pulling
@@mattk6827 EVGA never do AMD cards because AMD cards has 1/5 margin profits unlike Nvidia 5/5
Did you realize why before Ryzen, no laptop manufacturer do AMD laptop except HP??? That because nobody want to sell AMD laptop, margin profits is too low.
All of this is business decision. Not because of problematic of AMD chips
@Ryuzaki Raiga what? It's not margins, it might be popularity/volume, but it's not margins. AMD literally went Chiplet to make CPUs and GPUs as good on margins as possible.
Nvidia literally leaves
I remember getting a GTX 970 for under 400 euro's in 2014. It was such amazing value that I didnt mind paying more when replacing it with a 1070 when that came out. In the light of those prices, seeing this card on sale for over a thousand euro and the 4090 going for twice that is a complete mindfuck
I been running a 970 since 2016. Still a great card and can do pretty good in modern AAA games. I figured it was time to upgrade and I splurged on a 4070ti
yep i payed 300 uk pounds new at release for a 970 g1 gamer and that was the best out of the 70 series then. my 1070 amp extreme was 430 pounds and strangly the cheapest 1070 even though performance was one of the if not best. yet all the other 1070s were above 500 pounds. that wasnt so long ago and i was angry at the 100 pound increase in price. now they want me to spend 850 pounds on what really is a 4060 but called a 4070. crazy people.
@@ShootLuckGaming still on a 970 myself. Running strong for the games I play.
Leather Jackets was cheaper back in 2014. Now at poor Jensen has to travel around the world to find quality leather jackets in a fair price. How else he can earn money to buy the latest jacket made from zebras? Only by selling over priced trash like rtx 3060/4070 for $800.
did you play in 4k in 2014?
I actually really like the new testing graphs they seem far more professional and give a better overview of performance
I wish they did a price-to-performance graph like Tom's Hardware used to do.
@@morgan5941 I believe they addressed this on the WAN show stating that those wouldn't age well whereas performance graphs will genuinely stay mostly the same.
@@sophieosu They don't have to age well. They just have to be the current MSRP's. Tech videos in general don't age well. Who the hell watches old WAN shows?
@@morgan5941 Well... That's the problem isn't it? If they make the graphs for the GPU msrp, it's a poor representation if there was a GPU shortage and the prices doubled. It also doesn't take into consideration board partner cards which will be over MSRP anyways because the margins for MSRP cards are non-existent.
You want to impress me? Try this suck with ALL of the Witcher 3's Next Gen features turned on. Including Ray Trace. =P
18:29 Can confirm this is absolutely accurate, I have a gaming PC with an RX 5700. I wanted ray tracing, took one look at current GPU prices and bought a PS5.
does PS5 support ray tracing?
@@ehsanzare7515 no
@@ehsanzare7515 It supports it on a hardware level, but I don't expect adoption to be remain widespread due to the sheer performance costs that inevitably come with RT.
@@ehsanzare7515 It does, just obviously not to the extent of PCs with new flashy GPUs like the 4090
@@ehsanzare7515 Yes, that and NVMEs that use a DirectStorage-like feature were two of the main selling points of the PS5 and Xbox Series One X X Box Series One X or whatever the hell MS is calling it this time.
Can I just say whoever came up with the dotted line designating 60 fps, that's hugely helpful in getting the data in a glance.
I found them quite distracting when trying to compare individual GPUs though.
I can't even turn on Ray Trace on my games thanks to the FPS difference. Try 60 FPS on Witcher 3 with Ray Trace turned on. o_O I'm just glad I can play a game without my 3060 Ti spontaneously combusting into a fireball. =P
I notice that too I just didn't like that coloring as it overlapped with the color scheme of the 1% lows. If it was just gray/dimmer or something that would be nicer.
Honestly, I think Nvidia and AMD are recognizing that customers no longer need/want to be upgrading these cards every year to two years. The high end card will be able to handle next gen gaming at the end of year 5 with ease, so they simply are raising the price knowing that the customer is holding onto the device longer.
Same product cycle as with iPhones- I ask for a new work phone when the battery is shot...and usually my 5 year old iPhone performs only marginally slower than a brand new iphone.
Im loving these final thoughts at the end of the reviews. Direct, honest and pretty much what a lot of people has in mind but has no far reaching voice like Linus.
No one else in the world has a voice like Linus....
8:49 For Blender benchmarks, it would be better if you labelled what renderer was used. OptiX vs CUDA makes a huge difference for Nvidia, where OptiX can take advantage of the RT cores. CUDA and HIP only use the GPU for raw compute with no special hardware acceleration. Clarifying this difference would make it much easier to check your numbers against other benchmark data.
Agreed came here just for the blender bench marks
I had intended to upgrade to the 30 series at its launch but the scalping price hike made that not feasible. And seeing the 40 series priced closer to the scalper prices of the 30 series I think my current gpu will stay in use for a while longer
its so weird like how are the 3080s the same price still as the new gpu's ....
1070Ti go BRRRRRRRRRRRR
why not get a 30 series now then?
Currently have a 1080 ti, probably will move to a 6800xt. Those are around $475 used right now and that seems reasonable to me.
@@Dustmuffins My 1070 Ti is still chugging, FSR does a lot for those cards
Here's an idea for you guys to put the benchmarks in better context: incorporate the price somehow (both MSRP and market average if at all possible). My two potential ideas are either to add a "price per performance" line down the right, with it dipping left to indicate worse relative p/p ratio. Bonus points if instead it's a ribbon matching your color coding on the bars for p/p ratio per average/5%/1% lows, so we can spot when a card might be worse on average p/p, but we get more for our money in terms of frame-rate stability).
Alternatively, if that winds up being more cluttered: when you have a specific card you're highlighting for comparison in the ranking do a simple conditional formatting style color coding of the background behind the card names (again, better if it's up to date upon final editing) to visually represent if a card is more or less expensive in relation to one-another.
Yes i need this badly. Money to performance is all that matters. Idc if its strong enough to simulate the matrix if it will cost $10billion ill never buy it. I need to see if these cards are reasonably priced for their power otherwise its just random numbers.
this could also be done with power consumption would be appriciate as i am from germany and our government is just plain stupid.
That’s a great idea. In the video he was comparing the 3090 Ti with the 4070 Ti and made it seem like a bad thing that the 4070 would underperform the 3090 ti but the 3090 Ti is a $2000 Gpu compared to the $800 for the 4070 Ti.
@@jamestaylor9782 100% like this guy on "youtube" doesn't make the connection..
Guys, the problem is that there is no single price that they can give.
Abstractly, the ideal would be them giving the price that *you* would pay.
1. Prices are different by region
2. Prices fluctuate over time, because of anything from day-to-day sales to geopolitical events
3. MSRP is just that: Manufacturer Suggested Retail Price.
All three of those first words have issues:
3.1 - The manufacturers set this, and they don’t necessarily have to set it at the econ 101 ideal where supply meets demand in a perfectly competitive market; they control supply.
3.2 - It’s just a suggestion, the retailers ultimately charge what they will
3.3 - This is the retail, i.e. new, price
4 - Used prices are not just unpredictable, but unknowable. There isn’t data on what price every used card sold at; you can’t even assume the prices that you see on buy-n-sell sites because there is very probably bartering that doesn’t get seen, let alone made available by the few sites that attempt to show their sale histories
5 - Cards aren’t as fungible as we assume;
5.1 - Used cards are obviously worth less than new. And even within those, “true value” is affected by things like how much warranty the card has, intensity of use, previous operating environment, whether it comes with receipt, or was mined on, or overclocked, or cleaned regularly, comes with box and cords, etc etc
5.2 New card prices might include separate value adds, like bundled games or extra retailer warranty
And the kicker is that we who might want to be surgical about price with things like price/performance are more likely in a segment where buying used makes sense, and used has the least universally knowable data
A - There are sources that try to present price/performance numbers like UserBenchmark and others, and there’s good reason why nobody in the comments recomended them to you. Go look, the data’s shit.
Loving these new charts. Having the top performer consistently at the top makes them easier to understand at a glance, rather than having to rely on reading the “lower/higher is better”. Very comprehensive video!
Oh yeah, I can't remember how long that's been. I finally stopped thinking him as a review channel, just because LTT didn't do as many benchmarks anymore, so it's cool he's trying to do comprehensive benchmarks again.
Stoked I literally just picked up a RTX 3090ti from eBay for $600 and wasn’t scammed!! Had me sweating seeing this released for just a bit more and I thought I’d be leaving all sorts of power on the table. This video got me feel real good about my purchase! 😅
And it's even better because Ngreedia doesn't get a cent of your money!
Damn, nice job. That card will do you well for at least 5 years. Don't be pining for the next thing, just enjoy what you've got :)
They are £1000 used here
Definitely a steal
Wait until you see 5000 series pricing. They need people buying their old tech from 2018 and 2020. So the prices have to go higher and higher. All while killing the planet with crazy high TDP.
LTT Labs is making these charts look like testing is done by Gamersnexus, and I love it. Finally more comparisons than one-two other offerings from NVIDIA and AMD.
LTT is lightyears better than GN and HWU. Those only know games. LTT is light years more versatile.
@@ImtheIC I love LTT but that's simply not entirely true. GN gives facts straight and all technical details are there but Steve, with all his wisdom, can sound boring. HWU same at least for me. What LTT does is give info as digestible bites that the layman can understand but the experienced can appreciate but the super techy ones will find lacking or wanting.... until Labs came in.
@@ImtheIC As much as I love LTT;
Their testing breakdown is a DUMBED-DOWN version compared to GN and HWU.
I'm not complaining as I know LTT targets a wider audience.
RTX 4060RTX 4060 😻? Where will you come please tell me
@@ImtheIC ur insane
With EVGA leaving the market for GPUs I am firmly on team red and since I tend to skip generations I can easily avoid attempts at gouging me.
Next generation cards will be 1200 watts plus better buy a nuclear power plant before, I got me a 3070 TI for 500$ brand-new the week before the release of the 40 series at 1440p Cyberpunk 2077 Rays at high DLSS quality everything at high and ultra in between 78 and 85 FPS
evga will be there for us thats why they go full on for 5000w superquiet power supplies
I'm quitting gaming. Bought m2 mac mini. Buying a gaming PC if it will be cheaper and energy efficient
@@wonrei Yeah I have a Mac mini with both consoles. I also have a PC with a 5700 XT. I’m good. GPU prices are stupid.
I went team red as well for a similar reason and love my 7900 xt. I think $800 is fair for the performance jump I got over my 3070 which I paid almost the same amount for years ago.
This is one of the best graphics card reviews I’ve ever seen from LTT
Just checked the prices for Europe, 1000€ for a 70 class gpu...
Man this hits hard.
1300 CAD (+15% tax) here. Man, I remember getting my 1070 TI for 700$ CAD and thought that was expensive
I think someone is buying it, enabling, encouraging Nvidia to keep this greedy behavior.
they're making AMD consoles more attractive to the average consumer and focusing on making money from enthusiasts willing to spend big money. It's quite sad, really.
I skipped out on the big 2 this Gen and opted to go with an arc 16GB , TBH it hasn't disappointed. I'm not a hard-core gamer / graphics designer, but for my 32" 1080P LG it was a good upgrade from my RX580 8G. It doubled my frames in most games that I play with my sons and it renders just fine in my 3D printers software. And it seems it gets a little better with every new firmware intel pushes. And for less than $400 US. Something needs to give to bring these prices back down to earth. The high they were riding on crypto has their wires scrambled.
Give this guy your respects boys, what a madlad.
Exactly nvidia has lost their mind with these prices And its only alowing AMD to do the same
Arc might not be a performer but damnit if it brings prices back down to earth im all for it
I really want to do the same actually I mean like... In my country atleast all other cards pretty much have trash value even second hand.
And something very cool I think about arc is actually that the Ray tracing is pretty powerful!
But on the other hand I'm still unsure about Intel not dropping the project they already demoted raja pretty much the head guy of the division not so long ago
Not saying he did a good job or a bad job but I have not heard about anyone replacing him so that Is pretty scary.
I would have loved to buy that, but I ended up buying the 6700xt bc I don't have resizable bar
Bro how do you even use a 32" 1080p display
Unless you use it from like 2 meters it's gonna look like shit
I personally have a 32 1440p lg display and this resolution is barely enough, at least for me
I stumbled upon this video and want to thank you for making it. During the 90s-2000s I built multiple computers. Last one was over 11 years ago as I slowed down gaming. Recently started to look into building a new rig and most components were inline with my expectations. Once I saw the pricing for video cards, I decided to abandon the project. I am disgusted by the pricing set by nVIDIA and will not spend any money to support this practice. I hope they get what’s coming to them.
The reason why I’m still keeping my 12 year old pc, it is ridiculous
Second hand cards are so much cheaper
I have to say, Linus hits the nail on its head on this. “Good product, for Bad price, Produced by an Company that cant hide their ugly nature”.
Everyone involved in the changes to GPU reviews, great job. This one was really good.
tbh, Every single reviewer has the same take on this card. Gamer's Nexus was far, far more brutal than Linus, and pretty much no reviewer I'm away of has made any positive anything about this product.
@@JE-zl6uy I don't understand the bad reviews. The card does about 3090TI-performance for half the price and power-demand and only get beaten by AMD in non RT-games, while RT (and VR) is kind of the reason to buy such cards in the first place. And it's still just huge compare to a 2070Ti, so the price-increase is kind of justified by the size. And if the card would be 200 bucks cheaper would mean your old card it worth less as well while for that price I still get at least half the price back if I sell my 3080.
@@Leynad778 the reason is the price for the top and mid end is scaling up greatly each generation and the price-performance leap isn’t correlating with actual performance leap gotten from architecture and node shrink at the given amount of cuda cores/die size
@@Leynad778 do you know what else beats a 2070 TI? A 300$ 6650xt. Beating a 20 series card is no longer a achievement, especially for a 800$ card.
@@Leynad778 It performs about double compared to the 2070 Super, but also costs double. What's the point then? Where's the generational improvement? I was planning on upgrading my 2070 Super, but I don't see the point honestly with 1) this price, 2) the price to performance ratio, and 3) power consumption - just the card itself consumes as much power as my entire PC with the 2070 Super. And honestly I'm not even having any trouble running games with it maxed out at 1440p. I was also hoping it would have more VRAM, AMD is really beating them with 16GB.
Its insane that in the current year the gpu costs the same if not hundreds more than the rest of the pc
Outrageous prices that’s for sure
What is also insane is that 10 years ago top of the line GPU prices were just 400€ ~500€
@@draconk remember the 1080 too, an absolute beast of a GPU for its time and it was $699. Fucking wild we’ve come to this point.
@@Maverektresellers man…
Always been that way but consider everyone games now 20 years ago it was just us nerds, kids have better pcs than adults now
I think a lot of the reason why people voted they had never used ray-tracing in that poll is because they have never had hardware that supports it w/ good performance. So I'm not sure the argument that the RT performance of the 4070Ti dosen't matter much really holds up, since a lot of the people that voted 'Never' probably would enable it if they had a card as powerful as the 4070Ti
i have the card and i would not return to non ray tracing
I can not stress enough how essential it is that Intel succeeds in this market. Honestly, for the average customer, the a750 is looking like a great purchase now that is drivers are improving.
Intel pisses off customers because of their inconsistency in literally everything. They are also to be blamed for making shite cpu and jacking the prices
Ain't the GPU division getting scraped or smt?
@@N_Joji it is not getting scrapped, intel confirmed it
@@yellowscarlightningscream8347 well i wouldn't trust any of these companies but I do hope they will push forward, fix drivers and release new products.
mate u have a point sure. but you are putting your hope in the most rotten company of all, Intel 🤣🤣🤣. what we need are new fresh companies with totally new architecture. Don't know, japan , korea etc. enough with this Duopoly.
If Intel could get a solid $500 card out, I could see them making a huge splash in the market.
intel lmao what?
@@MrFearlesskiller from their first launch it seems the main issue lies in their very undercooked drivers if they make decent gains in their drivers they could surely make a nice splash in the market
You still probably believe in the tooth fairy as well. It's not easy to make a "solid $500 card" being new to the scene.
@@MrFearlesskiller
Well for 300$ or so ,it is a very good card.
Especially if you like RT.
If you do,than then there is nothing in that price range to compare to it.
Asrock Phantom Gaming D A770 is 305$.
It is around 3060 normaly.
But in RT it trades blows with 3060Ti
if Intel could make a card performs like RTX4070ti, Intel will probably charge USD800 as well.
Like other people have already said, the new graph style is a much-needed improvement, helps the information clarity a lot! Also, love that you guys are always using relative performance to other GPUs now, provides a good summary of data whenever different games/programs features discrepancies.
I upgraded my gaming rig from a 1050ti (now in my plex server) to a 3060 this week - I have 3 1080p, 60hz max displays that I really don't want to replace, so this made the most sense to me. The titles I have looked at so far with RTX on - have left me with a feeling of "who cares?"
It seems far too much weight is being put onto ray tracing from the manufacturers side, when the users, really couldn't care less.
you should upgrade at least 1 monitor! the upgraded graphics wont do much good if you can only get a max of 60fps from the monitor
@@ryanakahuss I think you missed hte point - RTX hasn't showin itself to be anytihing to really care about.
But thats your opinion. Talk for yourself. You cant say « everyone think like me » that make you sound very narrow mind
Good choice, that RTX 3060 is one hell of a video card.
Get well soon Linus. I hope you dont come down with something bad.
down baad
Glad I wasn't the only one that felt something was wrong
New Year's party 🥳
This is why we need competitors. Nvidia and AMD are collaborating at this point on pricing. They might as well be price fixing.
A duopoly is just a monopoly with slightly different lettering.
Intel?
@@Dave102693 That's like comparing two big burly, battle hardened greedy and ugly men against a wee lad barely having object permanency
they were selling GPU under third parties names on amazon and ebay during the pandemic, they are really that scum for money, probably planning another alt coin mining fever to skyrocket prices again
@@Dave102693 sure but as we have seen Intel isn't a competition at this point
The LTT team is to be commended on one of the best reviews of the 4070 Ti, especially given the current climate. The comparisons across competing GPU's, extensive game and productivity benchmarks, and actual vendor based pricing considerations, sets this review at the top.
The real-world commentary is refreshing and I appreciate the effort by all LTT staff involved enabling me to make informed purchasing decisions for both business and personal hardware upgrades. Well done.
I liked Jayztwocents review as well. You can tell he's just over the pricing bs and his feelings match my own
I have a suggestion. Put the card's typical price next to each entry on the comparison charts.
Man, when they announced the 3080 it felt like such a great stride for the company. Incredible performance at a solid price
Feels like ages ago
i remember when they first showed 3060 beat the 2080ti at a wayyy cheaper price but now theres no hope of a 4060 that can beat the 3080 or 3080ti
@darkreaper0604 worse than that, the pour souls who paid $400 for a 3060 knowing the value was bad at the low end, because getting a card at msrp in February 2021 was nothing short of witchcraft.
3080TI was terrible pricing, their pricing has been nothing but garbage since the pandemic. they literally just doubled the prices of all their cards and kept it that way.
@@Reaperzx6 the 4060 would beat the 3080ti, however it would cost more.
@@sebastianjennings1159 yeah i recently just bought a 3070ti for $300 used. It works perfectly! I think people should buy used if they cant afford MSRP then sell and buy a new one when they can. Nonetheless, buying gpus or pc parts in a whole is kind of a waste unless youre working from home. People fail to realize that video games are a complete waste of time if abused. Ill be selling this 3070 for the same price i bought it then in turn buy a new one in 2 years.
it's amazing how still today cards like the 1080 or the 1660 are some of the best cards to buy with best cost/frame efficiency.
Yeah true i upgraded from 1050ti to 1070 and now rocking 3070ti 🤟🏻
But tbh for my use the 1070 would’ve been ok.
Having my 4090 made me appreciate my 1080 soooooo much more
As it was the last good generation from nvidia , every other time it has been the worst performance increase for the biggest price increase .
I kind of regret my used 1080 Ti purchase for 400€ though. The power consumption is insane and costs me about 1€/day. I should have spent more, even if it didn't look worth it.
Really sad that even with crypto collapsing, prices are still insane.
It still blows my mind how massive these cards are, looking like a VCR in your desktop. Most of these are 4 slots now, and for what? Who are these for? I'm so glad I was able to get a 3070 (EVGA FTW3) at MSRP during the pandemic. I'm going to sit back and watch this generation, with a confused look on my face the entire time.
ngl same I got a 6600xt tho.
I think the size and the price suggests they are really struggling to innovate. Fundamentally a lot of the digital technology plateaued quite a few years ago and since then they've managed to continue to improve and refine it but nothing like the huge strides in clock speeds, ram, storage, etc that were being made from the 90s to 2010s. Ever since about 2015 they've had to find alternate ways of improving performance, some of them very clever, some of them quite crude such as just shovelling more and more and more cores onto something. But it's been a struggle.
They've probably hit a wall with just how much performance they can get from fundamental technological improvements, so the only way is to make them better now is bigger and more costly. Maybe there's some other revolutionary big leap round the corner.... but probably not.
I wish I had that luck. I have been looking for a 3070 ti ftw3 for a decent price for 2 years now and I have gotten MSI Ventus 4070 ti for cheaper than the current listed price for the 30 series
I feel the same way, just mine is the TI. Great performance and runs every game well at max 1440p.
@@Hantzius I put my name on a list for a computer store, and just hoped I'd get called. On my day off, I get that call, they called me and asked if I was still interested, and if I was I had an hour to get it. Hell yeah! 😆 Apparently they had like 15 in stock (3070s, 3080s) and they happened to look on the list when it got down to the last handful of cards. I got lucky and by the time I got there my card was the only one left.
And remember, that 7 year old card(at least the non-Ti/super version) only cost around half the price give or take(depending on the card).
Appreciate the whole upgrading bit near the end. As someone finally looking to upgrade and finally in a financial position to upgrade again, even ARC will be a massive upgrade from my 2015 GTX950M I'd been dailying for the last 7 and a half years. I think Nvidia forgets this demographic exists and if I have a good experience finally upgrading and going intel or AMD, I'll likely not bother with Nvidia in the future because too expensive
Went all AMD 8 years ago and since bulldozer "yes that mess, my system performed fine IDK why people hated it . . . " and I never went back to intel/nvidia. NO REGRETS, I'm loving ryzen and radeon, make the jump when you can :D
I want to upgrade from a GTX 770 to a used RTX 2070 Super or above (within 2000 series), but it's being very hard to find used cards at a fair price in my country.
Radeons seem to be a better deal price wise, however, I need the CUDA capabilities for production work.
If only AMD or Intel would step up their game in that regard...
@@ThePortuguesePlayer I managed to get a 3060ti from Nvidia at RRP so if you could do that I'd highly recommend as it trades blows with the 2080 Super.
@@asdfghjklkjhgfdsa69 I'm avoiding anything higher than 2000 series. 3000 series are generally more expensive in my used goods market and there are many other disadvantages that come with them regarding my usecase. For most other people, a 3000 series is probably a better deal, though.
The AMD 6000 series is a hell of a lot cheaper than the newest offerings from anyone and as long as you don't care about ray tracing (ie. most people) it's incredibly good value at 1080 or 1440.
I need to say, I love that your GPU comparison also includes (hopefully) realistic dimension comparisons! Whoever did opt for this on your team, please give him my acknowledgement.
dimension comparisons?
wait like the physical size of the cards? where?
Love the new graphs! .. I was thinking about how to visualize performance of different cards in different games with different settings. In addition to the individual slides you already have I would appreciate a final overview combining all data points into one view. I think a matrix with color codings might be the way to go here. You could for example increase the saturation for cells with higher scores compared to others. This would make it easier to find patterns
indeed congratulations on improving the visual layout and also the labs really start to show their power here
Still not as good as Gamers Nexus’ charts imo, but LTT’s charts are definitely getting better.
@@FaceyDuck I would disagree, I love GN data, analysis, honesty & professionalism, but its charts are probably not the best in readability
at least with my poor 1680x1050 TN screen
@@whismerhillgaming I would also disagree, I find GN's chart readability to be perfectly fine on my screen. But LTT's charts are definitely getting better.
As a 4070 ti owner who has been gaming my entire life but has not kept up with the times the last 5-7 years. I mainly play cs go but wanted to play some AAA titles I've missed out on the last few years. I bought a 1060 6gb laptop to tie me over about 5-6 years ago and finally broke down and built a new pc. I did quick research and paired a 13600k with the 4070 ti. Have had it for about 1.5 months now and have played cyberpunk, hunt, redfall, and red dead 2. All have played awesome and I couldn't be happier. I got a little nervous after I built it that I made a mistake but after using it myself and seeing some of these benchmarks I'm happy with my choices also in my opinion the best performance for power build you can get. I don't intend on using it for 4k and will likely build another in 5-7 years. In all likelihood I'll be able to max out 1440 for that amount of time without a problem and if I have to turn a texture or two down at the end of its life I will. Should it have had 16 gb for the price well yes and I'm a little disappointed about that but the cache is like 16x more than the last gen so I think people are exaggerating the ram need. Also as games get optimized after release they will get better. Overall I'm happy and just wanted to give an opinion from someone who owns one. If I had a 30 series card I might hold out till 50 series but if you're building a new pc go for it I'm happy. And people will hate on frame gen or dlss 3 but if its single player to me it looks great and feels smooth. For something competitive I would never use something like that. For example I run medium setting at 1080 on cs go with a 165hz monitor and cap my frames at 320 because it reduces latency to .6-1.2 and uses 44 w. Also frames never dip below 200 even on the biggest spikes. CPU usage at those frames is like 10%.
I remember when I upgraded my first computer with a new gpu for 96usd and I could crank up all the settings in the latest games to ultra with no problem!
Sad how we have to pay more than 10x that price today and not even being able to get a steady 60fps.
What card was it?
I remember trying to play games like DayZ and BF3/4 on my school laptop with integrated graphics. Then built my first tower for $400 with a AMD FX-6300 and Sapphire 270X. Boy was it magical playing a game a 1080p above 30fps. Good times.
@@sean9267 The FX-6300 :D Man, we sold more of those than any other CPU ever, and for a longer time. They just worked, and at a low price. The 270X was also a great card for the money. You made good build decisions.
pretty sure i got my first gpu for 2 chickens and i ruled all of azeroth
@@williamdunkley5791 Anyone who rushed dark knights ruled all of azeroth. You didn't even need a GPU. Even a Tseng Labs 512KB ISA card would do the trick.
I've noticed that some charts are sorted by Average, and others are sorted by 5% low (for example Hitman 3 @ 1440p 6:01 ). This is something that's a bit confusing when trying to follow the graphs. There should really be consistency or a clear indication how the graphs are sorted in the individual game's benchmark results. Otherwise, just seems like someone didn't hit the right sort in excel when extracting the data.
I guess we have to follow either one may be averages are better? Or 1%
Looking at the productivity graphs they seem to have sorted it by the average of all values, at 6:01 two cards have the same mean value so they have to choose
I think they are sorting by the combined scores(?) of all three lines. Whatever it is, it does make sense intuitively. High average fps is not as important if the gap between the lows and the average is way too big.
I noticed that too. Would be great to have a little up or down arrow with e.g. "1% low" in a top corner to show the sort. other than that, these graphs are much improved - very readable
Purchased a 1070Ti for 600 AUD at the end of 2018 and if I were to spend $600 again (4 years later) all I can afford from Nvidia is a 3060 (
1070 Ti is such a legendary card.
@@Alex-bl8uh my 1070ti is dying :( gonna have to put the beast to sleep soon
@@brandonr2188 out here with a 980 still going strong lol
@@brandonr2188 mine is dying now too. sadly the same games don't even run medium settings now.
I have a 1080 ti strix.... I got a 4090 gaming x trio (high oc) and wasn't all that impressed. Seemed like a negligible increase in performance across the board. Games would just utilize 50-75% of the gpu, otherwise frame rate increases again were negligible in a lot of scenarios. I sent it back.
It's actually kind of sad to see how the 4070 ti preforms 3 months ago now that 7900xt are at $799 for some and 7900 xtx are at 899 if open box
Got a 4070TI open box - excellent from Best Buy for 750 with tax and free shipping. I wasn't initally going to get a 4070 but it just seemed to line up.
Gotta love that the price increase has doubled every gen since the 10-series. 200$ increase from 3070 sounds insane! Especially since they wont sell FE cards and the cards available to consumers most likely wont match msrp...
There is no inflation.
I feel like this is another little hiccup like to 400 series was. I think if we wait for a while it's all gonna stabilize again. We just need to stop buying them.
it's actually $300 increase, 3070 msrp originally was $499
@@klrct5736 It's never gonna go back down, People proved that when people were buying ps5 double the price. Nvidia knows this and they won't change.
@@Steven-yf2ef this is because its not likely a console gamer will fight the system by purchasing an entire pc at 4 times the cost. however buying a console for a fraction the cost and saying fuch it?
When it comes to productivity I'd go for a 3090ti or even a 3090 over the 4070ti because of the VRAM, when rendering stuff in Blender etc you really don't want to run out of it since it goes quite a bit slower when it has to offload on the system memory instead.
Aaaand youre comparing a highest end card with a not highest end. Technically the performance is high but its still doesnt have 90 at the end of its name. My rtx 2070 used to be cool, today in 2023, its half the performance of 70 name.
@@fynkozari9271 The reason they are compairing it is because they are the same price right now.
@@fynkozari9271 they're the same price. The naming means nothing at all
@@fynkozari9271 The 4070ti USED to be called the 4080 then they slapped a different name on it. Are you actually stupid? Like did you just not watch the video?
If you want to productivity.. buy a 4090ti. they are cheaper than a 3090ti. and when your budget is 1200$ or so for GPU why wouldn't you go for a 4090ti..
I understand that the graphs are now more intricate due to LTT Labs being brought into the fold. But can I request that future graphs have markers that coincide with what the host is saying? I.e. @ 5:52, Linus talks about the 4070 "closing the gap among its older siblings". It would be nice to see some arrows to point to where these "older siblings" are in the graph. Otherwise, I will have to scan through the extensive graph to see where they are, making it more like a class lecture or an RTS game than a form of entertainment.
I'm no PC part connoisseur, and I know some folks will say "just pause the video", but the whole idea of it being a presentation is so that I don't have to pause it and it flows along with the speaker.
I think it was quite obvious as to what Linus was referring to as "older siblings". The 4070 Ti, the card being reviewed here, is always bolded AND circled out in the graphs. I think they are doing more than enough for the graphs to be as readable as possible, all without them being too obnoxious.
Easier to read graphs is always welcome! I think your suggestion is solid.
he says "bigger siblings" so it's pretty obvious...
@@jopa3895 We know which cards they’re talking about, but in the sea of cards, I shouldn’t have to be looking for them, I should be focusing on the text.
I don't even pay attention to the graphs anymore because they're too confusing, something like your suggesting would be helpful!
I think the 10x series upgrade language is directed at the steam hardware survey. a lot of 10 series cards there, so they are really trying to pull in those people, becuase they know the people that shelled out top $ for a 30x series are not going to see any value in an upgrade.
I remember when $500 was an insane price for a GPU.. 15 years ago the NV 8800 GTS was top tier and $450. Sure designs have got more complex, but at the same time manufacturing cost keep going down as well. At this point, anything above $750 for the top top tier GPU is price gouging.
Right?? The $200-250 was the sweet spot. And it got bumped by $50-150 every generation until they're now in the triple digits. 😐 The worst thing is that older generations don't come down in price when a newer generation comes out. I don't understand why... Is it the crypto mining thing or just greed? Probably just greed.
500 dollars back then were also worth a lot more than 500 dollars are worth now, that's how inflation works.
@@Nobody-su9km I took that into account when I said more then $750 ;-) in 2008 500 USD would be 681 USD today..
@@AtomicOverdrive did you also took shortage of eletronic components into account? Or the cost per chip of smaller more complex technology?
@@AtomicOverdrive and it would be 700, not 681
I genuinely do appreciate OS Version/Patch/Driver Version being put on visual benchmarks. It adds context.
I think is the time I start looking at ebay for 3080/3090 series cards, I keep seeing some pop up starting at £500. Anything is an upgrade for me from a GTX 980
This was the exact same case last year with the 3000s series card, 3rd party OEMs even stated their MSRP was unobtainable.
If you look at it they planned it, the 2070 super already cost almost as much as a GTX 980 cost, then a RTX 3070ti cost even more than a 980 cost, and finally get this. Even if nVidia "dropped" the price to $700, that's still them clearly planning to charge *at least* $100 more every single generation.
And no, it's not fucking inflation Jesus Christ. There was no inflation making them charge that much for the RTX 2080. There was no inflation making people buy used 1080tis instead because they gouged prices so much compared to performance. The inflation (i.e. corpo price gouging when they realized everyone would just blame increased prices on inflation) hadn't set in yet when they announced that $600 MSRP for the 3070ti, a card that couldn't even perform as awell as the $580 RX 6800, and that was fully $100 above the MSRP of a card I'd already passed on 2 years prior because I thought it was a bit much, and that was when I was comparing in $50 segments.
No AMD isn't better either, that is correct. AMD is just not as brazen, usually. Zen3 when they dropped the stock coolers though, that combined with adding a cost made it effectively $100 more expensive than Zen2, plus people complain about the 5600x cost compared to 2600 and 1600x and 3600x. So, I think that was trying to profit off those of us on AM4, who'd be buying it anyway (hence the cooler didn't make sense). Maybe it was a wise thing, coolers are back for Zen4 right? On AM5? But either way, I'm not praising them when they're being greedy. It's just nVidia is so brazen about it, it's like they're personally daring you not to buy it. So I won't.
30 series came out fall of 2020 or 2.25 years ago but were hard to find at retailers until spring of last year/2022. I think you meant to say 4 series OEM's stated MSRP is unobtainable.
@@pandemicneetbux2110 It all ties in with the fact that, if you have the best/fastest product and no competition, you can charge whatever you want for it. Because there's no alternative from a competitor. Plus the fact that global focus is on maximizing corporate profits at every cost, including human lives, instead of of maximizing human happiness.
@@Weshwey_ Sad hell this world we live in.
@@Weshwey_ It's not without competition is the thing. In some metrics apparently there's AMD cards beating nVidia at native resolution raytracing same price bracket now. So they're even closing that gap. That's just the thing--they're not better. They're not unchallenged. Buyers are just bydlo morons, willing to be led by the nose. The 3070ti itself was bad value. It's not just the bad value of this generation all around, but even within that bad value for example the 7900XTX is clearly competing with the 4080 in performance but beating it in price. That's what has me so baffled, is the 3050 was selling for $100 more than the much, much better performing RX 6600.
>but muh features
A 3050 isn't strong enough to do RT regardless, it's basically a GTX level card in all but name. That's my point. They realized they can shovel any manure and someone will buy at least some of it.
I got a MSI 4070Ti for 799 on Newegg.
Huge upgrade over my 1080Ti, and the 4070Ti is 100 dollars cheaper than what I paid for the 1080Ti back in 2018.
I gotta say. LOVE the graphs. The 60 and 120 line keeping everything in perspective is awesome!
I'd like a clearer indicator of what generation each row is and either msrp or, even better, recent sale price. That's a lot of stuff, but without it the graphs don't have clear indicators of value if you want a one stop video.
A great upgrade from 10 to 40 series? Have Nvidia forgotten about their last Gen 30 series? Like, from a 1060 3gb to a 3060 ti 8gb is a BIG upgrade and for less than $800
I've been thinking about upgrading to a 3070 ti but my 2070 runs very hot already I had to change case fans so my GPU didn't rise above 70°c so I'm a little concerned I'd have to water cool it.
That was almost exactly my upgrade path (except my 1060 was a 6gb) and man, it's pretty much all I've ever wanted. Especially because I caught it on a one-day random $549CAD sale on Amazon. All I need is 1440p because 4k gaming is basically a meme and it's perfect for my needs. Skip this generation and let Nvidia get the hint.
@@CYWNightmare have you tried re-applying thermal past?
i went from 6gb one to rtx 3070 2 yrs ago
That's the exact path that I took. I got my 1060 3GB last year for $250 at the height of the GPU shortage era. It struggled with most games. 2 months ago saw a mint 3060ti 8GB for the exact same price and went for the upgrade. The difference is like night and day. Incredible value. No need to drop 800 for a new card
I'm still rocking a gtx 1080 I bought used in 2018 for $300. most of the 10XX series cards almost seemed like the perfect sweet spot for price to performance IMO.
Yep. That's me too. Red Dead Redemption still looks amazing at 1080p on ultra. Want a 1440p card, maybe with a high refresh rate monitor when I hop on less demanding multi-player games like Overwatch or Deep Rock Galactic, but no fucking way am I dropping this much of a bag when I already love my set up
The whole 10 series was and still is a beast. From performance to longevity to value for money those cards were the last hurrah for us gamers.
I wanted an upgrade to my 1070 but after the disappointing holiday sales (if you can call it that) I gave up. Still wildly overpriced.
Bought a nice PNU XLR8 triple fan 2080ti used for like $260 in October last year , I have it in my PC and finally retired my 1080ti to the second PC.
For the money nowdays there are so many good deals on used GPUs that it’s tuff to ever justify buying a new GPU.
Although I’m not gonna lie I don’t see the hate about the 4070ti here it’s 3090ti performance that was using 185 watts while gaming!! Like that’s amazing it’s expensive but I’d love to have one and if I was on a 1080ti still and wanted a 3090 ish card I’d buy a 4070ti for sure
yeah , rocking with 30 fps max.
As ridiculously greedy as Nvidia is, its also expected behavior from them at this point and this generation has me more disappointed with AMD. They had the chance to put out normal priced cards and actually win a larger share of the market because everyone was excited for the 7000 series to be a good performer at a fair price. Instead, they followed Nvidias lead and went for higher margins leaving us with no good options for multiple years in a row.
Well now they have cards at more reasonable prices (relatively)
@@Dell-ol6hbits not reasonable at all..
The 1080ti back in the days was the best card / nvidia's flagship - high end ... and you could get it for 700-800€.
Now you need to pay the same price for a 4070ti, while their new flagship is the 4090ti...
the 4070ti basically stands for "low-mid" end cards... at this point.
Imagine having a 3 generations newer low end card for the same price as the high end card while it can only offer 70-100% more performance..
It should bring like 200% more performance for 3/4 of the price to be on par with the 1080ti in terms of performance/money.
big disappointment unfortunately
I'm happy with my $575 MSI 6800xt. Upgrading right now or even in the next couple years just doesn't seem worth the absolutely terrible pricing that they are trying to force down our throats right now. Maybe Intel will shake up the game some more in that time. Who knows.
I've got a 5700xt and I have no regrets, welcome to tribal team red.
Yeah rn there is no upgrade to make, it's all a downgrade when you consider the price increase to perf increase
I managed to get a 6700xt used for $259 in October. It'll realistically be at least 5 years before I would "need" to upgrade it.
I held off upgrading my 1080 for so long because of the crazy prices ever since the chip shortage that I mostly lost interest in gaming. Their greed cost them a customer.
I build my new pc about 2 years ago with a 5800x etc. the only thing i took from the old pc was the 1060. And guess what -it's still in there, like you i lost interest in gaming..
yes, I am not too happy with my gtx 1070. New games don't run that great at 1440p for a couple years already. Now just rarely play some older or indie titles. I can't pay 900 euros, don't want to risk with used and middle end cards cost as much as top end not long ago.
As other commenters have pointed out, a 550-600$ RX6900 might be a good upgrade if you're really itching.
Same. Games are cash grab, microtransaction hells these days, that doesn't help for sure
Same, their greed has lost me as a customer also.
I'm beginning to think AMD is deliberately allowing this to happen so that more people are forced into the console market. They make most of the GPUs there after all.
I’m looking to upgrade from a 1080 and use a 1440p monitor and VR (especially VRChat) a lot, so despite the disappointment that was cast upon the 4070ti, it looks like the right card for people in my position especially because 10 series cards don’t have DLSS to take advantage of which can improve gameplay. Also I’ve seen a decent prebuilt with the 4070ti at $1800, which isn’t a bad price for the performance.
been there, bought the 1800 prebuilt, no regrets whatsoever. This pc is is so silent at 165Hz for all the FPS I played, I am now thinking to upgrade to a 1440p aswell because this pc can absolutely handle it. I m glad I made the purchase tbh. Just a stranger agreeing with your take.
1660 Super gamer here lol I'm also planning to build my new/2nd PC as my old one I built back in 2012 is now dead. (Rip)
I want to play in 1440p on 144Hz and I actually think the 4070 Ti is the only card that is the most attractive one. (Except prices, those are 💩 anyway)
Same here!
In the exact same position as you. Was gonna get a 3080 at first, but the price for a 3080 and 4070ti are the same where I'm from and the 4070ti has better performance. Kind of a no brainer
Bro its amazing for 1440p hdr. This is a great card for exactly that and that is why I built with this card. Gigabyte Gaming OC 4070 ti is phenomenal and runs super cool as well. Decent overclocking too.
So glad I got a 3080 at MSRP 2 years ago. Looks like another generation to skip, just like I did with the 2000 series.
Still rocking 1070 lol
same here, and im still really happy with the performance the 3080 offers
Same. I managed to snag a 3080 for MSRP i'm not doing this with nvidia this generation. If AMD fixes their vapor chamber issue, i'd consider them over nvidia right now. (especially being as I kinda want to go to linux full time since gaming is getting really close there.)
I bought a used 3080 for $500, Asus TUF OC too
Same, got my 3080 for just a bit over msrp last year and boy, it seem I got lucky with price to performance. I wouldn't pay more that $650 or $700 for the 4070ti.
I have a 3070ti and have never turned on ray tracing just because performance is too big of a hit since I don't want crippled fps lol. It feels like when we went dx8>9 I think it was around 05 with the 7000>8000 changeover too. Sad to see pricing and performance progressing the way it currently is though even though I expect fluctuation with my experience over the years.
oi, I like your name
Never even tried it? It's worth a shot, even if just with stuff like Portal RTX.
What they and Digital Foundry will tell you is do you know how amazing RT is....although its 25fps you don't understand what you are seeing and experiencing. The truth is we are gamers and gameplay matters most...the only thing that affects gameplay is frames. Have a good one.
I have a 3070 and have literally tried finding RT settings to turn on in the games I play. Either the setting's absent or the game outright doesn't support it, in every case! *HMMMM.*
Methinks NVIDIA really just does not understand how much raytracing actually matters. (It doesn't.)
@@seizonsha lmao the gatekeeping. I play some games with RT on because I like the way they look, and I don't really care for the difference between 60 and 120 fps in some games. If I play a competitive shooter, of course I'll want higher fps... but I'll also want Cyberpunk to look the best it can, even if it's at 45-ish fps.
Why do people have such a hard time understanding that we can all have different preferences and that their own way of doing things shouldn't be imposed or used against others lol
I appreciate the nicer graphs that are softer to the eyes but still conveying a professional and accurate look.
I think if you've got a healthy build going, it can be easy to demonize Nvidia, but...
If you own an older video card & are looking to upgrade, let's consider this:
Radeon 7900 XT - $900 on Amazon.
RTX 4070 Ti - $850 on Amazon.
While rasterization and raw performance are important, who the Hell is buying 30 and 40 series cards, without considering Ray-tracing as a measurement of performance?
In addition to that, who isn't considering DLSS or FSR as a measurement of performance?
If you want to call them gimmicks (largely debatable), EVERYONE got suckered in.
Until these gimmicks become less important, I think Rasterization, sadly, might have to take a back seat for a little while.
If you factor in these "gimmicks" with performance and market share values, the 4070 Ti is good. It's $40 cheaper than AMD's current equivalent.
BUT EVEN from a pure rasterization standpoint, the 4070 Ti still beats out ALMOST the 30 series entirely and its average FPS is 120+ in 4K - that's nothing to laugh about.
I'd like to see ARC in these charts now that they are running better with the updates they have been putting on them. They added More Direct X support so those charts should look better. Yes I know not a 4080 class gpu but still good enough for the vast majority of 1440p gamers that don't use ray tracing. Which by the charts indicate most do not use 4k nor ray tracing. So an added segment to show how much of a difference Amd to Intel to Nvidia would be helpful in the buyers choice. When it's time for my next upgrade I'm hoping Intel does more and Arc works out because that's the next Gpu I would like to try.
It is still not stable enough to give consistent results. The intel cards are very inconsistent with RAM and Processor selection. There is no benchmark for Intel Arc graphics card yet
I hope they can compete with the *80/XT class GPUs with future products. But who knows how they gonna price it.
My understanding was that Arc is excellent at Ray tracing and gives Nvidia a run for the money...
@@yellowscarlightningscream8347 BS, of course it can be benchmarked in games
The current arcs are like 3060 level even id they tested it it will get pushed out of the chart
The new formatting for the charts is 🔥 SO much clearer than the old ones and you've packed more information in too 👏👏👏
I don't like how the cards keep moving up and down on the chart, it makes each chart transition really hard to follow.
@@GetterRay this video in particular was INTENSE with the data and charts!
Yes and innacurate as hell...I don't like to see the same numbers to have different sizes in graphs....like why is there 6 higher than 6 on 3090ti over the 3080....numbers are fine but the visuals are the thing that bothers me... 11:23
@@MightyScharp if the scores were given in decimal and LTT just rounded the number up, it's possible it looks like that but idk specworkstation scores
@@anishj.2390 That is true but if that's the case it should've been stated in the graph somewhere...for me it's misleading that is all
I'm one of the people still running a 10 series. Looking to make a new rig this year, but I'll be primarily focused on getting the best board and socket available. If I have to stick an old card into my new system until prices stabilize, then I will.
Where do you live?
same. I'll probably keep my 1660 ti for my next system
Prices could go up too if crypto bounces back
@@ItsArzee true. I'll just have to keep an eye on it.
1080 Ti = Beast. Ive had a few in builds that are within a few % of the 3060 Ti.
1100 euros in my country... that aged well. I guess my good old 1070 will just have to last 2 more years.
I still can' get over how big the cards are now. Soon I'll be making the jump from an old ATI 5770 to whatever I decide to go with. As life happended I got out of PC gaming for awhile and moved to console gaming. I'll be getting a new PC to hop back into PC gaming.
Yeah it's pretty nuts they basically just didn't bother optimizing for efficiency anyway and now just dump 450-600w of massive power draw on top of those chips like it's some sort of GTX 690 Mars II SLI whatever memery in those things which of course takes a pretty much full tower mesh fronted case and with a massive beefy cooler that you can't even fit the freakin thing in a full tower anymore without taking the hard drive cages out if you go with nVidia. It's frankly impossible for me to recommend them just because all of that adds up, and I quite literally can't either fit it in my case nor power it with my PSU, and I have a 850w 80+ Gold from EVGA with a full tower mesh case. Because again, the hard drive cages makes it not have enough room. Good luck with your front mounted rad either. At least AMD still has a sane 3 slot or less, 2x8pin design that doesn't catch itself on fire.
I might just hold on to my GTX 1070 for yet another generation, hoping a better option might present itself :D Thanks for the very in-depth review
That's how I'm feeling about my 3080. Prolly won't upgrade till I make a whole new build
get a 30 series card unless you don't need the upgrade.
I'm finally retiring my 1070 for the 4070Ti. At $800 it really is the best value when compared to what's out there, if you don't want to go used. I think my 1070 is staring to die as my computer is crashing now.
@@MaxVids1 i feel you man, not gonna judge you, and it sucks to saay that 4070ti is better for the price at the moment. i wish you get your 4070ti for a good fat discounted price and upgrade in future to 6000 or 7000 series, or better an AMD if reds finally beat the 80 and 90 series cards with big numbers.
@@MaxVids1 me too, mine 1070ti is performing really good but im tired of waiting. Ive waited through whole 2xxx series, ppl told me to wait for 3xxx. Waited through them as well because ive heard and seen how big gap there will be with 4xxx series. Now that the 4070ti is out ive already ordered it just because im tired of waiting for price drops and other "better options". I feel like at this moment the 4070ti is a pretty good choice
I'm just gonna stick with my second hand 30 series card, runs all the games I want and probably will for several years. I thought about getting one of the 40 series cards, but once I heard they were gonna price hike them at MSRP, like what the GPU shortage did for second hand cards, I decided it's better to settle than break the bank and give card makers any more reason to price hike cards that would be far cheaper in a world that had no GPU shortage.
It's a weird prospect to even upgrade every GPU cycle anyway.
Unless you had to mega cheap out and get the 3050 or something, I see no reason why anyone should bother to refresh their GPU for the few percent difference.
But hey, if you're rich enough, I suppose..
@@HotdogSosage bought a 3060ti for about $300 used upgraded from a 1050ti. Was gonna wait for the 40 series to come out and willing to spend a lot more for the upgrade, but I settled for a cheaper decent upgrade when I heard the inflated MSRP prices the 40 series was gonna be.
To be fair, what percent of those who never used ray tracing also never had a card good enough to expect reasonable performance from it? I had a 1070 before my 3080, so I had no expectation of good performance with it on unless the game otherwise was ridiculously easy to run.
My thoughts exactly, it's kind of offputting the fact that he... didn't even recognise this fact? I used to have a 2060 until recently, and I couldn't even think about using ray tracing, so obviously I didn't, but now that I upgraded, I always do
Agreed. I went from a 1660Ti which couldn’t handle RT to a 3060 that still honestly couldn’t but I could at least get a look at what it might be like. Once I saw it actually in play I went for the 4070Ti and I am fully convinced it’s worth it. Granted I know it isn’t for everyone.
Let's also just admire how far AMD has come in regards to raytracing from just a few years ago.
The 7900XT now nearly matches the 4070 Ti in raytracing while being just slightly more expensive.
With the huge lead Nvidia had just a few years ago that's really admirable.
Is it, catching up is always easier than continuing to innovate
Yeah next gen is supposed to be the refinement that will make them beat current gen Nvidia in both raster and rtx perf.
i dont know the rtx 4070ti is just garbage, like the ddr4 gt 1030
the 4060s seam also to be a new vram experiement
also the cut down pcie lanes that have been rumored
asking for the quality of a 1050(100 dollar by the way) on a 350-400 dollar price point is outrageous apparently
I don't see it, 4070 ti has 15% advantage over 7900 XTX in heavily raytraced Cyberpunk 2077.
4070 ti is, mind you, a REALLY tiny GPU chip and a THIRD from the Nvidia GPU stack behind 4080 and 4090.
This is at 4K which is not the favored resolution for this card as it would be preferably tested at 1440p.
Cyberpunk 2077 has none of the Shader Execution Reordering or Opacity Micromaps implementations that were introduced in Ada Lovelace (RTX 40) architecture. Not even the preview build with DLSS3 has them, as that's part of the RT Overdrive update that's probably many months away.
lol only a true fanboy would admire that
I have a 2070 Super and skipped 30's series hopping that 40's would have a relation between price and performance. But as you said, the industry just became a shame. Best regards from Brazil my friend, and thank you very much for this video.
Edit:
Thanks a lot for the answers guys. Now I known I'm not alone.
For some background:
I bought the 2070 for $500 in cybermonday in 2018, but it had the artifact bug, I send it to Asus to repair and they send me back an 2070 Super because they didn't have available a 2070 in stock. Before buying the 2070 I had an Radeon R9 280x. I don't change GPUs every year, I just change when one of these two things happens:
One, my games stop running at high settings at least at 40 - 50 FPS average, 4k (now I play in my 50 inch TV that have 60hz).
Two, I change my GPU when I have a good price/performance opportunity to at least increase 50% or more the performance and keep playing for a long time without have to worry about lower my graphics.
I Play Games Like, RDR 2, Cyberpunk, Sekiro, Forza Horizon, Elden Ring and I'm waiting for Diablo 4 and I Know that I already have more than the recommended gear to play it. I can wait a little bit more to not give money to this greed corpo market.
One more time, tks for your comments and thoughts. I appreciate.
I too have a 2070 super, and decided that nope, no upgrade this generation, unless price drops significantly.
It looks like I'll be squeezing at least one more year out of my 2060 Super...
I would get a 6k series if you really need an upgrade. Sound like you have a 1440p or 1080p display. So you want hurt too bad. Still if you can’t afford it like me, skip these gen and save your Pennie’s now for the next gen. I’m sure it will be expensive as well. I’m saving now. $10 hear and there for the next 2 years I guess
Yeah the pricing is nuts, I just upgraded from a 2080 Ti to a 4090, and am still debating now whether to go 13900K/ KS, or just wait until fall for the 14900K since it’s going to be a new socket most likely.
Im the same my man. I might just upgrade my 3700x to the 5800x3d instead for my upgrade this year.
I already had low expectations for this GPU, but holy shit (regarding the value). Still, can’t say no to an early LTT video.
RTX 4060 😻? Where will you come please tell me
Stop buying these cards for these insane prices.
VOTE WITH YOUR WALLET
Even before the good-bad-good cycle was mentioned I had already decided that I’ll wait for the 50-series before upgrading my 2070.
Crazy to think that I always felt that high-end PC gaming was expensive and unobtainable for a long time, but looking back to my first build with that 2070 card now it feels like everything was great value 😂
My 2080 Super is still holding its own. Ok I'm tempted with the 7900xtx but prices are still in rape mode.
still running a 1060 myself. and a i7-3770 lol. still runs everything well enough for 1080p so no point wasting thousands to upgrade and play at the same settings anyway.
50 series usually isnt good value, i wouldnt consider
@@DoryHayes I have my 2080 super as well. Runs MWII 60 fps at 4k. But I agree, prices are fucked.
do you think I can hang onto my 660ti any longer? :(
As a student studying game development and constantly doing work in unreal engine and blender I very much appreciate the blender benchmarks! Looks like I will be holding on to my 1080 though. It's funny to me that nvidia thinks that 800 is the price to set an upgrade for the budget college kid card I spent 300 on during the crypto mess. Feels like nvidia tried to force out rtx, thinking it would win them market supremacy. Issue is that the college kids like me that are the future of game development couldn't afford effective rtx in the form of a 2080 ti. So they came out with the 30 series, which nobody could get thanks to crypto. nvidia needed to get rtx in the hands of as many creators as possible to drive the technology forward, but failed to do so. Now here they are trying to recover the loss on their investment by just jacking up prices, when frankly if they made a 4070 ti without rtx that cost 200 less it would be far more compelling.
The funny thing is, NVidia's like oops we figured out how to not have shit algorithms for RTX(after YEARS)... oh look 2-3x more performance...whoopsie!...new series. I really think this new breakthrough was some young college guy who ACTUALLY studied older computer tech and was like, uhh they did this thing for cpus back in the day why are we not doing this for gpus? You know, order stuff properly....DERRRRP. Imagine if Intel charged $1600 for their current i9 cpus.
Do yourself a favor and get out of computer gaming (gaming in general if you can imo). There's no joy in it and you're just nickel and dimed for everything.
its on par with the 3090 at half the price. look at the specs. rtx 4070 ti where it's at.
GTX 2770 ti
The 1080 is actually at like $150 now, it’s insane to go for current gen at this point
Day by day, the 3060Ti which I bought in October seems like an amazing buy compared to the new generation.
Yeah I'm kicking myself for not grabbing a 3060 ti FE when they were readily available.
Same here.
I'm even more happy to "steal" a used 1080 last year, considering the bullshit is still going strong on the market
same
At this point getting an older GPU is probably much more sensible than paying the silly prices for a new one, no matter what you plan to do with it.
of course.. everyone is lying..polititians, doctors, companies...
What I find most shocking is how good the 3080 performance is in comparison to the 4070 ti despite the price. Glad I got a 3080 for a good price when I did
laughs in 7900xtx
this was same with 2000 vs 3000 series
yeah dude I've been feeling the same thing
I was fortunate enough to get a TUF 3080 for $699.99 due to living 10mins from a microcenter, almost exactly 2 years ago.
This card still killing it so hard I will prob wait another 2 years at least before upgrading.
Yep. Still kicking myself daily for not picking one up early on, 3-4 mos. before I was actually ready to build the PC it would've gone into. By the time I was ready to purchase, it would have more than doubled the price of the system I built 2 years ago.
I'm glad I got a 6700 xt. I know it doesn't compare to the other cards, but I got the Sapphire Pulse version for $410, and now it's $550. I actually do need the 12 gigs of vram, and AMD is better value for money, since I personally don't need any of the bs features from Nvidia like DLSS or RTX (I play at 1080p with mostly nonRTX games)
Idk how I could be any sooner
Edit: While I may have LTT's attention... Alex is the bees knees. More videos making wack projects would be much appreciated!
Same
You couldn't
Me neither
Congratulations on your only meaning in life
Finally! Wattage graphs! They are so important and LTT is finally doing them! I hope to see a future video comparing wattage use with fps/scores for many gpus and cpus.
der8auer always implement W/fps graphs too. So you can Look it Up there
When it comes to efficiency, leading desktop gpus give reason for big concerns. The current overheat problems of the latest nvidia and amd top tier gpus are troublesome for a stable and safe use.
Opposing to that It is astonishing what mobile gpu developers like imagination technologies, who also helped with apple's m1, can do with a fraction of the electrical power. They published very efficient ray tracing gpus, long before nvidia hyped gamers for that.
If I was a NVIDIA simp and finally got sick of their sh*t, I would not switch to console. I would switch to AMD. They’re not perfect, but at least I’m not being robbed.
I got a 3060 at a very good price last year, but when I have to look for a new GPU in a few years I'm gonna seriously consider going back to consoles. Buying a new GPU, be nvidia or AMD, for more than a PS5 to get a similar experience is simply unnaceptable.
if you need a gaming only machine, a console is the way to go.
But the game selection is less ample than that on a pc.
so its a 50/50 if you ask me.
@@lordrefrigeratorintercoole288 And the mods the only reason i still consider pc as superior.
@@kls1836 you dont think resolutions dropping to 920p to maintain 30fps is a bit weak?
In what way is the experience similar? Console plays its games, generally, at a minimum level of acceptable quality. And that's without the options to do all the other things that come with pc.
Not being able to use something like discord is the big no for me on experience difference. Not including everything else you sacrifice for console
I share your sentiment my friend. I subscribed to nvidia geforce now (the priority tier) and I'm just playing my games from the living room and it's really good for what it is. I will be trying the exclusive tier with 4k and see how that is.
I own a xbox series x and a pc but I will sell the pc and just use my console instead alongside geforce now to play any pc related games or games I haven't finished.
Everyday, I have less and less reason to own a pc.
I'll probably just wait and upgrade from my 1070 system next year. Still a killer rig for 1080p gaming and moderate workloads. Raytracing isn't beneficial enough yet for me to spend that much and see little to no difference in frames the way I play at 1080p.
Even for me, I have a 3060, to this day since 2 years ago, when I bought it, I have *never* experienced ray tracing, it's just in such few games anyways, and gives 0 graphical improvement for almost all performance lost!
It's a fuckin marketing strategy, have useless features.
_I wonder where they learned this from.. perhaps.. apple??_
I love how around 19:30, even Linus forgot there was the 1600 series, that gen was so forgettable due to how great the 1000 series was.
It wasn't really a generation, it came out AFTER the 2000 series cards, just to fill out the low-end product stack.
There wasn’t a 1600 series tho…1600 named cards was part of the lower end models for the RTX 2000 series without RTX cores. Instead of being an RTX 2050 for example it was a GTX 1650. Same series just o RTX cores and only had two product. 1650 and 1660 and the variants/super refreshes of each. Never was a series.
How is it forgettable when 1650 is the most used graphic cards rigjt now?
That regular model graphics card is for my Dell XPS 8940. NVIDIA
I actually really appreciated the 1080ti comparison, still rocking mine. I’m getting ready to build a new pc, and have been curious of how big of a jump I was going to get with a new card
They did compare it however they misrepresented it, they only showed 4k/1440p ultra/high. Nobody on a 1080ti is maxing 1440p graphics these days so it's just a dumb comparison, never mind playing at ultra is just stupid anyways. 1080ti is still holding up strong, it can easily reach 60fps on mw2, not that they show that.
@@MrHennoGarvie I’ve been super happy with it, that’s for sure. It’s runs a lot of games @ 3440x1440, 100 fps, @ high graphical setting. The shooters don’t really count to me, I’ve always done custom setting to maximize the fps and get the clarity I want. Still does amazing for the games I want to be maxed out (or as close as I can with out a huge it to fps). I just don’t want to feed the Green machine, they prices are ridiculous. Had high hopes for AMDs latest cards. Wish they would have stuck it to Nvidia prices wise though. Shit I’d even consider Intels gpu’s if they had a higher end version that fit my needs.
@@aeromech2155 1080ti is the best card NVDIA ever made. Best card when it was released, best card for its price and best card for longevity. 699 when it was released...
@@Alpay283 6 months after the Titan X Pascal purchase for work this beast made me cry a little inside. I spent $1200, they release the 1080 Ti to beat AMD for $699. What a monster, still!
@Zero I have yet to build my rig so I can’t give you a review of it. My build list has changed now. I’m about 90% certain that I will go with the 7900xtx. My 850w psu from my old build will power the system happily, which saves me money. I thought I wanted ray tracing but now that I have done more research into it, most people can’t see a major difference, and it’s a decent hit to performance. Any one out there with a RT card is welcome to correct me considering I don’t actually have the capability to RT at the moment.
something nvidia seems to keep overlooking when it comes to their overreliance on RayTracing in performance graphs, is that unlike previous generations (pascal and maxwell especially), mid-tier 50 and 60 series cards keep coming out significantly later than the 80 and 90 cards (and especially with 20 series, had poor if any DLSS/RT performance, made worse by the 50 and 60 cards standing to benefit from DLSS the most), even though the 50 and 60 cards are historically purchased by a lot more people
because way less people buy the cards that can actually run these types of games, a lot less devs have proper RT/DLSS integration, so only people buying AAA graphical showcase games and a 90 series card (a very small subset of pc gamers) will use those features often, making the excessive prices hard to justify to most people
edit: I actually appreciate that LTT is taking note of the RayTracing usage problems, because reviews in the last couple years often placed Nvidia cards over AMD cards because of RayTracing performance, even when the equivalent AMD card had comperable if not better rasterization performance (what the vast majority of people and games will actually use, at least for the foreseeable future. and as we all know, "futureproofing" is never a safe bet)
I'm not sure why you had to specify AAA games because that's what most people play, the only thing that makes it a small subset is the GPU they have
Plus, for anyone running a 50/60/70 card, you've basically got a choice - turn RT on and turn a lot of other settings down (say, high to medium, turn down resolution, go to a lower-quality DLSS, etc.), and still have a drop in FPS - or keep RT off and have higher frame rates and rasterization quality. Having had my 3080 for over 2 years, in those situations, I'll say what I'll normally do is keep RT on for a bit to look at pretty lights then turn it off once I'm actually getting deeper into the game.
Yeah its weird, like i´ve got an RTX 2070 Super that i bought in 2019 for like 520€, and honestly, the only time i really "used" Raytracing was when Minecraft RTX came out, to experiment a bit, but im playing on a 1440p 144Hz display, and i´d rather have high frame rates then some eyecandy. The only feature i use whenever its avalable is DLSS, cause its basically free perfomance, with little to no downsight, especially when using the "Quality" mode.
I'm honestly so tired of seeing ray tracing and DLSS performance... I judt want to see pure rasterization benchmarks to see what they've ACTUALLY done. Not this fake frames and fancy lighting bullshit.
@@Aqueox DLSS performance is actually important to show how far they've come, although I agree frame generation is terrible (If you look it up you can see just how terrible it is, absolutely ruins the whole idea of DLSS improving performance with minimal quality loss), ray tracing is important too because it shows how far we can go without having to use pre baked lighting and other lighting tricks
I’m really glad you’re including productivity programs in your charts now. I game, sure, but I do more than that and it helps get a more holistic view of gpu performance.
Agreed! I Game a lot but I also run photo editing software that often is more performant heavy than my fave games
Agreed, but benchmarks don't say it all when it comes to productivity. This video makes it seem like the 4070ti might be a worthy alternative to a 3090/3090ti for 3D renderings, but it comes with HALF the vram. This means that heavy scenes that would use 51% of a 3090's vram would literally just crash on a 4070ti. This is a BIG problem, considering the 4080 still has only 16gb and the 4090 is out of everybody's budget plus you need to add a new PSU on top of it.
has anyone else noticed their gas and groceries cost lately? and anything on amazon? I've started looking back at prices i paid for consumable items 3 years ago and its insane. Everything is up 30%, at LEAST. Everything. oh wait ...