Yup. I was initially excited for these new cards, was planning to swap my 3080. But looking at all the stuff, I'll stick with the 3080 for 2-2,5 more years ty :P
To increase frame rate significantly, Nvidia would have to increase clock speed, chip total area or improve architecture. Higher clock speed means higher power consumption. Increase chip size means increased production costs.
@stuartthurstan the 9070xt looks to be 7% slower than the 7900xtx in raster. So basically a 4080 and around a 4070ti super raytracing performance. So it is a 5070ti competitor when it comes to raster performance and that card will be 750$. They could price it at 600$ and would sell quite the good amount of cards if fsr4 is good.
Have you seen the hands-on tests? Latency is much lower using all the "fake frames" tech than with all the extra tech switched off in the tests I've seen
more raw performance and it seems wayyy better FG and upscaling. if utilized correctly, the 50 series is superior and no one should be upset or feel disappointed. dont use MFG at a base framerate of 20 fps and everything will be fine lol.
@@kleo1307 Its a software update mostly. No need to push out new hardware with more and more power consumption every 2 years when the main achievement is made with software
@@kleo1307 You shouldn't feel disappointed that they only added hardware to service software, that's only good if you buy an expensive high end card that already run games the well over ultra settings? That the only gains appear through extreme enthusiast fluff resolutions and framerates at 100+ you can barely notice? All the while they're continually shrinking the budget dies while also short changing those cards on ram and bus size. You're literally saying don't use it when you would actually need it, i.e its useless.
Yea it's gonna be a tough choice between 4000 Super and 5000, The 4000 Super prices are still so high this close to the next gen launch It's making me worried there's not going to be price drops of any significance on the used market, and may just be better trying to get a 5070Ti FE at $750.
@@LB-sh7qt incorrect. The correct comparison is what performance you could get for certain amount in the outgoing gen (at current prices, as that how much it would cost you to buy now) vs how much performance you can get in the new generation for the same X amount.
@@cptwhite Then you never get a better performance/dollar amount out of the new cards, because the old cards get discounted based on the new cards value. AMD was stupid enough to discount their rdna2 cards 1-2 months before the launch of rdna3. The reviews then stated that the 7000 series had no value improvements compared to current products, which were heavily discounted at that point.
The same percentage raster performance bump over the 4090 is there only Nvidia is not giving you the added performance or the AI/frame generation updates for free on the flagship offering this go round. I have never understood why gamers always expect to get 30%+ added performance when compared to the prior generation cards but then complain when the cost goes up for the faster high performance card. Speed cost money, how fast do you want to go? Hot rod car guys have been hearing this phrase for decades and its still true today including PC graphic cards!
At this point, the vast majority of Nvidia customers are fully bought into DLSS. There's no reason for them not to market these cards in a way that puts them in the best light.
Imagine you're Jensen. You've got the choice between continuing the race for raw hardware power, going to great lengths to miniaturize as much as possible ( exploding your R&D costs and supplier orders ) all to please 4 nerds who want to play natively, knowing that you've got almost no competition to force your hand. Or spend peanuts and gamble on software and fake frames for more or less similar results, except that you can sell your gear for the same price as if you'd gambled on hardware, and so gorge yourself ad vitam æternam. What would you do?
All new AAA games coming out already have raytracing.. what would be the point in spending a ton of money on gpu and then not using the technology it has
I was expecting a bigger jump with more shaders, better architecture, faster memory, etc. This is proof that the new architecture actually didnt improve all that much. Also the power draw hasnt got better, it got worse.
they don't have the incentive to do so since AMD is out of the game in the top tier level. Keeping their powder dry for when someone else can compete. They know they will still have monopoly and get people to buy this in droves so they're delivering just the bare minimum this gen.
@@honeybadger3568 I think AMD is being quiet and very secretive, they are playing the long game, like they did with intel. NVIDIA can always beat AMD but I think AMD is secretly planning for at least one generational big win.
@@honeybadger3568 Well they couldn't deliver anything better this gen. It's still TSMC 5nm, the same as ADA. Just a slightly improved version of it. Ada 4N, Blackwell 4NP. Both are refinements of TSMC 5nm. 3nm is the best they make atm but there is limited production and Apple has all of that. The 3nm stuff is meant to offer big gains but we will have to wait until next gen for that.
@@FreddyFazbearsPizzeriaOfficialthat’s if the prices come down at all. The 5070 will likely sell above MSRP by $100 or $200 and Trump’s tariffs will also raise prices across the board
There is a video about this, 5090 gets 26-28 fps 4k raster raytracing, while 4090 gets 22 fps raster 4k raytracing in CP2077, the raster is barley any difference between 5090 and 4090.
@Zombiesmoker no I am waiting for several games so we can get an average and from several different channels at raster performance with no raytracing, with raytracing, with dlss. Also the 5090 based off of the guesses I have seen should be at least 30% faster then the 4090 which is the normal generational jump but I will wait for the official benchmarks from hardware unboxed, gamers nexus, jay2cents, linus tech tips, pc centric, etc. There will be a big enough difference between the 5090 vs the 4090 what are you talking about?
@@MrAnimescrazyBro, PC centric uploaded a video of him running Cyberpunk with the 5090. Zombie claim is mostly accurate. One game doesn’t represent all games but he isn’t lying about the small uplift in regard to Cyberpunk.
@@Zombiesmoker That's still a 25-30% uplift, which is normal for a generational uplift in performance.. Just to bad Nvidia also increased the price by 20% again, so not really more performance for your money.. It's what happens if there's no competion.. They know people with the money will still buy.. Simply because there's no alternative if you want the best.. It's the well known FOMO marketingstrategy, companies like i.e. Apple also use very effectively..
I find it ridiculously misleading to compare the 5070 Ti (a 16GB card) to the 4070 Ti (a 12GB card) instead of to the 4070 Ti Super (also a 16GB card which retailed for ~ the same price that the 5070 Ti is rumoured to drop at).
@ but the 5070 has 4090 performance! )) wink-wink. which basically means NOBODY NEEDS these new GPUs because the 5060 8gb for 300usd WILL BE ENOUGH FOR ANYBODY @ any game just pump up more glorious fake frames and turn on upscale from 240p like all nvboys do)) ENOUGH FOR ANYBODY just like 640kb))
This analogy makes no sense. It's more like putting rocket boosters on a car making it look like it goes really fast but it cannot respond to turns on a track, can only go straight in one direction.
@divertiti Except it isn't because it's conceptual. It isn't real. If you swap out a 4000 series card for a 5000 series card, you're buying a software upgrade, not a hardware upgrade.
Looking at this it seems 5080 is slower than a 4090 and a 5070 is on par with 4070 super. Nvidia is literally just discounting the 4070 super for 50$ lol.
Not even that, 4070 Super has 1024 more cuda cores, 25% more ROPS (80 vs 64) so the question is will the slightly better RT and Ai of the 5070 reduce the loss from enabling RT more than you lose in native raster performance? DOUBT
The 4070 super is the only 4000 series super card with a decent performance increase, the 4070 ti super and 4080 are basically same performance as the non super.
@@jackinthebox301 ? The 4070 super was almost a 20% performance bump, 4070ti was a lesser but still decent 10-15% bump +16gb instead of 12, and it was only the 4080 super that was literally a price cut with everything else being the same(1-3% bump at best)
@@ultimategohan1551 I'll give you the 4070 S was better, but it was more like 10 to 15% with some outliers. The 4070 S was also more expensive so I'm not sure that is a good example for either of us lol
There is a reason Nvidia is allowing reviews of the 5090 a week before launch, but restricting the 5080 reviews to launch day. The 5090 may be a decent 20-25% real world improvement on the 4090, and all the big boy pro features are there. 32GB double the encoders. Things people will drool over. The 5080 is cut in half in everyway. No doubt in a real world mix of RT and NON Rt games its not going to impress. The 5070 no TI with its 12GB is going to have reviewers calling it the real 5060. The door is open for AMD to destroy Nvidias entire stack save the Halo. The Halo is out of reach clearly. Nvidia however has cut down everything else so hard that AMD really does have a shot if they come out swinging with a $500 or less 9070 XT. If AMD is serious this gen come in at like $450-475 on the XT... and blast the non XT at $350 make the very idea of a higher end Intel card this generation seem stupid, and the XT just being the defacto card till your willing to really spend 2k.
I think 9070xt prices was already somewhere and it was 575-599 Sorry but amd ain’t good side vs mvidia bad :). Both are cost to make profit as big as possible :))
@@krzyssadzynski1067 From the 9070xt leaks it's somewhere between the 4070ti super and 4080, while the 5070 looks like a 4070 super. So if they sell for the same, then the 9070xt is much better value.
@@krzyssadzynski1067 AMD has said in a few interviews the last few months. That they understand they continually loose market share. Even generations where they thought we have hit our goals, our pricing is reasonable should be good. They still end up loosing market share. AMD and NV used to split the market. That isn't really the case anymore. I think AMD understand they need to not just slow the loss of market but actually grow it a bit. If they don't what is the point of even making dGPU anymore. AMD has Strix Point/Halo parts, the consoles probably locked down for at least one more gen. They really don't need dGPU. Its starting to be a now or never if AMD is going to continue making enthusiasm class GPUs. We are all in serious trouble if Lisa does the actual sane thing going forward and tells them to forget consumer GPUs. The GPU teams needs a win in terms of market share. Just not loosing money isn't enough at this point. If dGPUs don't sell this round, she is going to pull the plug. (Frankly she sort of already has as RDNA is dead after 4.... they will be combining RDNA and CDMA into UDMA. Basically that means they are going to do like NV and design for AI and servers first, and maybe sprinkle down some cast offs for consumers)
I agree amd does have a shot but they have fumbled so many times already im not holding my breath. People are tired of nvidias greed and are also twitching for an upgrade without gimmicks.
Nvidia is smart. They never released a 4080TI, but gave us the 4080TI disguised as a 5080 with a few software integrations. They never had to care about the TI version for this because it would hit the 5000 series too hard like the 1080TI did with 2000 series and 3000 series.
@@SeraSxF nobody is asking for endless 40% improvements gen to gen. But if you look at the 30-40 gen uplifts everything below the 70ti class was terrible. Its basically same again. The 60 class will be even more garbage.
Yup maximum long term milking profit. Intel's already has celestial and working on druid. Nvidia got no competition and saving their ammo that's what I'd do. Milk me harder daddy Jensen
Not to mention performance could be worse where the older card was tested with dlss 3.0 and the 50 series with 4.0 which provides supposedly a 20% boost in performance. So ill be quite interested to see what actually everything is when released
@@PreacherT5 yeah i dont see enough people mentioning this. They improved the performance of both the upscaling and the framegen. Thats on the DLSS 4 side tho. DLSS3.5 wont get these gains. So imo, the gain will be even less than what they are claiming
It hasn't reached its limit. But, the improvements are getting more expensive. TSMC has recently started manufacturing chips with the GAAFET design, and that will allow for further improvements for the next 5 to 10 years, and scaling below even 1 nanometre. But, GAAFET is expensive and NVIDIA can't use it yet for GPU's because it would cost too much.
We've still got a few more node size changes before those problems really show up. Nvidia could have moved to a newer TSMC node, but chose not to. The wait time was probably too much. Easier to book that newer silicon for the future and simply boost power consumption this gen.
Just dont understand how we're 4 generations into raytracing technology and still cant run it natively above 30 fps EDIT: it seems i have misunderstood the difference between path tracing and ray tracing
we absolutely can run ray tracing (RTX). but path tracing is a different beast. Also, RTX came with DLSS technology, so even from the start, DLSS was meant to help with ray tracing since its very expensive.
erm..... my 2080 runs above 30fps easily even 60 is no issue I can even go above 120 without issues so it must be a you issue if you can't get that on 1080p
Jensen made the claims and now they gotta backtrack most of it. People were shitting on amd but I’d way rather they just stick to real info and hopefully good pricing. These 5080s are the exact same card just with better upscalers.
AMD has never stuck to real info when they do shows. Have you not watched their GPU last releases, there slides are always BS. They also always blow it on pricing, so i don't expect much.
@@666stankyrich To be fair, Jensen claimed that the gaming enthusiast has a $10k gaming room, which he called the "command center". I don't care about Nvidia, AMD or Intel. I only care about my wallet.
@@NathanAtkinson590 yea both exaggerate, but nvidia also isn’t letting reviews go live until launch day for everything under a 5090. You guys don’t think that’s absurd?
Ive been saying this since i found out the new cards are on the same 4nm node as last gen, these are gonna have nominal gains at best. The whole point of these cards is to make it cheaper for NVIDIA to produce while upgrading their frame gen to make up the performance gains
4070 Super has 1024 more cuda cores, 25% more ROPS (80 vs 64). ?: will the slightly better RT of the 5070 reduce the loss from enabling RT more than you lose in native raster performance.
5080 with only a 15% improvement over a 4080... When are they going to admit that this generation is a rehash of the previous one and the only decent one is the 5090?
I don't understand why people didn't understand that 6 months ago when we found out that it would be on the same node. TSMC N3 is not ready yet large dies. The pricing and the cut levels would be completely stupid (and supply would be an issue as Apple is still taping out).
PC Centric just showed Cyberpunk running @ 4k max settings with full/path tracing with no DLSS or FG. 5090 got 28ish PFS. That's a 27% uplift from the 4090. Something else that should be mentioned is I'm pretty sure during the 40 series key notes, Nvidia used Intel CPU for their charts. They are using the 9800X3D for their 50 series charts PLUS DLSS performance. DLSS P (low res render) plus X3D is a pretty big uplift.
@@midnightvibes5485 People will complain if the 5090 is only 30% better than the 4090 because Nvidia is charging 25% more. That's not a lot of improvement in value per dollar, at least for games. If you're using it for other purposes, especially AI, it's probably a much better value uplift.
Are those uplifts usually static across the board though? We’re talking about 27% but low FPS numbers, if you got 200fps at 1080p should you except 260fps on a 5090?
So people buying a 5090 will play games using DLSS Performance mode ? Cmon now if you’re buying card of that price bracket I want to be playing maxed out ultra settings.
So we're paying for a fancy radiator to heat our rooms? The 5000 series just comes off a 4000 series that has been forced to consume more power for 30% increase in performance.
Nobody is forcing you to do anything, get a 4090 or 7900XTX(although that's also an insanely inefficient card) after the fact at a cheap price if you want.
@@FreddyFazbearsPizzeriaOfficial Actually yes they are because this is the long term direction all 3 vendors are going in. The alternative is buying a tiny PC with an SoC with the repairability of a phone. Server GPUs are far worse with heat and power consumption to the point they require datacenters to be replaced or just give up and go to the cloud. I don't have a good feeling about any of this.
Since the CES presentation of the GPU brands, my expectations with Nvidia started very high when I saw the prices and performance they exhibit, and my expectations with AMD were very poor, but now 9 days after CES I am on the totally opposite side, my expectations about AMD are very very high, and about Nvidia, there is no way I will buy these GPUs.
AMD is a wild card this generation. My napkin math says that at best the 9070XT could match the 5070Ti, but at worst only match the 5070. So the error bars are still WAY too wide to be certain of anything.
Clock rates and transistor density for Blackwell have remained similar to those in Ada - that probably explains the "modest" raster performance boost. Most of the effort in the new architecture have been spent on AI and RT and a lot of software-side updates (DLSS4).
People complain about "modest" gains constantly, but think of this way. What what if you were 33% taller? Your car was 20% faster? These are still significant gains, just not stratospheric gains that people expect to just get pulled out of someone's ass. Everyone seems to just discount the AI and software stack as well, which is fine, but they fail to realize that the vast majority of people are perfectly happy with using the upscalers and frame gen
This is why it'll be a waste of an upgrade for anyone with a good 4000-series card. But i, with my 2070 Super, am eagerly awaiting an upgrade when I make a full AM5 build at some point.
In short you are getting a minor increase in raster performance and the new generation is basically unlocks a feature which generates more frames instead of being a generation which gives massive performance bumps.
Imagine you're Jensen. You've got the choice between continuing the race for raw hardware power, going to great lengths to miniaturize as much as possible ( exploding your R&D costs and supplier orders ) all to please 4 nerds who want to play natively, knowing that you've got almost no competition to force your hand. Or spend peanuts and gamble on software and fake frames for more or less similar results, except that you can sell your gear for the same price as if you'd gambled on hardware, and so gorge yourself ad vitam æternam. What would you do?
Ngl, you got 9070XT performance for 300 dollars more. I'm guessing the 9070 XT is gonna be 499 ~ 549 and is gonna be 4080 Super performance in raster, and 10 to 20 percent weaker in raytracing based on the Morse Law is Dead leaks. The future will tell, tho. We'll see how well my comment ages.
@FreddyFazbearsPizzeriaOfficial I didn't say he was stupid. He could've gotten better value for waiting more and seeing how the market changes with the new cards around the corner.
If you check the Danish retailers,they have put the price on 9070xt €1200 which is also surprising to me since Im waitinf for that card too but definitely not with that price!
@@SafetNeziri666 It's an unreleased card. They can put whatever placeholder price on it that the want. They could put 9999.99 on it and until AMD actually sets the MSRP, it's meaningless.
rx7700xt is 35% faster than rtx4060ti for same price as of now and i dont see anyone buying amd , why??? this low pricing never works , its nvidia marketing and nonsense from nvidia fanboys making fake claims about amd drivers is scaring away people buying amd . its the scare tactics keeping people away from amd even if amd is 50% faster .
Thanks for the update! I dont expect this gen to be very exciting because nvidia is using almost the same silicon as they did with the 40 series (4nm vs 5nm, same architecture roughly) and they didn't come up with any new exciting software trick this time. Maybe this would have been the chance for them to raise the VRAM capacity across the board, at least to look somewhat exciting.
The 4NP in blackwell is still 5nm the same as Ada's 4N. Both are minor improvements over 5nm but are still 5nm. Apple is consuming all of the limited production of 3nm atm which gpu's won't see until next gen.
@@premjitchowdhury262 Nah even at that low of a price, they wont make a loss I dont think. It will be a thin margin though. Depends on the wafer cost as AMD is using a new node I think, but everything else on the board is last gen. Which could be sort of construed as a win in some ways.
I wish this were true, but even if they price them like this or better they'll still pick up minimal market share. I would love for them to have their Ryzen moment in the GPU space, but Nvidia's mindshare is so overwhelming.
@@laszlozsurka8991 1660 Ti was 40% better than 1060 with a $20 increase, and 1660 Super had a better ratio performance/$ at a later stage when nvidia finally released it, EVGA got the 2060 KO at $300 and AMD released the 5600 XT, all of them pretty good cards for a cheap price, the bad priced cards was every from 2060 upward
If frame generation is producing 3x AI frames for every "real" rendered frame, then 75% of your gaming experience is AI-generated and only 25% is real. It's sad that all the creative work devs put into games (artwork, animation, characters) is so drastically watered down by generative AI. This trend risks creating a cycle where developers release games filled with AI-generated art and animations because gamers playing with fake frames won’t even notice the lack of detail. It sucks & is all caused by AI and companies pushing onto it like NVIDIA
5090 = 4090 x 1.25 5080 = 4080 x 1.15 5070 ti = 4080 super 5070 = 4070 super We are looking at 0-5% uplift in efficiency. This is an AI/RT uplift.. nothing to speak of for standard gaming. Big skip for anyone with 40 series.
You always skip every other generation unless you're looking to get ripped off. Going from Ultra to High is usually the same performance boost as a generation, and this year's High looks like last year's Ultra.
So basically my 4070 Ti is basically a 5070 without the gimmick of mult gen, and i still have the normal frame gen, got it. Zero reason to upgrade and skippable generation unless youre on hella old tech or an 8 gb card. Honestly im relieved cause now is not the time to spend money and this just means theres a limit to how crazy they can make games graphically. Mult gen might cause some of the same boneheads with mediocre games to make something that looks pretty and vapid or just flat out not optimize crap and still put out a lousy game. But honestly this is fine. Only a handful of full of games ive been interested in anyway lately
I heard that the last time a bought mine at MSRP with a discount. Not everyone experiences your environment, they actually live in a different environment.
@@walter274 yea im surprised they didn't use the 3GB capacity ram on the desktop 5080, they did with the laptop version (5090 mobile) where it has a 256-bit bus with 24GB of vram
curious if the 4070 they used is the GDDR6 or GDDR6X one. I'd think it's the recently nerfed GDDR6 4070 so they can make it look better in these results.
The charts showing huge performance increases were mostly due to frame generation. Also why are we comparing the new generation to 4070 Ti and not 4070 Ti super?
@@Greez1337 because the super line is a refresh........... would you compare a 5090 to a 4090 if a 4090ti existed? no you wouldnt, they clearly didnt compare it to the super cards because that would shave about 3-5% off those results and in the case of the 5080 means it looks utterly terrible when at best thats only seeing 15% of an uplift vs the 4080 (non super) also in the case of the 4080 and 4080 super, when the super launched nvidia stopped production of the 4080 (non super) so its a valid comparison when thats the better SKU under the same name
Lovelace 2 - Electric Boogaloo 5090($2000) => OC 4090Ti x 4090($1600MSRP) 5080($1000) => OC 4080S($950) x 4080($1200MSRP - no longer available) 5070Ti($750) => 4070TiS($850) x 4070Ti($800MSRP - no longer available) 5070($550) x 4070S($600) x 4070($550)
Sort of. 4070 super has 1024 more cuda cores and 16 More ROPS (25%) so raster will be better on 4070 Super. 5070 has slightly better RT TFlops and AI TOPs. Will the slightly better RT lose less FPS than the loss in native raster potential? Will the slightly better Ai power make a huge difference in visual quality? That is unknown for now. If you have 4070 Super I wouldn't switch. Frame Warp will extend the life of all RTX cards. Only if you can sell and upgrade for very little money out of pocket. But 5070 TI is the sweet spot, 45% more cores + 4GB more VRAM for 40% more money. 5080 you pay 33% more for 12% more cores.
@@FreddyFazbearsPizzeriaOfficial Also, even reviewers make mistakes, because they compare card against previous tier cards. The amount of times I saw comparisons, but they forgot the history how they named these cards, like RTX 3080 10GB / 3080 12GB. The whole xx70 Ti to xx80 Ti level cards have been weird past 3 generations.
@@heroicsquirrel3195 Pretty sure the new Indiana Jones game have mandatory raytracing things. Also Alan wake 2 is dependent on some memory optimization things you only find on RTX2060 and above.
I think Nvidia are playing games here. Does no one else think it is strange that the two titles they chose to show none RT performance of are Resident Evil 4 and Horizon. Both games are optimised AMD titles...
For example, according to Nvidia's charts Horizon Zero West was tested at 1440p for the 4070Ti. According to techpower-up: 4080 = 107.6 FPS 4070 Ti Super = 101.2 FPS 4070 Ti = 94.3 FPS Therefore if 5070Ti is 4070Ti + 20% = 113 FPS This means the 5070Ti is stronger than 4080 in Horizon
Lets be honest Blackwell is an Ada refresh on the same node with new added proprietary tech. Lets call a duck a duck. They just fed it more power, threw on an updated memory bus interface and upgraded to GDDR7 that's what this is all about. Blackwell is Ada tweaked. The performance scaling proves it from the 4080 to 5080, the 4080 to the 4080 super was such a weak increase, it's Ada Lovelace, renamed Blackwell. Not quite an uplift from Maxwell to Pascal, in contrast Blackwell is a weak generation baring in mind the AI DLSS 4 throw in and they called it a day.
@@FreddyFazbearsPizzeriaOfficial Tom's Hardware even states it's on 4nm. So does Techpowerup. They renamed the the GPU series to GB variants instead of AD. It's still Ada though but with newer stuff bolted on. GB202 to GB205 instead of AD102 to AD107, the further the number goes up the more the GPU gets cut down on each series.
@@FreddyFazbearsPizzeriaOfficial No you're right the 4000 series is 5nm. They are so close tbh. That's most likely why you see people saying the same node, cause it's literally that close. It's not like a jump from 8nm to 5nm like from Ampere to Ada Lovelace. It pretty much is the same node, but technically isn't and from an engineering standpoint they would most like say it's the previous generations architecture tweaked, cause it's such a small bump down.
@@huggywuggy594it's litteraly the same latency as NVIDIA with their dlss4 or lower in some case, don't say fake things about it when you don't know how to use it properly...
these numbers are more in line with what I am expecting, being the 5 series on the same node as 4 and counting on architectural and memory improvements only
May as well call this series the 40XX Super Duper series. Compared to the Super series of the last gen, you're basically getting a marginal upgrade in performance, multi-frame gen, and a price-cut for the 70 series. Zero reason to upgrade if you have those cards already.
Hey, I've been saying that if you've got a 40-series card, you really don't need a 50-series. The 50-series is just a slightly better 40-series, thanks to AI... that's it. It's like when the 30-series came out - if you had a 2070 or 2080, no need to upgrade. The 30-series was just a slightly better 20-series. Same thing here, but the gap between the 20 series and 30 series was much bigger than the gap between the 40 and 50 series. And i can totally attest to that given how i had my RTX 2070 for 4 years from 2020-2024, with zero problems running high level games, except in 2024 when dragons dogma 2 forced me to buy a 40 series card and upgrade for the next 4 years. Rtx 2070 lasted me 4 years at high/max game settings. So food for thought ppl. If you buy a 50-series after getting a 40-series, you need to work on your money management and general gaming pc knowledge; you'll regret spending all that extra cash for basically the same thing, just with slightly better AI-boosted FPS, not native FPS improvements. The RTX 60-series, *that's* the real next-gen upgrade.
That's assuming people upgrade to the same tier of card. I'll be jumping from a regular 4070 to a 5070ti if the uplift is big enough to warrant it. It would have been a 5080 if they had more vram.
@mojojojo6292 Yeah, I get what you're saying. But my RTX 4070 Ti handles anything maxed out with ray tracing - it's a beast, especially with my CPU. So, if you think it's worth it, go for it, man. 🤷♂️🤷♀️ I'd go for a 5080 if I were you though considering your current card is a 4070, but to each their own. Personally, I wouldn't upgrade even with a 4070, not that big a jump imho. But it's your money, your decision 👍
I'm going to enjoy the 2x+ frame rate (over the 40 series) at the cost of 10% latency. I'm coming from the 30 series. Even without all the extra stuff enabled, I'm still getting nearly 2x raw raster framerate at 4k.
Hell yea. I have a 4090 and I was curious about this app so I bought it. I didn't use the Scaling option since my system was powerful enough to run Native 4K (I tested Black OPS 6). However, it wasn't powerful enough to hit 160 FPS and I have a 160hz Display. This gave me the bump I needed without any latency. Well worth the price. The program pairs well with DLSS so if your playing a game that supports DLSS you can run Lossless Scaling Frame gen only and the results are great.
@@joesantoro4964 Perfect! Here's a tip in case you need more performance - WORKS great: Set LFG to 3.0 - scaling to 60% - then on Scaling CHOOSE LS1. If you lower your res in game settings it will do fantastic scaling to your monitor res. Just leave the scaling on LS1 because if you keep your res at 4k (monitor res) it won't scale at all and NO performance cost - but should you choose to use it - it works REALLY well. Talk later my friend. /C
@@joesantoro4964 Without any latency? KEK it uses frame gen my man. It adds latency. But, and that is a huge butt: the latency is barely noticable. Same goes for regular frame gen on nvidia and amd cards. It's usually like 10-20ms. Anyone who claims they notice a 10-20ms difference is lying, humans don't have the reaction time to actually notice that. I am using lossless scaling myself and I love it, it's superb and one guy in his basement beat NVIDIA to it, it now supports up to 20x. Not that anyone needs it but fug it I guess.
Imagine you're Jensen. You've got the choice between continuing the race for raw hardware power, going to great lengths to miniaturize as much as possible ( exploding your R&D costs and supplier orders ) all to please 4 nerds who want to play natively, knowing that you've got almost no competition to force your hand. Or spend peanuts and gamble on software and fake frames for more or less similar results, except that you can sell your gear for the same price as if you'd gambled on hardware, and so gorge yourself ad vitam æternam. What would you do?
@ I have Reflex mode set to Ultra and use a Gamepad. I am sure there is some latency but I was not able to notice it. I will do more testing this weekend. Either way for $8 you can't really complain.
Seems like new generation in heavier RT load is noticeably faster (bigger increase than in raster, same case as with 3000 vs 4000 series), also games that scale more with bandwidth get higher gains. Would explain disparity in results. Would be interesting to see which games are really bandwidth starved, just for science.
Here's something to ponder. Everything except 5080 has increased L2 cache size. 5070 also has 25% power budget increase which is way greater than 5070 ti or 5080. 5070 ti has way larger bandwidth improvement compared 5080 or 5070 but lowest power budget increase.
I guess it would be somewhat useful to have classifications for every game title that says whether they tend to show typical performance or whether they are more of an outlier. So when there is results for one game, the performance in the other games can be predicted reasonably accurately as well.
The further you go up on the shaders, the less it scales compared to one with a lesser shader count. That's why you get more performance per shader on the more midrange cards.
aye they are nice, you can flash the the 552w ASROCK Aqua bios then use their flash tool to upgrade to the extreme 😅 grants access to asrock ai quickset too
@@toddblankenship7164 It's not a new die size both 4x and 5x use the same 4n process, so it's that type of generational gain. 20'ish %. We see it often. On top of the lower gains, it will run hotter and use more power as they are pushing it more vs the 4xxx's.
Honestly, the only thing wrong with Daniel's content is that he hasn't developed the technology to upload a new video every time I hit refresh. I mean, how inconsiderate can you be to my endless tech cravings xD
It sounds like their highest card this gen will be slightly below a 7900 XTX (but hopefully ~$500-600). So at least you know what you're getting; it's just a question of price and availability.
More alarms that something is wrong with the 5090 here. What is up with this building suggestion of a gap between its uplift vs baseline specs increase vs the others? The real testing will be interesting. I keep suspecting the heat and thermal throttling, but that is only because it is the obvious point for suspicion with that two slot cooler apparently somehow cooling 575 watts on that FE. And how can the other specs be the explanation when they also went up on the order of 33% from the 4090, and also did not very meaningfully increase gen on gen for the other skus? Still a gap against the 90.
oh its 100% power throttled, I could be wrong but with these new 12pin connectors it specifically doesnt draw the 75w extra from the PCIE slot so when those connectors are 600w rated and your capped at 575w, you havent got much if any room to play with or let clocks boost a little higher, so unless they have voltage binned these chips I suspect thats whats holding the 5090 back, Just look at the 5080, sees about a 10-15% power increase and performance seems to be about 10-15% vs 4080 (non super) compare that to the 4080 super as technically the 4080 non super was discontinued when super launched, that 10-15% uplift goes down to about 5-10% in a best case scenario
I really dislike how Nvidia is comparing these to the non Super cards. Obviously, doing so puts the performance increases in a better light, but it is disingenuous. I hope reviewers compare it to both the non Super and Super refreshes and include proper price to performance increases for each. Not looking great for this gen though.
The only super card with meaningful performance uplift is the 4070 super. The 4080s is like 2% better than a 4080, are you that stressed over them not showing that 2% better card?
This isn't really surprising. If you assume that MFG doesn't introduce a lot more overhead than regular FG (which it doesn't seem to do in Lossless Scaling), the claim of the 5070 being the speed of the 4090 becomes a lot less impressive when you consider that the 4070 was already half as fast as the 4090 in raster, and 5070 is simply generating three times more frames.
Anyone who believed the OG charts didn’t read the fine print Edit: this seems to be one of the smallest Gen on Gen uplifts for consumer level cards that we’ve ever seen. (4090/5090 are not for consumers)
Actually, those MFG benchmarks need to be checked. I see some ones That are clearly under 2X perfomance gains, that means That the input Frames are actually lower than 40 Gen, so actually worse gaming sensation (even if Motion in monitor is better).
The more you skip this generation, the more you save!
Yup. I was initially excited for these new cards, was planning to swap my 3080. But looking at all the stuff, I'll stick with the 3080 for 2-2,5 more years ty :P
@@DjDanee87 3080 will do fine till the 60xx series.
3070 ti 8Gb here. I am looking at some GPU but more than willing to switch to AMD. Hoping the 9070 XT is a solid card for a reasonable price.
lmao
@@DjDanee87 3080 will be all fine until PS6 games start coming out.. no reason to spend money on new GPU until then.
Nvidia out here pretending to give us more frames but instead giving us the same frames more often...😢
good
more benchmarks ultimately gives more frames; just not at once
To increase frame rate significantly, Nvidia would have to increase clock speed, chip total area or improve architecture.
Higher clock speed means higher power consumption.
Increase chip size means increased production costs.
Yeah, same cards with more AI cores and RT cores, time to unoptimise the new games so they look even better
Brilliant!
So the 5080 is actually a 4080 super plus 🤔
🤡
It's basically a 4080 Ti if that gpu ever existed.
4080ti super
Die shrinked 4080 lol
So the RX9070XT might end up being between the 5070Ti and 5080 in performance.
Yeah, I’m really just praying that AMD doesn’t disappoint with the 9070 series’ price to performance
$549 take it or leave it
@@GewelReal Only if it's an improvement vs the 7800xt
@@GewelReal Too much. What a shame.
I'm sure we can rely on AMD to snatched defeat from the Jaws of victory.
But hopefully they will prove me wrong.
@stuartthurstan the 9070xt looks to be 7% slower than the 7900xtx in raster. So basically a 4080 and around a 4070ti super raytracing performance. So it is a 5070ti competitor when it comes to raster performance and that card will be 750$. They could price it at 600$ and would sell quite the good amount of cards if fsr4 is good.
FPS is dead, all hail latency!
"Looks better but feels worse for a higher price!"
Looks worse and performs worse lol
... who's gonna tell him ?
They've worked to reduce that though?
Have you seen the hands-on tests? Latency is much lower using all the "fake frames" tech than with all the extra tech switched off in the tests I've seen
Looks like the 5000 series is just the 4000 series with slight improvements for AI
more raw performance and it seems wayyy better FG and upscaling. if utilized correctly, the 50 series is superior and no one should be upset or feel disappointed. dont use MFG at a base framerate of 20 fps and everything will be fine lol.
@@kleo1307 Its a software update mostly. No need to push out new hardware with more and more power consumption every 2 years when the main achievement is made with software
@@bvbftwonly if you ignore literally everything else like gddr7
@@kleo1307 You shouldn't feel disappointed that they only added hardware to service software, that's only good if you buy an expensive high end card that already run games the well over ultra settings? That the only gains appear through extreme enthusiast fluff resolutions and framerates at 100+ you can barely notice? All the while they're continually shrinking the budget dies while also short changing those cards on ram and bus size. You're literally saying don't use it when you would actually need it, i.e its useless.
@@bvbftw The hardware determines the possibilities. If every card had 20gbs of ram every game would use 4k textures instead of tiny compressed ones.
Notice they don't compare the 5070 vs the 4070 Super. Almost no uplift.
That's because there will be a 5070 super which is the correct comparison.
nvidia never compared new gen against super models.. it does not make sense for marketing material to show you that your current gpu is still good
Yea it's gonna be a tough choice between 4000 Super and 5000, The 4000 Super prices are still so high this close to the next gen launch It's making me worried there's not going to be price drops of any significance on the used market, and may just be better trying to get a 5070Ti FE at $750.
@@LB-sh7qt incorrect. The correct comparison is what performance you could get for certain amount in the outgoing gen (at current prices, as that how much it would cost you to buy now) vs how much performance you can get in the new generation for the same X amount.
@@cptwhite Then you never get a better performance/dollar amount out of the new cards, because the old cards get discounted based on the new cards value. AMD was stupid enough to discount their rdna2 cards 1-2 months before the launch of rdna3. The reviews then stated that the 7000 series had no value improvements compared to current products, which were heavily discounted at that point.
Considering that 5090 costs about 25% more, we are talking a very low gen to gen raster performance bump, a kin to the 8800 GT to 9800 GT.
9800GT was a rebadged 8800GT IIRC
9800 GT have exactly the same performance of 8800 GT
Those 32GB of ram tho. They don't even try to hide them as gamer cards anymore. Not sure why they got rid of the Titan branding
@@PopTart-j8uyeah it was
The same percentage raster performance bump over the 4090 is there only Nvidia is not giving you the added performance or the AI/frame generation updates for free on the flagship offering this go round.
I have never understood why gamers always expect to get 30%+ added performance when compared to the prior generation cards but then complain when the cost goes up for the faster high performance card.
Speed cost money, how fast do you want to go?
Hot rod car guys have been hearing this phrase for decades and its still true today including PC graphic cards!
They just can't seem to bring themselves to show a chart that has even 1 raster vs raster performance.
At this point, the vast majority of Nvidia customers are fully bought into DLSS. There's no reason for them not to market these cards in a way that puts them in the best light.
It's because no one really cares about raster lol
Imagine you're Jensen.
You've got the choice between continuing the race for raw hardware power, going to great lengths to miniaturize as much as possible ( exploding your R&D costs and supplier orders ) all to please 4 nerds who want to play natively, knowing that you've got almost no competition to force your hand.
Or spend peanuts and gamble on software and fake frames for more or less similar results, except that you can sell your gear for the same price as if you'd gambled on hardware, and so gorge yourself ad vitam æternam.
What would you do?
All new AAA games coming out already have raytracing.. what would be the point in spending a ton of money on gpu and then not using the technology it has
Bc theres no raster gains. Other than maybe 5fps with cost of some power and heat. I only care for raster so in most cases amd is better pick.
I was expecting a bigger jump with more shaders, better architecture, faster memory, etc. This is proof that the new architecture actually didnt improve all that much. Also the power draw hasnt got better, it got worse.
they don't have the incentive to do so since AMD is out of the game in the top tier level. Keeping their powder dry for when someone else can compete. They know they will still have monopoly and get people to buy this in droves so they're delivering just the bare minimum this gen.
@@honeybadger3568 I think AMD is being quiet and very secretive, they are playing the long game, like they did with intel. NVIDIA can always beat AMD but I think AMD is secretly planning for at least one generational big win.
@@honeybadger3568 Well they couldn't deliver anything better this gen. It's still TSMC 5nm, the same as ADA. Just a slightly improved version of it. Ada 4N, Blackwell 4NP. Both are refinements of TSMC 5nm. 3nm is the best they make atm but there is limited production and Apple has all of that. The 3nm stuff is meant to offer big gains but we will have to wait until next gen for that.
Yo snap, used to love watching your old Roblox videos and your dev videos. Crazy to see you here.
Edit: Hope your doing alright!
It's on the same node and the chips have reasonably similar sizes sans the 5090, so yeah... Nvidia only has TDP to play with.
Lowering expectations but increasing the prices.
The more you buy the more you save.
@@RAZGR1Z the more you buy the more you get scammed.
@@RAZGR1Z the more you buy, the more jackets Jensen buys
They actually decreased prices except that 5090.
The cards are cheaper than the 40 series, so how have the prices increased?
The fact we have to count how many pixels tall the bar chart is, is PREPOSTEROUS 😂
5070 is essentially a 4070 Super with a software update and minor physical changes (less cores, but better memory)
Same with AMD 9070xt is just a 7900xt with less vram
Not complaining tho, better deals for gamers when the prices come down!
@@Dempig Improved raytracing, machine learning upscaling and the chip is monolithic, so easier to produce.
@@FreddyFazbearsPizzeriaOfficialthat’s if the prices come down at all. The 5070 will likely sell above MSRP by $100 or $200 and Trump’s tariffs will also raise prices across the board
@ The 5000 series also has better raytracing and upscaling. You can't claim them as improvements for AMD but not Nvidia lol
50 series will be a 10-15% generational uplift without all the fake AI bullshit. That's the truth.
I'm afraid the AI BS is the future though 😐
@@rodriguezkev pretty much... RIP Moore's Law
Its not the truth tho. You take away RT and its looking more like 2-5% with increased power draw.
Hey look on the bright side, significantly improved price to performance and you can get last gen for cheaper afterwards.
I hope future game requirements don’t ask for Frame Generation
Looking forward to the 5090 vs the 4090 raster, raytracing, and dlss benchmarks.
There is a video about this, 5090 gets 26-28 fps 4k raster raytracing, while 4090 gets 22 fps raster 4k raytracing in CP2077, the raster is barley any difference between 5090 and 4090.
@Zombiesmoker no I am waiting for several games so we can get an average and from several different channels at raster performance with no raytracing, with raytracing, with dlss. Also the 5090 based off of the guesses I have seen should be at least 30% faster then the 4090 which is the normal generational jump but I will wait for the official benchmarks from hardware unboxed, gamers nexus, jay2cents, linus tech tips, pc centric, etc. There will be a big enough difference between the 5090 vs the 4090 what are you talking about?
@@MrAnimescrazyBro, PC centric uploaded a video of him running Cyberpunk with the 5090. Zombie claim is mostly accurate. One game doesn’t represent all games but he isn’t lying about the small uplift in regard to Cyberpunk.
@@Zombiesmoker That's still a 25-30% uplift, which is normal for a generational uplift in performance.. Just to bad Nvidia also increased the price by 20% again, so not really more performance for your money.. It's what happens if there's no competion.. They know people with the money will still buy.. Simply because there's no alternative if you want the best.. It's the well known FOMO marketingstrategy, companies like i.e. Apple also use very effectively..
The 4090 was around a 70% increase over the 3090. This isnt normal. Stop normalizing incompetence. @inflatable2
AMD - we're not pushing for top spot this gen.
NVIDIA - Okay. We'll wait for you to catch up.
AMD also, we have shader rendering like in 2002 when DX9 came out and try to match RT perfomance from 4 years ago.
@@TheMSilenus xtx beating 4080 and 4080 super. how bad is nvidia^^
@PohnnyRico xtx for 1440p high fps native / 4080 for 4k dlss quality. fsr sucks / xtx better for native
@PohnnyRico Enable pathtracing and enjoy the XTX slideshow edition
Nvidia unironically might save it as a response to amd's counter.
I find it ridiculously misleading to compare the 5070 Ti (a 16GB card) to the 4070 Ti (a 12GB card) instead of to the 4070 Ti Super (also a 16GB card which retailed for ~ the same price that the 5070 Ti is rumoured to drop at).
The 4070Ti Super is at the price of the 5080 in my country...
@@machintrucGaming No it's not, because the 5080 hasn't been released yet.
@ Nvidia website has the "from" prices, so it's the FE edition price.
The 4070 ti super is like 5% better than the 4070 ti, big deal. The only major difference is vram
I think we shouldn't compare with dlss or frame gen. Just native....
5080 reviews comming out later than 5090 reviews hmmm i am wondering why
Maybe the 5080 is more powerful lol
@@rodriguezkev disappointing to see that the 5080 isn't even going to be as fast as the 4090.
@ but the 5070 has 4090 performance! )) wink-wink.
which basically means NOBODY NEEDS these new GPUs because the 5060 8gb for 300usd WILL BE ENOUGH FOR ANYBODY @ any game just pump up more glorious fake frames and turn on upscale from 240p like all nvboys do)) ENOUGH FOR ANYBODY just like 640kb))
I think as a teacher something we can all agree on.
" You can not produce results that does not exist, by changing how performance is defined."
Nvidia’s claimed performance gains from MFG is like putting flame decals on your car and saying it’s faster.
This analogy makes no sense. It's more like putting rocket boosters on a car making it look like it goes really fast but it cannot respond to turns on a track, can only go straight in one direction.
😂
@divertiti Except it isn't because it's conceptual. It isn't real. If you swap out a 4000 series card for a 5000 series card, you're buying a software upgrade, not a hardware upgrade.
@@divertiti It's more like increasing the FOV on a game, so it looks like you are running faster lol.
@@divertiti Artificial engine sounds, or drive by wire throttle remapper.
So, that was why Nvidia lower the prices. Because they suck.
Of course. NVidia ain’t doing it out the goodness of their hearts
by suck do you mean the best?
and AMD will lower theirs even more, because they suck even more
yeah sucking all of AMD sales away
Did Nvidia release higher prices earlier? When did they lower them?!?
Looking at this it seems 5080 is slower than a 4090 and a 5070 is on par with 4070 super. Nvidia is literally just discounting the 4070 super for 50$ lol.
Not even that, 4070 Super has 1024 more cuda cores, 25% more ROPS (80 vs 64) so the question is will the slightly better RT and Ai of the 5070 reduce the loss from enabling RT more than you lose in native raster performance? DOUBT
5xxx should be compared to 4xxx super models that would be more informative
My thoughts every single time...
You wanna see 5% increase ? Lol
Then the gain would be even more disappointing.
4070 SUPER vs 5070 = 10% better
4070 Ti SUPER vs 5070 Ti = 15% better
4080 SUPER vs 5080 = 22% better
That wouldn't be a correct comparison. assuming the 50 series will come out with super series too.
The 4070 super is the only 4000 series super card with a decent performance increase, the 4070 ti super and 4080 are basically same performance as the non super.
Nvidia always seems to forget that the Super series exists 😂
Nah they just stop their production, never put the remaining stock on sales and pretend they never happened
Super are for a refresh
Meh. The 40 series Supers were more a marketing move. Price cut without having to officially cut prices, not performance boost.
@@jackinthebox301 ? The 4070 super was almost a 20% performance bump, 4070ti was a lesser but still decent 10-15% bump +16gb instead of 12, and it was only the 4080 super that was literally a price cut with everything else being the same(1-3% bump at best)
@@ultimategohan1551 I'll give you the 4070 S was better, but it was more like 10 to 15% with some outliers.
The 4070 S was also more expensive so I'm not sure that is a good example for either of us lol
There is a reason Nvidia is allowing reviews of the 5090 a week before launch, but restricting the 5080 reviews to launch day. The 5090 may be a decent 20-25% real world improvement on the 4090, and all the big boy pro features are there. 32GB double the encoders. Things people will drool over. The 5080 is cut in half in everyway. No doubt in a real world mix of RT and NON Rt games its not going to impress. The 5070 no TI with its 12GB is going to have reviewers calling it the real 5060.
The door is open for AMD to destroy Nvidias entire stack save the Halo. The Halo is out of reach clearly. Nvidia however has cut down everything else so hard that AMD really does have a shot if they come out swinging with a $500 or less 9070 XT. If AMD is serious this gen come in at like $450-475 on the XT... and blast the non XT at $350 make the very idea of a higher end Intel card this generation seem stupid, and the XT just being the defacto card till your willing to really spend 2k.
I think 9070xt prices was already somewhere and it was 575-599
Sorry but amd ain’t good side vs mvidia bad :). Both are cost to make profit as big as possible :))
@@krzyssadzynski1067 From the 9070xt leaks it's somewhere between the 4070ti super and 4080, while the 5070 looks like a 4070 super. So if they sell for the same, then the 9070xt is much better value.
@@krzyssadzynski1067 AMD has said in a few interviews the last few months. That they understand they continually loose market share. Even generations where they thought we have hit our goals, our pricing is reasonable should be good. They still end up loosing market share. AMD and NV used to split the market. That isn't really the case anymore. I think AMD understand they need to not just slow the loss of market but actually grow it a bit. If they don't what is the point of even making dGPU anymore. AMD has Strix Point/Halo parts, the consoles probably locked down for at least one more gen. They really don't need dGPU. Its starting to be a now or never if AMD is going to continue making enthusiasm class GPUs. We are all in serious trouble if Lisa does the actual sane thing going forward and tells them to forget consumer GPUs. The GPU teams needs a win in terms of market share. Just not loosing money isn't enough at this point. If dGPUs don't sell this round, she is going to pull the plug. (Frankly she sort of already has as RDNA is dead after 4.... they will be combining RDNA and CDMA into UDMA. Basically that means they are going to do like NV and design for AI and servers first, and maybe sprinkle down some cast offs for consumers)
@@Psi-Storm wawit till U see test like 40 games. do not judge book by cover :)
I agree amd does have a shot but they have fumbled so many times already im not holding my breath. People are tired of nvidias greed and are also twitching for an upgrade without gimmicks.
Nvidia is smart. They never released a 4080TI, but gave us the 4080TI disguised as a 5080 with a few software integrations. They never had to care about the TI version for this because it would hit the 5000 series too hard like the 1080TI did with 2000 series and 3000 series.
Nvidia is becoming more and more like old Intel's tictac strategy: max of +10% performance over previous generation.
And intel gave you four cores, nvidia gives you 8-12gb of vram for like decade now
It's not always a strategy.
You can't just be guaranteed to endlessly dramatically improve.
@@SeraSxF nobody is asking for endless 40% improvements gen to gen. But if you look at the 30-40 gen uplifts everything below the 70ti class was terrible. Its basically same again. The 60 class will be even more garbage.
@@OneDollaBill why would they even try to deliver better gaming gpu when amd left them free hands
Yup maximum long term milking profit. Intel's already has celestial and working on druid. Nvidia got no competition and saving their ammo that's what I'd do. Milk me harder daddy Jensen
Love your videos dude, keep it up
The fact that we got almost no effiency gain (~33 more performance, ~33 more power draw) shows silicon chip manufacturing is close to it's limit.
Not to mention performance could be worse where the older card was tested with dlss 3.0 and the 50 series with 4.0 which provides supposedly a 20% boost in performance. So ill be quite interested to see what actually everything is when released
No, it does not.
@@PreacherT5 yeah i dont see enough people mentioning this. They improved the performance of both the upscaling and the framegen. Thats on the DLSS 4 side tho. DLSS3.5 wont get these gains. So imo, the gain will be even less than what they are claiming
It hasn't reached its limit. But, the improvements are getting more expensive. TSMC has recently started manufacturing chips with the GAAFET design, and that will allow for further improvements for the next 5 to 10 years, and scaling below even 1 nanometre. But, GAAFET is expensive and NVIDIA can't use it yet for GPU's because it would cost too much.
We've still got a few more node size changes before those problems really show up. Nvidia could have moved to a newer TSMC node, but chose not to. The wait time was probably too much. Easier to book that newer silicon for the future and simply boost power consumption this gen.
Just dont understand how we're 4 generations into raytracing technology and still cant run it natively above 30 fps
EDIT: it seems i have misunderstood the difference between path tracing and ray tracing
we absolutely can run ray tracing (RTX). but path tracing is a different beast. Also, RTX came with DLSS technology, so even from the start, DLSS was meant to help with ray tracing since its very expensive.
Because it's trash technology and another bandaid for lazy devs.
erm..... my 2080 runs above 30fps easily even 60 is no issue I can even go above 120 without issues so it must be a you issue if you can't get that on 1080p
It's because we're talking g path tracing not just regular Ray tracing which is many times more demanding
Just don't understand how after 60 years we still cant go to mars
Jensen made the claims and now they gotta backtrack most of it.
People were shitting on amd but I’d way rather they just stick to real info and hopefully good pricing.
These 5080s are the exact same card just with better upscalers.
AMD has never stuck to real info when they do shows. Have you not watched their GPU last releases, there slides are always BS. They also always blow it on pricing, so i don't expect much.
@@frankytanky5076 Jensen claimed that I have a 10 thousand doller computer..
@@666stankyrich To be fair, Jensen claimed that the gaming enthusiast has a $10k gaming room, which he called the "command center".
I don't care about Nvidia, AMD or Intel. I only care about my wallet.
@@NathanAtkinson590 yea both exaggerate, but nvidia also isn’t letting reviews go live until launch day for everything under a 5090. You guys don’t think that’s absurd?
I mean you dont know that untill Gamers nexus tests it, so ur talking out ur ass till then
Ive been saying this since i found out the new cards are on the same 4nm node as last gen, these are gonna have nominal gains at best. The whole point of these cards is to make it cheaper for NVIDIA to produce while upgrading their frame gen to make up the performance gains
RTX 4000 Super Duper
Anyone saying gddr7 vram will make vast improvements are gonna be beaten.
So the 5070 is very similar to the 4070 super?
Yes, between 4070 super and 4070ti.
Well this is a flop 😂
@@Psi-Storm its actually worse than 4070 super just compare both specs
4070 Super has 1024 more cuda cores, 25% more ROPS (80 vs 64). ?: will the slightly better RT of the 5070 reduce the loss from enabling RT more than you lose in native raster performance.
i bought a 7900xtx on black friday
my first pc has a 1060
still going to be AMDs best card. good purchase
My condolences
W
bruh why tf would u do that...that thing bout to come down to like 400 bucks after the rtx 5070, 5070 ti and rdna 4 gpus drops.
I should have gotten the gre when I saw it for $440 at microcenter
5080 with only a 15% improvement over a 4080...
When are they going to admit that this generation is a rehash of the previous one and the only decent one is the 5090?
5080 IS FUCKED
It is a rehash.. there was no node improvement so they cant do anything big
over the 4080 (non super) so take about 3-5% of that 15%
@@damara2268 Put in a lot more active shaders than half of the chip?
I don't understand why people didn't understand that 6 months ago when we found out that it would be on the same node. TSMC N3 is not ready yet large dies. The pricing and the cut levels would be completely stupid (and supply would be an issue as Apple is still taping out).
Instead of posing as Captain America, Jensen should walk around with Cody Rhodes theme song.
PC Centric just showed Cyberpunk running @ 4k max settings with full/path tracing with no DLSS or FG. 5090 got 28ish PFS. That's a 27% uplift from the 4090. Something else that should be mentioned is I'm pretty sure during the 40 series key notes, Nvidia used Intel CPU for their charts. They are using the 9800X3D for their 50 series charts PLUS DLSS performance. DLSS P (low res render) plus X3D is a pretty big uplift.
People will still cry about it. Actually insane. 5090 would rock that game on 1440p, no need for that RT.
@@midnightvibes5485 People will complain if the 5090 is only 30% better than the 4090 because Nvidia is charging 25% more. That's not a lot of improvement in value per dollar, at least for games. If you're using it for other purposes, especially AI, it's probably a much better value uplift.
Are those uplifts usually static across the board though?
We’re talking about 27% but low FPS numbers, if you got 200fps at 1080p should you except 260fps on a 5090?
@keanu3260 no, they are never linear like that. It varies from game to game but PC Centric gave us a pretty good idea for one game raw to raw.
So people buying a 5090 will play games using DLSS Performance mode ? Cmon now if you’re buying card of that price bracket I want to be playing maxed out ultra settings.
5070 is like getting a new partner thats identical to current but AI adds a filter to their face
Shallow hal has a gal
So we're paying for a fancy radiator to heat our rooms? The 5000 series just comes off a 4000 series that has been forced to consume more power for 30% increase in performance.
Nobody is forcing you to do anything, get a 4090 or 7900XTX(although that's also an insanely inefficient card) after the fact at a cheap price if you want.
@@FreddyFazbearsPizzeriaOfficial Actually yes they are because this is the long term direction all 3 vendors are going in. The alternative is buying a tiny PC with an SoC with the repairability of a phone. Server GPUs are far worse with heat and power consumption to the point they require datacenters to be replaced or just give up and go to the cloud. I don't have a good feeling about any of this.
@@vgernyc then i suppose you'll want to learn undervolting if you think all cards are going to go to insane wattages
Since the CES presentation of the GPU brands, my expectations with Nvidia started very high when I saw the prices and performance they exhibit, and my expectations with AMD were very poor, but now 9 days after CES I am on the totally opposite side, my expectations about AMD are very very high, and about Nvidia, there is no way I will buy these GPUs.
Same
AMD is a wild card this generation. My napkin math says that at best the 9070XT could match the 5070Ti, but at worst only match the 5070. So the error bars are still WAY too wide to be certain of anything.
almost like you are a midia sheep that isn't able to have an opinion without others to guide it...
Clock rates and transistor density for Blackwell have remained similar to those in Ada - that probably explains the "modest" raster performance boost. Most of the effort in the new architecture have been spent on AI and RT and a lot of software-side updates (DLSS4).
ddr 7 is pitching it too. If it was still ddr 6x I wonder how much of a gain there would be?
People complain about "modest" gains constantly, but think of this way. What what if you were 33% taller? Your car was 20% faster? These are still significant gains, just not stratospheric gains that people expect to just get pulled out of someone's ass. Everyone seems to just discount the AI and software stack as well, which is fine, but they fail to realize that the vast majority of people are perfectly happy with using the upscalers and frame gen
@@atomic3325 Uh except in this case, the entire world around you keeps enlarging due to UE5 bloat and lack of optimization.
This is why it'll be a waste of an upgrade for anyone with a good 4000-series card. But i, with my 2070 Super, am eagerly awaiting an upgrade when I make a full AM5 build at some point.
@kunka592 then complain to the devs?
In short you are getting a minor increase in raster performance and the new generation is basically unlocks a feature which generates more frames instead of being a generation which gives massive performance bumps.
Imagine you're Jensen.
You've got the choice between continuing the race for raw hardware power, going to great lengths to miniaturize as much as possible ( exploding your R&D costs and supplier orders ) all to please 4 nerds who want to play natively, knowing that you've got almost no competition to force your hand.
Or spend peanuts and gamble on software and fake frames for more or less similar results, except that you can sell your gear for the same price as if you'd gambled on hardware, and so gorge yourself ad vitam æternam.
What would you do?
i'd be interested in the difference between 4070 super vs 5070
Stick with your 4070
Man I scored a steal for a new 4080 Super at 800 USD on prime days. Love the new PNY card.
damn good job, you got the 5070Ti early.
Ngl, you got 9070XT performance for 300 dollars more.
I'm guessing the 9070 XT is gonna be 499 ~ 549 and is gonna be 4080 Super performance in raster, and 10 to 20 percent weaker in raytracing based on the Morse Law is Dead leaks.
The future will tell, tho. We'll see how well my comment ages.
"guh you should've bought AMD, ur stupid for buying nvidia"
dude shut up, nobody asked. If he wanted an nvidia card thats fine.
@FreddyFazbearsPizzeriaOfficial I didn't say he was stupid. He could've gotten better value for waiting more and seeing how the market changes with the new cards around the corner.
9070 for 450 would slap Nvidia so hard please amd make it true
It doesn’t work like that. Nvidia would adjust the price if they start losing market share and that would make both companies lose money
If you check the Danish retailers,they have put the price on 9070xt €1200 which is also surprising to me since Im waitinf for that card too but definitely not with that price!
@@SafetNeziri666 this price isn’t real for sure
@@SafetNeziri666 It's an unreleased card. They can put whatever placeholder price on it that the want. They could put 9999.99 on it and until AMD actually sets the MSRP, it's meaningless.
rx7700xt is 35% faster than rtx4060ti for same price as of now and i dont see anyone buying amd , why??? this low pricing never works ,
its nvidia marketing and nonsense from nvidia fanboys making fake claims about amd drivers is scaring away people buying amd . its the scare tactics keeping people away from amd even if amd is 50% faster .
From the looks of it the 5070 is less powerful than the 4070 super when not using dlss4
Yep lmaoo
If they had stuck with 4070 Super core counts or added +256 or +512 it would be a good card and still almost a 2000 cuda core spread between 5070 TI.
Thanks for the update!
I dont expect this gen to be very exciting because nvidia is using almost the same silicon as they did with the 40 series (4nm vs 5nm, same architecture roughly) and they didn't come up with any new exciting software trick this time. Maybe this would have been the chance for them to raise the VRAM capacity across the board, at least to look somewhat exciting.
This is THE year to buy amd instead of Nvidia if amd prices the 9070xt aggressively
The 4NP in blackwell is still 5nm the same as Ada's 4N. Both are minor improvements over 5nm but are still 5nm. Apple is consuming all of the limited production of 3nm atm which gpu's won't see until next gen.
@@tygovanbragt it will be nvidia -15% as always.. maybe -20% if amd feels adventurous
@mojojojo6292makes sense as from what I've read, they won't be ramping up production until later this year.
@ Probably yes, unless you have high end 40 series, in which case, nothing really makes sense, just wait for next gen i would say.
AMD this is the PERFECT time to grab market share. Make the 9070XT $500 and 9070 400/425 and watch them fly off shelves.
At a loss... 😂😂
@@premjitchowdhury262 Nah even at that low of a price, they wont make a loss I dont think. It will be a thin margin though. Depends on the wafer cost as AMD is using a new node I think, but everything else on the board is last gen. Which could be sort of construed as a win in some ways.
I wish this were true, but even if they price them like this or better they'll still pick up minimal market share. I would love for them to have their Ryzen moment in the GPU space, but Nvidia's mindshare is so overwhelming.
No point people will still go out and Nvidia anyway.
Yes sir
Man, this is really gonna be an another Pascal vs Turing moment for NVIDIA...
5070 = 4070 Ti
5070 Ti = 4080 / 4080 SUPER
5080 = 4090 D
@@laszlozsurka8991 1660 Ti was 40% better than 1060 with a $20 increase, and 1660 Super had a better ratio performance/$ at a later stage when nvidia finally released it, EVGA got the 2060 KO at $300 and AMD released the 5600 XT, all of them pretty good cards for a cheap price, the bad priced cards was every from 2060 upward
Why the Chinese version 4090D and not the original 4090 in the comparison?
If frame generation is producing 3x AI frames for every "real" rendered frame, then 75% of your gaming experience is AI-generated and only 25% is real. It's sad that all the creative work devs put into games (artwork, animation, characters) is so drastically watered down by generative AI. This trend risks creating a cycle where developers release games filled with AI-generated art and animations because gamers playing with fake frames won’t even notice the lack of detail. It sucks & is all caused by AI and companies pushing onto it like NVIDIA
5090 = 4090 x 1.25
5080 = 4080 x 1.15
5070 ti = 4080 super
5070 = 4070 super
We are looking at 0-5% uplift in efficiency. This is an AI/RT uplift.. nothing to speak of for standard gaming. Big skip for anyone with 40 series.
i wonder whether that's the case for TPU performance as well
Also a big skip for those with a 3080 and above. No decent uplift…
You always skip every other generation unless you're looking to get ripped off. Going from Ultra to High is usually the same performance boost as a generation, and this year's High looks like last year's Ultra.
5090 = 4090 SUPER
So basically my 4070 Ti is basically a 5070 without the gimmick of mult gen, and i still have the normal frame gen, got it. Zero reason to upgrade and skippable generation unless youre on hella old tech or an 8 gb card. Honestly im relieved cause now is not the time to spend money and this just means theres a limit to how crazy they can make games graphically. Mult gen might cause some of the same boneheads with mediocre games to make something that looks pretty and vapid or just flat out not optimize crap and still put out a lousy game. But honestly this is fine. Only a handful of full of games ive been interested in anyway lately
The 4% shader increase from 4070 to 5070 is kinda laughable when you see how much the other cards get more shaders.
Bruhh, so 50 series is basically just the 40 super series with ai 😂😂
0 tests without RT or DLSS, the only things they beefed up in this generation. They must be terrified of the raster performance comparisons.
I might be very disappointed when looking to replace a 3080Ti.
There is no way I'm using 4x frame gen from 30fps to get 120.
I'm sticking with my 3080Ti for another 2-3 years. Give me the legendary RTX 6090Ti Super when it comes out.
@@CC8771 itll only set you back $2999 if not even more.
@@simplyruben3184 plenty of time to plan and save
3080ti is awesome, just keep it.
3080 Ti still a very good card, it's just the Nvidia VRAM nonsense holding it back a bit.
I wish reviewers would get/be able to publish results more than 1-2 days before launch so that people can actually make an informed decision
yup
Some people are oblivious if they think the 5070 will retail anywhere near msrp.
It will in the US. Why wouldn't it?
I heard that the last time a bought mine at MSRP with a discount. Not everyone experiences your environment, they actually live in a different environment.
I wouldn't be shocked in the 5070 doesn;t sell very well and we;ll see a 5070 super or a 5070 18gb fairly quickly.
@@walter274 yea im surprised they didn't use the 3GB capacity ram on the desktop 5080, they did with the laptop version (5090 mobile) where it has a 256-bit bus with 24GB of vram
curious if the 4070 they used is the GDDR6 or GDDR6X one. I'd think it's the recently nerfed GDDR6 4070 so they can make it look better in these results.
It probably wouldnt even make a difference on the chart, but i doubt it.
The charts showing huge performance increases were mostly due to frame generation. Also why are we comparing the new generation to 4070 Ti and not 4070 Ti super?
how about make the argument why a 5070ti should be compared to the 4070ti super.
@@Greez1337 because the super line is a refresh........... would you compare a 5090 to a 4090 if a 4090ti existed? no you wouldnt, they clearly didnt compare it to the super cards because that would shave about 3-5% off those results and in the case of the 5080 means it looks utterly terrible when at best thats only seeing 15% of an uplift vs the 4080 (non super)
also in the case of the 4080 and 4080 super, when the super launched nvidia stopped production of the 4080 (non super) so its a valid comparison when thats the better SKU under the same name
Nvidia is like Omar Little, and AMD is like reverse Omar.
Nvidia; “Even if I miss I can’t miss”
AMD; “Even if I hit I can’t hit”
5070ti looking more like a 4080, plus extra tech that could boost frames. So it's not terrible if the price stays as advertised.
True. I want the 5070ti but i hope it aint sold out on launch ;(
My thoughts as wel.
Lovelace 2 - Electric Boogaloo
5090($2000) => OC 4090Ti
x 4090($1600MSRP)
5080($1000) => OC 4080S($950)
x 4080($1200MSRP - no longer available)
5070Ti($750) => 4070TiS($850)
x 4070Ti($800MSRP - no longer available)
5070($550)
x 4070S($600)
x 4070($550)
5070 = 4070 Super?
Yep
Sort of. 4070 super has 1024 more cuda cores and 16 More ROPS (25%) so raster will be better on 4070 Super. 5070 has slightly better RT TFlops and AI TOPs. Will the slightly better RT lose less FPS than the loss in native raster potential? Will the slightly better Ai power make a huge difference in visual quality? That is unknown for now.
If you have 4070 Super I wouldn't switch. Frame Warp will extend the life of all RTX cards. Only if you can sell and upgrade for very little money out of pocket. But 5070 TI is the sweet spot, 45% more cores + 4GB more VRAM for 40% more money. 5080 you pay 33% more for 12% more cores.
@@NoSpamForYou There is 100% architectural changes that means you cant just straight up compare cuda cores.. Otherwise this is ass
Remember that the 4070 Ti is unlaunched 4080 12GB model that Nvidia later released as 4070 Ti.
Yes, never trust companies, only reviewers who can demonstrate credibility.
@@FreddyFazbearsPizzeriaOfficial Also, even reviewers make mistakes, because they compare card against previous tier cards. The amount of times I saw comparisons, but they forgot the history how they named these cards, like RTX 3080 10GB / 3080 12GB. The whole xx70 Ti to xx80 Ti level cards have been weird past 3 generations.
Might finally retire my GTX 1080 after 8 years. Whether I'm going for NVIDIA or AMD? not sure yet.. curious to see all benchmarks in a few weeks.
Worse case scenario there's the B580. I know it's a downgrade, rasterization wise, but at least you'll get raytracing...
@@machintrucGamingraster is more important than ray tracing lol
@@heroicsquirrel3195 Pretty sure the new Indiana Jones game have mandatory raytracing things. Also Alan wake 2 is dependent on some memory optimization things you only find on RTX2060 and above.
Just sold my 1080ti for $200,now waiting for 9070xt and the Danish retailers have put a price €1200 Im lost for words
@@SafetNeziri666 That's just a placeholder amount, ignore it.
I think Nvidia are playing games here.
Does no one else think it is strange that the two titles they chose to show none RT performance of are Resident Evil 4 and Horizon. Both games are optimised AMD titles...
For example, according to Nvidia's charts Horizon Zero West was tested at 1440p for the 4070Ti.
According to techpower-up:
4080 = 107.6 FPS
4070 Ti Super = 101.2 FPS
4070 Ti = 94.3 FPS
Therefore if 5070Ti is 4070Ti + 20% = 113
FPS
This means the 5070Ti is stronger than 4080 in Horizon
In these comparisons, is the 4080 Super considered part of 4080? Or is just 4080 super in between the 4080 and the 5080?
4080 vs 4080super is 2%difference you wont tell.
Great work, Daniel! Thank you, sir! :)
Lets be honest Blackwell is an Ada refresh on the same node with new added proprietary tech. Lets call a duck a duck. They just fed it more power, threw on an updated memory bus interface and upgraded to GDDR7 that's what this is all about. Blackwell is Ada tweaked. The performance scaling proves it from the 4080 to 5080, the 4080 to the 4080 super was such a weak increase, it's Ada Lovelace, renamed Blackwell. Not quite an uplift from Maxwell to Pascal, in contrast Blackwell is a weak generation baring in mind the AI DLSS 4 throw in and they called it a day.
Blackwell is 4nm while ada is 5nm isn't it? why do people keep saying it's the same node?
@@FreddyFazbearsPizzeriaOfficial Tom's Hardware even states it's on 4nm. So does Techpowerup. They renamed the the GPU series to GB variants instead of AD. It's still Ada though but with newer stuff bolted on. GB202 to GB205 instead of AD102 to AD107, the further the number goes up the more the GPU gets cut down on each series.
@ Techpowerup states 5nm and 4NM finfet on the same page lol, im seeing conflicting sources everywhere
@@FreddyFazbearsPizzeriaOfficial No you're right the 4000 series is 5nm. They are so close tbh. That's most likely why you see people saying the same node, cause it's literally that close. It's not like a jump from 8nm to 5nm like from Ampere to Ada Lovelace. It pretty much is the same node, but technically isn't and from an engineering standpoint they would most like say it's the previous generations architecture tweaked, cause it's such a small bump down.
idk, on techpowerup it says TSMC 4nm Finfet and 5NM right next to each other
Great information and insight Daniel. You are one of my favorite RUclipsrs... thank you!
by the specs, 4090 should be able to do multi - frame. Hopefully a mod comes out
Lossless Scaling 3.0 - works perfectly.
@@christroy8047 sure, when i turn mouse it takes 3 second to respond with 4x mode + visual artifacts is insane
@@astrea555it does lol
@@huggywuggy594 if you have over 60 fps it's not that bad
@@huggywuggy594it's litteraly the same latency as NVIDIA with their dlss4 or lower in some case, don't say fake things about it when you don't know how to use it properly...
these numbers are more in line with what I am expecting, being the 5 series on the same node as 4 and counting on architectural and memory improvements only
0.04 increase is nuts, no wonder they reduced the price if they are offering the same card just with ai🤑
May as well call this series the 40XX Super Duper series. Compared to the Super series of the last gen, you're basically getting a marginal upgrade in performance, multi-frame gen, and a price-cut for the 70 series. Zero reason to upgrade if you have those cards already.
Hey, I've been saying that if you've got a 40-series card, you really don't need a 50-series. The 50-series is just a slightly better 40-series, thanks to AI... that's it. It's like when the 30-series came out - if you had a 2070 or 2080, no need to upgrade. The 30-series was just a slightly better 20-series. Same thing here, but the gap between the 20 series and 30 series was much bigger than the gap between the 40 and 50 series. And i can totally attest to that given how i had my RTX 2070 for 4 years from 2020-2024, with zero problems running high level games, except in 2024 when dragons dogma 2 forced me to buy a 40 series card and upgrade for the next 4 years. Rtx 2070 lasted me 4 years at high/max game settings. So food for thought ppl.
If you buy a 50-series after getting a 40-series, you need to work on your money management and general gaming pc knowledge; you'll regret spending all that extra cash for basically the same thing, just with slightly better AI-boosted FPS, not native FPS improvements.
The RTX 60-series, *that's* the real next-gen upgrade.
That's assuming people upgrade to the same tier of card. I'll be jumping from a regular 4070 to a 5070ti if the uplift is big enough to warrant it. It would have been a 5080 if they had more vram.
@mojojojo6292 Yeah, I get what you're saying. But my RTX 4070 Ti handles anything maxed out with ray tracing - it's a beast, especially with my CPU. So, if you think it's worth it, go for it, man. 🤷♂️🤷♀️
I'd go for a 5080 if I were you though considering your current card is a 4070, but to each their own. Personally, I wouldn't upgrade even with a 4070, not that big a jump imho. But it's your money, your decision 👍
Selling my 4080 Super and getting the 5080. Then the 6080 🙌🏻
@@LIF3sAGAM3 4070 ti gets fucked by Alan wake 2.
I'm going to enjoy the 2x+ frame rate (over the 40 series) at the cost of 10% latency. I'm coming from the 30 series. Even without all the extra stuff enabled, I'm still getting nearly 2x raw raster framerate at 4k.
Its useless if you dont have a 240hz monitor because you need a 60fps base frame to a lower input lag and artifacts
Lossless Scaling 3.0 - save your 2500$
Hell yea. I have a 4090 and I was curious about this app so I bought it. I didn't use the Scaling option since my system was powerful enough to run Native 4K (I tested Black OPS 6). However, it wasn't powerful enough to hit 160 FPS and I have a 160hz Display. This gave me the bump I needed without any latency. Well worth the price. The program pairs well with DLSS so if your playing a game that supports DLSS you can run Lossless Scaling Frame gen only and the results are great.
@@joesantoro4964 Perfect! Here's a tip in case you need more performance - WORKS great: Set LFG to 3.0 - scaling to 60% - then on Scaling CHOOSE LS1. If you lower your res in game settings it will do fantastic scaling to your monitor res. Just leave the scaling on LS1 because if you keep your res at 4k (monitor res) it won't scale at all and NO performance cost - but should you choose to use it - it works REALLY well. Talk later my friend. /C
@@joesantoro4964 Without any latency? KEK it uses frame gen my man. It adds latency. But, and that is a huge butt: the latency is barely noticable. Same goes for regular frame gen on nvidia and amd cards. It's usually like 10-20ms. Anyone who claims they notice a 10-20ms difference is lying, humans don't have the reaction time to actually notice that.
I am using lossless scaling myself and I love it, it's superb and one guy in his basement beat NVIDIA to it, it now supports up to 20x. Not that anyone needs it but fug it I guess.
Imagine you're Jensen.
You've got the choice between continuing the race for raw hardware power, going to great lengths to miniaturize as much as possible ( exploding your R&D costs and supplier orders ) all to please 4 nerds who want to play natively, knowing that you've got almost no competition to force your hand.
Or spend peanuts and gamble on software and fake frames for more or less similar results, except that you can sell your gear for the same price as if you'd gambled on hardware, and so gorge yourself ad vitam æternam.
What would you do?
@ I have Reflex mode set to Ultra and use a Gamepad. I am sure there is some latency but I was not able to notice it. I will do more testing this weekend. Either way for $8 you can't really complain.
Seems like new generation in heavier RT load is noticeably faster (bigger increase than in raster, same case as with 3000 vs 4000 series), also games that scale more with bandwidth get higher gains. Would explain disparity in results. Would be interesting to see which games are really bandwidth starved, just for science.
Here's something to ponder. Everything except 5080 has increased L2 cache size. 5070 also has 25% power budget increase which is way greater than 5070 ti or 5080. 5070 ti has way larger bandwidth improvement compared 5080 or 5070 but lowest power budget increase.
I guess it would be somewhat useful to have classifications for every game title that says whether they tend to show typical performance or whether they are more of an outlier.
So when there is results for one game, the performance in the other games can be predicted reasonably accurately as well.
The further you go up on the shaders, the less it scales compared to one with a lesser shader count. That's why you get more performance per shader on the more midrange cards.
Daniel, have you noticed that they're specifically measuring the 5070 and 5070 TI against their non-super counterparts?
Bought a 7900xtx nitro+ whd for 759€ - listening to this makes me even happier i bought it lol
aye they are nice, you can flash the the 552w ASROCK Aqua bios then use their flash tool to upgrade to the extreme 😅
grants access to asrock ai quickset too
I've had it for like 9 months now I love it
I just want to know what the 5090 does at Native 4k without RT, DlSS and Frame generation!
I think Imma just get a 2nd hand 4070Ti for 450
Let's not forget the impact of cache size as well. (The reason why the 4070 Ti Super performed less than expected.)
The 5000 Class cards are just The 4000 Super card, that's all!
Speculation until we see numbers
@@toddblankenship7164 It's not a new die size both 4x and 5x use the same 4n process, so it's that type of generational gain. 20'ish %. We see it often. On top of the lower gains, it will run hotter and use more power as they are pushing it more vs the 4xxx's.
Honestly, the only thing wrong with Daniel's content is that he hasn't developed the technology to upload a new video every time I hit refresh. I mean, how inconsiderate can you be to my endless tech cravings xD
Increase TDP = Increase Performance.
So simple, but got mixed with AI bs lol.
I just can't wait for raster data side by side with NVidia, to see how the 9070XT does in 1440P.
Because Nvidia marketing for DLSS4 fake frames is massively misleading, my RTX 3080 will be replaced by an AMD card.
same
It sounds like their highest card this gen will be slightly below a 7900 XTX (but hopefully ~$500-600). So at least you know what you're getting; it's just a question of price and availability.
like your beloved amd will be any better XD
@@jordaniansniper934 Read my comment! I'm using a Nvidia card at the moment and I don't have feelings for dead objects.
@@braveknight6008 watch them launch it at $649 and then 1 week later $100 discount.. amd marketing is a joke
More alarms that something is wrong with the 5090 here. What is up with this building suggestion of a gap between its uplift vs baseline specs increase vs the others? The real testing will be interesting.
I keep suspecting the heat and thermal throttling, but that is only because it is the obvious point for suspicion with that two slot cooler apparently somehow cooling 575 watts on that FE.
And how can the other specs be the explanation when they also went up on the order of 33% from the 4090, and also did not very meaningfully increase gen on gen for the other skus? Still a gap against the 90.
That's exactly what I've been thinking. It would be nice if the partner cards had better performance.
oh its 100% power throttled, I could be wrong but with these new 12pin connectors it specifically doesnt draw the 75w extra from the PCIE slot so when those connectors are 600w rated and your capped at 575w, you havent got much if any room to play with or let clocks boost a little higher, so unless they have voltage binned these chips I suspect thats whats holding the 5090 back,
Just look at the 5080, sees about a 10-15% power increase and performance seems to be about 10-15% vs 4080 (non super) compare that to the 4080 super as technically the 4080 non super was discontinued when super launched, that 10-15% uplift goes down to about 5-10% in a best case scenario
I really dislike how Nvidia is comparing these to the non Super cards. Obviously, doing so puts the performance increases in a better light, but it is disingenuous. I hope reviewers compare it to both the non Super and Super refreshes and include proper price to performance increases for each. Not looking great for this gen though.
The only super card with meaningful performance uplift is the 4070 super. The 4080s is like 2% better than a 4080, are you that stressed over them not showing that 2% better card?
This isn't really surprising. If you assume that MFG doesn't introduce a lot more overhead than regular FG (which it doesn't seem to do in Lossless Scaling), the claim of the 5070 being the speed of the 4090 becomes a lot less impressive when you consider that the 4070 was already half as fast as the 4090 in raster, and 5070 is simply generating three times more frames.
Anyone who believed the OG charts didn’t read the fine print
Edit: this seems to be one of the smallest Gen on Gen uplifts for consumer level cards that we’ve ever seen. (4090/5090 are not for consumers)
20 series was worse. 1080ti to 2080 was no uplift at all.
Actually, those MFG benchmarks need to be checked. I see some ones That are clearly under 2X perfomance gains, that means That the input Frames are actually lower than 40 Gen, so actually worse gaming sensation (even if Motion in monitor is better).
It's not perfect, usually dlss and frame gen cost you significant amounts of performance so who knows
Till I see how good MFG is I will reserve my judgment for this gen.
Man if Daniel did a thumbnail and made it look like the Mad TV skit Lowered expectations, that would have been hilarious 😂