Yeah that's some proper manipulation via graphs. The baseline is 95% but is unlabelled. If there was only the first bar you'd have no idea that the bar height only represented 5% instead of 100%. Shady.
@@daunted232 Because it's misleading bullshit. They need to show the raw performance to show what the hardware is capable of. Anyone who isn’t a fanboy doesn't want to constantly see these fake frames and upscaling shit.
@@daunted232because its the only thing they got going for them to look “Good”. No significant improvements since the 20 Series. Coming from someone thats never had a dedicated GPU but Intergrated
@@imadudeokaythe point is pretty obvious I think... If you disable capabilities of the 5090, you are handicapping it. Why would comparing a handicapped 5090 to a 4090 be a fair comparison?
If you cant read a graph it's you problem lol 100% is the baseline, how tiny do you want to make a graph like this? It's perfectly readable, and if you can't read, it's a you problem
@@crateer You are american, right? otherwise the room temperature IQ is concerning. A graph without a valid X/Y axis isnt "perfectly" readable and the concerns of the other guy are valid
@@MayoDealer as long as people ignore what PSNR is how smart they are is totally irrelevant, this is marketing material not a review or declared specifications, the graph is not bad and understandable with basic level of attention
I really feel like the performance metrics should be called "perceived frames per second." Frame Gen is good, but it isn't perfect and gets weird around text, subtitles, and some games UI elements. I don't love that Nvidia is claiming 200+ FPS when 3 of the 4 images are created using AI and might have artifacting.
Yep, but disingenuous marketing gonna do what it always does. The apples to oranges comparisons are always ridiculous too. BTW, the 5090 is approximately 30% faster than the 4090 in rasterization. The 5080 is approximately 15% faster than the 4080. You're paying $1,000 for 15% performance (x1 or x2).
@ well hopefully nobody is going from 40 to 50 in the same tier. You’d have to be kind of a dumbass to do a single generation upgrade from something like a 4080 to 5080.
But upscaling likely from 768p to 1080p(not about frame generation) is not fakes frame, its success reduce cpu and gpu usage then that empty usage can render more frame, also can reduce latency.
@@General_M and the real problem people don't understand is that it's not even faking the next frame just frames in between 2 real frames so you already have the end point that you calculated and you just wait for the AI to show you images in between, I want to get the next frame as soon as possible not wait for it even more with 30ms response time
7:00 "Comparing at the same settings"... dude gtfo here.... generating 3 fake frames vs 1 fake frame is not same and the 5090 is using the new transofrmer DLSS model... I just don't understand this marketing, literally every reviewer will bash the crap out of this once they are allowed to show the actual apples to apples performance.
Was just about to comment this... if only using FG then the 5090 would be around 130-140 fps compared to 4090's 110-120. Wow, what an improvement 2 years later 😆
How they love to demonstrate the capabilities of their DLSS, avoiding dynamic scenes and sharp camera turns as much as possible, because they understand that multiple artifacts will immediately appear on the screen.
@@mikeramos91 Memory alone will do big difference, GDDR7 has 30% higher bandwidth (21 of GDDR6X on 4070 for example vs 28gbit of 5070) whole Ada series was memory bottlenecked so this might make a bigger difference
@@jean_yves_plongeur well obciously because no new process size. Of course no efficiency gains because they mostly come from smaller nodes size which will happen with the 6090 but stayed same between 4090 and 5090. Unless you fucked up on some other spots earlier you won't press more efficiency from the same node size.
@@alb.1911 Its an old joke where a new gpu releases and they say cant wait to have when when it's way much older in years to come. lets say 5090 release and some people say they can finally get a gtx 1080ti
Watching this gives me that feeling of catching a glimpse of a happy family decorating their Christmas tree through the window as I walk back to my lonely apartment.
I’m getting a 7900xtx. Faster than a 4080 Super in non ray traced performance and faster than a 3090 Ti in ray tracing performance, and 24gb of video memory.
Showing frame counters with frame generation enabled is not a performance benchmark. Increased motion fluidity should not be presented as improved performance. Frame generation decreases your base frame rate when GPU limited, adds additional latency and introduces motion artefacts. While your average Joe may not be able to notice the additional latency and decreased frame quality that comes along with frame generation, the fact remains that these techniques can’t be claimed as increased performance. This is coming from someone who owns a 4080.
It's funny that if when CP devs would've said: Yo, we don't do RT and PT. Then you'd get 80+ fps and no one would complain. But now you have an option for better quality (trading performance), you then CHOOSE YOURSELF to use it, and then complain about the fps drop.
That is EXACTLY what this feels like. The 20 series had an absolutely awful performance lift from the 10 series, and you paid the early adopter tax to beta test new tech. Except at least back then, the price to performance was not absolute dog shit like it is with the 50 series. Worst performance with the biggest price hike ever.
Now that not all frames are created equally, gamers will soon collectively stop caring about frame amount and start caring about frame quality instead as there is now a distinction frame rates are no longer a indicator of performance.
Can y'all please for the love of god start putting 10-12 GB VRAM on your 50 and 60 series cards? A majority of gamers don't have the money to get a 5090. Everyone, please like so Nvidia can see this
Those are just the encoders. To a game dev, that's huge. But to everyday Joe's like us, that's frankly not much-- maybe a higher time spy score at best.
@@crateer Yeah, ONLY for the 5090. And it's actually 27%, not 30%. Also, cuda core count doesn't translate linearly to raster performance. For that, you need to look at the shader TFLOPS for each card. Based on shader TFLOPS, raster perf. increase is as follows: 5090 vs 4090 = 27% (105 vs 83 TFLOPS), 5080 vs 4080 = 14% (56 vs 49 TFLOPS), 5070TI vs 4070TI = 10% (44 vs 40 TFLOPS - also worth noting that vs the 4070 TI Super, it'll be a 0% increase because they're both 44 TFLOPS), 5070 vs 4070 = 6% (31 vs 29 TFLOPS). Shill harder.
@@omnustv483 yeah no shit, these are just same spec refreshes at same price point with some extra features. 5090 is a monster but will be impossible to get for even the people who can afford it.
@@oisiaaYou have no clue what you’re talking about. Real frames reduce system latency and make the game more responsive and stable. Fake frames just smooth the game and have no reduction on input latency, but rather increase it.
All these years you have been paying real momey for faked graphics anyway. All lighting, shadows, reflections you are used to are fake. Now you are getting the actual real graphics with ray tracing that actually simulate how real light work in the real world. You have two choices now: (1) play future real graphics games at sub 1 FPS (2) be stuck with ancient 2020 fake graphics (if developpers even bother wasting their time for these ancient faking tricks).
I honestly think the reason why this is a thing is because a lot of gamers over the years have continually been stating that they want the games to run at their monitors refresh rate and not just 60 fps with every setting maxed out. If you wanted a card that would run Cyberpunk at even 165 or even 120 FPS with everything maxed out, you’d need to pay 3 times the price of a 4090/5090, and 99% of people can’t do that. The fact that the new technology is offered at the same price as the last gen. card (4090) is a miracle. I’m not supporting the whole AI thing at all, but if we’re being realistic, there’s a reason we don’t get top of the line hardware without any AI, and it’s because it would be impossible for the regular person to purchase. 1’999 is already the cap, could you imagine a card at 3’999, or even 2’999?
You, and anyone else with the same thinking here, have no idea what you're talking about. Playing in native res makes zero sense nowadays and in *most* cases DLSS will give better results. The paradigm isn't the same anymore - building the hardware to allow AI more leverage to predict and display > 'raw power' is the right move; and while the frames being inferred is technically correct, 'fake frames' as stated here is a misnomer brought about by a lack of understanding. It's like saying a car that uses AI algorithms (mechanical hardware with software) to intelligently adjust the engine for better mileage and power is shite and doesn't actually account for anything - see how silly that sounds? Unless a developer isn't doing their job right, the hardware in a 50-series card *in concert* with software will enable you to play that game at massively higher frames at high-to-max graphics with all RT features, etc. turned on *and look as good or better* than it would at native resolution.
The fact that, 5 years later, Cyberpunk2077 still is THE game to showcase all these features just shows how good the RED Engine is.. or was.. since this is the last game on it.
When using MFG, you'll get smoothness, but you won't get responsiveness (like with real 200FPS). It's weird when 200FPS is displayed on the screen, and the responsiveness is like you're playing at 40 FPS. nVidia, as always, is silent on important points
200 with 4x factor is probably around 60 base fps and it's not weird, different games have different latency and the same fps, even the same game at the same fps on a different system have different input latency
@erenyaeger7662 DLSS is necessary in cases where FPS is low and needs to be increased. But it doesn't work well when your initial FPS is low - it's a vicious circle. When you try to increase the frame rate from 26 to 200 frames per second (as nVidia has shown), you will definitely have problems with image artifacts and high latency. And Reflex won't reduce the control delay enough to make you feel like you're playing at 200 FPS. Your eyes see the smoothness, but your hands don't feel it.
@namusx clearly they didn't get from 26 to 200 with DLSS Frame Generation only, first they use DLSS Super Resolution with the Performance preset to get +60fps and the they used MFG. If you have to create issues that aren't there in order to complain maybe there isn't much to complain about
The price appears to be absolutely obscene though, over AU $4000 is a tough ask for a GPU in these tough economic times. We will see how well it does I guess...
technically yes but i think the input lag would be massively different if you didnt have the extra architecture improvements to make sure the input lag wasnt much more than only a x1 frame gen compared to x4 frame gen. you can do MFG with losless scaling program from steam but its not as good in motion obviously.
yeah not sure for other 40 series gpu... but 4090 spec already show it capable vs 5070.. it can run but they decide to lock to show how massive performance gain vs 50 series.
And not as ingle one mentioning about temps, I can only imagine that even in well ventilated case you still hit 80C-90C degree just on core itself, mem and hot spot up the roof.
@ if people don’t buy we can get 5080 close to 1K and 5090 close to 2K which is still ridiculous for the percentage uplift you get in real frames. Most people play competitive wtf are fake frames and latency going to help me with?
and i saw leaks that the 5080 will be slower than the 4090 in raster.... and it's still that expensive lol, 5080 should be like 800-900 if it's actually slower than the 4090 in raster
@@humble2246 There is something wrong in your head. All your comments have been ridiculously over-loyal and biased towards Nvidia. Nvidia is not perfect. No company is perfect. What the hell is wrong with preferring raw performance over AI? AI is far from perfect. Just let people have their opinions and stop with the cringey fanboy stuff. If they prefer raw performance, then keep quiet and let it be .
I would really like nVidia to start acting like Apple with their hardware sales. Let us preorder now, and just give us an estimated arrival date. If you need to limit orders to 1 per email, great. But only nVidia can do something about all the scalpers out there including the ones that call themselves "stores". Right now, at NewEgg, the CHEAPEST 4090 you can get is $2,299, and that's from a non-major brand. Then there are "open box" cards for $2,499, so that's $1,000 over MSRP. If retailers and vendors refuse to sell things at or below MSRP, the manufacturer should just cut them out of the loop and sell exclusively to end users the same way Apple does. It's not like nVidia is benefiting from these scalper markups, as they sell the cards to the vendors at 20% below MSRP or less. NewEgg, Amazon, BestBuy, they're all doing these crazy markups or at least allowing it to happen through their sites. Please nVidia, just let us preorder and form a line!
Powerful hardware and useful software features like DLSS and Fame Gen become obsolete if the actual games remain unoptimized which is highly dependent on the Devs. DLSS is a feature not a crutch for them to release unoptimized products. Hope the newer Unreal Engine 5 titles are not so demanding in comparison to how they look.
How optimized can games get before hardware upgrades are needed? Ray tracing/path tracing used to be done on massive rigs, but now its regulated to a single GPU card ... Not saying I don't agree with you that devs are getting lazy AF with optimization, but at what point do you think the only solution is optimization?
@@bankaimaster999 As hardware gets better/Couple generations go by, RT is probably not going to be an issue by then. But as now is the transitional point of switching from rasterized lighing to full on RT, i also don't have a solution other than baking in parts of RT alongside rasterization instead of Raw Path Tracing to ease the process for the lower/mid end cards. What is your opinion on Vram? Is Nvidia skimping on budget/Mid tier cards by not giving them sufficient vram or is it still okay for the base **70 series to have 12GB Vram and the base **60 seies cards(Presumably) to have 8GB vram.
this is like the worst experience possible. just turn on pathtracing and use dlss and framegen. looks about 1000x better with more fps. native is dead. welcome to 2025
You, and anyone else with the same thinking here, have no idea what you're talking about. Playing in native res makes zero sense nowadays and in *most* cases DLSS will give better results. The paradigm isn't the same anymore - building the hardware to allow AI more leverage to predict and display > 'raw power' is the right move; and while the frames being inferred is technically correct, 'fake frames' as stated here is a misnomer brought about by a lack of understanding. It's like saying a car that uses AI algorithms (mechanical hardware with software) to intelligently adjust the engine for better mileage and power is shite and doesn't actually account for anything - see how silly that sounds? Unless a developer isn't doing their job right, the hardware in a 50-series card *in concert* with software will enable you to play that game at massively higher frames at high-to-max graphics with all RT features, etc. turned on *and look as good or better* than it would at native resolution.
When you say full rt are you using path tracing? And I'd like to see the fps without dlss4 so I could fairly compare apples to apples with the 4090s pure power.
Why is everyone so upset about 'fake frames'? Look at the bigger picture if the choice is between a $10k GPU that's the size of a PC and a $2k GPU (which, by the way, is only meant for a small percentage of players), I'd know which one to pick. Most of the people complaining aren't even planning to buy it, and honestly, most wouldn't even notice the difference. OFC there won't be HUGE 70% leaps every 3 years that's normal. People are just too spoiled these days. and yes if u spend money on a gpu above even 750 dollars YES you are spoiled.
We have all the right to complain about misleading performance "improvements" when they literally just locked the 40 series to a 2x framegen and allow 3x and 4x on the 50 series. Back then, they said Ampere couldn't do framegen because it lacked optical flow accelerators. Fine. But now they lock Ada to 2x when compared to the 50 series. Isn't this just forced obsolescence? Or are we supposed to believe that optical flow accelerators got a 3x generational uplift?
@@jamesmuking The thing is they clearly said it was with AI and impossible without. + everyone already knows this those are still the same people to complain like they mislead.
Since the launch of the 2080Ti, I've had trouble getting my hands on a Founders Edition card at MSRP due to bots and the overwhelming demand. Maybe Nvidia could consider introducing a loyalty program for long-time customers like us, who have been with the company since the GeForce 2 era or even earlier. This program could provide us with a genuine opportunity to purchase these cards. Personally, I’ve purchased 18 GPU and mostly TopDogs generations from Nvidia, and many other older nerds and I have remained loyal to the company even when it wasn't as successful. Despite the loud fans of the NV 30 series or the poor AF without AA of the G70 series, we still chose GeForce products and helped make Nvidia what it is today.
Same. I've had money set aside for something since my 3080 Ti, and have yet had a chance to get one. Nvidia simply does not care enough about their actual customers to put in any proper effort to combat it, because they're making money either way. It's honestly ridiculous at this point.
I like how everybody is upset with DLSS 4 and MFG, but guys that gpu has still solid 30% rise in native compare to 4090. Also nobody force you to use this SW AI features, you can use native settings. Its not Nvidia fault some games are so demanding, but it is usually problem only in 4K resolution with Path tracing ON. Path tracing is still more prototype technology like Ray tracing was when RTX 2000 came out. Anyway still you can have nice solid 90 FPS in 4K with only using Frame generation 3x without DLSS, or 60-70 FPS without FG and with DLSS Q. Its great, a few years ago you would be glad for this stats on 2K monitors. Plus guess what, 5090 and also 4090 are mainly build for work PCs and stations, this is why Nvidia mainly concern now on AI and SW then HW, because world and companies are very hungry for AI computing chips. I am sorry but for Nvidia, gamers are in 2nd row, in 1st are companies like Google, Meta or Microsoft, they have much more money from them than from sells to gamers, and still they care about us. I am glad for MFG because it giving massive FPS reserve to future, you will can easily use RTX 5000 for 5-10 years without some big problems. I still using GTX 1070 which is 8 years old, and true I need replace it. But if it could only generate 1 frame like 4000 series, I could use it on fullHD monitors till now without big issue.
I agree, however then why are they not showing Native performance? If we are honest the only reason there not is they know it matters most. Like why lift 5090 embargo which we all know is fastest card so it don't matter like the rest do and they originally set the 5080 embargo to the same day they launch they let it come out day before? I don't get the hide performance until right before launch. Personally it puts me off not knowing sooner.
@@jmangames5562 Because 4090 would still destroy 5070 in native 😅. But I am more interest in compare 4090 with DLSS Q and FG on, and 5070 with DLSS Q and 4x MFG. If in this case would be performance same, It could be true that 5070 has same FPS power as 4090. Because Nvidia still showing that 5070 is comparable with 4090, but we dont know if 4090 is in native or has FG and DLSS on, at 5070 it is sure that it has ON.
@@justinlime69 Has like-for-like power draw been tested and released yet? Theoretical TDP is about that much higher, yes, but actual use? Have never seen my 4090 get even close to 450w, for example.
The AI top number was given in FP4, which isn’t commonplace for developers unless building a product. Most all users would be using fp16 for their models. Given the number on the slide, wouldn’t the general use case of fp16 make the have an AITop count of 838? Or was that not speced at any point?
Regardless of raw raster vs MFG, still will be a beast of a card from a 4090. If I can get anywhere near close to double Blender rendering again that'll be great (rumor currently is ~70%).
4:44 I like how the 105% chart looks almost DOUBLE the size of the 100% one with the 4090. LMAO
Yeah that's some proper manipulation via graphs. The baseline is 95% but is unlabelled. If there was only the first bar you'd have no idea that the bar height only represented 5% instead of 100%. Shady.
Funnily enough the graph on the right side is accurate so its 100% on purpose lmao.
scummy way of making graphs i guess. Not illegal but very misleading
holy shit that's a good find
classic nvidia
Can you guys stop showing results with DLSS 4
Why would they do that?
@@daunted232 Because it's misleading bullshit. They need to show the raw performance to show what the hardware is capable of. Anyone who isn’t a fanboy doesn't want to constantly see these fake frames and upscaling shit.
@@daunted232because its the only thing they got going for them to look “Good”. No significant improvements since the 20 Series.
Coming from someone thats never had a dedicated GPU but Intergrated
Native resolution sucks 😂
@ Nvidia "hey this new gpu is all about dlss4!"
People online "OMG YOU MISSLEADING:
Can't wait to finally get close to 30 fps in Cyberpunk!
DLSS 4 + MFG 4X is better. This is not bad.
30 fps for 3000 bucks
on my screen (3440x1440) with 5090 will be 70 fps with full RT on
:v
You do realize that you get around 10 fps with a 4090 at 4k full path tracing without dlss ?
lol comparing single frame gen for the 40 series vs multi frame gen for the 50 series for benchmarks... never change nvidia
truly man.
Multi frame gen exclusive to 50 series, bottlenecking the new tech to match would not increase the accuracy of the results
@@Zeeves why on earth wouldn't it?
@@imadudeokaythe point is pretty obvious I think... If you disable capabilities of the 5090, you are handicapping it. Why would comparing a handicapped 5090 to a 4090 be a fair comparison?
@@m0ose0909 Frame gen 2x on both 4090 and 5090 would be the most accurate demonstration.
Man, that cooler design for the 5090 is a work of art. Props to the engineers of this marvel!
best part of the 5090 tbh
We don't know about temps, it might not perform that good.
Gamers Nexus has an interview with the thermal engineer that is totally worth the watch
@@pandalayrealthere have been reports of blackwell overheating so the smaller size comes at a cost
who cares about the design for a graphic card.
I love the 4:34 graphs where 110% appears thrice as good as 100%, those marketing guys are paid big bucks for this wizardry.
Video encoding quality is measured on a logarithmic scale.
If you cant read a graph it's you problem lol
100% is the baseline, how tiny do you want to make a graph like this?
It's perfectly readable, and if you can't read, it's a you problem
@@crateer You are american, right? otherwise the room temperature IQ is concerning. A graph without a valid X/Y axis isnt "perfectly" readable and the concerns of the other guy are valid
@@MayoDealer as long as people ignore what PSNR is how smart they are is totally irrelevant, this is marketing material not a review or declared specifications, the graph is not bad and understandable with basic level of attention
Don't forget the "at twice the performance" when is actually only about v15 percent faster
I really feel like the performance metrics should be called "perceived frames per second." Frame Gen is good, but it isn't perfect and gets weird around text, subtitles, and some games UI elements. I don't love that Nvidia is claiming 200+ FPS when 3 of the 4 images are created using AI and might have artifacting.
Yep, but disingenuous marketing gonna do what it always does. The apples to oranges comparisons are always ridiculous too.
BTW, the 5090 is approximately 30% faster than the 4090 in rasterization. The 5080 is approximately 15% faster than the 4080.
You're paying $1,000 for 15% performance (x1 or x2).
@ well hopefully nobody is going from 40 to 50 in the same tier. You’d have to be kind of a dumbass to do a single generation upgrade from something like a 4080 to 5080.
But upscaling likely from 768p to 1080p(not about frame generation) is not fakes frame, its success reduce cpu and gpu usage then that empty usage can render more frame, also can reduce latency.
@@General_M and the real problem people don't understand is that it's not even faking the next frame just frames in between 2 real frames so you already have the end point that you calculated and you just wait for the AI to show you images in between, I want to get the next frame as soon as possible not wait for it even more with 30ms response time
chilling with my 4080s until rtx 9080 came out
sir why are you waiting till 2033 lmfao you still think you will be playing games to even care at that point?
@@cobra2994 The joke clearly went over your head.
@@cobra2994 for me YES!
😂@@MilitaryGradeLubricant
I got the joke 😂
I'm looking forward to seeing what the Radeon 9900XTX will be like. Could end up being an absolute monster!
7:00 "Comparing at the same settings"... dude gtfo here.... generating 3 fake frames vs 1 fake frame is not same and the 5090 is using the new transofrmer DLSS model... I just don't understand this marketing, literally every reviewer will bash the crap out of this once they are allowed to show the actual apples to apples performance.
Saw a test with the actual same settings in cyberpunk no dlss framegen, 5090 got around 60 fps and 4090 got around 40fps
@@silverbashspam9701where?
@@silverbashspam9701 +50% no bad
@@silverbashspam9701 where?
Was just about to comment this... if only using FG then the 5090 would be around 130-140 fps compared to 4090's 110-120. Wow, what an improvement 2 years later 😆
How they love to demonstrate the capabilities of their DLSS, avoiding dynamic scenes and sharp camera turns as much as possible, because they understand that multiple artifacts will immediately appear on the screen.
@@Atreyuwu they dont, its not magic.
no it wont it will just look weird if they keep flinching around
Great to see Jacob doing well for himself. Spoke to him as a teenager way back in the early EVGA days and he was always super helpful.
I thought this guy was AI. The way he was acting seems so fake like Nvidias dlss
Had the pleasure of chatting with him on a few support forum messages back when I ran an SR-2 Board, really awesome guy!
''aaaah, don't you mean 30% faster.''
That's still pretty good Gen-over-Gen. Especially since the 4090 is already a BEAST!
@@koek4539yea 30% is decent upgrade but the rest of the line up is gonna be like 10% 🤷🏻♂️
@@koek4539 they just increased the shaders and consumption by 30%. No gen improvement
@@mikeramos91 Memory alone will do big difference, GDDR7 has 30% higher bandwidth (21 of GDDR6X on 4070 for example vs 28gbit of 5070) whole Ada series was memory bottlenecked so this might make a bigger difference
@@jean_yves_plongeur well obciously because no new process size. Of course no efficiency gains because they mostly come from smaller nodes size which will happen with the 6090 but stayed same between 4090 and 5090. Unless you fucked up on some other spots earlier you won't press more efficiency from the same node size.
Can't wait to have this one when the 7090 releases!!!
you mean the AMD RX 9070 XT?
@@alb.1911 Its an old joke where a new gpu releases and they say cant wait to have when when it's way much older in years to come. lets say 5090 release and some people say they can finally get a gtx 1080ti
@@dominicshortbow1828 lol... sorry.
@@Gabrilos505 2029
2029
One thing buried was that FG now uses game engine data/state and not just output frame. Will likely help with movement and latency.
How about rasterized performance? I'm a VR gamer, and my main VR game can make my 4090 sweat at times, and there's no support for DLSS or Frame Gen.
Lossless scaling dude, same fake frames, only 5 bucks ✌️🙂↕️
@@castro.ri_ lossless scaling is even worse dogshit
@@castro.ri_That might look cursed in VR
May I ask what your main VR game is? I have VR also and my 4090 seems to do pretty well.
@@castro.ri_not the same ahahahaha😂😂😂
Watching this gives me that feeling of catching a glimpse of a happy family decorating their Christmas tree through the window as I walk back to my lonely apartment.
When he said 2x faster and showed us the games with DLSS I lost all excitement lmfao Now I have to figure out which gpu to buy this time around
I’m getting a 7900xtx. Faster than a 4080 Super in non ray traced performance and faster than a 3090 Ti in ray tracing performance, and 24gb of video memory.
Maybe going for Intel this time
Showing frame counters with frame generation enabled is not a performance benchmark. Increased motion fluidity should not be presented as improved performance. Frame generation decreases your base frame rate when GPU limited, adds additional latency and introduces motion artefacts. While your average Joe may not be able to notice the additional latency and decreased frame quality that comes along with frame generation, the fact remains that these techniques can’t be claimed as increased performance. This is coming from someone who owns a 4080.
You probably eat McDonald. You think that’s real meat???
@@johnc8327 realer than this, that's for sure.
It is when you are comparing two cards that are both using it…
you think that's air you're breathing?
Why not
DLSS 5 will offer VRAM generation so that we can finally offer you 4gb physical vram while dlss will hallucinate the rest 16gb
-nvidia.
😂😂🤣
Frame Generation is NOT performance.
@@Fantomas24ARM yes it is
Who are you to say that?
@@maxkrug2000 someone with at least average iq you are bellow average
You live under a rock to say that? At least an uni degree to back up your claims?
yes it is
A shame that it didn't even touch 30fps on Cyberpunk natively.
Well yeah what do you expect using max settings path tracing etc😂
With Path Tracing...even the 6090 likely won't hit 60 fps without upscaling/fake frames with those settings, it's super intensive.
It's funny that if when CP devs would've said: Yo, we don't do RT and PT. Then you'd get 80+ fps and no one would complain.
But now you have an option for better quality (trading performance), you then CHOOSE YOURSELF to use it, and then complain about the fps drop.
Frame gen was made for a reason.
That's because the game isn't optimized.
wheres my free RTX 5090 ? Santa missed my chimney and forgot to leave it under the tree
☠
I was wondering the same thing, lol. Best thing I've read all day.
Real-time path tracing is almost incomprehensibly difficult.
The fact we can get 30fps without AI hardware acceleration is a miracle.
On 4090 you can get the same 28fps with no Frame gen on.
@@TheAlyeid That's 5090. 4090 is 20 fps on FULL RT, native 4k. No DLSS at all.
@ I knew it was close. 28fps isn’t much better for $2k though right? lol
@@TheAlyeid it isnt close. thats a 30% increase
@ Rasterized!?
Getting real suspicious that yall refuse to show just raster or just upscale comparisons without FG.
Don't worry that's what the reviewers and early adopters are for! I certainly plan to test all scenarios if I can snag a card.
The EVGA guy 😊
Feels like the 20 series all over again except the 40 series wasn't all that good so instead of a disappointment it's just sad.
That is EXACTLY what this feels like. The 20 series had an absolutely awful performance lift from the 10 series, and you paid the early adopter tax to beta test new tech.
Except at least back then, the price to performance was not absolute dog shit like it is with the 50 series. Worst performance with the biggest price hike ever.
Now that not all frames are created equally, gamers will soon collectively stop caring about frame amount and start caring about frame quality instead as there is now a distinction frame rates are no longer a indicator of performance.
Can y'all please for the love of god start putting 10-12 GB VRAM on your 50 and 60 series cards? A majority of gamers don't have the money to get a 5090. Everyone, please like so Nvidia can see this
they wont listen. 5060 is going to be seen with like 8gb again.
gtx1060 3-6gb / rtx2060 6-12gb / rtx3060 8-12gb / rtx 4060 8gb / RTX 5060 8gb 🤣"evolution"🤣, F U nvidia
HAHAHA
Nvidia is not for poor gamers, AMD is still around
Nvidia listens. 6080 will have 10 GB of VRAM
That encode time between 5090 vs 4090 vs 3090 tell u everything 😂.
Those are just the encoders.
To a game dev, that's huge. But to everyday Joe's like us, that's frankly not much-- maybe a higher time spy score at best.
Because it doesn't utilize the matrix transformers and rt cores?
id image encoding has diminishing returns so the closer you get to zero the harder it is to lower the time
Helps in editing! My 2060S to my 4080S I can actually get things done now😂 but it just depends on where you’re coming from
@@ShadowLancer128 How is encoding huge for a game dev?
I don't care what the cook does in the kitchen. As long as the meal looks good on my plate.
Wait until everyone realizes the raster improvement is only 6-10% faster vs the 4xxx series lol
5090 is 33% better, whereas the rest sit at 15% or lower
Pure Raster Performance increase is around 30%, simply based on cude core count, without taking improved memory into account.
Get your facts straight.
@@crateer Yeah, ONLY for the 5090. And it's actually 27%, not 30%. Also, cuda core count doesn't translate linearly to raster performance. For that, you need to look at the shader TFLOPS for each card. Based on shader TFLOPS, raster perf. increase is as follows: 5090 vs 4090 = 27% (105 vs 83 TFLOPS), 5080 vs 4080 = 14% (56 vs 49 TFLOPS), 5070TI vs 4070TI = 10% (44 vs 40 TFLOPS - also worth noting that vs the 4070 TI Super, it'll be a 0% increase because they're both 44 TFLOPS), 5070 vs 4070 = 6% (31 vs 29 TFLOPS). Shill harder.
@@omnustv483 yeah no shit, these are just same spec refreshes at same price point with some extra features. 5090 is a monster but will be impossible to get for even the people who can afford it.
Since we're getting fake frames. How about we pay for it with some fake monopoly money? :) It'll be fun. Come on Nvidia!
They are just as "fake" as any rendered frame. Nobody would care if they didn't tell you how the frames are generated.
@@oisiaaYou have no clue what you’re talking about.
Real frames reduce system latency and make the game more responsive and stable.
Fake frames just smooth the game and have no reduction on input latency, but rather increase it.
All these years you have been paying real momey for faked graphics anyway. All lighting, shadows, reflections you are used to are fake. Now you are getting the actual real graphics with ray tracing that actually simulate how real light work in the real world. You have two choices now: (1) play future real graphics games at sub 1 FPS (2) be stuck with ancient 2020 fake graphics (if developpers even bother wasting their time for these ancient faking tricks).
I honestly think the reason why this is a thing is because a lot of gamers over the years have continually been stating that they want the games to run at their monitors refresh rate and not just 60 fps with every setting maxed out. If you wanted a card that would run Cyberpunk at even 165 or even 120 FPS with everything maxed out, you’d need to pay 3 times the price of a 4090/5090, and 99% of people can’t do that. The fact that the new technology is offered at the same price as the last gen. card (4090) is a miracle. I’m not supporting the whole AI thing at all, but if we’re being realistic, there’s a reason we don’t get top of the line hardware without any AI, and it’s because it would be impossible for the regular person to purchase. 1’999 is already the cap, could you imagine a card at 3’999, or even 2’999?
You, and anyone else with the same thinking here, have no idea what you're talking about.
Playing in native res makes zero sense nowadays and in *most* cases DLSS will give better results.
The paradigm isn't the same anymore - building the hardware to allow AI more leverage to predict and display > 'raw power' is the right move; and while the frames being inferred is technically correct, 'fake frames' as stated here is a misnomer brought about by a lack of understanding.
It's like saying a car that uses AI algorithms (mechanical hardware with software) to intelligently adjust the engine for better mileage and power is shite and doesn't actually account for anything - see how silly that sounds?
Unless a developer isn't doing their job right, the hardware in a 50-series card *in concert* with software will enable you to play that game at massively higher frames at high-to-max graphics with all RT features, etc. turned on *and look as good or better* than it would at native resolution.
Now if I turn off DLSS will it perform worse than my 3DFX Voodoo 5 5500?
No, around 25% to 40% better than 4090
@@koek4539i think it was a joke bro
it wasn't funny
@@koek4539 But can it run Unreal 98 in 1024x768 using Glide API?
@@therealkylekelly humor is subjective
Jacob is in the big leagues now. Congrats bro. We’re proud of you
Even not using multi frame generation DLSS 4 is greater than FSR 4.
Very nice
Now lets see those rasterization results
The fact that, 5 years later, Cyberpunk2077 still is THE game to showcase all these features just shows how good the RED Engine is.. or was.. since this is the last game on it.
Or how badly the game is optimized
@@mythicallegendary3992 graphically well done, not so well on the cpu side.
You gotta go to the worse thing sometimes to see how good YOUR solution is
@@mythicallegendary3992 true
@@mythicallegendary3992it's not badly optimized at all. Have you seen how it looks???😂😂😂
Is stock ramping up? Should be good by Oct?
When using MFG, you'll get smoothness, but you won't get responsiveness (like with real 200FPS). It's weird when 200FPS is displayed on the screen, and the responsiveness is like you're playing at 40 FPS. nVidia, as always, is silent on important points
200 with 4x factor is probably around 60 base fps and it's not weird, different games have different latency and the same fps, even the same game at the same fps on a different system have different input latency
Latency is not an issue. You have reflex and fg should only be used with 60fps avg or more.
@erenyaeger7662
DLSS is necessary in cases where FPS is low and needs to be increased. But it doesn't work well when your initial FPS is low - it's a vicious circle.
When you try to increase the frame rate from 26 to 200 frames per second (as nVidia has shown), you will definitely have problems with image artifacts and high latency.
And Reflex won't reduce the control delay enough to make you feel like you're playing at 200 FPS. Your eyes see the smoothness, but your hands don't feel it.
@namusx clearly they didn't get from 26 to 200 with DLSS Frame Generation only, first they use DLSS Super Resolution with the Performance preset to get +60fps and the they used MFG.
If you have to create issues that aren't there in order to complain maybe there isn't much to complain about
We aren't paying for his jacket anymore 😭
The price appears to be absolutely obscene though, over AU $4000 is a tough ask for a GPU in these tough economic times. We will see how well it does I guess...
There are lower end models for less
The more GPUs you Buy,the more money you save" - Jensen Huang
Let's be honest, you could absolutely implement MFG for the 40series... complete marketing bs
technically yes but i think the input lag would be massively different if you didnt have the extra architecture improvements to make sure the input lag wasnt much more than only a x1 frame gen compared to x4 frame gen. you can do MFG with losless scaling program from steam but its not as good in motion obviously.
You are not really bright aren’t you ?
yeah not sure for other 40 series gpu... but 4090 spec already show it capable vs 5070.. it can run but they decide to lock to show how massive performance gain vs 50 series.
Should I upgrade ?
Mine is 4080 .. monitor 4k 240hz .. mainly for gaming
:O I love the overview 🤩🤩🤩some day I want a high end gpu from nvidia
And not as ingle one mentioning about temps, I can only imagine that even in well ventilated case you still hit 80C-90C degree just on core itself, mem and hot spot up the roof.
to me im happy for the new driver update rather than the newest gpu's that are coming out
already happy with my 4080S buy last Sep so to hear we getting an update I feel like is pretty cool
Video editing dream! The Founders edition is beautiful as well
1600+ euros for 5080 and 2600+ for 5090. The pricing on 5series cards is RIDICULOUS. DON'T BUY everyone if you want things to get back to normal
Sadly fanboys will buy
@ if people don’t buy we can get 5080 close to 1K and 5090 close to 2K which is still ridiculous for the percentage uplift you get in real frames. Most people play competitive wtf are fake frames and latency going to help me with?
and i saw leaks that the 5080 will be slower than the 4090 in raster.... and it's still that expensive lol, 5080 should be like 800-900 if it's actually slower than the 4090 in raster
its 999 and 1999 LOL
@@UHD730 where?😂
would love to try these new features on some "older" games
will that be possible ?
I am waiting for the time when SLI will come back.
Same, Bring on MCM for gpu's!
Is there still a limitation on the number of DSC displays to only 2 for the 5090?
If I had a panny for every time Nvidia says AI, I would be able to afford the 5090.
Well time to wait 1 final generation before upgrading the 3060 laptop
We need RAW performance gpu's; not fake performance with AI Or Upscaling non-sense.
Just stick to 40 series gpu's for now on.
God bro you speak like a child
@@humble2246 There is something wrong in your head. All your comments have been ridiculously over-loyal and biased towards Nvidia. Nvidia is not perfect. No company is perfect. What the hell is wrong with preferring raw performance over AI? AI is far from perfect. Just let people have their opinions and stop with the cringey fanboy stuff. If they prefer raw performance, then keep quiet and let it be .
@@Gamer-q7v He's Just An Nvidia Fanboy. He Will Pay Anything To Get Anything Without Properly Knowing The Good Side & Bad Side Of 50 Series GPU's.
@@VengeanceVishwas Exactly. He's literally deranged in the head. That fanboy needs mental help.
@@VengeanceVishwas RUclips is playing up as always. Some of my replies only appear in newest.
make chat with RTX available on all RTX cards
all i see is EVGA, nobody can change my mind. 😮💨
Why is the power cable right in front of "GEFORCE RTX" led this time? ? 🤦♂
I would really like nVidia to start acting like Apple with their hardware sales. Let us preorder now, and just give us an estimated arrival date. If you need to limit orders to 1 per email, great. But only nVidia can do something about all the scalpers out there including the ones that call themselves "stores". Right now, at NewEgg, the CHEAPEST 4090 you can get is $2,299, and that's from a non-major brand. Then there are "open box" cards for $2,499, so that's $1,000 over MSRP. If retailers and vendors refuse to sell things at or below MSRP, the manufacturer should just cut them out of the loop and sell exclusively to end users the same way Apple does. It's not like nVidia is benefiting from these scalper markups, as they sell the cards to the vendors at 20% below MSRP or less. NewEgg, Amazon, BestBuy, they're all doing these crazy markups or at least allowing it to happen through their sites.
Please nVidia, just let us preorder and form a line!
Rumor is that they'll be handling sales differently this time around but guess we'll know more once date gets closer. But yeah, agreed!
Anybody know when or if preorders go up?
Powerful hardware and useful software features like DLSS and Fame Gen become obsolete if the actual games remain unoptimized which is highly dependent on the Devs. DLSS is a feature not a crutch for them to release unoptimized products. Hope the newer Unreal Engine 5 titles are not so demanding in comparison to how they look.
True
How optimized can games get before hardware upgrades are needed? Ray tracing/path tracing used to be done on massive rigs, but now its regulated to a single GPU card ... Not saying I don't agree with you that devs are getting lazy AF with optimization, but at what point do you think the only solution is optimization?
you are possessed
Words of a poor man
@@bankaimaster999 As hardware gets better/Couple generations go by, RT is probably not going to be an issue by then. But as now is the transitional point of switching from rasterized lighing to full on RT, i also don't have a solution other than baking in parts of RT alongside rasterization instead of Raw Path Tracing to ease the process for the lower/mid end cards. What is your opinion on Vram? Is Nvidia skimping on budget/Mid tier cards by not giving them sufficient vram or is it still okay for the base **70 series to have 12GB Vram and the base **60 seies cards(Presumably) to have 8GB vram.
THAT "DLSS 4" is ON PERFORMANCE MODE???
omg wait...
The Av1 encoding really lets me see all the detail of this guys ear fuzz.
LETS FAKE FRAMES AND CHARGE $2000!
Nice to see you, Jacob! Tell Jensen we need a Kingpin model! Shoot, maybe ask Vince instead lol
All i care about is , 4k native, max settings, no dlss, no frame gen, no Raytracing. Not one RUclips channel has tried that benchmark
review embargo is probably jan 30, thats when the real benchmarks comes from reviewers like Hardware Unboxed & Gamers Nexus.
raytracing actually adds a lot of quality and effects to the game and it's very intensive, it's not upscaling
Forget about native
@popkat1 high fps > ray tracing. Games are unplayable at 4k below 80 fps.
this is like the worst experience possible. just turn on pathtracing and use dlss and framegen. looks about 1000x better with more fps. native is dead. welcome to 2025
The 5090 looks so amazing!! Looking forward to buying one! I am finally upgrading my GTX 1070!
Now, do the benchmarks without any upscaling...
@@Mr.T-v4y its 30% faster than the 4090 at 25% more msrp... Plus more features, vram.
@@michelangelodrawcars5778 I wish they would demonstrate that.
@@michelangelodrawcars5778technically around 12 percent more msrp if we factor inflation
Let's save 12 mins of our lives
In a nutshell
Nvidia is saying 3 fake frames are better than 2 fake frames
So u gotta pay more way more
RTX 6090 will come with DLSS 5, with 1 real frames of 8 😂
9090 will be 100% generated with ai
@ Yeah, and NVIDIA saying RTX 9090 is 16x times faster than 8090, but in AI, because of put more and more AI frames in every DLSS
@@Haydos the moment that happens is when everybody turns to amd and intel
aannd? whats your point? this is most likely to happen. and its awesome!
@@jackmills7758 nah they are doing the same thing.
Will there be 5080 super with added memory?
ok cool but show raw performance instead of photoshopped frames? :D
You, and anyone else with the same thinking here, have no idea what you're talking about.
Playing in native res makes zero sense nowadays and in *most* cases DLSS will give better results.
The paradigm isn't the same anymore - building the hardware to allow AI more leverage to predict and display > 'raw power' is the right move; and while the frames being inferred is technically correct, 'fake frames' as stated here is a misnomer brought about by a lack of understanding.
It's like saying a car that uses AI algorithms (mechanical hardware with software) to intelligently adjust the engine for better mileage and power is shite and doesn't actually account for anything - see how silly that sounds?
Unless a developer isn't doing their job right, the hardware in a 50-series card *in concert* with software will enable you to play that game at massively higher frames at high-to-max graphics with all RT features, etc. turned on *and look as good or better* than it would at native resolution.
@@Atreyuwu tell me you dont know what youre talking about without telling me you dont know what youre talking about
@@Hugs288yeah, you dont know what you're talking about.
@Hugs288 tell me you're a sheep who just follows garbage tech tubers with zero understand without telling me
What about the option to run a webcam in nvidia overlay?
We want The Performance whit Dlss Super resolution .. and not Multi frame gen!
Since this is LM. Will we be able to disassemble to water cool?
This makes me shed a tear down my leg. 😅
What's the difference between this one and laptop 5090 chip in in real life practical use?
Эх принял бы карту на обзор ))лу ше бы отрекламировал ))
When enhance feature lunch for previous gen???
Hey, nvidia. If you claim that 5070 is as fast as 4090, maybe then I'll buy 5070 and you will swap it to 4090? Deal?
5070 Will be a 3090 i quess but with more features
When you say full rt are you using path tracing? And I'd like to see the fps without dlss4 so I could fairly compare apples to apples with the 4090s pure power.
Why is everyone so upset about 'fake frames'? Look at the bigger picture if the choice is between a $10k GPU that's the size of a PC and a $2k GPU (which, by the way, is only meant for a small percentage of players), I'd know which one to pick. Most of the people complaining aren't even planning to buy it, and honestly, most wouldn't even notice the difference. OFC there won't be HUGE 70% leaps every 3 years that's normal. People are just too spoiled these days. and yes if u spend money on a gpu above even 750 dollars YES you are spoiled.
No it's not. They are price gouging and using deceptive practices.
@@SonyJimable Will u buy it?
We have all the right to complain about misleading performance "improvements" when they literally just locked the 40 series to a 2x framegen and allow 3x and 4x on the 50 series. Back then, they said Ampere couldn't do framegen because it lacked optical flow accelerators. Fine. But now they lock Ada to 2x when compared to the 50 series. Isn't this just forced obsolescence? Or are we supposed to believe that optical flow accelerators got a 3x generational uplift?
@@jamesmuking The thing is they clearly said it was with AI and impossible without. + everyone already knows this those are still the same people to complain like they mislead.
Most of the people howling in here aren't going to buy this GPU anyway. Just salty, ignorant little children.
Can a card with this cooler design be mounted upright (i.e. IO panel facing up/down) without a negative affect on temperatures?
Yes! It was designed to be able to work in various orientations.
@@GeForce_JacobF Thanks for the reply. Hoping to buy one now!
@@GeForce_JacobFi dont believe so iirc. Vertical mounting has issues
11:05 that looks even more uncanny than before, truly horrifying and unsettling to look at
No way - it's awesome.
Think ill keep my 4090
Since the launch of the 2080Ti, I've had trouble getting my hands on a Founders Edition card at MSRP due to bots and the overwhelming demand. Maybe Nvidia could consider introducing a loyalty program for long-time customers like us, who have been with the company since the GeForce 2 era or even earlier. This program could provide us with a genuine opportunity to purchase these cards. Personally, I’ve purchased 18 GPU and mostly TopDogs generations from Nvidia, and many other older nerds and I have remained loyal to the company even when it wasn't as successful. Despite the loud fans of the NV 30 series or the poor AF without AA of the G70 series, we still chose GeForce products and helped make Nvidia what it is today.
Same. I've had money set aside for something since my 3080 Ti, and have yet had a chance to get one. Nvidia simply does not care enough about their actual customers to put in any proper effort to combat it, because they're making money either way. It's honestly ridiculous at this point.
they don't care about you
stop pretending that corporations are your friends
Now I can finally play rtx and immersive portals in Minecraft and my house won’t explode
I'm a big fan of DLSS and FG. Can't wait for DLSS4! 😁
Uhm that's not real frames. Cuda cores are gimmick! Only real gamers use native res. Anti aliasing is fake!
yeah yeah, can it run tarkov in 2k or 4k full ultra settings?
This looks incredible in every way. Size, performance, and overall design. Can't wait for January 30th.
What are the raw performances Vs the 4090? 2x? 1.5x?
1.3X
@laszlozsurka8991 so little lol
I can't wait to get this in South Africa! Only 10 years and this will be mine!
Well best of luck but there will be better option in future
Please tell me this will still be available in the non-"app" old control panel?
I like how everybody is upset with DLSS 4 and MFG, but guys that gpu has still solid 30% rise in native compare to 4090. Also nobody force you to use this SW AI features, you can use native settings. Its not Nvidia fault some games are so demanding, but it is usually problem only in 4K resolution with Path tracing ON. Path tracing is still more prototype technology like Ray tracing was when RTX 2000 came out. Anyway still you can have nice solid 90 FPS in 4K with only using Frame generation 3x without DLSS, or 60-70 FPS without FG and with DLSS Q. Its great, a few years ago you would be glad for this stats on 2K monitors. Plus guess what, 5090 and also 4090 are mainly build for work PCs and stations, this is why Nvidia mainly concern now on AI and SW then HW, because world and companies are very hungry for AI computing chips. I am sorry but for Nvidia, gamers are in 2nd row, in 1st are companies like Google, Meta or Microsoft, they have much more money from them than from sells to gamers, and still they care about us. I am glad for MFG because it giving massive FPS reserve to future, you will can easily use RTX 5000 for 5-10 years without some big problems. I still using GTX 1070 which is 8 years old, and true I need replace it. But if it could only generate 1 frame like 4000 series, I could use it on fullHD monitors till now without big issue.
Amen
+30% more performance, but also +25% price, and +28% more power draw
I agree, however then why are they not showing Native performance? If we are honest the only reason there not is they know it matters most. Like why lift 5090 embargo which we all know is fastest card so it don't matter like the rest do and they originally set the 5080 embargo to the same day they launch they let it come out day before? I don't get the hide performance until right before launch. Personally it puts me off not knowing sooner.
@@jmangames5562 Because 4090 would still destroy 5070 in native 😅. But I am more interest in compare 4090 with DLSS Q and FG on, and 5070 with DLSS Q and 4x MFG. If in this case would be performance same, It could be true that 5070 has same FPS power as 4090. Because Nvidia still showing that 5070 is comparable with 4090, but we dont know if 4090 is in native or has FG and DLSS on, at 5070 it is sure that it has ON.
@@justinlime69 Has like-for-like power draw been tested and released yet? Theoretical TDP is about that much higher, yes, but actual use? Have never seen my 4090 get even close to 450w, for example.
The AI top number was given in FP4, which isn’t commonplace for developers unless building a product. Most all users would be using fp16 for their models.
Given the number on the slide, wouldn’t the general use case of fp16 make the have an AITop count of 838? Or was that not speced at any point?
I want real frames im keeping my 4090 as i prefer my raw performancr
Better turn off all dlss, all raytracing, and somehow turn off cuda cores because they aren't real!
Well no shit. I didn’t buy a 409 and overclock it to use dogshit dlss upscaling and frame gen. I bought it to play 4k native
Dude, 4090 is plenty good and will be for years to come. If I had a 4090, I wouldn't upgrade unless I really needed the 32 GB of VRAM.
Can I buy this exact model at MSRP. I don’t want third party, I don’t want to overpay to scalpers.
I see many upset commenters. No one forces you to buy-go for the 4000, 3000, 2000 series, or even the 1080 Ti if you prefer.
Regardless of raw raster vs MFG, still will be a beast of a card from a 4090. If I can get anywhere near close to double Blender rendering again that'll be great (rumor currently is ~70%).
nobody cares about DLSS, show us the raw performance comparison
You don't care because you're poor and don't understand technology. Let's be real here