No not really. They are presenting a technology update and positive press is important. There are so many needlessly negative comments around - just look at the comment section in this video for starters. It's clear than Nvidia are going all in on AI. I can see in the not too distant future that we'll depend on AI to run the latest games and pure rasterisations will be like old skool PC's without a dedicated GPU.
@@dtrjones "needlessly" No no, not sinking into 10 more years of stagnation and hardware requirement bloat and software slop because "The AI will fix it" is very much needed. This exact attitude is WHY people are negative and should keep that negativity in mind when it's time to shell out money.
Still, tons of reels/shorts about death of 4090 are already flooding the internet. Casual gamers see “5070 is as powerful as 4090” and thats all. They won’t factcheck.
@@MageLeaderInc The type of person to buy a 4090 is not the type of person who's going to "upgrade" to a 5070. They'll upgrade to 5090 or 5080/5080ti etc
@@MageLeaderInc indeed, the 5070 being as powerful as the 4090 is very impressive and everyone should buy it's stock. Sell the 4090, you don't need that outdated trash. $549 is a bargain.
I don't think anyone actually believes that, given they are open with its specs and we can do a side by side comparison.. But if their smoke and mirrors tech improvement actually delivers a experience close to their claims while having no major draw backs.. I really could not care less.. And for the games out there that may not have it, 99% of them will be a non issue for performance with any moderately new card. In the end of the day I care about my experience with gaming, I really do not care what smokes and mirrors they use as long as its not a detriment to my experience.
@@shabpnd481 pretty much. said the same got a lot of "hate" for saying it. but i guess everyone thinks DLSS4 gonna be in every single game they gonna play too, props to the devs for patching every single game the past 3-5 years lmao & to this day i encountered new games that only supported FSR3 instead of DLSS or rely on DLSS2. it will be a very interesting release. imo the 5070 is a rebranded 5060 in terms of performance, so they can make use of an entry entry level 5060 later on
"Hey guys, we've gotten a lot better at faking better performance" is what I get from it. I can't wait for the reviews to show what the actual uplift is for native resolution rendering.
100% this. I f*cking hate this post pandemic/AI obsessed GPU era. It's all just smoke and mirrors and a huge PR effort to mislead customers and obfuscate facts without actually lying to the point of being held legally accountable. The new 50 series seems powerful enough to not even need this shady, slimy, greasy, misleading used car salesman style marketing. Just give us raw raster numbers, and raw raster+RT numbers. Fake frames and upscaling can be addressed in a different segment.
No more than 33%, when you look at Nvidia's page for 40- and 5090 both have the DLSS section. On 4090's page it shows 21fps native with full RT in Cyberpunk and on 5090's it shows 28fps. But that's assuming both numbers are for the same version of the game and drivers. It's likely that 4090 has some old number that didn't take into account some updates/optimizations so that would reduce the difference between the cards.
You will have to realize this is the future. There is no more raw processing approach, at least it wouldn’t make financial sense. Performance is performance. FPS are FPS. You can make a case for whether upscaling has the same image quality, that’s valid. But we are getting to the point is indistinguishable or even better. There is no going back. Accept it. Why would I buy a $3000 GPU that might give me real rendering when I can buy a $500 that would get me there, at almost no cost of image quality.
@@dv5466 but when you're pushing 240fps, but your GPU is only rendering 60fps... you're literally not gaining any performance... it just looks prettier. basically scamming users with smoke and mirrors, and fake frames. this is not the future, this is false advertising and lies and inflated numbers to make dumb consumers go "ooooh more eff pee ess is good" what good is 240 fps when games will still react at 60fps. that's right... it's not. The issue is that you'll still get better pure performance from the 4090 than you will from the 5070. so you won't get anything useful from saving the 2500 and buying the new gen. it's all fake frames with no genuine performance uplift.
What is this weird obsession people have that GPU’s must render everything natively for it to be “real”? Basically every aspect video game optimisation is layers upon layers of “cheating” to create the illusion of higher fidelity… Even the human brain “cheats” by only “rendering” a tiny portion of your vision with full “resolution” in your FOV. Did you know you have a blind spot in your vision that your brain literally fills in with made up information? If the future of video gaming is primarily neural rendering, why does it matter what the original native performance is?
@@MarylandDevin You are so right - these games already almost demand AI bullshit to run decently, with this next iteration generating more then 1 frame per actual - it's going to leave everyone who doesn't or can't run that tech in the dust.
- 4090, did you bring me that frame I asked you for? - Better, I have a picture of the frame you requested! - Ok... What about you, 5090? - I brought you my entire photo album of your favourite frame... You love frames, don't you!
It doesn't matter if they are fake or not, as long as your eyes feel better. If that makes gaming feel smother.. why not, it's not a bad tech. BUT... do not sell on that! I'm talking to you nVidia. Like Phill said.. latency is also important.
but DLSS has always been one of the selling points of NVIDia. For a long time now FSR and DLSS have been focal points in this tech space and have widely been accepted as the new norm
I've heard enough 'doing more is impossible' from NVIDIA's conference that I think they stopped trying and are going for low hanging fruit. Is it really a ceiling if you don't try?
@@jordanmntungwa3311 True, but when it gets to the highest end cards line 4090 & 5090, their core audience there is the professional space. 3D artists, game asset modelers, & even pro gamers just to still consider gamers in that market. The main issue is that you don't get to use DLSS in workload tasks, just gaming. This means that they are being disingenuous to their target demographic for the 90 cards. It's not wrong to say that 5070 can game at 4090 frame rates when the right settings are enabled, but 4090 users in the professional space can't accept the 5070 as a valid substitute for their cards and it's worrisome that some people looking to upgrade their professional workstation may purchase these thinking so if they don't watch videos like this one.
Thank you for this video. People are eating this up and low balling 4090 sellers right now. Nvidia pulls the same thing every year. I remember the 3070 beating a 2080Ti as well.
Saw a guy offering to sell out contracts trading a new 5080 he will buy for a used 4090 and some people were responding asking for contact details lol It was on Facebook, go figure.
It was definitely on par with 2080ti tho. Especially once drivers matured. What cut the legs off from underneath the 3070 at the end of its life was the insufficient VRAM. The core had the power to keep on going but lacked the memory capacity to feed it data, once PS5/XbX multiplats expecting 12gb started coming out.
@@rarespetrusamartean5433 Don't worry! The scalpers will get most of the ones the OEM's don't get so they'll all be $800 + by the time they are released.
If you dial back some settings you will have decent performance. I have the 3080 and will wait for the 6000 generation or get an AMD if the price/performance is right.
I'm still on my 2070 and modern games are starting to piss me off XD on one hand you get a few gems like that Robocop game that seems like it will run on anything but looks like it was rendered on a super computer and it just stunning. Then you have 99% of the other games acting like everyone has a 4090 with unlimited vram. The re4 demo kept crashing on me because I only have 8GB of vram! But somehow it used to be more than enough... Hell even cyberpunk when it came out ran perfectly fine on my computer with rt and everything but now is like a slide show with everything turned down or off because they decided to update it and require higher specs. All this "AI" stuff is just going to make games worse because they will be made with it in mind.
I am having a 3090 (24GB VRAM) and I play on 1440p and do a lot of AI rendering on stable diffusion. I was planning on upgrading to a 5099, but I can run everything quite good still with my 3090, and the funny thing is.. for AI and stable diffusion I don’t find the 3090 “slow” or something. It’s quite a good card also for that still.
Just like most of the other post-processing effects that actually make things look worse: depth of field, motion blur, chromatic aberration, lens flare, bloom, etc. Now including frame generation, upscaling, and that thing where you render a game at higher than native resolution then downscale it to your display's actual resolution.
I work in marketing and I fucking hate when companies do this. They should be sued for false advertisement bc this is completely lying to consumers and going to lead to many people getting burnt. Not to mention how bad they are with scalpers, the 5070 will never sell for anywhere near that 549 price. 1000 on the market if we’re lucky.
Where's the false advertising? Their performance graphs have all the features enabled at the bottom of the graph . In the presentation, Jensen said "with AI features enabled".
@@clarkisaac6372 every company does the same thing. If you're going to rely on marketing and not your own research, you probably deserve whatever scam you fall for. You can't believe anything you read anywhere these days, why would you believe any company lol.
@@sketchyx8307 oh grow up. "Performance" in this context has a decades-long defined meaning, and "fake frames that are not based on user input" does not form part of it.
They just put the 4090 worst case scenario vs 5070 best case scenario, that's not false advertising, it sucks, sure, but if someone gets burned by that, they diserve it, you never trust first party graphs, and wait to see the real reviews and comparisons, to me, if someone gets fooled by an abvious half truth, it's their own bad for putting so much trust on a companie's word, which is only there to make money and hype a product.
From my experience using frame generation in marvel rivals, when it says "80-120" frames but it FEELS like the actual 20-30 frames in responsiveness, it's terrible. It's a terrible feeling and the disconnect between visual frames and responsiveness is another layer of discomfort on top of the discomfort of it being at such a low frame rate.
It's honestly pretty funny, Nvidia's own website has videos showing the performance of both the 4090 and 5090 with and without DLSS. When you compare the "without DLSS" FPS you see that the 5090 is only a 30% improvement at most. In the videos, Cyberpunk 2077 gets an average FPS of 26.4 with a range of 25-28 FPS on the 5090 and an average FPS of 20.2 with a range of 19-21 FPS on the 4090. Granted, there's no way to tell what impact the rest of the system the GPUs is attached to is impacting those results, as the only other performance metric provided is that Full Ray Tracing is on for both, but since these are GPU performance demos released by Nvidia, I think it's safe to say the performance difference either way from the other components would be minimal.
what are you yapping about, modern cards crush those cards and the framegen and upscaling are literally just a bonus, the cards can render without them and dont rely on ai for basic operation. you sounds like an old man blabbering about old times
@@HELLF1RE9the explain why, without DLSS, the 4090 could be brought to sub 60fps @4k, by games like Alan Wake 2, the new Indiana Jones game, Black Myth Wukong? Game optimization aside, the card shouldn't need frame gen to reach 60+fps. There's also the price point as well. That's my biggest gripe. Even if I had a spare $2k for the 5090, I'd have issues justifying that purchase.
@@Bruhngus420honestly, it's or less a gripe about performance to price point. The value of the 5090... Let's face it, isn't that high. On the 10, 20, 30 series? The performance to price was good (the whole scalping bs aside). The 40 series? Not as high. But they're not going to move a ton of 5090s. They're biggest sellers are going to be the 5060, 5070 and possibly the 5080.
If you go to the web page for the RTX 4090 and scroll down, you'll see Cyberpunk side by side with DLSS/FG off and on. 4090 got 20 FPS with DLSS/fg turned off. The RTX 5090 page also has a Cyberpunk video side by side DLSS/FG off and on. 5090 got 26 FPS with DLSS/FG off.
Soo the rtx 5090 from 30 fps got to 26, its still bad, but i think people should understand that those 30 fps 4k max settings with path tracing si super good, like before it would take minutes or hours to render a frame, now you are doing it in seconds.
Sooooo, does that mean your FP Shooters will look smoother but you’re gonna miss your shot because your location will be different than what you’re seeing?
if you go to 5:30 mark and watch at 0.2x speed. The amount of ghosting you see when spinning the camera around, i realy wonder what that will look like playing a high speed FP shooter where you are constantly spinning the camera in all different directions.
You won't be when we have 1000hz 2000hz or 8khz monitors. Imagine running games at 250fps natively but using frame generation to reach the monitors refresh rate, you wont feel any latency difference compared to running at 8000fps natively. This is a step in the right direction, you might not see it yet, but save this comment and check on it 12-14 years from now.
Don't buy it and go enjoy your actual gpu. If you writing here and you worry about people that defend this b.s. you have some problem. Do your things and let us worry about this b.s.
You will have to realize this is the future. There is no more raw processing approach, at least it wouldn’t make financial sense. Performance is performance. FPS are FPS. You can make a case for whether upscaling has the same image quality, that’s valid. But we are getting to the point is indistinguishable or even better. There is no going back. Accept it. Why would I buy a $3000 GPU that yeah can give me “native” rendering but I can buy a $500 GPU that would get me there, at almost no cost of image quality. The market for people that will buy a $3k GPU to have “native” rendering will be slim. NVIDIA knows this.
PC gamings death knell was when the whole mainstream "PCMR" era brought on a bunch of console gamer converts. They brought with them their 0 standards in visual clarity and hardware performance and it was inevitably all downhill from there
I don't mind playing at low FPS, most of my older games only run correctly at 30, but I will never ever use frame gen. Upscaling is one thing, a whole new frame, that's a step too far.
I think it's completely worth it. You rarely notice the artifacts just playing the game, I more notice the entire image becomes a bit less sharp. Not as bad in modern games just because they're already smeared by TAA or DLSS. Really wish we'd go back to the good old days of razor sharp MSAA. And from using Lossless Scaling, adding more than one fake frame really doesn't make artifacts that much worse. I honestly even use it a lot on videos just because you still some reason find people uploading 30fps videos today.
Basically how i understand this. This feels like if i were to drive fastest car that exists and instead of it having own engine, someone put a worse engine and also took away the speedometer. So i might think I'm driving 150 kilometers (80 miles) per hour while in reality im driving 90 kilometers (50 miles) per hour?
It's technically not because the "fps counter saiz so". But In reality, it's not. It's bending the rules/gray area/smoke and mirrors. Whatever you want to call it. BS. LOL
When Jensen was publicly stating the 5070 will equal the 4090 in performance he very clearly suggested that all 'crutches' should be enabled (upscaling + multi fake frame gen). Even he knows not to (try to!) mitigate crossing that line. HOWEVER, yes, I STRONGLY believe he DID cross that line simply because the 99% of buyers NOT 'in the know' will falsely believe the deliberate marketing headline misinformation....that the 5070=4090. --> PLEASE let there be a class action lawsuit...if only for the bad publicity Nvidia will get!
So what I'd like to understand about frame gen is this. Let's say your card displays three frames in the following order: First it displays 1 actually rendered frame. Then there's a generated middle frame, then the next actually rendered frame. Doesn't the card have to know what the second actual rendered frame is, in order to generate the middle frame that links the two rendered frames? And if that's the case, then wouldn't the game feel like it's operating slightly behind reality? Because it would need to render the second frame before creating the middle frame, right?
Not quite, it actually takes a lot of data from the game engine, like motion vectors and such to generate the new frame. The additional latency is introduced because the GPU has to "pace" the frames, putting out the rendered and generated frames at smooth intervals. Frame gen in my experience increases latency quite a bit compared to native, 60fps has usually at least 10ms less latency than 120fps with 2x frame gen. Also the new 3x and 4x frame gen add even more latency on top of 2x and each other. I don't think it will feel really good unless you start from high fps, making 3x and especially 4x really useless since by then you will be out of the refresh window of many monitors. They are also updating Nvidia reflex to lower latency, that's using a trick that VR headsets use, basically decoupling the input from the rendered frame and shifting the frame in your predicted moving direction, like an overshoot of your actual movement to compensate the lag.
it's basically just the image smoothing filter on TVs that Scorsese told us to turn off. My console playing friend figured this out years ago, "who needs a gaming PC when I got Image Smoothing?"
I turned frame gen on once and noticed that it screws up movements, shadows, some parts of the UI and markers and other shit....never again, that thing can fuck off. Raw 60fps is all I need. It also feels...weird. I know there's more input lag but instead of there being no new frame, there's wrong frames, which feels much worse than there being no frame at all.
I usually have FG off if my 4090 can actually do 4k120fps+ with or without dlss3 quality or even somewhere like 80-90fps in the most demanding games. RT is killing performance and bad optimization of games is even worse, like the recent ue5 games. In some games it changes the game to look visually stunning but runs like crap and vice versa. When FG is on, the inout lag is noticable higher due to base frame is lower. But this can also be affected by the game engine creating even more input lag which can/will happen on any current and future AI card like 5090 and so on. Only reason to have FG is when using RT which is the performance killer on any card. Yes FG will always feel smoother but input lag will feel worse and reflex 2 will not help as much because games are more demanding giving a lower base frame which in turns gives worse FG frames.
@@TheEdmaster87 apparently reflex 2 is supposed to be able to accept mouse inputs even on generated frames, with something they are calling 'frame warping', thus 120 fps should feel like 120 fps instead of 30 fps.... obviously needs to be tested though.
We all remember the claim of a 4070 being three times faster than a 3090 and how that turned out. At this point anything Nvidia says cannot be trusted, however without wanting to sound arrogant its not meant for us so to speak, its for people that hear 5070 as good as 4090 for such a cheap price and that is all they will ever hear and go and buy it. While Nvidia may be an untrustworthy company they are not stupid and know it will work on enough people to make a fortune.
If it works it works, if it provides a better gaming experience then previous gens.. And the competition cannot top it, why would I care? Its about graphical fidelity for gaming. People seem not to understand that the cost of graphics cards isn't just hardware, but the billions they spent in R&D for their tech.
I take your point but it is hardware accelerated. As in you won't see the same improvement on a 40 series that you will on the 50 series even if they're both running DLSS 4. Hardware improvement for sure but that hardware doesn't actually improve raw performance at all.
Ohh they can do it in hardware. But why would they do that, they are greedy and saving money by holding it up hard with software. From a business standpoint it's amazing but for a consumer your essentially paying just for the small bump in raster and the big push of frame gan.
Graphic cards come out too often. Just like smartphones. It’s very scammy. We have like 3-5 games on the market that need a high end gpu. Everything else is pretty mid.
No, there should be no game that needs rtx 4090 or 5090 to play a 4k game, there are games that even rtx 4090 can't do 4k or even 1440p on max settings, its called ASA, and its the beginning, the next gen games are going to be playable only with dlss/fsr from now on
And can you tell the difference? The whole point of this AI is to make the graphical fidelity more efficient, leading to less use of actual hardware and power..
Then invent a new kind of semiconductor to replace silicon with. That one is kinda hitting the wall for a while now. The reason they're jumping hoops and doing these tricks is because you can't just pack twice as many transistors onto a chip at the same cost but half the power draw and heat emissions every 18 months anymore.
i dont understand, wouldnt the frame stay there until the new frame comes in anyway? what's the point of the generated frames? are they different than the original frame or the exact same??
I don't like that all GPU R&D is going into generative AI instead of raster performance like it used to. Maybe I'm too stuck in my ways but I prefer real rendered frames.
I think the reason for this would be because gpu’s would be insanely more costly than just working on the software side but this can only happen because AMD is severely lacking and can’t compete so nvidia just has free reign to charge what ever and not innovate.
@@theanglerfish I’d imagine it might get to the point where it might feel like hit reg feels where you are clearly hitting the character but it doesn’t register but because that frame is 3 frames old……😅
Sad fact is we are reaching the limits of shrinking silicon and within a few generations we will hit the limit and not be able to increase how many transistors we can fit into silicon. That is why AI and software improvements are so important. Until we can switch over to something besides silicon like graphine or living cell computers based on human brain matter AI improvements are all we got.
And with multi-frame generation on, the 4090 would shut down the 5070 based on the numbers we see so far. Which is why they will not let 40-series owners enable it
So about 27% better performance (going by pixel counting for Far Cry 6 on the poorly-labeled bar graph) than the 4090, and it only consumes about 28% more power. Huge gains Kappa
Wow, congratulations on your impressive investment success! Your discipline and focus on delayed gratification is truly inspiring. I'm curious, what are some of the key factors that you consider when making investment decisions? Do you have any tips for those of us who are just starting to dip our toes into the world of investing? Thanks for sharing your story!
When competitive gamers / streamers start using this and find out real fast that something is wrong, the viewers and community at large might finally figure it out.
@@britainvernon9286 Responsive problems when playing competitive. Like the video said it does not feel like those fake frames even if it is looking like it. The feel of the play is not there and when playing esports with that, it will be a problem.
It actually sounds like they tried to mitigate that with additional warping based on your camera movement, taken from keyboard and mouse input Don't know how successful they were, but it might feel almost like server lag, rather than being obviously from the AI frame gen.
Thanks for all you do, Phil...! I usually enable frame gen when my desired detail level falls below 60fps. Once it gets back over 60, it appears smooth to my one good eye.
I LOVE this explanation. Super clear and helpful for how these will actually work. As a 4090 owner who was also burned the generation before- it’s good to know the generational truths. That said- I would love your thoughts on when gaming hits a wall. We’re now firmly in cards for 4k, as none of these higher cards are needed for 1080p workloads… so when will an entire generation just all be vanity purchases?
I do not even watch live reveals or reveals in general because it is mostly just bs marketing. I wait for reviews from multiple content creators before I decide.
around 30% increased performance for what? 40-50% in price? but again, if you have 4090 and looking for 5090 you should wait for few gen before upgrading.
First the game devs use DLSS & Frame Gen as a crutch for bad optimisation, now nVidia is doing the same. I'm shocked. With "multi-frame gen" the game devs will only get lazier. 🤦
I want to see what Reflex 2 does. Changing it to an asynchronous timewarp type function with infill means that game responsiveness could actually be matched to the internal frame rate, not what the render queue is providing. Then, MFG could provide visual fill while the game responds at a high internal frame rate, even if the renderer is spitting out a relatively low number of frames.
I\m betting on yes, because otherwise the prices of AMD vs Nvidia stocks wouldn't be so massively different. AMD fumbles presentations currently, but delivers decently in p/p. Nvidia is stellar at inserting marketing BS in presentations and delivers better RT performance + the entire ability to use AI which afaik just doesn't run at all on AMD from what I've read on many github AI repos. The last part of that is definitely not enough to drive their stock so goddamn high. Marketing BS has to be a reason why they are doing so well. They're professional liars. Nevermind they didn't innovate the actual hardware AT ALL. They increased the die size by 30%, the TDP by 30%, and they're getting 20-30% better performance than last gen...shocker.
every nvidia flagship gpu was monopoly the market because amd dont exist in this segment for more than 15 years hahahaha even 7900xtx was on par with 4070 if you open ray tracing in both...
@@diomedes7971 Once you realize the market isn't a collective of sophisticated finance gurus and actually just a crackhead casino where barely anyone reads about or understands anything, it makes so much more sense
@@diomedes7971 they didn't fool anyone. Idk what "firms" you're referring to. If they're AI-centric firms they pretty much depend on nvidia and if they're not I don't know what they have to do with this unless they're financial firms. If they are financial firms they don't need to be fooled to invest in nvidia, they just need to think their stocks will go up.
yeah, like what was he yapping about? am i poor or sth? that i "only" have a 2000€ pc at home? and not a fully coverd in gold watercooled pc? wtf is wrong with this dude :D
When you take all the fake frames and fake resolution out of the way, the generational improvement is almost non-existent. Especially when factoring price and power requirement increases. That and a 70 class card still having 12GB of RAM in the year of our lord 2025, even though it's GDDR7, is ridiculous. Consoles are starting to have more available memory than that.
Yes nvidia purposly set 5090 to 32gb ram and 16gb and 12gb ram on the lower end series 5080/5070 which is so 2020 just to make 5090 look like the ultimate enthusiast card to make people believe 32gb is the way to go. But adding thst much vram already ijust keeps the price up and not down.. I have the 4090 24gb and never came close to 20gb usage at 4k gaming maxed out, but i can imagine and I know that 12gb is and will not be enough on same settings so thats why the are relying on AI to do the work... but the prices are out of this world and I believe its because of amd not following up.
@@Vennux Friend, I'm talking about total available memory, not overall performance. It's obvious that a top-end gaming PC is far better than a PS5 overall, exactly because a console is already old technology by the time it comes out. However, the PS5 has 12.5GB of addressable memory for developers. The PS5 Pro increases that to 13.7GB. Even if its the slower GDDR6, this is still larger in size than a 5070. Since consoles are usually the lowest common denominator when developing games, that means your fancy 5070 might be already capped for memory while trying to play today's more demanding games, let alone in 4~5 years. Nvidia is doing with memory size today what Intel did with 14nm a decade ago. Lack of competition lets the company slow innovation so they can always have some 5% improvement to sell each generation, even if they can do more.
@@TheEdmaster87 I get you. My 3090 also is more than enough for most things and I won't be replacing it anytime soon. 24GB VRAM is more than enough for almost anything nowadays and most AAA games are so unoptimized that even a 5090 will need upscaling and frame generation to play the most demanding stuff at reasonable framerates. I don't like that and don't want it to be the future of gaming.
You missed one thing: In combination with Reflex2, the generated frame is including the mouse and keyboard information at that point in time. So the game can potentially be more responsive. I am curious, how the reviewers will rate this experience.
I found it interesting in part of the keynote that it was mentioned that it was rasterizing only about 2 megapixels with the DLSS/MFG producing the rest of the pixels in one example. Good to see through the hype.
Literally same. I heard those words and I spit out my drink laughing. I even rewinded to listen again, because it sounded so outlandish. But...wow. He actually said what I thought he said. My next words were "BuuuuulllllllllSHIT". And it was loud enough that my roommate texted me to quiet down lol
so i need some help here..i dont understand nothing from pc and i asked a friend to build 1 and he told me to wait because of this new 5070 that will do the 4090 work way cheaper but i see people talking that it wont do near the job and its all rubish...what shoud i buy?
this is not the case from what people who had hands on are reporting, they say there was a 2ms increase between the original frame generation and the 4 frame generation
Note for Phil: latency is not the same with and without FG. They always enable Reflex with FG and had it off when FG was off. Hence the similar latency
Actually the new Reflex at least makes sense. Because old solution just updates your current frame, which doesn't make sense but in sense of numbers it lowers latency. Meanwhile new Reflex or Warp or whatever they called it, accounts for your movements in game too and then update your frame. So this one could be actually useful, maybe it will improve responsiveness and 120 FPS generated from 30 FPS will feel at least like 60, who knows... Can't wait for benchmarks to debunk all those things.
I noticed that the PC Latency in the LTT video comparing the 4090 to the 5090 were nearly identical. Meaning that the performance uplift was almost entirely based off of DLSS 4. This really implies that the raster performance might just be the same or similar.
rewatch his vid and go to 5min30 and set playback to 0.2x speed. Thats where he flips the camera around standing on the stairs and the amount of ghosting from DLSS not adjusting fast enough to the shifting perspective is bad. Cant imagine what that looks like playing a high pace shooter yourself.
@@ronniebots9225 you know that recording a monitor and then uploading it on youtube heavely impacts that too, and the fact that youtube isnt 240 fps too... No nvidea fanboy here, just saying what you say has a flaw too :)
But what's with Reflex 2? Supposedly, it should be able to warp the frame to the new camera position during Frame Generation and infill the holes. Theoretically, it should be possible to run the G-Buffer at 240 fps and image rendering at 60 with both technologies. That's kinda the whole point of the new Frame Generation tech, as to NOT increase lag.
So essentially, as I have a 144 Hz 4K monitor, providing the FG can boost me to 144 Hz if my GPU can't already do that, there will be no benefit whatsoever going beyond that because latency isn't improved at all, and going beyond the refresh rate of your monitor only benefits you if latency can be improved.
4:58 " it does look visually smoother " it's all just an illusion. I've never bought into the whole DLSS/frame gen/up-scaling rubbish. just give me native FPS/resolution, all the rest is just hyped, fake nonsense. besides anything above about 120hz/fps is pointless, your eyes can't really perceive much above that.
I agree with you but not with the 120hz. It depends from person to person but most people will notice a change but let’s be honest my rig can’t even play the newest games on highest settings with 60 fps without all this bs ai
so let me get this straight, it looks like more fps but feels like native?so if i get under 30 fps, but with frame gen, visual will improve but will still feel like low fps??
My 4090 with no DLSS enabled at my monitors refresh rate of 4K 144Hz all max settings runs at 144 FPS all day long without breaking a sweat. No upgrades needed this time.
on older games maybe, on the new games cant, there are people still doing videos about 4090 and 4080 and you can tell that devs dont care about the games anymore.
More like publishers don't give time for devs to finish the product. The most unoptimised games come from shovelware/asset flip devs that don't care at all about anything, that you are right on. and fthen there are the massive publishers like Ubi and EA that don't give the dev time needed for games... Not saying smaller publishers don't fall into the same trap but still
@@4KGameScape Because if they sold their $1600 4090 for... let's say $1200 to entice a buyer, and then bought a $550 5070, they would then have $650 to spend on whatever they want while still having the performance of their old card. It's a profit play. I'm not defending that a 5070 is as good as a 4090, I'm just telling you the mindset of why a person who thought the cards were equal in performance would sell their 4090: For profit.
The simple question that needs answering is: Does it actually deliver higher fps? The answer is no, which is absolutely insane for that price tag. You're pretty much buying a slightly larger last gen GPU and fancy software. It's bonkers. All while game devs think optimization means soiling your pants in public since they try to avoid it like the plague. Once testers get their hands on the cards they will absolutely butcher these cards and hopefully the Nvidia beast have to step up their game or someone else runs past them.
They're still 20-30% more powerful than last gen. With DLSS turned off on both, the 5090 ran 26fps and 4090 ran 20fps which equals 30% faster. Numbers are hard, I know.
@@ShellShock794 Take a look at Vex'es video. The real comparison should be the super cards, not the standard 4070/80/. You'll get a decrease in performance on some features. Numbers can be hard, but apparently research is harder.
So, on a game and resolution that is limited to 144 Hz on a monitor, and DLSS4 with 3 fake frames is enabled, you could end up with a latency equal to 36FPS if game is waiting for those 3 fake frames to input it's next actual frame? If game could have been running at 60 FPS with no fake frames then you're ruining responsiveness...
Instead of focusing on generating new frames I feel like the focus should be on how to make the game look vidually better while making it easier to run with true frames. Aka upscaling in more clever ways so that each frame is easier to produce, so that the framerate increases due to that. Because I feel like having a disconnect between whats rendered on screen vs the game's actual responsiveness and how fast it's updating its physics etc is not good for gameplay.
No, it´s not true that 5070 is as fast as 4090. Like it was not true that the 4090 was the best investment ever, as the CEO said. That being said the 4090 was... no is.. a real powerhouse, I mean in a good way - a GPU beast. Thank you for another, as always, great video!
Shhhhhhhhh Phil I’m waiting for someone to list a 4090 on eBay for $500😂
If I get a 5090 on launch day I'll totally sell you my 4090 for $500 lol
@@WyFoster I'll
i'll take that deal
I'll give you $500 and 20pc chicken nuggetswith fries 😔
I’m looking at the prices on eBay now people are crazy wanting 1500 still for their 4090s !!!!
@@Ascari9244You play a hard bargain sir 😮
The fact that the comment section is disabled on the nvidia keynote vid says enough.
If you needed to hear the same Nvidia comments and jokes then just click on every GPU related video ever.
No not really. They are presenting a technology update and positive press is important. There are so many needlessly negative comments around - just look at the comment section in this video for starters. It's clear than Nvidia are going all in on AI. I can see in the not too distant future that we'll depend on AI to run the latest games and pure rasterisations will be like old skool PC's without a dedicated GPU.
Most rich people or companies cannot handle criticism, they have fragile egos.
They love censorship.
@@dtrjones "needlessly" No no, not sinking into 10 more years of stagnation and hardware requirement bloat and software slop because "The AI will fix it" is very much needed.
This exact attitude is WHY people are negative and should keep that negativity in mind when it's time to shell out money.
Oooof I didn't even notice that. That's funny! 😂
Still, tons of reels/shorts about death of 4090 are already flooding the internet. Casual gamers see “5070 is as powerful as 4090” and thats all. They won’t factcheck.
I see the 5070 being as powerful as the 4090 because after some time, the 4070 eventually became as good or better than the 3090
Don't correct them. I'll happily buy their cheap 4090 when they upgrade to a 5070
@@jordanmntungwa3311 The logic lmao.
@@MageLeaderInc The type of person to buy a 4090 is not the type of person who's going to "upgrade" to a 5070. They'll upgrade to 5090 or 5080/5080ti etc
@@MageLeaderInc indeed, the 5070 being as powerful as the 4090 is very impressive and everyone should buy it's stock. Sell the 4090, you don't need that outdated trash. $549 is a bargain.
Im so excited to see AI generated frames on my monitor playing my AI generated game while on a Discord call with my AI generated girlfriend
truly living the dream
Lol everything is AI
Instant realisation, how can I trust you are not an AI??🤨🤨
whilst you are commenting on a ai generated channel 😁
And you won't own any of it. You will rent your entertainment hardware, you will subscribe to discord and your girlfriend.
@@longaugust sounds like the pro i hire at night😁
So, that's like saying that they make a 4-cylinder car's exhaust sound like it has 16 cylinders, but it still runs like a little 4-banger.
4 cylinders-16 valves,if you focus on valves you'll forget about cylinder(s).
Sounds like an exhaust leak to me lmao
makes me think of those exhaust mods that end up accidently sounding like a wet fart machine lol
So only 1/4 is actual in-game rendered footage and the rest is a "AI" interpretation 😐
Perfectly put 👍
Another is the old phrase "putting lipstick on a pig" sure it looks better but it's still a pig
Is anyone surprised that a company would say something is better than it actually is
Nvidia is a special kind of slimy tho
Well, intel undersold the performance of the b580. Nvidia is the king of the market right now, they have no need to obfuscate.
Safe and effe.... I mean faster than the 4090!
@@Plutonium239MXRb580 is worse than 4060 on 5600 instead of 9800x3d they used in tests
Nope! 😛
"Rtx 5070 with rtx 4090 proformance for $549" press X to doubt
I don't think anyone actually believes that, given they are open with its specs and we can do a side by side comparison.. But if their smoke and mirrors tech improvement actually delivers a experience close to their claims while having no major draw backs.. I really could not care less.. And for the games out there that may not have it, 99% of them will be a non issue for performance with any moderately new card.
In the end of the day I care about my experience with gaming, I really do not care what smokes and mirrors they use as long as its not a detriment to my experience.
lol the 5070 cant even beat 4070 super on most games that doesn't support dlss4
@@shabpnd481 Ok, so basically every game that can be easily handled by both cards at max settings anyways..
X
@@shabpnd481 pretty much. said the same got a lot of "hate" for saying it. but i guess everyone thinks DLSS4 gonna be in every single game they gonna play too, props to the devs for patching every single game the past 3-5 years lmao & to this day i encountered new games that only supported FSR3 instead of DLSS or rely on DLSS2. it will be a very interesting release. imo the 5070 is a rebranded 5060 in terms of performance, so they can make use of an entry entry level 5060 later on
"Hey guys, we've gotten a lot better at faking better performance" is what I get from it. I can't wait for the reviews to show what the actual uplift is for native resolution rendering.
100% this. I f*cking hate this post pandemic/AI obsessed GPU era. It's all just smoke and mirrors and a huge PR effort to mislead customers and obfuscate facts without actually lying to the point of being held legally accountable. The new 50 series seems powerful enough to not even need this shady, slimy, greasy, misleading used car salesman style marketing. Just give us raw raster numbers, and raw raster+RT numbers. Fake frames and upscaling can be addressed in a different segment.
me thinks that the fc6 chart is exactly that.
No more than 33%, when you look at Nvidia's page for 40- and 5090 both have the DLSS section. On 4090's page it shows 21fps native with full RT in Cyberpunk and on 5090's it shows 28fps.
But that's assuming both numbers are for the same version of the game and drivers. It's likely that 4090 has some old number that didn't take into account some updates/optimizations so that would reduce the difference between the cards.
You will have to realize this is the future. There is no more raw processing approach, at least it wouldn’t make financial sense. Performance is performance. FPS are FPS. You can make a case for whether upscaling has the same image quality, that’s valid. But we are getting to the point is indistinguishable or even better. There is no going back. Accept it. Why would I buy a $3000 GPU that might give me real rendering when I can buy a $500 that would get me there, at almost no cost of image quality.
@@dv5466 but when you're pushing 240fps, but your GPU is only rendering 60fps... you're literally not gaining any performance... it just looks prettier. basically scamming users with smoke and mirrors, and fake frames. this is not the future, this is false advertising and lies and inflated numbers to make dumb consumers go "ooooh more eff pee ess is good"
what good is 240 fps when games will still react at 60fps. that's right... it's not.
The issue is that you'll still get better pure performance from the 4090 than you will from the 5070. so you won't get anything useful from saving the 2500 and buying the new gen. it's all fake frames with no genuine performance uplift.
Remember when games could render decent fps natively? Pepperidge Farms remembers
Game devs are lazy af now and days bc the market allows it
@@BIOHAZARDCURE Part of me wonders how intentional it is. So Nvidia can sell their next line of GPUs
Yup .Gamers are such little B these days..20 years ago people would have rioted over this nonsense
What is this weird obsession people have that GPU’s must render everything natively for it to be “real”? Basically every aspect video game optimisation is layers upon layers of “cheating” to create the illusion of higher fidelity…
Even the human brain “cheats” by only “rendering” a tiny portion of your vision with full “resolution” in your FOV. Did you know you have a blind spot in your vision that your brain literally fills in with made up information? If the future of video gaming is primarily neural rendering, why does it matter what the original native performance is?
@@MarylandDevin You are so right - these games already almost demand AI bullshit to run decently, with this next iteration generating more then 1 frame per actual - it's going to leave everyone who doesn't or can't run that tech in the dust.
- 4090, did you bring me that frame I asked you for?
- Better, I have a picture of the frame you requested!
- Ok... What about you, 5090?
- I brought you my entire photo album of your favourite frame... You love frames, don't you!
Much moar better
It is a drawring of a frame
If they want to sell me fake frames. I’m gonna pay with fake money.
Yeah, just put 3 fake dollar bills between each real dollar bill :D
Sup illegal person
Every money is fake, it is a concept invented by the society to quantify trade value...
It doesn't matter if they are fake or not, as long as your eyes feel better. If that makes gaming feel smother.. why not, it's not a bad tech. BUT... do not sell on that! I'm talking to you nVidia. Like Phill said.. latency is also important.
crypto?
So you telling me we hit a ceiling and just faking it with frame generation to show advancement.
but DLSS has always been one of the selling points of NVIDia. For a long time now FSR and DLSS have been focal points in this tech space and have widely been accepted as the new norm
Thats not what he is saying. He is just explaining the bs marketing. The ceiling got higher with the 5090.
@@chronozeta just not $400 worth of additional ceiling.
I've heard enough 'doing more is impossible' from NVIDIA's conference that I think they stopped trying and are going for low hanging fruit. Is it really a ceiling if you don't try?
@@jordanmntungwa3311 True, but when it gets to the highest end cards line 4090 & 5090, their core audience there is the professional space. 3D artists, game asset modelers, & even pro gamers just to still consider gamers in that market. The main issue is that you don't get to use DLSS in workload tasks, just gaming. This means that they are being disingenuous to their target demographic for the 90 cards. It's not wrong to say that 5070 can game at 4090 frame rates when the right settings are enabled, but 4090 users in the professional space can't accept the 5070 as a valid substitute for their cards and it's worrisome that some people looking to upgrade their professional workstation may purchase these thinking so if they don't watch videos like this one.
Thank you for this video. People are eating this up and low balling 4090 sellers right now. Nvidia pulls the same thing every year. I remember the 3070 beating a 2080Ti as well.
Saw a guy offering to sell out contracts trading a new 5080 he will buy for a used 4090 and some people were responding asking for contact details lol
It was on Facebook, go figure.
It was on par with 2080 ti tho
It was definitely on par with 2080ti tho. Especially once drivers matured. What cut the legs off from underneath the 3070 at the end of its life was the insufficient VRAM. The core had the power to keep on going but lacked the memory capacity to feed it data, once PS5/XbX multiplats expecting 12gb started coming out.
@@rarespetrusamartean5433 Don't worry! The scalpers will get most of the ones the OEM's don't get so they'll all be $800 + by the time they are released.
@@РаЫо In most situations, the Nvidia RTX 4070 is considered to be roughly equal to or slightly faster than the RTX 3090 as well
On the NVIDIA diagram, there is '5070 = 4090,' but in the performance diagram, they compare '5070 vs 4070.'
I’m over here sitting on my 2080ti still, and to be honest. I’m still very happy with it.
If you dial back some settings you will have decent performance. I have the 3080 and will wait for the 6000 generation or get an AMD if the price/performance is right.
I'm still on my 2070 and modern games are starting to piss me off XD on one hand you get a few gems like that Robocop game that seems like it will run on anything but looks like it was rendered on a super computer and it just stunning. Then you have 99% of the other games acting like everyone has a 4090 with unlimited vram. The re4 demo kept crashing on me because I only have 8GB of vram! But somehow it used to be more than enough... Hell even cyberpunk when it came out ran perfectly fine on my computer with rt and everything but now is like a slide show with everything turned down or off because they decided to update it and require higher specs. All this "AI" stuff is just going to make games worse because they will be made with it in mind.
Well my 1060 is getting a bit long in the tooth 😅
I am having a 3090 (24GB VRAM) and I play on 1440p and do a lot of AI rendering on stable diffusion.
I was planning on upgrading to a 5099, but I can run everything quite good still with my 3090, and the funny thing is.. for AI and stable diffusion I don’t find the 3090 “slow” or something. It’s quite a good card also for that still.
@@PaletoB replace it with any other amd card like 6800xt, all 2nd hands are still great, nvidia cards are still overpriced even when 2nd hand
I’m extremely confident that the 5070 will not actually be better than the 4090.
i doubt that even 5080 will beat 4090 in real battle
Agreed
@@lo0nyk with a 256 bit bus? No way
it will be ABOUT 20% better than the 4070 Ti Super or whatever the hell it's called, at least according to Hardware Unboxed.
lol the 5070 cant even beat 4070 super on most games that doesn't support dlss4
Frame generation and upscaling - the modern equivalent of smearing Vaseline over the lens or squinting.
Just like most of the other post-processing effects that actually make things look worse: depth of field, motion blur, chromatic aberration, lens flare, bloom, etc.
Now including frame generation, upscaling, and that thing where you render a game at higher than native resolution then downscale it to your display's actual resolution.
Or shooting air bubbles into the peanut butter to "whip" it (make it lighter per jar) and then charging extra
Okay Nativecels.
When it comes to graphic in games, everything is smoke, mirrors and shortcuts to achieve the result.
can we go back to MSAA it looked so much better than the smeary mess of TAA
I work in marketing and I fucking hate when companies do this. They should be sued for false advertisement bc this is completely lying to consumers and going to lead to many people getting burnt. Not to mention how bad they are with scalpers, the 5070 will never sell for anywhere near that 549 price. 1000 on the market if we’re lucky.
Where's the false advertising? Their performance graphs have all the features enabled at the bottom of the graph . In the presentation, Jensen said "with AI features enabled".
NVIDIA's marketing has been full of tricks since day one of the company.
@@clarkisaac6372 every company does the same thing. If you're going to rely on marketing and not your own research, you probably deserve whatever scam you fall for. You can't believe anything you read anywhere these days, why would you believe any company lol.
@@sketchyx8307 oh grow up. "Performance" in this context has a decades-long defined meaning, and "fake frames that are not based on user input" does not form part of it.
They just put the 4090 worst case scenario vs 5070 best case scenario, that's not false advertising, it sucks, sure, but if someone gets burned by that, they diserve it, you never trust first party graphs, and wait to see the real reviews and comparisons, to me, if someone gets fooled by an abvious half truth, it's their own bad for putting so much trust on a companie's word, which is only there to make money and hype a product.
From my experience using frame generation in marvel rivals, when it says "80-120" frames but it FEELS like the actual 20-30 frames in responsiveness, it's terrible. It's a terrible feeling and the disconnect between visual frames and responsiveness is another layer of discomfort on top of the discomfort of it being at such a low frame rate.
It's honestly pretty funny, Nvidia's own website has videos showing the performance of both the 4090 and 5090 with and without DLSS. When you compare the "without DLSS" FPS you see that the 5090 is only a 30% improvement at most. In the videos, Cyberpunk 2077 gets an average FPS of 26.4 with a range of 25-28 FPS on the 5090 and an average FPS of 20.2 with a range of 19-21 FPS on the 4090. Granted, there's no way to tell what impact the rest of the system the GPUs is attached to is impacting those results, as the only other performance metric provided is that Full Ray Tracing is on for both, but since these are GPU performance demos released by Nvidia, I think it's safe to say the performance difference either way from the other components would be minimal.
Im so tired of upscaling and frame generation
Why ???
@@yaxo11Graphical artifacts. Old games are clear and sharp, new games not so much.
@@yaxo11 it makes any movement a blurry soupy mess but at least screenshots with no movement are pretty
well the entire industry is going there led by nvidia so it's happening.
@@redditredditredditredditreddit you should try DLSS instead of FSR
I miss the days of the 1070/1080/1080ti, when GPUs were GPUs and not the inbred offspring of ChatGPT and a rusty toaster.
what are you yapping about, modern cards crush those cards and the framegen and upscaling are literally just a bonus, the cards can render without them and dont rely on ai for basic operation. you sounds like an old man blabbering about old times
@@HELLF1RE9its just pure emotional ranting not really based in reality
@@HELLF1RE9the explain why, without DLSS, the 4090 could be brought to sub 60fps @4k, by games like Alan Wake 2, the new Indiana Jones game, Black Myth Wukong? Game optimization aside, the card shouldn't need frame gen to reach 60+fps.
There's also the price point as well. That's my biggest gripe. Even if I had a spare $2k for the 5090, I'd have issues justifying that purchase.
@@HELLF1RE9 bro u dont get what he sait like at all
@@Bruhngus420honestly, it's or less a gripe about performance to price point. The value of the 5090... Let's face it, isn't that high. On the 10, 20, 30 series? The performance to price was good (the whole scalping bs aside). The 40 series? Not as high. But they're not going to move a ton of 5090s. They're biggest sellers are going to be the 5060, 5070 and possibly the 5080.
If you go to the web page for the RTX 4090 and scroll down, you'll see Cyberpunk side by side with DLSS/FG off and on. 4090 got 20 FPS with DLSS/fg turned off.
The RTX 5090 page also has a Cyberpunk video side by side DLSS/FG off and on. 5090 got 26 FPS with DLSS/FG off.
Soo the rtx 5090 from 30 fps got to 26, its still bad, but i think people should understand that those 30 fps 4k max settings with path tracing si super good, like before it would take minutes or hours to render a frame, now you are doing it in seconds.
So 30% faster is the real number. Good info
20 FPS? Wait a minute…does this mean most of the frames are fake?
@@Name-ho4ttobviously lol they are fake frames
So you are getting 6 fos more for like $400 more on raster? 😂
Sooooo, does that mean your FP Shooters will look smoother but you’re gonna miss your shot because your location will be different than what you’re seeing?
If it's pure frame gen issues, then you'll miss due to input lag
Bingo
if you go to 5:30 mark and watch at 0.2x speed. The amount of ghosting you see when spinning the camera around, i realy wonder what that will look like playing a high speed FP shooter where you are constantly spinning the camera in all different directions.
No, it generates frames in between 2 render frames.
I think your game will react with higher latency to your mouse input. i find it unplayable at 60 fps.
I'm against fake frames.
same
SAME! Always hated it.
I remember when they added DLSS to satisfactory and it made the game look like absolute ass
You won't be when we have 1000hz 2000hz or 8khz monitors. Imagine running games at 250fps natively but using frame generation to reach the monitors refresh rate, you wont feel any latency difference compared to running at 8000fps natively.
This is a step in the right direction, you might not see it yet, but save this comment and check on it 12-14 years from now.
@@STXNCIC faking sjit is NEVER a "step in the right direction". you're delusional
What’s more concerning is the people actually defending this B.S.
Don't buy it and go enjoy your actual gpu. If you writing here and you worry about people that defend this b.s. you have some problem. Do your things and let us worry about this b.s.
@ You should probably worry more about grammar and spelling instead of GPU’s.
50 cent army out in force.
You will have to realize this is the future. There is no more raw processing approach, at least it wouldn’t make financial sense. Performance is performance. FPS are FPS. You can make a case for whether upscaling has the same image quality, that’s valid. But we are getting to the point is indistinguishable or even better. There is no going back. Accept it. Why would I buy a $3000 GPU that yeah can give me “native” rendering but I can buy a $500 GPU that would get me there, at almost no cost of image quality. The market for people that will buy a $3k GPU to have “native” rendering will be slim. NVIDIA knows this.
PC gamings death knell was when the whole mainstream "PCMR" era brought on a bunch of console gamer converts. They brought with them their 0 standards in visual clarity and hardware performance and it was inevitably all downhill from there
I don't mind playing at low FPS, most of my older games only run correctly at 30, but I will never ever use frame gen.
Upscaling is one thing, a whole new frame, that's a step too far.
I think it's completely worth it. You rarely notice the artifacts just playing the game, I more notice the entire image becomes a bit less sharp. Not as bad in modern games just because they're already smeared by TAA or DLSS. Really wish we'd go back to the good old days of razor sharp MSAA.
And from using Lossless Scaling, adding more than one fake frame really doesn't make artifacts that much worse. I honestly even use it a lot on videos just because you still some reason find people uploading 30fps videos today.
@@Skylancer727 smaa msaa ssaa, anything but taa please! I don't play video games standing still.
Basically how i understand this. This feels like if i were to drive fastest car that exists and instead of it having own engine, someone put a worse engine and also took away the speedometer. So i might think I'm driving 150 kilometers (80 miles) per hour while in reality im driving 90 kilometers (50 miles) per hour?
How is this not bordering on false advertising?
That's the thing, my friend. They spend a looooot of money on engineers, marketing, and lawyers to make sure they stay riiiiiight on that line.
It's technically not because the "fps counter saiz so". But In reality, it's not. It's bending the rules/gray area/smoke and mirrors. Whatever you want to call it. BS. LOL
Because if you didn't comment directly and do research, you would know that Nvidia is right about it technically.
When Jensen was publicly stating the 5070 will equal the 4090 in performance he very clearly suggested that all 'crutches' should be enabled (upscaling + multi fake frame gen). Even he knows not to (try to!) mitigate crossing that line.
HOWEVER, yes, I STRONGLY believe he DID cross that line simply because the 99% of buyers NOT 'in the know' will falsely believe the deliberate marketing headline misinformation....that the 5070=4090.
--> PLEASE let there be a class action lawsuit...if only for the bad publicity Nvidia will get!
@@fightnight14 "Technically the bullet killed him. I didn't" still gets you hung.
I never eat without a brick of salt when it comes to marketing, they want our money and will lie for it.
This is as shocking as Nvidia tying the price of its gpus to the A.I. market
Hi Phil. Alway nice to see you in front of the camera and your presenting skills are improving further.
So what I'd like to understand about frame gen is this. Let's say your card displays three frames in the following order: First it displays 1 actually rendered frame. Then there's a generated middle frame, then the next actually rendered frame.
Doesn't the card have to know what the second actual rendered frame is, in order to generate the middle frame that links the two rendered frames? And if that's the case, then wouldn't the game feel like it's operating slightly behind reality? Because it would need to render the second frame before creating the middle frame, right?
Not quite, it actually takes a lot of data from the game engine, like motion vectors and such to generate the new frame. The additional latency is introduced because the GPU has to "pace" the frames, putting out the rendered and generated frames at smooth intervals. Frame gen in my experience increases latency quite a bit compared to native, 60fps has usually at least 10ms less latency than 120fps with 2x frame gen. Also the new 3x and 4x frame gen add even more latency on top of 2x and each other. I don't think it will feel really good unless you start from high fps, making 3x and especially 4x really useless since by then you will be out of the refresh window of many monitors. They are also updating Nvidia reflex to lower latency, that's using a trick that VR headsets use, basically decoupling the input from the rendered frame and shifting the frame in your predicted moving direction, like an overshoot of your actual movement to compensate the lag.
it's basically just the image smoothing filter on TVs that Scorsese told us to turn off. My console playing friend figured this out years ago, "who needs a gaming PC when I got Image Smoothing?"
After 150fps pc gaming I never used my PS4. It feels like there is a massive lag even if the gameplay is smooth.
4 times of fake frames? We should be allowed to buy 50 series cards with 1/5 of their MSRP and pay the rest 4/5 with generated fake money.
it's one real 3 fake, so we should buy at a 1/4 of the price and pay the rest in monopoly money.
@@huh0123 Are US dollars anything other than Monopoly money that the government pretends is real?
Unoriginal AMD you got the math wrong
I got a whack of old monopoly money lying around, think mail order would work?
You'll never get the suggested MSRP outside the slide, therefore 1/5 .
Also notice these comparison charts are comparing the 5070 to the 4070, not the 4090. Check the top right corner.
Should I get a 5070 or a 4070 super for a 7600x for fortnite
so 4090 were selling for almost 3k at one point and now they have the same performance as a 600 vid card?> what a great deal !
I turned frame gen on once and noticed that it screws up movements, shadows, some parts of the UI and markers and other shit....never again, that thing can fuck off.
Raw 60fps is all I need.
It also feels...weird. I know there's more input lag but instead of there being no new frame, there's wrong frames, which feels much worse than there being no frame at all.
I usually have FG off if my 4090 can actually do 4k120fps+ with or without dlss3 quality or even somewhere like 80-90fps in the most demanding games. RT is killing performance and bad optimization of games is even worse, like the recent ue5 games. In some games it changes the game to look visually stunning but runs like crap and vice versa.
When FG is on, the inout lag is noticable higher due to base frame is lower. But this can also be affected by the game engine creating even more input lag which can/will happen on any current and future AI card like 5090 and so on.
Only reason to have FG is when using RT which is the performance killer on any card.
Yes FG will always feel smoother but input lag will feel worse and reflex 2 will not help as much because games are more demanding giving a lower base frame which in turns gives worse FG frames.
@@TheEdmaster87 that someone is even considering optimization on a 4090 is ridiculous...
@@TheEdmaster87 apparently reflex 2 is supposed to be able to accept mouse inputs even on generated frames, with something they are calling 'frame warping', thus 120 fps should feel like 120 fps instead of 30 fps.... obviously needs to be tested though.
We all remember the claim of a 4070 being three times faster than a 3090 and how that turned out. At this point anything Nvidia says cannot be trusted, however without wanting to sound arrogant its not meant for us so to speak, its for people that hear 5070 as good as 4090 for such a cheap price and that is all they will ever hear and go and buy it. While Nvidia may be an untrustworthy company they are not stupid and know it will work on enough people to make a fortune.
They can't do it in hardware, so they'll do it in software. They are really taking the "fake it 'til you make it" business mantra to heart.
If it works it works, if it provides a better gaming experience then previous gens.. And the competition cannot top it, why would I care? Its about graphical fidelity for gaming. People seem not to understand that the cost of graphics cards isn't just hardware, but the billions they spent in R&D for their tech.
amd has fsr and frame gen so you have them off ?stop crying for something u cant get
I take your point but it is hardware accelerated. As in you won't see the same improvement on a 40 series that you will on the 50 series even if they're both running DLSS 4. Hardware improvement for sure but that hardware doesn't actually improve raw performance at all.
Ohh they can do it in hardware. But why would they do that, they are greedy and saving money by holding it up hard with software. From a business standpoint it's amazing but for a consumer your essentially paying just for the small bump in raster and the big push of frame gan.
Imagine being a tech intusiast and saying software evolution is bad??? LMAAAAAAAAO
Still running a 1080, looking to upgrade.
Looking forward to your reviews and benchmarks!
Always refreshing to get a Phil video ❤
Graphic cards come out too often. Just like smartphones. It’s very scammy. We have like 3-5 games on the market that need a high end gpu. Everything else is pretty mid.
No, there should be no game that needs rtx 4090 or 5090 to play a 4k game, there are games that even rtx 4090 can't do 4k or even 1440p on max settings, its called ASA, and its the beginning, the next gen games are going to be playable only with dlss/fsr from now on
It’s not scammy. You don’t have to upgrade every year.
Running every new upscaler they have, the same upscaler's they have locked the 4090 out of because money, it might be as 'powerful.'
I've always said that frame generation is fake frames. And it really is. I want raw performance numbers with no frame.
Then buy the 5090
@@L9MN4sTCUk it has the same issue as well
And can you tell the difference? The whole point of this AI is to make the graphical fidelity more efficient, leading to less use of actual hardware and power..
I guess that means to go Radeon
Then invent a new kind of semiconductor to replace silicon with. That one is kinda hitting the wall for a while now.
The reason they're jumping hoops and doing these tricks is because you can't just pack twice as many transistors onto a chip at the same cost but half the power draw and heat emissions every 18 months anymore.
Phil you're absolutely killing it on the videos man. Absolute pleasure to listen to you give the overview here!
i dont understand, wouldnt the frame stay there until the new frame comes in anyway? what's the point of the generated frames? are they different than the original frame or the exact same??
I don't like that all GPU R&D is going into generative AI instead of raster performance like it used to. Maybe I'm too stuck in my ways but I prefer real rendered frames.
I think the reason for this would be because gpu’s would be insanely more costly than just working on the software side but this can only happen because AMD is severely lacking and can’t compete so nvidia just has free reign to charge what ever and not innovate.
Real frame=real head shot, generated frame=headshot in your head because you miss without realising it 😐😓
@@theanglerfish I’d imagine it might get to the point where it might feel like hit reg feels where you are clearly hitting the character but it doesn’t register but because that frame is 3 frames old……😅
It’s essentially junk fps where it doesn’t give real value just kinda like doritos compared to a steak.
Sad fact is we are reaching the limits of shrinking silicon and within a few generations we will hit the limit and not be able to increase how many transistors we can fit into silicon. That is why AI and software improvements are so important. Until we can switch over to something besides silicon like graphine or living cell computers based on human brain matter AI improvements are all we got.
So basically, with DLSS off, all those shiny 50-series graphics card will be the Emperor's New Clothes?
Yes
And with multi-frame generation on, the 4090 would shut down the 5070 based on the numbers we see so far. Which is why they will not let 40-series owners enable it
Except 5090
@@Informatic1 you think lossless scale app will be an option when combine with 4090?
My 4090 chuckles at the audacity 😂
So does my 3090...
@SlimeyGuitarStrings 😁👌
This is the time to buy a RTX 4090
idk if you will respond to this but i dont have enough money for a 4090 so would it be better to get a 5070 ti or a 4070 ti?
@milk_savyget a used 4070 ti. Look for vram
Thanks Phil, and a very happy new year to you, and all those around you!
😊😊🍻🍻
So about 27% better performance (going by pixel counting for Far Cry 6 on the poorly-labeled bar graph) than the 4090, and it only consumes about 28% more power. Huge gains Kappa
Thank you for recommending Sarah Jennine Davis on one of your videos. I reached out to her and investing with her has been amazing.
Wow, congratulations on your impressive investment success! Your discipline and focus on delayed gratification is truly inspiring. I'm curious, what are some of the key factors that you consider when making investment decisions? Do you have any tips for those of us who are just starting to dip our toes into the world of investing? Thanks for sharing your story!
Do you mind sharing info on the adviser who
assisted you? I'm 39 now and would love to
grow my portfolio and plan my retirement
@@BenjaminDuncan-c5i Sarah Jennine Davis is highly recommended
You most likely should get her basic info when you search her on your browser.
@@mayor-o1wHow do I access her ? I really need this
+156
technically my 1060 is as fast as rtx 9090 if you just close your eyes
"Phil, do NOT move from that chair we still have 34 more videos to make" 🤣
When competitive gamers / streamers start using this and find out real fast that something is wrong, the viewers and community at large might finally figure it out.
What would be wrong? What are you talking about?
@@britainvernon9286 Responsive problems when playing competitive. Like the video said it does not feel like those fake frames even if it is looking like it. The feel of the play is not there and when playing esports with that, it will be a problem.
It actually sounds like they tried to mitigate that with additional warping based on your camera movement, taken from keyboard and mouse input Don't know how successful they were, but it might feel almost like server lag, rather than being obviously from the AI frame gen.
@@britainvernon9286input lag, advise you to not go down the rabbit hole, it is a huge hole
Thanks for all you do, Phil...! I usually enable frame gen when my desired detail level falls below 60fps. Once it gets back over 60, it appears smooth to my one good eye.
I LOVE this explanation. Super clear and helpful for how these will actually work. As a 4090 owner who was also burned the generation before- it’s good to know the generational truths.
That said- I would love your thoughts on when gaming hits a wall. We’re now firmly in cards for 4k, as none of these higher cards are needed for 1080p workloads… so when will an entire generation just all be vanity purchases?
I do not even watch live reveals or reveals in general because it is mostly just bs marketing. I wait for reviews from multiple content creators before I decide.
all you had to do is too see raw performance increase in Cyberpunk between 4090 and 5090, it went from 20 to 30fps without frame gen lol
around 30% increased performance for what? 40-50% in price? but again, if you have 4090 and looking for 5090 you should wait for few gen before upgrading.
@@cythose4059 20fps to 30fps gain IS a 50% gain
@@bumble3572 another commenter said only 6 fps diff in c2077. [Can be seen at the bottom of each card page.]
so yeah it could be 30%.
@@bumble3572 in reality it was a bump from 20-21 to around 27-28 fps
First the game devs use DLSS & Frame Gen as a crutch for bad optimisation, now nVidia is doing the same. I'm shocked.
With "multi-frame gen" the game devs will only get lazier. 🤦
Nice, informative and short, man I love that channel!
I want to see what Reflex 2 does. Changing it to an asynchronous timewarp type function with infill means that game responsiveness could actually be matched to the internal frame rate, not what the render queue is providing. Then, MFG could provide visual fill while the game responds at a high internal frame rate, even if the renderer is spitting out a relatively low number of frames.
NVIDIA’s Company Mission: Fake It to Make It!
The game will actually always only feel as good as your latency allows.
You mean a coporation lie to us, standing a hotel basement wearing a forced outfit. No way man they never lie.
Nice to see Phil back as a host. Very informative video. Keep up the good work!
So is the frames going to look smoother but it will feel like there is a delay in ping?
Yeah.. like the 3090 was the first 8K gaming card.. Are people still gonna believe this or the whole AI absurdity?!
I\m betting on yes, because otherwise the prices of AMD vs Nvidia stocks wouldn't be so massively different.
AMD fumbles presentations currently, but delivers decently in p/p.
Nvidia is stellar at inserting marketing BS in presentations and delivers better RT performance + the entire ability to use AI which afaik just doesn't run at all on AMD from what I've read on many github AI repos.
The last part of that is definitely not enough to drive their stock so goddamn high. Marketing BS has to be a reason why they are doing so well. They're professional liars.
Nevermind they didn't innovate the actual hardware AT ALL. They increased the die size by 30%, the TDP by 30%, and they're getting 20-30% better performance than last gen...shocker.
@@rarespetrusamartean5433 Uh huh, they fooled the entire tech industry and numerous firms, but you're smart enough to see all through it.. Clearly.
every nvidia flagship gpu was monopoly the market because amd dont exist in this segment for more than 15 years hahahaha even 7900xtx was on par with 4070 if you open ray tracing in both...
@@diomedes7971 Once you realize the market isn't a collective of sophisticated finance gurus and actually just a crackhead casino where barely anyone reads about or understands anything, it makes so much more sense
@@diomedes7971 they didn't fool anyone. Idk what "firms" you're referring to. If they're AI-centric firms they pretty much depend on nvidia and if they're not I don't know what they have to do with this unless they're financial firms.
If they are financial firms they don't need to be fooled to invest in nvidia, they just need to think their stocks will go up.
I can’t wait to put a 5090 in my $10,000 gaming command center. Just like every other normie gamer.
yeah, like what was he yapping about? am i poor or sth? that i "only" have a 2000€ pc at home? and not a fully coverd in gold watercooled pc? wtf is wrong with this dude :D
When you take all the fake frames and fake resolution out of the way, the generational improvement is almost non-existent. Especially when factoring price and power requirement increases.
That and a 70 class card still having 12GB of RAM in the year of our lord 2025, even though it's GDDR7, is ridiculous. Consoles are starting to have more available memory than that.
consoles having 30 fps in 2025 what are u talking about ...consoles are bad for gaming they are like a decade ago in technology
Yeah I think we'll be looking at 15% tops, in _real_ performance gain tier-to-tier.
Yes nvidia purposly set 5090 to 32gb ram and 16gb and 12gb ram on the lower end series 5080/5070 which is so 2020 just to make 5090 look like the ultimate enthusiast card to make people believe 32gb is the way to go. But adding thst much vram already ijust keeps the price up and not down..
I have the 4090 24gb and never came close to 20gb usage at 4k gaming maxed out, but i can imagine and I know that 12gb is and will not be enough on same settings so thats why the are relying on AI to do the work... but the prices are out of this world and I believe its because of amd not following up.
@@Vennux Friend, I'm talking about total available memory, not overall performance. It's obvious that a top-end gaming PC is far better than a PS5 overall, exactly because a console is already old technology by the time it comes out.
However, the PS5 has 12.5GB of addressable memory for developers. The PS5 Pro increases that to 13.7GB. Even if its the slower GDDR6, this is still larger in size than a 5070. Since consoles are usually the lowest common denominator when developing games, that means your fancy 5070 might be already capped for memory while trying to play today's more demanding games, let alone in 4~5 years.
Nvidia is doing with memory size today what Intel did with 14nm a decade ago. Lack of competition lets the company slow innovation so they can always have some 5% improvement to sell each generation, even if they can do more.
@@TheEdmaster87 I get you. My 3090 also is more than enough for most things and I won't be replacing it anytime soon. 24GB VRAM is more than enough for almost anything nowadays and most AAA games are so unoptimized that even a 5090 will need upscaling and frame generation to play the most demanding stuff at reasonable framerates. I don't like that and don't want it to be the future of gaming.
You missed one thing: In combination with Reflex2, the generated frame is including the mouse and keyboard information at that point in time. So the game can potentially be more responsive. I am curious, how the reviewers will rate this experience.
Give this man some more talking point videos like this! If he wants it ha Good work.
Maybe at 1080p, on one game, on a specific motherboard, with some throttle, and a frame rate cap on…
As a 4090 owner, I am gonna skip the next generation of cards.
well the 5090 is the only upgrade for you, so yeah, no reason to upgrade.
also 4090 and waiting for LG OLED tvs with hdmi 2.2 and 240hz. 5090 on 4k240hz will be great even for old games like max payne 3
I will be upgrading mine. Hello Ebay, here is my 4090.
Same. I can kind of see the appeal but not for the staggering amount of cash involved.
4090 cant even 4k max everything in alan wake 2 it struggles so if you need a really high end pc 5080 or 5090 is the only choice to upgrade
5070 raw performance not even on par with the 4080 and 4080S let alone 4090.
Can't complain it's cheaper 😂
It used to be every previous gen 80 was next gen 70 perf wise
Yes, thank you, finally someone said it!!!!
Which is weird that it was you guys as you're sometimes recklessly possitive, but ok.
I found it interesting in part of the keynote that it was mentioned that it was rasterizing only about 2 megapixels with the DLSS/MFG producing the rest of the pixels in one example. Good to see through the hype.
When Jensen claimed 4090 performance in the 5070 I said aloud "that is bullshit isn't it"
Literally same. I heard those words and I spit out my drink laughing. I even rewinded to listen again, because it sounded so outlandish. But...wow. He actually said what I thought he said.
My next words were "BuuuuulllllllllSHIT". And it was loud enough that my roommate texted me to quiet down lol
but its not, its stronger than the 4090
@@GhostMan407 good joke
I guess that's why you didn't listen to the next part where he says with the AI features like frame gen
@@sketchyx8307 Nvidia isn't going to kiss you. You don't need to shill for them.
AI is the death on gaming. We need to all stop with this frame gen. Only raw performance please
Dont sell your 4000 series card or even a 3090ti card just yet, tests will reveal the actual performance
so i need some help here..i dont understand nothing from pc and i asked a friend to build 1 and he told me to wait because of this new 5070 that will do the 4090 work way cheaper but i see people talking that it wont do near the job and its all rubish...what shoud i buy?
this is not the case from what people who had hands on are reporting, they say there was a 2ms increase between the original frame generation and the 4 frame generation
Note for Phil: latency is not the same with and without FG. They always enable Reflex with FG and had it off when FG was off. Hence the similar latency
Actually the new Reflex at least makes sense. Because old solution just updates your current frame, which doesn't make sense but in sense of numbers it lowers latency. Meanwhile new Reflex or Warp or whatever they called it, accounts for your movements in game too and then update your frame. So this one could be actually useful, maybe it will improve responsiveness and 120 FPS generated from 30 FPS will feel at least like 60, who knows... Can't wait for benchmarks to debunk all those things.
I noticed that the PC Latency in the LTT video comparing the 4090 to the 5090 were nearly identical. Meaning that the performance uplift was almost entirely based off of DLSS 4. This really implies that the raster performance might just be the same or similar.
rewatch his vid and go to 5min30 and set playback to 0.2x speed. Thats where he flips the camera around standing on the stairs and the amount of ghosting from DLSS not adjusting fast enough to the shifting perspective is bad. Cant imagine what that looks like playing a high pace shooter yourself.
It's a new version of vsync.
@@_PatrickO Would you care to elaborate?
so lol 😂 @@ronniebots9225
@@ronniebots9225 you know that recording a monitor and then uploading it on youtube heavely impacts that too, and the fact that youtube isnt 240 fps too...
No nvidea fanboy here, just saying what you say has a flaw too :)
I don't mind the frame generation in games. I mind how well it renders 3D projects in Blender.
But what's with Reflex 2? Supposedly, it should be able to warp the frame to the new camera position during Frame Generation and infill the holes. Theoretically, it should be possible to run the G-Buffer at 240 fps and image rendering at 60 with both technologies. That's kinda the whole point of the new Frame Generation tech, as to NOT increase lag.
So essentially, as I have a 144 Hz 4K monitor, providing the FG can boost me to 144 Hz if my GPU can't already do that, there will be no benefit whatsoever going beyond that because latency isn't improved at all, and going beyond the refresh rate of your monitor only benefits you if latency can be improved.
Nvidia will always claim that new gen is so much better but we all know that is just marketing BS, like if you agree👍
4:58 " it does look visually smoother " it's all just an illusion. I've never bought into the whole DLSS/frame gen/up-scaling rubbish. just give me native FPS/resolution, all the rest is just hyped, fake nonsense. besides anything above about 120hz/fps is pointless, your eyes can't really perceive much above that.
I agree with you but not with the 120hz. It depends from person to person but most people will notice a change but let’s be honest my rig can’t even play the newest games on highest settings with 60 fps without all this bs ai
Stuff above 120 FPS is important (even if your display can't show it) for reaction time.
And What we see is just an interpretation of the light our eyes sees by our brain.
Nvidia has lost all respect from me since the 30 series. 40 was a joke, 50 is spitting in your face..
the 1000 series were the worst of them all and what enabled them to be what they are now and the market to be what it is now.
4090 is not a joke
@@rattlehead999 1000s series was the best price to performance from 9XX...wtaf are you waffling on about.
@@dieglhix compared to previous pricing yh that card is a bargain 😭
@@dieglhix at that power draw and msrp it wasn't good either.
so let me get this straight, it looks like more fps but feels like native?so if i get under 30 fps, but with frame gen, visual will improve but will still feel like low fps??
It's like saying I am as fast as Usain Bolt...sitting in my car.
My 4090 with no DLSS enabled at my monitors refresh rate of 4K 144Hz all max settings runs at 144 FPS all day long without breaking a sweat. No upgrades needed this time.
on older games maybe, on the new games cant, there are people still doing videos about 4090 and 4080 and you can tell that devs dont care about the games anymore.
More like publishers don't give time for devs to finish the product.
The most unoptimised games come from shovelware/asset flip devs that don't care at all about anything, that you are right on. and fthen there are the massive publishers like Ubi and EA that don't give the dev time needed for games... Not saying smaller publishers don't fall into the same trap but still
Laughing at people who sell their 4090 rn to get a 5070. 😂
Nobody do that, please give me data who did that
Why would anyone sell a card to buy a card of the same performance? Hello?
@@4KGameScape Because if they sold their $1600 4090 for... let's say $1200 to entice a buyer, and then bought a $550 5070, they would then have $650 to spend on whatever they want while still having the performance of their old card. It's a profit play. I'm not defending that a 5070 is as good as a 4090, I'm just telling you the mindset of why a person who thought the cards were equal in performance would sell their 4090: For profit.
The simple question that needs answering is: Does it actually deliver higher fps? The answer is no, which is absolutely insane for that price tag.
You're pretty much buying a slightly larger last gen GPU and fancy software. It's bonkers. All while game devs think optimization means soiling your pants in public since they try to avoid it like the plague. Once testers get their hands on the cards they will absolutely butcher these cards and hopefully the Nvidia beast have to step up their game or someone else runs past them.
What? Clearly you didn't see research it's cheaper than last gen cards so you can't complain 😂
@@kirby21-xz4rx fanboy much?
They're still 20-30% more powerful than last gen. With DLSS turned off on both, the 5090 ran 26fps and 4090 ran 20fps which equals 30% faster.
Numbers are hard, I know.
@@ShellShock794 Take a look at Vex'es video. The real comparison should be the super cards, not the standard 4070/80/.
You'll get a decrease in performance on some features.
Numbers can be hard, but apparently research is harder.
So, on a game and resolution that is limited to 144 Hz on a monitor, and DLSS4 with 3 fake frames is enabled, you could end up with a latency equal to 36FPS if game is waiting for those 3 fake frames to input it's next actual frame? If game could have been running at 60 FPS with no fake frames then you're ruining responsiveness...
Instead of focusing on generating new frames I feel like the focus should be on how to make the game look vidually better while making it easier to run with true frames. Aka upscaling in more clever ways so that each frame is easier to produce, so that the framerate increases due to that. Because I feel like having a disconnect between whats rendered on screen vs the game's actual responsiveness and how fast it's updating its physics etc is not good for gameplay.
No, it´s not true that 5070 is as fast as 4090.
Like it was not true that the 4090 was the best investment ever, as the CEO said.
That being said the 4090 was... no is.. a real powerhouse, I mean in a good way - a GPU beast.
Thank you for another, as always, great video!