Soon nVidia gpus will completely eliminate user input lag by simply eliminating the user and playing the game for you by using the latest bleeding edge AI innovations.
probably already do, rendering many frames ahead properly means predicting what a user would do and already doing it for them, many users would be subconciously affected by this, taking the alternatively rendered frames essentially as a guide for what they should do and then actually doing that. ofcource unless already at a super high fps where such generation wouldn't be needed. so esentially they probably already play it for the user, both doing thigns they didn't do yet, and litterally controlling the users actions by showing them those things(humans are like monkeys and naturally try to copy things on insticnt).
I'm really mad that basically every significant "performance jump" is mainly AI. I remember last time, Jensen said something like "1 pixel is calculated, then other 7 pixels are generated by AI", and that was just the upscaling before Frame Generation. Now they are doing that, and generating entire 3 frames from it. In the end, only 1 out of 32 pixels is actually rendered, everything else is an educated guess by AI. Someone will surely make a comparison between the AI image and a fully rendered image, I am really curious about that. Still, I am most interested about actual no-AI performance. Though on a positive note, nice surprise they didn't increase (and even lowered) the price of the lower models.
I get what you’re saying but I think software/AI enhancements will the main way we see big performance leaps moving forward. You can see by the results, the frames aren’t just “educated guesses” but highly accurate and reflective of the source material. If we are getting big performance gains, that’s a win, regardless of how it’s done.
@@fogofwar342 Moving forward yes. And yes, they are getting better and better. But specifically for Frame generation, look how last gen messed up the UI in some games. And that was just with 1 frame inserted in between. Now they are inserting 3. Unless the UI processing improved significantly, it's going to be a huge mess.
@@fogofwar342 You know how we get actual performance uplift by game devs actually optimizing games and I mean at all they don't do it nowadays and that's a fact they just use their nanite and import a billion polygon models into the game then use their framegen and add copious input lag it's all s***id.
Yep. Now so many AAA developers are using all of the Nvidia tricks as a crutch to release badly optimized games, and they even now include "with DLSS" under their recommended specs. DLSS was never meant to completely replace raw rasterization performance. It was supposed to be an addition, but now it's being treated as the baseline... 🤬
heavily relying on framegeneration is a bad thing. saying 5070 have double the perf of 4090 just because DLSS 4 in 50 get that mult generation while 40 series only get single frame gen is just scam.
Can’t wait for 5070. 200 fps with the input latency of 50fps. How cool. Papa Jensen gimme more fake frames please! Might as well grab a crayon and write 1000fps on the screen and just play pretend…
Oh no problem then stop crying and be satisfied with your real 20fps on AMD. Ps. You get roughly+30% raster for every class and dlss4 is cherry on top. Even if you look for raster only nvidia gives you more frames/$. And then 50fps latency for single player titles is actually pretty fine, but its more like 70 to 200fps boost.
@@filippetrovic845 As someone who play exclusively in the 40-60fps range, it's not that bad. That being said, it's going to be frustrating as Hell if I'm reacting to shit and the game is being sluggish due to the input latency.
@@filippetrovic845bro really has nvidia down his throat. Instead of focusing on how the customers are the one losing and nvidia is relying on AI’s educated guesses for frames instead of actual frames. Doesn’t matter if it’s AMD or NVidia the consumers lose
Did anyone else notice that Nvidia disabled chat, comments, and downvotes on their CES stream? It's telling me that they knew angry gamers would have mobbed them.
16 дней назад+15
The problem with fake frames is that, from what I've noticed, when combined with even moderate amounts of ray tracing, they result in a noisy, flickering image. This can be observed in games like the new Indiana Jones, as well as in older titles like Cyberpunk 2077
You should watch the 5090 with out DDSS4 & only RT lets put it simple 1440P Here we come from 4K for $1,999.99 . lower then 30 FPS with 4K with only RT & with DDSS 4 & RT 260FPS on 4K. Like the Video said Stop & wait for Reviews before buying anything. You really need to think AMD 9700X Vs 9800X 3D & possible Intel Ultra Series Processors. Like it or not Salt needs to be use so get a track or sit & wait for Reviews.
@@mattstansbeary3068 Nah I'll get the 5090. AMD won't deliver. I want top AI performance. I want PT in 4k with high FPS and that is what NVIDIA is good at. For people who don't want that and care more about raw native fps without PT, I'm sure AMD has got them covered.
@@chrisking6695 Good Luck you be sleeping very strange hours on Jan. 29th & Jan. 30th by that I mean not Sleeping at all since you have to Either have a Best Buy or Micro Center in your town because if you need to do it online you better Jump on F5 like your life depends on it. Other words Geforce 5090 & 5080 will be sold out in Secs.
just buy the Founders Edition, Other versions of the cards will be higher because they have better cooling and in most cases higher clock speeds so yes you will pay extra for the bells and whistles but i think going from £549 to £900 is a little too much i would expect them to goto around £650 or even £700.
@@azorees7259 I agree, without scalpers in the equation. I really Hope iam wrong, but probably Nobody can really buy them cause Bots buy everything Up and it gets sold for a huge overpay in eBay...
Think that's bad scalpers are going to be running their programs to buy all these up before you can even put your card number in and sell them on Ebay for 3 to 4 times the price ..
How about this.... MOVE THE CONNECTOR TO END OF THE CARD AND DOWNWARD!... You know, like the rear right corner below the card? So it is OUT OF THE WAY AND SIGHT without making card longer?! That connector DOES NOT NEED TO BE VISIBLE!
@@TK13M terrible idea, trust me. Some of the high end EVGA Kingpin cards did this and caused big problems with front case fans, even in full-tower cases. 45 deg is the better of the options here since all cases expect the cables to be there.
Also , those 2 games that were on the slide that didn't use dlss were both still using ray tracing, and if the raytracing is significantly faster on the 5000 series compared to the 4000. That would probably make up that 25% increase in FPS that was shown. If that's the case, there isn't going to be much in the way of actual raster performance improvement.
VERY valid point. Can't wait to see benchmarks of raw performance so we can finally see the value for each gpu, as well as true performance comparison to ada lovelace
are you acoustic? look at the price difference. What else are they going to do other than improve ai. There is no limit to ai. Also, the statistics of the 5090 are actually better than the 4090.
@gamingsol7521 Just to let you know, Adding more frames to improve performance sounds like a bad idea. Correct me if I'm wrong, Frame generation is like an animator adding more stills to an animation in real time. It's like adding a buffer that is not necessarily needed.
Here's to hoping AMD can come with more raw performance than AI performance, that way we can choose if we want to fake it until we break it or if we want true graphics.
Sucks to be the people that were paying $2500-3000 for a 4090 the past couple months after Nvidia stopped producing them and supply dried up. People are so unbelievably dumb sometimes.
Yes haha😂 i already see panic sellers offering 4090's for msrp price, soon will they realise none will buy them for anything close to msrp. Hell, i would rather take 5080 if they are same price. I dont care for raster, only RT, since all these cards already have strong raster for older games anyways.
the scalpers will buy all the cards in the first batch so nvidia will hit it sales quota then we will see how many people want to pay scalper price after raw benchmarks come out.
Can it predict a shot or new jacket material in 2years? +20% pure performance while +30% power usage. Single frame (morph) is fine on 4090/4080, thank you.
Basically all of the improvements are from the Tensor cores that are used for A.I., not the CUDA cores or shader performance, core per core and clock for clock. Has there really been any improvement in shader performance other than the old AMD strategy, MORE CORES!?
@@arch1107I can tell you, 9070xt will be lowered to $499 msrp and when it comes to stores it will be immediately discounted to $449 and still none will buy it over 5070 just based on raster performance alone, because 5070 will match 7900xt raster while 9070xt will match 7900 gre while lacking in RT big time.
yes, they wioll offer 15fps, then the 75% fake frames will make it close to 60fps, then you have the fake resolution to reach close to 80 fps, everyboddy will love this lie
back then cpu usually generate one frame of time due to cpu cycles...but now AI cores can render more than 4 frames due to gpus can use its onboard ai processors to simultaneous rendering more than one frame than traditional cpu 1 frame per cycle when ai can do it all 4 in same time on your gpu....it seems like NVIDIA is replacing CPU work load with AI processors do it all. Its similar to what data center does their gpu's ...not just rendering one per cpu processors when it can do 128 frames due to amount of processors can output 128 frames same time. that resulted CNN model, since this architecture is based on networking gpu's data mining since we are in petaflop territory using all FP instructions..gamers don't even know it when buying nvidia gpu now is quite remarkable tech
i wish they showed some results for 3d renders where optix is used, as there could be seen a result between news and old graphic cards for path tracing. i also want to add that 5090 isnt for gaming, its for multitasking / render / ML - ai, pretty sure 5080 would handle 4k with great frames using the DLSS4 and Transforms. Thank for the video!
Well you saw the AI performance estimate on the slideshow, so it's explainable... however the 24GB model could easily be as high as 1300$, and it will only come with higher VRAM.
@@OzzySafa even if we pay scalpers prices it would be around the $1300 mark everyone was expecting to pay anyway. Not saying it's OK for the scalping thing.
no leaks said they were going to fake 75% of the generated frames, so, keep that in mind, the 1300 price was not considering this crap, people expected real better performance
The most important thing, the 5090 is not exactly a gaming GPU. It is really powerfull, but you definitely don't need that Performance just for Gaming. It is a more universal GPU, optimized for large parallel workloads and at this price, it shouldn't be bought for just for gaming, unless you have the money anyway.
Yes! I tell people nvidia could cancel 5090 and it wouldn't change a thing for gaming. 5080 and 4080 are real high end gpu and comparing it to 1080ti, 2080ti you can see how justified they are as a high-end gpus.
Has anyone stopped to consider if those 3 additionally generated frames are actually consecutive? wouldn't it make more sense for it to generate 3 possible frames that it could then choose from depending on your input? I mean it doesn't know for sure its initial prediction is right which is where we get the artefacts rn
Remember when dropping $2K on a PC meant you got an absolute monster of a machine w/ custom hardlined tubing, temps in the teens, maybe even a custom case? Pepridge Farm remembers
Not exactly.. the Rtx 4070 is 40% slower than the Rtx 4090. The Rtx 5070 is 40% faster than the Rtx 4070, so the normal performance of it will be on par with the Rtx 4090 in most games even without dlss of frame gen
i'm glad we are getting back to 2 slot cards, but can we get back to the sub 1.3k pricing again like back in the days of the 10 series, and can we bring the power draw down to sub 400 like the 30 series. I dont feel comfortable putting a card that draws more than 450 watts especially after the fires of the 4080 and 4090.
Thumb up.👍 You have made correct mild and cautious comment on NVIDIA announcement. Let's see how it comes out and what real gaming benchmarks would show.
The way I'm taking is is that it retains the first frame, modifying it 3 times before acquiring the fifth frame. Is higher framerate required or will it be noticeable at 60 and what about fast changing scenes. I definitely wont be spending $2000 again (that was covid desperation). But I'll probably do the 5080, if ever available.
Can we get a video on nvidia’s timeline for architecture? Seeing dates thrown around that blackwell ultra is coming late 2025 and rubin coming in early 2026
*Nvidia's performance claims are just BONKERS !! And $999,- for an 16Gb Vram GPU (5080) is still way too much !* Just like the 12Gb Vram 5070, for $559,- ! I'm sure they are betting, that their 5070ti will be their BEST seller ! *But ... I think it will be, the Radeon RX-9070XT for $449,- to 549,- !!*
The thing i don’t understand, if MFG is completely flawless (big assumption). And you’re pushing 240fps @ 4K in Cyberpunk, when the hell would you ever need to upgrade again? It’d be a very long time indeed. I’m wondering if this is the final GPU series from Nvidia, and they will stop making gaming GPU’s for the foreseeable future. And focus on their CPU’s they have planned.
I think anyone crapping on this 4-1 ratio will soon realize that AI can easily create real frames, even if they aren't technically real. This is all just AI fear. Imagine being on the skeptical side of the computer when i firs came out. Oof. I think people are reacting wrong.
I don't think frame generation is even worth the extra latency and visual artifacts when the graphics solution doesn't have more memory than their competition. Because when that meagre amount of memory is sufficient for the graphics solution, it is already producing frames fast and smooth enough. So Nvidia is following an anti-consumer path once again here.
It's just occurred to me. I think his claim of it "predicting the future" is a 100% lie. Because the technology they're using it called In-between Interpolation. You need two frames to generate in between them for it to work. So it would have to render the next frame before showing you it, generate the 3 frames between it and the last rendered frame, show the generated frames, show the next rendered frame, then repeat.
Interesting, but for instance in flight sims with TAA mode, there will be almost no power and FPS increase (except 5090 with more VRAM) and for DLSS and Frame Gen, it won't work for VR, am I correct?
To be fair most people will get the RTX 5080 and be perfectly happy. Only select people will have 4k 240hz monitors needed for the 5090. Not to mention some gamers still prefer 1440p 300+ FPS.
I plan on using Neweggs trade-in program to upgrade my 4090 to a 5090. The prices aren't bad imo. It could be worse. I just hope they can compete with the reduced added latency that AMDs frame gen gives. Nvidia's current iteration double system latency, while AMD's only adds 1or 2ms.
One of crypto’s most bizarre developments in recent memory was the creation of “Terminal of Truths” (ToT), which led to the creation of the Goatseus Maximus memecoin ($GOAT). This was more than just a memecoin though. That’s because ToT marked the beginning of a new era of Kate in crypto - specifically, Kate Agents.
People with case limitations can struggle with getting the connector into the gpu due to space and this eliminates that problem as small form factor builds are becoming more popular
@@TheRanscat i still dont get how people have that problem. I still have the very same power cable that come with my 4080 fe and no problem 2 years and still going.
to avoid stress on the connector and the pcb and make easier the cable management, it is not a bad idea on this pos connector that likes to burn down to ashes
You can always trust nvidia. They never lied, as amd did. I was super satisfied with dlss 3.5 promises with ada, and now very positively surprised with huge RT gains.
So the real performance without hacks of the RTX 5070 will be half of a RTX 4090 because 40 series can only frame gen 1 time after real picture while 50 series frame gen can do 3 times. Nvidia pulling the same bs and people applauding. 🙃
Glad to see XAI508P leading the charge. One thing to note about XAI508P tokenomics that was glossed over is that usage of the XAI508P Network burns, lowering supply and well... You know the rest ;)
ok I might be a caveman but if I set every game I can to 60fps is 5080 worth it? or I should just wait till 4090 goes down with price? currently using 2080 OC
If your current gpu is doing what you need then thats all that matters then who cares about 50 series not me i love my 4090 no need to sidegrade id rather take the money i would spend on a 5090 and beef up my other things in my pc 4090 is far from obsolete just because a new gen comes out.
Nvidia fan boys are hating on amd for not adding rdna 4 features to the 7000 series cards but ignoring rhe fact that most of their game is ai created images or that the 40 series won't get the new tech. Theyre a soecial bunch
Actually the improvements of DLSS4 trickle down to all RTX cards from 2000 series up. The only thing cards older than 4000 series don’t get is frame gen and multi frame gen is restricted to 5000 series. Pretty reasonable honestly.
the AI cores wouldn't handle multi frame gen, so it wouldn't be possible to add it to older cards. everything they can is going to older cards it seems. even frame gen probably wouldn't run on the 30 series, i can't be mad at that.
I own a 4090, and Im skipping this generation. Nvidia really massively cheaped out on their VRAM. Vram is king these days, as I use my card for generative AI applications. The 5070 should have 16gb minimum, and the 5080 should have came with 20+ GB vram. If you're paying for a $600 USD 12 GB vram card in 2025 then you're getting ripped off.
Can someone please explain me ı am building a new pc ı was thinking buy 4070super should ı wait to 5070 or just build a 4070 super sytem and after 1 or 2 years ı can buy 5000 supers which one is best option ?
Apparently the additional frames are predictive frames rather than an interpolation between two frames so we're going to be in territory where the issue shifts from how bad the input lag is to how accurate is the image to whatever is going on in real time from player inputs. Frankly I'm just upgrading from my 3090 to a 5090 because the raw raster performance improvement is enough to justify it, if DLSS 4 works well that's just icing on the cake for me.
@@scarx4181 That makes sense. But the lower the native FPS is, the longer into the future the AI needs to predict. Like if the native fps is 100 then it only needs to predict 0.03 seconds into the future, while 20 fps = 0.15 seconds. So im guessing this will work great at increasing fps from 100 to 400, but not so much at going from 20 to 80. Of course, if you're getting a 5090 then you probably won't get 20 native fps in any game for quite a few years.
No gpu was ever good in future proofing. Amd fanboys though rx 6800/6900 will be future proof because of 16gb vram but then cyberpunk was not even worth playing with rt and then alan wake 2 and wukong put last nails in its coffin. I'd say 5080 will do the best in future as 16gb vram will still be plenty, while 12gb is really pushing it especially considering how much impressive is the performance of these cards. Despite of haters saying 40 series lacked vram it was not true. So far, gpus really lacked performance more than vram in every single case. But now that you get 2x rt perf vram starts to be an issue, except that they claim new dlss4 uses 30% less vram which is really good to know.
@@filippetrovic845their presentation was good. I’ll wait to see what 3rd party reviews look like but nvidia with bringing out exclusive features every gen could be a way of getting people almost on a subscription. In other words giving folks a reason to have to buy a new card every generation by locking features away from the older cards.
@theicewitch9328 I used to upgrade every 2 generation but with scalper and crypto it's been 3 generations since my last GPU upgrade. I recently upgraded my monitor to 4k OLED so I need a new GPU xD
Doesn't frame generation massively increase input delay? I tried playing with frame generation on and the fps counter was at 100 but when turning the camera the delay was so bad it felt like 20 fps. Upscaling tech is good but frame gen is worthless.
it sioncresing it by 4x but reflex redusing it by 4x so it back to 0 latency :D 50XX gonna sell like never before im so exited to get my hand on 5090 astrat quad fan gpu
For those mad at the chart let me break it down, Its DLSS3 FG vs DLSS4 MFG except for the 2 games in the front. Now, the difference between 30 series and 40 series was completely insignificant in native rendering, however we see there is about a 1.3X 1.4X performance upgrade between 40 series and 50 series in native rendering. I predict 5060 8gb will hit 50fps where 4060ti 8gb hits 30fps
Dude, 4080 destroyed ampere with raster only , and dlss 3.5 was a cherry on top and it was mighty impressive. Now its very similar situation but even more DLSS gains. So basically+30% raster +100% RT for every class, and 5080 is even cheaper. 5090 is special case because its more expensive and its so powerful that i don't know how would you even utilise it at this point lol. You need new expensive 2k 480hz monitor to play single player games on those crazy frames which is pointless.
for what i saw in a graphic on their page real resterisation will go more like these 5070 = 4080 5070ti = between 4080 and 4090 5080 = 4090 5090 = 40-50% increase in performance the 4090 being a rtx 3080 owner think the 5070ti will be a good upgrade.
People need to really state considering the 90 series like new titans. Remember 2000 series RTX Titan was $2500…. Most people will have no need above 80 series these days
Soon nVidia gpus will completely eliminate user input lag by simply eliminating the user and playing the game for you by using the latest bleeding edge AI innovations.
😂😂
That's called watching streamers on Twitch.
Finally, I can play Hunie Pop hands free
This sounds expensive lol.
probably already do, rendering many frames ahead properly means predicting what a user would do and already doing it for them, many users would be subconciously affected by this, taking the alternatively rendered frames essentially as a guide for what they should do and then actually doing that. ofcource unless already at a super high fps where such generation wouldn't be needed.
so esentially they probably already play it for the user, both doing thigns they didn't do yet, and litterally controlling the users actions by showing them those things(humans are like monkeys and naturally try to copy things on insticnt).
Jensen's new leather jacket has full 4k path racing enabled.
Only half the price of a 5090.
😂😂😂😂😂😂
That was the ONLY thing worth watching 😂
Looks a bit gay nonetheless.
@@archibaldikowski3646 only u think like that.
I'm really mad that basically every significant "performance jump" is mainly AI.
I remember last time, Jensen said something like "1 pixel is calculated, then other 7 pixels are generated by AI", and that was just the upscaling before Frame Generation. Now they are doing that, and generating entire 3 frames from it. In the end, only 1 out of 32 pixels is actually rendered, everything else is an educated guess by AI.
Someone will surely make a comparison between the AI image and a fully rendered image, I am really curious about that. Still, I am most interested about actual no-AI performance.
Though on a positive note, nice surprise they didn't increase (and even lowered) the price of the lower models.
I get what you’re saying but I think software/AI enhancements will the main way we see big performance leaps moving forward. You can see by the results, the frames aren’t just “educated guesses” but highly accurate and reflective of the source material. If we are getting big performance gains, that’s a win, regardless of how it’s done.
@@fogofwar342 Moving forward yes. And yes, they are getting better and better. But specifically for Frame generation, look how last gen messed up the UI in some games. And that was just with 1 frame inserted in between. Now they are inserting 3. Unless the UI processing improved significantly, it's going to be a huge mess.
@@fogofwar342 You know how we get actual performance uplift by game devs actually optimizing games and I mean at all they don't do it nowadays and that's a fact they just use their nanite and import a billion polygon models into the game then use their framegen and add copious input lag it's all s***id.
Yep. Now so many AAA developers are using all of the Nvidia tricks as a crutch to release badly optimized games, and they even now include "with DLSS" under their recommended specs. DLSS was never meant to completely replace raw rasterization performance. It was supposed to be an addition, but now it's being treated as the baseline... 🤬
We don'tknow exactly how it will perform. I assume they have addressed the UI and such.
1:28 4090 performances until you need vram.
absolutely corrrect 😁😁😁
Dont need vram with dlss
If you got a 4090 why would you get anything less than a 5090? People go broke that quick?
@@thewayz9629 dlss and frame gen use a lot of vram
@@RedWolfenstein dlss reduces vram usage
heavily relying on framegeneration is a bad thing. saying 5070 have double the perf of 4090 just because DLSS 4 in 50 get that mult generation while 40 series only get single frame gen is just scam.
No they said 5070 has 4090 performance. They said 5090 has double 4090 performance.
@@ZackSNetwork with frame gen and dlss
@@RedWolfenstein and fg and dlss for 4090 too
@@ZackSNetwork wait what?
Thye had both compared GPUs running with their full suite of frame help.
Can’t wait for 5070. 200 fps with the input latency of 50fps. How cool.
Papa Jensen gimme more fake frames please!
Might as well grab a crayon and write 1000fps on the screen and just play pretend…
Oh no problem then stop crying and be satisfied with your real 20fps on AMD.
Ps. You get roughly+30% raster for every class and dlss4 is cherry on top. Even if you look for raster only nvidia gives you more frames/$. And then 50fps latency for single player titles is actually pretty fine, but its more like 70 to 200fps boost.
@@filippetrovic845 As someone who play exclusively in the 40-60fps range, it's not that bad. That being said, it's going to be frustrating as Hell if I'm reacting to shit and the game is being sluggish due to the input latency.
@@filippetrovic845bro really has nvidia down his throat. Instead of focusing on how the customers are the one losing and nvidia is relying on AI’s educated guesses for frames instead of actual frames. Doesn’t matter if it’s AMD or NVidia the consumers lose
@@liebrat you weren't blessed with reading comprehension were you? 😢
@@liebratit's even worse than this. If only we could go back to msaa and culling instead of shitty unoptimized game development
I cant help wonder if 3-frame generation might look somewhat similar to motion blur, especially as physical objects change trajectory.
I expect they have dealt with all this kind of thing.
yes try playing competitive with dlss, you will realize weird frame drops, even though frames are 140+
It has to be interpolation. And that would look awful
We have had frame interpolation in TVs for a long time. Remember "600hz" plasma tvs?
Last gen DLSS looks like motion blur to me and gives me a headache
The presentation was a typo, they meant to say 5070 = 4090 price.
Did anyone else notice that Nvidia disabled chat, comments, and downvotes on their CES stream? It's telling me that they knew angry gamers would have mobbed them.
The problem with fake frames is that, from what I've noticed, when combined with even moderate amounts of ray tracing, they result in a noisy, flickering image. This can be observed in games like the new Indiana Jones, as well as in older titles like Cyberpunk 2077
Maybe if you use dlss performance but with new performance boosts everyone will be using quality dlss.
@@filippetrovic845 no fanboy
@@filippetrovic845 Hopefully the new architecture and more advanced AI will resolve this. This is what Jensen was implying.
I mean you can barely notice it unless you look extremely closely
when we know nothing: everything about 5090 leaked!
when we know everything: you don't know anything about 5090!
5090 level of -AI hallucinations- performance my ass
You should watch the 5090 with out DDSS4 & only RT lets put it simple 1440P Here we come from 4K for $1,999.99 . lower then 30 FPS with 4K with only RT & with DDSS 4 & RT 260FPS on 4K. Like the Video said Stop & wait for Reviews before buying anything. You really need to think AMD 9700X Vs 9800X 3D & possible Intel Ultra Series Processors. Like it or not Salt needs to be use so get a track or sit & wait for Reviews.
@@mattstansbeary3068 Nah I'll get the 5090. AMD won't deliver. I want top AI performance. I want PT in 4k with high FPS and that is what NVIDIA is good at. For people who don't want that and care more about raw native fps without PT, I'm sure AMD has got them covered.
@@chrisking6695 Good Luck you be sleeping very strange hours on Jan. 29th & Jan. 30th by that I mean not Sleeping at all since you have to Either have a Best Buy or Micro Center in your town because if you need to do it online you better Jump on F5 like your life depends on it. Other words Geforce 5090 & 5080 will be sold out in Secs.
😂😂😂😂😂😂GAY so literally the series 5000 is a 4000 but after smoking a very good blunt or hallucinogenic mushrooms or peyote
The msrp means absolutely nothing, wait till vendors get it. It’s going to go from 549 to almost 900 bucks
just buy the Founders Edition, Other versions of the cards will be higher because they have better cooling and in most cases higher clock speeds so yes you will pay extra for the bells and whistles but i think going from £549 to £900 is a little too much i would expect them to goto around £650 or even £700.
@@azorees7259 I agree, without scalpers in the equation. I really Hope iam wrong, but probably Nobody can really buy them cause Bots buy everything Up and it gets sold for a huge overpay in eBay...
@@azorees7259try finding one without a buying bot or campin micro center 8hours before opening. Hell nah
Think that's bad scalpers are going to be running their programs to buy all these up before you can even put your card number in and sell them on Ebay for 3 to 4 times the price ..
@@We_never_die_we_just_respawn Yep regardless of the stock, no matter what we will always have a 3-7 month wait.
As a PC builder, I welcome the 45deg angled power connector, as it is more case side-panel compatible.
How about this....
MOVE THE CONNECTOR TO END OF THE CARD AND DOWNWARD!...
You know, like the rear right corner below the card? So it is OUT OF THE WAY AND SIGHT without making card longer?!
That connector DOES NOT NEED TO BE VISIBLE!
@@TK13M then do it if it is ez. make ur own company.
@@TK13M Running power cables over a large gap and distance probably has issues.
@@TK13M terrible idea, trust me. Some of the high end EVGA Kingpin cards did this and caused big problems with front case fans, even in full-tower cases. 45 deg is the better of the options here since all cases expect the cables to be there.
@@TK13M Gigabyt 4080 super windforce v2 has this
Like to see rasterization numbers compared to the 4090. I don't play sims with DLSS😢
20-30% improvement
@BaSiC47 is this verified?
@@cdrseabee If you look at the graphs from Nvidia and there is no Multi-Frame-Generation the uplift is about 20-30%
Also , those 2 games that were on the slide that didn't use dlss were both still using ray tracing, and if the raytracing is significantly faster on the 5000 series compared to the 4000. That would probably make up that 25% increase in FPS that was shown. If that's the case, there isn't going to be much in the way of actual raster performance improvement.
VERY valid point. Can't wait to see benchmarks of raw performance so we can finally see the value for each gpu, as well as true performance comparison to ada lovelace
Full specs were released on the Nvidia website.
so we mostly getting only new upscaling... fuck this shit...
are you acoustic? look at the price difference. What else are they going to do other than improve ai. There is no limit to ai. Also, the statistics of the 5090 are actually better than the 4090.
@gamingsol7521 you love fake frames and lag
75% fake frames, that is nvidia for you, do not forget to enable the fake resolution too
It will probably be slower at raster because the electron-hungry AI will gobble up half the power ..
@gamingsol7521 Just to let you know, Adding more frames to improve performance sounds like a bad idea.
Correct me if I'm wrong, Frame generation is like an animator adding more stills to an animation in real time. It's like adding a buffer that is not necessarily needed.
Reminds me of the day 300K webcams had a 5MP resolution......
3090 vs 4070 all over again.
4070 is slightly slower than a 3080 10gb.
the 4070ti is about on par with 3090 raw power
@@nsmilitiaand the vram?
Here's to hoping AMD can come with more raw performance than AI performance, that way we can choose if we want to fake it until we break it or if we want true graphics.
I’m waiting for Jensen to start dancing and BEAT IT with that jacket on!!! The 5090 price is not the one..hee..hee!!
Sucks to be the people that were paying $2500-3000 for a 4090 the past couple months after Nvidia stopped producing them and supply dried up. People are so unbelievably dumb sometimes.
Yes haha😂 i already see panic sellers offering 4090's for msrp price, soon will they realise none will buy them for anything close to msrp. Hell, i would rather take 5080 if they are same price. I dont care for raster, only RT, since all these cards already have strong raster for older games anyways.
AMD buyers are not that dumb ...
Still am happy with my gigabyte rtx 3060 eagle oc 12gb😊
Ya I was half tempted to sell my Gigabyte watercooler 4090 and sit on the cash for the ROG water-cooled 5090 and make money... Woops
They can afford it
The more money Nvidia makes the shinier Jensen's coat gets
the scalpers will buy all the cards in the first batch so nvidia will hit it sales quota then we will see how many people want to pay scalper price after raw benchmarks come out.
You 💯
2:28 you know what else is massive
is the frame generated shitness 3 x as bad too?
just buy it, 75% fake frames, fake resolution, it just works
Can it predict a shot or new jacket material in 2years? +20% pure performance while +30% power usage. Single frame (morph) is fine on 4090/4080, thank you.
Basically all of the improvements are from the Tensor cores that are used for A.I., not the CUDA cores or shader performance, core per core and clock for clock. Has there really been any improvement in shader performance other than the old AMD strategy, MORE CORES!?
If the 5070 is $549 .. that makes me more excited for AMD prices 😂
if amd comes with 450 it will sell well, if it comes at 500, it will not sell well, over that, it will not sell untill it falls to 450
@@arch1107I can tell you, 9070xt will be lowered to $499 msrp and when it comes to stores it will be immediately discounted to $449 and still none will buy it over 5070 just based on raster performance alone, because 5070 will match 7900xt raster while 9070xt will match 7900 gre while lacking in RT big time.
So now game developers can be more lazy on optimization because nvidia will generate more frames 🤦🏻♂️
yes, they wioll offer 15fps, then the 75% fake frames will make it close to 60fps, then you have the fake resolution to reach close to 80 fps, everyboddy will love this lie
nvidia just hit the wall.
500+ watts is too much of a power draw for me.
back then cpu usually generate one frame of time due to cpu cycles...but now AI cores can render more than 4 frames due to gpus can use its onboard ai processors to simultaneous rendering more than one frame than traditional cpu 1 frame per cycle when ai can do it all 4 in same time on your gpu....it seems like NVIDIA is replacing CPU work load with AI processors do it all. Its similar to what data center does their gpu's ...not just rendering one per cpu processors when it can do 128 frames due to amount of processors can output 128 frames same time. that resulted CNN model, since this architecture is based on networking gpu's data mining since we are in petaflop territory using all FP instructions..gamers don't even know it when buying nvidia gpu now is quite remarkable tech
I'm very skeptical whether it's going to be good or not. I'm expecting crazy framerate with a lot of artifacts.
i wish they showed some results for 3d renders where optix is used, as there could be seen a result between news and old graphic cards for path tracing.
i also want to add that 5090 isnt for gaming, its for multitasking / render / ML - ai, pretty sure 5080 would handle 4k with great frames using the DLSS4 and Transforms.
Thank for the video!
Is that actually mesh over the the fans that is an interesting design, no visible fans
fans are on the other side
The 5080 pricing is better than the leaks. All the leaks were saying the 5080 would be around $1300.
Yeah its so funny seeing how many people were fear buying 40xx series thinking all the gpus are gonna be double the price.
Well you saw the AI performance estimate on the slideshow, so it's explainable... however the 24GB model could easily be as high as 1300$, and it will only come with higher VRAM.
That’s what you will be paying with the aib partners. But you won’t be paying the $999 because the scalpers will beat you to it.
@@OzzySafa even if we pay scalpers prices it would be around the $1300 mark everyone was expecting to pay anyway. Not saying it's OK for the scalping thing.
no leaks said they were going to fake 75% of the generated frames, so, keep that in mind, the 1300 price was not considering this crap, people expected real better performance
I wish framegen didn’t exist
it just works, just buy it
if people in 2018 decided to give nvidia the middle finger none of this would be happening, but here we are
The most important thing, the 5090 is not exactly a gaming GPU. It is really powerfull, but you definitely don't need that Performance just for Gaming. It is a more universal GPU, optimized for large parallel workloads and at this price, it shouldn't be bought for just for gaming, unless you have the money anyway.
Yes! I tell people nvidia could cancel 5090 and it wouldn't change a thing for gaming. 5080 and 4080 are real high end gpu and comparing it to 1080ti, 2080ti you can see how justified they are as a high-end gpus.
That's a good point. I wonder if they wanted the 5070 to be directed at gaming or gaming development.
Has anyone stopped to consider if those 3 additionally generated frames are actually consecutive? wouldn't it make more sense for it to generate 3 possible frames that it could then choose from depending on your input? I mean it doesn't know for sure its initial prediction is right which is where we get the artefacts rn
Remember when dropping $2K on a PC meant you got an absolute monster of a machine w/ custom hardlined tubing, temps in the teens, maybe even a custom case? Pepridge Farm remembers
GaMeld RTX 5090 uses AI and DLSS to get some pretty impressive results.
Is it all software boosting not raw power?
Boosts with DLSS4 and frame gen
@@samtarooo which sucks
Yes
Not exactly.. the Rtx 4070 is 40% slower than the Rtx 4090.
The Rtx 5070 is 40% faster than the Rtx 4070, so the normal performance of it will be on par with the Rtx 4090 in most games even without dlss of frame gen
@@Tapsby7False the 4080 Super is %40 slower than a 4090.
wow - that is a full retraced ultra 8k jacket of Jensen....
i'm glad we are getting back to 2 slot cards, but can we get back to the sub 1.3k pricing again like back in the days of the 10 series, and can we bring the power draw down to sub 400 like the 30 series. I dont feel comfortable putting a card that draws more than 450 watts especially after the fires of the 4080 and 4090.
Thumb up.👍
You have made correct mild and cautious comment on NVIDIA announcement.
Let's see how it comes out and what real gaming benchmarks would show.
Does it make sense to upgrade from 3080?
The way I'm taking is is that it retains the first frame, modifying it 3 times before acquiring the fifth frame. Is higher framerate required or will it be noticeable at 60 and what about fast changing scenes. I definitely wont be spending $2000 again (that was covid desperation). But I'll probably do the 5080, if ever available.
Can we get a video on nvidia’s timeline for architecture? Seeing dates thrown around that blackwell ultra is coming late 2025 and rubin coming in early 2026
I recently got a 4070 Ti super did I do the wrong choice? Or is it too early to say?
Bad choice as the 5070 will absolutely outperform it while costing less
@@kloklojul depending on how long he owns the card. This is the question.
@@kloklojul We'll see when the benchmarks come out, I have little faith
I have the same card and will skip the 50 series gen
No your fine the 5070ti will perform like a 4080 Super.
*Nvidia's performance claims are just BONKERS !! And $999,- for an 16Gb Vram GPU (5080) is still way too much !* Just like the 12Gb Vram 5070, for $559,- !
I'm sure they are betting, that their 5070ti will be their BEST seller ! *But ... I think it will be, the Radeon RX-9070XT for $449,- to 549,- !!*
It’s looking like there is about 1/3 performance increase over last gen per tier as raw performance based upon limited graphs we’ve seen
The thing i don’t understand, if MFG is completely flawless (big assumption). And you’re pushing 240fps @ 4K in Cyberpunk, when the hell would you ever need to upgrade again? It’d be a very long time indeed. I’m wondering if this is the final GPU series from Nvidia, and they will stop making gaming GPU’s for the foreseeable future. And focus on their CPU’s they have planned.
I think anyone crapping on this 4-1 ratio will soon realize that AI can easily create real frames, even if they aren't technically real. This is all just AI fear. Imagine being on the skeptical side of the computer when i firs came out. Oof. I think people are reacting wrong.
I don't think frame generation is even worth the extra latency and visual artifacts when the graphics solution doesn't have more memory than their competition. Because when that meagre amount of memory is sufficient for the graphics solution, it is already producing frames fast and smooth enough. So Nvidia is following an anti-consumer path once again here.
It's just occurred to me. I think his claim of it "predicting the future" is a 100% lie. Because the technology they're using it called In-between Interpolation. You need two frames to generate in between them for it to work. So it would have to render the next frame before showing you it, generate the 3 frames between it and the last rendered frame, show the generated frames, show the next rendered frame, then repeat.
Interesting, but for instance in flight sims with TAA mode, there will be almost no power and FPS increase (except 5090 with more VRAM) and for DLSS and Frame Gen, it won't work for VR, am I correct?
Ppl were paying 3k for 4090 not surprised
I don’t like DLSS. I can see the difference with and without it. Looks like I’m sticking to my 4090 for another 2 years
To be fair most people will get the RTX 5080 and be perfectly happy. Only select people will have 4k 240hz monitors needed for the 5090. Not to mention some gamers still prefer 1440p 300+ FPS.
I plan on using Neweggs trade-in program to upgrade my 4090 to a 5090. The prices aren't bad imo. It could be worse. I just hope they can compete with the reduced added latency that AMDs frame gen gives. Nvidia's current iteration double system latency, while AMD's only adds 1or 2ms.
The new Reflex reduces latency by x4.
as long as it works well, i am excited!!!
it works well by faking 75% of the frames generated under a fake resolution
where do you see it can work well? wake up!
We doubled your fake frames because we cannot make a GPU anymore.
At this point they should rename GPU to: Frame Generating Unit
could not be more clear the situation isnt?
fake frames, fake resolutions, it is over
One of crypto’s most bizarre developments in recent memory was the creation of “Terminal of Truths” (ToT), which led to the creation of the Goatseus Maximus memecoin ($GOAT). This was more than just a memecoin though. That’s because ToT marked the beginning of a new era of Kate in crypto - specifically, Kate Agents.
what do you think about Convolutional neural network ??
All smoke and mirrors , Jensen if fake frames is all that u have I'm not buying, I feell really disapointed right now.
What if you can't tell a negative difference?
@@Safetytrousers if you cant tell, play with tetris
@@quietus5077 I have no idea what I would feel, no user does yet.
Why would you put a 45 Degree power cable on a GPU? I think that's a bad choice.
So people don't bend the 12vhpwer, I guess. It will probably be fe exclusive, and those are hard to get anyway.
People with case limitations can struggle with getting the connector into the gpu due to space and this eliminates that problem as small form factor builds are becoming more popular
@@TheRanscat i still dont get how people have that problem. I still have the very same power cable that come with my 4080 fe and no problem 2 years and still going.
@@theicewitch9328 Cool story. Tell it again.
to avoid stress on the connector and the pcb and make easier the cable management, it is not a bad idea on this pos connector that likes to burn down to ashes
Holding XAI508P$ with ADA is like waiting for fireworks to go off. Huge potential here.
How would this effect Games like cod where it's trying to predict player movements I mean we can already break cameras?
XAI508P looks different, like it’s ready to explode. Keeping momentum alive while others are slowing down. Love this.
Digital foundry found that there was only a 7ms increase in latency compared with dlss 3 frame generation. So the improved performance is pretty good.
You can always trust nvidia. They never lied, as amd did. I was super satisfied with dlss 3.5 promises with ada, and now very positively surprised with huge RT gains.
So the real performance without hacks of the RTX 5070 will be half of a RTX 4090 because 40 series can only frame gen 1 time after real picture while 50 series frame gen can do 3 times. Nvidia pulling the same bs and people applauding. 🙃
Fine add this to XBOX AND WE POOR GAMERS CAN SAVE 5 000 DOLLARS. THIS SHIT IS A MONEY GRAB....
how you get jack black to cameo the thumbnail?
Glad to see XAI508P leading the charge. One thing to note about XAI508P tokenomics that was glossed over is that usage of the XAI508P Network burns, lowering supply and well... You know the rest ;)
ok I might be a caveman but if I set every game I can to 60fps is 5080 worth it? or I should just wait till 4090 goes down with price? currently using 2080 OC
Having a 240hz monitor, MFG will be kinda cool for single player games
If your current gpu is doing what you need then thats all that matters then who cares about 50 series not me i love my 4090 no need to sidegrade id rather take the money i would spend on a 5090 and beef up my other things in my pc 4090 is far from obsolete just because a new gen comes out.
Nvidia fan boys are hating on amd for not adding rdna 4 features to the 7000 series cards but ignoring rhe fact that most of their game is ai created images or that the 40 series won't get the new tech. Theyre a soecial bunch
Actually the improvements of DLSS4 trickle down to all RTX cards from 2000 series up. The only thing cards older than 4000 series don’t get is frame gen and multi frame gen is restricted to 5000 series. Pretty reasonable honestly.
the AI cores wouldn't handle multi frame gen, so it wouldn't be possible to add it to older cards. everything they can is going to older cards it seems. even frame gen probably wouldn't run on the 30 series, i can't be mad at that.
OMG 50 series have crystal balls..
I own a 4090, and Im skipping this generation. Nvidia really massively cheaped out on their VRAM. Vram is king these days, as I use my card for generative AI applications. The 5070 should have 16gb minimum, and the 5080 should have came with 20+ GB vram. If you're paying for a $600 USD 12 GB vram card in 2025 then you're getting ripped off.
I think my GTX 1050 Ti can handle 2 more years of waiting before it gets retired
I hope the new 16 pin power connector will mean the card won’t bust into flames.
Can someone please explain me ı am building a new pc ı was thinking buy 4070super should ı wait to 5070 or just build a 4070 super sytem and after 1 or 2 years ı can buy 5000 supers which one is best option ?
How bad is the latency going to be with 3 fake frames for each real?
Its on their website, I checked it out earlier, 30-33ms for DLSS3 and 32-36ms DLSS4
Apparently the additional frames are predictive frames rather than an interpolation between two frames so we're going to be in territory where the issue shifts from how bad the input lag is to how accurate is the image to whatever is going on in real time from player inputs. Frankly I'm just upgrading from my 3090 to a 5090 because the raw raster performance improvement is enough to justify it, if DLSS 4 works well that's just icing on the cake for me.
@@scarx4181 That makes sense. But the lower the native FPS is, the longer into the future the AI needs to predict. Like if the native fps is 100 then it only needs to predict 0.03 seconds into the future, while 20 fps = 0.15 seconds. So im guessing this will work great at increasing fps from 100 to 400, but not so much at going from 20 to 80. Of course, if you're getting a 5090 then you probably won't get 20 native fps in any game for quite a few years.
The new Reflex reduces latency by x4.
@@frankjohannessen6383unless it has path tracing.
Thank you meld I’m going to get a 5070TI do you think that will be good for future-proofing
Likely just until 6000 series releases.
No gpu was ever good in future proofing. Amd fanboys though rx 6800/6900 will be future proof because of 16gb vram but then cyberpunk was not even worth playing with rt and then alan wake 2 and wukong put last nails in its coffin. I'd say 5080 will do the best in future as 16gb vram will still be plenty, while 12gb is really pushing it especially considering how much impressive is the performance of these cards.
Despite of haters saying 40 series lacked vram it was not true. So far, gpus really lacked performance more than vram in every single case. But now that you get 2x rt perf vram starts to be an issue, except that they claim new dlss4 uses 30% less vram which is really good to know.
@@Averagedude-mi3flBut doesn't that say that new gpus are really impressive, they immediately retire last gen?
@@filippetrovic845their presentation was good. I’ll wait to see what 3rd party reviews look like but nvidia with bringing out exclusive features every gen could be a way of getting people almost on a subscription. In other words giving folks a reason to have to buy a new card every generation by locking features away from the older cards.
That Yellowstone hat is the real winner of the video
Is it a gpu?
4090 performance from the 5070 sounds as if it can achieve that due to DLSS4 (software) more than raw performance?
Not true. It is harware based, even if AI generated. BUT, none said 5070 will be apsolute trash in raster only. In fact i expect 30% uplift.
obviously, plus 75% of fake frames w/ new frame generation
@@filippetrovic845 maybe compared to 4070? and it wont be 30%, probably around 20-25%
XAI508P$ and ETH together? Bro, this combo feels unstoppable. 15x is just the start 🚀
For anyone reading this planning to upgrade from what they have. Will you go FE or aib? 🤔
im skipping like i did with 3000 series
@theicewitch9328 I used to upgrade every 2 generation but with scalper and crypto it's been 3 generations since my last GPU upgrade. I recently upgraded my monitor to 4k OLED so I need a new GPU xD
@@theicewitch9328good for you, how does it feel playing AAA on sub 30fps (real frames tho)😂
I am looking for zotac msrp (Trinity) rtx 5080. I was impressed by their 4090 so i am sticking with Zotac.
He was very specific on the AI usage here. The 5070 will still likely equal, or surpass the prior 4080 in raw horsepower.
Doesn't frame generation massively increase input delay? I tried playing with frame generation on and the fps counter was at 100 but when turning the camera the delay was so bad it felt like 20 fps. Upscaling tech is good but frame gen is worthless.
it sioncresing it by 4x but reflex redusing it by 4x so it back to 0 latency :D 50XX gonna sell like never before im so exited to get my hand on 5090 astrat quad fan gpu
In the thumbnail, he really looks like Jack Black.
Yeah I thought it was Jack Black.
should i get an rtx4060 laptop now or wait for an rtx5060 laptop?
u should wait
wait till 6000 series i feel this new generation is a joke
For those mad at the chart let me break it down, Its DLSS3 FG vs DLSS4 MFG except for the 2 games in the front. Now, the difference between 30 series and 40 series was completely insignificant in native rendering, however we see there is about a 1.3X 1.4X performance upgrade between 40 series and 50 series in native rendering. I predict 5060 8gb will hit 50fps where 4060ti 8gb hits 30fps
Dude, 4080 destroyed ampere with raster only , and dlss 3.5 was a cherry on top and it was mighty impressive. Now its very similar situation but even more DLSS gains. So basically+30% raster +100% RT for every class, and 5080 is even cheaper.
5090 is special case because its more expensive and its so powerful that i don't know how would you even utilise it at this point lol. You need new expensive 2k 480hz monitor to play single player games on those crazy frames which is pointless.
Ps. Ampere to Ada was huge die shrink and now is just slight process improvement, yet we see even bigger gains overall, its crazy.
All the 5090 got was basically the ti version pricing since it has more vram and the full 512 bit bus from what i have seen
It got double the CUDA cores, and a radically new architecture.
for what i saw in a graphic on their page real resterisation will go more like these
5070 = 4080
5070ti = between 4080 and 4090
5080 = 4090
5090 = 40-50% increase in performance the 4090
being a rtx 3080 owner think the 5070ti will be a good upgrade.
Good logic you have. Haters say fake frames ignoring the fact that raster is going to be improved too. And dlss 3.5 is very impressive already.
People need to really state considering the 90 series like new titans. Remember 2000 series RTX Titan was $2500…. Most people will have no need above 80 series these days
Although it probably wont be too notiable i imagine sudden changes in motion might make things a lil.. odd.