People are crying, but your all still going to buy it over AMD and Intel GPU's so hush up and buy your software dependent soon to be E waste.... Cus your an elite gamer who's team green all the way baby!!! This is what gamers wanted back in 2010, why the sudden change of heart!?!? "Team green, team green, team green!!!"
lol wat. There are no months subscriptions to receive updates for rtx cards. Where you pulling this out of. edit: omfg i know hes talking about the future, my point is that its speculation, nvidia hasnt talked about going to a subscription process any time soon so dont worry about it. my point is dont worry about something that doesnt exist.
You joke, but I’ve actually wondered if they’ve ever had a similar idea float around during their shareholder meetings on how they can go about getting more from people without rustling too many jimmies. Then again, the prices we’ve seen and the people that still justify it … people will whine and complain, but they know their loyal customer base. Just throw around a few stats, show some graphs, play some videos, throw on a few gaudy looking outfits, and in due time: “No, you just don’t get it. They’re _innovating_ a path forward for the future! I don’t see what the big deal is? They’re the ones making technological advancements, why should I not pay monthly for that? This is the future, I’m helping shape it.” You just know it’s coming lol.
@@romanboi3115I wonder why… people will claim it’s the future but we are no where near ready for this “future” in any logical sense. At least with tessellation it made some more sense on paper and didn’t require several times the current available hardware power, yes it was hard to run but not literally “we need something 90x faster to achieve real time high refresh rate rendering” this is insanity. What we have right now isn’t even real ray or path tracing in any sense, they’re already heavily gimped down and underbaked versions of what could be with enough raw power, it’s only a taste if you will and it’s crippling several thousand dollar pcs… this is so so sad
@@romanboi3115 Indiana Jones and the Great Circle... but global illumination works there quite well, even lowend cards can do it.... so yeah, depends on devs and stuff, biggest problem now is UE 5 and how badly optimized that is, but it's easy to port games to consoles and pc, so it's widely used
@@1Grainer1the biggest difference with Indiana Jones is it's running on a modified iDtech engine. Those results are going to be significantly different than a UE5 game. By that I mean noticeably better.
I've been playing games since the 1980's and it's only ever got worse. As the specs and resources available to devs increase, the more bloated games seem to get for no discernible benefit lol.
Optimization isn't worth it. A better optimized game is much more expensive to produce but doesn't bring in any more money for the company. Better put that effort into gambling mechanics like lootboxes
@@JensenBooton It's getting worse at a much faster rate now. The DLSS crutch is getting bigger and bigger. Atleast before they were expected to atleast somewhat try now it's just who cares AI will fix it. I mean I get why he's doing it. Line goes up and get a new leather jacket but it's just going to keep making things worse by extension.
As long as devs have to impress investors with “looks” and not game play it unfortunately will never change. They make games for the guys writing their checks not the people playing them. Games won’t get better until the big guy says so.
@@yogichopra6606 then it will cease to be some uncommon gimmick and instead become a widespread technical failing on the behalf of game developers. using frame gen to get from 30 to 60 fps is not free frames and leaves you with latency issues and picture quality issues, and this is not mentioning the fact that it is dishonest marketing to claim these generated frames as real performance by comparing the 5070 and the 4090.
Dont tell me Asmongold is falling for the scam that is 5070. Nvidia is winning. Yet WE are getting screwed. Makes you wonder why some of the product is cheaper than expected for this huh? So many reports coming in that the 5070 is weaker than expected. Pure dishonesty from Nvidia which I suppose is expected at this point Also feel free to correct me but doesn't frame gen increase input lag by a considerable magin to make up for the frames?
Afaik frame gen lag is nearly unnoticeable. You'd only notice it in an esports title where you wouldn't notice. They apparently cut back on delay for frames this generation as well
the first thing i thought was that it could never ever be real that 5070 = 4090 and i was thinking that they probably included the dlss frames... well i was right.
@@RemedyTalon But unlike my real frames fake frames don't lower input latency which is the whole point of gaming at high fps in the first place. Why does no one defending Nvidia seem to get that?
E-sports and older games dont need ai frame generation. Just turn it off and use the gpus raw power for that. Frame gen, dlss and all this shit are meant for the single player triple a titles.
you rather they focus on rasterization which is less performant than the ai hardware parts of the card? no, it doesn't get you 20-30 fps. everyone using it uses dlss, and most people use frame gen, efficiently getting multiples of times more frames.
@@badwolf8112 Uhm, no, "people" really dont. It still gives better raw performance then the best AMD Card, thats why most buy NVIDIA. AI Stuff is on top of it, but not the reason people buy it. Now this release is a strange spot because the 5080 is really close to 4070 TI spec wise and uses a higher TDP, it will be really interesting how it perfoms in comparison outside of DLSS optimized games like Cyberpunk.
So 50% performance increase for 25% price increase, what else do you want? Expect them to go from 20 fps in 40 generation to 200 in 50 generation and make all of the 50 generation GPUs free? lmao
Leather craftsman here. Jenson is wearing an embossed jacket, so it's just cow leather stamped to look like alligator/croc it's also massive overpriced since for the money he paid you could buy a real croc/alligator jacket (therefore being higher quality) several times over.
@@BaxxyNut In terms of implementation in gpus? Frame gen has a not insubstantial input lag compared to actual frames so they're still unpleasant to game with.
Not an improvement . 5090 is 30% more performance with a 30% larger chip and 30% more power draw and is also a 25% increase in cost compared to the 4090. What a technological marvel.....
"Nobody would've thought we can ever ray trace every single pixel." Goes on to explain how they actually ray trace about every 5th pixel and have the LLM hallucinate the rest This is beyond parody
@@mattheweaton9470 bruh. the AI is literally inventing the extra frames between the ones that actually get rendered. have you listened to the presentation?
@@eventhorizon853 yes because it studied patterns, all the frames are REAL , one is rendered by the GPU manually and the others Through AI , there is nothing fake in this. its like saying the answers you get from chatgpt are fake and not real. such a stupid opinion.
Reminder that these FPS are achieved WITH frame generation, there's a reason they don't show latency / input lags. they're generating 80% of the frames out of thin airs! even if you have 100FPS the actual frame still stuck in 24fps which means everytime you make an input it still being updated at 24fps, to put it simply it is like playing on europe server from California WHICH STILL SUCKS, FPS doesn't matter, input latency and frame time does. it's like having 10Trillion Zimbabwean dollars doesn't make you a trillionare because it's fake money.
Your own fault for not properly using it with a base fps and input delay that already feels good, and not turning on the new reflex 2.0, which will further reduce your input delay by 75%
@@DBTHEPLUG Reflex 2 does not actually reduce input lag, it it's camera space warp to make mouse input feel more responsive (like we already have in VR to reduce head tracking latency)
@rtyzxc If it's just like how dlss isn't considered increasing performance, it doesn't matter. No one cares about technicalities anymore. It's AI. New technology. You can't apply old thinking to it.
NVIDIA is like: *Look, we've got these new cards, they're amazing, 26% more efficient, and consume 25% more energy. Plus, we can sell you some AI bullshit too. yaaayyy*
@@avatarion i caught on long ago, nvidia or intel does anything, pro amd youtubers make video on it bashing it and creating their own narrative, those video get posted and shared on reddit, redditors come to youtube and repeating what the pro amd youtubers told them. none of this matters and everyone buys it anyways
Not if nvidia gets so big they start getting everyone to put RT and all that crap in and don't bother to optimise non RT then if you don't activate RT your game runs like ass and looks like ass.
Raytracing has been available in games and supported by GPUs for 7 years now. There is no excuse for it still be struggling this bad. What you're saying is the equivalent of telling someone in 2007 that if they want playable FPS on the newest GPU at the time, they can just play without shaders and normal maps.
@@TechnoMinarchist and it still has issues with many games not casting enough rays to be not grainy which means graphics cards will need be even more powerful before ray tracing has no downsides. this is still true with ray reconstruction but instead of having grain you have blur and loss of texture detail in motion.
The 5090 exists as a price anchor, what nVidia really wants you to buy is the 5070/5080. The 5090 is priced that way to make the 5070/5080 look like a good deal (they aren't). The 5070 could be sold at $400 and the 5080 at $650.
@@TheUnkindness for the past 5 years gpus especially at the top line have been anywhere between 2k-7k when 2020 scalpers were a thing. we havent seen prices like this in probably a decade. this was a very successful reveal. a 5070 for 547 that can match native 4090 using ai upscaling is insane.
@@Thegooob95 You've been fooled once again if you think these are the prices you will actually pay. And still, the 5070 will only match the 4090's performance in ray traced scenarios. It will be far outclassed in rasterized performance.
All this hype, shouldn't it bother you all that this is purely generated from marketing? Not reviews, testimonials or actual independent data, pure marketing..
Thankfully, the majority of tech minded people ive talked to are just as jaded as i am about the whole situation. Will there be droves of braindead trend chasers buying these? Yes. But at least those aware of the typical Nvidia marketing tactics are suuuper unimpressed and very skeptical until 3rd party reviews are released.
The same people who tell me that Nvidia knows more than everyone else about how to run a computer parallel to the people who got vaccinated. Its a paradox of "anti-intellectualism" vs people who surrender their opinions to corporations, and for the same reasons actually. These people have more familiarity with computers than you, who are you to argue with fkn NVIDIA.. would be the thought process.
I had the same thought, pretty disingenuous. I'm sure it was previously recorded "in real time" but I also wouldn't be surprised if they did multiple renders of each section for the best parse.
this is a horrific stagnation. same node, basically the same number of cuda cores, same clocks. the raw performance will be 5-10% higher. take it back nvidia, we don't want it.
delusional brain dead take based off pure speculation. You haven’t seen a single benchmark yet you are making sweeping generalizations based off stat sheet numbers LMAO
If you call paying 2 grand for 28fps winning. Cool. I want cards to play games at 200 fps WITHOUT nasty DLSS. All it does is smear, ghost, and artifact. It looks like absolute ass compared to 15 years ago.
If you are doing raw computation thats all you will ever get, get used to it and quit complaining. Unoptimized games from lazy developers cause the issue not the GPU.
Its not the card that's the problem there, its the game programmers who don't bother to optimize games anymore. Programming USED to be about making the code as efficient as possible since there was limited ram and processing power. When you had kb's of ram in a pc and the processor doing a few thousand calculations a second then programs will run like garbage if the code is bad. Look at the Witcher 3, and Doom 2016 that were put onto the Switch. While perhaps not optimized perfectly, those games run fairly well on hardware that nobody would expect to work like that. Or even look at current mods or games released for retro systems. They are doing wonders with the limited hardware, pushing what you could expect out of those systems to their limits.
With lossless scaling you can do the same thing with a 2080, or hell even a 1650 mobile. These people are scamming themselves. When the 60 series comes out and features some new exclusive feature we’ll see how everyone truly feels just wait
19:50 he said he want to look like Harry Potter from cyberpunk, that listens to death metal, but he wants to be able to camouflage at his grandmas wallpaper! 👌😂
"i think a lot of people like frame gen"...no..just no. it creates so much input lag. If youre the type that enjoys camera mode in games, you probably love this AI slop. But if youre into skill based gaming, frame gen is a cancer
tbh input lag isn't really too bad unless you are playing really competitive games, i am more use to the N64 input lag and honestly for single player games it wouldn't be that much of an issue at all
@@RahzZalinto please, enlighten my pea-brain as to what "the point of what he is saying" is frame gen is nothing more than smoke and mirrors. Youre basically paying for software, not hardware. If a game you want to play does not support frame gen, youre relying on your raw rasterization power which means youre at the mercy of game devs with your $2K space heater. If youre, cool with that then...well...theres nothing more i can say
I think calling it a cancer might not be nuanced enough, because to be fair to it with non competitive games with a high base framerate and refresh rate monitor it can be nice. the issue is the marketing when they make ridiculous claims like the 5070 being as fast as the 4090 and games that like MHWilds that planned on using it to get from 30 to 60fps(in a competitive game much less).
4:13 "Predict the future" ... This is not correct. DLSS4 still requires two existing frames to produce intermediate frames, nothing is predicted. Someday we will have predictive interpolation but that's not what Nvidia is doing right now.
I don't know why people keep saying it's win. To me it always feels so false with DLSS and frame generation. Essentially, you only get real render on the first true frame. The rest is just buffer. No matter how much fps you have, only the real rendered one actually show what's really happening in game as a result of your and games interaction. It's nice to have as option for low end devices, but I am only interested about the real performance.
In older games like Farcry 3 we had “gpu buffered frames” which did the exact opposite of frame generation, it buffered frames ahead of time… god we are going backwards and half the people in the comments are happy to pay for it we are so f#%^3d as enthusiasts. People arguing about “input delay” and “frame times” when the games it wouldve truly mattered in… it doesn’t even matter in because they already run so good! It’s just like framegen, you’d only want to use it when your game runs bad but it’s useless with low fps. Instead of focusing on real tangible features they’re reeling in the lowest common denominator with more buzzwords about frame times and big fps numbers when these people don’t even understand what’s being explained to them…
AI dlss is meant for gaming laptops on single player games. there you would really have 5fps on everything and will tolerate anything to have the 100fps on them on modern demanding games. i only use laptop because of my on-the-move work and lifestyle, i am constantly on a train or on a plane, so, for me, personally, this is.. okay..the 5090 mobile version could be good, for me.e-sport games are very low demanding graphically and dont need AI dlss, thats what most hardcore gamers play.
If people rejected frame generation like they did DEI, we'd have mid grade cards with 20+ GB of Vram and solid rasterization/RT performance. But if they can convince people to buy 12-16 gb cards with more software simulating the difference, why wouldn't they just do that?
If that were the case AMD would have accomplished that ages ago, people are expecting performance that doesnt exist and isnt at all possible and when NVIDIA finds ways around it (even if they aren't great) they get hated on by enthusiasts, its possibly the dumbest market ive ever seen outside of the trading card collectors.
@MontySlython I personally think the performance on my 7900 xtx without frame gen and upscaling is great. And if they made a 8900 xtx with 20% more performance, people would buy it if it was priced well enough. But the problem is they're more concerned with consoles than high end DIY GPUs, and it's more beneficial for them to focus on how to generate performance on a much smaller GPU to keep the system costs as low as possible. There's no reason for them to focus on a niche high-end now, because people accept AI card features. Which means they can produce cards that handle AI workloads, and sell them to a wider market outside of gamers. And because the gamers mostly are happy to buy them too, they have no reason to go another direction.
@@ElderGamerX Thats not a satisfactory defense at all, in fact the fact that their tech and R&D focus entirely on keeping chips small and efficient should translate to way stronger GPUs than NVIDIA anyways (at least in the minds of the NVIDIA detractors), if they dont make them its not because they cant or arent focusing on it, its because they simply dont want to for whatever reason. People are mad at NVIDIA for presenting mostly AI stuff in their presentation even though thats where they make the big bucks and are acting as if NVIDIA making their GPUs the way they do is purely greed when they dont see that same mindset applies to AMD as well and focus all their hatred on NVIDIA. NVIDIA is much the same, they have no reason to make GPUs that are way stronger than the current ones if they have to be way bigger, they can wait until their R&D comes up with something new, in the meantime though they can fit more AI cores for better software improvements that translate to far better performance for single player games, which is absolutely needed considering how awful games have been with optimizations.
@MontySlython I mean, I agree that there's market shifted towards AI, and now that all the GPU makers in town want to chase it, people don't have any choice. But the problem is that more AI features will never catch up with poorly optimized games. Devs are already accounting for upscaling and frame gen in game specs. Which just means they'll lean on the tech and do even less, rather than the tech getting people ahead. Personally, I wasn't that impressed with DLSS 3. I have a friend with a 4080 super hooked up to a 4k display, and I really didn't care for it. It felt floaty, and while sometimes it looked good, sometimes (especially when there was a lot of movement) it looked streaky and weird. I acknowledge that obviously this is the direction the tech is going. And there likely wouldn't be a market for too much longer on a pure rasterized card. But I do think it would have been fine for a couple more gens, and I also feel it's a bit early to make the shift to AI graphics. I'm guessing by the 70 series, it'll be indistinguishable from raw raster in 99.9% of cases. And I feel THAT is when the market should have shifted towards the adoption of the new tech. But hey, AMD taking the hit now to angle towards the future will end up being a short term loss, that will hopefully translate to survival in the 2030s. I just wish we could have had a 8900 xtx before they downgraded to RDNA 4, since it looks like they won't be competitive until UDNA anyway. Because as it stands, there's basically no reason for anyone who had a GPU above a 7700 xt to get a 9070 xt, unless they're holding back something nuts. AMDs "top end" card this gen will be a $499 card that is generally worse than a 5070 despite the 4gb extra Vram, because their software isn't as good. So in a year when people can get the card for $350 on sale, it'll be good value for anyone who held out from RDNA 3. Which will likely set AMD back to single digit market share, since people will buy 5070s thinking they're 4090s.
@MontySlython The problem is they didn't though. They thought they'd be able to stack this chip, and effectively SLI 2 together on the PCB to make a 9090 xt. But it doesn't work at scale like that. So now they're left with a card that's weaker than a 7900 gre, and are relying on FSR 4 to keep up. But given their track record, FSR 4 will probably look like DLSS 3, which isn't good enough to keep their market share. So now they've backed themselves into a corner, because unless they can find a way to improve upon their console designs with UDNA and make a significantly more powerful card, they've removed themselves from the high end completely, and engaged with a battle with Intel on the low end. And considering Intel has better frame gen software than FSR 3 already, there's no guarantee they win that battle.
Nvidia's pricing is laughable. $50 price reductions after $300-500 price hikes is barely even charitable, and Multi-frame Generation is not a substitute for paltry 10-15% gains on native performance for the 70/80-class cards.
You had years to get these bandaids up. You just sound like the type of person to look into your (empty) wallet after seeing the product, instead of having your wallet prepared and ready to go before the product is even announced.
Seems like a type of person who doesn't even want to look at the price history + add the inflation. From 2020 inflation has been around 25%, from 2022 10%. Prices have stayed the same pretty much 3 generations. 50xx is even lower priced than anyone could have predicted. @@DBTHEPLUG
@@DBTHEPLUG "You had years to get these bandaids up." I'm not sure I follow? I'm just criticizing their wild price hikes starting with the 40-series. The 3080 was $700, the 4080 started at $1200 before they corrected after backlash. The 80-class GPUs in particular have lost their place as an affordable enthusiast card and moved into GTX TITAN territory without offering the performance to back it up.
@@DBTHEPLUG "instead of having your wallet prepared and ready to go." Lol spoken like a true Consoomer. God forbid people try to be discerning with their purchases and make smart choices with their money.
@@enmanuel1950 Of course. I want a new gpu, and have to work with the options that are presented. Instead of being ready to complain with an emtpy wallet in the RUclips comment section of some random RUclipsr (who isn't forwarding any of my complaints to nvidia), I'm instead preparing my wallet and having it ready to go for whenever it's time to go. I've seen you guys complain about these prices since 2020. We are 5 years further and where have those complaints gotten you? Exactly, nowhere. In the period you guys were complaining (2020-2025), nvidia's stock rose by 2200%. Just think about what that indicates.
People always saying that RT is performance heavy so it's okay to only get 28fps in Cyberpunk (a 5 year old game) on world's newest most powerful gpu... RT wouldn't be so damn performance heavy if they dedicated resources to optimising it and R&D to make their chips run it better instead of dumping all their money into Fake Frames instead. Real-time raytracing has been on consumer hardware for a long time now; 7 years. There is simply no excuse that we're still struggling with it without Fake Frames. Look at what little improvements have been made from 2018 to 2025, and compare that from 2000 to 2007. No excuse.
Probably because rasterized performance has hit a plateau in recent years compared to the early 2000’s. Nvidia was extremely clear that this was impossible to do at this scale without ai. I have no desire to spend 10k on a chip that’s 20 pounds and has to sit independently from my pc to play cyberpunk 8 k native at 60 fps. And if they went that route, it wouldn’t scale to the laptops. Ai is the future, get used to it
@@Thegooob95if frame gen works as good as Nvidia claims, a 5080 will run Racing sims at triple 4K120 max settings and triple frame rates in CPU limited games. Thats insane tech if it works. Imagine playing any game at 4K with 8x supersampling AA at 100fps on midrange cards. Thats worth a small latency hit in 90% of games.
Bet you feel smart capitalizing Fake Frames like an idiot who doesn't comprehend how ludicrous running PT would've been even running on the first RTX gen cards.
@@deathtoinfidelsdeusvult2184 no shit. but it also for no reason cost 2000$ when get this its only a 15% increase in performance from the 4090. an already outrageously over priced card.
I personally hate frame gen. Not because it is a bad thing to have, but because just like TAA it is a bandaid fix to problems that could be solved better. It allows for lazyness in optimization on the game dev's side which who at this point probably assume that you are using frame gen and dlss/fsr meaning that they only care if the game runs well enough with those enabled and don't care about actual performance. The graphics programmers should be forced to work on midrange GPUs, ceneter the game on those with no frame gen or dlss and then switch to a top tier pc for the ultra settings. To me it feels like most games develop for ultra and turn off features completely from there. The worst example that I know is Ark: Survival Ascended like, do not want fully ray traced shadows? I guess you ain't getting any shadows then. If we want ray tracing we should start adding more RT specific cores, or hell maybe even an RT specific card rhat does nothing but that instead of afding AI to halucinate 3/4 of the frames for us and on top of that most of the pixels too. I really hope this trend turns around soon where we pay sky high prices for fake frames.
All your reasons for disliking NVIDIA stem from game devs and not NVIDIA itself, if it were possible for them to push far more raster performance they would have done it, why are the expectations for NVIDIA 10x higher than for AMD even though AMD only markets itself on raster performance and low prices? If it were magically possible for them to make their GPUs more powerful by just not pushing AI at all then wouldn't AMD have reached this mythical raster performance that NVIDIA isn't hitting? it makes no sense and nobody acknowledges this dichotomy.
@MontySlython I have the same expectations for amd too. I just hate that nvidia leaned into AI generated fake frames as "perfomance". But part of that might come from the fact that I often use GPUs for compute too. And yes it is mostly unreal forcing certain feauters that causes some game dev issues, but selling GPUs based on how well they can fake frames for you doesn't make anything better.
@U_Geek doesn't make things worse either, it's adding some type of performance that they otherwise couldn't figure out how to get on the card, or maybe they could have but didn't because their research and development is trying to push past the need for raster altogether, I see both sides of the argument but the overwhelming hate for nvidias approach is absurd when their techniques are pushing us past the need for extremely strong cards in the future (hopefully), if game devs make things worse thats mostly on them not nvidia imo.
That's exactly why your games look like blurry, smeary shit in recent years instead of crispy like they used to 10 years ago. Rasterisation is king. F*ck fake frames.
What is NVidia losing? They literally have made the most powerful GPUs in existence for years now. AMD gave up, there is no other competitor. They won, not lost.
Lmao 30% is huge. The jump from the 980 to 1080 was 30% and is considered historic yet you petulant children are not satisfied with this 30%? If I were a Jensen I’d just stop marketing to gamers entirely. You people are not worth the time of day.
@@apersonontheinternet8006all of your comments are extremely angry and negative in nature while also consistently toxic, you’re also completely ignoring what op just said. The raw performance increase may be acceptable but the actual new features of the cards suck. They are being marketed towards features that still don’t even make sense in real use cases. Nobody is saying the cards are bad the misleading marketing is, and so is framegen inherently. The only time you’d really want to depend on it is when it’s useless; when your performance is already bad Will my understanding of this topic impact gpu sales? No. Did I say it would bother me? Not at all. Just covering my tracks before you try to respond with a rude comment. People are obviously going to continue buying Nvidia and that’s ok, but it’s important to inform consumers of what they’re getting into
raw raster got around 30% upgrade as usual with new generation, so why are you crying that there is a new software bonus that you can use if you want to?
While I'll probably just rock my little 4060 till the doors fall off, ...I would happily "volunteer" to receive a free 5090 as an Asmongold subscriber. If only one of us must make that sacrifice, ...I nominate myself to bear that burden.
Problem is that the game does not get "more responsive", because the game only "knows" about 1 of those four frames. The Jayz2Cents video explains it brilliantly.
Nah this is an L announcement they keep raising the 90 series price by a couple hundred every time. Their 5070 better than 4090 is so disingenuous because raw power it won't be. Plus the performance difference on their lower end cards keep coming closer together so its pointless to upgrade unless you get their best card.
22:52 DLSS is Deep Learning Super Sampling, which means it completes the frame but only rendereing a fraction of it (e.g. render at 1080p and then use DLSS to generate a 4k frame) and then Frame Generation takes that one completed frame, and generate 3 extra frames. the video on the left is DLSS OFF, so basically it has no super sampling nor frame generation. my guess is, 27 frames is at 4k dlss (as stated), and regular DLSS Quality will render it at 2k and frame rate will often go up by 1 to 1.5 times, say +1.2 times. so you get 60 frames. now DLSS4 gives you 3 extra frame per rendered frame, so you take 60 multiply by 4, you get 240. the math checks out. Whats confusing is its demo-ing both DLSS and DLFG, which makes the difference so huge.
12:15 It's backwards because of the illution that the price is decreasing rather than increasing. It's the new manipulative way. You will see more of this, maybe apple too etc.
cant wait for new games to run like crap or have 3 frames of visual latency and zero in-between.. we have very old games that look beautiful and run way better. feels like we are just wasting so much time on AI graphics that we are just leaving rendering and optimization in the dust
I like how some really clever folks cry at 25fps like it's not running a 4k path tracing at ultra settings. They could've easily shown us 300+ fps without path tracing.
Nobody is unhappy with the performance they’re unhappy with framegen and deceptive marketing. it’s realistically just internet drama but it’s interesting to see people’s favoritism in action and just ignoring why people are upset, the card is going to sell because they have no high end competitor regardless. People are just worried at where the market is headed and that’s totally fair we should be cautious but we aren’t Would you also like to explain the deceptive marketing to these “clever folks”? Or would that make you angry becuase im apparently an AMD fanboy for understanding how things really are? Or hell im just an intel fanboy at this point too! Just covering any typical response you would hit me with. I am not a fanboy, I am well informed.
@@opticalsalt2306 those people can't even understand that it's 25fps because using path tracing, why would I explain anything to them. They need to read information before commenting, at first. They won't change anything with being "unhappy", they can't do anything at this point. Even buying a product from other companies won't help, because they are doing the same thing.
If they can make a pixel perfect representation of what raw compute can convey, then it doesn't matter how they get there. The ghosting and input lag, once completely fixed, solves the final problem of reaching beyond 4k 120hz. From what I have seen, the tech has reached the boundaries of what is possible for modern transistors, so we have to rely on new tech to produce pixels on the screen.
@@Onetooth7997 He probably already use AMD or Intel, Xess is quite good in term of quality but if he only use FSR, well if he's happy with the quality that's good for him
upscaling technology is pretty good though especially at higher resolutions (especially DLSS and that is coming from someone who has an AMD gpu). that being said unless under a very niche gaming circumstance frame gen kinda sucks and should not be shown as real performance.
probs also coz of possible incoming tariffs. building a rig now with minor bank charges vs building a couple months of paychecks down the road w/ ~25% increased cost. 5090 will be 2k when it releases, but 2.5k come summer time.
No surprise there, offering people loans with high interest and capitalizing on their stupid financial decisions by them purchasing crap that they don't need. A fool and his money are easily parted.
Went from not owning games to not owning frames.
Very true, I would rather pay a premium to have a proper gpu that uses no ai crap.
People are crying, but your all still going to buy it over AMD and Intel GPU's so hush up and buy your software dependent soon to be E waste....
Cus your an elite gamer who's team green all the way baby!!! This is what gamers wanted back in 2010, why the sudden change of heart!?!? "Team green, team green, team green!!!"
@@AlexHonger-fj3nxnope I’ll keep buying AMD.
Amd 6000 series ftw
Happy with my 7900XT
Bro selling so much snake oil they had to start turning the leftover snake skins into jackets..
lmfaoooo 😂😂😂 i’m dying from this comment
Imagine paying thousands of dollars only to get PS4 and Xbox One performance natively 🤣
@idindunuphenwong
Horrible take 💀
@@idindunuphenwong Imagine thinking PS4 and xbox performance was native itself lmao.
this snake aswell as AI has the performance of 10 4090s here now only for $3000
I like how you can see the state of Nvidia by how fancy Jensen's jacket is
It's a symbol of their greed
Crafted from alligator material, thats some bada$$ leather armor
That's why their gpus keep increasing in price he needs the money to bring back the dinosaurs so he can make a new jacket out of T rex skin
It's an 8k jacket from Tom Ford lol
His jacket is a part of his body
Cant wait for your GPUs preformance to not actually be based on the card itself but in the software updates you pay monthly for.
lol wat. There are no months subscriptions to receive updates for rtx cards. Where you pulling this out of.
edit: omfg i know hes talking about the future, my point is that its speculation, nvidia hasnt talked about going to a subscription process any time soon so dont worry about it. my point is dont worry about something that doesnt exist.
You joke, but I’ve actually wondered if they’ve ever had a similar idea float around during their shareholder meetings on how they can go about getting more from people without rustling too many jimmies.
Then again, the prices we’ve seen and the people that still justify it … people will whine and complain, but they know their loyal customer base. Just throw around a few stats, show some graphs, play some videos, throw on a few gaudy looking outfits, and in due time: “No, you just don’t get it. They’re _innovating_ a path forward for the future! I don’t see what the big deal is? They’re the ones making technological advancements, why should I not pay monthly for that? This is the future, I’m helping shape it.”
You just know it’s coming lol.
@@Thegooob95 up your reading comprehension. they're talking about the future when saying 'can't wait'.
@@Thegooob95 his point is basically in the future its all ai doing the thing not the card anymore so yeah we might pay for the ai as well lol
@@thunderlion4897 at this point thats complete speculation.
Let me tell you how to boost your 20fps to a 100fps in one simple technique, turn off raytracing.
Yup, until games hardbake Raytracing for no good reason.
@@romanboi3115I wonder why… people will claim it’s the future but we are no where near ready for this “future” in any logical sense. At least with tessellation it made some more sense on paper and didn’t require several times the current available hardware power, yes it was hard to run but not literally “we need something 90x faster to achieve real time high refresh rate rendering” this is insanity.
What we have right now isn’t even real ray or path tracing in any sense, they’re already heavily gimped down and underbaked versions of what could be with enough raw power, it’s only a taste if you will and it’s crippling several thousand dollar pcs… this is so so sad
@@romanboi3115 Indiana Jones and the Great Circle... but global illumination works there quite well, even lowend cards can do it.... so yeah, depends on devs and stuff, biggest problem now is UE 5 and how badly optimized that is, but it's easy to port games to consoles and pc, so it's widely used
@@romanboi3115 not enough people are willing to drop that much on a beefy computer
@@1Grainer1the biggest difference with Indiana Jones is it's running on a modified iDtech engine.
Those results are going to be significantly different than a UE5 game. By that I mean noticeably better.
Optimization is going to get even worse isn't it
I've been playing games since the 1980's and it's only ever got worse. As the specs and resources available to devs increase, the more bloated games seem to get for no discernible benefit lol.
We've gone from optimised to sloptimised
Optimization isn't worth it. A better optimized game is much more expensive to produce but doesn't bring in any more money for the company. Better put that effort into gambling mechanics like lootboxes
@@JensenBooton It's getting worse at a much faster rate now. The DLSS crutch is getting bigger and bigger. Atleast before they were expected to atleast somewhat try now it's just who cares AI will fix it.
I mean I get why he's doing it. Line goes up and get a new leather jacket but it's just going to keep making things worse by extension.
As long as devs have to impress investors with “looks” and not game play it unfortunately will never change.
They make games for the guys writing their checks not the people playing them. Games won’t get better until the big guy says so.
4090 performance for third the price. I'm sure there's no catch, right?
With games like Alan Wake 2 and Monster Hunter Wilds being made with frame gen in mind it starts to become less of a gimmick and more of a standard.
@@yogichopra6606 then it will cease to be some uncommon gimmick and instead become a widespread technical failing on the behalf of game developers. using frame gen to get from 30 to 60 fps is not free frames and leaves you with latency issues and picture quality issues, and this is not mentioning the fact that it is dishonest marketing to claim these generated frames as real performance by comparing the 5070 and the 4090.
DLSS , Mult generation, AI... yeah there is, its not native performance.
@@yogichopra6606Input latency kills it.
@@kouyasakurada5547Nvidia reflex is getting closer to completely eliminating input delay caused from frame gen
Dont tell me Asmongold is falling for the scam that is 5070.
Nvidia is winning. Yet WE are getting screwed.
Makes you wonder why some of the product is cheaper than expected for this huh?
So many reports coming in that the 5070 is weaker than expected.
Pure dishonesty from Nvidia which I suppose is expected at this point
Also feel free to correct me but doesn't frame gen increase input lag by a considerable magin to make up for the frames?
Afaik frame gen lag is nearly unnoticeable. You'd only notice it in an esports title where you wouldn't notice. They apparently cut back on delay for frames this generation as well
the first thing i thought was that it could never ever be real that 5070 = 4090 and i was thinking that they probably included the dlss frames... well i was right.
He just pretending to fall for it because he knows that starforge will use it and make more profit
@@BaxxyNut They delay has been published and the new frame gen adds 57 miliseconds of delay.
@@Jankmastathat's a fucking lot of latency. That's >4x the latency of 60FPS, double the latency of 30FPS, and higher than most people's ping.
39:19 The guy who randomly yelled out Linux had his brain extracted and is now being used to train the next model of DLSS.
POV your entire new GPU relies solely on ai generating frames which dont make the gameplay more responsive
But Reflex 2 does. The AI generated frames follow the camera movement, which hides the latency.
Make real frames great again
They are generated by the GPU.. just like your "real" frames.
@@RemedyTalon But unlike my real frames fake frames don't lower input latency which is the whole point of gaming at high fps in the first place. Why does no one defending Nvidia seem to get that?
@@RemedyTalon so I guess minecraft graphics and cyberpunk graphics look the same to you since they are both rendered by the same gpu?
E-sports and older games dont need ai frame generation. Just turn it off and use the gpus raw power for that. Frame gen, dlss and all this shit are meant for the single player triple a titles.
Rasterization Gang
Nvidia wins and the consumers keep on getting fked.
And the ones that invested a while back🤑
well consume something else
No one forcing you to buy it. Go buy your obsolete amd gpus.
Seriously, go buy AMD or Intel they're literally the other competitors, but I know you won't because you still want Nvidia.
Why?
And he reacted to a gamedev, who explained why we have smear, ghosting, and inputlag in games.
All that is going to be way worse with 3 fake frames for every real one.
but now we have AI FRAME CORE FRAME AI FRAM AI AI FRAME JIZZ AI FRAME TECH AI AI technology with AI technology. duh,
Now like 15 out of 16 pixels can be ai generated or upscaled. It's insane @@salikali2
@@salikali2just wait for frame gen 3 with 10 FAKE FRAMES 😂
The latency of 30fps at 100+ FPS
Jensens jacket is 25% real leather and 75% AI simulated leather. That’s why it shimmers from heavy artefacting.
$1600 gets you 20 FPS and $2000 gets you 30 FPS. Truly innovative
50% Boost
why?
because fabrication is stuck at 4nm, our hardware stuck, but demand want more and more graphic quality
you rather they focus on rasterization which is less performant than the ai hardware parts of the card?
no, it doesn't get you 20-30 fps. everyone using it uses dlss, and most people use frame gen, efficiently getting multiples of times more frames.
@@badwolf8112 Uhm, no, "people" really dont. It still gives better raw performance then the best AMD Card, thats why most buy NVIDIA. AI Stuff is on top of it, but not the reason people buy it. Now this release is a strange spot because the 5080 is really close to 4070 TI spec wise and uses a higher TDP, it will be really interesting how it perfoms in comparison outside of DLSS optimized games like Cyberpunk.
So 50% performance increase for 25% price increase, what else do you want? Expect them to go from 20 fps in 40 generation to 200 in 50 generation and make all of the 50 generation GPUs free? lmao
Leather craftsman here. Jenson is wearing an embossed jacket, so it's just cow leather stamped to look like alligator/croc it's also massive overpriced since for the money he paid you could buy a real croc/alligator jacket (therefore being higher quality) several times over.
Asians don’t care much
So he paid premium to be eco friendly since there's more cows than gators.
I'm pretty sure money is not an issue for him.
AI shouldnt be needed in order to reach playable framerates.
What exactly discounts AI?
A I is a feature,
@@BaxxyNut In terms of implementation in gpus? Frame gen has a not insubstantial input lag compared to actual frames so they're still unpleasant to game with.
@@Barthea-oe1el i use framegen and also don't use it. No noticeable difference other than a raise in frame.
Lmao if you want to have the best possible graphics then yes, these are gaming gpus not 3d moddeling so having this feature makes it possible
Not an improvement . 5090 is 30% more performance with a 30% larger chip and 30% more power draw and is also a 25% increase in cost compared to the 4090. What a technological marvel.....
careful NVIDIA fanboys will tear you apart
It's 130% more performance if you enable the triple AI hallucinations to 4090's single AI hallucinations.
i thought they're both 2000 usd
@@thewhyzergeekin off the NVIDIA tabs
well the 5080 is 30% faster at the same price as 4080 so at least thats not a bad deal. We are getting 4090 performance for $1000
"Nobody would've thought we can ever ray trace every single pixel."
Goes on to explain how they actually ray trace about every 5th pixel and have the LLM hallucinate the rest
This is beyond parody
fake tracing
Why would a llm be hallucinating pixels for ray tracing?
@@mattheweaton9470 ignore him he doesnt really know what he's saying :)
@@mattheweaton9470 bruh. the AI is literally inventing the extra frames between the ones that actually get rendered. have you listened to the presentation?
@@eventhorizon853 yes because it studied patterns, all the frames are REAL , one is rendered by the GPU manually and the others Through AI , there is nothing fake in this. its like saying the answers you get from chatgpt are fake and not real. such a stupid opinion.
The frames are LIES people, they're just a bunch of AI slops disguised as frames.
Reminder that these FPS are achieved WITH frame generation, there's a reason they don't show latency / input lags. they're generating 80% of the frames out of thin airs! even if you have 100FPS the actual frame still stuck in 24fps which means everytime you make an input it still being updated at 24fps, to put it simply it is like playing on europe server from California WHICH STILL SUCKS, FPS doesn't matter, input latency and frame time does. it's like having 10Trillion Zimbabwean dollars doesn't make you a trillionare because it's fake money.
take that back about zimbabwe 😡
Bro the amount of research gone into it to get those fres generated is immense, I don't know why they don't acknowledge it.
@@Jim-og7jz I am sorry :(
@@Jim-og7jz I am sorry :(
@@Jim-og7jzI am sorry :(
Everyone love frame gen until frame gen messed up their input delay.
Your own fault for not properly using it with a base fps and input delay that already feels good, and not turning on the new reflex 2.0, which will further reduce your input delay by 75%
@@DBTHEPLUG Reflex 2 does not actually reduce input lag, it it's camera space warp to make mouse input feel more responsive (like we already have in VR to reduce head tracking latency)
mine doesnt
And ghosting
@rtyzxc If it's just like how dlss isn't considered increasing performance, it doesn't matter.
No one cares about technicalities anymore. It's AI. New technology. You can't apply old thinking to it.
Asmongold falling for the dirty 5070 marketing
5070 same performance as an 4090 using AI
Yeah using Fake frames SMH
@@fuentesjuanjose90 fake frames don't exist.
The fake frames can’t hurt you. He immediately understood it was with DLSS and MFG.
For the price 5070 is still a good card, mark my words it will be the highest selling card in the 50 series.
Going to run out of vram in a couple years in 1440p games.
Nvida is winning, even Jensen's jacket has Ray Tracing on
3x the fake FPS. 3x the price. 3x the input lag. 3x the motion blur. 3x the scam. 3x the melted cables. Did i forget anything?
3x the tears from poors
why do you think it will have input lag, current nvidia frame gen doesnt
NVIDIA is like: *Look, we've got these new cards, they're amazing, 26% more efficient, and consume 25% more energy. Plus, we can sell you some AI bullshit too. yaaayyy*
@@bigturkey1 It just has to have, that's the current comment section meta. Haven't you caught on yet?
@@avatarion i caught on long ago, nvidia or intel does anything, pro amd youtubers make video on it bashing it and creating their own narrative, those video get posted and shared on reddit, redditors come to youtube and repeating what the pro amd youtubers told them. none of this matters and everyone buys it anyways
Did yall know you can play games without ray tracing and it won’t look trash 😧
and it can look worse with ray tracing
Not if nvidia gets so big they start getting everyone to put RT and all that crap in and don't bother to optimise non RT then if you don't activate RT your game runs like ass and looks like ass.
*turns ray tracing on* "huh hat is cool but i prefer 90+fps in games"
*turns ray tracing back off* "aaaah that is better"
Raytracing has been available in games and supported by GPUs for 7 years now. There is no excuse for it still be struggling this bad.
What you're saying is the equivalent of telling someone in 2007 that if they want playable FPS on the newest GPU at the time, they can just play without shaders and normal maps.
@@TechnoMinarchist and it still has issues with many games not casting enough rays to be not grainy which means graphics cards will need be even more powerful before ray tracing has no downsides. this is still true with ray reconstruction but instead of having grain you have blur and loss of texture detail in motion.
People skin jacket is wild
The 5090 exists as a price anchor, what nVidia really wants you to buy is the 5070/5080. The 5090 is priced that way to make the 5070/5080 look like a good deal (they aren't).
The 5070 could be sold at $400 and the 5080 at $650.
They’re an insane deal the 4080 is going for almost a thousand right now
Of course it is a good deal. MSRP for 4080 was 1200
@@Thegooob95 Being a better deal doesn't mean it's an ''insane'' deal.
@@TheUnkindness for the past 5 years gpus especially at the top line have been anywhere between 2k-7k when 2020 scalpers were a thing. we havent seen prices like this in probably a decade. this was a very successful reveal. a 5070 for 547 that can match native 4090 using ai upscaling is insane.
@@Thegooob95 You've been fooled once again if you think these are the prices you will actually pay. And still, the 5070 will only match the 4090's performance in ray traced scenarios. It will be far outclassed in rasterized performance.
If your GPU can't give me locked 144FPS at 1440p NATIVE resolution with maxed out graphics, then it shouldn't cost more than $450.
1440p native is more demanding than 4k upscaled
In what game, counterstike? Lol. Asking for the impossible bruh.
5 percent hardware 95 percent ai software
All this hype, shouldn't it bother you all that this is purely generated from marketing? Not reviews, testimonials or actual independent data, pure marketing..
Thankfully, the majority of tech minded people ive talked to are just as jaded as i am about the whole situation. Will there be droves of braindead trend chasers buying these? Yes. But at least those aware of the typical Nvidia marketing tactics are suuuper unimpressed and very skeptical until 3rd party reviews are released.
Almost like it’s their marketing event. U guys just are insufferable
@Thegooob95 yeah, really worked out well for the shareholders this week 😂😂😂 but fr, buy the stock while it's on sale 😉
The same people who tell me that Nvidia knows more than everyone else about how to run a computer parallel to the people who got vaccinated. Its a paradox of "anti-intellectualism" vs people who surrender their opinions to corporations, and for the same reasons actually. These people have more familiarity with computers than you, who are you to argue with fkn NVIDIA.. would be the thought process.
for the record I am unvaccinated
"The more you buy... the more you lose"
Me watching at 480p: Damn that looks amazing
Why do we need these new GPUs and games are still not being made optimized. Business is a crazy thing
"That was real-time computer graphics"
Yet it was clearly a pre-recorded and edited video...
I had the same thought, pretty disingenuous. I'm sure it was previously recorded "in real time" but I also wouldn't be surprised if they did multiple renders of each section for the best parse.
and looking at how they make 28fps to 240fps in CP, then this trailer having 140means, it was almost 10-15fps before DLSS performance and 4x frame gen
this is a horrific stagnation. same node, basically the same number of cuda cores, same clocks. the raw performance will be 5-10% higher. take it back nvidia, we don't want it.
It’s 30% faster. You should stick to mobile gaming.
@@apersonontheinternet8006you really got him with that one Mr sigma, destroyed his entire logic and reasoning… wow…
I don't think so. The 50 series makes 4K more affordable. Previously only possible for 4090 users.
delusional brain dead take based off pure speculation. You haven’t seen a single benchmark yet you are making sweeping generalizations based off stat sheet numbers LMAO
Their "wins" are fake. Just like their frames.
NVIDIA Truly the Masters of AI Slop
amd always the master of failing to copy AI slop
Sloptimised for the modern market
no, the 5070 is nowhere near as powerful as a 4090, calm your horses.
NVIDIA turned GPUs into social media: instead of taking time to calculate facts, spamming whatever feels right at the moment
Last time I paid 2k for something this fake, I needed penicillin for a week.
Lmfao
If you call paying 2 grand for 28fps winning. Cool. I want cards to play games at 200 fps WITHOUT nasty DLSS. All it does is smear, ghost, and artifact. It looks like absolute ass compared to 15 years ago.
If you are doing raw computation thats all you will ever get, get used to it and quit complaining. Unoptimized games from lazy developers cause the issue not the GPU.
Path tracing is 28 fps. Not actually gaming.
Your crappy games would probably run at 400+ fps
Its not the card that's the problem there, its the game programmers who don't bother to optimize games anymore. Programming USED to be about making the code as efficient as possible since there was limited ram and processing power. When you had kb's of ram in a pc and the processor doing a few thousand calculations a second then programs will run like garbage if the code is bad.
Look at the Witcher 3, and Doom 2016 that were put onto the Switch. While perhaps not optimized perfectly, those games run fairly well on hardware that nobody would expect to work like that. Or even look at current mods or games released for retro systems. They are doing wonders with the limited hardware, pushing what you could expect out of those systems to their limits.
you want something that doesn't exist for an consumer
DLSS is good. Game devs using it as a crutch for their unoptimized mess is the problem.
Classic we can't have good things scenario.
"4090 power" yeah, no. Only if you're using the 4x generating AI frames and adding latency
With lossless scaling you can do the same thing with a 2080, or hell even a 1650 mobile. These people are scamming themselves. When the 60 series comes out and features some new exclusive feature we’ll see how everyone truly feels just wait
you sure about that
19:50 he said he want to look like Harry Potter from cyberpunk, that listens to death metal, but he wants to be able to camouflage at his grandmas wallpaper! 👌😂
Dont think Asmon can talk bad about nvidia since he's a co-owner of Starforge unfortunetly
"i think a lot of people like frame gen"...no..just no. it creates so much input lag. If youre the type that enjoys camera mode in games, you probably love this AI slop. But if youre into skill based gaming, frame gen is a cancer
Irrelevant. Normies dont even know what input lag is. Youre missing the point of what he is saying....as yall always do lol
tbh input lag isn't really too bad unless you are playing really competitive games, i am more use to the N64 input lag and honestly for single player games it wouldn't be that much of an issue at all
you clearly haven't played with framegen because its barely noticeable, most of the time not at all.
@@RahzZalinto please, enlighten my pea-brain as to what "the point of what he is saying" is
frame gen is nothing more than smoke and mirrors. Youre basically paying for software, not hardware. If a game you want to play does not support frame gen, youre relying on your raw rasterization power which means youre at the mercy of game devs with your $2K space heater.
If youre, cool with that then...well...theres nothing more i can say
I think calling it a cancer might not be nuanced enough, because to be fair to it with non competitive games with a high base framerate and refresh rate monitor it can be nice. the issue is the marketing when they make ridiculous claims like the 5070 being as fast as the 4090 and games that like MHWilds that planned on using it to get from 30 to 60fps(in a competitive game much less).
modern day pimps. they pimp pixels
Reminds me of Bible pimp lmao
4:13 "Predict the future" ... This is not correct.
DLSS4 still requires two existing frames to produce intermediate frames, nothing is predicted.
Someday we will have predictive interpolation but that's not what Nvidia is doing right now.
I don't know why people keep saying it's win. To me it always feels so false with DLSS and frame generation. Essentially, you only get real render on the first true frame. The rest is just buffer. No matter how much fps you have, only the real rendered one actually show what's really happening in game as a result of your and games interaction. It's nice to have as option for low end devices, but I am only interested about the real performance.
In older games like Farcry 3 we had “gpu buffered frames” which did the exact opposite of frame generation, it buffered frames ahead of time… god we are going backwards and half the people in the comments are happy to pay for it we are so f#%^3d as enthusiasts. People arguing about “input delay” and “frame times” when the games it wouldve truly mattered in… it doesn’t even matter in because they already run so good! It’s just like framegen, you’d only want to use it when your game runs bad but it’s useless with low fps. Instead of focusing on real tangible features they’re reeling in the lowest common denominator with more buzzwords about frame times and big fps numbers when these people don’t even understand what’s being explained to them…
AI dlss is meant for gaming laptops on single player games. there you would really have 5fps on everything and will tolerate anything to have the 100fps on them on modern demanding games. i only use laptop because of my on-the-move work and lifestyle, i am constantly on a train or on a plane, so, for me, personally, this is.. okay..the 5090 mobile version could be good, for me.e-sport games are very low demanding graphically and dont need AI dlss, thats what most hardcore gamers play.
If people rejected frame generation like they did DEI, we'd have mid grade cards with 20+ GB of Vram and solid rasterization/RT performance. But if they can convince people to buy 12-16 gb cards with more software simulating the difference, why wouldn't they just do that?
If that were the case AMD would have accomplished that ages ago, people are expecting performance that doesnt exist and isnt at all possible and when NVIDIA finds ways around it (even if they aren't great) they get hated on by enthusiasts, its possibly the dumbest market ive ever seen outside of the trading card collectors.
@MontySlython I personally think the performance on my 7900 xtx without frame gen and upscaling is great. And if they made a 8900 xtx with 20% more performance, people would buy it if it was priced well enough. But the problem is they're more concerned with consoles than high end DIY GPUs, and it's more beneficial for them to focus on how to generate performance on a much smaller GPU to keep the system costs as low as possible.
There's no reason for them to focus on a niche high-end now, because people accept AI card features. Which means they can produce cards that handle AI workloads, and sell them to a wider market outside of gamers. And because the gamers mostly are happy to buy them too, they have no reason to go another direction.
@@ElderGamerX Thats not a satisfactory defense at all, in fact the fact that their tech and R&D focus entirely on keeping chips small and efficient should translate to way stronger GPUs than NVIDIA anyways (at least in the minds of the NVIDIA detractors), if they dont make them its not because they cant or arent focusing on it, its because they simply dont want to for whatever reason. People are mad at NVIDIA for presenting mostly AI stuff in their presentation even though thats where they make the big bucks and are acting as if NVIDIA making their GPUs the way they do is purely greed when they dont see that same mindset applies to AMD as well and focus all their hatred on NVIDIA.
NVIDIA is much the same, they have no reason to make GPUs that are way stronger than the current ones if they have to be way bigger, they can wait until their R&D comes up with something new, in the meantime though they can fit more AI cores for better software improvements that translate to far better performance for single player games, which is absolutely needed considering how awful games have been with optimizations.
@MontySlython I mean, I agree that there's market shifted towards AI, and now that all the GPU makers in town want to chase it, people don't have any choice. But the problem is that more AI features will never catch up with poorly optimized games. Devs are already accounting for upscaling and frame gen in game specs. Which just means they'll lean on the tech and do even less, rather than the tech getting people ahead.
Personally, I wasn't that impressed with DLSS 3. I have a friend with a 4080 super hooked up to a 4k display, and I really didn't care for it. It felt floaty, and while sometimes it looked good, sometimes (especially when there was a lot of movement) it looked streaky and weird.
I acknowledge that obviously this is the direction the tech is going. And there likely wouldn't be a market for too much longer on a pure rasterized card. But I do think it would have been fine for a couple more gens, and I also feel it's a bit early to make the shift to AI graphics. I'm guessing by the 70 series, it'll be indistinguishable from raw raster in 99.9% of cases. And I feel THAT is when the market should have shifted towards the adoption of the new tech.
But hey, AMD taking the hit now to angle towards the future will end up being a short term loss, that will hopefully translate to survival in the 2030s. I just wish we could have had a 8900 xtx before they downgraded to RDNA 4, since it looks like they won't be competitive until UDNA anyway. Because as it stands, there's basically no reason for anyone who had a GPU above a 7700 xt to get a 9070 xt, unless they're holding back something nuts.
AMDs "top end" card this gen will be a $499 card that is generally worse than a 5070 despite the 4gb extra Vram, because their software isn't as good. So in a year when people can get the card for $350 on sale, it'll be good value for anyone who held out from RDNA 3. Which will likely set AMD back to single digit market share, since people will buy 5070s thinking they're 4090s.
@MontySlython The problem is they didn't though. They thought they'd be able to stack this chip, and effectively SLI 2 together on the PCB to make a 9090 xt. But it doesn't work at scale like that. So now they're left with a card that's weaker than a 7900 gre, and are relying on FSR 4 to keep up. But given their track record, FSR 4 will probably look like DLSS 3, which isn't good enough to keep their market share.
So now they've backed themselves into a corner, because unless they can find a way to improve upon their console designs with UDNA and make a significantly more powerful card, they've removed themselves from the high end completely, and engaged with a battle with Intel on the low end. And considering Intel has better frame gen software than FSR 3 already, there's no guarantee they win that battle.
Nvidia's pricing is laughable. $50 price reductions after $300-500 price hikes is barely even charitable, and Multi-frame Generation is not a substitute for paltry 10-15% gains on native performance for the 70/80-class cards.
You had years to get these bandaids up.
You just sound like the type of person to look into your (empty) wallet after seeing the product, instead of having your wallet prepared and ready to go before the product is even announced.
Seems like a type of person who doesn't even want to look at the price history + add the inflation. From 2020 inflation has been around 25%, from 2022 10%. Prices have stayed the same pretty much 3 generations. 50xx is even lower priced than anyone could have predicted. @@DBTHEPLUG
@@DBTHEPLUG "You had years to get these bandaids up."
I'm not sure I follow? I'm just criticizing their wild price hikes starting with the 40-series. The 3080 was $700, the 4080 started at $1200 before they corrected after backlash.
The 80-class GPUs in particular have lost their place as an affordable enthusiast card and moved into GTX TITAN territory without offering the performance to back it up.
@@DBTHEPLUG "instead of having your wallet prepared and ready to go." Lol spoken like a true Consoomer. God forbid people try to be discerning with their purchases and make smart choices with their money.
@@enmanuel1950 Of course. I want a new gpu, and have to work with the options that are presented.
Instead of being ready to complain with an emtpy wallet in the RUclips comment section of some random RUclipsr (who isn't forwarding any of my complaints to nvidia), I'm instead preparing my wallet and having it ready to go for whenever it's time to go.
I've seen you guys complain about these prices since 2020. We are 5 years further and where have those complaints gotten you? Exactly, nowhere.
In the period you guys were complaining (2020-2025), nvidia's stock rose by 2200%. Just think about what that indicates.
I would be surprised if 5070 actually had 80% of 4090 performance.
I wonder what could be 4090 price in aftermarket
Dont be, because it isnt 😂
The 5070 is gonna have like 60% of the 4090's performance.
it won't even have 20%
the snakes he got for the jacket is also the same snakes he used for the oil.
People always saying that RT is performance heavy so it's okay to only get 28fps in Cyberpunk (a 5 year old game) on world's newest most powerful gpu... RT wouldn't be so damn performance heavy if they dedicated resources to optimising it and R&D to make their chips run it better instead of dumping all their money into Fake Frames instead.
Real-time raytracing has been on consumer hardware for a long time now; 7 years. There is simply no excuse that we're still struggling with it without Fake Frames.
Look at what little improvements have been made from 2018 to 2025, and compare that from 2000 to 2007. No excuse.
Probably because rasterized performance has hit a plateau in recent years compared to the early 2000’s. Nvidia was extremely clear that this was impossible to do at this scale without ai. I have no desire to spend 10k on a chip that’s 20 pounds and has to sit independently from my pc to play cyberpunk 8 k native at 60 fps. And if they went that route, it wouldn’t scale to the laptops. Ai is the future, get used to it
this is, optimizing it ? why do you think RT works better in Nvidia GPUs than in AMD GPUs
@@feelsDankMan_ AMD's RT cores are 2 generations behind at this point.
@@Thegooob95if frame gen works as good as Nvidia claims, a 5080 will run Racing sims at triple 4K120 max settings and triple frame rates in CPU limited games. Thats insane tech if it works. Imagine playing any game at 4K with 8x supersampling AA at 100fps on midrange cards. Thats worth a small latency hit in 90% of games.
Bet you feel smart capitalizing Fake Frames like an idiot who doesn't comprehend how ludicrous running PT would've been even running on the first RTX gen cards.
His jacket is rendering at 25 fps and TAA
Man.. Jensen is selling snake oil, and Asmon is buying
What?! Nah man, you just don't understand.. He's not buying it, not at all..
He's "buying it" and selling it himself.
You say that like you would feel scammed getting a 5070 for $550
@@DBTHEPLUG that is a scam
@@bruhmoment9429 genuinely curious for a 5070 vs 7900 gre comparison whenever it's available
Well hes not exactly the smartest person on the planet
winning with fake frames? nah
Yah
5090 will still be more powerful than all other cards even without frame gen.
What if I told you all frames are fake. They are all generated by the GPU so what does it matter?
@@deathtoinfidelsdeusvult2184 no shit. but it also for no reason cost 2000$ when get this its only a 15% increase in performance from the 4090. an already outrageously over priced card.
@@DeputyFish do you have any proof 5090 is only 15% better than 4090?
The waste of money series.
If you are poor yes. If you think it's useless just don't bought. Personally I going to upgrade it :)
@@fenneck9676dude I’m so happy all the other 2024 models will go dirt cheap , good time to get a 2024 asus .
Nobody is making you buy a 50 series man.
@@fenneck9676 good morning saar - i see we are doing the 'r u poor' marketing tactic like with the 40 series
I'm getting a 5070 ti or 5080 if im lucky
33:25 but wouldn't a blue knife turn red after you... ya know... use it?
Clearly, red blood will also be illegal. Just use blue blood.
Jarvis running on my raspberry pi 5
Nah the 5090 costs twice my current computer that I have had for 6 years.
im gonna go ahead and guess your computer you got 6 years ago is no longer worth even that.
Anyone believing Nvidia’s lies are the real jokers here
Engineers: engine get 25% more power with the same size and fuel consumption.
Designers: add more weight to the car.
I personally hate frame gen. Not because it is a bad thing to have, but because just like TAA it is a bandaid fix to problems that could be solved better. It allows for lazyness in optimization on the game dev's side which who at this point probably assume that you are using frame gen and dlss/fsr meaning that they only care if the game runs well enough with those enabled and don't care about actual performance. The graphics programmers should be forced to work on midrange GPUs, ceneter the game on those with no frame gen or dlss and then switch to a top tier pc for the ultra settings. To me it feels like most games develop for ultra and turn off features completely from there. The worst example that I know is Ark: Survival Ascended like, do not want fully ray traced shadows? I guess you ain't getting any shadows then. If we want ray tracing we should start adding more RT specific cores, or hell maybe even an RT specific card rhat does nothing but that instead of afding AI to halucinate 3/4 of the frames for us and on top of that most of the pixels too. I really hope this trend turns around soon where we pay sky high prices for fake frames.
All your reasons for disliking NVIDIA stem from game devs and not NVIDIA itself, if it were possible for them to push far more raster performance they would have done it, why are the expectations for NVIDIA 10x higher than for AMD even though AMD only markets itself on raster performance and low prices? If it were magically possible for them to make their GPUs more powerful by just not pushing AI at all then wouldn't AMD have reached this mythical raster performance that NVIDIA isn't hitting? it makes no sense and nobody acknowledges this dichotomy.
@MontySlython I have the same expectations for amd too. I just hate that nvidia leaned into AI generated fake frames as "perfomance". But part of that might come from the fact that I often use GPUs for compute too. And yes it is mostly unreal forcing certain feauters that causes some game dev issues, but selling GPUs based on how well they can fake frames for you doesn't make anything better.
@U_Geek doesn't make things worse either, it's adding some type of performance that they otherwise couldn't figure out how to get on the card, or maybe they could have but didn't because their research and development is trying to push past the need for raster altogether, I see both sides of the argument but the overwhelming hate for nvidias approach is absurd when their techniques are pushing us past the need for extremely strong cards in the future (hopefully), if game devs make things worse thats mostly on them not nvidia imo.
That's exactly why your games look like blurry, smeary shit in recent years instead of crispy like they used to 10 years ago. Rasterisation is king. F*ck fake frames.
my 4080 never smear
the 5090 sounds awesome but I think I'm going to upgrade to a 4090 when people start selling them
Frame gen is like going from a CRT monitor to LCD - SO MUCH LATENCY
But with Reflex 2.0 you'll be able to track hallucinated enemies at low latency!
Last graphics card I bought was a voodoo😂
I have a 4090, I won't be spending 2k for 30% more real frames. Nvidia is losing, maybe they will convince some with their smoke and mirrors tactics.
4090 owners already lost.
What is NVidia losing? They literally have made the most powerful GPUs in existence for years now. AMD gave up, there is no other competitor. They won, not lost.
Lmao 30% is huge. The jump from the 980 to 1080 was 30% and is considered historic yet you petulant children are not satisfied with this 30%?
If I were a Jensen I’d just stop marketing to gamers entirely. You people are not worth the time of day.
@@apersonontheinternet8006all of your comments are extremely angry and negative in nature while also consistently toxic, you’re also completely ignoring what op just said. The raw performance increase may be acceptable but the actual new features of the cards suck. They are being marketed towards features that still don’t even make sense in real use cases. Nobody is saying the cards are bad the misleading marketing is, and so is framegen inherently. The only time you’d really want to depend on it is when it’s useless; when your performance is already bad
Will my understanding of this topic impact gpu sales? No. Did I say it would bother me? Not at all. Just covering my tracks before you try to respond with a rude comment. People are obviously going to continue buying Nvidia and that’s ok, but it’s important to inform consumers of what they’re getting into
if you have a 4090 you probably wont have to upgrade until they start making 8k high refresh rate gaming monitors
Engineer here. It's completely pointless for you to go from a 4090 to a 5090 if you aren't playing on a 4k monitor.
For real, the entire purpose of upgrading cards is to get the same frame rates at higher resolution
You know what I think when I see this? I think I feel foolish for not having bought AMD stock a year ago, and perhaps foolish waiting for a dip.
I need to see a comparison between the 4090 and the 5090, RT off DLSS off
Then with RT on and dlss on
Same
Raw raster is what maters not some pseudo magical ai shit...
Well then enjoy playing games at shit fps forever
raw raster got around 30% upgrade as usual with new generation, so why are you crying that there is a new software bonus that you can use if you want to?
While I'll probably just rock my little 4060 till the doors fall off, ...I would happily "volunteer" to receive a free 5090 as an Asmongold subscriber. If only one of us must make that sacrifice, ...I nominate myself to bear that burden.
Problem is that the game does not get "more responsive", because the game only "knows" about 1 of those four frames. The Jayz2Cents video explains it brilliantly.
I may have spoken too quickly. The new Reflex system and the ability to frame skip on the fly may offset the added latency, apparently around 57 ns.
Got 4070 TI Super, not gonna buy gpu every year just to get dlss upgrade, i would rather leave gaming
That's the best one to have right now for the money. All this 5000 series stuff is bs
Reason its cheaper is because its gonna have a subscription attached to it. You just watch
^ 💯
2025 and Nvidia can’t be stopped.
Nvidia is making a huge claim they are upscaling from 480p to 4k!
Zack totally need this to generate hair pixels for his livestream
Nah this is an L announcement they keep raising the 90 series price by a couple hundred every time. Their 5070 better than 4090 is so disingenuous because raw power it won't be. Plus the performance difference on their lower end cards keep coming closer together so its pointless to upgrade unless you get their best card.
A fool and his money are easily parted.
He's trying too hard with that jacket 😒
Jensen: I own five of these jackets
Mark Baum: There's a bubble
22:52
DLSS is Deep Learning Super Sampling, which means it completes the frame but only rendereing a fraction of it (e.g. render at 1080p and then use DLSS to generate a 4k frame)
and then Frame Generation takes that one completed frame, and generate 3 extra frames.
the video on the left is DLSS OFF, so basically it has no super sampling nor frame generation.
my guess is, 27 frames is at 4k dlss (as stated), and regular DLSS Quality will render it at 2k
and frame rate will often go up by 1 to 1.5 times, say +1.2 times. so you get 60 frames.
now DLSS4 gives you 3 extra frame per rendered frame, so you take 60 multiply by 4, you get 240.
the math checks out. Whats confusing is its demo-ing both DLSS and DLFG, which makes the difference so huge.
Don't think I need a 5090 for playing Terraria and Project Zomboid. All new Triple A games are trash
I respect Asmon, he is the only guy that can stand so much slop and keeps moving forward. A true warrior!
but their shadowplay on new nvidia bug app wont record microphone they might build good gpu's but their software app sucks hard
Nah I suck harder
@@cheems6193 Gonna need first hand experience on this one ( ͡° ͜ʖ ͡°)
12:15 It's backwards because of the illution that the price is decreasing rather than increasing. It's the new manipulative way. You will see more of this, maybe apple too etc.
cant wait for new games to run like crap or have 3 frames of visual latency and zero in-between..
we have very old games that look beautiful and run way better. feels like we are just wasting so much time on AI graphics that we are just leaving rendering and optimization in the dust
If you skipped the 4090 and invested that $1600 into NVIDIA you would be able to buy 10 5090's now.
I like how some really clever folks cry at 25fps like it's not running a 4k path tracing at ultra settings. They could've easily shown us 300+ fps without path tracing.
Nobody is unhappy with the performance they’re unhappy with framegen and deceptive marketing. it’s realistically just internet drama but it’s interesting to see people’s favoritism in action and just ignoring why people are upset, the card is going to sell because they have no high end competitor regardless. People are just worried at where the market is headed and that’s totally fair we should be cautious but we aren’t
Would you also like to explain the deceptive marketing to these “clever folks”? Or would that make you angry becuase im apparently an AMD fanboy for understanding how things really are? Or hell im just an intel fanboy at this point too! Just covering any typical response you would hit me with. I am not a fanboy, I am well informed.
@@opticalsalt2306 those people can't even understand that it's 25fps because using path tracing, why would I explain anything to them. They need to read information before commenting, at first.
They won't change anything with being "unhappy", they can't do anything at this point. Even buying a product from other companies won't help, because they are doing the same thing.
Why are we pushing so much for graphics when the games are coming out buggy and crappy and not optimized
Because now you can use AI to make those unoptimized games run butter smooth now! Why optimize games when you should just buy a better pc?
If they can make a pixel perfect representation of what raw compute can convey, then it doesn't matter how they get there. The ghosting and input lag, once completely fixed, solves the final problem of reaching beyond 4k 120hz. From what I have seen, the tech has reached the boundaries of what is possible for modern transistors, so we have to rely on new tech to produce pixels on the screen.
It wasn't a mistake. They just generated another preview with AI so it looks like if you've seen it twice.
Mark my word: I will never use DLSS ever in my lifetime
Then just buy amd at that point if you wanna upgrade
Then you are living in 2015 and will remain there until a technological sci fi computer breakthrough occurs, good luck to you.
@@Onetooth7997 He probably already use AMD or Intel, Xess is quite good in term of quality but if he only use FSR, well if he's happy with the quality that's good for him
upscaling technology is pretty good though especially at higher resolutions (especially DLSS and that is coming from someone who has an AMD gpu). that being said unless under a very niche gaming circumstance frame gen kinda sucks and should not be shown as real performance.
Lol, such an idiot😂
I work at a small bank and we're literally offering loans to buy computer parts, specifically GPU's.
probs also coz of possible incoming tariffs. building a rig now with minor bank charges vs building a couple months of paychecks down the road w/ ~25% increased cost. 5090 will be 2k when it releases, but 2.5k come summer time.
No surprise there, offering people loans with high interest and capitalizing on their stupid financial decisions by them purchasing crap that they don't need. A fool and his money are easily parted.
4:45 DLSS upscales, e.g. a 5x5 becomes 25x25; the rest isn’t magic either - they've been faking frames longer than nvidia
Why i invested in this company in 2020
I can almost see the horrifying "Lagging? Want to buy FPS? Buy our new AI DLC today!"-ad in front of my eyes.