You can just buy the B580 instead and less money going to them will get your point across. So hope you're not one of those hypocrites with their cards ...
9:18 Don't be fooled. The GPU's raw performance is 28 fps. Like huh? This is so stupid. There is no way their raw power only gets 28 fps. This is even more proof that Modern Games are extremely unoptimized and heavily relied on DLSS. EDIT: I've seen some dumb take here like it's beneficial for EVERYONE to have a minimum standard of 60fps instead of 30fps. It's also beneficial for everyone that developers optimize their game from the very beginning. I know that they are using PT and RT, which is why it only got 28fps. Yes, I understand that, but even a 4090, the highest GPU on the market, ran like garbage in modern AAA games without PT and RT. Like it's crazy how people just straight up fine with unoptimized games when it's beneficial for everyone.
Please note that this new technology will not be effective with most games in your Steam library that do not have frame generation or DLSS options available in the in-game settings.
lmao these cards will kill the performance with the games that dont get support because the internal specs and vram is already just as good or better than the 4000 series.
@@palniok Yup, from how I understand it, they can force inject DLSS into pretty much any game with the new cards. Not sure how it works, but that's pretty crazy if true.
@@CertifiedSunset AMD already has that for FSR and frame gen. you can turn it on by force in the driver. Kinda weird Nvidia took this long to put it in.
A bunch of tech channels already commented that what DLSS4 offers is 3 "fake" frames for every 1 true frame. It only works in games that support it and while LOOKING smooth has noticeable delay.
The regular frame gen is only every second fake. DLSS4 offers 1 true frame for every 4 frames, while those remaining 3 are interpolated frames that DO NOT react to your inputs. 30 fps framegened to 120fps will look like 120fps but will play like 30fps. Framegen from DLLS3 was only good when you were hitting like 70-90fps natively and you wanted to increase motion fluidity since most people have 144Hz+ monitors.
as.owner of the 4080 super, the extra fake frames feels terrible, it does not feel smooth and your output from your keyboard and mouse feels really delayed, I tried them a few times and never turned back on
Be careful. I just said the same on another video and got 10 fanboys telling me I'm against technology and I'm stupid etcetc. Supposedly if they can't notice then the issue doesn't exist🤡
@@РаЫо It's wild they are going that route, every compute that is going to AI, means it's not going to actually generating authentic pixels the way devs want them.
The reason the lower end cards are cheaper than the last generation is because they cut them down even more compared to the last generation. They are upselling lower tier cards, so actually they are more expensive. They are pulling an RTX 4080 12GB situation again but this time nobody seems to be complaining. And they are doing it for the whole lineup below the 5090.
ik right, when i saw how they cut down the 5080 in half compared to the 5090 i was like wtf is this, the 5070 is even worse , without dlss those cards are head to head with the 40 series
True but tbh human eyes don’t get a real benefit over 120 fps. So what’s the point in trying to go higher? Edit: in case those want to comment, focus on the term “real benefit”. Not actual limit. It’s like watching something in 8k vs 4K. Technically better but the difference becomes marginal.
@@smokinace926 tbh it's harder and harfer to notice, sure, but proper 240 is already noticebly better. For those who chase further and further fluidity it makes sense I guess
At least now we know Asmon isn't afraid to tell us what kind of FPS he gets even if he is being a liar while getting ripped off for 110 FPS with a 4090
You're basically paying and banking on Nvidia to provide you performance through software updates rather than buying a solid and powerful GPU. No thanks.
this is it 1000% its like a subscription. you want to unlock the new software features we locked behind the new hardware thats not even much better, give is 2k...
@joonashannila8751 and the moment Nvidia boinks an update, your performance goes to the crapper. Or better yet, when they decide to stop supporting the card.
If even that. A lot of this ai shit also depends on the dev implementation. If they implement it bad, it could turn out almost unplayable. And if they can't make optimized games anymore, who says they can do anything else?
@@PerciusLivebut that's not what they're doing. New games are so bad now that you literally need to enable the AI software to get you above 60fps. Edit= After reading your comment again I realized we're saying the exact same thing. My bad.
18:10 Zack still doesn't understand the retail price of goods is not determined by the costs of production. They are aimed to be priced at the optimum price that generates the most aggregate profit.
These are two separate things. Of course the price is determined by the cost of production. And they come down from the price after they have developed cheaper productions methods.
@@damazywlodarczyk They are not separate, they are priced above BOM/production and margined at what people will spend for the product. It is tech not a necessity.
yep, and I'm going to buy it anyways because of the 32GB vram required to play skyrim modded and then rent it out when I'm done to make $10/day and hopefully pay off the entire PC in 1 years time. this is however, perhaps my last video card purchase, I'll just hold onto it for 20 years unless they go back to generating real frames because devs are going to get lazy, why work hard when AI can smooth everything out? but you'd think there'd be a parallel economy for real frame GPUs however idk that the lay person will value it over the fake BS and us real frame enjoyers will just get the shaft where if you want a complete game, dlss is required.
@SBlazeable the Vram increase is literally the only reason why I'm interested as well lol. 8gb more is decent. If not for that, well it honestly doesn't look all that compelling
One of the problems with fake frames is that performance doesn't work in VR or other high demand applications. 3D modeling, Video editing, multi screen, etc
yip. pure rasterization performance is always more important and more useful in general for various tasks. imho the die space for dlss, raytracing and ai is just wasted resources that could have been put toward more powerful gpu's.
@@socks2441 you seem to not understanding current engineering capabilities, the reason we have software solutions are because of our hardware limitations. GeForce is geared towards gaming, not productivity. DLSS and Ray Tracing are both beneficial for gaming.
When he says "ai" all I can hear is "horrible TAA" and "blurry in motion" and also "horrific ghosting artifacts", too. Stalker 2 is one great recent example of all of those.
You havent seen DLSS Transformer model. It will be available for all RTX cards since 20series. And it fixes a lot of the ghosting and blurry issues. Go look it up, Digital Foundry already has a video on it
Something overlooked is that it isn't just a more dense version of the 4000 GPU chip with a bunch of more cores added on (a percentage chunk more CUDA cores is always the go-to performance bump for any new generation). It's supposed to be an entirely new architecture, chip layout, new process node etc. therefore driver updates over time will likely vastly increase performance of the new 5000 series too as the software department begin to take advantage of what the hardware boys built.
I cannot believe that frame generation feels good at all. Each frame that is generated does not have any direct input from your mouse and keyboard. This would feel like the input lag we had with v-sync 60hz, where you also have to wait for the monitor's refresh rate (it creates small gaps between control-input and video-output). Also, the fps counter can be manipulated by frame generation by simply showing the same generated frame twice. If you have enough different frames in a second, you won't notice it except for some blur - which the AI conveniently generates anyway, and the fps counter going up.
I've tried it with just regular frame gen and it feels like Jell-O. It'll be even worse, it'll feel like a monitor with slow pixel response time despite having more frames displayed.
From my experience, the only game it feels decent on (with a 4090 mind you) is Cyberpunk. Otherwise, every other game I enabled framegen on felt terrible.
@@wolfsbane9985 every game I've tried frame gen with felt like I was fighting the game to do what I wanted, input delay felt like trying to swim in honey
@lolsalad52 that's awful. It wasn't that bad with my setup, but in Space Marines 2, for example, enabling it output a higher framerate, but moving around/looking around felt like my framerate was lower than if I had just left it off. The best way I can describe it is stuttery. If good/high fps is the feeling/texture of running your hands through really smooth sand, then turning on framegen felt like running your hands through gravel. I maintain the only game it feels good in is Cyberpunk, but otherwise, it's not a good experience.
Roger that. 👍 I have more games in my Steam Library than I'll probably ever get around to playing. I'd rather have my Steam Library overflowing with games I might never play, than an RTX 4090 in February 2025. Folks were paying $3000 for a 4090!
Not me! I miss a day of work for every launch. I pay retail every chance I get. I'm severely in debt to the point that I'm living at my friend grandmother's neighbors apartment at the senior living home. I must bath her daily that will haunt any sane man dreams. But it's worth it!
Just putting it out there: on a 4K monitor (27inch), you can turn off all the anti-aliasing options because there’s no need for them. On a 1440p monitor, however, you might still need them.
@@Hamborger-wd5jg "Yeah, that's right on a 27-inch. I can't say for anything over that, but you seem to point out that on a 32-inch, you might need (jokingly, 8K). Jokes aside, you're right. ;)"
@@Mr.Stephane_L I came from a 32" 1440p monitor and I didn't feel like downsizing to 27" 1440p, and I felt like getting a 27" 4k monitor would be fruitless, so that's why I went for 4k 32". I saw a $350 black Friday deal on a UPF322 and sniped it. Difference is night and day on the 4k one compared to the 1440p one. My 7900 GRE is able to drive games of similar demand to FH5 around 130 fps once overclocked using basic taa without upscaling. Given of course Forza is somewhat AMD optimized, I can't say if it can run anything more intensive, but I do know it can do this.
Yup, listen to the guy thats been personally invited backstage on Nvidias HQ with multiple Nvidia executives listen to every word cause if there is one negative thing, he'll lose Nvidia as a potential sponsor. I like LTT, but this felt forced. Almost like gunpoint.
He vowed not to get the 4090 but he said he's gonna get the 5090. However the elephant in the room is he didn't get the 4090 because of the price, and the 5090 is even more expensive.
Hes always expressed distaste for Nvidia. He says they're a huge pain to work with all the time. The card is more powerful. And he's got the money to spend. I don't blame him for getting the best card available. But he's probably just as miffed as most of us that this is all dlss AI performance. You're still going to get great performance without craptracing on.
It feels like the biggest issue with DLSS is not to do with it's performance, but that developers are relying on it instead of developing actual good graphics processes. DLSS does have it's negative quirks, such as producing frames that are not responsive to user input, and smearing between frames, but it does produce a lot of "performance" for much less effort.
@@datbo1jay1 yea I think you are 100% on the money there man. When it was introduced I was hoping it was intended to keep older GPU's alive/relevant or something
@@datbo1jay1 That's precisely what it is. Now that these tech guys have a way to make life easier on themselves and become lazy that's what's happening.
There's also the issue with games being CPU intensive. I heard that with Dragon's Dogma 2, some people had worse performance when they enabled DLSS because of that. I had really bad performance with No Rest for the Wicked and I'm not too optimistic about DLSS in that game (haven't tried it yet iirc). I wonder if part of the reason it was so poorly optimized is bc they expected DLSS to fix everything.
14 дней назад+2
''i measure time in wow expansions...'' absolute cinema right there asmon
The 5090 exists as a price anchor, what nVidia really wants you to buy is the 5070/5080. The 5090 is priced that way to make the 5070/5080 look like a good deal (they aren't). The 5070 could be sold at $400 and the 5080 at $650.
More like the opposite. 5080 is not even half of 5090. And other cards show no significant improvement over previous gen as well. So cards for poor, and A SINGLE card that actually delivers performance.
@@alexturnbackthearmy1907 You're thinking of 4060 (eventually 5060), an intentionally horrendous value meant to push buyers to the 4060ti (eventually 5060ti). The 5090 has 102% more cores, the 5080 has 14% faster clock. The 5080 is half the price, more than half the performance.
The 5080 isn’t horrible as back in the day when I was building PC’s in 2005 a top tier GPU was $500 and with inflation is $851 today so they aren’t too far off but the 5090 is yes absolutely horrible value because it actually is a top tier GPU which should be close to the $851 if they followed inflation but they added another $1,150 on top of inflation which is very frustrating. It’s literally the same with housing too. Do I think the 5080 will actually sell for $999…absolutely not. I’ll just wait for some sales to pop up to get a 5080 to upgrade my 3080.
@@indeedinteresting2156 I'd say that anything that can't run on a 1000/1600 series card on lowest settings without any frame gen and upscaling bullshit deserves to go under
Because companies - not the Devs - are lazy and only see Dollar signs. There's no time for optimisations so whatever shortcut they can take to make their games faster out the door the better. Hence this dlss/fsr crap.
Because NVIDIA is the king of AI hardware. They are going to push that everywhere so game devs don't need to optimize, you can just use their tech and get easy gains. Every GPU company is doing this.
What are these graphic cards good for then if their native frame rate is just 25 FPS? What would be the point of buying them beside the DLSS/FSR technologies they have? Like, are they really good for other tasks like AI image generation or stuff like that, or nah?
Nobody is unpaid nowadays except everyone willing to say no to getting ripped off for that much. Gamers Nexus... Digital Foundry... EVERYONE is collecting a bag.
LOL yeah it's funny how LTT don't even try and hide it anymore. They'll release another "nvidia I am disappoint" video this year or next, to try and seem neutral to redditors, and they'll fall for it once again.
I think that from now on GPUs should no longer be called "hardware" but here you are practically paying for a "software" since 3/4 is generated by dlss4
@@HellPedre so you really think that artificially generated frames are better than physical frames generated by the gpu? 🤣I think that dlss4 should only "help" a gpu to "compensate" with additional frames only when it becomes obsolete... not become the main feature. otherwise here we are no longer talking about GPU but about FGU (frame generating unit)
You say most of the average consumers are not going to notice but who's going to afford 5090 is going to be the people are looking for the all that graphics.
You mean the Austin millionaire who doesn't leave his home or bath, forgot people in trailer parks, district 8 and 3rd world countries want to play games too but can't afford to and nor has the infrastructure to handle it? If intel doesn't mess up and meets market demand remembering poor people like games too and even more at higher rates, Intel will become the wealthiest company in the world by country miles; it always happens a third company swoops in to fill the entire low-end market of a product or service which is the major majority then later buys the competition just in this case they have the aid of the fed and it's the largest product today.
@@JunkYardDawgGohan Waiting for the NVIDIA consoomer to spawn in and inform you that “PC gaming is a ✨luxury✨ and that if you can’t afford it, its not the hobby for you … go be a poor elsewhere.” You’re not a true power-bottom gaymur like they are if you don’t rush out and slop up the slop.
A good amount of people In the military have PC builds. Or at least in my community I did. Everyone has a built PC. I've served in the Marines for 7 years now. So that's just my experience still a good amount of people. And we don't get paid much. Also I'm not for PC builds. A decent laptop and your chilling.
@@TwoBs I ain't a fanboy of Nvidia, but this idea is ridiculous. You don't like it don't buy it. And if you can't buy it within reason, then you shouldn't even think about it. Gaming is absolutely a luxury, for every second you spend on a game rather than being productive, you lose money or some gain you could've otherwise had. Now I'm not one of those anti-gaming people. That narrative is just as stupid. However, it's pretty wild that there's so many people out there that think 2000 or anything like that is a genuinely impossible number. I guess most people don't know how to invest or save money so it makes sense, but if they learned how to do that one simple thing they could buy whatever they wanted. It's financial illiteracy, not being poor or anything like that. Shouldn't be spending money on any PC part if you can barely make ends meet. But if you live comfortably and have excess every now and then, yeah it won't harm you to keep money on the side and save up for something big. And for people in crappy low-income areas of the world, and 3rd world countries, they have bigger problems to focus on anyway.
@@PufflordYou make good points, however, $2k for this frame generated tech just sounds like a gimmick with very questionable value. The delay you experience with it can in some instances feel worse than low frame rates (speaking from experience, good luck playing any fps with it). So yeah personal financing is obvs. the 1st priority, and it’s not that “$2k is an impossible number,” but rather the value incentive for that price is not compelling.
M32, tech professional in Australia here… I own a gaming laptop, we live in a space constrained apartment and had to give up my desktop to reduce clutter / make space. A few of my friends have had to do the same when they have kids and give up their office / study. A lot of younger people I know get a mid-range gaming laptop they can take to uni but also game on as they can’t afford both. The market for gaming laptops is pretty huge.
the big difference is still , they used different settings on the systems why linus was not allowed to show the settings in cyberpunk of the 5090 system
The settings are the same, it's just that the 4090 had 1/2 frames generated by AI and the 5090 having 3/4 frames generated by AI. This would roughly equate to double the framerate if the base performance of the cards were about equal.
The hottest take anyone can give: We should have NEVER left the GTX lineup. RTX and the very IDEA of raytracing has done nothing but irreversible damage to the entire gaming industry as a whole to the point of ruin. Raytracing was not only a mistake, but flat out the worse direction that industry has taken on mass due to the negative impact it has had on development (industry-wide) and then by proxy, performance. Gaming would be in a MUCH better place right now if raytracing technology never existed.
I wouldn't go as far as it shouldn't exist as much as it shouldn't be on a commercial product. Production workloads still benefit from RTX cards for ray traced rendering.
Or was released in done state - as in, all games would be fully ray-traced out of the box. Nowadays this is not the case - RT is unoptimized mess that runs on top of another unoptimized mess, and is also completely butchered by needing a proper implementation from dev team which happens...extremely rare.
Gaming laptops are not usually used for gaming. Most of the time gaming laptops are used by developers for modeling or creating video games a I work things of that nature.The reason why they buy the gaming laptops.Because of the superior gpu performance
Agree I see this on Ark ascended and I thought it was my monitor but I have a new one with thats OLED 240hz and still the same. DLSS I don't think is the future but it might kill gaming for allot of us if it keeps going this direction.
@@michaelmichaelagnew8503 DLSS is only needed because developers CANNOT and WILL NOT optimize their own games, so GPU manufacturers are forced to do shit like this with fake frames because the game devs are incompetent.
Serious question, if you've been playing games since way back (my first card was a voodoo 2), which means you were happy playing extremely low res games for at least a decade how do you rationalize your current "standards"? This sounds to me more like the curse of adaptability, essentially people are so quick to take the latest image/graphical improvements for granted due to the constant progress we've experienced over the last 20 years, and as a result they cannot be happy with anything for too long because they are constantly adjusting/adapting to the latest improvements. If you look at it from that perspective our adaptability is a kind of curse, doomed to never be satisfied with anything as long as a new, shinier, fancier version of it comes out. It would seem that the only way to remedy this would be to first be conscious of this reality and try to remember where we came from, what we had. I do this all the time, I bitch about imperfect image quality but then remember the evolution over the years, and then suddenly I'm 14 years old again and blown away, and this allows me to see what we have now in a clearer light, and my bitching turns to thankfulness, and appreciation. And I'll tell you, thinking like this makes everything better, makes me less bitch less, makes me a better person....
@@caldale4941 Well said, and it needs to be said more. This is my perspective exactly. Which is why I'm still HAPPILY using my 2070 super, even though I can afford a 4090. But why would I bother. During actual gameplay the differences are hardly noticeable. I'm still running everything on high / ultra with this card. To help this situation I always stay a couple years behind on games as well. Basically what we see these days are a bunch of benchmark chasers that don't understand diminishing returns.
@@DingleBerryschnapps It actually means a lot that you said that, sometimes I feel like everyone is missing something when I see how entitled and demanding they are, with no awareness of what's happening to their minds, completely swept up by the time and devoid of introspection and the ability to shift perspective to something more wholesome and grounded. Not a week goes by when I dont look at a game and go "Wow!", and this goes for all types of games not just the cutting edge ones. I refuse to be a slave to my adaptability because I've learned to see how it corrupts and lessens my ability to see what is good and valuable. Anyhoo, your comment reminded me that there are some people out there who really do think outside the box and understand that they have internal processes that actually work against them when they go unchecked and arent acknowledged.
I had to recheck becuase I didnt even know that was the input lag...If 5ms is already noticeable but playable, I can't imagine what 50ms does to a person HAHAHAHA
100% I can't even look at Christmas led's without seeing flickering, it drives me bonkers after just a few seconds. This would have me throw my monitor across the room in under a minute if I was forced to play like that.
yeah that was surprisingly low 1ntelligence from asmon. like gee you think the people that dont care that much about gaming are going to spend $2000 on just their gpu?
Not the case, people who buy 5090 are not the people who know a lot of stuff about tech and computers, it's just rich people, if you get 200k a year working in finance, you won't notice shit.
@NajaToxicus I have a 4090. I don't use dlss because it is a blurry mess. I bought it because it was not only the best card on the market but also the best cost to performance. A lot of the people with 4090s will notice, i assure you. I'd imagine most pro players are on 4090s currently.
@@mistrsynistr7644 And? You think those people are all professionals that understand diminishing returns just because they purchased a high-end gpu? If I got a nickel every time a person thought they needed something, or they could notice a difference in graphics, when there was no difference there, I'd be rich. As a photographer I've seen this countless times. Sure, it looks better when you're pixel peeping and adjusting the settings, but when you're actually playing, it's meaningless, unless you're looking for 200 plus frame rates, which most people are not. Someone purchasing a xx90 doesn't tell me that person knows more. It's a fact that people that know the "least" about something, pay the most. They don't understand the nuances of things that would allow them to make an educated decision and save money. They just have money to spend, and listen to Tech influencers, and benchmark chasers in the comments. People that understand diminishing returns and are frugal with their money are the smart ones. They understand how fast technology moves now and only buy what's necessary at the moment. I know people that have spent $6000 on gpus in 5 years when I only spend $500 every 5 years, and we play the same games. Btw, "best card on the market" is completely subjective. Best for what? I guarantee my 2070 super I'm still using plays all the same games yours does at perfectly acceptable frame rates, on high / ultra settings. 👍
@@aegd LOL a GPU does not draw more power than a fridge unless it's a mini fridge that GPU does not draw more power than a fridge you can't plug a fridge into an extension cord like a power supply for a GPU or at least nobody I know is cracked out enough to use extension cords and surge protectors just to plug in a fridge
Didn’t the 4070 super compare to the 3090 in performance? This AI and scaling is getting ridiculous. I already have to use some type of scaling when I play with my 2080Ti at 1440p. Some games I can get away at Native and lowering my fps cap but I might end up going for a used 40 series card or something.
And 100% bought my first prebuilt in 2019 for $800 after taxes for a 1660Ti and an i5-9400F and upgraded that and eventually built my own with the 2080Ti in late 2020/ early 2021.
21:00 as someone who bought a gaming laptop i think it's because I wanted to be able to take it to university, play games when I'm on my lunch breaks, and be able to take my laptop to a friend's house and game together with split screen games
@@alexturnbackthearmy1907 it's just to go to a friend's dorm and casually game together if we were bored. There's a lot of uses normies get out of gaming laptops was my point, I couldn't take my pc with me
Yes I got a gaming laptop for that purpose too. Ended up getting a desktop PC and a cheap laptop. Just remote into the gaming desktop with moonlight and sunshine. Extremely low latency. Works out perfectly. Laptop gets great battery life because it's only streaming video. No sound from the loud desktop as it's in another room. And I can stream it to any screen anywhere.
I think you vastly underestimate how many people care about image quality going backwards and input lag. Also on what planet are "casual gamers" getting a 4090 or 5090. We are also not hit the ceiling on visuals. Visuals have literally gone backwards.
Image quality is subjective. How would you define the quality of an image? I can claim that Super Mario 64 has superior graphics over modern games, does that make my opinion a fact?
@@maxttk97 when people refer to image quality they are usually not referring to the objective value of pixel information but are instead talking about the aesthetics. 2D vs 3D, art styles, etc. Obviously a 480p texture has less detail than 720 or 1080 but a majority of games today are created with at least HD textures so the objective quality of the pixels has not gone down and in fact has gone up with the rise of 1440p and 4k usage which would debunk OP's theory of image quality going down if we are strictly speaking about the pixel fidelity.
@@bigturkey1 maybe because you are using a damn 4080? on a 3080ti i have noticed almost every AAA game of the last few years is terribly optimized. indiana jones being the one exception.
@@bigturkey1 exactly. its incredible. especially after so many games that require dlss and low settings and still cant hold 60fps. but yeah, i was shocked to find my 3080 ti could max the game out at NATIVE 4k60. i wish they had not limited the game to ray tracing hardware though, because clearly this game is well enough optimized for any decent GTX gpu to play. i dont think the forced ray tracing is doing much in this game.
@@socks2441 It's probably because they didn't spend thousands of hours to make "fake lights" in the game and rather went for the better looking and easier ray tracing way. Which is why ray tracing was made, just like nVidia forced PhysX into the games years ago.
19:50 more context though: 3080 was $700, then the 4080 jumped to $1200, now the 5080 is $1000. So it's still a massive increase for that tier compared to two gens ago. New gens used to be similar price for the same tier but better performance. Around covid it turned into huge price increase to get the performance increase. It's good that it decreased but it still could be better, and the limited vram is just sad.
7:40 "you cant change the settings" sounds weird to me, im sure its not because they have to reset it for the next journalist, more like "it only works if you put it like that" basically what Triple A devs do today "the game only works, if you play as we want you to play!" if you do it a bit differently, everything brakes
They don’t want Linus to turn off DLSS. Nvidia is building up hype around the 5090 performing twice as good as the 4090 but only with DLSS which is meaningless.
they have it set up so the 4090 runs at a crawl, and the 5090 has all the new ai software for fake frames. if he could turn that off he could actually compare raw performance numbers of the actual hardware instead of the software. they dont want that.
Yep. But the problem for Nvidia will be that after few days of official launch, we will have tech reviews from GN, Hardware Unboxed etc., and we will see the reality. Making Linus not change things is stupid, they know very well that every major tech reviewer will play with the settings and the truth will shine.
@@roshunepp Bruh, been running my 2080TI since 2019, intend to keep going for a few more years and praying maybe something that isn't AI dogslop comes out with actual hardware improvements.
so... i basically skipped the 4xxx gen nvidia's and it looks like I will be skipping the 5xxx gen as well and wait until they either perfect this system to avoid the latency and ghosting or change it to simply improve the native rendering. I cannot ignore the blurring/ghostness, it actually bothers me.
20% more performance , 100% more dlss4 performance lmao imagine AI generating 15 out of 16 pixels and calling it performance lol , unlike FSR this will only work with a few titles that Require Nvidia to work with the developers , Nvidia must be mad that the 1080TI Was relevant for too long and they want performance to be locked behing "software compatibility" while a old and weak gtx1050 from a decade ago can run AMD Frame generation technology lol
yeah, but since that frame generation is 50series exclusive, its still produced double the frames. And think about the fact that game devs dont give a shit about opimization anymore, so they just implement these frame gen features. So in the end it does not matter, still double the frames. BTW did no one notice that the resolution was even on 4k???? With the new Displayport upgrades to 2.1 and the new 4k OLED 240hz monitors, the quality of pixels will be insane.
Real frames are generated. These are more like hallucinated, as they're that bad. GPU just went on lsd-level of artifacts. But somehow it's amazing and I'm just stupid because others can't notice the problem so it must not exist.
@@Jay-vt1mw Same. I used Frame Generation on some titles now. Some were broken at release, but later fixed, some had input-lag issues, but some also ran perfectly fine as if there was no Frame Gen (it was on) with high FPS. What Nvidia needs to do it to push it onto more games or engines like UE5 OR make it available via drivers, so you can use it on all games (but it might be broken).
I feel like the people mainly buying gaming laptops are engineers doing BIM or 3d modeling for construction projects. This is the case in our company and all the contractors we work with. If you're working in a construction site powered by generators, it's much more convenient to have a battery than lose power suddenly and see all your unsaved work fade away with the flickering lights every now and then.
I have one computer, my gaming laptop. The practicality of a laptop, the capacity to play things like BG3. I don't game enough, or care enough frankly, to maintain a tower/monitor setup. Though I can appreciate other's joy in those spaces.
I shift places often as my job requires me to shift in every few years and my gaming laptop is like my pet dog i can carry anywhere to game and it also helps me with my work sometimes.
@@Darkdiver28you know modded minecraft Especially with the more demanding sharders bring any pc to it's knees like yeah minecraft is more CPU & ram base than gpu but those more Demanding shaders you need a powerful asf Graphics card to even run. Like a RTX 4090 can't deal with those more demaning shaders like having it's like Ray tracing just to much for modern Consumer hardware.
1440p to 4k on 27 is an immediate and noticeable difference on my end. I tried an ultrawide high refresh 1440p Alienware Oled and ended up returning it because of how big of a difference the sharpness was compared to my 4K ultrawide IPS. On top of that, the motion was still not as clear as I hoped, maybe we need 480Hz, or MicroLED.
wasn't the point of more expensive hardware that we don't NEED dlss? It's cool that the price is competitive but all i heard was that nvidia can't really squeeze much more performance than they already have with the 40xx series and from now onward the improvements are all gonna be DLSS based
The problem for Nvidia will be that after few days of official launch, we will have tech reviews from GN, Hardware Unboxed etc., and we will see the reality. Making Linus not change things is stupid, they know very well that every major tech reviewer will play with the settings and the truth will shine.
But the first impression (from linus) will be seeded in many people's minds. Most people will not bother to double check. They'll just watch the first, most popular thing and basing on it form their opinion. That's what they're going for.
These new GPU cards should comfortably be able to do: native 4k 60fps with full raytracing for the amount they're charging, especially the: 4080, 4090, 5070, 5080, & 5090
@@jackofsometradesmasterofnone I still have my 6900XT. It is "old" but it works and I can play everything I want. I wanted to build a new PC in a month but it looks like the new GPUs from Nvidia and AMD are just "fake" with their AI trash.
@@wolfsbane9985 You must have extensive experience shipping games with full ray tracing at 4k 60fps to make that statement. What have you worked on, and what have you optimized?
@computron1 What are you implying with your question? That my statement is false or I don't know what I'm talking about? I own a 4090 and do a lot of testing myself. It's a hobby of mine. You don't need to be a game developer with years of experience to see how broken and unoptimized AAA release titles have been these past few years. It doesn't take much to see that developers are heavily relying on AI techniques to get games to acceptable framerates, and it may get worse. Take Silent Hill 2 remake for example. Why does that game have worse performance than Cyberpunk 2077 on a 4090? Because it's terribly unoptimized. That's just one example. Do your own research. It's not hard to see where the market is going.
Yeah, many gamers / it engineers buy those gaming laptop so they can have dual use our of it. These laptops are generally connected as a desktop setup with screens and staff to actually play the game.
The main problem here will be system latency doesnt help to have 200 or more frames, when system latancy is 60 ms. Add 24-60ms latancy to server and its like you play on over 100ms. Wont be a pleasant experience for online games. The ghosting and smushing in fast paced games will be really noticable.
Those tech demos are like a limit of what the card can handle in a very especific scenario, ghraphics cards have a long road to improve. The limitation of looking realistic is not the only one, but you also can increase the number of things in a scenario, movement on things, and so on, and in those aspects we still have a long road. Games and tech demos dont look that realistic, they are build to give that impression, but they still work on really big limitations
DLSS Is great as an option for those who wants better performance... The issue is nowadays developers just don't care about optimization at all, and DLSS has become a REQUIREMENT instead of an option. DLSS and frame gen has done a lot of damage to the videogame industry.
so what? lets stop progress? it is not nvidia who deals damage to games, but big publisher managers trying to save some costs using newest tech. because from the pure profit point of view - why bother with optimization when you can just turn on the couple of setting switches? This is what fked up
@@PhilosophicalSock Who said to stop the tecnology? I like DLSS and FSR , they're great options for players. My problem is with developers who rely on upscaling software to get "OK" performance instead of optimizing their games...
Gamers have been destroyed gaming, more than any developer could ever do. Gamers are responsible for destroying the physical side of gaming. Collecting in real ownership, it is pretty much gone because gamers are just not very smart. A group of millions of people conditioned to rent entertainment.
@@RobK-rl6sn lol no, gaming was destroyed in 2013 when the industry was showing billions of profits to people with BA degrees who think mass market and subcription models is how you build a business.
I dont get why they cant get the DLSS to have different layers. Make it seperate certian render sections. Crucial items that shouldnt blur ever would always be rendered while less important items would be using the AI cores. That way you get performance + the accuracy neccessary.
I think we brought this to ourselves. Many gamers are so obsessed with frame rates that graphic cards no longer provide "visuals" as such but they are simply focused on generating frame rates, while games look like their 2015 counterparts but in 4k...
That's a lie. We're talking about the 60 fps standard, the lowest possible bar, set more than 10 years ago. I, nor anyone else, should be gaslit into believing that "gamers have unreasonable standards." Games should run on reasonably affordable hardware, at 60 fps, minimum, before any frame generation. That has been the standard for more than 10 years, the absolute minimum standard, and it's the industry's fault for failing to meet even the minimum standard.
@@afelias Games can run at 60fps though if everything else sacrificed. Frankly, I see no issue with the rtx50 generation. The AI and frame generation is for people wanting 4K and 60 fps and ray tracing and pathtracing. If you play at 1080p or 2K 60 fps and no raytracing or pathtracing you won't need the AI frame generation.
@@sadiqraga3451 Yes, but you shouldn't advertise any sort of improvement if it comes at the cost of the minimum bar. If you have to turn off RT, lower the texture quality, lower the resolution or use supersampling, just to hit 60 FPS, then that is the limit of the performance uplift. If the games can't look better at 60 FPS then there would be no reason to spend more or even an equal amount of money for new hardware. That's the point. It's not like it's anyone's fault but NVIDIA's for aggressively pushing for RT in the first place. So they can't advertise any kind of uplift if they have to start below-minimum to create that comparison.
8:11 DLSS Performance Screen Space Reflections Quality not on the maximum setting This is 1/4th of the resolution of the original, meaning this "4k" gameplay is only a 1080p one upscaled, also 3 out of 4 frames is AI generated, by doing a simplified calculation you have to subtract by ~7 (you have to calculate with the actual calculation of the upscaling so amore realistic number is around subtracting by 5-6) to get the "real" performance of this card, for 4k native So the 110 FPS you see at 8:41 would be about 20-25 FPS
@@StefanRial-i4f This is the raw performance increase that makes most sense. 5090 is a 4090 that draws more power and has better AI performance and that's it.
For most. Same way that 1080p was standard for 2020, and 720p was for 2015/2016, 1440p should be standard for 2025. Benchmarking to 60 frames at ultra at 1440p native should be the ideal for optimization now.
@@PerciusLive I feel like 1080 has been a standard for waaay longer...Like 2010's late, pretty much every single person had a 1080p monitor. Besides that i couldnt agree more.
Even 1080p is plenty enough for basically almost everyone, and its soso much easy to run compared to 1440p, especially 2160p. I do however think for gaming 1080p is enough but for more productive work having multiple windows side by side 1440p is the sweetspot. For only gaming 1080p is enough for most people
I dont know how im every gonna afford this shit, so annoying with PC games being unoptimized shit stains nowdays and you need DLSS and 5070TI's to run them well
Not long ago, playing CS2, I noticed how viscous the controls were, and they often killed me as if they saw me before I did them. When I turned on the FPS counter, I saw that my frame time was jumping by 18-24 milliseconds. There are several factors here, of course, after working on a lot of things, I eventually achieved 11-14 milliseconds and this radically changed my feeling about the game. If you think about what 10 milliseconds are, it's almost nothing, but in reality everything became faster, controls, reaction time, even skill increased and it became easier to kill. What I'm getting at is that they feed us 300-500 fake frames with a delay of over 35 milliseconds, this is just hell for multiplayer shooters.
For me I like it but my reflexes aren’t as good as yours so I’m not best at competitive shooters. I am curious how 5090 will work with 1080p just feel gpu wise were not there for gaming but monitor wise we are getting there
@@mew1534 " I am curious how 5090 will work with 1080p" dude. the 1060 is a 1080p card. if you are buying 5090 wondering "how it will do" in 1080p i am speechless
Honestly, I think Nvidia took this approach because their marketing strategies are lacking, and collaborating with RUclipsrs might help them come across as more genuine.
49” Ultra wide monitor is the way to go. It’s two 27” screens in one. You can game at the 27” screen size with streamlabs or other go super wide screen full gaming mode which is sick.
Gaming laptops are pretty cost effective if you consider the included monitor. They aren’t as powerful as a desktop but they’re pretty close. Plus they are portable so nice to take to a friends house or play party games on the tv
All of these examples of artifacts and jittering is why I DO NOT use RT and FSR4 / DLSS. I always turn off Motion blur on games....I'm old and I hate the new shit that covers the major flaws....
1000$ is unrealistic for every reason being all the different manufacturers for each different PC component charging what they want - even for very basic bare bones features. And then the periodic gimmick advancements we see in hardware/software. It'll never really happen without making "non-name brand" types of cost reductions
so basically the new gpus are using ai to upscale/ downscale to keep performance up that sounds like something we already have 'Dynamic Resolution' but its implemented to the gpu by default for every game i guess? and dlss gets a boost
the 5000 series prices for me is they found the price cap for each class with 4000 series, and used the successful 'super' card prices to gauge the 5000 series cards, the 5090, they can price it whatever they want... because... what else are you buying at that level? 🤣
This 50s series is probably just a substandard byproduct of their ongoing Blackwell AI research. The money shifted from their GPUs to their AI enterprise instead. It's like petrol byproducts. You really want the petrol, but why not sell the byproducts too.
I think the telling thing that no one seems to be taking into account about DLSS4 is the input latency. While it is generating frames inbetween your current and next frame to give you a smoother visual, it is not reacting to your inputs. For many games, this will be imperceptible. However, for many other games, include most FPS games, games like Rocket League, rhythm games, action combat games, they are all going to feel somewhat sluggish to play regardless of the visual smoothness. A lot of people wont even know what the issue is, they will probably blame the game.
Can we pay fake money for fake frames?
Bitcoin
You can just buy the B580 instead and less money going to them will get your point across. So hope you're not one of those hypocrites with their cards ...
We already do lol, our money is backed by nothing.
@@nw932 never was
AI upscaling subscription coming to a GPU near you.... I wouldn't be surprised if they swap to a model like that in the future.
9:18 Don't be fooled.
The GPU's raw performance is 28 fps.
Like huh? This is so stupid.
There is no way their raw power only gets 28 fps.
This is even more proof that Modern Games are extremely unoptimized and heavily relied on DLSS.
EDIT: I've seen some dumb take here like it's beneficial for EVERYONE to have a minimum standard of 60fps instead of 30fps.
It's also beneficial for everyone that developers optimize their game from the very beginning.
I know that they are using PT and RT, which is why it only got 28fps. Yes, I understand that, but even a 4090, the highest GPU on the market, ran like garbage in modern AAA games without PT and RT.
Like it's crazy how people just straight up fine with unoptimized games when it's beneficial for everyone.
It is because it runs full path tracing in 4k that is ridiculously demanding. But yeah fake frames suck it isn't performance.
In the same scenario with full RT-overdrive, 4090 gives you 17-18 of raw perfomance. Just to be clear.
@@KraszuPolis You wouldn't need DLSS/FSR for full RT if the games were made properly. You'd just need rasterisation.
@@sanek9500 4090 gives 20 fps but keep in mind this "jump" in "raw performance" will be even lower when you not using RT
Yep, I commented the exact same thing.
Graphics Processing Unit ❌
Frames Generating Unit✅
Hallucination lsd-trip unit 🤡
27 frames on native resolution ahh card
@@shreyasdharashivkar8027 this is the future you don't understand!!11!!! 😠😠
@@shreyasdharashivkar8027 on 4k with path tracing
😂😂😂 @@sadge0
Please note that this new technology will not be effective with most games in your Steam library that do not have frame generation or DLSS options available in the in-game settings.
But most of those games don’t need this technology to run well.
lmao these cards will kill the performance with the games that dont get support because the internal specs and vram is already just as good or better than the 4000 series.
Haven’t nvidia announced driver level DLSS enforcement?
@@palniok Yup, from how I understand it, they can force inject DLSS into pretty much any game with the new cards. Not sure how it works, but that's pretty crazy if true.
@@CertifiedSunset AMD already has that for FSR and frame gen. you can turn it on by force in the driver. Kinda weird Nvidia took this long to put it in.
A bunch of tech channels already commented that what DLSS4 offers is 3 "fake" frames for every 1 true frame.
It only works in games that support it and while LOOKING smooth has noticeable delay.
It doesnt even look smooth, there is noticeable ghosting and smear even on these early reviews. Which means it will be even worse in person.
a lot of the tech channels on youtube are just influencers posing as reviewers
@@bigturkey1Very true. If it ain’t GN, I ain’t interested.
The regular frame gen is only every second fake. DLSS4 offers 1 true frame for every 4 frames, while those remaining 3 are interpolated frames that DO NOT react to your inputs. 30 fps framegened to 120fps will look like 120fps but will play like 30fps. Framegen from DLLS3 was only good when you were hitting like 70-90fps natively and you wanted to increase motion fluidity since most people have 144Hz+ monitors.
And in that 1 frame one 1/4 of it is actually rendered the 3/4 rest is DLSS
as.owner of the 4080 super, the extra fake frames feels terrible, it does not feel smooth and your output from your keyboard and mouse feels really delayed, I tried them a few times and never turned back on
Finally somebody else said it.
Be careful. I just said the same on another video and got 10 fanboys telling me I'm against technology and I'm stupid etcetc. Supposedly if they can't notice then the issue doesn't exist🤡
@@GFClocked i doubt they have a 40 series card lmao
That’s why they showed a reduced latency. Everything will be AI in the future
@@РаЫо It's wild they are going that route, every compute that is going to AI, means it's not going to actually generating authentic pixels the way devs want them.
The reason the lower end cards are cheaper than the last generation is because they cut them down even more compared to the last generation. They are upselling lower tier cards, so actually they are more expensive. They are pulling an RTX 4080 12GB situation again but this time nobody seems to be complaining. And they are doing it for the whole lineup below the 5090.
ik right, when i saw how they cut down the 5080 in half compared to the 5090 i was like wtf is this, the 5070 is even worse , without dlss those cards are head to head with the 40 series
Linus would have been an infomercial star like 30 years ago
He was.
The Billy Mays of tech.
He was for Gerber
This is the closet thing to willingly watching an ad
Shamwow type or Billie type?
I'm not paying 2k for double the fake frames. Real frames peaked
True but tbh human eyes don’t get a real benefit over 120 fps. So what’s the point in trying to go higher?
Edit: in case those want to comment, focus on the term “real benefit”. Not actual limit. It’s like watching something in 8k vs 4K. Technically better but the difference becomes marginal.
@@smokinace926 Well when it can't even go over 28fps at native....
Tell me when they hit 60fps@@smokinace926
@@smokinace926 tbh it's harder and harfer to notice, sure, but proper 240 is already noticebly better. For those who chase further and further fluidity it makes sense I guess
@@smokinace926 the point is, the real frames are only 28 so...
5090: “Hey we heard you love AI generated frames so we put more AI in your AI”
Truly a yo dawg moment
truly an Xzibition of our time
Let's play Unreal 98!
RTX 5090: Hey, i can't play that. it doesn't use DLSS!
"You can't, but I can!" *3DFX Voodoo 3 3000*
Why is it on performance and DLSS? I run a 5k ultrawide with a 4090, no. dlss, quality etc. Runs like a champ. No need for DLSS and all of the crap.
At least now we know Asmon isn't afraid to tell us what kind of FPS he gets even if he is being a liar while getting ripped off for 110 FPS with a 4090
You're basically paying and banking on Nvidia to provide you performance through software updates rather than buying a solid and powerful GPU. No thanks.
Yes 💯% True!
this is it 1000% its like a subscription. you want to unlock the new software features we locked behind the new hardware thats not even much better, give is 2k...
The software “updates” are only for their next gen cards everytime
Ehh its still RTX card. A very powerful graphics card.
@joonashannila8751 and the moment Nvidia boinks an update, your performance goes to the crapper. Or better yet, when they decide to stop supporting the card.
If Asmon want to see by himself how the AI generated frames looks he can fire up a game himself and play it in framegen, he has a 4090 right?
Now we wait for game devs to release unoptimized slop running at 15fps and leaving it to MFG to reach their 60fps target.
60 frames being the target is very disingenuous now because 60 frames at native should be the target. Not 60 frames after all these AI bs
If even that. A lot of this ai shit also depends on the dev implementation. If they implement it bad, it could turn out almost unplayable. And if they can't make optimized games anymore, who says they can do anything else?
@@PerciusLivebut that's not what they're doing. New games are so bad now that you literally need to enable the AI software to get you above 60fps.
Edit= After reading your comment again I realized we're saying the exact same thing. My bad.
dont you dare be so correct
@@PerciusLiveI always disable every motion optimisations on my tv, they all make the image look weird with all kinds of artifacts and ghosting
18:10 Zack still doesn't understand the retail price of goods is not determined by the costs of production. They are aimed to be priced at the optimum price that generates the most aggregate profit.
You're absolutely correct in saying this. It's why ink cartridges have a nightmarish profit margin. They know you'll pay.
Sad that Asmon will read comments on videos he reacts to and not his own YT comment section.
These are two separate things. Of course the price is determined by the cost of production. And they come down from the price after they have developed cheaper productions methods.
@@joev3783 It's a different business model, ink cartridges works like a subscription. You pay for the printer in ink.
@@damazywlodarczyk They are not separate, they are priced above BOM/production and margined at what people will spend for the product. It is tech not a necessity.
$2000 to play a few ultra unoptimized and buggy games at 300fps (75% IA generated frames) with a disgusting 60ms input delay
yep, and I'm going to buy it anyways because of the 32GB vram required to play skyrim modded and then rent it out when I'm done to make $10/day and hopefully pay off the entire PC in 1 years time.
this is however, perhaps my last video card purchase, I'll just hold onto it for 20 years unless they go back to generating real frames because devs are going to get lazy, why work hard when AI can smooth everything out? but you'd think there'd be a parallel economy for real frame GPUs however idk that the lay person will value it over the fake BS and us real frame enjoyers will just get the shaft where if you want a complete game, dlss is required.
Let's see you design something better.
Woke DEI slop nonetheless
Time to go back to good old pixel games.
@SBlazeable the Vram increase is literally the only reason why I'm interested as well lol. 8gb more is decent. If not for that, well it honestly doesn't look all that compelling
Lul the monitor turning off when he slammed the case close
1000 bucks for 16GB of vram.
this is the worst timeline
Ai reduces vram usage who cares
Fr plus it's ddr7, but people love complaining@@РаЫо
@@РаЫо HAH. Good one.
@@РаЫо and that's the problem the over reliance on AI
@@РаЫо In their tests it only reduced 400mb tho
One of the problems with fake frames is that performance doesn't work in VR or other high demand applications. 3D modeling, Video editing, multi screen, etc
So no reason to get 5090 if those don't work
yip. pure rasterization performance is always more important and more useful in general for various tasks. imho the die space for dlss, raytracing and ai is just wasted resources that could have been put toward more powerful gpu's.
That's a good point
@@socks2441 you seem to not understanding current engineering capabilities, the reason we have software solutions are because of our hardware limitations. GeForce is geared towards gaming, not productivity. DLSS and Ray Tracing are both beneficial for gaming.
Frame gen can be nauseating even on regular monitors, I can't imagine how vomit inducing it would be in VR.
When he says "ai" all I can hear is "horrible TAA" and "blurry in motion" and also "horrific ghosting artifacts", too. Stalker 2 is one great recent example of all of those.
we'll see with benchmarks if Nvidia really solved that with DLSS 4.0
You havent seen DLSS Transformer model. It will be available for all RTX cards since 20series. And it fixes a lot of the ghosting and blurry issues. Go look it up, Digital Foundry already has a video on it
Remember - GSC sacrificed A-life for aggresive optimisation in Stalker2
Hallucinated frames. Soon we'll have full hallucinated games with 6000 series !
@@BlazeBluetm35Probably not...
Something overlooked is that it isn't just a more dense version of the 4000 GPU chip with a bunch of more cores added on (a percentage chunk more CUDA cores is always the go-to performance bump for any new generation). It's supposed to be an entirely new architecture, chip layout, new process node etc. therefore driver updates over time will likely vastly increase performance of the new 5000 series too as the software department begin to take advantage of what the hardware boys built.
Hard disagree
@@skzeeman007 about... ?
none of that matters when it’s fake frames dependent on software updates and games that are compatible with the new system
I cannot believe that frame generation feels good at all.
Each frame that is generated does not have any direct input from your mouse and keyboard.
This would feel like the input lag we had with v-sync 60hz, where you also have to wait for the monitor's refresh rate (it creates small gaps between control-input and video-output).
Also, the fps counter can be manipulated by frame generation by simply showing the same generated frame twice. If you have enough different frames in a second, you won't notice it except for some blur - which the AI conveniently generates anyway, and the fps counter going up.
I've tried it with just regular frame gen and it feels like Jell-O. It'll be even worse, it'll feel like a monitor with slow pixel response time despite having more frames displayed.
1st time admitting to the fact that new video games run like dog !@#$ compared to older ones?
From my experience, the only game it feels decent on (with a 4090 mind you) is Cyberpunk. Otherwise, every other game I enabled framegen on felt terrible.
@@wolfsbane9985 every game I've tried frame gen with felt like I was fighting the game to do what I wanted, input delay felt like trying to swim in honey
@lolsalad52 that's awful. It wasn't that bad with my setup, but in Space Marines 2, for example, enabling it output a higher framerate, but moving around/looking around felt like my framerate was lower than if I had just left it off. The best way I can describe it is stuttery. If good/high fps is the feeling/texture of running your hands through really smooth sand, then turning on framegen felt like running your hands through gravel. I maintain the only game it feels good in is Cyberpunk, but otherwise, it's not a good experience.
build a new pc every 5-10 years, buy games late.. you save TONS of money by being patient.
Roger that. 👍 I have more games in my Steam Library than I'll probably ever get around to playing. I'd rather have my Steam Library overflowing with games I might never play, than an RTX 4090 in February 2025. Folks were paying $3000 for a 4090!
Or don't play games like Tarkov that are so poorly optmized it requires you to drop $4,000 on a PC for a chance at giving decent performance.
I agree I stay a year or two away from new hardware let others beta test lol
@@user-jq3jh7hp2w So true plus that game for as cool as it seems to play, just sucks. The best thing about it is the gun builder
Not me! I miss a day of work for every launch. I pay retail every chance I get. I'm severely in debt to the point that I'm living at my friend grandmother's neighbors apartment at the senior living home. I must bath her daily that will haunt any sane man dreams. But it's worth it!
25% price increase for 25% performance at 25% more electricity used.
The question is do you need that 25% increase to play a game? Most people would say no. Band wagoners would say probably yes. 😂
best case scenario.
Im still on my 1080ti lol
@RubenGugis Still using my RX6650XT that cost me $250 two years ago. No need for anything more. I play at 60fps.
@@TheOneTrueFett Yooo! That's sick! I'm still on a 2070s, not sure I'll even upgrade to a 50xx LMAO.
Just putting it out there: on a 4K monitor (27inch), you can turn off all the anti-aliasing options because there’s no need for them. On a 1440p monitor, however, you might still need them.
i still need antialiasing on my 4k 32" monitor. Might not be necessary on a 27" one, but i absolutely need it on my 32" one.
@@Hamborger-wd5jg
"Yeah, that's right on a 27-inch. I can't say for anything over that, but you seem to point out that on a 32-inch, you might need (jokingly, 8K). Jokes aside, you're right. ;)"
@@Mr.Stephane_L I came from a 32" 1440p monitor and I didn't feel like downsizing to 27" 1440p, and I felt like getting a 27" 4k monitor would be fruitless, so that's why I went for 4k 32". I saw a $350 black Friday deal on a UPF322 and sniped it. Difference is night and day on the 4k one compared to the 1440p one. My 7900 GRE is able to drive games of similar demand to FH5 around 130 fps once overclocked using basic taa without upscaling. Given of course Forza is somewhat AMD optimized, I can't say if it can run anything more intensive, but I do know it can do this.
Yup, listen to the guy thats been personally invited backstage on Nvidias HQ with multiple Nvidia executives listen to every word cause if there is one negative thing, he'll lose Nvidia as a potential sponsor.
I like LTT, but this felt forced. Almost like gunpoint.
He vowed not to get the 4090 but he said he's gonna get the 5090. However the elephant in the room is he didn't get the 4090 because of the price, and the 5090 is even more expensive.
Hes always expressed distaste for Nvidia. He says they're a huge pain to work with all the time.
The card is more powerful. And he's got the money to spend. I don't blame him for getting the best card available. But he's probably just as miffed as most of us that this is all dlss AI performance.
You're still going to get great performance without craptracing on.
@@Grodstark poor linus🤣
Yeah, with the whole room full of Nvidia execs looking at you while making a video, it must felt akward as hell.
@@truckywuckyuwu If he truly thought Nvidia was distasteful and a pain to work with he simply wouldn’t work with them.
Bro got abducted to the Nvidia Backrooms to make content for them.
They even got a "casting"couch in there 🤣
He was shilling HARD.
Bro got a big fat bag to lie about poor performance being responsible for lazy development and 3x the fake frame generation.
To make content for us*
He's basically Jesus.
It feels like the biggest issue with DLSS is not to do with it's performance, but that developers are relying on it instead of developing actual good graphics processes.
DLSS does have it's negative quirks, such as producing frames that are not responsive to user input, and smearing between frames, but it does produce a lot of "performance" for much less effort.
True. Should have been used as a booster, like a floor raiser not just a crutch for devs to get lazy and go “well I guess we can stop trying as hard”
@@datbo1jay1 yea I think you are 100% on the money there man. When it was introduced I was hoping it was intended to keep older GPU's alive/relevant or something
DLSS is for hiding mistakes and bad programming its expensive too.
@@datbo1jay1 That's precisely what it is. Now that these tech guys have a way to make life easier on themselves and become lazy that's what's happening.
There's also the issue with games being CPU intensive. I heard that with Dragon's Dogma 2, some people had worse performance when they enabled DLSS because of that. I had really bad performance with No Rest for the Wicked and I'm not too optimistic about DLSS in that game (haven't tried it yet iirc). I wonder if part of the reason it was so poorly optimized is bc they expected DLSS to fix everything.
''i measure time in wow expansions...'' absolute cinema right there asmon
The 5090 exists as a price anchor, what nVidia really wants you to buy is the 5070/5080. The 5090 is priced that way to make the 5070/5080 look like a good deal (they aren't).
The 5070 could be sold at $400 and the 5080 at $650.
More like the opposite. 5080 is not even half of 5090. And other cards show no significant improvement over previous gen as well. So cards for poor, and A SINGLE card that actually delivers performance.
@@alexturnbackthearmy1907 You're thinking of 4060 (eventually 5060), an intentionally horrendous value meant to push buyers to the 4060ti (eventually 5060ti).
The 5090 has 102% more cores, the 5080 has 14% faster clock. The 5080 is half the price, more than half the performance.
Cope my guy. 4090 user spotted.
@@ivancouto1028 nice argument, emotionally insulting a person instead of actually trying to argue his point with logic.
The 5080 isn’t horrible as back in the day when I was building PC’s in 2005 a top tier GPU was $500 and with inflation is $851 today so they aren’t too far off but the 5090 is yes absolutely horrible value because it actually is a top tier GPU which should be close to the $851 if they followed inflation but they added another $1,150 on top of inflation which is very frustrating. It’s literally the same with housing too. Do I think the 5080 will actually sell for $999…absolutely not. I’ll just wait for some sales to pop up to get a 5080 to upgrade my 3080.
NVIDIA doing the Apple move increasing pricing even higher. Chip shortage am I right
The 5090 is hardly a consumer card, if you only plan to game then can just pretend it doesn't exist.
People with financial difficulties shouldn’t be buying halo tech products. For moderately successful middle-aged people, $2K isn’t a big deal.
@@ivyr336 IKR. Anything that can't run smoothly on a 3000 series card deserves to go down under. Unoptimized bullshit.
@@indeedinteresting2156 I'd say that anything that can't run on a 1000/1600 series card on lowest settings without any frame gen and upscaling bullshit deserves to go under
@@ivyr336 It's very much a consumer card. Won't be buying it though.
Bro even the fucking DLSS was on Performance and frame gen on wtf is this .... Why is Raw not the normal stuff nowadays. The input delay must be crazy
Because companies - not the Devs - are lazy and only see Dollar signs. There's no time for optimisations so whatever shortcut they can take to make their games faster out the door the better.
Hence this dlss/fsr crap.
@@jackofsometradesmasterofnone devs are getting lazy af as well, no one is free from criticism at this point.
Devs work whatever they are told.
They dont have free will
Dont blame them@@EduardoSantiagoDev
Because NVIDIA is the king of AI hardware. They are going to push that everywhere so game devs don't need to optimize, you can just use their tech and get easy gains. Every GPU company is doing this.
What are these graphic cards good for then if their native frame rate is just 25 FPS? What would be the point of buying them beside the DLSS/FSR technologies they have? Like, are they really good for other tasks like AI image generation or stuff like that, or nah?
As someone who upgraded to a 43" 4k monitor its scary how fast it becomes your norm and its beautiful
I'm just waiting for real reviews from unpaid testers.
Nobody is unpaid nowadays except everyone willing to say no to getting ripped off for that much. Gamers Nexus... Digital Foundry... EVERYONE is collecting a bag.
@@idindunuphenwonggamers nexus still gives data. Do you think that data is false? 😊
@@idindunuphenwong Gamers nexus don't sell ad space to Nvidia, AMD or Intel. They also know what they're talking about
LOL yeah it's funny how LTT don't even try and hide it anymore. They'll release another "nvidia I am disappoint" video this year or next, to try and seem neutral to redditors, and they'll fall for it once again.
linus wasn't paid lol
he's very vocal about never doing "paid reviews" because that's illegal
I think that from now on GPUs should no longer be called "hardware" but here you are practically paying for a "software" since 3/4 is generated by dlss4
does that make it feel laggy or can you not even tell its on like current nvidia frame gen
they literally invented the HARDWARE to make DLSS and real time RT possible........................................ SMH
@@HellPedre so you really think that artificially generated frames are better than physical frames generated by the gpu? 🤣I think that dlss4 should only "help" a gpu to "compensate" with additional frames only when it becomes obsolete... not become the main feature. otherwise here we are no longer talking about GPU but about FGU (frame generating unit)
@@HellPedre haha you think dlss is hardware? good joke!
@@HellPedre that's exactly what they want you to think.
You say most of the average consumers are not going to notice but who's going to afford 5090 is going to be the people are looking for the all that graphics.
You mean the Austin millionaire who doesn't leave his home or bath, forgot people in trailer parks, district 8 and 3rd world countries want to play games too but can't afford to and nor has the infrastructure to handle it?
If intel doesn't mess up and meets market demand remembering poor people like games too and even more at higher rates, Intel will become the wealthiest company in the world by country miles; it always happens a third company swoops in to fill the entire low-end market of a product or service which is the major majority then later buys the competition just in this case they have the aid of the fed and it's the largest product today.
@@JunkYardDawgGohan Waiting for the NVIDIA consoomer to spawn in and inform you that “PC gaming is a ✨luxury✨ and that if you can’t afford it, its not the hobby for you … go be a poor elsewhere.” You’re not a true power-bottom gaymur like they are if you don’t rush out and slop up the slop.
A good amount of people In the military have PC builds. Or at least in my community I did. Everyone has a built PC. I've served in the Marines for 7 years now. So that's just my experience still a good amount of people. And we don't get paid much. Also I'm not for PC builds. A decent laptop and your chilling.
@@TwoBs I ain't a fanboy of Nvidia, but this idea is ridiculous. You don't like it don't buy it. And if you can't buy it within reason, then you shouldn't even think about it.
Gaming is absolutely a luxury, for every second you spend on a game rather than being productive, you lose money or some gain you could've otherwise had. Now I'm not one of those anti-gaming people. That narrative is just as stupid.
However, it's pretty wild that there's so many people out there that think 2000 or anything like that is a genuinely impossible number. I guess most people don't know how to invest or save money so it makes sense, but if they learned how to do that one simple thing they could buy whatever they wanted.
It's financial illiteracy, not being poor or anything like that. Shouldn't be spending money on any PC part if you can barely make ends meet. But if you live comfortably and have excess every now and then, yeah it won't harm you to keep money on the side and save up for something big.
And for people in crappy low-income areas of the world, and 3rd world countries, they have bigger problems to focus on anyway.
@@PufflordYou make good points, however, $2k for this frame generated tech just sounds like a gimmick with very questionable value. The delay you experience with it can in some instances feel worse than low frame rates (speaking from experience, good luck playing any fps with it). So yeah personal financing is obvs. the 1st priority, and it’s not that “$2k is an impossible number,” but rather the value incentive for that price is not compelling.
M32, tech professional in Australia here… I own a gaming laptop, we live in a space constrained apartment and had to give up my desktop to reduce clutter / make space.
A few of my friends have had to do the same when they have kids and give up their office / study.
A lot of younger people I know get a mid-range gaming laptop they can take to uni but also game on as they can’t afford both.
The market for gaming laptops is pretty huge.
Frame generation is like painting stripes and changing exhaust your car and say that it's faster now just because it looks and sounds faster.
Nah, frame generation is parking at the finish line and bribing a ref to affirm you ran the race
Comment section on this video is making me cry hahahaah, good one dude, really puts things into perspective
Lol the copium is through the roof.
Add stripes to your Lada to make Adidas car
frame gen is like making games like hogarts and msfs run smooth for the first time ever
the big difference is still , they used different settings on the systems why linus was not allowed to show the settings in cyberpunk of the 5090 system
It's all set to low. And the 80% is AI generated using frame generations. And blurry gaming. It's a joke.
Yeah, everything there has been manipulated, we definitely need to real reviews to know what are we getting.
The settings are the same, it's just that the 4090 had 1/2 frames generated by AI and the 5090 having 3/4 frames generated by AI. This would roughly equate to double the framerate if the base performance of the cards were about equal.
The hottest take anyone can give:
We should have NEVER left the GTX lineup. RTX and the very IDEA of raytracing has done nothing but irreversible damage to the entire gaming industry as a whole to the point of ruin.
Raytracing was not only a mistake, but flat out the worse direction that industry has taken on mass due to the negative impact it has had on development (industry-wide) and then by proxy, performance.
Gaming would be in a MUCH better place right now if raytracing technology never existed.
I wouldn't go as far as it shouldn't exist as much as it shouldn't be on a commercial product. Production workloads still benefit from RTX cards for ray traced rendering.
Or was released in done state - as in, all games would be fully ray-traced out of the box. Nowadays this is not the case - RT is unoptimized mess that runs on top of another unoptimized mess, and is also completely butchered by needing a proper implementation from dev team which happens...extremely rare.
Its so funny to me that raytracing can look worse but needs far more ressources then baked lighting. But GUYS THE RAYS, THE TRACING, SPEND 2000$ NOW!
@@sebastianrosenheim6196 Bought into the hype with a 3070, I'll turn RT on to see how it looks, then back off when I actually go to play.
you can still buy a 1080ti if you want, they are like $100
Gaming laptops are not usually used for gaming. Most of the time gaming laptops are used by developers for modeling or creating video games a I work things of that nature.The reason why they buy the gaming laptops.Because of the superior gpu performance
The blur and smear is driving me out of high end gaming. I've been on the high end of graphics since 3dfx. I cannot 'unsee' the bad
Agree I see this on Ark ascended and I thought it was my monitor but I have a new one with thats OLED 240hz and still the same. DLSS I don't think is the future but it might kill gaming for allot of us if it keeps going this direction.
@@michaelmichaelagnew8503 DLSS is only needed because developers CANNOT and WILL NOT optimize their own games, so GPU manufacturers are forced to do shit like this with fake frames because the game devs are incompetent.
Serious question, if you've been playing games since way back (my first card was a voodoo 2), which means you were happy playing extremely low res games for at least a decade how do you rationalize your current "standards"? This sounds to me more like the curse of adaptability, essentially people are so quick to take the latest image/graphical improvements for granted due to the constant progress we've experienced over the last 20 years, and as a result they cannot be happy with anything for too long because they are constantly adjusting/adapting to the latest improvements.
If you look at it from that perspective our adaptability is a kind of curse, doomed to never be satisfied with anything as long as a new, shinier, fancier version of it comes out. It would seem that the only way to remedy this would be to first be conscious of this reality and try to remember where we came from, what we had. I do this all the time, I bitch about imperfect image quality but then remember the evolution over the years, and then suddenly I'm 14 years old again and blown away, and this allows me to see what we have now in a clearer light, and my bitching turns to thankfulness, and appreciation. And I'll tell you, thinking like this makes everything better, makes me less bitch less, makes me a better person....
@@caldale4941 Well said, and it needs to be said more. This is my perspective exactly. Which is why I'm still HAPPILY using my 2070 super, even though I can afford a 4090.
But why would I bother. During actual gameplay the differences are hardly noticeable. I'm still running everything on high / ultra with this card. To help this situation I always stay a couple years behind on games as well.
Basically what we see these days are a bunch of benchmark chasers that don't understand diminishing returns.
@@DingleBerryschnapps It actually means a lot that you said that, sometimes I feel like everyone is missing something when I see how entitled and demanding they are, with no awareness of what's happening to their minds, completely swept up by the time and devoid of introspection and the ability to shift perspective to something more wholesome and grounded. Not a week goes by when I dont look at a game and go "Wow!", and this goes for all types of games not just the cutting edge ones.
I refuse to be a slave to my adaptability because I've learned to see how it corrupts and lessens my ability to see what is good and valuable.
Anyhoo, your comment reminded me that there are some people out there who really do think outside the box and understand that they have internal processes that actually work against them when they go unchecked and arent acknowledged.
9:25 50 ms input lagg oh my Lord, its disgusting.
And thats the problem with fake frames. The engine is still bogged down, and itll feel like shit.
I had to recheck becuase I didnt even know that was the input lag...If 5ms is already noticeable but playable, I can't imagine what 50ms does to a person HAHAHAHA
Just wait for Deep Learning Input Generation.
yeah, 60FPS has 16.6ms of latency, on 50ms everything would feel like 20FPS (1000/fps=latency)
@@Basuko_Smoker you arent going to notice 5ms thats monitors latency already unless you are on CRT or OLED
Does this mean that my 3DFX Voodoo can outperform the RTX 50 series because it doesn't rely on AI & just runs off the 3D Acceleration?
Yes
Several times over!
Is cramming all components rught next to the ref hot gpu a good idea vs spacing them over a full size pcb?
@14:10 but casual players aren't buying 5090s. The people who get 5090s will notice the issues for sure.
100%
I can't even look at Christmas led's without seeing flickering, it drives me bonkers after just a few seconds. This would have me throw my monitor across the room in under a minute if I was forced to play like that.
yeah that was surprisingly low 1ntelligence from asmon. like gee you think the people that dont care that much about gaming are going to spend $2000 on just their gpu?
Not the case, people who buy 5090 are not the people who know a lot of stuff about tech and computers, it's just rich people, if you get 200k a year working in finance, you won't notice shit.
@NajaToxicus I have a 4090. I don't use dlss because it is a blurry mess. I bought it because it was not only the best card on the market but also the best cost to performance. A lot of the people with 4090s will notice, i assure you. I'd imagine most pro players are on 4090s currently.
@@mistrsynistr7644 And? You think those people are all professionals that understand diminishing returns just because they purchased a high-end gpu? If I got a nickel every time a person thought they needed something, or they could notice a difference in graphics, when there was no difference there, I'd be rich. As a photographer I've seen this countless times.
Sure, it looks better when you're pixel peeping and adjusting the settings, but when you're actually playing, it's meaningless, unless you're looking for 200 plus frame rates, which most people are not.
Someone purchasing a xx90 doesn't tell me that person knows more.
It's a fact that people that know the "least" about something, pay the most. They don't understand the nuances of things that would allow them to make an educated decision and save money. They just have money to spend, and listen to Tech influencers, and benchmark chasers in the comments.
People that understand diminishing returns and are frugal with their money are the smart ones. They understand how fast technology moves now and only buy what's necessary at the moment. I know people that have spent $6000 on gpus in 5 years when I only spend $500 every 5 years, and we play the same games.
Btw, "best card on the market" is completely subjective. Best for what? I guarantee my 2070 super I'm still using plays all the same games yours does at perfectly acceptable frame rates, on high / ultra settings. 👍
This GPU draws more power than my fridge.
That's true for practically any GPU genius
Can your fridge play CP2077?
@@aegd LOL a GPU does not draw more power than a fridge unless it's a mini fridge that GPU does not draw more power than a fridge you can't plug a fridge into an extension cord like a power supply for a GPU or at least nobody I know is cracked out enough to use extension cords and surge protectors just to plug in a fridge
You got 550W fridge? For cooling 1 can of coke? (Joking)
@@swiftrealm can your gpu cool your food?
There are no "tech bloggers".
There are only freelance marketers for tech corporations.
and then there's Gamers Nexus
Didn’t the 4070 super compare to the 3090 in performance? This AI and scaling is getting ridiculous. I already have to use some type of scaling when I play with my 2080Ti at 1440p. Some games I can get away at Native and lowering my fps cap but I might end up going for a used 40 series card or something.
And 100% bought my first prebuilt in 2019 for $800 after taxes for a 1660Ti and an i5-9400F and upgraded that and eventually built my own with the 2080Ti in late 2020/ early 2021.
21:00 as someone who bought a gaming laptop i think it's because I wanted to be able to take it to university, play games when I'm on my lunch breaks, and be able to take my laptop to a friend's house and game together with split screen games
Split screen? At least i hope you are doing it on TV at least, even the regular monitor is way too small for that.
@@alexturnbackthearmy1907No reason you can't. all you need is a hdmi cable
@@alexturnbackthearmy1907 it's just to go to a friend's dorm and casually game together if we were bored. There's a lot of uses normies get out of gaming laptops was my point, I couldn't take my pc with me
Yes I got a gaming laptop for that purpose too.
Ended up getting a desktop PC and a cheap laptop. Just remote into the gaming desktop with moonlight and sunshine. Extremely low latency. Works out perfectly.
Laptop gets great battery life because it's only streaming video.
No sound from the loud desktop as it's in another room.
And I can stream it to any screen anywhere.
@@hotrodhunk7389 the simplest explanation I can say is, this correlates as to why mobile gaming is so big too. It's all about convenience.
I think you vastly underestimate how many people care about image quality going backwards and input lag. Also on what planet are "casual gamers" getting a 4090 or 5090.
We are also not hit the ceiling on visuals. Visuals have literally gone backwards.
Image quality is subjective. How would you define the quality of an image? I can claim that Super Mario 64 has superior graphics over modern games, does that make my opinion a fact?
@@therealkylekelly so a 480p texture has more detail then a 1080p texture? or even a 4k texture?
@@maxttk97 when people refer to image quality they are usually not referring to the objective value of pixel information but are instead talking about the aesthetics. 2D vs 3D, art styles, etc. Obviously a 480p texture has less detail than 720 or 1080 but a majority of games today are created with at least HD textures so the objective quality of the pixels has not gone down and in fact has gone up with the rise of 1440p and 4k usage which would debunk OP's theory of image quality going down if we are strictly speaking about the pixel fidelity.
You 100% nailed it. That screen tearing and shimmering and ghosting gave me nightmares. Nvidia fanboys about to hit you with snarky replies though.
Games are becoming a blurry mess, I always turn all of it OFF!
I'll rather play with less fps and crispy aliasing than turn on fake frames.
Imagine how bad the optimization is going to be in future releases oh god, they're already stuttery messes on high end current hardware
i use a 4080 at 4k and never felt like a game was unoptimized
@@bigturkey1 maybe because you are using a damn 4080?
on a 3080ti i have noticed almost every AAA game of the last few years is terribly optimized. indiana jones being the one exception.
@@socks2441 Indiana jones at 4k wtih a 3080 gets 80-90 in the jungle areas and 110-120 in doors.
@@bigturkey1 exactly. its incredible. especially after so many games that require dlss and low settings and still cant hold 60fps.
but yeah, i was shocked to find my 3080 ti could max the game out at NATIVE 4k60.
i wish they had not limited the game to ray tracing hardware though, because clearly this game is well enough optimized for any decent GTX gpu to play. i dont think the forced ray tracing is doing much in this game.
@@socks2441 It's probably because they didn't spend thousands of hours to make "fake lights" in the game and rather went for the better looking and easier ray tracing way.
Which is why ray tracing was made, just like nVidia forced PhysX into the games years ago.
19:50 more context though: 3080 was $700, then the 4080 jumped to $1200, now the 5080 is $1000.
So it's still a massive increase for that tier compared to two gens ago. New gens used to be similar price for the same tier but better performance. Around covid it turned into huge price increase to get the performance increase. It's good that it decreased but it still could be better, and the limited vram is just sad.
1:07 "the most beautiful real time rendered demo" that ball is blurring... in the demo.
Not to mention the gem completely loses color when it got "realistic"
I don't wanna dig at the devs of Stalker 2. But the performance of that demo looked like Stalker 2 gameplay....
All the hyped tech demos and when it's coming to actual games they all look like poop
You gotta have a 4090 with a 4k screen to accurately watch this video and comment on the picture quality 😂
7:40 "you cant change the settings" sounds weird to me, im sure its not because they have to reset it for the next journalist, more like "it only works if you put it like that"
basically what Triple A devs do today "the game only works, if you play as we want you to play!" if you do it a bit differently, everything brakes
They don’t want Linus to turn off DLSS. Nvidia is building up hype around the 5090 performing twice as good as the 4090 but only with DLSS which is meaningless.
they have it set up so the 4090 runs at a crawl, and the 5090 has all the new ai software for fake frames. if he could turn that off he could actually compare raw performance numbers of the actual hardware instead of the software. they dont want that.
Yep. But the problem for Nvidia will be that after few days of official launch, we will have tech reviews from GN, Hardware Unboxed etc., and we will see the reality. Making Linus not change things is stupid, they know very well that every major tech reviewer will play with the settings and the truth will shine.
The intention is so everyone gets the same experience
@@Aqualightnin i see no problem to change few things in game settings, just for the show and set them back, its LTT afterall not just random guy.
And then it dies after exactly 2 years of warranty
Don't you already get a new card before the warranty runs out?
c o n s u m e
@@roshuneppBro, what?
This is what bothers me most. Paying 2000 should get you a minimum of 4 year warranty. I'd be happy with that.
@@roshunepp Bruh, been running my 2080TI since 2019, intend to keep going for a few more years and praying maybe something that isn't AI dogslop comes out with actual hardware improvements.
so... i basically skipped the 4xxx gen nvidia's and it looks like I will be skipping the 5xxx gen as well and wait until they either perfect this system to avoid the latency and ghosting or change it to simply improve the native rendering. I cannot ignore the blurring/ghostness, it actually bothers me.
20% more performance , 100% more dlss4 performance lmao imagine AI generating 15 out of 16 pixels and calling it performance lol , unlike FSR this will only work with a few titles that Require Nvidia to work with the developers , Nvidia must be mad that the 1080TI Was relevant for too long and they want performance to be locked behing "software compatibility" while a old and weak gtx1050 from a decade ago can run AMD Frame generation technology lol
If it actually works i dont see any issue.
:D
yeah, but since that frame generation is 50series exclusive, its still produced double the frames. And think about the fact that game devs dont give a shit about opimization anymore, so they just implement these frame gen features. So in the end it does not matter, still double the frames. BTW did no one notice that the resolution was even on 4k???? With the new Displayport upgrades to 2.1 and the new 4k OLED 240hz monitors, the quality of pixels will be insane.
Real frames are generated. These are more like hallucinated, as they're that bad. GPU just went on lsd-level of artifacts. But somehow it's amazing and I'm just stupid because others can't notice the problem so it must not exist.
@@Jay-vt1mw Same.
I used Frame Generation on some titles now. Some were broken at release, but later fixed, some had input-lag issues, but some also ran perfectly fine as if there was no Frame Gen (it was on) with high FPS.
What Nvidia needs to do it to push it onto more games or engines like UE5 OR make it available via drivers, so you can use it on all games (but it might be broken).
I feel like the people mainly buying gaming laptops are engineers doing BIM or 3d modeling for construction projects.
This is the case in our company and all the contractors we work with.
If you're working in a construction site powered by generators, it's much more convenient to have a battery than lose power suddenly and see all your unsaved work fade away with the flickering lights every now and then.
Machinists and Heavy Duty Mechanics too
laptop is superior in almost every way. if it could also perform as good as desktop and not burn my balls then it'll be the best.
I have one computer, my gaming laptop. The practicality of a laptop, the capacity to play things like BG3. I don't game enough, or care enough frankly, to maintain a tower/monitor setup. Though I can appreciate other's joy in those spaces.
I shift places often as my job requires me to shift in every few years and my gaming laptop is like my pet dog i can carry anywhere to game and it also helps me with my work sometimes.
That tech demo had less frames than my modded Minecraft playthrough with Shaders...
@@EliteInExile minecraft looks worse
@@Darkdiver28you know modded minecraft Especially with the more demanding sharders bring any pc to it's knees like yeah minecraft is more CPU & ram base than gpu but those more Demanding shaders you need a powerful asf Graphics card to even run. Like a RTX 4090 can't deal with those more demaning shaders like having it's like Ray tracing just to much for modern Consumer hardware.
1440p to 4k on 27 is an immediate and noticeable difference on my end. I tried an ultrawide high refresh 1440p Alienware Oled and ended up returning it because of how big of a difference the sharpness was compared to my 4K ultrawide IPS. On top of that, the motion was still not as clear as I hoped, maybe we need 480Hz, or MicroLED.
Built my lad his first pc Christmas day will never forget it.
Starting him off with fake FPS and paying more for less. May god have mercy on his gamer soul.
4070 to 5070
$549 for software upgrade "AI frame"
$549 for more fake FPS*
wasn't the point of more expensive hardware that we don't NEED dlss? It's cool that the price is competitive but all i heard was that nvidia can't really squeeze much more performance than they already have with the 40xx series and from now onward the improvements are all gonna be DLSS based
the view at the 4.50 mark of the card, looks so nice! They've done a beautiful job aesthetically
The problem for Nvidia will be that after few days of official launch, we will have tech reviews from GN, Hardware Unboxed etc., and we will see the reality. Making Linus not change things is stupid, they know very well that every major tech reviewer will play with the settings and the truth will shine.
But the first impression (from linus) will be seeded in many people's minds. Most people will not bother to double check. They'll just watch the first, most popular thing and basing on it form their opinion. That's what they're going for.
If only we had games, worth playing
If only we knew how to use commas.
Just because you cant find a game you enjoy doesnt mean „we“ dont have good games😂😂🤦🏽♂️
@@megaplexXxHDOK concord player lmao
Concord
Banana
These new GPU cards should comfortably be able to do: native 4k 60fps with full raytracing for the amount they're charging, especially the: 4080, 4090, 5070, 5080, & 5090
I don't even care about ray tracing, I just want stable 4k 60fps at this point with no gimmicks.
@@jackofsometradesmasterofnone I still have my 6900XT. It is "old" but it works and I can play everything I want. I wanted to build a new PC in a month but it looks like the new GPUs from Nvidia and AMD are just "fake" with their AI trash.
They can, IF the game is optimized. But many games simply aren't anymore and rely on dlss and framegen to get them there.
@@wolfsbane9985 You must have extensive experience shipping games with full ray tracing at 4k 60fps to make that statement. What have you worked on, and what have you optimized?
@computron1 What are you implying with your question? That my statement is false or I don't know what I'm talking about? I own a 4090 and do a lot of testing myself. It's a hobby of mine. You don't need to be a game developer with years of experience to see how broken and unoptimized AAA release titles have been these past few years. It doesn't take much to see that developers are heavily relying on AI techniques to get games to acceptable framerates, and it may get worse. Take Silent Hill 2 remake for example. Why does that game have worse performance than Cyberpunk 2077 on a 4090? Because it's terribly unoptimized. That's just one example. Do your own research. It's not hard to see where the market is going.
Yeah, many gamers / it engineers buy those gaming laptop so they can have dual use our of it. These laptops are generally connected as a desktop setup with screens and staff to actually play the game.
0:46 no wonder there employees are retiring
No we’re not.
They're employee's
@@Emperor_tomato_ketchuptheir* lol
Nope they stay and work 80 hours weeks because they want to make millions more
@@Emperor_tomato_ketchuptheir*
Yes, lets trust Linus's words.
Linus is a shill
he also promo honey scam ADS why everyone trust him word
The only person you can really trust is Steve.
especially when the NVIDIA staff watching him while he's doing the review. Totally not holding him on a "gunpoint" 😂
You realize every youtuber didnt know honey was scamming right??@@HitomiYuna1
Frame generation is a cancer on gaming... As in AI and the push for tech for it :/
Wait till it's used in everything to cut down on streaming data.
It's tech word slop to hide the fact that the new hardware is just as big a scam as the old new hardware.
consoles using that for 10 years.. non of them NPCs can tell tho lol
@@noobandfriends2420 LOL clearly you've never thought about how much streaming data that bots have already been hogging nowadays
The main problem here will be system latency doesnt help to have 200 or more frames, when system latancy is 60 ms. Add 24-60ms latancy to server and its like you play on over 100ms. Wont be a pleasant experience for online games. The ghosting and smushing in fast paced games will be really noticable.
Those tech demos are like a limit of what the card can handle in a very especific scenario, ghraphics cards have a long road to improve. The limitation of looking realistic is not the only one, but you also can increase the number of things in a scenario, movement on things, and so on, and in those aspects we still have a long road. Games and tech demos dont look that realistic, they are build to give that impression, but they still work on really big limitations
DLSS Is great as an option for those who wants better performance... The issue is nowadays developers just don't care about optimization at all, and DLSS has become a REQUIREMENT instead of an option. DLSS and frame gen has done a lot of damage to the videogame industry.
Ya I think we will see a revolt like we saw against SBI and the influence.
An the industry will be better for it.
so what? lets stop progress? it is not nvidia who deals damage to games, but big publisher managers trying to save some costs using newest tech.
because from the pure profit point of view - why bother with optimization when you can just turn on the couple of setting switches?
This is what fked up
@@PhilosophicalSock Who said to stop the tecnology? I like DLSS and FSR , they're great options for players. My problem is with developers who rely on upscaling software to get "OK" performance instead of optimizing their games...
Gamers have been destroyed gaming, more than any developer could ever do. Gamers are responsible for destroying the physical side of gaming. Collecting in real ownership, it is pretty much gone because gamers are just not very smart. A group of millions of people conditioned to rent entertainment.
@@RobK-rl6sn lol no, gaming was destroyed in 2013 when the industry was showing billions of profits to people with BA degrees who think mass market and subcription models is how you build a business.
Wait for benchmarks. Every launch nvidia has had since the 970 has had an issue on day 1. Plus youre paying more $$$ for artificial frames.
I dont get why they cant get the DLSS to have different layers. Make it seperate certian render sections. Crucial items that shouldnt blur ever would always be rendered while less important items would be using the AI cores. That way you get performance + the accuracy neccessary.
I think we brought this to ourselves. Many gamers are so obsessed with frame rates that graphic cards no longer provide "visuals" as such but they are simply focused on generating frame rates, while games look like their 2015 counterparts but in 4k...
That's a lie. We're talking about the 60 fps standard, the lowest possible bar, set more than 10 years ago. I, nor anyone else, should be gaslit into believing that "gamers have unreasonable standards."
Games should run on reasonably affordable hardware, at 60 fps, minimum, before any frame generation. That has been the standard for more than 10 years, the absolute minimum standard, and it's the industry's fault for failing to meet even the minimum standard.
did you even game in 2015?
@@afelias Games can run at 60fps though if everything else sacrificed.
Frankly, I see no issue with the rtx50 generation. The AI and frame generation is for people wanting 4K and 60 fps and ray tracing and pathtracing.
If you play at 1080p or 2K 60 fps and no raytracing or pathtracing you won't need the AI frame generation.
@@sadiqraga3451 Yes, but you shouldn't advertise any sort of improvement if it comes at the cost of the minimum bar.
If you have to turn off RT, lower the texture quality, lower the resolution or use supersampling, just to hit 60 FPS, then that is the limit of the performance uplift. If the games can't look better at 60 FPS then there would be no reason to spend more or even an equal amount of money for new hardware. That's the point.
It's not like it's anyone's fault but NVIDIA's for aggressively pushing for RT in the first place. So they can't advertise any kind of uplift if they have to start below-minimum to create that comparison.
@@bigturkey1 I gamed in the 1990s on Amiga 600 :P
NGL, that chain did add charisma. Now, it's gone. RIP Chain Asmon
8:11
DLSS Performance
Screen Space Reflections Quality not on the maximum setting
This is 1/4th of the resolution of the original, meaning this "4k" gameplay is only a 1080p one upscaled, also 3 out of 4 frames is AI generated, by doing a simplified calculation you have to subtract by ~7 (you have to calculate with the actual calculation of the upscaling so amore realistic number is around subtracting by 5-6) to get the "real" performance of this card, for 4k native
So the 110 FPS you see at 8:41 would be about 20-25 FPS
Damn I can't believe DLSS isn't actually real, can't believe we've been lied to
Could you do some calculations but on the 5090 side?
@@enderwigin7976 We don't have to. They showed the 5090 FPS raw performance without DLSS, it was 28FPS, about 27% better than the 4090
@@StefanRial-i4f This is the raw performance increase that makes most sense. 5090 is a 4090 that draws more power and has better AI performance and that's it.
LOL 28 FPS natively LMAO no bag is big enough to make me ignore that big of a sign
You killed me with the chain breaking debuff 🤣👌🏼
Not a fan of the AI shit. As others have said, all it really brings to the table is blur and texture smear.
DLSS fan girls will flame you for this.
my 4080 never smear anything
@@bigturkey1 then you're blind.
@@mikschultzyevo what nvidia card do you use to game at 4k and what monitor, i use 4080 and 55" 4k oled. no smearing no blur
@@Grodstark and here one is
I dOnt GeT SmEar wItH mY 4o80
1440p for a monitor is plenty enough
nah you need to play at 8k
For most. Same way that 1080p was standard for 2020, and 720p was for 2015/2016, 1440p should be standard for 2025. Benchmarking to 60 frames at ultra at 1440p native should be the ideal for optimization now.
@@PerciusLive you need to get an 8k tho
@@PerciusLive I feel like 1080 has been a standard for waaay longer...Like 2010's late, pretty much every single person had a 1080p monitor. Besides that i couldnt agree more.
Even 1080p is plenty enough for basically almost everyone, and its soso much easy to run compared to 1440p, especially 2160p. I do however think for gaming 1080p is enough but for more productive work having multiple windows side by side 1440p is the sweetspot. For only gaming 1080p is enough for most people
I dont know how im every gonna afford this shit, so annoying with PC games being unoptimized shit stains nowdays and you need DLSS and 5070TI's to run them well
do you need to run the latest games at 4k 60? no. If you can be satisfied with 1080p there are many options for you.
Gaming laptops are pretty commonly used as work laptops as well when the work requires graphic heavy programs.
People in the military and people that travel for work a lot use gaming laptops
Not long ago, playing CS2, I noticed how viscous the controls were, and they often killed me as if they saw me before I did them. When I turned on the FPS counter, I saw that my frame time was jumping by 18-24 milliseconds. There are several factors here, of course, after working on a lot of things, I eventually achieved 11-14 milliseconds and this radically changed my feeling about the game. If you think about what 10 milliseconds are, it's almost nothing, but in reality everything became faster, controls, reaction time, even skill increased and it became easier to kill. What I'm getting at is that they feed us 300-500 fake frames with a delay of over 35 milliseconds, this is just hell for multiplayer shooters.
It's great for movies pretending to be video games though
For me I like it but my reflexes aren’t as good as yours so I’m not best at competitive shooters. I am curious how 5090 will work with 1080p just feel gpu wise were not there for gaming but monitor wise we are getting there
@@mew1534 " I am curious how 5090 will work with 1080p" dude. the 1060 is a 1080p card. if you are buying 5090 wondering "how it will do" in 1080p i am speechless
literally has zero to do with this conversation. cs is an ancient game that can run on a potato
@@cloudnine5651 CS2 updated the game engine so its more demanding than the original counter strike.
Honestly, I think Nvidia took this approach because their marketing strategies are lacking, and collaborating with RUclipsrs might help them come across as more genuine.
their marketing is so far behind AMD at this point it's sad.
@@thescarletpumpernel3305 10-15% market share AMD Radeon doesn't really scream "leadership in marketing strategy" like you think it does.
@@TheDravic Yeah...at least nvidia has something to show. Coming empty handed is worse then not participating at all.
49” Ultra wide monitor is the way to go. It’s two 27” screens in one. You can game at the 27” screen size with streamlabs or other go super wide screen full gaming mode which is sick.
just a side note: I think it should be called Big Black Case.
The "Zuckerberg free speech" video wore the chain
13:30 and now seeing it broken by next video it's like seeing the conclusion side quest story
Gaming laptops are pretty cost effective if you consider the included monitor. They aren’t as powerful as a desktop but they’re pretty close. Plus they are portable so nice to take to a friends house or play party games on the tv
Do they have ethernet ports?
They do @@zRidZz
@@zRidZzthey do have 5ghz wifi
The advertising on Linus videos always reminds me of the early 2000’s with all the popup ads.
RTX 5090 giving 20 fps at native 4k with full ray tracing
As the video shows it does 28fps. and with path tracing on.
My 3080ti and 1440p is getting 4-5 fps doing that.
@@Primafaveo so we waited 2 years for 25% more computing power, 575w tdp and 2500 price tag. very nice development cant wait for the future
@@mewtilate420 25% to 30% is a pretty normal jump in a 2 year time frame though.
@@infinite683 LOL for these price points it's a rip-off if you don't get a happy ending just admit it we've peaked technologically
@@Primafaveo it also wasnt 4k.
All of these examples of artifacts and jittering is why I DO NOT use RT and FSR4 / DLSS. I always turn off Motion blur on games....I'm old and I hate the new shit that covers the major flaws....
1000$ is unrealistic for every reason being all the different manufacturers for each different PC component charging what they want - even for very basic bare bones features. And then the periodic gimmick advancements we see in hardware/software. It'll never really happen without making "non-name brand" types of cost reductions
so basically the new gpus are using ai to upscale/ downscale to keep performance up that sounds like something we already have 'Dynamic Resolution' but its implemented to the gpu by default for every game i guess? and dlss gets a boost
the 5000 series prices for me is they found the price cap for each class with 4000 series, and used the successful 'super' card prices to gauge the 5000 series cards, the 5090, they can price it whatever they want... because... what else are you buying at that level? 🤣
Jensen is elbow deep up in Linus.
This 50s series is probably just a substandard byproduct of their ongoing Blackwell AI research.
The money shifted from their GPUs to their AI enterprise instead.
It's like petrol byproducts. You really want the petrol, but why not sell the byproducts too.
I think the telling thing that no one seems to be taking into account about DLSS4 is the input latency. While it is generating frames inbetween your current and next frame to give you a smoother visual, it is not reacting to your inputs. For many games, this will be imperceptible. However, for many other games, include most FPS games, games like Rocket League, rhythm games, action combat games, they are all going to feel somewhat sluggish to play regardless of the visual smoothness. A lot of people wont even know what the issue is, they will probably blame the game.