@@Manganization you are watching a recording of a screen on RUclips which is compressed Some of it *might* be somewhere between the 5090 and your eyes, lol.
@@justinlabarge8178 I would believe that if I wasn't seeing none of the on the monitor with the 4090. Although, it's entirely possible the Monitor may be flickering in a way the recording doesn't like.
The price comparisons to last gen aren't exactly fair. Saying the 5080 is set cheaper than the 4080 when considering scalping prices of the latter, and that the 4080S was brought down to $1000. Also that the 40series prices are still not exactly good, so being $50 less than them isn't that good.
@@animegeek6079 using frame generation to go from 20 to 200 fps will not feel like 200fps. It will feel like something worthy of 60-80 fps. That is one issue, the other one is that gamestudios basically dont optimize their games anymore for older gpu's, they are basically like: "if you want more frames, buy better shit and use frame gen and upscaling and shit"
Revolutionary new technology allows it to benchmark at an impressive INFINITY points in Cinebench. This is because it notices the rendered image is the same every iteration, so it just duplicates the frame over and over without all that pesky calculation.
And so you should, just like myself and everyone else should. Frame generation and DLSS are clouding everyone's perception of actual performance gains.
For reals. I love Linus and his excitement for new hardware. But this isn't it chief. There are so many problems with what Nvidia is doing, and Linus should be leading that charge against it. But he's not. He's just getting people excited to pull out their wallets. ~Jenson liked that~
@Clutch4IceCream This is not meant to be rude, I actually like your comment, but also, it can be said, It's not fake money. It's more of an alternate currency used to exchange goods that would not be available with traditional currency like the USD. Thanks again, though, for the clarity, but it's still upsetting since the price to real frames' performance does not correlate to their price spike.
Nvidia doesn't just pick a price arbitrarily, it depends on many factors, chiefly supply and demand. $699 was way too cheap as clearly indicated by the massive shortages of the card (so you couldn't actually buy it at that price). Also 700$ in 2020 are 850$ now, so half of that difference is already made up by inflation.
Given I was tempted by a 7900xtx a 5080 for the same money feels like a better deal…. Of course…. I won’t be able to BUY one for £999 for a year but still…
@@DerMef I absolutely agree. It was more of a reaction to Linus saying that because 5080 is 200$ cheaper then 4080 at launch, it's a good deal. Once those cards are properly tested and compared to AMD's latest and to what's on the used market, we can say if it's a good deal or not.
4:01 DLSS enabled, set to performance (upscaled 1080p -> 4K), rather than Quality (1440p -> 4K), DLSS sharpness at 0 (to be fair, I heard that this shouldn't be used in game anymore) and DLSS frame generation enabled, so fewer true frames. So it's more of: look how high we can get the frame counter with our AI shenanigans, rather than: look at what kind of raw performance we can get from our new card!
To be fair, performance mode in 4K looks much better than even 1440p native, so it's not as bad as it sounds. Performance mode is absolutely acceptable and commonly used at 4K, as opposed to at 1440p or 1080p where it looks like dogshit. Although I do agree they could have just ran it native, and the 5090 would still be impressive.
Next-Gen gaming will be... New game rendering engine, with additional FPS out of the box. It generates TWO frames with MFG and then reports a third additional frame just because it can.. so that you can get even higher fps numbers!
They wanted to show off the AI shenanigans. It is a big selling point. And hey, if it looks good and feels good (debatable) I’m not complaining too much
@@khunthaasmyseed-h6v There was criticism? There are positives and negatives, He's just presenting the information as is and isn't focusing heavily on either negatives or positives.
@ayyalicia why do you appreciate it? It's an expectation of the consumer for honest disclosures. Something ltt has long struggled with and shouldnt be given props for doing a very basic transparency to not prevent misleading audience.
@@AVdE10000 I mean 4090 at MSRP was 1599 dollars that today would be ~1724 dollars (inflation). 5090 is 1999 dollars MSRP which is a ~15% price increase for 29% fps increase (without MFG) not as bad as some people are making out. If you discount inflation (which wouldn't be a true comparison), a 25% price increase for a 29% fps increase is not great but isn't unreasonable imo.
@@tizzlegaming8688are you serious? Technology advances and gets cheaper. This performance improvement is a joke and should be the same price as last gen.
@@Lightwish4K You mean other than the part where motion looks terrible in almost every modern game and only extremely specific examples are good, slow pans protect the terrible motion guesses, and that this is a fundamentally negative thing for anyone who is interested in a responsive and maximum clarity gaming environment? I have these same complaints about FSR but worse than with DLSS obviously, as FSR is specifically more worse in motion. Unreal Engine is setting gaming back by over a decade and GPUs are all for it!
@@Lightwish4K I don't want the game developers to ignore good optimization and just assume AI is going to fix everything for them. Cyberpunk is definitely an oddball having the best utilization of RT and DLSS out there, but in most games I don't want DLSS and I can definitely tell that I am using DLSS.
@@CirnoTales listen, you're telling me you watched a 12 minute long video and missed the point that this was a direct comparison of the 4090 and 5090?? The comment your commenting on is talking about the 5090 being only slightly better than the 4090...
Those bastards knew what they were doing by not letting you change the settings, I really want to see the difference of both systems running natives res.
Worth noting that you can see the 4090 was running the older CNN model as the DLSS preset. Presumably the 5090 was using the new model. Really sketchy.
It should be mandatory to reveal raw raster performance comparisons, minus any "augmented" visuals. I'd love to know the raster of the 5070 vs say 7900XTX
Raw raster is meaningless since it's approaching the end of the road for meaningful performance gains. It's all ray-tracing and AI now, whether you like it or not (and you better like it).
@@fujinshuraw performance is still very much relevant, it's just that ngreedia wants you to believe that ai is the future so you buy their overpriced garbage
Man Linus tech's take on tech stuff are SO damn worth my time. Joy to listen as always. My bit of kidney for you today Linus. Wish you well! Keep it real and coming !
If framegen basically „doubles the performance“ and you get 100 on the 4090 and 240 on the 5090 but the new framegen makes it basically „4 times the performance“ you get a raw performance difference of 50fps on the 4090 and 60 on the 5090…
Every part of DLSS has its own performance cost. Frame gen, too. 60fps never translated to 120fps. It was always like 105-110fps. I imagine extra frames will add even more performance cost.
@@EmperorLicht well not exactly. They stated in the video that both the 4090, and 5090 were using AI features in that demo. (I know this for a fact because I have a 4090, and the 4090 CANNOT run cyberpunk anywhere near 100 FPS Maxed at 4k naturally) The difference here is that the 5090 is able to utilise the AI up-scaling and frame generation more efficiently. Factoring this out of the equation, you’re probably still looking at close to double the performance
i was amazed when the 5090 was running the game at 263fps but a few seconds later you said it was using frame generation and all my hope and shock was gone.....
Trash more trash. I have a 4090 and a 4k 240hz oled I'm waiting for them to stop burning g us and make something that can run 240 hz in 4k without the fake frames that add latency . Monitors are so far ahead of us and getting faster and we can't even use them to their potential 😢
@advertentiegolf I have a 4090 and was very disappointed with the raw performance of rtx 5090... it's less than even 30% for 2000usd... that is straight robbery.
Comparing a GPU with DLSS 4 vs DLSS 3.5 and saying "performance doubled" is BS. It's like winning a race by starting at the finish line. Only benchmarks without DLSS give you a clear picture of the performance.
@@xXDarknetMagicXx The point is that Moores law is basically over these days and the only way they can continue to make noticeable performance improvements at a reasonable power draw and cost is with AI and not raw horsepower. Whether this is a good thing or not is up for debate.
To be fair most GPUs from 30XX series at this point handle the job very well for gaming, even without DLSS.. However, if you want to run an LLM model locally on your machine, the 32GB of VRAM and lot of AI-specific core and doubled bandwidth and completely worth the upgrade if you can afford it
4090 was using the CNN/Old DLSS 3 model. 5090 was using the new DLSS 4 Transformer model, otherwise MFG wouldn't work? Not an equal comparison then... Still happy to get 4.25/5 of the new DLSS features for my 4090 though.
when he says "ai" all i can hear is "horrible TAA" and "blurry in motion" and also "horrific ghosting artifacts", too. stalker 2 is one great recent example of all of those.
The thing that really shocked me in stalker 2 was how much noticeable delay there was to otherwise natural mouse movements, if you tried to flick a 180 it looked like there was no problem but it felt like you just pulled five g's.
Guessing you play 1080p with an outdated DLSS .dll set to performance mode? Not seeing many people on 1440p using quality DLSS (most games run fine on my 6 year old card with this) complain about it.
I recently got a 4070 Ti Super and have been playing Indiana Jones at 1440p. I thought it would rip through the game at ultra settings, but when I turned ray tracing on I'm stuck around 20-30 fps with path tracing and 50 fps with high rat tracing. So I thought "gee wiz, good thing I can turn on DLSS quality and frame gen on my new 40 series card!" Omg the artifacts are AWFUL. Even though Im getting 100 fps with path tracing, a lot of the time it's so blurry and has so many artifacts that it breaks the immersion completely. Plus the latency at 30ms is noticeable. I am not sold on MFG whatsoever and for now I'm just glad I snagged a 16gb card for $700, especially since it looks like the 4070 ti super has better raw power overall than the 5070...
I love how the selling point is basically "If you know what to look for it's not the best, but if you put someone less knowledgeable in front of it then they probably won't notice so it's good!"
@@ruby_fireleader on your phone looking at a compressed yt video you can still see artifacts... Like c'mon, wtf is it gonna look like on my 4k Tv, it's first world problems I know but man Its so sad that instead of real performance improvements or optimization advancements we are given these ai tricks.
Those robots coming up on stage gave me "Justin Hammer" drone vibes. Are we sure we didn't just witness a super-villain moment? Because it felt like we did.
Comparing a GPU with DLSS 4 to DLSS 3.5 and claiming "performance doubled" is false. It's similar like winning a race by beginning at the finish line. Only benchmarks without DLSS provide a clear view of the performance.
8:56 - Down $200 from an already ridiculous price, where they didn't sell any, and had to fix it with a super model priced at, wait for it, $999. Actually what we end up with is 5% more cuda cores and no more VRAM, for the same price. Doesn't really get me excited, at all. Nvidia have conditioned us to accept higher prices and we're lapping it up, just like they planned. Anyone forget that the 3080 was $699?
Yep, until scalpers snapped them up anyway and they were overpriced to hell and back, which gave NVIDIA the bright idea to juice their stock with scalper demands.
Exactly. No point referencing a MSRP for a product no one bought. In Australia when the 4080 super released it was higher price then the 4080. The 4080 sold so poorly here they had to reduce prices below the $999 to sell any.
It's literally a carrot and a stick. First create insane price to set the standard, and then make the next gen a little cheaper to create fake fomo for a pc part that's still at insane price and markup.
CUDA cores are similar to last gen. If your game doesn't support DLSS, frame generation, and ray tracing based on previous generations, it seems like a 20%-30% uplift over last gen would be reasonable to me. So a 5070TI is a 4080 Super and a 5080 is still 20% shy of a 4090 without any AI trickery. We'll see if I'm wrong.
pretty much every game coming out now has DLSS, FSR, XeSS and FG support for DLSS and FSR. I dont understand the hate from strictly gamers. "Fake Frames" or not, if the image looks good with more FPS then I'm okay with it. I've been using FG on 4080 Super for the past year and it's great
@aHungiePanda maybe it looks good while standing still. I have some motion sickness and playing for some minutes with the shenanigans on starts to make me sick. All the ugly blurry mess while moving the mouse to feeling sluggish and not looking good as you say makes me sick, literally sick, needing 10-15 minutes of just eyes closed and regulate the breathing to be ok again. TAA and TSR made games look worse, MSAA and FXAA and I think SSAA made games look crisp and nice. Gaming now is just a blurry mess, CRTs and games then looked better than the blurry mess today. If I see it with my eyes that the images look good in motion and it doesn't make me sick, then I'll truly say that nVidia has done it. Otherwise, is just ugly fake frames. And no top-end/high-end card should get 100-200 FPS with DLSS on performance and FG enabled on Max settings in any game. Top-end/High-end should get 60-80FPS without all that crap and then 200+ with that crap on, max settings, native 4K. Everyone saying 4K and very high FPS, is just f-ing 1080p upscaled and fake frames. Give me TRUE 4K, NATIVE 4K, MAX SETTINGS 60FPS WITH NO BS and then you'll get the praise. Cards back in the day like the GTX 980-980Ti could do that and games looked so much better and were better optimized. Now just laziness and let the AI do the work. Most likely because studios hire based on race and stuff and not on capability so most devs don't know wtf are they doing nowadays. Just "tUrN tHiS On AnD lEt Ai dO It'S tHiNg", "I'm sO dUmB, hE heHe".
Moore's Law is over. The laws of physics mean we can't get huge improvements anymore. Better fake frames is all we can hope for. Tbh, if you have to zoom in to spot the differences between fake and real frames then there is no difference from a practical perspective. So the key issue is how good are the fake frames not merely raw performance.
Stop praising them for the "price cut" on the 5080. It only appears that way because they screwed everyone on the $1200 price tag of the 4080 which it never should have been. But now they get to be the heroes because they so graciously cut the insane price down to *only* $999. The 980 launched at $549.
@@p_serdiuk At a 30% higher price, power draw, and dye size. Not really generational uplift for compute. Only for AI. Which sucks because as games get harder to run, we will have to rely on that AI, that only a select few cards have access to, for 75%-90% of performance.
the card design is sooo good, the angled power connector is wicked, and the fact its smaller and probably more powerful, is kinda sick, the PRICE KINDA SUCKS THO
Its with utmost pleasure that I have received this information, now a GTX 1030 can be affordable (in a few weeks), such a great time to be a gamer with tight budget.
are you willing to pay for a GPU that does double raw FPS? then you will all complain about the 5k price))))) what are you expecting, alien technology for half the price? common sens in business
We're squeezing almost as many transistors into a single space as allowed by physics and material science as we know it. So unless there's some major breakthrough in computer engineering, we're not going to see a GPU with double the fps from hardware alone for a while.
With NVIDIA and AMD getting legitimately +20-30% raster performance each single gen, it is hardly the cards fault. Video game Devs need to optimise their games better or dumb them down properly for consumer hardware. Forcing users to use frame gen for 60fps is a Dev side problem, not a Card side problem.
They aren't even selling us GPU's now... they're just selling us an updated version of DLSS and Frame Generation (Frame gen especially being bad since it causes input delay)
Yep. The consumer is almost forced to buy a new graphics card every generation because the cards are really not that great at their raw power input (when it comes to price value). It's an expensive software update. And it will only get worse because the average consumer gets basically blinded by Ai buzzwords.
Nah -- I think they're indeed selling GPU's; and yup part of the sauce is indeed using (hardware accelerated) magic tricks, but it speaks to the maturity of the industry that they're at a point where hardware+software updates can essentially 'double' FPS on a flagship game in just a generation. Flashback to 2007 / 2010 GPU era -- where the generational difference was effectively 25% more VRAM. And no, it still couldn't run Crysis. This era, you can expect to essentially double the FPS on max settings on flagship games? Ya'll don't know how good you have it.
honestly it wouldnt even be a problem if game developers could pull their shit together and make an optimized title. I have a 6800xt and running Silent Hill 2's remake is a fucking endeavor for it because the dumbasses at Konami RENDER EVERYTHING BEYOND THE FOG AT FULL DETAIL. It's not a hardware or software issue its a game issue where games are being made by actual miscreants who just use upscalers and Ai as their bandaid fixes for them not knowing how to do their job.
The enhanced DLSS will be available on all RTX cards. Enhanced Single Frame Gen on 4000 cards. Multi frame gen is only on 5000 cards. (The only valid thing here to say they are selling you) Frame Gen does not "cause" input delay. It just doesn't reduce input delay. Input latency matches your real fps which can feel like a delay with higher fps. Nvidia Reflex already helps with this but Reflex 2 can be up to 2x as fast by modifying already rendered frames in accordance to new input. Reflex 2 will also come to other RTX cards. Should make both versions of frame gen feel better.
@@raitonodefeat6503 They’re not fake frames. They’re AI generated frames. And they’re going to be completely indistinguishable from non AI generated frames in another generation or two Progress happens slowly. This technology is the future, and it gets better every generation. There’s a reason why Nvidia is now the second biggest company in the world
@@tylerclayton6081 So what you're saying is pretty soon a game engine will just be a Notepad file describing what's supposed to happen on the screen and your GPU just generates the entire thing off that narrative? EDIT: AI is fake, and the trendy AI terminology that people have been throwing around in the last couple of years is not actual AI. There are games with "AI" smarter than these generative algorithms are.
@@TehButterflyEffect All it does is interpolate frames using algorithms. To the same extent that the image that your GPU maps 3D faces onto is real, these frames are not fake either. A capped 60 put to 120 with FG is a better experience than fluctuating 70-90. As long as you have the headroom to play with minimal input lag, it feels good, and this is coming from the lossless scaling app on Steam, not even nvidia frame gen.
oh brother you guys are never happy. Must live pretty sad lives huh? Have you ever played on a rtx50? How do you know 200fps feels like 30fps? stop making baseless accusations
@@Justme-jp8ih the assumption is correct however in this case, the number on the frame counter is not raw frames, they are interpolated or AI generated frames to fill in the gap between real frames, so no, 200 fps is not actually 200fps, that’s what frame generation is all about at the cost of input latency.
@@Justme-jp8ih you just proved that you dont know anything about how frame generation works. imagine you are getting 30 fps. you turn on frame generation. you get 130 fps technically, but you still experience the same input lag as 30 fps.
@@RageBaiter1 hey man, we know this because frame generation literally makes higher frame rate, but does not increase latency. its because frame gen is creating fake frames to fill in the space between each frame, creating a more smooth motion experience and showing higher FPS, but the input lag is still as if nothing changed. so experience 30 fps input lag but 90 fps motion. it feels bad but at least it has smoother motion.
My favourite part of the keynote was when Jenson joked about how expensive the 4090 was then said the 5070 was double the performance (giant ghostly asterisks looming), while quietly forgetting to mention that the new 90 tier card is 25% more expensive.
This release is just sad, they could easily get DLSS 4 to run on the 40 series, you're essentially paying for premium software, not raw performance, you'd probably be able to get the same raw performance uplift by overclocking the 4090.
@bbqR0ADK1LL you playing the game of telephone? He didn't say the 5070 was "double" the performance of the 4090 he said it was 4090 performance with the AI The 5090 is double with the ai
@@theMF69if it was so easy then why aren't they doing it? It seems to me they need the extra Cuda to make the new processes work correctly. I'm just happy my 2070 is still getting some love. 😂
That's why you get 5070Ti with 16GB. I'm just waiting for 5060 with 8GB xdddddd I'll laugh so hard, that i'm probably gonna shit myself :D 5060 - 8GB 5060Ti - 10GB 5070 - 12GB 5070Ti - 16GB 5080 - 16GB 5080Ti - 20GB Maybe super version will have 24GB 5090 - 32GB
it's going to perform worse than a 4070 ti, which had 12gb of vram, and probably not much better than a 4070 super or 3080 ti which both also have 12gb of vram. Its not a crime, if you don't want to buy it, dont. But 12gb its just fine for 90%+ of titles at 1080p or 1440p where 99% of people will be playing. the crazy part is the price. if the regular rasterization gen over gen from the 4070 super is 0-5% this is basically a 50$ price reduction for the same performance, which is wild.
Only thing that is important to me is, whats the performance on both without DLSS? A shitty frame generation tech that blurs out everything does not tell me anything. Can't wait for you to break it apart Linus!
@@ischysyt looking at the keynotes from both of the 4090 releases and the 5090 releases, the raw performance seems to be about a 20% increase in the best case. Take from that what you will.
The Titan XP in 2017 was the full die at 3840 CUDA cores. They cut that in half down to 1920 and sold it as the 1070. The 5080 has a bit less than half the CUDA cores of the 5090, meaning they re-labled a 70 class card as an 80 class and are charging $1000 for it.
I am really excited to see the waterblocks for 5090s. I bet we will be able to pack a ton of performance in SFF machines this gen. I am withholding judgement until we see DLSS-off performance though.
@AndreasHGK It's pretty typical of recording high refresh rate screens with a camera (which has a slower shutter speed than the update speed of the monitor), though I wouldn't doubt the AI having a little bit to do with that as well. A lot depends on persistence of vision as to how clear it feels in real life.
@@beirch That would be a fair point, but at 3:48 it looks like motion blur is turned off. As @iotkuait fairly points out tho, it could also have something to do with just the way it was recorded and we won't know until reviewers get their hands on this.
The prices are still prohibitively expensive. I will skip the 50 series and run my 3080ti until it wears out or I win the lottery of something. I am tired of $2000 or even $1000 cards.
Well, good plan, but these costs are nothing new and maybe 5-20% more expensive depending on the card. We should expect prices to be 20-30% more than ten years ago. Based on inflation alone, these 5000 series cards are only a little more than expensive previous generations. If these were released ten years ago, only the RTX 5090 would actually cost over $1000 Here's a full breakdown compared to a decade ago: RTX 5090 = $1500, before the RTX 3090, the last '90' card was the GTX 690 in 2012 for $1000 (or $1,375 adjusted), the RTX 3090 would be $1830 adjusted RTX 5080 = $750 = the GTX 1080 was $600 RTX 5070ti = $560, the GTX 1070ti costed $449 RTX 5070 = $415, the GTX 1070 costed $379 at launch If we ever get an RTX 5060, it will probably cost about $400 in today's money. Me personally, I'm happy with my RTX 3090 and besides, video game graphics CAN get better, but since many cards can get 4K / 60fps we are getting diminishing returns. I still think the regular PS5 looks phenomenal even though it's like 1/3 as powerful as my computer
@@mrmrmrcaf7801 people who bought 4K 240fps and want 240fps most likely for a competitive gamers. And competitive games like CS2 or maybe R6, Valorant are relatively easy to run.
Yeah on a 4090 it has dlss PERFORMANCE AND frame generation and every single other helping hand you could turn on😭😭😭 this is NOT honest reporting Bruh, even the FOV is set to 80 which nobody would do
@@GoldenSW it was cooked the moment he wasnt allowed to change settings. its literally perfectly designed to "look" marketable as an upgrade without actually being much of one.
Optimisation was killed when Sony showed the soulless corporations that gaming had money to be made for non-gaming companies, DLSS and its ilk will only add to the problem.
@@Nobody-absolutelynobody lol this is so dumb they told him exactly what to day and do and he sold put like a ...common chicken he's nothing but a stooge... I think it's funny I go to comment Nd it tells me to behave ... u like.linus and his team with that one girl
@@ReportingEligiblePodcast would they though? I bet some black market out there somewhere would sell partially working eyes for less than 2000$ in that alternate reality
It doesn't start at 27fps, but from much higher fps, 27 is at native 4k, but the gen frame is probably applied at 1080p upscaled with dlss, so it will be going to 70 80fps which are then quadrupled. The input lag will be less than 15 ms which all in all is not bad
frame gen adds disgusting latency - its so stupid that nobody talks about this, play some stalker 2 with it , it feels like your are drowning in a swamp, but fake fps counter shows 150 lmao
@@playPs232 have u tried ? As long as u are around 90-100 fps there is basically none , also if u are using a controller ( like single player rpg , the imput lag of the controller is bigger than the fg one)
Ya, they do look like Nvidia goons.. good fellas but with green logos, lol. Modern-day mobsters work for corporations. The look on their faces.. say, "Let's break his legs." Lol
Nvidia was watching, and specifically would not let anyone at the preview event change any of the settings. Should tell you something about the completely BS claim of "5070 = 4090 performance" and that's without even getting into the insulting 12 GB of VRAM.
just wait for the 6000 series or maybe when they come out with the 5080 super or something that has like 24 GB of VRam or something im not spending over 2k for the 5090 if its like barely 10% faster than the 4090 i get the feeling the 5090 is actually a 4090Ti in disguise
Id suggest looking up how CGI or just 3D renders in general are made, rendering ridiculously simple scenes with optimized settings doesn't even come close to 1fps. 28fps with path tracing is insanity that should not be possible, cyberpunks engine and ofc the gpu are doing amazing things but the tech is too complicated for basic consumers to understand and appreciate
Remember the time, where a bonkers spec Nvidia GTX Titan X set it self apart with a ridiculous pricetag of ... $1,000!!! "But! This is a flagship device, a halo product! Nobody would ever pay a thousand bucks, just to play games...." - yeah, that aged like fine milk.
Dude the GTX 690 was $1000 in 2012. Companies have been making $1000 GPUs for a very long time. I mean they were selling flagship GPUs in 2004 that had an MSRP of $600.
@@DMSparky Yeah and in the days of SLI the top-end card might only be $600 but you were expected to get TWO of them if you wanted to waste your money on a luxury PC. And you probably needed to watercool everything because air cooling sucked.
@@Emetsys they are doing it to cause people to upgrade every 1-2 years. Even 12GB is pretty limiting in newer titles .. I was hitting the VRAM limit in RE4 Remake and the settings weren't even maxed out.
Can't change the settings. Yeah because as you stated the 5090 was running Frame gen 3x to make it look better than it was. Turn that shit off and then put them head to head and that difference will suddenly be about 20 fps instead of 120. Don't get me wrong, I like the tech, but if we feel we have to rely on it to bandaid other shortcomings then I think we went the wrong direction here.
@@CaptToilet stop bitching. Why do you care about raw vs AI. The whole point with more fps is for shooter games to feel better. More fps is more fps who cares if it’s not raw
NVIDIA won't innovate unless the competition does first now, they only want to sell more cards for higher margins, if they started researching how to make an extremely high performance cards affordable to everyone they wouldn't have 1000% price increases in their stock. They are going to just raise the performance by 20% and sell cards to people on hype alone convincing them they need to upgrade every 1-2 years.
And that’s on terribly optimized “new” games, if you play actual games made with love, not for money then you’ll see how big of a difference optimization is.
@@ryutsukishiba2943 are they really worth it? I have been recently hearing about them and wanted to try as I am not a huge fan of the ones on my current headset. I just get wary of buying RUclips influencer ads as most products aren't worth it
I used their cushions for a 3+ year old Corsair headset and it felt great. Headset was still partly broken so wasn't a "like new" experience, but it made the headset usable until I could get a new one. Just thought it would be worth mentioning since I see so many overhyped products - and now this is sounding like free advertising...
I just hope more and more people wake up to this and stop supporting these malicious practices. Unfortunately this is what happens when there is no/limited competition. The same thing is happening in consoles now that Xbox has essentially given up. PlayStation is about to go full Nvidia. Gaming is in the saddest state it's ever been.
Interesting i noticed cyber running on performance mode, i play in 4k on quality modes with the strix 4090, the fps sits comfortably with most games around 60 (not supported) while with supported games (or battlefront 2 because its engine is amazing) the fps flies over 120
It is depressing to have finally hit the point where I realize that my love of being at the highest end of PC performance has well exceeded any of my gaming use cases. It would be epic to upgrade to the 5090..... except I do not play anything anymore that even stresses my 4090, and I realize I have no desire to jump to the games that could use it because I am now just the newest generation of old school gamer.
You don't need to upgrade anyway because the performance of the 5090 is barely higher (mostly attributed to gddr7 and a bit higher clock speed - else essentially the same besides some npu architecture shenanigans). So if you're not using dlss4 aka getting 15/16 of the content you see on your screen generated into a blurry mess ... well then just be happy with your 4090 bro.
PCL is not render time, it's latency. Those can be correlated, but don't have to be. I'm pretty skeptical on the front of non-DLSS perf. gains for this generation, but this isn't information that backs that up in any way.
I think the interpolating technologies will work for a few games at 4K resolution, and all other games will support them and other resolutions to varying degree. It would be informative to know how these place in a bang for buck raw-performance chart of TFLOPS + 1080p 1% lows *without* interpolating tech, in other games where they didn't ask the devs to make a special version for a demo. Most of the games I play are outside the very limited list of "the only five AAA titles this year", and that's (hopefully) true for most gamers.
When he panned to the people in the back that confirmed they kidnapped him.
They're gonna break his legs after the video. 😮
@@TampaGigWorker 😂
@@codybriseno4550 Jokes on you. He kidnapped them to get the card.
He’s definitely at gun point
lol
Great, now I can finally afford a 980Ti.
You're only 5 credit cards away from buying a 5090
go for a 1060 dude
rx580 dropped 60$ after this... so epic
Lucky you. I'm still running 750ti
Someone can probably donate that for free to you.
Linus was held gunpoint by those dudes
yeah. I immediately noticed the blurriness during motion before Linus pointed it out.
Lol it does look this way
@@Manganization you are watching a recording of a screen on RUclips which is compressed
Some of it *might* be somewhere between the 5090 and your eyes, lol.
@@justinlabarge8178 I would believe that if I wasn't seeing none of the on the monitor with the 4090. Although, it's entirely possible the Monitor may be flickering in a way the recording doesn't like.
The price comparisons to last gen aren't exactly fair. Saying the 5080 is set cheaper than the 4080 when considering scalping prices of the latter, and that the 4080S was brought down to $1000. Also that the 40series prices are still not exactly good, so being $50 less than them isn't that good.
NVIDIA: *shows graphics card with more fake frames*
Me: *shows wallet with more Monopoly money*
I would surely Wall-IT
What difference does it make if the frames is fake or not if it plays the same way? they are not even physical, unlike money.
@@animegeek6079 using frame generation to go from 20 to 200 fps will not feel like 200fps. It will feel like something worthy of 60-80 fps. That is one issue, the other one is that gamestudios basically dont optimize their games anymore for older gpu's, they are basically like: "if you want more frames, buy better shit and use frame gen and upscaling and shit"
@@indigovanhoof4941 and input delay
Revolutionary new technology allows it to benchmark at an impressive INFINITY points in Cinebench. This is because it notices the rendered image is the same every iteration, so it just duplicates the frame over and over without all that pesky calculation.
still going to wait for a non supervised non sponsored video giving more details
smart idea. I wouldn't listen to this hype video.
Same. Waiting for Gamers Nexus to make a video on it.
epcially when nvidia official numbers showed the 5090 non dlss frame rate was 27 compared to 4090 22 fps , all these gains are generated frames lol
And so you should, just like myself and everyone else should. Frame generation and DLSS are clouding everyone's perception of actual performance gains.
it's only 30%, you deserve to lose 3,000 dollars if you are seriously this blind
Bro got abducted to the Nvidia backrooms to make content for them.
Linus blink twice for help.
Good comparison.
And to segue to our sponsors, don't forget
For reals. I love Linus and his excitement for new hardware. But this isn't it chief. There are so many problems with what Nvidia is doing, and Linus should be leading that charge against it. But he's not. He's just getting people excited to pull out their wallets. ~Jenson liked that~
Yeah, this is $$$ sponsor.
1:45 "Yeah, the thing"
"Oh right, the THING"
>Drops 5090
@@danwhite3224 THAT Thing
I was thinking the same thing lol
it’s clobbering time
I wish xD
Sounds about right
Ok, now let's see Paul Allen's card.
If we pay for fake frames, does that mean we can also use fake money?
@@Clutch4IceCream nope 1080ti made it look nothing more is feasible but here we are
@Clutch4IceCream This is not meant to be rude, I actually like your comment, but also, it can be said, It's not fake money. It's more of an alternate currency used to exchange goods that would not be available with traditional currency like the USD.
Thanks again, though, for the clarity, but it's still upsetting since the price to real frames' performance does not correlate to their price spike.
@@skippy336 we'll if you're from the US like me then I will be using fake money as my country is cooking the books
@@Clutch4IceCream Fake frames based off fake pixels. How hard is it to not grasp this?
@@Clutch4IceCream i mean, if developers would actually optimize games, but thats not going to happen eiter due to all the AI performance as an excuse
Good job, nvidia. Sell 3080 for $699, the increase 4080 to absurd $1199, so that the 5080 for $999 can now look like great value :)
Nvidia doesn't just pick a price arbitrarily, it depends on many factors, chiefly supply and demand. $699 was way too cheap as clearly indicated by the massive shortages of the card (so you couldn't actually buy it at that price). Also 700$ in 2020 are 850$ now, so half of that difference is already made up by inflation.
Given I was tempted by a 7900xtx a 5080 for the same money feels like a better deal…. Of course…. I won’t be able to BUY one for £999 for a year but still…
I’ll wait for a 5080 TI with 20gb VRAM for $1200 💪
@@mwhizzler476I'll wait for a 5080 ti with 20gb vram for $250. I can wait up to 20 yrs
@@DerMef I absolutely agree. It was more of a reaction to Linus saying that because 5080 is 200$ cheaper then 4080 at launch, it's a good deal.
Once those cards are properly tested and compared to AMD's latest and to what's on the used market, we can say if it's a good deal or not.
4:01 DLSS enabled, set to performance (upscaled 1080p -> 4K), rather than Quality (1440p -> 4K), DLSS sharpness at 0 (to be fair, I heard that this shouldn't be used in game anymore) and DLSS frame generation enabled, so fewer true frames.
So it's more of: look how high we can get the frame counter with our AI shenanigans, rather than: look at what kind of raw performance we can get from our new card!
To be fair, performance mode in 4K looks much better than even 1440p native, so it's not as bad as it sounds. Performance mode is absolutely acceptable and commonly used at 4K, as opposed to at 1440p or 1080p where it looks like dogshit.
Although I do agree they could have just ran it native, and the 5090 would still be impressive.
if its at a reasonable price, looks fine, runs great, then raster doesnt matter as much imo although i do personally prefer more raw power
Next-Gen gaming will be... New game rendering engine, with additional FPS out of the box. It generates TWO frames with MFG and then reports a third additional frame just because it can.. so that you can get even higher fps numbers!
plus 50 series has double frame gen so...
They wanted to show off the AI shenanigans. It is a big selling point. And hey, if it looks good and feels good (debatable) I’m not complaining too much
Linus being held gun point by nvidia employees: this is Fine
nvidia having five goons to babysit linus is kinda funny
I wonder that 5 were enough. I would have doubled the number
@@tomsite2901uk double goons a feature of the next DLSS version
ZTT probably can handle all of those goons alone
Just so you know, those goons are actual millionaires :) No joke here.
theyre just 5 computer corporate bros simping until they get unceremoniously laid off in a few quarters
2:24 they look so dissapointed lmao
Real, they look like they finna kill em.
Like buldogs in Tom & Jerry
@@aclutchboy5389 "finna" Jesus christ
@@darkdraconis Cry about it. It's modern day english. Womp Womp
@@darkdraconis one day you will learn English is spoken many ways and will be able to respect it.
I appreciate the team finding a way to disclose they were being supervised by multiple NVIDIA employees during this. 🙏
Yet they had no criticism? How is that proper journalism?
This is inept and unprofessional.
@@khunthaasmyseed-h6v There was criticism? There are positives and negatives, He's just presenting the information as is and isn't focusing heavily on either negatives or positives.
@@khunthaasmyseed-h6v you didn;t watch the video then? he did criticize DLSS 4.0...
@ayyalicia why do you appreciate it? It's an expectation of the consumer for honest disclosures. Something ltt has long struggled with and shouldnt be given props for doing a very basic transparency to not prevent misleading audience.
@@khunthaasmyseed-h6v You're watching LTT for journalism?
"bro what are you shooting"
"oh the ai put something thats not supposed to be there"
@@rileybeez what I was thinking lol 😆 😂 🤣 😅 🙃 😄 😆 😂 🤣 😅 🙃 😄 😆
Native fps is going up from 21fps on the 4090 to 27fps with the 5090
@@MikeG-rh1lk now lets see the 5070 if it delivers the same power as the 4090 nvidia said
I mean, that's a 29% increase. That's not too bad for... Wait, it's 400 dollars more expensive and 100+W power increase??? Nevermind, lol
@@AVdE10000 I mean 4090 at MSRP was 1599 dollars that today would be ~1724 dollars (inflation). 5090 is 1999 dollars MSRP which is a ~15% price increase for 29% fps increase (without MFG) not as bad as some people are making out. If you discount inflation (which wouldn't be a true comparison), a 25% price increase for a 29% fps increase is not great but isn't unreasonable imo.
There's no game out there that a 4090 can't run at native 4K at well over 100 fps with basically all settings maxed out.
@@tizzlegaming8688are you serious? Technology advances and gets cheaper. This performance improvement is a joke and should be the same price as last gen.
YAY, We'll now get 30 naturally aspirated fps instead of 20 on Cyberpunk max settings
I hope I can at least somehow pirate dlss 4 so my actually still good 3080 can breathe on any new games abusing it.
@@Lightwish4K It's the only actuallyy meaningful way to count because it's the only way to have bearable latency
@@Lightwish4K You mean other than the part where motion looks terrible in almost every modern game and only extremely specific examples are good, slow pans protect the terrible motion guesses, and that this is a fundamentally negative thing for anyone who is interested in a responsive and maximum clarity gaming environment?
I have these same complaints about FSR but worse than with DLSS obviously, as FSR is specifically more worse in motion.
Unreal Engine is setting gaming back by over a decade and GPUs are all for it!
@@Lightwish4K I don't want the game developers to ignore good optimization and just assume AI is going to fix everything for them. Cyberpunk is definitely an oddball having the best utilization of RT and DLSS out there, but in most games I don't want DLSS and I can definitely tell that I am using DLSS.
@@Lightwish4K no we won't
Not allowed to change settings. Of course. Because turning off DLSS would make it barely better than the 4090
It actually would made it slightly worse, 4090 is about 90TFlops, 5070 is 30, just like the 4070, not much improvements in terms of raw performance.
@@CirnoTales if the 5070 is slightly worse than the 4070 that's still huge since it's 3x less in term of price than the 4090
@@erythreas34 And you’re complaining? It’s 1/3 the cost and only the 5070.
@@CirnoTales listen, you're telling me you watched a 12 minute long video and missed the point that this was a direct comparison of the 4090 and 5090?? The comment your commenting on is talking about the 5090 being only slightly better than the 4090...
And buying a car with no AC or seats is barely better than one with those.
The custom cushions alter the sound profile of certain headphones, making them sound worse, be aware!
Let's see that FPS without DLSS.
xD
This. I want to see the native rendering performance.
It would get around 130-140 without fps
stay mad 4090 user.
why though? we have features for a reason. use them.
Those bastards knew what they were doing by not letting you change the settings, I really want to see the difference of both systems running natives res.
"more for the money" yeah fkn right lol
5090 optimised for AI post-processing, not actual processing.
@@Pyroteknikid bingo
Worth noting that you can see the 4090 was running the older CNN model as the DLSS preset. Presumably the 5090 was using the new model. Really sketchy.
@@PyroteknikidThis
It should be mandatory to reveal raw raster performance comparisons, minus any "augmented" visuals. I'd love to know the raster of the 5070 vs say 7900XTX
Raw raster is meaningless since it's approaching the end of the road for meaningful performance gains. It's all ray-tracing and AI now, whether you like it or not (and you better like it).
Honestly, I don't like how upscaling looks. i like to play with native render @fujinshu
wait for actual reviews
@@fujinshuraw performance is still very much relevant, it's just that ngreedia wants you to believe that ai is the future so you buy their overpriced garbage
@@fujinshu AI bot comment
Man Linus tech's take on tech stuff are SO damn worth my time. Joy to listen as always. My bit of kidney for you today Linus. Wish you well! Keep it real and coming !
should of bought a coffee instead tbh
@TerryJackson-o5u well, I believe good organic content deserves some good organic rewards. Ai and bot-generated content will become a plague soon
@@samrichardson9827 Linus tech tips does not need your money. Genuinely this is like half a penny to them
@@samrichardson9827 a decent comment from a good man, sadly nieve. move with the times and use what youve got
If framegen basically „doubles the performance“ and you get 100 on the 4090 and 240 on the 5090 but the new framegen makes it basically „4 times the performance“ you get a raw performance difference of 50fps on the 4090 and 60 on the 5090…
Shhh.. Don't say the quiet part out loud. Nvidia might send their lawyers after you.
exactly, it looks like the actual gains between the 40 and 50 gens are ridiculously small...
Except it’s all generated garbage. Which for single player games is fine but, anything that requires low latency it’s absolutely garbage for
Every part of DLSS has its own performance cost. Frame gen, too. 60fps never translated to 120fps. It was always like 105-110fps. I imagine extra frames will add even more performance cost.
@@EmperorLicht well not exactly. They stated in the video that both the 4090, and 5090 were using AI features in that demo. (I know this for a fact because I have a 4090, and the 4090 CANNOT run cyberpunk anywhere near 100 FPS Maxed at 4k naturally)
The difference here is that the 5090 is able to utilise the AI up-scaling and frame generation more efficiently. Factoring this out of the equation, you’re probably still looking at close to double the performance
i was amazed when the 5090 was running the game at 263fps but a few seconds later you said it was using frame generation and all my hope and shock was gone.....
Running DLSS performance, so quarter of the resolution and quarter of the frames. 260/16=~16, nearly 50% overhead too
Trash more trash. I have a 4090 and a 4k 240hz oled I'm waiting for them to stop burning g us and make something that can run 240 hz in 4k without the fake frames that add latency . Monitors are so far ahead of us and getting faster and we can't even use them to their potential 😢
@@Somthn-sliteReally? Is the 5090 not going to give us the 240hz 4k frames? I really needed the 5090 for that!
@advertentiegolf I have a 4090 and was very disappointed with the raw performance of rtx 5090... it's less than even 30% for 2000usd... that is straight robbery.
Of course it's frame generation. We are nowhere near that raw performance yet for native non generated frames
Comparing a GPU with DLSS 4 vs DLSS 3.5 and saying "performance doubled" is BS. It's like winning a race by starting at the finish line. Only benchmarks without DLSS give you a clear picture of the performance.
THIS!!! someone gets it!!!
@@LootiumSZN - I think anyone who isn’t shilling for NVIDIA gets it honestly. They sell AI, not GPUs at this point.
@@xXDarknetMagicXx The point is that Moores law is basically over these days and the only way they can continue to make noticeable performance improvements at a reasonable power draw and cost is with AI and not raw horsepower. Whether this is a good thing or not is up for debate.
To be fair most GPUs from 30XX series at this point handle the job very well for gaming, even without DLSS..
However, if you want to run an LLM model locally on your machine, the 32GB of VRAM and lot of AI-specific core and doubled bandwidth and completely worth the upgrade if you can afford it
AMD does the exact same thing, thats what FSR is.
4090 was using the CNN/Old DLSS 3 model.
5090 was using the new DLSS 4 Transformer model, otherwise MFG wouldn't work?
Not an equal comparison then... Still happy to get 4.25/5 of the new DLSS features for my 4090 though.
when he says "ai" all i can hear is "horrible TAA" and "blurry in motion" and also "horrific ghosting artifacts", too. stalker 2 is one great recent example of all of those.
The thing that really shocked me in stalker 2 was how much noticeable delay there was to otherwise natural mouse movements, if you tried to flick a 180 it looked like there was no problem but it felt like you just pulled five g's.
100%.
ai is a gpu thing and the things you mentioned are game things. no amount of gpu innovation is gonna make ALL games run good
Guessing you play 1080p with an outdated DLSS .dll set to performance mode? Not seeing many people on 1440p using quality DLSS (most games run fine on my 6 year old card with this) complain about it.
I recently got a 4070 Ti Super and have been playing Indiana Jones at 1440p. I thought it would rip through the game at ultra settings, but when I turned ray tracing on I'm stuck around 20-30 fps with path tracing and 50 fps with high rat tracing. So I thought "gee wiz, good thing I can turn on DLSS quality and frame gen on my new 40 series card!" Omg the artifacts are AWFUL. Even though Im getting 100 fps with path tracing, a lot of the time it's so blurry and has so many artifacts that it breaks the immersion completely. Plus the latency at 30ms is noticeable.
I am not sold on MFG whatsoever and for now I'm just glad I snagged a 16gb card for $700, especially since it looks like the 4070 ti super has better raw power overall than the 5070...
I love how the selling point is basically "If you know what to look for it's not the best, but if you put someone less knowledgeable in front of it then they probably won't notice so it's good!"
@@ViralC1 You'd need to be Digital Foundry to notice flaws
to be fair would you notice it while playing a game?
@@ruby_fireleader you definitely can notice the blurriness of UE5, DLSS, and TAA
@@ruby_fireleaderYES, YOU CAN, STOP PARROTING THIS EXCUSE
@@ruby_fireleader on your phone looking at a compressed yt video you can still see artifacts... Like c'mon, wtf is it gonna look like on my 4k Tv, it's first world problems I know but man Its so sad that instead of real performance improvements or optimization advancements we are given these ai tricks.
Those robots coming up on stage gave me "Justin Hammer" drone vibes. Are we sure we didn't just witness a super-villain moment? Because it felt like we did.
And giving them commands like:
"Ok, T-800 now go to this school..."
Knew I wasn't the only one thinking it
Literally my first thought lol
Do you relate everything in the real world with children's comics?
@@Zoran54321 That was a movie, not a comic.
But I would relate things to almost anything, that's how relating things works.
Comparing a GPU with DLSS 4 to DLSS 3.5 and claiming "performance doubled" is false. It's similar like winning a race by beginning at the finish line. Only benchmarks without DLSS provide a clear view of the performance.
Both were tested with dlss 4
@@SienaEaton 30% more power draw, 30% better performance without DLSS.
@donjohn8699absolutly not
this used comment btw its a bot
@donjohn8699 4090 is running dlss 3.5, it cant run dlss 4.
8:56 - Down $200 from an already ridiculous price, where they didn't sell any, and had to fix it with a super model priced at, wait for it, $999. Actually what we end up with is 5% more cuda cores and no more VRAM, for the same price. Doesn't really get me excited, at all. Nvidia have conditioned us to accept higher prices and we're lapping it up, just like they planned. Anyone forget that the 3080 was $699?
Yep, until scalpers snapped them up anyway and they were overpriced to hell and back, which gave NVIDIA the bright idea to juice their stock with scalper demands.
I'm definitely NOT buying another Nvidia card.
Raster performance seems to be abysmal on their new gen. I'll just go back to AMD.
Exactly. No point referencing a MSRP for a product no one bought. In Australia when the 4080 super released it was higher price then the 4080. The 4080 sold so poorly here they had to reduce prices below the $999 to sell any.
Until AMD or Intel come out with a card that is actually competitive at a cheaper price this is the new future.
It's literally a carrot and a stick. First create insane price to set the standard, and then make the next gen a little cheaper to create fake fomo for a pc part that's still at insane price and markup.
CUDA cores are similar to last gen. If your game doesn't support DLSS, frame generation, and ray tracing based on previous generations, it seems like a 20%-30% uplift over last gen would be reasonable to me. So a 5070TI is a 4080 Super and a 5080 is still 20% shy of a 4090 without any AI trickery. We'll see if I'm wrong.
Agree !
pretty much every game coming out now has DLSS, FSR, XeSS and FG support for DLSS and FSR. I dont understand the hate from strictly gamers. "Fake Frames" or not, if the image looks good with more FPS then I'm okay with it. I've been using FG on 4080 Super for the past year and it's great
4090 was about 20-25% over the 4080. So the 5080 trading blows in raw raster should be about right.
@aHungiePanda maybe it looks good while standing still. I have some motion sickness and playing for some minutes with the shenanigans on starts to make me sick. All the ugly blurry mess while moving the mouse to feeling sluggish and not looking good as you say makes me sick, literally sick, needing 10-15 minutes of just eyes closed and regulate the breathing to be ok again.
TAA and TSR made games look worse, MSAA and FXAA and I think SSAA made games look crisp and nice. Gaming now is just a blurry mess, CRTs and games then looked better than the blurry mess today.
If I see it with my eyes that the images look good in motion and it doesn't make me sick, then I'll truly say that nVidia has done it. Otherwise, is just ugly fake frames. And no top-end/high-end card should get 100-200 FPS with DLSS on performance and FG enabled on Max settings in any game. Top-end/High-end should get 60-80FPS without all that crap and then 200+ with that crap on, max settings, native 4K.
Everyone saying 4K and very high FPS, is just f-ing 1080p upscaled and fake frames. Give me TRUE 4K, NATIVE 4K, MAX SETTINGS 60FPS WITH NO BS and then you'll get the praise.
Cards back in the day like the GTX 980-980Ti could do that and games looked so much better and were better optimized. Now just laziness and let the AI do the work. Most likely because studios hire based on race and stuff and not on capability so most devs don't know wtf are they doing nowadays. Just "tUrN tHiS On AnD lEt Ai dO It'S tHiNg", "I'm sO dUmB, hE heHe".
Moore's Law is over. The laws of physics mean we can't get huge improvements anymore. Better fake frames is all we can hope for. Tbh, if you have to zoom in to spot the differences between fake and real frames then there is no difference from a practical perspective. So the key issue is how good are the fake frames not merely raw performance.
Stop praising them for the "price cut" on the 5080. It only appears that way because they screwed everyone on the $1200 price tag of the 4080 which it never should have been. But now they get to be the heroes because they so graciously cut the insane price down to *only* $999. The 980 launched at $549.
i paid 1400$ for rtx 3080 :(
But 980 didn't have ray tracing. Compared to the 2080 it's only a 200 dollar difference. MSRP was 799
@@kylegarvey3370 accounting for inflation it would be around $750. It is still $250 over what it should be lmao
Still agree on the bs of the 4080 though that was embarrassing
$723 if adjusted for inflation, still way cheaper.
The 5090 alone costs more than my entire setup which includes a 32 inch OLED monitor.
@@mattcollins1521 so?
Lol that Minimap burn in is gonna be a slap to the wallet when u need another Oled
@@SilverStallion.eSports Burn-in isn't really a concern. It's more likely to break while I'm moving before burn-in becomes even slightly visible.
@@SilverStallion.eSports do you say that to everyone that has an oled lol
@@SilverStallion.eSports bro is stuck in 2018 😂😂😂
4090 raster+ RT = 20fps
5090 raster+ RT = 28fps
Min 75 watts more.
All Ai. What are you buying again?
Software that needed more expensive GDDR7 to run xD
+30% performance sounds good
@@p_serdiuk At a 30% higher price, power draw, and dye size. Not really generational uplift for compute. Only for AI. Which sucks because as games get harder to run, we will have to rely on that AI, that only a select few cards have access to, for 75%-90% of performance.
@@p_serdiuk not even getting 30FPS and saying its sounds good is crazy
@@p_serdiuk For a 30% increase on side of die, power usage and of course a 50%+ increase in price.
Jensen's jacket had raytracing enabled 💀
_dang 💀_
That's what I thought too. It was probably the point of that jacket
Nah it looks like someone modded his jacket like one of those "GTA V uLtRa pHoToReAliStiC" ones.
idk you could see all the polygons. Seems like their DLSS jacket has a ton of artifacting.
the card design is sooo good, the angled power connector is wicked, and the fact its smaller and probably more powerful, is kinda sick, the PRICE KINDA SUCKS THO
Its with utmost pleasure that I have received this information, now a GTX 1030 can be affordable (in a few weeks), such a great time to be a gamer with tight budget.
"all settings cranked" >dlss set to performance mode, LOL
and frame generation
At Nvidia's order. They wouldn't let anyone change any of the settings.
Double FPS from Frame generation is NOT the same as double FPS from raw power alone
very well said
are you willing to pay for a GPU that does double raw FPS? then you will all complain about the 5k price))))) what are you expecting, alien technology for half the price? common sens in business
How about double the power usage of the GPU?
We're squeezing almost as many transistors into a single space as allowed by physics and material science as we know it. So unless there's some major breakthrough in computer engineering, we're not going to see a GPU with double the fps from hardware alone for a while.
@@trinodot8112Maybe developers could start optimizing games again?
Be sure to turn DLSS on or you´ll barely scratch the 30 fps mark
what a joke
PATH TRACING PATH TRACING DO YOU PEOPLE NOT UNDERSTAND THATS WHY IS SAID 30FPS
Honestly now thinking about it I think nividia just rebranded the 4090ti
What are u getting for the fps on ur 4k pc right now?
With NVIDIA and AMD getting legitimately +20-30% raster performance each single gen, it is hardly the cards fault. Video game Devs need to optimise their games better or dumb them down properly for consumer hardware. Forcing users to use frame gen for 60fps is a Dev side problem, not a Card side problem.
2:10 That "1 Kidney Please" on the intro 😅😆🫣
They aren't even selling us GPU's now... they're just selling us an updated version of DLSS and Frame Generation (Frame gen especially being bad since it causes input delay)
Yep. The consumer is almost forced to buy a new graphics card every generation because the cards are really not that great at their raw power input (when it comes to price value). It's an expensive software update. And it will only get worse because the average consumer gets basically blinded by Ai buzzwords.
Nah -- I think they're indeed selling GPU's; and yup part of the sauce is indeed using (hardware accelerated) magic tricks, but it speaks to the maturity of the industry that they're at a point where hardware+software updates can essentially 'double' FPS on a flagship game in just a generation.
Flashback to 2007 / 2010 GPU era -- where the generational difference was effectively 25% more VRAM. And no, it still couldn't run Crysis. This era, you can expect to essentially double the FPS on max settings on flagship games? Ya'll don't know how good you have it.
they dont want peasants physically owning any real chip power, especially when they are close to forcing everyone to use their cloud servers only.
honestly it wouldnt even be a problem if game developers could pull their shit together and make an optimized title. I have a 6800xt and running Silent Hill 2's remake is a fucking endeavor for it because the dumbasses at Konami RENDER EVERYTHING BEYOND THE FOG AT FULL DETAIL. It's not a hardware or software issue its a game issue where games are being made by actual miscreants who just use upscalers and Ai as their bandaid fixes for them not knowing how to do their job.
The enhanced DLSS will be available on all RTX cards.
Enhanced Single Frame Gen on 4000 cards.
Multi frame gen is only on 5000 cards. (The only valid thing here to say they are selling you)
Frame Gen does not "cause" input delay. It just doesn't reduce input delay. Input latency matches your real fps which can feel like a delay with higher fps.
Nvidia Reflex already helps with this but Reflex 2 can be up to 2x as fast by modifying already rendered frames in accordance to new input.
Reflex 2 will also come to other RTX cards. Should make both versions of frame gen feel better.
"and if everything is fake frames, nothing will be"
15:1 (fake frames:Native frame)is the way god intended us to game, it's the golden ratio.
@@raitonodefeat6503 They’re not fake frames. They’re AI generated frames. And they’re going to be completely indistinguishable from non AI generated frames in another generation or two
Progress happens slowly. This technology is the future, and it gets better every generation. There’s a reason why Nvidia is now the second biggest company in the world
@@tylerclayton6081 So what you're saying is pretty soon a game engine will just be a Notepad file describing what's supposed to happen on the screen and your GPU just generates the entire thing off that narrative?
EDIT: AI is fake, and the trendy AI terminology that people have been throwing around in the last couple of years is not actual AI. There are games with "AI" smarter than these generative algorithms are.
@@TehButterflyEffect All it does is interpolate frames using algorithms. To the same extent that the image that your GPU maps 3D faces onto is real, these frames are not fake either. A capped 60 put to 120 with FG is a better experience than fluctuating 70-90. As long as you have the headroom to play with minimal input lag, it feels good, and this is coming from the lossless scaling app on Steam, not even nvidia frame gen.
$2000 to see your game run at 200+ fps, but play like its running at 30 fps.
The future is now, old man!!!
oh brother you guys are never happy. Must live pretty sad lives huh?
Have you ever played on a rtx50?
How do you know 200fps feels like 30fps?
stop making baseless accusations
@@Justme-jp8ih what kind of a strange argument is that dude
@@Justme-jp8ih the assumption is correct however in this case, the number on the frame counter is not raw frames, they are interpolated or AI generated frames to fill in the gap between real frames, so no, 200 fps is not actually 200fps, that’s what frame generation is all about at the cost of input latency.
@@Justme-jp8ih you just proved that you dont know anything about how frame generation works. imagine you are getting 30 fps. you turn on frame generation. you get 130 fps technically, but you still experience the same input lag as 30 fps.
@@RageBaiter1 hey man, we know this because frame generation literally makes higher frame rate, but does not increase latency. its because frame gen is creating fake frames to fill in the space between each frame, creating a more smooth motion experience and showing higher FPS, but the input lag is still as if nothing changed. so experience 30 fps input lag but 90 fps motion. it feels bad but at least it has smoother motion.
@1:32 that’s 2 legends siting together - Thor and Hercules 🤘🏻⚡️
My favourite part of the keynote was when Jenson joked about how expensive the 4090 was then said the 5070 was double the performance (giant ghostly asterisks looming), while quietly forgetting to mention that the new 90 tier card is 25% more expensive.
This release is just sad, they could easily get DLSS 4 to run on the 40 series, you're essentially paying for premium software, not raw performance, you'd probably be able to get the same raw performance uplift by overclocking the 4090.
@bbqR0ADK1LL you playing the game of telephone? He didn't say the 5070 was "double" the performance of the 4090 he said it was 4090 performance with the AI
The 5090 is double with the ai
He never said double. He said it equals the performance of the 4090. The 5090 is what doubles the 4090.
Ghostly, I see what you did there.
@@theMF69if it was so easy then why aren't they doing it? It seems to me they need the extra Cuda to make the new processes work correctly. I'm just happy my 2070 is still getting some love. 😂
5070 with only 12GB of VRAM is a CRIME
That's why you get 5070Ti with 16GB.
I'm just waiting for 5060 with 8GB xdddddd I'll laugh so hard, that i'm probably gonna shit myself :D
5060 - 8GB
5060Ti - 10GB
5070 - 12GB
5070Ti - 16GB
5080 - 16GB
5080Ti - 20GB Maybe super version will have 24GB
5090 - 32GB
it's going to perform worse than a 4070 ti, which had 12gb of vram, and probably not much better than a 4070 super or 3080 ti which both also have 12gb of vram. Its not a crime, if you don't want to buy it, dont. But 12gb its just fine for 90%+ of titles at 1080p or 1440p where 99% of people will be playing. the crazy part is the price. if the regular rasterization gen over gen from the 4070 super is 0-5% this is basically a 50$ price reduction for the same performance, which is wild.
Apparently using ai, it will use the VRAM more efficiently is what I’m understanding.
@@grimleemer Yea, the whole neural shader thing boasts 7x compression, we will see if it's easy to implement though.
Better than the 3070 with 8GB.
Only thing that is important to me is, whats the performance on both without DLSS?
A shitty frame generation tech that blurs out everything does not tell me anything.
Can't wait for you to break it apart Linus!
Not much. Its noticeably better, but not much extra. Absolute majority of extra performance is DLSS 4 and extra generated frames.
@@ischysyt looking at the keynotes from both of the 4090 releases and the 5090 releases, the raw performance seems to be about a 20% increase in the best case. Take from that what you will.
Yes, exactly! That's what I wanna know too. And I wonder, do people still care about that?
@@ischysyt it's roughly 35% faster without any dlss
Why?
Adding 2 frames for 2000$ is a jackpot for sure
Went from not owning games to not owning frames
He really is being held hostage 6:01
switch 2 by nvidia when
@@semihylmaz3949 this comment was made by a bot bro
Nice rack!
@sometime- Its a bot dvmbfvck 💀
This made me laugh so hard
The Titan XP in 2017 was the full die at 3840 CUDA cores. They cut that in half down to 1920 and sold it as the 1070. The 5080 has a bit less than half the CUDA cores of the 5090, meaning they re-labled a 70 class card as an 80 class and are charging $1000 for it.
That about sums it up nicely.
Then buy a Titan if it's so good...
@@JeremyBell its a 2017 card....
there was no 1090 but there was a 1050
Wasn't it the case for 40 series also? Looking at the memory bus
I am really excited to see the waterblocks for 5090s. I bet we will be able to pack a ton of performance in SFF machines this gen. I am withholding judgement until we see DLSS-off performance though.
5:40 I love that as the glazing is happening, the shadows of the people walking up the stairs are ghosting. AI is gonna AI
is it just me or when he flicks (in-game) camera at 5:30 literally the entire scene is ghosting very badly
Good catch, thanks
@AndreasHGK That looks like motion blur, not ghosting from DLSS.
@AndreasHGK It's pretty typical of recording high refresh rate screens with a camera (which has a slower shutter speed than the update speed of the monitor), though I wouldn't doubt the AI having a little bit to do with that as well. A lot depends on persistence of vision as to how clear it feels in real life.
@@beirch That would be a fair point, but at 3:48 it looks like motion blur is turned off. As @iotkuait fairly points out tho, it could also have something to do with just the way it was recorded and we won't know until reviewers get their hands on this.
The amount of people defending the 5070 = 4090 claim is concerning
@@soapa4279 exactly. We will have to wait to see the real performance difference when users can disable all the AI frame gen bs and see how it fares
No where near on 4090.
@Creed5.56 why would you disable something that gives you 3x higher framerate?
Look at the cores and clockspeed lmao. Then tell me it matches a 4090
Found someone who over payed for a 4090
The prices are still prohibitively expensive. I will skip the 50 series and run my 3080ti until it wears out or I win the lottery of something. I am tired of $2000 or even $1000 cards.
True
Yep my 3090 going strong
Bought a 3080 not too long ago and I'm not planning on replacing it for a while.
tbf I gave up on nvidia long time ago, I'm here just to check the competition
Well, good plan, but these costs are nothing new and maybe 5-20% more expensive depending on the card. We should expect prices to be 20-30% more than ten years ago. Based on inflation alone, these 5000 series cards are only a little more than expensive previous generations. If these were released ten years ago, only the RTX 5090 would actually cost over $1000
Here's a full breakdown compared to a decade ago:
RTX 5090 = $1500, before the RTX 3090, the last '90' card was the GTX 690 in 2012 for $1000 (or $1,375 adjusted), the RTX 3090 would be $1830 adjusted
RTX 5080 = $750 = the GTX 1080 was $600
RTX 5070ti = $560, the GTX 1070ti costed $449
RTX 5070 = $415, the GTX 1070 costed $379 at launch
If we ever get an RTX 5060, it will probably cost about $400 in today's money. Me personally, I'm happy with my RTX 3090 and besides, video game graphics CAN get better, but since many cards can get 4K / 60fps we are getting diminishing returns. I still think the regular PS5 looks phenomenal even though it's like 1/3 as powerful as my computer
Im personally thinking of getting the 5080 once i gather the money for a custom build
I dont need top end, just VRAM and good VR support
DLSS and PT issues:
5:00 ghosting
5:26 blury stairs texture and shadows with ghosting
Just like previous DLSS this will get better with time. It's still super impressive.
Get glasses?
@@geetargato😂
dude, the yt video is not even at 60fps...ofc u will see ghosting...lol
so nothing has changed since the first DLSS then?
If im paying $1000 or more for a GPU I want triple figure FPS without frame gen or a half res image upscaled....
sure you can. any non-RT games are already playable easy without FG or Upscaling. in hundreds of fps. no need for $1000 gpu
My 3080 can do that just fine. Just not in Cyberpunk at 4K Ultra settings with raytracing, obviously.
@@PHIplaytesting Naaa...not @ 4k 240hz which is a thing because all the top monitors are like that and UE5 kills the fps in all new games ..
Okay, go invent and build a GPU then
@@mrmrmrcaf7801 people who bought 4K 240fps and want 240fps most likely for a competitive gamers. And competitive games like CS2 or maybe R6, Valorant are relatively easy to run.
LOL 3:24 the monitor shut down when He hit the lid, I can't 🤣
"Absolutely everything is cranked" @4:00...proceeds to show the settings with DLSS Performance mode selected.
Tells you everything you need to know about nvidia. Greed and manipulation.
Yeah on a 4090 it has dlss PERFORMANCE AND frame generation and every single other helping hand you could turn on😭😭😭 this is NOT honest reporting Bruh, even the FOV is set to 80 which nobody would do
Absolutely!! What even is this comparison? With DLSS Performance mode, FOV set to 80 and Screen Space Reflections set to Ultra and not Psycho?
@@GoldenSW it was cooked the moment he wasnt allowed to change settings. its literally perfectly designed to "look" marketable as an upgrade without actually being much of one.
@ congratulations nvidia!!!!! You’re running the same embarrassing helping hands as my 2060 super!!!!🤪🤪
Uploads been diabolical since CES
2 minutes ago is CRAZY
It might just be *because* they're at CES
Great fo UK
@@KaiCross-l5inah b it really isnt
@@NahBNah you really had to use your profile name in your reply 😂
So can we talk about 3:22 when the display connected to the 5090 blacked out as soon as Linus closed the case??
@@TilLakeover it went to sleep mode
I noticed that too. Probably just went to sleep at the same time which made it look like he knocked the display cable out lol
it’s a conspiracy, you should start a religion about it
This openness is amazing, and shows how confident they are in what the card is capable of! They actually outdid themselves here
dlss will kill optimisation and gpus
Lazy VG developers already did that brotha. Talent and dedication are no longer the norm.
Optimisation was killed when Sony showed the soulless corporations that gaming had money to be made for non-gaming companies, DLSS and its ilk will only add to the problem.
TAA already did, we need MSAA back.
This i hate it
@@ni55an86 that's called the woke dev era
2:44 he just shows us the employees to show that they are holding guns to his head
@@Nobody-absolutelynobody lol this is so dumb they told him exactly what to day and do and he sold put like a ...common chicken he's nothing but a stooge... I think it's funny I go to comment Nd it tells me to behave ... u like.linus and his team with that one girl
😂 must have been so awkward
6:33 interestingly, the way the numbers and lights blur is exactly how the world looks for me when I’m not wearing my glasses
yes and nobody would pay $2000 for your eyes
@@gameclips5734 damn, harsh bro
@@gameclips5734 HAHAHAHAHA
That was wildly unnecessary, but i loved it hahahaha
@@gameclips5734 technically a blind person would if it was guaranteed to make them see
@@ReportingEligiblePodcast would they though? I bet some black market out there somewhere would sell partially working eyes for less than 2000$ in that alternate reality
One important thing: how does the 5090 perform against the 4090 without AI frame prediction? Just raw power?
Anyone else notice the left monitor turn off after he slammed the case door @3:22? 👀😂
😂🤣
@@liquidalloy yeah, that was weird
My monitor does that too idk why.
My monitor flickers off when I turn on/off the adjacent room's light 🤷♂️
@@Un1234l get an electrician at your house because that sounds like a problem
36ms input lag with 268 frames is wildly shit
It doesn't start at 27fps, but from much higher fps, 27 is at native 4k, but the gen frame is probably applied at 1080p upscaled with dlss, so it will be going to 70 80fps which are then quadrupled. The input lag will be less than 15 ms which all in all is not bad
frame gen adds disgusting latency - its so stupid that nobody talks about this, play some stalker 2 with it , it feels like your are drowning in a swamp, but fake fps counter shows 150 lmao
i thought you're exaggerating until i saw the video. Holy.
@@playPs232 have u tried ? As long as u are around 90-100 fps there is basically none , also if u are using a controller ( like single player rpg , the imput lag of the controller is bigger than the fg one)
That's 20fps with max settings, like all AA settings at max values. I always game at 4k above 60fps, never run AA, it's a waste on gpu power.
100% that linus is being held at gunpoint
Ya, they do look like Nvidia goons.. good fellas but with green logos, lol. Modern-day mobsters work for corporations.
The look on their faces.. say, "Let's break his legs." Lol
Turn off DLSS and test again. (only raster performance matters in the end)
Nvidia was watching, and specifically would not let anyone at the preview event change any of the settings. Should tell you something about the completely BS claim of "5070 = 4090 performance" and that's without even getting into the insulting 12 GB of VRAM.
just wait for the 6000 series or maybe when they come out with the 5080 super or something that has like 24 GB of VRam or something im not spending over 2k for the 5090 if its like barely 10% faster than the 4090 i get the feeling the 5090 is actually a 4090Ti in disguise
cyberpunk running at 28 fps is *not* the flex nvidia thinks it is
ever heard of path tracing and native 4k? not exactly easy things to render lol
@@theseventhdman they should be tho
@@theseventhdman For a 50 series card it should be more than 28fps.
@@iambear.6526 according to who? Path tracing wasn't even possible in real time until recently and it becomes even heavier at high resolution
Id suggest looking up how CGI or just 3D renders in general are made, rendering ridiculously simple scenes with optimized settings doesn't even come close to 1fps. 28fps with path tracing is insanity that should not be possible, cyberpunks engine and ofc the gpu are doing amazing things but the tech is too complicated for basic consumers to understand and appreciate
Remember the time, where a bonkers spec Nvidia GTX Titan X set it self apart with a ridiculous pricetag of ... $1,000!!! "But! This is a flagship device, a halo product! Nobody would ever pay a thousand bucks, just to play games...." - yeah, that aged like fine milk.
Remember when the dollar could actually buy you something of note at a dollar store?
that was back when rent was $500. Compared to the rent that's now $1500, that gpu is a bargain /s
That was 10 years ago though...
Dude the GTX 690 was $1000 in 2012. Companies have been making $1000 GPUs for a very long time. I mean they were selling flagship GPUs in 2004 that had an MSRP of $600.
@@DMSparky Yeah and in the days of SLI the top-end card might only be $600 but you were expected to get TWO of them if you wanted to waste your money on a luxury PC. And you probably needed to watercool everything because air cooling sucked.
Now I know the reason why god gave us 2 kidneys.
1 for us,
1 for Nvidia
Trick is dont use your own kidney
😂😂
If people sell 1 for apple, you need two for NVIDIA
@@Snafu2346 nvidia have so many kidneys that you have to give 2 now....
Yeah, Jensen needs another leather jacket
What is the game called
6:01 he really is being held hostage
Nvidia advertising performance with only dlss and multi frame generations as reaching 4090 performance with a 4070 is unacceptable fr
Or maybe raw frames are a dated way to look at it ?
@@Lightwish4KBros an Nvidia employee
@@BugattiBoy01 Maybe there's more nuance than "rah, AI bad"?
@@Lightwish4Kthey look better than AI generated ones
"The 4070 isnt as powerful as the 4090 when you disable everything that gives it an advantage"
I like real frames better than fake frames all day every day
If you can't actually tell the difference what does it matter? It would be like saying "I'll take real video to polygons any day"
you won't recognize them
@@edgeldine3499 i can tell the difference 100%
@@EasternUNO wont there be input lag?
@@edgeldine3499it’s not the frames that’s the issue but it’s just the fact the Gabe looks ugly as hell
Will your internet speed effect the card performance? asking for a friend.
The 5080 should have 20 gigs of vram imo
Yeah, and 5070 16GB, they created a nice VRAM hungry technology and are limiting it to get people to get the more expensive card,...
same vram as 4060ti for double the price what a joke
@@Emetsys they are doing it to cause people to upgrade every 1-2 years. Even 12GB is pretty limiting in newer titles .. I was hitting the VRAM limit in RE4 Remake and the settings weren't even maxed out.
ti/super coming end of the year, they gonna serve ya witht that juice 24gb
And at least have 12000-14000 cores with 10k does not make any sense
Can't change the settings. Yeah because as you stated the 5090 was running Frame gen 3x to make it look better than it was. Turn that shit off and then put them head to head and that difference will suddenly be about 20 fps instead of 120. Don't get me wrong, I like the tech, but if we feel we have to rely on it to bandaid other shortcomings then I think we went the wrong direction here.
It’s ddr7 so probably a little more but ya DLSS is doing all that shit
@@CaptToilet stop bitching. Why do you care about raw vs AI. The whole point with more fps is for shooter games to feel better. More fps is more fps who cares if it’s not raw
@Noobnubnoob you ever hear of a thing called input latency? Educate yourself on it
@@Noobnubnoob Nobody is gonna play shooters with AI mods lol, latency is way higher
@@Noobnubnoob tell me you never played an fps game with dlss without telling me you never played an fps game with dlss
Can't wait to purchase this when the 10090 comes out
27 fps raw performance of $2000 RTX 5090. all you need to know.
exactly , those cards seem to be a scam...
mind me asking for the timestamp?
Explain?
NVIDIA won't innovate unless the competition does first now, they only want to sell more cards for higher margins, if they started researching how to make an extremely high performance cards affordable to everyone they wouldn't have 1000% price increases in their stock. They are going to just raise the performance by 20% and sell cards to people on hype alone convincing them they need to upgrade every 1-2 years.
And that’s on terribly optimized “new” games, if you play actual games made with love, not for money then you’ll see how big of a difference optimization is.
I was surprised to see wicked as a sponsor. I replaced the pads on my Bose with them years ago and they are still in great condition today.
@@ryutsukishiba2943 are they really worth it? I have been recently hearing about them and wanted to try as I am not a huge fan of the ones on my current headset. I just get wary of buying RUclips influencer ads as most products aren't worth it
This isn't the first time; I think they sponsored a WAN Show awhile back, and they've been using the cushions on the WAN Show headsets since then.
I used their cushions for a 3+ year old Corsair headset and it felt great. Headset was still partly broken so wasn't a "like new" experience, but it made the headset usable until I could get a new one. Just thought it would be worth mentioning since I see so many overhyped products - and now this is sounding like free advertising...
1:21
Yeah, but only with the new Frame Generation x4 Activated. Other than that the 5070 is only like 20 percent faster as a 4070.
Wait till you compare it to 4070 Ti super, which is also 799 $ and realize 5070 is slower for about 50 $ less without MFG.
5070 is 549@@crimsonninja5090
I just hope more and more people wake up to this and stop supporting these malicious practices. Unfortunately this is what happens when there is no/limited competition. The same thing is happening in consoles now that Xbox has essentially given up. PlayStation is about to go full Nvidia. Gaming is in the saddest state it's ever been.
I don't like the ghosting in frame gen. Still on a 3080Ti so guess I either have to grab a 5080 or wait till next gen. Probably just gonna wait.
Interesting i noticed cyber running on performance mode, i play in 4k on quality modes with the strix 4090, the fps sits comfortably with most games around 60 (not supported) while with supported games (or battlefront 2 because its engine is amazing) the fps flies over 120
11:51 props of the editors that was an amazingly good sync
I was looking for this comment haha
It is depressing to have finally hit the point where I realize that my love of being at the highest end of PC performance has well exceeded any of my gaming use cases. It would be epic to upgrade to the 5090..... except I do not play anything anymore that even stresses my 4090, and I realize I have no desire to jump to the games that could use it because I am now just the newest generation of old school gamer.
It's not you bro, games do suck nowadays, objectivly
You don't need to upgrade anyway because the performance of the 5090 is barely higher (mostly attributed to gddr7 and a bit higher clock speed - else essentially the same besides some npu architecture shenanigans). So if you're not using dlss4 aka getting 15/16 of the content you see on your screen generated into a blurry mess ... well then just be happy with your 4090 bro.
@@Konayo_ 5090 is actually downclocked compared to 4090
That's a hell of an irony there my guy, but I totally understand it.
Newer games are getting bad
"I can't think of any setting I'd wanna change" I can. DLSS. Shouldn't we see the difference it provides? I would think Nvidia would like that
I think because they want to advertise the card not dlss itself.
Nvidias website advertises 4090 with 21fps and 5090 with 28 fps.(dlss and frame generation off) 4k Max path trace settings. IN Cyberpunk
The difference is 130fps
Nvidia would most definitely not like that because knowing cyberpunk, the performance would probably TANK.
Damn! how many 5090 did it take to render Jensen's leather jacket?
I am hoping that an alligator or crocodile did not give up its life for that ridiculous jacket
bros jacket be having that full path tracing
@HadTooMuchToDream why? Theyre a nuasance in most parts of the world since they breed so fast. Like wild boar and coyotes.
@@BradUSMCVETrider We don't see alligators or Crocodiles wearing human skinned jackets. Besides, they don't breed as much as humans
1:32 hey guys, is that Austin?
Yes
yes
yes
yes
yes
4:40 ok so it's only AI, 36 vs 38 milliseconds render time basically means these have the same underlying framerate and latency
PCL is not render time, it's latency. Those can be correlated, but don't have to be. I'm pretty skeptical on the front of non-DLSS perf. gains for this generation, but this isn't information that backs that up in any way.
5% better for 300 more dollars letsgoo
Exciting times ahead with Adaxum! Just joined the presale, can't wait to see where this project goes
I think the interpolating technologies will work for a few games at 4K resolution, and all other games will support them and other resolutions to varying degree. It would be informative to know how these place in a bang for buck raw-performance chart of TFLOPS + 1080p 1% lows *without* interpolating tech, in other games where they didn't ask the devs to make a special version for a demo. Most of the games I play are outside the very limited list of "the only five AAA titles this year", and that's (hopefully) true for most gamers.
6:40 probably your siblings wouldn’t pay 2000$ msrp for a graphics card
Video cut 3:13
Came here to say that
I'd be interested to know how it handles 10-20 year old games. That's part of the beauty of PCs after all running old games
Still not good enough for GTA4.
can it run crysis
Perfectly fine wym lol. You can run 10 year old games on max settings on cards from 5+ years ago and demolish it
My 7900 XT crushes old games.