EDIT - After publish, Nvidia asked for me to remove the "Exclusive" and "Benchmark" aspect of the title, which I have obliged by FAQ - This was filmed at CES in an Nvidia briefing, embargoed until today. Full Review embargo is separate. Does it use reflex 2? No, that's going to be a future release for esports titles only. FG does require Reflex 1 though Does Frame Gen look better than last gen - hard to say without a side by side, but for normal gameplay it works great, but will struggle with fast non linear movement and hair
LOL! I'm guessing it was OK'ed by NVIDIA, but they didn't restrict going into the settings and playing around with them, so I can't imagine they'll be taking him to court or anything. I'm guessing NVIDIA ok'ed this to counteract all the "fake frames" rhetoric (some of which I find pretty toxic and/or opportunistic) that has flooded RUclips and this video gives more context.
@@larkinc Yeah they oked their demo, and changing the settings, but not showing temps or power usage. Basically they want the demo to be as public as possible, without inviting millions of people to their HQ
Genuinely unless they start making GPUs with larger sizes it will not get higher than 60 at 4k Path traced rendering. Unless there is an incredible leap in architecture of course.
A test build of the 5090 while also Cyberpunk being the cutting edge game to look at makes sense. Cyberpunk isn't really made for optimization and if you are going to play on the highest of settings you can't without DDLS or FSR
@@EarthRising97 It is not impossible. I have an RTX 4080 and the game in 1440p (this is my monitor's resolution) can produce 90-130 fps with maxed out + PT. Of courses I'm using DLSS, Ray Reconstruction and Frame Generation. I don't know, everybody wants to play in 4K but for me 1440p is the sweetspot and most people forgets that Path Tracing is only an option there are "old RT" in the game which looks fine and even that you can turn off and simple get 144 fps locked without RT.
There's no pride in making unoptimized game which can't be run with decent performance in max settings even on top-of-the-line hardware. With such logic stalker 2 devs should be even more happy, because this crap can give decent latency only on minimum settings. On 4090 (!!!).
I'm surprised how many people say how unoptimised it is. I have a modest system (Ryzen 5 5600 and 6600-XT) and it runs fantastic. Only basic RT of course and I'm on 1080p but with my TV's 4k upscaling, it looks nice and i'm happy.
i wanna see if 5070 TI would be worth the money. I am looking to make a pc, switching over from laptops. Really keen to see if 5070 TI is worth it or should just go with 4070 ti super and upgrade wayy into the future lols
Just wanted to say out of all the big youtubers covering rtx 5000 CES launch, this video give the best and most detailed information of what we want to know. You basically allowed people watching this to calculate the performance uplifts between 4090-5090 in all the different scenarios. Exactly what I was looking for. Thanks so much!
The big tech channels were not allowed to show fps because Nvidia is scared of their massive influence 😂. Let’s be real, they didn’t let Linus show fps in his video because that guy can instantly tank nvidia’s gpu sales by a single comment
It's not a GPUs fault a game is made so you can adjust settings to make it unplayable. It's like saying a car sucks cause it barely makes it up this jagged icy mountain.
@@getrekkedonto be fair, it’s like you’re going 160mph in the speedometer, but it’s fake, you’re only going 100, but AI makes the engine sound louder and tricks you.
@ Is there an AMD card that can push path tracing with no upscaling? What are you saying? You know you can turn those settings off right? Just like you can choose not to drive your car up icy cliffs. I think most people who have the option to play graphical single player games with path tracing on using AI will because it actually looks that good.
@ Unlike car race times, it's a subjective experience one in which the majority will appreciate actually being able to play full path traced aaa titles because of the AI.
Does anyone actually play the game? Get into a gunfight, with explosions and particles flying. Why are we just standing at an intersection? There is no purpose praising the latency if you’re not showing the conditions in which the latency needs to be good.
Yeah not only this, but the latency standing still in a low density area is 40MS, soon as he turned to quality mode it was 50MS, for a guy that claims hes optimistic about latency why is he not complaining? 40ms/50ms is horrendous it would be like playing on geforce now in the cloud lmao.
Just take the numbers here and compare it to a 4090 Cyberpunk 2077 Overdrive benchmark. RAW: 5090: 28-35 fps | 4090: 15-20 fps DLSS Q FG 2x: 5090: 110 fps | 4090 80fps
Appreciate you showing the latency and giving more awareness about it- definitely makes a huge difference in how a game 'feels' when you're actually playing it
Yeah, 40-50 ms latency is unplayable. For first person shooter it's nice to be below 20 ms, MAX 30 ms, anything higher is garbage. Fake frames nonsense is just a scam, it's such a shame that most people don't even understand it.
@@detrizor7030 everyone understands it. It's a trade off. If you want path traced super graphical games, AI is a must right now. Esports where that low of a latency is a must are typically not graphically intensive and on top of that highly optimized and don't need AI. If you're really competiive you really don't care much at all how beautiful the game is and will turn settings down to get rock bottom latency and visual clarity anyway. The 5090 will likely be the most powerful GPU with or without the AI, so having the AI on top to play single player games where lightning reflexes don't matter and you like incredible graphics is nice.
@@detrizor7030 how is 50 ms that bad , i usually play online games at that ping and honestly i don't feel a difference, especially in online games it should matter how come its a problem in game like cyberpunk 2077?? since when 50 ms became that bad
@@getrekkedon "It's a trade off." - no, it's just nonsense. There's absolutely no sense in playing first person shooter with 40-50+ ms of input lag, this path tracing stuff just isn't worth it. Better not play a game at all, than with input lag higher than 30 ms. "Esports where that low of a latency is a must" - that's what i'm talking about - almost everybody keep talking some gibberish about multiplayer/singleplayer games and so on. That's NOT about esports, we need decent input lag (below 20 ms, 30 ms tops) NOT ONLY in multiplayer games, but in ANY game with mouse input, ANY. Come on, we need it even without any game, in plain operating system. In esports we need not just decent input lag, but lowest input lag possible, there's a VERY big difference between these 2 things. "where lightning reflexes don't matter" - what are you even talking about? For you there's only 2 options available - "lightning reflexes" and 50 ms ugly mess? Nothing in between? Seriously?
I honestly don't think I'm going to notice the occasional smudged graphic generated by the AI unless I plunge my face on the monitor, actively looking for it
@@Billy_bSLAYERhopefully it gets better. Cause it seems nvidia isnt gonna change. Gonna go all in on this and it sucks. We are gonna get to a point where our gpus can only handle fake frames 😂
@@ryanhamrick8426 Clearly you aren't listening, all of that AI performance means nothing if the base performance is shit. If all you care about are generated frames then great buy a 5000 series GPU. For those of us that care about 0 AI generated frames or just just base DLSS with no frame gen. We want to know how much they actually improved the raw performance compared to the 4000 series. Is the dollar per base fps worth it or not.
@@ryanhamrick8426why would you want ai frames that have artifacts and will increase latency. When you can have raw power that will have lower latency and less artifacts? Nothing cool about new technology that is more lazy and less useful then how they were doing things before
yes please everyone hate on the 50 series. DO NOT buy a 50 series. wait AT LEAST 6 - 7 months. do not pre-order. just keep posting on the internet about how bad it is, This might help me get one at launch. thanks.
@@TheFlyinFisch FACTS. You'd think Nvidia is out here with guns to everyone's head DEMANDING they buy a 5090 the day it drops lol weirdos 🤣 wonder if they get all mad when Rolex drops a new expensive watch they can't afford too?
the funniest thing imo is: absolute most of those who cry about the "disgusting fakery huang scam" are the same ppl who either sit on Amd or on the gtx1060. They are not going to buy the new gpus anyway. And they have no clue about how the new tech FEELS with your own eyes and hands. Also, they are sure that 88% of PC gamers (who buy nvidia) are plain stupid hamsters and scammed by huang. Yes, i dont think i would upgrade from 4090 atm, but still the new features look decent if you have a much older gpu
I'm a 1998 Tomb Raider II guy, so I know that gameplay is the main consideration, like about 70% game design and gameplay. Graphics are great but there's a point of diminishing returns, like a glitch here and there makes rock all difference to the gaming experience. These GPUs are put on a pedestal and you think they will make you happy, but in actual fact is just way too expensive for 99% of the population.
@ ok then 99.999& of gamers. how can u spend £2000+ on a gpu, like it's a Rolex or a Gibson? It's a complete rip off like more or less everything today; KFC, cars whatever. You pay LOTS more and d get a lot less value. Get real, £600 is too much for a GPU
Wow this may be one of the best videos I have seen from you!! I love how you went through many settings and even did a noise test on the card with a quiet enviroment this is way better than Linus's video on this!
Great video exactly what I wanted to see love how you were tweaking with the settings very informative! Any chance of a similar video for the rtx 5080?
@veniulem5676 Yep, I went daft in COVID £2,150 for a 3090 Suprim X. Not sure i can stomach the big prices again. Need to wait for 60 series I think. Unless the 5080 comes out really good.
Definitely feeling better about having recently gotten the 4080s. The frame generation is a nice buffer but without it, it's a marginal improvement. Something I expect to develop more for cards that are not the 50 series. Still cool, and way more appealing to those building a newer pc with a smaller budget
@ I think they are no brainer for the story games, but for the fps they don’t seem like must upgrades. I’d like to see a VR run with the frame gen, that might be epic
Now that Moore's Law is dead, short of tech paradigm shift the way we are going to get more performance is by 'cutting corners'. Thankfully it looks like we won't really be able to tell corners have been cut.
I feel like you should've explained that PC latency isn't the same thing as frametimes, since that seems to be a common misconception and something there was tons of outrage about on Reddit.
100 percent I can't stand when I see people say they are getting 16-17ms of latency when at 60fps. They literally don't understand that they are looking at only the frame time and not the total system latency.
I have that one and the card is strong enough for 24gb of vram. But with 16gb in limited in terms of modding and certain games with path tracing. So I’m considering the 5090.
@@prenti1205The 4080 super won't remain a 4k card forever. Games are inevitably going to get more demanding and it will drop down to a 1440p card and then to 1080p card. So 16gb of Vram isn't going to be much of an issue for the future.
It’s like going from traditional CDs to MP3 CDs. 12 songs on a disc can now be 100+ songs on the same disc, on the same CD player. The difference? Compression. Now we’re back to lossless audio options, and I assume gaming will go from lossless like now, to DLSS which is like MP3, then back to a mix depending on user preference. Eventually, lossless and DLSS rendering will be so close, it’ll be like arguing about hearing the difference of high bitrate MP3s, and lossless (or Bluetooth versus cable)!
i’m disappointed, i wanted to buy the 5090 but it doesn’t seem worth it, and now the worst part is we have to wait for the 60 series for a major raw performance increase.
The more you add generated frames, the higher the latency is, but at infinite MFG, the additional latency, assuming perfect scaling (as in generating the frame have 0 cost), will be similar to the cost of rendering a frame. So a 60fps game will have a max latency penalty of 16.7ms (1s divided by 60) assuming you're using infinite MFG. The main contributor for the latency is not the processing cost but the fact that it needs to delay showing the real frame because the need for frame pacing and interpolation. If you search the internet about FSR3 GPU Open, you can see why it introduce additional latency. Basically the game engine render a frame but instead of being shown, it is being hold by half the frame time (assuming it is the normal FG) and then when a new frame is ready being displayed, the game doesn't display that new frame but use that frame + previous frame for interpolation and present the generated frame. The actual new frame will be hold for around half the frame time and you repeat this process. Because the game holding the real frame for half the frame time, normal FG will have an additional 8.3ms for a game with 60fps base frame rate. For 4X MFG, instead of half, it hold the real frame for around 3/4 of the actual frame time, meaning 16.7ms * 0.75 = 12.5ms additional latency or around 4ms higher latency vs 2X FG. Again, all this numbers doesn't take into account the time needed for the game to process the generated frame, but as you can see, while yes, more generated frames add latency, but the subsequent latency increase will be smaller vs the initial latency increase from 2X frame gen. Basically if you're not bothered with the increased latency from 2x frame gen, 4x frame gen shouldn't bother you in terms of added latency unless we are talking about really low base frame rate.
So are you saying the higher your base frame rate, the less latency. For example I play marvel rivals with frame gen on. I get about 100fps +- without it on, but when I turn on frame gen. I don’t feel the difference in latency, makes me wonder why it’s so shit on
Honestly 40-50ms latency isn't even noticeable unless you compare it side by side with 15ms. It's incredible Nvidia has figured out how generate so many fake frames with sub 50ms latency.
If you don't understand what it takes to run Full Ray Tracing, let alone RT with full Path Tracing in 4K, then it's time to just walk away. The real question is, how is the image quality on the 4x DLSS vs Raster? Cause if our human eyes cannot tell, then who cares if it is AI.
"if our human eyes cannot tell, then who cares if it is AI" brain-healthy people do not care, while the crybabies in comments somehow do. probably a religious cult of "rAw FpS", no matter what
im coming from a 3070 and planning on 3080 barring a terrible release. i just wanna play my games at 1440p and maybe 4k. i have an overkill monitor and i wanna finally take full advantage of it
65 fps in PT DLSS Q 4K is pretty good actually, 4090 does around 40. Let's say it's 60 to 40 because the scene was pretty basic, it's still a massive 50% jump. I think 30% average is more likely but if it gets close to 50% in the workloads where 4090 doesn't cut it (4k dlss Q PT) it's a worthwhile upgrade for enthusiasts.
@@TampaGigWorker true, no competition in high end gpu market but 90% of user don’t really need a 4090……they make a conscious decision to give Nvidia money and they already know how they are. Just don’t cry after it.
@@Mike-Dynamike i made a conscious decision to spend $1069 on the Gigabyte 4080super OC. A tough pill to swallow but no complaints after making the final payment through Affirm financing. Im happy with it for now. I'll upgrade in about 2 years and stay in the 80 ti/super series braket.
Why do people keep saying this? It is blowing my mind. Do you actually believe there will be a ZERO PERCENT difference in raw performance? I think the consensus is that there will be around 30-35% uplift. which is pretty standard.
@@tucci06by pretty much having 30% more power consumption. :) it's basically like a overclocked version. Dont get me wrong i get that nowadays you dont pay for HW, but for frame gen... Thats where the future is heading,but dont lie to yourself regarding HW improvement...
I remember when console people were saying we don't need more than 30fps to have an enjoyable game, and the PC people were bragging about glorious 60fps. But yeah 120 is getting to be plenty.
@@scottwatrous 120 was about where I couldn't see much difference past back in the CRT days. Once you hit around there it was good for almost any type of game. Yes twitch FPS guys can benefit from more but most games... nope lol. These days I care more about pixel response and individual lighting.. so OLED and similar is king. 48" C2 is my daily driver and absolutely love it with an XTX Nitro and 7800x3d.
its plenty for single player story games but nowadays the majority of gamers play competitive shooters and actually need the fps to reach their refresh rate. i do agree games like warzone and Fortnite need to optimize their games better because out of all the years of occasionally loading into Fortnite i still feel like i have to put my settings at low to get my fps to match my monitors refresh rate despite having a 4070 super
In singleplayer titles you dont need more than 120 fps, but in multiplayer you only want maximum fps without any FG. Interesting marketing Nvidia has right there.. huh
People who utter this sentence are amusing to me. Who forces you to use very intensive settings in a game? If they would add a feature to increase the path tracing accuracy even more when the user want to activate it, then you would be upset??
This is the reason I grabbed a 4090, I didn’t think they could get much better performance then it over the next couple of years… and I was right. Prob won’t be upgrading again until 6090 or maybe even the 7090.
Same here, don't see myself upgrading until the gen after the PS6 is released, and that will probably be the 70-series. I also hope that I can get at least 4090 +50% performance at 250-300W then, since the 4090 is already making my room into a sauna in the summer months (the 5090 would probably do this 6-9 months a year).
As of recent data, a small percentage of PC gamers use 4K monitors. According to a Steam hardware survey, only 1.74% of PC gamers utilize single monitor setups at 4K resolution. (What is the point building a gpu that's catered to less then 2% of potential buyers?)
@@superior_nobody07 I guess you don't see the massive trails behind the cars in Cyberpunk 2077 with DLSS on to get a playable frame rate. Or the massive loss in detail on everything. Probably play on a 24" LCD monitor at 1080p. I use a 45" Corsair OLED 3440x1440 240Hz monitor. I'm not upgrading for $2,000 from the 4090 to the 5090 for a 10% at MAX increase of raw performance that's insane. I was personally expecting way more.
Appreciate the in depth test. It’s all the sort of experimenting I would have done. Great to see an influencer/reviewer really taking advantage of their early access to benefit of their audience. I feel I have a pretty good understanding of the performance gains of the 50 series. Thank you!
Something to notice here is that the 5090 gets ~68fps at DLSS Quality, while the 4090 was only able to get the same framerates with DLSS Performance (as far as I remember). With the new transformer model the improvement to image quality should be substantial
both still look crap. the transformer demo frames didn't even have the same damn textures or texture maps, it's like throwing away the original game and making a fake one out of cardboard. trash
It was most likely at DLSS Performance, the video didn't show what supersampling preset they chose after disabling FG, notice the cuts at 8:14 and 10:31. I'm happy to be wrong though
@@damara2268well at that point we could argue the same about rayter vs ray tracing
12 дней назад
I play games to win and having artefacting, ghosting and motion blur at high refresh rates kind of makes high refresh rate monitors useless no? Now I know this is going to be hard to grasp but - other people are different than you, they aren't the same as you. Try to understand this simple concept in the most simple manner and the rest shall follow. Peace
It's faster than a 4090 while native rendering with path tracing enabled, but not by much. DLSS means we're NOT rendering at native resolution, and it can't get decent frames without it. This isn't a huge jump in REAL performance over a 4090. It's just a huge jump in software crutches to make things appear faster. I have a 4090, but my personal rule is to not upgrade until a GPU is 50% faster in native rendering performance. Good native performance plus the software crutches will enhance the experience far more. I'll save up $3000 for that 6090 I guess😂
Both 4090 and 5090 weree built on TSMC 4nm. Meaning only increase in raster comes from more power and more cores. No increase in performance increase or efficiancy increase, from reduction in fabrication node size. .......... Meaning the next gen of Nvidia will likely drop node fabrication node size and be a monster. TSMC now have 2nm fabrication, so hell yeah.
Thats what I wanna know. RAW power. Thats all that matters to me, as I dont like frame gen or DLSS or RT or any of that nonsense (MOST of the time). I just love a steady smooth high framerate with "normal" max settings.
@@bra1nc417dthen you'll still be fine with this card, or really any of the 40 series and most of the 30 series cards. Most games aren't anywhere near as taxing as cyberpunk
for me raw performance is no ray tracing and no AI generator or DLSS stuff like that. That is raw performance and what I wanted to see. I don't think he show it in the video I don't think.
Thanks for the video, I for one will stick with my 40 series card, I can''t be bothered to pay more for better fake fps. I hope 1 day the native side will be upgraded!!!!!
I have a 4090 card and I was going to sell it and buy 5090 but after watching this video I will keep my card and will not buy the Bullshit you are nvidia selling
I honestly don't understand all this drama over "fake frames". The entire point of a gaming GPU is to provide optimal gaming performance. That means graphical fidelity and smooth, high FPS. Why does it matter how it achieves that? Is my game experience going to be ruined if there's an artifact in the background every once and a while? Am I going to be panning around as I'm playing on the hunt for imperfections resulting from AI frame generation? No. When I was in the hospital, I didn't reject morphine because it would provide "fake comfort" to combat my pain, so why would I reject AI frames that resolve low FPS and stutter? What am I missing here? I know it has to be me because I trust that a million people aren't wrong and I'm the one who's right.
most of these people are in an echo chamber. 99% of people think the way you do, including me. People are always pessimistic when it comes to new things, especially new tech/software. Just wait for the reviews to come out and youll see most of these people will backtrack on their statements about fake frames. I have an RTX 3080 and I'm eyeing the 5080. With frame gen it would be 3-4x better then my current card. For $1k that is not a bad deal at all.
@@behcetozer6356what do you think about the vram on the 5080? My 3080 10gb is having issues at 4k and im worried the 16gb of vram on the 5080 is going to cause issues in 2-3 years again
They’re just parroting statements they’ve heard, and most of their examples make no sense or actually don’t matter at all. Stuff like “Why can it only run cyberpunk at 30fps 4k on native” is just undermining the technological achievement it is to be able to play a real time path traced game at over 100 fps, people complaining about the latency in multiplayer competitive games are just completely ignoring that there is no fps game that’s demanding enough to need you to turn mfg on in the first place, and everyone’s silent all of a sudden about pricing on something like the 5070, because it’s actually lower than what the 4070 launched with. Vram and the 192 bit bus are the only things to really complain about when it comes to the 5070, but the importance of vram has also been grossly over exaggerated, I’m still occasionally using a laptop with 6gb vram and it’s perfectly fine as long as you turn a couple settings down that don’t really affect how the game looks regardless. I run a 7800xt now and it definitely feels more comfortable, but I still don’t think vram is nearly as big an issue as people make it out to be
We are reaching the limit of technology. We can force frames to be 300fps without AI but the gpu size will be large. The 4k series reached our limit with forcing frames and instead of forcing frames we are using ai to make frames the main difference is visual bugs bluring and rotation quick combat and ray tracing for example playing hogwarts legacy the ai frames will delay the deployment of certain visual used in combat but you can hear the sound of it meaning instead of the sound and visual being in sync you get the thunder before the lightning. Forcing frames though you'll get both at the same time. 99% of the games dont have issues with ai generated frames but the ones that do it affects gameplay and forces the game devs to address it in the case of hogwarts legacy they hard patched the game so that those visuals and the sound are synced. This is forcing devs to do nvidia's job.
I can't wait for the side by side of the 9070XT and 5070, if they are in the same price bracket I'll probably pick the one with the best raster performance within reason, if it's close NVidia features will probably make the difference, this is going to be an incredible generation for mid range competition and a horrible one for mid range value
nah no matter what i'll still pick Nvidia for the sole reason off... Ray tracing is the future and no matter what its here to stay DLSS is superior and you'll still use FSR even at 9070XT DLSS Transformer heck i rather choose Intel over AMD because of AI too
This is exactly where my mind is right now. Totally agree with you. I do want a little bit of future proofing though so I may just shut my eyes and click "buy" on the 5080 because of the increased vram.
They have to update the fps counters to show CPU fps and GPU fps. The CPU is still processing on real frames, adding fake frames won't make it run faster, but it will seem like it's taking longer to process due to the fake frames keeping the frame still for longer with the fake frames. This is obvious to anyone with a brain.
Что может RTX 5090: 28 FPS - трассировка путей, лучи, ультра, 4К без dlss и без мультигенератора RTX 5090 60 FPS - трассировка путей, лучи, ультра, 4К c dlss 4.0, без мультигенератора 115 FPS - трассировка путей, лучи, ультра, 4К c dlss 4.0, мультигенератор x2 170 FPS - трассировка путей, лучи, ультра, 4К c dlss 4.0, мультигенератор x3 Качество картинки не падает! Лагов в управлении нет! Это революция!! Продавайте свою 4000 и 3000 / 2000 серию в Авито пока она еще стоит не копейки!
@@МихаилЛипатников-у3ю ага конечно ) 5000 серия будет стоить дорого. 5090 будет в жутком дефиците. AMD уже сдалась - прямо на CES. Они были настолько в ахуи, что просто заткнулись потому что лучше молчать чем нести бред про новую серию - что она что то там может. FSR 4 показали что мыла стало намного меньше - но этого мало )
Thank you for this very informative video. It helped me to visually understand some things about AI, DLSS and ray tracing that I was struggling to figure out. I also just noticed something interesting. The game is being played on a Falcon NW PC. That might have something to do with how quiet the machine is. Back in the day, my first serious computer was a Falcon NW. Everything they say about Falcon NW is true. Superb build quality and incredible direct support. Unfortunately, it is also true that they are extremely expensive. I couldn't afford to keep up with the pace of the rapid changes in computers at their price point. So I learned to build my own, at least partially through my interactions with their customer service (I5 8600K, GTX 1080, 1TB M.2 Sata SSD). I have to admit it has held up amazingly well, mainly due to the incredible longevity of the 1080 card. However, it is getting to be time to get a new rig. This video was very helpful to me in my research. I have to say the first thing I do when I am researching a new setup is go to the Falcon NW website and see what they are putting in their machines.
It will be interesting to see how much reflex 2 brings things down. I’m building new PC and torn on a Gpu. I feel the future is geared towards AI and anyone would be stupid in my shoes not to buy a 5000 series my only hang up is that latency.
@@bluelinewhitetails6205 it's only an issue if you need Ray tracing. With it off you can game decent at 4k with a 4070ti super so a 5070ti or up be good depending on your budget. Get a good surge protection plug adapter Never forget it.
Great vid, heaps of info and lots to think about. I've got a 4090 and have found that is does struggle sometimes at 4k. There is no doubt that with DLSS4 and multi frame gen that it will run very well in supported titles, but it's those unsupported titles I'm worried about. Obviously it's the Path Tracing that is the insane performance killer in CP2077, and if there is only ~30% performance uplift over 4090 in rasterisation, I'm not sure I can justify the expense of the 5090. I'll just wait until the 24th for more info I think, but hopefully with DLSS4, the 4090 will continue to perform well enough to hold out another generation.
Can't get over the massive nauseating Motion Blur... to hide the AI artifacts. Doubling the framerate with the same latency is essentially double the processing time. Also for games that do not support DLSS4, this is completely useless.
I think the cyberpunk 2077 game is designed to work with low fps and high latency even on the newest GPU. Then people with money build the best PC build with the latest best GPU to get the best performance. Then those companies get a lot of money. I think the companies that make games and the companies that make PC parts (gpu, cpu...) are doing This(some kind a deal). For example, some people with money give up their old iPhone and buy the latest iPhone. But there is no big difference between the iPhone from last year and the new iPhone. PC centrics, if you can, test other games in 4k in addition to the cyberpunk game and show us. That would be a big help. Another thing is that we really need to do a real world test to see if the game that looks the most graphically is the cyberpunk game or another game. Even though the cyberpunk game shows low fps and high latency in the test (dlss off, frame gen off), we cannot say from that alone that the cyberpunk game has the most graphics.
EDIT - After publish, Nvidia asked for me to remove the "Exclusive" and "Benchmark" aspect of the title, which I have obliged by
FAQ - This was filmed at CES in an Nvidia briefing, embargoed until today. Full Review embargo is separate.
Does it use reflex 2? No, that's going to be a future release for esports titles only. FG does require Reflex 1 though
Does Frame Gen look better than last gen - hard to say without a side by side, but for normal gameplay it works great, but will struggle with fast non linear movement and hair
Linus did this a few days ago. Same game too.
Wait, is it really just for esports titles only? I haven't heard about that.. If that's true then it's a shame :/
Cyberpunk with 4K RT Overdrive + DLSS Performance + RR + FG = 100+ FPS, 8-20ms (in-between) Latency [RTX 4090]
Cyberpunk with 4K RT Overdrive + DLSS Performance + RR + FG = 200+ FPS, 35-40ms (in-between) Latency [RTX 5090]
Just posting this for myself, Pls correct me if I'm wrong
@@jamieferguson935 Actually no. Rewatch his vid and then rewatch this vid.
what CPU was in the system?
"The video isn't seen by Nvidia" if any Nvidia employee is watching this, please close your browser immediately.
LOL! I'm guessing it was OK'ed by NVIDIA, but they didn't restrict going into the settings and playing around with them, so I can't imagine they'll be taking him to court or anything. I'm guessing NVIDIA ok'ed this to counteract all the "fake frames" rhetoric (some of which I find pretty toxic and/or opportunistic) that has flooded RUclips and this video gives more context.
@@larkinc Yeah they oked their demo, and changing the settings, but not showing temps or power usage. Basically they want the demo to be as public as possible, without inviting millions of people to their HQ
@@PcCentric What model monitor was that?
do you think nevidia dos t look their presentation, where they show everything by them self😅😂
@@skyserf Lg 32GS95UV-W
A $2000 GPU only running CP2077 at 34 fps without fake frames.... impressive
Genuinely unless they start making GPUs with larger sizes it will not get higher than 60 at 4k Path traced rendering. Unless there is an incredible leap in architecture of course.
I prefer 300 Fake frames compared to native 60
A test build of the 5090 while also Cyberpunk being the cutting edge game to look at makes sense. Cyberpunk isn't really made for optimization and if you are going to play on the highest of settings you can't without DDLS or FSR
Path traced games basically require dlss
I think it actually averaged 27-28. Still quite a jump from the 4090's 21fps though. But yeah.....2000 bucks....sheesh.
I love how Crysis used to be the benchmark for testing graphics cards before. Now it is Cyberpunk.
Good
Only because people gave up trying the impossible task of running Crysis.
@@EarthRising97 😅😅
@@EarthRising97 It is not impossible. I have an RTX 4080 and the game in 1440p (this is my monitor's resolution) can produce 90-130 fps with maxed out + PT. Of courses I'm using DLSS, Ray Reconstruction and Frame Generation. I don't know, everybody wants to play in 4K but for me 1440p is the sweetspot and most people forgets that Path Tracing is only an option there are "old RT" in the game which looks fine and even that you can turn off and simple get 144 fps locked without RT.
The witcher 3 was too for quite awhile!
The developers of cyberpunk must be so happy they made an extremely hard game to run, now they get so much advertising! Lol
Cp2077 doesn't need advertisement anymore
Literally everyone knows about this game
I mean, it runs on my steam deck at 45fps. Obviously lower resolution and not max settings but they shows how well optimised this game is
It’s actually pretty well optimized nowadays.
There's no pride in making unoptimized game which can't be run with decent performance in max settings even on top-of-the-line hardware. With such logic stalker 2 devs should be even more happy, because this crap can give decent latency only on minimum settings. On 4090 (!!!).
I'm surprised how many people say how unoptimised it is. I have a modest system (Ryzen 5 5600 and 6600-XT) and it runs fantastic. Only basic RT of course and I'm on 1080p but with my TV's 4k upscaling, it looks nice and i'm happy.
They didn't even allow Linus to tweak the settings. Bro, you have given us a lot.
Would you let Linus mess with settings? Lets be honest here...
@@Ahamsheplinus would try to frankenstein the 5090 into running an SLI with a 1050 ☠️
@@Ahamshep
And just like that. the first 5090 in Linus's hands hits the floor with a thud.
Learn to read. The embargo got lifted today thats why he was able to mess with the settings. Full review embargo is on the 24th.
That's how time works.
Not bad, but I think most of us are interested in the 70 or 80 class gpus, and how well will they do in 1440p and 4k
same as 40 series +5/10% based on SMs with a slightly better result in RT on than standard raster would be my asumption right now
Indeed. Many of us have a QHD 180-240 Hz monitor.
@@B_Machine 180-240 wtf i'm still in 60
@@Mayeloski actually interested with the difference of 5070 and 5070ti as compared to my current card 4070 Ti Super.
i wanna see if 5070 TI would be worth the money. I am looking to make a pc, switching over from laptops. Really keen to see if 5070 TI is worth it or should just go with 4070 ti super and upgrade wayy into the future lols
Just wanted to say out of all the big youtubers covering rtx 5000 CES launch, this video give the best and most detailed information of what we want to know. You basically allowed people watching this to calculate the performance uplifts between 4090-5090 in all the different scenarios. Exactly what I was looking for. Thanks so much!
the big tech tubers know if you make pro amd content it gets shares and views on reddit and more money for them
and to me there is none it is almost no point in getting the 4090 or 5090
The big tech channels were not allowed to show fps because Nvidia is scared of their massive influence 😂. Let’s be real, they didn’t let Linus show fps in his video because that guy can instantly tank nvidia’s gpu sales by a single comment
If you can sell you 4090 for $1,500 then it's worth getting the 5090 for an extra $500.
@@franklinokeke7848 Linus did show fps though didn't he? He wasn't allowed to change settings but he was allowed to post his video days earlier.
A GPU for the price of a used car running cyberpunk 2077 at 28 FPS. Wonderful.
It's not a GPUs fault a game is made so you can adjust settings to make it unplayable. It's like saying a car sucks cause it barely makes it up this jagged icy mountain.
@@getrekkedonto be fair, it’s like you’re going 160mph in the speedometer, but it’s fake, you’re only going 100, but AI makes the engine sound louder and tricks you.
@@getrekkedon You are so cringe lol.
@ Is there an AMD card that can push path tracing with no upscaling? What are you saying? You know you can turn those settings off right? Just like you can choose not to drive your car up icy cliffs. I think most people who have the option to play graphical single player games with path tracing on using AI will because it actually looks that good.
@ Unlike car race times, it's a subjective experience one in which the majority will appreciate actually being able to play full path traced aaa titles because of the AI.
Does anyone actually play the game? Get into a gunfight, with explosions and particles flying. Why are we just standing at an intersection? There is no purpose praising the latency if you’re not showing the conditions in which the latency needs to be good.
I was thinking the same. To truly test a card you need to play the game. Period.
Yeah not only this, but the latency standing still in a low density area is 40MS, soon as he turned to quality mode it was 50MS, for a guy that claims hes optimistic about latency why is he not complaining? 40ms/50ms is horrendous it would be like playing on geforce now in the cloud lmao.
@@welshe222 Thats what im saying like are we really gassing 40ms
THANK YOU.... this is what every single comment should be talking about. You are standing still doing nothing
That main area of the city is usually one of the worst for performance so I think it works.
Will wait for a 4090 vs 5090 comparison with raw frames rather than dlss/Ai
the 5090 is 33% faster
Just take the numbers here and compare it to a 4090 Cyberpunk 2077 Overdrive benchmark.
RAW: 5090: 28-35 fps | 4090: 15-20 fps
DLSS Q FG 2x: 5090: 110 fps | 4090 80fps
@@aHungiePanda So 80% more performance in ray tracing.
cope AI is the future
Its only an 8 frames gain with the 5090 vs the 4090
Standing still and playing with motion blur enables surely is a best way to demonstrate how this over 2000 dollar GPU fares. Bravo!
Appreciate you showing the latency and giving more awareness about it- definitely makes a huge difference in how a game 'feels' when you're actually playing it
Yeah, 40-50 ms latency is unplayable. For first person shooter it's nice to be below 20 ms, MAX 30 ms, anything higher is garbage. Fake frames nonsense is just a scam, it's such a shame that most people don't even understand it.
@@detrizor7030 everyone understands it. It's a trade off. If you want path traced super graphical games, AI is a must right now. Esports where that low of a latency is a must are typically not graphically intensive and on top of that highly optimized and don't need AI. If you're really competiive you really don't care much at all how beautiful the game is and will turn settings down to get rock bottom latency and visual clarity anyway. The 5090 will likely be the most powerful GPU with or without the AI, so having the AI on top to play single player games where lightning reflexes don't matter and you like incredible graphics is nice.
@@detrizor7030 how is 50 ms that bad , i usually play online games at that ping and honestly i don't feel a difference, especially in online games it should matter how come its a problem in game like cyberpunk 2077?? since when 50 ms became that bad
@@getrekkedonfinally someone gets it. With out the Ai features for fps games turn it on for single player games
@@getrekkedon "It's a trade off." - no, it's just nonsense. There's absolutely no sense in playing first person shooter with 40-50+ ms of input lag, this path tracing stuff just isn't worth it. Better not play a game at all, than with input lag higher than 30 ms.
"Esports where that low of a latency is a must" - that's what i'm talking about - almost everybody keep talking some gibberish about multiplayer/singleplayer games and so on. That's NOT about esports, we need decent input lag (below 20 ms, 30 ms tops) NOT ONLY in multiplayer games, but in ANY game with mouse input, ANY. Come on, we need it even without any game, in plain operating system. In esports we need not just decent input lag, but lowest input lag possible, there's a VERY big difference between these 2 things.
"where lightning reflexes don't matter" - what are you even talking about? For you there's only 2 options available - "lightning reflexes" and 50 ms ugly mess? Nothing in between? Seriously?
Thank you Centric for the benchmark with no bells and whistles
Would have loved to have known the temps. That Nvidia card is quiet and if temps are good, the AIBs are going to have to compete.
10:55 nice npc walking path there
wht
😂 💔
failed game, used as bechmark
@@MilosCsrb2nd best singleplayer game I've ever played besides Elden Ring.
This is amazing! I have walked like that sometimes, but it's the first time seeing NPC doing that.
I honestly don't think I'm going to notice the occasional smudged graphic generated by the AI unless I plunge my face on the monitor, actively looking for it
tool
In a multiplayer game online with many movements and actions taking place around the screen you can notice the MFG..
@@Billy_bSLAYER don't waste your time explaining. this person has accepted the lie and justified believing it....
@@Billy_bSLAYERhopefully it gets better. Cause it seems nvidia isnt gonna change. Gonna go all in on this and it sucks. We are gonna get to a point where our gpus can only handle fake frames 😂
What lies? You are getting more frames. Ai or not, 200 frames is still better than 45@@albertcamus6611
Cant wait for the 5090 vs the 4090 raster, dlss, raytracing benchmarks.
didnt dlss4 have problems and also nvidia lied about their ai framerates.
it couldn't even get 30 fps on cyberpunk.
xD
Raster seems to be like 33% difference based on this game
5070= 2x4090
@@Celatraless
the 5080 vs 4080 will be hilarious since its a mere 8-14% raw performance increase lmao
If 5090 gets released please do one video on RTX 4090 vs RTX 5090 Raw Performance and Native benchmarks 😅
Why does that even matter? 5000 series is about AI and new tech that 4090 doesn't have and can't compete with.
@@ryanhamrick8426 Clearly you aren't listening, all of that AI performance means nothing if the base performance is shit. If all you care about are generated frames then great buy a 5000 series GPU.
For those of us that care about 0 AI generated frames or just just base DLSS with no frame gen. We want to know how much they actually improved the raw performance compared to the 4000 series. Is the dollar per base fps worth it or not.
@@RedSky8 what's truly concerning is the latency which will worsen
Jensen: STFU 🤡 & GO EARN ME A NEW JACKET!
@@ryanhamrick8426why would you want ai frames that have artifacts and will increase latency. When you can have raw power that will have lower latency and less artifacts? Nothing cool about new technology that is more lazy and less useful then how they were doing things before
yes please everyone hate on the 50 series. DO NOT buy a 50 series. wait AT LEAST 6 - 7 months. do not pre-order. just keep posting on the internet about how bad it is, This might help me get one at launch. thanks.
amen to that. sick of these crybabies
If they have a 4090 yeah it is senseless to jump on the new line. If you come from a gtx card that is another story
@@laszlodajka5946 no shut your mouth. It's trash. No one should buy one. Terrible. Fake frames ya know
@@TheFlyinFisch FACTS. You'd think Nvidia is out here with guns to everyone's head DEMANDING they buy a 5090 the day it drops lol weirdos 🤣 wonder if they get all mad when Rolex drops a new expensive watch they can't afford too?
the funniest thing imo is: absolute most of those who cry about the "disgusting fakery huang scam" are the same ppl who either sit on Amd or on the gtx1060. They are not going to buy the new gpus anyway. And they have no clue about how the new tech FEELS with your own eyes and hands.
Also, they are sure that 88% of PC gamers (who buy nvidia) are plain stupid hamsters and scammed by huang.
Yes, i dont think i would upgrade from 4090 atm, but still the new features look decent if you have a much older gpu
I'm a 1998 Tomb Raider II guy, so I know that gameplay is the main consideration, like about 70% game design and gameplay. Graphics are great but there's a point of diminishing returns, like a glitch here and there makes rock all difference to the gaming experience. These GPUs are put on a pedestal and you think they will make you happy, but in actual fact is just way too expensive for 99% of the population.
Well yeah, these gpus are obviously not targeted towards 99% of population
Totally agree with you except when it comes to VR.
@ ok then 99.999& of gamers. how can u spend £2000+ on a gpu, like it's a Rolex or a Gibson? It's a complete rip off like more or less everything today; KFC, cars whatever. You pay LOTS more and d get a lot less value. Get real, £600 is too much for a GPU
@@outsidethepyramidif u have the money who cares, its your choice and its your money
@@jaaqob2 gamers will probably get the 5080/5070
Wow this may be one of the best videos I have seen from you!! I love how you went through many settings and even did a noise test on the card with a quiet enviroment this is way better than Linus's video on this!
Great video exactly what I wanted to see love how you were tweaking with the settings very informative! Any chance of a similar video for the rtx 5080?
I spent too much time making this to try the other demos sadly
I told you why he didn't test the 5080 but he deleted my comment.
Damage control.
Why? What happened? @@CrashBashL
@@VoidBass6412 Each time that I answer, he is deleting my comment.
No bad words are being used.
He just damage controlles.
@@CrashBashL I can assure you PC centric doesnt manually delete comments like this lol.
Great Video tbf, ultra comprehensive!!!
I will use my 3090 with upscalers. It can still spill out decent framerates.
Im rocking my 3090, still not convinced to upgrade.
it honestly amazes me that people complain about the 2k price tag when the 3090 for a long time was sold above 2k and plenty of people bought it
You will get DLSS4 upgrade too, the transformer model with much better image quality as it was said in the video.
@veniulem5676 Yep, I went daft in COVID £2,150 for a 3090 Suprim X.
Not sure i can stomach the big prices again. Need to wait for 60 series I think. Unless the 5080 comes out really good.
Yup! Pair it with Lossless Scaling and you basically got yourself a 5090 lol
send me the card i can benchmark it too
Sell it for a 2011
sure thing lil bro
Yeah just send me your adress and your mums credit card information so I can ship it lmao.
Do you mean keep it?
I can test it for free and never return it pls
Thanks for going into the latency and not just being a fps fan boy. I really appreciate deeper dives like this and has answered many of my questions.
Optimize games. We dont want fake frames. We want native resulotion.
Wooo UK person does the 5090 review!
He passed Jensen a botta-of-woaa
@@AmanSingh-xk1me tasted slightly better than wadder
Better than review on red note ? 😂
Buncha wanks
@@AmanSingh-xk1me Woaa > Wahda
Definitely feeling better about having recently gotten the 4080s. The frame generation is a nice buffer but without it, it's a marginal improvement. Something I expect to develop more for cards that are not the 50 series. Still cool, and way more appealing to those building a newer pc with a smaller budget
Only 1000 for the 5080 idk mate….
@ I think they are no brainer for the story games, but for the fps they don’t seem like must upgrades. I’d like to see a VR run with the frame gen, that might be epic
@@nlluke5207
If you could even get it in the first place lmao.
People who got a 4080s at launch made the right choice.
@@ThisLuvee ppl that did buy the 1080 ti back in 2016 made te right choice hihi…
@@nlluke5207it will be just 15% uplift unless you prioritize fake frames lol,4080s dropped in prices.
Thought this was Scott the Woz, I should wear my glasses more
I believe he is a character from Lord of Ring
4 years later and Cyberpunk still can't run at 60fps 4k Max settings natively on the most "powerful" gaming GPU without cutting corners.
Yeah and the 5090 is nearly twice the speed of a 3090.
Well tbf they added path tracing way later after the game released
Now that Moore's Law is dead, short of tech paradigm shift the way we are going to get more performance is by 'cutting corners'. Thankfully it looks like we won't really be able to tell corners have been cut.
@aquacube The current state of DLSS is really noticeable in a lot of games that feature any kind of hair or foliage. Maybe one day it'll be better
@ fingers crossed for future generations, I'm sure it will as AI/software tech is always evolving.
I feel like you should've explained that PC latency isn't the same thing as frametimes, since that seems to be a common misconception and something there was tons of outrage about on Reddit.
That could be it's own video.
100 percent I can't stand when I see people say they are getting 16-17ms of latency when at 60fps. They literally don't understand that they are looking at only the frame time and not the total system latency.
@@Sp3cialk304system latency is dependent on your keyboard and monitor and even quality of HDMI/ display port cables
05:40 minutes. the Latency jumps to 72 for a second.
I'ma stick with my 4080 Super for the next 7+ years.
Amazing GPU! 4080S. All you need. Though 16GB VRAM will become needed for some games.
Not a bad idea.
I have that one and the card is strong enough for 24gb of vram. But with 16gb in limited in terms of modding and certain games with path tracing. So I’m considering the 5090.
16gb vram wont even caery you 3 years if play 4k
@@prenti1205The 4080 super won't remain a 4k card forever. Games are inevitably going to get more demanding and it will drop down to a 1440p card and then to 1080p card. So 16gb of Vram isn't going to be much of an issue for the future.
Reflex low latency is activated automatically with Frame Gen, but NOT if you disable frame gen, you will get less input lag with Reflex ON
Yeah with reflex 2 the latency should be 1/4th of what he was seeing with frame gen off
So 9-12ms
Reflex 2 wont actually work with FG due to how R2 changes frames to get the lower lantency @crispycrusader1
It stays on if you turn frame gen off
@@crispycrusader1This game doesn’t have reflex 2 and reflex is on still
@@notbrokebrolt6281 Ah i thought that it was a combined system, like if they updated to dlss4 they would update reflex as well to newest mode
I dont think gamers truly understand how much is required to run a game in native 4k let alone at 60 frames per second.
It’s like going from traditional CDs to MP3 CDs. 12 songs on a disc can now be 100+ songs on the same disc, on the same CD player. The difference? Compression. Now we’re back to lossless audio options, and I assume gaming will go from lossless like now, to DLSS which is like MP3, then back to a mix depending on user preference. Eventually, lossless and DLSS rendering will be so close, it’ll be like arguing about hearing the difference of high bitrate MP3s, and lossless (or Bluetooth versus cable)!
MP3 will never compare to the quality of vinyl
I prefer the quality of record players thank you very much
Jk this is a pretty good example tho
10:55 Good to see npcs in cyberpunk are still half baked after 4 years.
😂. Unfortunately no amount of ai is gonna fix that lol
No game is perfect even Elden ring has bugs here and there
I have been seeing this kind of stuff in GTA 5 for 10 years now.
i’m disappointed, i wanted to buy the 5090 but it doesn’t seem worth it, and now the worst part is we have to wait for the 60 series for a major raw performance increase.
The more you add generated frames, the higher the latency is, but at infinite MFG, the additional latency, assuming perfect scaling (as in generating the frame have 0 cost), will be similar to the cost of rendering a frame. So a 60fps game will have a max latency penalty of 16.7ms (1s divided by 60) assuming you're using infinite MFG. The main contributor for the latency is not the processing cost but the fact that it needs to delay showing the real frame because the need for frame pacing and interpolation. If you search the internet about FSR3 GPU Open, you can see why it introduce additional latency. Basically the game engine render a frame but instead of being shown, it is being hold by half the frame time (assuming it is the normal FG) and then when a new frame is ready being displayed, the game doesn't display that new frame but use that frame + previous frame for interpolation and present the generated frame. The actual new frame will be hold for around half the frame time and you repeat this process. Because the game holding the real frame for half the frame time, normal FG will have an additional 8.3ms for a game with 60fps base frame rate. For 4X MFG, instead of half, it hold the real frame for around 3/4 of the actual frame time, meaning 16.7ms * 0.75 = 12.5ms additional latency or around 4ms higher latency vs 2X FG. Again, all this numbers doesn't take into account the time needed for the game to process the generated frame, but as you can see, while yes, more generated frames add latency, but the subsequent latency increase will be smaller vs the initial latency increase from 2X frame gen.
Basically if you're not bothered with the increased latency from 2x frame gen, 4x frame gen shouldn't bother you in terms of added latency unless we are talking about really low base frame rate.
So are you saying the higher your base frame rate, the less latency. For example I play marvel rivals with frame gen on. I get about 100fps +- without it on, but when I turn on frame gen. I don’t feel the difference in latency, makes me wonder why it’s so shit on
Because most of the people commenting on it have no idea what they are talking about.
Never thought we'd get 4ki in 2025 lmfao
Honestly 40-50ms latency isn't even noticeable unless you compare it side by side with 15ms. It's incredible Nvidia has figured out how generate so many fake frames with sub 50ms latency.
so basically fram gen sucks ass
If you don't understand what it takes to run Full Ray Tracing, let alone RT with full Path Tracing in 4K, then it's time to just walk away. The real question is, how is the image quality on the 4x DLSS vs Raster? Cause if our human eyes cannot tell, then who cares if it is AI.
"if our human eyes cannot tell, then who cares if it is AI"
brain-healthy people do not care, while the crybabies in comments somehow do.
probably a religious cult of "rAw FpS", no matter what
30 fps with path tracing is legit good tbh. I would still wait for raw offline rendering benchmarks though.
Legit good ? No, it's a facking disappointment of a product... should have been getting at least 50!!!!! Ffs !!!
im coming from a 3070 and planning on 3080 barring a terrible release. i just wanna play my games at 1440p and maybe 4k. i have an overkill monitor and i wanna finally take full advantage of it
My 3070 Ti plays fine on 1440p 21:9. Just the VRAM is low. Why a 3080 lol just get a RTX 5070 Ti instead
What did the vents smell like? Or is that under embargo?
Smells like Jensens 9k leather jacket he wore
Money leaving your bank account
Who cares?
lmao
65 fps in PT DLSS Q 4K is pretty good actually, 4090 does around 40. Let's say it's 60 to 40 because the scene was pretty basic, it's still a massive 50% jump. I think 30% average is more likely but if it gets close to 50% in the workloads where 4090 doesn't cut it (4k dlss Q PT) it's a worthwhile upgrade for enthusiasts.
Um. My 4090 gets 60 fps. Not 40. Ngl this doesn’t seem like a big jump at all without the MFG
For enthusiasts 10% will do, but that's not what nvidia could aim for.
@@guerrierim15no it doesn't, I have a 4090 and it does not get to 4k 60 on quality dlss
@@guerrierim15 ummmmm yea, my 4090 gets 500 fps at native 4K so basically 5090 sucks ass
@@marekavropls stop for god sake
I’m a 4090 owner and am severely triggered and threatened by a new product coming on the market *dribble *
You will continue to give them money no matter what they do 😂
@@Mike-Dynamiketrue. There's no real competition to choose from. Not like in the auto industry where there's so many to choose from.
@@TampaGigWorker true, no competition in high end gpu market but 90% of user don’t really need a 4090……they make a conscious decision to give Nvidia money and they already know how they are. Just don’t cry after it.
@@Mike-Dynamike i made a conscious decision to spend $1069 on the Gigabyte 4080super OC. A tough pill to swallow but no complaints after making the final payment through Affirm financing.
Im happy with it for now. I'll upgrade in about 2 years and stay in the 80 ti/super series braket.
@@Mike-Dynamike lmao fax
With input lag increased, so it's the same thing of LossLess Scalling
That artifacting is definately in the "Now I can't unsee it" category
Nice video mate. Should have tried without RT and without DLSS/ frame gen !
Thanks for this comment. Now I know not to watch this shit. Did he really not test native res😂😂😂
@theotherLewbowski no no, he did. Native, but full RT. I was interested in native, no RT 😅 but the other tests he made were good, watch it !
true
If nvidia didn't soft lock the 40 series with the new MFG x4 there would be really no difference
there is an app with a duck on steam
lossless scaling
Why do people keep saying this? It is blowing my mind. Do you actually believe there will be a ZERO PERCENT difference in raw performance? I think the consensus is that there will be around 30-35% uplift. which is pretty standard.
@@tucci06by pretty much having 30% more power consumption. :) it's basically like a overclocked version. Dont get me wrong i get that nowadays you dont pay for HW, but for frame gen... Thats where the future is heading,but dont lie to yourself regarding HW improvement...
@@tucci06 5090 with mfg enabled 400 fps ok a 4090 with mfg enabled 350 fps ok a new 5090 2k plus dollars do you think that’s worth an upgrade 🤣 ok
Awesome video, thanks!
I have a 4090 and trust me, we dont need more than 120 fps in games, developers just need to optimize there games
I remember when console people were saying we don't need more than 30fps to have an enjoyable game, and the PC people were bragging about glorious 60fps.
But yeah 120 is getting to be plenty.
@@scottwatrous 120 was about where I couldn't see much difference past back in the CRT days. Once you hit around there it was good for almost any type of game. Yes twitch FPS guys can benefit from more but most games... nope lol. These days I care more about pixel response and individual lighting.. so OLED and similar is king. 48" C2 is my daily driver and absolutely love it with an XTX Nitro and 7800x3d.
its plenty for single player story games but nowadays the majority of gamers play competitive shooters and actually need the fps to reach their refresh rate. i do agree games like warzone and Fortnite need to optimize their games better because out of all the years of occasionally loading into Fortnite i still feel like i have to put my settings at low to get my fps to match my monitors refresh rate despite having a 4070 super
In singleplayer titles you dont need more than 120 fps, but in multiplayer you only want maximum fps without any FG.
Interesting marketing Nvidia has right there.. huh
I want a stable 120 FPS.
10:50 spend 2000$ to play at 28 fps... sounds good to NVIDIA
You think people with 4090 are playing Cyberpunk at 20 fps? Use your brain
There's no saving y'all I swear 😂
People who utter this sentence are amusing to me. Who forces you to use very intensive settings in a game? If they would add a feature to increase the path tracing accuracy even more when the user want to activate it, then you would be upset??
This is the reason I grabbed a 4090, I didn’t think they could get much better performance then it over the next couple of years… and I was right. Prob won’t be upgrading again until 6090 or maybe even the 7090.
Same here, don't see myself upgrading until the gen after the PS6 is released, and that will probably be the 70-series. I also hope that I can get at least 4090 +50% performance at 250-300W then, since the 4090 is already making my room into a sauna in the summer months (the 5090 would probably do this 6-9 months a year).
@ I mean I’m waiting till they come out with a card that can run 4k 240hz RAW performance not fake AI frames.
Exactly.
@@jakek3653 Waiting what now?? 😂😂
@@BladeRunner031 a card that can run 240hz at 4k.
Really nice bit of analysis, especially for doing this on the fly, at CES in a hotel room. Kudos.
wouldn't that latency be horrible for multiplayer games tho?
Yes exactly this feature is only good if you want to play story games
If You plan to play competitive games at 4K with pathtracing and frame gen to achieve 240hz, so yes.
who plays compettetive games in 4k with dlss and frame gen? pls explain
As of recent data, a small percentage of PC gamers use 4K monitors. According to a Steam hardware survey, only 1.74% of PC gamers utilize single monitor setups at 4K resolution. (What is the point building a gpu that's catered to less then 2% of potential buyers?)
@@jaaqob2 Hey,people think they play competitive gaming if they play COD online lol
No one wants fake blurry frames just pure raw speed. Reviewers just kissing ass.
More frames equals more frames. I'm not going to notice minor miniscule blurs every so often, but I sure as he'll well notice 120fps compared to 40fps
@@superior_nobody07 I guess you don't see the massive trails behind the cars in Cyberpunk 2077 with DLSS on to get a playable frame rate. Or the massive loss in detail on everything. Probably play on a 24" LCD monitor at 1080p. I use a 45" Corsair OLED 3440x1440 240Hz monitor. I'm not upgrading for $2,000 from the 4090 to the 5090 for a 10% at MAX increase of raw performance that's insane. I was personally expecting way more.
I just want to know what the 5090 does at Native 4k without RT, DlSS and Frame generation vs a 4090!
We will know in 9 days. IDC about RT, DLSS, Frame Gen or any of that shit. I want the raw MAX performance at 4k.
true never liked DLSS ... making blury line ... I m not a console gamer.
i think about 20-30% better
Appreciate the in depth test. It’s all the sort of experimenting I would have done. Great to see an influencer/reviewer really taking advantage of their early access to benefit of their audience. I feel I have a pretty good understanding of the performance gains of the 50 series. Thank you!
Something to notice here is that the 5090 gets ~68fps at DLSS Quality, while the 4090 was only able to get the same framerates with DLSS Performance (as far as I remember). With the new transformer model the improvement to image quality should be substantial
both still look crap. the transformer demo frames didn't even have the same damn textures or texture maps, it's like throwing away the original game and making a fake one out of cardboard. trash
It was most likely at DLSS Performance, the video didn't show what supersampling preset they chose after disabling FG, notice the cuts at 8:14 and 10:31. I'm happy to be wrong though
@@kam9223 At 8:57 he says that it's DLSS Quality
@@N4CRwhat are you on about😂😂😂
@@N4CR Imagine calling the greatest graphics available to humans crap, lmao
You will notice the MFAA frame gen issue even more when it comes to a scenario with a lot of vegetation.
question is, do you play game to have fun or to zoom on grass and shake the camera?
@@damara2268well at that point we could argue the same about rayter vs ray tracing
I play games to win and having artefacting, ghosting and motion blur at high refresh rates kind of makes high refresh rate monitors useless no? Now I know this is going to be hard to grasp but - other people are different than you, they aren't the same as you. Try to understand this simple concept in the most simple manner and the rest shall follow. Peace
2:08 If you have played multiplayer games in 50+ Ping, I don't think you will have any problem playing story mode games in 30+ Ping
I play games with 5 to 9 ms.. this will kill me when playing competitive
@@ruven782look... Until 70ms is fine
Those have rollback tho, this would be straight up delay so it’d feel a lot worse
Meh...getting 70 fps, 4k, DLAA (no frame gen) with path tracing using 3080 Ti. I want raw power like the 1080Ti 😅
It's faster than a 4090 while native rendering with path tracing enabled, but not by much. DLSS means we're NOT rendering at native resolution, and it can't get decent frames without it. This isn't a huge jump in REAL performance over a 4090. It's just a huge jump in software crutches to make things appear faster. I have a 4090, but my personal rule is to not upgrade until a GPU is 50% faster in native rendering performance. Good native performance plus the software crutches will enhance the experience far more. I'll save up $3000 for that 6090 I guess😂
Yes I have 4090 and I will not upgrade until the next generation 6000
Both 4090 and 5090 weree built on TSMC 4nm. Meaning only increase in raster comes from more power and more cores. No increase in performance increase or efficiancy increase, from reduction in fabrication node size. .......... Meaning the next gen of Nvidia will likely drop node fabrication node size and be a monster. TSMC now have 2nm fabrication, so hell yeah.
I don't understand why anyone would think a 50 ms delay is acceptable.
Why didn't you tested 4k native res without frame gen, with dlaa or not at all, no dlss, no PT and RT and max settings?
Thats what I wanna know. RAW power. Thats all that matters to me, as I dont like frame gen or DLSS or RT or any of that nonsense (MOST of the time). I just love a steady smooth high framerate with "normal" max settings.
@@bra1nc417dthen you'll still be fine with this card, or really any of the 40 series and most of the 30 series cards. Most games aren't anywhere near as taxing as cyberpunk
@@bra1nc417dWhat you want is an airplane performance in a car engine.
for me raw performance is no ray tracing and no AI generator or DLSS stuff like that. That is raw performance and what I wanted to see. I don't think he show it in the video I don't think.
He kept yapping 😂
Thanks for the video, I for one will stick with my 40 series card, I can''t be bothered to pay more for better fake fps. I hope 1 day the native side will be upgraded!!!!!
fr
And lossless scaling does the same job for you without making you bankrupt
I have a 4090 card and I was going to sell it and buy 5090 but after watching this video I will keep my card and will not buy the Bullshit you are nvidia selling
@@cxngo8124 It's not the same thing but it's a useful program.
just get lossles scalling ...
its looks better and have more options, and will always be updated
I honestly don't understand all this drama over "fake frames". The entire point of a gaming GPU is to provide optimal gaming performance. That means graphical fidelity and smooth, high FPS. Why does it matter how it achieves that? Is my game experience going to be ruined if there's an artifact in the background every once and a while? Am I going to be panning around as I'm playing on the hunt for imperfections resulting from AI frame generation? No. When I was in the hospital, I didn't reject morphine because it would provide "fake comfort" to combat my pain, so why would I reject AI frames that resolve low FPS and stutter? What am I missing here? I know it has to be me because I trust that a million people aren't wrong and I'm the one who's right.
most of these people are in an echo chamber. 99% of people think the way you do, including me. People are always pessimistic when it comes to new things, especially new tech/software. Just wait for the reviews to come out and youll see most of these people will backtrack on their statements about fake frames. I have an RTX 3080 and I'm eyeing the 5080. With frame gen it would be 3-4x better then my current card. For $1k that is not a bad deal at all.
@@behcetozer6356what do you think about the vram on the 5080? My 3080 10gb is having issues at 4k and im worried the 16gb of vram on the 5080 is going to cause issues in 2-3 years again
They’re just parroting statements they’ve heard, and most of their examples make no sense or actually don’t matter at all. Stuff like “Why can it only run cyberpunk at 30fps 4k on native” is just undermining the technological achievement it is to be able to play a real time path traced game at over 100 fps, people complaining about the latency in multiplayer competitive games are just completely ignoring that there is no fps game that’s demanding enough to need you to turn mfg on in the first place, and everyone’s silent all of a sudden about pricing on something like the 5070, because it’s actually lower than what the 4070 launched with.
Vram and the 192 bit bus are the only things to really complain about when it comes to the 5070, but the importance of vram has also been grossly over exaggerated, I’m still occasionally using a laptop with 6gb vram and it’s perfectly fine as long as you turn a couple settings down that don’t really affect how the game looks regardless. I run a 7800xt now and it definitely feels more comfortable, but I still don’t think vram is nearly as big an issue as people make it out to be
We are reaching the limit of technology. We can force frames to be 300fps without AI but the gpu size will be large. The 4k series reached our limit with forcing frames and instead of forcing frames we are using ai to make frames the main difference is visual bugs bluring and rotation quick combat and ray tracing for example playing hogwarts legacy the ai frames will delay the deployment of certain visual used in combat but you can hear the sound of it meaning instead of the sound and visual being in sync you get the thunder before the lightning. Forcing frames though you'll get both at the same time. 99% of the games dont have issues with ai generated frames but the ones that do it affects gameplay and forces the game devs to address it in the case of hogwarts legacy they hard patched the game so that those visuals and the sound are synced. This is forcing devs to do nvidia's job.
Optimial? When the picture does have artifacts. That is NOT optimal.
raw performance from 10:24 30 fps
One of the best vids out there!
If the new Reflex works well, i dont even care about the fake stuff anymore, just enjoy the features
They would not release a feature that does not work well.
Reflex 2 is tested and approved by esports gamers QA btw
@@PhilosophicalSock Paid experts approve everything. Even not-tested vaccines 😂
Reflex 2 has AI component to IT... to "predict" the next frame.
I can't wait for the side by side of the 9070XT and 5070, if they are in the same price bracket I'll probably pick the one with the best raster performance within reason, if it's close NVidia features will probably make the difference, this is going to be an incredible generation for mid range competition and a horrible one for mid range value
Me too.
nah no matter what i'll still pick Nvidia for the sole reason off...
Ray tracing is the future and no matter what its here to stay
DLSS is superior and you'll still use FSR even at 9070XT
DLSS Transformer
heck i rather choose Intel over AMD because of AI too
@@LeafyTheLeafBoy FSR 4 looks very similar to DLSS tho, I hope AMD finally fixes their upscaler (only took them 4+ years😅).
Amd 😅😂, better pick nvidia 😅
This is exactly where my mind is right now. Totally agree with you. I do want a little bit of future proofing though so I may just shut my eyes and click "buy" on the 5080 because of the increased vram.
turn off DLSS lol
Fr the one thing we want to see
moment when fps drops to 30 is like hell
So 5090 is just a 4090 with 600w tdp and ai
No
@bwv582 yes
@@AB-td5oo No.
No
Dlss performance, that make 1080p render? or less?
For 4K, it's 1080p upscaled on DLSS Performance.
9:14 16ms is the standard ms time for 60fps. You have the latency of console players. So what type of “good latency” did you mean to say???
2:30 Can see all the artifacts on screen, and they are a lot! Looks like garbage! Fake frames are garbage!
if we get twice or tiple fps for little increase in latency and some graphical distortions, this is a massive win
They have to update the fps counters to show CPU fps and GPU fps.
The CPU is still processing on real frames, adding fake frames won't make it run faster, but it will seem like it's taking longer to process due to the fake frames keeping the frame still for longer with the fake frames.
This is obvious to anyone with a brain.
They should. I've been wanting to know what my real frames were
Что может RTX 5090:
28 FPS - трассировка путей, лучи, ультра, 4К без dlss и без мультигенератора
RTX 5090
60 FPS - трассировка путей, лучи, ультра, 4К c dlss 4.0, без мультигенератора
115 FPS - трассировка путей, лучи, ультра, 4К c dlss 4.0, мультигенератор x2
170 FPS - трассировка путей, лучи, ультра, 4К c dlss 4.0, мультигенератор x3
Качество картинки не падает! Лагов в управлении нет! Это революция!!
Продавайте свою 4000 и 3000 / 2000 серию в Авито пока она еще стоит не копейки!
Пока эти глупцы заняты своей жадностью, но потом сразу понизят свои цены после первых реальных тэстов
@@МихаилЛипатников-у3ю ага конечно ) 5000 серия будет стоить дорого. 5090 будет в жутком дефиците. AMD уже сдалась - прямо на CES. Они были настолько в ахуи, что просто заткнулись потому что лучше молчать чем нести бред про новую серию - что она что то там может. FSR 4 показали что мыла стало намного меньше - но этого мало )
Лагов в управлении нет?? хахахахаха
@@dibaklp хаха. Посмотри ролик. С Яндекс переводом. Никаких лагов нет в Киберпанке
Fake frames
Thanks for the sound test. This is the only part that matters for me
Now we are waiting for non-corrupt bloggers
Great work! This is how you showcase a new GPU in detail 🖥
Thank you for this very informative video. It helped me to visually understand some things about AI, DLSS and ray tracing that I was struggling to figure out. I also just noticed something interesting. The game is being played on a Falcon NW PC. That might have something to do with how quiet the machine is. Back in the day, my first serious computer was a Falcon NW. Everything they say about Falcon NW is true. Superb build quality and incredible direct support. Unfortunately, it is also true that they are extremely expensive. I couldn't afford to keep up with the pace of the rapid changes in computers at their price point. So I learned to build my own, at least partially through my interactions with their customer service (I5 8600K, GTX 1080, 1TB M.2 Sata SSD). I have to admit it has held up amazingly well, mainly due to the incredible longevity of the 1080 card.
However, it is getting to be time to get a new rig. This video was very helpful to me in my research. I have to say the first thing I do when I am researching a new setup is go to the Falcon NW website and see what they are putting in their machines.
10:38 Gonna destroy the pc doing that 😅 Good vid 👍
It's wild that latest stuff is still being benchmarked on a game that is over 4 years old now
10:55 That NPC casually walking over that road pillar
Remember that these are the first drivers. It will only be better :)
Just for reference, the latency for 60fps is 16.67ms, 30fps is 33.34ms, and 24fps is 41.67ms.
Turn to the monitor
Turn to the camera
Turn to the monitor
Turn to the camera
37 milliseconds of latency isn't amazing. Not great, not terrible.
It will be interesting to see how much reflex 2 brings things down. I’m building new PC and torn on a Gpu. I feel the future is geared towards AI and anyone would be stupid in my shoes not to buy a 5000 series my only hang up is that latency.
@@bluelinewhitetails6205 it's only an issue if you need Ray tracing.
With it off you can game decent at 4k with a 4070ti super so a 5070ti or up be good depending on your budget.
Get a good surge protection plug adapter Never forget it.
10:55 lol that NPC 😂
Great vid, heaps of info and lots to think about. I've got a 4090 and have found that is does struggle sometimes at 4k. There is no doubt that with DLSS4 and multi frame gen that it will run very well in supported titles, but it's those unsupported titles I'm worried about. Obviously it's the Path Tracing that is the insane performance killer in CP2077, and if there is only ~30% performance uplift over 4090 in rasterisation, I'm not sure I can justify the expense of the 5090. I'll just wait until the 24th for more info I think, but hopefully with DLSS4, the 4090 will continue to perform well enough to hold out another generation.
Can't get over the massive nauseating Motion Blur... to hide the AI artifacts.
Doubling the framerate with the same latency is essentially double the processing time. Also for games that do not support DLSS4, this is completely useless.
THIS is literally the best video on the 5090 so far. Thank goodness for a vid that shows what we ACTUALLY want to see. Def a like & share
I think the cyberpunk 2077 game is designed to work with low fps and high latency even on the newest GPU. Then people with money build the best PC build with the latest best GPU to get the best performance. Then those companies get a lot of money. I think the companies that make games and the companies that make PC parts (gpu, cpu...) are doing This(some kind a deal). For example, some people with money give up their old iPhone and buy the latest iPhone. But there is no big difference between the iPhone from last year and the new iPhone. PC centrics, if you can, test other games in 4k in addition to the cyberpunk game and show us. That would be a big help. Another thing is that we really need to do a real world test to see if the game that looks the most graphically is the cyberpunk game or another game. Even though the cyberpunk game shows low fps and high latency in the test (dlss off, frame gen off), we cannot say from that alone that the cyberpunk game has the most graphics.
The cooler from the 5090 is the most impressive thing, a two slot cooler, and on high load at over 500 watt you can whisper in the room.