It is insane to me how he has not upgraded to at least 1440p since he has such a high end rig. Even if you run competitive games at lower settings, the higher resolution makes things sharper so you notice things better.
As he should, twitch doesn't even have 1440p let alone 4k 1440p is the sweet spot, 4k will only give you performance issues for a little more resolution It's pretty much marketing for you to buy better GPUs when even 1080p is a great pixel density for your eyes
@@nexuhs. lil bro, he’s literally using a 4090 and it’s under utilized because he’s using 1080p even if he did buy a 5090 he’d see like an average of 5% increase on fps unless he used the new frame gen. Also no one said he had to stream in 1440p with a 1440p monitor
@@nexuhs. You don't use 1440p then. It's not pretty much marketing. Yes, 1080p is great pixel density. When developers aren't lazy and making their games blurry af. But developers decided 5+ years ago to start offloading their labor onto your $1000 GPU and high resolution monitor so they don't have to work hard. So modern games at 1080p do not look like older games at 1080p. They are a blurry mess and you can barely see anything in some titles. I used 1080p 360 Hz for a few years with a 2080 and recently a 4070 Ti Super but finally decided to go for 1440p 240 Hz instead. The difference in some games is minimal, but in others its staggering. Much like the extra cache in the X3D CPUs which I have also been using, high res is technology that helps when the developers are too lazy to optimize. And these days, they are getting lazier by the day... 50% of modern games are blurry messes. The other 50% run like crap and get a massive boost from a higher CPU cache. I think this is most clearly visible in games like R6 Siege, where 1080p with TXAA and 100% sharpness on the in-game sharpness slider looks super sharp and crisp. If all games looked like this at 1080p, I'd never go for 1440p. But then you fire up the latest Warzone or Battlefield and all you see is blur at 1080p but looks decent at 1440p. Tarkov is somewhere in-between. It definitely gets a lot crisper but the difference isn't staggering like with the latest CODs and Battlefields. As for Twitch, this is completely irrelevant. OBS' down scaler barely takes an extra 1% of GPU power. And there is no visual difference. I was concerned about this too before I upgraded, but then I did and there's functionally no difference in the stream quality at 1080p. I'm now running 3x 1440p and I feel good. Oh, and don't even get me started on text... I code for a living and it's so much more enjoyable looking at text all day. The 1440p is a massive difference there.
@@purplewastaken And the 5070 only draws 250W and will still have 2x transistors compared to the 4070. Also, the selling point of the 5070 series is frame gen not raw performance bumps.
@@jay_deeeyna5680 These are rebranded TITAN cards... no games will ever need 32GB VRAM for at least 10-15 years. Remember how the 3090 Ti was also $2,000 and it was only 15% faster than the 3090?
@turbo________ believe it or not having $2000 to spend on yourself doesnt make someone instantly rich. people buying the 5090 still absolutely care about price to performance and no one is happy about the massively smaller gains from the 4090 -> 5090 compared to the 3090 -> 4090
im gonna swap to red team edit: im swapping to team red if the price is right on the 9000 series since Im in need of an upgrade. I didnt say anything about whos better in performance, you guys just assumed
AMD wasnt great last gen and this gen they are dropping the high end. You are too late to swap to red team bro. They were on par in the 3000/6000's series. 6800XT was about the same performance as 3080. Since then its been going downhill for them.
@@AquaLai06 The 7900xtx is basically a ''7800xt'' card, aka same tier as the 6800xt that cost $650. So while yeah, its cheaper.. its still +50% more expensive then the generation before it.
what is this background music.. I thought my car had bad bearings first lol. Also the video about the gains, the 2080ti to 3090 was on different nodes, and so was 3090 and 4090.. but 4090 and 5090 is on the same node, so its not that wierd that it just uses more power.
The input delay from frame regeneration is way worse than the tearing. You can hardly even notice the screen tearing in games, the most noticeable thing is the input delay which makes it feel like you're playing at 60fps even when it "says" 120.
@@propertymanager9149Turning frame gen on will definitely give you input delay no matter what fps youre playing at. The delay is caused due to the AI frames being generated.
@@propertymanager9149it does add input lag. Frame generation uses a video processing technique called "frame interpolation" to essentially insert new frames between existing "real" frames. This requires a lot of calculations and processing power. It creates latency because your system needs to wait for these calculations to complete before the response from your input (controller, keyboard etc.) Is registered which creates latency.
5090 is lowkey an engineering marvel going to the pcs size it is etc and now nvidia is going to force aib to improve thermal options aswell no1 wants fat cards lets be honest
Man, fair play to anyone excited about NVidia's tech, but any sensible desktop gamer (most PC gamers, in other words) is on a 1440p monitor with a high refresh rate. Meaning every high-end card from both AMD and NVidia can get you high frame-rates at max settings. This card solves a problem that either doesn't exist, or one you've made for yourself. For gaming, the only reasonable use for 4K is if you game on a large TV screen, which actually makes use of those pixels. Once you're on a desktop, you don't need 4K, and once you don't need 4K, you don't need products like these.
For context the 5090 is like duct taping a 4060Ti to a 4090! or duct taping a 7800xt to a 7900xtx in raw rasta performance! It is also getting bottlenecked by a 9800X3D @ 4K in some titles.... A 9800X3D!! @4K!! The thing is a monster! That being said the 5090 is a Dev Tool!!
I also recommend checking out the video called ''20% Efficiency Boost by RTX 5090 Power Target Tweaking'' by der8aer EN, he set 5090 power limit to 70% and it used less power than a stock 4090 while having more performance
Funny how people cant understand that getting these kinds of frames without AI would be possible maybe in 3-4 generations of cards. Without AI the cards would be getting bulkier, more expensive to produce and the power draw would be completely insane. We are living in the greatest times right now and the AI tech will be getting more insane every year. Smaller, faster and less expensive to run. AI is at its infancy.
the truth hurts people but the rate at we can improve performance is slowing down as we get closer to the theoretical limit, not saying that Nvidia is the good guy here but people should be blaming the game devs who release unoptimized trash, til they actually do so AI will be the next jump in increasing FPS
AI Overview: To calculate the cost of running a 500-watt appliance 24 hours a day for a year, you would need to know your local electricity cost per kWh, but assuming an average US rate of around 13 cents per kWh, running a 500-watt appliance continuously for a year would cost approximately $1,800.
Pretty much if you dont buy the 5090 for 2 years you can afford to use the GPU for a year 24/7 or 2 years 12 hours a day. 1 year for the GPU cost and 1 Year of use
TSMC and Samsung Foundry's being stuck on the same node really set this architecture back. If Nvidia can get some 3mn slots that Apple bought out last year, we might see a Ti or Super with 20% extra performance over the 4090 at reduced power draw. Sadly this is it for now.
its not the tech the problem frame gen is fine you can use it if you want the problem are the devs optimize games with frame gen in mind i hope we replace them with ai soon enough
i have a question for those of you who surely know what they are talking about. Does locking your fps with frame gen to your monitor refresh rate make the ghosting, input lag and tearing better? I mean to me it would make sense that it would be horrible at highest possible amount of frames generated it would suck. For example you don't generate 60->200+ fps but you would generate to liek 144fps wouldn't that make it much better?
I feel like this new ai frames technology is debuting just like ray tracing did with the 20 series. It’s still in the works and prob the 60 series will have it perfected.
I want to clarify for everyone: the main issue with fake 240 FPS isn’t visual artifacts, but latency. When paid-off reviewers like this one are confronted about it, they dismissively say, 'But the latency doesn’t increase!' - as if they don’t understand that 240 FPS should come with a latency of around 4-5 milliseconds, not 30 ms. With such massive delay, it’ll feel like you’re playing at 30 FPS, even though the counter claims 240. And let’s not forget the 1% lows - they don’t improve, so you’ll still experience annoying microstutters that ruin the experience. Don’t fall for this nonsense.
Optimum tech is amazing reviewer so i promise you he is not some “paid reviewer”. He built his entire channel around not taking sponsorships, buying everything on his own dollar, and always being honest. I understand that your upset that he didn’t mention the “true latency”, but you need to understand that he is mostly a PC builder channel not a Linus tech tips kinda dude. he stated multiple times that you should watch other reviewers so you can get the full picture.
one thing, he doesn't mention in the video is that while 4x frame gen does increase the frames per second, it does NOT increase the responsiveness you get with that many frames. if you can only run the game at 20 fps natively and then you use 4x frame gen to go past 100fps, while it will look smoother, it will NOT FEEL smoother. if you get 20 fps natively, it can be fine to frame gen up to maybe 40 to maybe even 60 fps and it might work well for you, not a big disparity between visuals and responsiveness. or if you have 100 fps, you could frame gen upto maybe 200 fps and things will be decent. frame gen is not a way to win. it is a way to win more if you are already winning. so to speak.
tbh, most modern games already look freaking good on highest settings without RT. I don't see why you would want to turn on RT for better lightning, reflection etc while dipping your performance. Used to be team green but I'm getting sick of their ridiculous pricing for some something even their current hardware couldn't handle to the point they have to sell you fake frames.
no one is a fan of fake frames. but no one is like 5% of the market so they are not gonna worry about us. Every casual gamer will enjoy this and probably not even feel the latency or anything. This new tech is for casuals.
30ms is pretty bad. I'd say this is a game changer for flight sims and other immersion games that doesn't require fast response times, but for anything that has guns in it or even action games this is unplayable.
@@AmigoAmigo-w5p say u have 60 ping and then u add 30 ms of input latency above that and now you are effectively have 90 ping, thats quite the substantial latency for anything competitive
Not really, probably just not aimed at you. For high end consumers the difference they make for 4k high Hz gaming and VR is pretty noticeable. And for professionals the time you save rendering, editing, scientific computing, etc is worth it. And the consumption difference isn't that bad, even here where it's 21.25 cents per kWh, if used 10 hrs a day it's about $170 more per year than my 355W TDP card. And if power consumption is such a concern, you can limit the power it uses and it'll still perform better than anything else available
The card isn't aimed at 14yo Fortnite players like you. People buying a 4090 don't care about the cost and need the VRAM or performance for productivity or high-res gaming.
1 in 3 Canadians (canada born) are on food banks. Only the foreigners from the big city's will get these. By design btw. If a white kid in Canada wants this card he better start pushing snake charmers out of the timmies line for a job
Paid review... 34 fps with maxed-out settings without DLSS, but of course, they won't show you this because it would reveal the true performance of this "upgrade".
so i just realized all the games i play are stylized and required a potato. why do i keep upgrading? for what games? All the newest games run perfect on 2 year old hardware. No one is making games for these beasts, and no Star citizen is a benchmark not a game silly.
imagine people who arguin regarding latency framegen is "good and competitive player are only 0.1% of population", are the same people who bough 2k/4k 144/240/360hz monitor for playing a ugly 70$ single player with pad.....
35 isnt high in itself. 6-8 isnt normal. but the problem with frame gen is that your inputs are not runnig at lets say 100 FPS but the non generated 50 frames.
What happened to technology becoming more compact and efficient? Soon your monthly electricity bill will cost you the same amount you paid for that card 😅
it was also supposed to have 2 power connecters but they saw the single one getting to 85C under IDEAL conditions and were like: nah its fine guys, save the 5$ per card
Lets be honest, why do people even watch xQc react to other content, because 90% of the time, he just watches the video and says literally nothing, the other 10% is just him asking chat a question
thanks. now the entry pc for any AAA games will be 10k dollars and no one realizes that before you could play AAA games with a 400 dollars build (BO2, GTA IV etc)
If you have pci-e gen 4.0 motherboard you’ll limit the 5090 off gate need a 5.0 to get the full slot speed just putting that out there since nobody is really talking about that. 👌🏽 after launch we may see a big difference on performance when you have the right pieces with the beast
I'll always choose AMD. I'll never have the budget to invest on the top 0.1% absolute best GPU which is what Nvidia provides. When it comes to affordable GPUs, you get much more value per $ with AMD
I use lossless scaling frame generation for Escape from tarkov, and I must say, even though there are some artifacts, the overall gameplay feels a lot better and due to intertia you dont really feel latency diff
Watching this has made me realise in the like 17 years I’ve been playing games I don’t think I’ve ever played a game above 60fps(120fps I lied)… that’s tragic.
@ I have a 180hz monitor but I’m on an Xbox series S, it’s set to run at 120hz and does on a few games so I just completely lied lmao, most games I play definitely don’t hit 120 though, I’m lucky to get 50fps consistently in Arma, but yeh definitely night and day man I was on a 60hz 49inch tv before twitching to my monitor lmao, massive difference
MFG useless tech unless you have a 5090 and 480hz 4k monitor. you need at least 60-80 base framerate anyway and you wont get that with ray tracing on anything less than a 5090 lmao. still if your base frame rate is at like 90 already, why add input lag and artifacts on top of it?
80 x 4 is not 480 lol. 2x FG at 60fps only gets you to 110fps on average because it doesn't scale exactly. Most gamers have a 144hz screen minimum, so 3x MFG will get you to your monitors refresh rate. Why add input lag and artifacts? Because most people don't notice the difference between 20ms and 40ms. The only difference they notice is that their game is smoother and more enjoyable to play.
@demetter7936 3x MFG will go over 144hz so out of range of VRR so its useless too. unless you cap it at 142 fps but even then the image quality is terrible. 4x even more useless as i said you need like a 480hz monitor 240hz minimum. and only 5090 can push that. only the traditional 2x mode is actually usable imo
@@delayeedbms look at how much better dlss 4 is compared to other versions, theres no reason to believe that frame gen wont be as impressive in the future.
Imagine playing at 1080p with a 4090, lowering your settings to medium to achieve high FPS which your monitor cant even handle but then rage over how bad the 5090 is in Quality, MFG and so on... thats XQC right there lol.
2:16 Yeah you are... They will be forced uppon you, RT technology and Frame Gen will be a standard for future gaming in fact, old GPUs that are still capable like the 1080ti will not be supported in most of the newer titles due to this.
not really a bad thing. As with any tech, stuff advances and older stuff gets left behind. I do wish RT and stuff was more optimised but we will get there. AMD is slacking though.
doesnt your psu just pull 1.5kwh flat if it says 1.5kw or does it fluctuate depending on the usage? if it is flat then there is no difference between 5090 \4090
the guy also made a video on undervolting the card reducing by 100w with little to no fps loss. making basically close at the 4090 with better perfomance
@@lir4e these people dont know what theyre talking about, theyre just making up copes because they wouldnt be able to afford the card in the first place. its a very good card, when it comes out everyone will want it despite the whining about "fake frames" lmao
it that same shit as FG ion the 40xx just with more frames it will still have the same problem and artefact you have now for 2 year just more of it, the same lantency, the same ghosting ect..... even with MFG if you dont have the navite FPS it will look like shit, you sure can set it toi 2x time to have less ghosting and artefacts but at that point you cpould also stay at normal FG..... nvidias 50xx gpu are just a scam for gamers it has bearly more native performance and it only build arround Ai adn we all know why they dont give a shit about game and want to sell thoses gpu dies in AI cards becaus ethey are just AI cards with new ram and maybe 2-4 rt cores more .... a mean just look at it they did end production of the 4090/4080 2 months ago, that happend never befor and they still talk about a shortage of gpu for launch.... now they want to end production of 4060 4070 gpus befor they come out and i bet we will still see a shartage of gpus when tehy come out and no it not because they did rush thoses gpu this tiem they have even 3 months more to get them out and still fucked up and i dont htink it getting betetr with MFG, FG nevert did because unlike DLSS they need to make a imges out of nothing (they pull it out their ass) ok not nothing but the frame befor so you will allways have latency because the gpu dose not know what you will to in the furture it guesses what will come next, sure based ob what oyu did in frames befor but still a guess no more noe less
gambling at 1000fps with AI gen frames
It will generate the 1000x max wins hooooly
book book book
I think the tech is cool, but I always turn it off, I do not like how it feels. Maybe multigen is better, but I doubt it.
@@AndrewTSqregular frame gen has been updated and you wont need the new gpu's to use it
@@AndrewTSq 5090 andy
X is gonna buy this and use a 1080p monitor
It is insane to me how he has not upgraded to at least 1440p since he has such a high end rig. Even if you run competitive games at lower settings, the higher resolution makes things sharper so you notice things better.
As he should, twitch doesn't even have 1440p let alone 4k
1440p is the sweet spot, 4k will only give you performance issues for a little more resolution
It's pretty much marketing for you to buy better GPUs when even 1080p is a great pixel density for your eyes
@@nexuhs. lil bro, he’s literally using a 4090 and it’s under utilized because he’s using 1080p even if he did buy a 5090 he’d see like an average of 5% increase on fps unless he used the new frame gen. Also no one said he had to stream in 1440p with a 1440p monitor
and then hes going to complain that hes not getting better performance because he is cpu bottlenecked
@@nexuhs. You don't use 1440p then. It's not pretty much marketing. Yes, 1080p is great pixel density. When developers aren't lazy and making their games blurry af. But developers decided 5+ years ago to start offloading their labor onto your $1000 GPU and high resolution monitor so they don't have to work hard. So modern games at 1080p do not look like older games at 1080p. They are a blurry mess and you can barely see anything in some titles.
I used 1080p 360 Hz for a few years with a 2080 and recently a 4070 Ti Super but finally decided to go for 1440p 240 Hz instead. The difference in some games is minimal, but in others its staggering. Much like the extra cache in the X3D CPUs which I have also been using, high res is technology that helps when the developers are too lazy to optimize. And these days, they are getting lazier by the day... 50% of modern games are blurry messes. The other 50% run like crap and get a massive boost from a higher CPU cache.
I think this is most clearly visible in games like R6 Siege, where 1080p with TXAA and 100% sharpness on the in-game sharpness slider looks super sharp and crisp. If all games looked like this at 1080p, I'd never go for 1440p. But then you fire up the latest Warzone or Battlefield and all you see is blur at 1080p but looks decent at 1440p. Tarkov is somewhere in-between. It definitely gets a lot crisper but the difference isn't staggering like with the latest CODs and Battlefields.
As for Twitch, this is completely irrelevant. OBS' down scaler barely takes an extra 1% of GPU power. And there is no visual difference. I was concerned about this too before I upgraded, but then I did and there's functionally no difference in the stream quality at 1080p. I'm now running 3x 1440p and I feel good. Oh, and don't even get me started on text... I code for a living and it's so much more enjoyable looking at text all day. The 1440p is a massive difference there.
Ridiculous that modern GPUs are pulling 500+ watts. That's half the power it takes to run your fucking microwave.
1080ti has 12 billion transistors
5090 has 92 billion transistors
You get what you buy 🤷♂️
@@Dom-zy1qy
1080ti has 12 billion transistors and 250W TDP
4070 Super has 35.8 billion transistors and 220W TDP
@@purplewastaken Why are you comparing two completely different class of GPU?
@ so did the first guy
@@purplewastaken And the 5070 only draws 250W and will still have 2x transistors compared to the 4070.
Also, the selling point of the 5070 series is frame gen not raw performance bumps.
So they increased the price by 25% and power consumption around 30% but we only get 30% more performance
thts the price for founder edtion,imagine price for gpu from msi or asus holyyyyy molyyyy
@@jay_deeeyna5680 These are rebranded TITAN cards... no games will ever need 32GB VRAM for at least 10-15 years.
Remember how the 3090 Ti was also $2,000 and it was only 15% faster than the 3090?
the person buying a xx90 card does not care about anything other than performance.
because its the same size process so you cant get more efficient
@turbo________ believe it or not having $2000 to spend on yourself doesnt make someone instantly rich. people buying the 5090 still absolutely care about price to performance and no one is happy about the massively smaller gains from the 4090 -> 5090 compared to the 3090 -> 4090
im gonna swap to red team
edit: im swapping to team red if the price is right on the 9000 series since Im in need of an upgrade. I didnt say anything about whos better in performance, you guys just assumed
Good luck 😅
They are worse,
@randomroIIs much more cheapter tho lmfao, 7900xtx its like 1k which 5090 is like double the price, XTX on price wise, its way much better
AMD wasnt great last gen and this gen they are dropping the high end. You are too late to swap to red team bro. They were on par in the 3000/6000's series. 6800XT was about the same performance as 3080. Since then its been going downhill for them.
@@AquaLai06 The 7900xtx is basically a ''7800xt'' card, aka same tier as the 6800xt that cost $650. So while yeah, its cheaper.. its still +50% more expensive then the generation before it.
optimum GIGACHAD
RTX 5090 + 1080p Monitor 💀
1080p with dlss on 💀
what is this background music.. I thought my car had bad bearings first lol. Also the video about the gains, the 2080ti to 3090 was on different nodes, and so was 3090 and 4090.. but 4090 and 5090 is on the same node, so its not that wierd that it just uses more power.
The input delay from frame regeneration is way worse than the tearing. You can hardly even notice the screen tearing in games, the most noticeable thing is the input delay which makes it feel like you're playing at 60fps even when it "says" 120.
its not really delay its just tied to your actual FPS right? it doesnt add input lag
@@propertymanager9149Turning frame gen on will definitely give you input delay no matter what fps youre playing at. The delay is caused due to the AI frames being generated.
@@propertymanager9149it does add input lag. Frame generation uses a video processing technique called "frame interpolation" to essentially insert new frames between existing "real" frames. This requires a lot of calculations and processing power. It creates latency because your system needs to wait for these calculations to complete before the response from your input (controller, keyboard etc.) Is registered which creates latency.
5090 is lowkey an engineering marvel going to the pcs size it is etc and now nvidia is going to force aib to improve thermal options aswell no1 wants fat cards lets be honest
Man, fair play to anyone excited about NVidia's tech, but any sensible desktop gamer (most PC gamers, in other words) is on a 1440p monitor with a high refresh rate. Meaning every high-end card from both AMD and NVidia can get you high frame-rates at max settings.
This card solves a problem that either doesn't exist, or one you've made for yourself. For gaming, the only reasonable use for 4K is if you game on a large TV screen, which actually makes use of those pixels. Once you're on a desktop, you don't need 4K, and once you don't need 4K, you don't need products like these.
Gotta always remember to mentally block out chat during these kinds of videos so i can save what sanity i have left
i put on 64kx frame gen I pressed left click 5 minutes ago, maybe ill kill this enemy when it registers
When it registers!
For context the 5090 is like duct taping a 4060Ti to a 4090! or duct taping a 7800xt to a 7900xtx in raw rasta performance!
It is also getting bottlenecked by a 9800X3D @ 4K in some titles.... A 9800X3D!! @4K!!
The thing is a monster!
That being said the 5090 is a Dev Tool!!
what kills me is I can see Optimum never using Multi Frame Gen in his Competitive games...which is why you don't see him showing that part off...
I also recommend checking out the video called ''20% Efficiency Boost by RTX 5090 Power Target Tweaking'' by der8aer EN, he set 5090 power limit to 70% and it used less power than a stock 4090 while having more performance
The paused picture tearing is because of the monitor updating the pixels right? The fragments on the edges are bad though
the tear is the refresh and the artiifcasts near the moving sutff is the mfg
No tearing is from the gpu. The monitor refreshes all of its pixels at once so tearing is not from the monitor
7:58 it looks like the framebuffer not clearing glitch in windows xp lol
"Right around 500W draw on average" Meanwhile average is 468
Screen tearing is from the recording and not the framegen.
Bro the 4090 can already power half my city , why they hail is there a 5090
1080p gamer gives his opinion on modern hardware
My best memory gaming is on a crt. Im sure he enjoys 1080.
TRUE.
Funny how people cant understand that getting these kinds of frames without AI would be possible maybe in 3-4 generations of cards. Without AI the cards would be getting bulkier, more expensive to produce and the power draw would be completely insane. We are living in the greatest times right now and the AI tech will be getting more insane every year. Smaller, faster and less expensive to run. AI is at its infancy.
what benefit is there when the input lag is still tied to your ACTUAL fps?
the truth hurts people but the rate at we can improve performance is slowing down as we get closer to the theoretical limit, not saying that Nvidia is the good guy here but people should be blaming the game devs who release unoptimized trash, til they actually do so AI will be the next jump in increasing FPS
AI Overview: To calculate the cost of running a 500-watt appliance 24 hours a day for a year, you would need to know your local electricity cost per kWh, but assuming an average US rate of around 13 cents per kWh, running a 500-watt appliance continuously for a year would cost approximately $1,800.
Pretty much if you dont buy the 5090 for 2 years you can afford to use the GPU for a year 24/7 or 2 years 12 hours a day. 1 year for the GPU cost and 1 Year of use
This does not include gaming for 2 years with your current GPU you have to run integrated or something or it will eat at your current GPU watt savings
TSMC and Samsung Foundry's being stuck on the same node really set this architecture back. If Nvidia can get some 3mn slots that Apple bought out last year, we might see a Ti or Super with 20% extra performance over the 4090 at reduced power draw. Sadly this is it for now.
its not the tech the problem frame gen is fine you can use it if you want the problem are the devs optimize games with frame gen in mind i hope we replace them with ai soon enough
For me, there just isn't any games tha ii feel justify upgrading my 3080.
Bro my rooms already heated with aah 3060
exaggerating. My 4080 doesn't even heat up my room.
I got the same. Bro in summer I'm sweating my ass off it gets so hot in my room
Time to get your xqc reacts channel verified you got 131000 subscribers
People acting like they know the difference between real frames and upscaling 🤣🤣🤣
Funny thing is upscaling looks better than real frames in a lot of games due to TAA.
1:38 dude i've been wanting this for so long
i have a question for those of you who surely know what they are talking about. Does locking your fps with frame gen to your monitor refresh rate make the ghosting, input lag and tearing better? I mean to me it would make sense that it would be horrible at highest possible amount of frames generated it would suck. For example you don't generate 60->200+ fps but you would generate to liek 144fps wouldn't that make it much better?
Remember these wattage draw are of a STOCK factory gpu, imagine 3rd party gpus like asus.
I feel like this new ai frames technology is debuting just like ray tracing did with the 20 series. It’s still in the works and prob the 60 series will have it perfected.
I want to clarify for everyone: the main issue with fake 240 FPS isn’t visual artifacts, but latency. When paid-off reviewers like this one are confronted about it, they dismissively say, 'But the latency doesn’t increase!' - as if they don’t understand that 240 FPS should come with a latency of around 4-5 milliseconds, not 30 ms. With such massive delay, it’ll feel like you’re playing at 30 FPS, even though the counter claims 240. And let’s not forget the 1% lows - they don’t improve, so you’ll still experience annoying microstutters that ruin the experience. Don’t fall for this nonsense.
Optimum tech is amazing reviewer so i promise you he is not some “paid reviewer”. He built his entire channel around not taking sponsorships, buying everything on his own dollar, and always being honest. I understand that your upset that he didn’t mention the “true latency”, but you need to understand that he is mostly a PC builder channel not a Linus tech tips kinda dude. he stated multiple times that you should watch other reviewers so you can get the full picture.
The increased power draw is due to the 512 bit memory bus
one thing, he doesn't mention in the video is that while 4x frame gen does increase the frames per second, it does NOT increase the responsiveness you get with that many frames. if you can only run the game at 20 fps natively and then you use 4x frame gen to go past 100fps, while it will look smoother, it will NOT FEEL smoother. if you get 20 fps natively, it can be fine to frame gen up to maybe 40 to maybe even 60 fps and it might work well for you, not a big disparity between visuals and responsiveness. or if you have 100 fps, you could frame gen upto maybe 200 fps and things will be decent.
frame gen is not a way to win. it is a way to win more if you are already winning. so to speak.
He literally shows that, did you watch the video?
he literally talked about it. and showed the latency numbers.....
it will obviously feel smoother
other nvidia reflex improvements help a lot in this regard - they have already thought about it
We watch Daniel Owen too 😂
tbh, most modern games already look freaking good on highest settings without RT. I don't see why you would want to turn on RT for better lightning, reflection etc while dipping your performance. Used to be team green but I'm getting sick of their ridiculous pricing for some something even their current hardware couldn't handle to the point they have to sell you fake frames.
no one is a fan of fake frames. but no one is like 5% of the market so they are not gonna worry about us. Every casual gamer will enjoy this and probably not even feel the latency or anything. This new tech is for casuals.
I'll buy my next gpu when this technology is perfected, I'll stick with my 3080ti and wait for the 8090
nahh thats crazy 500 watts imagine the electricity bill 💀 💀
I watched a clip a XQC saying 1080p is still peak. Guys a moron.
30ms is pretty bad. I'd say this is a game changer for flight sims and other immersion games that doesn't require fast response times, but for anything that has guns in it or even action games this is unplayable.
What are you talking about. You can't even comprehend 30ms latency. Games start to get noticeable delay above 60-70ms
Yeah. But it's pretty impressive that there is zero increase between the old 2x and the new 4x despite getting twice the amount of frames
@@AmigoAmigo-w5p say u have 60 ping and then u add 30 ms of input latency above that and now you are effectively have 90 ping, thats quite the substantial latency for anything competitive
no one using frame gen in comp games its for story games
@@MrRjizz That is not how latency works lol. You can't just add them together.
a 90 cards are so absurdly worthless. you buy this card and then the bill goes +100$ a month
a person buying a 90 card does not care about that tho
Not really, probably just not aimed at you. For high end consumers the difference they make for 4k high Hz gaming and VR is pretty noticeable. And for professionals the time you save rendering, editing, scientific computing, etc is worth it.
And the consumption difference isn't that bad, even here where it's 21.25 cents per kWh, if used 10 hrs a day it's about $170 more per year than my 355W TDP card. And if power consumption is such a concern, you can limit the power it uses and it'll still perform better than anything else available
The card isn't aimed at 14yo Fortnite players like you. People buying a 4090 don't care about the cost and need the VRAM or performance for productivity or high-res gaming.
1 in 3 Canadians (canada born) are on food banks. Only the foreigners from the big city's will get these. By design btw. If a white kid in Canada wants this card he better start pushing snake charmers out of the timmies line for a job
during winter you can use as stove
Funniest part about X is he always pretends to know what hes talking about when it comes to tech. Also, hes 1080p andy
optimum pog
Paid review... 34 fps with maxed-out settings without DLSS, but of course, they won't show you this because it would reveal the true performance of this "upgrade".
so i just realized all the games i play are stylized and required a potato. why do i keep upgrading? for what games? All the newest games run perfect on 2 year old hardware. No one is making games for these beasts, and no Star citizen is a benchmark not a game silly.
When we gonna get XQC uploads in 2160p? Or at least 1440? Watching a reaction to this in 1080 kinda ruins it
When twitch supports 1440p
@@nexuhs. it does
xQc still plays in 1080p
Running cyberpunk at maxed out graphicics with path tracing on 4k resolution, running 100 fps, is impressive, no dlss
A fridge is 300-800 watts on average 😂this shi bein in the middle kinda crazy
I dont like ai i dont wanna play on fake stuff
imagine people who arguin regarding latency framegen is "good and competitive player are only 0.1% of population", are the same people who bough 2k/4k 144/240/360hz monitor for playing a ugly 70$ single player with pad.....
can it run stardew valley smoothly ?
Am I tripping or is 35ms really high?? Isn't the normal 6-8ms?
35 isnt high in itself. 6-8 isnt normal. but the problem with frame gen is that your inputs are not runnig at lets say 100 FPS but the non generated 50 frames.
99% people cant differentiate 35ms and 55ms so no its not noticable unless youre playing competitive shooters
@@Muspellheimrr 35ms is fine but something like 30ms is very low.
35ms is system latency, which isnt bad, but it definitely could be lower or down to 30ms tbf
frame gen is cool, im tired of pretending it's not
1 view in 24 seconds. bro fell off
What happened to technology becoming more compact and efficient? Soon your monthly electricity bill will cost you the same amount you paid for that card 😅
4:55 That looks like shit to me
Compression
8:10 by the time its better, next gen will be out. just like dummies who bought a 2060 for rt. yeah good one lmao
energy demand 📈
Frame gen is so shit for competitive games especially when ping/ms matters
its so cut down from the feel gb202 die. They are just cheeping out on us. Couldve been 50% from last gen but instead 30
it was also supposed to have 2 power connecters but they saw the single one getting to 85C under IDEAL conditions and were like: nah its fine guys, save the 5$ per card
pretty sure everyone uses DirectX / Vulcan lol
bro I think 4090 is already the greatest.. 5090 wont do shit with ai
poorfrogs gripped by complete psychosis
Boys I got a 3070ti am I upgrading to 5070ti?!?!?
On the long run 5080 will be better because of more Vram and perfomance... depends on how often do you change GPU
@@lir4e ah yes 5080 has more vram than 5070ti
Lets be honest, why do people even watch xQc react to other content, because 90% of the time, he just watches the video and says literally nothing, the other 10% is just him asking chat a question
straighten up lil nga
I like react content (unironically) 🙌
Its the chat
i like hate watching him
u clicked on the video and ur asking us why?
600watts hell no my energy bill cant handle that hahahaha
32gb vram for mining it isn't expensive 😂
thanks. now the entry pc for any AAA games will be 10k dollars and no one realizes that before you could play AAA games with a 400 dollars build (BO2, GTA IV etc)
A 400 dollar build? 😂 are you playing at 30fps 😂
f*** 240fps when its blocky and blurry mess with input lag of 30 fps
why is he so mald about this
If you have pci-e gen 4.0 motherboard you’ll limit the 5090 off gate need a 5.0 to get the full slot speed just putting that out there since nobody is really talking about that. 👌🏽 after launch we may see a big difference on performance when you have the right pieces with the beast
The performance difference is negligible, there's plenty of Info out there on gen5, inform yourself.
I'll always choose AMD. I'll never have the budget to invest on the top 0.1% absolute best GPU which is what Nvidia provides. When it comes to affordable GPUs, you get much more value per $ with AMD
4:55 Am I the only one who thinks that looks like utter garbage? NFS MW looked better on my PS2 lmao
Yeah you need to get some sleep
I use lossless scaling frame generation for Escape from tarkov, and I must say, even though there are some artifacts, the overall gameplay feels a lot better and due to intertia you dont really feel latency diff
waiting for the 5080
its half the size of the 5090 to meet restrictions to china. probably only gonna be 10% better than a 4080. just by a used 4080 now for cheap
@@xii-nyth4101 15-20% faster then the 4080 Super, there were some leaked benchmarks already, but it would depend on the gaming title i guess.
Watching this has made me realise in the like 17 years I’ve been playing games I don’t think I’ve ever played a game above 60fps(120fps I lied)… that’s tragic.
i have tasted 144hz for a while and now im back at 60 bcs my 144hz monitor broke, its like night and day bro
@ I have a 180hz monitor but I’m on an Xbox series S, it’s set to run at 120hz and does on a few games so I just completely lied lmao, most games I play definitely don’t hit 120 though, I’m lucky to get 50fps consistently in Arma, but yeh definitely night and day man I was on a 60hz 49inch tv before twitching to my monitor lmao, massive difference
MFG useless tech unless you have a 5090 and 480hz 4k monitor. you need at least 60-80 base framerate anyway and you wont get that with ray tracing on anything less than a 5090 lmao. still if your base frame rate is at like 90 already, why add input lag and artifacts on top of it?
80 x 4 is not 480 lol. 2x FG at 60fps only gets you to 110fps on average because it doesn't scale exactly. Most gamers have a 144hz screen minimum, so 3x MFG will get you to your monitors refresh rate. Why add input lag and artifacts? Because most people don't notice the difference between 20ms and 40ms. The only difference they notice is that their game is smoother and more enjoyable to play.
@demetter7936 3x MFG will go over 144hz so out of range of VRR so its useless too. unless you cap it at 142 fps but even then the image quality is terrible. 4x even more useless as i said you need like a 480hz monitor 240hz minimum. and only 5090 can push that. only the traditional 2x mode is actually usable imo
@@delayeedbms look at how much better dlss 4 is compared to other versions, theres no reason to believe that frame gen wont be as impressive in the future.
still not worth it
Your mom won't buy you a laptop with a 4060, stop acting like you could get this anyway 🤣
Imagine playing at 1080p with a 4090, lowering your settings to medium to achieve high FPS which your monitor cant even handle but then rage over how bad the 5090 is in Quality, MFG and so on... thats XQC right there lol.
2:16 Yeah you are... They will be forced uppon you, RT technology and Frame Gen will be a standard for future gaming in fact, old GPUs that are still capable like the 1080ti will not be supported in most of the newer titles due to this.
not really a bad thing. As with any tech, stuff advances and older stuff gets left behind. I do wish RT and stuff was more optimised but we will get there. AMD is slacking though.
I mean yeah.... Just like graphics cards from the 2000s aren't relevant now. What do you expect?
People complaining about "fake frames" not realizing every frame is fake its literally a video game none of it is real
are you braindead or ragebaiting?
Multi millionair complaining about 2-3 spins. Just buy it lil pup
respect for all who hate dlss !!!
that guy knows jacket shit about pc's an sim racing, always gives false information, hate him. Hes just cashed up
kids games at 777777fps
optimum makes great videos
doesnt your psu just pull 1.5kwh flat if it says 1.5kw or does it fluctuate depending on the usage?
if it is flat then there is no difference between 5090 \4090
kick chat 💤🥱
heard of undervolting?
the guy also made a video on undervolting the card reducing by 100w with little to no fps loss. making basically close at the 4090 with better perfomance
@@lir4e these people dont know what theyre talking about, theyre just making up copes because they wouldnt be able to afford the card in the first place. its a very good card, when it comes out everyone will want it despite the whining about "fake frames" lmao
they are gonna release their stock card at the optimal power to frequency curve
it that same shit as FG ion the 40xx just with more frames
it will still have the same problem and artefact you have now for 2 year just more of it, the same lantency, the same ghosting ect.....
even with MFG if you dont have the navite FPS it will look like shit, you sure can set it toi 2x time to have less ghosting and artefacts but at that point you cpould also stay at normal FG.....
nvidias 50xx gpu are just a scam for gamers it has bearly more native performance and it only build arround Ai adn we all know why
they dont give a shit about game and want to sell thoses gpu dies in AI cards
becaus ethey are just AI cards with new ram and maybe 2-4 rt cores more ....
a mean just look at it they did end production of the 4090/4080 2 months ago, that happend never befor and they still talk about a shortage of gpu for launch....
now they want to end production of 4060 4070 gpus befor they come out and i bet we will still see a shartage of gpus when tehy come out
and no it not because they did rush thoses gpu this tiem they have even 3 months more to get them out and still fucked up
and i dont htink it getting betetr with MFG, FG nevert did because unlike DLSS they need to make a imges out of nothing (they pull it out their ass)
ok not nothing but the frame befor so you will allways have latency because the gpu dose not know what you will to in the furture
it guesses what will come next, sure based ob what oyu did in frames befor but still a guess no more noe less
I don't mind them as long as people call them GFPS (generated fps). You're not getting actual fps lil pup
3060 is KING
good reaction
I prefer 40 clean sharp frames than 250 blurry frames per second that also have visual bugs
Have you used frame gen? It's good, it turns 50 fps to 90 fps in Wukong and is much more enjoyable.
is this real image of him? not a meme?