4090 is the best gpu upgrade over a generation in probably the last 20 years and nothing comes close Keep coping. 4090 is a beast and well worth the money to whoever can afford it
I think people forget just how mind-bogglingly, stupidly fast new GPU's like the 4090 are. Half of these games should be easily pushing 200+ FPS WITHOUT frame gen or DLSS. Unfortunately, modern games completely brush off game optimization like it's nothing because they think these new technologies and cards can handle it... they can, but these should ONLY be supplemental, not a reliance.
i have a 4080, if any game right now refuses to run on it at 60fps in 1440p without ai trickery or lowering details i'm not playing it, simple as and just to be clear, it's not good performance for the price either, it's just the lowest i can set the bar at... atomic heart, for all it's flaws, was looking good & running beautifully, 180fps 1440p maxed out, before frame gen, many devs could take notes
@@project_anti Me too man! I love my 240Hz. Anything less feels nasty. I don't aim for high FPS with games like, Grounded, Satisfactory, Path of Exiles, Ark, 7 Days To Die and many more because the gpu ramps up heat and energy usage which I don't really need. Competative shooters though! Ramp that thing all the way up.
As a guy who grew up poor and with no family, I had a crappy 2gb ram dx9 igpu laptop, barely playing GTA SA at 20ish fps. Now that I'm more fonancially stable, I bought an RTX 3070 laptop and my friends laugh at how I still play some games sub 1080p. Some things never change and I'm a glutton for crappy visuals lmao.
Don't worry man you do you and don't worry about what people say. I also have an RTX 3070 and it still is a good GPU. I bought mines 3 years ago and I'm still using it till this day. I got it for my first pc build and it was really popular around the time I got it so it felt good to get it. Now that time has passed and new GPU's have released I still look at my 3070 and admire it for what it has done. Enjoy your gaming man
2016: $380 GTX 1070 for 144fps 1080p native ultra. 2023: $1600 RTX 4090 for 144fps 720p upscaled to 1080p with interpolated frames. Being happy about 70fps dips at upscaled 720p on a $1600 gpu is just crazy to me. Edit: Since people can't seem to understand my comment, I'll explain it. In 2016, you only needed to pay $380 for a GPU guaranteed for 120+ frame rate at 1080p ultra for triple A title games released within that generation. In 2023, you need to pay $1000-$1800 to reach that performance target for triple A titles of this generation, and even still, a game like Cyberpunk that was released 3 years BEFORE the 4090, cannot achieve 144fps in 1080pMax on a 4090 without the use of AI. So, who do we BLAME? The game developers? NVIDIA and AMD? The blame goes on the consumer! We keep buying up these standards. Nvidia and AMD release the 40 series and 7000 series, where the only performances gains are in the high end market, because we'll buy them anyway. Developers are lazy and don't work on optimizating games, because they know we'll buy them anyway. It's not about DLSS, or FSR, frame generation AI crap, it's that the pc gaming industry saw how truly desperate the community is for the products during COVID and the semi-conductor shortage, and then matched their marketing to satisfy that enormous demand. They've been riding out the same wave for 4 years now. Hell, AMD literally said that they won't be making high end cards next generation. They don't need to! They're so satisfied with the amount of demand and desperation we show them, to where they only need to work on worse product to increase profit overhead.
He is just another paid shill by nvidia lol. Talking about how pretty a game looks while playing at upscaled 720p is just hilarious. The moment these people have to pay out of their own pocket they will drop the facade
LOL I only have 2,800 subs dude! I'm already getting called a shill, this is great! That means I'm on my way!! I bought this card for mah self like a big boy!! :D
Man, that is just disrespectful. I got the 7900 XTX 5 months ago and I'm still adding high-end equipment to my setup. When my 50" 120Hz super ultrawide monitor arrives next week, my life will be complete for 5 minutes.
Thanks for sharing! I plan on grabbing one of the newest gen CPUs that they are coming out with soon, and also upgrading my motherboard and ram too, so I can really see this GPU fly! LOL
@@casmithh Did AMD never actually fix that? I know for content creation you either want the normal 7950X, 7800X3D or 13900K if you want peak gaming performance (it has more overclocking headroom as the Ryzen processors massively ramp in heat with clock speed/vcore increases). I was hard Ryzen only for the past few years since Zen 2, decided to switch back to Intel for 13th gen and will likely keep upgrading through Intel even though it uses different sockets every gen fuck them.
@@Nitedontdie i'm decently sure there still have been some reported issues trying to get good performance out of the 7950x3d. 7800x3d works like a charm but specifically the 7950x3d (and maybe 7900x3d) have problems related to the ccd.
For a 4090, 1440p is totally doable at high frame rates. I've got a 4090 and a 7900X3D and the only game that I need to use DLSS is cyberpunk with PT. Without DLSS it can run natively, but there's some dips. My monitor goes to 265hz at 1440p so in more competitive games, dropping the graphics gets you the framerate with no issue, while in more story focused games I aim for 100 ish.
Oh nice! That's awesome! I'm sure that looks awesome! I am planning on getting either a 14th gen Intel, or an 8000 series AMD cpu when they drop, I had no idea my CPU was so outdated after only 2 years... LOL
I was looking for such a video the entire weekend. To see, I'm not the only person, who is in general aiming high fps 1080p with my newly arrived 4090 (I'll do use it mainly for work), but I have to aggree, never had such a great experience and smooth gameplay with these titles! Thanks for the video!
You are not alone! I've been buying high-end GPUs for high refresh rate gaming and content creation as well since 2017 with the 1080 Ti! Glad I could help in any way! And thank you for watching! :)
I also use my RTX 3090 for 1080p gaming paird with Ryzen 7 5800 X3D monitor Alienware 27 FHD IPS 240hz like yu said just love high FPS max graphic on my PC 😀
@@RealSlxzz yeah i recently switched to 1080p 144hz monitors from my old 4k monitor which i bought when i was a console gamer 5 years back, i barely noticed the resolution downgrade i guess my eyes prefer high fps over graphics which is great
Those micro stutters could be due to the frame timing aka the time inbetween each frame thats produced. Try rivatuner to alternately cap your fps as Rivatuner's fps cap (has a setting to fix screen tearing as well) tightens the time inbetween frames making it suuuuper smooth
Yea he kept blaming it all on CPU limitations, but it never even went above 90% utilization on his monitor software. Wasn’t sure if I was missing something or not.
Jackrabbit, CPU bottleneck doesn't need to be above 90% usage. That's the old meaning of it when all we had were low core count cpus. Now, with high core count CPUs, when a game only uses 4 cores, that's only 25% of a 16 core cpu. So the total usage won't show 90% or higher. What shows a CPU limitation nowadays is GPU usage, the GPU slows down to wait for the CPU. Checkout my Avatar video to really see a bottleneck from the cpu, even though CPU usage says it's low, and the frames are still good
@@MrJays Hm, interesting. I definetly see where you're coming from. From all my studies and research I've only ever found CPU bottlenecks being considered when said processor can't send enough data to the GPU to render. So what you're saying is the primary game core isn't enough HZ and or large enough, then the CPU is limiting the computing power in certain facet(s)? I suppose I could see it, especially when loading screens are baked into normal gameplay now. I personally rock an I7-13700f, RTX 4090, DDR5 6k MHZ and a lower wattage motherboard solely to keep things cool for air cooling+longevity of PC while multi-tasking, and haven't really seen a bottleneck using all cores. I believe my CPU benchmarks lower than the one in this video as well. Thanks for the input, the multi-use of the definition of bottlenecking would make this confusing.
With ray tracing on and everything maxed out, 4090 is barely a 1080p monster, having hard time stay above 100 fps without frame generation…. Ray tracing era is different, there’s no monster gpu, only monster games.
I get similar framerates at 4K in Plague Tale on the same settings. CPU bottleneck literally means that your GPU is sitting there idle so your pretty much capping your frames regardless. Anything less then 1440P and you're literally wasting GPU power. You would probably get comparable framerates at 1080P even with lower spec cards since at that point they'd be running closer to 100% utilization across both metrics.
So, are you wasting the monitor if you can't get max framerate that your monitor supports in every game? It doesn't matter if you are not able to utilize certain parts to its full potential, some people just want to have the most power efficient build or have bragging rights or whatever else is on their mind, maybe they just don't like money or love capitalism, etc. You're not going to be driving 100mph in your car everywhere you can just because every new car these days can drive that fast.
@carssucksince1800s Those analogies don't really work. That's more like saying, "I bought a Lamborghini, but I put a limiter on it so it can't drive faster then 20km/h." Like okay, you can waste the money to do that... but a Honda Civic could also do that at a fraction of the price.
@@TheInfiniteDraw It would work if Lamborghini could achieve a much better fuel efficiency than a Civic or a Prius. Playing anything above 1080p is a waste of money, but everyone still does it.
@carssucksince1800s You're adding on nonsensical hypotheticals to try and make flawed logic work. The massive engine of a Lamborghini isn't ever going to be as fuel efficient as a tiny Civic engine, because that isn't what it was designed to do. Even if you put a limiter on it. In the same way that a 4090 isn't going to be as efficient as say a 4060. As for anything above 4K being a waste, well that's just objectively wrong. The clarity and bump in visual fidelity 4K provides is massive over 1080p. You might not want to put together a rig that can play 4K because it isn't within your budget, or it isn't something you're interested in, but calling it a waste is just incorrect.
@@TheInfiniteDraw You obviously haven't done the research. 4090 will absolutely be more power efficient than a 4060, not just as efficient when performing the same task at the same intensity. If you would stop buying crappy TVs and monitors you'd know that 1080p is sufficient. Going above it is a waste of money due to increased demand of higher resolution. You can essentially reach similar clarity by rendering games at 4K. 1080p will look worse on a 4K display due to the way digital displays work, however comparing a high quality 1080p monitor to a cheap monitor that most people get for more pixels is going to show that 1080p is providing essentially the same image quality. Obviously more pixels are going to provide more clarity, but at what cost? Saying that someone doesn't use a 4K because they can't afford it or whatever is the dumbest thing anyone can say. A 4K resolution is simply not worth it when the content does not change, but demand for greater performance from your hardware increases dramatically. If 4K is not a waste in your opinion, then getting a far more powerful card for a 1080p gaming is definitely not a waste due to far greater benefits at lower resolution than at 4K. Do some research and you will maybe realize what a bunch of nonsense you've spit out just for some eye candy. Go look at your instagram models and wait until one of them falls on your lap automagically.
After around a year of using a 4k monitor, it's extremely hard to go back to something like 1080p, to the point where it doesn't matter what kind of graphics you're running, since it will just look pixelated no matter what (though youtube makes it look much worse than it would look in person as well). That being said, I was somehow expecting higher frame rates, and when you said let's see how it runs without frame generation, it was shocking, since I was thinking that was the native fps lmao.
am i the only one unimpressed? DLSS + Framegen on a 2K GPU and Plague tale runs at an average of around 140 with dips as low as 100fps at like 100% utilization. i'm more impressed that any other card can run it adequately if anything
same the "ooooh aaaaah wooow" telling us about the groundbreaking 260 fps in nfs unbound. bro doesnt know that the 130 fps frame gen nets him dont actually do anything. yet he still used 1080p WITH DLSS thats pathetic imo
To be honest,you did a right thing. I personally bought 4060ti,but decided to stay at 1080p,so i won't have to upgrade my hardware over and over again because of res being too high. Gives me 100-144+ fps in Cyberpunk on ultras + rt ultra on with FG and DLSS,and it's with the fact that i haven't upgraded my cpu yet and it's heavily bottlenecked.
nfs has a really optimized proprietary engine that handles mulithreading like no other engine. Note how all of the workload of the game was eavenly spread between all of the 32 threads you have resuling in maxing out the gpu all the time.
@@AntiGrieferGamesyou're on something.. works great on 4090fe and 13900ks. The only thing that's bad is that TAA doesn't work on windowed borderless so using my multi monitors kinda sucks. Needs to be fullscreen exclusive. Game is fantastically optimized
Massive thanks! I tested this on Robocop Rogue City because the combination of DLSS and Lumen caused glitchy shimmering on the streets. Suddenly, I had a clear picture and a wonderfully high frame rate. I tested this with many of my other games as well. When I disabled DLSS or FSR, set the resolution to 1080p, I was shocked by how clear the picture was, especially in motion. All this upscaling produces a decent image when standing still, but as soon as you move, it blurs. I didn't understand why modern games looked so blurry, despite using 4k DLSS quality and acceptable frame rates. Your approach fixed it for me.
Each to their own. But I don't know how you can use a 1080p screen in 2023. You can't even see half of the extra detail games are adding in these days. And aliasing is so bad. I suppose it's one of those things where it's what you get used to. You bought a high refresh 1080p screen and so can't go back to a lower refresh one. I adopted 4k early and so can't go back to 1080p without feeling like I forgot to put my glasses on lol. With a card like this tho 1080p makes no sense. You could get the same frame rate at 1440p And these days you can get the best of both worlds high refresh and 4k or 1440p. Got myself a 4k 160Hz monitor for £360, I don't see a reason to use a 1080p screen with a card that expensive when 4k has gotten that cheap.
My question is.. what frame rate is enough? Surely you don't care about 1080p 300fps more than 1440p 240fps for example. You gotta agree there is a point of diminishing returns, especially in single player games. You don't need 200fps on plague tale requiem for example
So, I can only speak for myself, but I would absolutely love to try a 1080p 360hz monitor! I've only ever seen up to 144hz, personally with my own eyes. And I can say that my eyes can see the differences between 90 and 120, so I'd want to see what the highest values feel like, or look like! But for me I do agree that 1440p 240hz is more appealing in the aspect of future-proofing than keeping 1080p and going like 360hz
To be frank (no pun intended) ya'll probably know more about this shit than me lmao. On my laptop bringing the resolution down to 1080p and playing at 120fps~ vs 1440 (or using the AMD super resolution to go higher) at 50-80fps just feels and looks better for my eyes. Then again I go mainly play FPS's like Destiny so wth do I know?
The problem is a 3090 or 4080 would be getting the same fps because of the massive cpu bottleneck. This kinda sucks because people see this and think that 70% utilization is a good thing lol
Thank you for making this video! It’s something ive always had on mine but never had the resources to buy the very best, it would be amazing for you to try even lower resolutions!
Bro, this is absolutely me. I bought an RTX 3090 in 2021 to play VR and flatscreen games in 1080p, or triple screen 1080p... always at 60hz. It's a fever dream and I love it.
I'm at 1440p 155hz, and it's amazing. I had a feeling right from the start that even the juggernaut 4090 was gonna be a 1440p card if you like graphics AND framerates. Sometimes I have too much GPU and my i9 9900k is CPU limited, but not all the time-and with the advent of UE5, the card will still get solid FPS on Ultra at native, even when maxed out. The next upgrade will be a cutting-edge CPU and RAM. Eventually I'd like to experience 240hz in games that allow, and "next-gen" stuff will still run smooth, but probably closer to 120.
@@speed999-uj5kr Proof of what? That high fps is amazing, or that even the 4090 can hit its limit at max graphics and high fps? The "proof" is seeing it for yourself, which you have to do in person since you need the tech-and streaming services all seem to cap at 60 anyway.
@@speed999-uj5kr Whatever dude. Even if I had lied about that it wouldn't help, 'cause the original comment doesn't make me cool or popular in the RUclips comment section, lol. Seems to me like your jealousy is showing
You’re going to be bottlenecked no matter what, cpu or gpu bound it really doesn’t make a difference lol. Why people get worked up on it, just get the best cpu you can afford and best gpu you can afford and play the game who gives a shit about whats bottlenecked. Also ram wont make a difference in gaming either unless you have like 16gb then maybe.
Ryzen Master has a game setting that drops half the CPU for higher clock speeds on the other half. It may work better to turn off cores based off maximizing CDX availability.
I used a 3090 for 1080p for a while. The trick to getting the most out of it was to run my desktop at 4k and downscale down to 1080p. As close to having a 4k monitor without having one until I got a 4k monitor. Now I got an ultra wide 1440p oled monitor though, and I freaking love this thing.
1440p is the best i think the quality will increase a lot and the fos won t be mediocre btw i love that i found someone who have this idea i feel 4k isn t that important
@@MrJays The great part about 1440p is that it actually looks significantly better than 1080p but is much less demanding than 4K. I've never cared about 4k too much because I know what it means for performance, but I'd never wanna go back to 1080p. Upgrading my 2080ti to a 4090 while staying at 1440p 155hz has been a wonderful experience.
@@eniff2925 it is veeery big but i agree that 4k is even a bigger jump, but the performance cost is what makes 1440 p a bigger deal for me because it is much much cheaper to get high framrates
@@abdogaming4613 i play starfield at 720p upscaled to 1080 with xess on my 1660 super and honestly its fine. I have to use a pretty strong sharpening filter though, i use marty's immerse depth aware sharpening filter. The final result is suprisingly good. I would have never thought about going under 1080p but it is honestly fine. This way i can put the game on medium and it looks much better than if it was on low.
Yeah, I'd love to try 4K gaming at some point, but I don't know if I'd make it my default resolution due to having much lower frame rates. so 1440p is my sweet spot :D
@@MrJaysyea but u can play on 4k monitor with dlss quality or balanced preset with similar fps like on 1440p native so if I have 4090 I would get that new oled 240hz 32" that's coming i think early next year idk but u can get oled 27" 1440p 240hz already that's great for 4090 too
Someone who actually talks sense 😅 4k it pretty much pointless below a certain screen anyway, I'll sure it's around 60-65" as well.. 1080p is still the sweet spot 1440p is super nice 1440p ultra wide is where its at 1440p mega wide though 👀
@BornJordan im not saying it's not better... It's more of a case that it's not worth it, it comes down to the dpi and pixel count needed for such a small screen. 4k isn't really needed for anything less than 65" with a viewing distance of 2-3 meters 😶🌫️, think back to those old CRT screens 480p, you know when you could see the pixel even while sitting across the room on the sofa or even at 1080p you make them out on the 21" plus screens.... but how often have you noticed the pixels on say a 32" 1440p screen ? Probably never so is 4k worth it? for sub 60 FPS in many games? unless you own a 13900ks and an 3rd party 4090... see PC gamers tend to aim for 120 FPS or more... not 30 fps.... Personally, I'd opt for ultra wide 1440p over 4k, 3440x1440 is just shy of 5 million pixels vs 8.3 million of a 4k screen... so the FPS takes a massive hit trying to refresh all those pixels, unless you ops for DLSS or FSR, but then you're degrading the quality, making the 4k screen pointless for competitive gaming 🤷♂️
i got a 4070 super for 1080p, i use 1080p instead of 1440p because it will last way longer, future games will play good at 1080p but not 1440p, im not rich enough for a 4090 but 4070 super is still considered "overkill" for 1080p lol. cool vid
Lmao i feel you the only reason i made the hd upgrade (remember those AV cables anyone?) was because in Mortal Kombat on the PS3, i couldn't read the small text doing those tower challenges. So i bought the HDMI cable but dayuuuum the difference it made. 4k is cool if you're an editor and need to zoom stuff. Gaming? I guess if you're doing 4 player split screen so you and the boys each get a full HD screen.
@@DfOR86 I will gladly. I don’t expect anything insane coming from the gaming industry any time soon. At worst they’ll continue to poorly optimize and create paywalls for most new games. Problem is everyone will drop out of gaming if they had to start coughing up 5-800 every year to play the newest games. The 4090 will most certainly be a great card which also means it might not hit 500 in 5 years
I would not mind an RTX 4090 at 1080P. I did the same thing with an RTX 3080. Higher frame rates rule, and the best part about running a high-end GPU at 1080P is that you don't have to use DLSS even on the most demanding games.
I understand why people say 8k and 4k gaming performance is a waste to focus on in 90% of cases but... you also have to remember VR. VR requires super super high resolutions at 90FPS or higher. the better the resolutions at high FPS the bettter VR can become. I play VR and I just bought a 3060 but now I need a new PSU so I can actually use the card, that should give me 1440p easily for my headset at 90FPS (in most titles and depending on graphics settings) which is super exciting.
@@MrJays playing VR feels like the opposite of a FPS over graphics gamer; 90hz is enough to get fully accurate tracking yet you easily need 4k for easily legible text; also nothing in VR necessarily goes obsolete; the index is still a good VR headset, only downside being the low resolution.
@@scarecrow5848 That's definitely a concern of mine, is the FPS being too low in VR. But I think the 4090 should be able to handle it on most games, right? I haven't looked into VR much with this card lol
Those micro stutters are because you didn't do any OS optimizing, or latency reduction aimed setti gs changes in your os. It takes a few hours to do even if you know what ur doin to accomplish but your fps would raise 10-30% still. My 3070 with i7 7700k does nearly as good at 1080p but no stutters lol. Use latencymon to view your system latency and don't stop learning about reducing system latency till you see 5-50ms overall system latency. And don't forget to use regedit to get the default windows data queue on your mouse and keyboard to reduce the data queue from default windows settings of 100 down to 5-16ms or else u have minimum up to 100ms input lag
I found the jump from 1440p monitor to 4K 144Hz actually not that drastic. Most games I play stay over 144fps at 4K, most settings on high. Forza Horizon for example I got 190fps at 1440p but 160 at 4K and looks SO much nicer and sharper. Got the Gaming X Trio 4090.
U mean the performance decrease was not that drastic, or the res? For me, the switch from 1440p to 2160p was just as impressive as 1080p to 1440p. I remember in 2013 when I had sli gtx 780 and I picked up the first 60hz 4k monitor that Dell made. The only other 4k 60hz monitor was an ASUS. They were the exact same monitor. It was so sick, but the 3GB of vram wasn't enough for newer games.
@@keybladerasta4142 Hi! So if you have a 4090, you can basically play at any resolution you want, with any and all settings at the maximum without any issues! What we were discussing is how lower resolutions handle the immense power of the 4090 :D But if you have any questions, feel free to ask!
Here's the thing with literally every high end gpu that comes out. I don't think its that unspeakably dumb to use 4090 at 1080p because some day.. say 6 or 7 years from now were going to probably look back and say "Yeah the 4090 is a great used option for 1080p gaming"
LOL okay guy. You sound very chronically online right now. The video was for fun. And, there's nothing wrong with the generated frames *as someone who has played a wide variety of games with it*. And DLSS is amazing no matter the resolution you use it at. The DLAA portion of it can help fix bad AA issues in games, and then DLSS can help increase your frames further - but generally I only turn on DLSS when the games I'm playing limit Frame Gen to DLSS.
Thanks for posting this. I have 12700k, a 1080p monitor and my 4090 is on its way to me now. I'm gonna do it just like you, ultra settings with ultra FPS. LOL Obs.: yeah, eventually I'll get a 1440p monitor. And F' 4k monitors!! haha
I’ve tried 60 hz and 240 hz and really don’t care enough to go beyond 60fps. But holy shit 4k is incredible compared to 1080p. 1080p just looks super blurry to me now . OLED is absolutely insane as well. I’d rather take 4k OLED 60hz over 1080p 500hz any day of the week, but to each his own
That's cool and all but how are you testing 1080p on games? If you're using a 4k monitor and setting the resolution to 1080p it will look a lot worse than a native 1080p monitor.
I'm actually envious of the people like that who can't see, or don't care for higher than 60hz! If I were like that, I'd have a 4K HDR monitor by now LOL
@@MrJays I know a few people who can't see the difference and I feel the same way as you. I love high hz and recently I bought a 360hz IPS panel with HDR and my goodness is it beautiful compared to a TN panel. I don't own any 1440p or 4k monitors but I do have friends who do and I can't see too big of a difference to switch from 1080p.
@@MrJays 1080p and 1440p looks like absolute trash, a blurry fuckin mess. It’s hilarious because this guy spent all his money on a 4090 like a retard when lower end GPUs get the exact same FPS at 1080p . 4090 is only optimized to get higher fps in higher resolutions in most games. 3080 gets practically the same FPS as 4090 at FHD in many titles. The 4090 only becomes faster at 4k
I was looking at the benq ex270qm for 1440p 240hz. Much like you and frames, I’m the same way and also want a no bs blurry ghosting lagging monitor. What did you wind up getting ?
Upgrading your RAM to DDR5 would help with bringing up average FPS and your 1% lows as long as its stable. Some games can use more than 32GB of RAM too which will cause unnoticeable microstutters. In testing it will affect your 0.1% lows but you wouldn't really notice in actual gameplay. That said, it's good to know what works and what doesn't. AMD does not support DDR5 8000 yet, and only goes up to 5600 if I'm not mistaken. Maybe 6000 MT/s DDR5 can be utilized by high-end Ryzen systems, but I'm just going off memory right now. I read an OC3D article about it.
Just got one from a 2070 super, the super was good but at 1440p it ls average was pretty much low 40s to mid 60s. And if you turned rtx on it basically halfed that. But these games are all 100+ fps and then drop to 60 - 80 with rtx in most titles
I never understand people who go "I want the highest settings, the max settings" Then turn on DLSS which removes anti-aliasing which removes smooth edges.. of .. everything.
Well, I just want the fancy graphics, and DLSS uses DLAA as well, or at least can use it. So essentially, you get better AA than like MSAA or FXAA. And it uses less performance than traditional AA.
@@MrJays I got the point watching further, it's only about fps and not resolution so it's all fair. I really do recommend just mentioning something in your video about how DLSS and FSR actually take away "max graphics" because you can only run max settings in native. Also, if you notice, all of your reflections/lighting effects are reduced greatly in cyberpunk, DLSS basically shadow bans max settings.
The 4090 is such a ridiculous GPU... 😂 Cyberpunk 2077 2.0/Phantom Liberty DLC test at 4K! - ruclips.net/video/vS48ZWsK39A/видео.html Like the video if you enjoyed, and subscribe if you're new! 🤍
But I'm literally saying it DOES run good with DLSS... So DLSS DOES help. And any other 40 series GPU will have the same benefit. I'm not denying the game runs poorly, but DLSS absolutely does help.
When the 4090 is at 1080p it doesn’t use all its power source your not going to get insane frames sometimes I get even more fps on 1440p in certain games
With an OLED tv with BFI setting maxed (or a CRT/Plasma TV or VR headset), 60FPS has the motion clarity equivalent of 150FPS on a regular PC gaming monitor. At 120hz, it's equivalent to 250FPS of motion clarity. I'm saying this because most gaming monitors have no BFI or strobbing mode. So it always makes me laugh when people claim to be framerate snobs but only understand the numbers and miss out on the whole aspect of how the human eye sees those frames and how you can make games looks even smoother with a simple trick without requiring a 4090 or fake AI frames or bad motion smoothing.
Pro-tip: if you sit on 1080p, you don't need something like 3090 or 4090 to get 100 fps in the latest AA or AAA titles. Even 4060, 3060 or even 20XX will be plenty without dlss. If you want to update your PC, here's the instruction: 1) If you don't have 16 gigs of RAM - update RAM first, you can even go with DDR3 sticks if your motherboard doesn't support it 2) See how powerful your power unit, if it's less than 600V - I recommend updating it (of course, the targeted CPU/GPU combination also matters) 3) After you done with first two, get a new GPU. 3060ti or 3080 are great for how much they cost. If you can do a test of a used GPU before buying - do so by all means, mined 30XX series can be a real deal when it comes to price/performance as long as they're in decent condition 4) Finally, update your CPU and possibly motherboard 5) If you haven't done 1 or have DDR3 - time to update RAM
People sitting at 1080p don't want 100 fps, they want 240 fps, people if you want to get 200+ fps buy a 4070 ti, it'll perform the same as a 4080 at 1080p with the cost of less dollars, i played with a 3070 ti for 2 years and never reached the 240 fps in a game like warzone but a 4070 ti smashed it, so please don't settle for less and you'll thank me
Holy...this is amazing thank you for the effort put it in... BUT MY GOD 9:55 the music is sSOOOOOOOOOOOOOOOOOOOOOOOOOOO GOOOD, do you mind sharing the music XD ?
I have some advices. I have Ryzen 5800X, 32GB RAM, RTX 3080 12GB and 1080p@144Hz monitor (actually three monitors, I used to play on all three for sim racing before moving to VR). Do yourself a favor and go to Nvidia Control Panel - Manage 3D Settings - Global Settings - DSR Factors. The ones with DL Scaling use DLDSR, the ones with Legacy scaling use normal DSR. It lets you use higher resolutions on 1080p monitor - you just can select higher reslution in games settings with Fullscreen enabled. For me it shows that 1440p and 1620p use DLDSR and other resolutions (like 4K) are normal DSR. I still play most games in 1080p, but there are some exceptions which have terrible aliasing (for example some of older Unreal Engine 4 games) and I prefer to play them in 2160p locked at 60 FPS (or more if I have some performance room) to reduce the aliasing. With RTX 4090 you can probably have high FPS with these settings. If you'll like it, a video about it would be nice, as not many people know about these settings. Another thing - good GPU is great for VR gaming and I'd love to get a card with performance similar to RTX 4090 just to play VR games at max settings, including VR mods for "flat" games. Some mods support VR controlers, some just let you play with an xbox controller/keyboard and mouse (like VR mod for Cyberpunk). People rarely go and buy a VR headset on a whim, so I won't go further into details, unless you'd like to. Also worth mentioning - if you care about temperatures, undervolting GPU is also a good idea. I have a slight undervolt on my GPU and my performance is just a little worse (like 3-5 FPS less in Cyberpunk on max settings with Path Tracing and Ray Reconstruction, 1080p DLSS Quality, but I'm still at 60 FPS most of the time), but the card uses around 100W less than on stock settings. During summer it helped me keep my room temperature at a reasonable level. For single player, locking FPS to your monitor refresh rate is also good, especially when you're CPU limited - it will also keep your CPU temperatures down. I usually do that per game basis - if there's an option in game, I have V-sync disabled, but Max framerate limit set to 144 FPS. I also have G-Sync enabled in Nvidia Control Panel, so you don't see that much of a difference if the game runs below your monitor refresh rate and it feels bad for me without G-Sync when it happens.
I did experienced CPU Bottleneck few years ago on BFV ( i7 4770 / GTX 1070 / 1080p) , i lowered the graphics to get to 144fps but it would push the CPU to 100% and cause some freezes so i raised graphics to cut a bit of load on the CPU, dropping down to 80/90 fps and i was fine. On NFS you said if was pretty much CPU bounded, and we could see that most game you played CPU was running around 30/40%, and on NFS you were around 70%. So i guess unless your CPU goes up to 100% you're not CPU Bottleneck, the reason you're not getting much more fps than 4080 etc is probably related to the fact that these GPU were engineered and optimized for higher resolution.
False broski cpu bottleneck is cpu is not sending frame data fast enough gpu is just waiting hence 70-90% usage….gpu only displays frames on what cpu gives it…if your gpu isn’t at 99/100% usage it’s a cpu bottleneck…even tho 4090 wasn’t made for 1080p gaming it’s that we currently don’t have fast enough CPU’s to push it at 1080p make sense ?
@@TheConfed01 The low usage to me is related to which resolution the GPU was design for. I'm 100% sure if he benchmark the card with that CPU it would reach full usage. Used to have a R5 3600x with a 3080 in 1080p, GPU was bottlenecked because CPU was almost at full usage, overclocked it gave me a 10-15% margin and no bottleneck anymore. i'm still on 1080p, with a 7950X3D and 4070 Ti obviously i don't have any bottleneck, but i bet going with a 4090 in 1080p wouldn't cause any bottleneck, if his CPU wasn't sending frames fast enough he would experience frame drops which wasn't the case. Just some kinda low fps that " doesn't make sense " considering the GPU he's running. To me CPU no reaching full usage = no bottleneck, because bottleneck is indeed a CPU to slow to the send frame at the right time to the GPU, but when it's happenning the GPU is forced to skip frames, which results in freezes and sudden frame drops
What's actually "insane" to me is that everywhere with DLSS "Quality" is actually rendered at 720p.. upscaled... I feel like 720p ultra at 70ish FPS isn't as impressive as it sounds
i know its just a prefrence but putting a face cam with animated characters is bad in my opinion and the video would look better without it its just what i think and it doesnt have to be the right thing but i just wanted to tell you and the video is fun man good content
I appreciate how you worded this comment! Thank you for the kind words! I have received this feedback a couple times on this video, so for videos going forward, I will be more selective on when I use my avatar :)
Nice man the card will keep you in high frame rates for years to come. I just got a 4080 (upgrading from a 3080) and now I can get 144fps in all my games consistently at 1440p
Friends, do not forget that the RAM is using 3600mhz, you need to look at it that way to avoid a sudden fps drop, and I will also test it on the i9 13900kf today :)
So far I've been able to crank almost every game i play to max settings 1080p with just a rtx 3070 and a Ryzen 7600x Usually 120fps - 144fps. Games such as helldivers 2, horizon zero dawn, horizon forbidden west, jedi survivor, etc have all run max settings at 120fps+
Well, I've had this 1080p 144hz monitor for about 6-7 year, and back then 1440p was really taxing. I plan on going for 1440p 240hz (maybe 4K 120) now that graphics cards are powerful enough to run that. As soon as I get a new CPU, since the comments pointed out how much better the 13900K is compared to my 5950x LOL
this actually sucks not even fully utilizing your gpu + cpu to squeeze out more frames. devs don’t even optimize their games anymore, you have to have a 4090 to be able to use these high refresh rate monitors with 1080p gaming thats a terrible reality, don’t even get me started with dlss and frame gen litteral upscaled lower resolutions and ai artificial frames just to get over 100fps is pitiful for the best graphics card and cpu combo
bro they want to push us into using cloud gaming. Thats why don't optimize them anymore. Just look at nvidia's stock right now. They aren't hyped cuz people are buying 4090s like hot shit, no. They are building data centers so they can sell the cloud gaming subscription services. They don't like local gaming, too much piracy n all that. A 4090 is like $2800 dollars in my country. Nobody can buy that shit.
benchmarks and reviews are just sad now, most of them use frame generation and upscaling as if it was normal on a 1600$ GPU, i mean in some games the 4090 couldn't even get more than 60 fps ( without frame gen)
Had my 4090 for a year, I use a 240 fps 1440p monitor and with no dlss, I'm getting ~100 fps for games like red dead and forza motorsport, while for others like forza horizon 5 im getting 200-230 easily. I honestly feel like for the price I paid, that 4090 is far more worth it than the new 5090, especially for my personal gaming needs.
As a person that 1) has not perfect eye sight and 2) doesn't breath down my monitor, I never really got the whole craze of anything above 1080p. I personally use my own 960p as it's really hard to tell the difference above that. And even though I have a 3070 Ti I still like to optimize my graphics before starting a game lol.
those frames would look better with a 7800x3d but 1440 is really the best of both worlds 1080 is just ugly after you use 1440p and I know what you mean with frames and the 4090 but it's still a waist of money you literally cant see the difference between 4090 and a 4070 lol
Yeah, my new plan is to get a 14900k, and a 1440p 240hz monitor to really let this 4090 stretch its legs lol But also, there's a reason I bought the 4090 over other cards, so not really a waste of money for me
@@MrJaysi just want to clear something up. I like watching your videos, but I think you had something on here. You were doing 300 frames per second. At 1080B, regardless of which video card you have? It's insanity to think that you can tell the difference between 200 frames per second and 300 frames. It's nonsense but anyway cool video
Honestly, I'm pretty much with you. I want to play at 1440p though. I just got a 4080 super and it's great. Maxxing it most games at native 1440p with fps above 90 is amazing. It's what I've always wanted
The only insane thing about the 4090 is the price people are willing to pay. IMO.
thats an opinion relative to an individuals wallet size, unfortunately.
for those who have the money, 4090 is actually the best value for the money
It's the only powerful 40 series card worth the upgrade @@spicepirate
@@arefx I could buy it and refuse to.
Even if I had a million lying around, I'd buy socket SP86 F and cpu over an RTX4090.
4090 is the best gpu upgrade over a generation in probably the last 20 years and nothing comes close
Keep coping. 4090 is a beast and well worth the money to whoever can afford it
So I bought a rocket for my toilet.
"My name is Johnny Knoxville, and welcome to Jackass!!" 😂
So I bought a skibidi for my toilet
They marketed the 4090 for 1080p 😂
@@ichisenzythat is most cancerous shit ive ever seen
@@ichisenzystop
I think people forget just how mind-bogglingly, stupidly fast new GPU's like the 4090 are. Half of these games should be easily pushing 200+ FPS WITHOUT frame gen or DLSS. Unfortunately, modern games completely brush off game optimization like it's nothing because they think these new technologies and cards can handle it... they can, but these should ONLY be supplemental, not a reliance.
the 4090 can only hit 60 fps on black myth wukong at 4k max settings
@@johndank2209”only” is crazy considering that game and the resplution and the preset
i have a 4080, if any game right now refuses to run on it at 60fps in 1440p without ai trickery or lowering details i'm not playing it, simple as
and just to be clear, it's not good performance for the price either, it's just the lowest i can set the bar at...
atomic heart, for all it's flaws, was looking good & running beautifully, 180fps 1440p maxed out, before frame gen, many devs could take notes
@@johndank2209 my rx 460 can easily hit underrated 40fps in 480p
4K Ultra High Settings ❌
1080P Ultra High settings ✅
I used to do this until I realized I didn't need 150+ fps in all games creating unnecessary heat and gpu stress.
@@SL1PSTAR I can't deal without my game going smooth. The first time I had an 144hz monitor, I hated anything less then 100 fps.
@@project_anti Me too man! I love my 240Hz. Anything less feels nasty.
I don't aim for high FPS with games like, Grounded, Satisfactory, Path of Exiles, Ark, 7 Days To Die and many more because the gpu ramps up heat and energy usage which I don't really need.
Competative shooters though! Ramp that thing all the way up.
@@SL1PSTARi used to be happy with 75hz until i got 165hz... now anything less than 120 looks choppy. high refresh rate is blessing and a curse
@@smalltowngoose The diffference is like night and day!
As a guy who grew up poor and with no family, I had a crappy 2gb ram dx9 igpu laptop, barely playing GTA SA at 20ish fps. Now that I'm more fonancially stable, I bought an RTX 3070 laptop and my friends laugh at how I still play some games sub 1080p. Some things never change and I'm a glutton for crappy visuals lmao.
Don't worry man you do you and don't worry about what people say. I also have an RTX 3070 and it still is a good GPU. I bought mines 3 years ago and I'm still using it till this day. I got it for my first pc build and it was really popular around the time I got it so it felt good to get it. Now that time has passed and new GPU's have released I still look at my 3070 and admire it for what it has done. Enjoy your gaming man
Wow 3070 sub 1080p?
fps must be amazing
I have a 4gb 1050ti and play at 1080 too lol
I have a RTX 3060 12gb with 1600x900 75hz Monitor (Sceptre)... XD
1050ti club here)
@@RedTrainerZrtx 3060 240hz
I still find it surprising? That a 4090 at 1080p still dips below 144 quite often.
2016: $380 GTX 1070 for 144fps 1080p native ultra.
2023: $1600 RTX 4090 for 144fps 720p upscaled to 1080p with interpolated frames.
Being happy about 70fps dips at upscaled 720p on a $1600 gpu is just crazy to me.
Edit:
Since people can't seem to understand my comment, I'll explain it.
In 2016, you only needed to pay $380 for a GPU guaranteed for 120+ frame rate at 1080p ultra for triple A title games released within that generation.
In 2023, you need to pay $1000-$1800 to reach that performance target for triple A titles of this generation, and even still, a game like Cyberpunk that was released 3 years BEFORE the 4090, cannot achieve 144fps in 1080pMax on a 4090 without the use of AI.
So, who do we BLAME?
The game developers? NVIDIA and AMD?
The blame goes on the consumer!
We keep buying up these standards. Nvidia and AMD release the 40 series and 7000 series, where the only performances gains are in the high end market, because we'll buy them anyway. Developers are lazy and don't work on optimizating games, because they know we'll buy them anyway. It's not about DLSS, or FSR, frame generation AI crap, it's that the pc gaming industry saw how truly desperate the community is for the products during COVID and the semi-conductor shortage, and then matched their marketing to satisfy that enormous demand. They've been riding out the same wave for 4 years now. Hell, AMD literally said that they won't be making high end cards next generation. They don't need to! They're so satisfied with the amount of demand and desperation we show them, to where they only need to work on worse product to increase profit overhead.
He is just another paid shill by nvidia lol. Talking about how pretty a game looks while playing at upscaled 720p is just hilarious. The moment these people have to pay out of their own pocket they will drop the facade
LOL I only have 2,800 subs dude! I'm already getting called a shill, this is great! That means I'm on my way!! I bought this card for mah self like a big boy!! :D
Yeah, that's not how this works, but okay lol
@@MrJays Lmao ok
@@AndInMyBalls2I understand that not all 40 series cards are good, but the 4090 is actually worth the price to performance difference
i think the ideal screen size is ultrawide 1440p. a little bit less demanding than 4k, and more immersive (imo)
Yes 🔥 I have a 5120x1440 31:9 monitor it’s literally amazing
@@MIGHTYYGAMEShey dude what’s the name of the monitor?
@@Xxx-wd8rb AOC AGON (AG493UCX)
Actually not cause it wont push the card hard enough.
@@JoeMama-yl1ow it doesn’t I have a 7900XTX and I play on a ultra wide 31:9 the card doesn’t struggle at all
Man, that is just disrespectful. I got the 7900 XTX 5 months ago and I'm still adding high-end equipment to my setup. When my 50" 120Hz super ultrawide monitor arrives next week, my life will be complete for 5 minutes.
7950x3d + 4090 has been an amazing combo for me for 1440p 144hz gaming as well as for VR. Definitely recommend an x3d chip for gaming with the 4090.
Thanks for sharing! I plan on grabbing one of the newest gen CPUs that they are coming out with soon, and also upgrading my motherboard and ram too, so I can really see this GPU fly! LOL
@@MrJays yep, make sure to go for a 7800x3d instead of a 7950x3d because those have problems
@@casmithh Did AMD never actually fix that? I know for content creation you either want the normal 7950X, 7800X3D or 13900K if you want peak gaming performance (it has more overclocking headroom as the Ryzen processors massively ramp in heat with clock speed/vcore increases). I was hard Ryzen only for the past few years since Zen 2, decided to switch back to Intel for 13th gen and will likely keep upgrading through Intel even though it uses different sockets every gen fuck them.
@@Nitedontdie i'm decently sure there still have been some reported issues trying to get good performance out of the 7950x3d. 7800x3d works like a charm but specifically the 7950x3d (and maybe 7900x3d) have problems related to the ccd.
@@MrJaysBetter go to at least 1440p so you can see your gpu fly
For a 4090, 1440p is totally doable at high frame rates. I've got a 4090 and a 7900X3D and the only game that I need to use DLSS is cyberpunk with PT. Without DLSS it can run natively, but there's some dips. My monitor goes to 265hz at 1440p so in more competitive games, dropping the graphics gets you the framerate with no issue, while in more story focused games I aim for 100 ish.
Oh nice! That's awesome! I'm sure that looks awesome! I am planning on getting either a 14th gen Intel, or an 8000 series AMD cpu when they drop, I had no idea my CPU was so outdated after only 2 years... LOL
I was looking for such a video the entire weekend. To see, I'm not the only person, who is in general aiming high fps 1080p with my newly arrived 4090 (I'll do use it mainly for work), but I have to aggree, never had such a great experience and smooth gameplay with these titles! Thanks for the video!
You are not alone! I've been buying high-end GPUs for high refresh rate gaming and content creation as well since 2017 with the 1080 Ti! Glad I could help in any way! And thank you for watching! :)
I also use my RTX 3090 for 1080p gaming paird with Ryzen 7 5800 X3D monitor Alienware 27 FHD IPS 240hz like yu said just love high FPS max graphic on my PC 😀
@@BAGIX20 cus imo smoothness beats visuals
@@RealSlxzz yeah i recently switched to 1080p 144hz monitors from my old 4k monitor which i bought when i was a console gamer 5 years back, i barely noticed the resolution downgrade i guess my eyes prefer high fps over graphics which is great
I think the fact that I grew up with 8-bit and 16-bit and CRT TV's that I'm completely fine with my big curved 240hz 1080p monitor.
Those micro stutters could be due to the frame timing aka the time inbetween each frame thats produced. Try rivatuner to alternately cap your fps as Rivatuner's fps cap (has a setting to fix screen tearing as well) tightens the time inbetween frames making it suuuuper smooth
fps cap isnt always the end all be all solution tho
Yes, in most cases I've solved this problem by using V-Sync or an in game FPS cap.
Yea he kept blaming it all on CPU limitations, but it never even went above 90% utilization on his monitor software. Wasn’t sure if I was missing something or not.
Jackrabbit, CPU bottleneck doesn't need to be above 90% usage. That's the old meaning of it when all we had were low core count cpus. Now, with high core count CPUs, when a game only uses 4 cores, that's only 25% of a 16 core cpu. So the total usage won't show 90% or higher. What shows a CPU limitation nowadays is GPU usage, the GPU slows down to wait for the CPU. Checkout my Avatar video to really see a bottleneck from the cpu, even though CPU usage says it's low, and the frames are still good
@@MrJays Hm, interesting. I definetly see where you're coming from. From all my studies and research I've only ever found CPU bottlenecks being considered when said processor can't send enough data to the GPU to render. So what you're saying is the primary game core isn't enough HZ and or large enough, then the CPU is limiting the computing power in certain facet(s)? I suppose I could see it, especially when loading screens are baked into normal gameplay now. I personally rock an I7-13700f, RTX 4090, DDR5 6k MHZ and a lower wattage motherboard solely to keep things cool for air cooling+longevity of PC while multi-tasking, and haven't really seen a bottleneck using all cores. I believe my CPU benchmarks lower than the one in this video as well. Thanks for the input, the multi-use of the definition of bottlenecking would make this confusing.
Its like buying a supercar to drive around a karting circuit.
Better than buying a supercar to look nice in your garage or drive at 10 KM/H because of traffic
now do 480p
Lol
240p next
144p
1p
pp
I think the 4090 is a 4k monster, the 4080 is a 1440p monster, the 4070 a 1080p monster and the 4060 a 720p monster
That’s true
Not at all.
With ray tracing on and everything maxed out, 4090 is barely a 1080p monster, having hard time stay above 100 fps without frame generation….
Ray tracing era is different, there’s no monster gpu, only monster games.
I play 4k on 4080 on all ultra settings, havent found a game under 70fps yet, that includes cyberpunk.
idk but i have a 4060 and playing in 1080p max setting get me 60+ fps in games
I get similar framerates at 4K in Plague Tale on the same settings. CPU bottleneck literally means that your GPU is sitting there idle so your pretty much capping your frames regardless. Anything less then 1440P and you're literally wasting GPU power. You would probably get comparable framerates at 1080P even with lower spec cards since at that point they'd be running closer to 100% utilization across both metrics.
So, are you wasting the monitor if you can't get max framerate that your monitor supports in every game?
It doesn't matter if you are not able to utilize certain parts to its full potential, some people just want to have the most power efficient build or have bragging rights or whatever else is on their mind, maybe they just don't like money or love capitalism, etc.
You're not going to be driving 100mph in your car everywhere you can just because every new car these days can drive that fast.
@carssucksince1800s Those analogies don't really work. That's more like saying, "I bought a Lamborghini, but I put a limiter on it so it can't drive faster then 20km/h." Like okay, you can waste the money to do that... but a Honda Civic could also do that at a fraction of the price.
@@TheInfiniteDraw It would work if Lamborghini could achieve a much better fuel efficiency than a Civic or a Prius.
Playing anything above 1080p is a waste of money, but everyone still does it.
@carssucksince1800s You're adding on nonsensical hypotheticals to try and make flawed logic work. The massive engine of a Lamborghini isn't ever going to be as fuel efficient as a tiny Civic engine, because that isn't what it was designed to do. Even if you put a limiter on it. In the same way that a 4090 isn't going to be as efficient as say a 4060. As for anything above 4K being a waste, well that's just objectively wrong. The clarity and bump in visual fidelity 4K provides is massive over 1080p. You might not want to put together a rig that can play 4K because it isn't within your budget, or it isn't something you're interested in, but calling it a waste is just incorrect.
@@TheInfiniteDraw You obviously haven't done the research. 4090 will absolutely be more power efficient than a 4060, not just as efficient when performing the same task at the same intensity.
If you would stop buying crappy TVs and monitors you'd know that 1080p is sufficient. Going above it is a waste of money due to increased demand of higher resolution. You can essentially reach similar clarity by rendering games at 4K. 1080p will look worse on a 4K display due to the way digital displays work, however comparing a high quality 1080p monitor to a cheap monitor that most people get for more pixels is going to show that 1080p is providing essentially the same image quality.
Obviously more pixels are going to provide more clarity, but at what cost?
Saying that someone doesn't use a 4K because they can't afford it or whatever is the dumbest thing anyone can say. A 4K resolution is simply not worth it when the content does not change, but demand for greater performance from your hardware increases dramatically.
If 4K is not a waste in your opinion, then getting a far more powerful card for a 1080p gaming is definitely not a waste due to far greater benefits at lower resolution than at 4K.
Do some research and you will maybe realize what a bunch of nonsense you've spit out just for some eye candy. Go look at your instagram models and wait until one of them falls on your lap automagically.
0:24 I thorougly enjoyed the content that started at that point in time. Thank You.
After around a year of using a 4k monitor, it's extremely hard to go back to something like 1080p, to the point where it doesn't matter what kind of graphics you're running, since it will just look pixelated no matter what (though youtube makes it look much worse than it would look in person as well). That being said, I was somehow expecting higher frame rates, and when you said let's see how it runs without frame generation, it was shocking, since I was thinking that was the native fps lmao.
Yeah, 4090 is made for higher resolutions. It's not like it's much faster than the 4070, it can just handle so many more computations at once.
It's because his CPU is holding back his GPU from reaching max FPS. There would be almost no bottleneck with a 7800X3D
@@stove.d It is also a lot faster but yes when it comes to scalablity 4090 is such a beast that he sits on his own peak
am i the only one unimpressed? DLSS + Framegen on a 2K GPU and Plague tale runs at an average of around 140 with dips as low as 100fps at like 100% utilization. i'm more impressed that any other card can run it adequately if anything
same the "ooooh aaaaah wooow" telling us about the groundbreaking 260 fps in nfs unbound. bro doesnt know that the 130 fps frame gen nets him dont actually do anything. yet he still used 1080p WITH DLSS thats pathetic imo
Most of the games he ran were ue5 which is what most games are prolly going to do soon plus he was cpu bottlenecked not much tho
To be honest,you did a right thing.
I personally bought 4060ti,but decided to stay at 1080p,so i won't have to upgrade my hardware over and over again because of res being too high.
Gives me 100-144+ fps in Cyberpunk on ultras + rt ultra on with FG and DLSS,and it's with the fact that i haven't upgraded my cpu yet and it's heavily bottlenecked.
The 4060 Ti is a 1080p GPU though guv.
@@killerra It is more 1440p leaning,which obviously eats performance,like any other higher res does,which is why i'm staying on 1080p.
@@pumcaliber7483你好,请问你的cpu是什么型号的呢
@@SW-by3yt Ryzen 5 3600,going to upgrade to Ryzen 9 7900x next month
@@pumcaliber7483 谢谢
Dayum recently I bought a 3060ti for 720p gaming as well it can really pull 30fps effortlessly 😊
@stefan9252 stephan my brother in christ that was sarcasm
As someone who uses an RTX 5090 for 144p gaming I can confirm
nfs has a really optimized proprietary engine that handles mulithreading like no other engine. Note how all of the workload of the game was eavenly spread between all of the 32 threads you have resuling in maxing out the gpu all the time.
I LOVE what they did for NFS Unbound! It blew me away with how it handled my hardware!
The newer Need for Speed games are the poorly optimized racing games ever made...
@@AntiGrieferGamesyou're on something.. works great on 4090fe and 13900ks. The only thing that's bad is that TAA doesn't work on windowed borderless so using my multi monitors kinda sucks. Needs to be fullscreen exclusive. Game is fantastically optimized
Also forgot to add my old PC 5800x 3060ti OC the game runs great on medium to medium high very great. Looks better than on console.
@@coreytaylor7367 On what resoltuion?
Massive thanks!
I tested this on Robocop Rogue City because the combination of DLSS and Lumen caused glitchy shimmering on the streets.
Suddenly, I had a clear picture and a wonderfully high frame rate. I tested this with many of my other games as well.
When I disabled DLSS or FSR, set the resolution to 1080p, I was shocked by how clear the picture was, especially in motion. All this upscaling produces a decent image when standing still, but as soon as you move, it blurs.
I didn't understand why modern games looked so blurry, despite using 4k DLSS quality and acceptable frame rates. Your approach fixed it for me.
i got a 4090 like 2 weeks ago and cyberpunk at 1440p ultrawide is ridiculous. i love it.
Dude that's awesome!! Enjoy the 4090 and the ultrawide! Sounds great!
4090 with 1440p? get 8k 390 hertz
Great video, Jay
Hey thanks, Mistborn!! Glad you enjoyed it! And yeah, this game is a monster!
@@MrJays the 4090 could have given much better performance if it had a better display port
I bought a 4090 for 800x600 gaming at 1000fps, just waiting on 1000hz monitor.
the 4090 is the powerhouse of the computer
Im surprised 1080p didn’t bottleneck the card
i love the cam replacement you're using , the cat that reacts to your inputs is pretty clever ngl
I concur, I also prefer 1080p at higher frames vs 1440p/4K at lower frames.
Each to their own. But I don't know how you can use a 1080p screen in 2023. You can't even see half of the extra detail games are adding in these days. And aliasing is so bad.
I suppose it's one of those things where it's what you get used to. You bought a high refresh 1080p screen and so can't go back to a lower refresh one. I adopted 4k early and so can't go back to 1080p without feeling like I forgot to put my glasses on lol.
With a card like this tho 1080p makes no sense. You could get the same frame rate at 1440p
And these days you can get the best of both worlds high refresh and 4k or 1440p. Got myself a 4k 160Hz monitor for £360, I don't see a reason to use a 1080p screen with a card that expensive when 4k has gotten that cheap.
My question is.. what frame rate is enough? Surely you don't care about 1080p 300fps more than 1440p 240fps for example. You gotta agree there is a point of diminishing returns, especially in single player games. You don't need 200fps on plague tale requiem for example
So, I can only speak for myself, but I would absolutely love to try a 1080p 360hz monitor! I've only ever seen up to 144hz, personally with my own eyes. And I can say that my eyes can see the differences between 90 and 120, so I'd want to see what the highest values feel like, or look like! But for me I do agree that 1440p 240hz is more appealing in the aspect of future-proofing than keeping 1080p and going like 360hz
To be frank (no pun intended) ya'll probably know more about this shit than me lmao. On my laptop bringing the resolution down to 1080p and playing at 120fps~ vs 1440 (or using the AMD super resolution to go higher) at 50-80fps just feels and looks better for my eyes.
Then again I go mainly play FPS's like Destiny so wth do I know?
The problem is a 3090 or 4080 would be getting the same fps because of the massive cpu bottleneck. This kinda sucks because people see this and think that 70% utilization is a good thing lol
Thank you for making this video! It’s something ive always had on mine but never had the resources to buy the very best, it would be amazing for you to try even lower resolutions!
You are so welcome! I'm glad you enjoyed it! I have actually considered trying this at 720p as well, for fun :D
@@MrJays I like the video but you really need a lower latency memory tuned CPU to do 720 meaningfully.
@@Wobbothe3rd I'm glad you liked the video! And my plan is to upgrade my CPU and get a new 1440p 240hz monitor :)
Bro, this is absolutely me.
I bought an RTX 3090 in 2021 to play VR and flatscreen games in 1080p, or triple screen 1080p... always at 60hz.
It's a fever dream and I love it.
I'm at 1440p 155hz, and it's amazing. I had a feeling right from the start that even the juggernaut 4090 was gonna be a 1440p card if you like graphics AND framerates. Sometimes I have too much GPU and my i9 9900k is CPU limited, but not all the time-and with the advent of UE5, the card will still get solid FPS on Ultra at native, even when maxed out.
The next upgrade will be a cutting-edge CPU and RAM. Eventually I'd like to experience 240hz in games that allow, and "next-gen" stuff will still run smooth, but probably closer to 120.
What's the proof ?
@@speed999-uj5kr Proof of what? That high fps is amazing, or that even the 4090 can hit its limit at max graphics and high fps? The "proof" is seeing it for yourself, which you have to do in person since you need the tech-and streaming services all seem to cap at 60 anyway.
@@ryo-kai8587 claiming that you have all that stuff ! 🧢
@@speed999-uj5kr Whatever dude. Even if I had lied about that it wouldn't help, 'cause the original comment doesn't make me cool or popular in the RUclips comment section, lol. Seems to me like your jealousy is showing
You’re going to be bottlenecked no matter what, cpu or gpu bound it really doesn’t make a difference lol. Why people get worked up on it, just get the best cpu you can afford and best gpu you can afford and play the game who gives a shit about whats bottlenecked. Also ram wont make a difference in gaming either unless you have like 16gb then maybe.
I'm using the 5950x as well... I find the main game threads tank the CPU on a couple of Cores and that holds the whole thing up
Ryzen Master has a game setting that drops half the CPU for higher clock speeds on the other half. It may work better to turn off cores based off maximizing CDX availability.
I used a 3090 for 1080p for a while. The trick to getting the most out of it was to run my desktop at 4k and downscale down to 1080p. As close to having a 4k monitor without having one until I got a 4k monitor. Now I got an ultra wide 1440p oled monitor though, and I freaking love this thing.
1440p is the best i think the quality will increase a lot and the fos won t be mediocre btw i love that i found someone who have this idea i feel 4k isn t that important
I'm thinking the same thing! I think 1440p is the best middle ground for FPS and a slight resolution bump!
@@MrJays The great part about 1440p is that it actually looks significantly better than 1080p but is much less demanding than 4K. I've never cared about 4k too much because I know what it means for performance, but I'd never wanna go back to 1080p. Upgrading my 2080ti to a 4090 while staying at 1440p 155hz has been a wonderful experience.
it's only 33% more pixels, not that big of a jump. I would go to 4k if it was more affordable. will probably wait it out
@@eniff2925 it is veeery big but i agree that 4k is even a bigger jump, but the performance cost is what makes 1440 p a bigger deal for me because it is much much cheaper to get high framrates
@@abdogaming4613 i play starfield at 720p upscaled to 1080 with xess on my 1660 super and honestly its fine. I have to use a pretty strong sharpening filter though, i use marty's immerse depth aware sharpening filter. The final result is suprisingly good. I would have never thought about going under 1080p but it is honestly fine. This way i can put the game on medium and it looks much better than if it was on low.
If I had an RTX 4090 I would prefer 1440p 144hz gaming too, I think 4k is awesome for playing above 42 inches
Yeah, I'd love to try 4K gaming at some point, but I don't know if I'd make it my default resolution due to having much lower frame rates. so 1440p is my sweet spot :D
@@MrJaysyea but u can play on 4k monitor with dlss quality or balanced preset with similar fps like on 1440p native so if I have 4090 I would get that new oled 240hz 32" that's coming i think early next year idk but u can get oled 27" 1440p 240hz already that's great for 4090 too
Someone who actually talks sense 😅 4k it pretty much pointless below a certain screen anyway, I'll sure it's around 60-65" as well..
1080p is still the sweet spot
1440p is super nice
1440p ultra wide is where its at
1440p mega wide though 👀
@@Denbot.Gaming Sitting with a 32" 4K screen on my desk and it's perfect. Image is way more detailed than any 1440p.
@BornJordan im not saying it's not better... It's more of a case that it's not worth it, it comes down to the dpi and pixel count needed for such a small screen.
4k isn't really needed for anything less than 65" with a viewing distance of 2-3 meters 😶🌫️, think back to those old CRT screens 480p, you know when you could see the pixel even while sitting across the room on the sofa or even at 1080p you make them out on the 21" plus screens.... but how often have you noticed the pixels on say a 32" 1440p screen ? Probably never so is 4k worth it? for sub 60 FPS in many games? unless you own a 13900ks and an 3rd party 4090... see PC gamers tend to aim for 120 FPS or more... not 30 fps....
Personally, I'd opt for ultra wide 1440p over 4k, 3440x1440 is just shy of 5 million pixels vs 8.3 million of a 4k screen... so the FPS takes a massive hit trying to refresh all those pixels, unless you ops for DLSS or FSR, but then you're degrading the quality, making the 4k screen pointless for competitive gaming 🤷♂️
i got a 4070 super for 1080p, i use 1080p instead of 1440p because it will last way longer, future games will play good at 1080p but not 1440p, im not rich enough for a 4090 but 4070 super is still considered "overkill" for 1080p lol. cool vid
It's like buying a porsche 911 to buy groceries and never take it to the track
But that's what 99% of 911 owners do.
@yowhatsup9909 all about showing off
wtf i thought this was 1 mil view video with how well edited this is!
That just made my day, thank you so much! We're still a very small community, but we're a great community! And I always try my best for videos!
Not too far off
Lmao i feel you the only reason i made the hd upgrade (remember those AV cables anyone?) was because in Mortal Kombat on the PS3, i couldn't read the small text doing those tower challenges. So i bought the HDMI cable but dayuuuum the difference it made.
4k is cool if you're an editor and need to zoom stuff. Gaming? I guess if you're doing 4 player split screen so you and the boys each get a full HD screen.
I just want so say how polished your videos are. I looking forward for more content like this :)
Thank you so much for the kind words! I try my best for each video! I hope you will enjoy my future content, and the other content on my channel! :D
when you decide to play on a 4k 65inch tv with a custom-made widescreen aspect ratio, you'll never want to go back to 1080p
I'd love to try that!
so true the only down side is not having the high 244Hz but I have got the LG QNED 65inch which can run 1440p at 120Hz and 4K at 60Hz
Genuinely enjoyed your video dude! I can't believe I watch it all
Thank you so much for watching, and I am glad you enjoyed the video! :)
This was such a great video!!! That intro tho!!! 😂
Thank you so much!! I'm so glad you liked it! hehehe, that intro was fun to make! :)
The future looks so bright. Just imagine where we’ll be five years from now.
The future is here, and my god, it feels GOOD!! I can't wait to see the future tech 5 years from now :D
im sure your excited to pay 3000 for a gpu and not 2000
@@Nicc93lmao exactly I’ll be excited when the 4090 is 500
@@Boosted_aj Wait another five years 😆😂😆😂
@@DfOR86 I will gladly. I don’t expect anything insane coming from the gaming industry any time soon. At worst they’ll continue to poorly optimize and create paywalls for most new games. Problem is everyone will drop out of gaming if they had to start coughing up 5-800 every year to play the newest games. The 4090 will most certainly be a great card which also means it might not hit 500 in 5 years
I never understood the appeal of 1080p gaming in 2024 until yesterday. I tried 1080p with my 4070ti super and was shocked by the low latency.
High quality video, but sadly it deserve much more views
OMG Thank you for the kind words! Here before it pops off?? 👀
@@MrJays i hope so 🔥
Well you can play at this resolution for another 9-10 years or 1440p for 6-7 years. So I'd say it's a well earned investment.
Maybe 3to 4 year more
GPU's lifespan is like 5-7 yrs at most. Even if he did play for 1080p it would be long gone until that time period
@@midnightrunners1470 Im still running a 980ti so would last abit longer than that
Shouldn't new GPUs be like more stronger enough to last long? @@midnightrunners1470
I would not mind an RTX 4090 at 1080P. I did the same thing with an RTX 3080. Higher frame rates rule, and the best part about running a high-end GPU at 1080P is that you don't have to use DLSS even on the most demanding games.
1440p > 1080p
I have eyes of a 70-year-old man. 1080p is starting to look blurry to me. Especially on games with large maps.
It's probably TAA if you mean newer games
I understand why people say 8k and 4k gaming performance is a waste to focus on in 90% of cases but... you also have to remember VR. VR requires super super high resolutions at 90FPS or higher. the better the resolutions at high FPS the bettter VR can become. I play VR and I just bought a 3060 but now I need a new PSU so I can actually use the card, that should give me 1440p easily for my headset at 90FPS (in most titles and depending on graphics settings) which is super exciting.
If I had more space, I'd love to get a VR headset and do more VR content! This 4090 would be soo good for VR games!
@@MrJays playing VR feels like the opposite of a FPS over graphics gamer; 90hz is enough to get fully accurate tracking yet you easily need 4k for easily legible text; also nothing in VR necessarily goes obsolete; the index is still a good VR headset, only downside being the low resolution.
@@scarecrow5848 That's definitely a concern of mine, is the FPS being too low in VR. But I think the 4090 should be able to handle it on most games, right? I haven't looked into VR much with this card lol
@@MrJays not at 4k but yeah, it can get close. again, you really just need to reach 90 and your good 👍x3d helps the most with it
Quality ..It's not about the highest fps we can achieve but how consistent it can be maintained.
Those micro stutters are because you didn't do any OS optimizing, or latency reduction aimed setti gs changes in your os. It takes a few hours to do even if you know what ur doin to accomplish but your fps would raise 10-30% still. My 3070 with i7 7700k does nearly as good at 1080p but no stutters lol. Use latencymon to view your system latency and don't stop learning about reducing system latency till you see 5-50ms overall system latency. And don't forget to use regedit to get the default windows data queue on your mouse and keyboard to reduce the data queue from default windows settings of 100 down to 5-16ms or else u have minimum up to 100ms input lag
Epic comment, thanks.
bro make a guide for us please
I found the jump from 1440p monitor to 4K 144Hz actually not that drastic. Most games I play stay over 144fps at 4K, most settings on high. Forza Horizon for example I got 190fps at 1440p but 160 at 4K and looks SO much nicer and sharper. Got the Gaming X Trio 4090.
Hmmm, interesting! I'll keep this in mind! Glad you're enjoy your 4090 as well!
U mean the performance decrease was not that drastic, or the res? For me, the switch from 1440p to 2160p was just as impressive as 1080p to 1440p. I remember in 2013 when I had sli gtx 780 and I picked up the first 60hz 4k monitor that Dell made. The only other 4k 60hz monitor was an ASUS. They were the exact same monitor. It was so sick, but the 3GB of vram wasn't enough for newer games.
This is literally me in 2015 buying a gtx Titan X graphics card for 1000+ euro only to have it run like garbage because the rest of my pc was bad
You NEED to enable DLDSR! It'll be even more beautiful and still get insane FPS!
You could also try playing at 1440p with FSR so its like running it at 1080p but at a higher resolution.
The big brain! 1440p with DLSS making it 1080p, and frame gen giving me 200 FPS LOL
@@MrJaysI've just bought a PC and this is like alien talk is this important my PC has the same gpu
@@keybladerasta4142 Hi! So if you have a 4090, you can basically play at any resolution you want, with any and all settings at the maximum without any issues! What we were discussing is how lower resolutions handle the immense power of the 4090 :D
But if you have any questions, feel free to ask!
Right! I had forgotten about this. It's like the optimal anti-aliasing! I'm gonna give that a go haha
why does my rx7800xt get more frames than a 4090
Only the ray tracing performance is carrying the 4090 it seems
Here's the thing with literally every high end gpu that comes out.
I don't think its that unspeakably dumb to use 4090 at 1080p because some day.. say 6 or 7 years from now were going to probably look back and say
"Yeah the 4090 is a great used option for 1080p gaming"
Exactly, they all become 1080p cards eventually - I'm just using it now to get stupidly high FPS LOL
2:15 no it looks like an oilll painting cus I guess you have zero knowledge abt what frame gen and dlss is. using dllss at 1080p should be a crime.
LOL okay guy. You sound very chronically online right now. The video was for fun. And, there's nothing wrong with the generated frames *as someone who has played a wide variety of games with it*. And DLSS is amazing no matter the resolution you use it at. The DLAA portion of it can help fix bad AA issues in games, and then DLSS can help increase your frames further - but generally I only turn on DLSS when the games I'm playing limit Frame Gen to DLSS.
If I had the money I would buy a super high end gaming pc just to play old games at 1080p
Hell yeahh
Thanks for posting this. I have 12700k, a 1080p monitor and my 4090 is on its way to me now. I'm gonna do it just like you, ultra settings with ultra FPS. LOL
Obs.: yeah, eventually I'll get a 1440p monitor. And F' 4k monitors!! haha
WOOP WOOP!! FPS is KING!! LOL Enjoy the 144hz or higher, and thanks for watching! :D
if i were you, i would grab the gigabyte m28u great budget 4k gaming monitor.
@@pcgamerzzzchannel3407I have the gigabyte 28 inch and it is decent. Price should be dropping.
Thanks! @@pcgamerzzzchannel3407
@@Ajwhayahaha here my setup , be quiet okay?
Pc Specs
Aorus Master z790
Corsair Venquence 2x 16gb DDR5 5600 CL36
Intel Core i9 13900k
Corsair H170i elite LCD
Gigabyte Gaming OC 4090
SSD samsung evo 850 500gb
SSD samsung evo 860 1TB
SSD samsung evo 870 4TB
Samsung 990 PRO 2TB M.2 SSD
Seagate HDD 2TB
Corsair 7000D
Corsair hx1200
Creative Soundblaster AE-5 Plus
Logitech-G G915 Lightspeed TKL GL Tactile QWERTY US
Logitech-G Pro X wireless Gaming Headset
Logitech-G Pro X Superlight
Logitech speakers Z623
MSFS Setup:
HP Reverb G2
Logitech-G Pro Flight Yoke System
Logitech-G Pro Flight Rudder Pedals
Win 11
Monitors
AOC AG271QG 1440p 165hz
Asus Rog Strix XG27UQ 4K 144hz
LG 32GQ950-B 32" 4K 144Hz
4k is just too glorious for me to ever want to go back to 1440p even if the 4090 starts struggling
I'd love to try it out some day! I've never really seen true native 4K gaming before!
11:28 NFS Unbound, I enjoyed this part of the video so much. Like watching a anime racing cartoon show xD
I LOVED that game! It was so cool!
rtx 4090 and stays on 144hz?!?! bro 1080p 360hz why get that good of a card?
Honestly, it seems I overestimated how good my CPU was, and under estimated how crazy the 4090 actually is LOL
I’ve tried 60 hz and 240 hz and really don’t care enough to go beyond 60fps. But holy shit 4k is incredible compared to 1080p. 1080p just looks super blurry to me now . OLED is absolutely insane as well. I’d rather take 4k OLED 60hz over 1080p 500hz any day of the week, but to each his own
That's cool and all but how are you testing 1080p on games? If you're using a 4k monitor and setting the resolution to 1080p it will look a lot worse than a native 1080p monitor.
@@JosiahKreifels I have a native 1080p monitor . And a native QHD monitor . They both look like dogshit now after switching to 4k
I'm actually envious of the people like that who can't see, or don't care for higher than 60hz! If I were like that, I'd have a 4K HDR monitor by now LOL
@@MrJays I know a few people who can't see the difference and I feel the same way as you. I love high hz and recently I bought a 360hz IPS panel with HDR and my goodness is it beautiful compared to a TN panel. I don't own any 1440p or 4k monitors but I do have friends who do and I can't see too big of a difference to switch from 1080p.
@@MrJays 1080p and 1440p looks like absolute trash, a blurry fuckin mess. It’s hilarious because this guy spent all his money on a 4090 like a retard when lower end GPUs get the exact same FPS at 1080p . 4090 is only optimized to get higher fps in higher resolutions in most games. 3080 gets practically the same FPS as 4090 at FHD in many titles. The 4090 only becomes faster at 4k
I was looking at the benq ex270qm for 1440p 240hz. Much like you and frames, I’m the same way and also want a no bs blurry ghosting lagging monitor. What did you wind up getting ?
Upgrading your RAM to DDR5 would help with bringing up average FPS and your 1% lows as long as its stable. Some games can use more than 32GB of RAM too which will cause unnoticeable microstutters. In testing it will affect your 0.1% lows but you wouldn't really notice in actual gameplay. That said, it's good to know what works and what doesn't.
AMD does not support DDR5 8000 yet, and only goes up to 5600 if I'm not mistaken. Maybe 6000 MT/s DDR5 can be utilized by high-end Ryzen systems, but I'm just going off memory right now. I read an OC3D article about it.
Just got one from a 2070 super, the super was good but at 1440p it ls average was pretty much low 40s to mid 60s. And if you turned rtx on it basically halfed that. But these games are all 100+ fps and then drop to 60 - 80 with rtx in most titles
Nice!! Congrats on getting one!! I love mine, totally worth the price of rent!! LOL
@@MrJays lol yeah i financed mine instead, still on backorder actually but it should be here soon.
the anticipation is insane
I never understand people who go "I want the highest settings, the max settings"
Then turn on DLSS which removes anti-aliasing
which removes smooth edges..
of ..
everything.
Well, I just want the fancy graphics, and DLSS uses DLAA as well, or at least can use it. So essentially, you get better AA than like MSAA or FXAA. And it uses less performance than traditional AA.
@@MrJays I got the point watching further, it's only about fps and not resolution so it's all fair.
I really do recommend just mentioning something in your video about how DLSS and FSR actually take away "max graphics" because you can only run max settings in native.
Also, if you notice, all of your reflections/lighting effects are reduced greatly in cyberpunk, DLSS basically shadow bans max settings.
The 4090 is such a ridiculous GPU... 😂
Cyberpunk 2077 2.0/Phantom Liberty DLC test at 4K! - ruclips.net/video/vS48ZWsK39A/видео.html
Like the video if you enjoyed, and subscribe if you're new!
🤍
if you "play" starfield the 4090 becomes a literal 1080p card
@@Blox117 Not if you install the DLSS mod :) I can get 100+ FPS in 4K with it. And soon it will be natively added in
@@MrJays if it cant run it natively at good FPS on the MOST POWERFUL GPU, then DLSS wont fix the problem. neither will people buying garbage
But I'm literally saying it DOES run good with DLSS... So DLSS DOES help. And any other 40 series GPU will have the same benefit. I'm not denying the game runs poorly, but DLSS absolutely does help.
I think you are the person that new 500hz 1080p monitor aims for
When the 4090 is at 1080p it doesn’t use all its power source your not going to get insane frames sometimes I get even more fps on 1440p in certain games
I do mention that in the video. And, I need a newer CPU, I'd get like 30% more performance if I did
@@MrJays my bad dude didn’t finish the video yet I was at work! Thanks for the reply. Great video man I’m not hating
Bro buy a 4090 for 1080p60 fps max settings and it will keep you going for a decade.
4090 already struggling at 4k max native at recent UE5 games
With an OLED tv with BFI setting maxed (or a CRT/Plasma TV or VR headset), 60FPS has the motion clarity equivalent of 150FPS on a regular PC gaming monitor. At 120hz, it's equivalent to 250FPS of motion clarity.
I'm saying this because most gaming monitors have no BFI or strobbing mode. So it always makes me laugh when people claim to be framerate snobs but only understand the numbers and miss out on the whole aspect of how the human eye sees those frames and how you can make games looks even smoother with a simple trick without requiring a 4090 or fake AI frames or bad motion smoothing.
U suggest gaming on a TV?
I'm sorry but if you can't feel or see the difference between 60fps with BFI and 120+FPS no BFI you're insane.
Pro-tip: if you sit on 1080p, you don't need something like 3090 or 4090 to get 100 fps in the latest AA or AAA titles. Even 4060, 3060 or even 20XX will be plenty without dlss.
If you want to update your PC, here's the instruction:
1) If you don't have 16 gigs of RAM - update RAM first, you can even go with DDR3 sticks if your motherboard doesn't support it
2) See how powerful your power unit, if it's less than 600V - I recommend updating it (of course, the targeted CPU/GPU combination also matters)
3) After you done with first two, get a new GPU. 3060ti or 3080 are great for how much they cost. If you can do a test of a used GPU before buying - do so by all means, mined 30XX series can be a real deal when it comes to price/performance as long as they're in decent condition
4) Finally, update your CPU and possibly motherboard
5) If you haven't done 1 or have DDR3 - time to update RAM
from 1080p 144hz tn 4790k 1070ti 16gb to 1440p ips qled 5800x 3080 32gb and a nvme its feels better a lot better :)
People sitting at 1080p don't want 100 fps, they want 240 fps, people if you want to get 200+ fps buy a 4070 ti, it'll perform the same as a 4080 at 1080p with the cost of less dollars, i played with a 3070 ti for 2 years and never reached the 240 fps in a game like warzone but a 4070 ti smashed it, so please don't settle for less and you'll thank me
Holy...this is amazing thank you for the effort put it in...
BUT MY GOD 9:55 the music is sSOOOOOOOOOOOOOOOOOOOOOOOOOOO GOOOD, do you mind sharing the music XD ?
I have some advices. I have Ryzen 5800X, 32GB RAM, RTX 3080 12GB and 1080p@144Hz monitor (actually three monitors, I used to play on all three for sim racing before moving to VR).
Do yourself a favor and go to Nvidia Control Panel - Manage 3D Settings - Global Settings - DSR Factors. The ones with DL Scaling use DLDSR, the ones with Legacy scaling use normal DSR. It lets you use higher resolutions on 1080p monitor - you just can select higher reslution in games settings with Fullscreen enabled. For me it shows that 1440p and 1620p use DLDSR and other resolutions (like 4K) are normal DSR. I still play most games in 1080p, but there are some exceptions which have terrible aliasing (for example some of older Unreal Engine 4 games) and I prefer to play them in 2160p locked at 60 FPS (or more if I have some performance room) to reduce the aliasing. With RTX 4090 you can probably have high FPS with these settings.
If you'll like it, a video about it would be nice, as not many people know about these settings.
Another thing - good GPU is great for VR gaming and I'd love to get a card with performance similar to RTX 4090 just to play VR games at max settings, including VR mods for "flat" games. Some mods support VR controlers, some just let you play with an xbox controller/keyboard and mouse (like VR mod for Cyberpunk). People rarely go and buy a VR headset on a whim, so I won't go further into details, unless you'd like to.
Also worth mentioning - if you care about temperatures, undervolting GPU is also a good idea. I have a slight undervolt on my GPU and my performance is just a little worse (like 3-5 FPS less in Cyberpunk on max settings with Path Tracing and Ray Reconstruction, 1080p DLSS Quality, but I'm still at 60 FPS most of the time), but the card uses around 100W less than on stock settings. During summer it helped me keep my room temperature at a reasonable level.
For single player, locking FPS to your monitor refresh rate is also good, especially when you're CPU limited - it will also keep your CPU temperatures down. I usually do that per game basis - if there's an option in game, I have V-sync disabled, but Max framerate limit set to 144 FPS. I also have G-Sync enabled in Nvidia Control Panel, so you don't see that much of a difference if the game runs below your monitor refresh rate and it feels bad for me without G-Sync when it happens.
I did experienced CPU Bottleneck few years ago on BFV ( i7 4770 / GTX 1070 / 1080p) , i lowered the graphics to get to 144fps but it would push the CPU to 100% and cause some freezes so i raised graphics to cut a bit of load on the CPU, dropping down to 80/90 fps and i was fine.
On NFS you said if was pretty much CPU bounded, and we could see that most game you played CPU was running around 30/40%, and on NFS you were around 70%.
So i guess unless your CPU goes up to 100% you're not CPU Bottleneck, the reason you're not getting much more fps than 4080 etc is probably related to the fact that these GPU were engineered and optimized for higher resolution.
False broski cpu bottleneck is cpu is not sending frame data fast enough gpu is just waiting hence 70-90% usage….gpu only displays frames on what cpu gives it…if your gpu isn’t at 99/100% usage it’s a cpu bottleneck…even tho 4090 wasn’t made for 1080p gaming it’s that we currently don’t have fast enough CPU’s to push it at 1080p make sense ?
@@TheConfed01 The low usage to me is related to which resolution the GPU was design for.
I'm 100% sure if he benchmark the card with that CPU it would reach full usage.
Used to have a R5 3600x with a 3080 in 1080p, GPU was bottlenecked because CPU was almost at full usage, overclocked it gave me a 10-15% margin and no bottleneck anymore.
i'm still on 1080p, with a 7950X3D and 4070 Ti obviously i don't have any bottleneck, but i bet going with a 4090 in 1080p wouldn't cause any bottleneck, if his CPU wasn't sending frames fast enough he would experience frame drops which wasn't the case. Just some kinda low fps that " doesn't make sense " considering the GPU he's running.
To me CPU no reaching full usage = no bottleneck, because bottleneck is indeed a CPU to slow to the send frame at the right time to the GPU, but when it's happenning the GPU is forced to skip frames, which results in freezes and sudden frame drops
I got a 1440p OLED monitor for my 4070ti and dude it’s insane
Dang dude! nice, enjoy!
What's actually "insane" to me is that everywhere with DLSS "Quality" is actually rendered at 720p.. upscaled... I feel like 720p ultra at 70ish FPS isn't as impressive as it sounds
Unironically amazing for people with 1080p 360Hz monitors. Playing games feels like looking through a window.
Dang dude I thought you were a massive channel until I read a comment. Keep it up bro you're very professional and entertaining!
Thank you so much!! I appreciate that! I get so many comments on my videos being rude or mean, but your comment has made my day! 😄
of course man I saw those negative ones and I was like what is wrong with everybody that was sick! Have a great day man!@@MrJays
Did this with my 1080ti. It’s still going strong and chews through most of what I play.
Hey me too! I bought a 1080 Ti back when they came out for the same reason as my 4090! Same with my 3080 as well!
i know its just a prefrence but putting a face cam with animated characters
is bad in my opinion and the video would look better without it
its just what i think and it doesnt have to be the right thing
but i just wanted to tell you
and the video is fun man good content
I appreciate how you worded this comment! Thank you for the kind words! I have received this feedback a couple times on this video, so for videos going forward, I will be more selective on when I use my avatar :)
i'm the same way lol I have a 1080ti and an 8700 running on a 1080p monitor
Nice man the card will keep you in high frame rates for years to come. I just got a 4080 (upgrading from a 3080) and now I can get 144fps in all my games consistently at 1440p
Nice! That sounds awesome! I hope you enjoy your 4080! :D
0:47 , what game is this?
That is Ratchet & Clank Rift Apart! :)
@@MrJays thank you!!!
2:40 MrJay: "wow, look at this river"
Me: this river flows upwards? o.O"
LOL It does look like that, huh. I assure you, it's just a visual optical illusion
Friends, do not forget that the RAM is using 3600mhz, you need to look at it that way to avoid a sudden fps drop, and I will also test it on the i9 13900kf today :)
So far I've been able to crank almost every game i play to max settings 1080p with just a rtx 3070 and a Ryzen 7600x
Usually 120fps - 144fps.
Games such as helldivers 2, horizon zero dawn, horizon forbidden west, jedi survivor, etc have all run max settings at 120fps+
But why not 1440p
Well, I've had this 1080p 144hz monitor for about 6-7 year, and back then 1440p was really taxing. I plan on going for 1440p 240hz (maybe 4K 120) now that graphics cards are powerful enough to run that. As soon as I get a new CPU, since the comments pointed out how much better the 13900K is compared to my 5950x LOL
this actually sucks not even fully utilizing your gpu + cpu to squeeze out more frames. devs don’t even optimize their games anymore, you have to have a 4090 to be able to use these high refresh rate monitors with 1080p gaming thats a terrible reality, don’t even get me started with dlss and frame gen litteral upscaled lower resolutions and ai artificial frames just to get over 100fps is pitiful for the best graphics card and cpu combo
bro they want to push us into using cloud gaming. Thats why don't optimize them anymore. Just look at nvidia's stock right now. They aren't hyped cuz people are buying 4090s like hot shit, no. They are building data centers so they can sell the cloud gaming subscription services. They don't like local gaming, too much piracy n all that. A 4090 is like $2800 dollars in my country. Nobody can buy that shit.
I got a 640x480 don't be such a snob playing at 1080p. I upgraded from the gtx 1060 to the rtx 4090 and hardly notice a difference.
thanks 4 this piece of docu-science
Thank you so much for the kind words! I greatly appreciate it, really! :)
benchmarks and reviews are just sad now, most of them use frame generation and upscaling as if it was normal on a 1600$ GPU, i mean in some games the 4090 couldn't even get more than 60 fps ( without frame gen)
Had my 4090 for a year, I use a 240 fps 1440p monitor and with no dlss, I'm getting ~100 fps for games like red dead and forza motorsport, while for others like forza horizon 5 im getting 200-230 easily.
I honestly feel like for the price I paid, that 4090 is far more worth it than the new 5090, especially for my personal gaming needs.
Your face cam is epic bro
As a person that 1) has not perfect eye sight and 2) doesn't breath down my monitor, I never really got the whole craze of anything above 1080p. I personally use my own 960p as it's really hard to tell the difference above that. And even though I have a 3070 Ti I still like to optimize my graphics before starting a game lol.
those frames would look better with a 7800x3d but 1440 is really the best of both worlds 1080 is just ugly after you use 1440p and I know what you mean with frames and the 4090 but it's still a waist of money you literally cant see the difference between 4090 and a 4070 lol
Yeah, my new plan is to get a 14900k, and a 1440p 240hz monitor to really let this 4090 stretch its legs lol
But also, there's a reason I bought the 4090 over other cards, so not really a waste of money for me
@@MrJaysi just want to clear something up. I like watching your videos, but I think you had something on here. You were doing 300 frames per second. At 1080B, regardless of which video card you have? It's insanity to think that you can tell the difference between 200 frames per second and 300 frames. It's nonsense but anyway cool video
By the quality of your content I assumed you had more subscribers, I have no doubt you'll get more.
This means so much to me! Thank you! :D
I try my best!
Honestly, I'm pretty much with you. I want to play at 1440p though. I just got a 4080 super and it's great. Maxxing it most games at native 1440p with fps above 90 is amazing. It's what I've always wanted