Ya me too . I recently upgraded from 3440x1440 to a 4K but it has a 3840x1600 mode which I use for racing and Witcher/Fallout type games. Best of both worlds and automatically detected in the driver so no futzing after switching. First video which actually gives a closer idea of where it falls between the 2. Thumbs up.
Same, I compare 4k to 1440p and pick the numbers in-between them for a rough guess it's worked well for the last 10 years I have had my old 2560x1080 (calculated from 1440p-1080p) and then 3440x1440p monitor, and if I get better performance all the better. I am a firm believer in "better to have it and not need it, than to need it and not have it" Who complains that they have performance left on the table? Unless of course the game runs like crap and all your hardware is snoozing, I have always had odd resolution monitors even back in the 90's I used to have to modify/hack games to get them to work on my CRT monitor, I guess it says something about me lol I upgraded to a 4070Ti Super from a RTX2080 because the 8GB VRAM was becoming a problem. Even DLSS couldn't fix that in a lot of the games. As i like to keep my graphics card for a while i didn't feel 12GB would be enough, and in some cases already 12GB can struggle in a few titles which isn't a good sign for the future...
@@Rafael_B Samsung Odyssey Neo G7 43". It had some reported issues when it first launched which i fortunately missed as I bought it early 2024 (for 1/2 MSRP). I applied the latest firmware right after hooking it up and am very happy with it. Watched a few YT videos on tweaking it for HDR which got me in the ballpark to tweak it further. Only running it @ 120 Hz as that's where 4:4:4 chroma support ends but my graphics card can't push it even this fast usually and I'm in need of a GPU upgrade now. Ah, I've opened the proverbial can of worms ...
Thanks, I do beleive the ultrawide comunity will continue to grow it seems 9/10 ultrawide users never go back to standard wide monitors. And with so many new sizes, resolutions and price points in the ultrawide ecosystem will keep it going .
Once you go ultrawide you can never go back ......... For the first time I can play First Person without feeling like I have horse blinders on wrecking my peripheral vision and Third Person games feel even more cinematic than ever .... The larger problem is the idiot game developers that never do their cutscenes in ultrawide meaning I usually have to go into the game's EXE with a Hex Editor and hack it so the cutscenes are ultrawide too .... Sure it's simple enough to do but that's all the more reason it's stupid that the game developers don't do with with a few simple lines of code.
You have a killer setup and I would agree that 3440x1440 is kind of the sweet spot resolution right now where almost no card is too powerful for the resolution, but a lot of cards can still work very well at that resolution. And it has some damn fine monitors available, whose prices aren't too outrageous.
I'm currently using a 27inch 1440p monitor, I'm giving serious consideration into getting a ultrawide Monitor. its a bit annoying as not all games support the 3440 resolution but there is usually a workaround. Question: could you ever go back to a 16:9 monitor after using a 21:9 ultrawide for so long?
@@Lennox032 Personally I'm not interested in going back. For a brief time there i was eyeballing one of those 42" LG tvs but then the AW3423DW OLED came out and I lost all interest in them.
@@Lennox032 I switched my main monitor LG 27" IPS 1440p 180hz to LG 34" IPS ultrawide 160hz about half a year ago, I would never go back. Having that extra space is simply amazing. Very minor downsides are cutscenes being 16:9 in games and most youtube content is 16:9 but there are extensions where you can toggle the aspect ratio / crop the video.
Thanks a lot for this video. I am currently investigating buying a new graphics card for my 3440x1440 screen, but most fps examples I find for 1440 and this helps to understand where I would roughly end up. Initially I was doing the same mistake, by multiplying the fps with a factor based on the resolution quotes.
I have both individual and head to head videos for all the graphics cards for the resolution you're interested in. They cover off both averages and 1% low performance as well as show DLSS and FSR performance for ray tracing.
Ya doubling your resolution like that will definitely give you a hard time in cyberpunk. it really hurts you for every pixel added especially when ray tracing.
What I have observed trough numerous reviews online trough the last few years, you need approx. 73FPS at 1440p to have 60FPS in 1440p UW. Interestingly enough, that is almost exactly reflecting your findings :D
Very informative, i noticed this while playing Star citizen in 3840x1600, and while using all 3 of my screens for a 6000x1600 resolution, i saw only a small drop in frames, 6800xt and 5900x 32gb ram. Unfortunately the screens on the side are not freesync so i get some tearing and not full screen on them because of their resolution.
I'll be doing a 4k vs super ultrawide video next and I'm very currious if ill get a lower margin than the 12.5% resolution scale as you would expect when just scaling up or if the nature of the super ultrawide showing a 50% different sceen will make it actuly scale better than the 12.5% like you were seeing using a multimonitor display.
I've done various triple monitor setups for the last 15yrs. AMD used to have far more advanced driver support for triple monitors. Back during the AMD 5850 1gb GPU's days of 2009 I had triple 1080p 60hz matching monitors and it worked great. I then tried an Ultrawide 1080p in the middle with two similar bezel matching LG 16:9 1080p monitors on the sides. I never could get away from tearing or a proper combined resolution with black bars. I now have on my wife's PC combined three 27" 1080p 240hz LG's all on Display Port and even though one of them isn't the same model they all play nice together due to the matching resolution. They all run 240hz with VRR working correctly on my 6800XT. I even have a 4th 27" 1080p 144hz installed as the center top monitor. The default configuration tool in AMD driver doesn't allow for proper config always placing the 4th monitor in a quad wide setup instead of being a standalone. What I learned from a YT video is AMD has this eyefinity monitor configuration application inside the AMD driver folder that you can run which actually did provide a bit more config options. I likely won't fix your mix matched resolution issues with black bars but it did allow me to configure my setup properly. I have to select 3x1 and then pick which of the 4 monitors I wanted to setup. It then set the 4th monitor as a stand alone and it properly now positions itself on top of the combined triple setup. The built in config application is called EYEFINITYPRO.EXE and can be found here C:/program files/AMD/CNEXT/CNEXT/eyefinitypro.exe.
I love PLP setups. Used to have HP 20-30-20 PLP back in the day. Now also 3840x1600 but still interested in side monitors. What 1600p monitors do you use for the portrait monitors?
This is why I also factor in Panel size + Seating distance into my performance budget. It saves me a lot. My usual seating distance of 2.5 - 3 feet away allows me to use 2560 x 1080P @ 29" quite comfortably, including with the panel size. With a 3440 x 1440P monitor, I'd need to sit much closer, but lose peripheral vision benefits of having the full monitor in view without needing eye movement. I would also need to use 125% @ 3440 x 1440p windows scaling to see better. Cards like the 6950 XT ($499)/7900 GRE ($550) become much for viable for me versus needing $1000 cards, like 7900 XTX to satisfy the 4 million pixel budget of UW1440P. UW1080P is nice.
Went from 1920x1080 24inch to 2560x1080 29 inch and it was all i wanted. Performance Hit was there, but managable (i7 8700 + RX 6600). Upgraded to 3440x1440 in May this year and also upgraded to R7600 + RX 6800. It is wonderful, BUT. While 1440p is great and clear. 1080p UW does not look bad. It really is something to consider when not much budget and 29inch UW works wonderful next to a 24 inch 16:9. If you really really are drawn to UW, get it. Do not get it, if your System already struggles with 16:9 Resolution. But 1080p UW exists and it not that bad.
I would also like to see 3440x1440 vs 3840x2160. Watching reviews of whatever new hardware that comes out, Ive always estimated about half of the difference between 1440 results and 4k. I actually have no idea what the difference should be
3840x1600 is a wonderful resolution that doesnt seem common enough with monitors. Those 45 ultrawides released in 2022-2023 with 3440x1440 resolution really should have been 3840x1600 instead.
3440×1440 is amazing. My GTX1080 does really well even on newer games. I can run medium to high graphics on Forza Horizon 5 and average between 50-90FPS I have a 144hz monitor and although I dont really ever hit it in games 8 years old or newer, I rarely ever get anything below 90fps and its perfect as is
@@ultrawidetechchannel One more question, is it possible to consider the power consumption of those cards for future videos like this? This way we can compare which resolution is more energy efficient or to have more data to compare in addition to the fps. Thanks again for all your work and dedication, you really are the only RUclips channel creating content like this
@@lucashernandez922 I don't really have the equipment to properly collect that data and on the tiny bit of money I'm getting from RUclips I’m not going to be able to invest in it for a while. Eventually I would love to be able to test everything but that is still a while down the road.
Brilliant video. I played FF14 at 1080p with an FX-8350 / R9-290, then at 2560x1080 upgrading to a 980Ti, then at 3440x1440 with a 6850K / 980Ti x 2, then a 1080Ti followed by a 2080Ti, followed by a 3900X / 2080Ti, and then at 4K, first on a 13700K / 6800XT and now a 7900XTX. As one might imagine, each upgrade in CPU yielded FPS gains, and each upgrade in GPU allowed me to use higher settings and higher resolutions. But how much I would gain each time I upgraded was never predictable. This helps explain why.
Well said, using the math in your video as well as the one tier above your own typical 1440p GPU of choice is a good decision maker. No need to overspend on some cards unless raytracing or other features are of importance, because from pure rasterization performance or even with some upscaling you could easily get 100+ in a lot of these titles. But would you still recommend the 7800XT though for 1440p ultrawide if you're going off by purely rasterization performance/FSR without raytracing and high/optimised settings instead of ultra? Or is it worth to spend the extra for the 7900XT in the long run? I still haven't seen many 7800XT ultrawide benchmarks out there despite it being months now since the cards release.
Sadly, I don't have the 7800XT so I can't say definitively but personally I would like to have the extra power of the 7900XT for the ultrawide resolution just so I know that high refresh rates are almost always going to be on the table if I feel like the game will really benefit from it. At standard 1440p i think the 7800XT would satisfy but for the ultrawide I would still want the 7900XT the same way that I recommend the 4070Ti over the 4070 at this resolution.
Great video, but you're not a loud talker. That's completely fine, but get closer to the mic, please. :) Edit: I thought the sound that plays, when new bars appear, is my HDD scratching, scary stuff.
The only problem I have with ultra wide is the fact that now every lesser quality monitor I look at has poor quality in comparison. Once you’re playing in 5k everything else looks like shit
While I don't have any hands on experience for a true budget monitor, the KOORUI 34 Inch Ultrawide 165HZ amzn.to/3Pl2I2M is getting some good press on several outlets and has a pretty satisfied amazon user score. It’s not IPS it’s using what I think is one of Samsung’s fast VA screens but those are very good panels. Here is a little bit more expensive LG option amzn.to/3veCNCP again haven’t used it myself but have had good experiences with my LG monitor and TVs and here is the Samsung G5 amzn.to/48TR1qt the quintessential midrange ultrawide.
I went to 34” a couple years ago but accidentally broke my monitor so had to go back to 27” for a whole, I hated it after trying ultrawide so now i’m back with a 34” monitor lol but finding benchmarks for it isn’t the easiest.
Thank you, this is very helpful! I myself has just upgraded to 4080 super and was seriously considering 4090. However, after trying out 4080s on my ultrawide AW3423dw I'm really happy I didn't pay extra for 4090 - everything runs great on perfect FPS and high graphics settings
@@yasink1995 i really like my OLED AW3423dw, so never thought of replacing it with a 4k screen. Plus a 4k oled would cost extra money and it will hit my fps
Thank you very much for this review that no one ever seem to have domne or didn't care about.. Never are there any game reviews using ultrawide screens. Been using 2 ultra wide 3440x1440 now for many years.., (60 Hz Samsung se790c for 8 years and 144 Hz AOC CU34G2X for 4 years) and I love them to bits. I always had a feeling I got better framerates than extrapolated from the 34½% performance drop claimed. My games ran faster than i expected based on game reviews with my graphics cards. glad to see it was not just wisful thinking on my side Very happy with the present gaming screen. However I would like to have a 3840x 2160 screen beside it for office and the few games that look better with non ultrawide screen. Hate it I can't really play skyrim on ultrawide screen.. and for that a 32 " 4K ish screen would be awesome .
Thanks, I do have some game review but its hard and expensive to keep up with game releases as I don't get any press codes or anything like that. I to try to tackle all the big releases that hit PC GamePass on day one though.
Thanks for the video! I was wondering between the msi 271QRX vs 341CQP, specifically around how bad the performance would be. You made a great point that the extra pixels on the sides are not seeing much action, so it actually doesn't need that much GPU!
When i upgraded to a 1440p ultrawide, my 1070 was having a hard time driving total war warhammer series. Had to upgrade to a 2080ti to get a decent framerate. x70 series cards would be the bare minimum for getting good fps on UW1440P tho id recommend x80. Great content, i subscribed
Thanks for the sub. Yeah, the 4070 series at that resolution is providing you with good performance today but likely not so good performance in a few years. Where is the 4080 will give you amazing performance now and good performance in a couple years.
Honestly, that's my main hesitation. I know a lot or newer games are doing better with natively supporting it but I also play a lot of older games as well and I'm debating whether the additional hassle would be worth it to me or if I should just get a normal oled instead of a nice ultrawide
@@slayerdwarfify once you get used to it, it’s not that bad. But, it can be annoying when you want to play a game that just released, but doesn’t support it, and there’s no fix available for a couple days to a few weeks or possibly months. Or if there’s no support at all, and you’re forced to play the game with black bars. I hate that even more. Actually there’s even worse 😬 when there is no support, and instead of black bars, it’s white bars! That sucks
If you just set the internal game resolution, while in full screen mode, to whatever resolution you want your recording to be, then it will record it at that resolution even if in the game it shows the screen all stretched out. The recording should be in standard 16:9
This video makes me hesitating between OLED 3440x1440p and 4K a lot …. If my performance will decrease by switching on ultrawide( even a little) would it be smarter to move to 4K 32’ for sharper and crisp image ? Nice video btw 😊
I'm not quite clear on what your saying, but 4K (3840x2160) would be significantly harder to drive than Ultrawide (3440x1440). In game the sharpness advantage of 4K will be less noticable than the frame rate drop will be, but on the desktop and in other apps the the sharpness advantage will be clearly noticable.
I’m my experience, there isn’t a massive visual difference between the image clarity of my 3440x1440 Alienware oled and my 4K tv. What made the biggest difference is the extra width for immersion and going from ips to oled. I used to always lug my computer upstairs to play games in 4K, but since the ultra wide, I never do. Because I much much prefer the aspect ratio and the image clarity is super high fidelity. Difference is nil
@@ghulcaster I bought a MSI Qd Oled 3440x1440p and yes I full agree, the ultrawide ratio is awesome and I don’t regret my purchase I can notice less clarity in games because I just love pause and looking the landscape but it doesn’t bother me a lot ( except for games with bad TAA). When the games is a little bit blurry you can use the NVIDIA filter and enjoy the ultrawide clarity :)
If you overspend and find out the GPU can't keep up, maybe your monitor can do picture-by-picture so one 5:9 region can be for the OS "second monitor" and remaining 16:9 can be for "narrow" widescreen 1440p. Might regain some FPS then. Just be sure to auto-hide the taskbar and use a rotating desktop background to reduce OLED burnin for the OS region :)
What a helpful video! I'm actually planning to buy another monitor and I'm considering WQHD, so this comparison was very useful for me. Thanks to the author for their effort.
THANK YOU SO MUCH FOR TESTS !!! Everyone been telling me that 3440x1440 makes no sense without top-tier GPU but after your vid I've decided to go for ultrawide. Picked IIyama VA panel since it have nearly no ghosting (I hate IPS glow) and paired it with 7900 GRE (Undervolted, underclocked, 45-55*C at 100% usage) Every game runs super smoothly, with at least 100 FPS ^^
Good video. The performance hit isn't quite as bad as the calculated 34.4% but losing about 20% or so on average is still pretty significant. My 6700xt does well enough for me at 1440p but it would likely struggle with ultrawide. Something to consider though if I move up in performance with a future GPU upgrade. Also something to consider though is if you like to record your gameplay and upload it to RUclips. Most people have 16 by 9 monitors and probably won't like the black bars on the screen viewing 21 by 9 content. 18 by 9 or 2 by 1 content seemed popular for a bit with Netflix shows I wonder if they'll ever make monitors in that aspect ratio.
Turn off RT, if you're at all concerned with the performance difference between 16x9 & 21x9. There's still nothing in 2024 that's worth the performance degradation of RT, if you don't have performance to spare. I'm sure there's a few people out there that won't agree, but Cyberpunk playing at higher resolution, without FG, and a good framerate is the better experience vs FG, high latency, with RT, and lower base resolutions. I'm sure there's a couple walking simulators that it's debatable, but for a FPS, it's a no brainer. This is a very good video to set the record straight for resolution scaling between aspect ratios.
@@joeykeilholz925No he's spot on if you want high Res and RT then why are you watching a video on framed rate if you can hit 60fps then why would you care?
@@MrFerrari458gtomay be because he has a gpu that can hit 90fps with RT and he doesn’t want to compromise that? RT or no RT is irrelevant when it comes to wanting to know performance hit when upping the resolution. He watched this video for the exact same reason you watch this video, he doesn’t want to compromise his framerate. Though in his case he also doesn’t want to compromise his graphics unlike you.
A month ago I weighted the idea of changing my RTX 2060 for a RX 7700. But, I considered I would not notice any significant difference since I had a 1080p 60hz monitor. Very basic. The RTX 2060 already allowed me to play 98% of my games at ultra settings and achieving 60 fps. an RX 7700 would have not made any difference. my real bottleneck was my monitor. So I went ultrawide 1440p 160hz. In the case of Helldivers 2, which I played at high settings, the change to 1440p took a heavy hit on the poor 6VRAM of my gpu, so I had to downscale the settings and use upscaling to barely achieve 60 fps. But in the end, you are wondering, was it worth it?. HELL YES. I CAN SEE MORE!
just got the new alienware oled ultrawide absolute beast although on certain setting i get headaches so had to do some tinkering , last ultrawide i had was the 2015 asus pg349q they have come a long way!
I've been gaming on Ultrawide since 2019. I'm currently using an Alienware AW3821DW that I bought in the beginning of 2022. It can be irritating that I still have to do a hex edit or some other modification for some games, but it is what it is.
Luckily most games i've played I've found a way to even edit the cut scenes to display in ultrawide. Pre-renderered scenes are a different story though. @@ultrawidetechchannel
The true test for me would be lending ultra wide monitors to FPS pros. The extra vision alone would be great, especially for battle royal games, but I’m very worried about the supposed image warping at the edges. Besides that having two monitors where you can game on one and watch something on another screen is very nice. Right now I have a performance monitor and a picture quality monitor. I would love to just have one monitor for convenience sake.
so, ive got a 3440x1440p monitor paired with a 3070.. been seeing the limits of just 8gbs and not sure where to go from here, 4070TS/4080(S) or as much as i enjoy it, sell the monitor. Any Thoughts?
The 16GB on the 4070TiS and the 4080S will keep you safe at that resolution for quite a long while. Even if you were to go down to a 16:9 1440p monitor it wouldn't mean you would be safe with only 8GB of VRAM. Some newer games are still going to get you.
I'm on two ultrawide's, an Oled and IPS plus a third 24" IPS. My 3080ti handles the games I play just fine with max settings and ray tracing, but latest games I've played are RE4 remake and RE Village. Been meaning to replay Elden Ring and start Cyberpunk, maybe I'll notice it more then.
What are your thoughts on the effect of changing the field of view? Theoretically you should run a bigger FOV angle on an ultrawide. And in theory this could cause lower fps. From my tests and a little looking around on YT this performance impact is a lot less than I expected. So, if you didn't factor that into your test, I think you've been right to do so, but I would be interested in your thoughts on this.
I used just the base settings you get when setting a preset as most people will never touch the FOV slider in a game. Most games i don't bother with it at all I don't feel 21:9 is wide enough to need and adjustment, 32:9 depending on what kind of experience you want to get out of it may make FOV a necessary setting to play around with.
I have a Westinghouse 34" ultrawide being pushed by a 5800x3d and 6900xt. The set up workes pretty well. Cyberpunk runs at around 65 fps with ultra settings and high RT settings.
It is also worth to mention than in ultrawide you can activate vrs without really notice it, increasing even more the performance with basically no quality difference.
The fact that my 3060 is handling high to ultra graphics in ultrawide 1440p and hovering between 40-80 fps in every title pretty much proves that you dont need overwhelmingly good specs to run it.
I just got a 4070Super from 1080Ti, planning to get alienware aw3423dwf, seen some testing where certain games(alan wake 2 for example) were reaching towards 12Gb VRAM, should I be worried about VRAM?
12 GB should be fine for that resolution for everything except path tracing in Cyberpunk and Alan wake 2. Regular ultra ray tracing especially with DLSS on which you will definitely be using will still stay within your memory budget.
My brother has changed his monitor to a 34 inch ultrawide 1440p oled. In a few weeks I will build him a new pc and I am still hesitating whether to recommend him the 7900 xt or the 4070ti super. The 20gb of the 7900 xt seems to me a great option for a monitor like this but I know that the nvidia is a safe bet; dlss, rt, nvidia codec etc. What would you recommend me in this case?
The same applies to 4K and even 8K, increasing resolution does not increase load across all aspects of the render pipeline equally, even if you calculate pixels rendered per-second you're almost always getting the best value using a higher resolution display.
That's bechause the games never are made for ultrawide, or even super ultrawide 32:9. The image is stretched more and more the wider the screen, and your performance is mostly affected by going from 1080p to 1440p. That's why my 3070ti could still run most games with 90+ fps on my 5120x1440p monitor. It was struggeling in the newest titles tho, so i did upgrade to a 4070ti that i got for cheap and it is amazing.
Hope I found this video before I bought my 5120x1440 monitor... I had a 4060 Ti at that time, ended up with an 4070 Ti Super which I'm really happy with! I now have a proper GPU with my 5120x1440 and 3440x1440 screen.
I Just whipped out my old 1070FE card because Ive been wanting to build a portable mini ITX build with it, I ran it at 2560x1080 and that thing slaps!! It even did about as good as my 5700xt in 3440x1440!! Frakkin crazy! Now Im wondering why I even uprgraded 4 years ago in the first place lol
I did build a system using 4070ti and 5800x a couple years ago. Great 16:9 1440p rig. Then without much research I got an ultrawide a month after. I don’t regret it, and most games look and run well, but I do wish I had a 4080 or didn’t need to lean too heavily on Frame generation. I think the biggest problem is legacy and some newer games don’t natively support UW. I mean… Elden Ring… what are we doing Fromsoft? And UW “fixes” can trip the anti-cheat
Am I the only one that thinks that these performance numbers are insane? I don’t mean anything by the author, but I just looked at a decent 1440p build with ultrawide and looking at these benchmarks a 7900gre is not gonna cut it for 60fps gaming. I’m talking 3400€ build. And what is going to happen with games 2 years down the line? I was almost about to buy that config(pcpartpickwr) but this benchmark has me go nope. I’ll wait next cpu and gpu gen. I can’t spent 3400€ for a pc that can barely do 1440p non ultrawide
Like I say in the video I have made these settings as punishing as possible to try and prevent any bottlenecking other than from the GPU, because I have previously found even on regular ultra settings some of my test games were not able to fully leverage the 4090 even at 3440x1440 ultrawide. So with going even lower resolution, I had to try to make sure that happened as little as possible. You can get much better frame rates than what are shown here for almost no visual downgrade and there are very few of the games I would play using these tested settings and I would classify myself as a pretty bad visuals snob.
@@ultrawidetechchannel yeah but a 7900xt barely makes the cut now i almost all the games (40fps) what's gonna happen in 3 years? i'm not gonna change computer every 3 years to keep up to play 1440p
Depends on the game and if you are willing to use DLSS or FSR upscaling and frame gen. If you can get to about 90fps with upscaling then frame gen can take you the resto of the way to your monitors max refresh rate.
What I'd really love to know is how much the two side monitors I have next to my center 3440x1440p screen are hurting my gaming performance. I have a 27" 3840x2160p 60Hz monitor on the left, my center 35" 3440x1440p 100Hz screen in the middle and a 27" 2560x1440p (60Hz) on the right. I usually only have STEAM and Discord running on the side monitors so nothing that should be using a lot of GPU horsepower but still, would be interesting to know if I'm losing noticeable performance by having those side monitors on while gaming.
Any way you could get your hands on a 2080TI and run it like 2080 just to see how much of a performance difference it would be if it had more VRAM? As a 2080 owner that moved from 1440p 16:9 to 1440p 21:9 a few months back, the 8GB really aged the card, performance wise. It may not have be a 30% performance reduction, but it sure felt like one! Now I have a 4080S on the way, and seeing an over 100% performance increase it makes spending all that money feel a little bit better. 😅
I don't know anybody who has one and the cost of acquiring one will definitely not pay for itself with the content I can make from it. Right now the 2080 is a good standing for the 4060 and any other 8 gigabyte card that you might be considering. You will be delighted to no end by the performance improvement you're getting with the 4080 super.
I have the 4070ti, I did some calculation because that just doesn't make sense as well to hear how much you lose performance going to an ultra wide and thank you so much for proving me right lol I have a 3440x1440 and I love the extra screen immersion you can get while not loosing "34%" performance as they say
I run an ultrawide 1080P monitor and I was considering a 1440P one to pair up with my 5800X3D and 7800 XT. And have been wondering how much of a drag ultra wide 1440P will be on that. But I also typically run games with FSR (frame generation it available) and no ray tracing at high to ultra settings. Now I may be able to make a more educated decision.
The performance drop always needs to be considered with how noticeable any benefit is. I like to compare to feeding 1080p to my 60" LG Oled and letting the tv handle the upscaling to 4k. The benefits of a smaller 1440p monitor, regardless if 16:9 or 21:9 are irrelevant. Bigger is better than smaller and Oled is better than LCD. And that's that.
CyberCyberpunk is not "the worst performing" game in your ranking. Linear performance change for Cyberpunk shows how well optimized is this game. Every but of your GPU is used and nothing is wasted. This is not a trival thing to achieve and developers had to put a lot of effort into it.
I think the most important part here which sure isn't obvious to many, if you are going ultra wide just be sure to have a gpu with more than 8gb of vram. Most games suck up 8/10 gb of vram already. Be more future proof and go with 12/16etc
*Fun Fact...* The "Rise of the Tomb Raider" benchmark got a higher FPS score with a hard drive than it did with an SSD. Why? Because many of the textures didn't load in quickly enough to get processed. So you got more pop-in but a higher FPS. Benchmarking can get weird (I'm referring to the lack of VRAM for this video causing difficulties).
I'm now running a 4060 on a 24" 1080p monitor......do you think I should get a 27" 1440p or a 32" ultrawide? I know I probably won't be able to play at 1440p, but I can still rely on DLSS and frame generation. What should I do?
I think a 4060 might struggle a bit too much with ultrawide personally. I suppose it partly depends on what types of games you like. If its games like Fortnight and CS Go than you would be fine but if you're playing games like Black Myth Wukong or Alan Wake 2 I'd stick with 1440p.
Huh, it would never have occurred to me to do the calculation that way (divide by pure increase of pixel count), precisely because that doesn't work between 1440P and 4K either. My thought process was to say this resolution includes about 25% of the extra pixels you'd render when going from 1440P to 4K. So i'll estimate performance by adding 75% of the 1440P framerate to 25% of the 4K framerate. I'd be interested to know how far off I am with this approach. I figure it would work pretty well when there are no memory walls involved - it'd likely still underestate cards where 4K would be memory walled but 3440x1440 isn't, but it should do OK where that isn't the case.
I have a 3440x1440 and thinking about upgrading to 5120x1440 for sim racing ... I know my 3060 will struggle so I hope 5000 series gpus will have affordable options that can work.
Not only will it struggle from a pure performance perspective, it's low Vram will cause it to have unrectifiable issues even at the lowest settings in several new games
I’m going pc next year, selling my ps5 and Series X. Bought a Rog Ally and fell in love with the pc platform. My question is, can I play ultra wide on my 65 oled tv? Sorry the ignorance.
As @Dystopikachu said you can use your driver control panel to change the resolution to display an ultrawide image with black bars at top and bottom same as a movie shot is cinematic widescreen, though i would probably choose 3840x1600p for the resolution as that will not cause any blurriness as it will match your TV's resolution. Make sure you chose the 1:1 pixel setting so you don't just get your image stretched tall.
for my situation with the 5800X3D the only GPU that is really CPU bottlenecked is the 4090 with the 7900XTX sometimes hitting it. What racing games are you seeing those bottle necks on?
gpu companies should start pushing out software that lets you render those edges at a lower detail or something since most of the time its just to fill your peripheral anyway
when it comes to gaming is ultra wide really worth all the hassle? having to worry about performance, paying more money for the monitor and whether the game properly supports the setting doesn't seem like a good trade-off just to get some more screen on the sides, which might even make the experience worse for many people. why not go from 1440p to 4k instead where you would still have to pay the extra money and need a better gpu but the visual benefits are more noticeable and you don't get the cringe wide screen that might cause nausea nor do u have to worry whether it will actually work for certain games?
I have a 34" 1440p ultrawide and I still haven't found a game made in the last 10 years that doesn't support this resolution. And even for older games that don't support it, there's usually a mod that makes it work (Skyrim for example). I personally like ultrawide because I play a lot of games where there's a lot of UI navigating (RTS, city builders, management, simulation). I no longer want a non-ultrawide because the games I play need a lot of info to be displayed at all times. And yes, I have tried multiple monitors. I barely used the second one while I always have 2 windows open on my ultrawide. Watching content is the real issue, yes, but I got used to the black bars when in fullscreen. Just like we got used to watching movies with the black bars on top and below. And I sit slightly more than an arm's length away from the screen so there's absolutely no risk of feeling sick.
@@metalface_villain Unfortunately I haven't. But I guess it wouldn't be incredibly different from what I currently do, as I would be able to place smaller windows comfortably on a 16:9 4k display as I do on my 21:9 1440p. Same goes for UI elements in games.
I made the switch to 3440x1440 4 years ago and did enjoy it till recently going to a larger 1920x1080 and don't regret it to be honest just a lot easier to see everything at my age lol
Is it that bad if you consider you can get them for as expensive or even less than the 4080/4080s? Also kinda strange to compare an 800€ GPU with an 1000-1400€ and an 1800-2000€ GPU with RT always enabled.
Ya it has been hit or miss with a lot of the competitive online titles that started before Ultrawide was a significant thing, weather or not they update themselves to support it.
Great video, thanks a lot... we need more of these niche videos. Can you please test Dell S3422DWG which is best budget VA 34 ultrawide monitor past years with much cheaper but 2024 monitor. AOC cu34g2xp. Aoc is almost 50% cheaper in my country but its supose to have some HVA panel that almost removed all black smearing and ghosting problems that all VA panels have. I think this would be super interesting to a lot of people as they are 2 best budget 34 UW monitors
Its having to render 3.7 million pixels every second versus 5 million pixels, or 26% difference. In reality it is typically having to render lesser used sides of the displayed content, so the difference should be 10-20%, not the full 26%. I recently picked up a new 16GB RX6750XT for under $300 and it handles almost everything at 1440 no issues (with 5600X and 32GB DDR4-3200).
I just want a little 32” flat screen 4k OLED but I sure as hell can’t afford a grand for a fricking monitor that I couldn’t even utilize with my current GPU a 2080. I’ll wait until the prices inevitably come down to reality.
"hundreds of dollars of needless spending" is pretty much the entire hobby
haha, it feels that way sometimes.
Before I bought my 3840 x 1600 monitor I used to look at benchmarks for 1440p and 4K and then just figure out the average FPS between the two.
Ya getting good benchmarks for ultrawides has been hard for a long time but I'm trying to fix that.
Ya me too . I recently upgraded from 3440x1440 to a 4K but it has a 3840x1600 mode which I use for racing and Witcher/Fallout type games. Best of both worlds and automatically detected in the driver so no futzing after switching. First video which actually gives a closer idea of where it falls between the 2. Thumbs up.
Same, I compare 4k to 1440p and pick the numbers in-between them for a rough guess it's worked well for the last 10 years I have had my old 2560x1080 (calculated from 1440p-1080p) and then 3440x1440p monitor, and if I get better performance all the better. I am a firm believer in "better to have it and not need it, than to need it and not have it"
Who complains that they have performance left on the table? Unless of course the game runs like crap and all your hardware is snoozing, I have always had odd resolution monitors even back in the 90's I used to have to modify/hack games to get them to work on my CRT monitor, I guess it says something about me lol
I upgraded to a 4070Ti Super from a RTX2080 because the 8GB VRAM was becoming a problem. Even DLSS couldn't fix that in a lot of the games. As i like to keep my graphics card for a while i didn't feel 12GB would be enough, and in some cases already 12GB can struggle in a few titles which isn't a good sign for the future...
@@tonyc1956 to which 4k Monitor did you upgrade to?
@@Rafael_B Samsung Odyssey Neo G7 43". It had some reported issues when it first launched which i fortunately missed as I bought it early 2024 (for 1/2 MSRP). I applied the latest firmware right after hooking it up and am very happy with it. Watched a few YT videos on tweaking it for HDR which got me in the ballpark to tweak it further. Only running it @ 120 Hz as that's where 4:4:4 chroma support ends but my graphics card can't push it even this fast usually and I'm in need of a GPU upgrade now. Ah, I've opened the proverbial can of worms ...
This guy is actually the goat, making videos for a niche community which will hopefully grow!!!
Thanks, I do beleive the ultrawide comunity will continue to grow it seems 9/10 ultrawide users never go back to standard wide monitors. And with so many new sizes, resolutions and price points in the ultrawide ecosystem will keep it going .
Once you go ultrawide you can never go back ......... For the first time I can play First Person without feeling like I have horse blinders on wrecking my peripheral vision and Third Person games feel even more cinematic than ever .... The larger problem is the idiot game developers that never do their cutscenes in ultrawide meaning I usually have to go into the game's EXE with a Hex Editor and hack it so the cutscenes are ultrawide too .... Sure it's simple enough to do but that's all the more reason it's stupid that the game developers don't do with with a few simple lines of code.
@@longjohn526 Preach, even when the gameplay is flawless in Ultrawide those stupid 16:9 cutsceens jsut kick you right out of the action.
Yup needs alot more subscribers if you ask me!
Been using 3440x1440 for 4-5 years now, currently an Alienware oled with a 4090. I truly believe 3440x1440 is the sweetspot.
You have a killer setup and I would agree that 3440x1440 is kind of the sweet spot resolution right now where almost no card is too powerful for the resolution, but a lot of cards can still work very well at that resolution. And it has some damn fine monitors available, whose prices aren't too outrageous.
Same setup for me, but I'm very tempted by the new 32 inch 4k OLEDs because of the uneven wear with 16:9 content
I'm currently using a 27inch 1440p monitor, I'm giving serious consideration into getting a ultrawide Monitor. its a bit annoying as not all games support the 3440 resolution but there is usually a workaround. Question: could you ever go back to a 16:9 monitor after using a 21:9 ultrawide for so long?
@@Lennox032 Personally I'm not interested in going back. For a brief time there i was eyeballing one of those 42" LG tvs but then the AW3423DW OLED came out and I lost all interest in them.
@@Lennox032 I switched my main monitor LG 27" IPS 1440p 180hz to LG 34" IPS ultrawide 160hz about half a year ago, I would never go back. Having that extra space is simply amazing. Very minor downsides are cutscenes being 16:9 in games and most youtube content is 16:9 but there are extensions where you can toggle the aspect ratio / crop the video.
Thanks a lot for this video. I am currently investigating buying a new graphics card for my 3440x1440 screen, but most fps examples I find for 1440 and this helps to understand where I would roughly end up. Initially I was doing the same mistake, by multiplying the fps with a factor based on the resolution quotes.
I have both individual and head to head videos for all the graphics cards for the resolution you're interested in. They cover off both averages and 1% low performance as well as show DLSS and FSR performance for ray tracing.
I made the jump to my 49" ultrawide and definitely see a hit with my 3080 FTW3, but mainly in Cyberpunk.
Ya doubling your resolution like that will definitely give you a hard time in cyberpunk. it really hurts you for every pixel added especially when ray tracing.
Between 1440p to ultra wide on my 3080 in Cyberpunk ultra with rt I loose about 10 fps
But it looks so good on the 49" lol. I can't go back.
What I have observed trough numerous reviews online trough the last few years, you need approx. 73FPS at 1440p to have 60FPS in 1440p UW. Interestingly enough, that is almost exactly reflecting your findings :D
Glad I could help reinforce your personal research
I think if we forget about too powerful GPU and too weak ones, it's fair to evaluate to 20 % the fps we loose with UW.
Very informative, i noticed this while playing Star citizen in 3840x1600, and while using all 3 of my screens for a 6000x1600 resolution, i saw only a small drop in frames, 6800xt and 5900x 32gb ram. Unfortunately the screens on the side are not freesync so i get some tearing and not full screen on them because of their resolution.
I'll be doing a 4k vs super ultrawide video next and I'm very currious if ill get a lower margin than the 12.5% resolution scale as you would expect when just scaling up or if the nature of the super ultrawide showing a 50% different sceen will make it actuly scale better than the 12.5% like you were seeing using a multimonitor display.
I've done various triple monitor setups for the last 15yrs. AMD used to have far more advanced driver support for triple monitors. Back during the AMD 5850 1gb GPU's days of 2009 I had triple 1080p 60hz matching monitors and it worked great. I then tried an Ultrawide 1080p in the middle with two similar bezel matching LG 16:9 1080p monitors on the sides. I never could get away from tearing or a proper combined resolution with black bars.
I now have on my wife's PC combined three 27" 1080p 240hz LG's all on Display Port and even though one of them isn't the same model they all play nice together due to the matching resolution. They all run 240hz with VRR working correctly on my 6800XT. I even have a 4th 27" 1080p 144hz installed as the center top monitor. The default configuration tool in AMD driver doesn't allow for proper config always placing the 4th monitor in a quad wide setup instead of being a standalone. What I learned from a YT video is AMD has this eyefinity monitor configuration application inside the AMD driver folder that you can run which actually did provide a bit more config options. I likely won't fix your mix matched resolution issues with black bars but it did allow me to configure my setup properly.
I have to select 3x1 and then pick which of the 4 monitors I wanted to setup. It then set the 4th monitor as a stand alone and it properly now positions itself on top of the combined triple setup.
The built in config application is called EYEFINITYPRO.EXE and can be found here C:/program files/AMD/CNEXT/CNEXT/eyefinitypro.exe.
I love PLP setups. Used to have HP 20-30-20 PLP back in the day. Now also 3840x1600 but still interested in side monitors. What 1600p monitors do you use for the portrait monitors?
This is why I also factor in Panel size + Seating distance into my performance budget. It saves me a lot.
My usual seating distance of 2.5 - 3 feet away allows me to use 2560 x 1080P @ 29" quite comfortably, including with the panel size.
With a 3440 x 1440P monitor, I'd need to sit much closer, but lose peripheral vision benefits of having the full monitor in view without needing eye movement. I would also need to use 125% @ 3440 x 1440p windows scaling to see better.
Cards like the 6950 XT ($499)/7900 GRE ($550) become much for viable for me versus needing $1000 cards, like 7900 XTX to satisfy the 4 million pixel budget of UW1440P. UW1080P is nice.
Went from 1920x1080 24inch to 2560x1080 29 inch and it was all i wanted. Performance Hit was there, but managable (i7 8700 + RX 6600). Upgraded to 3440x1440 in May this year and also upgraded to R7600 + RX 6800. It is wonderful, BUT. While 1440p is great and clear. 1080p UW does not look bad. It really is something to consider when not much budget and 29inch UW works wonderful next to a 24 inch 16:9.
If you really really are drawn to UW, get it. Do not get it, if your System already struggles with 16:9 Resolution. But 1080p UW exists and it not that bad.
I would also like to see 3440x1440 vs 3840x2160. Watching reviews of whatever new hardware that comes out, Ive always estimated about half of the difference between 1440 results and 4k. I actually have no idea what the difference should be
3840x1600 is a wonderful resolution that doesnt seem common enough with monitors.
Those 45 ultrawides released in 2022-2023 with 3440x1440 resolution really should have been 3840x1600 instead.
so my 4070 can actually play at this resolution.... since my esports days are far behind me ill go with ultrawide then
Ya the 4070 will be fine at this resolution, you will need to leveraged DLSS in some of the ray tracing titles to get the frames you want.
And those who are well won't go with an ultrawide? I don't agree.
You didn't show 1% lows and 0.1% lows and i have an 34" alienware and a 4090, but the immersion is much better on Ultrawide.
Yup especially racing games and I prefer shooters in ultrawide also
Thats my plan. I recently bought AW3423DWF and now waiting for 4090 to go on sale
Crazy how many people have the same monitor including myself.
same! but a 4070 super, still amazing experience
Great video. Liked and subscribed.
Not as much good info about Ultrawide as I wish, but this guy is helping.
3440×1440 is amazing. My GTX1080 does really well even on newer games. I can run medium to high graphics on Forza Horizon 5 and average between 50-90FPS
I have a 144hz monitor and although I dont really ever hit it in games 8 years old or newer, I rarely ever get anything below 90fps and its perfect as is
Keep having a blast with your ultrawide.
This is the kind of data I've always wanted to see but no one else seems to do it. Subbed!!
3/5 people should go ultrawide. The two being competitive FPS gamers and super casuals.
I prefer ultra heigth. Everything closer, not much information in those sides anyway. You are closer to action. Squarer ftw.
Very nice video, keep up the good work
Thanks, will do!
Thanks for taking the ideas. Love this video and truly, this was very helpful. Love your channel to death
Thank you very much.
@@ultrawidetechchannel
One more question, is it possible to consider the power consumption of those cards for future videos like this? This way we can compare which resolution is more energy efficient or to have more data to compare in addition to the fps.
Thanks again for all your work and dedication, you really are the only RUclips channel creating content like this
@@lucashernandez922 I don't really have the equipment to properly collect that data and on the tiny bit of money I'm getting from RUclips I’m not going to be able to invest in it for a while. Eventually I would love to be able to test everything but that is still a while down the road.
@@ultrawidetechchannel
Still incredible!
Thank you for your dedication and for your videos
Brilliant video. I played FF14 at 1080p with an FX-8350 / R9-290, then at 2560x1080 upgrading to a 980Ti, then at 3440x1440 with a 6850K / 980Ti x 2, then a 1080Ti followed by a 2080Ti, followed by a 3900X / 2080Ti, and then at 4K, first on a 13700K / 6800XT and now a 7900XTX. As one might imagine, each upgrade in CPU yielded FPS gains, and each upgrade in GPU allowed me to use higher settings and higher resolutions. But how much I would gain each time I upgraded was never predictable. This helps explain why.
Ya every time you upgrade the GPUs are not necessarily upgrading every aspect at the same rate and changing resolution only complicates the variables.
Well said, using the math in your video as well as the one tier above your own typical 1440p GPU of choice is a good decision maker. No need to overspend on some cards unless raytracing or other features are of importance, because from pure rasterization performance or even with some upscaling you could easily get 100+ in a lot of these titles.
But would you still recommend the 7800XT though for 1440p ultrawide if you're going off by purely rasterization performance/FSR without raytracing and high/optimised settings instead of ultra? Or is it worth to spend the extra for the 7900XT in the long run? I still haven't seen many 7800XT ultrawide benchmarks out there despite it being months now since the cards release.
Sadly, I don't have the 7800XT so I can't say definitively but personally I would like to have the extra power of the 7900XT for the ultrawide resolution just so I know that high refresh rates are almost always going to be on the table if I feel like the game will really benefit from it. At standard 1440p i think the 7800XT would satisfy but for the ultrawide I would still want the 7900XT the same way that I recommend the 4070Ti over the 4070 at this resolution.
Great video, but you're not a loud talker. That's completely fine, but get closer to the mic, please. :)
Edit: I thought the sound that plays, when new bars appear, is my HDD scratching, scary stuff.
Thanks for the feedback I do most my recording while the kids are sleeping so I can't get too loud, and sorry for the tech scare.
The only problem I have with ultra wide is the fact that now every lesser quality monitor I look at has poor quality in comparison. Once you’re playing in 5k everything else looks like shit
Even if you go 1440p QD-Oled? That's wild
@@AshLordCurryYou know ultrawides also come in OLED, right?
U guys must be rich
Is there a budget IPS 3440x1440p 144hz-170hz monitor you would recommend? Just upgraded to a 7800x3d and 4080 super.
While I don't have any hands on experience for a true budget monitor, the KOORUI 34 Inch Ultrawide 165HZ amzn.to/3Pl2I2M is getting some good press on several outlets and has a pretty satisfied amazon user score. It’s not IPS it’s using what I think is one of Samsung’s fast VA screens but those are very good panels. Here is a little bit more expensive LG option amzn.to/3veCNCP again haven’t used it myself but have had good experiences with my LG monitor and TVs and here is the Samsung G5 amzn.to/48TR1qt the quintessential midrange ultrawide.
@@ultrawidetechchannel Awesome recommendations! Just found your channel and its pretty great!
@@ajhylton thanks
Lg 34gp83a-b
Think its a shame to cheap out on monitor. You have killer setup dude
I love you thx, trying to make sense of benchmarks to have an idea where my ultrawide frame rate would be, was a nightmare.
I went to 34” a couple years ago but accidentally broke my monitor so had to go back to 27” for a whole, I hated it after trying ultrawide so now i’m back with a 34” monitor lol but finding benchmarks for it isn’t the easiest.
I'm here trying to help you out I won't be able to get to every thing but but ill try to do as best I can for ultrawide users.
@@ultrawidetechchannel Thanks!
Thank you, this is very helpful! I myself has just upgraded to 4080 super and was seriously considering 4090.
However, after trying out 4080s on my ultrawide AW3423dw I'm really happy I didn't pay extra for 4090 - everything runs great on perfect FPS and high graphics settings
Ya the 4080 Super is plenty for that resolution and the only reason to really go more is for full path tracing.
Your 4080s could handle 4k why stick with 1440 if i may ask
@@yasink1995 i really like my OLED AW3423dw, so never thought of replacing it with a 4k screen. Plus a 4k oled would cost extra money and it will hit my fps
Thank you very much for this review that no one ever seem to have domne or didn't care about.. Never are there any game reviews using ultrawide screens. Been using 2 ultra wide 3440x1440 now for many years.., (60 Hz Samsung se790c for 8 years and 144 Hz AOC CU34G2X for 4 years) and I love them to bits. I always had a feeling I got better framerates than extrapolated from the 34½% performance drop claimed. My games ran faster than i expected based on game reviews with my graphics cards. glad to see it was not just wisful thinking on my side
Very happy with the present gaming screen. However I would like to have a 3840x 2160 screen beside it for office and the few games that look better with non ultrawide screen.
Hate it I can't really play skyrim on ultrawide screen.. and for that a 32 " 4K ish screen would be awesome .
Thanks, I do have some game review but its hard and expensive to keep up with game releases as I don't get any press codes or anything like that. I to try to tackle all the big releases that hit PC GamePass on day one though.
you can mod skyrim to handle ultrawide
This is such an excellent video! Ultrawide is very rarely covered so well. I appreciate this a great deal!
Thanks for the video! I was wondering between the msi 271QRX vs 341CQP, specifically around how bad the performance would be.
You made a great point that the extra pixels on the sides are not seeing much action, so it actually doesn't need that much GPU!
I'm glad I could help you make that decision with more confidence.
This is a good video. I cant go back from ultrawide now. Just got an oled 240hz 3440x1440 monitor, its so unreal!!
When i upgraded to a 1440p ultrawide, my 1070 was having a hard time driving total war warhammer series. Had to upgrade to a 2080ti to get a decent framerate.
x70 series cards would be the bare minimum for getting good fps on UW1440P tho id recommend x80.
Great content, i subscribed
Thanks for the sub. Yeah, the 4070 series at that resolution is providing you with good performance today but likely not so good performance in a few years. Where is the 4080 will give you amazing performance now and good performance in a couple years.
If you want to go ultra wide, be comfortable with needing to either mod, config edit, or run third party apps to make it work in a lot of games
Honestly, that's my main hesitation. I know a lot or newer games are doing better with natively supporting it but I also play a lot of older games as well and I'm debating whether the additional hassle would be worth it to me or if I should just get a normal oled instead of a nice ultrawide
@@slayerdwarfify once you get used to it, it’s not that bad. But, it can be annoying when you want to play a game that just released, but doesn’t support it, and there’s no fix available for a couple days to a few weeks or possibly months. Or if there’s no support at all, and you’re forced to play the game with black bars. I hate that even more. Actually there’s even worse 😬 when there is no support, and instead of black bars, it’s white bars! That sucks
do you know how can i record my gameplay at 16/9 while using an ultrawide monitor? im having problems man
If you just set the internal game resolution, while in full screen mode, to whatever resolution you want your recording to be, then it will record it at that resolution even if in the game it shows the screen all stretched out. The recording should be in standard 16:9
This video makes me hesitating between OLED 3440x1440p and 4K a lot ….
If my performance will decrease by switching on ultrawide( even a little) would it be smarter to move to 4K 32’ for sharper and crisp image ?
Nice video btw 😊
I'm not quite clear on what your saying, but 4K (3840x2160) would be significantly harder to drive than Ultrawide (3440x1440). In game the sharpness advantage of 4K will be less noticable than the frame rate drop will be, but on the desktop and in other apps the the sharpness advantage will be clearly noticable.
I’m my experience, there isn’t a massive visual difference between the image clarity of my 3440x1440 Alienware oled and my 4K tv. What made the biggest difference is the extra width for immersion and going from ips to oled. I used to always lug my computer upstairs to play games in 4K, but since the ultra wide, I never do. Because I much much prefer the aspect ratio and the image clarity is super high fidelity. Difference is nil
@@ghulcaster I bought a MSI Qd Oled 3440x1440p and yes I full agree, the ultrawide ratio is awesome and I don’t regret my purchase
I can notice less clarity in games because I just love pause and looking the landscape but it doesn’t bother me a lot ( except for games with bad TAA). When the games is a little bit blurry you can use the NVIDIA filter and enjoy the ultrawide clarity :)
If you overspend and find out the GPU can't keep up, maybe your monitor can do picture-by-picture so one 5:9 region can be for the OS "second monitor" and remaining 16:9 can be for "narrow" widescreen 1440p. Might regain some FPS then. Just be sure to auto-hide the taskbar and use a rotating desktop background to reduce OLED burnin for the OS region :)
While an interesting idea I think just turning up the DLSS or lowering the settings is a more applicable option.
What a helpful video! I'm actually planning to buy another monitor and I'm considering WQHD, so this comparison was very useful for me. Thanks to the author for their effort.
Glad it was helpful
THANK YOU SO MUCH FOR TESTS !!! Everyone been telling me that 3440x1440 makes no sense without top-tier GPU but after your vid I've decided to go for ultrawide. Picked IIyama VA panel since it have nearly no ghosting (I hate IPS glow) and paired it with 7900 GRE (Undervolted, underclocked, 45-55*C at 100% usage) Every game runs super smoothly, with at least 100 FPS ^^
Good video. The performance hit isn't quite as bad as the calculated 34.4% but losing about 20% or so on average is still pretty significant. My 6700xt does well enough for me at 1440p but it would likely struggle with ultrawide. Something to consider though if I move up in performance with a future GPU upgrade. Also something to consider though is if you like to record your gameplay and upload it to RUclips. Most people have 16 by 9 monitors and probably won't like the black bars on the screen viewing 21 by 9 content. 18 by 9 or 2 by 1 content seemed popular for a bit with Netflix shows I wonder if they'll ever make monitors in that aspect ratio.
Superb video! My 3080 and 165hz ultra wide salute you and are saying "it's not that bad!"
I always wondered how much FPS dropped from going UW. Thank you for testing
My pleasure! It's my goal to fill in the gaps of ultrawide knowledge gaming community
Turn off RT, if you're at all concerned with the performance difference between 16x9 & 21x9. There's still nothing in 2024 that's worth the performance degradation of RT, if you don't have performance to spare. I'm sure there's a few people out there that won't agree, but Cyberpunk playing at higher resolution, without FG, and a good framerate is the better experience vs FG, high latency, with RT, and lower base resolutions. I'm sure there's a couple walking simulators that it's debatable, but for a FPS, it's a no brainer.
This is a very good video to set the record straight for resolution scaling between aspect ratios.
That's pure opinion lmao.
@@joeykeilholz925No he's spot on if you want high Res and RT then why are you watching a video on framed rate if you can hit 60fps then why would you care?
@@MrFerrari458gtomay be because he has a gpu that can hit 90fps with RT and he doesn’t want to compromise that? RT or no RT is irrelevant when it comes to wanting to know performance hit when upping the resolution. He watched this video for the exact same reason you watch this video, he doesn’t want to compromise his framerate. Though in his case he also doesn’t want to compromise his graphics unlike you.
funnily enough i went from a 1440p CRT that i got for free straight to ultrawide 1440p
ran out of vram immediately
A month ago I weighted the idea of changing my RTX 2060 for a RX 7700. But, I considered I would not notice any significant difference since I had a 1080p 60hz monitor. Very basic. The RTX 2060 already allowed me to play 98% of my games at ultra settings and achieving 60 fps. an RX 7700 would have not made any difference. my real bottleneck was my monitor.
So I went ultrawide 1440p 160hz.
In the case of Helldivers 2, which I played at high settings, the change to 1440p took a heavy hit on the poor 6VRAM of my gpu, so I had to downscale the settings and use upscaling to barely achieve 60 fps. But in the end, you are wondering, was it worth it?.
HELL YES. I CAN SEE MORE!
I felt the same way when getting my first one. I always end up buying more monitor than i have GPU then upgrading to accommodate the monitor.
just got the new alienware oled ultrawide absolute beast although on certain setting i get headaches so had to do some tinkering , last ultrawide i had was the 2015 asus pg349q they have come a long way!
That's a sweet monitor hopefully you have solved your headache problem.
I've been gaming on Ultrawide since 2019. I'm currently using an Alienware AW3821DW that I bought in the beginning of 2022.
It can be irritating that I still have to do a hex edit or some other modification for some games, but it is what it is.
Most new games have no problem with 21:9 for game play but often cutscenes lamely remain 16:9
Luckily most games i've played I've found a way to even edit the cut scenes to display in ultrawide. Pre-renderered scenes are a different story though. @@ultrawidetechchannel
The true test for me would be lending ultra wide monitors to FPS pros. The extra vision alone would be great, especially for battle royal games, but I’m very worried about the supposed image warping at the edges. Besides that having two monitors where you can game on one and watch something on another screen is very nice. Right now I have a performance monitor and a picture quality monitor. I would love to just have one monitor for convenience sake.
You may be interested in checking out this video of mine ruclips.net/video/GvyKtMzInT4/видео.html
so, ive got a 3440x1440p monitor paired with a 3070.. been seeing the limits of just 8gbs and not sure where to go from here, 4070TS/4080(S) or as much as i enjoy it, sell the monitor. Any Thoughts?
The 16GB on the 4070TiS and the 4080S will keep you safe at that resolution for quite a long while. Even if you were to go down to a 16:9 1440p monitor it wouldn't mean you would be safe with only 8GB of VRAM. Some newer games are still going to get you.
@@ultrawidetechchannel ok, thats very helpful Thankyou. the more people I talk to about this, Im leaning more and more towards the 4080S.
I can't unhear "Ulta wide"
*whispers* ultra...wide.
I'm on two ultrawide's, an Oled and IPS plus a third 24" IPS. My 3080ti handles the games I play just fine with max settings and ray tracing, but latest games I've played are RE4 remake and RE Village. Been meaning to replay Elden Ring and start Cyberpunk, maybe I'll notice it more then.
As long as you use DLSS you should be able to avoid most of the VRAM pitfalls.
What are your thoughts on the effect of changing the field of view? Theoretically you should run a bigger FOV angle on an ultrawide. And in theory this could cause lower fps.
From my tests and a little looking around on YT this performance impact is a lot less than I expected. So, if you didn't factor that into your test, I think you've been right to do so, but I would be interested in your thoughts on this.
I used just the base settings you get when setting a preset as most people will never touch the FOV slider in a game. Most games i don't bother with it at all I don't feel 21:9 is wide enough to need and adjustment, 32:9 depending on what kind of experience you want to get out of it may make FOV a necessary setting to play around with.
I have a Westinghouse 34" ultrawide being pushed by a 5800x3d and 6900xt. The set up workes pretty well. Cyberpunk runs at around 65 fps with ultra settings and high RT settings.
I kept waiting on you to say "ultrawide" to hear how youre gonna say it next 😂
Surprised the 4070 held up so well.
I got ultrawide monitor and since then I can't imagine life without it.
It is also worth to mention than in ultrawide you can activate vrs without really notice it, increasing even more the performance with basically no quality difference.
The fact that my 3060 is handling high to ultra graphics in ultrawide 1440p and hovering between 40-80 fps in every title pretty much proves that you dont need overwhelmingly good specs to run it.
Right now it's the sweet spot resolution for immersion and high FPS.
That means rtx 4060 ti should do fine
@@kasadam85 yes, the 4060 ti is by no means a bad card, it is just very expensive for what you get.
It’s a good thing you don’t own a OLED 4K monitor those need at least a rtx4080 to get 4k 120 on medium graphics
@@kasadam85 IF you get the 16gb version that is
I was surprised to see my 4060 still got reasonable fps when i switched to UW
The uptick isn't that harsh and going from Ultra to High or High to Medium will off set most of the fps loss.
I just got a 4070Super from 1080Ti, planning to get alienware aw3423dwf, seen some testing where certain games(alan wake 2 for example) were reaching towards 12Gb VRAM, should I be worried about VRAM?
12 GB should be fine for that resolution for everything except path tracing in Cyberpunk and Alan wake 2. Regular ultra ray tracing especially with DLSS on which you will definitely be using will still stay within your memory budget.
@@ultrawidetechchannel Thanks again for the response!!
My brother has changed his monitor to a 34 inch ultrawide 1440p oled. In a few weeks I will build him a new pc and I am still hesitating whether to recommend him the 7900 xt or the 4070ti super. The 20gb of the 7900 xt seems to me a great option for a monitor like this but I know that the nvidia is a safe bet; dlss, rt, nvidia codec etc. What would you recommend me in this case?
Go with 4070TiS or with the one that has good price (promo)
all amd cards are shit unless you are playing one specific game that works with one specific driver
@user-vl4iq7bj5e
I have yet to find one that doesn't run on my 7900 XTX.
Great video!
Glad you enjoyed it
The same applies to 4K and even 8K, increasing resolution does not increase load across all aspects of the render pipeline equally, even if you calculate pixels rendered per-second you're almost always getting the best value using a higher resolution display.
That's bechause the games never are made for ultrawide, or even super ultrawide 32:9.
The image is stretched more and more the wider the screen, and your performance is mostly affected by going from 1080p to 1440p.
That's why my 3070ti could still run most games with 90+ fps on my 5120x1440p monitor.
It was struggeling in the newest titles tho, so i did upgrade to a 4070ti that i got for cheap and it is amazing.
Hope I found this video before I bought my 5120x1440 monitor... I had a 4060 Ti at that time, ended up with an 4070 Ti Super which I'm really happy with!
I now have a proper GPU with my 5120x1440 and 3440x1440 screen.
I'm glad to hear you ended up with a super pleasing setup.
thank you for testing these for us
Glad you liked it.
I Just whipped out my old 1070FE card because Ive been wanting to build a portable mini ITX build with it, I ran it at 2560x1080 and that thing slaps!! It even did about as good as my 5700xt in 3440x1440!! Frakkin crazy! Now Im wondering why I even uprgraded 4 years ago in the first place lol
I did build a system using 4070ti and 5800x a couple years ago. Great 16:9 1440p rig. Then without much research I got an ultrawide a month after. I don’t regret it, and most games look and run well, but I do wish I had a 4080 or didn’t need to lean too heavily on Frame generation.
I think the biggest problem is legacy and some newer games don’t natively support UW. I mean… Elden Ring… what are we doing Fromsoft? And UW “fixes” can trip the anti-cheat
Am I the only one that thinks that these performance numbers are insane? I don’t mean anything by the author, but I just looked at a decent 1440p build with ultrawide and looking at these benchmarks a 7900gre is not gonna cut it for 60fps gaming. I’m talking 3400€ build. And what is going to happen with games 2 years down the line? I was almost about to buy that config(pcpartpickwr) but this benchmark has me go nope. I’ll wait next cpu and gpu gen.
I can’t spent 3400€ for a pc that can barely do 1440p non ultrawide
Like I say in the video I have made these settings as punishing as possible to try and prevent any bottlenecking other than from the GPU, because I have previously found even on regular ultra settings some of my test games were not able to fully leverage the 4090 even at 3440x1440 ultrawide. So with going even lower resolution, I had to try to make sure that happened as little as possible.
You can get much better frame rates than what are shown here for almost no visual downgrade and there are very few of the games I would play using these tested settings and I would classify myself as a pretty bad visuals snob.
@@ultrawidetechchannel yeah but a 7900xt barely makes the cut now i almost all the games (40fps) what's gonna happen in 3 years?
i'm not gonna change computer every 3 years to keep up to play 1440p
so, lets say to a 160Hz / 3440x1440 monitor, it requires a 4090 to properly run, right?
Depends on the game and if you are willing to use DLSS or FSR upscaling and frame gen. If you can get to about 90fps with upscaling then frame gen can take you the resto of the way to your monitors max refresh rate.
What I'd really love to know is how much the two side monitors I have next to my center 3440x1440p screen are hurting my gaming performance.
I have a 27" 3840x2160p 60Hz monitor on the left, my center 35" 3440x1440p 100Hz screen in the middle and a 27" 2560x1440p (60Hz) on the right.
I usually only have STEAM and Discord running on the side monitors so nothing that should be using a lot of GPU horsepower but still, would be interesting to know if I'm losing noticeable performance by having those side monitors on while gaming.
Likey not much maybe like 1-3% depending not the lovely of activity in the feed and the number of media shares
Any way you could get your hands on a 2080TI and run it like 2080 just to see how much of a performance difference it would be if it had more VRAM?
As a 2080 owner that moved from 1440p 16:9 to 1440p 21:9 a few months back, the 8GB really aged the card, performance wise. It may not have be a 30% performance reduction, but it sure felt like one!
Now I have a 4080S on the way, and seeing an over 100% performance increase it makes spending all that money feel a little bit better. 😅
I don't know anybody who has one and the cost of acquiring one will definitely not pay for itself with the content I can make from it. Right now the 2080 is a good standing for the 4060 and any other 8 gigabyte card that you might be considering.
You will be delighted to no end by the performance improvement you're getting with the 4080 super.
I have the 4070ti, I did some calculation because that just doesn't make sense as well to hear how much you lose performance going to an ultra wide and thank you so much for proving me right lol I have a 3440x1440 and I love the extra screen immersion you can get while not loosing "34%" performance as they say
I run an ultrawide 1080P monitor and I was considering a 1440P one to pair up with my 5800X3D and 7800 XT. And have been wondering how much of a drag ultra wide 1440P will be on that. But I also typically run games with FSR (frame generation it available) and no ray tracing at high to ultra settings. Now I may be able to make a more educated decision.
Check this video out and just assume like a 15% lower performance. ruclips.net/video/utUF_3z-lPE/видео.html
Does ultra wide reduce cpu bottleneck cos of the extra resolution?
Yes the extra resolution of the ultrawide puts more pressure on the GPU so that a CPU bottleneck is less likely.
I run a 32:9 samsung oddesy. I can never go back. It has its pain, but playing third person games on this one is such a blast.
The performance drop always needs to be considered with how noticeable any benefit is. I like to compare to feeding 1080p to my 60" LG Oled and letting the tv handle the upscaling to 4k.
The benefits of a smaller 1440p monitor, regardless if 16:9 or 21:9 are irrelevant. Bigger is better than smaller and Oled is better than LCD. And that's that.
CyberCyberpunk is not "the worst performing" game in your ranking. Linear performance change for Cyberpunk shows how well optimized is this game.
Every but of your GPU is used and nothing is wasted. This is not a trival thing to achieve and developers had to put a lot of effort into it.
I ordered a UWQHD OLED 34”. I've had vanilla 1440 until now. I'm afraid my 4080 won't handle this resolution well.
Your 4080 will do very well on a 3440x1440p ultrawide, no need to worry.
If performance was scaled to pixels, 4k fps would be only a quarter of 1080p.
Yes, that would be the case.
I think the most important part here which sure isn't obvious to many, if you are going ultra wide just be sure to have a gpu with more than 8gb of vram. Most games suck up 8/10 gb of vram already. Be more future proof and go with 12/16etc
Yes 8GB of VRAM is essential for going beyond 1440p and may become essential for even running 1440p
*Fun Fact...*
The "Rise of the Tomb Raider" benchmark got a higher FPS score with a hard drive than it did with an SSD. Why? Because many of the textures didn't load in quickly enough to get processed. So you got more pop-in but a higher FPS. Benchmarking can get weird (I'm referring to the lack of VRAM for this video causing difficulties).
Ultra wide only causes a linear change in performance. It's different if you change width and height, where you get a quadratic change.
I'm now running a 4060 on a 24" 1080p monitor......do you think I should get a 27" 1440p or a 32" ultrawide? I know I probably won't be able to play at 1440p, but I can still rely on DLSS and frame generation. What should I do?
I think a 4060 might struggle a bit too much with ultrawide personally. I suppose it partly depends on what types of games you like. If its games like Fortnight and CS Go than you would be fine but if you're playing games like Black Myth Wukong or Alan Wake 2 I'd stick with 1440p.
People that play 16:9 1080p or 4k dont know what they are missing from 21:9
I agree
Huh, it would never have occurred to me to do the calculation that way (divide by pure increase of pixel count), precisely because that doesn't work between 1440P and 4K either.
My thought process was to say this resolution includes about 25% of the extra pixels you'd render when going from 1440P to 4K. So i'll estimate performance by adding 75% of the 1440P framerate to 25% of the 4K framerate.
I'd be interested to know how far off I am with this approach. I figure it would work pretty well when there are no memory walls involved - it'd likely still underestate cards where 4K would be memory walled but 3440x1440 isn't, but it should do OK where that isn't the case.
Ultrawide is the futur
I have a 3440x1440 and thinking about upgrading to 5120x1440 for sim racing ... I know my 3060 will struggle so I hope 5000 series gpus will have affordable options that can work.
Not only will it struggle from a pure performance perspective, it's low Vram will cause it to have unrectifiable issues even at the lowest settings in several new games
wonderful video thank you for making it
Glad you enjoyed it!
I’m going pc next year, selling my ps5 and Series X. Bought a Rog Ally and fell in love with the pc platform. My question is, can I play ultra wide on my 65 oled tv? Sorry the ignorance.
You can always manually set the resolution to 3440x1440 or whatever you like on PC, but you will have large black borders on a 4K television.
As @Dystopikachu said you can use your driver control panel to change the resolution to display an ultrawide image with black bars at top and bottom same as a movie shot is cinematic widescreen, though i would probably choose 3840x1600p for the resolution as that will not cause any blurriness as it will match your TV's resolution. Make sure you chose the 1:1 pixel setting so you don't just get your image stretched tall.
@@ultrawidetechchannel nice will do. Just waiting for the next generation of RTX. Bought a used Ally to start dabbling.
On some games, mostly racing games, I end up being CPU bottlenecked on ultrawide (13700k and 6900 XT for context)
for my situation with the 5800X3D the only GPU that is really CPU bottlenecked is the 4090 with the 7900XTX sometimes hitting it. What racing games are you seeing those bottle necks on?
@@ultrawidetechchannel EA SPORTS WRC mainly, it could just be something specific to my setup I'm not sure
gpu companies should start pushing out software that lets you render those edges at a lower detail or something since most of the time its just to fill your peripheral anyway
when it comes to gaming is ultra wide really worth all the hassle? having to worry about performance, paying more money for the monitor and whether the game properly supports the setting doesn't seem like a good trade-off just to get some more screen on the sides, which might even make the experience worse for many people. why not go from 1440p to 4k instead where you would still have to pay the extra money and need a better gpu but the visual benefits are more noticeable and you don't get the cringe wide screen that might cause nausea nor do u have to worry whether it will actually work for certain games?
I have a 34" 1440p ultrawide and I still haven't found a game made in the last 10 years that doesn't support this resolution. And even for older games that don't support it, there's usually a mod that makes it work (Skyrim for example).
I personally like ultrawide because I play a lot of games where there's a lot of UI navigating (RTS, city builders, management, simulation). I no longer want a non-ultrawide because the games I play need a lot of info to be displayed at all times.
And yes, I have tried multiple monitors. I barely used the second one while I always have 2 windows open on my ultrawide.
Watching content is the real issue, yes, but I got used to the black bars when in fullscreen. Just like we got used to watching movies with the black bars on top and below.
And I sit slightly more than an arm's length away from the screen so there's absolutely no risk of feeling sick.
@@fluttzkrieg4392 how would you say ultra wide 1440p compares to 4k, in case you have tried em both?
@@metalface_villain Unfortunately I haven't. But I guess it wouldn't be incredibly different from what I currently do, as I would be able to place smaller windows comfortably on a 16:9 4k display as I do on my 21:9 1440p. Same goes for UI elements in games.
I made the switch to 3440x1440 4 years ago and did enjoy it till recently going to a larger 1920x1080 and don't regret it to be honest just a lot easier to see everything at my age lol
I don't think people are looking at how bad those AMD GPUs are performing.
Is it that bad if you consider you can get them for as expensive or even less than the 4080/4080s? Also kinda strange to compare an 800€ GPU with an 1000-1400€ and an 1800-2000€ GPU with RT always enabled.
I thought that too. Those frame rates were low for not being 4K
7900xtx performs better than a 4080 super for 250$ wtf are you saying doggy lmao
Essentially, if I went to 1440p ultrawide, with an RTX 4070, it wouldn't be THAT much more demanding?
Yes, that is correct
@@ultrawidetechchannel Awesome, that's very reassuring, as a jump to 4K would absolutely destroy my GPU and CPU 😆 Thanks for the reply ^^
I didnt go ultrawide because the game I play the most (Squad) doesn't support it properly.
Ya it has been hit or miss with a lot of the competitive online titles that started before Ultrawide was a significant thing, weather or not they update themselves to support it.
Great video, thanks a lot... we need more of these niche videos. Can you please test Dell S3422DWG which is best budget VA 34 ultrawide monitor past years with much cheaper but 2024 monitor. AOC cu34g2xp. Aoc is almost 50% cheaper in my country but its supose to have some HVA panel that almost removed all black smearing and ghosting problems that all VA panels have. I think this would be super interesting to a lot of people as they are 2 best budget 34 UW monitors
Its having to render 3.7 million pixels every second versus 5 million pixels, or 26% difference. In reality it is typically having to render lesser used sides of the displayed content, so the difference should be 10-20%, not the full 26%. I recently picked up a new 16GB RX6750XT for under $300 and it handles almost everything at 1440 no issues (with 5600X and 32GB DDR4-3200).
I just want a little 32” flat screen 4k OLED but I sure as hell can’t afford a grand for a fricking monitor that I couldn’t even utilize with my current GPU a 2080. I’ll wait until the prices inevitably come down to reality.