Thanks a lot for all the work! I've read about the performance differences between those resolutions multiple times, but I never saw actual benchs, great insight! If I may suggest, for a future video, I wonder what the performance difference would be between a 5120x1440p monitor and a 7,680 x 2,160p, like the 49' and 57' Neo G9s.
This video is incredibly helpful. Thanks so much! One small tip: Make your graphs a bit more clear. Difference between 4K and SUW are hard to see. Maybe white/grey/black? And I'd love to see a video comparing RT with RT & DLSS/FRS and no RT.
The white grey and dark gray bars are background data to support the colored percentages so I don't want them to be too contrasty and draw your attention away from the true information I'm trying to convey. though in some cases it seems to have only amplified the amount those bars were scrutinized.
Just wanted to give a HuGe thumbs up 👍. You’ve been doing the tests that us “ultrawiders” have always wanted. Keep it up. Also, I currently have a 5600x and intended to buy a 3080 for my 3440x1440 144hz monitor, but never got round to it. Now that the 40 series are out and Amd’s latest releases, what gpu would you recommend to maximise performance whilst minimising bottlenecking. I really can’t afford to upgrade my whole setup from am4 to am5 so I don’t mind going to the 5800x3d and what gpu would you recommend to pair with that. I have only a 750watt Psu so I can’t run a 7900xtx or a 4090 so it kinda ruins my upgrade path as it’s not really economical right now to get an 850watt Psu for my case (I have a sff Psu). Thx and keep up the good work. 👍
Thanks, if you plan on getting that 5800 X3D then the only GPU that could really get CPU bound by that processor would be the 4090, which you've already said is outside of your power budget. As far as the best GPU to get to pair with that power budget in mind, I would say the 7900 XT or the 4070 TI will both give you good gameplay at that resolution without blowing up your power supply.
I do like the 21:9 aspect ratio a lot and I think some of the larger sizes of that aspect ratio that are coming out may negate some of the advantages of the 32:9 which is just their sheer size and ability to fill your vision.
Same here, only way I could see myself going back to 16:9 would be if I got a 4K monitor that's at least 38 inches and is either OLED or Mini LED so I could run game games in ultrawide on it with black bars
So when I want to compare GPUs for my Odyssey G9 5120x1440 and I watch Videos I need to look at the performance between QHD (2560x1440) and 4K (3840x2160)? Or maybe 8% below the 4K performance
If I don't have the game your looking for in one of my 5120x1440p benchmarks i would just look at the 4k results and know that your going to get a few fps more than what they are showing. I would ignore the 2560x1440p benchmarks.
Omg! Super channel! Found you - subscribed and liked ❤! I have little question, im now on 38 inch Dell 3840x1600 144hz (migrated from AOC 3440x1440 144hz), performance a little dropped, but for me this size is perfect. My GPU is 4070ti 12Gb NON Super). So - whats will be difference in FPS if i go to 5120x1440 resolution later (P.S. MAYBE will go - because i love my resolution 😊)?
So, should I upgrade to 5800X3D from a 5800X on 49" 5120x1440p @ 165Hz monitor with 7900 XT ? 1440p SuperUltraWide is closer to 4K and it's a known fact that the "3D upgrade" is NOT worth that resolution ? 🤔
If you play games that utilize 3d cache if you can swing it go 7800x3d, games like rust dayz 7days for example all utilize 3d cache and will boost performance. I can say going from a 12900k to a 7800x3d I saw a boost in fps at 4k and super ultra wide, for games that use the 3d cache. I can use an example of a game that utizlies 3d cache; Star Citizen, fps at 4k I sit at around 80fps in a space station and 110+ in space, which is very good at a 4k resolution for a game notorious on bad fps. and compared to my 12900k that sat around at 50ish in space station and 80 in space. But like Ultrawidetech said, not every game will see an fps increase, BUT I would say most modern titles you probably would.
Thank you so much for this video, I gave my old g9 to my father and thinking on if I want to go back to an oled 32x9 screen, but I just started 4k gaming and have been loving it, BUT MMO's and sims I truly think 32:9 is a superior aspect ratio, but finding a benchmark page is so hard comparing these res'
Im trying to pick up a 44inch 5120x1440 but idk if my pc can run it😅 Using a rx6900xt red devil ultimate with a ryzen 7 5700x and 32gb ram 2999mhz Im trying to hit 70/90fps in ac games and total war genre and also in definity original sins2, banner lords And around 160/180fps in r6,for honor, forza horizon 4/5, age of empire, overwatch Would that be enough or what would u reccomend me to upgrade? If u have any advice pls do tell im using the Msi optix MPG341CQR right now but its always been a little dream of mine to play on 44,5/49 inchs monitor And also i dont need to play on high/ultra settings i play most games on medium/high right now I have been looking at some reviews on it it looks amazing
Any recommendation on which monitor i shoukd get im flopping back and forth. The g9 oled 5120x1440 or g8 4k 32 inch both have 240hz but the g9 is the only one with pled, the 32 inch has hdr2000
if your looking at a non oled 4K even with its higher max brightness it won't have as good of HDR as an OLED. Personally i would go for the G9 OLED but I don't know all your use cases and OLED may not be suitable if your going to excel and powerpoint on it 90% of the time.
From the games that seem to differ and the specific configuration of my system I would pin it on my motherboard, it is an old 2000 series am4 board and did not have native support for smart access memory and AMD cards are much more reliant on SAM for their performance than Nvidia cards, so the fact that SAM came to my mobo with a bios update and older games, that seem to benefit from SAM, like Horizon Zero Dawn have less reliable performance from one driver version to the next.
for the 7800XT you could split the difference your seeing for the 4070 and 4070Ti and that is about what you can expect. As for the 7600X i really couldn't say with accuracy.
Samsung monitors tends to have DSC enabled for the 5120x1440 though, so you lose out on things like G-Sync since DSC disables it. Also disables HDR and some other things.
Any monitor at the 5120x1440 or 4k resolution that goes over 144hz has to use DSC if they don't have Display Port 2.0 because they just don't have the bandwidth. Unfortunately that means pretty much every monitor on the market that goes that high, as DP2.0 is still exceedingly rare. I was not away DSC prevented G-Sync and HDR would all depend if it could fit in the allotted bandwidth increase DSC provides.
@@ultrawidetechchannel Yeah, that refresh rate on the monitor seems to be the sticking point from what I could tell as well. But from what I read, even lowering the refresh rate on those DSC monitors would still keep DSC running. Well, at least for the Samsung monitors. But there's a trick I think you'd be interested in, which someone on reddit came up with. It's for the rog swift pg49wcd, where you can keep HDR, G-Sync and DLDSR running all at the same time. Requires a few changes in the firmware via CRU to pull it off, but there's not really much to it. The monitor is 144hz, and it's also what the one who made the thread on Reddit about it, had specified is the reason for picking that monitor. But anyway, you can achieve that "faux" 7680x2160 on the monitor with DLDSR without disabling everything else, and also while keeping that refresh rate. Evidently this monitor is an outlier in the market, and honestly it seems like it was designed with this goal in mind, even though they can't officially endorse it of course, since CRU isn't to be messed around with by regular consumers. However, it shows there's still room for modern monitors to improve, even without the displayport upgrade. The trick works with both HDMI 2.1 and displayport 1.4 btw. Requires an insane rig to pull it off though, so it definitely isn't for people on a budget.
@@ultrawidetechchannelFrom what I read on reddit, there's a 5120x1440 monitor on the market where you can mess around with the firmware to reach 7680x2160 144hz via DLDSR, while also keeping HDR and G-Sync enabled. From what I read, the samsung superwides keeps DSC enabled even when lowering the refresh rate. It's not for people on a budget, but it certainly shows that even now there are boundaries being pushed even without the higher displayport standard. The trick seems to work with both hdmi 2.1 and displayport 1.4.
I feel you, ever since getting my 5kx2k ultrawide editing monitor, while it has been nice to enjoy the size and fidelity of it I really feel the lack of ability to hit high refresh rates on it, even with the 4090. If I had the cash and space to setup a separate gaming machine then I would just go for a 3440x1440 OLED monitor and enjoy my amazing contrast and high refresh rates.
I have no idea what you're trying to show here, graphs are confusing. Why are you ''calculating'' what it should be? And why are you headslamming the desk when toggling to next calculation haha
The reason for the calculations is in the past and to some extent still today there were not enough benchmarks for ultrawide monitors to actually know what performance you could expect at those resolutions so the common advice was to just take the resolution your using and calculate the percent difference in pixels and that is the performance you should expect. But that is not how it works in reality.
I have a G9 Oled and a 7900XTX I love and hate it ... Something to consider when buind a 32/9 Oled is whenever you want or have to play a game in 16/9 or 21/9 it might not be that great for your OLED if 50% of the Screen isnt used. If you like to play some retro games or other games where 32/9 isnt an Option maybe dont go Oled.
OLEDs do complicate matters by bringing back screen burn in worries. for 21:9 its less of an issue since so many games ship ready for that aspect ratio but 32:9 still has fairly significant gaps in its coverage.
This is the video I was looking for a while, great work man...
Glad I could help
Thanks a lot for all the work! I've read about the performance differences between those resolutions multiple times, but I never saw actual benchs, great insight! If I may suggest, for a future video, I wonder what the performance difference would be between a 5120x1440p monitor and a 7,680 x 2,160p, like the 49' and 57' Neo G9s.
If i can manage to get ahold of something with that resolution or manage to simulate it in a satisfactory way then ill do a video like that.
Thank you for the video. Leaving a comment for algorithm. Hope you get more views
Much appreciated!
This video is incredibly helpful. Thanks so much!
One small tip: Make your graphs a bit more clear. Difference between 4K and SUW are hard to see. Maybe white/grey/black?
And I'd love to see a video comparing RT with RT & DLSS/FRS and no RT.
The white grey and dark gray bars are background data to support the colored percentages so I don't want them to be too contrasty and draw your attention away from the true information I'm trying to convey. though in some cases it seems to have only amplified the amount those bars were scrutinized.
I work with business intelligence and data visualization. These charts gave me headaches.
Just wanted to give a HuGe thumbs up 👍. You’ve been doing the tests that us “ultrawiders” have always wanted. Keep it up.
Also, I currently have a 5600x and intended to buy a 3080 for my 3440x1440 144hz monitor, but never got round to it. Now that the 40 series are out and Amd’s latest releases, what gpu would you recommend to maximise performance whilst minimising bottlenecking. I really can’t afford to upgrade my whole setup from am4 to am5 so I don’t mind going to the 5800x3d and what gpu would you recommend to pair with that. I have only a 750watt Psu so I can’t run a 7900xtx or a 4090 so it kinda ruins my upgrade path as it’s not really economical right now to get an 850watt Psu for my case (I have a sff Psu).
Thx and keep up the good work. 👍
Thanks, if you plan on getting that 5800 X3D then the only GPU that could really get CPU bound by that processor would be the 4090, which you've already said is outside of your power budget. As far as the best GPU to get to pair with that power budget in mind, I would say the 7900 XT or the 4070 TI will both give you good gameplay at that resolution without blowing up your power supply.
For me 21:9 3440x1440 is the best aspect ratio and resolution. Can't go back to 16:9 and 32:9 is too much for me. 🙈
Absolutely agree! It’s great for work, for watching movies and for games. 🎉🎉
I do like the 21:9 aspect ratio a lot and I think some of the larger sizes of that aspect ratio that are coming out may negate some of the advantages of the 32:9 which is just their sheer size and ability to fill your vision.
Being exactly the right size for movies is a big bonus to this aspect ratio.
Same here, only way I could see myself going back to 16:9 would be if I got a 4K monitor that's at least 38 inches and is either OLED or Mini LED so I could run game games in ultrawide on it with black bars
Great data, great info, great channel. I hope your channel is seen as a valuable resource to new adopters and blows up.
Thank you. I'm trying to provide a resource for all ultrawide users giving them the information they lack.
So when I want to compare GPUs for my Odyssey G9 5120x1440 and I watch Videos I need to look at the performance between QHD (2560x1440) and 4K (3840x2160)? Or maybe 8% below the 4K performance
If I don't have the game your looking for in one of my 5120x1440p benchmarks i would just look at the 4k results and know that your going to get a few fps more than what they are showing. I would ignore the 2560x1440p benchmarks.
So, in other words, no cards are able to do 5120×1440 at 240hz without the help of upscaling?
Upscaling or lowering the settings. I'm sure fortnite on competitive settings will get close to that frame rate with otu upscaling.
@ultrawidetechchannel crazy to think we're able to create this tech, but not fully use it yet lol
Omg! Super channel! Found you - subscribed and liked ❤!
I have little question, im now on 38 inch Dell 3840x1600 144hz (migrated from AOC 3440x1440 144hz), performance a little dropped, but for me this size is perfect. My GPU is 4070ti 12Gb NON Super). So - whats will be difference in FPS if i go to 5120x1440 resolution later (P.S. MAYBE will go - because i love my resolution 😊)?
This video of mine covers the 4070Ti at both resolutions ruclips.net/video/ZNPlTwWgaqs/видео.html
So, should I upgrade to 5800X3D from a 5800X on 49" 5120x1440p @ 165Hz monitor with 7900 XT ?
1440p SuperUltraWide is closer to 4K and it's a known fact that the "3D upgrade" is NOT worth that resolution ? 🤔
In that situation the CPU will not likely be the limiting factor outside of esports titles, using competitive settings.
If you play games that utilize 3d cache if you can swing it go 7800x3d, games like rust dayz 7days for example all utilize 3d cache and will boost performance. I can say going from a 12900k to a 7800x3d I saw a boost in fps at 4k and super ultra wide, for games that use the 3d cache.
I can use an example of a game that utizlies 3d cache; Star Citizen, fps at 4k I sit at around 80fps in a space station and 110+ in space, which is very good at a 4k resolution for a game notorious on bad fps. and compared to my 12900k that sat around at 50ish in space station and 80 in space.
But like Ultrawidetech said, not every game will see an fps increase, BUT I would say most modern titles you probably would.
Thank you so much for this video, I gave my old g9 to my father and thinking on if I want to go back to an oled 32x9 screen, but I just started 4k gaming and have been loving it, BUT MMO's and sims I truly think 32:9 is a superior aspect ratio, but finding a benchmark page is so hard comparing these res'
Glad I could help
Im trying to pick up a 44inch 5120x1440 but idk if my pc can run it😅
Using a rx6900xt red devil ultimate with a ryzen 7 5700x and 32gb ram 2999mhz
Im trying to hit 70/90fps in ac games and total war genre and also in definity original sins2, banner lords
And around 160/180fps in r6,for honor, forza horizon 4/5, age of empire, overwatch
Would that be enough or what would u reccomend me to upgrade?
If u have any advice pls do tell
im using the Msi optix MPG341CQR right now but its always been a little dream of mine to play on 44,5/49 inchs monitor
And also i dont need to play on high/ultra settings i play most games on medium/high right now
I have been looking at some reviews on it it looks amazing
Any recommendation on which monitor i shoukd get im flopping back and forth. The g9 oled 5120x1440 or g8 4k 32 inch both have 240hz but the g9 is the only one with pled, the 32 inch has hdr2000
if your looking at a non oled 4K even with its higher max brightness it won't have as good of HDR as an OLED. Personally i would go for the G9 OLED but I don't know all your use cases and OLED may not be suitable if your going to excel and powerpoint on it 90% of the time.
It seems like your 4K 7900XTX results are very different from other test videos, any idea how come?
From the games that seem to differ and the specific configuration of my system I would pin it on my motherboard, it is an old 2000 series am4 board and did not have native support for smart access memory and AMD cards are much more reliant on SAM for their performance than Nvidia cards, so the fact that SAM came to my mobo with a bios update and older games, that seem to benefit from SAM, like Horizon Zero Dawn have less reliable performance from one driver version to the next.
Hey man for 7600x/7800xt, what fps will I get with 5120x1440p? I mainly play rpg games and sim racing
for the 7800XT you could split the difference your seeing for the 4070 and 4070Ti and that is about what you can expect. As for the 7600X i really couldn't say with accuracy.
Hey man, thanks for this. How about 3840x2160 vs 5120x2160?
4k vs 4k Ultrawide
Samsung monitors tends to have DSC enabled for the 5120x1440 though, so you lose out on things like G-Sync since DSC disables it. Also disables HDR and some other things.
Any monitor at the 5120x1440 or 4k resolution that goes over 144hz has to use DSC if they don't have Display Port 2.0 because they just don't have the bandwidth. Unfortunately that means pretty much every monitor on the market that goes that high, as DP2.0 is still exceedingly rare. I was not away DSC prevented G-Sync and HDR would all depend if it could fit in the allotted bandwidth increase DSC provides.
@@ultrawidetechchannel Yeah, that refresh rate on the monitor seems to be the sticking point from what I could tell as well. But from what I read, even lowering the refresh rate on those DSC monitors would still keep DSC running. Well, at least for the Samsung monitors.
But there's a trick I think you'd be interested in, which someone on reddit came up with. It's for the rog swift pg49wcd, where you can keep HDR, G-Sync and DLDSR running all at the same time.
Requires a few changes in the firmware via CRU to pull it off, but there's not really much to it.
The monitor is 144hz, and it's also what the one who made the thread on Reddit about it, had specified is the reason for picking that monitor.
But anyway, you can achieve that "faux" 7680x2160 on the monitor with DLDSR without disabling everything else, and also while keeping that refresh rate.
Evidently this monitor is an outlier in the market, and honestly it seems like it was designed with this goal in mind, even though they can't officially endorse it of course, since CRU isn't to be messed around with by regular consumers.
However, it shows there's still room for modern monitors to improve, even without the displayport upgrade.
The trick works with both HDMI 2.1 and displayport 1.4 btw.
Requires an insane rig to pull it off though, so it definitely isn't for people on a budget.
@@ultrawidetechchannelFrom what I read on reddit, there's a 5120x1440 monitor on the market where you can mess around with the firmware to reach 7680x2160 144hz via DLDSR, while also keeping HDR and G-Sync enabled.
From what I read, the samsung superwides keeps DSC enabled even when lowering the refresh rate.
It's not for people on a budget, but it certainly shows that even now there are boundaries being pushed even without the higher displayport standard. The trick seems to work with both hdmi 2.1 and displayport 1.4.
If i would use the 5120x1440 32:9 on the g9 oled i cant use hdr ???
definitely considering going to 21:9 from 32:9… my 4090 can handle it but i can’t but feel like the performance hit really isn’t worth it
I feel you, ever since getting my 5kx2k ultrawide editing monitor, while it has been nice to enjoy the size and fidelity of it I really feel the lack of ability to hit high refresh rates on it, even with the 4090. If I had the cash and space to setup a separate gaming machine then I would just go for a 3440x1440 OLED monitor and enjoy my amazing contrast and high refresh rates.
I have no idea what you're trying to show here, graphs are confusing. Why are you ''calculating'' what it should be? And why are you headslamming the desk when toggling to next calculation haha
The reason for the calculations is in the past and to some extent still today there were not enough benchmarks for ultrawide monitors to actually know what performance you could expect at those resolutions so the common advice was to just take the resolution your using and calculate the percent difference in pixels and that is the performance you should expect. But that is not how it works in reality.
I have a G9 Oled and a 7900XTX
I love and hate it ...
Something to consider when buind a 32/9 Oled is whenever you want or have to play a game in 16/9 or 21/9 it might not be that great for your OLED if 50% of the Screen isnt used.
If you like to play some retro games or other games where 32/9 isnt an Option maybe dont go Oled.
OLEDs do complicate matters by bringing back screen burn in worries. for 21:9 its less of an issue since so many games ship ready for that aspect ratio but 32:9 still has fairly significant gaps in its coverage.
how about 2080ti then
sorry don't have one to test. wish I did though.
Diablos issues are because Blizzard deemed 32:9 an unfair advantage when PVP
Ah, thanks for that bit of info. It still seems like they should update the way the game renders to give you that bit of performance back.
@@ultrawidetechchannel Agreed. And thanks for doing these Super Ultrawide benchmarks.
@@chuckb6782 My pleasure
Watching in 4k.