I couldn't have wished for a better tech video this morning as an early adopter of 3440x1440 - what a great surprise! I'm even using the same CPU. BtW I've seen someone test a 3080 with older CPU at 3440x1440 vs 4k on ultra and discover that 21:9 is significantly more CPU heavy ( especially in open world games). Have you noticed anything of the sorts?
Thank you! It makes sense that 3440x1440 would be more cpu heavy then 4K, because usually the more pixel you push the more gpu bound you are. 4k has more pixels then 3440x1440, therefore it's more demanding of the gpu. If there are other reasons, then I've not come across them, though I haven't thought about it either. I'll keep it in mind, thanks for the info.
@@DontCallTechSupport I was talking about KatimaGaming and he tested 21:9 and 4K in 2 separate consecutive videos. Ultrawide gets 20% to 50% worse performance than 4K because the 4790K is a harder bottleneck at that res. He speculates 21:9 has more of the game world at a given time, therefore is more CPU-heavy than 21:9. I was wondering if there's any truth to it, but I can't test it myself, because I don't have 4K screen and my setup is horrendously GPU limited anyways.
Dropping a like on here because you showed some love to ACC. I was a bit worried that ACC would prove to heavy for the 3060 at 3440x1440 and high settings. This shows that I'll be just fine! Thanks!
I have to say that I got an RX 6600 XT for what was my local launch price and it's running 3440x1440p 75Hz with freesync gen1. I've had it for two weeks and I'm very pleased with my purchase. Prices still haven't come down anywhere close to what they were at launch in my country - especially for the higher performing cards and even the RX 6700 XT costs 50% more for 20-30% performance increase. My reasoning is that a mid-range card would serve me well enough until next-gen comes when spending more would make much more sense than it does now and I'll even be able to wait out the inevitable initial hyper-demand that next-gen's performance leap would bring. This video helped me greatly and after getting the new graphics card I had the strong realization that my gaming backlog is nowhere as demanding as the games that'd require a more powerful graphics card for UWQHD (like CP2077 or WD:Legion). Such titles are very few and testing them is mostly used for perspective on what performance would be in next-gen titles and by the time they've grown in numbers, there will (hopefully) be the option of getting even faster next-gen cards.
I'm running RX 6600 since release with an i8 8700. I run 29 inch 23:9 in 1080p. Works wonderful in almost every Situation/Game. But 34 inch 1440p... i guess i'll wait until an Upgrade. I am absolutely fine with 1080p UW and 1440p would just be an "i want that" Upgrade. But since i play a lot of unoptimized games + Modding and Sim Games, i will have to wait until my RX 6600 non XT is too weak for mostly anything. I seldomly have everything on Preset Ultra and always have custom Videosetting when it makes sense and yes, Upscaling is and will be a thing. One year later, open source FG is a thing.
This is the video I needed and didn't think I would find. Upgrading from an RTX 2070 to a 6600 XT (my monitor has FreeSync and Nvidia's Gsync implementation doesn't work to a satisfactory standard consistently), managed to get the 6600 XT for £285, will be selling the RTX 2070 to make up at least two thirds of the difference in price, so in essence spending around £100 which I think is acceptable for native FreeSync support and a slight performance boost. Looks like I'll be getting very similar performance, which is expected and what I was looking for. Thanks for the video!
These card honestly better for 2560x1080. Id get the 6600xt and a 3440x1440p monitor, run Radeon Super Resolution and let it upscale from 1080p back to 1440p. It looks pretty damn good and improves performance.
Thanks for this video, I've got a ryzen 5800X and a 6600XT (the gpu was the only option available during the cov lockdowns) but I am now looking to upgrade both my monitor cause it's gotten issues lately and my gpu in the near future as well. Seeing that the 6600xt can handle 3440x1440 decently I might be able to stretch upgrading the GPU a bit longer to get an even better deal. PS: Could you do a 3440x1440 test with the 6600XT and baldur's gate 3?
@@DontCallTechSupport No worries, thanks for considering it though. And thanks to your video I know my 6600xt can at least handle 3440x1440 even though settings might need to be lowered. Personally medium settings is when I slowly start to look for an upgrade.
A: i had RX 6600 with display 3440x1440 165hz run game still playable ±35fps, cpu+mb(i guess bottleneck)2400G+B450=lmao, my old part bruh with case(axel cubegamming). When gaming VGA Sensor1 66°C & Sensor2 can ±90°C, it's that normal?!?! coz RDNA2 had 2sensor Gpu Q: Did i need to upgrade my mb+cpu, may B550M/i+3600? to get 4.0 + line Cpu~
Sensor 2 is the Hotspot on the gpu, the hottest spot. 90c is normal and safe. Max is 110 if I remember correctly. For pcie 4.0 you need at leet a Zen 2 cpu and a b550 board yes. There are videos testing the performance difference between pcie3.0 and 4.0 on you gpu, so you'll have to reference those for that information.
Thank you so much for doing this. I was looking for this exact subject!
Thanks for watching!
I couldn't have wished for a better tech video this morning as an early adopter of 3440x1440 - what a great surprise! I'm even using the same CPU.
BtW I've seen someone test a 3080 with older CPU at 3440x1440 vs 4k on ultra and discover that 21:9 is significantly more CPU heavy ( especially in open world games). Have you noticed anything of the sorts?
Thank you! It makes sense that 3440x1440 would be more cpu heavy then 4K, because usually the more pixel you push the more gpu bound you are. 4k has more pixels then 3440x1440, therefore it's more demanding of the gpu. If there are other reasons, then I've not come across them, though I haven't thought about it either. I'll keep it in mind, thanks for the info.
@@DontCallTechSupport I was talking about KatimaGaming and he tested 21:9 and 4K in 2 separate consecutive videos. Ultrawide gets 20% to 50% worse performance than 4K because the 4790K is a harder bottleneck at that res. He speculates 21:9 has more of the game world at a given time, therefore is more CPU-heavy than 21:9. I was wondering if there's any truth to it, but I can't test it myself, because I don't have 4K screen and my setup is horrendously GPU limited anyways.
Dropping a like on here because you showed some love to ACC. I was a bit worried that ACC would prove to heavy for the 3060 at 3440x1440 and high settings. This shows that I'll be just fine! Thanks!
I was curious about this exact thing but no other videos talked about this so great job
Thank you!
I have to say that I got an RX 6600 XT for what was my local launch price and it's running 3440x1440p 75Hz with freesync gen1. I've had it for two weeks and I'm very pleased with my purchase. Prices still haven't come down anywhere close to what they were at launch in my country - especially for the higher performing cards and even the RX 6700 XT costs 50% more for 20-30% performance increase. My reasoning is that a mid-range card would serve me well enough until next-gen comes when spending more would make much more sense than it does now and I'll even be able to wait out the inevitable initial hyper-demand that next-gen's performance leap would bring. This video helped me greatly and after getting the new graphics card I had the strong realization that my gaming backlog is nowhere as demanding as the games that'd require a more powerful graphics card for UWQHD (like CP2077 or WD:Legion). Such titles are very few and testing them is mostly used for perspective on what performance would be in next-gen titles and by the time they've grown in numbers, there will (hopefully) be the option of getting even faster next-gen cards.
I'm running RX 6600 since release with an i8 8700. I run 29 inch 23:9 in 1080p. Works wonderful in almost every Situation/Game. But 34 inch 1440p... i guess i'll wait until an Upgrade. I am absolutely fine with 1080p UW and 1440p would just be an "i want that" Upgrade. But since i play a lot of unoptimized games + Modding and Sim Games, i will have to wait until my RX 6600 non XT is too weak for mostly anything. I seldomly have everything on Preset Ultra and always have custom Videosetting when it makes sense and yes, Upscaling is and will be a thing.
One year later, open source FG is a thing.
This is the video I needed and didn't think I would find.
Upgrading from an RTX 2070 to a 6600 XT (my monitor has FreeSync and Nvidia's Gsync implementation doesn't work to a satisfactory standard consistently), managed to get the 6600 XT for £285, will be selling the RTX 2070 to make up at least two thirds of the difference in price, so in essence spending around £100 which I think is acceptable for native FreeSync support and a slight performance boost.
Looks like I'll be getting very similar performance, which is expected and what I was looking for.
Thanks for the video!
thats not really an upgrade lol. SHould go to 6700xt min
These card honestly better for 2560x1080. Id get the 6600xt and a 3440x1440p monitor, run Radeon Super Resolution and let it upscale from 1080p back to 1440p. It looks pretty damn good and improves performance.
Great comparison and quick to the point
Thank you!
thanks for solve my problem bro 🙇
thanks for doing this
Thanks for watching!
Thanks for this video, I've got a ryzen 5800X and a 6600XT (the gpu was the only option available during the cov lockdowns) but I am now looking to upgrade both my monitor cause it's gotten issues lately and my gpu in the near future as well. Seeing that the 6600xt can handle 3440x1440 decently I might be able to stretch upgrading the GPU a bit longer to get an even better deal.
PS: Could you do a 3440x1440 test with the 6600XT and baldur's gate 3?
I don't own the game, so unfortunately can't help you. Might pick it up when I get some time, but I can't make any promises. I'm sorry.
@@DontCallTechSupport No worries, thanks for considering it though. And thanks to your video I know my 6600xt can at least handle 3440x1440 even though settings might need to be lowered. Personally medium settings is when I slowly start to look for an upgrade.
Why use ultra settings instead of medium? That would be more relistic. I don't run ultra even on my 1080 monitor
True. Medium and high is much more coherent if you consider practical use.
my pc ryzen 2400g + rx6600, i want to buy aoc 3440x1440 can it still play on high settings 30-50 fps? next i will upgrade processor 5600
A: i had RX 6600 with display 3440x1440 165hz run game still playable ±35fps, cpu+mb(i guess bottleneck)2400G+B450=lmao, my old part bruh with case(axel cubegamming).
When gaming VGA Sensor1 66°C & Sensor2 can ±90°C, it's that normal?!?! coz RDNA2 had 2sensor Gpu
Q: Did i need to upgrade my mb+cpu, may B550M/i+3600? to get 4.0 + line Cpu~
Sensor 2 is the Hotspot on the gpu, the hottest spot. 90c is normal and safe. Max is 110 if I remember correctly. For pcie 4.0 you need at leet a Zen 2 cpu and a b550 board yes. There are videos testing the performance difference between pcie3.0 and 4.0 on you gpu, so you'll have to reference those for that information.
@@DontCallTechSupport thanks bruh for the reply, +1
Damn, dlss is bad but dlls2.0 looking better than native.
This seems like cpu bottlenecked...too similar
You're not gonna be cpu bottlenecked at 1440p Ultrawide with these settings unless you're using an amd fx cpu or playing poorly optimized games.