EVGA not deciding to make AMD cards is ignorance tbh. Loads of people had to have lost their jobs with the downsizing after the whole EVGA/NVIDIA GPU debacle. That's not how you treat all of those employees that's for sure
@@Timo_Tech Just tried some of the benchmarks, in cyberpunk i only get 74 fps compared to your 86. I think its because im running a 3800x cpu. Holding out for the 7900x3d. Other games are lower too and sometimes I get lower fps in old games than my GTX 1080.
A video testing 4080 with my exact monitor :) Thankyou! I bought a MSI Suprim X 4080, 2 days ago. Can’t wait for it to arrive. I’m currently running on my spare gpu, a 3060ti, to get by. (got on amazing deal!) I had a 3070 which wasn’t my 1st choice, but all I could get at the time bc of shortages, and I planned to get a 3080 eventually. Ended up with a 3080Ti. But it had an issue a couple months in, and eventually managed to get it replaced under warranty. Never installed it as I was slowly doing other things to computer, and went through a phase of watching movies/series instead haha. So I had a brand new 3080Ti sitting there sealed, and then the 40 series released.. so I sold it! Horrific pricing, though not that different from the 3080Ti pricing when I got it. So after a few months I finally snapped up a Suprim X version, close to retail price. With dlss3 and future performance updates, I shouldn’t want or need a new card till 60 series haha. Still probably a crappy justification to get an overpriced GPU when I already had a great one sitting there, but tech is my life and I’m in a decent living/work situation without many expenses. 🤷🏻♂️ I also have the Alienware 1440p 244hz monitor I use for games that don’t support UW, like Skyrim. But mostly as a second info screen while I play games on the 38” UW So my build now is i5 12600k, MSI 1000w PSU, AORUS Z690 Elite, 32gb DDR4, and 4080 :) I’m a very happy camper atm 😊
Well nice that you show a other resulution then 4k WQHD FUll HD but expected littel bit more in MW2 xD in comparison my RX5700Xt made at 3440x 1440 on high settings + FSR1.0 Ultra Quality 80-90 FPS. But it´s nice to see where the RX7900XT/XTX should go when they will be released.
Hi, I have an 9900k+4080 with a 3440x1440 monitor and found out I'm bottlenecked if I'm not playing at 4k. Not planning on upgrading my cpu yet, but would I still be bottlenecked at 3840x1600 or should I just go 4k all the way? How much more frames am I even losing compared to 13th gen cpus?
@@Timo_Tech I read about it so I'm not actually sure how much of a bottleneck it is. I actually don't mind, but I just wanted to know. 3840x1600 is very tempting, thou
@@theshahtwentyfour How do you know you have a bottleneck? A CPU bottleneck would be if your CPU is running at 100% and your GPU is at 80%. I wouldn't think that combo would have an issue.
Ha! That's a great question, I was actually thinking about making a video on the performance difference. But here's my example, when I was on 3440x1440 and I first got my 3090 Ti I could max out RDR2 and get 100+ FPS and when I made the jump to 3840x1600 I went to the mid 70's for FPS.
@@Timo_Tech I only say bad things Because I think no one will read, just do what ever you want and ignore me! Every body starts some how, maybe this is you starting. Who am i to say something.
You were stuck with a 10900k build because you were unwilling to just swap boards over? Why even make the video if you have the stuff but no will to set it up? What a waste.
Yes I would have liked to have used the 13900k build, BUT I was mostly interested comparing GPU's @ 3840x1600 (Not CPU's). So we still got a huge difference between the 3090 Ti and 4080. I was mainly using that 13900K build for size reference to show that you better have a case with the room for it.
7900xtx seems like the better buy atm. Better raw performance. I also lament the loss of EVGA for Nvidia cards. I have a 3080 FTW3 and its beautiful.
EVGA not deciding to make AMD cards is ignorance tbh. Loads of people had to have lost their jobs with the downsizing after the whole EVGA/NVIDIA GPU debacle. That's not how you treat all of those employees that's for sure
I don’t know if you made your decision. But the frame time spikes look far better on the 4080.
@@ryanthurman92 Also way more features
Nice I use the 38 inch LG, so its nice to see the resolution benchmarks for once lol
Yeah I recently picked up an alien 38, loving it so far!
@@Timo_Tech Just tried some of the benchmarks, in cyberpunk i only get 74 fps compared to your 86. I think its because im running a 3800x cpu. Holding out for the 7900x3d. Other games are lower too and sometimes I get lower fps in old games than my GTX 1080.
A video testing 4080 with my exact monitor :) Thankyou!
I bought a MSI Suprim X 4080, 2 days ago. Can’t wait for it to arrive. I’m currently running on my spare gpu, a 3060ti, to get by. (got on amazing deal!)
I had a 3070 which wasn’t my 1st choice, but all I could get at the time bc of shortages, and I planned to get a 3080 eventually. Ended up with a 3080Ti. But it had an issue a couple months in, and eventually managed to get it replaced under warranty.
Never installed it as I was slowly doing other things to computer, and went through a phase of watching movies/series instead haha.
So I had a brand new 3080Ti sitting there sealed, and then the 40 series released.. so I sold it!
Horrific pricing, though not that different from the 3080Ti pricing when I got it. So after a few months I finally snapped up a Suprim X version, close to retail price.
With dlss3 and future performance updates, I shouldn’t want or need a new card till 60 series haha.
Still probably a crappy justification to get an overpriced GPU when I already had a great one sitting there, but tech is my life and I’m in a decent living/work situation without many expenses. 🤷🏻♂️
I also have the Alienware 1440p 244hz monitor I use for games that don’t support UW, like Skyrim. But mostly as a second info screen while I play games on the 38” UW
So my build now is i5 12600k, MSI 1000w PSU, AORUS Z690 Elite, 32gb DDR4, and 4080 :)
I’m a very happy camper atm 😊
Oh that's awesome! As far as games that don’t support UW have you tried "Flawless Widescreen" I've had pretty good luck with it in the past.
I have a Suprim X 4090, but I do enjoy my ROG Strix 4080 OC that I found open box at microcenter for $1,014.
Great shit Timo!
Thanks! If folks only knew what went into a 5min video like this... 😂
Well nice that you show a other resulution then 4k WQHD FUll HD but expected littel bit more in MW2 xD in comparison my RX5700Xt made at 3440x 1440 on high settings + FSR1.0 Ultra Quality 80-90 FPS. But it´s nice to see where the RX7900XT/XTX should go when they will be released.
Personally I never really cared about 4K, 3440x1440 & 3840x1600 is where it's at!
I HAVE 3840X1600 MONITOR AND YOU NEED TO GO FROM 3080 TO 4080 ?
Need to? Depends on what frame rate and graphic settings you want to run.
@@Timo_Tech mw2 120 fps ultra 3840 x 1600
Hi, I have an 9900k+4080 with a 3440x1440 monitor and found out I'm bottlenecked if I'm not playing at 4k. Not planning on upgrading my cpu yet, but would I still be bottlenecked at 3840x1600 or should I just go 4k all the way? How much more frames am I even losing compared to 13th gen cpus?
How much of a bottleneck are you having? Sidenote I just switched to 3840x1600 and love it!
@@Timo_Tech I read about it so I'm not actually sure how much of a bottleneck it is. I actually don't mind, but I just wanted to know. 3840x1600 is very tempting, thou
@@theshahtwentyfour How do you know you have a bottleneck? A CPU bottleneck would be if your CPU is running at 100% and your GPU is at 80%. I wouldn't think that combo would have an issue.
How big Is the difference between this resolution compare to 3440 x 1440?
Ha! That's a great question, I was actually thinking about making a video on the performance difference. But here's my example, when I was on 3440x1440 and I first got my 3090 Ti I could max out RDR2 and get 100+ FPS and when I made the jump to 3840x1600 I went to the mid 70's for FPS.
Thank you for your testing, but the 10900k is too weak btw.
How is the 10900K weak? It's only hitting 30-40% usages and we still got a huge difference in GPU performance.
Kyles a amd gay boy
You're wrong bud
@@jaren-xd2zp Recently picked up a 12900K & 13900K, so far not a huge difference with the 12900K. (haven't tested the 13900K yet)
Wow I never seen content like this! Ops kidding the same as all the others big channels
What were you hoping to see when you clicked the video? In all fairness, I doubt they ran benchmarks at this resolution...
@@Timo_Tech I only say bad things Because I think no one will read, just do what ever you want and ignore me! Every body starts some how, maybe this is you starting. Who am i to say something.
@@gabrielmattos6337 bro was their even a need too comment this tf is ur problem and he's right no one benches at ultrawide
@@gabrielmattos6337 lol you're an absolute loser
You were stuck with a 10900k build because you were unwilling to just swap boards over? Why even make the video if you have the stuff but no will to set it up? What a waste.
Yes I would have liked to have used the 13900k build, BUT I was mostly interested comparing GPU's @ 3840x1600 (Not CPU's). So we still got a huge difference between the 3090 Ti and 4080. I was mainly using that 13900K build for size reference to show that you better have a case with the room for it.