Wow, this is exactly the sort of benchmark video I was searching for! Just one issue: Could you make sure that all benchmarks, from now on, are at least one minute per game? 30 seconds isn't enough to get a really good feel. You'd be surprised at how much the 1% low can change just by adding another 30 seconds. I feel a bit nervous about spending $900 on a graphics card without getting a bit more of a look.
Sympathise on the $900, kinda disappointed that AMD didn't have a much lower MSRP for this card! So typically the benchmark runs themselves are about three minutes. With canned benchmarks like F1 2022 and Warhammer I'll often just show the last 30 seconds because I don't wanna bore people. So yeah, Where there's a 30 second clip you can be confident that it's showing the end of a longer bench run.
Btw turn off screen space reflections or turn on low for Cyberpunk. Was a driver bug also on Nvidia that causes a loss of a massive amount of frame packet loss for some reason.
@@chloescoolcreations9719 If's a lot better with the new drivers they released. Doesn't get hot on most all games. OW2 for one stays right at 42C while playing at 165fps vsync.
im using a 7900xt + 3700x, and not getting nearly the same performance as you, especially RDR2 where I get average 90-100fps in the benchmarking. Would moving to a 5800x3d really make up that much of a difference at 1440p??
Issues are that the memory seems to run quite hot at about 90c after prolonged heavy loads. Idle power draw is around 90w and in games it'll pull around 300w no problem. Plague Tale Requiem will not launch and The Witcher 3 reliably crashes with ray tracing on, or sometimes just in DX12 mode. Ngl I expect that everything but the memory thermals and power draw under load will be fixed soon. Happy to say that general thermal issues like those derbauer covered aren't an issue with my card and outside of The Witcher and Plague Tale everything has been running nicely.
don't listen to him, I have the saphire vapor x assembly and everything goes great, I have the witcher 3 and it hasn't crashed once, and the adrenalin software is awesome.
@@Bezzerkus I mean, you only have to look at a few 7800x3d/7900xt videos online of games like Cyberpunk (with or without RT), Hogwarts Legacy (with RT), The Witcher 3 (with RT), TLOU Part 1 (drops to 50fps), MSFS and quite a few others. I didn't claim you can't run games at 4k if you are happy to lower settings or use upscaling, but AMD themselves marketed it as a high refresh 1440p card because that's where it gives the best experience.
Arctic liquid freezer is great i would recommend its cheap aswell and beats out even the expensive ones its about £66.99 where i live in the UK and it beats nzxt kraken and corsair ones
nice video mate but i would like to see this card on 2k on a game like Vindictus (mmo game) if you ever decide to make an fps metrics i would apreciate it :)
@@fpscomputers976 He is right, it works as a mid-range in VR, I would buy a better Nvidia 4080, I bought the msi 6950xt gaming x trio a month ago for €650 and it works well in VR with 9600k, today I will mount a 5800x3d to go better...
I bought my 7900xt last year and I don't understand people saying this card is an entry for 4k. If you are getting this card, that means you're mainly playing AAA games and you expect a decent >100 fps in 2k. You wouldn't buy a 7900xt to play Terraria or Stardew at 4k right? So how is this card a 4k entry card when you can only play Spiderman at ~43 fps. I guess people have different expectations. But for me I value smooth fps over stuttering 4k fps even though it's superior in quality. I mean who enjoys 40 fps spiderman for real.. Anyways I was gonna go for a 4k monitor but I think I have changed my mind for a decent 2k monitor. I think there's still some time before 4k gaming becomes mainstream so I can enjoy smooth games.
I always turn off ray tracing on my amd gpu cards and get 100+ ultra setting 4k/1440p ect it's pointless it's an nivida software leave it off if u have amd card
Why so random presets? sometimes medium settings and on other games extreme + ultra RT ? I understand that high does not mean the same on every game, but that seems all over the place.
Yeah. 850 covers the founders versions of the 4090 and 7900XTX. AIB models probably need more but there will be listed specifications where that's the case.
@@fpscomputers976 thanks man, Just one more doubt is Corsair RME 850 watts psu enough for a ryzen 7 5800 x3D and Rx 7900XT?(aib model ) ..just a rough idea..
@@arkamukherjee6051 Yeah so that is the exact config in this benchmark. It's within AMD's recommendations. Under a heavy game load the 5800X3D will pull about 100w but it's usually more like 60-80. The 7900XT under load pulls 308 watts. You can probably find detailed information about power spikes with the 7900XT, but under usual conditions it's well within spec for an 850 watts PSU.
@@fpscomputers976 i've arctic liquid freezer ii 280 and Curve Optimized with Project Hydra. ~6-7 degrees gain on warzone 2.0 and 4.45 ghz full boosted frequencies. You should try it.
Hi, friend ! I have the same build, but in MW2 Warzone and some other projects with the same settings, it will come out 50-60 fps less. Please tell me what could be the problem?
I’m not sure if this would apply to you but i was experiencing the same fps loss, i fixed it when i had updated my bios and i get around the same fps this guy gets in WZ
@@Wickery.Make sure resize bar is active in bios as well. Make sure your ram is in slot 2 and four in dual channel mode make sure xmp for memory is on too in bios. Download latest bios. Make sure your GPU is in PCI slot 1 ....
Because it's the straight RT Ultra preset, meaning the FSR setting is 'auto'. Iirc FSR auto is usually running at about balanced or even performance. I think it might be dynaminc, though.
Do you think it's worth upgrading from a 3070 if I want to play at 1440p ? I get an average of 80-100fps at 1080p on high or ultra settings. I don't really plan on playing with RT most of the time.
I assume because CP2077 was marketed as a showcase for Raytracing. It makes a big difference to the visual presentation depending on the settings/location.
CPUs never really show 100% usage while gaming because a single game isn't likely to utilize all threads. Based on these results, it looks like you would probably be fine with a 5600X, but some performance may be left on the table, especially at 1440p high refresh rate.
5600x3D is within like 10% performance gap of this CPU in gaming for 30% less. So you're still in a good spot. Only difference is 6-core vs. 8-core which matters more in workstation and content creation workflows.
In 2023, at 4K, you still can't make a card get over 100fps in every game? Wow! Trash cards! 40 series and 7000 series cards are garbage! Not worth it in my eyes because I only play 4K.
@Thomas Shears Facts! We are spoiled now days! To think years ago 1080p 60fps was the high standard! Not to mention gaming on consoles that only did 30fps with bad graphics! I tried playing some old games on my 360 that were “amazing” back in its time and was blown away how bad it is compared to my gaming laptop with 120fps and modern graphics
WIth current manufacturing tech there is literally no other way to render at higher frames unless you do pretty much most of the below: 1. Power a computer with 240v so it uses less amps from the wall socket. Universal 120v household sockets can only support 1800w Power Supplies assuming your PC is the only thing plugged into a single breaker in your house. This will be mandatory to power multiple GPU dies 2. Make the graphics card and cooling solution comically large across multiple dies and/or bring back SLI (which NVIDIA won't do because it cannibalizes the value of cards in their GPU lineup). Your PC tower will now have to be in a damn server rack on wheels. 3. Developers will need to be spending an exponentially larger budget and taking longer to release games to focus more on game performance optimization then charge you $10 more for the PC game while you complain about the game being pushed back 6 more months. 4. YOU WILL PAY FOR IT. $1800 High End GPUs sound like a lot? TRY $15000-20,000 for what your spoiled a$$ wants. Nvidia's A100 80GB Workstation GPUs are that expensive. Br4ts never satisfied.
AMD's upscaling tech. Game is rendering at a lower resolution but then artifically upscaled to the Display resolution using an algorithm...basically it translates to a minor dip in visual quality but higher Frames Per Second tradeoff for smoother gameplay.
An error was made with F1 2022. It should say that the preset is 'ULTRA HIGH'
Just ordered my 7900xt for 800 dollars, super excited. Seems to have no probelm getting 1440p144hz, it will double my current card
so it’s a month ago what do you think about it now?
Update mfr where’d you go 🤣
dont ever buy an amd card it gets very hot and crashes a lot the drivers are shit
@@unitor699industries that’s complete bullshit LOL
@@unitor699industries average Nvidia Buyer
You have no idea how much i search for this combo i have what i need now thank you
Perfect, thanks for this, you're using a really similar setup to me
Wow, this is exactly the sort of benchmark video I was searching for! Just one issue: Could you make sure that all benchmarks, from now on, are at least one minute per game? 30 seconds isn't enough to get a really good feel. You'd be surprised at how much the 1% low can change just by adding another 30 seconds. I feel a bit nervous about spending $900 on a graphics card without getting a bit more of a look.
Sympathise on the $900, kinda disappointed that AMD didn't have a much lower MSRP for this card!
So typically the benchmark runs themselves are about three minutes. With canned benchmarks like F1 2022 and Warhammer I'll often just show the last 30 seconds because I don't wanna bore people. So yeah, Where there's a 30 second clip you can be confident that it's showing the end of a longer bench run.
Is the cp2077 test with fsr quality?
I ran a lot of these games with a 5700xt just fine. The 7900xt looks like it does really well.
Mine should be in today for the Hellhound version. Waiting for them to start rolling out better drivers though.
Btw turn off screen space reflections or turn on low for Cyberpunk. Was a driver bug also on Nvidia that causes a loss of a massive amount of frame packet loss for some reason.
I get my hellhound tomorrow how do you like it?
@@chloescoolcreations9719 If's a lot better with the new drivers they released. Doesn't get hot on most all games. OW2 for one stays right at 42C while playing at 165fps vsync.
@@TitusFFX super exited THANKS
JUST placed an order for a 1440p monitor, this CPU and GPU is exactly what I have so it looks like it _might_ be worth!
im using a 7900xt + 3700x, and not getting nearly the same performance as you, especially RDR2 where I get average 90-100fps in the benchmarking. Would moving to a 5800x3d really make up that much of a difference at 1440p??
I'm not sure what difference it would make, but that CPU will be a bottleneck for the 7900XT for some games/settings at 1440p
@@fpscomputers976 I just put in a 5800x3d yesterday, the chip is like a cheat code, does get hot tho on some games
@@Protato675 glad you like t. It is a pretty hot chip. Definitely one for an AIO cooler.
@@Protato675true, yoi can undervolt it for quite good boost at fps and aprox -10degrees celsius of heat
3700x was definitely bottlenecking your 7900 XT
Thank you!
Thank you for posting!
Have you had issues with this card? Drivers are good? No freezing up ? Etc.... Thank you!
Issues are that the memory seems to run quite hot at about 90c after prolonged heavy loads. Idle power draw is around 90w and in games it'll pull around 300w no problem. Plague Tale Requiem will not launch and The Witcher 3 reliably crashes with ray tracing on, or sometimes just in DX12 mode. Ngl I expect that everything but the memory thermals and power draw under load will be fixed soon. Happy to say that general thermal issues like those derbauer covered aren't an issue with my card and outside of The Witcher and Plague Tale everything has been running nicely.
@@fpscomputers976 i guess u need undervolt like all AMD cards
don't listen to him, I have the saphire vapor x assembly and everything goes great, I have the witcher 3 and it hasn't crashed once, and the adrenalin software is awesome.
I have a 3060 12GB, would it be a good idea to get this card now?
yes 100%
Sad to see some games at 4k won’t even get a stable 60fps. 😢 my brand new 7900xt already feels disappointing at times.
Just turn off RT, that's all what you need to do.😅
@@Эпикфайт-н3ыor turn on frs :D
The 7900xt is more of a high refresh 1440p card than a 4k card. The 7900xtx/4080/4090 are what you should have bought if you were gaming at 4k
@@JustS0meK1dd lol wrong. After upgrading my cpu everything is playing at 4k 60 minimum :) wrongy wrong wrong
@@Bezzerkus I mean, you only have to look at a few 7800x3d/7900xt videos online of games like Cyberpunk (with or without RT), Hogwarts Legacy (with RT), The Witcher 3 (with RT), TLOU Part 1 (drops to 50fps), MSFS and quite a few others. I didn't claim you can't run games at 4k if you are happy to lower settings or use upscaling, but AMD themselves marketed it as a high refresh 1440p card because that's where it gives the best experience.
Just ordered these two parts perfect 👍. Is there a liquid cooler I should get or is the air ones good enough ?
my 5800x3d works great with an air cooler, just make sure to get a big beefy one
Arctic liquid freezer is great i would recommend its cheap aswell and beats out even the expensive ones its about £66.99 where i live in the UK and it beats nzxt kraken and corsair ones
nice video mate but i would like to see this card on 2k on a game like Vindictus (mmo game) if you ever decide to make an fps metrics i would apreciate it :)
question: Before the tests did you undervolt the GPU and CPU? My GPU and CPU don't keep those temperatures nearly as stable bro.
What case and fans you got? If both have temps issues its most likely not the hardware.
I dont think he's undervolted, at 1440p the 5800x3d clocks are all over the place. he'd get more fps with an undervolt
Would be nice to see these tested on VR 😢
I don't have the gear for that, unfortunately. I've read on forums that 7000 series cards aren't very good for VR, but take that with a pinch of salt.
@@fpscomputers976 He is right, it works as a mid-range in VR, I would buy a better Nvidia 4080, I bought the msi 6950xt gaming x trio a month ago for €650 and it works well in VR with 9600k, today I will mount a 5800x3d to go better...
Hi Hank was a bottleneck with the 5800X3D + 7900XT I have a 5600X + 1070 and I want to upgrade to a 7900 XT. Will I face a bottleneck?
I bought my 7900xt last year and I don't understand people saying this card is an entry for 4k. If you are getting this card, that means you're mainly playing AAA games and you expect a decent >100 fps in 2k. You wouldn't buy a 7900xt to play Terraria or Stardew at 4k right?
So how is this card a 4k entry card when you can only play Spiderman at ~43 fps. I guess people have different expectations. But for me I value smooth fps over stuttering 4k fps even though it's superior in quality. I mean who enjoys 40 fps spiderman for real..
Anyways I was gonna go for a 4k monitor but I think I have changed my mind for a decent 2k monitor. I think there's still some time before 4k gaming becomes mainstream so I can enjoy smooth games.
Hey i have this set up and all my games are 144... Basically locked on Ultra and low. Am i doing something wrong or configured something wrong?
Probably ur monitor, mine also locked on 144fps because my monitor is 2k 144fps, so its fine.
I always turn off ray tracing on my amd gpu cards and get 100+ ultra setting 4k/1440p ect it's pointless it's an nivida software leave it off if u have amd card
Then what's the point in any GPU beyond an rx580?
There isn't one if your logic is retarded like this
What’s the power draw like 300 watts ? Or something like that would a 750psu hold up ?
750 is the minimum required for 7900XT.
@@Rakshasa1986 I ended up getting sapphire 7900xtx with a 1000 watt psu works awesome
Got a 7900xt for 720$ on Black Friday happy to replace my 3080 😊
Why so random presets? sometimes medium settings and on other games extreme + ultra RT ?
I understand that high does not mean the same on every game, but that seems all over the place.
Sir the psu which you are using is it enough??
Yeah. 850 covers the founders versions of the 4090 and 7900XTX. AIB models probably need more but there will be listed specifications where that's the case.
@@fpscomputers976 thanks man,
Just one more doubt is Corsair RME 850 watts psu enough for a ryzen 7 5800 x3D and Rx 7900XT?(aib model ) ..just a rough idea..
@@arkamukherjee6051 Yeah so that is the exact config in this benchmark. It's within AMD's recommendations. Under a heavy game load the 5800X3D will pull about 100w but it's usually more like 60-80. The 7900XT under load pulls 308 watts. You can probably find detailed information about power spikes with the 7900XT, but under usual conditions it's well within spec for an 850 watts PSU.
@@fpscomputers976 okk thanks for the info man 🙂..
Any UV on 5800X3D? Which cooler do you use?
No, it's stock. Its cooler is an Arctic Liquid Freezer II 360 and it's inside the Corsair 5000D
@@fpscomputers976 i've arctic liquid freezer ii 280 and Curve Optimized with Project Hydra. ~6-7 degrees gain on warzone 2.0 and 4.45 ghz full boosted frequencies. You should try it.
I would like to know your RAM CAS latency
Ryzen 7 or Ryzen 9?
For future gpus
Cuz Ryzen 9 3900X I own now is not enough I guess
I'd be tempted to get the upcoming R7 X3D, depending on the cost.
Did you have Fast memory resizable bar on?
Yes
What is FSR?
Hi, friend ! I have the same build, but in MW2 Warzone and some other projects with the same settings, it will come out 50-60 fps less. Please tell me what could be the problem?
I’m not sure if this would apply to you but i was experiencing the same fps loss, i fixed it when i had updated my bios and i get around the same fps this guy gets in WZ
@@tairkyu thank you very much i will try to update
@@linkenssphere7699 any update? I’m not getting the same fps either
@@Wickery.Make sure resize bar is active in bios as well. Make sure your ram is in slot 2 and four in dual channel mode make sure xmp for memory is on too in bios. Download latest bios. Make sure your GPU is in PCI slot 1 ....
it is strange same cpu but with an xtx. in cyberpunk with ultra rt and fsr quality on 1440p i get 70 fps just like in this video at 12:54
Because it's the straight RT Ultra preset, meaning the FSR setting is 'auto'. Iirc FSR auto is usually running at about balanced or even performance. I think it might be dynaminc, though.
oh i see it explains it. lets see what fsar 3 will do
Do you think it's worth upgrading from a 3070 if I want to play at 1440p ? I get an average of 80-100fps at 1080p on high or ultra settings. I don't really plan on playing with RT most of the time.
Porbably not. The 3070 is a really good card still, keep it imo.
@@fpscomputers976 even If I want to switch to 1440p you think ?
@@bastiwen In most cases it's still a really good 1440p card, especially if you aren't going to be using ray tracing often.
@@fpscomputers976 Thanks for the answer :)
@@bastiwen it's double the performance so I'd say go for it if u need fps
4k is waste of time u can fo 1440p and u get better fps in all games
That means that 5800x3D dont hace bottleneck with that GPU??? AMAZING
FSR not activate in 2160P ?
This looks like a raw performance benchmark
Switching from 1060 6gb and ryzen 7 2700x to this setup i think i will see difference lol
Got a 7900xt brand new 500 traded a old gpu in
Why test cp2077 with RT...?
I assume because CP2077 was marketed as a showcase for Raytracing. It makes a big difference to the visual presentation depending on the settings/location.
I literally have the same specs but i get less fps, like -10 / -15 fps less than you at 1440p... 😪
What RAM are you using?
@@evalac2840 the problem was my cpu was overheating to 90 degrees so I was losing on some fps, I changed my case and AIO and it's cool now
@@Hecbertgg thank you for sharing, I'm trying to aquire these same parts
@@evalac2840 nice just make sure you have proper airflow because this cpu can be hot
You'll gain more fps by undervolting your cpu btw
Why are you running a 5800x3d cpu with the RX7900XT? The 5800x3d is too weak for the RX7900XT.
It was the best CPU when I got it, and I haven't upgraded yet. It does bottleneck the 7900XT at 1440p in some titles though, you're right
@@fpscomputers976 I thought the 5800x3d doesn't bottleneck at 1440p gaming?
Says who? Its still powerful af with newer cards. It can run a 4090 perfectly fine.
It does not bottleneck
5800x3d does not bottleneck an 7900xt
5600X BE OK WITH THIS AS X3D NOT EVEN AT 50 PERCENTT
CPUs never really show 100% usage while gaming because a single game isn't likely to utilize all threads. Based on these results, it looks like you would probably be fine with a 5600X, but some performance may be left on the table, especially at 1440p high refresh rate.
5600x3D is within like 10% performance gap of this CPU in gaming for 30% less. So you're still in a good spot. Only difference is 6-core vs. 8-core which matters more in workstation and content creation workflows.
3D has much better fps 1% low
Getting less frames in RDR2 on 1080p in ultra with nearly the same system
In 2023, at 4K, you still can't make a card get over 100fps in every game? Wow! Trash cards! 40 series and 7000 series cards are garbage! Not worth it in my eyes because I only play 4K.
Raytracing ruin everything, RT OFF = + 100 FPS every game
RT always tanks performance. 1440p is what most people play, i’m satisfied with my 79XT
@@_N4VE_ I don't think it's worth to use RT
Don't think there is huge difference
Better to never use it to not be dependent from it
@Thomas Shears Facts! We are spoiled now days! To think years ago 1080p 60fps was the high standard! Not to mention gaming on consoles that only did 30fps with bad graphics! I tried playing some old games on my 360 that were “amazing” back in its time and was blown away how bad it is compared to my gaming laptop with 120fps and modern graphics
WIth current manufacturing tech there is literally no other way to render at higher frames unless you do pretty much most of the below:
1. Power a computer with 240v so it uses less amps from the wall socket. Universal 120v household sockets can only support 1800w Power Supplies assuming your PC is the only thing plugged into a single breaker in your house. This will be mandatory to power multiple GPU dies
2. Make the graphics card and cooling solution comically large across multiple dies and/or bring back SLI (which NVIDIA won't do because it cannibalizes the value of cards in their GPU lineup). Your PC tower will now have to be in a damn server rack on wheels.
3. Developers will need to be spending an exponentially larger budget and taking longer to release games to focus more on game performance optimization then charge you $10 more for the PC game while you complain about the game being pushed back 6 more months.
4. YOU WILL PAY FOR IT. $1800 High End GPUs sound like a lot?
TRY $15000-20,000 for what your spoiled a$$ wants. Nvidia's A100 80GB Workstation GPUs are that expensive.
Br4ts never satisfied.
What is FSR?
AMD's upscaling tech. Game is rendering at a lower resolution but then artifically upscaled to the Display resolution using an algorithm...basically it translates to a minor dip in visual quality but higher Frames Per Second tradeoff for smoother gameplay.