The RTX 4070 Ti CPU Bottleneck - How Much? Compared to the RTX 3090.
HTML-код
- Опубликовано: 30 сен 2024
- How much of a CPU Bottleneck is there? Do you need a new CPU to get the most FPS from your RTX 4070 Ti? What "old" CPUs are still good? What about the RTX 3090? Will a newer CPU get you more FPS?
#4070tiCPU #CPUBottleneck #RTX3090CPU
5 years ago?! Oh my poor i9 9900k.
Its hard to believe some people build new PCs every five years.
5 years is a decent timeframe for a PC. More than decent. If you buy/build a hi-endish PC now, in 5 years you will likely experience some struggles keeping up the FPS in modern games, using same settings and resolution as 5 years ago. And you don't buy a hi-end pc to set quality to "medium" or even to "high". It's Ultra or Very High, nothing else :P
3 years is more like it😂😂😂
Not me I just got the 4070ti super paired with my 9900k @ 5Ghz. Works great at 100fps 3440x1440p extreme settings.
most of dont we just swich parts
I had a 3600x and 2060 for 1080p gaming and everything was smooth. I then upgraded to a 3080 for 1440p and some games gave me bad frame drops. I ended up upgrading to a 5700x and my 1% lows and averages went up. I will stick with this combo for a while and wait for the next gen gpus and cpus to release
4080 coming down in price, but I still lean towards the 7900 XTX.
As pointed out by Hardware Unboxed, the 7950X 3D is slower because Windows doesn't properly select the correct chip to use. Instead of selecting the 3D V-Cache chip, it selects the other one. For this reason alone, the 7800X3D is a better choice, offering more performance, and also cheaper.
Is it bad to buy 4070ti for 1080p gaming?
@@Zelchinho depends on the rest of your setup. If you have a high end cpu and a monitor with 165hz or higher you should be good
My build is 4070ti, potentially 7800x3d, 32gb 6000mhz CL30-36-36-76 1.35v, b650 aorus pro elite ax ice edition
cpu will be water cooled with a kraken x53
Just because the gpu isn't 100% utilized doesn't mean its bottlenecking it can be the game engine the game its self theres a lot of different things to account for JayzTwoCents make a vid on it it was pretty helpful
100% correct.
I have an r5 5600 always running at stable 4.4 GHz paired with an rx 6650 XT. This combo, generally, never gives a CPU bottleneck at 1080p low gaming.
Apex legends launched it's newest season recently and dropped a new map. This new map added TONS of textures and complex objects. Now, in some scenarios my GPU usage goes down to 70-75 % usage with an avg CPU usage of 60-75%. Apex uses source engine, which even though looks good its showing its age (in apex it even has a cap of 300fps). Apparently, adding a ton more of complex maps to that engine shows the limits of the engine itself when it comes to CPU-GPU utilization.
Ya it greatly depends on the games you play. I mostly play open big cities games and the bottleneck is a betch in watch dogs and mafia definitive.
I told you so, AMD fanboys...AMD were not the first ones with that idea in mind since intel broadwell introduce first cache die in same substrate idea from AMD is so late in the game with ideas now
You said people should not buy a 4070 ti for 1080p. But what if someone just wants to max out every game and play at 1080p and never dip below 60 fps? I'm seeing a lot of games drop below 60 on 1440p, which would really irk me if I did have a 1440p monitor and bought a 4070 ti. Hell, with RT enabled on a few games it drops below 60 even in 1080p. I thought I figured out the whole "bottleneck" thing a few years back, but now that games come out unoptimized I am completely lost about it yet again.
Agree, not everyone wants to play on 1440p and 4k, cuz 1080p when right monitor, can still looks very good, i have 4070 ti with 1080p monitor and its cool experience
For 1080p 60fps a 3060ti will suffice. A 4070 is for those with high refresh monitors, you know, competitive gaming. If your problem is lack of optimization, well Frame Generation solves It. I wouldnt pay more than a 4060 or 3060ti for 1080p
@@gerardotejada2531 i dont play online games, only sibgleplayer, so 4070 ti is still a better choice over 3060 ti, and i lready have 4070ti playing single player games with 1080p and its good
@@JacoB-wp4wswhat cpu?
4090 drops below 60 in RT overdrive maxed out settings 😂
Recently bought 5800x3d and want to pair it with 4070ti if it's price comes down. Great video
Is 5800x 3d the best for 4070ti ?? I too thinking of getting that
@@ramprasad9277 its really good , I would recommend that if you have aM4 or want to save a bit of money over 7800x3d
@@ramprasad9277 just don't forget that this CPU is very hot, I had to upgrade my cooler
What u upgraded from and to and what was the temperature drops, please.@@makiroll6815
Just bought a 5800x 3D as well. I'm waiting for prices for GPU's to drop as well.
Tyvm for the vid. Been running a 2070 & 9900K in my htpc and I wanted to upgrade the gpu to a 4070 Ti but was slightly concerned about cpu bottleneck. Nice to know that the bottleneck is negligible for as old as the 9900K is. I never run ultra settings typically a mix of High & Med at 1440p and my display LG is capped at 120fps anyways I just don't bother w/ 4K because the PPI on the LG displays is so good I don't feel its necessary for the cost of power and heat. So getting a nice smooth 1440p 120fps w/ low power consumption I find to be a nice balance for my use case.
The 9900K still can get it done in 2023. I agree, 1440p/120fps is the best place to be since chasing 4k will cost big money going forward.
Same here. 9900k + 2070, upgraded to a 4070 ti, same brand of display (LG) at 1440p, also capping it at 120fps. My reasoning was the same. Very happy with the results!
I'd like to know how +3fps = 27% Gpu bottleneck out +160 fps ?
even in the all the other fps listed it's no wear near above 10%.
You need to test more than one old game.
True...as time allows. The nice thing about the old Shadow of the Benchmarks is how well it still scales with hardware and how it does not get continual patch updates so you can always compare results.
Just got a gigabyte aourus water force 3090 and its a beast obviously i paid alot less for it as its an x mining card but its still performing great 👍
i mean 9900k is still fine but it can bottleneck games like battlefield 2042 or warzone 2 heavy multiplayer games in single player games i dare say even 8700k is enough for 1440p with 4070 ti
That's a really good distinction between types of games.
So you are saying 9900k is not enough for 3090 at 1440p? Jumping to 13 gen isnt worth imo.
@@fiece4767 well we are discussing bottleneck and 9900k oced will bottleneck 4070ti in those heavy weight multiplayer games for sure
If youre getting hundreads of fps then youre not being bottle necked by anything what are you even talking about. I have 4070ti and run 4k 144hz no problem. Nothing in this video makes sense this is dumb and youre lying to people making them think they should have 300fps as if thats normal
I just built my new PC and skipped over the 7950X3D in favor of the normal 7950X instead. Not too crazy about the way you seem to have to jump through hoops to "sometimes" see extra performance in "some games". I have my 7950X paired with a 4070Ti and couldn't be happier with my system performance.
I might have to get that cpu over the 7800x3d :/
The X3D will perform better
I have a wicked fast setup with a 4070 Ti, there are still some scenarios where there is a bottleneck. It 'seems' like it's the Cpu at times, but I think it's more likely Nvidia's garbage driver overhead, API issues, and general bad optimization of games at the higher tier graphics settings. Kinda sucks we have to throw expensive hardware at these problems that could at least in part be fixed by Nvidia and Microsoft.
I risk the bottleneck if i made a build with Ryzen 5 7600 and RTX 4070 playing in 2K resolution?
i've got the 10700k@5.1Ghz and the 4070 (non ti) is bottlenecking my cpu (even in 1440p)
heard that the 40 series are bottlenecking almost all cpu's
i guess nvidia did something wrong with the drivers
@@mr.electronx9036try the 13 the Generation
@@mr.electronx9036 you want your gpu to be the bottleneck though.. that means that you are getting all of its performance and it’s not being held back by the cpu. You want to see 99% gpu usage as often as possible. But when you go down in resolution like to 1080p it is so easy for your gpu to pump out frames that it’s not even trying, and the maximum fps your cpu can do handling its part of the equation is now what is the cap or cpu bottlenecked. There will always be a “bottleneck” though or else you would just be at infinity fps. When you go up to 4K however the resolution is much more taxing on the gpu, so it is pumping out much fewer frames which is well below the CPU’s max frame rendering and thus the max fps of the gpu is rendered and it is now the bottleneck.
@@eternalabysx we will see what 14th gen brings to the table.
I3 13100f. I wouldn't worry about GPU scaling on a old game. 100+ FPS is plenty. You could certainly use a 4090 with the i3 and have zero cpu bottlenecks at 8k.
I needed this video a couple of weeks ago. When I got my 4070ti I had a 5700G and noticed that whilst in most games I got a huge jump in performance, I noticed that in Cyberpunk and The Witcher with heavy RT it wasn't performing as expected despite CPU usage showing around 40%. I picked up the 5800X3D and was blown away as the Cyberpunk benchmark at 1440p RT Ultra DLSS Auto went from 76fps average (97 max) to 100 fps average and 127 max, in many like for like scenes I was gaining over 50% performance and the same for The Witcher 3
But what else does it tell you?
Buying a GPU and easy install it doubles the framerate.
Buying a hole system for about 680 costs you between 3h or a weekend and delivers only between 20-30frames or not even 50%+
@@memoli801 pretty sure it was just an easy cpu upgrade for him aswell, as far as i know you can use the same motherboard and ram with both cpus and i mean the gpu costs 800-900$ and the cpu is way cheaper. In the end you need the whole pc around the same performance lvl and you gotta know what youre doing with your pc, which games are played, are they heavy on the cpu or on the gpu and so on
How much fps you got for cyberpunk with rt enabled?
Learn to overclock your ram.
@@abdullahdanze2061 for? 2% gains?
sounds like shadow of the tomb raider might not be all that relible of a test at this point, i'd say try several other cpu bound games to see a better picture of how much those CPU's perfromance is deviate one another.
I have an I7-8700k and a 4070ti.
With max settings at 1440p with no DLSS on, I get 27% GPU bound as the results.
Do you have any issues with %1 lows on your 8700k?
Is that good?
Because i want to buy i7 11700k with 4070 does it have abottlneck?
@@doler2570 i would recommend going to at least a 12th gen cpu if you are building new. More of an upgrade path and prices have dropped a lot lately
I just got 7900XTX to my 3700X. Funny thing that in Hunt Showdown on Low settings I have almost same FPS like my old ROG 3060Ti. And I have also the same FPS in Hunt on Low and on Ultra settings. Difference is minimal. So, okay, now I can play every game on Ultra but Im not benefiting "any" FPS boost.
Can you confirm my results here? Does it make sense? Because I was thinking if my AsRock 7900XTX and its super loud coil whine isnt bad piece.
Also my results in SOTR: 3700x + 7900xtx + 32GB 3600Mhz CL16 + 3440x1440p 21:9.
- cca 18.569 points, 118 avg. fps, 0% GPU Bound.
I dont want to invest to AM4 anymore :(. Im waiting for next gen. CPUs :(
The 3700x which is based on Zen 2 Ryzen architecture released in July 2019 and will bottleneck that XTX. The SOTR benchmark is the big indicator with 0% GPU bound. I also would not invest in AM4 at this point as even Zen 3 can bottleneck at less than 4k resolutions. I don't think you need to wait for next gen CPUs. You can get an AM5 motherboard today and it will be good for a couple of years including Zen 5 coming next year. If I were in your position, I would wait for the holiday Black Friday sales and look for a CPU/MB/RAM combo deal. Even a Ryzen 5 7600 would be much better. Then upgrade to the next gen CPU I really want next year and sell the 7600. There could be some bottlenecking in games that use a lot of cores with that 7600 however, at least in MOST games, that 7600 would provide much higher frame rates with that XTX than the 3700x ever could. In the meantime, I would sell that AM4 MB/CPU/RAM while it's still worth something.
OR, return/sell that XTX until you are ready to upgrade your system. Hardware only gets cheaper over time. Good Luck!
I wish I saw this a couple months back when I bought a 5800x3d and a 4070ti. I wasn't sure if my CPU was good enough and some ppl and sites were saying it was a bad bottleneck. Looks like it does just fine 💀
People online aren't sane and like to troll for no reason
You need to test about 10-15 games at the minimum and i have a 5800x3d and 7950x3d the new chip is a good bit faster in gaming and people who get cpu’s like this like to play around and tune. Most people are not following the steps to installing a new x3d like updating windows, chipset, bios and so on plus running windows in balanced not performance. You can’t made an informed decision off one game there’s so many variations in games you need many to test different engines and api’s dx11 and 12 or vulkan.
I did some benchmarking in SOTTR over the last two years. I dont know/think it's of any use, but I'll throw some of the results at you, and if it helps in any way, then great. If not, well I guess I just wasted a bit of your time :D I did two upgrades in total between now and december 2021. CPU and GPU. I really wish I had tested my new gpu with the old cpu(4790k), as that would've been more of what you asked about at the end of the video.
And even though the video was about the 4070ti, which I cant help or contribute any sort of info towards. But figured I would throw some other numbers at you anyway. Feel free to ignore lol ;)
Anywho, I have both 4k, 1440, and 1080 results.
1440p: (4790k + gtx1080 = 65fps. 99% gpu bound ) -- (12700k + gtx1080 = 65fps(!) 100% gpu bound) -- (12700k + 3080Ti = 165fps. 85% bound)
1080p: (4790k + gtx1080 = 89fps. 50% bound) -- (12700k + gtx1080 = 96fps. 99% bound) -- (12700k + 3080Ti = 198fps. 42% bound)
4K: (sadly no test with 4790k) (12700k + gtx1080 = 34fps 100%bound) -- (12700k + 3080Ti = 93fps. 100% bound)
Probably zero interesting things to gather from this. But meh, at least you have more numbers to look at ;-) I'm by no means an expert, nor do I know much. But I always found the difference at 1440p between the 4790k and 12700k to be "funny". Exact same fps. Sure 1440p is more gpu heavy than say 1080p ofcourse. But still, zero increase going from that cpu to the 'new' one. Would've really loved it if I had tested 4K with the 4790k too, just for the sake of it. But way too lazy to remove my 12700 and replace it with 4790k just for a test ;-) I always do these kinds of benchmarks before I upgrade. As it's fun to see the difference between then/now. SoTTR is just one of several games I use for that. Guardians of the Galaxy and Forza 5 are used aswell.
Just as in SoTTR, Guardians also saw zero changes in 1440p between 4790k and 12700k. And as mentioned, 1440p is from my understanding more GPU heavy than CPU dependent. But still find it weird that there wasnt even a slight upgrade with a 'better' CPU. Even at 1080p, I felt the improvement was abysmal, where the 4790k gave me 89fps, while 12700k pushed it to 96fps. Not a big jump IMO. In Guardians of the Galaxy built-in benchmark, there was at least a bit higher improvment between 4790k and 12700k. Went from 75fps to 95fps. (and in 1440p there was 1fps difference between the two cpus (using same gpu ofcourse)
Oh and also, every test was only different when to which cpu/gpu was in it. In other words, same mobo, same RAM, same everything other than the cpu/gpu
Anyways, like I said, this might've been somewhat interesting, or a complete waste of time of a comment lol.
I love more data. I still have my 4790k but didn't think of testing with this generation of GPU hardware...but maybe it I should reconsider? Thank you very much for sharing!
unfortunately this is missing 4790K + 3080Ti test, that would give the best idea how much FPS the processor update actually gave.
also some games really push the CPU side, like total war games or warhammer 3, much more than your usual FPS
as this side is missing from everywhere, how does the CPU scale from. 3700x/ 4790K, 5800x(3d) to these last gen monsters with mid/upper mid tier GPU. I dont care about 4090, its not something 99.5% is going to have.
A lot of Bottleneck will depend on Specific Game you are playing and how much it leverages CPU..
4070Ti with 7800X3D it's a BEAST in gaming ...you should try it ;)
its the same problem as intel when they did their big little design , scheduling is the problem
turn of the die in the bios that has no v cache and see your fps fly
5800x3d is a beast can't wait to buy it :D
13600k is way better tbh
@@Alex-bl8uh it really isnt
@@justinsuerbier4301 ok, tell me why? same price but better performance in gaming on average + extremly better productivity performance.
@@Alex-bl8uh Platform price, B550 + DDR4 are much cheaper. Also if you go Intel now, you need to change mobo anyway once 14th gen hit. I'm personally wait for 14400f as well, should be the same uArch as 14600k but less ecore and clock.
I am waiting for the price to drop more before changing my 5600x to the 5800x3d. I have no problems with it on my 4070ti build but to use it’s full potential for the next 3 years, I’ll need the 5800x3D.
The focus of these graphics cards is on ray tracing. I think you should consider that. The 60 RT cores should also be used to get a decent result.
i have a i7 8700k@4,8Ghz and play on 3440x1440. the only game where i had a cpu bottleneck was spiderman remastered with a 4070ti. it get even worse then i use dlss cuz the gpu load was on 50% or something. even with raytracing and everything maxed out my gpu load is around 65-70%. all other games my gpu load is between 90-100%.
I7 13700k and 4070 ti no bottlenecking. On 13 most popular games ddr5 5600 36
I have a 5900x with the rtx 4070ti. They seem a perfect match
See i had this combo at first as well but I’ve noticed bottlenecks that wasn’t there before until i swapped to the 4070 ti
I have 5900x with 4070ti but I do have a bit of botlleneck ! About 80% gpu bound
Anybody want my 3080 12gb for 400 or 4070ti ventus 2x for 600 I have pics and video of both
Awesome video, I am using an Ryzen 7 5700X3D and frankly I want to get the 4070 Ti for 1440p gaming, especially FPS
your tests and information are perfect for my situation.
wanne buy a 4070 ti super for my 5800x with 64gb cl14 on a unify board, but im always want a 3090ti suprim x.
now 4070ti super 900€
3090 ti suprim x used 750-900€
for the future i think the 4070ti s is better but i want the 24gb suprim x
Just got a 4070 Ti Super and I'm still on AM4 Ryzen 9 5900x. I'm perfectly happy with my frame rates in gaming, but my PC is also a workstation. The Cuda performance is more important.
i have a 3080 and 5600x, i will get a 5800x3d. and my next upgrade will be a new build with intel
this is why i got a 5900x, 1080p is not what I play for anything.
I've loved my 4070Ti so far. It's been excellent with a 13700K. Coming from a 8700/2080.
I somehow managed to build my very first computer earlier this year with an i9-9900k, 32 GB of 3200 mhz RAM, and a 4070 TI for 1000 bucks. I literally had no idea what I was doing lol but I can confirm it's is the just enough it can still get 100% usage, especially if you overclock the CPU and undervolt the gpu. I wouldn't go any lower than that though lol.
Im thinking 3090 ti with my i9 9900kf but not sure if its a good match at 1440p.
Is the 9900k with a 4070ti fine for 1080p? I wanna upgrade from my 2060
@@veqr7266 way overkill just get a 4070 for 1080p, it has plenty of headroom for 1080p gaming for the next few years unless your going for something crazy like 240 fps.
@StuffIThink would a 4070 super still be fine because its the same price?
@@veqr7266they lowered the price of the 4070 since releasing the super series. 4070s lowest price new is $540 while the super is $610 and the 4070 is already kind of overkill for 1080p right now so i would just stick with the 4070.
pfft, who cares past 150fps? I don't care past 30fps for a bunch of games. When you maximise the use of a 4K or at times even a 2K monitor, your graphics card will be the bottleneck for a long time to come lol.
I got a 4070 ti "coming soon" and i have a ryzen 3700x. wondering if i should stay with it for now / any thoughts appreciated
I have similar setup. 3700x as well. Check my comment here and let me know if you have similar behaviour.
Hi. I have the 2700x and the 4070. Is the any bottleneck?
In valheim at 1440p my dip below 60fps.
In Enshrouded the same.
Will the 5800x3d end this problem?
Greetings from Germany
2700X isn’t powerful enough to utilise 4070. 5800x 3D will be great. Make sure you using latest bios, installed latest motherboard chipset drivers for and use 3600mhz ram in dual channel.
I see that it tends to vary depending on the game and how well its engine can utilize the CPU cores. Some games can spread the load over 8 cores while other games will only fully utilize 1, 2, or 4 cores. So in the instance that the game engine isn't spreading the workload over all the cores. It requires a CPU with a faster single core speed to get the best FPS. One really easy example of this is RuneScape on the low graphics preset. It's a very old game engine that can't take advantage of all the CPU cores. The 8 core 16 thread 11900k would get 100's more FPS that the 9900k even though they both had the same core count that the game engine couldn't fully use. The faster single core speed is what gave the biggest performance improvement. To be clear, I'm not saying all games behave this way. Just saying it's something I notice a lot in the games I tend to play. The single core speed of my old 5900x was a bottleneck at times and with my current, 12900ks system I dont see the same CPU bottleneck anymore.
I still have my older 9900k system and it performs very respectably in games that can use all the cores. I am curious to try the new 3d v-cache CPU's but I'd like to wait until the single CCD 7800x3d comes out because I think that one will be better for gaming because it shouldn't have the latency and scheduling problems like 7950x3d with its 2 CCD design. Just my thoughts.
There is no latency, The damn game will only run off of one CCD if it only uses 1 CCD, which 95% of all games ONLY use 8 cores or less!
@@cdiff07 up to 16 threads to be correct, cause my 5600x was lacking in new gamers like Control.
@@kevinerbs2778 hyperthreading isnt really a big factor in gaming, best example is actually the 9700k vs the 9900k in gaming, both 8 cores, around the same clockspeed, same architecture, same performance in gaming. In some games the 9700k was even a tiny bit better because some games seem to struggl with Hyperthreading. So your 5600x was struggling cause its only 6 cores not because it has only 12 threads
@@kevinerbs2778 I have an i5-11600k pared with a RTX 3090. I can definitely tell that the 6-cores are sometimes causing an issue but usually I just leave my games at 60fps. Eventually I'm going to upgrade to a i9-11900k once the prices drop more.
I have a 5950x with a 7900xt , it can be considered equivalent or a little better than a 4070ti in rasterization performance and after testing with more recent and demanding games such as dead space remake , far cry 6 , sons of the forest , kingdom come deliverance , doom eternal, resident evil 4 remake demo, Hunt Show Down I consistently get 95-100% gpu usage at 1440p max settings so I would imagine it to be not far off with the 4070ti. 50-60 % gpu usage is rather extreme and either is just the way this game is or there is something not quite right with your setup. Update : also just tested 5950x/7900XT with SOT and again getting 100% GPU utilization so I’m convinced it’s something to do with the configuration of your rig.
Radeon gpus in general has better gpu utilization compared to nvidia.
9900k @ 5000 Mhz all core no probs 40 gpu
How much a 5700g will bottleneck a 4070 super?
This was cool and all, but what about those 1 and 0-1% lows?
I have a i9 9900k with rtx4070 playing 1440p. Maybe I have a little of bottleneck, but I know that if I buy better GPU, its not worth it because it will bottleneck. Next gpu I will upgrade all the set.
Ya 9900k bottleneck 4070. I have same gpu but with 10700k. Pretty much same cpu and bottlenecks in open world games. Cannot maintain stable 144fps., keep fluctuating. Sick of it.
it has to do with ccd's the 7950x3d does not perform properly due to it having to deal with both ccds. and the latency between them, you should see better performance with 1 ccd only enabled. i believe hardwareunboxed simulated these results in their gaming benchmarks. with the 7800x3d simulated.
Makes sense, there should be that latency present.. switching off a ccd is not really optimal tho, hoping for the people that own these cpus that they get new microcodw
@@lemonhaze715 its not that bad if your only gaming with a chip than a theoratically fake 7800x3d with 8 cores will be just fine
Thats true but the main difference why other dual ccd chips performs well E.G. 5950x, 7950x is because they have symmetrical CCD's while 7950x3d/7900x3d do not and scheduling and managing is a complete nightmare with those chips
Again, it has NOTHING to do with latency. If the right CCD is chosen and said game utilizes 8 cores or less there is no god damn latency! By the way, 95% of all games sue 8 cores or less, may even be higher than that!
@@cdiff07 it will still use both. there is a reason why the simulated results in hardware unboxed showed very good performance gains when only 1 ccd is used
its not always a CPU bottleneck, but memory, and memory access (higher cache alleviates )
Good point! It's the combination with the platform.
No it doesn't the math doesn't support that. The benchmark it self is in fact incorrect. You can't mathmatically get 27% out off the numbers on both gpu render or cpu render fps in any of the calculations.
well, i saw that windows might be the culprit here, bc he chooses the wrong cores
7950x3d and 7900x3d are failure of cpus, only 7800x3d is worth this generation. Those two cpus LARP'ing as "best of both worlds" while in reality they are none of them. Even completely ignoring intel, amd already offer better choices. 7950x performs better in productivity, cheaper and has more consistent fps across wide plethora of games. 5800x3d is amazing value and still has a very good gaming performance, even 5950x with good memory is not that far from base zen 4 (non V cache).
AMD artificially delaying 7800x3d launch by 5 weeks is real scummy move, ignore 7950x3d/7900x3d and wait for 7800x3d.
4:26 7950x3d looking like a slow turd with 8+8
My stock 13600k with D5 Samsung B-die 6800c36 push CPU Game stats: min 252 / max 413 / average 322 / 95% 256
I have the Ryzen 7 7800X3D no bottle neck.
Use timestamps if you want to grow your channel
Imma pair my I7 6950x (heavily overclocked) with a 3090 for the lols, I'll upgrade to sapphire rapids one day
My mindset is to get 60fps maxxed out in singleplayer and 120fps at multiplayer end of the story. Was literally using 2500K for 11 years and recently bought 5950x and mobo (real cheap from a guy moving to 7800X3D) since I needed it for Handbrake and Blender, good for another x years. Never undterstood chasing FPS and getting the best of the best every 1-2 years. Great test
depends on what you are doing and what parts are in your pc right? if all youre playing is minecraft and some storygames youre fine with 10yo cpus and gpus, if you wanna play the latest and greatest games with decent quality settings to get the full experience you might need to get a better gpu and cpu right?
That is a way to save power draw and keep temps lower too... If you get a VRR display then you don't necessarily need a fixed frame rate either.
Intel didn't make decent improvements until 8th gen Core series where they upped the core counts. Good job riding out the years of BS 5% improvements from intel.
I think it’s the best card for i9-9900k for anything stronger new cpu gonna be required 😅
It’s the same scheduling problems intel 12th gen had being new P&E core split architecture, it’s only some games and it’ll be fixed with a windows scheduler update.
I have a 12900KF 4090 strix oc 6400mhz ddr5 c36 z690e gaming wifi 2t p5 ssd 1650psu ryujin 2 360 and I did the same test
1080p highest preset 247... 246 2 test... 3 test 247 44c to 50c
And 0% gpu bound.
1440p 242fps 7% gpu bound.. 55c to 61c gpu... 46c to 51c cpu.
4k 186fps 99% gpu bound 61c to 65c gpu and 41c to 51c
Gpu overclocked to 2990mhz +400mhz ram
Cpu overclocked 5.5/5.1 sp 85... 93p 70e cores
I mostly play in 4k so I don't really need to upgrade my CPU in this game but in other games I'm cpu bound in 4k when I don't use dlss. I really want 13900kf. I have 5900x if I have some time I'm totally going to test it with the 4090. Right now I'm using it with a 6900xt on water.
Buying a 12 GB GPU in mid-2023 is not a good choice. Why? Because Windows eats about 1 GB of VRAM out of the box. So you are left with 11 GB for games.
Many latest games use close to 11 GB at 1080p and over 11 GB at 1440p. For example the last of us, Forspoken, Hogwarts legacy.
In 2024 with the upcoming UE 5, the games will use even more VRAM.
here 13600KF max.perf in bios aorus Z690 ddr4 3600mhz, 5.2Ghz pcores/ 4.1ghz ecores. 4070ti aero stock . 1080p highest , 19% gpu bound. fps 216.
p.s QHD fps 187, gpu bound 60%.
I have a 7700x with a 4070ti supreme x and i get way better results then this: 1440p 197 fps 95% gpu bound, 1080p 252 fps 40% gpu bound. Budget cpu for the win?
I have 3090 with i9 10900k and play few games but warzone the most and im losing fps quick 1440p uesd to get 240 now luckily if im getting 140fps. If i dropped to 1080 i get little bit more
i gotta laugh at myself....i5 11600kf; evga rtx 3090 ftw3; 32gb ram 3600mhz 4x8 …can somebody recommend a mb to overclock with these parts
with my i7 12700 i can mimmick gpu boundedness .
with ultra i get im 100%
with high settings im at 58%
i will play with those settings which will give me 144 fps at 1440p.
who care abt gpu bound man
sorry for cancelling your test 😅😅😂 .
Do you think 4070ti Bottleneck Ryzen 7 5800x On 1440p 16:9 or 3440p Ultrawide 21:9 32 ram ddr4 3200mhz please help
I have a 3800x with the 4070ti and I've been on the fence about 5800x3d vs 5950x vs7950x3d upgrade
I have similar setup. Can you check my comment here and say if you have similar behaviour?
@@MrJirkaCZ no I'm well into the hundreds for fps on ultra raytracying in almost everything I play only reason I want to upgrade cpu is some cpu intensive games I want to play while streaming and that will be a bit much for my current 3800x
What res was this on?
1440
Clickbait
It's a 5 year old game though
Great video but some visual numbers comparisons on screen to accompany the commentary would make things so much clearer to understand. Admittedly my focus wasnt 100% on your video, if it were i would be able to follow but in a world full of distractions and multitasking my suggestion would make your video clearer to understand, i see that you're at 8k subs so maybe this is the next step in video quality, all the best.
Thank you for the feedback, very helpful!
I have a 3090 with a i9-9900k, am I missing out on frames? should I upgrade cpu?
I don’t know how accurate are the bottle neck calculators , but 3070ti and a 5950x should not have a bottle neck on 1440 p 21:9 aspect ration
so.. 4070ti is better on a 5800X3D than a 7800X3D because of bottleneck issues? and if paired with 7600X is the same as 5800X3D in fps or did i miss the point ? trying to understand noob here
5800x3d is king here
does anyone think that the 5800x3d will be a good combo with the upcoming 4070 ti super?
Looks like memory system bottleneck not cpu. Set Benchmark to run on vcache cores.
as long as everything is smooth not choppy no frame drops then the cpus are technically not bottlenecking yes some cpus add and take a few frames but nothing really noticeable unless your looking at stats in the grand scheme of things unless your having frame drops or seriously low 1% lows youre fine
5800x3d has been rock solid with the 4070Ti at 3440x1440
Will 7800x3D bottleneck the 4070 Ti or should I pair it with 4070? Can anyone help pls.
I have an i9 9900K and a 4070ti in the rig I am typing this on! My CPU is running a 5.0ghz all core overclock and I feel this really helps to push it up to modern standards, and when overclocked to 5.2ghz it can even hold it's own with the likes of the 5800X3D , beating it in synthetic benchmarks.. so it's a good chip, despite its age, and most games are optimized for 4-8 core systems now I would say.
I just don't like the 45-50 degree idle temps with the 5.2Ghz OC applied, as I need to give it about 1.45v, where as I can get away with 1.28v at 5hz stable.. I will consider perhaps upgrading to the next gen intel platform next year (maybe when they have fixed the DDR5 issues) but for now the 9900K is still a beast for gaming!
It would take a lot of convincing to get me to switch over to AMD anytime soon even with all the recent improvements
What kind of cooler you have ? I have mine at 5.2, 1.4V, i'm getting 35-40 at idle or doing random browser stuff on a NH-D15.
@@Taldirok Corsair H115i Elite
Guys stop bothering with core clocks, it needs fast ram, not 5.2ghz
I really have a problem with the logic of testing here. How can you benchmark only one game and present the results as if it was valid for all games? What an open statement for narrow testing.
at 1440p ultra settings, GPU mainly used so even i7-8700 non k can work fine with 4070.
Im using 4070ti with 1080p and its amazing, not everyone should follow rules of humans to play only 1440p or 4k, 1080p when you buy good monitor this can still looks very good
no one is gonna play rpgs in 1080p 240hz you should test on pvp games
i agree, most people aiming for high fps are gonna be pairing high end gpus with decent enough cpus to max out that gpu usage for highest frames in fast paced competitive pvp titles
Oh but people do it so when you say no one you must be talking about your self. Comp gamers will lower their res to 1080p to get the best frames not all do it but alot
@@DETERMINOLOGY He did say RPGs as in roll playing games to which I think his statement is mostly correct. You mentioned competitive gamers like e-sports or competition players so you are not comparing the same groups of games and gamers but in that context you are also correct but not in the same genre as the original posters comment.
@@dracer35 "no one is gonna play rpgs in 1080p 240hz you should test on pvp games"
He said no one as if no one is doing it at all. I know plenty of people play 1080p / Highest fps to stay competitive so thats out. If they can hit 165/240fps they will fps over graphics as they not trying to show the game off they trying to win.
Simple
@@DETERMINOLOGY To stay competitive in a single player rpg roll playing game? Sounds like you are talking about a completely different type of game but it not worth wasting more time trying to explain. I give up. You win. Since that seems to be the answer you were looking for. Have a good day.
I have a 10900k with a 3090 and I see no need to upgrade unitl 14th or 15th gen intel.
Still using i5 9600k for 1440p ultra gaming :(
so would a i7 11700f be okay with a 4070 ti super?
In your opinion what GPU bottlenecks even 5800X3D in 1080p games ?
I have a i9 9900k + 3080 ti, so my bottleneck is even worse thanks
can my 5800x handle the 4090 upgrading from a 1050ti
this is heavily dependent on the game and resolution, old games will run blazingly fast with new cards, test with a unreal engine 5 game firing on all cylinders with features like nanite, lumen, virtual shadow maps etc etc
mac you need to turn the xbox game bar on for Vcache to work this will be some thing in the future they will already have in windows 12 already on . so turn that on and rerun the test again... with the 7950x3D
Thanks for the feedback, but it was turned on. Installing Windows 10 on a clean install on a new SSD and using AMDs reviewers guide gets it to work...most of the time.
Did he just really try to Base his whole opinion of these CPU's off of 1 game? LMAO, no wonder why he only has 8k subs. If you know anything about CPU's, you will know depending on the game and how they are utilized/optimized for said game, it will drastically be different game to game. Also, what were your memory frequency and timings with the 7950X3D? Did you make sure you were using the CCD with the Vcache on it, did you try and run it on the CCD without the Vcache on it? Depending on the game, some games prefer frequency over Vcache. which is the beautiful thing about the 7950X3D. Does not matter if it does not auto pick what CCD, pretty easy to do it yourself depending on the game you're playing and doing the proper research to understand what the game likes more!
Do you think AMD knows anything about CPUs? Do you think AMD has the resources to determine which game should run on which CCD? Do you think AMD is capable of doing the proper research? Given this configuration of 1 V-cached die and 1 non-V-cached die was shown at Computex in May of 2021, do you think AMD had enough time?
Mostly amd as well. If you going to do a cpu bottleneck video cover a good count of both amd and intel from low to high end ...As for me tho i wouldnt want a strong gpu and a weak cpu or vice versa. Had a GTX 1080 hooked up with a 6700k @ first and it was ok switched to a 13600k and got much better performance. Since then i upgraded to a RTX 4070 Ti no issues @ 1440p
@ImaMac-PC Windows scheduling issue! AMD and Microsoft both need to work together on it! Plus, you never answered my question?
holy crap you are obnoxious dude, not even worth arguing with
@innocentiuslacrim2290 Your RUclips butt buddy's testing methodology is flawed!
Is there a bottleneck with a 5600x and a 4070 or 4070 ti?
aMd poo
Hi all! Can u pls say will 7600x bottleneck 4070 Ti in 1080p and 1440p?
I will wait until ryzen 8000 3D cpus release before I make the jump to AM5. For now my 5700x is more than enough for my gaming needs
Thank you, good to see a different point of view, but, the question is why?