THIS is what Bottlenecking REALLY looks like! AVOID THIS!

Поделиться
HTML-код
  • Опубликовано: 26 сен 2024

Комментарии • 2,3 тыс.

  • @Ration999
    @Ration999 Год назад +7781

    Personally my biggest bottleneck is my wallet, paired whit gpu prices.

    • @demontekdigital1704
      @demontekdigital1704 Год назад +66

      Same. For me neither is a bottleneck. More like trying to fit a softball in a garden hose, lol. My PC has tagged in my house, and I'm currently being choke-slammed by all the repairs, lol.

    • @camotech1314
      @camotech1314 Год назад +111

      😂😂😂 poor people are always bottlenecked

    • @demontekdigital1704
      @demontekdigital1704 Год назад +190

      @@camotech1314 LMAO! So painful, and so true. I'm so poor I can't even pay attention.

    • @RainyFoxUwU
      @RainyFoxUwU Год назад +20

      that comment killed me

    • @rchapman801
      @rchapman801 Год назад +13

      This comment made me laugh. So true.

  • @ee2610
    @ee2610 Год назад +421

    I upgraded from an Intel i5-7600k to a Ryzen 7 7700x but kept my GTX 1080. My framerates have increased 2 to 3x in CPU bound games like Risk of Rain 2 and Rust. God bless Microcenter for that $400 bundle

    • @Davincibeats
      @Davincibeats 8 месяцев назад +27

      That's a huge upgrade. Mine is similar. I upgraded from the i5-8600K to the Ryzen 7 7800X3D ... I've only tested one game... But Kingdom Come went from 40 FPS at ultra to 60 FPS ultra. Never knew my CPU could bottleneck my PC like that.
      I also massively upgraded the cooling, which helped keep temps insanely low. My GPU used to run at 90+ and my CPU at 90+ ... Now my CPU runs at 75 and GPU 75 at max.

    • @Wawawalulu
      @Wawawalulu 7 месяцев назад +9

      I just went from a 6700k (4.8ghz) to a 14700k with a 3070, and from a z170 w 2600mhz ram to a z790 w 6000mhz ram, yet to pick it up, stoked!
      A 7900xt is planned next 😁
      I'm at 3440x1440 120hz so I didn't think my cpu was bottlenecking super hard until I got into act 3 in bg3 😂

    • @Tiagocross
      @Tiagocross 6 месяцев назад +2

      Yep! Im building a new pc and for the moment my GTX 1080 paired with Ryzen 7 7800x3d will have to work

    • @balin1
      @balin1 6 месяцев назад +3

      Those microcenter bundles are the real value

    • @PhilipKerry
      @PhilipKerry 6 месяцев назад

      @@balin1 In the UK Microcenter doesn't exist we have Currys PC World but the choice of peripherals is limited .

  • @berserkslayer8638
    @berserkslayer8638 Год назад +491

    Your video about bottlenecking (the one with the i3 and 2080Ti) was the one that started my journey in PC building and gaming. Before watching it I was so lost and afraid of all this stuff, but you kept it simple and easy to understand. So after 6 months of watching your videos and saving some money, I was able to build my own PC back in 2019. I have nothing to say but thank you!

  • @northwestrepair
    @northwestrepair 7 месяцев назад +13

    Lowering graphics performance will allow GPU to generate more FPS.
    The more frames GPU generates, the more CPU has to work with those frames.

  • @luck9837
    @luck9837 5 месяцев назад +28

    My jobs been bottlenecking my life

  • @thigo94
    @thigo94 Год назад +351

    the frametime graph in afterburner would be a great visual aid, because as you said, it is hard to see jitter/stutter in the video. Adjusting the max and min properly gives a great indication of this issues.

    • @x0Fang0x
      @x0Fang0x Год назад +8

      or just use intel's gpu busy graph to know what settings to use.

    • @Sulphur_67
      @Sulphur_67 Год назад

      @@x0Fang0xtried and didn’t work at all with my rx 7600

    • @disser3849
      @disser3849 11 месяцев назад +1

      YES! Always show frametimes pls.

    • @squirrelattackspidy
      @squirrelattackspidy 11 дней назад

      How do you use it?

    • @thigo94
      @thigo94 3 дня назад +1

      @@squirrelattackspidy just search for how to setup the msi afterburner/riva overlay, once you set it up you can turn the osd for the frametime with the option "graph" the max and the minimum for the graph can be set to 2 ms and the max to 20, with this value you would be able to monitor from 50 to 500 fps. if you are at perfect 500 fps your line would be a flat line at the bottom of the graph. the lower the fps the higher the line will go, what you want is aways a flat line with not a lot of lumps, this means the experience of playing the game is smooth, any spikes mean a big stutter.

  • @Liaret
    @Liaret Год назад +145

    A small detail/correction: In most game engines, such as Unreal, Game Thread (main engine thread running the world etc) and Render Thread(s) (the threads responsible for "talking" to GPU and giving it instructions) are separate. The main thread (GT, Game Thread as it's called in Unreal) "ticks" the world, but does not send render instructions to GPU or waits for GPU (for the most part). That's what Render Threads do.

    • @rodiculous9464
      @rodiculous9464 5 месяцев назад

      Is there a way to check exactly which are the game threads and render threads? I don't see it in afterburner/rtss.

    • @noth606
      @noth606 3 месяца назад

      @@rodiculous9464 Depending on how it's done to a degree but in general no, since you don't see threads anywhere, only separate processes/exe's. Also, there is no reason for a game engine to have n threading enabled, it's an unnecessary overhead to code support for. It's not impossible by any means but it would bring a bunch of extra junk eating CPU time without enough gain in return. What I mean is, I can code a process to spawn 4 threads always, or just one, or *sniff* the load on the machine, take a count of cores and roll a pair o magic dice to decide how many threads to spawn, and check again every 10sec to see if to spawn new threads or cull and queue old ones etc.
      I'm an ex sw dev, most of the time for small things I never spawned multiple threads even if the task could support it, because it's more overhead than gain. For major worker loops I usually set a max, 4 or 8 depending on what it is. I did very occasionally code to spawn MANY threads at once, because I wanted a sort of snapshot to process coherently, meaning for example 40 different parameters of a changing dataset sampled at once, each spawning a worker thread to process - once done I have the processed result of a "snapshot" all different parameter at one time.
      But all these things change faster than a user can perceive in many cases. But yeah, it depends on how it was coded more than anything else. The OS cannot decide to spawn new threads of something, the application can be single threaded if it wants to. The application decides more than the OS does, since it has to be coded specifically to spawn threads and then manage them, there is no "automagical mode" to it, if it isn't coded for it, it won't nor can it do multiple threads. And even if it is threaded in code, the developer(s) decide how many threads and when and for what. The OS has no crystal ball to see what the application benefits from, thus has no positive control, and very little negative control.

  • @Gravgon
    @Gravgon Год назад +152

    I just wanted to thank you for all the videos you make. You have helped me so much with Building, Upgrading and trouble shooting my PC.

  • @wkeyser0024
    @wkeyser0024 8 месяцев назад +27

    Thank you for all the information you and your team provide. It makes a world that seemed unattainable for the Layman to start. Haven’t built a computer since a 486 but that changes this weekend. Thank you sir, be well.

  • @princexaine5340
    @princexaine5340 10 месяцев назад +23

    Really good guide Jay. I see this all the time and I try my hardest to explain bottlenecking in its simplest form wherever possible - but a lot of people seem to think bottlenecking happens "immediately" when you pair two components together. And that just isn't true. Depends on application. Very well thought out. Thank you.
    I actually paired an X5470 from 2008 (At 4.5 GHz) with my 4090 just to prove that the card can still run near close to specifications.

    • @chrischaf
      @chrischaf 10 месяцев назад +4

      A lot will depend on how the game itself balances demands between the cpu and gpu, *and* what you are doing in the background.
      (I came back up here to mention, this got *very long*, so tldr people should just skip this post and have a good day ;D and, also, I *don't* have big issues with any other games, *including* what little I've played of cyberpunk -8.2 hours- So this is a story of an "exception to the rule" sort of thing. *One particular game* that brought my system to it's knees due to bottlenecking. For reference, I run an i7 7700 at stock speed, and a zotac 1080 mini at stock speed. ssds all around. it's a prebuilt -zotac mek1- except for the 1080, a beefier power supply, 32 gigs of ram and being stuffed with more ssds than zotac ever intended lol and while it may sound very old by today's standards, to build *the same tier rig* in today's market would be about 3 times what I could afford. Plus, since I use a 65" tv running 1080p@120hz I'm basically limited to 1080p gaming, which the current system already handles quite well *in most games*. STRESS the *in most games* part lol hence this bottle-necking story)
      Like, I play a lot of 7 days to die, which actually relies *a lot* on the cpu when there are hordes (lots of zombies at once).
      Things worked pretty well on the base game, but when I started running a mod (Darkness falls) that tended to increase the size of the hordes a bit, and some various other things, it was dropping the game from completely playable, to completely UNplayable (12-14 fps while dozens of zombies are trying to pound on you, with occasional drops to 0 fps. yes *ZERO* fps lol!).
      So I started messing with game settings, trying to see what I could turn down in hopes of squeezing out a few more fps to get by...
      But...
      What I discovered, was that I could actually *turn a whole lot of graphics settings UP* and it didn't really matter, and I was actually already running the game at much lower settings than I really needed to, because it simply wasn't really the graphics being too high that were killing me.
      There was a major game update right in the middle of my settings-testing so I never got to learn if there were any particular settings worth turning down...
      But what I did manage to learn fairly quickly, was that I was almost constantly pegging the CPU up to 100% (in the task manager. I didn't have anything at the time to look at single core usage, but if it's riding at 100% in the task manager a majority of the time what's happening per core is probably less important lol).
      Now... I'll bring up one of the *big* issues that I hit on as a problem right away.
      I *stream* my games. So I always have obs running in the background, and I always keep chrome open to monitor my stream (just that one window, on a second monitor) and often I have firefox open in the background, for this and that (I still have all the bookmarks from back to the early 2000's in my browser, cause I always move them forward to my next pc, so firefox sort of serves of *alternative/extra* memory for my brain, which I rely on heavily since I have such a bad memory, *plus* I tend to always open things in new tabs, and my browser loads with all tabs from the previous sessions, so I can look back over my previous tabs to help keep track of time and when i did things, what i was looking up yesterday and the day before, etc. I rely on this so-much-so that I often end up with hundreds, or even *thousands* of tabs in my main browser. I believe the record number of tabs i've had in firefox at one time was 3,172 or something like that. somewhere over 3 thousand, i'd have to find the pic to know the exact number for sure. but, for reference, chrome with my one twitch page open, tends to hog just-as-much-or-more cpu than firefox with 2,000+ tabs).
      Chrome tended to want to use about 14% cpu most of the time (it liked to randomly use 60+% for no obvious good reason) and firefox tended to want to use 7%-14% as well, so the first thing I had to do, was stop streaming for testing.
      Aaaand, well I didn't get much further than that before the big game update changed too much to use what I'd already done, which for various reasons had taken a couple of days.
      And it's that part that was really showing me how cpu-limited/bottlenecked my system was with this particular game.
      The basic problem was, that getting REAL reliable AND *consistently* repeatable test results on that game was a heck of a lot harder and more time intensive than I expected. It was *literally* taking me *hours-per-setting* to get AND verify consistent/reliable/meaningful differences between runs.
      And the primary problem was that my video card was simply a bit *too* good for single changes to make an obvious impact.
      As in, the difference between low/off and ultra/on could be between 0-5 fps difference, while the variation between runs *with the same setting* could ALSO be between 0-5 fps.
      So I wasn't able to do just simple run/spin around tests and get real numbers. They were all over the place.
      I had to build a big square base with a particular place to stand, run around to all 4 corners and spin around to get everything to load, reset the time and weather to be the same, set my crosshairs on a particular point in the distance, then spawn a given number of zombies set to run towards me, and log the fps while they'd be beating on the base, and while I'd be shooting, etc. had to do that multiple times for *every single setting* to have consistent accurate results. And just running around in the city wouldn't have told me anything, because *that* part wasn't what was killing my fps. it was when my cpu was having to do this and that when there were large groups 25-50 or more trying to chase/attack at once.
      Which, ironically, was the absolute WORST time to have major fps drops.
      But, anyway, yeah. I *thought* that all these fancy green flamey effects that were added to some of the zombies (added by the mod) were going to be what was killing my fps, but no. spawning zombies that had lots of effects, had almost no impact compared to ones that didn't.
      That was all within the realm of what the graphics card could handle, if it was being allowed to handle things *at all*, which it *wasn't* when there were large groups, due to the cpu bottle-necking.
      So, yeah, if you go to do some graphics troubleshooting/tuning, and find that turning a bunch of settings down doesn't seem to actually help, and/or actually seems to make the game run slower/worse, you may very well be cpu-limited/bottlenecked, because your gpu is stuck waiting around for the cpu to take care of some things before it can do it's next job.
      And, since I still can't afford to upgrade to a similar tier modern system, i'm probably looking at needing to upgrade from my 7700 to a 7700k, since the k model's base clock is as fast as the standard model's full/max "turbo" speed.
      My mek1's proprietary bios may not allow me to actually overclock the k, but just the default speed bump would essentially be an overclock, so it's all good. And with that, I think this system will be about as maxed as it can get.
      Although... I actually have a factory refurbished 2080 super in the mail that i got for a decent price, which is probably the max video card I could use in this system. lol
      That might seem silly considering I've been talking about my system already being bottlenecked by the cpu, *but*, there's some caveats.
      1) My system is only this heavily cpu bottlenecked *on this one game*. other games, which balance things more heavily on the on the graphics side, don't have such a game-breaking limitation.
      2) The new video card costs $276 (including tax and shipping). I couldn't buy anything near that tier from nvidia for that price, and I will *never* touch another "noone would ever notice that it doesn't have a 256-bit memory interface during normal gaming" card again... I bought a lower tier card that was recommended on that advice once, and had to suffer a year of seasickness-inducing micro-stutter before I could afford another 256 bit card to replace it. And since nvidia is now even chopping down their xx70 series cards, that only leaves me xx80 series and higher, which cost... well you know what they cost ;P
      3) The 2080 super is a "blower card". My system was designed to use a 1070ti blower card, which sits in a little chamber, and the blower card blows all the heat out the back.
      I swapped it for my 1080 mini, because I already had the 1080 and i didn't want to use a 70ti when I already had an 80, even though, in my quick round of testing, the 80 didn't really perform much different than the 70ti.
      But, yeah, the temps with the mini are *bad*. I even tried adding a fan to pull the hot air out, and it made literally no difference. AND I found during recent testing that my 1080 *will not throttle* even though it's supposed to. so it's a risk to leave it running in there unmonitored. and it's actually the heat issue that was my biggest reason to buy the newer card.
      now it's time for lunch :) lol

    • @al3xb827
      @al3xb827 6 месяцев назад

      Can you please help me? I got a Ryzen 5 3600x and 6750XT and there is a significant Bottleneck what should I do?

    • @princexaine5340
      @princexaine5340 6 месяцев назад +1

      @@al3xb827 Turn graphics settings up. You can probably play at better settings without losing FPS due to the CPU bottleneck. You can try to OC your CPU if you have thermal headroom. You can try limiting the framerate, if you are experiencing large hitches in performance.
      ...Or just upgrade the CPU.

    • @al3xb827
      @al3xb827 6 месяцев назад

      @@princexaine5340 thanks. Somehow it looks like in some games if you change the graphic settings there isn't such a big impact on fps. But yeah that was my idea to oc the cpu at least. Cause I just bought the GPU sometime later this year I may upgrade the CPU.

    • @princexaine5340
      @princexaine5340 6 месяцев назад +1

      @@al3xb827 Right, and I know hearing "upgrade" after you've already dropped money on a component isn't the answer we all want to hear, but in reality, you can only do so much to mitigate the bottleneck your setup is experiencing in titles where a powerful cpu is beneficial.

  • @Gravgon
    @Gravgon Год назад +66

    HAHA! At 6:18 I thought his hand coming up to point was part of the animation after putting the sword away.

    • @lennyshoe
      @lennyshoe 5 месяцев назад +1

      same 😂😂😂

    • @rezaimran98
      @rezaimran98 5 месяцев назад +1

      RTX Overdrive

    • @SnarlyCharly
      @SnarlyCharly 20 дней назад

      I was looking for this comment. I thought the exact same thing, "whoa how is he getting his character to point in those exact spots like that?"

  • @ice.3000
    @ice.3000 Год назад +124

    My biggest bottleneck is my wallet ..

    • @camotech1314
      @camotech1314 Год назад +7

      Just stop being poor 😂 bottleneck solved 😅

    • @ice.3000
      @ice.3000 Год назад

      @@camotech1314 ok i will stop being poor, from now on i will be a millionair. Thanks for the lifehack!

    • @VioIetteMolotov
      @VioIetteMolotov 7 месяцев назад +15

      Just stop being solved 😂 bottleneck poor 😅

    • @YA-mr9zx
      @YA-mr9zx 6 месяцев назад +3

      Just stop being bottleneck 😂 poor solved 😅

    • @BumpkinBros
      @BumpkinBros 6 месяцев назад +4

      Just bottleneck being poor, stop solved 🔥🔥

  • @CaptToilet
    @CaptToilet Год назад +141

    Interested to see this test again when the 2.0 update lands. Dev has been saying the CPU can be hit hard due to better utilization across the board.

    • @HanCurunyr
      @HanCurunyr Год назад +4

      I dont see recommending a 7800x3D or a 12700k for 1080p High without RT a "better utilization", there is nothing good in that

    • @max16
      @max16 Год назад +1

      ohhhh. that would be why my 4770k and 3080 have been acting weird after the update. frames are still there as normal but CPU uti is like... 60% now.

    • @justcrap3703
      @justcrap3703 Год назад +2

      @@max16 But if it's because of better utilization, shouldn't your fps increase along with the usage?

    • @DavidTMSN
      @DavidTMSN Год назад

      @@justcrap3703 Not necessarily.

    • @alexandruilea915
      @alexandruilea915 11 месяцев назад +1

      ​@@justcrap3703maybe the GPU was already at it's limits as well.

  • @o_Domo
    @o_Domo Год назад +29

    the iFixit ad never gets old

  • @hamzaalhassani4154
    @hamzaalhassani4154 6 месяцев назад +135

    Out here playing cyberpunk on an i39100F and a gtx1660 with cpu bottleneck at 40fps. when Jay said "UGH ! feels like playing on a potato" it me hard in the feels, man.

    • @tentymes10x
      @tentymes10x 6 месяцев назад +5

      i play enshrouded on my i7 4790 and a rx 570 at 30 fps.....i dont mind tho cuz im old

    • @hamzaalhassani4154
      @hamzaalhassani4154 6 месяцев назад +2

      @@tentymes10x i ve been playing at 60 fps a lot lately. i cant handle 30fps anymore XD

    • @YourMother6yrsago
      @YourMother6yrsago 5 месяцев назад +1

      @@hamzaalhassani4154same, I upgraded to a 1650 super build from a 1050, and now I can play every game at 60 fps where before I was struggling to get 30. I can’t get used to 30 anymore

    • @megapet777
      @megapet777 5 месяцев назад +1

      thats not such a bad setup. Cyberpunk is just really demanding game. I bet you would get 60fps in elden ring for example.

    • @hamzaalhassani4154
      @hamzaalhassani4154 5 месяцев назад +1

      @@megapet777 i can get 60 fps, you're right. But i have to sacrifice details and use upscaling whenever possible.

  • @micb3rd
    @micb3rd Год назад +58

    Also one more topic which is very interesting, not all bottlenecks feel equal. Jay is right hitting a CPU limit gives large frame time spikes, it is very noticeable as stutters, it feels horrible. When a GPU is at a limit the delivered frame times are often quite stable so it is a much much nicer feeling. This is why it really is often best for motion fluidity to either load your GPU to 98% or set a frame rate cap a little below below where your CPU FPS limit is.
    This is also why Nvidia and AMD work on technologies like Reflex and Anti-Lag + to allow a small buffer in the GPU rendering output (a few FPS below your set refresh rate) to ensure latency does not spike up when the current workloads are saturating the GPU to its limit. The game play experience is much nicer.

    • @raven4k998
      @raven4k998 Год назад +1

      don't you just love it when you got an fps sissy crying about not getting the same fps as jay or some other youtuber video on the high end with their gpu even though it is still butter smooth? to play the game?🤣🤣🤣

    • @micb3rd
      @micb3rd Год назад +10

      @@raven4k998
      What looks and feels butter smooth to one person is not the butter smooth to another person.
      Lots of people are happy with 60 FPS from a motion clarity and input lag perspective.
      There are lots of people who have a much more enjoyable experience when they are at 90 FPS to 144 FPS.
      It is down to their preference.
      I don't judge other people I just help educate them to get a smoother, faster better looking gaming experience.

    • @raven4k998
      @raven4k998 Год назад

      @@micb3rd cry me a river cause the point he made was that 20 fps is nothing at all

    • @silverfoxvr8541
      @silverfoxvr8541 Год назад +3

      This really messes up VR, where frametime is king.

  • @BlackHoleForge
    @BlackHoleForge Год назад +6

    Sometimes while tweaking my system, the numbers and specs just blur together from test after test.
    Thanks Jay for keeping it simple.

  • @PindleofKujata
    @PindleofKujata Год назад +64

    I'm looking forward to seeing how Cyberpunk 2.0 will handle CPU core load. It's supposed to utilise them far more effectively instead of just parking 90% of your cores and using one or two of them.

    • @PC_Pineapple
      @PC_Pineapple Год назад +6

      apparently 90% usage on an 8 core CPU from what a developer said on Twitter

    • @zagan1
      @zagan1 Год назад +4

      Depends on what windows does, ONLY seeing 90% on a 4 core 8 threads example.
      But having 32 threads etc will be hard oressed to see the total go over 30 to 50% usage.
      Plus cpus do spc so far more that also reduces usage.

    • @maolcogi
      @maolcogi Год назад +6

      I'm pretty excited for this personally. I have a 4090 and a 5800X3D and it already runs the game amazingly, with DLSS 3.5 and the CPU usage upgrades I feel like the game will be jaw droppingly beautiful.

    • @durrik
      @durrik Год назад +2

      @@maolcogiI'm in the same boat but with a 13900k, extremely excited but concerned about temps lol

    • @PC_Pineapple
      @PC_Pineapple Год назад

      @@durrik The Cyberpunk Dev did say to make sure your CPU cooling is up to the task 😅We'll find out tomorrow

  • @oistyer
    @oistyer 11 месяцев назад +1

    Thanks so much for this, now I don't feel so nervous about my 3060 and my i5 12400

  • @calebjit
    @calebjit 2 месяца назад +1

    I have been going insane trying to figure out where my bottleneck was and this video singlehanded corrected my PC. I definitely am glad I stumbled on this

  • @ragetist
    @ragetist Год назад +53

    I see many people talk about bottlenecking and just comparing gpu and cpu. If you run a graphically heavy game on 4K/144Hz you're very unlikely to be bottlenecked by your cpu. Cpu + gpu is like a company with a painter and a mathematician and work being hindured by either is based on the work you give them, if you ask them to paint a fresco the math guy is gonna sit idle and if you ask them to do tax returns it the other way around.

    • @demontekdigital1704
      @demontekdigital1704 Год назад +10

      That's an excellent analogy because that's basically exactly what happens. My only addition to this is while they're sitting around waiting for each other, the painter would be sipping some soy latte abomination while the mathematician would be sucking down double shot espressos, lol.

    • @Jason_Bover9000
      @Jason_Bover9000 Год назад +2

      ​@@demontekdigital1704its still use it when hitting 144fps

    • @justinpatterson5291
      @justinpatterson5291 Год назад +3

      That would mean your VRAM, SSD and RAM are like the accountant/warehouse manager who holds and keeps stock of whats being used/needed... Right?

    • @NickSteffen
      @NickSteffen Год назад +4

      Also if you frame limit your fps to below your monitors refresh rate then it’s very hard to get cpu bound as well. There’s no reason to have more frames then your monitors refresh rate.
      You can just set this globally in nvidia control panel or AMDs settings. ( Don’t use in game settings for it (or v sync) as they often are terrible and cause problems).

    • @samuelsulaiman
      @samuelsulaiman Год назад +1

      its like a restaurant operation where GPU is the kitchen and CPU is the Front of the House. When kitchen is beefy enough and can send food as fast as it can and FOH just can't keep up in delivering the food, thats cpu bottlenecking. The kitchen will end up slowing down production because why? no one picking up the food to go anyway

  • @zackzeed
    @zackzeed Год назад +19

    I love these 'refresh' videos, Also it seems like Jay is the only techtuber that does this nowadays... correct me if i'm wrong!
    Much appreciated Guys!

  • @Th3King0fHearts
    @Th3King0fHearts Год назад +9

    I will never not enjoy that I fix it ad😂 putting my build together this week so have been binging your content! Hope yours and your familys health are well!

  • @TheHoodSite
    @TheHoodSite 11 месяцев назад +3

    The ad has gotta my favorite part of the video, haven’t something that funny/rad in a while.

  • @Altrop
    @Altrop Год назад +5

    People need to realize that a bottleneck will vary depending on the game (unless you have a serious hardware mismatch). In fact, there are often both CPU bottlenecked moments and GPU bottlenecked moments in the same game. My old 5600X frequently bottlenecked my old 6700XT even though they are a good pair.
    CPU bottleneck is usually the worst.

    • @jaket1520
      @jaket1520 Год назад +2

      What did you upgrade to from 5600X? I have RX 7800 XT now and 5600X and it kind of feels that the 5600X is a little bit of bottleneck in some situations.

    • @dragonclubracing8669
      @dragonclubracing8669 10 месяцев назад

      What happens when cpu bottle necks gpu? I can currently get a 4090 at a good price and considering buying?but I currently have 5800x3d cpu, if I bottle neck at 1440p will games stutter or drop frames etc? Thanks

    • @NoDFX_
      @NoDFX_ 9 месяцев назад +4

      @@dragonclubracing8669 A 5800x3d will work fine with a 4090, hell you can get away with a 5600x.
      Some cpu heavy games will have a little bottleneck on a 5600x, but no matter what you have a bottleneck in your system which was the entire point of this video.
      Any CPU that has came out in the past 5 years that isn't a I3 or Ryzen 3 will run a 4090 just fine. Even more fine if your planning on playing 1440p or 4k.

  • @liamcaroline448
    @liamcaroline448 Год назад +9

    Meanwhile the FPS where the 4090 thinks it has no load is where my 1070 is at full tilt.

  • @veraxis9961
    @veraxis9961 Год назад +84

    I completely agree with the conclusions here. I have seen a trend over time towards this idea that a CPU and GPU need to be "matched" (i.e. an upper-mid tier GPU has to be used with an upper-mid tier CPU or else you will get a bottleneck) rather than the wisdom of 5-10 years ago that for gaming it tends to be most cost effective to buy a slightly better GPU even with a slightly worse CPU. I think that logic still holds. Aside from maybe specific examples of CPU-heavy games, the numbers seem to support that most modern CPUs should be able to handle a mid- or upper-mid tier GPU just fine without bottlenecking. You might get a small performance boost from a better CPU, but most of your base performance is still going to be scaled off of your GPU.

    • @Muppet-kz2nc
      @Muppet-kz2nc Год назад +7

      the beauty of computing, whether productivity, gaming, or other is tailoring something to your use case. I see it all the time in subreddits where people throw out recommendations with very ltitle information on the use case scenario. As a professional, most the stuff I read and watch on RUclips really misses the mark. Nvidia has iterated leaps and bounds with its GeForce Experience gaming optimization settings and slider. It used to be horrid but does a pretty bang up job dialing in settings if you take the time to use it. Decide on a target FPS you want, then turn the slider until you find a breaklpoint that you like.

    • @Mr.Morden
      @Mr.Morden Год назад +7

      I got a 5600X with a 3060 Ti and living in Miami FL. Even if I was willing to buy a peak end system I can't tolerate the heat that dumps out of anymore more powerful. The ventilation in this house isn't setup to handle a hot spot like that. Nvidia would need to give me a voucher for a central AC upgrade and more duct installation.

    • @jondonnelly3
      @jondonnelly3 Год назад +3

      ​@@Mr.Morden5600x will handle a 4070 1440p no issue.

    • @Yerinjibbang
      @Yerinjibbang Год назад

      just got a 7800xt but still on my r5 2600 lol looks like a 5600x will do fine upgrade@@jondonnelly3

    • @zackwalkman8574
      @zackwalkman8574 Год назад +3

      I have rtx 3060ti in my old i7 3770k machine. It can play most newer games without issues. Even the newest i3 is better that that but it still going.

  • @4N5W3R5
    @4N5W3R5 6 месяцев назад +3

    Been running my rtx 4090 with my old i7 8700k and gaming in 4k for nearly a year and a half now and it's surprisingly good still (about 10-15 fps down on best case benchmarks)... was just waiting for another massive jump in cpu performance before sinking more cash into my build.

  • @lckrgl
    @lckrgl 11 месяцев назад +5

    I feel this one.
    Had a pretty old setup with an GTX-650Ti and a FX-8320e and upgraded the gpu first to a RX6600.
    That upgrade made almost no difference besides allowing me to play BG3 (the intended goal) with a very bad performance.

  • @blahorgaslisk7763
    @blahorgaslisk7763 Год назад +23

    Bottlenecking can give pretty weird results. Some years back both I and a friend upgraded our graphics cards to the RTX 2070 Super. He had a machine with an Intel Core i9 9900K while my machine was running a Core i7 6700. There's a pretty huge difference in CPU performance between these two, and in games his machine tended to be a lot faster. But when benchmarking The Division 2 we got some pretty interesting results. Like I said we both got 2070 Super cards, but they were different make and models. And in The Division 2 my machine consistently benched one to two FPS higher. The difference was that my graphics card had a very slight factory overclock which was just enough to make a difference in the benchmarking of the game. However what wasn't obvious when looking at the average FPS was that on my machine the CPU was pegged at over 90% for the entire benchmark. I actually got a good laugh when I sat watching the CPU utilization as reported by the game and it at some point reporter a utilization of 105%...
    Mean while my friends machine never broke 35% CPU utilization in the benchmark.
    When playing the game this was a lot more obvious. Loading times were a lot faster, and there were a lot less cases where the game slowed down

    • @erikbritz8095
      @erikbritz8095 Год назад +4

      This is why i say if you have a Cpu bottleneck you should instantly do a 15% overclock on the gpu as the card makes up for the gap a tiny bit but you still have to know there will be slow downs and loading issues etc. Thing is these issues really dont bother as much right now in my current combo so im good.

    • @iikatinggangsengii2471
      @iikatinggangsengii2471 Год назад

      im alright, if my belongings all returned theyll keep me busy for years

    • @anhiirr
      @anhiirr Год назад +1

      i mean some ppl "adapt' b-die ram....and OC the snot out of their ram....w.o realizing the price difference is near trying to grab a better cpu/board/chipset/ram combination w.o trying to buy "BENCHMARKER FLAGSHIP ram kits" Price difference PURELY from a if you "SELL" what you have for an "upgrade" the cost/difference is almost negligible

    • @MJSGamingSanctuary
      @MJSGamingSanctuary 11 месяцев назад +1

      @@erikbritz8095 Yeah OC'ing can improve things but it could potentially also have lasting long term effects. On the CPU. I support OC'ing on test benchmarks but for gaming its kinda walking into a black hole a bit. Any bugs or glitches that occur is like a soup of worms. XD. Most devs will just be like WTAF are you playing on hardware from the late 90's for XD.

    • @erikbritz8095
      @erikbritz8095 11 месяцев назад +1

      @@MJSGamingSanctuary nah my idea us a low level overclock so enough to feel a difference but not enough to hurt anything. Plus my pc is doing fine post gpu upgrade so now im saving up for a i5 13600k or Ryzen 5 7600x combo.

  • @TheModeRed
    @TheModeRed Год назад +14

    I think i speak for most when I ask for a video on what settings to use for Starfield that firmly put the load on the GPU but give you the max FPS without hurting graphical quality too much. Firmly on the GPU is key. I understand this completely depends on your specific PC hardware, but a tutorial on how to min/max would be great.

    • @alfredthibodeaux2414
      @alfredthibodeaux2414 Год назад +3

      HUB has a couple of videos on this topic.

    • @insomniacjack729
      @insomniacjack729 Год назад +6

      Starfield has me confused. According to their min specs I'm below or at min with a 1700x and a 1080 but it runs just fine at 1440p medium settings. Dips below 60 in the large cities but I can deal with that. What I don't understand is why people with better cpus and gpus are having the same issues

    • @xSkylar64
      @xSkylar64 Год назад

      I used this on my 3070 TI Rig and it worked great. Highly recommend @@alfredthibodeaux2414

    • @fredEVOIX
      @fredEVOIX Год назад +4

      @@insomniacjack729 the game engine doesnt really like more than 8 threads aka 4 cores, on 8 cores you didn't really saw this but now that we have 10-16 cores this became relevant, games don't know what to do and jump between cores all the time creating stutter and fps drops, imagine talkking on the phone but every word plays on a different one and you have 16 in front of you...that's the problem but devs know it, a lot of recent games limit core usage by default now

    • @jyubei_ichimonji
      @jyubei_ichimonji Год назад +4

      @insomniacjack729 Starfield is one of this year worst offenders, though. It's very badly optimized.
      Someone discovered recently that the Bethesda developers have a bad gpu driver implementation.

  • @SeaMonkeyMetals
    @SeaMonkeyMetals Год назад +134

    Rather than artificially slowing the CPU, you should have run this test on a legacy system. For example, drop that 4090 into a system with an Athlon X4 CPU and you will really see the effect.
    Before I understood bottlenecking, I paired an R9 390 with an 880k Athlon X4 and wondered why I didn't get much better performance over the HD 5860...
    Little performance increases and bad stability. I regret ever buying that R9 card.
    But at the time I didn't know about bottlenecking, and thought a new GPU would solve my problems.

    • @iikatinggangsengii2471
      @iikatinggangsengii2471 Год назад

      yeah did that with 4670 long time ago, absolutely no difference

    • @fortigan2174
      @fortigan2174 Год назад +5

      The issue with the test you suggest is the mobo of that settup will not have sufficient PCIE lanes. So at that point you are bottlenecking on the motherboard before the CPU even comes into the picture. That renders the test inconclusive as to how much of the bottleneck would be from the CPU.

    • @SeaMonkeyMetals
      @SeaMonkeyMetals Год назад +2

      @@fortigan2174 the Crossblade Ranger has a 16 lane pcie slot...cards only use 8. I'm not sure what the lane issue you speak of is, however I do realize that older boards have older gen lanes. Now speed can have a huge impact, but you have to go way way back in time to drop below pcie 8x...
      My point was, gimping the CPU does not give an accurate reflection of a real-world scenario, where someone might be trying to use an overpowered graphics card in an old system that cannot keep up.

    • @SeaMonkeyMetals
      @SeaMonkeyMetals Год назад

      @@fortigan2174 I'm no expert, and if I am missing something in my previous comment, I am open to clarification. Thank you.

    • @anteep4900
      @anteep4900 Год назад

      haha!

  • @ivankong1065
    @ivankong1065 11 месяцев назад +20

    Thank you for showing this video Jay. As your video from couple years ago has pointed out, it is not that easy to bottleneck a GPU with modern tech these days unless you are really going bottom dollar to pair an i3 (i.e. 13100) to a 4090. Thank you for demonstrating that here.

    • @SolarTara
      @SolarTara 9 месяцев назад

      My god thank you for mentioning that its an I3 he mentioned. I was confused cause I thought he misspoke and meant a I7 13700 and was like.. woa.. how would that be a bottleneck

    • @10th_Doctor
      @10th_Doctor 9 месяцев назад

      Or my antique i5-6600k paired with an RTX 3090. That situation is being resolved this month when all my new build parts arrive.

    • @Alex96194
      @Alex96194 8 месяцев назад

      @@10th_Doctor im running an I5 9400F with an overclocked RTX 3070Ti. Will also solve the issue soon.

    • @10th_Doctor
      @10th_Doctor 8 месяцев назад

      @@Alex96194 parts arriving by the end of the week except a few parts like the Thermaltake CPU frame and AIO arriving next monday.

    • @10th_Doctor
      @10th_Doctor 8 месяцев назад

      @@Alex96194 I am going with an i7-13700K, not 14700K as the "refresh" doesn't really make it much better than the 13700 but does cost extra, 96GB DDR5 6400. I also bought a seasonic 1600W PSU with a 10 year warrnaty that will not only likely outlive my system even if I have it for another 20 years but also has plenty of head room for higher powered components down the line. I am waiting to see what the 5th gen Nvidia GPUs look like so not going with a 4th gen as my 3090 is still good enough for really any game at 5120x1440 super ultrawide.

  • @Mr_Jimbo
    @Mr_Jimbo 10 месяцев назад +12

    I thought I'd defined the term "bottleneck" by testing my new 4080 in my old Q9300 (2 cores, OC'd to a whopping 3.3ghz) as it was the only case/system I had that fit the monster cooler of the 4080, but in pure GPU benching, it's doesn't seem to be massively affected compared to my 9900k system, about 500-750 points in a furmark bench for example

    • @ward7337
      @ward7337 8 месяцев назад +3

      Now play a normal game

  • @SpacemanSpifff
    @SpacemanSpifff Год назад +171

    I’d love to see a video showing the same kind of this with slightly older CPUs. The ones people are using that may be thinking of a GPU upgrade.

    • @GregBurgess360
      @GregBurgess360 Год назад +8

      Yeah I have a 3600 and seeing if that would bottleneck a 4070 hard

    • @Bello..
      @Bello.. Год назад +3

      ​@@GregBurgess360depends on your setup, but most likely it would be heavily bottlenecked. You should consider upgrading your CPU to something better on the AM4 platform if you don't want to spend much more

    • @oneonone8855
      @oneonone8855 Год назад

      Just turn on the graph in-game and you can see performance in % of the GPU and CPU. If you play a high GPU tense game like Cyberpunk at 1440 with high settings and your GPU is below 98% performance it's for sure bottlenecked. @@GregBurgess360

    • @Kmmlc
      @Kmmlc Год назад +4

      I have a new (bought 2 months ago) 6750 XT with a 9900K. I can tell you the bottleneck is real with that pairing.

    • @justinjesse2107
      @justinjesse2107 Год назад +1

      ​@@GregBurgess360 as a fellow 3600 owner, it would, but not by that much. It would also depend on what res you're playing at. I play 1440p so my CPU doesn't work as hard

  • @dipakgosain
    @dipakgosain Год назад +8

    The cpu is sending frames to the GPU so slowly the GPU is like bro we're not doing anything😂

  • @MeeMoo220
    @MeeMoo220 Год назад +27

    This makes me feel better about pairing a used 3090 FE with the Ryzen 5 3600 w/ Prism cooler in my secondary gaming rig. Thanks Jay!

    • @rusudan9631
      @rusudan9631 Год назад +3

      i set my 3600 at 2.2ghz all cores, still hitting stable 60 fps in 1440p gaming in rdr2 and cyberpunk with a 3060ti eagle

    • @vincentvega3093
      @vincentvega3093 Год назад +6

      ​@@rusudan9631clearly GPU limited😂

    • @rustler08
      @rustler08 Год назад +6

      If you're using a 3600 in games like Starfield or Cyberpunk, you should feel bad. You are leaving so much performance on the table, and I can tell you this considering I had a superior 3700X and swapped to a 7600X.
      Unless you're running lighter games, a 5600X3D or a 5800X3D would massively improve your gaming performance in Cyberpunk and Starfield.

    • @MeeMoo220
      @MeeMoo220 Год назад +1

      @@rustler08 You’re right. Thankfully, I’m only playing BG3 and LoL at 1440p 144Hz. If I wanted to run Starfield I’d need to sell my kidney. Not interested.

    • @__-fi6xg
      @__-fi6xg Год назад

      yeah i remember playing for honor on a new 6700 xt with a r5 2600 thinking looks nice but something is off, and when i moved to AM5 with same gpu and 7600x, the 1% lows and all the hick ups were gone.

  • @williamscott6209
    @williamscott6209 9 месяцев назад +5

    Just recently upgraded my RTX 3060 to a Radeon 6950 XT. I was quite concerned that I'd be severely CPU bottlenecked since I'm running a 5600x and I was thinking "damn, am I gonna have to get 5800X3D to run this thing adequately?" but this video makes me think the bottleneck won't be nearly as bad as I thought. Hopefully my CPU can hold out for another generation or 2 until I end up upgrading the whole motherboard.

    • @BansheeNornPhenex
      @BansheeNornPhenex 9 месяцев назад

      Thats a downgrade..

    • @williamscott6209
      @williamscott6209 8 месяцев назад

      @@BansheeNornPhenex How is that a downgrade? The 6950 XT is twice as fast

    • @unnamed715
      @unnamed715 3 месяца назад

      I'm running a 5600X with a 4070 and having a pretty good experience. In worst case scenarios I just have to tweak the settings a bit.

  • @ThiagoMatuo
    @ThiagoMatuo Год назад +3

    Me with a R5 3600 / RTX 3070, I was playing in 1080p for a long time and some games that would reach above 120fps I could feel a lot of bottleneck. My monitor was 75hz, so it was better to lock the fps to 75 than trying to go over it and having a lot of sttutering.
    Now I upgraded my monitor to an UW 1440p and the bottleneck is better, but I still need an upgrade of CPU.

  • @BeeWhere
    @BeeWhere Год назад +5

    The biggest bottleneck for me has always been my monitors. Going to 1440p 144hz was a game changer but 165 or 240hz doesn't have the same impact. And 4k 144hz is still a bit out of my budget.

    • @dmytrosoboliev935
      @dmytrosoboliev935 Год назад +2

      Yeah, i went from 1080p 60hz to 1440p 165hz recently and it's mind blowing. Of course, if you have at least 100fps in games. At 60fps it's not a big difference, pretty much same, just a higher resolution. But at competitive titles 144hz or above is a gamechanger

  • @l3lue7hunder12
    @l3lue7hunder12 Год назад +49

    As Jayz just demonstrated, the issue with bottlenecking isn't really that your games don't run - if you throw something like an RTX 4090 at it, most computers of the last 6 years should do at least well.
    The real issue for most is the price, because price conscious buyers seeking to upgrade their system tend to buy graphic cards that according to benchmarks should be enough for their needs, instead of emptying their bank account to go all overkill with the latest fasted graphics card model available.
    The problem here is that you won't get those same benchmark results if your CPU can't keep up, which means you just wasted money.

    • @LSSMIRAK
      @LSSMIRAK Год назад +14

      Pretty much every benchmark videos that compare gpus are inaccurate due to this. They pair a 13900K with every GPU, misleading people by making them think that they'll get the same performance with their crappy 9 year old cpu.

    • @DavidTMSN
      @DavidTMSN 11 месяцев назад +2

      @@LSSMIRAK Exactly.
      You can have a bunch of cores but if they're all clocked low and utilize an older IPC then it's gonna be the bottleneck if paired with newer architecture gpus.
      I run a i9 10th gen with a 3090 and it works but the improvements since then are pretty substantial - especially now with 14700k/7800x3d - seems like best upgrade path for my situation.

    • @richardrassat614
      @richardrassat614 10 месяцев назад +3

      I have a 1920x thread ripper that is 6 years old and would bottle neck on my 2070 super. didn't realize it until I started my new build and dropped my 4090 in and got the same fps as the 2070 super. the 1920x had a mild OC and never felt unplayable but apparently it still has more head room in it.

    • @MaddaxxxE
      @MaddaxxxE 10 месяцев назад +2

      Just upgraded my 11700F to a 7800x3d and can confirm that I was not getting the most out of my 4080 for almost a year at 3440x1440p. I was so mad when didn’t get the same performance that I saw in RUclips videos in new games and realized that I was bottlenecked. I was missing 30-40 frames in some games.

    • @jouniosmala9921
      @jouniosmala9921 9 месяцев назад

      I upgraded my i7 920 with 2070 super and 4k TV. That's what a real CPU bottleneck looks in gaming. It proved to me that average FPS and 1% low's are often bullshit numbers, the real issue happens occasionally when there's spike in CPU load for instance enemy appearing on my screen second after the ambush because your CPU didn't have enough power left to decompress it's model while it was rendering the frames. Oh. Upgrade wasn't for gaming I needed a new GPU for a GPGPU programming project. I did use it like that almost a year, until I could afford and really needed to replace rest of the parts.

  • @leonbigio5499
    @leonbigio5499 Год назад +4

    brother games have gotten so good when jay put his hand on the monitor at 6:23 i thought it was an animation from the game. No kidding searched how to point in cyberpunk lol

    • @xXXEnderCraftXXx
      @xXXEnderCraftXXx Год назад +2

      The axact same thing I thought!

    • @daniell9834
      @daniell9834 Год назад +2

      lol I was just going to comment that, I was like how the fuck do you do that?

    • @xXXEnderCraftXXx
      @xXXEnderCraftXXx Год назад +2

      @@daniell9834 RTX being too real😂

    • @RainyFoxUwU
      @RainyFoxUwU Год назад

      same here!

  • @patricklee8552
    @patricklee8552 3 месяца назад +1

    the calling card for a CPU upgrade is when you buy a new game and all that loads is a black screen or it just don't load at all

  • @reinhardswart753
    @reinhardswart753 2 месяца назад

    Recently upgraded my old i5 3rd gen to a i7 10th gen and gained anywhere from 40-50 fps on my GTX 1070 in more modern games. After seeing it in action, I finally understood what bottlenecking is.

  • @LukeTheJoker
    @LukeTheJoker Год назад +13

    Awesome video, I didn't realise how far you would have to go to cause a real bottleneck.

  • @JoeStuffzAlt
    @JoeStuffzAlt Год назад +5

    I once upgraded my GPU and my CPU was bottlenecking. However, the bottlenecked GPU was overall faster than before. Of course, this was part of a phased upgrade. This was also one of those eras where I had to go from DDR2 to DDR3, which adds more to the CPU upgrade cost.

    • @anhiirr
      @anhiirr Год назад

      for me my bigger fear was the operating voltages of ddr3 and 1.65v ram vs 1.5v in that "ERA" it grew to be quite troublesome for a lot of end users during this time. Where a newer GPU also required a relatively ideal range of 12v rail operation/capability....paired aspects of vdroop etc when running 1.65v ram in these sort of scenarios which largely FRIED/KILLED a lot of ppls builds/systems etc. ESP as they were little/by little upgrading aspects of their build like GPU/SSD etc....w.o realizing how INTEGRAL a TIGHTER system from an operating voltage stanpdoint....building a PC really GREW into/became...even to this day.

  • @RysenKai
    @RysenKai Год назад +4

    I don't have any bottlenecking, because I just put all the iFixit pieces into my case.

  • @Evrae04
    @Evrae04 11 месяцев назад +1

    I was using a core i7 6700t with a 3080. The card was crying for new computer. Awesome video Jay.

  • @ericwhitney8277
    @ericwhitney8277 Год назад +1

    I wish the whole gaming community would see this. I've had to explain this concept to so many people. Most people think because a 7800x has higher FPS than a 5800x that the CPU is therefore bottlenecking their GPU.

  • @mackan072
    @mackan072 Год назад +18

    I (briefly) ran an RTX 3080 coupled with an i7 4790k OC:ed to 4.8GHz, and at 3440x1440 I was significantly more GPU rather than CPU limited in most games. Sure, upgrading the the 5800X (Once it arrived, mine got delayed...) did improve on performance, but typically only in 1% lows, rather than "general performance" for most games. All in all, my ancient i7 4790k fared significantly better than I expected, given how huge of a mismatch it is in hardware.

    • @TheModeRed
      @TheModeRed Год назад +2

      I agree with the sentiment of confusion here. It's not just about bottlenecks. It's how do I maximize the potential of a 3080 or a 7900xt so that my older CPU really isn't a problem? Are certain areas of a game really CPU intensive (bottlenecked) regardless of settings? 3440x1440 is a good use case. You'd think it'd stay GPU bound, but lower a setting and maybe now you're CPU bound. But if you have everything ultra, you get less FPS. I think most RUclipsr's are almost getting there but it's still confusing. The best recommendation I have is to turn on metrics and test every setting yourself to see where the bottleneck is when min/maxing for FPS vs quality.

    • @Gastell0
      @Gastell0 Год назад +2

      I have 4790K and 4K as well, and just upgraded to 7900XTX, still don't get CPU bottlenecks that would warrant CPU upgrade.
      Might even modify my Z97 motherboard to support ReBar (there's github for that) to get more performance out of that ol'reliable CPU!

    • @mackan072
      @mackan072 Год назад +2

      @@Gastell0 I still did find it worthwhile to upgrade my CPU though. There are some games where the CPU performance was subpar, and the upgrade to a better CPU made stutters less frequent.
      Just because I tended to be more GPU rather than CPU limited, it doesn't mean that I always was limited by the GPU. Better CPU performance still improved on the general gaming experience overall. More so in some titles than in others though.

    • @TheModeRed
      @TheModeRed Год назад

      @@Gastell0 So say we're playing Starfield on ultra settings. We want more FPS but not so much a loss in visual quality. Does the CPU start bottlenecking in the city no matter what or just when we lower settings trying to improve FPS? There seems to be a fine line between min/maxing and shifting load back onto the CPU that I think we will just have to figure out for ourselves on a per game, per area, per rig basis.

    • @Gastell0
      @Gastell0 Год назад +2

      @@mackan072 That's definitely, I was debating what to upgrade, MB+CPU+RAM or GPU, and went with GPU as it provides the most gains (I changed from GTX1080)

  • @TheNtoukgr
    @TheNtoukgr Год назад +5

    This is why i Love your Videos along with Steve's, Proper execution with proper info and more analysis. Cause of your help i could run my 3950x on release period back on 2020. Well Done Jay you are AWESOME

  • @mr.gamerchannel2970
    @mr.gamerchannel2970 Год назад +145

    In short,the gamers worst nightmare

    • @fififiz
      @fififiz Год назад +12

      aside from showers

    • @Justlivin00
      @Justlivin00 Год назад +10

      ​@@fififizI love showers

    • @LilTachanka
      @LilTachanka Год назад +20

      ​@@fififizthat's a redditors worst fear

    • @drinkintea1572
      @drinkintea1572 Год назад +4

      @@LilTachanka nah that should be "responsabilities"

    • @iHaveTheDocuments
      @iHaveTheDocuments Год назад

      ​@@fififiz What is a shower?

  • @flare9612
    @flare9612 10 месяцев назад

    Bottlenecking demo 101: Launch Age of Empires II, set map to 12 players, max unit is 500, make them all meet in one place for glorious combat. Congratulations even Threadripper will be sweating for that task.

  • @ShaighJosephson
    @ShaighJosephson 13 дней назад +1

    This showed how well the 4090 reponds when the CPU is not up to par...which is pretty damn good actually... 😮

  • @Shadow0fd3ath24
    @Shadow0fd3ath24 Год назад +17

    too many people worry about this it seems, yes its a concern, but you almost have to try to bottleneck a system in gaming, especially on any Ryzen or gen of Intel i5/i7 mainline chip of the past 5+ years

    • @TheRealDlo
      @TheRealDlo Год назад +1

      I think the point is that Synergy matters 👍

    • @orangeapples
      @orangeapples Год назад +5

      Yeah. As long as things are of similar age and of similar price tiers you’re okay.
      Don’t pair a Celeron with a RX 7900xt. Don’t pair a 7800x3d with a RX 6400.
      And people will need to know there is no magic set of hardware that REMOVE bottlenecks. They don’t realize that different software will use hardware differently.

    • @neiltroppmann7773
      @neiltroppmann7773 Год назад

      Not entirely true, If your cpu and mobo can only do pcie gen 3(which is intel 10gen and previous), and you buy a lower end gpu like the rx 6600 gpu that only has 8 pcie lanes you will see stutters and lower fps then if you had gen 4 8 lanes or gen 3 16 lanes.

    • @Shadow0fd3ath24
      @Shadow0fd3ath24 Год назад

      Thats not due to CPU choice, that just the 6600xt itself, and it would be FPS not stuttering. PLUS 6600xt isnt fast or powerful enough to benefit from gen 4 PCIE much at all you lose MAYBE 5% from actual tests ive seen. If youre on a 6600xt youre already gonna have stutters and lower fps. Plus the 6600xt is a HORRIBLE buy, 2080ti is a solid 65-80% better and 150 bucks USD less, and a 3080 is like 50 bucks cheaper and even better @@neiltroppmann7773​

    • @HiluxForEveryone
      @HiluxForEveryone Год назад

      @@neiltroppmann7773Valid point, although the stutter part is a bit farfetched as from what I know the only thing that'd actually change is the framerate

  • @NSA-admin
    @NSA-admin 10 месяцев назад +4

    Your ifixit spots are still the best in the business. XD

  • @Redsfanatic32
    @Redsfanatic32 10 месяцев назад +4

    I was a little worried about 11700k paired with 4070 but it seems my worries were misplaced.
    My biggest mistake was making my secondary storage drive a hard disk instead of a SSD.

  • @DigitalRecollections
    @DigitalRecollections 7 месяцев назад +2

    yeah i had a 5700xt with an i7 8700k....upgraded the GPU to a 7900xt and only saw somewhat of an increase in 1440p gaming....ended up with a new AM5 build and all is well now lol

    • @AlejandroMagnoRivas
      @AlejandroMagnoRivas 3 месяца назад

      maybe because Ultra settings or latest heavy games like hellblade 2,avatar or something with ray tracing take all the power of u gpu..
      Not is to much powerfull that radeon to bottleneck..
      I see i7 7700k maintain the frames with ur card..
      I7 7700k no bottleneck gtx 1080ti but yes 2080ti a little I think.

  • @TheSpiikki
    @TheSpiikki 6 месяцев назад

    I bought an 4070ti while I still had my i7 8700k. While it's been a great CPU for years now - it started to show it's age on larger games, so I had major bottlenecking. My GPU Usage hovered anywhere from 50 to 70ish depending on the title. Today I upgraded to i7 14700k and the difference is IMMENSE.

  • @TheRealDlo
    @TheRealDlo Год назад +7

    It is important to have SYNERGY in your system! Thanks J

  • @dakrawnik4208
    @dakrawnik4208 Год назад +5

    Jay using RTSS without frame time graph while talking about noticing stutters and just spinning around and watching FPS is pure comedy 😂

    • @WayStedYou
      @WayStedYou Год назад

      Not like he does this for a job or anything.
      Oh

  • @AmritpalMoga
    @AmritpalMoga Год назад +8

    How big of a bottleneck is the RAM speed? Would be interesting to test.

    • @Cravenfr
      @Cravenfr Год назад

      imo, long testing for not huge differences, since there is test out there where you can see like +3 fps differences but the ram kit is also 3 times the cost of the reference basis 😂

    • @jordanlazarus7345
      @jordanlazarus7345 Год назад

      It depends. It can make a massive difference, but that depends on how high the speeds already are and what CPU you're using. Sometimes a shitty RAM config can literally cut your performance in half, other times it's not the limiting factor.

  • @platinumgrit
    @platinumgrit Год назад +1

    LOL I've literally bought additional iFixit toolkits because of Jay's ads 😆 it's how ads should be done!

  • @brokenmailman
    @brokenmailman Год назад +1

    When I first started to stream the PC I used was a i7-7700 (non-k) with a RX6600. The bottleneck was INSANE! Used to watch framerate go from 120fps to 30 fps. Just all over the place.

  • @constitutionalright827
    @constitutionalright827 Год назад +48

    Love the video, Jay. I would have liked it better to see this done with a basic card like a 2070... Basically in line with your PC industry destruct video, I'd like to see a discussion like this on bottlenecking with mainline, average components like exists in 90% of the PCs out there.

    • @julianvera1098
      @julianvera1098 9 месяцев назад +2

      agree with this also because I use a 2070 super lol so I would like to see a ryzen 5600 or so which is the most popular for most of gamers and a 2070, 3060, or 3070 or so maybe 3080 or different cards with different CPUs

  • @CampamentoUL
    @CampamentoUL Год назад +3

    Really good video man, I've been interested in pcs since I was a kid, I built my pcs when I was 16 and 19 and now im building my third at 25. When I built those I thought I knew everything but now I have more perspective, whatching your videos Im learning a lot and im only a motherboard away for my next build really happy with my component decisions, thank you for sharing your knowledge!

  • @watercannonscollaboration2281
    @watercannonscollaboration2281 Год назад +14

    “Bottlenecking”, the only word on par in terror as the phrase “future-proofing”

  • @GriffinCorpOne
    @GriffinCorpOne Год назад +1

    "Ten lane highway going down to a single lane dirt road" LOL - I love this channel

  • @rexyoshimoto4278
    @rexyoshimoto4278 11 месяцев назад

    I did my last clock-up on the cpu this year. My rig has finally reach the years I hoped to replace it. It's running hotter to stay up in new and better fps games. It's an Intel i7 8700k, 32gb 3200 ram, Nvidia RTX 3080 ti. Had two other gpus ( a Vega64 and a 2080 S) two yrs each. But for 6 yrs. of pleasure I had with the rig, it was monster. I'm gonna retire it as my daily work horse. Love this machine.

    • @vegg6408
      @vegg6408 11 месяцев назад +1

      My pc died a week ago and it i5 2400 and 1050ti

    • @rexyoshimoto4278
      @rexyoshimoto4278 11 месяцев назад

      @@vegg6408You kept it going that long? That's like gramma's car she putts around forever on. That's great. 2012. I have an old Dell Studio xps 435mt in the garage. It runs but is as slow as a rug. Intel i7 920. first year i7, smallest one. Came with an AMD HD 4850 2gb. The last gpu it had was an Nvidia GeForce GTX 970 ss. Still in the Pcie slot.😀

  • @TheGamefaq
    @TheGamefaq Год назад +4

    Jay do that with Starfield. This also requires a powerful CPU! The changes to the clock speed and/or number of cores will become noticeable much earlier and more strongly.

    • @CarlosXPhone
      @CarlosXPhone Год назад +1

      No it doesn't. Starfield as a game is DESIGNED for Series S. Jay just did a Starfield video saying the PC version is poorly optimized, which tells me, THAT PC version is on the same lane as the Series S.
      So, bottlenecking Starfield is gonna require a last gen card.

    • @Pand0rasAct0r_
      @Pand0rasAct0r_ Год назад

      @@CarlosXPhone starfield does require a powerful cpu. my 5800x3d is in the 50-80%s in new atlantis at 1440p max lol. ON ALL CORES.

    • @CarlosXPhone
      @CarlosXPhone Год назад +1

      @@Pand0rasAct0r_ By PC standards it does, but I mean Starfield isn't that powerful as a game. Its just like Jay said, its poorly optimized. He put a powerful CPU and a powerful GPU in there, and there was no improvement. I thought Starfield would've been the best place to game on a PC because upgrades. Howard was lying to people when he said to get a powerful computer.

    • @TremorX
      @TremorX Год назад

      ​@@Pand0rasAct0r_5800x3d and 4090; it's not a hardware problem. It's 100% Starfield being a turd. Some of the INI file tweaks prove just how full of it Todd is about optimization. I get the feeling all they did was "make sure the game loads" and called it a day.

    • @Pand0rasAct0r_
      @Pand0rasAct0r_ Год назад

      @@CarlosXPhone mate this game simulates thousands of objects in real time even more than back in skyrim or fallout4. Yes its creation engine jank as well but damm did they improve its capabilities. it could be better optimised for sure but its simply wrong to say its not a "powerful" game because it very much is.

  • @sterilyte
    @sterilyte Год назад +4

    Hey Jay, I love what you're doing here, but I think a better example would be doing this with hardware, like putting a first gen Ryzen (or even a FX chip) with the 4090. And doing the reverse and using something like a 1060 with a modern CPU. (I say Ryzen/FX because I don't know Intel chips that well).

    • @nhf7170
      @nhf7170 Год назад

      I had an FX chip with a 1070. Before that an HD 7950. I don't think either ever saw usage over 75%.

    • @sterilyte
      @sterilyte Год назад

      @@nhf7170 I used an FX-8350 with an AMD 480 and it was a pretty good match, though the CPU was the bottleneck more often than the 480 was.

  • @Amin_2k
    @Amin_2k Год назад +6

    Exactly! I was pressured into changing my i7 4770k to a ryzen 5 5600 (and i am still happy i did it because i can run windows 11 now) but in terms of gaming the difference i noticed was barely noticable. Gaming is mostly GPU bound, and CPU bottlenecks are very rare it seems, even if you have a 9 year old CPU. I must add that if you play at 1080p you might notice a bigger difference, but if you play at 1440p or 4k the older CPUs are still amazing.

    • @secondc0ming
      @secondc0ming Год назад +9

      If you didn't see any difference in games going from a 4770k to a Ryzen 5600, you either play old games or you have an old GPU.

    • @HackedGlitch265
      @HackedGlitch265 Год назад +4

      I think that stems from the fact that while GPUs have heavily increased in power, higher resolutions also require vastly more power to be rendered.
      So for 1080p, which was once high end, GPUs can now scale that mountain, but CPUs are slower, so they struggle. Push up to 4k, and while the GPU was faster, it ends up scaling a sheer cliff, while the CPU climbs merely a slope, and they end up reaching the top at around the same time.

    • @bigdaisy19k
      @bigdaisy19k Год назад

      ​@@HackedGlitch265best description ever. 👍

    • @lilpain1997
      @lilpain1997 Год назад +4

      yeah I really want to know what GPU you were running to get barely any gains at all from a 4770k to a 5600. I went from an i7 3770k ( not far off the 4770k at all ) to a 3600 then a 5800x3D and the jumps were massive. I play at 3440x1440p and noticed the Jump more so in 1% lows and .1% lows. Avg really does not matter at all as you can get decent averages out of older CPUs but your 1% and .1%s are much worse and if you play anything CPU limited like Satisfactory or games like that you will notice a massive jump.

    • @Amin_2k
      @Amin_2k Год назад +2

      ​@@secondc0ming I have a 2070 super, and played newer games (Watch dogs legion, Forza Horizon 5, metro exodus). I have benchmark comparison on my channel and the difference is not noticable, at least not at 1440p. My question to you is: Did you ever pair an older CPU like the 4770k with a medium/high end modern GPU, or are you just speaking out of what you think should happen?

  • @LeitoAE
    @LeitoAE 11 месяцев назад

    The problem is that in the past people were saying that for gaming all you need is strong GPU, because GPU is rendering a graphics and games require only graphics...
    Against this tech people were explaining, that in certain situations you are going to bottlenect your GPU by very old and slow CPU, or no enaugh memory etc. That bottlenect term became so popular that nowadays we see anxiety for bottlenect. People think that as you have said - there is only one perfect combo between GPU and CPU and any other combination is going to give you a bottleneck.
    What is important is to understand what a bottleneck is and realise, that you always have a bottleneck. It changes and swings from GPU to CPU all the time. Not only when you play a different game, but also when you are in different place on the map and you do different things in game itself. For example in shooters it might be an explosion, in racing games it is going to be burnout, when a tire smoke is a heavy thing to render for a GPU. In this same racing game you can be bottlenect, when you play against many AI opponents as behavior of each one of them needs to be calculated by the CPU.

  • @curbthepain
    @curbthepain 10 месяцев назад +1

    Man that's (at the beginning running overclocked) the smoothest I've seen a game run since I visited my local Milwaukee PC and played Borderlands 2 on a Nvidia branded PC like 11 years ago.

  • @UserNamesAreObsolete
    @UserNamesAreObsolete Год назад +15

    Hello Jay,
    thanks for your video. People can now identify a bottle neck when they have built a pc - but how can people identify a bottleneck before buying new components?
    When my 1070 broke, it took me quite a lot of time to decide on which GPU to buy next since my system was already built, basically.
    I already had a 5600X and was waiting for better GPU prices before getting a new card.
    Well, after checking a lot of videos around pricing and performance, I decided to buy a 6800XT for 1440p gaming.
    Later on I asked myself if I created a bottle neck for myself.
    Could you make a video to recognize a potential for a bottle neck on paper, so before you buy?
    Can you see which CPU / GPU are a good match just by comparing their technical data and if so, how?
    I believe such a video would be a real help, which people could use as a rough guideline on what to look out for.
    Old rules like "the higher the frequency the better the CPU" don´t apply any more, neither can you just tell the best match by looking at the core count of a CPU, else any EPYC would constantly be the best CPU for everything.
    What GPU would you pair with a r5 7600X? A 7800XT? 7900XT? 4070?
    And if you paired this setup, which PSU would you pick? 650 W or 750 W?
    What would change in your selection if ray tracing was considered unnecessary?
    Thanks in advance.

    • @michisauer
      @michisauer Год назад +4

      It's quite easy to understand even before buying a card:
      Hard bottlenecks should not happen anymore as long as you got at least 8 cores.
      Even a 6 core cpu running at high speeds will seldom drop you to framerates that are considered bottlenecks in its fullest.
      Best advise is:
      Running a midrange-cpu -> buy a midrange gpu.
      Pairing a 5600x with a 4090 is not a good choice.
      Funny thing is:
      Any 8-core cpu nowadays ( if it's 8 perf- cores ) should be safe not to force you into any bottlenecks.
      So best advice to the end:
      If you got a cpu, read the test what it was performing like with gpus that were out at that time. Then search for a gpu- test with the cards you want to use and 1 of the cards in the cpu test in it.
      Check the performance difference of the cards.
      The trick behind this is, that usually there will be the best performing card of the cpu generation listed. If the cpu was able to run that card unlimited, any card with equal or minimal higher performance won't cause the cpu to bottleneck. Most times you even cards with much higher performance will only be slowed down.
      Real bottlenecks occur only when the cpu or the gpu are not able to handle the workload anymore and they have to wait for each other.
      Therefore: buy accordingly and get your game settings right. Most times when running into bad frame times, increasing resolution or using gpu-intensive settings will lower the average fps by a bit, but will smoothen out frame consistency, which in the end will give you the better experience.
      I know this answer is long and hard to read, but giving a list of which gpu to buy to which cpu would take years to make. Just because there is so many combinations out there.
      Plus, you would need to do it for every resolution and a long list of games to get consistent results.

    • @tool46296
      @tool46296 Год назад

      Google bottleneck calculator. There is a nifty website that lets you choose a CPU, GPU, what you’ll be using them for, and at what resolution.
      Check it out. 👍🏼

    • @UrBeastyBalz619
      @UrBeastyBalz619 Год назад +2

      Honestly. You have a very good pairing with the 5600x and 6800xt. As for bottlenecking on paper is really down to the individual game your looking at playing. Some are cpu heavy and others gpu heavy.
      Hypothetically you could get similar performance in a game with your 6800xt paired with a 5950x or your 5600x. If it’s GPU heavy. But the next game may play way better with the 5950x if it’s a cpu heavy game.
      The argument could be made that your previous pairing, 1070 and 5600x your cpu is bottlenecked by your gpu. This is where I’d want to be btw. Not the other way around. But I digress.
      I don’t upgrade anything until I see something in my system isn’t providing the level of performance I’m looking for. Then I look at deals and wait for the best bang for buck. Kinda like you did looking at the 6800xt
      I just upgraded to a 5600x from my 3300x because I started playing games that needed more cpu to push the frames to my gpu fast enough. (Gpu at 60% util) But before I never needed a faster cpu. Just gpu.
      To touch on your psu question. I’d go with the 750W. I’d push for a good 850W though. Cards are getting more and more power hungry so in the future when you need another upgrade you don’t want to be buying ANOTHER psu just to power the next gpu. Buy one that’s good now so you can slot in a gpu in the future. Your psu is a component that can grow with your rig.
      And for your pairing question of the 7600x. I’d buy the best gpu for your budget. Even if your cpu is bottlenecking it a bit in specific titles. Your gpu potential isn’t going anywhere. Amd will be releasing another gen on am5 (finger crossed it’s the same story as am4) you can upgrade then to get the potential in your gpu then, when you need to.
      Buy what you can afford and have fun gaming!
      But I’m just a dude who has his own opinions on stuff and I’m sure someone will shoot me down 😂
      I hope this helps. Cheers.

    • @imo098765
      @imo098765 Год назад

      Hardware Unboxed and other benchmarking tech channels show cpu scaling vids every few months, thats the best place

    • @UserNamesAreObsolete
      @UserNamesAreObsolete Год назад

      @@imo098765 Thanks for the answer, that's how I decided on my GPU. I shot it for 509 (average price for a 6800XT was 560 Euro, 6950 is 640 Euro). But it took me some time to find a channel providing me with the necessary information. I could have needed a video as a guide for it, including possible bottlenecks for certain CPU / GPU combos.

  • @KimBoKastekniv47
    @KimBoKastekniv47 11 месяцев назад +5

    Your other bottlenecking video from 2019 was much more clear to understand.

  • @ThickpropheT
    @ThickpropheT Год назад +12

    Thanks for sharing this! Been meaning test out my 4770k and 1070 combo to see how much bottlenecking there is, but didn't really have too good of an idea on what I'd be looking for, but this clears it right up. Gonna check that out later

    • @fredEVOIX
      @fredEVOIX Год назад +3

      if you have shadow of the tomb raider do the benchmark in 1080p it will tell you, check GPU BOUND if it's 99% your gpu is limiting the game if it say 0% it's your cpu, this will tell you the balance of your setup

    • @ThickpropheT
      @ThickpropheT Год назад

      @@fredEVOIX Thanks for the tips. Cheers!

    • @m8x425
      @m8x425 Год назад +1

      My brother had a system with a stock 3770k and upgraded to the GTX 1070 back in 2017. He got a couple years out of that setup and he had no complaints. He did upgrade to a 9900k, but not for the sake of gaming performance.

    • @ThickpropheT
      @ThickpropheT Год назад

      @@friendlysloth hmm. I see. I guess I know what to expect now lol

    • @illustriousinc8608
      @illustriousinc8608 11 месяцев назад +1

      @@ThickpropheT I'm still running a 3770k with a 1070 and there is nearly no bottlenecking. At best like 2-3 % depending on the game/task, so it's not noticeable.

  • @akuma_soul
    @akuma_soul Год назад +1

    Rocking a old Gen1 Threadripper and a 3090. After using my PC a LOT for 3D Rendering and Work, I noticed the following: In Cyberpunk, Enemys just pop in when I arive somewhere and cant be attacked... guess its time to go for the Ryzen 7950X Rig for the new Update 👌

  • @esunisen3862
    @esunisen3862 11 месяцев назад

    My rig: 4790K + 3060 Ti, everything at 100% in Starfield in 1080p high
    Jay: This is a potato.

  • @dudenda757
    @dudenda757 Год назад +14

    Awesome info here. Thanks Jay! I would love to see the same kind of testing done with an AMD CPU/GPU combo though.

  • @farlonmuentes6004
    @farlonmuentes6004 11 месяцев назад

    bottlenecking issues are mostly comes to people with enough budget to buy a huge selection for parts but for us common folk that needs to wait for months or years to buy another budget components, we rarely hit this bottleneck issues. in other terms, bottlenecking is a rich man's first world problems.

  • @TheRogueWolf
    @TheRogueWolf Год назад +5

    And, of course, bottlenecking is also going to vary depending on which game you're playing. So forget about building the perfect system that plays everything at 100% efficiency, because there's no such thing.

  • @Intrepid17011
    @Intrepid17011 Год назад +5

    Currently bottlenecking my 7800X3D with a 6700Xt, upgraded from a i5 6600K.
    I choose to do it that way since i plan to get a new GPU anyway.
    Also, that way my CPU wont bottleneck my next GPU which needs to be more powerful than my 6700Xt.
    Also there are so many titles that are SO CPU depended, like Star Citizen ( which i play a lot ).
    Also i feel like a GPU bottlenecking is not as worse as the other way around.
    If your CPU is tooo slow for a title you cant do much , if your gou is too slow you can turn down the settings.

    • @insertnamehere4419
      @insertnamehere4419 Год назад +2

      lol stop using the word bottleneck. Your cpu is not "bottlenecked" by your gpu.

    • @jiboo6850
      @jiboo6850 Год назад

      huh? what?? i think your pc is a trash can that needs to be emptied. no way in a million year, your pc in a normal "good" situation will bottleneck. something is wrong on your pc.

    • @insertnamehere4419
      @insertnamehere4419 Год назад

      @@jiboo6850 There is always a "bottleneck". You are either cpu limited, or gpu limited. You can also be IO limited. You people need to just stop using the word. Pair a good cpu with a good gpu, and you'll be fine. You will benefit from a better cpu when cpu limited and a better gpu when gpu limited. You have to have a lower end cpu to literally "bottleneck" your gpu.

    • @jiboo6850
      @jiboo6850 Год назад

      @@insertnamehere4419 yes but there's a difference between a bottleneck that is 3 lanes to 1 vs a 10lanes to 1. the 1st one will be easier to dela with just by finding the balanced settings that ask an equal job from both. and it's not difficult to figure out. what i meant in the 1st comment is that his config is high enough to pull Starfield easy. he just hasn't the right settings to let it express itself to the best.

    • @HiluxForEveryone
      @HiluxForEveryone Год назад

      GPUs do not hold CPUs back. Stop using that term for that instance.

  • @pixelcutterofficial
    @pixelcutterofficial Год назад +1

    I recently got bad advice for a build and the bad pair of i5 13400F and 4060 TI causes dramatic drops in FPS and stuttering. Got my money back on original CPU and upped to a 13700KF solving the problem but DAMN was the bottleneck noticable

  • @heyiamnick4144
    @heyiamnick4144 7 месяцев назад

    I had a 3060ti and a 5700G and my usage for both rarely ever went past 50% while also having barely decent FPS on even low setings. Now I upgraded my CPU to a 5800x3D, GPU is now properly utilized and I got a massive FPS boost even with high settings.

  • @dvr1337
    @dvr1337 11 месяцев назад +2

    Cyberpunk uses 1 core people buy 16core cpus😂

  • @craigreustle2192
    @craigreustle2192 11 месяцев назад

    When you went from 16c to 4c the maths didn't math for the cpu utilization. With 16c, 1c was doing 50%, a couple were doing 2% and most were doing 0%. With 4c, 1c was 98%, 1c was 42%, 1c was 27%, 1c was 23%, and one was 21%.

  • @Castor586
    @Castor586 2 месяца назад

    Aight, I just subscribed because that's the most brilliant sponsor gig I've ever seen

  • @AKG58Z
    @AKG58Z Год назад

    An easy explanation :-
    1: Imagine your standing in a long line of people waiting at the end to get a Burger to eat
    2: Now the person at counter selling people the Burger is working TOO slow and you will have to wait a long time for your turn
    3: Now the owner walks in and sees what's causing this line of people and he sees that the employee at the counter is working too slow and says "ok so this person is the Bottleneck Here"
    4: Then he replaces him with someone faster so line get short fast
    so now You Guys get it in a pc there can be anything thats a bottleneck causing other components to wait for it to finish.

  • @Chris-techgamesfood
    @Chris-techgamesfood 4 месяца назад +1

    At 1080 in Starfield with an RTX 2080 and i7 9700F I don’t see any bottleneck.
    The CPU is getting on now but it’s still viable, I can run a lot of games at 120fps and not notice any drops.

  • @TheNerd
    @TheNerd 2 месяца назад

    A hard CPU Bottleneck is easy to find: Reduce the resolution of your game from say 1440P to 1080P for example. If you still have the same or similar FPS, your CPU is too slow.

  • @kieferonline
    @kieferonline Год назад +1

    Ha! I love the iFixit commercial!! So funny. It's clear someone had a good time making that.

  • @trig984
    @trig984 Год назад +2

    thanks man...get into ""conversations" all the time about cpu bottle-necking this gpu or that one...I appreciate this content...so many people point to this for pc issues that people have and it doesn't apply..Nice

  • @ALLinx87
    @ALLinx87 2 месяца назад

    I have a 5950x paired with a 7900XT, it works great! This is the last upgrade.
    The next time I need to upgrade, it will be to build a new PC.

  • @biggil4
    @biggil4 11 месяцев назад

    Your worst bottleneck scenario is my life with the Ryzen 7 2700x/2080 while playing cyberpunk.
    This made a lot of sense. Thanks for the bud!

  • @mikepawlikguitar
    @mikepawlikguitar 8 месяцев назад +1

    My biggest bottleneck is my brain - can't decide between building a brand-new PC on AM4 (5600) or AM5 (7600), 16G/32G memory respectively, and a 4070. Something tells me AM5, but it's a fairly steep price premium over AM4.

    • @Lero_Po
      @Lero_Po 8 месяцев назад +1

      For a 40 series card, definitely go AM5. As far as memory goes, it really depends on your budget and what you do with your computer. For gaming, 16gb is pretty standard, but if you regularly play pretty demanding games I would go for 32gb.

    • @PHANT0M410
      @PHANT0M410 7 месяцев назад +1

      Microcenter has a bundle right now for 499$, gets a R7 7800x3d, b650 board and 32gb ddr5 6000 ram. If your budget is 1,500$ you can easily fit a 4070 Super into it

    • @mikepawlikguitar
      @mikepawlikguitar 7 месяцев назад

      @@PHANT0M410 I love this..it's a sick deal. How TF can they afford to do that? If I lived in the US near a Microcenter, this would be my first stop! Unfortunately, Canada is the cheap knockoff version of the US, so we get shafted in so many ways 😂

  • @peateargriffen36
    @peateargriffen36 Год назад +1

    I think for my system turning on rebar helped my CPU stay more even with my GPU. I have a Ryzen 3700x with a RTX 3080.

  • @naut96
    @naut96 9 месяцев назад

    As someone who is currently running a 7600k with a 3060 this confirms my thoughts.

  • @MasterJangleLeg
    @MasterJangleLeg 10 месяцев назад +1

    I am upgrading my i9 9000k in about 4/5 months to a 13 700k for my 4070ti but at the moment it aint too bad.

  • @LandoKarzuk
    @LandoKarzuk Год назад

    I was going to skip the in video advert but I couldn't, too good