i7 2600K 60FPS+ in 2022 Games?

Поделиться
HTML-код
  • Опубликовано: 16 янв 2025

Комментарии • 267

  • @TheGoodOldGamer
    @TheGoodOldGamer  2 года назад +15

    Forgot to add in TUNIC 300FPS+, and Escape from Monkey Island 60FPS locked. Obviously no issues with those two 2022 games either ;)

    • @3dkiller
      @3dkiller 2 года назад

      It will also run Red dead redemption 2 easily.

    • @danpazzo73
      @danpazzo73 Год назад

      Hi to everyone! I have 3 old machines, a Xeon x5675 (x58 pci 2.0) with 12 gb ram DDR3 1600 triple channel, a i7 2600k (pci 3.0) with 16 gb ram DDR3 1600 and a fx 8350 (pci 2.0) with 16 gb ram DDR3 1866), well, I have 3 graphic cards; rx 6600, gtx 970 and a rx 580... What could be the better options to pair all devices please? and sorry for my english... 😊

    • @natejennings5884
      @natejennings5884 Год назад

      I have an i7-3770 (3.4GHz, 4 core, 8 thread), 16GB DDR3 RAM, and an Nvidia RTX 3050 8GB. It runs No Mans Sky, MechWarrior 5, Batman Arkham Knight, Dying Light, etc damn good.

  • @lancebermejo3319
    @lancebermejo3319 2 года назад +29

    It's crazy how technology has changed so much. Back in 2002, a $750 PC from 1999 would be considered useless. Nowadays, even an 11 year old CPU is still going strong, wow...

  • @philscomputerlab
    @philscomputerlab 2 года назад +45

    Nice CPU OC and also very fast memory, good to see this old CPU still going strong 🙂

    • @TheGoodOldGamer
      @TheGoodOldGamer  2 года назад +11

      I literally mentioned people with ddr3 platforms go get 2133+ ram because it’s cheap today and will extend the life of the platform

    • @globalzero
      @globalzero 2 года назад +2

      @@TheGoodOldGamer the problem is that I can not run 2133 Mhz , pc wont boot up and I have to go 1600 mhz max .

    • @daytimerocker3808
      @daytimerocker3808 2 года назад

      @@TheGoodOldGamer even just 1866 is gonna be huge over 1333 or 1600 and there always a 2x8gb kit on newegg of the patriot stuff for 45 and under I saw it on sale for 36 before.

    • @Wushu-viking
      @Wushu-viking Год назад

      Fast RAM is more important than it was back then. Many modern games are memory speed optimized. Even the "ancient" X58 platform can surprise today, if used with some 1600+ MHz low latency @triple channel RAM. I just did a build with DDR3-1600-cl7@triple channel. It blew me away how it can still Manage most games. Huge difference compared to the old "standard" 1333-cl9. Back 10 years ago, it was not gaming noticable.
      Some high-end 2133 RAM @cl9 or 10 will put extra life into Sandy/Ivy Bridge.
      Especially Haswell with 2133/2400 RAM, do compete with 6-10th gen (DDR4 platform) in performance (it's 4 core counterpart @same frequency).

    • @tvrtkocro28
      @tvrtkocro28 Год назад

      ​@@Wushu-vikingI'm running 2600k@4.8ghz and 1900mhz cl11 ram (overclocked 1600 kit) how much am I losing by not using 2133mhz lower latency? It's paired with amd rx580 4gb

  • @anthonyyoung9810
    @anthonyyoung9810 Год назад +15

    Just scored a 2600k, mobo & ram for $50. I'm
    Loving the old beast! What an absolute weapon!

    • @jorgmarowsky7242
      @jorgmarowsky7242 5 месяцев назад

      I just with case gpu and mobo ram 30€

    • @anthonyyoung9810
      @anthonyyoung9810 5 месяцев назад

      Good work mate! Enjoy!!!​@@jorgmarowsky7242

  • @jeffro5032
    @jeffro5032 2 года назад +17

    Well , this makes me happy to see. I'm still rocking my OC 2600K @4.533hz / Asus Sabertooth Z77 motherboard , 32gb ram and a Sapphire R9 Fury. I mostly do Sim racing , so it's been holding up well. Built this rig in 2013. Have thought about building a new machine later this winter or early spring. Prices finally seem to be dropping down to almost affordable again.

    • @bazvenom33
      @bazvenom33 Год назад

      That is an absolute beast of a MB, its a shame that I cheaped out on the MB and got a p67, and now it can’t OC my 2600k consistently anymore. Sigh. I bought a 1080 to upgrade my original 580. Thinking about upgrading as well. Just not so sure 40 series gpu is the way to go.

    • @jeffro5032
      @jeffro5032 Год назад

      @@bazvenom33 Yeah Baz , I have to say this MB is probably the most durable MB I've ever had. It's going on 10 years with it , and has outlasted any MB i've ever owned. Had to replace a few things in this rig over the years , but not the MB. Still running strong.

    • @Ofjelge
      @Ofjelge Год назад

      ​​@@bazvenom33 I'm still using my i7-2600K with my Asus p8p67 Pro (B3) motherboard. It's slightly overclocked and runs optimally.
      As my son needed a better gaming PC, I recently installed a second hand 2.5" SSD, upgraded the RAM by buying a couple more second hand sticks (totalling at 16 GB DDR3 1333 MHz RAM) and "splurged" on a Palit GeForce GTX 1650 StormX. Spent about the equivalent to 150 $ and now it can even run Horizon Zero Dawn with fairly decent graphics.

    • @bazvenom33
      @bazvenom33 Год назад

      @@Ofjelge good to hear yours is still going strong! Yeah I did upgrade to 16g ram back when it was dirt cheap and got multiple ssd in my rig. I built a pc with a 1660 and ryzen 2600 for the in-laws about 3 yrs ago and it seems to run better than my own rig at playing games on 1440p at 40-60 fps. Really made me thinking of upgrading my 2600k hence looking at this video.

    • @tvrtkocro28
      @tvrtkocro28 Год назад

      ​@@bazvenom33oh so it's p67 chipset fault? Best I can get out of mine 2600k is 4.8ghz@1.4v and I tried a few 2600k on P67 board this was the best one. I would get maybe more with higher chipset?

  • @RCD4444
    @RCD4444 2 года назад +39

    I blame a lack of competition from x58 until ryzen for this obsession with the need for insanely high refresh rates, it feels like to sell products during the ps4's lifespan which held back software we were tricked into upgrading needlessly. YES 144, 165 and 240hz are great but its insane to say 58 fps isnt playable. Fps obsession has replaced what made the pc scene so awesome in the late 90s and 2000s, price to performance and just enjoying games. Now I see all my friends spending half their time watching msi afterburner in the top left of the screen rather than focusing on the game they are "playing" wtf

    • @RCD4444
      @RCD4444 2 года назад +6

      @@christopherjames9843 the slightly less tech savvy friends of mine of that era didn't even know about fps. A game was playable when it launched and ran. Most of the time we upgraded our PC's back then was because our system was no longer compatible with a game we wanted to play. Now people spend more on RGB fans than we used to spend on gpus.

    • @RCD4444
      @RCD4444 2 года назад

      @@christopherjames9843 there is a lot of blame that lies at the feet of the tech reviewers. It's tech porn that has warped the consumers brain and hooked them into an endless early upgrade cycle.

    • @Zeakros
      @Zeakros 2 года назад +2

      During that time (mid 90s to mid 2000) all consoles from ps1 to ps3 and Xbox to Xbox 360 provided 30 fps exactly, in every game and no one blamed them as unplayable.

    • @RCD4444
      @RCD4444 2 года назад +2

      @@Zeakros don't get me wrong high refresh is great... My first crt monitor was 100hz and I had my first 144hz panel in 2012 but at no point did I think 144hz was required.

    • @Zeakros
      @Zeakros 2 года назад +1

      @@RCD4444 Also at that time I personally upgraded my pc when a new game would run below the playable performance, ie below 30fps. Additionally each cpu upgrade would result in 3 to 5 times the performance of the previous system and not the mere 10% to 20% we see today ( and even less in single core performance). Such leaps were happening back then in a 1-2 years time frame. Today is more about bragging rights than actual gaming and needed performance and I think more and more people begin to realize that. I should also add that where I live (europe) a max cpu +gpu (including mb and ram) upgrade in 2013 would cost about 1500 to 2000 euros ( I payed back then 1320eu for 3770k and gtx780 with mb and ram) . Today a 4090 with 7950x with mb and ram is priced here at about 4500 euros.

  • @ToneRetroGaming
    @ToneRetroGaming 2 года назад +21

    1% low info would have painted a much broader picture for us. Thanks for testing this out though it's really nice to see older hardware still performing well.

    • @b0ne91
      @b0ne91 2 года назад +1

      @@pf4934 He's using 2133CL9. That is SUCH a huge difference compared to your 1600CL11 (iirc Corsair Vengeance is CL11, might be CL9 tho). You can sometimes find cheap 2133 or 2400Mhz DDR3 kits on the used market. Picked one up last week for 20€. It's a worthy upgrade if you come across it.

    • @Zesserie
      @Zesserie 2 года назад +1

      @@b0ne91 Is it really worth an upgrade though? it's DDR3 on a cpu, which when i tested it, was just really underpowered.
      For me there is no point in buying ram for a 2600K where you are going struggle anyway. an AMD 5600, motherboard and ram can be had for less then 300 dollars, used down to around 150 - 200 dollars.
      and in games like CS:GO, valorant, apex etc where cpu is very important if you want be fairly competitive. trying to squeeze more life out of a 10+ year old system is admirable but only a fool would do it :D

    • @b0ne91
      @b0ne91 2 года назад

      @@Zesserie Not everyone has the US tech market available and often just buying RAM at roughly the price of your current RAM (which you will obviously sell) is more or less a free upgrade. I generally assume anyone sitting on a 2600k doesn't have a well priced tech market in their country. 2133C11 DDR3 will give a significant boost compared to 1600CL9. Huge boost, even.

    • @Zesserie
      @Zesserie 2 года назад

      @@b0ne91 i dont have access to the US tech market, but I can still buy it from ebay or aliexpress most of the time

    • @GTFour
      @GTFour Год назад +1

      @@Zesserie I went from 1600mhz to 2400mhz RAM on my OC 3570K just to see and I couldn’t notice single difference what so ever. Upgraded to a Ryzen 3600 and that was night and day faster.

  • @dalieu_Gaming
    @dalieu_Gaming Год назад +3

    In the Spiderman test, you essentially proved that the video card was the bottleneck not that the i7-2600K was the bottleneck. If you re-ran the test with a higher graphics card, RTX 4090, you would get higher frame rate with RTX on.

  • @ravioli0239
    @ravioli0239 2 года назад +10

    I must say, I'm quite surprised how it's holding up. I'm also impressed that you've got it running at 4.8GHz

    • @AshishKotnala29
      @AshishKotnala29 2 года назад +4

      5Ghz in spiderman if you see again. I'm more than a 100% sure that high overclock is what pushed 60fps in spiderman.

    • @h1tzzYT
      @h1tzzYT 2 года назад +5

      @@AshishKotnala29 even then so if you look at the footage without RT it dips to ~63-64fps. Sure its not bad but considering that this 2600k platform is basically maxed out (super good ram, close to max cpu OC) and he tested lots of indie games and not actually demanding AAA games, his whole point can be debatable. Like what about cyberpunk, far cry 6, cod warzone, bfV multiplayer, metro exodus enhanced edition (or regular version), rdr2? What about average oc with average ram, which most people will actually have if they still run 2600k not this maxed out setup.
      GOW 2018 is an old game now ported from ps4, destroy all humans is known for having terrible optimization, Spiderman is the most relevant in this video which cannot run at ultra high settings at 60+fps. The other ones either super meh games or just not demanding. The answer to this video is "yes***" with exceptions, depends on your criteria.

    • @AshishKotnala29
      @AshishKotnala29 2 года назад +4

      @@h1tzzYT I completely agree mate. I just didn't wanna point it in an obvious way. He's basically running the 2600K in a best case scenario or at least very close to it. Also I guess the video meant to just prove that it can run spiderman with RTX to one person and test games that HE purchased in 2022 on this CPU

    • @h1tzzYT
      @h1tzzYT 2 года назад +1

      @@AshishKotnala29 yeah i guess so, still imo it had missed opportunity for wider audience at questionable testing conditions, still i always enjoy old cpu revisits :)

    • @King_ShadowX_X
      @King_ShadowX_X Год назад

      I'm still using it with an rtx 3060 and i can't complain

  • @jangelelcangry
    @jangelelcangry 2 года назад +4

    4.8 GHz?
    Sandy Bridge was something.

  • @gdjaybee742
    @gdjaybee742 Год назад +2

    We're really living in age where even literally officially obsolete techs can run anything, Its an extremely great advantage as $100 almost complete hardware (add video cards from $50 upto $500) will run anything.

  • @joselopeslinsneto3176
    @joselopeslinsneto3176 Год назад +2

    I overclock my old one just for fun for 6GHZ and the motherfucker is still working as hell!

  • @ahmedfadzil3744
    @ahmedfadzil3744 Год назад +3

    I finally upgraded from my 2600K to brand new 7800X3D machine. Still have the 2600K and since it doesn't support Windows 11 considering turning it into a Linux box.

  • @fabiusmaximuscunctator7390
    @fabiusmaximuscunctator7390 2 года назад +14

    Thank you for checking this out!
    I had a 3770K for years. I changed to a 9900K, because I had some crazy stuttering in most of the games like Kingdom Come Deliverance while running trough the big towns. In Vermintide 2 there were also a lot of frame drops in the big battles. I guess it really depends on the titles you play. In Tomb Raider and Metro Exodus the i7 was just fine. Nevertheless I'm really glad I changed the CPU as my games run way smoother and the averages nearly doubled in most of the games. I guess if you really have to you can still get some life out of it.

    • @Timmy51m
      @Timmy51m 2 года назад +4

      That's interesting, I still have a 3770 at 4.1 non k oc, 2133 ddr3, gtx 980, played KCD a couple of years ago and it was all good, played through RDR2 locked 30fps last year fine, playing through AC Origins at the moment fine.
      I definitely see where you're coming from, I built it so many years ago it seems ridiculous to still be gaming on it now, even though I can afford to build a new system I'm generally tight as hell lol and hate parting with things that are still in working order.

    • @fabiusmaximuscunctator7390
      @fabiusmaximuscunctator7390 2 года назад +3

      The reason you didn't see that is the slower GPU. I had a 1080 that was simply to powerful in most games. Ti be fair though KCD only felt laggy in the towns. My RAM could have been a bit faulty, as I've seen some strange behavior at times.
      I think it's great you are still using it 👍🏻

    • @Timmy51m
      @Timmy51m 2 года назад +1

      @@fabiusmaximuscunctator7390 Could be right, it's quite well balanced as is, slightly more powerful than an Xbone or PS4, so anything they can play it can play at equivalent or slightly higher settings.
      It would bottleneck anything more powerful than an RTX 2060 from what I've seen, as I was looking at videos on here to see if it was worth it. I think I would look to build a new system like you have. It's great to hear other people's experiences as it helps make those decisions!

    • @tayloroxelgren264
      @tayloroxelgren264 2 года назад

      @@Timmy51m how did you overclock the non K? Just through base clock?

    • @Timmy51m
      @Timmy51m 2 года назад +3

      @@tayloroxelgren264 It's a feature on some Z series mobos, I have an Asrock Z77e-itx, it has a simple enable/disable toggle in the bios called "non k oc", if you enable it then it boosts turbo frequencies by 400mhz, so 4.3 max and 4.1 on all cores.

  • @valentinvas6454
    @valentinvas6454 2 года назад +4

    That 4.8 GHZ must have really boosted the stock performance.
    I played Elden Ring with a stock I7 3770K and it was a quite poor experience.

    • @haisitir
      @haisitir Месяц назад

      At 4.8 GHz performance is hugely increased, 15+ more fps, 2600k is a beast.

  • @Zeakros
    @Zeakros 2 года назад +3

    The reason why such an old cpu as the 2600 is still viable today, is that between 2009 and 2017 there was no meaningful evolution in cpus. I have personally tested an i7 920 to an i7 3770k both OCed to same frequency and both had exactly the same performance in games. The main reason for this was that intel had no rival for that period, and the second reason was that the same games had to be able to run on consoles. I personally had only 2 cpu upgrades during that period (i7 920 and i7 3770k) and the 3770k upgrade was not that necessary from a performance perspective.

  • @mirthumos
    @mirthumos 2 года назад +6

    Thanks for these tests. I'm still looking to get a 2600k myself. I have the i5 2500k since 2011 and that cpu is still kicking and running using win 10 x64.

  • @m8x425
    @m8x425 2 года назад +3

    I've owned two 2600k CPU's and I still have a 2700k lying around. The original one overclocked to 4.8ghz with ease, and it could pass all stress tests at 1.345v with high LLC. The second one fought tooth and nail to hit 4.6ghz and it needed 1.375v to remain stress test stable. The 2700k overclocked to 4.8ghz pretty easy, but I haven't had much of a chance to test it because my P67 WS Revolution board died last year.
    At one point I got my original 2600k to 5.0gz and it could run games with around 1.45v, but to get it stress test stable it needed 1.52v. It also had a weird quirk where it wouldn't run at 4.9ghz unless I set the multiplier to 48x then raised the baseclock. Beyond 4.8ghz it needed watercooling.... which motivated me to build my first loop. The CPU coolers on the market weren't too advanced back in 2011.
    Back in 2011 I built my dad a rig around the 2500k. After I upgraded his CPU to the 3770k, I tried my hand at overclocking the 2500k, but it needed too much vcore to hit 4.6ghz.

  • @giserson2
    @giserson2 2 года назад +1

    I have a cousin who still uses his 2600k since 2011, he had some trouble running some of the later battlefield games (1, 5), so I helped him overclock it to 4.2Ghz (which needed very little extra voltage for rock-solid stability) he then got another 8GB 1333mhz ram for free (16GB total) and with that he feels quite happy with the performance.
    With his configuration he probably wouldn't be able to play Elden Ring at a consistent 60+ FPS or Watch Dogs Legion either, but since he doesn't play those games he doesn't have a problem.

  • @TechLabUK
    @TechLabUK 2 года назад +1

    So cool to see these retro videos still. Want to get hold of the same CPU as this to play with. Have an i7 3770 hanging around but want a second gen for fun.

  • @MiKeSu223
    @MiKeSu223 2 года назад +2

    i can confirm it works with modern games, i use it 3,4 years and it is solid going with a decent cooler.

  • @TheVanillatech
    @TheVanillatech 2 года назад +3

    Still hanging on to the Devils Canyon 4790K with a lock at 4.4Ghz. Still handles everything I play.

    • @3dkiller
      @3dkiller 2 года назад +1

      Oc it higher, shouldnt be a problem. My 4770k does 4.8 ghz watercooled 280 rad.

  • @laejxela
    @laejxela 2 года назад +2

    I literally just replaced an i7 3930K (Sandybridge-E 6-core varient) @4.2Ghz with a 2070 Super and it's been an absolute MVP.
    Still coped fine with most games but the heavily multithreaded ones were starting to hurt.
    Got a 7900X now

  • @magottyk
    @magottyk 2 года назад +2

    Critical factor for old CPU's driving modern GPUs, is where is the point of diminishing returns.
    You're testing with a 3060Ti and the contemporary top card for sandy bridge is a GTX 580. The performance difference according to techpowerup between those GPUs is 474%.
    Now while you're asking if a 2600K can run 60FPS in modern games, you're running a lot of DX11 titles which in no way can be considered modern, new games on old engines are essentially new release old games.
    In the two DX12 titles that we can consider modern titles, the CPU usage was very high, in elden ring you had up to 89% and spiderman up to 95% on a thread and consistently above 80% on all threads.
    Also one DX11 title god of war was up to 98% and consistently above 90% on all threads.
    That brings up the question of system bottlenecks, how do all these games run using a higher performance platform and CPU running a 3060ti.
    Perusing elsewhere for just such a matchup, someone kindly uploaded a nice 5.0GHz OC 2600K vs stock 2700, 3700X, 9900K 4.7GHz and 10700 4.6GHz with a much weaker 1080 GPU using low settings. Needless to say the overclocked 2600K didn't fare very well and basically lost to the stock 2700 by 20-30%.
    In your stats on screen we see that the GPU usage for most of the games is low, elden ring ~75%, Spiderman 40% no RT - 50% with RT, and GOW ~80%
    So it's obvious that the overclocked 2600K cannot run a modern GPU in modern games effectively, what is really needed for perspective is the same tests against a modern CPU that can properly drive the GPU at the test settings to gauge just how much the OC 2600K is bottlenecking the experience.
    A useful bit of info for anyone stuck with a 2600K is just what GPU at what settings makes sense and in this case I'd expect that going beyond a RX 580 GTX 1060/1070 at 1080p is just not going to be worthwhile.
    So perhaps in future you can set the baseline so that the tests can give some meaningful reference to end users just what they need to balance the systems capabilities.

    • @magottyk
      @magottyk 2 года назад

      @@Phaethon569
      5900X @5.05GHz x570 32GB b-die 3600 16,16,16 RX6900XT undervolted and power limited to 220W for still better than stock specs @1440p .
      I'm already there and then some.
      In the link you listed I find it interesting that a FX6350 is outperforming a 2500K with a 2080ti, but these CPU's and platforms cannot shift the bottleneck to the GPU being 20+% behind the top levels that start with the Ryzen 1600X being as good as a 7600X for these purposes.
      If you can't bottleneck at the GPU, then upgrading the GPU is pointless. Also switching to lower specced GPUs which should shift the bottleneck, it doesn't and either there's an inherent platform penalty of around 20% on sandy/ivy bridge system or these numbers are mostly algorithmic speculation and not real world benchmarks.
      The problem I have with a lot of GOGs testing, is that there is no control reference to work with and not even a contemporary piledriver system for a competitive slant.

  • @Quadratball
    @Quadratball 2 года назад +1

    Today i will replace my i7 2600k, and it feels kinda wrong. Every day he did a nice job, and he still does. I normally never replace working things.

  • @Kynareth6
    @Kynareth6 2 года назад +4

    Doesn't PCI-E 2.0 bottleneck anything faster than an RTX 2060 though?

    • @Kynareth6
      @Kynareth6 2 года назад +2

      I was wrong. It doesn't.

  • @richarddunne3689
    @richarddunne3689 2 года назад +4

    Had my 2600k o/c 4.7 for many, many years until I got a 5800x two years ago. Paired with a 570, a 970 then a 1080, was a great system.

  • @TheDennys21
    @TheDennys21 Год назад +3

    The most legendary CPU of all time, damn impressive!

  • @joeylogan5469
    @joeylogan5469 2 года назад +1

    I'm still rocking the I7 2600k with a RX580. I have 2 5600x builds but when all my grandkids are over one of them plays on it just fine.

  • @silentmenace9286
    @silentmenace9286 2 года назад +1

    I agree with you and Paul, just because it's not brand new doesn't mean that it's no good.

  • @megapixeler
    @megapixeler 2 года назад +3

    We need more videos like this one... People seem desperate to get the very last thing out there, as if what they have has immediately become obsolete.

  • @Mazz-OmarAhmed
    @Mazz-OmarAhmed Год назад +3

    and if you have issues with CPU getting 100% usage u can just cap the FPS to 60 or to 75 if you have 75hz monitor and it will work fine

  • @hindel6141
    @hindel6141 2 года назад +2

    Just to add if I may Mr. GoG, do not try to push 2133MHz RAM on that chip. The memory controller on it will not go beyond 2133. Also just to add that you apper to have a golden sample, I mean 5GHz at almost full load and only around 80W and stable is really golden. I have pushed mine i7 2600k too hard and I think it degraded, now the best stable I can get is 4.2GHz. Wish I wasn't across the world to send it to you for testing

  • @adi6293
    @adi6293 2 года назад +4

    What about some heavy hitters? Cyberpunk 2077, AC Valhalla or Far Cry 6? :P:P

  • @notwhatitwasbefore
    @notwhatitwasbefore 2 года назад +5

    Great work, great video, The 2600k did better than I expected overall but it just goes to show if you have enough threads and a high frequency games are going to run at playable speeds for the most part.

  • @GodOfGamingBG
    @GodOfGamingBG 2 года назад +1

    serious sam siberian mayhem does not use unreal engine, it has its own in-house "serious engine", the series started out as a tech demo for the first version of the engine in fact

  • @jielibai2912
    @jielibai2912 2 года назад +3

    GPU intensive games will still run on old systems well....but trying a game like CK3 or Warhammer III on even a 6700k is rough.

  • @neonlost
    @neonlost 2 года назад +4

    i had a 2700k for so long almost a decade lol, the 5800x was a nice upgrade for the work i do tho

  • @NorthernLime
    @NorthernLime 2 года назад

    Thanks!

  • @WhiteSkyMage
    @WhiteSkyMage 2 года назад +1

    Hey Chris, it's surprising that my sister's i7 3700 is playing Bannerlord with mods quite well with just a Vega 64....in 2022.

  • @duckshepherd
    @duckshepherd Год назад +2

    Unfortunately the 2600k is not supported by Windows 11. This was the only thing that caused me to recently upgrade to a 5900x. Otherwise, I’d probably still be rocking my 2600k that I got in 2012. At the end of 2025, you’ll basically be forced to upgrade once Microsoft ends support of Windows 10.

    • @simontan5295
      @simontan5295 Год назад +1

      You can get windows 11 running on it , I have it running on old dual core laptop with 3gb of ram lol

    • @duckshepherd
      @duckshepherd Год назад

      @@simontan5295 Oh wow. Maybe just add more ram and you'll be good lol

  • @3dkiller
    @3dkiller 2 года назад +2

    4770k 4800 mhz 2400 mhz ddr3 10-10-12cl 1T. Runs flawless.

  • @ptown77
    @ptown77 2 года назад +2

    I'm curious how my old 2600X would fare against the 2600K. Battle of the 2600s!

  • @SRPMotors
    @SRPMotors 2 года назад +1

    I still run a 2600k in my gaming PC works great still. I have it at 4.9ghz. and a 980 ti I haven't had any issues with any games so I haven't upgraded yet.

  • @unclecreepy4324
    @unclecreepy4324 2 года назад +1

    I had this processor for ten years and it handled everything I threw at and overclocked it to boot. I finally retired it when the octo cores came out

    • @Blackoutkingbeats
      @Blackoutkingbeats Год назад

      I don't understand why people ditched their quads for octo core CPUs though. nothing out there actually uses 16 threads.... certainly no games out there. unless your running a server, you get the exact same performance weather you have 4 or 8 cores.

    • @unclecreepy4324
      @unclecreepy4324 Год назад

      @@Blackoutkingbeats If your in to video editing Octo cores are great.

    • @Blackoutkingbeats
      @Blackoutkingbeats Год назад

      I mean if your into video editing you should have you gpu doing the work no?

    • @unclecreepy4324
      @unclecreepy4324 Год назад

      @@Blackoutkingbeats It helps but the cpu is the most important in video editing.

  • @gubercc411
    @gubercc411 2 года назад +2

    I gave my gf my old delidded+relidded with liquid metal i7-3770k with 32gb of 1600cl9 and she uses it for streaming and gaming, and it works. But streaming DOES start to slow it down. I think it does eventually get limited by lack of cores when doing 20 different things at once. Still works great with a 6700xt gpu.

  • @Knuckism
    @Knuckism 2 года назад +3

    Wow I didnt expect to see this old cpu to run this well in games still. Matter of fact I still have a mobo + a 2600k + DDR 1600mhz still sitting in a box what would you think I could sell it for ?
    Also yes when I used my 2600k system it could easily do 4.8 to 5.0 ghz on a single fan AIO from corsair so your overclocks are definitely possible without really crazy cooling solutions. Great work on the video again and I really would love to give that old cpu a spin now just to see how well it could do since 60 fps is all I need basically.

  • @LiLBitsDK
    @LiLBitsDK 2 года назад +1

    I still miss my old 2700K ran like a charm :D

  • @zShooK
    @zShooK Год назад +1

    I remember playing PC gaming for the first time (13 at the time) when my oldest brother bought his first PC with Crisis and Bad Company 2. I didnt have a clue what a i7 or 2600K meant at the time but man I think I played his pc more than him lol he worked a lot 🤪The 2600K was so legendary at that time though, the difference between pc and console was drastic around that time. 3 Years later I end up getting a PC of my own with a 4790K once I got me a job. Never looked back on console since.

  • @adiffkindofswag1148
    @adiffkindofswag1148 2 года назад +4

    It's a wrap for CPUs and even GPUs now. We need another Crysis type game to push things forward.

    • @adiffkindofswag1148
      @adiffkindofswag1148 2 года назад +1

      @JacobTech except Cyberpunk doesn't look that impressive.

    • @rattlehead999
      @rattlehead999 2 года назад

      It's impressive how many objects there are on screen at the same time.

    • @Phaethon569
      @Phaethon569 2 года назад

      msfs2020 +vr checks all points ;)

  • @ilovehotdogs125790
    @ilovehotdogs125790 2 года назад +2

    2133c9 is insane! Super op ram

  • @scottstamm7022
    @scottstamm7022 2 года назад +2

    4.8GHz all the way till Spiderman, then 5GHz....why?

  • @melangkoh4184
    @melangkoh4184 Год назад +1

    rocking rtx2070 with i7 2700k

  • @zodwraith5745
    @zodwraith5745 2 года назад +2

    But that doesn't fit the narrative of you have to buy buy buy. High refresh is nice, but it's mostly a scam to sell CPUs, GPUs and monitors. This is why I still have my 4790k around for the kids gaming machine and it's never lacking. That thing is still a beast and I only replaced it for new features like NVME storage. I'm sure the kids will still get another 4-5 years out of it for Roblox, platformers, and Switch emulation. The 4790k was really the first i9 before i9s became a thing. Paired with a 1660Super it easily emulates Switch games at 4k/60 and the kids don't touch the switch anymore.
    Sure I would like a 244hz monitor but I'm not giving up 4k for it. I've always preferred fidelity over FPS, so long as I can get at least 60fps. Especially for the exponentially higher grunt needed for those super high refresh rates and ever diminishing returns, it just turns into a massive money sink to chase a fad propagated by companies trying to sell you something. I'm perfectly happy being a Good Enough Gamer when prices for everything are insane. The cost of a 40 series would be much better spent upgrading 2 systems for more family members to enjoy than just 1 powerful system I keep to myself.

  • @soucouyant
    @soucouyant 2 года назад +2

    Fantastic video. What about rendering videos? Productivity software? Thanks

  • @thaddeus2447
    @thaddeus2447 2 года назад +3

    Ive used i7 2600k with 1080 Ti for almost two years and then upgraded to r5 3600. But i7 with 4k monitor and pascal monster was great experience. Or 1440p at around 100fps in most tittles was good too.

  • @stevin47
    @stevin47 2 года назад +2

    heat may be an issue for v-cache on a 7700x3d . 7000 series is way hotter than the 5000 series and clocks had to be lowered on the 5800x3d to lower heat . i don't see v-cache coming to ryzen 7000 until they fix the die and the soldering process thickness and cooling the 7000

    • @budgetking2591
      @budgetking2591 2 года назад +1

      7000 series are hotter, but they are made to run that hot, there will be no problems with the 7700x3d, the core package will be thicker, so the heatspreader can be thinner.

    • @Decki777
      @Decki777 2 года назад

      @@budgetking2591 High Temperature is al ways bad man.

    • @rattlehead999
      @rattlehead999 2 года назад +2

      Lower the clock speeds on the 7950x to 4.8GHz on all cores an it is only 0-5% slower, consumes 125W down from 230W and runs at 55C with a decent cooler down from 95C
      Why AMD were such morons to almost double the power consumption and heat for an extra 5% performance is beyond me.

    • @jjlw2378
      @jjlw2378 2 года назад +2

      @@rattlehead999 They did it so they didn't lose to Alderlake in benchmarks.

    • @stevin47
      @stevin47 2 года назад

      @@budgetking2591 LOL

  • @SelbyKendrick-to5si
    @SelbyKendrick-to5si 2 года назад +1

    Did I miss it or was the resolution not mentioned?

  • @josephnorris4095
    @josephnorris4095 2 года назад +9

    That would mean, that at 1080p, an overclocked FX8350 would also be able to do the gaming with the same or similar results. However, you would need to tune the Hypertransport bus and memory speeds and timings, as well.

    • @Luke357
      @Luke357 2 года назад +2

      That is why Sandy Bridge will always be better. Sandy Bridge doesn't need tons of tuning like bulldozer did.

    • @AdiiS
      @AdiiS 2 года назад +2

      No AMD fanboy! FX8350 is shit compared to 2600k

    • @3dkiller
      @3dkiller 2 года назад

      Bulldozer aka faildozer, amd's worse cpu so far. Completly trash.

  • @DannyzReviews
    @DannyzReviews 2 года назад +7

    You're definitely right about hardware far superseding the needs of software now. Back in 2012 when the 2600K came out, you'd find nobody running a Pentium 4 (prescott) but now you can find many content 2600K owners. Same thing goes for GPUs as well, You'll still find people today rocking R9 290s from 2013, but was someone running a 2002 GPU in 2013?
    Games just need something revolutionary to happen to them, and its not ray tracing lol.

    • @musek5048
      @musek5048 2 года назад +2

      games went from being rough looking and fun to being super realistic with terrible gameplay and story lines lol sounds like the priorities shifted to selling shiny new things since they know how consumers function.

    • @DannyzReviews
      @DannyzReviews 2 года назад +1

      @@musek5048 I want a new conkers bad fur day!

    • @musek5048
      @musek5048 2 года назад +1

      @@DannyzReviews and not just a remaster, but a proper continuation of the story using the latest tech to make the visuals look like a pixar movie.

    • @Icureditwithmybrain
      @Icureditwithmybrain 2 года назад +2

      I realized this years ago. I used to upgrade my dads computer hardware every few years so he could do more on his pc, make things load faster, make videos run more smoothly etc. When I put an i7-6700 a GTX 1050 and an SSD in it that was it. I realized his pc could do everything he required of it perfectly and that he will no longer need upgrades ever again.

  • @nuffsaid7759
    @nuffsaid7759 2 года назад +3

    This is the best cpu intel ever made almost pushed amd to bankruptcy and went to monopolize entire market.

  • @michaelthompson9798
    @michaelthompson9798 2 года назад +3

    If you look at many titles recently released …… many only requiring a 6-core as a recommendation……. Not for minimum settings …… besides many game engines are still requiring 4-core / 8-Thread and as mentioned, Indie titles require only 4-c/8-t at recommended settings. We’ll need a 6+ core cpu maybe in the next 5+ years as a minimum spec when Unreal 5 engine etc are slowly released into new titles over the next few years.

  • @jreamer971
    @jreamer971 2 года назад +1

    Whoever still using this CPU definitely got their money's worth.

  • @zenstrata
    @zenstrata 2 года назад +2

    So, you are saying I don't need to replace my 5950x yet? ;D

    • @nuffsaid7759
      @nuffsaid7759 2 года назад +1

      you should obviously replace it man 7950 x gives you 1 extra frames why won't you take advantage of it?

    • @zenstrata
      @zenstrata 2 года назад

      @@nuffsaid7759 Gotta have them frames!!! So.. a quad 7950x system? ;)

  • @leerobinson8709
    @leerobinson8709 2 года назад +1

    I use a pair of Xeon E5-2690's in a HP z620 which uses the same architecture as Sandy Bridge, but has 8 hyperthreaded cores per CPU. Which in modern multitheaded titles is probably helping more than the higher clock speeds of the 4 core 2600k. I am getting about 20%+ more performance in a lot of these titles, with a worse GPU.

  • @chrissraceporting7055
    @chrissraceporting7055 2 года назад +1

    3570k and I play my games fine. But it's time to upgrade a bit

  • @slimjimjimslim5923
    @slimjimjimslim5923 2 года назад +4

    Man I got a 8700, but after seeing this video maybe I don't need to upgrade to 12600 this year. XD Maybe I'll just get a bigger computer case and 3080ti.

  • @jerrycheung6414
    @jerrycheung6414 Год назад +1

    What GPU is used for testing ?

  • @Julian-eb5zq
    @Julian-eb5zq Год назад +1

    I got ak 2500k 5.1ghz and 2600k 4.9ghz im about to swap to i7 5820k cuz more cores

  • @Hyperion1722
    @Hyperion1722 2 года назад +1

    Using 6700K (4.7mhz)/3080 Ti, it is just fine pumping most games at 4K to make the most of the GPU. With Raytracing/DLSS Quality, I can play Cyberpunk 1620p (DLSDR) at 70 fps. Quite ok with me and no need to upgrade.

  • @kommandokodiak6025
    @kommandokodiak6025 2 года назад +1

    2400Mhz ddr3 was affordable in 2015/16

  • @singular9
    @singular9 2 года назад +1

    This has been the case for a while. If you're okay with 60fps, just get a 4k monitor and a GPU that can handle the resolution and quality settings. I noticed that on my 60hz TV a lot of games are more than smooth enough. For eSports though where I demand 400+fps on 240hz... Yeah even my 6700k at 4.8ghz isn't enough

  • @taith2
    @taith2 2 года назад +1

    I'm sporting i7 2600 myself, works well in 1440p for games I play, also motherboard also auto overclocks it, despite it being non k version, think I cheaped out on k version that was ~$15 back then

  • @renderserik
    @renderserik Год назад +1

    Mention what graphics card ?

  • @evgkul9767
    @evgkul9767 2 года назад +1

    Cool! Which card in 2022-2023 will be optimal and stable(if amd) for the i7 2600k ? ty

    • @BierzeItboxer
      @BierzeItboxer Год назад

      Every card you can afford. The CPU Bottleneck depends highly on the game you play.

  • @tezflowerxell7647
    @tezflowerxell7647 2 года назад +1

    Everyone needs to game at 4k!

  • @scarletspidernz
    @scarletspidernz 2 года назад +1

    I had an overclocked i5 2500k and there was experiencing a lot of micro stutters while still hitting 60fps+ switching to Ryzen 2700 and keeping the same gpu/ssd and that stutter was gone. You won't always notice it but it was there and annoying with the 2500k

    • @fajaradi1223
      @fajaradi1223 2 года назад

      Yep, I've been there too. But I switch to Ryzen 1600. The micro stutter is still there, but less frequent, and less severe.
      For my case, it seems that microstutter happen when the game demand more VRam than my GPU has, then it search for more available memory elsewhere.

  • @JohannDaart
    @JohannDaart 2 года назад +1

    Its price in my country is nearly the same as R5 1600AF/2600 ;) So if someone is not able to get it for dirt cheap in some old workstation, then it's not worth at all, apart for having fun with it.

  • @shanilsam
    @shanilsam 2 года назад +2

    Fantastic video! Love seeing analysis focused on keeping older hardware. Can I say , it's also wonderful to see you showcase and be positive about some modern titles. There are a lot of great titles out there and, to your point , very few need a killer system to be enjoyed.
    If you like Westerns, you may like Hard West 2. Actually preferred it to Weird West.

  • @stevieelder642
    @stevieelder642 2 года назад +1

    Hmm, I have still have an old pc with a I7 2600k with amd hd7970hd but it's not been used for several years as i moved over to AMD, current build is Ryzen 5950x and rtx3080.

  • @extreme123dz
    @extreme123dz 2 года назад +1

    I have the i7 3770 but will change because in the live not only matter the "fps" but pther things too.

  • @VoldoronGaming
    @VoldoronGaming 2 года назад +1

    Cool. I have a 4770k clocked to 4.5 Ghz paired with a GTX 1070. DDR3 clocked at 1600 Mhz. I suppose I should get some 2400 Mhz ram.

    • @3dkiller
      @3dkiller 2 года назад

      Get the gskill ripjaws 2400mhz ones with 10-12-12 timings.

  • @ThePowersSix
    @ThePowersSix Год назад

    Whats the resolution, sorry if I missed it.

  • @denvera1g1
    @denvera1g1 2 года назад +1

    I need to do this with my 2009 Xeon X5690(6 cores 3.6Ghz boost with a good GPU)

  • @ahah1785
    @ahah1785 2 года назад

    1080TI: Still good in 2027? YEP!

  • @TatsuyaWow
    @TatsuyaWow 2 года назад +1

    i still have this cpu is still run well. but cpu demanding game well see the difference i will upgrade soon tho

  • @goregejones7248
    @goregejones7248 2 года назад

    1:46 12600k?

  • @adiffkindofswag1148
    @adiffkindofswag1148 2 года назад +1

    If Sandy Bridge can still play your games over 60FPS after 11yrs then how long is Skylake going to last with another 25% IPC then CPUs like Zen 3 are 24% faster than Skylake and ADL is 40% over Skylake.

  • @jacqli69
    @jacqli69 2 года назад +2

    Pretty sure 5800X3D will last me good until 2030 lolz...

    • @nuffsaid7759
      @nuffsaid7759 2 года назад +1

      Until consoles get better.
      Game companies don't make games for pc they make games that runs fine on console that's why we got that fake cyberpunk 2077 too different than demo and even Witcher 3 and people always talks about ubisoft downgrades. Those downgrades happen because of console limitations.

  •  2 года назад

    5GHz od 3,4 GHz CPU? I have i5 4690k and I was able to OC it only to 4,0GHz...

    • @Phaethon569
      @Phaethon569 2 года назад +1

      a good z77 based mb + hydro cooling makes this possible

  • @Nianfur
    @Nianfur 2 года назад

    It's all about the 0.1% and 1% lows.

  • @kevincampbell989
    @kevincampbell989 8 месяцев назад

    Kinda proves you dont always need the latest and greatest. Just gotta go in with the right expectations.

  • @WbosonLP
    @WbosonLP 2 года назад +1

    Sure for SP games, but in online MP shooters I gained 2x MIN FPS while switching from r7 2700x to r5 5600x and I´m sure there´s more to gain while going from 6 cores to 8 cores.
    Also in SOTR my 3770k@4.2GHz was bottlenecking hard, even 2700x didn´t have any gains in SOTR jungle city- also a SOTR fault.
    New zen4 is waste of money IMHO, zen3 or Alder Lake from Intel would be even better options.

  • @mikebruzzone9570
    @mikebruzzone9570 2 года назад

    Chris what was the graphics card? mb

  • @ryanmalin
    @ryanmalin 2 года назад +1

    I had a 7700k and 1660ti. I could run any game with that system at decent fps. If all you do is gaming, 4 core is still enough in 2022.

  • @DC3Refom
    @DC3Refom 2 года назад

    Simultation software no chance ie fernbus, xplane 12, etc

  • @yurimodin7333
    @yurimodin7333 2 года назад

    all the guys who didn't cheap out on hyperthreading are still laughing now.....

  • @Nekropsyy
    @Nekropsyy 2 года назад +1

    i just pulled this cpu out of an old computer tower and now im going to throw a 1660ti together with it that i just had laying around. siiick

  • @orgrimdoomhammer2161
    @orgrimdoomhammer2161 2 года назад

    i need advice, should i go for rtx 3060 i have i7 2600 nonK for 11 years now, don't wana change my pc :D

  • @m3mph1st0
    @m3mph1st0 2 года назад

    Why didn't you show the frame rate versus the 12100?