Hi to everyone! I have 3 old machines, a Xeon x5675 (x58 pci 2.0) with 12 gb ram DDR3 1600 triple channel, a i7 2600k (pci 3.0) with 16 gb ram DDR3 1600 and a fx 8350 (pci 2.0) with 16 gb ram DDR3 1866), well, I have 3 graphic cards; rx 6600, gtx 970 and a rx 580... What could be the better options to pair all devices please? and sorry for my english... 😊
I have an i7-3770 (3.4GHz, 4 core, 8 thread), 16GB DDR3 RAM, and an Nvidia RTX 3050 8GB. It runs No Mans Sky, MechWarrior 5, Batman Arkham Knight, Dying Light, etc damn good.
It's crazy how technology has changed so much. Back in 2002, a $750 PC from 1999 would be considered useless. Nowadays, even an 11 year old CPU is still going strong, wow...
@@TheGoodOldGamer even just 1866 is gonna be huge over 1333 or 1600 and there always a 2x8gb kit on newegg of the patriot stuff for 45 and under I saw it on sale for 36 before.
Fast RAM is more important than it was back then. Many modern games are memory speed optimized. Even the "ancient" X58 platform can surprise today, if used with some 1600+ MHz low latency @triple channel RAM. I just did a build with DDR3-1600-cl7@triple channel. It blew me away how it can still Manage most games. Huge difference compared to the old "standard" 1333-cl9. Back 10 years ago, it was not gaming noticable. Some high-end 2133 RAM @cl9 or 10 will put extra life into Sandy/Ivy Bridge. Especially Haswell with 2133/2400 RAM, do compete with 6-10th gen (DDR4 platform) in performance (it's 4 core counterpart @same frequency).
@@Wushu-vikingI'm running 2600k@4.8ghz and 1900mhz cl11 ram (overclocked 1600 kit) how much am I losing by not using 2133mhz lower latency? It's paired with amd rx580 4gb
Well , this makes me happy to see. I'm still rocking my OC 2600K @4.533hz / Asus Sabertooth Z77 motherboard , 32gb ram and a Sapphire R9 Fury. I mostly do Sim racing , so it's been holding up well. Built this rig in 2013. Have thought about building a new machine later this winter or early spring. Prices finally seem to be dropping down to almost affordable again.
That is an absolute beast of a MB, its a shame that I cheaped out on the MB and got a p67, and now it can’t OC my 2600k consistently anymore. Sigh. I bought a 1080 to upgrade my original 580. Thinking about upgrading as well. Just not so sure 40 series gpu is the way to go.
@@bazvenom33 Yeah Baz , I have to say this MB is probably the most durable MB I've ever had. It's going on 10 years with it , and has outlasted any MB i've ever owned. Had to replace a few things in this rig over the years , but not the MB. Still running strong.
@@bazvenom33 I'm still using my i7-2600K with my Asus p8p67 Pro (B3) motherboard. It's slightly overclocked and runs optimally. As my son needed a better gaming PC, I recently installed a second hand 2.5" SSD, upgraded the RAM by buying a couple more second hand sticks (totalling at 16 GB DDR3 1333 MHz RAM) and "splurged" on a Palit GeForce GTX 1650 StormX. Spent about the equivalent to 150 $ and now it can even run Horizon Zero Dawn with fairly decent graphics.
@@Ofjelge good to hear yours is still going strong! Yeah I did upgrade to 16g ram back when it was dirt cheap and got multiple ssd in my rig. I built a pc with a 1660 and ryzen 2600 for the in-laws about 3 yrs ago and it seems to run better than my own rig at playing games on 1440p at 40-60 fps. Really made me thinking of upgrading my 2600k hence looking at this video.
@@bazvenom33oh so it's p67 chipset fault? Best I can get out of mine 2600k is 4.8ghz@1.4v and I tried a few 2600k on P67 board this was the best one. I would get maybe more with higher chipset?
I blame a lack of competition from x58 until ryzen for this obsession with the need for insanely high refresh rates, it feels like to sell products during the ps4's lifespan which held back software we were tricked into upgrading needlessly. YES 144, 165 and 240hz are great but its insane to say 58 fps isnt playable. Fps obsession has replaced what made the pc scene so awesome in the late 90s and 2000s, price to performance and just enjoying games. Now I see all my friends spending half their time watching msi afterburner in the top left of the screen rather than focusing on the game they are "playing" wtf
@@christopherjames9843 the slightly less tech savvy friends of mine of that era didn't even know about fps. A game was playable when it launched and ran. Most of the time we upgraded our PC's back then was because our system was no longer compatible with a game we wanted to play. Now people spend more on RGB fans than we used to spend on gpus.
@@christopherjames9843 there is a lot of blame that lies at the feet of the tech reviewers. It's tech porn that has warped the consumers brain and hooked them into an endless early upgrade cycle.
During that time (mid 90s to mid 2000) all consoles from ps1 to ps3 and Xbox to Xbox 360 provided 30 fps exactly, in every game and no one blamed them as unplayable.
@@Zeakros don't get me wrong high refresh is great... My first crt monitor was 100hz and I had my first 144hz panel in 2012 but at no point did I think 144hz was required.
@@RCD4444 Also at that time I personally upgraded my pc when a new game would run below the playable performance, ie below 30fps. Additionally each cpu upgrade would result in 3 to 5 times the performance of the previous system and not the mere 10% to 20% we see today ( and even less in single core performance). Such leaps were happening back then in a 1-2 years time frame. Today is more about bragging rights than actual gaming and needed performance and I think more and more people begin to realize that. I should also add that where I live (europe) a max cpu +gpu (including mb and ram) upgrade in 2013 would cost about 1500 to 2000 euros ( I payed back then 1320eu for 3770k and gtx780 with mb and ram) . Today a 4090 with 7950x with mb and ram is priced here at about 4500 euros.
1% low info would have painted a much broader picture for us. Thanks for testing this out though it's really nice to see older hardware still performing well.
@@pf4934 He's using 2133CL9. That is SUCH a huge difference compared to your 1600CL11 (iirc Corsair Vengeance is CL11, might be CL9 tho). You can sometimes find cheap 2133 or 2400Mhz DDR3 kits on the used market. Picked one up last week for 20€. It's a worthy upgrade if you come across it.
@@b0ne91 Is it really worth an upgrade though? it's DDR3 on a cpu, which when i tested it, was just really underpowered. For me there is no point in buying ram for a 2600K where you are going struggle anyway. an AMD 5600, motherboard and ram can be had for less then 300 dollars, used down to around 150 - 200 dollars. and in games like CS:GO, valorant, apex etc where cpu is very important if you want be fairly competitive. trying to squeeze more life out of a 10+ year old system is admirable but only a fool would do it :D
@@Zesserie Not everyone has the US tech market available and often just buying RAM at roughly the price of your current RAM (which you will obviously sell) is more or less a free upgrade. I generally assume anyone sitting on a 2600k doesn't have a well priced tech market in their country. 2133C11 DDR3 will give a significant boost compared to 1600CL9. Huge boost, even.
@@Zesserie I went from 1600mhz to 2400mhz RAM on my OC 3570K just to see and I couldn’t notice single difference what so ever. Upgraded to a Ryzen 3600 and that was night and day faster.
In the Spiderman test, you essentially proved that the video card was the bottleneck not that the i7-2600K was the bottleneck. If you re-ran the test with a higher graphics card, RTX 4090, you would get higher frame rate with RTX on.
@@AshishKotnala29 even then so if you look at the footage without RT it dips to ~63-64fps. Sure its not bad but considering that this 2600k platform is basically maxed out (super good ram, close to max cpu OC) and he tested lots of indie games and not actually demanding AAA games, his whole point can be debatable. Like what about cyberpunk, far cry 6, cod warzone, bfV multiplayer, metro exodus enhanced edition (or regular version), rdr2? What about average oc with average ram, which most people will actually have if they still run 2600k not this maxed out setup. GOW 2018 is an old game now ported from ps4, destroy all humans is known for having terrible optimization, Spiderman is the most relevant in this video which cannot run at ultra high settings at 60+fps. The other ones either super meh games or just not demanding. The answer to this video is "yes***" with exceptions, depends on your criteria.
@@h1tzzYT I completely agree mate. I just didn't wanna point it in an obvious way. He's basically running the 2600K in a best case scenario or at least very close to it. Also I guess the video meant to just prove that it can run spiderman with RTX to one person and test games that HE purchased in 2022 on this CPU
@@AshishKotnala29 yeah i guess so, still imo it had missed opportunity for wider audience at questionable testing conditions, still i always enjoy old cpu revisits :)
We're really living in age where even literally officially obsolete techs can run anything, Its an extremely great advantage as $100 almost complete hardware (add video cards from $50 upto $500) will run anything.
I finally upgraded from my 2600K to brand new 7800X3D machine. Still have the 2600K and since it doesn't support Windows 11 considering turning it into a Linux box.
Thank you for checking this out! I had a 3770K for years. I changed to a 9900K, because I had some crazy stuttering in most of the games like Kingdom Come Deliverance while running trough the big towns. In Vermintide 2 there were also a lot of frame drops in the big battles. I guess it really depends on the titles you play. In Tomb Raider and Metro Exodus the i7 was just fine. Nevertheless I'm really glad I changed the CPU as my games run way smoother and the averages nearly doubled in most of the games. I guess if you really have to you can still get some life out of it.
That's interesting, I still have a 3770 at 4.1 non k oc, 2133 ddr3, gtx 980, played KCD a couple of years ago and it was all good, played through RDR2 locked 30fps last year fine, playing through AC Origins at the moment fine. I definitely see where you're coming from, I built it so many years ago it seems ridiculous to still be gaming on it now, even though I can afford to build a new system I'm generally tight as hell lol and hate parting with things that are still in working order.
The reason you didn't see that is the slower GPU. I had a 1080 that was simply to powerful in most games. Ti be fair though KCD only felt laggy in the towns. My RAM could have been a bit faulty, as I've seen some strange behavior at times. I think it's great you are still using it 👍🏻
@@fabiusmaximuscunctator7390 Could be right, it's quite well balanced as is, slightly more powerful than an Xbone or PS4, so anything they can play it can play at equivalent or slightly higher settings. It would bottleneck anything more powerful than an RTX 2060 from what I've seen, as I was looking at videos on here to see if it was worth it. I think I would look to build a new system like you have. It's great to hear other people's experiences as it helps make those decisions!
@@tayloroxelgren264 It's a feature on some Z series mobos, I have an Asrock Z77e-itx, it has a simple enable/disable toggle in the bios called "non k oc", if you enable it then it boosts turbo frequencies by 400mhz, so 4.3 max and 4.1 on all cores.
The reason why such an old cpu as the 2600 is still viable today, is that between 2009 and 2017 there was no meaningful evolution in cpus. I have personally tested an i7 920 to an i7 3770k both OCed to same frequency and both had exactly the same performance in games. The main reason for this was that intel had no rival for that period, and the second reason was that the same games had to be able to run on consoles. I personally had only 2 cpu upgrades during that period (i7 920 and i7 3770k) and the 3770k upgrade was not that necessary from a performance perspective.
Thanks for these tests. I'm still looking to get a 2600k myself. I have the i5 2500k since 2011 and that cpu is still kicking and running using win 10 x64.
I've owned two 2600k CPU's and I still have a 2700k lying around. The original one overclocked to 4.8ghz with ease, and it could pass all stress tests at 1.345v with high LLC. The second one fought tooth and nail to hit 4.6ghz and it needed 1.375v to remain stress test stable. The 2700k overclocked to 4.8ghz pretty easy, but I haven't had much of a chance to test it because my P67 WS Revolution board died last year. At one point I got my original 2600k to 5.0gz and it could run games with around 1.45v, but to get it stress test stable it needed 1.52v. It also had a weird quirk where it wouldn't run at 4.9ghz unless I set the multiplier to 48x then raised the baseclock. Beyond 4.8ghz it needed watercooling.... which motivated me to build my first loop. The CPU coolers on the market weren't too advanced back in 2011. Back in 2011 I built my dad a rig around the 2500k. After I upgraded his CPU to the 3770k, I tried my hand at overclocking the 2500k, but it needed too much vcore to hit 4.6ghz.
I have a cousin who still uses his 2600k since 2011, he had some trouble running some of the later battlefield games (1, 5), so I helped him overclock it to 4.2Ghz (which needed very little extra voltage for rock-solid stability) he then got another 8GB 1333mhz ram for free (16GB total) and with that he feels quite happy with the performance. With his configuration he probably wouldn't be able to play Elden Ring at a consistent 60+ FPS or Watch Dogs Legion either, but since he doesn't play those games he doesn't have a problem.
So cool to see these retro videos still. Want to get hold of the same CPU as this to play with. Have an i7 3770 hanging around but want a second gen for fun.
I literally just replaced an i7 3930K (Sandybridge-E 6-core varient) @4.2Ghz with a 2070 Super and it's been an absolute MVP. Still coped fine with most games but the heavily multithreaded ones were starting to hurt. Got a 7900X now
Critical factor for old CPU's driving modern GPUs, is where is the point of diminishing returns. You're testing with a 3060Ti and the contemporary top card for sandy bridge is a GTX 580. The performance difference according to techpowerup between those GPUs is 474%. Now while you're asking if a 2600K can run 60FPS in modern games, you're running a lot of DX11 titles which in no way can be considered modern, new games on old engines are essentially new release old games. In the two DX12 titles that we can consider modern titles, the CPU usage was very high, in elden ring you had up to 89% and spiderman up to 95% on a thread and consistently above 80% on all threads. Also one DX11 title god of war was up to 98% and consistently above 90% on all threads. That brings up the question of system bottlenecks, how do all these games run using a higher performance platform and CPU running a 3060ti. Perusing elsewhere for just such a matchup, someone kindly uploaded a nice 5.0GHz OC 2600K vs stock 2700, 3700X, 9900K 4.7GHz and 10700 4.6GHz with a much weaker 1080 GPU using low settings. Needless to say the overclocked 2600K didn't fare very well and basically lost to the stock 2700 by 20-30%. In your stats on screen we see that the GPU usage for most of the games is low, elden ring ~75%, Spiderman 40% no RT - 50% with RT, and GOW ~80% So it's obvious that the overclocked 2600K cannot run a modern GPU in modern games effectively, what is really needed for perspective is the same tests against a modern CPU that can properly drive the GPU at the test settings to gauge just how much the OC 2600K is bottlenecking the experience. A useful bit of info for anyone stuck with a 2600K is just what GPU at what settings makes sense and in this case I'd expect that going beyond a RX 580 GTX 1060/1070 at 1080p is just not going to be worthwhile. So perhaps in future you can set the baseline so that the tests can give some meaningful reference to end users just what they need to balance the systems capabilities.
@@Phaethon569 5900X @5.05GHz x570 32GB b-die 3600 16,16,16 RX6900XT undervolted and power limited to 220W for still better than stock specs @1440p . I'm already there and then some. In the link you listed I find it interesting that a FX6350 is outperforming a 2500K with a 2080ti, but these CPU's and platforms cannot shift the bottleneck to the GPU being 20+% behind the top levels that start with the Ryzen 1600X being as good as a 7600X for these purposes. If you can't bottleneck at the GPU, then upgrading the GPU is pointless. Also switching to lower specced GPUs which should shift the bottleneck, it doesn't and either there's an inherent platform penalty of around 20% on sandy/ivy bridge system or these numbers are mostly algorithmic speculation and not real world benchmarks. The problem I have with a lot of GOGs testing, is that there is no control reference to work with and not even a contemporary piledriver system for a competitive slant.
Just to add if I may Mr. GoG, do not try to push 2133MHz RAM on that chip. The memory controller on it will not go beyond 2133. Also just to add that you apper to have a golden sample, I mean 5GHz at almost full load and only around 80W and stable is really golden. I have pushed mine i7 2600k too hard and I think it degraded, now the best stable I can get is 4.2GHz. Wish I wasn't across the world to send it to you for testing
Great work, great video, The 2600k did better than I expected overall but it just goes to show if you have enough threads and a high frequency games are going to run at playable speeds for the most part.
serious sam siberian mayhem does not use unreal engine, it has its own in-house "serious engine", the series started out as a tech demo for the first version of the engine in fact
Unfortunately the 2600k is not supported by Windows 11. This was the only thing that caused me to recently upgrade to a 5900x. Otherwise, I’d probably still be rocking my 2600k that I got in 2012. At the end of 2025, you’ll basically be forced to upgrade once Microsoft ends support of Windows 10.
I still run a 2600k in my gaming PC works great still. I have it at 4.9ghz. and a 980 ti I haven't had any issues with any games so I haven't upgraded yet.
I don't understand why people ditched their quads for octo core CPUs though. nothing out there actually uses 16 threads.... certainly no games out there. unless your running a server, you get the exact same performance weather you have 4 or 8 cores.
I gave my gf my old delidded+relidded with liquid metal i7-3770k with 32gb of 1600cl9 and she uses it for streaming and gaming, and it works. But streaming DOES start to slow it down. I think it does eventually get limited by lack of cores when doing 20 different things at once. Still works great with a 6700xt gpu.
Wow I didnt expect to see this old cpu to run this well in games still. Matter of fact I still have a mobo + a 2600k + DDR 1600mhz still sitting in a box what would you think I could sell it for ? Also yes when I used my 2600k system it could easily do 4.8 to 5.0 ghz on a single fan AIO from corsair so your overclocks are definitely possible without really crazy cooling solutions. Great work on the video again and I really would love to give that old cpu a spin now just to see how well it could do since 60 fps is all I need basically.
I remember playing PC gaming for the first time (13 at the time) when my oldest brother bought his first PC with Crisis and Bad Company 2. I didnt have a clue what a i7 or 2600K meant at the time but man I think I played his pc more than him lol he worked a lot 🤪The 2600K was so legendary at that time though, the difference between pc and console was drastic around that time. 3 Years later I end up getting a PC of my own with a 4790K once I got me a job. Never looked back on console since.
But that doesn't fit the narrative of you have to buy buy buy. High refresh is nice, but it's mostly a scam to sell CPUs, GPUs and monitors. This is why I still have my 4790k around for the kids gaming machine and it's never lacking. That thing is still a beast and I only replaced it for new features like NVME storage. I'm sure the kids will still get another 4-5 years out of it for Roblox, platformers, and Switch emulation. The 4790k was really the first i9 before i9s became a thing. Paired with a 1660Super it easily emulates Switch games at 4k/60 and the kids don't touch the switch anymore. Sure I would like a 244hz monitor but I'm not giving up 4k for it. I've always preferred fidelity over FPS, so long as I can get at least 60fps. Especially for the exponentially higher grunt needed for those super high refresh rates and ever diminishing returns, it just turns into a massive money sink to chase a fad propagated by companies trying to sell you something. I'm perfectly happy being a Good Enough Gamer when prices for everything are insane. The cost of a 40 series would be much better spent upgrading 2 systems for more family members to enjoy than just 1 powerful system I keep to myself.
Ive used i7 2600k with 1080 Ti for almost two years and then upgraded to r5 3600. But i7 with 4k monitor and pascal monster was great experience. Or 1440p at around 100fps in most tittles was good too.
heat may be an issue for v-cache on a 7700x3d . 7000 series is way hotter than the 5000 series and clocks had to be lowered on the 5800x3d to lower heat . i don't see v-cache coming to ryzen 7000 until they fix the die and the soldering process thickness and cooling the 7000
7000 series are hotter, but they are made to run that hot, there will be no problems with the 7700x3d, the core package will be thicker, so the heatspreader can be thinner.
Lower the clock speeds on the 7950x to 4.8GHz on all cores an it is only 0-5% slower, consumes 125W down from 230W and runs at 55C with a decent cooler down from 95C Why AMD were such morons to almost double the power consumption and heat for an extra 5% performance is beyond me.
That would mean, that at 1080p, an overclocked FX8350 would also be able to do the gaming with the same or similar results. However, you would need to tune the Hypertransport bus and memory speeds and timings, as well.
You're definitely right about hardware far superseding the needs of software now. Back in 2012 when the 2600K came out, you'd find nobody running a Pentium 4 (prescott) but now you can find many content 2600K owners. Same thing goes for GPUs as well, You'll still find people today rocking R9 290s from 2013, but was someone running a 2002 GPU in 2013? Games just need something revolutionary to happen to them, and its not ray tracing lol.
games went from being rough looking and fun to being super realistic with terrible gameplay and story lines lol sounds like the priorities shifted to selling shiny new things since they know how consumers function.
I realized this years ago. I used to upgrade my dads computer hardware every few years so he could do more on his pc, make things load faster, make videos run more smoothly etc. When I put an i7-6700 a GTX 1050 and an SSD in it that was it. I realized his pc could do everything he required of it perfectly and that he will no longer need upgrades ever again.
If you look at many titles recently released …… many only requiring a 6-core as a recommendation……. Not for minimum settings …… besides many game engines are still requiring 4-core / 8-Thread and as mentioned, Indie titles require only 4-c/8-t at recommended settings. We’ll need a 6+ core cpu maybe in the next 5+ years as a minimum spec when Unreal 5 engine etc are slowly released into new titles over the next few years.
I use a pair of Xeon E5-2690's in a HP z620 which uses the same architecture as Sandy Bridge, but has 8 hyperthreaded cores per CPU. Which in modern multitheaded titles is probably helping more than the higher clock speeds of the 4 core 2600k. I am getting about 20%+ more performance in a lot of these titles, with a worse GPU.
Man I got a 8700, but after seeing this video maybe I don't need to upgrade to 12600 this year. XD Maybe I'll just get a bigger computer case and 3080ti.
Using 6700K (4.7mhz)/3080 Ti, it is just fine pumping most games at 4K to make the most of the GPU. With Raytracing/DLSS Quality, I can play Cyberpunk 1620p (DLSDR) at 70 fps. Quite ok with me and no need to upgrade.
This has been the case for a while. If you're okay with 60fps, just get a 4k monitor and a GPU that can handle the resolution and quality settings. I noticed that on my 60hz TV a lot of games are more than smooth enough. For eSports though where I demand 400+fps on 240hz... Yeah even my 6700k at 4.8ghz isn't enough
I'm sporting i7 2600 myself, works well in 1440p for games I play, also motherboard also auto overclocks it, despite it being non k version, think I cheaped out on k version that was ~$15 back then
I had an overclocked i5 2500k and there was experiencing a lot of micro stutters while still hitting 60fps+ switching to Ryzen 2700 and keeping the same gpu/ssd and that stutter was gone. You won't always notice it but it was there and annoying with the 2500k
Yep, I've been there too. But I switch to Ryzen 1600. The micro stutter is still there, but less frequent, and less severe. For my case, it seems that microstutter happen when the game demand more VRam than my GPU has, then it search for more available memory elsewhere.
Its price in my country is nearly the same as R5 1600AF/2600 ;) So if someone is not able to get it for dirt cheap in some old workstation, then it's not worth at all, apart for having fun with it.
Fantastic video! Love seeing analysis focused on keeping older hardware. Can I say , it's also wonderful to see you showcase and be positive about some modern titles. There are a lot of great titles out there and, to your point , very few need a killer system to be enjoyed. If you like Westerns, you may like Hard West 2. Actually preferred it to Weird West.
Hmm, I have still have an old pc with a I7 2600k with amd hd7970hd but it's not been used for several years as i moved over to AMD, current build is Ryzen 5950x and rtx3080.
If Sandy Bridge can still play your games over 60FPS after 11yrs then how long is Skylake going to last with another 25% IPC then CPUs like Zen 3 are 24% faster than Skylake and ADL is 40% over Skylake.
Until consoles get better. Game companies don't make games for pc they make games that runs fine on console that's why we got that fake cyberpunk 2077 too different than demo and even Witcher 3 and people always talks about ubisoft downgrades. Those downgrades happen because of console limitations.
2 года назад
5GHz od 3,4 GHz CPU? I have i5 4690k and I was able to OC it only to 4,0GHz...
Sure for SP games, but in online MP shooters I gained 2x MIN FPS while switching from r7 2700x to r5 5600x and I´m sure there´s more to gain while going from 6 cores to 8 cores. Also in SOTR my 3770k@4.2GHz was bottlenecking hard, even 2700x didn´t have any gains in SOTR jungle city- also a SOTR fault. New zen4 is waste of money IMHO, zen3 or Alder Lake from Intel would be even better options.
Forgot to add in TUNIC 300FPS+, and Escape from Monkey Island 60FPS locked. Obviously no issues with those two 2022 games either ;)
It will also run Red dead redemption 2 easily.
Hi to everyone! I have 3 old machines, a Xeon x5675 (x58 pci 2.0) with 12 gb ram DDR3 1600 triple channel, a i7 2600k (pci 3.0) with 16 gb ram DDR3 1600 and a fx 8350 (pci 2.0) with 16 gb ram DDR3 1866), well, I have 3 graphic cards; rx 6600, gtx 970 and a rx 580... What could be the better options to pair all devices please? and sorry for my english... 😊
I have an i7-3770 (3.4GHz, 4 core, 8 thread), 16GB DDR3 RAM, and an Nvidia RTX 3050 8GB. It runs No Mans Sky, MechWarrior 5, Batman Arkham Knight, Dying Light, etc damn good.
It's crazy how technology has changed so much. Back in 2002, a $750 PC from 1999 would be considered useless. Nowadays, even an 11 year old CPU is still going strong, wow...
Nice CPU OC and also very fast memory, good to see this old CPU still going strong 🙂
I literally mentioned people with ddr3 platforms go get 2133+ ram because it’s cheap today and will extend the life of the platform
@@TheGoodOldGamer the problem is that I can not run 2133 Mhz , pc wont boot up and I have to go 1600 mhz max .
@@TheGoodOldGamer even just 1866 is gonna be huge over 1333 or 1600 and there always a 2x8gb kit on newegg of the patriot stuff for 45 and under I saw it on sale for 36 before.
Fast RAM is more important than it was back then. Many modern games are memory speed optimized. Even the "ancient" X58 platform can surprise today, if used with some 1600+ MHz low latency @triple channel RAM. I just did a build with DDR3-1600-cl7@triple channel. It blew me away how it can still Manage most games. Huge difference compared to the old "standard" 1333-cl9. Back 10 years ago, it was not gaming noticable.
Some high-end 2133 RAM @cl9 or 10 will put extra life into Sandy/Ivy Bridge.
Especially Haswell with 2133/2400 RAM, do compete with 6-10th gen (DDR4 platform) in performance (it's 4 core counterpart @same frequency).
@@Wushu-vikingI'm running 2600k@4.8ghz and 1900mhz cl11 ram (overclocked 1600 kit) how much am I losing by not using 2133mhz lower latency? It's paired with amd rx580 4gb
Just scored a 2600k, mobo & ram for $50. I'm
Loving the old beast! What an absolute weapon!
I just with case gpu and mobo ram 30€
Good work mate! Enjoy!!!@@jorgmarowsky7242
Well , this makes me happy to see. I'm still rocking my OC 2600K @4.533hz / Asus Sabertooth Z77 motherboard , 32gb ram and a Sapphire R9 Fury. I mostly do Sim racing , so it's been holding up well. Built this rig in 2013. Have thought about building a new machine later this winter or early spring. Prices finally seem to be dropping down to almost affordable again.
That is an absolute beast of a MB, its a shame that I cheaped out on the MB and got a p67, and now it can’t OC my 2600k consistently anymore. Sigh. I bought a 1080 to upgrade my original 580. Thinking about upgrading as well. Just not so sure 40 series gpu is the way to go.
@@bazvenom33 Yeah Baz , I have to say this MB is probably the most durable MB I've ever had. It's going on 10 years with it , and has outlasted any MB i've ever owned. Had to replace a few things in this rig over the years , but not the MB. Still running strong.
@@bazvenom33 I'm still using my i7-2600K with my Asus p8p67 Pro (B3) motherboard. It's slightly overclocked and runs optimally.
As my son needed a better gaming PC, I recently installed a second hand 2.5" SSD, upgraded the RAM by buying a couple more second hand sticks (totalling at 16 GB DDR3 1333 MHz RAM) and "splurged" on a Palit GeForce GTX 1650 StormX. Spent about the equivalent to 150 $ and now it can even run Horizon Zero Dawn with fairly decent graphics.
@@Ofjelge good to hear yours is still going strong! Yeah I did upgrade to 16g ram back when it was dirt cheap and got multiple ssd in my rig. I built a pc with a 1660 and ryzen 2600 for the in-laws about 3 yrs ago and it seems to run better than my own rig at playing games on 1440p at 40-60 fps. Really made me thinking of upgrading my 2600k hence looking at this video.
@@bazvenom33oh so it's p67 chipset fault? Best I can get out of mine 2600k is 4.8ghz@1.4v and I tried a few 2600k on P67 board this was the best one. I would get maybe more with higher chipset?
I blame a lack of competition from x58 until ryzen for this obsession with the need for insanely high refresh rates, it feels like to sell products during the ps4's lifespan which held back software we were tricked into upgrading needlessly. YES 144, 165 and 240hz are great but its insane to say 58 fps isnt playable. Fps obsession has replaced what made the pc scene so awesome in the late 90s and 2000s, price to performance and just enjoying games. Now I see all my friends spending half their time watching msi afterburner in the top left of the screen rather than focusing on the game they are "playing" wtf
@@christopherjames9843 the slightly less tech savvy friends of mine of that era didn't even know about fps. A game was playable when it launched and ran. Most of the time we upgraded our PC's back then was because our system was no longer compatible with a game we wanted to play. Now people spend more on RGB fans than we used to spend on gpus.
@@christopherjames9843 there is a lot of blame that lies at the feet of the tech reviewers. It's tech porn that has warped the consumers brain and hooked them into an endless early upgrade cycle.
During that time (mid 90s to mid 2000) all consoles from ps1 to ps3 and Xbox to Xbox 360 provided 30 fps exactly, in every game and no one blamed them as unplayable.
@@Zeakros don't get me wrong high refresh is great... My first crt monitor was 100hz and I had my first 144hz panel in 2012 but at no point did I think 144hz was required.
@@RCD4444 Also at that time I personally upgraded my pc when a new game would run below the playable performance, ie below 30fps. Additionally each cpu upgrade would result in 3 to 5 times the performance of the previous system and not the mere 10% to 20% we see today ( and even less in single core performance). Such leaps were happening back then in a 1-2 years time frame. Today is more about bragging rights than actual gaming and needed performance and I think more and more people begin to realize that. I should also add that where I live (europe) a max cpu +gpu (including mb and ram) upgrade in 2013 would cost about 1500 to 2000 euros ( I payed back then 1320eu for 3770k and gtx780 with mb and ram) . Today a 4090 with 7950x with mb and ram is priced here at about 4500 euros.
1% low info would have painted a much broader picture for us. Thanks for testing this out though it's really nice to see older hardware still performing well.
@@pf4934 He's using 2133CL9. That is SUCH a huge difference compared to your 1600CL11 (iirc Corsair Vengeance is CL11, might be CL9 tho). You can sometimes find cheap 2133 or 2400Mhz DDR3 kits on the used market. Picked one up last week for 20€. It's a worthy upgrade if you come across it.
@@b0ne91 Is it really worth an upgrade though? it's DDR3 on a cpu, which when i tested it, was just really underpowered.
For me there is no point in buying ram for a 2600K where you are going struggle anyway. an AMD 5600, motherboard and ram can be had for less then 300 dollars, used down to around 150 - 200 dollars.
and in games like CS:GO, valorant, apex etc where cpu is very important if you want be fairly competitive. trying to squeeze more life out of a 10+ year old system is admirable but only a fool would do it :D
@@Zesserie Not everyone has the US tech market available and often just buying RAM at roughly the price of your current RAM (which you will obviously sell) is more or less a free upgrade. I generally assume anyone sitting on a 2600k doesn't have a well priced tech market in their country. 2133C11 DDR3 will give a significant boost compared to 1600CL9. Huge boost, even.
@@b0ne91 i dont have access to the US tech market, but I can still buy it from ebay or aliexpress most of the time
@@Zesserie I went from 1600mhz to 2400mhz RAM on my OC 3570K just to see and I couldn’t notice single difference what so ever. Upgraded to a Ryzen 3600 and that was night and day faster.
In the Spiderman test, you essentially proved that the video card was the bottleneck not that the i7-2600K was the bottleneck. If you re-ran the test with a higher graphics card, RTX 4090, you would get higher frame rate with RTX on.
I must say, I'm quite surprised how it's holding up. I'm also impressed that you've got it running at 4.8GHz
5Ghz in spiderman if you see again. I'm more than a 100% sure that high overclock is what pushed 60fps in spiderman.
@@AshishKotnala29 even then so if you look at the footage without RT it dips to ~63-64fps. Sure its not bad but considering that this 2600k platform is basically maxed out (super good ram, close to max cpu OC) and he tested lots of indie games and not actually demanding AAA games, his whole point can be debatable. Like what about cyberpunk, far cry 6, cod warzone, bfV multiplayer, metro exodus enhanced edition (or regular version), rdr2? What about average oc with average ram, which most people will actually have if they still run 2600k not this maxed out setup.
GOW 2018 is an old game now ported from ps4, destroy all humans is known for having terrible optimization, Spiderman is the most relevant in this video which cannot run at ultra high settings at 60+fps. The other ones either super meh games or just not demanding. The answer to this video is "yes***" with exceptions, depends on your criteria.
@@h1tzzYT I completely agree mate. I just didn't wanna point it in an obvious way. He's basically running the 2600K in a best case scenario or at least very close to it. Also I guess the video meant to just prove that it can run spiderman with RTX to one person and test games that HE purchased in 2022 on this CPU
@@AshishKotnala29 yeah i guess so, still imo it had missed opportunity for wider audience at questionable testing conditions, still i always enjoy old cpu revisits :)
I'm still using it with an rtx 3060 and i can't complain
4.8 GHz?
Sandy Bridge was something.
We're really living in age where even literally officially obsolete techs can run anything, Its an extremely great advantage as $100 almost complete hardware (add video cards from $50 upto $500) will run anything.
I overclock my old one just for fun for 6GHZ and the motherfucker is still working as hell!
I finally upgraded from my 2600K to brand new 7800X3D machine. Still have the 2600K and since it doesn't support Windows 11 considering turning it into a Linux box.
Thank you for checking this out!
I had a 3770K for years. I changed to a 9900K, because I had some crazy stuttering in most of the games like Kingdom Come Deliverance while running trough the big towns. In Vermintide 2 there were also a lot of frame drops in the big battles. I guess it really depends on the titles you play. In Tomb Raider and Metro Exodus the i7 was just fine. Nevertheless I'm really glad I changed the CPU as my games run way smoother and the averages nearly doubled in most of the games. I guess if you really have to you can still get some life out of it.
That's interesting, I still have a 3770 at 4.1 non k oc, 2133 ddr3, gtx 980, played KCD a couple of years ago and it was all good, played through RDR2 locked 30fps last year fine, playing through AC Origins at the moment fine.
I definitely see where you're coming from, I built it so many years ago it seems ridiculous to still be gaming on it now, even though I can afford to build a new system I'm generally tight as hell lol and hate parting with things that are still in working order.
The reason you didn't see that is the slower GPU. I had a 1080 that was simply to powerful in most games. Ti be fair though KCD only felt laggy in the towns. My RAM could have been a bit faulty, as I've seen some strange behavior at times.
I think it's great you are still using it 👍🏻
@@fabiusmaximuscunctator7390 Could be right, it's quite well balanced as is, slightly more powerful than an Xbone or PS4, so anything they can play it can play at equivalent or slightly higher settings.
It would bottleneck anything more powerful than an RTX 2060 from what I've seen, as I was looking at videos on here to see if it was worth it. I think I would look to build a new system like you have. It's great to hear other people's experiences as it helps make those decisions!
@@Timmy51m how did you overclock the non K? Just through base clock?
@@tayloroxelgren264 It's a feature on some Z series mobos, I have an Asrock Z77e-itx, it has a simple enable/disable toggle in the bios called "non k oc", if you enable it then it boosts turbo frequencies by 400mhz, so 4.3 max and 4.1 on all cores.
That 4.8 GHZ must have really boosted the stock performance.
I played Elden Ring with a stock I7 3770K and it was a quite poor experience.
At 4.8 GHz performance is hugely increased, 15+ more fps, 2600k is a beast.
The reason why such an old cpu as the 2600 is still viable today, is that between 2009 and 2017 there was no meaningful evolution in cpus. I have personally tested an i7 920 to an i7 3770k both OCed to same frequency and both had exactly the same performance in games. The main reason for this was that intel had no rival for that period, and the second reason was that the same games had to be able to run on consoles. I personally had only 2 cpu upgrades during that period (i7 920 and i7 3770k) and the 3770k upgrade was not that necessary from a performance perspective.
Thanks for these tests. I'm still looking to get a 2600k myself. I have the i5 2500k since 2011 and that cpu is still kicking and running using win 10 x64.
I've owned two 2600k CPU's and I still have a 2700k lying around. The original one overclocked to 4.8ghz with ease, and it could pass all stress tests at 1.345v with high LLC. The second one fought tooth and nail to hit 4.6ghz and it needed 1.375v to remain stress test stable. The 2700k overclocked to 4.8ghz pretty easy, but I haven't had much of a chance to test it because my P67 WS Revolution board died last year.
At one point I got my original 2600k to 5.0gz and it could run games with around 1.45v, but to get it stress test stable it needed 1.52v. It also had a weird quirk where it wouldn't run at 4.9ghz unless I set the multiplier to 48x then raised the baseclock. Beyond 4.8ghz it needed watercooling.... which motivated me to build my first loop. The CPU coolers on the market weren't too advanced back in 2011.
Back in 2011 I built my dad a rig around the 2500k. After I upgraded his CPU to the 3770k, I tried my hand at overclocking the 2500k, but it needed too much vcore to hit 4.6ghz.
I have a cousin who still uses his 2600k since 2011, he had some trouble running some of the later battlefield games (1, 5), so I helped him overclock it to 4.2Ghz (which needed very little extra voltage for rock-solid stability) he then got another 8GB 1333mhz ram for free (16GB total) and with that he feels quite happy with the performance.
With his configuration he probably wouldn't be able to play Elden Ring at a consistent 60+ FPS or Watch Dogs Legion either, but since he doesn't play those games he doesn't have a problem.
So cool to see these retro videos still. Want to get hold of the same CPU as this to play with. Have an i7 3770 hanging around but want a second gen for fun.
i can confirm it works with modern games, i use it 3,4 years and it is solid going with a decent cooler.
Still hanging on to the Devils Canyon 4790K with a lock at 4.4Ghz. Still handles everything I play.
Oc it higher, shouldnt be a problem. My 4770k does 4.8 ghz watercooled 280 rad.
I literally just replaced an i7 3930K (Sandybridge-E 6-core varient) @4.2Ghz with a 2070 Super and it's been an absolute MVP.
Still coped fine with most games but the heavily multithreaded ones were starting to hurt.
Got a 7900X now
Critical factor for old CPU's driving modern GPUs, is where is the point of diminishing returns.
You're testing with a 3060Ti and the contemporary top card for sandy bridge is a GTX 580. The performance difference according to techpowerup between those GPUs is 474%.
Now while you're asking if a 2600K can run 60FPS in modern games, you're running a lot of DX11 titles which in no way can be considered modern, new games on old engines are essentially new release old games.
In the two DX12 titles that we can consider modern titles, the CPU usage was very high, in elden ring you had up to 89% and spiderman up to 95% on a thread and consistently above 80% on all threads.
Also one DX11 title god of war was up to 98% and consistently above 90% on all threads.
That brings up the question of system bottlenecks, how do all these games run using a higher performance platform and CPU running a 3060ti.
Perusing elsewhere for just such a matchup, someone kindly uploaded a nice 5.0GHz OC 2600K vs stock 2700, 3700X, 9900K 4.7GHz and 10700 4.6GHz with a much weaker 1080 GPU using low settings. Needless to say the overclocked 2600K didn't fare very well and basically lost to the stock 2700 by 20-30%.
In your stats on screen we see that the GPU usage for most of the games is low, elden ring ~75%, Spiderman 40% no RT - 50% with RT, and GOW ~80%
So it's obvious that the overclocked 2600K cannot run a modern GPU in modern games effectively, what is really needed for perspective is the same tests against a modern CPU that can properly drive the GPU at the test settings to gauge just how much the OC 2600K is bottlenecking the experience.
A useful bit of info for anyone stuck with a 2600K is just what GPU at what settings makes sense and in this case I'd expect that going beyond a RX 580 GTX 1060/1070 at 1080p is just not going to be worthwhile.
So perhaps in future you can set the baseline so that the tests can give some meaningful reference to end users just what they need to balance the systems capabilities.
@@Phaethon569
5900X @5.05GHz x570 32GB b-die 3600 16,16,16 RX6900XT undervolted and power limited to 220W for still better than stock specs @1440p .
I'm already there and then some.
In the link you listed I find it interesting that a FX6350 is outperforming a 2500K with a 2080ti, but these CPU's and platforms cannot shift the bottleneck to the GPU being 20+% behind the top levels that start with the Ryzen 1600X being as good as a 7600X for these purposes.
If you can't bottleneck at the GPU, then upgrading the GPU is pointless. Also switching to lower specced GPUs which should shift the bottleneck, it doesn't and either there's an inherent platform penalty of around 20% on sandy/ivy bridge system or these numbers are mostly algorithmic speculation and not real world benchmarks.
The problem I have with a lot of GOGs testing, is that there is no control reference to work with and not even a contemporary piledriver system for a competitive slant.
Today i will replace my i7 2600k, and it feels kinda wrong. Every day he did a nice job, and he still does. I normally never replace working things.
Doesn't PCI-E 2.0 bottleneck anything faster than an RTX 2060 though?
I was wrong. It doesn't.
Had my 2600k o/c 4.7 for many, many years until I got a 5800x two years ago. Paired with a 570, a 970 then a 1080, was a great system.
The most legendary CPU of all time, damn impressive!
I'm still rocking the I7 2600k with a RX580. I have 2 5600x builds but when all my grandkids are over one of them plays on it just fine.
I agree with you and Paul, just because it's not brand new doesn't mean that it's no good.
We need more videos like this one... People seem desperate to get the very last thing out there, as if what they have has immediately become obsolete.
and if you have issues with CPU getting 100% usage u can just cap the FPS to 60 or to 75 if you have 75hz monitor and it will work fine
Just to add if I may Mr. GoG, do not try to push 2133MHz RAM on that chip. The memory controller on it will not go beyond 2133. Also just to add that you apper to have a golden sample, I mean 5GHz at almost full load and only around 80W and stable is really golden. I have pushed mine i7 2600k too hard and I think it degraded, now the best stable I can get is 4.2GHz. Wish I wasn't across the world to send it to you for testing
What about some heavy hitters? Cyberpunk 2077, AC Valhalla or Far Cry 6? :P:P
Great work, great video, The 2600k did better than I expected overall but it just goes to show if you have enough threads and a high frequency games are going to run at playable speeds for the most part.
2700k vs 7700k is only 25% difference in speed is why
serious sam siberian mayhem does not use unreal engine, it has its own in-house "serious engine", the series started out as a tech demo for the first version of the engine in fact
GPU intensive games will still run on old systems well....but trying a game like CK3 or Warhammer III on even a 6700k is rough.
i had a 2700k for so long almost a decade lol, the 5800x was a nice upgrade for the work i do tho
Thanks!
Hey Chris, it's surprising that my sister's i7 3700 is playing Bannerlord with mods quite well with just a Vega 64....in 2022.
Unfortunately the 2600k is not supported by Windows 11. This was the only thing that caused me to recently upgrade to a 5900x. Otherwise, I’d probably still be rocking my 2600k that I got in 2012. At the end of 2025, you’ll basically be forced to upgrade once Microsoft ends support of Windows 10.
You can get windows 11 running on it , I have it running on old dual core laptop with 3gb of ram lol
@@simontan5295 Oh wow. Maybe just add more ram and you'll be good lol
4770k 4800 mhz 2400 mhz ddr3 10-10-12cl 1T. Runs flawless.
I'm curious how my old 2600X would fare against the 2600K. Battle of the 2600s!
I still run a 2600k in my gaming PC works great still. I have it at 4.9ghz. and a 980 ti I haven't had any issues with any games so I haven't upgraded yet.
I had this processor for ten years and it handled everything I threw at and overclocked it to boot. I finally retired it when the octo cores came out
I don't understand why people ditched their quads for octo core CPUs though. nothing out there actually uses 16 threads.... certainly no games out there. unless your running a server, you get the exact same performance weather you have 4 or 8 cores.
@@Blackoutkingbeats If your in to video editing Octo cores are great.
I mean if your into video editing you should have you gpu doing the work no?
@@Blackoutkingbeats It helps but the cpu is the most important in video editing.
I gave my gf my old delidded+relidded with liquid metal i7-3770k with 32gb of 1600cl9 and she uses it for streaming and gaming, and it works. But streaming DOES start to slow it down. I think it does eventually get limited by lack of cores when doing 20 different things at once. Still works great with a 6700xt gpu.
Wow I didnt expect to see this old cpu to run this well in games still. Matter of fact I still have a mobo + a 2600k + DDR 1600mhz still sitting in a box what would you think I could sell it for ?
Also yes when I used my 2600k system it could easily do 4.8 to 5.0 ghz on a single fan AIO from corsair so your overclocks are definitely possible without really crazy cooling solutions. Great work on the video again and I really would love to give that old cpu a spin now just to see how well it could do since 60 fps is all I need basically.
I still miss my old 2700K ran like a charm :D
I remember playing PC gaming for the first time (13 at the time) when my oldest brother bought his first PC with Crisis and Bad Company 2. I didnt have a clue what a i7 or 2600K meant at the time but man I think I played his pc more than him lol he worked a lot 🤪The 2600K was so legendary at that time though, the difference between pc and console was drastic around that time. 3 Years later I end up getting a PC of my own with a 4790K once I got me a job. Never looked back on console since.
It's a wrap for CPUs and even GPUs now. We need another Crysis type game to push things forward.
@JacobTech except Cyberpunk doesn't look that impressive.
It's impressive how many objects there are on screen at the same time.
msfs2020 +vr checks all points ;)
2133c9 is insane! Super op ram
4.8GHz all the way till Spiderman, then 5GHz....why?
rocking rtx2070 with i7 2700k
But that doesn't fit the narrative of you have to buy buy buy. High refresh is nice, but it's mostly a scam to sell CPUs, GPUs and monitors. This is why I still have my 4790k around for the kids gaming machine and it's never lacking. That thing is still a beast and I only replaced it for new features like NVME storage. I'm sure the kids will still get another 4-5 years out of it for Roblox, platformers, and Switch emulation. The 4790k was really the first i9 before i9s became a thing. Paired with a 1660Super it easily emulates Switch games at 4k/60 and the kids don't touch the switch anymore.
Sure I would like a 244hz monitor but I'm not giving up 4k for it. I've always preferred fidelity over FPS, so long as I can get at least 60fps. Especially for the exponentially higher grunt needed for those super high refresh rates and ever diminishing returns, it just turns into a massive money sink to chase a fad propagated by companies trying to sell you something. I'm perfectly happy being a Good Enough Gamer when prices for everything are insane. The cost of a 40 series would be much better spent upgrading 2 systems for more family members to enjoy than just 1 powerful system I keep to myself.
Fantastic video. What about rendering videos? Productivity software? Thanks
Ive used i7 2600k with 1080 Ti for almost two years and then upgraded to r5 3600. But i7 with 4k monitor and pascal monster was great experience. Or 1440p at around 100fps in most tittles was good too.
heat may be an issue for v-cache on a 7700x3d . 7000 series is way hotter than the 5000 series and clocks had to be lowered on the 5800x3d to lower heat . i don't see v-cache coming to ryzen 7000 until they fix the die and the soldering process thickness and cooling the 7000
7000 series are hotter, but they are made to run that hot, there will be no problems with the 7700x3d, the core package will be thicker, so the heatspreader can be thinner.
@@budgetking2591 High Temperature is al ways bad man.
Lower the clock speeds on the 7950x to 4.8GHz on all cores an it is only 0-5% slower, consumes 125W down from 230W and runs at 55C with a decent cooler down from 95C
Why AMD were such morons to almost double the power consumption and heat for an extra 5% performance is beyond me.
@@rattlehead999 They did it so they didn't lose to Alderlake in benchmarks.
@@budgetking2591 LOL
Did I miss it or was the resolution not mentioned?
That would mean, that at 1080p, an overclocked FX8350 would also be able to do the gaming with the same or similar results. However, you would need to tune the Hypertransport bus and memory speeds and timings, as well.
That is why Sandy Bridge will always be better. Sandy Bridge doesn't need tons of tuning like bulldozer did.
No AMD fanboy! FX8350 is shit compared to 2600k
Bulldozer aka faildozer, amd's worse cpu so far. Completly trash.
You're definitely right about hardware far superseding the needs of software now. Back in 2012 when the 2600K came out, you'd find nobody running a Pentium 4 (prescott) but now you can find many content 2600K owners. Same thing goes for GPUs as well, You'll still find people today rocking R9 290s from 2013, but was someone running a 2002 GPU in 2013?
Games just need something revolutionary to happen to them, and its not ray tracing lol.
games went from being rough looking and fun to being super realistic with terrible gameplay and story lines lol sounds like the priorities shifted to selling shiny new things since they know how consumers function.
@@musek5048 I want a new conkers bad fur day!
@@DannyzReviews and not just a remaster, but a proper continuation of the story using the latest tech to make the visuals look like a pixar movie.
I realized this years ago. I used to upgrade my dads computer hardware every few years so he could do more on his pc, make things load faster, make videos run more smoothly etc. When I put an i7-6700 a GTX 1050 and an SSD in it that was it. I realized his pc could do everything he required of it perfectly and that he will no longer need upgrades ever again.
This is the best cpu intel ever made almost pushed amd to bankruptcy and went to monopolize entire market.
If you look at many titles recently released …… many only requiring a 6-core as a recommendation……. Not for minimum settings …… besides many game engines are still requiring 4-core / 8-Thread and as mentioned, Indie titles require only 4-c/8-t at recommended settings. We’ll need a 6+ core cpu maybe in the next 5+ years as a minimum spec when Unreal 5 engine etc are slowly released into new titles over the next few years.
Whoever still using this CPU definitely got their money's worth.
So, you are saying I don't need to replace my 5950x yet? ;D
you should obviously replace it man 7950 x gives you 1 extra frames why won't you take advantage of it?
@@nuffsaid7759 Gotta have them frames!!! So.. a quad 7950x system? ;)
I use a pair of Xeon E5-2690's in a HP z620 which uses the same architecture as Sandy Bridge, but has 8 hyperthreaded cores per CPU. Which in modern multitheaded titles is probably helping more than the higher clock speeds of the 4 core 2600k. I am getting about 20%+ more performance in a lot of these titles, with a worse GPU.
3570k and I play my games fine. But it's time to upgrade a bit
Man I got a 8700, but after seeing this video maybe I don't need to upgrade to 12600 this year. XD Maybe I'll just get a bigger computer case and 3080ti.
What GPU is used for testing ?
I got ak 2500k 5.1ghz and 2600k 4.9ghz im about to swap to i7 5820k cuz more cores
Using 6700K (4.7mhz)/3080 Ti, it is just fine pumping most games at 4K to make the most of the GPU. With Raytracing/DLSS Quality, I can play Cyberpunk 1620p (DLSDR) at 70 fps. Quite ok with me and no need to upgrade.
2400Mhz ddr3 was affordable in 2015/16
This has been the case for a while. If you're okay with 60fps, just get a 4k monitor and a GPU that can handle the resolution and quality settings. I noticed that on my 60hz TV a lot of games are more than smooth enough. For eSports though where I demand 400+fps on 240hz... Yeah even my 6700k at 4.8ghz isn't enough
I'm sporting i7 2600 myself, works well in 1440p for games I play, also motherboard also auto overclocks it, despite it being non k version, think I cheaped out on k version that was ~$15 back then
Mention what graphics card ?
Cool! Which card in 2022-2023 will be optimal and stable(if amd) for the i7 2600k ? ty
Every card you can afford. The CPU Bottleneck depends highly on the game you play.
Everyone needs to game at 4k!
I had an overclocked i5 2500k and there was experiencing a lot of micro stutters while still hitting 60fps+ switching to Ryzen 2700 and keeping the same gpu/ssd and that stutter was gone. You won't always notice it but it was there and annoying with the 2500k
Yep, I've been there too. But I switch to Ryzen 1600. The micro stutter is still there, but less frequent, and less severe.
For my case, it seems that microstutter happen when the game demand more VRam than my GPU has, then it search for more available memory elsewhere.
Its price in my country is nearly the same as R5 1600AF/2600 ;) So if someone is not able to get it for dirt cheap in some old workstation, then it's not worth at all, apart for having fun with it.
Fantastic video! Love seeing analysis focused on keeping older hardware. Can I say , it's also wonderful to see you showcase and be positive about some modern titles. There are a lot of great titles out there and, to your point , very few need a killer system to be enjoyed.
If you like Westerns, you may like Hard West 2. Actually preferred it to Weird West.
Hmm, I have still have an old pc with a I7 2600k with amd hd7970hd but it's not been used for several years as i moved over to AMD, current build is Ryzen 5950x and rtx3080.
I have the i7 3770 but will change because in the live not only matter the "fps" but pther things too.
Cool. I have a 4770k clocked to 4.5 Ghz paired with a GTX 1070. DDR3 clocked at 1600 Mhz. I suppose I should get some 2400 Mhz ram.
Get the gskill ripjaws 2400mhz ones with 10-12-12 timings.
Whats the resolution, sorry if I missed it.
I need to do this with my 2009 Xeon X5690(6 cores 3.6Ghz boost with a good GPU)
1080TI: Still good in 2027? YEP!
i still have this cpu is still run well. but cpu demanding game well see the difference i will upgrade soon tho
1:46 12600k?
If Sandy Bridge can still play your games over 60FPS after 11yrs then how long is Skylake going to last with another 25% IPC then CPUs like Zen 3 are 24% faster than Skylake and ADL is 40% over Skylake.
Pretty sure 5800X3D will last me good until 2030 lolz...
Until consoles get better.
Game companies don't make games for pc they make games that runs fine on console that's why we got that fake cyberpunk 2077 too different than demo and even Witcher 3 and people always talks about ubisoft downgrades. Those downgrades happen because of console limitations.
5GHz od 3,4 GHz CPU? I have i5 4690k and I was able to OC it only to 4,0GHz...
a good z77 based mb + hydro cooling makes this possible
It's all about the 0.1% and 1% lows.
Kinda proves you dont always need the latest and greatest. Just gotta go in with the right expectations.
Sure for SP games, but in online MP shooters I gained 2x MIN FPS while switching from r7 2700x to r5 5600x and I´m sure there´s more to gain while going from 6 cores to 8 cores.
Also in SOTR my 3770k@4.2GHz was bottlenecking hard, even 2700x didn´t have any gains in SOTR jungle city- also a SOTR fault.
New zen4 is waste of money IMHO, zen3 or Alder Lake from Intel would be even better options.
Chris what was the graphics card? mb
I had a 7700k and 1660ti. I could run any game with that system at decent fps. If all you do is gaming, 4 core is still enough in 2022.
(4 core 8 threads)
Simultation software no chance ie fernbus, xplane 12, etc
all the guys who didn't cheap out on hyperthreading are still laughing now.....
i just pulled this cpu out of an old computer tower and now im going to throw a 1660ti together with it that i just had laying around. siiick
i need advice, should i go for rtx 3060 i have i7 2600 nonK for 11 years now, don't wana change my pc :D
Why didn't you show the frame rate versus the 12100?
that wasn't the point of the video