@@RuruFIN nothing is true :) amd boards were cheap back then, most am3 supported am3+ with little as bios update, and best, phenom (very effective) heatsinks fitted the socket, so either you are ignorant or liar :)
I upgraded from Phenom II X6 1045T to FX-8350, there was a huge difference in gameplay in the game I was playing at the time, NFS The Run. Same motherboard and RAM, with Phenom I encountered stuttering and glitches, with the FX everything was smooth.
I built a Phenom II 1090T system 14 years ago. The only upgrades it had over the years were a new graphics card (a GTX960, about 7 years ago) and quieter CPU cooler (Noctua NH-C14) - and some drives in the RAID-5 were replaced due to bad sectors, but that's not really an upgrade. It still runs most things (including games) just fine. I work in post-production and visual effects, so I do have newer, faster systems, but (other than render times) I hardly notice the difference.
The sad thing about instruction sets is they aren’t physical limitations just support limitations since people used to mod instruction sets for h264 decoding to lga 775
@@josephdias3968 They are in fact physical limitations, you have to hard the hardware on the chip to support SSE4.2 for example, you can emulate support in microcode but real implementations are hardware
Answer on your last question: Yes and yes ! Phenom II was insanely good, specially the first gen. Everyone was blown away by the 955 Black Edition for example. They kept pulling with the 1xxx series. The later FX chips were really that bad. Power hungry, running hot for the sake of not much performance increase over the previous gen. Thanks for the content !
and Thank You! Yeah I'll have to keep an eye out for an 1100t or something and try this again. Trouble is people want to sell these things for far more than many would expect. I'm not paying $150 for a very plentiful 13 year old cpu, so gotta keep an eye out for one thats cheap. But yeah definitely will revisit this, I was really impressed as well by the results.
The FX x300 consumed very little power, about 2/3rd of their TDP. The 9590 was the crazy one that was needlessly overvolted. Instead of running at 200W you can undervolt it to run at 140-150W. Also the FX CPUs weren't bad for their time. For 90$ you got a FX 6300 or for 125$ you got a fx 8320 and motherboards were 45-50$, so for this little money you could play all AAA games 60fps+ until late 2016 with the exceptions being the same games that intel couldn't run at 60fps either. You could also play non-AAA games and older AAA games at 120fps+ and can still play many non-AAA games at 60-120fps to this day.
I remember being a huge proponent of AMD when Thuban came out, less so when FX came out, they were INCREDIBLY power hungry. And when FX8150 came out, they weren't cheap either. And bloody slow. Many reasons one would have bought a FX later didn't even exist when Bulldozer came out. FX6100? Even worse
I loved phenom ii i honestly believed if Phenom ii was put on same nm process and given newer instruction set it would have been that much faster than FX! Thank you.
Yes it would, it would be still limited to a 6 core CPU, and it would have costed a lot more than FX CPUs. FX CPUs were very cheap. The fx 6300 was 90$ the FX 8320 was 125$, the phenom II x6 started at 230$ and reached close to 300$. And the power consumption would have been higher. Keep in mind the FX CPUs were delayed by a year, because AMD had to reduce the transistor count per core by 40%, since Global Foundries overestimated their 32nm process node(thus straight up lied to AMD what was achievable).
@@rattlehead999Except that's total bullshit Phenom II X6 came out far EARLIER and the 95W Thubans stil smashed FX6100's face in. And the worst part? You conveniently ignored bulldozer which weren't even cheap either. Lastly, Bulldozers/Piledrivers didn't even have 8 real cores
@@DLTX1007 The first gen FX sure, the second gen was faster than the phenoms and were much cheaper. The board prices also dropped. Back in the beginning of 2013 I got an fx 6300 for 90$ and a 970 motherboard for 50$. And FX CPUs have real physical cores, but they share resources, which in 2020 was deemed that they weren't fully independent and thus not real cores, but technically they are two physical threads, it's not like SMT or Hyper-threading. Now the phenoms are one of my favorite chips, no doubt, but they were quite higher-end than FX CPUs. Keep in mind the inflation from 2008-2011 was huge thanks to the global recession, of course if you ask the government it's 10-12%, just like the recent inflation spikes...
I really appreciate the handbrake tests. As someone who does video transcoding and compression with ffmpeg daily it's very nice to have a benchmark that's an easy frame of reference to whan i have now.
I got my FX6300 in 2012 I think. Didn't take long and found out my causins older Phenom 1100T 6 core was still better. I think he could over clock to 4.5ghz and my FX6300 would do 5.1ghz and even then it was neck and neck. Later switched to a FX8320 @ 5.1ghz to barely beat him out in a few benches. Phenom was definitely superior to FX.
Yes you should have switched to intel. I went from fx6200 to i5 2550k and the i5 was miles snappier and faster. It also overclocked to 5ghz easily on a cheap air cooler, never going above 72c even in summer when room was over 30c.
@@LawrenceTimme I thought about it but I spent a lot on a Asus Formula Z ROG 990FX mobo and felt obligated to stick with AMD. Later in 2014 I switched to a 4790k and my causin went with a 4690k and both got GTX980 cards. Great setups ran them for years.
The difference is that the fx 6300 consumes less power, was much cheaper than 6 core phenoms and had many more instructions. Keep in mind the FX CPUs were delayed for a year and castrated by 40% per core due to Global Foundries overestimating their 32nm process node.
Not if you overclocked CPU-NB instead of CPU clock. You could push cpunb +600 on the 6300 higher than any other FX proc, and it at least would bench higher than any other FX proc overclocked.
@@timmyp6297 I don't remember all the details. I had them ram oc'ed to 2133 or 2400. It was winter at the time so we closed the door and cracked a window to bring room temp down I think it was like 55 or 60°f. I had the original FX 120mm AIO, we removed the fan from the radiator and placed it in a large bowl of ice water next to the PC. We ran out of ice quickly so had to go outside and make a bunch of snow balls my cousin could bring in as needed. Doing this and keeping temps down we hit 5.3ghz on the FX6300 setting the highest score in Passmark at the time for a FX6300. We also got my FX8320 to 5.3ghz and my other friend came over to help and we got his FX8350 up to 5.5ghz with this method. It was a fun night with too much beer confidence. And the Asus 990fx Formula Z board is a beast and held strong I'm sure making most of it possible.
Man, I still remember my old phenomom 2 1100t black edition. Used it daily until 2017 and primarily replaced it with a 2600k because I got a free lga1155 board and the phenom2 was lacking modern instructions. Still works on another computer I installed in though. Hope my current 2700x can do the same for just as long. Looking good so far though.
I had a 1075t (upgraded from a Phenom II 965) and oc'd it to 3.9Ghz for most of its life. Gave it to a friend a few years ago along with my old GTX680 and it's still playing GTA5 at 60FPS and doing audio recording and editing no problems. The only thing that let it down was the limited instruction set. At the time I built it it was an absolute beast Pretty sure I initially had crossfire 4870 Radeons with it, which was total overkill at the time. Oh yeah, and it was watercooled, which was rare at the time. I loved that PC.
I managed 5 years with Phenom II 910e, but had to upgrade to E3-1240 v5 (SkyLake), because I wasn't going to touch FX series. I had the Xeon only about 2 1/2 years and went to Ryzen 2700X and never looked back. Lots of Ryzens here, from Zen+ to Zen4.
Hope you still do the i5 against the phenom. this one was crazy, they could have saved so much money and just improved what they already had. What they already had was already faster than the new stuff they put out.
TL;DR: AMD did do a K10 evolution actually, and it also wasn't great (Llano on 32nm), Bulldozer was a very painful but necessary modernization for AMD. There's a Chips and Cheese article about it, which I highly recommend people read. It's forgotten now, but the thing that really suffocated Bulldozer is that the Globalfoundries 32nm process it ran on had bad teething issues in it's first years, and even implementing K10 on it would've (and did) cost AMD in a big way. They did actually implement K10 on the process with the Llano APUs, which tended to have lower clocks and worse overclocking than K10 on 45nm, and that was already after also having a large engineering effort. AMD had originally envisioned a wider, higher IPC architecture than what ended up releasing but suffered a death by a thousand cuts in execution unit layout, cache layout, and reordering capacity, among other things, in order to be able to reach the clocks they wanted on the 32nm process. While yields improved by the time Piledriver came out in 2012 and some fixes were made, AMD was more or less chained to the architectural decisions they had to make to get Bulldozer out the door in the first place. Chips and Cheese has an excellent two-part breakdown on how this all happened, which I highly recommend people read. By 2011 K10 was fast becoming dated on an architectural level, being more or less a slight evolution of the K8 which in turn was a slight evolution of the original K7 Athlon. It was mostly still competitive against Nehalem, but a 32nm Phenom that regressed in clocks would have been similarly left far behind like just like Bulldozer was when Intel did their comprehensive redesign with Sandy Bridge. Just compare the stock 1035t to the 3570 in the charts in this video. Bulldozer was not a particularly successful product. But it gave AMD's engineers the necessary experience in implementing modern architectural techniques in caching, branch prediction, out-of-order execution, and multithreading that would ultimately be put to use to much greater effect with Zen six years later.
Thanks for the vid as always it was interesting to see how differently programs behave on each chip, phenom II x6 series was really ahead of its time, shame it doesnt have newer instruction sets, not even full sse 4, let alone avx which is already mandated in several games. AMD already invested too much into FX arch so they had to launch it, but if they only did their phenom refresh with newer instruction sets on smaller node it would have been not a bad generation, instead we got excavators, buldozers, combines...😁 Like from that Chernobyl HBO series "not great - not terrible"
@@jims_junk That is a pretty good analogy, First gen fx sucked, later gens improved greatly with some cost on power and heat but still didn't compare to it's newer competition. Sums up Willamette - Prescott.
I have a 1055T, you gave me the idea of doing some OC, although I'm worried that the board and the power supply won't hold up apart from the fact that it's on an old one with DDR2.
I know there's probably going to be some people who will turn up and go "but the fx only has 3 cores!!11one!". They're not right, but they're also not fully wrong. The big mistake AMD made on teh fx was making two cores share an FPU. So it's got 6 integer cores, but only 3 floating point cores. That'll make anything that depends on a lot of floating point crunching perform like hot garbo.
the difference between these is that phenoms were decent core2 competitors, and FXes were terrible core i5 competitors. or, more to the point, budget alternatives to them.
What happened with the Phenom was the Map was still loading which is why the buildings were see-through and there was a lower FPS. I usually like to load the Italy map in Freeroam before opening the Main benchmark. I suspect that it would have had a 3-4 fps increase on the Phenom
Would have liked to see a higher end X6, like 1075T or above. The 1035T is the lowest model, so it's impressive how well it held up, but it would be interesting how much better the top end is, or if it's worth it at all. I'm currently trying to put together an 1100T system, but maybe the lower end would be just fine too. :)
Maybe I'll do another and pick up the 1100. This vid was really just a spur of the moment "what if". Really impressed how well it did all things considered.
I would love to see the best Phenom II, both stock and overclocked. Had an fx-8350 in the day and had no idea how good the phenoms were at that time. Nice video.
@@jims_junkIll also back up that the Phenom 1100T would be amazing to see. I used one and was blown away how a 45nm chip from 2010 could hold up so well
What would be interesting to see is comparing the Phenom to the FX 6100, since that was the actual FX release that occurred and was slower than the 6300 you benchmarked. That there will likely demonstrate the disappointment that FX showed for a time
Upgraded from Phenom II X6 1045T to FX-8350, with the same GPU Radeon HD7770 GHZ Edition, the games I was playing - Need for Speed The Run, along Crysis 2 were far smoother, especially Need for Speed ran significantly better! No glitches, no stuttering, gameplay was smooth and fast.
If amd given phenom sse4.1 and sse4.2 i can imagine how long would he was lived, realy good cpu. FX is bad especially the first generation, but in same time its good processors. He have big issue with L3 speed and memory bandwith and because of that 4100 lost in games to 1 gen i3 and 4300 to 2 gen, but after release some big games like witcher 3 fx starts pull ahead beacause of multicore performance. And multitask was better becuase of amount of threads. My friend had 4350 and can play any games what he want in 2018, after i give him my 8320e he complete cyberpunk 2077 on release with 40 fps. But in other hand his pc haved 8+2 gigs of memory on 1333 mhz and statters was everywhere. i5 was better for games and i7 was more stable for work, but fx was cheaper and In good hands with good ram this processors can play anything, worse than intel, but cheaper. sorry for bad EnGleSh LangUage, but i wanna give u some support for your work.
I now have a FX8350 to play with, and 2x 4GB sticks of DDR3 1600 CL8 - I think the absolute lowest CL at 1600 was 7, so not the fastest ever Bog standard 1600 is CL11, I've had some CL10, but nothing lower
@@rattlehead999 I’ve had 3 1090ts that all hit 4.5 stable and plenty that hit 4.3 but I also was using 970 and 990fx boards cause them beefy mosfets and vrms really help with just cranking that voltage 1090ts were an extremely common cpu in my flips from 2012-2015 I’ve probably had atleast 15 of them overtime
btw im also console gamer i still got my ps3 and ps2 that asus gaming pc i had i was doing some experiments with it like i tried amd phenom ii x4 965 and tried amd phenom ii x4 970 either one of them on air cooling i easily overlocked it to close to 4.1 ghz but i quickly turned down to normal but that same motherboard was am3 ddr2 i put in new bios update flashed it and put in amd phenom ii x6 1035t now its 6 core in that system but no insane overclock like amd Phenom ii x4 965 or phenom ii x4 970 id say if 4 core may have better performance then 6 core because much higher cpu frequency btw wether its ryzen or Amd fx which i have that gaming pc too and amd phenom ii end of day what really matters to me for gaming is graphics card thats what makes the overall difference
hey jims btw i always been a phenom fan personally I think Phenom ii all of them were good btw i even have an amd fx 8320 gaming pc btw i got ryzen Parts too I think Amd from phenom iis line up should have gone straight to Amd ryzen instead of Amd fxs Also I have and army of Phenom ii cpus i got and other gaming pc that has asus motherboards btw that same board i have over like 10 Phenom ii cpus Personally I think matter fact is if amd phenom ii x4 965 if its overclocked to 4.2 ghz i used to msi graphics card by nvdia geforce it still be montster against the amd FX of course they got more instructions But phenom iis generally had better performance then certain amd fx procesors My setup on older gaming pc that i was mentioned that same board somepoint even had the amd phenom ii x4 945 but its supported lke 10 diffrent phenom ii cpus when i had the Amd phenom ii xram4 945 it was at 3 ghz and i only overclocked the graphics card by msi nvdia i believe its ram on gpu was only 748 mb i overclocked it just little bit just the graphics card not cpu but with amd phenom ii x4 945 3 ghz i still had great performance for gaming i still love Phenom cpus that same board i flashed bios and put it a Amd phenom ii x6 1035t its stll great but my main pcs are amd fx 8320 for gaming and one Ryzen laptop Soon Ill build amd ryzen gaming pc Great memmories of amd phenom ii brother take care ill suscribe to ur channel sorry for this but almost like i Wrote an Essay to you lol
I've always liked comparing the Phenom/II architectures to the Bulldozer/Pile Driver architectures. It really demonstrates the ills of getting rid of a CPU architect such as Jim Keller.
The Phenom II performs better because it has a better FPU processing pipeline than the FX where the FPU is shared between both cores, meaning at floating point calculations, it effectively behaves as having half the cores and this holds true especially for the FX8xxx and FX9xxx, where they'll act as quad core cpus when dealing with floating point math. In practice, the FP performance of the FX6300 would be comparable to the oddball Phenom X3 and its tree cores.
The argument over cores / threads on the FX though - it is NOT 6 fully independent cores as each pair share two half FPUs and also share L1 I-cache and L2 cache
Realmente el AMD Phenom II x6 no tiene nada que envidiarle al FX solo por las instrucciones SSE 4.1 y 4.2 por eso quiero cambiar mi AMD Phenom II x6 1055t asi sea por un FX 6300 pero quiero es un FX 8350
bro just skipped the most interesting and complex part of the am3+ platform and the amd fx, skipping on the cpu nb, fsb and ht oc for fx6300 was just a missed opportunity to show off the real power of the cpu. and testing 6 core cpu's in games that barely use 1 or 2 is really genius
What's kind of crazy is... You're testing TWO generations after the last Phenom series, and this is all the FX series could muster up. I'm thinking clockspeed wise, the previous equivalent would have been something like the FX-6100 or 6120, and those performed even WORSE, maybe 10% or so worse..? I had an FX-8350 for many years, but looking back, these had to be among the biggest screwups in tech history.
Yup but amd really pulled themselves out with the ryzen lineup. Biggest CPU screwups though goes to Intel for the Pentium 4 and ESPECIALLY the Pentium D
That FX had only 3 cores while Phenom 2 had real 6 cores.Also best K-10 cpu IPC wise was actually Athlon2 651k.If they just added 2 cores on that cpu and SSE 4.1 and 4.2 they would not go almost bankrupt making this shitty FX cpu's.
the 6 core FX is really a 3 core with hardware SMT. A lot like what Intel is going to be doing with their Rentable Units. So you are really benching a 3 core vs a true 6 core. Edit: also the FX chip cost a lot less when it released compared to the 6 core Phe. Also benchmarking DX12 games will show you just how much better the 3 core FX chip is than the 6 core Phe.
1055T=£125 and 125w most common fx6300=£80 and 95w being cheaper than most phX4 I have 3 phx4's and a 1055t among many others I built and sold over the years, we might aswell be comparing a 3470 vs a 2600 3.25 I think you mean 10% 4:48 okay..... you're still measuring performance in absolutes not percentage wtf? if you go from 20fps to 30fps thats a "10fps" increase, but if you go from 100fps to 110fps thats also "10fps" increase, so do you think that both scenarios are equal? because the former matters allot more than the latter. Edit: actually i'm here because I just upgraded a m4a89td pro from phx4 to x6 but cant use it because no UEFI support, havent got round to making a non-uefi install key as I've crippled my back, so lying in bed. I'd choose a fx6300 @4.5ghz over a phx6 at 3.5 (250x14, dram 1066>1333).
The first FX series like 6100 was terrible, lost to Phenom very badly. The next FX series like 6300 was better but still not worthy. The only thing that give longevity to the cpu and platform was the instructions to continue using other applications and games and was very cheap.
You only start hitting epic garbage tier when you go RDRAM Pentium 4. Slow, expensive memory with higher latency than the DDR speed equivalent and a space heater. Couple that with a GTX580 and you have your own whole house heater.
I remember that when the FX series were released, I checked few reviews and immediately bought a Phenom II X6 1055T.
LOL!
kinda stupid cause you could oc FX and have more than ocd Phenom...
@@MarekKafarek-i6m And I would've needed an expensive motherboard and cooling to do that. No thanks.
@@RuruFIN nothing is true :) amd boards were cheap back then, most am3 supported am3+ with little as bios update, and best, phenom (very effective) heatsinks fitted the socket, so either you are ignorant or liar :)
I upgraded from Phenom II X6 1045T to FX-8350, there was a huge difference in gameplay in the game I was playing at the time, NFS The Run. Same motherboard and RAM, with Phenom I encountered stuttering and glitches, with the FX everything was smooth.
I built a Phenom II 1090T system 14 years ago. The only upgrades it had over the years were a new graphics card (a GTX960, about 7 years ago) and quieter CPU cooler (Noctua NH-C14) - and some drives in the RAID-5 were replaced due to bad sectors, but that's not really an upgrade. It still runs most things (including games) just fine.
I work in post-production and visual effects, so I do have newer, faster systems, but (other than render times) I hardly notice the difference.
If only Phenom II could use newer instruction sets
The sad thing about instruction sets is they aren’t physical limitations just support limitations since people used to mod instruction sets for h264 decoding to lga 775
@@josephdias3968 Not true, they emulate instructions, which kills performance, while having native hardware instructions increases performance.
How many extra instuction sets would you like?
Modern AMD64 cpus: Yes.
@@josephdias3968 They are in fact physical limitations, you have to hard the hardware on the chip to support SSE4.2 for example, you can emulate support in microcode but real implementations are hardware
Answer on your last question: Yes and yes ! Phenom II was insanely good, specially the first gen. Everyone was blown away by the 955 Black Edition for example. They kept pulling with the 1xxx series. The later FX chips were really that bad. Power hungry, running hot for the sake of not much performance increase over the previous gen. Thanks for the content !
and Thank You! Yeah I'll have to keep an eye out for an 1100t or something and try this again. Trouble is people want to sell these things for far more than many would expect. I'm not paying $150 for a very plentiful 13 year old cpu, so gotta keep an eye out for one thats cheap. But yeah definitely will revisit this, I was really impressed as well by the results.
The FX x300 consumed very little power, about 2/3rd of their TDP. The 9590 was the crazy one that was needlessly overvolted. Instead of running at 200W you can undervolt it to run at 140-150W.
Also the FX CPUs weren't bad for their time. For 90$ you got a FX 6300 or for 125$ you got a fx 8320 and motherboards were 45-50$, so for this little money you could play all AAA games 60fps+ until late 2016 with the exceptions being the same games that intel couldn't run at 60fps either. You could also play non-AAA games and older AAA games at 120fps+ and can still play many non-AAA games at 60-120fps to this day.
@@jims_junk Would love to see the 955 BE vs a i7 920.
I remember being a huge proponent of AMD when Thuban came out, less so when FX came out, they were INCREDIBLY power hungry. And when FX8150 came out, they weren't cheap either.
And bloody slow. Many reasons one would have bought a FX later didn't even exist when Bulldozer came out. FX6100? Even worse
@@jims_junkget a 1090t. I got one in a motherboard, ram and cooler combo for 50 bucks on eBay.
I loved phenom ii i honestly believed if Phenom ii was put on same nm process and given newer instruction set it would have been that much faster than FX! Thank you.
Totally agree
Yes it would, it would be still limited to a 6 core CPU, and it would have costed a lot more than FX CPUs. FX CPUs were very cheap. The fx 6300 was 90$ the FX 8320 was 125$, the phenom II x6 started at 230$ and reached close to 300$. And the power consumption would have been higher.
Keep in mind the FX CPUs were delayed by a year, because AMD had to reduce the transistor count per core by 40%, since Global Foundries overestimated their 32nm process node(thus straight up lied to AMD what was achievable).
@@rattlehead999Except that's total bullshit
Phenom II X6 came out far EARLIER and the 95W Thubans stil smashed FX6100's face in.
And the worst part? You conveniently ignored bulldozer which weren't even cheap either.
Lastly, Bulldozers/Piledrivers didn't even have 8 real cores
@@DLTX1007 The first gen FX sure, the second gen was faster than the phenoms and were much cheaper. The board prices also dropped. Back in the beginning of 2013 I got an fx 6300 for 90$ and a 970 motherboard for 50$.
And FX CPUs have real physical cores, but they share resources, which in 2020 was deemed that they weren't fully independent and thus not real cores, but technically they are two physical threads, it's not like SMT or Hyper-threading.
Now the phenoms are one of my favorite chips, no doubt, but they were quite higher-end than FX CPUs.
Keep in mind the inflation from 2008-2011 was huge thanks to the global recession, of course if you ask the government it's 10-12%, just like the recent inflation spikes...
I really appreciate the handbrake tests. As someone who does video transcoding and compression with ffmpeg daily it's very nice to have a benchmark that's an easy frame of reference to whan i have now.
👍
Great video 👍
Thanks 👍
I got my FX6300 in 2012 I think. Didn't take long and found out my causins older Phenom 1100T 6 core was still better. I think he could over clock to 4.5ghz and my FX6300 would do 5.1ghz and even then it was neck and neck. Later switched to a FX8320 @ 5.1ghz to barely beat him out in a few benches. Phenom was definitely superior to FX.
Yes you should have switched to intel. I went from fx6200 to i5 2550k and the i5 was miles snappier and faster. It also overclocked to 5ghz easily on a cheap air cooler, never going above 72c even in summer when room was over 30c.
@@LawrenceTimme I thought about it but I spent a lot on a Asus Formula Z ROG 990FX mobo and felt obligated to stick with AMD. Later in 2014 I switched to a 4790k and my causin went with a 4690k and both got GTX980 cards. Great setups ran them for years.
The difference is that the fx 6300 consumes less power, was much cheaper than 6 core phenoms and had many more instructions. Keep in mind the FX CPUs were delayed for a year and castrated by 40% per core due to Global Foundries overestimating their 32nm process node.
Not if you overclocked CPU-NB instead of CPU clock. You could push cpunb +600 on the 6300 higher than any other FX proc, and it at least would bench higher than any other FX proc overclocked.
@@timmyp6297 I don't remember all the details. I had them ram oc'ed to 2133 or 2400. It was winter at the time so we closed the door and cracked a window to bring room temp down I think it was like 55 or 60°f. I had the original FX 120mm AIO, we removed the fan from the radiator and placed it in a large bowl of ice water next to the PC. We ran out of ice quickly so had to go outside and make a bunch of snow balls my cousin could bring in as needed. Doing this and keeping temps down we hit 5.3ghz on the FX6300 setting the highest score in Passmark at the time for a FX6300. We also got my FX8320 to 5.3ghz and my other friend came over to help and we got his FX8350 up to 5.5ghz with this method. It was a fun night with too much beer confidence. And the Asus 990fx Formula Z board is a beast and held strong I'm sure making most of it possible.
Man, I still remember my old phenomom 2 1100t black edition. Used it daily until 2017 and primarily replaced it with a 2600k because I got a free lga1155 board and the phenom2 was lacking modern instructions. Still works on another computer I installed in though.
Hope my current 2700x can do the same for just as long. Looking good so far though.
I had a 1075t (upgraded from a Phenom II 965) and oc'd it to 3.9Ghz for most of its life. Gave it to a friend a few years ago along with my old GTX680 and it's still playing GTA5 at 60FPS and doing audio recording and editing no problems. The only thing that let it down was the limited instruction set. At the time I built it it was an absolute beast Pretty sure I initially had crossfire 4870 Radeons with it, which was total overkill at the time. Oh yeah, and it was watercooled, which was rare at the time. I loved that PC.
I managed 5 years with Phenom II 910e, but had to upgrade to E3-1240 v5 (SkyLake), because I wasn't going to touch FX series.
I had the Xeon only about 2 1/2 years and went to Ryzen 2700X and never looked back. Lots of Ryzens here, from Zen+ to Zen4.
Yup, especially AM4 zen CPUs were excellent. The 7000 series, particularly the X variants are just dumb.
I used to have a 1055T rig OCed to 4Ghz daily, it even hangs with the FX8xxx at some tasks
Hope you still do the i5 against the phenom. this one was crazy, they could have saved so much money and just improved what they already had. What they already had was already faster than the new stuff they put out.
Yep, that's the plan
TL;DR: AMD did do a K10 evolution actually, and it also wasn't great (Llano on 32nm), Bulldozer was a very painful but necessary modernization for AMD. There's a Chips and Cheese article about it, which I highly recommend people read.
It's forgotten now, but the thing that really suffocated Bulldozer is that the Globalfoundries 32nm process it ran on had bad teething issues in it's first years, and even implementing K10 on it would've (and did) cost AMD in a big way. They did actually implement K10 on the process with the Llano APUs, which tended to have lower clocks and worse overclocking than K10 on 45nm, and that was already after also having a large engineering effort. AMD had originally envisioned a wider, higher IPC architecture than what ended up releasing but suffered a death by a thousand cuts in execution unit layout, cache layout, and reordering capacity, among other things, in order to be able to reach the clocks they wanted on the 32nm process. While yields improved by the time Piledriver came out in 2012 and some fixes were made, AMD was more or less chained to the architectural decisions they had to make to get Bulldozer out the door in the first place. Chips and Cheese has an excellent two-part breakdown on how this all happened, which I highly recommend people read.
By 2011 K10 was fast becoming dated on an architectural level, being more or less a slight evolution of the K8 which in turn was a slight evolution of the original K7 Athlon. It was mostly still competitive against Nehalem, but a 32nm Phenom that regressed in clocks would have been similarly left far behind like just like Bulldozer was when Intel did their comprehensive redesign with Sandy Bridge. Just compare the stock 1035t to the 3570 in the charts in this video.
Bulldozer was not a particularly successful product. But it gave AMD's engineers the necessary experience in implementing modern architectural techniques in caching, branch prediction, out-of-order execution, and multithreading that would ultimately be put to use to much greater effect with Zen six years later.
Thanks for the vid as always it was interesting to see how differently programs behave on each chip, phenom II x6 series was really ahead of its time, shame it doesnt have newer instruction sets, not even full sse 4, let alone avx which is already mandated in several games. AMD already invested too much into FX arch so they had to launch it, but if they only did their phenom refresh with newer instruction sets on smaller node it would have been not a bad generation, instead we got excavators, buldozers, combines...😁 Like from that Chernobyl HBO series "not great - not terrible"
The fx really reminds me of the old days when Intel went with a netburst Pentium 4. The pentium 3s were still just as fast and sometimes faster.
@@jims_junk That is a pretty good analogy, First gen fx sucked, later gens improved greatly with some cost on power and heat but still didn't compare to it's newer competition. Sums up Willamette - Prescott.
I have a 1055T, you gave me the idea of doing some OC, although I'm worried that the board and the power supply won't hold up apart from the fact that it's on an old one with DDR2.
If you do oc it on an old board cool the chipset and vrms they can get toasty on am2+ boards
I know there's probably going to be some people who will turn up and go "but the fx only has 3 cores!!11one!". They're not right, but they're also not fully wrong. The big mistake AMD made on teh fx was making two cores share an FPU. So it's got 6 integer cores, but only 3 floating point cores. That'll make anything that depends on a lot of floating point crunching perform like hot garbo.
the difference between these is that phenoms were decent core2 competitors, and FXes were terrible core i5 competitors. or, more to the point, budget alternatives to them.
the only amd i had was k6-2 450
i only remember that everybody was saying that in winter, they did not need to turn on the heat at home.
The k6-2 450 consumes 11W, you can't heat a room with it...
Also Zen2, 3 and 4 have been superior to intel's offerings.
What FX had going for it was that you could effortlessly get really good overclocking results. My FX 6300 does 4.6GHz @ 1.4V.
What happened with the Phenom was the Map was still loading which is why the buildings were see-through and there was a lower FPS. I usually like to load the Italy map in Freeroam before opening the Main benchmark. I suspect that it would have had a 3-4 fps increase on the Phenom
I'm with ya. Prob was I tried loading multiple times. That benchmark scenario just starts too quick.
Would have liked to see a higher end X6, like 1075T or above. The 1035T is the lowest model, so it's impressive how well it held up, but it would be interesting how much better the top end is, or if it's worth it at all.
I'm currently trying to put together an 1100T system, but maybe the lower end would be just fine too. :)
Maybe I'll do another and pick up the 1100. This vid was really just a spur of the moment "what if". Really impressed how well it did all things considered.
I would love to see the best Phenom II, both stock and overclocked. Had an fx-8350 in the day and had no idea how good the phenoms were at that time. Nice video.
@@bmanrockwell2174 thank you. I'll see what I can do. Keep an eye out !
@@jims_junkIll also back up that the Phenom 1100T would be amazing to see. I used one and was blown away how a 45nm chip from 2010 could hold up so well
What would be interesting to see is comparing the Phenom to the FX 6100, since that was the actual FX release that occurred and was slower than the 6300 you benchmarked. That there will likely demonstrate the disappointment that FX showed for a time
I used the FX 6300 for 3 years and for me it was fine :)
Upgraded from Phenom II X6 1045T to FX-8350, with the same GPU Radeon HD7770 GHZ Edition, the games I was playing - Need for Speed The Run, along Crysis 2 were far smoother, especially Need for Speed ran significantly better! No glitches, no stuttering, gameplay was smooth and fast.
If amd given phenom sse4.1 and sse4.2 i can imagine how long would he was lived, realy good cpu.
FX is bad especially the first generation, but in same time its good processors. He have big issue with L3 speed and memory bandwith and because of that 4100 lost in games to 1 gen i3 and 4300 to 2 gen, but after release some big games like witcher 3 fx starts pull ahead beacause of multicore performance. And multitask was better becuase of amount of threads.
My friend had 4350 and can play any games what he want in 2018, after i give him my 8320e he complete cyberpunk 2077 on release with 40 fps. But in other hand his pc haved 8+2 gigs of memory on 1333 mhz and statters was everywhere.
i5 was better for games and i7 was more stable for work, but fx was cheaper and In good hands with good ram this processors can play anything, worse than intel, but cheaper.
sorry for bad EnGleSh LangUage, but i wanna give u some support for your work.
I now have a FX8350 to play with, and 2x 4GB sticks of DDR3 1600 CL8 - I think the absolute lowest CL at 1600 was 7, so not the fastest ever
Bog standard 1600 is CL11, I've had some CL10, but nothing lower
Them phenoms can easily hit 4.0ghz a 1090t usually can hit 4.2-4.5 depending on the cpus binning
4.0-4.2Ghz is most common, 4.5Ghz is an exceptional binning
@@rattlehead999 I’ve had 3 1090ts that all hit 4.5 stable and plenty that hit 4.3 but I also was using 970 and 990fx boards cause them beefy mosfets and vrms really help with just cranking that voltage 1090ts were an extremely common cpu in my flips from 2012-2015 I’ve probably had atleast 15 of them overtime
@@josephdias3968 Damn, that's quite the feat, you must have had great cooling and a ton of voltage for that.
would be a better decade in cpu war if we have phenom iii or even phenom iv 😯😯😯
They need new phenoms for am5
Agreed!
Zen1 is Phenom III XD
@@rattlehead999 "It would have been a better decade in the CPU war if we had Phenom III back then." sorry english isnt my native tongue 😢🙏
btw im also console gamer i still got my ps3 and ps2
that asus gaming pc i had i was doing some experiments with it like i tried amd phenom ii x4 965 and tried amd phenom ii x4 970 either one of them on air cooling i easily overlocked it to close to 4.1 ghz but i quickly turned down to normal but that same motherboard was am3 ddr2 i put in new bios update flashed it and put in amd phenom ii x6 1035t now its 6 core in that system but no insane overclock like amd Phenom ii x4 965 or phenom ii x4 970
id say if 4 core may have better performance then 6 core because much higher cpu frequency
btw wether its ryzen or Amd fx which i have that gaming pc too and amd phenom ii
end of day what really matters to me for gaming is graphics card thats what makes the overall difference
hey jims btw i always been a phenom fan personally I think Phenom ii all of them were good btw i even have an amd fx 8320 gaming pc btw i got ryzen Parts too I think Amd from phenom iis line up should have gone straight to Amd ryzen instead of Amd fxs
Also I have and army of Phenom ii cpus i got and other gaming pc that has asus motherboards btw that same board i have over like 10 Phenom ii cpus
Personally I think matter fact is if amd phenom ii x4 965 if its overclocked to 4.2 ghz i used to msi graphics card by nvdia geforce it still be montster against the amd FX of course they got more instructions
But phenom iis generally had better performance then certain amd fx procesors
My setup on older gaming pc that i was mentioned that same board somepoint even had the amd phenom ii x4 945 but its supported lke 10 diffrent phenom ii cpus
when i had the Amd phenom ii xram4 945 it was at 3 ghz and i only overclocked the graphics card by msi nvdia i believe its ram on gpu was only 748 mb i overclocked it just little bit just the graphics card not cpu but with amd phenom ii x4 945 3 ghz i still had great performance for gaming i still love Phenom cpus that same board i flashed bios and put it a Amd phenom ii x6 1035t its stll great but my main pcs are amd fx 8320 for gaming and one Ryzen laptop
Soon Ill build amd ryzen gaming pc
Great memmories of amd phenom ii brother take care ill suscribe to ur channel
sorry for this but almost like i Wrote an Essay to you lol
nice video! i like this very OLD comparison
I remember watching some hardware reviewers on yt claiming that the phenom II was faster than the fx 8150... on ipc iirc
good stuff
Thank You!
/me watching this on a Phenom II X4 925 in 2023, wishing I had a 1035T 😀
It may have been a closer match with something like the 1055T. Make clock speeds comparable then it's a more interesting and fair comparison.
I've always liked comparing the Phenom/II architectures to the Bulldozer/Pile Driver architectures. It really demonstrates the ills of getting rid of a CPU architect such as Jim Keller.
The Phenom II performs better because it has a better FPU processing pipeline than the FX where the FPU is shared between both cores, meaning at floating point calculations, it effectively behaves as having half the cores and this holds true especially for the FX8xxx and FX9xxx, where they'll act as quad core cpus when dealing with floating point math. In practice, the FP performance of the FX6300 would be comparable to the oddball Phenom X3 and its tree cores.
The argument over cores / threads on the FX though - it is NOT 6 fully independent cores as each pair share two half FPUs and also share L1 I-cache and L2 cache
RandomX mining needs 2mb of L3 cache per thread, so it won’t use all threads if it doesn’t have enough, hence the reason the fx is better
Where do you find the BeamNG benchmark? I can't find it anywhere
Forget what it's called. When I have a chance I'll look and report back. Unless someone comments before then.
www.beamng.com/resources/beamng-fps-benchmark-bustling-streets.19304/
Incase it doesn't work its called bustling streets.
I play nfsmw this month on a4-5000, get 60-30fps.
Realmente el AMD Phenom II x6 no tiene nada que envidiarle al FX solo por las instrucciones SSE 4.1 y 4.2 por eso quiero cambiar mi AMD Phenom II x6 1055t asi sea por un FX 6300 pero quiero es un FX 8350
bro just skipped the most interesting and complex part of the am3+ platform and the amd fx, skipping on the cpu nb, fsb and ht oc for fx6300 was just a missed opportunity to show off the real power of the cpu. and testing 6 core cpu's in games that barely use 1 or 2 is really genius
phenom probably pulls more watts than fx while not having instructions to play newer games.
no on the powerdraw.. FX uses more
@@ProcessedDigitally way more even
1:00 Not again, comparing Phenom 6x3 ALU/AGU x 3.1 = 55.8 vs Bullcrap 6x2 ALU/AGU x 4.1 = 49.2 pts let me guess Phenom will go head to head.
The fact the Phenom is as close as it is really shows how much of a failure Bulldozer/Piledriver really were
Agreed!
Lol, just like the whole Northwood/Prescott fiasco.
Prescott was utter CRAP. It's only saving grace was it's inclusion of sse3.
What's kind of crazy is... You're testing TWO generations after the last Phenom series, and this is all the FX series could muster up. I'm thinking clockspeed wise, the previous equivalent would have been something like the FX-6100 or 6120, and those performed even WORSE, maybe 10% or so worse..? I had an FX-8350 for many years, but looking back, these had to be among the biggest screwups in tech history.
Yup but amd really pulled themselves out with the ryzen lineup. Biggest CPU screwups though goes to Intel for the Pentium 4 and ESPECIALLY the Pentium D
That FX had only 3 cores while Phenom 2 had real 6 cores.Also best K-10 cpu IPC wise was actually Athlon2 651k.If they just added 2 cores on that cpu and SSE 4.1 and 4.2 they would not go almost bankrupt making this shitty FX cpu's.
6:26 vs 5:56 is half a minute, not a minute and a half.
the 6 core FX is really a 3 core with hardware SMT. A lot like what Intel is going to be doing with their Rentable Units.
So you are really benching a 3 core vs a true 6 core.
Edit: also the FX chip cost a lot less when it released compared to the 6 core Phe. Also benchmarking DX12 games will show you just how much better the 3 core FX chip is than the 6 core Phe.
That's only true for the FPU, as the cpu does have 6 integer cores in the die.
1055T=£125 and 125w most common
fx6300=£80 and 95w being cheaper than most phX4
I have 3 phx4's and a 1055t among many others I built and sold over the years, we might aswell be comparing a 3470 vs a 2600
3.25 I think you mean 10%
4:48 okay..... you're still measuring performance in absolutes not percentage wtf? if you go from 20fps to 30fps thats a "10fps" increase, but if you go from 100fps to 110fps thats also "10fps" increase, so do you think that both scenarios are equal? because the former matters allot more than the latter.
Edit: actually i'm here because I just upgraded a m4a89td pro from phx4 to x6 but cant use it because no UEFI support, havent got round to making a non-uefi install key as I've crippled my back, so lying in bed.
I'd choose a fx6300 @4.5ghz over a phx6 at 3.5 (250x14, dram 1066>1333).
Amd’s fx series was their pentium 4
Agreed!
The first FX series like 6100 was terrible, lost to Phenom very badly. The next FX series like 6300 was better but still not worthy. The only thing that give longevity to the cpu and platform was the instructions to continue using other applications and games and was very cheap.
lmao! Epic garbage tier vs garbage tier.
You only start hitting epic garbage tier when you go RDRAM Pentium 4. Slow, expensive memory with higher latency than the DDR speed equivalent and a space heater. Couple that with a GTX580 and you have your own whole house heater.