My HTPC has a fx4300 and gtx 1060 6gb good for game streaming (to my main pc) and watching youtube. I have overclocked it to 4.8ghz stable. Edit: Streaming in 4K 60 HDR
I had an FX 8320 still running up until some 2 years ago, when the motherboard died (and new ones were hard to come by here in Brazil). It lasted some 8 years. It was actually a decent processor, even though it had terrible temps.
I thought the GTX970 I'd bought was broken when it performed much, much worse than the internet suggested. Mixed feelings when I found out it was massively bottlenecked by the FX-6300. Upgraded to a Ryzen 3200 afterwards, card ran fine.
Known a guy with the same deal. But after two years of us both purchasing an i7 7700, his comment was "When everyone bought Intel, I bought AMD. And when everyone is bought AMD, I bought Intel". Both of us are using the i7 7700 till this day.
@@alexfrideres1198haha right. Ran my 3150 a tad under 5ghz for a decade and was always so depressed seeing my buddies core i5s kicking my ass. I always loaded into games first but fps dips were terrible.
I still run a basic setup, with an FX-8350, RX 580, and 16GB Ram. Upgraded from a 6100 at launch, to a 6300, and then now at the 8350. From my memory, the two biggest issues people had with FX was that they pushed for multi-core uses at a time where single-core programs were still the majority. Now that multi-core games are more common FX is able to have more longevity late in the game. Another issue was that the original FX chips performed worse than the old AM2 Phenom chips, which AMD not only stopped selling in favor of FX, but re-used the name for a line of FX Phenoms, which pissed off AMD fans.
i loved the Phenom chips. i had an X4 840T from an hp prebuilt that could be unlocked to be a 6 core, equivalent to the X6 1055T. it overclocked really well. those were the good days of hardware tinkering.
If you can swing it then not a lot of money can get you a significant upgrade from an FX now. About 6 or 7 years ago now I went to an R5 2600 (now also long gone) from an 8350 and even that jump was mind-blowing. Well worth it for the money you’ll pay for an early Ryzen chip now
At the time upgrading from any upper third of the Phenom II lineup to a FX was...too much money for so little gain. Also, the FX that really did have more performance were so power hungry it wasnt even funny.
AH I forgott, AMD burned the fx moniker, which up to this point was for the most fast versions of processors, to give the fx processor lineup some more shine.
@@pushatsinfrared Doing so would require a completely new MOBO, as Ryzen is AM4. Plus since no AM4 boards support DDR3, that means buying new DDR4 RAM. Honestly, even the basic setup I have can play most games pretty fairly. Most of my gaming takes place on games from before 2020, so im still good for now.
That weird weather in GTA Online is just the Halloween event that's going on right now. I hope that clears it up a little. Amazing video as always though!
A recent video "AMD FX in 2024 - Fine Wine?... or Vinegar?" puts things into perspective. I am running a FX-8300 and it's not as bad as I remember following the advice in that video.
Definitely aged better than it released with games and programs, even windows knowing what to do with more than 4 cores and multi core optimizations. It's crazy how different it was only 12 years ago
@@markskonecki2050 What'd you use to cool that thing? I always remember hearing not to go over 62C and I used a Hyper 212 Evo and struggled to keep it at 62C under full load at 4.3ghz. I realize not all silicon is the same, but still... I'd imagine that thing was putting out a ton of heat. Was it water cooled or something? Or did it have a massive air cooler? I use a Fuma 2 Rev B on my 5900x, now, with 3x fans on the heatsink lol.
@JustAGuy85 it was a 240mm AIO I don't remember the brand off hand .. yeah the fx8300 i had was good one..I had a 8350 that would never make it past 4.6 before it would freeze or reboot...every CPU is definitely different. Temps were in the upper 50s under load and in the low 30s at idle.
nearly the same here. i got an fx6300 +rx570. it was my witcher 3 build and it ran very well in fullhd. still using it as my main pc for internet,office and to play contemporary games. ❤
I just learned this. It makes soooo much sense. An 8350 should be compared to a quad core in FP heavy titles... And in that respect, it is not too bad. I knew something like this was happening in the background. It's like the GTX970 3.5 + 0.5 scenario
Back in 2016 I somehow snagged a FX9590 CPU from eBay for less than $100. The seller couldn't make it work on their MB, and so switched to a FX8xxx. I did some research, and eventually found a Sabertooth 990fx on eBay for about $100. I got a case, some memory, and an old AMD Radeon GPU and slapped them together along with a used 1000w power supply. Everything I used was used with the exception of a DVD drive I purchased. Anyways, I still have that beastie today. It has 32gb memory now, and an AMD Vega56 BIOS'ed to 64. I don't play 'modern' games, but World of Warcraft and the WarGaming "world of" games still play beautifully. I'll admit that it's not my primary machine, but that's only because a few years ago I switched over to the Intel ARC graphics cards; and you have to have a Core series machine to unlock all the memory voodoo of the ARCs. Regardless, my FX9590 still runs great and has survived 8 years of constant use.
I ran an FX-8350 for like 6 years, they were never as bad as some portray them, mine even kept up with Ivy Bridge and Haswell i5s for a fair bit less, they all overclocked well and frankly weren't any hotter then a nehalem i7. That systems still humming along with a vega64 at 4.8ghz and its still 100% capable of playing modern games nicely. For the many years of service it gave me no other CPU I remember more fondly then my FX.
I'm glad I looked at the comment section. When I switched from my 8300 to a 3600X when it released, I thought I was suffering from Stockholm syndrome and that I was the only person who actually liked it. Yeah, I like my 3600X more, but I like the 8300 more than I do my previous CPUs that are C2Q6600, C2D2200, Pentium (retching sounds) 4 (more retching sounds), Pentium 3, Pentium 1, and i486. 8600 made me switch from Intel to AMD, when AMD was at its worst. It isn't so bad at its worse to be honest.
FX will always be in my heart. My first gaming pc (2 years ago) had a fx6350, I found the motherboard with cpu and 2/4 ram slots broken in the trash while going for a walk. Paired it with a gtx 660 1,5gb I got for free from a local marketplace. Case was a clapped out Fujitsu from 1999. About 1 year ago I upgraded the pc with an old sharkoon case, a used ssd, a powercolor rx 580 8gb and a fx 8350. I read hundreds of bad reviews online, but I couldn't have been happier with it. (Paid 50€ total for the parts used) About 6 months ago a pc popped up on that same marketplace for free. Luckily I got it and it has an i7-6850k, a gtx 1080, 4tb of ssds all in a Lian Li PC Z70 It's definately a huge performance jump, but it doesn't feel the same as my old pc did.
For lighter tasks, the FX series still delivers decent, playable performance in older and some modern games, thanks to DirectX 12 and Vulkan its aging like _fine wine_ .
@@AffectionateLocomotivewhen fx launched games were single thread heavy and rarely using well 2 cores, dx12/vulkan makes up for that big time to the point first gen i7 or even Fx processorers or phenoms with high core count are somewhat usable while dual cores are essentially useless
I can say the same about my Core 2 Quad system. Still doing fine for lighter tasks. But a really interesting question now would be to compare the 4 module FX, especially the 8320 and 8350 to the i5-2500K and 3570K. They were around at the same time and the FX were claimed to age better once games start using those threads. But how do they hold up a decade later. Games do use more threads now, many not even playable on a quad anymore.
im currently running an FX 8350, 16gb DDR3, R9 390 8gb and some chinese cheap 500gb SSD. its my main gaming rig, runs world of tanks at ultra settings and use it for work.
I would like to point out that the modules of FX processors share more than cache, they actually share a floating point unit, and each core has its own integer unit. So in tasks that are integer heavy, they behave much more like a true X-number of core processor and when floating point heavy function much more closely to a processor with half the cores but has hyperthreading.
I ran an FX-6300 for 2 years until I could afford an 8350. From there I upgraded RAM to 16gb and ran that until ryzen 3rd gen. It's kept me loyal to AMD
If have never owned an FX Cpu you will never understand the bond that it makes with the user. My FX8320 served me well constantly overclocked to 4.5Ghz 10 years. I upgraded to AMD 5950X afterwards i got fed up with the thermal requirements of 8320, so the first thing i did was with 5950X is to underclock thinking that i i do not need such performance and to reduce thermals even more so that i can use it fanless in around 50Watts. So FX cpu changes you, it becomes a part of you and lives within :)
I used a FX 8350 until 2019 and my GFs daughter is still using it. It works and you can play games. Once games started utilizing more than one core the FX cpu just kept on going. I have not used intel since P4.
I built a gaming rig with the "six-core" FX CPU and a GTX660. It was... fine. Ran BF3 at max settings at 1080p. Still have a laptop with the "4 core" variant, and it's fine. Perfectly adequate at the time, most of the time. Goes to show how OP most CPUs are for gaming these days. I've got a 12c/24t Ryzen 9 7900X now, and on some games it's reporting like 4% usage. My RTX4080 will generally max out when my CPU is under 50%.
@@ChrisGrump The X3D chips are awesome. I was thinking 'hey four more cores at similar clocks is the way to go' I should have spent 10 seconds to look at the benchmarks, it's like having a V8 and losing a drag race to a V6 with a turbo. Still not a bad CPU by any means, but bang for buck goes to the X3D line all day long.
@@TheWarmotor Only in gaming. *most* working applications (encryption, decryption, compression/decompression, rendering, etc) doesnt really care about the extra cache, so you can save money AND gain performance by opting for the higher clocked version without the 3d cache if you arent gaming with your cpu
To be fair the 7900x does have a lot more cores than most games will use nowadays. I'd say it's more towards workstation in performance rather than gaming cpu.
The FX series was made using an old and inefficient node, with some models being pushed particularly hard to achieve high frequencies at the expense of extremely high power consumption and terrible energy efficiency. Also, the architecture didn't seem to be very good for most applications when they were first released. However, the architecture itself seems to have been relatively advanced, and performed well in some applications even early on, and ended up aging fairly well even for gaming, though it took a long time before a wide range of games were really taking good advantage of the FX architecture. These CPUs also offered decent value for money from the beginning, especially if your electricity costs were low enough that the power consumption wasn't likely to cost you very much, such as in some parts of Canada or the US. It was nice to see someone taking a stand on the core-count controversy. Even a lot of tech journalists will repeat this outright falsehood that these CPUs didn't have the number of cores that they said they had, and use it as an example to argue that AMD isn't a trustworthy or honest company. Sure, by all means, it's good advice to tell people not to trust AMD, but this was not an example of dishonesty. The AMD engineers themselves considered their FX CPUs to have 8, 6, and 4 cores. The lawsuit didn't prove anything. AMD only agreed to a settlement because they didn't think paying more money in legal fees was worth it. There was never any conspiracy to misrepresent the number of logical cores, and the whole legal argument that their products were misleading was completely dishonest nonsense. There was never a widely agreed upon definition of a CPU core which the CPU cores in the FX series chips failed to adhere to. The lawsuit against AMD was just 100% pure dishonest legal sophistry and opportunism by lawyers. Anybody who expected an 8-core CPU to be faster than Intel's best quad-core CPUs during the same time period was an idiot, because Intel's best quad-cores were significantly MORE expensive, and it's never been a good idea to assume that a CPU will be more powerful because any one of its specifications has a larger number in it than a competitor. The FX CPUs offered decent value, there was no legal grounds for the lawsuit, and AMD should have fought it tooth and nail for the marketing value of vindicating themselves in court, imo.
Great videos and since you have a working AM3+ board, hope we some more FX content. FX-8000s '8 cores' diffused after 2014 are insanely good overclockers and do so while sipping energy at the same base clocks as the FX-9590 or whatever that beast was. like 1.28v Vcore under 4.8ghz all core load compared to 1.45v of the old top FX chip, and also while being much better ram/FSB overclockers. They were fun to play with, and it was funny seeing a chip I pulled from an acer oem recycling centre special smoke my old budget champion i5-2500k in games like AC Origin and TW Warhammer 2.
I made the huge mistake of buying an FX-4100 back in the day. The architecture was simply at odds with how Windows gaming worked at the time, kept tripping over itself. A Phenom II X6 would have been a better purchase. Once I got a new job, I dumped it all for a 4770k.
Part of the issue was that they doubled the integer units, but kept a low amount of FPU units. And games since Quake are quite FPU heavy. NetBurst did work a bit better in that regard. Also a low IPC, long pipeline layout, that when fed well, could perform well. And it had a full amount of FPUs. So in terms of performance, it could compete just fine, but also scale up insanely with the right workload. I remember stuff like video rendering or mp3 encoding being much faster on a Pentium 4 than on an Athlon XP.
@@saricubra2867 Exactamente! El FX8350 costaba menos que un i5 35XX, y rendia mucho mejor en multithread... contra un i7 3777K/4770K perdía, pero el punto dulce de los FXs fue el costo/beneficio,
@@saricubra2867 Compared to an i5 without hyperthreading yes. Compared to an i7 An FX8320 was £100 about the same price as an i3, an 2500k was £150 and i7 2600k £280. The i5 is not 50% faster and the is is not 180% faster, I know as I still have them all.
@@janiss2926 Only 8phase was ROG crosshair, sabertooth and UD7 but doesn't matter, elbow of efficiency curve was ~4.2Ghz. MSI 970 gaming was a 3+1 POS and gigabyte UD3P was (good) 4 phase, IIRC some higher end asus was 6phase.
My first ever gaming pc build was a fx 4300 and a 1050 ti. Those were simpler days back then, put a couple hundred hours of og siege, dying light, og rust on that thing, ran them quite well back then. Now rust and seige probably would get 10 fps on that machine nowadays lol
My student house never had working heating, I am not joking here - I overclock my fx8320 to something silly and that machine kept the room warm, I just ran some stress tests if I needed more heat.. Honestly wasn't that bad of a cpu
I built a dirt cheap system in 2016 with an 8320e with a mild overclock and 16gb of ram, an rx480 and a fast ssd. It was extremely cheap to build because nobody wanted fx and performed much better than earlier fx builds because of dx12 and vulcan, as well as many newer games taking advantage of more cores. It was a very good budget gaming and video editing system for 4 years.
Part of the problem with FX was that games of the time didn’t know what to do with the modules. Later games such as those from 2015-2017 were designed around using them(because of the Xbox one and PS4) so they tended to run a lot better. When the modules are properly optimized for, it helps a lot with performance. I own an FX 6300, and it honestly impressed me how good it is still(I really need to break it out again though)
I wouldn't really go that far, but instead take an FX-8350 or even better, the 8320, which was a budget recommendation for quite a while. They overclocked pretty much to the same speeds as the 5950 came stock. See how it can keep up to the i5-2500K and 3570K, which were the direct competition at first. And it was said the FX would age better in the long run, once games actually use those available threads. Let's see how true that holds.
Part of the performance can be explained by the more modern tendency to prefer multicore CPU with several threads VS strong single core performance. Now everything from mobile phones and consoles to handheld gaming devices is based around (sometimes) lower speeds but multiple cores. Do you recall ARM/RISC performance back in the day? Nothing was optimized for that.
And then came BIG.little, pair a set of fast, but hungry cores, with a set of small, but efficient ones. Pretty much what Intel is doing with their E and P cores. Few fast cores for narrow tasks that don't scale well over many cores, and lots of small cores for things with better parallelism. And later versions could even use both blocks simultaneously, in case a task needs lots of CPU. My current phone even has three sets of cores, four small ones, 3 medium ones, and a single fast one.
I have a friend whose girlfriend have a pc with an FX and she does run Baldurs Gate 3 on that thing, despite pairing it with a less good GPU (not an RTX), so it doesn't really run as well. Tbh it does seem like the FX was hated on for being a power hog and not particularly great at anything, which's... well, kind of a shame, it's a surprise it held up this well
In 2011 i was in game design school and starting to making gaming videos on youtube, so i build a new pc for unreal 3 rendering (even with the 8 cores of the fx cpu rendering a unreal lvl sometimes took 8 hours) and editing videos, i build fx cpu based system with 2 amd hd6950s (later i swapped them out for a single r9 270x) an asus mb and my very first ssd...it was a good system for about 6-8 (i forget) years then moved on to a i7 6700k based system... I even made a video on my channel about this system...(my dog was the star of the video)
I still have the 4300 I bought over 10 years ago, semi-recently bought a 6300 and plan to buy a 8300 in the near future. The 4300 is slated to run a 24/7 minecraft server for friends, the 6300 runs BOINC decently at 50% CPU usage limit, temps rarely over 45° with a wraith cooler. The 8300 is something I'd stare at on various web stores with beer pockets and champagne taste.
In the previous generation (Athlon/Phenom), cores were cores. In the FX, each 2 "core" module had independent integer clusters, but a shared FPU (as well as shared cache and and a shared pool of instruction decoders) - for integer workload, it didn't face the same limitations as Intel hyperthreading, where EVERYTHING is shared.
And now Intel is doing big and small cores, so not even a core is equal to a core on the same CPU. And caches get even more complex. Intel's caches tend to be inclusive, meaning a cache at a certain level also holds a copy of the closer caches. On an Intel quad with 256k L2 (per core) and 8 MB L3 (shared)= would have 8 MB available in total, but only 7.25 MB for a core (minus the 0.25 MB copies of the other cores' L2). The same layout on AMD would give a core 8.25 MB at max (8 MB L3 + 0.25 MB L2) And while everyone was talking about core-to-cache latency on Ryzen, the same also holds true for Intel. They used a ring bus, with a stop for the iGPU, a stop for the PCIe lanes, a stop for the MMU, etc. But also a stop for each core, including a 2 MB slice of L3 cache. And latency tests were showing that the first 2 MB were quick, 4 MB latency was slower (since it needed to take a hop on the bus and back to get to a different slice) and 8 MB was even slower (since it needed to do 3 hops to the furthest slice and then 3 hops back) In fact, my ancient Core 2 Quad has a lower 4 MB latency than a Haswell i7
I fitted the 8-core fx 8300 to a friend's dad's PC recently, to upgrade from a fx 6100. It was a big upgrade for very little money and is absolutely viable as a daily PC. Snappy and handles multiple chrome tabs, youtube videos, MS office etc. all together nearly as well as my i9. No I'm not kidding!!!
As a young teen I spent the whole summer working around the house at home to earn some money. After hours of hard labour (we were removing tons of stone from the cellar) I had 700€ for a full gaming setup. Jokes on me I got the fx 4100 along with the fantastic HD 7850. In my young idiotic mind I found out you can overclock for more performance, as my 4100 struggled in games like battlefield, so I gave it a try... on a stock cooler. Needless to say that didn't go well, but it was a good life experience and a perfect excuse to get a new and better cpu later on
They were decent chips for the time. PC builders had good times indeed. I ran (and still own but don't run) an FX 9590. Yup, on a MSI 990fxa gaming board. 16 gigs Balistix 1866. It didn't suck.
Yeah, people remember that the gaming was better on intel chips with their better single threaded performance, but they forget that running anything in the background could occupy a core or 2 and the only intel chips someone on a budget could get did not have both four cores and overclocking at the same time. Which was a big deal because early turbo boost did not come anywhere close to maxing out the chip and the all core max clocks were kinda low. So for anyone who did more than games and could tolerate the power consumption more than the initial sticker price, they just worked.
Still rocking an FX8300 black. Does me fine for 90% of tasks I do and games I play. Got access to beefier cpus than this but don’t have a high end GPU, so can’t take too much advantage
Brazilian IT guy here: I do IT services for a local manufacturing company and roughly around 2012-2015 I built 5 computers: one FX-4100 , three FX-6300 and one FX-8320e. Along the years I´ve upgraded the ram to 8 GB and 16 GB on all machines, swapped the HDD to SSDs and I also have if I´m not mistaken 3 machines running discrete graphics. It´s amazing how well these machines still hold up for daily use like in an office like enviroment. All of them on the cheapest possible motherboard at the time and only the original AMD coolers (one of them runs an aftermarket cooler because of a lightning strike). However I´ve never cheaped out in power supply. At the time I bought them because they were much much cheaper than Intel, but after all those years the extra cores really came through!
I bought an FX-4100 in 2012 it was a huge bump in performance for me coming from an old Intel Pentium dual core. It was really cheap and allowed me to use all the power of my HD 5770 and recording my video games. Then I changed for an i7 3770k + HD 7870 in 2013 and it was a lot better of course. Good times. Now running R7 3700X + RTX 3060
My first built gaming PC was an FX-8350, it got a surprisingly huge bump in performance when Mantel, Vulkan, and DX12 could actually use the cores without the FPU bottleneck getting in the way. It also still works great under Linux. I later upgraded to the 1800X as soon as it came out
4100 king , i still have a system with the fx 6300 running and it runs very smoothly on a gigabyte 970 ud board . Its undying, permanently 4.5ghz :) Sweet throwback video.
I had an FX-8320 that outperformed i7’s back then. It took considerable effort, but I overclocked it everywhere possible. Occasionally, the i7 won, but I had that system tuned exceptionally well. However, the time and effort I invested in making the 8320 beat the i7 weren’t worth it. It was still fun, though. But no one will ever believe me.
Stock i7's? Because you had pretty much a guaranteed 25% oc over the all-core boost, and quite some could do almost 40% I still have an old i5 here with a 20% CPU OC and 33% RAM OC that still delivers adequate performance in many cases even in 2024.
I got a FX6100 for free and beat the absolute hell out of it. 5Ghz on all cores with a 25€ air cooler from a random chinese brand, paired with a MSI 970A krait. Great board apart from the power delivery system... It was overwehlmed by a stock FX6100 and i had to add massive amounts of airflow to reach the 5Ghz without it blowing up. MSI later removed the support for the FX9590 because the board literally blows up if you just try booting with it.
I got one too into my PC & i overclocked it too at 5Ghz. At 5Ghz paired with a modern GPU you can play SBK 22 & MotoGP 22 on Ultra in 2160p 60+FPS, Rage 2 on Ultra 2160p 60+FPS, Monster Jam Steel Titans, Terminator Resistance & of course older games maxed out in 2160p. I know because i recorded them on my RUclips channel XFORCE667. Even Moto GP 23 runs maxed out in 1080p or higher using this OCed CPU. Enjoy!
The FX 6300 was my chip at the time also (upgraded to a 4790K several years later) Had the chip overclocked to high heavens. 1866Mhz Ram too. It wasn't perfect, but it was a beast.
i had an fx-8320 that i overclocked to just under 5ghz. it had its drawbacks, but i always liked it. it was significantly cheaper than intel, taught me about overclocking, and kept my bedroom nice and toasty during the harsh montana winters. oh and getting paid like 6 years later in the class action suit was nice too.
I had my AMD FX-8320, Then the FX-8350 for about 10 years. the 8350 lasted me up until earlier this year in 2024. Overclocked to around 4.8GHZ, its 8 cores and high frequency smashed most gaming tasks i threw at it over the years. Loved those chips!
Few months ago i built 3 PC's with 3 different AMD FX CPU's specifically for my kids. First PC for my eldest daughter FX 8150 paired with 8GB DDR3 and RX 5500 XT 8GB. Second PC for my middle daughter AMD FX 6100 with 8GB DDR3 with R7 250 2GB. Third PC for my youngest son with FX 8320e with 8GB DDR3 and RX550 4GB. They mostly play Roblox but also Genshin Impact, Dragon Ball Xenoverse 2 and surprisingly enough, all the systems play GTA Online without issues, the FX 8150 at high settings with way over 60 FPS, and the weakest system with the FX 6100 on normal settings but with over 50 fps. The FX8320e was even able to play Ghostbusters spirits unleashed on normal settings with over 50 fps, i was not expecting to even be able to play it. So yeah, they're fine for what they play. Also browsing youtube or netflix is pretty smooth. I wouldn't touch the 4 cores FX CPU's though.
@@prostmahlzeit i don't see temps over 73°C on each FX. They are running at default settings without issues. They do how ever use quite a bit of power from 96w to 125w tdp. Not a problem tho, energy is very cheap where i am located.
Overclock the HT bus and do an fsb overclock instead of multiplier as they performed far better that way, and you have a decent little quad (dual) core But for less money, you can buy a xeon 2680v3 (7 usd ish) and a cheap x99 mobo and ram for less than 50 usd combined from ali and ebay respectively, and it will run just about any game out there (think between ryzen 2000 and 3000 performance)
Yup, the same also holds true for the K10 and to a degree for the K8 chips, the whole frontend scales up performance quite well. I think it was 2400-2600 on the northbridge and maybe 2200 on the HT, paired with fast RAM in the 1866-2000-2133 range and it should fly.
I had one of the 9000 series chips, ran hot as can be and was basically unusable with stock cooling. Ended up doing my first intel build after that, don't even remember which CPU it was but it ran much nicer. Nowadays the 8000 series chips are best known for the absolutely horrifying OCs people can get on them, with them being the first to crack the 8GHz ceiling, and my condolences go out to the guys who broke said record only to no doubt be cooked alive by the miniature sun they'd created with all the heat it must have kicked out.
Ill never forget my fx6350. Despite how god awful it was at lots of things, it still could run games. I remember barely managing to get Minecraft VR running on it a while back with my 970
I had FX-8320 overlocked from 3.8 to 4.2 GHz from October of 2013 to the October of 2019, my first 1080 capable computer ever. I built a R9 3900X Ryzen PC that year, which I still use today.
still rocking an FX8350 for last 7 years now. my 1st and only ever pc I built and it's working perfectly with me. paired up with a 1050 2GB GPU, 16GB of DDR3 and loads of RGB in a glass tower case. got no reason to upgrade as it does all I need it to do 🙂
@@nesyboi9421 Older and optimized game would run fine on an FX chip. Modern AAA slop probably won't though. With any older machine you just have to play games that the hardware can run.
@@Compact-Disc_700mb True and not true as shown here in this video. It depends entirely on the game, triple A or not. For example, I know for a fact a game like Space Engineers is going to run like crap on it, or at least would have used to dunno if they have improved optimization much or not but I know it used to mostly run on the one core. I was more referring to having a 2GB 1050 if anything. Like the 1050 is a good card it's just the 2GB one is less than optimal, I feel like 4GB is the minimum nowadays for most games. Sure if all you play is like Garry's mod, Spore, Minecraft, things like that it will be fine but for some other titles it's going to struggle. I had a 4GB laptop GTX 1050 and although it still plays a lot of games I like it's started to show it's age nowadays which is why I relegated that laptop to just school work. Even with 4GB there are some games that just don't take well to such a low amount nowadays. But yeah I guess if you only play older games or 2D type indie titles you could get away with it.
@@Compact-Disc_700mbIt seems my reply never went through on mobile. Yeah sure if you only want to play old games, but it isn't just triple A slop that you can't play on older chips and graphics cards, you miss out on a lot of the good titles to come out too. I mean if all you want to do is play Minecraft, spore, or Garry's Mod sure that CPU GPU combo will suffice and is fine you're right, but... you miss out on a lot of new stuff, indie and big studio. I can tell you one thing, that combo won't be able to play Helldivers 2, at least not well, nor would it run Space Engineers all too well either. And honestly I think they would be a lot better off if they had a 4GB GTX 1050 or 1050 ti, or at the very least a 3GB 1060. 2GB is so little nowadays, even my 4GB GTX 1050 was starting to show it's age like 4 years ago when a bunch of the sandbox games I played got harder for it to run. (I7-7700HQ laptop, 16GB DDR4, GTX 1050 4GB) 3GB 1060s are a dime a dozen and would make a great upgrade, and so would a board with first gen Ryzen or Intel 6th gen. I do understand though that maybe they don't want or need an upgrade, which is why I asked if they played many games. Maybe I should have asked if they only played old games or light titles.
@@nesyboi9421 I have an i7-6850k with a gtx 1080 for about 6 months now. There are no games I play now that I couldn't have with my old pc (fx8350, rx580 8gb) And yes, I play a lot of games.
My goood, this was my processor when I built my first PC, I was so proud of it until after some time the CPU heat warning would start blaring almost constantly while gaming lol.
The instruction set helps a lot like AVX. That's why Sandy Bridge and FX CPUs still usable for modern games. The main bottleneck of this CPU was the shared FPU unit and the lack of modern features for AM3+ motherboards like PCI-EX 3.0
Had an FX-8120, and I don’t think I’ve had any good memories to be quite honest. I don’t even think it was average to begin with, as the CPU got hotter reaching its tmax. (62.5c) No matter the cooler, the CPU stayed super hot and was very unstable. It also didn’t help that the ASUS motherboard I had was modified since it was a pre-built machine, but it did give me the best value in terms of longevity. (Until the motherboard died) I would never recommend FX to anyone unless you absolutely know what you are doing, and even then I’d still point for a used Ryzen first gen instead of the FX. You’d still have a much better time than with an FX.
Same situation for my A8 7650K, I think it's due to the paste inside the IHS that dries out so badly it's just leading our CPU overheating to death. I bought a 7600X. Super duper fast, I9 11900K levels of speed. Super duper cool (32 to 40°C MAX OC'd) While my A8, goes from 38 to 60°C...
I ran FX from 2011 up until 2016 when i switched over to x99 and then finally went back to AMD with AM5, i owned the 6100 the 8150, 6350 and the 8350 they were all good chips honestly and were extremely easy to overclock and had no problems running multiple SLI and crossfire setups from back then, i also wish i hadn't sold all my fx gear i wouldn't mind finding out what my 8350 could have achieved with my old 1080ti, also the higher tier fx chips also had 32 pcie lanes which means you could run two gpu's in x16 mode which the intel counterparts couldn't at least not on mainstream
This hits the feels, my first build was an FX4350, radeon 4870, an awful ocz 650w psu & an overly expensive thermaltake case......I still have pics of that setup with the enormous noctua cooler in a standard ATX case.......what was I thinking in 2012 🤦♂️🤦♂️😂
I had an FX 6300 on that same motherboard before I made the jump to an i5 Sandybridge. Then to an i7 (which I currently have and am replacing in a couple of weeks with an 8th gen). IDK if it was my video card or what but I remember being so disappointed with the performance of some then-current games in 2015, particularly GTAV. I sat on the threshold of going to an FX 9590...but when I saw the TDP of the 9590 (TWO HUNDRED AND TWENTY WATTS!) vs the i5, switching to team blue was a slam dunk.
I had the Phenom II X4 955 and later the 8350 (the latter with the wraith cooler). Bought both of them new, long after release, at a budget price. Both were really good, capable budget options. Still have them in their box on the shelf.
Time to set up a retro system for all those old games that don't run well on modern platforms. The Phenom could totally get something like a HD 5850, HD 6950, GTX 570, etc. And the FX would probably be happy with a GTX 670 or HD 7950
I still daily drive an FX6300, no issues Also 97W under load? mine never uses over 50W. You can't overclock the shot out of a CPU and then blame it for running hot
I remember playing some heavy mid 2010s games with an FX 9590 system that I got as a farewell gift from my first-ever workplace. Kept it until 2018 when I switched to a Haswell Xeon system (E3-1270 v3). At the end of 2020 I traded the AMD platform for an RTX-2060-Super, which still runs in my old Haswell system to this day. Sometimes I wish I still had the FX, to test new stuff with it.
@@Dagoth666Ur Actual 6 cores sure but 3 modules sharing and bottlenecked as such, also sharing the same cache. With not having actual 6 core performance and logical reasons why, for the technicality argument of "actually 6 cores" there's a reason they settled.
@Sprier " With not having actual 6 core performance" is highly arguable, they hade lower ipc than intel cpu' of the era witch amd planed to compensate with high clocks but they where unable to achieve them, that was biggest problem with FX cpu's. Then intel began dirty campaign of bribery of oem manufacturers and rest is history...
I started out on the 4300fx in 2011. I upgraded the CPU aa few years ago to the 8370fx. Recently doubled the ram to 16G, and ungraded from a pair of Radeon 7700 firewire to a GeForce 1650. Also a 2T SSD. A few months ago I got a new Intel i5 13600. Still have the AMD PC and it runs very well. 13 years and counting.
People tend to be overly dramatic with how bad hardware actually is. In reality if it performs decently and sits at the right price it's not a bad product at all.
Yeah I never understood the hate for FX chips, As long as it is stable and performs fine for the price then good, Older and optimized games run fine on old hardware, the new trash is not going to run well. So many just hate anything that cant hit 300fps in the latest AAA slop. Many just need to keep expectations in check with old hardware.
@@HappyBeezerStudios Asuming you are running windows, you might want to check your background processes and installed software to see if something is useing all your resources because that does not sound right. You might want to do a clean OS install if it was installed years ago. Could also be GPU or ram limitations.
I still have my old FX-8350 somewhere.... It was perfectly adequate for 1080p 60 FPS gaming and it did just fine for general computing, too; I didn't retire it until 2019. Huge improvement over the crappy 2-core Intels I could afford at the time, it gave me 80% of the performance of the then-king Intel I7-3770K for a third the price. I slapped a Cooler Master Hyper 212 Evo cooler on it and it was pretty cool and quiet unless I was rendering video or playing demanding games; my GPU was far noisier though so it didn't really matter.
It was a good chip for its time and still kicks ass today. In 2018 I did the full FX-8370, 16GB DDR3-2133 and RX 580 combo. Averaged between 45-60FPS in Unity VR stuff, 90-120FPS in anything Unreal and a steady 144FPS in most Unity desktop games, which was fair. After building a Ryzen 5 3600 system, I racked the FX and turned it into a recording server. It really deserves better than this but without streaming, it has no home. I promoted a single core AM2 box to be my full time web and storage server just to retire the FX to container duty but stopped doing containers.
I am using a FX9590 in a Crosshair V formula Z Mobo with a modified bios to use NVME drives. This processor while hot runs Windows 11 fine, that is until Microsoft shafted everyone that was in the Windows Insider group that tested 11 on the FX processors. We had no errors or issues but Microsofts uncompromising greed overcame peoples inability to afford a Win11 compatible PC and we were kicked out of Win11 Insiders.
FX never really was compatible with W11 anyways in fairness, lack of TPM 2.0 required work arounds. I hate to say it, considering how cheap older Ryzen is getting, FX is really hitting the point of effectively uselessness, even the 9590 at full wack overclock will get walked by a second gen 6 core Ryzen you can for *30 bucks* and DDR4 is cheap now. Might be time to move on
I used to have the fx-6300 "black edition" in my first pc. I paired it with 2x4gb 1600mhz ram and a radeon r9 280 3gb. However after a couple of years I have started to notice some very serious stuttering in games. In 2017 I decided to upgrade to the ryzen5 1400 since I could get it for 70euros and paired it with an ex miner rx480 8gb what i got for 75. It was night and day, the stuttering suddenly stopped and my fps doubled. In fact it is still my main config since I mostly play older or indie games.
Games are much more multi-core aware nowadays than they were back then though right? That might account for the "less bad than remembered" performance.
First setup I build back in 2012 had FX-4170 in it paired with AMD HD 7770 :D Later on upgraded to FX-8320 which smoothed out some of those Bulldozer kinks and gave the socket some more life but ehh... When I saw by friend play GTA V with locked 60fps with 4690K I knew I had to change (for a while) xD
They were fine - the first Semprons were basically Athlon XP Bartons for Socket 754. The later ones were even 64 Bit. Of course as soon as there were better options (e.g. Q6600) it didn't make any sense anymore - especially because they were single core parts...
@moezarella1261 I am a former Network Engineer & started building PCs in 1992. I was even part of the Cyrix 386-to-486 upgrade era. I gave up after getting the infamous "magic smoke" on the 3rd 386 I tried using the upgrade kit on. So I know how much the Sempron sucked. It was an IT support nightmare, people (aka management) who didn't understand how a PC worked went & bought these things, because of the price point.
My first PC I got as a kid was with an FX 6300 purely because of the price difference at the time compared to Intel as you mentioned. I still remember having to run it with the side panel off bc the stock cooler (was literally as you described it here) was just not sufficient, funnily I use that cooler now as a makeshift solution when I'm testing motherboards 😂. This video brought back many memories ngl. I went to a 6700k after and then back to Ryzen 3800x, to this day the only Intel cpu I ever used, and I am not planning on going back any time soon. AMD definitely started something good on the long run with their developent of chips at that time I think. I still have the system under my desk catching dust atm, might put it back to some use, especially after seeing this video showing how competent it still is in 2024. Amazing content, keep it up my man!
That lawsuit was such BS. Its 4 cores because its got 6 execution units divided into two integer units and one floating point unit per model but the cherry on top is that its a super scalar processor anyway with each of the integer units and both floating point units totally capable of executing more than one instruction at a time. AMD settled because they'd have to reveal precisely how they accomplished this and give away trade secrets to prove how it worked which might not have won the case anyway. Legal "team" earned about 12 million for their trouble and those who asked for a partial refund got less than 2 dollars.
I still have my AMD FX 8320 on a Gigabyte Ultra Durable MB with 16 Gb RAM and SATA SSD and GTX 1650. Old beater that I play old game on. Still going strong. lol.
The plastic cooler hold-downs breaking put an end to 2 of mine. The stock cooler falling of and shorting out on everything as it fell inside the case took out the cpu, motherboard , and psu. Oh the memories.
Still running an FX8350 with 16gb RAM and an RX580. Plays Elden Ring at high across the board. Going from Win7 to Win10 gave a huge boost in performance.
My family PC is still running on a FX-6300, 8gb of ddr3 and a GTX 750 (1gb VRAM version). Considering it's only my mom using it nowadays, and that it's mostly web browsing and browser/emulated mobile games, it's still holding up just fine! For security reasons I'll be getting my mom a new one once Win 10 goes EOL, but I'm sure she would be perfectly happy with keeping the old one if she could.
I built an FX6300 8GB pc with an RX460 and I used it for years, it handled games, virtualization, college and Uni without a single issue, a great PC that I only recently gave away to a young lad who's parent couldn't afford to buy him a PC. Very under rated processors.
I had an AMD fx-8350 in my old pc and it handled itself well enough for what I used at the time. I still have the old pc but got an 8th gen i7 based laptop after I started getting graphical issues on the pc. It was paired with a Saphire r9 380x, loved that pc when I got it back in 2014-2015 (.approx)
Man, this video has me wanting to dust off my old rig I rebuilt back during the winter of 2020 (Covid - Round 1 - Fight) after I finally had some time to myself as my job was classified as an Essential Worker. Like you, it's what I called my 'First Real' gaming rig that I built years before as I got a heck of a bundle on it and needed something better than the ol' Core 2 Duo system I was on at the time. Honestly, if I hadn't used the stock cooler as of course I was on a budget, I probably would have used it longer into it's life. But yeah, still using the same Gigabyte board, 16 gigs of G.skill ram, and the FX 8120 the bundle came with. Current graphics card I have in it is an EVGA GTX 1660 Super 6 Gig I picked up in March of 2020, the one that took me through the dark times to follow on my First Gen Ryzen system, which is just now one of those cards I'll never get rid of working or not, given EVGA pulled outta the graphics card market due to Nvidia shenanigans. Given the underdog status of the FX line, and all the gaming I did on the rig back in the day, it's got a special place in my heart :) Thus! Great video :)
Built my first pc with a fx-6350. I got it framed on my desk, was a decent little processor that did all the gaming I could have hoped for at the time. Lots of fond memories using that cpu. Super cool video!!
Are you still on AMD FX… (I’m so sorry) if you are let me know how you’re finding it 👍
Back when AMD's top end has increased in core count, and US$300-350. Now, at that price, it just a glorified low-end 6c.
yep i still use it im on it now
My HTPC has a fx4300 and gtx 1060 6gb good for game streaming (to my main pc) and watching youtube. I have overclocked it to 4.8ghz stable.
Edit: Streaming in 4K 60 HDR
my fx6300 is still chugging along. I don't play modern games and I'm more of a tinkerer using NixOS for occasional programming, CAD and 3d printing
I had an FX 8320 still running up until some 2 years ago, when the motherboard died (and new ones were hard to come by here in Brazil). It lasted some 8 years. It was actually a decent processor, even though it had terrible temps.
I was surprised when I went from an FX-8350 to an I7 7700K and saw my framerate literally double.
I went from FX-8350 to a Ryzen 5 3500X, the difference was noticeable. The feeling might be because of a clean OS installation though.
I used more my laptop (7700HQ + GTX 1050 Mobile) than my tower (FX 8350 + GTX 1060) at the time for this reason
Same moment for me but from 8700k to 7800x3d
I thought the GTX970 I'd bought was broken when it performed much, much worse than the internet suggested. Mixed feelings when I found out it was massively bottlenecked by the FX-6300. Upgraded to a Ryzen 3200 afterwards, card ran fine.
Known a guy with the same deal. But after two years of us both purchasing an i7 7700, his comment was "When everyone bought Intel, I bought AMD. And when everyone is bought AMD, I bought Intel". Both of us are using the i7 7700 till this day.
AMD Bulldozer did have the frequency record for like 11 years.
Yes, probably longest lasting record in long future.
Was a lot of fun to push them as far as you could
shame that frequency advantage was gimped by abysmal IPC and the bulldozer architecture's really strange FPU situation
@@alexfrideres1198haha right. Ran my 3150 a tad under 5ghz for a decade and was always so depressed seeing my buddies core i5s kicking my ass. I always loaded into games first but fps dips were terrible.
some stupid celeron with 1 core got it unfortunately
I still run a basic setup, with an FX-8350, RX 580, and 16GB Ram. Upgraded from a 6100 at launch, to a 6300, and then now at the 8350.
From my memory, the two biggest issues people had with FX was that they pushed for multi-core uses at a time where single-core programs were still the majority. Now that multi-core games are more common FX is able to have more longevity late in the game.
Another issue was that the original FX chips performed worse than the old AM2 Phenom chips, which AMD not only stopped selling in favor of FX, but re-used the name for a line of FX Phenoms, which pissed off AMD fans.
i loved the Phenom chips. i had an X4 840T from an hp prebuilt that could be unlocked to be a 6 core, equivalent to the X6 1055T. it overclocked really well. those were the good days of hardware tinkering.
If you can swing it then not a lot of money can get you a significant upgrade from an FX now. About 6 or 7 years ago now I went to an R5 2600 (now also long gone) from an 8350 and even that jump was mind-blowing. Well worth it for the money you’ll pay for an early Ryzen chip now
At the time upgrading from any upper third of the Phenom II lineup to a FX was...too much money for so little gain. Also, the FX that really did have more performance were so power hungry it wasnt even funny.
AH I forgott, AMD burned the fx moniker, which up to this point was for the most fast versions of processors, to give the fx processor lineup some more shine.
@@pushatsinfrared Doing so would require a completely new MOBO, as Ryzen is AM4. Plus since no AM4 boards support DDR3, that means buying new DDR4 RAM.
Honestly, even the basic setup I have can play most games pretty fairly. Most of my gaming takes place on games from before 2020, so im still good for now.
That weird weather in GTA Online is just the Halloween event that's going on right now. I hope that clears it up a little. Amazing video as always though!
A recent video "AMD FX in 2024 - Fine Wine?... or Vinegar?" puts things into perspective. I am running a FX-8300 and it's not as bad as I remember following the advice in that video.
Definitely aged better than it released with games and programs, even windows knowing what to do with more than 4 cores and multi core optimizations. It's crazy how different it was only 12 years ago
@@Rouxenator I had a 8300 that I could overclock to 4.7g without any problems amazing CPU.
the big channels tend to gloss over the fact that you need to tune the Bulldozer CPUs not like they would on Intel machines lmao.
@@markskonecki2050 What'd you use to cool that thing? I always remember hearing not to go over 62C and I used a Hyper 212 Evo and struggled to keep it at 62C under full load at 4.3ghz.
I realize not all silicon is the same, but still... I'd imagine that thing was putting out a ton of heat. Was it water cooled or something? Or did it have a massive air cooler? I use a Fuma 2 Rev B on my 5900x, now, with 3x fans on the heatsink lol.
@JustAGuy85 it was a 240mm AIO I don't remember the brand off hand .. yeah the fx8300 i had was good one..I had a 8350 that would never make it past 4.6 before it would freeze or reboot...every CPU is definitely different. Temps were in the upper 50s under load and in the low 30s at idle.
I still have my old FX8300+RX580 PC. Not a daily driver anymore, but still a nice pretty capable machine.
How about a Steam machine conversion?
nearly the same here. i got an fx6300 +rx570. it was my witcher 3 build and it ran very well in fullhd. still using it as my main pc for internet,office and to play contemporary games. ❤
Lol, my backup pc has an fx8350+rx570 8gb 😂
It was my first capable gaming pc and only cost me like 40€ about 2 years ago
@@Nightingale1887it’ll run a lot better then it does on windows
I still have my FX 6300 and Radeon HD 6970 which have served me well all the way to early 2019. It played all of my games just fine at 1080p.
Sorry to hear about the loss of your doggo😢
It's exactly the same feeling as losing a family member xxxxxxx
Made me a bit teary since I lost mine recently so I can understand.
@@CableWrestler It's a ruined wolf, what about it?
@@GrainGrown bad bait; try harder next time
@@ponponpatapon9670 I'm just speaking facts.
The actual Floating point unit ( FPU ) is shared between 2 cores.
I just learned this. It makes soooo much sense. An 8350 should be compared to a quad core in FP heavy titles... And in that respect, it is not too bad.
I knew something like this was happening in the background. It's like the GTX970 3.5 + 0.5 scenario
Back in 2016 I somehow snagged a FX9590 CPU from eBay for less than $100. The seller couldn't make it work on their MB, and so switched to a FX8xxx. I did some research, and eventually found a Sabertooth 990fx on eBay for about $100. I got a case, some memory, and an old AMD Radeon GPU and slapped them together along with a used 1000w power supply. Everything I used was used with the exception of a DVD drive I purchased.
Anyways, I still have that beastie today. It has 32gb memory now, and an AMD Vega56 BIOS'ed to 64. I don't play 'modern' games, but World of Warcraft and the WarGaming "world of" games still play beautifully. I'll admit that it's not my primary machine, but that's only because a few years ago I switched over to the Intel ARC graphics cards; and you have to have a Core series machine to unlock all the memory voodoo of the ARCs.
Regardless, my FX9590 still runs great and has survived 8 years of constant use.
I bet your home air temperature is over 40°C😂
@DraponDrako that old FX certainly keeps my room warm in the winter! I use it mostly to run Scientific Research with BOUNG from U. Berkeley.
I ran an FX-8350 for like 6 years, they were never as bad as some portray them, mine even kept up with Ivy Bridge and Haswell i5s for a fair bit less, they all overclocked well and frankly weren't any hotter then a nehalem i7. That systems still humming along with a vega64 at 4.8ghz and its still 100% capable of playing modern games nicely. For the many years of service it gave me no other CPU I remember more fondly then my FX.
I'm glad I looked at the comment section.
When I switched from my 8300 to a 3600X when it released, I thought I was suffering from Stockholm syndrome and that I was the only person who actually liked it.
Yeah, I like my 3600X more, but I like the 8300 more than I do my previous CPUs that are C2Q6600, C2D2200, Pentium (retching sounds) 4 (more retching sounds), Pentium 3, Pentium 1, and i486.
8600 made me switch from Intel to AMD, when AMD was at its worst.
It isn't so bad at its worse to be honest.
FX will always be in my heart. My first gaming pc (2 years ago) had a fx6350, I found the motherboard with cpu and 2/4 ram slots broken in the trash while going for a walk. Paired it with a gtx 660 1,5gb I got for free from a local marketplace. Case was a clapped out Fujitsu from 1999.
About 1 year ago I upgraded the pc with an old sharkoon case, a used ssd, a powercolor rx 580 8gb and a fx 8350. I read hundreds of bad reviews online, but I couldn't have been happier with it. (Paid 50€ total for the parts used)
About 6 months ago a pc popped up on that same marketplace for free. Luckily I got it and it has an i7-6850k, a gtx 1080, 4tb of ssds all in a Lian Li PC Z70
It's definately a huge performance jump, but it doesn't feel the same as my old pc did.
For lighter tasks, the FX series still delivers decent, playable performance in older and some modern games, thanks to DirectX 12 and Vulkan its aging like _fine wine_ .
dx12 is optimized?!
@@AffectionateLocomotivewhen fx launched games were single thread heavy and rarely using well 2 cores, dx12/vulkan makes up for that big time to the point first gen i7 or even Fx processorers or phenoms with high core count are somewhat usable while dual cores are essentially useless
@@oimazzo2537 Damm. DAMMMMMMMMMMMM. so how many cores do games that have dx12 use?
I can say the same about my Core 2 Quad system. Still doing fine for lighter tasks.
But a really interesting question now would be to compare the 4 module FX, especially the 8320 and 8350 to the i5-2500K and 3570K. They were around at the same time and the FX were claimed to age better once games start using those threads. But how do they hold up a decade later. Games do use more threads now, many not even playable on a quad anymore.
@@AffectionateLocomotive I have no idea if there is an upper limit but sure it can use more than 1
im currently running an FX 8350, 16gb DDR3, R9 390 8gb and some chinese cheap 500gb SSD. its my main gaming rig, runs world of tanks at ultra settings and use it for work.
I had the 8350 too , good chip in its day .
😂😂😂😂
My condolences...
@@Ignisan_66 If it does all you need it can't be beaten.
@Ignisan_66 lol
I would like to point out that the modules of FX processors share more than cache, they actually share a floating point unit, and each core has its own integer unit. So in tasks that are integer heavy, they behave much more like a true X-number of core processor and when floating point heavy function much more closely to a processor with half the cores but has hyperthreading.
This makes a lot of sense. Thanks for the facts!
I ran an FX-6300 for 2 years until I could afford an 8350. From there I upgraded RAM to 16gb and ran that until ryzen 3rd gen. It's kept me loyal to AMD
If have never owned an FX Cpu you will never understand the bond that it makes with the user. My FX8320 served me well constantly overclocked to 4.5Ghz 10 years. I upgraded to AMD 5950X afterwards i got fed up with the thermal requirements of 8320, so the first thing i did was with 5950X is to underclock thinking that i i do not need such performance and to reduce thermals even more so that i can use it fanless in around 50Watts. So FX cpu changes you, it becomes a part of you and lives within :)
I used a FX 8350 until 2019 and my GFs daughter is still using it. It works and you can play games. Once games started utilizing more than one core the FX cpu just kept on going. I have not used intel since P4.
I built a gaming rig with the "six-core" FX CPU and a GTX660. It was... fine. Ran BF3 at max settings at 1080p. Still have a laptop with the "4 core" variant, and it's fine. Perfectly adequate at the time, most of the time.
Goes to show how OP most CPUs are for gaming these days. I've got a 12c/24t Ryzen 9 7900X now, and on some games it's reporting like 4% usage. My RTX4080 will generally max out when my CPU is under 50%.
This. I "only" have a 5700X3D, but it can keep up with my 4080 Super. Processors have gotten incredibly good.
@@ChrisGrump The X3D chips are awesome. I was thinking 'hey four more cores at similar clocks is the way to go' I should have spent 10 seconds to look at the benchmarks, it's like having a V8 and losing a drag race to a V6 with a turbo. Still not a bad CPU by any means, but bang for buck goes to the X3D line all day long.
@@TheWarmotor Only in gaming. *most* working applications (encryption, decryption, compression/decompression, rendering, etc) doesnt really care about the extra cache, so you can save money AND gain performance by opting for the higher clocked version without the 3d cache if you arent gaming with your cpu
@@wills.5762 Yeah 😅 except there isn't any non-gaming computing I do that wouldn't run just fine on a raspberry pi.
To be fair the 7900x does have a lot more cores than most games will use nowadays. I'd say it's more towards workstation in performance rather than gaming cpu.
The FX series was made using an old and inefficient node, with some models being pushed particularly hard to achieve high frequencies at the expense of extremely high power consumption and terrible energy efficiency.
Also, the architecture didn't seem to be very good for most applications when they were first released. However, the architecture itself seems to have been relatively advanced, and performed well in some applications even early on, and ended up aging fairly well even for gaming, though it took a long time before a wide range of games were really taking good advantage of the FX architecture. These CPUs also offered decent value for money from the beginning, especially if your electricity costs were low enough that the power consumption wasn't likely to cost you very much, such as in some parts of Canada or the US.
It was nice to see someone taking a stand on the core-count controversy. Even a lot of tech journalists will repeat this outright falsehood that these CPUs didn't have the number of cores that they said they had, and use it as an example to argue that AMD isn't a trustworthy or honest company. Sure, by all means, it's good advice to tell people not to trust AMD, but this was not an example of dishonesty. The AMD engineers themselves considered their FX CPUs to have 8, 6, and 4 cores. The lawsuit didn't prove anything. AMD only agreed to a settlement because they didn't think paying more money in legal fees was worth it.
There was never any conspiracy to misrepresent the number of logical cores, and the whole legal argument that their products were misleading was completely dishonest nonsense. There was never a widely agreed upon definition of a CPU core which the CPU cores in the FX series chips failed to adhere to. The lawsuit against AMD was just 100% pure dishonest legal sophistry and opportunism by lawyers. Anybody who expected an 8-core CPU to be faster than Intel's best quad-core CPUs during the same time period was an idiot, because Intel's best quad-cores were significantly MORE expensive, and it's never been a good idea to assume that a CPU will be more powerful because any one of its specifications has a larger number in it than a competitor. The FX CPUs offered decent value, there was no legal grounds for the lawsuit, and AMD should have fought it tooth and nail for the marketing value of vindicating themselves in court, imo.
Great videos and since you have a working AM3+ board, hope we some more FX content. FX-8000s '8 cores' diffused after 2014 are insanely good overclockers and do so while sipping energy at the same base clocks as the FX-9590 or whatever that beast was. like 1.28v Vcore under 4.8ghz all core load compared to 1.45v of the old top FX chip, and also while being much better ram/FSB overclockers. They were fun to play with, and it was funny seeing a chip I pulled from an acer oem recycling centre special smoke my old budget champion i5-2500k in games like AC Origin and TW Warhammer 2.
I made the huge mistake of buying an FX-4100 back in the day. The architecture was simply at odds with how Windows gaming worked at the time, kept tripping over itself. A Phenom II X6 would have been a better purchase. Once I got a new job, I dumped it all for a 4770k.
Someone often dumped it and I picked twice since I moved to current place
Part of the issue was that they doubled the integer units, but kept a low amount of FPU units. And games since Quake are quite FPU heavy.
NetBurst did work a bit better in that regard. Also a low IPC, long pipeline layout, that when fed well, could perform well. And it had a full amount of FPUs.
So in terms of performance, it could compete just fine, but also scale up insanely with the right workload. I remember stuff like video rendering or mp3 encoding being much faster on a Pentium 4 than on an Athlon XP.
FX-8350 was a criminally underrated chip.
Compared to an i5 without hyperthreading yes. Compared to an i7 definetly no.
@@saricubra2867 Exactamente! El FX8350 costaba menos que un i5 35XX, y rendia mucho mejor en multithread... contra un i7 3777K/4770K perdía, pero el punto dulce de los FXs fue el costo/beneficio,
@@saricubra2867 Compared to an i5 without hyperthreading yes. Compared to an i7
An FX8320 was £100 about the same price as an i3, an 2500k was £150 and i7 2600k £280.
The i5 is not 50% faster and the is is not 180% faster, I know as I still have them all.
it was fine for it's price, ran the 8370e on the mid chipset budget 8phase board comfortably @ 4.7
@@janiss2926 Only 8phase was ROG crosshair, sabertooth and UD7 but doesn't matter, elbow of efficiency curve was ~4.2Ghz.
MSI 970 gaming was a 3+1 POS and gigabyte UD3P was (good) 4 phase, IIRC some higher end asus was 6phase.
I was kinda mad how you could handle an Athlon 64 FX like the processor in the thumbnail, then I remembered the other AMD FX series.
My first ever gaming pc build was a fx 4300 and a 1050 ti. Those were simpler days back then, put a couple hundred hours of og siege, dying light, og rust on that thing, ran them quite well back then. Now rust and seige probably would get 10 fps on that machine nowadays lol
My student house never had working heating, I am not joking here - I overclock my fx8320 to something silly and that machine kept the room warm, I just ran some stress tests if I needed more heat.. Honestly wasn't that bad of a cpu
I built a dirt cheap system in 2016 with an 8320e with a mild overclock and 16gb of ram, an rx480 and a fast ssd. It was extremely cheap to build because nobody wanted fx and performed much better than earlier fx builds because of dx12 and vulcan, as well as many newer games taking advantage of more cores. It was a very good budget gaming and video editing system for 4 years.
Part of the problem with FX was that games of the time didn’t know what to do with the modules. Later games such as those from 2015-2017 were designed around using them(because of the Xbox one and PS4) so they tended to run a lot better. When the modules are properly optimized for, it helps a lot with performance. I own an FX 6300, and it honestly impressed me how good it is still(I really need to break it out again though)
Nice stuff. You should see how the FX-9590 holds up today.
Mine still holding up pretty good 🙂
I wouldn't really go that far, but instead take an FX-8350 or even better, the 8320, which was a budget recommendation for quite a while. They overclocked pretty much to the same speeds as the 5950 came stock.
See how it can keep up to the i5-2500K and 3570K, which were the direct competition at first. And it was said the FX would age better in the long run, once games actually use those available threads. Let's see how true that holds.
First ever pc i build was with a fx-6300. Definitely wasn't the best but it did the job. Good times.
Maybe it's just bc I was moving from a core 2 duo but I remeber the pefromince being very good. The only game that made me get rid of it was beam.
My first gaming pc had a fx6350, build a 0€ pc with parts from a local marketplace. Definately good times!
The £80 cpu everyone compares to £130 core i5's. the alternative at the time was a 3ghz pentium dual core (no HT).
9:30 that was not the hacker that is actually halloween weather at this time during a night
Part of the performance can be explained by the more modern tendency to prefer multicore CPU with several threads VS strong single core performance. Now everything from mobile phones and consoles to handheld gaming devices is based around (sometimes) lower speeds but multiple cores. Do you recall ARM/RISC performance back in the day? Nothing was optimized for that.
And then came BIG.little, pair a set of fast, but hungry cores, with a set of small, but efficient ones. Pretty much what Intel is doing with their E and P cores. Few fast cores for narrow tasks that don't scale well over many cores, and lots of small cores for things with better parallelism. And later versions could even use both blocks simultaneously, in case a task needs lots of CPU.
My current phone even has three sets of cores, four small ones, 3 medium ones, and a single fast one.
I have a friend whose girlfriend have a pc with an FX and she does run Baldurs Gate 3 on that thing, despite pairing it with a less good GPU (not an RTX), so it doesn't really run as well. Tbh it does seem like the FX was hated on for being a power hog and not particularly great at anything, which's... well, kind of a shame, it's a surprise it held up this well
In 2011 i was in game design school and starting to making gaming videos on youtube, so i build a new pc for unreal 3 rendering (even with the 8 cores of the fx cpu rendering a unreal lvl sometimes took 8 hours) and editing videos, i build fx cpu based system with 2 amd hd6950s (later i swapped them out for a single r9 270x) an asus mb and my very first ssd...it was a good system for about 6-8 (i forget) years then moved on to a i7 6700k based system...
I even made a video on my channel about this system...(my dog was the star of the video)
I still have the 4300 I bought over 10 years ago, semi-recently bought a 6300 and plan to buy a 8300 in the near future. The 4300 is slated to run a 24/7 minecraft server for friends, the 6300 runs BOINC decently at 50% CPU usage limit, temps rarely over 45° with a wraith cooler. The 8300 is something I'd stare at on various web stores with beer pockets and champagne taste.
In the previous generation (Athlon/Phenom), cores were cores. In the FX, each 2 "core" module had independent integer clusters, but a shared FPU (as well as shared cache and and a shared pool of instruction decoders) - for integer workload, it didn't face the same limitations as Intel hyperthreading, where EVERYTHING is shared.
And now Intel is doing big and small cores, so not even a core is equal to a core on the same CPU.
And caches get even more complex. Intel's caches tend to be inclusive, meaning a cache at a certain level also holds a copy of the closer caches.
On an Intel quad with 256k L2 (per core) and 8 MB L3 (shared)= would have 8 MB available in total, but only 7.25 MB for a core (minus the 0.25 MB copies of the other cores' L2). The same layout on AMD would give a core 8.25 MB at max (8 MB L3 + 0.25 MB L2)
And while everyone was talking about core-to-cache latency on Ryzen, the same also holds true for Intel.
They used a ring bus, with a stop for the iGPU, a stop for the PCIe lanes, a stop for the MMU, etc. But also a stop for each core, including a 2 MB slice of L3 cache. And latency tests were showing that the first 2 MB were quick, 4 MB latency was slower (since it needed to take a hop on the bus and back to get to a different slice) and 8 MB was even slower (since it needed to do 3 hops to the furthest slice and then 3 hops back)
In fact, my ancient Core 2 Quad has a lower 4 MB latency than a Haswell i7
I fitted the 8-core fx 8300 to a friend's dad's PC recently, to upgrade from a fx 6100. It was a big upgrade for very little money and is absolutely viable as a daily PC. Snappy and handles multiple chrome tabs, youtube videos, MS office etc. all together nearly as well as my i9. No I'm not kidding!!!
As a young teen I spent the whole summer working around the house at home to earn some money. After hours of hard labour (we were removing tons of stone from the cellar) I had 700€ for a full gaming setup. Jokes on me I got the fx 4100 along with the fantastic HD 7850. In my young idiotic mind I found out you can overclock for more performance, as my 4100 struggled in games like battlefield, so I gave it a try... on a stock cooler. Needless to say that didn't go well, but it was a good life experience and a perfect excuse to get a new and better cpu later on
They were decent chips for the time. PC builders had good times indeed.
I ran (and still own but don't run) an FX 9590. Yup, on a MSI 990fxa gaming board. 16 gigs Balistix 1866.
It didn't suck.
Yeah, people remember that the gaming was better on intel chips with their better single threaded performance, but they forget that running anything in the background could occupy a core or 2 and the only intel chips someone on a budget could get did not have both four cores and overclocking at the same time. Which was a big deal because early turbo boost did not come anywhere close to maxing out the chip and the all core max clocks were kinda low. So for anyone who did more than games and could tolerate the power consumption more than the initial sticker price, they just worked.
Still rocking an FX8300 black. Does me fine for 90% of tasks I do and games I play. Got access to beefier cpus than this but don’t have a high end GPU, so can’t take too much advantage
Hey, that's my processor, I still use it with my GTX750 ti, 8 years going strong! 💪
Brazilian IT guy here: I do IT services for a local manufacturing company and roughly around 2012-2015 I built 5 computers: one FX-4100 , three FX-6300 and one FX-8320e. Along the years I´ve upgraded the ram to 8 GB and 16 GB on all machines, swapped the HDD to SSDs and I also have if I´m not mistaken 3 machines running discrete graphics. It´s amazing how well these machines still hold up for daily use like in an office like enviroment. All of them on the cheapest possible motherboard at the time and only the original AMD coolers (one of them runs an aftermarket cooler because of a lightning strike). However I´ve never cheaped out in power supply.
At the time I bought them because they were much much cheaper than Intel, but after all those years the extra cores really came through!
I bought an FX-4100 in 2012 it was a huge bump in performance for me coming from an old Intel Pentium dual core. It was really cheap and allowed me to use all the power of my HD 5770 and recording my video games. Then I changed for an i7 3770k + HD 7870 in 2013 and it was a lot better of course. Good times. Now running R7 3700X + RTX 3060
FX Cores weren't just Sharing Cache, they eben shared arithmetic compute Units like floating point units etc.
They really weren't complete cores
Ya9590 was really a hyperthreaded quad core but still kinda nice
@NolanWayne-h9n it wasn't hyperthreading
As I said, they were Cores with shared arithmetic Units
@ which in function is incredibly similar to hyperthreading. And in form too
@@NolanWayne-h9n hyperthreading uses special Branche prediction to run multiple Threads on one CPU
There aren't any multiple Units
@ that’s not true. There are parts that there are 2 of in hyperthreading just like amds simultaneous multithreading
My first built gaming PC was an FX-8350, it got a surprisingly huge bump in performance when Mantel, Vulkan, and DX12 could actually use the cores without the FPU bottleneck getting in the way. It also still works great under Linux. I later upgraded to the 1800X as soon as it came out
4100 king , i still have a system with the fx 6300 running and it runs very smoothly on a gigabyte 970 ud board . Its undying, permanently 4.5ghz :)
Sweet throwback video.
I have my FX rig next to my Ryzen config. I still love it!!
I had an FX-8320 that outperformed i7’s back then. It took considerable effort, but I overclocked it everywhere possible. Occasionally, the i7 won, but I had that system tuned exceptionally well. However, the time and effort I invested in making the 8320 beat the i7 weren’t worth it. It was still fun, though. But no one will ever believe me.
Stock i7's? Because you had pretty much a guaranteed 25% oc over the all-core boost, and quite some could do almost 40%
I still have an old i5 here with a 20% CPU OC and 33% RAM OC that still delivers adequate performance in many cases even in 2024.
I got a FX6100 for free and beat the absolute hell out of it. 5Ghz on all cores with a 25€ air cooler from a random chinese brand, paired with a MSI 970A krait. Great board apart from the power delivery system... It was overwehlmed by a stock FX6100 and i had to add massive amounts of airflow to reach the 5Ghz without it blowing up. MSI later removed the support for the FX9590 because the board literally blows up if you just try booting with it.
I got one too into my PC & i overclocked it too at 5Ghz. At 5Ghz paired with a modern GPU you can play SBK 22 & MotoGP 22 on Ultra in 2160p 60+FPS, Rage 2 on Ultra 2160p 60+FPS, Monster Jam Steel Titans, Terminator Resistance & of course older games maxed out in 2160p. I know because i recorded them on my RUclips channel XFORCE667.
Even Moto GP 23 runs maxed out in 1080p or higher using this OCed CPU.
Enjoy!
The FX 6300 was my chip at the time also (upgraded to a 4790K several years later)
Had the chip overclocked to high heavens. 1866Mhz Ram too. It wasn't perfect, but it was a beast.
i had an fx-8320 that i overclocked to just under 5ghz. it had its drawbacks, but i always liked it. it was significantly cheaper than intel, taught me about overclocking, and kept my bedroom nice and toasty during the harsh montana winters. oh and getting paid like 6 years later in the class action suit was nice too.
I used an AMD FX 8350 for a long time and overclocked the heck out of it and it was actually pretty decent imo
I had my AMD FX-8320, Then the FX-8350 for about 10 years. the 8350 lasted me up until earlier this year in 2024. Overclocked to around 4.8GHZ, its 8 cores and high frequency smashed most gaming tasks i threw at it over the years. Loved those chips!
Few months ago i built 3 PC's with 3 different AMD FX CPU's specifically for my kids. First PC for my eldest daughter FX 8150 paired with 8GB DDR3 and RX 5500 XT 8GB. Second PC for my middle daughter AMD FX 6100 with 8GB DDR3 with R7 250 2GB. Third PC for my youngest son with FX 8320e with 8GB DDR3 and RX550 4GB. They mostly play Roblox but also Genshin Impact, Dragon Ball Xenoverse 2 and surprisingly enough, all the systems play GTA Online without issues, the FX 8150 at high settings with way over 60 FPS, and the weakest system with the FX 6100 on normal settings but with over 50 fps. The FX8320e was even able to play Ghostbusters spirits unleashed on normal settings with over 50 fps, i was not expecting to even be able to play it. So yeah, they're fine for what they play. Also browsing youtube or netflix is pretty smooth. I wouldn't touch the 4 cores FX CPU's though.
GTA V/Online does fine as long as you give it 3 threads or more. There are some issues with dual cores, but beyond that it's pretty well optimised.
Ready to heat the house during winter season
@@prostmahlzeit i don't see temps over 73°C on each FX. They are running at default settings without issues. They do how ever use quite a bit of power from 96w to 125w tdp. Not a problem tho, energy is very cheap where i am located.
@@EtaYorius I've had a FX 8320 for 7 years and now running Ryzen 5 3600 and Ryzen 5 5600, it's like a 3x improvement
Overclock the HT bus and do an fsb overclock instead of multiplier as they performed far better that way, and you have a decent little quad (dual) core
But for less money, you can buy a xeon 2680v3 (7 usd ish) and a cheap x99 mobo and ram for less than 50 usd combined from ali and ebay respectively, and it will run just about any game out there (think between ryzen 2000 and 3000 performance)
Yup, the same also holds true for the K10 and to a degree for the K8 chips, the whole frontend scales up performance quite well.
I think it was 2400-2600 on the northbridge and maybe 2200 on the HT, paired with fast RAM in the 1866-2000-2133 range and it should fly.
I had one of the 9000 series chips, ran hot as can be and was basically unusable with stock cooling.
Ended up doing my first intel build after that, don't even remember which CPU it was but it ran much nicer.
Nowadays the 8000 series chips are best known for the absolutely horrifying OCs people can get on them, with them being the first to crack the 8GHz ceiling, and my condolences go out to the guys who broke said record only to no doubt be cooked alive by the miniature sun they'd created with all the heat it must have kicked out.
Ill never forget my fx6350. Despite how god awful it was at lots of things, it still could run games. I remember barely managing to get Minecraft VR running on it a while back with my 970
The weakest would actually be FX-4100. ;P
I had FX-8320 overlocked from 3.8 to 4.2 GHz from October of 2013 to the October of 2019, my first 1080 capable computer ever. I built a R9 3900X Ryzen PC that year, which I still use today.
still rocking an FX8350 for last 7 years now. my 1st and only ever pc I built and it's working perfectly with me. paired up with a 1050 2GB GPU, 16GB of DDR3 and loads of RGB in a glass tower case. got no reason to upgrade as it does all I need it to do 🙂
You don't play many games do you?
@@nesyboi9421 Older and optimized game would run fine on an FX chip. Modern AAA slop probably won't though. With any older machine you just have to play games that the hardware can run.
@@Compact-Disc_700mb True and not true as shown here in this video. It depends entirely on the game, triple A or not. For example, I know for a fact a game like Space Engineers is going to run like crap on it, or at least would have used to dunno if they have improved optimization much or not but I know it used to mostly run on the one core.
I was more referring to having a 2GB 1050 if anything. Like the 1050 is a good card it's just the 2GB one is less than optimal, I feel like 4GB is the minimum nowadays for most games. Sure if all you play is like Garry's mod, Spore, Minecraft, things like that it will be fine but for some other titles it's going to struggle. I had a 4GB laptop GTX 1050 and although it still plays a lot of games I like it's started to show it's age nowadays which is why I relegated that laptop to just school work. Even with 4GB there are some games that just don't take well to such a low amount nowadays.
But yeah I guess if you only play older games or 2D type indie titles you could get away with it.
@@Compact-Disc_700mbIt seems my reply never went through on mobile.
Yeah sure if you only want to play old games, but it isn't just triple A slop that you can't play on older chips and graphics cards, you miss out on a lot of the good titles to come out too. I mean if all you want to do is play Minecraft, spore, or Garry's Mod sure that CPU GPU combo will suffice and is fine you're right, but... you miss out on a lot of new stuff, indie and big studio. I can tell you one thing, that combo won't be able to play Helldivers 2, at least not well, nor would it run Space Engineers all too well either.
And honestly I think they would be a lot better off if they had a 4GB GTX 1050 or 1050 ti, or at the very least a 3GB 1060. 2GB is so little nowadays, even my 4GB GTX 1050 was starting to show it's age like 4 years ago when a bunch of the sandbox games I played got harder for it to run. (I7-7700HQ laptop, 16GB DDR4, GTX 1050 4GB)
3GB 1060s are a dime a dozen and would make a great upgrade, and so would a board with first gen Ryzen or Intel 6th gen.
I do understand though that maybe they don't want or need an upgrade, which is why I asked if they played many games. Maybe I should have asked if they only played old games or light titles.
@@nesyboi9421 I have an i7-6850k with a gtx 1080 for about 6 months now. There are no games I play now that I couldn't have with my old pc (fx8350, rx580 8gb)
And yes, I play a lot of games.
My goood, this was my processor when I built my first PC, I was so proud of it until after some time the CPU heat warning would start blaring almost constantly while gaming lol.
9:36 that's because of Halloween, it's not any hacking shenanigans
The instruction set helps a lot like AVX. That's why Sandy Bridge and FX CPUs still usable for modern games.
The main bottleneck of this CPU was the shared FPU unit and the lack of modern features for AM3+ motherboards like PCI-EX 3.0
Had an FX-8120, and I don’t think I’ve had any good memories to be quite honest. I don’t even think it was average to begin with, as the CPU got hotter reaching its tmax. (62.5c) No matter the cooler, the CPU stayed super hot and was very unstable. It also didn’t help that the ASUS motherboard I had was modified since it was a pre-built machine, but it did give me the best value in terms of longevity. (Until the motherboard died) I would never recommend FX to anyone unless you absolutely know what you are doing, and even then I’d still point for a used Ryzen first gen instead of the FX. You’d still have a much better time than with an FX.
Same situation for my A8 7650K, I think it's due to the paste inside the IHS that dries out so badly it's just leading our CPU overheating to death.
I bought a 7600X.
Super duper fast, I9 11900K levels of speed.
Super duper cool (32 to 40°C MAX OC'd)
While my A8, goes from 38 to 60°C...
I ran FX from 2011 up until 2016 when i switched over to x99 and then finally went back to AMD with AM5, i owned the 6100 the 8150, 6350 and the 8350 they were all good chips honestly and were extremely easy to overclock and had no problems running multiple SLI and crossfire setups from back then, i also wish i hadn't sold all my fx gear i wouldn't mind finding out what my 8350 could have achieved with my old 1080ti, also the higher tier fx chips also had 32 pcie lanes which means you could run two gpu's in x16 mode which the intel counterparts couldn't at least not on mainstream
Nice, I had an fx6350 (with gtx 660), 8350 (rx 580 8gb) and I now have an i7-6850k with a gtx 1080
RIP Stig the Dog
This hits the feels, my first build was an FX4350, radeon 4870, an awful ocz 650w psu & an overly expensive thermaltake case......I still have pics of that setup with the enormous noctua cooler in a standard ATX case.......what was I thinking in 2012 🤦♂️🤦♂️😂
I upgraded my FX 6300 to a Ryzen 5 5600 a month ago and it was the best investment i've ever made.
I had an FX 6300 on that same motherboard before I made the jump to an i5 Sandybridge. Then to an i7 (which I currently have and am replacing in a couple of weeks with an 8th gen). IDK if it was my video card or what but I remember being so disappointed with the performance of some then-current games in 2015, particularly GTAV. I sat on the threshold of going to an FX 9590...but when I saw the TDP of the 9590 (TWO HUNDRED AND TWENTY WATTS!) vs the i5, switching to team blue was a slam dunk.
Having a quad core CPU nowadays, is a major bottleneck
It depends on what youbare doing. i3-13100f isn't a bad cpu for example, and if you only play light games it'll get the job done
@@ItsNeverMe Why would you buy that though when you could get an R5 5600 for a similar price.
I had the Phenom II X4 955 and later the 8350 (the latter with the wraith cooler). Bought both of them new, long after release, at a budget price. Both were really good, capable budget options. Still have them in their box on the shelf.
Time to set up a retro system for all those old games that don't run well on modern platforms.
The Phenom could totally get something like a HD 5850, HD 6950, GTX 570, etc. And the FX would probably be happy with a GTX 670 or HD 7950
I still daily drive an FX6300, no issues
Also 97W under load? mine never uses over 50W.
You can't overclock the shot out of a CPU and then blame it for running hot
You’re very brave
I remember playing some heavy mid 2010s games with an FX 9590 system that I got as a farewell gift from my first-ever workplace. Kept it until 2018 when I switched to a Haswell Xeon system (E3-1270 v3). At the end of 2020 I traded the AMD platform for an RTX-2060-Super, which still runs in my old Haswell system to this day. Sometimes I wish I still had the FX, to test new stuff with it.
I've bought an FX 6300 in 2014 and it was already slow as hell. it felt more like it had 3 cores, not 6
Technically it did 👀
yup, since their "6 cores" meant 2 weak cores making up one ok core.
My 8320e was supposedly 8 core yet it really wasn't
@@Sprier Technically it did NOT, ffs, saw many years have passed and people still can`t understand FX architecture?!
@@Dagoth666Ur Actual 6 cores sure but 3 modules sharing and bottlenecked as such, also sharing the same cache. With not having actual 6 core performance and logical reasons why, for the technicality argument of "actually 6 cores" there's a reason they settled.
@Sprier " With not having actual 6 core performance" is highly arguable, they hade lower ipc than intel cpu' of the era witch amd planed to compensate with high clocks but they where unable to achieve them, that was biggest problem with FX cpu's. Then intel began dirty campaign of bribery of oem manufacturers and rest is history...
I started out on the 4300fx in 2011. I upgraded the CPU aa few years ago to the 8370fx. Recently doubled the ram to 16G, and ungraded from a pair of Radeon 7700 firewire to a GeForce 1650. Also a 2T SSD.
A few months ago I got a new Intel i5 13600. Still have the AMD PC and it runs very well. 13 years and counting.
People tend to be overly dramatic with how bad hardware actually is. In reality if it performs decently and sits at the right price it's not a bad product at all.
Yeah I never understood the hate for FX chips, As long as it is stable and performs fine for the price then good, Older and optimized games run fine on old hardware, the new trash is not going to run well. So many just hate anything that cant hit 300fps in the latest AAA slop. Many just need to keep expectations in check with old hardware.
@@Compact-Disc_700mb Meanwhile I'm sitting here with an i5 of the same vintage, unable to hold stable 60 fps in some non-AAA games from around 2013-16
@@HappyBeezerStudios Asuming you are running windows, you might want to check your background processes and installed software to see if something is useing all your resources because that does not sound right. You might want to do a clean OS install if it was installed years ago. Could also be GPU or ram limitations.
@@Compact-Disc_700mb just the game using 95%+ of all my cores.
@@HappyBeezerStudios probably an unoptomized game sadly. still a good cpu
I still have my old FX-8350 somewhere.... It was perfectly adequate for 1080p 60 FPS gaming and it did just fine for general computing, too; I didn't retire it until 2019. Huge improvement over the crappy 2-core Intels I could afford at the time, it gave me 80% of the performance of the then-king Intel I7-3770K for a third the price. I slapped a Cooler Master Hyper 212 Evo cooler on it and it was pretty cool and quiet unless I was rendering video or playing demanding games; my GPU was far noisier though so it didn't really matter.
9:35 wasn’t a modder, that’s part of the Halloween event.
I ran the 6300 on the stock cooler for 9 years. Never re-pasted, never had any issues. Good series of processors.
The weakest FX was FX 4100, not 4300.
I have one.
Yes it is 4100. Fx8350 people always says, I couldn't get it and I got 8120. 8120 is better than 8350. 😊
@@airmicrobe But 8350 is 230 more, right? So it must be better.
It was a good chip for its time and still kicks ass today.
In 2018 I did the full FX-8370, 16GB DDR3-2133 and RX 580 combo.
Averaged between 45-60FPS in Unity VR stuff, 90-120FPS in anything Unreal and a steady 144FPS in most Unity desktop games, which was fair.
After building a Ryzen 5 3600 system, I racked the FX and turned it into a recording server.
It really deserves better than this but without streaming, it has no home.
I promoted a single core AM2 box to be my full time web and storage server just to retire the FX to container duty but stopped doing containers.
I am using a FX9590 in a Crosshair V formula Z Mobo with a modified bios to use NVME drives. This processor while hot runs Windows 11 fine, that is until Microsoft shafted everyone that was in the Windows Insider group that tested 11 on the FX processors. We had no errors or issues but Microsofts uncompromising greed overcame peoples inability to afford a Win11 compatible PC and we were kicked out of Win11 Insiders.
FX never really was compatible with W11 anyways in fairness, lack of TPM 2.0 required work arounds.
I hate to say it, considering how cheap older Ryzen is getting, FX is really hitting the point of effectively uselessness, even the 9590 at full wack overclock will get walked by a second gen 6 core Ryzen you can for *30 bucks* and DDR4 is cheap now.
Might be time to move on
@@deepbludreams tpm isnt needed on consumer platforms
I used to have the fx-6300 "black edition" in my first pc. I paired it with 2x4gb 1600mhz ram and a radeon r9 280 3gb. However after a couple of years I have started to notice some very serious stuttering in games. In 2017 I decided to upgrade to the ryzen5 1400 since I could get it for 70euros and paired it with an ex miner rx480 8gb what i got for 75. It was night and day, the stuttering suddenly stopped and my fps doubled. In fact it is still my main config since I mostly play older or indie games.
Games are much more multi-core aware nowadays than they were back then though right?
That might account for the "less bad than remembered" performance.
Yes games today are far more multithreaded today
Would like to see a 2500K or 2700K thrown at the same games, just to see how true the claims of the time turned out to be.
First setup I build back in 2012 had FX-4170 in it paired with AMD HD 7770 :D Later on upgraded to FX-8320 which smoothed out some of those Bulldozer kinks and gave the socket some more life but ehh... When I saw by friend play GTA V with locked 60fps with 4690K I knew I had to change (for a while) xD
I have the FX8350 with 16GB memory SSD 128 RX580 8GB and been using for many years now with windows 11 and still as good as when I built it to today .
The worst FX is like the worst Pentium 4.
At the start of the pandemic I was still running an 8320 machine, I still look on the FX fondly.
Nothing will ever suck as much as Sempron.
They were fine - the first Semprons were basically Athlon XP Bartons for Socket 754. The later ones were even 64 Bit. Of course as soon as there were better options (e.g. Q6600) it didn't make any sense anymore - especially because they were single core parts...
@moezarella1261 I am a former Network Engineer & started building PCs in 1992. I was even part of the Cyrix 386-to-486 upgrade era. I gave up after getting the infamous "magic smoke" on the 3rd 386 I tried using the upgrade kit on. So I know how much the Sempron sucked. It was an IT support nightmare, people (aka management) who didn't understand how a PC worked went & bought these things, because of the price point.
My first PC I got as a kid was with an FX 6300 purely because of the price difference at the time compared to Intel as you mentioned. I still remember having to run it with the side panel off bc the stock cooler (was literally as you described it here) was just not sufficient, funnily I use that cooler now as a makeshift solution when I'm testing motherboards 😂. This video brought back many memories ngl. I went to a 6700k after and then back to Ryzen 3800x, to this day the only Intel cpu I ever used, and I am not planning on going back any time soon. AMD definitely started something good on the long run with their developent of chips at that time I think. I still have the system under my desk catching dust atm, might put it back to some use, especially after seeing this video showing how competent it still is in 2024. Amazing content, keep it up my man!
That lawsuit was such BS. Its 4 cores because its got 6 execution units divided into two integer units and one floating point unit per model but the cherry on top is that its a super scalar processor anyway with each of the integer units and both floating point units totally capable of executing more than one instruction at a time.
AMD settled because they'd have to reveal precisely how they accomplished this and give away trade secrets to prove how it worked which might not have won the case anyway. Legal "team" earned about 12 million for their trouble and those who asked for a partial refund got less than 2 dollars.
Exactly right that.
I still have my AMD FX 8320 on a Gigabyte Ultra Durable MB with 16 Gb RAM and SATA SSD and GTX 1650. Old beater that I play old game on. Still going strong. lol.
I run an 8150 bulldozer. Still runs great mind you I'm not a gamer.
The plastic cooler hold-downs breaking put an end to 2 of mine. The stock cooler falling of and shorting out on everything as it fell inside the case took out the cpu, motherboard , and psu. Oh the memories.
fx 8350
The bulldozer Stockholm syndrome is real
my bro still uses my old fx 8350 and it's running every game he wants to play, helldivers, elden ring, red dead 2. pretty good for an old as hell cpu
Still running an FX8350 with 16gb RAM and an RX580. Plays Elden Ring at high across the board. Going from Win7 to Win10 gave a huge boost in performance.
My family PC is still running on a FX-6300, 8gb of ddr3 and a GTX 750 (1gb VRAM version). Considering it's only my mom using it nowadays, and that it's mostly web browsing and browser/emulated mobile games, it's still holding up just fine!
For security reasons I'll be getting my mom a new one once Win 10 goes EOL, but I'm sure she would be perfectly happy with keeping the old one if she could.
I built an FX6300 8GB pc with an RX460 and I used it for years, it handled games, virtualization, college and Uni without a single issue, a great PC that I only recently gave away to a young lad who's parent couldn't afford to buy him a PC. Very under rated processors.
I had an AMD fx-8350 in my old pc and it handled itself well enough for what I used at the time. I still have the old pc but got an 8th gen i7 based laptop after I started getting graphical issues on the pc. It was paired with a Saphire r9 380x, loved that pc when I got it back in 2014-2015 (.approx)
Man, this video has me wanting to dust off my old rig I rebuilt back during the winter of 2020 (Covid - Round 1 - Fight) after I finally had some time to myself as my job was classified as an Essential Worker. Like you, it's what I called my 'First Real' gaming rig that I built years before as I got a heck of a bundle on it and needed something better than the ol' Core 2 Duo system I was on at the time. Honestly, if I hadn't used the stock cooler as of course I was on a budget, I probably would have used it longer into it's life. But yeah, still using the same Gigabyte board, 16 gigs of G.skill ram, and the FX 8120 the bundle came with.
Current graphics card I have in it is an EVGA GTX 1660 Super 6 Gig I picked up in March of 2020, the one that took me through the dark times to follow on my First Gen Ryzen system, which is just now one of those cards I'll never get rid of working or not, given EVGA pulled outta the graphics card market due to Nvidia shenanigans.
Given the underdog status of the FX line, and all the gaming I did on the rig back in the day, it's got a special place in my heart :)
Thus! Great video :)
Built my first pc with a fx-6350. I got it framed on my desk, was a decent little processor that did all the gaming I could have hoped for at the time. Lots of fond memories using that cpu.
Super cool video!!