FX-8350 vs i7-960 in 2023 - What If AMD FX Wasn't Delayed?

Поделиться
HTML-код
  • Опубликовано: 8 сен 2024

Комментарии • 231

  • @RATechYT
    @RATechYT  Год назад +5

    Thanks for watching! You can support my work here: www.buymeacoffee.com/ratechyt
    discord.gg/PFb9cMstZH ▪ instagram.com/ratechyt ▪ twitter.com/ratechyt ▪ facebook.com/ratechyt

    • @mastroitek
      @mastroitek Год назад

      Hei Man, just found your channel and you have some great content! I also saw how you handled the now two year old Hardware Unboxed story, way to go!!
      I was just wandering if you would be interested or thought about investigating the high idle power draw in AMD cpus, I discovered this issue when I built a 5900x system, after that I investigated the problem but never found an explanation or some king of "rigorous" testing on the internet. I know some ppl exaggerate the problem but many seems to be completely unaware of it and I'm not sure how wide spread this issue is. I also find weird that big channels don't mention or talk about this behaviour. As far as I can tell the only channel that somewhat covered this issue is Tech notice where he was asking is followers what was the idle power consumption of their AMD cpu, I found the results of the pool along with the comments quite interesting.
      Edit: I also sent emails regarding this subject to HU and GN but never received any reply.

    • @RATechYT
      @RATechYT  Год назад +1

      ​@@mastroitek Unfortunately I don't have the latest Ryzen processors so I can't really say much and am not planning to make a video on this topic anytime soon, but I just watched Tech Notice's video and think he already did a great job at explaining everything in detail. I agree with him, it really is unfortunate that most channels only talk about power draw under full load and make a conclusion out of it, adding a more realistic use case scenario such as browsing the web or editing a video for example definitely would've been great and it is something I'll be doing in my videos as well from now on.
      From what I've seen, it looks like the reason for the higher power consumption at idle and lighter workloads on AMD is the chiplet design.

    • @mastroitek
      @mastroitek Год назад

      @@RATechYT I see, thx for taking a look anyway :)
      Yes from the information I have gathered it seams that multi chiplet SKUs struggle in keeping a low idle power usage, the problem seems to be somewhat present also in single chiplet SKUs but not to the same degree

  • @jamezxh
    @jamezxh Год назад +109

    It's funny how people criticised the 8350's power consumption. Now look where intel Cpu's are in 2023 regarding that.

    • @gradystephenson3346
      @gradystephenson3346 Год назад

      I miss my 8320 the 4790 non k I use now seems generic

    • @nepnep6894
      @nepnep6894 Год назад +14

      Intel's power consumption is fine. It has significantly lower idle than chiplet based zen cpus, and under moderate gaming loads power consumption is very close.

    • @jforce321
      @jforce321 Год назад +12

      It probably has more to do with performance per watt than anything else we can say for sure that chips from AMD and Intel consume a lot of power compared to what they used to but they also both produce insane performance. We also know that the company's overclock them way past their efficiency curve to get that last few percent to be on the top of benchmark charts and that by lowering your power consumption by like 30% you can still get like 95% of the performance

    • @KarolusTemplareV
      @KarolusTemplareV Год назад +4

      The issue with the entire fx line is that if you had a Phenom X4 955 or up, to have a decent upgrade at the time you had to go to the FX with 220W of tdp. Bulldozer for games was kinda awful, and Piledriver was too little too late.

    • @KarolusTemplareV
      @KarolusTemplareV Год назад +7

      @@gradystephenson3346 If you used the 8320 vs that 4790 now, you would not miss it. Haswell somehow is still kinda enough, that piledriver you had would crawl in comparison today.

  • @ramiretz
    @ramiretz Год назад +22

    believe it or not i replaced my AMD FX just 1 week ago with a Ryzen 5700X setup..... it was long time a very good setup for me and if i would not be gaming from time to time i would still use it for my daily tasks like youtube chat email and so on....... rest in peace my dear FX you was always a very good buddy all those years

    • @LeitoAE
      @LeitoAE 5 месяцев назад

      I have quad core Phenom 2nd gen, few months ago sold r7 1700x (got replaced by r7 5700x) and bought fx8320 just for fun and from nostalgia as I used to have 8350 for some time.
      So as a person who recently used Phenom 955, FX 8320, R7 1700x and R7 5700x I can tell that performance gains even while simple web browsing is enormous. Phenom and FX now are having troubles with windows updates. Also games launch forever compared to Ryzen. If you try some older, single threaded games from FX era, you will have much worse experience. AMD went for high core count too early. The world and it's a software was not ready for that.
      Ryzen 1700x is much better in loading times, but also didn't aged well. Overall system performance is good, but it tends to stutter in games due to it's low single core performance and some instructions sets didn't work well on 1000 and 2000 ryzens. I mean... I had i7 2600k that after oc to 4.5Ghz had similar single core performance to that 1700x, which by the way didn't overclock at all. Couldn't hit 4ghz no matter what... All I did was to increase it's efficiency by a little bit by decreasing voltage and I kept it at stock clock.
      In my opinion - FX was not that bad, even when debuted and fought with 2nd gen Intel core. People forget that typical 8 core FX was similarly priced as i5s, so it should be compared to them. FX 9000 was a joke - priced extremely high just for few houndred more megahertz, but hey - I guess it was first consumer CPU that at stock clocked to 5ghz, right? XD

  • @AvroBellow
    @AvroBellow Год назад +7

    I used an FX-8350 from 2012-2017, five full years. At no time did I feel the "need" to upgrade as all of my games were plenty fast. The FX-8350 was like diesel truck. Sure, it wasn't overly fast but it was fast enough and it didn't slow down when increased loads were put on it. The fact that programs became more threaded over time only helped it and, if you're using an SSD, it runs Windows 10 just fine, if perhaps not as well as it ran Windows 7.

  • @lepatenteux592
    @lepatenteux592 Год назад +17

    I am still using my FX-8370 for gaming and it is an amazing machine... I built the PC back in 2011 with a 1100t 6 cores CPU, and upgraded it to the FX-8370 and 1866MHz RAM a few years after.
    It performs great and I can't yet justify replacing it... I Bought a second hand 8350 setup for my kid, and I have to say, it does not run as smooth... the 8370 refresh was a step forward....
    Both systems use the 990-UD7 chipset, mine is Gigabyte, my son's is Asus.
    I will be giving my 8370 to my second son this year... With a tear in my eyes! (And build a new PC for myself, 13 years later!)

    • @ramiretz
      @ramiretz Год назад +2

      i also replaced my FX setup 1 week ago for a ryzen 5700X setup.... my FX was also for more then 10 years a trustful buddy

    • @Trick-Framed
      @Trick-Framed Год назад

      It depends on the silicon lottery. It was still in effect back then. I’ve had 8370s that couldn’t clock past 4.4 and Fx 8350s that can hit 5.2 ghz on water.

    • @lepatenteux592
      @lepatenteux592 Год назад +1

      @@Trick-Framed Yeah, that is sure if you are overclocking your PCs... I don't! I undervolt them as much as possible and that's it... This might explain why my PCs last years for me!

    • @Trick-Framed
      @Trick-Framed Год назад

      @@lepatenteux592 That would explain the performance gulf. Thank you. Good luck with your 5700x. They are excellent value for money. The first AMD 8/16 to beat the venerable i9 9900k OCed to 5.2ghz.

  • @ironhead2008
    @ironhead2008 Год назад +40

    You can blame the FX being delayed on Global Foundries being unable to get their house in order: the node AMD had planned for the FX was hopelessly delayed

    • @ivankovachev8835
      @ivankovachev8835 8 месяцев назад +4

      And Global Foundry promised AMD that the 8 core FX CPU would have 2 billion transistors instead of 1.2 billion at the same power consumption and die area... So AMD had very swiftly redesign the architecture before release.

    • @ironhead2008
      @ironhead2008 8 месяцев назад +1

      @@ivankovachev8835 Yep, one of the places (amongst many) they had to cut were the FPU units. They were originally supposed to be MUCH wider and fat enough that they could handle the workload from both compute units. Take a good look at the mess surrounding the FX series from that angle and you can see one of the big reasons they went fabless. Yes Intel had similar issues with 10nm being hilariously delayed but that's one of the perks of having 70 to 80 percent market share: you can kick the can down the road and and buy yourself some time.

    • @ivankovachev8835
      @ivankovachev8835 8 месяцев назад +1

      @@ironhead2008 not just the FPUs the integer cores had 2 ALUs instead of 3 ALUs like in the Phenom II, now in the Phenom there were some bottlenecks where the 3rd ALU would be underutilized with certain instructions, but they could have widened the front-end to compensate for that too.
      And yeah, meteor lake is the first good intel product since Sandy Bridge. Keep in mind they recently got 10-15% better performance from a bios update without increasing power consumption.

    • @ironhead2008
      @ironhead2008 8 месяцев назад +1

      @@ivankovachev8835 Yeah, I know a lot of the original design got cut, I mentioned the FPUs because that was one of the more commonly cited weaknesses of the FX design.

    • @ivankovachev8835
      @ivankovachev8835 8 месяцев назад +1

      @@ironhead2008 At the time 256 bit Floating Point operations were primarily benchmark score boosters, but the integer performance suffered a lot too and that mattered much more back then.

  • @euX222X
    @euX222X Год назад +5

    I still have my old FX 8300 almost 10 years playing all games at 4K 30FPS (before 1080p 30FPS), 24GB DDR3 1866 and RX 580 8GB, but at beginning of 2023, I got one of these "China miracle CPU" a Xeon E5-2666 V3 3.5GGHz (10c/20t) with 64GB 2400 ECC DDR4 quad-channel, at same time/the reason for the FX8300 exchange, got a used Radeon VII 16GB for just 125US$ bargain...

  • @koreannom
    @koreannom Год назад +9

    Man, it feels like yesterday when my friend helped me build my first ever pc with an fx 6300 + GTX 760 (4gb model, yup) and 16gb ddr3. I wish I had played more games while I still had the pc and when it was still relevant haha.
    My fondest memories on that pc was playing Bioshock infinite, loved that game.

  • @Pruflas-Watts
    @Pruflas-Watts Год назад +5

    My FX 8350 is still sodiering on. I got it paired with a Asrock 970m PRO 3 and I got it PCIe bootable with a PCIe NVME M.2 running in 2.0 4x and im getting 1650 MB/s read and 1550 MB/s write speeds. I paired it with a RX 6600. Very happy with its performance all these years later.

    • @patrickc8007
      @patrickc8007 Год назад +1

      GTX 1060/RX580 would already be bottlenecked by this cpu, let alone 6600, the most absurd setup ever.

  • @gulskjegglive
    @gulskjegglive Год назад +4

    I am typing this on my AMD FX-8370 PC running Linux Mint. Going strong and stable almost a decade later.

  • @Konkretertyp
    @Konkretertyp Год назад +22

    The FX 83xx lineup was actually great in my opinion, if you know, how to configure the system for most stability and performance. At one point i replaced my nephews i7 920 system (which was dieing, because of a bad PSU, that i actually wanted to replace) with a AMD FX 8320 and he was happy about it, because he got a completly new mb + cpu + ram combo without any significant performance loss. Later on i gave him a GTX 1050Ti (at the time of the mining craze in 2017, it was one of the few gpus i could get new without overpaying) to replace his aging 9800 GT + 16GB of 1866 ddr3 ram. He was satisfied for a long time with this config, up until 2021 (where he got his hands on a Ryzen 5 2600 for free and build a new system around it by himself, with a RTX 2060 Super, that i gave him on christmas the same year). I've used a FX 8350 myself (later on replaced by a 8300, that i've got for cheap, because i somehow managed to kill my 8350) and was astonished on what it was capable to do, but at the same time disapponted, that some games didn't work well at all with it and even fell back behind some core i5 2xxx, because of the bad single-core performance. The FX lineup wasn't that bad, as people say, they had definitley their benefits, especially later on, when they bacame very cheap (and in my area way more cheaper new, than i could get any i7 2600/2700 used), but i can understand the frustration, that some people had with them, you has to have the patience for it and learn on how to configure them and what to watch out for (like ram-speeds and HT/NB, type of mb/chipset, are the vrm on the board properly cooled etc.), to make it run, as good, as it can.
    Nice comparison video, i always appreciate that type of videos, great job 👍

    • @RATechYT
      @RATechYT  Год назад +3

      Thank you!

    • @KarolusTemplareV
      @KarolusTemplareV Год назад +1

      They became very cheap because no one wanted one. I was expecting them to be decent to replace my 955BE at the time, but it made no sense, Bulldozer was crap, and Piledriver was too little gain for the normal tdp cpu's to the point of making no sense, so you had to go for the 220W one which seemed (and was) ridicolous, not that many motherboards had the power delivery good enough for that and be reliable, and if you had to change the motherboard...well, then you could go Intel at the time for similar price.

    • @НААТ
      @НААТ Год назад +3

      Not only the fx83xx. Amd fx 63xx was also a absolute beast and still is! Especially the 6350 if you know what ur doing with overclocking.

  • @KitsuneVoss
    @KitsuneVoss Год назад +2

    I rebuilt by fx 8350 just "because." sat for a while not being used but my roommate's computer was having issues. Likely just a corrupt os but might be the motherboard. still gave her a perfectly working computer.

  • @gamecomparisons
    @gamecomparisons Год назад +3

    I paid $360 for my FX-8150 full system upgrade (Motherboard, CPU and RAM) in May 2012 and I have never looked back. The reviews all told me to buy Phenom II X4 or iSomething instead, but I looked at RAM speeds (versus Phenom II systems) and cost for performance in the case of intel and bought FX instead. The early performance jump from my Athlon X2 4200+ and 6400+ was exactly what I hoped for, and the cost of the upgrade was right where I always keep my system upgrades.
    I hoped that AMD would get better software support over time and that definitely happened, probably thanks to the Xbox One and PS4 using AMD. So my system performance increased year after year until I decided to finally overclock, and then it jumped again. Just last week I bought Gotham Knights and it is smooth with my RX 480 despite the recommended system specs.
    One thing that might affect this particular comparison is that multicore was still very much a new thing for software even in 2009. I was still waiting for my 64-bit X2 processors to see their full potential then. I had only just moved on from X360 games to my HTPC in 2008 because it performed better at higher resolutions. But in 2009 I doubt games would have benefited from Quad core much less Octal core processors, much less 64-bit versus 32-bit, as much as they have since 2014 or so.

  • @wil8115
    @wil8115 Год назад +3

    i used my old 8370 OC for many years. it was a work horse.

  • @b0ne91
    @b0ne91 Год назад +5

    I was just having this argument on Reddit where someone had truly claimed that the i5 650 (iirc) is way better than the 8350.

    • @RATechYT
      @RATechYT  Год назад +7

      Unless he is talking about a specific task and the i5 is also fully overclocked, that ain't happening lol. The 8350 should be better in 99% of situations.

  • @logger589
    @logger589 Год назад +2

    I am proud FX-8350 owner for about 10 years.

  • @FinnLovesFP
    @FinnLovesFP 10 месяцев назад +1

    I used FX back in 2014 to 2019. Started with the 4350, to the 6350, and the final bastion being the 8320 that I OC'd to 4.5Ghz. started with the 4350 and GTX 660 Ti, and topped it with the OC'd 8320 and RX 480. it was clear i was CPU limited in a fair bit of games by mid 2019, as games like metro exodus challenged the CPU. there were some moments where keeping a locked 60 wasn't always possible. But it never put me into unplayable territory for the years I've gamed with FX. Wasn't as good as intels offerings sure, But it was cheap enough to get me off of the core 2 quad q8200 which by 2014 was showing it's age. It was hated, but it got me through many years of fun, and couldn't have asked for more.

  • @MDFGamingVideo
    @MDFGamingVideo Год назад +4

    I... SERIOUSLY... need to finish my media PC and clear my bench so I can start tinkering with retro stuff. I have a pair of 9590s with some TIGHT DDR3 RAM that I need to test. For SCIENCE! 😁
    Another great video!

  • @RyanLeCocq
    @RyanLeCocq 3 месяца назад

    This is a rare video where I was watching on my TV and normally wouldn’t find my remote to like the video, but I immediately pulled my phone out to like and comment. Max effort always appreciated.

    • @RATechYT
      @RATechYT  3 месяца назад

      Thank you so much!

  • @hitbm4755
    @hitbm4755 Год назад +6

    I think people who benefitted most out of AMD FX, were people who couldn't afford any kind of PC until 2012-2015 and then got a good deal on a FX 8350 system. These are great systems for people who never had a PC, and it is even compatible to this date.
    Furthermore, pairing it with a Nvidia GTX 1000 series gives the best results in my opinion, something like a GTX 1060 6GB. I admit this even though I prefer a Radeon, Nvidia GTX drivers just pairs much better on Windows 10.

  • @mikloskoza9044
    @mikloskoza9044 Год назад +7

    Back in the day, around 2016 one of my friend help me build a pc. It was an fx 6300 with 8gb single channel ram (to later upgrade to 16gb) and with an r7 370 gpu. It was waaay better than the pentium 4 pc that I used (at least it was able to play YT videos xd ) But tbh after 1 year I noticed that it's performance was all over the place. Later I got educated about PC hardware in general, and tried to OC it. It won the silicon lottery with 1.275 V it could go 4.6Ghz with easy, but my psu and cooler wasn't ideal. Overall FX was back in the day a moderately successful cpu, and it become awful later down the line. Could it play games? Sure. With great performance? It depended on the game. But this comperation shows, how bad was the FX line up in general. In this vid, we can all see, that even the 2nd gen fx cpus, can barely win over the 1st gen core top of the line chip. I wonder what would happen, if it was the OG bulldoser chip. It's still a disaster. Ofc if you payed a small price for it and if you had a great mobo, it was a budget king. Better than the i3 of the time, and some of the i5 cpus. I just have a mixed emotion with that chip in general. But I used to warm my feet when it was cold :D .

    • @mastroitek
      @mastroitek Год назад

      I built my first pc 10y ago, I was tempted to get an FX 8350 but opted for the i7 3790k, I think that was a great choice, I kept the i7 on my personal rig for 7 years and then moved it in a second build for office work. Between 2013 and 2020 I noticed how my build was holding up far better compared to the fx 8350 system a friend of mine had, the fx was really struggling in some modern games, to the point that in 2017 he switched to an i5.
      Obviously my i7 build also came with a higher cost, but in the end it was worth it

  • @herbertvonsauerkrautunterh2513
    @herbertvonsauerkrautunterh2513 Год назад +4

    I should have kept my FX system. The AMD CPUs i liked the most were for socket A and 939. I still have two Opteron 165 CPUs and others and a full fm2+ system sitting on the floor here..

    • @andreewert6576
      @andreewert6576 Год назад +2

      Keep everything. The retro wave is unstoppable.

  • @ivankovachev8835
    @ivankovachev8835 8 месяцев назад +2

    Many people forget that Global Foundry promised AMD that the 8 core FX CPUs would have 2 Billion transistors, but ended up having 1.2 Billion transistors, which is why AMD delayed and redesigned them. If FX CPUs had 67% more transistors the performance would have at least 30% more.
    Also another reason for FX CPUs to strugle is because they have write-through cache, instead of write-back cache, and almost no complier is optimized for write-through. FX CPUs also had new extra instruction that no software took advantage of.

  • @kevinedward6132
    @kevinedward6132 Год назад +3

    Imagine if game developers actually used the cores back then as well instead of catering to Intels single core performance bollocks. If Ryzen hadn't come along with the cores and decent ipc we'd still be suffering with 2 core 4 thread as the standard.
    It's crazy how well the 8 core FX still runs today considering how supposidly "bad" it is.

  • @AlyxSharkBite-2000
    @AlyxSharkBite-2000 Год назад +13

    It was the FPU that really killed FX, the fact they had 2 ALUs but only 1 FPU and the FPU was a lot weaker then it could have been. It is really too bad because the CMT design is honestly better than SMT all things equal. In SMT you are using the excess resources for the extra thread while CMT gives you a full extra core. Take a look at how good the DEC Aplpha did with CMT. They just really cut corners if they hadn't FX would have been a good CPU. But in the end it did lead to Ryzen so there is a silver lining to it.

    • @Ivan-pr7ku
      @Ivan-pr7ku Год назад +6

      Not really. The weak points of the Bulldozer architecture were undersized front-end (the shared fetch and decode serving two threads was slow) and latent cache-memory subsystem. All of this hit hard on complex single-thread code with long dependencies, typical for consumer applications, so the only counter measure was to turbo-clock the CPU as high as possible. To compound this, there were thread scheduling issues in Windows that weighted down even more on the performance, but that was corrected over time. The FX line was AMD's "Pentium 4" moment that happened in the worst time possible along the entire existence of the company -- the over valuated acquisition of ATi overlapping with the financial crisis of 2008 and the botched spin-off of the foundry business.

    • @andreewert6576
      @andreewert6576 Год назад +4

      If you think of the FX 8150 being positioned against sandy bridge 2500 price-wise, it wasn't too bad to have only 4 FPUs etc. The competition also only had 4. There were tasks (like compiling code) where all the ALUs really stretched their legs and even 2600 got beaten. Of course, sandy also was a greak overclocker, which bulldozer wasn't. Gaming was firmly ruled by single core performance, and intel had better IPC *and* higher clocks.

    • @Trick-Framed
      @Trick-Framed Год назад +1

      @@Ivan-pr7ku @teatimewithjuiblex3286 has valid points in laymen’s terms. Both the front end and FPU issues hampered this design. Their reach exceeded their grasp here and in order to make the chip cheaper they started removing features and cache. They were in the red and this chip was supposed to save them. CMT can be more powerful than SMT even with a poor implementation.

  • @Javadamutt
    @Javadamutt Год назад +3

    You are a bit off base with why FX "sucked". The architecture describing the layout of integer units ultimately played very little in the performance results but had a big impact in floating point operations which had indeed been shifting to GPUs.
    In K10 AMD gave two integer units three decoders while Bulldozer had four decoders for two integer units. This means Bulldozers two cores had 4 integer units while a dual core K10 had 6. This had the advantage of saving a lot of power (nearly 1/3) for a small IPC hit.
    The main reason for the whole Bulldozer sucking was the long instruction pipeline. To work effectively, every stage of the pipeline had to be filled. This required the a lot of complex development of the scheduler, Prefetch and branch prediction on silicon and software. Failing to have the pipeline full would result in a stall requiring the pipeline to be flushed to cache and the required instruction pulled in. In a short pipeline this is less of a problem (Intel also suffered branch prediction stalls and was part of the improvements from sandy to Ivy to Haswell).
    For context the minimum branch prediction penalty for Pentium 4 and Bulldozer was 20 cycles, K10 was 12, core 2 was 15 and Sandy bridge was 14.
    Lastly AMD had the micro-ops and macro-ops fusion processing later in the pipeline. Intel had it much earlier resulting in decoding bandwidth of 5 x86 instructions on Intel since Nehalem and no benefit of branch fusion on Bulldozer. Putting it earlier in the pipeline would have increased power consumption and complexity, therefore development time and cost that AMD didn't have.
    Hardware.fr disabled a four core Bulldozer so only one core per module was activated resulting in a 3-5% single thread performance gain on high IPC loads. Essentially The front end decoder was weak on Bulldozer

  • @O.Shawabkeh
    @O.Shawabkeh Год назад +1

    Intetesting video and thanks for your time.
    The i960 wasn't cheap either, I owned a 920 back then.
    Coming from Athlon XP/64/64 X2-AM2, I had lost faith in AMD and stopped even following their news, heck I srarted catching up again only after Ryzen2. (even their stock price at a time was priced-in for potential bankruptcy).
    But now I'm moving from Ci7 6700K into R-5800X along with a ATI Radeon 7900XT. (can't manage to call them AMD yet)

  • @pauls4522
    @pauls4522 8 месяцев назад +1

    A bigger question would be, "what would have piledriver and bulldozer for that matter have been if AMD did not use so much automation software on this architecture?" I read in multiple locations they used a lot of automation to write the architecture, which led to areas of the chip which could have been faster. A number I have seen is as much as a 10% performance penalty in IPC and that there is not much improvement from there due to high bulldozer was designed at a high level with the shared floating point units.
    Imagine if Bulldozer did not have shared floating point... Then again how much more expensive would it have been to manufacture on 32NM if it didnt.

  • @JamesSmith-sw3nk
    @JamesSmith-sw3nk Год назад +6

    Good video. Xeon 6 core processors gave socket 1366 a MUCH longer usability life span than most cpu sockets get.

    • @andreewert6576
      @andreewert6576 Год назад +1

      That and the triple channel (not all 1366 boards though). 3ch DDR3-1600 has the same bandwidth as DDR4-2400 but much lower latency.

    • @Pasi123
      @Pasi123 Год назад +1

      I'm still using a 6c/12t X5670 @ 4.4GHz with a GTX 1080 in my main PC. I first got onto X58 in 2013 with a W3520 (i7-920) and in 2016 I used a W3550 (i7-950) for few months before buying the X5670

    • @andreewert6576
      @andreewert6576 Год назад +1

      @@Pasi123 good, honestly, if it still works why not? It is about as fast (or slow) as a Ryzen 5 1600 and PCIe 2.0 surely holds it back for gaming but honestly, theres no huge gain for day to day tasks going from X58 to modern platforms.

  • @MarcWeavers
    @MarcWeavers Год назад +2

    that lawsuit in one state in the USA was stupid! there is nothing to state that each core must have its own FPU to be considered a single core!

  • @BREEZYM6015
    @BREEZYM6015 Год назад +2

    I still use an FX8320.😢

  • @MrMilli
    @MrMilli Год назад +6

    Piledriver represents Bulldozer's biggest leap gen-to-gen (20%). So making the fictional what-if review against the i7-960 is a bit skewed. The FX-8150 wouldn't have standed a chance even if it would have been launched in 2009. After that, Steamroller and Excavator each added around 10% more performance on average (clock-for-clock). AMD didn't bother shrinking Excavator to 14nm because it was a waste of resources considering 14nm wouldn't have allowed it to clock higher anyway.
    However you turn it, it was AMD's P4 moment because even to this day most CPU's hover around 5GHz-5.5GHz. AMD would have needed a 6GHz Excavator in 2016 to even stand a chance. It was a lost cause.

    • @andreewert6576
      @andreewert6576 Год назад +2

      You are absolutely right. Then again, no one with the money for an i7-960 and a triple-channel X58 board would have touched an FX, no matter which gen. Its rival would have been the 860.
      And i still wish - like RAtechYT - we'd have seen quad-module Steamroller and Excavators on AM3+. It wouldn't have made financial sense, of course, and AMD were in no position to waste money. But it would have allowed us AM3+ users a real upgrade in 2015/16 or maybe just a power sipping FX build - imagine that.

    • @DFX4509B
      @DFX4509B Год назад +2

      ​@@andreewert6576 I like the idea of an 8-core 'FX' based on an upscaled variant of the 8th-gen consoles' Jaguar CPU cores minus the IGP cores better as Jaguar had a higher IPC than Steamroller.

  • @MDFGamingVideo
    @MDFGamingVideo 11 месяцев назад +1

    5:07 - Watched this again. Best part of the video right here. 🤣

  • @SlowHardware
    @SlowHardware Год назад +5

    Id love to see an i7 990X vs an FX 9590

    • @RATechYT
      @RATechYT  Год назад

      Don't have the 9590 anymore unfortunately, but we will be revisiting the X5670 in one of my upcoming videos.

    • @SlowHardware
      @SlowHardware Год назад

      @RATechYT awesome I'm hyped :) the 8350 acts similar with an oc anyway

  • @kevinvanneste2500
    @kevinvanneste2500 6 месяцев назад

    That triple channel is definitely helping with gaming performance that's why the xeon 6core x58 held out for so long definitely with a great oc

  • @DanielGT_93
    @DanielGT_93 Год назад +3

    I have an 8120 that i like to play around, but one think that socket 1366 has that AM3+ does not is 6 core options. Any i7 with 6 core destroys any FX. But this is today, that both are cheap, at the time, 1366 was crazy as hell.

  • @Skrubmeister
    @Skrubmeister 5 месяцев назад

    You should check out the FX-4200. It's a real quadcore because it doesn't share modules on the bulldozer architecture. I've snagged one, overclocked mine to 4.2ghz 1.3v medium LLC 210Mhz FSB with 8GB dual channel DDR3 1333Mhz running @ 1400Mhz. It runs pretty darn good for such an old chip.

  • @RuruFIN
    @RuruFIN Год назад +3

    I just went for a Phenom II X6 when the 1st gen FX was released.

  • @damasterpiece08
    @damasterpiece08 Год назад +2

    looks like the 8350 was usable for at least 5 years if you didn't care about >60 fps; if games were optimized better these companies wouldn't have made as much money

  • @pamus6242
    @pamus6242 Год назад +1

    The FX 8350 was the best processor I ever owned.
    The day I parted ways with it, a part of my soul left my body.
    Now with a 5700G but does instill the same feelings, its just another "product".

    • @nevernicemeadow
      @nevernicemeadow Год назад

      efficiency with the 5700g is on a complete other level. i recently downgraded from a 5800x to a 5700g. an oh boy. i disabled PBO, and set it ti 4.0 GHz at 1.13V.
      still 13k cinebench r23 multicore, while only sipping 60Watts. Playing Fallout76 just requires freaking 18 Watts. In contrast the other extreme, 5800x with PBO enabled pulled 80 Watts in Fallout76, while only delivering around 10% more performance paired with my 1080 TI. Sure, the 1080 TI might bottleneck both, but still. (and or Fallout76 optimization is non existent)
      PS. Came from a Xeon1231v3 i used for 8 years. Part of me still regrets replacing it. But this time i wont sell my old Xeon + Board i used for so long. Selling my old Phenom II x4 965 Black Edition (used 5 years) for 200 bucks back then still hurts today. But yea, was a student and 200 bucks was a lot

    • @pamus6242
      @pamus6242 Год назад +1

      @@nevernicemeadow
      You are absolutely 100% correct.
      It's just that AMD was trying to be honest with the FX but it came out bonkers, a risk no company would ever take and they owned it!
      And i paid almost nothing for it.
      Now imagine if AMD released an FX 8950X today with 32c/64t .... Based on the ryzen architecture ddr5 etc

    • @DFX4509B
      @DFX4509B Год назад

      @@nevernicemeadow Wonder how well a 5800X3D undervolted with PBO disabled would do for efficiency then.

    • @nevernicemeadow
      @nevernicemeadow Год назад

      @@DFX4509B probably extremely well FPS wise, probably on par with 5800x with PBO. though power draw would still be higher than 5700g as the later one is monolithic and not chiplet... for chiplet CPUs you can always add like 15-20 watts on top for the infinity fabric which always runs at frequency half your RAM speed

  • @ivankovachev8835
    @ivankovachev8835 8 месяцев назад +1

    FX CPUs gain less performance compared to first gen core i CPUs when overclocking, because you have to look at the percentage of overclock, not the raw Mhz increase. Going from 3.2 to 4.1Ghz is a 28% increase in clock speed, going from 4.0Ghz to 4.7Ghz is a 17.5% increase in clock speeds. Usually FX 8350s overclock at least to 4.8Ghz(20% overclock), many do 4.9-5.0Ghz and the rare ones hit 5.1Ghz.

  • @bjarne431
    @bjarne431 Месяц назад

    Just got an fx 8350 for virtually nothing. I plan to play around with it. I got some 1866mhz ram to use with it :-) (out of the box with default bios settings the ram runs at its 1866 mhz speed, i wasnt expecting that haha :-) )

  • @Will-ex7kt
    @Will-ex7kt 4 месяца назад

    I have the two configs 🤗.FX 8350 with asrock 990 fx fatality killer and the i7 960 with Asus p6 TD Deluxe.

  • @lflyr6287
    @lflyr6287 Год назад +2

    RA Tech : In cases where the FX architecture doesn't scale properly is due to shared resources (like FPU and L cache inbetween 2 cores in 1 module) but also due to long life hidden Intel compilers that Intel themselves have been bribing game and app devs since 2011 on to implement to hurt non-Intel cpu architecture. CB 15 and CPU-Z built in benchmark (for example since v1.79 AMD Ryzen cpu-s were the only ones that were hit by around 30 % performance drop compared to Shintel).

    • @KarolusTemplareV
      @KarolusTemplareV Год назад

      The issue at the time was that even against AMD older offerings, AMD fx for games didnt made much sense, effectively speaking the architecture was crippled for gaming specially with bulldozer.

    • @lflyr6287
      @lflyr6287 Год назад +1

      @@KarolusTemplareV Yes, mainly because Microsoft had a very obsoulette way of thinking and when they made DX 11 it only had 3 very small additional features but in terms of linearity of computing it was completely the same as the DX 9.0c from WinXP era was. It only wanted 1-2 cores and their fastest frequency. Also the Windows Scheduller didn't understand what inbetween core communication was....hence Intel 4-core cpu-s that had faster frequency + 1 FPU per 1-core + hidden Intel compilers in many of that eras games made them appear faster.
      Today the FX 8350/8370 is faster than every i5 3590K or i5 4590K due to the fact of still having 8-cores to compensate for the lack of efficiency of IPC in modern games. So FINE WINE is taking an effect here.
      But if back then in 2012, when AMD developed Mantle API, a first true parallel API with ASYNCRHONUS COMPUTING capabilities, if that API was implemented in Win 7 / Win 8.1 and later in 2015 in Win 10, these FX cpu-s would be in reality much much faster and have actual 6 and 8 cores.

  • @SiliconPower74
    @SiliconPower74 Год назад +1

    You said cant OC more the FX due to voltage limit.
    What was your Vcore LLC setting? In those gigabyte boards, you have to use Medium
    Also, did you check VRM temps?
    I had a FX 8320E @4.7GHz 1.5v running with the same mobo you used (mine was Rev5).
    CPU didnt exceeded 61°C but it was throttling due to the VRM going pass 110°C.
    I added a fan over the VRM heatsink and manage to get 4.8GHz at 1.55v with the VRM peaking at only 75°C under continuous stress test.
    The improvement between stock and full OC was just insane.
    CBR15
    Stock @3.2GHz: 495cb
    OC @4.8GHz: 765cb
    And I used a silverstone HE-01, NO liquid cooling needed

    • @RATechYT
      @RATechYT  Год назад +1

      LLC was set to medium. VRM temps were below 90°C.

  • @goranzarkovic7350
    @goranzarkovic7350 Год назад +1

    i remember when i was buying pc in 2013 i had an option of amd fx(8350) or intel 4th gen like i7 4770..it was no brainer..but now i might buy one 8350 just to have it in collection

    • @goranzarkovic7350
      @goranzarkovic7350 Год назад +1

      at that time, intel was so much faster cpu that they were selling those processors capable of 4.4ghz(horrible chip lottery) to almost 5ghz(best ones) clocked at only 3.2-3.9ghz with turbo boost on)..1ghz underclocked.. and today when both are neck to neck they are squezing every mhz out of each core

  • @teapouter6109
    @teapouter6109 Год назад +2

    How is a first gen CORE this good in modern titles?
    Really shows CPU stagnation

  • @happycat0411
    @happycat0411 Год назад +5

    The i7 960 really only has 4 real CPU cores. Threads are just threads and threads are just another path to deliver information to a CPU. However, the FX850 has 8 REAL CPU cores (coupled with 4 math processors). Which would a person rather have > 8 REAL processing cores or 4 cores + 4 threads? Remember, threads CANNOT process information whereas REAL CPU cores CAN (provided one has software capable of providing the CPU with simultaneous multi-core CPU instructions).
    For most people, however, there won't be any noticeable difference between the i7 960 and FX 8350 since both processors are extremely capable when paired with a decent video card. Where the FX 8350 will REALLY shine is when the FX CPU is coupled with software that can fully make use of all eight cores running simultaneously such as video editing.

    • @Trick-Framed
      @Trick-Framed Год назад +3

      FX is multi threaded too. It has CMT or Clustered Multi Threading. What it does is use two cores in one module. One runs the code while the other either runs code that it can or waits for the resources it needs to free up. AMD's CMT gave it 65% more performance per module than just the cores alone would have if seperate. Whereas Intel's Hyper Threading (It's their version of SMT, Simultaneous Multi Threading) in Nehalem/Lynnfield only netted about 12.5% per core or 50% all total. Multi threading helping feed the cores data actually works so well it increases the speed of each core it works on. There are chips with multiple threads per chip. Sun has some. When Intel rumors about a bunch of cores and way more threads a lot of people thought it'd be more threads per core instead of E-cores.

  • @mungojerrie86
    @mungojerrie86 Год назад

    Solid content as it usual, mate! But I really believe you should put your beef with HUB to rest.

    • @RATechYT
      @RATechYT  Год назад +1

      Just joking around, no reason to take it seriously.

    • @mungojerrie86
      @mungojerrie86 Год назад

      @@RATechYT Good to know! But now everyone will take it like that.

    • @RATechYT
      @RATechYT  Год назад

      @@mungojerrie86 Well, they should've read carefully :)

  • @herobrinecyberdemon8104
    @herobrinecyberdemon8104 Год назад +2

    I'd love to see comparison with DXVK on the DX11 titles - or with one of the few well written Vulkan titles, World War Z was getting impressive results on the 2700X if I recall correctly. Outside of FX it would be nice to see where devs have hidden the so called "optimization" that we need in order for our CPUs to not become "obsolete" after 3 years - without having to OC, coolers, cases, proper motherboards and power from the wall cost money. Yeah, everyone should buy a 5.5GHz intel CPU, plus a water cooler, plus the needed motherboard and case, it has "more value than AMD". After 3 years we'll need a 7GHz intel CPU, I dunno, maybe we'll need two water coolers to cool it, still it would offer the "best value" for consumers. I'm not a native English speaker, how do we call those that consume shit?

  • @kirillsemen
    @kirillsemen Год назад +1

    Super!

  • @ChadKenova
    @ChadKenova Год назад +2

    I remember i almost bought a 8350 but luckily i decided on the 3770k on the first pc I built. Still have it on the shelf i had it in a tinker build with a 1060 for awhile. I was an intel only guy for a long time but my last intel chip was a 8700K and 9900ks been buying ryzen for my main rigs running a 7950x3d and 5800x3d. But i would like to try a 13900k. Might do a 14900k build im kind of a pc parts hoarder.

  • @Skynet6219
    @Skynet6219 9 месяцев назад

    It would be great to see a comparison between the FX8350 and the intel i7 3770. Both on old games and new games.Just to see how powerful the AMD FX series truly was.

    • @lordmaztema
      @lordmaztema 6 месяцев назад +1

      Man it could not beat a gen1 Intel, of course it will lose to a gen 3 Intel

  • @jamezxh
    @jamezxh Год назад +1

    I'd love to see a comparison video between the Phenom 2 x6 vs 8350 with 2 cores disabled.

    • @ismaelsoto9507
      @ismaelsoto9507 Год назад +1

      That's the FX 6350, they have the same amount of L3 Cache - as the FX 8350 - and the L1/L2 Caches is unchanged besides having two less cores to work with.

    • @RATechYT
      @RATechYT  Год назад +3

      I already compared the FX-6300 to the X6 1090T in the past.
      ruclips.net/video/Lzh6rBTN-iw/видео.html
      The video is a few years old at this point, so the quality won't be the same as in my recent videos.

    • @andreewert6576
      @andreewert6576 Год назад +1

      I'll save you time: it is a bloodbath. Full Vishera (8350, 8320) is on par with Thuban (1090T, 1100T) at similar clock speeds. The only thing Vishera has going for it is that it clocks way higher - 4.7 vs. 4GHz typical. If you take away 25% of cores then even an overclocked 63x0 loses to the good old Phenom II X6.

    • @negrusz
      @negrusz Год назад

      ​@@andreewert6576 I have a PII X4 965 at 4 Ghz since 12 year. (LoL) Pair with a Nitro+ RX580 8Gb since 4 year. I am in the age of middle 30 so i didn`t play too much since 7-8 years. So I didnt want to spend so much on the PC but the good old Phenom II (because the lack of some SSE) unable to run some new games if i want. Is it a 2nd hand 8350 worth to buy it? Not a big money but if i got less FPS what is the point?

  • @baladi921
    @baladi921 6 месяцев назад

    The FX 8350 is a highly underrated processor.

  • @wrusst
    @wrusst Год назад +1

    I'm waiting for the moment you build up to zen 2, 8 core Vs the 1800/2800 lol

  • @azimuth2142
    @azimuth2142 5 месяцев назад

    Still using my FX 12 years after buying it. It'll be retired to AI image duties or game server at some point.

  • @danb4900
    @danb4900 Год назад +1

    It was delayed for a reason, why assume that its release performance matches its 2009 prototype performance?
    Intels 1st and 2nd gen were G O D L Y for the time, its ridiculous how much staying power those mfers had.

  • @JuanDiaz-jo1rw
    @JuanDiaz-jo1rw 7 месяцев назад

    The i7 960 can do way more than 4.1ghz.
    I have the i7 960 OC @ 4.3ghz with an aftermarket Aircooler. Cinebench R20 score 1300.
    70° max load.
    50° to 63° on gaming.
    40° to 45° idle.
    I got up to 4.5ghz OC with the Aircooler, but the Temps are up in the 80°C on max load.
    Cinebench R20 score 1410.
    If I had a water cooler, the
    i7 960 can hold up to 5.0ghz stable.

    • @RATechYT
      @RATechYT  7 месяцев назад

      Unfortunately I was temperature limited, with a better cooler though I'd be able to push both CPUs another 200 MHz for sure.

    • @JuanDiaz-jo1rw
      @JuanDiaz-jo1rw 7 месяцев назад

      @RATechYT I use the Bequiet Dark Rock 4 Aircooler. Also, your voltages might still be a bit high for your clocking. Also, after overclocking, I optimized Windows 10 for performance and power consumption to keep the temps on check. I cut my CPUs Power on Windows 10 to 99%. And tweak a bunch of other stuff on Windows.

    • @RATechYT
      @RATechYT  7 месяцев назад +1

      @@JuanDiaz-jo1rw The voltage is as low as it gets, going any further makes the system unstable.

    • @JuanDiaz-jo1rw
      @JuanDiaz-jo1rw 7 месяцев назад

      @RATechYT
      Can you pls, post what your OC settings are?.
      I would like to see them.

    • @RATechYT
      @RATechYT  7 месяцев назад

      @@JuanDiaz-jo1rw 7:55

  • @obetemojkardi6473
    @obetemojkardi6473 Год назад +2

    Now try with i7 980x ow similar spec 6 core i7/xeon with overclock

    • @RATechYT
      @RATechYT  Год назад

      I'll be making a video about the Xeon X5670 in the near future.

    • @obetemojkardi6473
      @obetemojkardi6473 Год назад +1

      @@RATechYT great, just make sure you oc the uncore frequency as well, not just the core clock
      You can trade a little core frequency like 100-200mhz for higher uncore, it helps a lot

  • @kommandokodiak6025
    @kommandokodiak6025 Год назад +2

    Theyd have more success with a Phenom III

  • @mr.x2512
    @mr.x2512 Год назад

    The mythical triple-channel, pity that it is no longer used (at least for now) in the new consumer platforms.

    • @andreewert6576
      @andreewert6576 Год назад +2

      X58 wasn't a consumer platform, arguably. It was firmly in workstation territory. Their server-grade Xeon CPUs used the same platform. And it was not cheap, by the standards back then. A >$300 board was obscenely expensive. Still is imho.

  • @badass6300
    @badass6300 Год назад

    It's not that fx CPUs get less performance when oveeclocking, you are going from 4Ghz to 4.7Ghz which is a 17.5% increase while going from 3.2Ghz to 4.1Ghz is a 28.1% increase in clock speeds. You should think in percentages not raw numbers.

    • @RATechYT
      @RATechYT  Год назад +1

      Initially that's also how I thought, but I I believe it makes more sense to just mention the uplift in frequency. When calculating percentage uplift from stock Intel's uplift seems higher simple because it has a lower frequency. A 900 MHz uplift can mean a 45% increase if it's 2 GHz, but it also can be an 18% increase from 5 GHz.

  • @roccociccone597
    @roccociccone597 Год назад

    It’s crazy to see that if fx wasn’t delayed it would’ve been received much better.

  • @tyrkukulkan
    @tyrkukulkan Год назад +1

    Bulldozer was AMD's Netburst. It was a terrible architecture for efficiency. Yes, you could get some performance with high power consumption but that is never good. I do have an FX system but as an HTPC only.

  • @milenaveleva5274
    @milenaveleva5274 Год назад +2

    Overclock the north bridge man

  • @Trick-Framed
    @Trick-Framed Год назад +1

    You had said once that on air the max you get from your FX 8350 is 4.7 Ghz. However you said you could achieve 5 Ghz on water. If that's the case, why use air for this test?

    • @KavorkaDesigns
      @KavorkaDesigns Год назад

      I have over 5GHz on air, NH U12A, add a iPPC 3K in the rear and use a sabertooth 990fx r2 board

    • @RATechYT
      @RATechYT  Год назад +3

      I'm voltage limited, increasing the frequency an extra 100 MHz requires 1.54+v, which isn't worth it.

    • @Trick-Framed
      @Trick-Framed Год назад

      @@KavorkaDesigns You hit the silicon lottery with that chip. Mine is better than his too but I thought he had said on water he could get 5 Ghz without much fuss. I didn't know it was a voltage situation. I get it if you are trying to keep that CPU forever. I wouldn't want to overvolt it too much either but i'd rather use water cooling than air for FX either way.

    • @Trick-Framed
      @Trick-Framed Год назад

      @@KavorkaDesigns I have the next board down that has the same VRMs. I am using a Cryorig H7 on it. I see that is my issue. Your Noctua has nearly double the heat dissipation as my Cryorig. Guess I need an AIO or a Noctua model.

    • @KavorkaDesigns
      @KavorkaDesigns Год назад +1

      @@Trick-Framed I have the 8320, 8350, 8370, 9590, i didnt hit to lotter on all chips lol, the voltage is too high. Neither of my CPUs is pass 1.44v iirc, I'll gather the bios from each system, but there are many guides on YT how to OC the 8300 series. the Sabertooth 990fx r2 is the best oc AM3+ board, I have the crosshair v formula-z too, but the ST R2 is better imo, the cermix helps. Maybe a while before I can get the info, but I'll do my own video when I get more time, I currently mining with all the cpu's 0.6c US/day/cpu, they've paid for themselves many times over since 2012-2013, all have been rock-solid, and nothing but hate online from tech reviewers, I love the these cpu's, I use 3700x's in my other rigs they are down to $100 2nd hand

  • @MusicHavenSG
    @MusicHavenSG Год назад

    Should see if you can use the 6 core ones for the Intel test like the 970 or 980X

    • @RATechYT
      @RATechYT  Год назад +1

      We'll be revisiting the X5670 in one of my upcoming videos.

  • @evilqtip7098
    @evilqtip7098 7 месяцев назад

    ❤❤❤❤❤❤❤❤❤😊😊😊😊😊😊😊😊😊
    For the time those cpu,s were the breakthrew first 64bit CPU'S to be working on a 32bit system..
    😮😮😮😮😮😮😮😮
    Yup !!!
    Now on 64bit they shine !!

  • @mlzphoto-official
    @mlzphoto-official Год назад

    i only know of two very good amd cpus: one is from the 486 era (133 mhz), and one is from the new age era (ryzen 9 5950x). all else is nothing to write home about.

    • @andreewert6576
      @andreewert6576 Год назад +2

      You should read up on the OG Athlon then, the first CPU to break the GHz barrier. AMD was straight-up faster than the pentium 3s of that time, and P4 was its own disaster.
      Then theres the Athlon 64 which brought us the x86/64 architecture we use today. It was a great CPU to have, especially for gamers. Basically AMD dominated the desktop CPU market until intel released core 2, and then it took AMD until third-gen Ryzen to catch up again.

    • @mlzphoto-official
      @mlzphoto-official Год назад

      @@andreewert6576 what good is a broken ghz barrier when the cpu die cracks if you put on the cooler not the nm correct way? or the cpu simply smokes it's life away, when the cooler is a bit off? i've been there, i've done it all - i bulid my own PCs since the mid '90s.

    • @andreewert6576
      @andreewert6576 Год назад +2

      @@mlzphoto-official well your memory must be cloudy then because the 1GHz Athlons came in Slot-A and had no issue with cracking dies.
      Plus in all my years of meddling with Socket-A CPUs i've only managed to damage one - and that one still worked, it just couldn't be reset anymore. Open die needed a teensy bit of care or (especially once screw-down coolers became a thing) a proper CPU cooler.
      Thousands of people managed to build with those back in the day, even if admittedly the CPUs were not as idiot-proof as intels.

    • @mlzphoto-official
      @mlzphoto-official Год назад

      @@andreewert6576 yes, you are right. i tzhought of the socketed athlons - by bad.

    • @DFX4509B
      @DFX4509B Год назад +1

      Athlon Thunderbird was the fastest CPU on the market for its time, and the Phenom II line on K10 was also competitive as well IIRC. Also, Jaguar had higher IPCs than Steamroller and was even faster than Bay Trail which was its direct Intel competition too IIRC.

  • @OrjonZ
    @OrjonZ Год назад +1

    The gaming perf is what most people cared about. By Zen1, multie core perf was also important because we had been stuck with 4C for so long. In gaming Zen1 was also pretty slow vs Skylake. Really AMD did not catch up to gaming until Zen3 and won with Zen3 X3D.

  • @thudtheace
    @thudtheace Год назад +8

    The Steves' are annoying, and their test methods lacking, and narrow in scope..

    • @trueheart5666
      @trueheart5666 Год назад +3

      They have superiority complex so they're cocky.

  • @kinzie3915
    @kinzie3915 Год назад +1

    Well good to know that 8350 runs on base 1866 memory clock and Intel only on 1066. You should set memory clock on AMD also on the 1066. It would be much more fair for 3 year older Intel.

    • @RATechYT
      @RATechYT  Год назад

      We got what we got, but I'm sure performance would match in games, the i7 would maybe even be slightly ahead in some titles with higher memory frequency.

    • @johndoh5182
      @johndoh5182 Год назад +1

      If you do this, is it fair?
      This is always the question I ask when I see reviewers gimping a system.
      An FX-8350 defaults to 1600 as long as the memory is at least that fast. If you run 4 sticks it runs at 1600 at best. No enthusiast in their right mind would have bought an FX-8350 and paired it with 1066 memory. So what you're saying is take away the advantage of one system to benefit the other. You're also saying run a config that no one would have ever run. OK, did you only allow a certain OC for the Intel CPU because the AMD CPU only OCs to a certain percentage?
      These are different architectures. If you're going to allow one its full advantage then you should allow the other to have its full advantage.
      I've now seen reviewers FINALLY understand this point and from time to time different reviewers will compare systems when each is given its full potential, without overclocking. But even with that there's a lot of gray area because memory overclocking is common, and even the companies tell you how fast memory clocks can run and what is optimal memory for it even though it's theoretically an OC.

  • @zoranzorand8529
    @zoranzorand8529 Год назад

    So, let's keep it fair, will test a 4 core, 3 years older chip against a 8 core chip, I tell you, and we gonna need afterburner to decide the winner :)

  • @JohnSmithWesson
    @JohnSmithWesson Год назад

    Had "8" core AMD FX, it sucked compared to 4th gen i7, especially in games which can't utilize more than 2 cores

  • @amdeshnikforlife1058
    @amdeshnikforlife1058 Год назад +1

    😎👍

  • @KavorkaDesigns
    @KavorkaDesigns Год назад

    The 8350-8370 can easily hit 4.9GHz stable, even my 8320 is at 4.7GHz stable. The voltage your using isn't correct, and is the issues with the limited overclocks on the FX. Get an Sabertooth 990FX R2 board and use it along with Noctua U12A and 3,000 RPM iPPC fan and u can get well over 5GHz, Your bios settings aren't correct if using over 1.5V, I'm in the low 1.4V's

    • @RATechYT
      @RATechYT  Год назад

      No matter what you do, this chip requires this amount of voltage at this frequency to be stable. Increasing the frequency any higher won't improve performance much anyways, all I'll get is just a few extra frames in games and cut a few seconds on render times.

    • @KavorkaDesigns
      @KavorkaDesigns Год назад

      ​@@RATechYT copy&pasted from above comment. I have the 8320, 8350, 8370, 9590, i didnt hit to lotter on all chips lol, the voltage is too high. Neither of my CPUs is pass 1.44-ish volts iirc, for all the cpu's. I'll gather the bios from each system, but there are many guides on YT how to OC the 8300 series. the Sabertooth 990fx r2 is the best oc AM3+ board, I have the crosshair v formula-z too, but the ST R2 is better imo, the cermix helps. Maybe a while before I can get the info, but I'll do my own video when I get more time, I currently mining with all the cpu's 0.6c US/day/cpu, they've paid for themselves many times over since 2012-2013, all have been rock-solid, and nothing but hate online from tech reviewers, I love the these cpu's, I use 3700x's in my other rigs they are down to $100 2nd hand

    • @KavorkaDesigns
      @KavorkaDesigns Год назад +1

      We're all here fighting over a cpu from 10+ years ago, you knw the product is good when..

  • @mistandyork
    @mistandyork Год назад

    you should do a video with half the cores disabled as well, making it a true 4 core with no shared resources. I believe it can perform better in games this way, but that was back in the day.
    edit: The FX processor should also be compared with the 2600K or 3770K, not the MUCH older 960.

    • @cardboardsnail
      @cardboardsnail Год назад +3

      You didn't read the title of the video. You can find out why he compared it with the i7 960 if you read the title.

    • @andreewert6576
      @andreewert6576 Год назад +1

      8150 came out shortly before sandy, but was priced to compete with the i5, so 2500K at best. It did lose that fight, and badly, but GN i think did a revisit and t wasn't all that clear 10y later. 8350 again would have been competing with 3570k, which afair was a smaller gap. Also many gamers of that era were happily buying 61xx and 63xx over intels locked i3s (again, for the same money).

  • @theandregustmysteriouswolf917
    @theandregustmysteriouswolf917 Год назад

    what's your main CPU?

  • @extracoolboy
    @extracoolboy 7 месяцев назад

    Kind of meme cpu, I used this from 2014 to 2020. Rip FX.

    • @RATechYT
      @RATechYT  7 месяцев назад

      Same, 2013 to 2020.

  • @cosmicusstardust3300
    @cosmicusstardust3300 Год назад

    seems like most of AMD's existence was playing catch up with the competition lol

  • @herobrinecyberdemon8104
    @herobrinecyberdemon8104 Год назад +2

    Ah, software optimization. This is what always bottlenecked systems, not the FX. intel is under government support from the 80s for all likes, one should simply count the court cases they evaded without being punished to pay up. AMD was the first to introduce monolithic multicore chip design to PC - Athlon 64, particularly its X2 models, but it was designed for dual cores from the start, including a memory controller and an interconnecting bus. You can look up the comparison against the Pentium 4 and Pentium D, it was an absolute slaughter, and also the largest case of bribery initiated by intel to date - and Uncle Sam purposely closing his eyes, you can look the court case against intel and its clauses... 0$ fines after paying billions of dollars (!!!) quarterly to OEMs to not use any AMD chips.
    There were a couple of reasons AMD went for the module designs, but as you pointed out they mainly circle around die size shrinkage. The first modular design by AMD was Bobcat - it was used in APUs for compact notebooks, tablets, quite sure it made the cores of the APUs for the PS4 and the Xbox One, and also avaiable on the FM2 socket, all the mentioned are APUs with an iGPU. Bobcat had the same instructions avaiable as K10 (Phenom II), which is indeed quite a shorter list, don't know about the rest. Anyway these parts were aimed at low-power devices - thus needing proper iGPU and low power envelope. If anyone digs up a comparison on those, look for these first - see how "objective" and "unbiased" the review channels are. They're all propagandists of the billionaires holding the shares of big corporations, so no one should be surprised. Bulldozer does well at integer - the architecture itself would make quite sense for datacenters, lets not forget the 6000 series Opterons who were all - or at the very least the mass - 4-way CPU system capable, another thing the credible reviewers forgot to mention back in the day. Yeah, intel did offer 4 way Xeons for the 2011 socket - with Sandy Bridge intel branded their Xeons in the following manner: XYZZvB, X is the number of CPUs one can use in a single system (motherboard), Y is the socket (2 - 115X; 4 - 1356; 6 - 2011/v3; 8 - well either 2011v1 or 1567, don't remember exactly, intel tried to milk the shit out of their customers, didn't play out and their most "premium" platforms had to be removed), the ZZ are the model number and the vB the generation starting with Sandy Bridge. Well, quite the essay I'm writing. Anyway people open cpu-world.com and look for 46XX intel Xeons, just check the price, or if you manage to dig up the reviews from 2012-2013 you can even make a video on the bull$hit we get poured over our heads by the review (commercial propaganda) industry. Getting a comparison on motherboard prices for the said servers would be also nice, but that's extremely hard - we can't get one for AM3+ boards, and that is something one should add on the AMD vs Intel comparison, when intel started taxing for enabled overclocking.
    Why Bulldozer was delayed? Well, there were people designing the architecture - they are mortal, they need to sleep, there are usually delays to projected plans as AMD wasn't as organized as it is now, Lisa Su had to gather most of the engineers on Zen when she became CEO, people were working on various stuff, including ARM CPUs. Though Bobcat launched more than 6 months before Bulldozer, as Bobcat was on GloFo 45nm (32nm for Bulldozer and Piledriver) - there were issues with the yields of Global Foundries' 32nm. AMD parted with their foundry in the late 90s or early 2000, the foundry is the same Global Foundries we know nowadays. After Piledriver came Steamroller (quite sure they were all at TSMC's 28nm), the last Bulldozer offspring was Excavator, that was on 14nm GloFo if I'm correct, however neither of these successors offered 8 (4x2) cores, all the dies must've had an iGPU, there were parts without one for sale, though I'm quite sure it was always because that one was disabled. Intel was always on a roll cause they made their own chips. Well, untill the capabilities of their foundry workers came to the level of their architecture engineers, ha-ha!
    Oh, I've heard there were bearings of SMT on Bulldozer - likely groundwork. On the Hyper-Threading topic, yep physical cores>logical cores on mostly comparable architectures of course (comparing to a CPU for a smartphone won't cut it), probably a lot have heard that AMD's SMT is more effective than intel's HT - look who, and when, invented SMT: patents.google.com/patent/US5944816?oq=Microprocessor%20configured%20to%20execute%20multiple%20threads%20including%20interrupt%20service%20routines&fbclid=IwAR3pjwXX_dfBAax1s3Jo_zuEf4VXvXEfbGZoVHhhhHK7M3qKT35dnVGm7-s
    I've grown to despise intel "engineers" and their creations, delve deeper into their practices and you'll understand why. The patent file above simply puts another nail on the "intel is better" coffin.
    What's going on currently is intel being unable to raise clocks cause they can't shrink their nodes. Intel was planning to keep on refreshing Pentium 4 till they reach 10GHz, nowadays they seem to have returned on that track cause I'm hearing 6.5 GHz for the Raptor Lake refresh - we'll see how absurd the power draw will be once intel manages to get their 7nm node (intel 4) in working condition. But anyways software optimization was always for intel. Yes intel paid and pays for it - but the true cause is cause of intel's mediocre architectures and the fact they don't try changing them when they don't have to - add some instructions, definitely raise clocks, add cores only when forced to, so that you don't have to redesign the architecture, intel's engineers would be lost if that is to be done. Software devs are also a lazy bunch, and they love doing nothing useful but getting paid fat salaries. OK I've heard only some get paid - intel outsources work to other companies, the idea is that they pay a fraction of the salaries and other expenses cause people aren't directly hired there - that shouldn't surprise us either, first optimizations in capitalism go in reducing expenditures - without considering moral standarts obviously. AMD reworked their architecture twice from the ground up the last 15 years, Zen is also being progressively modified as AMD is promising once again large IPC gains for Zen5 (we can be sure on that), too bad we aren't prominent in the sciences behind CPUs to delve in the architectures and see for ourselves - of course considering we won't get full access to the architecture schematics. I dunno what is going on in AMD, everyone outside the military industrial complex in the collective West is laying off workers right now - what I know is that intel and Nvidia still keep on with their cheap propaganda tricks against AMD as people still buy them - well mass media has always been fake - and one needs to look for themselves, guess it's time we learn how to properly conduct investigative journalism. So that we are able to make an independent choice when buying PC hardware... Nice world we live in nowadays, isn't it?

  • @BonusCrook
    @BonusCrook Год назад

    FX is still a terrible CPU architecture most people are best getting an i3 12100f/i5 12400f/r5 5600.

  • @clanjade1
    @clanjade1 8 месяцев назад +1

    lol still using fx 8350

  • @patrickc8007
    @patrickc8007 Год назад

    "What if AMD FX wasnt delayed" and show gameplay of modern games, what's the point? LOL include some older titles for that, especially those single core heavy ones.

    • @RATechYT
      @RATechYT  Год назад

      Oh no doubt the i7 would do better, given it did this good in modern games.

    • @patrickc8007
      @patrickc8007 Год назад

      @@RATechYT Honestly i was suprised that i7 960 did so well against the FX.

  • @uhohwhy
    @uhohwhy Год назад +1

    lol 45nm vs 32nm? lame, try x5680 cpu.