REMATCH In 2019: Core i5 3570K vs. FX 8350

Поделиться
HTML-код
  • Опубликовано: 15 окт 2024

Комментарии • 758

  • @hikikomorihachiko
    @hikikomorihachiko 5 лет назад +258

    Still crazy to see how well these older platforms are holding up, even a stock 8350 is more than fine for a large majority of users in a ton of games and programs. Weird to think how dated my 2003 era P4 system was in 2010, 7 years after launch, compared to 7 year old processors with today's tasks. Great video like always, it's always good to see a fair benchmarking battle!

    • @LocoMe4u
      @LocoMe4u 5 лет назад +17

      It's because performance stagnated for so long, no increase, for the mainstream, in core count between core2quad and i7-7700k!!! (They're both 4 cores.) that's why at home I've i7 1 st gen, then skipped to i5 8th gen!

    • @Gamevet
      @Gamevet 5 лет назад +10

      Too many frame spikes. I had to move on from a 4.7 Ghz 2500K to a 4790K, because the frame lag became a problem in some games.

    • @Writeous0ne
      @Writeous0ne 5 лет назад +3

      @@Gamevet same as the FX... it's like they've stopped optimising games for them. i can play fallout 76 with an i5 3570 no problems but its unplayable with an fx...

    • @theoneyoudontsee8315
      @theoneyoudontsee8315 5 лет назад +1

      i have a fx-6350@4.2ghz n.b. at 2200mhz h.t. at 2400mhz windows core parking disabled 8gb 1866mhz c.l.9-9-9-26 dual-channel ram and evga gtx 750ti ftw and samsung 500gb 850 evo its still a console killer in 2019 only xbox one x actually wins.

    • @avikdas9649
      @avikdas9649 5 лет назад

      @@Gamevet how much the 4790k cost u?

  • @BudgetBuildsOfficial
    @BudgetBuildsOfficial 5 лет назад +132

    Very nice video. I'm really surprised by the FX8350... I mean over here it's dropped in price a tonne, but the only issue is that the boards haven't... Whereas Sandy Boards have dropped an absolute tonne, to the stage where H61 Boards are now only £20-30... Which helps a lot when you deal with the low low price bands I end up using.
    Still absolutely brilliant review.

    • @BudgetBuildsOfficial
      @BudgetBuildsOfficial 5 лет назад +3

      @@cyberman7348: if you wanna stay on the same socket, go for a Ivy i7 or something.
      Honestly the other option is to go down the route of AM4 but that depends on your budget.

    • @LuigiCastrotti
      @LuigiCastrotti 5 лет назад

      @@BudgetBuildsOfficial I am looking to make a budget PC for friend with an FX cpu. Any recommendations?

    • @BudgetBuildsOfficial
      @BudgetBuildsOfficial 5 лет назад +5

      @@LuigiCastrotti: Erm I mean the recommendation in the video is pretty good. You're better off going for 1155 mostly, especially in terms of low low budget stuff.

    • @JT-ko2ib
      @JT-ko2ib 5 лет назад +1

      @@LuigiCastrotti None, the video explained to get the used Intel system, and with reason. Unless your friend really needs those extra threads. I hope energy is cheap where you are, but if the computer is only on a few hours a day, stressed half the time, and a graphics card without a PCI connector is had + no overclocking, power consumption won't be astronomical.

    • @Pieteros21
      @Pieteros21 5 лет назад

      in poland it is opposite way ... am3+ boards are cheap , but lga1155 are a bit expensive , especialy better ones if you wanna overclock ...

  • @WafretTheWaffle
    @WafretTheWaffle 5 лет назад +58

    Finally, someone who OC'ed CPU/NB of FX

  • @RATechYT
    @RATechYT 5 лет назад +47

    It's kind of funny that you to uploaded this exactly when I'm about to drop a rant video on why showing just frame rate graphs in CPU tests do not tell the whole story, which unfortunately a lot of channels get away with. This is the kind of testing that everyone should be doing, and I appreciate you showing us the benchmark runs, frame time as well as frame rate graphs. This is as in-depth as it gets.
    P.S. - lol just realized you uploaded this 11 days ago

    • @danb4900
      @danb4900 Год назад +1

      Its also kinda funny how this video showcases that you cherrypicked Division 2 for comparisons when you know a based 4 core intel chip wrecks that shit architecture in most games

    • @angeltzepesh1
      @angeltzepesh1 6 месяцев назад

      ​@@danb4900that's what you usually get from fanboys.

  • @JanghanHong
    @JanghanHong 5 лет назад +84

    Some 2012 Dreams do come true, unlike Kony.

    • @z3roo0
      @z3roo0 4 года назад +1

      I'm suprised no one barely remembers or mentions the KONY 2012 fiasco anymore

  • @StarDruid
    @StarDruid 5 лет назад +45

    Still using my fx 8350 today, plays all my games just fine. Though I plan to buy a 3600x if I can hold on to the money long enough for it. Sadly bills seem bound and determined to drain my savings this year again.

    • @brtcobra
      @brtcobra 5 лет назад

      lol

    • @edi4ever1
      @edi4ever1 4 года назад +1

      are you running that nice 3600?

    • @MrBiky
      @MrBiky 4 года назад +1

      The faster you buy a newer CPU, the faster you will pay lower bills. And you should probably settle for the Ryzen 5 3600, not much of a difference.

    • @StarDruid
      @StarDruid 4 года назад +1

      @John Hooper Nope, budget is to tight for stuff like PC parts. To many bills and unforeseen costs. Tax return is only time for that but vehicle repairs, tires, medical bills have to come first.

    • @deXXXXter2
      @deXXXXter2 4 года назад +1

      @@StarDruid For money saved on power if u went Intel in 2012 you could buy whole new CPU now

  • @goge6337
    @goge6337 5 лет назад +36

    I still use 3570k as my main,glad to see video about it

    • @fabiusmaximuscunctator7390
      @fabiusmaximuscunctator7390 5 лет назад

      I have the i7 and I'm glad not to game on the FX 🤣

    • @goge6337
      @goge6337 5 лет назад

      Fabius Maximus Cunctator i don’t game anymore but it can still game very well

    • @fabiusmaximuscunctator7390
      @fabiusmaximuscunctator7390 5 лет назад

      For windows anything above first gen core i CPUs is more than sufficient.

    • @goge6337
      @goge6337 5 лет назад

      Fabius Maximus Cunctator yea but i used to game and now im left with 3570k,dont wanna change it,im probably gonna use it for next 6-7 years

    • @fabiusmaximuscunctator7390
      @fabiusmaximuscunctator7390 5 лет назад

      @@goge6337 6 years is quite a long time. Thought to use my C2Q system for ever, then swapped it for the i7, so damn happy I did. YT was lagging with 1080p60 footage all the time, 4K wouldn't even start. The lack of new instruction sets will make it obsolete sooner than later I think, but if it says useful, go for it. We throw away too much goods in a working condition anyway!

  • @Chaos8282
    @Chaos8282 5 лет назад +18

    Still running my i5 3570k . Been almost 7 years with zero issues and good performance. Only slightly showing it's age for what I play and want out of it.

    • @vinayvnr
      @vinayvnr 2 года назад

      Which GPU are you using ?

    • @marioloncar2169
      @marioloncar2169 2 месяца назад

      ​@@vinayvnrbest gpu for this cpu is gtx 1660 or rx 6600,

  • @connarcomstock161
    @connarcomstock161 5 лет назад +39

    I had an 8320e@ 5Ghz with a R9 390 @ 1175Mhz, the system, from the wall, at full load, would pull ~700w. It beat a 750w, Gold Certified, Corsair powersupply into submission within a year.

    • @LocoMe4u
      @LocoMe4u 5 лет назад +6

      Meanwhile I've OCed i5 8600k + OCed mem + OCed 1070ti + 3hdds + 1 nvme... etc on a 486w non rated grey oem PSU i reused 😂👌

    • @mt441
      @mt441 5 лет назад

      Under A Bridge Too Far wow so cool budget buy ayyy i want Intel xd xd

    • @Punkz83
      @Punkz83 5 лет назад +2

      See I don't understand that because my 8350 @4.5Ghz - GTX 1060 6GB @1.9Ghz with a Bronze certified 650W Corsair PSU never breaks a sweat...

    • @balomir2544
      @balomir2544 5 лет назад

      @@Punkz83 Is hot in your room? XD

    • @connarcomstock161
      @connarcomstock161 5 лет назад +4

      @@Punkz83 your gpu pulls ~200w tops. The R9 390 with an OC will *easily* pull 400w under load, and the CPU will pull ~250 give or take, so that's 650-700w no problems

  • @joannaatkins822
    @joannaatkins822 5 лет назад +16

    This was a really high quality video, and a really fair and balanced look at this aging but still relevant tech!
    It's good to see games making better use of higher thread counts, and the frame times really surprised me in the more CPU demanding games like SotTR.

  • @critiqalerror
    @critiqalerror 5 лет назад +83

    My second pc rocking an i5 3570k at 4.2Ghz paired with a 1060 6gb and 16gb ram. Still as reliable as it was before.

    • @filippetrovic845
      @filippetrovic845 5 лет назад +10

      Meanwhile my first pc rocking i3 2100 hd 6670 4gb ram and hdd with bad sectors.

    • @LocoMe4u
      @LocoMe4u 5 лет назад

      Lol my first pc was pentium 3 on Windows 95, now rocking sweetspot i5 8600+ 1070ti oc+ 16gb ram+ samsung 970 pro :D

    • @garytaft
      @garytaft 5 лет назад +2

      Similar setup but going to upgrade to zen2 and Navi after so many years of good gaming...

    • @budgetking2591
      @budgetking2591 5 лет назад +1

      @@LocoMe4u sorry but sweetspot is taken by ryzen 2600 ;) Great setup though! im still on i7 3770k and 1070ti

    • @LocoMe4u
      @LocoMe4u 5 лет назад +1

      @@budgetking2591 Ryzen has shit framerate in all my favorite games

  • @mesicek7
    @mesicek7 5 лет назад +54

    4:57 Props to you
    Most youtubers who do cpu benchmarks have no clue what they're doing at times.

    • @enchantress7928
      @enchantress7928 5 лет назад +3

      Low settings is NOT cpu bound.

    • @felixgamingvlog6702
      @felixgamingvlog6702 5 лет назад +1

      Enchantress nope you wrong

    • @enchantress7928
      @enchantress7928 5 лет назад +1

      ​@@felixgamingvlog6702
      (This is a piece I wrote on a hardware site. You are wrong with the game settings in terms of CPU testing. It is in Dutch but I translated it via Google translator.)
      Short explanation game engines, where are the problems with testing processors?
      I have given an explanation about the graphics pipeline, how this is translated into threads for the CPU where the problem lies with testing on Medium or even lower settings.
      The big bottleneck of multhreading with game engines like is the so-called render thread, not to be mistaken with the mainthread, which is a collection of game logic, phsyics, AI and player interaction.
      This render thread is sequential in nature because the links depend on each other, see the graphics pipeline.
      You can create additional threads in parallel via techniques such as deferred context and Multi-Render Target (MRT)en.wikipedia.org/wiki/Multiple_Render_Targets, but everything is merged into the render thread at a given moment.
      In most games with a modern 6C / 6T you have already reached the limit in which these games cannot batch better and adding extra threads with decreasing additional revenue works.
      Keep in mind that this also depends partly on the Driverlayer of the video card.
      Nvidia and AMD mainly differ in this regard with directX12.
      Hardware.info has recently published an article about this, mainly due to software vs..
      Hardware scheduling. See my post below that article regarding my opinion about the test.
      So as said, batch, batch, batch! www.nvidia.com/docs/IO/8228/BatchBatchBatch.pdf If you are going to execute everything sequentially, the render thread is only sending simple drawcalls to the GPU and that is waiting 95% of the time for the CPU.
      The GPU is a collection of very small processors (shader units) that you all want to put to work.
      The more polygons on the screen, the longer it takes for the GPU geometry shaders to calculate everything and the tasks need to be divided.
      With a concept called "deferred rendering / shading" it has been possible since DirectX11 to divide complex tasks (including geometry) into render targets en.wikipedia.org/wiki/Render_Target while the game logic can continue.
      Incidentally, the geometry only makes up part of the complete framing tender. With Intel Frame Analyzer I have made a trace and frame grab to analyze as far as possible.
      And yes this is very complex matter, also for me. I will therefore briefly discuss it. The first photo is the frame race, which shows exactly what the processor was doing and at what timings. This is exactly one frame, which you can see on both VBLANK en.wikipedia.org/wiki/Vertical_blanking_interval instructions which are exactly 16.65ms (60Hz) apart. Programs are divided into pieces in modern Operating systems to make optimal use of the resources in cache / memory, in so-called slices, even though you use them in real-time. en.wikipedia.org/wiki/Preemption_(computing) At the bottom of the trace you can see the "context thread", this is the render thread and I just tagged it for convenience. First, the geometry is calculated by the CPU, which can be batched and therefore you will also see additional threads being created in parallel. On low, there are three times as few (three times as few primitives as I had already indicated in the pipeline). It is placing the geometry in the various render targets. These are (not indicated) sent to the GPU. If you compare this with the frame grab, where I have divided it into so-called render targets, you will see the concept of MRT. In these targets, with their specific assets, pixel rendering takes place through the GPU. The lighting, tessellation, shadows, textures are provided in all rendering targets.
      You may now see where it will go if you make the scene too simple with lower settings and also at high frame rates (> 100fps).
      The load is shifting more and more towards the render thread, fewer simultaneous deferred context threads are created and the entire game engine will park at a draw call limit.
      A draw call from the CPU must be sent to the driver via the API.
      Certainly in the DirectX11 API (Application programming interface) overhead is quite large (see photo with the threads earlier, the kernel and userdrivermode are two large chunks of your render thread) and with a certain amount of drawcalls (also multiplied by the frame rate) is the renderthread is really just waiting for the CPU while it is going through the application layers.
      DirectX12 should partially resolve this by having fewer application layers between the CPU and the GPU, but this actually only increases the point where the CPU encounters the same problem (max. Amount of drawcalls).
      You will have to batch batch the tasks in your game engine to make use of the additional threads.
      So the moral is: Do not test with medium or low settings if the aim is to test CPU performance in general. Keep the frame rates within limits.
      If you go beyond 150-200 fps with your benchmark, you may have to put another scene or game in the suite because you are definitely testing the driver overhead on a single core on DirectX11 too.
      en.wikipedia.org/wiki/Preemption_(computing)
      en.wikipedia.org/wiki/Vertical_blanking_interval
      www.nvidia.com/docs/IO/8228/BatchBatchBatch.pdf
      en.wikipedia.org/wiki/Multiple_Render_Targets

    • @felixgamingvlog6702
      @felixgamingvlog6702 5 лет назад +1

      @@enchantress7928 are you need wall of text to make me understand wtf you write

    • @felixgamingvlog6702
      @felixgamingvlog6702 5 лет назад

      ​@@enchantress7928 not sure how how its work im just electronic hobbyist or hadware like mosfet and power delivery but in this bechmark is how fast the cpu without lets say not making vga to bottleneck the system but in realistic case you normaly test in high setting becouse most player play in high set mybe say to see hit box clear

  • @CossackHD
    @CossackHD 5 лет назад +51

    Proper overclock of FX platform with memory subsystem taken care of? Like!

  • @papankunci
    @papankunci 5 лет назад +10

    Awesome ReMatch!... i still use the i5 3570K OCed to 4Ghz but with lower Vcore to keep the temperature really low, 16GB RAM and RX580 ... my monitor is only 72Hz, so i run my games in ULTRA locked to the monitor refresh rate for smooth gaming... (battlefield V, 1 and so on).. this Oldeie is still a goodie..
    thanks for the Vid Man! you ROCKS!

  • @CuttingEdgeRetro
    @CuttingEdgeRetro 5 лет назад +63

    Great look back, nice benchmarks and data. I would love to see an AMD CPU's over the years video. Something like a Phenom II 1090T vs FX 8350 vs Ryzen 1700(X). i wonder how close the 1090T is these days to the 8350. Idea for ya Mike ;)

    • @F2FTech
      @F2FTech  5 лет назад +7

      🤔

    • @gojomodu9934
      @gojomodu9934 5 лет назад +2

      I saw a comparison of a fx 6100 and X6 phenom and the gaps are pretty big. The Fx has gotten better now

    • @LocoMe4u
      @LocoMe4u 5 лет назад +1

      6 core AMD vs Intel 6 cores... = RIP AMD

    • @7MBoosted
      @7MBoosted 5 лет назад +3

      @@gojomodu9934 that's because phenom doesn't have avx baked in. Back in the day a phenom II X6 would match FX 6 cores all day and in some tasks beat FX 8 core parts.

    • @Pieteros21
      @Pieteros21 5 лет назад +3

      @@gojomodu9934 fx didn't got better ... just lack of instructions in phenom start to be seen .

  • @Phoenix_SW20
    @Phoenix_SW20 5 лет назад +28

    The 3570K still holds up decently today. I had mine clocked to 4.6GHz at 1.275 volts. I've never gamed on a FX 8 core, but I'm surprised to see how well it performs today considering how much hate Bulldozer/Piledriver got back in the day. Great comparison.

    • @matthewsmith1779
      @matthewsmith1779 5 лет назад +7

      Windows schedulers didn't know how to handle them. They initially were treating them like true independent cores and tanking the performance.
      Now it treats the modules as true cores and the bulldozer cores more like SMT threads.

    • @SrWolf90
      @SrWolf90 2 года назад +1

      FX was misunderstood, and the board builders didn't know how to properly adjust the necessary settings to make them work well, they always put too much voltage on them and they generated too much heat.
      I have my FX8350 at 1.3v at 4.2 GHz and it stays cool (it is around 45 degrees and the maximum temperature reached was 65 degrees), I have this CPU for almost 10 years, and today it continues to serve as a server and Virtual machines.
      In addition, this CPU was designed to take advantage of its 8 threads, in 4 threads it performs poorly, but when the 8 threads are used it is at the level of an i7 2600.
      To date, the FX8350 has been my best purchase, it cost me €160, it is 10 years old and the CPU is still not short for server tasks.
      It would even serve to play leading games that require F16C instruction, something that the 3rd generation of Intel and the second did not have.
      Without a doubt, the FX8350 was a weirdo, because AMD made a new architecture with all the modern instruction sets, but it only lasted 2 generations.
      Then the Ryzen was the spiritual successor for continuing the idea of creating 8-core CPUs to dethrone Intel's 4-core rehashes.
      If AMD leaves the entire FPU units in the modules, they have dethroned Intel.

    • @Phoenix_SW20
      @Phoenix_SW20 2 года назад +1

      ​@@SrWolf90 I actually have an FX 8120 paired with an ASRock 990FX Extreme 3. I was also using it as a server until recently. I got it from a friend a few years ago who was upgrading and didn't want the old parts. It mainly acted as a NAS for file storage but also did a bit of transcoding for my Jellyfin media server. It was a fantastic chip for server duty. The system used to run FreeNAS with an Ubuntu server VM and a couple BSD jails. It had 16GBs of RAM. You're right about the default motherboard settings. The standard voltage was way too high. I set the voltage to 1.2V and the frequency at a static 3.8GHz. With a Wraith Prism cooler peak temperatures were around 46C under full load and the fan was practically silent. I really liked that system. It wasn't the fastest out there but it cost me nothing and was always reliable.
      That's great that you've gotten so much life out of your FX 8350. You're right about the multithreaded performance. With 8 threads loaded it really is on par with a 2nd gen i7. I don't have a use for my FX 8120 anymore but I've been thinking of turning it into a retro gaming system lately.

    • @SrWolf90
      @SrWolf90 2 года назад +1

      @@Phoenix_SW20 Totally agree, and the good thing about the FX SB950 chipset, is that it has 6 SATA 6G ports, unlike Intel it only has 2 SATA 6G ports in the 2nd and 3rd generation, that's why the FX for file server is the best.
      For me it was a very reliable platform, with all the most modern standards that existed at that time, it is incredible that it is 10 years old and still capable.
      That processor to emulate retro consoles will be plenty, besides being fresh, it will work wonders for you to emulate.

  • @Kottesque
    @Kottesque 5 лет назад +5

    I have recently purchased the FX8350 for €80 with a Wraith Cooler included. I did this to update an older machine. I currently have a 9900K and 2080Ti. But it replaced my last machine which is the same as the updated pc with an FX8350 and Sabretooth 990Fx V2 Mobo in both. I was so impressed by how well the Sabretooth and FX8350 overclocked and are stable , that I had to get another one for the update. It is probably the best long-term performing processor I have ever purchased.

    • @JDC2389
      @JDC2389 5 лет назад

      240 aio on an open test bench with a dedicated fan blowing on the vrms, the cooling alone is worth more than the cpu, this video is a joke.

  • @trucker9652
    @trucker9652 5 лет назад +7

    When I bought my 8320E it was 99 bucks, the I5 was 199 and the AMD MB was much much cheaper then intel.
    It would OC to 4.6 on air with base at 223 and turbo enabled
    Thank you for this vid

  • @bocconom
    @bocconom 5 лет назад +4

    Still running my Intel i5-3570K with the exact same motherboard within this video with 16 gigs of DDR3-1600 memory and a GTX 970 and it performs well with every game I throw at it (it replaced my AMD socket 939 system which still runs stable to this day). Nice video.

  • @Jpilgrim30
    @Jpilgrim30 5 лет назад +4

    Older CPUs are holding up well. That’s the reason I’m still holding onto my 4790k build. It still chews through any game I play. I am using a 1080 TI though so that helps. Doesn’t seemed to be bottleneck much if at all.

  • @Dave-dh7rt
    @Dave-dh7rt 5 лет назад +3

    Your test methodology is just as good as GN. Subbed. Also, STTR had very interesting results with the 1% low!

  • @MrGenoHydra
    @MrGenoHydra 5 лет назад +4

    Was using the 3570k's gimped little brother the 3470 for over 6 years (paired with a GTX660) up until a year ago when i swapped to a R7 2700X. Quite a wonderful CPU for the time and it held up quite well for 6 years without any issues with the CPU. Would probably be still using it if i had the 3570K version and a board that allows OC.
    Great video, glad to have you back!

  • @picoherbie9987
    @picoherbie9987 5 лет назад +37

    I was like, "god this again!" We get it, the the i5 is going to mop the floor with the 8350. Then I saw the video and was like, oh snap that isn't bad.

    • @bestopinion9257
      @bestopinion9257 4 года назад +4

      What those biased guys 7 years before did not understand is that their beloved i5 is not better having less cores, just the software is more suitable for it. When you need more cores then it may be a different story. Not the CPU was guilty having many unusable cores but software was guilty by not utilizing many cores.
      Now in Outer Worlds my i5 Hashwell has drops in FPS while my friend's FX does much better. The game just needs more than 4 cores to run well.

    • @glenwaldrop8166
      @glenwaldrop8166 3 года назад +2

      @@bestopinion9257 yep.
      The big guys argue a point we never tried to make. The FX is nothing compared to Ryzen but for the same money vs Intel i5 from the same time it runs better on modern multithreaded apps.

    • @pauls4522
      @pauls4522 3 года назад +1

      old comment, but very true. Also the Rtx 2060 is even slightly overall kill for both cpus, but only slightly. Paired with something like an rx580 or a 1060gb, or more modern 1650 super the margins would have been even less.
      Since this video was released I finally upgraded my main rig to a ryzen 5800x but kept my rx580 for like 8 months until gpu prices got more reasonable due to the silicon shortage. going from fx8350 at 4ghz /w 16gb 1866mhz ram to ryzen 5800x /w 32gb ddr4 3600mhz ram on the typical gpu bound game I usually only saw up to a 15% fps boost between the 2 cpus. With the 5800x being one of the fastest cpus on the market at this point... The big gains were of course from cpu bound titles like old starcraft 2 where I doubled my framerate! But since I mostly played gpu bound games the difference was not day and night. So as long as you stick with rx580 tier gpu or slightly faster the fx8350 will not hold you back by that much.

  • @alpay_z_7025
    @alpay_z_7025 5 лет назад +11

    my i5 3570k oc to 4.2 and paired with a rx 570 4gb and 24gb of ddr3 ram all of my games work well with high setings

  • @lewderoge7386
    @lewderoge7386 5 лет назад +2

    Just found your channel on recommendation! Your channel is really a BUDGET PC channel. Really can’t find good channels that goes back to the old GPUs /CPUs that can handle modern games. Very simple and informative. It’s good for new comers that don’t have $1000 or more to afford a PC or anyone that thinks old stuff just don’t think it’s worth it. Well definitely sub and HIT THAT BELL 🔔. Keep them coming!
    Edit: Keep them Lo-Fi BGM coming!

  • @derluan2008
    @derluan2008 5 лет назад +4

    Very professional analytics, thumbs up!

  • @WickedRibbon
    @WickedRibbon 5 лет назад +3

    Fantastic tests. Very comprehensive 👍

  • @AnalogFoundry
    @AnalogFoundry 5 лет назад +11

    Man, very underrated channel :(
    It deserves way more subs + views. Wish we could give more than 1 like for a video.

    • @F2FTech
      @F2FTech  5 лет назад +1

      Much appreciated

  • @TheMarc1k1
    @TheMarc1k1 5 лет назад +2

    As someone who used this core for nearly five years, and who had a brother who used the 8350 for the same or longer amount of time this willlllll be fun.

  • @FullyBuffered
    @FullyBuffered 5 лет назад +6

    Fantastic work Mike 👍 Impressive performance from both CPUs and interesting to see how the 8350 can now pull ahead or offer greater 1% low performance in some titles. The power consumption on the other hand...those numbers are pretty telling...2.6 times the power consumption is just laughable and is also testament to just how effficient that i5 is.

  • @MrYendor65
    @MrYendor65 5 лет назад +1

    My first gaming PC that I built myself was a AMD 8350Fx and it was paired with a ASUS Sabertooth Motherboard, ASUS R9 280x OC II Edition, 16 gigs Corsair 1600 Ram, and a ASUS 1080 60Hrz Monitor. Since the monitor could only give me 60 true frames per second I never had any issue with any game! It ran them all beautifully. But, I had to have something better so I kept the good old faithful 8350 for a couple of years all the while saving to build my new rig. I finally got my new rig built, I7 8700k, MSI Duke 1080Ti, 16gigs 3200 DDR4 14cl mem. All is good in the World. But I must say, that AMD FX 8350 did a pretty good job. I gave the old girl a new home. I gave it to my youngest daughter. Thanks for the memories 8350!

    • @F2FTech
      @F2FTech  5 лет назад

      Glad to see you passed it down 👍

  • @inkysteve
    @inkysteve 5 лет назад +11

    I'm a cheapskate so I like my old CPUs. My main systems use Xeon w3670s with cheap ecc memory and they still do very well gaming. I like the fact that my entire platform costs less than a midrange modern CPU.

    • @marty64thornton
      @marty64thornton 5 лет назад +1

      What's your system build.

    • @inkysteve
      @inkysteve 5 лет назад +2

      @@marty64thornton It's not in a case at the moment. Asus Rampage II Extreme, 12gb 1600mhz ECC ram, Xeon W3670, 120gb and 960gb SSDs, 2x hd 7970s.

    • @syncmonism
      @syncmonism 5 лет назад +1

      @@inkysteve What a hipster!

    • @Jack233
      @Jack233 5 лет назад

      Did you oc that w3670?

    • @inkysteve
      @inkysteve 5 лет назад +1

      @@Jack233 Yeah, nothing extreme, 3.8 boosts to 4.0. It would probably go a lot higher with better cooling but I can't be bothered as it's fast enough for me.

  • @ChrisCotzraiZ
    @ChrisCotzraiZ 5 лет назад +1

    Why am i watching this? A week ago i switched from a fx8350 to r5 2600. Well, i guess cuz i´m still in love with my little, old beast. gave me some pretty good years of gaming.

  • @h1tzzYT
    @h1tzzYT 5 лет назад +2

    Excellent video, proper overclocks, fair setup, frame times, avg/1%/0.1%, live gameplay telemetry, new games. I cannot nitpick anything, bravo :)

    • @h1tzzYT
      @h1tzzYT 5 лет назад

      Honestly im surprised by both cpus, 3570k holds pretty well with its 4 threads in cpu heavy games and fx catching up and sometimes surpassing i5 with better frametimes here and there. i5 used to obliterate fx in all games when it came out though it still does with its power consumption holly molly thats a big difference.

    • @h1tzzYT
      @h1tzzYT 5 лет назад

      and props to you for testing both lowest and highest setting as usually many benchmarkers just test with lowest setting and call it a day while its still good way to benchmark but its poor way to show how typical user would run any game distorting any realistic expectations, interestingly they differ a lot.

  • @AshenTechDotCom
    @AshenTechDotCom 5 лет назад +4

    depending on the title, you will also find that the low presets, do not use features that take advantage of some advanced cpu features, or at least this is true for 2 titles i have beta tested, as you turn things up, it will actually take better advantage of the processors and what they are capable of this is up to the limits of the videocards..mostly...we discovered that our FX chips on one title, actually performed 20% better at the ultra max settings vs the mid-ultra high settings, the devs looked into it, and have said that a future patch to that beta will enable those optimizations at all levels as long as the system supports it, they where unsure why it had been walled off to a formerly "beta-beta this could lock up your system" level, but, the gains where also there for intel chips , just not to the extreme degree, it went from giving our cpus the meh processing path to the fully optimized path... since im still under NDA i cant name the game but, the company has other products out that one of the devs strongly suggested had similar tired optimization that they needed to sort out, legacy dev code that should have been replaced with a proper cpu and gpu check then removed...
    the devs of that game actually went threw the trouble of tracking down the parts to build 4 decked out FX 8 core systems to test on and one came back to say, he was utterly shocked, he was taking one of them home to test on and, it was better then his 2700k@4.6ghz, and he and the other guys had been having a blast overclocking it.... (they actually have 8 systems 4 in house and 4 at devs homes now...they found out quite a few of their players have 6 and 8 core versions, and how easy it was to disable cores in bios and test for lower/higher core counts) they devs who took them home did so because they where having fun with the systems, as one told us in mumble, he hadnt had so much fun overclocking in years, since bus clocking became a bad idea on intel, he even bought 3x140mm custom water cooling loop, he put the rad in an external stand with fans on both sides, and a box that holds filters over that... its actually....amazing...i want it...hes managed 5.4ghz 24/7 (we talked him into switching it to a turbo oc for the last 400mhz..actually found it can do 500mhz turbo oc from 5ghz, thats a damn 8300e(low power chip), offset oc with high llc... on a sabertooth rev 1 no less...(very good airflow over the vrm's they dont get hot)

    • @Thelango99
      @Thelango99 5 лет назад

      This can't be real, can it?

  • @MidoChan808
    @MidoChan808 5 лет назад +2

    i absolutely love my 8350. i bought it and did a super budget build earlier this year. it works amazing on almost all games i play including fortnite, apex legends, sekeiro and even cod! ive since built another amd system but with a 2700x and rtx2080. i now game from the 2700x and use a capture card to stream from the 8350. the 8350 has not once dropped a frame on stream and it can put out 1080 at 60fps. its good for gaming but for being a true stream only pc its amazing 4.6 ghz btw:)

  • @pundewhee
    @pundewhee 5 лет назад +3

    Saw this on reddit, it didn't get much love there (rPcgaming is mostly as bad as the other gaming subs) but I enjoy these visits to previous gens.
    Nice video.

  • @ratulxy
    @ratulxy 5 лет назад +1

    For frame time comparison, I think it is better to show variance of the data set. It is essentially a measure of the extent to which a data-set deviates from average, in short smaller the variance more consistent is the frame time. You can even show standard deviation which is essentially square root of variance. The reason I am bringing this up is because a number will be a better indicator than a plot with seemingly gibberish data.

    • @F2FTech
      @F2FTech  5 лет назад +1

      Great feedback. I’ve tired to do this before, but I ended up with too much data on one screen. I’ll play with it again, and see if I can make it more presentable.

  • @deyeatdapoopoo7582
    @deyeatdapoopoo7582 5 лет назад +3

    Awesome video and comparison. Some critique regarding the CPU recommendation though: don't bother getting a pre-built with a Core i5 since it can't be overclocked. Instead go with one with a Core i7 or Xeon with 8-threads since those are only a little bit more. I had a pre-built with an i5-2500 which has the same performance as an i5-3470 and ended up replacing the CPU with a Xeon E3-1240 because the 2500 was having big problems with dips and stuttering in a lot of the games, especially when there were a lot of NPCs or particle effects (explosions, smoke) on screen. It was also keeping GPU utilization below 90% consistently in some games. This also meant running anything else in the background was not an option. With the 8 threads pretty much all the stuttering and dips went away and now the lowest GPU utilization I see is 95% plus Discord or the web browser can be open without causing issues. This is all with a Radeon RX 470/570. Go for something with a Core i7-2600/Xeon E3-1240 or 3770/Xeon E3-1240 v2 instead, it's very much worth it.

    • @F2FTech
      @F2FTech  5 лет назад

      Great feedback 👍

  • @izidor
    @izidor 5 лет назад +16

    still having a 3570k but to be honest the cpu is on its limits and doenst provide fun anymore... and ima very glad ill swap in july to ryzen 3000.

    • @hugo9016
      @hugo9016 5 лет назад +1

      I switched from a i5 3570k to a 2600 last year. Was 100% worth it

    • @Trancelistic
      @Trancelistic 5 лет назад +1

      @@hugo9016 2600 is surly nice but a 3600 is totally the win for that price. I'm swapping my 2600 for a 3700x soon.

  • @MrMarcost2
    @MrMarcost2 5 лет назад +2

    It amazes me how much the gap has closed. Even still, when the fx is losing, you can usually see that the cpu usage sits around 70% while the I5 sits around 95%. It seems that most games still dont use more than 4 cores. And in those where the game can use those extra cores, the fx takes a small lead

  • @CasualGamers
    @CasualGamers 5 лет назад +4

    Great video man! What software are you using to capture frametimes?

    • @F2FTech
      @F2FTech  5 лет назад

      Thanks. Mostly OCAT

  • @JamesSmith-sw3nk
    @JamesSmith-sw3nk 5 лет назад +3

    Awesome video, keep them up.

  • @MrSnapy1
    @MrSnapy1 4 года назад +1

    Just built a good friend a gaming pc (he never had one) out of some older parts I had lying around. I choose the FX 8350 and paired it with an r9 390 and was amazed how viable it is as a gaming system. It ran pretty all the games I threw at it in 1080p high/ultra at 60 fps (capped) many games would do 2k at 60 fps high/medium settings. Slapped on an hyper 212 cooler OC'ed the FX to 4.8 and it ran cool never getting above 65 degrees under the heaviest of load. Getting 100% gpu usage on all games so no cpu bottleneck. The newer the games the better since they use all of its threads. I appreciate this cpu more after I gave it away more than I did when I ran it.
    When I bought it originally I choose it over the i5 due to its workstation performance at the time. I said the it would age better than the four thread i5 and in many ways it has....

  • @berebecants3664
    @berebecants3664 5 лет назад +1

    Ótimo vídeo. Aqui uso o i5 3570k @ 4.2 com um DeepCool 400 Silent. É só alegria!

  • @srrocknroll4755
    @srrocknroll4755 5 лет назад +4

    Ivy Bridge was the first family of intel processors suportting PCI-E 3.0 so you bought the future in 2013.

    • @MrEscanaba
      @MrEscanaba 4 года назад

      Actually the Sandy Bridge i7 3960x and i7 3820 on a P9 X79 Asus Pro can run 3.0 Pcie speed. I have to hack the registry in order to fully get 3.0 on a 660ti 3gb from Galaxy, 290x 8gb Xfx require no such hack. I'm itching to tey Vega 56 on that board, however the Ryzen 1600 kick the crap out the $1000 i7 3960x under cpu performance benchmark.

  • @shadowopsairman1583
    @shadowopsairman1583 5 лет назад +2

    My 8350 is at 5.0 at 1.5Vcore on Air using Scythe Ashura, 2133 Gskill Ripjaws X at 2400 (Trident spec) with 10, 10, 12, 20, 20 1T (tighter than Trident) 1.65Vdimm. Asus Sabertooth 990FX R2.0 motherboard.
    Sapphire Radeon R9 290 VaporX 4G card

    • @miki476
      @miki476 4 года назад

      Good chip. I'm hitting the wall on 4.65ghz,same board... Fsb and all overclocked

  • @The_Nihl
    @The_Nihl 5 лет назад +1

    Performance of FX chip here is quite consistent with what I get. I run my FX at 4.4ghz with NB/HT 2.6ghz, 1866mhz DDR3 10-10-10-30-40 and Vega56.
    Crush through games, and dont bottleneck Vega at all. surprisingly.

  • @santinojoshuatorre1695
    @santinojoshuatorre1695 5 лет назад +1

    ivy bridge is still pretty relevant.I love that generation. All three of my PC's are ivy bridge. 3770k (main), 3570k (htpc), 3320m (laptop). i scouted parts for my brother's build too and that was a 3470. Good Z77 boards are extremely rare these days, though. Any of these ones break and they go straight to H61.

    • @Big1hzrig
      @Big1hzrig 5 лет назад

      Santino Joshua Torre I still have a good z77 board

  • @Ballfade90
    @Ballfade90 5 лет назад +5

    Quality viewing as always, thank you sir. Keep up the great work, be safe, and have a solid weekend!

  • @IamAWESOME3980
    @IamAWESOME3980 4 года назад +2

    More cores finally begin to make a difference. 2020 and onwards will be a battle of cores.

    • @patrickc8007
      @patrickc8007 Год назад +1

      Not in this case tho, theese are weak cores bottlenecked by 4 Floating Point Units, overclocked quad core i5 is still better than FX even in multithreaded games.

  • @JohnRussellViral
    @JohnRussellViral 5 лет назад

    Was heartbreaking letting my old rig with my 3570k go but i had no real use for it anymore. But knowing someone else is having a great time with a chip thats cheap on the market and a great performer makes me happy. Hope shes treating you well Will.

  • @f4z0
    @f4z0 5 лет назад +4

    This is as good as it gets for the FX. 2019 games are using a lot of threads where the FX should perform best.
    Back in 2012, the whole Linus tech Vs Logan (Tek Syndicate) was about this and a touch of futureproofing. Few games were using up to 4 threads not even caring about 8, while many were still heavily single threaded. The future was looking to favor those with more threads, but the thing with FX is not only the threads. It was the 2:1 ALU to FPU. Games started using a lot of complex math that require FPU. So the performance gap tightened but never closed nor favor FX by a whole lot.
    To make things worse, Back in 2012, you could swap the 3570K with a 2500K and it would show the same exact performance since the 2500K had a small ipc penalty but it overclocked like crazy. In the other hand if you swap the 8350 with the former 8150 the penalty in performance is insane. Early adopters of FX were right to be mad. Plus there was nothing else to buy from AMD. And it was like that for 5 more years. While on intel, you could still throw an i7 mainstream or jump to the Extreme line of sandy-E, ivy-E and whatnot. Dark times for AMD. So fucking glad they pulled the ryzen and now are whiping Intel with every launch.

    • @tiger.98
      @tiger.98 4 года назад

      I was not good enough in english to watch english channels in 2012, who said what (Linus vs Logan)?

    • @MrButtons252
      @MrButtons252 4 года назад

      When the 8150 launched, i got a 1090 thuban. Overclocking northbridge helped quite a bit.

  • @rossmclaughlin7158
    @rossmclaughlin7158 5 лет назад +2

    Awesome video FX seams to deal with background stuff bit better as well on likes of an older install of Windows personally I run both i5 4590 and a fx 8350 and 8350 is my main gaming rig obviously I have looked i5 though so oc for me

  • @MrMeoow91
    @MrMeoow91 5 лет назад +1

    Thank you so much with this video. I am still rocking with FX 8350 (OC to 4.5Ghz at 1.368v) but my Ram is very slow at 1600 and an R9 390X for over 5 years. I am looking to upgrade soon and this video is very helpful.
    I will upgrade the GPU (properly 5700XT) this year, then later on upgrade to Ryzen 3. Or I might hold out till next year since AMD might have something even bigger as stated from "Moor's Law is Dead"

  • @classic_jam
    @classic_jam 5 лет назад +2

    Another great video, very in depth.

  • @ViperDent
    @ViperDent 5 лет назад +4

    I've got to say, even if I were to pick up an FX processor when it came out, I wouldn't be disappointed. Considering the price difference and the cost of motherboards at the time, FX was definitely a good deal.

    • @daytimerocker3808
      @daytimerocker3808 Год назад

      ya but at the time of release the fx was really bad

    • @ViperDent
      @ViperDent Год назад

      @@daytimerocker3808 If you paired it with any mid range GPU at the time, it wasn't a bottleneck, so on a tight budget it was a good pick. A similarly priced platform from Intel would've been with a Pentium. FX was also unlocked so there was something for budget tweakers if you were into that.

    • @daytimerocker3808
      @daytimerocker3808 Год назад

      @@ViperDent u could get older i5 that blew it out of the water, even i3 ran games back then better than the fx. Not to mention how much power it consumed and heat it produced. It bottlenecks a 970. Its hard for me to say it was a good deal, but everyone is entitled to their opinion of course.

  • @SNE4KZ
    @SNE4KZ 5 лет назад +7

    i still have a fx 8350 just bought a used 1050ti last year for 100$ xD all i play is WOW/league. im waiting for ryzen 3 to rebuild atm.

    • @mr_tea753
      @mr_tea753 5 лет назад +1

      I bought a gtx 980 and a fx 8350 for 150€ with motherboard best deal ever

    • @bikboi3292
      @bikboi3292 4 года назад

      @@ulysses2162 how much is your power bill?

    • @bikboi3292
      @bikboi3292 4 года назад

      @@ulysses2162 how much is your power bill now?

  • @Punkz83
    @Punkz83 5 лет назад +11

    Just out of interest, do the same tests but change the 8350 to 1 core per compute unit so it's a full powered, full cached quad-core, like the i5... then OC them both as high as they'll go.

    • @silvy7394
      @silvy7394 5 лет назад +1

      FX CPU's are about 80% efficient, meaning if you took the 4 cores (1 core per FPU, ext) speed and added the additional 4 cores you'd get an 80% performance improvement, not a 100% or 2x.
      Or in other words, and as tested, it wont preform better. At all.

    • @Punkz83
      @Punkz83 5 лет назад +1

      @@silvy7394 Another intel shill.

    • @-eMpTy-
      @-eMpTy- 5 лет назад +1

      Might not make a big difference, probably yields worse results. I tested this around 3 years ago myself and I didn't notice any higher fps. I also tested a program called process lasso in which you can prioritize specific cores to use first but that didn't change much either.

  • @ej_tech
    @ej_tech 5 лет назад +4

    I went with the Dell prebuilt route with an i5-3470 and an RX 570

  • @MrSamadolfo
    @MrSamadolfo 5 лет назад +2

    🙂 neat, thanks for the comparisons

  • @jeremyellis269
    @jeremyellis269 5 лет назад +1

    In 7 zip my 8350 at 5 ghz was only 10-13% sower then my i7 3960x (6c/12t) at 4.8 ghz. The 3960x was 1000$ vs 200 for the fx. In some workloads the fx 8350 really was a beast for the $$$.

  • @SHUTENSEPC
    @SHUTENSEPC 5 лет назад +4

    And yes. Finally someone, who knows how to oc fx. Thank you for NB overclock.

  • @BoGy1980
    @BoGy1980 5 лет назад +1

    I still have a 3570k, on a gigabyte z77x-ud3 and i never had the urge to upgrade, it's still a very great cpu to play games with ... I'm happy when my games hit 60 fps stable at 1080p at ultra settings in most games; i sometimes disable ultraQ-shadows or nvidia-specific settings or even just overrule some gamesettings through radeon settings (per game settings) to get the most fps out of my amd rx580 and still maintaining teh best possible scenery; i always use best quality textures (8gb vram, helps a lot with textures).
    I also have 16gb of ram, 2 ssd's (one M2 for system and a second one for temp storing more demanding games when they're in my 'Now Playing'-list, Pubg has been on that drive for over a year now...), and then there's a few classic Harddrives for storing the other thousands of games (total storage is something between 12 and 16 TiB used diskspace purely for gaming, 99% of my steam library is installed, same for uplay, epic, Roberts Space Industries... wild guess is about 4000 installed games. (lol; weekly i download between 150 and 250 GB from steam just updating my games, around the holidays it's a lot more and also around halloween, because a lot of devs add special themed updates which are extra big in size, Last year i had 433 gb to updates in the second week before halloween),
    anyways, the 3570k is still a good cpu... and only now, so many years after it's release (i had it two days before official releasedate) it's becoming interesting to look out for an upgrade of the mobo+cpu+ram. I might switch over to camp green again (amd), my last amd was an Athlon64, then i went to core2duo E6850, then to the 3570k. The last few years there wasn't a lot of development on the cpu side if you compare it to the periods before core i3-5-7 series ... before then cpu's became like 25-50% faster almost every year to 18 months, in the latest releases this was down to only a few % so there isn't a lot to gain if your cpu is only a year or two three old ... back in the days a cpu that was 3 years old was double as slow as the newer ones, making upgrades viable after a few years, now it's the corecount which in combination with a decent IPC-cpu (so the i5/i7-K-series, not like i7-T series) finally will push me over the wall to buy something faster in the not so distant future.

  • @sheetlorde3415
    @sheetlorde3415 4 года назад +1

    Just brought a used PC with 3570K, plan to use another 5 years

    • @JoshuaOwens
      @JoshuaOwens 3 года назад

      Look up delidding and replace the ihs with a rockitt aftermarket one and temps could drop 22 degrees or so, some guys got to 5.0ghz.

  • @Peterowsky
    @Peterowsky 5 лет назад +4

    So, as expected, the 3570k outperformed the 8350 bu around 10-20%, while costing 10-20% more on launch.

    • @turtleneck369
      @turtleneck369 5 лет назад +1

      In older games it mostly did performed around 10-20% better than fx but new games are starting to using more cores etc....

  • @evilqtip7098
    @evilqtip7098 4 года назад

    Great Video man !! You've always done a good job and awesome video's Good Job !

  • @DDR-jg3gt
    @DDR-jg3gt 5 лет назад +1

    Hmmmm, interesting nice to see another recent comparison. I also have a couple FX8350s and with a decent GPU sure they play all games of late usually maxed at 1080p.
    I will be looking at the 3700x upon release though and seriously contemplating it.

  • @trappack6291
    @trappack6291 5 лет назад +1

    I have an fx 8350 and a Gtx 1060 6g, is the bottleneck bad enough to warrent upgrading?

    • @AnalogFoundry
      @AnalogFoundry 5 лет назад +1

      You might struggle to hit consistent 60fps in recent games due to that CPU... If it's not something that bothers you, then this setup is more than enough for another year or two.

    • @AshenTech
      @AshenTech 5 лет назад +1

      Get 4.5 or so oncore and u will be fine

  • @ComputerProfessor
    @ComputerProfessor 5 лет назад +1

    Nice to see how well the FX-8350 faired. The i5 during this time was about $50 and had to be paired with a Z series chipset to overclock. Where the FX-8350 did not need a special chipset to overclock

    • @Dave-dh7rt
      @Dave-dh7rt 5 лет назад +1

      Computer Professor *uses 2x power and is slower overall*

    • @ComputerProfessor
      @ComputerProfessor 5 лет назад +1

      @@Dave-dh7rt Doesn't really affect my electricity bill and with the games, I play it out does the i5 3550 which was priced about the same as the FX-8350
      110w vs 180w which is about a $15 difference in cost per year in electricity assuming you use your computer 4 hours a day every day under full load.

    • @zoomzabba452
      @zoomzabba452 5 лет назад

      I'd argue that a 870 or 970 chipset was necessary to get real oc out of the amd chips. The 990 usually had the one extra power phase to squeeze out 100-200 mhz more at the same voltage.

    • @ComputerProfessor
      @ComputerProfessor 5 лет назад

      @@zoomzabba452 not really. I got 4.4 on an 8320 on a basic gigabyte board. I can grab the model number if you're interested

    • @zoomzabba452
      @zoomzabba452 5 лет назад

      @@ComputerProfessor I would be curious to know (for the history books). Most people I talk to in comments couldn't oc much on lower boards. Usually had black screen shutdowns.

  • @MrAzrealDragon
    @MrAzrealDragon 4 года назад +1

    I'm glad I got the FX 8350, it seems to have been a really good choice and I've got a lot of out it. It lasted me nearly 8 years and I'm hoping to get a few more out of it, but it was well worth it and 40 - 60 fps is enough for me.

    • @MrAzrealDragon
      @MrAzrealDragon 4 года назад

      I will say tho thank you for showing the power consumption at the end there. I had no idea my FX took so much power. I wonder if underclocking is a thing? lol

  • @panzerbuern4138
    @panzerbuern4138 5 лет назад +1

    Still rocking a 3570k @4.9) no problems found as of yet) might delid it though as the temps seem to be slowly but steadily climbing

  • @johnpaulbacon8320
    @johnpaulbacon8320 5 лет назад +1

    Nice Video. When my PcC had the following hardware - Gigabyte 990FX-Gaming Mb - AMD FX-9590 Cpu - Corsair H100i GTX Cpu cooler - MSI Radeon RX-580 Gaming X 8 gb Gpu - Had 32 gb of ddr3 @ 1866 mhz ram.-1 tb Samsung 970 Evo NVME M.2 ssd - Seasonic Prime 650w FM 80+ Titanium Psu ; it was still a very well performing system. Some how the mb got ruined from what or how i still don't know ; but if it hadn't gotten ruined Most likely i would still be using that setup today. I was considering switching out the 32 gb of non-ecc dr3 @ 1866 mhz ram for 64 gb of ecc ddr3 @ 1866 mhz. Now my PC is an AM4 based build.

    • @zoomzabba452
      @zoomzabba452 5 лет назад

      I've lost the ethernet on a GB 990fx and the audio on a Asus 990fx. It's super annoying. 8350@4.7 on air

  • @Player-xi9dx
    @Player-xi9dx 4 года назад +1

    A minute for those who chose to buy the FX 8350 toaster instead of an i5k, only to save $ 30 and lose it on the electricity bill in the next 3 months, and the worst of everything, with less FPS.

  • @Spider7782
    @Spider7782 5 лет назад +5

    Awesome video mate!
    What are your thoughts on the RTX 2060?

    • @F2FTech
      @F2FTech  5 лет назад +1

      So far so good. Big fan of Global Illumination in Metro Exodus. 40-60 fps - Possible at 1440p with DLSS and No tessellation.

  • @xenonkay
    @xenonkay 5 лет назад +1

    I wonder how that would go now with new microcode and security mitigation updates applied.

    • @austinr09
      @austinr09 5 лет назад

      That’s all a scam created by intel to sell new processors.

    • @xenonkay
      @xenonkay 5 лет назад

      They're doing a fine job of selling new AMD processors but I'm not sure that's what they meant to do.

  • @chasberndt2255
    @chasberndt2255 5 лет назад

    Are you testing with SMT disabled or with all mitigations in place? Really curious to see how the Intel performs when you take into account things like Spectre, Meltdown, etc. Wondering how much of their performance lead was due to vulns.

    • @F2FTech
      @F2FTech  5 лет назад

      I am not. I will in my follow-up awhile from now

    • @chasberndt2255
      @chasberndt2255 5 лет назад

      @@F2FTech Cool, thanks for replying. Well done, btw.

  • @thomaswojahn
    @thomaswojahn 5 лет назад +1

    Was a lot of work. ThankU. Keeping the OC 8350 at around 51° is not easy, TDP = 220 - 250 Watt... good cooling. IMHO the gpu is the bottleneck with nearly 100% while the cpu around 50 to 70%.

    • @F2FTech
      @F2FTech  5 лет назад +1

      Yeah definitely in quite a few games, that’s why I included the 1080/Low results.

  • @DragonProtector
    @DragonProtector 5 лет назад +5

    i got a pc running fx9590 with m.2 pcie drive and it a beast

    • @LocoMe4u
      @LocoMe4u 5 лет назад +1

      You don't pay the electricity where you live do you? XD

    • @zoomzabba452
      @zoomzabba452 5 лет назад +1

      @@LocoMe4u electricity is included in my rent. #8350@4.7GHz silently on AIR.

    • @DragonProtector
      @DragonProtector 5 лет назад +1

      @@LocoMe4u no My parents do :)

    • @nenoneno9591
      @nenoneno9591 5 лет назад

      @@DragonProtector you r damn lucky boy

    • @DragonProtector
      @DragonProtector 5 лет назад +1

      @@nenoneno9591 well I just finished college and have helped them save money in other ways. I prob will start paying something once I get a job and all settled and what not.

  • @frshunter
    @frshunter 5 лет назад +4

    Wow surprising for the FX8350 on game performance after all the hate. Although the power draw argument is actually extremely valid on this chip as we fire up the nuclear reactor:). The I5 3570K is as good as the love it got and still looks like a great chip!

  • @vgjhlgjhfuyfjhfjkhgyghj
    @vgjhlgjhfuyfjhfjkhgyghj 5 лет назад +1

    I know u probably won't see this.
    But why the heck don't you have more subs? This content is gold 👍

    • @F2FTech
      @F2FTech  5 лет назад

      I see every comment 👍 I really appreciate it though. I don’t post often, so less exposure.

  • @we_all_die_onceyt8509
    @we_all_die_onceyt8509 5 лет назад

    Funny thing I just upgraded to a i7 6700k with a new tomahawk MSI motherboard, from my old ASRock I believe it's called budget motherboard and FX 8350 call me old but I love how that CPU still runs like a champ

  • @benruss4130
    @benruss4130 5 лет назад +11

    I mean... The FX8350 is the faster chip... BUT it suffers greatly due to optimization issues. The FX8350 was never that popular, so even in its early life, it was rarely the focus of optimization, especially because of it's core structure wonkiness and the fact that DX didn't support multicore CPUs very well. Nowadays, *NOBODY* is optimizing to it. They are optimizing to Ryzen and Core based architectures. The i5 3570k is benefiting indirectly from these optimizations, due to the fact that many tricks that work with the newer intel architectures also help earlier intel processors (if to a lesser degree). Whereas the FX 8350 has no indirect optimizations from Ryzen, due to the greatly different architectures. The only reason the FX8350 has been making strides in performance is that its additional core count is being somewhat utilized by DX12 and Vulcan games.

    • @AshenTechDotCom
      @AshenTechDotCom 5 лет назад +2

      not exactly true, you need to consider the era these chips came out in, what they came up against, most of the intel chips of that era lack avx and fma support that the FX line have, also there are developers who have been testing and optimizing for 6 and 8 core versions of the FX line for years, not making a huge deal of it, but, the FX line make up alot of 4 6 and 8 core systems, APU and CPU based, the 6 and 8 core started getting more attention from devs im told, when they ran hardware surveys/checks of their user base, and started checking the specs of systems of people having performance issues, the FX lines core arch isnt really the issue with perf, its the VERY VERY slow L3 cache that runs near ram speeds...
      the main reason these are seeing gains comes 2 fold though, windows 10 actually is optimized for them, it has a proper scheduler for them, where xp vista and 7 dont really know what to make of it...
      I personally am testing for 3 devs who have FX systems in house they test on and do at least basic optimization for, they also told us that the optimizations that benefit FX tend to benefit the 8 thread intel chips, but to a lesser degree in most cases, they are now working with ryzen systems as well.
      the other big issue for intel is the security patches to mitigate/address cpu hardware level security flaws, that the FX line, enlarge, do not suffer from, or need patches to address....(infact amd chips last i checked require hardware level physical access to the system not just a remote hack/malware to activate/use/exploit)
      amd are gobbling up market share in both desktop/workstation and server market, leading MS to patch AMD and Intel systems with specific security patches rather then just blanket patches that can harm performance for everybody...even those who arent using effected hardware.
      anyway, this isnt really true, to a point it is, but many devs are optimizing for these at least to a point, and this has mostly been true since windows 10 dropped and gave us a 10+% boost.....

  • @TrueThanny
    @TrueThanny 5 лет назад

    Saying "threaded" is incorrect. That term is used to refer to SMT, which is a duplication of registers, not execution units. Bulldozer modules have two independent integer execution units per module. Four-module Bulldozer-type processors have eight integer cores and four floating point cores.

  • @jameshogge
    @jameshogge 5 лет назад +1

    Nostalgia moment I still have my old 3570k in a server

  • @AgentSmith911
    @AgentSmith911 5 лет назад +3

    It would be interesting to see if the 8350 would handle 8k gaming with a 2080ti as the CPU difference means less at higher res.

  • @gt362gamer
    @gt362gamer 4 года назад

    Any idea why does my i5 4690k at 4.3ghz have a similar score of the i5 3570k OC you used in Cinebench R15 multithread but in Cinebench R20 multithread it has a similar multithread score of the fx 8350 OC you used?

  • @battyflaps5410
    @battyflaps5410 5 лет назад +8

    I would recommend not using hdds for newer open world ones, i found in a couple games particularly black desert online at the time i was getting huge hitching when performing attacks it would hitch every time i clicked a button to perform one and as soon as i put it on my SSD (after clearing up enough space to make it fit) it was gone like the performance of the game still relatively similar fps just not hitching like a bastard it was the same when moving about across the map. I hear these same problems can occur in some ass creed titles but then again since when have they ever been particularly well optimized i mean look at black flag i can't hit 100% utilization on anything in that game and it still drops fps like a mf.
    Maybe you should try testing planetside 2 sometime that has improved massively in performance with the new DX11 patch im getting higher fps at ultra than i was at medium low in battles now. But my i5 4670k at 4.5ghz is constantly pinned at 100% usage that game is demanding as fuck.. still. just fps got better i now see 60s more often than 40s with lower settings lol. (the smoke effects do kill the fps for me though now i have it turned up but its so pretty i need a better gpu)
    Honestly i got a 500gb ssd and while it is kinda expensive for 500gb it is so much better than having hdd. so so so much better except for fortnite the new season seems to be broken as fuck half the time my characters not loading in/invisible just a pickaxe or the emotes aren't coming up. Idk epic have destroyed that game lately. Good video man, maybe gameplay performance next instead of built in benchmarks? More like an experience thing or have you already done this? RUclipss pretty crap at keeping me up to date with channels i actually follow tbh

    • @F2FTech
      @F2FTech  5 лет назад +1

      it's funny you mention that. I was just having a conversation with a friend on Discord about the same thing - FT issues with mechanical drives in certain games. I plan to switch my testing over at some point, but since both systems were tested with the same drive, it's still valid for the comparison. I'm also curious to see how many people still use mechanical drives vs. SSDs for their game drives...? Maybe time for a poll!

    • @juanca2807
      @juanca2807 5 лет назад

      What you describe is bad optimization, to say it softly, it is more like shit coding from the point of view of someone who is involved in programming.
      Some issues that overflow modern Sata disks are using big audio files, those are loaded in game, every time needs to unpack that file and search the coded audio the game is asking for, you can still image they do the same shit for some secondary effects. Of course this things can be managed properly with modular files, cache methods and loading properly the files into the ram.
      The SSDs are not mechanical and avoid this by having a ridiculous bandwidth, but be aware that the game shouldn't be overriding anything inside your SSD continuously, other way is just getting the lifespan shorter, developers doesn't actually care about this because the SATA drives are commonly the main massive storage.

    • @doge820
      @doge820 5 лет назад

      @@F2FTech I have a 480GB ssd and two hdds, i put the games that i'm currently playing on my ssd and i move the ones that i finished or that i don't play anymore on the hard drives.

    • @AMRAMRS
      @AMRAMRS 5 лет назад +2

      I would be scared if your games are constantly rewriting sectors on your SSD... You could end up with a wasted/burnt solid state

    • @pauliusgruodis137
      @pauliusgruodis137 5 лет назад +1

      off topic, but lmao, "ass creed"

  • @gt362gamer
    @gt362gamer 5 лет назад +1

    18:06 seriously? That power draw is insane.

  • @dgfgable
    @dgfgable 5 лет назад +6

    FX 8350 a perpetual processor.

  • @genomeprojekt
    @genomeprojekt 5 лет назад

    I wonder how an AMD video card will affect the results. Would Nvidia / AMD drivers matter in the performance?

  • @thatxonexguy5438
    @thatxonexguy5438 5 лет назад +1

    Great vid man

  • @davidroopnarine7089
    @davidroopnarine7089 5 лет назад

    How much better will the i5 3570k over clocked perform my fx 6300? I want to go Intel but I can't really afford ddr4 ram and want to reuse my ddr3 ram. And will I see an improvement in the performance with my GTX 960 using the i5 3570k

    • @F2FTech
      @F2FTech  5 лет назад

      I haven’t tested it, but I’d expect a noticeable difference. Probably in the 20% range?

  • @kylerogers8782
    @kylerogers8782 5 лет назад +1

    Mu system is an i5-3570k at 5ghz air cooled paired with an R9 380 4gb and 32gb of system memory. Only now with new games am i having trouble on anything more than high settings

  • @capdas4585
    @capdas4585 5 лет назад +1

    i can't go to to 4.4 ghz or more with acceptable voltages. with a asrock z-77 extreme 4. and a good PSU . asrock do to me bottleneck?

  • @anasevi9456
    @anasevi9456 5 лет назад +2

    Great video as always, and 1.5v for all core 4.7ghz? ewww.. what a dog. My only experience with FX is an 8300 pulled from an OEM, diffused in 2015 when glofo had finally sorted their joke of a 32nm process. 4.8ghz@1.28v. It's like a weird privileged bizarre window of what could have been, but never was [when relevance had long died of old age].

    • @F2FTech
      @F2FTech  5 лет назад

      Yup. Not great, but not too terrible for an FX 8350. ASUS even has a guide for the CHV motherboard (Linked)- Pretty much says change voltage to 1.5v and increase multi to get 4.8Ghz, and done :)
      rog.asus.com/articles/hands-on/guide-overclocking-fx-8350-to-4-8ghz-on-crosshair-v-formula-z/
      I'm sure I could've tweaked it a bit more, but it wasn't stable at 4.85 with 1.5v.

  • @hpcarlos2255
    @hpcarlos2255 5 лет назад +1

    HEYNOW ~ Cool Guide : many peeps ask me if OverClocking you’re Gpu or Cpu makes a noticeable difference = Yes 🎮 always Yes, the reason Graphic cards or Processors don’t come OverClocked is because they need to Guarantee there Specs & Product , etc ..... Either way if you have these two Processors & there Overclocked with a decent Graphics Card and you mostly just use your Computer for Gameplay = Game on 🕹 Thanx for Posting 🇵🇹

  • @cssorkinman
    @cssorkinman 5 лет назад +1

    You do a great job with the videos . One thing to note, the FX was awfully close to throttling temps during at least a couple of the benchmark runs. Would like to see the same comparison using DX 11 and AMD video cards.

    • @F2FTech
      @F2FTech  5 лет назад

      Thanks. Yeah, I did multiple runs prior to capturing and monitored with Hwinfo. These runs are short and I never saw any throttling during my preliminary testing. Also, I probably won’t revisit this matchup for some time, but there will be a future revisit.

    • @cssorkinman
      @cssorkinman 5 лет назад +1

      @@F2FTech appreciate the reply - The one that concerns me in particular is the AC benchmark where the beginning of the run shows the cpu temp at 66C on the oc'd FX - that coupled with erratic frametime measurements at that point in the bench lead me to believe it was either throttling or throwing errors in the cache due to heat. Is that also where the the .1% lows were established? The data for the overclocked FX shows lower performance in those numbers which is hard to account for. I have noticed that my FX machines will sometimes start a benchmark while the cpu is still being taxed by loading the program. This was most evident in the canned benchmark for Bioshock infinite . Sniper Elite's benchmark would start counting frames during the load screen.... not sure anyone is really interested in that...lol.

    • @F2FTech
      @F2FTech  5 лет назад +1

      Valid point. Yes, the frame time issues were at the beginning of the benchmark, then smoothed out during the middle and the end. I probably ran this test 10-12 times using a number of different settings. Dropped the FSB and clocks slightly, dropped the memory clocks as well, but I didn’t drop the vcore. Performance was nearly the same, and I still had issues with frame times at the start of the benchmark. I didn’t start the capture until the camera starts to move in the BIBM, so no loading screens are captured. The amount of time that was capture is located at the top of the graph. You piqued my curiosity, so I’ll have to go back and take a look. Odd thing is, during my the run with low setting; the CPU generally saw higher utilization and these frame time issues didn’t occur. Strange - when the rig is setup again, I’ll monitor the cores during this run again. I appreciate the feedback 👍

    • @cssorkinman
      @cssorkinman 5 лет назад +1

      @@F2FTech Great to see a youtuber who's curiousity > than his ego! - earned a sub :)

  • @Rhamboll
    @Rhamboll 5 лет назад +2

    Should've put "control" CPU such as 3770k or something newer. Because comparison head to head is not enough when you compare it with real world performance(extrovert usage).

    • @LocoMe4u
      @LocoMe4u 5 лет назад

      Comparing with something like i5-7600k would show how much games rely on single core performance!

  • @shreddherring
    @shreddherring 5 лет назад +2

    I'm surprised the 8350 held up this well, strikes me as having improved with age! Speaking as someone still running a 3770K, I'm still glad I went with intel at the time, though as this thing runs hotter than the sun, perhaps I ought to have gone with a 3570K instead

    • @F2FTech
      @F2FTech  5 лет назад +2

      Yeah it’s definitely aged well thanks to a lot of games being multi-threaded. Give it a few years and I’ll test it again 👍

    • @shreddherring
      @shreddherring 5 лет назад

      @@F2FTech Might be interesting to see a productivity/non gaming comparison between them too