Radeon 6900xt Vs AMD FX 8370... hehehe

Поделиться
HTML-код
  • Опубликовано: 2 окт 2024
  • How bad is the bottleneck? Thanks to David Reid for the Mobo/CPU! Want to send me something? Email: me@timmyjoe.com! Consider Supporting Me On Patreon? / timmyjoe
    Shopping on Amazon? Use these links and it supports the channel: www.amazon.com...
    I have a website:
    www.timmyjoe.com
    Check out the MERCH!!! : www.redbubble....
    I have an Instagram:
    / watchtimmyjoe
    I have a Twitter:
    / watchtimmyjoe
    I have a facebook: who cares about facebook...
    -
    Intro is Instrumental produced by Chuki, the best beats on youtube!
    / chukimusic
    (Other music provided by the Epidemic Sound Library...)
  • НаукаНаука

Комментарии • 330

  • @DawidDoesTechStuff
    @DawidDoesTechStuff 3 года назад +103

    Hey! I had that motherboard back in the day. 😁
    For what it's worth, I get why you compared the two graphics cards. It's interesting to see how much of an advantage features like DLSS can be. Curious to see how AMD's version compares.

    • @qasimallawati4049
      @qasimallawati4049 3 года назад +3

      I love your vids

    • @TimmyJoePCTech
      @TimmyJoePCTech  3 года назад +19

      Thank you my dude!

    • @suntryp
      @suntryp 3 года назад +4

      @@TimmyJoePCTech think was talkin about dawids vids lmao, but hey, i love your vids timmy

    • @Ace12GA
      @Ace12GA 3 года назад +7

      Dawid, Timmy Joe has ruined your channel title for me. I always read it as "Dawid Does Butt Stuff". You should challenge him to something again to redeem yourself, because I think I watch your channel more than Timmy Joe's now. Sorry Timmy Joe.

    • @TimmyJoePCTech
      @TimmyJoePCTech  3 года назад +8

      @@Ace12GA haha everyone watches Dawids channel more than mine man, he's a juggernaut!

  • @Hardwareunboxed
    @Hardwareunboxed 3 года назад +60

    Ceiling lasers are what you want to look out for, no doubt!

    • @MirceaPrunaru
      @MirceaPrunaru 3 года назад +4

      Still waiting for the FX-CPU revisit in 2021

  • @m.r.a.allesklar4597
    @m.r.a.allesklar4597 3 года назад +34

    I like the FX Processors, using them for cheap budget Builds and can play the most games in 1080P, okay mostly in low settings, but they're playable👍

    • @chriswright8074
      @chriswright8074 3 года назад +1

      Sorry I take second gen ryzen 2600 and above

    • @timhartherz5652
      @timhartherz5652 3 года назад +6

      They aged well and i wasn't ashamed for picking one back then and keeping it around for ~10 years, while laughing at people who choose the "superior" intel dual-core option in the same price class.
      But i wouldn't use one now even for a buget build. There are better opions avaliable today with better upgrade paths.
      Well IF they are available that is, but thats mor of a general problem.

    • @JorgieTV01
      @JorgieTV01 3 года назад +5

      Absolutely, I got a 8320, 8150, 8120 (95w), 6100 and 4100 cpus laying around. Recently made a cheap Linux box with the 6100fx, average 80w during everyday use, cheap and fun, just hunting good am3+ boards can be a challenge

    • @acdcjor
      @acdcjor 2 года назад

      I have a FX 8320e + RX 470 8gb and right now I'm playing Kena BOS 1080p 50fps medium settings.
      When it was new, I could play everything (except it was badly optimized) at ultra 60 fps, much better than ok at low settings.

  • @pf100andahalf
    @pf100andahalf 3 года назад +32

    I just upgraded from a phenom ii x4 to a ryzen 5 last year, and your video is accurate.

    • @merlin1649
      @merlin1649 3 года назад +5

      The feels. FX 6300 to R5 3600 myself. "I'm not greedy but I get it if people are." ~Leviticus: near the back.

    • @kiavashasadi8687
      @kiavashasadi8687 3 года назад +3

      Yea i went from phenom ii x6 to an i3 10100f....well it was a huge upgrade obviously.....

    • @Kenji1685
      @Kenji1685 3 года назад +1

      Wow! What a leap!

  • @poorsap1598
    @poorsap1598 3 года назад +18

    Congratulations on world record.

  • @ImperiousImperator
    @ImperiousImperator 3 года назад +3

    I probably would have found it funnier if I wasn't still stuck using an 8350 😅

  • @PeteKay
    @PeteKay 3 года назад +6

    idk how I feel about this series.... You should probably temper your expectations and insert a little tact in your vids -- especially when the source of said hardware isn't from a sponsor or your wallet.

  • @spencerwaters5962
    @spencerwaters5962 3 года назад +29

    Can't complain when you break a world record!! lol
    On a side note, I'm interested to see the difference a 3rd or 4th gen intel core processor does versus the 8370 paired with the 6900xt.

    • @DuneRunnerEnterprises
      @DuneRunnerEnterprises 3 года назад +1

      X3470 WILL beat it hands down!!!

    • @talvisota327
      @talvisota327 3 года назад

      @@DuneRunnerEnterprises doubt it will be that much better. you need a 6 core xeon like x5650 to beat the FX significantly

    • @DuneRunnerEnterprises
      @DuneRunnerEnterprises 3 года назад

      @@talvisota327 Maybe...

    • @lagginswag
      @lagginswag 3 года назад +1

      Rocked an i5-3470 12gb ram and a r9 280 in a spare rig till I ruined it delidding due to temps lmao. Replaced with e3-1235 on the cheap, like 5 dollars more than I paid for the i5. Main rig FX 8320 @ 4.3, 16gb 1866 and nitro 8gb 480. The fx actually beats the i5 in a lot of titles and can be overclocked/ram tweaked further. Granted im on a weak Intel chipset (h61) but the fx overall aged better than a sandy/ivy i5. 👻

    • @DanielGT_93
      @DanielGT_93 3 года назад

      an i5-2400 beats any stock FX. And i'm a big amd fan, used Athlon in 754, than ATX2 in Am2 and Phenom II in Am3, but when sandy briedge came out, boy there is no comparison.

  • @dionmiller8547
    @dionmiller8547 3 года назад +6

    You should do a video where you test to see whats the best card to use for your overclocked FX. Is a 580 really the best? 1070?? 5600xt?

    • @findRED
      @findRED 3 года назад

      7950 / r9 280x / rx460 And even then some games will bottleneck with the fx. I know because I used one for quite some time lol.

    • @ravenof1985
      @ravenof1985 3 года назад

      RX570/RX580 will be bottlenecked by an FX 8 core at times

    • @thebadman16v1
      @thebadman16v1 3 года назад +1

      I recently upgraded from FX8350 @ 4.6ghz to Ryzen 3700x using GTX 1060 6GB. Gained around 10 - 15% across a range of benchmarks. Hope this helps.

  • @johnfox2521
    @johnfox2521 3 года назад +5

    Man, my current setup is an fx8320, with the same mobo, and a rx580. Thanks timmy for reminding im to broke to upgrade lol. I will say that my system runs a hell of alot better since I switched my os to m.2 from ssd via pcie adapter.

    • @kanuh
      @kanuh 3 года назад

      which adapter did you bought?

    • @johnfox2521
      @johnfox2521 3 года назад

      @@kanuh according to Amazon its MHQJRH M.2 NVME to pcie, if you need it i can grab the link.

    • @MJ-uk6lu
      @MJ-uk6lu 3 года назад +1

      You should have saved that cash and got i5 10400f with b460 mobo and some cheap 16GB DRR4. 310 Euros and you are set for 8 years.

  • @pilsen8920
    @pilsen8920 3 года назад +3

    I have 2 rtx 3090's dlss is not a feature it's trash it looks horrible it has a lot of weird graphical anomalies Shadows have weird stripes in them and like digital clocks or txt in games are all pixelated. Rtx and dlss is only in like 20 games it really shouldn't be part of your buying decision because it honestly it doesn't work.

    • @maciejbartosik1424
      @maciejbartosik1424 3 года назад +1

      Thanks for your comment. It is amazing how often people are turning blind eye on the facts. Number of titles supporting dlss 2 is pretty medicore. Raytracing is still a gimmick for the few who can pay premium for the 3090. Other cards? We will see how fast 10GB of memory is going to be insufficient. If not AMD with their cards Nvida would milk it's customers without mercy. Now they feel AMD's breath on their back and try to fix the low memory issue on their cards.

  • @LGN_Sniper
    @LGN_Sniper 3 года назад +3

    I had a 8320 with a 580 for a couple years, wasn`t that bad really... but I`m glad I upgraded lol. I also tried two R370 4GB cards in crossfire, was good for BOPs 3 zombies(over 200fps at points, but horrible low dips) and that`s about all it was good for lol.

  • @GaryReedUnfrequentedWorld
    @GaryReedUnfrequentedWorld 3 года назад +4

    Back to Form Timmy...LOL..Your last video was not a waste at all. Now you just need to redo the tests when fidelity fx comes out....all's fair in this GPU Arena at that point...

  • @Ziggy405
    @Ziggy405 3 года назад +5

    This is the shenanigans I'm here for! I upgraded 8350 to 1600af last January. So much better

    • @pauls4522
      @pauls4522 3 года назад

      What gpu you got? Going to ryzen 5800x from fx8350 using rx580 gpu I only gain maybe 25% performance boost. (I'm waiting on getting a 6800xt)

  • @nightbirdds
    @nightbirdds 3 года назад +4

    Oh the FX CPU. I have fond memories of it. I loved the tin boxes they came in. Might've been the best part of the whole deal. :)

  • @Revener666
    @Revener666 3 года назад +3

    *Looks up 8370, released (2014) 2 years after the 8350 and is basically the same. Hmm..... Well my 8350 with a 1060 still runs in this house. :)

  • @Nick_R_
    @Nick_R_ Год назад +2

    I was glad to see you running the fx at full tilt with great supporting components. After a decade of use, my fx 8350 still serves my needs really well, including gaming. You just have to adjust settings and expectations. And pair it with a sensible GPU. My recent upgrade from a HD7970 to a GTX1660ti works well. Note that the fx processors are limited to PCIE 2, so cards that need the bandwidth of PCIE 4 are a poor pairing. Of course I could get a new CPU, motherboard, RAM, NVME drive... and I would feel the difference. But now and for the last six years money has been tight. Hence the GPU upgrade to get more life out of my old beast.

  • @kodacko
    @kodacko 3 года назад +2

    I had a FX 8370. And it was bottle necking my 1060 6G. I would NOT pair it with a 1070/1660Ti/2060/2070 at all. a RX 570/580 or 1060 is this things max. IMO. Good vid Timmy!

    • @hattemabdein9065
      @hattemabdein9065 2 года назад

      My friend, what is the strongest? fx8370 / fx8350

  • @arturpisarek4692
    @arturpisarek4692 3 года назад +3

    You no need a AiO for this , use a Max Wraith from Ryzen , it will fit ant temp are Ok.

  • @mix3k818
    @mix3k818 3 года назад +6

    Haven't upgraded much in years and some of my specs are over a decade old at this point...
    So your build still seems better than mine

    • @basshead.
      @basshead. 3 года назад

      I bet my PC is older than yours.
      BSEL modded Intel Pentium E2160
      vmodded ATi HD4670
      4GB of DDR2 RAM

    • @basshead.
      @basshead. 3 года назад

      I have my studio tour and gaming setup video on my channel.

    • @mix3k818
      @mix3k818 3 года назад

      @@basshead. oh wow you might be right
      Pentium G4400
      ASUS GeForce 9600GT 512MB
      4GB of DDR3 RAM

  • @abelneto9945
    @abelneto9945 3 года назад +3

    Its a CPU like the ones in PS4 and Xone so locking at 30 fps in cases like cp2077 should be fine

  • @itsdeonlol
    @itsdeonlol 3 года назад +6

    The 8370 was a heater back in the day!

  • @slavomirfabian7087
    @slavomirfabian7087 3 года назад +1

    Five years ago I tried FX8320@5.0 with 1070 in GTA5 and it was suffering. Pair with this cpu max 1060 power level.

  • @gsp3987
    @gsp3987 3 года назад +1

    You can try running 4 8GB sticks at 1333MHz and see if that gains you a few frames? Now try 3090 SLI on that FX, but have it inside a PC case or fully built for great effect.

  • @fintrollpgr
    @fintrollpgr 3 года назад +1

    @Timmy Joe PC Tech: How did you overclock? Only multiplier? As you can get a lot of extra performance with FSB overclock. And matching NB and HT overclocks. I mean I get about the same cpu result in firestrike at 30fps that you do at 300Mhz more. And I am bottlenecked by my RX570 in UWHD and pretty high settings in RDR2. CPU load only hits about 70% while the GPU is pegged at 100% (still happy that it does almost 35fps, smooth enough for me)

  • @dhgodzilla1
    @dhgodzilla1 3 года назад +1

    I just threw together an older AMD build because I could. An AMD 1090t Black Edition with 16 Gigs of DDR3 1600 (Ripjaws), a R9 390 8 G Gaming, 1 500Gig SSD & a 200G SSD + a 2Tb Hard Drive. 750 Watt 80+ in it with a LED & RGB Fans in it with a decent budget case with a side window. Gave it to my Brother for Christmas (he was using an old Core 2 Quad still) so he is happy. Yeah it is technically obsolete but it is still 2-3 Xs the performance from his old Comp.

  • @terminator4625
    @terminator4625 3 года назад +2

    Timmy Joe hating that FX users still get 60 average 10 years later. XD. No seriously though, my brother swapped his phenom X6 to a 8350 and as long as he gets 50-60 he doesn't care. Console gamers... Last gen console gamers.

    • @talvisota327
      @talvisota327 3 года назад +1

      FX performs just as good as in 2013 or maybe even a bit better... during the last few years, games finally make use of all 8 threads compared to just 4 or 2 threads in 2013

    • @terminator4625
      @terminator4625 3 года назад

      @@talvisota327 no doubt. I never had the pleasure of being an FX user as I finally got to upgrade from the phenom it originally had to a Ryzen system. He got the old system and Phenom it still used. 1090t. Served me well through middle and high school.

  • @GeorgeJFW
    @GeorgeJFW 3 года назад +4

    Don’t apologize you were completely on the money in the last video. It’s great AMD is making cards that can hang on the high end, but they are lacking features that will be important over the next few years. If they would
    Have been aggressive on pricing that would be fine but apples to apples they don’t compare. IMO

  • @effexon
    @effexon 3 года назад +1

    What blasphemy is this? When FX was thing, all I saw was every reviewer bashing it mercilessly, while praising overpriced intel 4cores. Now sudenly people(reviewers in YT) try FX and say, well its not that bad. :S

  • @James-qu6ul
    @James-qu6ul 3 года назад +1

    Timmy was not way too hard on the 6900 XT. IT really sucks for the money. It's literally within like 5% of a 3080 in rasterization average in games without all the features and total shit ray tracing while costing way more. 3090 is stupid too, but at least its only competition is the 3080.

  • @SubZero8007
    @SubZero8007 3 года назад +7

    "It's cinematic" Love that comment!

  • @Dimondgamer123
    @Dimondgamer123 3 года назад +1

    Fx is still okay with 1650 or 1050 for people playing older games but newer ones are better like ryzen or older intel i still have an fx pc in my collection and its not bad

  • @ytmandrake
    @ytmandrake 3 года назад +1

    Don't apologize for not pleasing AMD fanboys. The only good thing that came out from AMD is cheaper Intels :)

  • @cinerir8203
    @cinerir8203 3 года назад +2

    Haha, "Butt poo @ 4,65 GHz", fastest poo ever :D
    Also, when showing Tomb Raider, the graphic still said "Wolfenstein YB", intended or oversight?

  • @eury360
    @eury360 3 года назад +1

    If you increase the Resolution to 4K, the CPU shouldn’t matter that much anymore.

  • @mbass-tt5dc
    @mbass-tt5dc 3 года назад +1

    shoulda done a fx 8350 nitrogen cooled at 8.1ghz with a 6900xt

  • @chriswright8074
    @chriswright8074 3 года назад +7

    I accept the apology and I can understand your point of view

  • @at0mac
    @at0mac 3 года назад +2

    I mixed a 6800XT with a 8600K and can see bit of the same result, IPC is low, GPU is too powerful, but well, a upgrade at some point should smoothen that out

  • @ZeR0goth
    @ZeR0goth 3 года назад +2

    I'm using a non overclocked fx 8350 paired with a rx 580 and 16gb of 1866. I'm still having enjoyable experience at 1080p on modern games.. Settings are usually low to medium but it gets me by.

  • @h1tzzYT
    @h1tzzYT 3 года назад +2

    Great video, i always enjoy these old cpu+modern gpu tests, missed opportunity to test this rig in 4k instead since its obvious you are cpu limited.

  • @Keimzy
    @Keimzy 3 года назад +1

    I had my 8350 for about 6 years at this point those where the days its a server nowadays and i got me the r7 3700x the boost is just amazing had to keep the r9 390 but it still chugs along with an oc and new paste

  • @DrMuFFinMan
    @DrMuFFinMan 3 года назад

    Everyone is entitled to their own opinion and as a owner of FX-6300, FX-8320, and my last upgrade the FX-8370 its always fun to see what they can do today. Personally though the new AMD cards are pretty good in less your only interest is Ray Tracing and at that point neither Nvidia or AMD is truly good because of the massive performance penalty for running any form of Ray Tracing.

  • @d4mb20
    @d4mb20 3 года назад +1

    My friend is still using the 8320 with a 580, he doesn't have stutters tho, maybe there's something work with the oc

    • @miki290576
      @miki290576 3 года назад +1

      I have the same components :)

  • @matthewclemons1574
    @matthewclemons1574 3 года назад +1

    If fidelity fx works in every game it already beats dlss.

  • @austinr09
    @austinr09 3 года назад +1

    Jimmy Toe, we love these kinda vids. Nobody else thinks to put a 6900xt on a FX. This is why we all subscribed.

  • @MrEscanaba
    @MrEscanaba 3 года назад +1

    I don't even play those games at all. It hard to compare my motherboard to your as it was a much newer Fx serie board before Ryzen first came out to brought it for $130 by Gigabyte with their best golden Bio. It a FXA 990 Gigabyte Ud3 Ultra CF, their last board with no reviewer.
    www.gigabyte.com/Motherboard/GA-990FXA-UD3-Ultra-rev-10#ov
    Sabertooth was the best board in 2012 and the 2.0 version is more improve with the good it have, Type C connector USB outlet, m.2 SSD x4 2.0 onboard lane, more Pcie lane for expansion slot with extra tweak to the motherboard of the FXA series. That is if you can find it and are very rare. Mine can be setup to play any GPU as primary in either pcie slot 1, 2, and 3 being the slowest lane. With great audio and I did not expected to sound good.
    I wouldn't touch the sequel of Wolfenstien 2 and that game wasn't good or better to skip and the hype, never follow lemming or hype. None of the games I can compare but it can still run a Vega 56 to be as equal as a stock CF R9 290x XFX 8gb (score as high as 27,000 OC on firestrike, almosy 24,000 with Vega 56 Msi Air Boost Oc.)
    Fx Chip isn't dead in the water yet. I doubt you can get your chip to 4.6ghz at 1.416v with NB 2500Ghz, 2000mhz dual ram of 9 10 9 24 timing OC from 1600Mhz. I broke many record with Vega 56 on the scoreboard of Fx 8350 class.

    • @MJ-uk6lu
      @MJ-uk6lu 3 года назад

      FX is truly dead. Ryzens downclocked to 2GHz or even less beat them.

    • @MrEscanaba
      @MrEscanaba 3 года назад

      @@MJ-uk6lu Why compare? You got a lots of curve to figure out your wisdom to understand when Framerate bar do NOT tell the whole story, even with 0.1 and 1% .

    • @MJ-uk6lu
      @MJ-uk6lu 3 года назад

      @@MrEscanaba Because it shows exactly how outdated and poorly performing FX truly is. And why do you mention minimum framerates? FX stuff was awful at them too.

  • @ItsJustJoeC
    @ItsJustJoeC 3 года назад +1

    At one point my friend and I both had the same MSI RX 480 8GB. I was running an i5 6600k at 4.5GHz and he was running an FX 6300 at 4.6GHz. I would typically get about 20% higher FPS depending on the game. We both had 16GB of RAM although mine was DDR4 3000 and he obviously had DDR3. I believe it was 1866 if I remember correctly. Where my system really pulled away was when using each others card to run the 480s in Crossfire. In Crossfire you may also want to factor in that I was running the cards at PCIe 3.0 X8 while the FX system only supports PCIe 2.0. Although, I am not sure how much difference that made with an RX 480 class card, even in Crossfire.

  • @ardennielsen3761
    @ardennielsen3761 3 года назад +2

    4.427GHz FX-4100 and a 1.270GHz RX570 are a 1:1 match... cant do any more then that on the stock cooler with two heat pipes on it.

    • @ardennielsen3761
      @ardennielsen3761 3 года назад

      Fortnite 45fps @ 1280x1024 ultra settings, FX8370 could push 200fps at that resolution? does it need to be up graded, no, it clearly runs fine... just kick the key loggers that boot ya from games just because they see certain hardware. give em the big leather boot.

    • @MJ-uk6lu
      @MJ-uk6lu 3 года назад

      @@ardennielsen3761 Dude just save up for i5 10400f. They are dirt cheap now and can handle any GPU out there.

    • @ardennielsen3761
      @ardennielsen3761 3 года назад

      @@MJ-uk6lu meh

  • @antoinemorfos7321
    @antoinemorfos7321 3 года назад +1

    You being that clumsy or even little bit of sloppy when moving around hardware made me not to treat hardware like newborns.

  • @cj_zak1681
    @cj_zak1681 3 года назад +1

    Just learned that Steve from GN uses this CPU....wow

  • @zero0core
    @zero0core 3 года назад

    I'm still running an FX 8350 w/ 16GB Ram & Red Devil RX 580 and I love plays most Modern Titles just fine. But come march I will have a new system Ryzen 9 5950X, 64GB 3200Mhz Ram, RTX 2060. To be honest I have not had too many issues with my Current system. And yes the 8350/580 combo plays Cyberpunk just fine if your ok with Console like Performance. 😎

  • @dodgydruid
    @dodgydruid 3 года назад

    Just like a lot of people who have unlimited access to the latest greatest things, you don't see in real time what consumers do and that is nVidia pull the carpet out of anything they deem as "old" with 9xx and 10xx card owners just finding out about how their cards are being sidelined because nVidia wants you to buy new new new all the time. Meanwhile, AMD has kept support and longevity in the forefront, my Radeon 570 8gb can handle quite a lot and what I have to turn down in settings its not the greatest of losses, the 580, 5700XT do even better and these will STILL be usable twice the amount of time a nVidia card would be. I was told Watchdog Legions wouldn't work on my Ryzen machine... oh no that card is TOO old said the blurb and then we see it running on ancient hardware pretty well. Oh and I have my settings on WD:L up pretty max... as do my Cyberpunk, FH4, Metro Exodus, Doom Eternal all on a card said to be not up to the mark :S

  • @jeffmorse645
    @jeffmorse645 3 года назад +1

    To think I was such a loyal AMD fan back then I kept beating my head against the wall with FX processors (FX-6300 at first, then an FX-8300). I OC'd them as much as I could and they still couldn't properly support the GTX 670 I had at the time. Finally broke down and bought an i5-4690K and Z97 motherboard. I was like night and day in gaming. It actually pissed me off at myself for waiting so long to switch to Intel. Terrible chapter in AMD history.

  • @FinnLovesFP
    @FinnLovesFP 3 года назад

    I think the more evil thing to do is pair and RTX 3090 with a core 2 Quad or a Pentium 4 on a late LGA 775 board that supports DDR3. LMAO only game a Pentium 4 in the past few years can run okay-ish, was Doom 2016 on a Pentium 4 and GTX 970. ridiculous test, but quite entertaining tho.

  • @chomper720
    @chomper720 3 года назад

    Well even the i7-4790K would struggle with this card to feed it enough frames... Edit: The FX still beat the PS4 and XBONE! XD

  • @lflyr6287
    @lflyr6287 3 года назад

    Timmy Joe PC Tech : first of all, all the games that you ran were games BASED ON OLD GRAPHIC ENGINE DERIVATES (Cyberpunk 2077 started getting developed back in 2010 which means that its game engine originates from 2010, and was finished in 2014 already). Second of all, you should try running an RX 6900XT or an RTX 3090 on a Shintel Core i5 2500K 4c/4t or on a Shintel Core i7 2600K 4c/8t cpu that were also released in 2012 :). You'll be surprised that half of games wont even run on the i5 anymore and you'll also be surprise how shitty they both perform especially on the 1% and 0,1% lows. Trust me, much lower than games do on the FX 8370 8c/8t cpu :).

  • @Babbages
    @Babbages 3 года назад

    why don't you or really anyone else do baseline benchmarks on non-overclocked fx chips compared to after the OC or with your oc matching the chip your versing because I bet the 0.1 percent would be higher than they are with the overclock because I sware them chips are more stable non overclocked or only slightly oced.

  • @thebeaner687
    @thebeaner687 3 года назад

    Hey man that’s basically my setup. Except it’s an FX-8350 at 4.2 GHz with 1866DDR3 and an AMD Reference RX 6800 XT and I am getting 45 FPS with my 6800 XT graphics card. I know it’s a BIG Bottleneck but the 5900X is back ordered and I have a Dark Hero X570 Motherboard with Ballistix Max DDR4 4000MHZ Ram waiting for my 6800XT

  • @d.b.8684
    @d.b.8684 3 года назад

    Come on now don't knock the sabertooth lol 1 of the best boards I've ever owned, matter of fact just put together a gaming system with that board, an FX 8350, 16gb RAM, 250gb SSD (new), 1tb WD HDD and topped it off with an RX590 8gb all powered by a Montech BETA 550w 80+ bronze PSU (new). I plan on selling for budget 1080p gaming rig Custom cables as well it looks fantastic btw. Not sure of price tag any suggestions?

  • @JamesJones-zt2yx
    @JamesJones-zt2yx 2 года назад

    Actually, I'm typing this comment on my Linux desktop machine that's sneaking up on 7 years old--FX-8370, 16 GB of RAM (soon to be 24; it was so inexpensive I couldn't pass it up) and a GTX 960 which I'll replace with an RX 5600 XT (no doubt the CPU will be a bottleneck, but at least I can be rid of nVidia,not to mention that I'm no gamer).

  • @jasonianduke4985
    @jasonianduke4985 3 года назад

    I got a PC with FX-9590, Crosshair V Formula-Z MB, 16GB Corsair Dominator 2133MHz and RTX 2060 Super, Samsung 850 PRO SSD 1TB. Plays most of the games OK. Remastered Battlefield games do take some performance hit with this hardware. Some tripple A titles do take performance hit. Game Control seems to run on all graphic settings maxed out plus Ray Tracing on just fine. Have to put some games on Medium setting graphics etc.

  • @zr1519
    @zr1519 3 года назад

    I think the last video was harsh...and the response was a bit harsh....but AMD being competitive with Nvidia should really show how bad Nvidia has done....to let AMD be right back up their coattails .
    Please overclock AM3+ with the Fclk . I usually have mine set to 240 (allow to keep ram same speed.)and then single core is boosted significantly. (done this on athlon x2 370k/4100/2x 6300/8350/9590) also up the Hyperthread and ...NB? I think it is? to 2200 each or 2400 each if can.
    The intel chips from the time are also good with fclk. but not much and most don't like to learn.
    Also. You have to stop core parking. QuickCpu can do this easily. The OS is built to hurt fx chips (check it . intel did several deals to hamper fx)

  • @ColJonSquall1
    @ColJonSquall1 3 года назад

    I have the 990FXA Gaming board from MSI, with an FX3820, and it was my primary gaming PC for quite a while. I had gone through a few GPUs over the years, to get better performance, but opted to rebuild after getting an RX 470 8GB OC from Sapphire and wanted to get ready for Cyberpunk 2077. So I took the plunge last year, and rebuilt with a 2920X Threadripper and running on an Nvidia 2070 Super from EVGA. I opted to go Nvidia this time around, just due to (at the time) AMD's driver support was rough. I'm actually happy to see the 6000 series get the spotlight, with the 6800 XT getting monster performance gains with some of the liquid cooled variants.

  • @pauls4522
    @pauls4522 3 года назад

    As a joke you should have done 4k since it's a 4k gpu and 4k is less cpu bound.
    But jokes aside yes if you are planning on using anything faster than an rx580 / 1660 then you should stay away from fx8350.
    I saw a good difference in cpu bound games like starcraft 2 when upgrading to ryzen 5800x from fx8350, but in mainstream non cpu titles i saw maybe a 25% performance bump. Which is pretty good, but 5800x is a 450$ processor. If I had a i7 coffee lake or 3700x that gain might be less than half. Which still are expensive cpus vs second hand fx8350.

  • @ModernGeekReview
    @ModernGeekReview 3 года назад

    Playing the wrong games ! Sims 4, Baldur's Gate , Try fighting games DEAD OR ALIVE 5LR , STREET FIGHTER and etc. But latest AAA then of course not ! I would like can it " BlueStacks" Timmy Joe PC Tech. Don't throw that AMD FX 8370 still got a lot of life for that CPU! But if it can be solid 60fps then it still awesome!

  • @josejuanandrade4439
    @josejuanandrade4439 3 года назад

    The tittle of this video should have been: TJ bends the knee to the AMD fanboys.
    Do never apologize for your opinion. You are entitled to it. And i agree that there's not many reazons to recomend AMD right, unless you find it in stock and no Nvidia cards at stock, also for reazonable price. The 6800xt and 6900xt are NOT bad by any stretch of the word, but they lack key features: Encoding is better on Nvidia, Nvidia has DLSS, and Raytracing is better on Nvidia IF you care for this last.
    Also, the rest of the video was great. I love people testing these old FX cpus.
    EDIT: Besides the fact that FidelityFX may not work as good as DLSS at launch (which hbnestly is like 99% sure, just look at raytracing on the AMD cards) theres the fact that games WILL NEED to get patched for it... and that is a biggest problem. FidelityFX is NOT DLSS, because that is a trademark of Nvidia, and who knows how long will take devs to patch the games in... Remember how long it took Shadow of the TR to get patched with rt and dlss in?

  • @cracklingice
    @cracklingice 3 года назад

    It's just overpriced is the problem. If the 6900XT launched at $599 then it'd be walking all over Nvidia, but AMD couldn't make enough of them to do that and if they had, they couldn't parade it as a top end GPU because it doesn't have top end pricing.
    I mean if they take the 6800, cut the RAM to 8GB, price it at $449, they would WALK the 3060Ti and 3070 in value.

  • @KubanKevin
    @KubanKevin 3 года назад

    As a 3090 owner, I think you are way too confident in NVIDIA and you continuously hint/jab at DLSS/Ray-tracing when just about any poll you find shows that the majority of people don't care about Ray Tracing. DLSS is not a product to be praised in my opinion because its just a way for NVIDIA to be lazy and not have to push out more IPC or clock speed out of their cards. Remember that NVIDIA, AMD, and INTEL are not your friends.

  • @Anakeish
    @Anakeish 3 года назад

    FX still has problems with AMD GPU, u should consider it. FX from the get go was build just wrong. That's why AMD was sued -"Bulldozer-based chips weren’t truly multicore processors to the extent that AMD claimed. AMD advertised the CPUs as eight-core chips, but each chip only had four “dual-core modules” with separate execution units; other resources like cache and a single floating point unit (FPU) were shared across the module. AMD says that those modules counted as two cores each, for a total of eight, but customers alleged that since the modules couldn’t actually run separate processes, they only should count as a single core for a total of four cores, not the eight that AMD claimed".
    For comparison, it would be nice to test 2600K and how this CPU handles modern games.

  • @JustJeff_LazyGamer513
    @JustJeff_LazyGamer513 3 года назад +1

    still rocking a fx 6300 at 3.9ghz with the rx 570 8gb and I get stuttering if i go above 1080p on high

  • @hi_tech_reptiles
    @hi_tech_reptiles 3 года назад

    The current FidelityFX is great, their sharpening stuff (which is technically ReShade) but its not DLSS 2.0. Hopefully the new version is better but when you cant hardly get the cards it works with anyway... Nvidias AI/ML stuff is going to be more and more important as time goes on.

  • @BjornsTIR
    @BjornsTIR 3 года назад

    The point of the 6900xt is to make a $1600-2100 400-500W card look stupid, which it did imo. Also part of why dlss doesn't suck anymore is that they kinda cut the AI bs and made it more of a hardware thing. If AMD would take a similar approach to it (or a better approach with dynamic resolution) it might be better at release than expected.

  • @FireFoxBancroft
    @FireFoxBancroft 3 года назад +1

    That 6900XT is still cheaper than the RTX 3080 Ti.

  • @cidsapient7154
    @cidsapient7154 3 года назад

    yea idk id rather have a 6800xt than a 3080 for gaming myself
    maybe u were just expecting too much from the 6900xt, the only reason id prefer a 3080 would be for compute as the full potential of big navi is far away
    but yea nvidias flagship 3090 should still out perform especially in ray tracing
    big navi is kinda like "zen2" so by "zen3" u should see them over taking nvidia in some areas

  • @Phynellius
    @Phynellius 3 года назад

    it's just your opinion. It's wrong, but that's the nice thing about opinions you can be wrong it's not the end of the world. Nobody is arguing AMD has the edge in software and features(are they?), you just have to keep in mind that on some titles DLSS looks like hot garbage so using it as a baseline is inherently biased and wrong. You should still test with it to show how good it can be, and I agree that overall NVidia still has the edge for the most part, but at least AMD has a competitive offering if all you care about is rasterization performance. I'll probably run both the 6900xt and the 3080 if I can ever find one of those EK water blocked 3080's from Asus... maybe wait until next gen or the gen after when they start bringing the 3000 series back to fill in for the lack of stock for the 5000 series

  • @Gibbons-q5y
    @Gibbons-q5y 3 года назад

    Don’t apologise man. You’re right. Radeon cards only make sense if they’re significantly cheaper than the equivalent Nvidia card. Nvidia has an objectively better feature set, which is valuable to the consumer. RX 6000 is a RTX 2000 series competitor in terms of features.

  • @DeathAdder71
    @DeathAdder71 3 года назад

    HeHe, I tried (almost) the same thing with my dad's old 8350 and a 6800XT just for laughs ... I wasn't disappointed (I laughed a lot); Felt as if I was Trolling myself 😁 Good Times

  • @andyjoelharper3448
    @andyjoelharper3448 3 года назад

    Really? There would be a monopoly if AMD/ATI didn't exist! And Nvidia would charge even more and get sloppy with their designs. Nvidia wouldn't even strive to do better if it wasn't for AMD. Their cards are getting better. Now no one can buy ANY card!

  • @keegan7736
    @keegan7736 3 года назад

    I dont even need to watch this video without saying poor AMD FX 8370 lol. also 69n dislikes. niiiiiice haha

  • @mix3k818
    @mix3k818 3 года назад +1

    What do you think will be the best year to upgrade in the next 5 years?

  • @ericcarver9250
    @ericcarver9250 3 года назад

    Shit, I still use my 8320 machine daily, proper maintenence over the last 10 years has kept her running like she was still new. While it may not compete with the intel counterparts, these things will run forever even heavily overclocked. I still

  • @1NIGHTMAREGAMER
    @1NIGHTMAREGAMER 3 года назад +1

    ahahaha old cpu go brr on new time spy bench bro my 3700x struggles on time spy lol

  • @forog1
    @forog1 3 года назад

    Everything you said to me last video is more or less true the 6900xt is a let down the 6800xt is the better product of the line up 6900xt really should not exist if all its got is like 10% more performance over the 6800xt and nothing else... At least the 3090 has +14gb of vram with its 10% performance up lift over a 3080.

  • @myleghurts7419
    @myleghurts7419 3 года назад

    My wife’s pc has a FX 8350 and RX 480 that she plays fortnite, planet zoo and sims 4 on, the only games she will play, and her performance is better than that on fortnite. Fortnite runs smooth for her. She runs 1080p high. Seems like something is up with your setup.

  • @mezar4854
    @mezar4854 3 года назад

    I had an FX-6300 for about 10 years and recently upgraded to R7 3700x because why not, right? Will I ever use all dem cores..no anywho see you in 10 years when I upgrade again

  • @PapaMav
    @PapaMav 3 года назад

    I can attest to a 1070 and FX 8320 being a terribile mismatch. Both came in the first PC I bought for my son, a CyberPower pre-built. It only took 2 years before that was shown to be the case, w/the FX being the weak sister.

  • @puregaming3976
    @puregaming3976 3 года назад +1

    I'm still using a 8370 lol

  • @pankoza
    @pankoza Год назад

    I still have that CPU
    but my GPU is still a good old GTX 960 2GB MSI Gaming

  • @HeyDan1983
    @HeyDan1983 3 года назад

    OMG what happened here... You were an old tech lover... Now youre just the average "LTT" capitalist nvidia fan boy tech youtuber.

  • @pufaxx
    @pufaxx 3 года назад

    We all know - FX wasn't the best choice for gaming, even in its time. That's why I think comparing an i7 4770 could be interesting, too.

  • @emilberkhahn4504
    @emilberkhahn4504 3 года назад

    Hey Timmy Joe do u still got the r9 fury x, if the answer is yes could u do another extrem OC Video on? You could a little more power limit and voltage :) i like ur overclocking Videos.

  • @Stelio_Contos
    @Stelio_Contos 3 года назад

    I still want an Anniversary of the FX series, use the new Zen Cores & "infinity fabric" up the cache sizes but keep the same architecture structure. New chipset, and youd have what the FX sereis was supposed to be, Future proof

  • @trueheart5666
    @trueheart5666 3 года назад

    Gamers Nexus and Hardware Unbox 4.9Ghz FX 8370 gets even lower fps than 2 core 4 threads cpus on their charts.

  • @OneCosmic749
    @OneCosmic749 3 года назад

    Wasn't AMD FX always trash? I mean that architecture was loosing to Intel at launch and prices were never low enough to justify it's performance loss to Intel.

  • @davem7722
    @davem7722 3 года назад

    I think your wrong, considering the next gen consoles both run AMD hardware, your going see a lot more AAA games optimized for BigNavi

  • @ZoruaZorroark
    @ZoruaZorroark 3 года назад

    had that motherboard and an fx 8350, can say that a cooler master hyper 212+ with dual fans can cool it, just too little thermal headroom for overclocking with the clocks i was aiming for at the time

  • @alekpo2000
    @alekpo2000 Год назад

    i got the same rig but whit the fx9370 and the rx6600xt, they both a perfect match cuz the motherboard suck so much ass that is impossible for any of them to perform higher lol

  • @VeritronX
    @VeritronX 3 года назад

    I would use dlss in control and wolfenstein, but I hate it in cyberpunk. I hate motion blur and run a fast monitor, I hate the trailing artefacts dlss gives you in cyberpunk with a passion.

  • @IndyMiraaga
    @IndyMiraaga 3 года назад

    Still more playable than PS4 or Xbox One. Tweaking some settings should make most of those games very playable.

  • @petenielsen6683
    @petenielsen6683 3 года назад

    I am running my RX 5500 XT on an MSI 970 Gaming Mobo with an FX 8350. I win the bad match sweeps!