It's the Best Gaming CPU on the Planet.. AND I'M MAD. - Ryzen 7 7800X3D Review

Поделиться
HTML-код
  • Опубликовано: 26 дек 2024

Комментарии • 4,8 тыс.

  • @AlexTheDane_
    @AlexTheDane_ Год назад +7422

    I think it would be a nice touch if you could highlight the CPU in the benchmarks. It can be a little confusing with so many different CPUs at the same time and it took me a bit to actually find the 7800X3D. I know it might be a little thing, but for me at least, it would make it a bit easier to spot the CPU that the review is about. Otherwise great video!

    • @Jjrage1
      @Jjrage1 Год назад +517

      Nah its not just you, the charts in this are all over the place. The orders of the CPUs change around and its not even ordered by fps in some of them (5:02). I don't really get it. Much easier if the CPUs stay in the same spot and we compare its size to other bars easily.

    • @AaronShenghao
      @AaronShenghao Год назад +184

      Yeah GN highlights the CPU in their videos, not sure why LMG doesn’t…

    • @b4ttlemast0r
      @b4ttlemast0r Год назад +96

      yeah, Gamers Nexus always highlight the ones that they're talking about

    • @SmashMysticSaiyan
      @SmashMysticSaiyan Год назад +32

      Yeah... the charts are pretty bad for this one. There doesn't seem to be and rhyme or reason for how the cpu's are listed from chart to chart. sometimes it looks like it's by performance, but other times it just seems random.

    • @daftpunk672
      @daftpunk672 Год назад +19

      I’d like them to take it a step further and just separate AMD and Intel on the charts. So in the case of this video then have AMD on top and Intel on the bottom and vice versa when it comes to Intel.
      I don’t know if this would be better or worse but whenever I want to find a specific cpu I have to go through the whole list because everything just gets scrambled.

  • @psihius
    @psihius Год назад +4097

    7800X3D will sell like hot cakes in Factorio community the same way the 5800X3D did - the performance boost is so massive that it allows you to scale to much much MUCH bigger factories.
    Also, memory latency really REALLY __REALLY__ matters to Factorio. A DDR4 2666 vs 3200 MHz made like 10-15% difference in performance provided similar CAS latency ratings. On DDR5 this is even more pronounced and could be a fascinating rabit hole for the LTT Lab.

    • @tech_you
      @tech_you Год назад +66

      CAS latency doesn't matter at all, it's more like tRRDS/L tFAW and tRFC for timings.

    • @mrbobgamingmemes9558
      @mrbobgamingmemes9558 Год назад +121

      I just hope this cpu can get 60 fps on my modded cities skylines , although most of my mods are gameplay tweak, very few of the mod that is asset mod

    • @TheHighborn
      @TheHighborn Год назад +117

      in factorio the performance is totally nuts

    • @duckilythelovely3040
      @duckilythelovely3040 Год назад

      LTT lab lol.
      That thing is a joke. Tell me how 1 dude from hardware unboxed has FAR more games tested, with far more settings than an entire fucking "lab" does.
      Sorry, but their content is pretty much a joke. Far too quick.

    • @muSPKwow
      @muSPKwow Год назад +65

      All of that community already bought the 7950X3D.

  • @MarioGoatse
    @MarioGoatse Год назад +2674

    AMD’s performance is great in the last few years, but look at that efficiency! 85W while the 13900K is hitting over 250w. That’s insane.

    • @RicochetForce
      @RicochetForce Год назад +179

      No joke, pairing this with a 4090 is going to lead to an incredibly cool and quiet PC as neither part is using much power at all. Efficiency monstrosities.

    • @AwesomePossum510
      @AwesomePossum510 Год назад +291

      @@RicochetForce A 4090 not using much power? Are you high? Efficiency is decent but even that would be way better on a 4090 if they underclocked it a little.

    • @muizzsiddique
      @muizzsiddique Год назад +24

      @@RicochetForce Forget to switch alts?

    • @mbj920
      @mbj920 Год назад +97

      @@AwesomePossum510 I assume if you cap the frames of a game, a 4090 would be more efficient reaching 144 frames than a 4070 would be if that makes sense. It's easier for a body builder to lift heavy dumbbells' than it would be for a teenager. Easier for a Lamborghini to reach 150 MPH than a typical car. This would be the logic but I haven't seen any comparison videos or stats.

    • @manfredDmeer
      @manfredDmeer Год назад +14

      @@AwesomePossum510 have you heard of undervolting?

  • @dejanzie
    @dejanzie Год назад +150

    As someone who uses my desktop PC for gaming and home office work, I highly appreciate the increased focus on energy efficiency. While I'm not sure of the impact on menial workloads that don't require full CPU usage, I do take it into account when buying parts (CPU as well as GPU etc.). Especially here in Europe, where energy costs have exploded since February 2022.

    • @Proxymated
      @Proxymated Год назад

      Blame your government for not knowing their place beneath Russia's boot.

    • @TheRealSykx
      @TheRealSykx 10 месяцев назад +3

      hmm what happened in february 2022 🤔 /s

    • @techietisdead
      @techietisdead 7 месяцев назад +8

      @@TheRealSykx A certain nation decided to invade its neighbor who was not a threat in any way at all

    • @Pt08020
      @Pt08020 4 месяца назад

      @@techietisdead not a threat at all...but definitely put NATO troops at the border and didn't expect anything to happen...crazy. Asking for it.

    • @fearfulSPARTAN
      @fearfulSPARTAN 3 месяца назад

      @@Pt08020 ..........

  • @skyfighter_64
    @skyfighter_64 Год назад +1416

    Quick improvement idea:
    I think if you keep the charts at 9:00 in a constant order instead of sorting them by value every time would make them much faster to read, especially with many of them in a row.

    • @KillerDonuts27
      @KillerDonuts27 Год назад +112

      Agreed! Either that or 'Freeze' the tested product at the top and sort the rest by best to worst.

    • @Taracktor
      @Taracktor Год назад +64

      Bigly agree or even BOLD or HIGHLIGHT the main product being tested.

    • @ubermidget2
      @ubermidget2 Год назад +10

      I like the "best performer" sort order, but definitely something can be changed. Others have mentioned fixing just the review product or bold/colouring the entry.
      If bold/colouring is used, it would also be nice to have a second entry for the nearest price competitor (at time of release)

    • @JamesTK
      @JamesTK Год назад +7

      The charts all through the video are impossible to follow and therefore useless fluff. More data doesn't make it better when it's not easy to consume

    • @mafi288
      @mafi288 Год назад +1

      Or maybe add a vertical Line at the End of the product talking about to quickly compare the other ones with it. Comparing for excample the top with the bottom one is really Hard without a grid in One way or another

  • @iben3271
    @iben3271 Год назад +1268

    why do these reviews never test simulation speed? like stellaris, civ6, hoi4, total war end of turn times (maybe even football manager?)... It's one of the main strengths of this cpu pretty much no one knows about.
    There's a test that exists where a 5900x is put up against a 5800x3d. The 5800x3d destroyed the 5900x in stellaris speed, simulating 25-30% more days in the same amount of time. If this happens with alot of similar games, then this CPU could be a gamechanger for alot of people, but no one seems to explore this further.
    I still see people recommending non x3d chips for these type of games, because of "higher single clocks"' and I think alot of people are still making this mistake. Hoping to see someone test this extensively someday, but here we are 1 year later, and still no one has.
    edit: please stop saying "no one plays those games" FM23 & civ6 alone make up of 110k players rn on steam. That's not counting any of the total war games or paradox games, or any other simulation/grand strategy game... (which is probably another 200k+ total)

    • @Elijah1573
      @Elijah1573 Год назад +138

      Fr The games i care about with these CPUS are never done

    • @whateverfitshere
      @whateverfitshere Год назад +131

      because most players are not playing it.

    • @KaloianBostandziev
      @KaloianBostandziev Год назад +2

      IDK

    • @iben3271
      @iben3271 Год назад +2

      @@whateverfitshere might wanna take a look again at steamcharts. very wrong take

    • @DiscombobulatingName
      @DiscombobulatingName Год назад +69

      Definitely. My biggest CPU bound game right now is flipping turns on Civ 6 when the game gets late.

  • @MisterAnderson91
    @MisterAnderson91 Год назад +1262

    Just on the graphs: it would be nice if you could highlight the reviewed CPU's result on every page. It jumps around up and down (since the graphs are ordered by performance) and because each graph has short screentime, it's really frustrating to try and quickly see the position on each one.

    • @lucasgondra5725
      @lucasgondra5725 Год назад +8

      Up

    • @klericer
      @klericer Год назад +77

      This. Also, don't shuffle them around. Keep the same order, then its much easier to compare even it the graphs are only shown for a short time.

    • @chrisfow87
      @chrisfow87 Год назад +59

      This! The graphs are always terrible, LTT has acknowledged they're terrible and said Labs will improve them, but why can't we have some QoL improvements while we wait for the step change?

    • @huzaifavawda8383
      @huzaifavawda8383 Год назад

      this is why you are part of a failed society, cant pause the video but want things highlighted for you. You are pure laziness exemplified and a brokie

    • @catsonicc5979
      @catsonicc5979 Год назад +1

      they totally should!

  • @DBofficial125
    @DBofficial125 Год назад +574

    One big thing to take from this is that the 5800X3D is still an elite CPU, holding in with this and there is no need to rush to AM5 and upgrade everything until prices settle and we're forced to.

    • @ex0stasis72
      @ex0stasis72 Год назад +20

      Interesting. I'm currently looking into upgrade my PC for Starfield, and I have an Intel i5-7600K. I want to switch to AMD, in part, because I like how AMD doesn't force you to buy a new motherboard after ever CPU, so in that regard, I feel like I should go with AM5 because AM4 is on its last generation of CPUs. But I'm also on a tight budget (if you couldn't tell by the fact I still use a i5-7600K). But I'm also in a hurry to build a PC that can play Starfield at 60 FPS.
      Maybe my other option is to buy used AM4 parts, that way, when I'm ready to upgrade in the future, the difference in resell value isn't so large.

    • @Legendaryplaya
      @Legendaryplaya Год назад +4

      @@ex0stasis72 Digital Foundry did a video on Starfield that may interest you. They claim there probably will not be ray tracing because it would be basically impossible do to all the calculations in the game. They explain why but it will certainly be a CPU-limited game as are all bethesda games. So I imagine you really could build a computer under $1000 and run the game at fairly high settings especially if you use 1080p.

    • @ex0stasis72
      @ex0stasis72 Год назад +5

      @@Legendaryplaya thanks! After giving it some thought, I'm probably going to go with a used AM4 motherboard and the 5800x3D. Buying a much cheaper used motherboard solves my concern about it being the last generation on AM4 because I can always just resell it used again for not that much less later.

    • @Legendaryplaya
      @Legendaryplaya Год назад +1

      @ex0stasis72 nice. If it was my money, I'd grab a 3060 ti/4060 ti for DLSS, and it'd be a smooth 60 fps.

    • @ex0stasis72
      @ex0stasis72 Год назад +1

      @@Legendaryplaya Good choices. I already have a RTX 2070 Super, so I'm not as much in dire need to upgrade just yet. But I am often CPU bottlenecked in CPU bound games, which Starfield is very much be.

  • @SiiiVR
    @SiiiVR Год назад +594

    I personally really appreciated the performance per watt cost graphs since they give a good estimate of the proportional difference in cost of ownership. Thanks Linus, Team and Labs!

    • @rockenrooster
      @rockenrooster Год назад +32

      it skews even more drastically when you add AC/cooling costs to keep your room at a confortable temp!

    • @TheFPSPower
      @TheFPSPower Год назад +8

      They just missed talking about idle power while watching videos and stuff, Ryzen CPU's are notoriously horrible at that while Intel has it figured out.

    • @majstealth
      @majstealth Год назад +6

      @@TheFPSPower stock yes, but one could always throttle the voltage a bit

    • @WaspMedia3D
      @WaspMedia3D Год назад +13

      @@TheFPSPower It's literally like 5 - 10 watts, total system. If you bought a $500 best of the best gaming CPU with 8 cores that can do well at productivity to leave it sit Idle the whole time, that would be stupid, and you could just turn it off, or set it to sleep after inactivity. I don't know why people keep bringing this up, 1 hour of productivity or six hours of gaming will displace 2 weeks of the difference at idle.
      The only point this argument ever makes is people like to grasp at straws to promote their favourite brands.

    • @constantin58
      @constantin58 Год назад +7

      it's not just electricity savings, a low demanding Watts cpu will also allow to run a cheaper cooler, with less temperature and less noise and longer longevity without the need to repaste often.

  • @TheMarioOne
    @TheMarioOne Год назад +281

    I really like the comparison of power usage and running costs between different CPUs, it's not something I've considered when building PCs in the past. Would be great to see this in GPUs and the full PC build videos as well.

    • @tshcktall
      @tshcktall Год назад +4

      nice avatar. Also, I always try to build a really power efficient pc.

    • @FriarPop
      @FriarPop Год назад +2

      If you are worried about $50 over 5 years, you have problems.

    • @tshcktall
      @tshcktall Год назад

      @@FriarPop I save way more than that, but how would you know. You don't know how I use my pc. What my electric prizes are. You just pulled a number out your §§§.

    • @handlealreadytaken
      @handlealreadytaken Год назад +1

      @@tshcktall even if it’s 10x, you have problems if that’s your financial undoing.

    • @tshcktall
      @tshcktall Год назад +3

      @@handlealreadytaken I just don't like wasting money, when there's a better way for me, that's it. Nothing more. Why do you guys even care so much, what other people prioritise?

  • @yellingintothewind
    @yellingintothewind Год назад +507

    You miss an important bit with Factorio. It's a game about automation and scaling out. While it's capped at 60 tps, no matter how powerful your system, it *will* dip below 60tps once you get a complex enough map. The better scaling means running bigger maps before you hit "TPS death". Would be interesting to develop a benchmark that tests how complex you can get before that happens.

    • @SurgStriker
      @SurgStriker Год назад +7

      maybe they could ask chatGPT to write a script that automates the game in a predictable way to reliably ensure growth happens equally every run, then just test for when the TPS drops.
      Unless factorio has procedural generation (random maps, etc). I've never played it so not sure on that part

    • @jma89
      @jma89 Год назад +47

      @@SurgStriker Yes, the maps are procedural, but actual gameplay is quite deterministic. A better test may be a single mega-base that strains even the latest gen hardware and see what the UPS (updates per second) falls to. I'm sure there are plenty of folks in the community who would love to share their multi-thousand SPM (science per minute) bases as a benchmark.

    • @yellingintothewind
      @yellingintothewind Год назад +23

      @@SurgStriker The maps are procedural, but that's easily solved using a fixed seed. No need for chat gpt, since the game logic is implemented in Lua. So it would be quite easy to write a "mod" that tiles out parallel construction segments at a set rate using the dev-mode infinite source / sink stuff till the game grinds to a halt. With the fixed rate expansion, that lets you plot performance as "time till TPS drop".

    • @NightKev
      @NightKev Год назад +36

      @@SurgStriker ChatGPT isn't some magic spell that can do everything for you.

    • @SovereignThrone
      @SovereignThrone Год назад

      @@NightKev anime girls aren't magically gonna take your virginity, but here you are with this profile pic

  • @madpistol
    @madpistol Год назад +336

    That power usage on the 7800x3D is mindblowing. It would have been enough for AMD to retake the gaming crown decisively, but they did it with a part that sips power. Freakin magic.

    • @madpistol
      @madpistol Год назад +84

      @@wasd____ you don’t buy a 7800x3D for production. You buy a 13900k or 7950x.

    • @madpistol
      @madpistol Год назад +34

      @@wasd____ buy a 7700x.

    • @AECH_CH
      @AECH_CH Год назад +11

      ​@@wasd____lmao, it's still a killer cpu and just looking at charts doesn't do it justice at all.
      Intel and AMD build good CPUs, the cash here is the selling point. It has like double the cash of similar CPUs, so 10% less GHz really isn't the point here.

    • @Ordoscc
      @Ordoscc Год назад +28

      ​@@wasd____If you're losing money over 5% performance lost, upgrade you concern troll.

    • @mr.x408
      @mr.x408 Год назад +21

      ​@@wasd____ well if a person couldn't afford a dedicated fast cpu then their "production" does not need one.

  • @kiri101
    @kiri101 Год назад +206

    As a UK viewer (and someone who generally tries not to waste power) the 7800X3D is clearly for me. I had a great experience building a GPU-less AM5 7700 video editing system for someone recently.

    • @GoobyDev
      @GoobyDev Год назад +2

      Ah yes the 7700, I chose that for my build specifically because of its power efficiency (UK too)

    • @sophieedel6324
      @sophieedel6324 Год назад +4

      if you cared about power consumption, you wouldn't buy a 7800X3D to begin with, you would get an i3 or a console

    • @GoobyDev
      @GoobyDev Год назад +46

      @@sophieedel6324 you only buy a console if you only want gaming and nothing else. It does not beat the usage of a computer. Also, i3s are not necessarily less power usage than these chips and why sacrifice massive amounts of performance for that anyway

    • @dex6316
      @dex6316 Год назад +23

      @@sophieedel6324 then you wouldn’t get an i3 or console either. You’d get a laptop if you truly cared about power consumption. Efficiency matters a lot. The fact of the matter is that those i3s won’t be more efficient than a 7800X3D; you just lose so much in performance and power consumption is barely different. You could also get a 4080 and power limit it to kill consoles in efficiency.

    • @nelsonmendoza561
      @nelsonmendoza561 Год назад +16

      @@sophieedel6324they simply don’t provide the performance necessary for everyday tasks and gaming in one machine. An i3 is great value, don’t get me wrong, but I think in power per watt, the 7800X3D is really quite amazing. Just unfortunate about the clock speeds.

  • @ChrisK251
    @ChrisK251 Год назад +721

    This CPU with the extra cache is an absolute monster for World of Warcraft up to 30% extra frames

    • @Sherfps
      @Sherfps Год назад +11

      thats good to know.

    • @watermelon58
      @watermelon58 Год назад +23

      Star Citizen too

    • @janfrederick7753
      @janfrederick7753 Год назад +8

      I think MSFS also likes the huge L3 Cache

    • @_Zane__
      @_Zane__ Год назад +6

      World of War craft? 😂

    • @Monsux
      @Monsux Год назад +12

      Yep... Just the older 5800X3D was performing insanely well with WoW. Miles ahead other higher core CPUs. This new 7800X3D will do even better.

  • @OTechnology
    @OTechnology Год назад +674

    I love how Linus never misses any opportunity to mention how AMD abandoned X399 lol

    • @SyRose901
      @SyRose901 Год назад +37

      I'm not a person who remembers CPUs by its chipset names, do you mean the Threadripper non-pro lineup?

    • @-FOXX
      @-FOXX Год назад +20

      I'm mad too; 20grand down the drain damn near

    • @tiedye001
      @tiedye001 Год назад +3

      ​@@SyRose901 Yes

    • @jakesnussbuster3565
      @jakesnussbuster3565 Год назад +10

      ​@@SyRose901 you could just look it up considering you're on the internet

    • @SyRose901
      @SyRose901 Год назад +4

      @@jakesnussbuster3565 True, didn't think of that.

  • @RippanYT
    @RippanYT Год назад +247

    Still happy 5800x3D being up there in the top representing AM4!
    It's insane we went from Ryzen 3 1200 to 5800x3D on the same socket... just freaking insane.

    • @XDSerialKiller
      @XDSerialKiller Год назад +1

      What MOBO btw?

    • @heliosaiden
      @heliosaiden 5 месяцев назад +1

      2 mobo (B350 -> X570) and 4 CPUs later, I still rock on AM4

    • @InnerFire6213
      @InnerFire6213 5 месяцев назад +2

      @@heliosaiden i went from 2600 to 5700x3d. It essentially doubles the gaming performance it's so crazy. Games that heavily uses the cpu runs buttery smooth, i'm seeing crazy gains in dyson sphere program, stellaris, and beyond all reason. all done on my 7 year old gigabyte ab350 gaming 3

  • @haukewalden2840
    @haukewalden2840 Год назад +283

    Finally! I'm so happy that you are starting to consider the energy usage of CPUs and the price they truly cost in a longer run.

    • @ZaikoEAG11
      @ZaikoEAG11 Год назад +12

      This. A lot of people (me included) like to maintain their pc parts for a little more than a couple of years

    • @Naokarma
      @Naokarma Год назад +1

      Especially with such a big difference. If you're not playing the highest-fidelity games, you can still benefit from spending a bit more today on this thing than paying for it in energy for the next 3 years or so

    • @WrexBF
      @WrexBF Год назад +6

      That power consumption section is a little misleading though. The 13600k pulls more power in productivity workloads but it's faster. Also, they didn't include idle power consumption numbers. Your CPU spends the majority of the time at idle, and AMD CPUs pull more power because of the I/O die.

    • @mars_12345
      @mars_12345 Год назад +4

      I think it's time for a energy consumption testing in different workloads, similar to how laptops are tested for their battery life. Idle, web browsing, 4k playback, gaming (this will be hard, need some sort of average load). It will be harder, cause components need to be tested more in isolation than in conjunction like in laptops, but now that they have the new lab... ;)

    • @wasd____
      @wasd____ Год назад +6

      Hot take: a difference of maybe $100 in total energy cost between CPUs amortized over a 5 year presumed lifespan is not a consideration. $20/year, less than $2/month, is literally just maybe some random loose change from your pockets. It's nothing you'd ever even notice or think about. It doesn't matter.

  • @lolowski6826
    @lolowski6826 Год назад +552

    I am so glad that you finally started adding performance from games like Factorio! As someone who plays a lot of automation and 4X games. Those are very important metrics to me and many other automation/4X enjoyers. It is also very difficult to find reviews that include them, they all just focus on frames per second in big AAA games.

    • @Austynrox
      @Austynrox Год назад +21

      Agreed. I don’t necessarily play factorio (but want to try it) but i do play games like rimworld, they are billions and recently was gifted satisfactory. And i do enjoy city builder/management games also so it is nice to have some mention outside aaa

    • @Kenstro949
      @Kenstro949 Год назад +16

      Yep exactly, I'm looking to upgrade just to improve performance in games like Stellaris etc.

    • @Crusader-eh2cv
      @Crusader-eh2cv Год назад +6

      @@Austynrox You should try Oxygen not Included and Dyson Sphere both VERY good.

    • @ex0stasis72
      @ex0stasis72 Год назад +4

      Also, heavily modded Minecraft tech modpacks. Those are very CPU intensive.

    • @MrSherhi
      @MrSherhi Год назад +2

      Maybe even civ6 turn time test could become a standard imo.

  • @DKTD23
    @DKTD23 Год назад +204

    The power efficiency has been incredible on the 5/7800X3D chips. Truly ideal.

    • @spacepioneer4070
      @spacepioneer4070 Год назад +4

      Agreed, i thought the days of air cooling was behind us on flagship gaming CPU's... I was wrong. I'm actually blown away!

    • @madpistol
      @madpistol Год назад +3

      The Ryzen 7000 parts are pushed to the limits, so naturally, when AMD has to reign in some of that power due to voltage limitations on 3D vcache, you get a very efficient chip. I just don't think any of us were expecting it to be THAT efficient. It's insane.

    • @wasd____
      @wasd____ Год назад

      It's not efficient when it takes longer on productivity tasks and doesn't really idle any more efficiently. You lose all those savings and more in the extra time it takes the processor to get things done.

    • @DKTD23
      @DKTD23 Год назад +2

      @Winston Deleon that is a moot point though, this chip is marketed as a high end gaming CPU that has the capabilities to perform productivity tasks with the vast majority of buyers focusing on the former and not the latter. For those type of tasks the 7900x or 7950x would be more feasible

    • @niko1even
      @niko1even Год назад

      ​@@spacepioneer4070 air cooling is far from behind, good air coolers have never been behind AIOs.

  • @tom940
    @tom940 Год назад +35

    i just picked up a 5800X3d, started with a 3060ti and a 3600, when i upgraded my middle monitor to 1440p it left just alittle to be desired, got a good deal on a 3090 but then the 3600 was a HUGE bottle neck. Like my FPS in games like farcry 6 straight up doubled switching to the 5800x3d.

    • @InertGamer
      @InertGamer Год назад +1

      I’m having that issue now but with Intel I have I7-8700K and had a 2080 base. Then I upgraded to a 3080 TI and now most games are running sluggish. Cyberpunk, SCP, Assetto Corsa

    • @sparda9060
      @sparda9060 4 месяца назад

      @@InertGamer your CPU is too weak to play modern heavy games like Cyberpunk. If you get a intel 12000 or Ryzen X3D chip, your gains from 2080 or 3080ti would be huge.

  • @moon200070
    @moon200070 Год назад +75

    Is it possible for all the graphs representing the cpus and their relative performance, that there could be a highlight of sorts for the two CPUs being compared? So maybe a red "highlight" around the 7800X3D, and a "blue" highlight for the CPU that costs the same or is priced to compete so that the viewer has a more intuitive feeling for the difference in the charts? Otherwise it feels like just a wall of numbers, and I usually like watching these not because I am going to buy them, but for entertainment. So a simple visual indicator for where they are on the list would be cool!

    • @AwesomePossum510
      @AwesomePossum510 Год назад +3

      Totally agree. I like how they go through the graphs quickly but with a little indicator of what’s what I wouldn’t need to pause the video as much.

  • @ThFnsc
    @ThFnsc Год назад +143

    I'm so happy about this focus in performance per watt. It's such an important metric that's been overlooked for far too long. I truly wish ltt keeps talking that much about it in future reviews. Not just about CPUs, but anything.
    I cannot help but think this absurdity of power consumption nvidia and intel been pulling lately as partially a result from the community and reviewers on average not caring much or at all about efficiency. But raw performance and to hell with the costs.
    There's only one more thing that would make me even happier: measuring and caring about idle consumption

    • @rockenrooster
      @rockenrooster Год назад +8

      I agree! But I would take it a step further because it skews even more drastically when you consider AC/cooling costs to keep your room at a comfortable temp, especially with summer coming up in NA. Very few like to game while sweating.

    • @marcuscook5145
      @marcuscook5145 Год назад +8

      Performance per watt is pretty much the ultimate measure of who has the best CPU architecture. It's fine looking at the benchmarks, but if the results are very close while one company needs a ton of extra wattage and heat output to achieve that result, I'm obviously going to pick the more efficient and cooler running chip if the prices are anywhere near comparable.

    • @SadMatte
      @SadMatte Год назад

      ​@@marcuscook5145well one thing about it though is some architectures can't consume too much power before they ultimately will fail, like ryzen, and something like 13th gen intel was made to be able to consume very high amounts of power.
      All in all I wouldn't say it necesarily makes one architecture superior over the other, but I think it's always a better foundation for improvements if you have lower watts consumed to supply the cpu sufficiently overall.

  • @SIW808
    @SIW808 Год назад +480

    Just look at how the old 5800X3D is keeping up with these new CPUs in games. What a chip!

    • @DKTronics70
      @DKTronics70 Год назад +50

      @@dukejukem413 The PLATFORM (AM4) it runs on is very old - September 2016 - that is ancient in PC times.

    • @gumbi79
      @gumbi79 Год назад +42

      @@DKTronics70 its just a socket .. the platform is the chipset and its evolved over the years . and ddr5 and pcie 5 are not worth the extra outlay at all

    • @macse7en
      @macse7en Год назад +1

      Yeah, and I’m still trying to get one without obliterating my bank account lol

    • @clark85
      @clark85 Год назад +3

      @@Malc180s oh its older in computers years come on enough excuses

    • @sabrevanson4412
      @sabrevanson4412 Год назад +18

      @@DKTronics70 the platform/socket yes. The 5800X3D was released in March 2022....so not old.

  • @TheJovialBrit
    @TheJovialBrit 4 месяца назад +3

    I placed an order for a custom built PC today and these two will be my GPU and CPU, couple with 32Gig DDR5, 6,000Mhz RAM. Can't wait to get it!

  • @explosev6513
    @explosev6513 Год назад +563

    Super happy to see the 5800X3D still very close in performance, gotta thank AMD for blessing us AM4 peeps.

    • @hypnotize107
      @hypnotize107 Год назад +32

      Bought mine yesterday because I didnt want to spend 300€ for a itx motherboard again

    • @chadofwar6554
      @chadofwar6554 Год назад +24

      got mine a month ago on sale and happy i did. it should last me awhile. AM5 is still too much money

    • @BMC2
      @BMC2 Год назад +4

      Just got one today but forgot to flash the bios lol. Anyways its great value cpu

    • @spacepioneer4070
      @spacepioneer4070 Год назад +10

      The 5800X3D gets beat by 20-30 fps in some titles to the 7800X3D. I was shocked, i haven't seen a generational performance increase like this in a while. ruclips.net/video/TVyiyGGCGhE/видео.html

    • @ik1llpeeple4fun
      @ik1llpeeple4fun Год назад +9

      @@spacepioneer4070 what he meant was the 5800x3D is about the same as the 7800x3D in 4k gaming. The 7800x3D according to the charts, and the video you linked, only have a noticeable gap over the 5800x3D in 1080p. Even at 1440p the gap shrinks tho. And in 4k the gap is negligible.

  • @spill1t
    @spill1t Год назад +240

    Glad you guys actually showed the cost savings related to the efficiency. I think that speaks volumes about how great the solution is and if AMD actually spent more time trying to make the top end line of chips behave the way we'd expect they should, then it will be very hard to recommend anything other than AMD solutions.

    • @blodstainer
      @blodstainer Год назад

      Having the top CPU only matters because we consumers say so,

    • @spinyjr
      @spinyjr Год назад

      ​@enriqueamaya3883too true 😂

    • @JBlNN
      @JBlNN 10 месяцев назад

      That's the thing AMD will never take risks or move ahead development thats why Intel/Nvidia will always be the best.

    • @generalpluto369
      @generalpluto369 7 месяцев назад

      ​@@JBlNN I would argue that they've taken a major risk already by challenging Intel when they were so small, now they're much larger and growing fast, I've used both Intel and amd before, just got a new amd cpu after using Intel for 7 yrs amd I'm excited to see the differences

  • @ShotGunner5609
    @ShotGunner5609 Год назад +74

    That breakdown of expected power consumption was a very nice metric. Please continue adding that into reviews!

  • @EkoFrisch
    @EkoFrisch 4 месяца назад +6

    I just love when gamers have a simple solution on a CPU.
    7800X3D is just great, doesn't use a lot Watts, is fast and decently cheap.
    Just a perfect choice without thinking to much.

  • @Infigo96
    @Infigo96 Год назад +29

    What amazes me the most is that power draw. My 5800x3d draws ~55w which is really nice in my quite small and compromized matx.....they managed to cut out 15W! from that with more performance. Now I seriously concider a SSF build as now any low profile cooler will do very well and the options opens up a lot while still having a quiet mashine.

  • @geoffreybassett6741
    @geoffreybassett6741 Год назад +113

    Thank you for mentioning performance per watt. With how performant these chips are it's now become a very important factor in my purchasing decisions.

    • @alphacompton
      @alphacompton Год назад +10

      yea the analysis with performance per watt is very good. I think he should have mentioned how you also wouldn't need as beefy a cooler for the 7800X3d vs other top performing chips. That's means a smaller PC and/or more quite PC.

    • @ThatGoat
      @ThatGoat Год назад +2

      Power companies. Continuously motivating us to make better choices. It's for the environment, m'kay?

    • @GrzegorzMikos
      @GrzegorzMikos Год назад +2

      I don't get why they even show it in a review of gaming CPU, 15$ difference in a year is nothing. Is it not normal in USA to tip delivery driver 5-7$ on each order? So like don't tip 2 times a year and you are done.

    • @mateuszsa7
      @mateuszsa7 Год назад +3

      They still messed it up tho :/
      No idle power consumption.

    • @GoobyDev
      @GoobyDev Год назад +1

      @@GrzegorzMikos its certainly not nothing, every dollar counts, also they showed what I would actually consider to be very minimal use, if you were a heavy gamer or left your PC on for longer periods, you could be saving dollars in the triple digits with this chip as opposed to say a 13900k, which is already crazy when you consider that its cheaper than the 13900k and on similar or better performance for gaming.
      Also, it's just good to note all the strengths and weaknesses of any product you review, like how this particular chip isn't a great investment for pure productivity

  • @josephjoestar515
    @josephjoestar515 Год назад +251

    I loved seeing the estimated power costs during ownership. Really made me think about it and realize that the extra cost of the 7800X could easily be justified (plus less heat during summers (or all year if you live in Florida like me)).

    • @Duglum666
      @Duglum666 Год назад +18

      The difference is is even really small in the US.. i was shocked how extremely cheap your power still is. In Germany we went up to about 65 us cents per kWh. The difference over a few years can easily be hundreds of Euros.

    • @WouldDieForChomusuke
      @WouldDieForChomusuke Год назад +12

      ​@Duglum666 honestly never thought about it that way, like as of right now I still live with my parents (not a leach only 16) and I just get the best thing I can afford, totally didn't think power was a factor of cost, even though it's long term.

    • @MisterPikol
      @MisterPikol Год назад +1

      ​@@Duglum666 where in Germany, that used to be the case but now in Hesse it dropped back to about 38 cents per Kwh

    • @Masterrunescapeer
      @Masterrunescapeer Год назад +5

      @@Duglum666 the 65c were during peak, shouldn't hit that again, and most places are coming down quite a bit, seeing contracts here in Austria for low 20c range again, it basically doesn't matter long-term for home users (also energy subsidies).

    • @abaj006
      @abaj006 Год назад +16

      The extra 100W of cooling load on the Aircon will easily add up over a year. Not only you have to pay for the extra 100W every hour, now you have to pay for your AC to get it out of your room. The 3D chips are a no brainer, and a fantastic value for money.

  • @Hr1s7i
    @Hr1s7i Год назад +13

    This 3D cache advantage reminds me of how overpowered L3 cache was when it got introduced. I did a little test between an Athlon II x4 620 and a Phenom 720. Parked both of them down to two cores and same clock speed. The L3 cache gave me up to about a third more fps iirc. It would stand to reason that more of it would be better when we keep improving it.

    • @andreewert6576
      @andreewert6576 Год назад +1

      Also that was then, L3 cache (or larger cache vs. smaller) has a bigger impact later in a products life cycle. That's why the phenoms stayed relevant way into the core-i era. That's (part of) why the intel X58 platform is kind of usable for gaming today. The other part being triple channel.
      these x3D parts will have some absurd longevity, not in maxFPS but in minFPS/frametimes.

    • @Hr1s7i
      @Hr1s7i Год назад +1

      @@andreewert6576 Indeed. the minimum FPS or 0.1% low or whatever they call it, is something I use as a general guide when it comes to capping framerate. My current gear for example handles Deep Rock Galactic at 1440p at roughly 120~ish fps, but dips down to 80-90 regularly. I decided to cap it at 100. 10-15 fps drop will not bother me, but 30-40 will be felt. Smooth gameplay is paragon for chill experience, in my personal opinion.

    • @sparda9060
      @sparda9060 4 месяца назад

      @@Hr1s7i in any games that has a lot of things going on on the screen all at once, its very desirable to keep FPS consistently high even if you have to cap it to avoid the frame dips, which creates way better smooth gameplay. with X3D CPU, the 1% lows are soo high like never seen before, which just makes gaming experience butter smooth, and if you can tune it to be almost locked at certain high FPS, that is golden...

  • @AlmightyJu2
    @AlmightyJu2 Год назад +301

    I really love the energy price graphs! Would love to see this more often for gpus too, since as a brit it's defo becoming more of a concern when looking at hardware and nobody does real world data of actual power usage

    • @JohnDoe-hw8ge
      @JohnDoe-hw8ge Год назад +5

      I did hope to see the 13700K or 13900K there, would have been quite shocking :O

    • @Psythik
      @Psythik Год назад +2

      Hey you stole my username

    • @AlmightyJu2
      @AlmightyJu2 Год назад +3

      @@Psythik 😯 Psythix's unite! The important question is has anyone called you "fish sticks" on voice chat because they don't know how to pronounce it?

    • @oxfordsparky
      @oxfordsparky Год назад

      Fellow Brit here and I completely disagree, the cost of electricity is beyond negligible. Even using their price of 42p a kw if £22 a year extra running costs is enough to convince you then you shouldn’t be gaming on a pc.

    • @AlmightyJu2
      @AlmightyJu2 Год назад +7

      @@oxfordsparky sure if you take a single component by itself it might be a small figure, but even then if you're paying even £10 a year for a 2fps increase is there point in that extra cost? No. But without performance per watt info it's all speculation

  • @thetraitor3852
    @thetraitor3852 Год назад +374

    You should also try benchmarking CPUs on 4x games like Civ, Stellaris or city builders where they matter the most.

    • @Bunster
      @Bunster Год назад +31

      we asked for this on the other 3d cpus review, I guess they didnt care or read it

    • @nicholaskiser6146
      @nicholaskiser6146 Год назад +4

      was thinking this myself watching the video. no clue if the videogame heavy or productivity heavy CPUs work best for the videogames that are productivity heavy :( would love to see 4x benchmarks

    • @LS-es8zv
      @LS-es8zv Год назад +2

      4x games don't care about 3dcache, 13900K was faster than 5800X3D already

    • @badjackjack9473
      @badjackjack9473 Год назад +25

      I just wanna see a high end CPU try to play stellaris late game on high speed.

    • @thetraitor3852
      @thetraitor3852 Год назад +2

      @@badjackjack9473
      With AMD you will get the same exact performance out of 7600x as out of 7950x. It will be very similar story with Intel's i5 and i9 if it's the K type CPU.

  • @FFXfever
    @FFXfever Год назад +106

    Another really cool thing about 7800X3D being so efficient is probably if they decide to put it into mobile platforms. I can imagine 20 watt desktop replacement for field work processing.

    • @sihamhamda47
      @sihamhamda47 Год назад +12

      Can't wait for Ryzen HX3D series mobile processor (if AMD want to)

    • @PileOfEmptyTapes
      @PileOfEmptyTapes Год назад +3

      They would invariably have to stack a _monolithic_ die with 3D V-Cache first before it would be attractive to mobile use. Desktop Ryzen 7000 CPUs will consume 20+ watts in idle, and almost nothing of that is due to the cores (which might take 0.6 W).

    • @loider3365
      @loider3365 Год назад +3

      7800x3d on mobile will be hella efficient to gaming but very inefficient when idling, so battery life will still be poor

    • @deansmits006
      @deansmits006 Год назад

      But would you see the advantages since notebooks get lower power GPUs? You can't generally upgrade the GPU in a notebook, so you are more likely to be GPU bound and waste the CPU horsepower, right?

    • @XX-121
      @XX-121 Год назад

      DESKTOP AND MOBILE CPU'S ARE NOT THE SAME.
      do you really think if they could make the desktop part have that much performance at 20w they would LOL!!!!! WTF!!? U S E YOUR BRAIN!!!! the people in this comment thread are the same people on the steam forums saying their games won't run. ...."buuut i has a I7" lol

  • @qm3ster
    @qm3ster Год назад +10

    Now all we need are consumer desktop dual AM5 socket motherboards, so that you can put your interactive tasks on the 7800X3D and everything else on the 7950X.

  • @Workmusic1988
    @Workmusic1988 Год назад +74

    Live in the UK. Those cost of ownership metrics really frigging matter. Been waiting for this type of analysis for a while now and am so happy to finally see it. I would MUCH rather give up 5/10% performance for, in this case, a new component to save significantly on cost of ownership. :)

    • @purplepenguin43
      @purplepenguin43 Год назад +12

      This is going to be huge for me, i live in Alaska and energy is actually pretty cheap cause of hydro, but nobody has A/C so my room gets toasty quickly in summer with a 300+ watt system running for long sessions. i was thinking of getting a 13900k, but that would literally have doubled my wattage output into my room. this new chip if its running at 100 watts which is the same or less then my ancient amd FX 8core chip i have now. holy cow im going to be going from 32nm to 5nm tech, this is going to be a massive jump. im holding off though for at least one reviewer to use it in Arma 3 and DCS and compare it to intel as those are my most played games. exshily in arma 3 intel has always had a huge lead over amd just because of optimization and single thread reasons. but that dosnt matter if it comes at 200+ watts more cost.

    • @oll13
      @oll13 Год назад +5

      Definitely relevent these days. Features like Radeon Chill where it limits your FPS while you're idling in games can actually save you very noticible sums of money, as well as buying efficient hardware. I miss the days where you didn't have to consider these things but here we are..!

    • @th3_g3ntl3man
      @th3_g3ntl3man Год назад +9

      7800X3D is a no-brainer in the UK. Price difference will pay for itself in lower energy costs.

  • @CreativeMindsAudio
    @CreativeMindsAudio Год назад +5

    This is why i wait for an entire line of products to be released/reviewed before deciding on a purchase. I’m on a 5800X right now. No need to upgrade my PC right now other than a video card, but it’s nice to know where things are headed. I built it in 2021, so it has a lot of life left in it.

  • @BrunoRibeiro-po2bv
    @BrunoRibeiro-po2bv Год назад +121

    The way Linus completely obliterated Paulo Coelho 🐰 was something to behold

    • @manuntvg01
      @manuntvg01 Год назад +9

      indeed, I cringed a bit

    • @1ZeeeN
      @1ZeeeN Год назад +29

      As a Portuguese speaker I paused the video and go back just to hear again hahaha Linus destroyed the pronunciation hahaha
      Of course not his fault... but it was funny! 🤣

    • @gruiadevil
      @gruiadevil Год назад +5

      @@1ZeeeN I'm Romanian, and even I cringed hearing his pronunciation.

    • @rslanna
      @rslanna Год назад +2

      indeed

    • @bible1944
      @bible1944 Год назад +1

      ​@@1ZeeeN same xD

  • @moonmode3232
    @moonmode3232 Год назад +50

    I’ve been waiting years for the PC to finally release efficient CPU’s. The 7800x3d seems like the real deal. Nice job AMD!
    I just don’t understand how Intel has had a few years to update their CPU’s after Apple released their super efficient M chips, but have done nothing for efficiency.

    • @MarkLikesCoffee860
      @MarkLikesCoffee860 Год назад +9

      Intel has a 62.8% market share so they dont need efficient chips. 62.8% of the world love Intel and keep buying Intel. AMD only have a 35.2% market share so they need to work much harder than Intel and produce better products than Intel if they want to grow bigger than 35.2%. The CPU company with the bigger market share will always be complacent.

    • @shringe9769
      @shringe9769 Год назад +8

      Apple uses an entirely different cpu architecture (Apple is ARM, while AMD/Intel are x86); x86 chips will probably never come close to ARM. ARM has many of Its own issues relative to x86, so ARM releasing into the mainstream desktop market is not happening anytime soon. Comparing ARM efficiency to x86 desktop chips is not a fair comparison by any means.

  • @mtbSTL
    @mtbSTL Год назад +116

    Idk what is does, but Escape from Tarkov seems to heavily benefit from the increased cache in the 5800x3d, and the whole community went crazy over the massive performance increases over any other chip on the market, even with older 10 series cards. I can only imagine how big of a jump this is in games like that

    • @murd4638
      @murd4638 Год назад +1

      Oh Boi we're not ready...

    • @Odeezee
      @Odeezee Год назад +13

      same thing for Star Citizen, it love X3D chips. glad AMD are coming through for us again!

    • @frale_2392
      @frale_2392 Год назад +4

      AI computations benefit the most from having a bigger cache (see how massive the gains are on the graphs for Warhammer 3), I've never played Tarkov but I guess there is some heavy and possibly non optimized ai logic running in the background.

    • @murd4638
      @murd4638 Год назад +8

      @@frale_2392 you can play tarkov on 4K with this chip and in good quality, i'm already running tarkov on 4K high, fsr on ultra quality with a 5800X3D and i have stable 130 fps. the new 7800x3D is gonna be a monster too,
      clueless, tarkov is dead because of the cheaters

    • @rkan2
      @rkan2 Год назад +6

      In MSFS it doesn't really matter which so far released 40 -series GPU you run - but from x3D you get 10-20% more fps lol So instead of 50 you get 55-60 - which is a lot.

  • @FatherLamb
    @FatherLamb Год назад +10

    The only thing I learned from this video today was that It's segue @0:53 And not Segway.

  • @joshuagray6061
    @joshuagray6061 Год назад +73

    The performance of basically every mid to high end CPU is so good, the choice of CPU almost doesn't matter for most consumers. For the more hardcore audience, it's best to pick one that is more tailored to your preferred games. You really can't go wrong though.

    • @makishae9811
      @makishae9811 Год назад +3

      As someone who really wants the stuff they buy for their pc to last, I’m always looking to get the absolute best I can for the price I get it at.

    • @Kevin-sg5xc
      @Kevin-sg5xc Год назад +2

      this is the real answer

    • @ChrisWijtmans
      @ChrisWijtmans Год назад

      it really depends what you are looking for.

    • @GiorgosKoukoubagia
      @GiorgosKoukoubagia Год назад

      The performance is good today for most of them indeed. But as Linus and others have said, buying a better CPU today helps you not have to upgrade it (along with mobo and potentially RAM) when you upgrade your GPU in the future, especially with higher-end GPUs.

    • @joshuagray6061
      @joshuagray6061 Год назад

      @@GiorgosKoukoubagia yep I'd argue AM5 itself is a good investment, regardless of the CPU you choose. Should be supported for many years to come. That's why I bought a Godlike board and 7950X3D. Pricy, but I'm set for a long time.

  • @myravied7965
    @myravied7965 4 месяца назад +2

    running 7800x3d for now and in desktop it has just 35-38 Watts usage (and its a gaming PC) GPU takes around 26W its an 4070 also undervolted to max. So total around 61W power usage in total in desktop, that competes even with some gaming laptops now!!!
    PS: using Samsung pro 990 for 2 Tb and DDR5 5600 Mhz.
    Got this PC just 2 weeks ago for 1200$ used and I'm happy!

  • @parasitex
    @parasitex Год назад +33

    I think the reason the 7900X3D performed so badly with dual CCD in Factorio, is because game mode probably didn't kick in for the benchmark. And as such the CPU probably didn't recognize it as a game, which would have parked the non v-cache CCD.

    • @PQED
      @PQED Год назад +3

      No, it's because it has 2 CCD's, each with 6 cores on them. So When one CCD is shut down in favor of the cache, it essentially becomes a 6-core CPU with V-Cache.

    • @parasitex
      @parasitex Год назад +2

      @@PQED well, yes.. But to determine when it should switch to v-cache only ccd, it relies on windows game mode to determine if a game is running. If game mode doesn't trigger, it won't know which ccd to prioritize. And my guess is that the fsctorio benchmark doesn't trigger game mode as the benchmark doesn't seem to run like the game normally.

    • @PQED
      @PQED Год назад +3

      @@parasitex I'm aware of how unreliable (and incompatible) Game Bar is, and that's why it's ridiculous to employ it in this fashion.

  • @watercannonscollaboration2281
    @watercannonscollaboration2281 Год назад +35

    7:08 I love how Adam’s in this bit because back in the first AMD extreme tech upgrade video he really wanted a 3DVcache CPU but missed out on the timing

    • @djorgs
      @djorgs Год назад +1

      I will be involuntarily dreaming of Rabid Adam for years to come.

  • @TheMongolPrime
    @TheMongolPrime Год назад +18

    I really loved seeing the pricing for using the chips in California prices. I don't live there, but I super appreciate the graphs, as I find that incredibly important when choosing between the options and saying "well I have the money, so I might as well get the more expensive chip... right?"

    • @tad2021
      @tad2021 Год назад

      A 7950X3D costs around 300-350 millirentmonths.

  • @jimirayvaughan5767
    @jimirayvaughan5767 Год назад +1

    The "Bar Nun" graphic/pun was so good I don't even need to watch the rest of the video.

  • @xuerian2511
    @xuerian2511 Год назад +17

    Glad to see power efficiency and TCO making it into these calculations.

    • @TippyHippy
      @TippyHippy Год назад +1

      I put my hamster in a sock and slammed it against the furniture.

  • @SethanderWald
    @SethanderWald Год назад +17

    Man, it's crazy how basically every graph was showing triple digit fps across the board... Doesn't seem that long ago when these types of graphs were in the 60-90 fps range. 😅

  • @niravramdarie9898
    @niravramdarie9898 Год назад +37

    I never thought I would see a 5800x3d out perform a 13900k in any game 😮

    • @airBornFpv
      @airBornFpv Год назад +3

      now consider how much less power it draws = how much less heat it puts out into your cooling system = how much more graphic card will have headroom (may even be able to get more from Video card due to better thermals!

    • @niravramdarie9898
      @niravramdarie9898 Год назад +1

      @@airBornFpv I know. It's painful to watch.

    • @jamesmackinlay4477
      @jamesmackinlay4477 Год назад +1

      That is of course gaming but in other operations of the PC the 13900k will do better but the reality is the difference in power draw and the real world performance over all for most everyone the amd is still the better overall.

    • @airBornFpv
      @airBornFpv Год назад

      @@jamesmackinlay4477 well yeah - hence “the best gaming cpu” 😄😘

  • @brownhorn111
    @brownhorn111 5 месяцев назад +8

    Boutta upgrade from my old trusty Ryzen 5 3600 to this "little boy" of a nuke of a CPU. I am financially responsible I swear

    • @giffya
      @giffya 5 месяцев назад +3

      did you do it? i'm in the same position with the 3600 he's getting a little worn out these days

    • @brownhorn111
      @brownhorn111 5 месяцев назад +3

      @@giffya yes I did it, it is a wonderful beast, a significant upgrade. Sadly though, you need a whole new Mobo and ram :/

    • @jermasus
      @jermasus 5 месяцев назад

      Might be worth it to wait until Ryzen 9000 and 7800X3D drops in price?

    • @mashedpotatoes5323
      @mashedpotatoes5323 5 месяцев назад +2

      I just upgraded to this from a i7 6800k and it's definitely significantly better. I paid around $530 for a CPU, motherboard and ram combo from microcenter.

  • @reg3dit20
    @reg3dit20 Год назад +21

    Great contextual content, keep calling out all companies in the space for their "smoke and mirrors" PR BS. Good and clean data and info ❤

  • @yobson
    @yobson Год назад +27

    would love to see some software improvements that would better assign the cores on the 7950X3D to appropriate workloads to close the gap

    • @SCYN0
      @SCYN0 Год назад +1

      You can deactivate the cores manually but u still paid 200% more for the same Chip then

    • @achillesa5894
      @achillesa5894 Год назад +3

      The 7950X3D basically becomes a 7800X3D when you turn off half the cores in the BIOS. Hopefully they fix it with software updates so it's always like that without turning off half of it.

    • @pennypenguin15
      @pennypenguin15 Год назад

      You mean processor affinity..? Cause that's been a native windows feature for some time now.

    • @johnsteves9158
      @johnsteves9158 Год назад

      Interesting

  • @korvish111
    @korvish111 Год назад +48

    100% agree about the delayed launch of a better cheaper product.
    If there is a defence it’s this, the 7950x3d performing worse seems to be a software issue (firmware, drivers, or OS). It should be best of both worlds rather than worst of both worlds.
    I’m sure original expectation was best of both worlds, and they didn’t adjust their plan when it didn’t pan out that way. Maybe theirs firmware updates in future.

    • @TheMessiah1337
      @TheMessiah1337 Год назад +2

      Yea really hoping they improve the schedular or the parking some how for the people who have the 7950x3d that wanna use it for production and also gaming.

    • @achillesa5894
      @achillesa5894 Год назад +2

      Yeah if they get the scheduler perfect it will be the ideal CPU for streamers or people who game and work on the same pc

    • @TheMessiah1337
      @TheMessiah1337 Год назад +1

      @@achillesa5894 Yea currently with some games i play there are scheduling issues unfortunately where the cpu doesnt fully utilize the 3d cache by about 30-50% so it can be like a 20-40 fps loss which really sucks -.-

    • @griffin1366
      @griffin1366 Год назад +6

      They delayed it to sell people more 7950X's as they new that the 7800X3D would be *THE CHIP* to go to for most people.
      The 7950X relies on Windows game bar to schedule itself so good luck...

    • @TheMessiah1337
      @TheMessiah1337 Год назад

      @@griffin1366 i know how it works lol ive got one. A lot of people chose the 7950x3d for productivity and gaming purposes....

  • @user-ym4nc6sn8b
    @user-ym4nc6sn8b Год назад +740

    70 bucks for a screwdriver???

    • @AlexGames-jm6ik
      @AlexGames-jm6ik 8 месяцев назад +128

      It's a really nice screwdriver

    • @codykyle511
      @codykyle511 8 месяцев назад +137

      Still no one needs a 70 buck screw driver

    • @jonathangwahlstedt
      @jonathangwahlstedt 7 месяцев назад +43

      @@codykyle511Whats the point of your comment then, besides giving something you think no one wants or need more attention?

    • @justaguywithapowerpole
      @justaguywithapowerpole 7 месяцев назад +1

      ​@@codykyle511 he rich sooo

    • @williamkelley1971
      @williamkelley1971 7 месяцев назад +28

      Bought one for my dad this year, I'd say it's well worth it. Buy the screwdriver and a couple bit sets and you'll never need another screwdriver or another bit set in your life.

  • @phx4669
    @phx4669 Год назад +8

    I finally jumped in when 5800x3D hit $319 and I had point on Prime so it only cost me points. I was already running 5900x and I popped in the 3D chip and gained 20 fps on the same game, same setting running at 1.16v and switch out for an air cooler. 3D is surely efficient and cool as long as you run PBO2.

  • @StaySic4Ever
    @StaySic4Ever Год назад +12

    It's a straight up beast. Exactly what I expected, really a BIS chip and worth the wait for AM5 platform, especially after some time for boards and memory to get cheaper too.
    Definitely amazing at certain games where it use extra cache for it's max potential like in WoW and other. Very nice!

  • @astrea555
    @astrea555 Год назад +8

    I'm buying that for emulation and avoid CPU bottlenecks for years after I buy it. And also the upgrade path. Planning to keep this AM5 motherboard for years

    • @Benefits
      @Benefits Год назад

      I could be wrong, but I was always under the impression that emulation performance relied on high clock speeds as opposed to tons of L3 cache. If that's the case, x3D chips wouldn't be ideal for emulators since their clock speeds are significantly lower than their normal X version counterparts. Someone correct me if I'm wrong though!

    • @shinyhappyrem8728
      @shinyhappyrem8728 Год назад

      @@Benefits: it's always about what the program is doing. A SNES game (plus the structures required for emulation) will probably fit easily into L3 cache, but a PS2 game won't.

  • @hansangb
    @hansangb Год назад +4

    I wish LTT would test VR performance. Typically, clock speed is really important.

  • @Hoigwai
    @Hoigwai Год назад +11

    I love my 5800X3D. I will be staying with it for quite a while. The thing that I feel when playing with it now is the reduction or removal of a lot of micro-stutters. Load hitching doesn't happen as it used to as well.
    I do feel it in Cyberpunk 2077, it's not in the framerate it's in the input response. Turning a corner on a vehicle at 100+MPH on my old CPU would feel sluggish and those little hitches as it tried to keep up with the GPU are gone.
    Yes, I am a 1080p player.

  • @WilliamOwyong
    @WilliamOwyong Год назад +6

    Well done in showing running costs. That has always been a factor in pricing my PCs. The excuse "rich people don't care about the bills" wears thin after a while (that excuse doesn't apply to many/most of us) so seeing those metrics is refreshing.

  • @MrLagzy
    @MrLagzy Год назад +26

    having the 5800X3D and now a 7900 XTX I'm set for a couple of years. I might end up skipping the entire AM5 generation and see when AM6 comes out what both AMD and Intel have at that point. Right now I dont need an upgrade. Not for the next 5-6 years. Still its astonishing the new implementations and innovations that AMD and Intel comes out with. I wonder when AMD will also have a big.little structure and add some efficiency cores, or if they just straight up go 3D V-Cache on all their future CPUs to make them as efficient as possible. They have already won that game.

    • @retovath
      @retovath Год назад +7

      They are planning out Big Little for Zen 5. Essentially their plan is to use Zen4 with avx512 and some other SMID instructions removed and call it zen 4 dense. This will shrink core size and interconnect size, enabling low power compute zen4 while leaving full fat zen 5 to do the high power high speed stuff.

    • @gruiadevil
      @gruiadevil Год назад +4

      They already tried a big.little chip.
      It's just for laptop/mobile though (2Perf+4Eff). That's where efficiency matters.
      For desktop, all the efficiency cores are parked and used only into production tasks.

    • @NahBNah
      @NahBNah Год назад

      Also they are going to add 3D cache onto their GPU I hear.

    • @Rippedyanu1
      @Rippedyanu1 Год назад +4

      Same setup as you. 5800x3d on a b550 Taichi board with a 7900xtx Taichi. The rig is amazing at 4k and not too insane on power draw so it's perfect for me for the time being. I don't see a need to upgrade for the coming years at all. I plan to use am5 for a home nas build though with a b650 itx board and a Jonsbo n1 case. Will use one of the non x CPUs because of how power efficient they are

    • @ChrisWijtmans
      @ChrisWijtmans Год назад

      Personally I hope this big.little fad goes away. It makes the schedulers too complex.

  • @K1ngFoxGaming731
    @K1ngFoxGaming731 Месяц назад +1

    I just got my 7800x3d today and omg that’s an upgrade from my 7700

  • @honaker326
    @honaker326 Год назад +8

    I upraded from my 2600x to a 5800x3d with a bios update for $299. Somtimes, the best medicine is to wait on the best of last gen to go on sale lol.

    • @MindShackleFilms
      @MindShackleFilms Год назад

      I did the exact same upgrade last month. GPU is a 5600XT on a 1080p monitor so the gains were minimal. Bottleneck Calculator says the 5800x3d is a bottleneck for a 4070 Ti or higher... not sure what GPU i should upgrade to. Will get a 1440p monitor as well. Wait for the 4070?

  • @Targetlockon
    @Targetlockon Год назад +11

    The cpu we were anticipating and waiting for to see the results after seeing the 7950X3D and know how the 5800X3D performs

  • @Deathsead747
    @Deathsead747 Год назад +9

    I honestly think that long term review of these chips, future forecasting in terms of energy consumption and compatibility need to be an integral part of these reviews. Most people aren't building a new rig every year.

  • @0bzen22
    @0bzen22 Год назад +3

    Just did my upgrade.
    7800X3D, B650-Plus, 32GB DDR5 6000, and a 6950XT (should have made the jump to 7900XT but whatever). Now, that's component prices I wasn't prepared for, but coming from a i7-6700K and 1080, it was about time, although the old i7 and 1080 still hang.

    • @N.K.---
      @N.K.--- Год назад

      Damn bro great upgrade

    • @razgaror
      @razgaror Год назад

      i got literally the same upgrade from my old system to the new one. besides the new GPU. how does it feel compared to your old rig ?

    • @0bzen22
      @0bzen22 Год назад +1

      ​@@razgarorIn general, not much as changed. Gaming, a leap forward, obviously, especially in newer titles like Dartide that gobble VRAM and CPU.
      Mainly because I use 3440x1440 34'' monitor, the 1080 was getting destroyed, then the CPU lagged bebind. I was getting 40%, 60% GPU utilisation with the 6950XT, so that needed to change.
      In some games, the old stuff worked perfectly well (Back For Blood, old titles...), with some visual reductions. I think if you game in 1080p, you can probably get away with a system as old as mine for a while. Less VRAM, easier on the GPU...A cheaper GPU, like a 6700, as long as you have decent amount of VRAM, imo.

    • @0bzen22
      @0bzen22 Год назад

      Oh and I've switched to two NVME drives (!TB, 4TB). But really, I had a 500TB ssd, 2TB NVME, and a 2TB HDD, and yes, it's faster, but diminishing returns.
      All in all, I could have kept the old system. If you're short on cash, second hand will save you a lot because all that shit is expensive. And as I said, my old pc still works fine and could still game OK. Besides, not many exciting things on the horizon. It's a bad time for gamers.

    • @Oxley016
      @Oxley016 3 месяца назад

      @@0bzen22What kind of cooler are you running on the CPU now and what what are the temps like? Looking to get a 7800x3d myself!

  • @AustnTok
    @AustnTok Год назад +53

    I'm extremely happy with my 7900x. I initially wanted to wait for the 7900X3D, but I do a lot of recording and streaming while gaming, and I plan on keeping my 7900x instead of upgrading.

    • @Minimal_M
      @Minimal_M Год назад +3

      Good choice

    • @cvbattum
      @cvbattum Год назад +8

      For game _developers_ the 3D variants don't make a whole lot of sense either. We're getting top-tier but only just shy of best performance in gaming but gain a massive improvement in the productivity department. Compiling and loading and editing take up a whole lot of time out of my daily schedule and I'd gladly give up even 15% of fps 5 years down the line for the massive gain in productivity.

    • @AustnTok
      @AustnTok Год назад +3

      @@gozutheDJ yeah. I was using the AV1 encoder with my 4080 but there isn’t a whole lot of support for AV1 encoded videos yet. So I use Nvidia NVENC H.264 but I’ve been wondering about trying out x264

    • @rkan2
      @rkan2 Год назад

      You would probably be lucky to get the 7800x3D in 6 months lol - so it is not necessarily a bad choice.

    • @AustnTok
      @AustnTok Год назад

      @@gozutheDJ is CRF similar to CQ level? I have CQ set to 14.

  • @sirchaps8489
    @sirchaps8489 Год назад +27

    Had on my AM4 board 2700x, now rocking the 5800x3d. And 32GB 3200mhz RAM.
    Thank you AMD. 🙏

  • @asldfjkalsdfjasdf
    @asldfjkalsdfjasdf Год назад +23

    The power consumption story is missing idle and web browsing or watching video values.
    If the 3D chips consume more power in idle or low performance tasks that might also be of interest for those that use their computer for extended periods of time with small bursts of power.

    • @ShersGarage
      @ShersGarage Год назад

      How much more? 50w more? If primary use for the PC is browsing, x3d CPU is a wrong tool for the task.

    • @asldfjkalsdfjasdf
      @asldfjkalsdfjasdf Год назад

      @@ShersGarage Sure but many only need the CPU Power for short bursts. And why not get a x3d cpu if you like to game on the same computer as well.
      I found this video which gives a better look at efficiency: ruclips.net/video/JHWxAdKK4Xg/видео.html

    • @ShersGarage
      @ShersGarage Год назад

      @@asldfjkalsdfjasdf doesn't look like there is that much difference in the power consumption. Maybe 5GBP difference? My take is, if you you plan on using the PC mostly for gaming, get the X3D. Otherwise it is a waste of money. In some cases non X3D CPUs are just as good or better than X3D counterpart.

  • @0Turbox
    @0Turbox Год назад +4

    Dude is a great seller. You stare at a graph that shows you barely any differences, and he still manages to use words like "win" and "lead" to shine some light on single products. No one can sell me, that you will notice 10 % gains at 200+ FPS.

    • @baboka2616
      @baboka2616 4 месяца назад

      True, but when you are buying a new cpu why not spend 50€ more for a much better cpu?

    • @0Turbox
      @0Turbox 4 месяца назад

      @@baboka2616 50? I got my 7600x for 195,00 and the 7800X3D was around 350,00.

  • @johnhughes6847
    @johnhughes6847 Год назад +20

    Nice Work Team LTT! I especially appreciated the efficiency cost/KWh comparison. This is a significant factor in the northeast US where its over .42/KWh.

  • @jpierce1987
    @jpierce1987 Год назад +26

    I'm running the 5800X3D with a 4090 @4K and I'm very happy with it! Definitely going to let me get a few more years out of my AM4 setup!

    • @Hugosanches594
      @Hugosanches594 Год назад

      You have the 100% of gpu usage with dlss quality or have a little bottleneck like 1440p?

    • @randomguy-
      @randomguy- Год назад +1

      @@Hugosanches594 I have the "same" setup and at 4K 120hz there are very few games where the 5800X3D will bottleneck the 4090. I see some performance loss in Witcher 3 and in Hogwarts, mainly because of low CPU utilization/bad game optimization. These games would have benefitted from faster core speeds than the X3D chips have.
      In Hogwarts a quick .exe tweek mostly fixed it and in both games DLSS/FG made it pretty much a non issue.
      (Running around in Skellige in Witcher 3 I had to turn of Frame Generation for a while though, because of some bug.)
      In 1440p there are several games where you will get a CPU bottleneck, but that is true for all CPU's currently available when you have a 4090.

    • @andrewt.5567
      @andrewt.5567 Год назад +2

      @@randomguy- Hogwarts is pretty new, so it may get some optimization with updates.

    • @thenonexistinghero
      @thenonexistinghero Год назад +2

      I don't think anything for you can last a few years when you upgrade every 1-2 years.

    • @cuatrokoop32v
      @cuatrokoop32v Год назад

      Yeah, I'm switching all my 3600's to 5600X's, and making sure I have at least B550 boards. I just got an ITX B550 last week to replace the ITX Fatal1ty 450, and the last 5600X arrived on Friday. I'll be able to recoup probably 75% of my expenditure.

  • @admistroy
    @admistroy Год назад +8

    For the performance graphs I would love for them to be color coded so like the product that is being reviewed have a different color to differentiate from the others

    • @Tamdrik
      @Tamdrik Год назад

      I'd also like them to highlight the product currently under review to easily pick out its position.

  • @Yoshi_206
    @Yoshi_206 Год назад +13

    I'm all to holding companies feet to the fire. But the push back for AMD launching the 7950X3D and 7900X3D 5 weeks before the 7800X3D is just too much for me. It was announced at the same time and included the MSRP so everyone knew they only had to wait a little more than a month and how much they would save. Those who need the bleeding edge or can't control themselves really just need to look in the mirror if they have regrets today. All major review sites (at least the ones I saw) said they expected the same performance with the 7800X3D and to wait to buy.

    • @KH-cs7sj
      @KH-cs7sj Год назад

      Four months later it wouldn’t matter. I just bought a 7950x3d and got a free motherboard. It’s equivalent to 7800x3d plus a motherboard, only with 8 more cores.

    • @KH-cs7sj
      @KH-cs7sj Год назад

      @@Cooe. with the latest chipset driver the non-vcache core parking is automatic.

  • @Rikorage
    @Rikorage Год назад +33

    Your inclusion of the energy efficiency over time makes me really happy that you invested in that lab. Power draw is one of the main reasons I refuse to get any current Gen NVIDIA product, Even though I really really really wanted a 3070 TI. Will most likely buy an arc a770 when that's on sale.

    • @r3mxd
      @r3mxd Год назад

      lmfaoo magine caring about an xtra couple pennies in ur bill this mf boycotts nvidia 😂😂😂😂 go hug a tree while i enjoy superior fps

    • @CALiiGeddonTwo
      @CALiiGeddonTwo Год назад +7

      Do you happen to know that Nvidias 40 series cards use less power than AMDs 7000 series cards?

    • @babagandu
      @babagandu Год назад +1

      nVidia 👑

    • @decus9544
      @decus9544 Год назад +2

      The 4070 (released after your comment I realise) is very power efficient. I'm mid-life upgrading my current computer with it and a 5800x3D.

    • @RiceWitch-dingus-400
      @RiceWitch-dingus-400 Год назад

      4070 sips power for how good it is

  • @danielet8172
    @danielet8172 Год назад +8

    It would be an added value for this video, and I know, more time to test things, to see the difference in gaming performance between 7800X3D and 7950X3D while streaming at 1080p with a reasonable number of programs open for streaming (elgato stream deck, xlr, RGB controller, etc).
    I think it is a pretty common environment nowadays, and useful too see if the extra money for the 7950X3D is more convenient than building an entire PC dedicated to streaming the video output from the gaming PC (which, it most probably is), or if the difference between the CPUs is neglectable.

    • @pendeltonthecook
      @pendeltonthecook Год назад +1

      remember that the drivers effectively shut off the other 8 cores on the 7950x3d while gaming, so it would be purely clock speed making the difference and i cant imagine the extra few hundred mhz would be worth the extra cost for that application.

  • @Eli-zb2yj
    @Eli-zb2yj Год назад +206

    The X in the name= expensive

    • @thatguy3000
      @thatguy3000 Год назад +7

      Nah, it'll go back down after awhile.

    • @Jibisgig
      @Jibisgig Год назад +2

      Tru

    • @sauravpradhan7881
      @sauravpradhan7881 Год назад

      Xdd

    • @RandomDude1999L
      @RandomDude1999L Год назад +2

      What about Xbox series X 0_0

    • @joemarais7683
      @joemarais7683 Год назад +1

      @@thatguy3000 I doubt it this time. These are going to be paper CPUs with AMD putting the ccd’s in the Ryzen 9’s first

  • @MrLegendL2118
    @MrLegendL2118 3 месяца назад +2

    Literally just got the Ryzen 7 7800x3d from microcenter with a bundle deal for $267.12 ❤‍🔥

    • @gtALIEN
      @gtALIEN 3 месяца назад +1

      so lucky

  • @Dj-Mccullough
    @Dj-Mccullough Год назад +10

    Currently happy with my 5800x3d and 6800xt for 1440p. while this cpu is obviously faster, I think ill wait things out to the 8000 series.

  • @Alovon
    @Alovon Год назад +10

    Honestly I would like to see benchmarks in CPU Intensive ports like Witcher 3's Updated version (with the comment that this was tested as of this point and CDPR may eventually fix it)

  • @namuzed
    @namuzed Год назад +9

    I wish you'd include the 5900x or 5950x, just for some perspective for those of us running a fairly recent-ish AMD CPU.

    • @ji3200
      @ji3200 Год назад

      5800x3d vs 5900x benchmark video on youtube. RUclips it brother

  • @CoolSilver
    @CoolSilver Год назад +5

    I'm still rocking a i7 5820k CPU and it was a 6 core processor that you guys hyped as one of the best choices long term even though there was faster chips that were still quad core.

  • @VanethReaper
    @VanethReaper Год назад +58

    Surprised to see how well my 5800X3D holds its own still next to these newer chips, and it most definitely isn't slowing down my 4080 so I guess I'll be holding onto it till the next CPU generation 😄

    • @Forke13
      @Forke13 Год назад +24

      It's not even a year old, bud

    • @Petar98SRB
      @Petar98SRB Год назад

      What are yours temperatures,my goes to like 90 and i dont like that...

    • @megadeth8592
      @megadeth8592 Год назад +1

      @@Forke13 its not current gen. no one thinks its old ffs

    • @spacer125
      @spacer125 Год назад +1

      @@Petar98SRB my 5800x3d is super inconsistent. it idles around 39-44c and during gaming it likes to sit around 50-60, and jumps around 70c regularly before going back down to 60ish c.
      and cinebench makes it go to like 84-85.
      You may want to check your thermal paste application/mounting, or reexamine your current cooling solution.

    • @VanethReaper
      @VanethReaper Год назад

      @@Petar98SRB X3D chips do get somewhat warm yes, but 90 is definitely on the high end. Gaming temps average 60-65c for me, cinebench load caps around 80-83c usually. I would definitely double check your thermal paste application and cooler mounting pressure, or possibly your case airflow setup.

  • @Cybey
    @Cybey Год назад +12

    The main takeaway from this video is that the word is not spelled as Segway but as Segue

  • @drakors
    @drakors Год назад +100

    47 anos para ver o Linus citar Paulo Coelho com um sotaque '' perfeito'' hahaha, muito bom!

  • @Max15691
    @Max15691 Год назад +37

    The problem with the 7900x3D and 7950x3D simply need microsoft to fix windows so that it can run the games on the 3D cache CCD, much like they had to do with intel's P cores vs E cores.

    • @FLMKane
      @FLMKane Год назад +2

      Windows fix it's scheduler and memory management?
      BAAHAHAHAHAHAHAAAAA😂

    • @Max15691
      @Max15691 Год назад +1

      @@FLMKane Yeah, a really silly request 🤣

    • @cyclechris6591
      @cyclechris6591 Год назад

      Im a 5950x owner, but don’t you just point microsoft xbox game bar to see a program as a game, then the games runs on the cached chiplet?

    • @Warhorse469
      @Warhorse469 Год назад

      it's been 2 months and it still isn't fixed safe to say it will never get fixed.

  • @awsm253
    @awsm253 Год назад +15

    "Raises some big questions like-
    What the F***?"
    Exactly my reaction when I saw the 7800x3D bullying literally everything in most gaming scenarios. I've gone through numerous reviewers and seen the same results. Amazing.

  • @FairlySadPanda
    @FairlySadPanda Год назад +11

    If you heavily play a Unity game (say, VRChat or Escape From Tarkov) then the extra cache of these chips vs Intel's is gigantic. Unity hits cache a lot.

    • @Dyils
      @Dyils Год назад +3

      What makes you think it's unity? A sample size of 2? No, cache is good for anything that isn't optimized well.

    • @FairlySadPanda
      @FairlySadPanda Год назад

      @@Dyils True! There's not been a huge amount of Unity-focussed testing of these. VRChat is a good general stress test of Unity on BIRP though given you can throw a very wide variety of content into it and really stress the engine out in high-load instances. The hypothesis is that it's a Unity quirk, something that has been mentioned by VRC devs, due to the reality of Unity games essentially being injected assets and scripts into a core engine that can only do limited build optimization compared to bespoke engines or Unreal. I think the same finding was made by Tarkov players: see here ruclips.net/video/Gi8X99f70D8/видео.html
      To clarify: it's good for lots of games, but I think Unity games in particular are friendly towards high levels of cache.

    • @smaxfpv1337
      @smaxfpv1337 Год назад

      Great to know as a VRchatter looking for a new system! Didn’t know that

    • @TheUnlocked
      @TheUnlocked Год назад

      ​@@Dyils Cache is good for anything that needs to access the same area of memory very frequently. It has nothing to do with poor optimization.

    • @califorupha1574
      @califorupha1574 Год назад

      I have the 7950x3D and play vrchat. I’m not sure if it’s worth to downgrading to 7800x3d for the simplicity and being on one ccd, can someone help me?

  • @hubertnnn
    @hubertnnn Год назад +25

    I think improving the 7950x3d is just a matter of software.
    All AMD need is a better scheduler. And looking at the numbers this kind of one die cache is going to be the future.
    So let's wait a few months and driver update can improve it even beyond 7800x3d

    • @wasd____
      @wasd____ Год назад +3

      Nope. Infinity fabric latency and having to "guess" at which core to use in the multi-die setup is always going to be a weakness that keeps 7950X3D from being what it should. AMD played customers with a chip they knew was no better than a cheaper offering coming shortly after.

    • @mitch075fr
      @mitch075fr Год назад +3

      Strange how everybody cut some slack for Intel designing a CPU with different cores that needed a new OS to work, but AMD get flak because Windows can't allocate a RAM-heavy process to a core with lots of cache...

    • @bezzieog3827
      @bezzieog3827 Год назад +5

      just use process lasso and make a personal CPU set, with PBO and its instantly faster than the 7800x3D. I've already tested this.. the issue is windows scheduler not being set up for gaming unless you use windows gamemode but that hinders some performance.

    • @PQED
      @PQED Год назад +4

      @@mitch075fr It struck me as odd as well that Intel got so much help from Microsoft to make their product work, but AMD has to rely on Game Bar of all things? It's not reliable in the slightest.
      AMD really does need to work on their own hardware scheduler though, but I don't see how it excuses the above.

    • @teemuvesala9575
      @teemuvesala9575 Год назад

      Nah not true. It needs hardware thread scheduler like Intel has. AMD just cheaped out as is usual for AMD. Never touched their trash products cause they pull of cheap crap like this.

  • @clerklierbrush0869
    @clerklierbrush0869 Год назад +10

    I have 2 monitors and do gaming/streaming and light 3d modeling. 13600k has had no issues and no lag, super quick. I feel like most modern cpus are all overkill for 95% of users.

  • @TheEclecticDyslexic
    @TheEclecticDyslexic Год назад +12

    PSA: it is possible to run your non x3d chips power limited to achieve pretty much the same watt per frame as the X3d variants. You will fall a little short in cache limited scenarios, but you will get pretty close. It barely even impacts performance on all core loads.

    • @jamiealeksic8428
      @jamiealeksic8428 Год назад

      I second this on both of my ryzen CPUs I was able to get a solid undervolt whilst maintaining higher then stock performance

  • @oneplusoneisnotfive
    @oneplusoneisnotfive Год назад +12

    can't wait to get one in 4 years!

  • @NicosLeben
    @NicosLeben Год назад +4

    I really would like to see an improvement with your charts, especially if there are so many. It's hard to focus on them without pausing the video a lot. Sometimes low values are better, sometimes high values are better, nearly every time the order of the CPUs changes, the spacing between the lines is different each time. The transitions between the charts/graphs could be better I think. I don't know how you could achieve that because I am not a layout designer but you definitely could do something here. That would be great!

  • @XolaresTiberius
    @XolaresTiberius 9 месяцев назад +1

    I got the 7950x3d since I game and work. Paired with 128gb ram and 4090. So it's a good balance for me.

  • @wyldecorey
    @wyldecorey Год назад +7

    Would have loved to see the 7900x in the comparisons. As of writing you can get one for $430 (admittedly at heavy discount, but multiple retailers have the same price), the closest price comparison out there.

    • @jiamingzhou2836
      @jiamingzhou2836 Год назад +1

      You can just take a look at the 7700x results. They perform similarly in gaming, and because games don't usually benefit from the extra cores, in some games the 7700x performs even better than the 7900x by a very small margin.

  • @BobYeeterson
    @BobYeeterson Год назад +15

    the power efficiency of that 7800X3D CPU with that epic performance makes it an instant buy for me :)

    • @sophieedel6324
      @sophieedel6324 Год назад

      because people buying $500 CPU care about power efficience, stupid shill

  • @ioandragulescu6063
    @ioandragulescu6063 Год назад +12

    all I can say is again, thank you AMD for releasing the 5800X3D and keeping my X470 platform still up and running :)

    • @Quasiguambo
      @Quasiguambo Год назад

      Mmm, I only recently built a new PC with a 5950x and a 3090ti... the thought of having to get a new motherboard and really rebuild the PC from near scratch next time around when my last PC lasted 10yrs... its pretty annoying and not something I want to do at all.
      I'll see how long I can hold on I guess... but with all the UE5 advances, I can't imagine my current PC'll keep pace in 5yrs time. Meh.

    • @SttravagaNZza
      @SttravagaNZza Год назад +1

      @@Quasiguambo ??? I got a Ryzen 1700x, built it at release, thats 6 yeras ago. I can just replace to a 5800x3d and the same system will work for years. Do you expect the 5950x to only last a couple??

  • @45eno
    @45eno 9 месяцев назад +1

    Just bought a sealed in retail box 7800X3D for $340 from local ad about a month ago. Then found another 7800X3D listed for $300 new but with no box. I snagged it for $260 for my son. I was only shopping for a used 7600 but the extra $100 was worth it. First time my kid has the same PC spec as me CPU/MB/RAM wise at least.