It's the Best Gaming CPU on the Planet.. AND I'M MAD. - Ryzen 7 7800X3D Review

Поделиться
HTML-код
  • Опубликовано: 20 июн 2024
  • Thanks to Mine for sponsoring this video! Discover where your data is, and take it back at bit.ly/LTT-Mine-Privacy
    Get rid of your old devices while getting paid! Check out Gizmogo at lmg.gg/gizmogo
    Hey! We got this one out on time! AMD already introduced their fancy 3D V-Cache tech to their Zen 4 series of Ryzen processors with the 7900X3D and the 7950X3D, but this is the one that everyone is waiting for. The follow-up to their so-good-its-a-problem-for-AMD 5800X3D. Can AMD bottle lightning twice? Can they take down the mighty 13900K, the king of Intel Core processors? ...Something is making me think that Intel might be developing a fear of thunder...
    Discuss on the forum: linustechtips.com/topic/14988...
    Buy an AMD Ryzen 7 7800X3D CPU: geni.us/BzdOS
    Buy an AMD Ryzen 7 5800X3D CPU: geni.us/U6q9
    Buy an AMD Ryzen 7 7700X CPU: geni.us/3wRrP01
    Buy an AMD Ryzen 9 7950X3D CPU: geni.us/e6msNmv
    Buy an AMD Ryzen 9 7800 X3D CPU: geni.us/tfm6v
    Buy an Intel Core i9-13900K CPU: geni.us/uK3UAK
    Purchases made through some store links may provide some compensation to Linus Media Group.
    ► GET MERCH: lttstore.com
    ► LTX 2023 TICKETS AVAILABLE NOW: lmg.gg/ltx23
    ► GET EXCLUSIVE CONTENT ON FLOATPLANE: lmg.gg/lttfloatplane
    ► SPONSORS, AFFILIATES, AND PARTNERS: lmg.gg/partners
    ► OUR WAN PODCAST GEAR: lmg.gg/wanset
    FOLLOW US
    ---------------------------------------------------
    Twitter: / linustech
    Facebook: / linustech
    Instagram: / linustech
    TikTok: / linustech
    Twitch: / linustech
    MUSIC CREDIT
    ---------------------------------------------------
    Intro: Laszlo - Supernova
    Video Link: • [Electro] - Laszlo - S...
    iTunes Download Link: itunes.apple.com/us/album/sup...
    Artist Link: / laszlomusic
    Outro: Approaching Nirvana - Sugar High
    Video Link: • Sugar High - Approachi...
    Listen on Spotify: spoti.fi/UxWkUw
    Artist Link: / approachingnirvana
    Intro animation by MBarek Abdelwassaa / mbarek_abdel
    Monitor And Keyboard by vadimmihalkevich / CC BY 4.0 geni.us/PgGWp
    Mechanical RGB Keyboard by BigBrotherECE / CC BY 4.0 geni.us/mj6pHk4
    Mouse Gamer free Model By Oscar Creativo / CC BY 4.0 geni.us/Ps3XfE
    CHAPTERS
    ---------------------------------------------------
    0:00 Intro
    1:09 V-Cache
    2:30 1080 Results
    3:17 Why it beats 79xxX3D
    4:50 1440 Results
    7:00 AMD Chicanery
    8:37 4K Results
    9:26 Productivity
    10:03 Power Consumption
    11:24 Market Placement
    13:29 Outro
  • НаукаНаука

Комментарии • 4,8 тыс.

  • @AlexTheDane_
    @AlexTheDane_ Год назад +6854

    I think it would be a nice touch if you could highlight the CPU in the benchmarks. It can be a little confusing with so many different CPUs at the same time and it took me a bit to actually find the 7800X3D. I know it might be a little thing, but for me at least, it would make it a bit easier to spot the CPU that the review is about. Otherwise great video!

    • @Jjrage1
      @Jjrage1 Год назад +488

      Nah its not just you, the charts in this are all over the place. The orders of the CPUs change around and its not even ordered by fps in some of them (5:02). I don't really get it. Much easier if the CPUs stay in the same spot and we compare its size to other bars easily.

    • @AaronShenghao
      @AaronShenghao Год назад +179

      Yeah GN highlights the CPU in their videos, not sure why LMG doesn’t…

    • @b4ttlemast0r
      @b4ttlemast0r Год назад +94

      yeah, Gamers Nexus always highlight the ones that they're talking about

    • @Stephen__White
      @Stephen__White Год назад +71

      Honestly I feel it would help if they would just put them in some sort of order, some charts are best to worst performance, some are grouped, some are randomly placed. It can not only make it hard to find the CPU you are looking for but also to actually see how it stacks up.

    • @SmashMysticSaiyan
      @SmashMysticSaiyan Год назад +31

      Yeah... the charts are pretty bad for this one. There doesn't seem to be and rhyme or reason for how the cpu's are listed from chart to chart. sometimes it looks like it's by performance, but other times it just seems random.

  • @psihius
    @psihius Год назад +3887

    7800X3D will sell like hot cakes in Factorio community the same way the 5800X3D did - the performance boost is so massive that it allows you to scale to much much MUCH bigger factories.
    Also, memory latency really REALLY __REALLY__ matters to Factorio. A DDR4 2666 vs 3200 MHz made like 10-15% difference in performance provided similar CAS latency ratings. On DDR5 this is even more pronounced and could be a fascinating rabit hole for the LTT Lab.

    • @tech_you
      @tech_you Год назад +62

      CAS latency doesn't matter at all, it's more like tRRDS/L tFAW and tRFC for timings.

    • @joeykeilholz925
      @joeykeilholz925 Год назад

      @@tech_you not this shit again

    • @mrbobgamingmemes9558
      @mrbobgamingmemes9558 Год назад +118

      I just hope this cpu can get 60 fps on my modded cities skylines , although most of my mods are gameplay tweak, very few of the mod that is asset mod

    • @TheHighborn
      @TheHighborn Год назад +108

      in factorio the performance is totally nuts

    • @duckilythelovely3040
      @duckilythelovely3040 Год назад

      LTT lab lol.
      That thing is a joke. Tell me how 1 dude from hardware unboxed has FAR more games tested, with far more settings than an entire fucking "lab" does.
      Sorry, but their content is pretty much a joke. Far too quick.

  • @RippanCSGO
    @RippanCSGO Год назад +129

    Still happy 5800x3D being up there in the top representing AM4!
    It's insane we went from Ryzen 3 1200 to 5800x3D on the same socket... just freaking insane.

  • @dejanzie
    @dejanzie Год назад +86

    As someone who uses my desktop PC for gaming and home office work, I highly appreciate the increased focus on energy efficiency. While I'm not sure of the impact on menial workloads that don't require full CPU usage, I do take it into account when buying parts (CPU as well as GPU etc.). Especially here in Europe, where energy costs have exploded since February 2022.

    • @Proxymated
      @Proxymated 9 месяцев назад

      Blame your government for not knowing their place beneath Russia's boot.

    • @TheRealSykx
      @TheRealSykx 4 месяца назад +1

      hmm what happened in february 2022 🤔 /s

    • @techietisdead
      @techietisdead Месяц назад

      @@TheRealSykx A certain nation decided to invade its neighbor who was not a threat in any way at all

  • @MarioGoatse
    @MarioGoatse Год назад +1998

    AMD’s performance is great in the last few years, but look at that efficiency! 85W while the 13900K is hitting over 250w. That’s insane.

    • @RicochetForce
      @RicochetForce Год назад +148

      No joke, pairing this with a 4090 is going to lead to an incredibly cool and quiet PC as neither part is using much power at all. Efficiency monstrosities.

    • @AwesomePossum510
      @AwesomePossum510 Год назад +219

      @@RicochetForce A 4090 not using much power? Are you high? Efficiency is decent but even that would be way better on a 4090 if they underclocked it a little.

    • @muizzsiddique
      @muizzsiddique Год назад +20

      @@RicochetForce Forget to switch alts?

    • @mbj920
      @mbj920 Год назад +82

      @@AwesomePossum510 I assume if you cap the frames of a game, a 4090 would be more efficient reaching 144 frames than a 4070 would be if that makes sense. It's easier for a body builder to lift heavy dumbbells' than it would be for a teenager. Easier for a Lamborghini to reach 150 MPH than a typical car. This would be the logic but I haven't seen any comparison videos or stats.

    • @manfredDmeer
      @manfredDmeer Год назад +9

      @@AwesomePossum510 have you heard of undervolting?

  • @skyfighter_64
    @skyfighter_64 Год назад +1328

    Quick improvement idea:
    I think if you keep the charts at 9:00 in a constant order instead of sorting them by value every time would make them much faster to read, especially with many of them in a row.

    • @KillerDonuts27
      @KillerDonuts27 Год назад +106

      Agreed! Either that or 'Freeze' the tested product at the top and sort the rest by best to worst.

    • @Taracktor
      @Taracktor Год назад +61

      Bigly agree or even BOLD or HIGHLIGHT the main product being tested.

    • @ubermidget2
      @ubermidget2 Год назад +9

      I like the "best performer" sort order, but definitely something can be changed. Others have mentioned fixing just the review product or bold/colouring the entry.
      If bold/colouring is used, it would also be nice to have a second entry for the nearest price competitor (at time of release)

    • @donloder1
      @donloder1 Год назад +9

      Totally! their graphs/charts are a snooze to digest (and it's a trend among many tech chanels), and why I tend to skip these videos unless it has a substansial shock value. But at that point I only watch until they name-drop the sponsor.

    • @JamesTK
      @JamesTK Год назад +7

      The charts all through the video are impossible to follow and therefore useless fluff. More data doesn't make it better when it's not easy to consume

  • @madpistol
    @madpistol Год назад +278

    That power usage on the 7800x3D is mindblowing. It would have been enough for AMD to retake the gaming crown decisively, but they did it with a part that sips power. Freakin magic.

    • @wasd____
      @wasd____ Год назад +6

      It's not mindblowing when it loses 5-10% on productivity tasks. That lost time is just going to eat any savings.

    • @madpistol
      @madpistol Год назад +81

      @@wasd____ you don’t buy a 7800x3D for production. You buy a 13900k or 7950x.

    • @wasd____
      @wasd____ Год назад +6

      @@madpistol Not everyone wants to or can afford to buy whole separate computers for production vs. gaming. Most people don't.

    • @madpistol
      @madpistol Год назад +32

      @@wasd____ buy a 7700x.

    • @AECH_CH
      @AECH_CH 9 месяцев назад +10

      ​@@wasd____lmao, it's still a killer cpu and just looking at charts doesn't do it justice at all.
      Intel and AMD build good CPUs, the cash here is the selling point. It has like double the cash of similar CPUs, so 10% less GHz really isn't the point here.

  • @DBofficial125
    @DBofficial125 Год назад +505

    One big thing to take from this is that the 5800X3D is still an elite CPU, holding in with this and there is no need to rush to AM5 and upgrade everything until prices settle and we're forced to.

    • @ex0stasis72
      @ex0stasis72 Год назад +18

      Interesting. I'm currently looking into upgrade my PC for Starfield, and I have an Intel i5-7600K. I want to switch to AMD, in part, because I like how AMD doesn't force you to buy a new motherboard after ever CPU, so in that regard, I feel like I should go with AM5 because AM4 is on its last generation of CPUs. But I'm also on a tight budget (if you couldn't tell by the fact I still use a i5-7600K). But I'm also in a hurry to build a PC that can play Starfield at 60 FPS.
      Maybe my other option is to buy used AM4 parts, that way, when I'm ready to upgrade in the future, the difference in resell value isn't so large.

    • @Legendaryplaya
      @Legendaryplaya Год назад +3

      @@ex0stasis72 Digital Foundry did a video on Starfield that may interest you. They claim there probably will not be ray tracing because it would be basically impossible do to all the calculations in the game. They explain why but it will certainly be a CPU-limited game as are all bethesda games. So I imagine you really could build a computer under $1000 and run the game at fairly high settings especially if you use 1080p.

    • @ex0stasis72
      @ex0stasis72 Год назад +4

      @@Legendaryplaya thanks! After giving it some thought, I'm probably going to go with a used AM4 motherboard and the 5800x3D. Buying a much cheaper used motherboard solves my concern about it being the last generation on AM4 because I can always just resell it used again for not that much less later.

    • @Legendaryplaya
      @Legendaryplaya Год назад +1

      @ex0stasis72 nice. If it was my money, I'd grab a 3060 ti/4060 ti for DLSS, and it'd be a smooth 60 fps.

    • @ex0stasis72
      @ex0stasis72 Год назад +1

      @@Legendaryplaya Good choices. I already have a RTX 2070 Super, so I'm not as much in dire need to upgrade just yet. But I am often CPU bottlenecked in CPU bound games, which Starfield is very much be.

  • @MisterAnderson91
    @MisterAnderson91 Год назад +1204

    Just on the graphs: it would be nice if you could highlight the reviewed CPU's result on every page. It jumps around up and down (since the graphs are ordered by performance) and because each graph has short screentime, it's really frustrating to try and quickly see the position on each one.

    • @lucasgondra5725
      @lucasgondra5725 Год назад +8

      Up

    • @klericer
      @klericer Год назад +78

      This. Also, don't shuffle them around. Keep the same order, then its much easier to compare even it the graphs are only shown for a short time.

    • @chrisfow87
      @chrisfow87 Год назад +60

      This! The graphs are always terrible, LTT has acknowledged they're terrible and said Labs will improve them, but why can't we have some QoL improvements while we wait for the step change?

    • @huzaifavawda8383
      @huzaifavawda8383 Год назад

      this is why you are part of a failed society, cant pause the video but want things highlighted for you. You are pure laziness exemplified and a brokie

    • @catsonicc5979
      @catsonicc5979 Год назад +1

      they totally should!

  • @yellingintothewind
    @yellingintothewind Год назад +482

    You miss an important bit with Factorio. It's a game about automation and scaling out. While it's capped at 60 tps, no matter how powerful your system, it *will* dip below 60tps once you get a complex enough map. The better scaling means running bigger maps before you hit "TPS death". Would be interesting to develop a benchmark that tests how complex you can get before that happens.

    • @SurgStriker
      @SurgStriker Год назад +6

      maybe they could ask chatGPT to write a script that automates the game in a predictable way to reliably ensure growth happens equally every run, then just test for when the TPS drops.
      Unless factorio has procedural generation (random maps, etc). I've never played it so not sure on that part

    • @jma89
      @jma89 Год назад +45

      @@SurgStriker Yes, the maps are procedural, but actual gameplay is quite deterministic. A better test may be a single mega-base that strains even the latest gen hardware and see what the UPS (updates per second) falls to. I'm sure there are plenty of folks in the community who would love to share their multi-thousand SPM (science per minute) bases as a benchmark.

    • @yellingintothewind
      @yellingintothewind Год назад +22

      @@SurgStriker The maps are procedural, but that's easily solved using a fixed seed. No need for chat gpt, since the game logic is implemented in Lua. So it would be quite easy to write a "mod" that tiles out parallel construction segments at a set rate using the dev-mode infinite source / sink stuff till the game grinds to a halt. With the fixed rate expansion, that lets you plot performance as "time till TPS drop".

    • @NightKev
      @NightKev Год назад +34

      @@SurgStriker ChatGPT isn't some magic spell that can do everything for you.

    • @SovereignThrone
      @SovereignThrone Год назад

      @@NightKev anime girls aren't magically gonna take your virginity, but here you are with this profile pic

  • @thetraitor3852
    @thetraitor3852 Год назад +364

    You should also try benchmarking CPUs on 4x games like Civ, Stellaris or city builders where they matter the most.

    • @Bunster
      @Bunster Год назад +30

      we asked for this on the other 3d cpus review, I guess they didnt care or read it

    • @nicholaskiser6146
      @nicholaskiser6146 Год назад +3

      was thinking this myself watching the video. no clue if the videogame heavy or productivity heavy CPUs work best for the videogames that are productivity heavy :( would love to see 4x benchmarks

    • @LS-es8zv
      @LS-es8zv Год назад +2

      4x games don't care about 3dcache, 13900K was faster than 5800X3D already

    • @badjackjack9473
      @badjackjack9473 Год назад +25

      I just wanna see a high end CPU try to play stellaris late game on high speed.

    • @thetraitor3852
      @thetraitor3852 Год назад +2

      @@badjackjack9473
      With AMD you will get the same exact performance out of 7600x as out of 7950x. It will be very similar story with Intel's i5 and i9 if it's the K type CPU.

  • @spill1t
    @spill1t Год назад +222

    Glad you guys actually showed the cost savings related to the efficiency. I think that speaks volumes about how great the solution is and if AMD actually spent more time trying to make the top end line of chips behave the way we'd expect they should, then it will be very hard to recommend anything other than AMD solutions.

    • @blodstainer
      @blodstainer 11 месяцев назад

      Having the top CPU only matters because we consumers say so,

    • @spinyjr
      @spinyjr 10 месяцев назад

      ​@enriqueamaya3883too true 😂

    • @JBlNN
      @JBlNN 3 месяца назад

      That's the thing AMD will never take risks or move ahead development thats why Intel/Nvidia will always be the best.

    • @generalpluto369
      @generalpluto369 Месяц назад

      ​@@JBlNN I would argue that they've taken a major risk already by challenging Intel when they were so small, now they're much larger and growing fast, I've used both Intel and amd before, just got a new amd cpu after using Intel for 7 yrs amd I'm excited to see the differences

  • @SiiiVR
    @SiiiVR Год назад +564

    I personally really appreciated the performance per watt cost graphs since they give a good estimate of the proportional difference in cost of ownership. Thanks Linus, Team and Labs!

    • @rockenrooster
      @rockenrooster Год назад +30

      it skews even more drastically when you add AC/cooling costs to keep your room at a confortable temp!

    • @TheFPSPower
      @TheFPSPower Год назад +8

      They just missed talking about idle power while watching videos and stuff, Ryzen CPU's are notoriously horrible at that while Intel has it figured out.

    • @majstealth
      @majstealth Год назад +6

      @@TheFPSPower stock yes, but one could always throttle the voltage a bit

    • @WaspMedia3D
      @WaspMedia3D Год назад +12

      @@TheFPSPower It's literally like 5 - 10 watts, total system. If you bought a $500 best of the best gaming CPU with 8 cores that can do well at productivity to leave it sit Idle the whole time, that would be stupid, and you could just turn it off, or set it to sleep after inactivity. I don't know why people keep bringing this up, 1 hour of productivity or six hours of gaming will displace 2 weeks of the difference at idle.
      The only point this argument ever makes is people like to grasp at straws to promote their favourite brands.

    • @constantin58
      @constantin58 Год назад +6

      it's not just electricity savings, a low demanding Watts cpu will also allow to run a cheaper cooler, with less temperature and less noise and longer longevity without the need to repaste often.

  • @iben3271
    @iben3271 Год назад +1217

    why do these reviews never test simulation speed? like stellaris, civ6, hoi4, total war end of turn times (maybe even football manager?)... It's one of the main strengths of this cpu pretty much no one knows about.
    There's a test that exists where a 5900x is put up against a 5800x3d. The 5800x3d destroyed the 5900x in stellaris speed, simulating 25-30% more days in the same amount of time. If this happens with alot of similar games, then this CPU could be a gamechanger for alot of people, but no one seems to explore this further.
    I still see people recommending non x3d chips for these type of games, because of "higher single clocks"' and I think alot of people are still making this mistake. Hoping to see someone test this extensively someday, but here we are 1 year later, and still no one has.
    edit: please stop saying "no one plays those games" FM23 & civ6 alone make up of 110k players rn on steam. That's not counting any of the total war games or paradox games, or any other simulation/grand strategy game... (which is probably another 200k+ total)

    • @Elijah1573
      @Elijah1573 Год назад +133

      Fr The games i care about with these CPUS are never done

    • @lostkostin
      @lostkostin Год назад +128

      because most players are not playing it.

    • @KaloianBostandziev
      @KaloianBostandziev Год назад +2

      IDK

    • @iben3271
      @iben3271 Год назад +2

      @@lostkostin might wanna take a look again at steamcharts. very wrong take

    • @DiscombobulatingName
      @DiscombobulatingName Год назад +67

      Definitely. My biggest CPU bound game right now is flipping turns on Civ 6 when the game gets late.

  • @tom940
    @tom940 Год назад +28

    i just picked up a 5800X3d, started with a 3060ti and a 3600, when i upgraded my middle monitor to 1440p it left just alittle to be desired, got a good deal on a 3090 but then the 3600 was a HUGE bottle neck. Like my FPS in games like farcry 6 straight up doubled switching to the 5800x3d.

    • @InertGamer
      @InertGamer 7 месяцев назад +1

      I’m having that issue now but with Intel I have I7-8700K and had a 2080 base. Then I upgraded to a 3080 TI and now most games are running sluggish. Cyberpunk, SCP, Assetto Corsa

  • @RiGz_Nz
    @RiGz_Nz Год назад +2

    That was fantastic review man, I've been testing and loving all this (don't get me started in my self made phd in memory chips and settings beyond and above lol) I've been little behind in my AMD tech but consider me updated and upgraded thanks for your enthusiasm & love for hardware. much love from New Zealand. (not that you would ever read this 4309 comments so far lol)

  • @kiri101
    @kiri101 Год назад +191

    As a UK viewer (and someone who generally tries not to waste power) the 7800X3D is clearly for me. I had a great experience building a GPU-less AM5 7700 video editing system for someone recently.

    • @GoobyDev
      @GoobyDev Год назад +2

      Ah yes the 7700, I chose that for my build specifically because of its power efficiency (UK too)

    • @sophieedel6324
      @sophieedel6324 Год назад +4

      if you cared about power consumption, you wouldn't buy a 7800X3D to begin with, you would get an i3 or a console

    • @GoobyDev
      @GoobyDev Год назад +46

      @@sophieedel6324 you only buy a console if you only want gaming and nothing else. It does not beat the usage of a computer. Also, i3s are not necessarily less power usage than these chips and why sacrifice massive amounts of performance for that anyway

    • @dex6316
      @dex6316 Год назад +24

      @@sophieedel6324 then you wouldn’t get an i3 or console either. You’d get a laptop if you truly cared about power consumption. Efficiency matters a lot. The fact of the matter is that those i3s won’t be more efficient than a 7800X3D; you just lose so much in performance and power consumption is barely different. You could also get a 4080 and power limit it to kill consoles in efficiency.

    • @nelsonmendoza561
      @nelsonmendoza561 Год назад +17

      @@sophieedel6324they simply don’t provide the performance necessary for everyday tasks and gaming in one machine. An i3 is great value, don’t get me wrong, but I think in power per watt, the 7800X3D is really quite amazing. Just unfortunate about the clock speeds.

  • @TheMarioOne
    @TheMarioOne Год назад +269

    I really like the comparison of power usage and running costs between different CPUs, it's not something I've considered when building PCs in the past. Would be great to see this in GPUs and the full PC build videos as well.

    • @tshcktall
      @tshcktall Год назад +4

      nice avatar. Also, I always try to build a really power efficient pc.

    • @FriarPop
      @FriarPop Год назад +2

      If you are worried about $50 over 5 years, you have problems.

    • @tshcktall
      @tshcktall Год назад

      @@FriarPop I save way more than that, but how would you know. You don't know how I use my pc. What my electric prizes are. You just pulled a number out your §§§.

    • @handlealreadytaken
      @handlealreadytaken Год назад +1

      @@tshcktall even if it’s 10x, you have problems if that’s your financial undoing.

    • @tshcktall
      @tshcktall Год назад +3

      @@handlealreadytaken I just don't like wasting money, when there's a better way for me, that's it. Nothing more. Why do you guys even care so much, what other people prioritise?

  • @Whoshotmyrifle
    @Whoshotmyrifle Год назад +5

    3:17 I loled at the Paulo Coheelo

  • @Hr1s7i
    @Hr1s7i Год назад +10

    This 3D cache advantage reminds me of how overpowered L3 cache was when it got introduced. I did a little test between an Athlon II x4 620 and a Phenom 720. Parked both of them down to two cores and same clock speed. The L3 cache gave me up to about a third more fps iirc. It would stand to reason that more of it would be better when we keep improving it.

    • @andreewert6576
      @andreewert6576 Год назад +1

      Also that was then, L3 cache (or larger cache vs. smaller) has a bigger impact later in a products life cycle. That's why the phenoms stayed relevant way into the core-i era. That's (part of) why the intel X58 platform is kind of usable for gaming today. The other part being triple channel.
      these x3D parts will have some absurd longevity, not in maxFPS but in minFPS/frametimes.

    • @Hr1s7i
      @Hr1s7i Год назад +1

      @@andreewert6576 Indeed. the minimum FPS or 0.1% low or whatever they call it, is something I use as a general guide when it comes to capping framerate. My current gear for example handles Deep Rock Galactic at 1440p at roughly 120~ish fps, but dips down to 80-90 regularly. I decided to cap it at 100. 10-15 fps drop will not bother me, but 30-40 will be felt. Smooth gameplay is paragon for chill experience, in my personal opinion.

  • @haukewalden2840
    @haukewalden2840 Год назад +271

    Finally! I'm so happy that you are starting to consider the energy usage of CPUs and the price they truly cost in a longer run.

    • @ZaikoEAG11
      @ZaikoEAG11 Год назад +12

      This. A lot of people (me included) like to maintain their pc parts for a little more than a couple of years

    • @Naokarma
      @Naokarma Год назад +1

      Especially with such a big difference. If you're not playing the highest-fidelity games, you can still benefit from spending a bit more today on this thing than paying for it in energy for the next 3 years or so

    • @WrexBF
      @WrexBF Год назад +6

      That power consumption section is a little misleading though. The 13600k pulls more power in productivity workloads but it's faster. Also, they didn't include idle power consumption numbers. Your CPU spends the majority of the time at idle, and AMD CPUs pull more power because of the I/O die.

    • @mars_12345
      @mars_12345 Год назад +4

      I think it's time for a energy consumption testing in different workloads, similar to how laptops are tested for their battery life. Idle, web browsing, 4k playback, gaming (this will be hard, need some sort of average load). It will be harder, cause components need to be tested more in isolation than in conjunction like in laptops, but now that they have the new lab... ;)

    • @wasd____
      @wasd____ Год назад +5

      Hot take: a difference of maybe $100 in total energy cost between CPUs amortized over a 5 year presumed lifespan is not a consideration. $20/year, less than $2/month, is literally just maybe some random loose change from your pockets. It's nothing you'd ever even notice or think about. It doesn't matter.

  • @ThFnsc
    @ThFnsc Год назад +137

    I'm so happy about this focus in performance per watt. It's such an important metric that's been overlooked for far too long. I truly wish ltt keeps talking that much about it in future reviews. Not just about CPUs, but anything.
    I cannot help but think this absurdity of power consumption nvidia and intel been pulling lately as partially a result from the community and reviewers on average not caring much or at all about efficiency. But raw performance and to hell with the costs.
    There's only one more thing that would make me even happier: measuring and caring about idle consumption

    • @rockenrooster
      @rockenrooster Год назад +8

      I agree! But I would take it a step further because it skews even more drastically when you consider AC/cooling costs to keep your room at a comfortable temp, especially with summer coming up in NA. Very few like to game while sweating.

    • @marcuscook5145
      @marcuscook5145 Год назад +8

      Performance per watt is pretty much the ultimate measure of who has the best CPU architecture. It's fine looking at the benchmarks, but if the results are very close while one company needs a ton of extra wattage and heat output to achieve that result, I'm obviously going to pick the more efficient and cooler running chip if the prices are anywhere near comparable.

    • @mattemathias3242
      @mattemathias3242 Год назад

      ​@@marcuscook5145well one thing about it though is some architectures can't consume too much power before they ultimately will fail, like ryzen, and something like 13th gen intel was made to be able to consume very high amounts of power.
      All in all I wouldn't say it necesarily makes one architecture superior over the other, but I think it's always a better foundation for improvements if you have lower watts consumed to supply the cpu sufficiently overall.

  • @kohta20
    @kohta20 Год назад

    Please make a video about sound bar/wireless satellites vs AV receiver/wired speakers. You and Jake were super impressed with the HT-A9 from Sony and the Q990C is out from Samsung now. So a cost analysis and obviously the experience between the two would be amazing especially with all the options. Tons to talk about and compare between brands, pricing, setup, etc!

  • @hexacarbide268
    @hexacarbide268 Год назад

    Great review given the complexity. Thx team!

  • @ShotGunner5609
    @ShotGunner5609 Год назад +72

    That breakdown of expected power consumption was a very nice metric. Please continue adding that into reviews!

  • @moon200070
    @moon200070 Год назад +71

    Is it possible for all the graphs representing the cpus and their relative performance, that there could be a highlight of sorts for the two CPUs being compared? So maybe a red "highlight" around the 7800X3D, and a "blue" highlight for the CPU that costs the same or is priced to compete so that the viewer has a more intuitive feeling for the difference in the charts? Otherwise it feels like just a wall of numbers, and I usually like watching these not because I am going to buy them, but for entertainment. So a simple visual indicator for where they are on the list would be cool!

    • @AwesomePossum510
      @AwesomePossum510 Год назад +1

      Totally agree. I like how they go through the graphs quickly but with a little indicator of what’s what I wouldn’t need to pause the video as much.

  • @M3rkatron5000
    @M3rkatron5000 Год назад

    Eye opening stuff I was looking to upgrade and I know what I’m going with now ! Thanks guys

  • @jaspervandemoortel5447
    @jaspervandemoortel5447 Год назад +4

    Would love to see some graphs comparing the cpu's in non-gaming environments. Like compiling code, assembling application, training machine learning models or deeplearning models. 😅❤❤❤

  • @OTechnology
    @OTechnology Год назад +653

    I love how Linus never misses any opportunity to mention how AMD abandoned X399 lol

    • @SyRose901
      @SyRose901 Год назад +36

      I'm not a person who remembers CPUs by its chipset names, do you mean the Threadripper non-pro lineup?

    • @-FOXX
      @-FOXX Год назад +20

      I'm mad too; 20grand down the drain damn near

    • @arsiarskila
      @arsiarskila Год назад +3

      who cares about who you love

    • @tiedye001
      @tiedye001 Год назад +3

      ​@@SyRose901 Yes

    • @jakesnussbuster3565
      @jakesnussbuster3565 Год назад +10

      ​@@SyRose901 you could just look it up considering you're on the internet

  • @lolowski6826
    @lolowski6826 Год назад +541

    I am so glad that you finally started adding performance from games like Factorio! As someone who plays a lot of automation and 4X games. Those are very important metrics to me and many other automation/4X enjoyers. It is also very difficult to find reviews that include them, they all just focus on frames per second in big AAA games.

    • @Austynrox
      @Austynrox Год назад +20

      Agreed. I don’t necessarily play factorio (but want to try it) but i do play games like rimworld, they are billions and recently was gifted satisfactory. And i do enjoy city builder/management games also so it is nice to have some mention outside aaa

    • @Kenstro949
      @Kenstro949 Год назад +16

      Yep exactly, I'm looking to upgrade just to improve performance in games like Stellaris etc.

    • @Crusader-eh2cv
      @Crusader-eh2cv Год назад +6

      @@Austynrox You should try Oxygen not Included and Dyson Sphere both VERY good.

    • @ex0stasis72
      @ex0stasis72 Год назад +4

      Also, heavily modded Minecraft tech modpacks. Those are very CPU intensive.

    • @MrSherhi
      @MrSherhi 11 месяцев назад +2

      Maybe even civ6 turn time test could become a standard imo.

  • @moonmode3232
    @moonmode3232 Год назад +46

    I’ve been waiting years for the PC to finally release efficient CPU’s. The 7800x3d seems like the real deal. Nice job AMD!
    I just don’t understand how Intel has had a few years to update their CPU’s after Apple released their super efficient M chips, but have done nothing for efficiency.

    • @MarkLikesCoffee860
      @MarkLikesCoffee860 11 месяцев назад +8

      Intel has a 62.8% market share so they dont need efficient chips. 62.8% of the world love Intel and keep buying Intel. AMD only have a 35.2% market share so they need to work much harder than Intel and produce better products than Intel if they want to grow bigger than 35.2%. The CPU company with the bigger market share will always be complacent.

    • @shringe9769
      @shringe9769 11 месяцев назад +7

      Apple uses an entirely different cpu architecture (Apple is ARM, while AMD/Intel are x86); x86 chips will probably never come close to ARM. ARM has many of Its own issues relative to x86, so ARM releasing into the mainstream desktop market is not happening anytime soon. Comparing ARM efficiency to x86 desktop chips is not a fair comparison by any means.

  • @CoolSilver
    @CoolSilver Год назад +5

    I'm still rocking a i7 5820k CPU and it was a 6 core processor that you guys hyped as one of the best choices long term even though there was faster chips that were still quad core.

  • @Cybey
    @Cybey Год назад +11

    The main takeaway from this video is that the word is not spelled as Segway but as Segue

  • @-Animal_
    @-Animal_ 8 месяцев назад

    i was trying to find a comprehensive video on ryzen 9's and this one was the ONLY video that expressed all my concerns. thank you

  • @qm3ster
    @qm3ster Год назад +7

    Now all we need are consumer desktop dual AM5 socket motherboards, so that you can put your interactive tasks on the 7800X3D and everything else on the 7950X.

  • @explosev6513
    @explosev6513 Год назад +558

    Super happy to see the 5800X3D still very close in performance, gotta thank AMD for blessing us AM4 peeps.

    • @hypnotize107
      @hypnotize107 Год назад +32

      Bought mine yesterday because I didnt want to spend 300€ for a itx motherboard again

    • @chadofwar6554
      @chadofwar6554 Год назад +24

      got mine a month ago on sale and happy i did. it should last me awhile. AM5 is still too much money

    • @BMC2
      @BMC2 Год назад +4

      Just got one today but forgot to flash the bios lol. Anyways its great value cpu

    • @spacepioneer4070
      @spacepioneer4070 Год назад +10

      The 5800X3D gets beat by 20-30 fps in some titles to the 7800X3D. I was shocked, i haven't seen a generational performance increase like this in a while. ruclips.net/video/TVyiyGGCGhE/видео.html

    • @ik1llpeeple4fun
      @ik1llpeeple4fun Год назад +9

      @@spacepioneer4070 what he meant was the 5800x3D is about the same as the 7800x3D in 4k gaming. The 7800x3D according to the charts, and the video you linked, only have a noticeable gap over the 5800x3D in 1080p. Even at 1440p the gap shrinks tho. And in 4k the gap is negligible.

  • @SIW808
    @SIW808 Год назад +459

    Just look at how the old 5800X3D is keeping up with these new CPUs in games. What a chip!

    • @dukejukem8843
      @dukejukem8843 Год назад +45

      its not old though

    • @DKTronics70
      @DKTronics70 Год назад +45

      @@dukejukem8843 The PLATFORM (AM4) it runs on is very old - September 2016 - that is ancient in PC times.

    • @gumbi79
      @gumbi79 Год назад +38

      @@DKTronics70 its just a socket .. the platform is the chipset and its evolved over the years . and ddr5 and pcie 5 are not worth the extra outlay at all

    • @macse7en
      @macse7en Год назад +1

      Yeah, and I’m still trying to get one without obliterating my bank account lol

    • @clark85
      @clark85 Год назад +3

      @@Malc180s oh its older in computers years come on enough excuses

  • @alexkass84
    @alexkass84 Год назад

    Thanks to you guys I had finally choose platform for my PC.

  • @AM-im8jb
    @AM-im8jb Год назад

    Pablo Coil-Oh is a closer pronuntiation. Great vid btw

  • @geoffreybassett6741
    @geoffreybassett6741 Год назад +108

    Thank you for mentioning performance per watt. With how performant these chips are it's now become a very important factor in my purchasing decisions.

    • @alphacompton
      @alphacompton Год назад +10

      yea the analysis with performance per watt is very good. I think he should have mentioned how you also wouldn't need as beefy a cooler for the 7800X3d vs other top performing chips. That's means a smaller PC and/or more quite PC.

    • @ThatGoat
      @ThatGoat Год назад +2

      Power companies. Continuously motivating us to make better choices. It's for the environment, m'kay?

    • @bradhaines3142
      @bradhaines3142 Год назад +6

      ​@@ThatGoat or not trying to heat the house that im trying to cool

    • @GrzegorzMikos
      @GrzegorzMikos Год назад +2

      I don't get why they even show it in a review of gaming CPU, 15$ difference in a year is nothing. Is it not normal in USA to tip delivery driver 5-7$ on each order? So like don't tip 2 times a year and you are done.

    • @mateuszsa7
      @mateuszsa7 Год назад +3

      They still messed it up tho :/
      No idle power consumption.

  • @josephjoestar515
    @josephjoestar515 Год назад +247

    I loved seeing the estimated power costs during ownership. Really made me think about it and realize that the extra cost of the 7800X could easily be justified (plus less heat during summers (or all year if you live in Florida like me)).

    • @Duglum666
      @Duglum666 Год назад +18

      The difference is is even really small in the US.. i was shocked how extremely cheap your power still is. In Germany we went up to about 65 us cents per kWh. The difference over a few years can easily be hundreds of Euros.

    • @WouldDieForChomusuke
      @WouldDieForChomusuke Год назад +12

      ​@Duglum666 honestly never thought about it that way, like as of right now I still live with my parents (not a leach only 16) and I just get the best thing I can afford, totally didn't think power was a factor of cost, even though it's long term.

    • @MisterPikol
      @MisterPikol Год назад +1

      ​@@Duglum666 where in Germany, that used to be the case but now in Hesse it dropped back to about 38 cents per Kwh

    • @Masterrunescapeer
      @Masterrunescapeer Год назад +5

      @@Duglum666 the 65c were during peak, shouldn't hit that again, and most places are coming down quite a bit, seeing contracts here in Austria for low 20c range again, it basically doesn't matter long-term for home users (also energy subsidies).

    • @abaj006
      @abaj006 Год назад +16

      The extra 100W of cooling load on the Aircon will easily add up over a year. Not only you have to pay for the extra 100W every hour, now you have to pay for your AC to get it out of your room. The 3D chips are a no brainer, and a fantastic value for money.

  • @aidenholmes5588
    @aidenholmes5588 Год назад

    I just got this cpu for my first build like 20 days ago. im pretty happy to see this.

  • @OccipitalVision
    @OccipitalVision Год назад +16

    I truly appreciate the cost analysis on power usage. Although not in a state with ridiculous energy cost, that's a great point to delv into.
    Those who are willing to be patient generally win these pricing games the companies play, but even with inflation, the fact that you can build a 7800X3D with newest gen for around $1500 when my friend help me build my now 9 yo system for $1200 looks good

  • @ChrisK251
    @ChrisK251 Год назад +697

    This CPU with the extra cache is an absolute monster for World of Warcraft up to 30% extra frames

    • @Sherfps
      @Sherfps Год назад +11

      thats good to know.

    • @watermelon58
      @watermelon58 Год назад +22

      Star Citizen too

    • @xepros4560
      @xepros4560 Год назад +14

      Really good to know. Only Game that matters ❤

    • @janfrederick7753
      @janfrederick7753 Год назад +8

      I think MSFS also likes the huge L3 Cache

    • @_Zane__
      @_Zane__ Год назад +6

      World of War craft? 😂

  • @Infigo96
    @Infigo96 Год назад +28

    What amazes me the most is that power draw. My 5800x3d draws ~55w which is really nice in my quite small and compromized matx.....they managed to cut out 15W! from that with more performance. Now I seriously concider a SSF build as now any low profile cooler will do very well and the options opens up a lot while still having a quiet mashine.

  • @dontstarebra
    @dontstarebra Год назад

    nice video man. very well displayed, good info. easy to digest.

    • @supersans3249
      @supersans3249 2 месяца назад

      I used intel best, i7 14700k CPU vs AMD Ryzen 7 7800X3D, AMD is more efficient and better, trust me, the videos wrong.

  • @CapitanGreenhat
    @CapitanGreenhat 11 месяцев назад +2

    Im curious to see how true this holds up when the drivers mature. Also, with an all rdna3 system. Seems like the performance gains are entirely driven by how the 3d cache is implemented.

  • @AlmightyJu2
    @AlmightyJu2 Год назад +300

    I really love the energy price graphs! Would love to see this more often for gpus too, since as a brit it's defo becoming more of a concern when looking at hardware and nobody does real world data of actual power usage

    • @JohnDoe-hw8ge
      @JohnDoe-hw8ge Год назад +5

      I did hope to see the 13700K or 13900K there, would have been quite shocking :O

    • @Psythik
      @Psythik Год назад +2

      Hey you stole my username

    • @AlmightyJu2
      @AlmightyJu2 Год назад +3

      @@Psythik 😯 Psythix's unite! The important question is has anyone called you "fish sticks" on voice chat because they don't know how to pronounce it?

    • @oxfordsparky
      @oxfordsparky Год назад

      Fellow Brit here and I completely disagree, the cost of electricity is beyond negligible. Even using their price of 42p a kw if £22 a year extra running costs is enough to convince you then you shouldn’t be gaming on a pc.

    • @AlmightyJu2
      @AlmightyJu2 Год назад +6

      @@oxfordsparky sure if you take a single component by itself it might be a small figure, but even then if you're paying even £10 a year for a 2fps increase is there point in that extra cost? No. But without performance per watt info it's all speculation

  • @Workmusic1988
    @Workmusic1988 Год назад +71

    Live in the UK. Those cost of ownership metrics really frigging matter. Been waiting for this type of analysis for a while now and am so happy to finally see it. I would MUCH rather give up 5/10% performance for, in this case, a new component to save significantly on cost of ownership. :)

    • @purplepenguin43
      @purplepenguin43 Год назад +11

      This is going to be huge for me, i live in Alaska and energy is actually pretty cheap cause of hydro, but nobody has A/C so my room gets toasty quickly in summer with a 300+ watt system running for long sessions. i was thinking of getting a 13900k, but that would literally have doubled my wattage output into my room. this new chip if its running at 100 watts which is the same or less then my ancient amd FX 8core chip i have now. holy cow im going to be going from 32nm to 5nm tech, this is going to be a massive jump. im holding off though for at least one reviewer to use it in Arma 3 and DCS and compare it to intel as those are my most played games. exshily in arma 3 intel has always had a huge lead over amd just because of optimization and single thread reasons. but that dosnt matter if it comes at 200+ watts more cost.

    • @oll13
      @oll13 Год назад +4

      Definitely relevent these days. Features like Radeon Chill where it limits your FPS while you're idling in games can actually save you very noticible sums of money, as well as buying efficient hardware. I miss the days where you didn't have to consider these things but here we are..!

    • @th3_g3ntl3man
      @th3_g3ntl3man Год назад +8

      7800X3D is a no-brainer in the UK. Price difference will pay for itself in lower energy costs.

  • @ethanthompson3934
    @ethanthompson3934 Год назад

    Just ordered it, can’t wait to start my new build

  • @Nidaloove
    @Nidaloove 10 месяцев назад

    Do nada o cara lançou um Paulo Coelho, Brabo! 😂😅
    All of a sudden you mentioned Paulo Coelho, Awesome!

  • @watercannonscollaboration2281
    @watercannonscollaboration2281 Год назад +34

    7:08 I love how Adam’s in this bit because back in the first AMD extreme tech upgrade video he really wanted a 3DVcache CPU but missed out on the timing

    • @djorgs
      @djorgs Год назад +1

      I will be involuntarily dreaming of Rabid Adam for years to come.

  • @FFXfever
    @FFXfever Год назад +104

    Another really cool thing about 7800X3D being so efficient is probably if they decide to put it into mobile platforms. I can imagine 20 watt desktop replacement for field work processing.

    • @sihamhamda47
      @sihamhamda47 Год назад +11

      Can't wait for Ryzen HX3D series mobile processor (if AMD want to)

    • @PileOfEmptyTapes
      @PileOfEmptyTapes Год назад +3

      They would invariably have to stack a _monolithic_ die with 3D V-Cache first before it would be attractive to mobile use. Desktop Ryzen 7000 CPUs will consume 20+ watts in idle, and almost nothing of that is due to the cores (which might take 0.6 W).

    • @loider3365
      @loider3365 Год назад +3

      7800x3d on mobile will be hella efficient to gaming but very inefficient when idling, so battery life will still be poor

    • @deansmits006
      @deansmits006 Год назад

      But would you see the advantages since notebooks get lower power GPUs? You can't generally upgrade the GPU in a notebook, so you are more likely to be GPU bound and waste the CPU horsepower, right?

    • @XX-121
      @XX-121 Год назад

      DESKTOP AND MOBILE CPU'S ARE NOT THE SAME.
      do you really think if they could make the desktop part have that much performance at 20w they would LOL!!!!! WTF!!? U S E YOUR BRAIN!!!! the people in this comment thread are the same people on the steam forums saying their games won't run. ...."buuut i has a I7" lol

  • @tokie2100
    @tokie2100 11 месяцев назад

    I am doing my very first pc build and I have a 7800x3d on the way with a 4090😊. I’m so freekin excited!!!!

  • @Maddesen
    @Maddesen Год назад

    Great topic to pick up. I think your focus on energy consumption is definitely worth the talk as well. Considering worldwide energy shortage and consumption as well as global sustainability everyone talking about tech should be talking about effeciencies as well. You could multiply the lowered power consumption to an estimate amount of PC owners and then comparing that result to relevant energy/CO2 savings..
    Anyway seems like I'm gonna look at a B650/7800X3D build rather than a 13600K/Z790 one.. 👌🏻

  • @FatherLamb
    @FatherLamb Год назад +6

    The only thing I learned from this video today was that It's segue @0:53 And not Segway.

  • @niravramdarie9898
    @niravramdarie9898 Год назад +32

    I never thought I would see a 5800x3d out perform a 13900k in any game 😮

    • @air21511
      @air21511 Год назад +3

      now consider how much less power it draws = how much less heat it puts out into your cooling system = how much more graphic card will have headroom (may even be able to get more from Video card due to better thermals!

    • @niravramdarie9898
      @niravramdarie9898 Год назад +1

      @@air21511 I know. It's painful to watch.

    • @jamesmackinlay4477
      @jamesmackinlay4477 Год назад +1

      That is of course gaming but in other operations of the PC the 13900k will do better but the reality is the difference in power draw and the real world performance over all for most everyone the amd is still the better overall.

    • @air21511
      @air21511 Год назад

      @@jamesmackinlay4477 well yeah - hence “the best gaming cpu” 😄😘

  • @austinfowler2707
    @austinfowler2707 Год назад

    They need to show more CPU bound games when showcasing anything with x3d architecture.
    Something like DCS, MSFS, or heck even Arma Reforger. X3d does wonders for sim games so its very important we see that, esspecially sense theres communities that need to know that information.
    Like me, as I was contemplating on whether or not on upgrading to an AM5 platform for Arma 3/Reforger.

  • @liang5345
    @liang5345 Год назад

    Excellent review! Thanks

  • @SethanderWald
    @SethanderWald Год назад +16

    Man, it's crazy how basically every graph was showing triple digit fps across the board... Doesn't seem that long ago when these types of graphs were in the 60-90 fps range. 😅

  • @BrunoRibeiro-po2bv
    @BrunoRibeiro-po2bv Год назад +119

    The way Linus completely obliterated Paulo Coelho 🐰 was something to behold

    • @manuntvg01
      @manuntvg01 Год назад +9

      indeed, I cringed a bit

    • @1ZeeeN
      @1ZeeeN Год назад +29

      As a Portuguese speaker I paused the video and go back just to hear again hahaha Linus destroyed the pronunciation hahaha
      Of course not his fault... but it was funny! 🤣

    • @gruiadevil
      @gruiadevil Год назад +5

      @@1ZeeeN I'm Romanian, and even I cringed hearing his pronunciation.

    • @rslanna
      @rslanna Год назад +2

      indeed

    • @bible1944
      @bible1944 Год назад +1

      ​@@1ZeeeN same xD

  • @TheOriginalCoda
    @TheOriginalCoda 4 месяца назад

    11:10 UK: Thanks for the mention. Did you also know it's raining all the time here?

  • @starfirextx
    @starfirextx Год назад

    My 7800x 3D is on its way, ordered this morning 😊 I’m really glad that I waited.

  • @mtbSTL
    @mtbSTL Год назад +112

    Idk what is does, but Escape from Tarkov seems to heavily benefit from the increased cache in the 5800x3d, and the whole community went crazy over the massive performance increases over any other chip on the market, even with older 10 series cards. I can only imagine how big of a jump this is in games like that

    • @murd4638
      @murd4638 Год назад +1

      Oh Boi we're not ready...

    • @Odeezee
      @Odeezee Год назад +13

      same thing for Star Citizen, it love X3D chips. glad AMD are coming through for us again!

    • @frale_2392
      @frale_2392 Год назад +4

      AI computations benefit the most from having a bigger cache (see how massive the gains are on the graphs for Warhammer 3), I've never played Tarkov but I guess there is some heavy and possibly non optimized ai logic running in the background.

    • @murd4638
      @murd4638 Год назад +8

      @@frale_2392 you can play tarkov on 4K with this chip and in good quality, i'm already running tarkov on 4K high, fsr on ultra quality with a 5800X3D and i have stable 130 fps. the new 7800x3D is gonna be a monster too,
      clueless, tarkov is dead because of the cheaters

    • @rkan2
      @rkan2 Год назад +6

      In MSFS it doesn't really matter which so far released 40 -series GPU you run - but from x3D you get 10-20% more fps lol So instead of 50 you get 55-60 - which is a lot.

  • @honaker326
    @honaker326 Год назад +7

    I upraded from my 2600x to a 5800x3d with a bios update for $299. Somtimes, the best medicine is to wait on the best of last gen to go on sale lol.

    • @MindShackleFilms
      @MindShackleFilms Год назад

      I did the exact same upgrade last month. GPU is a 5600XT on a 1080p monitor so the gains were minimal. Bottleneck Calculator says the 5800x3d is a bottleneck for a 4070 Ti or higher... not sure what GPU i should upgrade to. Will get a 1440p monitor as well. Wait for the 4070?

  • @0Turbox
    @0Turbox Год назад +2

    Dude is a great seller. You stare at a graph that shows you barely any differences, and he still manages to use words like "win" and "lead" to shine some light on single products. No one can sell me, that you will notice 10 % gains at 200+ FPS.

  • @davidmay268
    @davidmay268 Год назад +9

    It would be really useful if you could include DAWBench in your test suite. It's really difficult to make decisions for audio loads based on the current suite of tests, because realtime audio work is such a specific load which is very sensitive to latency and the ability to maintain high constant speeds.

    • @hugoapresname
      @hugoapresname Год назад +1

      DAW is interesting for me too. I realise that Energy Efficiency and cost is an important factor, because with audio work computers run for long times and also they might? be more silent since consuming less power.

    • @darkpolution468
      @darkpolution468 11 месяцев назад

      @@hugoapresname Have you guys come up with a solution for it i cant decide for which platform i will go.

  • @TheMongolPrime
    @TheMongolPrime Год назад +17

    I really loved seeing the pricing for using the chips in California prices. I don't live there, but I super appreciate the graphs, as I find that incredibly important when choosing between the options and saying "well I have the money, so I might as well get the more expensive chip... right?"

    • @tad2021
      @tad2021 Год назад

      A 7950X3D costs around 300-350 millirentmonths.

  • @joshuagray6061
    @joshuagray6061 Год назад +71

    The performance of basically every mid to high end CPU is so good, the choice of CPU almost doesn't matter for most consumers. For the more hardcore audience, it's best to pick one that is more tailored to your preferred games. You really can't go wrong though.

    • @makishae9811
      @makishae9811 Год назад +3

      As someone who really wants the stuff they buy for their pc to last, I’m always looking to get the absolute best I can for the price I get it at.

    • @Kevin-sg5xc
      @Kevin-sg5xc Год назад +2

      this is the real answer

    • @ChrisWijtmans
      @ChrisWijtmans Год назад

      it really depends what you are looking for.

    • @GiorgosKoukoubagia
      @GiorgosKoukoubagia Год назад

      The performance is good today for most of them indeed. But as Linus and others have said, buying a better CPU today helps you not have to upgrade it (along with mobo and potentially RAM) when you upgrade your GPU in the future, especially with higher-end GPUs.

    • @joshuagray6061
      @joshuagray6061 Год назад

      @@GiorgosKoukoubagia yep I'd argue AM5 itself is a good investment, regardless of the CPU you choose. Should be supported for many years to come. That's why I bought a Godlike board and 7950X3D. Pricy, but I'm set for a long time.

  • @mdd1963
    @mdd1963 Год назад +8

    $69 for a screwdriver? LOL!

  • @enrgys_world
    @enrgys_world Год назад

    Your pc's are really cool and I really like your videos!

  • @xuerian2511
    @xuerian2511 Год назад +16

    Glad to see power efficiency and TCO making it into these calculations.

    • @TippyHippy
      @TippyHippy Год назад +1

      I put my hamster in a sock and slammed it against the furniture.

  • @korvish111
    @korvish111 Год назад +47

    100% agree about the delayed launch of a better cheaper product.
    If there is a defence it’s this, the 7950x3d performing worse seems to be a software issue (firmware, drivers, or OS). It should be best of both worlds rather than worst of both worlds.
    I’m sure original expectation was best of both worlds, and they didn’t adjust their plan when it didn’t pan out that way. Maybe theirs firmware updates in future.

    • @TheMessiah1337
      @TheMessiah1337 Год назад +2

      Yea really hoping they improve the schedular or the parking some how for the people who have the 7950x3d that wanna use it for production and also gaming.

    • @achillesa5894
      @achillesa5894 Год назад +2

      Yeah if they get the scheduler perfect it will be the ideal CPU for streamers or people who game and work on the same pc

    • @TheMessiah1337
      @TheMessiah1337 Год назад +1

      @@achillesa5894 Yea currently with some games i play there are scheduling issues unfortunately where the cpu doesnt fully utilize the 3d cache by about 30-50% so it can be like a 20-40 fps loss which really sucks -.-

    • @griffin1366
      @griffin1366 Год назад +6

      They delayed it to sell people more 7950X's as they new that the 7800X3D would be *THE CHIP* to go to for most people.
      The 7950X relies on Windows game bar to schedule itself so good luck...

    • @TheMessiah1337
      @TheMessiah1337 Год назад

      @@griffin1366 i know how it works lol ive got one. A lot of people chose the 7950x3d for productivity and gaming purposes....

  • @ammertos1517
    @ammertos1517 Год назад +12

    it would be so cool if the hardware youtubers would include a World of Warcraft benchmark. WoW benefits like crazy from these X3D chips. I just went from a 3900X to a 5800X3D and I almost doubled my framerate in raids and BGs

  • @Yoshi_206
    @Yoshi_206 Год назад +11

    I'm all to holding companies feet to the fire. But the push back for AMD launching the 7950X3D and 7900X3D 5 weeks before the 7800X3D is just too much for me. It was announced at the same time and included the MSRP so everyone knew they only had to wait a little more than a month and how much they would save. Those who need the bleeding edge or can't control themselves really just need to look in the mirror if they have regrets today. All major review sites (at least the ones I saw) said they expected the same performance with the 7800X3D and to wait to buy.

    • @KH-cs7sj
      @KH-cs7sj 10 месяцев назад

      Four months later it wouldn’t matter. I just bought a 7950x3d and got a free motherboard. It’s equivalent to 7800x3d plus a motherboard, only with 8 more cores.

    • @Cooe.
      @Cooe. 10 месяцев назад +1

      ​@@KH-cs7sjIf you totally disable the non-V Cache CCD before gaming it's actually SUPERIOR to the R7 7800X3D thanks to its better binning & thus higher clock-speeds 🤷 (5.2Ghz max vs 5GHz).

    • @KH-cs7sj
      @KH-cs7sj 10 месяцев назад

      @@Cooe. with the latest chipset driver the non-vcache core parking is automatic.

  • @parasitex
    @parasitex Год назад +32

    I think the reason the 7900X3D performed so badly with dual CCD in Factorio, is because game mode probably didn't kick in for the benchmark. And as such the CPU probably didn't recognize it as a game, which would have parked the non v-cache CCD.

    • @PQED
      @PQED Год назад +3

      No, it's because it has 2 CCD's, each with 6 cores on them. So When one CCD is shut down in favor of the cache, it essentially becomes a 6-core CPU with V-Cache.

    • @parasitex
      @parasitex Год назад +1

      @@PQED well, yes.. But to determine when it should switch to v-cache only ccd, it relies on windows game mode to determine if a game is running. If game mode doesn't trigger, it won't know which ccd to prioritize. And my guess is that the fsctorio benchmark doesn't trigger game mode as the benchmark doesn't seem to run like the game normally.

    • @PQED
      @PQED Год назад +3

      @@parasitex I'm aware of how unreliable (and incompatible) Game Bar is, and that's why it's ridiculous to employ it in this fashion.

  • @reg3dit20
    @reg3dit20 Год назад +20

    Great contextual content, keep calling out all companies in the space for their "smoke and mirrors" PR BS. Good and clean data and info ❤

  • @marceloangeloa.9628
    @marceloangeloa.9628 Год назад

    im happy that linus tech has been recover this where I learn how to deal on components and being knowledgeable on pc tech

  • @cloudstrife7083
    @cloudstrife7083 Год назад +2

    greatly considering buying a 4090 and a 5800x3d when the price will drop a bit this summer or winter, im in no hurry and I can run 128g of ram on Am4/ddr4 the little extra boost of performance between am4 and am5 (talking just for gaming) isn't worth it in my view to re buy a whole motherboard ram cpu combo etc

  • @yobson
    @yobson Год назад +25

    would love to see some software improvements that would better assign the cores on the 7950X3D to appropriate workloads to close the gap

    • @SCYN0
      @SCYN0 Год назад +1

      You can deactivate the cores manually but u still paid 200% more for the same Chip then

    • @achillesa5894
      @achillesa5894 Год назад +3

      The 7950X3D basically becomes a 7800X3D when you turn off half the cores in the BIOS. Hopefully they fix it with software updates so it's always like that without turning off half of it.

    • @pennypenguin15
      @pennypenguin15 Год назад

      You mean processor affinity..? Cause that's been a native windows feature for some time now.

    • @johnsteves9158
      @johnsteves9158 Год назад

      Interesting

  • @IgorFradi
    @IgorFradi Год назад +3

    Having a brazilian author cited on this video was certainly unexpected hahah :P

  • @rene.s.s
    @rene.s.s Год назад +1

    That power usage chart hits home. I run my $555 7950x at full tilt on a 360mm AIO encoding at least 5+ hours daily, converting 2+ 4K discs to h265. I wish someone would crunch those numbers between 7950x and 7950x3d but as far as I know it’s heavily core frequency based so I’d probably not save too much power since it’s running longer.

    • @Matti6950
      @Matti6950 Год назад

      Every last 100mhz above 4000 mhz is less power efficient on AMD 7000 cpu's. In fact allcore 4900-5500 mhz doubles the power usage, 16 all core at 4000-slightly higher can cruch it with 65watt (88 watt true consumption).
      A 7950x tuned with exactly the mhz numbers of 7950x3d, will use similar amount of power, but alas amd marketings it differently, plus is more protective of the 3d models cause of 3d cache, and the 3d cache makes it perform better no matter the lower clock speeds, so they dont loose in benchmarkts.

    • @Cooe.
      @Cooe. Год назад

      ​@@Matti6950Actually, this is almost 100% true, but not quite. The R9 7950X3D uses slightly better binned CCD's than the vanilla 7950X, meaning the former's still slightly more efficient at the same clock-speed (needs less voltage to get there). Think the difference between Threadripper vs normal mainstream Ryzen die/CCD bin quality in previous generations, and it's much the same as this. Aka it's nothing dramatic, but it IS there. 🤷

  • @slgundam
    @slgundam Год назад +1

    I always hear that the extra V-Cache does not provide a benefit to some workloads (mainly productivity).
    But its never apples to apples compared on the same clockspeeds.
    Maybe the extra v-cache does help, just not enough to make up for the clockspeed difference.
    This could mean that a v-cache CPU with clockspeed comparable to its non-v-cache variants could be superior in all workloads. Maybe it just needs time to mature and fix the issues which currently hamper its performance.

  • @CreativeMindsAudio
    @CreativeMindsAudio Год назад +4

    This is why i wait for an entire line of products to be released/reviewed before deciding on a purchase. I’m on a 5800X right now. No need to upgrade my PC right now other than a video card, but it’s nice to know where things are headed. I built it in 2021, so it has a lot of life left in it.

  • @StaySic4Ever
    @StaySic4Ever Год назад +11

    It's a straight up beast. Exactly what I expected, really a BIS chip and worth the wait for AM5 platform, especially after some time for boards and memory to get cheaper too.
    Definitely amazing at certain games where it use extra cache for it's max potential like in WoW and other. Very nice!

  • @45eno
    @45eno 3 месяца назад +1

    Just bought a sealed in retail box 7800X3D for $340 from local ad about a month ago. Then found another 7800X3D listed for $300 new but with no box. I snagged it for $260 for my son. I was only shopping for a used 7600 but the extra $100 was worth it. First time my kid has the same PC spec as me CPU/MB/RAM wise at least.

  • @clerklierbrush0869
    @clerklierbrush0869 Год назад +10

    I have 2 monitors and do gaming/streaming and light 3d modeling. 13600k has had no issues and no lag, super quick. I feel like most modern cpus are all overkill for 95% of users.

  • @sirchaps8489
    @sirchaps8489 Год назад +26

    Had on my AM4 board 2700x, now rocking the 5800x3d. And 32GB 3200mhz RAM.
    Thank you AMD. 🙏

  • @Deathsead747
    @Deathsead747 Год назад +8

    I honestly think that long term review of these chips, future forecasting in terms of energy consumption and compatibility need to be an integral part of these reviews. Most people aren't building a new rig every year.

  • @THEGAMERKH
    @THEGAMERKH Год назад

    Linus can you donate a gaming pc to me I live in Africa Uganda I can't afford a pc please but I want to do streaming gaming YouTubing please help

  • @hansangb
    @hansangb Год назад +3

    I wish LTT would test VR performance. Typically, clock speed is really important.

    • @Cooe.
      @Cooe. Год назад

      Nah. Latency (cache & memory) is WAAAAAAY more important for VR performance than raw clock-speed, which is where the X3D CPU's are absolutely KING!!! This is the outright best CPU for PCVR if all you care about is in-game fps & frametimes.
      Being only an 8-core though means it might not be the best choice for stuff like simultaneous mixed reality video capture &/or live-streaming if that's by any chance something that you're into. 🤷
      If that's more your workload speed I would get a more core/thread heavy i7-13700K or i9-13900K if you're ok with having no future in-socket upgrade path + Raptor Lake's straight BANAYNAYS power & thermals. But if you DO significantly care about the latter stuff, than the R9 7950X3D is probably the best choice.

  • @phx4669
    @phx4669 Год назад +7

    I finally jumped in when 5800x3D hit $319 and I had point on Prime so it only cost me points. I was already running 5900x and I popped in the 3D chip and gained 20 fps on the same game, same setting running at 1.16v and switch out for an air cooler. 3D is surely efficient and cool as long as you run PBO2.

  • @Targetlockon
    @Targetlockon Год назад +10

    The cpu we were anticipating and waiting for to see the results after seeing the 7950X3D and know how the 5800X3D performs

  • @W0ND3RB0Y1
    @W0ND3RB0Y1 4 месяца назад

    Thank you for adding power consumption! Really worth a ton..

  • @rohultima
    @rohultima Год назад +1

    Glad to see these charts. Seems my 5800x3d will hold its own for awhile. Just need to get a better video card and well...wait for a change. Otherwise might have this chip longer than my i7-4790.

  • @namuzed
    @namuzed Год назад +9

    I wish you'd include the 5900x or 5950x, just for some perspective for those of us running a fairly recent-ish AMD CPU.

    • @ji3200
      @ji3200 Год назад

      5800x3d vs 5900x benchmark video on youtube. RUclips it brother

  • @MrLagzy
    @MrLagzy Год назад +25

    having the 5800X3D and now a 7900 XTX I'm set for a couple of years. I might end up skipping the entire AM5 generation and see when AM6 comes out what both AMD and Intel have at that point. Right now I dont need an upgrade. Not for the next 5-6 years. Still its astonishing the new implementations and innovations that AMD and Intel comes out with. I wonder when AMD will also have a big.little structure and add some efficiency cores, or if they just straight up go 3D V-Cache on all their future CPUs to make them as efficient as possible. They have already won that game.

    • @retovath
      @retovath Год назад +7

      They are planning out Big Little for Zen 5. Essentially their plan is to use Zen4 with avx512 and some other SMID instructions removed and call it zen 4 dense. This will shrink core size and interconnect size, enabling low power compute zen4 while leaving full fat zen 5 to do the high power high speed stuff.

    • @gruiadevil
      @gruiadevil Год назад +4

      They already tried a big.little chip.
      It's just for laptop/mobile though (2Perf+4Eff). That's where efficiency matters.
      For desktop, all the efficiency cores are parked and used only into production tasks.

    • @NahBNah
      @NahBNah Год назад

      Also they are going to add 3D cache onto their GPU I hear.

    • @Rippedyanu1
      @Rippedyanu1 Год назад +4

      Same setup as you. 5800x3d on a b550 Taichi board with a 7900xtx Taichi. The rig is amazing at 4k and not too insane on power draw so it's perfect for me for the time being. I don't see a need to upgrade for the coming years at all. I plan to use am5 for a home nas build though with a b650 itx board and a Jonsbo n1 case. Will use one of the non x CPUs because of how power efficient they are

    • @ChrisWijtmans
      @ChrisWijtmans Год назад

      Personally I hope this big.little fad goes away. It makes the schedulers too complex.

  • @citizenzero5437
    @citizenzero5437 Год назад +1

    Would you consider comparing CPU performance some on MMO's. Apparently the 3d cache wins there. Especially in Star Citizen. Yes, that janky alpha mess actually runs relatively well on x3d's

  • @TheDrummist88
    @TheDrummist88 4 месяца назад

    Still rocking my 3700x with an ex mined (probs) ebay bought 3090 founders. Working a treat. wanting to upgrade mobo to get more than 1 gb lan tho now i finally have full fibre of 1160mb so thinking of buying a combo with this CPU.

  • @johnhughes6847
    @johnhughes6847 Год назад +20

    Nice Work Team LTT! I especially appreciated the efficiency cost/KWh comparison. This is a significant factor in the northeast US where its over .42/KWh.