Bought AMD? You got played - Ryzen 7000 non-X Review

Поделиться
HTML-код
  • Опубликовано: 1 окт 2024

Комментарии • 3,6 тыс.

  • @yobb1n544
    @yobb1n544 Год назад +5393

    X = Xspensive

  • @Gargantura
    @Gargantura Год назад +6683

    Linus, we all know that we gonna buy this CPU like 3 years after it arrive with 50% of initial prices

    • @txmits507
      @txmits507 Год назад +102

      People don't want to buy a 3 yo cpu

    • @literallyhuman5990
      @literallyhuman5990 Год назад +229

      3 years? You mean 5 years?

    • @jakobmax3299
      @jakobmax3299 Год назад +141

      Lol, like with the 5600...

    • @devaraft
      @devaraft Год назад +437

      @@txmits507 lmao speak for yourself

    • @donatedflea
      @donatedflea Год назад +13

      @@txmits507 what's the point eh haha

  • @yaseen157
    @yaseen157 Год назад +812

    That thermals chart is insane! I'm so happy to see a return to awesome midrange CPUs, and am very excited for the budget ones.

  • @cpypcy
    @cpypcy Год назад +1330

    Fun fact: Factorio devs mentioned in one of the blogs that they optimized game code so much that it's mostly bottlenecked by cpu cache rather than clock speed. Also ram speed boosts performance too and by a lot!

    • @petercollins797
      @petercollins797 Год назад +86

      Feeling good about the 3DVCache CPU I just bought then

    • @BeastOrGod
      @BeastOrGod Год назад +79

      These suckers bought 7700x instead of 5800X3D with 96mb cache!!! MUAAHAHAH

    • @jaxrammus9165
      @jaxrammus9165 Год назад +66

      @@petercollins797 i literally bought the 5800x3d for factorio lmfao

    • @DanielWillen
      @DanielWillen Год назад +51

      @@jaxrammus9165 7800x3d looking juicy as well for factorio, tarkov and other games .

    • @flipout1977
      @flipout1977 Год назад +2

      @@jaxrammus9165 I did the same thing but for rust lmao

  • @JohnSheppard92WasTakenThxYT
    @JohnSheppard92WasTakenThxYT Год назад +1213

    Short Feedback on the new Performance Graphs: They are very nice and clean but it is a little confusing how they are ordered. You seem to randomly order them either for average or for 5% lows but never consistently. You have graphs where some CPU is in the Middle by it's average, directly behind you have one that has lower average, but higher 5% lows, after which you have one that has higher average, but lower 5% lows, which makes it hard to quickly see, which CPU is actually faster. Both ordering methods, average or 5% lows are totally viable, but you should choose one and stick to it to make the graphs more easily readable, especially since you show them for a limited amount of time. That means pausing the video constantly is a necessity to take in the information acuratly, which makes it harder to follow the script while those graphs are on screen. Other then that, Lab seems to really do a great job here!

    • @ParvinderSinghSaini
      @ParvinderSinghSaini Год назад +60

      Felt similarly confused as well

    • @johndododoe1411
      @johndododoe1411 Год назад +25

      The graphs are completely unreadable blurs on Mobile at 603p,
      Plus the tests seem irrelevant for anyone not playing those particular games. It was really easier back when everyone used synthetic tests that were designed to give realistic results across all CPU generations since the 1981 launch of the PC platform. Famous tests that would still be relevant include LINPACK, Whetstone and 3D-mark, each test may need updates to deal with new inventions by the 3 chip giants.

    • @LeoLeahy
      @LeoLeahy Год назад +16

      I was going to comment on the same thing. Maybe get some inspiration from Hardware Unboxed. Their graphs are also often packed with information, but theirs is a bit easier to digest.

    • @EvilTim1911
      @EvilTim1911 Год назад +20

      Looks to me like a calculated score is assigned to each CPU based on all 3 categories in which they're measured

    • @svgPhoenix
      @svgPhoenix Год назад +70

      @@johndododoe1411 if they made their graphs readable at [360?]p it would look like granny zoom to the majority of their audience. Change the quality and wait for the frame to buffer if the graphs are that important to you.
      Only using synthetic tests that were relevant on the first PCs would be more useless to the average viewer than testing even one modern indie game that nobody plays.
      Get some perspective

  • @__-fm5qv
    @__-fm5qv Год назад +97

    The performance per watt is insanely impressive. I can imagine it being useful for say a university environment, where you might have 100's of machines working on students engineering projects or similar. You want the performance but also dont' want to use too much energy.

    • @Manysdugjohn
      @Manysdugjohn 5 месяцев назад

      once you turn on PBO they consume the same exact wattage almost.

    • @colinkirkpatrick5618
      @colinkirkpatrick5618 2 месяца назад

      @@Manysdugjohnwhat is PBO?

    • @Manysdugjohn
      @Manysdugjohn 2 месяца назад

      @@colinkirkpatrick5618 pression boost override. Its a setting in the bios that lets the cpu boost to its maximum capabilities until it hits 75C (for older ryzen) or 95C for AM5 ryzen.

    • @gaspaider7392
      @gaspaider7392 Месяц назад

      @@colinkirkpatrick5618 the built in automatic overclocking

    • @fietsindeschie
      @fietsindeschie 13 дней назад

      @@colinkirkpatrick5618 peanut butter orange

  • @CanIHasThisName
    @CanIHasThisName Год назад +385

    Labs are definitely showing their value, great review.
    I think you should also include idle power draw. It can be a big deal for some use cases and sometimes people may find that their systems are drawing more at idle than they should (had that happen recently and it turned out to be a driver issue), so it's useful to know where approximately they should be.

    • @marsovac
      @marsovac Год назад +10

      People are noticing high idle power draws because they install crap like animated wallpapers and 10 game launchers, or they have multimonitor setups with high refresh rate. Or because they enable the high performance power profile in Windows.
      Tldr: too many factors for idle power draw today, since there is never a real idle. What you see drawn from the CPU is not what you measure from the cord.
      I use a 15 min sleep with wake up via usb (key/mouse). So It uses a few watts to power the memory and that's it. And for the commodity to not have to push the button manually, by just pressing the mouse to power on.

    • @Anankin12
      @Anankin12 Год назад +4

      Usually idle power draw is around 5W whatever platform you're on
      At least, mine draws the same with 3600 and 5800x3d. Note that the 3600 never puts to sleep its cores, while the 5800x3d will run just 2 cores/4 threads if the workload is light enough and the others are almost literally turned off

    • @Anankin12
      @Anankin12 Год назад +2

      You can't measure the whole system 's powerdraw for a review tho: different ANY component will massively change your results, so the possible combinations are too many to test or even just list.

    • @spoots1234
      @spoots1234 Год назад

      i'm interested in the idle power draw of the non X SKUs. I've seen a few people note 60-80w idle power draws on 7000 series compared to their intel systems.

  • @RCmies
    @RCmies Год назад +1165

    AMD being able to pull off these power draws just gives me a lot of faith in their engineers.

    • @sovo1212
      @sovo1212 Год назад +68

      Actually, what they did is CPU binning 101. They just stockpiled "golden" 7000x CPUs for months and are now selling them for the 65w lineup. Genius move.

    • @EuclidesGBM
      @EuclidesGBM Год назад +22

      @@sovo1212 Not actually Golden. that 0.2GHz drop is enough for parts that were subpar at higher clocks work a lot better.

    • @lua-nya
      @lua-nya Год назад +55

      If you're amazed at this performance and power draws you should check the work they did with the Steam Deck's APU. I got a lot of hope for their future mobile end chips.

    • @sovo1212
      @sovo1212 Год назад +30

      @@EuclidesGBM Grab any 7000x CPU, try to underclock that 0.2ghz and undervolt it to get to 65w TDP, it won't be stable.

    • @EuclidesGBM
      @EuclidesGBM Год назад +6

      @@sovo1212 considering I managed to overclock my 3600X to 4.3GHz All Cores whislt undervolting it to 58W... I think it is quite possible

  • @Hittorito
    @Hittorito Год назад +141

    7600 is ice cold, which is lovely. Can't wait for it to age and see more cheaper options and small factor options!

    • @teamfishbowl1076
      @teamfishbowl1076 Год назад +2

      i got mine to 93c building shaders on the last of us lol

    • @kliibapz
      @kliibapz Год назад

      @@teamfishbowl1076 mine is going above 95c with aerocool verkho2 (2 pipe cooler) and there was R5 1600x (95w) never going above 60c with the same cooler. I don't understand that.

    • @teamfishbowl1076
      @teamfishbowl1076 Год назад +1

      @@kliibapzyeah its a bit odd. prime95 on am5 7600 for an hour with prism cooler will easily hit 90c+.
      my ryzen 2600 with the dark rock pro4 can do 3-4 hours of torture tests on prime95 and never go above 65-70c.
      i want to test the dark rock pro on the am5 and see what happens, but either way the 7600 runs NO WHERE NEAR ICE COLD.

    • @Physics072
      @Physics072 Год назад

      ICE cold? They run around 93c with stock cooler thats 200F enough to cool meats easily. Not sure I would use ice to describe something that can cook steaks.

    • @GeneralS1mba
      @GeneralS1mba Год назад

      ​@@Physics072I'd it much worse than the 7600x?

  • @Mraz565
    @Mraz565 Год назад +905

    If you are still on AM4 mobo, the 5800x3d is a great cap filler without the need of new mobo/ram/cpu and it seems to be staying on par with the next gen CPU's

    • @DangerangTheAIMan
      @DangerangTheAIMan Год назад +136

      I'm not gonna upgrade from a 5800x to a 58003D xD

    • @Shadow-zo8xj
      @Shadow-zo8xj Год назад +6

      Well, i dont wanna be the karen today but the 5800x3d is better for content creator or ai people but more cores could mean better performance even though a game only runs at 8

    • @kahtyman7293
      @kahtyman7293 Год назад +17

      yeah, i found recently that my b450 mobo has a bios update for 5xxx series... and i'm running 1600 so i'll probably save a buck and just buy brand new r9, and i don't even have to buy other parts, yay

    • @DangerangTheAIMan
      @DangerangTheAIMan Год назад +115

      @@Shadow-zo8xj It thought it was the other way around, 5800x3D for gaming mostly, but i'm no expert

    • @paulssnfuture2752
      @paulssnfuture2752 Год назад +6

      For games yes.

  • @mes0gots0its
    @mes0gots0its Год назад +350

    THANK YOU for finally including Factorio benchmarks on your CPU reviews.
    I think CPU benchmarking in industry reviews are way too focused on AAA GPU-bound action games instead of processing-intensive strategy/sim games. Titles like Factorio and Civilization 6 are very relevant for people shopping for the optimal gaming CPU.

    • @halfbakedproductions7887
      @halfbakedproductions7887 Год назад +19

      We need more benchmarks for productivity as well. Everything focuses on AAA gaming at 8K with full RTX, paired with the best DDR5 and a 4090 overclocked to blowing up.
      That's not my use case and the data unhelpful.

    • @archerboy2714
      @archerboy2714 Год назад +45

      @@halfbakedproductions7887 they did the same number of productivity benchmarks as they did gaming....

    • @padnomnidprenon9672
      @padnomnidprenon9672 Год назад +5

      I play games like factorio and timberborn and I dream about reaching 60 UPS when having a really large base. Triple A are GPU bound that's well known

    • @Leraiton
      @Leraiton Год назад +7

      Also games like Victoria 3 (seeing how far it can get as observer from start after x amount of time) or Dyson Sphere Program

    • @blehbleh9283
      @blehbleh9283 Год назад

      Phoronix has a great benchmarking suite

  • @vexedhulk9817
    @vexedhulk9817 Год назад +89

    Looks like they made the 7600 with the same idea that the 3600 played out to be. I really like it!
    I'm still using R5 3600 and don't have any intentions on upgrading soon. I'm still running at stock but when it starts to wear down, I'll simply need to OC a bit to match what ever GPU I have(currently a RX 6650 XT).

    • @Postman00
      @Postman00 Год назад +5

      Due to ass prices in a small market, I still often buy Ryzen 5 3600s for my office.

    • @Tupsuu
      @Tupsuu Год назад +1

      @@Postman00 5700x is like only 200€

    • @Winnetou17
      @Winnetou17 Год назад +6

      @@Tupsuu He mentioned "a small market". That makes me think he's not in US, nor UK.

    • @Tupsuu
      @Tupsuu Год назад +1

      @@Winnetou17 Im in Finland

    • @Winnetou17
      @Winnetou17 Год назад +1

      @@Tupsuu Ok, fair enough.

  • @NanzoGonzo
    @NanzoGonzo Год назад +389

    Waiting for the 3D prices, but that 7900 is looking amazingly efficient for a nice price

    • @sophieedel6324
      @sophieedel6324 Год назад +10

      It's not a nice price at all. AM5 motherboards are very expensive, so is the DDR5 memory you have to buy.

    • @TheEpxMaster
      @TheEpxMaster Год назад +70

      @@sophieedel6324 it’s better than buying a motherboard and CPU every 2 years with Intel

    • @Distress.
      @Distress. Год назад +19

      @@TheEpxMaster not really I upgrade after 5 years and by then every component is worth upgrading.

    • @fynkozari9271
      @fynkozari9271 Год назад +21

      Ryzen 7900 76 mb cache. I would kill for that cache. 65 watt tdp is the best.

    • @MrZodiac011
      @MrZodiac011 Год назад +1

      @@TheEpxMaster It's realistically the same since the AMD motherboards are twice the price of Intel's. The 7900 sounds like an pretty mediocre deal, when you're spending that much on the AM5 boards and DDR5, you're probably better off with a 7900X3D when that comes out, because chances are if you want a board for a 7900, you're not an average user and shouldn't be cheaping out

  • @TheJackiMonster
    @TheJackiMonster Год назад +458

    Upgraded from 2700X to the 5900X recently and even with its eco mode capped at 65W it's a massive improvement to me. So I will probably wait for the next huge efficiency jump and another increase in thread count.

    • @Barrysteezy
      @Barrysteezy Год назад +4

      I'm curious. I also have a 2700x but with a rtx 2070. What gpu are you using and what is your workload?

    • @norkshit
      @norkshit Год назад +11

      @@Barrysteezy You should ponder these questions for a while, if you can confidently answer them then you’ll have an easier time making a choice:
      1. What chipset is ur MOBO, and is it one that supports 5000 series?
      2. Do you plan on upgrading to an entirely new system in the next 5 years
      3. Do you plan on doing any creative work or CPU-intensive work such as live-streaming to twitch while gaming?
      These are all things you should consider before spending any moneys on upgrades

    • @TheJackiMonster
      @TheJackiMonster Год назад +15

      @@Barrysteezy I recently got an RX 6800 but had an RX 5700 before that. One thing helping a lot with the GPU was enabling resizable bar in the BIOS.
      Honestly though my setup is pretty screwed by PCIe limitations since I'm using 8 lanes for some SSD adapter card. Therefore my GPU is limited in bandwidth (only using 8 lanes) and I only have PCIe 3.0 with my motherboard. However it's still fine but I assume with PCIe 4.0/5.0 I could easily get 5~10% more performance in games. The reason for upgrading was mostly to get a newer feature set with mesh shaders and ray tracing pipelines (I wanted to do some graphics development with that).
      However the CPU upgrade still made a lot of sense to me because I'm compiling a lot of code during development. So this adds up easily when I need to draw less power now while it takes less time as well.
      The most reason why I didn't went for AM5 is the pricing while my old mainboard (x370 chipset) got a BIOS update which enabled Ryzen 5000 support (even with more than 8 cores). So the upgrade was much cheaper and the benefit of Ryzen 7000 over 5000 is not that huge to be honest.
      But if you consider it, I would wait for the reviews of the 3D cache chips, especially if you care most about gaming. There's still a chance that they won't perform as perfect during launch though since they rely on software optimizations to utilize their stacked cache most efficiently. So hopefully reviews will cover that part.

    • @yuralmspw4911
      @yuralmspw4911 Год назад

      where exactly do you see improvement and did you really needed it? 2700x is still capable of any task, you just wasting money for higher numbers in benchmarks.

    • @Frozoken
      @Frozoken Год назад +3

      why would you buy a 16 core chip just to cap it at 65w? 😂 I can almost guarantee you that you'd lose barely any performing with a 65w 7900x

  • @qwertimus
    @qwertimus Год назад +307

    Very excited to see what the 3D V-Cache chips offer. Amazing how competitive the 5800X3D is a generation later

    • @tilapiadave3234
      @tilapiadave3234 Год назад +5

      5800X3D ,, WAY WAY overpriced EXTREMELY overhyped

    • @TheNicePIXEL
      @TheNicePIXEL Год назад +48

      @@tilapiadave3234 wrong :)

    • @tilapiadave3234
      @tilapiadave3234 Год назад

      @@TheNicePIXEL Indeed You are wrong ,, very wrong

    • @TheNicePIXEL
      @TheNicePIXEL Год назад +32

      @@tilapiadave3234 keep using your Intel CPU and let us use our OVERPRICED and OVERHYPED X3D's 😁

    • @tilapiadave3234
      @tilapiadave3234 Год назад

      @@TheNicePIXEL I most certainly will continue to use the vastly superior cpu's.
      My task is to INFORM all the newbies of the LIES spread regarding the 5800X3D , I hate when newbies waste hard earned dollars

  • @aaronthewhiz3160
    @aaronthewhiz3160 Год назад +744

    So the 7600 is the new budget king, the 7700 is DOA, and the 7900 is a power efficient mid/high-end monster? If only Radeon could deliver like Ryzen
    Edit: changed the 7900 from mid-range to mid/high end

    • @TimSheehan
      @TimSheehan Год назад +127

      7700 isn't exactly DOA because you can just turn on PBO in the BIOS and get pretty close to X performance for less $

    • @PDXCustomPCS
      @PDXCustomPCS Год назад +28

      Budget king will be the 13400F. Being BLCK OCing is becoming more common on boards at good prices. a 13100F overclocked will even give the 7600 a run for the money.

    • @MiGujack3
      @MiGujack3 Год назад +46

      No chance of it being a budget king with those board prices.

    • @PDXCustomPCS
      @PDXCustomPCS Год назад +19

      @@MiGujack3 I was thinking that too, AM4 is dead end but will still be relevant. Z690's are very affordable. B660's are plenty enough for a 13600k or better. Mortar Max+12400F is where it's at right now. 5.3GHZ and 13600k performance in gaming.

    • @nonameyet2205
      @nonameyet2205 Год назад +45

      @@PDXCustomPCS 13400F is still based on alder lake cores so gaming performance will be far behind 7600 and 13600k and only about the same as Zen 3 5600x which you can get for $135. You can see at 7:10 the 7600 is a bit faster than even the 13600k in gaming, it's prettymuch impossible for the 13400f and 13100f to match 13600k and 7600, if you OC them you can also OC the 13600k or 7600.

  • @JRFKSERRANO
    @JRFKSERRANO Год назад +279

    As an SFF user, those temps are impressive. I would definitely go this route once I move to am5.

    • @halocubed6788
      @halocubed6788 Год назад +1

      how small? i want to go sff but i’m afraid to make the jump

    • @defro125
      @defro125 Год назад +8

      @@halocubed6788 I built a 11 liter Dan A4 H2O case with a 3070ti and a 5800x3D about 2 weeks ago. It has 60 degree temps for the GPU and the CPU never goes higher than 70 degrees when gaming. It has been running great and I overall love it.

    • @dotafarm2699
      @dotafarm2699 Год назад +1

      @@halocubed6788 try tecware fusion. its itx friendly

    • @cuerex8580
      @cuerex8580 Год назад +2

      as a sff fan myself, i gotta keep the 5700x until am5 reaches its last gen

    • @cybersamiches4028
      @cybersamiches4028 Год назад

      @@cuerex8580 nice, eol is going to pretty awesome on am5!

  • @ApolloniusOfTyana0
    @ApolloniusOfTyana0 Год назад +31

    Wow, those are impressive thermals for the 7900! Think it might be going into my next build (which will be SFF)! I want to see what the X3D line this time around does compared to the 7900 and the 5800c3D

  • @tungstentaco495
    @tungstentaco495 Год назад +280

    I was planning on skipping the 7000 series, but price and TDP of the non-X chips definitely has me reconsidering. If only the 600 series boards and ddr5 ram weren't so overpriced, then it would be a really easy choice to upgrade.

    • @officialteaincorporated243
      @officialteaincorporated243 Год назад +32

      DDR5 ram isn't that overpriced in my opinion, however those 600 series boards are extremely expensive.

    • @fynkozari9271
      @fynkozari9271 Год назад +4

      @@officialteaincorporated243 Easy, just buy b560 mobo. How much DDR5 currently? DDR4 3200mhz in my country $58 current price before discounts.

    • @reformierende_person
      @reformierende_person Год назад +13

      @@fynkozari9271 i believe zen4 doesnt support ddr4 ram

    • @fynkozari9271
      @fynkozari9271 Год назад +3

      @@reformierende_person b550 am4 b650 am5.

    • @DigitalJedi
      @DigitalJedi Год назад +1

      I'm hoping we see some Zen 3+ AM5 chips for the budget 6000 series. Imagine something like the 6980HS becoming the 6700G or something like that.

  • @kabadisha
    @kabadisha Год назад +235

    This is quality reporting. Coverage like this helps real people filter the marketing hype and holds manufacturers to account. Nice job LMG 👌 And nice job AMD!

  • @ThisSteveGuy
    @ThisSteveGuy Год назад +2

    You need to explain your ordering on the gaming benchmarks graphs because it looks bonkers at first glance. How exactly are you combining lows with the average FPS? Are you doing a mean average or what? Also, why are you doing this? Just seems like it's overly complicated and goes against the purpose of a CPU benchmark, which is simply to measure the relative power of the chips. Lows can vary wildly between games and are more about software optimization than hardware power. If a game has micro-stutters on one chip and not another, that could be due to so many factors which have nothing to do with power, things which could be patched tomorrow for all you know.

  • @TheTalamier
    @TheTalamier Год назад +294

    All of the X version chips were available for $10 -$20 above these new chips between Thanksgiving and Christmas. I picked up a 7700X for $339 from Newegg. Currently, the new chips are a great value but early adopters of the X series did just fine.

    • @BoberFett
      @BoberFett Год назад +15

      I went with the 7700 and have no regrets. I got 32GB of free DDR5 from Microcenter. I did a direct comparison against a free mobo deal they were running on Intel 12 series, and made the decision to go with the new platform instead of the tail end of an old one. I don't feel bad about it. And I'm happy with the 6800xt I paired it with for $500 after seeing the latest gen video cards.

    • @altersami9660
      @altersami9660 Год назад +2

      I got $20 off the CPU and mobo combo, in addition to the discount on the CPU, so I kinda got it cheaper than than the non-x version. 😂

    • @duohere3981
      @duohere3981 Год назад

      I don’t believe you

    • @SlimedogNumbaSixty9
      @SlimedogNumbaSixty9 Год назад +1

      I paid $30 more for a 13700k, z690 for $150, and carried over ddr4 b-die ram I bought two generations ago. A more normie 32gb ram kit would've only been $70 anyway

    • @maanmahmoud4537
      @maanmahmoud4537 Год назад

      @@SlimedogNumbaSixty9 please tell me total price?

  • @forestrf
    @forestrf Год назад +282

    It's nice seeing a product being able to get a good review from time to time, not what we are getting with gpus

    • @NinjaForHire
      @NinjaForHire Год назад +3

      When iGPU integration starts shitting on a dedicated GPU.

    • @Gabu_
      @Gabu_ Год назад +17

      @@NinjaForHire Have you seen the AMD laptop chips announced at CES? iGPUs are going to DEMOLISH low end dedicated GPUs.

    • @cooleyYT
      @cooleyYT Год назад

      @@Gabu_ lmao yeeaa suuuurre

    • @cooleyYT
      @cooleyYT Год назад

      @@NinjaForHire bad troll. Wasn't funny. Nobody but you laughed. Cus an integrated gpu beating my dedicated gpu is more of a 1 in 9 billion chance. Get back to me when integrated steps up thier game *mic drop*

    • @filipkuchta_main
      @filipkuchta_main Год назад +7

      @@cooleyYT who hurt you...

  • @PaulManington
    @PaulManington Год назад +36

    Love the frames per watt charts. Would love a real deep dive on this with eco modes on cpus and maybe even power limits / fps limits on gpus too!

  • @ZachStein
    @ZachStein Год назад +146

    7900 will be great for building a quiet entry-level workstation at a very reasonable budget. This is pretty exciting. I think productivity-wise for devs, this is a great chip.

    • @bigdoublet
      @bigdoublet Год назад +1

      What about for gaming? Good as well?

    • @youtubewatcher4603
      @youtubewatcher4603 Год назад +15

      @@bigdoublet CPUs haven’t mattered too much in gaming for a while. A well chosen, cheap, older CPU is usually the way to go.

    • @someonesomewhere5325
      @someonesomewhere5325 Год назад +18

      @@youtubewatcher4603 every recent cpu released has increased ipc to the point where something like a second hand i7-8700k is actually a bad purchase rn, you really dont wanna go below 10th gen even if you gotta go entry level to stay on the newer platform unless you wanna deal with microstutters on older cpus.

    • @Haskellerz
      @Haskellerz Год назад +1

      DDR5 128 GB ram (4400 Mhz is the highest stable config) costs like $800
      DDR4 3600 128 GB ram costs $450 and has the same performance
      7900 sucks for virtualisation workstation builds or CAD simulation workstations

    • @bastianm5478
      @bastianm5478 Год назад

      @@Haskellerz sucks for virtualisation? does that mean VMware machines? that's my use case. please respond with what I should go for. I know I don't upgrade a lot coming from a 4690k but I like the idea of having the latest and greatest. was almost sure to buy a 7900 but why would it suck for vmware?

  • @leungoscar4126
    @leungoscar4126 Год назад +176

    The fact that 5800X3D is still here for comparison still surprises me 👀

    • @TiPeteux
      @TiPeteux Год назад +5

      Why ? As a owner i'm more than happy tho haha.

    • @leungoscar4126
      @leungoscar4126 Год назад +5

      @@TiPeteux I love how 5800X3D still performs, would've gotten one if I didn't have 3950X right now :P

    • @blunderingfool
      @blunderingfool Год назад +2

      @@leungoscar4126 I have a 5600, worth an upgrade?

    • @thecarrot4412
      @thecarrot4412 Год назад +4

      ​@@blunderingfool that's basically the upgrade I made. Depending on what you play and the price you could find the 3d chip at it's a bit equivocal. If you are happy with performance right now it's not worth it. If you play things like flight sim in vr or really logic heavy games like civ a lot though it'll be a great upgrade without needing a whole new pc, especially if you sell the 5600 and get a good deal

    • @nupagagyi
      @nupagagyi Год назад +6

      It's a last gen chip. It's not that suprising.

  • @SmarterGaming
    @SmarterGaming Год назад +2

    This COULD have been a good review if you had bothered to include the use of the AMD provided PBO option. Why LTT would test without showing the comparisons of performance gained by using PBO (a simple click and forget boost) that almost everyone will use - is baffling. Seems you are too far gone into building out state of the art labs and staffing highly qualified people, rather than using those assets to showcase good, real world testing procedures. PBO is a simple click and forget performance boost. It is like running Intel and not using XMP.... why not show both the "stock" and "PBO" performance by using the manufacturer provided optimizing software? Another LTT click baiting title, with less than thorough test results and conclusions.... Try doing better next time?

  • @tatec22
    @tatec22 Год назад +134

    It would be great if you could include the idle power consumption (e.g. on Windows Desktop) in future tests as well. Like you said in the video, energy costs are higher then ever. When I'm using my pc I'm usually browsing the web or listening to music the same amount of time or even more than I'm acutally gaming.

    • @tmnt9001
      @tmnt9001 Год назад +8

      Besides, I believe that it was AMD graphics cards that had unusually high numbers on idle. Including that in the tests puts pressure on the manufacturers to fix whatever issue they might have.

    • @simonvetter2420
      @simonvetter2420 Год назад +4

      Very good point, I always wish for the same thing.

    • @Daisudori
      @Daisudori Год назад +6

      Intel have lower power usage on idle. The CCD chiplet design is inherently more power hungry on idle/very low loads
      About the AMD Gpu: the memory doesnt downclock properly in idle when using multi monitors (of a differing hz rate so ie 165hz main and 60hz 2nd like I have) and it consumes 30-40watt constant in idle. (in my case on rx6800)

    • @joshjlmgproductions3313
      @joshjlmgproductions3313 Год назад

      @@Daisudori Your GPU only consumes 30W at idle? My 2080 Ti consumes 60 - 70W at idle, and I only have 1 1080p monitor.

    • @_--_--_
      @_--_--_ Год назад

      @@joshjlmgproductions3313 If your GPU pulls 60W at "idle" then your GPU never actually goes to idle.
      You prob have your windows misconfigured, set the power setting to balanced instead of high performance, it doesnt have any real impact on game performance anyway, its just a waste of energy.
      A 2080Ti should pull around 10-15W at idle.

  • @nfohkens20
    @nfohkens20 Год назад +95

    Happily surprised to see Factorio there as a game to compare at; no idea how that got on your radar but I look forward to hearing more benchmarks with that game, maybe with the UPS/FPS on a 500+ hour mega base since that's where it can really matter :D

    • @KarrasBastomi
      @KarrasBastomi Год назад +11

      Factorio devs are magician or some sort. My mid size base would run happily on 12 years old i5 2410m. Just amazing.

    • @Mindpron
      @Mindpron Год назад +6

      Factorio is great for cache benching, because it just beats the living shit out of CPU and GPU cache.

    • @jammo7370
      @jammo7370 Год назад +5

      People kept spam requesting it during the intel arc gaming stream so I guess it stuck in their heads afterwards lmao

    • @ACE112ACE112
      @ACE112ACE112 Год назад

      Destiny no lifing the game maybe.

    • @ACE112ACE112
      @ACE112ACE112 Год назад

      @@jammo7370 Probably Destiny streamer fans.

  • @SoarDoesGaming
    @SoarDoesGaming Год назад +5

    I got the 7700 non-x because not only very little performance diff that will be noticable but that 65w tdp in an SFF build is a huge bonus for thermals

    • @chrisrica6830
      @chrisrica6830 7 месяцев назад +2

      It has been 7 months. Have you enjoyed it so far?

  • @twocows360
    @twocows360 Год назад +86

    i'm very happy to see efficiency being targeted for once. now if only the GPU market would go this route.

    • @defnotatroll
      @defnotatroll Год назад +3

      AMD have always cared about efficiency though. It's intel who show no interest in it

    • @ultratronger
      @ultratronger Год назад +6

      @CompilationHUB since 2017, yes they have, rx 7000 series isnt as efficient because its not as powerful as they expected, but its not a hardware flaw, they need to fix the drivers

    • @Freestyle80
      @Freestyle80 Год назад

      @@defnotatroll here we go, fangirls spreading misinformation just to shill for a corporate company

    • @XantoS771
      @XantoS771 Год назад +1

      @CompilationHUB r9 290x....only the psu's can tell the horror stories from that card haha

    • @SirCrest
      @SirCrest Год назад

      @@defnotatroll Not until Zen.

  • @JammyBriton
    @JammyBriton Год назад +144

    I'm still using my first CPU, R5 2600. Love the direction AMD have taken with value, which means I'll be looking at potentially going AM5 when I upgrade probably next year. (Gotta let those prices drop a bit more). I can only hope GPUs will follow at some point

    • @guancoco
      @guancoco Год назад +1

      you will see that in 10 years, when others companies came out.

    • @AndyMitchellUK26
      @AndyMitchellUK26 Год назад +19

      Even if you could find a decent priced 5600 you'd still see a decent performance jump over the 2600. Then when it comes time to upgrading you would have a decent base system to sell as the 5000 is the last series of the AM4 platform. I had a 3700x and didn't need to upgrade but still did (5800x) and honestly did not regret doing so one bit.

    • @LuLeBe
      @LuLeBe Год назад +8

      What's the issue with the 2600? I have a 3700, I know it's zen2 and 8core but at least for many home workloads it seems pretty good. Games will probably not run much better unless you have a really great GPU like at least a 3070 or so, and the same goes for blender renders or whatever else you might be doing for fun.
      I would only upgrade if there's a clear advantage in your particular system combination and your use case, and then also only if you need it. So many people play on 1080p 60fps but want 4k120 for no reason. If you play battlefield, will you really be better at 4k120 or notice the visual difference enough to warrant spending $1500 on all that stuff?
      I upgraded my whole system a while ago (3700, rtx2070, 32gb RAM) and my secondary PC with a 1050ti & i7 4770k runs most things well enough to not really care during gameplay.

    • @AndyMitchellUK26
      @AndyMitchellUK26 Год назад +12

      @@LuLeBe It's not an issue but it's just a less mature processor. AMD made huge improvements between 2000 and 3000 and even bigger up to the 5000 series. Memory optimisation was one of the biggest changes and in some scenarios you can see a massive uplift in performance as the way the infinity cache works relies heavily on memory speeds and latency. I upgraded my 3700x to a 5800x to take full advantage of my RTX 3070 and triple wide 5760x1080 setup. I get a solid 60 fps now in most games whereas before I did have quite a few frame time issues as well as 1% lows dipping. You have to remember that everyone has different uses for their computers and what may seem like a small upgrade can turn out to be a big one for some people. That, and AMD have made huge improvements between each generation of Ryzen. It's not like the old days where Intel only improved by a few percent each generation.

    • @raikkappa23
      @raikkappa23 Год назад +2

      I always wanted an R5 2600! Of course, my upgrade got delayed substantially, and I got an R7 7700X for christmas. Best thing ever. Good luck with the upgrade! I'm still using a mid-tier GPU...

  • @thomasberdon2849
    @thomasberdon2849 Год назад +5

    sir Linus good evening, I'm Thomas E. Berdon from the Philippines...my rig is already complete but except for the CPU...if you ever any spare to give-aways that you never use it anymore... a AMD athlon 3000G or AMD ryzen 3 3200G it would be awesome...this is my rig I built it on my own...an old micro atx computer case, aA320M-S2H gigabyte motherboard, a 2X4 kingston beast 2666Mhz cl16 ram, a walram 128Gb M.2 ssd for OS, a ramsta 128Gb 2.5" ssd for storage, a wd blue 500Gb hdd for storage and a DVD rom drive...that's it.....I always watch your videos on RUclips as you assemble it one by one until you done with it...I learned a lot from you sir...thank you very much for the time that you read this....

  • @Homiloko2
    @Homiloko2 Год назад +46

    That's amazing. Love the lower TDPs and temps with an almost equal performance. It's definitely worth the small performance loss.

    • @NinjaForHire
      @NinjaForHire Год назад +6

      I'm with this guy save so much pwr to performance it's crazy makes Intel look like they don't care about your wallet in the long run.

    • @albundy06
      @albundy06 Год назад +2

      @@NinjaForHire LOL.
      AMD is a corporation too. They don't give a flying F about your damn wallet.
      They only released these non X chips because their X sales and new platform sales were low.
      It's almost as if the average user didn't like the idea of spending all that money on a new everything for a chip that stupidly ran itself factory over clocked to run at 95c.
      Funny how as soon as AMD became the better choice for CPUs. The X series was all there was. And the prices went up!
      But in a fanboys mind. AMD AND IT'S SHAREHOLDERS ARE LOOKING OUT FOR MY WALLET!

    • @ablet85
      @ablet85 Год назад +1

      @@NinjaForHire if AMD cared the 7900 XT wouldn’t cost so astronomically high. Don’t skimp on a company. Any company, let alone a billion dollar company

  • @SD-ud9sh
    @SD-ud9sh Год назад +9

    the 13600k's score is ronge by a lot and the 13900k is probably hitting a thermel trodelling I am not an intel fan boy but the testing latly is very light in gaming and in general not accert an cosistnt

  • @n-o-i-d
    @n-o-i-d 8 месяцев назад +3

    Ryzen 5 7600 for my new build is ordered and on it's way, along with the other components. It's my first AMD build since 1999 when I had an AMD K6-2.

  • @Neoxon619
    @Neoxon619 Год назад +36

    I’m probably still gonna hold out for the 3D CPUs, but I’m glad for the better deal here.

  • @roqeyt3566
    @roqeyt3566 Год назад +160

    That r9 7900 was pretty much what I was waiting for, I'll grab one in the summer when supply/demand settles on the entire platform

    • @fynkozari9271
      @fynkozari9271 Год назад

      But x670 price. Are u gonna get b560?

    • @roqeyt3566
      @roqeyt3566 Год назад +12

      @@fynkozari9271 why? I just care about the power consumption. 1kwh is 1€ here so
      If I go for a lower end B660, I cannot safely assume it'll run the next gen of chip, meaning that the 50-70€ I save on not going x670 means I gotta redo half my build and spend another 150-200 bucks
      I'll go x670, the 50€ saved on a 700€ investment is not worth the compromise

    • @nlemonj
      @nlemonj Год назад +2

      @@roqeyt3566 wow, your power is about 10x what I pay right now. The margin drops to 4x what I pay in summer though.

    • @dalzy25
      @dalzy25 Год назад +6

      @@roqeyt3566 Even A320 motherboards can run the 5800X3D. I don't think future compatibility is really an issue on B650. There's a reason B650 is so expensive. They come with good VRMs and I/O. Buy only what you need and want in a motherboard. No point having a X670E if it doesn't have the features you want.

    • @roqeyt3566
      @roqeyt3566 Год назад

      @@dalzy25 you technically can, but how many mobo's from first gen got bios's for 5th gen
      On top, many mobo's show weird behavior (sluggishness, microstutters, performance loss) if they're too low end or old, as they were designed for their generation of chips and not the future ones. Not only that, power delivery, especially with a 200w+ socket, becomes much more important compared to the old limited AM4 socket.
      So while you could, it's a question of should you. It'll be a pain in the behind if history is anything to go by

  • @paulreeves1787
    @paulreeves1787 2 месяца назад +3

    This one aged well, whoever bought AMD still has a working pc 😂

  • @TetraSky
    @TetraSky Год назад +33

    it's nice to see optimization to reduce power consumption and thermals instead of just throwing more watts at their chip like Intel and Nvidia have been doing for the past few years

    • @damara2268
      @damara2268 Год назад +5

      behold i9 14900K 500W

    • @PentaFox5
      @PentaFox5 Год назад +1

      @@damara2268 more like 600+ xDD

    • @withjoe1880
      @withjoe1880 Год назад +1

      To be fair, a lot of people forget that Intel is still on a 10 nm process node while AMD is on a 5 nm process node. This is what accounts for the current difference in power efficiency.
      We can look at the Intel roadmap to see that they plan to release 14th gen processors in the second half of this year. These are on the 7 nm node, which should greatly reduce power consumption. Similarly in 2024, they plan to release the 15th gen, which should use a 2 nm process. This is slated to compete with the AM5 release in 2024 which should be using a 3 nm process.
      So, we should expect the Intel 14th gen to be a large jump in efficiency even if the designs are just scaled, though there are also likely going to be some other improvements. This should help them compete with the non-X AMD cpus.
      Then, we are set up for another head-to-head in 2024: Intel 15th gen vs Zen5. It is great to see Intel potentially back in the game and helping to spur more competition in the CPU space.
      In recent times, Intel has been stagnant, but now, they are showing signs of moving back into the competition, which is great for us consumers.

    • @YounesLayachi
      @YounesLayachi Год назад

      @@withjoe1880 intel pulled a sneaky on ya and renamed their 10 nm ESF to "intel 7" , they are the same process, just a different marketing number

    • @withjoe1880
      @withjoe1880 Год назад +2

      @@YounesLayachi I know the current gen is 10 nm (Intel 7), I said 14th gen is 7 nm (Intel 4) and 15th gen is 2 nm (Intel 20A).
      Thanks for watching out, though.

  • @ash36230
    @ash36230 Год назад +121

    Meanwhile I'm satisfied with my 5900x, but it does make good expectation for the next gen (8000) when I'm more likely to upgrade. I eagerly await seeing benchmarks for the x3D variants of these chips.

    • @ShinyMooTank
      @ShinyMooTank Год назад +13

      Same with my 5900x. I'll probably upgrade in 5 years right as AM5 socket support ends so I can get a mature and cheap CPU.

    • @hrayz
      @hrayz Год назад +8

      I'm happy having my 5950X running perfectly on an original x370 board! Longevity!

    • @HilleCine
      @HilleCine Год назад +1

      Same. I went 2700x then 5900x. Gonna stay a while

    • @arsim612
      @arsim612 Год назад +1

      wouldnt next gen be 9000?

    • @wixardo
      @wixardo Год назад

      Got a 5900x too but I can't seem to get the temps under control, 85C when gaming no matter what the hell I do. I'm not used to having temps that high, feels uncomfortable.

  • @jyadel
    @jyadel Год назад +18

    As someone who was just about to upgrade their SFF build to a 5700X, the 7700 is definitely the way I'll be going (plus it's an excuse to get an all-new platform and shiny new things)

    • @Calisota
      @Calisota Год назад +1

      probably the right decision while more costly now, you aint stuck on a EoL socket with no further upgrades but rather a new one that just started shining. Plus new RAM that can only increase in quality which might be a real gamechanger in the future

    • @zacsolo1594
      @zacsolo1594 Год назад +1

      Which is also currently extremely expensive. You'll need to buy new RAM anyways so why pay a lot for underwhelming new gen stuff when the old one works just fine

    • @zaknafein8184
      @zaknafein8184 Год назад

      Just upgraded to the 5700x. Verry Happy with the performance gain and the undervolting and oc potential. Will last me a long time (still on x370) but shiny new new tech is shiny new tech

  • @AhamsOwO
    @AhamsOwO Год назад +55

    I just got the 5800X3D with a way more affordable motherboard + RAM. Got now a solid system thx to the cheap price of the AM4 platform :P

    • @timcesnovar978
      @timcesnovar978 Год назад +2

      It's actually a better cpu then the 7600 or the 7600x lol and a much cheaper mb and ram so yeah great deal. I got the same cpu. It's a no brainer right now if you need gaming performance

    • @sumedhtech1526
      @sumedhtech1526 Год назад

      @@timcesnovar978 how does 5800x3d beat 7600 or x variant? the new am5 platform supports ddr5 ram which massive increases the performance?

    • @timcesnovar978
      @timcesnovar978 Год назад +2

      @@sumedhtech1526 3D Vcache. You do know how to use google right? And no the ram doesn't make a big difference at all.

    • @sumedhtech1526
      @sumedhtech1526 Год назад

      @@timcesnovar978 if i want to build a pc purely for gaming is there a more affordable cpu than 5800x3d which has a good price to performance ratio?

    • @timcesnovar978
      @timcesnovar978 Год назад +1

      @@sumedhtech1526 price to performance no, but there are more affordable options like 5600x and 5700x although if your building a new pc I'd go for 5800x3d and some 3600mhz RAM

  • @Cruzz999
    @Cruzz999 Год назад +38

    Holy shit, a factorio benchmark!? This is amazing! X3D appears to be amazing for Factorio, which will almost certainly factor in to my next CPU purchasing decision.

    • @TheYCrafter
      @TheYCrafter Год назад +2

      enjoyed seeing this aswell, but you have to be really far in Factorio that these differences get noticable :,)

    • @WohaoG
      @WohaoG Год назад

      As soon as I closed the replies he said "Perhaps the most interesting game we tested though is Factorio"

    • @padnomnidprenon9672
      @padnomnidprenon9672 Год назад

      Wait 1 month and you'll have the 7800X3D available

    • @chillhour6155
      @chillhour6155 Год назад

      Enjoy Factorio!!

  • @AustnTok
    @AustnTok Год назад +6

    The MSRP for the 7900 is $449. I got my 7900X on Amazon during Christmas time when it was $449. So I'm still extremely excited about it!! got a good deal if you ask me. Especially when you consider that the 7900x went back up to $549

    • @Mewzyc
      @Mewzyc Год назад +1

      I find it weird they increased the prices back up. I thought they would keep it down since the x3d version are coming out, and ppl were saying the X version weren't selling well too.

    • @AustnTok
      @AustnTok Год назад

      @@Mewzyc I was surprised too tbh. I think they will drop the prices once the X3D chips actually come out. They're probably gonna milk every cent out of the MSRP of these that they can.

  • @am53n8
    @am53n8 Год назад +35

    Those temps and power consumption 👀 Very exciting to see this for small form factor pcs

  • @terryshrader3909
    @terryshrader3909 Год назад +45

    Hey just wanted to say thank you for the channel. I used to be a system builder back at the turn of the century and into the sli Era of gaming. I'm slowly catching up with the new tech. So glad to see some other enthusiasts make these videos.

    • @fridaycaliforniaa236
      @fridaycaliforniaa236 Год назад +3

      Same situation here =)

    • @TehButterflyEffect
      @TehButterflyEffect Год назад

      Yeah me too. Did my first modern build (first one in 12 years) late last year. Amazing how much faster everything has gotten for about the same money.

  • @zodwraith5745
    @zodwraith5745 Год назад +2

    WTH the wrong with your 13600k? You're showing notably worse gaming performance than other outlets. It's still annihilating Zen4 in productivity but those gaming benchmarks look off from what we've come to expect.
    Also, at 8:45 you're making a big deal over the 7600 being a whole 7 minutes slower than the X variant, but don't mention the 7700 getting absolutely f*cking clobbered by it's X variant by a staggering 28 minutes? Either the 7700 non X is a steaming pile or your team is dropping the ball and you're not actually reading anything besides the script. Can we get some quality control here Linus?

    • @SameAtol
      @SameAtol Год назад +1

      Yeah 13600k charts look way off, even comparing them with their own review, they don't match up at all

  • @EragoEntertainment
    @EragoEntertainment Год назад +14

    I am excited to upgrade to an R7 7700 for a hopefully cheap price in 2 years.
    My 3700X with it's low power consumption is great and I am hoping to get more performance for the same power consumption.

  • @YeezyTeezy
    @YeezyTeezy Год назад +84

    Considering I bought a 7600X just last week I should probably feel some buyers remorse but since it was on sale for 230€/$ I am actually very happy that I got it a day before AMD announced the new chips since it's now back to the MSRP

    • @duranium4445
      @duranium4445 Год назад +7

      I've bought an 7900x but for the same price as the 7900 currently. So I don't have buyers remorse (yet).

    • @00vaag
      @00vaag Год назад +6

      I got the 7600x last week. It's 10 euros more expensive than the 7600. So a pretty negliable difference, just as negliable as the performance difference.

    • @TooEasyL2P
      @TooEasyL2P Год назад +5

      @@00vaag correct me, but i think 7600x doesn't come with a cooler. And 7600 does. So depending what cooler you got it should be a bit bigger difference.

  • @dani43321
    @dani43321 Год назад +5

    It would be interesting to know if there's any difference in performance between these new non-X chips and the old X chips running in 65W Eco Mode.

    • @duranium4445
      @duranium4445 Год назад

      That does also interest me. You can boost an 7900 up to an 7900x but I'm sure you can also turn down your 7900x.
      Kinda looks like those are the same chips lol.
      The 7900 is actually more expensive currently than the 7900x in Germany.

  • @Mirra2003-f9s
    @Mirra2003-f9s Год назад +8

    I'd still go AMD just for the upgrade path. With AM5 you will be able to get new processors unlike the 13th gen that use the same socket as the 12th gen meaning you're going to be stuck with the 13th gen processor forever. I already did this mistake back when Ryzen came out. Got a 7th gen i7 instead of a 1700x and now i'm stuck with it but if i would have picked an AM4 platform i would have upgraded that 1700x easily

    • @Daisudori
      @Daisudori Год назад

      While correct there is no evidence am5 support will be as long as am4. they only guarantuee till 2025 which is in 2 years already. Buying a 300 $ cpu and upgrading in 2 years time already is monkey business.
      Also this is assuming AMD is still competetive in 2-3 years... who knows about the future. Buy whats best now. Or matches your wallet. Which might also be am5 if board prices come down.

    • @Mirra2003-f9s
      @Mirra2003-f9s Год назад

      @@Daisudori Honestly i don't even know what to get. i'm looking to upgrade to either a 13th gen i7 or a 7700x. Board prices don't really matter to me cause for some reason in my country those AM5 boards cost just as much as a good LGA 1700 board so in my case it's just a matter of preference.

  • @oscartrujillo5223
    @oscartrujillo5223 Год назад +50

    Picked up the 7600x for $240 while it was still available at that price when AMD announced the non x and x3d cpus. I'm hoping AM5 will be just as good as AM4 and adopting now also means ready for new features and getting the most life out of the socket 🤘🏼

    • @hman6159
      @hman6159 Год назад +7

      Same, I got my 7600x for 245 and I’m happy

    • @LPConde
      @LPConde Год назад

      Same here, in the process of building, just waiting for my mobo to arrive.

    • @joshpaton5892
      @joshpaton5892 Год назад

      I got mine on black Friday, I get the same performance for only 10 dollars more, 2 months earlier.

    • @Vykno
      @Vykno Год назад +1

      Got my 7600x for 229 last week, saw the release of the non X's and laughed

    • @lunakoala5053
      @lunakoala5053 Год назад

      Do you actually update during a sockets life tho?
      I bought Ryzen first gen and actually ended up swapping Motherboards, not CPUs, as I went from ATX to ITX.

  • @dbackscott
    @dbackscott 5 месяцев назад +2

    Just yesterday I ordered a Ryzen 7600 (not X), new MB and RAM. My old Ryzen 1700 CPU and MB took a dump.

    • @danialabd7473
      @danialabd7473 15 дней назад

      how is it going so far? im also thinking abour ordering the 7600 (non X).

  • @Idontcarewhathandleihave
    @Idontcarewhathandleihave Год назад +25

    6:24 This is one of the moments when having some more clear documentation and dedicated explanation regarding the Labs SOPs and testing methodology, not necessarily directly in the video but either in a stand-alone video or a link to a forums thread, would be really helpful. It’s no doubt hard to get a high quality testing lab put together, especially when there is a well known focus on developing testing automation. However, when there’s been issues with previous tests that don’t get caught before the video is published and then get at most a pinned comment, it makes me cautious when I start hearing about “strange data” that was discovered which for all I know could be human error but is immediately passed over without much explanation regarding how this result was validated to be accurate.
    Reputation is so important, and LTT as a brand has often made transparency a prominent feature, so please continue to apply that to the labs as they develop so that detail oriented viewers, the primary target of labs level content, won’t view the labs as having a reputation for letting errors slip by or having a murky methodology more focused on obtaining results quickly over accurately.
    To be clear, I love the content and don’t believe that there is any issue per se in this video, but hearing that line and no follow-up immediately made me remember other errors and think “well, they’ve been wrong in the past so I’ll just check GN instead” which is not the feeling I want to get out of labs related content because I’m really excited about what the labs have the potential to do. I am looking forward to seeing more content making use of the Labs capabilities and how your testing evolves in the future!

  • @Just_a_commenter
    @Just_a_commenter Год назад +19

    The more pressure on competitors to provide lower prices, the better. People shouldn't need to sacrifice a kidney for a good system.

    • @blunderingfool
      @blunderingfool Год назад

      Oh joy, now we have tech support scams.
      Also, yea, price improvements mean nothing whilst bloated governments make all of your dosh worthless.

  •  8 месяцев назад +1

    My 7900 gets 30k cinebench points in r23 and 1750 in 2024. Unlocked pbo and 6000MT/s RAM. 2200Fabric. If I limit pbo at 44Watts & slow RAM, it gets 18k in r23. But my cooler is the worst 240aio.

  • @DubstepsHub
    @DubstepsHub Год назад +47

    Alright, so while the big CPUs are the ones that normally draw a lot of attention, I have to say I'm very impressed with what the 7600 is putting up in performance.
    For gaming, in most titles it's putting up FPS within about 10-20% of the 7900X despite being $320 less. For productivity, it's within spitting distance of the 7600X for $70 less. If you're not building a high-end rig, it seems like the budget tier is fantastic this go around.

    • @johnsherby9130
      @johnsherby9130 Год назад +8

      I was watching that chip the entire time too, super high clocks, low prices, low power draw, low temps. Low in all the right places and high where it matters

    • @roetemeteor
      @roetemeteor Год назад +12

      @@johnsherby9130 I'm hard-core interested with the low Temps. After all, above all else, temp kills. Anything that stays cool generally will not die by abuse. If it barely hits above 50 when going full swinging, then that means it'll be exceptionally resilient. Beat it, boost it, aim your hairdryer at it for the shiggles, but it'll still be artic cool compared what the rest of the big boys are pulling.
      I'm sold.

    • @MaxDad7
      @MaxDad7 Год назад +2

      I very much agree. Just because it's not at the top of these charts, doesn't mean it's bad. It's still going to be great at productivity and gaming when compared to older CPUs, or having no PC at all right now.

  • @Bleuducky
    @Bleuducky Год назад +7

    Love the Factorio benchmarks, the factory must grow and so must the UPS

  • @kingyachan
    @kingyachan Год назад +9

    I'm actually super interested in the non x chips after watching this purely because of that incredible thermal performance, I live in Australia so almost the same performance with incredible thermals, I'm genuinely going to consider a new build

  • @jocerv43
    @jocerv43 Год назад +10

    Does this not happen on every release? There's always a price cut on last gen, and non x skus have refined power usage. I've gotten the 1800x, 2600, 3700x all on good price cuts. I always buy when the price cuts happen. The 5800x dropped to like 300 and the 5800x3d to 350, have both, the 5800x3d does wonders for my mmos.

  • @tobiwonkanogy2975
    @tobiwonkanogy2975 Год назад +13

    I'm loving the multiple approaches to computing , feels like the companies are specializing for different needs. It will be very difficult to tell who is moving in the correct direction. I commend AMD for squeezing raw cores as much as possible and Intel for going through a period of trying new things with cpu gpu and storage.

  • @liftoffandtravel
    @liftoffandtravel 4 месяца назад +1

    I was pretty certain I'd give a nod to AMD until you started talking about the power usage. That won me over! 😁Also worth noting yes DDR5 is more expensive but it seems more future proof.

  • @johnkneebuoy
    @johnkneebuoy Год назад +14

    Are there any specs for idle power draw? If you want a small home server, 90% of the time it's going to be on and idle, so that's where the real power savings come in. Also, we'd need to compare the total power of the system at idle to team blue's offerings to get a fuller picture.

    • @johnkneebuoy
      @johnkneebuoy Год назад

      @@rustler08 I'm curious to know what prompted the aggressive tone in your reply?
      Your initial question was good, asking why I would consider this for a server application. From that we could have had a productive dialogue and possibly shared ideas and learned from one another.
      But then you shut it down by shooting down my proposal without any context. You have no knowledge of my use case, and therefore cannot make the assumption that it is overkill.
      I happen to be a software, site reliability, cloud, network, machine learning, and big data engineer with over a decades experience. I am the head of department for systems development at tma successful FinTech. And have a side business getting off the ground.
      I have multiple clustered home servers for fun, side project work, and learning, running kubernetes and deployed with config as code. Most of the time they run close to idle, but when I crank the big data stuff, or do large compiles, or render or transcode video (a hobby of mine, which I do in software if it's for long term archival) the cluster can run pegged at 100% for hours or days at a time. I am also off grid, so energy matters to me.
      So the idea of CPUs which are significantly better performance per watt is of interest to me, and I earn well, so cost isn't a significant factor. But I am conscious of my carbon footprint, and so what my devices draw while idle is of interest to me.

  • @egerlachca
    @egerlachca Год назад +20

    The work the lab is doing is really starting to show, and it's awesome. It's a great investment.

  • @Seizuqi
    @Seizuqi Год назад +1

    AMD is really the more mainstream industry leader now.
    I wonder when it will switch again?, 6, 7 years maybe?

  • @insanecreeper9000
    @insanecreeper9000 Год назад +6

    I love that factorio is now a benchmark game. Such a nice introduction

  • @mcflu
    @mcflu Год назад +46

    Listen, if you dove into AM5 at the prices they started at, you already knew you were paying way more than if you just waited a few months. So then you played yourself.
    Side note, the 7600 reminds me a lot of the old 2600, and why it was such a good deal compared to the 2600x, at least till the 3000 series dropped lol

    • @wixardo
      @wixardo Год назад +5

      Yeah, AMD made the CPU market interesting to follow again. The Intel-only era was really boring.

    • @doro626
      @doro626 Год назад +1

      I agree, but the price dropped in weeks, not months. Still stings.

    • @Tential1
      @Tential1 Год назад

      If you want a successful review video for gamers, you need to cry about something.

    • @ElderlyAnteater
      @ElderlyAnteater Год назад

      the 5600 and 5600X are the same again. practically no difference in performance at all, but a much lower cost

    • @CGMossa
      @CGMossa Год назад

      this review doesn't really show that those people played themselves.. I mean, the X-series *is* performing better than the non-X.
      Productivity tasks is just clearly a win for X.
      But it seems like no matter what: Less power consumption, cheaper performance-ready CPUs, should make *everyone* happy.

  • @sunewjenkuzo2835
    @sunewjenkuzo2835 Год назад +3

    My friend decided to put the R7 7700 in my build and I can't thank them enough it runs much better than my old 4-5 year old pc and this build is about $300 cheaper than the my old one lol. But from going from an I7 8600k to this makes me wonder why I was on team blue.

    • @Physics072
      @Physics072 Год назад

      Teams? And that makes you a fan boy. Go with what works not the team

  • @SirHackaL0t.
    @SirHackaL0t. Год назад +7

    Did you test the x variants on eco mode compared to the non X variants? I wonder what the temps would do if you limit the X versions to 65W.

    • @PatrickJanz
      @PatrickJanz Год назад

      My 7700x in Eco Mode runs as the 7700 in this video degree’s wise but faster thanks to higher clocks.

  • @mushieslushie
    @mushieslushie Год назад +9

    I went with a 5600 recently and am still happy. The base clocks on these make way more damn sense but the overall cost of the build would be pretty high. These come with heatsinks too which may just be because AMD didnt design a better one for the high tdp on the X ones

    • @Porouskilldeathratio
      @Porouskilldeathratio Год назад

      AMD has 3 tiers of heatsink/cooler for their ryzen range, all 3 are actually pretty decent like im still running my stock cooler for my ryzen 3700x and it doesnt thermal throttle unless being hit with cpuZ

    • @AndyMitchellUK26
      @AndyMitchellUK26 Год назад

      To be fair if you're buying a top end chip, odds are that you would be buying your own decent cooling. Budget oriented chips like the non-x chips make much more sense to have a heatsink included and I actually like that AMD have made the decision to do it this way.

    • @fynkozari9271
      @fynkozari9271 Год назад

      Yeah 65watt cpu doesnt need special cooling. Maximum only 88 watt, temperature is low. The big highest end cpus high temp need big cooling.

  • @NonyaDamnbusiness
    @NonyaDamnbusiness Год назад +1

    Let me tell you where the non-X CPUs are going to wind up: homelab servers. They're *perfect* for them what with the high core/thread counts, low TDP, and 64GB RAM support.
    A lot of us are already using AMD Ryzen 5600G CPUs in dedicated TrueNAS Scale servers just for that reason. It allows you to run 40-50 different moderate-load Docker containers without breaking a sweat. Hell, I'm running 30 containers in a VM on an ancient Core i7 Lenovo tiny PC running as an XCP-NG host and that VM is backed up to my NAS every night.

  • @jarlerak137
    @jarlerak137 Год назад +5

    I Appreciate the Factorio Benchmark :)

  • @CliffordArk
    @CliffordArk Год назад +4

    How are people liking the video when it’s been out for a minute

  • @jdolzpcgamingnocommentary9k7y4
    @jdolzpcgamingnocommentary9k7y4 Год назад +1

    That's why I'm sticking with my Core i5-12600K best CPU I've ever bought bang for buck period 10 Cores 16 Threads for basically nothing in terms of $$$, it's like stealing candy from a baby I love it of course I have a 13900K and 10th gen processors I skipped 11th gen but no point in those and I own older Intel and AMD cpus I mean I got a whole kit full of cpus gpu's ram motherboards etc etc even Linus would love to see!

  • @infinitea4258
    @infinitea4258 Год назад +10

    I recently upgraded my CPU from a 1700x to a 5700x. Absolutely love the difference in performance and with the added benefit of it running at 65w aswell!

    • @Seppe1106
      @Seppe1106 Год назад +3

      2600 to 5600x here. Performance difference was nuts. Never knew how much the 2600 - even overclocked - actually bottlenecked my 2060. lol

    • @infinitea4258
      @infinitea4258 Год назад

      @@Seppe1106 yeah same story here, my 1700x was bottlenecking my 2070 super so much!

    • @ZackSNetwork
      @ZackSNetwork Год назад +1

      @@Seppe1106 That’s crazy because your 2060 is equivalent to a GTX 1070ti in rasterization performance.

    • @fynkozari9271
      @fynkozari9271 Год назад +3

      AMD users can brag 65w to Intel users using core i7 241 watt, i9 350watt . Getting the same fps for a fraction of the energy.

    • @sovo1212
      @sovo1212 Год назад +1

      I got the 5800x like one month before the 5700x was released. Having CPUs with over 100w of TDP has been always a nightmare for me. Fortunately I was able to do some nice undervolting with it, so it never reaches 90w. Still, my biggest frustration isn't not having bought the 5700x, but the 5800x3D instead.

  • @FSAPOJake
    @FSAPOJake Год назад +6

    I'm glad to see these well-priced options showing up, but do keep in mind that the X CPUs are selling way below their MSRP right now. For the past month, the 7950X has been $550-$570 on both Amazon and Newegg.

  • @AetherProwl
    @AetherProwl Год назад +5

    Actually the ram speed thing changed on Zen 4 there’s no longer a penalty for desyncing infinity fabric and ram. That said Zen 4’d memory controller generally can’t handle much faster than 6000 but the 6800 used on the Intel bench is way more expensive

  • @matthewwilliamson7881
    @matthewwilliamson7881 Год назад +7

    got my 7700x and it came with 32gb of free DDR5 RAM. I'm good with my purchase lol

    • @jmboyd78
      @jmboyd78 Год назад

      Micro center?

    • @mangatom192
      @mangatom192 Год назад +3

      I wish my country has microcenter...😭

    • @beanishone3998
      @beanishone3998 Год назад +1

      Got the same deal - was also in fact - DDR5 Ram 6000mhz 36CL which is great for AMD in general. Microcenter is wonderful.

  • @Felic_HRZN
    @Felic_HRZN Год назад +7

    having a 5950x I wasn't looking to upgrade anytime soon, however as someone really into Star Citizen right now, I've heard that the x3D cache is really helpful for it, so the 7950x3D might be my next upgrade.

    • @JakeNKlappy
      @JakeNKlappy Год назад +1

      Same with tarkov. I just upgraded from 3700x to 5900x and everyone who has the 5800x3d gets better frames than me. So I either kinda downgrade but get better game performance or wait for next gen 3d chips

    • @sovo1212
      @sovo1212 Год назад

      Fortunately I stopped worrying about my PC not being able to run Scam Citizen decently a long, long time ago. My latest build doesn't even reach 60 fps at 1080p on it. And now it's a 10 year old "game", it doesn't even look that good anymore.

    • @Felic_HRZN
      @Felic_HRZN Год назад

      @@sovo1212 well that would be because it’s still in an a alpha and is pretty unoptimized currently, I get 1440p 60fps+ so I’m not too worried about really :) but to each their own.

  • @andrewleno591
    @andrewleno591 Год назад +1

    I never could understand those remark about intel platform, that intel support DDR 4 and DDR 5. It's sounds like this but correct would be to say it support DDR4 or DDR5. In other words you can find a MB with DDR4 only or DDR5 only... But not both, cause physically DDR4 and DDR5 are different. So actually there is no benefit I would say it's opposite. You can buy an MB with DDR4 and later you can upgrade CPU to new, but only on one gen more and you can have an issues that DDR4 will be a bottleneck for some applications. So it's better to have DDR5 from beginning....

  • @pineppolis
    @pineppolis Год назад +7

    I liked how the non x cpu’s were highlighted in the chart, good job

  • @xiaowong6651
    @xiaowong6651 Год назад +5

    I didn't get a 7000 because AM5 boards are still 300-900€

  • @mutherbrotherHU
    @mutherbrotherHU Год назад +1

    Ryzen non X not only is cheaper by price tag, but power consumption too!
    Its like they wanna do the M1 chip, but... *non apple series*

  • @RockinHorea
    @RockinHorea Год назад +5

    Thank you for including those Watts per FPS graphs, after all! That's where we can clearly see efficiency at gaming. It's very interesting.

    • @rubiconnn
      @rubiconnn Год назад

      Watts per FPS is kind of pointless though. The only thing that should matter is FPS. Would you really want to lose 20% of your FPS to save a few watts? I mean it makes sense for laptops or anything else that runs on batteries but it's useless for desktops.

    • @406Steven
      @406Steven Год назад +1

      @@rubiconnn It shows efficiency and helps people who aren't building super gaming computers make an informed decision. It's not the do-all and end-all of CPU choice but very helpful. My computer idles between 150 and 180 watts so I turn it off when I'm not using it because that's a lot of power to just throw away. If I were to do a PC build right now that 7600 would be on my short list.

  • @KeyT3ch
    @KeyT3ch Год назад +14

    This is the way it should be, I was afraid that I would not have be able to get a 65w CPU in the future already seeing as to just how cool and efficient my 5600x performs. AMD needs to keep this up

  • @fridaycaliforniaa236
    @fridaycaliforniaa236 Год назад +2

    I'm from the future : Yesterday we saw a conference about the Core i9 24900KS. Intel says it needs a small form factor nuclear plant to reach its 9 GHz boost clock.

  • @doodskie999
    @doodskie999 Год назад +6

    This is why I chose AM5
    The upgrade path is just awesome.
    Sure Intel may have the crown but its just a small margin. Once the X3D versions launches, it will take the charts back

    • @Freestyle80
      @Freestyle80 Год назад +2

      right, you bought something today because you want to upgrade 5 years later
      99% of the people don't do this

    • @WilliamNFMS
      @WilliamNFMS Год назад

      @@Freestyle80 that doesn't go against the op's original statement

    • @doodskie999
      @doodskie999 Год назад

      @@Freestyle80 i started with the r5 2600 in 2018, then jumped to 3600 then 5600x while rocking a gigabyte b450 board. I may not be part of the 99% but I sure will keep upgrading in the future

  • @AverageJoeBot
    @AverageJoeBot Год назад +21

    Loving the new layouts for your data! It made it very clear which data was what and made it easier to take a look at at a glance without pausing. Fantastic work by your graphics team!

  • @skywalker1991
    @skywalker1991 Год назад +2

    R7 7900 is the sweet spot , only 2% to 4% slower than x CPUs even Intel CPUs , while using 50% less power , amazing . now only am5 mobos needs to come down in price ,

  • @gurpremsingh
    @gurpremsingh Год назад +5

    with those temps and power consumption its seems to be a great improvement!

  • @Barrett_Fodder
    @Barrett_Fodder Год назад +5

    Great price to performance here. Looking forward to what the x3d really offers to round out the lineup.

  • @MrGaZZaDaG
    @MrGaZZaDaG Год назад +1

    haha AMD runs hotter, AMD uses more power.... okay Intel fanboys, this finally proves Intel is way behind on power to performance. And Intels fireinside processors have been hot chips for a long long time, they always hit upwards of 90-100c for their top products.

  • @elfishmoss1457
    @elfishmoss1457 Год назад +4

    What happens if you didn't buy AMD instead went with i7-12700 last year :)

    • @batorian03
      @batorian03 Год назад +3

      Then you lost anyway because you chose Intel to begin with.

    • @elfishmoss1457
      @elfishmoss1457 Год назад +4

      @@batorian03 fair enough but honestly that's not a problem to me really since I'm upgrading from an 8 year old pc with no discrete GPU so it's not all lost

    • @joezmoma19285
      @joezmoma19285 Год назад

      @@batorian03 Amd copium

  • @incadavis5210
    @incadavis5210 Год назад +4

    Hey linus when watching the graphs of the data, i feel it would be helpful when doing this style of comparison to color code the CPU names. For example with this one, the 7900 and 7900x could be blue and then the 7700 and 7700x could be green and then so on so forth. Just a suggestion :)

  • @WrexBF
    @WrexBF Год назад +1

    8:09 another day another LTT benchmark mistake. The 13600K gets around 25K points in Cinebench R23, not 19K.
    4:17 That's false information as well. the infinity fabric on ryzen 7000 maxes out at around 2000Mhz, not 3000Mhz.

  • @Axeiaa
    @Axeiaa Год назад +6

    It's the motherboard prices that kill the potential for AM5, not the CPUs imo.
    Even recently I've recommended someone to go AM4 - where the motherboards are still surprisingly expensive as well both used and new. I guess that's the downside to motherboards supporting many generations of CPU (Ryzen 1000, 2000,3000 and 5000 series), prices never go down all that much as everything stays relevant.
    It spells doom for bargains in the far future as well as In the end it's almost always motherboards that end up being the more pricey part. Over time motherboards will kick the bucket whilst CPUs are practically immortal, even the good old 486 CPUs are mostly still working as new, finding functional motherboards is a lot harder.

  • @shawkindustries9521
    @shawkindustries9521 Год назад +5

    Gotta love seeing the 5800x3D sneaking into the lineup 😎🔥

    • @battalionstallion3894
      @battalionstallion3894 Год назад +2

      Fr it brings me a smile seeing my last gen cpu still topping the board in some games

  • @claucmgpcstuf5103
    @claucmgpcstuf5103 Год назад +1

    WEL THA DDR 5 6800 IS samTing els !!! den ddr 6000 AMD ... bit iven so veri coo tha R 7000 non x 65 wat tolti osam cool stuf from AMD AGNE !! YEA OSAM ! COOL SISTEM NEW YEA FOR SHURE !

  • @tylergorman3231
    @tylergorman3231 Год назад +3

    I wasn't planning on upgrading until I watched this video. Thank you for always being excited about value in the PC space. Helps a lot of us make informed decisions! -Wage Slave