I’m tired of winning (and it's awesome) - AMD COMPUTEX 2022

Поделиться
HTML-код
  • Опубликовано: 8 сен 2024

Комментарии • 4,2 тыс.

  • @phyrexiancoffee6324
    @phyrexiancoffee6324 2 года назад +4202

    It's still baffling to me that Ryzen is now 5 years old. I run a 1700x in my rig, and it still feels brand new to me. It's amazing how fast things are moving again now that AMD is fighting back.

    • @ZackSNetwork
      @ZackSNetwork 2 года назад +87

      Seriously Zen 1 new stop the CAP.

    • @8SnazzY8
      @8SnazzY8 2 года назад +167

      I had a 5year old 1700x as well but just 3 months ago I upgraded to a 5900x and it feels like a new damn computer even though all I changed was the CPU and MOBO.

    • @antont4974
      @antont4974 2 года назад +44

      I have one too, however, I feel like it's not very stable. Picking a right stick of RAM, motherboard was hard - BSOD galore. It was the biggest disappointment of 1700 - stability and ease of use.

    • @phyrexiancoffee6324
      @phyrexiancoffee6324 2 года назад +49

      @@8SnazzY8 I know that I will lose that feeling when I upgrade, but it's just the fact that while everything has advanced tremendously in the past 5 years, I am still completely fine with what I have. I couldn't say that before my last upgrade.

    • @phyrexiancoffee6324
      @phyrexiancoffee6324 2 года назад +68

      @@ZackSNetwork I am not insinuating that Zen 1 is still punching at the top of the list, simply that I can be more than satisfied with my 5 year old CPU this late in it's life. Couldn't say that with my last system pre-Ryzen.

  • @paulbrooks4395
    @paulbrooks4395 2 года назад +1585

    The best part of generation improvements is “low end” systems are vastly more capable, cheaper, and power efficient.

    • @Mrwigglyau
      @Mrwigglyau 2 года назад +72

      The new Gen Intel i3 and i5 lines performance to price is insane.

    • @fabrb26
      @fabrb26 2 года назад +42

      When today's i3 do better than 10 years back top notch i7 !
      But you still use 10 years old i5 that only match todays low end toaster in performance and august 1945 in therm of heat meeeeeh...

    • @zeph6182
      @zeph6182 2 года назад +11

      @@Mrwigglyau Even the Pentium line has been impressive since Skylake came out and added HT to them.

    • @jfaristide
      @jfaristide 2 года назад +18

      @@fabrb26 i3 12100F keeps up with the i9 9900k

    • @duck6966
      @duck6966 2 года назад +9

      @@jfaristide in term of gaming

  • @carbine781
    @carbine781 2 года назад +837

    As someone who's about to upgrade their computer for the first time in 8 years, all this new tech is both exciting and overwhelming at the same time

    • @cesco1990
      @cesco1990 2 года назад +18

      Hahah mine was a while back as well, but pretty sure it's gonna be a 6900xt paired with a 5900x . Old system is all AMD, I have no reason not to do that again. Specially with nVidia's mobster-prices.

    • @ERZAY2
      @ERZAY2 2 года назад +19

      Based off the rumors invest in a good Power supply

    • @andor001
      @andor001 2 года назад +4

      Have fun with your upgrading.

    • @liamness
      @liamness 2 года назад +5

      Yep, still on a Z97 board, CPU significantly overclocked so it doesn't bottleneck the 3060 Ti unbearably. Definitely seems like I should stop putting off upgrading! Think I'll do a rebuild early next year at the latest.

    • @anon_y_mousse
      @anon_y_mousse 2 года назад

      @@ERZAY2 Yes! A beefy power supply, especially for overclocking and discrete graphics, and water cooling pumps if you do that, and if you have more than a couple of spinning disk drives. Only people who don't need a good power supply anymore are those that air cool with 2 fans, have no disc drives or disk drives and use onboard graphics.

  • @TheRipeTomatoFarms
    @TheRipeTomatoFarms 2 года назад +189

    Back to the good old days of yearly upgrades/new builds? Oddly enough, I miss those days! My wallet doesn't.....but my PC enthusiast brain does!

    • @fahimfaisal7571
      @fahimfaisal7571 2 года назад +1

      My wallet is already digging its grave cause I'm definitely upgrading from 3000 to this generation.

    • @lagufit
      @lagufit 2 года назад

      Whats the next best after 3000g ,im in budget please

    • @MicJagger-micycle
      @MicJagger-micycle 2 года назад +1

      @@lagufit a rather "low" cpu but a good one from a 3000g would be a 5600g (6c12t)
      it has lower single core performance than the rest of 5th gen but it has decent all around performance as an apu and is currently $165 on amazon
      just pair that with a b550 motherboard and 2x8gb of ddr4 and you can make a really low budget gaming setup, and it would be way ahead of the 3000g

    • @retropcs88
      @retropcs88 2 года назад +1

      Haha remember Pentiums stuffed in 10 year old XT cases since the only thing that stayed was case, psu and floppy drives

  • @Artista_Frustrado
    @Artista_Frustrado 2 года назад +3255

    Linus: "even 5-year-old CPUs are showing their age"
    [me patting my 10-plus-year-old PC]: it's ok buddy, you're still a good RUclips machine...
    EDIT: shoutouts to all the peeps sharing their Old Rigs in the replies, you guys are great, also shout outs to the guys who clearly missed the joke & are acting all uppity about it LMAO

    • @nachomolaolivera7580
      @nachomolaolivera7580 2 года назад +68

      Mine is already 12.

    • @DigitalLiquid
      @DigitalLiquid 2 года назад +53

      Same here! I was planning on upgrading this upcoming month but now I’m holding off until the new chips release

    • @megajor232
      @megajor232 2 года назад +80

      Just beat elden ring on my 2012 rig lol

    • @DragonboltBlastter
      @DragonboltBlastter 2 года назад +44

      Bruh, I still have my desktop with AMD Athlon II x4 635!

    • @he.lena21
      @he.lena21 2 года назад +5

      Kkkkkkkk, ser brasileiro é foda

  • @spectralsarah
    @spectralsarah 2 года назад +592

    That 1MB L2 cache sparked a memory for me: the first system I built for myself had an AMD K6-2 300 paired with a Soyo SY-5EH motherboard, and it also had a 1MB L2. In 1998! But instead of integrated in the CPU, it was soldered to the motherboard.

    • @CC-bi8hw
      @CC-bi8hw 2 года назад +15

      Awww yeah! The wet was my first built desktop. K6-2 300 with a Voodoo 3!

    • @rudysal1429
      @rudysal1429 2 года назад +10

      Yea the memory controller was on the Northbridge before being moved to the cpu. I can't remember if it was AMD or Intel. AMD was the first to 1ghz, 64 bit and dual core, I think memory was too but can't Remember.

    • @dabombinablemi6188
      @dabombinablemi6188 2 года назад +7

      Would have been 1MB L3 if you'd later moved to K6-III

    • @interlace84
      @interlace84 2 года назад

      Same here back in the day; someone needs to benchmark & submit one to CPUID for comparison.

    • @Rysysys
      @Rysysys 2 года назад

      Yeah, also even that L2 cache were something relatively new at the time (I think first Pentiums presents it?) and just few years later boom, we did recieve multi-purpose L3 technology. Also I remember as dual cores came to market as we complain that internal memory controller will make CPUs more failover. Fortunately it didn't. Wild days.

  • @cn8299
    @cn8299 2 года назад +610

    I'm kind of bummed I upgraded earlier this year but it doesn't really matter, I have no need for such power. I'll be happy to know that when I do upgrade again in 4-5 years, I should see MASSIVE improvements.

    • @thedominator87
      @thedominator87 2 года назад +8

      yeah i built my pc in 2019, and im wondering do i upgrade or no. i really dont want to with how stressful it was the first time lol (took 3 days just to instawll m.2 lmao and had the fire hazard h1 case before switching ot lian li)

    • @SenonGaming
      @SenonGaming 2 года назад +36

      Bro for real, I just got the ryzen 5700g and rx 6700xt within this year , like how the heck are we supposed to keep up!! Am excited for them though

    • @MrFakeGriffin
      @MrFakeGriffin 2 года назад +10

      Running a Ryzen 7 3700X still. And it still is chugging along perfectly fine. It destroys all of my daily tasks.

    • @Bdot888
      @Bdot888 2 года назад +5

      @@MrFakeGriffin hell yeah ryzen 3rd gen is still great. I got a 3600x in 2020 and i still wanna keep pushing it along until any kinks and prices get worked out with the new am5 platform. Not always the best idea to hop on the first variation of a new platform unless you really need a new build. I may just pop in a 5800x when it drops bellow $300 which it very is close too

    • @markt8912
      @markt8912 2 года назад +13

      Ddr5 is still really expensive anyways. It will be a while before amd's new generation gets you a similar bang for buck.

  • @Dreamx13
    @Dreamx13 2 года назад +209

    The only downside is due to scalpers seeing a surge, and in-general inflation values, It's never been more expensive to maintain keeping your computer high end.

    • @flaccgh8799
      @flaccgh8799 2 года назад +8

      I think a huge improvement to this issue will come with companies (and their webstores) implementing better anti-bot programs to make purchasing mass quantities of products difficult/impossible. Can’t agree more though - such a bummer how expensive resells are for even the outdated stuff.

    • @ghosthunter0950
      @ghosthunter0950 2 года назад +8

      @@flaccgh8799 do they actually have a reason to do that? I mean they're selling out.

    • @flaccgh8799
      @flaccgh8799 2 года назад +3

      @@ghosthunter0950 definitely why they don’t 😂😭 why improve the system when the bots buy the stock the moment it drops? They prob love the bots.

    • @enzowahlberg4072
      @enzowahlberg4072 2 года назад +2

      the reason why prices surged is cause of stay home covid and mining being so profitable. Not casuse of inflation.

    • @marekholub8668
      @marekholub8668 2 года назад +3

      @@flaccgh8799 No, not really. They would sell the whole stock anyway. And scalpers don't buy extended warranty or use financing which brings in even more money. They also could keep higher prices if they actually had stock for consumers. Plus there probably could be a great PR boost, too.

  • @aoyuki1409
    @aoyuki1409 2 года назад +437

    *I like how no one mentioned the fact that the Ryzen 7000 engineering sample that was demo-ed to be able to run 16 cores at a stable 5.4-5.5ghz with less than 150 watt power delivery.. IS AN ENGINEERING SAMPLE*

    • @TheBURBAN111
      @TheBURBAN111 2 года назад +15

      Yeah under 150w while gaming isn't special... even the 12900ks power hog does that.

    • @edtheking1006
      @edtheking1006 2 года назад +138

      @@TheBURBAN111 he means it’s an engineering sample not the finished product, the finished product will most likely be more efficient than the engineering sample

    • @ILoveTinfoilHats
      @ILoveTinfoilHats 2 года назад +14

      Likely one of the final ones, seeing as for the leaked release date to be meet, they'd have to already have started manufacturing

    • @dampo2754
      @dampo2754 2 года назад +9

      5.4-5.5 on all cores? Geez i thougt it was 1 core and rest is like 5 ghz or so 😂

    • @aoyuki1409
      @aoyuki1409 2 года назад +56

      @@TheBURBAN111 you missed the point. the AM5 socket cannot provide power beyond 150 watts. if it was actually running all cores at 5.5ghz while running under 150 watt compared to 12900KS' 241 watt at 5.5ghz (i think its less when all core) all core means that you dont have to spend extra on a motherboard with top tier VRMs, beefiest cpu cooling, and extra PSU wattage. we are reaching the point where CPU doesnt really impact FPS that greatly. Even a Ryzen 3 or an i3 can handle reasonable frames at 1080p. at this point top end CPUs are just fighting whether its 250 fps or 240 fps which at that high of fps is almost legligible. AMD doesnt really need to make the world's fastest gaming CPU. the world wants the most reasonable gaming CPU. something that can run fast *enough* doesnt hog 241 watts of the socket and doesnt need a nuclear reactor cooling level

  • @ItsZim0
    @ItsZim0 2 года назад +222

    I hope they keep one-upping each other generation after generation. This is really good for the consumers.

    • @blazedyoda8608
      @blazedyoda8608 2 года назад +16

      I agree it’s also the same in the graphics card department with Radeon vs Nvidia

    • @stepbro1305
      @stepbro1305 2 года назад +5

      @@blazedyoda8608 I wouldn't say all that...but they really are catching up to Nvidia though and are damn close and still stay more efficient, I'm in love with my 6900xt it was about damn time they put out something with 80 series performance or better

    • @blazedyoda8608
      @blazedyoda8608 2 года назад +8

      @@stepbro1305 see in my own personal opinion I am not a fan of the top end cards from amd I had driver issues with my gigabyte 6800xt then in the end I just sold it and bought a 3080ti… drivers are such a big issue for amd cards

    • @stepbro1305
      @stepbro1305 2 года назад +4

      @@blazedyoda8608 you mean from AMD but I still understand lol, and eh I've had very little to minimum problems with AMD personally but maybe you're unlucky in the GPU silicon lottery or maybe I'm lucky lol who knows. The only issues I've ever had was COD cold war being a massive stutter mess when it launched (But it wasn't me it was also a bunch of other people and a handful of Nvidia users so I chalk that up to mostly being Activision's fault) and AMD being the last company gaming companies optimize for (or care about)

    • @Bwalston910
      @Bwalston910 2 года назад +4

      @@blazedyoda8608 Probably because it's Gigabayte, their software and driver support sucks and their features barely work half the time, at least in my experience.

  • @shapelessed
    @shapelessed 2 года назад +2780

    Imagine a CPU that goes up to 5.5GHz without sucking >250W and needing a freezer to cool it like the competition...

    • @girlsdrinkfeck
      @girlsdrinkfeck 2 года назад +53

      thermodynamics says no ,its impossible ,smaller the Transistor the more it leaks and causes heat, smaller NM = less efficient pertformance

    • @gnomatic
      @gnomatic 2 года назад +6

      How the times have changed.

    • @Girvo747
      @Girvo747 2 года назад +533

      @@girlsdrinkfeck Thats not at all how it works lol. Process improvements come with lower power requirements, and higher efficiency (depending on what the customer is after).

    • @PNWAffliction
      @PNWAffliction 2 года назад +5

      yea dont have to raid all the spare liquid nitrogen to cool the damn thing

    • @girlsdrinkfeck
      @girlsdrinkfeck 2 года назад +16

      @@Girvo747 if thats true then why do we need bigger and bigger coolers every new generation ? PCs in the 80s didnt even need fans ffs

  • @doakmickes693
    @doakmickes693 2 года назад +45

    I'm still on an i7 930 build from twelve years ago and I must say this year is looking even more promising than last year for a new build.

    • @monke2361
      @monke2361 2 года назад +4

      Yeah because gpu prices are finally coming down

    • @stefan_jank
      @stefan_jank 2 года назад

      Same here with a i7 3770

    • @blessthismessss
      @blessthismessss 2 года назад

      God, I used to have that CPU. it did me very well back in the day, good luck

    • @questmarq7901
      @questmarq7901 2 года назад

      Well the most basic current cpu will be a monstrous update for you

  • @SpecialEDy
    @SpecialEDy 2 года назад +1190

    Still running my de-lidded Intel 7700K at 5.2GHz, with something crazy like 1.45vcore.
    My cat is going to miss laying in front of my forward exhaust fans on the case when I finally upgrade CPUs, it's the warmest place in my apartment in front of my computer.

    • @MikeG4936
      @MikeG4936 2 года назад +89

      Wow.... highest I could possibly push my de-lidded 7700K is 4.9GHz. You must have a lucky piece of silicon.

    • @notavailable453
      @notavailable453 2 года назад +75

      Thats a stable overclock for you? That kind of boost on that chip is basically equivalent to winning the powerball

    • @Thatonefuckinguy
      @Thatonefuckinguy 2 года назад

      @@notavailable453 ruclips.net/video/mk7VWcuVOf0/видео.html

    • @janisir4529
      @janisir4529 2 года назад +36

      I'm just happy I didn't believe those people who said 6600k is enough

    • @Benefits
      @Benefits 2 года назад +8

      My R7 5800x hits 1.5vcore with stock settings. Is that a good or a bad thing?

  • @Taterisstig
    @Taterisstig 2 года назад +73

    Including an igpu in every chip is a very very positive move!
    From a troubleshooting perspective its amazing and from a budget gaming pc perspective its nice to know you don’t have to sacrifice for future upgrades when it comes to cpu performance.

    • @dieterdebruyne4826
      @dieterdebruyne4826 2 года назад +1

      if it's any good though. I bought an amd cpu for regular desktop use with integrated graphics and had to eventually buy a gpu because the igpu would randomly crash the pc. It was a nightmare

    • @Taterisstig
      @Taterisstig 2 года назад

      @@dieterdebruyne4826 weird! My guess is there was something up with drivers

    • @Gabu_
      @Gabu_ 2 года назад +3

      @@dieterdebruyne4826 When was this, what OS were you using, what was your use case, etc.? I've never heard of such a thing happening with AMD APUs.

  • @seanthiar
    @seanthiar 2 года назад +342

    I think it's time to include energy efficiency in the rating of a CPU and GPU with the rising prices for energy. AMD shows again that speed and low power does not exclude each other. In my opinion Intel just wastes energy and converse it to heat instead of computing power and we have to spent more power to cool that waste heat.

    • @supermasterfighter
      @supermasterfighter 2 года назад +43

      Intel and Nvidia both. AMD may not be top of the line in terms of performance, but they are the kings when it comes to price to performance and efficiency. Intel + Nvidia machines run hot while an AMD machine will run far cooler without much performance impact.

    • @AxR558
      @AxR558 2 года назад +21

      Absolutely, but it really needs a 3rd party test both sides in equal conditions, rather than leaving it up to manufacturers to come up with some figure that doesn't compare between AMD and Intel (like TDP). I'll be seriously looking at the thermals of my next CPU upgrade as my current CPU (65W TDP) pumps out far too much heat even at idle/browsing compared to my previous (91W TDP).
      GPUs appear to be the worst for that though as the generational jumps seems to come at the cost of simply throwing more power (and more massive coolers) at them which isn't sustainable.

    • @Azarilh
      @Azarilh 2 года назад +1

      @@supermasterfighter Not sure about that. There are some recent AMD GPUs that run hotter then most Nvidia GPUs.

    • @AD-zs5ce
      @AD-zs5ce 2 года назад +4

      absolutely agree. I think for laptops especially this is getting very misleading with power throttling on certain parts and acting like it's the same or better than other devices.

    • @gzaos
      @gzaos 2 года назад +3

      @Suuntakulma omg Bulldozer why you have to remind us

  • @hg-ir8tb
    @hg-ir8tb 2 года назад +86

    I don't see a lot of comments about this, but I find that AMD putting iGPUs in their desktop lineups really exciting news. While Multi-GPU pipelines are still not fully baked (with DX12 support generally ignored by developers), iGPUs allow for multi-monitor setups that can offload lower priority windows away from the dGPU.
    My wish is that this market grows bigger so that AMD/Intel/NVIDIA have reason to fix issues caused by such split screen setups. (I had VLC crash multiple times and some programs cause fatal graphics driver crashes fairly consistently.)

    • @humzabari8757
      @humzabari8757 2 года назад

      Why would you want that? Their drivers for integrated graphics are horrendous. I got a laptop that had AMD's integrated graphics and my god it was a nightmare. I had issues for a year, practically since I opened it. Switched to Intel and haven't even had 1 issue. I'd be so much happier if they DIDN'T include iGPU's in all their processors so I can actually use the CPU and not have everything shitting itself.

    • @susnojutsu2525
      @susnojutsu2525 2 года назад +7

      @@humzabari8757 I think that's more of an unfortunate personal experience than it is a company problem and a design problem. I have a laptop with an amd cpu that has intergrated graphics and a gtx 1650 mobile, and I've had 0 issues.

    • @hg-ir8tb
      @hg-ir8tb 2 года назад +3

      @@humzabari8757 Laptops have their own issues (e.g. multiplexing) and I'm not sure if your woes were caused by hardware or software issues. But, if your issues were software-related, these new products will expand the market in which AMD iGPUs are applicable and incentivize AMD to improve their software support.
      Also, on a desktop, you don't have built-in monitors. If your dGPU fails, you might have a functioning monitor, but if you don't have an iGPU, you're going to have to buy a dGPU to validate.
      And, if you don't want to use the iGPU, that's totally fine on a desktop: If you have a dGPU, there's usually a setting in the BIOS/UEFI that allows you to turn off the iGPU. But it's great to have the option.

    • @ferwiner2
      @ferwiner2 2 года назад +5

      Desktop iGPUs are great for people who need lots of processing power, but not much in terms of graphics other than video out and occasional video decoding. This includes programmers for example. Ryzen is more than capable enough as a professional rig, but it feels wasteful to buy generations old cheap nvidia GPU just to output the video. And if you don't have to do that, you can use the additional budget to get a higher end cpu. Which effectively means AMD is taking the money these people would either way spend on nvidia products. Sounds like profit for me.

    • @chexo3
      @chexo3 2 года назад

      I also hope open source drivers that are actually open source with no weird caveats are a thing.

  • @HansStrijker
    @HansStrijker 2 года назад +44

    I bought a Core2Duo E8400 in 2008. It lasted me about 10 years until I bought a Ryzen 1700x in 2017. Which has already been retired to my filesystem, and running a 3900 on my desktop. It's been crazy, and indeed, I kinda like it. Though, on the flip side, for those 10 stagnant years, I noticed software started being written with performance and optimalization in mind, and I am noticing a return to less optimized software because the processors are again fast enough to take up that slack.

    • @5at5una
      @5at5una 2 года назад +5

      meanwhile nothing is optimized with intel 12 gen in term of power efficiency.. lol

    • @patlefofort
      @patlefofort 2 года назад +10

      The plague of electron and web based apps.

    • @aprofondir
      @aprofondir 2 года назад

      Web apps and stupid python shit can go die in a fire. There is 0 reason for a messaging app to use 2 gigs of ram.

  • @ExarchGaming
    @ExarchGaming 2 года назад +205

    I mean... jesus. They've literally shook the entire industry up for the last 3 generations. when Zen 2 hit, the multi-core performance was staggering against the 9000 series for intel, and zen 3 was was a foot through the door beating intel's single thread and easily retaining the lead against 10th and 11th gen intel. 12th gen intel managed to edge it's superiority back.....for a few months, as Zen 4 is coming with a spinning back fist to Intel.
    Whether you're an intel fanboy, or an amd fanboy, you really must be tired of winning so hard.

    • @MAZZAR0TH
      @MAZZAR0TH 2 года назад +4

      Do you really have to use Jesus' name in vain? show some respect buddy.

    • @paulpietschinski3282
      @paulpietschinski3282 2 года назад +64

      @@MAZZAR0TH Jesus please make my AMD STOCKS RISE

    • @beryyjams
      @beryyjams 2 года назад +21

      @@MAZZAR0TH jesus, i’m glad i’m too broke to afford some top of the line rig, cause by the time i have enough money, the competition might die down a bit and i can keep the same parts for longer than a year before they arent outdated

    • @GeneralKenobi69420
      @GeneralKenobi69420 2 года назад +6

      They aren't really, though. 15% boost in single thread is pretty much what you'd expect from going to 5nm alone. Also, I feel like showing clock speed while gaming is kind of weird metric to show, why not just showing, you know... Frame rate? Which is the only thing that people care at the end of the day? Intel got the lead with Alder Lake and it looks like it's going to again with Raptor Lake. All in all this looks to be a fairly disappointing generation and I wouldn't recommend it unless it's fairly cheaper than its Intel counterparts. Not really sure why Linus seems to be all over it.

    • @Arwel22597
      @Arwel22597 2 года назад +2

      @@beryyjams this. Bro I'm on a 6700k and am now starting to struggle to hold 100hz at 1440p with other tasks. Buy pricey once, reap the benefits for the years to come

  • @cyjanek7818
    @cyjanek7818 2 года назад +197

    If am5 could live as long as AM4 then it would be everything anyone could ask for.
    I dont know if its possible since ddr5 is New tech that might need refining but it would be amazing.

    • @TwilightWolf032
      @TwilightWolf032 2 года назад +22

      Actually, since AM5 is coming so early into the life of DDR5, the platform could end up living for even longer. DDR4 came out in 2014 and it's still relevant today despite the DDR5 modules being available for the public for over a year now. If DDR4 is anything to go by, we could be looking for a 8 years support for AM5 platforms.
      Of course, it will all depend on how intel (and possibly Nvidia) will react to the AM5 platform - if they have something better, or that at least can much a huge deal of pressure on AMD, they might end up releasing an AM6 platform to stay competitive much sooner than we could expect. Only time will tell.

    • @acmenipponair
      @acmenipponair 2 года назад +4

      @@TwilightWolf032 On the other side with the new LGA-Socket they have much more possible pins they can use, so even an upgrade to PCIe 6.0 should be inside the spectrum of that socket. AM4 was limited much more and the 5xxx series used most of the pins they can (you need to leave some as ground too) with AM4.

    • @Mr.Morden
      @Mr.Morden 2 года назад +2

      Zen 4 (specifically Ryzen 7000) will be limited to the small high end market only. That's because Zen4 only uses DDR5 and DDR5's price/performance/availability is too poor for the whole PC industry to switch over right now. DRAM vendors won't have their new fabs up until Q4 2024 and that's if they happen to come online according to plan. Only Samsung has a new fab coming online later this year, nobody else in the DRAM business. In the meantime actually-fast DDR5 will continue to be significantly more expensive and affordable DDR5 won't be able to compete with DDR4 in terms of price/performance. This is why Intel's 13th gen still maintains support for DDR4, if they didn't then the whole PC industry would come to a screeching halt from DDR5 unavailability. Remember that Intel is still in the vast majority of all PCs sold, non-enthusiast PCs like HP and Dell. If Intel went DDR5 only like Zen4 it would make the GPU shortage look inconsequential.

    • @NightMotorcyclist
      @NightMotorcyclist 2 года назад

      IIRC they tend to leave pins or contacts that don't do anything so that they can be mapped later on for whatever reason and slot right into the same socket or one with the same pin out if the VRMs or traces on the motherboard cannot handle the new processor.

    • @acmenipponair
      @acmenipponair 2 года назад

      @@Mr.Morden True. But on the other side it also shows the different philosophies of the company: While Intel always tries to sell you their newest generation - but see DDR4/5 has to make compromises so that they can perpentrate the whole market, AMD will simply have two lines of CPUs in sale: the 7xxx series for the heavy enthusiasts, who are on the other side some kind of guinea pigs, and the 5xxx series that will become more available now and is a proven concept for the rest of the market. They even introduced the -G-CPU-line with 5xxx already, so that they can get into the embedded and office system market (where you need a GPU on the chip to compete with Intel, as no office wants to spend extra 300 Dollar for a low end GPU)

  • @Jason-ir5ig
    @Jason-ir5ig 2 года назад +292

    I hope AMD continues to offer efficient performance. A top-of-the-line CPU doesn't appeal to me if it's drawing several hundred watts

    • @steve88luv
      @steve88luv 2 года назад +11

      Its going to get to that point though, all this extra power needs energy. They might be able to stay below Intel most of the time but the trend is basically upwards every generation.

    • @tomgibbs5751
      @tomgibbs5751 2 года назад +41

      @@steve88luv yes and no. Let's not forget how a 5600x can outperform zen 1 CPUs with half the draw

    • @steve88luv
      @steve88luv 2 года назад +5

      @@tomgibbs5751 Yeah there will be leaps sometimes in performance without more power but eventually chips are going to be very power hungry.

    • @ekifi
      @ekifi 2 года назад +10

      @@steve88luv The point of Moore's Law and manufacturing improvements should be exactly offering exponetially better performance while staying in the limits of what's physically possible so it shouldn't be a given for power requirements to go up. That just means something's going kinda wrong somewhere in the design and production process of these products. A theoretically new uArch that performs 15% better than the previous one on silcon a full node ahead that manages to consume more is a bit concerning in my opinion.

    • @nikhafizal1246
      @nikhafizal1246 2 года назад +5

      Ever considered Apple m1 Max? I'm waiting for the am5 reviews to come out before purchasing an Apple

  • @chaschuky999
    @chaschuky999 2 года назад +638

    Lmao literally just finished watching their keynote. You guys had that upload time so perfect.

    • @robert19
      @robert19 2 года назад +31

      they probably had it scheduled - amd would have been working with them prior to the public keynote.

    • @benjaminoechsli1941
      @benjaminoechsli1941 2 года назад +10

      @@robert19 Yep. The major tech media outlets are given the slides you see in the video as long as they promise not to release them until the keynote finishes.

    • @uddinna
      @uddinna 2 года назад +1

      Also, the part where there was no one in the room with him. Because all the information was still corporate secret.

    • @mar2ck_
      @mar2ck_ 2 года назад +1

      Its called embargo time

    • @panzerofthelake4460
      @panzerofthelake4460 2 года назад

      lmao nobody aske- jk

  • @4Leka
    @4Leka 2 года назад +32

    I'm glad that none of those things actually make older PCs obsolete. The vast majority of people won't need or even benefit from PCIe Gen5 for several years.

  • @bend9064
    @bend9064 2 года назад +237

    AMD: launches a huge leap in performance
    Me who just bought a 5600G: goddamnit

    • @gotworc
      @gotworc 2 года назад +11

      I just bought a 5700x lmao

    • @nmotschidontwannagivemyrea8932
      @nmotschidontwannagivemyrea8932 2 года назад +97

      It doesn't make your chip any worse.

    • @olthdorimirth6055
      @olthdorimirth6055 2 года назад +4

      Bababoey... My r9 5900x is gonna get outpaced. Immediately.

    • @MoonRegolith
      @MoonRegolith 2 года назад +57

      Don't worry, you just saved yourself from being an early-adopter of a new chipset. That's always a rocky first three months. Just get onto the new platform in 18 months, you'll want to be closer to the launch of 7000 series 3D versions anyway. Those will be gaming beauties.

    • @KaRuNaRuGa
      @KaRuNaRuGa 2 года назад +30

      LMAO. I've been caught up in the *Just wait™* moment for so long that I didn't even bother anymore. Still going strong with my i7-3770 🥲

  • @xhhdbdhsb8192
    @xhhdbdhsb8192 2 года назад +118

    I still remember the days when Intel was king and had no real competitors. Glad to see that there is actual competition in the market now.

    • @dethtour
      @dethtour 2 года назад +19

      It's sad that intel being king for decades has ended its streak. Amd this past gen has killed Intel. When we had the shortage the shelves were always stacked with Intel chips. But you could never buy an amd walking into the store. People want amd. Now their gpu's are better than nvidia minus the ray tracing for a lower price. Amd has always been the underdog. It's blows my mind how they are crushing the competition in every aspect.

    • @SamuraiGuy
      @SamuraiGuy 2 года назад +5

      I mean, that wasn't that long ago.

    • @dethtour
      @dethtour 2 года назад +5

      @HorrorG.C.L how? If you would mind explaining

    • @nadir4562
      @nadir4562 2 года назад +1

      @HorrorG.C.L Explain lmao

    • @zo_to
      @zo_to 2 года назад +2

      Really, when the competition started, Intel reduced the prices by 50%!!

  • @benjaminoechsli1941
    @benjaminoechsli1941 2 года назад +198

    AM4 users: We want more chipset lanes! Moar USB!
    AMD: _backing the dump truck up_

    • @N1lav
      @N1lav 2 года назад +13

      TBH I think the thing that was holding AMD back was the socket. They had limited number of pins to work with. I hope they overprovision AM5 like they did with AM4, so it too can run for 4-5 generations of CPUs before AM6 is required

    • @scarletspidernz
      @scarletspidernz 2 года назад +2

      @@N1lav AM4 multi generation compatibility has been a huge selling point for AMD and dropping the A series was a good move. Definitely feels like a good move in the right direction with AM5

  • @maximiliandeitrick9468
    @maximiliandeitrick9468 2 года назад +841

    Just caught this announcement live, can't wait to see what am5 has to offer later this year

    • @harveyhandog8736
      @harveyhandog8736 2 года назад +1

      Wait DDR5 When Back In Stock Too Is Expensive?

    • @bagasfabianmaulana
      @bagasfabianmaulana 2 года назад +43

      @@thelightsilent wdym is marginal changes? did you even watch the announcement or this video? and about liquid nitrogen tank, are you talking about i9-12900KS?

    • @aimannorazman7959
      @aimannorazman7959 2 года назад +34

      @@thelightsilent lol AMD hater, can't you just be happy that we have competition, what's wrong with that?

    • @Mundilfari_
      @Mundilfari_ 2 года назад +25

      @@thelightsilent Shilling for intel so hard and for what?

    • @BrickTamlandOfficial
      @BrickTamlandOfficial 2 года назад +2

      its gonna be a paper launch until the next processors are announced lol

  • @technetium_tech
    @technetium_tech 2 года назад +354

    The Al acceleration is an interesting approach, as usually it would be on the GPU as with Intel and Nvidia. I wonder if it could be used to Al accelerated denoising in blender... Also +5 GHz without using 250W is amazing efficiency relative to Intel.

    • @broxy6350
      @broxy6350 2 года назад +16

      To be Fair some models train faster on CPU then GPU.

    • @F1ll1nTh3Blanks
      @F1ll1nTh3Blanks 2 года назад

      Might hurt open sourcing a bit.

    • @whitehavencpu6813
      @whitehavencpu6813 2 года назад +1

      I saw this exact same comment on Steve's channel.

    • @technetium_tech
      @technetium_tech 2 года назад +5

      @@whitehavencpu6813 Which Steve? Upside-down Steve or tech Jesus Steve? Either way, this comment will be on both!

    • @whitehavencpu6813
      @whitehavencpu6813 2 года назад

      @@technetium_tech lol

  • @nohomeforfreepeople2894
    @nohomeforfreepeople2894 2 года назад +210

    I like seeing these jumps, but would like to finally see a jump from software makers to get better memory management and give us an "upgrade" by making their software better. That might justify all of these subscriptions.

    • @mosesmatildadabwel6784
      @mosesmatildadabwel6784 2 года назад

      Thanks for watching and commenting,text to claim your price

    • @daitedve1984
      @daitedve1984 2 года назад +14

      +1, more compact memory consumption will allow cache (of any level) play in full.

  • @Kenjionigod
    @Kenjionigod 2 года назад +180

    Honestly, even AMD is being more competitive on the GPU side of things too as well. It's a good time to be into PC gaming.

    • @xicofir3737
      @xicofir3737 2 года назад +10

      Have you seen GPU prices?
      They are coming down but an RTX 3060 TI still cost as much as my GTX 1080 TI were I leave.
      The price/performance ration still the same as 5 years ago for rasterization performance.

    • @Rappoltt
      @Rappoltt 2 года назад +5

      Per dollar, they win in gaming. They're only closely losing in productivity, but their encoder subjectively looks very bad.

    • @Erunest
      @Erunest 2 года назад +1

      only thing i miss on a amd gpu is broadcast

    • @IgorBrendo
      @IgorBrendo 2 года назад

      AMD drivers sucks ass, never going back to them if they continue in this path

    • @MostlyPennyCat
      @MostlyPennyCat 2 года назад +1

      Yep, time to get turbo excited about Radeon 7000!
      The idea that we're starting to move into a true MCM GPU world is salivation exciting.
      That dream they said was impossible, having multiple GPU chiplets transparently acting as a single GPU.
      This is the stuff dreams are made of.
      The APUs that can come from CPU and GPU chiplets is something very special if it actually comes to fruition.
      I are excite.
      Excite a lot.

  • @ultimateclappage9109
    @ultimateclappage9109 2 года назад +48

    I first got into computers when the intel 8000 series and ryzen 2000 series were fresh. I was looking between the 8350k and the 2200g, and ultimately went with the 2200g for better pricing. Im glad i went amd, have been switching parts out once a year and have never had to upgrade the platform, its awesome

    • @milescarter7803
      @milescarter7803 2 года назад +1

      Now you can find used 9600k for $80-120, still good platform for a couple years out.

    • @bullshitdepartment
      @bullshitdepartment 2 года назад +1

      llol i got into computers when i7 4470 was decent...still is somehow

    • @leviathan19
      @leviathan19 2 года назад

      damn my first computer was a compaq 486 dx II now I feel old as F

    • @Hyperion9700
      @Hyperion9700 2 года назад

      I'm running a 2200G and it's an amazing little CPU

    • @CanuckGod
      @CanuckGod 2 года назад +1

      @@leviathan19 I got into computers... when I got a 486 SLC2/66 in 1994; to be fair, I was into computers before that, but that was the first PC I owned, and nearly 30 years later, it's Ryzen 5 3600. So yeah, you and I are ancient 😂

  • @haylog1543
    @haylog1543 2 года назад +436

    AMD keep getting W's at the moment it's brilliant

    • @Mike-kr5dn
      @Mike-kr5dn 2 года назад +6

      Intel might be beaten after the longest time in single core performance, we will see.

    • @Mundilfari_
      @Mundilfari_ 2 года назад +24

      @@Mike-kr5dn They’re probably gonna have their engineers working permanent overtime to come up with something. Just like the rushed 12900KS

    • @hardrivethrutown
      @hardrivethrutown 2 года назад

      Well yes... Other than desktop 4000 chips and the RX 6500/6400

    • @allenwalker9928
      @allenwalker9928 2 года назад +7

      @@Mike-kr5dn lolz zen 4 is 15% faster than zen 3, but alder lake is 25% faster than zen 3 in cinebench r33, which AMD used to compare

    • @Mike-kr5dn
      @Mike-kr5dn 2 года назад +4

      @@allenwalker9928 can’t wait to get my hands on 13900k 😉

  • @Armetron
    @Armetron 2 года назад +5

    I've been holding off on upgrading my machine for quite some time now and seeing how AMD has been pushing the CPU market with Ryzen I knew I wanted my next build to be AMD based, now with the announcement of the AM5 socket with a future support of 5 years I think it's finally time to upgrade.

  • @teighan7829
    @teighan7829 2 года назад +80

    I have a Ryzen 5 3600X and man she feels like a beast to me, amazing how fast stuff is improving

    • @erick2328
      @erick2328 2 года назад +4

      im still using 2600 and it still feel good to use

    • @redragongaming
      @redragongaming 2 года назад +8

      I am using a Ryzen 5 1600 overclocked to 3.75 Ghz and it seems enough to me, i like it because of its speed.

    • @stepbro1305
      @stepbro1305 2 года назад +2

      I gave my 3600x to my brother same with my 5600xt when I upgraded and he loves it

    • @yuvrajbhattal8213
      @yuvrajbhattal8213 2 года назад +6

      Bro im using r7 1700 with gtx 1650s and it feels completely fine

    • @blayde4577
      @blayde4577 2 года назад

      did u assume 3600x gender ?

  • @Nostalgia_Realm
    @Nostalgia_Realm 2 года назад +827

    Mendocino also used to be the name for some late 90s Celeron CPUs. Product names are getting more confusing the longer I've been in the PC space.
    Not that it's easy to confuse products that were released 20 years apart, but it can still be confusing if you need info on older parts but the search results have been overshadowed by newer parts with the same name.

    • @exscape
      @exscape 2 года назад +47

      Not just "some" Celeron, the legendary Celeron 300A!

    • @johngangemi1361
      @johngangemi1361 2 года назад +9

      I had one of those processors.

    • @VoldoronGaming
      @VoldoronGaming 2 года назад +15

      AMD may run into a trademark problem with Intel over that name if Intel trademarked it.

    • @shawnpitman876
      @shawnpitman876 2 года назад +9

      If you're even attempting to use computer parts from 20 years ago, you deserve to have those issues. There is NO REASON to be running systems that old, are you the US military who still launches nuclear weapons with floppy disks?

    • @Diapolo10
      @Diapolo10 2 года назад +7

      And don't even get me started with getting AMD CPUs and Radeon GPUs mixed up due to the similar naming scheme.

  • @RealMephres
    @RealMephres 2 года назад +141

    "Every new Ryzen 7000 CPU will have integrated graphics" is pretty damn good news, let alone the massive frequency bump and performance upgrades. Once I got a new GPU, I'll switch to the 7000 series for replacement of my 3600X.

    • @tapp3r109
      @tapp3r109 2 года назад +6

      My plain old 1600 is still chugging along. It's gonna be a huuuge upgrade whenever it's time to replace it

    • @MrMrRubic
      @MrMrRubic 2 года назад +4

      Though not many people actually actively use an iGPU (especially on higher end systems), having it exists for diagnostic purposes is an absolute win!

    • @blvck.8197
      @blvck.8197 2 года назад +1

      Just bought a 5800x now its obsolete 😭

    • @RealMephres
      @RealMephres 2 года назад +1

      @@MrMrRubic It's a win nonetheless, and as long as the price stays reasonable, it's perfect.

    • @Six_Gorillion
      @Six_Gorillion 2 года назад +2

      Dont really see any point for mid to higher tier cpus having that, they will be paired with GPUs anyway 99% of the time.

  • @tomwaller6893
    @tomwaller6893 2 года назад +67

    Wonderful Linus, now if the economy would just allow all this to be affordable I would get excited about the World is facing a massive hurt.

  • @shaneneyome
    @shaneneyome 2 года назад +200

    Absolutely love being outdated every time I upgrade. Makes me feel like upgrade paths are actually feasible

    • @takezokimura2571
      @takezokimura2571 2 года назад +16

      I'm still using a GTX 970, and I will only upgrade to RTX 3060 next year at the bare minimum. I'm always 2-3 generations behind.

    • @NuggetsXInfinite
      @NuggetsXInfinite 2 года назад +16

      @@takezokimura2571 go 3070 or 3060ti at the minimum, the 3060 doesn’t even hold up well enough against 20 series cards. The 3060ti does however

    • @IrisCorven
      @IrisCorven 2 года назад +3

      @@takezokimura2571 I upgraded from a Phenom II x4 965 right before COVID to a Ryzen 5 3600, and I'm still using my GTX 1060 that I got when it came out. It was the first PC component I caved and bought at launch.

    • @Aereto
      @Aereto 2 года назад +5

      Intel makes us replace our motherboards every generation by changing sockets every generation.
      AMD only forces the change at least every 2-3 generations, depending if the chipset can keep up.

    • @Slayiden
      @Slayiden 2 года назад +2

      I literally just bought a b550 motherboard and a ryzen 9 5900x and then this drops a few days after 💀

  • @joelconolly5574
    @joelconolly5574 2 года назад +82

    In the end, the consumer wins.
    We just literally get what we wanted all this year. And I'm all for it.

    • @logipilot
      @logipilot 2 года назад +5

      More old PCs in the landfill ... planet looses

    • @he.lena21
      @he.lena21 2 года назад +1

      @@logipilot Woke

    • @AsianFlex
      @AsianFlex 2 года назад +17

      @@logipilot that's if the people upgrading don't sell the old pcs or keep them for some reason

    • @anxiousearth680
      @anxiousearth680 2 года назад +12

      @@logipilot Not everyone is rich enough to buy new lol. And stuff like processors and ram last long enough to change hands in used form.
      So even if people upgrade, old stuff still gets use.

    • @kuma8030
      @kuma8030 2 года назад

      @@logipilot your woke ass loses. the planet doesnt care you animal. humans aint the planet. the planet is an orbiting rock you think an orbiting rock cares bout the insects leaching on it die or not?

  • @sofiaknyazeva
    @sofiaknyazeva 2 года назад +153

    Remember me when AMD was falling down that was around 2007, I impressed how they've pushed their limits. Seems like now Intel have to deal with a new era.

    • @Gatorade69
      @Gatorade69 2 года назад +9

      Even then in 2007 their processors were decently priced. AMD has always been decently priced for the performance provided.

    • @Orthoyt
      @Orthoyt 2 года назад

      amd shold die intel is better there core i5 12600k is cheap and nearly as fast as a core i9

    • @Drace_The_Ace
      @Drace_The_Ace 2 года назад +13

      @@Orthoyt and the award to the worst obvious troll goes tooo...

    • @Orthoyt
      @Orthoyt 2 года назад

      @@Drace_The_Ace I’m not joki g

    • @Orthoyt
      @Orthoyt 2 года назад

      @@markonw6661 and amd isntt

  • @Delta0429
    @Delta0429 2 года назад +38

    3:25 I think the most interesting part about this presentation was this portion right here, where they showed the blender render performance vs. the 12900k. If you look at the footnotes of the presentation, they reveal the actual time that each render took, AMD being 204 seconds and Intel being 297 seconds. They actually did the math backwards, so the new 7000 series processor is actually 45% faster than Intel, not 31%. That is an absolutely MASSIVE difference, and clearly pushes AMD to the top in that specific scenario. It makes me wonder if that holds true for their other performance reports, and if they did them the wrong way too... If that's the case, we could be looking at huge performance gains from this generation

    • @phillip_iv_planetking6354
      @phillip_iv_planetking6354 2 года назад +2

      LOL.
      Have you seen the latest Raptor lake score that was reported on?
      20% faster than the 12900k at a measly 4.6ghz....

    • @Delta0429
      @Delta0429 2 года назад +7

      @@phillip_iv_planetking6354 I'm happy with that too, competition is amazing for consumers. Intel is finally being pushed to get better after a decade long hold of minor performance gain, and AMD is pushing them to do it by consistently making good performance gains generation after generation for that last 5 or so years. I love this, it's great for me and for you, as it makes our performance better and better quicker than ever

    • @phillip_iv_planetking6354
      @phillip_iv_planetking6354 2 года назад +1

      @@Delta0429 I love the competition I just hate fanboyism.

    • @user-fq4oq9qv7b
      @user-fq4oq9qv7b 2 года назад +2

      I'm not sure what math u did bro, but it's wrong?
      204-297=93
      93/297*100=31.31%
      So ye?

    • @dxdux
      @dxdux 2 года назад +8

      @@user-fq4oq9qv7b When talking about increment it should be 93/204. But it is really about the way you word it: 1.You can say Intel is 31% slower than Amd or 2.Amd is 45% faster than Intel, in that regards he is right.

  • @joshuasedgwick5557
    @joshuasedgwick5557 2 года назад +163

    It’s great to see how good all of these improvements are getting!

    • @DeepakThakur-wl3gr
      @DeepakThakur-wl3gr 2 года назад +16

      @@thelightsilent Every angry Intel supporter right now LMAO

    • @Namamiri
      @Namamiri 2 года назад +14

      @@DeepakThakur-wl3gr ah dont listent to that guy he copy pasted the same text like in multiple threads. I love it that this back and forth is happening. I have so much choice and i only believe the third party benchmarks thata will come out after release. Its a good time after all the years intel sucked us dry

    • @eddiemate
      @eddiemate 2 года назад +3

      ​@@thelightsilent
      What are you saying? What AMD has announced *is* a big change. A marginal change is going from Intel's 10th gen to 11th gen processors. The step from Ryzen 5000 to Ryzen 7000 is *huge*.

    • @bgop346
      @bgop346 2 года назад +1

      @@DeepakThakur-wl3gr wot i buy whatever is better value and dont care about amd or intel and tbh i expect zen 4 to win in gaming by a few fps, but it depends on price/productivity perf as you wouldnt notice a few fps

    • @bgop346
      @bgop346 2 года назад

      @@eddiemate bro its sayiNg 15%ST clocks/ipc at the top end too, so i dont think it will be HUGE (it was also validated with cinebench so that would put it below a 12600ks 1930 vs 1883 5950x gets 1635,do the math)

  • @FutureKLX300SMowner
    @FutureKLX300SMowner 2 года назад +11

    I'll be happy to get one used for 50-60% less a few years after release. worked great for me so far. my first computer was an i7 4770k, 16gb ddr3 2400, and a GTX 780 for 200$ Canadian 3.5 years ago.

    • @fabrb26
      @fabrb26 2 года назад

      While i'm still on 2 core i5, 4gb ddr3 1066 , no Gpu but a HD4000 and a 1350 euro bill from Apple 2012 store. It's all about not knowing shit about computer but trust the sale man like the prophet 🤣🤣

    • @monke2361
      @monke2361 2 года назад

      wow what a deal, I never usually buy the newest generations anyways.

    • @FutureKLX300SMowner
      @FutureKLX300SMowner 2 года назад

      @@monke2361 yeah it was this guy's old Alienware x51 that had a power supply blow up, couldn't test anything so was definitely taking a risk lmfao

  • @paladingeorge6098
    @paladingeorge6098 2 года назад +57

    My 3700x, even 3 years later, still feels like a beast to me. I'm super excited to see what the future holds.

    • @swissix4947
      @swissix4947 2 года назад

      I have an 3800x since about an Year. Its fast and runs every game effortless with my 1080ti. Its no comparison to my old i5 4690. But even he was still usable. Just not for decent gaming. But i also could never really reach his full potential because my old gt 1030 limited it even in cs go.

    • @shubs463
      @shubs463 2 года назад

      I’ve a 3700X also and yes it’s still a beast
      Also quick question what’s your typical clock speeds as mine never hits above 4.05GHz when it’s supposed to be able to do up-to 4.4GHz acc to my box. (On the highest profile on my asus mb)

    • @paladingeorge6098
      @paladingeorge6098 2 года назад

      @@shubs463 Mine usually sits between 4.00 and 4.2 all core, though keep in mind the boost frequency AMD lists on the box is single core only, meaning that only one core will boost that high at a time and only for a short while.

    • @michaelkeudel8770
      @michaelkeudel8770 2 года назад

      Same here, 3700x only a little over a year old, looking at a 5800X3D maybe in a year, then no need to upgrade for at least 2 more years after that. I don't need the newest shiniest tool in the shed to do what I do these days, I can wait.

    • @michaelkeudel8770
      @michaelkeudel8770 2 года назад

      @@shubs463 that's a single core boost, I've undervolted mine and locked it to all core 4.3ghz all the time, running the stock cooler, and has never gone above 61C at full bore.

  • @Starfals
    @Starfals 2 года назад +9

    I'm gonna be very tempted to finally upgrade from i5-2500k... It's been 10 years now... it really feels like the right time. We shall see of course :)
    P.s. I've been saying this for 10 years now LOL.

  • @stevenjack6283
    @stevenjack6283 2 года назад +265

    Gosh, with all this new exciting tech coming out it would be a shame if there were to be, I don't know, say a global shortage of the necessary materials to make sophisticated microprocessors in large quantities at a reasonable price. Haha good thing that could never happen...

    • @razerow3391
      @razerow3391 2 года назад +6

      Yeah but AMD doesn't pay Linus to say that.

    • @kaldo_kaldo
      @kaldo_kaldo 2 года назад

      @@razerow3391 AMD didn't pay them to say anything in this video. You realize this is a tech channel, about tech, that talks about tech right? And when new tech shows up, they talk about it.
      We can't know what the future will hold, they aren't omniscient, how will they talk about something that hasn't, and may not have happened? Two big parts of the chip shortage were the COVID shutdowns, and the increased requirement due to the pandemic - people being stuck at home, buying things for entertainment, getting into hobbies they had never touched, getting computers, webcams, etc to work from home. That phase has passed. Things are recovering. It's not over yet but there's no guarantee that there will be no availablility this time around.
      CPUs have been widely available for over a year now. GPUs are the thing majorly impacted until recently, and crypto has a big role to play there. But the recent crypto crash has lowered the prices nearly to MSRP and the availability is pretty good. Still needs work, but it's not impossible to get something and to get it for a good price.

    • @hammerdick82
      @hammerdick82 2 года назад

      @@razerow3391 Judging by your past comments and this one, you should take your negative little ass off this channel and stay wherever you go. My god you people are nothing but a bridge troll destroying the joy of all those you come across

    • @wylelias
      @wylelias 2 года назад +9

      Yeah new tech for inflated prices should go over well in the booming economy.

    • @kaldo_kaldo
      @kaldo_kaldo 2 года назад

      @@wylelias What's the price going to be?

  • @kizzericgraves5921
    @kizzericgraves5921 2 года назад +170

    I love the way the tech world keeps progressing, even if you think there's no way things can get better they always do! :)

    • @jakobe328
      @jakobe328 2 года назад

      i encourage you to research something called moore's law.

    • @enraversadi2903
      @enraversadi2903 2 года назад +8

      i also encourage you to research something called GPU MSRP creep.

    • @masterdynamite1015
      @masterdynamite1015 2 года назад

      Not if you're stuck with the technology of a decade ago :)

    • @olloi91
      @olloi91 2 года назад +7

      Tech is improving to fast. Look at game developers. They try so hard, yet the games we get are bad

    • @deeomayall
      @deeomayall 2 года назад +5

      Until they don't.
      Linus's optimism is great and it feels so 2001. However, AMD is no stranger to overpromising and underdelivering. Even worse, even if all this were true, out there in the real world we're still plagued by component shortage, inflation and spiralling operating costs, plus good old greed. Imagine the likely future in which this CPU architecture is nowhere to be found for 6 months from launch, and when it appears it's priced at 2x what you were prepared to pay. The current market completely killed Ampere, it can kill Zen 4 just as easily.

  • @How23497
    @How23497 2 года назад +111

    That integrated graphics announcement is especially exciting for me. Will be a big uprade from my 3200G

    • @Shadow0fd3ath24
      @Shadow0fd3ath24 2 года назад +1

      just buy a GPU...itll be way cheaper, easier, and much more powerful🤦‍♂️ even the 10 series mini GPUs will blow away any APU on the market quite easily nd be less than JUST a single AM5 chip with that graphic ability, and thats before you upgrade the mobo and ram too

    • @harryhall4001
      @harryhall4001 2 года назад +8

      @@Shadow0fd3ath24 True when there isn't a shortage. I appreciate things are getting better on that front but we are still not at MSRP yet and it could still be a while before we are

    • @How23497
      @How23497 2 года назад +24

      @@Shadow0fd3ath24 maybe I didn’t state my use case. The games I play are very CPU intensive and for that RDNA 2 integrated graphics is more than enough. You are also underestimating future APU performance.

    • @Shadow0fd3ath24
      @Shadow0fd3ath24 2 года назад

      @@harryhall4001 10 series isnt in a shortage lol 1050tis are 210% as powerful as the TOP APU on the market.

    • @Shadow0fd3ath24
      @Shadow0fd3ath24 2 года назад

      @@How23497 even a 1050 ti is 210% more powerful than the TOP SKU of APU on the market and will work with that APU if needed, you would be better off getting a slightly better CPU and a normal GPU than this new chip lol. Theres no circumstance where what you are talking about wanting makes sense!! Thats the whole point.
      even a gtx1060 and a slightly better CPU(thatll work with your board even)would give you 300% more performance on every game for the same price as this CPU alone lol and thats with current prices factored

  • @evilemperordude
    @evilemperordude 2 года назад +18

    I'm sure it'll be a mind blowing upgrade when I buy a new CPU in 5+ years.

    • @benruss4130
      @benruss4130 2 года назад

      Ya, it was when I upgraded last year. I went from an i7-5930k (6C/12T intel extreme from 2014) to a TR PRO 3975wx, and I've been blown away. In single core senarios it is like 25% faster (unless it is a memory bound single core than it is like 50+% faster), and in multicore scenarios it is literally 5-6x (500-600%) faster.

  • @lucasmelee
    @lucasmelee 2 года назад +42

    A ryzen 7000 with RDNA2 integrated graphics sounds so crazy in terms of cost-benefit. 2002 was the last time I was actually playing games like Vice City on onboard graphics and having no issues at all. Are those times back?

    • @user-ww6dw4by9w
      @user-ww6dw4by9w 2 года назад +2

      Possible even better . Iam afraid . A apu with rdna 2 make the entry and low bug gpu cards with no use . Try to imagine. What a ps5 run with similar apus...

    • @FVBmovies
      @FVBmovies 2 года назад +5

      @@user-ww6dw4by9w Means finally price drops for all them 750s, 1030s, 1050s and 1660s. Long deserved.

    • @user-ww6dw4by9w
      @user-ww6dw4by9w 2 года назад

      @@FVBmovies dude. All of the gpus you said will ne good only for Museum ... Its ok if you have one of them already. After all i got recently a 6500 and iam fine. But with a very strong apu capable for 2k gaming.. Will be no place for such gpus in market share. My opinion.

    • @Lolkork
      @Lolkork 2 года назад

      The graphics on the standard CPUs will probably be fairly under-powered to save on cost, mainly just serving as display output for computers without discrete graphics. A CPU with gaming-capable integrated graphics will likely cost a bit extra.

    • @user-ww6dw4by9w
      @user-ww6dw4by9w 2 года назад +2

      @@Lolkork already the 5600g with the * poor * vega 7 are good for 1080p... AMD. do that intel don t...

  • @andrewmontague9682
    @andrewmontague9682 2 года назад +34

    Thank goodness this has come along! For a moment I thought all I couldn’t afford to upgrade was my graphics card, now it’s the whole system! Yay.

  • @Nate7.75
    @Nate7.75 2 года назад +9

    I actually really like the fact that b650 has pcie 5.0 for the storage only as that's really the only thing that saturates the 5.0 lanes. GPUs are just not there yet only the top top end are beginning to use 4.0 bandwidth

    • @logipilot
      @logipilot 2 года назад

      I like to carry over my pcie 3.0 gtx1070ti so b650 for me

    • @benjib2691
      @benjib2691 2 года назад +6

      Actually, even really high end GPUs like the 3090 and 3090 Ti or the 6950 XT only begin to saturate PCIe Gen 3 x16 bandwith. I also feel that if you don't plan on overclocking, B650 is the better choice compared to x670 and x670 extreme, as PCIe Gen 4 x16 should provide all the bandwith needed by GPUs for at least 4 to 5 years.

    • @kuma8030
      @kuma8030 2 года назад +2

      @@benjib2691 unless you need a shit ton of sata ports and usbs b650 will be the go to

    • @arrgghh1555
      @arrgghh1555 2 года назад +1

      I'll make that decision when I see some products and pricing.

    • @bronin2642
      @bronin2642 2 года назад +1

      @@logipilotThat doesn't matter. PCIe is backwards compatible. You can use your 1070 with every PCIe 5.0 slot

  • @ThegreatCMFB
    @ThegreatCMFB 2 года назад +2

    Man, with how expensive DDR5 RAM is, I'm glad I got my 5800x on sale, but this is super exciting for my next build! In 5-7 years...

  • @mAtT47739
    @mAtT47739 2 года назад +26

    I'd love to see an AM4 platform retrospective, where you compare all of the main ryzen CPUs (8 cores, 6 cores, 4 cores, etc) to better show the growth. Maybe even lick everything to the same clock to show IPC growth too.

    • @anhiirr
      @anhiirr 2 года назад

      i think they fucked up major on the APU front...their 3000 series had better IGPU and "IN TEH STEAD" of trying to compete with IPC cross intel with IGPU....they put on smaller IGPUS....in order to prioritize PURE CPU power...or market themselves to offices/students...etc...type end users. Then they launch the 5000 series APUs at an insane price and have such low cache theyre not even worth using with a GPU or as an APU. For ppl trying to build an ideal HTPC/light gaming system. AMD couldnt even deliver on that front. Now i get to/have to run a 5800 3D and a 3070 with my 200$ 3 video out 90a vrm itx board...or outright just sell the whole build. BC my MAIN rig already has a 5900x and 3070. The itx build was my "backup" build so i could sell my main rig...run an am4 chip and a 3070 in an ITX up until q4 2022 to run 13th/am5 in a full atx build. Then still have the itx build or a VIABLE apu for hack n slash genre sim cade racers or simulators/arcade games and humble bundle games....WHILE still having 2 other video outs to a projector/tvs etc for watching sports/racing etc. Like that boards 2 years old with 3 video out....and its so LUL/obvious its purely for "WORK" with no gaming/streaming potential or "GAME ROOM" use potential through a PURE APU. EVEN more LUL how the apu need b-die 4400mhz+ram to "GET BY" when b-die isnt remotely a requirement of use with zen3. More than it is typically ran with k chips or custom oc scenarios. Even 3950x to get the best at the time prior to zen3 gaming out of an amd cpu it was via 3950x across 4ccx. BC it had the highest single core or duel core IPC/Clock rates. Which also needed ryzen master ctr/dram calculator re-boots to switch from AC oc or 4.4ghz AC.....to turn it back to being able to full turbo boost 1-4 cores for games. At times 2 of your top 4 fastest cores on the same ccx and at times a dormant CCX that would kick up/on for background processing etc all yielding latency issues and transient ripple. RIGHT down to zen cores or a r9 on zen3 still having an "interface" between cores and cache dividers etc and ram access yielding inherent latency issues right down to the EOL of AM4 vs intel. And im not complaining the day i was able to remove my 3950x and sell it...for 500$ in exchange for a new 5900x....and able to run it "BONE STOCK" to equate to epic single and multicore clocks out of the box never having to install or touch ryzen master again. Is pretty nice when you constantly run a stream or multimonitors and still play AAA or warzone or tarkov etc...seamless windowed fullscreen system health etc. Not once have i had any crash/bsod/failures to boot vs my past issues with zen2/ram etc. For me that means a lot. Just wish they managed to "DELIVER" on teh apu front

  • @Vikotnick
    @Vikotnick 2 года назад +104

    I love AMD. Im a little sad that they are leaving Threadripper, but the next processor that I buy from them will probably more than make up for the extra cores.

    • @mduckernz
      @mduckernz 2 года назад +2

      Where did you see that they are abandoning TR? I've seen no such news

    • @Orthoyt
      @Orthoyt 2 года назад +3

      intel better

    • @sparkythewildcat97
      @sparkythewildcat97 2 года назад +22

      @@Orthoyt In what metric, specifically?

    • @ryanm7868
      @ryanm7868 2 года назад +23

      @@sparkythewildcat97 dont worry about him he's just an intel fanboy

    • @Maxime-ho9iv
      @Maxime-ho9iv 2 года назад +2

      Are they?

  • @nonae9419
    @nonae9419 2 года назад +10

    The AI isa is known. It's called AVX_VNNI but only the 256-bit wide instructions will be supported. Check Intel intrinsics guide for more information.

  • @cyclone5299
    @cyclone5299 2 года назад +2

    15% gain isn’t even worth me getting out of bed. Intel Raptor Lake 13900K paired with a New 4090Ti will destroy AMDs chances of regaining any Crown they had wished for. It’s true and you know it. 😍😍

  • @Zullfix
    @Zullfix 2 года назад +119

    Finally AMD is offering more PCIe lanes!

    • @MegaDrum1234
      @MegaDrum1234 2 года назад +3

      That's what got me confused with this.
      AM4 currently offers 24 PCIe lanes with Ryzen 3,5,7 and 9, only Threadripper gets 128.
      AM5 still has the same amount of lanes, just much much faster.
      So i don't actually see the increase in available PCIe lanes on the platform per say
      Edit: Unless you look into the APU aspect which do currently have 16 PCIe Lanes, then i guess an argument could be made that there is an increase as all new AM5 are APU's...

    • @LtdJorge
      @LtdJorge 2 года назад

      @@MegaDrum1234 4 are for the chipset. So 20 usable. I guess they are doing away with the chipset entirely, looking at those 14 USBs.

    • @Mysteoa
      @Mysteoa 2 года назад

      @@LtdJorge they just have 2 chipsets for the 670E and possibly running of 2 PCIe5 lanes each.

    • @Wildcard65
      @Wildcard65 2 года назад

      By the looks of it, those 24 lanes are AFTER slicing lanes off for the chipset(s)

  • @Aaron-zl5gq
    @Aaron-zl5gq 2 года назад +14

    Best part about this is what will Intel to do for 13th gen, I love competition

    • @Bunce9908
      @Bunce9908 2 года назад +3

      agreee whole heartedly, from what i've seen they are giving an extra 10-15 percent single threaded performance and DOUBLING the ammount of efficiency cores. Should be coming out by the end of the year too!

    • @LightMCXx
      @LightMCXx 2 года назад +1

      @@Bunce9908 i heard they are more efficiencg than 12th.

    • @ascathon7069
      @ascathon7069 2 года назад

      @@LightMCXx They should be. If not, then they are fucked.

    • @1vend7
      @1vend7 2 года назад +1

      I'm also looking forward to it, I hope they bring competition because when it launches I'll probably upgrade to 13th after 9 years of using an intel i5 4670k overclocked to 4.8 GHz i'm desperate for more power haha

    • @Bunce9908
      @Bunce9908 2 года назад

      @@LightMCXx sounds abt right

  • @igeekone
    @igeekone 2 года назад +40

    YES! Back to the glory days of rapid improvements. I cannot stand that we had a full decade of stagnation in CPUs. Seems like these announcements are propelling PCs to a future we should have today.

    • @jasonjenkinson2049
      @jasonjenkinson2049 2 года назад +3

      I think theres so much more money being spent on hardware since 2020 that the big companies have more money to invest into improvements

  • @poopy9172
    @poopy9172 2 года назад +2

    If I recall correctly AMD already stated that 3D cache will not be there with the first release of Ryzen 7000 but will release some sort of CPU with it in the future.
    I think it makes sense cause it will make AMD able to be everywhere on the media while waiting for Intel before dropping the bomb and gaining back the advantage or by even increasing it futher...
    It also makes sense because 3D cache only makes sense for gaming anyways (as at now at least) and the first release of the CPUs should be focused of having the entirety of the CPU manufacturing pipeline (chips, packaging, shipping and so on) to be as streamlined as possible in order to make things as flexible as possible for the "real" money making CPUs: the ones for servers and HPC.
    But to be honest, I think that the 6nm I/O die is really unexpected... I didn't really think they would use such an expensive node for the I/O die which historically doesn't really consume that much power anyways...

  • @ingoos
    @ingoos 2 года назад +13

    Just goes to show: vigorous competition quickly spurs rapid innovations which, ultimately, benefits everyone!

  • @embarkingolive
    @embarkingolive 2 года назад +9

    I honestly have a hard time justifying any upgrades right now. On a 3800x, 2080 super and 32GB of ram. Although I know the newer gen tech is huge leaps ahead, my hardware still holds up incredibly well. I game at 1440p and modern triple A games maxed out still average 100fps+.

    • @mosesmatildadabwel6784
      @mosesmatildadabwel6784 2 года назад

      Thanks for watching and commenting,text to claim your price

    • @barronhelmutschnitzelnazi2188
      @barronhelmutschnitzelnazi2188 2 года назад +1

      Those specs still hold very well today. Give it about another 4 years or so and it will finally show it's age. Might be sooner if Unreal Engine 5 is as good as they say it is because there will be beautiful games running that engine.

  • @cspreach1963
    @cspreach1963 2 года назад +23

    Wonder if we’re gonna get to a time where we just render stuff in 1 second. No need to upgrade anymore lol, we’ve achieved the perfect CPU.

    • @Anankin12
      @Anankin12 2 года назад +17

      No because we'll increase complexity

    • @theSato
      @theSato 2 года назад +16

      Well.. you can render stuff in 1 second now. Try rendering a nintendo 64 model or a 2 second 640x480 video.
      It's all perspective. As the power of CPUs and GPUs goes up, so does our resolutions, number of tris in our models, texture sizes, lighting complexity etc etc..

    • @zzzpoma9961
      @zzzpoma9961 2 года назад +1

      @@theSato But someday there won't be a point of increasing complexity apart just doing it for the sake of complexity. UE5 photorealistic demos are nearing the "Is this real?" category.

    • @BingBongBagels
      @BingBongBagels 2 года назад +2

      @@zzzpoma9961 in 30 years I feel like we’re VR is going to be like entering the matrix. I’m sure we will be blown away by future innovations

    • @saricubra2867
      @saricubra2867 2 года назад

      "render stuff"
      Use a graphics card, CPUs aren't designed for vectors...

  • @Carnage8
    @Carnage8 2 года назад +7

    to think that just a couple years ago i had the top speed pice 3.0 spec and a few short years later that is 25% of the badwith of the modern pcie gen 5.0 spec the advancements in the industry have finally sped up because of competition from AMD and its finally exciting to see new products release again :)

  • @smokeyeyes__
    @smokeyeyes__ 2 года назад +9

    AMD is starting to feel like the Airbus of the computing world, and Intel the Boeing. Intel has been the established industry darling for so long but began to falter before a relative newcomer blew them out of the water and pushed them to compete. The consumer is the ultimate winner in both industries!

    • @AbRaSkZo
      @AbRaSkZo 2 года назад

      AMD ISN'T a newcomer.

    • @smokeyeyes__
      @smokeyeyes__ 2 года назад +1

      @@AbRaSkZo definitely not but relative to intel, and the zen architecture has really taken it to a new level

  • @kayburcky7146
    @kayburcky7146 2 года назад +4

    I'm sorry I'm sorry I just have to ask based on this title:
    "Are ya winning son?!"

  • @austinveenstra7186
    @austinveenstra7186 2 года назад +7

    I'm so happy to hear about AMD pushing the market like this, I'm so excited to see what's available to me when I need to upgrade in 2-4 years and what kind of performance it'll do

  • @cajmo8635
    @cajmo8635 2 года назад +2

    Could I suggest doing a video where you take all the old CPUs and run them through similar benchmarks, from old stuff like R15 to new stuff like R23, to just see the scale of the improvement (though this probably isnt a new idea)

  • @NickFje
    @NickFje 2 года назад +17

    The LTT crew are not wasting a second when they are uploading new content 😅

  • @julianmitchell2115
    @julianmitchell2115 2 года назад +15

    I think the most important part of this progression for CPUs and other technologies across the board is the variety available to the consumer.
    You can be on the bleeding edge if you want to, but you can also hang a generation back and have an upgrade to your current hardware for a great price.

  • @lester44444
    @lester44444 2 года назад +92

    I feel like every time AMD announces something, it's been unparalleled exponential growth 😂😂 so honestly not even surprised

    • @deathab0ve
      @deathab0ve 2 года назад +7

      Here is the issue though, every RUclipsr says that every time and doesn't say it for Intel. Yet they are like neck and neck. Which means if it grew significant this time and intel didn't that must mean it was worse than intel before. And the same for last gen, and the gen prior, and prior again. Which to me almost admits RUclipsr's overhyped and lied about original Ryzen models. like 1000s, 2000s, 3000s,

    • @lester44444
      @lester44444 2 года назад +5

      @@deathab0ve I don't mind it too much, though I agree that that may be true. But they're punching up, with Intel being the giant that it is and has been, and AMD is the underdog that has forced the industry to be properly competitive as far as I know. LTT literally just released a video of Linus going to the intel facility, and heaped soooo much praise on them, so I'm not too too worried about bias haha

    • @user-pc8vn3ms2h
      @user-pc8vn3ms2h 2 года назад +1

      @@lester44444 Yes but AMD consistently release fundamentally worse drivers. The AMD hype train is fan boys who are trying to be part of the in-group.

    • @lester44444
      @lester44444 2 года назад +2

      @@user-pc8vn3ms2h i mean that's not something I'm saying is untrue, just that it still is rivaling Intel enough to push a far larger rival company that used to be totally comfortable in their industry to do better. And AMD drivers are improving enough compared to the past, at least IMO.

    • @dnawalahmed2979
      @dnawalahmed2979 2 года назад

      @@user-pc8vn3ms2h That's only on Windows. On Linux, since they're open source, the drivers as far as i know are pretty good for gaming

  • @Brimstonewolf
    @Brimstonewolf 2 года назад

    I'm still using a PC based around an i7+mobo+ram I bought in December '08. It's really only the last few years where it really started to show its age, quite a contrast to how quickly the 2013 gpu I put in it went obsolete.

  • @Mundilfari_
    @Mundilfari_ 2 года назад +29

    Intel engineers watching that announcement were probably dreading going back to work, because you know they’re gonna be putting in lots of overtime to come up with something.

    • @reaper15a
      @reaper15a 2 года назад +1

      Heads are still rolling almost an hour later, I can guarantee it.

    • @nachomolaolivera7580
      @nachomolaolivera7580 2 года назад +5

      13 gen is already established. They cannot come up with something at this point.
      Their power will come from how well they learnt to deal with 12 gen new features.

    • @MsMonster128
      @MsMonster128 2 года назад +2

      Why would they?
      With only 15% ST performance they would still be behind Alder Lake, and 30% MT considering that the 5950X is already 20% ahead in MT is not a big leap forward... Raptor Lake is increasing their ST performance and doubling the E cores from 8 to 16 for the MT performance

    • @Mundilfari_
      @Mundilfari_ 2 года назад +1

      @@MsMonster128 Intel is obviously still gonna be top dog when it comes to those peak performance numbers but AMD is allowing us to hit 5+Ghz without creating enough heat to warm an entire room and at a competitive price. Intel must do some major rethinking when it comes to their r&d/marketing philosophy for 13th gen or the market will start shifting even more towards AMD, like it has been the last 5 years.

    • @awen4784
      @awen4784 2 года назад +3

      @@MsMonster128 power eficiency

  • @tonyprice1293
    @tonyprice1293 2 года назад +7

    WANT ONE NOW! I Can't wait till they release this for a laptop. this and 64GB DDR 5 and an RTX 3090 in a laptop would rock. By the way, all you guys and girls at LTT friggin rock! Thanks for all the content and keep it going.

    • @iricalexis7508
      @iricalexis7508 2 года назад +2

      3090 at laptop sound like a waste. u will never get the full experience except it's a laptop with external water cooling.

    • @Bunce9908
      @Bunce9908 2 года назад +2

      better yet, 4000 series and 7000 series gpus coming out at the end of the year, ready for a 4080 laptop that outperforms a 3090 desktop?

  • @GiulianoMazzina
    @GiulianoMazzina 2 года назад +16

    My younger inner self is saying "YES, time to build a new desktop" while the adult in me is saying that my 5800x & 6700XT system is just fine and to stop wasting money.

    • @YuProducciones
      @YuProducciones 2 года назад +1

      LOL so true Giuliano. S T A P!

    • @ferwiner2
      @ferwiner2 2 года назад

      5800x? That is like brand new. I would hope not to have to upgrade for a few years.

    • @GiulianoMazzina
      @GiulianoMazzina 2 года назад

      @@ferwiner2 I'm a junkie. I upgrade far FAR too often. And the CPU will be 2 years old pretty soon. Plus my kids get the hand me downs. Gamers all around.

    • @fairuldeng7426
      @fairuldeng7426 2 года назад

      Stop wasting money .

  • @declanmckeown323
    @declanmckeown323 2 года назад

    shout out to micro center, I went for my first time to the cambridge location yesterday, as I live an hour away I have always wanted to go but have held off due to the distance, I can happily say it was well worth the drive for me and ill be back!

  • @BigMan7o0
    @BigMan7o0 2 года назад +5

    I got an X570 motherboard a few months ago, now it seems like this might give me a good deal to upgrade from my 3700x to something like a 5950x once the new chips launch

  • @timothycarr
    @timothycarr 2 года назад +11

    It is fascinating to me that the new low-end APUs seem to share the same architecture as the SteamDeck (Zen2/RDNA2).

    • @williampaabreeves
      @williampaabreeves 2 года назад +4

      Makes sense for the amount of investment required to build the chip to use in laptops too, if they have the stock. Steam Decks selling like hot cakes of course

  • @lukeperryglover
    @lukeperryglover 2 года назад +8

    Love the intro :P
    Performance aside, the fact they said they'll support "AM5" for a similar length of time as AM4 (an article I saw if it's to be believed said AMD told them so. I never watched the video on the announcement, just articles), existing coolers work, is awesome alone. Also them having iGPU's now, instead of just on a few. So woo!
    (I'm still on a Ryzen 5 2600x).

    • @mosesmatildadabwel6784
      @mosesmatildadabwel6784 2 года назад

      Thanks for watching and commenting,text to claim your price

    • @anhiirr
      @anhiirr 2 года назад

      as long as i can potentially run a pure apu system with out having to SHELL OUT big $$$ for a 6000mhz ram kit. or have FASTEST ram as a bare necessity for APU performance. Im really looking forward to board manufacturers making "EXTREME" b650 boards so i can try to squeeze out a good "OC" off a well priced kit of ddr5 ram...and have a multi monitor out APU build that can possibly play witcher 3 1080p 60 fps....or lesser games on 4k like a 3 tv setup or a 2 4k tv/projector setup for a game room. BC they really shilled me into thinking i could do that with a 3 monitor out b550i. I "REALLY" want this kind of ram oc/"support" viability with am5...so as i buy my FIRST am5 build and a legit gaming cpu....down the road into the life of am5.....i would eventually BUY a newer/est am5 build in say 4 years.....from now....ID ideally like to still use my first am5 board w.o having to sell it or "BUILD" it into something i could sell as a complete build...IDEALLY id like to plop the best AM5 APU/onboard graphics so i can still use that first gen am5 board and its onboard/video out ports etc.....given how amd has operated.....I HIGHLY doubt this would be possible....given its history with new chips/chipsets and apu....if i wanted teh "AMD" 9000 series best onboard graphics option at the time...i doubt i would get full support/features enabled on a 600 series board....with more emphasis on a b650 PER ex. Unless we start seeing some EXTREME b650s and some "CLAIMS" of future higher ddr5 speed support or futurelaunched chip support. Given inherent aspects with chipsets and ram support its hard to gurantee both let alone sill supporting the most modern APU or say amd 9000 series APU on a b650. Thats a lil alarming to me...or its the SAME EXACT used market/sell off upgrade path HISTORY amd had set with each launch of zen/zen+/zen2/zen3 as each chip supported faster ram ultimately better frames largely due to the faster ram support...as amd launched WORSE igpus.....on their apu on am4.....and the 5700g now has been proven to be "BETTER" with b-die memory or 4400 up to 5000mhz ram...bc it doesnt support inifity fabric nor has divider/limitations zen has proven to show. Essentially what im trying to say is IF AMD were to make it possible to use your first gen am5 board and ddr5 memory....ideally with a NEWER gen pure APU build...you could re-purpose that 1st/2nd gen ddr5 ram and b650 board into a VIABLE build. While buying into the newest cpu/board/gpu market to keep your main build fresh.

  • @GodlikeIridium
    @GodlikeIridium 2 года назад +3

    "We're reaching the end of moores law"
    AMD: "Hell no! We go full turbo now!"
    Nice^^

  • @babablekshep8485
    @babablekshep8485 2 года назад +17

    Man, I was just waiting to buy a laptop before uni this year, but now I'm thinking I might wait for next year Ryzen. But knowing how bad availability is in my region (still can't buy the Zephyrus G14 2022) I might buy this year. Someone pls help I'm stuck in analysis paralysis.

    • @capnskiddies
      @capnskiddies 2 года назад +3

      Shit or get off the pot. Buy the best you can afford by the start of August. Keep it right through uni, buy yourself a graduation present when yer done.

    • @logipilot
      @logipilot 2 года назад +1

      For word and excel every old thinkpad 2-core is sufficient

    • @kuma8030
      @kuma8030 2 года назад +1

      @@logipilot not true. thats for old word and old excel. i know of people that needs to use threadripper for excel.

    • @logipilot
      @logipilot 2 года назад +1

      @@kuma8030 ...at uni?

    • @innocentiuslacrim2290
      @innocentiuslacrim2290 2 года назад

      You need laptop for uni, so buy one. Your main considerations should be: battery life, screen, keyboard and possibly webcam (depending on how things are in your school). You obviously want gaming capability, get that too if you can get the priorities first.

  • @devoid-of-life
    @devoid-of-life 2 года назад +6

    AMD 7000 series + RTX 40 series is gonna be crazy

  • @mqaisataloss.5951
    @mqaisataloss.5951 2 года назад +7

    I miss incremental upgrades, for the first time since the early 90's pcs are unafordable again, but now your forced to own one.

  • @IbanezGuitarz87
    @IbanezGuitarz87 2 года назад +13

    AMD is doing a fantastic job. Keep it up!!!

  • @Zerasad
    @Zerasad 2 года назад +8

    Hmm, not sure I agree with this take. 12% frequency uplift is already almost the 15% performance increase they mentioned. And if you add the node shrink, doubled L2 cache, rumored 60% TDP increase, DDR5, the IPC increase might be in the negatives.

    • @Vernas_R
      @Vernas_R 2 года назад

      "Rumored"

    • @MsMonster128
      @MsMonster128 2 года назад

      @@Vernas_R Take that out of the equation and everything else is still facts

    • @MsMonster128
      @MsMonster128 2 года назад

      Not in the negatives but around 2%-3%... Underwhelming to say the least

    • @Postman00
      @Postman00 2 года назад

      I'm not sure what you mean considering the IPC does not take TDP into its equation. Even if it did, I would want some power to be allocated to the graphics portion on the IO die.
      The 12900K is between -1% to 6% faster than 5950X in Blender BMW depending on which reviewers' numbers you use. Assuming BMW and the AMD chip render has similar performance characteristics, and taking 12900K to be on average 3% faster than 5950X, the 16 core Zen 4 will need a ~20.4% IPC uplift to account for the rest of the performance over the 5950X that's not attributed to clock speeds.
      While 170W is indeed a 62% increase over the high-end AM4's 105W, the 3900X - 5950X actually consumes closer to 140W. Although it would suck if the 7950X or equivalent actually consumes 267W in reality.

  • @meatballg8655
    @meatballg8655 2 года назад +6

    Of course, the downside to all this new tech is the magpie envy I get when I see something new and better than what I have.
    My bank account is telling me no, but my frames are telling me yes

  • @Fuzkin
    @Fuzkin 2 года назад +12

    As much as i love powerful tech i do miss the low power draw. With electricity bill skyrocketing in last year this will just make it worse. Of course pcs don't draw that much but still i run it every day, 7hours atleast

    • @clark85
      @clark85 2 года назад +4

      dont buy intel garbage then

    • @WereCatStudio
      @WereCatStudio 2 года назад

      Just because something draws more power it doesn't mean it will use more power. If some CPUthat takes 100W will finish job in 10s but other CPU that draws 200W will finish job in 4s then the 200W CPU will use less power and is way more efficient.
      The problem of more power usage typically comes with overclocking and pushing the clocks on the same architecture high enough where the performance increase can't be justified with the power increase. Like on the RTX 3000 series for example. The jump in power from RTX 3080 to RTX 3090ti is tremendous while the performance increase is very small.

    • @iricalexis7508
      @iricalexis7508 2 года назад +1

      that is why right now they are starting to research a hybrid cpu that has low and high voltage draw, like at the ARMs system, i think that is the future of cpu until they can make x64 system running/evolved to another system kernel than can run at another architecture like ARMs or RISC.

    • @logipilot
      @logipilot 2 года назад

      Steam deck with 15 watt tdp ftw!

    • @joshjlmgproductions3313
      @joshjlmgproductions3313 2 года назад

      @@iricalexis7508 I'm still waiting for BIG-little to prove itself. Intel chips usually draw more power with less cores than AMD chips do.

  • @skutterbuster666
    @skutterbuster666 2 года назад +1

    I just got a 5600g for £165 new.
    Upgrading from an i5 7600k....even with the newer series about to drop, i'm still excited to see what difference there is.
    I always go for an included graphics solution when possible.

  • @SVD_NL
    @SVD_NL 2 года назад +22

    The last time I've been this excited about new PC hardware announcements, was the first Ryzen announcement.
    AMD just won't stop being disruptive and i love it.

    • @laurencefraser
      @laurencefraser 2 года назад +2

      And then you start thinking about the current state of the logistical pipeline for making these things... and the resulting price...

  • @Granturion
    @Granturion 2 года назад +49

    Improved efficiency and more performance. Thanks.
    Intel can go home with their bruteforce cpus when energy prices are sky rocketing.

    • @TobiasCarlsson1
      @TobiasCarlsson1 2 года назад

      My new 5700X is so cool I'm freezing in here. Thank you 65W.

    • @1vend7
      @1vend7 2 года назад +2

      you talk like AMD hasn't already done this with Bulldozer haha, but yes it is good to see that they are at the top of efficiency nowadays

  • @BillyMcBillface
    @BillyMcBillface 2 года назад +8

    As a northern Californian, Linus calling Mendocino “Mendichino” is painful.

    • @TeddieBean
      @TeddieBean 2 года назад +2

      is it "sea-no"? I would have called it cino as I assumed with all the Italian generational codenames, that's how it's pronounced

    • @BillyMcBillface
      @BillyMcBillface 2 года назад +3

      @@TeddieBean yes, its an adjective form of a Spanish name, literally “of Mendoza”

    • @tedtolman9006
      @tedtolman9006 2 года назад +2

      Grew up in NorCal, came to the comments looking for friends. Also cringed.

    • @TeddieBean
      @TeddieBean 2 года назад +1

      @@tedtolman9006 if you don't know, you just don't know! 😂🤣 I would have pronounced it the same way 😅✨

    • @tedtolman9006
      @tedtolman9006 2 года назад +1

      @@TeddieBean No worries! I think many Californians forget that not everyone knows about the Spanish missionaries and how they pretty much named everything in California. And not everyone knows the difference between Italian and Spanish pronunciation rules. I still get confused in Italian between a single 'c' or a double 'cc'. I rely on remembering an old friend named Ceccini. Pronounced cheh-kini.

  • @MusicLife2030
    @MusicLife2030 2 года назад

    Huge upgrade to PC industry. I am actually using japan made intel core 2 duo with gt210 gpu and just got an upgrade last month into intel i5 11400. gen 4 ssd and 3200hz memory. when I decided to play games. the speed is outstanding and more reliable. thanks for the intel competitors. shows anything is possible. Japan is reportedly testing a big bike which rides not on the road but in the air. I will not be surprised if we have flying big bikes for the next 3-5 years.

  • @Cram962
    @Cram962 2 года назад +90

    If Linus ever feels like giving out “outdated” components. I’m right here. No PC, in debt, and gaming Tetris.

    • @saintjupi
      @saintjupi 2 года назад +11

      i feel, i love seeing all the new tech while i use my xbox 360 and 2015 macbook pro

    • @davidy22
      @davidy22 2 года назад +8

      Modern tetris is absolutely cutthroat, a single frame drop and you'll be feeling the hardware difference

    • @FaZekiller-qe3uf
      @FaZekiller-qe3uf 2 года назад +11

      @@saintjupi you can run Linux on xbox360, therefore it’s all you need /s

    • @raandomplayer8589
      @raandomplayer8589 2 года назад

      @@FaZekiller-qe3uf you can? I've seen the thumbnail with ps2 didn't know x360 can too.

    • @imo098765
      @imo098765 2 года назад

      @@saintjupi thats your problem you bought a mac

  • @syakirbinabdkarim8302
    @syakirbinabdkarim8302 2 года назад +21

    i think that intel, amd and nvidia should focus on efficiency rather than speed and power once they reached the climax of what air and aio coolers perform with adequate wattage not a pocket size nuclear reactor level.

    • @chupasaurus
      @chupasaurus 2 года назад +1

      AMD does just that with Zen and RDNA (my 3900X's whole power budget is the same as i9's at base clocks lol), GPUs are sky-rocketing in TDPs though.

    • @jakobe_bryantgaming5580
      @jakobe_bryantgaming5580 2 года назад

      @@chupasaurus I mean even the 6800XT doesn't draw a whole lot more than a 980 Ti. Until its overclocked, but point still stands.

    • @minespeed2009
      @minespeed2009 2 года назад

      funfact CPUs have nearly surpassed the power density of nuclear reactors and rocket nozzles (in power/area)

    • @chupasaurus
      @chupasaurus 2 года назад

      @@jakobe_bryantgaming5580 Good point, but since Pascal Team Green GPUs have insane spikes of power draw, and both Ampere and coming Lovelace generations are stepping up the TDP, e.g. my 3070 has 300W budget, OC is just running at that point when fully loaded while non-OC ones can boost to that.

    • @chupasaurus
      @chupasaurus 2 года назад

      @@minespeed2009 And they'll never surpass fusion reactors.

  • @guillaumejoop6437
    @guillaumejoop6437 2 года назад +4

    Kudos to Lenovo for the analog wasd cause let's be honest we don't need the whole board to be optical switches. Maybe this will open the door for other manufacturer to follow

  • @valdyprawhesmara3658
    @valdyprawhesmara3658 2 года назад

    Still running a 5 1600 here....and still runs very nice on everything.
    Every 6 or 7 years, I've always built a new one instead of upgrading (well, I occasionally add/upgrade something on year 3 or so, but on this build, the only thing that I added was a second HDD and an acrylic side panel hahahha)...
    It was my best investment ever, AMD was gold introducing Ryzen.
    Before Lisa Su announced the Ryzen 7000, I was planning on buying a 5 5600G and a B550 Mobo to replace mine. But now, considering the fact that AM5 is on the horizon, might want to wait for awhile

  • @SuperSkandale
    @SuperSkandale 2 года назад +4

    the faster the hardware gets the less we seem to need it. in 2017 there was a real need to upgrade the hardware to run games like pubg and even Fortnite. Now there is nothing interesting on the scene that requires the upcoming cpu and gpu gen's.

  • @YOEL_44
    @YOEL_44 2 года назад +10

    AMD hasn't just improved performance but also looks, have you seen how sexy that CPU is looking sitting in the new LGA socket, damn...

    • @suibora
      @suibora 2 года назад

      @@digital_feline 😂

    • @milescarter7803
      @milescarter7803 2 года назад

      Yah, just like a Skylake Xeon, so pretty

    • @monke2361
      @monke2361 2 года назад +2

      Cleaning thermal paste on that thing won't be optimal, who cares what it looks like I only care about how it performs

    • @YOEL_44
      @YOEL_44 2 года назад

      @@monke2361 I care how it looks

    • @monke2361
      @monke2361 2 года назад +2

      @@YOEL_44 you can't even see it though, you put a cooler on it

  • @srosner0
    @srosner0 2 года назад +12

    I think a lot of people would buy this as their gaming cpu without requiring additional GPU, and maybe considering a future addition of discrete GPU once prices will drop enough

    • @user-um5ub2jn8o
      @user-um5ub2jn8o 2 года назад

      ⬆️⬆️⬆️congratulations you are among our shortlisted winners claim your reward🎁.

  • @Chuck_vs._The_Comment_Section
    @Chuck_vs._The_Comment_Section 2 года назад +1

    All these great new and further developments and yet SATA III (maximum 600 MB/s) is still the dominant interface for drives within our PCs.