Is M1 Max worth $400 extra? - MacBook Pro 14"

Поделиться
HTML-код
  • Опубликовано: 1 июн 2024
  • Get 20% OFF + Free Shipping @manscaped at manscaped.com/TECH
    Get 10% off all Jackery products with code LinusTechTips at www.jackery.com?aff=27
    Apple Silicon has left a lasting impression for both its CPU and GPU power, but could a 14 inch laptop really do the M1 Max SoC justice, or is this MacBook too specialized for its own good?
    Buy Macbook Pro 14' 2021
    On Amazon: geni.us/udqd1h
    On Best Buy: geni.us/mJK2
    On B&H: geni.us/rjd0W
    Buy Macbook Pro 16' 2021
    On Amazon: geni.us/3J6B36S
    On Best Buy: geni.us/5chC
    On B&H: geni.us/qwPIu
    Buy ASUS ROG Zephyrus M16
    On Amazon: geni.us/XjtG6
    On Best Buy: geni.us/y6jFmh
    On Newegg: geni.us/mqZ13G
    Purchases made through some store links may provide some compensation to Linus Media Group.
    Discuss on the forum: linustechtips.com/topic/13944...
    ►GET MERCH: www.LTTStore.com/
    ►SUPPORT US ON FLOATPLANE: www.floatplane.com/
    ►LTX EXPO: www.ltxexpo.com/
    AFFILIATES & REFERRALS
    ---------------------------------------------------
    ►Affiliates, Sponsors & Referrals: lmg.gg/sponsors
    ►Our WAN Show & Podcast Gear: lmg.gg/podcastgear
    ►Private Internet Access VPN: lmg.gg/pialinus2
    ►Our Official Charging Partner Anker: lmg.gg/AnkerLTT
    ►Secretlabs Gaming Chairs: lmg.gg/SecretlabLTT
    ►MK Keyboards: lmg.gg/LyLtl
    ►Nerd or Die Stream Overlays: lmg.gg/avLlO
    ►Green Man Gaming lmg.gg/GMGLTT
    ►Amazon Prime: lmg.gg/8KV1v
    ►Audible Free Trial: lmg.gg/8242J
    ►Our Gear on Amazon: geni.us/OhmF
    FOLLOW US ELSEWHERE
    ---------------------------------------------------
    Twitter: / linustech
    Facebook: / linustech
    Instagram: / linustech
    Twitch: / linustech
    FOLLOW OUR OTHER CHANNELS
    ---------------------------------------------------
    Mac Address: lmg.gg/macaddress
    Techquickie: lmg.gg/techquickieyt
    TechLinked: lmg.gg/techlinkedyt
    ShortCircuit: lmg.gg/shortcircuityt
    LMG Clips: lmg.gg/lmgclipsyt
    Channel Super Fun: lmg.gg/channelsuperfunyt
    They're Just Movies: lmg.gg/TheyreJustMoviesYT
    MUSIC CREDIT
    ---------------------------------------------------
    Title: Laszlo - Supernova
    Video Link: • [Electro] - Laszlo - S...
    iTunes Download Link: itunes.apple.com/us/album/sup...
    Artist Link: / laszlomusic
    Outro Screen Music Credit: Approaching Nirvana - Sugar High / approachingnirvana
    Intro animation by MBarek Abdelwassaa / mbarek_abdel
    Monitor And Keyboard by vadimmihalkevich / CC BY 4.0 geni.us/PgGWp
    Mechanical RGB Keyboard by BigBrotherECE / CC BY 4.0 geni.us/mj6pHk4
    Mouse Gamer free Model By Oscar Creativo / CC BY 4.0 geni.us/Ps3XfE
    CHAPTERS
    ---------------------------------------------------
    0:00 Intro
    1:05 M1 Max
    1:50 Specs
    2:50 Comparison
    3:29 Benchmarks
    5:16 Davinci Resolve
    5:51 Cinema4D Redshift
    7:07 Gaming Benchmarks
    8:50 CPU Discussion
    10:41 Thermals
    11:22 Battery Life
  • НаукаНаука

Комментарии • 3,7 тыс.

  • @googleone5867
    @googleone5867 2 года назад +6586

    Tailoring the chip to videomakers is essentially marketing genius, since it'll satisfy all the RUclips reviewers

    • @dominikhanus9320
      @dominikhanus9320 2 года назад +442

      They’ll satisfy every mac user LOL, that was the point of every Macbook for years
      The difference is this years Macs are even better at being macs then all of the previous Macs and that is not a bad thing

    • @Jo21
      @Jo21 2 года назад +134

      @@dominikhanus9320 well they ben crap since 2016

    • @dominikhanus9320
      @dominikhanus9320 2 года назад +217

      @@Jo21
      Yes, but these are simply not.
      There is bunch of people with priorities that this device is tailored for and they are going to be incredibly happy with it.
      The same way there are people who are going to love Asus G15 and people who will tell you that for their priorities it’s complete trash.

    • @surrealbeats4487
      @surrealbeats4487 2 года назад +4

      Lol

    • @D71219ONE
      @D71219ONE 2 года назад +95

      @@dominikhanus9320 The problem is the price though. It’s just not a good value.

  • @Excalibaard
    @Excalibaard 2 года назад +2183

    Therapist: Gangthony isn't real, he can't hurt you.
    Gangthony: 'MAXED OUT MACS ARE MAD MACS WITH M1 MAX TO THE MAXXX'

    • @eddusii
      @eddusii 2 года назад +40

      It’s like 2000’s cringe all over again lol
      No offense I love LTT and I get it’s ironic

    • @t20594
      @t20594 2 года назад +26

      Thugthony

    • @randomrdp3356
      @randomrdp3356 2 года назад +8

      Max max super max max super super max max.

    • @anharun
      @anharun 2 года назад +8

      Gangthony > Punk Linus

    • @reikisano4542
      @reikisano4542 2 года назад +2

      @@randomrdp3356 Max Super Max Max Super Super Max Max Max

  • @K31TH3R
    @K31TH3R 2 года назад +263

    2:54 That's a bold choice to use a 90's brick phone as a pointing device instead of literally _anything_ else. What a delightfully weird repurposing of e-waste, I fully support it.

  • @parkeredwards9165
    @parkeredwards9165 2 года назад +590

    I wish you guys did more testing in audio production, I'd be curious to see how Logic pro and Ableton Live run with lots of VSTS on the M1 Max

    • @User-ik2kc
      @User-ik2kc 2 года назад +60

      THIS!! They need to focus more on the audio production side

    • @Omalleypike
      @Omalleypike 2 года назад +24

      Alas, for sound is often forgotten but always essential

    • @clickbaitpro
      @clickbaitpro 2 года назад +22

      It's a no go if you're serious about music production. Rosetta is literal trash to run VSTs on and more than 70% of the 3rd party plugins are just not compatible on M1 which gliches out and have performance issues. Audio interface doesn't shows up some times and if you've to change your choices of VSTs based on machine, it's not a good machine to begin with

    • @User-ik2kc
      @User-ik2kc 2 года назад +35

      @@clickbaitpro How come other videos I've watched people have little to no problem with their VSTs on the new mac?

    • @machinesworking
      @machinesworking 2 года назад +23

      @@clickbaitpro You're dead wrong about pretty much all of that. There are very very few plug ins that cannot run in Rosetta, and the hit at 10% is far less than the CPU advantage that the M1 provides. NI are lagging, but NI have traditionally lagged, they're pretty much the worst at adopting to anything new. That said, Kontakt runs in Rosetta etc.

  • @TheRealAtello
    @TheRealAtello 2 года назад +871

    Man, that single photo of Linus has gotten more mileage than all of the cars I've ever owned combined.

    • @DavidNgo86
      @DavidNgo86 2 года назад +1

      Lol

    • @maddrone7814
      @maddrone7814 2 года назад +52

      But not as much mileage than you mom lolololol get rekt kid

    • @mozly9080
      @mozly9080 2 года назад +39

      @@maddrone7814 Bro how old are you

    • @neurocidesakiwi
      @neurocidesakiwi 2 года назад +29

      @@maddrone7814 nice, keeping it old school.

    • @sinuslebastian6366
      @sinuslebastian6366 2 года назад +15

      mozly not as old as your mom lolololol get rekt kid

  • @k680B
    @k680B 2 года назад +2379

    i love that the 5950X testbench is so much faster that it just goes off the charts
    i know it's not a fair comparison, it just looks funny

    • @notme8232
      @notme8232 2 года назад +323

      Well, all serious work that can afford a $3000 Macbook can afford, and should use, a $3000 desktop, and can eat the added peripheral cost

    • @Parth_Soni
      @Parth_Soni 2 года назад +208

      It was just to remind fan boys it's still a laptop and not a monstrosity

    • @gerardopadilla2666
      @gerardopadilla2666 2 года назад +283

      Isn't it a fair comparison? I've seen a lot of guys "reviewing" or showcasing their M1 Max Macs as equal or better than a desktop! A desktop which they don't have or had a 5 year old one with middling specs! 🤣
      I've also gotten comments from people who bought the propaganda and tell me it has RTX 3080 (not mobile) performance! 🤣

    • @JonathanKingstonFear
      @JonathanKingstonFear 2 года назад +27

      I’m just sad that there was no Davinci on Linux AMD performance comparison in the graph. Genuinely interested in how it compares either way.

    • @iliasiosifidis4532
      @iliasiosifidis4532 2 года назад +127

      It is absolutely fair. If i will spend 3000 for a computer, I need all the comparissons

  • @dugtrioramen
    @dugtrioramen 2 года назад +195

    I was wondering why the rtx had an arrow head on its bar graph, while the others were normal rectangles. Then I realized the rtx was so much higher than the others that it was being truncated so you could even compare the other bars 😂

    • @GeorgeGraves
      @GeorgeGraves 2 года назад +3

      ^this

    • @ryanw8664
      @ryanw8664 2 года назад +17

      Yes, because it makes good sense to compare one of the most expensive desktop configs you can buy to these LAPTOPS. There are not enough eyeballs available in the world to roll for this asinine comparison…

    • @HuyTran-sb2ql
      @HuyTran-sb2ql 2 года назад +4

      @@ryanw8664 whatever it takes for them to make the mac look like a piece of trash. Honestly what trashy review

    • @nexuhs.
      @nexuhs. 2 года назад +22

      @@HuyTran-sb2ql malding comment

    • @Emcfree2084
      @Emcfree2084 2 года назад +9

      I wouldn’t doubt yourself so quickly. Max Tech did a response video to this revealing a surprising and very concerning set of anomalies in the data presented in this video suggesting either serious issues with testing methodology or massive pro Intel bias. An update urgently needed by the Linus team to respond to those observations and recover lost credibility

  • @kenbash6213
    @kenbash6213 2 года назад +30

    The professional presentation and eloquent voice of my favorite Linus media group personalities makes this review very entertaining and informative!

  • @nyanray
    @nyanray 2 года назад +2165

    I dunno if/how you're coaching Anthony to host these things, whatever you're doing, keep doing it. He just keeps getting better and better for every video, and he was good enough with a decent margin to begin with!

    • @Hunterboy2407
      @Hunterboy2407 2 года назад +61

      Bro Anthony is the best!!! I just wanna be best friends with him 😂

    • @LeFatalpotato
      @LeFatalpotato 2 года назад +117

      Plot twist:Anthony is coaching Linus so that he doesn't drop thousands of dollars worth of electronics every show.

    • @casewhite5048
      @casewhite5048 2 года назад +28

      i think he is just smart enough to do it naturally

    • @MiguelRPD
      @MiguelRPD 2 года назад +15

      I was originally offput but him but I now love the man and need him to make more videos. He's great !

    • @CyrilJap
      @CyrilJap 2 года назад +8

      Anthony is the best ❤️

  • @Bellenchia
    @Bellenchia 2 года назад +372

    You guys should really add model training to your benchmark. Find a basic TensorFlow notebook and see how M1 fares against a neural workload.

    • @touisbetterthanpi
      @touisbetterthanpi 2 года назад +13

      Please please please!

    • @DeeSnow97
      @DeeSnow97 2 года назад +28

      Especially compared to Nvidia's Tensor cores on 20 and 30 series cards, which are both crazy powerful and have great platform support

    • @OG_ALviK
      @OG_ALviK 2 года назад +37

      I clicked on the video expecting very good comparisons in diffrent scenarios and such. All i got is dolphin emulator and some random bench data.
      How disappointing.

    • @jesusbarrera6916
      @jesusbarrera6916 2 года назад +10

      @@OG_ALviK check hardware unboxed, LTT is like fast food despite being the biggest tech channel out there

    • @Workmusic1988
      @Workmusic1988 2 года назад +8

      100%, no idea wtf the dolphin review was. Odd at best.

  • @andreasneukoetter157
    @andreasneukoetter157 2 года назад +37

    If you want to actually see the promised performance gains:
    Use it for software development.
    Build times went from 4 minutes (2019, 16" MBP, max spec) to

    • @andremessado7659
      @andremessado7659 2 года назад +2

      I used to respect the guy but i'm not sure what to think about him or LTT at this point. If they don't address this i'm unsubscribing.
      - ruclips.net/video/g1EyoTu5AX4/видео.html M1 MAX - 32 CORE BENCHMARKS v RTX 3080 & 3090 Will Blow Your Mind!
      - ruclips.net/video/OMgCsvcMIaQ/видео.html M1 Max 32 Core GPU v 165W RTX 3080 Laptop
      - ruclips.net/video/JM27aT9qhZc/видео.html 16" MacBook Pro vs RTX 3080 Razer Blade - SORRY Nvidia..
      - ruclips.net/video/YX9ttJ0coe4/видео.html 16" M1 Max MacBook Pro vs. My $6000 PC
      The list keeps going. These results have been out for a while too so LTT really have no excuses. They didn't use optimized software, they didn't compare laptop performance while on battery, they didn't use readily available GPU benchmarking software thats already proven to place the M1 Max around the 3080 level. They need to explain.

  • @stolenromeo
    @stolenromeo 2 года назад +176

    One thing to note, you can’t set up the pro max chip with 16GB of memory, so if you don’t NEED 32GB, it’s actually a $600 difference to go from the base M1 pro to the base M1 pro max. The $200 for the chip upgrade itself and $400 for the memory upgrade.

    • @stewardappiagyei6982
      @stewardappiagyei6982 2 года назад +7

      Nope, it is currently not possible to buy an M1 Max Macbook Pro 14 with 16GB unified memory. 32GB is the lowest option on Apple's site.

    • @thomaswesleymiller
      @thomaswesleymiller 2 года назад

      Yeah when I bought my M1 Pro 16 in November I wanted to spring for the max but like you said it quickly became a grand difference in price and I’m really happy with the Pro.

    • @HearMeLearn
      @HearMeLearn 2 года назад +27

      @@stewardappiagyei6982 ... that's what he said...

    • @QuakerAssassin
      @QuakerAssassin 2 года назад +3

      I would not buy a laptop with 16gb of memory. I know it's use-case dependant, but I constantly am running up on 16GB on laptop and desktop. Given usually it's when doing 3D workflows, or container development, but it does feel like even more casual use and gaming workloads are going to be pushing up on 16GB soon enough that the cost of upgrade is worth it to make the computer relevant longer. Like, I'm using 10.6GB of mem right now just to have like 20 tabs of firefox/chrome and spotify open.

    • @thomaswesleymiller
      @thomaswesleymiller 2 года назад

      @@QuakerAssassin yeah it’s def application specific. I bought my Mac for just working with Lightroom and Photoshop and to work with raw files and it’s amazing for that. But I really don’t render anything or game so it works great for me as far as productivity goes.

  • @toshineon
    @toshineon 2 года назад +1641

    I'm a little puzzled about Apple's claims. On the CPU side, it makes sense. The M1 is using a much more efficient architecture compared to the x86 that AMD and Intel is using. But that's not how GPU's work, right?

    • @hilal_younus
      @hilal_younus 2 года назад +258

      I’m pretty sure that we can’t tell that much since Most apps and engines are not even optimised for ARM architecture much less this SoC…
      Till now, It was mostly Rosetta 2 which was doing most of the work, however it doesn’t really mean much against other PC’s

    • @theondono
      @theondono 2 года назад +187

      It’s also using a more efficient architecture for the GPU.
      Normally you’ll have the CPU feeding data to the GPU, and the GPU storing in it’s memory. This is why high end GPUs have higher bandwidths, because this is a limiting factor.
      This new chips don’t need to do this, the memory is unified, and CPU and GPU can share memory directly. This obviously requires massive changes in the application.
      Right now what we can see, is that the M1 macs have very limited graphics performance because Rosetta can’t use this trick, it emulates the previous architecture by copying data from CPU memory to GPU memory (in this case they’re the same). This essentially halves the throughput, and that’s why performance is so poor.

    • @Badjujubee
      @Badjujubee 2 года назад +29

      It's kind of the stepped up version of Smart Access Memory. The M1 series is the first modern implementation of a consumer unified direct access memory architecture. Just like how the most effective/efficient Mining GPU's are really the best memory bus implementations, these unified CPU/GPU/DRAM chips are going to start eating the modular systems lunch as long as they can get a large enough memory pool.

    • @wnxdafriz
      @wnxdafriz 2 года назад +152

      @@ezicarus8216 shhhh, they might realize that the imaginary system reserved memory for the igpu is actually a thing
      also.... x86 consoles go as far back as the original xbox... which yea it was shared memory for the gpu/cpu

    • @iliasiosifidis4532
      @iliasiosifidis4532 2 года назад +9

      GPUz have way more than RAW performance. For example, OPTIX of nvidia, is way smarter, therefore better for rendering than CUDA on the same card. Thing of games that RT cores, are incoparable to raw performance for raytracing, even if they consume less die space and power consumption

  • @nocturn9x
    @nocturn9x 2 года назад +450

    This "the answer may surprise you. It sure surprised me" thing is starting to be a distinctive mark of Anthony's videos and I like it. Love the energy!

    • @waitwhat1144
      @waitwhat1144 2 года назад

      To the max. Haha ditto!

    • @tradingbits7345
      @tradingbits7345 2 года назад

      same here, I enjoy Anthony's vids

    • @cleanlens
      @cleanlens 2 года назад

      ok

    • @dmt1994
      @dmt1994 2 года назад

      I thought he was going to say Apple was true to their word and that their marketing accurately reflected their products; that would have been shocking.

    • @nocturn9x
      @nocturn9x 2 года назад

      @@dmt1994 Yeah, same

  • @zizimugen4470
    @zizimugen4470 2 года назад +7

    Anthony, your screen-presence has improved so much from your debut. You’ve clearly gotten much more comfortable in front of the camera, and you provide a wonderfully logical insight (pun intended) into the things you present. I know you’re going by a script, but surely you contribute, and you make it yours.

  • @andremessado7659
    @andremessado7659 2 года назад +18

    It would have been nice to see the 2020 Intel MacBook Pro's included in these graphs.

  • @jonathanschmidt7874
    @jonathanschmidt7874 2 года назад +533

    These recent reviews with Anthony hosting are so damn high quality that I can't wait for some of the LTT Lab content to drop in the next year. It's gonna be absolutely sick.

    • @BR-pz7lx
      @BR-pz7lx 2 года назад +8

      Read this before video started. Then the first 3 seconds hit me…

    • @JaydevRaol
      @JaydevRaol 2 года назад

      Yeah

    • @vijeykumar7429
      @vijeykumar7429 2 года назад +2

      LTT labs content won't be in the form of videos, they'll be mostly print media, articles and posts on their website. He said so himself.

    • @jonathanschmidt7874
      @jonathanschmidt7874 2 года назад +2

      @@vijeykumar7429 no. He said most of it. Can’t imagine them spending so much money without making use of the information in videos.

    • @dbased1915
      @dbased1915 2 года назад +2

      this guys voice alone is 10000x better than linus IMO.

  • @hovant6666
    @hovant6666 2 года назад +348

    I'm really tired of mobile parts being called the same name (eg: 3080) as their exponentially more powerful discrete counterparts. They're fundamentally different parts I feel

    • @djsnowpdx
      @djsnowpdx 2 года назад +80

      I mean, they’re up to twice as powerful on desktop, but that’s plenty to mislead consumers. AMD and Apple aren’t doing that, though. Just Nvidia.
      I take issue with your use of the word “discrete” here - the 3080 laptop GPU is still discrete graphics because it’s not on-die with the CPU. Still, I take your point, and I second it.

    • @jesusbarrera6916
      @jesusbarrera6916 2 года назад +9

      Technically the 3060 is different, has more cuda cores than the desktop variant and that's why they are actually comparable

    • @gustavrsh
      @gustavrsh 2 года назад +24

      That's their intention

    • @hovant6666
      @hovant6666 2 года назад

      @@djsnowpdx That's a fair distinction, is there a category to describe desktop + workstation + server GPUs? The only thing I can think of is 'PCIe GPUs', vs mobile GPUs and iGPUs. There's also the distinction between the specially-made rackmount-only versions, like the A100, which although use PCIe, are not PCIe-socketable, which futher muddies things

    • @hovant6666
      @hovant6666 2 года назад +1

      @@gustavrsh Probably right, might just be an upselling tactic

  • @branpod
    @branpod 2 года назад +11

    Blender 3.1's metal support is very nice. I still don't think it beats out some of the higher end RTX cards, but it still performs very well, even in the alpha stages

  • @corygyarmathy8867
    @corygyarmathy8867 2 года назад +1

    Really loved the style of this video - it felt like there was a conscious effort to the presentation and it flowed really well.

  • @BrianJones-wk8cx
    @BrianJones-wk8cx 2 года назад +33

    Great perspective, appreciate the continued, in-depth coverage on these. I also appreciate what feels like an objective, enthusiastic investigation of the tech, neither a takedown nor blind exaltation, thank you so much for your work!

  • @TheWeakLink101
    @TheWeakLink101 2 года назад +118

    Can we talk about the B-roll camera shots? Seems like they’re trying some new techniques here and I love it!

    • @yeti4269
      @yeti4269 2 года назад +2

      They're so clean!

    • @thatguyalex2835
      @thatguyalex2835 2 года назад +2

      @@yeti4269 Unlike Linus' humor. Lol ... :)
      All jokes aside, LTT quality has up ticked in the last few months. :) I like this new style a lot.

  • @ScrungleGaming
    @ScrungleGaming 2 года назад +3

    After rewatching this review, I went ahead and bought the base model of the 14 inch m1 pro. I will be doing more cpu than gpu heavy work but I didn't think the 2 extra cores was worth the money

  • @sigelsegal2297
    @sigelsegal2297 2 года назад +49

    I’m a video editor , I have used Mac and pc for a long time. Recently built a nice PC and I game too much on it lol so now I’m thinking of getting the M1 Max for portability. Glad to hear it’s a beast at what I need it for. This is definitely not for everyone

    • @almuel
      @almuel 2 года назад +12

      It definitely is a beast especially if it has a native support for Apple silicon. If you game unfortunately there isn’t any game that natively supports it yet, if there was then you’d get close to 3080’s performance for far greater efficiency. The biggest advantage of these chips are the performance power you get on the go versus any other laptop on the go. The MacBooks just smoke them there and if you travel a lot getting a MacBook over the others is going to be a no brainer. Just remember that you’d have to sacrifice playing some AAA game titles though but if Apple themselves release some AAA games for the Mac, I’m sure more game devs would see the potential in the Mac and port titles to them. That possibility definitely exists but it’s going to be a gamble.

    • @skadi7654
      @skadi7654 2 года назад +3

      @@almuel you should watch the video.......... it is Rtx 2060 rather than 3080

    • @Emcfree2084
      @Emcfree2084 2 года назад +4

      Interestingly Max Tech did a response video to this revealing a surprising and very concerning set of anomalies in the data presented suggesting either serious issues with testing methodology or massive pro Intel bias. Either way an update urgently needed by the Linus team to respond to those observations and recover lost credibility

    • @Emcfree2084
      @Emcfree2084 2 года назад +4

      @@skadi7654 no, they misrepresented data for whatever reason. Others have proven the reality, but although IMO LTT we’re raising an important and valid concern about these laptops, they did it in a very sketchy and either underhand or unprofessional way. See the Max Tech response for more details.

    • @ViPulaNz
      @ViPulaNz 2 года назад

      Same situation, 3080 gaming desktop but wanted a m1 for portability. You getting the m1 pro max or m1 pro ?

  • @stormburn1
    @stormburn1 2 года назад +431

    Something I think was missing was battery life under load. A key part of Apple’s claims was sustained performance even on battery and at much greater efficiency. So I’m curious how the gaming comparisons would look if you capped framerates to 60 or 30 across machines and compared battery life then. You showed Apple exaggerated how close they were in raw performance, and now I want to know how much they exaggerated on efficiency.

    • @andreasbuder4417
      @andreasbuder4417 2 года назад +69

      Well, to actually make a comparison, the PC Laptops would need to deliver full performance on battery, which they can't.

    • @Natsukashii1111
      @Natsukashii1111 2 года назад +64

      @@andreasbuder4417 it would need a laptop that cost as much as the macs. The zephyr is was half the cost of the macs here

    • @GeckoClever
      @GeckoClever 2 года назад +23

      If you have such a workhorse, why use it on battery where it would die in less than 4 hours IF it was at 100% battery? Seems extremely unrealistic scenario.

    • @kjeldkaj
      @kjeldkaj 2 года назад +1

      @@Natsukashii1111 It is the same prize? at least in Denmark

    • @kjeldkaj
      @kjeldkaj 2 года назад +10

      wait, my mistake, it is the same prize as the low-end M1 pro 14 inch (2,600 dollars), therefore cheaper than the M1 Max 14 inch (5,000 dollars)

  • @anno203
    @anno203 2 года назад +132

    I’d be interested in the power consumption comparison during these tests

    • @jackburrows5850
      @jackburrows5850 2 года назад +33

      What nvidia doesn’t want you to hear.

    • @cristhiantv
      @cristhiantv 2 года назад +21

      I dont think you will ever see this in this channel.. the other machines would look like crap

    • @HuyTran-sb2ql
      @HuyTran-sb2ql 2 года назад +15

      @@cristhiantv they’ll probably say some delusional things like ”pro users always have their laptop plugged in anyway so power consumption isn’t an issue”.

    • @evolicious
      @evolicious 2 года назад +5

      Power consumption effects nothing in your life and costs next to nothing extra, unless you live in a shithole without reliable power.

    • @cristhiantv
      @cristhiantv 2 года назад +6

      @@evolicious you don’t know his life, do u? Also power consumption is important if you’re rendering videos on the go…. But you’re gonna probably reply something telling us how stupid we are just by looking at your comments before… so don’t mind answering, have a good day

  • @dgallitelli
    @dgallitelli 2 года назад +125

    I would love one day to see Deep Learning Benchmarks as well ... as a DL practitioner, looking forward to the comparison for both CPU and GPU workloads.

    • @calsimeth1588
      @calsimeth1588 2 года назад +1

      I know! The code to get a simple run of MNIST going is just a couple blocks of copy paste.

    • @MrGeometres
      @MrGeometres 2 года назад

      Get a workstation grade laptop. (Dell Precision / Thinkpad P-series)

    • @MrGeometres
      @MrGeometres 2 года назад +1

      @Christos Kokaliaris You can get these notebooks with 500nits, 4k 120Hz displays if you are willing to spend the cash. Personally I use external monitors.

    • @cmfrtblynmb02
      @cmfrtblynmb02 2 года назад +7

      @@MrGeometres if you run stuff on cloud, nothing beats a 900 dollar Macbook Air. You get a wonderful display, great touchpad, nice keyboard. At some point you have to run stuff on cloud if you are doing serious business. It does not makes sense to put thousands dollars to workstations that don't run most of the time and don't scale at all.

  • @shwarma_ketchup.123
    @shwarma_ketchup.123 2 года назад +1

    The actual video to the sponsor sequence transitions are always smooth af ngl

  • @robemanchester2277
    @robemanchester2277 2 года назад +196

    It's no wonder all of the reviews were so glowing when these laptops came out. It's because all of them almost exclusively focus on video editing and the Adobe Suite. "Benchmarking" often times is just video render times, and it's frustrating, as you can clearly see, it doesn't paint a good picture overall. The Zephyrus is what, at least $1k less? And it performs largely the same, at the cost of battery life? I guess efficiency is a good thing, but these laptops are good for only really very specific purposes, and I question whether they entirely deserved the ubiquitous glowing reviews when they dropped.

    • @aritradey8334
      @aritradey8334 2 года назад +45

      If you also consider programming, then also the m1 pro and max outshines the competition. Android projects and java projects are significantly faster than even the top-end machines running linux. Python and TensorFlow builds are also faster, although there somehow the m1 pro trains and builds the ML model faster than the m1 max due to some reasons. So in the departments of media creation and programming these laptops are truly top of the class.

    • @Lodinn
      @Lodinn 2 года назад +18

      Apple's gig has never been good value. I would actually consider buying it for the hardware if not for the OS lock-in. $1k for weight/battery life/build quality? Sure, why not.

    • @robemanchester2277
      @robemanchester2277 2 года назад +1

      @@Lodinn This is why, despite it's many downsides, I still kind of like the MacBook 16in 2019 with the updated keyboard and i7. Boot camp gives it longevity, and being that it runs x86, it runs all modern day apps. Obviously efficiency isn't nearly there, but all the other MacBook perks are, which makes it a rather nice machine. Outclassed for sure by these last few years of laptops by orders of magnitude, but hey, until Razer or Microsoft can get the build quality down as good as Apple has, it's an attractive option.

    • @robemanchester2277
      @robemanchester2277 2 года назад +8

      @@aritradey8334 That's fair! I haven't seen too many benchmarks I guess in the programming world, which I feel is telling when it comes to the reviewer landscape. With that being said, I remember some of the Hardware Unboxed review, and now this one, and they are such a stark contrast to the uniform praise these recieved upon launch. Great machines for sure, especially for those who use the areas they excel at. I guess I'm just rather exhausted at all of the review outlets only reviewing things for videography, simply because that's what they do. Their reviews shouldn't be a general "review" and should be more a "videographer review", so that those who don't watch/read 18 reviews like a lot of us here who do this for fun, don't get the wrong ideas.

    • @bencze465
      @bencze465 2 года назад +6

      I did wonder and reminded me of how Volkswagen optimized their software for specific use cases. I considered M1 briefly for a Linux laptop but then quickly reconsidered - if not else for the keyboard - and went for a Thinkpad Ps. I don't think these Macs are good for generic purpose computers. They are fine for the same task a Chromebook is also good for, or for the special video editing stuff. Seems quite niche, lucky them they can sell it with marketing.

  • @Mediumpimpin69
    @Mediumpimpin69 2 года назад +61

    Nobody could have predicted that their claims of integrated graphics beating a 3080 would be a load of crap.

    • @RoachDogggJR
      @RoachDogggJR 2 года назад +5

      My 1070 not overclocked with an i5 on ultra settings has better frames than m1 in a comparison with its settings at low on crysis lmao

    • @Stikkzz
      @Stikkzz 2 года назад +2

      @Slayer Developer shure they did. in a shortcircuit video

    • @battlebuddy4517
      @battlebuddy4517 2 года назад +3

      @Slayer Developer even if it was still won't change much

    • @RoachDogggJR
      @RoachDogggJR 2 года назад +5

      @Slayer Developer lets be real too, no ones gamed on an apple computer since '99

    • @shrayesraman5192
      @shrayesraman5192 2 года назад +5

      @@RoachDogggJR What an idiotic claim. Lets try running a mac specific task like FCP or Logic on that i5 and then compare...

  • @Hermiel
    @Hermiel 2 года назад +25

    M1 can throw a lot of weight around as a DAW host, especially running Logic and AS-native plugins. It's reportedly less-well-suited to realtime audio tasks (like recording live guitars through software amp sims at low latency in a busy session) but it absolutely pommels at mixing and composing tasks that don't require super-low RTL figures under load. The 32GB Max variant will benefit a serious composer who wants all of the orchestral libraries and soft synths loaded at once, although all that GPU will be drastically underutilized in the same scenario.

  • @bc3316
    @bc3316 2 года назад +1

    Very nice review; concise and informative; cheers

  • @NightForce
    @NightForce 2 года назад +102

    I really like the confidence Anthony grew over the times standing in front of the camera :)

  • @SirSicCrusader
    @SirSicCrusader 2 года назад +10

    So what Macs did you get?
    "M1 max"
    Yeah I know you got M1 Macs, but what model
    "M1 Max..."
    **flips desk**

  • @HOLDMAHHHHDIK
    @HOLDMAHHHHDIK 2 года назад +1

    the progression of Anthony and how much better/confident he has become on camera should be an inspiration for everyone to practice confidence in social setting (which is even worse on camera when you're staring into a lens instead of talking to people)

  • @darklighter4475
    @darklighter4475 2 года назад +1

    One thing not mentioned when doing the benchmarks, how do all the laptops ( MacBooks and Zephyrus) perform while only on battery. Yes battery usage length is great, but how is the horsepower of the cpu/GPU effected running apps while on battery. I think some surprises might arise.

  • @andrewlalis
    @andrewlalis 2 года назад +40

    Apple's deprecation of OpenGL support is nasty.

    • @markjacobs1086
      @markjacobs1086 2 года назад +5

      They pretty much had to for this M1 chip anyway. Can't really run widely compatible API's if you're going to do specialised hardware & also claim it slays a top of the line dGPU while using less than half the power. They just don't tell you that the software to actually get the claimed performance isn't widely available (yet).

    • @DeeSnow97
      @DeeSnow97 2 года назад +15

      @@markjacobs1086 Just wait until the community implements OpenGL using Metal, similar to MoltenVK. It's not really "specialized hardware", it's just a graphics API, that's how every GPU works. That's why OpenGL support is still ubiquitous on non-Apple GPUs, even though they're architecturally much more geared towards Dx12 and Vulkan, which are very similar to Metal (in fact, Metal itself is barely anything more than a deliberately incompatible clone of Vulkan because Apple is still Apple).
      The M1 CPU may be awesome at clearing up decades-long inefficiencies of the x86 architecture, but the GPU world has long progressed way beyond that. Apple has no such advantage there. The only reason they are even remotely competitive in a performance per watt benchmark is TSMC's 5nm node, to which they currently have exclusive access, but from an architectural standpoint they have a lot of catching up to do with both AMD and Nvidia.

    • @djsnowpdx
      @djsnowpdx 2 года назад +3

      @@DeeSnow97 well, Apple couldn’t “just wait.” They had a product they were ready to sell.

    • @DeeSnow97
      @DeeSnow97 2 года назад +8

      @@djsnowpdx lol, what a horrible take, Apple could have just kept on supporting OpenGL and not sold an incomplete product

    • @markjacobs1086
      @markjacobs1086 2 года назад +4

      @@DeeSnow97 The M1 just sucks for "community anything though" since Apple doesn't really do much of anything to have "the community" fix up their slack. Most of the time they specifically go down the path where they like "the community" to be able to do absolutely nothing. Like doing basic servicing of a device...

  • @Watchandlearn91
    @Watchandlearn91 2 года назад +16

    I love the way that Mario Sunshine is used as a benchmark here lmao

    • @notme8232
      @notme8232 2 года назад +9

      Only thing that Macs can run

  • @gabrielfair724
    @gabrielfair724 2 года назад +2

    Things i still want to see covered:
    1) How much can the USB-C take? 8 hubs fully loaded with all the native monitors going, with also X extra monitors using displayLink, while running a USB connected NAS and 10gb Ethernet dongle
    2) EGPU support? If not, what happens if you try it? What if you try to force the Nvidia or AMD drivers with rosetta?
    3) Wipe one of the system and use it as a daily driver for a week but this time refusing to install Rosetta. How are the proformance numbers changed without the emulator running or even listening

  • @tsong1492
    @tsong1492 2 года назад +1

    Absolutely MAD MAX intro! And that was the smoothest and most hilarious segue ever to a sponsor! Well done haha

  • @dexterman6361
    @dexterman6361 2 года назад +299

    would've been interesting to also include a G15, seems like a fair competitor (3070 and a pretty good ryzen chip, and about 7 hours of battery life)

    • @theNimboo
      @theNimboo 2 года назад +46

      I mean this comparison is fair because the Zephyrus is cheaper. So it actually isn't fair to the Zephyrus if anything.

    • @thatguyalex2835
      @thatguyalex2835 2 года назад +2

      Would be interesting if they used TeraFLOPS as a unit of measurement to determine estimated GPU performance. :) Now it's not the best unit to use, but the FLOP can show 32-bit precision calculations per second.

    • @ZerograviTea.
      @ZerograviTea. 2 года назад +35

      Not only not the best, Teraflops is quite possibly the worst measurement to use, since for every generation and architecture performance per flop can differ so much.
      The only thing it's good for is marketing number shows (also relative estimated performance withing one gpu family of the same generation, but that's besides the point).

    • @ACE112ACE112
      @ACE112ACE112 2 года назад +3

      Lenovo Legion would be better because it has a MUX switch

    • @thatguyalex2835
      @thatguyalex2835 2 года назад

      @@ZerograviTea. Wow. I didn't know it was the worst. So, what is the best unit of measurement for GPU performance? GPU bandwidth (GB/s), throughput, or something totally different?

  • @asamird
    @asamird 2 года назад +9

    Blender 3.0 now run natively on M1, so that could be a nice comparación.

  • @PrinceWesterburg
    @PrinceWesterburg 2 года назад +2

    I was just auditioned for an animation job, I was put on a last gen Intel iMac, fired up Blender and put a normal map on one surface in the scene and the GPU almost caught fire and the whole macOS GUI dropped to 0.5fps, I'm not sh1tting you!!!

  • @astronemir
    @astronemir 2 года назад +5

    Thanks Anthony. Now I have to decide whether the better memory is worth it or whether it’s not good enough and I’ll have to push code off to the compute cluster anyway..
    In the future, could you guys run some memory intensive data analysis code, like something that inverts large matrices in memory in python? That’s a good reason to get these if I have to avoid the hassle of pushing code to a network machine and can run it on my laptop and play with it.

  • @RogueAfterlife
    @RogueAfterlife 2 года назад +4

    Loved the reference, in the intro, to the open air unboxings Linus made many years ago :))

  • @butters_147
    @butters_147 2 года назад +1

    Watching Anthony go from absolutely HATING being on camera to being so much more comfortable that he cracks me the eff up with an intro like that! Bravo Anthony! 👏 👏 👏 I almost spit out my coffee lol'ing at that. Great work.

  • @TelcoGeek
    @TelcoGeek 2 года назад +2

    I’d be interested to know how it performs in doing other graphics workloads, like QGIS with lots of data displayed.

  • @pauljohnston568
    @pauljohnston568 2 года назад +76

    Unfortunately, the answer on HDMI 2.1 adapters is currently no, for software reasons. I think if you guys make a video on it that could get Apple’s attention to finish it

    • @antibullshit2732
      @antibullshit2732 2 года назад +7

      sure, cause they're apple's favorite reviewers.

    • @666Tomato666
      @666Tomato666 2 года назад +5

      yes, because Apple is well known to take into consideration what people outside Apple are saying /s

  • @Fearinator
    @Fearinator 2 года назад +12

    Anthony, you are nailing reviews recently! Your voice acting/narration is SO professional :) great stuff mate

  • @sushi_donut
    @sushi_donut 2 года назад +217

    Anthony was the best decision LTT has done recently. Congrats to the both of you!

    • @nnnnnn3647
      @nnnnnn3647 2 года назад +4

      this test is pure crap. They should be sued by Apple for misinformation and lies.

    • @damara2268
      @damara2268 2 года назад +3

      @@nnnnnn3647 wut? They can't show results which they measured?
      Or are you gonna say they should have only used software that works better on macs?

    • @nnnnnn3647
      @nnnnnn3647 2 года назад +1

      @@damara2268 They should use software that people really use./

    • @bobanders6672
      @bobanders6672 2 года назад

      @@nnnnnn3647 ok

    • @RoshiGaming
      @RoshiGaming 2 года назад +3

      @@nnnnnn3647 cope and seethe

  • @xalvarez07
    @xalvarez07 2 года назад +17

    I'm interested if the laptops were connected to power. Also interested what the battery percentages would be at the end of the test with all laptops disconnected from power, and how hard the fans blew.

    • @PeterKoperdan
      @PeterKoperdan 2 года назад

      I think it's pretty clear that Macs run much better on battery power than most PCs. At least until the latest Intel and AMD chips are properly put to the test.

    • @petereriksson6760
      @petereriksson6760 2 года назад +5

      That is probably how apple got theirs to look so god in their comparisons... they unplugged the PCs...

    • @xalvarez07
      @xalvarez07 2 года назад

      @@petereriksson6760 lol u right. so I guess Apple actually makes laptops...while PCs are designed be lost in house fires..got it.

    • @angeloangibeau5814
      @angeloangibeau5814 2 года назад

      @@petereriksson6760 that's exactly what they've done and there's nothing wrong with that because laptops are meant to be use unplugged!

    • @theyoutuber273
      @theyoutuber273 2 года назад +1

      @@angeloangibeau5814 I disagree heavly. The point of laptops is portability, but that doesn't mean I will use them unplugged.
      Battery life is good but not as important as Apple makes it out to be. It's not that important like phones.
      When I am using my laptop for more than an hour, it's usually on a desk and almost all places I visit with a desk, they have an outlet.

  • @RomanPapush
    @RomanPapush 2 года назад +82

    In laptop comparisons I believe having separate benchmarks for plugged AND unplugged scenarios would shine more light on Apple claims.

    • @Prithvidiamond
      @Prithvidiamond 2 года назад +8

      This, the mac slaughters every laptop on battery lol!

    • @Natsukashii1111
      @Natsukashii1111 2 года назад +7

      @@Prithvidiamond every laptop you say? You do know that there are laptops with desktop cpu and desktop gpu? I mean they are absolutely huge, are barely able to be transported but they are still laptops and they will be 2 to 3 times more powerful than m1 macs for the same price.
      It's not a fair comparison but you might want to lower your expectations on Apple claim.

    • @neolordie
      @neolordie 2 года назад +14

      @@Natsukashii1111 "on battery"

    • @vlada909
      @vlada909 2 года назад +9

      yeah but why would you do something resource intensive on battery....

    • @OG_ALviK
      @OG_ALviK 2 года назад +5

      @@Natsukashii1111 Laptop and Portable computer aren't the same.
      Macbook is a laptop. Some Clevos that you are talking about are "portable" computer whith whom you can do evrything as long as you have a desk and power socket. Without those two it's a bigass brick good for nothing.

  • @niekversteege
    @niekversteege 2 года назад +99

    Great review but I'm curious about differences between pro and max for development benchmarks i.e. code compilation. This is generally a very large use case for these macbooks.

    • @defeqel6537
      @defeqel6537 2 года назад +11

      They use the same CPU, so while the extra bandwidth (and cache?) may make a difference, it's unlikely to be a huge one.

    • @Masterrunescapeer
      @Masterrunescapeer 2 года назад +5

      Depends on what you're compiling, if your stuff can compile on mac and is not using GPU acceleration, then the difference is minimal/non-existent.
      The efficiency cores on Intel next year will be very interesting, and AMD finally moving to 5nm, though that is supposedly end of year, will be very interesting to see performance jump with that including the new cache stacking. It's great getting past the stagnation.
      I'm probably upgrading end of next year, will move from laptop (i7 9750H, it's 3 years old now) to PC since moved continents, and things like Rider and VS Code having remote support means I can just have home PC host the stuff (which I do often enough on my NUC if I need to run overnight).

    •  2 года назад +8

      Check Alexander Ziskind youtube channel for many, many development benchmarks done to the M1/Pro/Max machines, most videos are very short and to the point.
      In general, CPU-bound work sees very little difference between the Pro and Max chips, you end up seeing more differences being caused by the number of cores available on the different versions than in the kind of CPU. In some cases, specially single-threaded ones like some javascript tests, a MBP16 running a maxed out i9 might beat the numbers, but if the workflow is multithreaded the M1 chips do it better.
      Unless your workflow really needs more than 32GB of RAM a 10 core M1 Pro is probably the "sweet spot" for development at the moment.

    • @CinemaSteve
      @CinemaSteve 2 года назад

      My friend is a senior engineer for Apple and he does both iOS and MacOS compiling. He got a Pro for himself and they gave him a Pro for work too because the Max isn't necessary for their developers for the most part. Only certain developers would get allocated a Max but he hasn't heard of any devs getting them.

  • @Banezarian-
    @Banezarian- 2 года назад +2

    Honestly, i love every review Anthony does. His voice is like butter on a subwoofer

  • @legofan2284
    @legofan2284 2 года назад +3

    I would like to see a comparison in machine learning tasks. Maybe the gpu can keep up to 30-series in that?

  • @EvanBoldt
    @EvanBoldt 2 года назад +8

    The lack of Vulcan, Cuda, or OpenCL support on Macs is absolutely killing multi platform compatibility for even professional workloads and games have taken a giant leap backwards.

    • @nigratruo
      @nigratruo 2 года назад +1

      That is Apple's work, they just remove and destroy a industry-standard like OpenCL and OpenGL / CUDA (they never supported the most powerful GPUs, which are Nvidia). In Linux and Windows, when you get a new standard, they let you use the old one, it does not just get removed, which destroys a lot of software. You can still run 32 Bit Apps on Win and Linux very well and that is how you must do it. Apple is just typically arrogant and does not care about its users. That is the reason why they have not had more than 10% marketshare globally, not once in 44 years the company has existed.

    • @LiLBitsDK
      @LiLBitsDK 2 года назад +1

      @@nigratruo x86 is stagnant and needs a complete reboot... but noone got the guts for it... Apple did and they now have quite powerful machines that uses little power... perfect? not yet... but way better for what they are meant for and then on top of that they can game decently... but again not perfectly... yet. but the extra power of the M1 chips? especially the pro and the max? well they could (should) be interesting for game devs to tap into

  • @madebythebird
    @madebythebird 2 года назад +3

    2020 M1/16GB has very similar performance in AE, Daz/ Blender to our 2016 3.6ghz, 6core i7, RTX2070 PC.
    However, Premiere 2020 is blazingly awesome on the M1 although the PC exports in half the time.

  • @hwstar9416
    @hwstar9416 2 года назад +1

    you guys really need to do code compilation tests, that is honestly all that I'm interested in.

  • @johnreynolds6074
    @johnreynolds6074 2 года назад

    I enjoyed the details and the clarification of the test results. This was a very good video and I am looking forward to your 16" review. Do you think the results would be better if and when these software companies write programs which don't require going backwards to use Rosetta? Why are these software companies not writing more versions that use the capabilities of the Apple SOC? Thanks again for your great reviews.

  • @evoboy67
    @evoboy67 2 года назад +4

    What about software compilation, data analysis and heavy crunching like that? Can you 🙏 test compiling Linux or something similar workflow for the 16” review? Pretty please 🥺 It’s a lot more relevant for someone like me

  • @Gibbbrayden
    @Gibbbrayden 2 года назад +5

    I have the M1 Pro Max. First Apple computer I have owned. And I am nothing but impressed... Sure I could find something I don't like about it. But... I could show you a list of complaints with my last laptops that are far worse. How efficient it is does have a lot of value. My last laptop was $2,000 when i purchased it from Lenovo. And I needed a Go Pro for a project. realized the memory was full and it killed my laptop battery before it could get the footage off. Even Chrome would noticeably kill battery life. Having a laptop that is useless without a being plugged in sucks.

  • @dbased1915
    @dbased1915 2 года назад +2

    I bought the 16in m1 max 32gb. It's a beast, huge improvements from the lacking macbooks from years prior.

  • @anthonydugarte
    @anthonydugarte 2 года назад +1

    Great content!
    Question, what software can I use to monitor my macbook pro's hardware?

  • @SirRelith
    @SirRelith 2 года назад +3

    Hope they get eGPUs up and running for the M1 chip soon. Imagine the possibilities.

  • @alexatkin
    @alexatkin 2 года назад +26

    So glad I got the Pro. From what I hear its the same story with AI compute (something I personally bought it for), as it uses the Neural Engine there is ZERO improvement moving up the Max. Its a shame they didn't double those cores. It will be interesting to see if the new Mac Minis will differ from the Macbooks or just use the same SoCs, as if they double the Neural Engine I will probably get one. Interestingly AI is where you only get about a doubling in performance going from M1 Pro to 5950X + 3080, so a bigger Neural Engine could really make up a ton of ground there.
    Whats the battery life, performance and noise level of the Zephyrus when running 3D on battery though? A big reason I got the Pro was the performance to watt and noise level were terrific, even manually pushing the fans up so the SoC never goes far over 80C under heavy load.

  • @chris_0725
    @chris_0725 2 года назад +1

    I love how you had to change the graphs to accomodate the intel/nvidia notebook

  • @Jayesh8881
    @Jayesh8881 2 года назад

    Really loved the manscaped review.

  • @Ttarler
    @Ttarler 2 года назад +11

    I’m primarily interested in the M1 for ML workloads. While it offers potential for edge applications most enterprises ML is on RHEL using the x86_64 architecture so the Intel CPU Mac’s end up being the better choice for a development workstation (windows just ain’t it, and find me an IT department willing to deploy native nix laptops. I’ll wait). That dynamic will probably shift over time,but I’ll probably treasure my intel MacBooks for longer than I should.
    All this is to say LTT: love your content, but would appreciate reviews with a data scientist perspective.

    • @NaryVynnsark
      @NaryVynnsark 2 года назад

      Sorry for such a straightforward question. My work is based on Java most of the time. Are the compiling times better on M1? Right now I'm on 10th gen i7 10700k. Should I upgrade to alder lake or M1?

    • @calaphos
      @calaphos 2 года назад +1

      The CPU stuff on numpy seems really promising. But training on GPU seems rather meh, at least in Computer Vision the performance improvements of using tensor cores is just way to large. Seems like performance per watt is rougly on par with a full power 3090 - not exactly a efficient GPU. But IMHO any GPU heavy workload is much more suited for remote servers than a laptop.

    • @djagofficial
      @djagofficial 2 года назад

      @@NaryVynnsark m1 pro 10 core model would be better but wait for the mac mini or imac pro

  • @MapleJokerRofl
    @MapleJokerRofl 2 года назад +2

    Cant wait to come back to this video 10 years from now.
    Whats up future me :)
    I love you

  • @Cosmozorb
    @Cosmozorb 2 года назад +2

    Anthony for the win! Thanks for this thorough tests. Interesting results!

  • @keegzwhittal
    @keegzwhittal 2 года назад +1

    But the important question is the battery life on the Zepharys vs the Mac when pushing their performance capabilities

  • @mikehathaway3659
    @mikehathaway3659 2 года назад +25

    So much of this is really optimization in code, for those of us that lived through the changes from carbon to coco to metal and from Motorola to PPC and then to Intel, one of the things that happened was after a giant change in architecture, over time as software gets updated the macs would get faster. Even Apples OS is still hitting rosetta. The review is still fair, but in a year the results from the same hardware will most likely be significantly different.

    • @Cat-kp7rl
      @Cat-kp7rl 2 года назад +3

      Steam drains the battery on my 16” Mac(M1 Max) faster than running Windows on Arm(Parallels) + ECAD(Altium) or Keysight ADS for EM field solving. Yeah… Just having the Steam launcher running, not even with a game going.
      Oh well, i never intended to game on the Mac anyways since I have a gaming PC… but in terms of work, the Mac can do everything I need it to do in portable form factor, while maintaining all day battery life.

    • @LiLBitsDK
      @LiLBitsDK 2 года назад

      @@Cat-kp7rl but you CAN game on a mac, but I am surprised what I can push out of my OG 13" M1 Air

    • @Cat-kp7rl
      @Cat-kp7rl 2 года назад +1

      @@LiLBitsDK I was just backing up his point that unoptimized things can run really bad no matter the device. Like in my case, something as trivial as the Steam launcher

    • @cmfrtblynmb02
      @cmfrtblynmb02 2 года назад

      Exactly. That's another crazy thing about M1. It will just get faster as we get updates. Normally machines will be slower as they age since software gets more complicated.

    • @cmfrtblynmb02
      @cmfrtblynmb02 2 года назад

      @@LiLBitsDK Yep. I was running Diablo 3 wuthout issues. It heats my laptop from the sam e year like crazy

  • @kkollsga
    @kkollsga 2 года назад +15

    What's hilarious is that if you read reviews of the Zephyrus it's constantly referred to as over priced and under powered 😂

  • @chico4832
    @chico4832 2 года назад +1

    Anthony, with that last ad transition one can only conclude that you have achieved your final form as a true artist. A poet!

  • @nerdimmunity7672
    @nerdimmunity7672 2 года назад +2

    the results are impressive when compared to power consumption

  • @psycl0ptic
    @psycl0ptic 2 года назад +10

    not on the 14inch. if needed, yes on the 16in. best to stick with the Pro on the 14in, or if needed Max with the 24core GPU. the 32 core is voltage limited in the 14in.

  • @xmetal280
    @xmetal280 2 года назад +19

    These results look to be spot on, and not terribly surprising. The major thing you didn’t really touch on however, is that the Macs will do this exact performance whether plugged in or unplugged, it makes them no difference. From the tests that I have seen elsewhere, none of the high-end PC laptops even come close to their plugged in performance, or their battery life drops so significantly that they become much less useable as a work tool.

    • @HoJSimpson
      @HoJSimpson 2 года назад +1

      Exactly this. I mean who buys a Mac for Gaming.... I got my PC for that. My new MBP is just to be productive. And it excells at that.

  • @anderty4088
    @anderty4088 2 года назад +1

    [Looks at video][Slaps it]: This bad boy can include more than 3 ads!
    Love sponsors.

  • @CL0WN
    @CL0WN 2 года назад +1

    God I love this guy as a linus replacement when he's gone/too exhausted to do videos/building his sound setup..

  • @gbessone
    @gbessone 2 года назад +27

    I agree with what you say: M1 max is literally only for professional video editors, which is a super ultra niche market, for everyone else, it's not worth it.

    • @wykananda
      @wykananda 2 года назад +10

      I think it'd be more accurate to say media professionals and developers in general. It's absolutely fantastic for professional audio production and software development. Silent the vast majority of the time and can easily handle on-location and remote tasks with it's awesome battery life with full power whether plugged in or not. The high-impedance-capable headphone jack and best-sound-in-a-laptop ever doesn't hurt either. I think it's important to compare Apples to Apples here (pun intended). They're not designed for gamers, they are designed for professionals. As an equal Windows and MacOS user, my experience with these has been top-notch. For pros, Apple has hit a home-run here IMHO. Also, I think the power-per-watt here should not be ignored and I don't believe this was mentioned - add that factor to the benchmarks and you'd see some very different results. Energy costs money and affects the environment. And a hot, noisy laptop isn't particularly enjoyable to use day in and day out.

    • @theNimboo
      @theNimboo 2 года назад +3

      Super niche. Because let's face it, the m1 air can do 4k editing. How many editors need to edit 12 simultaneous 4k streams? Most youtube viewers don't even watch in 4k yet rofl. I really wish it performed better at 3d design.

    • @pupperemeritus9189
      @pupperemeritus9189 2 года назад

      @@wykananda for audio professionals most of them were good with an older generation macbook with high memory configuration tho. also for non video editing/audio professionals, macos is really really difficult to use. even more so with arm. basic stuff like a volume mixer and any sign of useful window management are absent out of the box. what is the point if you are spending such a premium to get a sub par experience with non video editing/audio professionals.

    • @wykananda
      @wykananda 2 года назад

      @@pupperemeritus9189 Hi pupper. I'm not sure I understand your comments. Sadly, the previous Macbook laptop generations were all limited to16gb of ram - so high-memory configs were simply not possible. Moving to the ARM architecture did not change the underlying operating system, MacOS, it simply made the laptop hardware run faster, smoother, quieter, and for much longer on a single battery charge. As for the difficult-to-use / sound control / window management - the latest Windows and MacOS are both more than reasonably user-friendly and well-equipped in all these areas - these OSs have both been around for many years and generations now and it shows. As a multi-OS power-user I could nit-pick plenty at both OSs here and there for sure though. However, in my experience, for the countless newbies that I've trained and continue to help, MacOS has to get the nod for getting productive and comfortable more quickly with less frustration and confusion and less problems over the long haul. Let's face it, both operating systems are DEEP. They're both very capable and stable at this stage but either will take time and effort to learn to get the most out of them. Curiously, my current "go to" Windows-based laptop is a 2015 Macbook Pro running Boot Camp - ironically, it's easily the best Windows laptop I've ever owned - cool, quiet, fast, stable, good battery life, well-built, expandable - and, of course, it runs MacOS like a champ too. I'll likely get another 3-4 good years out of it before I hand it down the line. IMO, the 2015 MBP was the best overall professional laptop ever made for Windows, MacOS, or Linux until now. While I can run the ARM version of Windows on the latest MBP via Parallels and so on, I'll have a new laptop King if-ever/when Microsoft Windows gets fully up to ARM speed and these new killer Macs can boot into it natively.

    • @pupperemeritus9189
      @pupperemeritus9189 2 года назад

      @@wykananda i appreciate your patient reply

  • @zero_burrito
    @zero_burrito 2 года назад +4

    I'm so glad these new macbooks have proper thermals rather than the "bluetooth heatsink" of models prior...

    • @DigitalJedi
      @DigitalJedi 2 года назад

      I know right. It honestly looks like the inside of my gaming laptop in there.

  • @naniSinek
    @naniSinek 2 года назад +2

    Additionally you have to consider that the M1 Max in the 14 inch model has a cap in GPU Frequency compared to the same M1 Max in the 16 inch as you may see in a RUclips video of Max Tech

  • @StefanUrkel
    @StefanUrkel 2 года назад +1

    Anthony: could you shed some light on how the M1 Pro 16" compares in Resolve workloads vs Zephyrus G15 3070?

  • @raytsh
    @raytsh 2 года назад +3

    That's the first time I heard that someone is preferring the silver color. I also got a silver one and, looking around online, it seems like I'm way in the minority with that decision.

  • @MichaelHickman3D
    @MichaelHickman3D 2 года назад +177

    I am happy Apple is making great arm processors, and I’m also happy Anthony did the review for this episode again. Keep up the great work guys.

    • @davide4725
      @davide4725 2 года назад +3

      Apple makes nothing, thank TSMC

    • @parkerdavis7859
      @parkerdavis7859 2 года назад +16

      @@davide4725 “Thanks TSMC”
      You sound like the kind of guy who loves bringing up John Lennon’s wife beating tendencies every time someone mentions they like the Beatles lmao

    • @davide4725
      @davide4725 2 года назад +1

      @@parkerdavis7859 Cute assumptions kid. Good bye now...

    • @DigitalJedi
      @DigitalJedi 2 года назад +1

      I am also loving the progression for the ARM space. What really excites me isn't the CPU or GPU in these, its the optimizations they made to make ARM that competitive. They're getting asic-like performance for a lot of low-level stuff.

    • @lolomo5787
      @lolomo5787 2 года назад +8

      @@davide4725 i find it funny how you called the other guy "kid" while here you are having absolutely no knowledge on RnD, Design, audit, documentation, subcontracting and manufacturing process works in the tech industry.
      "Thank TSMC" lol. Kid please.

  • @adabujiki
    @adabujiki 2 года назад

    Lol LOVED!!!! the intro.
    Nice touch

  • @Logqnty
    @Logqnty 2 года назад +14

    You could try to benchmark the Macs with the machine learning library "tensorflow." It is optimized for metal GPU

  • @MaddTheSane
    @MaddTheSane Год назад

    IIRC, there was a regression in the MoltenVK that was found when they added the Metal back-end.

  • @leonfrancis3418
    @leonfrancis3418 2 года назад +34

    So much for faster than a 3060, lol.
    The media engine seems to be giving the GPU the illusion of more performance than it's actually capable of.

    • @shibbychingching4845
      @shibbychingching4845 2 года назад +12

      Yeah, it's all smoke and mirrors. People that actually believed these claims..I mean..first time? 🤣

    • @bear2507
      @bear2507 2 года назад +10

      @@shibbychingching4845 I mean no where did I hear apple say it was a gaming machine. The illusion is what you keep telling yourself.

    • @foxley95
      @foxley95 2 года назад +5

      @@bear2507 the illusion is that apple claimed the performance is about the same as an rtx 3080, not just M1 barely beaten the rtx 3060 its not even close to rtx 3080 and i mean mobile rtx gpu, an rtx is a GAMING gpu, so when they made these claim people will think about its performance for gaming obviously, should have compared it to a profesional gpu like quadro instead of being either brave or stupid to compare them to rtx

    • @-Blue-_
      @-Blue-_ 2 года назад +1

      @@foxley95 m1 don't have the capability to beat 1% of quadro GPU in 3d task

    • @marcusaurelius6607
      @marcusaurelius6607 2 года назад

      @@foxley95 yeah, i’ll go tell my research lab to shut down our datacenter with hundreds of 3080s, because some kid on youtube said these gpus are for games only and not generic compute. comments are full of children who have never touched anything ouside minecraft, but have an opinion on everything hahah

  • @jordananderson2728
    @jordananderson2728 2 года назад +10

    Anthony is a wonderful personality and knows how to mix humour and information supremely well. Love his Mac content!

  • @pedropicapiedra5365
    @pedropicapiedra5365 2 года назад

    Great review thanks for sharing

  • @mhoeij
    @mhoeij 2 года назад +2

    Could the difference in battery life be caused by the 64 vs 32 Gb RAM rather than the M1-Max vs M1-Pro chip?

  • @SaarN1337
    @SaarN1337 2 года назад +12

    0 dislikes! Great vid LTT👍👍👍

    • @Z3t487
      @Z3t487 2 года назад +3

      Starting to hate this joke. RUclips should make the dislikes public again.

  • @jimmydoo
    @jimmydoo 2 года назад +20

    Really looking forward to hearing about the results of that Thunderbolt to HDMI 2.1 adapter, and if it will finally allow for the use of an 8k@60hz display with MacOS!

  • @xern_
    @xern_ 2 года назад

    That intro was the best thing I’ve seen all year 😂 much love Anthony!

  • @cliffgeo
    @cliffgeo 2 года назад +1

    That 14" silver looks dope doe, I would literally trade my 2018" 4 port MB Pro for it

  • @frostilver
    @frostilver 2 года назад +3

    I thought you'd add battery life tests with gaming on the M1 Maxes...

  • @_graymalkin
    @_graymalkin 2 года назад +8

    I’d really love a software dev take on this. For my use case fast cpu, good battery life and 64gb of ram are compelling - but are distinctly not video rendering.

    • @DeeSnow97
      @DeeSnow97 2 года назад +2

      Developer here, I wouldn't buy any of these besides the base-level MacBook non-pro. You can literally code on a Raspberry, unless you're compiling something crazy-complex like an entire browser you're not going to feel the difference, so why pay extra for literally nothing? A USB-A port would have been a compelling addition, but oh well.

    • @JackiePrime
      @JackiePrime 2 года назад +13

      Other developer here. Never found myself desperate for a usb A port while developing but have definitely found a use for better cpu and ram. Not sure what serious developers are developing on trash hardware tbh.

    • @DeeSnow97
      @DeeSnow97 2 года назад +2

      @@JackiePrime Web, for example. I don't develop on trash hardware because I can afford better equipment, but if I still had my old FX-8320 it wouldn't slow me down in any way. Peripherals are way more important at that point.
      Also, every single hardware debugger uses USB-A, and even if you just want hobbyist stuff have fun hunting down a USB mini-B (not micro-B) to USB-C cable just because you can't use the included mini-B to A.
      But it does make sense, if you only develop for iOS (which is literally the only reason I've ever considered buying a Mac) then you won't run into any of those issues, and Xcode being a hot mess does necessitate a faster CPU and more RAM. But there's a lot more to development than just Apple's walled garden, and if you step out of it it's a lot more important to be able to mess with any device you want to.

    • @seren1ty755
      @seren1ty755 2 года назад +10

      Also a developer here, gpu on the max is absolutely useless and 64 gb of ram is overkill for my line of work. 32 gb ram and 10 core pro is plenty plan to keep for about 4 to 5 years.

    • @Juan.Irizar
      @Juan.Irizar 2 года назад +2

      Another Developer here, I have the M1 Max with 64GB, 32GPU and 1TB SSD. While this setup is overkill, first I can afford it and feels good not having to worry about performance while working. On the technical side, running Webstorm, and other ides, multiple node apps, multiple docker containers, electron apps that suck like Slack etc takes a toll on any computer. If you can afford it, specially since software engineering is a well paid job, plus the resell value down the line, why not?

  • @kc-st
    @kc-st 2 года назад

    The opening sequence just blew me away man...

  • @kelownatechkid
    @kelownatechkid 2 года назад

    Great video Anthony!